Volume control: Statoil’s data management strategy

17 August 2012 (Last Updated August 17th, 2012 06:30)

Sound data management in E&P ought to be a no-brainer, but the challenge lies in the sheer volume of information and the range of users that need it to produce actionable decisions. Nigel Ash quizzes Tore Hoff, leading adviser exploration data management at Statoil, about the Norwegian multinational’s unique approach.

Volume control: Statoil’s data management strategy

One of the first things that Tore Hoff says about data and information management is that it is definitely not an IT issue. Moreover, companies that have chosen to treat it as such have made their lives unnecessarily complicated.

"At Statoil, data and information management is driven more from the business side," he explains. "There is a synthesis there with the IT. What you need to think about is the constant of what the information represents, regardless of the tools and the technology that are being used."

Statoil's enterprise architecture approach works on the basis that, from the acquisition of land through to the final exploitation of a reservoir, there is a constant flow of information, some of which is critical.

"Of course, along the way you produce something new [from this data]," he says. "So regardless of whether the information is in the best format, form or representation, you need some business objectives. Then, of course, you need to have some tools to handle it. So the tools are a function of the information and the information is a function of the business processes. That's our main approach, and that's why I call it a business process rather than a technology or an IT focus."

Statoil's approach, says Hoff, is that all technical work during exploration is based upon creating value by interpreting, combining and aggregating data to produce information and knowledge that can support sound business decisions.

"The biggest challenge is related to enabling end-users to have all the relevant data in time," he says. "As projects progress, the volume of data increases, while at the same time the deadlines and the time required for doing technical work is shrinking."

Hoff likes to use the image of an ever-narrowing funnel to describe the multiple challenges facing data managers as prospects are identified, analysed, reviewed and, in many cases, discarded.

"You have fewer and fewer projects but with ever-higher data volumes and an increasing investment," he notes. "At some point you step over an invisible barrier and you need agility, speed and accuracy to be able to move fast and deliver data on which crucial decisions can be made.

"All of a sudden high levels of accuracy, precision and quality are required as well as a different quality of data."

Quality control

Data quality management (DQM) is, in Hoff's view, a process that is never complete, because it must always be monitored to ensure it is still producing reliable information and knowledge that lead to the right operational decisions.

Within the entire organisation, he believes it is essential to develop and promote data quality awareness and to define clearly what data quality really is. However, even when a standard has been set, it is imperative that it is profiled, analysed and assessed according to clearly defined metrics.

Beyond the data itself, it is also necessary to understand how it will be used and, moreover, whether the business is asking the right questions. Thus, it is important to define data quality business rules, test and validate data quality requirements, and, finally, use this information to set and evaluate data quality service levels.

Hoff maintains that it cannot be emphasised enough that not only must the quality of data be checked constantly, but also that clear procedures must be in place to manage issues as they arise, and quickly clean and correct data quality defects.

"We're trying to automate as much as possible," he explains. "For this, we need to have sensible work processes, and best practices and guidelines."

To illustrate his point, he offers the example of a geologist, a primary producer of value-added information, who is interpreting and performing technical work in order to produce a new updated geological map of the UK Continental Shelf.

"So then I need those geologists both to document what they have done and to put in some naming standards, attributes and quality stamps to say what this data and this interpretation is good for," he explains.

"Statoil’s enterprise architecture approach works on the basis that, from the acquisition of land through to the final exploitation of a reservoir, there is a constant flow of information."

This way, every input from primary information producers can be aligned in the same way by information managers to ensure a high level of standardisation. As Hoff explains it, the data management function is not a customer used for specialist input, but a partner dedicated to understanding everything that is being provided. Such care at the input stage enhances value and effectiveness at the business decision stage, when it is combined with other data.

The Statoil way

Hoff sums up the 'Statoil way' of conducting successful data and information management as resting firstly on clear direction and ownership from top management. Upon this is built thinking at a global exploration portfolio level, which avoids the build up of discrete information silos and permits company-wide data and data management best practice.

No less important is a management system with clear requirements and standardised work processes, coupled with clearly defined data and information management roles. Below Hoff is the project data manager, who coordinates the management of information through E&P, ensuring the data's validity, uniqueness and completeness. Next comes a central project data manager who oversees the data flow between IT platforms, establishes routines and procedures, and supervises training and the use of data management tools.

Thereafter comes data administrators, each of whom is responsible for the data management in one of the main subsurface data stores. The function of the data definition owner is to approve all changes to standards. Then, there is the actual data owner, such as the exploration manager or the PeTech leader, who decides on who else will own it, and on the management and information budgets, and legal issues.

When it comes to the actual systems and technology that it uses for data management, Hoff says that Statoil seeks the best and the most reliable solution - which does not always mean the newest products on offer.

"There is a delicate balance between being a fast follower and a pioneer," he says. "Leading edge is not necessarily a bad thing, because by being a step in front of some of the rest of the pack, you can actually harvest something good out of the new. But at the same time, there is also a delicate balance between leading edge and bleeding edge."


World Expro
This article was first published in our sister publication World Expro.