With significant reserve discoveries an increasingly rare occurrence, oil and gas companies have been focusing instead on optimising their existing reservoir assets. However, before any reservoir can be optimised it must have a technology surveillance infrastructure that delivers essential data to support timely and effective decisions about the reservoir’s changing condition, as well as assessing its long-term value.

A quick-fire decision-making process helps to reduce costs and minimise risks. A technology infrastructure provides a continuous flow of accurate real-time reservoir data that underpins effective decision-making, which is the essence of any reliable reservoir asset management system.

“Data-handling work is still the most time-intensive aspect of routine reservoir surveillance.”

According to Trevor Grose, the senior petroleum engineer at BP, the company’s two key focus areas are the management of base production (i.e. existing reservoirs and wells) and the delivery of brand new subsea well intervention programmes. The latter, he says, is taking BP into uncharted territory.

“With our use of monohull vessels, we’re moving into a domain that is entirely new to the industry as a whole, in both a business sense and a functional sense,” Grose explains. “The challenge is how to realise and exploit the best recovery programme in subsea fuels, where recovery factors are generally lower. It all comes down to ease of access. At the moment subsea access is high cost, so our big plan for 2008 is to identify and deliver an optimal solution.”

As Grose admits, this will involve a radical shake-up in the way BP operates as a business. However, it is a plan that the company has already set in motion in the Shetland fields alongside suppliers and partners.


The energy industry has struggled when it comes to putting a value on data, particularly when the data in question lacks a direct connection to production delivery in the current year. “We’re always faced with a short-term need for production delivery actions and the work aligned to that, versus long-term recovery,” says Grose.

“Historically, the sector has struggled to deliver the long-term surveillances required. Instead of concentrating only on the immediate benefits, you need to look at where else you can use the data to realise more value in future reserves. You can’t just wait for the future to happen before you choose to take action on it.”

The role of new data-mining methods as a complement to conventional reservoir surveillance tools looks significant here. Emerging technologies such as neural networks offer capabilities that are of special interest in providing an automated reservoir surveillance tool, enabling oil workers to bring about a faster execution of trained models in real time, detect trend violations in high-dimensional problems and account for missing values.

Previously, spreadsheet applications have been used widely for the purpose, but this approach now looks distinctly outdated. The vast amount of data, automation and time-critical operations cannot be handled by such systems. Data handling work is still the most time-intensive aspect of routine reservoir surveillance. Spreadsheets can hardly be maintained in such a way that all engineers use the same data and the same software tools alongside their limited storage capacities.


Available supervisory control and data acquisition (SCADA) systems enable information gathering on individual parameter values. Nevertheless, a single pressure gauge does not provide the complete picture about field performance and cannot be used directly for production optimisation. Therefore, real time surveillance models that include reservoir performance and surface facilities constraints look like an attractive option.

“Before any reservoir can be optimised it must have a technology surveillance infrastructure.”

“In trying to understand a vast array of data, you need to know where all the interdependencies are,” says Grose. “Sometimes you find there is so much information; you can’t see the wood for the trees. That’s why for the past four to five years BP has been moving towards real-time data collection analysis and management, where we aspire to be the industry leader.

“We have our own internal systems for capturing and managing data flow and providing that in real time to the engineer working in the office. What the offshore workers see is now replicated in real-time for the onshore engineers. It’s a great way to improve operational efficiency, and something we are keen to roll out internationally.”

Harding, a mature field in the North Sea, provides a strong example of how long-term recovery processes can be optimised through the use of emerging technologies. Harding came on-stream over a decade ago, quickly achieving a two-year production plateau before suffering a sharp decline during 2000 as a result of rapidly increasing water cuts.

Using new technology, the Harding team was able to view both downhole information and topsides data simultaneously, which enabled the onshore support team to monitor reservoir performance to within a fraction of a psi (pound-force per square inch). The effect of the daily changes made to each well could then be assessed and modified accordingly.

Better use of downhole data has enabled precise reservoir management through optimal well performance and has opened up new windows of opportunity for further reservoir management improvements. Because the effect of even very small increases in reservoir pressure can be monitored and understood, the true value of pressure support can be better quantified.

For example, enhanced reservoir understanding enabled the sanction of an alternative water-injection project during platform turnaround. One of Harding’s aquifer wells on electrical submersible pump lift was fixed directly to the field’s three water injectors. Exactly as predicted, the effects of the water injection could be seen in real time from the gauge data as the turnaround progressed, with the increased reservoir pressure reducing gas cones in several key wells, giving a dramatic boost to dry oil production.

Five years ago the Harding field experienced a second plateau, with average production in 2002 higher than the figures for 2001.


BP’s experience with Harding shows the benefits of using a real-time data acquisition and control system. The key is that it allows for secure communication with a remote data management system to achieve streaming of high-quality data from existing and new downhole monitoring devices, enabling event notification and remote advice for the control of well site functions.

“A quick-fire decision-making process helps to reduce costs and risks.”

Systems can be quickly deployed at the well site using only standard physical communication links, such as satellite, fibre, mobile telephone or Internet, making them independent of communication service and legacy infrastructure. The data can then be securely accessed and downloaded by multiple users via a web browser on any standard desktop personal computer using an intranet or the internet, helping companies like BP to realise the promise of a collaborative workspace.

By allowing authorised personnel (both onshore and offshore) access to the same project information through an online internet workspace, an enhanced data acquisition, monitoring and delivery system can promote teamwork and collaboration in the daily and weekly production decision-making processes.