Categories

Information Management in the Upstream

Introduction

The Data Room carried out benchmark research in E&P Information Management during 1997 and 1998 for Total SA, who have subsequently agreed to release this information. Ten international major oil and gas companies took part in the study, which comprised a questionnaire and follow-up interviews in Europe and North America. This paper draws from the benchmark study and other sources including conference attendance and The Data Room’s Technology Watch publications. The analysis does not reflect Total’s opinions and any errors or misrepresentations are attributable to The Data Room.

Why Information?

For this study we focussed initially on the stuff that is ‘difficult’. If you have a well location and status, you probably know who to see in you organisation to have this information properly captured and made available to your co-workers. If you have a piece of paper that your CEO scribbled down on a rainy golf course during a conversation with a middle east oil-sheikh, then what to do with it may prove a bit more tricky. We set out in this study to look at the area of New Ventures, where in our experience, the sort of material that is collected is about as heterogeneous as it gets. We then expanded from this to see how these ‘difficult’ data types integrate the corporate workflow. And in particular, what best practices were in place to help in this task.

Why a benchmark?

The computerisation of data capture, management and distribution is a recent field. Oil companies have adopted quite different strategies in managing their data and the companies that took part in this study were interested in taking a rain-check and seeing how they were performing with respect to their peers. Of course the degree to which this can be quantified and presented as a cost per item managed is extremely limited. While we do offer qualitative metrics as to the degree to which data is managed digitally, we believe that focusing on a ‘bottom line’ analysis of the cost of data management is missing the point.

What is a ‘Best practice’

A best practice can be defined as an industry-wide accepted ways of doing things. Now the data management industry is young and best practices are embryonic. A major learning from the first few years of data management conferences and courses is that there are in reality two constellations of best practices

Best practice for the archive

Best practice for application workflow and assets

The key question is what are you trying to do? Offer quick gratification to the asset team, or ensure that quality data will be available 10 years hence when a long forgotten property becomes hot? Recognition of this dichotomy is essential to best practice design.

Data Management undersold

Data Management as a separate discipline was born at a low point in the oil industry cycle. This has had coloured the way that the discipline is ‘sold’ to the industry by vendors and internally to management. While cost savings were demonstrated spectacularly by drillers and seismic acquisition contractors, data managers could not really show cost savings because before it existed, there was no cost! Consequently, data management is often sold on the basis of savings in storage costs, sometimes in a rather contrived manner. The end result is that the whole data management industry has talked itself into being a ‘poor boy’. So I’d like to present some anecdotal evidence of how good management of data can be orders of magnitudes more cost-effective than saving storage costs.

Tales from the data managers

  1. Outsourcing the management of data is a popular way of handling library data. But one participating company reported that seeking a low up-front cost for storage, translated into a high unit cost of retrieval. Conventional wisdom has it that this is OK because the Asset pays. But the reality is that the Asset never looks at the data because it is so expensive.
  2. At the 1998 SMI E&P Data Management Conference Mairead Boland from Shell presented an excellent paper describing the complexity of naming a North Sea well cluster and the very high cost of an error.
  3. One company inadvertently re-shot a seismic survey because of an error in navigation data management.
  4. Another company re-opened a subsidiary, drilled a well only to find that the petroleum system was a non-starter. They actually had this information in their archives, but these were not used because of cost cutting in the library.
  5. It’s not all bad though. Many participants report ‘world class’ libraries and information retrieval. One has developed a "pizza party" approach to migrating unstructured data from boxes to the screen. Participants also report enthusiastically on the use of the Intranet to distribute heterogeneous data such as that on offer in a bidding round.

Practices are People

In the aftermath of the last bout of downsizing many support functions disappeared, although there were marked differences between participating companies. What is clear is that those companies who have maintained or enhanced the library function with data management tools are in much better shape than those who have not. Contrary to another widely promoted view, companies claim significant competitive advantage – especially at bidding round time – for their best practices.

I’d like to conclude with a tilt at another sacred cow of the data management industry. I have lost count of the number of talks I have heard which start with a statistic of the downtime lost through looking for data. This is variously quoted as being 30 - 70% of an engineer’s time. Now while this may be true today, it was most definitely not the case when I was working in the North Sea, more years ago than I care to admit. If I did not have my data ready, it was because I had forgotten to tell the Technical Assistants to prepare it for me in time. In those halcyon days, we had TA’s and draftspeople doing well-defined jobs according to established best practices even if we did not call them such. In those companies who have lost librarians and TA’s through cost cutting, the results are absolutely clear. Without people, no practices, either best or otherwise.

 Neil McNaughton

 © 1999 The Data Room – all rights reserved

 January 1999