Rockflow Logo

Do you really treat data as an asset?

Rockflow approach - How to treat data as a real oil and gas asset

How often have you heard leaders state we need to treat data as an asset? However, if we’re honest with ourselves, we rarely treat data with the same care as our other assets in oil and gas.

That is not the case for every kind of data. We treat production data, for instance, very diligently as it directly appears as revenue, and for inventory and sales where we have strict accounting, financial and regulatory controls.

Nonetheless whilst our oil and gas sales data may be reliable other data are often treated with less care and attention such as well reports, checkshots, or schedules. As a result, forecasting gross and net production – which leads directly to sales – can have a wider range of outcomes depending on the data quality used to generate them.

However, when production forecasts miss the mark, that has a knock on effect on dividend policies and planned investment programmes. Any forecast can be reset – but how can we generate more robust and less volatile forecasts in the first place, so that every forecast is improved upon year on year?

 

1. Assigning leadership accountability

One gap between the way oil and gas companies treat data and physical assets is in leadership accountability.

Every asset manager knows they are responsible for the performance of their asset, and they have the ability to influence across functions, departments and disciplines so that the asset is supported. Within company-specified limits they have the power to change existing systems and structures. They can, for instance, own and execute changes to the field development strategy, with authority to gain support from others.

By contrast, overseeing data is usually a responsibility given solely to an IT or an Information Management (IM) department. In this case IT staff, more familiar with hardware and software, sometimes struggle to set a value or priority on the data – or there are insufficient trained IM staff within the business to make their work sufficiently visible and business critical. Either way, they don’t have enough sway with the rest of the organisation to make necessary changes.

“Every system is perfectly designed to get the result it gets”. If you want different results, you must make changes to your existing systems and structures, and leaders are the only ones with the power to instruct the change and make sure it stays in place. The process should yield the results you want. If not, the leader needs to go and inspect the process and make changes so that data is created in the right format, the right way, every time.

Yes, an IT or data department can help to standardise an approach to managing data, but they can’t own the responsibility for the quality of the data teams generate. Those who create the data in the first place need to be held accountable for the data being high quality. Leaders need to be intolerant of receiving bad data and hold themselves and their organisations accountable by providing clear feedback when receiving data with known errors.

 

2. Developing a business case

With oil and gas assets, a business case is usually required to justify kicking off a project. If it’s not going to result in a net positive impact, the project won’t go ahead. However if it’s going to be a wildly profitable venture, the project will go ahead at pace.

The same approach should be taken to improving data. When there’s a business case put forward – whether it’s for improved business insights, low-cost operatorship or increased productivity – there’s an opportunity for the operation to be taken seriously.

The impact can be monumental. For instance, we know of one drilling department that committed one extra hour per week to putting more controls on their data. For every one hour they spent, roughly 100 hours were saved downstream of drilling elsewhere in the company.

Now in this case, the drilling team weren’t seeing the benefits firsthand – it was the other departments that benefited from the time savings. And that’s another reason why leadership is so vital to the process. It is often the case hat the projected benefits are substantial, but occur further downstream from the department who creates the data. Leadership is required to manage the end-to-end flow of data often across departments and functions.

Usually these “data driven” business case initiatives have many interlinked parts, and unless these are clearly articulated, leaders question the value. If for instance, they know that better data controls will result in a 20% improvement in overall staff productivity – an extra day per week of output – those leaders will ensure that all departments get on board, whether they stand to benefit directly or not.

 

3. Undertaking root cause analysis – and applying operational discipline

When data problems rear their head, we often have a habit of creating “data heroes” out of the people who spend their nights and weekends trying to correct bad data or find missing data. But far better than having data heroes fly in to solve a mounting problem is to tackle the root cause issues – and to do that, we need to invest in causal analysis to logically inspect where and why “bad” data is being generated.

One common source of that bad data is the disconnect between structured and unstructured data. For example, if you have results from a fluid sample test, it’s easy for the lab team to plug in numerical data like viscosity and the gas-oil ratio into a standard database.

But often the comments that come with the sampling and lab reports get separated from the data because there is no one standard way for capturing them. When that happens, vital context that might reveal issues in the analysis are lost, and that the data is unreliable.

In the role of a data customer we should be intolerant of accepting bad data, and provide clear feedback to those who created the data so that they can improve the data for the next business cycle. Like an asset the use of constructive feedback and good collaboration allows the data to be delivered in the right format, the right way, everytime.

 

4. Measuring what matters most to your organisation

With an O&G asset, an operator will always work with KPIs aligned to business performance. These are used as a vehicle to communicate the performance of the asset and ensure objectives are linked to critical success factors.

With data, there’s usually no such tracking. However, what gets measured gets done. And once you start measuring data quality, it makes it visible for everyone to see and often leads to an operational change in behaviour.

Not all data is equal, so you want to look for your value drivers. Where desired quality of data will make the most difference, and how will this be measured?

For instance, much of the talk around data in oil and gas is often centred around seismic data. It’s understandable from a data storage and transfer point of view, since the amount of data subsurface generates is significantly large.

But seismic data has limited influence over the rest of the value chain. It’s only used by geophysicists. And that’s not to say they’re not important, but when you compare seismic to well data, which will be vital for drilling, geophysicists, and finance, it helps define what data objects provide the most value. Most of the value for the company is going to be in understanding forecasts, costs and inventory – so make that a focus.

5. Sustaining your improvements through continuous learning

When it comes to oil and gas assets, lessons are learned over time. As an industry, we’re always looking to capture learning so that the next business cycle can be improved upon.

But have you ever performed a look back on your data quality? Have you ever tried to learn and improve so you’re ready for the next data type? And if not, why not? Leaders primarily have two responsibilities: First to run the business and second to improve how the business is run – and data quality is vital to both. Getting high quality data the first time avoids unnecessary rework as well as more reliable business insights.

It’s tempting to look at new technologies to help modernise and unlock new ways of working. But ultimately the technology will be of limited use if its inputs are not correct. The other side of the coin also holds true – once you have systems generating high quality data, you’ll be able to plug in AI and machine learning technologies and find real traction.

For more on how to create value from your data processes, see our podcast on digital innovations in oil and gas with Lewis Gillhespy and Geoffrey Cann, available on Spotify or Apple Podcasts.

Continue Reading

Fill in your details below to continue reading and receive Rockflow insights, technical thinking and news direct to your inbox.

"*" indicates required fields

Name*

Share article

Articles

See all articles

Article what start-ups need in order to win the energy transition - Rockflow

What startups need in order to win in the energy transition

While we aren’t really picking favourites in the energy ...

Read More
Article Agile and digital – is it what incumbent players are missing in the energy transition?

Agile and digital – is it what incumbent players are missing in the energy transition?

Many of our team at Rockflow have spent years working ...

Read More
Article

Podcast: Digital Innovations in Oil and Gas with Lewis Gillhespy

Lewis Gillhespy joins renowned speaker, author and podcast host Geoffrey ...

Read More