Rockflow Logo

15 companies are following this data roadmap – how do you compare?

15 companies are following this data roadmap – how do you compare

Lewis Gillhespy, Executive Advisor at Rockflow, is one of our resident experts on digital transformation – applying Lean Management and Continuous Improvement principles to simplify subsurface process, data, and technology.

Lewis recently undertook a survey of 15 O&G companies – carrying out 45-minute Q&A sessions with senior stakeholders – in order to map the state of progress in their digital journeys 

Modern data management principles originally evolved in the 1960s, but at the time the perceived  business value was modest and not all practices were adopted. Today, the digital age has dramatically reshaped the importance of data, unlocking substantial business value. This has prompted a widespread reassessment to revisit and embrace those legacy best practices with renewed urgency.

Many of us feel how rapid the shift is in terms of how data is being used – but paradoxically, the pace of change will never be this slow again. While it is hard to predict what changes the next five years will bring, we can say with confidence that the need for high-quality data will be at its foundation as geotechnical teams transition from traditional workflows to data-enriched environments with the demand for new tools, skills and data management.

As the industry adapts to the accelerating pace of change, oil and gas companies are recognising the need to move beyond traditional, industrial-era ways of working. The transition to the digital era is underway but organisations are progressing at very different speeds depending on their size, resources, and priorities. 

At Rockflow, we sought to understand this variation by surveying 15 mid-sized and smaller operators. The results highlighted a broad range of digital maturity. While some companies are starting to explore modern data practices, their peers have already invested in transformative ways of working. Across the board, effective data management is emerging as a critical low-cost differentiator and a competitive advantage. 

This article offers insights into how some of the companies have successfully transitioned. 

 

Although at different stages most upstream mid-sized companies are on a similar journey

Most companies that are actively seeking improvement have a focus on core / well-established areas of the business. This enhances existing workflows and makes critical data visible to all, including senior leaders. Not all companies have embarked on a transformation but those who have are looking at time-saving measures for subsurface users with access to data in the right format at the right time. Focus areas for improved business intelligence have been on the data with highest value with organisations – often leading with production, operational and budgeting data. 

None of the five stages listed below involve rocket science. However, advancing through them requires sustained commitment and top-down leadership – which is why, as the blueprints for change become established, smaller more agile organisations are particularly well positioned to gain momentum and outperform their peers.

Our survey indicates clear signs of progress and commitment starting to occur across the sector. Many companies are managing external expertise to enhance their data management capabilities or develop tailored dashboards. The majority are being driven by technical leaders with accountability to the subsurface, and are working closely with senior leaders and IT or data teams to ensure alignment and help support.  

Some organisations are advancing rapidly, moving through transformation stages and achieving measurable return on investment – often recovering costs within a single budget cycle. 

So let’s look closer at the journey they’re undertaking: that transforms data from a burden into a strategic asset. 

 

1. A call for action

Characterised by:

  • Low trust in data​
  • High technical debt​
  • Low data utilisation​

At this point, people are spending significant time trying to locate and re-interpret data in preparation for using. One company told us that “at least 50% of the team’s time is spent getting the data ready for interpretation.”​

When encountering multiple sources of the same data (e.g. two spreadsheets displaying different versions of the truth or multiple well picks for the same horizon), there is a lack of trust in the data. This leads to users re-interpreting the same data again. The end result is a compounding of technical debt and low trust in data.

Recognizing this stage can help leadership take a call for action. To be effective, this involves a change to the structure and systems that govern the data, and a transition to a new way of working. 

This normally requires leadership level input to work across teams and functions, and overcome the cultural barriers resistance to adopting a new way of working.

This call to action stage may be triggered when frustration mounts or when there’s some kind of catalyst for change – such as a change in leadership, adopting best practices within a JV, following a period of M&A activity, or timed to an IT upgrade. 

 

2. Planning

Characterised by:

  • Assigning roles and responsibilities
  • Establishing steering leadership group
  • Garnering 3rd party support​
  • Identifying high value / critical data 
  • Pilot project
  • Governance tools

As with any project, there’s a degree of planning upfront. You’ll have questions you’ll need to ask, such as what kind of governance you’ll be applying and what tools, who’s going to oversee the project, whether you’ll set up a steering committee and what will be your organisational structure? You’ll also consider whether you need to take any third party support.

Most companies have a formal or informal steering committee that includes leadership representatives. As well as helping to provide resources and adopt new ways of working, a steering committee offers an opportunity to bring others along the journey, and to begin to see first-hand the art of the possible. It also provides a platform to readily adapt to other parts of the organisation, with buy-in from leadership across the company rather than an isolated segment.

The other key aspect of the planning phase is to identify the critical data. By identifying what is the highest value data, it is possible to see where an improvement will drive the most improved business results.

Whilst supermajors may be able to afford broad change across the organisation, for small-to-mid-sized companies the steadier approach that disrupts operations the least is more sustainable. Companies can begin by targeting one area of data management and ensuring it is functioning effectively. Once tangible benefits and returns can be demonstrated, organisations are in a stronger position to expand to additional data objects with confidence and clarity. 

At this stage it’s important to ask, what are the most high-value data objects for your company? And which ones are most likely to be successful? Identifying this is key not just in and of itself, but because it is the most likely to get support from a change leader. As mentioned, we often see companies start improving on production and production forecasting. 

 

3. Piloting a high-value data object

Characterised by: 

  • Identifying customer needs and pain points​
  • Standardising source data and rationalising IT​
  • Setting up dashboard prototypes (e.g. Power BI)​
  • Training team members and managing change​

Once a company has identified a high-value data object, the next step is to enhance its quality and utility. This means defining how an improvement will be achieved. This needs to start with the people who rely on the data (data customers) rather than those who generate it (data creators).

Go to the customers of the data, not the people who generate it. As an example, a geophysicist’s customer may be the geologist or reservoir engineer. For the reservoir engineer it may be the production engineer. And for the production engineer, it may be those working the actual operations in the field. Following this chain reveals how data flows and where quality matters the most.

Understanding what a data customer’s needs and pain points are helps you to focus on what needs to be addressed. This often leads to standardisation of the source data, identifying which of the IT systems are the source of truth, and establishing a process to update them whenever errors are found.

For instance, a geologist may extract well data from Wellview into Petrel. Any issues with the data will likely be resolved locally in Petrel or a personal Excel file. While that solves the immediate problem, the error persists in WellView and the root cause issue remains unresolved. Identifying the source systems and correcting for errors at source ensures improvements are systematic rather than superficial, resulting in data that is more highly trusted. 

 

4. Visualisation of data “What get measured, gets managed”

Characterised by:

  • Data is a trusted and valuable asset
  • Data is shared and accessible

Once data has been corrected, standardised, and rationalised, it is far better positioned to serve as a reliable input for decision-making. But improvement doesn’t stop with clean data – it must also be accessible, understandable, and clearly presented to those in a position to make decisions based on it. 

This is where business intelligence (BI) reporting plays a pivotal role. By creating visual representations of the data (often through dashboards), organisations can create intuitive insights and guide strategic choices.

Among the companies we surveyed, many are turning to dashboards as they make decisions scientific and measurable, with less risk of human bias. In addition, when these dashboards are visible throughout the organisation and at the executive level (e.g. C-suite), they become powerful drivers of accountability. Individual teams are far more likely to address data quality issues when they know their outputs are being scrutinised by leadership – in essence the system becomes self policing. 

Reliable data plugged into well-designed BI dashboards enables faster, more confident decisions across the business. 

 

5. Replication

Characterised by:

  • The next high value data objects
  • Enriching existing dashboards

With a proven framework in place, teams can quickly add additional data objects using a similar methodology. This stage is characterised by expanding into new domains. Whether that be reservoir characterisation data, HSE incident metrics, or financial reporting, these data objects can follow the same blueprint.

Of the 15 companies we interviewed, three had reached this stage and the results were striking:

  • Time to value improved noticeably, with less rework and rapid business decisions
  • Business decisions were increasingly data-driven, reducing the reliance on management calls
  • Benchmarking capabilities become more robust, especially around critical metrics like pre-drill resource estimates
  • Data, being visible, resulted in self-led improvements in quality – leading to a positive virtuous cycle

With better subsurface data and enhanced visibility, organisations are able to move from lengthy decision processes to rapid, scientific business decisions 

 

6. Continuous Improvement

Characterised by:

  • Culture of improvement
  • Make data reporting workflows more efficient
  • Automating routine work​
  • Higher utilisation of dashboards

Only a few companies had entered this stage – a state of continuous improvement where data is both reported and visualised, and actively refined at source. Better data leads to better insights, and better insights drive a desire for further enhancements.

In this phase, routine cycles like drilling, planning, work program and budgets, and joint venture reviews become significantly faster and leaner. The reliance on spending time to build meeting presentation powerpoints are replaced with live dashboards that reflect the current state of play. 

Once a new data-led culture takes hold, organisations are positioned to leverage more advanced tools, whether that be AI, machine learning, automated workflows or simply continuing to encompass other data objects.

It’s worth stressing though: you don’t need to get to this point before you realise meaningful value. With the right guidance, digital transformation is not a process that needs to take years. Companies can be piloting data objects and seeing returns within six months. This may not be a massive change but a well-placed first step.

Lewis Gillhespy and Rockflow are working to accelerate organisations on their data journey. For more on how we can help, see our transformation and change services. Or for more insights on data management, see Lewis’ articles on bringing data into the boardroom and why the data customer is always right.

Continue Reading

Fill in your details below to continue reading and receive Rockflow insights, technical thinking and news direct to your inbox.

"*" indicates required fields

Name*

Share article

Lewis Gillhespy

Technical Excellence Practice Leader

  https://rockflow.com/wp-content/uploads/2024/02/Lewis-Gillhespy-New-Position.mp4   Lewis’ passion in applying Lean Management and Continuous Improvement principles to simplify subsurface process, data, and technology is increasingly sort after by clients wanting to thrive in the digital age.  By reducing waste and generating high-trusting data,…
LEARN MORE

Articles

See all articles

Podcast

Podcast: Digital gets real in mid-market oil and gas with Lewis Gillhespy

Podcast Overview: Digital innovations in oil and gas with Geoffrey ...

Read More
Article

Five steps to bringing data into the boardroom

How often have you heard leaders state we need to ...

Read More
Article the data customer is always right

The data customer is always right: understanding the roles in managing data quality

In the oil and gas industry, data quality is everything. ...

Read More