Navigating the Digital Transformation
The advancement of the oil and gas industry hinges on its ability to transform digitally by applying high-quality data to illuminate its path forward.
Long before the energy transition was top of mind and tip of the tongue for many, the digital transformation took more than the lion's share of focus in articles, board meetings, conferences, and discussions.
But on 2 November, it retook center stage at Corvacon 2023. The annual event hosted by the Houston-based energy software development company Corva brought together stakeholders in the oil and gas industry to discuss the intersection of energy and technology.
‘Long, Long Journey’
With a history that traces back to the late 1970s when computer-aided design and manufacturing systems were first used in business applications, digital transformation took root in business lingo in October 2013. Then, the MIT Sloan Management Review and Capgemini Consulting published the results of a survey of more than 1,500 executives and managers to understand better how businesses succeeded or failed in using digital technology to improve business performance.
The key finding was that companies faced a digital imperative: Adopt new technologies effectively or face competitive obsolescence.
More than 20 years later, that key finding still rings true for the oil and gas industry as it seeks to integrate digital technologies into its operations, processes, and business models to become more efficient, competitive, and flexible in a rapidly changing market. Pinpointing where the industry is at in the integration and transformation is—like the topic—a complex exercise.
Matthias Gatzen, executive director of digital well construction for Baker Hughes, said he sees the transformation as a “long, long journey” and a “key pathway” on the industry's way toward autonomous execution.
“Automation is a huge topic for the industry, and it’s a journey to get there,” he shared with attendees. “The starting points are processes and work flows and ensuring you have the best setup possible that’s been collected over many years.”
For Baker Hughes, those processes and work flows are digitized into digital twins capable of running continuous simulations in prejob environments to improve execution, he noted.
“The third layer is to start automating by feeding real-time data for the next site into the same digital twin used in your preplanning phase. Then, as you keep executing, you add machine learning and similar logs.”
David Forbes, general manager for global wells at ConocoPhillips, sees it differently.
“I don’t think we have transformed the industry yet,” he said. “We have gotten more data into the hands of engineers and geoscientists, and they’ve been able to do more with it. We’re still doing many things we’ve done before, but now we’ve been able to optimize performance.”
He noted that he sees great possibilities with efforts such as the predictive drilling technology under development by Nabors Drilling and ConocoPhillips' own automation journey, adding that a goal is to “take a lot of the variability” out of the process and to be more predictable.
According to Moji Karimi, chief executive and co-founder of Cemvita, the transformation is “not something that happens just because time goes by. It’s facets of creating the transformation, starting with setting the vision first.”
He encouraged the industry to “take a step back and think about what kinds of insights companies will require,” noting, for example, the regulatory point of view for validation of sequestration technologies.
“It is important to think about the details but also what are the new industries and the new categories being created,” he said.
Underpinning the entire digital transformation is the data, with critical importance placed on the quality of the data. High-quality data not only helps management make more-informed decisions but it also ensures better regulatory compliance, prevents significant financial setbacks, and optimizes operations to help deliver cost savings.
Put simply, bad data makes it significantly more challenging to make accurate choices.
Gatzen said that, when looking at data quality, his team starts at the source and then maps out the entire pathway from the sensor being used all the way to the final application.
“There’s a hardware case, too, that is tied to the data quality that we also look at,” he said. “Ultimately, you need good, high-quality data to execute all the KPIs [key performance indicators]. For us, that's a huge focus area.”
Forbes noted that data quality is a big topic for ConocoPhillips internally, adding that the company addresses it by broadening the data, for example, to provide drilling rates and to bring that data in centrally.
“It helps us on the quality because we've got people looking at it to ensure that the quality is there but also from its source. We have very strict protocols that we need people to follow regarding how the data is entered so we can use it,” Forbes said.
One example he cited of the company's success with this approach comes from the new assets it acquired in the Permian Basin.
“We were able to take their data, change the view, and clean it up a little bit. As soon as we got those data to our standards, we could put it straight into our tools, provide analytics backup on all that new data that we had, and put it back to the people that had joined the company,” he said.
“It was impactful, too, because we've got the processes in place and the protocols in place to be able to do that," he said. "I think the other thing is that we were very active in an industry group trying to standardize data quality protocols. We put considerable time and effort into it because it is the foundation for what we must do.”