Data management

Gaining a Competitive Edge by Digitalizing Subsurface Operations

The E&P subsurface sector today faces the lowest reserve replacement ratios in decades as well as an increasing need to maximize recoverable oil and gas from developed fields.

jpt-2020-08-cognite-472058331.jpg

The E&P subsurface sector today faces the lowest reserve replacement ratios in decades as well as an increasing need to maximize recoverable oil and gas from developed fields. Many of the challenges in these areas stem from legacy systems that silo data and information critical for investment decisions. To turn the tide, this sector should call on a reliable, though perhaps under-recognized ally: digital technology driven by open, accessible data.

Whether in exploration, field development, or drilling, digital ­technology powered by open data can help the upstream industry access and contextualize big data and scalable computing power to make trustworthy time- and cost-saving decisions.

With the right technology and software, the E&P subsurface sector can realistically avoid repetitive tasks, remove human biases, reduce processing time, enhance collaboration, and empower workers to become innovators. Liberated data take out the guesswork and make E&P subsurface operations more effective, efficient, and better for all.

Minimizing Uncertainty

Though some software products claim to minimize uncertainty in E&P subsurface decision-making processes, a degree of uncertainty always exists. Decisions should therefore be based on the widest-possible range of information. A study of 97 wells drilled from 2003–2013 in the UK sector of the North Sea found that more than 50% failed due to poorly integrated data and insights, improperly applied domain science, and a lack of context and effective peer review. Digital products and software that run on liberated, contextualized data could have rectified this, putting subsurface information to use across stakeholders, enabling data-driven decision-making and maximizing the wells’ uptime and delivery.

The largest independent company in the Norwegian Continental Shelf is democratizing access to all subsurface and drilling data across all its teams. Data and information are shared and accessible via a map-based polygon search. In this way, the company shares the same context, which is the most complete one, while each expert can enrich it, thereby generating more domain-specific insights.

Reliability of Data Quality

Data and information should always be auditable. If not, subsurface interpretations and models have a high risk of being incorrect, leading to decisions usually based on overestimated reserves calculations which drive up investments and costs. By contrast, liberated data—with enriched-quality tags referencing source systems, users, and history—can provide the foundation for best practices in automated enterprise data governance. Users will always know which data­sets are validated and able to run their digital workflows.

It is proven in many implementation projects that when data are liberated and ingested in cloud-based environments, many issues with data quality are identified. By contrast, it would be impossible to detect them were data stored in a legacy system. Once data are liberated and accessible to everyone, the cloud technology also enables hosting environments to create, deploy, and publish data management “functions” that can provide continuous semi- or fully automated data QC and standardization.

Fidelity of Data

Data must be consistently packaged, contextualized, validated, and shared throughout a multifunctional organization’s entire decision-making process. This process typically involves multiple teams of experts and organizational functions. To avoid confusion, they need a “single source of truth.” Information and data flows should be open and accessible to all stakeholders. This enables data reutilization and enrichment without losing quality, resolution, or other important attributes. Only open-data platforms and software that contextualizes data can make this possible.

An end-to-end subsurface-topside digital twin of the Valhall field combines subsurface and topside data analytics, physics-driven models, live data monitoring, and an end-user notification system. The digital twin predicts and identifies early signs of chalk influx and other abnormal events; it continuously monitors the model outputs and alerts reservoir engineers if a chalk-influx event is predicted. Since the system was installed at Valhall during the first half of 2019, no chalk-influx events have stopped production from the monitored wells.

Elasticity in Data Connection

Volatility and uncertainty seem to be defining characteristics of the 2020s thus far, so it is critical for energy-­adjacent sectors such as E&P subsurface to be able to adapt and minimize risk in a changing environment. Throughout the entire subsurface lifecycle, data should be evergreen, continuously adapting to new streams of data (new seismic imagery, production data, real-time drilling data, etc.), building resilience, and replacing static, inflexible processes that slow organizations down.

By leveraging the connection of real-time automated drilling to an open, standardized, and structured digital drilling ecosystem, Aker BP believes it can reduce drilling time by 15 to 25%. This will create substantial cost savings and thus allow smaller reservoirs to be more profitable. Central to this ecosystem is a “smart hub” that receives and centrally masters updated plans, checks each plan’s conformance to the schema on which it is based, and distributes each committed plan to registered consumers. The smart hub provides a loose coupling between itself and connected plan consumers and publishers, which enables the easy connection of new systems or the replacement of one system with another.

Accelerate Time to Value

With information and data effortlessly available and “ready to use,” organizations will find that they can run company-specific automated or semiautomated standardized processes and best practices, accelerating time to value.

This is possible only if E&P organizations fully control their data. We have not yet seen this anywhere in the sector, because data and information have been locked in multiple legacy software applications and stored in proprietary file formats understood only by people originally involved with that software. This has over the years created multiple inter- and intradomain silos, which still represent the greatest tech bottleneck that the industry faces. It is not surprising that E&P organizations spend more than half of their data science time and resources on identifying, assembling, and formatting the data needed for subsurface analyses.

In an EAGE paper (Caso et al.), Aker BP and Cognite describe the implementation of a seismic datastore in the Aker BP cloud environment, to get fast, tool-independent access to the Aker BP seismic data via an API architecture to ingest, store, query, visualize, and consume seismic data.

Reclaiming Power and Minimizing Risk

The E&P subsurface sector has historically had to adapt to vendors’ specific applications and file formats to enrich its data and generate insights, which has made the decision-making process unnecessarily dependent on one or a few preferred software or technology stacks. This type of risk is no longer affordable.

In a recent hackathon, geoscience ­subject-matter experts tested and deployed a set of self-developed codes on top of liberated data.

Open-data technology is now laying the foundation for the sector to reclaim its power and minimize the risk of relying on a single technology, product, or vendor. E&P organizations can establish their own vendor-neutral data architectures, where they decide on the best technology to implement based on their own business priorities and competitive landscape. Liberating data and information from legacy applications and making them accessible via open, standard formats is the first crucial step. Accessing open libraries of technology for subsurface interpretation, modeling, reporting, and visualization—as well as deploying proprietary or third-party vendors’ microservices via APIs—enable organizations to build an ­operational ecosystem that best fits their business objectives.

The E&P subsurface sector has every opportunity to improve, optimize, and succeed in the years to come. Whether the sector is able to do so will depend heavily on decision-making processes driven by democratized access to data and technology, enabling subsurface workers to innovate and do their jobs more confidently, resiliently, rapidly, and with a competitive edge.

Reference

Accelerating Seismic Data Access, QC and Vendor-Independent Automated Workflows with Cloud-Based Seismic Datastore and API by C. Caso, P. Aursand, and T. Stray. European Association of Geoscientists & Engineers.