Business Runs at the Speed of Data
Getting useful data into your analytics and decision software faster can improve your competitive edge. Is it true that whoever has the most data wins? That is only part of the formula—we should rewrite this as “whoever has the most trusted data wins.”
Data volume, variety, and velocity will continue to grow, and leveraging all of that data effectively must be based on data transparency and trust. But, in too many organizations, data veracity continues to be a challenge.
Data has been both the bane and the blessing of industry for decades. Every year, experts think of new kinds of data that will help industry be more effective, efficient, safer, and acceptable.
Annually, industry invests trillions of dollars to create and analyze data, but only a tiny fraction of that investment goes to ensuring that it is being properly stewarded for industry. It is important to understand why investment is low and what the consequences are.
The petroleum industry is cyclical, a nice way of saying that, as the price of oil fluctuates, industry priorities tend to shift. Economic challenges result in operational fat being trimmed, and functions deemed nonessential are outsourced, scaled back, or even eliminated. All too often, data disciplines are viewed as noncore functions. Experienced data professionals are laid off, and critical data stores are relegated to tactical management inside operational silos and proprietary software. While this can be an effective short-term cost-cutting mechanism, the long-term consequences are often devastating.
Data attenuation refers to the degradation of signal strength as it decays over time, across distance, or through interference. Working out how attenuation happens and then preventing or reversing it takes lots of time and patience. But it is worth the effort to solve the problem, because doing so strengthens consumer trust in corporate data systems. This is a critical function of strategies that manage and steward data.
Sadly, without intentional intervention, data attenuates a little bit every time data is moved from stakeholder to stakeholder or from process to process. Here are some of the actions that result in attenuation unless strategically placed data professionals are in place to mitigate the consequent risks:
- Data is segmented into different operational systems (usually proprietary software) that accept data relevant to its tactical role for its business users. Important contextual information is lost.
- Data is aggregated or summarized to reduce overall data volume and thereby improve performance or simply access.
- Data is used to change data or create new interpreted or calculated information, without direct inclusion or reference to the original data used and without metadata that describes the provenance of changes.
- Data vocabularies are changed to the preferred vocabulary of the recipient; often the meaning of the data become ambiguous or even incorrect.
- Data relationships are ignored or simplified to accommodate software limitations or performance expectations.
- Data formats or structures become tactical and reflect the needs of the user rather than the business processes that created it.
- Data versions are untracked so changes to important information (e.g., asset identifiers and locations) are cascaded into some tactical systems but not others.
These data movements happen a lot in our industry because every stakeholder is exceptionally dependent on data that comes from not one but many outside sources and software systems. Each group that has its own definition of what good, well-formed data looks like must accept the attendant consequence of verifying and amending every bit of data that they receive.
That is expensive and inefficient, but many data managers report that they spend 50% or more of their time doing exactly that—even when data comes to them in digital form from trusted vendors. That is because each data creator has its own unique rules for what “good” data should look like, and theirs are not the same as yours.
Unless we work together to resolve the underlying causes of data problems, we will continue to struggle with the time and effort needed to get the foundations of data preparedness sorted out. That means that industry must accept that data is strategic rather than tactical. It also means that each organization must transform their expectations to address the overall needs of industry rather than their own expectations.
What Happens When Users Want To Use Data?
Users who require data run into critical delays or even show stoppers because data requires intervention before it is suitable for what they need to do. Problems arise when frustrated and impatient business units circumvent data control systems in order to use analytics tools.
The Professional Petroleum Data Management Association (PPDM) works with data scientists from many companies. The message from these scientists is consistent: “We were hired to do analytics, but we can’t because data is simply not in a state of readiness. It is scatted between systems, and the problems we encounter in preparing it can’t be solved with data sciences. That’s a job for data management professionals.”
Delays caused by reworking digital data to make it fit-for-purpose slow business down and can cost your company competitive advantage. On the flip side, the risk introduced by bad data can be incalculable.
Considerations for Data Strategies
Data professionals are tasked with figuring out where and how data attenuation occurs and then prevent it from happening, repair the damage, or mitigate the impact of data attenuation to our many stakeholders and users.
Data attenuation that happens before you receive data can be difficult (impossible) to prevent, particularly when the attenuation results from proprietary systems, vocabularies, and processes applied to the data by another stakeholder, such as a field service company. Often, the data receiver has limited or no control over systems and processes applied before the data arrives.
Clearly, the best way to fix an expensive and time-consuming problem is to get it sorted out before it becomes a problem. That’s why data professionals get involved with organizations such as the PPDM Association.
Is Technology a Solution to Data Problems?
Technology has a shelf life, often 2 or 3 years. And most technology is aimed at addressing specific tactical needs. Even master data management software systems have scope limitations that must be considered by consumers.
Data also has a shelf life, of course, but that life is measured in decades rather than years. It must be available to many users for purposes that may not be fully imagined today. It is often attractive to hope that the right technology will solve your data problems, but that doesn’t work because data must survive many technology advances and systems.
Strategic data solutions require strategic data governance programs that consider the holistic nature of the purposes for data. The tactical solutions used to implement the governance programs must support both the strategic and tactical demands on data systems.
This, of course, is why industry standards are developed, particularly when they are technology neutral. The PPDM Data Model is an expression of strategic needs for data systems that has been expressed in relational form for convenience. Separating the strategic content from the legacy relational expression is an objective of the PPDM Association.
Other PPDM standards, such as “What is a Well” and “What is a Completion,” are fully technology neutral.
An Ounce of Prevention
Data attenuation has deep roots in the social, cultural, and technological history of the oil and gas industry. Increasingly, the negative consequences to industry stakeholders is resulting in unacceptable levels of inefficiency and redundancy. Data professionals help resolve these problems at various stages of exploration and production work flows and processes and strive to resolve root causes in the future.
Prevention requires industry to collectively identify the sources of data attenuation and develop process to prevent it where appropriate. That means we need to work together, because attenuation that is not a problem for one group of users may be catastrophic for others or may delay the adoption of new systems and technologies in the future.
The prize we seek is worth it. Oil and gas prices are always going to be volatile. A strong foundation will help us position industry for efficiency, safety, and effectiveness. Having good, trusted data helps industry make good decisions that balance the needs of all stakeholders.