Best Practices for Managing Data, From Acquisition to Archive
As data sets collected in the field have grown exponentially, data management challenges have both evolved and compounded. This article addresses these pain points and presents strategies and solutions that should be implemented to ensure data compliance and entitlement and promote ease of accessibility, both internally and externally.
Data-intensive work flows at the field level have always presented challenges in the energy industry. As data sets collected in the field have grown exponentially, data management challenges have both evolved and compounded. This article addresses these pain points and presents strategies and solutions that should be implemented to ensure data compliance and entitlement and promote ease of accessibility, both internally and externally.
For every segment of the energy industry—upstream, midstream, and downstream—aggregating, transporting, and analyzing critical data faster is one element that gives energy industry companies a competitive advantage. For upstream applications, specifically, when it comes to data, time is of the essence. Whether used to assess subsurface geologic features or to identify potential risks that may exist downhole or in surface facilities, it is crucial to capture and process large digital information data sets with the highest possible speed and integrity.
The following sections of this article discuss the various phases involved upstream data management, including:
- Presurvey leasing, permitting, and contract management
- Data acquisition in the field
- The challenges presented by raw, unstructured mass data sets
- Optimizing legacy data
After addressing key issues data management operators typically encounter in each of these phases, the article introduces best practices designed to address each of them.
Work Flow Phase 1: Seismic Data Acquisition
Every aspect of exploration and production company operations—from geophysical surveys to drilling, hydraulic fracturing, logging and coring, and well testing—depends on geophysical data. Managing that data, however, has become increasingly complex, as the evolution of data acquisition technologies has outpaced approaches to data management in the field. In order to build a focused and sustainable data management process, it is important to implement a logical, systematic approach to data management processes, beginning at the seismic data acquisition phase.
Presurvey Leasing, Permitting, and Contract Management
Before a survey can even begin, everything from leasing and permitting requires efficient reviewing, negotiating, finalizing, and executing seismic data acquisition contracts, all of which depends on a streamlined approach to managing legacy exploration and production data. On top of this, operators recognize the importance of geophysical and geological data entitlement, making accounting for potential legal ramifications and determining proactive data management solutions a top priority. Siloed work flows at this stage can restrict productivity and limit business opportunities, sometimes inhibiting a project from getting off the ground in the first place.
Data Acquisition in the Field
When it comes to the actual execution of a seismic shoot, whether on land or at sea, it will become essential to adapt and maintain a continuous, iterative approach to data consolidation, copy, transfer, and ingest. Before energy companies can transform data into a deliverable, they must first establish a physical data storage strategy that guarantees effortless and affordable scalability in the field, frictionless physical data transfer from edge to data center to cloud, as well as coordinated and efficient data management along the way.
Key issues include the following:
- Inability to scale (as data collection requirements increase)
- Limited storage in the field for applications
- Limited options in data transfer because of large data volumes
- Data organization
- Traceability throughout the data life cycle
Data Management Best Practices
When it comes to managing data in the field, instead of recording seismic, survey, and surveillance data on hundreds of tapes, upstream operations can reduce their hardware footprint significantly by recording data to storage arrays, substantially increasing disk space while reducing IT support requirements at the edge and maximizing edge data storage resiliency.
To achieve these goals, upstream companies should leverage high capacity, ruggedized, and securely encrypted storage arrays to create secure copies of raw data so that, before data is manipulated or moved, one copy of the raw data is securely archived. Using high-performance storage arrays not only enables operators to keep up with simultaneous recording, backup, and processing work flows but also allows them to maintain a small data storage infrastructure footprint, making physically shuttling data from field to ingest logistically straightforward. Using securely encrypted drives that implement a user key management service layer protects data both in the field and in transit for robust data security against both cyber and physical threats.
Additionally, these operations should take advantage of a physical data transport service to simplify data management logistics. By using a cost-effective data transfer services model, operators benefit from just-in-time device delivery to and from any location, unburdening data management logistical overhead and maximizing existing budgets by ensuring individual projects only pay for the hardware they need. Leaning on the additional strengths of high-capacity, portable devices to aggregate mass data sets from the edge, operators can expect optimized performance in remote environments and robust data security, as well as scalable and efficient in-field storage for business insights. Implementing well-defined, consistent, and documented procedures to organize data management between parties also ensures data isn’t misplaced along the way.