Geomechanics in the petroleum industry is a multidisciplinary field, collaboratively drawing in the brightest minds in the geosciences, engineering, and mathematics to solve quantitative subsurface problems with the common foundation of stress, strain, and rock or soil mechanics. Disciplines that utilize geomechanics techniques include structural geology, drilling engineering, reservoir production engineering, rock physics, and seismology.
As a technical specialization, geomechanics has thrived for at least the past century (for example, A. A. Griffith’s seminal work on fracture mechanics in 1921), with subsurface petroleum applications persisting at the forefront of the science. Whereas fundamental theoretical research in geomechanics seems to progress asymptotically (i.e., incremental discovery of more complete governing physics equations), in recent years applications development in reservoir geomechanics has grown exponentially, to a large extent taking advantage of significant computing breakthroughs. A few examples include advances in downhole fiber-optic sensing like DAS and DTS, 4D time-lapse seismic imaging, and microseismic analysis. In many cases, the value of the technological advancement has centered on data integration and multidisciplinary strategies. With that in mind, I highlight five key innovation themes that relate to advancing integrated reservoir geomechanics capabilities.
Unified Digital Platforms. To facilitate data sharing and cross-functional collaboration, geologists, geophysicists, completions engineers, petrophysicists, reservoir engineers, data statisticians, etc., need to have shared access to the most current data and software applications. Many geomechanics problems that are critical to subsurface operations follow nonlinear workflows that require iteration between diverse data sets. Consider the example of a geomechanics workflow for conducting structural containment analysis, a risk assessment of the mechanical uncertainties surrounding fault stability, fault permeability, or hydrocarbon trap retention. The analysis requires building a structural framework from 2D and 3D seismic, basement surveys, and well ties. The steps require a velocity model, stratigraphic interpretation, fault interpretation, and detailed reconciliation of fault-horizon offsets. Mechanical modeling of this framework requires derivation of mechanical properties, interpretation of mechanical stratigraphy, and stress solutions at the wellbore or in full 3D over the volume.
Solving this sort of analysis requires a team of workers and access to multiple data types. If every step happens in isolation, using incompatible tools, the workflow becomes tortuous and the robustness and ability to cross-validate the solution is compromised. The remedy is to perform the work starting from a centralized working space, a common or unified platform. A unified digital platform can consist of a single host software package with a robust data management system and embedded geomechanics modules such as Petrel, SKUA-GoCAD, MOVE, TrapTester T7, or DSG, or it can be a streamlined integration methodology linking several mutually compatible tools. Starting with a common analysis platform is critical for reducing workflow gaps, for facilitating seamless data access, and ensuring two-way connectivity across analysis steps. As reservoir geomechanics solutions become increasingly comprehensive and shared multidisciplinary efforts, a unified digital platform serves as the central enabler to collaborative problem solving and accelerated innovation.
Automated Assistance. Many steps in a geomechanical analysis can be tedious and inefficient when performed using ad hoc or manual model building workflows. Whereas the foundational technology currently exists to solve a range of problems, for example, wellbore deformation, computing stresses from fault perturbations, or hydraulic fracture propagation, the steps for building these models are highly dependent on the biases of the user. For example, geometric modeling workflows tailored to drilling engineering problems might adopt a wellbore schematic view of the subsurface. Intensive multiphase flow applications might focus on complex multiphysics equations while assuming idealized topologies that ignore key geologic structures. Even though geomechanics specialists may be aware that the tools tend to emphasize certain parts of the workflow while glossing over other key controls, most current geomechanics solutions are deterministic. This leads to overly constrained solutions and insufficient treatment of uncertainties. A key goal of automated assistance is to help the user test different scenarios without getting bogged down by mundane or repetitive processes.
For example, a structural deformation problem solved using the 3D finite element method requires input of horizons, faults, mechanical properties, and boundary conditions. A typical workflow treats each of these inputs deterministically. However, in reality, horizons and faults may be interpreted in more than one way and still be structurally admissible, and the seismic data has a resolution of a few 10s of meters. The geometric architecture alone has uncertainties that may affect the stress solution. Similarly, mechanical properties and far-field stress assumptions are typically derived from limited wellbore data, whereas their three-dimensional distribution exists in the uncertainty space of empirical correlations and probability density functions. Automated assistance is thus needed to (a) efficiently interpret admissible geometric variations; (b) rapidly mesh different horizons and faults; (c) set up batch files to conduct accelerated scenario testing processes; (d) process results for statistical analysis (e.g., matrix design) or adaptive simulation (machine learning). As subsurface data collection techniques continue to improve, and as subsurface models become more complex, advances in automated assistance in data interpretation and modeling will help to resolve the intrinsic uncertainties.
Streamlined Data Pre-Conditioning. Machine learning and other data analytics approaches require access to large quantities of quality-controlled data. Many companies and research groups are starting to realize the challenge of archiving and filtering through decades of legacy data ranging from production data to geologic thin section to rock mechanics tests, while simultaneously collecting and analyzing new data. All subsurface data must go through rigorous pre-conditioning, quality control, and archiving processes before being considered suitable for data analytics.
For example, in many fields, triaxial compression tests on plugs from rock core by standard convention are loaded under estimated in situ confining stresses corresponding to the depth and pore pressure of the formation. However, it is well known that rock strength is a function of the level of confinement. In other words, if a set of tests for confined rock strength are run under variable confinement (rather than identical confining stress), the triaxial strength measurements from the different tests cannot be directly compared. Another example is that image log resolution and data quality are known to vary by tool and mud type (oil vs. water). Due to resolution differences, natural fracture density from a vintage image log using oil should not be directly compared to a modern high-resolution image tool with carefully optimized mud chemistry. Given the variance in quality, resolution, and completeness of subsurface mechanical data, machine learning on geomechanics is not plug-and-play, and requires streamlined processes and a significant data management infrastructure.
Optimized Computing. High-end process-based forward modeling and machine learning has the potential to unlock many mysteries of the subsurface. Using standard commercial software it is now relatively straightforward to perform advanced multiphysics solutions on geologically reconciled models with tens of millions of cells, ranging from coupled thermo-poroelasticity, to large strain inelastic deformation, to anisotropic seismic inversion of geomechanical attributes. Running these sorts of complex physics simulations on small-scale problems or problems with limited extent is tractable using a standard desktop workstation. However, to take advantage of HPC capabilities, companies have to invest in infrastructure while ensuring accessibility at the user level. In addition to investments in hardware, software license fees also significantly increase with parallel or cluster computing. In some cases, larger companies develop their own internal geomechanics software to allow for unlimited use of specialized reservoir applications, thus bypassing licensing limits. More companies are starting to utilize Cloud-based technology to increase computational power while decreasing infrastructure costs. Widespread access to optimized computing solutions is still a major bottleneck on the path to innovation. In my experience, many geomechanics modelers are still running their simulations on desktop workstations where computational power is limited.
Broad Access to Technical Education. Cross-functional geomechanics training is becoming more important to keep up with a growing industry trend toward more collaborative teams. For example, petrophysicists who work with sonic logs provide more impact to their team if they have had exposure to the latest geological methods for modeling mechanical stratigraphy, and will add even more value if they can knowledgeably help coordinate rock mechanics laboratory testing. Geophysicists who produce seismic volume attributes are better equipped to help explain how interpreted geologic features drive the production response if they have exposure to production simulation modeling. The days of technical silos, “handing off” data, or “tossing results over the fence” are fading.
Because reservoir geomechanics en-compasses such a broad range of scientific disciplines, for most people it is challenging to get exposure to all the types of geomechanical analysis done in the subsurface. Whereas on-site classroom, field-based training, and learning through conferences and industry consortia is still effective, online training fills a crucial education gap with fit-for-purpose modules and flexible scheduling. Moving forward, there is opportunity for the geomechanics community to embrace and sponsor a wider range of accessible training options to keep up with increasing demand for fundamental skills, such as refresher courses on solid mechanics, thermodynamics, or rock physics, as well as application skills such as MATLAB programming, creating Petrel plug-ins, or commercial software training.