Reservoir characterization

Anadarko Works To Balance Physics With Data Science

Merging tried-and-true physics-based models with data science is bolstering the Houston independent’s reservoir-engineering work on its deepwater and shale assets.

anadarko.jpg
Credit: Anadarko.

Momentum has been building lately in the reservoir-engineering space for the merger of physics and data science in an effort to provide a more complete view of what’s occurring in the subsurface.

The goal of combining the disciplines is to take the high fidelity of physics-based models, which provide good physical insight, and leverage the benefits of data-driven models where decisions can be made quickly.

During the recent Data-Driven Drilling and Production Conference in Houston, Sathish Sankaran, engineering manager in Anadarko’s Advanced Analytics and Emerging Technologies program, noted three themes underscoring why data analytics is increasingly being applied to more traditional methods in reservoir engineering.

“The first one is around maybe a lack of a good conceptual model where we do not have a good understanding of the underlying physics,” Sankaran explained. “For example, in the context of unconventionals, maybe we do not adequately understand the fluid flow through a fractured reservoir network. And, therefore, we have to rely on alternate techniques to effectively make decisions.

“Even if you do understand the physics, there could still be difficulty in just characterizing the inputs that need to go into those models themselves. In the context of fluid and core, we may not be collecting them in all our wells, for example,” he said. This can be especially true for large-field unconventional developments where fewer fluid samples, log data, and core data may be gathered.

And, “if you do end up having a physics-based model, and if you have all the inputs, you may still have difficulty in just running them at scale because of the complexity or maybe the large size of the models,” he said. This can make performing daily runs on a model for a large field difficult.

Recently, there has been push to reduce the complexity of physics-based models, while still preserving the physics, so that problems can still be solved much faster. “The starting point here is still the physics-based simulation—we are trying to reduce the dimensionality of the problems so that we can run them in much faster time,” Sankaran said.

Elsewhere, there has been a push for the use of data-physics models, or hybrid models. “The idea is, let’s try to incorporate physics starting from a data-driven model,” which can be done in two ways, he said.

One entails using physics-inspired features being introduced into data-driven models. Anadarko has used them internally as a replacement for pressure transient analysis, engineering features to mimic characteristic flow regimes that are used in well-testing analysis. Another is to introduce “physics-constrained optimization functions,” he said. If an engineer is solving a machine-learning problem, this requires introducing a term that regularizes on the basis of physical consistency or inconsistency.

Deepwater, Unconventional Case Studies

Ultimately, the main objective for Anadarko in combining data and physics, Sankaran said, is to solve business problems. One case study that he outlined involved production control optimization for five wells producing from channelized reservoirs in a deepwater field in the Gulf of Mexico. Anadarko wanted to determine the different combinations of ways the wells should be flowing.

The team started with a physics-based simulation and ran a handful of carefully designed scenarios with different bottomhole pressure profiles as the design inputs. It then collected and stacked all of its pressures and saturations at every cell block and converted them into a reduced-order modeling method to achieve a much smaller dimension, “which now runs like 30 to 40 times faster,” he said.

“We can estimate now the pressures and saturations for any future state, starting with any initial conditions, for completely different bottomhole pressure profiles,” he explained. This is merged with machine learning—a random forest regression model—to predict the wells’ oil-, gas-, and water-flow rates over time.

Another example supported well performance and forecasting for a large unconventional field. Anadarko sought to use surface pressure data, which it converted into bottomhole pressure data, and then do other forms of automated analysis. “The key challenge here,” Sankaran said, “is that we do not collect PVT [pressure/volume/temperature] for all our wells—less than 5% of our wells have actual fluid samples taken. We needed to have a way of characterizing the fluid first to do our PVT calculations.”

The team used a nonparametric regression technique along with routinely collected flowback data to estimate PVT. With that in hand, it applied the diffusive time-of-flight method to estimate production drainage volume over time. After creating a productivity index and measuring it vs. time, the team detected a frac hit at a well. However, in this case, it so happened that the frac hit improved the well’s productivity. “Now we can go and apply different kinds of decline curves, etc., to see what kind of a forecast we can expect from this well,” he said.

“Not any of these techniques need to be used in isolation,” Sankaran said. “You can also mix and match them as well, depending on, at the end of the day, your modeling purpose.”

Sarath Ketineni, Chevron lead reservoir engineer, noted after Sankaran’s presentation that there are several physics-based modeling techniques gaining in popularity in the unconventional space, including embedded discrete fracture modeling and discrete fracture networks modeling. “And on the other extreme, we have the type-curve sort of approach where we say, if you want to drill a well [in a certain region], you’ll get this amount of oil.” However, those models need to be normalized for lateral length and completion design.

“Unconventionals is a place that can definitely take the best of both worlds, and data physics is a right combination that will work great,” Ketineni said. “We need to do a little bit of both. We need to understand singular models using physics the best we can in terms of understanding where the fractures are propagating” but also zoom out to a basin-wide perspective where a broader model is needed, he said.