Unconventional/complex reservoirs

Does Physics Really Matter in Unconventionals?

A lot has been learned about shale, but those working to eke out oil from that ultratight rock still extract more value from data than physics.

jpt-2018-11-fracturedhero.jpg
A cored section of hydraulically fractured rock that was examined as part of the private-public hydraulic fracturing test site program in the US.
Source: US Department of Energy.

When a panel of fracturing technology leaders was asked if classic physics-based engineering matters in engineering fracturing, the answer was a qualified “sometimes.”

The group of three engineers speaking at the start of the SPE Hydraulic Fracturing Technology Conference was not going to dismiss the need for physics-based modeling. Still, applying the physics of flow in a complex, fractured reservoir sounded like a wrong turn.

To explain further, Cameron Rempel, vice president for subsurface engineering for Occidental Petroleum, compared analysis at the fracture level to trying to understand rush hour traffic by tracking each person as they pack up in their cubicle and head for their car at the end of the day.

That example, which will someday again represent office reality, is both incredibly hard to measure and analyze and does not offer a direct path to answer an analogous question that matters to oil producers: How can we measure the time it takes for all those cars to flow out of downtown and find ways to speed them up?

Or, for a panel of oil company executives: How do you maximize your return when draining a reservoir using technology and innovation?

Rob Fast, chief technology officer for Hess, described how the company used multiple instrumented test wells in the Bakken Shale to track the flow of oil from the reservoir to the production well. It showed a well in the ultratight formation was producing from a wider area than expected, including the Three Forks formation just below it.

That research investment offered an idea that could significantly increase the value of the wells it develops—which Hess is currently piloting – by widening its spacing from as tight as 350 to 700 ft or more and limiting the number of wells specifically targeting the Three Forks.

As for the questions about the role of physics, Fast said it is needed for modeling how these challenging petroleum systems work because learning by drilling is so expensive.

Trey Lowe, vice president for technology at Devon Energy, said that while physics is required for modeling, Devon’s shale development has been driven by working with data science.

That change began in 2015 with a push for remote monitoring. Lowe said there was a strong business case for focusing on data-driven change, which was not shaken by the fact that “it was completely disorganized at the time,” he said.

Gaining control of its data required Devon to create systems to gather and build an easily accessible database—using automation when possible—that reduced the time and skill needed for the analysis. And it built a staff of technical professionals able to effectively use the data.

Two pressure pumping charts showed how the company added value with data.

The first from February 2015 showed that periods when there was fracturing going on were separated by long breaks. When they analyzed the data, they saw reasons for those gaps ranged from waiting for sand to be delivered to working on wellheads as they shifted the flow from well to well during a zipper frac.

The second chart showed a recent zipper frac with almost no time lost between stages. Over the 5 years from 2015 to 2020, those delays shrunk due to changes such as contractual changes that ensured sand got delivered to the time required for quick connects for rapid well changes.

Recently, they were actively fracturing for 22.7 hours in a single day in the Powder River, Lowe said.

Devon’s prime example of the value of this work was a discovery by a pair of alert engineers who noticed one day that pressure changes in a sealed wellbore closely followed readings by fiber-optic cable during fracturing. That led to development of a method now widely used by Devon for tracking and analyzing pressure changes during fracturing, which costs a fraction of doing so with fiber-optic cable, he said.

Data has driven some changes, but it is far from a sure thing.

Rempel’s presentation recounted the terabytes of subsurface data gathered and freely available from fracturing test sites that were paid for by $85 million in US government and industry investment.

Occidental was among the supporters of these extraordinary efforts, such as drilling a slant well through a fractured zone to gain a rare look at actual fractures and how infrequently they are held open with proppant.

But did the industry earn a good return on that massive investment of money and time?

“The answer is a little muddled; it is somewhat yes and somewhat no, and a perhaps in the future,” he said.

On the yes side, the industry could see the sort of complex fractures it could only imagine before, and terabytes of free data are now available, covering everything from fracture interference to measuring how the area drained by fractures shrinks after production begins.

But in terms of payoff for the money spent, Rempel said the answer is no.

This has not inspired advances in fracturing designs—in which the changes have been “subtle,” he said. Spacing and development decisions are still mostly based on a comparison with historical production. Changes are still likely based on something that worked for an operator nearby.

For the many smaller companies in shale, the approaches described by those three large publicly traded independents are beyond their budgets, and staff skillsets.

On the other hand, the shale industry was created by those who figured out things as they went and cleverly used existing technology. Rempel pointed out that engineers used Darcy’s law to understand flow long before anyone understood the physics behind it.

And looking ahead, Fast and Rempel saw fully automated completion systems that will adjust based on a reading of the data flowing in while pumping a job, without human intervention.

So over time “perhaps we will get value in ways not imagined,” Rempel said.