Industrial computing and modeling is taking the early footsteps of a renaissance period. The world now sits at a coffee shop and is able to get the same data that it could only access at its desktop terminal 8 years ago. To put things in perspective, 40 years ago, the world’s most advanced computer was the size of a car and could perform 80 million floating point operations per second (FLOPS). Today, an iPhone zips along at 76.8 gigaflops. That’s 962.5 times faster in 40 years. In 10 years’ time, a computer that is the size and dimension of a silver dollar will be powerful enough to power a manned mission to the moon or beat a chess champion. In those 40 years, technology in oil and gas has also come a long way, but are we seeing the same ratio of improvement? Is the industry adopting technology at even a quarter of this speed?
In relation to modeling, not only can we read wellbore trajectories and drilling and reservoir reports on our phones and tablets, but to a greater degree, we can also examine and predict scenarios, avoiding costly and dangerous situations before they happen. We now rely on dynamic data and more increasingly, predictive data. But, where is the state of the art for these two interdependent disciplines of drilling and reservoir engineering? Where is the technology leading us? And more importantly, why is cutting-edge technology finding it hard to reach the wider industry?
Human Capital
The industry faces a never-before seen shortage of trained personnel entering the industry. It’s the “graying of the oil industry.” The reasons for this are left for a different article, but I strongly recommend referencing a Wipro report at
http://www.wipro.com/Documents/casestudy/Pennenergy-Wipro-Oil-and-Gas-Research-Report.pdf. The main thing we can take from this report is that the oil and gas industry is looking to software to try to maximize efficiencies by the smaller workforce.
A survey by Rice University and Ernst & Young (http://www.starktalent.com/2011/09/oil-and-gas-companies-talent-shortage-is-a-major-concern/) found that 88% of human resource managers at the world’s top oil companies agreed that the existing persistent shortage of people will slow down financial growth and performance in the industry over the next 10 years. So when we continue to talk about technology for oil and gas modeling, we need to keep in mind the practical requirement for advancement—allow less people to do more.
Looking Back to Look Forward
The late 1970s saw the beginnings of personal computing; Apple was on its way to making its first computer, and James H Clark and some Stanford graduates would start Silicon Graphics (now called SGI), focused on high-end hardware and software for 3D graphics computing.
Fast forward about 5 years and John Mouton, Royce Nelson, Bob Limbaugh, and Andy Hildebrand decided to take the next level of computing to seismic interpretation. “We were turned down by nearly every venture capitalist on the east and west coast of America, but we were approached by Sevin Rosen ventures who were also invested in Silicon Graphics computers,” said Mouton. “The way it was done in the industry at this time, we had huge 2D printouts of the geo reports that were spread out all over the floor. You had to have good eyes. Engineers would get down on their hands and knees and literally eyeball the trends on the printout.”
In recalling a conversation with Gene Ennis in spring 2013, the former chief executive officer of Landmark Graphics said, “When we brought this onto 3D graphics, we setup a demo for these guys in the oil company, switched it on in the meeting and everyone was just blown away.”
Mouton continued, “We were two software guys and two hardware guys. Our objective was generic. We wanted to bring the advantages of personal computing to how people did their work in oil and gas. Nelson was really pushing for the big oil companies to use the technology for seismic data. Shell was acquiring 3D seismic from boats. The problem was that the data sets were so large, the different layers had to be colored in with pencil on the printouts. The process just simply became counter-productive. Our approach was to be able to handle all the data and then apply simple pattern recognition, color, and 3D to the seismic sections. No pencils.
“We had a grass roots approach for selling this. Get the guys on the ground to use this. Soon, so many wanted this as standard that they forced the companies to buy it. We practically incited a revolution in the geosciences workforce.”
Today, Landmark and SGI are industry standards. Software globally now has moved to a new frontier. The cloud is not something new in oil and gas, with hard-line networked rigs all across the Gulf of Mexico (GOM) and the North Sea from the late 1970s. What is different about cloud today, however, is that it now has further reach and a distributed processor power, and uses a “client” device to access the data (i.e., smart device/phone/tablet).
There is a conflict of interest at play in today’s oil and gas industry though. Do I adopt a new technology and see how it goes, potentially reaping the rewards of requiring less people to drill more efficient and productive holes, or do I stick with what works all day, every day, and accept certain losses?
When hundreds of millions, if not billions, of dollars a day are at stake, the answer is clear. And the sales timelines of some software companies will show that they have to prove their worth over time prior to making those first big sales, let alone the “new way” becoming the new norm.
Things Not to Leave Behind
I remember being away at work with my dad, and we were discussing a tough situation in his early career. They were “stuck in the hole during drilling.” His account of the experience was not one defined by software solutions or advanced analysis, but one where he described having a feel for what was going on downhole in order to free up a drill bit. “Bring up the bit, rotate to a certain speed, and bring it down again,” he said. With all the technology and data analysis, sometimes you have to bring in your gut and let it steer the operation.
Indeed, there is a huge amount of knowledge capital that is lost with the retirement of each geophysicist and senior drilling manager, and not enough technology can be created without this field experience “know-how” in mind.
New and Latest
Kevin McClurd of HawkEye tells me about HawkEye3D, a software widely used by service companies and directional drilling engineers around the world. “We have a 3D visual software that provides a view of the formation and the well while it is drilled, in 3D. What’s exciting is that the visualization is in real time and coming on mobile soon.”
Such technology was historically only available to large operators due to the high ticket cost. Now, a small directional driller can use this software and be up and running within minutes with a real-time view of the drilling from anywhere in the world. This capacity is brought to you by today’s cloud.
Facebook famously has one of the best technology infrastructures in the world for data handling. This ability and networking is based at the web server stack level and uses a backbone framework called Hadoop.
So who is using this type of technology in oil and gas? Schlumberger, for one. It doesn’t use Hadoop clusters, instead offering the Petrel E&P software platform. It is the Schlumberger answer to not only providing a backbone for new advanced software, but it also allows the company to future-proof the ability for new software to be supported in a “shared-earth model” method. Shared-earth modeling describes the process of integrating static and dynamic data from two or more disciplines to construct a model and visualize all relevant data in one multidisciplinary environment, via an elegant user interface.
The system utilizes a technology language called WITSML, which is short for Wellsite Information Transfer Standard Markup Language. It is a proprietary version of web HTML for the energy industry, developed by Energistics. WITSML allows energy companies, service companies, drilling contractors, application vendors, and regulators a common standard for data transfer and opens up the marketplace for third party software developers who can develop applications that provide next generation interpretation.
Integration is Key
When researching the marketplace in building my company, Skynet Labs, and talking to drillers and well planners, I have found that the biggest pain for them in the field was integrating all the well plans and reservoir data sets they were sent. One each was received from the geologist, the geophysicist, drilling department, and another from the electrical and fluids departments. Have I left anyone out? Oh, yes, and to make things even harder for our driller, each contributor had developed their plans in different software so that integration was impossible. This forced the drillers to come back to “gut.” Schlumberger’s software with WITSML is powerful and a game changer.
Another issue that 99% of all drilling and well planning operations face is planned trajectory. I have spoken to more than 100 senior drillers, presidents of drilling companies, drill planners, and reservoir analysts over the last year in developing Skynet’s products. When they mention the “planned trajectory” they always have a small chuckle and say, “Well, you know, it’s planned but we never drill to plan.” The answer to problems at the coalface of exploration and production companies is found by talking to the guys in the field and not always in the office. The shout from the field is integration and simplification of the data. Specifically, the integration of reservoir projections and plans with wellbore plans across the team.
Steve Devereux, founder of Drillers.com, calls it like it is. “Improved wellbore trajectory and reservoir modeling achieves better, more accurate wellbore placement. Collisions are avoided, and drillers can drill faster relief wells” such as was required in the Macondo disaster in the GOM in April 2010.
In looking at the delivery of these integrated data to its stakeholders and consumers, there are three questions: 1. Do I want to be able to see it on mobile? 2. Do I want to be able to manipulate the process from anywhere? 3. Do I feel secure about it?
The last question is, by far, the single most prominent blocker for new systems proliferation, but it is nothing new. Leaders in oil companies have some of the most top-level security concerns in the world, on par with governments.
One company looking at this issue is Secure-Nok, based in Houston. Their software is deployed in oil and gas control systems at a core level. It actively hunts for any breaches in the system. Headed by Siv Hombe, a renowned white hat hacker from Norway, the stellar team is addressing an exciting and high-value question mark for the industry.
What’s Ahead?
One small startup company looking to the future is Waveseis, founded by Mark Roberts, an ex-BP geophysicist. Roberts has developed an algorithm to allow greater visibility of presalt reservoirs. Another startup at an early stage is Petrolance. Their team is focused on utilizing cloud processing and distributed computing to provide a stronger visualization for geoscientists when they want to look at reservoir data from the cloud. Currently, this ability is limited to poor pre- and post-visual programs.
Yaroslav Bashenko of Petrolance said, “Most companies focus on good simulators but the problem is products have weak pre- and post-processor power. Petrolance aims to make it all in one, one cloud package providing the higher spec processor and visualization.”
The oil and gas industry is admittedly slow in adopting digital technologies (though for reasons which I respect, mainly security and compromised data systems being the main concern), but when you match a paradigm shift in the wider software world with a dwindling workforce in such a high-value industry, it is difficult to see profit in maintaining such a slow change in attitude.
In closing, there are still some questions left open. Do we need to kick the horse in the belly and catch up with where it’s at? Or should we stay trotting? Is big data useful or are we drowning in big bytes? Whatever the interpretation may be, I know one thing: I don’t have a clue how I will get any typing done on a computer the size of a silver dollar!