Onshore/Offshore Facilities

People and Processes Define the Facility of the Future

Developing the facility of the future is a technological and infrastructural challenge, but the people behind the technology play just as critical of a role in actualizing complex, innovative asset designs. How will future facilities affect the way companies operate?

Graphic with binary code overlaying a technical drawing
Getty Images

Designing the facility of the future is a significant technical challenge that incorporates several complex technological developments, but the keys to actualizing that design lie in the people behind the technology. It may require organizations to change the way they operate, train their staff, and how they look at pilot projects, amidst a host of other considerations.

These considerations were examined during a panel discussion on the facility of the future presentation held during the 2019 OilComm Conference. The panel touched on digital twins, the virtual representations of physical facilities that are the talk of the industry. Hani Elshahawi, digitization lead for deepwater technology at Shell, said that, while the term may be overused these days, understanding the concept of a digital twin is important for companies looking to adapt facility design for a changing energy landscape. To that end, companies need to agree on what, exactly, is a digital twin.

Elshahawi said a digital twin should have three components: a model of the company’s understanding of how the asset being replicated works, data fed in a way that the model becomes temporally sensitive and predictive, and a visualization element. A true digital twin, he said, is a system of systems, so it must be reducible and combinable to a higher level, such as a system of systems would be. 

“It needs to have those three elements,” Elshahawi said. “If it just has visualization as a 3D model, it’s not a digital twin, and if it’s just a predictive time-series fit, that’s not a digital twin either. It’s just a model. If it’s just a theoretical model that’s not fed with any data, it’s just a point in time. It really needs to have all three elements to be a true digital twin that is capable of being evergreen and continuously meaningful.”

When talking about digital twins, the conversation often centers on the streaming of sensor data from a key part of an asset, or the digitization of analog data. However, the panelists spoke about the need to consider the complete spatial context of the asset as part of the construction of their virtual representations. How things fit in the context of the industrial asset is almost as important as the spatial variations of individual components.

Skip Davis, a principal at Siemens, said that the spatial component is important for developing operational digital twins.

“If I have to dispatch a technician, I want to know if that pump is under a girder that’s only 6 ft high or something like that. The spatial is a key component in my mind,” he said.

People Are the Top Priority in Achieving Success

People can play as big of a role in developing the facility of the future as artificial intelligence logarithms, and the panelists spoke about how holding people accountable for growing the skills needed to advance facility design can be crucial. Davis said that, within Siemens, the human resources department has been a primary driver in this regard. Getting the people to work in concert with the technology means orchestrating processes from beginning to end, sometimes through business process management software.

“Businesses are organized vertically, but they’re run horizontally. We get an order, we put cash in the bank. That’s how we run a business. So, it’s great that you can have an analytic that tells you an asset’s going to fail in 2 days, and I may send a text message to someone saying that asset’s going to fail. That person’s on vacation. That text message doesn’t get acted on. The thing still fails in 2 days. You’ve got to use some software where, basically, the idea is to orchestrate the work from beginning to end,” Davis said.

Elshahawi said that organizations derive value by changing the business processes that lead to decisions, and that it is critical for them to figure out how people work within those processes. He cited value stream mapping, a lean-management method for analyzing and improving the steps needed to deliver a product, as an effective tool to these ends. The design thinking process, a nonlinear, iterative solution-based design methodology, could also be useful.

“You have to understand the process, build a user story, and figure out what do I need to do to enable the users so that they can reimagine this design process and deliver a better outcome with the better data that they now have,” he said. “Then you can actually deliver value at the end of all this. We’re starting to focus more on that and less on just building data banks, which is how everybody got started.”

Pilot Digital Studies: Don't Chase Ideas for the Fun of It

The panelists also discussed the role pilot projects play in advancing facility design. Brent Gage, a cybersecurity specialist at SecurityGate, said that small, paid pilots that are both high risk and high return are the best way to test the feasibility of design ideas. However, pilots are not always easy to get off the ground, and it is not always easy to identify exactly what value they may bring to a company.

Gage warned against opposition to pilots that seek to address data siloes, and that vendors may be resistant to ideas that force them to change their approach. Elshahawi said useful pilots require a certain amount of organizational agility, particularly with larger organizations.

“In the process of identifying pilots and getting them kicked off, it is very useful to change the paradigm a little bit from organizations as these inflexible machines of hierarchical layers, to these pools of squads, of teams. Think of a sports team. It has all of the capabilities within it, but it doesn’t have the defensive team, the offensive team, and the midfield. It doesn’t work that way. They need to be self-contained, empowered, and somewhat self-governing,” Elshahawi said, noting that Shell has begun successfully employing this approach in piloting several digital projects.

Elshahawi described an agile organization as one where different elements of the business are embedded within one team, and where the people within those separate elements are empowered to make decisions. Rob Nolan, chief information security officer at Noble Energy, said building familiarity between the IT teams and the people in the oil field is vital.

“We took a group of IT people and made them specifically address some field-related things, and understand what it means to pull hydrocarbons out of the ground, ship them around, and sell them,” Nolan said. “If they don’t understand that process, there’s no tie-in, there’s no connectivity to ownership around what we do from a business perspective. That breaks down a considerable amount of siloes because now they know exactly who they’re dealing with, why that data is valuable, and why that boundary—wherever it may exist—doesn’t really need to be there.”

Elshahawi said pilots need to be developed with the end in mind, with the project serving as part of a bigger, longer-range organizational objective. Davis said this means thinking about scaling, whether an idea can apply to the organization in a broader sense.

“Is this something I can scale across all my assets, across basins? Doing a science experiment and calling it a proof of concept just for the fun of it, because someone else had a bright, shiny object and a couple of people got excited about it—I’ve seen so many of those in my career. A few of them have scaled. A lot of them have just been done and they don’t go anywhere, so that’s what you’ve got to keep in mind,” he said.