Societies Tackle Human Challenges of Automation
A joint webinar conducted by the Human Factors and Ergonomics Society and the Society of Petroleum Engineers addressed the role of human factors in automation in the oil and gas industry.
Following a recent signed letter of cooperation between the two groups, the Human Factors and Ergonomics Society (HFES) and the Society of Petroleum Engineers (SPE) came together to address the human side of automation. While automation is spreading in the oil and gas industry, humans are not out of the loop—not by a long shot. And they won’t be any time soon, or ever.
But humans are tricky, and the human part of automation creates some challenges.
Marcin Nazaruk, former chairperson of SPE’s Human Factors Technical Section; Shashi Talya, Halliburton’s global production manager for drilling automation; and Mica Endsley, president of SA Technologies and former chief scientist for the US Air Force, laid out the challenges in a webinar with moderators Camille Peres, an associate professor at Texas A&M University, and Julie Gilpin-McMinn, a technical fellow in human factors/ergonomics at Spirit Aerosystems.
Looking primarily at drilling—although the challenges can apply to all automation—Talya presented three key challenges he sees when implementing automation with drilling.
The first is the connection between the human and the automation on site. Driller’s cabins can be “pretty crowded real estate,” Talya said. There’s not a lot of room to add more sources of information. “You can’t keep adding more and more information in that limited space. There needs to be some way to consolidate and concisely show the information that the driller is interested in or wants to know in terms of what’s happening on the rig floor.”
Remote operation appears to be an attractive solution, but that creates a challenge of its own.
“One of the drivers for drilling automation is to enable us to do remote execution,” Talya said, “to get to a point where, eventually, we would have people in remote centers that are able to autonomously, automatically access or view what’s happening on the rig floor.”
That’s all fine and good, but “imagine having a similar setup in a remote center but now you’re trying to do the same work across multiple rigs,” Talya said. “So your challenge suddenly multiplies. If you maintain the same status quo in terms of type of information that’s being displayed, the type of interaction a human has with the automation system, then you’re suddenly overloading the human or the operator with a lot of information that they now have on top of what they’ve already been doing before.”
Yes, the driller is still drilling, even with automation.
“The human is still responsible for some of the manual tasks in addition to interfacing with the automation system and trying to figure out when he or she needs to step in when the automation system asks for help,” Talya said. “So, in a sense, you’re kind of in a hybrid mode where the human is not only doing their manual tasks that they’ve always been doing but they’re also looking at ‘how do I interface with this automation system and be able to recognize what’s going on and take action as needed?’ ”
Therein lies Talya’s third challenge: training. “In addition to the whole drilling process, you now need training and competency in the automation system itself to the extent that you have a pretty reasonable or good understanding of what the automation system is doing in the background while you’re trying to monitor the overall process that’s happening,” he said.
“Systems are never fully autonomous,” added Endsley. “People need to be able to interact with them and oversee their performance. Just like no man is an island, no autonomy is an island.”
Perhaps technology can solve this problem, too. “I know there’s a big trend now that, ‘Oh, AI [artificial intelligence] is going to solve all these problems,’ ” Endsley said. “The reality is it doesn’t. Even with AI, it still only does some things to a certain degree. We still need people to be part of that loop.”
The consistent need for humans to be engaged with the automation means that, counterintuitively, increasing automation increases the human factors challenges. This is what Endsley calls the automation conundrum: “The more automation is added to a system, and the more reliable and robust that automation, the less likely that human operators overseeing the automation will be aware of the critical information and able to take over manual control when needed.”
Increasing the reliability of automated systems helps, Endsley said, but that affects how operators allocate their attention and engagement. A more reliable system means operators put their attention on other aspects of the job and can miss critical information. This challenge, Endsley said, “is not something we can just engineer our way out of. We can’t just piecemeal it and snap in pieces of automation.”
The automation systems have built-in challenges, too. Having the right data at the right time is critical. “No matter how good your software is, it’ll go wrong if the data is not being correctly picked up by the sensors,” she said.
Endsley suggests a holistic approach to the problems, which she has presented in her 2017 paper “From Here to Autonomy: Lessons Learned From Human/Automation Research.”
“What we really have to do is look at the integrated whole and look at the job they’re doing where automation is a part of that with all of the information, all the systems that they’re managing, and say, ‘How do we approach this holistically so that the entire system makes sense?’
“We can do a lot to improve human performance with automation if we make it much more transparent about not only what it’s doing now but what it’s capable of doing and what it’s going to be able to do in the near future.”