Virtual and Augmented Reality is certainly maturing, now finding its way into business after a gentle transition into our homes. About a year ago, I had a go at a VR-enabled tool used to train engineers on how to service an offshore turbine on a wind farm (that’s me in the picture above).
The physical constraints were easily dealt with – I found myself at the top of the turbine on a small, metal-floored deck – that included a desk with a number of tools used to work with the equipment. The turbine itself was located in the sea with a coastline perhaps a mile or two away. When I was ready to get started, the software prompted me on which tool to pick up and highlighted where it should go, presumably in the correct order to service the turbine. All was going well; a procedural training exercise in which you could only advance to the next step once the previous was completed. That was until I used the paddles to pick up a wrench, and threw it over the deck.
I moved carefully to the edge of the deck to see the wrench descend into the waves below. I then noticed there was silence, and a little gentle laughter outside my earphones from my colleagues in the room, as the VR engineers reset the training and carefully lifted my headset away.
No harm done of course. We could just reset and go again. But the software didn’t respond at all to the action of losing a tool from the desk. In some respects, if this was real life training, the same thing would apply – the wrench has gone and nothing happens. But in terms of procedure, the software hadn’t been designed to account for people interrupting the training procedure with what we could call an “edge” use case; that is, one that doesn’t occur very often so understandably deserves little attention.
Let’s continue the real-life parallel example for a moment. If I was truly on a platform and had thrown my wrench into the sea, I’d be without a key tool to complete my task. A brief health and safety check or risk assessment at that moment would render my task impossible to complete, so my assignment would end. The VR software couldn’t handle this; we remained in an impasse, arguably showing the infancy of the products.
Maybe the focus has been too much on the headset and 3D technology needed to enable it. It is, after all, a very immersive experience using headsets and headphones. They’re probably not easy to develop and with potential investors blown away by that first moment they put the headset on, there’s little time to consider how the software should work. This is where video games come in: they’ve been doing this for years, particularly in open world environments. You build a fixed world to operate in with a level of autonomy but with objects and interactions that can manage themselves. Like thrown wrenches.

More recently, I attended the launch of the Immersive Training Institute (ITI). An initiative delivered by both the Central Bedfordshire College and VR specialist A.I.Solve, they are going all-in on mixed reality, betting that it will be the next revolution in immersive training. A.I.Solve however are developing VR solutions with artificial intelligence. This opens the doors to scenario-based training, where your actions can determine a number of different outcomes. Known as Decision Tree Learning, a computer can build a predictive model from a user’s behaviour and use this to predict an outcome.
This is all computer games have ever done. While not true AI, your interactions with a game are used to determine outcomes. I always loved the way the later versions of Grand Theft Auto did this; within its open world environments, you could complete the same mission repeatedly but in completely different ways – particularly in the online mode, with the random addition of human interaction. And in the same way that servicing a wind turbine is probably never going to be the same each time, a gaming engine could inject some much needed variance, challenge and realism in to the virtual reality experience. Then it could be ready for enterprise.
Leave a Reply