7 October 2020
Virtual Reality (VR) is now considered a legitimate and widely accepted technology; it is becoming increasingly ambitious and immersive, with greater utilisation outside of the gaming industry. Healthcare, tourism, manufacturing, education and defence are just some of the industries using VR for a range of applications including training, the creation of new experiences or detailed simulations.
Here at BMT, we strive to take the latest innovations from the VR consumer gaming sector — both hardware and design principles — and apply them to serious VR applications. Our team have been developing VR applications for some time, and our project experience includes showcase demos, training applications and research projects on using VR for design. Whether you are already using VR, or are completely new to the medium, this article is your guide on how to take VR principles from gaming to training.
Owlchemy Labs, a video game developer based in the US, has developed some interesting VR games based around giving the player a narrative as well as tasks to complete. At the 2017 Game Developer Conference, they presented “Rick and Morty: Virtual Rick-ality Postmortem: VR Lessons *burrrp* Learned" (Owlchemy Labs, 2017). They tackled problems such as simulating eye-contact with non-player characters (NPCs) to increase immersion, picking up objects using dynamic snapping techniques, dynamically changing the layout of environment depending on physical room size, and novel ways of handling failure states (i.e. death/ game over).
One of the problems that Owlchemy Labs explored was how to deal with picking up objects off the floor. Dropping an object on the floor is a frustrating problem in the world of VR; when the user bends down to pick it up with the controller, there is a chance that this may not work as the real-world floor does not always correspond with the virtual floor. This can be caused by a calibration error or by sensors being accidentally bumped. Owlchemy Lab’s solution was to make the object hover off the ground when the user moves their hand/controller closer to it, making it easier to ‘grab’. Small considerations like this make a big difference.
Figure 1: Slide from GDC 2017 Presentation by 'Owlchemy Labs' titled “Rick and Morty: Virtual Rick-ality Postmortem: VR Lessons *burrrp* Learned" (Owlchemy Labs, 2017)
Some may argue that implementing mechanics, such as hovering objects over the floor to make them easier to interact with, detract from realism. For the team at BMT, the purpose of our VR training is to train and reinforce knowledge of processes. We look at game development techniques to provide some degree of realism in relation to the task, but not so much as to make the process tedious. We always like to include as many interactive elements as possible, even if they add nothing to the actual task. We also like adding ‘wrong answers’ into our scenarios to encourage experimentation. For example, in a health and safety training VR application, the trainee may be required to extinguish an electrical fire. They are provided with different types of extinguishers which they can use freely. Using the incorrect extinguisher could cause the fire to remain unchanged or become a greater threat. This method is preferable to simply indicating to the user that they have picked up the wrong extinguisher.
Imagine wanting to simulate a console one-to-one which may have a lot of small buttons in an area. As the user is not physically using a finger to interact with the button it can be difficult to move the controller precisely to the button you want to press. The user spends more time grappling with the mechanics and nuances of using a VR controller than focussing on the task. There are visual tricks which can be used to aid the user. These include highlighting objects, laser pointers, or enlarging interaction points when the hand is in proximity.
The game “Star Trek Bridge Crew VR” (Ubisoft, 2018) requires the player to operate functions on a spaceship by interacting with dynamic interfaces. A mixture of simulated hand gestures and auto-snapping functions makes precise, minute interactions feel natural and effortless. The popularity of such a game demonstrates that it is possible to accommodate fine interactions with mainstream out-of-the-box controllers.
In VR training applications, it is possible to use these controllers for precise interface interactions by implementing soft-snapping techniques i.e. the user’s representation of their hands ‘snaps’ to a location just above the nearest interactive virtual object. This is implemented in such a way that it does not fully detract from the expected one-to-one movement, thus making it easier to interact with precision.
There is inspiration to be found in modern conventional games which can function perfectly in a VR game. “Apex Legends" (EA Games, 2020) is a game where groups of 2-3 players compete against each other in a ‘last-team-standing’ type of battle. The game was developed under the assumption that not all players will have voice communication. They developed a simple function which allows players to communicate important information with one button via “Pings”. According to Apex Legends’ Wiki page entry, Pings are “callouts for communication with teammates without the need for voice communication”. This minimalist approach to inter-player communication could be hugely beneficial in virtual training applications, particularly if a trainer is trying to communicate with one or more of their students, or if VR-based designers are communicating with one another. The Ping system shows how effective a non-verbal communication system can be and could be incorporated into one of our multi-user VR training applications.
Locomotion, the ability to move from one place to another, is still one of the most pressing problems in VR games. There are established guidelines such as the ‘Oculus Best Practices’ guide (Oculus, 2019) which states how VR applications should be implemented to reduce the risk of discomfort to the user. Many rules and guidelines relate to locomotion. Control of the camera (first-person view) should never be taken from the user - for example, animated head-bobbing when the character is walking in the environment. Should there be a need to control the user’s movement, such as driving a car or using a thumb stick to move around the virtual environment, special measures should be taken to reduce the risk of inducing motion sickness, for example rotating the character in 30 degree increments.
Trying to implement a fast-paced first-person shooter in VR requires a different approach. Games like Onward (Downpour Interactive, 2019) attempt to implement military simulators in VR by using traditional movement mechanics – for example, using the directional thumb stick or trackpad on the controller to move and strafe the player like in a traditional game. Games like this rely on players’ understanding that repeat-exposure will help them to overcome feelings of sickness. They use a smoothing technique which tries to make the movement less abrupt on the optical and vestibular systems in our body.
The common method for long-distance locomotion is ‘teleporting’. This is typically done by using the controller to aim a recital onto the ground, indicating the location the user will travel to. Teleporting is implemented in different ways depending on the type of application. The movement from one position to the next can be done instantaneously, with a “blink” effect or by having the player “dash” towards the destination point. Many fast-paced (shooter) games such as “Doom VFR” (id Software, 2020) adopt this approach as it provides simple locomotion without disorientating the player.
Figure 2: An example of a teleport arc in a BMT VR application (©BMT)
Games such as “Robo Recall” (Epic Games, 2020) make teleporting more accessible by allowing the player to not only specify a destination position, but also a destination orientation. This enables the player to have better agency when moving around the environment. We also use the teleport system in most of our VR demonstrations and applications; it is the most simple and effective way to move around our virtual worlds. As we evolve our capability, we seek to develop more challenging applications which will force us to rethink how we approach locomotion.
Figure 3: An example of a teleport arc where the user can also specify their destination orientation as well as position (Valem, 2019)
One of the ways we are doing this is by dividing our environments up into teleportable areas, transporting the user to a set point in the VR environment. This forces the user to actively and physically walk around and use their body, thus contributing to a more immersive experience.
Asymmetric VR games refer to applications where VR and non-VR players collaborate or compete against each other in the same game.
“Keep Talking and Nobody Explodes” (Steel Crate Games, 2016) features one VR player and several non-VR players. A large printable manual is provided with the game. The objective is for the VR player to diffuse a virtual bomb whilst the other players, who cannot see the bomb, use the manual to guide the VR player.
“CtrlShift” (CtrlShift Games, 2015) demonstrates collaboration between an immersed VR player and a PC/Mobile-based support player. The support player has access to an overhead map, guiding the VR player in a ‘spy’ scenario through the level. This style of game is a good reference for implementing mechanics which allow a trainer to observe and instruct a trainee who is participating in a VR-based training exercise. Good communication between the players is required with the help of visual aids, such as the support player leaving waypoints and messages in the VR player’s environment.
Competitive asymmetric games, such as “Panoptic” (Team Panoptes, 2017), also provide insightful design principles which can be applied in VR training. In “Panoptic”, the VR player is a large floating player looking down at the game world. Their task is to search and eliminate the hidden ‘traitor’ who is played by a non-VR player disguised as an NPC. This design can be applied when implementing smarter, organic virtual training by having virtual characters played by actual players who are not required to be immersed with a VR headset. This helps to simplify and streamline VR training development, ensuring that VR headsets are only used in multi-user training when required by a particular role.
Figure 4: A BMT application using asynchronous design, where the instructor is monitoring and inputting into the VR trainee's environment. The trainer has set a waypoint to the next objective (©BMT)
Figure 5: The VR trainee's viewpoint in detail (see Figure 4 above) (©BMT)
BMT’s approach to VR development for training, design and R&D closely follows game design principles. We utilise these innovative techniques and technologies for our VR projects for defence and security customers. The same elements that make a VR game fun, comfortable and different from any other modality should be present when developing serious industrial applications. These range from approaches to gamification, quality-of-life considerations and enabling naturalistic interaction and communication with other virtual collaborators. As we look to expand our capability with better hardware, a wider range of interaction modalities and more ambitious projects, we look to advancements in the gaming industry.