VR Locomotion: How to Move in VR Environment
January 12 2021
With the strong launch of Oculus Quest 2 (Facebook/Oculus has collected a record number of monthly active users over the last 7 weeks), the adoption of VR devices is growing like mushrooms after the rain. More than 100 of VR games are already making over $1 million, and the business VR projects are progressing at breakneck speed. The VR is here and the number of apps, titles, and games is only going to rise.
The biggest proponent for this growth has been an affordable price along with the hardware upgrades, including tracking systems, resolution, comfort, and even incorporating new technologies such as hand tracking, but VR applications have definitely improved their experiences, with better standards for interactions, locomotion, and UX.
In this article we are focusing on locomotion - an art of movement in a virtual environment. You are going to learn:
- What locomotion is
- What are the locomotion methods
- Details about different locomotion methods
- How to setup locomotion in Unity 3D
What is VR Locomotion
By definition locomotion is the ability to move from one place to another in physical space. It derives from the Latin origin words locō (place) and mōtiō (movement).
The virtual reality locomotion is the technology that enables movement of the avatar or user (in this case you in first person) through the entire virtual world, using only a small real-world space. Locomotion is one of the pillars of great VR experience.
Why is VR Locomotion Important
In the resurgence of AR/VR since 2015 when the hardware and technology allowed more dabbling and creating VR worlds achieving the immersiveness was (and still is) the biggest goal.
VR replaces your reality with digitally constructed worlds. The problem is, even if the digital environment is tricking your brain in one way, the other senses are being disrupted by the "glitches in the matrix".
Your inner ear functionality includes the sense of presence in the room, balance, and body direction and stability in movement. To avoid sickness and nausea, the movement in VR should mimic the movement in physical space.
That is how we achieve realism, right?
However, how do you "walk" in VR if you're in a seated experience? Even sophisticated omnidirectional treadmills (i.e. the platform OmniVR) are only just barely trying to imitate the real world.
In the 2019, University of California user study "A User Experience Study of Locomotion Design in Virtual Reality Between Adult and Minor Users", the researchers found out that physical body movement that mirrors real-world movement exclusively is the least preferred by participants; both adults and minors.
Most of the artificial movement in VR is achieved through hand-held controllers that are part of the headset package like Quest 2, Oculus Rift, HTC Vive and others.
According to Oculus for Developers guidelines recommendations you should take into account several elements when developing and designing experience using artificial locomotion:
- Acceleration - fast, unexpected movements lead to discomfort or even nausea. Special attention should be applied to torso and head movement.
- Speed - to be in the human perception golden range, Oculus recommends speeds of human locomotion: between 1.4 m/s and 3 m/s.
- Direction and Positional Tracking - with motion controller setup, the user sets the range of space (guardian mode). Designers/developers must understand that backward movement and lateral movements are rare.
- User Control — Let users control the motion as much as possible/ This is extremely important when it comes to gameplay and enjoyment.
- Visual Quality — the quality of the visual is crucial as well. There's a thin line between the quality and performance. Rapid movements such as head bobbing may break the immersive experience
VR Locomotion Techniques
There are several methods on how to transport the user through the environment. Each method comes with its set of benefits and challenges.
Locomotion techniques have been tested and perfected for some time, aiming for seamless and user-friendly navigation in a variety of virtual environments, and we now classify them like so:
This approach utilizes only the player's physical movement on the real-world space, and so, the applications must be designed around this constraint. Beat Saber is a good example of this, where your only movements are for dancing and dodging those pesky walls.
This technique uses some extra sensors, that for now aren’t included with the major headsets, to detect some kind of physical movement, and translate it into VR movement.
For generations, gamers have been controlling their characters using a trackpad or thumbstick, and using this same method on VR applications seems only logical, but the perception of moving in the virtual world, while remaining stationary in the real world, may cause motion sickness, or more specifically, something called “Vestibular Mismatch”.
So, in order to make this kind of locomotion possible, some extra techniques had to be added for it to work, like reducing the field of view while moving in VR, which limits the amount of detail noticed by the player, and in turn, decreases the triggers for Vestibular Mismatch.
This is the only non-continuous locomotion technique, meaning when the player teleports, they are instantaneously repositioned to the target location, with no in-betweens. The target location is selected by the player by aiming with the controller, and sometimes they can also select a facing direction for the teleport.
Like the controller based technique, a raw implementation of this locomotion also has some drawbacks, suddenly changing the player position, with no context at all, will leave her completely lost, and so, there are a few variations for this technique.
In this variation, upon selecting the target position, the player view fades out, they are relocated in the virtual world, and finally, the vision fades back in. Almost as if the player blinked and voilà! They are in a new location.
The blink locomotion is by far the most used, but it can hinder the player immersion, and for that, there is the dash variation. In this version, instead of blinking into a different location, the player is rushed there at super speed.
For this technique, we must be really mindful of the vestibular mismatch problem, as this sudden movement would definitely trigger it. Raw Data deals with this by adding some dash animation effects that serve as an anchor for the player, much like a car cockpit would.
Unity Implementation Example
This tutorial will be focused on Oculus, but here are some links to get valve headsets working with the XR Interaction toolkit:
1) Enable Developer Mode on your Oculus Headset
Follow the directions provided by Oculus.
2) Unity Android Build Support
Make sure that unity is set up with the android build support.
3) Create a new Unity Project
From the Unity Hub, create a new 3D project.
With the project created, switch to the Android platform through File > Build Settings.
4) Install Unity Package Manager Packages
Install the following packages from the Package Manager.
- XR Plugin Management
- Oculus XR Plugin
After downloading these packages, enable the plug-in from the XR Plug-in Management. Open the Project Settings window from Edit > Project Settings.
Install XR Interaction Toolkit
This is still a preview package, so in order to install it from the Package Manager, you must enable the option “Show preview packages”.
During the installation, unity will prompt you to switch to the new input system, accept that.
After installing this package, import the Default Input Actions from the XR Interaction Toolkit package.
An XR Rig is an object that houses the VR device camera and controllers. To add it to the scene simply select Room-scale XR Rig (Action-based), under the GameObject > XR menu.
This simple setup already gives us the Room-Scale Based locomotion explained earlier.
Locomotion System Setup
The XR Interaction Toolkit comes with solutions for Controller and Teleportation based locomotion out of the box.
Add the following components to the XR Rig gameObject:
- Input Action Manager
- Character Controller
- Character Controller Driver
- Locomotion System
- Teleportation Provider
Under the new Input Action Manager component, add a new Action Assets and set it to the XRI Default Input Actions.
And from the folder Assets > Samples > XR Interaction Toolkit > 0.10.0-preview.7 > Default Input Actions, drag the XRI Default Continuous Move and XRI Default Snap Turn assets into the XR Rig gameObject.
The final XR Rig should look like this:
XR Controllers Setup
The controllers GameObjects are children of the XR Rig, and can be found at XR Rig > Camera Offset > LeftHand / RightHand Controller in the hierarchy window. We need to set up the input mappings for the XR Controller component in them, luckily for us the input samples we downloaded earlier also have this.
Delete the current XR Controller (Action-based) component on each of the hand controllers gameObjects, then drag the XRI Default Left/Right Controller from the folder Assets > Samples > XR Interaction Toolkit > 0.10.0-preview.7 > Default Input Actions to its respective controller gameObject.
With our XR Rig done, we can now focus on creating the world the player will explore, and with regards to the locomotion features, there are 2 things you need to consider:
Teleportation Area / Anchor
The Toolkit includes 2 components for defining where the player can teleport to. The Teleportation Area will allow the player to choose any point within an object to teleport to, while the Teleportation Anchor will always teleport the player to the same position and orientation.
Just add one of these components to any gameObject with a collider and you are good to go. It is also a good practice to add a feedback reticle displaying where the teleport action will move you to. To do that just create a prefab with the desired visuals of this reticle, and add this prefab to the custom reticle field on the Teleportation Area / Anchor component.
Walls / Objects colliders
The Continuous Move Provider handles the controller locomotion aspect, but with it alone the player would be able to walk through solid objects and walls. That is where the Character Controller component comes into play, just add colliders to the objects in the scene, and this system will handle the collisions and locomotion seamlessly.
Here you will find a sample scene with the full locomotion setup done in this tutorial, on a simple house environment.
Looking to learn more?
Check our free workshops for video tutorials.
If you're serious about becoming a professional AR/VR developer/designer, download our 10-week XR Development with Unity syllabus.
Want to know exactly what you’ll learn? Download the course syllabus.