How to develop motion sicknes free applications?
You may have ever felt sick when playing a game, and it wasn't due to its story. The reason was probably a bad design of the locomotion system in the game that causing motion stickness. I have experienced this feel myself several times. Apart from a few experiences from browser games developed by beginners, I remember especially the nausea and malaise lasting for several days caused by a few tens of minutes of playing on my VR headset HTC Vive sometime in 2016. The feeling at the time was not so much the fault of the VR headset itself as the surrounding factors. Nevertheless, in the end, I had the feeling of nausea associated with VR in general that resulted in the feeling of fear to try anything else. This would not happen if developer of the game followed best practices for elimination of motion stickness described in this article.
How does motion sickness arise?
The human body perceives its natural location in a space through 2 independent but interconnected organs:
- Vestibular system - we can simplify it for a sensory system providing information about motion, position, and spatial orientation of the head. The detection is based on a liquid behaviour in an inner ear.
- Visual perception - we can simply express it as a signal from eyes, from which the brain derives information about the motion, position and spatial orientation of the head.
At the moment when the perception coming from these 2 systems is different, the body does not know exactly what is happening and how to react - there is a so-called motion sickness. In virtual reality, it is very close to experience this state - typically the user watches a dynamical image through the VR headset (visual perception ✓) while his body is at a rest (vestibular system X).
What are locomotion systems in Virtual Reality?
VR applications can be divided into stationary and spatial. With stationary applications, no movement is allowed - the user is located in one place throughout whole time of using the application. In contrast, spatial applications work with active space allowing independent movement of the user. Such movement can be performed by the following locomotion systems:
-
Intuitive locomotion
Virtual reality headsets with 6 degrees of freedom (6DoF) contain technology for tracking and transmitting the real movement of the user in the physical world into virtual reality application. This is a completely natural and intuitive way of moving in VR, however it can only be used separately in applications requiring very little space. The space limit is determined primarily by the available free space in the physical space - the VR headset itself Oculus Quest manages to track safely motion on the area of 25ft x 25ft (58m² ) , HTC devices with added base stations can handle even larger area.
Teleportation
Teleportation is the ability to move from place to place within 1 frame. Processing the selection of a new location and the method of transmission can be arbitrary, for example:
- By clicking on a specific place in the virtual reality space
- Using user interface (UI) / map
- Using an interactive object in the space - door / vehicle
There is no specific movement during teleport - you can imagine it as a blink during which everything around you changes completely. There is no contradiction between the state of perceptions by a pair of organs and thus no motion sickness occur.
Teleportation is often used in combination with the intuitive movement from point 1. With he teleport, the user moves between more distant places while intuitive movement is used for closer exploration. Active space for intuitive movement is defined by a free space in the physical world, for example free area of 2x2 metres in a living room.
Locomotion platform
The locomotion platform can be imagined as a walking recessed belt with free movement in any direction, while the movement of a person on this belt is replicated in virtual reality. The use of a movement platform makes it possible to achieve freedom of unrestricted movement in a very small space of the belt.
The movement acts naturally again and it does not cause any motion sickness.
Movement via controller
The use of buttons and especially the joystick of the controller for movement should be used only in necessary cases. During this movement, the user's brain receives information about a movement from eyes while the information from vestibular system does not confirm it. This clash leads to motion sickness. If you really need to allow this type of movement, it's a good idea to find ways to eliminate the feeling of nausea as much as possible, see for example the best practices in Best practices to design motion sickness free applications section.
How to design motion sickness free VR applications?
Performance optimization for maximum FPS
VR headsets can handle 60, 90 or more frames per second (Hz). The human brain is able to process a maximum of between 40-60Hz through the eye, but it works with a high loss (Critical Flicker Fusion) of up to 50%. VR headsets themselves must therefore reach much higher Hz values. If we take the worst case scenario - the need to achieve the maximum perceptible number of frames and at the same time the lowest processing efficiency in the brain, we will come to the need to achieve a frequency of at least 120Hz. In practice, however, we often work with a minimum value of 90Hz, however even this value is not achieved by far in all devices. For example, the popular Oculus Quest VR glasses work with a frequency of 72Hz.
Motion sickness occurs when the frequency of frame processing in the brain is higher than the frame rate generated by the application and displayed on the VR headset display. That's why it's necessary to keep the FPS application performance above the level of processing in the brain. The choice of the minimum application frequency is simple - optimally 90Hz in all circumstances. Current FPS value of an application can be measured easily with the help of a script (or via trackers in the game engine in which you are developing).
-
Elimination of unwanted movements
The terrain surface must be free of any elements that would cause any jumps, stuck, or camera lags of the image in the virtual space compared to the real movement of the user's head.
An example could be a curb or stairs. In the mechanisms such as walking up stair, the camera would jump with each step about the step height in the virtual space, while user's head would not reflect such movement. Therefore there is a good practice to dampen these sudden changes in camera position caused by the virtual terrain. For example, the stairs may be "replaced" for an inclined plane without any visual inpact. Technically, the plane with opacity 0% may be placed on the stairs which keeps visual look of stairs in the terrain while the walk itself is processed on the inclined plane without unwanted camera movements.
Another example is the shaking of the ground after a grenade explosion. In PC shooting games, the camera shakes tremendously at this moment. What feeling a user would have if the same camera movement would be used during a VR experience? It's better to do not think about it, it would hurt. A much better choice is to use post processing and shake only the edges of the screen - the player gets a peripheral feeling of shaking while the main field of view remains without unwanted and unpleasant movements.
-
How to move a user if needed?
The best way of transmission is natural self-movement, teleportation or use of a locomotor equipment. However, often the VR application is based on the mediation of the forced movement with a user - such application has perfect prerequisite to emergence motion sickness. Examples of such applications may be a ride on a roller coaster, on a moving platform or a simple virtual tour across the city.
There is one important rule for camera design in such applications - "no quick and unexpected movements!" Forget for crazy movements of the camera (user) such as frantic acceleration / braking, change of direction ... The movement of the camera must be completely smooth, in all directions. Illustrations of speed or acceleration must be delivered through post processing - typically by reducing the field of view, blurring the edges, creating a "fast tunnel" along the edges of the screen (you know it from Need For Speed games), and so on.
Appropriate environment design - the existence of orientation points
In space, the brain orients itself on the basis of certain distinctive elements - such as a particular house, tree and the like. This is not possible in the desert, a one can quickly become disoriented and lost there. The same is true in the VR space. Design the scenes so that the user's brain always has something to remember in a space to reduce the unpleasant feeling of disorientation.
A suitable structure of the terrain and environment will also help the user to recognize the speed and general orientation when turning and looking around. When rotating, the existence of objects in multiple layers is always beneficial (something in the foreground, something else in the background ...). Speed is a very relevant to a distance of objects from the player's camera (sight) - the closer the objects are, the more pronounced the feeling of speed. Bad feeling of orientation and excessive speed with fast-changing objects cause motion sickness.
-
User interface as part of the terrain
UI elements such as menu, buttons and various descriptive and informational texts should be designed as part of the terrain. If it is necessary to place them connected to the camera movement, it should not be in 1:1, instead UI elements should slowly and smoothly follow the camera movement itself with a low delay.
-
Avoid excessive flickering
Excessive use of flashing lights, flashes during shooting with a pistol, and other effects increases user's fatigue and the likelihood of unpleasant feelings.
Give the user a choice
Each user is different, has different preferences, a different VR headset and a different threshold for gaining motion sickness. Giving the user a simple choice (what locomotion systems he wants to use, what effects should be active, what quality of graphics is preferred, etc.) is always good practice.
Target time in the VR application
VR has gotten a big progress, but it's still not perfect. The same applies for your application. The longer you force the user to spend time in virtual reality, the higher is the chance of getting motion sickness. So create VR applications in such a way that the user does not have to spend more time in the app than is absolutely necessary. When it comes to games, it's ideal to pause and save the game automatically each time the user undress the headset. This brings the user the freedom to play intermittently - option to decide when to end.