top of page

Tips for Avoiding Sim Sickness in VR app design

Writer's picture: Sky NiteSky Nite

Updated: Apr 25, 2020

VR apps can be very comfortable if designed in the right way. I've compiled here all the tricks I know of to remove or reduce sim sickness in Virtual Reality.


(Note: This post is an excerpt from my book VR / AR Enterprise Insider <-Amazon Link-).


As we talked about earlier in the book, modern VR hardware is capable of avoiding sim sickness in the vast majority of people. The software ultimately determines whether a user gets sick, and is the area designers and developers have the most control over.


Motion Sickness is a subset of Sim Sickness that is very common, so we’ll start with some design tips to either remove or reduce motion sickness in VR apps. Then we’ll talk about other sim-sickness-reducing tips to keep in mind when designing VR apps.


The focus for this section is on VR because AR has a different relationship with sim sickness. Most AR apps don’t move you through virtual space, and keep you grounded in physical reality. That said, many of these tips probably apply to AR should you design your AR apps closer to that of VR. Also note, I’ve personally never worn an AR HMD without getting sim sickness, probably due to the hardware’s low frame rate and tracking latency (although uncomfortable ergonomics may also be to blame). Handheld AR usually doesn’t cause sim sickness since the field of view is so small.


I expect AR frame rate and tracking to improve to the point of being comfortable like VR within a year or two of this book’s release. However, since writing these principles in the context of AR would be guesswork (there are still too many confounding factors and the hardware is changing so fast), I’ve decided to focus on the principles from a VR perspective to keep things clear and in the realm of what is well tested.


Artificial Locomotion


Any disconnect between a user’s proprioceptive system (i.e. their body’s sense of motion) and what they see is capable of causing motion sickness. How then do users move around in a VR app?


We’ll start out by discussing locomotion (i.e. movement) techniques that essentially cause no motion sickness. Because most enterprise users aren’t trying to maximize gaming immersion, I recommend these “full-comfort” locomotion techniques as the default for most enterprise apps.


If these “full-comfort” solutions don’t fit your use case or you want users who aren’t as sensitive to have different options, I’ve included a number of locomotion styles and tips that can reduce motion sickness while having freer movement. It’s important to keep in mind that not everyone suffers from sim sickness, and that it’s a spectrum where increased exposure can reduce its effects.


Teleport


lt turns out that if you instantly move a player from one location to another (known as Teleporting) they don’t experience motion sickness. This makes sense because our brains don’t recognize teleporting as movement. Our minds aren’t used to the experience of suddenly shifting a several meters away without any vection (i.e. sensation of movement), so we can move in VR very comfortably by teleporting.


In some ways, sim sickness can be thought of as the uncanny valley effect of immersion. If our brains’ expectations of reality don’t line up with what we see in the VR headset, our brain thinks something is wrong. However, if we completely trick the brain into thinking the immersive experience is real, or do something that our brain knows is not real, we generally won’t get sick.


Free Teleport


One of the most popular ways to implement teleportation is to use a laser beam with a parabolic arc attached to the hand. The user aims their hand (and thus the laser) where they want to go, see an indicator on the floor of where they will arrive, then either press (or in most cases release) the button to teleport. The user then appears at the spot indicated.


Some developers make the teleport happen instantly with no transition. The danger of this is that it can be a bit jarring and if you allow rapid teleports the illusion of motion can be strong enough to still cause motion sickness. A common trick to avoid this is a quick fade to black then back up. The fade happens very quickly and is barely perceivable, but gives the brain a transition cue to make teleporting less jarring. You can also set a limit on how many times a user can teleport in a set amount of time, but need to be careful with this as it can be frustrating for players who want to move quickly.


Another solution is the Dash Teleport. Instead of instantly moving the user, Dash Teleport moves the user very quickly over time, often with some combination of the motion-reduction tips from later in this chapter. It gives users the feeling of traversing through the space, which can lead to them being more aware of their new surroundings. The motion is so fast that our brains have trouble recognizing it as motion. Although this technique works for many people, it can still cause issues for some, so you need to be careful using it as your default “comfort” option.


Free teleport has become an industry standard for apps that need a comfortable-for-everyone option combined with the ability to move freely throughout a VR environment.


Waypoint Teleport


Some apps need to restrict user movement to certain locations. In these instances, it is common to use a teleport waypoint system. Essentially, the user can see indicators for specific locations they can teleport, rather than being able to teleport anywhere with a laser arc. The indicator could be in the form of a glowing light, a 3D icon, or any number of other representations.


For example, you may have the teleportable locations light up or show an indicator when you look at them. Then you press a button or keep looking for a few seconds to teleport to the new location. Instead you may give the user a laser pointer on their hand, and allow them to click on a teleport location icon to go there.


Although waypoint teleport is more restrictive, it can ensure your user goes to the places you want, and in some cases, sees the start of the location how you want them to see it.


Roomscale and Scene-Based


Another option for avoiding any motion sickness is to never move the user artificially. “Roomscale” is a term used to describe a VR experience where the user can move around their environment physically in 360 degrees. Because the motion is one-to-one (due to the great tracking), no vestibular disconnect occurs. The motion you are seeing reflects the motion of your own body, and thus of your own inner ear.


Obviously, this can be fairly restrictive. The recommended minimum space for a roomscale VR experience is 2 meters by 1.5 meters. This means that the user in the virtual world is also somewhat restricted to that amount of space. If you need the user to use more virtual space, you can do so by using a larger physical space. Such “warehouse scale” VR is possible with the current technology, and can make sense for certain use cases, but is also potentially expensive. How many companies have a large empty room they can use for this purpose?


A way to get around limited space is to design for roomscale but allow transitions (or teleports) to specific locations. For example, if you are training someone at McDonalds how to flip burgers in the kitchen, but then want them to practice with the cash register out front, the scene can transition the user from the kitchen to the register counter via a button on a menu, a teleport waypoint, or completion of the burger flipping lesson.


With larger spaces, it is also possible to use “redirected walking”, a technique that allows a user to feel like they are walking in a straight line when in actuality they are turning. Redirected walking can make a warehouse space feel like it has infinite size. The bare minimum space required to use redirected walking is about 6x6 meters, but more is preferable. Tricks can also be used to make a smaller space seem bigger using turning.


If you are going to constrain your user to only physical space locomotion within a scene, but need the user to access objects outside of their physical space, telekinesis can be a useful method.


In apps that make use of telekinesis, the user points or looks at an object that is distant (sometimes with a laser pointer or cursor), then the object either glows or has some indicator pop up to show that it is highlighted. Pressing the grab button then floats the object into the user’s hand, or allows them to move it in relation to how their hand moves, with a trackpad or stick controlling how close or far the object moves from the hand. This technique requires more training and practice than just picking objects up naturally with your hands, so I’d hesitate to use it in an enterprise setting, but it can be an interesting solution.


Smooth Locomotion (reducing sim sickness for other movement techniques)


Whatever your reason, you may decide teleport and roomscale don’t work for what you’re trying to accomplish with VR. Gaming and narrative uses especially can benefit from smoother locomotion techniques (i.e. locomotion that involves vection). If that’s the case, there are still a ton of tricks you can employ to make users more comfortable.


Vection Reduction


Vection is the feeling of movement produced by visual stimulation. Since vection in VR doesn’t match what our bodies (specifically our vestibular system) feel, it can make us experience discomfort (i.e. sim sickness).


One way to reduce discomfort from vection is to use it sparingly. Perhaps the user performs most of their tasks from a standstill, and only locomotes through the environment to go to a new activity.


Reducing movement speed also reduces vection (to a certain point). There is actually a balance between how long you experience vection and vection magnitude, and I think the topic is not well understood. The traditional advice is that if you move the user more slowly, they will experience less discomfort than moving them more quickly, and this is often true.


However, as discussed with Sprint Teleport, there is a speed above which our brains don’t recognize motion the same way, and thus if that threshold is achieved faster may actually be better. Unfortunately, it’s not clear cut because this “threshold” is more likely a spectrum, and that spectrum is different for every individual person.


It may be possible that moving someone at a speed of 6m/s for 10 seconds is more comfortable in the long run than moving someone 3m/s for 20 seconds, because even though the vection magnitude is greater for 6m/s, the user experiences vection for double the time at 3m/s. If possible, you probably want to allow user movement speed to be adjustable in settings, since different speeds are comfortable for each person. You also will want to test different speeds and see what feels good for the majority of people.


When I first started in VR, I used the Oculus recommended 1.4m/s for walking in a game I was making. It was very comfortable for my short experience. However, users repeatedly reported frustration with how slow the movement felt. If you’re going to have smooth locomotion, you probably want to avoid user frustration (and wasting user time) by employing as fast a movement speed as comfortable along with the techniques later in this section.


Acceleration / Deceleration Reduction


Acceleration and deceleration are far more problematic than vection when it comes to motion induced sim sickness. To rephrase, changes in motion are more uncomfortable than motion itself.


The general rule of thumb is that shorter periods of acceleration are better than longer periods. This means that instant acceleration to a set speed and instant deceleration can be more comfortable than gradual speeding up and slowing down.


It’s important to keep in mind that changes in direction constitute a velocity change, meaning that if the app requires strafing (moving side to side) via artificial locomotion, the user is experiencing constant velocity change.


Traditional console game thumb stick movement uses variable speed, meaning that the user can have a speed anywhere from 0 to their max speed in any direction (by pushing the thumbstick in that direction by a variable amount). This finely-variable speed of movement is generally less comfortable than quick or instant acceleration to a max speed.


So, where possible you want to make your velocity changes instant or very quick (if trying to minimize movement discomfort).


Perceived Movement Reduction


How much movement we perceive is affected by the environment around us. So, by reducing movement cues in our scene, we can lower discomfort.


For example, how tall a user is affects how fast the ground seems to be moving, as well as how much of our vision is taken up by the moving ground. Placing a user close to the ground is not recommended for comfort.


In VR development, the user’s view is called a “camera”. One of the popular implementations for a user camera is to set its height from the floor equal to that of the user (so the user appears as tall in VR as in physical life). This is generally a good idea for reasons other than sim comfort, such as increased immersion, being able to reach objects on the floor, and balance. In this case you are powerless to change physical height.


However, there are also times where for the sake of design consistency you need the user to be a certain height (or a height within certain bounds). For times like this, you want to choose a height that feels comfortable for a majority of your demographic, but that isn’t too close to the ground.


The floor is just one surface that can cause perceived motion. If you are out in an open space, you’ll often see a skybox, which doesn’t move as you get closer to it (since a skybox will always appear the same from any distance). Objects that are very distant also don’t appear to move much as you move. It’s safe to conclude then that the most comfortable types of environments are those where you’re in an open space without walls or a ceiling.


Once you’re inside, walls and ceiling can increase your sense of movement, since they change in your periphery. That said, you can still have a comfortable experience with an indoor environment. Just avoid tight enclosed spaces where possible.


A good example of this principle in action can come from Echo Arena, a 0-gravity VR game on 6DOF headsets. While playing the tutorial, which constitutes a series of narrow rooms, people report increased motion sickness compared to the game’s lobby and play arena, which take place in much more open and expansive environments (despite being indoors).


Peripheral Blinders (FOV Limiting)


A common technique used to reduce motion sickness with smooth locomotion is the use of peripheral blinders. Essentially, when a user moves, the edges of their vision fade to black. When the user stops moving, the black at the edge of their view disappears. This probably works because the blinders block or reduce the motion our peripheral vision sees. Lessening of FOV that perceives motion is one of the most surefire ways to reduce discomfort.


Although this technique can reduce motion sickness in most people, some report that it increases discomfort. Whether or not this is the case (the perceived discomfort may be worse without it), the technique can be implemented to varying degrees of intrusiveness. Generally, the more pronounced and immersion-breaking the black rim, the more effective at preventing motion discomfort.


Thus, there is a fine line to walk between keeping users immersed and not annoyed, and keeping them physically comfortable. Users who don’t suffer from sim sickness generally hate peripheral blinders, and even those who need it will complain if it is too pronounced, so be careful with this technique, possibly making it an option that can be disabled.


Movement in Viewpoint Direction


If the user is looking in the direction they are moving, they will experience less discomfort than moving in a different direction from where they are looking.


A common implementation of smooth locomotion for VR is to press a button (or the joystick / trackpad forward), and have the user move in the direction they are looking. For games with gravity, the up / down of the head is ignored. This is in contrast to moving forward, then allowing the head to turn freely after you’ve started moving, without affecting your direction. That said, there are still plenty of apps that allow you to move in directions other than where you are looking (including strafing side to side and backing up).


Design-wise, you want to encourage your users to move in the direction they are looking, or look in the direction they are moving since other movement will increase discomfort. Perhaps instead of allowing artificial strafing, make it so the user has to physically move side to side in the VR app. A great example of this is the game Pistol Whipped, which moves the user slowly forwards (in one set direction) through an environment, but encourages the user to dodge side to side with their physical body.


Give User Control of Movement


As much as possible, its better to allow users to control their artificial locomotion rather than moving them unexpectedly. When a user anticipates that a control input (such as a button press) will move them, sim sickness is reduced. Although the mechanism for this is not fully understood, it’s plausible to believe that when our brains expect motion to occur, the proprioceptive disconnect that occurs from artificial locomotion is not as unexpected, and so not as uncomfortable. Ultimately, what matters most is that giving users control of their own movement definitely increases comfort, and possibly allows them to build a mental connection between the input and the motion (which as we’ll talk about later, can build resistance over time).


Roller-coaster VR simulators are an example of how to reliably get people who aren’t immune sick, because in addition to the fast speed and many turns, the user has no control over the motion happening to them. The visual indicator of the track can provide some expectation of movement, but because the coaster goes so fast, that movement is generally not expected the same way a button-press-to-response is.

Author’s Note: Please, stop showing new VR users roller coaster apps. They are a terrible intro to VR, especially on a 3DOF headset.


Provide Cues for Non-User-Controlled Locomotion


When the user has to be moved without their own input, providing sound and haptic cues can reduce discomfort by giving their brains an action-reaction expectation.


For example in Echo Arena (the 0-gravity Oculus game) you have control of your movement most of the time. However, if you hit your head on a wall while flying, you bounce off it. Usually when you hit your head, it isn’t an intentional thing, and so was not an anticipated change in movement. Despite this, bouncing off something in Echo Arena feels fairly comfortable.


This is because when you hit your head on something in the game, a distinct “thump” sound is played, coinciding with a haptic rumble on both your hand controllers. The result is a visceral “thump” of your head that you instantly recognize, turning a situation of unexpected velocity change into “oh dang, I just hit my head”.


This is the kind of presence and embodiment that VR can provide. Increasing our sense of embodiment can make our brains more comfortable in VR. So, wherever possible make unexpected direction changes feel like an expected result of something within the VR simulation. Sound and haptics can go a long way to increasing immersion and building brain expectations.


Consistent Orientation


We’ve talked about artificial movement, but what about rotation? Artificial rotation is in some cases the most sim-sickness inducing thing that can occur (dropped frames and bad hardware can be as bad or worse).


Imagine you are sitting here reading this book, then the whole world started spinning around you. This is the same experience of smoothly turning the user in VR, and for many people it’s an instantly bad time. That said, I’ll remind once again that comfort is a spectrum, with some users able to turn with traditional first-person shooter joystick controls just fine.


How then do we turn the user comfortably? Well, one sure-to-work option is to have the user turn their head physically. This can work very well for standing or roomscale experiences, but fails if your user is seated. Another option is to use peripheral blinders whenever the user turns, but this solution isn’t perfect.


You could design your app so that the user never needs to turn. My first few VR prototypes used this method, but it was admittedly lackluster. People don’t like to feel restricted, and designing a forward-only facing VR app with no artificial turning makes it very hard and situational to use this technique. It can certainly work; games like Space Pirate Trainer and my own Meme Dragons drop waves of enemies in front of you to shoot, making it so you don’t ever need to turn. Users don’t miss the turning in this case because they don’t have a reason to turn.


But, as soon as your experience takes place in a 3D world with free movement, turning becomes necessary. So far, the industry has come up with one effective solution: snap turning.


Snap Turning


Snap Turning is a technique where flicking a joystick or touchpad right or left instantly turns the user a pre-set number of degrees (often 45-90). There is no acceleration, because the frame instantly changes to 45 degrees turn right or left. This trick appears to be a very comfortable way to turn for most people. However, it can be fairly jarring for new users, and has a bit of a learning curve. Because it breaks the smooth motion we’d expect from reality, it can also reduce immersion.


That said, with practice users can learn to quickly reorient themselves after a snap turn, building the muscle-memory expectation of “if I press the stick this direction, I’ll turn 45 degrees and be looking over here”. If the degree increments are too small, turning can be frustrating and appear like motion (which defeats the whole purpose).


Snap Turning is comfortable and pervasive enough that I’d recommend it as an option for any VR app that uses artificial turning (since for many people, other artificial turning methods are untenable). Just keep in mind, users who can handle smooth turning usually prefer that.


Avoid Pitch and Roll


Turning around to look behind you is a turn on the Y-axis, which is known as “yaw”. Turns on the Z-axis (i.e. somersaulting forwards) are known as “pitch”, and turns on the X-axis (i.e. a cartwheel) are known as “roll”. The term “stick yaw” is commonly used in the VR and gaming industry in reference to the stick / trackpad turning we talked about in the last section. Smooth yaw turning is already very uncomfortable, but pitch and roll are even more so.


Living our day to day lives, we rarely experience significant pitch and roll, and our experience of it is almost entirely initiated by our heads. The few exceptions are events like riding a roller coaster or getting shoved to the ground, events that are generally disorienting. Astronauts undergo significant training to prepare for the common pitch and roll of 0-gravity, and even still often have to adjust to being in space before getting over initial nausea (usually taking drugs like dramamine).


So, to create a comfortable experience you should avoid artificial turning in the pitch and roll directions. Obviously, if you want to create a realistic flight simulator this is not possible, but the vast majority of experiences don’t require pitch or roll.


Fixed Horizon


Related to the pitch and roll discussion, having a fixed horizon can help ground the user and reduce discomfort. If the user is floating in a void of space with no clear horizon, they can become disoriented and feel increased sim sickness. The same thing can occur if the horizon starts moving.


Visible skyboxes are the easiest way to accomplish this. A skybox appears the same no matter how far the user is away from it, providing a consistent far-away grounding reference for our eyes. Other far off non-moving pieces of geometry (such as a wall) can also provide this.


An example of breaking this tip would be to have a massive castle rise directly in front of a user, or to put them in an elevator where they can see the walls moving past. Space sims with yaw and roll break this tip as well, since the horizon spins depending on what direction you turn.


Fixed Focal Point


If your vision is focused on a specific object and able to follow it, some motion sickness can be reduced. For example, when driving we are usually focused on the car in front of us or the horizon, which barely move relative to the road. If you were to watch the road directly below you instead you would feel much more discomfort.


Another example comes from Echo Arena, where you chase a disc around a 0-gravity environment. Even though you are moving quickly through the environment with artificial locomotion, much of your visual focus is the disc or the other players around you. You’re so focused on a fixed object, you barely notice the vection of your peripheral vision.


HUD and Cockpits (Fixed Objects and FOV motion reduction)


A Heads Up Display (HUD) constitutes a user interface (UI) made up of elements fixed to a user’s face. For example, a racing simulator might attach your car’s speed to the lower right of your vision so that you can check your speed with a glance. As you turn your head, the UI stays in the same relative location, and so appears to never move.


Having too many visual elements fixed to your head can be uncomfortable. However, subtle use of HUD elements can decrease discomfort by providing a constant fixed reference point.


For example, Echo Arena uses subtle UI images attached to the face to give the user the feeling that they are wearing a helmet. The presence of these images reduces sim sickness (probably because of reduced perception on vection, but the reason is not fully understood). Generally I’d recommend keeping fixed UI elements to the peripheral vision rather than areas of the visual field where you spend most of your time focusing. However, this is just a rule of thumb.


As with the peripheral blinders we talked about, limiting how much of the vection we perceive can also reduce sim sickness. An effective way to do this can be through the use of cockpits, which block part of our field of view from seeing movement. The feeling of being in a vehicle may also contribute to tricking our brain into accepting artificial locomotion.


While piloting a spaceship in the VR version of No Man’s Sky, the cockpit takes up much of your vision and there are HUD elements fixed to the viewport, helping to reduce sim sickness. Other games like Vox Machina, a mech piloting game, use similar tactics.


Using Embodied Movement


When we involve our body in artificial movement, sim sickness appears to be reduced. This isn’t fully understood, but it’s likely because moving our physical body tricks our brain into thinking our physical actions are causing our virtual movement.


One to one hand movement to body movement seems to work very well. This “climbing” or “crawling” movement works by having users press a “grab” button, then move their hand in the direction opposite to where they want to go. As you move your hand down, your body (and with it your head) moves up. Move the hand right, and you go left. This movement behaves like how we’d expect our actions to occur in the physical world (if you pull yourself up a rock wall, you bring your hands closer to your body, which moves your body up).


Echo Arena makes use of this trick, allowing you to climb around the environment and throw yourself into 0-G off of walls and other objects. A race game mode in Rec Room uses a similar technique to allow you to climb up walls.


Another embodied movement technique is “arm running”. Users pump their arms as if they are running, and this causes them to move forward through the environment. The games Sprint Vector and Climbey are good examples of this.


So, it appears that connecting physical movement to artificial locomotion in a natural way increases our immersion and stops our brains from freaking out as much. These are just two methods of embodied virtual locomotion that the industry has thought of so far. There are still plenty of possibilities waiting to be discovered or popularized.


Reducing Overstimulation


VR environments can be overstimulating, meaning the user experiences mental fatigue from the complexity of how much is going on. Additionally, high detail can be lost to the VR display, instead creating increased vection.


If you wanted to design an environment that would be as uncomfortable as possible, you would use lots of bright colors and color variation. The geometry would be very complex, with textures (aka color arrangements) that contained a plethora of fine details and lines.


Very bright colors can be exhausting for users, so its better to use a tone that isn’t too bright (so neon green as a major part of the environment is probably out). This is a common design principle even outside of VR for the same reason, but VR emphasises the eye strain.


Similarly, in any design too much complexity can make it difficult to see the shapes of the art clearly. Too many lines or colors can make an object difficult to parse (i.e. make out). In VR this problem is exacerbated because of relatively low pixels-per-degrees compared to flat monitors. Very detailed art ends up looking messy and unclear because there aren’t enough pixels to display it.


Even with better displays, too many lines and color / shape changes increase the feeling of vection when a user is artificially moving. The most comfortable environments use simple geometry with relatively few colors.


That doesn’t mean comfortable VR environments have to be unsophisticated. Cartoony art works well in VR for the reasons above (and some optimization reasons we’ll talk about later). However, you can also achieve high realism without making the 3D objects overly complex.


“VR Legs” (Repeated Exposure)


Repeated use of VR and artificial locomotion can increase resistance to sim sickness. People can earn their “VR legs” by a combination of persistence and gradual intensity increase.


Gained resistance to the effects of sim sickness are not overly surprising. About half of astronauts sent to the space station experience 0-gravity sickness their first few days. At first their vestibular system is very confused. However, as they live their lives in 0-gravity and gain familiarity with it their body starts to relax.


VR can work the same way. However, it is unclear what frequency of use improves comfort at what rate (it is definitely variable for each person), and for how long the resistance lasts. You may do a VR gaming or training binge one week and have an iron stomach by the end, but if you then stay away from VR for a few days and go back, how resistant will you be? This is not well understood.


Relying on users to get experience with VR vestibular disconnect is an uncertain business. One method could be to start a user with a very comfortable experience, then slowly graduate them to more and more extreme forms of artificial locomotion. However, many people don’t want to feel the effects of sim sickness, and would rather not use VR than train their bodies to be able to use your app.


Generally it’s advised to not fight through the discomfort and to instead take a break when you notice it. Doing otherwise can result in a very unpleasant experience and potentially form an automatic association in a person’s mind between VR use and that unpleasant feeling.


I think where possible it is best to offer options for users. This allows them to build resistance at their own pace, and allows users with immunity to jump right into the style of movement they desire.


Final Thoughts


When tourists go up in a plane that simulates 0-gravity (which was used to train astronauts), about 60% experience motion sickness. Should we then expect about 60% of the population to be susceptible to the discomfort?


This chapter contains many tricks for avoiding motion sickness in VR, and these are only the practical ones I’ve seen so far. As the industry develops and more experimentation is done, we’ll probably find more! In the long run, if kids start to use VR at a relatively early age, they may gain immunity to vestibular disconnect just like kids who grow up playing high-action video games gain resistance to its motion sickness effects.


Note: If you know of another design trick for avoiding motion sickness that I've missed, please let me know of it at sky@moduxr.com.


If you enjoyed this article, you may enjoy the rest of VR / AR Enterprise Insider on Amazon. <-Amazon link



Comments


Enjoy Our Content? Subscribe for New Articles!

Thanks for submitting!

bottom of page