VR Game Review: Eve Valkyrie

A case study on how to build cross-platform UI & UX

Billy Vacheva
Virtual reality UX
Published in
8 min readMar 14, 2018

--

I am a huge fan of the EVE Online universe. I still remember the first time I saw the game, the scale of the open world was mesmerizing. You can imagine how excited I was when they announced EVE Valkyrie for the Oculus Rift. The game was built specifically for VR but later on CCP dropped the VR requirement and it became available for PC and PS4 as well. I decided to review the EVE Valkyrie UI and UX because the game is a great example of truly cross-platform UI. Designing a UX that works both for VR and non-VR platforms is a topic I am very passionate about and plan to build on it in an upcoming article on how the Witcher 3 UI could be re-designed to be VR-ready. This being said let’s look at some good practices in Valkyrie!

Game Interactions

The game was released before the Oculus touch controllers were introduced, so all interactions were designed for gamepad and gaze selection. Consequently, when the game was released for PC and PS4, the interaction patterns were easy to adjust. I got my hands on Valkyrie at the time when the game was already supporting touch. At first, I couldn’t orient how to select a menu as I expected to be able to use the touch controls for that. I could hear the sound of switching between menus but I was not sure how I was doing it. I actually had to leave the game and research a bit to realize that I had to gaze into a menu in order to select it. Still, being aware of this technique did not make navigation much easier. The main menu was split into two rows of tabs, located in front of each other which made selecting the right button quite challenging. You can see in the screenshot how both buttons on the back row seem to be selected simultaneously. This is because the game could not track where exactly I was looking at. To select only one of the buttons I had to turn my head a bit more left or right.

Main Menu

In general, if you plan to release experience on high-end VR devices, I would recommend avoiding gaze selection. In the case that you target Gear VR or Cardboard — make the buttons large and leave a significant distance in between them.

In the gif above you can see how the hover effect looked like. It was distinctive and smooth using both a visual cue and sound effect. Designing a good hover effect is key for VR because it helps inexperienced users get used to the controls. The best hover effects I’ve seen so far use a combination of visual cues such as glow, pop-up, underlining and sound.

How to achieve VR immersion?

EVE Valkyrie is one of the first AAA VR titles. It achieves an insane level of immersion. How did the team do it? The answer is by paying attention to the small details. For instance, starting a training simulation you enter a room for a tactical briefing surrounded by AI teammates and an awesome augmented reality map in the center of the room. It reminds me a lot of the design of the map and common space in Echo Arena which was released later on, so I suppose the designers got some inspiration from here. In video games that require the players to move fast (fly, drive a car, etc.) it is common to show a tactical map of the terrain beforehand. This helps them orientate more easily.

AR map in the training simulation

Character customization

One of my favorite parts of the game is the so-called hangar. The UI is clean, minimalistic and makes a good use of the 120-degrees view by arranging all available spaceships in a carousel. If you think about it there is no reason for this UI not to work for 2D screens as well, just the experience would not be that cool as in VR. Still, if you start from the opposite direction — 2D UI to VR, there are many potential pitfalls that could ruin the experience — for example positioning UI elements so that they fall outside of the focus zone of the player in VR.

The character customization system in Valkyrie unlike the one in EVE Online is quite basic. Still, personalization/ character customization is a powerful user experience technique which is why I want to focus a bit more on the benefits it brings. Game designers discovered that by allowing the players to personalize their character, they improve the user engagement and the player motivation to return to the game. It increases the immersion by making it easier for the player to associate with the character. Could this technique be applied outside of games though? VR applications such as AltspaceVR, Oculus Avatars, Facebook spaces are already offering different options for avatar customization. Especially in the cases of social or collaborative experiences, being able to bring your personality in VR is crucial for a good user experience.

When we talk about VR browsing, I see the creation of a consistent virtual personality as potential problem browser vendors or third parties should look to resolve. Imagine browsing in VR, should you be able to build a persona once and use it to log in to different websites or even make payments? Or should there be a separate avatar system for each website that regards this a crucial part of the experience (Twitter, Amazon, Tinder)?

Player position and locomotion

One of the big reasons why the UX and UI of EVE Valkyrie translate so well to non-VR is that your character is seated during the whole time. The view you see in VR is pretty similar to the one you see on a PC screen. One difference is the player height — while this does not matter on PC, it is essential for VR as some users might be very short, while others could be quite tall. If the user height is likely to affect the experience a good practice would be to offer the option of adjusting the point of view before starting the experience. Unfortunately, I did not find how to adjust my height in the game, so I ended up playing it standing in order to see the screen well.

Although I love the EVE Online universe, space sim games may not be the best fit for VR as you are literally flying a spaceship! The first time I tried it I felt sick just 1 minute into the gameplay and I am a person using VR on a daily basis. Having something static in front of you should reduce the motion sickness they said… but not really. It’s true that the motion sickness got better with time — the third time I managed to chase enemies for 15 minutes before giving up. Alas, I do not have the ambition to become an Airforce pilot, thus I intend to continue playing the game on PC. I wonder whether the locomotion problem was the reason why CCP decided to close their VR studios and shift focus back to PC and mobile.

In-flight UX

As said already, the in-flight experience of Valkyrie is designed to make you feel like a true pilot. The UI is almost entirely diegetic and elements were perfectly positioned so that they did not prevent me from seeing the battle. The only thing that did not make sense was the battle log which was placed in the upper left corner of the screen. I could read it only if I was looking straight into it and I did not have time for that. If we were playing multiplayer it would have been hard to follow a team chat, maybe a better practice would be to move this communication to a voice chat.

In game UI: battle log

There were two things I found particularly helpful. All ships were marked with an energy trail — blue for my team and yellow for the enemy which really helped me orient in the complete chaos of the battle. The second thing was a number of red triangles guiding me towards the position of the other players. I honestly could not pay attention to anything else, the battle was that intense.

In game UI

I loved the way in which the character health was communicated — when your ship is hit a red glow appears around the edges of the cockpit, then an alarm turns on, the glass starts to crack and in the end breaks apart. Take a look at the video by The Mighty Jingles who seems to be much better in the game than me.

Player health

One thing that made my life hard was that the controls seemed inverted — I expected the ship to turn right but it turned left. At first, I thought that the problem was me but when one of my colleagues tried the game his experience was the same. Of course, I got used to the reversed controls after a while but UX which makes the user feel stupid is generally bad UX.

Despite the inverted controllers and the motion sickness, I got my first big victory on the third attempt playing the game.I completed the combat arena training simulation by shooting down 5 enemies and won a capsule!

Rewarding players with loot boxes is a popular game mechanic used in many titles including Overwatch, Counter-Strike: Global Offensive, DotA 2. The basic is that achieving targets you can win or also purchase a secret box containing random items to enhance the game. The mechanic seems to be applicable outside of the gaming world as well — for example, Oculus use it in Rift Core 2.0. I am not a huge fan of loot boxes. They give me a strong indication to be cautious — the idea is based on gambling mechanics and could lead to addiction.

In conclusion, EVE Valkyrie is an example of a VR game that builds truly immersive user experience making you feel like

The game gives a good example of how a UI could be designed to fit both VR and non-VR platforms but also highlights some VR-specific challenges. It’s worth giving a try even if you end up playing the PC version instead. Until next time space cowboys!

Originally published at productmanageronthego.wordpress.com on March 14, 2018.

--

--

Billy Vacheva
Virtual reality UX

Product manager for LensVR browser. I am passionate about UX, VR, WebVR. ❤ snowboarding, ramen, art