Part 2: Metaverse Building Blocks — Hardware

Aaron Farr
9 min readOct 20, 2022

--

Part 2: Hardware

This is Part 2 of Agya’s series on Metaverse Building Blocks.

To read the series to date, check out:

In this edition we discuss the importance of hardware in the Metaverse.

Let’s dive in.

Metaverse hardware encapsulates physical technologies used to experience, interact and engage with the Metaverse. Most of the hardware of the Metaverse is still in its nascent stages, and will improve drastically over the next few years. In fact, experiencing the Metaverse may look very different in the future than it does today.

Hardware today varies significantly based on the use case: hardware for gaming involves high-powered consoles like a PS5, while physicians use enterprise-grade AR headsets from the likes of MagicLeap. Hardware also varies based on the end user. Some hardware is designed primarily for consumers, which will typically prioritize comfort, flexibility, portability, and design. Other hardware is built for enterprise users, prioritizing output and performance over anything else.

The following subsectors of hardware will provide an understanding of what is available to access early forms of the Metaverse today. These subsectors will cover: gaming consoles, virtual reality, augmented reality, mixed reality, and emerging hardware.

A. Gaming Consoles

Gaming consoles have been the primary on-ramp to early Metaverse experiences for many consumers. Unlike other consumer devices, gaming consoles have the computing and streaming capabilities designed to provide immersive and low latency graphics, enabling 2D Metaverse experiences. Metaverse games such as Axie Infinity, Fortnite, Roblox, and Mythical Games rely on advanced graphics processing units (GPUs), but also the high speed streaming capabilities that enable simultaneous social engagement and monetization. Two of the most common consumer gaming consoles that have enabled early Metaverse experiences for many include Sony’s PlayStation and Microsoft’s Xbox. Traditional gaming consoles also exist alongside other hardware such as gaming PCs. Gaming PCs differ from consumer PCs in that their primary purpose is to process intensive UI/UX games at a higher quality. This includes high bandwidth technology, large amounts of memory for local data storage, cooling and airflow to avoid overheating, improved graphics and more.

B. Virtual Reality

Virtual reality (VR) is most commonly associated with the Metaverse due to its fully immersive nature. The hardware in this space includes VR headsets, which enable users to experience virtual worlds through sight, sound, and movement throughout the virtual environment. Around 80% of VR headset sales in Q4 2021 were accounted for by the Meta Quest 2, a consumer-grade headset that provides access to VR content at a relatively low cost ($300 USD).

While the Meta Quest 2 has been successful in introducing basic VR applications to consumers, VR headsets still have a long way to go to achieve mainstream adoption. Users of the Meta Quest 2 often still complain about nausea and motion sickness, primarily due to the relatively high latency and slow refresh rate. The Meta Quest 2 (without being connected to a gaming PC) has a refresh rate of 90hz, where the minimum threshold to avoid nausea is around 120hz. There are several projects that are set to be announced in the near future that will improve on the Meta Quest 2. Apple has hinted at an AR/VR headset to be released in early 2023. Meta is also reportedly in development of their premium headset, dubbed Project Cambria, which should offer several hardware improvements, along with their retinal resolution software, Butterscotch. With these next two iterations, headsets will continue to improve the immersive experiences necessary for the Metaverse.

Horizon Workrooms is one of Meta’s VR experiences made for virtual work and collaboration

C. Augmented Reality

Augmented reality (AR) headsets allow users to engage primarily with the physical world around them, but with a superimposed digital layer to supplement their experience. Most AR headsets on the market today typically take on the same shape and design as VR headsets, but with a sleeker, less bulky look.

One of the first mainstream, consumer-grade AR headsets was Google Glass, which was first announced in 2012, and after receiving a great deal of criticism, discontinued in 2015. Despite being one of the first widely available AR headsets, it failed due to its clear lack of compelling use cases: there were few things users would’ve preferred to use the Glass for over their smartphones. Since Google Glass’s discontinuation, the company released two subsequent editions of Google Glass Enterprise, which improved on the use case challenges experienced by the first consumer product. Most mainstream AR headsets exist for enterprise use cases including healthcare, manufacturing, maintenance, construction, and education. Microsoft’s HoloLens 2 as well as the Magic Leap 2, both competitors to the Google Glass Enterprise 2, also provide enterprise users with a range of features including immersive eye and hand tracking, voice commands, video collaboration, enterprise apps, and more.

In order for AR to become mainstream among consumers, there must be clear reasons for users to engage with AR content over alternative information provision such as a smartphone or smartwatch. While Apple’s VR headset is rumored to be released in early 2023 as a likely competitor to the Meta Quest 2, the AR-specific headset will come later down the line. Tim Cook has stated that VR and AR are both “incredibly interesting”, but that his own view is that “augmented reality is the larger of the two, probably by far”.

The MagicLeap 2 enterprise AR headset is set to be released September 2022
Microsoft’s HoloLens 2 is designed primarily for enterprise use cases across a variety of industries

D. Mixed Reality

Mixed reality hardware (MR) is physical technology that integrates aspects of both VR and AR, where the user is able to manipulate the digital layers that they experience with physical actions. An example of this would be writing on a virtual whiteboard with a physical marker in hand, or tightening a virtual screw with a physical wrench. There are currently no mainstream MR-specific headsets on the market, though several AR and VR headsets allow for MR engagement. The Meta Quest 2 and HoloLens 2 involve mixed reality experiences, where users are able to engage with their digital applications by moving their hands or fingers in specific ways without the use of controllers. Like AR, MR has immense potential in a wide range of industries.

E. Emerging Hardware

While the most well-known hardware for the Metaverse today are AR/VR headsets, there exists a whole range of emerging hardware that will also provide additional layers of experience. These improve the immersion into the Metaverse experience by engaging additional senses beyond just sight and sound. The following areas primarily only exist for enterprise use: haptic wearables, haptic non-wearable devices, olfactory devices, and 3D imaging and display.

a. Haptic Wearables

Haptic wearables are devices that, through kinaesthetic feedback to the body, supplement VR, AR, or MR to provide more immersive virtual experiences. Haptic wearables mimic the feeling of touch by applying forces, vibrations, or motions to the user wearing the device. Some of the most common haptic wearables include full-body suits, vests, or gloves. Teslasuit is one of the leading haptic suits on the market. Their full body suit includes over 14 inertial measurement unit (IMU) sensors to identify, track, and record movements of the body or provide feedback to the user. In addition to haptic feedback, the suit also offers bodily metrics of the user’s physical measurements such as pulse and oxygen saturation. Other emerging companies in the space include Aktronika (haptic vest) and Senseglove (haptic gloves). Meta has also been developing its own haptic wearables. The company stated in November 2021 that “the team is developing haptic gloves: comfortable and customizable gloves that can reproduce a range of sensations in virtual worlds, including texture, pressure and vibration”.

Haptic wearables today are almost exclusively designed for enterprise uses, and are typically not as affordable as consumer headsets. The enterprise use cases are evident across a broad range of industries. In a virtual military training exercise for example, a user might be able to experience physical feedback to their body while undergoing training tasks. For a construction worker, they might be able to feel the weight or recoil of a heavy machine or material. As wearables continue to improve, use cases will expand, opening up the possibilities for VR training, learning, and value creation.

​​

The Teslasuit includes 68 haptic points capable of simulating a wide variety of physical sensations, from the subtle trickle of water to the intense G-force of a jet fighter engine
Meta is in the process of developing its own haptic gloves, which utilize small motors and air pressure to provide tactile feedback

b. Haptic Non-wearable Devices

Beyond just haptic wearable devices, there are non-wearable devices that also provide the user with physical feedback in different ways. The most basic and well-understood example of this is a remote that is paired with an AR/ VR headset. Meta Quest 2 headsets come coupled with a controller for each hand that provides basic haptic feedback in the form of vibrations. A more advanced enterprise example is Touch, the haptic device produced by 3D Systems. Touch offers industrial and healthcare professionals the means to design and 3D print models and artificial replicas for their clients.

A second example beyond AR/VR controllers are haptic feedback panels, such as the Emerge Wave 1. By emitting ultrasonic waves that can be used to feel virtual objects and sensations in mid air, users can engage with virtual objects and environments in new ways without the use of physical controllers. Despite the nasency of these panels, one could imagine a setting in which much larger panels surrounding a user in AR/VR could provide full body ultrasonic feedback.

Emerge emits ultrasonic waves to feel virtual objects without the need for wearable devices

c. Olfactory Devices

Olfactory devices have also recently emerged as promising Metaverse technologies. These devices are physical pieces that attach to the bottoms of AR/VR headsets (closest to the nose), and emit scents that will stimulate users’ olfactory receptors. One of the promising players in the space is OVR Technologies, whose device produces nano-particles of scent that activate in millisecond increments.

OVR Technology’s industry applications are primarily aimed at preventing negative health outcomes and financial losses in high-risk industries such as defense, fire, oil & gas, and aviation. More immersive training with incorporated scents would improve the understanding around dangerous smells, resulting in improved action response.

OVR Technology utilizes the science behind scent memory to improve performance across various industries

d. 3D Imaging and Display

3D imaging is a novel combination of taking live video, compressing it, and generating 3D rendered displays to view. This gives the impression, when viewed through a 3D display, that physical objects or people appear to be 3D when in fact they are actually 2D. Google’s Project Starline has pioneered this technology with their hardware-based booth that allows individuals to have video conversations that feel like a face-to-face conversation. These interactions are made possible through a combination of 3D sensors and cameras for capture, and a fabric-based, multi-dimensional light-field display to generate the outputs.

This form of AR is unique in that it doesn’t involve any wearable technology and is only done through sensors and light technology. Enterprise use cases for this technology would likely be primarily for office work and collaboration in the short term. In the long term however, this technology could be used for a variety of use cases requiring improved imagery, such as education, healthcare, construction, industry, and more.

Google’s Project Starline uses light field displays and loads of data to create hologram-like communications
Google employees have used Starline for “thousands of hours” to “onboard, interview and meet new teammates, pitch ideas to colleagues and engage in one-on-one collaboration”

Should you have any questions or would like to discuss more, please don’t hesitate to contact Aaron Farr (aaron@agyaventures.com).

--

--