Building the Haptics Future through Collaboration: The MPEG Haptics Standards submission (Process Update #2)

Interhaptics
Geek Culture
Published in
9 min readJul 6, 2021

Where is the haptics standardization process today?

Two important milestones were achieved in October 2021 and Interhaptics is at the heart of them.

The first milestone is the near future of Haptics as a first-order media type, giving Haptics the same status as Audio and Video in the technological landscape. This means that haptic files can be encapsulated to be delivered with audio and video format in standardized MPEG media. This is the 1st step to deliver content adoption. This means that your streaming service can contain a haptics track included in the current MPEG formalism.

The second milestone was the issuing from MPEG of the Reference Model 0 — RM0 of haptics encoding. The RM0 is the base format upon which the final haptic encoding standard will be issued.

The success of the haptics standard depends on the adoption of the standard by the industry players. This comes down to who wrote the specifications.

The RM0 was issued following a Call for Proposal issued by MPEG. Several companies submitted their technology proposal, and a challenging selection process indicated the best technology candidate for the RM0.

Among the sponsors and judges of the Haptics Call for Proposal were:

- Apple

- Sony

- Immersion

- Interdigital

Apple owns the largest haptics platform in the world, the iPhone.

Sony is a leading innovator in Haptics with the Dual sense controller of PS5, and a big player in the XR world with the PSVR.

Immersion is the historical major player in haptics technologies.

Interdigital is an IP technological company with major IP and technologies in video compression and 3D scenes.

No better sponsor existed to kickstart the adoption of the MPEG standard.

Interhaptics will be the technology platform upon which the RM0 will be built.

Interhaptics has demonstrated superior technology both in perceptual rendering fidelity, signal compression, and platforms compatibility. All this is both for vibrotactile and kinesthetic feedback.

Our encoding standard will be wrapped in a compatibility layer based on the proposal of other contributors to ensure future compatibility with 3D scenes and specific high-performance profiles.

Interhaptics owns IP in both the form of patents and fundamental software to enable the implementation of these technologies in the Haptics ecosystem.

Interhaptics’ proposal is the basis of RM0 and is a game-changer

- It is 100 % compatible with Apple AHAP format

- Perfectly transcode the Immersion IVS format, the basis of the implementation of many smartphones

- It connects with Sony’s dual sense multi-channel needs, both for force feedback (adaptive triggers) and for the wide band haptic voice coil.

- It has by far the best compression technology that paves the way for multiple actuators haptics for XR. (Nanoport partnership: https://bit.ly/3oqdZ3B)(Saarland University partnership: https://bit.ly/3bRT6rW)(Syntouch partnership: https://bit.ly/3kgqBJk)

- It is based on perceptual encoding, which can be easily expanded for extended reality

What are the next steps?

Interhaptics and other contributors to the RM0 will provide a working software reference model for MPEG 137 scheduled for January 2022. The model will undergo several rigorous tests to ensure its quality and reliability for the Core Experiment phase to finally deliver a first commercial version of the standard shortly.

Current MPEG Phase 1: Submission Material

The submission consisted of the challenge of encoding and compressing in our Interhaptics Haptic Material format 3 different wideband signals:

  • Short Vibrations (< 100 ms)
  • Long Vibrations (> 100 ms)
  • Forces in 3D

The challenges for the call for proposal are linked to the high compression rates necessary to meet the validation requirements and the format of the haptic signal to digest in the encoding process.

The signal to digest were multiples, some of them vectorial and some of the signal based:

  • AHAP from Apple
  • IVS from Immersion
  • PCM files from Interdigital

A specific challenge was posed by the PCM encoding: the starting signal was an 8 kHz,16 bit, which needed to be compressed at:

  • 64 kb/s
  • 16 kb/s
  • 2 kb/s

Where the 2 kb/s requires a compression rate of 64, which means 64 times less memory consumption compared to the original!

Interhaptics delivered the totality of the encoding requirements with 108 encoded files meeting all the compression requirements from the call for proposal.

The submitted encoding proposal has been evaluated by a pool of haptics experts which will give an unbiased report on the quality of the encoding format to the call proponent. The MPEG experts gathered this submission and are selected the most promising characteristics of each proposal to synthesize the final standard.

The standardization process is long, and it will take few years before it becomes operative on the market.

Next step : MPEG meeting to establish the RM0

The MPEG experts have selected the most promising characteristics of each proposal. Interdigital, Nanoport Technologies, Lofelt, and Interhaptics will have a new meeting with MPEG. The goal of this meeting is to establish the RM0 that stands for Reference Model 0. Each participant will have to argue about their submitted material.

The next step of this meeting, MPEG present their previous tests results. Each proposal will be reviewed with their strenghts and weaknesses. Furthermore, a brainstorm with the test result will be operated to establish the RM0 among the 4 proposals. Some proposals can be merged in one if some features are complementary.

Following the establishment of the RM0, each participant will have a 6 months process in order to evaluate and improve the RM0.

The implementation of accepted standards in a fragmented market always accompanied the growth and consolidation of any industry. This happened multiple times throughout the history of technology with large impacts on the economy of the industry. One example is the implementation of the mp3 coding format standard which allowed the widespread adoption of digital audio players such as the iPod. This event cascaded till today’s predominance of digital audio streaming services such as Spotify. Nothing would have been possible without an agreed and common standard allowing the interoperability of audio encoded signals between devices of different manufacturers.

The implementation of these standards is led by different organizations. The one responsible for the MP3 and the upcoming digital haptic encoding standard is MPEG.

What is MPEG?

MPEG or the Moving Picture Expert Group “is an alliance of working groups of ISO and IEC that sets standards for media coding, including compression coding of audio, video, graphics, and genomic data, and transmission and file formats for various applications”. MPEG helps industry partners to agree upon shared encoding format and manages the licensing of relevant patent pools to enable third-party software producers to include the enhanced functionalities.

Thanks to MPEG, you can listen to Spotify on your Amazon Fire TV, because all these services agreed on a common musical language.

The Need for Haptics Standards

The haptics market is fragmented in different silos coming from the wildly different technologies that form the haptics ecosystems. One example is for vibrations: the implemented technologies in haptics vary from vibrations generated by Eccentric Rotating Masses (ERMs), towards refined wideband actuators like the Taptic Engine of Apple or the one included in the newest PlayStation 5 Dualsense Controller. ERM and wideband actuators accept and execute signals which are fundamentally different, they speak different languages.

Source: PlayStation 5 DualSense Controllers

In practice, this means that if you build a game for the Taptic Engine of Apple Core Haptics, you will have to rebuild all your haptics implementation for a controller like the Dual Shock. This usually means that the game developers simply do not invest in great haptic implementation due to the higher cost to implement it.

This is the reason the ecosystem needs haptics standard to be able to tie together all the different languages that each actuator speaks and allow content creators to build once for Haptics and deploy on all the devices implementing these technologies.

MPEG Phase 1: Organizers and Participants

The first Call for proposal to standardize vibrotactile haptics proposed by MPEG is open and proponents can submit their proposal up to the 5th of July 2021.

The call was proponent by Immersion Corp with the support of other organizations such as Apple. The original encoding submissions for this first round come from Interdigital, Nanoport Technologies, Lofelt, and Interhaptics.

MPEG Phase 1: Submission Material

The submission consisted of the challenge of encoding and compressing in our Interhaptics Haptic Material format 3 different wideband signals:

  • Short Vibrations (< 100 ms)
  • Long Vibrations (> 100 ms)
  • Forces in 3D

The challenges for the call for proposal are linked to the high compression rates necessary to meet the validation requirements and the format of the haptic signal to digest in the encoding process.

The signal to digest were multiples, some of them vectorial and some of the signal based:

  • AHAP from Apple
  • IVS from Immersion
  • PCM files from Interdigital

A specific challenge was posed by the PCM encoding: the starting signal was an 8 kHz,16 bit, which needed to be compressed at:

  • 64 kb/s
  • 16 kb/s
  • 2 kb/s

Where the 2 kb/s requires a compression rate of 64, which means 64 times less memory consumption compared to the original!

Interhaptics delivered the totality of the encoding requirements with 108 encoded files meeting all the compression requirements from the call for proposal.

The submitted encoding proposal will be evaluated by a pool of haptics experts which will give an unbiased report on the quality of the encoding format to the call proponent. The MPEG experts will later gather this submission and select the most promising characteristics of each proposal to synthesize the final standard. The standardization committee feedback will happen by the end of September 2021.

The standardization process is long, and it will take few years before it becomes operative on the market.

The Importance for Interhaptics

The mission of Interhaptics is to enable creators to create wonderful haptics content in no time. The ability to deploy their creations on all the haptics interfaces available is part of the execution of this vision.

We are aware of the challenges of a fragmented ecosystem, and for this reason, we began the development of Interhaptics back in 2016 by creating a Haptics Encoding format, our Haptics Materials, which could encompass the differences between the different haptics technologies. The Haptics Materials can store effectively not only vibrotactile Haptics, but also force feedback, spatial texture, and temperature changes. Being able to influence the Haptics Standard thanks to the submission of a part of our Haptic Material encoding will help the Interhaptics ecosystem to be compliant with the upcoming standard even before it is accepted.

The MPEG challenge allowed Interhaptics to develop key technologies in the area of high compression haptics encoding as well as wideband haptics compatibility with a different format. One, in particular, is our perceptually lossless original encoding format which allows an extreme compression of wideband signals without any perceived information loss.

Creators looking to implement Haptics in their software will have direct benefit from this activity in short term. The developed compression technologies will be available within the upcoming Haptic Composer 2.0 to allow creators to import sound into the Haptic Composer.

The designer’s workflow will include the ability to import a sound file and translate it to haptics, add their haptics touches thanks to the composer, and export them in their format of choice. The supported format will include AHAP from Apple, WAV for sound2Haptics conversion, and the Interhaptics Haptic Material Format allowing the best quality of the encoded signal for native integration in the supported platforms.

What does it mean for haptics creators and designers?

The upcoming standards will help the Haptics ecosystem grow. One of the key drivers for growth is the ability to create efficient and effective haptics content. The Haptic Composer from Interhaptics is a tool addressed to designers and developers to create high-definition haptics content. The future compatibility with the MPEG proposed haptics standard will ensure that the investment you are doing today with our content creation toolchain will be ready for the innovations upcoming in the haptics market tomorrow.

--

--

Interhaptics
Geek Culture

Interhaptics is a development suite designed to build and create realistic human like interactions as well as haptics feedback for 3D application in XR