Announcing the Wit-Unity Plug-In for Building Voice Enabled AR and VR Experiences

Pan Wangperawong
Wit.ai
Published in
2 min readJul 15, 2021

In this past F8 2021 Wit.ai Hackathon, we challenged developers to build voice enabled AR and VR experiences. As a result, we received many submissions that integrated voice in creative ways, such as interactive storytelling, guided dance lessons, and voice based game controls (see the announcement post for the winning hackathon projects).

To continue supporting AR and VR developers looking to build immersive voice experiences, we are happy to announce that the Wit-Unity plug-in is now available. With the plug-in, you can easily integrate voice into your Unity app allowing you to focus on the creative and functional aspects of your app experience rather than spending time on the configuration. All of the code is also open-sourced and available on GitHub. To get started, you can visit https://github.com/wit-ai/wit-unity.

In the repository, we’ve also included a voice enabled VR sample app where users can change the color of 3D shapes through voice commands. Accompanying it, is an in-depth tutorial with step by step instructions.

Wit-Unity 3D shapes demo
Wit-Unity 3D shapes demo

For more information about integrating voice into your AR and VR apps, watch our F8 session, where we discuss how developers are using Wit.ai to provide conversational user experiences in different ways.

F8 2021 — Wit.ai: Take your customer experience to the next level with voice and natural language

Keep in mind that in augmented and virtual reality environments, the laws of physics no longer apply, so don’t feel constrained by the limitations of physical reality. We can’t wait to see what you will build with the Wit-Unity plug-in!

--

--