Motion Sense(Project Soli), Litho, Google Home and Augmented Reality

Rajat Kumar Gupta
Voice Tech Podcast
Published in
4 min readFeb 24, 2020

Today I came across an interesting project called Soli by Google.

What is Project Soli?

It launched back in 2015, one year before I became a fresher. Checkout the video👇🏼

Project Soli by Google

After watching that video I asked my assistant, “Hey Google, what happened to this project?” I assumed it was dead until I researched a little more and came across the Motion Sense technology in Google Pixel 4 phones. Motion Sense is Gesture technology to control your phone’s basic apps in layman terms.

So I tweeted this👇🏼

Let’s understand how effective this can be -

2 months ago Google announced ARCore Depth API. Google had released a call for collaborators form for ARCore Depth API but now it has stopped accepting responses. I am predicting that they will release the API for all developers in this year’s Google I/O event, you know, after testing out with some close partners of theirs.

So the Depth API announcement and frequent updates to the ARCore API shows Google’s slow and steady progress in AR. I am not sure whether Apple iPhones have gesture technology but their ARKit sdk is better than ARCore. One of the reasons is body tracking feature. However I have not seen any gesture controlling AR experience developed using ARKit. So the ARCore SDK and the Pixel 4’s Motion Sense technology seem like a powerful combination for the future of AR that may have a result like so -

Credits : https://litho.cc

Looks awesome right? Did you notice something in his hand? That is a device called Litho. If you don’t know about it then check out my video on Litho and trust me when I say this, it’s a great tool to experiment with for all you AR Developers. So order one today.

Let’s compare Litho and Motion Sense in Pixel 4. I will dive into how both the Litho and Motion Sense in Pixel 4 phones are actually different on the basis of interaction design.

Build better voice apps. Get more articles & interviews from voice technology experts at voicetechpodcast.com

Motion Sense & Litho

Motion sense is about close range gestures i.e your hand and your phone will have to be pretty close to operate apps on your phone with gestures. On the other hand Litho works via Bluetooth and has a better range. Moreover, the Litho device will be great for AR glasses or headsets. Why? Well, yeah you can make gestures like air tap and stuff that they show in this Microsoft Hololens video but I have a feeling that those gestures will lack accuracy. This is where the Litho controller comes in. The accuracy will prevent accidental or unexpected interactions with virtual objects and simply improve the overall UX i.e User Experience.

Alright so where will Motion Sense help? It will help in interactions with basic apps to avoid those thumb pains that you have while swiping across your phone’s touch screen the whole day. [Wow now that I think about it, that would give rise to a whole new world of UX opportunities] However I feel the Motion Sense will help control the virtual objects in a much more natural fashion than using the phone’s touch screen. According to the Design Guidelines by Google for AR, you should encourage user movement, but me being one of those lazy users, think that Motion sense will help me interact with the objects without actually walking close to them. This is one use case that I can think of.

Project Soli in Android TV

Another use case would be integrating Project Soli in Android TV so that I don’t have to type with a remote the next time I browse YouTube videos. The number of times I need to press a button just to be able to reach the video that I like is torture. Don’t even get me started with the lag. However, I still think the range might be a concern.

Google Home and Augmented Reality

What if a bulky hardware like the television can be condensed into the Google Home that pops up an interface(checkout my conceptual design below for reference) when you point your phone towards it or what if you can actually see such an interface on demand i.e, put on your AR glasses and view the interface whenever you need or want to. Removing the television will actually help make more space for some other important thing in your living room. What if it becomes capable of running even more powerful applications that run on computer?

Above is a concept that I designed a few months ago on what Google Home with a visual interface would look like. You can view it here on uplabs. You may also relate it to the following real world scenario -

  1. Think of a song. You cannot remember the name of the artist of the song.
  2. You point your phone towards the assistant and you get an interface with all the details of the song as well as the artist.
  3. Sure you could have just asked the assistant about the song’s artist in the first place, but what if the song is already playing and you don’t want to interrupt?
  4. In this case, Project Soli or Motion sense will help design the user interaction to browse songs or navigate around the augmented reality interface, provided that, the user is using a phone to view the AR Experience.

Do you think this will be the future of cross intersection of Voice Interaction and Augmented Reality?

All I can say is “Okay Google(ahem ahem assistant), what are you thinking?”

Something just for you

--

--