Integrating Gesture Controls into your Applications — Kai SDK

Sanskar Biswal
Jul 27 · 5 min read
Image for post
Image for post
Fig. 1 Kai Gesture Controlled Applications

It has only been a little over seven decades since the first computers were invented. At the time, they were bulky and expensive but their spread was limited due to another major factor. User Interactivity. Scientists working on those computers, had to literally punch in holes onto boards to give binary instruction. Since this crude, cumbersome and a rather humble origin, our computers and methods we interact with them has come a long long way.

Gesture — The Future of UI

The human brain is hardwired to the real world. This means that it is not normal for us to interact with anything in 2D, especially not technology that we use so frequently.

To understand this better, we need to separate the concepts of Computer and its User Interface. A computer is essentially a processing device, a module that can do a lot of calculations very fast. It’s user interface on the other hand is how we interact with it.

The Kai allows us the liberty to interact with our computers, via a simple set of gestures. These gestures can be mapped to key board or mouse options. The limit here is your imagination. The Kai Control Center is shipped with a set of 8 gestures and 2 pinch actions, configurable by the user. The keyboard mappings can be configured separately for each application of the users choice.

Kai SDK

Image for post
Image for post
Real Time 3D Interaction

The Kai Software Development Kit interfaces with the Kai Dongle, allowing users/developers full control to the data being sent from the Kai Controller. These gestures and data can be interpreted and used as you choose, either to improve the performance of the existing device or add gesture controls to your apps, games and softwares.

The Kai SDK operates by reading from a web socket to which the incoming data from Kai is written. Any commands sent to the Kai is also sent via this web server. The web socket can be read from any programming language. Currently, we natively provide support for the following programming languages:

  • C# (or any .NET based language)

Hardware Prerequisites

  • Kai Gesture Controller

Software Requirements

  • Kai Control Center

Setting Up Python Environment

Create a Project Directory and open a terminal inside the directory.

Step 1. Create a fresh Virtual Environment

$ virtualenv venv

Step 2. Activate Virtual Environment

$ venv\Scripts\activate

Step 3. Clone Kai SDK into Project Directory

$ (venv) git clone https://github.com/vicara-hq/kai-python

Step 4. Setup Kai SDK and Install into Virtual Environment

$ (venv) python setup.py build
$ (venv) python setup.py install

At this point, the python environment is setup with that Kai SDK.

Setting up Config File and Starting Web Socket

We setup a new file called ‘config.ini’ to store the web socket ID and access credentials.

# File-name : config.ini
# moduleID = "KaiApplication" # Name can be anything
# moduleSecret = "qwerty" # Leave as is
[MODULE]
ID = "KaiApplication"
SECRET = "qwerty"

Create a new file ‘main.py’.

"""
File: main.py
Purpose : Kai Data Control Center (python)
"""
import configparserfrom KaiSDK.WebSocketModule import WebSocketModule############# Global Varaibles #######
module = WebSocketModule()
######### Evt handlers ############################ Functions ##############def get_config():
config = configparser.ConfigParser()
config.read("config.ini")
return config
def start_web_socket(config):
moduleID = config.get("MODULE", "ID")
moduleSecret = config.get("MODULE", "SECRET")

global module
success = module.connect(moduleID, moduleSecret)

if not success:
print("Unable to authenticate with Kai SDK")
exit(1)

Set Kai Capabilities and Event Handlers

The SDK operates on an event based format, to optimize performance. There are a few types of event generated in the SDK.

  • Gesture: Makes the Kai notify when a gesture is performed.

In addition to these, the Kai can also respond with data from the built-in sensors.

  • Finger Position Data: Makes the Kai send the absolute positions of the 4 fingers

Note: All of the above scenarios trigger events within the SDK which needs to be handled by the Application Code.

Additionally, to improve battery performance, the Kai responds only with data it has been requested for. The following code-snippet is for adding event handlers and setting up requests for data from the Kai.

# main.pyfrom KaiSDK.DataTypes import KaiCapabilities
import KaiSDK.Events as Events
######### Evt handlers ###############
def gestureEvt(evt):
print(ev.gesture)
def accelEvt(evt):
print(ev.accelerometer.x)
print(ev.accelerometer.y)
print(ev.accelerometer.z)
############# Functions ##############
def main():
global module

module.setCapabilities(module.DefaultKai, KaiCapabilities.GestureData | KaiCapabilities.AccelerometerData)
# Register Evt Listeners
module.DefaultKai.register_event_listener(Events.GestureEvent, gestureEvt)
module.DefaultKai.register_event_listener (Events.AccelerometerEvent, accelEvt) time.sleep(30) # Testing Duration # Can be replaced on ESC break # Save kai battery by stopping requests when not Needed
module.unsetCapabilities(module.DefaultKai, KaiCapabilties.AccelerometerData)
time.sleep(30) module.close()

Note: Multiple capabilities can be subscribed to and this will result in the module getting both data parallelly.

Run and Test Module

if __name__ == "__main__":
get_config()
start_web_socket()
main()

Finishing Notes

The output console will print the accelerometer data once every 9ms on an average. The Gesture will be printed only of the Kai is able to recognize a gesture.

The Python SDK allows developers to adapt the unlimited benefits of gesture control to their own applications. These applications may range from gaming and 3D modelling tools all the way to an active interaction medium on AR and VR devices.

The potential is truly unlimited…

In order to optimize battery life, the recommended way is to get this data only when required. An example of such a flow would be as follows. Let’s say we need to make an object rotate in real time on the screen.

  1. We would first subscribe for finger shortcut data.

Useful Links

Web Socket Info:

  • The Kai Published and Listens to the websocket at localhost on port 2203

Vicara

Vicara is an Immersive Technology Company that develops…

Sanskar Biswal

Written by

Electronics Engineer | Firmware Developer | Programmer | Poet | Writer

Vicara

Vicara

Vicara is an Immersive Technology Company that develops hardware products and solutions for Mixed Reality based Industrial applications.

Sanskar Biswal

Written by

Electronics Engineer | Firmware Developer | Programmer | Poet | Writer

Vicara

Vicara

Vicara is an Immersive Technology Company that develops hardware products and solutions for Mixed Reality based Industrial applications.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store