Track Your Smartphone in 2D With JavaScript

Imagine what you can do with this new medium of interaction

Sanjeet Chatterjee
Jun 3 · 6 min read
Image for post
Image for post
Photo by the author.

With a fundamental shift to the web, we are able to do really cool things right from our browser. In this tutorial, we will be utilising the Generic Sensor API to turn your smartphone into a pointer with real-time tracking.

Here’s what we will be making:


  • As of writing, the Generic Sensor API is not yet supported on iOS. In addition, some Android smartphones don’t have the required sensors. However, it is still possible to work through this tutorial with simulated sensors found in Chrome DevTools:
Image for post
Image for post
Image for post
Image for post
  • It is also possible to view the console output of Chrome from your Android smartphone via USB, although this requires some further setup.
  • The Generic Sensor API requires a secure context. Therefore, HTTPS is required. You can either work on localhost with the sensor simulator or an online code editor (such as with your smartphone.

Note: Everything for the tutorial can be found at this Repl, where you can browse and edit the code as well as try out the demo.

Tracking Your Smartphone

Let’s start off with a generic controller.html file and corresponding controller.js script:

The Generic Sensor API supports multiple sensors. However, for our requirements, we will be using the AbsoluteOrientationSensor.

According to MDN web docs, the AbsoluteOrientationSensor is a sensor fusion API that “describes the device's physical orientation in relation to the Earth's reference coordinate system.”

By combining data from multiple real sensors, new virtual sensors can be implemented that combine and filter the data so that it’s easier to use — these are known as fusion sensors. In this case, data from the onboard magnetometer, accelerometer, and gyroscope are used for the AbsoluteOrientationSensor’s implementation.

Below is the code for interfacing with this virtual sensor. And that’s it!

First, the sensor object is initialised with a set frequency — the rate at which the sensor is read and corresponding handleSensor callback is fired. Then it starts the reading process.

After refreshing your page, move your phone around to see the following output. You should see a stream of quaternions:

Image for post
Image for post

But what are quaternions?

“Quaternions are a number system that extends the complex numbers.” — Wikipedia

In place of Euler angles, they can be used as an alternative method of describing the orientation of an object in space. Quaternions are extensively used in game development, as calculations using them are less computationally expensive.

However, for simplicity, let’s convert these to the more intuitive Euler angles. Following the conversion formula, below is a JavaScript implementation. As we are tracking in two dimensions, pitch has been omitted:

Update your handleSensor function to now print the converted Euler angles by using the function above:

Image for post
Image for post

You should see the angle output in radians, which should change intuitively as you rotate your phone around. Below are the dimensions we will be using to track our pointer in 2D.

Image for post
Image for post
Photo by the author.

We have successfully interfaced with the Generic Sensor API to obtain the required sensor data to track your smartphone’s orientation in real-time. We need to now translate these changing angles into movement projected on the screen.

But first, we need a method of calibration — setting an initial start position from which all distances are measured:

When you click on the controller page body, the current orientation of the phone is set as the start point from which all angles — and therefore distances — are measured.

Image for post
Image for post
Photo by the author.

For calculating the relative distance moved, simple trigonometry is required by using the change in angle from the start point.

When taking differences, we need a way to wrap around to ensure a correct value (i.e. at the 180° and -180° points).

Here’s the code:

Note: The number 800 in the final calculation determines the virtual distance of the controller from the canvas. In the real world, this does not make sense and instead can be used to change the sensitivity of movement.

Update your handleSensor function to print out the calculated distance using the function above:

And that’s it!

You now have the ability to track your smartphone’s movement in real-time and translate it into a distance measurement for on-screen movement.

Pointing With Your Smartphone

With a simple Node.js server, some SocketIO magic, and an HTML canvas element, the distance measurements above can turn your smartphone into a digital pointer with support for multiple controllers.

Note: As this article focuses on utilising the Sensor API, SocketIO explanations have been glossed over, although the code should be self-explanatory. For more information, take a look at the documentation.

The server

This simple server serves our HTML pages from the public directory and sends controller data to all connected web clients via SocketIO for the pointers to be rendered on canvas.

By storing a list of connected controller clients, it is possible to add support for multiple controllers.

As a result, the array of controllers and corresponding distances moved are sent to all connected clients at a constant rate with setInterval.

The controller

The controller reads sensor data, calculates distances, and sends them to the server to be broadcasted — mostly covered above, along with the added communication with our SocketIO server.

The digital canvas

Image for post
Image for post

The canvas page renders the calculated distances received from the server as circular pointers on a canvas.

Most of it is self-explanatory, but let’s look at the draw function:

For multiple controllers, the controller array is iterated through and corresponding distances are offset by amounts that enable the pointer to start at the centre rather than in the top left corner where the canvas coordinate system begins.

These are then rendered as circles using the Canvas API with the next available colour. For more information, see the docs.

Finally, the requestAnimationFrame function is called to tell the browser to perform the previous operation again before the next repaint. This cycle continues to allow the pointer to move in real-time with your smartphone.


If you have a working smartphone pointer, congratulations! But what next?

Maybe a tool of communication? A smarter alternative to a laser pointer? Or perhaps something like Paintr — a collaborative digital canvas where your smartphone becomes your paintbrush:

This is a new medium of interaction with something we all carry around in our pockets. The possibilities with this kind of setup are endless.

Thanks for reading. Let me know your thoughts below.

Better Programming

Advice for programmers.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store