Using Sensors in React Native
With great power comes great responsibility
This has lead to a couple of problems with React Native packages containing native code. Often they are iOS or Android only, leaving the user in the need to pick multiple libraries to do the same job. Another problem is the API of native modules. Some are great, but others are simply using the primitives React Native provides, e.g. the
NativeEventEmitter, without exposing a nice to use developer interface.
When I wanted to write a little demo using the Gyroscope sensor of my device I stumbled on both of the problems. As there were already libraries out there doing the job on one of the native sides I decided to combine them. Sprinkle a little RxJS on it and boom:
react-native-sensors was born.
What does it provide?
react-native-sensors provides a RxJS based interface for Accelerometer, Gyroscope, and Magnetometer. It’s API is consistent with the different sensor types and platforms. Is it easy to use? You will see:
Let’s go through the important parts:
- Line 20: Create a new accelerometer. This returns a promise, which fulfills if the sensor is available. We subscribe to the observable and set the state.
- Line 33: In the render method we just access the state to show us the raw sensor data, which looks like this:
What are the possibilities?
The project I wanted to do when I first developed this library was fairly simple: I wanted to use the gyroscope to let the user interact with an image that is too wide to display on the screen entirely. So instead of using gestures to go left or right, I wanted to pan my phone and just see the other parts of the image.
As you can see the example is quite similar. To calculate how much I would like to move the image I just add up the sensor values for the y coordinate, so that if you leave your phone tilted to the left you get further left in the image.
In the render function, I use
translateX to position the image in the middle of the screen and then add the calculated offset to see where our sensor data should move us to. By dividing the sensor value by 10 I can make the movement more smoothly, you can play with this value to see how it affects the behavior.
The only difference in the construction of the sensor I made is that I added a higher update interval to make the feeling more fluid when using harsh motions. Here you can see how the end result looks like:
Wow, working with sensor data seems easy, where can I start?
You can go and checkout
react-native-sensors on GitHub, it’s all Open Source and ready to use. The examples I have shown can be found in the
examples folder. If you need real-world examples, we maintain a list of Open Source projects using this library, so go ahead and see how they do it.
So go ahead and check out our new website: react-native-sensors.github.io
How can I help?
You like the project and you would like to make it even better? There are plenty of ways you can help:
- Build something cool with it? Do a PR to add your project to our list
- Wrote a small fun demo to show what can be done? Do a PR and add it to the examples
- Found a typo in the docs or the website? Do a PR, the website will be automatically deployed after the merge
- Want a new sensor we have currently no support for? Make sure it is available on both platforms and open an issue. I will guide you through the process of how to do a PR and help you land it.
- Want any other feature? I will give my best to help you land a PR for whatever seems nice to have.
- I am new to coding and I don’t feel comfortable doing a PR, how can I help? If you would like to learn how to do a PR, just ping me in an issue and I will make sure to help you. Otherwise, I am always in need of people testing out pre-releases, just open an issue saying that you would like to test and I will put you on my list and ping you.