Photo by Andrea Zanenga on Unsplash

React Native Camera: Enhancing Stability with Accelerometer- Part1

Fadi Shaar
5 min readJan 10, 2023

--

React Native is a powerful framework for building mobile applications using React. One of the most popular features in mobile applications is the ability to take pictures, but capturing images on a mobile device can be challenging, especially if the device is not held steady. In this article (part 1), we’ll show you how to build a stabilized camera in React Native that ensures that the device is level before allowing the user to take a picture.

Different approaches can be used to check the stability of a device in React Native.

  • Pitch and Roll Angles
  • Standard Deviation of Accelerometer Data
  • Kalman Filter and complimentary filter

In this part1 will focus only on one stability approach which is using the pitch and the roll angles. These angles represent the tilt of the device in relation to the horizontal plane and can be calculated using the accelerometer data. The pitch angle represents the tilt of the device around the x-axis and the roll angle represents the tilt of the device around the y-axis.

Requirements

To build this application, we’ll need to install the following libraries:

  • react-native-camera: This library provides a React-Native component that you can use to display the camera view and take pictures.
  • react-native-sensors: This library allows us to access the device's sensors, like the accelerometer, which we'll use to get the device's pitch and roll angles.
  • react-native-camera-focus: This library allows us to get the focus score of the camera.

Implementation

The first thing we’ll do is create a custom hook that gets the device’s pitch and roll angles using the react-native-sensors library. We'll call this hook usePitchAndRoll.

import { useEffect, useState } from 'react';
import { Accelerometer } from 'react-native-sensors';

export const usePitchAndRoll = () => {
const [pitch, setPitch] = useState(0);
const [roll, setRoll] = useState(0);

useEffect(() => {
const subscription = Accelerometer.addListener(({ x, y, z }) => {
setPitch(Math.atan2(x, z) * 180 / Math.PI);
setRoll(Math.atan2(y, z) * 180 / Math.PI);
});

return () => subscription.stop();
}, []);

return { pitch, roll };
};

This hook uses the useEffect hook and the Accelerometer from react-native-sensors to access the device's accelerometer and adds a listener that sets the state variables pitch and roll to the device's current pitch and roll angles.

Then, we’ll use this hook in the main component to get the device’s pitch and roll angles and also use the FocusScore component from the react-native-camera-focus library to get the focus score of the camera.

import React, { useState } from 'react';
import { View, TouchableOpacity, Text, Alert } from 'react-native';
import { Camera } from 'react-native-camera';
import { usePitchAndRoll } from './usePitchAndRoll';
import FocusScore from 'react-native-camera-focus';

export default function CameraView({ onPictureTaken }) {
const [cameraType, setCameraType] = useState(Camera.Constants.Type.back);
const [enabled, setEnabled] = useState(false);
const { pitch, roll } = usePitchAndRoll();

const level = Math.abs(pitch) < 10 && Math.abs(roll) < 10;

FocusScore.addEventListener(FocusScore.Events.FOCUS_CHANGED, (data) => {
setEnabled(level && data.score > 0.5);
});

return (
<View style={{ flex: 1 }}>
<Camera
style={{ flex: 1 }}
type={cameraType}
flashMode={Camera.Constants.FlashMode.off}
autoFocus={Camera.Constants.AutoFocus.on}
zoom={0}
whiteBalance={Camera.Constants.WhiteBalance.auto}
focusDepth={0}
>
<View style={{ flex: 1, backgroundColor: 'transparent'}}>
<View style={{ flex: 1, flexDirection: 'row', justifyContent: 'space-between' }}>
<TouchableOpacity
onPress={() => {
setCameraType(
cameraType === Camera.Constants.Type.back
? Camera.Constants.Type.front
: Camera.Constants.Type.back
);
}}
style={{ alignSelf: 'flex-start', margin: 30 }}
>
<Text style={{ fontSize: 18, color: 'white' }}> Flip </Text>
</TouchableOpacity>
<TouchableOpacity
disabled={!enabled}
style={{ alignSelf: 'flex-end', margin: 30 }}
onPress={async () => {
if (this.camera) {
const options = { quality: 0.5, base64: true };
const data = await this.camera.takePictureAsync(options);
onPictureTaken(data.base64);
} else {
Alert.alert('Error', 'Camera not found');
}
}}
>
<Text style={{ fontSize: 18, color: 'white' }}> Take Picture </Text>
</TouchableOpacity>
</View>
</View>
</Camera>
</View>
);
}

Here is a detailed breakdown of the code:

  • const [cameraType, setCameraType] = useState(Camera.Constants.Type.back);: we set up a state variable to keep track of the current camera type and a function to update it.
  • const [enabled, setEnabled] = useState(false);: we set up a state variable to keep track of whether the picture taking functionality is enabled and a function to update it.
  • const { pitch, roll } = usePitchAndRoll();: We're getting the current pitch and roll from the usePitchAndRoll hook
  • const level = Math.abs(pitch) < 10 && Math.abs(roll) < 10;: we're checking if the device is level.
  • FocusScore.addEventListener(FocusScore.Events.FOCUS_CHANGED, (data) => {: we're subscribing to the focus score change event.
  • setEnabled(level && data.score > 0.5);: we're enabling the picture taking functionality only if the device is level and the focus score is greater than 0.5
  • <Camera>: Wraps the camera and its settings like flashMode, autoFocus, whiteBalance and focusDepth
  • <View>: Wraps the touchableOpacity buttons which enable the user to change the camera type and take a picture
  • <TouchableOpacity>: A component that can be pressed and is used for touch events. Inside this component you'll find two buttons, one to flip the camera and another to take a picture. It also has the enabled state to check if the button should be active or not.
  • onPress={async () => {: a callback function that's called when the button is pressed, It checks if the camera is available and if it is, it calls takePictureAsync method of the camera that returns a promise with the image data (in this case base64 encoded).

In conclusion, building a stabilized camera in React Native is a great way to ensure that your users have a consistent and reliable experience when capturing images. By using the device’s accelerometer to get the pitch and roll angles, as well as the focus score of the camera, we can enable the picture taking functionality only when the device is level and the focus score is high enough. The example code provided in this article serves as a starting point for building your own stabilized camera in React Native and can be easily integrated into your existing application. With the power of React Native, the possibilities are endless when it comes to building user-friendly and stable camera experiences.

In part2, will cover more advanced approaches that can be used to check the stability of a mobile device in React Native.

Stay tuned for the continuation of this article in Part 2

--

--

Fadi Shaar

Passionate machine learning engineer with a Ph.D in computer engineering specializing in computer vision.