A Intuition on how Smartphone Sensors work

-By Sushanth Shajil

Most Android Devices have sensors that measure orientation, motion and other environmental conditions around you.

These sensors deliver data with high precision and accuracy.

For example, a really cool racing game you just downloaded from the play store might need to use your gravity sensor to swing, rotate and tilt.

Android developers have access to the sensors on your phone by using something called the “Sensor framework”.

A Sensor framework can help a developer:

  • Determine which sensors are present on your phone.
  • Get to know about each sensor’s capabilities and it’s maximum range.
  • Acquire data from the sensors and set a minimum range from which readings have to be noted down.

A developer first starts off by getting to know the sensors present on your device.

He does this by creating an instance of SensorManger class and calling the method getsystemservice() and passing in the SENSOR argument.

Didn’t make sense at all, did it? Let’s not get too much into technical side of things.

Sensors are mainly used to make a user’s experience more comfortable and more personalized.

Let’s say, you’re in a dark room and the light is too bright on your phone. But your screen brightness is set all the way down. Android comes built in with a feature called “Adaptive Brightness”

This feature uses one of the proximity sensors, which come under environmental sensors to measure the intensity of light falling on it. Depending on the intensity, your screen turns dimmer or brighter.

Now you know more about how sensors work on smartphones.

If you want to get to know more, write your questions/queries below.