Getting Oriented

Ben Kolin
Ben Kolin
Sep 3, 2018 · 9 min read

A quick study of orientation sensing on Android.

Hello World!

Disclaimer: I am not a professional Android developer, though as an enthusiast I am past the Hello World stage. Take everything in this article with a grain of salt — it may very well be trivial, incorrect, or widely known. It took me a while to get my head around this information so I’ve written my findings here for posterity.

I recently modernized an old project of mine where I needed to gain a simple piece of knowledge from an Android device — its pitch and roll orientation with regard to the ground. Phones and tablets these days are bristling with sensors so this should be extremely simple, emphasis on should.

I ended up putting together a simple app to help me visualize the differences between a number of different methods of surfacing this data. I have not researched the internal Android implementations of the requisite sensors so there are potentially deltas that are easy to explain if anyone would like to dig a level deeper. I also used longs for all the values instead of floats to keep things simple for the UI, but floats are available for those who need them.

The Sensors

Without further ado, Method 1: The Orientation Sensor

The orientation sensor happens to be a software-based composite sensor that uses input from various hardware sensors, specifically the accelerometer, the magnetometer, and, where available, the gyroscope.

An orientation sensor, it would seem, would be the absolute perfect piece of hardware to use to sense orientation. It is right there in the name! Take a look at the Javadoc though:

“Avoid using deprecated methods or sensor types. Several methods and constants have been deprecated. In particular, the TYPE_ORIENTATION sensor type has been deprecated. To get orientation data you should use the getOrientation() method instead.”

Boooo. The orientation sensor was deprecated in Android API version 8, which https://source.android.com/setup/start/build-numbers says was the “Froyo” release, which was available on May 20th, 2010. That means that as of this writing, the perfectly good orientation sensor has been deprecated for over 8 years.

All that said, on most (but not all) devices I tested it still works well. The code to use this sensor is very simple:

public void onSensorChanged(SensorEvent event) {
if (event.sensor == orientationSensor) {
long pitch = event.values[1];
long roll = event.values[2];
<whatever code uses pitch and roll goes here>

TAADAAAH! Orientation. Easy. Deprecated.

Next up? Method 2: The Rotation Vector Sensor

Like the orientation sensor, this is a software-based composite sensor that uses input from various hardware sensors, specifically the accelerometer, the magnetometer, and, where available, the gyroscope. So far so good.

Let’s check out the doc on this sensor, it must be pretty straightforward to use.

“The orientation of the phone is represented by the rotation necessary to align the East-North-Up coordinates with the phone’s coordinates. That is, applying the rotation to the world frame (X,Y,Z) would align them with the phone coordinates (x,y,z).

The rotation can be seen as rotating the phone by an angle theta around an axis rot_axis to go from the reference (East-North-Up aligned) device orientation to the current device orientation. The rotation is encoded as the four unit-less x, y, z, w components of a unit quaternion.”

All of a sudden things are sounding very technical and I have a headache. I just want to know the pitch and roll and all of a sudden we are talking about quaternions, which are for locating a point in 3D space. How do we use this monster? Like so:

public void onSensorChanged(SensorEvent event) {
if (event.sensor == mRotationSensor) {
float[] rotationVector = event.values
float[] rotationMatrix = new float[9];
SensorManager.getRotationMatrixFromVector(
rotationMatrix, rotationVector);

float[] orientation = new float[3];

SensorManager.getOrientation(rotationMatrix, orientation);

// Convert radians to degrees
long pitch = Math.round(Math.toDegrees(orientation[1]));
long roll = -Math.round(Math.toDegrees(orientation[2]));
<whatever code uses pitch and roll goes here>

That does not read well. Mutable array parameters being passed, magical methods provided by the sensor infrastructure doing who knows what, unit conversions, etc. Note also the negation of the roll result, placed in order to get similar results to the orientation sensor. That said, this solution does seem to work for most (but not all) devices tested, however especially near the limits it performs quite differently than the orientation sensor. More on that later.

Now serving #3! Method 3: The Magnetometer + Accelerometer Sensors

Alright, so higher up in this doc there was a note about the orientation and rotation vector sensors being composite sensors. The mag+accel method more or less trades away that composite magic for some increased local complexity. This method effectively looks at directional accelerations of the device with respect to the sensed geomagnetic field, much like a compass app would do. Let’s take a look at the setup for this one. Note that because both low-level sensors are used, some bookkeeping is involved to make sure both values have been populated before use.

public void onSensorChanged(SensorEvent event) {
if (event.sensor == mMagSensor) {
if (mLastAccel != null) {
// mLastAccel starts as null and is set when
// accelerometer results arrive
float R[] = new float[9];
float I[] = new float[9];

boolean success = SensorManager.
getRotationMatrix(R, I, mLastAccel, mLastMag);

if (success) {
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);

// Convert radians to degrees
long pitch =
Math.round(Math.toDegrees(orientation[1]));
long roll =
-Math.round(Math.toDegrees(orientation[2]));
<whatever code uses pitch and roll goes here>

}
}
} else if (event.sensor == mAccelSensor) {
// Store accelerometer results for combination with mag
mLastAccel = event.values;

That’s a handful. There are magical calls, there is no easy way to understand what is supposed to happen and when without finding someone who has done it before and copying from them. Note the same negation of roll as needed in the rotation vector solution.

After implementing methods 1–3 I thought I was done, I had covered all devices I could test in person and although there were some oddities, I had solutions in hand. Then I posted the app on the Play Store and watched its automated pre-launch testing run, which gave me some strange results. Some of their test devices, including name brands like the Samsung Galaxy J7, the Galaxy J1 Ace, and the Moto G4 Play all failed on all orientation methods.

How could this be?

I broke down and bought a used Moto G4 Play so I could explore it up close. Mystery solved: it has no magnetometer. This means that compass apps will not work, nor will various other positioning apps including a fair number of games. This is a bizarre situation, a modern device missing a modern sensor that bulk should be incredibly cheap.

Sigh.

Break out the black magic. Method 4: Accelerometer Only

We have a very limited data stream now, and it will come with functional limitations that will be described later. For now let’s get the best orientation values we can from this sensor. Much of this implementation is floating around on the internet, my favorite implementation was https://stackoverflow.com/questions/38711705/android-device-orientation-without-geomagnetic and this example is largely just that solution reformatted.

public void onSensorChanged(SensorEvent event) {
if (event.sensor == mAccelSensor) {
float x_accel = event.values[0];
float y_accel = event.values[1];
float z_accel = event.values[2];

double gx, gy, gz;
gx = x_accel / 9.81f;
gy = y_accel / 9.81f;
gz = z_accel / 9.81f;
float pitchForRotationMatrix =
(float) -Math.atan(gy / Math.sqrt(gx * gx + gz * gz));
float rollForRotationMatrix =
(float) -Math.atan(gx / Math.sqrt(gy * gy + gz * gz));
float azimuthForRotationMatrix = 0; // Cannot be determined


float[] mAccMagOrientation = new float[3];

float[] mRotationMatrix = new float[9];

mAccMagOrientation[0] = azimuthRM;
mAccMagOrientation[1] = pitchRM;
mAccMagOrientation[2] = rollRM;

mRotationMatrix =
getRotationMatrixFromOrientation(mAccMagOrientation);

float orientation[] = new float[3];
SensorManager.getOrientation(mRotationMatrix, orientation);

// Convert radians to degrees
long pitch = Math.round(Math.toDegrees(orientation[1]));
long roll = -Math.round(Math.toDegrees(orientation[2]));

<whatever code uses pitch and roll goes here>

There is a lot of math and magic there but nothing compared to the supporting functions, which to be perfectly honest I did not take the time to try to understand:

public static float[] getRotationMatrixFromOrientation(float[] o) {
float[] xM = new float[9];
float[] yM = new float[9];
float[] zM = new float[9];

float sinX = (float) Math.sin(o[1]);
float cosX = (float) Math.cos(o[1]);
float sinY = (float) Math.sin(o[2]);
float cosY = (float) Math.cos(o[2]);
float sinZ = (float) Math.sin(o[0]);
float cosZ = (float) Math.cos(o[0]);

// rotation about x-axis (pitch)
xM[0] = 1.0f;xM[1] = 0.0f;xM[2] = 0.0f;
xM[3] = 0.0f;xM[4] = cosX;xM[5] = sinX;
xM[6] = 0.0f;xM[7] =-sinX;xM[8] = cosX;

// rotation about y-axis (roll)
yM[0] = cosY;yM[1] = 0.0f;yM[2] = sinY;
yM[3] = 0.0f;yM[4] = 1.0f;yM[5] = 0.0f;
yM[6] =-sinY;yM[7] = 0.0f;yM[8] = cosY;

// rotation about z-axis (azimuth)
zM[0] = cosZ;zM[1] = sinZ;zM[2] = 0.0f;
zM[3] =-sinZ;zM[4] = cosZ;zM[5] = 0.0f;
zM[6] = 0.0f;zM[7] = 0.0f;zM[8] = 1.0f;

// rotation order is y, x, z (roll, pitch, azimuth)
float[] resultMatrix = matrixMultiplication(xM, yM);
resultMatrix = matrixMultiplication(zM, resultMatrix);
return resultMatrix;
}
public static float[] matrixMultiplication(float[] A, float[] B) {
float[] result = new float[9];

result[0] = A[0] * B[0] + A[1] * B[3] + A[2] * B[6];
result[1] = A[0] * B[1] + A[1] * B[4] + A[2] * B[7];
result[2] = A[0] * B[2] + A[1] * B[5] + A[2] * B[8];

result[3] = A[3] * B[0] + A[4] * B[3] + A[5] * B[6];
result[4] = A[3] * B[1] + A[4] * B[4] + A[5] * B[7];
result[5] = A[3] * B[2] + A[4] * B[5] + A[5] * B[8];

result[6] = A[6] * B[0] + A[7] * B[3] + A[8] * B[6];
result[7] = A[6] * B[1] + A[7] * B[4] + A[8] * B[7];
result[8] = A[6] * B[2] + A[7] * B[5] + A[8] * B[8];

return result;
}

Wow. This is one of those chunks of code that once you have it working, you never want to see or touch it again. That is enough stress for method 4, accelerometer only.


The Pros and Cons

Now that we have all these options available to us, which should we use? Let’s take one more look at each.

Rotation Vector Sensor

This is theoretically the best, most modern, most future-compatible way to go, but the details tell a slightly different story.

Pros: Theoretically the best, most modern, most future-compatible way to go (sounds familiar).

Cons: It just did not work on a Nexus 7 tablet. Pitch and roll numbers were produced, but they were erratic and often incorrect and did not match the clean orientation sensor numbers. No solution was found even though others had reported similar results online.

Orientation Sensor

The original.

Pros: On all devices that support it, the numbers are clean and reliable. The code is easy to write, maintain, and understand.

Cons: This is totally deprecated and has not been officially supported by Android versions shipped in the last 8 years. No one knows when it might be removed completely.

Magnetometer + Accelerometer Sensors

Pros: This setup works about as well as the rotation vector sensor. The numbers are usually pretty much a perfect match when compared to rotation vector, though there are occasionally small amounts of drift evident using mag+accel.

Cons: The code complexity goes up using this method for no observable gain. Any available gyroscope sensor will also not be used which is unfortunate because it should be able to help ensure results are good and stable.

Accelerometer Only

Pros: This method might be the only method available on some low-end devices that shipped without magnetometers.

Cons: The code is complex, there are parameters being passed everywhere, there are magical arrays and transformations, and the best you are going to do is hope that it all works. With only one sensor in play, the results themselves are passable but there is a major issue, the largest con that, depending on your use case, might be a blocker or just a minor inconvenience: Upside Down Detection is NOT POSSIBLE using the accelerometer alone. The lack of magnetometer ends up meaning that although you can loosely track pitch and roll, you cannot determine if the device’s screen or back are primarily facing the ground.


Final Thoughts

The orientation space on Android is a bit of a mess, even if we constrain our study to simple pitch and roll.

This investigation is not exhaustive. There are other sensors that would be interesting to add to the mix that appear on some devices — game rotation, geomagnetic rotation, and gyroscopes, and the possibility is always there to write other composite sensors by hand as well.

My apps will use these sensors in the following fallback order, following the pro/con and device compatibility notes: Orientation, Rotation Vector, Magnetometer + Accelerometer, Accelerometer Only.

Android Platform Team: If you are listening, I would love to know the story behind the deprecation of the original orientation sensor.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade