Differences between iOS and Android implementations as seen in an educational kids app

Carmen Popa
4 min readSep 9, 2018

This summer I was lucky enough to be selected as a participant in Google Summer of Code, a program which pairs contributors and organisations all over the world for a 3-month awesome learning experience. When I received my acceptance email, I jumped 5 feet into the air. I was going to contribute to an open source organisation for real, to a project I found entertaining and useful. Catrobat is an Austrian organisation which develops a phone app called Pocket Code on Android and Catty on iOS. It makes use of brick instructions to teach children how to code. Long story short, kids all over the world can install the app, add their favourite characters and drag and drop code instructions to create any kind of games they have in mind.

What’s in a sensor? That which we call a compass by any other name would point as far.

The games created on the Android app were not behaving the same as on the iOS and vice versa. That happened because of the device and object sensors, which had different values depending on the operating system. But this is where I came in and synchronised them as best as I could. Note that, throughout the article, whenever I say Android or iOS, I mean the Android/iOS implementation of the app (i.e. Pocket Code/Catty).

I worked with both device sensors (date, inclination, acceleration, compass, audio, face recognition, touch etc) and object sensors (colour, layer, position, size, rotation etc). You would think that the date is the same everywhere, and yes, it is, but the weekday sensor was slightly different: Sunday is considered to be the first day of the week on iOS, whereas Monday is the first day on Android. Another difference I noticed is that whenever something is in degrees on Android, almost certainly it will be in radians on iOS. Position and touch are very funny indeed, because the point (0, 0) is in the centre of the screen for Android, but on the top-left corner for iOS. The inclination axes were not only reversed, but on Android they had values between (-180, 180), whereas on iOS, x was between (-pi, pi), but y was between (-pi/2, pi/2). I needed a lot of luck and patience to sort them out.

By the time I started working on the loudness sensor, I thought nothing could surprise me. Well, I was wrong, but who would have thought that a value of 0 on iOS means an extremely loud noise, whereas something around -160 means absolute silence? Background and transparency had a range of [-1, 1] on iOS, whereas the first had a range of [0, 200] on Android and the second a range of [0, 100]. I didn’t even know who to judge anymore, I just wrote two functions to change the interval, but little did I know that transparency was descending on iOS and ascending on Android.

Anything that could be different was different. If the function of a sensor was periodic on an operating systems, it was not on the other. If moving the phone to the left the sensor showed a positive value on Android, it was bound to show a negative one on iOS. To make things worse, the modulo operator does not work on float data types! Why were the iOS objects always two and a half times bigger than those on Android? Who would have thought that the function to convert the loudness sensor from iOS to Android was exponential and who would have thought that the constant of the power would be 0.05? Not me, but some dude on StackOverflow.

Now that I look back at it, the sensors seem nice and simple. But as I was working on them I had a hard time figuring out how they behaved, how to test them, how to compare them. Once I knew how they worked on iOS and how I wanted them to work, it was fairly easy to make the conversion, but the part in which I had to discover the differences was challenging and time consuming. I added some photos to show how the objects’ size and colour are evaluated on the two operation systems, now that I synchronised them.

Apart from the sensors, I had some tasks related to code refactoring. The iOS app is written in Objective C and I was asked to convert some of the files to Swift. This was the most challenging part of the program, because I had linking problems and sometimes I couldn’t find the right function, but where there’s challenge, there’s also Carmen. I decided to continue contributing even after the summer program was finished and I will offer a helping (or disturbing) hand in converting the whole app to Swift.

All in all, participating in Google Summer of Code was an unbelievable experience! I was assigned an awesome mentor who helped me a lot, I learnt many things (I greatly improved my Git skills), I had lots of fun while trying to discover ways to test and synchronise the sensors, all these while helping an open source organisation create an app that will teach children to code.

You can read more about the project description on the Google Summer of Code official page and you can checkout the improvements I made on Github — Sensor documentation.

--

--