Android Things and Touch Display compatibility
Getting the non-official RPi screens up and running just for display together with Android things was quite a hassle as you could read in my previous posts on Android Things and Displays (Waveshare and Kedei). But after all it worked, although the touch display did not…
It turns out that the Waveshare uses a XPT2046 touch controller. For the Kedei display it was a little harder to find out what touch controller it’s using, but it seems to be some kind of rip-off of the XPT2046. But I started with the Waveshare. The first hassle to overcome was how I could sent any detected x-y press to the Android OS. I stumbled upon some code on GitHub for a 7" Waveshare USB touch display driver for Android. Android has the InputDriver class that you can instantiate with a builder. Put the source to touchscreen, set the name and the version your custom driver and set the maximum values for the display X and Y axis. That’s it. As soon as you are able to retrieve the touches from the touch screen you can just emit these values to the InputDriver instance. Therefore I created a Service that I launch from the Application. In that service I created the InputDriver instance and started a background thread with a never ending loop. So before the loop starts I should be able to get a reference to the screen so that in the never ending loop I would be able to fetch the touch coordinates out of it and pass them to the InputDriver.
That was the easy part! Now I had to get my readings from somewhere. So after a lot Googling I found out that the XPT2046 (and thus the Waveshare) their touch controller is available on a SPI bus. Reading out the SPI devices on Android Things is easy, reading the data from a device or writing to it is way more difficult because it depends on the implementation of the manufacturer. Lots of hours later I was lucky to find a StackExchange posts about a Waveshare 3.5" display which lead me to another GitHub project: SPI LCD. That is a project by one Larry Bank written in C as layer above the SPI bus so you would not have to do the heavy stuff yourself.
SPI_LCD is a C library for working with the SPI-connected LCD
displays which use the ST7735, ILI9341 or HX8357 controller chips. The idea is to provide a simple interface for C programmers to make use of
those low-cost LCD displays without having to read the data sheet or figure
out SPI programming.
Just a few weeks Larry Bank added compatibility to read touches from the XPT2046 touch controller. After converting the core of his solution to the Android Things platform I actually got some output. But the output was totally weird… The maximum (x,y) pair that came out of it was (2018, 2031) while my display is only 800x480 pixels… The only thing that came out right was the second byte of the buffer that was zero in case of no touching and different from zero in case of touching. That’s an important one to get right because you will have to emit that touching-boolean to the InputDriver as well as the x and y values.
I found out that if I divided the x values by 2018 and multiplied them by 800, and divided the y values by 2031 and multiplied them by 480 that touching the center of the screen fitted quite well with the calculated (x,y). I outlined these values a little bit more and came up with 2030 for X and 2100 for y. Then my center point was perfect! But moving my pointer from the center point to the edges gave me a little offset. Horizontally (all to the left or right) there was an offset of about 10 to 20 pixels (so for X), vertically for Y the offset was about 16 to 24 pixels. Apparently this offset is constant, and it’s lineair from the center point towards the edges. Thus I can easily calculate the amount of pixels to be taken into account depending on how far between the edge and the center point the touch-coordinate is.
That is already quite good, but in practice I noticed two things…
1. When I started typing on the keyboard other keys got pressed than the one I was pressing. Sometimes just the one next to mine, sometimes on other side of the keyboard where I didn’t touch anymore for a long time.
2. And when dragging the slider from the left to right, while I was slowly moving to the right the slider jumped a little to the left and right all the time around my pen.
So after analysing a lot of the X and Y values that were read from the touch controller I saw that for one single press on a button for example, the Android Things OS collected a few milliseconds of touch data which it somehow uses to determine where to click. But it happend quite a lot that the very last reading was tens of pixels (sometime even 200 pixels or so) away from the position I was clicking, sometimes that was the X value, sometimes the Y value and sometimes even both. When doing long touches I saw the same, after a certain amount (which was always variable) of readings there was 1, and no more than 1, exactly 1, reading that was totally off. So I came up with quite a simple solution. Every time that I detect a reading that is a certain amount of pixels away from the previous reading (and the time between this reading and the previous is only a few milliseconds ago and you did not release the press in between of the two readings) then I ignore this reading and just pass on the previous (x,y) coordinates to the InputDriver. I also kept a flag if this happend that on a second pass I should not ignore the current reading if it happens again. This last one is because in the measurements I inspected I saw that it happend no more than once and the next reading would be ok again, but if you move around quite fast (drag your pointer around the screen) then this algorithm would ignore the movement because most likely you will move at more then 12 pixels per reading. Oh yeah the amount of pixels to detect a flaky reading was 12 pixels I determined after a lot of testing.
But then for the other problem, the shivering of the slider input control has to do with the inaccuracy of the touch screen. When you keep pressing at almost exactly one point the readings will give that point but also go around that point. So the reading my be a little up, down, left, right or right on it. So that’s why I came up with the inverse solution of the previous, when a movement in x or y is detected of less then 12 pixels, then you are most likely not moving around with the pointer we don’t use the incoming x or y value but we just take over the previous. And this on every reading. And the result is tremendous. The slider now moves perfectly from the left to the right and back and you can hold the slider also at one position without jumping around.
Next I wanted to get the Kedei 3.5" screen working as well off course. There is less info to find on that screen and it’s touch controller. After some looking around I found that it must be the same as on the Waveshare, the XPT2046. And so I gave it a try. Initially it did nothing, but I changed from channel 1 (that I sued for the Waveshare display) to channel 0 and I had my output immediately.
There were however 3 things to be adapter:
1. First the way to detect presses did not work anymore. For the default XPT2046 touch controller implementation you should check the second byte in the buffer, if different from 0 touches are captured. Here on this board the value should be different from 127…
2. Secondly the X and Y readings were reversed, so that’s an easy one.
3. And last but not least after switching the readings of the Y value the Y value that came out itself was also inverted. So again an easy fix for inverting the value.
That was basically it to get touch input working on that display as well!
I have put all of this code in a an easy to use library that is available on GitHub: TouchInputDisplayDriver. It currently has support only for those two touch input screens but any screen can be added. So feel free to contribute to the library with your display driver and please let me know your findings on the Waveshare and Kedei driver with other dimension screens.