Smart Signs & Android Things: A case study

During the spring semester, three students in the Electrical & Computer Engineering program at Rowan University worked with the developer preview of Android Things and the developer preview of the Google Assistant SDK in order to prototype a new way of interacting with devices you see everyday.

At the university, there is the concept of an engineering clinic. Juniors and seniors must take a clinic each semester. These can be research projects initiated from students, professors, or companies. These clinics give students real-world engineering problems which provide them with hands-on experience.

The Android Things platform has evolved from a developer preview to the recently released version 1.0, the initial stable version of the framework. The platform gives developers a lot of potential to build not just embedded devices, but intelligent and interactive devices. The Internet of Things is not just about putting computers into everyday objects, but creating novel and useful experiences for those who use these objects.

From Digital Signage to Interactive Signage

Digital signage is a growing industry and attractive to many businesses, restaurants, and other facilities because it gives them the ability to easily control and change content that is displayed. However, the user experience has not improved when migrating from bulletin boards to digital screens.

Digital signs present the users with a lot of information, forcing them to parse and make sense of everything. Giving the business control over the content takes control away from the user. Content on the signs can change suddenly, such as transitioning to a new set of information, causing the reader to lose their place and feel more lost.

What should digital signage look like? A user should get an experience suited to them. When they walk up to a digital map in the mall, they are looking for a specific place. When they are looking at a digital calendar of upcoming events, they are looking for the time of a specific event. How can these signs become interactive, giving a tailored experience?

One easy way is to use the Google Assistant SDK. With it, users are able to ask questions and the Google Assistant can use natural language processing to identify their intent and give them an answer. In addition to all of the general knowledge of the Assistant, developers can add domain-specific knowledge and capabilities through custom device actions.

If you’re in a mall, asking for a restroom can allow the software to explicitly indicate this location on a map, with directions to it. You could ask about events and get the time or a list based on the topic of your choice.

Many more use cases come to mind: local weather, news, and advertised discounts are all potential features that would create a more personalized and richer user experience.

Prototyping

The group of students began with the Android Things starter kit, based on the i.MX7D running Developer Preview 0.6. During the course of the semester, they wrote a design document to specify what they would build and what technologies they would use in the process. Over the course of the semester, they put together a proof-of-concept and were able to demonstrate the completed project during the end-of-year clinic showcase.

Example of the sign’s zero state

Starting from the Assistant sample project, they added a display that could be used to show information before a question is asked. This zero state includes the current time and shows the local weather, as shown in the image above.

Custom device actions expanded the capabilities of the app to handle custom grammars specific to the students’ use case. Query patterns were placed in the action package, as shown below:

“Where is (room)? $SchemaOrg_Number:number”

“How do I get to (room)? $SchemaOrg_Number:number”

When someone gives a query like “How do I get to 101” and “Where is room 216” , the app receives an action named Room_Number with the number as a parameter.

From this, the app is able to show you directions to your destination. The sign’s location, which is not expected to change, can be hardcoded for each device. An example of visual directions is shown below.

Example of viewing directions from the sign

The right tools for the job

Android Things makes it pretty easy to build these kinds of IoT devices. The group was able to take advantage of existing sample code for mobile apps and incorporate it into their IoT app. In doing so, they were able to use a number of existing libraries and tools like ConstraintLayout without any specific modifications.

The final result was an APK that handles the entire user experience from end-to-end. Folks using the signs can get an intelligent experience that provides high quality answers and a rich visual interface. When there is a bug fix or new feature, it can easily be added to the APK and delivered with an integrated over-the-air update capability.

What’s next?

Now that the group has put together a proof-of-concept, their focus for the fall semester will be to make it easier to integrate the signs into the university’s engineering building and make the software more configurable so it can be used in other places. Even though it will be in a public space, the security of the platform means there should not be any issues.