Microsoft Event: Mobile + AI

Tony Law
Tech @ Domain
Published in
4 min readMay 22, 2018

A few of us Domain developers recently attended this half-day Microsoft event in the Sydney CBD. The two main speakers of the event, Erik Polzin and Colby Williams, are based in the Bay Area of the US, and they joined Microsoft via the Xamarin acquisition. A few other Microsoft employees of the Asia Pacific region were also in attendance, and each spoke briefly about their favourite mobile app that uses artificial intelligence (AI), as well as their favourite app in general. None of them mentioned the Domain app, to our disappointment.

The event was broken down into three segments:

Excite

Here we first learnt about the history of Microsoft:

The speaker talked about how AI has changed the way we go about our everyday lives. From using Google Assistant to perform tasks, to Spotify creating automated and personalised playlists.

I then learnt a new industry buzzword: Appification, which of course means that a person is qualified to sell apples, e.g.

Person looking for a job at the supermarket (or Apple Store): “Yes I’ve got the appification for this job.”

(but seriously, I think this ‘word’ just means that everything is an app now)

Speaking of apps, we were made aware of how important the initial impressions are when you first install an app. If the user experience is not good, chances are the user would not use the app a second time.

We were then shown a clip of Minority Report, which demonstrated how a lot of things in that futuristic movie actually exists in the present day.

The first part was wrapped up with a Turing test. Two songs were played, one made by a human, and one made by an AI program. The room was pretty much split 50/50 regarding which song was which, and showed that AI is capable of making music that is comparable to that of a human made one.

Explore

“Our industry does not respect tradition, it only respects innovation” — Satya Nadella

This segment was mostly about some of the innovative services that Microsoft offers. We learnt that a lot of money (billions!) are invested into the MS R&D department.

We learnt a little bit about Azure Functions, which is the equivalent of AWS’s Lambda, and how it’s possible to scale just the compute intensive part of your app.

We were also made aware of Cosmos DB, a NoSQL database on Azure.

Next up was a live demo on how to set up a build server using App Center. It supports a wide variety of platforms, including iOS, macOS and Android.

Cognitive Services was next on the agenda. The services include:

  • Vision
  • Speech
  • Language
  • Knowledge
  • Search

We then saw a live demo of Seeing AI. By pointing the camera to a person, the app was able to estimate that person’s age and facial expression. It was also able to read a barcode and determine the product. Most impressively, it could read out text handwritten on a whiteboard.

Finally we briefly looked at Bot Framework and how it can integrate with Cognitive Services, to wrap up the Explore segment.

Experiment

After a quick break, a live demo of QnA maker was shown. There were a few technical issues with the demo, which shows the importance of having video walkthroughs as a backup measure.

Another live demo (customvision.ai) was shown which utilised image recognition with machine learning. Once again, there were some minor technically difficulties with the demo. As the room was full of developers, I think we all sympathised with the presenter when things didn’t work as intended. “That’s OK, we believe you.” one gentleman called out, which showed the camaraderie between us tech people (and also because it was close to 5pm).

Overall, it was a very informative afternoon. The key take away for me is the importance of machine learning, which the cognitive services relied upon. It echos what our CTO said at an internal meeting, that we should all strive to learn more about ML, as it is the present and future of our industry.

--

--