What I learned from Google I/O 2016

I just got back from Google I/O 2016 in Mountain View and want to share what I learned.

Day 0

With so many parties on Day 0 evening, I ended up going to only two parties: Women Techmaker’s dinner at GARField park near Googleplex, and the Android Study Group (ASG) mixer at Quora in Mountain View. The Women Techmaker’s dinner was an amazing experience with over 1000 women from tech! It was fun hanging out with my Android lady friends.

Day 1

Google I/O keynote at the Amphitheater main stage was incredible. We were so excited as we entered the theater. The live paper airplane game before keynote was really fun. And I made my very first sketchnote of the keynote!

My sketchnote highlights the announcements of I/O this year: Google Assistant, Google Home, Allo, Duo, Android Wear 2.0, DayDream, VR, Instant Apps, and Firebase. Firebase now has a new logo! Firebase, no longer just a realtime database, has been expanded to include many new features - analytics, cloud messaging, notifications and crash reporting. Android Studio 2.2 now has test recording that will generate Espresso code automatically. You can read more details on the official Google blog here.

After lunch I attended What is New in Android? ConstraintLayout, Multi-window, drag & drop between activities, updates in notifications, and doze etc. Take a look at the Android N Preview and play with the new features hands-on.

I was lucky to get in the session How to build a smart RasPi Bot with Cloud Vision and Speech API. It was pretty cool to see the cute little robot built with Raspberry Pi, Google Cloud Vision and Speech API, able to recognize different objects, and respond to questions in different languages. It was interesting to learn that the usage of deep learning at Google has been trending up significantly.

Day 2

I attended What’s New in Android Wear 2.0? As summarized in my sketchnote, Android Wear 2.0 introduced new UI such as nav drawer, and input with keyboard, standalone apps without needing to connect to a phone, smart messaging, more interactive watch faces and better fitness tracking. With Android Wear 2.0, hopefully my “smart” watch actually becomes smart and useful, and no longer just something for showing notifications.

I learned many useful tips and best practices from the Advanced Espresso and The experts’ guide to Android development tools sessions. During Supercharge Firebase with GoogleCloudPlatform, we saw a demo of an app that uses Firebase, YouTube, Speech and Translate APIs etc.

The popular Speechless session was fun and relaxing. Chet Haase gave some invaluable advice on how (not) to use selfie sticks.

Day 3

While riding the shuttle from Cupertino to Mountain View and stuck in traffic, I watched Google’s Vision on Machine Learning. It had very informative Q&As with the experts. Here are a few sample Q&As:

Q: Machine learning is not new. Why are we hearing about it a lot now?

A: We are in a AI spring. Improvement in ML making it more accurate, powerful and available to developers.

Q: How can we use machine learning in products?

A: 1) Turbocharge current products. Speech recognition & Translation for example. 2) Unlock new product use cases. Thanks to mobile, many real world problems can now be solved by ML. Transportation is an example.

Bridging the physical and digital. Imagine the possibilities. ATAP. was really cool! We got an update on Project Jacquard, Project Soli, Project Ara and 360 VR films. I would love to try on the Levi’s commuter jacket with Project Jacquard when it comes out in 2017!

Later I learned how RecylerView came about and how to best use it from the session RecyclerViews Ins and Outs. I didn’t get into Breakthroughs in Machine Learning even though I arrived 45 minutes before the session start time, so I went to hang out at the code lab area and went through Hello, Beacons! Proximity & Context-aware Apps instead.

Overall I’m pretty excited to see so many sessions at I/O on machine learning, especially excited about the opportunity that machine learning is now made widely available to app developers who don’t have knowledge in ML. The Firebase expanded with tons of new features will provide app developers with good backend infrastructures. Android Wear 2.0 will make our watches smarter, and the new tools in Android Studio 2.2 will make Android developers’ lives easier.


If you didn’t get to attend Google I/O this year or missed a session, you can catch up with the recorded sessions on YouTube here.