Google held its I/O 2017 event at the Shoreline Amphitheatre, on the 17th of May.
Sundar Pichai kicks off the event with a few stats.
- Google Search, Android, Google Chrome, YouTube, Google Maps, Google Play, and Gmail have a billion monthly active users each.
- 1 billion hours of video are watched each day, on YouTube.
- 1 billion kilometers are navigated each day, on Google Maps.
- Google Drive has 800 million monthly active users.
- 3 billion objects are uploaded to Google Drive each week.
- Google Photos has 500 million monthly-active users, with users uploading 1.2 billion photos each day.
- Active Android devices cross 2 billion.
- Google Assistant is on 100 million devices.
Gmail gets smart reply feature, where the Google Assistant will suggest replies based on the content of the mails.
Google Pixel will soon be able to remove huge imperfections on a photo, such as a mesh between the photographer and the object being photographed.
Google Lens can understand what you are looking at, and can help you take actions based on that information. It will be first available through Google Assistant and Photos.
Google.ai is a collection of efforts and teams across the company, focused on bringing the benefits of AI to everyone. Google.ai will focus on state-of-the-art Research, Tools and Infrastructure, and Applied AI.
Google.ai has helped, and is helping with research about diseases, to predict the behaviour of molecules in chemistry, in Google Search, and in Google Assistant.
Scott Huffman takes the stage to talk about what’s new in Google Assistant.
Now, each member of the family gets personalized responses, based on their voice. Googe Assistant now allows typing queries. It can translate what it can see, and carry on conversations about what it sees. Google Lens integrated with Google Assistant, helps with setting reminders, adding stuff to calendars. The Google Assistant SDK allows third-party hardware manufacturers to build the Google Assistant into their devices. Google Assistant also gets support for new languages.
Google Assistant is now coming to the iPhone!
Valerie Nygaard talks about how Google Assistant helps in performing actions across various services on the Google Platform.
Third-party developers can perform transactions through the Google Assistant. Smart home devices can now be controlled through conversations with Google Home.
Rishi Chandra takes the stage to update us on Google Home.
Google Home now supports individual voice recognition for upto 6 members of a family, and offers a personalized user experience for each member. You can get your personal reminders, appointments, commutes, news sources based on your voice.
Proactive Assistance — Google Home will now let you know about upcoming plans and traffic, based on your calendar, without you asking for it.
Hands-Free calling — You can call anyone in the US and Canada, for free, by asking Google Home to call anyone in your contacts. The voice determines whose contacts Google Home chooses to use. You can link your mobile number to the Google Assistant, so that the call receiver knows it’s you.
Streaming — Google Home now supports new music-streaming from Spotify, SoundCloud, and Deezer, and new video streaming services, including HBO Now.
Visual Responses — Google Home now utilizes the devices around you, like sending directions to your phone. The new Chromecast update allows Google Home to answer your queries regarding your calendar, weather, or video services, directly on your TV screen.
Anil Sabharwal now talks about the updates to Google Photos.
Suggested Sharing — Google Photos will remind you to share the new photos, and also suggest the photos and people to share the photos with.
Shared Libraries — You can choose to automatically share new photos, or just a subset of your photos, with whomever you want to. The shard photos are automatically included in search results, collages, and movies.
Photo Books — Google Photos chooses the best photos from your selected photos, and you can order a printed book containing all the photos. Google Photos also creates Photo Books automtically to save you time.
Google Lens is now integrated into Google Photos. Google Lens can help you identify and use the contents of your photos, likes buildings, paintings, screenshots, or receipts.
Susan Wojcicki now takes the stage to give us updates on YouTube.
YouTube on the TV — Susan gives a few minor updates about YouTube viewer increase, and invites Sarah Ali to talk about YouTube on the TV.
The YouTube TV app now gets 360-degree-video support. The TV remote helps in the 360-degree navigation.
Super Chat — Barbara Macdonald (crazy alert!) talks about Super Chats, where viewers can purchase higher-priority comments, and creators can earn more. There is a new developer API integration that triggers actions in the real world, which means Super Chat comments, for example, could trigger events in the creators’ studio.
Susan invites Dave Burke to talk about Android. (Finally!)
Dave talks about two main themes.
- Picture-In-Picture — A floating window allows a video to be played on the screen, while the user can use the phone without any interruption.
- Notification Dots — A dot is shown on an app’s icon, to indicate that some activity has taken place. Long-pressing an app icon shows a popup on the screen, listing the app’s notifications.
- Autofill with Google — Form autofilling is now available within apps. APIs are also available for developers to customize the autofill experience.
- Smart Text Selection — Double tapping a part of a phone number, house address, email address, or name, automatically selects the whole phone number, house address, email address, or name respectively, removing the need to drag the text-selection handles. This selection also suggest apps relevant to the content selected.
Dave introduces TensorFlowLite, a specialized version of the machine-learning library, TensorFlow. TensorFlowLite helps apps to be fast and small, still enabling state-of-the-art techniques, like CONVNETS and LSTMS.
Dave calls upon Stephanie Saad Cuthberston to talk about the second theme, VITALS, which helps in keeping system behaviour in a healthy state, maximizing the battery, performance, security, and reliability.
Google Play Protect — Scans all your apps, and is available out-of-the-box with Google Play.
OS Optimizations — New limits are added to limit background location and background execution, to reduce consumption of system resources.
For Developers :
- Play Console Optimizations — Developers can scan apps to pinpoint the top issues which cause battery drain, crashes, and slow UI.
- Android Studio Profilers — Developers can scan the network, memory, and CPU, and scan process down to the exact precise line of code.
- New Programming Language — Kotlin is now an officially supported language in Android.
The first beta release of Android O is available today. You can check it out at : android.com/beta
Sameer Samat takes the stage to talk about Android Go, a new experience for entry-level devices.
Android Go focuses on the OS, Apps, and Google Play, and will be available for devices with 1 GB of Ram or lesser, on all Android version, starting with Android O.
The system UI and kernel have been optimized to allow for a smooth experience of lower-memory devices.
Data Saver functionalities have been integrated into the settings, and in apps like Chrome and YouTube Go , which have offline modes, and offline peer-to-peer sharing, to save data.
Gboard now supports 191 languages.Gboard can take in phonetic sentences and convert them into proper-language sentences. It also have the Google Translate feature in-built, to translate typed content, into other languages.
To optimize their apps for Android Go, developers need to follow instructions on developer.google.com/billions.
Clay Bavor now takes the stage to talk about VR and AR.
Clay introduces the standalone VR headsets , which are designed just for VR, and which do not need any cables, phone, or PC.
Clay takes some time to talk about Google’s Visual Positioning Service, or VPS, which is useful in AR, for tracking objects and to provide short-distant directions, based on the objects visible in the vicinity.
Sundar is back to give a short introduction to Google For Jobs.
Job searching is easier than ever, with it now being integrated in Google’s search results, which directly list jobs that suit you, along with filters to help you find what suits you the best.
And with that, we come to the end of the keynote.
MY TAKE ON THE KEYNOTE
I enjoyed the keynote. A lot of the stuff announced was very creative, helpful, and useful. They made me think “Just give Apple 2 years to catch up to all this…” a couple of times. The event was packed with new releases, exactly opposite to Apple’s events nowadays.
Did you watch the event? What did you like the most? Let me know in the comments.
Thanks for reading.