Android Summit 2018! — Recap
Another eventful Android Summit transpired this year on Thursday, August 16th, 2018. A good line up of speakers inspired by what the new tech has to offer, presented talks on Kotlin, Flutter, Android Jetpacks library, Machine Learning etc. and dove deep into other new concepts announced this year, at Google IO. It was an exciting one-day conference, with the objectives of sharing tremendous knowledge with the community and showing support towards “Women who Code”.
This one-day event was packed with three different tracks for Developers, Designers and a testing track for the obliged Quality Engineers. Although it was not possible to attend all the classes, we had the option to pick and attend the classes of our choice with the help of the Android Summit 2018 app, which was actually developed by the interns at Capital One using Flutter and Firebase.
The summit kicked off with an opening keynote by Claire Slattery, Director of Performance at Speechless. Yes, it was an Improv session for engineers! The method was non-traditional, yet engaging and very interactive. We learned “how improv helps clear doubts and changes the volume of brain levels” -Clair Slattery. And the keynote continued to focus on how to collaborate, learn and grow within your respective areas followed by some fun exercises.
These interactive exercises encouraged us to move out of our comfort zones and to engage in imaginary conversations with our neighbors. To mention a few, one of these exercises was to talk about an imaginary vacation with your partner that never took place. Another was to discuss an imaginary event, that never occurred on a vacation, with your partner and the most interesting was to step into a group setup and announce your presence. I believe the point was to get or be comfortable in an awkward situation.
Out of the options of classes provided at the event, I decided to attend the Android Navigation session first by Britt Barak, a Developer Advocate at Nexmo. The Navigation Architecture component is a library that was created to sustain consistency of navigation within or across Android apps.
The session talked about managing app navigation behavior, with the help of navigation components, and also how to simplify your implementation in order to provide coherent and foreseeable experiences to the users. Mainly, understanding how android tasks/collection of activities can be managed with the help of launch modes and intent flags to provide a symmetric push and pop model, to keep the back stack clean and consistent.
My next talk was by Doug Stevenson, a developer advocate at Google Firebase Team. Android Jetpack Architecture Components was announced this year at Google IO 2018. This session particularly talked about Live Data and View Model components, and how it works effortlessly with Firebase.
My take away from this class, as follows:
- Easily allows keeping the UI fresh with Live Data.
- Ensure that the Activity or the fragment to receive data only when they appear on the screen.
- Live Data enables the ability to manage the database listeners according to the state of the activity.
- Helps to avoid boilerplate code and memory leaks.
- View Model contains Live Date objects, in order to retain data on activity configuration changes.
- Testability and readability are optimal.
Overall the talk was very interesting and I got the opportunity to meet with the speaker at the end of the class. I made my acquaintance with Doug and asked his opinion about the summit and he said: “I really enjoyed this conference as it was small and intimate, I felt comfortable sharing the knowledge with the community” — Doug Stevenson.
Other key points mentioned…
- Kotlin as an architecture is shareable and testable.
- Large and grown active community
- Jet Brains — provides robust tools
- 100% native for Android
Later in the afternoon, I attend the ML Kit! session by Pete Varvarezis, Mobile Platform Architect at Capital One. Although it seemed complex at the time and I assume getting started with Artificial Intelligence or Machine learning, or how to implement the models with your app or even using the frameworks itself can be challenging for other developers as well. However, I think Pete’s talk broke it down nicely to understand the concept and paved a path for the listeners to get their hands dirty with the Google’s beta release of ML Kit.
As Pete puts it “ML Kit is the beginning of AI for the people”. The talk explained more on how a large collection of mathematical algorithms can help build better products. ML Kit gives you access to both On-device and Cloud API’s. Production ready API’s are available to consume for common use cases, to include Text recognition, Face Detection, Barcode Scanning, Image labeling and Landmark recognition. These API’s simply passes in data to the ML Kit and provides intuitive responses. Also, talked about how one can integrate ML into an app with a few simple steps and how it is all available with Firebase.
Meeting Hour / Reception
Later in the evening, the day ended with a closing keynote by Nitya Narasimhan, an independent Software Consultant. This keynote was about, how to cope with stress and practice “Self-Care” actively participating in protecting one’s well-being and happiness. The session was very relatable to our day to day life and I do plan to put in efforts in order to practice what I have learned here, via Control, Cope and Comfort.
After the closing keynote, followed by a reception, provided us with plenty of time to mingle and make a few new connections. Overall, it was a great turnout and I got what I was hoping for. Another great advantage of this event was networking with other tech enthusiastic folks and just being a part of the community. Having learned so many new concepts, I really enjoyed my time at the summit. I was thrilled knowing more than what I knew before. It was time to put things into perspective and practice all the new tech knowledge I have gained through this event and I look forward to Android Summit 2019!