When I started researching about Kotlin Multiplatform, the idea of using Firebase in my Multiplatform projects soon occurred to me. At first, there was no library that could accomplish this, until TouchLab released its implementation to use Firebase and Firestore sharing the business logic code. This library is FirestoreKMP.
In a previous post, I talked about creating an app using Kotlin Multiplatform and moko-mvvm library developed by IceRock that that provides architecture components of Model-View-ViewModel for UI applications. …
During the KotlinConf 2019 there were several talks about Kotlin Multiplatform, in one of them Alexander talked about different libraries developed by IceRock to achieve MVVM pattern, share resources or widgets using the same code on different platforms (Android, iOS, JS or web).
This talk sparked my curiosity to try the MVVM pattern + LiveData for Android and iOS apps and Clean Architecture on the common shared code.
So I decided to get down to work. …
On 05 March 2019 Google announced the release of Android Jetpack WorkManager 1.0 Stable. We are going to try to explain how to use WorkManager and why it is a good option to execute tasks in the background.
Usually, you need to execute on a background thread because the task is an expensive operation — like applying filters to a bitmap — or the task depends on a network request.
Until now, when you would like to execute a long-running operation in the background, you have these options:
Android Oreo (8.0) came with many cool new features like Picture in Picture, Smart text selection and much more. But one of the most important changes came into the Notifications field.
Now with Android Oreo, developers can separate the behaviour of notifications by creating different channels for each of them. It is important to say that no channel is specified for a notification, this notification will not appear in the status bar.
All notifications that use the same channel will have the same behaviuour, so…
Until now if we wanted to recognize a user’s current activity 🚴 🏃 🚶 🚌, such as walking, driving, or standing still we had to use ActivityRecognitionApi.
But since a time ago this API is deprecated and we have to use ActivityRecognitionClient.
In this post I will try to explain how we recognized the user’s activity before the new API was launched, and the changes you have to accomplish to use the new ActivityRecognitionClient API.
First of all, in both cases we have to add to our AndroidManifest.xml a uses-permission to read the user’s activity:
Remember the activities…
In the previous post I explained some changes you have to make if you want to call “startService()” in Android Oreo.
In this post, I will try to explain the set of changes you have to accomplish if you want to use BroadcastReceivers. But don’t forget to keep in mind all the Android parts affected when you target Android Oreo:
Essentially, the problem in Android Oreo with BroadcastReceivers is that all BroadcastReceivers that we have declared in our AndroidManifest.xml …
In the following months new changes are coming for all Android developers. Google is going to ensure the best performance on all applications in Google Play, so for that purpose is going to force Android developers target their apps to Android Oreo.
During Google I/0 2018 it was announced that from August 2018 all new apps, that developers upload to Google Play, have to be targeting Android Oreo (8.0). Starting from November 2018, the requirement will affect to updates of existing apps as well.
The changes we have to take in mind about this migration are: