<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:cc="http://cyber.law.harvard.edu/rss/creativeCommonsRssModule.html">
    <channel>
        <title><![CDATA[Paradox Cat Tech Hub - Medium]]></title>
        <description><![CDATA[Stories about Android, HMI and Artificial Intelligence written by our colleagues. - Medium]]></description>
        <link>https://medium.com/paradox-cat-tech-hub?source=rss----4466e279ddff---4</link>
        
        <generator>Medium</generator>
        <lastBuildDate>Sat, 16 May 2026 17:07:52 GMT</lastBuildDate>
        <atom:link href="https://medium.com/feed/paradox-cat-tech-hub" rel="self" type="application/rss+xml"/>
        <webMaster><![CDATA[yourfriends@medium.com]]></webMaster>
        <atom:link href="http://medium.superfeedr.com" rel="hub"/>
        <item>
            <title><![CDATA[Building Apps for Cars in 2026: Android Auto, AAOS, and Vehicle Data APIs]]></title>
            <link>https://medium.com/paradox-cat-tech-hub/building-apps-for-cars-in-2026-android-auto-aaos-and-vehicle-data-apis-ac627ba9fc8e?source=rss----4466e279ddff---4</link>
            <guid isPermaLink="false">https://medium.com/p/ac627ba9fc8e</guid>
            <category><![CDATA[android-auto]]></category>
            <category><![CDATA[vehicle-properties]]></category>
            <category><![CDATA[carpropertymanager]]></category>
            <category><![CDATA[car-app-library]]></category>
            <category><![CDATA[androidautomotiveos]]></category>
            <dc:creator><![CDATA[Viktor Mukha]]></dc:creator>
            <pubDate>Thu, 11 Dec 2025 14:51:44 GMT</pubDate>
            <atom:updated>2025-12-11T14:51:32.711Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*18cdbcy2x6vmiq8b41YeAQ.jpeg" /><figcaption>Source: <a href="https://www.press.bmwgroup.com/deutschland/photo/detail/P90626683/der-neue-bmw-ix3-50-xdrive-space-silver-12/2025">https://www.press.bmwgroup.com/</a></figcaption></figure><p>In this article, we will discuss car infotainment ecosystems from the perspective of an application developer, covering their past, present, and future.</p><p>We will also explore how to interface with a car and retrieve information, such as vehicle speed, from its network.</p><h3>Do cars need software applications?</h3><p>When cars first went into mass production, they did not have radios. The driving experience was exciting enough on its own.</p><p>The very first commercial radio appeared in 1930, and it took several years for it to become an established standard equipment. People clearly saw the value of the radio, and radio stations were happy about it too.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*kDFKmcbxres8ITSz6dVCTw.jpeg" /><figcaption>We’ve come a long way since then!</figcaption></figure><p>Today, large, colorful displays with touch and speech control are all the rage. Much as we love analog technology and physical buttons, we cannot deny that such user interfaces offer drivers far greater value.</p><p>These days, people expect cars to offer more than just a radio.</p><h3>Why now?</h3><p>Displays are now just as much a part of any new car as a radio. But why are we talking about it now? What changed?</p><p>Until recently, in order to develop an application for a car’s built-in system, you had to talk to OEMs directly (e.g., to BMW, GM, Ford, Volvo, etc.), and use their proprietary framework. This resulted in virtually no market for car-specific applications. This is still the case for most OEMs today.</p><p>However, this is changing.</p><h3>Android Auto &amp; Apple CarPlay</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*6NwgWWzEpxJhVv3314F6Kg.png" /><figcaption>Apple CarPlay (source: <a href="https://www.apple.com/de/ios/carplay/">https://www.apple.com/de/ios/carplay/</a>)</figcaption></figure><p>Today, there is an established market of Android Auto and Apple CarPlay. These are so-called projection modes. Simply connect your phone to your car and enjoy your favorite apps on the go.</p><p>When it comes to developing car applications, these are clearly the best platforms to target today. That is where the users are. Consequently, it receives the most attention from both Google and Apple as a platform for automotive apps.</p><p>However, a new platform has emerged: Android OS installed directly in a car. This is the future.</p><h3>Native Android in cars</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*5qY4fx6coRgfOgyfhoN7mA.png" /><figcaption>Google built-in (source: <a href="https://built-in.google/cars/">https://built-in.google/cars/</a>)</figcaption></figure><p>In the recent years there has been a dramatic shift in the technology being used for car infotainment systems. This change was enabled by the release of Android Automotive OS (AAOS), which included many useful car-specific features in the Android Open Source Project (AOSP).</p><p>Previously, most OEMs used their own customized Linux solutions; now, most support Android APIs. We will not go into motives of the change. Instead, let us focus on the opportunity to develop for it.</p><h3>What do the cars support?</h3><p>A developer who wants to build an Android application specifically for cars, or add certain in-car use cases to an existing Android application, is faced with the following choice of the target today:</p><ul><li>Cars with support for <strong>Android Auto</strong>,</li><li>Cars running <strong>Android Automotive OS (AAOS)</strong> with <strong>Google Automotive Services (GAS)</strong> —<strong> </strong>these are now marketed as “<a href="https://built-in.google"><strong>Google built‑in</strong></a><strong>”</strong>, and</li><li>Cars which run an OS based on <strong>Android Open Source Project (AOSP) or AAOS </strong>without GAS.</li></ul><p>For a more in-depth overview of these platforms, take a look at these great articles by <a href="https://www.roa-valverde.com/">Antonio Roa-Valverde</a>:</p><ul><li><a href="https://www.roa-valverde.com/2024/10/navigating-the-android-automotive-os-third-party-app-ecosystem-a-developers-perspective/">Navigating the Android Automotive OS Third Party App Ecosystem: a Developer’s Perspective</a>,</li><li><a href="https://www.roa-valverde.com/2025/01/android-automotive-os-app-stores-understanding-the-ecosystem/">Android Automotive OS app stores: understanding the ecosystem</a>.</li></ul><h3>Car API overview</h3><p>At the highest level, Google provides an abstraction which is called <a href="https://developer.android.com/training/cars/apps">Android for Cars App Library</a>, also known as <strong>Car App Library (CAL)</strong>. It works for both Android Auto and AAOS.</p><p><strong>Ideally, all car APIs would be exposed exclusively via CAL.</strong></p><p>Unfortunately, there is still some fragmentation, even among Google’s own APIs. AAOS has certain APIs that can (or have to) be used instead of CAL.</p><p>Let’s categorize all of the Android car APIs into 3 groups:</p><ol><li>Car APIs that mirror standard Android APIs</li><li>Car-specific APIs for user interaction</li><li>Car-specific APIs for vehicle data</li></ol><h3>1. Car APIs which mirror standard Android APIs</h3><h4><a href="https://developer.android.com/training/cars/apps#carsensors">CarSensors</a> (CAL)</h4><p>Suppose you want to access an accelerometer.</p><p>Since Android Auto runs on a phone, developers have access to two accelerometers: one on the phone, and one in the car. CarSensors gives you access to the one in the car.</p><p>AAOS does not have this ambiguity. Therefore, we must use <a href="https://developer.android.com/reference/android/hardware/SensorManager">SensorManager</a> and <a href="https://developer.android.com/reference/android/location/LocationManager">LocationManager</a> there instead.</p><p>The choice is straightforward:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/958/1*yrIHFoLxpMXdOVZN_WbbcQ.png" /><figcaption>APIs for accelerometer and alike: CarSensors vs SensorManager/LocationManager</figcaption></figure><p>To target both Android Auto and AAOS, you must check which platform your application is running on at runtime, or build separate versions of the app.</p><p>The reason CarSensors returns an <a href="https://developer.android.com/reference/androidx/car/app/hardware/common/CarValue#STATUS_UNIMPLEMENTED()">unimplemented status</a> on AAOS instead of wrapping calls to SensorManager and LocationManager remains unclear.</p><h4><a href="https://developer.android.com/reference/androidx/car/app/media/CarAudioRecord">CarAudioRecord</a> (CAL)</h4><p>This is a car-specific subset of the <a href="https://developer.android.com/reference/android/media/AudioRecord.html">AudioRecord API</a>. It is minimal and integrates well with other CAL APIs.</p><p>There is also generic <a href="https://developer.android.com/reference/android/media/MediaRecorder">MediaRecorder API</a>, but its scope is different.</p><h3>2. Car-specific APIs for user interaction</h3><p>This cluster of car APIs is new to Android. Cars are not phones, and must have different standards for distraction optimization and general user interaction.</p><p>These APIs include <a href="https://developers.google.com/cars/design/create-apps/app-types/overview">App Types</a>, <a href="https://developers.google.com/cars/design/create-apps/apps-for-drivers/overview">Templates</a>, <a href="https://developers.google.com/cars/design/create-apps/media-apps/overview">Media Apps</a>, etc.</p><p>Depending on your app’s category, target platform and whether it must be usable while driving, you will see which of these APIs are available and possibly even enforced.</p><p>For a great introduction, see Section 4, “Building your App,” in the Appning Automotive Apps Market <a href="https://appning.com/developers/#documentation">developer documentation</a>.</p><p>Google has documented these APIs quite well: <a href="https://developers.google.com/cars/design/create-apps">https://developers.google.com/cars/design/create-apps</a></p><p>The libraries and documentation are both in active development, so it is best to regularly check these resources. This is also why we omit the arguably largest section of this article and focus on data APIs instead.</p><h3>3. Car-specific APIs for vehicle data</h3><p>The usage of vehicle data is an important feature that sets a car-specific Android application apart. Examples of such vehicle-data-centric use cases include:</p><ul><li>Fleet management: fuel and charge levels, pending maintenance, location</li><li>Insurance: driving performance evaluation, tire pressure, speeding and braking events, indicator usage, timely maintenance</li><li>Entertainment: driving feedback, control of peripherals such as ambient lights</li></ul><p>One specific example is the <a href="https://carchestra.com/">Carchestra</a> app. It is a media application that reacts to your driving style that was created by us at <a href="https://paradoxcat.com">Paradox Cat</a>. It won the <a href="https://appning.com/developers/">Appning App Challenge</a>, so give it a try!</p><p>Currently, there are the following ways of accessing car data:</p><ol><li><a href="https://developer.android.com/training/cars/apps#carinfo"><strong>CarInfo</strong></a> as part of <strong>CAL</strong>. It works for both Android Auto and AAOS.</li><li><a href="https://developer.android.com/reference/android/car/hardware/property/CarPropertyManager"><strong>CarPropertyManager</strong></a>, which only works for AAOS-based cars, because it uses Car Service and Vehicle HAL.</li></ol><p>Here is how to choose:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*yZMcjZWqzX5mUgHGWbbfxA.png" /><figcaption>APIs for typical car data: CarInfo vs CarPropertyManager</figcaption></figure><p>From a software architecture perspective, it is questionable whether we even need a CarPropertyManager. Again: ideally, all Car APIs would be exposed exclusively via CAL. Perhaps Google will address this issue eventually, but for now, this is the current state.</p><p>Now, let’s take a deeper look at both <strong>CarInfo</strong> and <strong>CarPropertyManager</strong>.</p><h3>CarInfo</h3><blockquote>Starting with Car App API level 3, the Car App Library has APIs that you can use to access vehicle properties and sensors.</blockquote><p>This quote is about CarInfo, which does not expose as much as most use cases would require. Here’s the full list of APIs as of today:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*8FYeoEe8hYbjCpxPikfQFQ.png" /><figcaption>Car data surfaced by the <a href="https://developer.android.com/reference/androidx/car/app/hardware/info/CarInfo">CarInfo</a> APIs</figcaption></figure><p>It is definitely not all the data which could be made available to Android applications. Unfortunately, even automotive OEMs cannot expose additional data via this library, because the interface is specified by Google for all cars.</p><p>From our experience of using CarInfo with Android Auto on non-Android cars, we have found that most OEMs only deliver the car model and speed. The rest of the CarInfo methods usually do not return actual data.</p><p>On AAOS, CarInfo is implemented using the CarPropertyManager API. CarInfo essentially exposes a subset of these APIs in a slightly different form.</p><p>Therefore, more data is available on AAOS cars than on non-Android cars using CarInfo.</p><p>However, AAOS directly exposes the CarPropertyManager API, so let’s explore the benefits of using it.</p><h3>CarPropertyManager</h3><p><em>Note: CarPropertyManager is an Android Automotive OS (AAOS) feature, it does not exist on Android Auto.</em></p><p>Have you ever wondered how OEMs who use the Android OS in their cars are supposed to access car data points themselves in all of their first-party applications such as Climate, Telephony, or Navigation?</p><p>The answer is: the <a href="https://developer.android.com/reference/android/car/hardware/property/CarPropertyManager">CarPropertyManager API</a>.</p><h4>What is a Car Property?</h4><p>Google also refers to them as <a href="https://source.android.com/docs/automotive/vhal/previous/properties">vehicle properties</a>. A vehicle property is a data point that represents the state of a car.</p><p>This data is sent from the automotive network to the Vehicle Hardware Abstraction Layer (VHAL), where it is mapped to properties. CarPropertyManager allows application developers to access these properties.</p><p>Properties can be read-only, write-only, or read-write.</p><p>Each property has an area (such as VEHICLE_AREA_TYPE_SEAT), a strong data type (such as Integer), and a change mode (such as VEHICLE_PROPERTY_CHANGE_MODE_CONTINUOUS) associated with it. Additionally, each property declares the necessary permissions to access it.</p><p>Below are some examples of vehicle properties:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*pZZdThWawoeXU6omBNvnkw.png" /><figcaption>Vehicle Properties</figcaption></figure><p>The full list of property IDs can be found <a href="https://developer.android.com/reference/android/car/VehiclePropertyIds">here</a>. As of today, there are over 250 different vehicle properties, which by far surpasses the number provided by CarInfo.</p><p>For more information on how it works under the hood, please refer to this <a href="https://medium.com/@mmohamedrashik/vehicle-hal-and-car-api-in-android-automotive-os-cfca60c7edd0">article by Rashik</a>.</p><h4>System Properties vs Vendor Properties</h4><p>By design, Google is expecting OEMs to provide data via a predefined set of so-called <a href="https://developer.android.com/reference/android/car/VehiclePropertyIds">system properties</a>. This set contains over 250 properties, as shown above.</p><p>At the same time, Google recognizes that this set may not be ideal for all OEMs and provides a mechanism to extend it with <a href="https://source.android.com/docs/automotive/vhal/special-properties#vendor-properties">vendor properties</a>. It is important to note that OEMs that want to be <a href="https://source.android.com/docs/compatibility/overview">officially compatible with Android</a> must adhere to these rules:</p><blockquote>Always try to use system properties first, vendor properties should be used as a last resort when none of the system properties feeds your requirement.</blockquote><blockquote>To prevent ecosystem fragmentation, vendor properties must not be used to replicate vehicle properties that already exist in the SDK <a href="https://developer.android.com/reference/android/car/VehiclePropertyIds">VehiclePropertyIds</a>. To learn more, see <a href="https://source.android.com/docs/compatibility/13/android-13-cdd#25_automotive_requirements">Section 2.5, Automotive Requirements</a> in the CDD.</blockquote><p>It is unclear whether all car manufacturers who use AOSP comply with this requirement.</p><p>So, there are even more vehicle data points. But who has access to them?</p><h4>Permissions: a word of warning</h4><p>If the property requires a “signature” or “signature | privileged” permission level, it means that you cannot obtain permission without the platform being updated. Each OEM must update their AOSP to whitelist your application package. You will only receive the permission when the OS image is updated in each car. Not to mention, a trusted relationship with an OEM is required.</p><p>This is a topic that we are currently addressing with many different automotive OEMs at the <a href="https://covesa.atlassian.net/wiki/spaces/WIK4/pages/39068159/Automotive+AOSP+App+Framework+Standardization+Expert+Group">Connected Vehicle Systems Alliance (COVESA)</a>.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*v5lcD_HA4uYm_9XgEhEUNg.jpeg" /></figure><p>One of the goals there is to avoid fragmentation by standardizing all vehicle APIs for Android. No need to talk to each OEM separately, flexible permission model, the ability to reuse Vehicle Signal Specification (VSS), and so on. The group is open, and software is open-source.</p><p>Given the current permissions situation, three categories of developers could use the CarPropertyManager API:</p><ul><li>OEMs developing first-party applications;</li><li>third-party application developers who are in agreement with OEMs to have their applications whitelisted to access specific properties;</li><li>third-party application developers who are only interested in properties with “normal” or “dangerous” (runtime) permissions.</li></ul><blockquote>Disclaimer</blockquote><blockquote>The following tables are merely a snapshot of the current API. Please, always refer directly to <a href="https://developer.android.com/reference/android/car/VehiclePropertyIds">https://developer.android.com/reference/android/car/VehiclePropertyIds</a></blockquote><p>The following is a list of the 37 properties with “normal” permissions today:</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/2002d88ccbd8ed011f2ac6509ddae497/href">https://medium.com/media/2002d88ccbd8ed011f2ac6509ddae497/href</a></iframe><p>Below is a list of the 33 properties that require “dangerous” permissions to be granted at runtime:</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/e7dc2e3191cd1a42212ccd2ea7aa0615/href">https://medium.com/media/e7dc2e3191cd1a42212ccd2ea7aa0615/href</a></iframe><h4>Code example</h4><p>Now that we have the permissions out of the way, we can move on to the actual code example.</p><p>At <a href="https://paradoxcat.com">Paradox Cat</a>, we have recently released an open-source (MIT-licensed) wrapper library for the CarPropertyManager API. This library greatly simplifies integration with the Kotlin codebase and handles all connection logic out of the box.</p><p>This is how easy it makes to subscribe for car speed (omitting permission request):</p><pre>val kpm = KarPropertyManager(context, scope)<br>val speedFlow = kpm.getProperty&lt;Float&gt;(<br>        VehiclePropertyIds.PERF_VEHICLE_SPEED, 0, 60F).valueFlow</pre><p>Here is the GitHub repository: <a href="https://github.com/Paradox-Cat-GmbH/KarPropertyManager">https://github.com/Paradox-Cat-GmbH/KarPropertyManager</a></p><p>We hope you’ll find our library useful! Feel free to report any issues or suggest features you’d like to see, we would appreciate your feedback.</p><h3>Challenges</h3><p>From an application developer’s perspective, the situation is pretty dire.</p><h4>Confusing API surface</h4><p>It’s simply too much for a typical Android application developer to grasp. Just look at the size of this article, and we mostly focused on the Google experience.</p><h4>Lack of Testing Tools</h4><p>The Google landscape is more or less covered by the official Android Automotive emulator (with GAS). Additionally, one could rent cars of the following brands and expect similar experience: <a href="https://built-in.google/cars/#explore-cars">https://built-in.google/cars/#explore-cars</a></p><p>It would also be helpful to feed the emulator with real car data. There is a free demo recording provided by <a href="https://www.remotivelabs.com/blog/feeding-your-aaos-emulator">remotiveLabs</a>, but Google has not yet provided anything like that. A comprehensive car simulator that includes all displays and interaction capabilities would also be helpful.</p><p>As for non-Google OEMs using AAOS, it’s certainly difficult to find a representative platform. COVESA recognized it and we supported this by setting up a vanilla AAOS emulator CI pipeline. See <a href="https://github.com/COVESA/aosp_device_covesa_emulator">https://github.com/COVESA/aosp_device_covesa_emulator</a>.</p><p>Next steps: <a href="https://github.com/COVESA/aosp_device_covesa_emulator/issues">https://github.com/COVESA/aosp_device_covesa_emulator/issues</a></p><p>Discussions: <a href="https://github.com/COVESA/aosp_device_covesa_emulator/discussions">https://github.com/COVESA/aosp_device_covesa_emulator/discussions</a></p><p>As of this writing, further development and maintenance of this emulator has been paused due to lack of interest from automotive OEMs. Each OEM has its own emulator, but they differ, and most are not public. This makes it difficult for developers to verify whether their apps would work on non-GAS cars.</p><p>Once the issue of a standard reference emulator is resolved, non-GAS OEMs will face further challenges in standardizing their interfaces if they cannot be identical to Google’s.</p><p>With regard to the rental of non-GAS cars, we have also recognized that there is no official channel for sideloading the application for testing purposes. ADB is usually disabled.</p><p>Google has an internal test track in the Play Store, but it is only usable with GAS. Perhaps other app stores should make it publicly available too.</p><h4>OEM support of Template Host</h4><p>Whether CAL’s Template-UIs are supported depends on whether the OEM has integrated so-called <a href="https://source.android.com/docs/automotive/hmi/aosp_host">Automotive App Host</a>. Cars with Google built-in (GAS) come with the <a href="https://play.google.com/store/apps/details?id=com.google.android.apps.automotive.templates.host&amp;hl=en">Google Automotive App Host</a>, which Google keeps up to date via Play Store updates. Without GAS, there is no guarantee that the OEM has integrated their own version of the <a href="https://source.android.com/docs/automotive/hmi/aosp_host">Automotive App Host</a> that supports the latest Car App Library version.</p><p>Fragmentation occurred when Google updated the GAS version of the Automotive App Host, but not the open-source version.</p><ol><li>OEMs running AAOS without Google Automotive Services (GAS) are stuck with Car App Library API level 4.</li><li>Some OEMs and App Stores started to extend it. Unfortunately we are not aware of any implementation of Template Host that supports the latest Car App Library APIs except for Google’s proprietary one.</li></ol><p>Hopefully, Google will continue to update the open-source Template Host to support the latest Car App Library APIs.</p><p>To ensure compatibility today:</p><ul><li><strong>Wait for OEM support:</strong> Avoid using features that require API level 5 or higher until all manufacturers have updated their systems.</li><li><strong>Build with fallbacks:</strong> If you need newer features now, check the API level at runtime and provide alternative implementations for cars running older API versions.</li></ul><h4>OEM support of Media Apps</h4><p>The MediaBrowserService and MediaSession APIs are already supported in AAOS. Therefore it is unlikely that OEMs would use something else.</p><p>However, OEMs typically implement their own media player, which may or may not function as expected.</p><h4>Access to Vehicle Data</h4><blockquote><strong>Issue #1</strong></blockquote><blockquote>Car manufacturers are not required to support all vehicle properties.</blockquote><p>Currently, even cars with “Google built-in” are <a href="https://source.android.com/docs/compatibility/16/android-16-cdd#73_sensors">only required to implement 4 (!) vehicle properties</a> out of the 70 publicly available (with “normal” and “dangerous” permissions):</p><blockquote>[<a href="https://source.android.com/docs/compatibility/16/android-16-cdd#73_sensors">7.3</a>/A-0–1] MUST implement and report <a href="https://developer.android.com/reference/android/car/VehiclePropertyIds.html#GEAR_SELECTION">GEAR_SELECTION</a>, <a href="https://developer.android.com/reference/android/car/VehiclePropertyIds.html#NIGHT_MODE">NIGHT_MODE</a>,<a href="https://developer.android.com/reference/android/car/VehiclePropertyIds.html#PERF_VEHICLE_SPEED">PERF_VEHICLE_SPEED</a> and <a href="https://developer.android.com/reference/android/car/VehiclePropertyIds.html#PARKING_BRAKE_ON">PARKING_BRAKE_ON</a>.</blockquote><p>Therefore, application developers cannot rely on CarInfo or CarPropertyManager to deliver more than these four properties.</p><p>Another issue is that some OEMs do not even use AAOS; they use the tablet version of AOSP, which lacks CarPropertyManager.</p><p>The solution will take time. The more apps that start using these APIs, the greater the incentive for OEMs to support them.</p><blockquote><strong>Issue #2</strong></blockquote><blockquote>The permission model is not flexible enough for OEMs to adopt.</blockquote><p>Currently, 180 out of 250 standard vehicle properties are only available with OEM signature.</p><p>There’s an initiative at COVESA to enable a better permission model with minimal AOSP changes: <a href="https://covesa.atlassian.net/wiki/spaces/WIK4/pages/262766594/COVESA+AOSP+Vehicle+Data">https://covesa.atlassian.net/wiki/spaces/WIK4/pages/262766594/COVESA+AOSP+Vehicle+Data</a></p><p>Ideally, this will be merged into AOSP, which will avoid fragmentation.</p><blockquote>Issue #3</blockquote><blockquote>Most OEMs require more than Google’s set of properties.</blockquote><p>These end up as vendor properties, as indicated above, which third-party applications have no access to.</p><p>If all OEMs agreed to reuse the <a href="https://covesa.github.io/vehicle_signal_specification/">Vehicle Signal Specification (VSS)</a> or any other standard, developers would have access to more car data points without needing OEM-specific software development kits (SDKs).</p><h3>Summary</h3><p>The world needs <strong>more </strong>car-specific applications, and <strong>now</strong> is the time to develop them because almost all OEMs are using Android as the main OS for infotainment.</p><p>Begin by consulting <a href="https://appning.com/developers/#documentation">Appning’s documentation</a>, followed by <a href="https://developers.google.com/cars/design/create-apps">Google’s</a>.</p><p>Refer to this article for details on how to work with <strong>vehicle data</strong> and what to be aware of.</p><p>If you are using Kotlin, check out our <a href="https://github.com/Paradox-Cat-GmbH/KarPropertyManager"><strong>KarPropertyManager</strong></a> wrapper.</p><p>As a third-party developer you can only access the properties with <strong>“normal”</strong> or <strong>“dangerous”</strong> (runtime) <strong>permissions</strong>. Do not assume that you have access to all the data.</p><p>Both developers and car manufacturers face many challenges, but most of these will be solved once the market saturates and stabilizes.</p><p>So go ahead and develop some apps for cars! <a href="https://paradoxcat.com/kontakt/">Contact us at Paradox Cat</a> if you need any help.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=ac627ba9fc8e" width="1" height="1" alt=""><hr><p><a href="https://medium.com/paradox-cat-tech-hub/building-apps-for-cars-in-2026-android-auto-aaos-and-vehicle-data-apis-ac627ba9fc8e">Building Apps for Cars in 2026: Android Auto, AAOS, and Vehicle Data APIs</a> was originally published in <a href="https://medium.com/paradox-cat-tech-hub">Paradox Cat Tech Hub</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[How Can UWB Radar Protect Children Left Alone in Cars?]]></title>
            <link>https://medium.com/paradox-cat-tech-hub/how-to-use-uwb-radar-to-protect-kids-left-alone-in-the-car-45395a05c9c1?source=rss----4466e279ddff---4</link>
            <guid isPermaLink="false">https://medium.com/p/45395a05c9c1</guid>
            <category><![CDATA[ai]]></category>
            <category><![CDATA[radar]]></category>
            <category><![CDATA[child-presence-detection]]></category>
            <category><![CDATA[machine-learning]]></category>
            <category><![CDATA[in-cabin-automotive]]></category>
            <dc:creator><![CDATA[Mayara Bonani]]></dc:creator>
            <pubDate>Tue, 09 Dec 2025 08:36:32 GMT</pubDate>
            <atom:updated>2025-12-09T08:49:31.904Z</atom:updated>
            <content:encoded><![CDATA[<p>Every year a significant number of children lose their lives after being left alone in a vehicle, unintentionally or on purpose. For instance, in the USA, 39 children die each year from vehicular heatstroke. This fact may not be widely known, but there are many factors that cause this situation:</p><ul><li>Inside a car, even in a day with a moderate outside temperature, the vehicle may reach dangerous higher temperatures within minutes. What may look like a partial solution, such as letting the window slightly open, parking in shade, or using air conditioning, do not prevent the vehicle from accumulating heat.</li><li>Children’s physiology makes them more susceptible to higher temperatures, because their thermoregulatory system, such as their ability to sweat and to regulate their body temperature, is different from adults. Therefore, children cannot dissipate heat effectively and their body temperature rises three to five times faster. The case is more serious for newborns and toddlers.</li><li>Since the temperature inside a car can reach dramatic levels in a short time, combined with the fact that a child absorbs heat faster without having the capability of regulation, heatstroke conditions develop quickly, leading to a life-threatening scenario.</li><li>The child’s inability to exit the vehicle on their own.</li></ul><p>At the same time, while parents have a primary responsibility to take care of their children and their health, society also has a responsibility to develop systems and mechanisms to prevent such situations from happening.</p><p>To address the risk of children being left alone in cars, the European New Car Assessment Programme (Euro NCAP) and China NCAP (C-NCAP) have recognized Child Presence Detection (CPD) systems inside the vehicles as a feature in their safety rating systems since 2023. CPD may become mandatory in the future, depending on the outcome of current EU and UNECE regulatory processes.</p><p>A CPD system should be capable of detecting unattended children anywhere inside the vehicle, including all seating positions (optional and removable seats) and unusual locations such as footwells. It should operate reliably under diverse lighting conditions, including darkness, shadows, or glare, and handle occlusions, varying postures, and other challenging scenarios such as children covered by blankets, objects on seats, or atypical seating arrangements.</p><p>According to Euro NCAP’s 2026 protocol, a CPD system must rely on direct sensing methods, such as detecting movement, breathing, or even heartbeat. In addition, the system must also issue a visible and audible warning within 15 seconds if a child is detected after the vehicle has been locked.</p><p>To meet these requirements, radar-based technologies, such as millimeter-wave and Ultra-Wideband (UWB) radars, are among the most suitable options. Our company offers solutions for both, including a CPD system based on 60 GHz mmWave radars. However, because UWB is already widely used in vehicles, particularly for applications like keyless entry, integrating it into existing automotive platforms is often simpler, more cost-effective, and faster. Beyond ease of integration, UWB also offers several technical advantages, such as lower power consumption, better penetration, and occlusion handling. Still, its use in current automotive systems remains one of the strongest reasons for choosing UWB for CPD.</p><h4>How do UWB radars work, and how can they detect if a child or newborn is left alone in a car?</h4><p>To start, let us recall the physical behavior of electromagnetic waves, such radar waves, in simple terms. Imagine this scenario: If you go to a lake and throw a rock, you can see patterns of waves in the water. In this analogy, the rock represents the part of the radar called transmitter, which generates invisible waves in the air. The pulse repetition frequency of the transmitter is determined by “how many times you throw a rock in just one second”. The distance between the ripples created by each stone corresponds to the radar wavelength, which is determined by its carrier frequency (6.5 to 8 GHz for UWB radars).</p><p>Just like an echo when you scream in a cave, radar waves are also reflected when they hit an object: if you scream, your ears recognize your own voice bouncing back. With radars, the receiver antenna recognizes the reflected wave pulses, but the radar “shout” is a billion times faster than sound and the “ears” measure nanoseconds. The time between transmitting a pulse and receiving its reflection is called Time-of-Flight (ToF). The ToF is proportional to the distance between the radar and the object, and it is used to estimate the distance and speed of objects. In this way, the reflections read by the radar carry information about the location and movement of surrounding objects.</p><p>The UWB radar periodically emits a sequence of extremely short pulses. The reflection pattern acts like a map, capturing how reflections appear at each moment. If an object moves, this pattern changes. UWB radar can “see” tiny movements, detecting small periodic changes in the reflected signal. This makes it ideal for recognizing a child or adult breathing inside the car, as each breath causes the chest to move around 4 to 8 mm and follows a periodic pattern with a frequency of approximately 0.2 to 1 Hz. The breathing rate of an adult is different from that of a child or a newborn.</p><p>When two antennas are used to receive the signal, it is possible to get information about the direction of arrival by comparing the time difference of the received echoes. The technical term is angle-of-arrival estimation<em> </em>and it allows us also to check the location of an individual inside the car.</p><p>The capability of reflection depends on the object material and thickness. Metal, water, and the human body have strong UWB reflection, while soft fabrics such as blankets and clothes, foam, or thin plastic partially absorb or scatter the received pulses, reflecting very little. As a result, much of the signal penetrates through the blanket, allowing the radar to ignore the fabric and to receive the echo from the body. This property of the frequency band of UWB radars enable us to measure the reflections of people and children, focusing on our goal: to detect if there is a kid alone inside the car.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*GFqrjqEz_eUohQn9ifothQ.jpeg" /></figure><p>Once we receive the UWB radar signal for a maximum duration of 15 seconds (required by Euro-NCAP), we then extract relevant information received for each moment of time. First, we need to filter the received data to remove unwanted components and enhance the parts of the signal that correspond to life presence detection. For instance, we want to remove all reflections that are related to static objects, such as car seats. We also do not want reflections that are associated with locations outside the car. This step is called pre-processing.</p><p>The received UWB radar signal contains multiple forms of information, commonly referred to as <em>features</em>. The subsequent processing stage focuses on extracting those features that are relevant and discriminative for CPD. There are amplitude-based features, which describe how strong the reflections are by measuring the intensity of the returned signal. If the radar uses more than one antenna, we can also calculate the angle or direction of arrival of those reflections, which is crucial to distinguish between signals from inside or outside the car. Additionally, by extracting the repeating patterns in the signal over time, we compute frequency-domain features, which help us detect periodic motions, such as breathing or other rhythmic activity, from otherwise still objects in the car.</p><p>After extracting the relevant features, we want to group all of them as an input for a machine learning (ML) model, which classifies whether the radar data represents reflections of a child or of an adult, or of an empty car. The ML model learns from the subtleties of the data that may not be obvious when we analyze it with a simple mathematical model. It acts like a decision-maker step, built on top of all features extracted. Once the model is trained, it can then perform real-time inference, pointing out to the CPD system, whether there is a child left alone inside a car or not.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*gc6DV35UITMBLNQ7JJ25wg.jpeg" /></figure><p>Therefore, UWB radars are a good solution for a CPD system, having low-power consumption. The combination of UWB radar sensing and artificial intelligence allows us to implement a reliable and cost-efficient solution.</p><p>Radar-based sensing opens many possibilities in vehicles: what applications do you think it could unlock? Let us know in the comments!</p><p>For companies looking to implement vehicle safety with Child Presence Detection (CPD) solutions, get in touch with us today to explore how our cutting-edge radar technology can help you bring life-saving features to market. We are flexible in sensor choice and positioning, and our algorithms can be adapted to work with different radar models and configurations, ensuring seamless integration into your vehicle systems.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=45395a05c9c1" width="1" height="1" alt=""><hr><p><a href="https://medium.com/paradox-cat-tech-hub/how-to-use-uwb-radar-to-protect-kids-left-alone-in-the-car-45395a05c9c1">How Can UWB Radar Protect Children Left Alone in Cars?</a> was originally published in <a href="https://medium.com/paradox-cat-tech-hub">Paradox Cat Tech Hub</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Introducing our new app: Carchestra — winner of the Appning App Challenge]]></title>
            <link>https://medium.com/paradox-cat-tech-hub/introducing-our-new-app-carchestra-winner-of-the-appning-app-challenge-8b54b16540b7?source=rss----4466e279ddff---4</link>
            <guid isPermaLink="false">https://medium.com/p/8b54b16540b7</guid>
            <category><![CDATA[android-automotive]]></category>
            <category><![CDATA[cars]]></category>
            <category><![CDATA[music]]></category>
            <category><![CDATA[android-automotive-app]]></category>
            <category><![CDATA[android-app-development]]></category>
            <dc:creator><![CDATA[Adrian Tappe]]></dc:creator>
            <pubDate>Thu, 12 Jun 2025 11:08:51 GMT</pubDate>
            <atom:updated>2025-06-12T11:08:51.034Z</atom:updated>
            <content:encoded><![CDATA[<h3>Introducing our new app: Carchestra — winner of the Appning App Challenge</h3><p>We are excited to announce that our new app, <strong>Carchestra</strong>, is now available for Android Automotive OS (AAOS)! This release is the first preview version with more features to follow soon. Carchestra is an audio app where the music adapts to your driving style in real time, reacting to speed, acceleration, braking, turning, and more! You can read more about it on the project <a href="https://carchestra.com/">website</a> and in our video:</p><iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2FvCSt8PzF6yU%3Ffeature%3Doembed&amp;display_name=YouTube&amp;url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DvCSt8PzF6yU&amp;image=https%3A%2F%2Fi.ytimg.com%2Fvi%2FvCSt8PzF6yU%2Fhqdefault.jpg&amp;type=text%2Fhtml&amp;schema=youtube" width="854" height="480" frameborder="0" scrolling="no"><a href="https://medium.com/media/35ccd236d3c655c8b9e8ef4b9cffd263/href">https://medium.com/media/35ccd236d3c655c8b9e8ef4b9cffd263/href</a></iframe><p>In this blogpost, we, the development team of Carchestra, will give you an insight into how this project came about, how it works, and what technical challenges we faced while creating the app.</p><h3>How to use it</h3><p>To download Carchestra, click on the <a href="https://play.google.com/store/apps/details?id=com.paradoxcat.carchestra">link</a>, or alternatively find it in the Play Store in your car:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*f_PLEjO5d0A-pv9hf35vBw.png" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*QIWVScX73YB8SlRIGoyk4g.png" /><figcaption>Install it remotely via the web</figcaption></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*c1GF-3XtiV6ciaa_MPA6Mg.png" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*HQTgtIWVVU_echJVvaDeog.png" /><figcaption>Install it through the car’s Play Store</figcaption></figure><p>Then, find the Carchestra icon in the app drawer or select it as a source in your media player.</p><p>You will now have to accept our license agreement, and grant the location permission, so that Carchestra can access vehicle sensors such as speed, accelerometer, and gyroscope.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*ub0GGhTVmXtSe_Aeo5eeCA.png" /><figcaption>Accept the location permission</figcaption></figure><p>Finally, you can hit play and start driving!</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*PAOp6rGnHHs8l9BqPhTTLQ.png" /><figcaption>Driving with Carchestra</figcaption></figure><h3>Automotive app stores</h3><p>In the past, most “apps” on automotive infotainment systems were provided by the OEMs themselves. More recently, CarPlay and Android Auto, which rely on a connection to the driver’s phone, brought a wider choice of apps to cars. Now, with more and more OEMs choosing AAOS, third-party apps can be downloaded from app stores to run natively on the vehicle, creating a more seamless experience.</p><p>Depending on the OEM, there are different major app stores available on AAOS vehicles. In a future blog post, we will go into more details about the automotive app store ecosystem.</p><ul><li><a href="https://play.google.com/"><strong>Google Play Store</strong></a> for Google’s flavor of AAOS (Google Automotive Services / “Google built-in&quot;) such as Volvo, Polestar, Renault, GM, Ford</li><li><a href="https://appning.com/"><strong>Appning by Forvia</strong></a> (previously Faurecia Aptoide) – white-label app stores for brands such as BMW, Mercedes-Benz, Dacia, Lynk &amp; Co and many others</li><li><a href="https://ignitedevelopers.harman.com/"><strong>Harman Ready Link Marketplace</strong></a> (previously Harman Ignite) – white-label app stores for Audi, Porsche and others</li></ul><p>Carchestra is now available in the Google Play Store, it is currently in review at Appning, and we are planning to submit it to Harman as well.</p><h3>How it started</h3><p>Every project starts with an idea: As a software development service provider specializing in HMI, we at PARADOX CAT are already working on AAOS systems and apps in customer projects, such as on BMW’s latest Android-based infotainment system. At one of our regular Ideathons, we were thinking of a new side project for our company: Building our own AAOS apps and distributing them through the automotive app stores. Soon, the idea of a music app that adapts to your driving style was born, and became popular among developers, data scientists and music enthusiasts in our team. Shortly after, we found a great name for the app as well: Carchestra!</p><p>The <a href="https://appning.com/developers/">Appning App Challenge</a> was an additional motivator for us to start this fun project. With a project start on January 7, 2025, and a submission deadline for the App Challenge on February 23, this gave us just shy of 7 weeks to complete the first MVP. The development team was formed, and the fun began!</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*k8_H-KrHKnLJIfu8dd_YSg.jpeg" /><figcaption><em>The team behind Carchestra, consisting of Android developers, data scientists, UX designer, product owner, as well as IT and QA experts</em></figcaption></figure><h3>Challenges</h3><p>And fun we had! Of course, not everything went smoothly:</p><h4>Testing on cars:</h4><p>An Android developer’s testbench usually looks like this:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/790/1*L-cywtpMYVbQUGJtXdF7iA.jpeg" /><figcaption><em>Android device testbench. Credit: </em><a href="https://github.com/openstf/stf/blob/master/doc/shelf_closeup_790x.jpg"><em>Simo Kinnunen</em></a><em> (</em><a href="http://www.apache.org/licenses/LICENSE-2.0"><em>Apache 2.0 License</em></a><em>)</em></figcaption></figure><p>Unfortunately, we can’t afford this as Android Automotive developers:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*iRmiO9fZgHhGjIkc38cGlA.jpeg" /><figcaption><em>Parking lot full of cars. Credit: </em><a href="https://www.pexels.com/photo/parking-lot-of-a-car-dealership-from-birds-eye-view-11502452/"><em>Joshua Santos</em></a></figcaption></figure><p>Still, we got access to various cars and tried to make the most of the time we had with them:</p><ul><li>Carchestra needs <strong>real-time vehicle data</strong> to work, so it is hard to test it on an emulator. To address this, we built a separate data recording app and built a collection of test drive data and videos from different cars. Then, we added functionality to replay recorded data synchronized with the videos on the emulator.</li><li>During the test drives, we also noticed quite a few <strong>differences between the AAOS implementations</strong> on the cars that we tested, such as certain sensors not being available or returning incorrect data, or different quirks in the behavior of the media app. Our collection of replayable data recordings made it possible to identify and address these differences.</li></ul><h4>Being a 3rd party</h4><p>As a third-party app developer, you usually cannot get a direct Android Debug Bridge (ADB) connection to a production car’s head unit to install and debug apps, as this is restricted by the OEM. Our workaround for this was to use Google Play’s internal test track, which allows us to upload app updates and make them available in the car within a couple of minutes. For debugging, we then also implemented a solution to upload logs to the cloud. Still, this is not ideal, and we would love it if OEMs provided ways to unlock ADB for approved developers.</p><h4>Cars are not phones</h4><p>Building user interfaces for third-party AAOS apps also differs from phone apps: Depending on the app category, the system provides distraction-optimized UI templates that are safe to use while driving. In our case, as a media app, we use the MediaBrowserService and MediaSession APIs to expose our capabilities to the OEM’s own media app. Our settings, built with standard Activities and Jetpack Compose, are only available in standstill.</p><h4>Real time audio is hard</h4><p>Finally, we also encountered a challenge that was not specific to automotive: Implementing real-time audio mixing based on Android standards for low-latency audio using Oboe and the NDK. Which is like laying tracks without using power tools while the train is driving full speed.</p><h3>Carchestra Preview Release</h3><p>Despite these challenges, we managed to build a working MVP and submit it to the contest. This first version of Carchestra includes one song, “Without You” by Michael Badal, with different intensity levels triggered by speeds up to 60 km/h (we call it the “urban mode”), and effects like turning or coming to a standstill.</p><p>Speaking of challenges, the final challenge was of course the Appning App Challenge. And when we got an email in mid-April, we were excited to hear that we had won the challenge! Thanks a lot to Appning and the Jury for recognizing our work. We are looking forward to working with the Appning team to bring Carchestra to BMW and other OEMs.</p><p><a href="https://www.linkedin.com/posts/appning-by-forvia_faapchallenge-activity-7317537864366411778-z_WR">𝗜𝘁&#39;𝘀 𝗼𝗳𝗳𝗶𝗰𝗶𝗮𝗹 - we have our winner! | Appning by FORVIA</a></p><p>If you read the blogpost until this point, you are probably interested in some more technical details. We are planning to release more blog posts in the future about technical aspects of development for cars in general and Carchestra specifically.</p><h3>Future plans</h3><p>Speaking about future plans, we of course have many ideas for Carchestra already. Of course, we are planning to add more music and more sophisticated effects. As already mentioned, we also hope to make it available to more cars in the near future through all the major AAOS app stores.</p><p>First, we would like to hear your feedback: Which features and what type of music would you like to see in Carchestra? Is everything working smoothly on your car? You can reach us at <a href="mailto:carchestra@paradoxcat.com">carchestra@paradoxcat.com</a> or through the contact form on the <a href="http://carchestra.com/">Carchestra website.</a></p><p>Is your company interested in Android apps for cars? Then please get in touch with us – we can support you with the development of new or porting of existing apps.</p><p><em>Authors: </em><a href="https://medium.com/u/e1bd9945b957"><em>Johan von Forstner</em></a><em>, </em><a href="https://medium.com/u/e9d632ce1251"><em>Adrian Tappe</em></a><em>, Lucía León Milán</em></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=8b54b16540b7" width="1" height="1" alt=""><hr><p><a href="https://medium.com/paradox-cat-tech-hub/introducing-our-new-app-carchestra-winner-of-the-appning-app-challenge-8b54b16540b7">Introducing our new app: Carchestra — winner of the Appning App Challenge</a> was originally published in <a href="https://medium.com/paradox-cat-tech-hub">Paradox Cat Tech Hub</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Gatti pazzi a Milano —  PARADOX CAT at #ECCV2024 ]]></title>
            <link>https://medium.com/paradox-cat-tech-hub/gatti-pazzi-a-milano-paradox-cat-at-eccv2024-1e1918b0c60b?source=rss----4466e279ddff---4</link>
            <guid isPermaLink="false">https://medium.com/p/1e1918b0c60b</guid>
            <dc:creator><![CDATA[Johan von Forstner]]></dc:creator>
            <pubDate>Thu, 17 Oct 2024 13:25:26 GMT</pubDate>
            <atom:updated>2024-12-05T09:50:21.712Z</atom:updated>
            <content:encoded><![CDATA[<h3>Gatti pazzi a Milano — PARADOX CAT at #ECCV2024 🍕</h3><p>After having a great time <a href="https://medium.com/paradox-cat-tech-hub/paradox-cat-at-iccv2023-a65528060274">at last year’s ICCV in Paris</a> learning about the latest scientific advances in Computer Vision, we at <a href="https://paradoxcat.com/ai">Paradox Cat’s AI team</a> were delighted to see that another major computer vision event, the <a href="https://eccv2024.ecva.net/">European Conference on Computer Vision (ECCV)</a> in October 2024, took place even closer to Munich, in the beautiful city of Milan in Northern Italy 🇮🇹. This time, our working student Priya and I took the trip together using our Paradox Cat <a href="https://paradoxcat.com/en/career/">training budget</a> and it was a lot of fun!</p><p>After a scenic train ride through the Alps we arrived in Milan on a Saturday afternoon with beautiful sunny weather and 26 °C 🌞. The weather wouldn’t stay quite as nice over the rest of the week, but that didn’t matter too much to us as the conference was jam-packed with interesting content. 😸</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*otJ2SW36mnzJa5bu_VCpuQ.jpeg" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*S1oWidkBkEz9I1IiIbTsMw.jpeg" /><figcaption>Welcome to ECCV!</figcaption></figure><p>Sunday and Monday were dedicated to workshops and tutorials, which could be regarded as mini-conferences focusing on specific subjects within the field of Computer Vision, organized independently from the main conference and with a few varying formats (invited talks, challenges, poster and oral sessions). We enjoyed many different engaging workshops and tutorials such as the annual workshop on <a href="https://hands-workshop.org/workshop2024.html">observing and understanding hands in action</a>, Meta’s <a href="https://www.projectaria.com/events/eccv2024/">Project Aria tutorial</a>, or the workshops on <a href="https://sites.google.com/view/t-cap-2024">human analysis</a>, <a href="https://sites.google.com/view/spatial-ai-eccv24">spatial AI</a>, and <a href="https://nianticlabs.github.io/map-free-workshop/2024/">map-free visual localization</a>.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*frFoB2Bx2IVT8hdelUTTQw.jpeg" /><figcaption>Fancy lighting in the “Gold Room” lecture hall, where the opening ceremony is about to start</figcaption></figure><p>This year’s ECCV was the largest to date with about 7,000 participants — and the scale of this event only really became clear on Tuesday, the first day of the main conference, where the opening ceremony took place and was streamed into all three large lecture halls. But the organizers did a great job in preparing for the crowds, with ample seating available in all the oral sessions and also a generous spacing of posters in the exhibition hall.</p><p>The opening ceremony also included the Best Paper Award, which was presented to <a href="https://www.ecva.net/papers/eccv_2024/papers_ECCV/papers/08113.pdf"><em>Minimalist Vision with Freeform Pixels</em></a>, an intriguing approach to solve vision problems with a minimalist camera with as few as 8 pixels of irregular shapes, providing benefits for privacy and power efficiency.</p><p>The main conference was again filled with so many great presentations that it’s impossible to list all of them. Among the most popular topics were generative AI (image and video), multimodal models (vision &amp; language, vision &amp; audio), applications of foundation models, 3D and multi-view vision techniques, and many more. I’ll mention two papers that I was particularly impressed by:</p><p><a href="https://www.ecva.net/papers/eccv_2024/papers_ECCV/papers/00529.pdf"><em>Sapiens: Foundation for Human Vision Models</em></a><em><br></em>Self-supervised pretraining of large foundation models has been a big trend fueling the recent breakthroughs in text and image generation models. This paper from Meta marks the first success in bringing this concept to the realm of human-centric vision tasks: They pretrain a Vision Transformer model on a large dataset of high-resolution pictures of humans using the masked autoencoder (MAE) technique. This pretrained model can then quickly be fine-tuned for downstream tasks like body pose estimation, body part segmentation, or depth and surface normal estimation, surpassing previous state-of-the-art results.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/960/1*DJceH4u3oi2dAmZbtbfSpA.gif" /><figcaption>Demo of pose, segmentation, depth and normals predicted by Sapiens [<a href="https://github.com/facebookresearch/sapiens/blob/main/assets/01.gif">Source</a>]</figcaption></figure><p><a href="https://www.ecva.net/papers/eccv_2024/papers_ECCV/papers/09080.pdf"><em>Grounding Image Matching in 3D with MASt3R​</em></a><br>This paper won the map-free visual localization challenge, surpassing baseline methods by such a large margin that the organizers initially suspected the authors had cheated. Given just two images of the same scene under challenging conditions (outdoor scenes captured on different days with different lighting and extreme viewpoint changes), the task of the challenge is to accurately reconstruct the relative camera pose between the two images. MASt3R solves the problem in 3D, taking into account not just local feature descriptors but also the 3D geometry via a depth map estimated from each image.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*tWRMsOe0HKIVy6-T" /><figcaption>Point correspondences in challenging scenes predicted by MASt3R [<a href="https://github.com/naver/mast3r/blob/main/assets/examples.jpg">Source</a>]</figcaption></figure><p>Of course, there were also a lot of popular sessions about new trends in generative AI and multimodal foundation models. But besides these highlights, we also found many more niche presentations in the poster sessions that were still quite relevant for our work in automotive in-cabin sensing.</p><p>In addition to the scientific program, ECCV included three keynote talks on Tuesday, Wednesday and Thursday. The first one was from Lourdes Agapito and Vittorio Ferrari, presenting the journey of their startup Synthesia and their newest advances in AI avatar video generation, with some impressive and funny demonstrations at the end. In the second keynote, law and AI ethics expert Sandra Wachter gave insights into fairness and transparency in AI systems and how current anti-discrimination legislation in the EU is not quite prepared for dealing with these issues. Finally, on Thursday, Stanford professor Sanmi Koyejo talked about distribution shifts in AI and described how, despite the advances in foundation models with improved generalization capabilities, distribution shifts are still an issue in many Machine Learning applications and domain adaptation/domain generalization techniques are still a crucial field of research.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*JNQQ3rq509r3gR00pf7Eog.jpeg" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*yDssSQBrMh7zvd1f2A-gZA.jpeg" /><figcaption>ECCV gala dinner &amp; party on Thursday night</figcaption></figure><p>After a fun and insightful week in Milan, we headed back to Munich on Friday afternoon. Thanks a lot to the ECCV 2024 team for organizing such a great conference and to all the authors who presented their awesome work. And of course thank you to Paradox Cat for allowing us to participate! 🐱</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=1e1918b0c60b" width="1" height="1" alt=""><hr><p><a href="https://medium.com/paradox-cat-tech-hub/gatti-pazzi-a-milano-paradox-cat-at-eccv2024-1e1918b0c60b">Gatti pazzi a Milano —  PARADOX CAT at #ECCV2024 🍕</a> was originally published in <a href="https://medium.com/paradox-cat-tech-hub">Paradox Cat Tech Hub</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Everything you need to know about Android on Raspberry Pi]]></title>
            <link>https://medium.com/paradox-cat-tech-hub/android-on-raspberry-pi-aa4b8eea72c6?source=rss----4466e279ddff---4</link>
            <guid isPermaLink="false">https://medium.com/p/aa4b8eea72c6</guid>
            <category><![CDATA[android-bsp]]></category>
            <category><![CDATA[android]]></category>
            <category><![CDATA[raspberry-pi]]></category>
            <category><![CDATA[diy]]></category>
            <category><![CDATA[aosp]]></category>
            <dc:creator><![CDATA[Viktor Mukha]]></dc:creator>
            <pubDate>Tue, 24 Sep 2024 13:44:05 GMT</pubDate>
            <atom:updated>2024-09-24T13:44:05.590Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*KAYy7AIVJ7_7UrsP" /><figcaption>Generated by AI using ChatGPT’s DALL·E</figcaption></figure><h3>In this article we are going to unveil:</h3><ul><li>4 Reasons you should use Android on Raspberry Pi</li><li>Where should you get Android for Raspberry Pi?</li><li>Which one to choose?</li><li>Future prospects.</li></ul><h3>4 Reasons you should use Android on Raspberry Pi</h3><p>Raspberry Pi is arguably one of the world’s most popular single-board computer. You probably already have one, don’t you?</p><p>Android is the <a href="https://www.businessofapps.com/data/android-statistics/#:~:text=Android%20is%20the%20most%20popular,manufacturers%20in%20the%20early%202010s.">most popular operating system in the world</a>. There is a huge variety of applications to choose from.</p><p>This popularity, combined with the increasing power of Raspberry Pi, opens up many possibilities for using Android on Raspberry Pi. Let us explore these briefly.</p><h4>Reason #1: Gaming</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*C61Pzn-gWN6Ad7d9zLpcCg.png" /><figcaption>Get retro vibes from your Raspberry Pi with Android</figcaption></figure><p>Raspberry Pi with Android is one of the most affordable gaming platforms on the market, whether using a keyboard and mouse with your desktop monitor, or using controllers with your TV. Simply install an Android game as an APK or use another app to emulate one of the retro consoles.</p><h4>Reason #2: Home Entertainment</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*04hffYoSgi2BrfHq.jpg" /><figcaption>TV with bias lighting</figcaption></figure><p>Once your built-in TV functionality gets old, there are plenty of affordable ways to breathe some life back into it. You could obviously get something like a FireTV stick, Apple TV, the <a href="https://www.theverge.com/2024/8/6/24214471/google-chromecast-line-discontinued">discontinued</a> Google Chromecast, or the all-new Google TV Streamer. If you feel adventurous, you could also take a look at the myriad of chineese streamers which retail on Amazon and Aliexpress. However, none of these would give you the flexibility of a Raspberry Pi.</p><p>The Raspberry Pi opens up so many more possibilities. It is a great centerpiece for home entertainment.</p><p>You can choose an Android TV Lineage OS build or use any Android flavor and simply install an app like <a href="https://www.plex.tv">Plex</a> to have a centralized streaming console.</p><p>Raspberry Pi officially supports Widewine, which means that Netflix, Disney+ and all the major streaming services work.</p><p>With a help of <a href="https://www.hifiberry.com">HiFiBerry</a>, you could do multi-room audio, turn passive speakers into active ones, correct room acoustics with DSP, build a high-quality streamer, and dive deep into DYI audio.</p><p>Projects like <a href="https://github.com/shrocky2/Hyperion">Hyperion</a> enable DYI lighting solutions for <a href="https://en.wikipedia.org/wiki/Bias_lighting">bias lighting</a>.</p><p>Simply plug a Webcam into a USB port of a Raspberry Pi and your TV turns into a video-call enabled device. Due to the fact that all major videoconferencing software runs on Android phones, the chances are high that it would work really well.</p><h4>Reason #3: Low-Volume Product</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*bAp2AXL8v6b5Z_RCfJij_w.jpeg" /><figcaption>Point of Sale System</figcaption></figure><p>Let’s say you need to build a point-of-sale terminal and you only need 10 of them. Maybe you already have an Android application that you could at least partially reuse. Or you could quickly deploy a solution using apps like <a href="https://squareup.com/">Square</a> or <a href="https://www.lightspeedhq.com/vend/">Vend</a>.</p><p>Not every company needs a professional and scalable hardware solution from NXP, Qualcomm, or other vendors right away. Sometimes you need a quick and inexpensive way to test your proof of concept.</p><p>Android is a great platform for an embedded solution with a touch display, that has a lot of features you would need to implement yourself in a more standard Yocto distribution.</p><p>Thus, a combination of Raspberry Pi and Android is a very lucrative option for such prototyping.</p><h4>Reason #4: Car Navigation System</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*rNOjgCZLrd67yhWKyW2E6g.png" /><figcaption>Car Navigation</figcaption></figure><p>Once your car’s embedded system can not keep up with technology, the typical solution is to use your phone either directly or via a so-called projection mode such as CarPlay (for iPhones) or Android Auto, effectively projecting your phone’s screen onto the car’s display(s).</p><p>Not all cars support projection modes, so there are things you can do with Raspberry Pi and an external display to get around this.</p><p>There are many off-the-shelf solutions on the market, mostly from the Far East. However, if you want an up-to-date Android system without built-in backdoors and with some DIY capabilities, Raspberry Pi with an external touch display is a great place to start.</p><p>The main goal of the project would undoubtedly be navigation with Google Maps. But again, just like in home entertainment, you could pimp your sound system, install some ambient lighting, and configure it all to play well with your Android system.</p><h4>Bonus: Other Reasons</h4><p>There are plenty of projects which may be realized with a combination of Raspberry Pi and Android.</p><p>However, this may not be the best solution for organizations due to licensing issues or lack of hardware performance.</p><p>Feel free to contact us at <a href="https://paradoxcat.com">Paradox Cat</a> if you have a project in mind. We can help you understand the costs and benefits involved, and we can build a custom AOSP-based solution that fits your needs precisely.</p><h3>Where should you get Android for Raspberry Pi?</h3><p>To run Android, Raspberry Pi needs an Android Board Support Package (Android BSP). This BSP is a collection of tools and drivers that allow specific hardware to boot into vanilla AOSP. Normally, it is the hardware vendor’s responsibility to create such a BSP.</p><p>However, the Raspberry Pi vendor (Raspberry Pi Foundation) only officially supports the Raspberry Pi OS.</p><p><strong>There is no official Android support for Raspberry Pi.</strong></p><h4>Unofficial Android Support</h4><p>Fortunately, Google provides the source code for the main part of the Android OS via the Android Open Source Project (AOSP). This, combined with the open source nature of most of the Raspberry Pi drivers for Linux, enabled people to build their own Android BSPs for Raspberry Pi.</p><p>The development of Android BSP for the original Raspberry Pi <a href="https://phys.org/news/2012-08-raspberry-pi-android.html">dates back to 2012</a>. After a while the developer community has been centered around the <a href="https://groups.google.com/g/android-rpi">android-rpi Google group</a>, which is still active today.</p><p>Let us take a look at the landscape of Android BSPs for Raspberry Pi today.</p><blockquote>Disclaimer</blockquote><blockquote>What follows is our humble research of available Android BSPs. We are not affiliated with any of the authors, and would be happy to be corrected. It is also difficult to determine the original authorship of the code, as some of it has often been copied without proper attribution. It seems that all the BSPs developers have been working in parallel, looking at each other’s changes and cherry-picking some, but not all.</blockquote><h4>android-rpi</h4><p>android-rpi is a developer community found on Github (<a href="https://github.com/android-rpi">https://github.com/android-rpi</a>), as well as Google Groups (<a href="https://groups.google.com/d/forum/android-rpi">https://groups.google.com/d/forum/android-rpi</a>).</p><p>The history of this community goes back to 2015 with the release of Android 5.0 on Raspberry Pi 2. The community is still active and continues to release source code of Android BSPs for new Raspberry Pi versions.</p><p>There are no binary images available for download.</p><p>The entry point for building your own image for Raspberry Pi 5, for example, can be found at <a href="https://github.com/android-rpi/device_arpi_rpi5">https://github.com/android-rpi/device_arpi_rpi5</a>.</p><p>This build is based on Android TV and comes with its own open source launcher called <a href="https://github.com/peyo-hd/RpLauncher">RpLaucher</a> and TV settings app called <a href="https://github.com/peyo-hd/LbSettings">LbSettings</a>. If you want to know more about Android launchers, we have written <a href="https://medium.com/paradox-cat-tech-hub/custom-android-launcher-why-and-how-do-i-build-one-6a1b3af89d43">another article</a> about them.</p><h4><strong>raspberry-vanilla</strong> and <strong>lineage-rpi by KonstaKang</strong></h4><p><a href="https://github.com/KonstaT">KonstaKang</a> has been consistently providing ready-to-use Raspberry Pi Android images for years. Looking at the website <a href="https://konstakang.com/">https://konstakang.com/</a> along with other devices, one would find a lot of different binary Raspberry Pi Android images for free download:</p><ul><li>Raspberry Pi 4: <a href="https://konstakang.com/devices/rpi4/">https://konstakang.com/devices/rpi4/</a></li><li>Raspberry Pi 5: <a href="https://konstakang.com/devices/rpi5/">https://konstakang.com/devices/rpi5/</a></li></ul><p>Most of the source code for these images along with build instructions, can be found in two different GitHub “organizations”:</p><ol><li>The <a href="https://github.com/raspberry-vanilla">raspberry-vanilla</a> collection of repositories has everything you need to build a vanilla AOSP for Raspberry Pi 4 or 5. You can build the kernel and the Android OS all by yourself.</li><li>The <a href="https://github.com/lineage-rpi">lineage-rpi</a> collection of repositories is used to build an Android OS distribution based on <a href="https://lineageos.org">LineageOS</a>, which provides a little more than vanilla AOSP, including its own TV launcher.<br><em>NOTE: these repositories became private after Lineage OS 17.1, so you can only build the kernel, but not the rest.</em></li></ol><h4>GloDroid</h4><p>The project was initiated by <a href="https://www.linkedin.com/in/roman-stratiienko-03474a92">Roman Stratiienko</a> at “<a href="https://www.globallogic.com">Global Logic</a>” with the main goal of creating a platform for training Android BSP developers [<a href="https://www.youtube.com/watch?v=Em_ITP3Y2ik">#44 Proof My Concept : GloDroid</a>]. Github profile: <a href="https://github.com/rsglobal">https://github.com/rsglobal</a></p><p>It was originally started for Orange Pi. Raspberry Pi support was recently added in 2023: <a href="https://github.com/GloDroidCommunity/raspberry-pi/commit/7c0e3b199f9c7f5a43ad76cf9688811300ce0490">https://github.com/GloDroidCommunity/raspberry-pi/commit/7c0e3b199f9c7f5a43ad76cf9688811300ce0490</a></p><p>Source code: <a href="https://github.com/GloDroidCommunity/raspberry-pi/releases">https://github.com/GloDroidCommunity/raspberry-pi</a></p><p>Prebuilt images: <a href="https://github.com/GloDroidCommunity/raspberry-pi/releases">https://github.com/GloDroidCommunity/raspberry-pi/releases</a></p><p>Project status: <a href="https://github.com/GloDroidCommunity/raspberry-pi/issues/1">https://github.com/GloDroidCommunity/raspberry-pi/issues/1</a></p><p>Discord channel: <a href="https://discord.gg/5H8cW5xA">https://discord.gg/5H8cW5xA</a></p><p>GloDroid also provides builds based on LineageOS, but you have to build it yourself using this script: <a href="https://github.com/GloDroidCommunity/raspberry-pi/blob/main/unfold_lineageos.sh">unfold_lineageos.sh</a>. See <a href="https://github.com/GloDroidCommunity/raspberry-pi/blob/main/README.md">README.md</a> for details.</p><h4>OmniROM</h4><p>OmniROM is a custom ROM distribution founded in 2013. The OmniROM distribution also founded TWRP (Team Win Recovery Project), which is included in the <strong>raspberry-vanilla</strong> and <strong>lineage-rpi</strong> builds as the default recovery partition. Android 13 builds for the Raspberry Pi 4, when tested by us resulted in an unstable experience, with frequent OS crashes and visual stuttering. Due to these unresolved issues, we would not recommend using this distribution for this particular version of Android. The latest stable release for Raspberry Pi 4 appears to be Android 12.1.</p><h4>Emteria Android OS</h4><p>Emteria customizes and maintains the Android OS for popular off-the-shelf hardware and industrial platforms, including Raspberry Pi. This BSP is closed source and commercial.</p><p>More info: <a href="https://emteria.com/">https://emteria.com/</a></p><h3>Which one to choose?</h3><p>It depends on your requirements and your use case. Here is a checklist:</p><ul><li>What version of Raspberry Pi do you target? Different versions have different hardware, and require different Android BSPs.</li><li>Can you build it yourself? Current AOSP builds <a href="https://source.android.com/docs/setup/start/requirements#hardware-requirements">require</a> a Ubuntu machine with at least 64GB ofRAM.</li><li>Do you need support for all the hardware that Raspberry Pi offers? Is anything in particular more important than the rest?</li><li>Do you want to use it commercially? Check the license.</li><li>Do you want the vanilla AOSP experience, LineageOS, or something else?</li></ul><p>As you can see, the KonstaKang and GloDroid BSPs are the most prominent AOSP distributions for the Raspberry Pi 4 and 5. However, while similar, these images are built differently and offer different experiences and are intended for different applications. To help you make your choice, we have compiled the following table with the functionality of various components:</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/a39075949c64eaf1081510d149d1fbf1/href">https://medium.com/media/a39075949c64eaf1081510d149d1fbf1/href</a></iframe><p>The flashing procedures also differ between the two. When flashing to an SD card, the scripts and instructions are different.</p><p>Besides using an SD card, a nice addition in GloDroid is fastboot over USB. Simply call:</p><pre>adb reboot bootloader</pre><p>and off you go, fastboot is enabled.</p><p>KonstaKang builds are using TWRP recovery partition for easy OTA updates. To enter this recovery mode, simply flip a switch in the Raspberry Pi submenu of the modified Settings app and reboot. This submenu also offers some nice features, such as the ability to view and change the screen resolution.</p><p>Another difference is that GloDroid uses U-boot, the “Universal Boot Loader”, while the KostaKang builds boot directly from the standard Raspberry Pi EEPROM bootloader.</p><h3>Future prospects</h3><p>It seems that the fragmentation of Raspberry Pi Android BSPs is mostly caused by the different goals of the projects. We do not see these projects converging in the near future.</p><p>It is also highly unlikely that the Raspberry Pi Foundation will start officially supporting Android.</p><p>Fortunately, we have the community where we can find support and play around with Android on Raspberry Pi.</p><p><em>Special thanks to </em><a href="https://medium.com/u/63d80726cb34"><em>James Gatt</em></a><em> for co-authoring this story.</em></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=aa4b8eea72c6" width="1" height="1" alt=""><hr><p><a href="https://medium.com/paradox-cat-tech-hub/android-on-raspberry-pi-aa4b8eea72c6">Everything you need to know about Android on Raspberry Pi</a> was originally published in <a href="https://medium.com/paradox-cat-tech-hub">Paradox Cat Tech Hub</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Immersive Showroom Experiences: The Future of Car Buying with spatial computing and augmented…]]></title>
            <link>https://medium.com/paradox-cat-tech-hub/immersive-showroom-experiences-the-future-of-car-buying-with-spatial-computing-and-augmented-64bcb29d0371?source=rss----4466e279ddff---4</link>
            <guid isPermaLink="false">https://medium.com/p/64bcb29d0371</guid>
            <dc:creator><![CDATA[Paradox Cat GmbH]]></dc:creator>
            <pubDate>Tue, 18 Jun 2024 13:07:37 GMT</pubDate>
            <atom:updated>2024-07-10T12:41:09.046Z</atom:updated>
            <content:encoded><![CDATA[<h3><strong>Immersive Showroom Experiences: The Future of Car Buying with spatial computing and augmented reality (AR)</strong></h3><p><strong>Spatial computing, an emerging technology that encompasses the digital integration of physical and virtual environments, has transformed numerous industries by enhancing user engagement, increasing operational efficiency and enabling innovative applications. With the rise of new headsets like Apple Vison Pro and Meta Quest 3 in the automotive sector, spatial computing has a significant potential. The car showroom experience and configurator can become highly immersive opening doors to a new experience, particularly through augmented reality (AR). <br>This Insight explores the use of spatial computing by means of augmented reality applied to interactive car configurators, highlighting its benefits in enhancing customer experience, increasing sales, and strengthening brand loyalty.</strong></p><h3>What is spatial computing?</h3><p>Spatial computing is a paradigm that involves the interaction of digital and physical worlds through the integration of spatial and contextual data. It includes technologies such as augmented reality (AR), virtual reality (VR) and mixed reality (MR), which enable users to interact with digital content in a spatially aware environment.</p><h3>Importance in the automotive industry</h3><p>The automotive industry is increasingly adopting digital technologies to meet evolving consumer expectations. AR as a component of spatial computing, offers unique opportunities to engage customers more deeply, providing interactive and immersive experiences that traditional digital interfaces cannot match.</p><h3>Augmented reality online car configurator</h3><p>An interactive car configurator allows customers to personalize and visualize their vehicle choices in real-time. Integrating AR into these configurators elevates the user experience by enabling customers to see a life-size, interactive 3D model of their customized vehicle in their own environment.</p><h3><strong>Benefits of immersive <br>car configurator experience</strong></h3><h3>1. Best use of retail space: More cars in limited retail space</h3><iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2FB3l1yM1dkUQ%3Ffeature%3Doembed&amp;display_name=YouTube&amp;url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DB3l1yM1dkUQ&amp;image=https%3A%2F%2Fi.ytimg.com%2Fvi%2FB3l1yM1dkUQ%2Fhqdefault.jpg&amp;key=a19fcc184b9711e1b4764040d3dc5c07&amp;type=text%2Fhtml&amp;schema=youtube" width="854" height="480" frameborder="0" scrolling="no"><a href="https://medium.com/media/f2475def3f03042ab0b6f8b3e34ad925/href">https://medium.com/media/f2475def3f03042ab0b6f8b3e34ad925/href</a></iframe><ul><li><strong>Digital exhibition space for city car showrooms:</strong> The recent city car showroom displays a lower amount of retail space and is usually focused on one or two example vehicles. <br>AR eliminates the need for physical display models, allowing dealerships to display an unlimited number of car models and configurations digitally. This reduces the physical space required for car exhibitions, making room for a more engaging digital showroom.</li><li><strong>Cost efficiency:</strong> Reducing the number of physical cars on display lowers storage and maintenance costs while maximizing the use of available space. <br>This reduction in physical footprint can lead to significant cost savings in real estate and facility management.</li></ul><h3>2. Ready to “WOW” your customers: Immersive experience</h3><ul><li><strong>Immersive visual and interactive appeal: </strong>AR creates an immersive experience that captures customers’ attention and imagination. <br>The ability to visualize their customized vehicle in a realistic 3D space, walk around the car, view it from different angles, with different environments and interacting with it in 3D seeing how different customization options look in real life, significantly enhances the “wow” factor.</li><li><strong>Interactive experience: </strong>Customers can engage with the car’s features in real-time, exploring different configurations and options in a highly interactive manner, such as changing colors and rims and adjusting interior settings, leading to a more satisfying, enjoyable and memorable shopping experience.</li></ul><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*1z1NP-K3BZ7A9vuO.png" /></figure><h3>3. Innovative Features: AI + Personalization</h3><ul><li><strong>Customization and personalization: </strong>AR technology allows for extensive customization options, enabling customers to tailor every aspect of their vehicle to their preferences.</li><li><strong>New AI features:</strong> The possibilities with AR are vast, including integrating AI for personalized recommendations, overlaying information about car features, and even simulating driving experiences in a virtual environment.</li></ul><h3>4. Increase brand experience at the showroom</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*uRvpNWMtkIFOiTZq.png" /></figure><ul><li><strong>Emotional experience: </strong>Due to the high immersive character of the experience, customers can explore different realities and at the point of sales in a multisensorial environment and at a 1:1 scale ( real size of the car). <br>This can have a high impact on the customer choice and on the emotional experience with the car.</li><li><strong>Brand Differentiation: </strong>Adopting cutting-edge AR technology distinguishes a brand as innovative and customer-centric, enhancing its competitive edge in the market as well as its reputation and appeal.</li></ul><h3>5. Increased sales and conversion rates</h3><ul><li><strong>Informed decision-making: </strong>By offering a detailed and realistic view of the vehicle, AR helps customers make more informed decisions, reducing uncertainty and increasing confidence in their purchase. In an immersive experience the customer can see the actual personal variant of the product in actual size, rather than using samples.</li><li><strong>Upselling opportunities:</strong> AR can visualize the appeal of additional features and options, encouraging customers to consider higher-end configurations and accessories via the true size scale representation of the vehicle.</li></ul><h3>6. Easy to use and roll-out in the car showroom</h3><ul><li><strong>Plug &amp; Play: </strong>Modern AR solutions are ready to use with minimal setup, allowing dealerships to quickly integrate AR configurators into their sales processes. Due to technology advancements, the new spatial computing technologies became more user friendly. Out of the box solutions that are ready to use and don’t need controllers nor advanced rigging systems are increasing the user experience and decrease the complexity of past VR systems.</li><li><strong>Intuitive interfaces:</strong> AR platforms are designed to be intuitive, requiring little to no training for customers and sales staff to use effectively.</li></ul><h3>7. Data &amp; customer relationships</h3><ul><li><strong>Data-driven insights:</strong> AR configurators can collect data on customer preferences and behaviors, providing valuable insights for personalized marketing, improving customer relationships and tracking customer preferences.</li><li><strong>Market expansion:</strong> The flexibility of AR allows brands to reach a wider audience without the limitations of physical showrooms to display real scale digital car models. This makes it easier to expand to different markets due to its “out of the box” roll out specifications.</li></ul><h3>Technical implementation challenge</h3><h4>AR technology stack</h4><p>Implementing an AR car configurator requires a robust technology stack that includes:</p><ul><li><strong>3D modeling and rendering:</strong> High-quality 3D models of vehicles and customizable parts.</li><li><strong>AR SDKs and frameworks:</strong> Tools such as ARKit (iOS), AR Core (Android), or Web-based AR solutions.</li><li><strong>Backend infrastructure:</strong> Efficient data management systems to handle user configurations, preferences, and interactions.</li><li><strong>User Interface (UI) Design:</strong> Intuitive and user-friendly interface to facilitate seamless interaction with the AR environment.</li></ul><h4>Integration with existing systems</h4><p>To ensure a smooth and effective implementation, the AR car configurator must be integrated with the brand’s existing digital infrastructure:</p><ul><li><strong>E-commerce platforms</strong>: Synchronization with online stores for real-time pricing, availability, and purchasing options.</li><li><strong>Customer Relationship Management (CRM) systems</strong>: Integration to track user preferences, interactions and feedback for personalized marketing and support.</li><li><strong>Analytics tools</strong>: Deployment of analytics tools to monitor user behavior, engagement metrics, and configurator performance.</li></ul><h4>Future prospects</h4><p>The future of AR in car configurators is promising, with advancements in spatial computing poised to offer even more sophisticated and immersive experiences. Future developments may include:</p><ul><li><strong>Enhanced realism</strong>: Improved rendering technologies will enable even more lifelike and detailed visualizations.</li><li><strong>Expanded interactivity</strong>: Increased functionality, such as test-driving virtual vehicles in AR environments.</li><li><strong>Integration with AI</strong>: Artificial intelligence can provide personalized recommendations and adjustments based on user preferences and behaviors.</li></ul><h3>Conclusion</h3><p>Augmented reality represents a transformative opportunity for the automotive industry, particularly in the realm of online car configurators. Due to the intuitive user interface and easy to roll-out solution, this technology has the potential to take car personalization at the car dealership to a new dimension. By providing an immersive and personalized experience, dealers can enhance customer satisfaction, drive sales and increase customer experience at the showroom.</p><p>Automotive brands that will use AR technology in the future have a high potential to redefine the usual showroom experience for the customer and make a better use of retail space.</p><p>More than this, by adopting AR in online car configurators, automotive companies can significantly improve the customer journey, from initial at home immersive exploration to final showroom purchase, establishing themselves as first moves in the digital transformation of the automotive retail experience.</p><p><strong>Authors</strong></p><figure><img alt="" src="https://cdn-images-1.medium.com/max/164/0*wgBNi5uCSNkuvgjg.png" /></figure><p><strong>Vladimir Moldovanu</strong><br>Creative Director <a href="https://www.rpc-partners.com/">rpc</a></p><figure><img alt="" src="https://cdn-images-1.medium.com/max/185/0*1AC2NJeo_akEm3SV.png" /></figure><p><strong>Alvaro Alonso</strong><br>Sales Director <a href="https://paradoxcat.com/">ParadoxCat</a></p><p>Can we help you with your project? Contact us at <a href="mailto:sales@paraxcat.com">sales@paradoxcat.com</a></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=64bcb29d0371" width="1" height="1" alt=""><hr><p><a href="https://medium.com/paradox-cat-tech-hub/immersive-showroom-experiences-the-future-of-car-buying-with-spatial-computing-and-augmented-64bcb29d0371">Immersive Showroom Experiences: The Future of Car Buying with spatial computing and augmented…</a> was originally published in <a href="https://medium.com/paradox-cat-tech-hub">Paradox Cat Tech Hub</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[PARADOX CAT at #ICCV2023 ]]></title>
            <link>https://medium.com/paradox-cat-tech-hub/paradox-cat-at-iccv2023-a65528060274?source=rss----4466e279ddff---4</link>
            <guid isPermaLink="false">https://medium.com/p/a65528060274</guid>
            <category><![CDATA[scientific-conference]]></category>
            <category><![CDATA[computer-vision]]></category>
            <dc:creator><![CDATA[Johan von Forstner]]></dc:creator>
            <pubDate>Thu, 14 Dec 2023 11:47:40 GMT</pubDate>
            <atom:updated>2023-12-14T11:47:40.260Z</atom:updated>
            <content:encoded><![CDATA[<p>In October 2023, I had the great opportunity to join the International Conference on Computer Vision (ICCV), one of the top scientific conferences in the Computer Vision field, which was held in Paris this year. Our <a href="https://paradoxcat.com/ai/">AI division</a> at Paradox Cat is working on new technologies for interior sensing and HMI interaction in vehicles, so joining ICCV was a perfect way to learn about the latest research in the field using my Paradox Cat <a href="https://paradoxcat.com/en/benefits-en/">training budget</a>.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*GC2zBXdOFvinWDjv" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/800/0*4t_YcPlF5eJtJpQR.jpg" /><figcaption>a) Finally arrived at the entrance to ICCV — b) One of the oral sessions, Photo: <a href="https://iccv2023.thecvf.com/oral.sessions-48-11-203.php">ICCV</a></figcaption></figure><p>After getting off the train on which I had been speeding through the French countryside at 320 km/h, I headed directly to the conference venue at Paris Expo Porte de Versailles, where the workshops and tutorial sessions were already ongoing. I joined the tutorial on <a href="https://www.eecs.yorku.ca/~mbrown/ICCV2023_Brown.html">Understanding the In-Camera Rendering Pipeline and the role of AI/Deep Learning</a>, which gave an extensive background on how modern cameras perceive the world and which algorithms are needed to turn the raw sensor output into beautiful images.</p><p>The following three days constituted the main conference, with two tracks of oral sessions, as well as poster sessions in between. Certainly by now, it became clear that the field of Computer Vision and AI is really exploding — with many poster sessions rivaling the Paris Métro in terms of crowding, it was not always easy to push one’s way through towards the poster one wanted to see. Especially presentations from the well-known Big Tech companies and the award-winning papers like <a href="https://omnimotion.github.io/">Tracking Everything Everywhere All at Once</a>, <a href="https://segment-anything.com/">Segment Anything</a> and <a href="https://openaccess.thecvf.com/content/ICCV2023/papers/Zhang_Adding_Conditional_Control_to_Text-to-Image_Diffusion_Models_ICCV_2023_paper.pdf">ControlNet</a> received their well-deserved attention.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*ziUCpy-Qx8WYn7REQy-gBA.jpeg" /><figcaption>Packed poster session at ICCV 2023</figcaption></figure><p>It was also interesting to see that Computer Vision is becoming very interdisciplinary, in the sense that many methods are not just bound to specific problems, but are often applied all across the field and combined in new creative ways. This also meant that it probably became pretty hard for the conference organizers to group the papers into distinct sessions, and there really was something interesting in almost every session.</p><p>Besides the oral and poster sessions, the final two days also incorporated two invited keynote sessions. Especially the talk from Pushmeet Kohli (Google DeepMind) was very inspiring, giving an insight to how DeepMind is using AI to help solve hard problems in science, such as their recent success predicting protein structures with <a href="https://deepmind.google/technologies/alphafold/">AlphaFold</a>.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*syk0GUflBzKd6RkegNP3Ng.jpeg" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*otdwUVOAp7Sfa-QHrbFsNg.jpeg" /><figcaption>a) HuggingFace 🤗 Open Source AI community event at STATION F — b) trying to talk to the <a href="https://ai.meta.com/llama/">LLaMA</a>s at the entrance 😉</figcaption></figure><p>Au revoir, et merci beaucoup, Paris 🥐🇫🇷!<br>Thanks a lot to the ICCV 2023 organizers for this great event, and thank you to Paradox Cat for letting me participate! I hope that we can also present some of our own work at a future conference.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=a65528060274" width="1" height="1" alt=""><hr><p><a href="https://medium.com/paradox-cat-tech-hub/paradox-cat-at-iccv2023-a65528060274">PARADOX CAT at #ICCV2023 🥐</a> was originally published in <a href="https://medium.com/paradox-cat-tech-hub">Paradox Cat Tech Hub</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[From apps to Operating Systems, how an AOSP training changed our perspective.]]></title>
            <link>https://medium.com/paradox-cat-tech-hub/from-apps-to-operating-systems-how-an-aosp-training-changed-our-perspective-4216cde22ce5?source=rss----4466e279ddff---4</link>
            <guid isPermaLink="false">https://medium.com/p/4216cde22ce5</guid>
            <category><![CDATA[android]]></category>
            <category><![CDATA[software-engineering]]></category>
            <category><![CDATA[automotive]]></category>
            <category><![CDATA[aosp]]></category>
            <category><![CDATA[software-development]]></category>
            <dc:creator><![CDATA[Fernando Gallego]]></dc:creator>
            <pubDate>Fri, 10 Nov 2023 14:39:34 GMT</pubDate>
            <atom:updated>2023-11-10T14:39:34.039Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*tgD4Ggb6LAEAwANrER5MTA.png" /><figcaption>Image generated with DALL-E 3</figcaption></figure><p><strong>Being part of Paradox Cat means that from time to time you get to do some specialisation trainings on some technologies that not so many people get access to.</strong> In our case we had the opportunity to participate in an exclusive online training course about the topics most of us need for working with our customers. This time we did a 40-hour training on Android Open-Source Project (AOSP) with a dedicated day on Android Automotive given by an expert on embedded systems that has been in the industry for so many years already.</p><p>During this course we were able to listen to the lessons and then get our hands dirty trying the concepts ourselves on a preconfigured remote machine in the cloud (that made the setup for the course a breeze). It is proven that when you want to study something new, listening to a few lessons is not enough. We developers learn by doing and tinkering, and to do so for this course, we had a safe sandboxed environment to try out the commands we just learned, change some parameters, experiment, etc. and see the results ourselves without the fear of breaking our system.</p><p>We touched on many topics involved in the building and customization of such a complex operating system like Android. We started by creating a new “product”, this means we created a new device configuration that allowed us to customize it by adding new packages, services and features.</p><p>Once we had this new “product” we applied all the things we saw during the lessons. From replacing the kernel with a new one we built from scratch, looking at the startup scripts and adding a new booting animation, to adding new modules and system services, modifying SELinux policies, adding a new HAL and calling JNI code. Then, for the Automotive part, we looked at the Vehicle HAL, vendor properties and the Car Service.</p><p>By getting this knowledge we could be more effective for our customer project and make more informed decisions about the development of some parts of the system where we are involved. Since AOSP doesn’t really provide a lot of public documentation, it is comforting to have someone who understands it to explain to you the most difficult parts and be able to ask questions. And even better, that you get to do it yourself.</p><p>All in all, it was a lot of content condensed in just 40 hours, which means we had to review and play ourselves afterwards, just to get the most of it. However, I think it was very useful and allowed us to understand at a lower level how AOSP works and how the overall architecture is designed to work on mobile devices. As an app developer, it is completely another Android world that you don’t get to see when you create mobile applications. It means really getting into the nitty gritty details of an operating system, and if you have never done it before, it is both exciting and daunting at the same time, but if you are eager to learn new things outside of your comfort zone, this is your place. I am very happy that Paradox Cat gave me this opportunity to take this training and expand my knowledge, enabling me to grow as a Software Engineer.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=4216cde22ce5" width="1" height="1" alt=""><hr><p><a href="https://medium.com/paradox-cat-tech-hub/from-apps-to-operating-systems-how-an-aosp-training-changed-our-perspective-4216cde22ce5">From apps to Operating Systems, how an AOSP training changed our perspective.</a> was originally published in <a href="https://medium.com/paradox-cat-tech-hub">Paradox Cat Tech Hub</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Machine Learning systems for HMI interaction — insights from our World Café session at Auto.AI]]></title>
            <link>https://medium.com/paradox-cat-tech-hub/machine-learning-systems-for-hmi-interaction-insights-from-our-world-caf%C3%A9-session-at-auto-ai-156dca4cb65a?source=rss----4466e279ddff---4</link>
            <guid isPermaLink="false">https://medium.com/p/156dca4cb65a</guid>
            <category><![CDATA[machine-learning]]></category>
            <category><![CDATA[automotive]]></category>
            <category><![CDATA[human-machine-interface]]></category>
            <category><![CDATA[in-cabin-automotive-ai]]></category>
            <category><![CDATA[sensors]]></category>
            <dc:creator><![CDATA[Johan von Forstner]]></dc:creator>
            <pubDate>Fri, 10 Nov 2023 10:13:27 GMT</pubDate>
            <atom:updated>2023-11-10T10:13:27.731Z</atom:updated>
            <content:encoded><![CDATA[<h3>Machine Learning systems for HMI interaction — insights from our World Café session at Auto.AI Europe 2022</h3><p>Recently, PARADOX CAT took part in <a href="https://www.auto-ai.eu/">Auto.AI Europe</a> 2022 in Berlin. Auto.AI is one of the leading conferences on the applications of Deep Learning in the automotive industry, including but not limited to Level 4 and 5 of autonomous driving. As a company that specializes in in-cabin AI applications, we hosted a World Café session on the topic of architecting robust machine learning systems for HMI interaction within autonomous vehicles, moderated by Johan von Forstner (Principal Machine Learning Engineer) and Alexander Schaub (Director Development).</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*pKgxsAfB9cih6Yg6.jpg" /><figcaption>Johan and Alexander moderate our World Café session at Auto.AI</figcaption></figure><p>Our World Café session focused on several important challenges in the development of AI systems for natural interaction in the vehicle, and many attendees participated in the interesting discussions around these topics.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1000/0*T2SsgDI4YlDFHr54.jpg" /><figcaption>The participants’ ideas were collected on the board and further discussed in following World Café rounds</figcaption></figure><h3>Auto.AI takeaways</h3><h4>Leveraging from new and existing sensors</h4><p>Safety-critical in-cabin sensing applications such as driver monitoring or child presence detection, but also convenience features, like gesture control, greatly benefit from dedicated sensors such as interior cameras with RGB and near-infrared capabilities or depth sensors like time-of-flight cameras. Modern millimeter-wave radars, which are traditionally used as exterior sensors for autonomous driving, are also quickly gaining popularity in the interior space due to their low cost and power consumption. They are even accurate enough to detect vital signals, including breathing and heart rate. On the other hand, participants stressed, that it should be possible to leverage the existing sensors in the cabin for new use cases with machine learning and sensor fusion, including microphones, steering wheel and seat occupancy sensors.</p><h4>Avoiding false positives</h4><p>In gesture or voice interaction systems, false positives can easily lead to frustration and increased driver distraction if they trigger unwanted actions in the HMI. There is always a tradeoff between making the system as natural and easy to use as possible while also keeping the gestures or voice commands distinct enough to avoid triggering them accidentally. In other words: Should the system be trained to work well for every user, or should the user need to be trained to use the system correctly? This problem is hard to solve in general, so it requires tailored solutions depending on the application. Possible approaches include personalization of the system to the user through active learning, as well as combination with other modalities like gaze and pose detection to provide additional context.</p><h4>Deploying active and federated learning systems</h4><p>Just like autonomous driving systems, in-cabin sensing applications usually benefit from large and diverse datasets collected from the fleet. However, sending data back from customer vehicles may raise privacy concerns, especially in the case of interior cameras. Federated learning, where only gradients of the deep learning model are transmitted, provides a privacy-friendly approach to this, however it is not directly applicable in cases where we cannot benefit from self-supervised techniques to label the data. In these cases, it could be combined with active learning, where the user can directly help to improve the AI system by providing labels for a couple of data samples. In the car, special care needs to be taken to choose a suitable UX design for such active learning sessions.</p><h4>PARADOX CAT and in-cabin AI</h4><p>PARADOX CAT has created a new division (PARADOX AI) with focus on artificial intelligence and machine learning, which has a startup character within our already established software development business. We strongly believe that this technology will have the greatest impact on everyone’s lives in the future. Focusing on new in-cabin applications of AI, we enable our automotive customers to develop prototypes in this field based on cutting-edge research. In addition, with our experience in series development of automotive HMI systems, we can support customers with the integration of AI applications into these platforms to bring them into production. Do you want to learn more about our PARADOX AI team and services? We are looking forward to <a href="https://paradoxcat.com/en/contact/">get in touch</a>.</p><p><em>Originally published at </em><a href="https://paradoxcat.com/ai/auto-ai-europe-2022/"><em>paradoxcat.com</em></a><em> on September 29, 2022.</em></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=156dca4cb65a" width="1" height="1" alt=""><hr><p><a href="https://medium.com/paradox-cat-tech-hub/machine-learning-systems-for-hmi-interaction-insights-from-our-world-caf%C3%A9-session-at-auto-ai-156dca4cb65a">Machine Learning systems for HMI interaction — insights from our World Café session at Auto.AI</a> was originally published in <a href="https://medium.com/paradox-cat-tech-hub">Paradox Cat Tech Hub</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Custom Android Launcher — Why and How do I build one?]]></title>
            <link>https://medium.com/paradox-cat-tech-hub/custom-android-launcher-why-and-how-do-i-build-one-6a1b3af89d43?source=rss----4466e279ddff---4</link>
            <guid isPermaLink="false">https://medium.com/p/6a1b3af89d43</guid>
            <category><![CDATA[android-app-development]]></category>
            <category><![CDATA[mobile-app-development]]></category>
            <category><![CDATA[aosp]]></category>
            <category><![CDATA[android-launcher]]></category>
            <category><![CDATA[android]]></category>
            <dc:creator><![CDATA[Viktor Mukha]]></dc:creator>
            <pubDate>Tue, 07 Nov 2023 16:56:01 GMT</pubDate>
            <atom:updated>2023-11-08T11:22:40.575Z</atom:updated>
            <content:encoded><![CDATA[<h3>Custom Android Launcher — Why and How do I build one?</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*WdedIjmgwZnG_ys4tCoL2w.jpeg" /></figure><blockquote>Disclaimer. I have presented most of this material at DroidCon 2023 in Berlin. If you prefer a video, you can find it here: <a href="https://www.droidcon.com/2023/08/01/custom-launcher-why-and-how-do-i-build-one/">https://www.droidcon.com/2023/08/01/custom-launcher-why-and-how-do-i-build-one/</a></blockquote><h3><strong>What is a launcher?</strong></h3><p>According to the source code of Android OS, a Launcher<strong> </strong>or a Home App is an <strong>application </strong>which “replaces the home screen and gives access to the contents and features of your device”. It is usually responsible for <em>listing and starting other applications </em>and for <em>hosting widgets.</em></p><p>Here is an example of a launcher:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/299/1*oUEMJvtE-BjGo-O6e62QyQ.png" /></figure><p>Guess what your phone screen would look like without a launcher? You would only see what is called System UI:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/296/1*eVk-JnNDcCuYA7-evFttpw.png" /><figcaption>No launcher in system (only System UI is visible)</figcaption></figure><p>A home app is essential for any Android system. Try to start your applications without it!</p><p>Every Android device comes with a pre-installed launcher. Device manufacturers may create their own home app, or simply reuse the vanilla launcher that comes with AOSP (Android Open Source Project). Since it is Open Source, you can actually inspect the original launcher source code. For instance, Launcher 3 for Android phones can be found here: <a href="https://cs.android.com/android/platform/superproject/main/+/main:packages/apps/Launcher3/src/com/android/launcher3/Launcher.java">https://cs.android.com/android/platform/superproject/main/+/main:packages/apps/Launcher3/src/com/android/launcher3/Launcher.java</a></p><p>User can choose default home app: Settings — Apps — Default Apps — Home app:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/290/1*JZrEq-9jktq3GTg1-e6NaA.png" /><figcaption>Default home app chooser</figcaption></figure><p>A default launcher will be the default “HOME” action, which can be triggered by a hardware button or a swipe-up-from-bottom gesture. The selected default home application will always be at the bottom of the activity back stack. In simple words, if you keep going “BACK”, you will always arrive at the default launcher.</p><h3><strong>Why create new launcher?</strong></h3><p>There are multiple possible reasons:</p><ul><li>Performance</li><li>Features</li><li>Usability</li><li>Design</li><li>Customization</li></ul><p>These are the reasons behind every custom launcher you see in the Google Play Store. In fact, most of today’s stock launchers features today were once only available in custom launchers.</p><p>If you check Wikipedia, you will find a table with more than 60 launchers for phones/tablets only. Obviously, not all of them can keep up with Android development and some became obsolete when their features were integrated into stock launchers.</p><p>Another reason to create a new home app is the plethora of Android devices out there: phones, foldables, tablets, TVs, cars, wearables. They all need a launcher.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/497/1*EcreQUhPPVsKM2euMAkIHg.png" /><figcaption>Source: <a href="https://www.android.com/everyone/">https://www.android.com/everyone/</a></figcaption></figure><p>There are many possible applications for Android OS. Here are just a few of them.</p><ul><li>Vending machines, coffee machines, beer machines</li><li>Public transport: entertainment, info screens, ticket machines</li><li>Shared &amp; private transportation: cars, bikes, scooters</li><li>Smart home, IoT devices</li></ul><p>Of course, not all of them run Android (yet), but if they do, they all need a launcher.</p><h3><strong>How to develop a launcher?</strong></h3><p>First of all, make your application a launcher. Add this category to your intent filter in <strong>AndroidManifest.xml</strong>:</p><pre>&lt;category android:name=&quot;android.intent.category.HOME&quot; /&gt;</pre><p>Actual meaning of this category and other related ones can be found in <a href="https://cs.android.com/android/platform/superproject/main/+/main:frameworks/base/core/java/android/content/Intent.java">Intents.java</a>, here are the example definitions extracted from this file:</p><blockquote><strong>android.intent.category.HOME<br></strong>This is the home activity, that is the first activity that is displayed when the device boots.</blockquote><blockquote><strong>android.intent.category.HOME_MAIN (hidden API, only for platform)<br></strong>This is the home activity that is displayed when the device is finished setting up and ready for use.</blockquote><blockquote><strong>android.intent.category.SECONDARY_HOME<br></strong>The home activity shown on secondary displays that support showing home activities.</blockquote><blockquote><strong>android.intent.category.SETUP_WIZARD (hidden API, only for platform)<br></strong>This is the setup wizard activity, that is the first activity that is displayed when the user sets up the device for the first time.</blockquote><blockquote><strong>android.intent.category.LAUNCHER_APP (hidden API, only for platform)<br></strong>This is the home activity, that is the activity that serves as the launcher app from there the user can start other apps. Often components with lower/higher priority intent filters handle the home intent, for example SetupWizard, to setup the device and we need to be able to distinguish the home app from these setup helpers.</blockquote><p>Next step in becoming a launcher is to list the apps. To get a list of apps, use <strong>Package Manager’s </strong><a href="https://developer.android.com/reference/android/content/pm/PackageManager#queryIntentActivities(android.content.Intent,%20android.content.pm.PackageManager.ResolveInfoFlags)"><em>queryIntentActivities</em>()</a> API:</p><pre>val intent = Intent(Intent.ACTION_MAIN)<br>intent.addCategory(Intent.CATEGORY_LAUNCHER)<br>val flags = PackageManager.ResolveInfoFlags.of(<br>    PackageManager.MATCH_ALL.toLong())<br>val activities: List&lt;ResolveInfo&gt; =<br>    context.packageManager.queryIntentActivities(intent, flags)</pre><p>For ease of use, map this list to your app objects:</p><pre>data class App(<br>    val name: String,<br>    val packageName: String,<br>    val icon: Drawable?<br>)<br><br>val installedApps = activities.map { resolveInfo -&gt;<br>    App(<br>        name = resolveInfo.loadLabel(packageManager).toString(),<br>        packageName = resolveInfo.activityInfo.packageName,<br>        icon = resolveInfo.loadIcon(packageManager)<br>    )<br>}</pre><p>Start an app via such object:</p><pre>fun App.launch(context: Context) {<br>    val intent = context.packageManager.getLaunchIntentForPackage(packageName) ?: return<br>    context.startActivity(intent)<br>}</pre><h3><strong>How to display applications?</strong></h3><p>Let us start by looking at the existing user interface modalities. It seems that the world has settled on multi-screen navigation with a configurable grid of icons grouped with the app names.</p><p>However, grid is not the only option. Experiment with simple flat lists, embedded search or filtering using text or speech, fast scrolling using first letters, etc.</p><p>Take a look at <a href="https://play.google.com/store/apps/details?id=bitpit.launcher&amp;hl=de&amp;gl=US">Niagara Launcher</a>’s approach:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/291/1*qUAebM-LWA_8MTLzaaFJKA.gif" /></figure><p>Or it could be as simple as <a href="https://github.com/tanujnotes/Olauncher">OLauncher</a>:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/291/1*9K63RGiZbHZrnHiTf_h6rA.gif" /></figure><p>Also, remember to give users a way to see all of their installed apps in one place. For example using an App Drawer which one could “draw” from the bottom of the screen with a swipe-up gesture:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/299/1*1Kis35V7_UnVLTqLi-KlGQ.png" /></figure><p>That said, do not limit yourself to application icons and app widgets only. <strong>Get creative </strong>and build the user interface around what your users actually need. For example, a SpaceX rocket launcher (pun intended) might look something like this:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*0cVdqLhPTqvtUj2HjTZwNw.png" /><figcaption><em>Source: </em><a href="https://www.figma.com/community/file/855715967691534013"><em>https://www.figma.com/community/file/855715967691534013</em></a></figcaption></figure><h3>App Widgets</h3><p>A story about launchers would be incomplete without talking about App Widgets.</p><p>This is an example of a standard clock widget:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/299/1*ML0mWWEzL2Kjty4Xebc7MA.gif" /></figure><p>I will not bother you with details on implementation or widget types, you can find great official documentation <a href="https://developer.android.com/develop/ui/views/appwidgets/overview">here</a>. If you use Jetpack Compose, check out the <a href="https://developer.android.com/jetpack/androidx/releases/glance">Glance</a> library, it is truly great!</p><p>Instead, let us discuss <strong>the uses </strong>and <strong>limitations </strong>of App Widgets.</p><p>It is clear that widgets are quite useful for applications like calendar, clock, and weather. However, if you want to have something like a fully interactive map on your home screen all the time, App Widgets unfortunately are not going to cut it.</p><p>And why is that?</p><ul><li>Interaction. Gestures are limited to only <strong>touch </strong>and <strong>vertical swipe</strong>, so as not to conflict with Home App gestures.</li><li>UI. Not all UI elements are available, so you cannot create a full-featured home screen app as an App Widget.</li><li>Battery. By default, App Widgets refresh once every <strong>30 minutes (!)</strong>. That says a lot about the design decisions.</li></ul><p>But wait a minute, isn’t that what the default launcher of Android Automotive OS does? Showing a giant interactive map in a launcher?</p><p>Yes, it does:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*1TjoSrwJrD3-TZ-P3monTw.png" /><figcaption>CarLauncher in AAOS based on Android 12</figcaption></figure><p>If it is not an App Widget, what is it?</p><p>In the <a href="https://cs.android.com/android/platform/superproject/main/+/main:packages/apps/Car/Launcher/src/com/android/car/carlauncher/CarLauncher.java">source code</a>, Google calls these widgets “Cards”. Cards are implemented in terms of lower-level window management facilities. This facility used to be an <a href="https://cs.android.com/android/platform/superproject/+/android-11.0.0_r48:packages/apps/Car/Launcher/src/com/android/car/carlauncher/CarLauncher.java;l=41"><strong>ActivityView</strong></a>, which was deprecated and eventually removed in Android 12. Now it is a <a href="https://cs.android.com/android/platform/superproject/+/main:packages/apps/Car/Launcher/src/com/android/car/carlauncher/CarLauncher.java;l=58"><strong>TaskView</strong></a>, which is rightfully a <strong>hidden platform API</strong>.</p><p>What does <strong>“hidden platform API” </strong>mean for application developers? Simply put:</p><ul><li>it should not be used by non-system applications,</li><li>it will likely change in future Android version.</li></ul><p>Yes, there are ways to expose this API to applications, but if you want your application to survive Android updates, I urge you to consider a better design alternative.</p><p>Implement the element you want to “embed” directly in the launcher. As long as you have control over the providing app, you can extract the relevant code parts into common libraries and design proper communication between the “Widget” and the “App” yourself.</p><p>That said, your launcher should probably still support embedding App Widgets as well, some users depend on it.</p><p>We have covered a lot of ground in this story. To summarize, we have learned the following.</p><ul><li><strong>Launcher</strong> is a home screen <strong>application</strong>.</li><li>The world needs new launchers that are fast and stable.</li><li>Now you know how to <strong>make an application a launcher</strong>, how to <strong>list other applications</strong>.</li><li>You should support <strong>App Widgets</strong>, but do not let them limit your design.</li></ul><p>Be creative and focus on what your users need.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=6a1b3af89d43" width="1" height="1" alt=""><hr><p><a href="https://medium.com/paradox-cat-tech-hub/custom-android-launcher-why-and-how-do-i-build-one-6a1b3af89d43">Custom Android Launcher — Why and How do I build one?</a> was originally published in <a href="https://medium.com/paradox-cat-tech-hub">Paradox Cat Tech Hub</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
    </channel>
</rss>