Casting custom content from Android app to TV (2020 approach)

Iqaan
The Startup
Published in
8 min readJun 28, 2020

If you’re anything like me, who has never created a screen-cast app before and want to explore the options to do it, let’s dive in together.

Things you should know already
-
Creating Android apps (used Kotlin for this post)

Screen mirroring or screen casting

For most users, screen mirroring and screen casting might just mean the same thing: presenting the content from phone or any other small device to TV. It’s crucial to know the difference though, if you’re integrating one of these into your own apps.

Screen Mirroring refers to showing your phone’s display on the TV as is. That means any clutter on the phone, navigation icons, and every other clickable item appears on the TV as well, even though on most TVs you can’t interact with those UI elements.

Screen Casting means showing only the meaningful content on the TV. It’s used when you want to play a full-screen video on TV while your phone is showing controls, or maybe show graphs and charts on TV while your phone displays data in tabular format.

Protocols

The top most common protocols used for casting are:
1. ChromeCast (Google Cast)
2. MiraCast
3. AirPlay
4. Amazon Fling
5. DLNA
6. DIAL
7. Samsung SmartTV

We’ll focus on the first two of the list. I might soon add AirPlay and Amazon Fling also (although AirPlay works on Apple devices only).

MiraCast

MiraCast is essentially a screen-mirroring protocol. The typical use case is to mirror entire screen of your small device (phone or tablet), on a bigger device (TV). It is possible to override system behaviour and display our own (custom) views using MiraCast as well, which from a developer’s perspective will be a second screen being displayed on top of the mirrored screen.

On Android, MiraCast devices aren’t visible to app unless that MiraCast device is already selected to cast from the system settings (at least from Android 8.0).

Once a MiraCast device is selected from settings, the app can then use MediaRouter API or DisplayManager API to get hold of a Display object, which can then be provided to Presentation object.

You can look for the selected MiraCast device by checking for selected route in the onResume method of the Activity or a Fragment.

override fun onResume() {
findSelectedDevice()
}
private fun findSelectedDevice() { val route = mediaRouter.selectedRoute if (route.playbackType != PLAYBACK_TYPE_LOCAL && route.presentationDisplay != null) {
selectedDisplay = route.presentationDisplay
showPresentationDialog()
}

A Presentation object is just a common Android dialog, when you call its show() method, instead of being presented to screen, this dialog opens in fullscreen mode on the TV. Now you have any layout in that presentation dialog.

private fun showPresentationDialog() {
if (selectedDisplay == null)
return

if
(presentationDialogFragment != null && presentationDialogFragment!!.presentationDisplay != selectedDisplay) {
dismissPresentationDialog()
}

if (presentationDialogFragment == null) {
presentationDialogFragment = MyCustomPresentation(this, selectedDisplay)
try {
presentationDialogFragment!!.show(activity!!.supportFragmentManager, "MY_PRESENTATION_DIALOG_FRAGMENT")
}
catch (e: WindowManager.InvalidDisplayException) {
Toast.makeText(context, "Invalid cast display", Toast.LENGTH_LONG).show()
}
}
}

Now create your own MyCustomPresentation class that inherits from Presentation.

You can also inherit from DialogFragment and then inside its nnCreateDialog return a Presentation object.

class MyCustomPresentation(private val display: Display) : DialogFragment() {
override fun onCreateDialog(savedInstanceBundle: Bundle?) : Dialog {
return Presentation(context, display)
}
override fun onCreateView(inflater: LayoutInflater, container: ViewGroup?, savedInstanceState: Bundle?): View? {
super.onCreateView(inflater, container, savedInstanceState)
return inflater.inflate(R.layout.my_layout, container, false)
}
}

Then you can add views inside my_layout.xml however you want.

To test this, run the application and go to settings > cast/smart view/or similar name. Choose your MiraCast device, come back to the application and you should see your layout on the TV.

It’s reasonable to think that all the processing of the views for supporting MiraCast on your app has to be done by the phone itself instead of the TV. Under the hood, the phone is probably rasterizing this Presentation view into a video and then sending it out to TV.

ChromeCast

ChromeCast until recently supported this behaviour from MiraCast as well. You could get hold of Display object, using CastRemoteDisplayLocalService class. Then provide that Display object to Presentation object, and work in the same way as you did with MiraCast. However, this behaviour is not supported anymore and doesn’t work now.

Instead with ChromeCast, you have two options: one is to use a Default Media Receiver (or a Styled Media Receiver which is the same as Default Media Receiver except it can be styled using CSS), another option is to use Custom Receiver.

With the Default Media Receiver, you can present any media content (videos, photos, and audio) in supported formats (mp4, jpeg, and all common formats) to the TV without needing to do any hard work on the TV side. With the Default Media Receiver, you can even present live video content in one of the three supported protocols: HLS, DASH, and SmoothStreaming.

However, if you need to show your own custom content on ChromeCast (this means showing non media elements like text, charts, or virtually anything), you need to create an app for the ChromeCast.

The ChromeCast app is essentially a web-application running inside a Google Chrome instance in a full screen mode without showing address bar, menus, and all other chrome.

For creating a ChromeCast application, you need cast application id which you can get by registering on Google Cast Dev Console and creating a new application in that console. The registration fee is US $5 as of date of writing this post. Application URL is also set up here.

For your application to be seen by your ChromeCast device while it is not published yet, you also need to register your ChromeCast device’s serial number in the Cast Dev Console.

Once registered, you can start coding from the sender side (phone application) and the TV side (web-application) in order to establish custom communication channels.

Sender App

Create a class that inherits from OptionsProvider. This class is referenced from AndroidManifest.xml

import com.google.android.gms.cast.framework.OptionsProvider
// other imports
class CastOptionsProvider : OptionsProvider {

companion object {

fun getCastAppIdFromContext(context: Context): String {
return context.resources.getString(R.string.cast_app_id)
}

}

override fun getCastOptions(context: Context): CastOptions {
return CastOptions.Builder()
.setReceiverApplicationId(getCastAppIdFromContext(context))
.build()
}

override fun getAdditionalSessionProviders(context: Context?): MutableList<SessionProvider>? {
return null
}

}

Add these lines to AndroidManifest.xml. Make sure class paths are matched according to your own project.

<meta-data
android:name="com.google.android.gms.cast.framework.OPTIONS_PROVIDER_CLASS_NAME"
android:value="com.mycompany.myapp.utils.CastOptionsProvider" />

<meta-data
android:name="com.google.android.gms.version"
android:value="@integer/google_play_services_version" />

Create a menu layout which contains cast button. Use AndroidX library’s MediaRouteActionProvider which will handle all the cast related things and change the state of the button automatically.

<?xml version="1.0" encoding="utf-8"?>
<menu xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto">

<item android:id="@+id/media_route_menu_item"
android:title="@string/media_route_menu_title"
app:actionProviderClass="androidx.mediarouter.app.MediaRouteActionProvider"
app:showAsAction="always"
/>
</menu>

The inside your Fragment or Activity, override onCreateOptionsMenu. You might also need to call setHasOptionsMenu(true) inside onCreate.

override fun onCreateOptionsMenu(menu: Menu, inflater: MenuInflater) {
inflater.inflate(R.menu.trainee_conference_menu, menu)

val mediaRouteMenuItem = menu.findItem(R.id.media_route_menu_item)
val mediaActionProvider = MenuItemCompat.getActionProvider(mediaRouteMenuItem) as MediaRouteActionProvider
mediaActionProvider.setAlwaysVisible(true)

mediaRouteSelector.also(mediaActionProvider::setRouteSelector)
}

The last line of code will attach this cast button to a MediaRouteSelector object. You can configure the type of devices that should be visible to the app using mediaRouteSelector.

val mediaRouteSelector = MediaRouteSelector.Builder()
.addControlCategory(MediaControlIntent.CATEGORY_LIVE_VIDEO) .addControlCategory(CastMediaControlIntent.categoryForCast(CAST_APP_ID))
.build()

By using CastMediaControlIntent.categoryForCast(CAST_APP_ID), you’re enabling the devices that you’ve registered from Dev Console to be visible to the app.

Create a MediaRouter item, a SessionManager item and register their callbacks. MediaRouter callbacks will handle when the cast device has been selected, unselected, or switched. SessionManager callbacks will let us know when the ChromeCast receiver app is loading, or loaded, or closing, or closed.

val mediaRouter = MediaRouter.getInstance(application.applicationContext)private var castSessionManager: SessionManager = CastContext.getSharedInstance(application.applicationContext).sessionManageroverride fun onResume() {
// ... lines above ...
mediaRouter.addCallback(mediaRouteSelector, mediaRouterCallback, MediaRouter.CALLBACK_FLAG_PERFORM_ACTIVE_SCAN)
castSessionManager.addSessionManagerListener(sessionManagerListener)
}
override fun onPause() {
// ... lines above
mediaRouter.removeCallback(mediaRouterCallback)
castSessionManager.removeSessionManagerListener(sessionManagerListener)
}

Define MediaRouter callback functions

// You can write your own logic in each of these functionsprivate val mediaRouterCallback = object: MediaRouter.Callback() {
override fun onRouteSelected(router: MediaRouter?, route: MediaRouter.RouteInfo?) {
}
override fun onRouteUnselected(router: MediaRouter?, route: MediaRouter.RouteInfo?) {
}
override fun onRoutePresentationDisplayChanged(router: MediaRouter?, route: MediaRouter.RouteInfo?) {
}
override fun onRouteChanged(router: MediaRouter?, route: MediaRouter.RouteInfo?) {
}
}

Define SessionManager callback funtions

private val sessionManagerListener = object: SessionManagerListener<Session> {
override fun onSessionStarted(session: Session?, sessionId: String?) {
castSession = session as? CastSession
castSessionEnabled = true
sendTestLogMessage()
}

override fun onSessionResumeFailed(p0: Session?, p1: Int) {
}

override fun onSessionSuspended(p0: Session?, p1: Int) {
castSessionEnabled = false
}

override fun onSessionEnded(p0: Session?, p1: Int) {
castSessionEnabled = false
}

override fun onSessionResumed(p0: Session?, p1: Boolean) {
castSessionEnabled = true
}

override fun onSessionStarting(p0: Session?) {
}

override fun onSessionResuming(p0: Session?, p1: String?) {
}

override fun onSessionEnding(p0: Session?) {
}

override fun onSessionStartFailed(p0: Session?, p1: Int) {
}

}

Send message to cast device

// I used Timber for logging. You can just use Log.d(...)fun sendMessageToChromecastDevice(namespace: String, jsonMessage: String, successCallback: (() -> Unit)? = null) {
if (castSessionEnabled == true) {
try {
castSession?.sendMessage(namespace, jsonMessage)
?.setResultCallback { status ->
Timber
.d("Message to $namespace $status")
if (status.isSuccess) {
successCallback?.invoke()
}
}
}
catch (e: Exception) {
Timber.e(e)
}
}
}

Namespaces are crucial for establishing communication channels. For ChromeCast the namespaces must start with “urn:x-cast:”. For example, you can have a namespace defined like this:

val LOG_NAMESPACE = "urn:x-cast:com.mycompany.myapp.Log"

Let’s see how sendTestLogMessage will look like:

// You can call this function on some button click eventsfun sendTestLogMessage() {    val jsonMessage = Gson().toJson(LogMessageOut(message = "Screen touched at ${LocalDateTime.now()}"))
sendMessageToChromecastDevice(LOG_NAMESPACE, jsonMessage) {
// success callback code
Toast.makeText(context, "Logged", Toast.LENGTH_LONG).show()
}
}

LogMessageOut class:

data class LogMessageOut (

@SerializedName("message")
var message: String? = null

)

At this point, your sender app is able to configuring ChromeCast settings to open the application (web-application whose URL is defined inside Dev Console), responding to cast device select/unselect events, responding to receiver app lifecycle events, creating communication channels using namespaces, and sending test log messages.

Receiver app

Building a receiver app that listens to log messages from the sender app and printing them on the screen is trivial.

First, create an html (preferably index.html) file with the following content:

<html><head><script type="text/javascript" src="//www.gstatic.com/cast/sdk/libs/caf_receiver/v3/cast_receiver_framework.js"></script></script></head><body><div style="color: red; font-size: 108pt;">My App</div><div id="logger" style="color: white;"></div><script src="cast_script.js"></script></body></html>

Create cast_script.js with the following content:

var LOG_NAMESPACE = "urn:x-cast:com.mycompany.myapp.Log"var castReceiverContext = cast.framework.CastReceiverContext.getInstance();castReceiverContext.addCustomMessageListener(LOG_NAMESPACE, function (customEvent) {logElement.innerText += LOG_NAMESPACE + " - " + customEvent.data.message + "\n";console.log(LOG_NAMESPACE + " - " + customEvent.data.message);});var logElement = document.querySelector("#logger");logElement.innerText = "Logging Events\n\n";
// JSON is the default type, but you can be sure anyway
castReceiverOptions.customNamespaces = Object.assign({});castReceiverOptions.customNamespaces[LOG_NAMESPACE] = cast.framework.system.MessageType.JSON;castReceiverContext.start(castReceiverOptions);

Host the receiver app files on the URL you have provided to the Dev Console. For hosting on a local server, you can use a localhost tunnelling service like ngrok or pagekite because ChromeCast won’t recognize local servers normally.

For debugging the ChromeCast receiver application, you can go to chrome://inspect on Google Chrome, or edge://inspect on Mircosoft Edge. (For me, Chrome inspector wasn’t working fine on 1st gen ChromeCast, but worked fine on Edge.)

Also, ChromeCast 1st gen wasn’t updated after Chrome 70.x while ChromeCast 2nd gen was running on Chrome 76.x. So, some ES6 features didn’t work on ChromeCast 1st gen and caused a lot of trouble for me. For that purpose, I made use of Babel to convert ES6 code into ES5.

--

--