Roku Test Automation

More lessons from the field

Mike Theiss
Globant
14 min readNov 11, 2022

--

Overview

In my previous articles about Roku test automation, I covered the basics of getting started with Roku WebDriver and more advanced topics, including working with dynamic content, testing video playback scenarios, and developing solutions to mitigate crashes during test runs. In this third installment of my series on Roku test automation, I will cover some additional topics of interest for Roku test automation developers: running test automation against Beta and Public channels, incorporating automation in CI workflows, incorporating screen capture capabilities in your solution, and device compatibility testing.

In case you missed the first two articles in the series, you can read them here:

Using Roku WebDriver with Sideloaded Channels vs. Beta or Public Channels

Different Ways to Install Test Channels with Different Test Implications

The most common way to test Roku channels for development is by sideloading channels onto the device from a zip file with the source output. This is where you should start when you build a Roku WebDriver solution, as it is the most versatile way to test Roku channels. However, Roku also provides an option for development teams to use, referred to as "beta channels", which allow for Roku channels under development to be loaded onto devices through a user's Roku account in a similar way to public channels consumers download from the Roku Channel Store. You may even want to run automated tests against a public channel in the Roku channel store, which your team is responsible for developing. Here we will discuss the implications of running automated tests depending on how your channel has been deployed to the device.

Sideloaded Channel Testing Considerations

Using sideloaded channels is generally the way to go when using test automation because this option gives you full access to Roku's debugging utilities and has the least hoops to jump through to set up your tests. You need to sideload your build if you want to use telnet to monitor your device's debug output on port 8085 and use the interactive debugger. The same applies if you're going to use the screen capture utility built into the device's developer services.

However, there are some additional considerations to be aware of. When running automated tests which rapidly exit and re-open sideloaded channels, I have observed unexpected errors and crashes related to the write and read operations to the device registry. The design of your development channel will likely be a factor here, but the issues I have encountered here were not reproducible in the same way on beta or public channels. Adding a timed delay of several seconds after exiting the channel before relaunching to start a new set of tests seems to reliably prevent this type of odd behavior. I also think you will find that the debugging capabilities present for sideloaded channels increase the resource demands of the Roku devices under test, which may be a factor when testing low-memory devices.

Finally, some features that integrate with Roku's store, such as on-device subscription upgrade and downgrade paths, are not testable with sideloaded channels. These paths are generally the most challenging to automate for logistical reasons, so it may be easiest to test these flows manually. Still, if you want to test them via automation, you must do so with a beta channel or public channel tied to Roku's services.

Packaging Channels for Beta and Public Channel Publication

To publish a beta or public Roku channel, you need to use a Roku device to "package" the channel. The packaging process encrypts the channel source code so that the internal logic is not exposed. The output is a file with a .pkg extension which can be uploaded to Roku's developer site for distribution. Typically, the packaging process would be handled in your team's build process, preferably in an automated continuous integration solution.

Testing with a beta channel is the most usual way to test the experience an end consumer will have before submitting your channel to the Roku Channel Store. This is also the only way to test certain functionality, such as Roku Pay integrations. However, there are some significant limitations to keep in mind when testing beta and public channels. You will not be able to use Roku's telnet debugging utilities or use the screen capture function on the device when testing a beta or public channel. You also have to set up your test devices in a very specific way for automation to work in this context.

Beta/Public Channel Packaging Requirements for Automation

Though it may seem counterintuitive, to run Roku WebDriver automation in a Beta or Public channel (as opposed to sideloaded), you will need to go through Roku's packaging process on each and every one of the test devices that you intend to use to run your test automation. The Roku WebDriver documentation states the following:

"To test production channels with the Roku Web Driver APIs, package the channel on your Roku device using the same Roku developer account linked to the production version of the channel."

If you try to use Roku WebDriver with a Public or Beta channel that was not packaged on the unique device you are using to test, you will only be able to send remote commands; you will not be able to query elements or retrieve XML source from the channel. This is a security mechanism to prevent you from seeing how other developers' channels work under the hood. It also prevents you from implementing any test automation solution of value without source code and the proper permissions for a channel deployed via Roku's services.

Though you must package the channel on your unique test device to run automation in this way, you do not need to re-upload the binary to the Roku developer portal each time you run the package utility. You can, as an example, package the build using your build system, upload the binary to a beta channel via the Roku developer site, then repackage the same build on the devices you are using to run test automation.

You will need access to the Roku Developer account being used to publish the channel under test, the packaged channel binary, source code, and the password used to encrypt the channel when it was originally packaged. You can then 'rekey' the channel on your device using the instructions here. In addition to generating a pkg file that you could upload to Roku, this process will give your test device the authorization it needs to use the packaging utility built into the device and execute Roku WebDriver queries against it the channel.

Note: the process of uploading beta channels to the Roku developer site cannot be fully automated currently, as Roku implements a "captcha" mechanism. There are also limitations to the number of users who can access these builds and a limit to the duration that a beta channel is operational. However, there may still be value in incorporating beta channels into your team's workflow.

Figure 1. Roku build and packaging workflow as applicable when automating Beta or Public channels.

Incorporating Test Automation into Continuous Integration Workflows

Integration of Roku Hardware

As touched on above, the process of packaging Roku builds requires direct access to Roku hardware. This is perhaps the biggest obstacle to continuous integration workflows for Roku. You either need to have a test device on the same network that the continuous integration (CI) instance is running on, or you need to have the device's IP and applicable ports available via a static IP that the CI solution can reach. However, many Roku teams use Jenkins or other CI solutions to generate Roku builds. You can integrate your test automation scripts similarly if your CI solution has network access to Roku hardware. However, there are some logistical considerations to factor in if this is part of your test automation plan.

Remote Datacenter Considerations

If you are creating builds or targeting tests against devices in a remote data center, you should plan to have a way to remotely reboot your devices. Roku does not provide a remote command method to initiate a device restart. Technically, remote commands can be used to navigate system menus, but there is no way to assert state against the system menu contents. Different device models or Roku OS versions may have different system menu options, so automating system menus reliably poses challenges. You may occasionally get your device in a state where it will not respond to remote commands. The best recovery method is to use a power switch that can be remotely controlled over your network. Even then, you may encounter situations that require physical access to the machine or a way to view the screen output remotely. If Roku detects an OS update that has not been installed, it will block sideloaded build installation until the OS has been updated. I'm not aware of any way to manage this without being able to see the device screen (if allowed time, it should automatically install and reboot itself, but this will be of little consolation to you if you are trying to get a build out to a client or run an automated test pass on a deadline). Roku devices, as is the case for most 10-foot experience devices, are strictly designed for consumers, so they don't have the types of IT management features common for devices that integrate into corporate IT workflows. Though Roku provides a screen capture feature that will retrieve an image of an active sideloaded development channel, it should be noted that this feature is designed to capture screens from your sideloaded channel builds. It does not allow you to capture the screen when the system menus are displayed, so keep this in mind as you plan your solution.

Device Allocation

For CI solutions, I recommend reserving at least one device exclusively for building operations and running test automation on separate dedicated devices. Automated tests will be more likely to crash or lock devices, and you won't want test automation activities to block your build jobs. Similarly, suppose you are running a test automation job. In that case, you will want to avoid a scenario where team members trigger a build operation that relaunches a channel and interrupts a test automation pass.

Video Playback and HDCP

Video playback tests will need to be connected to a live display, or you will otherwise need to manage HDCP (High-bandwidth Digital Content Protection). Normally, your video playback requests will fail if the HDCP chain is broken, which will occur if your device is connected to an HDMI switch that isn't actively pushing your Roku's signal to a display. Unfortunately, this precludes the possibility of running a bunch of headless Rokus for parallel automation jobs.

Gating Build Process Steps

It probably goes without saying that if you want to gate builds with automated UI tests, you should be selective about which tests you incorporate into these workflows. You wouldn't want failures from volatile or incidental tests to block builds or to delay the build process unnecessarily. You will want to make sure any failures impacting the build process are reported clearly and concisely so that analyzing failures can be handled by development team members.

I would avoid using Roku test automation to gate builds and limit CI integrations to scheduled or on-demand jobs. These kinds of UI tests can be time-consuming, so it would be great to have a job that runs nightly and reports the results to you each morning or to kick off after a new build and message your team with the results once the job finishes. However, you should have a configuration that allows your tests to be run locally as well. Simply watching the tests run is often the best way to understand failures, and you also may catch UI issues that your assertions will miss.

Use of Screen Capture

Limited Screen Capture Capabilities on the Platform

It is commonplace to incorporate screen capture into test automation solutions. Most modern development platforms provide an easy way to get screen captures from test devices. Typical screen capture applications would be to store images at regular intervals to reference for debugging when failures occur or to leverage visual testing tools that analyze the image content and make assertions against the image content as tests execute. Regretfully, the Roku WebDriver solution does not have an API method for collecting screenshots.

A Screen Capture Workaround

Though Roku WebDriver does not include a screen capture method (nor does Roku's External Control Protocol API support it), it is possible to retrieve screenshots from a Roku device remotely without external video capture hardware if you are testing against a sideloaded channel.

You can connect your Web browser to the IP address of your Roku and manually collect a screen capture from your Roku channel using the same built-in Web server application for developers that allows you to sideload builds. With widely available Web debugging tools (including those built into most Web browsers), you can observe the methods that Roku's built-in Web service uses to query the device and collect screen capture binaries. With a bit of reverse engineering, you can add a module to your test framework which will replicate these interactions so that your test framework can collect screenshots from your device.

Figure 2: The Web-based utility shown above is available on Roku devices configured for developer use and can be used as a model for screen capture interactions.

Screen Capture Reliability Challenges

Though screen capture is possible directly from the device using the process described above, I would caution against relying on this heavily on your Roku test automation solution. My team implemented a screen capture module which we used to collect screenshots during test runs for debugging, but we found that sometimes the requests to retrieve screenshots would fail inexplicably. I have also seen this using Roku's developer tooling to manually retrieve screenshots. Rebooting the test device will typically resolve the issue. Still, there is a risk that this sort of issue may be disruptive if you design your tests so that they are dependent on images retrieved via screen capture.

If leveraging visual testing services is a goal of your Roku test automation solution, I would suggest starting with a proof of concept implementation and perhaps considering using external image capture hardware. Still, that approach comes with its own set of challenges to overcome (such as finding a way to work around HDCP content protection if you are capturing images from a device's HDMI output).

Also, please note that the screen capture function built into the device will not include images rendered by the Roku video player. Suppose you attempt to capture the screen while the video player is on screen. In that case, you will see overlay elements rendered by your application in the output (buttons, text, etc.) but not the actual video content rendered by the player.

Device Selection For Roku Test Automation

Device Compatibility Testing on Roku

Device compatibility is a key testing consideration on the Roku platform. Roku requires applications submitted to the Roku Channel store to maintain compatibility with current and updatable devices as detailed in their Hardware Specifications list. One of the biggest and most expensive mistakes a team developing a new Roku channel can make is to fail to test the experience on older, slower devices, thereby allowing performance concerns to go undetected.

Fast Devices for Fast Feedback

When starting a new Roku test automation initiative, I suggest starting with top-of-the-line Roku hardware. This is currently the Roku Ultra series of devices with snappy processors and ethernet connectivity. One of your goals should be fast feedback, so you will want your functional tests to run briskly as long as your channel can keep up and then slow things down if you are starting to see failures that aren't representative of actual world use scenarios. The Roku Streaming Stick+ devices, the recent model Roku Express devices, and the newer Smart TVs which integrate the Roku operating system should also work fine for test automation use, as should the higher-end devices from the last several years (Premiere, Premiere+, Ultra). Using a variety of these devices would be an excellent way to expand the device coverage of your automated tests.

Resource Limitations Impacting Automation

I would be less inclined to use automation to target older, slower devices for general functional test automation use (including any of the Express devices or Streaming Stick models that are 3+ years old and the older numbered models: Roku 1, Roku 2, and Roku 3). Roku does not deprecate devices often, and part of its market strategy has been to market devices at low prices to reduce the barrier of entry for consumers. Some currently supported devices went to market 8+ years ago and literally struggle to keep up when navigating the system menus in the operating system. Some of these devices are still supported on the platform but are no longer widely used by consumers. The use of Roku WebDriver queries, along with development tooling (such as logging), will increase the demands on the hardware beyond a typical consumer workflow. You may find failure scenarios that don't happen when executed in a real-world context by a user with a remote control. On these older, slower devices, you may also encounter GUI rendering issues that will only be evident when watching the screen, even if Roku WebDriver reports that the elements are present in the XML.

Ultimately, which devices to run test automation against is a business decision for your team. Still, due to the above issues, I feel that low-end device compatibility testing is an area best suited for manual test execution. I see value in having specific automated tests targeting lower-end devices. Roku Channel Certification requirements specify performance metrics standards for two particular devices, the first generation Roku Express devices ("Littlefield" 37xxX) and the Roku Streaming Stick+ ("Amarillo-2019" 3810X), so you could create some basic tests that launch your apps on these devices and then analyze the log output to measure launch time using Roku's signal beacons if nothing else. "Littlefield" devices generally perform poorly, so I would be reluctant to engage them for broad use in automated functional tests. However, as long as you have the devices and the screens to render their output, you should be able to run these tests in parallel to other devices.

Roku Automation Alternatives

Beyond those tips above, based on my learned experience working on the platform, I would be remiss not to mention that there are a few alternative Roku automation solutions that merit consideration:

  • Rokul Runnings — this TypeScript solution was featured in a guest post on the Roku Developers blog sometime back. It extended the basic Roku WebDriver solution with features you don't get out of the box from Roku. Check out their screen capture component library, which appears to use a similar approach to what I described above.
  • RTA (Roku Test Automation) — this npm solution uses a different approach altogether, as it does not leverage Roku's Go-based WebDriver server. Instead, it uses Roku's External Command Protocol in conjunction with a custom BrightScript add-on that you would incorporate into your manifest for test builds. This solution also has a screen capture feature and other features, such as proxy functionality.

Final Thoughts

To summarize, we've learned that:

  • Sideloaded builds offer the most convenience and flexibility when running automated tests, but test automation can also be executed against public (production) channels and beta channels with some limitations. This is the only way to automate features that require Roku Channel Store integrations, like subscription workflows.
  • Like other automation solutions, Roku test automation can be integrated into continuous integration workflows. However, doing so requires that the CI solution has direct access to Roku devices. Some use scenarios require you to be able to view the Roku's screens or to have a display constantly powered up and connected to the Roku device, which presents special challenges that require consideration as you define your CI solution.
  • Though the Roku WebDriver framework does not include a screen capture method, we can implement our solution by replicating the requests that the Roku Web service for developers uses, with some limitations and considerations of note.
  • Device compatibility testing is an important step for channel distribution. Still, though automated tests can be run against most Roku devices, the additional resources required for test automation execution and debugging tooling make higher-end and mid-range devices more suitable for higher-end and mid-range devices more suitable for the bulk of our functional Roku test automation duties.
  • There are some great open-source alternative Roku automation solutions with features beyond the base solution which we can leverage in our projects or analyze and learn from.

--

--