Nutanix Xi IoT Application Containers for Developers

Bao Phan
nutanix-iot
Published in
7 min readMay 2, 2019

by Heiko Koehler, Sr. Staff Engineer, Xi IoT R&D

Edge computing with the Nutanix Xi IoT platform frees developers from the complexity of deploying edge devices and data streams and instead allows developers to focus on creating useful data analysis and transformation logic through easy to use Application Program Interfaces (APIs), reusable data pipelines, and a pluggable machine learning architecture.

The article Nutanix Data Pipelines and AI for Developers explains how the Xi IoT platform uses the concept of a data pipeline to route edge data sources to pluggable transformation functions, and in turn, route transformed data to cloud storage or additional pipelines.

In this article I’ll discuss another option supported by Xi IoT: containerized applications that enable you to configure and deploy custom services and runtime environments for your edge computing solution.

The Xi IoT Application Model

Nutanix Xi IoT is a cloud-based end-to-end platform that allows you to easily create solutions for gathering data from remote sensors and edge appliances and perform real-time data processing on that data. An IoT solution starts with edge hardware, connected to sensors, generating data. The data streams (routed through protocol gateways as needed) are configured as data sources, which can be connected as input to applications or data pipelines for analysis and transformation. Transformed data can be routed to storage or as input data sources to other applications or data pipelines.

Xi IoT provides some standard, built-in data source protocols you can simply select and configure to handle common incoming sensor data formats. The data pipeline interface also provides the ability to code transformation functions in Go, JavaScript or Python using standardized runtime environments. This makes it very easy to create powerful, pluggable libraries of processing, transformation and AI functions that you can quickly connect to data sources and get running almost immediately.

However, if you need runtime environments, applications, or services beyond what’s provided out of the box, Xi IoT provides the option to build and deploy your own custom containerized applications.

Application Container Basics

The Xi IoT edge environment supports running custom applications and services providing data processing and AI inference. On Xi IoT, applications are simply Docker containers and associated services running in a hosted Kubernetes environment. Containers deployed as part of a Xi IoT solution are primarily used for applications that require dependencies or environments not available to data pipeline functions.

If you’re not already familiar with the concept of application containers, I’ll give you a quick primer.

Docker is a popular open-source software platform for creating lightweight, containerized applications that can be deployed and run independently in a shared execution environment. Docker containers are software packages that include all of the software dependencies and settings required to run an application.

Kubernetes is an open-source platform for managing — or “orchestrating” — containers and their related networking and management services.

You use tools such as Docker Compose and Kompose to create, convert and manage the YAML configuration files used to define and run multi-container applications in Docker and Kubernetes environments.

If you’re already familiar with using Docker, Kompose and Kubernetes, you’ll find that deploying to the edge with Xi IoT is just as easy as deploying to the datacenter.

Using Containers with Xi IoT

To keep things simple, Xi IoT supports deployment of containers from well-known public cloud-based registries like Docker Hub, Amazon AWS Elastic Container Registry (ECR), and Google Container Registry.

If you’d like to utilize a publicly available container, just specify the image as part of your application’s YAML configuration. If the container is stored in a private registry, you also have the ability to securely store the credentials as part of a Container Registry Profile, assign them to any required Xi IoT Projects, and instantly use them across your Project’s fleet of edges.

With the exception of privileged mode (of course), nearly all of the Kubernetes specification is supported on a Xi IoT-based edge, including PersistentVolumeClaims, and even NVIDIA GPUs for accelerated inference at the edge.

Managing Applications in Xi IoT

To create and manage containerized applications in Xi IoT, you can use the web interface or the REST APIs.

In the web interface, select Apps And Data > Applications from the navigation menu. You’ll see a list of available applications:

Click on an existing application name to view details of that application, edit the application properties, or launch an application:

To set up a new application, go to the Applications page and click the Create button. You’ll be asked to provide some basic descriptive information about the application and its project. The project determines which edges are available for configuration, and you can then select an edge.

Click New to continue. The next screen is where you type, paste or upload your Kubernetes formatted YAML.

Click Create and your application is ready.

You can also create and manage applications programmatically through the /applications endpoint of the Xi IoT REST API. The API allows you to create, update and delete applications, list available applications or get the details for a specific application.

API management of applications requires only that you know the IDs of edges available to the application, which is available through the /edges endpoint of the API. To create or update applications you can pass orchestration YAML as part of the request.

The article Getting Started with Xi IoT API explains how to create an API user and authorize access to the APIs. See the Create an application section of the API documentation for details on using the /applications endpoint.

Application Alerts and Logs

Alert summaries are displayed on the dashboard for a given application. These will cover conditions like edge connectivity errors, nodes that are unavailable or even the occasional application error you don’t catch during development or testing. To view these, click Apps and Data > Applications, then click the application you want to view.

To view alert details, click the View All link or the Alerts tab for the application.

Application log data can be generated at any time from the Log Bundles page. Among other logs for platform support, the bundle includes any application logs generated by your code, as well as the typical Docker and Kubernetes log output.

To generate log data go to Apps and Data > Applications, then click the application you want to view and click the Log Bundles tab. Click on the “Run Log Collector” button to collect the log data. Then select the edge instances running the app and click on the “Collect Logs” button, after which Xi IoT collects the log data, creates a log bundle, and automatically uploads it to the cloud when connectivity is available. If you’ve ever deployed and debugged applications deployed to the edge where connectivity and console access can be sporadic, you’ll understand what a lifesaver this feature really is!

Finally, download a selected log file to your local machine to analyze. Simply select a log bundle from the list and click on the corresponding download link. To view logs, un-tar the file and open it in a text editor or log analyzer to read the contents.

You can also access application status and logs through the Xi IoT APIs. You can query application status for all or individual applications through the /applicationstatuses endpoint. See Get applications status in the API documentation for details.

Various endpoints of the Log APIs enable you to list, upload, download or delete logs.

About the Author

Heiko Koehler, Sr. Staff Engineer, Xi IoT R&D

Heiko has worked on infrastructure related products and services for the past 14 years. During his career he has helped bootstrap new product initiatives like Xi IoT, VM migration across different hypervisors, a hypervisor based storage cache, as well as a hierarchical storage management solution. He leads the Xi IoT data plane efforts, focusing on containerized applications support and server-less data pipelines at the edge.

Originally published at https://developer.nutanix.com on May 2, 2019.

--

--