Get started with Jetson AGX Orin. Quickly and easily, in a matter of minutes!

Usually, when new hardware is launched, we developers tend to suffer for hours (or even days) until things start working correctly. Even a “hello world” could become a pain when you are dealing with pre-Alpha versions and a bunch of libraries that sometimes are not pre-compiled for your platform.

SegNet example using ResNet18-Cityscapes, 1024x512 resolution and running a 200FPS. (Scroll down to see how easy it is to run something like this on Jetson hardware)

That’s why I wrote this quick start tutorial for the Jetson AGX Orin, with the intention to save some time for any developer, from the enthusiasts and curious ones to the most advanced researchers in the field.

Step 0 — Flashing the Jetson AGX Orin

  1. Use Ubuntu 20.04 (download it here, install it or use it from a USB)
  2. Install NVIDIA SDK Manager from here, and launch it
  3. Connect a USB-C cable to the second (data) USB-C port of the AGX Orin, unplug the USB-C power cord, keep pressed the central button (recovery), and plug the USB-C power cord. A new device should be detected on the SDK Manager
  4. Flash it, when done, it will ask you to connect to the new Orin to install the extra software (CUDA, etc.). I did it by login in to my Orin with an external monitor, keyboard and mouse, getting its IP address and adding this IP to the IP address field. It installed everything straight away.
Installation and flashing process with Ubuntu 20.04 + NVIDIA SDK Manager

Step 1 — Let’s prepare everything for the demos!

First things first, I always need to have everything up-to date so:

apt update
apt upgrade

I created a repository that includes all the scripts to launch the demos in one click, these are the few steps you should follow to have everything working in minutes:

git clone https://github.com/dusty-nv/jetson-inference
git clone https://github.com/asierarranz/AGX_Orin_Demos
cd AGX_Orin_Demos
  • Let’s install the software and dependencies in one click:
cd 1_Install
./Step1_installation.sh

When the menu to install additional models appears, I recommend you add the PedNet model because we are going to use it for our demos. Still, you always will be able to launch this menu by executing this:

./Step1b_Download_additional_models.sh

The installation will ask you if you want to install Pytorch, you don’t need it if you don’t want to do training on the AGX Orin, so you can leave that option unchecked.

Step 2 — Let’s run some quick demos

Let’s run our first demos; as I promised. It is one click! :-)
And running between 100–160FPS

./Step2_Pedestrians_DetectNet.sh

Now let’s run the same demo but with a model to detect exclusively pedestrians:

./Step2b_Pedestrians_DetectNet_PedNet_model.sh

And finally, let’s try a different model, in this case a segmentation one called ResNet18-Cityscapes (1024x512 resolution and running a 200FPS)

./Step3_DashCam_SegNet.sh

Bonus track! — Keep the demos running continously

Let’s imagine you want to have your AGX Orin making inference over some pre-recorded videos during an exhibition in a booth.

This will keep the default example running in a loop until you close the app.

To run it:

cd 3_Demos_in_loop
./Pedestrians_demo.sh
./Dashcam_demo.sh

And with this simple script, we added an extra feature: you can edit it to add your own multiple videos being inferenced in different ways with different models, and everything running in an infinite loop. Very useful to display your AGX Orin demos!

./Demo_Playlist_editable.sh
Running all the demos you want in loop.

Which basically is a Shell script where you can add your own demos in a loop, just by adding them between the DO and DONE loop:

cd ~/jetson-inference/python/examples
while :
do
# Add your own demo commands here
python3 detectnet.py --network=pednet /opt/nvidia/vpi2/samples/assets/pedestrians.mp4
python3 segnet.py --network=fcn-ResNet18-Cityscapes-1024x512 /opt/nvidia/vpi2/samples/assets/dashcam.mp4
done

I hope these examples are helpful. I have created this article to cover the basic demo examples. Soon, I will continue explaining how to run more advanced demos (in an easy way) using SDKs like DeepStream and RIVA.

Become part of the NVIDIA developer community

If you want to learn more about Jetson, DeepStream, and the most advanced topics in Deep Learning and Robotics, join the NVIDIA developer community, it is free, and you will be able to get official NVIDIA certifications, access the best tutorials, and interact with thousands of developers and experts worldwide.

Plus, you can share your AI projects and receive feedback from the community!

Also, before March 20th, you can register here for GTC, and join the Jetson Edge AI Developer Day sessions during it.

--

--

Robotics & Embedded AI at @NVIDIA | GenerativeAI Dev | Formerly at IBM Quantum | I love AI, XR, robotics, and specially humans! | Bio: asierarranz.com

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Asier Arranz

Robotics & Embedded AI at @NVIDIA | GenerativeAI Dev | Formerly at IBM Quantum | I love AI, XR, robotics, and specially humans! | Bio: asierarranz.com