Uploading a Locally Trained DeepRacer Model to AWS

ARCC
4 min readJul 22, 2019

--

NOTE: This is a manual and non-automated way to upload your model. Better guides exist here and the upload process can be further streamlined with the DeepRacer local console.

In a previous post about how to run DeepRacer locally, I mentioned that I would write a second post on uploading that model to AWS manually. I decided to write this for two reasons: one I could not get the script to work out of the box and two it provides insight int how the DeepRacer S3 system is set up. With that stated, let's dive on in.

This tutorial assumes that you have run the DeepRacer locally and have files that you would like to upload to the official console. Note that I am not responsible for anything that goes wrong and you are assuming some risk when deleting and uploading to your DeepRacer S3 bucket.

Step 1: Start a 5-minute DeepRacer training session to allocate the proper resources and create the S3 bucket that we will use to upload our local model. Make a note of the time at which you created the simulation as that could come in handy later.

Step 2: Go to AWS S3 and open your most recently created deepracer bucket.

Step 3: Navigate into that bucket and find the most recent DeepRacer-SageMaker-RoboMaker-comm-* file by clicking sort by last modified until the arrow points up and then selecting the file that is in the bottom of the group of files starting with DeepRacer-SageMaker-RoboMaker-comm. If you have any doubts make sure you navigate to a file with a timestamp in the folder and verify it based on when you started the 5-minute simulation.

Step 4: Enter the folder and you should see two other folders called ip and models, those are the folders that we will be replacing with folders from our locally trained model. You can now navigate to the folder in your computer's directory. It should be located in deepracer-for-dummies/docker/volumes/minio/bucket/rl-deepracer-sagemaker. Go ahead and replace the folders in S3 with the folders on your local machine.

model and ip folders on the local machine

Step 5: Go back the DeepRacer S3 bucket root directory and now go into the corresponding folder titles DeepRacer-SageMaker-rlmdl *. If you have the folders sorted the same way as above then it should be the bottom one. If you have any doubts make sure you navigate to a file with a timestamp in the folder and verify it based on when you started the 5-minute simulation.

Step 6: Download the tar.gz from S3 and replace the model.pb with a model from your local machine (likely this will be the model_x.pb in your deepracer-for-dummies/docker/volumes/minio/bucket/rl-deepracer-sagemaker/model directory with the largest x). Also, replace the model_metadata.json in the tar.gz with the file found on you local machine in the deepracer-for-dummies/docker/volumes/minio/bucket/custom_files directory.

Step 7: Replace the tar.gz in S3 with the one you created in step 6.

Step 8: Navigate to the root directory of the DeepRacer S3 bucket one more time and go to model-metadata. Find the folder with the same name as the model you trained for 5-minutes in step 1 and replace the json in that folder with the json on your local machine in deepracer-for-dummies/docker/volumes/minio/bucket/custom_files.

Step 9: Make sure that you s3_bucket/s3_prefix/model/checkpoint file (create if needed) has the content similar to what is below. The model_checkpoint_path is what determines which iteration of the model to use.

model_checkpoint_path: “29_Step-118027.ckpt”
all_model_checkpoint_paths: “25_Step-100629.ckpt”
all_model_checkpoint_paths: “26_Step-104883.ckpt”
all_model_checkpoint_paths: “27_Step-109206.ckpt”
all_model_checkpoint_paths: “28_Step-113940.ckpt”
all_model_checkpoint_paths: “29_Step-118027.ckpt”

That's all! Now you can experiment with your model in the official DeepRacer simulator and submit it to the league! I am sure it will not take long to get a script to automate this process working, maybe I just did not look or try hard enough, but I found it educational to go through it by hand since it gave me some insight into how the different parts of the DeepRacer work together on AWS.

--

--

ARCC

Organization founded to inspire and teach people about AI and Machine Learning (ML) through the application of autonomous race cars. See arcc.ai