Misty II Project Directory, Part 2: Sample Code

Learn about the sample code for Misty’s REST API, JavaScript SDK, and .NET SDK (Beta) that’s maintained by the Misty Robotics organization.

Johnathan Ortiz-Sonnen
MistyRobotics
11 min readFeb 18, 2020

--

Welcome back!

In Part 1 of this series, we published a list of community-shared skills and robot applications for you to explore. Part 2 continues by gathering links to skills and sample code maintained by the Misty Robotics organization. Whether you’re learning to use Misty’s REST API, JavaScript SDK, or .NET SDK (Beta), these examples are a great place to get started.

We host these samples on GitHub in the JavaScript-SDK, .NET-SDK, and REST-API repositories, as well as in individual skill repositories in the MistySampleSkills organization. When you find something to try, you can download the code by navigating to the top level of the GitHub repository and clicking the Clone or download button.

Misty’s ready. Are you?

A quick note on the categories below. Most of the REST API examples run in your web browser (we use HTML script tags to embed the client code for subscribing to Misty’s WebSocket connections and issuing REST requests when the page loads). Examples that use Misty’s JavaScript SDK require you to load the skill files onto Misty via the Skill Runner web page, and if you’re using the .NET SDK, you may prefer to deploy your code to Misty directly from Visual Studio. You can read more about the different ways to code Misty in the developer documentation.

We do our best to keep these samples up-to-date. Please create an issue on GitHub if you run into trouble!

JavaScript SDK

This section highlights skills and sample code built with Misty’s JavaScript SDK. You can run most of these skills without making any changes to the skill files, but some examples require you to complete the face training process and update the relevant labels in the skill code before the skill will work. We’ve loaded up the skill files with with code comments to help developers who are new to the platform, and you can usually find more information about a code sample in the README file that’s associated with it.

audioLocalization — Shows how to use audio localization events. Misty starts recording and localizing audio, and prints the degree of arrival speech to debug listeners. (These messages show up in the JavaScript console for the Skill Runner web page).
bumpSensor — Shows how to use bump sensor events. Misty plays different default system audio files when you press her bump sensors.
Bump Sensors (Tutorial) — The code for a step-by-step tutorial that teaches you how to build a bump sensor skill from scratch. (Like the other bumpSensor example on this list, Misty plays different sounds when you press each bump sensor.) You can read the tutorial in the developer documentation.
capTouch — Shows how to use capacitive touch sensor events. Touch different parts Misty’s head to hear the robot play different sounds.
Compliance — This skill puts capacitive touch events to good use. When it runs, Misty stops engaging her neck motors when you touch her head, so you can manually place her head in any position you like. When you release the sensors, Misty re-engages her neck motors and moves her head back to its original position.
driveCircle and driveSquare — Shows how to use Misty’s DriveArc and DriveHeading commands to drive Misty in a circle or square pattern. These commands use data from Misty’s inertial measurement unit (IMU) to move the robot in the desired direction. Modify the sample code to customize Misty’s locomotion speed or the size of the shape.
Head & Arm Movement (Tutorial) – Sample code for a tutorial that teaches how to use different head and arm movement commands. You can read the full tutorial in the developer documentation.
Hello World (Tutorial) – Sample code for Misty’s Hello World tutorial series. When it runs, Misty moves her head and arms, plays sounds, breathes her LED, rotates on her treads, and starts recognizing faces. You’ll need to train Misty to recognize you and update the skill code with the label associated with your face before you run this skill. Read the full tutorial series in the developer documentation.
EmoteCapTouchLED – A basic example that shows how to program Misty to express her personality. Misty plays different sounds and changes the expression on her display when you touch her head and chin. (You’ll need to customize the skill code to choose the sounds Misty plays when different capacitive touch sensors activate.)
externalRequest_getAudio – Shows how to get data from the web to use in your locally-running skills. This skill has Misty download an audio file from a web URL and play it back through her speakers.
externalRequest_getData – Shows how to use third-party REST APIs in your skills. This sample has Misty get information about the local weather from the Weatherstack API, and print the current weather to the JavaScript console of the Skill Runner web page. Check out the tutorial for a step-by-step walkthrough.
faceDetection and faceRecognition – Shows how to use face detection and face recognition events. In the first skill, Misty reacts when she detects any face; in the second, Misty reacts differently to known vs. unknown faces.
Face Detection (Tutorial) – Sample code for an in-depth tutorial on using face detection in your skills. Read the full tutorial in the developer documentation.
FaceRecEmote – More fun with face recognition. When this skill runs, Misty reacts differently to different faces. You’ll need to train the robot on one or more faces and update this code with the labels you assign to each one. You’ll also need to define which audio files Misty should play for each face.
hazardNotification – Shows how to use data from hazard notification events in your skills. When this skill runs, Misty prints a list of any sensors that are in a hazard state to debug listeners. This sample is useful when coding Misty to navigate around obstacles and avoid high ledges during autonomous exploration.
lookAround – This skill shows how you can use timer events to loop through unique patterns of behavior. When it runs, Misty continuously moves her head in a random and lifelike way.
LooksAtFace – Shows how you can use data from face recognition events to issue commands that have Misty move her head and maintain eye contact with the face she detects in her field of view.
Misty-Multiple-Personality – Another basic example of Misty’s programmable personality. Misty shows different emotions by changing her display image and playing sounds when you press different bump sensors.
MistyReads_AzureTutorial – The code for a skill that integrates with Microsoft Azure Functions and Microsoft Cognitive Services to give Misty the ability to read handwritten text. You’ll find code for Misty and the Azure cloud functions at this link, and there’s a tutorial to show you how to use them on the Misty blog.
mistySeesYou – When this skill runs, Misty raises her arms, plays a happy sound, and changes her display image each time she sees a face.
Play Audio (Tutorial) – The code for a tutorial that demonstrates how to access Misty’s audio assets and play sounds through her speakers. Read the full tutorial in the developer documentation.
propertyTest – Property tests let you filter out unwanted event messages, so that your event callback functions only trigger when certain criteria is met (for example, when a specific capacitive touch sensor is pressed, or when a specific hazard is triggered). To help you understand how they work, this sample shows how to apply property tests to a time-of-flight event listener and ignore irrelevant distance data.
randomLED – Another example of how you can use timer events to loop through patterns of behavior. This time, Misty changes her chest LED to a new, random color once every second.
Record Audio (Tutorial) – Sample code for a tutorial that shows how to use Misty’s API to record audio. You can read the full tutorial in the developer documentation.
recordVideo – Demonstrates how to use Misty’s API to create a new video recording.
RoamLook – Misty loops through a locomotion pattern with face recognition enabled. On seeing a known face, she greets the person and expresses joy. On seeing an unknown face, she takes a picture of the person and saves it to her local storage.
serialReadWrite – This resource includes the JavaScript skill code and the .ino sketch code for programming Misty to send and receive messages to and from the Misty Backpack for Arduino. The backpack can communicate with Misty over hardware or software serial; the .ino sketch files at this link show how to use both.
takePicture – Shows how to code Misty to take a picture with her RGB camera, save it to her local storage, and show the picture her display.
timeOfFlights – Demonstrates how to use time-of-flight event messages in your skill code. When this skill runs, Misty reacts when she detects an object within a certain distance of her range time-of-flight sensors.
Time-of-Flight (Tutorial) – Sample code for a tutorial that teaches you how to code Misty to change her LED, drive forward, and stop when her time-of-flight sensors detect an obstacle within a certain distance. You can read the full tutorial in the developer documentation.
Timer Events (Tutorial) – Sample code for a tutorial on using timer events to loop through patterns of behavior. As with the randomLED sample above, this skill randomly changes the color of Misty’s LED on a timed interval. Read the full tutorial in the developer documentation to learn more about timer events.
Trigger Skill (Tutorial) – Did you know event messages from Misty’s sensors can trigger other skills to run? This link has the sample code for a tutorial that shows you how. Read the full walkthrough in the developer documentation.
TurnToSound – When this skill runs, Misty uses audio localization data to calculate drive commands and pivot in the direction of the speech she detects.
wakeWordShows how to register for key phrase events to trigger actions when Misty hears her wake word.
Wander – When this skill runs, Misty randomly explores her environment, using bump and time-of-flight sensor data to detect obstacles in her path. (Note that Misty drives differently on different types of flooring. You may need to customize this skill code for best performance in your environment.)

.NET SDK (Beta)

Misty’s .NET SDK is still in its early stages, and we expect the list of C# samples to grow with time. Currently you can find the collection of Misty-maintained .NET skills in the .NET-SDK repository on GitHub. This repo has a sample C# project called IntroSkillsTask that includes a library of C# skills you can explore to learn about Misty’s .NET SDK.

We encourage you to read more about the IntroSkillsTask project Misty’s developer documentation, but here’s a quick primer. You’ll want to start by cloning the .NET-SDK repository to a computer with Windows 10 and Visual Studio installed. Open the Misty.Skill.IntroSkills.sln file with Visual Studio, and update the LoadAndPrepareSkill method in the StartupTask.cs code file with the name of a skill from the SkillLibrary directory to deploy that skill to Misty. The SkillLibrary includes the following examples:

ForceDrivingDemonstrates how to use the .NET SDK to build skills that have Misty engage with her environment in different and interesting ways. On start, the skill listens for data from Misty’s time-of-flight (ToF) sensors. Place your hand in front of a range ToF sensor to make Misty drive in the opposite direction. The closer you bring your hand, the faster Misty moves.
HelloWorldSkillOn start, Misty loops through head and arm movement, audio playback, and image changes to greet the world.
HelloAgainWorldSkillOn start, Misty loops through head and arm movement, audio playback, and image changes to greet the world. (The functionality is similar to the HelloWorldSkill, but the implementation is different.)
HelloLocomotionSkillDemonstrates how to use basic driving commands and handle data from Misty’s hazard system in your .NET skills. On start, Misty drives around the room and avoids obstacles.
InteractiveMistySkillDemonstrates how to register, unregister, and handle events in your .NET skill code, and shows some of the different ways to listen for data from events and callbacks. Touch Misty’s head and press her bump sensors to see her express different emotions.
LookAroundSkillDemonstrates how to use timer callbacks to loop through behaviors in .NET skills. On start, the skill registers a timer callback such that Misty receives randomized head movement, arm movement, and change LED commands.
MostlyHarmlessSkill — On start, this skill loops to randomly change Misty’s chest LED until the skill is cancelled or times out.
MostlyHarmlessTooSkillOn start, uses a timer callback to invoke change LED commands on a loop until the skill is cancelled or times out. (The functionality is similar to the MostlyHarmlessSkill, but the implementation is different.)
SkillTemplate — A template for developing .NET skills in C#. This template implements the IMistySkill interface so that you can invoke commands and use data from Misty. It provides an interface for declaring meta information about the skill, and it instantiates methods you can populate with your code to determine how Misty behaves when the skill starts, pauses, and cancels. (You can also use the Misty Skill Extension to install a Visual Studio project template for building your own C# skills.)

REST API

These samples show how to send commands and stream data using Misty’s REST API and WebSocket connections. Most of these samples run in your web browser, but you can apply the same principles to any client code you write to issue commands to Misty from an external device.

Changing Misty’s LED (Tutorial) — Sample code for a tutorial that teaches you how to change Misty’s LED using REST API commands. Read the full tutorial in the developer documentation.
Exploring Computer Vision (Tutorial) — Sample code for a tutorial that teaches you how to subscribe to Misty’s face recognition WebSocket connection and issue commands that have the robot detect, recognize, and learn new faces. Read the full tutorial in the developer documentation.
Find Face — Do Something — This code runs in your web browser to have Misty play a sound when she detects a face.
Using Sensors, WebSockets, and Locomotion (Tutorial) — Sample code for a tutorial that teaches you how to subscribe to and use data from Misty’s time-of-flight and locomotion command WebSocket connections. Read the full tutorial in the developer documentation.
Taking Pictures (Tutorial) — Sample code for a tutorial that teaches you how to write an application that runs in your browser to have Misty automatically take a picture and save it to her local storage when she detects a face. Read the full tutorial in the developer documentation.
MistyReads_Python — Sample code that uses the mistyPy wrapper for Misty’s REST API with Microsoft’s Vision and Speech APIs to have Misty read handwriting out loud. You’ll need to update this code with your own API keys before you can use it with your robot.

We hope these lists help accelerate your development with Misty II. If there’s a particular sample you’d love for us to add to the library, let us know by posting in the Community forums. And stay tuned for Part 3 of this series, where we’ll share a list of projects, tools, and experiments from the Misty community that don’t fit neatly under the skill umbrella!

--

--