Nose theremins and light painters

Use your body as a pointer in p5.js

Luisa Pereira
Oct 23, 2018 · 3 min read

The Creatability experiments, which include several musical instruments, a text-to-musical-speech toy and a music-to-visuals tool, offer a range of input and output modes for people to choose from. Having multiple interaction modes can make creative coding projects more accessible, expressive, or engaging; designing them often leads to new ideas and inquiry paths. We hope these experiments will inspire you to explore multiple interaction modes in your own projects.

One key feature of the Creatability experiments allows people to use their torso, nose, wrists, and other body parts as pointers, instead of the mouse.

Image for post
Image for post
Nose as pointer in a p5 sketch. (Image description: an animated screen capture of the nose pointer sketch running on the p5 editor. On the left pane is the code, too small to read. On the right is the running sketch: Luisa sways her face left and right as a blue circle follows the position of her nose.)

To use body parts as pointers, the Creatability experiments use poseNet, a machine learning model that allows for real-time human pose estimation. Adding this feature to your p5 sketches only takes a few lines of code. Here are three quick examples I have made using the nose as a pointer:

These links lead to the p5 editor, where you can try the examples and play with their code. Please try them using Chrome.

Image for post
Image for post
Nose Scribbler example. Play + Edit Code here (Image description: an animated screen capture of light painter sketch on the p5 editor. On the left pane is the sketch code, too small to read. On the right is the sketch: as Luisa moves left and right, a yellow trail follows her nose)

Adding body pointers to your p5 sketches

To add body pointers to your sketches you can either duplicate the p5 boilerplate or follow the steps below.

  1. Add a reference to the Creatability library to your index.html file:
<script src="https://storage.googleapis.com/gweb-creatability.appspot.com/acc-components.js"></script>

2. In your setup() function, create a pose input element, set its dimensions to match the dimensions of your canvas, and pick the body part you want to track:

function setup(){
canvas = createCanvas(640, 480);
// Create and initialize a pose input element
input = document.createElement('acc-pose-input');
input.initialize();

// Set its dimensions to match the dimensions of the canvas
input.contentElement = canvas.elt;

// Pick a body part to track
input.part = 'nose';
// Other body parts to track:
// sternum,
// leftWrist,
// rightWrist,
// leftKnee,
// rightKnee,
// leftAnkle,
// rightAnkle
}

3. In your draw() function, draw the camera input, get the pointer coordinates and draw it:

function draw(){
// Make sure the pose input component has been loaded
if(input.isReady){
// Draw the camera input to the canvas
canvas.elt.getContext('2d')
.drawImage(input.canvas,
0, 0,
input.canvas.width,
input.canvas.height);
// Get the position of the pointer
let x = input.targetPosition[0];
let y = input.targetPosition[1];
// Draw an ellipse at the position of the pointer
noStroke();
fill(0, 0, 255);
ellipse(x, y, 60, 60);
}
}

4. If your user is not positioned at the center of the screen, it is useful to re-set the center point of the tracking, and the amplification that is applied to the motion. The easiest way to see how these work is to try changing them in the Camera Settings section of any of the experiments, for example Sampler.

To change the amplification, set the input.amplification property:

 input.amplification = 2;

To set the center point of the tracking to the current position of the cursor, call input.setCenterToCurrentPosition() . For example, re-set the center whenever a key is pressed:

function keyPressed() {
// Re-set centerpoint
input.setCenterToCurrentPosition();
}

If you are interested in diving deeper into the components library we have created and used in the project, please visit the Creatability component repository on Github.

Also check out ml5.js, a friendly library for machine learning on the web that is fully compatible with p5js and comes with plenty of examples, tutorials and sample datasets, including poseNet.

Luisa Pereira

Written by

music, code, design • faculty @itp_nyu • @NEWINC, #CS4AllNYC, @p5js • ⚡️🎛 http://bit.ly/the-counterpointer

Luisa Pereira

Written by

music, code, design • faculty @itp_nyu • @NEWINC, #CS4AllNYC, @p5js • ⚡️🎛 http://bit.ly/the-counterpointer

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch

Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore

Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store