Making ml5.js Accessible

Interview with Bomani Oseni McClendon, ml5.js Fellow 2020

A person sits in a chair. Their entire face is covered by a black shape, w/ white dots positioned over key facial features
Automatic facial occlusion using the ml5.js Facemesh model and p5.js. Original base photo by Edrick Chu. Bomani Oseni McClendon is an engineer living in Brooklyn. Bomani builds software to support artists, and teaches students grades 2–5 about electronic circuitry through craft activities. Through his creative practice, he studies the ways that Black health outcomes are influenced by a history of scientific racism. Bomani’s Github is here. Bomani was mentored by Joey K. Lee.

Releasing v0.5.0 of ml5.js

Centralizing the version release process into single-repo changes

Merge examples from ml5-example into the core ml5-library repo

  • Consolidating the ml5-library repo and the ml5-examples repo removed an additional repo to handle in our library release process. This means that releases to the core ml5.js library and our ml5.js examples can be coordinated with changes to a single repo.
  • Features for the core library can be developed, tested, and submitted alongside a working example showing their use case. Not only does this make it easier to keep our examples in sync with the core library, it also means that contributors don’t need to context-switch between two separate libraries. Additionally, our examples can be used as manual tests during development to double-check that our changes to the library aren’t breaking examples or causing perceptible regressions.
A person opens and closes their hand in front of a camera. Green dots are drawn over key positions on their palm and fingers.
Testing handpose detection using the ml5.js Handpose model.

Preparing for the next release

  • Facemesh Model: The next release of ml5.js will include the new Tensorflow Facemesh model used to predict 386 3D facial landmarks on the human face.
  • Handpose Model: We’re also adding the new Tensorflow Handpose model used to predict 21 3D hand keypoints.
Demonstration of the new “npm run develop” command. This command boots the local development workflow with Webpack file watching and live reloading for the examples index. [image description: On the left side of the screen, the command “npm run develop” is typed into a black terminal window. A few lines of output are listed in the terminal output before the ml5 Examples Index website appears on the right side of the screen.]



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Processing Foundation

The Processing Foundation promotes software literacy within the visual arts, and visual literacy within technology-related fields.