DJ and VJ All By Yourself in Seconds on Web

Beact, only thing you need is a browser to become a DJ + VJ !

Vibert Thio
4 min readOct 30, 2017

Play Beact Here | Source Code

Video shooting for “Beact”. (2017.7)

1. Basic Usage

Keyboard & Sequencer Pads

  1. click on the drum pad to make your own pattern.
  2. press space to start/stop.
  3. press up/down for bpm changing.
  4. left/right to change sound samples.
  5. press 1~8 to trigger preset patterns.
  6. press a~z to trigger animation and audio just like patatap.

Sidebars

  1. Start / Stop
  2. Pattern : create a pattern, type in the name, and press add to upload yours to server and store.
  3. Chain:chain few patterns into a song.
  4. Recorder:record the drum machine pattern and keyboard together into a recording, and upload to server for you to share and replay.

2. Introduction

It’s not only an audio/visual art work but also an instrument that everyone can play with to become a DJ + VJ.

Based on the idea of Patatap, “Beact” uses Two.js, Tone.js as audio and visual engine library respectively. It combines the concept of sequencer with keyboard, becoming a complete set for musician to perform their work in a browser. This project merges creative coding into the software architecture with conceptual integrity.

3. Inspiration

In my spare time, I followed some artists/hackers/musicians who utilize the browser and internet as media to create. Sometimes, I found works so astonishing that I couldn’t close mouth in front of my computer. The concept “Patatap” is really fascinating, that each letter corresponds to a sound and an animation. It transforms abstract words into a beautiful outcome.

After a while, I wanted to use it as an instrument for my performance and compose songs out of it. However, the system didn’t allow me to do that. I started thinking about modifying it. That’s the starting point of “Beact”.

Beact showcase video. (2017.7)

4. Cooperation and Git

In the beginning, I developed “Beact” by my own. When I finished the front-end program which is the core logic of this project, I asked two of my friends, Yu-An Chan and Joey Huang, help me with the back-end and CSS respectively. I became the technical leader and project manager of the 3-person team, and was in charge of all the front-end design, including visual and audio effects.

We used git as the version controlling tool. The size of the source code made the cooperation significantly hard, and resulted some conflicts difficult to resolve. I learned a lot about work flow of git in this project, and getting familiar with teamwork pattern in large-scale software engineering.

Screenshot of one scene of Beact. (2017.7)

5. Beact with Performer/Audience

There are several existing libraries which support audio or visual effects on node package manager (NPM). Therefore, I had to find the ones the most suitable for my project. I used React as the framework, and Webpack as module bundler.

In beginning, user could only use keyboard to access the functions. After a time, I added the connection port to midi controller to let musicians play “Beact” with their own instruments.

Beact is not merely a simple app for people to have fun with, but a powerful audio/visual tool that can be accessed with simply a browser for performers.

Screenshot of one scene of Beact. (2017.7)

6. UI/UX analysis

When I finished first the stable version of Beact and published it using Heroku, I got some feedback that some users felt not intuitional enough to use it without text-based manual. I reached out for a friend Ms. Fang, who studies UI/UX in National Tsing Hua University in Taiwan. We are currently discussing about how to make the UI/UX more friendly and instinctive in order to let users enjoy it without a manual.

The user interface of Beact 0.0.1

--

--