Neurosynths: Harnessing Mental States to Sculpt Sonicscapes
If the rich history of synthesizers has proven anything, it’s that these instruments are destined to break new ground. What began as walls of patchable analog modules and simple waveform generators have evolved into software behemoths and iOS apps. Now, a pioneering group of biohackers and electronic musicians are taking synthesis further than ever imagined — by making brainwaves the control source.
Through titles like NeuroSynth and Encephalon, users can now direct synthesizers directly from their minds. Electrodes placed on the scalp read electroencephalography (EEG) data, decoding subtle changes in concentration, relaxation and other cognitive states to trigger sonic changes in real time. The results mark a new frontier for brain-computer music interfaces (BCI) and biotechnological performance.
It’s still early days for EEG-controlled synthesis but the possibilities are dazzling. As the technology progresses, BCI setups may allow sculpting timbres fluidly through thought alone — bringing new meanings to the notion of “mind melding” with one’s instrument. Live performers could thought-modulate generative patches on the fly, conjuring musically evocative soundscapes in dialogue with their mental state.
And things need not stop at sound design. Proposals exist for using emotion-sensing BCIs…