Interactive Media Art|Visualize “Noise” From Space|Inspired by Space Sound Effects (SSFX)

What is SSFX?

What does the space sound like? It’s a mysterious question. Not until I saw the website of SSFX did I realize how close we are to the remote outer space. SSFX (Space Sound Effects) is a Physics & Astronomy project at Queen Mary University of London. It organized a short-film competition around the world in 2017 (Yeah, I know it’s really a long time ago) and incorporated those films into their anthology film. It was released on 16th October 2018 by Martin Archer on YouTube.

Where does our inspiration come from?

I was greatly shocked and inspired by their full film, including all the fascinating space sounds, narrative scenes, etc. Of course I love all the short-films made by thoese independent filmmakers, but I am particularly interested in how Martin and his team started the full film and connected all the subsequent short-films.

They adopted a first-person perspective in the filming to achieve a really engaging and immersive feeling. They also achieved signal interruption effects on the screens with green-screen technology in the postproduction. And their idea is that the space weather and space sound are affecting and interferencing the technology in the person’s apartment.

Screenshot from the full film on YouTube.

After checking their behind the scenes footage, I got to know that all the screens in the apartment are featured with tracking markers.

Green-screen and tracking markers shown by their Behind the Scenes footage on YouTube.

What we did in our project?

I really like these cool flickering noises on screen and I wonder whether we could achieve a similar effect with python code and hardware instead of postproduction. Our crew members discussed about this and decided to use Field-programmable gate array (FPGA). And to enhance the system’s interactivity, we decide to let the screen synchronize with the sound intensity recognized by PC.

This is a brief demonstration of the effect we achieved.

If you are interested, please visit my YouTube channel and check the full video.

Process

The systematic block diagram
  • PC will recognize sound wave with its microphone.
  • It will then quantify the signal and analyze sound intensity.
  • When the sound intensity reach the threshold, the switch in FPGA will be turned on through serial. The display frame data will be transmit to the VGA controller and then appear on screen as random flickering effects based on clock pulses.
  • If the sound intensity does not reach the threshold, there is still data in the display frame process module, but it will not be sent to the VGA controller. Therefore there is no signal and no display on the screen.
  • As the sound intensity data will also be sent to display frame process module, the algorithm will adjust the black and white ratio on screen according to the sound intensity recognized.

If you are interested in our source code, please check Mengrou’s GitHub page.

Please also follow my Twitter account @VMaggieee