Here is a link to a page with featured articles
Featured Articles, Podcasts, Bio
Micah Blumberg, programmer, journalist https://www.linkedin.com/in/worksalt
Otherways to connect
In 2012–2014 I opened a neurofeedback business in San Francisco that also did research to develop new brain computer interface experiences with light & sound patterns driven by EEG brainwaves.
In 2015 I pivoted to become a full time journalist intent on learning how to build a magazine in VR (with WebVR technology) and on how to develop VR applications that could be used to visualize sensor data.
In 2017 I started the “Neural Lace Podcast” series on youtube, the discussions explore how future brain computer interfaces might work, touch on topics related to neuroscience, computational biology, deep learning, and more.
The Neural Lace Podcast
For 2 years from Jan 2018 until December 2019 I was the organizer of NeurotechSF in San Francisco, part of NeuroTechX.
In 2018 for NeurotechSF: I organized coding meetups with the San Francisco hacker community at Noisebridge to bring EEG into WebXR
In 2019 for NeurotechSF: I hosted a series of neuroscience talks at the Red Victorian Hotel.
In 2018 as the organizer of NeurotechSF I led a weekly code night community project at Noisebridge, a maker space in San Francisco, to bring EEG (brainwaves) signals into Virtual Reality (WebXR) In the beginning we had an openbci eeg device and an HTC Vive, and we tried to create 3D printed parts to attach the open bci sensors to the Vive. Our 3D printing efforts were not successful so we ended up using a brainduino eeg device with an Oculus Go, to start we served the eeg data to desktop running a debian distro of linux, we used Python to create a server (and later replaced that with a Go server), we imported our EEG signals into the webpage with a websocket, and we used the Fast Fourier Transform to divided the signal in frequency ranges such as Delta, Theta, Gamma, Alpha, and Beta, so that with our two sensors we had 10 incoming signals to visualize, and then we used a combination of Aframe & Threejs scripting software to change the height and color of objects representing the incoming data effectively creating a 3D time series in virtual reality. User’s could walk around inside this representation of their brain activity with the Lenovo Mirage Solo VR headset and they could click the visualization to rotate it in space.
This video is one of the first results: WebVR EEG Scatterplot work we are doing at NeurotechSF meetups https://twitter.com/worksalt/status/1055343502005923840?s=20
I improved the code on the aframe side after that significantly and this was a later result https://photos.google.com/share/AF1QipP_l79r-yc8ghELZykQDL1eI7Xw82uumsFM7QNsqAAJNiEzZTXWhvnleticudX6jQ?key=aUpyWE81U3YxSC1SdXRKS0dtdFJUdklaelQ4MXJ3
Updates to Neurohaxor EEG WebVR: added VR console to show user new buttons to teleport, turn right or left, or laser click items, turned the boxes into plain buffer geometry which makes the app runs a lot smoother. Watch the new video here:
In 2019 I began to create some advanced ui/ux in aframe, in this video you can see 3D models and 2D pictures moving on a 3D carousel that I wrote that movies items between visible and invisible points when the user clicks next or previous, the entities change scale, rotation, and position with every click. It works in both AR and VR.
This code demonstration shows Tilt Brush inside WebXR with Aframe working on an iPad. It runs in Oculus Go, Oculus Quest, Lenovo Mirage Solo, Windows Mixed Reality with all the VR controllers working, iphone, ipad, and android with touch screen UI, and in AR mode on iphone ipad and android, and if I had other devices to test I would get it working on those as well eventually.
I continued to host meetups in 2019 but they were more about presentations from people creating novel brain computer interfaces, and then in 2020 I decided to shift my focus and attention back to writing WebXR code full time, with the goal of improving my programming skill so that I can eventually use that knowledge to build next generation brain computer interfaces with devices such as functional near infrared spectroscopy.
In 2020 I led a project to bring gravity gloves (originally featured in the world famous videogame called Half Life Alyx) into WebXR, which means you can point at objects that you want and click them and they fly towards you so you can catch them. I also implemented game physics so the objects could be thrown. I also helped develop virtual (3D) post it notes and persistent objects that had their properties stored on a server and available to anyone who visited the webpage.
In 2020 I hosted the newer Aframe WebXR Online Hacknight for 32 weeks, sometimes twice a week, we built the Aframe Gravity Gloves Component, inspired by the Half Life Alyx Gravity Gloves, and we built other cool things. https://github.com/n5ro/n5ro.github.io/tree/master/gravitygloves