Micah Blumberg

Silicon Valley Global News SVGN.io
6 min readJan 8, 2018


I develop next generation computing technologies based on the study of neuroscience. I’m an author, patent creator, institute founder, programmer, neuro-physicist, theoretical physicist, NFT creator. I am Creole & Jewish, Neurodivergent, Autistic, I use He/Him pronouns.

I’m a matrix hacker (math, code, physics, webxr, neural networks). A neurohacker (writing code for brain machine interfaces, I design new brain computer interface technologies, and I write about the quantum physics of neurophysics, quantum gravity, and I recently began to design propulsion systems and aerospace vehicles. I’m a book author (in progress), a patent creator (in progress), a science institute founder (in progress), a software architect.

I am writing a book about the quantum physics of neurophysics, how human phenomenal consciousness works, how to hack the human mind with technology, the book includes a new extension of Quantum Gravity.

I am this year (2021) starting a new science institute.

I work on climate change, global economic issues, I help to resolve international conflicts, I’m a consultant for governments and corporations on advanced technologies, I support artists and cancer research.

I’ve been writing code with javascript since 2010, I create VR and AR applications with WebXR, Aframe, Threejs, and other related code libraries since 2017. I have been a community organizer who has in the past hosted coding meetups, talks about neuroscience, the metaverse, vr ar tech, and artificial intelligence tech.

I design code & micro-processors for novel new self aware neural networks.

Always seek wisdom, continually improve, learn from mistakes & successes, goodwill towards others is the best path to long term success, fight to do right!

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — -

In addition I have been writing code with javascript since around 2010. I co-created software for neuroimaging (EEG in VR). I created software applications with Virtual Reality, Augmented Reality, Deep Learning, BCI (Experience with EEG, FNIRS, EIT & more.) applications with WebXR, Aframe, Threejs, Mediapipe / Tensorflow (js) and other code libraries since 2017. I’m a researcher of deep neural networks, computational neuroscience, computational biology, neuroscience, human & artificial cognition, and brain computer interfaces. My ultimate mission in life to create full dive nerve gear (next gen BCI), and self aware networks (sentient artificial brains also referred to as AGI). I am also a community organizer who hosts coding meetups, talks on neuroscience and talks on tech topics.

Here is a link to a page with featured articles

Email: micah@vrma.io

Otherways to connect

In 2010 I was coding with Javascript, HTML, CSS, PHP inside Adobe Dreamweaver to build websites for my clients, I was writing simple animations using action script with Adobe Flash.

In 2012–2014 I opened a neurofeedback business in San Francisco that also did research to develop new brain computer interface experiences with light & sound patterns driven by EEG brainwaves.

In 2015 I pivoted to become a full time journalist intent on learning how to build a magazine in VR (with WebVR technology) and on how to develop VR applications that could be used to visualize sensor data.

In 2017 I started the “Neural Lace Podcast” series on youtube, the discussions explore how future brain computer interfaces might work, touch on topics related to neuroscience, computational biology, deep learning, and more.

The Neural Lace Podcast


For 2 years from Jan 2018 until December 2019 I was the organizer of NeurotechSF in San Francisco, part of NeuroTechX.

In 2018 for NeurotechSF: I organized coding meetups with the San Francisco hacker community at Noisebridge to bring EEG into WebXR
In 2019 for NeurotechSF: I hosted a series of neuroscience talks at the Red Victorian Hotel.

In 2018 as the organizer of NeurotechSF I led a weekly code night community project at Noisebridge, a maker space in San Francisco, to bring EEG (brainwaves) signals into Virtual Reality (WebXR) In the beginning we had an openbci eeg device and an HTC Vive, and we tried to create 3D printed parts to attach the open bci sensors to the Vive. Our 3D printing efforts were not successful so we ended up using a brainduino eeg device with an Oculus Go, to start we served the eeg data to desktop running a debian distro of linux, we used Python to create a server (and later replaced that with a Go server), we imported our EEG signals into the webpage with a websocket, and we used the Fast Fourier Transform to divided the signal in frequency ranges such as Delta, Theta, Gamma, Alpha, and Beta, so that with our two sensors we had 10 incoming signals to visualize, and then we used a combination of Aframe & Threejs scripting software to change the height and color of objects representing the incoming data effectively creating a 3D time series in virtual reality. User’s could walk around inside this representation of their brain activity with the Lenovo Mirage Solo VR headset and they could click the visualization to rotate it in space.

This video is one of the first results: WebVR EEG Scatterplot work we are doing at NeurotechSF meetups https://twitter.com/worksalt/status/1055343502005923840?s=20

I improved the code on the aframe side after that significantly and this was a later result https://photos.google.com/share/AF1QipP_l79r-yc8ghELZykQDL1eI7Xw82uumsFM7QNsqAAJNiEzZTXWhvnleticudX6jQ?key=aUpyWE81U3YxSC1SdXRKS0dtdFJUdklaelQ4MXJ3

Updates to Neurohaxor EEG WebVR: added VR console to show user new buttons to teleport, turn right or left, or laser click items, turned the boxes into plain buffer geometry which makes the app runs a lot smoother. Watch the new video here:

In 2019 I created a 3D Carousel to cycle through items (2D pictures and 3D objects) by clicking next / previous I also built an interface with buttons for a video player inside WebXR. This involves some really advanced javascript.

In 2019 I began to create some advanced ui/ux in aframe, in this video you can see 3D models and 2D pictures moving on a 3D carousel that I wrote that movies items between visible and invisible points when the user clicks next or previous, the entities change scale, rotation, and position with every click. It works in both AR and VR.

This code demonstration shows Tilt Brush inside WebXR with Aframe working on an iPad. It runs in Oculus Go, Oculus Quest, Lenovo Mirage Solo, Windows Mixed Reality with all the VR controllers working, iphone, ipad, and android with touch screen UI, and in AR mode on iphone ipad and android, and if I had other devices to test I would get it working on those as well eventually.

I continued to host meetups in 2019 but they were more about presentations from people creating novel brain computer interfaces, and then in 2020 I decided to shift my focus and attention back to writing WebXR code full time, with the goal of improving my programming skill so that I can eventually use that knowledge to build next generation brain computer interfaces with devices such as functional near infrared spectroscopy.

In 2020 I led a project to bring gravity gloves (originally featured in the world famous videogame called Half Life Alyx) into WebXR, which means you can point at objects that you want and click them and they fly towards you so you can catch them. I also implemented game physics so the objects could be thrown. I also helped develop virtual (3D) post it notes and persistent objects that had their properties stored on a server and available to anyone who visited the webpage.


In 2020 I hosted the newer Aframe WebXR Online Hacknight for 32 weeks, sometimes twice a week, we built the Aframe Gravity Gloves Component, inspired by the Half Life Alyx Gravity Gloves, and we built other cool things. https://github.com/n5ro/n5ro.github.io/tree/master/gravitygloves



Silicon Valley Global News SVGN.io

Silicon Valley Global News: VR, AR, WebXR, 3D Semantic Segmentation AI, Medical Imaging, Neuroscience, Brain Machine Interfaces, Light Field Video, Drones