Micah Blumberg

Programmer + Science & Tech Journalist.

SVGN.io
SVGN.io
Jan 8, 2018 · 5 min read

Programmer & Journalist. Writing javascript since 2010. Co-created software for neuroimaging (EEG in VR). I create applications with Virtual Reality, Augmented Reality, Deep Learning, BCI (Experience with EEG, FNIRS, EIT & more.) applications with WebXR, Aframe, Threejs, Mediapipe / Tensorflow (js) and other code libraries since 2017. I’m a researcher of deep neural networks, computational neuroscience, computational biology, neuroscience, human & artificial cognition, and brain computer interfaces. My ultimate mission in life to create full dive nerve gear (next gen BCI), and self aware networks (sentient artificial brains also referred to as AGI). I am also a community organizer who hosts coding meetups, talks on neuroscience and talks on tech topics.

Here is a link to a page with featured articles

Email: micah@vrma.io

Otherways to connect

In 2010 I was coding with Javascript, HTML, CSS, PHP inside Adobe Dreamweaver to build websites for my clients, I was writing simple animations using action script with Adobe Flash.

In 2012–2014 I opened a neurofeedback business in San Francisco that also did research to develop new brain computer interface experiences with light & sound patterns driven by EEG brainwaves.

In 2015 I pivoted to become a full time journalist intent on learning how to build a magazine in VR (with WebVR technology) and on how to develop VR applications that could be used to visualize sensor data.

In 2017 I started the “Neural Lace Podcast” series on youtube, the discussions explore how future brain computer interfaces might work, touch on topics related to neuroscience, computational biology, deep learning, and more.

The Neural Lace Podcast

https://www.youtube.com/playlist?list=PLkuAPx_OL_kE35pOKBfEd2TIWWX_tfuvM

For 2 years from Jan 2018 until December 2019 I was the organizer of NeurotechSF in San Francisco, part of NeuroTechX.

In 2018 for NeurotechSF: I organized coding meetups with the San Francisco hacker community at Noisebridge to bring EEG into WebXR
In 2019 for NeurotechSF: I hosted a series of neuroscience talks at the Red Victorian Hotel.

In 2018 as the organizer of NeurotechSF I led a weekly code night community project at Noisebridge, a maker space in San Francisco, to bring EEG (brainwaves) signals into Virtual Reality (WebXR) In the beginning we had an openbci eeg device and an HTC Vive, and we tried to create 3D printed parts to attach the open bci sensors to the Vive. Our 3D printing efforts were not successful so we ended up using a brainduino eeg device with an Oculus Go, to start we served the eeg data to desktop running a debian distro of linux, we used Python to create a server (and later replaced that with a Go server), we imported our EEG signals into the webpage with a websocket, and we used the Fast Fourier Transform to divided the signal in frequency ranges such as Delta, Theta, Gamma, Alpha, and Beta, so that with our two sensors we had 10 incoming signals to visualize, and then we used a combination of Aframe & Threejs scripting software to change the height and color of objects representing the incoming data effectively creating a 3D time series in virtual reality. User’s could walk around inside this representation of their brain activity with the Lenovo Mirage Solo VR headset and they could click the visualization to rotate it in space.

This video is one of the first results: WebVR EEG Scatterplot work we are doing at NeurotechSF meetups https://twitter.com/worksalt/status/1055343502005923840?s=20

I improved the code on the aframe side after that significantly and this was a later result https://photos.google.com/share/AF1QipP_l79r-yc8ghELZykQDL1eI7Xw82uumsFM7QNsqAAJNiEzZTXWhvnleticudX6jQ?key=aUpyWE81U3YxSC1SdXRKS0dtdFJUdklaelQ4MXJ3

Updates to Neurohaxor EEG WebVR: added VR console to show user new buttons to teleport, turn right or left, or laser click items, turned the boxes into plain buffer geometry which makes the app runs a lot smoother. Watch the new video here:

In 2019 I created a 3D Carousel to cycle through items (2D pictures and 3D objects) by clicking next / previous I also built an interface with buttons for a video player inside WebXR. This involves some really advanced javascript.

In 2019 I began to create some advanced ui/ux in aframe, in this video you can see 3D models and 2D pictures moving on a 3D carousel that I wrote that movies items between visible and invisible points when the user clicks next or previous, the entities change scale, rotation, and position with every click. It works in both AR and VR.

This code demonstration shows Tilt Brush inside WebXR with Aframe working on an iPad. It runs in Oculus Go, Oculus Quest, Lenovo Mirage Solo, Windows Mixed Reality with all the VR controllers working, iphone, ipad, and android with touch screen UI, and in AR mode on iphone ipad and android, and if I had other devices to test I would get it working on those as well eventually.

I continued to host meetups in 2019 but they were more about presentations from people creating novel brain computer interfaces, and then in 2020 I decided to shift my focus and attention back to writing WebXR code full time, with the goal of improving my programming skill so that I can eventually use that knowledge to build next generation brain computer interfaces with devices such as functional near infrared spectroscopy.

In 2020 I led a project to bring gravity gloves (originally featured in the world famous videogame called Half Life Alyx) into WebXR, which means you can point at objects that you want and click them and they fly towards you so you can catch them. I also implemented game physics so the objects could be thrown. I also helped develop virtual (3D) post it notes and persistent objects that had their properties stored on a server and available to anyone who visited the webpage.

https://www.facebook.com/worksalt/videos/3673215699371717/

In 2020 I hosted the newer Aframe WebXR Online Hacknight for 32 weeks, sometimes twice a week, we built the Aframe Gravity Gloves Component, inspired by the Half Life Alyx Gravity Gloves, and we built other cool things. https://github.com/n5ro/n5ro.github.io/tree/master/gravitygloves

Silicon Valley Global News SVGN.io

Silicon Valley Global News: Stories, Research, Advanced…

Silicon Valley Global News SVGN.io

Silicon Valley Global News: Stories, Research, Advanced Concepts RE: Virtual Reality, AR, WebXR, AI Semantic Segmentation on 3D volumetric data, Medical Imaging, Neuroscience, Brain Machine Interfaces, Blockchain, Cryptocurrency, Drones, Light Field Video, Homomorphic Encryption.

SVGN.io

Written by

SVGN.io

Silicon Valley Global News: VR, AR, WebXR, 3D Semantic Segmentation AI, Medical Imaging, Neuroscience, Brain Machine Interfaces, Light Field Video, Drones

Silicon Valley Global News SVGN.io

Silicon Valley Global News: Stories, Research, Advanced Concepts RE: Virtual Reality, AR, WebXR, AI Semantic Segmentation on 3D volumetric data, Medical Imaging, Neuroscience, Brain Machine Interfaces, Blockchain, Cryptocurrency, Drones, Light Field Video, Homomorphic Encryption.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store