deep tech must be open sourced

elon musk noted that the best way to hedge against the potential negative effects of ai is to open source and distribute the tools and techniques as widely as possible. this is true.

however, the better reason for open sourcing deep tech is not the fear that elon alludes to but the peer review aspects of double checking the work itself.

simply put, there arent enough truly independent thinkers in the world that are able to understand technology, question it down to first principles, and, in all likelihood, come across the work in the first place. with 7.3b+ people in the world, i know, it seems counterintuitive.

i wont step on elons toes and discuss ai here, but let me explain in terms of a field that is just finding its legs — biology.

unfortunately — i havent wanted to write this for more than a year now — theranos is a perfect example.

first of all, their entire product, marketing, and mission orientation around collecting a tiny sample of blood — their core design decision— makes it impossible [see this paper] for them to actually deliver repeatable, reproducible results — you know, aka science.

after the press beating and government scrutiny, they touted their new minilab. from the pictures, you can clearly see that this altered version of a 3d printer chassis is an open system, has no redundancy of its sensors, and might stream data to the cloud — with a wired ethernet cable.

why does this matter?

besides the fact that the whole company approach is — literally, scientifically — impossible, an open system means that environmental factors such as temperature, humidity, and pressure are not calibrated into the data readings. with a minilab in california vs a minilab in brazil vs china vs africa vs europe the environment will greatly affect the chemical reactions and biological activity of the samples, and wont be able to take into account the differences. further, if this portable minilab is in an ambulance, or out in the field somewhere, the possibility of erratic motion could very easily contaminate and/or destroy the entire system, along with the samples.

streaming data to a central repository, in order to cross-calibrate devices and data readings, should be the new standard for all scientific and health instrumentation. hopefully, theranos is doing this — as seen from the ethernet cable in the photo.

over time, as happens with every physical object, the components will deteriorate. because theranos hasnt built in redundant sensors, there will be no way to tell if the collective data in the centralized repository is accurate or not. shit.

basically, theranos has hundreds of millions of dollars and still doesnt have a core team of engineers that understand the intricacies of developing scientific hardware.

deep tech must be built in the open, at the very least, so that outside opinions can prevent mistakes that cost hundreds of millions, if not more.

you dont know me, but you can — hopefully — follow my logic. outside opinions arent necessarily well known names or come with grandiose titles attached but they are absolutely necessary to the practical development of deep tech.