Tim Knowles
Aug 24, 2017 · 1 min read

Just me here the coding illiterate but I think the existential threat is not so much Machine Learning, AI. AGI but the weaponization of software.

I really don’t care what tool you used to develop the software for a deadly system, the threat is that you will create a weapon powerful enough to make nuclear weapons seem pleasant.

Nuclear war is obviously too scary to consider but hamstringing a country’s infrastructure and releasing autonomous killing machines seems like a viable a alternative. I am imagining that STUXNET and Autonomous Drones are just the tip of the iceberg. I don’t think either need AI but AI will make them deadly and easier to market.

TEK

)

    Tim Knowles

    Written by

    Worked in our nations space programs for more than 35 years