
Coral — The Human Machine Merger Age
The Spiral Dynamics vMeme system has certainly provided newer layers, dimensions, and insights into the long arc of humankinds development.
However, its creator and co-creator, Cowan & Beck, essentially hit roadblock at some point, because the future is cloudy, and uncertain.
The pace of development in 2018 moves, in my estimation, at least 10 times faster than it did in 1980. If you factor in the generation of information, approximately 2.5 quintillion bytes of data every single day, it’s no wonder predictions are becoming increasingly difficult.
But where they couldn’t see, I believe we now can make out land, through the “Fog of War” which is the battle to survive as a species into the future.
But in order to touch ground in the Coral Age, humans will have to give up a LOT more than our “old & outdated beliefs, values, traditions, and customs.”
We might have to give up many of the things that make us “human” after all.
The Expansion
“Cortex, meet your new friend, Artificial Intelligence.”
Elon Musk has succinctly put forth a phrase into the social ether, by saying human brains have a “bandwidth problem.”
This is where we can receive plenty of information, visually, but our best output is speaking, or typing. Now days we mostly type with our thumbs.
In an age of hyper-connectedness, it makes a human a rather tight bottleneck when it come to the flow of information.
Here’s the thing, machines, computers, and Artificial Intelligence depend on the near instantaneous flow, and computation, of data and information.
So in a world where more and more of our lives are run by A.I. it would seem that machines themselves will have a choice to make.
Do they make humans obsolete? Or, is there someway to enhance a human to still retain their “human-ness” but augment them to have a higher bandwidth?
I’m sure, that if humans had a choice, we would rather not engage in full scale removal by our machine overlords.
So, what can be done?
A Human-Machine Interface

Rather than focus on the specific technologies one could use to make this system, let’s instead look at what this system would accomplish.
Brain Monitoring + Pattern Recognition

In monitoring brain activity, it is possible to generate highly visualized information.
As seen here, a real-time video of streaming brain activity.
This essentially allows one to tap into an amazing set of tools that have been developed for industrial applications, and most commonly used in Self-Driving Cars: Computer Vision.
Below we will see a Computer Vision system mapping an entire wing of the University of California in Berkeley.
Essentially, the machine has no idea what the terrain will look like, but it is able to receive enough data to create a map of the building.
This information would be usable for an autonomous vehicle to navigate the halls with minimal error.

So when we start to think of this Artificial Intelligence as a “perfect stranger” type of fellow, we can drop it right into the cortex, and allow it “move” around and map out the territory.
Now, we, as humans, might not really be able to make perfect sense of what the machine is seeing, and interpreting, but that doesn’t entirely matter if it can repeatedly reach the same location.
But, you might be wondering… “How would a machine ‘move’ inside of the cortex?”
Ah, great question!
We must give it a pair of legs to walk around with, of course!
Giving Our A.I. ‘Legs’
An autonomous driving vehicle wouldn’t be very useful if it had no tires, right?
So, an intelligent sensing A.I. connected to your brain would also be useless if it could not navigate the territory it was witnessing.
So how does one give this vehicle its legs?
Well, that will be the subject of my next post!

