Thanks. That’s another huge topic I didn’t try to address here.
Chris Dixon
9510

The thing about automation is that we haven’t yet learned how to AI human insight or decision making (yes, I verbed AI (and I verbed verb). I think the trend should grow). This is because of two reasons: 1) Insight requires that something (a human) decides what is important, aggregates data (more and more, a machine), sifts data (again, more and more, a machine), and then derives insight per the decision on what is important (again, a human) 2) decision making requires emotion and I think we are a long way off from creating AI for human emotion (although machine learning is developing ways to predict action based on emotion). The “best practice” of decision making is for cold logic to compile facts and then in cases of “tie-breakers” emotion pulls the trigger. But you can’t forget that emotional want has real weight to the value of any option regardless of logic.

My point? Humans will be needed to decide what humans want in life. Automation always kills jobs and skills, but humans always adapt to fill the real need in life, specifically, deciding what the real need in life is (and innovating amazing new futures and automating stuff).

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.