Hey Ev, I definitely agree that “It’s rarely as nefarious as you can make it seem from the outside.” That’s so true of most “conspiracy theories.” They’re just a side-effect of our brains’ biases and pattern recognition.
But consider that perhaps the product team believes they are trying to make the best product they can; and that they likely don’t have pure access to what their own intentions are, nor the agency to keep those intentions ethically sound.
I once visited a doctor about a thing on my foot. He mentioned that he had a new laser machine, and that he could remove the thing nearly instantly, but that it would be pretty expensive. I asked him about other options — over-the-counter stuff or prescriptions — and, while convincing himself of his impartiality with phrases like “I don’t care either way, this is my professional opinion,” he painted a bleak picture about the prospects of the thing ever going away without his laser machine.
Fortunately, I was in the middle of reading Dan Ariely’s book The Honest Truth About Dishonesty, and I remembered his own story about a doctor who got angry at him for not wanting to try an experimental treatment. It turned out that doctor only needed one more test subject to publish his paper, and that’s why he was so pushy.
So, I opted against the laser treatment, and in a few weeks, the thing disappeared.
Interestingly, a few months later, the doctor was out of business.
I believe this doctor believed that he wasn’t biased, but he really was. He may not think the recent expense of the new machine, nor the pressure of high city rents were influencing his recommendations, but they were.
Even if the product and the money-making teams are separate teams, with separate intentions, they are body parts of the same beast. That beast, as Jesse has pointed out in his response, is a publicly-traded company.
As much as members of the product team may be convinced they’re trying to make the best product, they are still beholden to the desires of the beast.
The beast has to make money to survive. In this case, the beast needs advertising dollars to survive. If the beast increases time spent on its platform, it will make more advertising dollars.
This may subtly redefine the definition of best in the minds of product team members to mean that-which-increases-time-spent on the platform, and cause the employment market to favor those who believe that metric. (I myself checked out of Silicon Valley because of these broken metrics.)
That which is good for us, and that which we spend lots of time on are different things. A human mind is full of weaknesses — like joints in the armor of a medieval knight — that can be exploited, which seems to be the point of the amazing Tristan Harris article you shared.
You already know the attention economy is unsustainable, and you certainly have more experience than I in creating the best products. But consider for a moment how you yourself may be biased on this matter. Claiming that an algorithmic feed may actually reduce time spent on a platform; and that a switch to an algorithmic feed is not driven to maximize monetization — despite massive economic pressure to the contrary, and the power of biases within all of the players involved — makes it sound like you’re fooling yourself.
This may seem like something over which we have no control — that the market has spoken, and that we have no choice but to mine the collective attention of humanity the same way we might deface a mountain of iron ore, but this could turn out to be a very big problem, and everyone in technology should be thinking about how to solve it.
(With 100% respect from a fellow Nebraska kid.)