I never put together that community management, and in this particular case, content-focused contributions from a community, can be a form of representative democracy.
For a long time, I and other community pros have worked to separate signal from noise while clearly representing the sentiment and voices of those in our greater community. We know that it can’t all be done by bots or algorithms but that those devices do help humans, or in your words, editors, scale their work.
I think you’re right in focusing on Google’s work as an example because so much of what they’re seeking is focused on trust — is this a trusted source — versus cut and dry popularity. I would counter your argument that there isn’t wisdom in crowds. The problem is really in the crowds they create. Social sites like Facebook have self-selecting crowds that don’t meet the rigors of a healthy crowd: diversity, independent-thinking, and aggregate knowledge.
I wonder if they worked on actively diversifying a person’s feed with trusted or edited sources if that would affect the quality of their feed and possibly increase engagement too. Bassey Etim, the NYTimes community editor gets this. He found that their new generation of readers wants to compare opinions. Maybe Facebook isn’t allowing for the effect that an older generation who’s joined is messing up their algorithms for a comparison hungry younger audience? Maybe diverse feeds hurt their ability to target promoted content? I’m not sure. But I do think the social-based news site that figures that out will be a force to be reckoned with and will hopefully take down sites that are willfully or mistakenly acting as progaganda machines.
Very interesting read and thoughts! Thanks for sharing and sparking investigative thinking around this topic. Found via Eric Friedman, so thanks to him too.