Solving misinformation for the public, not for experts.
(a poorly written, unedited, brain dump — because I need to publish more often and stop criticising myself into inaction)
Solving the misinformation problem for experts look very different to solutions for the public.
Experts can inspect the system, question why things were done, understand the reasoning and then decide to trust the system or not. They also have a basis of knowledge around the information being arbitrated on which gives them confidence in the system when it produces results which match with their expert knowledge.
Lay-people generally can’t do either of these things, and so they end up, once again, dependent on experts telling them whether the system is trustworthy or not. If that worked, then we wouldn’t need a system to tell people what to trust in the first place — they would already be listening to the experts. But instead we have almost 50% of the population who refuse to believe that climate change is man made, and as much as 40% of the US population who believe that all life was created by God in its current form.
The public don’t trust experts any more, and they have no interest in any ‘solution’ to misinformation which tells them what is and is not true or who can and can’t be trusted. This is why every attempt to address misinformation which takes this approach, is not solving it at all.
To be clear — these tools are useful. They are useful to experts and organisations who need to quickly (or automatically) decide whether they will publish information or not. They are useful to companies who have a reputation for truth and accuracy (amongst certain crowds), and wish to maintain that reputation. They are useful to people trying to make sense of the misinformation problem that the world is currently facing. They just aren’t a solution to the problem.
The Real Problem
The real problem is with the public, the global population of humans, and how they continue to believe and promote things which are not true.
The misinformation problem isn’t a problem with misinformation, it is a problem with false beliefs.
No matter how hard you make it for people who believe in things which are not true to spread their ideas, they will continue to find ways to do it. In fact, the more you exclude them from online spaces, and cut their funding and ostracise them from the community, the more persecuted and justified they will feel in breaking the rules and using every technique available to them to push their ideas out there.
You don’t correct false beliefs by telling the believers “You are wrong.” That has never worked, and never will.
You also don’t correct false beliefs by ostracising the believers from society, and isolating them together in little echo chambers that no one else dares go into ‘because of all the stupid.’ That just allows the ideas to circulate and reinforce themselves without resistance or criticism.
The only way that people have ever changed their minds (on important, core beliefs) is through ongoing dialectic engagement with people who disagree with them, over the course of (usually) years.
No single argument changes a mind. No single authority figure changes a mind. Minds are changed piece by piece, argument by argument, crack of doubt by crack of doubt. Eventually, the believer simply realises that they no longer believe, and they move on with their new life with their new beliefs, whatever they may be.
Solving the Problem
Changing the mind of a true believer is very hard. Luckily, changing the mind of someone who has believed for a week, isn’t so hard. Better yet, preventing someone from forming a false belief in the first place is just a matter of being in the right place at the right time with the right information.
So solving the misinformation problem, for the public (not for experts), must allow them to take their beliefs into the public sphere and engage with other people in a way which doesn’t make them feel persecuted or vilified. They must be allowed to hold those beliefs while simultaneously being regularly prompted to challenge them. And any system which does this, must do so equally for all beliefs, for, as far as I know, humanity hasn’t yet figured anything out with 100% certainty.
The internet brought us into the information age. All information. All the time. Everywhere. Now we just need to fix it so that it applies a small, constant counter-pressure to how it presents that information, reminding us fallible humans and our weak, easily manipulated brains, that no one knows anything for sure, that lots of people disagree with what you are reading, and that critical reading is crucial for consumption of all information we encounter.
And it better give everyone access to a simple solution which helps them take the next step in case they don’t know what to do on their own. For example, take them to a high quality critique of whatever they are reading, since that is in fact critical analysis in action. A demonstration of what they, perhaps, should be doing themselves. Ideally, checking the claims are accurate, adding in any omitted facts, contextualising those facts, assessing the credibility and motivations of the author(s) and/or publisher(s), and breaking down logical fallacies and tricks used in the piece.
Read enough critical analyses, and you too can learn the sort of thinking, the tricks of the trade, and the steps taken to verify content before accepting it as true.
And an Internet which always makes it easy to access critical content from any webpage or link you see, will be an Internet which normalises skepticism, promotes critical thinking, and teaches the skills of critical thinking by demonstration.
That is how you solve the misinformation problem. You normalise and promote skepticisim and critical thinking to everyone, equally, and let them figure out the rest for themselves. You let them change their own minds. You let them make up their own minds.
Because that is what they are going to do anyway. You might as well give them the best tools possible to help guide them to good thinking, since good thinking is the only thing we have any handle on so far.