Fixing Facebook
The architecture of social media platforms like Facebook poses severe threats to humanity. Here are some solutions to reform them and limit the harm.
What if the law dint mandate wearing seat belts? What if Tobacco companies were not forced to add warning labels? What if FDA did not regulate sales of drugs? All of these have direct implications on human lives, so citizens, at least the majority of them, agree to have regulations in place to govern the activities of corporations who stand to benefit from their products sold in those realms.
Who would have imagined even ten years ago that the world's oldest democratic system could have come under attack? Who knew that an opinion from a largely anonymous source would lead to hundreds of members of a minority religion being killed in the world's largest democracy? Or a vaccine that was scientifically proven to stop a pandemic would become a topic of political debate? In every one of these scenarios and more, Facebook had an enormous role play. What started as a way to connect 7B people in the world has become the de-facto news platform for many, with algorithms feeding confirmation bias. Without moderation, with little to no regulation, and content creators who, for the most part, have no journalistic ethics, we may have created information, Frankenstein. The early mission for Meta, the social technology company that owns Facebook, was to "give people the power to share and make the world more open and connected." However, the social platform has turned into an agent for targeted harassment, human trafficking, terrorist recruitment, emotional exploitation, and genocide.
As responsible citizens and advocates for information privacy, we are truly at the turning point where we must shift the conversation towards regulating and establishing policies to restructure the harmful effects of social media.
How do we rewire the social media system?
The NHTSA (National Highway Traffic Safety Administration) conducts crash tests on new cars and publishes its findings and procedures. The Auto safety features are tested and certified by this administration, making the car safety regulations work unanimously. Take the case of medical journals, where entries need to be peer-reviewed. The best solutions for safety come from a robust, competitive knowledge search, whereby disparate researchers critique one another's work and try to improve it. So having a defined, organized framework is the requirement for social media platforms. Here are some solutions from my point of view.
Hold platforms accountable for their algorithmic designs that amplify false information.
Frances Haugen, the Facebook whistle-blower, disclosed that the platform's algorithms are designed to promote content that instigates the most engagement. Facebook knows that the more time users spend on its platforms, the more data it can collect and monetize. The platform should consider redesigning its websites to optimize holistic measures -turn off the computer algorithms and gravitate toward designs showing posts in chronological order. Additionally, content interventions could be used to create awareness of false information and accurate information. Accuracy nudges could be used so that not all data is treated equally — false information is reported so algorithms are trained automatically to identify lies. Fake news can be labeled, sources can be transparent, and advertising can be prohibited next to false content to limit resharing. They could adopt the similar approach taken by Whatsapp to slow the spread of Covid-19 misinformation in search results.
Pledging Transparency
Kids aged 7 to 9 are using social media apps these days, and we don't have enough information about what social media companies are doing with their data. We need stronger laws to protect our children online. Instagram has played a huge role in directing kids to accounts that glorify eating disorders, hence it is clear that online algorithms can lead to real-world harm. When Apple can give its users the option to have their data tracked or not, why can't Facebook provide more transparency of how users' data is collected and used. There should be laws so that the data can be shared and analyzed by an independent panel of researchers so that they could examine peer-to-peer sharing of misinformation and other trends.
Public representatives should oversee Facebook from the inside.
Just like there are Federal Reserve examiners for large banks like the SEC, Congress should start by creating a new digital regulator. The comprehensive federal privacy law could be written, or Section 230 could be reformed to show that companies like Meta comply with best practices for countering illegal content. National privacy regulation and internet governance framework for social media platforms should be established.
Quitting countries where the company cannot devote resources and establish cultural competence
The power of people to use the platform to express themselves, join forces, and challenge authority is more seen in places where institutions are weak or corrupt and where citizens never had a voice. It is also in those places where Facebook has done the most harm and where the company and the world have paid the least attention. In Myanmar and Ethiopia, the platform was turned into a propaganda tool for genocide by the military. Employees at Facebook had flagged that human traffickers in the Middle East were using the platform to lure women into abusive employment situations in which they were treated like slaves or forced to perform sex work. One solution, as advocated by Shira Ovide, a N.Y. Times technology columnist, in her September 21 column titled "Shrink Facebook To Save the World" is for Facebook to get out of some countries. The platform did nothing to stop violence in these areas, so they must simply leave countries like Myanmar and Azerbaijan until they can allocate the same level of money, attention and cultural competence to those places as it devotes to its presence in the U.S. and France.
Reviving competition
Wouldn't it be a win-win situation if social media giants were competing against each other to safeguard consumers' privacy? But while competition can force platforms to compete for designs that protect privacy and society's norms and values, the market forces that push social media companies toward monopolies will remain even if Facebook is dismantled.
Civilizing design
Facebooks own data shows that 64% of people who join extremist Facebook groups do so because of the platform's recommendations. When platforms that are used at such a global scale are designed, then policy must precede design. Each product on the platform must be regulated for safety — which ensures that engineers, executives, corporations, and designers who work on the product are certified with security and safety clearance. The engineering core must be diversified to include social scientists, cognitive scientists, people trained in different types of histories.
Finding the balance between free speech and hate speech
Section 230 makes free and open internet possible. Removing it would limit internet freedom and make many of the world's largest online businesses unworkable. Free speech must be protected, and harmful speech and its amplification must be curtailed. Social media companies must have the tools to classify between speech and reach and the right to amplification of harmful speech. There must be laws so that businesses can draw these sensible boundaries while protecting people’s civil and social norms.
Conclusion
Facebook started with a powerful objective to share information and connect people globally. I have, for one, used it to connect to long-lost friends, distant family members, join groups of like-minded people in new geographies. Like Google for search and Amazon for retail, Facebook is and will continue to be the largest networking platform — there is a critical mass that cannot be displaced. It is a product of American innovation, and we wouldn't want to stymie the system.
Maybe the same system that brought it to life could help put up guardrails. In this regard, I believe advertisers can exercise their power by either refusing to spend or penalizing Facebook for the propagation of false content. In addition, large investors in Facebook could take a similar stand akin to policies hedge funds are taking in their prior investments in fossil fuel companies. Government should then reinforce these measures by placing common-sense regulations to prevent corrosive effects on democratic and social norms.
Ultimately, we, the people who engage with these platforms, need to realize the role they play and shouldn't play in our lives. So, as the world opens up from the pandemic, I am hopeful that humans are able to see each other eye to eye, have empathy for each other's position and circumstances, and develop an appreciation for the limitation of digital networks in driving human relationships.