On a hot, muggy day in October, an exotic pet trapper in an Indonesian forest snatched a young Javan gibbon from its mother, stuffed it in a sack and took to his heels. A day later the gibbon, a protected species, was offered for sale on Facebook. A scroll down on the trafficker’s timeline reveals more gibbons, birds, and other endangered species offered for sale.
The sale of gibbons and endangered species is illegal under Indonesian law. In March of this year, traffickers in Indonesia were arrested and tried for the illegal sale of Komodo dragons on Facebook.
The same would not be true for another accomplice in this crime sequence: Facebook. When it comes to crime on social media, the enabler always walks free. It’s time for regulators to take steps to hold online platforms accountable for facilitating the illegal trafficking of wildlife.
For traffickers engaging in some of the world’s biggest black-market trades, Facebook Inc. is the enabler. The company serves as a vehicle for thousands of traffickers who sell illegal goods using Facebook, WhatsApp, and Instagram to market their goods, connect with and negotiate sales with buyers, and even receive payments.
More than two decades ago, the U.S. Congress passed the Communications Decency Act (CDA), which included Section 230. The bill was meant to mitigate the risk for firms of hosting third-party content on Internet platforms. Senator Ron Wyden, one of the bill’s sponsors, said that CDA 230 was envisioned to provide a “sword and a shield.” The “sword” was meant to enable technology firms to self-police content on their platforms as they saw fit. [MOU3] The shield provided those platforms with sweeping immunity from liability for content posted by third-parties. As it turned out, the sword was made of rubber while the shield was Teflon.
Tech firms broadly — and Facebook in particular — failed to hold up their end of the bargain, however. Huge cyberspace marketplaces exist where buyers and sellers trade illegal products ranging from drugs, wildlife, antiquities, and human remains to human beings themselves. Facebook’s closed and secret groups provide insulated environments for transnational criminals to connect, advertise, and move material.
Facebook has a set of policies banning illegal activity, laid out in its Terms of Service and Community Standards. But these are only as effective as their enforcement, and the company’s content moderation leaves much to be desired.
Facebook, and other social media firms, mainly rely on algorithms and artificial intelligence to moderate harmful content. But investigations by the Alliance to Counter Crime Online (ACCO), where I am a contributing member, show time and again how these algorithms actually connect traffickers faster than moderators can remove them. They suggest friends and recommend groups, putting illicit actors in touch with one another, continually expanding networks of users engaging in similar illegal activities.
In response to increasing pressure from wildlife organizations, Facebook and Instagram banned the sale of all animals in 2017, but a cursory search on either platform will turn up countless groups and individuals still advertising domestic and exotic pets, even zoo animals, for sale. Ivory and rhino horn are sold, using code words, in closed groups. The same is true on other online platforms such as Alibaba, Taobao (owned by Alibaba), Google, Baidu and more. Along with Facebook and Instagram, these platforms are all founding members of the Global Coalition to End Wildlife Trafficking Online. Launched in May 2018, the coalition’s stated goal is to reduce online wildlife trafficking by 80 percent by 2020. The technology companies involved in the Coalition are very far from achieving that goal.
The Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES) has 183 Parties, countries that represent a majority of the world’s nations. The organization, grappling with how to address wildlife cybercrime, recently introduced a revised resolution that included key recommendations to deal with these online issues. The amended resolution was accepted, making clear that CITES Parties recognize that it is up to them and not to the online platforms to develop measures to control the illegal trade in wildlife.
A crucial revision noted that the resolution recommends that Parties “identify key contacts at online technology and data companies that can facilitate the provision of information upon request from Parties in support of investigations.”
In November 2016, one such key contact emailed me after seeing an article in the New York Times regarding my work identifying the role of Facebook Inc.’s platforms in the rampant illicit ape trade. Max Slackman, Facebook’s animals policy manager, wrote to me, in part: “We would like to learn more about your investigation and if there are additional learnings you can share …. In the meantime, please send us over any Facebook and Instagram accounts, Pages, or groups that offer to sell endangered animals. We will investigate immediately.”
I sent him information and asked if there was any way that he or Facebook could help in the investigation of the dealers and assist in arrests and prosecutions. As ACCO researchers have pointed out time and again, closing accounts doesn’t stop the trafficking, traffickers simply set up new accounts with tighter privacy. Not only that, but Facebook’s practice of deleting accounts, rather than archiving and disabling them, erases years of valuable evidence that could actually help prosecute these criminals and impact the illicit trade.
Individuals selling illegal commodities online can be prosecuted, but there are not yet any legal pathways in the U.S. that formalize and regulate cooperation between online service providers and law enforcement, mainly due to privacy policies and regulations.
The crux of the problem in getting effective cooperation from the titans of Silicon Valley to push illegal and dangerous activities offline is the uncomfortable fact that such cooperation conflicts with their business model. The online service providers make their money from user engagement and ‘clicks,’ whether it be to purchase a commodity, to read an advertisement, or, in the case of online black markets, do business with a trafficker. Any actions that service providers take to reduce user engagement reduces their bottom line.
Big tech has made clear they aren’t interested in wielding their sword. It’s up to governments to enact legislation that compels social media firms to modify their algorithms to detect illegal activity instead of facilitating it. Facebook, Google, and other technology firms are sophisticated and rake in billions in annual revenue. They’re more than capable of combating the crime on their platforms, but there’s nothing in the law that requires them to do so. That needs to change. The U.S. Congress has started the discussion. Other countries where online platforms are based that engage in illegal wildlife trade, particularly China, need to step up their enforcement actions as well.
Legislatures will have to achieve the admittedly difficult task of assigning responsibility to online service providers to decide how to balance free speech, legitimate commerce and user privacy against dangerous communications, illegal trade and reporting user abuse to the appropriate national and international authorities. The current service provider response of simply removing posts or closing accounts does not solve the problem.