The world’s largest social media company does more than just connect people. Facebook has also become a repository for massive online criminal markets and terrorist groups. Outdated technology laws have created an environment where social media giants generate billions of dollars in revenue without any accountability for this and other illegal activity occurring on their platforms.
One-third of the world’s population logs onto Facebook platforms, benefiting from a digital space for shared ideas and a network for global activism. Those same platforms have become ground zero for organized crime syndicates to connect with buyers, market their illegal goods, and move money, using the same ease of connectivity enjoyed by ordinary users. Facebook and its related companies are also used by terrorist groups as a megaphone for propaganda, a platform for recruiting new members, and even a source of financing. This illegal activity often occurs out in the open through Facebook groups and pages, two staple features of the platform.
Instead of acknowledging that their technology is being used for illegal purposes and fixing the problem, Facebook CEO Mark Zuckerberg has hidden behind immunity he claims is provided by Section 230 of the Communications Decency Act of 1996, which courts have interpreted to mean that tech companies shouldn’t be held liable for content posted by third parties.
As the House Committee on Homeland Security prepares to question policy leaders at major tech companies, including Facebook, we want to point out the fundamental problem with this approach. The algorithms Facebook has touted to connect the world have connected criminals and terrorists faster than Facebook’s own beleaguered moderators can delete them. The negative impact of this illegal activity is affecting our communities, our cultures, and our environment, and it’s happening in the same digital spaces where our children play, our families connect, and our companies advertise.
Facebook has enabled sex trafficking, both inside the United States and abroad. A Texas woman who was lured into prostitution at age 16 by another Facebook user sued the platform for allowing traffickers to “stalk, exploit, recruit, groom, recruit and extort children.”
Rare and endangered species from tigers to reptiles are also widely trafficked on Facebook. More than 80 percent of the ape trade is now on social media, and multiple tons of elephant ivory are being sold monthly across Facebook.
The platform’s black markets are not limited to crimes affecting the living; they also affect the dead. Groups trading in human remains on Facebook and Instagram exchange macabre items of questionable legality and origin ranging from Tibetan skull caps to babies in jars.
Archaeologists investigating the illicit antiquities trade on Facebook have recorded tens of thousands of artifacts trafficked from conflict regions including Syria, Iraq, and Yemen – a war crime. ATHAR Project is monitoring almost 2 million regular users who log onto a collective 95 groups serving as digital black markets for priceless artifacts plundered from across the Middle East and North Africa.
Investigators at the Global Health Policy Institute have tracked illegal drug sales — everything from prescription opioids like Oxycontin and Fentanyl-laced pills – that are widely available on Instagram and Facebook. Facebook itself admitted to finding and removing more than 1.5 million listings for illegal drugs sales, including heroin, cocaine and meth, within the past six months. The narcotics marketplace on Instagram is targeted to teenagers. Its scale remains unknown.
Putting this in perspective, the notorious dark web platform called the Silk Road never posted more than 250,000 ads at any given time. Facebook is hosting six times that much. In other words, Zuckerberg has succeeded in bringing the Silk Road to Main Street.
In light of all this, his announcement at F8 that he plans to shift the platform design to focus on groups, and Facebook’s plan to launch a cryptocurrency, are downright alarming.
There’s no reason to believe these changes will make user data any more secure. After all, Facebook hasn’t changed its fundamental business model. But the changes will make it harder for authorities and civil society groups to track and counter illegal activity on the platform. Groups are already the epicenter of black-market activity on Facebook.
The firm’s continued negligence in the moderation of criminal content makes clear that the time for self-regulation has passed.
The challenge is that federal laws take time, something that human trafficking victims, drug addicts and endangered species don’t have. But there are other ways U.S. regulators can address crime on social media. Facebook’s IPO may hold the key to effective regulation.
When Facebook went public in 2012, the firm voluntarily entered into a strict regulatory regime that negates CDA 230 immunity in the context of Facebook’s obligations under securities law. The firm’s lack of internal controls and effective compliance programs implicate potentially serious securities law violations. Congress can influence immediate action by asking the Securities and Exchange Commission to utilize its existing regulatory power.
As a result of Facebook’s failure to establish appropriate internal controls, criminal activity has accelerated on its platform and continues to grow. It’s time to make social media a safer space for all. Congress and the SEC must act before it’s too late.
Gretchen Peters is executive director of the Alliance to Counter Crime Online. Amr Al-Azm is a co-founder of the Antiquities Trafficking and Heritage Anthropology Research (ATHAR) Project and is a professor of Middle East history and anthropology at Shawnee State University in Ohio.
Morning Consult welcomes op-ed submissions on policy, politics and business strategy in our coverage areas. Updated submission guidelines can be found here.
Get the latest global tech news and analysis delivered to your inbox every morning.
Correction: A previous version of this op-ed misstated the nature of Facebook’s drug advertising.