Tech

We’ve Tracked Extremist Content on Facebook for Years: It Doesn’t Get Removed for Long

As Facebook bends to mounting pressure to stem the spread of domestic extremist content and conspiracy theories, declaring an expansion of hate-speech policies and shutting down recruiting efforts by militia groups, it’s pertinent to look at Facebook’s track record for responding to foreign extremism on its platforms. The Alliance to Counter Crime Online and the Counter Extremism Project have spent years tracking for how violent groups ranging from Mexican cartels to ISIS utilize Facebook.

Our organizations have jointly concluded that the world’s largest social media company has neither the capacity nor the will to comprehensively remove violent extremist content and misinformation, despite the fiduciary risks this brings.

An excellent example is the designated terror group Hezbollah, which has used Facebook to broadcast propaganda, recruit for attacks, report on the activities of its leadership and even shill for money.

Facebook has enabled Hezbollah to cheaply and instantaneously reach a global audience, addressing local, national and international issues in real time. Some analysts even suggest Hezbollah’s influence in the Levant and beyond depends a great deal on the party’s sophisticated use of social media to conduct grassroots advocacy.

As with emerging violent groups here in the United States, Hezbollah accounts are pretty easy to locate, operating in plain sight as opposed to in private or secret Facebook groups. Multiple designated Hezbollah entities have linked to their Facebook pages from their official websites, or can be quickly located if one searches on the actual name of the entity.

A Facebook page logged by an ACCO researcher in June 2019, available here or here, from Hezbollah’s “Electronic Resistance” news site provides a striking example of just how integral Facebook became to Hezbollah’s public outreach. They actually integrated Facebook’s logo into the Hezbollah logo – the iconic hand holding an assault rifle.

As with recent Facebook efforts to ban the militia indicted for its plot to kidnap the Michigan governor, Facebook does occasionally remove pages linked to Hezbollah. But the pages tend to reappear weeks or months later, and quickly build a solid following. Last month, for example, we could easily find a Facebook page for Jihad Al Bina, a fundraising entity that the U.S. Treasury designated in 2007 and another Facebook page linking to “Al-Emdad,” a social service organization created by Iran in the 1980s that helps recruit new members and operates youth training camps. The U.S. government designated that group a decade ago.

These may be the most flagrant examples of oversight failure, but they’re merely the tip of the iceberg in terms of Hezbollah content available on Facebook. There are additionally hundreds of Hezbollah-affiliated schools, municipalities, scout groups, supporters, propaganda accounts — some under fake names — and accounts on Facebook aimed at developing the terror group’s oversees outreach.

Our advocacy work indicates that Facebook only removes Hezbollah pages as a direct result of media attention, congressional hearings or direct intervention from concerned organizations. And once Facebook kicks off accounts and pages, it doesn’t appear to use sophisticated means to prevent them from coming back.

In April 2018, for example, nine Hezbollah-related Facebook pages were removed after Counter Extremism Project publicized links, including a tribute page to martyrs that had more than 60,000 followers. Within two weeks, a replacement popped up.

Hezbollah also uses social media to promote fund-raising campaigns, meaning Facebook is effectively facilitating terror financing. In 2019, for example, Hezbollah’s Islamic Resistance Support Association ran a crowdfunding campaign on Facebook to “Equip a Jihadi.”

The campaign urged sympathizers to give money or items of value in order to help Hezbollah fighters purchase necessary equipment, including boots, weapons and vests. Pro-Hezbollah accounts on Facebook circulated stories glorifying everyday people donating jewelry and other valuables to the campaign, and encouraging further donations.

Facebook executives have repeatedly asserted that their artificial capabilities are successful at taking down terror content. In an October 2019 speech, for example, Facebook CEO Mark Zuckerberg said that, “Our AI systems identify 99 percent of the terrorist content we take down before anyone even sees it. This is a massive investment.”

The careful phrasing of this oft-repeated contention could be easily misconstrued to convey that Facebook’s AI tools remove 99 percent of overall terror content on its platform, an interpretation that would sound reassuring, but would be false. Rather, Facebook discloses that its AI systems proactively identify 99 percent of all the terror content that the company removes. Facebook never identifies how much overall terror content it believes it removes, and it won’t disclose the number of user reports it receives about terror content.

In fact, earlier this year, ACCO researchers filed a whistleblower complaint noting they identified that 33 percent of the 68 U.S. designated terror groups, or their leaders, operated official pages or groups on Facebook. CEP, meanwhile, has questioned Facebook’s claim of removing 99 percent of terror content from its platforms.

This problem is not limited to terror networks either. ACCO researchers have tracked how Mexican drug trafficking networks like Los Zetas, and the multinational gang Mara Salvatrucha, also use Facebook to broadcast their propaganda, fundraise, recruit new members, extort and put out hits on individuals.

The extent to which designated terror groups and other violent organizations leverage Facebook products to advance their aims and operations is a threat to public safety in the United States and beyond. It’s high time Congress reform the laws governing tech, and asked regulators to sanction Facebook.

 

Gretchen Peters is executive director of the Alliance to Counter Crime Online. Josh Lipowsky is a senior research analyst with the Counter Extremism Project.

Morning Consult welcomes op-ed submissions on policy, politics and business strategy in our coverage areas. Updated submission guidelines can be found here.

Morning Consult