March 1, 2021 at 5:00 am ET
Last month, Twitter stated unequivocally that its ban on former President Donald Trump’s account was permanent. “The way our policies work,” the company’s chief financial officer explained in an interview, “when you’re removed from the platform, you’re removed from the platform.”
It’s a decisive move — one that Facebook hasn’t yet taken. While Mark Zuckerberg also announced a ban on Trump in the days after the Jan. 6 insurrection, that decision has now been kicked over to Facebook’s new Oversight Board for a review process. Sometime in the next 90 days, the board will decide whether Trump should be allowed back on Facebook or whether — like Twitter — the platform should definitely uphold the ban. In doing so, the board will also be asked to guide a more general policy for how political actors should be treated on Facebook.
The board is too new to allow for much of an educated guess about what its deliberations process will conclude. But if the five cases that it has already decided are any indication, Facebook may choose to fall on the side of reinstating Trump — as it has established a preference thus far for prioritizing the right to free expression above other competing concerns.
This would be a mistake. While deplatforming political actors is a prospect that activists are appropriately cautious about, a ban on Trump is more than justified under Facebook’s Community Standards and under international human rights law — the standards that the board is using to evaluate each case.
It is also justified under U.S. domestic law. While the United States has a proud tradition of protecting free speech — and most importantly, political speech — under the First Amendment of the U.S. Constitution, the right to free speech has never been absolute. Indeed courts have found that speech that leads to “imminent lawless action” can be banned. Furthermore, free speech extends to all Americans: not simply the loudest and most aggressive voices. If women, minorities and vulnerable populations are harassed out of online forums, we have, in effect, failed to uphold their right to free speech. Finally, the First Amendment applies to government, not business. (Admittedly, there are multiple occasions in which a law mandating equitable treatment is desirable, but this is separate from the First Amendment.)
Some may argue that Trump’s ban was temporarily justifiable. This logic says that Trump was banned because of his ongoing incitement to violence and stated intention to undercut the results of the U.S. election — but that circumstances changed with the inauguration of President Joseph Biden.
Unfortunately, this is not true. Large sections of the population still believe Trump’s lies — in the face of repeated court cases and claims by local and federal Republican leadership. These people have been radicalized and pose an ongoing threat of violence. Trump is their lodestone and, while correlation isn’t causation, the benefits to society from banning Trump already appear clear.
Observers here are — and should be — concerned that the tools turned against Trump could be turned against them. After all, political advocates and rights activists have long been censored online. But in this case, it’s a false equivalency to claim that Trump is a beleaguered, powerless activist speaking truth to power.
However, there is one critical issue that all this surface-level debate obscures: Looking forward, what does deplatforming Trump mean for the future of social media?
The Oversight Board is a particularly interesting initiative from Facebook’s side. Facebook is trying to walk a fine line, saying that it agrees with the position that private firms shouldn’t be making unilateral decisions about political speech, while at the same time working to control the regulation, conversation and structures through which public discussion about content moderation is held.
Perhaps, rather than focusing on the fact that Trump has finally been deplatformed, we should instead focus on the fact that Facebook was warned repeatedly that its services were being used to foment disinformation and violence. We should focus on the fact that it shaped community standards rules to allow Trump to remain on the platform for more than five years.
Ultimately, while a ban may be necessary in Trump’s case, bans and the takedown of content are reactive measures — not proactive ones. Facebook is not without other tools, and should use its other abilities to change the culture on Facebook, including by altering ad structure, changing its algorithms and using the evidence and metrics it has to understand what actions cause damage. Furthermore, simply removing objectionable content causes its own problems. In other words, a ban is a band-aid — not a solution.
Facebook should retain its ban on Donald Trump. But this does not mean the case is closed.
Tatyana Bolton, the former senior policy director of the U.S. Cyberspace Solarium Commission and the former cyber policy lead of Cybersecurity and Infrastructure Security Agency, is director of cybersecurity and emerging threats at the R Street Institute. Mary Brooks is a senior research associate of cybersecurity and emerging threats at the R Street Institute.
Morning Consult welcomes op-ed submissions on policy, politics and business strategy in our coverage areas. Updated submission guidelines can be found here.