Opinion

The Potential Unintended Consequences of Passing SESTA

On Tuesday, the Senate Commerce Committee is scheduled to hold a hearing to discuss the Stopping Enabling Sex Traffickers Act, or “SESTA” (S. 1693), a bill intended to hold rogue websites accountable for sex trafficking crimes occurring on their platforms. Though everyone agrees with SESTA’s fundamental goal, a growing coalition of academics, legislators and startups is concerned that the bill as currently written will give rise to serious unintended consequences.

Contrary to how the bill’s proponents are making it seem, the debate about SESTA is not about a choice between protecting victims from being sexually exploited and protecting tech companies from economic harm. Rather, SESTA’s opponents, like the startups we work with, are concerned that the bill will actually make it harder to fight online sex trafficking while also subjecting law-abiding internet companies to expensive and meritless litigation. It is certainly understandable that legislators are eager to put in place tools to fight this horrific crime, but in the rush to pass SESTA, its supporters are ignoring the many problems with the currently-drafted legislation.

To understand the problems with SESTA, it’s necessary to understand the legislation SESTA amends: Section 230 of the Communications Decency Act. At its heart, Section 230 states that websites that do not exercise editorial control over the statements of their users cannot be held legally liable for those statements. This may seem like a relatively minor rule, but its implications are massive. Without Section 230, any website that hosts user content would be at risk of ruinous legal liability any time a user posted something illegal. If Section 230 didn’t exist, Yelp would be liable every time a user posted a defamatory statement about a business, and Kickstarter would be liable if someone posted a fake campaign to defraud users.

But Section 230 is not unlimited. It does not serve as a blanket shield against all legal charges, like SESTA’s proponents suggest. Most critically, Section 230 provides absolutely no immunity for violations of federal criminal law.

SESTA doesn’t make it any more illegal than it already is for online platforms to facilitate sex trafficking. Rather, SESTA would change Section 230 to make it possible for individuals to bring civil lawsuits against websites that have “knowledge” of trafficking activity. The trouble, of course, is that what constitutes “knowledge” of trafficking is difficult to define and could end up making it harder for honest websites to fight trafficking. Currently, many startups engage in voluntary measures to search for and remove illegal content, using tools like Spotlight, Memex and LaborLink. Unfortunately, automated tools for filtering content are far from perfect. At best, they can only identify content by examining the physical characteristics of particular media (e.g. image, sound, and text), but they cannot actually determine whether particular content violates the law. No filtering tool can determining the intent behind a particular post or analyze whether a piece of content was posted with the subject’s consent. Those determinations require human intervention and are often costly, subjective and difficult. And even then, it’s simply not possible to accurately identify all potentially illegal content on a website, particularly considering traffickers are actively working to evade these measures.

This doesn’t mean that startups, and the tech community generally, shouldn’t try to help law enforcement end trafficking. But it means that punishing them for failing to fully remediate trafficking activity may be counterproductive. Under SESTA, if a startup tries, but fails, to remove all trafficking content, it may face a lawsuit for having “knowledge” of trafficking activity it didn’t identify. Even if such a claim is ultimately meritless, the cost of defending can be high enough to bankrupt an early stage company. In light of these risks, startups may be discouraged from taking proactive steps to fight trafficking.

Rather than making it harder for platforms to proactively fight trafficking, policymakers and law enforcement should focus on going after bad actors by more effectively using existing law like the SAVE Act, which was signed into law just two years ago as part of a federal effort to crack down on online advertising that supports human trafficking.

No matter the outcome, the tech community will continue to be a committed partner to stop human trafficking. We share Congress’ desire to deliver justice to the perpetrators of terrible crimes, but we hope it does not lead to hastily crafted legislation that unintentionally works against this laudable goal.

 

Evan Engstrom is the executive director of ​Engine​, a policy, advocacy, and research organization that supports tech startups.​

Morning Consult welcomes op-ed submissions on policy, politics and business strategy in our coverage areas. Updated submission guidelines can be found here.