For many, it seems like everything suddenly went wrong for Facebook during the 2016 election, and it’s only gotten worse since then. But that would be untrue: the crisis over Facebook – indeed other digital platforms such as Google – has been building over the last decade. And the root of the cause is simple: resistance by these companies to taking responsibility for the content that appears on their sites.
Over the last 18 months, the torrent of criticism that Facebook has faced is well-documented. Fake, manipulative and divisive content, infiltration by Russian interests and now revelations that the company turned over personal information on up to 50 million users without their consent. CEO Mark Zuckerberg is promising to do better, but only after Facebook lost $40 billion in stock value.
And while it’s Facebook under the microscope today, over the last year Google has faced an advertiser revolt over the proliferation of objectionable content, including videos promoting terrorism, and been criticized for promoting stolen credit cards, the sale of painkillers without a prescription, and websites that peddle pirated content in order to infect computers. That, too, cost Google billions in stock value.
That hands-off approach, whether intentional or not, made it easy for criminals and other bad actors to exploit the platforms, which in turn has blurred the lines between mainstream sites and the “Dark Web.” That blurring is a key reason why many Americans now question the trustworthiness of digital platforms.
Google’s response to criticism that its platforms harbor inappropriate content is telling. When confronted, the company’s response was that “review teams respond to videos flagged for our attention around the clock, removing any content that violates our policies.” In other words, Google relies on others to alert it to inappropriate content instead of actively policing its networks itself.
The reason? Google and Facebook have treated self-monitoring as a slippery slope. The more they did, the more they’d be expected to do. And it should be noted that they had a financial disincentive: lucrative advertising revenue. That may have appeared a savvy legal and business position but look where it’s gotten us today.
Digital Citizens surveys found that trust in Facebook is eroding. In a survey completed on March 20, 39 percent said Facebook wasn’t a reliable company; by March 22 that number jumped to 45 percent. What should really alarm Facebook: Forty percent said they were either getting off the site or scaling back their usage.
What’s next could be even more daunting for Facebook and Google. For several years, Digital Citizens has warned that if the digital platforms don’t take the lack of trust seriously, they are inviting regulation.
Now, Facebook, Google and others are facing exactly that. In its new guidelines, the European Commission is demanding that platforms remove illegal and objectionable content within 1 hour of notification. If they don’t, the EC says it’s poised to make it a legal requirement. But it’s not just Europe. In the United States, the Federal Trade Commission has opened its own investigation and there is growing support for regulation in policy, business and consumer-protection corners — especially in light of the Cambridge Analytica controversy.
And if it happens: Google and Facebook have no one but themselves to blame.
Even before the recent controversy, former President Barack Obama, once a close ally of the tech sector, was raising questions about the future. “I do think the large platforms—Google and Facebook being the most obvious, Twitter and others as well, are part of that ecosystem—have to have a conversation about their business model that recognizes they are a public good as well as a commercial enterprise,” he told MIT’s Sloan Sports Conference in late February.
Now others, Republicans and Democrats alike, are weighing in as well. Missouri Attorney General Josh Hawley has launched an investigation of Google’s business practices, including whether its dominant position enables him to squeeze out smaller competitors. And Facebook faces privacy investigations from the Federal Trade Commission and state attorneys general.
So, what can Google and Facebook do to avoid regulation?
First, they have to show consumers, other businesses and policymakers that they are serious about regaining trust. That includes the hiring of a more diverse workforce to monitor and clean up their platforms. Second, there should be a cross-platform initiative to identify and ban bad actors. Finally, they should commit they will not use their dominant position to harm would-be competitors.
Google and Facebook have to decide what company they want to keep. It’s not too late, and regulation should be a last resort. But we live in an era where the public is distrusting – of government, institutions, and now, large tech companies – and Google and Facebook ignore that fact at their own risk.
Tom Galvin is the executive director of the Digital Citizens Alliance.
Morning Consult welcomes op-ed submissions on policy, politics and business strategy in our coverage areas. Updated submission guidelines can be found here.