When the North American Free Trade Agreement was first written, the internet was a brand new phenomenon. Only companies on the cutting edge of technology had a website, and email addresses were not anyone’s primary method of communication. It’s no surprise, then, that NAFTA re-negotiators are looking to draft proposals to address the huge amounts of data businesses now use on a daily basis.
Though it might seem a little arcane to the average person, in reality, allowing financial data to flow across national borders is something consumers across North America should be paying attention to. By permitting financial institutions to store and process data in the location that’s best for them, we can improve the financial products we all consume, help secure our personal information from cyberattacks, and help make it possible for the under-insured and under-banked populations to gain access to the financial system.
Several decades ago, a limited number of data points shoehorned financial customers into standardized mortgages, uniform auto insurance policies, and any number of other unimaginative financial products. But today, with advancements in financial technology, ever-expanding computing power, and an increase in available data, people can buy customized, more-individualized financial products, tailored to a person’s or family’s unique and specific needs. In order to maximize the benefits of these types of technological advancements, insurers, banks, and other financial firms need to be able to access, store, and process the data that undergirds these improvements in real time and through a centralized data management system. Forcing a company to coordinate between several data systems only makes their services slower and more cumbersome for consumers.
Choosing the location that best fits a business’ IT models can also greatly help keep consumers’ personal information safe. As one insurance company leader put it, keeping one or two data hubs that follow uniform procedures is a lot easier than trying to cover all of the potential weak points in a patchwork of local servers with any number of different IT vendors. As cyberattacks grow more sophisticated and more frequent, governments should be making it easier, not harder, to protect consumers’ data.
Lastly, as each data center can cost millions of dollars a year, having a centralized hub can vastly reduce the cost of storage. When companies’ costs aren’t artificially driven up by unnecessary government requirements, consumers get to reap the benefits in several ways, such as decreased costs, increased company investment in R&D to deliver more consumer product options, and higher quality products. Regardless of which path a company takes, it should be theirs to make, not a government’s.
In a 2016 article in The Economist, Brookings Institution scholar Joshua Meltzer noted, “One thing is clear: there is a gulf between the experience of firms, which insist data flows are crucial, and policymakers, who have no sense of their macroeconomic importance.” NAFTA negotiators can significantly bridge this gap with a commitment to the free flow of data and a promise to consumers to keep their data secure.
The United States has made it clear that NAFTA will be the template for future trade pacts, and the outcomes in NAFTA inevitably will influence future trade agreements that all three countries negotiate going forward. As digital protectionism creeps around the world, strong commitments in NAFTA could be the answer.
Steve Simchak serves as vice president of international affairs for the American Insurance Association.
Morning Consult welcomes op-ed submissions on policy, politics and business strategy in our coverage areas. Updated submission guidelines can be found here.