With academic institutions and technology companies hunkering down to create apps to track those who could have been exposed to COVID-19, polling shows potential users would be most likely to trust federal agencies and university researchers with keeping the data collected about them secure and private.
About half of the 2,200 U.S. adults in a new Morning Consult survey said they had at least some trust in federal agencies like the Centers for Disease Control and Prevention and researchers at universities to keep their personal information such as health and location data protected. Thirty-five percent said the same about large technology companies.
At the same time, adults were also more inclined to say they trusted agencies and researchers to develop an effective contact-tracing app, a program aimed at notifying those who were in close contact with someone who ended up testing positive for COVID-19 so they can quarantine themselves. Fifty-five percent said they had at least some trust in researchers to build an effective tool, while 54 percent said the same for federal agencies.
Forty-one percent said they had the same level of trust in large technology companies.
The survey, conducted April 22-24, has a margin of error of 2 percentage points.
Apple Inc. and Google are expected to release the first version of their contact-tracing API on Tuesday, which software developers can use to build apps on behalf of public health agencies. And a team of researchers from the Massachusetts Institute of Technology, Harvard University and the Mayo Clinic are beta-testing an app, called Private Kit, which Reuters reported earlier this month had deals with at least three local governments and was being considered by an additional 17 states and municipalities.
Despite the growing attention on contact tracing, sentiment among adults over sharing their location data with the government for COVID-19 tracking hasn’t changed in the last month. Fifty-nine percent said they would be at least somewhat uncomfortable with technology companies sharing their location data if it meant the government could better track the spread of the virus, compared to 57 percent who said the same last month.
Lorrie Cranor, the director and professor in security and privacy technologies at CyLab Security and Privacy Institute at Carnegie Mellon University, said the level of privacy risks involved with contact-tracing technologies depends on the model being used by app developers. For instance, Cranor says systems aimed at collecting and re-creating the precise location of each person’s cellphone to show where everyone is going and where they’ve been, while also including identifiable information, would be the most extreme violation of privacy.
On the other end of the spectrum are systems where someone’s phone collects data about other phones that are nearby and then sends out codes to anyone who’s been near someone who tested positive for COVID-19.
“The devil’s in the details,” said Cranor, who served as chief technologist for the Federal Trade Commission during the last year of the Obama administration. “There are proposals out there where basically the data stays on my phone and doesn’t get released at all, it’s only used to inform me. And there are proposals out there where the data goes to a central source. That makes a big difference.”
And stakeholders in Washington have started sounding the alarm to ensure privacy safeguards are in place in these new technologies. Last week, Sen. Ed Markey (D-Mass.) laid out his nine principles for establishing a national contact-tracing system, which calls for transparency about what information is collected, voluntary participation and thorough data security processes. And Sen. Josh Hawley (R-Mo.) wrote a letter to Apple Chief Executive Tim Cook and Google Chief Executive Sundar Pichai calling on them to take personal responsibility for protecting the data collected by their contact-tracing system.
Adam Schwartz, a senior staff attorney at the Electronic Frontier Foundation, said in an interview that in order for these apps to be effective, people need to feel that they can trust that the system isn’t doing more surveillance than is necessary to protect public health.
“In our democratic society, we don’t compel people to turn their most intimate tool — their cellphone — into their parole officer,” Schwartz said. “And if you try to do that, then people will leave their phones at home or turn off the Bluetooth connectivity.”
As these new tools emerge, Cranor said it’s important that government officials create oversight for these projects — regardless of whether they come from researchers, technology companies or federal agencies.
“There’s a big question of how do we know they’re going to do what they say they’re going to do,” she said. “That’s important to consider beyond just, ‘How does the technology work?’”