January 28, 2021 at 5:00 am ET
In the Information Age, a time when our data offers a detailed look into our lives with alarming accuracy, can there still be a reasonable expectation of privacy?
It’s a question that everyone with a smartphone and an internet connection has considered in some form or fashion. For people who’ve endured discrimination and hate for living true to their identities that question holds even greater significance. As a gay man and an advocate for LGBTQ+ communities around technology and public policy spaces, I’ve become increasingly concerned with technologies that circumvent basic privacy rights, make people less safe and enable systemic biases to persist.
Thursday, Jan. 28, is Data Privacy Day, an opportunity to reflect on the importance of data privacy and to take action against potentially harmful data collection practices. In the long fight for LGBTQ+ rights, infringements on our privacy that seek to disenfranchise us have sadly been the norm. Though many innovations have equipped us with helpful tools to more easily navigate our lives, unique threats to privacy and safety have emerged as personal data becomes more intertwined with society’s daily rhythms. Now is the time for everyone to carefully consider these threats to our reasonable expectation of privacy so they can be prioritized in any policy actions relating to data privacy.
There’s no lack of threats to data privacy, but one particular technology that has caught my attention – for the distinct risks it poses to the LGBTQ+ community as well as its Orwellian features – is a rider surveillance platform called Mobility Data Specification. Currently being implemented in Washington, D.C., and Los Angeles, among other cities, MDS requires mobility companies to share real-time, individual trip data with city governments. This raw data could allow anyone traveling in a connected vehicle to be identified and tracked based on where they’ve been. Designed as a tool for “smart cities,” MDS creates a citywide surveillance apparatus for transportation officials to use with little oversight.
Personal movements are by nature a private matter. For LGBTQ+ people, the sensitivity of where they travel and for what purpose carries much higher stakes, especially if they have not come out publicly. But everyone, regardless of identity, has a stake in securing their location data. The existence of technologies that track movements in real time, whether it’s a ride to a medical appointment, a place of worship, or your own home, opens too many doors for abuse. This is especially true when city governments, which are notoriously unprepared to prevent data breaches, have failed to provide concrete rules for safe and transparent data storage and sharing practices.
The problem goes well beyond one policy or program. Smart-city technologies, facial recognition software and artificial intelligence are being deployed with haste, well ahead of the time it takes to carefully deliberate policies that ensure basic privacy rights are protected. Law enforcement agencies in cities around the country have begun installing camera networks that can scan faces and identify people on the spot. Perhaps more alarming is that officials are often unaware that data collected in a similar fashion is being shared with outside entities, such as federal immigration authorities and other law enforcement agencies.
The lack of structural privacy protections surrounding the use of biometric data could also mean that existing inequalities are perpetuated. The growing interest in facial recognition and artificial intelligence among governments, law enforcement and private entities could implant systemic biases and discrimination into their use if public policy is not properly amended. For example, incorrect facial recognition has led to the arrest and incarceration of innocent people and caused transgender people to be frequently misgendered.
We have taken numerous steps to advocate for stronger data privacy protections for all, from raising awareness of rider surveillance in our nation’s capital to signing an American Civil Liberties Union letter calling on Congress to place a moratorium on facial recognition technology. But the work is just beginning.
For this year’s Data Privacy Day, we renew our calls on elected officials to truly learn more about these dangerous technologies and consider the damage they could inflict absent thoughtful, equal regulation. We also ask the new administration and Congress to prioritize bipartisan data privacy legislation that ensures people’s location, biometric, and other data remains private, and reflects the unique needs of minority communities. The LGBTQ+ community has already suffered from years of oppression and pain caused by unchecked, uneducated and uninformed regulations. This is an opportunity to pursue a unifying effort that brings together a wide array of voices, communities and political persuasions to solve real problems.
The longer we wait to act, the greater the chance that threats to privacy, safety and equality will pervade emerging technologies and data collection practices.
Christopher Wood is the executive director of LGBT Tech, a nonpartisan group working to ensure that technology issues of specific concern to LGBT communities are addressed in public policy conversations.
Morning Consult welcomes op-ed submissions on policy, politics and business strategy in our coverage areas. Updated submission guidelines can be found here.