September 03, 2020

The Razor’s Edge: Liberalizing the Digital Surveillance Ecosystem

By Kara Frederick

Executive Summary

The COVID-19 pandemic is accelerating global trends in digital surveillance. Public health imperatives, combined with opportunism by autocratic regimes and authoritarian-leaning leaders, are expanding personal data collection and surveillance. This tendency toward increased surveillance is taking shape differently in repressive regimes, open societies, and the nation-states in between.

China, run by the Chinese Communist Party (CCP), is leading the world in using technology to enforce social control, monitor populations, and influence behavior. Part of maximizing this control depends on data aggregation and a growing capacity to link the digital and physical world in real time, where online offenses result in brisk repercussions. Further, China is increasing investments in surveillance technology and attempting to influence the patterns of technology’s global use through the export of authoritarian norms, values, and governance practices. For example, China champions its own technology standards to the rest of the world, while simultaneously peddling legislative models abroad that facilitate access to personal data by the state. Today, the COVID-19 pandemic offers China and other authoritarian nations the opportunity to test and expand their existing surveillance powers internally, as well as make these more extensive measures permanent.

Global swing states are already exhibiting troubling trends in their use of digital surveillance, including establishing centralized, government-held databases and trading surveillance practices with authoritarian regimes. Amid the pandemic, swing states like India seem to be taking cues from autocratic regimes by mandating the download of government-enabled contact-tracing applications. Yet, for now, these swing states appear responsive to their citizenry and sensitive to public agitation over privacy concerns.

Today, the COVID-19 pandemic offers China and other authoritarian nations the opportunity to test and expand their existing surveillance powers internally, as well as make these more extensive measures permanent.

Open societies and democracies can demonstrate global surveillance trends similar to authoritarian regimes and swing states, including the expansion of digital surveillance in the name of public safety and growing private sector capabilities to collect and analyze data on individuals. Yet these trends toward greater surveillance still occur within the context of pluralistic, open societies that feature ongoing debates about the limits of surveillance. However, the pandemic stands to shift the debate in these countries from skepticism over personal data collection to wider acceptance. Thus far, the spectrum of responses to public surveillance reflects the diversity of democracies’ citizenry and processes.

Summary of Recommendations

The United States should respond to the illiberal use of surveillance technology in four ways:

  1. Draw in like-minded democratic partners to work multilaterally, with a particular focus on global swing states.
  2. Enshrine data privacy protections in statute for American use—demonstrate what “right” looks like.
  3. Support U.S. tech companies in establishing a ruleset for engaging with authoritarian government use of surveillance tech abroad.
  4. Partner with tech companies to develop a “privacy solution” (commercially viable, privacy-preserving digital products), which can act as the free-world alternative to the illiberal use of surveillance tech.

The United States should:

1. Actively engage other democracies, especially swing states, to articulate and establish a set of democratic principles that undergird state use of digital surveillance.

  • The State Department should convene a bloc of democratic allies to debate and craft a specific set of democratic norms. Advanced democracies can use Britain’s “D10” approach as a blueprint for this summit of democracies. The United States can then leverage its position as a founding member of the Open Government Partnership (OGP)—and similar bodies—to draw countries at risk of instituting repressive digital surveillance policies further into democratic governance models for technology.
  • The United States and other advanced democratic nations should engage the other OGP founding members, including Brazil, Indonesia, and the Philippines, using the OGP’s Peer Learning and Exchange Subcommittee of the Steering Committee as a vehicle for policy discussions. The United States should lobby to include surveillance issues at the top of the agenda for the 2021 OGP Summit.

2. Develop a secure and privacy-protecting, free-world alternative for the handling of digital data—establishing a model for what “right” looks like.

  • Congress should enshrine data protections for American citizens and consumers by creating a national data protection framework that incentivizes consistent, open, and transparent data practices.
  • Congress should approve and ratify the Cyber Solarium Commission’s Key Recommendation 4.7 and pass a “national data security and privacy protection law establishing and standardizing requirements for the collection, retention, and sharing of user data.”

3. Map the foreign digital surveillance ecosystem to help U.S. tech companies assess risks of abuse of their technology abroad.

  • The State Department should establish a scorecard for U.S. tech companies to reference when operating abroad. The scorecard would assess levels of risk associated with specific uses of surveillance technologies. Tech companies can refer to this scorecard when making policy decisions regarding the potential illiberal use of their technology by foreign governments for surveillance purposes.
  • This scorecard should be accompanied by a risk-based compliance framework for export controls of surveillance-enabling technology, as defined by the State Department, in coordination with the Commerce Department and other relevant agencies.

4. Fund the development of freedom-enhancing and privacy-preserving technologies and export democratic models of surveillance technology.

  • The U.S. government, via the Defense Advanced Research Projects Agency, the Intelligence Advanced Research Projects Activity (IARPA), the State Department, In-Q-Tel, and the National Institute of Standards and Technology (NIST), should research and fund the development of privacy-preserving technology solutions. The private sector will drive development in surveillance tech and the ability to exploit it, but there are too often insufficient private sector incentives for privacy-preserving solutions. The United States must be at the forefront of these developments to provide an alternative to Chinese technology.
  • The United States can leverage existing technology research and development initiatives via In-Q-Tel or IARPA or through updated exercises like NIST’s 2018 “Differential Privacy Synthetic Data Challenge” to help develop the “privacy solution” hand in hand with the U.S. private sector. Solutions can serve as democratic models of surveillance technology, imbued with privacy protections, for export abroad.

Introduction

The world is saturated with sensors. These sensors, along with a projected 6 billion people with access to the internet, will generate between 80 and 90 zettabytes of data worldwide by 2025. By next year alone, approximately 30 billion devices will be connected to the web, creating a prodigious “digital exhaust.” This digital exhaust consists of the data trail left behind when users rely on connected devices: from wearables that capture running routes to cameras that double as doorbells to social media applications that provide geolocation data. Nearly every facet of the daily life of an individual—wittingly and unwittingly—is logged and captured by the sensors and digital applications in the new information environment. DNA is collected by medical companies, voices are collected by banks, faces are collected by phones, and sleep cycles are collected by rings and watches. As the surveillance infrastructure expands, this connectivity, when used to identify individuals and match them to the data they emit, will make privacy much harder to maintain. And systems that do the matching will be in much higher demand. As one Chinese technology company put it in an advertisement to Chinese police: “People pass and leave a shadow. The phone passes and leaves a number.” Now systems are built expressly to “connect the two.” Further, surveillance technology, driven by artificial intelligence (AI) and machine learning (ML), will play an increasingly significant role in the digital environment. From facial recognition to the Internet of Things (IoT), technology will alter the ways nation-states fight wars and police their citizens, and even the way populations behave.

Nearly every facet of the daily life of an individual—wittingly and unwittingly—is logged and captured by the sensors and digital applications in the new information environment.

Legitimate uses for surveillance exist across the globe. Public safety and health, counterterrorism measures, and quality of life are all critical to a functioning society. Facial recognition systems to detect stalkers at concerts, drones that aid in disaster response, and software that identifies criminals after a shooting spree are elements of surveillance that can contribute to public good. Digital surveillance technologies like contact-tracing apps can help at-risk individuals avoid COVID-19 hotspots during the pandemic. Commercial uses of surveillance technology can also make services more efficient, such as reducing wait times at popular amusement park rides or advertising a product perfectly tailored to an individual’s needs. Surveillance that makes travel more convenient at airports and on subways during rush hour, decongests the roads, and improves emergency response times enhances quality of life in real ways.

But the risk of potential abuse of these and related technologies is profound. Facial recognition helps Chinese authorities implement ethnic targeting in the Xinjiang region, where approximately 1.5 million Uighurs are imprisoned in reeducation camps. Sources for Human Rights Watch and the International Consortium of Investigative Journalists point to license plate trackers at gas stations, “data doors” that collect SIM card information, facial recognition checkpoints erected to monitor the Uighur minority, and a mass data system employing artificial intelligence to flag “entire categories” of Xinjiang residents for detention. The risk of abuse exists in democracies as well. Citizens are rightfully wary of opaque algorithms, the potential for unlawful detentions based off false positives generated by facial recognition software, and privacy invasions exploited for commercial gain.

Yet understanding how surveillance technology is used around the world is important in calibrating a U.S. response. First, understanding the global digital surveillance ecosystem is foundational. A mutually reinforcing ecosystem of sensors, networks, and processing capabilities advances as each technical component improves. Soon, surveillance technology will be cheaper, easier to conduct, and everywhere. Recognizing how political systems influence the way countries adopt these technologies will also be critical to formulating a U.S. approach. Authoritarian regimes, with China leading the way, are using digital surveillance to enforce and enhance internal control, as well as peddle their methods to other countries on the cheap. China is leading the world in using technology to monitor its population and influence behavior. Open societies exhibit some of the same troubling trends, but are ultimately making use of a democratic process to stall rapid deployment of some surveillance technologies and debate their use. Initial responses to the pandemic are reinforcing and accelerating these trends and offer inchoate lessons for how surveillance will be employed in the future.

Understanding the ecosystem—not just the technology, but the actors, markets, and governance that shape its use—is critical to establishing policies the free world can export.

All of this occurs in the context of a global contest between freedom and authoritarianism. Democracies like the United States maintain a system of checks and balances to provide a bulwark against the abuse of technology within their own borders. Properly applied, democratic systems offer institutions and practices—the rule of law, independent judiciaries, a free press, and engaged citizenries—to act as guardrails on the state’s internal use of technology. But this is no longer sufficient. Democracies must also decide how to respond to increasing abuse of these technologies abroad. Democracies must establish a model for the use of these technologies globally that helps ensure the preservation of individual freedom and liberty. Due to the pace of technological advancement, China’s success at spreading surveillance technology far and fast, and the eagerness of autocrats to employ it against their own populations, the window to get this right is closing. Understanding the ecosystem—not just the technology, but the actors, markets, and governance that shape its use—is critical to establishing policies the free world can export.

The Digital Surveillance Ecosystem: Easy, Inexpensive, and Everywhere

Surveillance technology is best understood as an ecosystem of mutually reinforcing technologies. The surveillance ecosystem is loosely composed of sensors, networks, and processing capabilities. In order to conduct surveillance, sensors are needed to detect, generate, and collect data. In turn, a network must transmit these data. Methods to exploit the raw data gathered by sensors and transmitted on networks are then required to draw value from the process. Sets of technologies and techniques, including AI, can help manage this sensor-derived information, organize it, and turn it into insights.

All of these components are evolving as technological developments improve. Sensors are more distributed, leading to the collection of more data (volume), along with different types of data (variety) and at higher rates (velocity). Faster networks with lower latency provide quick transmission and higher throughput to handle these formidable data flows. More compute power and computer options, such as cloud or edge computing, to sort and process the data are advancing in concert. Developments in ML and sophisticated analytics that extract value from data are also growing exponentially. These improvements fit together in mutually reinforcing ways. For example, fifth-generation telecommunications (5G) will provide application-heavy smart cities the necessary bandwidth to function; IoT devices are optimized by edge-computing capabilities; AI parses the data generated by IoT devices on 5G networks to detect patterns. In aggregate, these advancements amount to surveillance that is less personnel-intensive, less expensive, more capable, and ubiquitous.

New avenues of surveillance are cropping up in the wake of the COVID-19 pandemic. From Facebook-sponsored disease migration maps to hospital bed availability applications to QR codes that act as health status indicators, developers and—often—their government sponsors are responding to the virus with more expansive data collection. MIT Technology Review’s COVID-19 Tracing Tracker documented 25 robust global automated contact-tracing efforts in early May 2020. Iran claimed to collect location data from 4 million of its citizens via its first state-sponsored contact-tracing application before Google pulled the app from its digital store. South Korea relaxed its infectious disease law to allow automation and give government officials access to patient information. If the United States and its allies are not careful, this environment can exacerbate certain political and social effects deleterious to democracy already driven by digital surveillance. In fact, the pandemic appears to accelerate some of these existing trends, especially in autocracies.

Authoritarian Regimes

Prior to the pandemic, authoritarian regimes exhibited certain distinct trends of using digital surveillance for internal political and social control. China is at the “bleeding edge” of using data aggregation and data exploitation for this control. As security scholar Sheena Greitens argues, China is the “index case” for mass surveillance and the precipitation of its spread around the globe. Though not perfectly implemented, Beijing’s philosophy for control is characterized by “standardized” and “integrated” “information collection,” in the words of Chinese authorities. To effectively do this, the party-state aims to aggregate enough useful data—synching multiple data sets will allow authorities to look for patterns in behavior—with the goal of “totalizing control,” as one New York Times reporter observed. China’s data collection practices help influence behavior in the service of this goal.

Attempts to link the digital and physical world—where online transgressions equate to near real-time, physical confrontation by authorities—further tighten the party-state’s grip on its citizens. Concurrently, China’s investments in surveillance technology abroad and its export of norms, values, and governance practices represent a push to set the terms for global use. Specifically, the party-state seeks to influence technology standards and provide legislative models that allow broad access to data by governments.

China as an Advanced Surveillance State

China at the “bleeding edge” of data aggregation for social control. China’s data practices demonstrate the bleeding edge of what is possible when surveillance tools are turned inward on a population. China’s 1.34 billion citizens are watched by around 200 million cameras on the mainland today. By the end of 2021, China is expected to house over half of the world’s more than 1 billion surveillance cameras. And by 2022, China could have one public CCTV camera for every two people across the mainland. Its “Golden Shield” project of widespread censorship and surveillance; “Sharp Eyes,” the state-sanctioned rural surveillance program; and “Skynet,” its urban counterpart, help lay the groundwork for a massive surveillance net. This is punctuated by more pointed surveillance testing labs in high schools, grocery stores, banks, and public spaces that hoover up biometric data, experiment with micro-expression technology, and expose jaywalkers. The Social Credit System is also used to monitor and assess the conduct of individuals and influence behavior on the basis of reward and punishment (e.g., redlists and blacklists enabled by information-sharing platforms). According to China’s Supreme People’s Court, 8.8 million individuals were blacklisted in the period from late 2013 to late 2017, at times limiting their government subsidies, job advancement, real estate purchases, ability to get a loan, and more. Data extracted from spyware, malware unwittingly installed on devices that mines them for data, also adds to this data pool. Data extraction software called MFSocket, employed as a routine security check on citizens’ phones, can allow access to “image and audio files, location data, call logs, messages and the phone’s calendar and contacts,” according to the Financial Times. Other more experimental methods of data aggregation under this authoritarian regime include a reported voice biometric database, which is augmented by an artificial intelligence program known as automatic speaker recognition to accelerate matching between this and other databases. Human Rights Watch has reported that these police-held databases link voice data to:

the person’s identification number, which in turn can then be linked to a person’s other biometric and personal information on file, including their ethnicity, home address, and even their hotel records. … Official tender documents and police reports suggest that police are collecting voice patterns together with other biometrics—fingerprints, palm prints, and profile photos, as well as urine and DNA samples.

The Xinjiang Uighur Autonomous Region is the consummate test lab for China’s digital surveillance tools. Chinese officials demand voice samples, 360-degree photos and videos, and eyelash samples from some Xinjiang natives in order to monitor the Uighur population for “dissident” behavior. Further, Xinjiang’s predictive policing platform, the Integrated Joint Operations Platform, synchronizes biometric and behavioral data to detect and ultimately identify individuals for potential detention.

China’s data practices demonstrate the bleeding edge of what is possible when surveillance tools are turned inward on a population.

However, these attempts at instituting effective large-scale surveillance—throughout China and in the Xinjiang region—are far from perfect. The lack of interoperability, or systems’ ability to communicate with other systems, will hamper efforts to synchronize disparate data sets and identify patterns in behavior. Maya Wang of Human Rights Watch describes Chinese authorities’ travails with “incompatible datasets and systems developed by different companies,” especially those born out of discrete municipalities run by local officials, as stymieing efficiency. New York Times reporter Paul Mozur has detailed how police struggle to consolidate location data collected in their local jurisdictions due to lack of data sharing by Chinese telecommunications companies. Local police are forced to build their own trackers to obtain data, resulting in duplicative databases that operate on different networks and can contain conflicting information. Further, class differences expose the “holes” in the surveillance blanket, with upper- and middle-class Chinese citizens more likely to register with the government and make use of technologies that can be linked to their personal identity, like high-speed trains or the internet. Additionally, not all cameras are created equal. Some of China’s hundreds of millions of CCTV cameras are equipped with facial recognition; most are not. Yet overall, China’s developing surveillance regime, combined with low-tech human surveillance efforts, amounts to an inordinate—and growing—amount of control. Moreover, the holes in China’s surveillance dragnet do not necessarily reduce its effectiveness as a deterrent on individual behavior, since individuals are unlikely to know which cameras have facial recognition and which do not, which databases are linked and which are not. The uncertainty and capriciousness of government power can be an asset in creating a climate of fear, since citizens never know when they are being watched.

Interconnecting the digital with the physical. The ability to link online behavior with reality is a trend that will increasingly be leveraged by governments with malign intent. In China, police cultivate efforts to respond to digital behavior in real time. According to Wired, if Uighurs are contacted on their smartphones by non-Chinese numbers, the result can be “instant arrest.” Research laboratory Citizen Lab found evidence that certain users of China’s most popular social media platform, WeChat, were arrested for insulting police officials or threatening government facilities online since at least 2018 in addition to facing automated real-time censorship. The New York Times’ Paul Mozur offers corroborating observations regarding physical responses to online content considered unacceptable by Chinese officials:

China’s internet police are increasingly responding in real time to question people who have said things deemed questionable online. Eventually the goal is to link all online and offline behavior. … What the police are doing is putting in the ground floor on a system to control reality as tightly as the internet.

Increasing investments in surveillance technology. In the past five years, the Chinese government and private companies have scaled up financial investments in surveillance technology at home and abroad. In terms of industry projections, the China Security & Protection Industry Association predicted that surveillance and safety product sales would rise to an estimated 800 billion yuan ($114.8 billion) in 2020, up from a surveillance and safety industry worth 490 billion yuan ($70 billion) in 2015. Specifically, a 2018 International Data Corporation report estimated that Chinese investments in smart-city initiatives would surpass $38 billion by 2023. In 2019, Huawei more than doubled its smart-city technology reach to 90 countries, up from 40 countries in 2017.

Proliferation of the Chinese Model

China’s focus on internal control does not exclude deliberate and opportunistic expansion of China’s surveillance technology outside its borders. China is narrowing the digital divide between countries that possess high-tech surveillance capabilities and those that do not by exporting both its tools and its policies abroad.

Beijing is shortcutting its access to the world’s data reserves by exporting Chinese technology and, at times, transmitting data to Chinese-owned servers. The list of these incursions is extensive. For instance, in Venezuela, Chinese company ZTE stores data generated by a smart chip–based ID card used by its citizens, the creation of which was inspired when Venezuelan officials took a trip to China over a decade ago. In 2018, the Chinese AI firm CloudWalk Technology sold Zimbabwe’s government a mass facial recognition system, giving the firm access to part of that country’s biometric data pools. Starting in 2011, the government-subsidized technology company Huawei and the state-run China National Electronics Import & Export Corporation built most of Ecuador’s surveillance system, ECU-911. Malaysian police forces use automated facial recognition cameras from Chinese AI start-up Yitu. A Yitu executive also declared “big potential” in Singapore in 2018, opening up an office there that year, in addition to a research and development center in 2019. Chinese AI start-up SenseTime also explored a bid for Singapore’s facial recognition software market, which includes a planned 110,000 facial recognition cameras in addition to the 80,000 cameras already installed in public areas. In 2019, SenseTime reportedly established an office in Abu Dhabi; there have also been reports of meetings between SenseTime’s representatives and the United Arab Emirates’ smart-city officials in 2018. Belgrade, Serbia intends to import a Huawei system consisting of 1,000 cameras for 800 locations. One accelerant to this trend is the ability of Chinese firms such as Huawei to undercut competitors on price due to government support and the incentive for those firms to siphon off local data once entrenched in those countries. In addition to its commercial footholds, China is laying the technical foundation for access to a high volume and variety of data collected by its surveillance technology across the globe.

In total, China is exporting AI surveillance technology to more than 60 countries, including 36 Belt and Road Initiative (BRI) countries, and possibly more, according to a report from the Carnegie Endowment for International Peace. And the spread is not limited to typical BRI countries. Eighteen countries, including Germany, are using “intelligent monitoring systems” made in China, according to New York Times reporting. The list of countries importing Chinese surveillance technology even includes the United States. As recently as April 2020, Amazon ordered 1,500 thermal cameras from blacklisted Chinese company Dahua, with plans to use at least 500 of the cameras in the United States.

In addition to its commercial footholds, China is laying the technical foundation for access to a high volume and variety of data collected by its surveillance technology across the globe.

Along with its technology, China is also exporting its norms, values, and governance practices to the rest of the world. It does this by actively attempting to promote its own indigenous technology standards to international standards bodies. Simultaneously, Beijing is inspiring the rapid expansion of legislation with low thresholds for access to data by the state. China makes no secret of its ambitions to promote the “global influence” of its “national champions,” industrial conglomerates heavily subsidized by the government, and push its “proposition of internet governance toward becoming an international consensus,” according to the Cyberspace Administration of China. China is also investing in patents linked to the overall advancement of surveillance systems, in addition to setting standards to influence global technology development. China published 530 video surveillance–related patents in 2017, compared with 96 in the United States that year. China submitted every standard related to surveillance technology to the United Nations in the past three years, in an attempt to influence how this technology is used throughout the world and to displace existing U.S. influence.

Other countries are also adopting China’s legal frameworks in order to build stronger foundations for their own authoritarianism. This includes imitating laws and policies that offer broadly scoped data localization, storage, and retention provisions and make it easier for governments to conduct surveillance on their people. For instance, Vietnam officials attempted to implement a cybersecurity law akin to China’s in June 2018, with a key provision that service providers must “disclose user data to authorities without … a court order.” This draft law contained strict data storage provisions (giving access to a government “task force”); a mandate to open offices in Vietnam if requested by Vietnam’s Public Security Ministry; and overarching definitions of content, including “information chosen for upload, sync or import from device” and information about “friends [and] groups that users connect with or interact with.”

Other countries are also adopting China’s legal frameworks in order to build stronger foundations for their own authoritarianism.

Uganda and Tanzania implemented similar cybersecurity laws, and Singapore’s “fake news” law has a similarly broad data provision. Russia is also getting in on the game. By legally requiring a number of private technology companies in Russia to install equipment for its targeted surveillance mechanism, System for Operative Investigative Activities, the government provides its own agencies with access to users’ identifying information, including email and internet protocol (IP) addresses.

Global Swing States

How “swing states”—non-Western nations like India, Brazil, Turkey, the Philippines, and Indonesia that exhibit forms of both democratic and illiberal governance—choose to use surveillance technology will be key to how it is adopted globally. Thus far, nations with large populations, technical capability, and current spikes of unrest exhibit troubling trends, especially for nations undergoing democratic backsliding, such as Hungary.

Swing states such as Brazil and India, poised to become the world’s biggest security camera markets, are imitating authoritarian surveillance practices. According to Human Rights Watch, multiple Brazilian officials visited China in 2019 to observe and learn more about its surveillance techniques. On their return, a policymaker introduced an August 2019 bill to mandate the installation of facial recognition cameras in all public spaces, citing public safety as the justification. Additionally, Brazilian President Jair Bolsonaro signed two 2019 decrees to establish a central database of biometric and other personally identifiable data on Brazilian citizens. Likewise, in the Philippines, lawmakers ultimately relented on the “Safe Philippines” contract—signed after CCP leader Xi Jinping traveled to the country in 2019—which seeks to install 12,000 CCTV cameras from Chinese telecom company Huawei for $400 million in Manila. Similarly, Indian government proposals for a centralized facial recognition system, data localization practices, forays into internet shutdowns, and a biometric identity database of its 1.3 billion people augur bad news for privacy and civil liberties in swing states.

Open Societies and Democracies

Open societies are demonstrating some of the same trends as swing states and even elements of these authoritarian practices, including the expansion of digital surveillance in the name of public safety and growing capabilities of the private sector to collect and analyze data on individuals. Similar to authoritarian regimes, some surveillance applications are legitimate and some are illegitimate. Genuine tradeoffs exist between public safety/health and privacy, convenience and commercialization, and rapid product rollouts and product security. The difference between open societies and repressive regimes is that, in democracies, how best to balance these tradeoffs is up for debate. What distinguishes surveillance practices in democracies from repressive regimes is that the process to settle these questions takes place in the court of law, through an engaged citizenry, civil society groups, and the free press.

Expansion in the Name of Public Safety

Even in democracies where widespread surveillance is already entrenched, the expansion of its use will continue as a stated public safety imperative. The United States nearly matches China in its surveillance coverage, with one camera for every 4.6 people, compared to China’s one for 4.1 individuals. Facial recognition is currently used for entry at borders (Global Entry), Transportation Security Administration checkpoints (CLEAR), and more than 20 sports stadiums. Success stories include the detection of Maryland’s Capital Gazette shooter in 2018 and the detention of multiple individuals using false passports at Dulles International Airport in the Washington, D.C., area that same year. For over a decade, the FBI has used driver’s license and visa photos to look for criminals. Many large-scale surveillance projects are underway—MIT’s Lincoln Labs licensed camera-cluster technology that provides up to a 360-degree field of view as the future of “high-tech aerial surveillance” to Consolidated Resource Imaging in 2019. In December 2019, the city of Baltimore announced the deployment of a pilot program for wide-area motion imagery, providing about 90 percent coverage of the city with three aircraft flying simultaneously. Baltimore Police Commissioner Michael Harrison claimed Baltimore would be “the first American city to use this technology in an attempt to solve and deter violent crime.” Other cities are following suit. In 2020, San Diego plans to fly a General Atomics MQ-9B SkyGuardian, a version of a drone used by the U.S. military, in test sorties over the city. The Center for the Study of the Drone at Bard College estimates that as of March 2020, more than 1,500 U.S. public safety agencies had procured some form of drone.

Genuine tradeoffs exist between public safety/health and privacy, convenience and commercialization, and rapid product rollouts and product security. The difference between open societies and repressive regimes is that, in democracies, how best to balance these tradeoffs is up for debate.

Other Western democracies are expanding their use of surveillance technologies as well, with an eye toward harnessing the latest tech developments. In January 2020, the United Kingdom’s Metropolitan Police Service announced it would use live facial recognition technology to immediately identify potential suspects in real time. The technology will be employed in areas “where intelligence suggests [they] are most likely to locate serious offenders.” Other metropolises around the world employ drones, social media aggregators that monitor sentiment, and beacons that connect to smartphones in the name of public safety.

Private Sector Access to and Commercialization of Personal Information

U.S.-based tech companies are using emerging tech to elicit more value out of consumer data. As both the amount of data and the ability to exploit it increase, companies are finding ways to use it more effectively. For example, applying artificial intelligence to parse through big data repositories that include voices, faces, and other personal identifying information allows them to analyze the data for targeted advertising purposes, often without notifying consumers. To illustrate the reach of these companies, one German researcher installed 14 apps on his phone in a 24-hour period, only to discover 7,305 data transmissions to 636 servers in those hours, including 64 percent of transmissions occurring when his screen was locked and 18 percent overnight. Outside of such manually downloaded apps, databases like Google’s SensorVault store device ID information scooped up in major “geofence” searches (i.e., passive collection). As of April 2019, Google is collecting “detailed location records involving at least hundreds of millions of devices worldwide and dating back nearly a decade”—demonstrating “the whole pattern of life” through this type of geofencing and its SensorVault storage. In December 2019, Facebook admitted to collecting user data even when users are not logged in to the Facebook application. As early as 2015, Democratic Senator Sheldon Whitehouse noted that “the greatest collector of data on ordinary Americans is not our government, but the private sector entities gathering personal data for marketing and commercial purposes.” Additionally, tech companies’ ability to exploit this information is vast and expanding. Johns Hopkins’ Thomas Rid explains:

The big revolution in intelligence is happening in start-up-y spaces in the private sector due to the combination of vast amounts of customer (& other) data: amazing engineering, stunning analytics, deep regional expertise, native language skills, intel feeds, graphing, machine learning. . .

Process: Debating, Litigating, and Reworking Digital Surveillance

In democracies, surveillance practices are up for debate. The process to settle these questions involves civil society groups, public commentary and pushback, legislative deliberation, and courts of law. For instance, some tech rollouts and developments are facing increasing resistance among various groups, lawmakers, citizens, and even technology companies themselves, especially those based in the United States. In 2019 and 2020 there was a wave of grassroots bans against government use of facial recognition. As of January 2020, 10 state legislatures had introduced bills to limit, study, or generally increase transparency surrounding the use of facial recognition. Cities in California (Oakland and San Francisco) and Massachusetts (Brookline, Somerville, and Cambridge) have enacted bans on the use of facial recognition technology by local law enforcement and municipality representatives. And as of early 2020, Indiana, Michigan, Maryland, New Hampshire, New Jersey, South Carolina, and Washington were considering varying forms of restrictions and regulation. In the U.K., public outcry against the use of facial recognition in London’s transportation system forced an Information Commissioner’s Office investigation and resulted in the ultimate cessation of the program. Even private companies are engaging the public, lawmakers, and external stakeholders to influence their internal decisions. U.S. tech company IBM announced in May 2020 that it would end its “general purpose” facial recognition program, citing mass surveillance and potential racial profiling as factors in its decision.

Other potential privacy invasions invite further scrutiny under democratic processes. Litigation as a method to temper potential abuse of surveillance technology will grow as strains on civil liberties increase. For example, Facebook and Twitter suspended social media monitoring start-up Geofeedia’s access after a challenge from the American Civil Liberties Union (ACLU) in 2016. Clearview AI, a U.S. tech start-up that bases its facial recognition app on over 3 billion images scraped from public websites, is the subject of another ACLU lawsuit in May 2020.85 In addition to privacy issues, potential exacerbation of inequality and social impacts of widely deployed facial recognition systems, including the repeated misidentification of minorities, high false-positive rates, and potential for algorithmic bias, will engender additional legal challenges—yet another tool in the democratic toolkit to address the public’s privacy concerns.

These differences are an opportunity for the United States and other democratic nations to articulate the right way to imbue technology with privacy protections from the outset, in addition to maintaining a strong system of checks and balances to redress privacy infringements that do occur.

The debate over personal data highlights the differences in how authoritarian regimes and democracies approach surveillance technology. These differences are an opportunity for the United States and other democratic nations to articulate the right way to imbue technology with privacy protections from the outset, in addition to maintaining a strong system of checks and balances to redress privacy infringements that do occur. Repressive regimes, if competing on this playing field, will come up deficient. For example, many democracies are developing and deploying digital surveillance systems with some form of safeguards built in. The governments of Germany, France, Italy, and the United Kingdom are all rolling out their own versions of surveillance in the name of public safety and—concurrently—some accompanying version of checks and balances. Just as the UK Metropolitan Police’s facial recognition systems come with a multi-year trial period by Scotland Yard, Germany’s Safety Station Südkreuz project comes with the EU’s General Data Protection Regulation protections. Though not perfect, these measures at least provide initial constraints or attempts to shield privacy for the individual citizen. They demonstrate an intent by democratic governments to consider individual privacy in these rollouts, in stark contrast to repressive regimes around the world.

COVID-19 Response Lessons

Attempts to surveil populations using digital tools as a response to the COVID-19 pandemic offer a glimpse into the future of surveillance. The way countries are adopting COVID-19-related technology is determined by the political systems and governance structures that incubate or adopt them. China is accelerating trends underway, such as mass surveillance for social control, and is seeking to make them permanent. Swing states appear to be taking cues from China’s authoritarian model, while the spectrum of democratic responses reflects the diversity of democracies’ citizenry and processes.

Authoritarian Regimes: Expand Powers and Make Them Compulsory and Permanent

China is taking advantage of the crisis to expand its surveillance powers and make these more extensive measures permanent. Its Alipay Health Code system is a “black box,” using QR codes to indicate health status without clear justification for the algorithm’s decision. The developers have not publicly revealed how the assessments of green (unrestricted freedom of movement), yellow (possible seven-day quarantine), or red (mandatory two-week quarantine) are determined. Further, The New York Times reported that the application was developed in partnership with police, and its data is possibly transferred to law enforcement and other servers without the user’s explicit consent. In May 2020, the health commission of the city of Hangzhou—one of the first cities to use the app—suggested making this tool permanent, and even expanding its capabilities to assign health scores that encompass additional health records, physical exams, and lifestyle choices like smoking or alcohol consumption.

In Russia, in addition to a mandatory, Communications Ministry–designed program that tracks COVID-19 patients (and accesses device data like location and storage information), the government also introduced a compulsory QR code–based permit in April 2020 for use on public transportation in Moscow. This digital tracking method requires the input of travel routes and an upload of identification documents, tax information, and license plate details. In Iran, developers suspected of building tools for Iranian intelligence agencies helped release an application, AC-19, in early March. The program functioned as a surveillance tool, requesting real-time geolocational information, before its removal from the Google Play store, likely due to Google’s quality control. Its new version, Mask.ir, also collects location information, signaling the regime’s intent to add to the reported 4 million individuals whose data was obtained by the original app.

Swing States: Still Mostly Democratic but Taking Cues from Dictators

Hungary, in a continued expansion of illiberal policies, is using the pandemic to allow Prime Minister Viktor Orbán “extraordinary powers” to “rule by decree” for an indefinite time period. At the technical level, other crackdowns in swing states include the Turkish Health Minister’s mandatory order for infected persons aged 65 years or older to download the “Life Fits in the Home” app, part of a larger effort to isolate and monitor COVID-19 patients. The app uses personal data to track users’ movements and send them text messages if they violate quarantine. If these messages are ignored, repeat offenders are reported to law enforcement and face jail time or other sanctions. Thus far, Indonesian government’s contact-tracing application, PeduliLindungi, developed by a state-owned telecommunications company, is not mandatory. But if installed, it can help notify national police when large crowds are detected, monitor foreign nationals and citizens back from recent travel, and feed into a Web-based dashboard operated by the Health Ministry. Finally, India continues to straddle the free world and illiberal practices. In spite of officially declaring its COVID-19 detection app voluntary, India requires government workers to download Aarogya Setu, which sucks up Bluetooth and location information and determines infection risk using a color-coded system. According to the Home Affairs Ministry, certain cities threaten fines and potential jail time if the app is not used. However, in a display of democracy at work, at least 45 organizations and a critical mass of individuals complained about the “mandatory imposition” of the system, according to India’s Internet Freedom Foundation. In late May 2020, India’s Ministry of Electronics and Information Technology announced that it would release the source code of the Aarogya Setu app.

Democratic and Open Societies: A Chance to Get It Right

While the pandemic has also accelerated surveillance trends in democracies, elements of society are pushing back to slow the wheels of the surveillance machine. This includes resistance from civil society groups, appeals to an independent judiciary and a free press to amplify privacy concerns, and active opposition at the individual level. In line with late April polling by more than five organizations on public attitudes toward contact-tracing technologies developed by tech companies, civil society groups like the ACLU are already pushing back against unfettered use of public health surveillance technology. In a white paper released in May 2020, the organization admonished government officials and private sector leaders to consider the negative impacts of building widespread surveillance infrastructures. In Israel, the Supreme Court in late April 2020 blocked its internal security service, Shin Bet, from using location data from COVID-19 patients. The decision was contingent on the ratification of privacy protection legislation and followed a parliamentary oversight committee’s decision to stall similar use by police the week prior.

Democracies are also responding to the crisis differently. Some democracies are electing to institute more invasive public health–centered surveillance measures, with minimal nods to privacy. Although South Korea has elected to integrate virus tracking data collection with location data and financial transaction records—among the more expansive COVID-19-related data integration practices in democracies today—local officials have been at least in part responsive to privacy concerns. Korea Centers for Disease Control and Prevention (KCDC) officials turned Korea’s smart-city technologies into expanded surveillance nets during the pandemic but ultimately rejected facial recognition as a part of this data pool due to privacy issues.

Democracies are also responding to the crisis differently. Some democracies are electing to institute more invasive public health–centered surveillance measures, with minimal nods to privacy.

The way these new surveillance technologies collect and store data is also the subject of robust debate among policymakers, tech companies, civil society, and other groups. Friction between Apple and the French government is demonstrative of democratic processes at work. In May 2020, France’s government lobbied Apple to make an exception to one of the company’s privacy-enhancing protocols, claiming this “technical hurdle” impeded the French government’s efforts to streamline the efficiency of its “StopCovid” app. According to French officials, Apple deferred to its default settings, upholding its existing privacy standards. French Digital Minister Cedric O confirmed the conflict, stating that French authorities intend to develop a “sovereign European health solution that will be tied to [their] health system.” Nevertheless, French Prime Minister Édouard Philippe allowed representatives in the lower house of Parliament to vote on whether or not to release the app, in the hope of increasing buy-in and the tool’s legitimacy in the eyes of the French public—a concession the government was previously unwilling to make. Meanwhile, Germany, an original adherent to the centralized architecture, decided in late April 2020 to adopt a mostly privacy-protecting, decentralized approach to digital public health surveillance.

Recommendations and Conclusion

The United States should respond to the illiberal use of surveillance technology in four ways:

  1. Draw in like-minded democratic partners to work multilaterally, with a particular focus on global swing states.
  2. Enshrine data privacy protections in statute for American use—demonstrate what “right” looks like.
  3. Support U.S. tech companies in establishing a ruleset for engaging with authoritarian government use of surveillance tech abroad.
  4. Partner with tech companies to develop a “privacy solution,” or commercially viable, privacy-preserving digital products, which can act as the free-world alternative to the illiberal use of surveillance tech.

Recommendation: The United States should actively engage other democracies, especially swing states, to articulate and establish a set of democratic principles that undergird state use of digital surveillance.

  • The State Department should convene a bloc of democratic allies to debate and craft a specific set of democratic norms. Advanced democracies can use Britain’s “D10” approach as a blueprint for this summit of democracies. Once drafted, the United States can leverage its position as a founding member of the Open Government Partnership (OGP)—and similar bodies—to draw countries at risk of instituting repressive digital surveillance policies further into democratic governance models for technology. The United States and other advanced democratic nations should focus on OGP founding governments, including Brazil, Indonesia, and the Philippines, using the OGP’s Peer Learning and Exchange Subcommittee of the Steering Committee as a vehicle for policy discussions. The United States should lobby to include surveillance issues at the top of the agenda for the 2021 OGP Summit. Within these bodies, advanced democracies can:
  • Formalize and fund information-sharing mechanisms for fledgling democracies or aspiring democracies. Advanced democracies can help develop information-sharing processes and tech exchanges between democratic actors from polities on the frontlines (e.g., Hong Kong and Taiwan) to fight back against the authoritarian use of technology. For instance, resource the exchange of surveillance-defeating technology standard operating procedures between vetted dissidents. The State Department can invest in specific disinformation-combating efforts with threatened democracies through the Global Engagement Center.
  • Open dialogue about increasing democratic states’ engagement and cooperation in international standards bodies and intergovernmental organizations to combat the uptick of authoritarian influence in these institutions.

Recommendation: Develop a secure and privacy-protecting alternative for the use of digital data—establishing a model for what “right" looks like.

  • Congress should enshrine data protections for American citizens and consumers by creating a national data protection framework that incentivizes consistent, open, and transparent data practices.
  • To mitigate public safety and security tradeoffs, any government use of surveillance technology in the United States should require keeping the public informed on how their personal information is being used and protected. As part of this framework, Congress should also articulate clear policies and limits around data retention by the federal government, such as strict time limits and no indefinite data storage. Further, biometric data should be classified as “sensitive data,” and additional protections around this data, including limited interoperability, should be added.
  • Any identity management systems used by the federal government must be secure and reliable, based off proper standards and measurements, and in accordance with National Institute of Standards and Technology (NIST) guidelines. And Congress should enforce data protection inspections and oversight among agreed-upon parties.
  • Congress should approve and ratify the Cyber Solarium Commission’s Key Recommendation 4.7 and pass a “national data security and privacy protection law establishing and standardizing requirements for the collection, retention, and sharing of user data.”

Recommendation: Map the foreign digital surveillance ecosystem to help U.S. tech companies assess risks of abuse of their technology abroad.

  • The State Department should establish a scorecard for U.S. tech companies to reference when operating abroad. The scorecard would assess levels of risk associated with specific uses of surveillance technologies. Tech companies can refer to this scorecard when making policy decisions regarding the potential illiberal use of their technology by foreign governments for surveillance purposes.
  • This scorecard should start by defining what constitutes “abuse” of technology for surveillance purposes (e.g., specific activities or data collection architectures) and provide assessments to help tech companies create their own rulesets to respond to nation-state governments.
  • The scorecard should include a key set of indicators for this potential abuse, especially by repressive regimes or swing states, including but not limited to the following criteria:
    • Data aggregation with established intent of political or social control (including the assessments on the levels of interoperability and data integration between technical systems).
      • Whether a “national architecture” for data collection on citizens is present or in advanced stages of development. Is a repressive regime laying the groundwork for state surveillance and widespread data aggregation with punitive elements? Is this data collection compulsory and are these mandates enforced? Are elements of algorithmic scoring preset in systems that influence social behavior? Do that nation’s private companies work with the central government to integrate overseas records with data on its citizens held domestically?
    • A significant increase in surveillance technology investments, domestically or abroad.
    • Significant evidence of government subsidizing private industry.
    • A track record of exporting to and investing in surveillance systems for regimes consistently ranked low on Freedom House, World Bank, etc., rankings or those found to have committed gross violations of human rights as defined in the Foreign Assistance Act of 1961.
    • Systemic risk due to a nation’s political institutions, including lack of an independent judiciary, free press, and mechanisms for recourse against government demands for private data. For instance, China lacks sufficient rule-of-law protections, specific corporate governance practices, and democratic features that would allow companies to resist arbitrary requests for information from the Chinese government.
  • This scorecard should be accompanied by a risk-based compliance framework for export controls, as defined by the State Department, in coordination with the Commerce Department and other relevant agencies.
  • Tech companies, in turn, should assess the systemic as well as reputational risk associated with aiding authoritarian governments and incorporate these factors as standard in their due diligence practices.

Recommendation: Fund the development of freedom-enhancing and privacy-preserving technology and export democratic models of surveillance technology.

  • The federal government and private companies should stack the deck in favor of democratic principles and values by developing commercial solutions to the illiberal spread of surveillance technology.
  • U.S. tech companies should devote substantial engineering capacity to designing protocols with built-in data privacy protections. This “privacy by design” concept builds in safeguards for users at the outset. This can be done by promoting user control of data through differential privacy, federated models of machine learning, or encrypted domain name servers.
  • The U.S. government, via the Defense Advanced Research Projects Agency, the Intelligence Advance Research Projects Agency (IARPA), the State Department, In-Q-Tel, and NIST, should research and fund the development of privacy-preserving technology solutions. The private sector will drive development in surveillance tech and the ability to exploit it, but too often there are insufficient private sector incentives for privacy-preserving solutions. The United States must be at the forefront of these developments in order to provide an alternative to Chinese technology. The United States can leverage existing technology research and development initiatives via In-Q-Tel or IARPA or through updated exercises like NIST’s 2018 “Differential Privacy Synthetic Data Challenge” to help develop the “privacy solution” hand in hand with the U.S. private sector. Solutions can serve as democratic models of surveillance technology, imbued with privacy protections, for export abroad.

Democracies must resist the impulses to build permanent digital surveillance infrastructures or risk losing a broader global contest between open societies and repressive regimes. They must instead fight back on the merits of innovation, using technology and policy solutions grounded in democratic principles.

From public safety to pandemic response, few governments appear immune to the allure of “tech-solutionism.” Even democracies are falling prey to the siren song of expanded surveillance. But early responses to the pandemic offer lessons for a longer-term battle. Democracies must resist the impulses to build permanent digital surveillance infrastructures or risk losing a broader global contest between open societies and repressive regimes. They must instead fight back on the merits of innovation, using technology and policy solutions grounded in democratic principles.

Acknowledgments

Thank you to Maura McCarthy, Loren DeJonge Schulman, Paul Scharre, Col. Sarah Albrycht, and Megan Lamberth for their reviews and clarifying conversations on all stages of the draft. Maura McCarthy and Melody Cook provided excellent assistance in editing and graphic design. Finally, CNAS would like to thank the Quadrivium Foundation for its generous support of this and other digital freedom initiatives.

  1. “UK seeks alliance to avoid reliance on Chinese tech: The Times,” Reuters, May 28, 2020, https://www.reuters.com/article/us-britain-tech-coalition/uk-seeks-alliance-to-avoid-reliance-on-chinese-tech-the-times-idUSKBN2343JW; The “D10” consists of members of the Group of Seven (G7), which are Canada, France, Germany, Italy, Japan, the United Kingdom, and the United States, plus the addition of critical technology partners Australia, India, and South Korea.
  2. “OGP Global Summit 2019: Ottawa, Canada,” OGP, May 29, 2019, https://www.opengovpartnership.org/events/ogp-global-summit-2019-ottawa-canada/.
  3. Senator Angus King and Representative Mike Gallagher, “Cyberspace Solarium Commission Report” (March 2020), https://drive.google.com/file/d/1ryMCIL_dZ30QyjFqFkkf10MxIXJGT4yv/view, 93.
  4. “Draft U.S. Government Guidance For The Export Of Hardware, Software And Technology With Surveillance Capabilities And/Or Parts/Know-How” (U.S. Department of State, September 2019), https://www.eff.org/files/2019/10/29/draft-guidance-for-the-export-of-hardware-software-and-technology-with-surveillance-capabilities.pdf.
  5. The “privacy solution” can be commercially viable, high-performing digital products or technologies with proper privacy protections engineered early in the design phase. “2018 Differential Privacy Synthetic Data Challenge,” National Institute of Standards and Technology (NIST) Public Safety Communications Research Division, press release, September 18, 2018, https://www.nist.gov/ctl/pscr/open-innovation-prize-challenges/past-prize-challenges/2018-differential-privacy-synthetic.
  6. David Reinsel, Jon Gantz, and John Rydning, “Data Age 2025: The Digitization of the World From Edge to Core,” IDC White Paper No. US44413318 (International Data Corporation, November 2018), https://www.seagate.com/files/www-content/our-story/trends/files/idc-seagate-dataage-whitepaper.pdf; “The Growth in Connected IoT Devices Is Expected to Generate 79.4ZB of Data in 2025, According to a New IDC Forecast,” International Data Corporation, press release, June 18, 2019, https://www.idc.com/getdoc.jsp?containerId=prUS45213219.
  7. David Goldfein, keynote address, Air Force Association Air Warfare Symposium (U.S. Air Force, Orlando, Florida, February 23, 2018), https://www.af.mil/Portals/1/documents/csaf/CSAF_AFA_Orlando-23Feb18.PDF; Gil Press, “IoT Mid-Year Update from IDC and Other Research Firms,” Forbes, August 5, 2016, https://www.forbes.com/sites/gilpress/2016/08/05/iot-mid-year-update-from-idc-and-other-research-firms/#16ec3e8355c5.
  8. Paul Mozur and Aaron Krolik, “A Surveillance Net Blankets China’s Cities, Giving Police Vast Powers,” The New York Times, December 17, 2019, https://www.nytimes.com/2019/12/17/technology/china-surveillance.html.
  9. Surveillance technology comprises a broader ecosystem than the technology alone. It is highly globalized, providing access points for a multitude of players via forced tech transfer, hacking, and other illegitimate and legitimate avenues. Similar to the global banking system, this new digital surveillance ecosystem is an interconnected system of technologies, actors, and markets. For instance, Australian Strategic Policy Institute analyst Samantha Hoffman details the extent of this interdependence in what she calls the “data collection ecosystem,” which can look like one private company using neural machine translation to support a bigger global conglomerate, taking funding from the Chinese party-state, and signing agreements with a next-generation wireless provider. Another typical snapshot of this ecosystem can be illustrated by academic and private sector researchers developing the latest advancements in the technology, video surveillance suppliers like Chinese company Hikvision depending on U.S.-designed and U.S.-imported chips to break into regional markets, and the patronage of the Chinese government and foreign regimes keeping the suppliers in business. Private industry researcher Brady Wang summed up the nature of this ecosystem when he told Nikkei in 2019 that “if one link is broken, the whole industry will feel the pain.” Other transactions within the ecosystem take place at the individual level. For instance, in the fall of 2019, Stanford Internet Observatory Director Alex Stamos emphasized how the “leverage” and influence certain actors have over individual people with potential access to data can play a role in how data is handled. Other potential examples of impacts on the ecosystem include governments that put pressure on individuals with families living in authoritarian countries, governments that attempt to convince the World Bank to fund their data collection schemes, and cybercriminals and foreign spies who steal data and personally identifiable information from unsecured surveillance systems.
  10. Select AI-driven technologies to process and analyze data include facial recognition, voice biometrics, emotion recognition, micro-expression recognition technology, etc.
  11. “AI and Compute,” OpenAI blog on OpenAI.com, May 16, 2018, https://openai.com/blog/ai-and-compute/.
  12. Analysis of the surveillance ecosystem components in this section builds on the author’s working paper. Kara Frederick, “Inundata’d: Solving the U.S. Military’s Imbalance Between Data Collection and Processing,” (CNAS, expected publication 2020); Kara Frederick, “How network tools can improve base security,” C4ISRNet, April 12, 2018, https://www.c4isrnet.com/opinion/2018/04/12/how-network-tools-can-improve-base-security/.
  13. Patrick Howell O'Neill, Tate Ryan-Mosley, and Bobbie Johnson, “A flood of coronavirus apps are tracking us. Now it’s time to keep track of them,” MIT Technology Review, May 7, 2020, https://www.technologyreview.com/2020/05/07/1000961/launching-mittr-covid-tracing-tracker/.
  14. David Grout, Richard Weaver, and John Doyle, “The Security and Privacy Implications of COVID-19 Location Data Apps,” FireEye Blogs: Industry Perspectives, May 5, 2020, https://www.fireeye.com/blog/executive-perspective/2020/05/security-privacy-implications-of-covid-19-location-data-apps.html.
  15. Liza Lin and Timothy W. Martin, “How Coronavirus Is Eroding Privacy,” The Wall Street Journal, April 15, 2020, https://www.wsj.com/articles/coronavirus-paves-way-for-new-age-of-digital-surveillance-11586963028.
  16. Sheena Greitens, “‘Surveillance with Chinese Characteristics’: The Development & Global Export of Chinese Policing Technology” (paper presented at Princeton University’s International Relations Faculty Colloquium, Princeton, New Jersey, October 7, 2019), http://ncgg.princeton.edu/IR%20Colloquium/GreitensSept2019.pdf, 2.
  17. “China: Voice Biometric Collection Threatens Privacy,” Human Rights Watch, October 22, 2017, https://www.hrw.org/news/2017/10/22/china-voice-biometric-collection-threatens-privacy.
  18. Paul Mozur (@paulmozur), “The world is not always harmonious, and in China attempts to make it so come through intimidation and often crushing social controls. Now that Beijing’s authoritarianism has the help of very average tech, it’s putting a whole new world of totalizing control within reach.” December 18, 2019, 12:13 a.m. Twitter, https://twitter.com/paulmozur/status/1207166857079181314?s=20.
  19. Paul Scharre (Center for a New American Security Senior Fellow and Director of the Technology and National Security Program), conversation with Kara Frederick, May 20, 2020.
  20. Liza Lin and Newley Purnell, “A World With a Billion Cameras Watching You Is Just Around the Corner,” The Wall Street Journal, December 6, 2019, https://www.wsj.com/articles/a-billion-surveillance-cameras-forecast-to-be-watching-within-two-years-11575565402?mod=hp_listb_pos1; Oliver Philippou, “Video Surveillance Installed Base Report—2019,” IHS Markit, December 5, 2019, https://technology.ihs.com/607069/video-surveillance-installed-base-report-2019.
  21. Paul Bischoff, “Surveillance camera statistics: which cities have the most CCTV cameras?” August 15, 2019, https://www.comparitech.com/vpn-privacy/the-worlds-most-surveilled-cities/.
  22. Associated Press, “China’s Sharp Eyes surveillance system puts the security focus on public shaming,” South China Morning Post, October 30, 2018, https://www.scmp.com/news/china/politics/article/2170834/chinas-sharp-eyes-surveillance-system-puts-security-focus-public.
  23. “Inside China’s surveillance state,” Financial Times Magazine, 2018; https://www.ft.com/content/2182eebe-8a17-11e8-bf9e-8771d5404543; “High School Students in Eastern China to Get Facial Monitoring in Class,” Radio Free Asia, May 18, 2018, https://www.rfa.org/english/news/china/high-school-students-in-eastern-china-to-get-facial-monitoring-in-class-05182018113315.html.
  24. Kara Frederick, Fellow at Center for a New American Security, testimony to the Subcommittee on Crime and Terrorism, Judiciary Committee, U.S. Senate, November 5, 2019.
  25. Jane Li, “How people in China are trying to evade Beijing’s digital surveillance,” Quartz, August 6, 2019, https://qz.com/1659328/chinese-people-are-pushing-back-on-beijings-digital-surveillance/; Elliot Alderson, “MFSocket: A Chinese surveillance tool,” Medium, June 25, 2019, https://medium.com/@fs0c131y/mfsocket-a-chinese-surveillance-tool-58e8850c3de4.
  26. Christian Shepherd and Yuan Yang, “Chinese police use app to spy on citizens’ smartphones,” Financial Times, July 4, 2019, https://www.ft.com/content/73aebaaa-98a9-11e9-8cfb-30c211dcd229.
  27. China: Voice Biometric Collection Threatens Privacy,” Human Rights Watch.
  28. China: Voice Biometric Collection Threatens Privacy,” Human Rights Watch.
  29. Chris Buckley and Paul Mozur, “How China Uses High-Tech Surveillance to Subdue Minorities,” The New York Times, May 22, 2019, https://www.nytimes.com/2019/0....
  30. Megha Rajagopalan, “They Thought They’d Left The Surveillance State Behind. They Were Wrong,” BuzzFeed News, July 9, 2018, https://www.buzzfeednews.com/article/meghara/china-uighur-spies-surveillance; Josh Chin and Clément Bürge, “Beijing Squeezes Exiles in U.S. by Detaining Family Back Home,” The Wall Street Journal, March 30, 2018, https://www.wsj.com/articles/beijing-squeezes-exiles-in-u-s-by-detaining-family-back-home-1522402202.
  31. “China’s Algorithms of Repression: Reverse Engineering a Xinjiang Police Mass Surveillance App,” Human Rights Watch, May 1, 2019, https://www.hrw.org/report/2019/05/02/chinas-algorithms-repression/reverse-engineering-xinjiang-police-mass.
  32. Maya Wang, “China’s Bumbling Police State,” The Wall Street Journal, December 26, 2018, https://www.wsj.com/articles/chinas-bumbling-police-state-11545869066.
  33. Paul Mozur (@paulmozur), “The trackers solve a key problem for police: consolidating data. China’s telcos don’t share location data for mass surveillance w/ local police, mostly out of fear the data will be sold. So police build their own. They can interfere with telco networks, which annoys the telcos.” December 17, 2019, 11:35 p.m. Twitter, https://twitter.com/paulmozur/status/1207157455085457408?s=20.
  34. Rui Zhong and James Palmer, “Wuhan’s Virus and Quarantine Will Hit Poor Hardest,” January 22, 2020, Foreign Policy, https://foreignpolicy.com/2020/01/22/wuhan-coronavirus-quarantine-china-will-hit-poor-hardest/.
  35. Lin and Purnell, “A World With a Billion Cameras Watching You Is Just Around the Corner.”
  36. Isobel Cockerell, “Inside China’s Massive Surveillance Operation,” Wired, May 9, 2019, https://www.wired.com/story/inside-chinas-massive-surveillance-operation/.
  37. Jeffrey Knockel, Lotus Ruan, Masashi Crete-Nishihata, and Ron Deibert, “(Can’t) Picture This: An Analysis of Image Filtering on WeChat Moments,” The Citizen Lab, August 14, 2018, https://citizenlab.ca/2018/08/cant-picture-this-an-analysis-of-image-filtering-on-wechat-moments/.
  38. Paul Mozur (@paulmozur), “China’s internet police are increasingly responding in real time to question people who have said things deemed questionable online. Eventually the goal is to link all online and offline behavior. … What the police are doing is putting in the ground floor on a system to control reality as tightly as the internet.” December 17, 2019, 11:12 p.m. Twitter, https://twitter.com/paulmozur/status/1207163596691558401.
  39. Lauly Li, Coco Liu, and Cheng Ting-Fang, “China’s ‘sharp eyes’ offer chance to take surveillance industry global,” Nikkei Asian Review, June 5, 2019, https://asia.nikkei.com/Business/China-tech/China-s-sharp-eyes-offer-chance-to-take-surveillance-industry-global.
  40. Samantha Hoffman, “Engineering Global Consent: The Chinese Communist Party’s data-driven power expansion,” Australian Strategic Policy Institute, October 14, 2019, https://www.aspi.org.au/report/engineering-global-consent-chinese-communist-partys-data-driven-power-expansion.
  41. Danielle Cave, Samantha Hoffman, Alex Joske, Fergus Ryan, and Elise Thomas, “Mapping China’s Tech Giants,” Australian Strategic Policy Institute, April 18, 2019, https://www.aspi.org.au/report/mapping-chinas-tech-giants.
  42. “China: Voice Biometric Collection Threatens Privacy,” Human Rights Watch.
  43. Paul Mozur, Jonah M. Kessel, and Melissa Chan, “Made in China, Exported to the World: The Surveillance State,” The New York Times, April 24, 2019, https://www.nytimes.com/2019/04/24/technology/ecuador-surveillance-cameras-police-government.html.
  44. Li Tao, “Malaysian police wear Chinese start-up’s AI camera to identify suspected criminals,” South China Morning Post, April 20, 2018, https://www.scmp.com/tech/social-gadgets/article/2142497/malaysian-police-wear-chinese-start-ups-ai-camera-identify.
  45. Aradhana Aravindan and John Geddie, “Singapore to test facial recognition on lampposts, stoking privacy fears,” Reuters, April 13, 2018, https://www.reuters.com/article/us-singapore-surveillance-idUSKBN1HK0RV; “China AI firm Yitu opens R&D centre in Singapore,” The Straits Times, February 1, 2019, https://www.straitstimes.com/business/china-ai-firm-yitu-opens-rd-centre-in-spore.
  46. “Freedom on the Net 2018: The Rise of Digital Authoritarianism,” Freedom House, 2018, https://pbs.twimg.com/media/Du4Nb84XgAEBF-L.jpg; Lin and Purnell, “A World With a Billion Cameras Watching You Is Just Around the Corner”; and Aravindan and Geddie, “Singapore to test facial recognition on lampposts, stoking privacy fears.”
  47. Archana Narayanan, “World’s Largest AI Startup SenseTime to Open Abu Dhabi Hub,” Bloomberg, July 23, 2019, https://www.bloomberg.com/news/articles/2019-07-23/world-s-largest-ai-startup-sensetime-to-set-up-hub-in-abu-dhabi; Megha Rajagopalan, “Facial Recognition Technology Is Facing a Huge Backlash in the US. but Some of the World’s Biggest Tech Companies Are Trying to Sell It in The Gulf,” Buzzfeed News, May 29, 2019, https://www.buzzfeednews.com/article/meghara/dubai-facial-recognition-technology-ibm-huawei-hikvision.
  48. “Chinese facial recognition tech installed in nations vulnerable to abuse,” CBS News, October 16, 2019, https://www.cbsnews.com/news/china-huawei-face-recognition-cameras-serbia-other-countries-questionable-human-rights-2019-10-16/.
  49. Steven Feldstein, “The Global Expansion of AI Surveillance,” Carnegie Endowment for International Peace, September 17, 2019, https://carnegieendowment.org/files/WP-Feldstein-AISurveillance_final1.pdf.
  50. Mozur, Kessel, and Chan, “Made in China, Exported to the World.”
  51. Krystal Hu and Jeffrey Dastin, “Exclusive: Amazon turns to Chinese firm on U.S. blacklist to meet thermal camera needs,” Reuters, April 29, 2020, https://www.reuters.com/article/us-health-coronavirus-amazon-com-cameras/exclusive-amazon-turns-to-chinese-firm-on-u-s-blacklist-to-meet-thermal-camera-needs-idUSKBN22B1AL?utm_source=Twitter&utm_medium=Social.
  52. Elsa Kania, Samm Sacks, Paul Triolo, and Graham Webster, “China’s Strategic Thinking on Building Power in Cyberspace,” New America, September 25, 2017, https://www.newamerica.org/cybersecurity-initiative/blog/chinas-strategic-thinking-building-power-cyberspace/.
  53. Li, Liu, and Ting-Fang, “China’s ‘sharp eyes’ offer chance to take surveillance industry global.”
  54. nna Gross and Madhumita Murgia, “China shows its dominance in surveillance technology,” Financial Times, December 26, 2019, https://www.ft.com/content/b34d8ff8-21b4-11ea-92da-f0c92e957a96.
  55. “Vietnam: Big Brother Is Watching Everyone,” Human Rights Watch, December 20, 2018, https://www.hrw.org/news/2018/12/20/vietnam-big-brother-watching-everyone; Frederick, testimony to the Subcommittee on Crime and Terrorism, Judiciary Committee.
  56. Alina Polyakova, “Russia Is Teaching the World to Spy,” The New York Times, December 5, 2019, https://www.nytimes.com/2019/12/05/opinion/russia-hacking.html.
  57. According to the January 2016 issue of the Journal of Democracy, democratic backsliding is the “state-led debilitation or elimination of the political institutions sustaining an existing democracy.” Nancy Bermeo, “On Democratic Backsliding,” Journal of Democracy 27, no. 1, 5-19.
  58. Lin and Purnell, “A World With a Billion Cameras Watching You Is Just Around the Corner.”
  59. Maria Laura Canineu, “High-tech surveillance: from China to Brazil?” Human Rights Watch, https://www.hrw.org/news/2019/05/31/high-tech-surveillance-china-brazil.
  60. Canineu, “High-tech surveillance: from China to Brazil?”
  61. Ana Ionova, “Brazil takes a page from China, taps facial recognition to solve crime,” The Christian Science Monitor, February 11, 2020, https://www.csmonitor.com/World/Americas/2020/0211/Brazil-takes-a-page-from-China-taps-facial-recognition-to-solve-crime; and Aiuri Rebello, “Bancada do PSL vai à China conhecer sistema que reconhece rosto de cidadãos,” Folha de S. Paulo, January 16, 2019, https://www1.folha.uol.com.br/mercado/2019/01/bancada-do-psl-vai-a-china-importar-sistema-que-reconhece-rosto-de-cidadaos.shtml.
  62. Niharika Mandhana, “Huawei’s Video Surveillance Business Hits Snag in Philippines,” The Wall Street Journal, February 20, 2019, https://www.wsj.com/articles/huaweis-video-surveillance-business-hits-snag-in-philippines-11550683135; CNN Philippines Staff, “DILG launches Chinese CCTV surveillance system in Metro Manila,” CNN Philippines, November 22, 2019, https://cnnphilippines.com/news/2019/11/22/DILG-Chinese-CCTV-Manila-Safe-Philippines.html.
  63. Scharre, conversation with Kara Frederick.
  64. Lin and Purnell, “A World With a Billion Cameras Watching You Is Just Around the Corner.”
  65. CLEAR Homepage, 2020, https://www.clearme.com/.
  66. Gretta L. Goodwin, Director of Homeland Security and Justice, testimony to the Committee on Oversight and Reform, U.S. House of Representatives, June 4, 2019, https://www.gao.gov/assets/700....
  67. Arthur Holland Michel, Eyes in the Sky: The Secret Rise of Gorgon Stare and How It Will Watch Us All (New York: Houghton Mifflin Harcourt, 2019); Christopher Mims, “When Battlefield Surveillance Comes to Your Town,” The Wall Street Journal, August 3, 2019, https://www.wsj.com/articles/when-battlefield-surveillance-comes-to-your-town-11564805394.
  68. Regina Garcia Cano, “Police surveillance planes to fly above Baltimore in 2020,” Associated Press, December 20, 2019, https://apnews.com/537d25e5269f08477a42a94d66784233.
  69. Katy Stegall, “Military-grade drone will fly over San Diego next year,” San Diego Union Tribune, December 26, 2019, https://www.sandiegouniontribune.com/news/watchdog/story/2019-12-26/city-council-and-public-were-unaware-of-military-grade-drone-test-flight; Patrick Tucker, “Look for Military Drones to Begin Replacing Police Helicopters by 2025,” Defense One, August 28, 2017, https://www.defenseone.com/technology/2017/08/look-military-drones-replace-police-helicopters-2025/140588/.
  70. Dan Gettinger, “Public Safety Drones, 3rd Edition,” (Center for the Study of the Done at Bard College, March 2020), https://dronecenter.bard.edu/projects/public-safety-drones-project/public-safety-drones-3rd-edition/. Widespread public safety surveillance efforts are not distinct to aerial surveillance. Municipalities such as Louisville, Kentucky, and Kansas City, Missouri, offer wearable devices to pinpoint poor air quality and place sensors on streetlights to provide alternative traffic suggestions. Other cities, including New Orleans, use predictive analytics to improve emergency response times.
  71. Charlotte Hathaway, “Met Police begins operational use of Live Facial Recognition (LFR) technology,” Land Mobile, January 24, 2020, https://www.landmobile.co.uk/news/metropolitan-police-service-nec-live-facial-recognition/.
  72. Kim Hart and Aïda Amer, “The privacy worries with smart cities,” Axios, December 24, 2019, https://www.axios.com/toronto-sidewalk-labs-smart-cities-stalled-distrust-big-tech-government-95cd21c4-39f6-4c9e-ae98-bc393ca85e75.html.
  73. Grassroots efforts to rein in surveillance do not stop at government use. The expansion of and skepticism toward surveillance technology by private companies is also growing. A 2019 Pew survey found that 81 percent of the American public believes the risks faced from data collection by companies outweigh the benefits, with 66 percent concerned about government data collection. A majority of Americans report being concerned about the way their data is being used by companies (79 percent) or the government (64 percent). Most also feel they have little or no control over how these entities use their personal information, according to a new survey of U.S. adults by Pew Research Center that explores how Americans feel about the state of privacy in the nation. Scandals, such as the one that resulted from IBM’s 2019 use of millions of photos from unwitting citizens on the photo hosting site Flickr to improve its algorithms, hurt public trust in this type of data collection, especially facial recognition. The hacking of a surveillance border contractor in Texas and the Biostar2 leaks of biometric data in the United Kingdom are similarly troubling for public trust trajectories. Only 36 percent of Americans say they trust tech companies—and only 18 percent say they trust advertisers—to use facial recognition responsibly. This trend will likely result in increased pushback that makes use of the democratic system.
  74. Wolfie Christl (@WolfieChristl), “German @SZ took an Android/Xiaomi phone, installed 14 typical apps and found 7305 data transmissions to 636 servers in 24 hours - thereof 64% during locked screen and 18% during the night. In the night, Xiaomi received a list of apps used + length of use.” December 15, 2019, 4:28 p.m. Twitter, https://twitter.com/WolfieChristl/status/1206325297777455106.
  75. Jennifer Valentino-DeVries, “Tracking Phones, Google Is a Dragnet for the Police,” The New York Times, April 13, 2019, https://www.nytimes.com/interactive/2019/04/13/us/google-location-tracking-police.html; Thomas Brewster, “Google Hands Feds 1,500 Phone Locations In Unprecedented ‘Geofence’ Search,” Forbes, December 11, 2019, https://www.forbes.com/sites/thomasbrewster/2019/12/11/google-gives-feds-1500-leads-to-arsonist-smartphones-in-unprecedented-geofence-search/#628b8f4427dc.
  76. Carrie Cordero, “Corporate Data Collection and U.S. National Security: Expanding the Conversation in an Era of Nation State Cyber Aggression,” Lawfare, June 1, 2018, https://www.lawfareblog.com/corporate-data-collection-and-us-national-security-expanding-conversation-era-nation-state-cyber; Sheldon Whitehouse, “Why Americans Hate Government Surveillance but Tolerate Corporate Data Aggregators,” Lawfare, June 2, 2015, https://www.lawfareblog.com/why-americans-hate-government-surveillance-tolerate-corporate-data-aggregators.
  77. Layering our national technical means with these techniques and other open source information is a good starting point.
  78. Tom Simonite, “Amazon Joins Microsoft’s Call for Rules on Facial Recognition,” Wired, February 7, 2019, https://www.wired.com/story/amazon-joins-microsofts-call-rules-facial-recognition/.
  79. This figure was compiled by the Center on Privacy and Technology at Georgetown Law. Georgetown Privacy (@GeorgetownCPT), “So far in 2020, 10 state legislatures have introduced bills involving #FacialRecognition: [graphic].” January 17, 2020, 12:25 p.m. Twitter, https://twitter.com/GeorgetownCPT/status/1218222879097049088.
  80. Kate Conger, Richard Fausset, and Serge F. Kovaleski, “San Francisco Bans Facial Recognition Technology,” The New York Times, May 14, 2019, https://www.nytimes.com/2019/05/14/us/facial-recognition-ban-san-francisco.html?smtyp=cur&smid=tw-nytimes; Sarah Ravani, “Oakland bans use of facial recognition technology, citing bias concerns,” The San Francisco Chronicle, July 17, 2019, https://www.sfchronicle.com/bayarea/article/Oakland-bans-use-of-facial-recognition-14101253.php; Sarah Wu, “Somerville City Council passes facial recognition ban, The Boston Globe, June 27, 2019, https://www.bostonglobe.com/metro/2019/06/27/somerville-city-council-passes-facial-recognition-ban/SfaqQ7mG3DGulXonBHSCYK/story.html; and Nathan Sheard, “Victory: Brookline Votes to Ban Face Surveillance,” Electronic Frontier Foundation, December 20, 2019, https://www.eff.org/deeplinks/2019/12/victory-brookline-votes-ban-face-surveillance.
  81. Orion Rummler, “2020’s first wave of facial surveillance bills,” Axios, January 18, 2020, https://www.axios.com/facial-surveillance-legislation-2020-47063834-a7fb-47bf-b53c-e770b0e16d1a.html.
  82. Matthew Keegan, “Big Brother is watching: Chinese city with 2.6m cameras is world's most heavily surveilled,” The Guardian, December 2, 2019, https://www.theguardian.com/cities/2019/dec/02/big-brother-is-watching-chinese-city-with-26m-cameras-is-worlds-most-heavily-surveilled.
  83. “IBM CEO’s Letter to Congress on Racial Justice Reform,” IBM, press release, June 8, 2020, https://www.ibm.com/blogs/policy/facial-recognition-susset-racial-justice-reforms/.
  84. Frederick, “How Network Tools Can Improve Base Security”; Curtis Waltman, “Meet Babel Street, the Powerful Social Media Surveillance Used by Police, Secret Service, and Sports Stadiums,” Vice, April 17, 2017, https://www.vice.com/en_us/article/gv7g3m/meet-babel-street-the-powerful-social-media-surveillance-used-by-police-secret-service-and-sports-stadiums.
  85. Kashmir Hill, “The Secretive Company That Might End Privacy as We Know It,” The New York Times, January 18, 2020, https://www.nytimes.com/2020/01/18/technology/clearview-privacy-facial-recognition.html; and “ACLU Sues Clearview,” American Civil Liberties Union, press release, May 28, 2020, https://www.aclu.org/press-releases/aclu-sues-clearview-ai.
  86. Contact tracing makes use of Bluetooth or location data to notify users if they come in contact with an individual who tests positive for COVID-19.
  87. Paul Mozur, Raymond Zhong, and Aaron Krolik, “In Coronavirus Fight, China Gives Citizens a Color Code, With Red Flags,” The New York Times, March 1, 2020, https://www.nytimes.com/2020/03/01/business/china-coronavirus-surveillance.html.
  88. Liza Lin, “China’s Plan to Make Permanent Health Tracking on Smartphones Stirs Concern,” The Wall Street Journal, May 25, 2020, https://www.wsj.com/articles/chinas-plan-to-make-permanent-health-tracking-on-smartphones-stirs-concern-11590422497?mod=tech_lead_pos4.
  89. “Mobile Location Data and COVID-19: Q&A,” Human Rights Watch, May 13, 2020, https://www.hrw.org/news/2020/05/13/mobile-location-data-and-covid-19-qa.
  90. Mary Ilyushina, “Moscow rolls out digital tracking to enforce lockdown. Critics dub it a 'cyber Gulag',” CNN World, April 14, 2020, https://edition.cnn.com/2020/04/14/world/moscow-cyber-tracking-qr-code-intl/index.html.
  91. Jon Fingas, “Iran's coronavirus 'diagnosis' app looks more like a surveillance tool,” Engadget, March 14, 2020, https://www.engadget.com/2020-03-14-irans-coronavirus-diagnosis-app-looks-more-like-a-surveillanc.html; Catalin Cimpanu, “Spying concerns raised over Iran's official COVID-19 detection app,” ZDNet, March 9, 2020, https://www.zdnet.com/article/spying-concerns-raised-over-irans-official-covid-19-detection-app/; and Victoria Song, “Google Pulls Iran's Official Coronavirus App from Play Store,” Gizmodo, March 10, 2020, https://gizmodo.com/google-pulls-irans-official-coronavirus-app-from-play-s-1842235444.
  92. O'Neill, Ryan-Mosley, and Johnson, “A flood of coronavirus apps are tracking us. Now it’s time to keep track of them.”
  93. Nick Thorpe, “Coronavirus: Hungary government gets sweeping powers,” BBC News, March 30, 2020, https://www.bbc.com/news/world-europe-52095500.
  94. Kareem Fahim, Min Joo Kim, and Steve Hendrix, “Cellphone monitoring is spreading with the coronavirus. So is an uneasy tolerance of surveillance,” The Washington Post, May 2, 2020, https://www.washingtonpost.com/world/cellphone-monitoring-is-spreading-with-the-coronavirus-so-is-an-uneasy-tolerance-of-surveillance/2020/05/02/56f14466-7b55-11ea-a311-adb1344719a9_story.html.
  95. “Mobile Location Data and COVID-19: Q&A,” Human Rights Watch; Fahim, Kim, and Hendrix, “Cellphone monitoring is spreading with the coronavirus. So is an uneasy tolerance of surveillance.”
  96. Tara Marchelin, “Minister Encourages Indonesians to Install Covid-19 Surveillance App,” Jakarta Globe, April 8, 2020, https://jakartaglobe.id/news/minister-encourages-indonesians-to-install-covid19-surveillance-app; and Rizki Fachriansyah and Ardila Syakriah, “COVID-19: Indonesia develops surveillance app to bolster contact tracing, tracking,” The Jakarta Post, March 30, 2020, https://www.thejakartapost.com/news/2020/03/30/covid-19-indonesia-develops-surveillance-app-to-bolster-contact-tracing-tracking.html.
  97. Patrick Howell O'Neill, “India is forcing people to use its COVID app, unlike any other democracy,” MIT Technology Review, May 7, 2020, https://www.technologyreview.com/2020/05/07/1001360/india-aarogya-setu-covid-app-mandatory/.
  98. Payal Dhar, “COVID-19 Could Turn India Into a Surveillance State,” Slate Future Tense, May 11, 2020, https://slate.com/technology/2020/05/covid19-india-surveillance-aargoya-setu.amp.
  99. “45 organizations and more than 100 prominent individuals push back against the coercion of Aarogya Setu,” Internet Freedom Foundation, May 2, 2020, https://internetfreedom.in/45-organizations-and-105-prominent-individuals-push-back-against-the-coercion-of-aarogya-setu/.
  100. Manish Singh, “India’s contact-tracing app is going open-source,” Tech Crunch, May 26, 2020, https://techcrunch.com/2020/05/26/aarogya-setu-india-source-code-release/.
  101. After an initial surge of rhetorical enthusiasm, Americans appear to be developing some resistance to contact tracing apps, with particular concern over who controls the data. According to a joint Axios and IPSOS poll in 2020, only around a third of Americans would likely opt in to cellphone-based contact tracing systems created by major tech companies. Further, a poll conducted by The Washington Post and the University of Maryland concluded in April that nearly three in five Americans would be unable or unwilling to use the application programming interface (API) developed by Apple and Google. In the same poll, Americans demonstrated skepticism over tech companies’ handling of health data privacy, with only 43% stating they “trust” big tech companies. The Kaiser Family Foundation concluded in late April that twice as many individuals would be willing to download a contract tracing application if it was “managed by public health agencies rather than tech companies.”
  102. Jay Stanley, “Temperature Screening and Civil Liberties During an Epidemic,” American Civil Liberties Union, May 19, 2020, https://www.aclu.org/aclu-white-paper-temperature-screening-and-civil-liberties-during-epidemic; “Coronavirus: Israeli court bans lawless contact tracing,” BBC News, April 27, 2020, https://www.bbc.com/news/technology-52439145; and “Coronavirus: Israel halts police phone tracking over privacy concerns,” BBC News, April 23, 2020, https://www.bbc.com/news/technology-52395886.
  103. “Coronavirus: Israeli court bans lawless contact tracing”; and “Coronavirus: Israel halts police phone tracking over privacy concerns.”
  104. Hyonhee Shin, Hyunjoo Jin, and Josh Smith, “How South Korea turned an urban planning system into a virus tracking database,” Reuters, May 21, 2020, https://www.reuters.com/article/us-health-coronavirus-southkorea-tracing/how-south-korea-turned-an-urban-planning-system-into-a-virus-tracking-database-idUSKBN22Y03I.
  105. At the contract-tracing application level, these fissures can be loosely characterized as a decision between centralized and decentralized architectures in the design of the technology. Centralized methods transfer personal data like location information (or identifying information like a phone number) to a central data repository, likely located on a large server. Decentralized applications use short-range signals like wireless Bluetooth “handshakes,” where interactions between devices stay at the device level, without tracking location data through the use of cell towers or GPS.
  106. Sudip Kar-Gupta and Michel Rose, “France accuses Apple of refusing help with ‘StopCovid’ app,” Reuters, May 5, 2020, https://www.reuters.com/article/us-health-coronavirus-france-tech/france-accuses-apple-of-refusing-help-with-stopcovid-app-idUSKBN22H0LX.
  107. Helene Fouquet, “France Says Apple Bluetooth Policy Is Blocking Virus Tracker,” Bloomberg Technology, April 20, 2020, https://www.bloomberg.com/news/articles/2020-04-20/france-says-apple-s-bluetooth-policy-is-blocking-virus-tracker; Leo Kelion, “Coronavirus: Apple and France in stand-off over contact-tracing app,” BBC News, April 21, 2020, https://www.bbc.com/news/technology-52366129. This default setting acts as a privacy-preserving measure, according to Apple, and blocks Bluetooth access if a user is not actively running the application.
  108. Fouquet, “France Says Apple Bluetooth Policy Is Blocking Virus Tracker.”
  109. Kelion, “Coronavirus: Apple and France in stand-off over contact-tracing app”; Romain Dillet, “French contact-tracing app StopCovid passes first vote,” Tech Crunch, May 27, 2020, https://techcrunch.com/2020/05/27/french-contact-tracing-app-stopcovid-passes-first-vote/.
  110. Douglas Busvine, Andreas Rinke, “Germany flips to Apple-Google approach on smartphone contact tracing,” Reuters, April 26, 2020, https://www.reuters.com/article/us-health-coronavirus-europe-tech/germany-flips-on-smartphone-contact-tracing-backs-apple-and-google-idUSKCN22807J.
  111. These recommendations draw heavily from the author’s congressional testimony in November 2019 and her paper submitted to the Cyberspace Solarium Commission on October 1, 2019, “Reclaiming Cyber Governance as a Bulwark Against—and Not a Tool of—Illiberalism,” for their 2020 report.
  112. “UK seeks alliance to avoid reliance on Chinese tech,” Reuters.
  113. “OGP Global Summit 2019: Ottawa, Canada,” OGP.
  114. Ely Ratner et al., “Rising to the China Challenge,” (Center for a New American Security, January 28, 2020), https://www.cnas.org/publications/reports/rising-to-the-china-challenge.
  115. “NIST Testimony: Facial Recognition Technology (FRT),” NIST, March 22, 2017, https://www.nist.gov/speech-testimony/facial-recognition-technology-frt.
  116. Miles Brundage et al., “The Malicious Use of Artificial Intelligence: Forecasting, Prevention, and Mitigation” (Future of Humanity Institute, February 2018), https://arxiv.org/pdf/1802.07228.pdf.
  117. King and Gallagher, “Cyberspace Solarium Commission Report,” 93.
  118. For example, this scorecard could help WhatsApp assess its response to India’s request to weaken the Facebook-owned company’s privacy restrictions so Indian officials could, in principle, identify individuals who send messages the central government deems problematic.
  119. Similarly, Congress can consider these characteristics when assessing whether a surveillance system or exported surveillance technologies will be used for human rights abuses.
  120. Samantha Hoffman, “Managing the State: Social Credit, Surveillance and the CCP’s Plan for China,” The Jamestown Foundation, August 17, 2017, https://jamestown.org/program/managing-the-state-social-credit-surveillance-and-the-ccps-plan-for-china/.
  121. “Draft U.S. Government Guidance For The Export Of Hardware, Software And Technology.”
  122. “Draft U.S. Government Guidance For The Export Of Hardware, Software And Technology.”
  123. The “privacy solution” can be commercially viable, high-performing digital products with proper privacy protections engineered early in the design phase; “2018 Differential Privacy Synthetic Data Challenge.”

Authors

  • Kara Frederick

    Fellow, Technology and National Security Program

    Kara Frederick is a Fellow for the Technology and National Security Program at the Center for a New American Security (CNAS). Prior to joining CNAS, Kara helped create and lea...

View All Reports View All Articles & Multimedia