Does Covid-19 Contact Tracing Pose a Privacy Risk? Your Questions, Answered

Apple and Google's Bluetooth-based system isn't perfect. But many of the biggest concerns have solutions.
red strings pinned to white wall
Photograph: Getty Images

When Google and Apple announced last week that the two companies are building changes into Android and iOS to enable Bluetooth-based Covid-19 contact tracing, they touched off an immediate firestorm of criticisms. The notion of a Silicon Valley scheme to monitor yet another metric of our lives raised immediate questions about the system's practicality and its privacy. Now it's time to seek answers.

Apple and Google say that starting next month they'll add new features to their mobile operating systems that make it possible for certain approved apps, run by government health agencies, to use Bluetooth radios to track physical proximity between phones. If someone later receives a positive Covid-19 diagnosis, they can report it through the app, and any users who have been in recent contact will receive a notification. The system is Bluetooth-only, fully opt-in, collects no location data from users, and no data at all from anyone without a positive Covid-19 diagnosis. Apple and Google chose perhaps the most privacy-friendly of the many different schemes that could allow automated smartphone contact tracing.

But that doesn't necessarily mean it's private enough or practical. Security and privacy-focused technologists have pointed to a long list of potential flaws in Apple and Google's system, including techniques that could reveal the identities of Covid-19 positive users or help advertisers track them, false positives from trolls, mistaken self-diagnoses, and faulty signals between phones.

Those problems are real—but some have solutions. WIRED spoke to cryptographers and security experts about the potential pitfalls of Bluetooth contact tracing, and then posed those issues to a few of the technologists helping to build the contact-tracing systems at Apple, Google, and a consortium of more than a dozen groups focused on Bluetooth-based contact tracing called the TCN Coalition, including groups like Covid Watch, Co-Epi, and Novid.

Read all of our coronavirus coverage here.

The result is a complicated picture—an unproven system whose imperfections could drive users away from adopting it, or even result in unintended privacy violations. And yet it may also preserve privacy in the most important ways, while also serving as a significant tool to help countries around the world prevent new outbreaks.

The criticisms of the Bluetooth-based system outlined below don't encompass some of the larger sociological and political issues surrounding smartphone contact tracing. Any effective contact tracing will require testing for Covid-19 to ramp up far past current levels. Diagnosed or exposed individuals need the economic freedom and space to self-quarantine. And many low-income or older folks—those who appear to be most at risk—are less likely to have smartphones. Instead, we'll examine the more immediate question of potential technical vulnerabilities in the system.

Can It Be Used to Track People?

The likeliest concern for anyone taking part in a contact-tracing system is whether they're signing up for more surveillance. Bluetooth-based contact tracing is perhaps the least surveillance-friendly option, but its protections aren't perfect.

To understand those flaws, first a refresher on how Google and Apple's scheme—and the similar one proposed by the TCN Coalition—will work. Contact-tracing apps will constantly broadcast unique, rotating Bluetooth codes that are derived from a cryptographic key that changes once each day. At the same time, they'll constantly monitor the phones around them, recording the codes of any other phones they encounter within a certain amount of range and time—say, within six feet for 10 minutes. (Both numbers are "tunable" based on new data about how Covid-19 infections are occurring.) When a user reports a positive Covid-19 diagnosis, their app uploads the cryptographic keys that were used to generate their codes over the last two weeks to a server. Everyone else's app then downloads those daily keys and uses them to recreate the unique rotating codes they generated. If it finds a match with one of its stored codes, the app will notify that person that they may have been exposed, and will then show them information about self-quarantining or getting tested themselves.

The system involves every phone constantly broadcasting Bluetooth codes, but limits any snoop's ability to eavesdrop on those codes to track a person's movements by switching up the numbers every 10 or 15 minutes. Even so, Ashkan Soltani, former chief technologist for the Federal Trade Commission, has pointed out that a so-called "correlation attack" could still allow some forms of tracking.

To demonstrate the problem, Soltani imagines a nosy neighbor setting up a camera outside their window and recording the face of everyone who walks by. The same neighbor also "roots" their phone so they can see all the contact-tracing Bluetooth signals it picks up from other users. When one of those passersby later reports that they're Covid-19 positive, the snoop's app will receive all their keys from the contact-tracing server, and they'll be able to match up the codes the user broadcast at the moment they passed the camera, identifying a stranger as Covid-19 positive. They might go as far as posting the picture of that infected person on Nextdoor to warn neighbors to watch out for them.

"While the system itself has anonymous properties, the implementation—because it's broadcasting identifiers—isn't anonymous," Soltani says. "If you know you might end up on Nextdoor as someone who's infected, you might not be willing to use one of these apps."

Neither the contact-tracing developers at Google and Apple's joint project nor the TCN consortium had an easy answer to this question. But both teams suggested that these sort of correlation attacks would be difficult to do at a large scale. A spokesperson for the Google/Apple team pointed out that if an adversary is willing to use surveillance cameras, they could more easily point them at the entrances to clinics and other testing sites to capture people's faces.

The head of one contact-tracing project, Co-Epi founder Scott Leibrand, went so far as to say that the correlation attack is inextricable from an intended function of the contact-tracing protocol. Some versions of a Bluetooth-based contact tracing app may choose to alert you with information about the exact time and place when you crossed paths with a person who was later diagnosed as infected, so that you might better assess your risk. That could also help you determine the identity of the person who later tested positive. "One of the things that we will have to do is make it very clear to people that if they choose to submit a report, they're possibly disclosing to their friends and random strangers the fact of this exposure," Leibrand says.

Will the Tech Be Used for Ads?

The good news is that ad-targeting firms wouldn't be allowed to directly implement Google and Apple's Bluetooth contact-tracing protocol to track users. But another scenario suggested by Johns Hopkins University cryptographer Matthew Green points to a variant of the "correlation attack" above that might be useful for commercial tracking. An advertising firm could put Bluetooth beacons in stores that collect contact-tracing codes emitted by visiting customers. The firm could then use the public health app to download all the keys of people who are later diagnosed as Covid-19 positive and generate all their codes for the last two weeks. That method could hypothetically determine which trail of codes represented a single person, and follow them from store to store.

But even as Green described that scenario, he was quick to downplay it himself. First, the attack would only allow retailers to track people who reported themselves as Covid-19 positive, not the vast majority of users. It would also only allow those few infected people to be tracked for just the two weeks prior to their diagnosis. Besides, Green notes, advertisers already have plenty of tools to track movements from store to store, from credit card transactions to sneaky ultrasonic signals sent from apps. Would they really risk the scandal of specifically surveilling Covid-19-positive people just to add one more tracking method to their arsenal?

"It's definitely possible that some evil advertiser could use this to augment their data sets," Green says. "But, gosh, it really requires a lot of evil. And it seems to me like a small case."

Keeping ad tracking as an unlikely scenario, of course, depends on Apple and Google continuing to deny advertisers access to the API—or deprecating the feature altogether—after the coronavirus threat fades.

Will Contact-Tracing Apps Also Ask for Location Data?

Tracing Covid-19 infections based on Bluetooth contacts rather than GPS location data avoids a huge privacy concern. The latter, after all, can be used as evidence of everything from extramarital affairs to political dissent. But some critics have pointed out that contact-tracing apps that use Google and Apple's Bluetooth-tracing functionality will inevitably ask for location data anyway.

They may want to do so to make the system more efficient, argued cryptographer Moxie Marlinspike, creator of the popular encrypted communications app Signal, in a series of tweets following Apple and Google's announcement. According to the initial description of Apple and Google's API, every app user's phone would have to download the keys of every newly diagnosed Covid-19 person every day, which would quickly add up to a significant load of data. "If moderate numbers of smartphone users are infected in any given week, that's 100s of [megabytes]" for every phone to download, Marlinspike wrote. "That seems untenable." Instead, apps could better determine who needs to download which keys by collecting location data, sending users only the keys relevant to their area of movement.

Representatives from Google and Apple's joint project and the TCN Coalition had the same response to this point: If the app simply asks the user for their region, that very general location would allow the app to download a manageable number of keys. By both groups' back-of-the-napkin math, telling the app what country you're in would reduce the daily key download to just a megabyte or two, no GPS tracking required.

That doesn't mean some apps using Google and Apple's API won't ask for location data anyway. Health care organizations may miss the point of a system that avoids using GPS, or simply want the extra data to help better track infections. Google and Apple point out that if a location-tracing app wants to use GPS, it will need to first ask permission from the user, just as any app does.

But the question of location data points to a larger issue: Google and Apple can only point developers toward the most privacy-preserving approach. Every app will need to be judged independently on how it implements that framework. "There are a lot of additional problems that an app developer would need to work through in order to ship a product," Marlinspike wrote. "That can possibly be done responsibly, but Apple/Google aren't doing it for us."

Can the App Itself Identify Covid-19 Patients?

Bluetooth-based Covid-19 contact-tracing schemes are designed to upload no data from most users, and only anonymous data from people who are infected. But it still uploads some data from users who report themselves as positive. That raises the question of whether the upload can truly be anonymous, given how hard it is to move any data across the internet without someone learning where it came from.

Even if the keys that the app uploads to a server can't identify someone, they could, for instance, be linked with the IP addresses of the phones that upload them. That would let whoever runs that server—most likely a government health care agency—identify the phones of people who report as positive, and thus their locations and identities.

Apps can prevent anyone other than the server from eavesdropping on those IP addresses and identifying diagnosed users by using HTTPS encryption and also padding data they upload to obscure it, says Johns Hopkins' Green. But you still have to trust the app server itself not to collect and store identifying data from those uploads.

The TCN Coalition and the Google/Apple project both say the server shouldn't collect those IP addresses as a matter of policy. But it's up to the app developer to follow that policy.

In fact, many health care agencies will want to identify Covid-19-positive people. On that point, however, a representative from the Google/Apple project argued that trying to keep the Covid-19 status of infected patients secret from health care agencies themselves may be an unrealistic goal. After all, these are likely the same agencies administering Covid-19 tests. As such, the public has already entrusted them with identifying data about Covid-19-positive people.

What About False Positives?

Aside from surveillance issues, there's also the problem of making sure a Bluetooth contact-tracing app doesn't overwhelm people with incorrect warnings that they've been exposed. Those false positives could come users self-diagnosing incorrectly or worse, trolls spamming the system. University of Cambridge computer scientist and cryptographer Ross Anderson warned that "the performance art people will tie a phone to a dog and let it run around the park" to create canine contact-tracing chaos.

person lathering hands with soap and water
Plus: What it means to “flatten the curve,” and everything else you need to know about the coronavirus.

Cristina White, the executive director of contact-tracing project Covid-Watch and a Stanford computer scientist, suggests a solution to those problems: Only allow people to report a positive diagnosis with a health care provider's approval. To create that safeguard, Covid-Watch would distribute a separate app to health care providers that generates unique confirmation codes. When doctors or nurses have determined that a patient is Covid-19-positive, they would tap a button to generate a confirmation code and give it to the patient, who then enters it into their contact-tracing app. A representative from Apple and Google's joint contact-tracing project said that their system similarly envisions that patients can't declare themselves infected without the help of a health care professional, who would likely confirm with a QR code.

Critics have pointed out that approach seems to depend on the widespread availability of testing. But Stanford's White says that doctors could provide confirmation codes to patients without an actual test result, relying instead on observed symptoms. "Even without testing, doctors can say 'this looks like Covid to me,'" White says. "It could be a 'presumed' Covid-19 diagnosis, and we just let the doctor decide that." But White concedes this is a less than ideal backup plan, and would only be put into practice if tests remains tough to access for a certain contact-tracing system's users.

Other false positives could come from an entirely different problem: Bluetooth leaks through walls, while viruses don't. It's hardly useful to be warned that you were exposed to Covid-19 just because your upstairs neighbor or someone in the adjoining apartment building was infected.

On this point, the TCN Coalition and the Apple/Google joint project argue that Bluetooth signal strength nonetheless serves as a proxy for sharing airspace with someone. Apple and Google plan to use Received Signal Strength Indication as a metric for determining if phones are in proximity, calibrated to account for the Bluetooth radios and ranges of different phones. Both distance and obstacles like walls diminish RSSI, meaning someone in the neighboring apartment would likely appear equivalent to someone well outside of Covid-19 transmission range. Google and Apple say they're also considering blending in other factors as well, like using proximity sensors to determine if a phone is inside a bag or a pocket, which might diminish RSSI but not Covid-19 transmission.

All that said, a representative from Google and Apple's joint project conceded that any contact-tracing system will have a false-positive rate, just as Covid-19 tests themselves do. In fact, there will be a false negative rate, too, based on everything from viruses left on surfaces rather than contact-based transmission to the fact that many groups of people either don't have smartphones or won't opt in to smartphone-based contact tracing.

In other words, the system will be imperfect. No one should expect otherwise. But done right, with full knowledge of the real risks and finite rewards it provides, Bluetooth contact tracing serves as one more tool to detect and fight an invisible adversary. The world may need every tool it's got.


WIRED is providing free access to stories about public health and how to protect yourself during the coronavirus pandemic. Sign up for our Coronavirus Update newsletter for the latest updates, and subscribe to support our journalism.


More From WIRED on Covid-19