Apple and Google Team Up To Speak On Privacy Concerns During Corona Virus Tracking

Publish date April 26, 2020 Comments: 0
When the Chinese and South Korean authorities introduced mobile apps to track coronavirus, the world didn’t blink an eye, after all, the citizens of those countries are used to having their privacy trampled. In Europe and the US, however, coronavirus tracking apps are raising both eyebrows and concerns that privacy rights could fall by the wayside.

As Apple and Google put their heads together to create an international coronavirus tracking app, privacy issues are among their highest priorities, but that hasn’t stopped some “staunch privacy advocates” from fearing the worst.

Apple and Google’s smartphone app comes hot on the heels of the UK’s controversial National Health Service app and, in its first phase as an APK, is designed to be integrated into such pre-existing apps. The second phase will see it operating on an OS level. But how does it work, and will it be safe to use?

Apple and Google Team Up to Speak on Privacy Concerns During Coronavirus Tracking

Apple and Google are usually on opposites sides of the app war, but the coronavirus pandemic has seen them come together to use their joint technological know-how to create an app that can help to curb the spread of the coronavirus. The idea behind the app is that it will notify users if they have been in contact with someone who has tested positive,potentially leading people to take extra precautions or self-quarantine to slow any further spread.

While “using phone tacking for contract tracing has already been used in China to great effect… privacy compromises have prevented more open take-up of such technologies” until now. Rather than relying on location tracking as they have in Israel, the app Apple and Google have developed relies on Bluetooth LE technology – something that’s familiar to many and already used, somewhat controversially, in US election campaigns.

The key to the efficacy of Apple and Google’s coronavirus tracking app is the Bluetooth LE proximity profile (PXP) that Bluetooth relies on for device locating and tracking purposes”. You know how you get a pop up asking if you’d like to connect to a nearby device? Well, that couldn’t happen without Bluetooth’s PXP.

While there are advantages to using Bluetooth for tracking purposes, there are also limits to its accuracy and the more obstacles there are between two devices, “the worse Bluetooth LE is at accurately tracking something”. Apple and Google will need to tweak its capabilities if it wants its app to be effective. “Notifying everyone who’s been in contact with someone who is infected with the novel coronavirus within 100 feet won’t be useful information, but only alerting you after your phones are just inches apart won’t be helpful either”.

It does have advantages over the QR code system employed in China, however as Bluetooth can operate over greater distances than the optical scanners required for a QR code system. Furthermore, although neither the virus nor Bluetooth signals can travel through walls, it should, theoretically, be more effective when it comes to notifying those who are most at risk.

Are Apple and Google’s Privacy Safeguards Enough?

“Privacy, transparency, and consent are of the utmost importance in this effort” according to a joint statement by Apple and Google, but will their efforts be enough to assuage the unease of privacy advocates?

As far as Apple and Google are concerned, their coronavirus tracking app will not collect any user location data or personally identifiable information. Furthermore, the anonymous identifier beacons allocated to each device will be both random and will change every 15 minutes, ensuring that no one can track the device.

Not only must users opt-in to use the coronavirus tracking app, but they must also give their explicit consent before the system can share information about a positive COVID-19 test result. While this is good news in terms of privacy, it also means that users could withhold information about their health status, thereby limiting the system’s accuracy.

Another benefit of this coronavirus tracking app is that the information wouldn’t be “administered centrally by governments”, rather using a system in which users are informed but where that information never leaves their smartphones. The app will also inform other users that they’ve been in contact with someone with coronavirus but without revealing the individual’s identity.

Apple and Google have both given assurances that, once the pandemic is over, the coronavirus tracking tools will be shut down, pacifying some of the concerns about governments adopting the technology to conduct aggressive mass surveillance in the future.

With the app addressing many of the related privacy concerns, it currently sounds miraculous, so what could go wrong? Well, there are a few potential stumbling blocks and we only need to look to South Korea for some disturbing insights into just how wrong coronavirus tracking apps can actually go.

When Anonymized Data Identified the Man with No Name

There’s been a lot of talk about anonymized data and just how anonymous it actually is and the ability to de-anonymize data is the cause for some concern, although not so much with the Google and Apple offering, which uses randomized device identifiers to protect anonymity.

A closer look at the UK’s coronavirus tracking app, however, revealed that ministers could, potentially, use an individual’s unique device identifier “to enable de-anonymization if ministers judge that to be proportionate at some stage”. The memo didn’t discuss what circumstances would dictate such proportions!

If done properly, anonymization should involve a lot more than simply removing a person’s name or device identifier from the profile, as the attempts to track coronavirus in South Korea, although largely successful in terms of curbing the spread of the virus, have indicated.

The South Korean government introduced travel logs and text alerts to help its citizens stay away from areas known to have outbreaks of the coronavirus. The idea was to help people avoid infection by telling them where COVID-19-positive individuals had been before entering quarantine.

While the messages didn’t identify people by name, they gave away enough information to fuel social stigma and even arouse speculation of extramarital affairs and other questionable behavior.

One text alerted people to the fact that a man in his 30s who had tested positive had arrived at the main train station in Seoul, after which they had been unable to track his movements. Online discussions erupted as people speculated that, as he had been lost in an area known for prostitution, the likelihood was that he’d been paying for sex. It turned out he’d simply had dinner in a local restaurant and the earlier confusion was the result of a technical glitch.

As the coronavirus tracking app designed by Apple and Google app uses means such technical glitches are unlikely to expose such intimate details and embarrassing home truths, it’s far from perfect. Despite Apple and Google’s joint efforts to prioritize privacy, some still feel the coronavirus tracking app could potentially create a “real-time walking health report”.

Sergio Caltagirone, a former National Security Agency official and now vice president of threat intelligence at Dragos fears the app will be used to “discriminate against people, as fear of coronavirus will rise as we leave large-scale quarantine”.

Others argue that, if we must trace coronavirus, then a Bluetooth app is the best way to go. A Cryptographer at the Zcash Foundation, Henry de Valence, said, “Bluetooth-based contact tracing is the best technology, because compared to other methods like GPS or cell-tower location it’s more difficult to repurpose for surveillance, and it has the potential to be a user-respecting system”.

Nevertheless, the line between coronavirus tracing and mass surveillance is thinner than a supermodel after detox, and, when it comes down to it, there would inevitably come a point at which we would need to consider sacrificing our privacy and freedoms for the health of the general public.

Will You Exchange Your Privacy for Coronavirus Tracking Efforts?

South Korea and China have both illustrated just how effective contact tracing can be in the fight against the coronavirus while an app like the one created by Apple and Google could have other useful applications. It could, for example, warn you if you’re spending too much time outside or if you’re closer to someone than you ought to during social distancing.

One of the main problems facing Apple and Google at this stage is whether enough people will opt in to make the coronavirus tracking app effective. It will only work if at least half the population is willing to optionally trade freedoms for the greater good [by] installing apps to monitor movements and warn on contact”.

Perhaps more worrying is that, although Apple and Google say they will shut the app down once the need for it has passed, there’s no knowing when that might be. For the foreseeable future, it seems likely we will need to continue to monitor infection rates, trace contacts, and enforce isolations where necessary to avoid future lockdowns. Therefore, it would seem, “from a surveillance perspective, this new normal is only just beginning”. Have you ever wondered if your computer was spying on you? If it isn’t now, chances are, it soon will be!

It seems, however, that while privacy advocates are tearing their hair out at the thought of such violations, the general public is taking more of a “Keep Calm and Let Big Brother Watch” approach to the situation. A poll on TechRadar showed that one in four respondents “said they were okay with sacrificing a portion of their personal privacy in exchange for some form of cellphone tracking that could – in theory – reduce coronavirus infection rates and save lives”.

That still leaves 75% valuing privacy over public health, however, and a further 58% said that even if there was an app that could tell them who in their neighborhood was infected, the privacy implications were such that they would refuse to use it.

Others, like the co-founder of the software company, SaltStack, Thomas Hatch argue that “Given the extraordinary nature of the COVID-19 pandemic, I do believe that, in this case, public health does outweigh privacy”.  What do you think?

Is Our Privacy Safe in Apple and Google’s Collective Hands?

You can only trust an app as far as you can trust its developers, so how comfortable do you feel about Apple and Google and their respective reputations when it comes to user privacy and data protection?

Apple has always prided itself on its commitment to user privacy and CEO Tim Cook not only “made privacy a key selling point for the iPhone” but has tried his utmost to position the company “as a model of responsible tech”.

Google, on the other hand, has a considerably more tarnished reputation when it comes to privacy. In 2009, Google was under fire for its data collection practices and user tracking”. In 2013, Edward Snowden hinted at the NSA having accessing to Google user data and, last year, Google was fined $57 million under the General Data Protection Regulation (GDPR) for not properly disclosing to users how data is collected across its services”.

Needless, to say, Google’s reputation has taken quite a blow but, surprisingly enough, Americans still feel safer sharing their sensitive information with Google than Apple. That’s not saying a great deal, however, and, as one expert said, “When one hears ‘Google and Apple’ together, privacy and security are not the first things that come to mind”.

CEO of blockchain privacy technology company, Suterusu, ZP Hou said that, although he believed both companies to have the right intentions, “in reality, these are two of the largest tech mega-corporations in existence, and their historical commitment to privacy is lackluster at best”.

Bluetooth, Coronavirus Tracing, and Accuracy

Bluetooth and proximity tracking are certainly more privacy-friendly than tracking location data, especially with the introduction of random identifiers, but they are vulnerable to error.

Say, for example, someone cycled past the window of your apartment and happened to have the coronavirus tracking app installed, it could well communicate with the Bluetooth on your device, making it appear as though the two of you have been in contact, even though you never got closer than six feet apart.

If that means that people who don’t have the virus are forced into isolation based on a technical error, chances are this app will have a short lifespan. As in China, where a red QR code would force you into quarantine, such false negatives could result in widespread discontent and frustration, especially if it means not being able to return to work.

Conclusion

Apple and Google may not have squeaky clean reputations when it comes to user privacy, but the fact that they’ve been willing to put their professional rivalry to one side and come together to help in this unprecedented global crisis must be applauded.

Any app, be it a VPN, a fitness app, or a coronavirus tracking app, is bound to have its limitations but it does seem that Apple and Google have done all they can to minimize the impact on user privacy. The ever-changing device identifiers go a long way to protecting users while the fact that no information leaves the user’s smartphone is a huge plus in terms of security.

There are, nonetheless, privacy concerns regarding the app. As Michael Kwet of Yale Law School’s Information Society Project notes, “we now have to weigh the fate of our lives and economy against trust in Apple and Google, the ad-tech industry they support, and government intelligence agencies”. For Kwet, this is nothing short of a “nightmare”.

It would seem that some kind of coronavirus tracking app is unavoidable, especially if we want our lives to regain some modicum of normal, and, to be fair, if we must trace the infections, then Apple and Google’s app is one of the best, most privacy-conscious ways of doing it. Will it impinge on the privacy rights we have currently? Probably. Could it be used for mass surveillance projects afterward? Maybe. Should a public health crisis be a justifiable reason for tramping over privacy laws? Possibly not.

Times are certainly changing but whether they’re changing enough for over 50% of the world’s population to put their faith in Apple and Google and sacrifice privacy for public health remains to be seen.

Article comments