- Tom Loosemore is a partner at Public Digital, which helps governments and large organizations around the world adapt to the internet era. He was a cofounder of the UK’s Government Digital Service.
- Google and Apple made a huge global health policy decision in April by setting boundaries on how governments build their national contact-tracing apps, he writes.
- Although it’s a positive that Google and Apple have people’s privacy in mind with these limitations, their decision raises alarming questions about how unaccountable Silicon Valley firms have power over public health decisions.
- The Apple and Google approach may also undermine governments’ attempts to link contact-tracing apps with their wider, real-world contact-tracing efforts.
- This is an opinion column. The thoughts expressed are those of the author.
- Visit Business Insider’s homepage for more stories.
On 10 April, in the midst of the largest public health crisis in over 100 years, Apple and Google quietly revealed the true extent of their political power. The way they did so has left me questioning some long-held beliefs.
Via a couple of choreographed blog posts, Google and Apple dictated to 197 countries around the world how the world’s estimated 3.2 billion smartphones could — and could not — be used to combat the spread of coronavirus.
Their diktat curtailed the ambitions harbored by many governments to place smartphone apps at the front line of the battle to limit the spread of COVID-19.
As political power plays go, it was quietly brutal.
Yet not one government fought back with the legislative tools available to them.
Maybe they knew it would be pointless in the short time available? Or maybe they feared the legal muscle the two big tech giants bring to any party? De facto, Google and Apple laid down the law to each and every government around the globe.
As COVID-19 spreads, governments are exploring how contact-tracing apps might help them track the pandemic
Here’s the backstory.
From March onwards, governments around the world have been exploring how sensors on our mobile phones could be harnessed to alert someone that they had been in close contact with a confirmed case of COVID-19, and are thus at risk themselves.
The sensors on a modern smartphone are stupendously powerful. They can detect how close the device is to another phone, whether it is inside or outside, whether people are speaking nearby, and if so how loudly, whether its owner is walking, cycling, in a car, bus or train, and a myriad of other nuggets of knowledge.
All of this can help discern if you have been in close enough contact with a confirmed case to take steps to self-isolate and get tested, through a contact-tracing app.
The possibility that ‘an app might save us’ generated much excitement — if not over-excitement — amongst politicians and senior public health officials.
Many saw contact-tracing apps as the perfect tool with which to contain localized outbreaks, thus easing the path out of lockdown. Those alerted could then immediately self-isolate and get tested, swiftly putting a lid on the spread of the virus.
But such apps were also a potential privacy nightmare.
Many governments saw contact-tracing apps as a means to collect and analyze detailed data about the spread of the infection through the population, individual by individual.
And for some governments, the prospect of a centralized lake of real-time contact-tracing data was enticing. Possibly too enticing for comfort, if your own government had a poor human rights record.
In normal times, Google and Apple strictly forbade third-party mobile apps from using Bluetooth and other sensor data in such an intrusive fashion.
They locked away access to such features deep inside Android and iOS, their respective mobile operating systems, and only allowed themselves access to such sensitive capabilities. Those Google adverts don’t target themselves, after all.
But Google and Apple were facing huge pressure — not least from many of their employees — to allow governments’ nascent contact-tracing apps to access this smartphone sensor data. This was an unprecedented global public health crisis. They should do their bit.
But how to show willing while protecting their corporate interests?
They didn’t want to risk losing the trust of their users. And if they only allowed governments with impeccable human rights records to access the sensitive sensor data, that would doubtless harm their commercial interests in countries less blessed with democratic checks and balances. Apple sells a lot of iPhones in China.
So the normally fierce competitors decided to work together. Through early April they debated their options behind closed doors and finally came to a conclusion.
No governments would be allowed access to sensitive contact information. No personal data would be allowed to reach government servers. None.
Google and Apple allowed governments to each build one contact-tracing app — and one app only. This could make use of a new Google and Apple feature (an API) that would itself alert others who might be at risk, without routing the information through government servers.
Such a decentralized approach protected the privacy of individuals but meant governments would struggle to align the alerts triggered by their app with their wider manual contact-tracing and testing services.
Worse, the apps’ risk-scoring algorithms could only use a crude Bluetooth proximity estimate, scored between 0 and 8. This algorithm is impossible for governments to tune as they learn more about how COVID-19 is transmitted in the real world, as there is no feedback loop.
Who should have the power to dictate global health policy?
What’s happened since is sadly predictable.
Some ill-advised governments, the UK’s included, tried and failed to circumvent Google and Apple’s strict control of their phone’s sensors.
Other governments, such as Germany, have launched contact-tracing apps that adhere to Google and Apple’s mandated approach. Their effectiveness has yet to be reported.
I hope I am wrong, but I fear that Google and Apple’s approach will not prove particularly valuable in the messy real world of contact tracing. It is just too crude. Google and Apple have given governments an abacus in an era of machine learning.
I’ll admit I was instinctively pleased when I heard of Google and Apple’s decision. Throughout my career, I’ve defended people’s privacy from your typical state’s propensity to collect ever more data about their citizens, often without reason.
But in the weeks since April 10, I’ve reflected more on the nature of power. Who has power? And how is it held to account?
What Google and Apple did on April 10 was to make a huge, global public health policy decision — a decision that I believe should be the preserve of elected governments.
They alone had determined where the balance was between privacy and public health should lie. And they plumped firmly on the side of individual privacy. Governments were not to be trusted.
That their decision aligned well to both their commercial interests and the prevailing libertarian instincts of Silicon Valley should come as no surprise. They see themselves as accountable only to their shareholders.
The coming era will be defined in large part by an ongoing power battle between governments and Big Tech. Think hard before you pick a side. Power corrupts. And absolute power corrupts absolutely.
Something is loading.