A surveillance pandemic? Results of the community listening post on risks for freedom in the wake of COVID-19

The situation is highly fluid. Over last weekend, the Pan-European Privacy-Preserving Proximity Tracing (PEPP-PT) has collapsed over privacy concerns. Score one for the “non-solutionist” approach advocated in our call vs. the magic app.

1 Like

It looks like the scientific and tech communities are increasingly disassociating themselves from the contact tracing app trope. I have news of open letters from these communities expressing concern in:

  • Italy, led by Turin Polytechnic. This is one is urgent enough that I received a personal invitation to sign it, from Francesca Bria, currently the chairwoman of Italy’s largest public VC fund.
  • Belgium, led by KU Leuven.
  • The Netherlands

@teirdes has filed a “JO anmälan” (legal term) against Göteborg city (one of the largest cities in Sweden) on the topic of collecting information in a public cloud provider, Microsoft Office 365. Touches on many of the topics we have discussed in many threads here.

I listened to the whole interview and strongly recommended it. It’s in Swedish.


I get so tired whenever I hear about this kind of thing. For a country that is supposed to be at the forefront of digitalisation this stuff happens remarkably often. Why exactly are our public institutions on effin microsoft???

1 Like

Good question.

The real answer is probably that our political leaders do not care about their communities members enough to take these issues seriously.

Force of habit and resistance to change, as well as ridiculously high institutional fears of being blamed when something goes wrong. Additionally, even if there are knowledgeable staff they frequently don’t get backing from their bosses and there is a tendency to reward behaviour that approaches “I went to a conference and saw a powerpoint about a software tool from a vendor and now I really want to buy it”.

I was personally slightly disillusioned when I saw an elected official from the region where I was living in at the time share in some social medium that I forget which one it was their “progressive IT intervenion on health IT systems” that looked something like: slide 1: lots of arrows going back and forth, and slide 2: only three arrows going back and forth, with the “ingenious” political point “having fewer arrows going back and forth is great! let’s pay a lot of money for that!!!”

Sorry if the presentation of this grievance isn’t best precise. Sigh.


Well, it looks like Apple/Google has forced something of a standard for contact tracing apps over the wishes of Germany (and likely many other nations).

I go into detail here: The State of COVID-19 Exposure Notifications.

I’m curious about other people’s thoughts on this development. It looks like a ‘middle way’ that is better than much of what was on the table. The two red flags for me:

  1. The solution is corporate controlled and the actual implementation is closed-source.
  2. Many epidemiologists are skeptical of automated contract tracing’s effectiveness.

There are also ways to abuse this solution, which concern me. From a cryptographic perspective, the engineering does seem quite sound.

1 Like

Hello @schmudde, welcome! That’s a hell of a post, thanks for writing it.

I am impressed that you consider the Apple-Google exposure notification tech CCC-compliant. That is more than I hoped for. Also, I like your list of dangers, and the point about these apps being somewhere between “potentially helpful, given cheap and rapid testing” to “totally useless” was brought up by several people, also during the Surveillance Pandemic call. In general, the Magic Covid App reminds me of “if we had a loaf of bread, we could make a chicken sandwich, if we had some chicken”; where the app is the bread, and testing capacity is the chicken. Your final list of privacy-infringing government actions is also chilling.

I also have a question for you. If the Apple-Google code is not open source, on what basis is the infosec community evaluating its cryptographic soundness?

Glad to be here. I’m looking forward to continued conversations.

I used some pretty imprecise language when talking about CCC compliance. Your post made me revisit that footnote and include the Nexa Institute’s Open Letter.

My original tone was favorable because I was a little surprised how many of the boxes Apple/Google managed to check. I didn’t expect them to open source the effort. But I also didn’t expect them to come up with a decentralized solution. It makes sense in hindsight. Apple has demonstrated that they really do not want to be caught holding the keys and answering a government subpoena.

Obviously infosec cannot vouch for the soundness of this implementation. It only “seems like” sound cryptography (using my words in the article). The problems will arise post-facto. An analyst like Ben Thompson would argue that Apple absolutely must deliver the promises they make in their white paper. For example, they have taken the reputation of iMessage very seriously - it may be their most valuable software asset outside of OSX/iOS. Failing to do so could affect their bottom line and shareholder value.

The only other thing we have to go on is their inspiration, the open source D3-PT.

So yeah, the fact it’s not open source is a real shame. I think you and I would both agree that Apple/Google could benefit tremendously by making it open source. I’m not sure why it is closed. Any ideas? They both run plenty of open source projects.

And indeed, testing is the chicken.

This makes plenty of sense. But still, there is the NSA backdoor problem… or has that gone away since the PRISM days? I admit I lost track of that debate.

Haha, I signed that one myself. Good people, and I respect the MEP that got in touch with me to propose that (Irene Tinagli).

NSA requests for backdoors are a little less fashionable after the CIA spied on the Senate in 2014. But this pendulum has swung back and forth with congress and the intelligence committee since the 1970s. I’m sure it will continue.

I was referencing the Apple/FBI fight after the 2015 San Bernardino shooting. If you remember, the FBI used the courts to try to make Apple hand over information - essentially forcing open a back door. Apple changed iMessage encryption after this incident, which offers two advantages.

  1. They can’t hand over keys if they don’t have them.
  2. They can’t build a backdoor if they don’t even have a front door.

I don’t have any predictions about how this will play out in the future, but this is the precedent from the last five years. So it makes sense that Apple would adopt a technology where they have no access to user data. It’s the safest for them.

I can certainly see a day where lawmakers try and force Apple to build a less secure iMessage. We will see.

1 Like

This is true.

Tracing apps were a big topic during the session, therefore I thought this might be interesting here:

1 Like

@alberto is going to question data scientists from DeLab in this webinar next week (3rd of June) about how to use their data to validate results 1-4 listed above from the surveillance pandemic session.

Result 1: there is cause to worry, but also leverage for defense

Result 2. Contact tracing apps are ineffective against COVID-19, but may help in the next pandemic

Result 3. Immunity passports are an unworkable idea

Result 4. Locational data are impossible to anonymize, and of limited utility. Capacity for data governance is bad

Join us if you are interested in that discussion and want to discuss how to understand and interpret data rather then just receiving it. And of course, you are also welcome to ask about the origin of that data.

I want to revamp this thread in view of a couple of disturbing developments I am seeing. They are not directly related to the “magic COVID app”, but they do resonate with some of the arguments we explored in this event.

1. Restricted freedoms in Asia-Pacific

Earlier this week, the Open Government Partnership held this meeting. They found robust evidence that states inclined towards authoritarianism exploited the pandemic to increase surveillance and restricts liberties. Most countries in the region are also restricting the publication of data on the pandemic itself, presumably so that they control the narrative and keep their nations in a perpetual state of fear. Authoritarians love fear, because it makes government overreach more acceptable.

I am not convinced that this is only an Asian phenomenon. In my own native Italy, 19 regions out of 20 refuse to publish open data on COVID-19 ([thoughtful post by Matteo Brunati]
(Il governo (non è) aperto durante la pandemia: serve una spinta - Casual.info.in.a.bottle - Il blog di Matteo Brunati), in Italian).

2. Smart city infrastructure re-deployed to counter-demonstration surveillance in the USA

The city of San Diego, in progressive California, has 4,000 smart traffic lights. Which have cameras. Which store their footage on some server farm. Turns out San Diego police are now accessing that footage in search of material to incriminate Black Lives Matter protestors. The whole thing had been predicted by the EFF in 2017, and this lends some strength to slippery slope-type arguments whenever someone wants more monitoring and more data on the spaces where humans live.

We are still very far from a bottom line on the consequences of SARS-COV-2 on the Internet and how humans use it. But the shadows are deepening, and I fear that the opportunists thinking the pandemic is a good excuse for getting more surveillance tech in place… are mostly right.

1 Like

I find that this thread is a nice place to store information and reflections about how the COVID-19 pandemic interacts with surveillance matters. Most of this information has a technological side.

The latest is this: the government of Romania has decided to “transmit the entire database” of clinical data treated by the health authorities, currently stewarded by the Ministry of Health, to the Ministry of Internal Affairs (link, Romanian).

I know of another case where the Ministry of Internal Affairs ended up making health policy: Italy. As lockdown was decided, the government initially allowed daily exercise outdoors, done individually. This is, of course, consistent with the recommendations of any doctor: exercise is an essential component of health. However, the initial regulation was, in a few days, superseded by a new one, after the Minister of Internal Affairs complained that “the only way to check people where compliant was to lock them inside at all times”. Doctors protested, but could not get the regulation to be overturned.

In general, when health care is entrusted to the police, and powerful digital surveillance tech is brought in… I don’t know, it just smells bad to me.

EDIT: some trust issues are indeed showing in the preliminary coding of the NGI Forward corpus. :roll_eyes:

I suspect that one worry is that if you give over to the government temporary powers that restrict your basic freedom of movement, would those powers be given back, after the danger passes?

In the domain of Internet tech it’s more surveillance than restrictions. But yes.

1 Like

I’m trying to think of a time when a government voluntarily gave back powers when they didn’t need them after some crisis, excluding adding and then taking away troops or police for a given situation. No luck so far.

I was doing some prep work for the event on the 29th and landed in this thread again - how long ago it feels…

1 Like