2020 has been a very turbulent year. This is also true with regards to European surveillance politics, both at the EU level and in national politics. Like most years, it was largely characterised by one central conflict, which in simple terms goes like this: a push for more and more technologically advanced surveillance practices by both industry and government actors on the one hand, and fierce resistance from civil society, academia, and some regulators on the other, attempting to reign in or even roll back perceived faits accomplis by capital and state power. 

Covid, and the seeming need to combat the pandemic with technological solutions, raised the stakes for this conflict quite considerably, without, however, changing its underlying dynamics and arguments. “Security”, the government agencies tasked with providing it perpetually claim, “is a fragile flower and we need to use all the fire power we can get to protect it”. But: “civil liberty must not be sacrificed in pursuit of security”, civil society responds, “and it is not a zero sum game anyway”. 

There is great merit in listening to all sensible arguments in this conflict. After all, there is a strong case to be made, and about:intel has done so, that it is only the kind of conversation which involves all relevant stakeholders that can help us build and maintain resilient, modern democracies. Only through the democratisation of surveillance governance as a policy field can we hope to bring about the kind of structural change that the Snowden revelations showed us we need.

That is what we set out to do with about:intel. Consequently, we looked at the central policy debates of 2020 from as many angles as possible, seeking to include whenever possible the generally more reclusive voices from within the circles of power: government, intelligence, industry. Bringing together those who regularly make important decisions and those who regularly criticise them for it will not resolve the underlying conflict outlined above, in fact it makes this conflict and the diverging interests that fuel it more visible. Yet, it may help to not only navigate surveillance policy in a more constructive and mutually engaged manner but also enable a space where the better argument prevails, rather than the bigger wallet or the greater claim to power. In this spirit, below is a summary of the themes and contributions we featured in 2020. 


Predictive Policing

One of the controversial practices this year was big-data or predictive policing, a method of algorithmic risk mapping that has increasingly been used by law enforcement departments around Europe. Whether this is because they are trying to harness new technology to provide a 21st century police service, trying to cut costs and do more with less, or perhaps merely jump on the shiny tech bandwagon, or a bit of everything — algorithmic analysis tools are proliferating. While they vary in method, many seek to anticipate future crimes (types, geographic areas, and time windows) and identify possible victims and offenders. Despite contested evidence for the effectiveness of the approach, more and more police departments around Europe are adopting predictive policing tools, often in the absence of clear regulation on the use of data analytics in our criminal justice systems. While proponents claim algorithmic tools eliminate human bias, voices flagging the self-fulfilling nature of using historical crime data, which can lead to over-policing and profiling of racial minorities, are growing louder. How these tools sit with the right to presumption of innocence and civil liberties is yet to be determined.

Jamie Grace, Senior Lecturer in Law in the Department of Law and Criminology at Sheffield Hallam University and member of the independent Data Analytics Ethics Committee established by the West Midlands Police and Crime Commissioner, argues that data-driven policing can only be as good as the data feeding it. If data-driven policing is to become a fair and ethical reality, he demands that more work needs to be done to figure out what a code of practice for data analytics in criminal justice should look like.

Chris Todd, Chief Superintendent at West Midlands Police and National Police Chiefs Council’s lead for Data Analytics, sees the independent ethics committee, of which Grace is a member, as essential in mitigating the risks of predictive policing, such as bias leading to an over-policing of racialised communities. Like Grace, he also stresses the importance of good data and transparency. With these factors present, he believes predictive policing could be used for the public good and reduce crimes like domestic homicide.

Nina Galla strongly disagrees. She is senior advisor for the study commission “Artificial Intelligence Social Responsibility and Economic, Social and Ecological Potential” for the parliamentary group THE LEFT PARTY at the German Bundestag. Galla argues that predictive policing trials – especially those utilising facial recognition – had demonstrated extreme lack of effectiveness. Even more worrying was the risk of fundamental rights violations and the lack of attention paid to the ‘human factor’, i.e. the rights and competencies of people using machine-learning systems based on insufficient data.

Richard Helson, Head of the UK for Chorus Intelligence, a predictive policing software provider, weighs in by explaining the priorities they give themselves in their work. He wants his company’s tools to simply augment the judgement of experienced law enforcement officers. Police still had a responsibility, he states, to ensure data and tools used possess enough integrity to stand trial in court and to draft ethical frameworks to win back the public’s trust. 

Yet, Fieke Jansen from the Data Justice Lab at Cardiff University studies says studies actually showed that there was no clear relationship between the use of predictive policing tools and crime reduction. The reason police forces still jumped on the bandwagon was that they didn’t want to appear unmodern, were curious for new policing technologies, or simply fell prey to sales pitches. Across Europe, predictive policing had been the result of political crises which gave rise to more interventionist, and at times racialised notions of security. With the risk of increased stigmatisation and over-policing of specific communities, Jansen demands that we had to not only examine the tools themselves but the entire legitimacy of such forms of state intervention.


Export of Surveillance Technology

Another controversial practice some industry and state actors are jointly engaged in is the proliferation of surveillance technology to repressive regimes around the world. These regimes persecute journalists and dissidents and violate the rights of minorities with sophisticated surveillance tools, from government malware to facial recognition and from Stingrays to IMSI catchers. European countries, particularly the UK, Germany, and France, are among the key suppliers of these technologies. Despite the implementation of stricter export controls for European companies since 2011 — after the Arab Spring exposed the degree to which European technology aided the crackdown of protests — much government oppression continues to be “Made in the EU”. With global trade in AI-enabled surveillance flourishing, what are the regulatory options for ensuring that surveillance technology produced in Europe will not be used to assail fundamental human rights elsewhere? And what are the practical obstacles to their effective implementation and enforcement? 


Mark Bromley, Director of the Dual-use and Arms Trade Control Programme at the Stockholm International Peace Research Institute (SIPRI), surveys the existing export control regimes: the Wassenaar arrangement and the EU dual-use regulation. With evidence of questionable transfers casting doubt on their efficacy, the success of these controls remained unclear, Bromley says. He argues that to make them more effective, public reporting obligations were needed as well as a more inclusive process for reviewing policy and formulating new one, also in light of alternative mechanisms other than export controls.

For Edin Omanovic, Advocacy Director at Privacy International, global trade in surveillance technology, driven by a craze for profits, is a race to the bottom, in which liberal democratic governments continue to criticise the behaviour of others while using it to justify their own. Controlling the spread of surveillance technology was a moral, legal, and strategic imperative though, and Europe had to do better; not only through reformed export controls but also by making transparent and leveraging its international security partnerships to improve legal and governance standards around the world.

Speaking of leveraging its influence: Europe could also capitalise on its economic power to incentivise proper use of the dual-use technology it exports, argues Katrina Lampert, assistant editor of about:intel. By putting respect for human rights at the forefront of trade agreements, not unlike the EU’s Generalised Scheme of Preferences (GSP), the EU could combine ‘sticks’ – existing sanctions and controls – with a complementary carrot approach.


Covid-19 and Surveillance

Another contentious practice this year were surveillance-based government responses to Covid-19. There was great concern that the surveillance infrastructure (potentially) built and expanded to respond to a public health need would not be walked back once in place. In April and May, we reached out to experts in the UK, Germany, France, South Africa, and on the EU level to understand which changes to the legal framework, policy, and use of surveillance had been triggered by Covid-19 and whether these changes were necessary and legitimate. 

Javier Ruiz, Policy Director at the UK-based advocacy organisation Open Rights Group, recognises that in this novel crisis there is no direct precedent for using surveillance powers targeting such a large proportion of the population. He believe, however, that “social compliance with control measures was the most important aspect for success in stopping the pandemic, and minimising any interference with human rights, however lawful, would be better in the long term to build the trust required”. That was why radical transparency, trust, and participation were much better ways to fight the pandemic than data-driven automated enforcement tools. 

Jan-David Franke, Editor of about:intel, shows concern about a possible normative shift in what would be considered acceptable levels of privacy intrusion, propelled by China and its advertisement of mass surveillance in response to the crisis. He argues that in order to escape the ideational pull of China, Europe had to stick to the principles of proportionality and necessity and remember that the real trailblazers aren’t the countries who implement the most extreme ideas or engage in the most intimidating security theatre but those who get results with the least invasive of measures.

Jane Duncan, professor and Head of Department of Journalism, Film and Television at the University of Johannesburg, remarks that the South African government had issued relatively strong regulations for the use of location data. These included purpose specification, user notifications, and a sunset clause. Given the country’s history and weak metadata control, the strength of these stipulations came as somewhat of a surprise to Duncan. She attributes this to the pressure civil society had put on the country’s surveillance regime before the crisis. 

Alexandra Paulus and Sven Herpig (both at Stiftung Neue Verantwortung) join the criticism of China, although for a different reason: China’s intelligence agencies’ attempt to leverage Covid-19 for their own cyber operations, which was actually hurting the Chinese government’s international leadership projects. Since it was often in states’ own interest to respect international norms, governments — including their intelligence agencies — should consider self-restraint when it comes to cyber operations. Paulus and Herpig argue that the case of China showed that this was particularly true during the unfolding Covid-19 pandemic.

Elspeth Guild, law professor at Queen Mary University, and Elif Kuskonmaz, lecturer at the School of Law of the University of Portsmouth, look at the legal grounds on which EU states may enact exceptional powers in light of public health crises, namely the GDPR, the Data Protection Directive, the European Convention on Human Rights, and rulings by the European Court of Human Rights. They also stress that it was of utmost importance that these exceptions remained exceptions and didn’t chisel away at existing fundamental rights.


Automated Video-Surveillance

Facial recognition technology has captured both the imagination and the concern of many. But it is only one form of biometric surveillance, which itself is merely one application of automated video-surveillance. The field of video analytics is vast, and can include nearly every type of action and occurrence imaginable, even human sentiment. The pair of human eyes once tasked with passively watching CCTV footage has been replaced with artificial intelligence programmes. Law enforcement and other security agencies, which increasingly resort to automated video-surveillance, tout the technology’s aid in reducing crime and increasing public safety, but critics have long raised the alarm. They may highlight its supply-driven market background or point to all the problems AI itself is fraught with — racial biases, false positives, and algorithmic inscrutability, among others. Fundamentally, they worry that full-scale automated video-surveillance in public spaces will create a point of no return, after which we will be unable to live our lives anonymously and assert our essential civil liberties, such as freedom of assembly or freedom of speech, with dire consequences for democracy. With the spotlight usually being on how to regulate technology after it has already been introduced, we want to take a step back and ask: Do we even want to allow this kind of technology? How can we enforce our democratic will in the face of ever-faster technological change?

We should ban the use of video surveillance, says Bojan Perkov, Policy Researcher at the SHARE Foundation. Studying the case of the rollout of thousands of cameras in his home town of Belgrade, he argues that we need to resist the total surveillance of urban public spaces for law enforcement purposes if we want to keep living in free and democratic societies. In the case of the so-called “Safe Society” project in Belgrade, surveillance cameras had been installed without any public debate, nor did they come with a strong legal framework protecting digital and civil rights. 

Thorsten Frei, Member of the German Bundestag since October 2013 and Deputy chairman of the CDU/CSU parliamentary group, has a different take. If used to target crimes of a grave nature, limited to crime hotspots, and if high standards are observed to protect against discrimination, he argues, facial recognition can and should be used to prevent crime. Using the technology in Germany would represent a major step up in the fight against terrorism and serious crime. 

Still, what should be prevented, in any case, is that facial recognition databases are established through the backdoor without parliamentary debate or scrutiny, says Niovi Vavoula, Lecturer in Migration and Security at Queen Mary University. Yet, she shows that that is exactly what is currently happening as the next Prüm framework – which regulates the exchange of data between police in different EU Member States – is being devised. Moreover, there was a slew of accuracy challenges around image quality and age and database sizes that needs addressing. 


BND-Reform 2.0

One of this year’s most prominent attempts to roll back existing, possibly unconstitutional practices of industry or state security actors was the successful lawsuit against the BND Act of 2016 by Reporters Without Borders, the Gesellschaft für Freiheitsrechte (GFF) and others before the German Constitutional Court. In May 2020, the German Constitutional Court ruled that key provisions in the current legal framework on the German foreign intelligence service (BND Act) are unconstitutional and that the Bundestag has until December 2021 to rectify a long list of deficits. The basic premise of the Court’s judgement is that the right to private communication and the right to press freedom under Germany’s Basic Law are rights against state interference that ought to extend to foreigners in other countries, too. In its new foreign intelligence bill, German state authority must honor these rights not just with respect to its own citizens and residents but also with regard to non-nationals the world over. 

Drafting a new legal framework for Germany’s foreign intelligence collection requires a substantial overhaul of the provisions on the surveillance of foreign telecommunications, on the sharing of intelligence thus obtained with other bodies, and on the cooperation with foreign intelligence services as well as the design of effective judicial and administrative oversight. Many legal, technical and political decisions that now need to be made are open questions in other countries, too. This concerns, for example, the mandate for bulk collection, oversight requirements, the rights and protections afforded to non-nationals, or special protections for journalists. 

With the German cabinet passing the draft bill in December 2020, we reached out to all key stakeholders in Germany and asked them: Does this BND reform 2.0 manage to protect both fundamental rights and security? Does it provide a rights-based and modern framework for foreign intelligence or will it only be a matter of time before this reform, like its predecessor, will be squashed in court? 

Patrick Sensburg, member of the German Parliament for the CDU/CSU and member of the Parliamentary Oversight Panel, says the bill goes too far. He worries about the additional hurdles it will introduce for the BND’s surveillance of foreigners and how that might hurt the service’s operational effectiveness and ability to cooperate. In striking a compromise between fundamental rights protection and the interests of the BND, the latter’s capacity to keep Germany safe should not be undermined.

André Hahn, member of the German Parliament for Die Linke and Sensburg’s colleague from the Parliamentary Oversight Panel, contends it does not nearly go far enough, however. Instead, it left gaping loopholes for the BND to creatively interpret, failing to effectively regulate and oversee the service’s military intelligence activities, its surveillance of communications abroad when performed from outside German territory, its data collection through suitability tests, and more. By also further fragmenting an already splintered oversight structure, the German government was flouting the protection of fundamental rights in Germany and abroad.

Lisa Dittmer, Advocacy Officer for Internet Freedom at Reporters Without Borders Germany, agrees. She says that the bill is far from a radical overhaul of the mass surveillance of online communications practiced by the BND and its partners. As is, the bill would “severely harm foreign journalists’ right to privacy and confidentiality, the protection of their sources, and at worst, their safety. It would likely also damage German media outlets, which cooperate with international partners on a regular basis”. Substantial changes were necessary if this bill were to meet the standards set by the Court for a democratic compromise between the pursuit of security interests and the protection of press freedom and fundamental rights. 


Spotlight

The growing importance and normalisation of intelligence politics has entailed a whole range of captivating processes, from litigation to legislation, from oversight reports to political processes. In Spotlight we keep checking in on intelligence law and practice around Europe and give context to intelligence news. Be it litigation, legislation, politics, reviews, leaks and scandals, or stand-alone investigative features, Spotlight is our feed for contemporary intelligence journalism.

Here, too, we looked at a whole range of existing surveillance practices by industry and government actors which had either just come to light, were being publicly reviewed, insufficiently overseen, litigated at a national or European court, or had come out of litigation and required constructive proposals.


Thorsten Wetzling, editor in chief of about:intel, looks at the long and ambitious list of safeguards that the Bundestag is ordered to write into its new BND act by the end of 2021 and points to potential legislative solutions that could prevent future findings of unconstitutionality. If done right, he concludes, the next BND reform may significantly contribute to the harmonisation of good SIGINT standards in Europe.

The tensions between the state seeking greater surveillance powers and civil society cautioning against it was perhaps nowhere more visible this year than in Germany. The German government tried to defend vast surveillance powers not only in the context of its foreign intelligence service BND, but also expand them for its domestic intelligence agency, the Bundesverfassungsschutz (BfV). Kilian Vieth and Charlotte Dietrich (both at Stiftung Neue Verantwortung), come to the conclusion that it tried to do so without convincing evidence and that the marginal changes proposed to the oversight framework could not make up for it. They show that the draft law failed to meet international good oversight standards and would, with the introduction of the so-called “state trojan”, end up undermining IT security for all.

Floran Vadillo, former security advisor to the President of the National Assembly’s Laws Committee and involved in the drafting of the 2015 Intelligence Act in France, lets us in on the current reform debate in France. With new legislation due sometime in 2021, Vadillo surveys the proposed changes different stakeholders would like to see. His argument is that to preserve the overall balance struck in 2015, only light-touched interventions were warranted. 

Following the CJEU’s invalidation of the Privacy Shield agreement in its ‘Schrems II’ decision and the German Constitutional Court’s BND ruling, the opportunity has arisen to establish more progressive and internationally harmonised standards for intelligence governance and safeguards against disproportionate government access to data. Charlotte Dietrich and Thorsten Wetzling (both at Stiftung Neue Verantwortung), are convinced that how personal data can and will be protected would remain a decisive question not just in transatlantic data politics but more broadly in global human rights protection. They argue that it was high time to establish better intelligence governance standards among democracies — such as the protection of non-nationals or oversight unimpeded by the Third Party Rule — to both guarantee fundamental individual rights and ensure national security.

The landscape of security politics is full of opportunities, however, and by virtue of our market society it is predominantly private corporations who seize them, or create them in the first place. One such corporation is US big data analytics provider Palantir, also heavily involved with US intelligence. Palantir has been making headway also into the EU’s security infrastructure, collaborating with Europol, EASA, and national law enforcement as well as deepening its political relationships. According to Sophie in ‘t Veld, MEP for the Renew Europe Group, this reflects a larger naiveté within European politics towards foreign tech companies and an imbalance in EU-US relations. Not only should Palantir be kept out of our institutions and security fabric, she demands, it was also overdue for the EU to gain more strategic technological independence, so that it may defend and assert its status as the last bastion of privacy.

Bastien Le Querrec, PhD student in public law at Université Grenoble-Alpes and member of La Quadrature du Net, analyses the Advocate-General’s opinion in the CJEU’s ruling on a range of cases regarding data retention and access to metadata, brought before the court by digital rights organisations from France, Belgium, and the UK. He outlines that the Advocate-General rejects bulk data retention schemes but is in favour of limited retention of and real-time access to data. 

Barbara Grabowska-Moroz, former project coordinator at the Helsinki Foundation for Human Rights in Warsaw, tells us about the Polish surveillance regime, whose powers the government expanded in 2016 while ignoring a 2014 decision by the Polish Constitutional Tribunal. It had demanded that an independent oversight body be established, that individuals who had been subject to intrusive surveillance methods be notified, and that procedural safeguards for secret surveillance be tightened. In the absence of these safeguards, two cases have been brought before the ECtHR by lawyers and NGOs, arguing that Polish law is in violation of Articles 8 and 13 by failing to provide effective legal remedy against violations of the right to privacy.

Megan Goulding, lawyer at the UK human rights organisation Liberty, writes about one of the biggest oversight scandals in recent history. For nearly a decade MI5 knowingly mishandled data collected through surveillance in violation of statutory safeguards. The service also failed to inform the UK government watchdog IPCO of these unlawful errors. The safeguards and oversight system contained in the Investigatory Powers Act of 2016 were therefore little more than window dressing, Goulding argues.

Liberty is not the only UK NGO suing the MI5 over its conduct. Guidelines revealed in March 2018, known as the ‘third direction’, permit MI5 agents to become involved in crime while undercover. Reprieve, Privacy International, the Committee on the Administration of Justice, and the Pat Finucane Centre think this is illegal. Sam Johnston Hawke, Legal and Police Officer at Reprieve, argues that “involvement in serious abuses of human rights damages the agencies’ ability to do their vital work of keeping the UK safe”. He is also concerned that the government had actively avoided legislating further on this issue despite urgent calls to do so.

Jan Jirat and Lorenz Naegeli, Swiss journalists working for independent weekly WOZ – Die Wochenzeitung in Zurich, dig deep into the Club de Berne, the elusive intelligence sharing forum between the domestic intelligence services of the EU member states, Norway, and Switzerland. With great concern, they come to the conclusion that the once informal organisation is morphing into a proper institution, which evades the few legal and regulatory frameworks that exist on international intelligence cooperation.

Eleftherios Chelioudakis, co-founder of the Greek digital rights organisation Homo Digitalis, analyses the surge of police-led intelligence and border management in his country and situates it in a larger European trend of fundamental rights challenges. Despite austerity, Hellenic police and border authorities were experimenting with and implementing facial recognition and other problematic biometric processing technologies, often at the expense of already marginalised groups. They were also making widely unregulated use of drones, he writes. 

Kilian Vieth and Thorsten Wetzling, both at Stiftung Neue Verantwortung, contend that traditional oversight mechanisms are reaching their limits in the age of digital, transnational surveillance practice. The question then becomes how to close the gap between high-tech intelligence techniques and low-tech and inefficient oversight processes. Summarising their comprehensive report on the matter, they present seven ideas for data-driven intelligence oversight. 

Annika Hansen, senior researcher and Deputy Head of the Analysis Division at the Center for International Peace Operations, looks at digital technologies in peacekeeping operations. In today’s era of fake news and hate speech, conflict moves fluidly between the ground and cyber realm, she argues. Peace operations currently lacked the ability to pre-empt or counter these new dimensions of strife, but they might be able to do so if the technical capacity of staff were increased and a culture of data collection and use were fostered.

Jesús Cordero, lawyer at the law firm Leegaltech, outlines what the Spanish government needs to do better in their Digital Agenda, announced in Summer of 2020. Spain, he argues, would do well to conceive of ‘the digital’ as a space where security, economic, and democratic interests meet and can be aligned. As the Covid-19 pandemic had underscored, greater inter-departmental coordination and a more open and attentive public debate on questions of surveillance and digital rights were therefore key in shaping a healthy digital society.

Didier Bigo, professor of International Political Sociology at Sciences Po Paris-CERI, considers the fundamental sociological assumptions and implications of Covid tracking apps. Digital contact tracing was being presented as the ideal way to differentiate between the sick and the healthy, he writes. But in reality it added another political problem, that of an enlarged surveillance apparatus, without solving the underlying one, the lack of an effective public health strategy.

Beside the day to day policy debate, it is indispensable to consider the big picture of where we are, how we got here, and what that tells us about where we’re heading. François Thuillier, former intelligence practitioner in the French services and now an associate researcher at the CELS in Paris, does just that by retracing the transformation of French society from a universalist and secular to an anti-terrorist republic. He argues that the new public narrative that had taken hold misjudged the complexities of the world and blurred the line between liberalism and authoritarianism. French democracy would now have to pay a hefty political price for this clearance sale of its values in light of surging extreme right-wing politics and an invasive and intransparent surveillance apparatus, Thuillier cautions. 


Outlook and Acknowledgement

In 2021 about:intel will continue to shed light on key questions in European surveillance in the spirit of inclusive dialogue. These will include the priorities of European and national surveillance research programs; the practice of intelligence agencies buying data from commercial vendors, circumventing oversight and accountability safeguards; the role and oversight of Europol; and the ramifications and regulatory options for automated video-surveillance. The issues of adequacy and data sharing as well as international oversight and intelligence cooperation are evergreens but Brexit will surely spice them up. Then, the ECtHR is set to hand down some decisive rulings regarding bulk surveillance powers in the Centrum för Rättvisa v. Sweden and Big Brother Watch v. UK cases. Moreover, military intelligence policy will receive more attention, especially with regards to procedural safeguards and cooperation with civil intelligence agencies. 

These upcoming discussions as well as all the ones revisited in this article will and would not have been possible without the many fantastic, voluntary submissions from all over Europe, and sometimes beyond. We owe a ton of gratitude to everyone who has shared their perspective on about:intel, joining and enabling a pluralistic conversation on intelligence, technology, and democracy. If you would like to contribute in the future, please do not hesitate to contact us at info@aboutintel.eu. Please also get in touch if you would like to give us feedback on our content or structure, which would be much appreciated, or if you would like to suggest new themes that deserve more attention from your point of view.  

A warm thank you to you all. Happy new year and may the vaccine be with you.