Conor O’Kane, PhD Student and Lecturer in Economics at Bournemouth University, offers his perspective of the recent Amsterdam Privacy Conference.
The Amsterdam Privacy Conference (APC) took place over 4 days from the 23-26 October 2015. Organised by the Amsterdam Platform for Privacy Research (APPR), an initiative from the University of Amsterdam, this interdisciplinary conference brought together leading experts in the field of privacy from a diverse range of disciplines including philosophy, law, economics, informatics as well as social, medical and media sciences. The conference was divided into seven themes; (1) Privacy and security, (2) Privacy and the information society, (3) Privacy and healthcare, (4) Privacy and technology, (5) Commercial value of privacy, (6) Transformation of the public space and personalized communication and (7) The value and ethics of privacy.
The APC took place under the backdrop of the European Commission’s expected imminent publication of a new data protection directive that promises to deliver a comprehensive reform of data protection rules in the EU. The long overdue directive will replace the existing and outdated 95/46/EC data protection provisions and seeks to clarify how existing data protection principles apply in relation to new technologies that have emerged in the last 20 years.
As a PhD student researching in this area, I approach privacy from an economics perspective and conduct online experiments to examine the impact prompts or ‘nudges’ have on users decision making in relation to making personal information disclosure. Empirical research continues to make an important contribution to the policy debate in this area. Experimental research outputs in this area have often produced counter-intuitive findings or ‘control paradox’. Research outputs in this area shown that increasing perceived control over some aspects of privacy results in individuals revealing more personal information about themselves, to the extent where they end up more vulnerable as a result of measures meant to protect them.
The conference opened with ‘Guest of Honour’ Max Schrems addressing delegates. Max has very much been in the media spotlight in recent weeks as the case he took against Facebook resulted in the European Court of Justice (ECJ) ruling that the ‘Safe Harbour’ principle (a legal agreement that allowed companies including the likes of Facebook, Google and Microsoft etc to transfer EU citizens’ data to the US for processing) is no longer deemed valid. In his presentation, he gave an overview of his case and the technicalities of the judgement. Max received a warm welcome from delegates and personally I thought his participation brought a great energy to the conference. His story has an appealing ‘David and Goliath’ resonance. Who would have thought an individual student could successfully take on a technology giant like Facebook?
The opening keynote speaker Julie Brill from the US Federal Trade Commission confirmed the impact of Max’s case across the Atlantic, describing the ECJ judgement as, “measuring 7.8 on the Richter scale”. Providing a US view on the ECB decision, Julie interestingly rejected the argument held by some of her compatriots that the ECJ decision amounted to a form of EU protectionism. In fact, she explicitly agreed that the Safe Harbour provisions needed to be improved adding that the US Department of Commerce was working hard in an attempt to achieve these improvements. Most of the rest of Julies’ presentation was devoted to telling the audience about the various actions they had taken against Google, Facebook and others to protect consumers privacy. Her concluding comment, “we [FCT] are doing an awesome job” in relation to their role of protecting consumers privacy rights left me wondering who she was trying to convince, the audience or herself!
A central issue addressed by many of the presenters explored the challenge of balancing the individuals right to privacy without restricting the potential wider benefits the analysis of the very same data may hold for wider society. A keynote from Anita Allen addressed this with regards to medical data. How do we balance the privacy rights of medical patients right against the potential wider benefits analysis of this medical data may deliver in terms of the development of new medicines and treatments? Deidre Mulligan from the Berkeley Centre for Law & Technology perhaps best captured the tension framing the privacy debate as the desire for personal privacy versus the ability of intelligence services to maintain national security.
Anonymity and Tracking
Ashkan Soltani, Chief Technologist at the FTC delivered an interesting talk on how sophisticated online tracking tools had become. I was familiar with some of his research on issues like the use of persistent cookies by web tracking companies. Ashkan provided an overview how the tools web tracking companies use operate and how they can circumvent users attempts to prevent such tracking. However, while the talk was informative, it failed to explain what the FTC was doing to help the consumer to combat these practices.
A Role for Regulation?
Viktor Mayer-Schönberger, Professor of Internet Governance at Oxford Internet Institute made the case for more regulation to protect privacy. Viktor told us, “Privacy is not dead”, but the mechanisms we use to protect privacy are not working properly and needed to be revised. Individuals’ privacy is at the mercy of big companies and government agencies. Although we have ‘informed consent’ to manage personal privacy, we have discovered this does not work effectively. He argued that we need laws to limit what can be done with personal data. He cited examples of the regulation of car safety, food safety etc and argued that we should do the same in relation to privacy. If a strong power imbalance exists we must intervene in the market. In summary Victor argued that we need old-fashioned regulation – ‘A framework for big data usage’, as he calls it.
In a wide-ranging talk, Professor Helen Nissenbaum also made the case for more regulation. She argued that privacy is a solution to some societal problems i.e. to prevent the flow of information that allows discrimination. Her core message argued for the regulation of use of personal data, not the flow of data.
A New Privacy Doctrine
While much of the debate at the conference centred on issues such as how transparency could be improved, or changes to what we define as sensitive/personal data, Professor Amitai Etzioni delivered an interesting and philosophical presentation in which he called for a comprehensive re-think on how we approach privacy. He is calling for what he termed a new ‘Cyber Age Privacy Doctrine’ with key characteristics that would include: (a) different rules for the collection and processing of more v’s less sensitive personal information; (b) a clear differentiation of point collection v’s dossier building of personal information by tracking firms, and; (c) rules that limit cybernation, especially where insensitive information is used to figure out sensitive information. Privacy should be viewed as a ‘Personal Bubble’ that goes with a person rather than being tied to specific physical or virtual places.
The Future of Privacy
A panel proposal by Google invited delegates to examine ‘privacy and future challenges’ in the form of an ‘exclusive fireside chat with Peter Fleischer, Google’s Global Privacy Council’. The talk essentially amounted to a description from Google as to how they managed the Google Spain ruling (e.g. set up processes to deal with take down requests to remove links to content, escalation procedures, etc). In terms of ‘future predictions’ we were told to expect US intelligence agency surveillance to continue, Safe harbour 2.0 to come and different and contradictions from individual DPA’s in relation to rulings on disputed take down requests. Oh, and Android will allow privacy settings on a per app basis in the future.
While calls for a more fundamental rethink in terms of a ‘new privacy doctrine’ is appealing and interesting on a theoretical level, it seems unlikely that this could be practically implemented. To me the way forward will require regulators to address two key areas. For Data Collectors, new regulations that govern how personal data is collected, stored, processed and shared, as well as effective enforcement of these rules is needed. For Data Subjects, increasing transparency about how personal information is collected and used, including how we are tracked online, is essential to allow users to make informed decisions in relation to when and who they chose to disclose personal information. More empirical research that demonstrates evidence of causal links with respect to the effectiveness of ‘privacy enhancing’ features has an important role to play in the policy debate in this area. CREATe has been a leader in establishing the important role empirical research can have in prompting changes to copyright policy, an approach that could contribute significantly to the digital privacy debate.