News / July 2023

How to uphold privacy for the long run? A continuing fight for data protection, agency, and accountability

Last month, the Irish Data Protection Commission fined Meta 1.2 billion euros for violating the General Data Protection Regulation (GDPR), which limits the transferring of data of EU citizens. Meta was also ordered to stop transferring data collected from EU users to the U.S.  

This is not the first time EU regulators have levied heavy fines on U.S. tech companies for mishandling user data. Indeed, Meta had already been hit with hundreds of millions of fines by the same Irish regulator for data breaches on its Whatsapp and Instagram.  In other recent cases, Amazon was fined 746 million euros by Luxembourg for non-compliance with GDPR’s data processing rules, and Clearview AI was fined by the French data protection agency, CNIL, for its illegal collection and processing of biometric data.  

This wave of cases serves as a reminder of how fragile user privacy is today. While the GDPR established legal guardrails for data protection, regulators, researchers, and practitioners need to think proactively about how to steer the tech industry away from privacy violations, data misuse, and erosion of autonomy. So how can firms be led to uphold privacy for the long term? What is the north star for privacy advocates, regulators, and researchers? 

To weigh these questions, the Center for Long-Term Cybersecurity partnered with the Center for Digital Trust at EPFL and convened a panel at this year’s Computers, Privacy, and Data Protection conference. 

Hanlin Li speaks at CPDP Conference
From left to right: Robin Wilton, Internet Society (moderator); Carmela Troncoso, EPFL Security and Privacy Engineering Lab; Hanlin Li, UC Berkeley Center for Long-Term Cybersecurity; and Paul Nemitz, European Commission

The panel included four panelists from diverse geographies and disciplinary backgrounds:

  • Jhalak Kakkar, Executive Director, Centre for Communication Governance & Visiting Professor, National Law University Delhi (IN); 
  • Paul Nemitz, Principal Advisor on Justice Policy, European Commission (EU); 
  • Carmela Troncoso, Tenure Track Assistant Professor & Head of the Security and Privacy Engineering Lab, Swiss Federal Institute of Technology, Switzerland (CH); 
  • Hanlin Li, Postdoctoral Researcher, Center for Long-Term Cybersecurity, UC Berkeley (US); 
  • Robin Wilton, Director, Internet Trust, Internet Society (UK), as the moderator. 

Wilton first invited the panelists to discuss why privacy is important. First and foremost, they agreed, privacy is a fundamental human right. As today’s technologies are designed to vacuum large troves of data records from a wide range of user behaviors, from demographic information to browser history, it has become increasingly difficult for individuals to withhold information about themselves. Nemitz stressed that given the sensitivity and comprehensiveness of data collected about citizens by technology companies, governments have a responsibility to support citizens’ privacy rights and protect them from surveillance and harmful inferences. 

Nemitz and Troncoso also highlighted how a just, democratic society necessitates strong privacy protections. Nemitz sees privacy as a crucial pillar for democracy because,with access to large-scale data about citizens’ consumption behaviors, views, and preferences, companies are equipped with unprecedented power to influence many aspects of public life. Governments have a duty to strengthen individual privacy rights and thereby minimize corporate influence on elections, social well-being, and public health. Troncoso further stressed that privacy should not be the end goal, but should serve as a defense mechanism for citizens to leverage against the exploitation of personal information and power abuse by technology companies. For example, privacy protections could allow users to set restrictions on what their data can be used for. If scaled, this mechanism would provide users with the power to shape what data gets aggregated and what technologies are developed.  

According to Li, strengthening privacy rights is a means toward not only a more democratic society but also a more equitable market in which those who contribute data can benefit from the aggregation of data. Currently, tech companies collect large troves of personal data, often in exchange for providing free or low-cost products and services. Yet users have no way of knowing whether this transaction is a fair value exchange. As members of the public become more concerned about how their personal data is monetized, stronger privacy protection would give more power to citizens to make meaningful decisions about when and how to share their personal data. 

The stakes are high, so what can we do to mitigate the erosion of privacy? Kakkar described solutions drawn from India’s 2022 Data Protection Act, including placing fiduciary duties on data processors and collectors and potentially establishing data trusts–public entities that represent the interests of communities. Individuals exercising their privacy rights, for example by deleting their data will not make a meaningful impact on tech companies. Community-based approaches, such as data trusts, are necessary to organize collectives of citizens to tip the balance. Kakkar also noted that in exploring data stewardship models such as data trusts, we urgently need to map out how different types of datasets map out on the dichotomy between community data and personal data: Where should data rights rest between individuals and communities?

Troncoso offered one direction toward a more privacy-oriented tech future: making privacy more of a priority for developers through engineering education. Users should not be the ones to worry about privacy, as they are rarely equipped with the knowledge and resources to make meaningful decisions about their personal data. As such, educating developers who are in a better position to shape technologies based on privacy best practices, with awareness of implications and risks, is a natural next step. 

Nemitz drew attention to the importance of public awareness of the value of data protection for empowering legislators and advocates to continue the fight for privacy. Data protection is a public good, as it serves to uphold individual privacy rights, support democratic values, and promote economic growth. 

In the end, all panelists agreed that the fight for privacy requires concerted, continuing efforts across geographies. Citizens should be a key group of stakeholders in shaping the future of privacy and technology. After all, anyone who contributes data about themselves to the tech ecosystem is, in turn, affected by the ecosystem. As Wilton summarized, “You don’t have to be in the statistics to be affected by the statistics.” We are all in this together.