Who We Are
Society needs a new paradigm for dealing with digital harms, such as attention extraction, excessive commercial surveillance, cybercrime, and discriminatory algorithmic profiling.
UC Berkeley’s Center for Long-Term Cybersecurity supports a research program, Data Metabolism, to develop new approaches to accountability around digital harms. We refer to data “metabolism” to reflect the ways in which data practices and policies can generate both value (similar to animal metabolism forming energy from food) and harm (toxic inputs yield negative outcomes).
Our research on data metabolism develops new approaches to communicating and accounting for digital harms, and suggests solutions that amplify the benefits of good data practices. The purpose of this program is to generate insights and interventions for the global information system through novel forms of communication and accounting, and through study of social and cultural norms.
Motivating this stream is the idea that digital harms behave like negative externalities, costs imposed on society that are not factored into the cost of a data activity, such as discriminatory consumer profiling or unwanted tracking. Digital harms related to privacy and security are negative externalities that need to be managed at the system level, rather than the individual level. But prevailing approaches to digital harms in the United States and other Western jurisdictions have focused on individual rights and remedies. As one paper produced by our program has suggested, we need policy, legal, and cultural models that consider our data economy’s collective and societal harms—not just individual harms—in order to mirror evolving social norms.
What We Do
A critical step toward shifting the data economy paradigm is to strengthen norms around corporate accountability and communication for security and data practices. Toward this end, the Data Metabolism program is exploring how norms evolve in communication between companies and their stakeholders with respect to corporate responsibility, risks, and opportunities inherent in their data and cybersecurity practices.
In this era of digital transformation for business, companies are at the forefront of responsibility for data and security. Technology firms are perhaps the most visible actors in this new era of accountability. As our digital lives sync with evolving social norms about privacy and security, tech firms need to respond to activists, investors, consumers, and the broader public in order to maintain their license to operate.
Investors and NGOs also have a role to play. In recent decades, activists and institutions have demanded more and better information from companies about social externalities, and they have sparked a sustained focus on corporate transparency. The Data Metabolism research program examines this information exchange, with a particular focus on corporate communication in the form of ESG (environmental, social, and governance) disclosure.
A CLTC workshop in June 2022 convened specialists to create a forward-looking agenda for research and collaboration on how to improve communication about digital harm in rapidly changing areas of corporate disclosure related to ESG reporting, sustainability, and corporate responsibility. Driving this agenda forward, a series of working groups in fall 2022 identified critical knowledge gaps that could be better understood through interviewing professionals on the ground at the interplay of companies, NGOs, and investors in ESG communication. The interview project is ongoing. Its results are expected to inform collaborative development of a communication tool to assess and benchmark firms’ contributions to societal outcomes through their data and security practices, designed to meet expectations in the ESG investing community. This body of work is generously supported in 2022–23 by Omidyar Network.
Future Directions in Corporate Disclosure on Digital Responsibility
- Jordan Famularo. “Corporate Social Responsibility Communication in the ICT Sector: Digital Issues, Greenwashing, and Materiality.” International Journal of Corporate Social Responsibility 8, no. 8 (2023).
- Jordan Famularo: “A template for voluntary corporate reporting on data governance, cybersecurity, and AI: Designed for the mobile health market.” CLTC White Paper + Template. August 2023.
- Jordan Famularo. “Platform-related Harms.” In Platform Governance Terminologies, edited by Mehtab Khan and Brianna Yang. Yale Law School Information Society Project and Wikimedia Initiative on Intermediaries and Information. June 28, 2023.
- Jordan Famularo. “Future directions in corporate disclosure on digital responsibility.” CLTC White Paper. June 2023.
- Jordan Famularo and Richmond Wong. “How the tech sector can protect personal data post-Roe.” Brookings TechStream
- Jordan Famularo. Public comment to U.S. Federal Trade Commission for advance notice of proposed rulemaking, Trade Regulation Rule on Commercial Surveillance and Data Security, ANPR R111004. October 19, 2022.
- Jordan Famularo. “Sustainability reporting on digital harm: State of play and future agenda.” CLTC Bulletin. July 21, 2022.
- Jordan Famularo. “Better understanding of data lifecycles can reduce digital harms.” Brookings TechStream. March 30, 2022.
- Steven Weber, Ann Cleaveland, Sekhar Sarukkai, and Sundar Sarukkai. “Dealing with digital waste: A blueprint for managing the data waste stream generated by the digital economy.” CLTC Bulletin. May 5, 2021.
- Steven Weber, Ann Cleaveland, Sekhar Sarukkai, and Sundar Sarukkai. “Reducing the waste from our digital lives.” Noema. April 8, 2021.