Corporations are increasingly called upon to disclose their practices around technology, including how they manage data, cybersecurity, and artificial intelligence. Yet no clear standard prescribes what such reporting should look like.
A new white paper from the Center for Long-Term Cybersecurity (CLTC) aims to fill this gap. The paper, A Template for Voluntary Corporate Reporting on Data Governance, Cybersecurity, and AI — Designed for the Mobile Health Market, authored by Jordan Famularo, PhD, Postdoctoral Scholar at CLTC, presents a template for corporate disclosure for companies in the mobile health market, which includes providers of healthcare-related products and services ranging from mobile health apps to fitness wearables and medical devices.
“Without solid norms about what this reporting should look like, there is no recognizable consistency across companies or comparability over time, raising doubts about the effectiveness of the information for augmenting investor decision-making and catalyzing better corporate practices,” Famularo explains in the report’s executive summary. “This paper presents a template for such disclosure in the mobile health market, designed to guide more consistent reporting by companies and to propel better information for investors concerned about financial materiality, human rights, and equity.”
Famularo has conducted extensive research into how companies communicate aspects of digital responsibility to investors, and how norms for corporate disclosure evolve. In a previous report, Future Directions in Corporate Disclosure on Digital Responsibility, she examined how institutional investors, technology firms, and civil society are shaping norms around disclosure, including the dynamics that deter or motivate reporting.
The new white paper introduces a template (available for download as a Microsoft Excel spreadsheet) designed to guide companies through their reporting on data governance, cybersecurity, and AI, based on a series of 26 prompts. To develop the template, Famularo used a Delphi study (a structured consensus-building process) and conducted interviews with a panel of 20 experts from eight countries who represent a range of professional domains, including law, institutional investing, consulting, pharmaceutical and technology companies, non-profit organizations, and academia.. The template is intended to supply investors and companies involved in the mobile health market with a framework for monitoring, disclosing, and evaluating risks and opportunities related to data governance, cybersecurity, and AI. “The goals were to select and systematize the most critical disclosure recommendations, organize them into a user-friendly reporting template, and aid reporting practice by appending commentary and definition updates,” Famularo explains.
The template is designed for reporting by companies on a spectrum from Series C (high-growth startup with established market presence) to public (mature company with publicly traded shares). The paper focuses on the mobile health market, but the template could be adapted for other industries. “Though mobile health is a specific market with distinct digital responsibility disclosure needs, this work could also inform adaptations for different markets, verticals, and industries,” Famularo says.
A variety of stakeholders are likely to find the template useful, Famularo explains, including:
- Investors seeking to understand financially material risks, human rights risks, and salient equity issues across the B2C mobile health market;
- Sustainability, corporate responsibility, and ESG practitioners at companies in the B2C mobile health market seeking to understand stakeholder priorities regarding disclosure;
- Executive management and boards reviewing expectations in the voluntary reporting landscape for B2C mobile health companies and/or disclosures on data governance, AI governance, or cybersecurity;
- Civil society organizations engaging companies and governments on priority human rights and equity issues in the B2C mobile health market;
- Regulators and policy-makers seeking to explore interventions in corporate reporting regarding data governance, AI governance, cybersecurity, and/or mobile health;
- Human rights assessors seeking to help companies conduct human rights due diligence;
- Standard-setters exploring development of reporting guidelines relevant to data governance, AI governance, cybersecurity, and/or mobile health companies; and
- Consultants and advisors providing services to any of the above.
“The purpose of the research was to illuminate ties between corporate disclosure and digital responsibility, and to produce a tool that offers some reporting guideposts for companies, investors, and their observers,” Famularo writes. “Our first aim is to offer a systematic, empirically tested, user-friendly template that guides voluntary disclosure by companies on data governance, cybersecurity, and AI, and that informs investors’ expectations for such reporting. Our second aim is to provide a groundwork for future efforts by the private, public, and non-profit sectors to clarify expectations for corporate disclosure on data governance, cybersecurity, AI, and related topics.”
For more information, contact Jordan Famularo at jordan.f@berkeley.edu.