Menu Close

Rebuilding Trust in the Digital Ecosystem: New Mechanisms for Accountability

IDAC News Logo

A report by the International Digital Accountability Council & GMF Digital

Summary

A patchwork of privacy laws globally and the lack of baseline protections in the United States undermines users’ trust in the digital ecosystem and limits the transformative potential of technology. Rebuilding trust requires big changes at both levels.

Congress must pass a federal privacy law coupled with nimble implementation measures that can move at the speed of the Internet. Privacy legislation should provide for accountability mechanisms that go beyond traditional law enforcement. Independent watchdogs can play an important role in monitoring privacy practices and enforcing codes of conduct. Additionally, those offering digital products and services need an advanced understanding of privacy rules. Education and certification programs can ensure they adhere to these effectively and consistently. Independent watchdogs could help inform the curriculum.

Internationally, differing legal frameworks for privacy rules must be interoperable to ensure that personal data remains protected when transferred across borders. An enforceable code of conduct that fulfills the requirements of privacy laws in multiple jurisdictions could support interoperability.

Introduction

The open nature of the Internet has allowed it to flourish, giving billions of people access to information and connecting us in ways that had never before been possible in human history. This free flow of information has enabled digital technologies to transform economies and societies, but it has also created unique governance challenges. As digital technologies have dramatically affected every aspect of our lives—from the economy to health-care system to education to romantic relationships—governments have struggled to implement rules that enable the growth of digital innovation while protecting consumers, users, and democratic institutions.

Increasingly, the patchwork of governance structures and accountability mechanisms seem outmatched by the challenges that emerge from the digital landscape. Cybersecurity threats leave governments, businesses, and users vulnerable to the abuses of malicious third parties, some of whom are connected to governments. Online disinformation has pushed democratic institutions worldwide to a breaking point. And lack of trust in data privacy threatens to undermine the transformative promise digital technologies offer for commerce, health care, entertainment, and education.

The largest economies have taken differing approaches to commercial data privacy, reflecting competing philosophies. The approach in the European Union, codified through the landmark General Data Protection Regulation (GDPR), differs considerably from the orientation in the United States, which has a patchwork of state- and sector-specific laws but lacks baseline protections. Emerging economic giants such as Brazil, China, and India have each taken different paths toward digital privacy. The net result of this lack of harmonization is governance gaps that leave users exposed to considerable privacy risks.

The result of this global patchwork system is a lack of trust. Eighty-one percent of Americans say that they have very little or no control over the data that companies collect about them and 79 percent that they are concerned about how companies use their personal data.1 This lack of trust emerges from an ecosystem bedeviled by myriad risks and harms. As the Cambridge Analytica scandal demonstrated, violations of best practices in consumer data privacy can have a profound effect on democratic discourse. The recent examples of a fertility app surreptitiously transmitting user data to unseen third parties affect vulnerable users in a deeply intimate way, further eroding trust.2

Lack of trust in the digital ecosystem can undermine the transformative possibilities the digital revolution creates for improving people’s lives. One survey found that over half of Americans do not use a product or service due to concerns for their privacy.3 If users do not trust apps with sensitive health data, they will not enroll in digitally enabled public health programs meant to contain the global coronavirus pandemic.4 People may be more reluctant to download their health data in connection with innovative programs like Blue Button,5 which makes it easier for patients and their doctors to unleash the potential of health data to personalize their care. Parents may not want their children to use distance-learning apps that can enable learning during a pandemic and hold the possibility of dramatically improving the efficacy of teaching and learning by tailoring pedagogy to the needs of individual learners.6 In the commercial space, it is easy to see how fundamental trust is to online retail, digital entertainment, and increasingly data-rich homes and businesses.

Rebuilding trust in the digital ecosystem requires big changes. The next section lays out some models for reform, including past efforts at updating U.S. privacy rules. The following section lays out the critical elements of a path forward on commercial data privacy. The need for federal baseline privacy legislation is clear, and this brief describes the necessary contours of such a law, but there are other critical elements for effective reform.

Read the full report here.