By Gauri Gupta and Ginny Kozemczak
In her recent keynote address at a Future of Privacy Forum event, Federal Trade Commission (FTC) Acting Chairwoman Rebecca Slaughter outlined how pandemic-induced privacy and security issues surrounding education technology and healthcare will be key priorities for the FTC under the new administration.
The sudden shift to digital learning and healthcare brought on by the COVID-19 pandemic led to the rapid development of education and health mobile apps. In the absence of federal privacy legislation, the FTC’s commitment to deterring problematic privacy and data security practices in these areas is critical.
The FTC’s initiatives are an important step toward bringing privacy issues to the forefront of the national policy agenda. These efforts are a much-needed step in the right direction from the new federal administration and are in line with clear industry needs revealed through past investigations by the International Digital Accountability Council.
The FTC has issued privacy-related guidance for parents, schools and providers in education technology, or EdTech. Last December, it launched an industry-wide study that will survey various social media and video streaming platforms to learn more about the EdTech services they provide. The FTC is also taking important steps around enforcement by actively working to clarify how the Children’s Online Privacy Protection Act (COPPA) will be applied to the EdTech space. COPPA requires “operators of commercial websites and online services directed to children under 13 or knowingly collecting personal information from children under 13 to obtain verifiable parental consent prior to the collection, use, or disclosure of children’s personal information.” While COPPA states that providers must obtain notice and consent for children under the age of 13, the FTC proposes that these rules should also apply to users of EdTech services who are 13 or older.
The proliferation of digital learning demands strong privacy protections for students. During the initial days of the pandemic last year, as students around the world were just beginning to adapt to remote learning, IDAC investigated 496 EdTech apps being used across 22 countries. Our findings revealed that many EdTech apps fell short of best privacy and security practices since they failed to disclose the collection and sharing of user data with third parties, circumvented user privacy controls, or embedded potentially invasive and unnecessary software development kits (SDKs).
In line with the FTC’s guidance, IDAC strongly recommends that EdTech companies be more transparent about data protection policies and inform users when their information is being collected and how it will be used.
The FTC is right to encourage schools to consult with authoritative sources on the suitability of the privacy and security policies of the services they use. But that should just be the first step. The FTC must take a proactive stance to deter privacy harms in digital learning spaces, while helping companies and developers improve student privacy.
Slaughter also announced that health apps would be a priority for the FTC, given the similar rise and essential nature of telehealth during the pandemic. Citing the FTC’s recent fem-tech case and settlement with Flo Period & Ovulation Tracker, a menstruation and fertility app, Slaughter underscored that effective consumer notice is critical to promoting user privacy. The FTC’s settlement prohibits Flo from misrepresenting its data collection and sharing practises, after it previously found that Flo had disclosed user health data to third party marketing and analytics companies, despite assurances otherwise. Although prioritizing consumer notice is theoretically a move toward user privacy, in practice, these regimes still fall short of providing meaningful privacy protection as consumers should have explicit rights over their data without having to navigate complex privacy policies. The FTC should build on these measures to include even stronger consumer protections in the near future.
IDAC is similarly focused on ensuring that sensitive health apps are taking all the necessary steps to ensure the security of the extremely personal data they collect. IDAC conducted an investigation into the popular fertility app Premom last year that we believed was violating federal and state laws and the Google Play terms of service, and referred the case to the FTC and the Illinois Attorney General. The FTC should pay particular attention to such apps that handle sensitive health data and affect vulnerable populations in a disproportionate way.
Slaughter noted that the FTC will closely examine health apps more generally, including telehealth and contact tracing apps. IDAC’s 2020 investigation into COVID-19-related mobile apps highlights many concerns surrounding protecting user data on health apps. We investigated 108 global apps spanning 41 countries, and found several instances in which apps fell short of best privacy and security practices and posed potential risks to users. In particular, some apps were not transparent about their data collection and third party sharing practices, some apps sent unencrypted transmissions, and others requested unnecessary permissions.
IDAC supports the FTC’s guidance regarding mobile healthcare. Developers should practice the following:
- Data Minimization: Apps should limit the information they collect and share about users to only data that is absolutely necessary for the app to provide its services.
- Limiting Access and Permissions: App developers should incorporate privacy by design principles and set default privacy settings for apps to give users more control over their data, and ensure that permission requests are narrowly tailored.
- Accessibility and Transparency: Developers must use accessible and transparent privacy policies that clearly explain how users’ personal data is collected, used, retained, stored, and shared.
These recommendations help protect users from harmful data collection practices and retain control over their personal information. Notably, by ensuring that user data is handled responsibly, developers can build the trust necessary to facilitate public participation in critical pandemic response efforts.
The privacy risks surrounding digital learning and healthcare have the potential to impact a substantial number of users. The FTC must continue to respond quickly and effectively to concerns as they arise. One way they can do that is by working closely with technologically-savvy watchdog organizations like IDAC that can monitor emerging privacy issues and remediate them during early stages.
We also recommend that the FTC, under any acting or Senate-confirmed commissioner, look at ways to protect personal health data that falls outside the scope of the Health Insurance Portability and Accountability Act (HIPAA). The FTC must be proactive in paying particular attention to pandemic-related apps that are likely to emerge in the coming months, such as vaccine passports. Relatedly, the FTC should focus on the data retention policies of apps that have collected — and are continuing to collect — large amounts of personal health data related to COVID-19.
IDAC will continue its work in ensuring EdTech and Healthcare app developers are following best practices to safeguard user data and strengthen the digital ecosystem.