By Alex Psilakis
When it comes to protecting children’s privacy online, there remains a lot more that developers, platforms, and government regulators can and must do. In a recent policy statement, the Federal Trade Commission (FTC) appears to be taking important steps to address privacy concerns around the rise of education technology tools and upholding the Children’s Online Privacy Protection Act (COPPA) in light of the rise in education technology tools children use.
Passed in 1998, COPPA seeks to protect the data privacy of children under 13 years of age. COPPA places strict limitations on the type of data that websites or online services can collect on children, as well as how long services can store children’s data. COPPA also mandates that companies implement strong security measures to protect children’s data.
To adapt to rapid technological advancements, the FTC has occasionally updated COPPA. For instance, in 2013, the FTC revised COPPA by expanding its definition of “personal information” to include persistent identifiers – long-lasting references to digital objects – used to target advertising to children.
The FTC’s recent statement clarifies that it will seek to use COPPA to uphold children’s privacy while Ed Tech programs climb quickly in prominence. Since students relied heavily on Ed Tech during the pandemic – and will continue to do so moving forward – the FTC makes clear that protecting children’s privacy in this space is of special importance.
In its letter, the FTC outlined specific areas of COPPA that they aim to uphold, such as the prohibition against mandatory collection, which prevents COPPA-covered companies from requiring students to share more personal information than is necessary to participate in an activity. The FTC also reiterated its dedication to retention prohibitions, which do not allow COPPA-covered companies to retain the personal information collected on children longer than is reasonably needed.
The FTC concluded the policy statement by reaffirming its dedication to upholding children’s privacy protections, stating,
“Children should not have to needlessly hand over their data and forfeit their privacy in order to do their schoolwork or participate in remote learning, especially given the wide and increasing adoption of ed tech tools. Going forward, the Commission will closely scrutinize the providers of these services and will not hesitate to act where providers fail to meet their legal obligations with respect to children’s privacy.”
We applaud the FTC’s commitment to prioritizing children’s privacy and Ed Tech programs. Our research has shown the need for action. In 2020 IDAC investigated the privacy practices of nearly 500 Ed Tech apps spanning 22 countries. In particular, we focused on analyzing apps that teachers or parents may have found useful as tools to aid in remote learning. Although IDAC’s investigation did not discover any illicit privacy practices, we did uncover some areas of concern. For example, apps like Hello Talk and Learn Python collected location data, even though it was not clear why they collected the data. Some apps also shared personal information with third parties. The iOS app Homer, for instance, shared users’ name, email address, and Identifier for Advertisers (IDFA) with 24 third party domains.
Given the rise in use of Ed Tech apps, as well as the presence of troubling data privacy trends, it is essential that both developers and regulators work to prioritize the protection of children’s privacy. Developers must take substantive steps to protect children’s privacy, such as limiting the collection of location data, maintaining transparent privacy policies, and minimizing the amount of data apps share with third parties, especially advertisers and data brokers. When violations do occur – specifically COPPA violations – the FTC must step in and respond accordingly.
IDAC will continue to monitor the Ed Tech space, educating developers on how best to protect children’s privacy, and notifying regulators of any suspected violations we discover.