Menu Close

Children’s Apps Unknowingly Collecting Data Pose Compliance Risk

Read the original article here.

Some apps used by children are collecting more data than even app developers know, posing problems for compliance with U.S. privacy law.

Under federal children’s privacy law, app makers that know children under 13 are using their apps must take steps to protect their data. Issues can come up when apps use third-party software that unknowingly collects data from users, often due to misconfigured settings, raising questions for liability under the Children’s Online Privacy Protection Act.

Updating COPPA’s so-called knowledge standard, as advocates have called for and some lawmakers have proposed, could force makers of such software to bear more responsibility for compliance.

The Federal Trade Commission is still considering updates to its COPPA rules as part of a review that launched in 2019. The commission’s last update to COPPA rules took three years to complete.

Even without rewrites to the law or rules for carrying it out on the near-term horizon, legal thinking in federal courts is starting to shift toward heightened compliance obligations for companies that provide the underlying technology for apps, not just the apps themselves.

A set of recent legal settlements reached over popular gaming apps represents “a sea change” in how courts enforce children’s privacy law, according to Amy Lawrence, counsel to the privacy and data security group at Frankfurt Kurnit Klein & Selz PC.

“For a long time, it was clear tech providers don’t have an obligation to comply with COPPA unless the content creator says it’s a kids app,” Lawrence said.

Now, as targeted advertising faces growing public scrutiny, responsibility for protecting children’s data isn’t just falling on content creators like The Walt Disney Co. and Viacom International Inc., which were involved in the recent gaming app settlements. The companies settled with consumers to go beyond what COPPA requires to restrict how children’s data is collected and used for advertising.

“It’s pushing obligations down,” Lawrence said.

Data Collection
COPPA gives parents control over what information online platforms can collect about their kids. Its privacy protections apply when a tech platform is directed at children or when companies have so-called actual knowledge that they’re handling data on users under 13.

Children’s advocates have argued that this knowledge standard can incentivize tech companies to look the other way if children lie about their age to create accounts on platforms like Facebook Inc.‘s Instagram. Facebook has faced backlash over a planned new version of the photo-sharing app for children.

“There’s a view throughout the app ecosystem that the best course is not to look and thus not to know,” said Leslie Harris, acting president of the International Digital Accountability Council, a watchdog group that investigates mobile apps.

“We need to turn that paradigm on its head and build looking into the apps ecosystem,” Harris said.

A legislative proposal from Sens. Edward Markey (D-Mass.) and Bill Cassidy (R-La.) would revise the privacy law’s standards so that companies must seek consent for data collection when they “reasonably know” that children are on their platforms.

Previous efforts to update decadesold online protections for children have fallen into a partisan divide over whether parents should be given the right to sue companies over alleged privacy violations. Democrats have favored this private right of legal action, but Republicans haven’t supported the provision because they say it exposes companies to lawsuits that could harm smaller startups without large legal departments.

Such suits remain a sticking point for broader consumer privacy proposals, which have likewise failed to gain traction among federal lawmakers.

App Developers
Sometimes even app developers can find it challenging to figure out where data is being sent because of third-party components built into apps, according to a study by Serge Egelman, research director for usable security and privacy at the University of California Berkeley.

Makers of these components might not be transparent about what they’re doing with user data, or the components might just be configured in a way that doesn’t account for COPPA obligations, Egelman’s study showed.

Egelman built tools to inspect 5,855 Android apps directed to children and found that more than half appeared to be violating COPPA. Many of the potential COPPA violations were due to data collection behaviors of third-party software, not necessarily due to code written by app developers, according to the research.

The software can be configured to enable COPPA-compliant data collection. But Egelman’s study said some app developers weren’t correctly configuring software to disable personal data collection for profiling and ad targeting. Others were using software that, according to its terms of service, wasn’t meant to be used in child-directed apps.

Egelman, who testified before lawmakers in a recent congressional hearing on children’s privacy law, has argued for a knowledge standard that forces third-party recipients of data to determine whether the data comes from child-directed apps. In his view, that would shift the burden of COPPA compliance away from many small app developers and toward a relatively limited set of data recipients.

He’s also called for stricter standards under the Federal Trade Commission’s safe harbor program for COPPA. The program allows certain self-regulatory organizations, such as the Children’s Advertising Review Unit, also known as CARU, from BBB National Programs, to certify apps or other online services as compliant with children’s privacy law.

But there’s a lack of transparency about which apps are certified and how, Egelman argues. In May 2020, the FTC settled with digital game maker Miniclip S.A. for misleading consumers about its membership in the CARU safe harbor.

Egelman suggested that safe harbor certifications should rely on independent forensic evaluations of an app’s privacy behaviors, with the FTC developing standards for the evaluations.

App Stores
App stores such as those run by Apple Inc. and Alphabet Inc.’s Google have their own policies concerning content and the handling of personal information on child-facing apps. The stores could be “more rigorous” in how they enforce these policies, Egelman said.

“They’re not testing apps to see how they’re handling children’s data,” he said. “They offload that burden to app developers.”

Advocacy groups have urged the Federal Trade Commission to investigate whether the Google Play Store misrepresents some apps as safe and appropriate for children when they’re not.

The Campaign for a Commercial-Free Childhood and the Center for Digital Democracy argue that focusing on the app store could help the FTC address systemic issues with children’s apps, rather than enforcing against one app at a time.

Harris likewise suggested that app stores could serve more of a gatekeeping role for apps that are labeled as kid-friendly. Apps should be required to audit their software’s data practices before gaining access to an app store, she said.

“If no knowledge is all you need for no legal liability, then we’ve got to take it beyond what the law says and just do it for the ethical reason that we want to have good apps for children,” Harris said.