Menu Close

More Accountability Needed to Protect Children Online

IDAC Blog Logo

More Accountability Needed to Protect Children Online

By: Katherine Zhuo

In the past 20 years, despite the drastic changes in the social media landscape and online presence of children, there have not been changes to federal laws regarding the protection of children online.

During a recent Senate Hearing on Online Protections for Children, Instagram CEO Adam Mosseri addressed issues around parental controls, the nature of content reaching children on Instagram, and the lack of transparency from tech companies, including Instagram. 

When questioned about the motivation behind the temporarily paused Instagram for Kids app, Mosseri explained that because children will be on the internet and social media regardless of the restrictions tech companies try to impose, the company wanted to create a separate platform specifically for kids that would address the concerns that parents and policymakers have about their safety and privacy online. 

Senator Richard Blumenthal (D-CT) expressed concerns that the previous mechanisms of self-policing and self-regulating don’t adequately protect children online. Despite the attempts to verify age, there are still children under 13 registering for social media platforms that collect their data, and anyone under the age of 18 is still at risk of being targeted. 

There was a bipartisan demand for Instagram to be transparent about the algorithms being employed to display content, data collection and storage policies, as well as the right for children to have their data deleted. Mosseri claims that contrary to reports that Instagram is hurting teenagers, Instagram’s internal research suggests there was actually the opposite effect. He indicated that he supports an industry-body that can regulate and set standards for their policies, but failed to indicate whether he believes independent researchers are also necessary for greater transparency and accountability. When asked about whether he would support federal legislation regarding violations of children’s privacy rights, Mosseri expressed support for federal legislation that would limit the degree to which companies can target customers. In addition, he announced that Instagram would be implementing a few new measures to protect children’s privacy online, including verifying users’ ages, defaulting young people to private accounts, limiting who can message teenagers, and restricting displayed content to be age-appropriate for teenagers.  

Instagram is just one of many tech companies that faces growing concerns about how they protect children online. There is a broad consensus that children require more privacy protection, yet there is a disconnect when it comes time to taking action. Congress must step up and pass legislation that adequately protects children online and provides an enforcement mechanism so tech companies can comply with these laws. 

Unfortunately, the existing laws are outdated, and there is a lack of accountability from tech companies. While social media platforms and tech companies agree that there needs to be increased measures to protect children online, they have not yet expressed support for specific proposals or legislation that would hold them legally responsible for the protection of children’s privacy online. 

As a first step, updating the Children’s Online Privacy Protection Act (COPPA) to allow 13-15 year olds to have control over their data, as some senators have already proposed – along with additional federal legislation – could better protect kids online today. 

When COPPA was first implemented in 1998, the digital age had only just begun: computers and the Internet had only recently been invented. While states like California have passed a number of privacy laws, there are no federal privacy laws that are comparable to the General Data Protection Regulation (GDPR), the comprehensive data protection and privacy law in the European Union and European Economic Area. 

Today, social media platforms exercise a tremendous amount of power over users, especially children who are malleable and easily influenced by social media and the internet. As such, it’s particularly important that those platforms are held accountable when privacy laws are violated. 

Without expanded privacy laws and adequate enforcement, tech companies will continue to take advantage of loopholes and compromise the privacy of their users. As such, we recommend that:

  1. Congress should work together with tech companies and civil society organizations, including independent watchdogs like IDAC, to educate parents and children about social media and their privacy rights on these platforms. 
  2. At a minimum, Congress must expand COPPA to fit today’s evolving digital landscape, including provisions for holding tech companies accountable and punishments for noncompliance. 
  3. Tech companies should adhere to the privacy principles set forth by the Global Network Initiative and commit to supporting specific legislation regarding protecting children’s privacy online. 
  4. Independent organizations and civil society organizations should hold both Congress and tech companies accountable by rating (1) the strength and efficiency of the privacy laws, and (2) compliance with privacy laws. 

In its investigations, the IDAC team has discovered concerning trends around data collection in childrens’ apps, including three popular apps which Google removed from its Play Store based on the findings from our investigation. It is time for Congress, app developers and platforms to take action to better protect children online.