Menu Close

Google’s Efforts to Fight Bad Apps & Developers a Good Start. Now It Must Build on These Efforts

IDAC Blog Logo

By Ginny Kozemczak

Last year, Google took impressive steps to better protect users and guard against inappropriate and malicious behavior in its mobile app ecosystem. These are important steps to ensure user data is better protected. 

Now is the time for Google to build on its policies and procedures to further reduce the risk of user harm. 

Monitoring Apps 

Google has taken several steps to monitor apps during the pre- and post- publication stages, including an enhanced app review process to prevent policy-violating apps from being published on the Google Play store, reducing the ability for developers to access sensitive permissions such as data location, and focusing on software development kit (SDK) enforcement

In our investigations, IDAC has identified areas of concern for potential abuse by SDKs. We have seen that many developers do not fully understand the data collection and sharing practices of the SDKs embedded in their apps. 

Google must continue to monitor and take enforcement action around the inappropriate use of SDKs and work with developers to provide them with information about its policies and procedures while featuring trusted developers’ apps more prominently in the Play Store. 

Sensitive and Protected Apps 

Google has also taken additional steps in the last year to secure user data in sensitive and protected apps related to COVID-19, childrens’ apps and political campaign apps. 

For COVID-19 apps, Google introduced specific requirements that an app must not only be endorsed by a government entity or healthcare organization but must also include prominent disclosure and consent requirements, a prominently displayed privacy policy and not access personal or sensitive data that does not support the public health emergency.

In our investigation into more than 100 COVID-19 related apps, IDAC found that although many developers were well-intentioned and attempted to incorporate privacy-by-design principles, because of the hurried process of responding to the pandemic there was room for additional measures to protect user privacy. 

Google should continue to enhance its processes around COVID-19 apps by enforcing permissions and requests for additional data. Google should also ensure apps that use SDKs are narrowly tailored and not sending inappropriate data transmissions. Google should require developers to ensure that communications are encrypted in these apps. 

When it comes to childrens’ apps, Google has also taken additional steps, including launching a specific tab geared towards childrens’ apps, and working with academic experts and teachers to evaluate trusted apps. 

In 2020, IDAC investigated almost 500 global ed tech apps across 22 countries and found that while most ed tech apps we tested act in ways that align with users’ privacy expectations, there are gaps that developers and platforms should review and remedy in order to promote user trust and encourage widespread adoption.

In another investigation, we found three childrens’ apps — Princess Salon, Number Coloring and Cats & Cosplay — were violating Google’s policies by accessing Android ID and AAIDs and using suspicious SDKs. Google took immediate action and removed the three apps from the Google Play store. 

While we applaud Google’s immediate action to address the concerning behavior in these three apps, it is clear to us that this is part of a broader problem. One 2018 analysis into more than 5,800 of the most popular free childrens’ apps found that the majority might be violating the Children’s Online Privacy Protection Act (COPPA). 

Google must allow third-party auditing of apps to ensure there is no inappropriate behavior and should continue to expand its use of SDK vetting. 

Around the 2020 elections, Google created teams focused on elections, including support around app review, and took steps to encourage more explicit privacy disclosures, tailored permissions and discouraging the use of geolocation data by campaign apps. 

In 2020, IDAC conducted an investigation into U.S. political campaign apps. These types of apps need particular scrutiny due to the potential for abuse and implications on our nation’s elections. One area of major concern where Google must continue to focus its enforcement efforts is around ID bridging, a practice which allows apps to identify and track users across devices even if permissions do not allow for it.

Broader Platform Changes

While Google has taken important steps toward greater first party policing that gives users some increased control around how their data is used, it must take additional efforts to ensure data is not misappropriated. 

Google should not only perform spot auditing but should work with third party groups like IDAC to perform audits and spot checks on apps, particularly those with sensitive privacy concerns such as health, education and elections. Google must also increase its efforts to promote and encourage developer education and incorporate privacy-by-design principles. Google must also encourage the development of self-regulatory processes, including enforceable codes of conduct.

Ultimately, though, efforts to increase accountability and trust must be paired with government intervention. Federal privacy legislation that directly addresses the challenges posed by an increasingly app-centric digital ecosystem is long overdue.

As we continue to see expanded use of mobile apps, platforms must be aware of practices that put sensitive personal data at risk. Google has the chance to take additional steps to increase the trustworthiness of the apps on its platform.