By Holden Williams
In late March, the Financial Times reported that mobile apps using software development kits (SDKs) created and operated by the Russian tech giant Yandex risked sharing user data with the Russian government. Over 52,000 apps including gaming, messaging and virtual protection networks with hundreds of millions of installations use SDKs that potentially transmit information to the Russian government.
Unfortunately, this is not the only example of mobile apps collecting data that is exposed to foreign adversaries. In our report, “Digital Health is Public Health: Consumers’ Privacy & Security in the Mobile Health App Ecosystem,” IDAC discovered a handful of femtech and mental health apps that transmitted user information, including identifiers, to Yandex servers located in Russia — a country with weak data protection laws and an active history of human rights abuses.
One of the most concerning aspects of this discovery is the fact that apps using third party SDKs are not entirely in control of what the third parties do with user data once it is transmitted. Each SDK has its own terms of service and privacy policies separate from the app that utilizes it. In addition, SDKs can engage in harmful behavior unilaterally from the app. For example, some of these Yandex transmissions shared the users’ Android identifier (SSAID), a persistent identifier, simultaneously with the ressettable Android advertising identifier (AAID). This practice is what we consider “shadow profiling,” which enables users to be identified even when they make efforts to reduce tracking.
This behavior is particularly concerning because in Russia the government can mandate under local law that they have access to information stored on Russian servers.
U.S. policymakers have taken steps to curb the sharing of consumer information with foreign adversaries. In June 2021, the Biden Administration issued an Executive Order ordering federal departments, led by the Department of Commerce, to produce a report outlining how the U.S. may better protect the data of residents from adversaries.
Specifically, the Order targeted software applications “designed, developed, manufactured, or supplied by persons owned or controlled by, or subject to the jurisdiction or direction of, a foreign adversary.” The Order also urged the U.S. to hold accountable those that utilize software to commit human rights abuses.
Additionally, U.S. Senator Ron Wyden has called on major platforms like Google and Apple to better regulate apps that risk sending consumer information to adversaries such as Russia.
Google and Apple both have requirements, or are implementing requirements, that mandate apps to be transparent with users about the data flows to third parties. But, transparency in a privacy policy might not provide enough protection when SDKs are operating in the background and don’t require explicit user consent to collect their data.
For example, we continue to see the concerning behavior around “shadow profiling” even after Google Play updated its User Data policy to prohibit linking persistent device identifiers with other personal information and sensitive data or resettable device identifiers. As part of our health investigation, we tested mobile health apps both before and after the change in Google Play’s policy went into effect. Before the policy went into effect, we observed 28 apps sending multiple identifiers to third parties. After the policy went into effect, all 28 apps were still simultaneously transmitting persistent and resettable identifiers.
All of this amounts to distressing behavior. When apps share data and personal information with third parties based in nations with poor data protection laws – as well as a track record of human rights abuses – users are put at risk and trust in the digital ecosystem erodes. Russia’s recent invasion of Ukraine, for instance, emphasizes how their record of human rights abuses is not left in the past.
The U.S. must take more concrete steps — such as passing robust federal legislation – to guard against these kinds of privacy issues. If not, we risk having our data fall into dangerous hands.