By Ginny Kozemczak
Last year, Apple took an important step forward and announced changes designed to boost user privacy. Starting with iOS 14.5, developers were required to ask users for their permission to track them across apps and websites. Shortly after that announcement, developers began to find ways to circumvent them.
Recently, ArsTechnica reported that two companies, ByteDance, which owns TikTok, and Tencet, one of the world’s largest and most profitable tech companies, are testing tools in China to bypass Apple’s new rules and track iPhone users without their consent. Apple has pledged to block any apps that attempt to use Chinese Advertising ID.
App developers often want to track users for a variety of reasons, including for some beneficial purposes such as analytics and increased security. But just as often, developers are also looking to serve mobile advertisements and drive revenue. Unfortunately, the ad tech industry incentivizes amassing as much user data as possible in order to show targeted advertisements. We also know that tracking users is not a new problem: several past IDAC investigations have shown the harmful impacts of this, including when users of apps targeting children and couples trying to have children were tracked without their permission.
Even under Apple’s new policy, apps can continue to collect persistent identifiers for purposes other than advertising. This data may be misused for targeted advertising by either first parties or other third-party data brokers down the data sharing stream.
Still, platforms like Apple have a responsibility to police their app ecosystem, prevent invasive tracking, and protect user data.
While its new policy gives users some increased control, Apple must take additional steps to ensure that data, such as permanent identifiers, are not misappropriated. Apps on Apple’s platform should be subject to spot auditing by the company, and Apple should welcome third party groups like IDAC to audit apps on the platform to increase app accountability and transparency. Apple must also encourage the development of self-regulatory processes, including enforceable codes of conduct. More extensive developer education is also necessary.
Ultimately, though, efforts to increase accountability and trust must be paired with government intervention. Federal privacy legislation that directly addresses the challenges posed by an increasingly app-centric digital ecosystem is long overdue.
Our mobile phones have become a part of our daily lives. Extensive user tracking erodes users’ trust in the digital ecosystem and puts sensitive personal data at risk. Platforms must take steps to increase the trustworthiness of the apps on their platforms, including welcoming third party investigations and auditing.