Read the original article here.
By TIM STARKS
07/06/2020
The push to use smartphone apps to track the spread of coronavirus is creating a potential jackpot for hackers worldwide— and the U.S. offers a fat, loosely defended target.
In the Qatar Covid-19 app, researchers found a vulnerability that would’ve let hackers obtain more than a million people’s national ID numbers and health status. In India’s app, a researcher discovered a security gap that allowed him to determine who was sick in individual homes. And researchers uncovered seven security flaws in a pilot app in the U.K.
The U.S. is just starting to use these contact tracing apps — which track who an infected person may have had contact with — but at least one app has already experienced a data leak.North Dakota conceded in May that its smartphone app, Care19, had been sending users’ location data to the digital marketing service Foursquare. The issue has since been fixed, according to the privacy app developer that discovered the leak.
To date, the public debate about whether to use contact tracing apps — a potentially crucial strategy for reopening economies during the pandemic — has centered mostly on what data to collect and who should have access to it, but cybersecurity insiders say the apps are also highly vulnerable to attacks that could expose data ranging from user names to location data.
And the U.S. has its own unique vulnerabilities: a fragmented collection of apps, tiny state cybersecurity budgets and stalled legislation in Congress that makes federal government rules unlikely anytime soon.
As the world has adapted to the coronavirus pandemic, other areas of life have already become magnets for massive hacking efforts:Hackers have exploited security gaps in videoconferencing software to “Zoom Bomb” meetings and tracked logins into companies’ secure networks to target employees working from home. Cyberattacks on the World Health Organization have more than doubled during the outbreak. And China has allegedly tried to steal U.S. vaccine research.
The drive to deploy many of the contract-tracing apps quickly enough to flatten the curve of the outbreak means developers haven’t always made stress-testing security their highest priority, while spotty government oversight means officials aren’t necessarily looking out for problems once apps are deployed.
“There’s no denying that contact tracing is integral to tracking and, ultimately stopping, the spread of Covid-19,” said Kelvin Coleman, executive director of the National Cyber Security Alliance, a public-private partnership that works with the Department of Homeland Security. “While the apps are designed to help scale human efforts to do so, they’re also a double-edged sword when seen through a lens of individual privacy and security.”
It’s a trade-off that Americans may start to feel soon, as more states lift lockdown restrictions and deploy these apps as part of their efforts to identify disease hot spots and limit the spread of the virus.
Cybersecurity experts and researchers say data from contact tracing apps could be a particularly attractive target for a range of groups. Agenda-driven “hacktivists” couldtry to take down the apps in a bid for notoriety. Cybercrime gangs could extract lucrative identity information. And nation states could use it as a covert surveillance tool.
“You can think of all sorts of potential abuses for this information: finding out who visited a psychiatrist regularly, who sat near the pro-democracy activists at university, who wasn’t alone when they said they were,” said Vanessa Teague, a security researcher in Australia who found the vulnerabilities in the U.K.’s pilot app.
Cyber espionage is also a worry.
“If I want to target you, if you are someone with a high position, it’s interesting to know your contacts,” said Baptiste Robert, a security researcher who discovered the Indian app’s flaws. “If you’re a lawyer, who are your clients? Or if you’re a politician. This kind of information is very valuable for an attacker.”
In one of the most extreme scenarios experts are tossing around in the U.S., foreign government hackers could attempt to disrupt the 2020 election by manipulating the apps to display a surge in cases in one area in order to dissuade voters away from a polling place.
So far, researchers have found that many of the apps lack basic security.
In a recent study of 17 government-sponsored apps, mobile app security firmGuardsquare found that less than a third had a kind of encryption that protects sensitive information in the source code, and less than half had the ability to detect when attackers access restricted data on the phone.
Among discovered app vulnerabilities, the Qatar app flaw had the most dire potential consequences: It would have allowed hackers to obtain sensitive information on more than one million users, including their names, national IDs, health status and location data.
In some cases, the rush may have been the problem. Local officials say that the fact that the North Dakota app was produced in a hurry led to its woes.After an app in the Netherlands exposed about 200 people’s names, email addresses and encrypted passwords, one of its co-developers said the breach was due to a rush to publicly release the app’s code.
“The speed and scale transitioned at such a rate that they didn’t really consider security in the beginning,” Coleman said of apps rushed to market. “It was a topic for them, but not top of mind.”
France is one of a handfulof countries known to have done extensive hacking tests on their apps. The country held two “bug bounty” competitions in which security researchers tried to break into the app — one before its release, and one after with up to 2000 euros (about $2,250) in prize money.
Others have done little to test vulnerabilities before releasing the apps. For instance, researchers quickly found vulnerabilities in Australia’s app, which didn’t initiate a bug bounty program in advance.
In the U.S., security oversight is particularly weak. Congress has so far been unable to agree on security protocols and unlike many countries, the U.S. has no national app. Instead, states and cities are developing their own apps — each of which could have different security problems.
Major bug bounty operators Bugcrowd, HackerOne and Synack told POLITICO they weren’t running any programs for U.S. states to identify security vulnerabilities in their apps. And Coleman said that the U.S. tapestry of state-by-state apps means that no one’s focusing all their assets on ensuring the security of one offering.
Most states also have minuscule cybersecurity budgets. According to an organization that represents state chief information officers, most states allocate between zero and 3 percent of their IT budgets to cyber, compared to 10 percent in the private sector. Most also report flat cyber budgets or increases of less than 5 percent since 2014.
Tim Brookins, the Microsoft engineer who developed North Dakota’s app via his independent company ProudCrowd, said his security emphasis for that appwas protecting the data server, which was not the source of the leak.
“It’s like Fort Knox,” Brookins said of the central server,which uses security offered by Microsoft’s Azure cloud service. He’s the only one with regular login privileges, he said. While there was no bug bounty program for the app, Brookins said he had informal consultations about protections for the app with a security expert he connected with via a New York Times journalist.
The one national coronavirus app that the U.S. government has put out — a symptom tracker from the Centers for Disease Control and Prevention — sent unencrypted transmissions, according to a recent investigation by the International Digital Accountability Council. “Although we could not determine the content of the transmissions, metadata about the user’s activity can be correlated with the device metadata that we were able to observe (e.g., mobile carrier, operating system, device resolution, etc.),” the council said.
A CDC spokesperson said the unencrypted transmissions aren’t an issue to be fixed because it publicly disclosed that it communicated with third-party apps.
Some software initiatives have made efforts to think ahead about data security — especially one from Apple and Google, which teamed up to offer a software framework for app developers that would work seamlessly with their phones. The companies set rules about how those apps can operate, for example, prohibiting the apps from collecting information from users in a centralized database. As a result, a number of European countries, including the U.K. and Germany, abandoned plans to collect and store user information on government servers, instead leaving the information on users’ phones. With a decentralized system like this, there’s no giant trove of data to hack. Researchers say big databases are a richer target because it only takes one break-in to get away with a large haul.
U.S. states began to launch apps in May and are continuing to roll them out. At least nine states have released or begun to develop contact tracing apps, and more than a dozen are actively considering them, according to a recent POLITICO review.
While North Dakota’s app, which launched before Apple and Google released their software, uses a centralized database, many of the newer apps are going with the decentralized approach in order to use that software.
But the Google-Apple software has its own security issues. It relies on Bluetooth, which is a frequent target from hackers who can leverage its vulnerabilities to snoop on conversations or take over a device.
Regardless of which types of databases or software frameworks U.S. apps use, they are prey to another security problem: division in Congress that has left them with no federal oversight. That’s a difference from Europe, where the European Commission has developed data protection guidance for coronavirus apps, such as a prerequisite that the data be stored on individuals’ devices and that it should be encrypted.
U.S. lawmakers have proposed a variety of data security provisions for Covid-19 apps in three bills that have so far been stymied by long-standing partisan divisions over the broader question of how to protect data privacy, such as differing views on whether to allow states to enact their own laws.
And the disagreements about data security, while not as divisive, are still substantial.
One bipartisan bill, sponsored by Sens. Maria Cantwell (D-Wash.) and Bill Cassidy (R-La.), would institute specific requirements for data breach notifications and would outlaw interference, such as the transmission of signals that send inaccurate notifications. It also would require app-makers to include a risk and vulnerability assessment for their product, along with steps for lessening risks and vulnerabilities and ways to notify users of data breaches. It also makes it illegal to tamper with the apps to produce inaccurate notifications.
“For an exposure notification system to be useful, enough people have to choose to adopt it — and people aren’t going to do that if their data isn’t secure. I wouldn’t,” Cantwell told POLITICO.
Legislation introduced by Senate Republicans, in contrast, only requires app makers to release a “general description” of their data security practices.
Bicameral Democratic legislation contains similar language to the Republican bill, but also establishes that app makers should not be able to collect, use or disclose emergency health data unless a user consents, or it’s necessary to do so for detecting, preventing responding to information security incidents and threats. It further says that app makers must “implement reasonable data security policies, practices, and procedures.”
Aides say talks to unify and advance the app legislation are only at the beginning stages.