Coronavirus: Security flaws found in NHS contact-tracing app
Numerous security flaws have been reported in the Covid-19 contact monitoring app being tested on the Isle of Wight.
The security researchers involved have warned that the problems pose risks to user privacy and could be abused to prevent the sending of contagion alerts.
GCHQ’s National Cyber Security Center (NCSC) told the BBC that it was already aware of most of the problems raised and is about to resolve them.
But the researchers suggest that a more fundamental rethink is needed.
In particular, they are calling for new legal protections to prevent officials from using the data for purposes other than identifying those at risk of infection or retaining them indefinitely.
Furthermore, they suggest that the national health service is considering moving from its current “centralized” model – where contact matching takes place on a computer server – to a “decentralized” version – where correspondence occurs on people’s phones .
“There may still be security bugs and vulnerabilities in decentralized or centralized models,” said Cybersecurity director general, Dr Vanessa Teague.
“But the big difference is that a decentralized solution wouldn’t have a central server with the recent face-to-face contacts of every infected person.
“So there is a much lower risk of loss or abuse of that database.”
Health Secretary Matt Hancock said on Monday that a new law “is not needed because the Data Protection Act will do the job.”
And NHSX – the digital innovation unit of the health service – said that using the centralized model will both make it easier to improve over time and activate alerts based on people’s self-diagnostic symptoms rather than on medical test results.
Researchers describe in detail seven different problems encountered with the app.
- weaknesses in the registration process that could allow attackers to steal encryption keys, which would allow them to prevent users from being notified if a contact is positive for Covid-19 and / or generates spoofing transmissions to create event logs fake contact
- storage of unencrypted data on laptops that could potentially be used by law enforcement agencies to determine when two or more people met
- generate a new random ID number for users once a day instead of once every 15 minutes, as in the case of a rival model developed by Google and Apple. The longest gap theoretically allows you to determine if a user has a relationship with a work colleague or meets someone after work, it is suggested
“The overall risks are varied,” Dr. Chris Culnane, second author of the report.
“In terms of logging problems, it’s a fairly low risk because it would require an attack on a well-protected server, which we don’t think is particularly likely.
“But the risk on unencrypted data is greater, because if someone were to access your phone, they might be able to learn some additional information because of what is stored on it.”
NCSC technical director Ian Levy wrote on the blog thanking the two researchers for their work and promising to address the problems they identified.
But he said several versions of the app may be needed before all the problems are resolved.
“Everything that is reported to the team will be correctly assessed (even if it takes longer than normal),” he wrote.
An NCSC spokesman said: “It was always hoped that measures such as releasing the code and explaining the decisions behind the app would generate significant discussions with the security and privacy community.
“We can’t wait to continue working with researchers on security and encryption to make the app the best it can be.”
But the dr. Culnane said that politicians must also revisit the problem.
“I am confident that they will solve the technical problems,” he said.
“But there are wider problems around the lack of legislation protecting the use of this data [including the fact] there is no strict limit for when the data needs to be erased.
“This is in contrast to Australia, which has very strict limits for erasing its app data at the end of the crisis.”
Meanwhile, Harriet Harman, who chairs the Parliament’s human rights committee, announced that she was asking for permission to present a bill for a private member to limit who could use the data collected by the app and how and create a watchdog to handle related public complaints.
“Personally, I would download the app myself, even if I’m concerned about what data it would be used for,” the Labor MP told BBC News.
“But my commission’s view was that this app shouldn’t go on unless [the government] is willing to put in place privacy protections. “