Outlining the way Singapore's TraceTogether app works, Teague, who is the chief executive of Thinking Cybersecurity, said in a post on GitHub that when one phone was near another, the app sent random-looking beacons over Bluetooth. These TempIDs were encrypted and generated by the central server and there were several problems with this model.
For one, Teague said, a user would be unable tell whether accurate IDs were being sent to one's contacts.
"When you download your encrypted IDs, you are relying on them to be a truthful reflection of your ID. If a software bug, security problem, or network attack gives you someone else's encrypted IDs instead, you have no way to notice," she said. "If you send IDs that are not yours, then when someone near you tests positive, you will not be notified."
With the centralised model, the server was a single point of failure, she said.
A third issue was that the server could find out if a user was running the app. This was done by logging the issue of daily batches of TempIDs. While it was not possible to tell if someone was using the app throughout the day, the authorities would be able to tell if it was not turned on.
Another problem was that if a list of encrypted contacts was leaked or obtained through compromise, the IDs could be decrypted by anyone who had the key, even if the user did not test positive for COVID-19.
Again, if the server used weak or broken encryption, the encrypted IDs could be easily recovered by third-parties.
"Crucially, even if we get the source code for the Australian app, we cannot test that the encryption is being computed properly, since it is not being computed by the app," Teague said.
"Singapore's server code is openly available (good), but an Australian server could decide to downgrade its encryption at any time, even after deployment, and could do so for some people but not others. This would make you easily tracked through shopping malls and other public (and private) spaces, even if you never test positive."
The Singapore app trusted Google's Firebase cloud to hold all the information that was collected and Firebase employees were able to access private information.
Teague said the alternative was for Australia to run the server as a government IT service, leading to two options:
- "We assume that Google (or some other corporate partner) won't abuse its detailed personal information about us, our contacts and our illnesses to make an extra buck, or,
- "We assume that the Commonwealth Government won't mess it up, crash the server, leak the information, or post it all on the web in not-really-de-identified form."
She said both assumptions were "squarely contradicted by past behaviour".
Teague pointed out that consenting to install the app was not the same as agreeing to run it all the time. Again, this was not the same as agreeing to have the running of the app constantly monitored.
"In its TraceTogether form, I would be happy to run it on the train but refuse to run it in my home or office," she said. "I need to see the details of Australia's version before I decide. Informed consent requires telling us what we're consenting to. Open source code is a minimal requirement."
In conclusion, Teague said the government needed to publish source code and specifications for the app. "When they do, we can begin a genuine democratic discussion of whether we will tolerate a centralised app, insist on a decentralised one, or refuse to install either."