Friday, September 17, 2021

Two of Australia's largest states test facial recognition software in order to enforce pandemic rules

 


Two of Australia's most populous states are testing facial recognition software to allow police to check if someone is home during COVID-19. This expands on trials that have caused controversy among the majority of the country's citizens.

Subscribe to Today's Cache newsletter to get a quick overview of the top five tech stories. To subscribe, click here. ()

Genvis Pty Ltd, a little-known tech company, stated on a website for their software that New South Wales (NSW), Victoria and Victoria were testing its facial recognition products. Genvis stated that the trials were voluntary.

The software was developed in 2020 by the Perth-based startup with WA state police. It is intended to enforce restrictions on pandemic movements.

South Australia started testing a non-Genvis technology last month. This prompted concerns from privacy advocates all over the world about possible surveillance overreach. These concerns may be amplified by the involvement of Victoria and New South Wales, who have not made public their intention to test facial recognition technology.

Gladys Berejiklian, the NSW Premier, stated in an email that the state was "close" to piloting home quarantine options for returning Australians. She did not directly respond to questions about Genvis facial recognition software. The premier of the state was referred by police in NSW.

Victoria Police asked questions of the Victorian Health department. They did not respond.

The system is being tested allows people to respond to random check in requests by taking a "selfie" at their home quarantine address. The software also records location data. If the software does not match the image with a "facial Signature", police might follow up by visiting the address to verify the person's whereabouts.

Although the technology is in use in WA since November last year, it was pitched to the country as a tool for reopening its borders. This would end a system that has been in place since the outbreak of the pandemic and requires international visitors to spend at least two weeks in quarantine in hotels under police guard.

Apart from the pandemic itself, police officers have expressed an interest in facial recognition software. This has prompted a backlash by rights groups regarding the potential targeting of minorities.

Although the technology has been used in China, it has not been reported in any other democracy that it has been considered for use in coronavirus containment procedures.
"Keep communities safe"

Kirstin Butcher, Chief Executive of Genvis, declined to comment beyond what was disclosed on the product's website.

She stated that home quarantine cannot be implemented without compliance checks if it is to protect communities.

"Physical compliance checks cannot be performed at the scale required to support (social-economic) reopening plans, so technology must be used."

Rights advocates cautioned that the technology could be inaccurate and open the door for law enforcement agencies without specific laws to access people's data.

Toby Walsh, an assistant professor of Artificial Intelligence at the University of NSW, stated that "I am troubled not only by the use here, but also by the fact that this is an instance of the creeping usage of this type of technology in our lives."

Walsh raised concerns about the reliability of facial recognition technology, and said that it could be hacked to provide false location reports.

He said, "Even though it works here... it validates that facial recognition can be a good thing." "Where does it end?"

The government of Western Australia stated that it has banned police from using COVID-related data for non-COVID purposes. According to the WA police, 97,000 people have been placed in home quarantine using facial recognition without incident.

Edward Santow, an ex-Australian Human Rights Commissioner, said that the law should prohibit a system of monitoring quarantine from being used for any other purposes. He is now leading an artificial intelligence ethics project at University of Technology in Sydney.

Although facial recognition technology may seem like a convenient way of monitoring people in quarantine,... the risk of harm with this technology is high.

0 comments:

Post a Comment