October 6, 2024

New facial recognition Devices called into question

Science & Technology

New facial recognition Devices called into question

By: Emily Wang

Imagine that every day, the police send a message to your phone, telling you to send then a selfie. And every day, you oblige. It’s not like you have a choice. If you ignored the text, the police would come to your house and fine you for disobeying the law.

In many parts of the world, laws have been enacted that demand citizens to send selfies of themselves inside their houses when they are quarantined. This act is to make sure that any citizens with COVID do not attempt to go outside or in public and spread the virus.

The app G2G made by Genvis tracks your location by using the selfies you take on the app, and it is used by more than 150,000 people, and the same type of device is also used in many countries, with Australia being the only democratic one to use it.

However, other places do not seem to want to accept this new type of technology. San Francisco declared a moratorium against the use of facial recognition in 2019. Soon, other cities such as Oakland, California and Somerville in Massachusetts also outlawed the use of facial recignition technology.

It’s not only cities that are fighting against using G2G. Many companies are also joining in. Amazon, IBM, Microsoft, and Google have stated that they will not sell any facial recognition products unless it becomes a federal law to use them. Meta, the creator of Facebook, deleted billions of faceprints and stopped allowing people to tag others in their photos because they needed to see if the positive effects of facial recognition outweighed the growing concern that facial recognition was invasive of people’s privacy.

“The Pandemic created all these new justifications for using facial recognition technology,” says Mark Andrejevic, a professor of media studies at Monash University in Australia. “Everything went online and organizations were trying to make things work very quickly. But the implications haven’t been thought through. Do we want to live in a world where everything is rendered and there are no private spaces? It creates a whole new level of stress that does not lead to a healthy society.

In 2019, a law to enforce facial recognition was proposed in Australia, but it was put aside after a parliamentary committee review found that the law would lack sufficient privacy enforcement. One of its most active critics was the then Australian Human rights Commissioner, Edward Santow.

Santow is working on a project to work out how their country can use facial recognition technology effectively while protecting the privacy of Australia’s citizens. A part of the project requires Australia to observe how other countries are using their facial recognition devices. One of the most common ways observed is to depend on a small amount of limited privacy protections. Santow disagrees, saying that the method would not completely address the issue in Australia’s case.

Santow says that the goal in Australia is to form an approach which encourages positive applications and adding guardrails to prevent casualties. The worst thing that could happen is that the government would turn to social credit system like the one used in China, where the government tracks people’s whereabouts to determine how trustworthy they are.

Santow wants a law that allows the use of facial recognition technology while also maintaining basic international human rights, such as privacy. For example, the law would require free and informed consent for facial recognition to be installed. But if the device was causing discrimination, even the consent would not be enough to justify the use of the technology. When the technology was first tested, it was always on white people or just white people. But if it is introduced to other ethnicities or backgrounds, then that creates a problem.

The device could mistake one person for another. For example, if it was looking for a criminal, it could think that someone else was the criminal and alert officials. Hence, it creates a situation unsafe for everyone involved. The device also relies heavily on mathematics, which means the math has to be completely accurate in order for the device to function properly.

Facial recognition works by splitting your face into multiple geometric shapes and memorizing the distance between each feature on your face, such as the nose, eyes, and mouth. The face is then compared to various other faces. In fact, facial recognition apps can also recognize your face through sunglasses or a facemask, and it can di that from more than a kilometer away.

In some cases, these the facial recognition devices can be useful. They could be used in schools throughout the United States as a way to monitor any visitors coming through the doors, which many believe could reduce the risks of school shootings by recognizing the faces of expelled students. In fact, facial and object recognition technology has been tested at multiple schools. Australia has also used these cameras in many stadiums to prevent any terrorists or banned people from entering the building.

Live facial recognition is also being used by some police forces. For example, London’s Metropolitan Police, uses it to check on certain areas for a “watchlist” of wanted people or people who might be dangerous.

“CCTV is often criticized for only allowing evidence after the fact, whereas facial recognition creates actionable information in real time in order to pre-empt crime,” says Andrejevic. ‘That’s a very different concept of security.”

As technology becomes more and more advanced so do hacker’s attempts to breach those tools. Deep fakes, which are ways to change or shape something into something entirely different, have become one of the best methods hacker’s use to fool technology.

“It used to take several hours to create a deep fake using animation tools-now it takes a couple minutes,” says Francesco Cavalli, the co-founded of Sensity Al. “All you need is one photo to create a 3D deep fake. You don’t even need to be a developer or an engineer. You can do it yourself. There are tons of apps that allow you to replicate anyone’s face.”

Fortunately, Sensity Al has found a way to detect any fake humans. However, Cavalli does predict that hackers will find a way to get past it someday in the future.

Even though there are many challenges that come with regulations to date, Santow believes that Australia could become a world leader in the usage of facial recognition.

“Australia could provide a good model for a number of reasons,” Santow says. “We have a strong institutional and corporate respect for human rights. It may not be perfect, but it is fundamentally to who we are as a county. We’re also a sophisticated developer and adopter of technology.”

Santow says, “I perceive the greatest challenge not to be drafting a watertight law, but in making sure that the law itself isn’t ignored.”

Back To Top