FB Pixel no scriptCompanies are hacking facial recognition to make it safer | KrASIA
MENU
KrASIA
Insights

Companies are hacking facial recognition to make it safer

Written by South China Morning Post Published on   4 mins read

Share
Facial recognition testing companies like Qingfei are producing sophisticated 3D-printed faces and even robots to reveal the technology’s vulnerabilities.

Facial recognition is seemingly used for everything these days: Bank accounts, passwords, taking flights, entering offices, and ubiquitous public cameras with AI systems. But what happens when these systems fail us?

Last December, a company called Kneron went around the world trying to break into facial recognition systems used to authenticate people at airports, railway stations, and even on payment apps. As it turned out, many of them could be fooled, including self check-in terminals at the Netherlands’ largest airport, where the team tricked a sensor just by using a photo displayed on a smartphone.

Companies like Kneron aren’t here to steal your identity, though. This is all part of a process that keeps your face safe from hackers. It’s called a presentation attack, and it’s a type of testing that helps keep strangers from checking in to your flights.

“Basically, the more elaborate the attack and the more realistic [the image], the more dangerous the attack is,” said Sébastien Marcel, head of biometrics security and privacy at Switzerland’s Idiap Research Institute.

This testing has become ever more important as facial recognition shows up in everything from police surveillance to toilet paper dispensers. It allows testers to determine whether a system can distinguish between you and an imposter. To carry it out, experts “attack” the software with a fake image that could be as simple as a piece of paper with a face printed on it or as elaborate as a 3D printed model made of plastic or silicone.

Read this: This Singapore startup marries 3D printing with orthodontics for big smiles: Startup Stories

If the facial recognition algorithms are doing what they’re supposed to, the facial models get even more elaborate. Beijing-based Qingfei Technologies, for instance, can recreate your face in such detail that it’s enough to fool Apple’s Face ID.

So does this mean you’re in danger of having your iPhone hacked? It’s actually more complicated than that.

According to Qingfei’s R&D director Tim Li, Face ID satisfies the demands of regular application scenarios. But there’s no such thing as unhackable facial recognition software.

“It takes a lot of time, manpower, technology and money,” Li explained. “Reliable and unreliable are actually relative. The reliability of a technology is actually based on the difficulty of cracking it.”

Lifelike models

To crack the most sophisticated algorithms, the facial models need to be almost eerily lifelike. Some of Qingfei’s models are so similar to the real thing that you can actually zoom in on individual hairs, blemishes, pores, and all the small imperfections that make our faces human.

The process also requires different technologies. Special equipment is needed to take incredibly detailed photographs to be replicated during 3D printing, explained Wei Yufei, Qingfei’s chief marketing officer. Then there’s the chemical engineering required to create the right materials necessary for these models.

But it’s also kind of an art: The more elaborate models have to be hand sculpted, and the entire process can take up to 40 working days.

Needless to say, this entire process would be very difficult for someone who just wants to crack your iPhone.

Fingerprint recognition

Companies like Qingfei provide laboratory testing services for a growing number of companies that are using biometric authentication. The process is straightforward: If the model can easily trick the software, then the algorithms need more work.

Qingfei also tests vein recognition and fingerprint recognition software, with the latter sometimes requiring the creation of special materials that mimic certain characteristics of human skin such as electrical conductance.

The company’s clients include mobile payment companies, public security systems and China’s many smartphone makers, according to Wei. But not all companies seek outside advice.

Many companies that currently sell biometric software still do their own testing rather than seeking out independent certification. The problem is that no one is verifying what they claim.

zwc

“Testing face recognition or biometrics in general, anyone can do it, but it can be done very badly,” Marcel said.

And not all facial recognition systems are made alike, as has become clear from various incidents over the years.

Amazon Rekognition, which sells software to police in the US, notoriously matched 28 members of the US Congress with mug shots in 2018 in a test conducted by the ACLU. More recently, the National Institute of Standards and Technology (NIST) tested algorithms from 99 developers, including Intel, Microsoft, Toshiba, and Chinese firms Tencent and DiDi Chuxing. The results suggested the systems were much less accurate at identifying Asians and African-Americans than Caucasians.

Even without the now well-known weaknesses in detecting darker skin tones, facial recognition systems have faced many blunders, some more comical than others. In China, an incident involving facial recognition software for clocking in to work went viral after the machine kept matching one employee’s face to his colleagues, both male and female. People started joking that the man must have one of those faces that looks way too common.

Better detection

More companies have started seeking certification after mounting pressure from client demands, but this also means the technology has to keep up. Facial recognition is still evolving, and the tests need to evolve with it.

Qingfei’s 3D printed models may look lifelike, but it’s clear that they’re not alive. That’s about to change. Facial recognition algorithms are starting to get better at detecting whether the image in front of the sensor is fake. One way to do this is motion detection, with algorithms that determine whether someone’s eyes are blinking or mouth is moving.

So the next move for testing companies is creating robots to add movement to 3D-printed faces. Qingfei said that it also plans to work on skeletons that mimic the bone structure of a face and models that emit heat since for cameras that can detect temperature.

This article was originally published in the South China Morning Post. Featured photo credit: Chris Chang via South China Morning Post.

Share

Auto loading next article...

Loading...