Deepfakes are not a threat to facial payments, for now
Sep 11, 2019
A new Chinese app allowing users to implant their faces into scenes from well-known movies and TV shows landed at the end of last month. Zao quickly stormed to the top of China’s free app download charts for both Android and iPhone. However, the app’s success was short-lived.
However, Zao’s rapid rise not only sparked concerns over questionable data collection practices but also about how the content could be used maliciously to bypass security systems such as those on facial recognition payment platforms.
Programs created solely to entertain users may not appear as technologically sophisticated as professional face-mapping software, but significant privacy risks still exist. “Given that these technologies use biometric data, which are irrevocable by nature, once the data is leaked or abused, it could bring severe and permanent consequences for users,” said Dong Jing, an executive committee member at IEEE Asia Pacific office, at a media event held by the association last Friday.
“It could even impact judicial investigations, insurance appraisals, and other serious and sensitive areas,” added Dong, who holds a Ph.D. in pattern recognition and serves as deputy general of the Chinese Artificial Intelligence Association.
Spotlight on facial recognition
As face-scanning tech becomes increasingly prevalent in China, it is unsurprising that the Zao pushback has spilled over to payment providers, forcing leading player Alipay to issue a public statement (in Chinese) defending its facial recognition capabilities. Alipay said that deepfake apps pose no risk to its payment tools and cannot be fraudulently used on its payment devices.
“Before scanning a user’s facial features, the device will detect whether the facial information is from an image, a video clip or generated by software, which can effectively avoid cases of identity fraud enabled by fake facial information (our translation),” the company said. With over one billion users globally, Alipay launched its facial recognition system in 2017 for commercial use.
Using Zao’s face-swapping technology to bypass facial recognition systems may be a little farfetched, according to Dong. Although the technology has been around some time, using it to crack security systems would not be an easy task, she told TechNode.
Although Zao’s developers have not confirmed whether they used opensource deepfake technology, Dong said they bear a strong resemblance. The app likely trained algorithms on a database—TV and movie clips—and they refined the face-swap feature so that a user’s face looks realistic. However, the app is incapable of implanting a user’s chosen facial features onto video content. It has its limitations and is not as advanced as one might think, she added.
Dong is currently working on an artificial intelligence tool able to detect whether an image or video has been doctored by deepfake or morphing tech. In a world where face-scanning applications are increasingly being used, there should be “defense mechanisms” that can be used, for example, to tell software-generated videos from real ones. Such technology could also be used to find out if a person standing in front of a facial recognition camera is in disguise, said Dong.
In terms of doubts over the use of facial recognition in financial services, Dong said the technology is already quite mature and has seen widespread use in China. However, she attributes the lack of adoption in the financial sector to the degree of risk involved. Institutions are unwilling to bear these risks or deem them unnecessary. “This is not so much a technology problem, on a broader application level, the market is not yet mature,” she said, adding that for now, the regulatory framework remains murky.