How deepfakes threaten KYC (Know Your Customer) | Kaspersky official blog
While humanity is trying to figure out how to recoup the hundreds of billions of dollars invested in generative AI, cybercriminals are already adopting the technology. For example, they’ve discovered that AI can be used to create virtual money mules — dummy accounts used to transfer stolen funds. Deepfakes allow criminals to successfully bypass customer identity verification (KYC, Know Your Customer) procedures used by financial institutions, thereby eliminating the need for living accomplices. Let’s delve into the details.
What is KYC?
The KYC procedure is a financial-sector practice for verifying a customer’s identity that’s used to combat various illegal financial activities — including fraud, money laundering, tax evasion, financing terrorism, and more.
More specifically, KYC often refers to biometric identity verification systems in fully remote services — that is, when a customer signs up online without any personal contact with employees of the financial institution.
Typically, this procedure requires the customer to upload photos of their documents and take a selfie, often holding the documents. An additional security measure has also recently become popular: the customer is asked to turn on their smartphone camera and turn their head in different directions, following instructions.
This method is sometimes also used to verify transactions, but it’s generally designed to protect against authentication using static photos that might have been stolen somehow. The problem is that criminals have already figured out how to bypass this protection: they use deepfakes.
AI tools for fraud
Not long ago, experts from the deepfake detection startup, Sensity, released an annual report describing some of the most common ways that cybercriminals maliciously use AI-generated content.
In this report, the experts publish the total number of AI content creation tools worldwide. They counted 10,206 tools for image generation, 2298 tools for replacing faces in videos and creating digital avatars, and 1018 tools for generating or cloning voices.
The report also highlights the number of specialized utilities designed specifically to bypass KYC: they counted as many as 47 such tools. These tools allow cybercriminals to create digital clones that successfully pass customer identity verification. As a result, fraudsters can remotely open accounts in financial institutions — banks, cryptocurrency exchanges, payment systems, and more.
These accounts are later used for various criminal activities — mainly for direct financial fraud, as well as laundering profits from illegal operations.
Digital clone store
Recently, 404 Media reported on an underground website selling photos and videos of people for bypassing KYC. According to the journalists, traders of digital duplicates have entire collections of such content. They find volunteers in disadvantaged countries and pay them relatively small amounts ($5-$20) for the footage.
The resulting content is then sold to anyone interested. The collections are quite extensive and include people of different ages, genders, and ethnicities. The site’s services are fairly inexpensive: for example, the journalists purchased a set for only $30. The sets include photos and videos in different clothing, as well as images with a white card and a blank sheet of paper in hand, which can be replaced with an ID or some other document.
The service is extremely customer-oriented. The website has reviews from grateful buyers, and even features a special mark for those photos and videos that have been purchased the least number of times. Such “fresh clones” are more likely to successfully pass anti-fraud system checks.
In addition to ready-made digital identities, the site’s administrators offer exclusive content sets created individually for the buyer — on demand and probably for more serious money.
AI-generated fake documents
Journalists from the same media also discovered a website specializing in selling realistic photos of fake documents created using AI.
According to an expert from a company that deals with such fraud, some services of this kind sell ready-to-use sets that include both fake documents and photos and videos of their fake owners.
Thus, AI tools and such content collections make the work of fraudsters much easier. Just a few years ago, money mules — real people who directly handled dirty money, opened accounts and made transfers or cash withdrawals — were the weakest link in criminal operations.
Now, such “physical” mules are rapidly becoming unnecessary. Criminals no longer need to interact with unreliable “flesh bags” who are vulnerable to law enforcement. It’s just a matter of creating a certain number of digital clones for the same purposes and then targeting those financial services that allow you to open accounts and conduct transactions completely remotely.
So what’s next?
In the future, the ease of bypassing current KYC procedures will likely lead to two consequences. On the one hand, financial organizations will introduce additional mechanisms for verifying photos and videos provided by remote customers based on detecting signs of AI forgeries.
On the other hand, regulators will likely tighten requirements for fully remote financial operations. So it’s quite possible that the simplicity and convenience of online financial services, which we’ve already become accustomed to, will be threatened by artificial intelligence.
Unfortunately, the problem doesn’t end there. As noted by experts, the widespread availability of AI tools for generating photo, video, and audio content fundamentally undermines trust in digital interactions between people. The higher the quality of AI creations, the harder it becomes to believe what we see on our smartphones and computers.
Kaspersky official blog – Read More