The iProov Threat Intelligence Report 2024 describes how new technologies have accelerated the digital arms race between threat actors and those charged with stopping them. Copies can be downloaded here.
Dr. Andrew Newell, iProov’s chief scientific officer, said deep fakes have been around for five years. However, tools such as digital emulators and techniques like metadata spoofing have lowered the skill level needed to commit fraud. Emulators are software tools that can mimic devices like mobile phones. More threat actors use emulators to strike at mobile web platforms, iOS and Android.
Why threat rates are surging
“We’re engaged in an arms race,” Newell said. “We have always expected that the threats against us will evolve, and we’ve built the team in a way around this arms race idea.
“We’ve been talking about things like deep fakes for about five years, injection attacks for many, many years. For quite a lot of that, people looked at us and said these things are quite hard to do; that won’t ever happen.”
They’re not saying that anymore. Gone are the days when people could spot fakes with the naked eye. Many mistakenly assumed that would be it.
But it was only just beginning. Newell said visual and audio technologies have rapidly advanced over the past 18 months. At the same time, they have become easier to use.
That’s a recipe for proliferation, and that’s what happened. Newell said iProov tracks around 110 face-swapping technologies alone. New versions appear almost weekly.
“You can download these tools often for free, and can be up and running within an hour,” he explained. “The ease of use of these things is just incredible. So they’ve gone from being what was a quite advanced attack to now being something that you have to class it as a low-effort attack.”
Newell said these tools give attackers complete control, and that threatens critical identity systems. They direct the actions of the face seen in the video and can apply them to different faces.
How to fight back
The good side must fight fire with fire. Solutions must drill down to synthetic imagery. iProov technology accesses the user’s device and illuminates the face with different colors each time. How the light interacts with the face provides critical clues. The seamless process requires no user effort.
Dr. Andrew Newell said threat detection systems must be designed in more responsive ways to keep up with the rapidly changing threat environment.
Systems must also be designed to adjust to the rapid pace of advancement. They must be updated frequently.
“We have to start thinking about the world in a completely different way and accept that timescales are really short,” Newell said. “In the past, you had a lot of people who were keen about on-prem deployments and things like that.
“In the future, these aren’t going to work. The timescales are just too long. We have to think about how we architect the whole system, such that from detection of the threat through the adaptation of the defence and through to the deployment of the update everywhere, how do we make sure that we complete this in a very short term?”
The use of deepfake injections, where criminals inject themselves into systems via a virtual camera, increased by more than 700% in the last half of 2023. Injection attacks surged 255% over the same time, with emulator use up 353%. Credit the increased availability of simple tools.
In addition to more accessible technology, criminals are getting smarter by sharing knowledge. There is a surge in the number of nefarious groups, with half created in the last year. The median membership is 1,000.
The three main threat actors
There are three main types of threat actors. Opportunists seek financial gain through basic tools. Popular tactics are phishing, social engineering, and identity theft.
Commercial actors have the financial resources, patience and knowledge to exact more damage. Their actions are more targeted. They’ll experiment with a system to find an exploit and sell it to others once they do.
Nation-state actors play the long game. Newell said that as more countries move to national identity schemes, they become attractive targets.
That makes it even more imperative to design systems that rapidly evolve. There is no perfect system, so what you have must be constantly assessed, and vulnerabilities must be immediately addressed because an enemy may have already found it and is biding their time.
“You want to make sure that when they come back, you know that the system has advanced so that it will not work anymore,” Newell said. “Make sure that they’re dealing with a moving target while making sure that the effort bonafide users had to go through is very low.”
Also read:
The iProov Threat Intelligence Report 2024 describes how technologies have accelerated the digital arms race between threat actors and those who stop them. AI, Fraud/Identity, Home, News, Andrew Newell, deep fakes, emulators, injections, Iproov, threat actors