(Injection attack syringe image from Imagen 3)
Having realized that I have never discussed injection attacks on the Bredemarket blog, I decided I should rectify this.
Types of attacks
When considering falsifying identity verification or authentication, it’s helpful to see how VeriDas defines two different types of falsification:
- Presentation Attacks: These involve an attacker presenting falsified evidence directly to the capture device’s camera. Examples include using photocopies, screenshots, or other forms of impersonation to deceive the system.
- Injection Attacks: These are more sophisticated, where the attacker introduces false evidence directly into the system without using the camera. This often involves manipulating the data capture or communication channels.
To be honest, most of my personal experience involves presentation attacks, in which the identity verification/authentication system remains secure but the information, um, presented to it is altered in some way. See my posts on Vision Transformer (ViT) Models and NIST IR 8491.

Injection attacks and the havoc they wreak
In an injection attack, the identity verification/authentication system itself is compromised. For example, instead of taking its data from the camera, data from some other source is, um, injected so that it look like it came from the camera.
Incidentally, I should tangentially note that injection attacks greatly differ from scraping attacks, in which content from legitimate blogs is stolen and injected into scummy blogs that merely rip off content from their original writers. Speaking for myself, it is clear that this repurpose is not an honorable practice.
Note that injection attacks don’t only affect identity systems, but can affect ANY computer system. SentinelOne digs into the different types of injection attacks, including manipulation of SQL queries, cross-site scripting (XSS), and other types. Here’s an example from the health world that is pertinent to Bredemarket readers:
In May 2024, Advocate Aurora Health, a healthcare system in Wisconsin and Illinois, reported a data breach exposing the personal information of 3 million patients. The breach was attributed to improper use of Meta Pixel on the websites of the provider. After the breach, Advocate Health was faced with hefty fines and legal battles resulting from the exposure of Protected Health Information(PHI).
Returning to the identity sphere, Mitek Systems highlights a common injection.
Deepfakes utilize AI and machine learning to create lifelike videos of real people saying or doing things they never actually did. By injecting such videos into a system’s feed, fraudsters can mimic the appearance of a legitimate user, thus bypassing facial recognition security measures.
Again, this differs from someone with a mask getting in front of the system’s camera. Injections bypass the system’s camera.
Fight back, even when David Horowitz isn’t helping you
Do how do you detect that you aren’t getting data from the camera or capture device that is supposed to be providing it? Many vendors offer tactics to attack the attackers; here’s what ID R&D (part of Mitek Systems) proposes.
These steps include creating a comprehensive attack tree, implementing detectors that cover all the attack vectors, evaluating potential security loopholes, and setting up a continuous improvement process for the attack tree and associated mitigation measures.
And as long as I’m on a Mitek kick, here’s Chris Briggs telling Adam Bacia about how injection attacks relate to everything else.
As you can see, the tactics to fight injection attacks are far removed from the more forensic “liveness” procedures such as detecting whether a presented finger is from a living breathing human.
Presentation attack detection can only go so far.
Injection attack detection is also necessary.
So if you’re a company guarding against spoofing, you need someone who can create content, proposals, and analysis that can address both biometric and non-biometric factors.
Learn how Bredemarket can help.
Not that I’m David Horowitz, but I do what I can. As did David Horowitz’s producer when he was threatened with a gun. (A fake gun.)

4 Comments