Us, Them, Pornographic Deepfakes, and Guardrails

(Imagen 4)

Some of you may remember the Pink Floyd song “Us and Them.” The band had a history of examining things from different perspectives, to the point where Roger Waters and the band subsequently conceived a very long playing record (actually two records) derived from a single incident of Waters spitting on a member of the audience.

Is it OK to spit on the audience…or does this raise the threat of the audience spitting on you? Things appear different when you’re the recipient.

And yes, this has everything to do with generative artificial intelligence and pornographic deepfakes. Bear with me here.

Non-Consensual Activity in AI Apps

My former IDEMIA colleague Peter Kirkwood recently shared an observation on the pace of innovation and the accompanying risks.

“I’m a strong believer in the transformative potential of emerging technologies. The pace of innovation brings enormous benefits, but it also introduces risks we often can’t fully anticipate or regulate until the damage is already visible.”

Kirkwood then linked to an instance in which the technology is moving faster than the business and legal processes: specifically, Bernard Marr’s LinkedIn article “AI Apps Are Undressing Women Without Consent And It’s A Problem.”

Marr begins by explaining what “nudification apps” can do, and notes the significant financial profits that criminals can realize by employing them, Marr then outlines what various countries are doing to battle nudification apps and their derived content, including the United States, the United Kingdom, China, and Australia.

But then Marr notes why some people don’t take nudification all that seriously.

“One frustration for those campaigning for a solution is that authorities haven’t always seemed willing to treat AI-generated image abuse as seriously as they would photographic image abuse, due to a perception that it isn’t real’.”

First they created the deepfakes of the hot women

After his experiences under the Nazi regime, in which he transitioned from sympathizer to prisoner, Martin Niemoller frequently discussed how those who first “came for the Socialists” would eventually come for the trade unionists, then the Jews…then ourselves.

And I’m sure that some of you believe I’m trivializing Niemoller’s statement by equating deepfake creation with persecution of socialists. After all, deepfakes aren’t real.

But the effects of deepfakes are real, as Psychology Today notes:

“Being targeted by deepfake nudes is profoundly distressing, especially for adolescents and young adults. Deepfake nudes violate an individual’s right to bodily autonomy—the control over one’s own body without interference. Victims experience a severe invasion of privacy and may feel a loss of control over their bodies, as their likeness is manipulated without consent. This often leads to shameanxiety, and a decreased sense of self-worth. Fear of social ostracism can also contribute to anxiety, depression, and, in extreme cases, suicidal thoughts.”

And again I raise the question. If it’s OK to create realistic-looking pornographic deepfakes of hot women you don’t know, or of children you don’t know, then is it also OK to create realistic-looking pornographic deepfakes of your own family…or of you?

Guardrails

Imagen 4.

The difficulty, of course, is enforcing guardrails to stop this activity. Even if most of the governments are in agreement, and most of the businesses (such as Meta and Alphabet) are in agreement, “most” does not equal “all.” And as long as there is a market for pornographic deepfakes, someone will satisfy the demand.

Leave a Comment