Julia, 21, has received fake nude photos of herself generated by artificial intelligence. The phenomenon is exploding.

“I’d already heard about deepfakes and deepnudes (…) but I wasn’t really aware of it until it happened to me. It was a slightly anecdotal event that happened in other people’s lives, but it wouldn’t happen in mine”, thought Julia, a 21-year-old Belgian marketing student and semi-professional model.

At the end of September 2023, she received an email from an anonymous author. Subject: "Realistic? “We wonder which photo would best resemble you”, she reads.

Attached were five photos of her.

In the original content, posted on her social networks, Julia poses dressed. In front of her eyes are the same photos. Only this time, Julia is completely naked.

Julia has never posed naked. She never took these photos. The Belgian model realises that she has been the victim of a deepfake.

  • natecheese@kbin.melroy.org
    link
    fedilink
    arrow-up
    17
    arrow-down
    1
    ·
    9 months ago

    I think the issue is that there is sexual imagery of the person being created and shared without that persons consent.

    It’s akin to taking nude photos of someone without their consent, or sharing nude photos with someone other than their intended audience.

    Even if there were no stigma attached to nudes, that doesn’t mean someone would their nudes to exist or be shared.