Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.

While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.

  • UnderpantsWeevil@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    3
    ·
    edit-2
    8 months ago

    I assume that’s what you’d call OnlyFans.

    That said, the irony of these apps is that its not the nudity that’s the problem, strictly speaking. Its taking someone’s likeness and plastering it on a digital manikin. What social media has done has become the online equivalent of going through a girl’s trash to find an old comb, pulling the hair off, and putting it on a barbie doll that you then use to jerk/jill off.

    What was the domain of 1980s perverts from comedies about awkward high schoolers has now become a commodity we’re supposed to treat as normal.

    • CaptainEffort@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      5
      ·
      8 months ago

      Idk how many people are viewing this as normal, I think most of us recognize all of this as being incredibly weird and creepy.

      • UnderpantsWeevil@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        8 months ago

        Idk how many people are viewing this as normal

        Maybe not “Lemmy” us. But the folks who went hog wild during The Fappening, combined with younger people who are coming into contact with pornography for the first time, make a ripe base of users who will consider this the new normal.

        • CaptainEffort@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 months ago

          Yeah damn, that’s true.

          An obvious answer would be to talk to younger people about it, to explain how gross and violating it is. Even if it doesn’t become illegal, there are plenty of legal things that people avoid and recognize are bad because they were taught correctly.

          Unfortunately, due to how puritan our society is, I can’t imagine many parents would be willing to talk to their kids about stuff like this.