Apps and websites that use artificial intelligence to undress women in photos are soaring in popularity, according to researchers.

In September alone, 24 million people visited undressing websites, according to the social network analysis company Graphika.

Many of these undressing, or “nudify,” services use popular social networks for marketing, according to Graphika. For instance, since the beginning of this year, the number of links advertising undressing apps increased more than 2,400% on social media, including on X and Reddit, the researchers said. The services use AI to recreate an image so that the person is nude. Many of the services only work on women.

These apps are part of a worrying trend of non-consensual pornography being developed and distributed because of advances in artificial intelligence — a type of fabricated media known as deepfake pornography. Its proliferation runs into serious legal and ethical hurdles, as the images are often taken from social media and distributed without the consent, control or knowledge of the subject.

  • QuarterSwede@lemmy.world
    link
    fedilink
    arrow-up
    18
    arrow-down
    1
    ·
    11 months ago

    The only real options for response for a celebrity or public figure is 1) say nothing or 2) make light of it by saying something like, “I’m flattered, they made me look better than I do!”

    • daredevil@kbin.social
      link
      fedilink
      arrow-up
      17
      ·
      11 months ago

      I’d imagine this will also be very problematic for non-celebrities from all sorts of backgrounds as well. The harassment potential is very concerning.

      • QuarterSwede@lemmy.world
        link
        fedilink
        arrow-up
        14
        arrow-down
        3
        ·
        11 months ago

        Agreed. Just need to teach our kids that it literally isn’t the end of the world and how to deal with it.

        • pinkdrunkenelephants@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          14
          ·
          11 months ago

          All that’s going to do is give abusers power over others.

          What really needs to happen is for that kind of tech, and more importantly using any software, AI or not, to do such a thing, to be outlawed. It shouldn’t be protected under the first amendment.

            • uranibaba@lemmy.world
              link
              fedilink
              arrow-up
              1
              arrow-down
              7
              ·
              11 months ago

              It would prevent a child in school from creating such an image and openly sharing it because it would be illegal.

              • sylver_dragon@lemmy.world
                link
                fedilink
                English
                arrow-up
                5
                ·
                11 months ago

                Yes. Because kids never do anything illegal.
                And I certainly wouldn’t know anything about the misuse of fireworks, rocket engines or household chemicals which mysteriously happened in my hometown when I was young.

                Yes, by all means, it should be illegal to make and distribute these sorts of images. But, the tech is out there, it’s going to happen. We’re going to need to teach children the mental resilience to deal with these image when they happen. And try to make it much less of a big deal.

          • Sybil@lemmy.world
            link
            fedilink
            arrow-up
            3
            arrow-down
            1
            ·
            11 months ago

            All that’s going to do is give abusers power over others

            no, they’re proposing NOT giving abusers power.