• PaupersSerenade@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    96
    arrow-down
    14
    ·
    7 months ago

    I’ll be a minority voice considering the other comments. But maybe just pay for onlyfans or whatever you guys use. I’m a generally attractive woman (I can surmise from interactions while trying to date) and I really don’t like the idea that my likeness would be used for something like this. Get your jollies off, but try and be a bit consensual about it. Is that so much to ask?

    • GrymEdm@lemmy.world
      link
      fedilink
      arrow-up
      72
      arrow-down
      6
      ·
      edit-2
      7 months ago

      It isn’t too much to ask. According to Dr. K of HealthyGamerGG (Harvard Psychiatrist/Instructor), research shows that the release of non-consensual porn makes the unwilling subjects suicidal over half the time. Non-consensual porn = deepfakes, revenge porn, etc. It’s seriously harmful, and there are other effects like depression, shame, PTSD, anxiety, and so on. There is functionally unlimited porn out there that is made with consent, and if someone doesn’t want to be publicly sexually explicit then that’s their choice.

      I’m not against AI porn in general (I consider it the modern version of dirty drawings/cartoons), but when it comes to specific likenesses as with deepfakes then there’s clear proof of harm and that’s enough for me to oppose it. I don’t believe there’s some inherent right to see specific people naked against their will.

      • fidodo@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        7 months ago

        I think it would be too big of a privacy overreach to try to ban it outright as I think what people do on their own computers is their own business and there’s no way to enforce a full ban without being incredibly intrusive, but as soon as it gets distributed in any way I think it should be prosecuted as heavily as real non consensual porn that was taken against someone’s will.

      • HakFoo@lemmy.sdf.org
        link
        fedilink
        arrow-up
        6
        arrow-down
        5
        ·
        edit-2
        7 months ago

        I wonder if part of the emotional risk is due to the general social stigma attached to porn. It becomes something that has to be explained and justified.

        If done to grand excess, deepfakes could crash the market on that, so to speak. Yeah, everyone saw your face on an AI-generated video. They also saw Ruth Bader Ginsburg, their Aunt Matilda, and for good measure, Barry Bonds, and that was just a typical Thursday.

        The shock value is burnt through, and “I got deepfaked” ends with a social stigma on the level of “I got in a shouting match with a cashier” or “I stumbled into work an hour late recently.”

        • fidodo@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          7 months ago

          My main concern is for kids and teenagers. They’ll bully people for no damn reason at all and AI porn allows for bullies to do more fucked up psychological abuse, and that could be made much worse if victims have no recourse to fight back.

    • MuchPineapples@lemmy.world
      link
      fedilink
      arrow-up
      23
      arrow-down
      2
      ·
      7 months ago

      Ai porn isn’t deepfake porn. The default is just a random ai generated face and body. Unless you want to it’s difficult to deepfake someone.

            • starman2112@sh.itjust.works
              link
              fedilink
              arrow-up
              7
              ·
              7 months ago

              You can’t just say “excellent question” when someone asks you to clarify your point lmfao

              “They’re trying to force our kids to get vaccines so they can manipulate them with 5g wifi”

              How could they manipulate your kids with 5g signals?

              “That’s a good question innit”

              • prole@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                7 months ago

                I guess this was just one level of abstraction too much for you huh?

                The entire issue here is AI being trained on people’s data without them knowing or giving permission. The question of who’s likenesses and which photos are being used is an excellent question and it’s a big part of the problem here.

    • ArbiterXero@lemmy.world
      link
      fedilink
      arrow-up
      15
      arrow-down
      3
      ·
      7 months ago

      So I’m not disagreeing with you, but you’re assuming they’re making deepfake images, and the article doesn’t specify that. In fact I’d bet that it’s just AI generated “people” that don’t exist.

      What about AI porn of a person that doesn’t exist?

      • Arbiter@lemmy.world
        link
        fedilink
        arrow-up
        25
        arrow-down
        1
        ·
        7 months ago

        However, one of Salad’s clients is CivitAi, a platform for sharing AI generated images which has previously been investigated by 404 media. It found that the service hosts image generating AI models of specific people, whose image can then be combined with pornographic AI models to generate non-consensual sexual images.

    • BudgetBandit@sh.itjust.works
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      7 months ago

      I know someone who’s into really dark romance stuff, like really hardcore stuff, but she’d never do some of this due to safety reasons. I can totally see her generating scenes of herself in those situations.

    • VaultBoyNewVegas@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      3
      ·
      7 months ago

      Shouldn’t be but I’ve been down voted here for speaking against deepfakes. Some people really don’t want to see the problem with them.

    • CleoTheWizard@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      4
      ·
      7 months ago

      I have a question and I hope that people here will discuss this because I really want to understand the general opinion on this.

      Is it wrong to deepfake someone without their consent so long as you don’t share the content and it’s all stored locally? I’ve seen this come up and my general opinion is that it isn’t. I know that isn’t the case in the article, just want to hear why people would disagree.

      My angle is that doing a deepfake of someone in private hurts zero people and is an extension of fantasy. I don’t see the creation of fake nudes any different than writing fantasy erotica about someone. And I also don’t see it as different than creating fake nude art of them by hand or with photoshop. Like if you do it in your head anyways, which is completely normal, then aren’t we just worried about the outside effects and not the fantasizing itself?

      • William@lemmy.world
        link
        fedilink
        arrow-up
        13
        arrow-down
        1
        ·
        7 months ago

        It’s at least as wrong as fantasizing about them if they aren’t already romantically involved with you.

        How wrong that is, is up for debate. It will definitely creep them out and they can never find out about it.

        If it’s just in your head, at least there’s no physical way they could ever find out. You’d have to admit it. But if you have it on your hard drive, a hacker could get it and blackmail you with it, or just distribute it.

        So my stance is that there’s a non-zero chance of doing harm to them, and so it’s wrong. I wouldn’t do it. I also wouldn’t create it with Photoshop, or by hand, for the same reason.

        If you want to jerk off, do it to existing porn, or imaginary people porn. Don’t create porn of real people without their permission, even if you think nobody will ever see it other than you. Accidents happen, and they don’t deserve to bear the cost of that.

        • fidodo@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          7 months ago

          You’d have to admit it. But if you have it on your hard drive, a hacker could get it and blackmail you with it, or just distribute it.

          There are lots of sick fucks that will distribute it themselves and even send it to their victims to harass them directly. It’s already happening.

          I don’t think it’s possible to ban it outright, and I think what people do on their own computer is their own business so long as they aren’t connecting to other computers, but we should have strong laws against distributing it and treat it the same as distributing secretly taken real nudes against someone’s will. Victims need recourse against harassment.

        • CleoTheWizard@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 months ago

          The first part, absolutely. But I think a lot of that is biological so I don’t see fantasy as a problem. You should keep it to yourself though.

          The second part I think Id somewhat agree with except the hacker can’t blackmail you with it because it’s just as likely that they created it. And even if they did blackmail you, I would view that as the damage caused by the hacker, not by the individual.

          Like if someone put something nasty about me down in their diary where they expected it to be private, and a hacker sent me an email of that diary page, that’s entirely the hackers fault. The diary writer was expressing an emotion or desire or whatever in complete privacy. Was their creation wrong? No, I don’t think so.

          And to be clear I’m not saying people should go to this type of fantasy, this is all a thought exercise for ethics, but I think a lot about this stuff because as much potential for bad as it has, it also has some potential for good. All of the women I know experience behaviors such as stalking, obsession, unwelcome sexual advances, etc. on a regular basis. There is a reason those men don’t turn to free porn. Incel behavior is also just as bad in many ways. So could AI and deepfake stuff result in many of those men keeping that stuff to themselves more? Maybe.

          And before you say that these perverts will just send fake nudes to you and harass you that way, we should absolutely be prosecuting people that do so. That’s an entirely separate convo tho.

        • AstralPath@lemmy.ca
          link
          fedilink
          arrow-up
          4
          arrow-down
          5
          ·
          7 months ago

          It will definitely creep them out and they can never find out about it.

          And that’s all that’s required for it to be considered wrong IMO.

            • AstralPath@lemmy.ca
              link
              fedilink
              arrow-up
              2
              arrow-down
              4
              ·
              7 months ago

              How anyone could think that going so far as to invoke thoughtcrime is relevant in this discussion is beyond me. It should be self evident to anyone that fantasies are a thing. They’ve been a thing for the entire history of the human race. In no way do fantasies compare to creating reproducible and sharable media of someone in a pornographic situation without their consent.

              You can’t transplant your fantasies into someone else’s head. Your fantasies literally cannot hurt anyone. On the other hand, imagine if you found out that someone was distributing pornographic material depicting one of your loved ones. It can quite literally ruin someone’s reputation to be seen in a pornographic situation.

              Your argument is some slippery slope fallacy shit.

              • notfromhere@lemmy.ml
                link
                fedilink
                arrow-up
                4
                ·
                edit-2
                7 months ago

                Reread the comment I replied to and then reread my comment. You are putting words in my mouth. I never mentioned anything about sharing anything nor implied anything of the sort.

    • oozynozh@lemm.ee
      link
      fedilink
      arrow-up
      5
      arrow-down
      6
      ·
      7 months ago

      Deepfake pornography is super goony but if I had to look for a silver lining, at least nobody had to undergo the actual physical degradation of making porn. It’s still gross in its own way, but it’s a different kind of gross that seems worse in some ways but better in others.

      I don’t know… Am I off base here?

          • otp@sh.itjust.works
            link
            fedilink
            arrow-up
            7
            arrow-down
            1
            ·
            7 months ago

            Ah, right, sorry. The first part of your comment makes it seem like you’re leaning the other way.

            • oozynozh@lemm.ee
              link
              fedilink
              arrow-up
              4
              ·
              7 months ago

              I’m not sure if I feel strongly enough about it to have a consequential opinion either way but I’m trying to at least judge the situation objectively.

              I think you raised a valid point. The non-consensual nature of deepfakes pushes it into the realm of abuse material and maybe that’s worse overall than the general exploitation of women going on in the adult film industry, even if those are supposed to be “consensual” on paper.

      • William@lemmy.world
        link
        fedilink
        arrow-up
        7
        arrow-down
        1
        ·
        7 months ago

        Judging by another comment here, non-consensual porn is far worse, and causing suicidal thoughts and more.

        So I’d say it has all the “gross” of regular porn (which is subjective) and the additional “gross and horrifying” of violating someone.