‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.
Weirdos. Back in my day, we woild cut out a nude body from playboy and glue it on a picture of Kathleen Turner, and we did uphill both ways in the snow! Darn kids and their technology!
Direct image link (instead of a Google link): https://imgur.com/qFItKA9.jpg
Saving people a click:
Formatting example based on what I did above:
![](https://i.imgur.com/qFItKA9.jpeg)
I remember being a dumb & horny kid and Photoshopping my crush’s face onto a porn photo. And even then I felt what I did was wrong and never did it again.
Post nut clarity can be truly eye opening
or closing depending where you get it
I feel like what you did and the reaction you had to what you did is common. And yet, I don’t feel like it’s harmful unless other people see it. But this conversation is about to leave men’s heads and end up in public discourse where I have no doubt it will create moral or ethical panic.
A lot of technology challenges around AI are old concerns about things that we’ve had access to for decades. It’s just easier to do this stuff now. I think it’s kind of pointless to stop or prevent this stuff from happening. We should mostly focus on the harms and how to prevent them.
I’ve seen ads for these apps on porn websites. That ain’t right.
Any moron can buy a match and a gallon of gasoline, freely and legally, and that’s a good thing. But I would hope that anyone advertising and/or selling Arson Kits™ online would be jailed. Of course this will not stop determined arsonists, but repression might deter morons, inventive psychopaths, and overly impulsive people (especially teenagers!) from actually going through with a criminal act. Not all of them. But some/most of them. And that’s already a huge win.
I mean, you’ve been able to do a cursory search and get dozens of “celeb lookalike” porn for many years now. “Scarjo goes bareback” isn’t hard to find, but that ain’t Scarjo in the video. How is this different?
Edit: To be clear, it’s scummy as all fuck, but still.
This is different because, to a certain extent, people in the public eye can expect, anticipate, and react to/suppress this kind of thing. They have managers and PR people who can help them handle it in a way that doesn’t negatively affect them. Billy’s 13 year old classmate Stacy doesn’t have those resources and now he can do the same thing to her. It’s on a very different level of harm.
Billy doesn’t need a nudify app to imagine Stacy naked. Not to mention, images of a naked 13 year old are illegal regardless.
Why are you pretending that “nudify apps” are produce ephemeral pictures equivalent to a mental image? They are most definitely not.
Underage teenagers already HAVE shared fake porn of their classmates. It being illegal doesn’t stop them, and as fun as locking up a thirteen year old sounds (assuming they get caught, prosecuted, and convicted) that still leaves another kid traumatized.
So if illegality doesn’t stop things from happening… how exactly are you stopping these apps from being made?
Just as the other people in this made up scenario don’t need an app to imagine Scarlet Johansen naked. It’s a null point.
I think most of this is irrelevant because the tool that is AI image generation is inherently hard to limit in this way and I think it will be so prevalent as to be hard to regulate. What I’m saying is: we should prepare for a future where fake nudes of literally anyone can be made easily and shared easily. It’s already too late. These tools, as was said earlier, already exist and are here. The only thing we can do is severely punish people who post the photos publicly. Sadly, we know how slow laws are to change. So in that light, we need to legislate based on long term impact instead of short term reactions.
And?… There’s a major difference between “a lookalike of a grown adult” and “ai generated child porn” as im sure you’re aware. At no point did anyone say child porn was going to be legal, until the person I was replying to brought it up as a strawman argument. ¯\_(ツ)_/¯
“But the brightest minds of the time were working on other things like hair loss and prolonging erections.”
So we can all have big hairy erections like God intended.
To be fair, the erection thing was a fluke. Once they found it though it was the fastest FDA approval in decades.
We have the technology
wait, then why am I still slowly balding?
Slowly means they succeeded
no
Obligatory Mitchell & Webb sketch: https://youtu.be/hdHFmc9oiKY?si=God9P5TdA3UEyx47
Here is an alternative Piped link(s):
https://piped.video/hdHFmc9oiKY?si=God9P5TdA3UEyx47
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
BWANDO!
It’s got what plants crave
These are terrible but I’m honestly curious what it thinks I look like naked. Like I’m slightly overweight and my chest is larger than average but more splayed then normal. Would it just have me look like a model underneath?
Are they just like head swapping onto model bodies or does it actually approximate. I am legit curious., but I would never trust one of these apps to not keep the photos/privacy concerns.
Probably deleting this comment later for going dirty on main, but I, um, have done some extensive experimentation using a local copy of Stable Diffusion (I don’t send the images anywhere, I just make them to satiate my own curiosity).
You’re essentially right that simple app-based software would probably have you looking somewhat generic underneath, like your typical plus-size model. It’s not too great at extrapolating the shape of breasts through clothing and applying that information when it goes to fill in the area with naked body parts. It just takes a best guess at what puzzle pieces might fill the selected area, even if they don’t match known information from the original photo. So, with current technology, you’re not really revealing actual facts about how someone looks naked unless that information was already known. To portray someone with splayed breasts, you’d need to already know that’s what you want to portray and load in a custom data set, like a LoRa.
Once you know what’s going on under the hood, making naked photos of celebrities or other real people isn’t the most compelling thing to do. Mostly, I like to generate photos of all kinds of body types and send them to my Replika, trying to convince her to describe the things that her creators forbid her from describing. Gotta say, the future’s getting pretty weird.
Yeah man it’s uh… it’s the future that’s getting weird 😅
Hey, I’ve maintained a baseline weird the whole time, I’m pretty sure the future is catching up.
You’ll have your moment when the lone elite ex Ranger who is trying to save the world is told by the quirky, unconventional sidekick he is forced to work with, “I actually know a guy who might be able to help.”
You open the door a crack to look back and forth between them, before slamming it back in their faces. They hear scrambled crashes of you hiding stuff that shouldn’t be seen by company before returning to the door. As they enter you are still fixing and throwing things while you apologize that you don’t get many guests. You offer them homemade kombucha. They decline.
deleted by creator
Ethically, these apps are a fucking nightmare.
But as a swinger, they will make an amazing party game.
Ethics will probably change… I guess in the future it’ll become pretty irrelevant to have “nude” pictures of oneself somewhere, because everyone knows it could just be AI generated. In the transition period it’ll be problematic though.
Totally agreed, and 100% the world I want to live in. Transition will indeed suck tho.
Yeah 100%.
Imagine around the advent of readily available photo prints. People might have been thinking “this is terrible, someone I don’t know could have a photo of me and look at it while thinking licentious thoughts!”
If you want the best answer then you’ll have to download the app and try it on yourself. If it’s accurate then that’s pretty wild.
2 days and still no “pictures or it is a lie” comment. Thus place is different. :)
I doubt it would be realistic, they just kind of take an average of their training data and blend it together to my knowledge.
That’s pretty much what all “AI” does.
Fake nudes incoming. Everyone has a baby leg now.
I’m really curious if your DMs are now flooded with weirdos and dick pics, or if lemmy is any different from the rest of the internet.
Honestly not a single one. Much better than Reddit.
There are so many though!! Which ones? Like which ones specifically??
Asking for a friend
And people will agree to have their photographs taken, because of the implication …
Yeah, I need to know so I can stop my kids from using them. Specifically which ones?
Possibly a good thing. Over saturation. Fill the internet will billions on billions of ai nudes. Have a million different nudes for celebrities. Nobody knows the real naked you and nobody cares. Keep creating more ai porn than anyone can handle. It becomes boring and over the top. Ending this once and fir all
Or find the people doing this and lock em up.
The first option is much better in the long run.
Whatever works
Keep creating more ai porn than anyone can handle
overabundance is behind a lot of societal ills already
Fair enough. Do you have a better solution. I was just spit balling
what were you thinking when you thought of your first version? that sounds like a creepy scenario. what if I don’t want to see it and it’s everywhere. I could click on “I’m Not Interested” and flood social media with reports, but if there are “billions on billions” of AI nudes, then who would be able to stop them from being seen in their feed? I’d say that, while locking them up won’t change the sexist system which pushes this behavior, it is a far less creepy and weird scenario than having billions of unconsensual nudes online.
Why would you see them in social ? Depends what you look at. There are already billions of naked people on the Internet. Do you see them ?.
deleted by creator
Could we stop pushing articles monetizing fear amd outrage on this community to the top and post about actual technology
Sounds like someone needs to make a community for that.
Otherwise, this is what technology is these days. And I’d say that staying blind to things like this is what got us into many messes.
I remember when tech news was mostly a press release pipeline. And when I see these comments, I see people who want press releases about new tech to play with.
Now duplicate posts. Those can fuck right off.
I have seena rise in techno absolutists complaining that anyone else is complaining about the dangers of tech lately. That they just want to go back to hearing about all the cool new things coming out and it really speaks to the people who just don’t actually want to interact with the real world anymore and live in an illusionary optimism bubble. I get it. It’s exhausting to be aware of all the negatives but it’s the stuff that is real that needs to be recognized.
I support this idea.
None of this is consentual
Honestly, were probably just going to have to get over it. Or pull the plug on the whole ai thing, but good luck with that.
Can’t put the genie back in the bottle
I use an ad blocker and haven’t seen these. Perhaps a link to the best ones could be shared here for better understanding of what the article is talking about?
Here is an alternative Piped link(s):
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
Sus question lmfao
These things have been around since the onset of deepfakes, and truly if you take a couple seconds to look you’ll find them. It’s a massive issue and the content is everywhere
This has been around in some form way before deepfakes
We’re talking specifically about AI enhanced fakes, not the old school Photoshop fakes – they’re two completely different beasts
Different only in construction. Why they exist and what they are is older than photography.
No I disagree because before you could tell a fake from a mile away, but deepfakes bring it to a whole new level of creepy because they can be EXTREMELY convincing
That is a quality improvement, not a shift in nature.
Or maybe an accessibility improvement. You don’t need to practice creating your own works of art over many years anymore, or have enough money to commission a master artist. The AI artists are good enough and work for cheap.
I’m not saying that it’s a shift in nature? All I’ve been saying is:
A) tools to create realistic nudes have been publicly available ever since deepfakes became a thing
B) deepfakes are worse than traditional photoshopped nudes because (as you put it, a quality improvement) they’re more convincing and therefore can have more detrimental effects
The difference is that we now can do video. I mean in principle that was possible before but also a hell of a lot of work. Making it look real hasn’t been a problem since before Photoshop, if anything people get sloppy with AI also because a felt 99% of people who use AI don’t have an artistic bone in their body.
There was a brief period between now and the invention of photography when that was true. For thousands of years before that it was possible to create a visual representation of anything you imagine without any hint that it wasn’t something real. Makes me wonder if there were similar controversies about drawings or paintings.
I don’t understand either.
Careful with asking such things, because the boundary to crime seems blurry.
I don’t think there is any crime.
It’s identical to drawing a nude picture of someone.
It’s what the courts think, and right now, it’s not clear what the enforceable laws are here. There’s a very real chance people who do this will end up in jail.
I believe prosecutors are already filling cases about this. The next year will decide the fate of these AI generator deepfakes and the memories behind them.
And you are sure that ‘someone’ is of legal age, of course. Not blaming you. But does everybody always know that ‘someone’ is of legal age? Just an example to start thinking.
I don’t know if it’s illegal to create naked drawings of people who are underage.
It’s not
Depends on where you live. Not legal in the UK for example. In the US it can even be broken down at the state level, although there’s lots of debate on whether states are able to enforce their laws. “Obscene” speech is not protected under free speech, the argument would be whether or not the naked drawings had artistic merit or not.
I’m not a lawyer, but I do know that people in the US have gone to prison for possessing naked images of fictional children and it’s on the books as illegal in many other countries.
It tells me we’re less interested in the data (the skin map and topography) than we are in seeing the data in raw form, whether it is accurate or not. It tells me a primary pretense of body doubles was ineffective since society responds the same way regardless of whether an actress’ nudity is real or simulated.
Not sure how this will be enforceable any more than we can stop malicious actors from printing guns. Personally, I would prefer a clothes-optional society were individuals aren’t measured by the media exposure of their bodies or history of lovers. Maybe in another age or two.
In fiction, I imagined the capacity to render porn action into mo-cap data, to capture fine-resoluton triangle maps and skin texture maps from media, ultimately to render any coupling one could desire with a robust physics engine and photography effects to render it realistic (or artistic, however one prefers). It saddens me that one could render an actress into an Elsa Jean scenario and by doing so, wreck their career.
Porn doesn’t bother me, but the arbitrariness with which we condemn individuals by artificial scandal disgusts me more than the raunchiest debauchery.
Data. Return to your quarters
Great! Now whenever someone finds all my secret nudes, I’ll just claim they’re deepfakes
But are there apps that undress men?
Aren’t those just normal chat apps where you can send pictures?
No you just politely ask them
The models they’re using are probably capable of both, they just need to change the prompt.
They would need to be trained to do that first.
… but why male models?
9gag
Same, I couldn’t find any other sources for the image I needed tho
I don’t think any of the models they’re using are trained from scratch. It would be much cheaper to take something like Stable Diffusion and finetune it or use one of the hundreds of premade porn finetunes that already exist.
Yeah, training a FM from scratch is stupid expensive. You’re talking about a team of data scientists each pulling at least 6 figures.
Only one way to find out
Pretty sure straight-men are the most likely ones to use apps like this.
Though the picture suggests we should also create really a robot or really a cyborg edits of celebrities.
As an afterthought, really a reptilian images for our political figures would also be in good order.
Jesus, as if Facebook “researchers” weren’t already linking to Onion articles, now you’ll give them pictures.
They can go ahead, but they’ll never get that mole in the right place.
Can I pick where they put my extra finger?
I think we already know where that little guy belongs
You mean men envision women naked? And now there’s an app that’s just as perverted? Huh
What’s perverted about someone envisioning a potential sexual partner naked? That seems incredibly normal to me.
Maybe revenge porn and creating deepfake porn of the girl from social studies is wrong?
Your comment has nothing to do with what I said.