this post was submitted on 02 Jul 2025
376 points (97.2% liked)

Technology

72828 readers
2337 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

you are viewing a single comment's thread
view the rest of the comments
[–] wewbull@feddit.uk 54 points 1 week ago (5 children)

Honestly I think we need to understand that this is no different to sticking a photo of someone's head on a porn magazine photo. It's not real. It's just less janky.

I would categorise it as sexual harassment, not abuse. Still serious, but a different level

[–] lath@lemmy.world 46 points 1 week ago (2 children)

Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the "AI" companies are generating a ton of csam and nobody is doing anything about it.

[–] LostXOR@fedia.io 20 points 1 week ago (4 children)

Do deepfake explicit images created from a non-explicit image actually qualify as CSAM?

[–] lath@lemmy.world 7 points 1 week ago

I don't know personally. The admins of the fediverse likely do, considering it's something they've had to deal with from the start. So, they can likely answer much better than I might be able to.

[–] lka1988@lemmy.dbzer0.com 5 points 1 week ago* (last edited 1 week ago)

I would consider that as qualifying. Because it's targeted harassment in a sexually-explicit manner. All the girl would have to do is claim it's her.

Source: I'm a father of teenage daughters. I would pursue the individual(s) who started it and make them regret their choices.

load more comments (2 replies)
[–] wewbull@feddit.uk 6 points 1 week ago (4 children)

Disagree. Not CSAM when no abuse has taken place.

That's my point.

[–] Zak@lemmy.world 20 points 1 week ago (1 children)

I think generating and sharing sexually explicit images of a person without their consent is abuse.

That's distinct from generating an image that looks like CSAM without the involvement of any real child. While I find that disturbing, I'm morally uncomfortable criminalizing an act that has no victim.

[–] kemsat@lemmy.world 5 points 1 week ago

Harassment sure, but not abuse.

[–] atomicorange@lemmy.world 8 points 1 week ago (2 children)

If someone put a camera in the girls’ locker room and distributed photos from that, would you consider it CSAM? No contact would have taken place so the kids would be unaware when they were photographed, is it still abuse?

If so, how is the psychological effect of a convincing deepfake any different?

[–] General_Effort@lemmy.world 9 points 1 week ago

If someone puts a camera in a locker room, that means that someone entered a space where you would usually feel safe. It implies the potential of a physical threat.

It also means that someone observed you when you were doing "secret" things. One may feel vulnerable in such situations. Even a seasoned nude model might be embarrassed to be seen while changing, maybe in a dishevelled state.

I would think it is very different. Unless you're only thinking about the psychological effect on the viewer.

[–] BombOmOm@lemmy.world 8 points 1 week ago* (last edited 1 week ago) (1 children)

Taking secret nude pictures of someone is quite a bit different than....not taking nude pictures of them.

It's not CSAM to put a picture of someone's face on an adult model and show it to your friend. It's certainly sexual harassment, but it isn't CSAM.

[–] atomicorange@lemmy.world 2 points 1 week ago (1 children)

How is it different for the victim? What if they can’t tell if it’s a deepfake or a real photo of them?

[–] BombOmOm@lemmy.world 4 points 1 week ago* (last edited 1 week ago) (1 children)

It's absolutely sexual harassment.

But, to your question: you can't just say something has underage nudity when the nudity is of an adult model. It's not CSAM.

[–] atomicorange@lemmy.world 10 points 1 week ago

Yes, it’s sexual abuse of a child, the same way taking surreptitious locker room photos would be. There’s nothing magical about a photograph of real skin vs a fake. The impact to the victim is the same. The impact to the viewer of the image is the same. Arguing over the semantic definition of “abuse” is getting people tangled up here. If we used the older term, “child porn” people wouldn’t be so hesitant to call this what it is.

[–] lka1988@lemmy.dbzer0.com 6 points 1 week ago* (last edited 1 week ago)

Except, you know, the harassment and abuse of said deepfaked individual. Which is sexual in nature. Sexual harassment and abuse of a child using materials generated based on the child's identity.

Maybe we could have a name for it. Something like Child-based sexual harassment and abuse material... CSHAM, or maybe just CSAM, you know, to remember it more easily.

[–] lath@lemmy.world 3 points 1 week ago

There's a thing that was happening in the past. Not sure it's still happening, due to lack of news about it. It was something called "glamour modeling" I think or an extension of it.

Basically, official/legal photography studios took pictures of child models in swimsuits and revealing clothing, at times in suggestive positions and sold them to interested parties.

Nothing untoward directly happened to the children. They weren't physically abused. They were treated as regular fashion models. And yet, it's still csam. Why? Because of the intention behind making those pictures.

The intention to exploit.

[–] LadyAutumn@lemmy.blahaj.zone 24 points 1 week ago* (last edited 1 week ago) (4 children)

Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.

If the person in the image is underaged then it should be classified as child pornography. If the woman who's photo is being used hasnt consented to this then it should be classified as sexual exploitation.

Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.

[–] FishFace@lemmy.world 15 points 1 week ago (1 children)

It's bullying with a sexual element. The fact that it uses AI or deepfakes is secondary, just as it was secondary when it was photoshop, just as it was secondary when it was cutting out photos. It's always about using it bully someone.

This is different because it's easier. It's not really different because it (can be) more realistic, because it was never about being realistic, otherwise blatantly unrealistic images wouldn't have been used to do it. Indeed, the fact that it can be realistic will help blunt the impact of the leaking of real nudes.

[–] LadyAutumn@lemmy.blahaj.zone 2 points 1 week ago (19 children)

It's sexually objectifying the bodies of girls and turning them into shared sexual fantasies their male peers are engaging in. It is ABSOLUTELY different because it is more realistic. We are talking about entire deep fake porngraphy production and distribution groups IN THEIR OWN SCHOOLS. The amount of teenage boys cutting pictures out and photoshopping them was nowhere near as common as this is fast becoming and it was NOT the same as seeing a naked body algorithmically derived to appear as realistic as possible.

Can you stop trying to find a silver lining in the sexual exploitation of teenage girls? You clearly don't understand the kinds of long term psychological harm that is caused by being exploited in this way. It was also exploitative and also fucked up when it was in photoshop, this many orders of magnitude more sophisticated and accessible.

Youre also wrong that this is about bullying. Its an introduction to girls being tools for male sexual gratification. It's LITERALLY commodifiying teenage girls as sexual experiences and then sharing them in groups together. It's criminal. The consent of the individual has been entirely erased. Dehumanization in its most direct form. It should be against the law and it should be prosecuted very seriously wherever it is found to occur.

[–] rottingleaf@lemmy.world 4 points 1 week ago (2 children)

Can you stop trying to find a silver lining in the sexual exploitation of teenage girls?

Can you please use words by their meaning?

Also I'll have to be blunt, but - every human has their own sexuality, with their own level of "drive", so to say, and their dreams.

And it's absolutely normal to dream of other people. Including sexually. Including those who don't like you. Not only men do that, too. There are no thought crimes.

So talking about that being easier or harder you are not making any argument at all.

However. As I said elsewhere, the actions that really harm people should be classified legally and addressed. Like sharing such stuff. But not as making child pornography because it's not, and not like sexual exploitation because it's not.

It's just that your few posts I've seen in this thread seem to say that certain kinds of thought should be illegal, and that's absolute bullshit. And laws shouldn't be made based on such emotions.

[–] atomicorange@lemmy.world 11 points 1 week ago

I don’t know where you’re getting this “thought crime” stuff. They’re talking about boys distributing deepfake nudes of their classmates. They’re not talking about individuals fantasizing in the privacy of their own homes. You have to read all of the words in the sentences, my friend.

[–] jjlinux@lemmy.ml 2 points 1 week ago* (last edited 1 week ago) (1 children)

~~"thought crime"? And you have the balls to talk about using words "by their meaning"?~~

This is a solid action with a product to show for it, not a thought, which happens to impact someone's life negatively without their consent, with potentially devastating consequences for the victim. ~~So, can you please use words by their meaning?~~

Edit: I jumped the gun when I read "thought crime", effectively disregarding the context. As such, I'm scratching the parts of my comment that don't apply, and leaving the ones that do apply (not necessarily to the post I was replying to, but to the whole thread).

[–] rottingleaf@lemmy.world 5 points 1 week ago* (last edited 1 week ago) (1 children)

The author of those comments wrote a few times what in their opinion happens in the heads of others and how that should be prevented or something.

Can you please stop interpreting my words exactly the way you like? That's not worth a gram of horse shit.

[–] jjlinux@lemmy.ml 2 points 1 week ago (2 children)

Yes I can, moreso after your clarification. I must have misread it the first time. Sorry.

load more comments (2 replies)
load more comments (18 replies)
[–] General_Effort@lemmy.world 7 points 1 week ago (3 children)

Historically, the respectability of a woman depended on her sexuality. In many conservative cultures and communities, that is still true. Spreading the message that deepfakes are some particular horrible form of harassment reinforces that view.

If having your head on the model of a nude model is a terrible crime, then what does that say about the nude model? What does it say about women who simply happen to develop a larger bosom or lips? What does it say about sex before marriage?

The implicit message here is simply harmful to girls and women.

That doesn't mean that we should tolerate harassment. But it needs to be understood that we can do no more to stop this kind of harassment than we can do to stop any other kind.

[–] LadyAutumn@lemmy.blahaj.zone 10 points 1 week ago (1 children)

This is just apologia for the sexual commodification and exploitation of girls and women. There literally is no girl being sexually liberated here, she has literally had the choice taken from her. Sexual liberation does NOT mean "boys and men can turn all women into personal maturation aids". This ENFORCES patriarchy and subjugation of women. It literally teaches girls that their bodies do not belong to them, that its totally understandable for boys to strip them of humanity itself and turn them into sex dolls.

[–] General_Effort@lemmy.world 4 points 1 week ago

The most deepfaked women are certainly actresses or musicians; attractive people that appear on screens and are known by much of the population.

In some countries, they do not allow people to appear on-screen exactly because of that. Or at least, that's one justification. If the honor or humanity of a woman depends on sexual feelings that she might or might not arouse in men, then women cannot be free. And men probably can't be free either.

At no point have I claimed that anyone is being liberated here. I do not know what will happen. I'm just pointing out how your message is harmful.

load more comments (2 replies)
[–] rottingleaf@lemmy.world 7 points 1 week ago (1 children)

This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.

Sexual attraction doesn't necessarily involve dehumanization. Unlike most other kinds of interest in a human being, it doesn't require interest in their personality, but these are logically not the same.

In general you are using emotional arguments for things that work not through emotion, but through literal interpretation. That's like using metric calculations for a system that expects imperial. Utterly useless.

If the person in the image is underaged then it should be classified as child pornography.

No, it's not. It's literally a photorealistic drawing based on a photo (and a dataset to make the generative model). No children have been abused to produce it. Laws work literally.

If the woman who’s photo is being used hasnt consented to this then it should be classified as sexual exploitation.

No, because the woman is not being literally sexually exploited. Her photo being used without consent is, I think, subject of some laws. There are no new fundamental legal entities involved.

Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.

I think I agree. But it's neither child pornography nor sexual exploitation and can't be equated to them.

There are already existing laws for such actions, similar to using a photo of the victim and a pornographic photo, paper, scissors, pencils and glue. Or, if you think the situation is radically different, there should be new punishable crimes introduced.

Otherwise it's like punishing everyone caught driving while drunk for non-premeditated murder. One is not the other.

[–] Lv_InSaNe_vL@lemmy.world 3 points 1 week ago (1 children)

Hey so, at least in the US, drawings can absolutely be considered CSAM

[–] rottingleaf@lemmy.world 2 points 1 week ago (1 children)

Well, US laws are all bullshit anyway, so makes sense

[–] Lv_InSaNe_vL@lemmy.world 2 points 1 week ago (1 children)

Normally yeah, but why would you want to draw sexual pictures of children?

[–] rottingleaf@lemmy.world 2 points 1 week ago (6 children)

Suppose I'm a teenager attracted to people my age. Or suppose I'm medically a pedophile, which is not a crime, and then I would need that.

In any case, for legal and moral purposes "why would you want" should be answered only with "not your concern, go eat shit and die".

load more comments (6 replies)
[–] atomicorange@lemmy.world 4 points 1 week ago

Thank you. Focusing on the harm the victims is the right way to understand this issue. Too many people in here hunting for a semantic loophole.

[–] lurch@sh.itjust.works 9 points 1 week ago* (last edited 1 week ago) (1 children)

I hope it might lead to a situation of dirty pics/vids not being a problem for the people in it any more, as it could be a deepfake. Like there were cases where a surfacing dirty pic was used for blackmail, ruined someones career or got them kicked out of some comittee, but since it could be fabrication now, I hope this will beva thing of the past, soon.

[–] wewbull@feddit.uk 10 points 1 week ago (2 children)

That could be a socially healthy place to end up at. I don't see it anytime soon though. Just look at the other response I got.

[–] Adderbox76@lemmy.ca 7 points 1 week ago

Sure. That might end up being a socially healthy place for adults to end up.

But it will never work that way for young teens. Their brains aren't done baking yet. They don't have the emotional maturity to understand that enough to be "okay with it because it's just a fake".

That's why we protect kids rather than just telling them "hey it's okay...it's only a fake."

[–] BombOmOm@lemmy.world 2 points 1 week ago

Anyone with half a brain will certainly claim as much. Even if people don't fully believe it, it will blunt the most serious of social consequences.

[–] Adderbox76@lemmy.ca 4 points 1 week ago (1 children)

I'm not even going to begin describing all the ways that what you just said is fucked up.

I'll just point out that online deepfake technology is FAR more accessible to the average 13 year old to use on their peers than "porno mags" were in our day.

You want to compare taking your 13 year old classmates photo off of Facebook, running it through an AI and in five seconds creating photo-realistic adult content featuring them, and compare that to getting your dad's skin-mag from under his mattress when he's not home, cutting your classmates face out of a yearbook, taping it on, then sneaking THAT into the computer lab at school so that you can photocopy it and pass it around in home room, and then putting the skin-mag BACK under the mattress before your dad finds out.

Is that right...is THAT what you're trying to say? Are those the two things that you're trying say are equivalent?

[–] SheeEttin@lemmy.zip 14 points 1 week ago

Yes, we all know it's fucked up. The point is that we don't need a new class of laws just because it's harassment and bullying ✨with AI✨.

load more comments (1 replies)