this post was submitted on 01 Apr 2025
256 points (97.1% liked)

Technology

68772 readers
3371 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] jaschen@lemm.ee -4 points 2 weeks ago (1 children)

I'm no pedo, but what you do in your own home and hurts nobody is your own thing.

[–] reseller_pledge609@lemmy.dbzer0.com 6 points 2 weeks ago (3 children)

Yes, but how is the AI making the images or videos? It has to be trained on SOMETHING.

So, regardless of direct harm or not, harm is done at some point in the process and it needs to be stopped before it slips and gets worse because people "get used to" it.

[–] Mnemnosyne@sh.itjust.works 43 points 2 weeks ago* (last edited 2 weeks ago) (2 children)

Ai can combine two things. It can train on completely normal pictures of children, and it can train on completely normal adult porn, and then it can put those together.

This is the same reason it can do something like Godzilla with Sailor Moon's hair, not because it trained on images of Godzilla with Sailor Moon's hair, but because it can combine those two separate things.

[–] reseller_pledge609@lemmy.dbzer0.com 0 points 2 weeks ago (1 children)

Fair enough. I still think it shouldn't be allowed though.

[–] figjam@midwest.social 10 points 2 weeks ago

Why? Not pressing but just curious what the logic is

[–] Kusimulkku@lemm.ee 7 points 2 weeks ago (1 children)

I wouldn't think it needs to have child porn in the training data to be able to generate it. It has porn as the data, it knows what kids look like, merge the two. I think that works for anything AI knows about, make this resemble this.

[–] reseller_pledge609@lemmy.dbzer0.com 2 points 2 weeks ago (2 children)

That's fair, but I still think it shouldn't be accepted or allowed.

[–] Kusimulkku@lemm.ee 4 points 2 weeks ago

It seems pretty understandable that companies wouldn't allow it, it's more that if it is illegal (like in some places) then that gets into really sketchy territory imo.

[–] jaschen@lemm.ee 2 points 2 weeks ago

I agree it shouldn't be accepted, but I disagree on being allowed. I think it should be allowed because it doesn't hurt anyone.

[–] MonkderVierte@lemmy.ml 6 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

needs to be stopped before it slips and gets worse because people "get used to" it.

Ah, right, almost finally forgot the killer games rhetoric.

[–] reseller_pledge609@lemmy.dbzer0.com 0 points 2 weeks ago (1 children)

I also don't agree with the killer games thing, but humans are very adaptable as a species.

Normally that's a good thing, but in a case like this exposure to something shocking or upsetting can make it less shocking or upsetting over time (obviously not in every case). So, if AI is being used for something like this and being reported on isn't it possible that people might slowly get desensitized to it over time?

[–] MonkderVierte@lemmy.ml 6 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

But what if pedophiles in therapy are less likely to commit a crime if they have access to respective porn? Even better then, if it can be AI generated, no?

[–] jaschen@lemm.ee 1 points 1 week ago

Japan is country that has legal drawn cp. It's available in physical store and online. Yet Japan is much lower than most developed country in the world in terms of actual sexual child abuse.