this post was submitted on 04 Apr 2025
223 points (94.4% liked)
Technology
68400 readers
2835 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
When I saw this, 2 questions came to mind: How come that this isn't immediately reported? Why would anyone upload illegal material to a platform that tracks as thoroughly as Meta's do?
The answer is:
The 1 question that came to mind upon reading this is: What?
ai generated content is like plastic pollution
I’m a little confused as to how it can still be AI CSAM if the bodies are voluptuous and the breasts are ample. Childlike faces have been the bread and butter of face filters for years.
Which parts specifically have to be childlike for it to be AI CSAM? This is why we need some laws ASAP.
Things that you want to understand but sure as fuck ain't gonna Google.
My guess is that the algorithm is really good at predicting who will be likely to follow that kind of content, rather than report it. Basically, it flies under the radar purely because the only people who see it are the ones who have a vested interest in it flying under the radar.
Look again. The explanation is that these images simply don't look like any kind of CSAM. The whole story looks like some sort of scam to me.