this post was submitted on 18 Jun 2025
351 points (98.3% liked)

Not The Onion

16802 readers
1485 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Live_your_lives@lemmy.world 5 points 9 hours ago (3 children)

Can anyone give me a tldr on how LLMs are linked to mental health breakdowns?

[–] Schmoo@slrpnk.net 3 points 6 hours ago

There was a spate of news articles recently about AI feeding people's delusions to the point that they start thinking they're the chosen one, emphasising that it was happening to people with no prior history of mental health issues. It was supposedly linked to an update to ChatGPT that made it even more sycophantic than usual.

It's a mix of sensationalist media and moral panic with just a hint of truth IMO. If people can fall for scams, pseudoscience, and conspiracy theories then they can definitely get pushed off the deep end by an AI saying "yes and" to their every thought.

[–] Muaddib@sopuli.xyz 6 points 8 hours ago

People get attached to a machine with no emotions, no morals, no ability to think before it speaks. A 14 year old boy killed himself because his AI girlfriend told him to. An LLM doesn't know what's okay and what's abuse. A person's heart doesn't know it either.