this post was submitted on 21 Jul 2025
589 points (98.7% liked)

Technology

73066 readers
2445 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] zerofk@lemmy.zip 103 points 2 days ago* (last edited 2 days ago) (3 children)

in which the service admitted to “a catastrophic error of judgement”

It’s fancy text completion - it does not have judgement.

The way he talks about it shows he still doesn’t understand that. It doesn’t matter that you tell it simmering in ALL CAPS because that is no different from any other text.

[–] rockerface@lemmy.cafe 45 points 2 days ago

Well, there was a catastrophic error of judgement. It was made by whichever human thought it was okay to let a LLM work on production codebase.

[–] jj4211@lemmy.world 6 points 1 day ago

judgement

Yeah, it admitted to an error in judgement because the prompter clearly declared it so.

Generally LLMs will make whatever statement about what has happened that you want it to say. If you told it it went fantastic, it would agree. If you told it that it went terribly, it will parrot that sentiment back.

Which what seems to make it so dangerous for some people's mental health, a text generator that wants to agree with whatever you are saying, but doing so without verbatim copying so it gives an illusion of another thought process agreeing with them. Meanwhile, concurrent with your chat is another person starting from the exact same model getting a dialog that violently disagrees with the first person. It's an echo chamber.