this post was submitted on 21 Jul 2025
699 points (98.6% liked)

Technology

302 readers
339 users here now

Share interesting Technology news and links.

Rules:

  1. No paywalled sites at all.
  2. News articles has to be recent, not older than 2 weeks (14 days).
  3. No videos.
  4. Post only direct links.

To encourage more original sources and keep this space commercial free as much as I could, the following websites are Blacklisted:

More sites will be added to the blacklist as needed.

Encouraged:

founded 2 months ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] WraithGear@lemmy.world 12 points 1 week ago

it doesn’t. it after the fact evaluates the actions, and assumes an intent that would get the highest rated response from the user, based on its training and weights.

now humans do sorta the same thing, but llm’s do not appropriately grasp concepts. if it weighed it diffrent it could just as easily as said that it was mad and did it out of frustration. but the reason it did that was in its training data at some point connected to all the appropriate nodes of his prompt is the knowledge that someone recommended formatting the server. probably as a half joke. again llm’s do not have grasps of context