this post was submitted on 21 Jul 2025
699 points (98.6% liked)

Technology

290 readers
216 users here now

Share interesting Technology news and links.

Rules:

  1. No paywalled sites at all.
  2. News articles has to be recent, not older than 2 weeks (14 days).
  3. No videos.
  4. Post only direct links.

To encourage more original sources and keep this space commercial free as much as I could, the following websites are Blacklisted:

More sites will be added to the blacklist as needed.

Encouraged:

founded 2 months ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] RedPandaRaider@feddit.org 0 points 6 days ago* (last edited 6 days ago) (2 children)

Lying does not require intent. All it requires is to know an objective truth and say something that contradicts or conceals it.

As far as any LLM is concerned, the data they're trained on and other data they're later fed is fact. Mimicking human behaviour such as lying still makes it lying.

[–] kayohtie@pawb.social 13 points 6 days ago (1 children)

But that still requires intent, because "knowing" in the way that you or I "know" things is fundamentally different from it only having a pattern matching vector that includes truthful arrangements of words. It doesn't know "sky is blue". It simply contains indices that frequently arrange the words "sky is blue".

Research papers that overlook this are still personifying a series of mathematical matrices as if it actually knows any concepts.

That's what the person you're replying to means. These machines don't know goddamn anything.

[–] ech@lemmy.ca 8 points 6 days ago

Except these algorithms don't "know" anything. They convert the data input into a framework to generate (hopefully) sensible text from literal random noise. At no point in that process is knowledge used.