this post was submitted on 10 Aug 2025
98 points (99.0% liked)

AI - Artificial intelligence

80 readers
18 users here now

AI related news and articles.

Rules:

founded 2 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Perspectivist@feddit.uk 12 points 1 day ago (1 children)

It literally isn’t. It’s a Large Language Model - an AI system designed to generate natural-sounding language.

The fact that it gets any answers right at all isn’t because it knows things - it’s because it’s been trained on data that contains a lot of correct information. Its answers are based on statistical probabilities. It’s physically incapable of looking at a word and counting the letters in it.

[–] jj4211@lemmy.world 5 points 1 day ago

Also, it fires off the prompt or various processed versions of it at various backends and then "stuffs" the prompt with the results.

So you ask a prompt "What team won a sports game last night?" and then the LLM ultimately gets the Prompt like.

"Summarize the following sports results to indicate a winning team and statistics: : ESPN: The Lakers held off the Bulls to finish with a 119 to 117 victory FoxSports: The Bulls lose to the Lakers by two points ...."

So you end up asking a question, then your question may query a traditional search engine, and the LLM is summarizing the traditional results, and usually doing an ok job of it, except it can mess up by mixing the results together in nonsensical ways. Like one source cited Canada as having the most lakes by a certain criteria with like 300,000 lakes and another result talking about Alaska having 3 million lakes and ends up telling the user "Canada has significantly more lakes than the United states, with 300,000 while the USA has 200,000, with Alaska contributing the most to the US with 3 million lakes".