this post was submitted on 14 Jul 2025
7 points (88.9% liked)
Hacker News
2044 readers
620 users here now
Posts from the RSS Feed of HackerNews.
The feed sometimes contains ads and posts that have been removed by the mod team at HN.
founded 10 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Presumably bullshit.
And yet:
This kind of nonsense is what's feasible, for some AI everyone agrees is "not really intelligent." No model thinks about what code does, but you can ask what code does, and it will try to tell you. You can also describe what code is supposed to be doing, and it will try to make the code do that thing. Looping that might turn GetAnAlbumCover into GetAnalBumCover. But the result should return an anal bum cover.
"What's supposed to be happening here?" "Is that what's happening here?" and "What would make that happen here?" are all questions a neural network could answer. Any functionality can be approximated. Any functionality. LLMs almost kinda sorta do it, already, and they're only approximating "What's the next word?" It is fucking bonkers that these models are so flexible.