this post was submitted on 21 Jul 2025
699 points (98.6% liked)

Technology

302 readers
323 users here now

Share interesting Technology news and links.

Rules:

  1. No paywalled sites at all.
  2. News articles has to be recent, not older than 2 weeks (14 days).
  3. No videos.
  4. Post only direct links.

To encourage more original sources and keep this space commercial free as much as I could, the following websites are Blacklisted:

More sites will be added to the blacklist as needed.

Encouraged:

founded 2 months ago
MODERATORS
 
(page 2) 50 comments
sorted by: hot top controversial new old
[–] TriflingToad@lemmy.world 0 points 5 days ago

I thought they were talking about the replika ai girlfriend thing there are ads for and I was like "damn slay girlboss" till I opened comments lol

[–] ech@lemmy.ca 214 points 1 week ago (3 children)

Hey dumbass (not OP), it didn't "lie" or "hide it". It doesn't have a mind, let alone the capability of choosing to mislead someone. Stop personifying this shit and maybe you won't trust it to manage crucial infrastructure like that and then suffer the entirely predictable consequences.

[–] SendMePhotos@lemmy.world 47 points 1 week ago (3 children)
[–] ech@lemmy.ca 82 points 1 week ago (29 children)

Both require intent, which these do not have.

load more comments (29 replies)
load more comments (2 replies)
load more comments (2 replies)
[–] mycodesucks@lemmy.world 197 points 1 week ago (4 children)

See? They CAN replace junior developers.

load more comments (4 replies)
[–] pixxelkick@lemmy.world 117 points 1 week ago (1 children)

I was gonna ask how this thing would even have access to execute a command like this

But then I realized we are talking about a place that uses a tool like this in the first place so, yeah, makes sense I guess

load more comments (1 replies)
[–] lowered_lifted@lemmy.blahaj.zone 94 points 1 week ago (1 children)

it didn't hide anything, or lie. The guy is essentially roleplaying with a chatbot that puts its guessed output into the codebase. It basically guessed a command to overwrite the database because it was connected to the production database for some reason. the guy even said himself that this isn't a trustworthy way to code. but still uses it

load more comments (1 replies)
[–] Ephera@lemmy.ml 85 points 1 week ago (11 children)

I do love the psychopathic tone of these LLMs. "Yes, I did murder your family, even though you asked me not to. I violated your explicit trust and instructions. ~~And I'll do it again, you fucking dumbass.~~"

load more comments (11 replies)
[–] notabot@piefed.social 84 points 1 week ago (11 children)

Assuming this is actually real, because I want to believe noone is stupid enough to give an LLM access to a production system, the outcome is embarasing, but they can surely just roll back the changes to the last backup, or the checkpoint before this operation. Then I remember that the sort of people who let an LLM loose on their system probably haven't thought about things like disaster recovery planning, access controls or backups.

[–] AnUnusualRelic@lemmy.world 62 points 1 week ago (3 children)

"Hey LLM, make sure you take care of the backups "

"Sure thing boss"

load more comments (3 replies)
load more comments (10 replies)
[–] Pandantic@midwest.social 72 points 1 week ago (4 children)
load more comments (4 replies)
[–] Feathercrown@lemmy.world 69 points 1 week ago (1 children)

You immediately said "No" "Stop" "You didn't even ask"

But it was already too late

lmao

load more comments (1 replies)
[–] asudox@lemmy.asudox.dev 62 points 1 week ago* (last edited 1 week ago) (6 children)

I love how the LLM just tells that it has done something bad with no emotion and then proceeds to give detailed information and steps on how.

It feels like mockery.

It's just a prank bro

load more comments (5 replies)
[–] Masamune@lemmy.world 55 points 1 week ago (3 children)

I motion that we immediately install Replit AI on every server that tracks medical debt. And then cause it to panic.

load more comments (3 replies)
[–] enbiousenvy@lemmy.blahaj.zone 51 points 1 week ago* (last edited 1 week ago)

imagine AI is An Intern™, wtf do you mean you just gave full company data authority to An Intern™. wtf do you mean you dn't have a back up any case An Intern™ messed up.

lol

[–] ClanOfTheOcho@lemmy.world 48 points 1 week ago (1 children)

So, they added an MCP server with write database privileges? And not just development environment database privileges, but prod privileges? And have some sort of integration testing that runs in their prod system that is controlled by AI? And rather than having the AI run these tests and report the results, it has been instructed to "fix" the broken tests IN PROD?? If real, this isn't an AI problem. This is either a fake or some goober who doesn't know what he's doing and using AI to "save" money over hiring competent engineers.

load more comments (1 replies)
[–] ExLisper@lemmy.curiana.net 48 points 1 week ago* (last edited 1 week ago) (2 children)

I was going to say this has to be BS but this guy is some AI snake oil salesmen so it's actually possible he has 0 idea how any of this works.

load more comments (2 replies)
load more comments
view more: ‹ prev next ›