this post was submitted on 07 Jul 2025
132 points (100.0% liked)

Programming

21420 readers
260 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 2 years ago
MODERATORS
all 16 comments
sorted by: hot top controversial new old
[–] realitista@lemmy.world 3 points 3 hours ago

This is how ChatGPT makes a feature request.

[–] oantolin@discuss.online 47 points 8 hours ago (2 children)

Normally people use ChatGPT to vibe code, this is the first instance I'm aware of of ChatGPT using people to vibe code!

[–] daniskarma@lemmy.dbzer0.com 10 points 8 hours ago

Chatgpt is just enforcing 4th law of robotics.

[–] zod000@lemmy.ml 4 points 7 hours ago (1 children)

I actually had this happen more than once because of a coworker that uses Copilot and it would help him with code and use functions not in existance. He has since finally stopped trusting Copilot code without more testing.

[–] JWBananas@lemmy.world 4 points 5 hours ago

Worse, they typically do exist, just in training data made up of others' code. Sometimes if you put the function names into GitHub search, you can find them.

[–] JackbyDev@programming.dev 21 points 9 hours ago (1 children)

Fascinating! Because this notation is already used by another tool (and possibly more), it might not be as silly as it sounds. From the headline it sounds like some really weird API was added to something.

Happy to see this sort of optimism in the wake of AI causing people's programs to get bad reviews because the AI thinks they can do things they can't.

Thanks for the share. Maybe I'm looking too far into it, or just in one of those moods, but this really is oddly inspirational to me.

[–] limer@lemmy.dbzer0.com 7 points 9 hours ago

It is inspiring to me

[–] palordrolap@fedia.io 9 points 9 hours ago (3 children)

TBH, this is barely any different from marketing promising that a product will have a feature that the development team only find out about later purely by accident when upper management asks about it.

[–] L0rdMathias@sh.itjust.works 16 points 7 hours ago

It's worse. This is like the restaurant across the street, a company that is completely unaffiliated with me and even my industry, is now running a lunch promotion that includes advertising something for my business which I did not approve and do not sell.

If a human being did this, it would be so unbelievably grossly obviously illegal it wouldn't even have to go to court. It's obvious and blatant fraud. Lying on this level is like so unbelievably blasphemous I would go so far as to say that this is uncivilized wild animal behavior that far precedes modern copyright/property laws, having been frowned upon in almost every society since the dawn of civilization.

[–] cecilkorik@lemmy.ca 6 points 8 hours ago (1 children)

It's much different because you can fire your salespeople for failing to consult with the engineering team, promising shit that is impossible, going to damage your brand and reputation, and provide little-to-no return on investment.

The biggest difference is that you can't fire ChatGPT (as much as I desperately wish we could)

[–] palordrolap@fedia.io 1 points 8 hours ago

OK, yeah, you can't control a third party's promises (or hallucinations), but the boss isn't going to fire someone from sales and/or marketing. They'll fire the developer for failing to deliver.

[–] Kowowow@lemmy.ca 2 points 8 hours ago

The krusty krab doesn't sell pizza till they do

[–] YourAvgMortal@lemmy.world 3 points 8 hours ago (3 children)

ChatGPT is just fancy autocomplete, so it probably got the notation from somewhere else; it’s not really capable of inventing new stuff on its own (unless it hallucinates). It would be interesting to ask it where it saw that notation in the past if you didn’t support it before, but in a way, you could say it’s a standard form of notation (from a different service).

[–] LesserAbe@lemmy.world 5 points 5 hours ago

You know it's not strictly auto completing sentences that previously existed, right? It's using words that it anticipates should follow others. I've had it suggest code libraries that don't exist, and you'll hear about people going to the library to ask for books that haven't been written but supposedly by real authors, and it sounds like something they would write.

Tab music notation is super common, and although it wasn't supported by this particular service before, you could see where it might be the sort of request people make, and so chatgpt combined the two.

[–] Paradox@lemdro.id 2 points 5 hours ago* (last edited 5 hours ago)

Iirc Wikipedia supports it for tab notation

Personally I much prefer lilypond. I wonder if this tool supports lilypond. Would love to have a workflow to scan sheet music and get lilypond out the other end.