Telling a writer "You probably would love this thing that spits up gibberish" is tantamount to punching yourself in the face
chapotraphouse
Banned? DM Wmill to appeal.
No anti-nautilism posts. See: Eco-fascism Primer
Slop posts go in c/slop. Don't post low-hanging fruit here.
Having a profession centered around writing and being in favor of the gibberish machine really is telling on yourself lol
NPR "journalist"
I mean, it tracks
Marg bar NPR
NPR is a result code on the Voight-Kampff test, it means No Personhood Readings
I like this place, I see it on /all/ often with some good stuff (I’ve never listened to the podcast… yet?) but this reads like a foreign language to me, or maybe I’m having a stroke?!
it's a movie reference
clip from Blade Runner (1982) based on the book Do Androids Dream of Electric Sheep by Philip K. Dick
First of all, thank you.
Second of all, BLADE RUNNER IS BASED ON ELECTRIC SHEEP?! I loved that book and have never seen Blade Runner!
Added to my list, thanks again!
Also lawl, my last comment before this a little bit ago on another post was praising Electric Sheep. Weird coincidence.
Also again: Electric Sheep was a dope screensaver thingy when I was a stoner-ass 20s bitty
Some shitlibs have standards it seems.
the only use of ai i think is probably remotely useful is programmers using it to help write new code. not people who aren't experienced at software development mind you, they don't get too much of chatgpt, but someone that knows what they're doing with copilot to copy-paste someone's completely correct implementation, that seems useful. at least to people i've talked to.
It is very useful for coding because that is one of the few places where unoriginal repetitive solutions are often desirable. But even with coding you have to know what to tell the LLM to do and you have to be able to read and understand the output to make sure it works as intended.
LLM's are a useful too for programmers to automate repetitive tasks but it is nowhere near bearing able to produce usable applications by itself. I am not worried that I'll be replaced by a robot anytime soon.
Those who should be worried about their jobs are people in places like customer support or government services directed at people who doesn't matter to the ruling class. In these cases the powers that be have little holding them from replacing human interactions with significantly worse interactions with a LLM. Nobody important gives a shit if some schmuck can't cancel their cable subscription or gets their employment benefits cut because the computer had a hiccup.
IMO no, for two reasons:
- reading code is harder than writing it. If the AI writes you a standard implementation, you still have to read it to make sure it's correct. So that's more work than just doing it yourself
- AI will produce code that looks right. Since it can't understand anything that's all it does, next most likely token == most correct-looking solution. But when the obvious solution is not the right one, you now have deceptively incorrect code, specifically and solely designed to look correct.
I've never used Copilot myself but pair programmed with someone who used it, and it seemed like he spent more time messing with the output than it would have taken to write it himself.
More like helping programmers write e-mails to needy project managers who need a status update on that feature ticket every 11 hemiseconds
I use JetBrains "local LLM" thingy and it's good at suggesting the very obvious, trivial code that I would write anyway, so it just saves me keystrokes
It's clearly become a crutch for some programmers. I remember talking to someone who does ai research and openly admitted that most of the people in their lab couldn't code and that the outputs from chatgpt where sufficient to do their work.
"You're right, this is great! It's never been so easy to make sure I'm not just throwing up stale "art by committee" tropes and drivel. What a time saver! Wait, you meant to actually use them? "
I always say with AI “don’t they have anything more important to automate?”
If we are told that art is silly and only a lucky few can ever make a career out of it, then why is it that automating art is top priority?
I love that Shapiro gives an example of one of the things AI is worst at doing with creative writing. AI is terrible at linking two unrelated scenes together. All AI can really do with a script is pad it with samey nonsense, it can't come up with a clever twist or a good segue.