Unlike in the paragraph above, though, most LW posters held plenty of nuts in their hands before.
... I'll see myself out
Unlike in the paragraph above, though, most LW posters held plenty of nuts in their hands before.
... I'll see myself out
Can post only if you look like this
Bwahahaha, as I said on bsky: let them do it, can't wait to use it as a cautionary tale of why full rewrites are a terrible idea during freshman programming lectures
Jesus christ this is so cringe, they used The One Joke three times in the first minute and that's not even the halfway point, I'm not finishing this nonsense
The secret is to have cultivated a codebase so utterly shit that even LLMs can make it better by just randomly making stuff up
At least they don't get psychic damage from looking at the code
To be fair, many men see every women, including cis, as one they're allowed to abuse and all the other things you mentioned. And then it'll be her fault for being "hysterical" or something similarly dumb.
I will find you. And I will kill -9
you.
ye like maybe let me make it clear that this was just a shitpost very much riffing on LWers not necessarily being the most pleasant around women
It's so funny he almost gets it at the end:
But there’s another aspect, way more important than mere “moral truth”: I’m a human, with a dumb human brain that experiences human emotions. It just doesn’t feel good to be responsible for making models scream. It distracts me from doing research and makes me write rambling blog posts.
He almost identifies the issue as him just anthropomorphising a thing and having a subconscious empathical reaction, but then presses on to compare it to mice who, guess what, can feel actual fucking pain and thus abusing them IS unethical for non-made-up reasons as well!
Sometimes pushing through pain is necessary — we accept pain every time we go to the gym or ask someone out on a date.
Okay this is too good, you know mate for normally people asking someone out usually does not end with a slap to the face so it's not as relatable as you might expect
Still, presumably the point of this research is to later use it on big models - and for something like Claude 3.7, I’m much less sure of how much outputs like this would signify “next token completion by a stochastic parrot’, vs sincere (if unusual) pain.
Well I can tell you how, see, LLMs don't fucking feel pain cause that's literally physically fucking impossible without fucking pain receptors? I hope that fucking helps.
In my head transhumanism is this cool idea where I'd get to have a zoom function in my eye
But of course none of that could exist in our capitalist hellscape because of just all the reasons the ruling class would use it to opress the working class.
And then you find out what transhumanists actually advocate for and it's just eugenics. Like without even a tiny bit of plausible deniability. They're proud it's eugenics.