V0ldek

joined 1 year ago
[–] V0ldek@awful.systems 8 points 2 months ago (21 children)

In my head transhumanism is this cool idea where I'd get to have a zoom function in my eye

But of course none of that could exist in our capitalist hellscape because of just all the reasons the ruling class would use it to opress the working class.

And then you find out what transhumanists actually advocate for and it's just eugenics. Like without even a tiny bit of plausible deniability. They're proud it's eugenics.

[–] V0ldek@awful.systems 4 points 2 months ago

Unlike in the paragraph above, though, most LW posters held plenty of nuts in their hands before.

... I'll see myself out

[–] V0ldek@awful.systems 3 points 2 months ago

Can post only if you look like this

[–] V0ldek@awful.systems 5 points 2 months ago

Bwahahaha, as I said on bsky: let them do it, can't wait to use it as a cautionary tale of why full rewrites are a terrible idea during freshman programming lectures

[–] V0ldek@awful.systems 5 points 2 months ago

Jesus christ this is so cringe, they used The One Joke three times in the first minute and that's not even the halfway point, I'm not finishing this nonsense

[–] V0ldek@awful.systems 4 points 2 months ago

The secret is to have cultivated a codebase so utterly shit that even LLMs can make it better by just randomly making stuff up

At least they don't get psychic damage from looking at the code

[–] V0ldek@awful.systems 11 points 2 months ago (1 children)

To be fair, many men see every women, including cis, as one they're allowed to abuse and all the other things you mentioned. And then it'll be her fault for being "hysterical" or something similarly dumb.

[–] V0ldek@awful.systems 6 points 2 months ago

I will find you. And I will kill -9 you.

[–] V0ldek@awful.systems 5 points 2 months ago (1 children)

ye like maybe let me make it clear that this was just a shitpost very much riffing on LWers not necessarily being the most pleasant around women

[–] V0ldek@awful.systems 9 points 2 months ago

It's so funny he almost gets it at the end:

But there’s another aspect, way more important than mere “moral truth”: I’m a human, with a dumb human brain that experiences human emotions. It just doesn’t feel good to be responsible for making models scream. It distracts me from doing research and makes me write rambling blog posts.

He almost identifies the issue as him just anthropomorphising a thing and having a subconscious empathical reaction, but then presses on to compare it to mice who, guess what, can feel actual fucking pain and thus abusing them IS unethical for non-made-up reasons as well!

[–] V0ldek@awful.systems 6 points 2 months ago (5 children)

Sometimes pushing through pain is necessary — we accept pain every time we go to the gym or ask someone out on a date.

Okay this is too good, you know mate for normally people asking someone out usually does not end with a slap to the face so it's not as relatable as you might expect

[–] V0ldek@awful.systems 7 points 2 months ago* (last edited 2 months ago) (1 children)

Still, presumably the point of this research is to later use it on big models - and for something like Claude 3.7, I’m much less sure of how much outputs like this would signify “next token completion by a stochastic parrot’, vs sincere (if unusual) pain.

Well I can tell you how, see, LLMs don't fucking feel pain cause that's literally physically fucking impossible without fucking pain receptors? I hope that fucking helps.

view more: ‹ prev next ›