this post was submitted on 22 Jul 2025
19 points (100.0% liked)

SneerClub

1160 readers
24 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

See our twin at Reddit

founded 2 years ago
MODERATORS
top 8 comments
sorted by: hot top controversial new old
[–] scruiser@awful.systems 8 points 19 hours ago* (last edited 19 hours ago) (1 children)

Some of the comments are, uh, really telling:

The main effects of the sort of “AI Safety/Alignment” movement Eliezer was crucial in popularizing have been OpenAI, which Eliezer says was catastrophic, and funding for “AI Safety/Alignment” professionals, whom Eliezer believes to predominantly be dishonest grifters. This doesn't seem at all like what he or his sincere supporters thought they were trying to do.

The irony is completely lost on them.

I wasn't sure what you meant here, where two guesses are "the models/appeal in Death with Dignity are basically accurate, but, should prompt a deeper 'what went wrong with LW or MIRI's collective past thinking and decisionmaking?, '" and "the models/appeals in Death with Dignity are suspicious or wrong, and we should be halt-melting-catching-fire about the fact that Eliezer is saying them?"

The OP replies that they meant the former... the later is a better answer, Death with Dignity is kind of a big reveal of a lot of flaws with Eliezer and MIRI. To recap, Eliezer basically concluded that since he couldn't solve AI alignment, no one could, and everyone is going to die. It is like a microcosm of Eliezer's ego and approach to problem solving.

"Trigger the audience into figuring out what went wrong with MIRI's collective past thinking and decision-making" would be a strange purpose from a post written by the founder of MIRI, its key decision-maker, and a long-time proponent of secrecy in how the organization should relate to outsiders (or even how members inside the organization should relate to other members of MIRI).

Yeah, no shit secrecy is bad for scientific inquiry and open and honest reflections on failings.

...You know, if I actually believed in the whole AGI doom scenario (and bought into Eliezer's self-hype) I would be even more pissed at him and sneer even harder at him. He basically set himself up as a critical savior to mankind, one of the only people clear sighted enough to see the real dangers and most important question... and then he totally failed to deliver. Not only that he created the very hype that would trigger the creation of the unaligned AGI he promised to prevent!

[–] Soyweiser@awful.systems 6 points 8 hours ago

The irony is completely lost on them.

So not only has Yud failed to properly align AI, he also failed to align the AI aligners. Time to burn down the sequences and start over.

[–] swlabr@awful.systems 15 points 1 day ago (2 children)

with like 15 links for context/back references, this is basically impenetrable. splendid find

[–] zogwarg@awful.systems 10 points 16 hours ago

It can't be that stupid, you haven't read the sequences hard enough.

[–] Soyweiser@awful.systems 8 points 1 day ago* (last edited 1 day ago) (2 children)

Followed one of the links which game me a very "is yud ok?" Feeling : https://www.lesswrong.com/posts/PCfaLLtuxes6Jk4S2/fighting-a-rearguard-action-against-the-truth

(Talking about the way it is written not the actual content, which seems to be 'looks like I was wrong, need to rethink some things and check my assumptions' but using thousands of words that make it sound like is having some weird mental episode).

E:

One is Zack Davis documenting endorsement of anti-epistemology ... [Links to articles removed] ... to placate trans ideology even many important transgender Rationality community members overtly reject

Euh ok. Always good when people bring up 'trans ideology'.

[–] blakestacey@awful.systems 10 points 1 day ago (1 children)
[–] froztbyte@awful.systems 7 points 1 day ago

Always good when people bring up ‘trans ideology’.

I wish I'd stored the link somewhere but some years back I hit one of those Twitter Expositionary Threads about (one of?) the roots of this shit, the read focusing on "the nation question" (and then breaking it out both in historical context, and the various ways it got latterly repurposed by ghouls)

and every single fucking time I see some fucker post about "the $x ideology", I flash back to the that thread, and the knowledge gained from it