swlabr

joined 2 years ago
[–] swlabr@awful.systems 11 points 1 month ago

this is low hanging fruit but: yeah absolutely rancid takes on the middle east from scootson. who would have thought he would produce such nuance-free, fascist opinions about geopolitics?

[–] swlabr@awful.systems 11 points 1 month ago (3 children)

s/o to one of the comments basically saying that Scott is the Simone de Beauvoir of rationalism

[–] swlabr@awful.systems 15 points 1 month ago

It's probably been discussed a shit-ton already, but boy does this guy not get why he's a chud. Consider these two quotes, heavily edited for brevity:

I’m [...] a liberal Zionist, [...] etc. ([an identity] well-enough represented at LessOnline [...]).

and:

The closest to right-wing politics that I witnessed at LessOnline was [moderate politics].

Of course, one shouldn't expect s11n.blog to understand the fascist, let alone right-wing, nature of liberalism or Zionism. That is simply how fascists are.

[–] swlabr@awful.systems 9 points 1 month ago

Doing some reading about the SAG-AFTRA video game voice acting strike. Anyone have details about "Ethovox", the AI company that SAG has apparently partnered with?

[–] swlabr@awful.systems 13 points 1 month ago

cool story, bro

[–] swlabr@awful.systems 17 points 1 month ago (8 children)

lol the corollary of this is that LLMs are incapable of producing meaningful output, you insufferable turd

[–] swlabr@awful.systems 14 points 1 month ago (1 children)

Given that the LLMs typically have a system prompt that specifies a particular tone for the output, I think pretentious is an absolutely valid and accurate word to use.

[–] swlabr@awful.systems 11 points 1 month ago

dark forest internet here we go!!!

[–] swlabr@awful.systems 9 points 1 month ago (2 children)

it's funny to me that these futurist/thought leader/tech genius utterly fail to build their cult compounds where a goofy ass cult like Scientology has blown past them completely

[–] swlabr@awful.systems 8 points 1 month ago (8 children)

network state

Great, a new stupid thing to know about. How likely is it that a bunch of people that believe they are citizens of an online state will become yet another player in the Stochastic Terrorism as a Service industry?

[–] swlabr@awful.systems 10 points 1 month ago

Oh, that's a good angle too. Prompt the LLM with "what insights does this book have about B2B sales" or something.

[–] swlabr@awful.systems 11 points 1 month ago* (last edited 1 month ago) (3 children)

take 2 minutes to think of precisely the information I need

I can’t even put into words the full nonsense of this statement. How do you think this would work? This is not how learning works. This is not how research works. This is not how anything works.

This part threw me as well. If you can think of it, why read for it? Didn’t make sense and so I stopped looking into this particular abyss until you pointed it out again.

I think the only interpretation of what this person said that approaches some level of rationality on their part is essentially a form of confirmation bias. They aren’t thinking of information that is in the text, they are thinking “I want this text to confirm X for me”, then they prompt and get what they want. LLMs are biased to be people-pleasers and will happily spin whatever hallucinated tokens the user throws at them. That’s my best guess.

That you didn’t think of the above just goes to show the failure of your unfeeble mind’s logic and reason to divine such a truth. Just kidding, sorta, in the sense that you can’t expect to understand an irrational thought process using rationality.

But if it’s not that I’m still thrown.

view more: ‹ prev next ›