swlabr

joined 2 years ago
[–] swlabr@awful.systems 17 points 1 month ago (8 children)

lol the corollary of this is that LLMs are incapable of producing meaningful output, you insufferable turd

[–] swlabr@awful.systems 14 points 1 month ago (1 children)

Given that the LLMs typically have a system prompt that specifies a particular tone for the output, I think pretentious is an absolutely valid and accurate word to use.

[–] swlabr@awful.systems 11 points 1 month ago

dark forest internet here we go!!!

[–] swlabr@awful.systems 9 points 1 month ago (2 children)

it's funny to me that these futurist/thought leader/tech genius utterly fail to build their cult compounds where a goofy ass cult like Scientology has blown past them completely

[–] swlabr@awful.systems 8 points 1 month ago (8 children)

network state

Great, a new stupid thing to know about. How likely is it that a bunch of people that believe they are citizens of an online state will become yet another player in the Stochastic Terrorism as a Service industry?

[–] swlabr@awful.systems 10 points 1 month ago

Oh, that's a good angle too. Prompt the LLM with "what insights does this book have about B2B sales" or something.

[–] swlabr@awful.systems 11 points 1 month ago* (last edited 1 month ago) (3 children)

take 2 minutes to think of precisely the information I need

I can’t even put into words the full nonsense of this statement. How do you think this would work? This is not how learning works. This is not how research works. This is not how anything works.

This part threw me as well. If you can think of it, why read for it? Didn’t make sense and so I stopped looking into this particular abyss until you pointed it out again.

I think the only interpretation of what this person said that approaches some level of rationality on their part is essentially a form of confirmation bias. They aren’t thinking of information that is in the text, they are thinking “I want this text to confirm X for me”, then they prompt and get what they want. LLMs are biased to be people-pleasers and will happily spin whatever hallucinated tokens the user throws at them. That’s my best guess.

That you didn’t think of the above just goes to show the failure of your unfeeble mind’s logic and reason to divine such a truth. Just kidding, sorta, in the sense that you can’t expect to understand an irrational thought process using rationality.

But if it’s not that I’m still thrown.

[–] swlabr@awful.systems 14 points 1 month ago

Dune

Prompts ChatGPT, skims the output because of muh ‘fishency

The Omelas Hole sure sounds like a paradise, I want to live in there!

[–] swlabr@awful.systems 6 points 1 month ago

That’s reasonable, and especially achievable if you don’t use chatbots or digital assistants!

[–] swlabr@awful.systems 9 points 1 month ago

I wonder if some of those chuds think those waymos might be conscious

[–] swlabr@awful.systems 19 points 1 month ago* (last edited 1 month ago) (2 children)

what kind of semen retention scheme is this

[–] swlabr@awful.systems 8 points 1 month ago

Note to the Peanut gallery: this guy knows about paperclipmaxxing but not this more famous comic. Curious. lmfao

view more: ‹ prev next ›