swlabr

joined 2 years ago
[–] swlabr@awful.systems 7 points 1 month ago (1 children)

Incredible work as always, self

[–] swlabr@awful.systems 15 points 1 month ago* (last edited 1 month ago) (5 children)
[–] swlabr@awful.systems 13 points 1 month ago

that being said… AI does literally help some people. for many things. google search was my favourite AI tool 25 years ago, but it’s definitely not right now.

lol

[–] swlabr@awful.systems 7 points 1 month ago (1 children)

Lol, I’m a decision theorist because I had to decide whether I should take a shit or shave first today.

What's your P(doodoo)

[–] swlabr@awful.systems 16 points 1 month ago (4 children)

time to donate my money to a different wiki that only has the noblest of intentions, wikifeet (jk)

[–] swlabr@awful.systems 3 points 1 month ago

Children really shouldn’t be left with the impression that chatbots are some type of alternative person instead of ass-kissing google replacements that occasionally get some code right

I agree! I'm more thinking of the case where a kid might overhear what they think is a phone call when it's actually someone being mean to Siri or whatever. I mean, there are more options than "be nice to digital entities" if we're trying to teach children to be good humans, don't get me wrong. I don't give a shit about the non-feelings of the LLMs.

[–] swlabr@awful.systems 27 points 1 month ago (7 children)

hey dawg if you want to be anti-capitalist that’s great, but please interrogate yourself on who exactly is developing LLMs and who is running their PR campaigns before you start simping for AI and pretending like a hallucination engine is a helpful tool in general and specifically to help people understand complex topics where precision and nuance are needed and definitely not fucking hallucinations. Please be serious and for real

[–] swlabr@awful.systems 11 points 1 month ago (6 children)

Very off topic: The only plausible reason I’ve heard to be “nice” to LLMs/virtual assistants etc. is if you are being observed by a child or someone else impressionable. This is to model good behaviour if/when they ask someone a question or for help. But also you shouldn’t be using those things anyhoo.

[–] swlabr@awful.systems 10 points 1 month ago (1 children)

decidely unsexy. gerard is finally beating the allegations

[–] swlabr@awful.systems 27 points 1 month ago

Making LLMs safe for mentally ill people is very difficult

Arguably, they can never be made "safe" for anyone, in the sense that presenting hallucinations as truth should be considered unsafe.

[–] swlabr@awful.systems 17 points 1 month ago

For you, the day Disney graced your independent research lab was the most important day of your life. But for Disney, it was Tuesday.

-M. Bison (the M stands for Mickey Mouse)

[–] swlabr@awful.systems 10 points 1 month ago (1 children)

If you want to hang out, you've gotta take her out, tech brain

If you want to get down, down on the ground, tech brain

AGI, say goodbye, ROI, tech brain

-Eric “Slophand” Slopton

view more: ‹ prev next ›