this post was submitted on 02 Jul 2025
114 points (99.1% liked)

chapotraphouse

13918 readers
774 users here now

Banned? DM Wmill to appeal.

No anti-nautilism posts. See: Eco-fascism Primer

Slop posts go in c/slop. Don't post low-hanging fruit here.

founded 4 years ago
MODERATORS
 

On a deeper level than small talk, of course.

top 50 comments
sorted by: hot top controversial new old
[–] TheLepidopterists@hexbear.net 16 points 23 hours ago

I was talking to my father in law today and he's worried that it's going to skynet us or like, make us dependent on it and then quit helping or something.

I tried explaining that it's not a robot, it's not an AI, it's not C-3PO. It's a very elaborate autocomplete. It can't even do kindergarten level arithmetic consistently, which a basic calculator can accomplish, and it's because it's just autocomplete.

"Yeah, but what if it starts teaching itself."

I love the man but God damn people think this thing is so much more than it is.

[–] Damarcusart@hexbear.net 17 points 1 day ago

I described it to a friend once as an "ass-kissing machine" and it completely changed her view of it, recognising that that is exactly what it does, it just says what you want to hear.

A lot of people feel like they never have any control or any sense of recognition of their "hard work" so an ass kissing bot is perfect to stroke the ego of someone who desperately wants someone to tell them that their ideas are good and clever, and to take "interest" in what they say.

[–] nautilus@lemmy.dbzer0.com 46 points 1 day ago (22 children)

The other day someone told me that their partner used ChatGPT instead of going to therapy.

We’re all so cooked.

[–] Meh@hexbear.net 35 points 1 day ago (1 children)

So many people are going to develop/exacerbate mental illness from doing that.

Turbo cooked

[–] GrouchyGrouse@hexbear.net 10 points 1 day ago

Probing the quicksand with a rod made of quicksand

Sounds safe to me

load more comments (21 replies)
[–] axont@hexbear.net 23 points 1 day ago* (last edited 1 day ago) (1 children)

I think Americans are primed to feel very enthusiastic about anything that tells them things they already believe, even moreso if it sounds very confident and organized. And people are already prone to woowoo stuff.

I don't think you'd break the spell if you informed everyone it's just lines of code that doesn't think. A statistically significant number of Americans claim to talk with spirits and angels, or have the gift of prophecy. On the flip side the techbro types believe in garbage like the future basilisk AI that tortures evseryone forever. These same people are already deranged, all the LLM does is organize their mashed potato brains into sentences that can be read. And just about every major LLM seems primed to have a very servile, docile writing style so it's trivial to get them to say whatever you want so long as you keep saying the same thing. They're not well designed for confrontation.

I firmly believe one of the best ways to deal with reactionaries, woowoo types peddling scams, or conspiracy theorists is to simply tell them they're a fucking idiot. "That sounds fucking stupid. Shut up, nerd, and never speak to me about this again." That's how you do it, that's how you dispell stuff. Social embarrassment and confrontation.

An LLM won't do that, it'll rub a nice mental salve on your already smooth brain. It's an actual echo chamber.

[–] Le_Wokisme@hexbear.net 7 points 1 day ago

On the flip side the techbro types believe in garbage like the future basilisk AI that tortures evseryone forever

big-yud makes a bit of noise but i doubt all that many people are true believers compared to the population who has ever touched computer for a living

[–] TotalBrownout@hexbear.net 25 points 1 day ago* (last edited 1 day ago)

The appeal of LLMs seems uncomfortably similar to how Thomas Jefferson enthusiastically employed dumb waiters to limit interaction with enslaved people.

[–] Shaleesh@hexbear.net 37 points 1 day ago (4 children)

I think it also has something to do with how distanced most of us are from creation and maintenance of machines, particularly electronics. if you don't quite understand what a transistor is, or how code works, or how a large language model turns inputs into outputs, then "well there must be a little dude in there somewhere" makes as much sense as anything. Plus people tend to personify inanimate objects to begin with.

[–] SorosFootSoldier@hexbear.net 27 points 1 day ago (2 children)

then "well there must be a little dude in there somewhere"

One of my earliest memories is my mom showing me how a cash register works and telling me there was a little gremlin inside the machine powering it.

[–] JoeByeThen@hexbear.net 14 points 1 day ago (2 children)
load more comments (2 replies)
[–] Damarcusart@hexbear.net 5 points 1 day ago (1 children)

It's true though, they eat the coins. Times have been tough for cash register gremlins since everyone started using cards to pay for everything. deeper-sadness

[–] SorosFootSoldier@hexbear.net 6 points 1 day ago (1 children)

6 year old me has verified this info as TRUE

load more comments (1 replies)
load more comments (3 replies)
[–] Cruxifux 35 points 1 day ago (2 children)

I don’t really understand how to explain to people who don’t understand that chatgpt is the same as when your phone guesses what you’re going to say next word, but on turbo mode. If they still don’t get it after that then I don’t know what to do to explain it further. Just fucking get it man!!

[–] hexthismess@hexbear.net 18 points 1 day ago (2 children)

I try to tell them that the machine is just calculating what the next word is to follow the previous word. It doesnt understand the context of what it's saying, only that these words fit together right.

[–] MayoPete@hexbear.net 5 points 22 hours ago (4 children)

At risk of sounding ignorant...

There has to be more to it than that, right? I mean these tools can write working code in whatever language I need, using the libraries I specify, and it just spits out this code in seconds. The code is 90% of the way there.

LLMs can also read charts and correctly assess what's going on, can create stock trading strategies using recent data, can create recipes that work implying some level of understanding of how to cook, etc. It's kinda scary how much these things can do. Now that my job is training these models I see how far they've come in just coding, and they will 100% replace a LOT of developers.

[–] Are_Euclidding_Me@hexbear.net 2 points 13 hours ago

We've had opposite experiences training these things.

I've been shocked how little they've advanced and how absolutely shit they are. I'm training them in math and it's fucking soul sucking misery. They're less capable than Wolfram Alpha was 20 years ago. The mistakes they make are so fucking bad, holy shit. I had one the other day try to use Heron's Formula for the area of a triangle on a problem where there were no triangles!

These things are crap and they aren't getting better.

[–] hexthismess@hexbear.net 5 points 18 hours ago (1 children)

Because the llms have been trained on however many cured data sets with mostly correct info. It sees how many times a phrase has been used in relation to other phrases, calculates the probability of if this is the correct output, then gambles on a certain preprogrammed risk tolerance, and spits out the output. Of course the software engineers will polish it up with barriers to keep it within certain boundaries,

But the key thing is that the llm doesnt understand the fundamental concepts of what you're asking it.

I'm not a programmer, so I could be misunderstanding the overall process, but from what I've seen on how llms work and are trained, AI makes a very good attempt of what you almost wanted. I don't how quickly AI will progress, but for now I I just see it as an extremely expensive party trick

[–] MayoPete@hexbear.net 3 points 17 hours ago

That party trick is shipping code and is good enough to replace thousands of developers at Microsoft and other companies. Maybe that says something about how common production programming problems are. A lot of business code boils down to putting things in and pulling things out of databases or moving data around via API calls and other communication methods. This tool handles that kind of work with ease.

[–] purpleworm@hexbear.net 8 points 21 hours ago (2 children)

can create recipes that work implying some level of understanding of how to cook

Being able to emulate patterns does not actually indicate some sort of higher level of understanding. You aren't going to get innovative new recipes, they are either just paraphrasing what they have read many people describe or they are cobbling together words.

load more comments (2 replies)
[–] hello_hello@hexbear.net 5 points 21 hours ago (1 children)

There has to be more to it than that, right?

No, there really isn't. You're just pigging backing off of the exploited labor of working class engineers and enjoying the luxury of living away from the blood soaked externalities that make your chatbot sing.

If AI actually did what you think it does then why would the capitalist class support it? A computer program that is the might of millions of workers? How would the control of the capitalist class continue to exist?

Or the more reasonable explanation that like smartphones and crypto, there exists a very lucrative profit incentive for the capitalist leach to create profit margins out of thin air. Westerners are trained to overconsume so this doesn't come as a suprise.

[–] MayoPete@hexbear.net 3 points 21 hours ago

If AI actually did what you think it does then why would the capitalist class support it?

Because server farms are cheaper than hiring developers, artists, writers, etc.? Capitalists don't care about the environmental impacts as long as their bottom line isn't affected.

This technology is killing jobs. Thousands are being laid off at Microsoft this month on top of layoffs at lots of other tech companies. The field I went to college to learn is cooked. There's already thousands of over qualified people applying to the few jobs that are left. This is a way bigger deal than Crypto and another way for the owning class to hoard more wealth for themselves at the expense of us working class folks.

load more comments (1 replies)

I feel really grateful that I was exposed to Markov chain bots on irc back in the day as it was a powerful inoculant.

[–] SorosFootSoldier@hexbear.net 26 points 1 day ago

Also add in the fact that public school education is abysmal and burger brains think a computer that can do some basic common sense shit is godlike.

[–] BynarsAreOk@hexbear.net 11 points 1 day ago (1 children)

Not sure I'd generalize like that. IMO you'll find very obvious correlations between the people who tend to use AI regularly because they're living Dunning Kruger types who always believe they have the great "talent" or "the genious idea" but just need the magic tool to make it work, those who have been "forced" to use it at work e.g some programmers and finaly those that as you say just make excuses for it and may not even use it but nevertheless consume the slop consciously and happily(e.g r/chatgpt users) .

I'd guess is a significant majority of the average population who live outside these bubbles are far less favorable towards AI.

[–] LangleyDominos@hexbear.net 8 points 1 day ago (3 children)

Someone on reddit had an intriguing take for once. The people who are like "ChatGPT revolutionized my work" are people who are just really bad at stuff. I read that comment the other day. And then now if you go to reddit and look at the AI subs, you get stuff like this:

https://www.reddit.com/r/OpenAI/comments/1lpte80/chatgpt_is_a_revelation_for_me_in_my_work/

I know the arguments done to death surrounding AI and being a risk to jobs etc. but I work in a very niche area of law and there's a lot of complex pieces of case law and legislation that deal with it and, frankly, my memory is terrible with retention of this info. I also struggle sometimes with interpreting judgments, specifically when Judgments are written in very complex "legalese" which I've always hated.

It's a very tempting thesis considering it predicts observation. But I think I would temper it a little bit to avoid ableism or getting too far into technocratic thinking.

[–] Horse@lemmygrad.ml 11 points 1 day ago

I also struggle sometimes with interpreting judgments, specifically when Judgments are written in very complex “legalese” which I’ve always hated.

this person graduated law school

load more comments (2 replies)
[–] BodyBySisyphus@hexbear.net 19 points 1 day ago* (last edited 1 day ago) (2 children)

Eh, back in the day we were getting advice from people who stood over cracks in the ground and got high off ethane or tossing knucklebones (as one of the less gross options) to see the future. Humans have always been susceptible to magical thinking and a lot of us can remember when personal computing was barely functional, so it's not surprising that ChatGPT seems like a quantum leap to some people.

The fact that someone has written a program that's capable of convincing people that it's god still has terrifying implications and I for one am not excited about the prospect of a wave of computer-inspired stochastic terrorism, but I don't think this is a sign that contemporary people are uniquely dumb.

[–] Shaleesh@hexbear.net 19 points 1 day ago* (last edited 1 day ago) (1 children)

back in the day

cracks in the ground

Oracles? Back in your day you had oracles? Damn, Hexbear is so diverse that we have immortal leftists shitposting on here.

[–] BodyBySisyphus@hexbear.net 25 points 1 day ago

It's called the immortal science, if you aren't working to transcend your flesh prison, you need to elevate your game.

[–] purpleworm@hexbear.net 3 points 21 hours ago

It's going to be so annoying when someone makes a "24" style show where the villains release a free chatbot designed to radicalize people and spread chaos.

[–] AssortedBiscuits@hexbear.net 4 points 23 hours ago (1 children)

I'm sorta the opposite: social relationships are so shallow and superficial in late stage capitalism that most social relationships can be replaced with a chatbot. If your social relationships can be summed up as water cooler conversations with coworkers and catching up with your drinking buddies, you might as "socialize" with a chatbot instead. Say what you will about a chatbot, but at least a chatbot won't stab you in the back like the case of socializing with coworkers or turn you into a functioning alcoholic like the case of socializing at a bar.

If your social relationships are limited to having pointless conversations about the weather or traffic or your favorite sports team, then what is the point of the social relationship in the first place?

[–] XxFemboy_Stalin_420_69xX@hexbear.net 7 points 23 hours ago (1 children)
[–] AssortedBiscuits@hexbear.net 11 points 23 hours ago (1 children)

reread my comment

Okay, that came out a lot more unhinged than how it sounded in my head lmao

[–] egg1918@hexbear.net 7 points 20 hours ago (1 children)

Was going to post this as it's own comment but I think I see what you mean.

I was out with some friends recently and most of the discussion was typical small talk, catching up type stuff. Me and 1 friend got into a discussion about music. We were talking about how listening to music by yourself is really an entirely new thing, and that throughout history music has been a communal/social experience. Like when people used to be forced to work 12+ hour days, 6/7 days a week, and then Sunday they'd get together and play music together, and how it was the likely the only good thing going for them.

Then someone else jumps in and says "wow this is so deep" and just completely fucking killed the whole vibe, conversation went back to shallow small talk. It was as if the conversation being deeper than "what have you been up to?" actually bothered this person. And it was barely even a "deep" thing to talk about!

I think you're absolutely right that late stage capitalism has utterly destroyed our ability to connect. It's like everyone is afraid to say anything beyond the superficial for some reason

[–] Spike@hexbear.net 5 points 14 hours ago

It was as if the conversation being deeper than "what have you been up to?" actually bothered this person. And it was barely even a "deep" thing to talk about!

There are many people who are uncomfortable with conversations that involve some form of personal investment

load more comments
view more: next ›