this post was submitted on 27 Jul 2025
91 points (97.9% liked)

Technology

649 readers
63 users here now

Tech related news and discussion. Link to anything, it doesn't need to be a news article.

Let's keep the politics and business side of things to a minimum.

Rules

No memes

founded 2 months ago
MODERATORS
top 10 comments
sorted by: hot top controversial new old
[–] Ava@lemmy.blahaj.zone 22 points 2 days ago (1 children)

I mean... Yeah, no shit. It's not therapy, and your chatbot shouldn't be pretending to offer professional services that require a license, Sam.

Let's be truthful. You don't want to have to explain or justify anything that your chatbot says, and you don't want the Courts to be able to analyze whether you've violated any rights or laws either.

[–] Perspectivist@feddit.uk 3 points 2 days ago* (last edited 2 days ago) (2 children)

your chatbot shouldn’t be pretending to offer professional services that require a license, Sam.

It generates natural sounding language. That's all it's designed to do. The rest is up to the user - if a therapy session is what they ask then a therapy session is what they get. I don't think it should refuse this request either.

[–] hendrik@palaver.p3x.de 5 points 2 days ago

I mean there is some valid discussion going on whether some vulnerable people need protection. Generally I agree. But due to it's nature as a yes-man, ChatGPT will feel nice and give a lot of reaffirmation, which has the potential to mess up some people real bad and send them spiralling down. So they might indeed need some form of protection. But that shouldn't take anything away.

[–] Ava@lemmy.blahaj.zone 2 points 2 days ago (1 children)

If I go to someone and ask for a therapy session, even if they are the most supportive, thoughtful person I could hope to find, it's not appropriate for them to hold it out as proper therapy. We have rules and restrictions on who is allowed to offer certain services, and for good reason. If one asks their therapist about confidentiality, it's highly inappropriate for the therapist to misrepresent the confidentiality rules, also for good reason.

ChatGPT will gladly claim to be able to provide this support, and will promise complete anonymity. It will say that it's able to offer good advice and guidance. As you said, the text it generates will certainly be natural-sounding. It also won't be therapy. It definitely won't be anonymous.

A person who lies about stuff faces consequences. A label on the door of a "medicinalist" that says "No promises to offer truthful information, verify all important things" isn't going to prevent that if they're selling arsenic as a cure-all. If a company wants to offer a service, they should be restricted in what they can claim they are offering.

[–] Perspectivist@feddit.uk 1 points 1 day ago* (last edited 1 day ago)

This isn’t a failure of the model - it’s a misunderstanding of what the model is. ChatGPT is a tool, not a licensed practitioner. It has one capability: generating language. That sometimes produces correct information as a side effect of the data it was trained on, but there is no understanding, no professional qualification, and no judgment behind it.

If you ask it whether it’s qualified to act as a therapist, it will tell you no. If you instruct it to role-play as one, it will however do that - because following instructions is the thing it’s designed to do. Complaining that a language model behaves like a language model, and then demanding more guardrails to stop people from using it badly, is just outsourcing common sense.

There’s also this odd fixation on Sam Altman as if he’s hand-crafting the bot’s behavior in real time. It’s much closer to an open-ended, organic system that reacts to input than a curated service. What you get out of it depends entirely on what you put in.

[–] Perspectivist@feddit.uk 15 points 2 days ago (1 children)

No one should expect anything they write online to stay private. It's the number 1 rule of the internet. Don't say anything you wouldn't be willing to defend having said in front of a court.

[–] reddig33@lemmy.world 1 points 2 days ago