this post was submitted on 02 Jul 2025
114 points (98.3% liked)

chapotraphouse

13933 readers
747 users here now

Banned? DM Wmill to appeal.

No anti-nautilism posts. See: Eco-fascism Primer

Slop posts go in c/slop. Don't post low-hanging fruit here.

founded 4 years ago
MODERATORS
 

On a deeper level than small talk, of course.

you are viewing a single comment's thread
view the rest of the comments
[–] Skye@hexbear.net 20 points 2 weeks ago (2 children)

The problem is that AI does absolutely not provide a clinical relationship. If your input becomes part of the LLM's context (which it has to in order to have a conversation) it will inevitably start mirroring you in ways you might not even notice, something humans commonly (and subconsciously) respond to with trust and connection.

Add to that that they are designed to generally agree with and enable whatever you tell them and you basically have a machine that does everything to reinforce a connection to itself and validate the parts of yourself you have concerns about.

There are already so many stories of people spiralling because they started building rapport with an LLM and it's hard to imagine a setting where that is more likely to occur than when you use one as your therapist

[–] LangleyDominos@hexbear.net 11 points 2 weeks ago (1 children)

Given the way LLMs function, the will have a hard time with therapy. Chat GPT's context window is 128k tokens. As you chat, your prompts/replies add up and start filling the context window. GPT also has to look at its own responses for context. That fills up the window as well. LLMs suck with nearly empty context windows and nearly full context windows. When you're close to having a full context window, it will start hallucinating and having problems with responses. Eventually it will only be able to focus on parts of your conversations because you've blown past the 128k token mark.

The ways to mitigate this problem have to be done by the user and they disrupt therapy.

[–] KuroXppi@hexbear.net 1 points 2 weeks ago

We just need to add more tokens then

spoiler/S


[–] purpleworm@hexbear.net 5 points 2 weeks ago

There are already so many stories of people spiralling because they started building rapport with an LLM and it's hard to imagine a setting where that is more likely to occur than when you use one as your therapist

There are multiple cases where an LLM is alleged to have contributed to someone's suicide, from supporting sentiments of the afterlife being better to giving practical advice.