this post was submitted on 29 Jun 2025
493 points (95.9% liked)

Technology

72137 readers
3565 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

A profound relational revolution is underway, not orchestrated by tech developers but driven by users themselves. Many of the 400 million weekly users of ChatGPT are seeking more than just assistance with emails or information on food safety; they are looking for emotional support.

“Therapy and companionship” have emerged as two of the most frequent applications for generative AI globally, according to the Harvard Business Review. This trend marks a significant, unplanned pivot in how people interact with technology.

top 50 comments
sorted by: hot top controversial new old
[–] buddascrayon@lemmy.world 1 points 25 minutes ago* (last edited 25 minutes ago)

CDC data from 2022 indicated that more than one in five U.S. adults under the age of 45 experienced symptoms of mental distress.

Must be the lack of personnel. Couldn't have anything to do with the global insecurity of rising inflation and low wage jobs coupled with the skyrocketing housing costs. Not to mention the whole "the earth is steadily getting hotter and extreme weather events are happening more and more frequently."

Yeah, let's invest in more AI that will fuck over the planet even more with colossal energy requirements and not even bother with making people more financially and socially secure.

[–] HugeNerd@lemmy.ca 1 points 1 hour ago

Buy more. Buy more now.

[–] minorkeys@lemmy.world 22 points 9 hours ago (1 children)

So somewhere they feel safe to do so. Says something pretty fucked up about our culture that men don't feel safe to open up anywhere. And no, it's not their own fault.

[–] 0x0@lemmy.zip 4 points 7 hours ago

And no, it’s not their own fault.

Of course it is, men are cool targets to hate, get with the program.

[–] Rooty@lemmy.world 39 points 15 hours ago* (last edited 15 hours ago) (2 children)

I see a lot of people in this thread reacting with open hostility and derailment every time men's issues are mentioned. Have you tried not being a part of the problem?

[–] rekabis@lemmy.ca 13 points 9 hours ago* (last edited 9 hours ago)

Allowing men’s issues to even be addressed risks giving legitimacy to the fact that these issues even exist. And if they exist, men can no longer be that evil monolith that exists only to be torn down and used as the cause for whatever is wrong with the world.

After all, the zero-sum game must be properly reinforced with an appropriate evil that cannot be allowed to have any weaknesses or redeeming attributes.

[–] interdimensionalmeme@lemmy.ml 3 points 12 hours ago

There are people like that for anything related to AI.

Combine that with men stuff and this going to be crack for all of those people

[–] baatliwala@lemmy.world 7 points 11 hours ago (1 children)

Unironically the "Men will do X besides going to therapy" meme

[–] ronigami@lemmy.world 8 points 10 hours ago (2 children)

Even therapists are suffering these days. It’s just more challenging than it’s ever been to gaslight clients into believing their concerns about the world aren’t objectively true and instead the symptom of an internal struggle.

[–] HugeNerd@lemmy.ca 1 points 1 hour ago

...which are always conveniently treated by drugs!

[–] admin@lemmy.today 0 points 6 hours ago

I wonder how many therapists end up gaslighting and depressing themselves by trying to unintentionally gaslight their patients.

[–] Vreyan31@reddthat.com 12 points 15 hours ago (3 children)

I think we may be (re)-discovering the appeal of monotheistic religions, and why they hew patriarchal.

On average, men desperately need more mental health resources. But, on average, they are not comfortable building that with other men, and it often isn't appropriate or effective to lean on their female significant other (if a straight man).

So - enter the primary description of 'God'. Can listen any time but will always forgive, is super masculine but won't emasculate you, and has never told another soul what you are thinking.

AI is always available and is unlikely to emasculate anyone, but that third item... Well, we'll see where this goes.

[–] Bravo@eviltoast.org 5 points 9 hours ago

You've basically just described "confession". You go into a little box designed to make it as difficult as possible for the priest to identify you, you talk about all the ways you feel like you're a bad person, and the priest talks to you for a while about it, then gives you some actionable items to make amends and once you've done them God officially forgives you. The whole concept of confession is designed to allow people to let go of their regrets and live in the now. It's actually quite clever as a bit of societal design. If modern priests had psychotherapy degrees then everyone in the world would have access to free therapy - unfortunately they wouldn't be very useful for LGBT+ people.

[–] whostosay@lemmy.world 2 points 9 hours ago* (last edited 9 hours ago) (1 children)

Lol see where it goes? If you think these AI companies, that are very publicly bleeding money, aren't selling your data out for pennies on the dollar, you're just keeping your head in the sand.

I've been asking Google Gemini weird and stupid trivia questions just to burn the world down faster.

[–] interdimensionalmeme@lemmy.ml 5 points 12 hours ago

Omg self host it omg

[–] stoly@lemmy.world 26 points 20 hours ago (6 children)

Part of me is ok with this in that any avenue to get mental health resources can be better than nothing. What worries me is that people will use ChatGPT for this sort of thing and these models will not be good help.

[–] MrMcGasion@lemmy.world 17 points 18 hours ago (1 children)

I'll admit I tried talking to a local deepseek about a minor mental health issue one night when I just didn't want to wake up/bother my friends. Broke the AI within about 6 prompts where no matter what I said it would repeat the same answer word-for-word about going for walks and eating better. Honestly, breaking the AI and laughing at it did more for my mental health than anything anyone could have said, but I'm an AI hater. I wouldn't recommend anyone in real need use AI for mental health advice.

[–] HugeNerd@lemmy.ca 2 points 1 hour ago

I'd say make a grilled cheese sandwich with quality Gruyere and Cheddar and take a nap after.

[–] zarkanian@sh.itjust.works 8 points 16 hours ago (1 children)

AI will reinforce delusional thinking. This is definitely not good.

load more comments (1 replies)
load more comments (4 replies)
[–] Val@lemmy.blahaj.zone 10 points 17 hours ago (1 children)

AI is what cracked my egg shell, fucking wild...

[–] Wiz@midwest.social 7 points 14 hours ago

Well that's gotta be an interesting story! Don't leave us hanging!

[–] vivalapivo@lemmy.today 38 points 22 hours ago (2 children)

Like... yeah?

Tried to open to a girlfriend about a sensitive topic - she got the ick.

Tried to make an appointment with a psychiatrist - got a very hateful rejection because of my place of birth.

Damn, even when I try to uplift a friend, I use phrases like 'you got this before, you'll get it now'.

I don't know how to be a man, mentally

[–] BackgrndNoize@lemmy.world 18 points 20 hours ago (1 children)

Getting rejection because of place of birth is worth getting that doctors license revoked, find out which body governs doctors in your location and file a complaint

[–] vivalapivo@lemmy.today 13 points 19 hours ago (2 children)

Haha, not every place is in the US. Hopefully, I won't face this kind of treatment as I do not live in that shit hole of a country

[–] 0x0@lemmy.zip 1 points 7 hours ago

not every place is in the US

Thank Sithrak for that, jeez...

[–] BackgrndNoize@lemmy.world 6 points 16 hours ago (1 children)

I never said it was the US, do rules and regulations governing doctors behavior not exist in your country?

[–] vivalapivo@lemmy.today 2 points 9 hours ago

IDK, really. As I said, I left the country where this psychiatrist lives.

[–] Snowies@lemmy.zip 8 points 17 hours ago (1 children)

Become a rich jacked sociopath.

That’s most manly thing you can do apparently.

[–] vivalapivo@lemmy.today 4 points 9 hours ago

Sorry, I will pursue happiness instead

[–] shplane@lemmy.world 2 points 13 hours ago

Much easier if you just bury your feelings deep deep down. No repercussions whatsoever. The occasional psychic breakdowns but that’s normal.

[–] RedFrank24@lemmy.world 66 points 1 day ago (10 children)

I can kinda understand the appeal. An AI isn't gonna judge you, an AI isn't gonna leave a mean comment or tell you to get over it and man up. It's giving an unnerving amount of personal information to corporations, but I can sympathise with the thoughts these men are having.

[–] ronigami@lemmy.world 4 points 10 hours ago

It might even, gasp, offer solutions.

[–] plyth@feddit.org 1 points 11 hours ago* (last edited 11 hours ago)

An AI isn’t gonna judge you,

Guess what is happening with that chat history.

load more comments (8 replies)
[–] fellowmortal@lemmy.dbzer0.com 18 points 21 hours ago* (last edited 21 hours ago)

Just a note to say that the very first chat bot, Eliza, created in the 1960's was a Rogerian therapist. I'm sure I remember a quote that the author was surprised that people opened up to it. I doubt anyone working in AI or chat technology would not know about Eliza so probably not a surprise to the industry... but maybe I am that old. [edits: facts/spelling etc]

[–] whotookkarl@lemmy.world 20 points 22 hours ago
load more comments
view more: next ›