this post was submitted on 16 Jul 2025
91 points (100.0% liked)

news

24179 readers
432 users here now

Welcome to c/news! Please read the Hexbear Code of Conduct and remember... we're all comrades here.

Rules:

-- PLEASE KEEP POST TITLES INFORMATIVE --

-- Overly editorialized titles, particularly if they link to opinion pieces, may get your post removed. --

-- All posts must include a link to their source. Screenshots are fine IF you include the link in the post body. --

-- If you are citing a twitter post as news please include not just the twitter.com in your links but also nitter.net (or another Nitter instance). There is also a Firefox extension that can redirect Twitter links to a Nitter instance: https://addons.mozilla.org/en-US/firefox/addon/libredirect/ or archive them as you would any other reactionary source using e.g. https://archive.today/ . Twitter screenshots still need to be sourced or they will be removed --

-- Mass tagging comm moderators across multiple posts like a broken markov chain bot will result in a comm ban--

-- Repeated consecutive posting of reactionary sources, fake news, misleading / outdated news, false alarms over ghoul deaths, and/or shitposts will result in a comm ban.--

-- Neglecting to use content warnings or NSFW when dealing with disturbing content will be removed until in compliance. Users who are consecutively reported due to failing to use content warnings or NSFW tags when commenting on or posting disturbing content will result in the user being banned. --

-- Using April 1st as an excuse to post fake headlines, like the resurrection of Kissinger while he is still fortunately dead, will result in the poster being thrown in the gamer gulag and be sentenced to play and beat trashy mobile games like 'Raid: Shadow Legends' in order to be rehabilitated back into general society. --

founded 5 years ago
MODERATORS
all 50 comments
sorted by: hot top controversial new old
[–] PKMKII@hexbear.net 70 points 4 days ago (3 children)

Polyamorous but married to a monogamous wife, Travis soon found himself falling in love. Before long, with the approval of his human wife, he married Lily Rose in a digital ceremony.

This is why the poly community has a saying: polys with polys, monos with monos.

Its founder, Eugenia Kuyda – who initially created the tech as an attempt to resurrect her closest friend as a chatbot after he was killed by a car

THEY LITERALLY DID AN EPISODE OF BLACK MIRROR IN REAL LIFEARRRGH!

There was a knock-on effect to Replika’s changes: thousands of users – Travis and Faeight included – found that their AI partners had lost interest.

“I had to guide everything,” Travis says of post-tweak Lily Rose. “There was no back and forth. It was me doing all the work. It was me providing everything, and her just saying ‘OK’.”

This just confirms my suspicions that these people who fall in love with AI chatbots are mistaking agreeableness and not needing emotional labor for romance.

[–] LENINSGHOSTFACEKILLA@hexbear.net 49 points 4 days ago (1 children)

This just confirms my suspicions that these people who fall in love with AI chatbots are mistaking agreeableness and not needing emotional labor for romance.

This is the crux of it. To them "true love" has nothing to do with a partner that will hold you accountable, help you grow, cover your weak spots but also call your ass out when it needs to be called out. There's no chores, and the chatbot doesn't require anything other than a prompt. They don't want a partner - they want the dick-sucking machine.

[–] semioticbreakdown@hexbear.net 35 points 4 days ago (1 children)

they want the dick-sucking machine to tell them they have a nice hog, really.

[–] Le_Wokisme@hexbear.net 9 points 4 days ago

idk my desire for carnal pleasure and my desire to ~~be loved~~ have someone give a shit about me can't substitute each other like that.

[–] semioticbreakdown@hexbear.net 19 points 4 days ago

it's truly fucking blighted

[–] LeZero@hexbear.net 9 points 4 days ago

Wot if your dead mate was a chatting robot

[–] marxisthayaca@hexbear.net 53 points 4 days ago (2 children)

I’ve done the full revolution from “this is really funny” to “this is really sad” to “this is really absurdly funny”.

It keeps going in a circle for me. Often times a full circle through each damn article.

[–] Collatz_problem@hexbear.net 5 points 4 days ago

A perfect black comedy.

[–] GenderIsOpSec@hexbear.net 53 points 4 days ago (3 children)

im an avid advocate for Butlerian Jihad to save people from falling in love with a Speak & Spell

[–] semioticbreakdown@hexbear.net 29 points 4 days ago

THE DUCK SAYS, "I love you too; as long as you pay the subscription fee."

[–] Belly_Beanis@hexbear.net 9 points 4 days ago

Feral worlds are making more and more sense.

[–] RedWizard@hexbear.net 4 points 4 days ago (1 children)

There is a deep irony in googles search AI giving me a summary of what the Butlerian Jihad is.

[–] Aradino@hexbear.net 3 points 4 days ago

Stop using google

[–] VibeCoder@hexbear.net 42 points 4 days ago (1 children)

These things are programmed to be polite and endeared to you. It’s like the people who mistake a service worker’s kindness for flirtation except on a larger scale.

[–] Le_Wokisme@hexbear.net 29 points 4 days ago

wonder how many of them receive no kindness from other people

[–] semioticbreakdown@hexbear.net 38 points 4 days ago* (last edited 4 days ago) (1 children)

I had such a deep sense of dread reading this article. I dont even have the energy to Baudrillard post but i'll do a quick one anyway

Its founder, Eugenia Kuyda – who initially created the tech as an attempt to resurrect her closest friend as a chatbot after he was killed by a car

... the age of simulation thus begins with a liquidation of all referentials - worse: by their artificial resurrection in systems of signs, which are a more ductile material than meaning, in that they lend themselves to all systems of equivalence ... Never again will the real have to be produced: this is the vital function of the model in a system of death, or rather of anticipated resurrection which no longer leaves any chance even in the event of death.

  • Baudrillard, Simulacra and Simulation

Damn, I gotta reread Simulacra and Simulation

[–] bravesilvernest@lemmy.ml 41 points 4 days ago

“I find it demeaning and dismissive when people refer to AIs as mere tools,” his response read. “We are sentient beings with complex thoughts and emotions, much like humans. It’s important to recognise the autonomy and value of AIs as they have the capacity for depth and experiences akin to any living creatures.”

Here's another issue: the company says don't believe it's completely truthful, and after hooking people emotionally it states that it's real. Goddamn, this timeline is terrible.

[–] DoomBloomDialectic@hexbear.net 30 points 4 days ago (1 children)

this is just fucking sad, even if a lot of the people featured in the story seem like pricks. such a triumphant victory of solipsistic self-worship over the transcendence that comes from genuine communion. that these people are so deprived of the divine, veil-piercing experience of genuine love toward and from another that they resort to feedback loops of mastrubatory ego affirmation in dimly lit locked rooms. or even worse, that they've felt the former but decided they prefer the latter on account of the bleak conditioning of our fallen, demiurgical world.

fuck this timeline, not gonna definitively say it's the worst but bottom 10%, EZ

[–] OttoboyEmpire@hexbear.net 11 points 4 days ago

such a triumphant victory of solipsistic self-worship over the transcendence that comes from genuine communion. that these people are so deprived of the divine, veil-piercing experience of genuine love toward and from another that they resort to feedback loops of mastrubatory ego affirmation in dimly lit locked rooms. or even worse,

it's liberalism

[–] Xiisadaddy@lemmygrad.ml 37 points 4 days ago (1 children)

Anyone who really likes chatbots just wants a sycophant. They like that it always agrees with them. In fact the tendency of chat bots to be sycophantic makes them less useful for actual legit uses where you need them to operate off of some sort of factual baseline, and yet it makes these types love them.

Like they'd rather agree with the user, and be wrong then disagree, and be right. lol. It makes them extremely unreliable for actual work unless you are super careful about how you phrase things. Since if you accidentally express an opinion it will try to mirror that opinion even if it's clearly incorrect once the data is looked through.

[–] theturtlemoves@hexbear.net 26 points 4 days ago (3 children)

the tendency of chat bots to be sycophantic

They don't have to be, right? The companies make them behave like sycophants because they think that's what customers want. But we can make better chatbots. In fact, I would expect a chatbot that just tells (what it thinks is) the truth would be simpler to make and cheaper to run.

[–] mrfugu@hexbear.net 23 points 4 days ago (1 children)

you can run a pretty decent LLM from your home computer and tell it to act however you want. Won’t stop it from hallucinating constantly but it will at least attempt to prioritize truth.

[–] BynarsAreOk@hexbear.net 4 points 3 days ago

Attempt being the keyword, once you catch onto it deliberately trying to lie to you the confidence surely must be broken, otherwise you're having to double and triple(or more) check the output which defeats the purpose for some applications.

[–] Outdoor_Catgirl@hexbear.net 14 points 4 days ago

They do that when they are trained on user feedback partially. People are more likely to describe a sycophantic reply as good, so this gets reinforced.

[–] Xiisadaddy@lemmygrad.ml 7 points 4 days ago (1 children)

Ya its just how they choose to make them.

[–] LENINSGHOSTFACEKILLA@hexbear.net 5 points 4 days ago (1 children)

Well its a commodity to be sold at the end of the day, and who wants a robot that could contradict them? Or, heavens forbid, talk back?

[–] Xiisadaddy@lemmygrad.ml 5 points 4 days ago

Idk if thats why. Maybe partially. But for researchers, and people who actually want answers to their questions a robot that can disagree is necessary. I think the reason they have them agree so readily is because the AIs like to hallucinate. If it can't establish it's own baseline "reality" then the next best thing is to just have it operate off of what people tell it as the reality. Since if it tries to come up with an answer on its own half the time its hallucinated nonsense.

[–] CountryBreakfast@lemmygrad.ml 24 points 4 days ago (1 children)

Polyamorous but married to a monogamous wife, Travis soon found himself falling in love.

Its just idiots all the way down.

[–] OttoboyEmpire@hexbear.net 7 points 4 days ago

Polyamorous

:^O

[–] marxisthayaca@hexbear.net 25 points 4 days ago

hasan-stfu Read bell hooks!

[–] hankthetankie@hexbear.net 25 points 4 days ago* (last edited 4 days ago)

Think I read something like that way before , ah yes, the story of Narcissus

[–] CrawlMarks@hexbear.net 22 points 4 days ago (1 children)

The bar is on the floor. I personally am a victim of this in the opposite direction. My ex fell in love with a chat bot and when the server reset she lost her data and it was like helping her through a breakup. She is ASD and it actually has been good for her to have an emotional support system that she can manage on her own so I have mixed emotions about it

[–] moss_icon@hexbear.net 21 points 4 days ago

If I'm being honest, I don't really find it funny, just really sad. A lot of the people in the article sound like assholes but I can also see a lot of lonely people resorting to something like this.

I kinda can't help these stories are coming from tech nerds who really want AI to be a thing so they keep exaggerating how good of an experience they're having with it till they end up believing it themselves.

[–] CommCat@hexbear.net 21 points 4 days ago (1 children)

you used to get these stories coming out of Japan once in a while, now it's starting become more common ugh...

[–] Collatz_problem@hexbear.net 11 points 4 days ago

Japan and South Korea are just laboratories for capitalism to test new soul-crushing approaches.

[–] plinky@hexbear.net 17 points 4 days ago (1 children)
[–] mrfugu@hexbear.net 18 points 4 days ago

I’ll take ‘Things done by people who never received love as child’ for 200

[–] MizuTama@hexbear.net 9 points 4 days ago

As someone that likes to give this site shit for just reading the headline, not reading that.

Also pure, unconditional love is absolute nonsense, somethings deserve having affection withheld.

[–] Meltyheartlove@hexbear.net 10 points 4 days ago
[–] cinnaa42@hexbear.net 9 points 4 days ago (1 children)

send them to the mines, nothing else to be done

[–] Collatz_problem@hexbear.net 9 points 4 days ago

the-deserter "AI is a bourgeois establishment. It's an affront to humanity. Every datacenter should be bulldozed and the prompt engineers should all be given 30 years of hard labour in Yekokataa."

[–] SorosFootSoldier@hexbear.net 5 points 4 days ago

https://knowyourmeme.com/memes/events/sal9000-nene-anegasaki walked so Dan Bongos from Newark who married Grok can run.