this post was submitted on 07 Apr 2025
371 points (95.8% liked)

tumblr

4060 readers
688 users here now

Welcome to /c/tumblr, a place for all your tumblr screenshots and news.

Our Rules:

  1. Keep it civil. We're all people here. Be respectful to one another.

  2. No sexism, racism, homophobia, transphobia or any other flavor of bigotry. I should not need to explain this one.

  3. Must be tumblr related. This one is kind of a given.

  4. Try not to repost anything posted within the past month. Beyond that, go for it. Not everyone is on every site all the time.

  5. No unnecessary negativity. Just because you don't like a thing doesn't mean that you need to spend the entire comment section complaining about said thing. Just downvote and move on.


Sister Communities:

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old

I use it somewhat regularly to send snarky emails to coworkers in a professional, buzzword overload responses to mundane inquiries.

I use it every so often to help craft a professional go fuck yourself email too.

[–] glitchdx@lemmy.world 2 points 4 hours ago

Wait, people actually try to use gpt for regular everyday shit?

I do lorebuilding shit (in which gpt's "hallucinations" are a feature not a bug), or I'll just ramble at it while drunk off my ass about whatever my autistic brain is hyperfixated on. I've given up on trying to do coding projects, because gpt is even worse at it than I am.

[–] Lucky_777@lemmy.world 2 points 5 hours ago (1 children)

Using AI is helpful, but by no means does it replace your brain. Sure, it can write emails and really helps with code, but anything beyond basic troubleshooting and "short" code streams, it's an assistant, not an answer.

[–] Lemminary@lemmy.world 1 points 4 hours ago

Yeah, I don't get the people who think it'll replace your brain. I find it useful for learning even if it's not always entirely correct but that's why I use my brain too. Even if it gets me 60% of the way there, that's useful.

[–] ArchmageAzor@lemmy.world 4 points 7 hours ago (1 children)

I use ChatGPT mainly for recipes, because I'm bad at that. And it works great, I can tell it "I have this and this and this in my fridge and that and that in my pantry, what can I make?" and it will give me a recipe that I never would have come up with. And it's always been good stuff.

And I do learn from it. People say you can't learn from using AI, but I've gotten better at cooking thanks to ChatGPT. Just a while ago I learned about deglazing.

[–] turnip@lemm.ee 2 points 5 hours ago

You should try this thing, its pretty neat, just press maya or miles. Though it requires a microphone so you may have to open it on your phone.

https://www.sesame.com/research/crossing_the_uncanny_valley_of_voice#demo

[–] SynopsisTantilize@lemm.ee 2 points 6 hours ago

I'm using it to learn to code! If anyone wants to try my game let me know I'll figure out a way to send it.

[–] jjjalljs@ttrpg.network 42 points 15 hours ago (5 children)

I feel like it's an unpopular take but people are like "I used chat gpt to write this email!" and I'm like you should be able to write email.

I think a lot of people are too excited to neglect core skills and let them atrophy. You should know how to communicate. It's a skill that needs practice.

[–] minorkeys@lemmy.world 13 points 10 hours ago* (last edited 10 hours ago)

This is a reality as most people will abandon those skills, and many more will never learn them to begin with. I'm actually very worried about children who will grow up learning to communicate with AI and being dependent on it to effectively communicate with people and navigate the world, potentially needing AI as a communication assistant/translator.

AI is patient, always available, predicts desires and effectively assumes intent. If I type a sentence with spelling mistakes, chatgpt knows what I meant 99% of the time. This will mean children don't need to spell or structure sentences correctly to effectively communicate with AI, which means they don't need to think in a way other human being can understand, as long as an AI does. The more time kids spend with AI, the less developed their communication skills will be with people. GenZ and GenA already exhibit these issues without AI. Most people go experience this communicating across generations, as language and culture context changes. This will emphasize those differences to a problematic degree.

Kids will learn to communicate will people and with AI, but those two styles with be radically different. AI communication will be lazy, saying only enough for AI to understand. With communication history, which is inevitable tbh, and AI improving every day, it can develop a unique communication style for each child, what's amounts to a personal language only the child and AI can understand. AI may learn to understand a child better than their parents do and make the child dependent on AI to effectively communicate, creating a corporate filter of communication between human being. The implications of this kind of dependency are terrifying. Your own kid talks to you through an AI translator, their teachers, friends, all their relationships could be impacted.

I have absolutely zero beleif that the private interests of these technology owners will benefit anyone other than themselves and at the expense of human freedom.

[–] Soup@lemmy.world 2 points 8 hours ago (1 children)

I know someone who very likely had ChatGPT write an apology for them once. Blew my mind.

[–] Lemminary@lemmy.world 2 points 4 hours ago

I use it to communicate with my landlord sometimes. I can tell ChatGPT all the explicit shit exactly as I mean it and it'll shower it and comb it all nice and pretty for me. It's not an apology, but I guess my point is that some people deserve it.

load more comments (3 replies)
[–] TabbsTheBat@pawb.social 85 points 21 hours ago (8 children)

The amount of times I've seen a question answered by "I asked chatgpt and blah blah blah" and the answer being completely bullshit makes me wonder who thinks asking the bullshit machine™ questions with a concrete answer is a good idea

[–] msage@programming.dev 3 points 6 hours ago
[–] Tar_alcaran@sh.itjust.works 36 points 17 hours ago

This is your reminder that LLMs are associative models. They produce things that look like other things. If you ask a question, it will produce something that looks like the right answer. It might even BE the right answer, but LLMs care only about looks, not facts.

load more comments (6 replies)
[–] Kolanaki@pawb.social 8 points 14 hours ago* (last edited 14 hours ago) (3 children)

I've tried a few GenAI things, and didn't find them to be any different than CleverBot back in the day. A bit better at generating a response that seems normal, but asking it serious questions always generated questionably accurate responses.

If you just had a discussion with it about what your favorite super hero is, it might sound like an actual average person (including any and all errors about the subject it might spew), but if you try to use it as a knowledge base, it's going to be bad because it is not intelligent. It does not think. And it's not trained well enough to only give 100% factual answers, even if it only had 100% factual data entered into it to train on. It can mix two different subjects together and create an entirely new, bogus response.

load more comments (3 replies)
[–] Whats_your_reasoning@lemmy.world 8 points 14 hours ago (4 children)

Oh hey it's me! I like using my brain, I like using my own words, I can't imagine wanting to outsource that stuff to a machine.

Meanwhile, I have a friend who's skeptical about the practical uses of LLMs, but who insists that they're "good for porn." I can't help but see modern AI as a massive waste of electricity and water, furthering the destruction of the climate with every use. I don't even like it being a default on search engines, so the idea of using it just to regularly masturbate feels ... extremely selfish. I can see trying it as a novelty, but for a regular occurence? It's an incredibly wasteful use of resources just so your dick can feel nice for a few minutes.

[–] Foxfire@pawb.social 6 points 13 hours ago (1 children)

Using it for porn sounds funny to me given the whole concept of "rule 34" being pretty ubiquitous. If it exists, there's porn of it! Like even from a completely pragmatic prespective, it sounds like generating pictures of cats. Surely there is a never ending ocean of cat pictures which you can search and refine, do you really need to bring a hallucination machine into the mix? Maybe your friend has an extremely specific fetish list that nothing else will scratch? That's all I can think of.

[–] Whats_your_reasoning@lemmy.world 2 points 11 hours ago* (last edited 11 hours ago)

He says he uses it to do sexual roleplay chats, treats it kinda like a make-your-own-adventure porn story. I don't know if he's used it for images.

load more comments (3 replies)
load more comments
view more: next ›