this post was submitted on 22 Jul 2025
1077 points (98.6% liked)
Curated Tumblr
5588 readers
819 users here now
For preserving the least toxic and most culturally relevant Tumblr heritage posts.
Here are some OCR tools to assist you in transcribing posts:
-
FOSS Android Recs per u/m_f@discuss.online: 1 , 2
Don't be mean. I promise to do my best to judge that fairly.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
we used to do this thing called "learning".
It's called git gudding now.
git -f gud
Isn't that gud gitting?
I think using ChatGPT for learning is okay, assuming the user is actually interested in learning. If you just want to get something done, you're absolutely cheating the task at hand, and your future self.
ChatGPT truly shines when you ask it follow-up questions on the thing you want to learn about and really "delve" (hate that AI ass word) into different aspects to internalize them yourself.
The dangerous part is that it makes stuff up and you won't have the knowledge to tell.
Exactly. I had a colleague who searched for a question about a retail math formula. The LLM returned a result that was close but slightly wrong. She spent two weeks with incorrect numbers for her baseline and as a result her forecasts were all wrong. When reviewing her numbers, everything was just a little wonky so I dug into it and discovered her mistake. She was absolutely dumbfounded the "AI" even could be wrong and tried to argue that I was incorrect. Dug out my old retail math cheat sheet and showed her the correct formula.
I haven't used LLM's for anything since. Gotta validate all that shit anyways, so why use it at all?
These things will be fantastic for taking my order at the drive thru and in a few other applications, but if you're trying to learn from them; don't.
I was a kid in the era of separate pocket calculators, so I've heard so much of this song and dance before. Even with deterministic tools that always work barring user error you need to have enough understanding that you can tell when something is off and to properly frame the problem
Well, yeah, you have to have a brain and actually verify things. It's like Wikipedia circa 2004.
Except when it lies. Then it is the opposite of what you want it to be.