111
this post was submitted on 28 Jun 2023
111 points (99.1% liked)
Technology
69298 readers
3882 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
CHATGPT, although it sometimes lie so need to take it with a grain of salt
It is slightly confusing to me why everyone was insisting ChatGPT was a threat to Google and that it would replace Google search. They don’t seem like comparable products.
If you imagine users asking questions to Google just to recieve a bunch of crappy listicles or the wiki page, versus ChatGPT, it makes more sense.
ChatGPT enables you to have a dialogue to ask follow up questions, more detail or summarize information in two sentences. Google can’t compete with that using a page-rank algorithm alone. It is incredibly powerful and it’s getting exponentially better.
I’d caution anyone who just dismisses it by calling it a chatbot or says it hallucinates too much. I found the accuracy between 3.5 to 4 pretty astonishing to the point where I now fear the AGI apocalypse.
It can help, but it's data is sometimes too old.
Honestly for certain things that I was already knowledgable about, it provided a really great and accurate summary when I asked. Other times not so much, so I don’t feel comfortable using it for research like that.
I think if it could base its output on real sources and direct you to them it would be a bit better
LLMs are far better at writing things that look like answers than they are at writing actual answers.
Hah. Imagine if you went to Wikipedia and had to account for that 40% of all info on there was straight made up. Like the Scots wikipedia.