I mean, the tech is changing faster than science can analyize it, but isnt this now outdated?
I dont use AI but a friend showed me a query that returned the sources, most of which were academic and appeared trustworthy
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.
I mean, the tech is changing faster than science can analyize it, but isnt this now outdated?
I dont use AI but a friend showed me a query that returned the sources, most of which were academic and appeared trustworthy
Now guess how much power it took for each one of those wrong answers.
The upper limit for AI right now has nothing to do with the coding or with the companies programming it. The upper limit is dictated by the amount of power it takes to generate even simple answers (and it doesn't take any less power to generate wrong answers).
Training a large language model like GPT-3, for example, is estimated to use just under 1,300 megawatt hours (MWh) of electricity; about as much power as consumed annually by 130 US homes. To put that in context, streaming an hour of Netflix requires around 0.8 kWh (0.0008 MWh) of electricity. That means you’d have to watch 1,625,000 hours to consume the same amount of power it takes to train GPT-3.
https://www.theverge.com/24066646/ai-electricity-energy-watts-generative-consumption
If the AI wars between powerful billionaire factions in the United States continues, get ready for rolling blackouts.
Time for nuclear to make a comeback.
It's a drop in the bucket compared to what's actually causing damage like vehicles and plane travel.
Estimates for [training and building] Llama 3 are a little above 500,000 kWh[b], a value that is in the ballpark of the energy use of a seven-hour flight of a big airliner.
https://cacm.acm.org/blogcacm/the-energy-footprint-of-humans-and-large-language-models/
That's around 570 average american homes.
That being said, it's a malicious and stupidly formed comparaison. It's like comparing the cost of building a house vs staying in a hotel for a night.
The model, once trained can be constantly re-used and shared. The llama model has been downloaded millions of time. It would be better to compare it to the cost of making the movie.
An average film production with a budget of $70 million leaves behind a carbon footprint of 3,370 metric tons – that’s the equivalent of powering 656 homes for a year!
The water consumed by data centers is a much bigger concern. They're straining already strained public water systems.
I like how when you go pro with perplexity, all you get is more wrong answers
That's probably why I end up arguing with Gemini. It's constantly lying.
Identifying the source of an article is very different from the common use case for search engines.
1:1 quotes of web pages is something conventional search engines are very good at. But usually you aren't quoting pages 1:1.
AI can be a load of shite but I’ve used it to great success with the Windows keyboard shortcut while I’m playing a game and I’m stuck or want to check something.
Kinda dumb but the act of not having to alt-tab out of the game has actually increased my enjoyment of the hobby.
Go figure, the one providing sources for answers was the most correct...But pretty wild how it basically leaves the others in the dust!