this post was submitted on 04 Jul 2025
225 points (100.0% liked)
TechTakes
2036 readers
149 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Tech giants will ruin AI with monetization. We are in the happy honeymoon growth phase now. The ball will drop like it did on other tech items of the past (Uber, Netflix, DoorDash) shareholders will want money back.
This is so funny, I don't think I've seen this before
Like imagine a cryptobro circa 2020 being like "no, we're not early, this is actually the honeymoon phase and it'll just get worse"
The happy honeymoon phase of turning the internet into an AI-slopfest.
this is the happy honeymoon phase?
this marriage is fucked
Surely having a baby together will save it
AI is completely unaffordable right now. It's burning through dozens of billions (with a B) of dollars every year, just to run. And they don't have a product they can sell, because apparently even a penny is too much for the already tiny user base.
Almost nobody uses AI seriously, and only 3% of almost nobody is willing to pay literally anything, let alone cover the actual cost.
I'd amend that to already tiny intentional userbase. Google, Samsung, Microsoft, Apple, and others are gleefully shoving it down their users's throats, hoping they'll get hooked, so there's a massive userbase. I suspect this is exactly why only 3% are willing to pay - they're a portion of the tiny group who actually signed up.
Snapshot into ChatGPT 5.0 when you say you're depressed and that your life has no menaing.
For the most part Netflix is still ok. I dont have ads but the UI and ratings have gotten worse over time.
Spotify is currently dipping hard.
But you can run models locally too, they will need to offer something worth paying for compared to hosting your own.
Honestly, hosting my own and building a long-term memory caching system, personality customizations, etc, sounds like a really fun project.
Edit: Is ChatGPT downvoting us? 😂
You're just in a place where the locals are both not interested in relitigating the shortcomings of local LLMs and tech-savvy enough to know long term memory caching system is just you saying stuff.
Hosting your own model and adding personality customizations is just downloading ollama and inputting a prompt that maybe you save as a text file after. Wow what a fun project.
no, you fuckers wandered into an anti-AI community and started jacking off about local models
It's a factual statement regardless of what you think of AI. People won't pay for something if the free option that can't be taken away from them is just as good.
Maybe that will at some point kill off the big overvalued companies
what the numbers show is that nobody gives a shit. nobody’s paying for LLMs and nobody’s running the models locally either, because none of it has a use case. masturbating in public about how invested you are in your special local model changes none of this.
Tonight, I installed Open Web UI to see what sort of performance I could get out of it.
My entire homelab is a single n100 mini, so it was a bit of squeeze to add even Gemma3n:e2b onto it.
It did something. Free chatgpt is better performance, as long as I remember to use place holder variables. At least for my use case: vibe coding compose.yamls and as a rubber duck/level 0 tech support for trouble shooting. But it did something, I'm probably going to re-test when I upgrade to 32gb of ram, then nuke the LXC and wait till I have a beefier host though.
statements dreamed up by the utterly deranged
case in point: you jacked off all night over your local model and still got a disappointing result