this post was submitted on 18 Jun 2025
351 points (98.3% liked)

Not The Onion

16802 readers
1485 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] shirro@aussie.zone 8 points 9 hours ago* (last edited 9 hours ago) (1 children)

It's just another money/power grab. Tech bros want to spy on your kids to train their mythical AGI God, make shit loads of money and be in a position of influence.

Take away people's economic independence. Take their intellectual capital. Take their capacity to learn and think independently and ultimately their capacity to play and imagine.

The sheep are being led towards a very fucking grim world.

Lets be clear it isn't the tech that is bad. Self hosting a model for a task that suits it like speech recognition for a disabled person is righteous and liberating.

I hate to agree with the Marxists of Lemmy but the problem is very much capitalism as it is with global warming, pollution and a social inequality. Where I will split with them historically is I still believe in liberal democracies capacity to regulate capitalism for a common good. But when everyone has outsourced their thinking to a corporate AI from birth to death democracy is fucked isn't it.

We need to be aware of how we are being fucked over, sceptical of people pushing this shit and engaged politically to make sure it is regulated appropriately otherwise these cashed up AI fuckers are going to write legislation for our politicians to rubber stamp.

[–] utopiah@lemmy.world 5 points 8 hours ago

it isn’t the tech that is bad. Self hosting a model for a task that suits it like speech recognition for a disabled person is righteous and liberating.

Even that is tricky. One must check how the model itself was trained, namely :

rather than just solely have a positive use case and being privacy preserving.