this post was submitted on 30 Jun 2025
24 points (96.2% liked)

Ask

504 readers
4 users here now

Rules

  1. Be nice
  2. Posts must be legitimate questions (no rage bait or sea lioning)
  3. No spam
  4. NSFW allowed if tagged
  5. No politics
  6. For support questions, please go to !newtolemmy@lemmy.ca

Icon by Hilmy Abiyyu A.

founded 4 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] hendrik@palaver.p3x.de 3 points 1 week ago* (last edited 1 week ago) (1 children)

Right. Plus big things tend to end up differently from what we anticipated. Even if we arrive at Terminator level AI doom some day far in the future... It'll be the one thing we didn't anticipate. It's been that way with most big disruptive changes in history. Or it's not doom but transitioning from horses to cars. People back then couldn't predict how that was going to turn out as well. Main point, we don't know, we mainly speculate.

[โ€“] brucethemoose@lemmy.world 2 points 1 week ago* (last edited 1 week ago)

To me, our "AI Doom" scenario is already obvious: worsening the attention optimization epidemic, misinformation, consolidating wealth, and so on. It's no mystery. It's going to suck unless open source takes off.

In other words, its destabilizing society because of corporate enshittification and lack of any literacy/regulation, not because of AI itself.