ignirtoq

joined 11 months ago
[–] ignirtoq@fedia.io 31 points 17 hours ago (1 children)

I think you've already answered your own question. Trump's first presidency was very different from his second. And the key difference is his advisors. No one knew how to deal with Trump in his first presidency, and the overarching pattern above the chaos was his close advisors constantly working against him to protect the system.

In the break between his terms, he found people who would follow any of his directives, no matter how stupid or damaging. Or I should say, they found him. Trump is now being manipulated himself by a group of "loyalists" who stand to gain from the exact chaos Trump was thwarted from doing the first time around. These people know enough about how the systems work to be actually dangerous. And now that his advisors are philosophically aligned with him, he's actually doing what he says he's going to do. I would argue the 4 years of a Trump reprieve may have been the worst of all possibilities, because it gave Trump and his new in-group time to find each other and prepare.

But most people weren't paying attention. Many saw Trump in 2024 just like they saw him in 2016; the counterweight, spoiler, outsider set to upset Washington and get real changes happening. They thought Democrats weren't helping them enough, and so wanted to upset the apple cart and get someone different in the high seat. They weren't paying close enough attention to see that this time was actually radically different to all elections before, and I saw many, many people dismissing the warnings as "oh everyone claims their opponents are fascists, or are going to destroy the country."

So it's a combination of people not paying attention, as usual, and Trump actually changing how he's doing things this time in the worst way possible.

[–] ignirtoq@fedia.io 31 points 1 day ago (6 children)

I never stopped. The pandemic never ended. People just got tired of acknowledging it and left the vulnerable to fend for themselves.

[–] ignirtoq@fedia.io 17 points 1 day ago

Amber Hart, the co-founder of a research and advisory firm, the Pulse, that specializes in federal contracting, said it’s simply not possible to create a real-time accounting of contract savings with the data the team has used — as DOGE has promised on its website.

“There’s no way for them to make it possible unless they completely overhaul the way the data is reported — which would be awesome,” she said. “I would absolutely love for them to break that. They’re breaking the wrong things.”

Whether "they're breaking the wrong things" or "hurting the wrong people," this is what you get when you vote narcissistic idiots who portray themselves as geniuses with inscrutable knowledge of how to fix things at zero cost. It always, always comes back to hurting the poor and minorities to benefit the wealthy. Or at least hurting the out-groups to benefit the in-group. And you are never in the in-group.

[–] ignirtoq@fedia.io 1 points 3 days ago (1 children)

Ah, I think I misread your statement of "followers by nature" as "followers of nature." I'm not really willing to ascribe personality traits like "follower" or "leader" or "independent" or "critical thinker" to humanity as a whole based on the discussion I've laid out here. Again, the possibility space of cognition is bounded, but unimaginatively large. What we can think may be limited to a reflection of nature, but the possible permutations that can be made of that reflection are more than we could explore in the lifetime of the universe. I wouldn't really use this as justification for or against any particular moral framework.

[–] ignirtoq@fedia.io 0 points 3 days ago (3 children)

I think that's overly reductionist, but ultimately yes. The human brain is amazingly complex, and evolution isn't directed but keeps going with whatever works well enough, so there's going to be incredible breadth in human experience and cognition across everyone in the world and throughout history. You'll never get two people thinking exactly the same way because of the shear size of that possibility space, despite there having been over 100 billion people to have lived in history and today.

That being said, "what works" does set constraints on what is possible with the brain, and evolution went with the brain because it solves a bunch of practical problems that enhanced the survivability of the creatures that possessed it. So there are bounds to cognition, and there are common patterns and structures that shape cognition because of the aforementioned problems they solved.

Thoughts that initially reflect reality but that can be expanded in unrealistic ways to explore the space of possibilities that an individual can effect in the world around them has clear survival benefits. Thoughts that spring from nothing and that relate in no way to anything real strike me as not useful at best and at worst disruptive to what the brain is otherwise doing. Thinking in that perspective more, given the powerful levels of pattern recognition in the brain, I wonder if creation of "100% original thoughts" would result in something like schizophrenia, where the brain's pattern recognition systems are reinterpreting (and misinterpreting) internal signals as sensory signals of external stimuli.

[–] ignirtoq@fedia.io 3 points 3 days ago (5 children)

The problem with that reasoning is it's assuming a clear boundary to what a "thought" is. Just like there wasn't a "first" human (because genetics are constantly changing), there wasn't a "first" thought.

Ancient animals had nervous systems that could not come close to producing anything we would consider a thought, and through gradual, incremental changes we get to humanity, which is capable of thought. Where do you draw the line? Any specific moment in that evolution would be arbitrary, so we have to accept a continuum of neurological phenomena that span from "not thoughts" to "thoughts." And again we get back to thoughts being reflections of a shared environment, so they build on a shared context, and none are original.

If you do want to draw an arbitrary line at what a thought is, then that first thought was an evolution of non-/proto-thought neurological phenomena, and itself wasn't 100% "original" under the definition you're using here.

[–] ignirtoq@fedia.io 4 points 4 days ago (7 children)

From your responses to others' comments, you're looking for a "thought" that has absolutely zero relationship with any existing concepts or ideas. If there is overlap with anything that anyone has ever written about or expressed in any way before, then it's not "100% original," and so either it's impossible or it's useless.

I would argue it's impossible because the very way human cognition is structured is based on prediction, pattern recognition, and error correction. The various layers of processing in the brain are built around modeling the world around us in a way to generate a prediction, and then higher layers compare the predictions with the actual sensory input to identify mismatches, and then the layers above that reconcile the mismatches and adjust the prediction layers. That's a long winded way to say our thoughts are inspired by the world around us, and so are a reflection of the world around us. We all share our part of this world with at least one other person, so we're all going to share commonalities in our thoughts with others.

But for the sake of argument, assume that's all wrong, and someone out there does have a truly original, 100% no overlap with anything that has come before, thought. How could they possibly express that thought to someone else? Communication between people relies on some kind of shared context, but any shared context for this thought means it's dependent on another idea, or "prior art," so it couldn't be 100% original. If you can't share the thought with anyone, nor express it in any way to record it (because that again is communication), it dies with you. And you can't even prove it without communicating, so how would someone with such an original thought convince you they've had it?

[–] ignirtoq@fedia.io 3 points 4 days ago

Ctrl+F coup

Nope, still not calling it what it is.

[–] ignirtoq@fedia.io 2 points 6 days ago

The options from other responses are better (gel, cold shrink tubing), but just for your edification, sand in a box can work as an extremely effective insulator for a short period. So heat up the soldering iron and stick it in a bed of sand in a box to take it in with you. Most of the heat won't escape the box, but it will spread through the tool, so you'll definitely want gloves.

[–] ignirtoq@fedia.io 3 points 1 week ago

More infections mean more chances to mutate into being able to spread human-human. I would expect a greater chance from the strain that's already spreading mammal to mammal, but human infections with any strain have a chance.

We have the technology and logistical infrastructure to stop this, but the incentive structure in this country is completely backwards for protecting health and safety. And that's very unlikely to chance any time soon. Get your masks ready.

[–] ignirtoq@fedia.io 6 points 1 week ago

Math, physics, and to a lesser extent, software engineering.

I got degrees in math and physics in college. I love talking about counterintuitive concepts in math and things that are just way outside everyday life, like transfinite numbers and very large dimensional vector spaces.

My favorite parts of physics to talk about are general relativity and the weirder parts of quantum mechanics.

My day job is software engineering, so I can also help people get started learning to program, and then the next level of building a solid, maintainable software project. It's more "productive" in the traditional sense, so it's satisfying to help people be more productive, but when it's just free time to shoot the shit, talking about math and science are way more fun.

[–] ignirtoq@fedia.io 6 points 1 week ago (1 children)

Not autistic but my partner is. One of her hacks is having me put alarms on my phone so I can gently remind her of things. The alarms are too jarring for her to have them on her own phone.

view more: next ›