I look forward to factchecker services that interface right into the browser or OS, and immediately recognize and flag comments that might be false or misleading. Some may provide links to deep dives where it's complicated and you might want to know more.
Solarpunk
The space to discuss Solarpunk itself and Solarpunk related stuff that doesn't fit elsewhere.
Join our chat: Movim or XMPP client.
People are thinking of the firewall here as something external. You can do this without outside help.
Who is this source. Why are they telling me this. How do they know this. What infomation might they be ommiting.
From that point you have enough infomation to make a judgement for yourself what a point of infomation is.
you already have that firewall. it's your experiences and human connections, your understanding of media, your personal history and learning and the feelings you experience.
you don't need a firewall to keep you from being manipulated, you need to learn to fucking read and think and feel. to learn and question, to develop trusted friends and family you can talk to.
if it feels like your emotional backdoors are being exploited then maybe youre thinking or behaving like a monster and your mind is revolting against itself.
Having a will means choosing what to do. Denying the existence of a person’s will is dehumanizing.
that which influences you is more powerful than your will. You cant really choose what to do.
And dehumanization is a bad thing. I have a will. I choose what to do.
The idea that free will is an illusion is becoming more popular in philosophy. We can be human and not exactly have free will.
I've, since I was young, had mantra. If you don't why your are doing something someone else does. Its not always conspiracy or malicious its literally the basis of the idea of memetics, shareable and spreadable ideas that form the basis of who we are and what we know.
Yeah IF you don’t know why you’re doing something. Key word.
Do we have an iamverysmart community yet?
i have a general distaste for the mind/computer analogy. no, tweets aren't like malware, because language isn't like code. our brains were not shaped by the same forces that computers are, they aren't directly comparable structures that we can transpose risks onto. computer scientists don't have special insight into how human societies work because they understand linear algebra and network theory, in the same way that psychologists and neurologists don't have special insight into machine learning because they know how the various regions of the human brain interact to form a coherent individual mind, or the neural circuits that go into sensory processing.
i personally think that trying to solve social problems with technological solutions is folly. computers, their systems, the decisions they make, are not by nature less vulnerable to bias than we are. in fact, the kind of math that governs automated curation algorithms happens to be pretty good at reproducing and amplifying existing social biases. relying on automated systems to do the work of curation for us isn't some kind of solution to the problems that exist on twitter and elsewhere, it is explicitly part of the problem.
twitter isn't giving you "direct, untrusted" information. its giving you information served by a curation algorithm designed to maximize whatever it is twitter's programmers have built, and those programmers might not even be accurately identifying what it is that they're maximizing for. assuming that we can make a "firewall" that maximizes for neutrality or objectivity is, to my mind, no less problematic than the systems that already exist, because it makes the same assumption: that we can build computational systems that reliably and robustly curate human social networks in ways that are provably beneficial, "neutral", or unbiased. that just isn't a power that computers have, nor is it something we should want as beings with agency and autonomy. people should have control over how their social networks function, and that control does not come from outsourcing social decisions to black-boxed machine learning algorithms controlled by corporate interests.