The truth should not be partisan. Despite what people will have you believe, hard facts alone are not political. Facts are what remains true no matter if you believe in them or not. I don't even understand how truth went from somewhat partisan, to extremely partisan, to something that even splits families up (go look up qanoncasualities if you don't believe me).
@Elizafox This is a thing I've been studying for awhile.
I have a bunch of scattered observations about what it is that causes people to be susceptible to repetition of blatant untruth.
appeals to comforting beliefs: yes, you are in control of your life, you deserve everything you have, anyone who worked as hard as you would have achieved the same thing -- therefore anyone who isn't doing well is pretty much to blame for that (the Cult of Personal Responsibility)
appeals to vanity: by always questioning any conclusion which is either widely accepted or seemingly inescapable, you are proving that you are cleverer and more open-minded than the Sheeple -- and you can prove your worthiness and non-sheeplyness by repeating ideas which articulately take down such widely-accepted and/or inescapable conclusions, regardless of whether those arguments make sense on closer examination, in the name of being sure to question everything and examine all sides (Cheap Talk Skepticism)
inability to filter facts: these people have been raised with the idea that sources are either 100% trustworthy or else The Enemy. Critical thinking is discouraged and not taught -- so it is essentially impossible for them to question a source they have decided is trustworthy.
Have you read The Authoritarians by Bob Altemeyer?
@woozle This is a brilliant analysis btw and spot-on. Confirmation bias isn't really on the list and I'm not sure it fits in there, but that's a huge part of it too. People want to believe things that confirm their world view. Leftists want signs Capitalism is truly in its final stage and about to crumble literally any day now. Right-wing people want evidence that any day now the tide will turn and they'll win the culture war and the cult of personal responsibility will come out on top.
@Elizafox Confirmation bias is a specific example of cognitive errors in general -- which is something most of us recognize as a problem and try to contain. Counterfactualists, however, embrace any cognitive error they can find which seems to reinforce their beliefs, because feeling safe and comfortable with their beliefs is... not just more important than having accurate beliefs, but is kind of more like the only way they understand the idea of "truth". (See "truthiness".)
We all have harmful biases. Most of us try to contain them; a bigot is someone who embraces those biases and gives them a cognitive platform.
That's an interesting analysis. In my experience most people who are susceptible to untruths masked as "comforting beliefs" are more likely to chase after the idea that they have no personal responsibility and that they're _not_ in control of their lives; they're more likely to attribute their failure to the actions of systems outside of their control than attempt to take control themselves.
There may be two polar opposite types who are each susceptible in different ways:
One clings desperately to the idea that they are in control, and accepts any nonsense which confirms this ("personal responsibility", meritocracy, etc.); the other feels completely out of control and adrift, and will cling to any piece of flotsam that seems to offer a restoration of control (e.g. QAnon theories, "we are the storm", etc.).
Presumably those of us who understand that our lives are affected by both merit and pure dumb luck are already skeptical of simple answers, and therefore less vulnerable to such things.
On the internet, everyone knows you're a cat — and that's totally okay.