Unsurprisingly, it turns out that the Trump leadership is just as hilarious as I predicted. The past few weeks, people have been amused by the notion of “alternative facts,” and how that is obviously hogwash and propaganda. It’s so easy to recognize alternative facts. How about non-alternative facts? Actual facts?
My thesis is that it is very hard to identify actual facts. So hard, in fact, that it is a bit ridiculous to make so much fun of “alternative facts.” How do you know that your facts are not the alternative facts?
Let’s take an example: “God created man.” Fact. It is easily proven by Genesis 1:27 (I totes had to Google that), which says that “God created man in His own image…” It even comes with the bonus fact of “God is a dude (or tranny identifying as a dude).”
Another example: “Man evolved from the great apes by means of natural selection.” Also a fact. It is proven by the existence of fossils of intermediate forms.
Both of those cannot be true at the same time (for the same of this argument, shut up and go pick some nit elsewhere). Yet, they are both supported by evidence. Of course the person subscribing to things going “monkey, hillbilly, ape, man” would say that the bible is allegorical or not a believable source being written hundreds of years after events took place. At the same time, people in support of the “abracadabra: man” theory would say that god just faked the evidence and that natural selection cannot be true because the bible says otherwise.
In essence, it all comes down to which evidence we believe what we consider facts. There is an answer: man did appear somehow. There is an absolute truth. But we might not be able to find it. I like to separate questions into 4 categories: the unanswerable, the unknowable, the unknown, and the known. The unanswerable have no answer: “Which is better? Chocolate or Britney?” Better how? The question is too vague to have an answer. Same with paradoxical questions. Unknowable questions are ones for which there is an answer, but where there is no theoretical (or practical) way of knowing the answer. “How many people are alive right now?” “What hurts more – giving birth or getting kicked in the testicles?” “Is man-flu real or imagined?” The origin of man probably falls into that category as well. The unknown can be answered but just haven’t yet; “Who will win at sports next Sunday?” The known are “Who won at sports last Sunday?” (well known to some, I guess). I treated this categorization of questions much more in this post.
We might be able to convince ourselves that one of the two prevalent theories for where we all came from is true. We do so by accepting one piece of evidence and rejecting another. We might decide that an unknown or unknowable question is really known. That it is a “fact.”
A month or two ago, I was reading an essay about arguments in social science. The book was all high and mighty and said that a proper social argument is not just an exchange of opinions, but instead start by posing a question, moving on to making claims, and finally provides evidence for the claims. Then the authors continued to drawing conclusions based on their opinions, which I found hilarious.
I have a feeling that many just base arguments around opinions without feeling the need to present evidence in support of their claims, but I don’t have numbers to support that. Social scientists try to improve that by basing their claims on evidence. But we have just seen that even if you provide an evidence-based argument, we managed to argue for two completely contradicting views of the world. My social science book did the same: presented arguments for and against claims that big supermarkets were good for small societies.
And then it all falls down: now instead of discussing the original claims, we throw evidence after each other, and it all comes down to whether or not we accept evidence or not. Or in other words, opinions and feelings. A social science argument, while on the outset intended to be more rigorous than your average bar discussion (yours, not mine; they are always brilliant and deep), they fall back on non-evidence-based judgement of the quality of evidence.
A proper scientific argument is a bit more involved. It starts with a hypothesis and continues on to a claim (prediction). It then tries to disprove the hypothesis by showing that the predictions do not hold. This might seem like a double-negatory way of saying the same, but it really isn’t. There’s two reasons this is a better argument than a social science argument: it can if not answer, then at least give an indication of the answer to unknowable questions, and it reverses the incentive: instead of just trying to find evidence in support of your claim, you have to do an honest effort of finding evidence against it.
The question “Is there a god?” (and corresponding hypothesis “there is no god”) is probably unknowable. If there is no god, there is really no way of proving that. Aside from watching what Christina Aguilera did to her self in the past 5-10 years. It is possible to prove the hypothesis false, though: all you have to produce a god, and you will have proven it wrong. The theory is falsifiable; there is a (simple) test that can prove it false. The harder you try proving the theory false and fails, the more probable it is that it is true.
But even that has a hard time of producing facts. Proper scientific reasoning still relies on interpretation of evidence. Some bible-enthusiast producing Jesus from the bible will claim to have produced a god, having proven the claim false.
There are a couple psychological effects that makes it very hard for us to convince others of a fact we know. Those effects are confirmation bias and the backfire effect.
Confirmation bias means we are more likely to remember things in support of something we already believe. Somebody who believes in anthropogenic global warming will remember evidence that shows correlation between human activity and temperature. They will not remember claims that instead correlate sunspots with temperature, but will perhaps remember claims that that claim has been discredited. We have a definite tendency to just discarding arguments we disagree with as being propaganda.
The backfire effect means that when confronted with evidence that goes against something we believe strongly in, we have a tendency to believe stronger in it. A Trump supporter confronted with him being a bit of a nazi will either dismiss it as leftist propaganda or rationalize that Hitler went overboard but had some sensible ideas like the Autobahn and Volkswagen. A Hillary supporter confronted with her sending state secrets using Hotmail (roughly but not really) will dismiss that as less important (even though it is very important that Trump does the exact same because it’s different somehow). This is very related to the sunk cost fallacy, where one is very likely to stand stronger behind a decision if one is already invested in it.
Studies show we are notoriously bad at seeking evidence. It is easy to make fun of people getting news from Fox or Breitbart, but if you non-ironically cite an RT article you’re no better yourself. Studies show that the most neutral news outlets are often judged as the most biased. The reason is that a neutral news outlet is more likely to present more than one perspective of a case, and most will agree with and largely forget arguments they agree with (not forget the arguments themselves, that they heard them there) and only remember the ones we disagree with. A super-biased media, but one we agree with, is less likely to have a lot of things we disagree with and therefore be categorized as biased. If you agree with Fox, you will view it as much less biased than CNN. Same with RT. That does not mean that Fox is the epitome of neutrality because so many find it biased; the correlation works in the other direction: if neutral then viewed as biased. In Denmark, the state TV station, Danmarks Radio is often viewed as extremely left-wing by the righter wing. At the same time, it is viewed as ultra-liberalist by the extreme left. All in all, it is probably not that bad (says I, who for a long time viewed it as pretty much Stalin-TV). I see the same arguments about the EU: it is the society of the capitalism and simultaneously communism. It’s probably ok. I think the big take-away is that if you always agree with the media you get your news from, they are probably biased.
If we all get our evidence from outlets we find are less biased, they are probably among the more biased ones and reinforce our own biases. That is of course true for everybody else, but also for you and I. If it is so obvious that Putin and Russian medias tell lies, if comical Ali was so obviously lying to the Baghdad population, if Fox is so obviously propaganda, how can we be so certain the sources we use ourselves are the truth. I don’t mean that in a conspiracy theory-y way, just that we are all wrong to some extent. How can it be that all I read in the newspaper is so profound and deep unless I know about the topic, in which case it is full of factual errors?
A fact is probably just an opinion supported by carefully cherry-picked evidence. That goes for “alternative facts,” but also what you or I view as the truth. A tree might not be green to a colorblind. And to be stoner-deep, how do we know that your green and my green are the same? (That last one has been very interestingly treated in this podcast.)
Time person of the year 2006, Nobel Peace Prize winner 2012.
One thought on “Non-alternative Facts”
Are you not dead?