I wonder how many times I’ll say this. This will be my last Bayes’ theorem post. At this point a careful reader should be able to extract most of the following post from the past few, but it is definitely worth spelling out in detail here. We’ve been covering how academics have used Bayes’ theorem in their work. It is also important to see how Bayes’ theorem could be useful to you in your own life.
For this post I’m going to take as my starting point that it is better in the long run for you to believe true things rather than false things. Since this is an academic blog and most academics are seeking truth in general, they hold to some sort of rational or skeptic philosophy. Whole books have been written defending why this position will improve society, but wanting to believe true things shouldn’t be that controversial of a position.
Honestly, to people who haven’t spent a lot of time learning about bias, it is probably impossible to overestimate how important a role it plays in making decisions. Lots of well-educated, logical, careful people can look at both sides of the evidence of something and honestly think they are making an objective decision about what is true based on the evidence, but in reality they are just reconfirming a belief they made for totally irrational reasons.
We’ve all seen this type of picture before:
Even though you know the following things:
1. What the optical illusion (i.e. bias) is.
2. How and why it works.
3. The truth, which is easily verified using a ruler, that the lines are the same length.
This knowledge does not give you the power to overcome the illusion/bias. You will still continue to see the lines as different lengths. If bias can do this for a sense as objective as sight, think about how easily tricked you can be if you go off of intuition or feelings.
This exercise makes us confront a startling conclusion. In order to form a true belief, we must use the conclusion that looks and feels wrong. We must trust the fact we came to through the verifiably more objective means. This is true of your opinions/beliefs as well. You probably have false beliefs that will still look and feel true to you even once you’ve changed your mind about them. You need to trust the evidence and arguments.
A Bayesian analysis of this example might run as follows. You have the belief that the lines are different lengths from looking at it. In fact, you could reasonably set the prior probability that this belief is true pretty high because although your eyesight has been wrong in the past, you estimate that around 99% it wouldn’t make such an obvious and large error. The key piece of evidence you acquired is when you measured this with a ruler. You find they are the same length. This evidence is so strong in comparison with your confidence in your eyesight that it vastly outweighs the prior probability and you confidently conclude your first belief was false.
You probably came to many of the beliefs you have early on in life. Maybe your parents held them. Maybe your childhood friends influenced you. Maybe you saw some program on TV that got you thinking a certain way. In any case, all of these are bad reasons to believe something. Now you’ve grown up, and you think that you’ve re-evaluated these beliefs and can justify them. In reality, you’ve probably just reconfirmed them through bias.
Once you’ve taken a position on something, your brain has an almost incomprehensible number of tricks it can do in order to prevent you from changing your mind. This is called bias and you will be totally unaware of it happening. The rational position is to recognize this happens and try to remove it as much as possible in order to change an untrue belief to a true belief. Trust me. If done right, this will be a painful process. But if you want true beliefs, it must be done and you must be willing to give up your most cherished beliefs since childhood even if it means temporary social ostracization (spell check tells me this isn’t a real word, but it feels right).
What this tells us is that if we really want true beliefs we need to periodically revisit our beliefs and do a thorough sifting of the evidence in as objective a way as possible to see if our beliefs have a chance at being true.
Since there are literally thousands of cognitive biases we know about, I can’t go through all the ones you might encounter, but here are a few. One is confirmation bias. When you look at evidence for and against a belief you will tend to remember only the evidence that confirmed your already held belief (even if the evidence against is exponentially bigger!). It is difficult to reiterate this enough, but you will not consciously throw the evidence out. You will not be aware that this happened to you. You will feel as if you evenly weighed both sides of the evidence.
One of my favorite biases that seems to receive less attention is what I call the many-against-one bias (I’m not sure if it has a real name). Suppose you have three solid pieces of evidence for your belief. Suppose the counter-evidence is much better and there are seven solid pieces of it. When you look through this, what you will tend to do is look at the first piece of evidence and think, “Well, my side has these three pieces of evidence and so although that looks good it isn’t as strong as my side.” Then you move on to piece of counter-evidence two and do the same thing.
All of a sudden you’ve dismissed tons of good evidence that when taken together would convince you to change your mind, but since it was evaluated separately in a many-against-one (hence the name!) fashion you’ve kept your old opinion. Since you can’t read all the counter-evidence simultaneously, and you probably have your own personal evidence well-reinforced, it is extremely difficult to avoid this fallacy.
And on and on and on and on and on … it goes. Seriously. This should not be thought of as “bad” or something. Just a fact. It will happen, and you will not be aware of it. If you just simply look at both sides of the argument you will 99.99% of the time just come out believing the same thing. You need to take very careful precautions to avoid this.
Enter Bayes’ theorem. Do not misconstrue what I’m saying here as this being a totally objective way to come to the truth. This is just one way that you could try as a starting point. Here’s how it works. You take a claim/belief which we call B. Now you look at the best arguments and evidence for the claim you can find. You write each one down, clearly numbered, with lots of space between. Now you go find all the best counterarguments and evidence you can find to those claims and write those down next to the original ones. Now do the exact same thing with the best arguments/evidence you can find against the claim/belief.
One at a time you totally bracket off all your feelings and thoughts about the total question at hand. Just look at evidence 1 together with its counter-evidence. Supposing the claim is true, what are the chances that you would have this evidence? This is part of your P(E|B) calculation. Don’t think about how it will affect the total P(B|E) calculation. Stay detached. Find people who have the opposite opinion as you and try to convince them of your number just on this one tiny thing. If you can’t, maybe you aren’t weighting it right.
Go through every piece of evidence this way weighing it totally on its own merits and not in relation to the whole argument. Having everything written down ahead of time will help you overcome confirmation bias. Evaluating the probabilities in this way one at a time will help you overcome the many-against-one bias (you’ll probably physically feel this bias when you do it this way as you start to think, “But it isn’t that good in relation to this.”) This will also overcome numerous other biases, especially ones involving poor intuitions about probability. But do not think you’ve somehow overcome them all, because you won’t.
One of the hardest steps is then to combine your calculations into Bayes’ theorem. You should think about whether or not pieces of evidence are truly independent if you want a proper calculation. But overall you’ll get the probability that your belief is true given the evidence, and it will probably be pretty shocking. Maybe you were super confident (99.99% or something) that there was no real reason to doubt it, but you find out it is more like 55%.
Maybe something you believe only has a 5% chance of being true and you’ve just never weighed the evidence in this objective a way. You need to either update what you think is true, or at very least if it still seems to be able to go either way, be much more honest about how sure you are. I hope more people start doing this as I am one of those people that think the world would be a much better place if people stopped confidently clinging to their beliefs taught to them from childhood.
Changing your mind should not have the cultural stigma it does. Currently people who change their minds are perceived as weak and not knowing what they are talking about. At very least, they give the impression that since their opinion changes it shouldn’t be taken seriously as it might change again soon. What needs to happen is that we come to recognize the ability to change ones beliefs as an honest endeavor, having academic integrity, and is something that someone who really seeks to hold true beliefs does frequently. These people should be held up as models and not the other way around.