Mini Blog Post 13: Beliefs are for True Things

(Note: This draws heavily on the ideas of my previous post, on the Map and the Territory - I recommend skimming that first)

There are a lot of things I care about in this world, and it’s often useful to divide them into instrumental and terminal values. Terminal values are ends, things I care about intrinsically, eg my own happiness, saving lives. While instrumental values are a means to an end, things I value only because they help me achieve my terminal goals, rather than because they are intrinsically important to me.

I think this is a useful mental distinction to have, but it can often go wrong. Our minds are bad at understanding the world, and will often only optimise for an instrumental value when we can see how this directly achieves our goals. This can be a virtue, but humans are bad at longterm thinking, and there are some instrumental values we systematically neglect. For example, I think taking breaks and maintaining your mental health is key for being able to be effective in the long-term, but we often neglect it for short-term benefit. Or, we’ll lie because it’s convenient in the short term, and ignore the harm to our long-term reputation and relationships.

I think the solution to this is to take on “fake” terminal values. If I can act as though my mental health is intrinsically important, I will better achieve my other goals, because I am incapable of respecting it when it’s just an instrumental goal. And, this is easier said than done. It’s not enough to just say it’s important to me - I need to live and feel as though it’s important to me. It can’t feel like a second class terminal value, it needs to be something I will prioritise, and sometimes sacrifice other terminal values for, because this will help me achieve my true terminal values in the long-term. And one of the more important “fake” terminal values I’ve taken on is to seek the truth, and strive to have true beliefs - internalising the idea that “Beliefs are for true things”. And the rest of this post will be trying to convince you that this is a good idea!

I think having true beliefs is a key first step to achieving almost any goal. If a friend is toxic, and makes me unhappy to be around, realising this is key to ensuring that I’m happy. If I take a new job, it would be helpful to know if this will systematically make me less happy. If I’m choosing courses based on what will help me get a job, it’s important to know which ones will actually help. It is really, really easy to make mistakes, and systematically lose out on things that I care about if I don’t have true beliefs.

And finding truth is hard. This is one of the things that almost everyone fails at (including me!). Our minds are full of biases! We want certain beliefs to be true. We want to believe what the people around us believe. We neglect the long-term future. We don’t have a good intuition for large numbers, or small probabilities. The underlying point is that truth lives in the territory, but all I have is the map. I look at the world through the flawed lens of my mind, with all of its inherent biases. But from the inside, they aren’t labelled as biases - they just feel like how the world is.

In fact, I think the entire notion of a bias is framing this wrong. A bias makes it feel like I have an accurate view of the world, with a few flaws. I find it more helpful to say that the default is having false beliefs. This is not a personal flaw, or me not trying hard enough. Having false beliefs is the default state of the world. And if you care about truth, there is a constant fight against entropy.

I think this was made especially visceral to me with coronavirus. This was a really big deal, that has meaningfully affected my life, and been a massive catastrophe for the world. And the world had all the information to realise that this was a big problem, and worth preparing for, but didn’t. And on a personal level - even though I care a lot about truth, and try to understand my own biases, and could see people I respected freaking out about it, it took a long time until I properly internalised how big a deal this was. Because I was seeing the world through the lens of optimism bias, where I refused to believe that things could actually be this bad. And social conformity bias, where most people weren’t taking it seriously, so it felt like it couldn’t be that big a deal. And this was subtle, and insidious, because these didn’t feel like biases from the inside. I wasn’t asking myself “is coronavirus a big deal”, thinking hard about it, and concluding that it wasn’t. It never felt like there was a question to be asked in the first place.

A common counterpoint to this: true beliefs hinder effectiveness, eg, if I realise that my project isn’t going to succeed, I won’t be as motivated as if I lie to myself about it. And this is true! This is a genuine cost, and having false beliefs is often useful. But this is the same error as somebody who doesn’t take breaks, because it doesn’t feel important. Having true beliefs is hard, and incredibly important. Almost everybody gets this wrong, and there is a constant fight against entropy. And if you’re willing to sacrifice truth for short-term motivation, then you will lose this fight against entropy. And maybe this caches out in obvious ways - like fighting on with a doomed project. But more realistically, you won’t know how the truth will help you. Because the world is uncertain and imperfect, and you need to make decisions anyway. And so, even if it’s really useful to believe false things, I have found that it is key to my long-term goals to still seek the truth. Even if that specific belief didn’t matter, often just preserving my identity as somebody who seeks truth is key. There’s a useful skill of having true beliefs, but hacking my motivation to feel as though I have false beliefs, and I consider this a valuable skill. But having the grounding of true beliefs deep down remains key.

Having true beliefs is hard. If you are not actively trying, you will have false beliefs. Even if you are actively trying, you will still have some false beliefs! Because this is the default state of the world. And one of the most important ways to fight against this, is to always remember that beliefs are for true things.

So, realising that truth is important is just the first step, but how do you action upon this? This is a pretty broad question, and far too broad to be easily handled in a blog post, but I’ll try to outline the important points of the framework.

One of the most important bits of research on this is the work of Phillip Tetlock, who researched human ability to predict the future. And he found that this was a measurable skill, and studied the people who were best at this (an excellent summary). This is a small subset of truth-seeking, but predicting the future is hard, and having true beliefs is a vital sub-skill, so I think this is highly worth reading about. And one finding I find fascinating, is that the people who did well tended to have an intuitive grasp of Bayes Theorem.

Bayesian probability is a formal mathematical framework for understanding the world. You have a proposition you’re uncertain about, and that uncertainty is captured by a probability. You start with a prior probability that the proposition is true, and then observe evidence which you use to update your probability. Priors are hard, because they ultimately need to come from your understanding of the world, and your intuitions. One of the most successful techniques Tetlock found was using base rates - looking for similar things to that proposition in history, and finding the proportion which were true, and taking that as your prior. Priors are important, but difficult to do in a principled way, and very much the kind of thing upon which reasonable people disagree.

But here I want to focus on evidence, which I think is much more actionable. Evidence, formally defined, is something which is more likely to happen if the proposition is true, than if the proposition is false. This is a really important definition, and an important one to internalise. Truth lives in the territory, but my beliefs live in the map, and by default there is no correlation between them. Evidence is anything that entangles the map and the territory. The default state of the world is that things in the map are equally likely to be there, regardless of what’s in the territory, and to fight against this I need to find a way to relate them.

And this is something our intuitions are super wrong about! When I hear someone I dislike and think has incorrect beliefs, like Donald Trump, say something, my intuition is to think that this is wrong. But if something is wrong, then the opposite of it must be true - this is changing my beliefs, so it must be evidence. While if I think someone is is bad at seeking truth, that means their beliefs are not entangled with the world, so I learn nothing from their beliefs. Reversed stupidity is not intelligence. We do not have good intuitions for how evidence works.

And this correlation is precious, and fragile, and easy to break. And it needs to be grounded. By definition, a scientific experiment is something whose results are more likely to happen if they hypothesis is true than if it is false. A patient is more likely to recover if the drug works than if it doesn’t. The grounding there is empirical reality, and an experiment is a way to entangle that with my beliefs. But even here, this is extremely fragile - p-hacking, publication bias, seeking out studies that confirm my beliefs - there are many ways for this to stop being evidence. The entire field of psychology failed at this for a long time. The entanglement with reality is weak and fragile, and takes evidence to preserve.

And evidence doesn’t have to be formal, scientific evidence. It’s anything that entangles with reality. If I’m trying to solve a maths problem, and I have an intuition to try symmetry arguments, this is evidence. I have a lot of experience cached in those intuitions, and they’re recognising subtle cues in the problem - this is grounded. But this is an entanglement I’ve built over time, and I need to be careful - often my intuitions are not entangled with reality, especially in a newer field. Listening to experts can often be entangled with reality, because their beliefs come from all of their experiences. And for complex topics, this is often my best form of evidence. But this is easy to disrupt, if I’m not careful to listen to the right experts.

In practice, to find truth I need to cultivate habits that notice the things hindering me from finding truth, and counteract them. I easily fall prey to the planning fallacy, and this stops me from having true beliefs. And this is not labelled as the planning fallacy from the inside. But I can learn to detect subtle emotional cues, like the subtle feeling of overconfidence, and to double my estimated time when this happens. I find it is often useful to actively err in the direction of trying new things, doing something awkward or scary, caring about my longterm self, because I know that I am systematically bad at doing so. And the times when I notice this are the times when my intuitions are most uncertain, and by shifting that point I can become better entangled with reality. Noticing subtle cues is hard, but it is a learnable skill. And even here, this doesn’t work by default. Other people should err the other way. I’ve become entangled with reality by grounding myself in my experience, and tracking things, and noticing a systematic bias.

Another fruitful source of true beliefs is discussions with people I disagree with, especially if I can be empathetic, and understand where they’re coming from. My lens to see the world is flawed, and often I can only detect a flaw by talking to somebody outside of my head. This one is easy to screw up, it’s easy to be defensive and not listen to unpleasant ideas. It’s also easy to just go through the motions - to focus on being polite, and reasonable, but not on actually listening. Always remember to Be Deliberate. Your goal is to find truth. Don’t try to be reasonable, just be reasonable. And maybe their beliefs are wrong! If your goal is to seek truth you should talk through things, hear them out, and seek their grounding, and see if that can bring you closer to truth. I strongly dislike formal debating, in large part because I do not think it is a discussion that optimises for truth, and instead far too much on things that are not entangled with reality.

And ultimately, it’s impossible to be right about everything, or even to try to be right about everything. Seeking truth takes time, effort and energy. Just as you cannot save every life in the world, you cannot detect every false belief. Optimisation takes effort, you need to conserve resources and allocate them well. And finding this balance is hard, is a difficult skill.

But seeking truth is key when it matters. And hopefully I have outlined why it’s so important to me that truth feels like an end in and of itself, rather than a means to an end. Beliefs are for true things. And if you ever lose sight of that, you’ll lose the ability to fight for the things that matter to you.

$\setCounter{0}$
Previous
Previous

Mini Blog Post 14: Minds should make sense

Next
Next

Mini Blog Post 12: The Map is not the Territory