I’m not going to try to argue this rigorously. None of this is new - I just want to give some examples. If you want to see actual philosophy, go read actual philosophy!

In this post, I’m only going to talk about empirical beliefs, rather than moral/normative beliefs. The latter case is much more obvious - see my previous blog posts: “A modest view of doing good” and “Nobody has any idea whether wild-caught fishing is good or bad”.

When I observe things in my day-to-day life, I often notice empirical statements (that is, statements that could be falsified in principle) that people state and believe without subjecting to rigorous testing.

Some examples:

  • Folk psychology. “I’m tired because I didn’t sleep well.” Maybe, maybe not - can you define “well”? Have you validated a link between sleep quality and fatigue in you personally? What is the effect size and level of confidence? Have you excluded other possible explanations?
  • Life wisdom. “If you want to go far in life, you need to apply yourself.” Well, can we operationalise “go far” and “to apply onself”? What is the effect size and level of confidence? Are there moderating variables?
  • Political or policy views. “If we want to protect wild animals, then preserving nature is the way to go.” This one has an underlying assumption that “protect wild animals” is equivalent to “preserve the population size of a specific few types of animals”, and that “nature” is a valid concept. Both have been challenged (see e.g. Brian Tomasik).
  • News headlines. Don’t even get me started!
  • Scientific findings. There was a Youtube video a few years back called “Are most scientific findings wrong?”. This video argued that because of the way p-values are used and the incentives facing professional academics, it could be that the majority of scientific conclusions are incorrect.

It’s pretty easy to imagine going back in time, equipped with our present body of scientific “knowledge”, and readily finding many profound conflicts between our “knowledge” today and people’s “knowledge” in the past. Why would we think that the same will not be true for us in 100 years’ time?

Well, you might say, the solution is simple. If somebody makes a statement, then I know it’s probably wrong, so all I have to do is invert that statement in my head and then I’ll end up with a statement that is probably right.

I don’t think this works. It’s not just a matter of flipping the sign on any given statement. It’s not as though a statement is either completely right or completely wrong. When I say “most statements are wrong”, I don’t mean “most statements are exactly the opposite of the truth”. If a statement is exactly the opposite of the truth, then that statement carries perfect information about what the truth is - you just have to flip the sign. Rather, I mean “most statements carry little-to-no information about truth”. As an analogy, if you have a coin and you can see which side is heads, then you can deduce with near-certainty which side is tails. My view is more like a twenty-sided die, where every face is blank except for the 20. If all you can see is one face, it usually gives you very little information about where the 20 might be.

I think it is possible to begin to approach truth. I think experts on any topic can often make statements that begin to approach something true.

When I say “expert”, I don’t mean somebody who simply thinks about a topic a lot. I mean somebody who spends years and decades training themselves to think rigorously and using that skill to deliberately try to shoot down their own beliefs. This is what science/philosophy/research is, when it is done well (though I’d guess that only a minority of scientists/philosophers/researchers see it this way - in the pages of peer-reviewed journals, I’ve seen some mind-boggingly incoherent arguments!). The best astronomers are an example that I particularly enjoy - they know just how little information humanity can obtain about what is happening in space, so they are very, very careful to think rigorously, design their instruments and data processing tools very well, and adjust their level of confidence accordingly. A side-effect of this process is that the researcher in question often builds a deep appreciation of just how little they (and the rest of humanity) actually know (Dunning-Kruger effect).

It is weird to me that we have this tool (rigorous thinking and falsification) that is widely respected as a way to access truth in one context (science) but people frequently fail to apply this same tool in other contexts (day-to-day life).

I think some types of Buddhist meditation and Christian mysticism can achieve a similar thing. E.g. Thich Nhat Hanh’s favourite question: “Are you sure?”. Or the parable of the farmer and the horse.

It’s not as though simply being a scientist, writing something in a peer-reviewed paper, etc etc, makes a statement likely to be true. These things are superficial markers of status, not predictors of truth. Here are three examples:

  1. My dear PhD supervisor, whom I do respect greatly, once said “When people look at a dog, men look at the dog’s genitals to see if they’re a threat.” I wish I had asked what experimental setup and statistical test he used to deduce this!
  2. A manager in a former job (again, someone who held a PhD) said “We usually give the graphic design tasks to females, as they’re better at it.” He did not supply an effect size or a p-value.
  3. A journal article I once read said “It’s hard to imagine how a hen [living on a caged egg farm] could be so productive if it [!] has low welfare.” That one is a head scratcher.

Now here’s a question: if I believe that most statements are wrong, then is that belief also probably wrong?

Maybe. Probably! The point is that it’s worth training yourself to think rigorously and, as a consequence, being extremely selective about what sources of information you choose to consume. It is rarely the case that just because a nerd with a laptop has something to say, then it is worth putting any credence into that statement.

It almost makes you see the appeal of fundamentalism.

Recommended reading: actual philosophy (e.g. Socrates, Thich Nhat Hanh, contemporary secular philosophy on the topic of delusion, or Tetlock’s Superforecasting for further applications of these same ideas)