Scott H Young

Learning to Doubt


Human nature is to reason in certainties. It takes training to rid yourself of that handicap. Nobel-laureate, Richard Feynman, said it best:

“I can live with doubt and uncertainty. I think it’s much more interesting than live with answers which might be wrong. I have approximate answers and different degrees of certainty about various things, but I’m not absolutely certain of anything.”

Thinking in uncertainties is unnatural. Like reading, arithmetic or operating computers, it is a learned skill, not something that comes imbued in our mental hardware. But uncertainties is all there ever is, so it is a useful skill, even if it doesn’t come installed at birth.

Most people intuitively understand that some things are uncertain. You can see the weather forecast predict a 75% chance of rain. Yet, if it doesn’t rain tomorrow, you’ll hear people say that the weather report was wrong.

The weather report wasn’t wrong, your belief was. By rounding up the probability of .75 to 1, you made the error by reasoning in certainties.

Uncertainty is the Only Correct Way to Think

Weather is a trivial example. You’ll see the same fallacious reasoning in the abundant comments on my article on atheism. I get many critical commenters saying that I’m being irrational for rejecting the idea of a god without proof.

The very nature of their comment betrays a more fundamental ignorance—those people can’t reason in uncertainties. I don’t reject anything, I simply declare that with the evidence available to me, the proposition is unlikely. I can’t put an explicit percentage like the weatherman, but I’m making the forecast that there probably isn’t a god.

What if new evidence comes to me in the future, suggesting higher intelligence? I would adjust my relative uncertainty to be more in favor of the possibility. That doesn’t make my original judgement unjustified, since I had always included the possibility of new evidence changing my original position.

Many people have a false notion of reasoning that suggests if something is uncertain, they are free to believe what they want. While you’re always free to believe anything, including falsehoods, that doesn’t mean they are justified. My claim is that uncertainties, which are weighted appropriately, are the only justifiable beliefs to have.

For many things in life, the uncertainty is so small we can ignore it. I don’t regularly consider the possibility that gravity will cease to function and I’ll float into space because the doubt is miniscule. But just because we can ignore uncertainty in practical contexts doesn’t mean we can omit it wholesale.

Texas Hold ‘Em Logic

Poker is a fantastic game to train you in this counter-intuitive way of thinking about uncertainty. Good players quickly learn that there is a difference between winning and making correct decisions. Many novice players bet incorrectly on hands will sometimes win, but that doesn’t make their decisions justified.

Being a good player means being able to reason in uncertainties. Good players also realize that this uncertainty doesn’t afford them the luxury of being able to decide to believe whatever they want to. Ultimately, a 50% chance of winning a particular hand just that—50%. Believing anything other than this uncertainty is wrong.

Realizing that uncertainty exists encourages players to think not just which decisions paid off, but which decisions were valid strategies from the available evidence. Few other activities will train you in the difference between a valid argument with a false conclusion and an invalid argument with a true conclusion than poker.

Degrees of Belief

I like the poker example because it shows why thinking in uncertainties is useful. If you can’t think in uncertainties, you’ll lose a lot of money playing poker. Similarly, failing to think in uncertainties means you’ll almost certainly make bad decisions in your life.

The challenge is that uncertainties are rarely calculable like they are in games of chance. This is why I think in terms of qualitative degrees of certainty, rather than trying to pin down beliefs to 1%, 50% or 99%.

For example, if I get gossip about a friend which has been buffered through many sources, I put hazy certainty on that knowledge. Is it 75%? 60%? 50/50? Who knows? But it’s mentally labelled with an ambiguous quality.

What if I read a single scientific study about a phenomenon in psychology? Again, the exact percentage of uncertainty is probably too difficult to calculate, but I can put it as being more reasonable than speculation, but less certain than a truth confirmed repeatedly.

I try to apply the same metrics of uncertainty to my own pet theories and ideas. Holistic learning is popular here and there have been scientific studies which suggest it may be right. But I’m not under the misconception that a reasonable hypothesis is equivalent to a theory which has been rigorously tested.

Knowing something to the confidence that you can safely ignore uncertainty is extremely difficult, perhaps impossible for many domains of knowledge. It’s the ability to make use of the abundant circumstantial evidence and reasonable hypotheses which allows us to function, but at the same time realize that they don’t amount to unquestionable proof.

What About Healthy Delusions?

Am I contradicting myself from last week, where I suggested that some delusions (believing things which aren’t justified) is useful? It’s a finer point, but because it might confuse, I’ll try to clarify.

Some beliefs are useful but unjustified. Overconfidence with women is a possible example. Even if I have no evidence to show I’m the most desirable candidate, consuming myself in an irrational belief of confidence may improve my success.

There’s two ways these instrumentally rational beliefs fit into the uncertainty scheme I’ve presented. First, if I “irrationally” believe I’m good with women, and that makes me good with women, then the belief was actually justified. Such self-fulfilling prophecies may be cases of apparent irrationality which aren’t.

Perhaps thinking I’m Casanova makes me better with women, but not to the degree I believe it. In this sense, the instrumentally rational belief is useful, but technically unjustified. This is an example of a healthy delusion where it might be beneficial to believe a lie. That doesn’t make the lie true, or that unjustified beliefs suddenly become justified, simply that some degree of self-delusion may be advantageous.

Uncertainties are the only valid way of reasoning, in that they maximize true beliefs. However, there may be exceptions where maximizing true beliefs conflicts with other goals.

The Inescapability of Doubt

Once you accept that certainty is just a useful simplification, and that uncertainties are the only correct way to reason about things, life becomes much easier. Doubts, fears and worries still exist, but they stop being unnatural entities that need to be avoided, but qualities of reality that should be embraced.

Instead of avoiding doubt, learn the skills to work within it. There are many good algorithms for making smart decisions, given uncertain situations. Familiarity defeats fear. If you get in the habits of reasoning with uncertainties, doubt becomes a tool, not just an anxiety.


Print Friendly
StumbleUpon It!

This website is supported, in part, by affiliate arrangements (usually Amazon). Affiliate relationships are always marked by bolded links.


24 Responses to “Learning to Doubt”

  1. JS says:

    “Uncertainty is the only correct way to think”

    Do you see the ironic inconsistency of this statement?

    This same inconsistency underpins the self-contradiction of relativism’s absolutist belief that “there is no objective truth”.

  2. Max says:

    Scott,

    With this mindset of uncertainty what are you comparing your beliefs to? If you are comparing your current beliefs with the perfect set of “true” beliefs you are ultimately presupposing a standard. This standard is something that is outside the minds of humans. People can have absolute ideas but they can also have ideas that haven’t been validated in full context.

  3. Vinay Bhat says:

    Can you intuitively explain to me, or present a hypothesis that explains how and why, does “overconfidence” (or massive self-belief) actually help?

  4. Phil says:

    “There are many good algorithms for making smart decisions, given uncertain situations.”

    care to expand? or link to some that goes into this more?

  5. Jonathan says:

    Well said and smartly reasoned. With certainty – Jonathan

  6. Tanner says:

    Interesting article. I just had a couple thoughts/questions/ideas. Within this doubt reasoning idea is there any room for faith? You talk about healthy delusions, so couldn’t we say that, for example, faith in god can be a healthy “delusion” for some? If my belief in god causes me to do good in the world for others, live my life in a better way, and generally make the world a better place is it not healthy? Just a thought…

    Also, I like the idea of dealing in “percentages” of doubt. I can see how that can make life decisions easier or less daunting or intimidating. At the same time saying that something “probably isn’t” a certain way it seems might breed more doubt about a decision, couldn’t it? I am intrigued by this subject and appreciate your comments here. Keep up the good work!

  7. [...] um post do blog Get More from Life nos motivou a retomar a pergunta “É melhor viver cheio de dúvidas ou em meio a respostas [...]

  8. JB says:

    I never really though of irrationality as ever being useful because the truth never makes you unhappy; it is how you take it. We are all humans and certain truths/ideas, however abstract, can have an emotional effect. I separate the two and say that realism can only be useful and the emotional impact can be calculated on the side. For the overconfidence in women example I wouldn’t lie to myself, I would simply admit that I am merely pretending, but that this is the best option available to me. It’s the same conclusion but the difference is in the thought process.

    I think it is a little dangerous to be willing to accept irrationality on such a subjective basis. Whenever someone tries to justify religion to me they say ‘well it helps me live’ or ‘it’s my truth’. This reasoning just creates a loophole that logic cannot protect you from.

  9. Adam Isom says:

    Phil,

    I recommend the book Smart Choices. It’s about $10 and short. It’s written by three authors very experienced in decision theory and applying it to business situations; one of the authors pioneered some aspect of decision theory. I cannot fully endorse it because I’m only halfway through it (reading slowly), but the book’s contents definitely fit under the heading “good algorithms for making smart decisions (given uncertain situations)”.

  10. Arjan says:

    Makes me think of the Tao Te Ching chapter 71:

    “To know how little one knows is to have genuine knowledge.
    Not to know how little one knows is to be deluded.
    Only he who knows when he is deluded can free himself from such delusion.
    The intelligent man is not deluded, because he knows and accepts his ignorance, and accepts his ignorance as ignorance, and thereby has genuine knowledge.” – Translated by Archie J. Bahm

  11. Anthony Lee says:

    I agree totally with alloy this. However, i’m interested to know what you think of absolutes? If there is ONLY degrees of uncertainty, then there can be no absolutes. This means humans can never truly know anything (even if gravity will always hold). Furthermore, no absolutes means no basic axioms. No building blocks. This means infinite progression and regression.
    As an objectivist, this is a hard pill to swallow.

  12. Phil says:

    Adam,

    thanks, I’ll check it out

  13. Max says:

    Anthony,

    Just by the fact that there are words and entire systems of language presuppose that there are absolute ideas. Reality is the denominator from where all ideas come from. If you believe accept contradictions of reality, then your right, you won’t achieve very much intellectual progress in your life……

  14. Simon says:

    Are you have trouble with women Scott ? :-)

  15. Simon says:

    U said, women say they want equality but respond to dominant behaviour…Maybe you’re right…But maybe, you’re targeting the wrong women…Who knows ;-)

  16. Scott Young says:

    JS,

    It’s a more nuanced point, but when I speak of uncertainties I’m talking about a posteriori beliefs, or those which are based inherently on the gathering of evidence.

    I skipped the mathematical justification because it’s a little baroque for practical considerations, but what I’m essentially arguing in favor of is a Bayesian system of reasoning which updates beliefs based on evidence but mathematically has both 0% and 100% as asymptotes which are never reached.

    Of course, in any system of reasoning, some a priori axioms must be taken as a given (including the logical underpinnings of the theory itself). I think there may be some room for meta-rationality to discuss whether those axioms are valid, but, once again, that’s getting a little obtuse.

    Uncertainty != relativism. Any uncertainty would be because of bounded rationality, not the lack of objective truths.

    Anthony,

    That’s the wrong way to think about it. Uncertainties are knowledge too. If I know something with a 99.999999999999% certainty (say, gravity, for example) then I know it nearly with the same precision as if I had 100% certainty.

    In fact, reasoning with uncertainties means we have more knowledge than we would otherwise. Since it allows us to “know” things, for which the evidence is not perfectly conclusive. If we restrict knowledge to include *only* things which we have 100% confidence, we must either lie to ourselves or limit our reasoning to very few things.

    Max,

    I’m not sure I understand your point. I’m not arguing for relativism here. Uncertainty here has to do with the finite nature of the human brain, not that absolute truths don’t exist.

    Phil,

    Here’s a couple:

    -ABZ Planning
    -Minimax regret
    -Worst-case planning

    Tanner,

    My main arguments in favor of healthy delusions were mostly game theoretic–that is you genuinely believe a lie or exaggeration because it enables you to compete with other people. Believing in God may fit into that schema under some limited contexts, but I don’t feel it’s necessary.

    Even in my last article I doubt whether delusions are ever “healthy” since the consequences of abandoning rationality, even for special cases, is worrisome.

    Simon,

    Haha, I wouldn’t read too much into it. :)

    The reason I bring up dating is because this is probably one of the few clear examples where a healthy delusion can be beneficial, since signaling is so critical. Most other cases (say, being overly optimistic about a business success) have a whole lot of caveats that make the writing muddier.

    JB,

    I completely agree–I’m concerned about accepting the possibility of instrumental irrationality, since if you do, where does it stop? Clearly wholesale foregoing of rationality is undesirable, and since the possible exceptions are a decided minority of cases, the risk of contaminated beliefs might exist.

    That said, I think there are some indications from game theory of where irrationality may be beneficial, and the human brain doesn’t seem to suffer from compartmentalized beliefs, so I’m left with mixed feelings on the issue.

    -Scott

  17. Max says:

    Scott,

    I’m really objecting to the overall theme of the post, that personally I can never be absolutely sure of anything and that the circumstances that life presents to me are never absolutely knowable. On that note, reasoning can never be valid mode of knowing and this leaves an individual in a situation that is no different than accepting and using faith……..

    Perhaps its just the way the article written, but I disagree that If you know something 99.999999% its the same as knowing it completely. This is where i disagree with Feynman. I have absolutely certain of things and until they are validated in full context of these ideas and future validated ideas that they will be accepted in my mainframe of thought…….

    Last point I wanted to mention…… doesn’t living in uncertainty decrease one’s self esteem? Wouldn’t someone end up not accepting anything but that they are wrong?

  18. Scott Young says:

    Max,

    I didn’t really spend much time arguing it, but the uncertainty of empirical knowledge is basically impossible to get around. From a philosophical standpoint, see Hume’s problem of induction.

    From a scientific perspective, take the validity of scientific experiments. The accepted definition of certainty in physics is 6-sigma, which means that there is still a small chance (less than one in a million) that a particular experiment’s results are actually incorrect. Repeating the experiment increases this further, but it never reaches 1 exactly. And that’s with physics, the most reliable of all disciplines of empirical knowledge.

    The problem isn’t with uncertainties, which is demonstrably the only valid way to reason about empirical phenomenon, but with your subjective sense of uncertainty. The truth is, a 99.9999999999% belief is practically identical to a 100% belief, the only reason people feel it isn’t is that they haven’t trained themselves to reason in uncertainties.

    -Scott

  19. Max says:

    Scott,

    We can go on and on in circles. But this is where you and me differ on the validity of beliefs. A belief measured to be 99.9999% true isn’t the same as a 100 % validated belief. Lets just say for instance, in my particular belief system, I have 100 axioms that are absolute and validate all my ideas. If just a single idea doesn’t agree with one of those axiom then it isn’t a valid idea. It isn’t one hundred percent accepted as true and isn’t valid. Now i agree with you: that the “uncertainty” of whether sense experience is the same for people. Except, the relationships among the sense data and all users must remain the same. Hence is why mathematics was established before any discipline of science was created. …..Now you’re probably right with your proposition that people are unable to reason under uncertain circumstances but with all due respect… what makes your thinking process so correct? Ideally, I would rather have a small number of justified beliefs than a large number of unjustified beliefs…..

  20. Scott Young says:

    Max,

    First, whenever I write anything it’s simply my opinion, and I’m not afraid to change it in the face of good arguments. I just choose not to append everything I currently believe with “on the other hand I might be wrong.” So don’t take it as hubris.

    More to your point–name an empirical phenomenon (that is something which is logically indeterminate from axioms) that you know with 100% certainty. I can’t think of any, and since this article is speaking about empirical phenomenon, which vastly outweigh mathematical or philosophical axioms and their consequents, degrees of certainty is the only correct way to reason about these things.

    The only alternative to uncertainty is to propose something is 100% true and ignore unlikely or unpalatable alternatives. For example, why aren’t we a hallucinating Boltzmann brain instead of living in a real universe? Even the physical justification for rejecting this is based on probabilities, not axiomatic proofs.

    -Scott

  21. Max says:

    Scott,

    Its interesting that you separate logic and experience into two unrelated fields. I always considered the two interconnected.

    Anyways, the answer to your question would be that I agree and disagree simultaneously. I can’t think of any empirical phenomenon that I know with 100% certainty that is logically indeterminate from axioms. Except, at the same time, how can any of my experiences not be derived by axioms especially implicit ones. The fact that you are conceptualizing and using language as a means of communicating presupposes the primary axiom “consciousness”. If there were no such thing as consciousness the phenomenon under observation couldn’t be experienced. Anyways back to your question……the only empirical phenomenon that I am 100% certain on is the mass sum of experience itself: my entire set of sense data entering my mind. The first conclusion I would make is that everything is 100%. If this wasn’t the case, how can I further make any certain or uncertain conclusions: there wouldn’t be a standard for my ideas? To put it into better words, the reason why our ideas are certain is from observing a orderly universe. If for a fact that we lived in a disorderly universe, it wouldn’t cease to exist. Therefore, disregarding conceptual and mathematical knowledge, we have to be certain of the primal experiences that happen moment to moment. This is the starting point for everything. Otherwise whats the point of establishing probabilities or mathematical relationships if the foundation of everything isn’t concrete?

    I do agree that its impossible for someone to reason what is going to happen next in the universe successfully. Except this doesn’t change the fact that the present phenomenon that I am experiencing now is absolutely true. If this wasn’t the case, where would you derive knowledge If you didn’t have sense perception from objects and events?

    -Max

  22. Anthony Lee says:

    Max
    I agree with most of your points here….but, to play devil’s advocate I have to ask; how do you know your senses are valid and can be trusted? I mean, you are basing your knowledge of absolutes (the same ones I believe…mostly….except consciousness isn’t a primary axiom….its a corollary to existence, which is primary) on sense perception. True, that perception is our only window to reality…..but how can you prove you aren’t in the matrix? Or skitsophrenic (sp?) ?

    Scott
    Thanks for clarifying. I see it from your perspective now (I think). This “degrees of uncertainty” is the scientific way to view models of reality. The issue that I think people are struggling with is thinking that way philosophically….in the realm of beliefs. In science there are no beliefs. There is evidence. When dealing with beliefs, it becomes difficult for people to operate within degrees of uncertainty…. this would rather contradict the definition of belief.

    Thanks so much for this fantastic discourse either way.

  23. Max says:

    Hey Anthony,

    Its great to have other people into the conversation. Just to answer quickly, the motive behind the matrix is to provides means as to saying that the senses are valid. Do you agree on this?

    If all data or information is perceptual data, and perceptual data is liable to error then that means that perceptual data is fallible. Except apply this to all of existence and reality doesn’t behave this way. Thus, if sense data is wrong how would go about replicating it into a perfect mirror of reality that couldn’t be recognized or identified because my basic means of information is never correct? Its a fun idea to talk about and the films did a terrific job of capturing the main themes.

    -Max

  24. [...] written before that the only appropriate way to look at knowledge is through degrees of belief. This means that almost nothing (aside from logical truths) is known perfectly. Instead everything [...]

Debate is fine, flaming is not. Pretend that this comment form is a discussion taking place in my house. That means I enjoy constructive criticism and polite suggestions. Personal attacks, insults and all-purpose nastiness will be removed especially if it is directed at other readers.

Leave a Reply