In my first post on the Standards of Evidence, I pointed out how it is rational to adjust our standards of evidence depending on how outlandish or unlikely to occur the claim is. I concluded that with “believable” claims that our brain works quickly to take into account the small amount of evidence we are given to make a somewhat rational stand on the merits of the claim, and with extraordinary claims, we require more evidence to accept them.
This post will examine another aspect of a claim that we can rationally use to help us determine the level of evidence we should require before accepting a claim:
Risk is one of my favorite board games even though I have never played it in real life with anyone. I’ve played it online a lot. The idea behind the game is that you have to constantly decide what actions to take based on what your opponents are likely to do and what you will lose if you make a misstep. If you over extend or make a bad map move, you could be easily over run. A popular strategy (especially early on) is called “turtling”, where you basically conserve your army, and build up numbers until the very end when you overpower the remaining players. This is a minimal risk strategy.
In the arena of truth claims, minimal risk strategies should be considered. Consider the following scenarios:
The first scenario is that your close friend tells you that he bought a big dog. This friend wouldn’t have to try hard convince you of this fact because it’s a common thing for people to buy dogs, but ALSO because if he’s lying to you, it makes no difference to you. Finding out that you’re wrong about your friend owning a dog will be virtually harmless to you. It may damage your trust in your friend, but the risk involved with not believing him may be even greater. If you tell your friend you don’t believe him and ask him to prove it, and are continually incredulous of every claim he makes, he might think it’s not worth his time to be around you. Constantly having to prove every minor claim he makes would get old, and you would likely lose friends like that.
In another scenario, you’re told by a close friend that your wife and daughter are being held hostage in a nearby building. He hands you a shotgun, gives you the location, and tells you that you have 5 minutes to bust into the building and take all the guys out before they kill your family. Well, not only is this an unlikely scenario, but it’s also extremely risky to accept your friend’s claims here. If he’s wrong, and you believe him you could endanger many innocent lives, go to prison, etc.
On the other hand, if he’s right and you do nothing, you could lose your wife and daughter. So the claim is at least worth examining. Merely by alerting you that something valuable to you is at risk, your friend has forced you to examine his claim at least somewhat seriously. This is why it’s illegal to scream “fire” in a crowded theater.
So here we have two scenarios where, in the first, there’s not much at stake if you’re wrong either way, and, in the second, there’s much at stake if you’re wrong either way.
The risk of losing something valuable is a highly motivating factor when examining claims, and forces us to put at least some standard of evidence on those claims where is much to lose if we’re wrong.
In a later post, I will examine how these two criteria for raising our standards of evidence can help us sort through various God claims. This will definitely include a discussion about Pascal’s Wager. For now, thanks for reading!