## Saturday, February 26, 2005

The other day I thought of a paradox that arises from admitting fallibility. For each of my beliefs, I believe it to be true (that's just what belief is). So I believe that all my beliefs are true. But I don't believe this at all! Rather, I know I'm fallible - that I have some false beliefs. So we have a contradiction.

This reminds me of the lottery paradox:
The lottery paradox begins by imagining a fair lottery with a thousand tickets in it. Each ticket is so unlikely to win that we are justified in believing that it will lose. So we can infer that no ticket will win. Yet we know that some ticket will win.

(They go on to describe the 'preface paradox', which sounds equivalent to my fallibility paradox, though I had not heard of it before.)

In both cases, the problem involves joining many individual beliefs into one big conjunction. It seems plausible that if you believe that X and you believe that Y, then you believe that X and Y. But I think this general closure principle is mistaken.

The most obvious explanation for this is that belief comes in degrees. Of each individual lottery ticket, I believe with 99.9% sureity that it will lose. Join five of them together, and I believe the conjunction with only 99.5% sureity. Conjoin all 1000, and my 'belief' is zero - I am certain that not all of them will lose.

A similar solution will also explain my fallible beliefs. I will have more confidence in individual beliefs than in the conjunction of several of them. Make the group too large and my confidence level could fall so low that I wouldn't even affirm a 'belief' in the conjunction any more.

So, I've two general questions:
1) Are there any problems with this solution?
2) Are there any alternatives?

1. Rather, I know I'm fallible - that I have some false beliefs.Is it possible to believe that you have false beliefs? Call the belief that you have at least one false belief B. If it turned out that all of your other beliefs were true, then B would be false, but then it would be true that you have a false belief. That liar's paradox is following you around again, and this time it's dressed up to look vaguely like a disjunction version of a Gettier case (since your evidence is for the first half of "one of my other beliefs is false, or B is false," but it's the second half that keeps B from being determinately false).

As far as your questions go, one potential problem with the degrees of sureity approach is that it seems to make belief secondary to probability in cases where you can calculate probabilities. If you can calculate the probability of X, p(X), then whether you believe X seems to depend only on the quantity p(X) and some cutoff standard (which may vary from case to case) for how high p needs to be for it to count as a belief. Whether this is a problem depends on how well it fits with the rest of your views on belief and knowledge.

2. "The lottery paradox begins by imagining a fair lottery with a thousand tickets in it. Each ticket is so unlikely to win that we are justified in believing that it will lose. So we can infer that no ticket will win. Yet we know that some ticket will win."

This argument isn't logical, because it doesn't account for the difference between the an individual's beliefs and the chances of ANY one in 1000 people winning the lottery; Two completely different things. I think it's just a case of bad reasoning to beleive with absolute certainty that you won't win. When it says "we can infer that no ticket will win", it should explain WHY!

Visitors: check my comments policy first.
Non-Blogger users: If the comment form isn't working for you, email me your comment and I can post it on your behalf. (If your comment is too long, first try breaking it into two parts.)