Wednesday, May 14, 2008

Epistemic Conservatism and Meta-coherence

Some commenters in the thread on 'structure and similarity' proposed that we should work with whatever concepts we happen to start with -- green rather than grue, or vice versa -- and only change if there's a compelling reason to do so: "I do not have to assume that the way I cut up the world now is the best way (in fact I'm pretty sure it isn't), I just have to consider if the [change] that you are proposing has useful consequences."

However, this runs headlong into a problem that I highlighted in a previous post: that principles of metacoherence mean that such humble meta-beliefs rationally undermine our first-order beliefs. Or, as I put it in my comment:
If you think that both inductive arguments are equally reasonable, and that there's no sense in which X is objectively more similar to Y than Z, then you have no grounds for believing one conclusion rather than the other (e.g. that future emeralds will be green rather than grue). By the principle of metacoherence, you should be agnostic.

I introduce this notion in my global rationality paper:
Sometimes we may be in a position to realize that our initial judgments should be revised. I may initially be taken in by a visual illusion, and falsely believe that the two lines I see are of different lengths. Learning how the illusion worked would undercut the evidence of my senses. I would come to see that the prima facie evidence was misleading, and the belief I formed on its basis likely false. Principles of meta-coherence suggest that it would be irrational to continue accepting the appearances after learning them to be deceptive, or more generally to hold a belief concurrently with the meta-belief that the former is unjustified or otherwise likely false.

As I further explain in my post 'Meta-Coherence vs. Humble Convictions':
It's [meta-]incoherent to believe that P whilst also believing that the weight of evidence fails to support P, since this is just to judge both [on the lower level] that P is true and [on the higher level] that P is probably not true after all.

If, faced with alternative possibilities, you nonetheless retain your belief in P, you must represent this to yourself as being because you judge that P is more likely true, and not just because you happened to have the belief in the past and don't feel like revising it. You can't consciously be an epistemic conservative, because the realization that you believe P partly for non-truth-related reasons will instantly lower your degree of belief in P, to match your assessment of the (subjective) probability that P is really true.

Some form of epistemic conservatism may be true nonetheless, in the trivial sense that we think our existing beliefs are true, and it will take positive work to convince us otherwise. But from a first-personal perspective, the reason we hold on to our existing beliefs is not the egoistic fact that they are our beliefs, but rather the "fact" (as it seems to us) that they are true. If you lose your confidence in the truth of a belief, and are "pretty sure" it's mistaken, then you can't sincerely believe it any more. You can't now appeal to epistemic conservatism as a way to hold onto your belief without judging it to be true. That's incoherent.

5 comments:

  1. This comment has been removed by a blog administrator.

    ReplyDelete
  2. Thanks Rachael, I've moved your comment to a new post: here.

    ReplyDelete
  3. you start with a 'should' argument in paragraph one. In that context should one always be logical (in this sense) if you know that doing so results in useless consequences*?

    "If you think that both inductive arguments are equally reasonable"

    What if you think it is 'most likely they are equal' and possible they are not, and if not then X is similar to Y (maybe because it is more useful?).

    *or is there is no disagreement here at all?

    ReplyDelete
  4. I suspect that this is where we are going to fundamentally disagree.

    In your final paragraph you suggest that there is a trivial sense in which we do hold on to our beliefs, but that is because we think that they are true. I agree that this is the case, but I think that this is all we have access to. In fact from this 'trivial' level I build all my ideas about what is true. In other words all of my beliefs are fallible in that they could, given enough evidence, be altered.

    This does not force me into denying that some of all of beliefs are true; rather I accept those beliefs as true that have satisfied a certain amount of testing. In addition some beliefs are more useful (or expedient) to hold as true. The key issue is here is to understand truth as something is part of human activity, and not as something that stands outside that activity

    ReplyDelete
  5. "The key issue is here is to understand truth as something is part of human activity, and not as something that stands outside that activity"

    I have no idea what distinction you are trying to draw here. (Neither option you mention sounds particularly coherent. Truth does not "stand" anywhere, and it is only a "part" of human activity in the trivial sense that we have beliefs which may in turn be true or false.) But that's not the issue, anyway.

    The key issue is meta-coherence, between (1) your belief that future emeralds will be green rather than grue; and (2) your meta-beliefs about the reliability of that #1 belief.

    If you believe that green is no more natural than grue, and that a grue-speaker would do just as well as you when they believe that your #1 is false (and that future emeralds will instead be grue), then your beliefs are incoherent. You should give up either your first-order belief that future emeralds will be green, or your meta-belief that this first-order belief is no more likely to be true than its negation.

    This is all quite consistent with working from the beliefs you have. I'm pointing out that the beliefs you have are inconsistent, so you have to get rid of one of them. Start from your 'green' belief, and conclude that green is natural. Or start from your denial of naturalness and conclude with agnosticism about the colour of future emeralds. Those are your options.

    My point is simply that, insofar as you are rational, you cannot hold onto your 'green' beliefs merely on the grounds that you started off with them. You must further think that they are most likely true. But a precondition for this is that green is more projectible than grue. So you can't accept the one without the other. Conservatism isn't enough. You must think that you have real grounds for your beliefs.

    ReplyDelete

Visitors: check my comments policy first.
Non-Blogger users: If the comment form isn't working for you, email me your comment and I can post it on your behalf. (If your comment is too long, first try breaking it into two parts.)

Note: only a member of this blog may post a comment.