Tuesday, February 17, 2009

Rational Akrasia

I must say, I'm a huge fan of Nomy Arpaly's book Unprincipled Virtue. In the second chapter she argues against the common assumption that our deliberative judgments about what's rational (for us to do or believe) have any special normative authority or significance in determining what would in fact be rational for us. That is:
[S]ometimes an agent is more rational for acting against her better judgment than she would be if she acted in accordance with her best judgment... there are cases where following her best judgment would make the agent significantly irrational, while acting akratically would make her only trivially so. (p.36)

This is so in cases where one's "best judgment" is itself completely irrational -- a point I previously made in my post on 'Subjective Oughts'. Though this fact is often neglected, I don't think there's any sane view of rationality on which S believes that φ-ing is rational implies that S rationally ought to φ. Even on the most 'internalist' views, e.g. coherentism, it's possible to have false beliefs about what would be most coherent. That is, I might think that believing P is rational (and would increase the coherence of my belief-set), when in fact this is false and my belief set would be more coherent on the whole were it to contain not-P instead. Coherentism then straightforwardly entails that I ought to believe not-P, despite the fact that I believe that I rationally ought to believe P. The higher-order judgment is just wrong. And the same may be true in case of practical rationality, i.e. I may hold false views about what I rationally ought to do or desire. Following my judgment, in such a case, would not in fact be rational.

Once stated explicitly like this, the point seems very obviously correct. I guess one possible point of resistance could arise in readers who fail to distinguish what Arpaly calls an "account" of rationality from an "instruction manual". (Cf. the traditional distinction between the 'criteria of rightness' and a 'decision procedure'.) Obviously an instruction manual can't contain advice of the form, "Go against your all-things-considered better judgment", or "Do X, for a reason other than that this rule advises it", or "Don't think of an elephant". None of this is advice you can follow. Nonetheless, they may be true statements of what it would in fact be most rational for you to do in the circumstances. As I keep stressing, trying hard is no guarantee of rationality. Sometimes (though hopefully not often) one's efforts may even be counterproductive* -- this is perhaps most familiar in case of neurotic "over-analyzing", but another example would be overriding one's reliable (reasons-responsive) gut instincts with bad reasoning or rationalizations.

One virtue of Arpaly's discussion is that she highlights how (implicitly) familiar this point is in everyday life. We're all familiar with the idea of "a man who has some 'crazy notions' sometimes but whose common sense prevails 'in real life.'" (p.49) Most people aren't philosophers, or even particularly competent reasoners, so their explicit judgments may end up being downright dopey. If they fully internalized and acted on these dopey explicit beliefs, we might consider these folks fanatics. But because Uncle Bob doesn't really 'live out' or act on his dopey "best judgment" -- being instead restrained by his implicit "common sense", practical wisdom, and basic human decency (though he doesn't consciously realize it) -- we may judge that he is rational enough on the whole. His irrationality is restricted to his explicit beliefs, and he's otherwise ("in real life") not so bad. Akrasia -- failing to act on his explicit 'better judgment' -- thus makes him comparatively more rational than the fanatic he otherwise would have been. (Though of course he'd be even more rational if he didn't have such dopey explicit beliefs in the first place.)

* Stronger still: sometimes any attempt at explicit deliberation might prove to be essentially less-than-optimally rational. This point is more familiar in the context of ethics, where we may be required to respond directly to a person in need rather than mediating our response by any kind of deliberate moral theorizing. The person who considers the permissibility of saving his drowning wife before jumping in clearly has "one thought too many", as Bernard Williams puts it. We want people to be sensitive to moral considerations, but that doesn't require -- and sometimes precludes -- consciously deliberating about such things.

8 comments:

  1. You seem to be assuming that one cannot be acting upon 'judgment' when jumping in to save one's wife. Even though he is not considering permissibility, I don't think it's obvious that he's not following his 'better judgment' when he jumps in. Judgment, like reason, is an activity that uses many of our faculties simultaneously.

    And I don't see why the 'instruction manual' case isn't sufficiently compelling. It is clearly not rational for me to not do what I think is rational - it's my best guess at what I should do. If you think there's a 90% chance that option A will win and a 10% chance that option B will win, then A is the rational choice, even if you're mistaken. If that's not the case, then it seems like people can never be rational (or whether we're being rational is effectively random, which does violence to the concept of rationality), as we're often mistaken about a great number of things.

    ReplyDelete
  2. I'm mainly interested in the instruction manual problem, not in the external evaluation problem. I'd like a name for the problem I care about, so I can avoid confusion with people who think I am talking about the other problem. If "Is he rational" isn't clear enough, what would be clear enough?

    ReplyDelete
  3. Robin - how about: "what advice should an instruction manual offer to wannabe rationalists?"

    But I'm not sure if you're going to be able to evaluate agents ("is he...") in any sensible way, once we recall that a person might violate a rule of the instruction manual without realizing it. For example, asking "Is he successfully following the rules of the instruction manual?" may yield the verdict 'No' in case of the person who follows his ill-formed (i.e. unwittingly anti-instruction manual) best judgment. If you want the answer to be straightforwardly 'yes' in such a case, perhaps your question is simply "Is he being rational by his own lights?" -- i.e. "Does he believe himself to be rational?" But it's hard to see why the answer to a question that subjective should have any real significance at all. (Even the most insanely irrational people might believe themselves to be rational. Such delusions should not be counted as a point in their favour.)

    Thom - "You seem to be assuming that one cannot be acting upon 'judgment' when jumping in to save one's wife."

    No, I didn't mean to suggest that the jumping-in case was itself an example of akrasia. It was simply a footnote demonstrating how "explicit deliberation might prove to be essentially less-than-optimal".

    "It is clearly not rational for me to [fail to] do what I think is rational"

    Erm, the whole point of my post was to demonstrate that such a claim is actually (indeed, "clearly") false.

    Note that the question is not whether you're mistaken about some fact (what option will win), but whether you're mistaken in your interpretation of the available evidence (what option is best supported by all your beliefs and desires).

    For an extreme example, suppose a new mother (in a bout of temporary insanity or depression) judges that her newborn baby will ruin her life, so that she "rationally" ought to abandon it. Further suppose that this judgment of hers is completely crazy -- in fact she's very attached to her child, and it will bring her great joy in the years to come, as is obvious to all her friends and should be obvious to her even now. Now suppose the mother, against her "better judgment", fails to abandon her child because her subconscious attachment prevents her (emotionally, she just "can't bring herself to do it"). Here it seems that her subconscious knows better than her conscious mind what is best for her. She keeps her baby, and does so for a good reason (namely, love -- not that this was transparent to her conscious mind).

    My claim is that her action here is far more rational than it would have been if she had followed her explicit "best judgment". Just consider the likely reaction of her friends. In cases like this, informed observers will say things like, "Thank goodness her 'common sense' showed through in the end, despite her 'crazy notions'."

    We all (implicitly) recognize that people's explicit judgments about 'what would be rational' can be mistaken, and in such cases their inner "compass" might steer them towards more rational action than their explicit deliberations can.

    (None of this implies that rationality is impossible.)

    ReplyDelete
  4. (In other words, I'm respecting the standard understanding of rationality as supervening on one's mental states -- beliefs and desires. The point is just that rationality really does depend on all of one's mental states, and not just some "privileged" subset of beliefs that are salient during deliberation.)

    ReplyDelete
  5. Strange - when you put it that way, I agree with you completely; but I still feel I reacted appropriately to your post. I wonder if there's some difference in how we're using 'judgment'?

    ReplyDelete
  6. Which part of your original comment do you still endorse? I suggested that (i) your first paragraph was non-responsive (i.e. objecting to a claim I didn't actually make), and (ii) the claims in your second paragraph are false (i.e. an agent may be rational in failing to do what they believe to be rational). Do you disagree on either point?

    ReplyDelete
  7. I'm a fan of Unprincipled Virtue too, for those very reasons.

    ReplyDelete
  8. I endorse the entirety of my original comment, I just no longer take it to be an appropriate response to what you meant to say. I was taking 'best judgment' to include all of the things that you were including in 'rationality' - I was basically pointing out what you said in your second comment.

    I normally take one's actions to be evidence for what's best supported by one's beliefs and desires. Fallible evidence, of course.

    And before I go, Webster was entirely wrong about 'judgment'. It should be 'judgement'.

    ReplyDelete

Visitors: check my comments policy first.
Non-Blogger users: If the comment form isn't working for you, email me your comment and I can post it on your behalf. (If your comment is too long, first try breaking it into two parts.)

Note: only a member of this blog may post a comment.