Friday, June 22, 2007

Coherent Persuasion

Peter argues that no-one has "ever been convinced to change their mind by a rational argument". I'm not convinced.
No rational argument then can be constructed to ever change a person’s mind, because we can never get to premises that people must accept.

Even if we grant this premise (must we?), the conclusion simply doesn't follow. At most, it shows that arguments won't necessarily rationally convince everyone. It remains an open possibility that some people will indeed be rationally convinced, since they may well be more committed to the truth of the premises than to the conclusion's falsity.

Still, I think there is something artificial about an argument's directionality, as demonstrated by the adage that "one man's modus ponens is another's modus tollens." Valid arguments are easily inverted, simply by switching the conclusion with a premise and negating each. Validity is preserved; the contraposed argument is logically equivalent to the original. (This refutes the common claim that logic tells us how to reason. At most, logic can provide us with wide-scope norms against inconsistency, but it cannot tell us which of the conflicting claims to give up.)

So, philosophical debate should be thought of as producing... not "arguments" per se, but logical maps -- "inconsistent triads" and the like -- to show which claims cohere best with certain others, which ones rise and fall together, etc.

It should be quite clear that this process can be rationally persuasive. We feel rational/psychological pressure to have a coherent belief set. So if someone can show that some claim P coheres better with our other beliefs than does our present belief of not-P, this may bring us to change our minds about P. And, indeed, this happens all the time.

People often hold opinions due to conceptual misunderstandings. (Think of the popular false dilemma between relativism and dogmatism.) These are easily cleared up, and any half-rational agent will change their mind upon learning of their error. Much if not most philosophy is simply a matter of overcoming sloppy thinking -- appreciating possibilities or implications that we'd previously missed.

Peter is implicitly assuming that we already have maximally coherent belief sets, so that no argument (logical map) would have any new information to give us cause to update our beliefs. But this assumption is patently false. We can - and do - learn things from others' arguments, and change our minds accordingly.

P.S. I've previously, in response to a reader's challenge, given examples of changing my mind in response to rational arguments. The issue of normativity and ultimate ends is the big one. More recently, learning about 2-D semantics radically changed my opinion of conceivability arguments. So, those are two very fundamental changes right there. (Of course, it's open to Peter to insist that the changes had non-rational causes. But that would seem unmotivated and uncharitable. I certainly think that my views have improved with time, and didn't merely "shift" in a rationally neutral fashion.)

Question for regular readers: have any arguments on this blog ever led you to revise your beliefs?

9 comments:

  1. I never believed that there were non-denumberable sets until I read Cantor's proof.

    ReplyDelete
  2. The problem is that making your beliefs more coherent doesn't make them more likely to be true, unless you are coherentist about truth. So all I see you as demonstrating here is that we have the ability to build a castle of abstractions in the sky, but not any way to make that castle match up with reality. And that matching to reality is why I care about more, and thus why I am irrationally motivated to abandon argument.

    ReplyDelete
  3. I dont think you changing your mind proves that that change was fundimentally caused by logic its a harder nut to crack than that.

    GNZ

    ReplyDelete
  4. Richard--you changed my mind about libertarianism; and a bit about intellectual property.

    Peter--First, there is one sense that I agree with your rejoinder. If you hold P and an argument satisfactorily shows that ~P is more coherent with your beliefs, it may still be that rationality does not require you to change your mind. But elsewhere your thoughts seem muddled.

    You seem to vascillate, but we need to stick either to rationality or to truth. If we stick to rationality (which makes more sense), then this is a very plausible principle: For some belief you hold P, upon encountering some ingenius argument, rationality would require that you change your mind to ~P. I especially like sdf's example of beliefs about the existence of non-denumerable sets in reaction to a diagnolization argument. Other examples are easy to come by. To refuse this principle, which is formulated as a principle about you, Peter, would seem an act of incredible intellectual arrogance. Are your beliefs really such that if you were to examine them closely you would never find incompatibilities? You would never think, "Huh, it doesn't seem rational to hold these two beliefs?" If there are two beliefs that are incompatible like this (or twenty beliefs which are incompatible like this), couldn't an argument to that same effect be exactly the sort of thing which would rationally require of you that you change your beliefs?


    Of course, ~P might nevertheless be false. Rationality may, at times, diverge from truth. But if we stick to truth (which seems very peculiar to me), then again this is a very plausible principle: you hold a false belief Q, and upon encountering a sound argument to the effect of ~Q, truth maximization requires that you change your mind.

    In any case, argumentation seems occasionally both rationally require that you change your mind, and change your mind for truth maximization reasons.

    ReplyDelete
  5. On second thought, it is probably foolish of me to think that I can change your mind about whether argumentation can change people's mind by argumentation. (And foolish of you, from your perspective, to try to convince me, and hence change my mind, that argumentation cannot change someone's mind.)

    ReplyDelete
  6. Peter, one can be a coherentist about justification without being a coherentist about truth. As Jack effectively points out, incoherence is at least a sign of falsehood (and thus to be avoided on truth-seeking grounds), even if coherence alone is no guarantee of truth. So it should come as no surprise that pointing out an incoherence could lead someone to rationally revise their beliefs. (We may further support this by pointing out that, in appealing to other claims in their belief set, we are appealing to premises that they take to be true. Again, it should be clear that people may rationally revise their beliefs upon learning what's entailed by something they take to be true!)

    But in any case, your comment is besides the point. It merely raised a standard skeptical worry, about whether philosophical argumentation is a good route to truth. That's an interesting question in its own right (coherent argumentation strikes me as plainly superior to the alternatives, but I look forward to reading your future posts on this topic), but it clearly fails to address the question of whether rational persuasion is possible. I think I've pretty clearly established that your answer to this question was riddled with error, and we in fact have every reason to expect that rational persuasion is possible (and actual). It's a straightforward implication of the existence of hidden incoherence in our belief sets.

    Jack - "it is probably foolish of me to think that I can change your mind about whether argumentation can change people's mind by argumentation."

    Perhaps not. We think that argumentation can change minds, in general, and so we should expect this case to be no exception (well, unless our interlocutor is exceptionally stubborn or unreceptive -- as may be encouraged by the content of his thesis here).

    ReplyDelete
  7. Richard,

    I agree with you that Peter's view that rational argument never changed anyone's mind is quite absurd. It happens all the time, and arguments that show that multiple beliefs that one has are incompatible are a paradigm case. Coherence is constitutive of rationality, and since many of us aim to be rational, we will be disposed to respond to inconsistencies in our beliefs that are pointed out to us by others by revising our beliefs accordingly. You're right that logic cannot tell us which of the conflicting claims to abandon; and rationality can't either.

    One very interesting issue that arises out of these points concerns the normative status of rationality. Peter points out that what he is fundamentally concerned with is matching his beliefs to reality; and though this aim seems to conflict with his view that people (presumably including himself) only change their minds when they come to "like" another view more or "want" a view they don't yet hold to be true, it seems to me the right aim to have when doing philosophy or engaging in any other intellectual inquiry.

    But since rationality demands that we shift our views so as to be coherent, and since often enough we will have false beliefs that we will be strongly inclined not to give up, conforming to the demands of rationality will sometimes lead us to have more false beliefs, and perhaps fewer true beliefs as well. That is, sometimes we would do better in terms of maximizing our number of true beliefs and/or minimizing our number of false beliefs by failing to conform to the demands of rationality. Some have suggested that this implies that rationality is not normative. I'm skeptical of this conclusion, but I'm also not convinced that a coherent set of beliefs that includes many false ones is better than an inconsistent set of beliefs with fewer false ones.

    ReplyDelete
  8. "You're right that logic cannot tell us which of the conflicting claims to abandon; and rationality can't either."

    I'd question the latter claim. Kolodny convincingly argues that there are some narrow-scope (directional) rational requirements. Consider the meta-incoherence involved in cases where one believes that P but also that the evidence (all things considered) is against P. Then, to quote myself, "Rationality requires us to go where our assessment of the evidence takes us, rather than revise our assessments to match the conclusions we’d like to reach. The latter sort of revision amounts to wishful thinking, not reasoning."

    A simpler counterexample is where one possible revision fits better than the others with your overall 'web of beliefs'. That particular revision then seems rationally superior, given the circumstances.

    "One very interesting issue that arises out of these points concerns the normative status of rationality."

    Indeed! (Follow my previous link for a whole essay on the topic.)

    ReplyDelete
  9. So, philosophical debate should be thought of as producing... not "arguments" per se, but logical maps -- "inconsistent triads" and the like -- to show which claims cohere best with certain others, which ones rise and fall together, etc.


    I think this is right (as indeed I think your whole argument is right); but I think it's important to note that in the sense that philosophical debate does this, all debate does this. Two physicists debating a theory, for instance, are doing precisely the same thing. The reason that it never is merely a set of castles in the air is that some claims in this 'logical map' are reasonably seen as related to to actual facts by observation, experiment, and the like. (Duhem has some lovely discussions in which he shows this in operation.) In other words, not all claims are equally evident. Someone who rejected this would be in such a state that they couldn't be persuaded of anything. But it's consistently breaking the connection between reality and claims used in argument that does this; and virtually no one does this consistently.

    What I think Peter's reasoning actually shows is how utterly silly it is to think terms like 'rationally compelling' are a useful way to evaluate arguments. Arguments do not compel people to accept their conclusions; and rejecting an argument because it is not 'rationally compelling' in anything like this sense is a sure sign of irrationality. And yet it seems to have become a fairly common thing.

    ReplyDelete

Visitors: check my comments policy first.
Non-Blogger users: If the comment form isn't working for you, email me your comment and I can post it on your behalf. (If your comment is too long, first try breaking it into two parts.)

Note: only a member of this blog may post a comment.