Kolodny argues that we don't have conclusive reason to be rational. He appeals to the "boot-strapping objection": rationality requires us to do what we have apparent conclusive reason to do. Suppose (for reductio) that we have conclusive reason to be rational. Then we have conclusive reason to do whatever we have apparent conclusive reason to do. But that's absurd -- you can't just drop the 'apparent' like that. Apparent reasons need not be real reasons. From the mere fact that you take yourself to have a reason, it doesn't follow that you really do have a reason - it's possible to be mistaken about such things! So the supposition is false: we do not have conclusive reason to be rational.
Still, it seems to me that we have a reason of sorts to be rational. And not just instrumental ones (e.g. a rule-utilitarian-style suggestion that rationality is the most reliable strategy in the long run). Here's why:
Suppose an angel comes along and offers you the following choice. She will alter your brain in one of two ways. (1) She will make you perfectly rational. You will be able to reason perfectly well, and never suffer from weakness of the will, so you will always intend to do what you have apparent conclusive reason to do, etc. or (2) She will make you totally irrational but perfectly lucky. Knowing the future, the angel will set up your brain so that you always pick the option that you in fact have conclusive reason to choose. But this "choice" will not be because you consciously reflected on these reasons. Instead, you will engage in terribly bad reasoning, and make fallacious inferences, all of which just happens by good fortune (and angelic tampering) to yield true conclusions.
Which option would you choose? It seems to me that there is something very much preferable about option #1. This shows that we have reason to be rational. It's a good disposition or characteristic to have. I wouldn't want to be incapable of rational reflection, that would suck much of the value out of life. It would be like losing your free will. (Indeed, on some conceptions, it would be to lose your free will!)
Now suppose the angel adds that, whatever you pick, she'll re-offer these options in one week's time. Now you should pick option #2. Why? Because you can't be sure which is the best permanent option, but if you temporarily pick option #2 then the angel will guarantee that your later decisions -- including next week's decision about which of these two options to keep -- are always right.
So, what will the angel make you pick for your permanent choice? Like I said above, it seems that option #1 is the better. By restoring (and improving!) your rationality and free will, this option provides you with the most worthwhile life. Sure, you'll make some mistakes later on, but that's better than being a coincidentally perfect robot. Suppose I'm right about this, so that when you temporarily choose option #2, the angel would set you to pick option #1 the second time around. What does this show? Well, the angel sets you up to pick whatever you have conclusive reason to do. So it would show that you have conclusive reason to pick #1 - to make yourself rational.
Does this mean you have conclusive reason to be rational? This looks like the same thing, but perhaps it isn't. Making yourself rational seems like a kind of acting upon yourself. It might be like the distinction between believing P and bringing it about that you believe P. These are two distinct types of action, as I explain in my essay on reasons for belief. So we might grant the above argument and still deny that we have reason to be rational. On the other hand, if we accept that there can be 'external' or 'inaccessible' reasons, then we might say that we have reason to be rational (there is something to be said in favour of it), even if this is not a reason that we could actually recognize and act upon.
(Kolodny thinks it's "fetishistic" to value rationality for its own sake. But I don't know about that. It seems to me that there really is something intrinsically valuable about being rational in general, or having rational capacities, at least. Though perhaps I'd agree that it's fetishistic to care instrinically about doing what is rational in any particular case.)
So I guess I haven't really made much progress here. Maybe you should go read Clayton instead. Then come back and leave me some helpful comments :)