Friday, September 26, 2014

Where QALYs Go Wrong

My paper 'Against "Saving Lives": Equal Concern and Differential Impact' defends the use of QALYs (Quality-Adjusted Life Years) in medical resource allocation against several traditional objections. But along the way, I note several respects in which (it seems to me) not all life years -- even in perfect health -- are equal, and hence a straightforward QALY-maximization approach falls short.  I'll briefly outline them below, and invite readers to suggest any further examples I may have missed...

Wednesday, September 24, 2014

Bostrom's Superintelligence - Does AI constitute an Existential Risk?

The folks at OUP kindly sent me a review copy of Nick Bostrom's new book Superintelligence, exploring AI risk.  It's a topic that lends itself to eyerolls and easy mockery ("Computers taking over the world? No thanks, I already saw that movie.") -- but I don't think that's quite fair.  So long as you accept that there's a non-trivial chance of an Artificial General Intelligence eventually being designed that surpasses human-level general intelligence, then Bostrom's cautionary discussion is surely one well worth having.  For he makes the case that imperfectly implemented AGI constitutes an existential risk more dangerous than asteroids or nuclear war. To mitigate that risk, we need to work out in advance if/how humanity could safely constrain or control an AGI more intelligent than we are.

Thursday, September 18, 2014

The "Double Jeopardy" Objection to QALYs

I've previously discussed Harris (1987)'s famous objection that the use of Quality-Adjusted Life Years (QALYs) in medical resource allocation is unjustly "discriminatory". Harris' second objection is that the use of QALYs gives rise to an unfair kind of “double jeopardy” (p.190):
QALYs dictate that because an individual is unfortunate, because she has once become a victim of disaster, we are required to visit upon her a second and perhaps graver misfortune. The first disaster leaves her with a poor quality of life and QALYs then require that in virtue of this she be ruled out as a candidate for lifesaving treatment, or at best, that she be given little or no chance of benefiting from what little amelioration her condition admits of. Her first disaster leaves her with a poor quality of life and when she presents herself for help, along come QALYs and finish her off!

Sunday, September 07, 2014

An Obligation to Abort? Moral Guidance vs. Reaction

Dawkins was widely condemned for his tweet a couple of weeks ago claiming that it would be "immoral" not to abort a fetus with Down Syndrome. The claim seems pretty implausible on its face if we read "immoral" in the "reactive" sense indicating blameworthiness or moral criticizability. But if we instead address the question of 1st-personal moral guidance -- if faced with this situation, what should I do? -- Dawkins' response strikes me as more defensible. (Dawkins' own elaboration seems to indicate that something more in this vicinity was indeed his intended meaning.)