tag:blogger.com,1999:blog-6642011.post6941663218090848849..comments2023-10-29T10:32:36.914-04:00Comments on Philosophy, et cetera: Harms, Benefits, and Framing EffectsRichard Y Chappellhttp://www.blogger.com/profile/16725218276285291235noreply@blogger.comBlogger4125tag:blogger.com,1999:blog-6642011.post-64281322951579523092017-10-30T05:11:44.379-04:002017-10-30T05:11:44.379-04:00I don't know about the full context of the que...I don't know about the full context of the question but I wonder whether nonlinearity plays a role. For example loosing the last members of a group may play a role. And the chosen answers seem to prevent that.Gunnar Zarnckehttps://www.blogger.com/profile/13901789224078878396noreply@blogger.comtag:blogger.com,1999:blog-6642011.post-56522696648324021192017-10-03T16:12:12.500-04:002017-10-03T16:12:12.500-04:00That seems plausible. The thing that's tricky ...That seems plausible. The thing that's tricky is that (for this) you need the qualification simultaneously to neutralize any causal implicature and not ruin the symmetry that's important for the Kahneman-Tversky set-up. Offhand, I can't think of any problem a still or nevertheless qualification would cause for either one, so it looks like it would would work.<br /><br />I think the guarantee requires that the causal context already be neutralized for any specification to work (otherwise the specification gets confined to the causal context), but with still or nevertheless that doesn't look like it would be a problem, either.Brandonhttps://www.blogger.com/profile/06698839146562734910noreply@blogger.comtag:blogger.com,1999:blog-6642011.post-16690200438563946212017-10-02T12:44:21.009-04:002017-10-02T12:44:21.009-04:00Hmm, good point. I wonder if further refinements ...Hmm, good point. I wonder if further refinements could avoid the natural causal reading? (The guarantee we can add in easily enough by specifying that 'exactly 400 will die'.) Maybe something like: "If program A is adopted, there will still / nonetheless be 400 people who die of the disease." That no longer sounds like A causes the deaths, right?Richard Y Chappellhttps://www.blogger.com/profile/16725218276285291235noreply@blogger.comtag:blogger.com,1999:blog-6642011.post-73575402890955209562017-10-02T12:14:31.405-04:002017-10-02T12:14:31.405-04:00I don't think your proposed revision actually ...I don't think your proposed revision actually addresses the issue, which is that a very common natural reading in ordinary contexts of <br /><br /><i>If A is done, B will happen</i><br /><br />is that by doing A, one causes B. This is not affected by simply revising the consequent. That is, at least one natural reading of<br /><br /><i>If program A is adopted, 400 people will die of the disease</i><br /><br />is that the adoption of program A causes 400 people to die of the disease. Same problem, assuming the diagnosis is correct; literally the only information people have about the program is that it would (on one reading) cause 400 people to die of the disease. And surely it's not surprising that they would be inclined to think <i>that</i> a bad deal.<br /><br />Independently, I wonder if there might be an effect I notice with my students on trolley problems: if they are not explicitly told that something's guaranteed, they often assume that it need not be. So it could well be that people are reading '400 will die' as 'at least 400 will die' and '200 will be saved' as 'at least 200 will be saved'), which also breaks the symmetries.Brandonhttps://www.blogger.com/profile/06698839146562734910noreply@blogger.com