tag:blogger.com,1999:blog-6642011.post112384352207396568..comments2023-10-29T10:32:36.914-04:00Comments on Philosophy, et cetera: Conscious CausationRichard Y Chappellhttp://www.blogger.com/profile/16725218276285291235noreply@blogger.comBlogger16125tag:blogger.com,1999:blog-6642011.post-1124704354643283082005-08-22T05:52:00.000-04:002005-08-22T05:52:00.000-04:00You're right that it's possible for similar action...You're right that it's possible for similar actions to have different causes, but I don't think that's what's going on in this case. For note that Zichard is behaviourally indistinguishable from myself (by the definition of a 'zombie'), so when you ask him why he ran away, he'll say "because I saw a lion!" (He'll even <A HREF="http://pixnaps.blogspot.com/2005/08/zombie-philosophers.html" REL="nofollow">philosophize about consciousness</A>, completely unaware that he doesn't have any.)<BR/><BR/>Assuming Zichard has no consciousness, the real reason he ran away must be some physical fact (perhaps about light from the lion impacting on his retina, being processed by his brain in certain ways, computing that there is danger, etc.).<BR/><BR/>But I'm physically identical to him, so all those exact same events take place in <I>my</I> brain too. So it just seems <I>unnecessary</I> to add consciousness in to the mix. The physical facts alone are sufficient to cause my action.Richard Y Chappellhttps://www.blogger.com/profile/16725218276285291235noreply@blogger.comtag:blogger.com,1999:blog-6642011.post-1124701886097513852005-08-22T05:11:00.000-04:002005-08-22T05:11:00.000-04:00"you say that Zichard having same physical structu..."<I>you say that Zichard having same physical structure as yours cannot perform same tasks as you coz you are performing them being conscious.</I>"<BR/><BR/>Actually, I was suggesting the opposite. Zichard <I>would</I> (in some possible world) perform the exact same actions as me, simply in virtue of our physical identity, whether or not he was conscious. But our actions have the same, physical, causes. So if my action was a result of my consciousness, then my consciousness must consist in my physical facts -- so Zichard must be conscious too (since he's physically identical to me). Hence the conclusion that zombies are impossible.Richard Y Chappellhttps://www.blogger.com/profile/16725218276285291235noreply@blogger.comtag:blogger.com,1999:blog-6642011.post-1124698936288359892005-08-22T04:22:00.000-04:002005-08-22T04:22:00.000-04:00I don't find determinism problematic -- see here.B...I don't find determinism problematic -- see <A HREF="http://pixnaps.blogspot.com/2005/08/red-pill-choosing-determinism.html" REL="nofollow">here</A>.<BR/><BR/>But I don't think it matters for present purposes anyway. Nothing I've said here assumes that any physical replica of the universe <I>must</I> turn out the same way. Rather, the dualist is committed to saying that there <I>could</I> be a physical replica of our universe, lacking consciousness. More pressingly, my physical actions have physical causes. So if consciousness is non-physical, it seems superfluous when explaining my actions. Indeterminism doesn't change any of this.Richard Y Chappellhttps://www.blogger.com/profile/16725218276285291235noreply@blogger.comtag:blogger.com,1999:blog-6642011.post-1123981588694789292005-08-13T21:06:00.000-04:002005-08-13T21:06:00.000-04:00Covaithe, I was thinking of a robot that merely ma...Covaithe, I was thinking of a robot that merely managed to match my <I>actual</I> behaviour. If it was impossible <I>in principle</I> to tell us apart, then it would seem odd to say that we might nevertheless have different mental states. Hmm. Maybe functionalism isn't so bad after all? ;)Richard Y Chappellhttps://www.blogger.com/profile/16725218276285291235noreply@blogger.comtag:blogger.com,1999:blog-6642011.post-1123953319368773362005-08-13T13:15:00.000-04:002005-08-13T13:15:00.000-04:00Richard: If you look inside the robot and find th...Richard: If you look inside the robot and find that its behavior is the result of a physical system that is really simpler than your own, then it should in principle be possible to find a test to demonstrate this fact. It should be possible to design a test where the robot behaves physically differently than you do. If, on the other hand, you claim that the robot is totally behaviorally identical to you under every conceivable test, I fail to see in what meaningful way you can describe its internal mechanism as "simpler" than your own, no matter what it is.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-6642011.post-1123907899577486972005-08-13T00:38:00.000-04:002005-08-13T00:38:00.000-04:00(This point is made by Jack Copeland in his excell...(This point is made by Jack Copeland in his excellent introduction to Artificial Intelligence -- highly recommended reading if you're interested in this stuff.)Richard Y Chappellhttps://www.blogger.com/profile/16725218276285291235noreply@blogger.comtag:blogger.com,1999:blog-6642011.post-1123907722604941312005-08-13T00:35:00.000-04:002005-08-13T00:35:00.000-04:00Genius, I think you've misunderstood my argument. ...Genius, I think you've misunderstood my argument. I wasn't suggesting that robots cannot be conscious. Rather, I was suggesting that it's possible for a robot to behave identically to me without being conscious.<BR/><BR/>"<I>we have all the evidence that it is conscious as we have for you except that it is a machine.</I>"<BR/><BR/>No, the crucial difference isn't <I>that</I> it's a machine, but rather, the exact details of <I>how</I> the mechanism works. Functionalists want to treat agents as a "black box", ascribing consciousness purely on the basis of behaviour. I was suggesting that we need to look <I>inside</I> the black box, at the causal mechanisms that underlie the behaviour. If the robot's mechanisms are of sufficient complexity, we might be justified in ascribing genuine understanding and even consciousness to it. But if we find the behaviour is arising from simplistic chatbot-style heuristics, then we can conclude that it's just a big fake. It's not just behaviour that matters. The underlying "programming" is crucial too.Richard Y Chappellhttps://www.blogger.com/profile/16725218276285291235noreply@blogger.comtag:blogger.com,1999:blog-6642011.post-1123907024162216762005-08-13T00:23:00.000-04:002005-08-13T00:23:00.000-04:00Richard,Not sure if your argument is very strong. ...Richard,<BR/>Not sure if your argument is very strong. You are saying that the robot does all the same things as you and thus we have all the evidence that it is conscious as we have for you except that it is a machine. <BR/><BR/>For an external actor to deny it the status of consciousness they would have to also deny you that status. Unless you use the "understanding" argument that I will detail as follows.<BR/><BR/>The "understanding" argument is an implied argument one that one often finds in the justice system and in religion.<BR/><BR/>There is a complex system and you declare that if you can understand it is let say "not conscious" and if you can understand it is "conscious".<BR/><BR/>Similarly in religion if you can’t understand it is god if you can it is nature<BR/><BR/>And in law if you can understand it is OK if you cant it is "evil"<BR/><BR/>In this case you might say - I can see why the computer comes up with those answers as a simple function of some inputs.<BR/><BR/>The problem is that understanding is in internal to you - so how can it be valid for judging?<BR/><BR/>The other possibility is that you define consciousness arbitrarily as "what I have" or "what I and every natural born human has" in which case you can deny a conscious robot that can exceed you on every test as a result of him just not being identical (in every aspect) to you.<BR/><BR/>It is valid for you to exclude the robot because it can’t do something you can do (at the risk of being arbitrary again) but that implies the robot does something different to you.<BR/><BR/>Otherwise when you say <BR/>"Our behaviours both have physical causes, but they are different physical causes. Maybe mine are of the right type to give rise to consciousness, whereas Richard’s aren't. "<BR/><BR/>You are using a tool of defining a hypothetical that is potentially impossible and then marvelling at the fact that it is possible as a result of your hypothetical<BR/><BR/>for example I might say imagine if two things are pulled towards the earth - but one thing is pulled towards the earth by a different but absolutely identical force to gravity on the same trajectory but you look and find that that one is being effected by gravity while the other is being effected by another force - therefore you cannot say that if something obeys gravity it is being effected by gravity.<BR/><BR/>Now I guess you can say "yeah that’s true" but few people are using that hypothetical to dispute gravity.<BR/><BR/>I think you quite often to some extent use such arguments that emerge naturally from the impossibility of some hypotheticals.Geniushttps://www.blogger.com/profile/11624496692217466430noreply@blogger.comtag:blogger.com,1999:blog-6642011.post-1123899387474987912005-08-12T22:16:00.000-04:002005-08-12T22:16:00.000-04:00I don't quite follow your previous comment. The de...I don't quite follow your previous comment. The denial of (1) is physicalism, which makes zombies impossible. Anything physically identical to myself must be just as conscious as I am, because my consciousness consists in some of those physical facts.<BR/><BR/>Now, physicalism does not entail behaviourist functionalism. Let me illustrate why.<BR/><BR/>Suppose I had a behavioural twin 'bot' called Bichard. Bichard exhibits identical behaviour to myself, but if you look inside his head you find not a brain, but rather, a clever yet superficial Eliza-like bot program. (Okay, it seems unlikely that a superficial program could exhibit such complex behaviour, but never mind that for the moment.)<BR/><BR/>I take it functionalism is the claim that if I am conscious, then my behavioural duplicate Bichard must also be conscious. But clearly the denial of (1) does not entail any such thing, for Bichard and I are physically very different. Our behaviours both have physical causes, but they are <I>different</I> physical causes. Maybe mine are of the right type to give rise to consciousness, whereas Bichard's aren't.Richard Y Chappellhttps://www.blogger.com/profile/16725218276285291235noreply@blogger.comtag:blogger.com,1999:blog-6642011.post-1123893432780383592005-08-12T20:37:00.000-04:002005-08-12T20:37:00.000-04:00Covaithe - I guess you're right that the dualist c...Covaithe - I guess you're right that the dualist could say something along those lines. Interesting.<BR/><BR/>Pulp - I have no idea what you're trying to say. It seems fairly clear that we <I>are</I> able to imagine other beings as conscious or not, even if the only consciousness we have first-hand experience of is our own. I just don't see the problem here.Richard Y Chappellhttps://www.blogger.com/profile/16725218276285291235noreply@blogger.comtag:blogger.com,1999:blog-6642011.post-1123893220687669752005-08-12T20:33:00.000-04:002005-08-12T20:33:00.000-04:00Paul, I don't think the rejection of (1) commits o...Paul, I don't think the rejection of (1) commits one to functionalism. We might still hold that it matters <I>which</I> physical causes were behind the action. For example, we can look inside chat-bots like 'Eliza', and see straight away that they are merely cleverly programmed such as to give the false impression of understanding, when there is no deep parsing of semantics going on. So we can hold mere behaviour (even passing the turing test) to be insufficient to establish consciousness, for the behaviour must also have the right kinds of causes.<BR/><BR/>(I don't find the Chinese Room objection very compelling though. I'm inclined to simply insist that the system <I>does</I> understand Chinese, even if the conscious agent doesn't.)Richard Y Chappellhttps://www.blogger.com/profile/16725218276285291235noreply@blogger.comtag:blogger.com,1999:blog-6642011.post-1123877381245198652005-08-12T16:09:00.000-04:002005-08-12T16:09:00.000-04:00Seems to me you are a Zombie in one sense and not ...Seems to me you are a Zombie in one sense and not in another.<BR/>For example in as far as you define what you hve as conciousness you are not a Zombie - HOWEVER in reality what you have is far less that what we might philosophically define as conciousness or free will etc etc.Geniushttps://www.blogger.com/profile/11624496692217466430noreply@blogger.comtag:blogger.com,1999:blog-6642011.post-1123869096449351292005-08-12T13:51:00.000-04:002005-08-12T13:51:00.000-04:00Regarding 2', perhaps a dualist might argue that n...Regarding 2', perhaps a dualist might argue that non-zombie's non-physical consciousness is capable of overriding, but may choose not to override, the physical processes that translate physical cause to physical actions. In other words, where consciousness is involved, physical actions are *not* caused solely by physical causes, but have an additional, non-physical cause. This is exactly the negation of assertion 2. <BR/><BR/>What I'm saying, perhaps not very well, is that I don't see how a dualist in rejecting 2 is necessarily forced to accept 2'. Of course, IANAD(ualist), so maybe I'm missing something. <BR/><BR/>Fun reading, though. :)Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-6642011.post-1123861739425160092005-08-12T11:48:00.000-04:002005-08-12T11:48:00.000-04:00I'm sorry that I didn't read your earlier post bef...I'm sorry that I didn't read your earlier post before commenting. I see you addressed my point there.Stevehttps://www.blogger.com/profile/14851240963321295307noreply@blogger.comtag:blogger.com,1999:blog-6642011.post-1123859519109019812005-08-12T11:11:00.000-04:002005-08-12T11:11:00.000-04:00Hi Richard. I agree with you that zombies are imp...Hi Richard. I agree with you that zombies are impossible for the same reason. But this is not an argument for physicalism, IMO. The problem with physicalism is a failure to explain why this is so. I don't think it is satisfying to say that a physical system put together the way I am "just is" conscious. Nowhere in the third person methodologies of present-day physics do we find an entity or process which would explain why first person experience exists in the world.<BR/><BR/>I also agree with your saying that a reductionist argument makes sense. But reductionist approaches which eliminate the explanandum aren't explanations at all. There must be some fundamental entity and/or process in the world we can reduce our experience to.<BR/> - Steve EsserStevehttps://www.blogger.com/profile/14851240963321295307noreply@blogger.comtag:blogger.com,1999:blog-6642011.post-1123848817525520282005-08-12T08:13:00.000-04:002005-08-12T08:13:00.000-04:00But to impute that you are conscious and Zichard i...But to impute that you are conscious and Zichard is, hypothetically, non-conscious does not compute either in the first place because your 'consciousness' has no other reference but yourself. <BR/><BR/>If you cannot define, grasp or identify 'consciousness' or 'what it is to be conscious' without self-reference, then you're never really going to be in a position to imagine a Zichard, much less another human being, as conscious or non-conscious.<BR/><BR/>In which case, for all you know, Zichard could be conscious or non-conscious, or you could be your very own Zichard, but you wouldn't be able to tell the difference.Anonymousnoreply@blogger.com