Tuesday, May 16, 2006

Extended Mind-Control

I went to a fascinating talk yesterday from Neil Levy, on the ethical implications of the extended mind thesis.

A large part of the talk was concerned with motivating the thesis itself, drawing on fascinating research in cognitive psychology (e.g. change-blindness) suggesting that our internal representations are far less complete than we typically realize. (It vaguely reminds me of the Dennettian ideas discussed here.) Instead of wasting mental effort creating detailed internal representations, we typically just let the world represent itself. If we need more detail, we can focus attention on the appropriate part of the world, accessing the external information much as we might access our internal memories or information stores. Levy also mentioned typical examples like using pen and paper to do arithmetic calculations, etc. This all adds up to an understanding of the brain as no mere thinking organ, but rather a tool or interface which enables us to think with the external world. (Levy didn't put it quite like that. But I rather like the idea.)

I suppose one might embrace the same insight in more conservative terms by speaking of "cognitive scaffolding", or the idea that our environment can amplify or augment our cognitive capacities. One might insist that, strictly speaking, all the real thinking goes on in the head, with the external parts merely "assisting" or "enabling" this process. But this may be a mere terminological quibble. In any case, I think the insight is more vividly captured by the idea that we actually offload cognitive work on to the external world, so that not all cognition occurs in the head.

Neil went on to point out how silly it is for people to get so much more upset about "internal" manipulations than "external" ones. He cited the example of moral panic over students using Ritalin to get a slight boost in grades, when the indirect effects from disparities in nutrition, socio-economic status, cultural capital, etc., are so much greater. Or people get worried about brain scans "reading their mind", when all the scans do is identify neural correlates of already observable behaviour (that would come out in, say, cognitive psychological tests). Or they worry about neuroscientists coming up with mind-control devices, and ignore the far more advanced research into manipulation conducted by marketers and social psychologists.

Now, as Dave Chalmers noted in discussion, the ethical parity claims don't really depend on the extended mind thesis. One might simply recognize that the brain/mind is subject to external causal influences, and then say we should be just as concerned about those as we are of direct neural manipulation, since they're simply two means to the same end. But again, so long as one agrees on the relative unimportance of the internal/external distinction, it isn't clear that much hinges on the question of how liberal one wants to be with the word "mind".

(I should add that since any kind of manipulation will eventually affect the brain, it isn't clear how the internal/external distinction is meant to apply here. It really seems to be a matter of direct vs. indirect meddling with the brain. I suppose one who accepts the extended mind thesis will hold that some of these apparently "indirect" manipulations are actually direct manipulations of one's extra-cranial mind. But again, this seems a merely semantic difference, and doesn't have any obvious ethical implications, besides reinforcing what we should have already accepted, i.e. the relative unimportance of the direct/indirect distinction.)

Much of the discussion focussed on the question of why people typically think the distinction has ethical significance. I suspect that many people simply fail to realise that the environment has a causal impact on the brain. (Recall the recent AP release claiming that brain studies "add weight to the idea that homosexuality has a physical basis and is not learned behavior" -- as if learned behaviour had a non-physical basis!) We don't normally think of conversation as a way to alter another person's brain, for instance.

Jeanette proposed that concerns about 'authenticity' might play an intuitively role here too. Taking drugs, whether for your muscles or your neurons, strikes many as "cheating". One should earn self-improvement through hard work and effort. How that makes drugs relevantly different from good nutrition, I'm not sure. Perhaps such judgments draw on preconceptions of what's "natural".

I wouldn't be surprised if concerns about "unnaturalness" played a large role in people's special fear of neuroscience. Perhaps we can even develop a coherent argument out of it. Here's one I proposed in question time: it seems like direct neural manipulation has more radical potential. People have long been manipulating each other indirectly (i.e. via behavioural or psychological means), and we've evolved various natural defences accordingly. But neurotechnology promises an entirely new form of manipulation, against which we have no defence. Because we are specially vulnerable to internal manipulations, then, we should be specially concerned about them. Even though the internal/external distinction has no intrinsic import, it may have instrumental significance.

In response, Neil pointed out that social psychology wasn't around back in the EEA either, so we might be just as vulnerable to the new manipulative strategies that it uncovers. (Apparently research shows that we really are quite vulnerable to "ego depletion" and other effects.) Still, it seems plausible that there is more potential for radical manipulation by direct than indirect means. Still, I think Neil could take this on board easily enough by simply pointing out the instrumental nature of our concern. In the end, it doesn't much matter whether a manipulation is directly 'internal' or 'external' to the brain. At best, this distinction might correlate with other differences that really matter. But then we might as well just focus directly on those.

One proposal here that came up in discussion was the idea of manipulations "bypassing our rational faculties". That could be one reason for direct neural manipulation being worse than, say, polemic arguments. But if we found some 'external' means of bypassing the rational faculties (perhaps used by advertisers), then we should consider such psychological manipulation on a par with the neural sort.

A final idea Levy proposed is that we should consider attacks on external cognitive facilities (e.g. closing libraries, or cuts to education funding) to be equivalent to imposing mental retardation. Restricting access to information is effectively - or on Levy's view, literally - to shrink their minds. Though again, we can capture the idea more conservatively by denying the significance of the negative/positive freedom distinction. (Doing/allowing might also need to go, e.g. if we want to say that neglecting to provide adequate education is equivalent to giving kids lobotomies.)

Lots of interesting issues raised there, anyway. I'm looking forward to Levy's forthcoming book on Neuroethics...



  1. Yup, it's interesting stuff, at least partially in the potential to increase our effective intelligence by using certain tools. It seems that there's a limitation to how much those can help, though, because our interaction with the external world takes place in an extremely small bandwidth compared to the total bandwidth within our brains. The faster you can interact with external symbols, the more you can rely on them. But we can't really do that very fast yet nowadays.

  2. I think the limits there might be ethics (instead of technology).

  3. Gee Richard, your description makes it sound like a good paper! New research suggests that cultural scaffolding is responsible for the innate powers of the brain itself. I'm sure there's some cool way to embed links, but since I don't know it...


Visitors: check my comments policy first.
Non-Blogger users: If the comment form isn't working for you, email me your comment and I can post it on your behalf. (If your comment is too long, first try breaking it into two parts.)