Friday, October 28, 2005

Native Empiricists

Why are we the way we are? Each of us currently possesses a wide range of dispositions, but where did they come from? The standard answer appeals to a combination of 'nature' and 'nurture' — genetic heritage and environmental influence. We may illuminate the problem by conceiving of it in terms of a 'search space'. Given all the possible actions open to an agent or animal, how does it decide what to do? Or, taking a step further backwards, how does evolution design organisms that achieve their biological goals? Put in this light, we can see that the solution involves knowledge of a (perhaps implicit) sort. The organism needs to be sensitive to means by which it can achieve these goals. This would seem to open up two broad options. The organism might have the necessary knowledge "built in" - the 'nativist' solution - or it might instead learn from experience, as the 'empiricists' would have it. Put into an evolutionary framework, the 'innate knowledge' posited by nativists would have a phylogenetic explanation, i.e. arising over evolutionary timescales. This can be contrasted with the more familiar ontogenetic explanations provided by empiricists, wherein knowledge acquisition occurs over individual lifespans.

Of course, we obviously do learn much from experience, so no-one would seriously propose that all of our knowledge is innate. The extreme nativist would have no sense organs at all, invariably performing one pre-programmed action after another, entirely insensitive to environmental contingencies. It's an absurd image. But the extreme empiricist would fare no better. A purely 'blank slate', lacking the drive of any prior dispositions whatsoever, would never do or learn anything. Even if provided with some innate desires and a general ability to learn (perhaps through trial and error, induction, etc.), such a creature still has no basis on which to choose one action over another. A random choice, much like a random mutation, is very unlikely to be beneficial. The sheer vastness of phylogenetic scales allows the latter strategy to ultimately yield results for species despite its unreliability. But an individual organism has no such luxury. It requires some guidance as to how to achieve its goals, what actions are worth 'trialling', which properties are projectible and thus potentially inductible upon, etc.

In light of the multiplicity of natural properties, and the indefinitely varied gerrymandered objects one might identify, it seems doubtful whether the blank slate creature could even make sense of the preconceptual content delivered by its sensory organs. We need innate perceptual processes to cut up the world into identifiable chunks, drawing our attention to the properties and objects that matter, and neglecting the rest so as to avoid information overload. But even then, there are vastly many things in the world we could learn about. (How many blades of grass in this field? What does the inside of that lion's mouth look like?) How do we choose what to focus on? We clearly cannot learn what is worth learning about, in advance of learning it. Successful learning itself requires the guidance of prior (i.e. innate) knowledge and dispositions.

The empirical evidence backs up this theoretical result. For example, Garcia tested lab-raised rats (lacking any prior experience with such problems) with some novel food that caused nausea three hours later. The rats formed an immediate aversion to the food, despite the three hour separation, indicating that they were innately prepared to associate nausea with novel foods rather than more temporally proximate stimuli. Such 'innate constraints' aid learning by shrinking the search space, thus making it easier to find the solution. In general, innate biases can provide a scaffold for further learning, by drawing our attention to biologically significant features of the world that we might not otherwise appreciate. (For a human example, we might expect babies to be born with the pictorial knowledge of a rough 'face' template to guide their attention and enable them to learn their parents faces.)

So, empiricists must accept this 'minimal nativism'. But we may still capture the spirit of their position by suggesting that "culture is our nature", that our innate dispositions are geared towards learning, and that we do not generally come 'hardwired' with knowledge that could instead be acquired from experience. This position can then be contrasted with that of the 'rich nativist' who suggests that we have extensive innate knowledge of specific facts, perhaps relating to the Pleistocene environment that our ancestors adapted to.

From an evolutionary perspective, we should expect innate encoding to be favoured in cases of slow or no environmental change, as the robustness of an innate and invariable disposition can be relied upon in such circumstances. Change over phylogenetic timescales should tend to favour social learning (i.e. cultural transmission) of adaptive information. And for extremely unpredictable environments, which change from one generation to the next, a heavier reliance on individual learning would make sense. Learning is of course the more flexible option, being sensitive to environmental contingencies in a way that innate "knowledge" or dispositions are not. It may also be more efficient for an organism to simply be equipped with the general cognitive tools to extract information from the world, rather than loading extensive and specific information into the mind right from the start.

Interestingly, humans evolved to have an extended juvenile stage of development, compared to other primates. As Dr. Sean Rice explains (HT: Fido), "we have not adapted to grow rapidly during adolescence, but rather to grow slowly before it; thus stretching out our childhood." This adaptation makes sense from an empiricist / minimal nativist viewpoint -- it creates room for a long period of learning and skill acquisition. It is less clear whether rich nativists can explain our long childhoods, and it certainly seems an uncomfortable result for those who would downplay the biological importance of individual development.

Humans seem to be uniquely well adapted for learning. Our best understanding of the matter thus leads us away from the old dichotomy of "nature vs. nurture", instead suggesting that our nature is to be nurtured.

4 comments:

  1. Yep. Ever read Matt Ridley's Nature Via Nurture (a.k.a. The Agile Gene)? That's pretty much what it says too.

    ReplyDelete
  2. who first coined the phrase "nature vs nurture"?
    It is a bit like saying
    "feet vs legs... - which causes walking?"

    I can just imagine all the experiments regardng people with sore feet or sprained leg muscles as scientists try to figure out whether it is indeed the feet or the legs.

    ReplyDelete
  3. You've summed things up nicely. One caveat: Rice demonstrates what appears to be a synapomorphy (novel growth phase in Homo and Pan infants) as well as some apomorphies (sequential hypermorphosis and prepubescent neotony in Homo sapiens).

    ReplyDelete
  4. If you caught the last of Kim Hills Face-to-Face shows, this was put so beautifully I almost cried (OK, I'm a sucker for a good argument). She interviewed one of the leading lights behind the truely remarkable Dunedin study of (I think) about 1500 people- now 32. He spoke inspiringly (how often does a scientist do that?) about the complex and fascinating relationship between genes and environments, relating to depression, anger, drug use, childhood poverty, and more.

    ReplyDelete

Visitors: check my comments policy first.
Non-Blogger users: If the comment form isn't working for you, email me your comment and I can post it on your behalf. (If your comment is too long, first try breaking it into two parts.)

Note: only a member of this blog may post a comment.