Researchers from the Max Planck Institute and St Andrew’s University have come up with some fresh evidence that chimps have a theory of mind (ToM) – that is to say that they are aware that other individuals possess knowledge and that what they know doesn’t always match what we know.
The researchers placed dummy snakes in the path of wild chimps: the chimps gave warning calls more frequently in the presence of others who, so far as they could tell, had no prior knowledge of the presumed hazard.
This kind of research is fraught with difficulty. Morgan’s Canon tells us that we should only use consciousness as an explanation for some item of behaviour where no simpler explanation is available, and similarly we should be reluctant to grant chimps ToM unless there is no alternative. Couldn’t the explanation be, for example, that chimps who are alone are more likely to give warning calls, either because that response is just hard-wired, or because they are more fearful when alone? Alternatively, perhaps the observed behaviour could be largely explained if chimps are programmed to give a warning call, but only one, for each member of the troupe they spot or hear approaching?
Although I think Morgan’s Canon is absolutely the right kind of principle to apply, it is difficult to satisfy, and if read too literally perhaps impossible. We know from all the discussions of philosophical zombies that there are plenty of thoughtful people who find it conceivable that all of human behaviour could be produced without consciousness (at any rate, without the kind of consciousness that requires actual phenomenal subjective experience). If that’s really true then there are surely no cases in which behaviour can strictly be explained only by consciousness. It’s equally hard, going on impossible, to rule out every conceivable alternative explanation for the chimps’ behaviour – but the researchers were well aware of the problem and the key point of the research is the observation of circumstances where, for example, chimp A could be presumed to have heard an earlier warning, but chimp B could not. So we can take the claims they make as well grounded. It seems that with some inevitable margin of doubt we can reasonably take it as established that chimps do have ToM.
So what? We might have been willing to assume that that was probably the case anyway. We already know chimps are extremely bright and there are many who believe they can develop language skills which approach human levels. Language is what makes it so much easier to know for sure that human beings have ToM – they can tell us about it – so if chimps are anywhere near that level it’s really no surprise that they also have ToM. (Interesting, by the way that the current research uses the chimps’ proto-linguistic warning calls.) One further conclusion offered by the researchers themselves is that ToM must have emerged in the primate lineage at a point before the divergence of chimp and human ancestors: but that ain’t necessarily so. It could equally be that each lineage has developed a functionally comparable capacity in parallel, one which the latest shared ancestor need never have had.
Do we and our pongid cousins have the same ToM? In some respects obviously not. For one thing, we humans really do have actual academic theories of mind; and we write novels filled with the putative contents of minds that never existed. We have ToM on levels which completely transcend the mental lives of chimps. Are these, though, just fancy overlays on an underlying ability which remains essentially the same? Alas, there’s no easy way of telling without knowing what’s going on in the chimp’s mind – what it is like to be a chimp – and Nagel long ago told us that that was impossible.
Attempting to know the unknowable is nothing new for us, though, so let’s at least briefly try to achieve the impossible. There are lots of possibilities for what might be passing through the chimp’s mind: by way of illustration it could be any of the following.
- A cloudy sense of something indefinable but importantly snake-related which is missing in Chimp B.
- A mental picture of Chimp B continuing to advance and stumbling on the snake.
- A brief empathetic sense of being Chimp B, and a recollection that seeing the snake or hearing a warning has not occurred.
- Routine enumeration of the troupe and its whereabouts leading to a realisation that Chimp B hasn’t been around for a while.
- Occurence of proto-verbal content equivalent to uttering the sentence “Look there’s B, who doesn’t know about the snake yet!”
There are plenty of other possibilities: cataloguing them would in itself be a challenging task. Moreover, humans are clearly capable of operating on two or more of these levels at once, and it would be mere speciesism to assume that chimps are not. Still, can we pare it down a bit: given that chimps lack full-blown human linguistic abilities and are relatively limited in their foresight, can we plausibly hypothesise that cases like 5 and other involving relatively complex levels of abstraction are probably absent from the chimp experience? I’m not sure, and even if we can it doesn’t help all that much.
So instead I ask myself what state obtained in my own mind last time I warned someone about a potential hazard. Luckily I do remember a couple of occasions, but interestingly introspection leaves me quite uncertain about the answer. This could be a result of hazy memory, but I think it’s worse than that: I think the main problem is that so far as conscious thought goes I could have been thinking anything. It feels as if there is no distinct single state of mind which corresponds to noticing that somebody needs to be warned about something; curiously I feel tempted to examine my own behaviour and conclude that if I did go on to warn someone, I must have been thinking that they needed warning.
That kind of approach is another option, I suppose: we can take a behaviourist tack and say that if chimps behave in a way that displays ToM, then they have it, and that’s all there is to be said about it. If we can’t formulate clearly what kind of behaviour that would be, that just means ToM itself turns out to be mentalistic nonsense. The snag with that is that ToM is pretty certainly mentalistic nonsense to behaviourists anyway; so if we think the question is worth answering we have to look elsewhere.
We could get neuronal on this: we might, for example, be able to scan human and chimp brains and detect some distinctive patterns of activity which occur just when the relevant primate appears to be getting ready to issue a warning. If these patterns of activity occurred in the corresponding sections of the chimp and human brain (perhaps involving some of those special mirror neurons) we should be inclined to conclude that our ToMs were basically the same: if they occurred in different places we should be very tempted to conclude that evolution had recruited different sections of the two species’ brains to carry out the same function. This latter case is quite plausible – in human brains, for example, the areas used for speech don’t match the bits of the chimp brain used for vocalisations (which apparently correspond to areas used by humans only for involuntary gasps and cries and, strangely enough, for swearing).
Results like that might settle the evolutionary question; but not the deeper philosophical one. Even if we did use a different set of neurons, it wouldn’t prove we weren’t running the same ToM. Different human beings certainly use somewhat different arrays of neurons – no two brains are wired identically. If we came across the yeti and found he was fully up to human levels of consciousness, able to hold an impeccably normal human-style conversation with us and discuss ToM just as we do, and then we made the astonishing discovery that he had no prefrontal cortex and was using what in humans would have been his cerebellum to do his conscious thinking with, we would not on that account alone say he had a different kind of consciousness (at least, I don’t think we would).
So it looks to me as if we have a radical pattern of variation at both ends. All sorts of neuronal wiring (or maybe silicon or beer cans and string – why not?) will do at the bottom level; all sorts of cogitative content will do at the top levels. Somewhere in the middle is there a level of description where deciding that someone needs to be warned is just that and nothing else, and where we can meaningfully compare and contrast human and chimp? I suspect there is, but I also suspect that it resides in something analogous to a high-level mental metacode of a kind we should need a proper theory of mind even to begin imagining.