Posts tagged ‘Grinde’

Where did consciousness come from? A recent piece in New Scientist (paywalled, I’m afraid) reviewed a number of ideas about the evolutionary origin and biological nature of consciousness. The article obligingly offered a set of ten criteria for judging whether an organism is conscious or not…

  • Recognises itself in a mirror
  • Has insight into the minds of others
  • Displays regret having made a bad decision
  • Heart races in stressful situations
  • Has many dopamine receptors in its brain to sense reward
  • Highly flexible in making decisions
  • Has ability to focus attention (subjective experience)
  • Needs to sleep
  • Sensitive to anaesthetics
  • Displays unlimited associative learning

This is clearly a bit of a mixed bag. One or two of these have a clear theoretical base; they could be used as the basis of a plausible definition of consciousness. Having insight into the minds of others (‘theory of mind’) is one, and unlimited associative learning looks like another. But robots and aliens need not have dopamine receptors or racing hearts, yet we surely wouldn’t rule out their being conscious on that account. The list is less like notes towards a definition and more of a collection of symptoms.

They’re drawn from some quite different sources, too. The idea that self-awareness and awareness of the minds of others has something to do with consciousness is widely accepted and the piece alludes to some examples in animals. A chimp shown a mirror will touch a spot that had been covertly placed on its forehead, which is (debatably) said to prove it knows that the reflection is itself. A scrub jay will re-hide food if it was seen doing the hiding the first time – unless it was seen only by its own mate. A rat that pressed the wrong lever in an experiment will, it seems, gaze regretfully at the right one (‘What do you do for a living?’ ‘Oh, I assess the level of regret in a rat’s gaze.’) Self-awareness certainly could constitute consciousness if higher-order theories are right, but to me it looks more like a product of consciousness and hence a symptom, albeit a pretty good one.

Another possibility is hedonic variation, here championed by Jesse Prinz and Bjørn Grinde. Many animals exhibit a raised heart rate and dopamine levels when stimulated – but not amphibians or fish (who seem to be getting a bad press on the consciousness front lately). There’s a definite theoretical insight underlying this one. The idea is that assigning pleasure to some outcomes and letting that drive behaviour instead of just running off fixed patterns instinctively, allows an extra degree of flexibility which on the whole has a positive survival value. Grinde apparently thinks there are downsides too and on that account it’s unlikely that consciousness evolved more than once. The basic idea here seems to make a lot of sense, but the dopamine stuff apparently requires us to think that lizards are conscious while newts are not. That seems a fine distinction, though I have to admit that I don’t have enough experience of newts to make the judgement (or of lizards either if I’m being completely honest).

Bruno van Swinderen has a different view, relating consciousness to subjective experience. That, of course, is notoriously unmeasurable according to many, but luckily van Swinderen thinks it correlates with selective attention, or indeed is much the same thing. Why on earth he thinks that remains obscure, but he measures selective attention with some exquisitely designed equipment plugged into the brains of fruit flies. (‘Oh, you do rat regret? I measure how attentively flies are watching things.’)

Sleep might be a handy indicator, as van Swinderen believes it is creatures that do selective attention that need it. They also, from insects to vertebrates (fish are in this time), need comparable doses of anaesthetic to knock them out, whereas nematode worms need far more to stop them in their tracks. I don’t know whether this is enough. I think if I were shown a nematode that had finally been drugged up enough to make it keep still, I might be prepared to say it was unconscious; and if something can become unconscious, it must previously have been conscious.

Some think by contrast that we need a narrower view; Michael Graziano reckons you need a mental model, and while fish are still in, he would exclude the insects and crustaceans van Swinderen grants consciousness to. Eva Jablonka thinks you need unlimited associative learning, and she would let the insects and crustaceans back in, but hesitates over those worms. The idea behind associative learning is again that consciousness takes you away from stereotyped behaviour and allows more complex and flexible responses – in this case because you can, for example, associate complex sets of stimuli and treat them as one new stimulus, quite an appealing idea.

Really it seems to me that all these interesting efforts are going after somewhat different conceptions of consciousness. I think it was Ned Block who called it a ‘mongrel’ concept; there’s little doubt that we use it in very varied ways, to describe the property of a worm that’s still moving at one end, to the ability to hold explicit views about the beliefs of other conscious entities at the other. We don’t need one theory of consciousness, we need a dozen.