Consciousness = Entropy

Is consciousness a matter of entropy in the brain? An intriguing paper by R. Guevara Erra, D. M. Mateos, R. Wennberg, and J.L. Perez Velazquez says

normal wakeful states are characterised by the greatest number of possible configurations of interactions between brain networks, representing highest entropy values.

What the researchers did, broadly, is identify networks in the brain that were operative at a given time, and then work out the number of possible configurations these networks were capable of. In general, conscious states were associated with states with high numbers of possible configurations – high levels of entropy.

That makes me wrinkle my forehead a bit because it doesn’t fit well with my layman’s grasp of the concept of entropy. In my mind entropy is associated with low levels of available energy and an absence of large complex structure. Entropy always increases, but can decrease locally, as in the case of the complex structures of life, by paying for the decrease with a bigger increase elsewhere; typically by using up available energy. On this view, conscious states – and high levels of possible configurations – look like they ought to be low entropy; but evidently the reverse is actually the case. The researchers also used the Lempel-Ziv measure of complexity, one with strong links to information content, which is clearly an interesting angle in itself.

Of the nine subjects, three were epileptic, which allowed comparisons to be made with seizure states as well as those of waking and sleeping states. Interestingly, REM sleep showed relatively high entropy levels, which intuitively squares with the idea that dreaming resembles waking a little more than  fully unconscious states – though I think the equation of REM sleep with dreaming is not now thought to be as perfect as it once seemed.

One acknowledged weakness in the research is that it was not possible to establish actual connection. The assumed networks were therefore based on synchronisation instead. However, synchronisation can arise without direct connection, and absence of synchronisation is not necessarily proof of the absence of connection.

Still, overall the results look good and the picture painted is intuitively plausible. Putting all talk of entropy and Lempel-Ziv aside, what we’re really saying is that conscious states fall in the middle of a notional spectrum: at one end of this spectrum is chaos, with neutrons firing randomly; at the other we have then all firing simultaneously in indissoluble lockstep.

There is an obvious resemblance here to the Integrated Information Theory (IIT) which holds that consciousness arises once the quantity of information being integrated passes a value known as Phi. In fact, the authors of the current paper situate it explicitly within the context of earlier work which suggests that the general principle of natural phenomena is the maximisation of information transfer. The read-across from the new results into terms of information processing is quite clear. The authors do acknowledge IIT, but just barely; they may be understandably worried that their new work could end up interpreted as mere corroboration for IIT.

My main worry about both is that they are very likely true, but may not be particularly enlightening. As a rough analogy, we might discover that the running of an internal combustion engine correlates strongly with raised internal temperature states. The absence of these states proves to be a pretty good practical guide to whether the engine is running, and we’re tempted to conclude that raised temperature is the same as running. Actually, though, raising the temperature artificially does not make the engine run, and there is in fact a complex story about running in which raised temperatures are not really central. So it might be that high entropy is characteristic of conscious states without that telling us anything useful about how those states really work.

But I evidently don’t really get entropy, so I might easily be missing the true significance of all this.