John Stewart has written an interesting paper on the future of consciousness – where will evolution take it next? His discussion is strongly rooted in the Global Workspace theory, but even if you don’t subscribe to that school of thought it makes good sense.The prime function of consciousness, he says, is to give us new and ultimately better adaptive responses. In his view, it is the Global Workspace that allows the human mind to bring together resources which were not previously linked, but which can be used together in new and effective ways. This process becomes especially powerful when, in human beings, it gives rise to declarative knowledge – explicit, conscious thought. It’s this ability that makes human beings able to frame complex and long-term plans, a huge benefit so far as survival is concerned.
But the use of declarative reasoning, thus far, is limited: in particular, while humans use it to find means of achieving their goals, they don’t make very effective use of it in determining their goals in the first place: moreover, where they do think explicitly about their long-term objectives, they are very vulnerable to interference from hedonic impulses. In simpler language, the goals which you have chosen for good reasons of principle tend to be undermined by the pursuit of short-term pleasures thrown up by the more primitive parts of the mind. Passion ends up dominating reason.
This has a very familiar ring to it: for thousands of years philosophers and religious leaders have been enjoining us to raise our minds above ephemeral pleasures and seek more lofty goals. Stewart is happy to make the connection with religious thought, and suggests that certain religious and contemplative traditions have developed techniques which we could use to help us develop our consciousness to a new level.
Interesting stuff, but there are a few points which seem dubious. In the first place, what he seems to be talking about is self-improvement rather than evolution: nothing we learn during our lifetime is transmitted with our genes. I suppose it might be the case that if enlightened thinking became a skill we all had to learn for our own good, those who had genes which especially fitted them for it might acquire an advantage and so the gene pool might move in that direction.
Second, does the kind of detached thinking he mentions really have survival value? I don’t think Buddhist monks take up meditation in order to give their genes a better chance, and in fact some religious traditions have definite leanings towards celibacy and the sacrifice of one’s own reproductive prospects. Stewart seems to think that if choose our goals rationally, we will choose ones that best serve our survival: but there’s no purely rational reason to choose one goal rather than another – that’s why we have built-in drives to begin with. It seems quite likely to me that a meditative, rational human race, operating on an unemotional level, might easily decide not to reproduce – or perhaps even not to eat. It may be that we need to stay stupid enough for our own good.
The question of where consciousness might go is an interesting one, though. What mystery gift might Mother Nature have in store for us? There is of course some debate about whether human beings have escaped from evolutionary pressures for the foreseeable future, either by controlling their own environment, or perhaps just by moving around so much that there are none of the isolated populations which seem to play an important part in speciation.
Since it’s nearly Christmas, I propose to address the issue on an entirely frivolous level and present a Christmas wish-list of my top ten favourite improvements to consciousness…
Coming in at number 10 is Distributed Processing. Why can’t we bud off a separate train of thought to deal with some subsidiary problem while we go on thinking about something else? We could have a whole tapestry of different threads separating and re-converging.
At number 9 I’d like an Indexed Memory. I don’t need anything complex here – just a date-based system will do, so that I can remember what happened at any specified time and date.
Number 8 is a Vivid Memory. I’d like to be able to run personal flashbacks the way people in films do, where you relive the events being recalled in realistic detail.
At 7, I want Control of My Dreams. Not that I have terrible nightmares, you understand – I rarely recall any dreams at all. Some people apparently do have some control of some of their dreams, but I’d like to be able to script them completely. Purely for research purposes, of course. Well, partly for research purposes.
Number 6 is the ability to enter Altered States of Consciousness at will, without having to do any of that starving, meditating, or consumption of ayahuasca. If this is too much, I’d settle for a degree of control over my own endorphins.
At 5, I’ve got (or rather, I haven’t yet got) Control of the Attention; the ability to concentrate on things, or indeed, ignore them.
Number 4 is Automation. There are many things I have to do laboriously in the forefront of my consciousness – calculations, writing routine letters, that sort of thing – which I wish I could delegate to unconscious functions which I’m sure are perfectly up to the job.
At 3, I want, ahem, Control of My Emotions. I don’t actually want to be Mr Spock, but when someone treads on my foot I should like to be able stop being annoyed about it immediately instead of an hour and a half later.
Number 2 is Control of Unconscious Functions. I’d like to able to summon up placebo effects without being put through a confidence trick first; to stop the body’s exaggerated and unhelpful response to burns and some other injuries, and rev my own heart up a bit when I know it’s going to be needed but my unconscious mind doesn’t.
At number 1, (since this is an intellectually oriented site), is Transparency, or Being Able To See How It Works. You cannot observe yourself in the process of composing your own thoughts, no matter how quickly you look over your shoulder. But wouldn’t it be good if you could?