Besides being the author of thoughtful comments here – and sophisticated novels, including the great fantasy series The Second Apocalypse – Scott Bakker has developed a theory which may dispel important parts of the mystery surrounding consciousness.
This is the Blind Brain Theory (BBT). Very briefly, the theory rests on the observation that from the torrent of information processed by the brain, only a meagre trickle makes it through to consciousness; and crucially that includes information about the processing itself. We have virtually no idea of the massive and complex processes churning away in all the unconscious functions that really make things work and the result is that consciousness is not at all what it seems to be. In fact we must draw the interesting distinction between what consciousness is and what it seems to be.
There are of course some problems about measuring the information content of consciousness, and I think it remains quite open whether in the final analysis information is what it’s all about. There’s no doubt the mind imports information, transforms it, and emits it; but whether information processing is of the essence so far as consciousness is concerned is still not completely clear. Computers input and output electricity, after all, but if you tried to work out their essential nature by concentrating on the electrical angle you would be in trouble. But let’s put that aside.
You might also at first blush want to argue that consciousness must be what it seems to be, or at any rate that the contents of consciousness must be what they seem to be: but that is really another argument. Whether or not certain kinds of conscious experience are inherently infallible (if it feels like a pain it is a pain), it’s certainly true that consciousness may appear more comprehensive and truthful than it is.
There are in fact reasons to suspect that this is actually the case, and Scott mentions three in particular; the contingent and relatively short evolutionary history of consciousness, the complexity of the operations involved, and the fact that it is so closely bound to unconscious functions. None of these prove that consciousness must be systematically unreliable, of course. We might be inclined to point out that if consciousness has got us this far it can’t be as wrong as all that. A general has only certain information about his army – he does not know the sizes of the boots worn by each of his cuirassiers, for example – but that’s no disadvantage: by limiting his information to a good enough set of strategic data he is enabled to do a good job, and perhaps that’s what consciousness is like.
But we also need to take account of the recursively self-referential nature of consciousness. Scott takes the view (others have taken a similar line), that consciousness is the product of a special kind of recursion which allows the brain to take into account its own operations and contents as well as the external world. Instead of simply providing an output action for a given stimulus, it can throw its own responses into the mix and generate output actions which are more complex, more detached, and in terms of survival, more effective. Ultimately only recursively integrated information reaches consciousness.
The limits to that information are expressed as information horizons or strangely invisible boundaries; like the edge of the visual field the contents of conscious awareness have asymptotic limits – borders with only one side. The information always appears to be complete even though it may be radically impoverished in fact. This has various consequences, one of which is that because we can’t see the gaps, the various sensory domains appear spuriously united.
This is interesting, but I have some worries about it. The edge of the visual field is certainly phenomenologically interesting, but introspectively I don’t think the same kind of limit seems to come up with other senses. Vision is a special case: it has an orderly array of positions built in, so at some point the field has to stop arbitrarily; with sound the fading of farther sounds corresponds to distance in a way which seems merely natural; with smell position hardly comes into it and with touch the built-in physical limits mean the issue of an information horizon doesn’t seem to arise. For consciousness itself spatial position seems to me at least to be irrelevant or inapplicable so that the idea of a boundary doesn’t make sense. It’s not that I can’t see the boundary or that my consciousness seems illimitable, more that the concept is radically inapplicable, perhaps even metaphorically. Scott would probably say that’s exactly how it is bound to seem…
There are several consequences of our being marooned in an encapsulated informatic island whose impoverishment is invisible to us: I mentioned unity, and the powerful senses of a ‘now’ and of personal identity are other examples which Scott covers in more detail. It’s clear that a sense of agency and will could also be derived on this basis and the proposition that it is our built-in limitations that give rise to these powerfully persuasive but fundamentally illusory impressions makes a good deal of sense.
More worryingly Scott proceeds to suggest that logic and even intentionality – aboutness – are affected by a similar kind of magic that similarly turns out to be mere conjuring. Again, results generated by systems we have no direct access to, produce results which consciousness complacently but quite wrongly attributes to itself and is thereby deluded as to their reliability. It’s not exactly that they don’t work (we could again make the argument that we don’t seem to be dead yet, so something must be working) more that our understanding of how or why they work is systematically flawed and in fact as we conceive of them they are properly just illusions.
Most of us will, I think want to stop the bus and get off at this point. What about logic, to begin with? Well, there’s logic and logic. There is indeed the unconscious kind we use to solve certain problems and which certainly is flawed and fallible; we know many examples where ordinary reasoning typically goes wrong in peculiar ways. But then there’s formal explicit logic, which we learn laboriously, which we use to validate or invalidate the other kind and which surely happens in consciousness (if it doesn’t then really I don’t think anything does and the whole matter descends into complete obscurity); hard not to feel that we can see and understand how that works too clearly for it to be a misty illusion of competence.
What about intentionality? Well, for one thing to dispel intentionality is to cut off the branch on which you’re sitting: if there’s no intentionality then nothing is about anything and your theory has no meaning. There are some limits to how radically sceptical we can be. Less fundamentally, intentionality doesn’t seem to me to fit the pattern either; it’s true that in everyday use we take it for granted, but once we do start to examine it the mystery is all too apparent. According to the theory it should look as if it made sense, but on the contrary the fact that it is mysterious and we have no idea how it works is all too clear once we actually consider it. It’s as though the BBT is answering the wrong question here; it wants to explain why intentionality looks natural while actually being esoteric; what we really want to know is how the hell that esoteric stuff can possibly work.
There’s some subtle and surprising argumentation going on here and throughout which I cannot do proper justice to in a brief sketch, and I must admit there are parts of the case I may not yet have grasped correctly – no doubt through density (mine, not the exposition’s) but also I think perhaps because some of the latter conclusions here are so severely uncongenial. Even if meaning isn’t what I take it to be, I think my faulty version is going to have to do until something better comes along.
(BTW, the picture is supposed to be Thomas Aquinas, who introduced the concept of intentionality. The glasses are suppose to imply he’s blind, but somehow he’s just come out looking like a sort of cool monk dude. Sorry about that.)