scalpelExistential Comics raises an interesting question (thanks to Micha for pointing it out). In the strip a doctor with a machine that measures consciousness (rather like Tononi’s new machine, except that that measures awareness) tells an unlucky patient he lacks the consciousness-producing part of the brain altogether. Consequently, the doctor says, he is legally allowed to harvest the patient’s organs.

Would that be right?

We can take it that what the patient lacks is consciousness in the ‘Hard Problem’ sense. He can talk and behave quite normally, it’s just that when he experiences things there isn’t ‘something it is like’; there’s no real phenomenal experience. In fact, he is a philosophical zombie, and for the sake of clarity I take him to be a strict zombie; one of the kind who are absolutely like their normal human equivalent in every important detail except for lacking qualia (the cartoon sort of suggests otherwise, since it implies an actual part of the brain is missing, but I’m going to ignore that).

Would lack of qualia mean you also lacked human rights and could be treated like an animal, or worse? It seems to me that while lack of qualia might affect your standing as a moral object (because it would bear on whether you could suffer, for example), it wouldn’t stop you being a full-fledged moral subject (you would still have agency). I think I would consequently draw a distinction between the legal and the moral answer. Legally, I can’t see any reason why the absence of qualia would make any difference. Legal personhood, rights and duties are all about actions and behaviour, which takes us squarely into the realm of the Easy Problem. Our zombie friend is just like us in these respects; there’s no reason why he can’t enter into contracts, suffer punishments, or take on responsibilities. The law is a public matter; it is forensic – it deals with the things dealt with in the public forum; and it follows that it has nothing to say about the incorrigibly private matter of qualia.

Of course the doctor’s machine changes all that and makes qualia potentially a public matter (which is one reason why we might think the machine is inherently absurd, since public qualia are almost a contradiction in terms). It could be that the doctor is appealing to some new, recently-agreed legislation which explicitly takes account of his equipment and its powers. If so, such legislation would presumably have to have been based on moral arguments, so whichever way we look at it, it is to the moral discussion that we must turn.

This is a good deal more complicated. Why would we suppose that phenomenal experience has moral significance? There is a general difficulty because the zombie has experiences too. In conditions when a normal human would feel fear, he trembles and turns pale; he smiles and relaxes under the influence of pleasure; he registers everything that we all register. He writes love poetry and tells us convincingly about his feelings and tastes. It’s just that, on the inside, everything is hollow and void. But because real phenomenal experience always goes along with zombie-style experience, it’s hard for us to find any evidence as to why one matters when the other doesn’t.

The question also depends critically on what ethical theories we adopt. We might well take the view that our existing moral framework is definitive, authorised by God or tradition, and therefore if it says nothing about qualia, we should take no account of them either. No new laws are necessary, and there can be no moral reason to allow the harvesting of organs.

In this respect I believe it is the case that medieval legislatures typically saw themselves, not as making new laws, but as rediscovering the full version of old ones, or following out the implications of existing laws for new circumstances. So when the English parliamentarians wanted to argue against Charles I’s Ship Tax, rather than rest their case on inequity, fiscal distortion, or political impropriety, they appealed to a dusty charter of Ine, ancient ruler of Wessex (regrettably they referred to Queen Ine, whereas he had in fact been a robustly virile King).

Even within a traditional moral framework, therefore, we might find some room to argue that new circumstances called for some clarification; but I think we would find it hard going to argue for the harvesting.

What if we were utilitarians, those people who say that morality is acting to bring about the greatest happiness of the greatest number? Here we have a very different problem because the utilitarians are more than happy to harvest your organs anyway if by doing so they can save more than one person, no matter whether you have qualia or not. This unattractive kind of behaviour is why most people who espouse a broadly utilitarian framework build in some qualifications (they might say that while organ harvesting is good in principle actual human aversion to it would mean that in practice it did not conduce to happiness overall, for example).

The interesting point is whether zombie happiness counts towards the utilitarian calculation. Some might take the view that without qualia it had no real value, so that the zombie’s happiness figure should be taken as zero. Unfortunately there is no obvious answer here; it just depends what kind of happiness you think is important. In fact some consequentialists take the utilitarian system but plug into it desiderata other than happiness anyway. It can be argued that old-fashioned happiness utilitarianism would lead to us all sitting in boxes that directly stimulated our pleasure centres, so something more abstract seems to be needed; some even just speak of ‘utility’ without making it any more specific.

No clear answer then, but it looks as if qualia might at least be relevant to a utilitarian.

What about the Kantians? Kant, to simplify somewhat, thought we should act in accordance with the kind of moral rules we should want other people to adopt. So, we should be right to harvest the organs so long as we were content that if we ourselves turned out to be zombies, the same thing would happen to us. Now I can imagine that some people might attach such value to qualia that they might convince themselves they should agree to this proposition; but in general the answer is surely negative. We know that zombies behave exactly like ordinary people, so they would not for the most part agree to having their organs harvested; so we can say with confidence that if I were a zombie I should still tell the doctor to desist.

I think that’s about as far as I can reasonably take the moral survey within the scope of a blog post. At the end of the day, are qualia morally relevant? People certainly talk as if they are in some way fundamental to value. “Qualia are what make my life worth living” they say: unfortunately we know that zombies would say exactly the same.

I think most people, deliberately or otherwise, will simply not draw a distinction between real phenomenal experience on one hand and the objective experience of the kind a zombie can have on the other. Our view of the case will in fact be determined by what we think about people with and without feelings of both kinds, rather than people with and without qualia specifically. If so, qualia sceptics may find that grist to their mill.

Micha has made some interesting comments which I hope he won’t mind me reproducing.

The question of deontology vs consequentialism might be involved. A deontologist has less reason — although still some — to care about the content of the victim’s mind. Animals are also objects of morality; so the whole question may be quantitative, not qualitative.

Subjects like ethics aren’t easy for me to discuss philosophically to someone of another faith. Orthodox Judaism, like traditional Islam, is a legally structured religion. Therefore ethics aren’t discussed in the same language as in western society, since how the legal system processes revelation impacts conclusion.

In this case, it seems relevant that the talmud says that someone who kills adnei-hasadeh (literally: men of the field) is as guilty of murder as someone who kills a human being. It’s unclear what the talmud is referring to: it may be a roman mythical being who is like a human, but with an umbilicus that grows down to roots into the earth, or perhaps an orangutan — from the Malay for “man of the jungle”, or some other ape. Whatever it is, only actual human beings are presumed to have free will. And yet killing one qualifies as murder, not the killing of an animal.


  1. 1. David Duffy says:

    I probably shouldn’t comment on this, since the idea of p-zombies seems completely incoherent to me. One close natural model might be sleepwalking – I don’t think we will be harvesting organs from such people, even if we know they aren’t going to wake up. As you pointed out, some consequentalists are fine with maximizing non-human-animal happiness, where it is pretty easy to discount qualia completely.

    PS It always seemed to me that we actually know quite a bit about qualia scientifically, even though philosopher definitions are deliberately constructed to exclude stuff like intensity (Weber–Fechner law). I enjoyed the slides from Tschuchiya and Kanai
    where they point out Japanese lack separate qualia for the sounds of L and R. Fortunately, like Mary…

  2. 2. Hunt says:

    I have to start with the same caveat as David Duffy. I’m convinced of the possibility of “objective” philosophical zombies, zombies that would behave precisely as would a person experiencing qualia by elaborate simulation, but not a “subjective” zombie, a “person” who would not know and not be able to report her status if she chose to do so. So the Tononi machine would amount to a type of Voight-Kampff machine able to suss out replicants (zombies), but its necessity would only be due to the deceitfulness of the zombie. (Note, one of the central points in Blade Runner was the question of whether replicants actually were zombies, so the comparison here is not exact.)

    Once identified, a zombies probably best be considered property, to be disposed, “retired” in Blade Runner lingo, according to property law. Maybe I’m being a little hardhearted.

  3. 3. calvin says:

    Doesn’t this beg the question what awareness is? Qualia, as you imply here, are not objective things, but qualia are experienced by persons. So what does that mean? Even if qualia are private experiences, Qualia are experiences. And by that we mean a person is aware of them. Awareness is the fundamental problem. how can you be aware of anything? awareness itself is the extra bit. Qualia are a kinds of content of awareness.

    Colors, words, sounds, ideas, numbers, shapes, smells are all qualia… but so are atoms and wavelengths and pressure and molecules. A zombie or machine may respond to a color or shape, but does it “see”? no, the response is a program or a mechanism. the only reason machines can respond to colors or shapes or sounds is because human beings made, or programmed them that way. It is the awareness and representations of human beings that make machines work at all.

    Zombies are thought problems because we don’t see any real zombies that are not machines. Either way, machines and zombies are not aware. That is the joke of the comic, the man is aware, the machine is not, and the doctor is acting as the zombie.

  4. 4. Philosopher Eric says:

    Hello Peter,
    I’m very excited to finally comment on one of your articles. In the past I’ve attempted to explore philosophical questions in a solitary fashion so that the various (presumed) biases found in standard work do not infect mine. Now that my own ideas have solidified, however, I come to you “intentionality uneducated.” I do thank you for your teachings, and in payment of this debt I also hope to give your readers interesting new insights.
    My “zombie” comment follows:

    I emplore you Peter to never ask the question “Does (whatever) have rights?” — this makes you seem no greater than, for example, the founding fathers of America. In order to not limit yourself in this manner I’d have you consider “rights” as something that may or may not be given. Thus the right to “vote,” or “speak,” or “kill,” becomes an arbitrary issue rather than something “inherent.” I would have you instead ask “Should this zombie be given any rights?”

    An ideology would be needed in order to intelligently answer this question (given the “should” part), as well as a subject. Exactly whose good are we trying to promote here? Many ideologies (such as my own) imply that existence is no more relevant to this zombie than it is for a block of wood, so this would not be a valid subject. But if “social good” is what we are trying to promote, and harvesting these organs would actually harm society (maybe through the society’s empathy?) then perhaps it would be beneficial for it to give this zombie a few rights. (As you may now have guessed, I am indeed a Utilitarian.)

  5. 5. Callan S. says:

    lack of qualia might affect your standing as a moral object (because it would bear on whether you could suffer, for example)

    So animals can’t suffer?

    Or animals are concious or have qualia? But the guy talking and writing poems doesn’t have those?

  6. 6. Notebook » Blog Archive » The Rights of Zombies says:

    […] article by Peter at Conscious Entities considers the question of whether philosophical zombies have rights. It seems to me that under most […]

  7. 7. A spot of consciousness says:

    Existential comics are sweet!

  8. 8. Android iOS Facebook Game Hacks says:

    It comes with advanced features that can enhance the experience of browsing on the Internet through mobile phones.
    Sony has chosen Android as the base for its “Play – Station Certified” initiative,
    though, and the HTC Flyer tablet was developed in partnership with On – Live.
    Today, go out there and install some hacks for your game.

  9. 9. Mobile Games Hacks says:

    What’s up, all is going perfectly here and ofcourse every one is sharing facts, that’s really good, keep
    up writing.

  10. 10. says:

    Have you ever thought about creating an ebook or guest authoring on other websites?
    I have a blog centered on the same topics you
    discuss and would really like to have you share some stories/information.
    I know my visitors would enjoy your work. If you are even remotely
    interested, feel free to shoot me an e-mail.

    Here is my web site :: seventh-day adventist (

  11. 11. john davey says:

    A human without a mind is classed as brain dead. There are human zombies – serious injury victims in persistent vegetative states. Various jursidictions allow such people to be “let to die” but few insist they be kept alive for ever. The other type of zombie is a foetus. They have so few rights they can even be killed in most jurisdictions.

    if you are suggesting that there can be minds that form intentional states sufficient to form contracts, but somehow aren’t conscious, you might as well start talking about what the world would be like if all the trees were made of chocolate, or imagine a planet made of bubblegum. Consciousness is the precursor of agency – no consciousness, no agency. All you have is a machine to whom moral considerations just don’t apply. Does an observer-relative functional object like a cup of tea warrant a moral thought ? No, neither does a robot (which is what the zombie question is usually about)

  12. 12. Romeo says:

    I couldn’t resist commenting. Perfectly written!

  13. 13. casual club says:

    It’s difficult to find experrienced people on this topic, however, yoou seem like you know what you’re talking about!

  14. 14. how old who is dan bilzerian says:

    Undeniably believe that that you said. Your favourite
    reason seemed to be at the web the simplest thing to understand of.
    I say to you, I definitely get irked at the same time as other folks consider concerns that they
    plainly don’t understand about. You managed to hit the nail upon the top as well as outlined out the whole thing without having side-effects
    , folks can take a signal. Will likely be back to
    get more. Thank you

Leave a Reply