Interesting exchange about Eric Schwitzgebel’s view that we have special obligations to robots…


  1. 1. Sci says:

    It seems odd to me he connects our obligation to both what gods owe their creations and to what parents owe their children. Both of these imply isolated agents whose moral obligation comes from their choosing to create life.

    As such, I’m confused as to what I personally would owe to sentient robots beyond what is owed to my fellow humans if someone else created them? If anything, we owe robots less IMO, to the point where the only guarantee after whatever “childhood” we grant to machines would be preservation of physical form without any obligation to “feed” it via tax dollars.

    Better to expand care to humans & animals before we waste tax dollars on sentient machines that need not be created at all?

  2. 2. Callan S. says:

    Of course if you were such a sentient machine, Sci, the whole privilege argument might not sound so convincing!

    I’m not sure why the notion of ‘morality’ seems to be taken as a thing where X is treated this way and Y is treated that way as if fundamental rules of the universe say it is so, as opposed to just imagining yourself in the other persons shoes? Or the other things shoes?

    Even if the universe somehow says you have to treat them in X way, imagine being treated in X way yourself, perhaps?

  3. 3. Callan S. says:

    Peter, I seem to have a comment locked in moderation – could you look into it, please 🙂

    [Sorry about that – I’ve approved it. No idea what the problem was. Peter]

  4. 4. Sci says:

    Well I didn’t say we owe robots nothing.

    We just don’t owe sentient robots more – they aren’t my children or my creation.
    A robot sitting in a warehouse waiting for someone to power it up is better off than the homeless people who will freeze on the street.

    In a post-scarcity society, of course, things could easily change but I’d rather also have a policy that dis-incentivizes the creation of sentient machines.

Leave a Reply