Pages

2/26/11

Angels or Slugs

Tim Mulgan's book Future People  is full of tricky, ingenious, subtle ideas.  It's mainly an attempt to hammer out the ethical rules that govern creating new people.  Here's the rule he winds up proposing  (p. 170)--
The Flexible Lexical Rule. Reproduce if and only if you want to, so long as you are reasonably sure that your child will enjoy a life above the lexical level, and very sure that the risk of your child falling below the zero level is  very small. 
The zero level is the level at which a life is not worth living.  The lexical level is...well, that's what I want to talk about.  "Lexical" connotes "like letters."  All of the As come before any of the Bs.  In our value systems some things take lexical priority over others.  Suppose you are an ardent Apple fan.  You are offered both Apple products and Dell products.  In ever case, you choose the Apple product.  You would rather have one Apple product than any number of Dell products.  But when they run out of Apple products (there aren't that many), you're happy to accept Dell products.  They do have some value to you. Apple stuff has lexical priority (for you) over Dell stuff.

Do some lives have lexical priority over others?  Mulgan thinks so.  Suppose God has already created a world with both angels and slugs.  (There's already variety in this world, so we don't have to worry about that as a desideratum.)  Should God create all the angels he can, before any more slugs?  Or is there some kind of an exchange rate?  Is creating one angel the same as creating a million slugs?  It's plausible to think that angels are lexically prior to slugs.  God should exhaust his angel-making powers before making any slugs, but he should go on to make some slugs. (Don't worry about omnipotence--the God talk here is obviously not to be taken seriously.)

Take a more real-worldish application of the idea: Tom Regan's famous case of four adults and a dog on a sinking lifeboat.  They all have inherent value and rights, he thinks.  In ordinary circumstances, the adults can't eat the dog.  But in this unusual situation, where all will die if nothing is done, he says the dog should be thrown overboard.  The dog has less to lose, in the way of valuable future experiences.  Dog-years just aren't as rich in value as people-years (assuming they're all normal, and other things are equal).

In fact, Regan says it would be better to throw a million dogs overboard, rather than one person.  (Why would we need to throw that many?  Presumably, they're teeny tiny dogs!)  Quick way to capture how Regan values people vs. dogs: he thinks in the lifeboat type situation, people take lexical priority over dogs.  We should save all the (normal) people before any of the (normal) dogs, but then we should go on to save dogs (if we can).

Mulgan, then, is saying that some people's lives take lexical priority. The lives above the lexical line are different--they're autonomous lives; the ones below are non-autonomous.  Don't worry. That doesn't mean he's for using non-autonomous people as slaves or food.  The way Mulgan uses the concept is not drastic.  Before creating a child, we ought to be reasonably sure he or she will live a life above the lexical line. Failing that, we can create children below the lexical line, but above the zero level.

Should we buy into the lexical priority of autonomous people?   How would the line affect the way we treated people once they are born?  Would we have to throw less autonomous people off of lifeboats?  Would a special-ed program have to put extra resources into the highest achieving kids, the ones with the best chance of getting above the lexical line?  And do less for those way below the line, since they'll never have the super-valuable type of life?

But back to basics.  All the angels before any of the slugs?  All the people (on the lifeboat) before any of the dogs?  Do any lives ever take lexical priority over other lives?  This is the most fundamental question, and the one that keeps rolling around in my head.

When the going gets tough, I say "make a poll."  Assume angels are like people, only even better.  And assume slugs have a tiny bit of mental life--only enough to enjoy a good slither.  So what about creating angels and slugs?  Four possibilities-- 

(1)  God should create all the angels before any of the slugs, though if he can't create any more angels, he should create some slugs (the lexical view).

(2)  It's all the same whether God creates one angel or some vast number of slugs (the exchangability view)

(3) It's all the same whether God creates one angel or one slug (the egalitarian view).

(4)  God should create angels only, because slugs have no value (the elitist view).

The poll is in the right column.

5 comments:

  1. I feel Parfit lurking..lurking..

    I bounce between 1 and 4, I reject 2 and 3.

    2. Strikes me as the repugnant conclusion 3. Seems like nonsense.

    I strugle with 4 for this reason: it already contains an assertion of "no value" but I don't think you mean it this way.

    You've already noted tha "slugs have just enough mental life to enjoy a good sither." So what 4 REALLY means is "because having just enough mental life to enjoy a good slither is insufficient value to warrant being created even though their are no other contraints."

    Since no harm is specified it seems to me "why not create slugs if there is no corresponding harm?" On the other hand, I'm enough of an elitist that creating Angels seems like the only activity of interest.

    ReplyDelete
  2. Yes, one of Mulgan's arguments for (1) is that (2) leads to the repugnant conclusion.

    Whoops, I shouldn't have called it a "good" slither. I just meant that colloquially. (4) says a mere "good" slither isn't really good--has no real value!

    Yes, (3) sounds really nutty to me...but a lot of animal rights people seem to seriously believe it. They're appalled by Regan's verdict on the lifeboat case.

    ReplyDelete
  3. Well leaving out the "good" slither, the cruicial bit is they ARE enjoying the slither yes? It's the "minimal enjoyment" placeholder just above zero yes? So it's between a world with JUST enjoment level 10 vs a world with enjoyment level 10 + a bunch of enjoyment level 1s.

    Out of curiosity, since you know the landscape better than I: are the #3 type people ALSO the people who tend to think "the world" would be better off if all people just got wiped out so "mother nature" could just live on "in peace?" Or is that going to far?

    ReplyDelete
  4. The above was for Charlie.

    Faust, We are not to assume that the only valuable thing is pleasure. The angels have a lot of that going for them, but much else. They are highly autonomous, super-smart, etc. You can take that into account to any extent you like (or not at all).

    The #3 people--I don't think they necessarily want to see humans become extinct, but I would think some of them do. They might say we humans are wiping out too many other creatures who are no better than ourselves.

    ReplyDelete
  5. Well, if we already have some slugs, and its a choice between more angels, or angels and slugs, then it seems like 1 would be the optimum answer.... 2, and 3 are pretty easily rejected, like you said 2 leads the repugnant conclusion, and 3 just isn't true.

    I think we can easily reject 4 for the same reasons we reject 3, it just isn't true. Just because slugs have less value, doesn't mean they have no value. Their value comes from at the very least as instrumental value to the angels for novelty.

    ReplyDelete

Comment policy: Be reasonable, respectful, and relevant. Comments in violation of this policy won't be published.