12 in 12

Great things I discovered in 2012 (possibly after everyone else)--
  1. The state of California.
  2. Trees (the fall issue of The Philosophers' Magazine contains my paean to trees).
  3. Bjork--esp. Homogenic and Biophilia. Can't get enough of Joga and Cosmogony.
  4. Joanna Newsom--daughter turned me into a total fan in the past 24 hours!  Try The Book of Right On.
  5. The problem of personal identity. It used to be one of my least favorite philosophy problems, but no more.
  6. Philosophy books: Eric Olson's books on animalism (The Human Animal and What Are We?).  Also  Zoopolis by Will Kymlicka and Sue Donaldson
  7. Magnolia, the Paul Thomas Anderson movie. Favorite 2012 movie:  Argo.
  8. Magnolia soundtrack, featuring songs by Aimee Mann (here she is singing Save Me at the White House).
  9. Melancholia--wonderfully strange Lars von Trier movie.
  10. Fort Davis-Marfa area of west Texas. Wonderfully far from everything else and great for nature, history, art, astronomy.
  11. Twitter--when news is breaking, there's no better way to find out how people are reacting (e.g. during the presidential debates).  Then again, distraction, distraction, distraction.  May have to pull the plug in 2013.
  12. Barack Obama. During the campaign the case he made for himself turned me into a bigger fan than I'd been before.  Runner up: Hillary Clinton.  The champagne is already in the fridge for 2016. 
And one thing I'm really tired of--
  1. Watching the brawl over feminism in the atheist community. The tendency to brawl is the problem, not feminism. Few seem to see that.  I hope to pay much less attention in 2013.
Happy New Year!


West Texas

You must, if you live in Texas, visit the Fort Davis area.  Who knew there were so many riches a mere (ahem) 10 hours' drive from Dallas?  Most of the drive is either boring or hideous, but there's something pretty neat about seeing bumps, and then hills, and then high mountains rise out of the plain. By the time you get to the far west end of the state, the landscape has become truly beautiful, in a stark sort of a way (that I adore). You're also hundreds of miles from the nearest Starbucks, Barnes and Noble, or Panera--how refreshing!  You'll be at the mercy of the locals for food, coffee, and books, but you won't be deprived.

Night One--attend a star party at the McDonald Observatory. Do make reservations ahead of time, and try to schedule your trip when there's no full moon.  We went the day after Christmas, which made the crowd pretty reasonable ("just" 400 people, compared to 1000 a few nights later!).  Even with the full moon, we had some amazing views: Jupiter plus four moons, the orion nebula, and (of course) the moon in all its glory. Plus assorted star clusters. There's a cafe on the premises where you can have dinner and warm up with hot chocolate.

We stayed in tiny Fort Davis at the oldest hotel in West Texas (according to the proprietor)--a very charming B&B called The Veranda.  Another option would be staying in Fort Davis state park a few miles from town. At least from the outside the Indian lodge looks delightful.

Day One--The Fort itself has been restored to its former glory, so you can see how officers lived at the end of the 19th century. You can also see the hospital, complete with instruments and explanations of medical procedures.  Behind the fort there are trails up to a bluff with splendid views. Then drive back for the afternoon tour at the observatory--you'll get to see the biggest telescopes up close and learn about the research going on there (exoplanets...dark energy...wow).

Day Two--Marfa, 20 miles away, is another dust mote of a town, but (believe it or not) it's a major art mecca.  Make reservations ahead of time to tour the collection of the Chinati foundation, which houses major works by Donald Judd, Dan Flavin and 10 others, in abandoned military buildings.  You can't see the collection without a tour, and never fear--the tour guides maintain a low profile. They mainly just guide you around the premises and give basic information.

The town accommodates the art crowd with an appropriately elegant book store, restaurants, and the smallest NPR radio station in the country. Remember: dust mote.  Marfa ain't Santa Fe. But in a way, it's better.  Talk about totally quirky and unexpected. 

Alpine is the third point on the Fort Davis-Marfa-Alpine triangle. It's worth the trip for the bookstore, museum, and a few restaurants, but mainly because the drive is dramatically beautiful. Another beautiful 20 miles takes you back to Fort Davis. The Bistro is a pretty little restaurant with good food (even a vegetarian special!).  Mexican food is down the road--with chile rellenos to die for.

Day Three--Now you have to visit the Chihuahua desert center, which has some lovely trails--one into a canyon, and another to a hilltop with a terrific geology exhibit.

Next question--should we open our own little bookstore in Fort Davis, since it doesn't have one? Can we? Can we? The family says No, which means we'll just have to plan another visit. There's actually much more to do--more hikes, drives, more astronomy, more art, and even more in the way of performing arts. I hear Grizzly Bear performed at The Ballroom in Marfa last year. I'm already starting to plot and scheme .... there must be a trip #2.


A Nation of Idiots

Idiot #1 - Wayne La Pierre
I knew yesterday's announcement would be depressing, but underestimated the man. Armed guards. That's the way to prevent another Newtown.  Let the bad guys keep their guns, and give more guns to the good guys. He wants trained volunteers to guard our schools.  Now--I would not reject that idea entirely. If off-duty police officers wanted to guard our neighborhood elementary school, I'd think--"can't hurt!" But anybody can see that's not a complete solution -- not even close.  The kids come out of the school at regular intervals.  Anyone with an appropriate weapon could easily slaughter them as they leave at the end of the day and congregate in front of the school. They come out for recess. They come out to board buses that take them to another school to participate in a talented and gifted program.   Even when kids are inside the school, sitting in classrooms, it's not obvious that an armed guard could stop an intruder. The element of surprise, plus having the right weapons, would give the next killer a huge advantage.  Which is not to say security guards are completely useless--they're just (obviously) not the whole answer. 

Idiot #2 - Charlotte Allen
This completely boggles the mind.  She says the problem at Newtown was the "feminized" environment.  All the employees at the school were female. Not good, she says, because
a feminized setting is a setting in which helpless passivity is the norm. Male aggression can be a good thing, as in protecting the weak — but it has been forced out of the culture of elementary schools and the education schools that train their personnel. Think of what Sandy Hook might have been like if a couple of male teachers who had played high-school football, or even some of the huskier 12-year-old boys, had converged on Lanza.
Passivity?  But several of these teachers bravely confronted the shooter and paid the ultimate price. One of them died -- Victoria Soto -- by staying in her classroom to misdirect the shooter away from the children.  In the face of this sort of bravery, Ms. Allen thinks 12 year old boys would have been more effective?  What a grotesque insult to the memory of these courageous teachers.  And besides, it's bullshit.  For one, there actually were two male employees at the school. For another, men probably wouldn't have helped. At Aurora there were lots of men in the movie theater, and yet the shooter killed dozens of people. The men mostly did what the female teachers did--they shielded their loved ones and died in the process. At Virginia Tech there were plenty of male students and they died like the female students. What people like Allen can't admit to themselves is that lethal weapons make everyone powerless, whether male or female.  It's the assault weapons that makes the death toll so high in these situations, not the gender of the victims.   

Idiot #3 - Mike Huckabee
Every time I turn on the car radio, there's Mike Huckabee, proposing yet another ludicrous way of not saying guns are to blame for the Newtown massacre.  What's to blame is not enough prayer in the schools.  On another day, what's to blame is that parents aren't using enough corporal punishment on their kids. Kids don't have enough respect for their elders.  The funny thing about his reasoning is that he allows himself any speculation on earth, so long as it doesn't involve guns.  When guns comes up, suddenly he's a tough-minded critical thinker.  He wants proof, proof, proof!  As far as I know, that proof is easy to find (lots of good facts and links from Massimo Pigliucci, here), but it's never enough. And then he's back to the prayer issue, and respect, and video games, and of course no evidence is needed!

Idiot #4 -- The Liberal Ideologue
This will be an unnamed idiot because I have in mind various and sundry people, not one specific voice.  The liberal ideologue is as adamant as the conservative about what needs to be taken off the table.  We must not talk about mental illness, since we have compassion for the mentally ill and legitimate concerns about their rights.  We must not talk about violent video games, because we care so much about having a free and open entertainment industry.  We must not talk about ...  Well, basically we must not talk about anything but guns.  For the liberal ideologue it's not enough to view weapons as a central part of the problem. It's got to be the whole problem.


In the face of such a horrendous, heart-breaking event--the brutal murder of all those little kids and their teachers--you'd think people would be willing to set aside biases and ideologies.   Hopefully the idiots are on the fringes of the post-Newtown debate, and they're not going to prevail. 


All I want for Christmas is ... life and death

So ... I'm being ridiculed by my family just because three items on my holiday wishlist include the words "life" and "death" --
  • Katherine Boo, Behind the Beautiful Forevers: Life, Death, and Hope in a Mumbai Undercity
  • Jill Lepore, The Mansion of Happiness:  A History of Life and Death
  • Bernd Heinrich, Life Everlasting: The Animal Way of Death
Gloomy, me? Oh come on. I also asked for the book Capital, by John Lanchester, which looks downright fun, not to mention Sweet Tooth by Ian McEwan.

Just to prove I am not overly angst-ridden, I shall not be asking  for the "Sisyphus" watch I found at the Nation's online store.  It certainly is tempting though ....


Law and Order (and the vigil)

New York Times
Our political leaders need to reinstitute the assault weapons ban that lapsed in 2004.  This might really be possible, in the aftermath of the Newtown massacre, especially if gun control advocates sell the idea in the right way. When President Clinton signed the assault weapons ban in 1994, he was something new and different--a "law and order" Democrat.  In addition to signing the assault weapons ban, he also took other crime prevention measures, like increasing the size of the police force in major cities. I think Democrats today ought to take note.  We want the assault weapons ban not because we're against personal rights and liberties, but because we're for law and order.  Say what you're for folks.  Don't stress what you're against, stress what you're for.

But as to the costs -- I do think the NRA has gotten away with a whole lot of bullshit about rights and liberties.  The second amendment, unfortunately, is right there after the first amendment. That makes it easy to suppose that the right to bear arms is something like the right to assemble or worship or speak out: i.e it's a virtually absolute right, outweighed by other considerations in very few instances.  But no, the right to bear arms is subject to far more restrictions than rights of conscience. We obviously don't get to arm our "militias" (what militias?!) by keeping rocket grenade launchers in our backyards or nuclear weapons in our driveways. I can't buy myself a cannon and point it toward my neighbor across the creek (who does have too many loud parties).  Why not? Because my right to have such weapons is trumped by the community's need for "law and order".  (Let's use that phrase as often as possible!  It's really good...)

The assault weapons owner, even if perfectly law-biding, cannot perfectly control who uses the weapons. Nancy Lanza, who was murdered by her son with her own weapons, put her whole community at risk by having them in her possession. Her right to have these weapons, whether she wanted them for self-defense or recreation, should have been balanced against the security needs of the community.  Not because her rights don't matter at all, but because law and order takes priority.  We Democrats need to make that our rallying cry--LAW AND ORDER!-- instead of spending too much time laughing at and decrying gun nuts.


The Newtown vigil last night got me thinking about religion and its alternatives.  I don't fault anyone for approaching grief in any way that "works" for them. This isn't the time to ask whether there are really ... angels, and the like.  If I were suffering the devastation of the Newtown families, I could be believing in angels right now too. Or ... or maybe not.  It seems unfortunate that no one got up at the vigil and spoke words of comfort in a secular key.  I'm just about sure some of the bereaved were unable to connect with all the religious talk, as "inter-faith" as it was. I meet people all the time, and not just in philosophy settings, who don't think about life in those terms.

As an alternative to the Jew, Methodist, Catholic, Muslim, and Bahai who spoke, what comes to mind is ... what?  An atheist?  A God-denier?  That's not helpful. There's no more reason to ask a God-denier to comfort the bereaved than to ask a Jesus-denier or an angel-denier.  We need people to get up and talk about loss in terms of what they do believe, not in terms of what they don't believe. The naturalistic alternative is ... what?  Not atheistm, but (I suppose) secular humanism. Perhaps we non-believers should be doing more to make that positive outlook a part of the national landscape, and thus part of an occasion like this. I say that not in a "we have rights too!" spirit, but only because, in all honesty, I think secular language is needed to meet the needs of some grievers. 



Last night the rabbi spoke about light in a time of darkness as people lit hundreds of menorahs in our temple's lovely sanctuary. It's hard to hold onto a feeling of life's basic goodness in the face of yesterday's massacre of completely vulnerable little children and their teachers. I can't think of anything insightful to say except the obvious:  gun control, now.


Losing My Religion

It's Friday, so let's have a song. This, from The Sandy Relief concert on 12-12-12, was amazing.


Must Philosophers Be Parents?

Philosopher Justin E. H. Smith rebels against pressure to be a parent here. He resents the fact that philosophers are always telling him there are things he just can't know, because he's childless.

All the hyperventilation about parenthood can be excessive, but how can it not be the case that parenthood gives people special experiences and insights?   For example, parenthood makes you think about values and priorities. That's because your own agenda may be altered to accommodate children and because, in raising children, you're constantly forced to make up your mind about what will make a child's life go better (or worse). Should you focus on happiness? Achievement? Autonomy? Whatever the child happens to want? Also, parenthood gives you first-hand experience with a certain type of love. Without first hand experience of parenthood, I don't think Harry Frankfurt could have written the book The Reasons of Love (to give just one example).

Now, non-parents have more time for other things.  So they may also be able to claim special opportunities for insight and reflection too. Surely being an adventurous traveler gives people unique experiences and insights, as does being a musician, or flying airplanes, or whatever.  And non-parents have more time for such things, as well as for ... philosophy! So there are plenty of advantages to having no children.  But Smith isn't content to defend non-parenthood in that manner. Not only is parenthood not needed for philosophical insight, on his view; it's actually harmful to the philosophical way of life!  Maybe he's only semi-serious, but he does expound on this idea.  Nietzsche, he says, found pictures of domesticity comical--
Thus he tells us what he things about the domestic life in The Genealogy of Morality:
Heraclitus, Plato, Descartes, Spinoza, Leibniz, Kant, Schopenhauer– they were not married, and, further, one cannot imagine them as married. A married philosopher belongs to comedy, that is my rule.
Married philosophers are funny, married with children--HILARIOUS!  Granted, Nietzsche wasn't quite the best role model, Smith admits, but he's still got a point--

Now Nietzsche was of course a raving and sorry case, who precisely failed to implement philosophy as a practice of the good life. But his historical point is unassailable, and the truth of it helps us to bring into relief the exceptional situation of current academic philosophy. The life of the philosopher was traditionally something akin to life in a monastic order; it placed an extremely high demand on its initiates, and forced them to choose between different and competing fundamental goods.

This is particularly clear in the Indian tradition, where the choice was explicit between being a ‘householder’ and being a ‘world-renouncer’, as the two ways of expressing Hindu devotion. It was also explicit that the former figure would necessarily be prevented from advancing as far as the latter in matters of illumination. Philosophy was an askesis, and as such was incompatible with the domestic life.Of course these days the fashion is to reconceptualize domesticity so as to fit whatever image we prefer to maintain of ourselves, so that, for example, when a new mother has to cut back on her yoga sessions, she can announce that she is now engaged in ‘the yoga of being a mom’. I have read PataƱjali’s Yoga Sutras, and I have not found anything about that.

Nor have I found any convincing evidence that the figure of the philosopher can be transplanted from the cloister to the household without serious deformation, let alone any evidence that –again, as I’ve been told three times over the past year– one cannot fully realize one’s potential as a philosopher unless one is a parent.
"Serious deformation."  I'd like to know what the "serious deformation" consists in, apart from the fact that philosopher-parents don't fit the stereotype of the guru or yogi or monk. I have no idea, because after making the deformation point, Smith goes on to make a different point.  He says philosophers have written perceptively about childhood by just recalling their own childhoods. Since we all used to be children, everyone's on an equal footing--parents and the childless.

Ahhhh, but that doesn't work. The insight we get by having children isn't necessarily about what it is to be a child. As I said above, it's (partly) about values, and about love, and this crucially involves bring up a child, not just recollecting being a child.  In any event, it does jog the memory about childhood to watch a child grow up.  I remember being a teenager much more vividly now than I did 10 years ago.  That's because I have two teenagers, and I'm continually being brought back to that time of my own life by watching them grow up.

I'm against a certain sort of over-zealous egalitarianism about value, where you can't recognize the unique value of doing X if it's inevitably the case that some people don't do X.  I think that's politically correct but not truthful.  Parenthood is one of those things with special value, but which (for a whole host of reasons), not everyone will experience or even want to experience.


12 X 6 = Time for ... What?

It's just about 12:12:12 on 12/12/12, so it's an auspicious time to ... what? That's the the only problem. I can't figure out how to use this obviously precious moment!


The Metaphysics of Corpses

Continuing to explore whether I was ever a zygote (or embryo, or fetus) ....

Current favorite picture of things:  I am an organism, essentially (so: Animalism). But must I think I began to exist as a zygote? Maybe not. First there was a zygote, which developed and grew into an embryo, and then a fetus (etc.) and at some point the fetus became me.  There's a lot to say about why that picture is attractive, and why it might also be unattractive ... but I want to focus on just one issue here: does it makes sense to suppose that a zygote becomes a mature organism, without being identical to it? Can A become B, in the absence of an underlying entity C, such that A=C and B=C?  Or to put it another way, does it make sense to think of A becoming B, when it's also the case that A goes out of existence?

One indication that this might make sense involves monozygotic twinning.  Suppose a zygote Z becomes two embryos, E1 and E2.  Z can't be identical to either of the successor embryos (since they're not identical to each other), so Z goes out of existence.  Nevertheless, it's fair to say that Z becomes E1 and Z becomes E2. This is becoming without identity. There's no underlying entity that persists, as Z becomes E1 (and also E2). Jeff McMahan points out that this sort of going out of existence via division isn't problematic in The Ethics of Killing:
There does not, however, seem to be anything problematic about the claim that, if an organism begins to exist at conception and twinning occurs, the organism simply ceases to exist. Ceasing to exist through division is not the same kind of event as death and does not leave dead remains behind. Thus, for example, when an amoeba divides, it ceases to exist though it does not die. While living entities may cease to exist by dying, some may also cease to exist in another way, by dividing. (p. 27)
Could it be the case that even without twinning, a zygote/fetus goes out of existence when it becomes a baby? If "ceasing to exist through division" is a possibility, why not also "ceasing to exist through transformation"? This, again, would be the sort of ceasing to exist that doesn't leave remains behind.  When there's radical transformation, you might say, A becomes B, but A also goes out of existence.  There's no underlying entity, C, that endures "underneath" the transformation.

In fact, isn't this exactly how we think about death (or at least could think about it)?  There is a living organism--A.  It dies, and there are so many radical transformations involved that the corpse, B, is something new.  A both goes out of existence and becomes B.  Surprisingly (to me, anyway), McMahan thinks if you take A as essentially an organism, you can't make out the idea that it goes out of existence when death occurs. Since that's absurd, you shouldn't (on his view) take A to be essentially an organism.

But why can't you think of a living organism as ceasing to exist, when death produces a corpse?  He writes,
 If, however, an organism ceases to exist when it dies, what exactly is the corpse and where does it come from? Merely labeling it the 'remains' of the organism is unilluminating (p. 20).

He comes up with four possibilities: (1) The corpse is "an entirely new entity, one that springs into existence in the area of space that the organism previously occupied immediately upon the organism's death." (p. 20) His assessment: No, of course not, that's silly. (2) The corpse was there all along, a second entity in addition to the living organism, A. His assessment: Also silly! (3) The living organism, A, is just a phase in the existence of a longer-lasting entity, and the corpse is another phase. So the organism goes out of existence in the same sense that a child goes out of existence when she becomes an adult.  His assessment: This is inconsistent with thinking that A is essentially an organism, so it's ruled out too. (4) Upon death, the organism, A, basically disintegrates, leaving behind no further entity, B.  In a nutshell: there are no corpses. Again: Silly.

But isn't there another option?  Let's call it (1A), since it's close to (1).

(1A)  The corpse is a new entity, but not "entirely new". Most of the molecules in the corpse come from the intial living body. Much of the organization of the corpse is owing to things that happened to the prior living body. The living body becomes the corpse, but also goes out of existence, since the transformation is too great to sustain identity.

If division of Z into two embryos is a way for Z to go out of existence, and we shouldn't think of twins E1 and E2 as "entirely new" in some mysterious or absurd way, then why not also think of ordinary death as a way for a living organism to go out of existence, though also becoming a corpse? So: becoming without identity, in both instances. And then, why not use the notion of becoming without identity to understand us and our origins? Zygotes become you and me, but we're not identical to them.

When all is said and done, I fear there may be more than one coherent way to think about these things.  All I say (so far) is that this might be one of them.


More Kerfuffling

Let us not kerfuffle endlessly, especially during winter break, when all good academics get vast amounts of work done (right?).  But I can't resist noting that Jerry Coyne weighs in here on the topic du jour.  If I'm reading between the lines correctly (I'm not betting large sums on that), he thinks Rebecca Watson is too rough on evolutionary psychology.* I'm prepared to believe what he says on that score, but it would have been nice for him to also say two other things--(1) Her main goal was to talk about how bad science is picked up and propagated by the media and thus puts women under stereotype threat (that's what I said here and she says here). So while she did excoriate EP quite generally, her main contentions were about other matters.  And they were supported by some great examples. (2) The kind of thing she was doing--feminist science criticism--has been supported by Coyne in the past. So he is not in the camp that dismisses all feminist inspired responses to science.  There are a bunch of people with that anti-feminist stance making a lot of noise these days in skeptic/atheist circles, so it seems worth pointing out that he isn't one of them. 

* Update: He says over there that he never even saw her talk and so his post is not about it at all.


Feminist Science Criticism, 300 BC

Yesterday I happened to be reading Aristotle's account of reproduction, and came upon a nice example where having a feminist science critic on the scene would have been helpful.  Let us imagine one Kallista, (imaginary) champion of women from 300 BC, responding to this passage from Generation of Animals (Book I, ch. 21, trans. Platt) --

Aristotle says here that a female is passive and a male is active.  Thus, the matter of an embryo comes from the female, but the form comes from the male.  The semen doesn't wind up being a part of the offspring.  The male rather gives form to the matter supplied by the female--he is like the carpenter, and she merely supplies the wood. The form, function, and essence of the embryo, and later the baby, are therefore generated by the male.

Now Kallista, let's suppose, is sensitive to a woman's role in Athenian society.  She knows that women lived in veritable purdah in Aristotle's time, playing no role in the public affairs of the city. So the male-active, female-passive story sounds suspicious to her. That's just how we're supposed to live, says this society, not necessarily how things really are. She also notes that Aristotle buys into female inferiority in a big way. For example, later in Generation of Animals he says "For the female is as it were a male deformed, and the menses are seed but not pure seed; for it lacks one thing only, the source of the soul." (Balme trans., in A new Aristotle Reader).

Only the male is the source of the soul, says that passage. That's a very big deal. There are multiple souls, for Aristotle. Without ensoulment there can be no growth, nourishment, perception, or thought. Unensouled entities are not even alive, let alone human. So Aristotle's giving the male virtually all the credit for our nature.

Kallista suspects Aristotle is biased--he's got a deeply ingrained notion about the role of women, and he's projecting it onto nature.  That's step one in her thinking.

The second step is to take the suspicion and run with it, searching for actual errors in Aristotle's science. She scratches her head .... How is it that the male's role in reproduction is anything like a carpenter's role? How can that really be?  How (on earth) could depositing semen inside the female be anything at all like what happens when a carpenter imposes form on a block of wood?  The carpenter thinks about the form and function he's after. How could the semen do anything of the kind? What--does semen think? Her feminist suspicions lead her to some very reasonable doubts about Aristotle's account.

The third step is to produce a better theory. Now Kallista  shouldn't be dogmatic.  She shouldn't presume to know more than she really does.  It would be more egalitarian if the female and male both contributed the same amount of matter to the embryo (and fetus and baby), but it's obvious that they don't. (This amusing editorial by Greg Kamikian estimates that males have contributed 1 pound to the mass of humanity, over the whole history of homo sapiens reproduction--107 billion babies thus far.  If females contributed just the same--1 pound--we'd be missing another 800 billion pounds!)

It would be more egalitarian if males and females contributed equally to the ensoulment and the form, function, and essence of the embryo/baby, but she shouldn't fully commit herself to that a priori. She has legitimate doubt about Aristotle's view, but not a replacement.

Still, feminist science criticism does move things forward.  It creates reasonable suspicion that Aristotle's just projecting male and female social roles. It puts his details under heightened scrutiny, helping us see problems with the idea that semen plays the role of carpenter, with women providing the wood.

So much for defending FSC.  It seems like skeptics about it (I've run into some recently) really have to be doctrinaire anti-feminists. It just can't be that Kallista's reaction to Aristotle wouldn't be a good thing--good both for science and for women.


Feels Like We Only Go Backwards

It's Friday, so time for a great song...and great video!  Makes me think about John Lennon. Finder's credit: RAG



Was I ever a a fetus? I continue to read and think about this.  So far I'm inclined to go along with Eric Olson's animalism, which says that the entity I am right now did start to exist as a fetus, way back when.  I think there are some excellent arguments for that view in his two books. But it does make me a bit queasy.  It seems odd to suppose that I was once a millimeter long. The oddity is enough to make me take a very close look at the competing views.  (There are also worries about defending abortion, but I'm going to make a concerted effort to set them aside. It's a good general principle that we should not let ethics steer metaphysics, but vice versa.)

The Constitution View, the view defended by Lynne Rudder Baker in Persons and Bodies, avoids the oddity by saying that I am a person merely constituted by an organism. The organism is one thing, and it did begin to exist as a fetus.  But the person is another, and it began to exist only at the point when self-awareness emerged. I am a person, not an organism, and I never existed as a fetus.  The problem is, I find aspects of that view completely unintelligible, as I explain here.

How else can we avoid saying I was once a millimeter long? I think maybe this is what I thought, before I really thought a lot about it--  I started to exist as a baby, or maybe as a late-term fetus.  That would have been roughly in the month of March, since I was born in May. I didn't exist in February, January, December, etc., but of course "my" early term fetus did. So what did I used to think about the connection between my early term fetus (F1) and my late term fetus (F2)? I used to think F1 became F2, of course.  Duh!

This, then, raises a question of logic:  If F1 becomes F2, must there be an underlying entity that endures, first as F1 and then as F2? In other words, must it be the case that F1=F2? If that is the case, and I think I did exist as F2, then I'm forced to think I existed even earlier, as F1.  In other words, my old way of thinking about this is untenable. I can't say I started to exist as a baby, and deny ever having been a fetus.

So... what is the logic of becoming? Is it really true that if A becomes B, then there's some underlying, enduring entity, so that in fact A=B?  Well, it's at least often true.  When a child becomes a teenager, the identity holds: the teenager is the child.  When caterpillar Charlie becomes butterfly Bob, Bob is Charlie. Is there any such thing as a case of becoming, where A becomes B, so there's a very intimate connection between A and B, yet it's not true that A=B?

Don't say "a frog becoming a prince" if you mean the frog disappears and a prince takes his place.  Because that's not the sort of "becoming" that's involved when an early term fetus becomes a late term fetus.  We need a very intimate relation, where the properties of F1 are pretty continuous with the properties of F2, yet there's some coherent reason to acknowledge two separate entities.  Is becoming ever that way--a transition from one entity to a bonafide second entity? If you can think of an example like that, pray tell.


Feminist Science Criticism

Update 12/6: Here's a vastly more exhaustive response to Watson and Clint than mine. Great stuff.

For your viewing pleasure, I give you a controversial talk by feminist skeptic Rebecca Watson:

A lot of people seem to be impressed with this excoriation of Watson, by one Ed Clint, but I'm not so impressed. In fact, I'm amazed.  If Watson's talk amounts to "science denialism" then there are piles of books and articles that belong in the trash with the classic instances of denialism--holocaust denialism, climate change denialism, and evolution denialism.   If Watson is a denialist, so is Cordelia Fine, author of Delusions of Gender --  Fine also subjects a huge pile of science to a withering critique.  If Watson is a denier, so is Sarah Blaffer Hrdy, author of The Woman that Never Evolved--she also critiques existing science for being biased by prevailing gender norms.

A denialist cheerleader is going to be Jerry Coyne, who writes  "I am a fan of 'feminist science criticism': the idea that women can sometimes point out male biases in research strategies and in the interpretation of scientific results" in a post supporting Slate writers Emily Yoffe and Amanda Schaffer, who had trashed a science article on rape.  And let's not forget people who have critiqued science for its racist biases--if feminist critics of science are denialist, we'd better call Stephen Jay Gould a denialist too. That's a lot of denialists!

Right. None of them are denialists.  Rebecca Watson is not a science denialist.  She's simply engaging in feminist science criticism, with a focus on how media and business interests stoke the fires of sexism. It's a separate question whether she's doing what she's doing well, but the kind of thing she's doing is perfectly legitimate, and in fact valuable.

The other objection we get early in Ed Clint's post is that she has the wrong credentials. "Watson is known for her blog website, as co-host of a popular skeptic podcast, and for speaking at secular and skeptic conferences. But Watson holds no scientific training or experience." (Holds?  Whatever!)  Again, you have to be consistent. If it's a problem that Watson has insufficient science credentials, it's got to be a problem that Emily Yoffe and Amanda Shaffer aren't scientists, and neither are my favorite science journalists, like Robert Wright, Matt Ridley, and Natalie Angier.  Many people make excellent pundits and popularizers, without first getting degrees in the relevant subject. No--come on!--Watson's lack of science training isn't really an appropriate basis for complaint.

I'm afraid I lost interest in the post soon after the bits about denialism and Watson's credentials, so can't tell you what I think about the 50 billion errors Clint claims to have found in the talk. Listen for yourself. It's fun and interesting, and you simply have to love the way Watson's hair and top match the lectern.


Taking Persons too Seriously

In Persons and Bodies Lynne Rudder Baker says the Constitution View takes persons seriously, and other accounts of what we are don't take them seriously enough. I would say, rather, that the Constitution View takes persons too seriously. On the Constitution View, persons (like us) are persons essentially.

What makes a person a person, Baker contends, is having a first person perspective (FPP). When a human organism starts having an FPP, something that's a person essentially starts to exist.  The organism doesn't have personhood essentially; rather it starts to constitute a distinct entity, an entity that is essentially a person.  I am such an entity and you are too. Non-human animals aren't persons, fetuses aren't persons, and at the end of life, if our FPP is extinguished before death, we go out of existence before organic death.

Now you could quarrel with the science here. Maybe some non-human animals do have FPPs. But let's not go there--let's suppose Baker is right about human uniqueness. Even granting that, I'm out of synch with her eagerness to intensify that difference, to make it "ontological"--
My claim is this: However the first-person perspective came about, it is unique and unlike anything else in nature, and it makes possible much of what matters to us. If even makes possible our conceiving of things as mattering to us. The first-person perspective -- without which there would be no inner lives, no moral agency, no rational agency -- is so unlike anything else in nature that it sets apart the beings that have it from all other beings. The appearance of a first-person perspective makes an ontological differences in the universe. (p. 163)
She's not satisfied with the thought that humans and dogs are dramatically different.  It's not enough to simply say "normal, mature humans have an FPP and dogs don't." That, to her, doesn't do justice to the difference.  We must be in an entirely different ontological category from dogs. I suspect she would actually like to be able to consider us immaterial, to make the difference even more dramatic, but she settles for our essential personhood, because she's a materialist. The idea is not only that you should be able to look at your dog and see something in a different ontological category, though that's important to her. You must also be able to say that you were never an entity in that category, and you will never become an entity in that category.

I personally feel perfectly satisfied with merely factual differences between myself and dogs.  I have an FPP and probably they don't.  I don't need to ontologize the difference and make it starker (this makes me think of Instagram filters, for some reason!). If at the end of life my FPP fades, and very old JK starts seeing the world as a dog does, I don't think that will prevent that individual from being me. If baby JK once saw the world as a dog does, that fact doesn't make me think "that wasn't me."  Baker seems to be guided by some sort of moral imperative to draw the sharpest possible lines between persons and non-persons, but I think there are moral costs to doing so.  The heavy duty line drawing would surely undermine fellow feeling between ourselves and animals, our future elderly selves, our elderly parents, people with severe cognitive disabilities, etc.

It seems to me we've given enough significance to personhood if we just say that human beings are a kind of creature the normal, mature members of which are persons throughout a major stage of their lives. It's part of my humanity that, if all goes well, I'll spend a lot of my life with the characteristics of personhood. That strikes me as a more accurate representation of the facts than saying that I am essentially a person--that my very existence hinges on retaining the properties associated with personhood.


And now for a more technical objection. Baker thinks persons are constituted by animal bodies, but not identical to them.  Persons have some of their properties derivatively--on account of the properties of their animal bodies. We have our weights, for example, derivatively.  On the other hand, she thinks persons also have some of their properties non-derivatively or independently. Our FPPs and the abilities associated with them are non-derivative. In fact, she thinks, I have an FPP independently, and the organism that constitutes me has it derivatively. This is taking persons very, very seriously--as having their own independent causal powers.
The Constitution View allows that many of our causal powers are independent of the causal powers of our bodies (i.e., are independent of the causal powers that our bodies would have if they did not constitute persons). Dean Jones [the person] has the power to cut the departmental budget; twenty-one-year-old Smith has the power to buy beer; I have the power to send e-mail from home (p. 218)
Imagine the development of a human being over time. At some time or other, the baby crosses the Rubicon. At t minus 1, there was no FPP. At t there is an FPP.  There's a change in causal powers. The baby's not cutting budgets, buying beer, or sending e-mail, but what?  Well, maybe reacting to a mirror differently. Suddenly, a person is on the scene, merely constituted by the baby-organism, says Baker. And (Baker says) these new causal powers belong to the new person independently; they are not derived from the constituting body.

Why not derived? I think the gloss in the parentheses makes very little sense. The new causal powers are independent (she says) because they are "independent of the causal powers that our bodies would have if they did not constitute persons," she writes.  They're independent (period) because they're independent of a certain portion of our bodily causal powers. Which ones? The "animal" ones, the ones we'd have if we didn't constitute persons.  I would have thought they could only be independent (period) if they're independent of all of our bodily causal powers, including those associated with the abilities that make us persons.

The whole story here starts to fall apart, when you think about that moment when a person comes into existence. The baby must surely have some new brain activity for the FPP to emerge. This, says Baker, makes another entity come into being, an entity that's essentially a person. But the FPP emerges in the brain. It's got to be true that the baby-organism has that FPP, even if the new entity, the person that's essentially a person, does as well.  Why does having a certain brain property that generates an FPP not just generate the further property in the baby of being a person, as opposed to inducing the existence of another entity? You can say it's more intuitive to suppose a person exists, since that allows for essential personhood (which the organism lacks) but that doesn't make it so.

Favorite Philosophers

Philosophy Bites asked a lot of philosophers "Who's your favorite philosopher?"  Most of then chuckled first and then said ...

Montaigne & Nietzsche
Sartre, None
Hobbes, Rousseau
Hybrid of Wittgenstein, Marx, Mill
Hybrid of Armstrong, Smart, Lewis
Fred (Fred Nietzsche)
Fodor (for his wit!)
The last woman I talked to, whoever she is
Socrates, Wittgenstein
Hume, Bentham
Hume, Wittgensetin
Mill (because he'd be interested in talking to a woman)
Hume (for many reasons, but for one: he was a good cook)
Kant (he was so damned good)
Hume, Williams
Dummett, Hume, Wittgenstein, Chomsky (Hume #1)
Panetius ( a late Stoic), Gandhi
Kant (triggers salivation)

Have fun guessing who said whom.  Where did I go wrong? Everyone loves Hume. I find him boring. I'll go for ..... Aristotle. No, Plato. Agh. Just not Hume.