Guest Post: A Deserved Toast To Christopher Hitchens

Josh: Today, we’re excited to have a guest poster, Pat Andriola, who is elegizing famed author, atheist, and (unfortunatey) cancer sufferer Christopher Hitchens.

———————–

To the dumb question “Why me?” the cosmos barely bothers to return the reply: Why not?

"To me, Christopher Hitchens is what we classically think of as a superhero, but he uses words and reason instead of lasers and guns; throughout his life, those have been his true weapons."

The news of Christopher Hitchens having esophageal cancer, a disease that Hitchens describes as “not a good cancer to get,” hit me hard. If you’ve ever heard Hitchens speak, you probably have an opinion of him. His English wit, high intelligence, and propensity to say what he feels have made him somewhat of a polarizing figure. A former Trotskyist who now aligns more politically with the moderate right, Hitchens has written for Vanity Fair since 1992. A passionate atheist who has been entirely critical of religion, Hitchens rose to fame mostly for his persistent attacks on adored pubic figures. He once called Ronald Reagan a “stupid lizard,” wrote a scathing book on Mother Theresa entitled The Missionary Position, and said on the day of the Reverend Jerry Falwell’s death that he “wished there was a hell” awaiting the late pastor in the afterlife.

Hitchens has led a fascinating life that is seemingly coming to a very mundane ending; he calls his cancer “boring,” and the always cool-headed author has been exceedingly rational throughout the process, even though the cancer has most likely been brought on by decades of drinking and smoking. “I have been taunting the Reaper into taking a free scythe in my direction and have now succumbed to something so predictable and banal that it bores even me. Rage would be beside the point for the same reason.

Hitchens may best be known for his stance regarding religion during 2006, when many publications began touting him as one of the Four Horsemen of the “new atheism,” alongside evolutionary biologist Richard Dawkins, Stanford graduate student and best-selling author Sam Harris, and one of my professors at Tufts University, Daniel Dennett. Hitchens began debating theists such as Dinesh D’Souza and Al Sharpton across the country, completely blunt in his criticism of Christianity. He threw down with the likes of Bill O’Reilly and Sean Hannity on Fox News and called the newest Roman Catholic Pope, Joseph Ratzinger, an “elderly criminal.” As the news of his imminent mortality spread, many began wondering if the devout antitheist (as he calls himself) had softened his position on the irrationality of belief in the divine. When CNN’s Anderson Cooper asked him of a potential deathbed conversion, Hitchens said that it were only possible if a combination of drugs and pain made him delusional. “Not while I’m lucid, no.”

But as much as Christopher Hitchens ripped on every pious person across the globe, he was an entirely endearing and charming figure to many others. His intellectual arrogance and ever-present glass of scotch gave him a “bad ass” demeanor that wooed the hearts and minds of fans from all over. For me, reading Hitchens’ 2007 best-seller, God Is Not Great: How Religion Poisons Everything, was an eye-opening experience, one I began as a passionate Catholic and ended as a non-believer. As I struggled with the realization of losing my faith, which was an integral part of my identity, I was consoled by the words of Hitchens:

Thus, dear reader, if you have come this far and found your own faith undermined—as I hope—I am willing to say that to some extent I know what you are going through. There are days when I miss my old convictions as if they were an amputated limb. But in general I feel better, and no less radical, and you will feel better too, I guarantee, once you leave hold of the doctrinaire and allow your chainless mind to do its own thinking.

But now, Christopher Hitchens is dying an ugly death that has made one of the most proud thinkers of our generation a literal shell of his former self. Still, he says he hopes to live long enough to read, or possibly even write, the obituaries of “villains” such as Henry Kissinger and Ratzinger. He says he sees no irony in his manner of demise, but one cannot help but sadly and pathetically chuckle at Hitchens’ description of drinking alcohol, one of his favorite pastimes, from early 2009:

What the soothing people at Alcoholics Anonymous don’t or won’t understand is that suicide or self-destruction would probably have come much earlier to some people if they could not have had a drink.  We are born into a losing struggle, and nobody can hope to come out a winner, and much of the intervening time is crushingly tedious in any case.  Those who see this keenly, or who register the blues intently, are not to be simplistically written off as “dysfunctional” cynics or lushes.  Winston Churchill put it very squarely when he defined the issue as, essentially, a wager.  He was a lifelong sufferer from the depression that he nicknamed his “black dog”, but he could rouse himself to action and commitment and inspiration, and the brandy bottle was often a crucial prop.  I have taken more out of alcohol, he said simply, than it has taken out of me… But something in the Puritan soul is committed to making and keeping people miserable, even when it is *not* for their own good.  Some of us have at least an inkling of the pursuit of happiness, as well as of happiness as a pursuit.

Hitchens pursued happiness throughout his life, in the end succumbing to the consequences of relying too heavily on that pursuit. Hopefully, he has a few more obituaries to write and bad guys to get. To me, Christopher Hitchens is what we classically think of as a superhero, but he uses words and reason instead of lasers and guns; throughout his life, those have been his true weapons. I watched him take on evildoers and defend truth during my atheistic adolescence, at one point even being waterboarded to show just why and how it’s torture. So when Batman or Superman dies such a humiliating death, it’s truly shocking. But Hitchens detailed long before his battle with cancer the beauty behind his life:

Our place in the cosmos is so unimaginably small that we cannot, with our miserly endowment of cranial matter, contemplate it for long at all. No less difficult is the realization that we may also be quite random as presences on earth. We may have learned about our modest position on the scale…but then, the awareness that our death is coming and will be succeeded by the death of the species and the heat death of the universe is scant comfort. Still, at least we are not in the position of those humans who died without ever having the chance to tell their story, or who are dying today at this moment after a few bare, squirming minutes of painful and fearful existence.

The potentially abrupt end to Christopher Hitchens’ life, denying society of an important presence and the scholar of deserved decades more of living, is certainly tragic. But the man both told his story (finding out about his cancer during his book tour for his memoir) and certainly lived his life to the fullest. Raise a glass to him.

Advertisements

The Wrongly Decided Injunction Against Embryonic Stem Cell Research Funding.

::Edit:: The injunction that was the subject of this post was reversed on appeal by the D.C. Circuit in Sherley v. Sebelius, 644 F.3d 388 (D.C. Cir. 2011).

On Monday, a federal court issued an injunction against NIH funding of embryonic stem cell research. This decision has severe impacts on ongoing and proposed research projects. Although labs that received grants and their disbursements for embryonic stem cell research will be able to spend the money they’ve already gotten, new grant application review has ceased, and annual renewals of existing awards have also been suspended. Additionally, NIH is still figuring out whether “no cost extensions”, a common request to spend disbursed money beyond the proposed project years, will be allowed. This injuction should never have been granted.

In 1996, Congress passed the Balanced Budget Downpayment Act, which contained a rider, the Dickey-Wicker Amendment, which prohibited the use of federal money in projects involving the creation of embryos for research, or research in which a human embryo is destroyed or discarded. From then on, the Amendment was included in every major appropriations bill involving the Department of Health and Human Services, most recently in 2009.

SEC. 509. (a) None of the funds made available in this Act may be used for–
(1) the creation of a human embryo or embryos for research purposes; or
(2) research in which a human embryo or embryos are destroyed, discarded, or knowingly subjected to risk of injury or death greater than that allowed for research on fetuses in utero under 45 CFR 46.208(a)(2) and Section 498(b) of the Public Health Service Act [1](42 U.S.C. 289g(b)) (Title 42, Section 289g(b), United States Code).
(b) For purposes of this section, the term “human embryo or embryos” includes any organism, not protected as a human subject under 45 CFR 46 (the Human Subject Protection regulations) . . . that is derived by fertilization, parthenogenesis, cloning, or any other means from one or more human gametes (sperm or egg) or human diploid cells (cells that have two sets of chromosomes, such as somatic cells).

HeLa cells have been maintained in culture since 1951.

Fortunately and unfortunately for scientists, Congress has a very limited understanding of scientific research. Embryonic stem cells are not taken from embryos each time an experiment is to be performed, just as cancer cells do not need to be freshly taken from a tumor every time they are studied in a lab. HeLa cells, for example, are the predominant cells used for study of human cell biology. The HeLa cells used in labs today around the world are all descended from a sample taken in 1951 from a patient with an aggressive cervical adenocarcinoma–Henrietta Lacks. Those cancer cells now constitute an immortal “cell line”, and can replicate indefinitely in vitro. Embryonic stem cells are similarly immortal. Once the original cell lines were established from embryos, no future embryos will be needed to replenish the stock of those cell lines. This was a loophole to the Amendment. Stem cell lines were created by private companies who destroyed embryos, but the researchers with government grants only used stock cell lines. It’s important to note that even after this loophole was established and basically continued as precedent, Congress took no steps to modify the language of their Amendment to remove this loophole. It’s also interesting to remember that even though HeLa cells were procured (basically stolen) from Henrietta Lacks in what would today be seen as a clearly unethical manner, no one argues that the use of HeLa cells (which is vitally important to biology research) should be discontinued as a sort of fruit of the poisonous tree.

In 2001, George W. Bush announced a policy of limited funding for stem cell research, in which embryonic stem cell lines created prior to 2001 could be funded. In 2009, Barack Obama lifted all limitations on embryonic stem cell research. The embryos used for creation of new cell lines would be sourced from only embryos created for in vitro fertilization reproductive purposes and were no longer needed, and were donated by individuals who gave voluntary written consent.

Like many interested parties, I do not believe this injunction should have been granted. As D.C. District Court Chief Judge Royce Lamberth writes in his opinion:

A preliminary injunction is “an extraordinary remedy that should be granted only when the party seeking the relief, by a clear showing, carries the burden of persuasion. Cobell v. Norton, 391 F.3d (251, 258 (D.C. Cir. 2004). A party carries this burden of persuasion by establishing: (1) that there is a substantial likelihood of success on the merits; (2) that the plaintiff would suffer irreparable injury absent an injunction; (3) that an injunction would not substantially injure other interested parties; and (4) that an injunction would further public interest.

The Court found that each of these weighed in favor of the plaintiff doctors. I want to skip 3 and 4 (because I think scientists and society have both clearly suffered from this injuction) and focus on 1 and 2.

I do not believe there was a substantial likelihood of success based on the merits. As quoted, the language of the Dickey-Wicker Amendment bans federal funding for projects involving “the creation of a human embryo or embryos for research purposes; or research in which a human embryo or embryos are destroyed, discarded, or knowingly subjected to risk of injury or death.” As I stated above, most of the embryonic stem cells used in labs today have been derived from existing immortal cell lines created by other labs. There is simply not a need to destroy new embryos for the use of established cell lines, and so at a minimum an injunction should not apply to projects where there is no destruction of an embryo, simply the use of an old stem cell line, but only blocking the funding of new embryonic cell lines/destruction of new embryos. The embryonic stem cell lines approved by George Bush, for example, should continue to receive research funding. The Court writes that “ESC research is clearly research in which an embryo is destroyed. To conduct ESC research, ESCs must be derived from an embryo. The process of deriving ESCs from an embryo results in the destruction of the embryo. Thus, ESC research necessarily depends upon the destruction of a human embryo.” On the contrary, embryos had been destroyed and once depended on the destruction of a human embryo. It is entirely possible and most often the case that embryonic stem cell research may continue in which no human embryo is destroyed, by a private company or a federally funded actor.

Next we have the “irreparable injury absent an injunction.” The two plaintiffs in this case were Dr. James Sherley and Dr. Theresa Deisher. They are researchers of adult stem cells who claimed competitor standing; they were eligible to sue because the embryonic stem cell policy would “result in increased competition for limited federal funding.” To put it simply, I doubt these fine researchers will get NIH funding after this. Peer review of grant applications is already not only highly difficult to pass with many researchers struggling to receive funding–according to Science, “The NIH typically receives between 35,000 and 40,000 proposals a year, while the NSF gets roughly half that number. Only a quarter to a third of these proposals are ever funded”–but also highly political. The NIH review committee use five core review criteria: significance, investigators, innovation, approach, and environment. The reputation of the investigators plays a role in the grant review, as does it in peer review for scientific journals. Grant proposals must include a “Biosketch” with the CVs of the PI and other key personnel. I can’t think these plaintiffs helped their cases before the NIH review committee, especially when there is plausible deniability given the high rate of unapproved grant proposals, and given the past grant history of one of the plaintiffs, revealed below.

Interestingly, the plaintiffs had separate experiences in receiving NIH funding. I used the NIH RePORTER engine to search for historical grants given to these researchers (if you try this, make sure to expand the fiscal year search to all years, and uncheck the ‘active projects’ box). I’ve never really used this before, so I tested it by looking up the grant history of a professor I worked for at Columbia University, Dr. Darcy Kelley. She had multiple grants listed, all the way back from 1986, listed on the server; so it looks like RePoRTER gives both current and historic NIH grants. A search for Theresa Deisher, however, found no grants ever awarded. Dr. Theresa Deisher, who graduated with a PhD from Stanford in 1990, has never been awarded an NIH grant. Did she really suffer competitively from any embryonic stem cell funding? After two decades of futility, I don’t think this stunt will help her in her quest for NIH money. Side note: Deisher is clearly also in this for religious, and not personal financial, reasons. She’s been trying to establish a connection between abortion, vaccines, and autism with funding from a pro-life group. On the other side of things is Dr. Sherley, who despite hunger-striking after being denied tenure by MIT, has managed to assemble a steady stream of NIH grants since 1988, including a grant in 2010. Has he really suffered from embryonic stem cell funding after Bush’s 2001 policy, and Obama’s 2009 policy? I think standing should have been denied to these plaintiffs, one of whom couldn’t get a grant before competition from ESC  researchers, and one of whom got grants in spite of them. There is no evidence to show that their success or lack thereof in obtaining grants was changed in any way by Bush’s or Obama’s policies.

The U.S. Department of Justice will appeal this incorrect decision to the DC Circuit Court of Appeals, and regardless of how one feels ethically about the use of embryonic stem cells, I think the scientific fact and legal background makes the correct ruling quite clear. Hopefully there can be a speedy review and decision from the D.C. Circuit that will restore funding to embryonic stem cell researchers, and they can continue their important work in battling the diseases that plague humanity.

Continue reading

Ask the Audience: Marky Mark – TV Dynamo?

A feature we used to do on StoneSoup each Monday was to ask our readership a question aimed to prompt interesting responses. I’m not sure we’re going to reintroduce the Ask the Audience item permanently, but I did have something that I’ve been curious about that I figured I’d crowd source out to readers to see if anyone knew the answer: What is with Mark Wahlberg’s burgeoning TV empire?

For those of you who don’t know (which I think is most people), in addition to being a successful actor, Mark Wahlberg is also the producer of four shows on HBO: Entourage (loosely based on his life), In Treatment, How to Make it in America, and the upcoming Boardwalk Empire. In fact, on IMDB, he’s actually listed as an executive producer of each of these shows (so either his title is completely misleading or his role is substantial). Moreover, he is also developing a new HBO show about the  pornography industry. It’s worth pointing out that Mark Wahlberg has produced more shows for HBO than most of the auteurs traditionally associated with the network: David Milch (2), Alan Ball (2), and David Chase (1). The only producer who has done as many is David Simon (with The Corner, The Wire, Generation Kill, and Treme). Obviously, Mark Wahlberg does not have the same role in his shows as the better-known HBO names have in there’s, but it’s still an impressive (and rarely-mentioned) feat.

Nevertheless,  I’ve found it difficult to find sources about Wahlberg’s role in the shows he produces or really any information about the subject at all. Thus, I’m throwing it out to the StoneSoup audience to see if you will have more luck: What’s Mark Wahlberg’s role in creating all these shows, and why does it go essentially unmentioned?

Your Home Theater sucks.

Leo stands in awe, like I do at the movies.

The phrase home theater is a very sad one. 60 inch LCD 1080p televisions with massive Bose sound systems are all well and good, but they absolutely pale in comparison with the real thing. This is something I’ve always believed, but only recently have I begun to appreciate movie theaters as the absolute necessity in my life that they are.

Movies are meant to envelop people into a world that is not their own, often telling incredibly complex story in the span of two to three hours. The goal of so many motion pictures is evoke a genuine emotional response from the audience regarding people who until a few hours before were essentially complete strangers. A good film will allow a character to evolve on screen in front of you in such a way that makes you feel as if you understand that persons innermost motivations, which is a feat that many of us never accomplish even for our closest friends. Home theater setups provide an environment which simply doesn’t allow this kind of experience to happen in the same manner. First, there is something to be said to looking at a picture that often taxes your peripheral vision in order to appreciate it all. Instead of looking to the left and catching a glimpse of a coffee table, all you get in a theater is more movie or unobtrusive darkness. Even a 60 inch television will seldom will produce this effect unless you’re sitting 4 feet away. But more importantly, distractions like computers, cell phones, lights, and other such things constantly remind people that what they are watching really is just a movie, and these are simply characters and not people to invest in. A proper movie experience is different; it’s more visceral, more real. Recently, I saw Inception with fellow Stone Soup poster Josh Morrison at the Boston Common IMAX here in Boston. From the very first moment of that movie, the pure scope of the theater allowed me to experience that movie in a way that simply isn’t possible within your average residential setup. The bass of the IMAX speakers within the room roared through not only my ears, but literally shook the entire theater as it did so. When you can literally *feel* the soundtrack of the movie, that evokes a different response than were I simply to hear that same sound at home. It makes you feel like you’re there with the characters; it makes you want to know more about the world they live in. I knew when I left that theater, that no matter how many times I saw Inception over the course of my life, in many ways I was seeing it for the first and last time.

The real driving force for why I’m writing this however, is neither the sound, nor the picture… it’s the people. In the past 12 months I’ve discovered a local gem near my apartment known as the Coolidge Corner Theater. The Coolidge as its patrons affectionately call it is a small independent non-profit theater which shows the kind of independent fare that doesn’t normally make it to the AMCs of the world until months later, if at all. However, they also have a propensity as many independent theaters do for showing screenings of classic movies quite frequently. In the past few months, I’ve seen showings of both Raiders of the Lost Ark, and The Big Lebowski on the big screen and let me tell you, this experience has been awesome. The idea of viewing these films in a converted opera house does pique my interest for reasons of pure nerdyness, but more importantly, these films get the community to come out en masse. These shows sell out days in advance, and the folks who come out are true enthusiasts. At The Big Lebowski, there were several excellent Walter Sobchak look-a-likes who were often reminding people that, “This is not Nam, there are rules.” In addition to that, the crowd didn’t stop participating when the film began to roll, there was raucous laughter for every funny quip that came out, and absolutely nothing was missed by anyone. During Raiders of the Lost Ark, people cheered wildly when Indiana Jones first looked up from beneath the brow of his fedora in the opening scene, and again when he shoots the giant sword wielding goon in the square.

The crowd goes wild.

When you know that everyone around you is enjoying the movie in the same way that you are, that feeling of elation from can only be described as contagious. Seeing a movie with four or five friends at home is also good, but it simply isn’t the same.  Admittedly, these moments certainly aren’t experienced during every movie you’d see in theaters, as most films aren’t of that quality. Nevertheless, I’ve always felt that movies are a communal experience, and the collective gasp that occurs in theaters during tense scenes in great movies is a tough thing to replicate.

Anyway, go see a movie this weekend. I will.

Foam

Ah, foam. Once the height of culinary innovation, now the unloved stepchild of the molecular gastronomy movement. Assailed as too frou-frou by the bulk of the dining public and as passe by the epicurean elite, I’d like to take this post to sing its praises and explain (away) its faults.

Harold McGee’s On Food and Cooking defines foam as “a portion of liquid filled with air bubbles, a moist, light mass that holds its shape”.  That definition appears in Dr. McGee’s discussion of dairy-based foams, which are common, unexceptional, and delightful. They include meringue (egg whites suffused with air bubbles) and whipped cream. More recently popular is the use of foamed milk for cappuccino.   These are all, of course, not usages of foam that anyone finds objectionable. Instead, restaurant patrons are responding to the use of substances like agar or leichtin to allow ingredients like mushroom, garlic, or raspberry to be shot onto an otherwise inoffensive plate in the form of a delicate, airy foam. This was perhaps the first widely copied technique to come out of Fernan Adria’s El Bulli, a restaurant that helped kickstart the molecular gastronomy trend, but it might also be the most controversial

For those readers unfamiliar with the term, molecular gastronomy basically refers to a movement in fine dining that radically alters ingredients (typically changing their texture) by using new and unusual preparation techniques. For example, Grant Achatz of Chicago’s Alinea (perhaps considered the best American restaurant at the moment) spherifies olives by taking the reduced juice of an ingredient (here an olive), injecting it into a a gelatin casing, and then watching the fun as diners eat and puncture the sphere, bursting open an impossibly vivid concentrated olive flavor. Clio, the Boston restaurant most associated with the molecular gastronomy movement, sometimes serves a dish of strawberries with dehydrated chocolate “soil.” When you first have a bit of the chocolate, its  impossibly chocolate-y taste combines with a crumbled-cookie-like texture, but then, when it comes into contact with liquid, it becomes creamy and rich, changing as you eat it.

In some ways, I think, transforming culinary ingredients into foam is one of the best and most useful of the innovations associated with this school of cooking. Most obviously, it is a very straightforward way of injecting a flavor into a dish — you can add the flavor of sesame into a dish (as at O Ya, a fantastic(ally expensive) sushi restaurant in Boston) without needing to make it into an oil or a sauce (since the ingredient might not combine well with another in a condiment that’s already being used, or since another way of getting the flavor into the dish might be undesirable since it is too heavy or not evenly dispersed across the dish). Perhaps more subtly, the lightness of a foam (it evaporates as you eat it) combines with the vividness of the flavor to create something magical. You don’t want to never feel the solidity of the thing you are tasting, but sometimes, as a change of pace, the evanescent essence of a flavor is a delightfully shocking experience. Finally, the reason I go to nice restaurants (or attempt unreasonable feats of cooking on my own) is to experience something surprising, delightful, and different from what I usually eat. Basically anyone who knows how to turn a dial on the stove can follow a Mark Bittman recipe to the letter and make sometbesing delicious, but going out to a great dinner allows us to try something new we wouldn’t otherwise have; foam accomplishes that.

So why is it so hated? One reason can uncharitably be described as willful phillistinism. Eating is a comforting experience, and people don’t want to be pushed out of their comfort zones. When they go to a nice restaurant, they’ll try something they might not generally eat at home (diners are much more likely to order fish at a restaurant than they are to cook it themselves), but they also don’t want to try something really weird and hoity-toity. Foam, to these diners, is an easy manifestation of when chefs go too far in the direction of the exotic, so it is easy to define their tastes against it (“the butter-poached lobster at Clio is great, but I don’t like how they use all those foams”). Foam makes a convenient target because it has become a fairly widely adopted culinary trend in the realm of fine dining.

But, of course, foam also gets it from the other side — highly knowledgeable food critics and gastronomes who declare themselves sick of the idea, which is old and overdone. Part of this can be explained by a similar psychological mechanism as above — here the foodie elite are turning on a technique that was exciting when it was new (at least ten to fifteen years ago, maybe even longer) but which has now aged and in a weird way become popularized to an extent inconsistent with being the cool authentic new thing. Think Bob Dylan when he played Like a Rolling Stone on the electric guitar — the early adopting folk fans were furious, but it’s not like the establishment was very happy to except him.

Moreover in defense of foodies who’ve made their stand against foam, there are indeed often instances of chefs who use foam to make their cooking seem avant-garde and new when (as mentioned) the technique is anything but. The best example of this is the villain of Season Two of Top Chef, Marcel Vigneron, a pretentious prig who insisted on putting foam atop every one of his stupid dishes. Even four years ago, this was old hat and failed to impress the judges, but many viewers probably thought the technique indicated innovation and sophistication. Nevertheless, I’ve definitely noticed restaurants with no actual claim to creativity putting foam on the menu as an attempt to project sophistication to those who didn’t know any better, so I suppose I can forgive fellow foodies for feeling galled. Still, don’t fear the foam. It’s a good, useful, culinary technique that should be more widely adopted, not less.

Fired for Paying Respects to the Controversial Dead.

In July, Octavia Nasr was fired by CNN following a tweet mourning the death of Lebanese Shiite cleric Sayyed Mohammed Hussein Fadlallah.

Recently an important Muslim religious figure passed away. During his life, he stated that the United States “probably deserved” to have been attacked on September 11th, 2001, and additionally blamed the attacks on the sins of American homosexuals, non-believers, and abortionists. He once proclaimed that God “does not hear the prayers” of Jews, and only prayers offered in the name of his Prophet warranted God’s attention. He denounced secular teaching in education, and was in favor of religious organizations one day taking over schools.

Now my question is this: Should CNN journalist Octavia Nasr have been fired for praising the life of this controversial religious leader? Nasr was fired, after 20 years of working at CNN, for tweeting “Sad to hear of the passing of Sayyed Mohammad Hussein Fadlallah.. One of Hezbollah’s giants I respect a lot...”

What do you think?

Well, the religious leader I talked about in the first paragraph wasn’t Fadlallah. In fact, I lied about this leader being Muslim. The person described in the opening paragraph was, in fact, Rev. Jerry Falwell, who died in 2007. Jerry Falwell accused America of bringing about the Sept. 11th attacks through a host of sins. He once attacked civil rights by opposing the Brown v. Board of Education decision, and told a crowd that the Antichrist was walking the earth as a male Jew. He was in favor of religious control of American education. He hated homosexuals, and made such comments as “AIDS is not just God’s punishment for homosexuals, it is God’s punishment for the society that tolerates homosexuals” and “Gay folks would just as soon kill you as look at you.”

Yet after his death, news organizations did not fail to praise him for the good he brought the world. Paula Zahn of CNN, before introducing Christopher Hitchens to her show, said “you won’t believe what my next guest is saying about [Falwell’s] legacy.” Would CNN have introduced an outspoken critic of Fadlallah like that? Zahn opened by grilling Hitchens: “Falwell was controversial. Isn’t that a good thing? It gets people thinking and discussing and debating… I’m curious why you think he’s such an idiot. There were thousands and thousands of people who followed him and believed in him… ” On Fox News, Sean Hannity questioned whether Hitchens’ criticism went overboard because they could have hurt his family. Hannity said, “I know the good work this man has done… I know what he did for unwed mothers, I know what he did for alcoholics, I know what he did for drug addicts…”

It’s clear that for Jerry Falwell, the media was able to distinguish the good of his life from the bad in examining his legacy. And of course, we do this all the time for the dead. Ted Kennedy was remembered for being the Lion of the Senate and a great liberal champion, not for the Chappaquiddick incident that for the less politically-connected might have resulted in a homicide investigation. The Washington Post famously wrote about the Chilean dictator Augusto Pinochet:

Continue reading

Value for money in the Major Leagues: the “Moneyball Cup”

I’ve had a lifetime love for baseball. Born a Yankees fan, I remember staying up late into the night watching the Yankees win the World Series in 1998, 1999, and 2000 (only to watch the Yankees’ heartbreaking collapse in 2001) with some of the best teams in baseball history. I loved listening to baseball games on mute, pretending to be an announcer with my younger brother (also a fanatic) who did the color commentary. We watched all the games, played backyard baseball, created countless fantasy teams and played countless video games, and we collected baseball cards–assembling a collection that would later make my brother a fortune on eBay. But baseball, like most things I was interested in when I was 10 or 11, eventually faded for nerdier pursuits.

Moneyball: The Art of Winning an Unfair Game, by Michael Lewis

It was about two years ago when I became interested in baseball again, following the league and discussing the game with my friends and family. My brother was still an expert, and I started living with Will, a formidable Boston Red Sox fan who I had to deal with on Red Sox Nation turf. Part of what drove my interest in baseball was Michael Lewis’ famous book Moneyball, an investigative journalist-style inquiry into baseball’s sabermetric revolution, its impact on the low-budget Oakland A’s, and Oakland’s eccentric General Manager Billy Beane (Brad Pitt will play Beane in a Moneyball movie in 2011).

I had already been familiar with the statistical revolution that was overtaking baseball, being a statistical nerd myself, but the book sparked my interest once again: I began spending way too much of my time on Baseball-Reference.com, FanGraphs, and various other baseball blogs.

Part of what interested me in the statistics was how it played into baseball as a business: with the explosion of statistical sports commentary on the web, teams were constantly being criticized for their front office moves. Did an aging slugger’s statistical performance merit his new contract? Using advanced statistics, the decisions of General Managers could be analyzed not just for their contribution to performance on the field but also for their cost-effectiveness. The change had an impact on me as a fan, but it had a bigger impact on front offices: though Moneyball chronicled a low-budget team trying to catch up with the big boys, even big spending teams like the Red Sox and the Yankees began using advanced statistics to find undervalued and underrated players.

Baseball-Reference.com recently added more sabermetric stats. It's a great tool for quickly looking up players, teams, records, etc.

But what teams are doing the best job at getting value for money in the Major Leagues today? One could take a look at the standings, divide each team’s salary by their number of wins, and determine which team has paid the most in order to get where they have (and which teams have spent the most per win) but a central insight of the statistical era in baseball is that many wins and losses in baseball are due to statistical noise–in other words, luck. The statistic “batting average on balls in play” (BABIP) for example, measures how many times a batter gets a hit per each time he puts the ball in play. Since BABIP tends to be uniform in the Major Leagues for all hitters season-to-season (with slight variations for players who beat out a lot of infield hits), one can use the statistic to determine whether a hitter has been lucky (missing fielders or taking advantage of bad ones) or if his batting average is a result of higher performance.

Wins Above Replacement is a great statistic to determine how much a team’s performance is a result of luck and how much a team’s performance is a result of prudent front office decisions: the statistic measures how many wins a player contributes to his team, compared to an average player the team could have picked up off the waiver wire or Triple-A (click the link above for a fuller explanation, I’ve probably oversimplified it a bit). Overall, it correlates better with winning than most other stats, and when it doesn’t, it’s likely due to luck.

So what teams get the most WAR for the money? Baseball-Reference makes that calculation easy by providing salary and WAR information for all Major League teams. According to my calculations, the team that’s spent the least money per WAR so far this season (winning what I like to call the “Moneyball Cup”) is the San Diego Padres: spending just $1.9 million per WAR. It’s not just because they’re bargain hunters, either: The Padres are currently in first place in the NL West. Here’s the rest of the top 10:

The Padres, bolstered by a few surprises and some very talented young players, are getting the most value for money this season.

1.  San Diego Padres - $1.9 million per WAR
2.  Toronto Blue Jays - $2.9 million per WAR
3.  Oakland Athletics - $3.4 million per WAR
T-4.  Texas Rangers - $3.6 million per WAR
T-4.  Cincinnati Reds - $3.6 million per WAR
6.  Tampa Bay Rays - $4.1 million per WAR
7.  Minnesota Twins - $4.2 million per WAR
8.  Kansas City Royals - $4.3 million per WAR
9.  Cleveland Indians - $4.5 million per WAR
10.  Atlanta Braves $4.6 million per WAR

Unfortunately, as a Yankees fan, I have no bragging rights in this contest (not that I expected any): While the Red Sox are around league average at $6.5 million per WAR, the Yankees pay quite a bit more: $9.1 million.

Interestingly, it’s a little tricky to determine which team is the worst. Both the Houston Astros and the Pittsburgh Pirates have compiled negative WAR this year. What does that mean (after all, you can’t lose negative games…)? By compiling -2.2 and -2.6 WAR, respectively, the players on the Astros and Pirates have done worse than what one would expect from a team assembled off the waiver wire, the Minor Leagues, and free agency. Ouch. Both teams have thus basically wasted the money they’ve spent on payroll this season, though it’s probably worse for the Astros, whose payroll is over $91 million (the Pirates’ payroll is approximately $34 million).

Here are the rest of the worst:

5.  Seattle Mariners - $15.1 million per WAR
4.  Los Angeles Angels - $18.7 million per WAR
3.  Chicago Cubs - $39.4 million per WAR
T-1.  Pittsburgh Pirates - $34.4 million for negative 2.6 WAR
T-1.  Houston Astros - $91.5 million for negative 2.2 WAR

Of course, the Major League standings don’t mimic these standings. The Yankees have the best record in baseball (though several of the best teams are contenders, like the Rays and Rangers), and plenty of other teams use big budgets to win as well. Since teams like the Sox and Yankees have more money, their use of statistical analysis has allowed them to continually assemble competitive teams. Meanwhile, teams like the A’s struggle (still under Billy Beane, they haven’t made the playoffs since 2006). Baseball, the only major professional sports league in America that does not have a salary cap, remains a game that favors big market teams and free-spending owners (heh… go Yankees!). For everyone else, though, there’s always the Moneyball Cup!

Beyond the jump, I have the full standings.

Continue reading