The Wrongly Decided Injunction Against Embryonic Stem Cell Research Funding.

::Edit:: The injunction that was the subject of this post was reversed on appeal by the D.C. Circuit in Sherley v. Sebelius, 644 F.3d 388 (D.C. Cir. 2011).

On Monday, a federal court issued an injunction against NIH funding of embryonic stem cell research. This decision has severe impacts on ongoing and proposed research projects. Although labs that received grants and their disbursements for embryonic stem cell research will be able to spend the money they’ve already gotten, new grant application review has ceased, and annual renewals of existing awards have also been suspended. Additionally, NIH is still figuring out whether “no cost extensions”, a common request to spend disbursed money beyond the proposed project years, will be allowed. This injuction should never have been granted.

In 1996, Congress passed the Balanced Budget Downpayment Act, which contained a rider, the Dickey-Wicker Amendment, which prohibited the use of federal money in projects involving the creation of embryos for research, or research in which a human embryo is destroyed or discarded. From then on, the Amendment was included in every major appropriations bill involving the Department of Health and Human Services, most recently in 2009.

SEC. 509. (a) None of the funds made available in this Act may be used for–
(1) the creation of a human embryo or embryos for research purposes; or
(2) research in which a human embryo or embryos are destroyed, discarded, or knowingly subjected to risk of injury or death greater than that allowed for research on fetuses in utero under 45 CFR 46.208(a)(2) and Section 498(b) of the Public Health Service Act [1](42 U.S.C. 289g(b)) (Title 42, Section 289g(b), United States Code).
(b) For purposes of this section, the term “human embryo or embryos” includes any organism, not protected as a human subject under 45 CFR 46 (the Human Subject Protection regulations) . . . that is derived by fertilization, parthenogenesis, cloning, or any other means from one or more human gametes (sperm or egg) or human diploid cells (cells that have two sets of chromosomes, such as somatic cells).

HeLa cells have been maintained in culture since 1951.

Fortunately and unfortunately for scientists, Congress has a very limited understanding of scientific research. Embryonic stem cells are not taken from embryos each time an experiment is to be performed, just as cancer cells do not need to be freshly taken from a tumor every time they are studied in a lab. HeLa cells, for example, are the predominant cells used for study of human cell biology. The HeLa cells used in labs today around the world are all descended from a sample taken in 1951 from a patient with an aggressive cervical adenocarcinoma–Henrietta Lacks. Those cancer cells now constitute an immortal “cell line”, and can replicate indefinitely in vitro. Embryonic stem cells are similarly immortal. Once the original cell lines were established from embryos, no future embryos will be needed to replenish the stock of those cell lines. This was a loophole to the Amendment. Stem cell lines were created by private companies who destroyed embryos, but the researchers with government grants only used stock cell lines. It’s important to note that even after this loophole was established and basically continued as precedent, Congress took no steps to modify the language of their Amendment to remove this loophole. It’s also interesting to remember that even though HeLa cells were procured (basically stolen) from Henrietta Lacks in what would today be seen as a clearly unethical manner, no one argues that the use of HeLa cells (which is vitally important to biology research) should be discontinued as a sort of fruit of the poisonous tree.

In 2001, George W. Bush announced a policy of limited funding for stem cell research, in which embryonic stem cell lines created prior to 2001 could be funded. In 2009, Barack Obama lifted all limitations on embryonic stem cell research. The embryos used for creation of new cell lines would be sourced from only embryos created for in vitro fertilization reproductive purposes and were no longer needed, and were donated by individuals who gave voluntary written consent.

Like many interested parties, I do not believe this injunction should have been granted. As D.C. District Court Chief Judge Royce Lamberth writes in his opinion:

A preliminary injunction is “an extraordinary remedy that should be granted only when the party seeking the relief, by a clear showing, carries the burden of persuasion. Cobell v. Norton, 391 F.3d (251, 258 (D.C. Cir. 2004). A party carries this burden of persuasion by establishing: (1) that there is a substantial likelihood of success on the merits; (2) that the plaintiff would suffer irreparable injury absent an injunction; (3) that an injunction would not substantially injure other interested parties; and (4) that an injunction would further public interest.

The Court found that each of these weighed in favor of the plaintiff doctors. I want to skip 3 and 4 (because I think scientists and society have both clearly suffered from this injuction) and focus on 1 and 2.

I do not believe there was a substantial likelihood of success based on the merits. As quoted, the language of the Dickey-Wicker Amendment bans federal funding for projects involving “the creation of a human embryo or embryos for research purposes; or research in which a human embryo or embryos are destroyed, discarded, or knowingly subjected to risk of injury or death.” As I stated above, most of the embryonic stem cells used in labs today have been derived from existing immortal cell lines created by other labs. There is simply not a need to destroy new embryos for the use of established cell lines, and so at a minimum an injunction should not apply to projects where there is no destruction of an embryo, simply the use of an old stem cell line, but only blocking the funding of new embryonic cell lines/destruction of new embryos. The embryonic stem cell lines approved by George Bush, for example, should continue to receive research funding. The Court writes that “ESC research is clearly research in which an embryo is destroyed. To conduct ESC research, ESCs must be derived from an embryo. The process of deriving ESCs from an embryo results in the destruction of the embryo. Thus, ESC research necessarily depends upon the destruction of a human embryo.” On the contrary, embryos had been destroyed and once depended on the destruction of a human embryo. It is entirely possible and most often the case that embryonic stem cell research may continue in which no human embryo is destroyed, by a private company or a federally funded actor.

Next we have the “irreparable injury absent an injunction.” The two plaintiffs in this case were Dr. James Sherley and Dr. Theresa Deisher. They are researchers of adult stem cells who claimed competitor standing; they were eligible to sue because the embryonic stem cell policy would “result in increased competition for limited federal funding.” To put it simply, I doubt these fine researchers will get NIH funding after this. Peer review of grant applications is already not only highly difficult to pass with many researchers struggling to receive funding–according to Science, “The NIH typically receives between 35,000 and 40,000 proposals a year, while the NSF gets roughly half that number. Only a quarter to a third of these proposals are ever funded”–but also highly political. The NIH review committee use five core review criteria: significance, investigators, innovation, approach, and environment. The reputation of the investigators plays a role in the grant review, as does it in peer review for scientific journals. Grant proposals must include a “Biosketch” with the CVs of the PI and other key personnel. I can’t think these plaintiffs helped their cases before the NIH review committee, especially when there is plausible deniability given the high rate of unapproved grant proposals, and given the past grant history of one of the plaintiffs, revealed below.

Interestingly, the plaintiffs had separate experiences in receiving NIH funding. I used the NIH RePORTER engine to search for historical grants given to these researchers (if you try this, make sure to expand the fiscal year search to all years, and uncheck the ‘active projects’ box). I’ve never really used this before, so I tested it by looking up the grant history of a professor I worked for at Columbia University, Dr. Darcy Kelley. She had multiple grants listed, all the way back from 1986, listed on the server; so it looks like RePoRTER gives both current and historic NIH grants. A search for Theresa Deisher, however, found no grants ever awarded. Dr. Theresa Deisher, who graduated with a PhD from Stanford in 1990, has never been awarded an NIH grant. Did she really suffer competitively from any embryonic stem cell funding? After two decades of futility, I don’t think this stunt will help her in her quest for NIH money. Side note: Deisher is clearly also in this for religious, and not personal financial, reasons. She’s been trying to establish a connection between abortion, vaccines, and autism with funding from a pro-life group. On the other side of things is Dr. Sherley, who despite hunger-striking after being denied tenure by MIT, has managed to assemble a steady stream of NIH grants since 1988, including a grant in 2010. Has he really suffered from embryonic stem cell funding after Bush’s 2001 policy, and Obama’s 2009 policy? I think standing should have been denied to these plaintiffs, one of whom couldn’t get a grant before competition from ESC  researchers, and one of whom got grants in spite of them. There is no evidence to show that their success or lack thereof in obtaining grants was changed in any way by Bush’s or Obama’s policies.

The U.S. Department of Justice will appeal this incorrect decision to the DC Circuit Court of Appeals, and regardless of how one feels ethically about the use of embryonic stem cells, I think the scientific fact and legal background makes the correct ruling quite clear. Hopefully there can be a speedy review and decision from the D.C. Circuit that will restore funding to embryonic stem cell researchers, and they can continue their important work in battling the diseases that plague humanity.

Continue reading

Advertisements

Safe Science.

Recently a friend convinced me to watch Breaking Bad, a show on AMC. (Mild Spoilers Ahead!)

xkcd

The protagonist of the show is a high school chemistry teacher named Walter White who, upon discovering he has lung cancer, decides to start a life of crime in order to earn money for his family before he dies. Specifically, he decides to apply his chemistry expertise to become a crystal meth supplier, and with his CalTech technique, his meth quickly surpasses the competition in quality, as well as obviating the need to scrounge around for pseudoephedrine. I’ve never purchased Sudafed, but my more cold-prone (drug-addicted?) friends tell me that it’s impossible to buy in quantity from the same drug store. When Walter gets into trouble, he uses chemistry to get out of it, sort of like a MacGyver or even Michael Westen, but more rooted in pure science. A great example of this, that actually made me regret not studying more chemistry, was when Walter uses the highly explosive mercury fulminate to dominate a local mob boss. Suffice to say, we never made that in chem lab (although I believe we learned the formulas for making meth, or was it synthetic cocaine, in orgo).

Besides being a great show, it would be cool if Breaking Bad inspires previously bored high school students to pay attention in their chemistry classes by revealing its practical, sometimes everyday applications. Of course, a chemical engineering degree is a great way to make money, even without resorting to drug dealing.

Watching Breaking Bad and thinking about the practical applications of chemistry reminded me of a thrilling story I once heard about hiding gold. As in the Purloined Letter, the best place to hide anything is in plain sight.

During World War II, Jewish scientists from Germany fled to (among other places) Copenhagen’s Niels Bohr Institute. Two Nobel Prize-winning scientists, Max von Laue and James Franck, brought with them their Nobel Prize medals, which at that time were still minted in 23 carat gold (since 1980 they’re just 18 carat with 24 carat plating; hardly worth winning anymore, I’d say). When the Nazis invaded Denmark, another (future) Nobel laureate and chemist George de Hevesy decided to hide their gold medals. Instead of burying them, or secreting them in some hidden drawer, he thought like a scientist and dissolved the gold in aqua regia; stored the mundane, reddish-brown solution in his lab among all the other bottles; then fled to Sweden. The Nazis, upon searching his lab, saw only bottles of common chemicals. After the war, de Hevesy returned to Copenhagen and found his bottles undisturbed. He precipitated the gold out of solution and the Nobel Foundation recast the medals for von Laue and Franck.

So instead of hiding gold Krugerrands or kilobars in expensive safes, easy pickings for whatever Neal Caffrey that seeks to crack it, you should be hiding gold dissolved, in bottles of “red wine vinegar” or maple syrup”!
Just for fun (and since I know amateur gold scavengers Google this stuff), here’s How to Dissolve and Precipitate Gold Using Aqua Regia:

Aqua regia is a 3:1 molar ratio of concentrated nitric acid and concentrated hydrochloric acid. Its pH is near zero, so it’s HIGHLY acidic. It is so named (“royal water”) because it has the unique ability to dissolve noble medals like gold and platinum (see what I did there?).

Au (s) + 3 NO3 (aq) + 6 H+ (aq) → Au3+ (aq) + 3 NO2 (g) + 3 H2O (l) and
Au3+ (aq) + 4 Cl (aq) → AuCl4 (aq).

Solid gold reacts first with the nitrate from the nitric acid which oxidizes the gold into the gold ion. However, this equilibrium leans toward the left, meaning very little gold is actually dissolved… yet. This is why nitric acid alone can be used to dissolve other metals/compounds in the purification of gold. The second reaction removes Au 3+ from the first reaction by converting it into aurochloride, and drives the first reaction to the right (LeChatelier’s principle).

Now to precipitate the gold and get it back into a solid state, I found this helpful patent online. You add a strong base like NaOH until the pH of the solution rises to 2.8-3.0. At this higher pH, the nitric oxide remaining in the solution will no longer oxidize gold and just redissolve anything we precipitate. After filtration, add butyl stearate (an emulsifying agent to prevent coalescence of gold particles during precipiation) and sodium sulfite (or I’ve seen sodium metabasulfite elsewhere). The sodium sulfite (or sodium metabasulfite) forms sulfur dioxide, a strong reducing agent:

Na2SO3 + 2 H+ → 2 Na+ + H2O + SO2

Which leads us to (I think):

2 AuCl4+ 3 SO2 + 6 H2O → 2 Au (s) + 3 SO42- + 8 Cl + 12 H+

Gold dust after precipitation. I suppose you could also hide your gold like this, as "clay".

Filter, wash the gold powder with acetone and water, bake, melt, and you’re back to your solid gold at ~98% recovery!

The Antibiotics Shortage and How to Solve It.

update (11/5/2010) :: The NYTimes published a good article about subsidizing antibiotic research: http://www.nytimes.com/2010/11/06/health/policy/06germ.html?ref=antibiotics

The U.S. Should Establish a No-Fault Antibiotic Injury Program.

Few inventions in the history of mankind have saved more lives than antibiotics. Antibiotics are compounds that can kill or inhibit (long enough for our immune systems to kill) microorganisms–bacteria, fungi, and protists. In 1928, Alexander Fleming discovered penicillin, a substance exuded from a humble fungus that would spawn the modern pharmaceutical industry and revolutionize our lives both in war and in peace. Manufactured in time for WWII, penicillin saved countless lives that, in previous wars, would have been lost to bacterial infection. Despite the ever-increasing deadliness of modern weapons, equally rapid innovations in medicine and pharmaceuticals, especially antibiotics, has steadily lowered the likelihood of death in war. In every American war since they started counting, more soldiers died due to disease (or accident) than on the battlefield. In the Civil War, 224,097 died of disease or accident compared to 140,414 battle deaths; in WWI, over half of the deaths were attributable to disease. WWII was the first war where more soldiers were killed by fellow man than by microorganism, with under 30% of deaths due to disease. In times of peace, antibiotics are used from the beginnings of our lives to the very end, everywhere from combating bacteria that kill women in childbirth, to fighting pneumonia that wracks our aged lungs. 

And yet these vital drugs that we daily take for granted, no longer fearing every minor scratch or major gash, are quickly running out.

The first problem is bacterial resistance. Over time, bacterial strains will naturally mutate and evolve, through natural selection, to become immune to once-effective antibiotics. This process is exacerbated when patients do not take their full course of antibiotics–feeling better, they decide to stop taking their medicine early, thus allowing the small amounts of bacteria still alive (the ones resistant to the medicine) to grow, restart the infection, and spread to other hosts. Sometimes antibiotics can be overprescribed (for example, in common cold cases), which gives bacteria exposure to the antiobiotic and starts the resistance clock. Bacteria can also get exposure through our use of antibiotics in agriculture–approximately 60% of antibiotic usage in the United States. Bacteria can exchange genes that confer resistance to antibiotics across strains, and even across species, through plasmid transfer. This enables bacterial strains to accumulate resistances, and leads to the development of multiple-drug resistant bacteria (MDRs). Bacteria could also increase their expression of resistance genes, for example, in increasing the amount of pumps that filter out the antibiotic. Microbiologist Kenneth Todar writes,

70 percent of the bacteria that cause infections in hospitals are resistant to at least one of the drugs most commonly used for treatment. Some organisms are resistant to all approved antibiotics and can only be treated with experimental and potentially toxic drugs. An alarming increase in resistance of bacteria that cause community acquired infections has also been documented, especially in the staphylococci and pneumococci (Streptococcus pneumoniae), which are prevalent causes of disease and mortality. In a recent study, 25% of bacterial pneumonia cases were shown to be resistant to penicillin, and an additional 25% of cases were resistant to more than one antibiotic.

A few famous examples of MDR bacteria include multiple-drug resistant tuberculosis (MDR TB), and Methicillin-resistant Staphylococcus aureus (MRSA). Drug-resistant TB requires treatment with more expensive and dangerous second-line TB drugs, and if the TB develops resistance to those second-line drugs as well, there are few good options for the third line. MDR TB is a serious public health problem in the developing world where the WHO is seeking to eradicate TB. Resistant staph infections are a pressing concern in the First World. In the United States, a Department of Health and Human Services study estimated 390,000 hospitalizations from MRSA cases in 2005, and researchers estimated 17,000-19,000 deaths were attributed to MRSA. It’s important to remember that despite antibiotics, infections remain the second-leading cause of death in the world, and this is a problem not limited to the Third World, but right in our backyards, in our local hospitals and community health clinics.

Despite the steadily decreasing supply of antibiotics, drug companies are not rushing out new antibiotics, nor do they have clinical trials already on track. For example, despite the resistance of gram-negative bacteria, a “study released about a year ago by the Infectious Diseases Society of America found no drugs in middle- or late-stage clinical trials directed specifically at Gram-negative organisms.”

Why are drug companies unexcited about inventing life-saving products? Antibiotics are less profitable than drugs like Lipitor and Viagra that are used to treat chronic conditions and that are chronically consumed; they are taken for short treatment courses, and doctors, cognizant of the resistance problem, are loathe to overprescribe them. Antibacterial resistance can render a drug obsolete, so drug companies must bear the risk of losing all profitability even before the life of the patent expires.

Due to the low profit margins of antibiotics, drug companies are particularly sensitive to profits that are lost from drug liability suits–the cost of hiring lawyers, and the risk of losing millions. These facts mirror, in fact, the conditions experienced by vaccine manufacturers in the 1980s. Vaccine manufacturers threatened to abandon the market because of the threat of lawsuits over vaccine-related injuries. Congress, fearing a vaccine shortage and public health crisis, passed the National Childhood Vaccine Injury Act of 1986, which shielded vaccine manufacturers from many tort claims, and established a Vaccine Court to adjudicate payment from a no-fault injury fund. Instead of suing a vaccine manufacturer and proving negligence, individuals would/must file a claim before the Vaccine Court (Court of Federal Claims) and merely demonstrate they were injured by the vaccine. If successful, they would be paid damages from the Vaccine Fund, which is funded from small surcharges on every vaccine purchase. This, apparently, assuaged the fears of vaccine manufacturers, and our nation’s (and the world’s) vaccine crisis passed. A similar system should be created for antibiotics, and perhaps it, alone, will be sufficient as well to encourage antibiotics research.

This system benefits not just drug companies, but plaintiffs as well. Under the vaccine injury program, for example, the legal fees for bringing a claim forward are reimbursed by the Vaccine Fund, not out of the plaintiff’s pocket, or out of the plaintiff’s damages check. Additionally, victims have a difficult time winning tort claims against large multinational corporations and their legions of lawyers. Companies have a huge incentive to resist and drag out even valid injury claims, for fear that one winning suit will become the seed for hundreds of others. As with medical malpractice suits, even if the odd plaintiff wins, the vast majority will lose and end up with nothing. With medical malpractice, the doctors “lose” as well because they have to spend so much money on lawyers/insurance. Analogously, drug companies lose when they are forced to spend money on lawyers despite the low profit margins of antibiotics. With an injury fund, the majority of legitimately injured claimants can receive compensation and funds for future medical care, and drug companies can keep their profits.

While shielding drug companies from liability goes against the sentiment in last year’s Wyeth v. Levine ruling, this plan is also beneficial from a regulatory point of view. If antibiotic suits are preempted, the safety judgments on antibiotics will be performed by FDA instead of state juries. Where juries see only the terrible, individual harms in front of them and may be careless in punishing drug companies (and disincentivizing their activities), FDA is in a better position to perform a holistic cost-benefit analysis of drug safety vs. drug accessibility. They see all those who would suffer if deprived of a drug, even a drug that carries dangerous risks. This concern is true of all drugs, but I think particularly true in cases of vaccines and antibiotics, where the public health concerns are tremendous.

Under this plan, the vast majority of legitimate injury victims will receive fair compensation, going through a claims court, without needing to attempt a long and painful tort process. This plan saves millions of lives, as antibiotic manufacturers, hopefully like vaccine manufacturers did in the 80s, will find a no-fault liability system sufficient to re-stimulate their investment in antibiotic research. Government subsidy of pharmaceutical research in antibiotics, and increased government grants for university development of antibiotics could also go a long way toward restoring the healthy profitability and incentive for antibiotic research, and this tort scheme would only serve to supplement any additional incentives the government could enact. These life-saving drugs could perhaps become even cheaper as drug companies no longer need to create legal war chests in case of tort suits, or no longer need to front the whole sum of R&D costs, increasing the availability of these drugs in the Third World, and saving millions more lives abroad.

In these times of corporate repugnance, it seems distasteful to erect yet another shield for large businesses to protect themselves from judgment at the hands of the people, before twelve angry men. Yet the alternative, a shortage of novel antibiotics to combat newly-mutated bacterial strains, is far more perilous to the public health.

My quick take on Cryogenics.

I think Josh pretty much covered everything in his final summary. Like Josh and others, given the technology, I wouldn’t hesitate to freeze myself, especially if my assets would be taken in the estate tax. But I do have two thoughts to add.

First, when do you freeze yourself? Time and timing are both important. In order to maximize your chance at future resurrection, you should probably freeze yourself before you die, as Josh points out. As soon as you die, your brain cells are deprived of oxygen, and start a rapid cascade into death. In a more compelling example, if you’re shot in the face, chances are future scientists won’t bring you back, cryogenics or no. This means that you basically have to decide at some point before death that you want to go into a deep freeze. There’s a chance that you won’t ever be brought back, in which case you’re basically exchanging a few hours, days, or perhaps months/years of life for the possibility of a presumably longer period of time later. I’d probably have to calculate the expected value of the cryogenically extended life vs. the probably amount of life I’d be potentially giving up by freezing myself before death. I’d be far more worried about that sacrifice than the specific monetary cost, i.e. I don’t know if I’d want to “die early.”

Second, in response to Tom’s idea of downloading a digital copy of your brain, I’m not sure that I’d benefit from that. Josh might blog later about how he believes that the idea of a unified consciousness is false, but the fact is that we perceive ourselves as a unified being, a unified self. I am the same person as I was when I was 5 years old, even though my neuronal patterns are very different. More importantly, I’m making decisions right now… right now, n-now… er, now… that will affect me, my future self. So the downloading thing is only valuable IF I think that I, a unified self, will be able to appreciate being alive in the future. One of Josh’s links had a great example of why this might not be the case. So if you (say, David 1) downloaded your exact neuronal signaling, etc. into a computer, and then uploaded it perfectly into a cloned body, the clone (David 2) would think that it was me. It would perceive that it was David 1, the unified self. But what if I were still alive? It’d be clear to me, the true and original David 1, that David 2 was just an imposter, even though he might genuinely believe himself to be the original. So if I died, and my memories were just transplanted into a cloned body, I think that clone (or even several clones) would think himself to be “me” but…. they’d still be imposters. That means that I wouldn’t enjoy the fruits of the process; I’d still fear the eternal death just the same. The difference is being able to replicate oneself, and being able to live forever as a unified self. And if the only advantage to the “digital download” is there’s someone running around with my genetic material thinking that he’s me… well, like Josh, I think I might as well just have natural genetic progeny.

Genius of Crowds

We already expect hearty collaboration among PhDs and literati. Even in the middle of the Cold War, Soviet and American scientists collaborated in medicine and public health. This year’s biggest breakthrough in science, as profiled by the journal Science, was the reconstruction of a 4.4 million year-old skeleton of Ardipithecus ramidus, a human evolutionary ancestor, and its environment. According to the editor of Science, “it represents the culmination of 15 years of highly collaborative research. Remarkably, 47 scientists of diverse expertise from nine nations joined in a painstaking analysis of the 150,000 specimens of fossilized animals and plants.”

The age of the internet has allowed unprecedented communication, and interconnection, and now it enables everyday people to cooperate in ways unimaginable even a few years ago. In a bit of a deviation from my previous post criticizing the occasional irrationality and short-sightedness of the American public, here’s a post on the newly tapped potential of the masses.

Every year (for the past nine years) the NYTimes assembles a list of the best ideas of the year. One of my favorites was titled ‘Massively Collaborative Mathematics’. A Cambridge professor posted a difficult math problem–proving the Density Hales-Jewett Theorem, and invited commentors on his post to help solve it.

“The resulting comment thread spanned hundreds of thousands of words and drew in dozens of contributors, including Terry Tao, a fellow Fields Medalist, and Jason Dyer, a high-school teacher. It makes fascinating, if forbiddingly technical, reading. Gowers’s goals for the so-called Polymath Project were modest. ‘I will regard the experiment as a success,’ he wrote, ‘if it leads to anything that could count as genuine progress toward an understanding of the problem.’ Six weeks later, the theorem was proved.”

I read about another, in some ways even more impressive, example of this phenomenon in the Marginal Revolution blog. The Defense Advanced Research Projects Agency (DARPA) stationed 10 weather balloons at random locations across the United States, and announced a prize for anyone, or any group, who could find the coordinates of all 10 balloons. An MIT group created a website/pyramid scheme attracting thousands of collaborators. As one of the team members explained it, “Each balloon had a value of $4,000. If you came directly to us without a referral, you got $2,000, and the charity got $2,000. If you came with one referral, if one person referred you to us, then you still got your $2,000, the referrer gets $1,000 and the charity gets $1,000. This goes on from $2,000, $1,000, $500, $250, $125, and so on.” The MIT team ended up locating all 10 balloons in under 9 hours.

I’m excited to see what other projects can harness the wisdom of crowds in the future. From Yahoo Answers and Wikipedia, to balloon-finding and scientific collaboration on advanced projects, the potential seems limitless.