In December 1982, Dr. Jack Yoffa of Syracuse, New York, took Zomax, a painkiller, just before driving to the hospital for minor surgery. About halfway there, Yoffa began to itch and turn red. Within 60 seconds, he was unconscious. His car hit a guardrail, crossed a three-lane highway (narrowly missing several cars), knocked over a light pole, and landed in a ditch. Yoffa had experienced anaphylactoid shock, a not uncommon—yet often fatal—reaction to Zomax, which the drug’s manufacturer, McNeil, had neglected to publicize despite numerous reports of similar responses among other users. While McNeil had managed to keep reports of several deaths associated with Zomax from the public, a local TV newscaster’s March 1983 interview with Yoffa finally forced the company to withdraw the drug from the market.

Yoffa’s story is perhaps the most disturbing of the many examples of scientific misconduct documented in Robert Bell’s Impure Science: Fraud, Compromise, and Political Influence in Scientific Research. Yet even in cases where the consequences are not life-threatening. Bell, a professor of economics at Brooklyn College, makes it clear that “malfeasance and compromise subvert our scientific and technological base, thereby weakening the competitiveness of the economy in which we earn our living.” Bell maintains that science, long treated as a “sacred cow,” is really only as “pure” and unbiased as the “political machinery that dispenses its patronage and funding.”

The events surrounding the marketing of Zomax provide a telling example of how the scientific process is corrupted by conflict of interest, which Bell defines thus: “when the scientist who is supposedly making an objective judgment stands to benefit or lose by that decision.” In this case, the McNeil representatives charged with gathering information on the adverse effects of Zomax were also promoting the drug to doctors. Conflict of interest also occurs when scientists, funded by grants from corporations, conduct their research so as to help market or insure approval of a drug or medical device. Such ineffective and/or potentially dangerous products as amoxicillin, the sedative Versed, and the Bjork-Shiley convexo-concave heart valve were all promoted by scientists who should have known better. But the prime example of compromise in scientific research is the Pentagon, whose advisory Defense Science Board consists largely of top executives from the Pentagon’s major contractors. In the Pentagon, conflict of interest is institutionalized to the point that negligence, secrecy, and downright deceit constitute a modus operandi. “Concurrency,” the “practice of performing fundamental research and development while simultaneously mass producing the item being researched,” is especially prevalent here (although it is also common in other areas of science like the pharmaceutical industry). The Apache helicopter, the Navy’s A-12 Stealth plane, and the Bl-B bomber were all victims—at taxpayer expense—of this approach to research and development.

Political influence results in similar abuse of public trust. Politics apparently played a role in the National Science Foundation’s 1986 decision to locate its Earthquake Engineering Research Center not at the University of California-Berkeley, home to the top researchers in the field and the logical spot for such a facility, but rather at the State University of New York-Buffalo. Involved in this decision (although the NSF panel denied being pressured) was Buffalo Congressman Jack Kemp, who “made certain that the White House knew how important the project was to him,” according to his press secretary. Political influence also figures in the multimillion (and even billion) dollar “super science” projects funded directly by Congress, which “are political science by definition, since Congress is constitutionally designed to respond to political pressure.” Enterprises like the Space Shuttle, the Hubble Space Telescope, the Superconducting Supercollider, and the Strategic Defense Initiative are what Bell calls “casualties of patronage.” The dubious Superconducting Supercollider, for instance, gathered 94 percent of its support from representatives of states submitting proposals for its construction and their neighbors. The pork barreling surrounding science projects means politicians blindly push for projects that subsequently balloon out of their control, moving quickly from what Ernest Fitzgerald (an Air Force cost cutter) calls the “too early to tell” stage of development to the “too late to stop” one.

As political pressure leads to questionable allocation of funds, pressure to publish or perish leads to questionable, if not fraudulent, research results. Perhaps the most publicized example of fraud in scientific research was the ease of Dr. David Baltimore, a Nobel Prize-winner, who contributed to a paper based on faked data in the 1980’s. The scientific community’s reaction to this case—bombarding the media and Congress with calls to end the investigations of Baltimore—outrages Bell. Yet even more, he decries this same community’s silence in other cases of compromise, political influence, and fraud. In fact, the lack of reaction to scientific misconduct is a secondary yet significant theme running throughout his book. Bell emphasizes again and again that silence equals guilt. As the hero of one of his case studies, an anthropologist who lost his NSF grant because of the rumors his rivals on the peer review panel circulated about him, reasons: “Ultimately . . . every scientist that individually and collectively fails to confront abuses and wrongdoing in the system is contributing to corruption in the system.” Bell also champions the victimized whistleblowers in these cases, who are often denounced and even investigated themselves for bringing unfavorable attention to (and thus threatening the power of) universities and funding agencies. Margot O’Toole (who could not find work after she exposed the Baltimore scandal), Dr. Erdem Cantekin (whose superiors moved his office, erased data from his hard drive, and tried to revoke his tenure when he challenged a colleague’s endorsement of amoxicillin), and Ernest Fitzgerald (who was fired by President Nixon from his job as an Air Force cost analyst for exposing the deficiencies of the C-5ATransport) illustrate that fraud frequently pays while whistleblowing does not.

By combining breadth (in the variety of cases he examines) and depth (in his analysis of each case). Bell provides an excellent overview—for the scientist and the layperson alike—of the causes and consequences of scientific misconduct. He thoroughly surveys the available evidence—court cases, government investigations and reports, testimony given under oath before congressional committees, documents requested through the Freedom of Information Act, and personal interviews, as well as the more usual newspaper and journal articles, books, and radio transcripts. His tone may be too alarmist for some, but his work nevertheless raises critical questions about the practices and ethics of the scientific community.

One of these questions is “Where Do We Go From Here?”—the title of Bell’s concluding chapter. While Bell could have said more on the subject, he does address himself to three potential solutions. Self-regulation and bureaucratic oversight are obviously inefficient (the number and names of the committees, reports, and agencies that he cites throughout his book are enough to strike horror in the heart of any decentralist). Bell therefore proposes: one, separating funding and control in scientific research; two, requiring universities that receive federal research money to prevent or at least publicize conflicts of interest; and three, refining the Federal False Claims Act, which allows one to sue an individual or organization that has defrauded the government, to further protect the whistleblower from retaliation. He hopes that these remedies will force scientists “to live up to the strictures of the scientific method.” Yet he neglects to mention perhaps the most essential ingredient to any plan to reform science: vigilant media coverage of the type of shenanigans Impure Science describes. For if misconduct is as rampant in the scientific community as Bell suggests, something more than a handful of new regulations is needed to shake up its patronage system. Indeed, the future of science may depend on similar exposes.

 

[Impure Science: Fraud, Compromise, and Political Influence in Scientific Research, by Robert Bell (New York: John Wiley & Sons) 301 pp., $22.95]