II. WHAT ARE THE RULES OF THE GAME?


It has been reported that Harriet Zuckerman suggests that we should "distinguish between fraud as a deviation from the moral norms of science and negligence as a deviation from the methodological norms. She sees two sorts of cognitive errors in science: reputable ones and disreputable ones".(12) Thus, one may consider these deviations as two separate instances where 1). the scientist doesn't follow the methodological rules of the game or 2). the scientist doesn't feel the moral rules of the game.

The Conventionally Accepted Methodological Norm:

The Scientific Method

We now have the opportunity to consider the commonly accepted methodological and moral norms of science. The methodological norms in their simplest form are generally recognized as the Scientific Method. Often the scientific method is listed as a discrete series of steps that all scientists should follow in order to do "proper science". A conventional definition of the method is the "systematic, controlled observation or experiment, whose results lead to hypotheses; which are found valid, or found invalid, through further work; leading eventually to reliable theories [or a renewal of the cycle beginning with a reconsideration of the original hypothesis]".(6) The scientific method says to the impressionable reader "Follow me and you simply cannot go wrong in the performance of your laboratory experiments.". It has been argued elsewhere, and for good reason, that the scientific method is a myth.(1,6) Hopefully, the reader will agree that the scientific method must be treated as an ideal set of recommendations which scientists should strive to follow; that there are no guarantees that the scientific method will be applicable to every instance of scientific research. It is an ideal that each individual scientist must choose to incorporate into their work. There are many examples that clearly illustrate this argument.(6)

In many general science texts, the scientific method is stated quite explicitly. Students in American public schools, who may never have the opportunity to participate in a research environment, are frequently taught that science works only by following these methodological norms. This is surely harmful to the public image of science. Science isn't such a well-defined undertaking as the method indicates. It is refreshing to discover a text that exposes the myth of the method. Brown and LeMay have written

"There is no fail-proof, step-by-step scientific method that scientists use. The approaches of various scientists depend on their temperament, circumstances, and training. Rarely will two scientists approach the problem in the same way. The scientific approach involves doing one's utmost with one's mind to understand the workings of nature. Just because we can spell out the results of science so concisely or neatly in textbooks does not mean that scientific progress is smooth, certain, and predictable.".(13)

Although not expressly stated in their text, Brown and LeMay have indicated that there are two types of science: textbook and frontier. Bauer argues that "The scientific method doesn't allow for us to distinguish between reliable textbook science and unreliable frontier science.".(6) He has stated that "Textbook science is uncontroversial" and that "We all learn science from textbooks, and we can hardly fail to be impressed by the range and reliability of the knowledge that has been amassed...".(6) Textbook science and frontier science are differentiated from one another in that "... we rarely remain excited by this [textbook science], at least not for long. What we do get excited about is the very latest stuff [frontier science]... what gets into the newspapers and magazines and onto television.".(6) Bauer offers the following illuminating summary: "The trouble is that we use the same word "science" to describe both the reliable textbook stuff and the exciting frontier stuff.".(6)

Continuing with the work of Brown and LeMay, we find that they also mention that

"The path of any scientific study is likely to be irregular and uncertain; progress is often slow and many promising leads turn out to be dead ends. ... we will see that serendipity has played an important role in the development of science. What we will often miss discussing are the doubts, conflicts, clashes of personalities, and revolutions of perception that have led to our present ideas.".(13)

Such authors are to be commended for painting such an accurate portrait. Both the student and the general public receive a grave disservice when the methodological norms of science are explained only in terms of the scientific method.

The Conventionally Accepted Moral Norms:

Merton's Contribution

R.K. Merton first suggested in 1942 that the behavior of scientists could be exemplified by a coherent set of norms. Ziman reminds us that "The Mertonian norms are not, of course, a precisely defined and standardized code...".(14) Schmaus advises that "These norms are shared only among members of the scientific community and not by society at large.".(12) These norms include: universalism, communalism, disinterestedness, and organized skepticism. R.H. Brown offers the following insightful interpretations:

"Universalism requires that science be independent of race, color, or creed and that it should be essentially international. Communalism requires that scientific knowledge should be public knowledge; that the results of research should be published; that there should be freedom of exchange of scientific information between scientists everywhere, and that scientist should be responsible to the scientific community for the trustworthiness of their published work. Disinterestedness requires that the results of bona fide scientific research should not be manipulated to serve considerations such as personal profit, ideology, or expediency, in other words they should be honest and objective; it does not mean that research should not be competitive. Organized skepticism requires that statements should not be accepted on the word of authority, but that scientists should be free to question them and that the truth of any statement should finally rest on a comparison with observed fact.".(15)

Ziman has suggested that originality should be considered as a primary norm as well.(14) Even though originality is an essential characteristic of science it was not included by Merton in his initial listing. Originality requires that scientific research be novel. Ziman explains that "An investigation that adds nothing new to what is already well known and understood makes no contribution to science.".(14) He also notes that organized skepticism is a misnomer -- "although skepticism is strongly encouraged within the scientific community, it is not organized very systematically...".(14) Incidentally, the handy acronym CUDOS is offered by Ziman as a convenient reminder of these five norms.

For each of the proposed norms, it is possible to illustrate existing flaws and weaknesses. The requirement of universalism, that science is international, seems to hold well. However, Ziman notes that "The tragic experience of Soviet genetics under Lysenko is direct evidence of what can happen when the norm of universalism is not respected".(14) He also suggests that there is a "...tendency for groups of specialists to discriminate against the opinions of outsiders and lay-persons.". The norm of disinterestedness is particularly flawed. As long as scientists are human, science cannot truly proceed in a disinterested and objective manner. Schmaus has concluded that

"The growth of scientific knowledge does not depend on a norm of disinterestedness peculiar to the scientific profession. And it may be unjust or unfair to require that scientists be disinterested -- unlike the rest of us.".(12)

It has been suggested that it is possible to distinguish between special moral norms (for instance, the Mertonian Norms) and general moral norms (for example, honesty).(12) Schmaus argues that "Scientists who commit fraud thus violate not a moral rule that applies only to them and enjoins them from self-interested activity, but a moral rule that applies to everybody and requires scientists to do their duty.".(12) Bauer claims that "The system works better, the more each member of it strives to behave according to the ideals expressed in the myth of the scientific method and the Mertonian norms." and that "Scientists like all other professional people should strive to the utmost to behave ethically, for without that science will not work properly.".(6) In general, it seems that there is agreement regarding the validity and importance of both special moral norms and general moral norms. However, the manner in which violators of the norms are disciplined is not generally agreed upon.

How are offenders to be punished?

If it does become exposed, how is scientific negligence conventionally dealt with? If the erroneous work is discovered by the initiator after it has entered the public domain, then either corrections to the work or, in more severe cases, retractions of the work are typically announced. For literary accounts of the research, this is undoubtedly an embarrassing, but relatively painless mechanism of righting the wrong. If the erroneous work is discovered by the perpetrator's colleagues, then either corrections or retractions are made in a similar fashion. There may be those who would think badly of the negligent scientist, but there are undoubtedly many more who will be saying "I'm glad that was the other fellow, and not me.".

This occurrence is a commonly accepted practice among scientists. Typically, such mistakes will not cause much of a fuss; however, numerous "errors" originating from a single scientist's laboratory will undoubtedly cultivate suspicions. Baltimore has noted that "Authors of scientific papers risk everything when they publish their work. If a researcher develops a reputation for error, other scientists scrutinize his or her work all the more closely. And because funding decisions are made on the basis of peer review, error-prone scientists soon lose their financial support and are no longer able to function. This process is the ultimate safeguard against errors, whether they are consciously perpetrated or not.".(9) Thus, for minor infractions, disciplinary actions resulting from negligence may consist of mere formalities, customarily exercised internally. The general public and press need not interfere. However, at the opposite end of the spectrum, there are those who believe that "negligence should [also] be regarded as grounds for debarment.".(12)

Debarment of a negligent scientific researcher would initially seem to be an exercise of punishment to the extreme. Obviously, not every instance of negligence should be dealt with in this manner. As an analogy, the American practice of law offers two separate distinctions for instances of homicide: voluntary and involuntary manslaughter. It is conventionally agreed that persons guilty of voluntary manslaughter will receive harsher penalties than those responsible for involuntary manslaughter. Realistically, this is an oversimplified analogy. There will be many "degrees" of wrongdoing associated with each type of misconduct. It is always required that a jury of impartial persons decide the nature and degree of punishment for each individual case. The accused persons are innocent until proven guilty. They are aware of their specific rights from the very moment that accusations are made. This system of justice cannot be perfect, but it is a standardized system. We believe that it is the best possible system of justice that can presently be utilized. Would it not be best to handle instances of scientific misconduct in a similar fashion? Bauer has mentioned that "Fraud in science should be dealt with just as it is dealt with in business, government or anywhere else: unhysterically, under the rules we have evolved that safeguard the accused until such time as guilt may have been proven beyond reasonable doubt.".(6)

Disciplinary actions against fraudulent researchers do tend to be more serious than for instances of negligence. The punishment normally fits the crime -- just as it should. Resignation and debarment are common.(1,2) After reformed bank robbers have paid their debt to society, they are not offered employment as security guards. Likewise, deceitful scientists are no longer allowed to participate in science. The institution of science demands mutual trust and cooperation among its players. Schmaus has argued that "...negligent work gives rise to occasions in which scientists are tempted to commit fraud.".(12) He suggests that "The attempt to control fraud indirectly through discouragement of negligence may prove more fruitful than the search for a way to prevent scientists from acting selfishly in response to career pressures.". This analyis is seemingly flawed; it would be unrealistic and extremely difficult to place serious sanctions on those researchers guilty only of carelessness in the laboratory. First of all, who would we appoint to act as the "science police"? Perhaps an even better question to ask is: who would even want the job anyway? Furthermore, it is an unconvincing argument that negligence breeds fraud. Schmaus doesn't offer suitable evidence in support of this claim.

Until recent years, it was assumed that science could look after these matters by itself. Science was considered to be autonomous -- outsiders were not to concern themselves with its affairs. Presumably, the rare instances of scientific misconduct were dealt with fairly, judiciously, and in a timely manner. Evidently, this is not the case. Congressional investigations into allegations of foul play in science have become commonplace. An "Office of Scientific Integrity" has been established within the U.S. National Institutes of Health. Perhaps scientists are beginning to lose portions of their autonomy?

Who are the "watchdogs of science"?

The Office of Scientific Integrity (OSI) was created in March of 1989 with the purpose of conducting investigations of scientific misconduct. David Hamilton reports that the OSI has

"...adopted an unusual strategy for conducting its investigations. Instead of collecting evidence for use in a public hearing before an administrative or criminal law judge, the office employs an approach that director Jules Hallum calls a "scientific dialogue," aimed at ferreting out the scientific truth behind an allegation of misconduct. The scientific dialogue is an ambitious attempt to keep the process of investigating misconduct out of the hands of lawyers by keeping the focus on the scientific issues in any given dispute.".(16)

Many scientists are critical of this tactic; the OSI investigation does not give the accused scientists the right to confront their accusers or to review potentially incriminating evidence gathered by the OSI. The OSI will initially draft a report based upon their findings. The document will be reviewed for thoroughness by the NIH director and the Office of Scientific Integrity Review, an office within the Public Health Service. The conclusions of the OSI will eventually come before the assistant secretary for health, who will decide the final verdict of guilt or innocence and choose the manner of punishment. Researchers found guilty of misconduct may be forced to work under supervision, be suspended from NIH committees, or be "debarred" from receiving federal grants for several years.(16)

Hamilton asks the question "If OSI fails, what will replace it?". He suggests that "the scientific community could end up bringing upon itself a federal investigative apparatus more similar to the "science police" than the OSI.". The OSI directors have defended the scientific dialogue approach. They argue that "a legalistic approach to misconduct investigations would reduce scientists to "expert witnesses"... -- a serious loss to the interests of science.".(16) Apparently many scientists would like to secure the same kind of protection that they would receive in a court of law. On the other hand, OSI deputy director Clyde Watkins reminds that "In a legal forum, you lose the ability to distinguish misconduct from honest error.".(16)

Technology Review has published an article with the title John Dingell: Dark Knight of Science.(17) The headline was "While scientists cringe, one congressman crusades to rout misconduct in U.S. research universities." Exactly who is this Dark Knight of Science? Rep. John D. Dingell, Jr. (D-Mich.) is chair of the House Energy and Commerce Committee as well as its Subcommittee on Oversight and Investigations. Wade Roush, a doctoral student in MIT's Program in S.T.S., reports that

"Dingell has taken on such prominent scientists as Nobel Prize-winning biologist and Rockefeller University president David Baltimore, Stanford president Donald Kennedy, and new director of the National Institutes of Health, Bernadine Healy. All three crossed Dingell in hearings conducted by his subcommittee -- and all three are still licking their wounds.".(17)

According to Roush,

"The issue in the clashes between Dingell and the scientific community is not whether fraud or misconduct occurs in science -- which is, after all, a human enterprise -- but whether scientists, universities, and funding agencies like NIH can respond adequately when they do happen. Are allegations of misconduct thoroughly investigated? Can scientists judge each other without conflicts of interest? Are whistle blowers protected from reprisals? Dingell says that he wants scientists to police themselves. He construes their failure to do so as an invitation to step in on behalf of the taxpayers who are footing the research bill.".(17)

Roush suggests that Dingell will "capitalize on tips from disaffected insiders" in order to expose wrongdoing. Dingell's following move will be to call in the media. At the time of the hearings, Dingell applies skills acquired from his former position as prosecutor in order to trip up witnesses. Just as in the OSI investigations, the subcommittee hearings are not a court of law. Therefore, due process is not officially recognized and scientists continue to argue that they are being victimized.

Evidently, Dingell's hearings were sparked by a 1990 General Accounting Office investigation of several institutions, which revealed that Stanford University had used the overhead, or indirect cost, allowance accompanying government research grants for such items as antique silverware, the depreciation on a yacht, and operating costs of a university-owned shopping center. He notes that MIT since has repaid the government nearly $800,000 in overhead allowances that were similarly misused. Is this not a good thing for scientists? By having smaller portions of federal grants used for administrative expenses, there will be a greater amount to be used by scientists for their research.

Roush mentions that

"One root of scientist's concern about the subcommittee has been the notion that Dingell wants the government to police science. Dingell has repeatedly called this charge nonsense. His aim, he says, is "seeing to it that the scientific community has the tools, devices and mechanisms at hand to police itself, and to see to it that the results of government-sponsored research are of a character that can be trusted. We'd like to see the scientists and the profession deal with that matter themselves. They have demonstrated a considerable and continuing lack of interest in doing that."".(17)

Walter Gilbert, a Nobel prize-winning geneticist at Harvard has agreed that "The problem is with us. Until the profession and the universities show that they're policing themselves, the issue isn't going to go away. [A system of adjudication] needs to satisfy the scientists as just, and it needs to satisfy the Congress that it's effective.".(17) It should be added that the general public should be satisfied as well.


  • Return to Misconduct in Science
  • Next Chapter