Now Reading
In Science, Utter Honesty Isn’t Always a Good Thing

Photo: Sean Benesh/Unsplash


  • Honesty is a virtue in science because it is essential for the satisfaction of curiosity; it is a practiced disposition that is important for discovering truths about the natural world.
  • Sometimes doing the right thing may mean not being completely honest in a larger social setting in order to prevent a great harm.
  • The scientist who recognises the possibility of such cases is accepting that professional scientific values may sometimes need to be overridden if they come into conflict with broader human values.

In his famous commencement address in 1974 at the California Institute of Technology, physicist Richard Feynman gave an engaging talk that tried to express his understanding of the concept of integrity for science.

He began by recounting several amusing stories of pseudoscientific beliefs, including the curious Cargo Cult religions of Melanesia that arose following World War II, wherein tribespeople imitated the American soldiers they had encountered whose planes had brought valuable supplies, in the hope of once again receiving their bounty. For instance, they made headphones out of wood and bamboo to wear while sitting in imitation control towers by fire-lit runways they built, all with the goal of causing the return of the planes.

Feynman used this and a couple of similar cases of what he called “cargo cult science,” such as reflexology and telekinetic spoon-bending — goofy, unscientific ideas that he encountered among Californians — to illustrate how easy it is for human beings to fool themselves. He noted that the difficulty extends to pedagogy, where he suggested that teachers are often no better than “witch doctors” when it comes to validating their techniques. We may talk about educational science and such, but need to admit that these are not (or at least not yet) scientific, because we had not yet tested our pedagogical practices.

With this kind of difficulty in mind, he offered would-be science students some advice that gets to the essence of what it means to be a scientist. He admonished students to cultivate:

a kind of scientific integrity, a principle of scientific thought that corresponds to a kind of utter honesty — a kind of leaning over backwards. For example, if you’re doing an experiment, you should report everything that you think might make it invalid — not only what you think is right about it: other causes that could possibly explain your results; and things you thought of that you’ve eliminated by some other experiment, and how they worked — to make sure the other fellow can tell they have been eliminated …

In summary, the idea is to try to give all the information to help others to judge the value of your contribution; not just the information that leads to judgment in one particular direction or another.

Feynman first cited a core value — honesty — which is a central scientific character virtue, and then went on to show an example of what this means for behaviour. In saying that this requires a kind of “leaning over backwards,” Feynman clearly recognised that this prescription goes well beyond what is normally done or expected. It is an ideal. It may not be impossible to achieve, but certainly it will be very difficult.

One might reasonably object that utter honesty and full reporting cannot always coexist in science, at least not in all the detailed ways that Feynman recommends. For instance, while it might indeed be a helpful contribution to the progress of science to be able to access all the data from unsuccessful studies, this may not be feasible or desirable in practice. Who would publish those data or make them available, and how, if not through publication? Every working scientist knows that for any successful bit of research that leads to discovery, and a published paper, there were many failed experiments and rejected hypotheses.

Ceremonial cross of John Frum cargo cult, Tanna island, New Hebrides (now Vanuatu), 1967. Photo: Tim Ross/Wikimedia Commons, CC BY 3.0

It would seem far too easy to publish scientific papers detailing such negative results. It is sometimes said that the secret to science is to make mistakes quickly, but is a scientist who has more failures more productive? Would it be economically viable for journals to publish what would be a massive literature of false starts and blind alleys? Moreover, would anyone really care to subscribe to, let alone publish in, the American Journal of Discarded Hypotheses, the Annals of Failed Experiments, or PLOS Dumpster? This seems absurd on its face. Indeed, the idea is sufficiently amusing to scientists that there is a real Journal of Irreproducible Results that is devoted to scientific satire and humor.

That said, Feynman is surely right that in many cases such information would be useful for other scientists and important to include for the reasons he listed, namely, that it helps them judge the value of whatever positive result you are presenting. Indeed, most of the specific examples Feynman gave involve tests of alternative hypotheses or possible confounding factors that a good experimenter should check as part of the process of hypothesis testing.

It would indeed be dishonest to neglect to report the results of such tests, but notice that Feynman was not just saying that it would be wrong to be dishonest. Rather, he was arguing for a positive standard of honesty that sets a higher bar and requires scientists to actively commit to that more demanding standard as a matter of scientific integrity.

Integrity is the right word here, for the kind of utter honesty that Feynman is talking about involves the integration of values and methods in just the way that is required for the exemplary practice of science. Given that the goal of science is to answer empirical questions to satisfy our curiosity about the world, it is only by striving for the highest level of rigour and care in our methods and practices that a scientist, and the scientific community, can be confident that a question has been satisfactorily answered, and we have indeed made a real discovery.

However, this is not equivalent to publishing every failed experiment. There are many hypotheses that seem reasonable, or even likely, given the current state of understanding, that scientists would be very excited to know did not stand up to empirical test — some failures are interesting failures and worth publishing. If someone were to found the International Journal of Interesting Negative Results, it would surely find a receptive scientific audience. But other failures are uninteresting and of no particular value for the scientific community. Not publishing such failed studies is no more dishonest than not publishing uninteresting successful studies.

Of course, there will be many borderline cases where the degree of interest is a judgment call. We might even agree that publishers should take a broader perspective and publish more than they have done in the past, especially now that electronic publishing has greatly mitigated the previous economic constraints of journal publication. There are some thought-provoking questions to consider and trade-offs to weigh regarding social and professional policy that arise here, but in general this sort of issue is not really a moral counterexample to the idea of utter honesty. However, other issues are more challenging.

Harder cases

Photograph of the 1946 colloquium on the Super at Los Alamos. Front row left to right: Norris Bradbury, John Manley, Enrico Fermi and J.M.B. Kellogg. Second row left to right: Colonel Oliver G. Haywood, unknown, Robert Oppenheimer, Richard Feynman, Phil B. Porter. Third row left to right: : Edward Teller, Gregory Breit, Arthur Hemmendinger, Arthur Schelberg. Photo: Los Alamos National Laboratory

Should utter honesty, which some think implies completely open communication of scientific results, be upheld in cases, for example, where there is a reasonable fear that significant social harm would result from dangerous scientific knowledge becoming known?

Many of the Manhattan Project physicists who solved a sweet theoretical and technical problem, including Feynman, regretted their roles in releasing the nuclear genie. Leaving aside the question of whether they should have pursued this research in the first place, surely everyone would agree that honesty does not compel full publication of the details that would allow others to replicate that work.

In 2001, two British journalists reported that among materials found in the rubble of a Taliban military building after the fall of Kabul were not only weapons but also instructions for how to build an atom bomb. The fact that the document was later shown to be a version of a satirical piece from none other than the aforementioned Journal of Irreproducible Results is amusing, but does not negate what is a legitimate and serious concern.

Biological research may also carry a high risk of potential significant threat, and the US government issued policy guidelines requiring oversight and limits on what is termed Dual Use Research of Concern (DURC). It has at times even restricted funding of certain types of scientific investigations, such as so-called gain-of-function research in virology that investigated ways to increase the pathogenicity and/or transmissibility of viruses like influenza and SARS. A critic might point to such examples where science must be kept secret as a way of questioning whether honesty is always the best policy for scientists.

Colourised transmission electron micrograph of SARS virus particles (orange) found near the periphery of an infected cell (green). Image: NIAID/Flickr, CC BY 2.0

There are several things we should say in response to this criticism of the virtue of honesty in science. The first thing is to distinguish the question of whether dangerous research should be pursued at all from the question of whether, if the research is done, it should be published openly and honestly. The first question is important, but it is not a question about honesty. Rather, it is about the ethical limits of curiosity. Perhaps Feynman and the other Manhattan Project scientists were right that they should not have released the nuclear genie. For the current case, the presumption is that, for whatever reason, the research has been done, and the issue is whether it violates scientific honesty to not publish the findings or perhaps even to publish misleading results to throw others off the track.

A virtue-theoretic approach helps one begin to think through such cases. Honesty is a virtue in science because it is essential for the satisfaction of curiosity; it is a practiced disposition that is important for discovering truths about the natural world. In this sense, one must assume honesty as a core virtue even if we conclude that there are instances where the possibility of severe public harm requires secrecy, in that it was involved in discovering the danger in the first place. What is really going on in such cases is that scientific honesty is taken for granted but must be weighed against other more general social interests that come into play and ought also be taken into account.

The discovery of empirical truths is one important end for human beings, but it is not the only one. To say that veracity is a core value in science is not to say that scientific findings should never be kept hidden. Sometimes doing the right thing may mean not being completely honest in a larger social setting in order to prevent a great harm. The scientist who recognises the possibility of such cases is not denigrating or undermining this core value but rather is properly accepting that professional scientific values may sometimes need to be overridden if they come into conflict with broader human values. While all these cases show the importance of developing scientific and ethical judgment, they also affirm the importance of veracity as a core scientific value. A more difficult immediate question is how to understand cases where veracity breaks down within science itself. It is to that issue that we must now turn.

When honesty breaks down

A photo of Jan Hendrik Schön in a YouTube video. Source: BobbyBroccoli/YouTube

In the early 2000s, Jan Hendrik Schön was a rising star in physics. His research was at the intersection of condensed matter physics, nanotechnology, and materials science. While working at Bell Labs he published a series of papers — an incredible 16 articles as first author in Science and Nature, the ‘top’ science journals in the world, over a two-year period — giving results of work using organic molecules as transistors. One of his papers was recognised by Science in 2001 as a “breakthrough of the year.” He received several prestigious prizes, including the Outstanding Young Investigator Award by the Materials Research Society. His research looked like it would revolutionise the semiconductor industry, allowing computer chips to continue to shrink in size beyond the limits of traditional materials. The only problem was that none of it was true.

Other researchers tried to build on Schön’s findings but were unable to replicate his results. One scientist then noticed that two of Schön’s papers included graphs with the same noise in the reported data, which was extraordinarily unlikely to happen by chance. This was suspicious, but Schön explained it away as an accidental inclusion of the same graph. But then other scientists noticed similar duplications in other papers. It began to appear that at least some of the data were fraudulent. Bell Labs launched a formal investigation and discovered multiple cases of misconduct, including outright fabrication of data. All of Schön’s Nature and Science publications had to be withdrawn, plus a dozen or so more in other journals. Schön was fired from his job, his prizes were rescinded, he was banned from receiving research grants, and in 2004 the University of Konstanz stripped him of his PhD because of his dishonorable conduct.

Schön’s fraud was an especially egregious example of scientific misconduct, but unfortunately it is not unique. Every few years there is some case of scientific misconduct that makes headline news. Are these exceptions? It is hard to get good data, but a 2009 analysis of published studies on the issue found that an average of 1% of scientists admitted to having fabricated or falsified research data at least once in their career, and about 12% reported knowing of a colleague who had. What is one to make of such cases where honesty breaks down in science?

Self-correction and trust

A part of ‘The Creation of Adam’ by Michelangelo. Photo: Public domain

The Schön case is demoralising as a breach of scientific ethics, but it also leads one to question how 16 fraudulent papers could have made it through the peer review process of the two premier science journals in the world, and 12 more in other top journals. Peer review is an imperfect process. It generally weeds out papers where the presented evidence is inadequate to support the offered conclusion, though, shockingly, there have even been a few papers with creationist elements that somehow made it through the peer-review filter when reviewers were not paying attention. However, even a careful reviewer is likely to miss individual cases where someone fabricated the presented data.

The naturalist methodology of science is predicated on the idea that findings are replicable — that is implied by the idea of a world governed by causal laws — but for obvious practical reasons it is impossible for a journal reviewer to actually repeat an experiment to check the accuracy of the data. Journals very rarely require that a lab demonstrate its data-production process; mostly they trust researchers to have collected and reported their data accurately. Trust is the operative term here.

There are various circumstances that can justify trust. One is history — trust can be earned through experience. Another is trust based on interests — knowing that someone shares your interests means that you can count on their actions. A third is character — knowing that someone has certain character traits means you can count on their intentions. In general, scientists assume these common values with other researchers. In the vast majority of cases, it is completely reasonable to do so. Of course, this kind of prima facie trust provides an opening for the unscrupulous, which is why someone like Schön could infiltrate science the way he did. In the short term, it is hard to prevent such intentional deceptions, but science has a built-in safeguard. Reality is the secret sauce.

scientific social responsibility, Department of Science and Technology, Gautam Bhatia, RTI Act, knowledge workers, Hany Babu, Noam Chomsky, dissent, student protest, Chandrayaan 2, CSR,
Photo: Davide Cantelli/Unsplash

Adhering to scientific methods can help one avoid or correct one’s own errors. But even if one fails to self-correct in an individual case, science has a way of self-correcting such deceptions, intentional or unintentional, in the aggregate. This is because true scientific discoveries make a difference. The first difference is the difference one sees in a controlled experiment, which is a basic test for the truth of some individual causal hypothesis. They also make a difference to the fabric of science as a whole, adding an interlinking strand to the network of confirmed hypotheses. Because of the interdependencies within science, any discovery will be connected to many others that came before and others yet to come.

That means that fraudulent claims will not remain unnoticed. Other scientists pursuing investigations in the same field will at some point try to use that finding as part of some other study. At first, they may be puzzled by unexpected anomalies or inconsistent results in their experiment and recheck their measurements, assuming that their own work was wrong. However, if the problem persists, they will start checking factors that they had initially taken for granted. The test of reality will reveal that the original “finding” cannot be trusted. Even if investigators do not go on to uncover the original fraud, they will note the discrepancy, and future researchers will be able to avoid the error. Indeed, the more significant the purported finding, the more likely it is that an error, fraudulent or accidental, will be discovered, for it affects the work and stimulates the interest of all the more researchers.

From this point of view, Schön’s case is heartening in that it shows how the scientific process does work to self-correct. His meteoric success was short-lived; within two years, scientists discovered the deception as his fraudulent findings failed the reality test. A second heartening point is that in the aftermath of such cases, the scientific community typically reassesses its practices and attempts to improve its methods.

The Schön case led the scientific community to discuss ways that peer review could be improved to better detect fraud prior to publication. Although we typically think of scientific methodology in terms of experimental protocols, statistical methods and so on, procedures of community assessment such as peer review are also an important aspect of its methodological practice. Science never guarantees absolute truth, but it aims to seek better ways to assess empirical claims and to attain higher degrees of certainty and trust in scientific conclusions.

The most important thing

Photo: Ousa Chea/Unsplash

The kind of scientific integrity that is fundamental in science involves more than behaviors that investigators should not do. It is a mistake to think of ethics as a list of thou-shalt-nots. Feynman’s description of what he meant by “utter honesty” in science, remember, was couched entirely in positive terms. Scientific integrity involves “leaning over backward” to provide a full and honest picture of the evidence that will allow others to judge the value of one’s scientific contribution. For the most part, this sort of behavior is part of the ordinary practice of science and occurs without special notice.

In my book, I explore the scientific mindset — the cultural ideals that define the practice. As a practice that aims to satisfy our curiosity about the natural world, veracity is a core value. Scientific integrity is the integration of all the values involved in truth seeking, with intellectual honesty standing at the center. Feynman’s notion of “utter honesty” is but one expression of this. To reconstruct science’s moral structure, one must identify and integrate other virtues that orbit the bright lodestar that guides scientific practice.

Scientists do not always live up to these ideals, but the scientific community recognises them as aspirational values that define what it means to be a member of the practice. Those who flout them do not deserve to be called scientists. Those who exemplify them with excellence are properly honored as exemplars.

Feynman closed his commencement address with a wish for his listeners, the graduating science students: “So I have just one wish for you — the good luck to be somewhere where you are free to maintain the kind of integrity I have described, and where you do not feel forced by a need to maintain your position in the organisation, or financial support, or so on, to lose your integrity. May you have that freedom.” Fulfilling this wish requires more than individual virtue; it requires unified, vigilant support from the scientific community. Integrity in science involves a community of practice, unified by its shared values.

Robert T. Pennock is University Distinguished Professor of history, philosophy and sociology of science at Michigan State University in the Lyman Briggs College and the departments of Philosophy and Computer Science and Engineering. He is the author of Tower of Babel and An Instinct for Truth, from which this article is adapted. It was first published by MIT Press Reader and was republished here with permission.

Scroll To Top