Echoes of the recent Korean stem cell scandal continue to reverberate, most recently in a report commissioned by the journal Science to examine how it can keep from falling victim to future frauds. But finding protection from the perils of potentially disastrous scientific deceptions is an important issue not only for journal editors but also for early-career researchers. The pressure some of them face to produce the "right" result is not only well documented but, according to some observers, is also on the rise. Savvy young scientists therefore need to know how to safeguard their consciences, their research, and their reputations.

Science was "intentionally deceived" when it published the Hwang team's fabrications, and "no realistic set of procedures" can make a journal "completely immune to deliberate fraud," states the report released in late November. Indeed, "incentives for work that is misleading" appear to be increasing, said Science Editor-in-Chief Donald Kennedy in a telephone press conference about the report. That rise is "a side effect of the shortage of positions, the taste of tenure committees, and the highly competitive nature of science," he told Next Wave in a subsequent e-mail.

The research enterprise traditionally has run on trust, with both journal editors and their readers assuming that articles "are honestly conceived and written," the report states. That trust in fact goes far beyond mere confidence in researchers' general truthfulness to the core of what scientists believe about the scientific method. "Science is a self-correcting enterprise, and self-correction works," report committee chair John Brauman of Stanford University in Palo Alto, California, said during the telephone news conference. "Fraud as well as normal mistakes in science will be discovered" when further research reveals that they "are not correct."

The limits of trust

But scant evidence bears out that traditional optimism, according to molecular biologist Adil Shamoo of the University of Maryland Medical School in Baltimore, co-author of the textbook Responsible Conduct of Research and the founder and, for nearly 2 decades, the editor of the journal Accountability in Research . "Science in part is self-correcting and in part is not self-correcting," he told Next Wave by telephone. "I differ from scientists [who] think it's all self-correcting, period." Because researchers generally prefer pursuing their own work to redoing others', relatively few studies are ever replicated. Despite a theoretical risk of exposure, therefore, an unknown number of careers have advanced through deceptive or distorted work, he believes. "Any good sociologist will tell you for every [case] that becomes a scandal, there's probably ... 10 to 20" others that go undetected.

Still, "the overwhelming majority of people want to do the right thing," Shamoo says. Truly "pathological" deceivers form only a tiny fraction of scientists. Another 5% to 10% of work, however, falls into an intermediate "gray zone" of lesser dishonesty, he believes. Authors take "small liberties" such as drawing conclusions "that go beyond the data," "chopping off outliers," or using references in biased ways, says George Lundberg, editor of Medscape General Medicine . "Soft plagiarism"--failing to credit ideas to their true originators--is another common "gray zone" abuse, adds Shamoo. "It is up to editors and reviewers to recognize [when scientists] stretch the truth a bit," Lundberg notes.

But before a deceptive paper ever gets into an editor's hands, a number of scientists have worked to develop its ideas and data. If that process had any shady elements, some of those people--most likely including the junior researchers who do much of the bench work--know about it. "I hear at least two dozen heart-wrenching stories [a year], where [the] adviser might steal [a subordinate's] work or fudge the data, and [the subordinates] feel very uncomfortable," Shamoo said.

If an early-career scientist witnesses such a violation, what should he or she do? Taking the situation to higher university authorities can be both risky and ineffective. "Institutions tend to circle the wagons," Lundberg says. "If they have a star who might be blemished somewhat by the disclosure of hanky-panky, institutions tend ... [to] protect their star." In one case he cited, a revelation of plagiarism "resulted in the immediate loss of the job by the person who had plagiarized." But in another, similar case that same year, "the institution didn't do anything. ... The author of the first instance was a relatively small player [who] didn't bring in a lot of money. The other was a large player with a lot of political clout ... who brought in lots of money."

"To me, there is a conflict of interest" when universities police such violations, "especially if the guy brings in $5 million a year," Shamoo agrees. "Universities unfortunately have not dealt with this issue in the past 15 or 20 years in a forthright manner."

The need for self-protection

It's therefore vital that young scientists take steps to protect themselves, Shamoo and Lundberg agree. To avoid the risk of involvement in a deception, "the first thing," Lundberg advises, "is to be absolutely sure that they agree with everything that's in a paper that goes in with their name, that they have seen the final version before it is submitted, that they know about any revisions that were made during the review process before it ultimately is published, and that they are able to take public responsibility for the essence of the manuscript."

Next, he urges all researchers of every career stage to become familiar with a document called Uniform Requirements for Manuscripts Submitted to Biomedical Journals . Prepared by the International Committee of Medical Journal Editors, it lays out the ethical standards that should govern scientific publication. Several hundred biomedical journals explicitly subscribe to this code of conduct and require their authors to do likewise. The principles, however, fit every field of science, says Lundberg, who advises all early-career researchers, regardless of field, to bring the document to their supervisor's attention as a basis for discussing the ethics of article writing. Such discussions, he believes, can often head off problems before they happen by raising investigators' ethical awareness.

Shamoo agrees that labs need a common frame of reference for thinking about ethics. He advocates 30 mandatory hours of ethics training for all researchers, including lab chiefs, rather than current requirements that cover only trainees. Such explicit training "sensitizes the professors into what ethical norms are. ... I think it becomes more difficult [to cheat] when the student knows what the professor knows and the professor knows what the student knows." He also believes a system of spot-check data audits--similar to IRS random audits--would reduce violations.

Ethics Issue Hits Home for Postdoc

Science magazine reports this week (subscription required) that Jong Hyuk Park, who was a postdoc in the lab of University of Pittsburgh researcher Gerald Schatten, has been barred for 3 years from any relationships with U.S. agencies.The Office of Research Integrity of the U.S. Public Health Service imposed the sanctions on Park for faking figures in a paper on monkey cloning.

Postdocs, of course, can't impose standards on their supervisors. But if young scientists believe that they've observed an ethical violation that demands corrective action, one approach is to contact the editor who published the work, who has several tools available to set the record straight, Lundberg says. Editors "can be more helpful than the people who are closer to the actual action ... because they have a different employer, and they have as their job ferreting out the truth as much as possible." They hear such concerns "rather regularly" and view "correct[ing] the literature" as one of their principal functions, he adds.

Letters to the editor that critique or correct published articles are a commonplace feature of journals, Lundberg says. Usually "the letter writer will have a name," but if the situation merits, "an editor will agree to publish a letter with the byline as 'name withheld upon request.' " Lundberg once published an anonymous letter whose author, a citizen of a dictatorial country, "was risking his life" by speaking out, but Lundberg also believes that risk of career damage, not just a death threat, can justify anonymity. Scientists flagging potential problems in published work need to inform the editor up front whether they will permit the use of their names.

If publishing a letter is not feasible, editors can also protect informants by publishing a notice of concern, which states that the journal has reason to doubt a previously published paper. "It means that the editor is concerned about an article [but] hasn't been able to get a retraction, hasn't been able to get a correction, hasn't been able to get a letter to the editor that's publishable," Lundberg explains. Before taking this "rare" and rather drastic step, however, the editor will contact the authors to resolve the issue through lesser means such as "retraction [or] correction," because "publishing a notice of concern [is] the least good" alternative, Lundberg says. Such an inquiry from an editor "upsets the authors no end" but generally produces results. "I've never had to publish a notice of concern," he adds.

Weighing the risk

An editor's efforts to provide anonymity cannot, however, guarantee that the supervisor may not guess an informant's identity. Whether to take this risk, or the ones inherent in approaching university authorities about the case, is "an individual decision," Shamoo says, because young scientists who run afoul of a superior "can lose their career." Sometimes there is "a moral imperative, like in a clinical trial" with patients' safety at stake, he says. But in every case, each person must "ask your own moral compass [whether] it is worth the risk I'm taking. Some people will swallow and say no and be disappointed, and some people will do something about it."

Should the decision be to proceed, Shamoo recommends "go[ing] through the system" rather than outside it and also documenting everything. "I'm not a believer that you have to break the system. See if the system will take care of [the problem] for you." In his experience, quite often it will. At every step, "I would keep a paper trail" for self-protection. He advises starting with "an e-mail to your boss" raising the issue "without being accusatory. This will make the boss very sensitive" to the issue and cause him "to think about it. [Say] something like, 'I wonder if this is right,' [or] 'I feel a little concerned and uncomfortable doing [such and such].' " The fact of having first raised the issue with the supervisor will redound to the young scientist's benefit if questions arise later, he said.

Often people weighing difficult decisions appreciate the counsel of someone who has given the issue long and serious thought. "If you want support," Shamoo offers, "CC [your paper trail] to Adil Shamoo as an ethics consultant." Whether to make this CC blind or not is the sender's decision. Once contacted about an issue, Shamoo will discuss the scientist's concerns by phone but will take no independent action. "There's nothing wrong" with consulting an ethicist, he says. And, whatever those who consult him decide, "I will not condemn them either way. It is their decision."

Letting the supervisor know that an ethicist has been consulted can have the advantage of putting the supervisor on notice that a disinterested party has an eye the situation, Shamoo says. Depending on the situation, however, sharing this information may alienate the supervisor. But, Shamoo emphasizes, senior investigators consult him, too. "I hear from quite a number of professors who come to me because they know they are about to be in trouble" and need to find out how to proceed. Many of the cases he hears about get "resolved nicely. Sometimes [professors] will feel guilty and do the right thing," and sometimes "they really weren't aware of" the problem and simply needed it brought into focus.

Such lack of awareness may be rarer in the future, however, Science's Kennedy suggests. "I think the increase in the number of fraud cases in the news has sensitized everyone," he tells Next Wave by e-mail. "Young scientists will have to get used to an environment in which there's marginally less trust and also get used to harder questions about their own work." In light of the issues that some young scientists have faced, however, any increased attention to the ethics of research can only be a welcome change.

Beryl Lieff Benderly writes from Washington, D.C.

10.1126/science.caredit.a0700008