Lab safety was back in the news recently, but thankfully not because of some horrific new mishap. In August, the U.S Occupational Safety and Health Administration (OSHA) sent a letter informing Yale University that its investigation into the April death of student Michelle Dufault had found several safety deficiencies in the campus workshop where she died. In September, the American Chemical Society's main deliberative body devoted a portion of its annual meeting to considering ways to make the nation’s academic labs safer.

That discussion, like countless previous ones on the subject, considered ways to improve the often inadequate safety culture prevailing at many institutions. It’s a truth universally acknowledged that a strong safety culture -- something the U.S. Nuclear Regulatory Commission defines as combining “the necessary full attention to safety matters” and the “personal dedication and accountability of all individuals engaged in any activity which has a bearing on safety” -- plays a vital role in doing science safely.

But attention and dedication, though necessary, can’t suffice to bring the needed change, says a man who spent nearly a decade managing labs at major academic institutions. Now president and CEO of BioRAFT, a company that develops and sells software for managing university laboratories, Nathan Watson also serves on the board of directors of the nonprofit Laboratory Safety Institute. During a recent conversation with Science Careers, he took care not to promote his firm’s products. The experience-based idea that he promoted in that interview could, however, advance the discussion significantly.

Small facts, big effects

“Everyone is saying lab safety culture is the problem,” Watson begins. “But why is lab safety culture so bad?" After 12 years working in labs, first doing bench research and then overseeing day-to-day operations, Watson believes that running a safe lab requires a great deal of specific, detailed information on the people, processes, equipment, and supplies involved in doing the research. The people ostensibly in charge of safety, be they lab chiefs, lab managers, or university safety professionals, rarely have a practical means of gathering and using it effectively.


Nathan Watson (CREDIT: Courtesy of Nathan Watson)

How do small facts make a large difference to safety? In the cases both of Dufault and of Sheri Sangji, a young University of California, Los Angeles (UCLA), researcher who died in 2009 of preventable burns from a lab fire, better control of certain crucial bits of information could have meant different outcomes. OSHA’s letter to Yale noted that the lathe that killed Dufault was built in 1962 but lacked safety features that were mandated by a standard established in 1984. Inspections over the years, the letter went on, failed to note that the machine was decades overdue for that upgrade. No one can say whether bringing it up to modern standards would have prevented a needless death. It seems clear, however, that such a modification would have reduced the risk Dufault faced on the fatal night she used it.

Faulty records may also have played a role in Sangji’s death. The proximate cause was lack of protective apparel while she worked with a dangerous flammable substance. In a citation against UCLA, California’s division of OSHA noted the apparent lack of required training about her hazardous task. Sangji’s lab chief claimed that she had received it but could produce no proof.

Records flagging these dangerous deficiencies could have made a difference. Why didn't the universities have them? The logistical difficulty of collecting, keeping, and -- especially -- retrieving and using such facts effectively, Watson says, has long been a major obstacle to improving safety.

A university’s labs can number in hundreds or even thousands, each with a team ranging from a handful to scores of members, he notes. Annual turnover can top 20% as students and postdocs come and go. “That’s a lot of people and a lot of change” to keep track of, especially when each of those individuals needs particular safety training and safety gear based on his or her specific work and the equipment, tasks, and supplies involved. Beyond that, the university’s thousands of pieces of machinery need monitoring, maintenance, and sometimes modification based on changing conditions. Potentially hazardous supplies and materials require careful tracking, as does access to sensitive areas based on people’s roles and competence.

"I do not think it’s possible to manage lab safety without an integrated software system that manages all the issues [and] the interconnections between the various aspects,” Watson has concluded. The system “needs to tie the people who use equipment with training requirements and delivery of training.”

Building relationships reduces risk

But there’s another factor complicating the effective use of safety-related information beyond the traditionally bothersome, cumbersome, and time-consuming process of collecting it, he says. In academic cultures that emphasize intellectual independence and creativity, scientists often view such record-keeping not as a tool for becoming safer but as an intrusive, annoying, and largely pointless bureaucratic ordeal, something to be put off until the dreaded day when deadlines or inspectors come around.

The people and procedures associated with safety information therefore often lack prestige with researchers. “For many years, lab safety has [been seen as] being born out of facilities management,” Watson says. “While there’s nothing wrong with facilities, from the researcher’s standpoint there’s nothing scientific about it, either.” One “very famous” principal investigator, Watson says, called the university safety officer “just a facilities guy who worked in a lab for a summer and fancies himself a scientist.” The “key disconnect” in this attitude, he says, is that safety professionals indeed are “not scientists, but risk managers [with] a different job.” (Watson, by the way, prefers the term lab safety “professionals” to “officers,” with its connotation of checking and giving out tickets.)

Ideally, Watson believes, the relationship between those focused on advancing research and those focused on reducing risk should be a collaboration with a common goal. “In my experience,” he goes on, “the more interaction you see between the lab safety professionals and the researchers, the better the safety culture.” If risk managers came to labs not as occasional inspectors but as frequent visitors, and "people were used to seeing them and it wasn’t such an abrasive interaction,” they could then spend time “talking to researchers and understanding the research and observing what’s going on and helping” to minimize risks.

“The logistics side of the job, the data management and paper management,” traditionally has kept safety professionals at their desks rather than talking with researchers, Watson says. But with today’s laptop and tablet computers, portable, convenient, and simple-to-use information systems can help align the concerns of scientists, lab managers, and safety professionals. These technologies can streamline the processes of gathering, keeping, and using facts and of alerting lab members about matters needing attention. They can integrate safety information into the daily life of a lab not as a bother but as a useful tool for raising standards, and for spotting and avoiding problems.

A culture change

But even if all researchers, lab managers, and campus safety professionals had access to such systems, more knowledge is still needed to achieve high safety standards. “We don’t yet know very well how effective lab safety training is,” Watson explains. “Is it best to tell people about regulations or is it best to show people a video where somebody makes a mistake? We don’t know such things yet, and it’s critical that we act and learn.” The University of California’s new Center for Laboratory Safety is a hopeful sign of progress, he says. Established in the wake of Sangji’s death, it aims “to sponsor and support research in laboratory safety [and] develop and translate research into applied best practices” that labs across the country can adopt, according to its Web site.

The keys to integrating modern information systems and research-based best practices into the nation’s academic labs are investment and leadership, Watson says. He hopes for culture change akin to the campaign that transformed auto seatbelts from novel gadgets into commonplace items. With the backing of government and private organizations, buckling up became today’s nearly universal habit. In the same way, Watson says, modern information technology can turn keeping and consulting complete and up-to-date safety records from a contentious bother into a routine feature of life in academic labs. When that happens, he believes, the people doing research will be much safer.

Beryl Lieff Benderly writes from Washington, D.C.

10.1126/science.caredit.a1100110