Category → Safety Culture
Earlier this week, the American Chemical Society released a report on “Advancing Graduate Education in the Chemical Sciences.” ACS president and University of Wisconsin, Madison, chemistry professor Bassam Z. Shakhashiri commissioned the report, charging the commission with defining the purposes of graduate education in the chemical sciences and what steps should be taken to ensure that programs “address important societal issues as well as the needs and aspirations of graduate students.”
One of the five report conclusions was:
Academic chemical laboratories must adopt best safety practices. Such practices have led to a remarkably good record of safety in the chemical industry and should be leveraged.
The commission could easily have folded safety under another conclusion: ”Current educational opportunities for graduate students…do not provide sufficient preparation for their careers after graduate school.” Clearly the commission members felt strongly that laboratory safety needed to be called out as a separate point.
The report notes that “students’ lack of familiarity with best practices in laboratory safety … represents a significant gap, regardless of the type of employment the student ultimately pursues,” whether students are looking at academic, industrial, or government positions. The report emphasizes that institutions should develop a culture of working safely rather than just following rules and regulations. In that respect, it jumps off from and references the ACS Safety Culture Task Force report Creating Safety Cultures in Academic Institutions released earlier this year. And the report recommends that ACS develop a comprehensive safety curriculum based on best practices.
The report addresses the finances of safety, too:
The costs of safety practices for research should be built into the indirect costs charged by universities; they should be adequate to provide what is needed (including supplies, equipment, skilled personnel, training, and more). The direct-cost budgets of research grants do not seem to provide the appropriate mechanism for funding safety measures. The top down approach to handling the costs of safety is imperative to make certain there is uniform implementation of safety practices and hardware across all chemical laboratories of a university and to eliminate conflicts of interests among individual PIs making financial decisions regarding safety implementation in their own laboratories.
The costs of safety practices outside research laboratories, most notably in teaching facilities, are inevitably an institutional responsibility. Suitable standards should govern them, and appropriate mechanisms should fund them.
Based on the University of California’s definition of indirect costs–”those that are better calculated on an institutional basis rather than costed-out by project (e.g. research administration and accounting, purchasing, library, space, maintenance)”–safety definitely should be part of overhead. But who pays for what in academic departments can be the subject of intense debate, so it’s nice to see the ACS commission take a clear stand. The commission included two chancellors and one dean, along with many professors and some industry representatives.
Last but not least, a few quotes from the report on the importance of lab safety in graduate education:
Progress would afford better protection to students and other workers at all academic levels and would better prepare students to meet the natural expectations of their future colleagues and employers.
[T]oday’s companies demand safety performance from their employees that far exceeds what students are accustomed to in academic settings. There are many safety skills that are easily taught, such as doing hazard analyses, but the core issue is that students must be “grown” to value safety in a manner that is “bone deep” and can drive the highest level of performance, known as interdependent behavior. This culture of safety is often a surprise to newly hired students. It should not be.
[T]here is a demonstrated, strong correlation between occupational safety and operating performance of factories.30 A great many industrial organizations have found safety to be powerfully coupled in a general way to productivity. They are not committed just because a safety culture reduces their exposure to liability, but in much greater degree because a bone-deep safety culture protects their people and because workers who consistently think carefully about what they are doing perform better.
30Veltri, A.; Pagell, M.; Behm, M.; Das, A. A Data-Based Evaluation of the Relationship between Occupational Safety and Operating Performance. Jour. SH&E Res. 2007, 4, feature 2.
The University of California posted video of last week’s webinar on “Creating Safety Cultures in Academic Institutions” on YouTube, and I’ve embedded the video below. Still haven’t had time to watch it myself!
Also, there’s another webinar coming up tomorrow on “Enhancing a Culture of Safety Through the Development of a Chemical Safety Committee.” The presenter will be Robert Emery, the University of Texas Health Science Center at Houston’s vice president for safety, health, environment, and risk management. The webinar is scheduled for 11 a.m. Pacific/2 p.m. Eastern.
This is old, but I didn’t flag it at the time and I think readers might find it useful: Back in August, Chemjobber and Janet Stemwedel of San Jose State University and Doing Good Science had a (long!) conversation about lab safety, which Chemjobber recorded and posted as a podcast. Stemwedel, who got her PhD in chemistry before transitioning to philosophy, followed up by posting transcripts from parts of the discussion. Here are the links:
[on incorporating safety into tenure decisions] … if it became a matter of “Show us the steps you’re taking to incorporate an awareness and a seriousness about safety into how you train these graduate students to be grown-up chemists,” that’s a different kind of thing from, “Oh, and did you have any accidents or not?” Because sometimes the accidents are because you haven’t paid attention at all to safety, but sometimes the accidents are really just bad luck.
It really does seem that the commenters who are coming from industry are saying, “These conditions that we’re hearing about in the Harran lab (and maybe in academic labs in general) are not good conditions for producing knowledge as safely as we can.” And the academic commenters are saying, “Oh come on, it’s like this everywhere! Why are you going to hold this one guy responsible for something that could have happened to any of us?” It shines a light on something interesting about how academic labs building knowledge function really differently from industrial labs building knowledge.
Something bad happened, and the reason something bad happened, I think, is because of a culture in academic chemistry where it was acceptable for a PI not to pay attention to safety considerations until something bad happened. And that’s got to change.
The U.S. Chemical Safety & Hazard Investigation Board released a video a couple of weeks ago on “Inherently Safer: The Future of Risk Reduction.” Although the video stems from CSB and National Research Council investigations into the BayerCropScience explosion in 2008, the principles of inherently safer processes can also be applied to research-scale experiments.
As outlined in the video, those principles are:
- Minimize – reduce the amount of hazardous material in the process
- Substitute – replace one material with another that is less hazardous
- Moderate – use less hazardous process conditions, such as lower pressure or temperature
- Simplify – design processes to be less complicated and therefore less prone to failure
“It’s not a specific technology or a set of tools and activities, but it’s really an approach to design and it’s a way of thinking,” said Dennis Hendershot, a consultant with the American Institute of Chemical Engineers Center for Chemical Process Safety, at a 2009 CSB meeting. “The safety features are built right into the process, not added on. Hazards are eliminated or significantly reduced rather than controlled or managed.”
The video goes on to say that the goal of inherently safer process design is not only to prevent an accident but to reduce the consequences of an accident should one occur. A research lab experiment gone wrong, of course, is unlikely to affect the surrounding community in the way that a manufacturing incident might. But research lab incidents have cost millions of dollars and caused personal injuries in the form of lost eyes, hands, and fingers; burns and other unspecified injuries; and deaths of several researchers (for more, see the Laboratory Safety Institute’s Memorial Wall).
UMN is one of the universities benefiting from a program Dow announced last year in which the company is investing $25 million per year for 10 years in research programs at 11 academic institutions. The new safety program is independent of that effort but germinated in the relationship established between Dow and the university, says Frank S. Bates, head of UMN’s chemical engineering and materials science department.
The safety program also extends beyond research programs sponsored by Dow. Central to the effort is a Joint Safety Team (JST) made up of the safety officers from every chemistry and chemical engineering research group. “All of those safety officers will be interacting with Dow and working together to learn best safety practices” from the company, says William B. Tolman, chair of the chemistry department.
At a kick-off meeting a few weeks ago, representatives from Dow and the university agreed that their focus would be on building and sustaining a good safety culture. UMN already seems to have some good procedures and protocols in place, says Pankaj Gupta, senior strategy leader for research and development at Dow. The task is how to raise awareness of those and how to share Dow’s best practices and adapt them to a university setting.
To that end, in the next couple of weeks, Dow and UMN plan to survey chemistry and chemical engineering faculty, postdocs, and students to get their feedback on the current state of laboratory safety and what needs to be improved. Then the program will try to address those concerns by having Dow representatives visit the campus to work with members of the JST. Some or all JST members will also visit Dow, where they will be exposed to things like Dow’s training program, its laboratory audits, and how scientists approach experiments, Gupta says. Repeat surveys will help determine how the program progresses.
Gupta has already surveyed recently-hired Dow employees to get their input on the differences between academic and Dow safety culture. “The number one theme that came up again and again was awareness,” Gupta says, adding that other concerns included specifications for protective equipment, protocols, and pre-task analysis. “When our new employees come in, they spend about 30 hours in mandatory training before they can set foot in the lab to do an experiment,” providing an immediate lesson that safety comes first, Gupta says. Monthly safety meetings and pre-task analysis, in which peer groups discuss the hazards of new procedures and what to do if something goes wrong, also reinforce that safety is an integral part of laboratory experiments.
One of the things the pilot program will work on is creating an environment in which it is both expected and comfortable for people to raise questions and work with each other around hazard assessment, says Lori Seiler, associate director for environmental health and safety in research and development at Dow.
The pilot program will run through the summer. Then Dow and UMN will take stock of the effort and figure out how to proceed. Two UMN alumni now employed at Dow—one chemist and one chemical engineer—are on the core team working with the university.
Neither Dow nor UMN comes to the program with the expectation that the university will duplicate Dow’s safety program, Bates says. “But there’s a lot of room between what we’ve done in the past and what they do at Dow,” he says. “Our intention is to make things better in a university setting.”
Key to the effort is the JST, Tolman adds. “We decided early on that it would be actual students and postdocs who would lead the effort, since they’re the ones in the labs,” he says. And the interdepartmental nature of the team should strengthen it, by providing both a common goal and a wider range of experience.
The team should also help address the problem of high turnover in academic labs, Tolman says. Even as some JST members leave every year, their replacements will learn from and be supported by veteran members. And if the safety officers are trained well, they in turn will do a better job of training new research group members, Tolman says.
“My own safety officer from my group came in my office two days ago and she told me flat-out, ‘This is going to make my job easier,’” Bates adds. He hopes that the JST will add some professionalism to the safety officers and promote their authority in the research groups they serve. “And to have a partner at Dow who they can consult with and make contact with occasionally as a resource? That’s just fantastic,” Bates says.
Bates and Tolman say that their faculty members are enthusiastic about the program, even though it means a big time commitment for the safety officers. “We agree it takes time, but it needs to take time. This is important and a high priority for us,” Tolman says.
And although the safety officers may have some busy weeks ahead, in six months or a year from now, “it’s not going to take any more time. I think it will take less time and less concern on the part of the safety officers,” Bates says.
On NPR today, Morning Edition’s Renee Montagne interviewed Charles Duhigg, a New York Times business reporter and author of “The Power of Habit: Why We Do What We Do in Life and Business.” In the interview, Duhigg discussed habit research and how Paul O’Neill turned around aluminum producer Alcoa when he became CEO of the company in 1987: Not by focusing on profits or efficiency, but by emphasizing worker safety.
Discussing habits generally, Duhigg gave a three-part habit framework. There’s a cue, or trigger for an automatic behavior to start; a routine, which is the behavior itself; and finally a reward, which tells our brains to repeat the behavior. In a recent NYT story, Duhigg wrote about a habit he had formed–going to the cafeteria every afternoon for a chocolate-chip cookie–and how he analyzed it to figure out the reason for the behavior. He wasn’t hungry, he found; instead, he wanted to socialize with his colleagues. Importantly, the key to changing habitual behavior is to build it into what you already do, Duhigg wrote:
To shift the routine — to socialize, rather than eat a cookie — I needed to piggyback on an existing habit. So now, every day around 3:30, I stand up, look around the newsroom for someone to talk to, spend 10 minutes gossiping, then go back to my desk. The cue and reward have stayed the same. Only the routine has shifted. It doesn’t feel like a decision… It’s now a habit.
Focusing on Alcoa and O’Neill, Duhigg told NPR that Alcoa managers and employees had been at odds, and O’Neill’s directive to focus on safety got both parties to sit down at the same table. As they studied production processes to make them safer, they also made them more efficient and improved product quality. Safety was a “keystone habit” at Alcoa, Duhigg said. “If you can change a keystone habit, you unlock all these other patterns in someone’s life or in an organization.”
O’Neill’s legacy lives on at Alcoa, Duhigg says in his book (quote courtesy of Amazon’s “Search Inside This Book”):
Even in his absence, the injury rate has continued to decline. In 2010, [a decade after O'Neill retired,] 82 percent of Alcoa locations didn’t lose one employee day due to injury, close to an all-time high. On average, workers are more likely to get injured at a software company, animating cartoons for movie studios, or doing taxes as an accountant than handling molten aluminum at Alcoa.
As of 2006, Alcoa was one of the world’s top three aluminum producers, according to the Economist. The company currently employs 61,000 people in 31 countries and is “the world’s leading integrated aluminum company,” according to its website.
Are there lessons here for research laboratory safety? I’m curious to hear from Safety Zone readers: How would you apply the cue/routine/reward framework to development of safer work habits? And as Alcoa saw, is it possible that improving lab safety would have side benefits, such as improving understanding of reactions or communication between labmates?
Other stories on O’Neill, Alcoa, and worker safety:
- Businessweek – How O’Neill Got Alcoa Shining (2001)
- Harvard Business Review – Paul O’Neill: Values into Action (2002)
Update: Duhigg was also interviewed on Fresh Air on March 5. Listen here.
We have a couple of letters regarding academic lab safety in this week’s issue of C&EN. From one:
I am disappointed to learn from the article “Academic Lab Safety under Exam” that most of the conventional industrial laboratory concepts and practices of decades ago have yet to be implemented in colleges and graduate schools in 2011 (C&EN, Oct. 24, page 25). Maybe I’m prejudiced because I come from a history of dangerous research, went through years of “training” with high-pressure reactions, and ended up teaching industrial safety to college faculty and students (as well as industrial and municipal investigators).
And the other:
Most of the people in [academic] labs are relative rookies. … By contrast, an industrial lab will frequently have people with 10, 15, or 20 years’ experience in the lab. You cannot teach years of experience. So it’s not surprising that mistakes occur in a lab full of people where the most senior person may only have four or five years of experience. In some rare cases, a professor sending students off to do experiments with life-threatening reagents or procedures may tend toward negligence. The culture of having labs populated by inexperienced people has to change or more people will be hurt or killed.
Go read them in full. We now have commenting available for online stories, so feel free to add your thoughts.
Following the American Chemical Society’s National Meeting in Anaheim in March, a Safety Culture Task Force was established by the ACS Committee on Chemical Safety, Society Committee on Education, Committee for Professional Training, and the Division of Chemical Health & Safety. Although the title of the task force doesn’t say so, its focus is specifically on safety culture in academic laboratories.
At a retreat in June, task force members identified some things that members believe are critical for strengthening safety cultures (per the pdf of the Council agenda, page 74):
- Teaching basic laboratory and chemical safety (shop safety included)
- Safety ethic/attitude/awareness
- Learning lessons from laboratory incidents
- Collaborative interactions
- Promoting and communicating safety
- Encouraging institutional support of safety by budgeting for safety programs and supplies
The task force then asked the ACS Council to take up the matter at the Denver meeting, to get comments and suggestions from councilors on ways that ACS could assist colleges and universities in developing better safety cultures and practices.
Continue reading →
While we certainly missed our friends who were unable to make it to Denver because of the hurricane on the East Coast, I’d still say it has been an excellent meeting for both the Committee on Chemical Safety (CCS) and the Division of Chemical Health & Safety (CHAS).
Safety culture in academic laboratories has become a popular topic. Efforts by CCS to raise the issue’s profile within ACS will result in today’s Council discussion on the discussion. Both CHAS & CCS meetings included presentations from the Chemical Safety & Hazard Investigation Board on their investigation of academic safety incidents and causes. I also saw several National Academies senior staff members in Denver; they continue to work on raising the necessary money to follow through on their initial meeting on the topic. It looks like this will continue to move forward on multiple levels. Hopefully there will be a consolidated effort to advance this important cause.
On another subject, several former ACS Presidents have approached me about CHAS developing an online laboratory safety certificate program for graduate students. The objective is to give graduate students a “leg up” on preparing for life after academia. As many of you know, a major complaint by industry is that students don’t have the safety experience they need to succeed when they’re hired. By developing a comprehensive course with testing and a certificate, these graduates could add something helpful to their resumes. I’ll throw a disclaimer right here that hands-on experience in using safety equipment and PPE is also necessary, but a well-designed program could be a strong basis. I’ll be talking soon with both ACS staff and outside providers to determine the best approach. Feel free to chime in if you have ideas!
Last but not least, thanks to the C&EN staff, particularly Jyllian and Amanda Yarnell, for including me in their get-together this weekend. I had a great time and would say they are not only professional and hard-working in their efforts to keep C&EN’s high profile, they’re also fun people!
I’ve got no major problem with working alone, so long as the person doing so uses good judgment in deciding what type of work is reasonable in these situations. When alone, it is prudent to limit yourself to experiments that don’t require especially hazardous reagents, dangerous conditions, or large scales. That said, I don’t think there are any black-and-white rules you can institute. Experience should also enter the analysis; you don’t want to try something dodgy for the first time when you are alone.
There are a bunch of other questions that can arise with respect to any outright ban of working alone. First off, what counts as “alone”? The institutional policies I’ve come across aren’t specific. Must the researchers working be located in the same bay? The same room? Same floor? Same building?
Another thing Gallagher highlights about his department’s safety culture is a prohibition on working alone—something that can be tricky to get right, he says. One approach is that no one works outside of 8 a.m. and 6 p.m. because no one else will be there. Another is to get people to collaborate to enable longer or later time in the lab. In Gallagher’s department, “We have a culture where students will work with one another to enable their experiments,” he says, noting that people must work within the line of sight of another person. Having someone in an office down the hall doesn’t cut it.
On one level, I agree with Paul that the work in question should dictate the circumstances–that is, after all, what risk assessment is all about. But are there not still some fundamental rules that should be in place? I always wear my seat belt in a car, for example, even when it’s daytime, the roads are dry, and the driver is sober, well-rested, and has been driving for many years without incident. (Full disclosure: I did work alone as a graduate student, principally to collect magnetic circular dichroism spectra. I’ve also been in a car accident.) It’s not always the hazards you know about that will cause a problem, as Gallagher also illustrated:
An incident at Bristol [in 2009] left a student’s face and hands badly cut when an experiment exploded and shattered the safety glass on the fume hood. With the benefit of hindsight, Gallagher says that the most likely cause was a side reaction that produced a small amount of an alkyl peroxide, which detonated when it came into contact with a ground-glass joint. But the peroxide formation was not something anyone had foreseen.
Are there basic safety policies that you think should be in place for all labs, all the time?