Category → Safety Culture
From a New York Times profile of McKinsey & Co. CEO Dominic Barton and his efforts to change the company’s rules and culture regarding personal investment after insider trading scandals:
At McKinsey’s London office last fall, a recently hired associate sat at a computer for an orientation session. The associate worked at McKinsey as a business analyst several years earlier and then left the firm for a nongovernmental organization. During her first stint, she simply signed a form confirming that she understood McKinsey’s investing rules. This time, though, she had to walk through a 45-minute interactive program.
When McKinsey first introduced this tutorial, six employees refused to complete it, saying it was a sign that the firm was turning into a “nanny state.” They left the firm. To push recalcitrant employees to complete the test, McKinsey cuts off their email access until they comply.
The story says that all McKinsey consultants–not just new ones–have to complete tutorials such as the one described and senior partners in particular weren’t
to too happy about it. Barton persevered.
Following the lab safety anecdotes that I posted on Monday, there was some discussion on Twitter regarding how to evaluate whether or not academic lab safety culture is changing. The answer seems to be that it’s hard and developing good metrics will take time.
It’s hard to say, but there are still serious problems, according to people posting on reddit chemistry:
I mean these are supposed to be some of the brightest student chemists around, yet the attitude is really disappointing. Whenever I’ve called people out on it they typically say that they “know” what they are doing. The brazen attitude is really pathetic. I’ve heard other excuses like saying how it takes too long to put on lab coats. I’ve seen people refuse to evacuate during fire alarms to finish their columns. I’ve seen people eat while doing chemistry at their hoods.
Everything from “not caring about the equipment/techniques” like not bothering to fill the vac traps with liquid N2, setting up distillations with pressurized nitrogen instead of with a N2 bubbler, letting dishes become overwhelming before dealing with them, leaving wrappers out; to “legitimate dangerous practices” like using a low temp oil bath heated to 325C instead of a sand bath until the oil burned and blackened, dumping 20-30 g of sodium in ketyl stills because they don’t want to clean it, trying to throw away broken Hg thermometers in the trash.
I watched this girl work with god only knows what ligand and metal with gloves still on, proceed to pull out her iPhone and text her friends. Also headphones in lab, we bought a radio so you didn’t have to wear those and block out the world. What if an emergency comes up and I’m yelling for help but your in your own little world listening to music??
And some responses when Chemjobber tweeted about the discussion:
— Renée Webster (@reneewebs) December 15, 2013
— Science Isn't Scary (@sciencenotscary) December 15, 2013
— Srdjan Tufegdzic (@TufegdzicSrdjan) December 15, 2013
@Chemjobber Thankful my lab doesn't run this way! Outside people have made fun of my group for always wearing lab coats. It's crazy.
— LN (@ChemistLN) December 15, 2013
@Chemjobber People come into my lab with cups of coffee, wearing flip-flops, no lab coats, no gloves,wiping spills on their jeans. Terrible!
— Lau (@Lau__1985) December 15, 2013
— B. Haas (@belehaa) December 15, 2013
The University of Minnesota chemistry department released a new promotional video last week. The department sets a pretty high standard for showing proper personal protective equipment. I spy only one person who is obviously in a wet lab without eye protection.
UMN was one of the schools involved with Dow in the company’s safety partnership with universities. They now have a paper out in the Journal of Chemical Education, so you can read about their experiences in their own words (J. Chem. Educ. 2013, DOI: 10.1021/ed400305e).
Speakers for the open part of the meeting include Massachusetts Institute of Technology chemistry professor and safety committee chair Rick L. Danheiser. I spoke with Danheiser about MIT’s safety program for “Learning from UCLA.” Also on the agenda is William B. Tolman, chair of the chemistry department at the University of Minnesota and one of the people involved in Dow’s academic lab safety partnerships. And then there’s Susan S. Silbey, who is head of anthropology at MIT and studies “the creation of management systems for containing risks, including ethical lapses, as well as environment, health and safety hazards.”
I can’t attend the meeting, but if anyone else who does would like to recap it for the Safety Zone, please let me know!
One of the things that came up at the National Academy of Science’s “Safety Culture in Academic Laboratories” committee meeting a couple of weeks ago was the idea that safety compliance leads to a better safety culture.
Many safety professionals say that a culture of compliance is definitely not the best safety culture. Compliance is about box-ticking on things like standardized training and lab inspections. A good safety culture means that people are thinking through, talking about, and paying attention to what they’re doing so they’re actually working safer. Compliance will come from a good safety culture, but a good safety culture will not necessarily arise from compliance.
Others argue, however, that safety culture can be improved through compliance. “It’s worked well for us to develop our safety culture through ensuring compliance,” because the compliance component promoted interactions between researchers and safety professionals, said Robert Eaton, director of environmental health and safety at the University of California, San Francisco.
That only works if those interactions on compliance are positive, I suspect. In an organization in which researchers do not respect or understand the role of safety staff, then compliance is unlikely to do much for the overall safety culture.
But perhaps compliance is an essential step en route to a better safety culture? Maybe organizations need some sort of base-level safety compliance to be able to move people to the next level–maybe people can’t be brought to think critically about what they’re doing when they’re not even bothering with the basics of eye protection and closed-toe shoes. Representatives from Sandia and Lawrence Berkeley national laboratories presented what they’re doing to push their organizations beyond what sounded like more of a compliance culture to more of a critical thinking culture. To the academics in the room, “You’re at a state we were at 20 years ago,” said J. Charles Barbour, director of the Physical, Chemical, & Nano Sciences Center at Sandia. Even if compliance culture is a necessary phase, though, perhaps academia can take advantage of the knowledge in industry and government labs to move people faster to critical thinking and safer work practices.
One more meeting tidbit: Stanford University chemistry professor Robert Waymouth‘s suggestion for how to get recalcitrant faculty on board with lab safety programs was to appeal to their egos–in his words, their “desire for excellence”–with the explicit goal of being better than and informing industry rather than the other way around. (Along with, I hope, a desire not to have their lab members get hurt.)
A final note: At the start of the open session, committee chair Holden Thorp noted that topics discussed during information-gathering do not necessarily indicate what will wind up in the final report.
Last month, the National Academy of Sciences kicked off a yearlong study of “Safety Culture in Academic Laboratories.” The project is supposed to focus not so much on what should be done to improve safety in academic labs, but on how to get people to actually do it. C&EN’s Jeff Johnson attended and reported on the first meeting of the committee, which is chaired by H. Holden
Thorpe Thorp. Thorpe Thorp transitions at the end of this month from chancellor of the University of North Carolina to provost at Washington University in St. Louis.
The second Safety Culture committee meeting is this week, Wednesday and Thursday (June 26 and 27) at the University of California, Berkeley. The agenda is here. Since it’s local to me, I plan to attend, and I’m sure at least one blog post will result.
Earlier this week, the American Chemical Society released a report on “Advancing Graduate Education in the Chemical Sciences.” ACS president and University of Wisconsin, Madison, chemistry professor Bassam Z. Shakhashiri commissioned the report, charging the commission with defining the purposes of graduate education in the chemical sciences and what steps should be taken to ensure that programs “address important societal issues as well as the needs and aspirations of graduate students.”
One of the five report conclusions was:
Academic chemical laboratories must adopt best safety practices. Such practices have led to a remarkably good record of safety in the chemical industry and should be leveraged.
The commission could easily have folded safety under another conclusion: ”Current educational opportunities for graduate students…do not provide sufficient preparation for their careers after graduate school.” Clearly the commission members felt strongly that laboratory safety needed to be called out as a separate point.
The report notes that “students’ lack of familiarity with best practices in laboratory safety … represents a significant gap, regardless of the type of employment the student ultimately pursues,” whether students are looking at academic, industrial, or government positions. The report emphasizes that institutions should develop a culture of working safely rather than just following rules and regulations. In that respect, it jumps off from and references the ACS Safety Culture Task Force report Creating Safety Cultures in Academic Institutions released earlier this year. And the report recommends that ACS develop a comprehensive safety curriculum based on best practices.
The report addresses the finances of safety, too:
The costs of safety practices for research should be built into the indirect costs charged by universities; they should be adequate to provide what is needed (including supplies, equipment, skilled personnel, training, and more). The direct-cost budgets of research grants do not seem to provide the appropriate mechanism for funding safety measures. The top down approach to handling the costs of safety is imperative to make certain there is uniform implementation of safety practices and hardware across all chemical laboratories of a university and to eliminate conflicts of interests among individual PIs making financial decisions regarding safety implementation in their own laboratories.
The costs of safety practices outside research laboratories, most notably in teaching facilities, are inevitably an institutional responsibility. Suitable standards should govern them, and appropriate mechanisms should fund them.
Based on the University of California’s definition of indirect costs–”those that are better calculated on an institutional basis rather than costed-out by project (e.g. research administration and accounting, purchasing, library, space, maintenance)”–safety definitely should be part of overhead. But who pays for what in academic departments can be the subject of intense debate, so it’s nice to see the ACS commission take a clear stand. The commission included two chancellors and one dean, along with many professors and some industry representatives.
Last but not least, a few quotes from the report on the importance of lab safety in graduate education:
Progress would afford better protection to students and other workers at all academic levels and would better prepare students to meet the natural expectations of their future colleagues and employers.
[T]oday’s companies demand safety performance from their employees that far exceeds what students are accustomed to in academic settings. There are many safety skills that are easily taught, such as doing hazard analyses, but the core issue is that students must be “grown” to value safety in a manner that is “bone deep” and can drive the highest level of performance, known as interdependent behavior. This culture of safety is often a surprise to newly hired students. It should not be.
[T]here is a demonstrated, strong correlation between occupational safety and operating performance of factories.30 A great many industrial organizations have found safety to be powerfully coupled in a general way to productivity. They are not committed just because a safety culture reduces their exposure to liability, but in much greater degree because a bone-deep safety culture protects their people and because workers who consistently think carefully about what they are doing perform better.
30Veltri, A.; Pagell, M.; Behm, M.; Das, A. A Data-Based Evaluation of the Relationship between Occupational Safety and Operating Performance. Jour. SH&E Res. 2007, 4, feature 2.
The University of California posted video of last week’s webinar on “Creating Safety Cultures in Academic Institutions” on YouTube, and I’ve embedded the video below. Still haven’t had time to watch it myself!
Also, there’s another webinar coming up tomorrow on “Enhancing a Culture of Safety Through the Development of a Chemical Safety Committee.” The presenter will be Robert Emery, the University of Texas Health Science Center at Houston’s vice president for safety, health, environment, and risk management. The webinar is scheduled for 11 a.m. Pacific/2 p.m. Eastern.
This is old, but I didn’t flag it at the time and I think readers might find it useful: Back in August, Chemjobber and Janet Stemwedel of San Jose State University and Doing Good Science had a (long!) conversation about lab safety, which Chemjobber recorded and posted as a podcast. Stemwedel, who got her PhD in chemistry before transitioning to philosophy, followed up by posting transcripts from parts of the discussion. Here are the links:
[on incorporating safety into tenure decisions] … if it became a matter of “Show us the steps you’re taking to incorporate an awareness and a seriousness about safety into how you train these graduate students to be grown-up chemists,” that’s a different kind of thing from, “Oh, and did you have any accidents or not?” Because sometimes the accidents are because you haven’t paid attention at all to safety, but sometimes the accidents are really just bad luck.
It really does seem that the commenters who are coming from industry are saying, “These conditions that we’re hearing about in the Harran lab (and maybe in academic labs in general) are not good conditions for producing knowledge as safely as we can.” And the academic commenters are saying, “Oh come on, it’s like this everywhere! Why are you going to hold this one guy responsible for something that could have happened to any of us?” It shines a light on something interesting about how academic labs building knowledge function really differently from industrial labs building knowledge.
Something bad happened, and the reason something bad happened, I think, is because of a culture in academic chemistry where it was acceptable for a PI not to pay attention to safety considerations until something bad happened. And that’s got to change.