If you ask any company executive or manager what they feel is more important, profit or their employees’ safety, you will get the same answer one hundred percent of the time: employees’ safety.
Sending employees home safe at the end of the day is always the priority over making an extra buck, unless you work for Ebenezer Scrooge (and even he changed his ways eventually!).
However, despite the best of intentions, employees still do get hurt at work – sometimes fatally.
In Risky Rewards, academics Andrew Hopkins (Emeritus Professor in the School of Sociology at The Australian National University) and Sarah Maslen (Assistant Professor of Sociology at the University of Canberra) examine why this may be.
Through interviews with several senior safety leaders from various companies and industries, the authors explore whether modern bonus structures actually encourage the pursuit of profits over safety, despite companies’ obvious preference for the reverse.
The Impact of Rare but Catastrophic Events
One of the most famous, and recent, examples of the pursuit of profits over safety gone wrong is the Deepwater Horizon disaster, where 11 people died, 17 were injured, and almost five million barrels of oil spilled into the Gulf of Mexico.
No one, especially not the executives of British Petroleum (BP), the company responsible for the disaster, wanted that outcome.
That particular rig was behind schedule and costing the company hundreds of thousands of dollars each day it wasn’t operating. As such, management, and especially executives whose bonuses relied entirely on the company’s profitability, were in fact incentivized to overlook several glaring safety warnings and get the rig into operation as soon as possible.
The book notes that the disaster ultimately ended up costing BP $40 billion – a third of company’s entire value. This is an interesting outcome as not only did many people lose their lives, but also the company lost money – the one thing decision makers were looking to avoid at all costs. The book puts it succinctly:
Major accidents are rare, and underinvestment can continue for years without giving rise to disaster. On the other hand, managers are judged on their annual performance, especially with respect to profit and loss.
Due to the extreme improbability of catastrophic events, many managers are incentivized to cut short-term costs including safety processes. The probability of a Deepwater Horizon-type disaster? Very low, almost negligible. The probability of not meeting profit goals and missing out on a bonus? Extremely high. What would you choose?
The Swiss Cheese Model
The inherent problem that the authors found with modern bonus systems is that financial performance is encouraged much more than, and often at the expense of, safety performance. Even in the cases where safety is measured and rewarded, the wrong performance indicators are measured (more on this in the next section).
However, as the Deepwater Horizon example aptly demonstrates, financial performance is absolutely tied to safety performance as one catastrophic event almost toppled a $120 billion company and it never fully recovered (share prices remain 25% lower than before the disaster).
The book presents the Swiss Cheese Model as a way for companies to manage risk:
The prevention of major accidents depends on defence-in-depth, that is, a series of barriers or defences to keep hazards under control…Accidents occur when all these barriers fail simultaneously. The Swiss Cheese Model developed by Professor Jim Reason conveys this idea. Each slice of cheese represents a fallible barrier and accidents only occur when all the holes line up.
The concept is simple enough and, when used properly, can provide a way to develop indicators of how well major hazard risks are being managed.
It has been said that lightening never strikes twice, but for BP, it did.
Five years before the Deepwater Horizon disaster, BP experienced another disaster, the Texas City Refinery explosion (interestingly enough, the Texas City Refinery had to be sold off in 2011 to help pay for the Deepwater Horizon catastrophe), which would have been predicted by the Swiss Cheese Model, had anyone been paying attention.
In the petroleum industry, a loss of containment (meaning a flammable liquid or gas is released into the atmosphere) is a precursor to an explosion. Thankfully, there are many layers of containment barriers between the gas or liquid and an explosion. In the years leading up to the explosion, Texas City Refinery experienced hundreds of loss of containment events, much more than is customary in the industry. With all these containment failures, it’s easy to see how the holes in the swiss cheese might line up, producing the perfect storm.
Of course, hindsight is 20-20. The chances of that explosion actually happening were minimal, and the costs to make the necessary repairs to containment barriers were high. BP executives were incentivized to focus on cost-cutting, not safety improvements.
Not to paint BP as a callous, uncaring, profit-driven monster – in its defense, safety incentives were in place. The problem was that the focus was on incentivizing the wrong kind of safety.
Personal vs. Process Safety
A major component of almost all bonus programs is the employee’s ability to achieve certain key performance indicators (KPIs). Some of these KPIs are highly objective such as “increase sales by 10% in North America,” – not a lot of ambiguity there, either sales went up or they didn’t. However, rewarding for safety is much more complicated.
To try and bring safety into the bonus mix, many companies started rewarding managers based on their injury rates – the lower the injury rates, the higher the bonus! Sounds logical. The problem is that this measure doesn’t typically result in managers trying to make workplaces safer – it results in managers manipulating the measure.
The book discovered several instances where companies were purposefully underreporting injuries to secure a safety bonus, pressuring employees not to come forward. To combat this, some companies instituted a policy that rewarded employees for reporting hazards or ‘near misses’ – an incident that could have resulted in serious damage or injury but didn’t. In one case, a company increased the number of Significant Potential Incidents (SPIs) reported from 300 to 1,400 in a three-year period.
The problem with focusing on personal safety measures is that it ignores process safety measures. Injury rates could have been zero at both of BP’s disasters and each would have still happened as it was process safety, not personal safety, that failed.
One of the reports following the Texas City Refinery explosion, the Baker Report, distinguishes the two as follows:
Personal or occupational safety hazards give rise to incidents – such as slips, falls, and vehicle accidents – that primarily affect one individual worker for each occurrence. Process safety hazards give rise to major accidents involving the release of potentially dangerous materials, the release of energy (such as fires and explosions), or both. Process safety incidents can have catastrophic effects and can result in multiple injuries and fatalities, as well as substantial economic, property, and environmental damage. Process safety in a refinery involves the prevention of leaks, spills, equipment malfunctions, over-pressures, excessive temperatures, corrosion, metal fatigue, and other similar conditions.
In simplistic terms, process safety hazards can be thought of as major hazards. An example of an industry that does a very good job of managing process hazards is the airline industry. The book describes the airline industry’s approach as follows:
No airline assumes that having good personal injury statistics implies anything about how well aircraft safety is being managed. The reason, no doubt, is that there is just too much at stake. When a passenger airline crashes, hundreds of people are killed. The financial and reputational costs to the airline are enormous and there is the real risk that passenger boycotts might threaten the very existence of the business…For all these reasons, airlines have developed distinctive ways of managing aircraft safety and would never make the mistake of using workforce injury as a measure of aircraft safety.
So how then do you incentivize process safety? It’s not an easy thing and few of the companies interviewed in the book had a solution.
The authors put forth a suggested solution that may help to better align company interests with bonus structures. Their suggestion was that a percentage of every bonus paid to top executives be put in a trust fund. Should there be a catastrophic incident, the trust fund would be drawn upon to cover the costs.
Think about this in terms of Deepwater Horizon. Minus this incentive, there was very little in place to discourage executives from not pushing ahead with the rig. Huge bonuses were on the line, hundreds of thousands of dollars were being lost each day, and the chances of something really bad happening were so remote it was hardly worth considering.
Except something really bad did happen.
Now, imagine if those executives would have been on the hook financially should a catastrophic event occur. Changes perspective. Imagine if, when you bought a lottery ticket, your chance of losing $50 million was the same as winning that amount – certainly changes your motivation. Would you risk total financial ruin – albeit a small chance – for an equally small chance at untold riches?
When considering how the probability of improbable events should impact your behavior (should you not pay your utilities bill in case an asteroid strikes Earth tomorrow causing an apocalypse?), consider the story of ‘The Thanksgiving Turkey’ from Nassim Nicholas Taleb’s The Black Swan (no, it has nothing to do with the movie about the ballerina):
A turkey is fed for 1,000 days by a butcher, and every day that passes confirms to the turkey and the turkey’s economics department and the turkey’s risk management department and the turkey’s analytical department that the butcher loves turkeys, and every day brings more confidence to that belief. But on day 1,001 something unexpected will happen to the turkey. It will incur a revision of belief.
That is because day 1,001 is the Wednesday before Thanksgiving.
Consider that [the turkey’s] feeling of safety reached its maximum when the risk was at the highest!
But the problem is even more general than that; it strikes at the nature of empirical knowledge itself. Something has worked in the past, until – well, it unexpectedly no longer does, and what we have learned from the past turns out to be at best irrelevant or false, at worst viciously misleading.
The point of Risky Rewards is similar to Taleb’s: Don’t be a turkey.
In this case, it’s more like “don’t let your employees become turkeys.” Ensure that incentive plans aren’t focusing attention on short-term financial goals at the expense of proper long-term process safety and major hazard planning.
Thanksgiving is coming.