The dictionary defines “complacency” as tranquil pleasure or self-satisfaction, especially when uncritical or unwarranted. Groups are prone to complacency when events occur as expected over an extended period of time. They let their guard down, assuming that events will continue to turn out as they have in the recent past. It is one of the most important jobs of leaders and managers to prevent their groups from falling into a culture of complacency. Otherwise, the group will be set up for failure, perhaps catastrophic.
Success breeds complacency
Offshore oil and gas exploration and exploitation is a highly complex and sophisticated technology. It is also fraught with danger. If not done right, all the time, people can be injured or killed, property damage can be immense, and ecological degradation can be extensive.
The Deepwater Horizon explosion and sinking (along with the associated Macondo oil spill) is one of the most recent examples of how badly things can go wrong when individuals and groups become complacent. The two space shuttle casualties are other examples. The loss of the Challenger and its seven crew members in 1986 was due in large part to the disregard by a launch officer of the risks related to the impact of freezing weather on vital O-rings in the fuel system. The 2003 loss of the Columbia
and its seven crew members was due in large part to the disregard by senior personnel of warnings that insulation from the fuel tanks had impacted the shuttle during launch and might have damaged the craft’s integrity. There were a number of warnings and anomalies during the drilling of the Macondo well that risks were increasing. These went unheeded by those involved aboard the mobile offshore drilling unit, ashore, and within the government.
Despite the dangers inherent in taking things out of context (and because the full report of the Commission covers 380 pages), I will attempt to quote from and highlight various provisions of that report.
Though it is tempting to single out one crucial misstep or point to one bad actor as the cause of the Deepwater Horizon explosion, any such explanation provides a dangerously incomplete picture of what happened – encouraging the very kind of complacency that led to the accident in the first place.
Absent major crises and given the remarkable financial returns available from deepwater reserves, the business culture succumbed to a false sense of security. The Deepwater Horizon disaster exhibits the costs of a culture of complacency.
But that complacency affected government as well as industry.
It should come as no surprise under such circumstances that a culture of complacency with regard to NEPA [the National Environmental Policy Act] developed within MMS [the Minerals Management Service], notwithstanding the best intentions of many MMS environmental scientists.
Moreover, increased citizen involvement before a spill occurs could create better mechanisms to utilize local citizens in response efforts, provide an additional layer of review to prevent industry and government complacency, and increase public trust in response operations.
The changes necessary will be transformative in their depth and breadth, requiring an unbending commitment to safety by government and industry to displace a culture of complacency.
Piper Alpha explosion and fire
For those who might contend that this casualty was an isolated event in an industry that is otherwise good at risk management, I would point to two other incidents. In July 1988, the offshore oil platform Piper Alpha in the North Sea suffered a catastrophic explosion and fire. Of the 226 workers on board at the time, only 61 survived. The UK Government convened an inquiry, chaired by Lord Cullen. Twenty years after the casualty, Stephen McGinty authored a book entitled “Fire in the Night: The Piper Alpha Disaster”. Summarizing Lord Cullen
’s inquiry, Mr. McGinty wrote:
The report was damning, in effect an indictment of a culture of complacency at Occidental where the monitoring of work was inadequate in an environment where mistakes proved lethal. The permit-to-work system was ‘knowingly and flagrantly disregarded’, relying on ‘informal communication’ between personnel instead of strict observance of proper procedure. Lord Cullen also found there was ‘no formal training in the permit-to-work system’. Occidental left the responsibility for training workers to their contractors. The permit-to-work system was supposed to be monitored and audited, but this had not been done in the twelve months prior to the disaster.
Occidental’s assessment of risk was considered unsatisfactory, while the ability of management to review and monitor safety procedures was lacking. They failed to perceive how changes in equipment and activities had serious safety implications. The decision to maintain oil and gas production during a massive period of ongoing construction and crucial maintenance work was described by Lord Cullen as ‘puzzling’.
Similar failures in training, oversight, continuity, and regulation were found by the Australian Commission of Inquiry established following the 21 August 2009 blowout at the Montara Wellhead Platform in the Timor Sea. Observed anomalies during the well-drilling operation were not examined to determine their cause. On-coming personnel were insufficiently briefed during hitch changes. Safety protocols were ignored because there had not been any recent casualties. Fortunately, this blowout, which lasted for ten weeks, did not include an explosion and no lives were lost. But, an estimated 1.2 to 9 million gallons of crude oil spille
d into the sea, affecting an area as large as 2,300 square miles. As the Commission’s Report noted, the knowledge and the means with which to accomplish the drilling operation in a safe manner was readily available, but was not utilized in this instance.
I am not contending that another major casualty within the offshore oil and gas industry is inevitable. I am saying, though, that the activity is highly complex. It relies on constant attention to detail and has various redundancies built into the system. Those redundancies have proven quite successful in preventing casualties when one or even two safeguards fail. The problem is that, after years of success, people start relying too heavily on those redundancies to allow the use of shortcuts or the disregard of standard procedures. When that occurs, the redundancies can be quickly eroded to the point where risks become unacceptably high.
The US Navy developed the “SAFESUB” system to guard against that very hazard due to the complex nature of its operations and the long history (though not unblemished) of safety. The system constitutes a special effort on the part of the Navy’s nuclear submarine program to guard against complacency. The commercial nuclear power industry and the Nuclear Regulatory Commission have implemented a similar program. As the Presidential Commission noted in its report: “The risk-management challenges presented by nuclear power are in some respects analogous to those presented by deepwater drilling: the dependence on highly sophisticated and complex technologies, the low probability/catastrophic consequences nature of the risks generated, and the related tendency for a culture of complacency to develop over time in the absence of major accidents.” The programs of both the Navy’s submarine fleet and the nuclear power industry have been quite successful to date in combating the hazards presented by complacency.
It is incumbent upon the offshore oil and gas industry and its regulators to not only take the steps necessary to prevent another Deepwater Horizon casualty, but to adopt wholeheartedly a safety culture that will root out complacency. We are smart enough to develop technologies and processes that will significantly reduce risk. Only time will tell if we are smart enough to overcome the human fallacy of complacency in the face of those risks.
(As published in the November 2011 edition of Maritime Reporter & Engineering News - www.marinelink.com)