Kicking The Gasoline & Petro-Diesel Habit

Nuclear Power Accidents & Our Ability To Predict Peak Oil Impacts

March 24, 2011

In response to the recent tsunamis, the resulting nuclear power plant breakdowns, and the ensuing environmental releases of radioactive materials, one Japanese governmental official claimed that contingency plans “failed to anticipate the scale of the disaster.” In 2001, Australian nuclear engineer Tony Wood indicated that probabilistic risk assessments (PRAs) failed to anticipate the events that led to the world’s worst nuclear disaster at Chernobyl (in 1986 in what is now the Ukraine). He also indicated that PRAs did not anticipate the worst reactor accident in the UK (in 1957 at Windscale), nor did PRAs anticipate the worst nuclear accident in the USA (in 1979 at Three Mile Island in Pennsylvania). There appears to be a pattern here.

Probabilistic risk assessment involves the estimation of a probability that a serious threatening scenario will actually take place. Although some related ISO standards have been released, there are no generally agreed-upon standards for conducting PRAs. Likewise, there are no requirements that the PRAs that have been completed be updated in light of new information — such as the problems encountered in Japan. Furthermore, PRAs do not need to be accurate, and according to a 2002 report written by the (US) Nuclear Regulatory Commission (NRC), the quality of these risk assessments varies considerably from one licensee to operate a nuclear power plant to another.

Typically, PRAs are based on many assumptions and subjective estimates, and the combination of all these factors, not so surprisingly can be way off the mark. Even credible sources can come up with unbelievably optimistic estimates. For example, a 2003 multi-disciplinary study done at the Massachusetts Institute of Technology (MIT) estimated that the risk of an accident damaging the core of a nuclear reactor in the US was about 1/10,000 per reactor per year. The Japanese nuclear disaster reminds us that the likelihood is in fact actually much greater than this study indicates.

So why are these PRA estimates so wildly optimistic? There are a number of serious problems with this risk assessment approach, but this author specifically calls out five problems below. It is of note that all of these problems also apply to peak oil, and the disastrous consequences that we are all on track to experience, unless business, non-profits, and government wake-up to the very serious dangers that peak oil poses. These peak oil dangers include: precipitous fall-off in demand for products and services, unexpected supplier bankruptcies, dramatic stock market crashes, financial system lock-ups, widespread unemployment, localized famines, and serious civil unrest.

The first of these problems has to do with who actually conducts a PRA. In many cases, employees or consultants paid by a certain organization promoting a nuclear power plant are the ones who conduct a risk assessment. A bias toward their benefactor no doubt is built into the assessments performed by these analysts. The pressure is to have a risk assessment be a marketing tool, rather than an objective review of the actual risks involved. The way to get around this bias is to have independent third parties, such as government regulators, either perform the analyses themselves, or else hire independent expert risk assessment consultants to do the work. A process to establish true independence rather than sham independence also needs to be in place.

The second problem involves groupthink, where established organizational biases color the way that the analysts look at various threat events such as a nuclear accident. It should not be surprising that the chosen risk analysts are often insiders, and/or are known by, and accepted by, those who would be assessed. Since it may adversely affect their careers, these insiders are loathe to “rock the boat,” and loathe to be the messengers bringing bad news. The fact that nuclear plants are run by utilities, which are for-profit operations, indicates they are under great pressure to keep costs down, and this too many cause the operators to chose insiders, particularly those who can offer the least cost deliverables. Thus the groupthink, augmented by efforts to minimize costs, will in turn will lead to cutting corners whenever possible. Cutting corners in a PRA is particularly dangerous because the result is likely to be that top management is not aware of what they don’t know. This insider approach can lead to “surprising events,” where top management claims that they couldn’t imagine that something like this would happen (as was apparently the case with the Japanese official mentioned at the beginning of this article).

A third problem involves the increased variability surrounding the occurrence of rarely encountered events. The world is now going into a phase of increasing volatility in many sectors. The financial meltdown of 2007-2008 showed that the economic world is becoming more variable in its ups and downs. Hurricane Katrina revealed the climate variability that we are increasingly experiencing around the globe. The recent revolutions in Egypt and Tunisia indicate how the public in many countries has become increasingly unpredictable in its acceptance of governmental control. The increasing incidence of ocean-going boat piracy, now taking place off the coast of Somalia, indicates that the delivery of oil to major oil consuming nations like China is becoming increasingly unpredictable. Wars fought over oil, such as the US Iraqi invasion, are likewise indicating that the supplies of fossil fuels are increasingly tenuous, and that the availability of these fuels will in the future be more variable than has been the case over the last few decades. These and many other types of variability need to be more directly incorporated into PRAs — that is if the PRAs are going to be anywhere close to accurate.

A fourth problem has caused PRAs to be wildly off the mark. That is the great faith in technology, the belief that it will save the day. When we play with dangerous technologies, such as nuclear power, or for that matter oil drilling (don’t forget the Gulf of Mexico oil spill), or still more dangerous — natural gas drilling (fracking has its own serious environmental side effects), then in order for us to act responsibly, we must in advance accurately predict the downsides and the side effects that come along with these powerful and dangerous technologies. In the nuclear power realm, these downsides and side effects for example include the need to safeguard nuclear waste for hundreds of thousands of years. To be more accurate, risk assessments should be performed by, or at least involve the active participation of, technology skeptics and cynics. To get a more balanced PRA, at least some of the risk analysts should seriously doubt the merits of complex technology, and they should have diligently studied the historical experience when it comes to the side effects of the technologies involved.

A fifth problem with many PRAs involves the failure to adequately consider the systemic interactions that go along with an accident, an attack, or a natural disaster. Disasters like the one recently taking place in Japan, involve disruptions caused by the failure of multiple centralized systems such as those providing water and electrical power. These systems are unfortunately often interdependent and linked-up with feedback loops. The difficult-to-understand interactions surrounding these systems means this level of analysis is often left out of PRAs, or at least unduly truncated, but this multi-level complexity must be closely examined and modeled. For example, if nuclear reactors need water cooling in order to remain safe, and if water cooling systems require electrical grid power in order to operate, what happens when the grid is down due to a natural disaster such as an earthquake? How will water cooling systems continue to run? Perhaps diesel powered generators will do the trick — but only until they consume the fuel stored on-site. What then? What if the roads are blocked due to an earthquake? How will more fuel come in? And what if the cooling pipes are broken by an explosion or an earthquake? Much more thought needs to go into the analysis of multiple system failures, and how we will deal with these simultaneous multiple system failures. Of course this is going to take more money, time, and expertise.

Based on the results of the current Japanese nuclear situation and many other technologically induced disasters, it is clear that our collective ability to accurately predict problems through PRAs is somewhere between weak and non-existent. Unless we markedly upgrade the way we are doing risk assessment, and take the process much more seriously, this deficient situation will continue to cause extended business interruptions, unnecessarily large financial losses, unwarranted deaths, and avoidable public health issues.

It is time that we came right out and said that: “It is simply not believable for top management to claim that they couldn’t have imagined that certain serious problems could take place.” The Japanese nuclear accidents, and many other technological disasters, could of course take place. And the PRAs that management paid for should have seriously examined and planned for these occurrences.

If the modern societies are going to use dangerous and powerful technology — such as nuclear power, or for that matter petroleum — then there is a significant price to be paid, a price that is currently not being adequately paid. This price includes the increased cost of performing accurate risk assessments, assessments that honestly estimate the probability of various serious attacks, accidents, and mishaps. This price additionally includes doing extensive up-front research that examines the downsides and side effects of the proposed technologies. This price furthermore includes adding more safeguards and controls, so that the downsides and side effects are dealt with as part of the initial design, not added later. What we don’t need now is still more “build it now and deal with the consequences later” approaches to the deployment of complex technologies.

—–
Charles Cresson Wood is a technology risk management consultant with Post-Petroleum Transportation, in Mendocino California. He is the author of the book entitled “Kicking The Gasoline & Petro-Diesel Habit: A Business Manager’s Blueprint For Action” (see www.kickingthegasoline.com).

Comments

Comments are closed.