My interest in nuclear power as a citizen, engineer, and past employee was a factor in establishing this blog, and it's terrific to see people asking about it seriously again. And because I'm an engineer whose resume includes 10+ years onsite at various nuclear power plants, I know a bit more about it than John Q. Public.
What follows is largely from memory. The numbers provided might be a bit off, but not so substantially as to mislead materially. I want to get this out while the iron is hot on this topic, so there will be few if any links. So kindly bear with me as I continue.
To measure cost-efficiency of nuclear power plants, it helps to have a yardstick that can be used to compare the costs of plants of varying sizes and ages. For our purposes, the best measurement is probably cost per kilowatt (kW) of generating capacity. You would expect well-designed plants to have very similar costs per kW, with a slight edge toward the larger plants. Coal, hydro, solar photovoltaic (PV) and windfarms can also be measured this way, so it's a handy concept for energy economics.
In the late 1970's and early 1980's, the costs per kW for nuclear power plants were skewed all over the map. Plants like LaSalle County Station came in for around $1000/kW, while Clinton Power Station (CPS) about 80 miles away in the same state (Illinois) was about $5500/kW. CPS was to have had two 950MW units for $485M, but various problems led to a final installation of one 950MW plant for about $5B. That's a factor of 20 difference. How did this happen?
There were several problems. Some affected the industry at large. Some were peculiar to the Illinois regulatory environment. And some were purely local, the result of bad management.
First the at-large problems. The generation of nuclear power plants built at that time were based on load projections dating back to the 1960s. These were boom times for electric power, and it looked like demand would keep increasing at the same rate indefinitely. So utilities overestimated both how much power they would need in the future, and the cash flow they would have to finance the construction.
Then came the 1970's. The economy headed south, and demand for power fell. Utilities had less income and had to borrow more, while interest rates rose to very high rates. The accident at Browns Ferry gave nuclear power some bad publicity. The federal Atomic Energy Commission split into what became the Department of Energy and what is now the Nuclear Regulatory Commission, and the new NRC was less chummy with the plant operators. Environmentalists got stronger, raised more issues and developed new legal and regulatory intervention strategies. The worst came in 1979, when "The China Syndrome", a movie portraying a hypothetical nuclear power plant accident, was released just weeks before the Three Mile Island (TMI) accident.
TMI lit a fire under the NRC, leading to a huge number of new regulations. Some of this was necessary, but much more was ordered without regard to cost-effectiveness or regard for alternatives. Plants that otherwise were ready for operation had to have entirely new systems installed, or required vast changes in existing systems. Necessary or not, the impact on construction schedules was horrendous, and that in turn sent costs through the roof.
But despite all that, some companies brought in their plants for costs of less than $1000/kW in that time frame. The best performer was probably Duke Power, based in the Carolinas.
Commonwealth Edison (CECo, now Unicom) was probably the next best performer, and like Duke Power had already built and operated other units before LaSalle and the later Byron and Braidwood units. So the availability of experienced hands certainly helped. However, CECo had to deal with the Illinois regulatory environment, which grew harsher over time.
Regulation of the utilities permitted rates based on the capital they had invested in their generating capacity. Regulators were obliged to disallow costs they considered imprudent, because otherwise a utility would earn the same rate of return on capital for poor performance as for good performance. And regulators did not permit any return on the capital until the assets had actually begun generating electricity.
Because of problems like those above however, the capital investment required to build a plant were higher than before, including interest on the funds used for construction. The result was that when the new units were included in the rate base, rates would have to be permitted to rise. But the politics of rate increases in Illinois elsewhere dictated that regulators were especially restrictive compared to the past, disallowing expenses and generally delaying permission for historically lawful rate increases.
Then another wrinkle was added. Illinois legislators subjected CECo and IP to "prudency" audits. This is where the over-optimistic past projections of power demand came in - intervenors like the Citizens Utility Board claimed that the utilities had built excess capacity and were not entitled to a return on the corresponding investments. It amounted to changing the rules in the middle of the game, which is a good way to ruin financial projections.
Anyway, the resulting delays and disallowances hit CECo hard, leading to layoffs in 1993 (including me). But it was especially bad for smaller operators like Illinois Power (IP, now Illinova) downstate. Having less cash flow, they had to finance more of the cost of CPS. Since they had less institutional knowledge about building a plant, they took much longer to build CPS than was warranted. Since they had not worked under such a regulatory environment before, they experienced quality assurance problems so bad that the NRC ordered them to stop some kinds of work entirely until they could prove that they were doing things right. Because of those delays, the interest mounted to the point that it was well over half the cost of building the plant.
But IP hung on and managed to get the plant online and running around 1987. They had no choice - at this time the plant accounted for about 17% of their generating capacity, but about 80% of their assets. Giving up meant writing it off, which would have been corporate suicide - the writeoff would have exceeded their capital. (I'm not an accountant, so take it easy on me with the nomenclature).
So now I hope you have some idea about how costs of nuclear power plants grew out of control for the most recent generation. Regulations changed precipitously and caused costly delays. Interest rates were high. The public grew hostile. And some plants simply were mismanaged.
Is the above relevant to any possible nuclear renaissance? Regulations should be more stable nowadays. Interest rates have calmed down. The public may have mellowed out by now. We have many more people with knowledge of how to manage a nuclear power plant under construction, and one manager in particular did a tremendous job with the St. Lucie plant in Florida. The industry has circled the wagons and shares information very effectively through organizations like the Institute for Nuclear Power Operations (INPO). The technology has matured significantly. And environmentalists have shown by now that they are opposed to virtually any form of power, so there's no point in trying to please them (although some are starting to recognize the benefits of nuclear power vs. alternatives).
What we do know is that power correlates well with human wealth and welfare, and we'll have to get it from somewhere.
No comments:
Post a Comment