← Back to context

Comment by rgmerk

16 hours ago

That’s not how the maths works unfortunately.

Basically, you end up having to overbuild to crazy levels, or build insane amounts of battery storage, which only gets used a few days a year.

That is right (if rather exaggerated, and I will note that it was you who originally picked the figure of two percent), and in practice, we accept a certain risk that we will not always have all the capacity we want, even though (or because) we cannot precisely predict how big or often these events will be. There is no particular reason to think this specific case is any different.

  • Why can't we predict how big or how often those events would be? We have clear understandings of the distribution of probabilities for all kinds of weather scenarios - see for example 1-50/100/1000 year flood/droughts.

    • I'm not saying we cannot do it, just that we cannot always get it right, and there is plenty of empirical evidence for that.

      The second point is that the distribution has a long tail, especially when we consider the possibility of multiple independent incidents overlapping in time, to the point where it becomes infeasible to suppose that we could be prepared to continue operating as if nothing had happened in all conceivable scenarios, regardless of how accurately we could predict their likelihood.

    • We can and do, and there are detailed plans based on those weather scenarios (eg for the Australian east coast grid; there is AEMO’s Integrated System Plan).

      Things in the US are a bit more of a mixed bag, for better or worse, but there have been studies done that suggest that you can get very high renewables levels cost effectively, but not to 100% without new technology (eg “clean firm” power like geothermal, new nuclear being something other than a clusterfumble, long-term storage like iron-air batteries, etc etc etc).

      1 reply →