Comment by epistasis

5 months ago

Some key parameters for new grid storage tech:

- Round trip efficiency: how much electricity comes out from electricity going in

- $/kWH capacity: lower is better, how does the battery cost scale as additional energy capacity is added?

- $/kW capacity: lower is better, how does the battery cost scale as additional power capacity is added?

- power to energy ratio: higher is better, to a certain point, but not usually at the expense of $/kWh capacity. If your ratio is 1:100, then you're in range of 4 days duration, which means at most 90 full discharges in a year, which highly limits the amounts of revenue possible.

- Leakage of energy per hour, when charged: does a charged battery hold for hours? Days? Weeks?

These all add up to the $/kWh delivered back to the grid, which determines the ultimate economic potential of the battery tech.

Lithium ion is doing really great on all of these, and is getting cheaper at a tremendous rate, so to compete a new tech has to already be beating it on at least one metric, and have the hope of keeping up as lithium ion advances.

Some tech has notably separate $/kW and $/kWh pricing.

Such as for example the awfully-often mentioned seasonal Europe setup of green summer hydrogen injected into former methane caverns, to be fed to gas turbines in winter.

Though I guess it's hard to measure $/kWh due to usage of natural formations.

Then there's the up-and-coming opportunity for green iron refining (ore to metal), which becomes financially practical when fed with curtailed summer surplus from integrated PV/battery deployments who's entire AC and grid side is undersized vs. PV generation capacity, using day/night shifting with local storage and peak shaving into iron electrolyzers (which would use some of the day/night shifting battery's capacity to increase over-the-year duty cycle of the iron electrolyzers).

For reference we're looking at capex for the electrolyzers (assuming 30% duty cycle average over a year, and zero discount rate over 20 years expected lifespan) around 0.1$/kg iron (metal) and electricity usage around 3 kWh/kg iron (metal).

I keep seeing comments that Li-ion is getting cheaper at an amazing rate but somehow the 18650 cells I seem to see online keep getting more expensive. Anyone have a source?

  • Might be the form factor. I think most of the big companies have moved away from 18650 cells. The cheapest full packs (not cells) in the US are $800 for 5kwh. Search “Server Rack Battery” on eBay, amazon or alibaba. These things are way cheaper than they were 12 months ago. The raw cells can be had even cheaper, but they require more specialized knowledge and equipment to use.

  • To see if anything is getting cheaper over the time, especially long term it's useful to adjust for inflation - if everything getting more expensive fast but Li-ion prices rise slower than other goods - adjusted for inflation Li-ion getting cheaper.

TFA says 75% round trip efficiency, compared to 85% for batteries.

While there is no leakage as such, the storage vessels might require continuous cooling, unless they are buried deep in the ground and they are very well insulated.

For great enough capacities, so that the costs of the turbo-generator and of the compressor become relatively small, the cost per stored kWh should become significantly lower than for batteries, especially when considering the far longer lifetime.

For small capacities, batteries are certainly preferable, but for very large capacities this should be a very good solution.

There is also security of access to rare earth metals needed for those batteries.