Solar has an average capacity factor in the US of about 25%. Naively, you might think that to turn this into a highly-available power source, you just need to have 4x the solar panels, plus enough batteries to store 75% of a day’s worth of power. E.g., for each continuous megawatt you want to supply, you need 4 MW of solar panels, and 18 MWh of batteries. During the day, you supply 1 MW from the panels and use the other 3 MW to charge the batteries. Overnight, you discharge the batteries to supply continuous power.

Turns out it’s not quite that simple. First, the capacity factor varies throughout the year, as the days get shorter in winter. So you at least need to build enough that even on the shortest day of the year, you can charge up enough to last through the longest night. (Hopefully you’re not inside one of the polar circles… at the poles, you’d need enough batteries to last through 6 months of darkness.)

Second, cloud cover will reduce the amount of power received in irregular and hard-to-predict ways. Here’s a chart from Casey Handmer showing the fluctuations in daily power at one site in Texas, which shows both seasonal and random variation:

Third, the load you need to supply is often not continuous. Residential loads vary a lot throughout the day and throughout the year. The more the load is weighted towards nighttime and winter, the more batteries and extra solar panels you’ll need. And the load can also vary in hard-to-predict ways.

So the cost of the system depends a few factors. One is your latitude. But another is exactly how available you want the system to be. And it turns out that the costs rise steeply as you approach 100%. That is, if you want to prepare for the absolute worst case—a run of very cloudy days in winter, combined with higher-than-expected load occurring at inconvenient times—you have to build a lot panels and batteries.

Brian Potter recently modeled a simple solar-plus-battery configuration. In his model, for a house in Atlanta, solar with no batteries costs 5.7¢/kWh, but of course only covers about 25% of the needed electricity. To supply 80%, a kWh of electricity now costs 19¢. At 90%, it’s about 25¢; and at 99%, it’s 40¢:

Casey Handmer found a similar curve:

As did this report on off-grid solar for data centers:

Off-Grid Data Center, LCOE vs Lifetime Renewable Percent

How much reliability do we need?

The US power grid has about 3-nines uptime (that is, 99.9%). This is considered unreliable: it means several hours of outages per year. Places like Germany, Japan and Singapore achieve closer to 5 nines: only a few minutes of outage per year. Let’s say 3 nines is a minimum and at least 4 nines (just under an hour of outage per year) is a better target.

Brian’s chart only goes up to 2 nines. I can’t eyeball the other two charts because they’re not on a logarithmic x-axis, so anything past about 2 nines is too close together. But in Brian’s model, even getting 2 nines increased costs about 7x from the baseline with no batteries. Given how steeply that curve is rising at the end, I can only guess that getting to 3 or 4 nines would drive up costs even more.

But wait, supplying power to the grid is not the only use case for solar. Solar is being used or considered for a variety of smaller, standalone, industrial applications. Casey’s company Terraform Industries is using it for methane production, and is designing the system to work intermittently, with no batteries. Even a data center doesn’t need very high uptime if it’s being used for offline processing, such as training AI models, instead of an online service.

Casey built a model of such use cases. His analysis is that what matters is the capex of the equipment you are powering, and the amount of power it takes to run, or specifically the ratio between those two. What he found was that if your equipment costs less than about $1/W, you’re best off using little to no batteries, and just running it for a fraction of the day. For more expensive equipment (like a data center full of GPUs), you want to build enough solar and batteries to run it almost continuously.

OK, but what about the grid? Well, here’s what Michael Cembalest has to say in a recent report:

I’ve been critical of marginal levelized costs of electricity cited for wind and solar since such measures typically do not incorporate costs of backup power or energy storage. I also don’t trust the rigor of back-of-the-envelope efforts by Lazard and others to estimate “grid firming costs” for wind and solar. That’s said, I also see little value in computing levelized cost of wind or solar in isolation even when including necessary overbuilding and storage, since no one would build a wind/storage-only or solar/storage-only grid in the first place.

What matters most is the systemwide cost of deeply decarbonized grids. Our grid optimization model uses real-world data on hourly generation by source, demand and reserve margins for the five largest ISO regions. The goal: determine the configuration of solar, wind, gas, carbon capture and battery storage, combined with existing nuclear and hydro, that can meet demand at the lowest cost and reduce carbon intensity. For those interested in the mechanics, we describe the entire exercise in supplementary materials linked below.

The results: we estimate that systemwide levelized costs of electricity would increase by 15%-35% in today’s dollars to increase the zero-carbon share of power by ~30% [points], with abatement costs of $85-$165 per metric ton of CO2. I consider these results to be a lower bound since they exclude future increases in load due to electrification of transport and home heating, and due to increasing demand from data centers.

But the costs of both solar power and battery storage are dropping rapidly. Maybe soon it will be so cheap that we can get 2, 3 or even 4 nines uptime cheap enough to compete with reliable, dispatchable natural gas?

This is unclear to me, because (1) again, I’m not sure how much 3 or 4 nines costs, and (2) it’s unclear to me how fast the cost of power and storage will continue to fall. Even if assume PV panels and lithium-ion batteries continue to get rapidly cheaper with no bottom in sight, we’re approaching the point where Baumol/Amdahl/bottleneck effects will take over. Already the modules themselves are a minority of the cost of a solar installation; eyeballing this chart from an IRENA report, it looks like they’re about a third. Racking, mounting, and other equipment; labor; and soft costs each make up a significant fraction. Of course, these can be reduced too, and the cheaper the panels get, the more pressure there will be to do so.

But my tentative bottom line is: solar + batteries can be great for specific applications that don’t need very high availability; and they can work as part of an overall system of supplying power to the grid; but they won’t supply all power to the grid in the foreseeable future. (And of course this is unchanged if you add in wind or any other intermittent, non-dispatchable source.)

Anyway, mostly putting this out there to collect what I’ve found so far, and so that people with better information or opinions can chime in and clarify or fill in missing pieces.

New Comment
5 comments, sorted by Click to highlight new comments since:

Yes, this lines up with current average prices for solar at time of production vs firmed. We're only finally starting to see firmed green power prices get covered much even by experts, now that penetration rates are rising and companies are realizing they made big noises about 2030 and 2050 goals before having actually made any kind of plan to achieve them.

But, unless you're at Kardashev-1 levels of power demand (we're not), why would you try to run a grid on all solar? Who is proposing doing that, even in the world's sunniest regions? The most cost-effective way to decarbonize involves a locally-optimized combo of sources, some mix potentially including solar, wind, hydro, geothermal, nuclear, biomass, gasified MSW, maybe wave and tidal if they start making more sense, whatever is available.

Also consider that as EVs continue to grow, electricity demand increases and changes the demand curve, but any place that actually had the foresight to plan for this will see it is "Now we're going to have a large percentage of homes with a 2-day battery in their garage, and the utility pays no capex for it," either as DER (V2G) or as dispatchable demand with smart charging. Similarly, as we electrify more industrial processes, there's a lot more market for load-shifting and efficiency-increasing technologies like phase change materials, thermal storage and heat recovery, air and ground source heat pumps, and so on that all create more avenues to increase grid resilience.

Unfortunately, from a regulatory perspective, almost nowhere has set themselves up to be able to manage or incentivize this anywhere close to intelligently, and almost no one is sufficiently empowered to convene and negotiate with the full set of stakeholders needed to fix that. And even internally, companies often cannot motivate themselves to take even very obvious high-ROI energy and carbon saving measures (like duct sealing) if it involves even the slightest short-term inconvenience. 

Also, I think these kinds of models often assume a strong desire to get non-renewable power down to zero very soon, which I think is in many ways a mistake. For context, right now I live in an RV. When I'm off grid, I have 1050W of solar, ~10kWh of batteries, a 3kW hybrid inverter, and a 5.5kW gasoline generator. In spring and fall I can easily go a week without needing shore power or the generator. In summer and winter, I can't, so I use the generator a bit each dayif I'm off grid, and manage demand as best I can, and stay on-grid more often. But it would make no sense, financially or (at this point) environmentally, to try to add enough solar or batteries to not need anything else. Instead I'd first replace my AC and supplement my furnace with mini-split heat pumps. Then when I become stationary again (probably soon) I'm considering getting a wood gasifier generator. And I'm assuming my next car will be electric, and hoping maybe by then I'll be able to make use of that extra storage capacity intelligently. I can certainly load-shift much of my demand. And after that if I'm still getting 10% of my much-reduced electricity needs from fossil fuels, it's not really urgent to fix that. If you reduce the growth rate of a cumulative problem by 90%, you now have 10x longer to fix the rest. 

I have 1050W of solar, ~10kWh of batteries, a 3kW hybrid inverter, and a 5.5kW gasoline generator. In spring and fall I can easily go a week without needing shore power or the generator. In summer and winter, I can't

Sorry naive question, I get that you can't do it in winter, but why not summer? Isn't that when solar peaks?

These are very reasonable questions that I learned about the hard way camping in the desert two years ago. I do not recommend boondocking in central Wyoming in August. 

First, because when you live in an aluminum box with 1" thick R7 walls you need more air conditioning in summer than that much solar can provide. It doesn't help that RV air conditioners are designed to be small and light and cheap (most people only use them a handful of days a year), so they're much less efficient than home air conditioners, even window units. I have 2x 15k BTU/hr AC units, and can only run one at a time on my inverter (they use 1400-1800W each). On very hot days (>90-95F) I need both at least some of the time.

Second, because the conversion efficiency of silicon PV falls at high temperatures, so hot and sunny summer days are actually not my days of peak production.

Third, my batteries and inverter are unfortunately but unavoidably placed in a closed compartment with limited airflow covered in black painted aluminum. And consumer grade inverters are not great, there's something like 15-20% loss (heat generation). That means on hot days it's sometimes challenging to keep these from overheating, and running the generator to give the inverter a break while the batteries recharge can be helpful.

Fourth, in addition to low solar production in winter, electricity consumption in an RV is higher than you might expect in cold weather. The propane furnace draws electric power for the fan. Since the plumbing is exposed to air, you need electric tank and line heaters for the fresh water tank, waste water tanks, and water lines to avoid freezing. I also use electric tank warmers for my propane tanks, since when the weather drops below freezing a partially-empty 20 lb tank can't supply the steady 30k BTU/hr the furnace needs (it normally relies on ambient heat to boil off liquid propane, and at low T in a small tank that doesn't happen fast enough, which can cut supply and even freeze the regulator). On a cold winter day, I'm probably drawing an average of 300-600 watts just to keep the plumbing and furnace working well. Granted, not many people winter in an RV in Massachusetts, I'm an unusual case. I wouldn't have this problem in most of the Southwest or Florida where other RVers go.

AC demand, most likely 

Yes, you really need to mention seasonal storage or just using 15% FF to get a wholistic picture. That applies to electricity alone and especially for the whole global economy. For example low efficiency/RTE, low capex, high storage capacity sources really bring the costs down compared to battery storage if the aim is to get to 100% RE and start to matter around 85%.

For example H2 with 50% RTE used for 10% of total electricity, and stored for 6 months doesn't end up costing much with really cheap solar and a low capex H2 electrolyser/fuel cell combo. Similarly if you just put a price on CO2 then getting to 100% when other sectors of the economy aren't there isn't worth it.  Of course direct air capture cost gives a hard upper cap on a 100% RE grid as you can just suck the CO2 emissions out the air again.

Holistically after getting ~90% non FF for industry there is far more benefit from synthetic protein, for meat and the likes of https://solarfoods.com/ and precision fermentation say for cheese/dairy than pursuing it further. E.g. if you reduce the land use requirements for farming by 75% then simply planting trees can solve the rest of the CO2 problem.

Curated and popular this week