The cost of generating power from the sun has dropped more than 80% in the last decade, making it competitive with plants powered by fossil fuels like coal and natural gas. Solar energy now accounts for the largest share of annual new generating capacity in the United States, according to government data.
My understanding is that these sorts of comparisons are generally for the pure variable cost of generating a kWH of electricity, but do not include the higher capital investment requirements that come from "renewables" being intermittent.
Stated more clearly, when solar is operating for an hour perhaps it can generate power at a direct variable cost competitive with a fossil fuel plant operating for an hour. But solar only produces when the sun is shining. So assume (generously) it produces half the time; during that operating period it has to produce twice as much power as demanded by its downstream customers, and that extra power must be stored, to supply power during the half the time it is not producing.
Consequently a "renewable" energy source has to be built to significantly greater capacity than peak demand, and with significant energy storage capacity; neither requirement exists for fossil-fuel-generated power. The additional capital costs relegate "renewables" back to a severely disadvantaged cost position compared to fossil-fuel-generated power, but a pure-variable-cost comparison hides that reality.
As always, I stand to be corrected by our resident Professors of Thermodynamics
@thackney and
@Joe Wooten .