It's surprising to see 6 pages in a thread about wind turbines, and so far not one mention of the terms EROEI or EPR. Energy Return On Energy Invested, aka Energy Profit Ratio.
This is the fundamental measure of worth for all energy sources, and much more important than cost of installation, or subsidies in dollars.
A simple measure: how much energy does a society as a whole have to invest in making that energy source available for final use, compared to how much of that energy actually becomes available for final use? This has nothing to do with dollars or economics, it's a question of basic thermodynamics - does the energy source return a worthwhile energy profit? It the ratio is below zero, obviously it's unworkable. Slightly more surprising is that ratios below about 10 to 15 are also unworkable, as they require too large a proportion of total social structure be dedicated to maintenance of the energy extraction infrastructure.
Iirc, in the early days of the industrial revolution (coal powered) coal had an EROEI of around 20. Only just good enough. (It's listed at 80 in refs I see on the net now - presumably modern mining methods.)
In the early days of oil, oil's EPR was over 150. Easily extracted, high grade crude, near to refineries and final use locations. Now oil's EPR is below 50 and falling fast, due to exhaustion of all easy resources, leaving mostly only difficult fields. Deep water, shale oil, tar sands, very deep wells, high sulfur content, long distance transport, etc. We'll never 'run out' of oil, it's just that oil's EPR will keep falling till eventually it's not thermodynamically feasible as an energy source.
Solar electric... sigh... typical silicon panels take more energy to produce, than the panel will return in 15 years of operation. Supposedly the panels last longer than that. And designs get better, but then maybe operating lifespan gets shorter too, with cheaper production.
Nuclear (fission) is really terrible. For a long time no one tried to work out the total lifespan energy cost of nuclear plants, including all the infrastructure required to build and operate a plant, all the security apparatus for keeping the fuel safe, the cost of decommissioning plants once the pressure vessel is too embrittled to function safely, and worst of all, the costs of maintaining secure containment of the high level wastes for thousands of years after.
But it turns out, that all fission power systems lifetime energy costs probably exceed their total energy production. That is, they are net losses, energy-wise. They only seem dollar-profitable in the short term, while an oil-based economy is used to support the required infrastructure, and one ignores the long term waste storage costs.
This is why for instance, no one wants to buy the British aging nuclear plants. Because the decommissioning costs begin to loom scarily.
All that's quite apart from the risks of nuclear power. Most people seem to have some difficulty comprehending that nuclear risks are different in kind from all other industrial risks. An explosion, oil spill, bridge collapse, plane crash, etc are all short term disasters, with no consequences beyond a few years, or maybe a few decades for major oil spills. But nuclear accidents risk planet-wide epigenetic permanent disaster, and radiological contamination lasting for hundreds, thousands and even with some isotopes MILLIONS of years. Life on Earth evolved once Earth's primordial radiation level decayed mostly away. Nuke accident cascades absolutely could reverse that state, and return Earth to a lifeless condition effectively forever.
Then ANOTHER different-in-kind aspect to fission plant risk, is that natural disasters do happen. Tsunamis, earthquakes, and the ones we forget because there have been none SO FAR during our mere 200 years of industrial civilization - major asteroid strikes. One major impact in an ocean, and every nuke plant on bordering coastlines is smashed. Or say, if Yellowstone supervolcano blows. What would have been a human/ecological disaster recoverable in maybe 100 years, gets turned by smashed nuke plants into a planet dead for thousands/millions of years. Especially since in that scenario ALL the other nuke plants and waste sites would also eventually be weathered till they leaked.
Fission power is something only retards, lunatics and the extremely ignorant can seriously suggest. If Chernobyl wasn't enough of a lesson in practical reality, surely since Fukushima and that ongoing radiological disaster, 'ignorant' is no longer a viable excuse. The sooner existing plants worldwide are shut down the better - but that still leaves the waste to be secured. Considering it has to be secured for far longer than our industrial civilization has existed yet, it has to be secured 'effectively forever'. This is not going to be easy.
In the overall scale of humankind's urgent need for some new energy source, wind power is not very significant. Even so, the first question is, what is the EROEI? The figure most commonly given for wind power seems to be around 18.
But I wonder what failure rate was included in that calculation? Also whether it took into account the amount of power wind turbines *draw* from the grid when not generating, to maintain their heading, pitch, de-ice, etc?
A few refs
http://en.wikipedia.org/wiki/Energy_returned_on_energy_investedhttp://energytransition.de/2014/09/renewables-ko-by-eroi/http://www.energybulletin.net/node/52124What is the Minimum EROI that a Sustainable Society Must Have?
http://www.dailymail.co.uk/news/article-2893708/New-wind-turbine-farce-power-National-Grid-NOT-generating-electricity.html