We’re not building clean energy fast enough to avoid catastrophic #ClimateChange #StopAdani #auspol

At this rate, it’s going to take nearly 400 years to transform the energy system

Here are the real reasons we’re not building clean energy anywhere near fast enough.

James Temple

Fifteen years ago, Ken Caldeira, a senior scientist at the Carnegie Institution, calculated that the world would need to add about a nuclear power plant’s worth of clean-energy capacity every day between 2000 and 2050 to avoid catastrophic climate change.

Recently, he did a quick calculation to see how we’re doing.

Not well.

Instead of the roughly 1,100 megawatts of carbon-free energy per day likely needed to prevent temperatures from rising more than 2 ˚C, as the 2003 Science paper by Caldeira and his colleagues found, we are adding around 151 megawatts.

That’s only enough to power roughly 125,000 homes.

At that rate, substantially transforming the energy system would take, not the next three decades, but nearly the next four centuries.

In the meantime, temperatures would soar, melting ice caps, sinking cities, and unleashing devastating heat waves around the globe (see “The year climate change began to spin out of control”).

Caldeira stresses that other factors are likely to significantly shorten that time frame (in particular, electrifying heat production, which accounts for a more than half of global energy consumption, will significantly alter demand). But he says it’s clear we’re overhauling the energy system about an order of magnitude too slowly, underscoring a point that few truly appreciate: It’s not that we aren’t building clean energy fast enough to address the challenge of climate change.

It’s that—even after decades of warnings, policy debates, and clean-energy campaigns—the world has barely even begun to confront the problem.

The UN’s climate change body asserts that the world needs to cut as much as 70 percent of greenhouse-gas emissions by midcentury to have any chance of avoiding 2 ˚C of warming. But carbon pollution has continued to rise, ticking up 2 percent last year.

So what’s the holdup?

Beyond the vexing combination of economic, political, and technical challenges is the basic problem of overwhelming scale. There is a massive amount that needs to be built, which will suck up an immense quantity of manpower, money, and materials.

For starters, global energy consumption is likely to soar by around 30 percent in the next few decades as developing economies expand. (China alone needs to add the equivalent of the entire US power sector by 2040, according to the International Energy Agency.) To cut emissions fast enough and keep up with growth, the world will need to develop 10 to 30 terawatts of clean-energy capacity by 2050.

On the high end that would mean constructing the equivalent of around 30,000 nuclear power plants—or producing and installing 120 billion 250-watt solar panels.

Energy overhaul

There’s simply little financial incentive for the energy industry to build at that scale and speed while it has tens of trillions of dollars of sunk costs in the existing system.

“If you pay a billion dollars for a gigawatt of coal, you’re not going to be happy if you have to retire it in 10 years,” says Steven Davis, an associate professor in the Department of Earth System Science at the University of California, Irvine.

It’s somewhere between difficult and impossible to see how any of that will change until there are strong enough government policies or big enough technology breakthroughs to override the economics.

A quantum leap

In late February, I sat in Daniel Schrag’s office at the Harvard University Center for the Environment. His big yellow Chinook, Mickey, lay down next to my feet.

Schrag was one of President Barack Obama’s top climate advisors. As a geologist who has closely studied climate variability and warming periods in the ancient past, he has a special appreciation for how dramatically things can change.

Sitting next to me with his laptop, he opened a report he had recently coauthored assessing the risks of climate change.

It highlights the many technical strides that will be required to overhaul the energy system, including better carbon capture, biofuels, and storage.

The study also notes that the United States adds roughly 10 gigawatts of new energy generation capacity per year.

That includes all types, natural gas as well as solar and wind. But even at that rate, it would take more than 100 years to rebuild the existing electricity grid, to say nothing of the far larger one required in the decades to come.

“Is it possible to accelerate by a factor of 20?” he asks. “Yeah, but I don’t think people understand what that is, in terms of steel and glass and cement.”

Climate observers and commentators have used various historical parallels to illustrate the scale of the task, including the Manhattan Project and the moon mission. But for Schrag, the analogy that really speaks to the dimensions and urgency of the problem is World War II, when the United States nationalized parts of the steel, coal, and railroad industries.

The government forced automakers to halt car production in order to churn out airplanes, tanks, and jeeps.

The good news here is that if you direct an entire economy at a task, big things can happen fast. But how do you inspire a war mentality in peacetime, when the enemy is invisible and moving in slow motion?

“It’s a quantum leap from where we are today,” Schrag says.

The time delay

The fact that the really devastating consequences of climate change won’t come for decades complicates the issue in important ways. Even for people who care about the problem in the abstract, it doesn’t rate high among their immediate concerns.

As a consequence, they aren’t inclined to pay much, or change their lifestyle, to actually address it. In recent years, Americans were willing to increase their electricity bill by a median amount of only $5 a month even if that “solved,” not eased, global warming, down from $10 15 years earlier, according to a series of surveys by MIT and Harvard.

It’s conceivable that climate change will someday alter that mind-set as the mounting toll of wildfires, hurricanes, droughts, extinctions, and sea-level rise finally forces the world to grapple with the problem.

But that will be too late.

Carbon dioxide works on a time delay.

It takes about 10 years to achieve its full warming effect, and it stays in the atmosphere for thousands of years.

After we’ve tipped into the danger zone, eliminating carbon dioxide emissions doesn’t decrease the effects; it can only prevent them from getting worse.

Whatever level of climate change we allow to unfold is locked in for millennia, unless we develop technologies to remove greenhouse gases from the atmosphere on a massive scale (or try our luck with geoengineering).

This also means there’s likely to be a huge trade-off between what we would have to pay to fix the energy system and what it would cost to deal with the resulting disasters if we don’t. Various estimates find that cutting emissions will shrink the global economy by a few percentage points a year, but unmitigated warming could slash worldwide GDP more than 20 percent by the end of the century, if not far more.

In the money

Arguably the most crucial step to accelerate energy development is enacting strong government policies.

Many economists believe the most powerful tool would be a price on carbon, imposed through either a direct tax or a cap-and-trade program. As the price of producing energy from fossil fuels grows, this would create bigger incentives to replace those plants with clean energy (see “Surge of carbon pricing proposals coming in the new year”).

“If we’re going to make any progress on greenhouse gases, we’ll have to either pay the implicit or explicit costs of carbon,” says Severin Borenstein, an energy economist at the University of California, Berkeley.

But it has to be a big price, far higher than the $15 per ton it cost to acquire allowances in California’s cap-and-trade program late last year. Borenstein says a carbon fee approaching $40 a ton “just blows coal out of the market entirely and starts to put wind and solar very much into the money,” at least when you average costs across the lifetime of the plants.

Others think the price should be higher still. But it’s very hard to see how any tax even approaching that figure could pass in the United States, or many other nations, anytime soon.

The other major policy option would be caps that force utilities and companies to keep greenhouse emissions below a certain level, ideally one that decreases over time. This regulations-based approach is not considered as economically efficient as a carbon price, but it has the benefit of being much more politically palatable. American voters hate taxes but are perfectly comfortable with air pollution rules, says Stephen Ansolabehere, a professor of government at Harvard University.

Fundamental technical limitations will also increase the cost and complexity of shifting to clean energy. Our fastest-growing carbon-free sources, solar and wind farms, don’t supply power when the sun isn’t shining or the wind isn’t blowing. So as they provide a larger portion of the grid’s electricity, we’ll also need long-range transmission lines that can balance out peaks and valleys across states, or massive amounts of very expensive energy storage, or both (see “Relying on renewables alone significantly inflates the cost of overhauling energy”).

The upshot is that we’re eventually going to need to either supplement wind and solar with many more nuclear reactors, fossil-fuel plants with carbon capture and other low-emissions sources, or pay far more to build out a much larger system of transmission, storage and renewable generation, says Jesse Jenkins, a researcher with the MIT Energy Initiative. In all cases, we’re still likely to need significant technical advances that drive down costs.

All of this, by the way, only addresses the challenge of overhauling the electricity sector, which currently represents less than 20 percent of total energy consumption. It will provide a far greater portion as we electrify things like vehicles and heating, which means we’ll eventually need to develop an electrical system several times larger than today’s.

But that still leaves the “really difficult parts of the global energy system” to deal with, says Davis of UC Irvine. That includes aviation, long-distance hauling, and the cement and steel industries, which produce carbon dioxide in the manufacturing process itself. To clean up these huge sectors of the economy, we’re going to need better carbon capture and storage tools, as well as cheaper biofuels or energy storage, he says.

These kinds of big technical achievements tend to require significant and sustained government support. But much like carbon taxes or emissions caps, a huge increase in federal research and development funding is highly unlikely in the current political climate.

Give up?

So should we just give up?

There is no magic bullet or obvious path here. All we can do is pull hard on the levers that seem to work best.

Environmental and clean-energy interest groups need to make climate change a higher priority, tying it to practical issues that citizens and politicians do care about, like clean air, security, and jobs. Investors or philanthropists need to be willing to make longer-term bets on early-stage energy technologies. Scientists and technologists need to focus their efforts on the most badly needed tools. And lawmakers need to push through policy changes to provide incentives, or mandates, for energy companies to change.

The hard reality, however, is that the world very likely won’t be able to accomplish what’s called for by midcentury. Schrag says that keeping temperature increases below 2 ˚C is already “a pipe dream,” adding that we’ll be lucky to prevent 4 ˚C of warming this century.

That means we’re likely to pay a very steep toll in lost lives, suffering, and environmental devastation (see “Hot and violent”).

But the imperative doesn’t end if warming tips past 2 ˚C. It only makes it more urgent to do everything we can to contain the looming threats, limit the damage, and shift to a sustainable system as fast as possible.

“If you miss 2050,” Schrag says, “you still have 2060, 2070, and 2080.”

Press link for more: Technology Review


Northam Solar Farm “A Game Changer” #auspol #qldpol #wapol #StopAdani

Northam Solar Farm set to be a ‘game changer’, says Carnegie Clean Energy boss

Written by Lynn GriersonMarch 12th, 2018, 04:30PM

NORTHAM Solar Farm is scheduled to open mid-year in a ‘partnership first’ between Perth Noongar Foundation, Indigenous Business Australia and Carnegie Clean Energy.

Carnegie chief executive Michael Ottaviano says the model is a potential game changer for WA communities.

If all goes according to plan, the 10MW solar farm in Northam will be a template for local people and industry to utilise a renewable resource in a future where everyone is a winner.

Carnegie Clean Energy retains a 50 per cent stake in a deal with co-equity investors Indigenous Business Australia (IBA) and the Perth Noongar Foundation to deliver electricity to about 3000 households throughout the 25-year project.

Carnegie chief executive officer and managing director Michael Ottaviano is hopeful that where his company leads, others will follow.

“What we’re really doing is taking engagement a step further and rather than just engaging the community at our whim, it’s about getting Indigenous people around the table to own and co-own projects; that’s never been done before,” he said.

“I think Northam could be a template for other companies to adopt in the sense that this is a way of not just community engagement where it’s the company coming in and dictating all terms, this is about working directly with the local community and with indigenous capital and owners to collectively drive change in these communities.”

The renewable energy project is breaking new ground for Carnegie and its indigenous partners.

“Part of our partnership agreement with IBA and the Perth Noongar Foundation talks about a whole range of requirements and obligations and part of that is engaging and employing indigenous people, contractors and businesses,” he said.

Artist impression of the Northam Solar Power Station.

“It’s a potential game changer; if you can give indigenous people equity in these projects then you’re creating an income stream for these groups, in this case for at least 25 years.”

Dr Ottaviano said people in Northam embraced the idea of a solar farm in their neighbourhood.

About 30 people will be employed during the construction phase and for the most part, they will be electricians and mechanical fitters.

“Solar farms don’t need much in the way of maintenance and you don’t even have to clean the panels.”

“The design element is being done at our Belmont HQ; it’s this sort of project that keeps us here in WA where we’re the biggest renewable energy employer.”

Carnegie is also on track to build the first microgrid in WA for the naval base on Garden Island.

The clean energy provider specialises in standalone solar projects, wave energy and hybrid – a complex energy mix, which Dr Ottaviano said is where the world is going.

Until recently, a 10MW would be considered large but on the east coast of Australia, solar farms are underway up to 10 times the size.

“Globally now we’re seeing projects approaching 500MW and 1000MW farms, which are really extraordinary and incredibly disruptive for the power section,” he said.

“To put it into perspective, a typical coal power station might be between 200MW and 500MW and now we’re seeing solar plants at that order of magnitude.”

He said that unlike other states, WA and NSW do not have renewable energy targets at a time when more consumers are putting solar panels on their roof to generate their own power.

“Australia has gone from having no roof top solar ostensibly five years ago to having more roof top solar per capita than any country in the world,” Dr Ottaviano said.

“We’ve got the best combination of solar, wind and wave; really we should be leading the world.”

He listed Denmark as among the top European countries approaching 100 per cent renewable power.

“Australia tends to be a technology taker rather than a technology maker, which is a shame because we’ve got great engineering skills and the world’s best renewable resources, but we consistently fail to see it as an opportunity,” he said.

“We sort of revert back to what is safe and conservative and easy, which is dig up the coal and gas and burn it.”

Press link for more: Community News

China’s War on Pollution #StopAdani #auspol #qldpol

China’s War on Pollution Will Change the World

March 9, 2018

China is cracking down on pollution like never before, with new green policies so hard-hitting and extensive they can be felt across the world, transforming everything from electric vehicle demand to commodities markets.

Four decades of breakneck economic growth turned China into the world’s biggest carbon emitter. But now the government is trying to change that without damaging the economy—and perhaps even use its green policies to become a leader in technological innovation.

So, as lawmakers attend the annual National People’s Congress, here’s a look at the impact of the environmental focus, at home and abroad.

PM 2.5 Concentration Estimate (µg/m3) as of January 31, 2018

Source: Berkeley Earth (see footnote for methodology)

China’s air pollution is so extreme that in 2015, independent research group Berkeley Earth estimated it contributed to 1.6 million deaths per year in the country.

The smog is heaviest in northern industrial provinces such as Shanxi, the dominant coal mining region, and steel-producing Hebei. Emissions there contribute to the planet’s largest mass of PM 2.5 air pollution—the particles which pose the greatest health risks because they can become lodged in the lungs. It can stretch from Mongolia to the Yellow Sea and often as far as South Korea.

Leaders at the congress said they will raise spending to curb pollution by 19 percent over the previous year to 40.5 billion yuan ($6.4 billion) and aim to cut sulfur dioxide and nitrogen oxide emissions by 3 percent. They said heavy air pollution days in key cities are down 50 percent in five years.

Carbon Dioxide Emissions

Tons of Carbon Dioxide

December 2001:

China joins WTO

Source: BP Statistical Review of World Energy

The country had become the world’s No.1 carbon dioxide emitter as it rose to dominate global exports, a process which began several decades ago but got its biggest lift with World Trade Organization entry in 2001. Emissions have started to fall again.

Bigger Than Tesla

The government’s war on air pollution fits neatly with another goal: domination of the global electric-vehicle industry.

Elon Musk’s Tesla Inc. might be the best-known name, but China has been the global leader in EV sales since 2015, and is aiming for 7 million annual sales by 2025.

Source: Bloomberg New Energy Finance

To get there, it’s subsidizing manufacturers and tightening regulation around traditional fossil-fuel powered cars. Beneficiaries include BYD Co., a Warren Buffett-backed carmaker that soared 67 percent last year and sold more cars than Tesla. Goldman Sachs Group Inc. has a buy rating on shares of Geely Automobile Holdings Ltd.

Clean Energy Frontiers

Worldwide, solar panel prices are plunging—allowing a faster shift away from carbon—thanks to the sheer scale of China’s clean-energy investment. It’s spending more than twice as much as the U.S. Two-thirds of solar panels are produced in China, BNEF estimates, and it’s home to global leaders, including JinkoSolar Holding Co. and Yingli Green Energy Holding Co.

Source: Bloomberg New Energy Finance

But China isn’t stopping there. As well as wind and solar, it’s exploring frontier clean energy technologies like hydrogen as an alternative to coal.

Follow the Money

The trend towards clean energy is poised to keep gathering steam worldwide. BNEF projects global investment in new power generation capacity will exceed $10 trillion between 2017 and 2040. Of this, about 72 percent is projected to go toward renewable energy, roughly evenly split between wind and solar.

The Third Industrial Revolution

China’s efforts to cut excess industrial capacity overlap with the imperative to clean up the environment. Combined, those forces have had a hefty impact on commodity prices. Coal, steel, and aluminum prices soared last year as factories shut and mines closed. Under the weight of new rules on pollutant discharge, paper prices did the same. Some markets have recovered somewhat since then, some haven’t.

Thermal coal

(Per metric ton)

Steel rebar

(Per metric ton)


(Per metric ton)

Paper products

(Producer Price index)

Source: Data compiled by Bloomberg, China Coal Resource, National Bureau of Statistics

Clearer Skies

Five years ago, Beijing’s “airpocalypse” unleashed criticism of the government so searing that even Chinese state media joined in. Last year, the capital’s average daily concentration of PM2.5 particles was almost a third lower than in 2015, compared with declines of about a tenth for some other major cities.

The turnaround isn’t just limited to improving air quality. China has stopped accepting shiploads of other countries’ plastic and paper trash, a response to public concern over pollution and a decreased need for scrap materials.

As Xi pushes a greener approach, officials at every level of government are working to put his words into action. The government has set up a special police force, and polluting factories have been closed. Officials obediently banned coal, sending natural gas sales surging, before backtracking after supply shortfalls left many areas in the cold.

Beijing’s 30-Day Average Air Pollution Levels

PM 2.5 pollutant concentration µg/m3

China’s LNG Imports

Source: U.S. Department of State Air Quality Monitoring Program, China Customs

While smog was long excused as the inevitable byproduct of rising wealth, there’s no sign so far that the cleanup is derailing the country’s economy. Growth last year accelerated to 6.9 percent—the first uptick in seven years—and remains a crucial prop for global expansion.

What’s more, China sees high-tech industries like electric cars and solar panels as its chance to lead the world, setting standards and cornering markets as they begin to build momentum. But turning around carbon emissions at home is one thing. Winning over the world’s consumers to become a tech superpower is a different goal entirely.

Press link for more:

Third Industrial Revolution Will Create a Green Economy Jeremy Rifkin #StopAdani #auspol

This essay is the first in a four-part series on the theme, “The Third Industrial Revolution.” An introduction by Arianna Huffington is available here. Part two is available here. Part three is here. Part four is here. Stay tuned for responses from leading global figures and technologists.

The global economy is slowing, productivity is waning in every region of the world and unemployment remains stubbornly high in every country.

At the same time, economic inequality between the rich and the poor is at the highest point in human history.

In 2010 the combined wealth of the 388 richest people in the world equaled the combined wealth of the poorest half of the human race.

By 2014 the wealth of the 80 richest individuals in the world equaled the combined wealth of the poorest half of the human race.

This dire economic reality is now compounded by the rapid acceleration of climate change brought on by the increasing emissions of industry-induced global warming gases.

Climate scientists report that the global atmospheric concentration of carbon, which ranged from about 180 to 300 parts per million for the past 650,000 years, has risen from 280 ppm just before the outset of the industrial era to 400 ppm in 2013.

The atmospheric concentrations of methane and nitrous oxide, the other two powerful global warming gases, are showing similar steep trajectories.

At the Copenhagen global climate summit in December 2009, the European Union proposed that the nations of the world limit the rise in Earth’s temperature to 3.5 degrees Fahrenheit (2 degrees Celsius).

Even a 3.5 degree rise, however, would take us back to the temperature on Earth several million years ago, in the Pliocene epoch, with devastating consequences to ecosystems and human life.

The EU proposal went ignored.

Now, six years later, the sharp rise in the use of carbon-based fuels has pushed up the atmospheric levels of carbon dioxide far more quickly than earlier models had projected, making it likely that the temperature on Earth will rush past the 3.5 degree target and could top off at 8.6 degrees Fahrenheit (4.8 degrees Celsius) by 2100 — temperatures not seen on Earth for millions of years. (Remember, anatomically modern human beings — the youngest species — have only inhabited the planet for 195,000 years or so.)

What makes these dramatic spikes in the Earth’s temperature so terrifying is that the increase in heat radically shifts the planet’s hydrological cycle.

Ours is a watery planet.

The Earth’s diverse ecosystems have evolved over geological time in direct relationship to precipitation patterns. Each rise in temperature of 1 degree Celsius results in a 7 percent increase in the moisture-holding capacity of the atmosphere. This causes a radical change in the way water is distributed, with more intense precipitation but a reduction in duration and frequency.

The consequences are already being felt in ecosystems around the world.

We are experiencing more bitter winter snows, more dramatic spring storms and floods, more prolonged summer droughts, more wildfires, more intense hurricanes (category 3, 4 and 5), a melting of the ice caps on the great mountain ranges and a rise in sea levels.

The Earth’s ecosystems cannot readjust to a disruptive change in the planet’s water cycle in such a brief moment in time and are under increasing stress, with some on the verge of collapse. The destabilization of ecosystem dynamics around the world has now pushed the biosphere into the sixth extinction event of the past 450 million years of life on Earth. In each of the five previous extinctions, Earth’s climate reached a critical tipping point, throwing the ecosystems into a positive feedback loop, leading to a quick wipeout of the planet’s biodiversity.

On average, it took upward of 10 million years to recover the lost biodiversity.

Biologists tell us that we could see the extinction of half the Earth’s species by the end of the current century, resulting in a barren new era that could last for millions of years. James Hansen, the former head of the NASA Goddard Institute for Space Studies, forecasts a rise in the Earth’s temperature of 4 degrees Celsius between now and the turn of the century — and with it, the end of human civilization as we’ve come to know it. The only hope, according to Hansen, is to reduce the current concentration of carbon in the atmosphere from 400 ppm to 350 ppm or less.

Typhoon Haiyan survivors make camp in the ruins of their neighborhood on the outskirts of Tacloban, central Philippines. (AP Photo/David Guttenfelder, File)

Now, a new economic paradigm is emerging that is going to dramatically change the way we organize economic life on the planet.

The European Union is embarking on a bold new course to create a high-tech 21st century smart green digital economy, making Europe potentially the most productive commercial space in the world and the most ecologically sustainable society on Earth.

The plan is called Digital Europe.

The EU vision of a green digital economy is now being embraced by China and other developing nations around the world.

The digitalization of Europe involves much more than providing universal broadband, free Wi-Fi and a flow of big data.

The digital economy will revolutionize every commercial sector, disrupt the workings of virtually every industry, bring with it unprecedented new economic opportunities, put millions of people back to work, democratize economic life and create a more sustainable low-carbon society to mitigate climate change.

Equally important, this new economic narrative is being accompanied by a new biosphere consciousness, as the human race begins to perceive the Earth as its indivisible community.

We are each beginning to take on our responsibilities as stewards of the planetary ecosystems that sustain all of life.

To grasp the enormity of the economic change taking place, we need to understand the technological forces that have given rise to new economic systems throughout history. Every great economic paradigm requires three elements, each of which interacts with the other to enable the system to operate as a whole: new communication technologies to more efficiently manage economic activity; new sources of energy to more efficiently power economic activity; and new modes of transportation to more efficiently move economic activity.

In the 19th century, steam-powered printing and the telegraph, abundant coal and locomotives on national rail systems gave rise to the First Industrial Revolution. In the 20th century, centralized electricity, the telephone, radio and television, cheap oil and internal combustion vehicles on national road systems converged to create an infrastructure for the Second Industrial Revolution.

The Third Industrial Revolution

Today, Europe is laying the ground work for the Third Industrial Revolution. The digitalized communication Internet is converging with a digitalized, renewable “Energy Internet” and a digitalized, automated “Transportation and Logistics Internet” to create a super “Internet of Things” infrastructure. In the Internet of Things era, sensors will be embedded into every device and appliance, allowing them to communicate with each other and Internet users, providing up-to-the-moment data on the managing, powering and moving of economic activity in a smart Digital Europe. Currently, billions of sensors are attached to resource flows, warehouses, road systems, factory production lines, the electricity transmission grid, offices, homes, stores and vehicles, continually monitoring their status and performance and feeding big data back to the Communication Internet, Energy Internet and Transportation and Logistics Internet. By 2030, it is estimated there will be more than 100 trillion sensors connecting the human and natural environment in a global distributed intelligent network. For the first time in history, the entire human race can collaborate directly with one another, democratizing economic life.

The EMC earth station at Raisting in Germany provides satellite-based communications for aid organizations, the United Nations and emerging markets. (Photo by Sean Gallup/Getty Images)

The digitalization of communication, energy and transportation also raises risks and challenges, not the least of which are guaranteeing network neutrality, preventing the creation of new corporate monopolies, protecting personal privacy, ensuring data security and thwarting cybercrime and cyber terrorism. The European Commission has already begun to address these issues by establishing the broad principle that “privacy, data protection, and information security are complementary requirements for Internet of Things services.”

In this expanded digital economy, private enterprises connected to the Internet of Things can use Big Data and analytics to develop algorithms that speed efficiency, increase productivity and dramatically lower the marginal cost of producing and distributing goods and services, making European businesses more competitive in an emerging post-carbon global marketplace. (Marginal cost is the cost of producing an additional unit of a good or service, after fixed costs have been absorbed.)

The marginal cost of some goods and services in a Digital Europe will even approach zero, allowing millions of prosumers connected to the Internet of Things to produce and exchange things with one another for nearly free in the growing Sharing Economy. Already, a digital generation is producing and sharing music, videos, news blogs, social media, free e-books, massive open online college courses and other virtual goods at near zero marginal cost. The near zero marginal cost phenomenon brought the music industry to its knees, shook the television industry, forced newspapers and magazines out of business and crippled the book publishing market.

While many traditional industries suffered, the zero marginal cost phenomenon also gave rise to a spate of new entrepreneurial enterprises including Google, Facebook, Twitter, YouTube and thousands of other Internet companies, which reaped profits by creating new applications and establishing the networks that allow the Sharing Economy to flourish.

Economists acknowledge the powerful impact the near zero marginal cost has had on the information goods industries. But, until recently, they have argued that the productivity advances of the digital economy would not pass across the firewall from the virtual world to the brick-and-mortar economy of energy, and physical goods and services. That firewall has now been breached. The evolving Internet of Things will allow conventional businesses enterprises, as well as millions of prosumers, to make and distribute their own renewable energy, use driverless electric and fuel-cell vehicles in automated car-sharing services and manufacture an increasing array of 3-D-printed physical products and other goods at very low marginal cost in the market exchange economy, or at near zero marginal cost in the Sharing Economy, just as they now do with information goods.

Jeremy Rifkin is the author of “The Zero Marginal Cost Society: The Internet of Things, the Collaborative Commons, and the Eclipse of Capitalism.” Rifkin is an advisor to the European Union and to heads of state around the world, and is the president of the Foundation on Economic Trends in Washington, D.C..

For more information, please visit The Zero Marginal Cost Society.

Press link for more: Huffington Post

The third industrial revolution. #StopAdani #auspol

The third industrial revolution

The digitisation of manufacturing will transform the way goods are made—and change the politics of jobs too

THE first industrial revolution began in Britain in the late 18th century, with the mechanisation of the textile industry. Tasks previously done laboriously by hand in hundreds of weavers’ cottages were brought together in a single cotton mill, and the factory was born.

The second industrial revolution came in the early 20th century, when Henry Ford mastered the moving assembly line and ushered in the age of mass production.

The first two industrial revolutions made people richer and more urban.

Now a third revolution is under way. Manufacturing is going digital.

As this week’s special report argues, this could change not just business, but much else besides.

A number of remarkable technologies are converging: clever software, novel materials, more dexterous robots, new processes (notably three-dimensional printing) and a whole range of web-based services.

The factory of the past was based on cranking out zillions of identical products: Ford famously said that car-buyers could have any colour they liked, as long as it was black.

But the cost of producing much smaller batches of a wider variety, with each product tailored precisely to each customer’s whims, is falling.

The factory of the future will focus on mass customisation—and may look more like those weavers’ cottages than Ford’s assembly line

Towards a third dimension

The old way of making things involved taking lots of parts and screwing or welding them together.

Now a product can be designed on a computer and “printed” on a 3D printer, which creates a solid object by building up successive layers of material.

The digital design can be tweaked with a few mouseclicks.

The 3D printer can run unattended, and can make many things which are too complex for a traditional factory to handle.

In time, these amazing machines may be able to make almost anything, anywhere—from your garage to an African village.

The applications of 3D printing are especially mind-boggling.

Already, hearing aids and high-tech parts of military jets are being printed in customised shapes.

The geography of supply chains will change.

An engineer working in the middle of a desert who finds he lacks a certain tool no longer has to have it delivered from the nearest city.

He can simply download the design and print it.

The days when projects ground to a halt for want of a piece of kit, or when customers complained that they could no longer find spare parts for things they had bought, will one day seem quaint.

Other changes are nearly as momentous.

New materials are lighter, stronger and more durable than the old ones.

Carbon fibre is replacing steel and aluminium in products ranging from aeroplanes to mountain bikes.

New techniques let engineers shape objects at a tiny scale.

Nanotechnology is giving products enhanced features, such as bandages that help heal cuts, engines that run more efficiently and crockery that cleans more easily. Genetically engineered viruses are being developed to make items such as batteries. And with the internet allowing ever more designers to collaborate on new products, the barriers to entry are falling. Ford needed heaps of capital to build his colossal River Rouge factory; his modern equivalent can start with little besides a laptop and a hunger to invent.

Like all revolutions, this one will be disruptive.

Digital technology has already rocked the media and retailing industries, just as cotton mills crushed hand looms and the Model T put farriers out of work. Many people will look at the factories of the future and shudder.

They will not be full of grimy machines manned by men in oily overalls. Many will be squeaky clean—and almost deserted.

Some carmakers already produce twice as many vehicles per employee as they did only a decade or so ago.

Most jobs will not be on the factory floor but in the offices nearby, which will be full of designers, engineers, IT specialists, logistics experts, marketing staff and other professionals. The manufacturing jobs of the future will require more skills.

Many dull, repetitive tasks will become obsolete: you no longer need riveters when a product has no rivets.

The revolution will affect not only how things are made, but where. Factories used to move to low-wage countries to curb labour costs. But labour costs are growing less and less important: a $499 first-generation iPad included only about $33 of manufacturing labour, of which the final assembly in China accounted for just $8. Offshore production is increasingly moving back to rich countries not because Chinese wages are rising, but because companies now want to be closer to their customers so that they can respond more quickly to changes in demand. And some products are so sophisticated that it helps to have the people who design them and the people who make them in the same place.

The Boston Consulting Group reckons that in areas such as transport, computers, fabricated metals and machinery, 10-30% of the goods that America now imports from China could be made at home by 2020, boosting American output by $20 billion-55 billion a year.

The shock of the new

Consumers will have little difficulty adapting to the new age of better products, swiftly delivered.

Governments, however, may find it harder.

Their instinct is to protect industries and companies that already exist, not the upstarts that would destroy them.

They shower old factories with subsidies and bully bosses who want to move production abroad.

They spend billions backing the new technologies which they, in their wisdom, think will prevail. And they cling to a romantic belief that manufacturing is superior to services, let alone finance.

None of this makes sense.

The lines between manufacturing and services are blurring.

Rolls-Royce no longer sells jet engines; it sells the hours that each engine is actually thrusting an aeroplane through the sky.

Governments have always been lousy at picking winners, and they are likely to become more so, as legions of entrepreneurs and tinkerers swap designs online, turn them into products at home and market them globally from a garage.

As the revolution rages, governments should stick to the basics: better schools for a skilled workforce, clear rules and a level playing field for enterprises of all kinds.

Leave the rest to the revolutionaries.

Press link for more:

Time for #factcheck on the @CairnsPost support for coal. #auspol #qldpol #media #climatechange

This editorial written by Julian Tomlinson @DamTom79 editor of the Townsville Bulletin @tsv_bulletin appears in today’s Cairns Post @TheCairnsPost.

Julian completely ignores climate change.

On a recent 4Corners program it was said that Cairns would become “Unliveable” due to rising sea level.

Recent flooding from sea level rise in the Torres Strait

Cairns is a city thar relies on tourism. The tourism industry employs far more than the fossil fuel industries.

We need a healthy reef recently global warming has caused back to back bleaching of the Great Barrier Reef.

For a Cairns newspaper to advocate for more coal mines ignoring climate change is beyond stupid it borders on treason.

Today 20,000 scientists have signed a letter saying we’re heading towards a catastrophic future unless we move to a sustainable economy.

The world is moving away from coal as renewable energy becomes cheaper and more reliable than coal.

In fact India has no need for new coal Indian Express

It’s cruel for journalists who should know better to hold out hope for workers hoping to get work in the Adani Mine. Even if it was built there will only be a handful of jobs as the mine will be fully automated. Independent Australia

The media is an integral part of a good democracy it is a failure of Australia’s main steam media to report what is happening around the world on climate change and renewable energy that has allowed Australia to fall so far behind the rest of the world on climate action.

Our children and future generations will pay a huge price for our ignorance and lack of action.

The vast majority of Australians do not want new coal mines!

It’s time for main stream media and our politicians to wake up.

Future generations of Australians will be horrified by your inability to accept change

A recent survey Energy Matters

More than 100 cities produce more than 70% of electricity from renewables. #auspol #StopAdani

100+ cities Produce More than 70% of Electricity from Renewables – CDP | UNFCCC

The transition to clean, renewable energy is a critical component of meeting Paris Climate Change Agreement goals, and cities around the world are increasingly taking up the challenge.

According to data published by the CDP, more cities than ever are reporting that they are powered by renewable electricity.

The global environmental impact non-profit CDP holds information from over 570 of the world’s cities and names over 100 now getting at least 70% of their electricity from renewable sources such as hydro, geothermal, solar and wind.

The list includes large cities such as Auckland (New Zealand); Nairobi (Kenya); Oslo (Norway); Seattle (USA) and Vancouver (Canada), and is more than double the 40 cities who reported that they were powered by at least 70% clean energy in 2015.

CDP’s analysis comes on the same day the UK100 network of local government leaders announce that over 80 UK towns and cities have committed to 100% clean energy by 2050, including Manchester, Birmingham, Newcastle, Glasgow and 16 London boroughs.

According to the World Economic Forum, unsubsidized renewables were the cheapest source of electricity in 30 countries in 2017, with renewables predicted to be consistently more cost effective than fossil fuels globally by 2020.

The new data has been released ahead of the Intergovernmental Panel on Climate Change (IPCC) conference in Edmonton, Canada on 5th March, when city government and science leaders will meet on the role of cities in tackling climate change.

Cities named by CDP as already powered by 100% renewable electricity include:

Burlington, Vermont’s largest city, now obtains 100% of its electricity from wind, solar, hydro, and biomass. The city has its own utility and citywide grid. In September 2014 the local community approved the city’s purchase of its ‘Winooski One’ Hydroelectric Facility.

“Burlington, Vermont is proud to have been the first city in the United States to source 100 percent of our power from renewable generation. Through our diverse mix of biomass, hydro, wind, and solar, we have seen first-hand that renewable energy boosts our local economy and creates a healthier place to work, live, and raise a family. We encourage other cities around the globe to follow our innovative path as we all work toward a more sustainable energy future,” added Mayor Miro Weinberger of Burlington.

Reykjavik, Iceland sources all electricity from hydropower and geothermal, and is now working to make all cars and public transit fossil-free by 2040. Iceland has almost entirely transitioned to clean energy for power and household heating.

Basel, Switzerland is 100% renewable powered by its own energy supply company. Most electricity comes from hydropower and 10% from wind. Advocating clear political vision and will, in May 2017 Switzerland voted to phase out nuclear power in favor of renewable energy.

CDP’s 2017 data highlights how cities are stepping up action on climate change with a sharp rise in environmental reporting, emissions reduction targets and climate action plans since 2015, following the ground-breaking Paris Agreement to limit global warming to below 2 degrees.

There is a growing momentum of the renewable energy cities movement beyond the UK, with cities around the world now aiming to switch from fossil fuels to 100% renewable energy by 2050.

In the United States, 58 cities and towns have now committed to transition to 100% clean, renewable energy, including big cities like Atlanta (Georgia) and San Diego (California). Earlier this month, U.S. municipalities Denton (Texas) and St. Louis Park (Minnesota), became the latest communities to establish 100% renewable energy targets. In addition to these recent pledges, CDP data shows a further 23 global cities targeting 100% renewable energy.

Much of the drive behind city climate action and reporting comes from the 7,000+ mayors signed up to The Global Covenant of Mayors for Climate and Energy who have pledged to act on climate change.

Kyra Appleby, Director of Cities, CDP said: “Cities are responsible for 70% of energy-related CO2 emissions and there is immense potential for them to lead on building a sustainable economy. Reassuringly, our data shows much commitment and ambition. Cities not only want to shift to renewable energy but, most importantly – they can. We urge all cities to disclose to us, work together to meet the goals of the Paris Agreement and prioritize the development of ambitious renewable energy procurement strategies. The time to act is now.”

Showing a diverse mix of energy sources, 275 cities are now reporting the use of hydropower, with 189 generating electricity from wind and 184 using solar photovoltaics. An additional 164 use biomass and 65 geothermal.

CDP reports that cities are currently instigating renewable energy developments valued at US$2.3 billion, across nearly 150 projects. This forms part of a wider shift by cities to develop 1,000 clean infrastructure projects, such as electric transport and energy efficiency, worth over US$52 billion.

Read the relevant CDP press release here

For a full view of cities generating electricity from renewables, visit the CDP’s list of world renewable energy cities

Press link for more: COP23.UNFCCC

The Two-Degree Delusion #ClimateChange #auspol #qldpol #StopAdani

The Two-Degree Delusion

The Dangers of an Unrealistic Climate Change Target

February 8, 2018

Global carbon emissions rose again in 2017, disappointing hopes that the previous three years of near zero growth marked an inflection point in the fight against climate change.

Advocates of renewable energy had attributed flat emissions to the falling cost of solar panels.

Energy efficiency devotees had seen in the pause proof that economic activity had been decoupled from energy consumption.

Advocates of fossil fuel divestment had posited that the carbon bubble had finally burst.

Analysts who had attributed the pause to slower economic growth in a number of parts of the world, especially China, were closer to the truth.

The underlying fundamentals of the energy economy, after all, remained mostly unchanged—there had been no step change in either the energy efficiency of the global economy or the share of energy production that clean energy accounted for. And sure enough, as growth picked up, emissions started to tick back up again as well.

Even during the pause, it was clear that the world wasn’t making much progress toward avoiding significant future climate change.

To significantly alter the trajectory of sea level changes or most other climate impacts in this century or the next, emissions would not just have to peak; they would have to fall precipitously.

Yet what progress the world has made to cut global emissions has been, under even the most generous assumptions, incremental.

But at the latest climate talks in Bonn last fall, diplomats once again ratified a long-standing international target of limiting warming to two degrees Celsius above preindustrial levels. This despite being unable to commit to much beyond what was already agreed at the Paris meeting two years ago, when negotiators reached a nominal agreement on nonbinding Intended Nationally Determined Contributions, which would result in temperatures surpassing three degrees above preindustrial levels before the end of this century.

Forty years after it was first proposed, the two-degree target continues to maintain a talismanic hold over global efforts to address climate change, despite the fact that virtually all sober analyses conclude that the target is now unobtainable.

Some advocates still insist that with sufficient political will, the target can be met.

Others recognize that although the goal is practically unachievable, it represents an aspiration that might motivate the world to reduce emissions further and faster than it would otherwise.

For still others, the target remains within reach if everyone gets serious about removing carbon from the atmosphere or hacking the atmosphere in order to buy more time.

But it is worth considering the consequences of continuing to pursue a goal that is no longer obtainable.

Some significant level of future climate impact is probably unavoidable.

Sustaining the fiction that the two-degree target remains viable risks leaving the world ill prepared to mitigate or manage the consequences.


My uncle, the Yale University economist William Nordhaus, is widely credited with being the first person to propose that climate policy should strive to limit anthropogenic global warming to two degrees above preindustrial temperatures.

He didn’t arrive at that conclusion through any sort of elaborate climate modeling or cost-benefit analysis.

Rather, he considered the very limited evidence of long-term climate variance available at that time and concluded that a two-degree increase would take global temperatures outside the range experienced by human societies for the previous several thousand years and probably much longer.

The standard was, by his own admission, arbitrary.

In the decades that followed, the international community formalized his target through a series of UN conferences, assessments, and negotiations.

Climate researchers, meanwhile, have backfilled the target with science, some of it compelling.

It does indeed appear that the earth is already hotter than it has been in the last several hundred thousand years, with temperatures likely to rise substantially more through this century and well beyond.

But limiting global temperatures below two degrees provides no guarantee that the world will avoid catastrophe, nor does exceeding that threshold assure it.

No one knows with much precision what the relationship will be between global temperature and the impact of climate change at local and regional levels.

Nor do we have a particularly good handle on the capability of human societies to adapt to those impacts.

Limiting global temperatures below two degrees provides no guarantee that the world will avoid catastrophe, nor does exceeding that threshold assure it.

In reality, most of the climate risks that we understand reasonably well are linear, meaning that lower emissions bring a lower global temperature increase, which in turn brings lower risk.

That is the case for impacts such as sea level rise, agricultural yields, rainfall, and drought.

Stabilizing emissions at 450 atmospheric parts per million brings less risk than stabilizing at 500, 500 brings less risk than 550, and so on.

The world isn’t saved should we limit atmospheric concentrations to 450 parts per million, nor is it lost should concentrations surpass that threshold.

There are a range of potential nonlinear tipping points that could also bring catastrophic climate impacts.

Many climate scientists and advocates argue that the risks associated with triggering these impacts are so great that it is better to take a strict precautionary approach to dramatically cut emissions.

But there are enormous uncertainties about where those tipping points actually are.

The precautionary principle holds equally well at one degree of warming, a threshold that we have already surpassed; one and a half degrees, which we will soon surpass; or, for that matter, three degrees.

Such calculations are further complicated by the substantial lag between when we emit carbon and when we experience the climate impacts of doing so: because of the time lag, and because of the substantial amount of carbon already emitted (atmospheric concentrations of carbon today stand at 407 parts per million, versus 275 prior to the start of the Industrial Revolution), even an extreme precautionary approach that ended all greenhouse gas emissions immediately would not much affect the trajectory of global temperatures or climate impacts until late in this century at the earliest.

Projections of sea level rise, for instance, don’t really diverge in high-emissions versus low-emissions scenarios until late in this century, and even then not by very much.

It is not until modelers project into the twenty-second century that large differences begin to emerge.

The same is true of most other climate impacts, at least as far as we understand them.

Many advocates for climate action suggest that we are already experiencing the impacts of anthropogenic climate change in the form of more extreme weather and natural disasters.

Insofar as this is true—and the effect of climate change on present-day weather disasters is highly contested—there is not much we can do to mitigate it in the coming decades.


Over the last two decades, discussions of climate risk have been strongly influenced by concerns about moral hazard.

The suggestion that human societies might successfully adapt to climate change, the argument goes, risks undermining commitments to cut emissions sufficiently to avoid those risks.

But moral hazard runs the other way as well.

On a planet that is almost certainly going to be much hotter even if the world cuts emissions rapidly, the continuing insistence that human societies might cut emissions rapidly enough to avoid dangerous climate change risks undermining the urgency to adapt.

Adaptation brings difficult tradeoffs that many climate advocates would prefer to ignore.

Individual and societal wealth, infrastructure, mobility, and economic integration are the primary determinants of how vulnerable human societies are to climate disasters.

A natural disaster of the same magnitude will generally bring dramatically greater suffering in a poor country than in a rich one.

For this reason, poor nations will bear the brunt of climate impacts.

But by the same token, the faster those nations develop, the more resilient they will be to climate change.

Development in most parts of the world, however, still entails burning more fossil fuels—in most cases, a lot more.

Most climate advocates have accepted that some form of adaptation will be a necessity for human societies over the course of this century.

But many refuse to acknowledge that much of that adjustment will need to be powered by fossil fuels.

Hard infrastructure—modern housing, transportation networks, and the like—is what makes people resilient to climate and other natural disasters.

That sort of infrastructure requires steel and concrete.

And there are presently few economically viable ways to produce steel or concrete without fossil fuels.

The two-degree threshold, and the various carbon budgets and emissions reduction targets that accompany it, has provided the justification for prohibitions at the World Bank and other international development institutions on finance for fossil fuel development.

Given how much climate change is likely already built into our future owing to past emissions and how long it takes for emissions reductions to mitigate climate impacts, those sorts of policies will almost certainly increase exposure to climate hazards for many people in developing economies.

Wolfgang Rattay / REUTERS Patricia Espinosa, executive secretary of the United Nations Framework Convention on Climate Change, French President Emmanuel Macron, COP23 President Prime Minister Frank Bainimarama of Fiji, German Chancellor Angela Merkel and UN Secretary-General Antonio Guterres pose for a photo during the COP23 UN Climate Change Conference in Bonn, Germany, November 2017.


Continued devotion to the two-degree target has also undermined carbon-cutting efforts.

In theory, cutting emissions deeply enough by midcentury to limit warming to two degrees would require deploying zero-carbon energy technologies today at a historically unprecedented scale.

That would seem to take important drivers of incremental decarbonization, such as the transition from coal to gas in the United States and many other parts of the world, off the table.

Burning natural gas produces half the carbon per unit of energy produced as burning coal. But it can’t decarbonize the power sector fast enough to hit the two-degree target by 2050.

For this reason, most climate advocates are at best indifferent to natural gas and are more often opposed, even though the switch from coal to natural gas has been the largest source of emissions reductions in the United States for over a decade, as it was in the United Kingdom in the early 1990s.

The two-degree target has also hobbled support for developing better clean energy technologies.

Because next-generation technologies such as advanced nuclear reactors, advanced geothermal, and carbon capture capabilities won’t be ready for large-scale commercialization for at least another decade or two, they will arrive too late to contribute much to two-degree stabilization scenarios.

In turn, many prominent climate advocates have long argued that the only climate action worthy of the name entails deploying zero-carbon technologies that are commercially available today.

Yet there is little reason to think that existing zero-carbon technologies are up to the job.

To be sure, some models do claim that current renewable energy technologies are capable of powering the electrical grid and much beyond.

But strong renewables growth in various parts of the world appears to follow a classic S-curve, with market share on electrical grids stalling at around 20 percent or less of total generation after a period of strong initial adoption, because the value of intermittent sources of energy such as wind and solar declines precipitously as their share of electricity production rises.

For a period of time, in the 1970s and 1980s, conventional nuclear reactors had a better track record. France decarbonized 80 percent of its electrical system with nuclear.

Sweden achieved 50 percent.

But conventional nuclear technology, which requires strong central governments and vertically integrated utilities that build, own, and operate plants, has been swimming against the current of economic liberalization and declining faith in technocratic institutions for decades.

Outside of China and a few other Asian economies, few nations have been able to build large nuclear plants cost-effectively in recent decades.

Such limitations continue to plague power sector decarbonization efforts around the world. But the power sector accounts for only about 20 percent of global primary energy use and turns out to be relatively easy to decarbonize compared with transportation, agriculture, industry, and construction.

There are currently few viable substitutes for fossil fuels in the production of steel, cement, or fertilizer or for powering aviation and heavy transportation.

Longer term, there may be better options, including advanced nuclear reactors that can provide heat for industrial processes, carbon capture technologies that can capture emissions from burning fossil fuels, and low-carbon synthetic fuels that might substitute for diesel and aviation fuels.

But all are decades away from viable application.

The technologies that are needed to cut emissions deeply enough to stabilize emissions at two degrees, in short, will not be ready in time to do so.

As a result, continued devotion to the two-degree threshold has ended up undermining both important incremental pathways to lower emissions and long-term investment in the development and commercialization of technologies that would be necessary to deeply decarbonize the global economy.


Almost 30 years after the UN established the two-degree threshold, over 80 percent of the world’s energy still comes from fossil fuels, a share that has remained largely unchanged since the early 1990s.

Global emissions and atmospheric concentrations of carbon dioxide continue to rise.

Climate policy, at both international and national levels, has had little impact on their trajectory.

Climate advocates have persistently blamed the failures of climate policy on the corrupting political power of the fossil fuel industry.

Industry-funded “merchants of doubt,” as the historians Naomi Oreskes and Erik Conway originally dubbed them, together with heavy political spending, have stopped climate mitigation efforts in their tracks.

But those claims are U.S.-centric.

Climate skepticism and denial have not found anywhere close to the same level of political traction outside the United States. (and Australia)

Exxon and the Koch brothers have no political franchise in the German Bundestag, the Chinese Central Committee, or most other places outside Washington.

And yet those nations have had no more success cutting emissions than has the United States.

To the contrary, U.S. emissions have fallen faster than those of almost any other major economy over the last decade.

The alternate explanation is rather less dramatic.

Decarbonization is hard.

Fossil fuels continue to bring substantial benefit to most people around the world, despite the significant environmental consequences.

The alternatives have improved, but not sufficiently to displace fossil energy at scales that would be consistent with stabilizing temperatures at the two-degree threshold.

The consequences of failing to do so for human societies are too uncertain or too far off in the future to motivate either a World War II–style mobilization to deploy renewable energy or a global price on carbon high enough to rapidly cut emissions.

At some point over the next 20 years or so, atmospheric concentrations of carbon will almost certainly surpass 450 parts per million, the emissions proxy for avoiding long-term temperature increases of greater than two degrees.

At that point, the only certain path to stay under the target will be either to pull carbon out of the atmosphere at almost unimaginable scales or to alter the chemistry of the atmosphere such that rising greenhouse gas concentrations do not lead to higher temperatures.

Functionally, that moment has already arrived.

Virtually all scenarios consistent with stabilizing global temperatures at plus two degrees, according to the Intergovernmental Panel on Climate Change, explicitly require so-called negative emissions in the latter half of this century.

In recent years, the moral hazard argument used against adaptation has also been used against geoengineering and carbon removal technologies.

The suggestion that it might be possible to pull sufficient carbon out of the atmosphere to lower global temperatures or, short of that, change the chemical composition of the atmosphere or the oceans such that large temperature increases might be forestalled, the logic goes, risks distracting us from the central task of rapidly decarbonizing the global economy.

Yet no one is seriously proposing embarking on large-scale carbon removal or geoengineering today.

We haven’t really figured out how to do the former, and the latter brings a range of potential risks that we don’t yet fully understand.

Still, such emergency measures may be necessary in the future even with a steep cut in emissions.

As in the case of adaptation, however, the twin fictions that the two-degree limit remains a plausible goal and that dangerous climate change can be avoided should we achieve it allow the moral hazard argument to be marshaled against even sensible calls for serious public research.


At this point, if there is a moral hazard argument to be made, it is against the two-degree threshold, not for it.

Humans are going to live on a significantly hotter planet for many centuries.

The notion that two degrees remains an achievable target risks diverting attention from steps we might take today to better weather the changes that are coming.

Once the world lets go of the unrealistic two-degree target, a range of practical policies comes much more clearly into focus.

We should do all that we can to speed up decarbonization.

Accelerating the coal-to-gas transition and continuing the deployment of today’s renewable energy technologies would incrementally reduce climate risk even if neither is capable of decarbonizing economies at rates consistent with achieving the two-degree target.

At the same time, it is important to support those efforts in ways that don’t lock out technologies that will be necessary to achieve deeper emissions cuts over the longer term.

Continuing subsidies for low-efficiency solar panels, for instance, have shut higher-efficiency solar technologies out of the renewables market.

Cheap gas has rendered many nuclear power plants, which don’t get the same privileged access to electrical grids or direct production subsidies as do wind and solar energy, uneconomical.

At relatively low overall shares of electricity generation, variable sources of power such as wind and solar risk crowding out other zero-carbon options that will be necessary to fully decarbonize power grids. And if deep decarbonization is the objective, much greater public investment will be needed to develop and commercialize clean energy technologies, even though those technologies are unlikely to contribute much to emissions-cutting efforts over the next several decades.

Press link for more: Foreign Affairs

Farmers look to wind farms for drought relief. #StopAdani #auspol #qldpol #climatechange

Queensland farmers looking to wind farms for drought relief

By Melanie Vujkovicabout an hour ago

Photo: Jade and Blair Wenham say the turbines on their farm will give them a constant income stream. (ABC News: Jennifer Huxley)

A community of Queensland farmers hopes a wind farm, set to be the southern hemisphere’s largest, will be able to drought-proof their futures.

Ten years in the making, a sod was turned yesterday on the Coopers Gap project, west of Brisbane.

A total of 123 turbines will be built across a dozen properties, over the South Burnett and Western Downs council areas, with landowners benefiting from leasing arrangements.

The 450-megawatt wind farm will produce more than 1.5 million megawatt hours of renewable energy annually, enough to power more than 260,000 homes.

Photo: When built the wind farm will generate enough power for 260,000 homes. (ABC Central West: Gavin Coote)

Russell Glode will have nine 180-metre tall turbines on his cattle property at Cooranga North.

“We’ve been waiting for so many years for the project to happen so I’m quite pleased it’s finally eventuating,” he said.

Jade Wenham and his wife Blair will also put turbines on their property.

“It will definitely give us a constant stream of income, where we don’t have to put a great deal of output into,” Mr Wenham said.

“It will be a big benefit to us … it’s not something we’ve banked on but it will be a great bonus that’s for sure.”

In a region battling with drought, Ms Wenham hopes it will provide financial security.

Photo: Cattle farmer Russell Glode says it’s been a long wait for work to start. (ABC News: Jennifer Huxley)

“The seasons have been difficult … we haven’t grown very good crops over the last couple of years so the idea of drought proofing is good,” she said.

“There’s also a lot of people picking up work that don’t have any towers on their place, but now have a job where they live.”

General manager of power development for AGL Dave Johnson says the project will generate 200 jobs during construction, and 20 permanent positions.

“To the maximum extent possible we will be employing locals,” he said.

“There’s been a lot of work that’s been going into community engagement and making sure the local towns and businesses are aware of what work they can bid into and some of those work packages have already gone out to local contractors.

“For the farmers it’s diversification of income.”

South Burnett Mayor Keith Campbell hopes it will revive the community.

“The reality is it’s a new business in the area that each of us can achieve some benefit from and it puts our region on the map,” he said.

The wind farm is set to begin producing energy by mid 2019.

Press link for more: ABC.NET.AU

Is 100% #RenewableEnergy realistic? #auspol #qldpol #sapol #StopAdani

Is 100% renewable energy realistic? Here’s what we know.

Reasons for skepticism, reasons for optimism, and some tentative conclusions.

David RobertsFeb 7, 2018, 12:30pm EST

The world has agreed to a set of shared targets on climate change. Those targets require deep (80 to 100 percent) decarbonization, relatively quickly.

What’s the best way to get fully decarbonized? In my previous post, I summarized a raging debate on that subject. Let’s quickly review.

We know that deep decarbonization is going to involve an enormous amount of electrification. As we push carbon out of the electricity sector, we pull other energy services like transportation and heating into it. (My slogan for this: electrify everything.) This means lots more demand for electricity, even as electricity decarbonizes.

The sources of carbon-free electricity with the most potential, sun and wind, are variable. They come and go on their own schedule. They are not “dispatchable,” i.e., grid operators can’t turn them on and off as needed. To balance out variations in sun and wind (both short-term and long-term), grid operators need dispatchable carbon-free resources.

Deep decarbonization of the electricity sector, then, is a dual challenge: rapidly ramping up the amount of variable renewable energy (VRE) on the system, while also ramping up carbon-free dispatchable resources that can balance out that VRE and ensure reliability.

Two potentially large sources of dispatchable carbon-free power are nuclear and fossil fuels with carbon capture and sequestration (CCS). Suffice it to say, a variety of people oppose one or both of those sources, for a variety of reasons.

So then the question becomes, can we balance out VRE in a deeply decarbonized grid without them? Do our other dispatchable balancing options add up to something sufficient?

That is the core of the dispute over 100 percent renewable energy: whether it is possible (or advisable) to decarbonize the grid without nuclear and CCS.

In this post I’m going to discuss three papers that examine the subject, try to draw a few tentative conclusions, and issue a plea for open minds and flexibility. It’ll be fun!


Two papers circulated widely among energy nerds in 2017 cast a skeptical eye on the goal of 100 percent renewables.

One was a literature review on the subject, self-published by the Energy Innovation Reform Project (EIRP), authored by Jesse Jenkins and Samuel Thernstrom. It looked at a range of studies on deep decarbonization in the electricity sector and tried to extract some lessons.

The other was a paper in the journal Renewable and Sustainable Energy Reviews that boasted “a comprehensive review of the feasibility of 100% renewable-electricity systems.” It was by B.P. Heard, B.W. Brook, T.M.L. Wigley, and C.J.A. Bradshaw, who, it should be noted, are advocates for nuclear power.

We’ll take them one at a time.

Most current models find that deep decarbonization is cheaper with dispatchable power plants

Jenkins and Thernstrom rounded up 30 studies on deep decarbonization, all published since 2014, when the most recent comprehensive report was released by the Intergovernmental Panel on Climate Change (IPCC). The studies focused on decarbonizing different areas of different sizes, from regional to global, and used different methods, so there is not an easy apples-to-apples comparison across them, but there were some common themes.

To cut to the chase: The models that optimize for the lowest-cost path to zero carbon electricity — and do not rule out nuclear and CCS a priori — generally find that it is cheaper to get there with than without them.

Today’s models, at least, appear to agree that “a diversified mix of low-CO2 generation resources” add up to a more cost-effective path to deep decarbonization than 100 percent renewables. This is particularly true above 60 or 80 percent decarbonization, when the costs of the renewables-only option rise sharply.

Again, it’s all about balancing out VRE. The easiest way to do that is with fast, flexible natural gas plants, but you can’t get past around 60 percent decarbonization with a large fleet of gas plants running. Getting to 80 percent or beyond means closing or idling lots of those plants. So you need other balancing options.

One is to expand the grid with new transmission lines, which connects VRE over a larger geographical area and reduces its variability. (The wind is always blowing somewhere.) Several deep decarbonization studies assume a continental high-voltage super-grid in the US, with all regions linked up. (Needless to say, such a thing does not exist and would be quite expensive.)

One conceptual example of a US-wide supergrid, from AWEA.

(AWEA, via Wikipedia)

The other way to balance VRE is to maximize carbon-free dispatchable resources, which include dispatchable supply (power plants), dispatchable demand (“demand management,” which can shift energy demand to particular parts of the day or week), and energy storage, which acts as both supply (a source of energy) and demand (a way to absorb it).

Energy storage and demand management are both getting better at balancing out short-term (minute-by-minute, hourly, or daily) variations in VRE.

But there are also monthly, seasonal, and even decadal variations in weather. The system needs to be prepared to deal with worst case scenarios, long concurrent periods of high cloud cover and low wind. That adds up to a lot of backup.

We do not yet have energy storage at anything approaching that scale. Consider pumped hydro, currently the biggest and best-developed form of long-term energy storage. The EIRP paper notes that the top 10 pumped-hydro storage facilities in the US combined could “supply average US electricity needs for just 43 minutes.”

Currently, the only low-carbon sources capable of supplying anything like that scale are hydro, nuclear, and (potentially) CCS.

So if you take nuclear and CCS off the table, you’re cutting out a big chunk of dispatchable capacity. That means other dispatchable resources have to dramatically scale up to compensate — we’d need a lot of new transmission, a lot of new storage, a lot of demand management, and a lot of new hydro, biogas, geothermal, and whatever else we can think of.

Even with tons of new transmission, we’ll still need a metric shit-ton of new storage. Here’s a graph for comparison:

The US currently has energy storage capacity for around an hour of average electricity consumption. Only 15 weeks, six days, and 23 hours to go!

Suffice to say, that would mean building a truly extraordinary amount of energy storage by mid-century.

It gets expensive, progressively more so as decarbonization reaches 80 percent and above. Trying to squeeze out that last bit of carbon without recourse to big dispatchable power plants is extremely challenging, at least for today’s models.

Thus, models that optimize for the lowest-cost pathway to deep decarbonization almost always include lots of dispatchable power plants, including nuclear and CCS.

“It is notable,” the review says, “that of the 30 papers surveyed here, the only deep decarbonization scenarios that do not include a significant contribution from nuclear, biomass, hydropower, and/or CCS exclude those resources from consideration a priori.”

To summarize: Most of today’s models place high value on large dispatchable power sources for deep decarbonization, and it’s difficult to muster enough large dispatchable power sources without nuclear and CCS.

100 percent renewables hasn’t been 100 percent proven feasible

The second review takes a somewhat narrower and more stringent approach. It examines 24 scenarios for 100 percent renewable energy with enough detail to be credible. It then judges them against four criteria for feasibility:

(1) consistency with mainstream energy-demand forecasts; (2) simulating supply to meet demand reliably at hourly, half-hourly, and five-minute timescales, with resilience to extreme climate events; (3) identifying necessary transmission and distribution requirements; and (4) maintaining the provision of essential ancillary services.

(“Ancillary services” are things like frequency regulation and voltage control, which keep the grid stable and have typically been supplied by fossil fuel power plants.)

Long story short, none of the studies passed these feasibility tests. The highest score was four points out of a possible seven.

The authors conclude that “in all individual cases and across the aggregated evidence, the case for feasibility [of 100 percent renewable energy] is inadequate for the formation of responsible policy directed at responding to climate change.”

That is the peer-reviewed version of a sick burn.

Note, though, that these are pretty tough criteria: Researchers model a full electricity system, responsive to both short-term and long-term weather variations, meeting demand that is not appreciably different from mainstream projections, providing all needed services reliably, using technologies already demonstrated at scale.

That’s not easy! It’s reasonable to ask whether we need that much confidence to begin planning for long-term decarbonization. If any new system must demonstrate in advance that it is fully prepared to substitute for today’s system, it’s going to be difficult to change the system at all.

(Renewables advocates might say that nuclear advocates have a vested interest in keeping feasibility criteria as strict and tied to current systems as possible.)

For more in this vein, see “A critical review of global decarbonization scenarios: what do they tell us about feasibility?” from 2014, and here for more.

The question is how much our current decision-making should be constrained by what today’s models tell us is possible in the distant future.

Energy experts are more optimistic than their models

A third paper worth mentioning is 2017’s Renewables Global Futures Report (GFR) from global renewable-energy group REN21. In it, they interviewed “114 renowned energy experts from around the world, on the feasibility and challenges of achieving a 100% renewable energy future.”

There’s a ton of interesting stuff in the report, but this jumps out:

That’s 71 percent who agree that 100 percent renewables is “reasonable and realistic.” Yet the models seem to agree that 100 percent renewables is unrealistic. What gives?

Models are only models

It pays to be careful with literature reviews. They are generally more reliable than single studies, but they are exercises in interpretation, colored by the assumptions of their authors. And there’s always a danger that they are simply compiling common biases and limitations in current models — reifying conventional wisdom.

There are plenty of criticisms of current models of how climate change and human politics and economics interact. Let’s touch on a few briefly, and then I’ll get to a few takeaways.

1) Cost-benefit analysis is incomplete.

Models that “minimize cost” rarely minimize all costs. They leave out many environmental impacts, along with more intangible social benefits like community control, security, or independence.

UC Berkeley’s Mark Delucchi, occasional co-author with Stanford’s Mark Jacobson of work on 100 percent WWS (wind, water, and sun — see more about that at the Solutions Project), says that the ideal analysis of deep decarbonization would involve a full cost-benefit analysis, taking all effects, “the full range of climate impacts (not just CO2), air-quality benefits, water-quality benefits, habitat destruction, energy security — everything you can think of,” into account. No one, he said, has done that for getting above, say, 90 percent WWS.

“My own view,” he told me, “which is informed but not demonstrated by my work on 100% WWS, is that the very large environmental benefits of WWS probably make it worth paying for close to — but not quite — a 100% WWS systems. The ‘not quite’ is important, because it does look to me that balancing supply and demand when you get above 90-95% WWS (for the whole system) starts to get pretty expensive.”

In other words, full cost-benefit analysis is likely to offset higher renewables costs more than most models show.

2) Most models are based on current markets, which will change.

“Our traditional energy models are pretty clearly biased against a 100% renewable outcome,” Noah Kaufman told me. He worked on the “US Midcentury Strategy for Deep Decarbonization,” which the US government submitted to the UNFCCC in November 2016 as a demonstration of its long-term commitment to the Paris climate process. “Models like to depict the system largely as it exists today, so of course they prefer baseload replacing baseload.”

(Kaufman cautions that while current models may underestimate renewables, he doesn’t believe we know that with enough certainty “to mandate those [100% renewable] scenarios.”)

Price analyses based on current wholesale energy markets will not tell us much about markets in 20 or 30 years. VRE is already screwing up wholesale markets, even at relatively low penetrations, because the incremental cost of another MW of wind when the wind is blowing is $0, which undercuts all competitors.

Wholesale power markets will not survive in their current form. Markets will evolve to more accurately value a wider range of grid services — power, capacity, frequency response, rapid ramping, etc. — allowing VRE and its complements to creep into more and more market niches.

Financing will evolve as well. As it gets cheaper, VRE and storage start looking more like infrastructure than typical power plant investments. Almost all the costs are upfront, in the financing, planning, and building. After that, “fuel” is free and maintenance costs are low. It pays off over time and then just keeps paying off. Financing mechanisms will adapt to reflect that.

3) Most models do not, and cannot, model emerging solutions or current costs.

Most energy models today do not account for the full complement of existing strategies to manage and expand VRE — all the different varieties of storage, the growing list of demand-management tools, new business models and regulations — so they neither are, nor claim to be, definitive.

“I don’t want to overstate or improperly extract conclusions from my work,” NREL’s Bethany Frew, who co-authored one of the key studies in the EIRP review, cautions, “I didn’t look at an exhaustive set of resources.”

Models today cannot capture the effects of technologies and techniques that have not yet been developed. But this stuff is the subject of intense research, experimentation, and innovation right now.

It is viewed as irresponsible to include speculative new developments in models, but at the same time, it’s a safe bet that the energy world will see dramatic changes in the next few decades. Far more balancing options will be available to future modelers.

In a similar vein, as energy modeler Christopher Clack (formerly of NOAA) told me, it can take two or three years to do a rigorous bit of modeling. And that begins with cost estimates taken from peer-reviewed literature, which themselves took years to publish.

The result is that models almost inevitably use outdated cost estimates, and when costs are changing rapidly, as they are today, that matters.

Speaking of which…

4) Models have always underestimated distributed energy technology.

As I described in detail in this post, energy models have consistently and woefully underestimated the falling costs and rapid growth of renewable energy.

The professional energy community used to be quite convinced that wind and solar could play no serious role in the power system because of their variability. Then, for a long time, conventional wisdom was that they could provide no more than 20 percent of power before the grid started falling apart.

That number has kept creeping up. Now CW has it around 60 percent. Which direction do you suppose it will go in the next few decades?

It’s a similar story with batteries and EVs. They keep outpacing forecasts, getting cheaper and better, finding new applications. Is there any reason to think that won’t continue?

Which brings us to…

5) Pretending we can predict the far future is silly.

Predicting the near future is difficult. Predicting the distant future is impossible. Nothing about fancy modeling makes it any less impossible.

Modelers will be the first to tell you this. (Much more in this old post from 2014.) They are not in the business of prediction; they aren’t psychics. All they do is construct elaborate if-then statements. If natural gas prices do this, solar and wind prices do that, demand does this, storage does that, and everything else more or less stays the same … then this will happen. They are a way of examining the consequences of a set of assumptions.

Are the assumptions correct? Will all those variables actually unfold that way in the next 20, 30, 40 years? Ask any responsible modeler and they will tell you: “Eff if I know.”

Long-term energy modeling was more tractable when the energy world was mostly composed of very large technologies and projects, with a small set of accredited builders and slow innovation cycles. But as energy and its associated technologies and business models have gotten more and more distributed, innovation has become all the more difficult to even track, much less predict.

Because distributed energy technologies are smaller than big power plants, they iterate faster. They are more prone to complex interactions and emergent effects. Development is distributed as well, across hundreds of companies and research labs.

Energy is going to bend, twist, and accelerate in unpredictable ways even in the next few years, much less the next few decades. We really have no friggin’ idea what’s going to happen.

The lessons to take from all this

Okay, we’ve looked at some of the literature on 100 percent renewables, which is generally pretty skeptical. And we’ve covered some reasons to take the results of current modeling with a grain of salt. What should we take away from all this? Here are a few tentative conclusions.

1) Take variability seriously.

One reason everyone’s so giddy about renewable energy is that it’s been pretty easy to integrate it into grids so far — much easier than naysayers predicted.

But one thing models and modelers agree on is that variability is a serious challenge, especially at high VRE penetrations. As VRE increases, it will begin to run into technical and economic problems. (Read here and here for more.) California is already grappling with some of these issues.

Getting deep decarbonization right means thinking, planning, and innovating toward a rich ecosystem of dispatchable resources that can balance VRE at high penetrations. That needs to become as much a priority as VRE deployment itself.

2) Full steam ahead on renewable energy.

We have a solid understanding of how to push VRE up to around 60 percent of grid power. Right now, wind and solar combined generate just over 5 percent of US electricity. (Nuclear generates 20 percent.)

The fight to get 5 percent up to 60 is going to be epic. Political and social barriers will do more to slow that growth than any technical limitation, especially in the short- to mid-term.

This is likely why the energy experts interviewed by REN21, though they believe 100 percent renewables is “reasonable and realistic,” don’t actually expect it to happen by mid-century.

It will be an immense struggle just to deploy the amount of VRE we already know is possible. If we put our shoulder to that wheel for 10 years or so, then we can come up for air, reassess, and recalibrate. The landscape of costs and choices will look very different then. We’ll have a better sense of what’s possible and what’s lacking.

Until then, none of these potential future limitations are any reason to let up on the push for VRE. (Though there should also be a push for storage and other carbon-free balancing options.)

3) Beware natural gas lock-in.

The easy, default path for the next several years will be to continue to lean on natural gas to drive down emissions and balance VRE. And sure enough, there’s a ton of natural gas “in the queue.”

But leaning too hard on natural gas will leave us with a ton of fossil fuel capacity that we end up having to shut down (or leave mostly idle) before the end of its useful life. That will be an economically unfortunate and politically difficult situation.

We need to start thinking about alternatives to natural gas, today.

4) Keep nuclear power plants open as long as possible.

Clack told me something intriguing. He said that there is enough nuclear capacity in the US today to serve as the necessary dispatchable generation in an 80 percent decarbonized grid. We wouldn’t need any big new nuclear or CCS power plants.

It would just mean a) changing market and regulatory rules to make nuclear more flexible (it largely has the technical capacity), and b) keeping the plants open forever.

Obviously those plants are not going to stay open forever, and the ones that are genuinely unsafe should be shut down. And Clack’s models are only models too, not gospel.

But what’s clear is that, from a decarbonization perspective, allowing a nuclear power plant to close (before, say, literally any coal plant) is a self-inflicted wound. It makes the challenges described above all that much more difficult. Every MW of dispatchable, carbon-free power capacity that is operating safely should be zealously guarded.

5) Do relentless RD&D on carbon-free dispatchable resources, including nuclear.

We know we will need a lot of dispatchable carbon-free resources to balance out a large share of VRE.

Storage and demand management can play that role, and in any scenario, we will need lots of both, so they should be researched, developed, and deployed as quickly as possible.

But large-scale, carbon-free dispatchable generation will help as well. That can be hydro, wave, tidal, geothermal, gas from waste, renewable gas, or biomass. It can also be nuclear or CCS.

I personally think fossil fuel with CCS will never pass any reasonable cost-benefit analysis. It’s an environmental nightmare in every way other than carbon emissions, to say nothing of its wretched economics and dodgy politics.

But we’re going to need CCS regardless, so we might as well figure it out.

View image on Twitter

Glen Peters


Without large-scale CCS there needs to be an immediate & more rapid decline in fossil fuel use

Current nuclear plants have proven uneconomic just about everywhere they’ve been attempted lately (except, oddly, South Korea) and there is no obvious reason to favor them in their market battle with renewables.

But it is certainly worth researching new nuclear generation technologies — the various smaller, more efficient, more meltdown-proof technologies that seem perpetually on the horizon. If they can make good on their promise, with reasonable economics, it would be a blessing. (See Brad Plumer’s piece on radical nuclear innovation.)

Basically, research everything. Test, experiment, deploy, refine.

6) Stay woke.

Above all, the haziness of the long-term view argues for humility on all sides. There’s much we do not yet know and cannot possibly anticipate, so it’s probably best for everyone to keep an open mind, support a range of bet-hedging experiments and initiatives, and maintain a healthy allergy to dogma.

We’ve barely begun this journey. We don’t know what the final few steps will look like, but we know what direction to travel, so we might as well keep moving.

Press link for more: