Thursday, March 06, 2008

Future carbon

In 1897 Lord Kelvin, a former President of the Royal Society in London declared that the atmosphere would exhaust its oxygen in some 300 years as a result of burning coal.  He was wrong about the atmosphere.  So what about the reliability of present forecasts of atmospheric evolution?

The processes at work in the atmosphere are complicated, coupled to external and surface influences, in particular the ocean.  Their interactions may be beyond computer simulation over the tens of years' time scales used in climate modelling.

Any approach to determining economic policy for climate change should take into account the possibility that the current understanding of the atmosphere may not be translatable into reliable forecasts with a precision that allows the design of an economic response.

Further, any economic forecasts that are used to construct models of future carbon use and carbon dioxide emissions will be unable to deal with technical innovations.  Their success cannot be predicted.  This impacts on policy in two ways, first the obvious uncertainty in estimating economic development but more immediately the desire of governments to stimulate technical solutions.  The need to be seen to be taking action frequently descends to picking winners and creating classes of rent seekers.

The problem faced by the Garnaut Enquiry may be reduced to the question "What is the magnitude of the risk of human induced climate change?"  The first question should be "How do we assure ourselves that the risk assessments are reliable?"  After all the assessments are based on climate modelling.  A quote from a well known IPCC scientist Dr Kevin Trenberth should act as a caution:

None of the models used by the IPCC are initialised to the observed state, and none of the climate states in the models correspond even remotely to the current observed climate.

In the 20th century, policy makers based decisions on scientific knowledge and understanding and where possible experimental confirmation.  This was the case in the development of atomic weapons.  In these programs there were issues of risk.  Would a controlled chain reaction in an atomic pile at the University of Chicago run out of control and destroy the south side of the city?  Some hoped that it might.  Would a hydrogen bomb set off a chain reaction in the ocean that would consume the planet?  Could the yield of the weapons be accurately estimated?  There were answers to these questions and there were experimental demonstrations to check calculations.  In some cases there were substantial errors in calculation which pointed to a gap in understanding.

For our atmosphere we have a much more complicated set of interactions than those in nuclear physics.  We need to have some ways of checking climate predictions.  It is well known that weather forecasts are useful for only a matter of days.  Temperatures are the best forecast variable, with rain and wind speed being substantially worse.  There do not appear to be any statistical analyses that cover the accuracy of yearly or five-yearly forecasts.  Until there is some measure of performance it seems unwise to take 50 or 100 year forecasts as providing any guidance.

The second problem, the inability to forecast innovation is not surprising.  The Garnaut Enquiry is required to look forward 100 years.  Models of energy use are needed for stationary and moving energy users.

Consider the start of the 20th century and ask if it were possible to predict, let alone model, the industrial development that occurred in that century?

We have examples within our own lifetimes of inventions that have changed the way we live.  In the second half of the 20th century, the invention of the transistor followed by the integrated circuit has led to an extraordinary flowering of micro-electronics.  Most appliances from pacemakers to telephones to motorcars have integrated circuits imbedded in them or central to their function.  The change in energy use in moving from thermionic valves to transistors can be illustrated by the comparison of a pentode vacuum valve using some 10 watts of electrical power, mostly for heating a filament to produce electrons, with a Pentium processor of 10 to 15 million transistors using 20 watts.

The growth of the Internet, like the creation of a nervous system, continues with consequences for living and working which will take some time to appear.  The consequences of the reduction of energy demand in appliances, the use of wireless communications and other technologies will doubtless play a large part in the less energy intensive economic improvements of the developing world.

So policy should not be prescriptive about technology choices.  There are real and imagined problems with any choice of energy source.  Frequently the response of society to technical development shows a mismatch between public perceived risk and expert assessment of risk.  The choice and promotion of technical solutions is often pitched to public concerns.  These concerns are picked up by government and may help shape policy development.

As an example the present subsidies for wind farms are a response to demands for action from Green groups and green politicians.  The result is a new rent seeking group.  There is little cost benefit analysis to guide policy development.  Rather policy is set to subsidise non-competitive technologies that may produce unquantified benefits.  A simple comparison with the more conventional alternative of natural gas shows the use of gas to be more cost effective and useful as gas turbine generators produce electricity on demand.

General encouragement of innovation should be the limit of government policy.  It is hard enough in business to develop innovations and well beyond the reach of general government.

Policy development should be mindful of these two problems.  The primary issue of long-term uncertainty requires a cautious approach so that our economy is not disadvantaged compared to that of other countries.


ADVERTISEMENT

No comments: