Sunday, January 30, 2000

Honorary Aboriginality for Us All

This is the year of Aboriginal Reconciliation, the culmination of a process the Hawke Labor Government began in 1991 in the hope of heading off the more radical push for a treaty between black and white Australians.  In his New Year's message, and again on Australia Day last Wednesday, the Governor General called on the nation to make a renewed effort to bring about "true and lasting reconciliation".

Nice words, but what do they really mean?  How will we know if reconciliation has been successful?  Will the outcome be announced from on high, with a committee of righteous worthies telling us we have finally transcended our shameful past?  Or will teams of public opinion pollsters tramp across the country to get a take on whether the nation actually feels "reconciled"?

Certainly, the vision promoted by the Council for Aboriginal Reconciliation seems reasonable -- "a united Australia which respects this land of ours;  values the Aboriginal and Torres Strait Islander heritage;  and provides justice and equity for all".  But in fact it is so vague that, depending on your point of view, you could say it has been largely achieved already, or see it as an impossible goal which will always enable critics to complain that Australia has fallen far short of fulfilling its obligations.

In the hope of clarifying some of the woolly thinking surrounding the issue, I put out a paper called Reconciliation:  What Does it Mean? towards the end of last year.  I pointed out that there were contradictions between various parts of the program being promoted by CAR;  for instance, between calls for "a united Australia" on the one hand, and promoting moves towards indigenous separatism on the other.

Among the issues we considered was whether it would really be possible to raise the social and economic indicators for Aborigines to an acceptable level without radical changes to the strategies of autonomy and cultural revival so favoured by the intelligentsia and prominent indigenous activists.  Whether or not our answers have merit, the many questions and inconsistencies we raised are fundamental.

Nevertheless, in the national quest for reconciliation, most of the tough questions have been set aside.  Although CAR released a Learning Circle kit last year "to promote discussion of the issues surrounding reconciliation in the Australian community", the package really seems designed to close down certain kinds of discussion by making participants aware that some matters are effectively out of bounds.

Only a smattering of alternative perspectives are offered, and even then, they are carefully placed in contexts that make it clear that such views are not at all helpful.  The kit refers to a very large number of books, articles and websites for further information, but hardly anything that contains detailed arguments or embarrassing facts that might cause people to question the direction that CAR is taking.  The people who espouse the "correct positions" get the best lines, the most sympathetic treatment, and the greatest prominence.

For example, a list of the underlying causes of indigenous disadvantage makes no mention of the many ways in which contemporary Aborigines are encouraged to see themselves as victims and to avoid taking personal responsibility for their own actions.  True, the kit was prepared before Noel Pearson released his paper, "Our Right to Take Responsibility".  But Pearson's observations about the destructiveness of victimhood were hardly new, as he readily acknowledged.  Indeed, the Learning Circle kit makes its own distinctive contribution towards furthering the mentality of victimhood amongst Aborigines.

If there were a simple and obvious way to end Aboriginal disadvantage, perhaps attempts to keep the reconciliation discussion on a narrow track might be understandable.  But if past dispossession and injustice is to blame for all of today's woes, as CAR and its supporters pretend, why are major social indicators for Aborigines such as life expectancy, suicide rates and violence getting worse rather than better in many areas?  And why are some of the highest Aboriginal mortality rates to be found in regions which have suffered the least amount of dispossession and interference, such as East Arnhem Land?

As the American black critic Shelby Steele has pointed out, when people genuinely want to solve difficult social and economic problems, they take a flexible approach, encouraging anything that might work.  In his recent book, A Dream Deferred:  The Second Betrayal of Black Freedom in America, he contrasts the experimentalism of President Roosevelt's New Deal administration during the Great Depression of the 1930s with contemporary Americans' rigid "politically correct" approach towards racial issues.

Steele explains why liberal-minded white Americans now feel they must support ineffective or counter-productive policies which only benefit a black elite, while eroding the sense of individual responsibility amongst the very people they are supposedly trying to help.  He thinks that many white liberals are not quite as keen to enhance black freedom and prosperity as they claim.

Rather, these whites are preoccupied with their own personal redemption;  with demonstrating their moral virtue in the face of the shame they feel in being identified with America's unpleasant history of racial injustice.  And they know that those who challenge the race-relations industry's favoured notions will find their goodness denied -- they will be called "racists".  Although Steele confines his observations to the United States, they also seem highly relevant to the Australian situation.

Nevertheless, even those most committed to moral posturing on indigenous issues can occasionally offer a constructive way forward.  Take Germaine Greer, who told a British audience last week that she was "an honorary Australian Aborigine".  She didn't say who made her an Aborigine, but it is a wonderful idea.  If only all the rest of us could be offered this status, there would be no need for reconciliation.


ADVERTISEMENT

Tuesday, January 25, 2000

Openness Needed on Aboriginal Issues

Last week Deputy Prime Minister John Anderson probably put himself offside with the majority of Aborigines.  He had a private meeting with Geoff Clark, the recently elected Chairperson of the Aboriginal and Torres Strait Islander Commission, an organisation regarded with such disdain by many indigenous Australians that less than 1 in 4 bothered to vote in the 1999 elections for ATSIC regional councillors.

The percentage of Aborigines voting in ATSIC elections has steadily declined since it was established by the Hawke Government in 1990, despite admonitions from indigenous leaders that a high turnout would send a "message" to the rest of Australia.  If most of its own supposed constituency is unwilling to take ATSIC seriously, why should anyone else?

However John Anderson's office says that the talks with Clark were constructive, focusing on common ground rather than on matters which divided them.  Given the displays of moral fervour which indigenous issues occasion amongst the nation's intelligentsia, it is understandable that even a politician with Anderson's refreshing candour would tread gingerly.

But if Australia is to make real progress in dealing with the seemingly intractable issues of Aboriginal disadvantage and social breakdown, a number of divisive and unpalatable issues will need to be openly discussed.  These include the continued existence of ATSIC, and the whole structure of government legislation and programs which are predicated on the notion of a distinct Aboriginal people with rights different from those of other Australians.

In an article criticising the American government's unwillingness to adopt colour-blind policies in the face of growing black integration (AFR, January 17), New York columnist Deroy Murdock cited the rapidly increasing number of interracial couples in his country.  In 1993 nearly 9 per cent of black men married white women, which seems impressive until it is compared with Australia.  The 1996 census revealed that 64 per cent of Aboriginal families involved a union between an Aboriginal and a non-Aboriginal partner, a figure that indigenous organisations such as ATSIC are most reluctant to publicise, given its implications for the separatist sentiments harboured by many of their leaders.

Nevertheless, since Noel Pearson's discussion paper "Our Right to Take Responsibility" was circulated last year, a number of non-Aboriginal commentators have felt less diffident about raising the need for indigenous people to develop a greater sense of personal responsibility over their own lives.  Pearson acknowledged that although previous generations of Aborigines suffered much greater discrimination and abuse than their descendants ever experience today, contemporary Aborigines are still encouraged "to see themselves as victims and to take on the mindset of victimhood".

Unfortunately however, Pearson and most of those who have taken up his concerns still seem to assume that a strong sense of individual responsibility is compatible with contemporary pieties such as special indigenous rights and a resistance to the culture of mainstream Australia.  They have fallen prey to the illusions of a post-modern age, where cultural characteristics and social institutions are treated as though they are supermarket consumables which can be blithely added to the shopping trolley irrespective of what is already there.

The problem is illustrated by remarks of Fred Chaney and Neil Westbury that one of the unexamined critical issues arising out of Pearson's paper is "how to assist communities to maintain strong Aboriginal identity in the face of mainstream values, beliefs and expectations" (AFR, January 14).  The possibility that communities which stress a "strong Aboriginal identity" under such circumstances might be undermining the personal responsibility of their members has obviously not occurred to them.

Similarly, Pearson himself falls into the trap of attempting to justify affirmative action programs on the supposed grounds that race is "all-pervading" for Aborigines, determining their "social, political and economic position" in contemporary Australian society.  But as the black American critic Shelby Steele has eloquently pointed out, giving such primacy to race as the explanation for disadvantage completely subverts the notion that it is individuals who control their own destiny.  If, as the indigenous establishment and its supporters constantly claim, racism really is the major cause of all the misery, then Aboriginal disadvantage is a white problem and there is not much point in urging Aborigines to take more personal responsibility.

Unfortunately, while Steele's writings are widely discussed amongst Americans concerned with race issues -- having been published in journals such as Harper's, New Republic and Commentary -- he is little known to Australia's intelligentsia.  The ABC ignores him, and few university libraries carry his books.  Perhaps he cuts too close to the bone;  although his focus is solely on America, his diagnosis of the destructiveness of affirmative action and other race-based compensatory programs is almost as applicable to the Australian situation.

Steele argues that the "black grievance elite" of academics, politicians and bureaucrats maintains its power by exploiting white America's justified feelings of shame at its past treatment of blacks, which compromised the moral principles on which America was based.  The liberal-minded whites of the post-Civil Rights era who were most sensitive to this "national legacy of unutterable shame" -- to use an Australian judicial phrase -- were hungry to recover their lost moral authority and show that they were not like their forebears.

Their quest for redemption beguiles these whites into seeing themselves as largely responsible for the freedom and advancement of blacks.  In the Australian context this translates into the refrain that Aboriginal domestic violence or alcohol abuse is "our shame", rather than the responsibility of the individuals who beat their spouses or drink to excess.  To suggest otherwise is "blaming the victim", a sure sign of the absence of moral virtue and an epithet which promotes structural explanations of disadvantage at the expense of an ethic of personal responsibility.

As Steele has observed, "it is precisely the rejection of the very idea of black responsibility that protects whites from the charge of racism".  And because moral redemption is their overriding concern, these white liberals are most reluctant to scrutinise policies and programs supported by the grievance elite, even when there are strong grounds for thinking that the programs are counter-productive for blacks suffering genuine social and economic disadvantage.

Steele's analysis helps us understand the moral indignation that invariably envelops those who stray from accepted positions on Aboriginal issues.  It also suggests that white Australians who respect the human dignity of the Aborigines they meet, but who are otherwise indifferent -- or even hostile -- to the urgings of the indigenous industry may prove better friends than those who smother Aborigines with their righteous concern.

ATSIC Chairperson Clark portrays himself as a radical.  This is no bad thing, considering Australia's lack of success in achieving genuine freedom and economic equality for Aborigines despite all the energy and resources that have been devoted to indigenous issues over the past few decades.  A true radical is prepared to go back to basics, and struggle against the power structures that perpetuate the problems.  It would be utopian to hope that Clark might push for the abolition of ATSIC itself.  But perhaps he may be willing to challenge the black establishment and its posturing white supporters.  Inviting Shelby Steele to Australia could be a first step.


ADVERTISEMENT

All Eyes on Electricity Regulators' Price Reset

Over the past two years, Victoria's privatised electricity businesses have been on a merry-go-round of analysis and lobbying.  The focus has been on Victoria's Regulator-General, John Tamblyn.  The major issue has been the 2001 price re-set for line charges, which comprise 30-50 per cent of the final cost of electricity.

In the Kennett Government's privatisation, the five electricity distribution businesses sold for $8.5 billion -- twice the book value of the assets.  Since the 1995 reforms which preceded privatisation, they have all achieved considerable productivity gains.  The rural based Eastern Energy, for example, has made operating cost savings of 22 per cent.  Even greater savings were made in the metropolitan based businesses, large parts of which were previously owned by municipal authorities which even made the SEC appear efficient.  Thus, CitiPower has achieved operating cost savings of 38 per cent.

Most people consider electricity lines to be mainly (some say overwhelmingly) natural monopolies.  Hence the need for a regulated price overseen by two key puppeteers:

  • efficiency incentives:  and the greater the profit left in the hands of the owners, the greater is their incentive to find more productivity improvements;  and
  • controls on price gouging:  with governments determined to deliver lower prices to electricity customers.

One reality check on the 2001 price re-set will be subsequent share market or takeover activity.  In the UK in 1995, failure to provide sufficient reductions saw rapid share price increases., forcing an embarrassing revisit of the decision.  One problem with this reality check is asymmetry -- although leaving more profits with the businesses than investors expected will boost share prices, the adverse productivity incentives from under-pricing will only become apparent over time.

This direct focus on profits was not what the Kennett Government originally had in mind in its electricity reform and subsequent privatisation.  The requirement on the Regulator-General was to re-set electricity charges based on price not profit control.  However the Regulator-General knows that the harshest and most politically powerful criticisms will be visited on him if he sets prices too high.  As prospective profits are the best gauge of this, in a classic use of Orwellian New-speak, he has interpreted the "explicit price capping" approach he is required to use, as offering him license to use a form of profit control.

Price re-sets in utility industries overseas have usually incorporated two key elements, an initial cut to account for productivity gains in the preceding period and an annual reduction (the X-factor) to account for forecast productivity increases.  The Regulator-General has indicated that he prefers to use only the X factor so that the businesses get to keep efficiency savings longer and have a stronger incentive to find new cost savings.  But he has also indicated that "windfall" gains that are not due to the management skills of the businesses will be handed back in lower prices from the outset.  And he wants to adopt a wide definition of these "windfall" gains.

UK developments offer a guide -- chilling for the electricity businesses -- on the outcome for the Victorian re-set.  There, the regulator eventually reduced line charges by about 25 per cent at the 1995 re-set and imposed an annual X-factor real price reduction of 3 per cent.  In the re-set of December 1999, the UK regulator has reduced line charges by a further 23 per cent and maintained the 3 per cent annual price reduction.

This has already brought pressure on some business's share prices, including GPU, the owner of Victoria's gas and electricity transmission lines.  The Wall Street reaction from the price reduction forced on GPU's UK subsidiary brought a change in the firm's strategy, including a proposed sell down of its Victorian holdings.

On first assuming office the Kennett Government raised then progressively lowered prices to end customers.  It also required the new distribution businesses to reduce line charges of 1-1.92 per cent per annum.  This was a form of X-factor impacting only on contestable customers.  Non-contestable customers, those not free to choose their own electricity supplier, now comprise only households and the smallest businesses.  The 2001 price re-set also coincides with freeing all customers' choice of supplier.

The scale of the X-factor and other price reductions so far required of the Victorian businesses would appear to be considerably less than those of the UK.  Even though the UK businesses sold at a steep discount to their book value and those in Victoria fetched twice that value, the productivity improvements demanded of the businesses at the time of corporatisation now seem modest.

But this does not reduce the difficulty of setting prices to achieve a profit goal.  Shortly before Christmas, the Regulator-General issued a report that tried to summarise and compare each of the five distributors' price proposals.  However, the myriad different price and service bases defied the ability to standardise them.

This points to the need for a simpler method, perhaps one that draws upon the principles used over a century ago when natural monopoly utilities first emerged.  The nineteenth century procedures put in place required a specific sharing of profits with customers, in the form of price reductions, for profits above a threshold.  A modern variation of this may be to set prices based on an anticipated level of profits and require a share of profits above this to be handed back in the form of lower tariffs.  The share could even be on a progressive rate scale.

In the next week or so the Regulator-General is to put out an Issues Paper on the distribution businesses' proposals.  The sheer complexity of the price/ quality mix in electricity and the likelihood of a substantial waste of resources in the analyses counsels in favour of greater simplicity.  If a profit based approach to tariff setting is inevitable, it is best to make maximum use of an explicit profit sharing formula rather than persisting with the attempt to establish a synthetic profit forecast as a base for prices.


ADVERTISEMENT

Saturday, January 22, 2000

Battle of the Giants

Back in October of last year, BHP announced that it was moving from an industrial award to individual workplace agreements in its Pilbara operations.  It offered its 1,000 iron ore workers the option of moving over to staff conditions.

This marked a major shift in BHP's industrial relations philosophy.  For over thirty years BHP's approach was to work closely with unions in the management of its mines and factories.  This approach worked especially well in the Hawke/ Button years of the 1980s.  With tripartite industry/ management/ government plans all the rage, it made special sense for BHP to cooperate with a trade union dominated government.  For the company, the steel industry plan that emerged from this process delivered real benefits in terms of investment concessions and import protection.

But Government industry plans for steel and other industry sectors gradually lost their allure as outcomes elsewhere, particularly the US, showed that planning did not bring home the bacon.  The plans themselves distracted management and often impeded the flexibility of firms to adapt to shifting circumstances.  By the Keating era, the Labor Government itself had quietly shelved or downgraded its industry plans in favour of more economy-wide measures.

The appointment of American Paul Anderson as Managing Director of BHP signalled radical changes at BHP across a range of areas.  Many previous ways of operating had run out of steam.  It had become clear that globalisation required a fundamental change for BHP if it was to remain competitive.  As an outsider, Anderson carried none of the baggage of the past which had, for example, led BHP to incur lavish termination payments as a means of exiting its Newcastle facility.  As an American, he had an indifference to unions rather than the Anglo-Australian tradition of deference.  And with Labor out of office in Canberra a key underpinning of union influence was removed.

BHP's move to introduce individual agreements to its Pilbara operations is also the final chapter in the mining companies' managements' long march to regain control of their operations from unions.

The process started in 1986 at Robe River where productivity had been falling, and the owners found they were paying an increasing proportion of its workforce to be nothing more than full time union agitators.  Initiating the reform took great courage on the part of the firm's management, especially its CEO Charles Copeman.  In addition to desperate and vicious union attempts to maintain themselves as the virtual management of the facility, the local supervisory staff had little stomach to fight, and rival firms (including BHP) did not want the system threatened.  Copeman also confronted intense denigration from ALP Governments in both Western Australia and Canberra.

In addition Robe River faced the hostility of an Industrial Relations Commission guarding the myth that it was the arbitrator of workforce arrangements even where it simply endorsed lavish conditions the unions had extracted.  The Robe River dispute did much to undermine the old industrial relations framework and laid the groundwork for the vast improvements in industrial relations seen in Australia over the past decade.  From the dispute, which lasted over a year, Robe achieved a doubling of labour productivity.  CRA followed suit in its own Pilbara mines and achieved similar gains.  The Pilbara, once notorious for massive overmanning and union demarcation disputes, has, until the recent dispute, become an island of industrial peace and prosperity.

BHP's management had long displayed strong skills in playing the industrial relations game.  But working within the system could not provide the quantum leap in productivity achieved by Robe's staff run Pilbara facility, and later at CRA's operation.  BHP found in the Pilbara -- as P&O is finding on the wharves -- that there are areas where unions are institutionally unable to accept the workplace gains that are delivered by individual staff agreements.

The advantage of workers being on staff is greater flexibility and mutual agreement of the firm's employees and its owners on the best outcomes for the business.  This brings win-win outcomes which mean benefits for shareholders, and better pay and more satisfying employment conditions for the workers.  For BHP it had become clear that the improvements from this approach that its competitors in the Pilbara were reaping were bringing richer dividends than the traditional industrial relations approach.

Compared with the bitterly fought pioneering struggle at Robe River, the unions' fight against BHP's decision has been foot-stamping and posturing.  This dispute is their last gasp.  And even if BHP concedes some on-going role to unions, they will cover few workers and be influential in even fewer workplace issues.  The individual employment agreement approach is now the dominant form of workplace arrangement in Western Australia.  It is generally recognised as bringing improved productivity and wages.

The changed approach of BHP in its iron ore operations may presage similar changes in its coal mining and steel divisions.  The unions' reactions to the Pilbara change show this is uppermost in their minds.  Even though, over recent years, the traditional union bargaining approaches have delivered good productivity gains in both steel and coal, the Western Australian developments are a direct shot across the union bows.

This is particularly so for the CFMEU.  At the very least BHP's determination to effect change in the Pilbara demonstrates the Big Australian will adopt an alternative approach if there is a return to the bad old days of confrontation in the coal mines.  The union leaders' nightmare is that it may even trigger a renewed acceleration of the decline in union importance and membership.


ADVERTISEMENT

Wednesday, January 19, 2000

Give GM Foods A Go

The green/consumerist radicals had great sport last year in demonising genetically modified foods.  They catapulted GM foods into one of the celebrated causes at the World Trade Organisation meeting in Seattle.  And prior to Seattle they managed to create public alarm with none-too-subtle images of "Frankenfoods" and claims that genetically modified tomatoes were really pigs in disguise.

There was no substance behind the fear-mongering, as scientific advisers told their political masters.  Even so, politicians in Australia and many other countries were spooked into contemplating draconian regulatory measures to control GM products.

And the radicals claimed the scalp of Monsanto, the firm most associated with biotechnology and food.  The NGOs organised an internet attack to undermine the value of the Monsanto brand name, forcing a sale of its GM division.

Data now becoming available is showing productivity from GM products falling short of Green Revolution breakthroughs but still allowing valuable improvements.  The increased yield for the two main GM crops in the US, corn and cotton, averages 6% and 11% respectively.  Some nations can afford to forego such gains but Australia, with agriculture providing 30 per cent of exports must remain in the vanguard of competitiveness and cannot reject technology to improve agricultural productivity.

Issues of productivity aside, the built-in insecticides at the core of most GM products is leading to reduced use of agricultural chemicals.  US studies show chemical spraying for GM crops fell by 14% for corn and a massive 72% for cotton.  This is a major commercial saving -- pest control accounts for 12% of the costs of producing cotton.  And, of course lower levels of spraying pay an environmental dividend.  When confronted with this evidence, the radicalised environmental groups have buried their heads in the sand rather than risk alienating some of their support base by praising business innovation.

In assaulting genetic engineering the radical lobby steered clear of attacking genetically modified products used in medicines.  The public would just not buy that.  Some GM medicines that are clear improvements on the "natural" products are already around.  These include the drug to combat growth hormone deficiency, which previously resulting in premature deaths when it was it was derived from a "natural" source, the pituitary glands of corpses.

Currently, the improved productivity from GM crops is difficult for consumers to translate into tangible benefits as the lower price is not apparent.  Consumer views will change as improved new GM products in the pipeline are brought to market.  These include:

  • Vitamin A enriched rice which will reduce the annual death and blindness toll in children with Vitamin A deficiency;
  • the elimination of a natural allergenic protein in milk that prevents many children from consuming the product;
  • eggs with high levels of peptides to allow healthier weight reduction.

The first round of GM improvements to agricultural products came without visible improvement to the consumer.  Scaremongers had a field day pushing the fear buttons.  With subsequent rounds of technology developments, quality improvements will add to the productivity gains.  Significant public opposition to GM products will disappear just like it has to the products of selective breeding that form the bulk of our present diet.


ADVERTISEMENT

Sunday, January 16, 2000

The Sorcerers' Apprehension

Do you want some comforting thoughts to help you face the tribulations that various experts have predicted for the years ahead?  Cast your mind over all those terrifying forecasts about the world-wide catastrophes that the millennium bug was going to cause.

Meltdowns at nuclear power stations;  devastating food shortages as essential services and transport systems collapsed;  planes and satellites falling from the sky;  global economic depression;  these and other expected calamities would lead to social mayhem.  Some on the loony right of American politics even convinced themselves that Bill Clinton would use this turmoil to declare martial law and make himself president for life.  (Surprisingly, they didn't seem to recognise the bright side of this scenario -- wacky Al Gore would never get to occupy the White House).

Faced with the embarrassing disparity between their forecasts of doom and the actual course of events, Y2K consultants resorted to what can be called the "witch-doctor gambit".  When things turn out well, sorcerers in tribal societies attempt to claim all the credit for everyone's good fortune.  But when disaster strikes it is because the sorcerers were not taken seriously enough, or because they were given too few pigs or shells, or for some other self-serving reason.

Some information technology specialists tried to portray themselves as heroes whose timely warnings and valiant efforts had saved our civilisation from a terrible fate.  Unfortunately for them however, there was just too little difference between the experiences of those who paid the sorcerers a great deal, and those who virtually ignored them.

As the year 2000 began, the only place that seemed to suffer serious computer problems was The Gambia, an almost make-believe nation which occupies a sliver of country 295 km long and 25 km wide in the middle of West Africa, and whose whole economy is based on peanuts.  But things are always breaking down in The Gambia.  So no-one can be sure whether the real culprit was the Y2K bug, or the normal chaos of everyday life.

Nevertheless, it would be quite wrong to argue that the hundreds of billions of dollars that were spent around the world in attempting to rectify the Y2K problem were completely wasted.  The failure of computer programmers in the early days of the industry to allow for years to be entered with four digits rather than just two could have caused some computers to treat the year 2000 as 1900, with potentially serious consequences for systems which depend on accurate dates.

It would have been highly irresponsible to ignore the risks to computer controlled operations in certain critical areas -- nuclear weapons systems, aviation, and essential services such as health, water and electricity.  In this sense, there was at least one significant difference between tribal sorcerers and Y2K consultants.  Despite what some people are now claiming, the latter were not involved in a complete con.  A genuine problem did exist, and it was necessary to take certain precautions.

But how much caution was justified?  The humbug which surrounded the Y2K bug offers some useful lessons for thinking about other areas of contemporary life which also attract self-interested doomsayers, such as the state of the natural environment.

Both greens and Y2K consultants would argue that they are forced to hype up the dangers, because otherwise they might be ignored.  As one prominent Cassandra once confessed, in order to obtain public support, scientists involved in environmental causes "have to offer up scary scenarios, make simplified, dramatic statements, and make little mention of any doubts that we might have.  Each of us has to decide what the right balance is between being effective and being honest".

The trouble with this kind of approach is that it ignores one of the more valuable ideas that environmentalists have popularised -- the need to think "holistically".  Nothing exists in isolation, so we should always consider the effects that our actions might have on other areas of concern, no matter how unrelated these might initially seem.

This means recognising that precautions always come with costs.  Resources devoted to addressing one set of possible dangers inevitably require funds to be diverted from somewhere else.

Many governments, organisations and individuals spent vast sums of money to fix potential Y2K problems which at worst would have produced only minor irritations, or which could have been fixed by a simple manual adjustment of the date once the millennium began.  As spokespeople for various interest groups like public health and education observed, some of the $12 billion spent on the Y2K bug in Australia alone could have found far better use in dealing with problems in their own sectors.

When every potential risk, from greenhouse gases to the Y2K bug to genetically modified foods, is presented as threatening the end of the world as we know it, it becomes so much harder to apportion our limited resources on the basis of a rational assessment and comparison of the genuine threats that we may face.

Rather, the greater share of funds will go towards meeting the concerns of those who can develop and successfully market the most disturbing scenarios, whether or not their jeremiahs are really justified.  And as each scary prediction proves unfounded, the voices of doom become more and more strident, crowding out the calm and rational public debate on which sensible policy depends.

These conditions are fine for talented and unconscionable doomsters, whether their focus is technology, the environment, or some other seemingly plausible calamity.  But their successes are invariably won at the expense of the broader community, which at best has its resources misallocated, and at worst finds that economic and technological developments that can bring new sources of prosperity are being prevented.

Faced with advocates who promise to deliver us from disaster, informed scepticism is the only cautionary approach that always repays its costs.


ADVERTISEMENT

Tuesday, January 11, 2000

Apocalypse now?  Let's just hang on a moment ...

No wonder Bob Carr always looks morose.  The poor man has fallen for the apocalyptic scenarios promoted by the peddlers of environmental gloom.

In a remarkable cry from the heart, the NSW premier announced that there is little hope for humanity ("The Doomsday Millenium", SMH 6 January).  World population numbers have reached catastrophic levels and even in the unlikely event that they are reined in, it will be too late.  The global environment has been irreversibly damaged.  Perhaps the most we can wish for is to be distracted by the tinsel of mariachi bands and fireworks displays.

What I found particularly surprising was the premier's apparent certainty about the bleakness of the future.  Less erudite greenies might be forgiven if they are blind to the hazards of predicting outcomes in systems as complex as those involving human societies and the physical environment.  But as a well-read and thoughtful man, Bob Carr has no such excuse.  He must be aware of history's marvellous tendency to utterly confound the jeremiahs of experts.

The past few decades have been particularly unkind to predictions made by those of Mr Carr's persuasion.  When he stated that "only 30 years ago ...  global warming was considered a possibility for late in the 21st century" he was telling only half the story.  Thirty years ago, many scientists, including a number who are now riding the global warming bandwagon, were confidently warning about the ominous signs of world-wide temperature declines caused by industrial pollution.  A new ice age, bringing devastating crop failures, would soon begin.

And thirty years ago Paul Ehrlich, the grand master of scary environmental stories induced by the "population explosion", was in full flight.  The world's supply of oxygen would be seriously depleted by the burning of fossil fuels and the clearing of tropical soils.  By the mid-1970s 200,000 people a year would be dying from "smog disasters" in American cities.  American life expectancy would drop to 42 years by 1980 as a result of pesticide-induced cancers.  Ehrlich also told British biologists that if he was a gambler, he "would take even money that England will not exist in the year 2000".

Questioned about his unenviable record as an environmental tipster a few years ago Ehrlich explained, "Everyone wants to know what's going to happen.  So the question is, do you say 'I don't know', in which case they all go back to bed -- or do you say, 'Hell, in ten years you're likely to be going without food and water' and [get] their attention".  Professor Ehrlich clearly got Bob Carr's undivided attention.

But those who foreclose hope by presenting the unknowable as awful and inexorable certainty are being wildly irresponsible, particularly when they are people whose words might be expected to carry some authority.  Why strive for a better society, or for the amelioration of environmental problems, when things are so bleak?  Young people foolish enough to take seriously Premier Carr's insinuation that they are unlucky to be alive at this "very scary moment" are more likely to take heavy solace in mind-altering substances than seek to become good citizens.

Mr Carr's argument about overpopulation and the devastation of nature depends on the idea that specific environments have a fixed carrying capacity, which can be "outstripped" by the birthrate.  Because carrying capacity appears to be a soundly-based and quantifiable scientific concept, it is a powerful weapon for those who want to claim that there are intractable limits to population growth.

But while carrying capacity is great for mobilising anxiety about the environment, it is a very dubious scientific concept, at least as far as humans are concerned.  Attempts to provide actual numbers for the earth's supposed carrying capacity produce extremely diverse results.  The biologist and demographer Joel Cohen recently noted that in 1994 alone, published scientific estimates varied from less than 3 billion people to 44 billion, or over 7 times the world's current population.

Similar problems arise when claims are made about environmental capacity at a regional or national level.  As Australian geographer Harold Brookfield has observed, despite repeated predictions about the human carrying capacity of country after country for over 50 years, "in almost every significant case these limits have been exceeded, while in most cases the present people are now better off than their less numerous predecessors".

Professor Brookfield could have added that at least in many developed countries, the environment is also better off.  Analyses of major environmental indicators in the United States and Canada by the Fraser Institute and the Pacific Research Institute point to dramatic declines in overall pollution levels and significant improvements in the condition of forests, waterways, and wildlife habitat over the past couple of decades.  Scientific and technological advances, flexible economic and political institutions, and a widespread commitment to sound environmental management all combine to allow a larger and wealthier population to create more, rather than "less nature".

Under the appropriate social, economic and legal conditions, such causal links can exist even in Africa, which Mr Carr seems to regard as beyond all hope.  In a highly acclaimed recent study, Mary Tiffen and her associates examined changes over a 60 year period in a densely populated region of Kenya.  They discovered that rapid population growth had been a driving force for greater prosperity and environmental improvement.  Researchers in some other parts of Africa are making similar findings.

The greatest threat to prosperity and effective conservation measures in much of sub-Saharan Africa comes less from increasing numbers of people than from the terrible ravages of AIDS on the most productive and talented sections of the population.  Deluded into believing that Africa already has too many people, there is a danger that many Western environmentalists will actually believe that this tragedy will be good for the planet.

This is not overwrought speculation.  Lynn White, the historian whose famous and highly influential 1967 article blaming environmental degradation on the Judeo-Christian tradition made him an early hero to greenies, later wrote an extraordinary essay called "The future of compassion".  This called for Christians to assist "a drastic global rollback in population".  White said that while he hesitated to pray for a new Black Death himself, "perhaps, whether we pray for it or not, a global atomic war will once more temporarily solve the population problem".

I am sure that Premier Carr would be as appalled as I am by such demented statements.  But they are little more than the logical outgrowth of the kind of foolishness he offered Herald readers last week.


ADVERTISEMENT

Sunday, January 02, 2000

Moving Back to Full Employment

In August 1999, 653,000 Australians were officially unemployed, with an unemployment rate of seven per cent after eight years of sustained economic growth.  Almost 30 per cent or 192,000 people had been unemployed for a year or more:  of these, 113,000 had been unemployed for two years or more.

This represents a tremendous waste of human potential and lost production.  It also increases costs to taxpayers and imposes great strains on the welfare system and community services.

It fosters a sense of hopelessness and alienation, particularly in the long term or chronically unemployed.  Unemployed people have increased levels of sickness and mental illness, and there is also a high level of correlation between young male unemployment and young male suicide.

The average duration of unemployment has steadily increased in Australia since the early 1970s when it was as low as seven weeks.

Economic growth does increase employment, but has not been sufficient.  Since the early 1970s, Australia has been experiencing what might be called the "rising mountain" pattern of unemployment, where each business cycle has seen unemployment peak at a higher level than the previous one.


Finding the Factors

The deteriorating performance of the Australian labour market is the result of a range of factors:

  • governments have loaded more and more regulations on employment, raising the costs and risks of hiring;
  • youth wage rates have continued to rise towards adult rates, disadvantaging young people against those with greater experience, productivity and proven track records (almost 40 per cent of the unemployed are aged 15 to 24 years);
  • The award system's complex series of minimum wages, making illegal a whole series of potentially successful bids for employment by the unemployed, has a range of effects, including ensuring the results of the wage "break-outs" of the Whitlam and late Fraser years remained embedded in the system, retarding employment recovery.  It also inhibits the ability of regions hit by economic changes to adjust, resulting in higher structural unemployment.

Regaining full employment

Ultimately the best protection for workers is full employment, the ability to easily change jobs to a more satisfactory one.  It is precisely the most disadvantaged workers, such as people with a disability, who most benefit from full employment, and who are most hurt by entrenched unemployment, since they are the ones most likely to fall by the wayside as the threshold of employment is raised.

So the greatest step to improving the employment prospects for the disabled is to lower the threshold at which workers and employers can reach mutually satisfactory employment exchanges.  This sets in train a virtuous circle of higher employment leading to higher production and incomes, feeding back into more jobs being offered and so it goes on.

Legitimate concerns that some people may thereby fall below an acceptable benchmark of income can be met by devices such as an "earned income tax credit", where the government tops up employment income.  Such an arrangement could be extended to provide various forms of extra assistance to those, such as the disabled, with extra needs.

Even that may not be enough for some people who have become so divorced from the employment experience that their productivity is not sufficient to interest any likely employer without extra assistance even in a de-regulated labour market.  In such cases, wage subsidies are a rational policy response.  Work-for-the-dole can also have particular value in keeping people in touch with the employment experience.


Setting the policy direction

So the appropriate policy thrust is clear:

  • government should stop intervening in labour markets in way which discourage employment;
  • an earned income tax credit or similar mechanism should be used to "top up" labour incomes to an acceptable standard rather than regulation of wages,
  • such "top-ups" should include adjustments for specific disadvantages,
  • extension of reciprocal obligation via such programs as work-for-the-dole and
  • use of bridging wage subsidies for those whose productivity is insufficient to interest any likely employer.

The income "top up" mechanism would raise issues of effective tax rates and it, along with the wage subsidies, would have involve significant expenditures.

Such expenditures would be offset by reductions in unemployment benefits and increased tax receipts.


Consider the opposition

Most of the government interventions in the labour market ostentatiously give benefits to the employed majority, with the costs being hidden and largely carried by the unemployed minority.

They are also justified on the basis of grand social values, like equity, social justice, stopping exploitation and so forth.  Unemployment is also a risk confined to a relatively small percentage of the population, if the young and the long-term unemployed are excluded, unemployment rates over the last 25 years have been in the range of two to four per cent.

This explains why persistent mass unemployment has been so tolerated and why it becomes salient as a political issue only when unemployment is rising rapidly, that is, when there is suddenly a much larger pool of people threatened by unemployment:  either themselves or a member of their close family.

There is also a range of special interests fostered by the current arrangements, particularly all those employed in or around the arbitration system, lawyers, union and employer organisers, industrial relations specialists.

For more than 25 years, this conjunction of factors has been sufficient to keep Australia's labour market institutions in place:  resulting in entrenched mass unemployment and consequent waste and human suffering.  Like other disadvantaged workers, the disabled suffer particularly badly from this failure.

Justified Contempt

Also published in the Canberra Times on 22.01.2000 as "Protests against WTO a case of never mind the consequences, feel the pretension"

It has long been an open question which group or institution in Australian public life engages in behaviour most worthy of contempt.

Is it the ABC, paid for by the taxes of all Australians but which deigns to flatter the opinions only of a narrow slice (and to sneer at the opinions of a much larger slice), whose staff are currently whining that the Party of Menzies will not give it more public money when sneering at all Menzies stood for is the posture of choice for so many ABC staff:  a spectacle at once unctuous and pathetic?

Is it humanities and social science academia, using the taxes of working-class families to fund middle-class advantage processing educational certification of diminishing worth, rife with the betrayal of their pedagogical duties and intellectual heritage (either actively or in silent compliance) to purvey an obscurantist pomposity which sneers at human achievements past and present in order to polish an overweening moral vanity?

Is it the education unions, in their stringent avoidance of real accountability as they relentlessly defend teaching mediocrity, doing their best to ensure that public schools imitate soviet production methods with soviet-style results of increasing cost and decreasing quality so that their most lively function is to propagandise (badly) at our children?

Is it the unions in general, reacting to the refusal of new workers to engage their dubious, overpriced services by strong-arming State ALP Governments into giving them ever more outrageous legislative privileges in the hope that state coercion can corral more members than their current mere fifth (and falling) of the private-sector workforce?

A difficult question to answer indeed.  Fortunately, the spectacle in Seattle at the WTO conference has clinched the title.  The groups in Australian public life whose behaviour is most worthy of contempt are those advocacy non-government organisations (NGOs) -- the ACFs, Greenpeaces, Community Aid Abroads and their ilk -- who have used their (massively overblown) reputation for altruism and their almost complete lack of accountability to do their bit to trash a major Australian national interest.

First, a bit of history and a smattering of economics.

The World Trade Organisation is the successor to the General Agreement on Tariffs and Trade (GATT).  The GATT was set up as part of the postwar settlement which sought to ensure there was not another World War.  The single thing which most made the Great Depression of the 1930s so severe was the destruction of international trade by "beggar-thy-neighbour" protection policies.  Countries competed in raising tariffs against each other, sending them all spiralling down into mutually-assured economic destruction.  Out of the Depression came Hitler and the rest is history:  and tens of millions of dead and devastated countries.  Something not to be repeated.

Hence the GATT, an organisation membership of which (like the WTO) has always been entirely voluntary.  You didn't have to play if you didn't want to.  But all countries who wish to trade internationally have an interest in having a common set of rules under which said trading takes place.  Which is what the GATT, and now the WTO, has provided.

As the failure of command economics has become ever more obvious, more and more countries have wanted to gain the benefits of trade, and so have joined the WTO.  And since, unlike the UN, the WTO is about something specific and real (trade) it has genuine effectiveness, again unlike the UN.  So, what the WTO is really about is common rules so trade continues and grows so the world economy does not get devastated so countries do not experience profound crises leading to who knows where (and, after Hitler, who wants to take the risk?).  And since no one will suffer more than the poor from a major collapse in world trade, the poor are major beneficiaries of the trading regime.

So, if you are in favour of world peace and development, the WTO has to be seen as a good thing.

As to the economics, we need to remember what trade is.  Trade is commercial exchange for mutual gain.  If both sides did not benefit, it would not continue to happen.  It is precisely because commercial exchanges leave both sides better off that we all engage in them, and trade is just such exchanges across national boundaries.  Which is why countries have a common interest in trade and common rules for same.  It is all about mutual gain and mutual benefit.

Which is why the protectionist policies of the 1930s were not only disastrous in effect, they were criminally stupid.  One does not get rich by impoverishing one's customers:  it is much better to own a shop in a rich suburb than a poor one.  And you certainly don't get prosperous, nor recover from an economic down-turn, by raising the cost of products to your own citizens (which is what tariffs do).  Even now, rich countries mostly trade with each other -- because they are the countries whose consumers have the money to buy things.  More wealthy countries mean more people able to buy more of our products.

The WTO is a way for governments to mutually agree not to do stupid things.  Or, to be a bit more precise, to mutually agree to each defend their own general interest against their own special interests clamouring for special privileges.

With the current WTO agenda, Australia is in an enviable position.  There was no significant local interest which was threatened by anything on the agenda, but, as a major food exporter, we stood to gain a great deal from liberalisation of agricultural markets.  Something else which stood to gain a great deal from such liberalisation is the environment.  Agricultural subsidies in the US and Europe encourage land to remain in agricultural production which should not be, and use of attendant chemicals.  They also depress the world prices for many agricultural products, undermining the incomes of developing world farmers.  More generally, the more prosperous people are, the more concerned they are with environmental amenity (since they are less worried about where their next meal, etc. is coming from) and the more able they are to do something about it.

So, peace, environmental concern, higher incomes for the developing world and Australia's national interest all argued for a good result in Seattle.  So, what were Greenpeace, the ACF, Community Aid Abroad, et al doing?  Campaigning strongly against it.  Why?

Their official position is that WTO promotes globalisation and globalisation is bad for democracy, the poor and the environment.  This is a line which does not stand up to even cursory examination and can be dismissed.  What really motivates them is straight institutional interest.

All these bodies are in the moral vanity game.  They sell "warm inner glows" to their supporters.  In Gary Johns' words

NGOs consist of mail-order memberships of the wealthy Left, content to buy their activism and get on with their consumer lifestyle.

Since over a century-and-a-half of polemical endeavour has established that opposing capitalism scores the highest moral vanity points -- and the WTO and globalisation represent capitalism-triumphant -- then being agin the WTO and globalisation is the best game in town for those in the moral vanity game.

What they are really trying to do is to beat up on the WTO so they can be "seen" to be "fighting the good fight" while also hoping to lever their nuisance value into being included in WTO decision-making.  It is a straight play for the danegeld of participation and influence.

And the incomes of Australian and third world farmers?  In the moral vanity game one has to remember the basic principle:  never mind the consequence, feel the pretension.  It was good enough for the apologists of Stalinism, and it is good enough for the anti-globalisation activists.

As for their arguments and antecedents:  who was that guy who used to talk about world-capitalist conspiracies, how malign international forces were secretly plotting to use their wealth to enslave unsuspecting decent people, who thought that nature was the true measure of things, and felt that the environment put strict limits on how many people could live in a certain area?  Was always going on about living space -- lebensraum he called it.  Had a funny moustache ...


ADVERTISEMENT

And About Time ...

There can be no better time than the beginning of a new millennium to reflect on the nature of time.  While anyone with a correctly set watch can answer the question "what is the time?", take out the "the" and we have a fundamental conundrum;  one that scholars spend years pondering without coming to any satisfactory conclusions.

After quoting a few circular definitions from history's most outstanding thinkers, the respected Cambridge Dictionary of Philosophy tells us that "time might be too basic to admit of a definition".  Some intellectuals are even convinced that time does not really exist;  which is rather a pity given that we spend so much time talking about it and measuring its passage.

Nevertheless, the great scientist Albert Einstein, anointed this week as the Person of the Century by no less an authority than Time Magazine, did argue that time could not be considered independently of space.  According to his theory of relativity, as the speed of an object increases time slows down, until it comes to a complete halt at the speed of light.  A clock in a space craft travelling at 90 per cent of the speed of light, for instance, would take the equivalent of about 138 Earth minutes to record 1 hour, and a person on board the craft would likewise age at less than half the rate of someone who had remained on the Earth.

As unlikely as the theory of relativity may seem, it has been confirmed by scientific experiment.  A few years ago, an ultra-precise atomic clock, accurate to one billionth of a second, was taken by jet from London to Washington and back.  The difference between the elapsed time it recorded and that measured on the ground was exactly as Einstein predicted.

Yet the most convincing answer I have seen to the question "what is time?" was graffiti at a university many years ago:  "Time is nature's way of preventing everything from happening at once".  Although Einstein's theory suggests that this insight would not apply at the speed of light, this graffiti has the distinct advantage of ringing true with our everyday perceptions, which many other definitions and explanations do not.

While all this suggests that us ordinary folk are well advised to trust in our common sense and leave the philosophising to the denizens of ivory towers, there are many questions about time that we can get our minds around without too much difficulty.

For some people the most pressing issue relating to time is whether the new millennium really begins today, or in January next year.  In an article in one of the southern newspapers this week, the nation's most-admired pedant, the Federal ALP president and former quiz king Barry Jones, was quoted as plumping for the latter.  This was despite the fact that all his party's premiers were busy hosting millennium celebrations last night.

The problem about the commencement of the millennium arises because our forebears did not have the benefits of globalisation.  The system of counting years from the supposed date of Christ's birth was devised by a sixth century English cleric called Dionysius Exiguus, or Dennis the Small.  Although by this time Indian mathematicians had already developed the concept of zero as a number, it was many centuries later before scholars in Western Europe learnt about it.  So Dennis made the year of Christ's birth AD 1, (Anno Domini, "the year of our Lord") rather than Year Zero as an informed Indian would have done.  In any case, Dennis also got the year wrong, as Christ was born during the reign of the Judean king Herod the Great, who died in 4 BC.

So strictly speaking, Barry Jones is right.  But he and all the other "the millennium begins in 2001" dogmatists are wrong psychologically, and as good feelings usually take first priority these days, this is what matters.  A year with three noughts at the end of it is much more compelling and gratifying as the turning point for the millennium than a year that ends with two noughts and a one.

However, if it is really important to bring feelings into line with logic, the solution would be to change 1 BC to 0 AD, and adjust all the other BC dates accordingly.  As few influential people take much interest in antiquity anymore, inconvenience would only be suffered by the declining number of classical scholars.  This dating change could be presented as a nice multicultural gesture, finally giving Indian civilisation its due for the contributions it has made towards our own.  And it would be a wonderful gift to our descendants, who would obtain the benefits as early as the year 2100, when they would be spared all the earnest debates about whether or not a new century had really begun.

Surprisingly perhaps, given the piety of the early Middle Ages, it took a very long time before Dennis the Small's innovation of taking Christ's birth as the starting point of the Western calendar was generally adopted.  The first person to popularise his system was another English cleric, the Venerable Bede, who used it two centuries later to date events in his widely read work, the Ecclesiastical History of the English People.  But even so, as late as the year 1000 the Anno Domini system was not particularly widespread in Christendom.

So even in Europe the great majority of people lived through the first millennium without actually being aware of it.  We are the first generation in human history to have experienced a millennium change that has been broadly acknowledged.  And that surely, has been something worth celebrating.

One of the other pressing questions exercising some minds is what to call the new decade that begins today.  Even the pedants have to recognise that the nineties are over.  Most of the possibilities suggested so far are pretty ordinary -- the ohs, the zeroes, the uh-ohs, the earlies.  The Australian Broadcasting Corporation, living up to its dreary auntie image, asked a professor of linguistics at Macquarie University to advise on the correct usage.  The good professor offered up "the two thousands", "the twenty-hundreds", and "the twenty-ohs".  Stolid and worthy perhaps, but most unlikely to take off on talkback radio.

In its current issue, the more adventurous British newsmagazine, The Economist, editorialises that "the naughties" captures "the right tone, rhythm and sense of fun", thus giving influential support to the grass-roots campaign launched on the Internet by Sydney artist David Wales.  (Churlishly, The Economist gives no credit to Wales;  continuing a long Pommie tradition of appropriating to itself the results of Australian creativity.) "Noughties" would be the more preferable spelling however, if only to head off the inevitable complaints from wowsers that the other spelling sets an unfortunate example to impressionable young children.

To me, the most surprising aspect of the change to the new millennium is the lack of protest from the "West is wicked" crew.  After all, there is little that better represents the ascendancy of our civilisation than the near universal triumph of our Gregorian calendar -- named after the pope who introduced it in the sixteenth century.  At the very least, we could have expected demands that we apologise to the world for our calendrical imperialism, which has effectively overwhelmed the thousands of other systems of reckoning dates and times that have been developed by human cultures over the ages.  Why aren't the usual suspects telling us that we are all timeists, complicit in chronocide?

Even many Islamic countries, such as Egypt, Pakistan, Nigeria and Indonesia, have adopted the Gregorian calendar, although some of these countries use the Muslim calendar as well.  Unlike our calendar, which is based on the earth's orbit around the sun, this calendar is based on the lunar cycle, consisting of twelve months of either twenty nine or thirty days long with a 354 day year (sometimes 355 days).  As a result, the months do not keep up with the seasons.  The starting date for the Muslim calendar is the Hegira, Mohammed's move from Mecca to Medina in 622 AD, making today the 24th day of Ramadan, 1420.

In fact however, even in Europe the triumph of the Gregorian calendar was no lay down misère.  As David Ewing Duncan recounts in his engaging book, The Calendar, scholars had long known that the solar year was over eleven minutes shorter than the calendar year, the duration of which had been established by astronomers in ancient Roman times during the rule of Julius Caesar.  Over the centuries the steady accumulation of error meant that the calendar was becoming increasingly out of phase with the seasons, and Christian holy days were being celebrated on the wrong dates.

In 1267 the brilliant English friar Roger Bacon wrote to Pope Clement IV warning that something had to be done about this scandalous situation, but it took another three centuries for the church to act.  In the 1570s Pope Gregory XIII established a commission to investigate the necessary reforms, and as a result of the commission's work, ten days were dropped from the calendar in October 1582.  In order to bring the calendar and solar years into closer alignment, it was also determined that only centuries which were exactly divisible by 400 would be leap years.  This is why this year is a leap year, whereas 1900 was not.

Unfortunately, a lot of people felt that they had lost ten days from their lives, and in some places there were violent demonstrations.  Even worse, Protestant countries such as England and Sweden, and those following the Orthodox faith saw the whole thing as a Popish plot and refused to adopt the Gregorian calendar.  Britain relented in 1752, by which time it had to drop eleven days to bring it in line with other countries in Western Europe.  Under the slogan "give us back our eleven days", mobs rioted in London and other centres, with some people being killed.  Most Balkan countries held out until the 20th century, and ironically, it took the Bolshevik Revolution to make Russia adopt Pope Gregory's calendar.

The Gregorian calendar is not totally accurate however, as it still runs fast by one day every 3,300 years.  This will be rectified by the rule that a year which can be exactly divided by 4,000 is not to be a leap year, which will kick in two thousand years from now.

This should be enough to satisfy everyone, and allow the world to turn its attention to more urgent problems.  But the fires of time reform still burn strong in some people's bellies.  The Swatch watch company wants us to adopt a system that divides the day into a thousand units and abandons time zones so that it is always exactly the same time in any part of the world.  In the hope of making it appealing, they are calling it "Internet time".  But it seems like a none-too-subtle attempt to force us to throw away our existing watches and buy new ones, preferably Swatches.

And then there is the Long Now Foundation, the brain child of that fervent visionary Stewart Brand, who initiated the Whole Earth Catalog in the late 1960s.  The foundation believes we are too focused on the present, and wants us to start thinking about the next ten thousand years.  To help facilitate this long term perspective, Brand and his followers use 5 digit numbers for dates -- so 01999 has just concluded and 02000 has now begun.  It does mean that there will be plenty of time to solve any Y10K problems.  But somehow I can't see it ever becoming a mass movement in my lifetime.


ADVERTISEMENT