Our oil war in Iraq is one of my favourite (albeit unpleasant) topics to grumble about - there has been plenty of commentary on the subject this week, so here we go once again.
I'll start with Michael Schwartz, writing for Tom Dispatch about "Why Did We Invade Iraq Anyway ?". I wonder. From Tom Engelhardt's introduction:
History… phooey!
Or, more mildly, Americans traditionally aren't much interested in it and the media largely don't have time for it either. For one thing, the past is often just so inconvenient. On Monday, for instance, there was a front-page piece in the New York Times by Elisabeth Bumiller on Robert Blackwill, one of the "Vulcans" who helped Condoleezza Rice advise George W. Bush on foreign policy during the 2000 election campaign, Iraq Director on the National Security Council during the reign in Baghdad of our viceroy L. Paul Bremer III, and the President's personal envoy to the faltering occupation (nicknamed "The Shadow"), among many other things.
He is now -- here's a giant shock -- a lobbyist. And, among those he's lobbying for (in this case to the tune of $300,000) is Ayad Allawi, former CIA asset and head -- back in Saddam's day -- of an exile group, the Iraq National Accord. Bumiller identifies Allawi as "the first prime minister of the newly sovereign nation -- America's man in Baghdad." She also refers to him as having had "close ties to the CIA" and points out that he was not just Bremer's, but Blackwill's "choice" to be prime minister back in 2004. Now, he's Blackwill's "choice" again. Allawi is, it seems, yet once more on deck, with his own K-Street lobbyist, ready to step in as prime minister if the present PM, Nouri al-Maliki, were to fall (or be shoved aside).
But there's another rather inconvenient truth about Allawi that goes unmentioned -- and it's right off the front page of the New York Times, no less -- a piece by Joel Brinkley, "Ex-C.I.A. Aides Say Iraq Leader Helped Agency in 90's Attacks," published in early June 2004, just at the moment when Allawi had been "designated" prime minister. In the early 1990s, Brinkley reported, Allawi's exile organization was, under the CIA's direction, planting car bombs and explosive devices in Baghdad (including in a movie theater) in a fruitless attempt to destabilize Saddam Hussein's regime. Of course, that was back when car bombs weren't considered the property of brutes like Sunni extremists, al-Qaeda in Iraq, and the Taliban. (Just as, inconveniently enough, back in the 1980s the CIA bankrolled and encouraged the training of Afghan "freedom fighters" in mounting car-bomb and even camel-bomb attacks in a terror campaign against Soviet officers and soldiers in Russian-occupied Afghan cities (techniques personally "endorsed," according to Steve Coll in his superb book Ghost Wars, by then-CIA Director William Casey).
But that was back in the day -- just as, to randomly cite one more inconvenient piece of history also off the front page of the New York Times (Patrick Tyler, "Officers Say U.S. Aided Iraq in War Despite Use of Gas," August 18, 2002), years before we went into Iraq to take out Saddam's by then nonexistent weapons of mass destruction, we helped him use them. The Reagan Pentagon had a program in which 60 officers from the U.S. Defense Intelligence Agency "were secretly providing detailed information on Iranian deployments" to Saddam's forces, so that he could, among other things, wield his chemical weapons against them more effectively. ("The Pentagon 'wasn't so horrified by Iraq's use of gas,' said one veteran of the program. 'It was just another way of killing people -- whether with a bullet or phosgene, it didn't make any difference.'")
Of course, when it comes to America's oily history in Iraq, there is just about no backstory -- not on the front page of the New York Times, not basically in the mainstream. Even at this late date, with the price of crude threatening to head for the $100 a barrel mark, Iraqi oil is -- well, not exactly censored out -- just (let's face it) so darn embarrassing to write about. In fact, now that all those other explanations for invading Iraq -- WMD, freedom, you name it -- have long since flown the coop, there really is no explanation (except utter folly) for Bush's invasion. So, better to move on, and quickly at that. These last months, however, Tomdispatch has returned repeatedly to the subject as a reminder that history, even when not in sight, matters. And the deeper you go, as Michael Schwartz proves below, the more likely you are to find that gusher you're looking for.
And from the article itself (I love that Wolfowitz quote - its a great one to use on those remaining deluded morons still babbling about weapons of mass destruction and the war on terror):
That these justifications for invading, or remaining, are unsatisfying is hardly surprising, given the reluctance of American politicians to mention the approximately $10-$30 trillion of oil lurking just beneath the surface of the Iraq "debate" -- and not much further beneath the surface of Iraqi soil. Obama, for example, did not mention oil at all in his speech, while Clinton mentioned it twice in passing. President Bush and his top officials and spokespeople have been just as reticent on the subject.
Why then did the U.S. invade Iraq? Why is occupying Iraq so "vital" to those "national security interests" of ours? None of this makes sense if you don't have the patience to drill a little beneath the surface – and into the past; if you don't take into account that, as former Deputy Secretary of Defense Paul Wolfowitz once put it, Iraq "floats on a sea of oil"; and if you don't consider the decades-long U.S. campaign to control, in some fashion, Middle East energy reservoirs. If not, then you can't understand the incredible tenaciousness with which George W. Bush and his top officials have pursued their Iraqi dreams or why -- now that those dreams are clearly so many nightmares -- even the Democrats can't give up the ghost.
The Rise of OPEC
The United States viewed Middle Eastern oil as a precious prize long before the Iraq war. During World War II, that interest had already sprung to life: When British officials declared Middle Eastern oil "a vital prize for any power interested in world influence or domination," American officials agreed, calling it "a stupendous source of strategic power and one of the greatest material prizes in world history."
This led to a scramble for access during which the United States established itself as the preeminent power of the future. Crucially, President Franklin Delano Roosevelt successfully negotiated an "oil for protection" agreement with King Abdul Aziz Ibn Saud of Saudi Arabia. That was 1945. From then on, the U.S. found itself actively (if often secretly) engaged in the region. American agents were deeply involved in the overthrow of a democratically elected Iranian government in 1953 (to reverse the nationalization of Iran's oil fields), as well as in the fateful establishment of a Baathist Party dictatorship in Iraq in the early 1960s (to prevent the ascendancy of leftists who, it was feared, would align the country with the Soviet Union, putting the country's oil in hock to the Soviet bloc).
U.S. influence in the Middle East began to wane in the 1970s, when the Organization of the Petroleum Exporting Countries (OPEC) was first formed to coordinate the production and pricing of oil on a worldwide basis. OPEC's power was consolidated as various countries created their own oil companies, nationalized their oil holdings, and wrested decision-making away from the "Seven Sisters," the Western oil giants -- among them Shell, Texaco, and Standard Oil of New Jersey -- that had previously dominated exploration, extraction, and sales of black gold.
With all the key oil exporters on board, OPEC began deciding just how much oil would be extracted and sold onto international markets. Once the group established that all members would follow collective decisions -- because even a single major dissenter might fatally undermine the ability to turn the energy "spigot" on or off -- it could use the threat of production restrictions, or the promise of expansion, to bargain with its most powerful trading partners. In effect, a new power bloc had emerged on the international scene that could -- in some circumstances -- exact tangible concessions even from the United States and the Soviet Union, the two superpowers of the time.
Though the United States was largely self-sufficient in oil when OPEC was first formed, the American economy was still dependent on trading partners, particularly Japan and Europe, which themselves were dependent on Middle Eastern oil. The oil crises of the early 1970s, including the sometimes endless gas lines in the U.S., demonstrated OPEC's potential.
It was in this context that the American alliance with the Saudi royal family first became so crucial. With the largest petroleum reserves on the planet and the largest production capacity among OPEC members, Saudi Arabia was usually able to shape the cartel's policies to conform to its wishes. In response to this simple but essential fact, successive American presidents strengthened the Rooseveltian alliance, deepening economic and military relationships between the two countries. The Saudis, in turn, could normally be depended upon to use their leverage within OPEC to fit the group's actions into the broader aims of U.S. policy. In other words, Washington gained favorable OPEC policies mainly by arming, and propping up a Saudi regime that was chronically fragile. ...
The second Bush administration ascended to the presidency just as American influence in the Middle East looked to be on the decline. Despite victory in the first Gulf War and the fall of the Soviet Union, American influence over OPEC and oil policies seemed under threat. That sucking sound everyone suddenly heard was a tremendous increase in the global demand for oil. With fears rising that, in the very near future, such demand could put a strain on OPEC's resources, member states began negotiating ever more vigorously for a range of concessions and expanded political power in exchange for expanded energy production. By this time, of course, the United States had joined the ranks of the energy deficient and dependent, as imported oil surged past the 50% mark.
In the meantime, key ally Saudi Arabia was further weakened by the rise of al-Qaeda, which took as its main goal the overthrow of the royal family, and its key target -- think of those unintended consequences -- the American troops triumphantly stationed at permanent bases in the country after Gulf War I. They seemed to confirm the accusations of Osama bin Laden and other Saudi dissidents that the royal family had indeed become little but a tool of American imperialism. This, in turn, made the Saudi royals increasingly reluctant hosts for those troops and ever more hesitant supporters of pro-American policies within OPEC.
The situation was complicated further by what was obvious to any observer: The potential future leverage that both Iraq and Iran might wield in OPEC. With the second and third largest oil reserves on the planet -- Iran also had the second largest reserves of natural gas -- their influence seemed bound to rise. Iraq's, in particular, would be amplified substantially as soon as Saddam Hussein's regime was freed from severe limitations imposed by post-war UN sanctions, which prevented it from either developing new oil fields or upgrading its deteriorating energy infrastructure. Though the leaders of the two countries were enemies, having fought a bitter war in the 1980s, they could agree, at least, on energy policies aimed at thwarting American desires or demands -- a position only strengthened in 1998 when the citizens of Venezuela, the most important OPEC member outside the Middle East, elected the decidedly anti-American Hugo Chavez as president. In other words, in January 2001, the new administration in Washington could look forward to negotiating oil policy not only with a reluctant Saudi royal family, but also a coterie of hostile powers in a strengthened OPEC.
It is hardly surprising, then, that the new administration, bent on unipolarity anyway and dreaming of a global Pax Americana, wasted no time implementing the aggressive policies advocated in the PNAC manifesto. According to then Secretary of the Treasury Paul O'Neill in his memoir The Price of Loyalty, Iraq was much on the mind of Defense Secretary Donald Rumsfeld at the first meeting of the National Security Council on January 30, 2001, seven months before the 9/11 attacks. At that meeting, Rumsfeld argued that the Clinton administration's Middle Eastern focus on Israel-Palestine should be unceremoniously dumped. "[W]hat we really want to think about," he reportedly said, "is going after Saddam." Regime change in Iraq, he argued, would allow the U.S. to enhance the situation of the pro-American Kurds, redirect Iraq toward a market economy, and guarantee a favorable oil policy.
The adjudication of Rumsfeld's recommendation was shuffled off to the mysterious National Energy Policy Development Group that Vice President Cheney convened as soon as Bush took occupancy of the Oval Office. This task force quickly decided that enhanced American influence over the production and sale of Middle East oil should be "a primary focus of U.S. international energy policy," relegating both the development of alternative energy sources and domestic energy conservation measures to secondary, or even tertiary, status. A central goal of the administration's Middle East focus would be to convince, or coerce, states in that region "to open up areas of their energy sectors to foreign investment"; that is, to replace government control of the oil spigot -- the linchpin of OPEC power -- with decision-making by multinational oil companies headquartered in the West and responsive to U.S. policy needs. If such a program could be extended even to a substantial minority of Middle Eastern oil fields, it would prevent coordinated decision-making and constrain, if not break, the power of OPEC. This was a theoretically enticing way to staunch the loss of American power in the region and truly turn the Bush years into a new unipolar moment in the Middle East.
Having determined its goals, the Task Force began laying out a more detailed strategy. According to Jane Mayer of the New Yorker, the most significant innovation was to be a close collaboration between Cheney's energy crew and the National Security Council (NSC). The NSC evidently agreed "to cooperate fully with the Energy Task Force as it considered the 'melding' of two seemingly unrelated areas of policy: 'the review of operational policies towards rogue states,' such as Iraq, and 'actions regarding the capture of new and existing oil and gas fields.'"
Though all these deliberations were secret, enough of what was going on has emerged in these last years to demonstrate that the "melding" process was successful. By March of 2001, according to O'Neill, who was a member of both the NSC and the task force:
"Actual plans.... were already being discussed to take over Iraq and occupy it -- complete with disposition of oil fields, peacekeeping forces, and war crimes tribunals -- carrying forward an unspoken doctrine of preemptive war."
O'Neill also reported that, by the time of the 9/11 attacks on the World Trade Center and the Pentagon, the plan for conquering Iraq had been developed and that Secretary of Defense Rumsfeld indeed urged just such an attack at the first National Security Council meeting convened to discuss how the U.S. should react to the disaster. After several days of discussion, an attack on Iraq was postponed until after al-Qaeda had been wiped out and the Taliban driven from power in Afghanistan. It took only until January 2002 -- three months of largely successful fighting in Afghanistan -- before the "administration focus was returning to Iraq." It wasn't until November 2002, though, that O'Neill heard the President himself endorse the invasion plans, which took place the following March 20th. ...
Revealingly enough, Greenspan saw the invasion of Iraq as a generically conservative action -- a return, if anything, to the status quo ante that would preserve unencumbered American access to sufficient Middle Eastern oil. With whole new energy-devouring economies coming on line in Asia, continued American access seemed to require stripping key Middle Eastern nations of the economic and political power that scarcity had already begun to confer. In other words, Greenspan's conservative urge implied exactly the revolutionary changes in the political and economic equation that the Bush administration would begin to test out so disastrously in Iraq in March 2003. It's also worth remembering that Iraq was only considered a first pit stop, an easy mark for invasion and occupation. PNAC-nurtured eyes were already turning to Iran by then as indicated by the classic prewar neocon quip, "Everyone wants to go to Baghdad. Real men want to go to Tehran."
And beyond this set of radical changes in the Middle East lay another set for the rest of the world. In the twenty-first century, expanding energy demand will, sooner or later (probably sooner), outdistance production. The goal of unfettered American access to sufficient Middle Eastern oil would, if achieved and sustained, deprive other countries of sufficient oil, or require them to satisfy U.S. demands in order to access it. In other words, Greenspan's conservative effort to preserve American access implied a dramatic increase in American leverage over all countries that depended on oil for their economic welfare; that is, a radical transformation of the global balance of power.
Notice that these ambitions, and the actions taken to implement them, rested on a vision of an imperial America that should, could, and would play a uniquely dominant, problem-solving role in world affairs. All other countries would, of course, continue to be "vulnerable to economic crises" over which they would have "little control." Only the United States had the essential right to threaten, or simply apply, overwhelming military power to the "problem" of energy; only it had the right to subdue any country that attempted to create -- or exploit -- an energy crisis, or that simply had the potential and animus to do so. ...
As worldwide demand for hydrocarbons soared, the United States was left with three policy choices: It could try to combine alternative energy sources with rigorous conservation to reduce or eliminate a significant portion of energy imports; it could accept the leverage conferred on OPEC by the energy crunch and attempt to negotiate for an adequate share of what might soon enough become an inadequate supply; or it could use its military power in an effort to coerce Middle East suppliers into satisfying American requirements at the expense of everyone else. Beginning with Jimmy Carter, five U.S. presidents chose the coercive strategy, with George W. Bush finally deciding that violent, preemptive regime change was needed to make it work. The other options remain unexplored.
Now wouldn't the first option be a lot better (and cheaper) ?
SO, HOW MUCH IS THE IRAQ WAR about the four freedoms and how much is it about oil? Former Federal Reserve Chairman Alan Greenspan wrote in his recently published memoir that, "the Iraq War is largely about oil." General John Abizaid (ret.) -- a Visiting Fellow at the Hoover Institution, and former CENTCOM big brass in Iraq -- said in a round table, "Courting Disaster: The Fight for Oil, Water and a Healthy Planet," at the Freeman Spogli Institute, Stanford University, on October 13, 2007, "Of course it's about oil, we can't really deny that." Jim Holt, a regular contributor to The New York Times Magazine and The New Yorker, submits a coherent explanation in "It's the Oil" (London Review of Books, October 18, 2007). Writes Holt:
Iraq has 115 billion barrels of known oil reserves. That is more than five times the total in the United States. And, because of its long isolation, it is the least explored of the world's oil-rich nations. A mere two thousand wells have been drilled across the entire country; in Texas alone there are a million. It has been estimated, by the Council on Foreign Relations, that Iraq may have a further 220 billion barrels of undiscovered oil; another study puts the figure at 300 billion. If these estimates are anywhere close to the mark, US forces are now sitting on one quarter of the world's oil resources. The value of Iraqi oil, largely light crude with low production costs, would be of the order of $30 trillion at today's prices. For purposes of comparison, the projected total cost of the US invasion/occupation is around $1 trillion.
THIS IS THE STRATEGIC PRIZE that has been heralded by Dick Cheney. Holt, who I suppose could not have had his piece published in the American media, asserts that from the perspective of this formidable wealth to be conquered, "The costs -- a few billion dollars a month plus a few dozen American fatalities (a figure which will probably diminish, and which is in any case comparable to the number of US motorcyclists killed because of repealed helmet laws) -- are negligible compared to $30 trillion in oil wealth, assured American geopolitical supremacy and cheap gas for voters. In terms of realpolitik, the invasion of Iraq is not a fiasco; it is a resounding success."
"NEGLIGIBLE COSTS" COMPARED to the potential wealth the war will create. Americans are known, and often envied, for their pragmatic optimism and for their ways to devise solutions based on careful cost analysis. The war has already been a "resounding success" for the military-industrial-congressional complex whose profits and stocks have soared in the past six years (e.g., GE's shares have more than doubled; Halliburton went from $5 to $40; Top executives are paid in the tens of millions -- our taxes at work, folks). Profits remain the central nervous system of the American experiment, and for the few who indeed profit immensely from the policies they put in place and fully control in total disregard of the well-being of the vast majority, the second Gilded Age has finally dawned upon them. Ben Stein, who writes the "Everybody's Business" column in the business section of The New York Times most Sundays, lamented recently that "socially responsible investors shun companies that do military contracting." (NYT, September 30, 2007: "Is It Responsible to Shun Military Contractors?"). Says Stein: "We are currently in a war that is about creating a better, more dignified planet and we are fighting enemies who openly say they want to kill everyone who is not their slavish follower." Apparently, Stein has been drinking the Kool-Aid that is so prevalent and liberally dispensed in the corridors of power. He goes on to flatly contradict the assessment of Alan Greenspan and John Abizaid. It's not about oil. We only import 20 percent of our needs from the Middle East. We should be thankful for the great prosperity oil has brought us thanks to the oil companies. His column stirred me enough to trigger a letter of mine:
Dear Mr. Stein,
I am an avid reader of your regular column in the business section of the Sunday NYT. I find them often pertinent and quite instructive. They seem to come from a conservative background, but one that has never forgotten its human soul, sense of fairness, and compassion.
In your last column, "Is it Responsible to Shun Military Contractors?" (NYT, 09/30/2007), you asked whether someone could explain why socially responsible investors refrain from investing in military contractors, and you went on to brush aside one portent reason behind the current conflict(s) in the Middle East -- namely oil -- by positing that only about 20% of the oil we import comes from that region. The vast majority of our imports, you asserted, is delivered to our shores thanks to the diligent work of the oil companies, without the intervention of the US military.
I've worked for many years in the oil industry (France -- my country of origin -- Bermuda, and the U.S.) and see no reason to bash it, except for the obscene compensation of its executives (but this latter point is not limited to that industry). Indeed, as you stated, "The staggering prosperity of this country, of the whole developed world, floats on oil."
However, I respectfully submit to you that your facts are slightly twisted and your premises incorrect. You ignore the "staggering" amount of waste associated with our material prosperity. You appear to disregard the projections of the DOE that show how much more dependent our economy will become on Mideastern hydrocarbon resources (oil and natural gas) in the next two or three decades -- the "strategic prize," according to Mr. Cheney. Furthermore, you appear to miss the significance of this mere 20% and its substantial growth in the future (again, please refer to the DOE's projections). Imagine that amount diverted to Japan, India, and China and you will quickly measure the negative consequences on our economy. Finally, you do not even broach the currency issue. What happens to the value of the dollar if Mideastern producers switch to the Euro or the Yen, as Saddam Hussein was planning, and Iran is implementing, is another reason for our military intervention in that region. Black gold and greenbacks!
Which brings me to your premise regarding the quagmire we have willfully created: We are not at war with people who want to destroy us and kill us all. That's an ideological battle that cannot be solved through military means. A few thousands crazies (if they are) won't defeat the West. Like all crazies (if they are) they will dissolve in times. Waging war on flimsy and mostly fabricated evidence can only bring more despair and devastation. Your analogy with WWII, which mirrors the rhetoric advanced by the media and government officials, confuses the actualities that rational and reasonable people must face. There was no Mesopotamian Hitler and there is no Iranian Hitler. Remember FDR's saying about fear?
Socially responsible investors refuse to support the purveyors of the industrial-military-congressional complex because we have long reached the conclusion that war was not the answer to the many challenges humanity confronts. You sold your tobacco shares long ago, and you note that they have done very well ever since. I never invested in them, as I have not invested in the shares of military contractors, which have done extremely well too in the past six years.
And I am a smoker!
Yours sincerely,
NOW, I CAN UNDERSTAND the profit motive and the alluring $30 trillion prize, but that someone of Ben Stein's caliber can posit that the Iraq War is about a "better, more dignified planet" truly boggles the mind. Perhaps Stein should go and read Bill Safire's "On Language" column in the NYT Magazine of October 7, 2007, to grasp the meaning of a "willing suspension of disbelief." What kind of a dignified planet has Mr. Stein in mind? One that has seen one million Iraqis, half children, killed during a dozen years of ignoble economic sanctions? One that has seen over one million Iraqis killed and over 4 million internally or externally displaced since March 2003? One that cannot even feed its own poor and the wretched masses -- according to the UN Food and Agricultural Organization, over 854 million people worldwide? Or is it a planet in which the dominant player cannot provide universal health care to its citizenry and whose president just vetoed a bill (S-CHIP) that would have insured an additional 5 out of 9 million uninsured American kids -- for whom insurance means a trip to the Emergency Room, in the thinking of the Decider?
There is a small rash of posts amongst the centre left - Atrios, Matt Yglesias, John Quiggin, Joseph Romm - wondering what the fuss about peak oil is. Don't hybrids solve the problem, they ask ? While I'm far from doomerish in my assessment of the peak oil challenge, I think these guys are oversimplfying things (which is a typical reaction when confronted with Kunstler's outlandish predictions).
There are 3 intertwining effects (4 if you consider global warming) :
1. Oil and gas production will decrease basically continuously over a long period of time 2. A large percentage of the world's population is getting richer as it industrialises 3. The world's population will increase by around 50% over the next 40 years
These mean that per capita oil consumption rates in the West will drop dramatically - we won't get the lions share of the world's oil as we become less economically dominant, and the reduced volume of oil will be shared between more and more people.
Thus a (maybe) 50% gain in fuel efficiency - assuming the entire vehicle fleet turns over and is converted into hybrids at a rapid rate (which is a pretty big assumption) - won't be enough of an adjustment. We need a huge investment in renewable energy sources, smart grids to distribute the electricity and an electric transport system (both cars and rail) to supercede the existing one that is nearing obsolesence.
I don't believe that industrial civilisation will collapse, but I don't believe everyone going out and buying a hybrid (assuming the manufacturers can scale up fast enough) will be enough to solve the problem either.
After doing a review of a Herman Kahn book in my last post, I was pleasantly surprised to see his name crop up when I wandered over to Beyond the Beyond today, with Bruce looking at Jamais Cascio's post "The Politics of Geoengineering. Is Cascio the new Kahn ?
(((I wince when I drop by Cascio's blog these days, as he has an almost Herman Kahn-like willingness to publicly ponder the unthinkable. Pray that this business about the Earth's faltering ability to absorb carbon isn't true, because if it is, we've walked into a planetary death-trap that's like some monster La Brea tar pit.))) Link: Open the Future: The Politics of Geoengineering.
"Geoengineering -- or, as I sometimes call it, re-terraforming the Earth -- is back in the news, with a sobering editorial in today's New York Times by Carnegie's Dr. Ken Caldeira. Caldeira's commentary arrives in the wake of news that the geophysical mechanisms for cycling carbon dioxide out of the atmosphere are beginning to slow down, thereby increasing the degree to which CO2 accumulates as a greenhouse gas. (((AIEEEEEEE!!!)))
"This is exactly the kind of news that makes one suspect that we may not have the time to re-imagine our urban systems, transform our agricultural methods, and move to a carbon-free economy. Geoengineering seems to provide a solution (of varying appeal) for just this kind of situation, focusing not on resolving the causes of global climate disruption, but on ameliorating the symptoms.
"I've addressed the question of support for or opposition to geoengineering in the past, and given its increasing visibility, debates among scientists, environmentalists, and engineers are not hard to find. But these debates center on the scientific risks and merits of the re-terraforming proposals. Few people, regardless of position, have focused on a fundamental non-geophysical risk of the method: political control, costs, and stability...."
(((Now that I think about it, my new novel is all about this subject. I didn't think it was gonna be, but now that I've got it done, it's clear that it pretty much is.)))
Cool - a new Sterling novel on the way - it has been years since I last read one...
Paul Krugman had a fun article in the New York Times last week - Gore Derangement Syndrome - looking at the neoconservative meltdown over Al Gore's Nobel Peace Prize win (apparently the last time this happened was when Martin Luther King won the award back in 1964).
On the day after Al Gore shared the Nobel Peace Prize, The Wall Street Journal’s editors couldn’t even bring themselves to mention Mr. Gore’s name. Instead, they devoted their editorial to a long list of people they thought deserved the prize more.
While Krugman concentrates on the frothing far right, they aren't the only ones having conniptions - Alexander Cockburn at Counterpunch (which is nominally far left, though it does wander erratically about the political compass at times in search of suitable vantage points to achieve their mission of throwing mud at everyone) also emitted an angry anti-Gore outburst, accusing the rise of concern about global warming of being nothing but a stunt put on for the benefit of the nuclear industry.
The UN often has an inside track on the "Peace" prize. The UN Peace-Keeping Forces got it in 1988. In 1986 another enthusiast for attacking Iraq and Iran, Elie Wiesel, carried off the trophy. Aside from Kissinger, probably the biggest killer of all to have got the peace prize was Norman Borlaug, whose "green revolution" wheat strains led to the death of peasants by the million. ...
The specific reason why this man of blood shares the 2007 Nobel Peace Prize with the IPCC is for their joint agitprop on the supposed threat of anthropogenic global warming. Bogus science topped off with toxic alarmism. It's as ridiculous as as if Goebbels got the Nobel Peace Prize in 1938, sharing it with the Kaiser Wilhelm Institute for his work in publicizing the threat to race purity posed by Jews, Slavs and gypsies. ...
Of course Al Gore has been a shil for nuclear power ever since he came of age as a political harlot for the Oakridge nuclear laboratory in his home state of Tennessee. The practical beneficiary of the baseless hysteria over "anthropogenic global warming" is the nuclear power industry. This very fall, as Peter Montague describes at length in our current CounterPunch newsletter, this industry is reaping the fruits of Al Gore's campaigning. Congress has finally knocked aside the regulatory licensing processes that have somewhat protected the public across recent decades. The starting gun has sounded, and just about the moment Gore and his co-conspirators at the IPCC collect their prizes, the bulldozers will be breaking ground for the new nuclear plants soon to spring like Amanita phalloides--just as deadly--across the American landscape.
While the whole piece is wildly contrarian, one piece of character assassination thrown in by Cockburn is the accusation that "green revolution" pioneer Norman Borlaug (who also won the Nobel Peace Prize) was responsible for "wheat strains that led to the death of peasants by the million". While I have no comprehension of what he is going on about here (as I understand it, the green revolution led to a massive fall in starvation and malnutrition and was responsible for a surge in global population levels, rather than enormous mounds of dead peasants), it does provide a nice lead in to one of those meandering rants I like to embark on from time to time (which once again will give Bart an opportunity to call me long-winded and in dire need of an editor - or as one redditor once put it "this is a colossal assemblage of other people's reviews and redundant analysis").
So - no news tonight, just an exploration of some more of the history around The Limits To Growth and related memes.
Now, the "fat man" referred to in the title isn't Al Gore (whose weight seems to be an issue amongst some parts of the blogosphere though he doesn't look all that large to me), but instead one of the more unusual icons of the cold war era (and one of those wisdom deficient products of the RAND Corporation I referred to in "The Shockwave Rider") - strategist and futurist Herman Kahn.
In the post "Silent Spring" era, environmental issues made their way to the forefront of popular thinking for a while, prompting books like "The Limits To Growth", which I've gone on about at length previously, and more alarmist works like Paul Ehrlich's "The Population Bomb". Herman Kahn was a conservative who set up, along with a group of other RANDians, an organisation called the Hudson Institute, which Wikipedia refers to as "the organization about which the phrase 'think tank' was originally coined".
The Hudson Institute seems to have drifted ever further to the right over the years, which is an achievement of sorts given its starting position (though to be fair, Kahn himself seemed a fairly moderate conservative in some ways). One of their original bugbears was the Club Of Rome and the environmental movement in general, and it has been a bastion of global warming denial in more recent times. If the SourceWatch profile is correct, they also seem to have a strong interest in promoting industrialised agriculture and opposing organic farming (their stance on apple pie and being nice to puppies is unknown but they may well oppose these too).
If you go to the Wikipedia link and the SourceWatch profile you'll find lists of the scary crew of characters, such as Richard "the prince of darkness" Perle, associated with the organisation today (with the honourable exception of William Odom, who has been strong on opposing the Iraq war and the expansion of the surveillance state).
The New Yorker has an interesting article on Herman Kahn himself, which should give you an idea of what he was all about and some of his stranger beliefs, which resulted in him being one of the inspirations for the "Dr Strangelove" character in Stanley Kubrick's movie.
Herman Kahn was the heavyweight of the Megadeath Intellectuals, the men who, in the early years of the Cold War, made it their business to think about the unthinkable, and to design the game plan for nuclear war—how to prevent it, or, if it could not be prevented, how to win it, or, if it could not be won, how to survive it. The collective combat experience of these men was close to nil; their diplomatic experience was smaller. Their training was in physics, engineering, political science, mathematics, and logic, and they worked with the latest in assessment technologies: operational research, computer science, systems analysis, and game theory. The type of war they contemplated was, of course, never waged, but whether this was because of their work or in spite of it has always been a matter of dispute. Exhibit A in the case against them is a book by Kahn, published in 1960, “On Thermonuclear War.”
Kahn was a creature of the RAND Corporation, and RAND was a creature of the Air Force. In 1945, when the United States dropped atomic bombs nicknamed Little Boy and Fat Man on Japan, the Air Force was still a branch of the Army. The bomb changed that. An independent Department of the Air Force was created in 1947; the nation’s nuclear arsenal was put under its command; and the Air Force displaced the Army as the prima donna of national defense. Whatever it wanted, it mostly got. One of the things it wanted was a research arm, and rand was the result. (rand stands for Research ANd Development.) RAND was a line item in the Air Force budget; its offices were on a beach in Santa Monica. Kahn joined in 1947.
In his day, Kahn was the subject of many magazine stories, and most of them found it important to mention his girth—he was built, one journalist recorded, “like a prize-winning pear”—and his volubility. He was a marathon spielmeister, whose preferred format was the twelve-hour lecture, split into three parts over two days, with no text but with plenty of charts and slides. He was a jocular, gregarious giant who chattered on about fallout shelters, megaton bombs, and the incineration of millions. Observers were charmed or repelled, sometimes charmed and repelled. Reporters referred to him as “a roly-poly, second-strike Santa Claus” and “a thermonuclear Zero Mostel.” He is supposed to have had the highest I.Q. on record.
Sharon Ghamari-Tabrizi’s “The Worlds of Herman Kahn” (Harvard; $26.95) is an attempt to look at Kahn as a cultural phenomenon. (Kahn is the subject of a full-length biography with a similar title, “Supergenius: The Mega-Worlds of Herman Kahn,” by a former colleague, Barry Bruce-Briggs, which, though partisan, is thorough and informed, and which Ghamari-Tabrizi, strangely, never mentions.) She is not the first to treat Kahn as more an artist than a scientist. In 1968, when Kahn was at the height of his celebrity, Richard Kostelanetz wrote a profile of him for the Times Magazine in which he suggested that Kahn had “a thoroughly avant-garde sensibility.” He meant that Kahn was uninhibited by conventional ways of thinking, alert to abandon positions that were starting to seem obsolete, continually trying to find new ways to see around the next corner.
The defense policy of the Eisenhower Administration, announced by Secretary of State John Foster Dulles in an address to the Council on Foreign Relations in 1954, was the doctrine of “massive retaliation.” Dulles explained that the United States could not afford to be prepared to meet Soviet aggression piecemeal—to have soldiers ready to fight in every place threatened by Communist expansion. The Soviets had a bigger army, and they threatened in too many places. The solution was to make it clear that the American response to Soviet aggression anywhere would be a nuclear attack, at a time and place of America’s choosing. It was a first-strike policy: if provoked, the United States would be the first to use the bomb. An overwhelming nuclear arsenal therefore acted as a deterrent on Soviet aggression. Eisenhower called the policy the New Look.
The New Look was good for the Air Force, because it made the nuclear arsenal, and its delivery system of bombers and, later on, missiles, the country’s principal strategic resource. But the analysts at rand considered massive retaliation a pathetically crude idea, an atomic-age version of Roosevelt’s big stick. They thought that it was practically an invitation to the Soviets to precede any local aggression by a preëmptive first strike on American bomber bases, eliminating the nuclear threat on the ground and forcing the United States into the land war it was unprepared to fight. There was also a major credibility problem. How aggressive did the Soviets need to be to trigger a thermonuclear response? Was the United States willing to kill millions of Russians, and to put millions of Americans at risk of dying in a counterattack, in order to prevent, say, South Korea from going Communist? Or West Berlin? There had to be some options available between disapproval and annihilation. The doctrine of massive retaliation was a deterrent—a way to prevent war—but it was inherently destabilizing. National defense policy required something more nuanced, and figuring out what, since Eisenhower was uninterested, fell to the people at rand.
Kahn began working on the problem not long after Dulles’s speech. In 1959, he spent a semester at the Center for International Studies, at Princeton, and then toured the country delivering lectures on deterrence theory. In 1960, Princeton University Press published a version of the lectures (with much added material) as “On Thermonuclear War.” Kahn was not really a writer, and his book—six hundred and fifty-one pages—is shaggy, overstuffed, almost free-associational, with a colorful use of capitalization and italics, long excurses on the strategic lessons of the First and Second World Wars, and the sorts of proto-PowerPoint charts and tables that Kahn used in his lectures.
“On Thermonuclear War” (Bruce-Briggs suggests that the title, an allusion to Clausewitz’s “On War,” was devised by the publisher) is based on two assertions. The first is that nuclear war is possible; the second is that it is winnable. Most of the book is a consideration, in the light of these assumptions, of possible nuclear-war scenarios. In some, hundreds of millions die, and portions of the planet are uninhabitable for millennia. In others, a few major cities are annihilated and only ten or twenty million people are killed. Just because both outcomes would be bad on a scale unknown in the history of warfare does not mean, Kahn insists, that one is not less bad than the other. “A thermonuclear war is quite likely to be an unprecedented catastrophe for the defender,” as he puts it. “But an ‘unprecedented’ catastrophe can be a far cry from an ‘unlimited’ one.” The opening chapter contains a table titled “Tragic but Distinguishable Postwar States.” It has two columns: one showing the number of dead, from two million up to a hundred and sixty million, the other showing the time required for economic recuperation, from one year up to a hundred years. At the bottom of the table, there is a question: “Will the survivors envy the dead?”
Kahn believed—and this belief is foundational for every argument in his book—that the answer is no. He explains that “despite a widespread belief to the contrary, objective studies indicate that even though the amount of human tragedy would be greatly increased in the postwar world, the increase would not preclude normal and happy lives for the majority of survivors and their descendants.” For many readers, this has seemed pathologically insensitive. But these readers are missing Kahn’s point. His point is that unless Americans really do believe that nuclear war is survivable, and survivable under conditions that, although hardly desirable, are acceptable and manageable, then deterrence has no meaning. You can’t advertise your readiness to initiate a nuclear exchange if you are unwilling to accept the consequences. ...
The most infamous pages in “On Thermonuclear War” concern survivability. What makes nuclear war different, Kahn points out, is not the number of dead; it’s a new element—the problem of the postwar environment. In Kahn’s view, the dangers of radioactivity are exaggerated. Fallout will make life less pleasant and cause inconvenience, but there is plenty of unpleasantness and inconvenience in the world already. “War is a terrible thing; but so is peace,” he says. More babies might have birth defects after a nuclear war, but four per cent of babies have birth defects anyway. Whether we can tolerate a slightly higher percentage of defective children is a question of trade-offs. “It might well turn out,” Kahn suggests, “that U.S. decision makers would be willing, among other things, to accept the high risk of an additional one percent of our children being born deformed if that meant not giving up Europe to Soviet Russia.”
The book proposes a system for labelling contaminated food so that older people will eat the food that is more radioactive, on the theory that “most of these people would die of other causes before they got cancer.” It advocates providing citizens with hand-held radium dosimeters, which will allow them to measure the radioactivity their own bodies have absorbed. One symptom of radioactive poisoning is nausea, Kahn explains, and, when one person vomits, people around him will start to vomit, convinced that they are dying. If the dosimeter indicates that no one has received more than an acceptable dose of radiation, everyone can stop throwing up and get back to work reconstructing the economy. Kahn dismisses the notion that a society that has just suffered the obliteration of its cities, the contamination of its soil and water, and the massacre of a large portion of its population might lack the civic virtue and moral fibre necessary to rebuild. “It is my belief that if the government has made at least moderate prewar preparations, so that most people whose lives have been saved will give some credit to the government’s foresight, then people will probably rally round,” he writes. “It would not surprise me if the overwhelming majority of the survivors devoted themselves with a somewhat fanatic intensity to the task of rebuilding what was destroyed.” The message of the book seemed to be that thermonuclear war will be terrible but we’ll get over it. ...
In its first three months, “On Thermonuclear War” sold more than fourteen thousand copies. The book received praise from a few prominent disarmament advocates and pacifists: A. J. Muste, Bertrand Russell, and the historian and senatorial candidate H. Stuart Hughes, who called it “one of the great works of our time.” They thought that, by making nuclear exchange seem not only possible but nearly unavoidable, Kahn had, intentionally or not, presented a case for disarmament. Not only pacifists believed this. “If I wanted to convince a skeptic that there is no security in the balance of terror which American policy is committed to maintaining, I would send him to the works of Herman Kahn far sooner than to the writings of the unilateralists and the nuclear pacifists,” Norman Podhoretz later wrote.
Other reactions were more predictable. The National Review thought that the book was not hard enough on Communism. New Statesman called it “pornography for officers.” The Daily Worker called it “useful.” In Scientific American, James R. Newman, the editor of the popular anthology “The World of Mathematics,” said that it was “a moral tract on mass murder: how to plan it, how to commit it, how to get away with it, how to justify it.” Though Kahn’s book is an assault on the overwhelming-force mentality of Dulles and the generals at the Strategic Air Command (who, Kahn once told them, dreamed of a “wargasm”), it is also an attack on the anti-nuclear movement and the belief that nuclear war means the end of life as we know it. Most anti-nuclear advocates thought that arguing that a nuclear war was winnable only made one more likely. An official of the American Friends Service Committee compared Kahn to Adolf Eichmann, and he became one of the movement’s favorite monsters. His house was picketed.
The best-known response to “On Thermonuclear War” was a movie. Stanley Kubrick began reading intensively on nuclear strategy soon after he finished directing “Lolita,” in 1962. His original plan was to make a realistic thriller. One of his working titles was taken from an article by Wohlstetter in Foreign Affairs, in 1959: “The Delicate Balance of Terror” (an article that anticipated many of Kahn’s arguments in “On Thermonuclear War”). But Kubrick could not invent a plausible story in which a nuclear war is started by accident, so he ended up making a comedy, adapted from a novel, by a former R.A.F. officer, called “Red Alert.”
“The movie could very easily have been written by Herman Kahn himself,” Midge Decter wrote in Commentary when “Dr. Strangelove” came out, in 1964. This was truer than she may have known. Kubrick was steeped in “On Thermonuclear War”; he made his producer read it when they were planning the movie. Kubrick and Kahn met several times to discuss nuclear strategy, and it was from “On Thermonuclear War” that Kubrick got the term “Doomsday Machine.” The Doomsday Machine—a device that automatically decimates the planet once a nuclear attack is made—was one of Kahn’s heuristic fictions. (The name was his own, but he got the idea from “Red Alert,” which he, too, had admired.) In Kahn’s book, the Doomsday Machine is an example of the sort of deterrent that appeals to the military mind but that is dangerously destabilizing. Since nations are not suicidal, its only use is to threaten. “The whole point of the Doomsday Machine is lost if you keep it a secret!” as Strangelove complains to the Soviet Ambassador.
There were a number of possible models for the character of Strangelove (who at one point tells the President about a report on Doomsday Machines prepared by the Bland Corporation): Wernher von Braun, Teller, even Henry Kissinger, who was an admirer of “On Thermonuclear War,” and whose book “Nuclear Weapons and Foreign Policy” (1957) pondered the possibility of tactical nuclear wars. Peter Sellers picked up the accent from the photographer Arthur Fellig, known as Weegee, when he was visiting the studio to advise Kubrick on cinematographic matters. But one source was Kahn. Strangelove’s rhapsodic monologue about preserving specimens of the race in deep mineshafts is an only slightly parodic version of Kahn. There were so many lines from “On Thermonuclear War” in the movie, in fact, that Kahn complained that he should get royalties. (“It doesn’t work that way,” Kubrick told him.) Kahn received something more lasting than money, of course. He got himself pinned in people’s minds to the figure of Dr. Strangelove, and he bore the mark of that association forever.
The subject of the Doomsday Machine made an odd reappearance a couple of months ago in Slate - The Return of the Doomsday Machine? - with the article suggesting it may have been real - and perhaps still is. More game theory being played as part of the "new cold war", perhaps ?
"The nuclear doomsday machine." It's a Cold War term that has long seemed obsolete.
And even back then, the "doomsday machine" was regarded as a scary conjectural fiction. Not impossible to create—the physics and mechanics of it were first spelled out by U.S. nuclear scientist Leo Szilard—but never actually created, having a real existence only in such apocalyptic nightmares as Stanley Kubrick's Dr. Strangelove.
In Strangelove, the doomsday machine was a Soviet system that automatically detonated some 50 cobalt-jacketed hydrogen bombs pre-positioned around the planet if the doomsday system's sensors detected a nuclear attack on Russian soil. Thus, even an accidental or (as in Strangelove) an unauthorized U.S. nuclear bomb could set off the doomsday machine bombs, releasing enough deadly cobalt fallout to make the Earth uninhabitable for the human species for 93 years. No human hand could stop the fully automated apocalypse.
An extreme fantasy, yes. But according to a new book called Doomsday Men and several papers on the subject by U.S. analysts, it may not have been merely a fantasy. According to these accounts, the Soviets built and activated a variation of a doomsday machine in the mid-'80s. And there is no evidence Putin's Russia has deactivated the system.
Instead, something was reactivated in Russia last week. I'm referring to the ominous announcement—given insufficient attention by most U.S. media (the Economist made it the opening of a lead editorial on Putin's Russia)—by Vladimir Putin that Russia has resumed regular "strategic flights" of nuclear bombers. (They may or may not be carrying nuclear bombs, but you can practically hear Putin's smirking tone as he says, "Our [nuclear bomber] pilots have been grounded for too long. They are happy to start a new life.") ...
Moving back to The Fat Man, Kahn himself continued writing after he left RAND, and published a number of books during his time at the Hudson Institute, one of which I read a couple of months back called "The Next 200 Years" which he wrote in 1976.
"200 Years" was a reaction against what Kahn called the "doomsday literature" (which was rather ironic, given his background) of the neo-Malthusians, as expressed in The Limits To Growth, The Population Bomb and other similar books, and the "current malaise" the world was experiencing at the time. The RAND Corporation still includes the book in their list of "50 Books for Thinking About the Future Human Condition".
This is a companion piece to Kahn’s book on the year 2000. In it he basically responds to the doom-and-gloom scenario that was laid out in the famous Limits to Growth study from the Club of Rome. Kahn argues that population is the primary driver of the longer-range future and that high world population growth rates around 1976 were an anomaly when looking at historic rates. He thought that the population growth rates would decline after 1976 (which they have) and that would make all the difference. He argues at length that if population levels off at about 15 billion people (probably a high estimate in today’s thinking) that the major problems addressed by the Limits to Growth study could be handled by technology. For those who think that population is the primary driver of the world’s problems, this is a well-argued screed for creating a sustainable earth if we can get population under control. In any event, this book is useful for its serious attempt at looking as far as 200 years into the future.
The book looks at the issue of growth and how it might be handled in 5 main areas - population, energy, raw materials (minerals in particular), food and the environment - in some ways it could be looked at as a deliberately optimistic thought experiment in dealing with the Limits To Growth, rather than denying they exist entirely.
As I go on about the subject of energy every day I won't say too much about that aspect here. Kahn argues that energy will go from an exhaustible resource (ie. a largely fossil fuel based one) to inexhaustible resource (one based on renewable - or what he calls "long term" - sources). To his credit he bites the bullet (possibly against his own personal preferences) and considers the scenario where you simply accept that the days of fossil fuels are numbered (without quibbling over how much oil, coal and gas is actually available) and ignore the nuclear power option because of its drawbacks, and instead try to work out if you can transition to a state where everything is based on solar power (including derivatives such as wind, bioconversion and ocean power), geothermal power and increased energy efficiency. He concludes that there is abundant energy available from these sources and we should be in a position to have transitioned to these by 2050 (which is basically the same conclusion I reached after a year or two of looking at the subject).
Of course, when modelling energy usage, you need to consider both expected population levels and their consumption levels. Kahn was fervently against "Population Bomb" style doomerism and believed the population would plateau around 15 billion people in 200 years time, while noting that exponential growth could and would not continue, and that population growth was already slowing at the time the book was written and that he expected it to slow further and further over time (while insisting that this was not happening because any physical limit to growth had been reached).
Now segments of the peak oil world tends to have a markedly different view on future population trends (with some of the most unpleasant being those of the "internet age's ultimate neo-malthusian" Jay Hanson and his Die Off - "a population crash resource page" - site and its equally toxic successor, the national socialist "War Socialism" site).
As I've delivered my highly unfavourable verdict on these before, I'll look elsewhere for a good example of the "peak oil = population crash" line of thought - this article at TOD Canada on "World Energy and Population: Trends to 2100" fits the bill nicely.
Throughout history, the expansion of human population has been supported by a steady growth in our use of high-quality exosomatic energy. The operation of our present industrial civilization is wholly dependent on access to a very large amount of energy of various types. If the availability of this energy were to decline significantly it could have serious repercussions for civilization and the human population it supports.
This paper constructs production models for the various energy sources we use and projects their likely supply evolution out to the year 2100. The full energy picture that emerges is then translated into a population model based on an estimate of changing average per-capita energy consumption over the century. Finally, the impact of ecological damage is added to the model to arrive at a final population estimate.
This model, known as the "World Energy and Population" model, or WEAP, suggests that the world's population will decline significantly over the course of the century. ...
The population model is based mainly on the long-term aggregate effects of energy decline. The mechanisms of the population decline it projects are not specified. However, it is likely that they will include such things as major regional food shortages, a spread of diseases due to a loss of urban medical and sanitation services and an increase in deaths due to exposure to heat and cold.
The main interaction in the model is between the energy available at any point in time (shown in Figure 13) and an estimate of average global per capita consumption. Current global consumption is about 1.7 toe per person per year, and in the model that declines evenly to a consumption of 1.0 toe per person per year by 2100. To put that in perspective, the world average in 1965 was 1.2, so the model is not predicting a huge decline below that level of consumption. An increase in the disparity between rich and poor nations is also likely, but that effect is masked by this approach.
Under those assumptions, the world population would rise to about 7.5 billion in 2025 before starting an inexorable decline to 1.8 billion by 2100. ...
Now my personal view is that this is a gross underestimate of how much renewable energy we can and will exploit, which therefore invalidates the conclusions entirely. The link between population growth and available energy possibly isn't quite as clear as many peak oil observers seem to think either - check out this graph from the Buckminster Fuller Institute, which shows an inverse relationship between energy consumption and population growth, as one interesting counter-example.
Recent population models show population leveling off at around 8.9 billion in 2050 (assuming both that fertility rates continue to fall slowly in the developing world, and that efforts to stem the growth of the HIV/AIDS epidemic are successful), though the latest update is now predicting a population level of 9.2 billion, due to slower than expected declines of fertility in developing countries and increasing longevity in richer countries.
For a great visual demonstration of what is happening with family sizes and longevity throughtout the world, watch Hans Rosling's TED Talk from a few years ago (the relevant part starts about 4 minutes in).
I'm fond of quoting Stewart Brand's "4 environmental heresies", one of which is that concern about population growth is unfounded now that it seems likely that it will stall at a level that can be sustainable, with the global population surge into cities driving the drop in growth rates (one of the reasons for my "cities are the future" slogan, even though some people don't seem to be in favour of the idea or misinterpret what is going on).
For 50 years, the demographers in charge of human population projections for the United Nations released hard numbers that substantiated environmentalists' greatest fears about indefinite exponential population increase. For a while, those projections proved fairly accurate. However, in the 1990s, the U.N. started taking a closer look at fertility patterns, and in 2002, it adopted a new theory that shocked many demographers: human population is leveling off rapidly, even precipitously, in developed countries, with the rest of the world soon to follow. Most environmentalists still haven't got the word. Worldwide, birthrates are in free fall. Around one-third of countries now have birthrates below replacement level (2.1 children per woman) and sinking. Nowhere does the downward trend show signs of leveling off. Nations already in a birth dearth crisis include Japan, Italy, Spain, Germany, and Russia -- whose population is now in absolute decline and is expected to be 30 percent lower by 2050. On every part of every continent and in every culture (even Mormon), birthrates are headed down. They reach replacement level and keep on dropping. It turns out that population decrease accelerates downward just as fiercely as population increase accelerated upward, for the same reason. Any variation from the 2.1 rate compounds over time.
That's great news for environmentalists (or it will be when finally noticed), but they need to recognize what caused the turnaround. The world population growth rate actually peaked at 2 percent way back in 1968, the very year my old teacher Paul Ehrlich published The Population Bomb. The world's women didn't suddenly have fewer kids because of his book, though. They had fewer kids because they moved to town.
Cities are population sinks-always have been. Although more children are an asset in the countryside, they're a liability in the city. A global tipping point in urbanization is what stopped the population explosion. As of this year, 50 percent of the world's population lives in cities, with 61 percent expected by 2030. In 1800 it was 3 percent; in 1900 it was 14 percent.
The environmentalist aesthetic is to love villages and despise cities. My mind got changed on the subject a few years ago by an Indian acquaintance who told me that in Indian villages the women obeyed their husbands and family elders, pounded grain, and sang. But, the acquaintance explained, when Indian women immigrated to cities, they got jobs, started businesses, and demanded their children be educated. They became more independent, as they became less fundamentalist in their religious beliefs. Urbanization is the most massive and sudden shift of humanity in its history. Environmentalists will be rewarded if they welcome it and get out in front of it. In every single region in the world, including the U.S., small towns and rural areas are emptying out. The trees and wildlife are returning. Now is the time to put in place permanent protection for those rural environments. Meanwhile, the global population of illegal urban squatters -- which Robert Neuwirth's book Shadow Cities already estimates at a billion -- is growing fast. Environmentalists could help ensure that the new dominant human habitat is humane and has a reduced footprint of overall environmental impact.
WorldChanging recently commented on some factors that influence population growth, noting that "the number of children a woman bears and raises has an inverse relationship to her - and her children's - life expectancy and social mobility" and thus "improving the availability of family planning options is crucial" (as is encouraging social mobility). In a similar vein, Patrick Di Justo has pointed out the number one factor influencing population growth is education levels for girls - the best way to keep family sizes small is to make sure that girls are able to get a good education. The next most important factors are making sure they have the freedom and economic opportunities to choose what to do with their lives.
Of course, another approach for stabilising population levels instead of providing girls with education, contraception and economic opportunities is simply to legislate controls on the numbers of children people are allowed to have, as per the "one child" policy China introduced in 1979. This has been effective at slowing China's population growth, but due to the Chinese preference for male children, some observers are calling the resulting "surplus of sons" a "geopolitical time bomb".
The Olympics are around the corner. Just as qualifying athletes are training hard for the big event, China seeks to put its best foot forward in response to critics at home and abroad. Among the criticisms is a quiet but serious challenge: the artificially high number of Chinese men compared with Chinese women. China should act expeditiously to correct the social and legal pressures that have converged to create this problem.
"Son preference" is a deep-seated, widespread problem in many cultures. In many parts of the world, having a son is integral to one's future financial and social wellbeing. Recent articles have tried to shed light on the problem in India – putting much blame on the ultrasound machines women use to determine the sex of their unborn children in order to decide whether they should abort a female fetus.
In China, however, the problem takes on a frightfully larger scope when "son preference" meets the notorious One Child policy. When the government only allows one child, it puts immense pressure on Chinese parents to determine the sex of their child in the womb, and terminate the pregnancy if it is a girl.
The unintended consequences of this government policy are staggering. The proportion of male births to female births (the "sex ratio") is not merely unusual, but alarming. Worldwide, there are already 100 million girls "missing" due to sex-selective abortion and female infanticide, according to the English medical journal The Lancet. Fifty million of these girls are thought to be from China. In many provinces, the sex ratio at birth is between 120 to 130 boys for every 100 girls; the natural number is about 104. What will happen in future decades when these boys grow up and look for wives?
Of course, the one child policy will have other side effects as well, leading to a population dominated by the old in future years (which will presumably lead to a relaxation of the one child laws at some point)
China’s population won’t always be the engine of growth that it is now, because the country’s fast-rising proportion of elderly people eventually will drag down the economy, says historian Niall Ferguson in the Los Angeles Times. While China’s population today is relatively young, by the middle of the century it is set to become one of the world’s grayest societies. Today, less than 8% of China’s population is 65 or older. By 2050, that proportion will rise to 24%, compared with Europe’s 28% and 21% in the U.S. In sub-Saharan Africa, the proportion of elderly individuals will rise to under 6% from 3% now.
Moreover, at 1.3 billion, China’s population is impressive now, but will be less so in the future. According to U.N. projections, most of the world’s total population increase from 6.5 billion today to 9.2 billion in 2050 will come from sub-Saharan Africa and the Muslim world. India’s population is expected to overtake China’s in 2025. China, whose so-called one-child policy has led to a steep decline in the country’s birth rate in the past several decades, will contribute only 4% of the rise in the world’s population up to 2050.
A high proportion of elderly people generally is thought to crimp economic growth. Certainly, it creates a higher tax burden for workers, who must pay to keep retirees alive. The impact of an aging population on financial markets is more difficult to predict, partly because it depends to what extent retirees save their money or spend it.
Prof. Ferguson, who teaches history and economics at Harvard University, speculates that China might end up aligning itself geopolitically with other countries with aging populations, including the U.S. and European nations. The “Old World” will be made up of countries with high tax rates aimed at ever smaller workforces. Meanwhile, the “Young World” in Africa and India will be marked by youthful unrest and dynamism.
The Economist recently had a look at slowing global population growth, noting some countries are worrying about declining population levels.
Worries about a population explosion have been replaced by fears of decline
THE population of bugs in a Petri dish typically increases in an S-shaped curve. To start with, the line is flat because the colony is barely growing. Then the slope rises ever more steeply as bacteria proliferate until it reaches an inflection point. After that, the curve flattens out as the colony stops growing.
Overcrowding and a shortage of resources constrain bug populations. The reasons for the growth of the human population may be different, but the pattern may be surprisingly similar. For thousands of years, the number of people in the world inched up. Then there was a sudden spurt during the industrial revolution which produced, between 1900 and 2000, a near-quadrupling of the world's population.
Numbers are still growing; but recently-it is impossible to know exactly when-an inflection point seems to have been reached. The rate of population increase began to slow. In more and more countries, women started having fewer children than the number required to keep populations stable. Four out of nine people already live in countries in which the fertility rate has dipped below the replacement rate. Last year the United Nations said it thought the world's average fertility would fall below replacement by 2025. Demographers expect the global population to peak at around 10 billion (it is now 6.5 billion) by mid-century.
As population predictions have changed in the past few years, so have attitudes. The panic about resource constraints that prevailed during the 1970s and 1980s, when the population was rising through the steep part of the S-curve, has given way to a new concern: that the number of people in the world is likely to start falling.
Some regard this as a cause for celebration, on the ground that there are obviously too many people on the planet. But too many for what? There doesn't seem to be much danger of a Malthusian catastrophe. Mankind appropriates about a quarter of what is known as the net primary production of the Earth (this is the plant tissue created by photosynthesis)-a lot, but hardly near the point of exhaustion. The price of raw materials reflects their scarcity and, despite recent rises, commodity prices have fallen sharply in real terms during the past century. By that measure, raw materials have become more abundant, not scarcer. Certainly, the impact that people have on the climate is a problem; but the solution lies in consuming less fossil fuel, not in manipulating population levels.
Nor does the opposite problem-that the population will fall so fast or so far that civilisation is threatened-seem a real danger. The projections suggest a flattening off and then a slight decline in the foreseeable future.
If the world's population does not look like rising or shrinking to unmanageable levels, surely governments can watch its progress with equanimity? Not quite. Adjusting to decline poses problems, which three areas of the world-central and eastern Europe, from Germany to Russia; the northern Mediterranean; and parts of East Asia, including Japan and South Korea-are already facing.
Think of twentysomethings as a single workforce, the best educated there is. In Japan (see article), that workforce will shrink by a fifth in the next decade-a considerable loss of knowledge and skills. At the other end of the age spectrum, state pensions systems face difficulties now, when there are four people of working age to each retired person. By 2030, Japan and Italy will have only two per retiree; by 2050, the ratio will be three to two. An ageing, shrinking population poses problems in other, surprising ways. The Russian army has had to tighten up conscription because there are not enough young men around. In Japan, rural areas have borne the brunt of population decline, which is so bad that one village wants to give up and turn itself into an industrial-waste dump.
States should not be in the business of pushing people to have babies. If women decide to spend their 20s clubbing rather than child-rearing, and their cash on handbags rather than nappies, that's up to them. But the transition to a lower population can be a difficult one, and it is up to governments to ease it. Fortunately, there are a number of ways of going about it-most of which involve social changes that are desirable in themselves.
The best way to ease the transition towards a smaller population would be to encourage people to work for longer, and remove the barriers that prevent them from doing so. State pension ages need raising. Mandatory retirement ages need to go. They're bad not just for society, which has to pay the pensions of perfectly capable people who have been put out to grass, but also for companies, which would do better to use performance, rather than age, as a criterion for employing people. Rigid salary structures in which pay rises with seniority (as in Japan) should also be replaced with more flexible ones. More immigration would ease labour shortages, though it would not stop the ageing of societies because the numbers required would be too vast. Policies to encourage women into the workplace, through better provisions for child care and parental leave, can also help redress the balance between workers and retirees.
Some of those measures might have an interesting side-effect. America and north-western Europe once also faced demographic decline, but are growing again, and not just because of immigration. All sorts of factors may be involved; but one obvious candidate is the efforts those countries have made to ease the business of being a working parent. Most of the changes had nothing to do with population policy: they were carried out to make labour markets efficient or advance sexual equality. But they had the effect of increasing fertility. As traditional societies modernise, fertility falls. In traditional societies with modern economies-Japan and Italy, for instance-fertility falls the most. And in societies which make breeding and working compatible, by contrast, women tend to do both.
The Economist also had a look at the aging and shrinking population of Japan - one of the more extreme examples of post-growth population dynamics, along with Russia, which is adopting a number of strange tactics to try and reverse its falling population.
A United Nations report this year called this global aging “a process without parallel in the history of humanity” and predicted that people older than 60 would outnumber those under 15 for the first time in 2047.
The twin forces of rising life expectancy and falling birthrates have accelerated the process. This is apparent from the United States, where policy makers fret over the baby boom generation beginning to retire, to Japan, which has the highest share of people older than 60 in the world. As in Japan, more than a quarter of the population in Italy and Germany is over 60, and the phenomenon extends to Poland and Russia.
Although the German government has begun to address the issue, it was particularly slow out of the blocks in dealing with its low birthrate, and, since 2003, the contraction of its population, in that first year by just 5,000 people, but in 2006 by a 130,000. The German population stands at 82.4 million people.
That was, in part, because almost no debate can proceed unencumbered by the country’s Nazi past. Hitler’s government gave medals to mothers of large families, gold for those with eight or more children. Uneasiness over the parallels kept the subject of encouraging reproduction on the back burner in Germany for years, unlike in France where promoting childbearing has long been government policy.
The UK Daily Telegraph recently had a neo-Malthusian take on the subject of population from historian Niall Ferguson, which looks at the relative rates of growth in population and food production, and the impact of biofuel production on rising food prices - "Worry about bread, not oil".
The great demographer and economist Thomas Malthus was 23-years-old the last time a British summer was this rain-soaked, which was back in 1789. The consequences of excessive rainfall in the late 18th century were predictable. Crops would fail, the harvest would be dismal, food prices would rise and some people would starve. It was no coincidence that the French Revolution broke out the same year. The price of a loaf of bread rose by 88 per cent in 1789 as a consequence of similar lousy weather. Historians of the Left like Georges Lefebvre used to see this as a prime cause of Louis XVI's downfall.
Nine years after that rain-soaked summer, Malthus published his Essay on the Principle of Population. It is an essay we would do well to re-read today. Malthus's key insight was simple but devastating. "Population, when unchecked, increases in a geometrical ratio," he observed. But "subsistence increases only in an arithmetical ratio." In other words, humanity can increase like the number sequence 1, 2, 4, 8, 16, whereas our food supply can increase no faster than the number sequence 1, 2, 3, 4, 5. We are, quite simply, much better at reproducing ourselves than feeding ourselves.
Malthus concluded from this inexorable divergence between population and food supply that there must be "a strong and constantly operating check on population". This would take two forms: "misery" (famines and epidemics) and "vice", by which he meant not only alcohol abuse but also contraception and abortion (he was, after all, an ordained Anglican minister). "The vices of mankind are active and able ministers of depopulation," wrote Malthus in an especially doleful passage of the first edition of his Essay. "They are the precursors in the great army of destruction; and often finish the dreadful work themselves. "But should they fail in this war of extermination, sickly seasons, epidemics, pestilence, and plague advance in terrific array, and sweep off their thousands and tens of thousands. Should success be still incomplete, gigantic inevitable famine stalks in the rear, and with one mighty blow levels the population with the food of the world."
I wish I could have a free lunch for every time I've heard someone declare: "Malthus was wrong." Superficially, it is true, mankind seems to have broken free of the Malthusian trap. The world's population has increased by a factor of more than six since Malthus's time, passing the 6 billion mark not so long ago. Average life expectancy has risen worldwide from 28 to 67.
Yet the daily supply of calories for human consumption has also gone up on a per capita basis, exceeding 2,700 in the Nineties. In France, on the eve of the Revolution, it was just 1,848. Since Malthus's day, the average human being's income has increased by a factor of more than eight. Human beings have grown taller and bigger, too. The average British male stood 5ft 5in tall in the late 18th century. Today, his mean height is 5ft 9in. So abundant is food in the land of the free that more than a fifth of Americans are now classified as obese.
The conventional explanation for our seeming escape from Malthus is the succession of revolutions in global agriculture, culminating in the post-war "Green Revolution" and the current wave of genetically modified crops. Since the Fifties, the area of the world under cultivation has increased by roughly 11 per cent, while yields per hectare have increased by 120 per cent. In 2004, world cereal production passed the 2 billion metric ton mark.
Yet these statistics don't disprove Malthus. As he said, food production could increase only at an arithmetical rate, and a chart of world cereal yields since 1960 shows just such a linear progression, from below one and a half metric tons to around three. Meanwhile, vice and misery have been operating just as Malthus foresaw to prevent the human population from exploding geometrically.
On the one hand, contraception and abortion have been employed to reduce family sizes. On the other hand, wars, epidemics, disasters and famines have significantly increased mortality. Together, vice and misery have ensured that the global population has grown at an arithmetic rather than a geometric rate. Indeed, they've managed to reduce the rate of population growth from 2.2 per cent per annum in the early Sixties to around 1.1 per cent today.
The real question is whether we could now be approaching a new era of misery. Even at an arithmetic rate, the United Nations expects the world's population to pass the 9 billion mark by 2050. But can world food production keep pace? Plant physiologist Lloyd T Evans has estimated that "we must reach an average yield of four tons per hectare… to support a population of 8 billion". But yields right now are, as we have seen, just three tons per hectare. And a world of eight billion people may be less than 20 years away. Meanwhile, man-made forces are conspiring to put a ceiling on food production. Global warming and the resulting climate change may well be increasing the incidence of extreme weather events as well as inflicting permanent damage on some farming regions.
It is not just British crops that are suffering this year. At the same time, our effort to slow global warming by switching from fossil fuels to bio-fuels is taking large tracts of land out of food production. According to the OECD, American output of corn-based ethanol and European consumption of oilseeds for bio-fuels will double by 2016. Only the other day, the executive director of the World Food Programme expressed anxiety about the unintended consequences of this huge shift of resources.
Some people worry about peak oil. I worry more about peak grain. The fact is that world per capita cereal production has already passed its peak, which was back in the mid-Eighties, not least because of collapsing production in the former Soviet Union and sub-Saharan Africa. Simultaneously, however, rising incomes in Asia are causing a surge in worldwide food demand.
Already the symptoms of the coming food shortage are detectable. The International Monetary Fund recorded a 23 per cent rise in world food prices during the last 18 months. Maybe you've observed it yourself. I certainly have. Of course, we're not supposed to notice that prices are going up. In the United States, the monetary authorities insist that we should focus on the "core" Consumer Price Index, which excludes the cost of food. According to that measure, the annual inflation rate in the US is just 2.2 per cent. But food inflation is roughly double that.
It's a similar story in Britain. Officially, UK inflation was running at 2.4 per cent in June. But food accounts for just 10.3 per cent of the notional basket of goods on which the CPI is based. Food inflation is actually 4.8 per cent. And it gets worse. When I wanted a Philly cheese steak in the States last week, I had to pay through the nose. That's because cheese inflation is 4 per cent, steak inflation is 6 per cent and bread inflation is 10 per cent. (American steak is now 53 per cent dearer than it was 10 years ago.) It was even worse when I fancied fish and chips for lunch on my return to Britain. That's because the fish inflation rate is currently 11 per cent in the UK, closely followed by the potato inflation rate of 10 per cent.
"The great question now at issue," Malthus asked more than 200 years ago, "is whether man shall henceforth start forwards with accelerated velocity towards illimitable, and hitherto unconceived improvement, or be condemned to a perpetual oscillation between happiness and misery." For a long time we have deluded ourselves that "illimitable improvement" was attainable. As the world approaches a new era of dearth, expect misery - and its old companion vice - to make a mighty Malthusian comeback.
The "green revolution" mentioned by Ferguson (and, cryptically, by Alexander Cockburn at the beginning of this post) was the progeny of Norman Borlaug, who was awarded the Congressional Gold Medal of Honor earlier in the year, prompting a tribute in the Wall Street Journal on the topic of "Borlaug's Revolution".
In 1944, when Norman Borlaug arrived in Mexico, the nation was in the grip of crop failure. Cereals like wheat are dietary staples. But in Mexico, an airborne fungus was causing an epidemic of "stem rust," and acreage once flush with golden wheat and maize yielded little more than sunbaked sallow weeds. Coupled with a population surge, famine seemed in the offing.
Dr. Borlaug left Mexico in 1963 with a harvest six times what it was when he arrived. From acres of arable land sprung a hyperactive strain of wheat engineered by the scientist in his laboratory, fertilized and nurtured according to his methods, and irrigated by systems he helped to design. Mexico's peasantry was not only fed -- it was selling wheat on the international market. [Norman Borlaug]
The reversal of the Mexican crop disaster was an early tiding of the Green Revolution. Over the next 30 years, Dr. Borlaug devoted himself to the undeveloped world, undoing crop failure in India and Pakistan, and rescuing rice in the Philippines, Indonesia and China. He has arguably saved more lives than anyone in history. Maybe one billion.
Dr. Borlaug was awarded the Nobel Peace Prize in 1970, yet his name remains largely unknown. Today, at age 93, he receives the Congressional Gold Medal. Perhaps it will secure the fame he merits but never pursued. Then again, perhaps not. While Dr. Borlaug was expanding human possibility, his critics -- who held humanity to be profligate and the Earth's resources finite -- were receiving all the attention. They still are.
The most famous may be Paul Ehrlich, a biologist who declared in the 1960s that "the battle to feed all of humanity" was lost. "In the 1970s and 1980s," he claimed, "hundreds of millions of people will starve to death in spite of any crash programs embarked upon now." In 1973, Lester Brown, founder of the Earth Policy Institute and still widely quoted today, said the demand for food had "outrun the productive capacity of the world's farmers." The only solution? "We're going to have to restructure the global economy." Of course.
Greenpeace and other pessimists were scandalized at Dr. Borlaug's Green Revolution; it disproved their admonitions and, worst of all, led to industrial development. They even convinced the Rockefeller and Ford Foundations to stop funding Dr. Borlaug's efforts. We see these battle lines today in the energy wars. History has its share of tragedy, but Dr. Borlaug's life demonstrates that environmental doomsayers are almost always wrong because they overlook one variable: human ingenuity.
The late economist Julian Simon was in the habit of claiming that natural resources are basically infinite. His refrain: "A higher price represents an opportunity that leads inventors and businesspeople to seek new ways to satisfy the shortages. Some fail, at cost to themselves. A few succeed, and the final result is that we end up better off than if the original shortage problems had never arisen."
As anti-development environmentalists preach the gospel of limits and state coercion, here is a question worth asking: How many millions of people might have perished had Norman Borlaug heeded their teachings?
The WSJ also had an article by Ronald Bailey about Borlaug last year, noting that, like Al Gore this year, Borlaug also won the Nobel Peace Prize - "The Man Who Fed the World ".
Who won the Nobel Peace Prize in 1970? You may be forgiven for not remembering, given some of the prize's dubious recipients over the years (e.g., Yasser Arafat). Well, then: Who has saved perhaps more lives than anyone else in history? The answer to both questions is, of course, Norman Borlaug. ...
After graduating from the University of Minnesota in 1944, Mr. Borlaug accepted an invitation from the Rockefeller Foundation to work on a project to boost wheat production in Mexico. At the time, Mexico was importing a good share of its grain. Working at plant breeding stations near Mexico City in the south and near Obregon in the northwestern part of the country, Mr. Borlaug and his staff spent nearly 20 years breeding the high-yield dwarf wheat that sparked the Green Revolution. (Using two stations allowed them to plant two crops a year instead of one, doubling the speed of research.) The key to their success was painstakingly cross-breeding thousands of wheat varieties to find those resistant to highly destructive "rust" fungi. They also changed the architecture of the wheat, from tall gangly stems to shorter sturdier ones that produced more grain.
It was an achievement that made Mexico self-sufficient in wheat by the late 1950s and, when later deployed throughout much of the developing world, forestalled the mass starvation predicted by neo-Malthusians. In the late 1960s, lest we forget, most experts were speaking of imminent global famines in which billions of people would perish. "The battle to feed all of humanity is over," biologist Paul Ehrlich famously wrote in "The Population Bomb," his 1968 best seller. "In the 1970s and 1980s hundreds of millions of people will starve to death in spite of any crash programs embarked upon now."
As Mr. Ehrlich was making his dark predictions, Mr. Borlaug was embarking on just such a crash program. Working with scientists and administrators in India and Pakistan, he succeeded in getting his highly productive dwarf wheat varieties to hundreds of thousands of South Asian peasant farmers. These varieties resisted a wide spectrum of plant pests and diseases and produced two to three times more grain than traditional varieties.
Mr. Borlaug's achievement was not confined to the laboratory. He insisted that governments pay poor farmers world prices for their grain. At the time, many developing nations--eager to supply cheap food to their urban citizens, who might otherwise rebel--required their farmers to sell into a government concession that paid them less than half of the world market price for their agricultural products. The result, predictably, was hoarding and underproduction. Using his hard-won prestige as a kind of platform, Mr. Borlaug persuaded the governments of Pakistan and India to drop such self-defeating policies.
Fair prices and high doses of fertilizer, combined with new grains, changed everything. By 1968 Pakistan was self-sufficient in wheat, and by 1974 India was self-sufficient in all cereals. And the revolution didn't stop there. Researchers at a research institute in the Philippines used Mr. Borlaug's insights to develop high-yield rice and spread the Green Revolution to most of Asia. As with wheat, so with rice: Short-stalked varieties proved more productive. They devoted relatively more energy to making grain and less to making leaves and stalks. And they were sturdier, remaining harvestable when traditional varieties--with heavy grain heads and long, slender stalks--had collapsed to the ground and begun to rot.
Hence the Nobel Prize. The chairman of the Nobel committee explained why it had chosen Mr. Borlaug in this way: "More than any other single person of this age, [he] has helped to provide bread for a hungry world. We have made this choice in the hope that providing bread will also give the world peace."
Whether bread induces peace is a question for another day. It certainly kills hunger and saves lives. Contrary to Mr. Ehrlich's bold pronouncement, hundreds of millions of people did not die for lack of food. Far from it. Despite occasional local famines caused by armed conflicts or political mischief, food is more abundant and cheaper today than ever before in history. It is an absurd travesty that Mr. Ehrlich is still much better known than Mr. Borlaug, but perhaps Mr. Hesser's biography can begin to right the balance.
Borlaug himself wrote an op-ed in the Wall Street Journal last year on the subject of "Continuing the Green Revolution", in which he discusses the need for a "Gene" revolution to continue to improve crop yields.
Persistent poverty and environmental degradation in developing countries, changing global climatic patterns, and the use of food crops to produce biofuels, all pose new and unprecedented risks and opportunities for global agriculture in the years ahead.
Agricultural science and technology, including the indispensable tools of biotechnology, will be critical to meeting the growing demands for food, feed, fiber and biofuels. Plant breeders will be challenged to produce seeds that are equipped to better handle saline conditions, resist disease and insects, droughts and waterlogging, and that can protect or increase yields, whether in distressed climates or the breadbaskets of the world. This flourishing new branch of science extends to food crops, fuels, fibers, livestock and even forest products.
Over the millennia, farmers have practiced bringing together the best characteristics of individual plants and animals to make more vigorous and productive offspring. The early domesticators of our food and animal species -- most likely Neolithic women -- were also the first biotechnologists, as they selected more adaptable, durable and resilient plants and animals to provide food, clothing and shelter.
In the late 19th century the foundations for science-based crop improvement were laid by Darwin, Mendel, Pasteur and others. Pioneering plant breeders applied systematic cross-breeding of plants and selection of offspring with desirable traits to develop hybrid corn, the first great practical science-based products of genetic engineering.
Early crossbreeding experiments to select desirable characteristics took years to reach the desired developmental state of a plant or animal. Today, with the tools of biotechnology, such as molecular and marker-assisted selection, the ends are reached in a more organized and accelerated way. The result has been the advent of a "Gene" Revolution that stands to equal, if not exceed, the Green Revolution of the 20th century.
Consider these examples:
* Since 1996, the planting of genetically modified crops developed through biotechnology has spread to about 250 million acres from about five million acres around the world, with half of that area in Latin America and Asia. This has increased global farm income by $27 billion annually. * Ag biotechnology has reduced pesticide applications by nearly 500 million pounds since 1996. In each of the last six years, biotech cotton saved U.S. farmers from using 93 million gallons of water in water-scarce areas, 2.4 million gallons of fuel, and 41,000 person- days to apply the pesticides they formerly used. * Herbicide-tolerant corn and soybeans have enabled greater adoption of minimum-tillage practices. No-till farming has increased 35% in the U.S. since 1996, saving millions of gallons of fuel, perhaps one billion tons of soil each year from running into waterways, and significantly improving moisture conservation as well. * Improvements in crop yields and processing through biotechnology can accelerate the availability of biofuels. While the current emphasis is on using corn and soybeans to produce ethanol, the long- term solution will be cellulosic ethanol made from forest industry by- products and products.
However, science and technology should not be viewed as a panacea that can solve all of our resource problems. Biofuels can reduce dependence on fossil fuels, but are not a substitute for greater fuel efficiency and energy conservation. Whether we like it or not, gas- guzzling SUVs will have to go the way of the dinosaurs.
So far, most biotechnology research and development has been carried out by the private sector and on crops and traits of greatest interest to relatively wealthy farmers. More biotechnology research is needed on crops and traits most important to the world's poor -- crops such as beans, peanuts, tropical roots, bananas, and tubers like cassava and yams. Also, more biotech research is needed to enhance the nutritional content of food crops for essential minerals and vitamins, such as vitamin A, iron and zinc.
The debate about the suitability of biotech agricultural products goes beyond issues of food safety. Access to biotech seeds by poor farmers is a dilemma that will require interventions by governments and the private sector. Seed companies can help improve access by offering preferential pricing for small quantities of biotech seeds to smallholder farmers. Beyond that, public-private partnerships are needed to share research and development costs for "pro-poor" biotechnology.
Finally, I should point out that there is nothing magic in an improved variety alone. Unless that variety is nourished with fertilizers -- chemical or organic -- and grown with good crop management, it will not achieve much of its genetic yield potential.
In his book, Herman Kahn looks at the issue of food supply and predicts it will not be an issue (even with the 15 billion odd population level he foresees), as growing demand will be met by:
* the spread of advanced agriculture techniques as practiced at the time - increasing "tillable acreage", multi cropping and increasing yield by using more fertiliser and new grain varieties (ie. the usual green revolution techniques) * the introduction of new industrial agriculture techniques in the coming decades * the use of promising "unconventional" technologies (advances in hydroponics which he labels "nutrient film techniques") * the adaptation of dietary tastes and habits to "inexpensive food produced by high technology factories" such as "single cell proteins" (which immediately brought "Soylent Green" to my mind)
This section of the book was by far the most unconvincing in my mind, ignoring the impact of both fossil fuel depletion and soil erosion and depletion on his recommended solutions, as well as alternatives like organic farming techniques and unanticipated future developments like the genetic engineering option. I suspect if Kahn was still around he would be a gung-ho GMO enthusiast though.
Ultra doomer James Lovelock shares some of Kahn's views about food growing techniques even if he is at the opposite of the spectrum on the population issue (and sounds like a crazed prophet with his frequent "end his nigh" declarations - though Lovelock supporters could probably make a strong case that he has made far more of a contribution to science and technology than someone like Kahn who simply talked a lot).
Writers like Richard Manning (The Oil We Eat) and Dale Allen Pfeiffer (Eating Fossil Fuels) have made the case that the green revolution will prove unsustainable as fossil fuels become scarcer in a post peak world, impacting on the availability of fertilisers, pesticides and mechanical planting and harvesting machinery - a common view amongst peak oil observers.
Experiments tell us that lack of fertilizer will reduce crop yields and that is exactly what oil prices cause--reduction in fertilizer. Why the difference? Precision application of fertilizer rather than the spray-it-all-over-the-place techniques have begun to come into play, minimizing the effect of lessened fertilizer application--so far. Eventually, even that might not be enough to avoid a drop in crop yield.
With corn, one of the interesting realizations is that a 19th century farm grew about 30 bushels per acre, while today, with our machinery we can grow up to 160 bushels per acre. How this is done needs some explanation. The first thing is that on a modern farm, 30,000 corn plants grow per acre. This is about 1.5 square feet per plant. This simply can't be done without machinery. I am in the process of purchasing a 100-acre farm. Let's say I wanted to plant corn by hand and achieve those densities. At 5 seconds per seed, it would take 41 hours to do one acre. And 173 days to do the farm. Of course, by having lots of children I can put them to work. With 10 children, I could do it in 17 days. This shows that without machinery, the plant densities will drop. A modern wheat field has 1.3 million plants. Clearly, without machinery, this is a throw-the-seed-out-there-and-hope-the-birds-don't-eat-it-all exercise.
So, having shown the problem of planting without machinery, we can see that any reduction of oil is likely to cause a serious drop in crop yield, leading to famine. When we can only drive our tractors 80% as much as we do today, it will effectively mean only 80% of the land will be under cultivation. And like everything else, we are being squeezed from two sides. The population increase requires a higher rather than a lower yield per acre. A recent article The Telegraph spoke of this problem. After pointing out that since the 1950s, there has been an 11 percent increase in cultivatable land, yields have gone up 120 per cent. As they say, 'they aren't making new land anymore'.
Its worth noting that if (as Kahn and I both believe) that we can replace our energy needs that are derived from fossil fuels with renewable sources, then much of this argument is invalidated. A shift to a renewable energy based economy means that we can divert remaining oil, gas and coal supplies to the production of fertiliser, pesticides and the like instead of fuel, which changes the scenario outlined dramatically (without even considering alternative means of achieving the required agricultural productivity such as the biotech option mentioned above by Borlaug, or by organic farming techniques that I'll mention later).
On his farm just outside the Punjabi city of Ludhiana in northern India, Jagjit Singh Hara showed off his collection of old photos. One of the farmer's most prized snaps is of him with the Norwegian-American agronomist Norman Borlaug, the man popularly known as The Father of the Green Revolution. "Here we are when we were both young men," says Mr Singh Hara with smile. "I said to him: 'Dr Borlaug, I want to put my hand in your pocket. But I don't want to take out the dollars, I want to take out the wheat seeds you have.'"
Punjab State, the breadbasket of India, is one of the places where the Green Revolution began. It more than doubled aggregate production here of wheat and rice.
India, a country that will probably soon overtake China as the most populous nation in the world, went from being a food-aid "basket case" to being largely self-sufficient in food. The benefits - and costs - of the Green Revolution in India are reflected in other parts of the developing world. In Punjab, I was looking for clues about whether output could be boosted further to cope with the rising demand that will be required to feed a world population set to rise from 6.6 billion today to more than nine billion people by 2050.
Food output across the world increased considerably in the last four decades of the 20th Century, largely as a result of the intensive farming techniques introduced by the Green Revolution. The new techniques involved distributing hybrid grain seeds - mainly wheat, rice and corn. The hybrids grow with a shorter stalk. This maximises the process of photosynthesis, which nourishes the grain because less energy goes into the stem. The hybrid seeds were combined with the intensive use of fertilisers and irrigation.
Population v Production
After successfully being introduced in India, the Green Revolution was rolled out in other parts of Asia, Latin America and the Middle East. It was so successful in terms of production increases that it defied the gloomy Malthusian predictions of the 1960s, which said hundreds of millions would starve as population outstripped farm output.
The Revolution was a technological success. "Before the 1960s, the population of India was multiplying like rats in a barn," said Jagjit Singh Hara, "but we didn't have the grain to feed them. After the Green Revolution, we doubled our yield and now we have proved that India can feed the world".
But the process has limits and they may have been reached. Population, on the other hand, has continued to rise in poor parts of the world. The graph, compiled for the BBC by the UN Food and Agriculture Organization, shows that while yield per hectare has increased, the amount of land used for the major staple grains has remained fairly constant; this is because the amount of good farmland is finite.
Given the shortage of land suitable for growing more food, the obvious answer would be a new Green Revolution, or another hike in yields. But this may not be possible. "The difficulty is that we are now pressing against the photosynthetic limits of plants," says the influential environmentalist Lester Brown of the Earth Policy Institute in the United States. ...
There are other limits to the Green Revolution. Some of the poorer villagers I spoke to in rural Punjab said they had fallen into debt as they were unable to keep up with the rising cost of the inputs - fertilisers, irrigation pumps and regular fresh supplies of seed - which intensive agriculture requires. One elderly man in the village of Lehragaga, Jasram Singh, sat on an old iron bedstead in his yard and recounted the unbearable pain he had suffered when two of his sons had committed suicide. They had fallen into debt, been forced to sell their land, and felt irreconcilable shame.
A local community activist, Jagdish Papra, said the case was typical of many families who had seen loved ones kill themselves because they could not keep up with the financial cost of inputs. "In the old days we practiced subsistence agriculture and we felt a sense of control," said Papra. "Now everything is more complicated and lots of people are desperately in debt."
Human cost
Amrita Chaudhry, an Agriculture Correspondent with The Indian Express newspaper, stood in a neat, almost manicured field of young green wheat; "The balance sheet of the Green Revolution is that, yes, we are feeding the mouths. India no longer has to ask for food aid from other nations. But the fact is we are paying a very heavy price for agriculture at this present moment. Punjab is one of the biggest user of pesticides in India," Mr Chaudhry continued, "and they have leached into our subsoil water." "There are health costs. We have had babies born blue because they are not breathing. Some of them have mental health problems. In the south-western belt, we have entire villages where each family has at least one or two cancer cases. All this is all because of this intensive agriculture that we have been doing."
A more recent article from the BBC recommended "food sovereignty" - giving farmers more control over how they farm - as a key to achieving sustainable agriculture.
Behind several kinds of environmental damage lurks the hand of the farmer. The key to better prospects for them and the environment, argues Michel Pimbert in the Green Room, is giving them more control over what they do.
Farmers and other citizens in various parts of the world are engaging in a major effort to change the nature of agriculture. The key phrase is "food sovereignty"; and this weekend, many of the interested parties are gathering for a conference in Mali, one of two countries (the other being Bolivia) which have adopted it as their overarching policy framework for food and farming.
Food sovereignty is all about ensuring that farmers, rather than transnational corporations, are in control of what they farm and how they farm it; ensuring too that communities have the right to define their own agricultural, pastoral, labour, fishing, food and land policies to suit their own ecological, social, economic and cultural circumstances.
Why is it needed? From the social point of view, because everyone has an unconditional human right to food, and it is simply unacceptable to allow over 850 million people go to bed hungry in a world that produces more than enough food for all. On the environmental side, industrial farming damages our planet's life support systems in a number of ways:
* it is a major contributor to global warming through intensive use of fossil fuels for fertilisers, agrochemicals, production, transport, processing, refrigeration and retailing * agrochemical nutrient pollution causes biological "dead zones" in areas as diverse as the Gulf of Mexico, the Baltic Sea and the coasts of India and China * human activity now produces more nitrogen than all natural processes combined * crop and livestock genetic diversity has been lost through the spread of industrial monocultures, reducing resilience in the face of climate and other changes
The progress of this growing food sovereignty movement could have profound implications for scientific research, politics, trade and the twin curses of poverty and environmental degradation.
Towards sustainable agriculture
Within the food sovereignty approach, the environmental ills outlined above are avoided by developing production systems that mimic the biodiversity levels and functioning of natural ecosystems. These systems seek to combine the modern science of ecology with the experiential knowledge of farmers and indigenous peoples. Combinations of indigenous and modern methods lead to more environmentally sustainable agriculture, as well as reducing dependence on expensive external inputs, reducing the cost-price squeeze and debt trap in which the world's farmers are increasingly caught.
Ecological agriculture has been shown to be productive, economic and sustainable for farmers, whether their external inputs are low or high. Scientists recently reported that a series of large-scale experimental projects around the world using agro-ecological methods such as crop rotation, intercropping, natural pest control, use of mulches and compost, terracing, nutrient concentration, water harvesting and management of micro-environments yielded spectacular results.
For example, in southern Brazil, the use of cover crops to increase soil fertility and water retention allowed 400,000 farmers to raise maize and soybean yields by more than 60%. Farmers earned more as beneficial soil biodiversity was regenerated.
Staying in control
Food sovereignty is not against trade and science. But it does argue for a fundamental shift away from "business as usual", emphasising the need to support domestic markets and small-scale agricultural production based on resilient farming systems rich in biological and cultural diversity. Networks of local food systems are favoured because they reduce the distance between producers and consumers, limiting food miles and enhancing citizen control and democratic decision-making.
Equitable access to land and other resources is vital, because a significant cause of hunger and environmental degradation is local people's loss of rights to access and control natural resources such as land, water, trees and seeds. This severely reduces their incentive to conserve the environment; the displacement of farming peoples from fertile lands to steep, rocky slopes, desert margins, and infertile rainforest soils lead to more environmental degradation.
Trade and markets must be made to work for people and the environment; current trade policies for agriculture are failing the environment and leading to the economic genocide of unprecedented numbers of farmers. New governance systems must ensure that negative impacts of international trade such as dumping are stopped, and local markets given priority; commodity agreements must restrict overproduction and guarantee small-scale producers equitable prices that cover the costs of producing food in socially and environmentally sustainable ways.
We need too a radical shift from the existing top-down and increasingly corporate-controlled research system to an approach which devolves more power to the local level. The process should lead to the democratisation of research, and more diverse forms of inquiry based on specialist and non-specialist knowledge.
Reclaiming diversity and citizenship
If unchecked, neo-liberal agricultural policies will aggravate the many worrying environmental trends identified by the recent Millennium Ecosystem Assessment. Grossly unfair market prices will continue to drive ever more farmers and owners of local food business to despair and bankruptcy. This will fuel human tragedies and conflicts associated with cross-border migrations everywhere.
The good news is that all this is not inevitable. The political choices made by governments and their corporate friends can still be decisively rejected and reversed. But this depends on creating inclusive alliances between farmers, fisher-folk, indigenous peoples, scholars and other citizens to exert countervailing power - which is perhaps the biggest challenge facing the food sovereignty movement.
Another series on world food production that always comes to mind when this topic comes up is Zaid Hassan's "Postcards FromThe GlobalFood System" at WorldChanging, which contained the immortal line "there are so many criticisms around the current global food system that for a while I started wondering if in fact it had already collapsed and I was studying a post-apocalyptic food system" and outlined the differences in opinion between the 2 main streams of thought.
The Road From Green Revolution to Fatal Harvest
The difficulty with data around the food system is a little like data around climate change, only much more fragmented and fast-moving. If a group of scientists make a claim, it's fairly easy to find a Bjorn Lomborg-type claiming it ain't so, you're just fear-mongering. Discerning the truth of what's going on with the global food system at the numbers and science level requires a lot of time and energy. There is contradictory information and all of it cannot be right. At the end of the day it boils down to epistemology and axiomatic truths, and a choice needs to be made as to what we are willing to accept as legitimate data.
In trying to discern patterns in the mass of data it seemed to me that there are two broad schools of dueling, wheeling thought, with a host of lesser and emerging schools emanating from them. The first is the modern Green Revolution. The second, simultaneously representing an older form of agrarian logic and a response to the Green Revolution, can be dubbed (perhaps unfairly) the Fatal Harvest School.
The Green Revolution took hold and changed the face of agriculture through the 1960s and 1970s, although its origins lie in the early twentieth century. Until the 19th century food production grew by expanding cultivated land area. If you wanted to grow more food then you had no choice but to put more land under cultivation. A key technological advance -- synthetic ammonia -- changed this age-old truism.
The modern fertilizer industry came into being in 1909, with the synthesis of ammonia by Fritz Haber. This discovery had little agricultural impact at first; during the two world wars production of ammonia was diverted to munitions instead of farming. Following the end of the Second World War, however, the ammonia industry turned to producing ammonia for the rapidly growing fertilizer industry, contributing to dramatically increasing crop yields. Norman Borlaug, known as the “father of Green Revolution”, in his survey, “The Green Revolution: Its Origins and Contributions to World Agriculture” (B. 2003) explains that change in hard, cold numbers,
“US maize cultivation led the modernization process. In 1940, US farmers produced 56 million tons of maize on roughly 31 million hectares, with an average yield of 1.8 t/ha. In 2000 US farmers produced 252 million tons of maize on roughly 29 million hectares, with an average yield of 8.6 t/ha.”
The Green Revolution coupled developments in fertilizer synthesis with the breeding of more robust and fast growing seed varieties. Borlaug won the Nobel Prize in 1970 for his work in the development of rust-resistant (disease resistant), semi-dwarf wheat and rice varieties with radically improved yields.
“The new short wheat varieties, which drew on the Japanese Norin wheat germplasm, were much more efficient than their tall predecessor varieties in converting sunlight and nutrients into grain production. Furthermore their superior plant architecture provided resistance against lodging (falling over) in heavy winds and under improved conditions of soil fertility and moisture.” (B. 2003).
Throughout the 60s and 70s these varieties of wheat (known as Mexican dwarf wheat) and rice, spread far and wide, particularly in countries suffering from acute food shortages such as India and Pakistan, and later China. Radical (and controversial) changes were made in national agriculture policy in these countries in order to adapt to the regime specified by the scientists that had developed these new wheat varieties. “Within 10 years, wheat and rice production had increased by 50 percent.” (B. 2003)
To very crudely summarize, the Green Revolution was, and is, a revolution in generating more yield from the same patch of land using hybrid seeds, pesticides and fertilizers. It’s a Revolution because it has changed the face of agriculture and is squarely responsible for the current, dominant, food production regime. It’s a movement that yokes itself strongly to science and technology and claims that there is no way of feeding the growing world population other than through the further deployment of a science-based agriculture. The shift to GMOs can be seen as a new chapter in the story of the Green Revolution, an attempt to further increase yields.
Those who subscribe to the Fatal Harvest School cannot be neatly packaged. It consists of a rag-tag bunch of farmers and activists who claim to represent an older, more gentle and contextually sensitive agrarian logic. They believe that industrial agriculture (as the prime product of the Green Revolution) is inherently destructive: its farming practices, such as the use of fertilizers, pesticides and GMOs, are a serious threat to the environment and to people's health; its practices of monoculture, single crop farming and single minded focus on yield-based agriculture is a threat to biodiversity and pays little attention to local context; the business practices of industrial agriculture are monopolistic and a threat to all subsistence, small and medium size farmers. In short, the Fatal Harvest school lays the blame for each and every problem in the global food system squarely at the feet of industrial agriculture. To summarise the criticisms of the Fatal Harvest School, familiar to many of us, are as follows:
1. Health: The food industry is killing us. In the West there are diseases of over-nutrition, ranging from coronary heart disease through to diabetes. The Center for Disease Control in Atlanta cites food related illnesses as the second largest cause of death in the USA. Food corporations are bracing themselves for “obesity” suits much in the same way that tobacco companies were targeted. In the developing world there are the diseases of malnutrition, such as Vitamin A and iodine deficiencies, as well as the stark fact that 40 million people die of hunger a year.
2. Environmental: Industrialised agriculture is the key cause of environmental degradation today. Mono-cropping is leading to a massive loss of biodiversity (See George Monibot’s excellent article “Fallen Fruit” for one case), it’s putting massive amounts of pesticides and herbicides into the air and into water, it’s causing a “food bubble” through rapidly depleting non-renewable water aquifers which once they run dry will cause a collapse in key grain commodities, it’s pushing fish stocks to extinction (in Canada & Europe).
3. Cultural: Modern agriculture is destroying rural and indigenous farming cultures. We’re heading towards a "walmartisation" of food, where the death of small & medium farmers, rural culture and indigenous farming practices means that millions of peasants are left vulnerable to displacement, loss of livelihoods and famine; a monoculture of food and the loss of valuable agricultural practices.
4. Economic: Agribusiness is forming a oligopoly out to control the entire food chain. The food business through rapid consolidation is leading us towards a food monopoly where a handful of Western corporations will control every aspect of food.
From what I can tell, the Fatal Harvest School is winning the public battle for hearts and minds. In the UK it was largely responsible for shaping public attitudes to GMOs – which were clearly rejected by the public at large. It’s responsible for the mass mobilization of farmers from the South, through the anti-globalisation movement and organizations such as Via Campesina.
In turn, the Green Revolutionaries -- that is, the scientists, agronomists and multinationals who are the target of so much ire -- throw up their hands in exasperation at the “irrationality” of the Fatal Harvest School. Their rebuttal can be boiled down to a few key points. The first is that given population growth figures we cannot afford, at social, financial and environmental levels, to turn over enough land to feed everyone through less intensive forms of organic farming. Jason Clay, in his excellent and monumental work “World Agriculture and the Environment” puts it bluntly,
“...the Earth is currently home to over 6 billion people. Supporting them all by low-intensity cropping – depending solely on recycling organic matter and using crop rotation with legumes – would require doubling or tripling the area currently cultivated. This land would have to come from somewhere – and would most likely mean the elimination of most if not all tropical rainforests and the conversation of a large part of tropical and subtropical grasslands too.”
His rather dead-pan conclusion is that “these are hardly acceptable alternatives.” Non-organic methods of farming, in other words, provide more bang for buck. Furthermore because industrial agriculture uses comparatively less land this means that less of the environment is disturbed and cleared away to meet farming needs. This argument hinges on the claim that industrial agriculture yields more per acre than organic agriculture, that we have no choice but to feed the growing population and that the masses want a standard of living equal to those in the West. The Green Revolutionaries think of themselves as the pragmatists in this particular game, they are responding to undeniable trends. The trouble with this, of course, is that they have designed an agricultural logic that profits from destructive and undeniable trends. They leave themselves open to broadsides of criticism in that it’s no longer possible to discern if they are simply responding to destructive trends or actively a cause of these trends.
The Fatal Harvest School, on the other hand, is arguing that some sort of fundamental change in human behaviour needs to be made. Ideally, population growth figures need to be controlled otherwise industrial agriculture will chew up the planet. Behaviour change, however, can take place in two places, the West and the developing world. In the West this behaviour change looks like a change in consumption patterns thus reducing stress on the environment. In the developing world this change looks like, at best, a change in reproductive patterns (if not more). I find it disturbing that so few people have any faith that behaviour change can take place in the West. At the moment the burden is therefore placed squarely on the developing world. What's more the way Green Revolutionary logic is playing out against Fatal Harvest logic currently means that the world will see an increasingly stratified global food regime, with the rich being able to afford organic food and the poor having to rely on GMOs.
Having said that there are precidents for behaviour change in both the West and the developing world.
Smoking is currently declining in the West, largely due to years of campaigning and health education. Given the liklihood of obesity taking the place of smoking as the number one killer it's also possible that the issue of over-eating be addressed in the same way, through massive public health campaigns. (Of course tobacco companies responded to the decline of business in the West by focussing efforts in the developing world.)
As we've reported before, it looks like global population figures will level off at around the 9 billion mark, which is far from the dire predictions of 20 billion or so that were being made in the 70s. Planning around such levelling off should, at least in theory, mean that it's much easier to make a case for a particular, more environmentally friendly, food logic other than one designed for runaway population growth.
The conundrum posed by these dueling logics boils down to a single, highly complex question, the answer to which is far from clear. Given the vast surplus of food, at least in the West, does the world really need more food?
WorldChanging also had an interesting post on a Truly African Green Revolution, which looks at the "food sovereignty" concept and notes that the original green revolution failed in Africa as the techniques were not suited to local conditions (which also means it is one continent not dependent on fossil fuel based fertilisers) and that they are trying not to go down the GMO path.
In a move that seems certain to spark controversy in agricultural and biotechnology circles, former United Nations Secretary General Kofi Annan of Ghana just announced that the Alliance for a Green Revolution in Africa (AGRA), which he now heads, will not use genetically modified (GMO) seeds to fight the war on hunger in Africa; instead, AGRA will focus on creating new seed varieties from familiar local seeds using conventional breeding methods. AGRA's commitment to local methods and materials is akin to the concept of food sovereignty, the idea that nations should be able to feed themselves, using native resources and techniques.
AGRA was established last year with a $150 million infusion from the Gates and Rockefeller Foundations. Its mission is to revitalize small-scale farming and improve the lives of African farmers (most of whom are women) while improving crop yields to alleviate the poverty that afflicts much of Africa. In addition to developing seeds, AGRA’s ambitious action plan includes fortifying soils depleted by poor agricultural practices; improve access to water and water efficiency; creating better agricultural markets; developing local agricultural education networks; utilizing African farming techniques; and encouraging government policies that help small-scale farmers.
According to its web site, although AGRA does not "advocate for or against genetic engineering," they "know that conventional methods of plant breeding can produce significant benefits in the near term at relatively low cost. Until now, however, conventional plant breeding has not received sufficient attention or investment in Africa, leaving untapped the inherent genetic potential available in African crops."
There is debate, even among those at Worldchanging about the risks and potential of genetically modified crops. Personally, I think current genetic engineering practices are insufficiently studied at best and potentially risky at worst, and that genetically modified crops are more beneficial to factory-farm agribusinesses than indigenous farmers. I'm also concerned about the potential health effects of GMO crops; although they haven't been demonstrated to be biochemically or nutritionally different than conventional, non-GMO crops, the truth is that we just don't know what the long-term health impacts of GMOs, if any, are. That's reason enough for concern.
Focusing on conventional breeding and agricultural practices may not work to increase farm productivity and reduce hunger in Africa. Then again, it very well might. The point, according to AGRA, is that it hasn't been tried. The Green Revolution, so successful at providing food to famine-stricken nations around the world, failed in Africa; as crop production around the world increased, production in Africa actually declined, in part because the tools and techniques of that revolution--high-yield crops, monoculture, pesticides, fertilizer, and mechanized farm machines--were poorly suited to the soil, climate, and economic conditions of Africa.
The problems facing Africa are tremendous: according to AGRA, the number of Africans living below the poverty line of $1 a day increased by 50 percent over the last 15 years, and an estimated 30 percent of the population routinely suffers from hunger. Combating these problems will take billions of dollars and decades. AGRA's approach--to launch, as Annan put it in a recent speech, "a uniquely African Green Revolution"--seems like a good start.
Of course, not wanting to plant or consume GMO crops is pretty common in Europe and the US too, but that hasn't stopped the stuff appearing all over the place. Fortune has an interesting report on the "Attack of the mutant rice".
America's rice farmers didn't want to grow a genetically engineered crop. Their customers in Europe did not want to buy it. So how did it end up in our food?
Back in the spring of 2001, a 64-year-old Texas rice farmer named Jacko Garrett watched a fleet of 18-wheelers haul away truckloads of rice that he had grown with great care. "It just bothers me so bad," Garrett said. "I'm sitting here trying to find food to feed people, and I've got to bury five million pounds of rice." No one likes to waste food, but for Garrett, who runs a charity that collects rice for the needy, the pain was especially acute.
Garrett's rice was genetically modified, part of an experiment that was brought to an abrupt halt by its sponsor, a North Carolina-based biotechnology company called Aventis Crop Science. The company had contracted with a handful of farmers to grow the rice, which was known as Liberty Link because its genes had been altered to resist a weed killer called Liberty, also made by Aventis.
But by 2001, Aventis Crop Science was living a biotech nightmare. Another one of its creations, a variety of genetically modified corn known as StarLink, had been discovered in taco shells made by Kraft. Because the StarLink corn had been approved as animal feed - and not for human consumption - all hell broke loose.
Hundreds of corn products were recalled. Consumers and farmers sued. Greenpeace dumped bags of corn in front of federal regulatory agencies, and an Environmental Protection Agency official accused Aventis Crop Science of breaking the law. So shell-shocked was Aventis SA (Charts), the French pharmaceutical giant that owned Aventis Crop Science, that it decided to sell the U.S. biotech unit and abandon the very emotional business of reengineering the foods we eat.
So dumping the Texas rice was a no-brainer. "We didn't want to take any chances," says a former Aventis executive. "We burned and buried enough rice to feed 20 million people."
Eventually Aventis paid about $120 million to settle the StarLink lawsuits. It sold its crop science unit to Bayer (Charts), the German drug giant that makes aspirin, Aleve and Alka-Seltzer. Bayer Crop Science dropped plans to bring Liberty Link rice to market, largely because rice grown in the U.S. is exported to Europe and other places that don't want genetically modified foods. And everyone forgot about Jacko Garrett's rice.
Can you guess where this is going? Yep. In January 2006, small amounts of genetically engineered rice turned up in a shipment that was tested - we don't know why - by a French customer of Riceland Foods, a big rice mill based in Stuttgart, Ark. Because no transgenic rice is grown commercially in the U.S., the people at Riceland were stunned. At first they figured that the test was a mistake or that tiny bits of genetically modified corn or soybeans had somehow gotten mixed up with rice during shipping. They said nothing.
Then came another shock. Testing revealed that the genetically modified rice contained a strain of Liberty Link that had not been approved for human consumption. What's more, trace amounts of the Liberty Link had mysteriously made their way into the commercial rice supply in all five of the Southern states where long-grain rice is grown: Arkansas, Texas, Louisiana, Mississippi and Missouri. Bayer and Riceland then informed the U.S. Department of Agriculture, which announced the contamination last August.
By then the tainted rice was everywhere. ...
Regardless of the pros and cons of GMO crops, another pointer to an new agricultural order that no longer relies on fossil fuel derived fertiliser and pesticides is organic farming. There have been a number of reports in recent years that organic farming yields can be as good or better than that of industrialised farming.
Organic farming can yield up to three times as much food as conventional farming in developing countries, and holds its own against standard methods in rich countries, US researchers say. They said their findings contradict arguments that organic farming, which excludes the use of synthetic fertilisers and pesticides, is not as efficient as conventional techniques.
"My hope is that we can finally put a nail in the coffin of the idea that you can't produce enough food through organic agriculture," Ivette Perfecto, a professor at the University of Michigan's school of Natural Resources and Environment, said in a statement. She and colleagues analysed published studies on yields from organic farming. They looked at 293 different examples. "Model estimates indicate that organic methods could produce enough food on a global per capita basis to sustain the current human population, and potentially an even larger population, without increasing the agricultural land base," they wrote in their report, published in the journal Renewable Agriculture and Food Systems.
"We were struck by how much food the organic farmers would produce," Professor Perfecto said. "Corporate interest in agriculture and the way agriculture research has been conducted in land grant institutions, with a lot of influence by the chemical companies and pesticide companies as well as fertiliser companies, all have been playing an important role in convincing the public that you need to have these inputs to produce food," she added.
With the world population passing the 6 billion mark last October, the debate over our ability to sustain a fast growing population is heating up. Biotechnology advocates in particular are becoming very vocal in their claim that there is no alternative to using genetically modified crops in agriculture if "we want to feed the world". Actually, that quote might be true. It depends what they mean by "we." It's true if the "we can feed the world" refers to the agribusiness industry, which has brought the world to the brink of food disaster and is looking for a way out. Biotech just may be their desperation move. "We'll starve without biotech," is the title of an opinion piece by Martina McGloughlin, Director of the Biotechnology program at the University of California, Davis. Could be. Modern industrial agricultural — which forms the foundation for biotech — ranks as such a dismal failure that even Monsanto holds them up as the evil alternative.
"The commercial industrial technologies that are used in agriculture today to feed the world... are not inherently sustainable," Monsanto CEO Robert Shapiro told the Greenpeace Business Conference recently. "They have not worked well to promote either self-sufficiency or food security in developing countries." Feeding the world sustainably "is out of the question with current agricultural practice," Shapiro told the Society of Environmental Journalists in 1995. "Loss of topsoil, of salinity of soil as a result of irrigation, and ultimate reliance on petrochemicals ... are, obviously, not renewable. That clearly isn't sustainable."
Shapiro is referring to the 30-year-old "Green Revolution" which has featured an industrial farming system that biotech would build on: the breeding of new crop varieties that could effectively use massive inputs of chemical fertilizers, and the use of toxic pesticides. As Shapiro has hinted, it has led to some severe environmental consequences, including loss of topsoil, decrease in soil fertility, surface and ground water contamination, and loss of genetic diversity.
Do we really need to embark upon another risky technological fix to solve the mistakes of a previous one? Instead, we should be looking for solutions that are based on ecological and biological principles and have significantly fewer environmental costs. There is such an alternative that has been pioneered by organic farmers. In contrast to the industrial/monoculture approach advocated by the biotech industry, organic agriculture is described by the United Nations Food & Agriculture Organization (FAO) as "a holistic production management system which promotes and enhances agro-ecosystem health, including biodiversity, biological cycles, and soil biological activity."
Despite the lack of support from government and university extension services in the US, consumer demand for organic products is driving the organic movement ahead at a 20% annual rate of market growth, primarily with the help of an increasing consumer demand for organic products. The amount of certified organic agricultural land increased from 914,800 acres in 1995 to 1.5 million in 1997, a jump of more than 60% in just two years.
Not surprisingly, agribusiness conglomerates and their supporters dismiss organic farming, claiming it produces yields too low to feed a growing world population. Dennis Avery, an economist at the Hudson Institute — funded by Monsanto, Du Pont, Dow, and Novartis among others — had this to say in a recent ABC News' 20/20 broadcast. "If overnight all our food supply were suddenly organic, to feed today's population we'd have plowed down half of the world's land area not under ice to get organic food ... because organic farmers waste so much land. They have to because they lose so much of their crop to weeds and insects." In fact, as a number of studies attest, organic farming methods can produce higher yields than conventional methods. Moreover, a worldwide conversion to organic has the potential to increase food production levels -- not to mention reversing the degradation of agricultural soils and increase soil fertility and health. ...
I love the way the Hudson Institute crops up there in the last quoted paragraph - I find myself wandering around in memetic circles in some of these explorations...
Energy Bulletin points to a Scientifc American article on "future farming" which looks at another technique for making agriculture sustainable - plating and harvesting perennials (this is a key theme in Janine Benyus's book "Biomimicry" too).
Large-scale agriculture would become more sustainable if major crop plants lived for years and built deep root systems.
Many in the West take a blase approach toward thinking about their food supply. Yet our society is almost entirely dependent on a few very similar crops, making us extremely susceptible to a sudden, prolonged drought or epidemic. Scientists are trying to find ways around this, and one of the most appealing ideas is to strengthen plants' root systems and prolong their lives: Hello, perennials!
For many of us in affluent regions, our bathroom scales indicate that we get more than enough to eat, which may lead some to believe that it is easy, perhaps too easy, for farmers to grow our food. On the contrary, modern agriculture requires vast areas of land, along with regular infusions of water, energy and chemicals. Noting these resource demands, the 2005 United Nations-sponsored Millennium Ecosystem Assessment suggested that agriculture may be the "largest threat to biodiversity and ecosystem function of any single human activity."
Today most of humanity's food comes directly or indirectly (as animal feed) from cereal grains, legumes and oilseed crops. These staples are appealing to producers and consumers because they are easy to transport and store, relatively imperishable, and fairly high in protein and calories. As a result, such crops occupy about 80 percent of global agricultural land. But they are all annual plants, meaning that they must be grown anew from seeds every year, typically using resource-intensive cultivation methods. More troubling, the environmental degradation caused by agriculture will likely worsen as the hungry human population grows to eight billion or 10 billion in the coming decades.
In "The Next 200 Years", Herman Kahn also considered our impact on the environment - this section was quite refreshing in many ways - he wasn't a global warming skeptic (even back then) and was concerned about the overuse of nitrogen based fertiliser (even noting that if overuse of nitrogen was really a problem then it could make his extrapolations of food production using industrialised farming techniques invalid).
While looking into these topics in detail would make this post the length of a largish book, I'll just throw a couple of recent links regarding the impact of global warming on farming into the mix (Energy Bulletin has a little roundup of Nitrogen news a while back).
With the U.N.-affililated Food and Agriculture Organization (FAO) already warning of declining grain harvests due to extreme weather, a U.S. study released last week suggests that global warming could cause world agricultural systems to face possible collapse by 2080, with countries in the south being the hardest hit.
India, Pakistan, most of Africa and most of Latin America would be the areas most affected, according to the Washington-based Center for Global Development and the Peterson Institute for International Economics. India, which is fast becoming the world’s most populous nation, could stand to see its agricultural yield to fall 29 to 38 percent.
William Cline, the study’s author and a well-known economist, notes that global yields for major crops have actually slowed down. "There's already a sign that there is fatigue in the Green Revolution," he said, noting that the average annual growth in yields during the 1960s and 1970s was 2.6 percent per year – yet by the 1980s and 1990s it had slowed to 1.8 percent. "The problem is that you need the technical change to keep up with demand for food," emphasizes Cline. "I estimate that the global demand for food after you take into account higher population, as well as higher incomes, would about triple from now to late in the century."
While some analysts believe that excess carbon dioxide in the atmosphere in fact will benefit crops, citing laboratory studies that show a yield increase of 30 percent, Cline counters that farm field studies have demonstrated that benefits from so-called “carbon fertilization” is closer to 15 percent and eventually leveling out.
Conversely, food production in northern countries, especially in industrialized nations, could increase due to the effects of global warming increasing the length of the growing season. Cline cautions however that it will not meet world demand for food. Already, there is an increasing competition between human and wild/domesticated animals for food supplies – worldwide meat production is increasing and most of it depends on grain – ultimately bringing into question the future sustainability of such a trend.
Using modelled projections on temperature and rainfall, the study’s results could also be further aggravated by unpredictable factors such as crop pests, severe droughts and water shortages. "Governments and millions of poor people in developing countries have limited ability to cope with such changes," said Nancy Birdsall, president of the Centre for Global Development. "At least a billion people live in the poorest countries that are likely to be worst hit by this slow-moving crisis. This will be a serious problem for us all."
As exceptionally heavy rains continued to cut a wide swath of ruin across northern India, a top United Nations official warned Tuesday that the vagaries of climate change could destroy vast areas of farmland in this country, ultimately affecting food production and adding to the woes of already desperate peasants who live off of the land.
Even a small increase in temperatures, said Jacques Diouf, director general of the United Nations Food and Agricultural Organization, could push down crop yields in southern regions of the world, even as agricultural productivity goes up in northern climes like Europe. A greater frequency of droughts and floods, the agency added, could be particularly bad for agriculture.
"Rain-fed agriculture in marginal areas in semi-arid and subhumid regions is mostly at risk," Diouf said on a visit to the southern Indian city of Chennai. "India could lose 125 million tons of its rain-fed cereal production, equivalent to 18 percent of its total production."
That is a signal of the steep human and economic impact of extreme weather in India, where a majority of peasants still rely on the rains to irrigate their fields and where a bad flood can be nearly as devastating as a bad drought. The latest floods have affected an estimated 20 million people in India alone, 8 million in neighboring Bangladesh and 300,000 in Nepal, according to the United Nations children's agency.
The World Meteorological Organization said in a statement on Tuesday that the region experienced double the normal number of monsoon depressions in the first half of the four-month rainy season which started in June, causing heavy rainfall and flooding across South Asia.
One final section from Kahn's book that I'll point out is one looking at the availability of raw materials, which he called "the end of the beginning". Kahn is extremely scathing about the projections in "Limits To Growth" regarding the available quantities of various minerals and the timeframes in which they are expected to be exhausted. By and large I tend to suspect he is mostly right - there are few minerals which look to be difficult to find new sources for and many large existing mines have estimated lifespans that will extend for a century or more - the "reserves" data for a lot of minerals tends to be bogus because there is so much of the stuff exploration only occurs when there are step changes in demand, as we've seen with the China boom.
Once we do approach the limits to extraction of raw materials, unlike oil and gas (which get burnt, and thus are destroyed) metals are simply fed into the industrial ecosystem and thus will be available for recycling (preferably via an emerging "cradle to cradle" based manufacturing system) forevermore.
TOD Europe had a post on "Peak minerals" recently which didn't do much to convince me of any danger of this occurring any time soon (the iron ore graph being a classic demonstration of why curve fitting for these types of resources seems like a really bad approach). There are a few possible instances where there might be cause for concern though - phosphorus being the most obvious one.
Energy Bulletin has a posting by Patrick Dery and Bart Anderson on the potential for us reaching "peak phosphorus" along with an accompanying reading list which is worth checking out.
Peak oil has made us aware that many of the resources on which civilization depends are limited.
M. King Hubbert, a geophysicist for Shell Oil, found that oil production over time followed a curve that was roughly bell-shaped. He correctly predicted that oil production in the lower 48 states would peak in 1970. Other analysts following Hubbert's methods are predicting a peak in oil production early this century.
The depletion analysis pioneered by Hubbert can be applied to other non-renewable resources. Analysts have looked at peak production for resouces such as natural gas, coal and uranium.
In the literature, estimates before we "run out" of phosphorus range from 50 to 130 years. This date is conveniently far enough in the future so that immediate action does not seem necessary. However, as we know from peak oil analysis, trouble begins not when we "run out" of a resource, but when production peaks. From that point onward, the resource becomes more difficult to extract and more expensive.
* The small Pacific island nation of Nauru, a former phosphate exporter. * The United States, a major phosphate producer. * The world.
He tested Hubbert Linearization first on data from Nauru to see whether he could have predicted the year of its peak phosphate production in 1973. Satisfied with the results, he applied the method to United States and the world. He estimates that U.S. peak phosphorus occurred in 1988 and for the world in 1989. ...
I'll close with the "Earth clock" from Poodwaddle.com (via Crikey). I hope you found this one interesting (assuming anyone actually reads all the way to the end, which may be a rash assumption).