The Low Cost Energy Revolution
Posted by Big Gav
The SMH has a front page report on the latest IPCC report, noting that the Energy revolution may come at only a small cost.
THE cost of saving the planet from catastrophic climate change will not be a major burden on the world economy, shaving only a small amount from global growth if governments act now, says a report by the United Nations expert panel on climate change.
Recommending that nations act swiftly to boost renewable energy, energy efficiency and halt deforestation, the report by the Intergovernmental Panel on Climate Change, released yesterday, said the world could be on the brink of an energy revolution that would ultimately change how power was used and generated. It also found many nations already had the technical know-how to reduce greenhouse gas emissions and arrest climate change but needed political action to make it happen.
"The consensus of the experts is that it actually doesn't cost the earth to save us from disastrous climate change," said the head of the World Wildlife Fund, Greg Bourne, in summing up the report yesterday.
A former CSIRO climate chief, Dr Graeme Pearman, of Monash University, said the impact on a healthy economy would be small. "The cost of letting climate change happen is a lot more than the cost of mitigation."
Stabilising greenhouse gas emissions at a level that can limit the temperature rise to 2 to 3 degrees would reduce annual gross domestic product growth rates by only 0.12 per cent, the report said. But the cuts would have to be deep. Global emissions would need to be slashed between 50 and 85 per cent by 2050 from levels in 2000.
But time was running out, and urgent action was required. Efforts over two to three decades would determine whether the world could avoid major impacts from climate change.
In its key findings, the UN panel's report says:
■ A cost on greenhouse gas pollution caused by fossil fuels of $US20-50 a tonne would have a big impact on cutting harmful emissions. "It could lead to a power generation sector with low greenhouse gas emission by 2050."
■ This would allow renewable energy to have a 30 to 35 per cent share of total electricity supply by 2030.
■ The relative high cost of nuclear power means it would provide only an additional 2 per cent of the world's electricity supply by 2030, and "safety, weapons proliferation and waste remain as constraints".
■ Improving efficiency of energy supply and use would play a key role in reducing emissions by up to 30 billion tonnes a year by 2030.
The SMH notes there are no more excuses for lethargy on global warming for the federal government.
THERE is a simple message for the Federal Government from the final report of the United Nations' expert panel on climate change: get with the program. We have the technology and the means to arrest climate change. We can save the planet and the economy. We will not bankrupt the world or the nation.
Even Rupert Murdoch gets it. Next week the world's most powerful media baron is expected to announce that News Corp will set targets to cut its greenhouse gas emissions globally, using methods from energy-saving buildings to hybrid car fleets.
From the head of News Corp to the head of Greenpeace, there is a realisation the world is about to embark on an energy revolution, and the UN's Intergovernmental Panel on Climate Change is pointing the way forward.
The SMH also had a good article on green buildings today (not online unfortunately), along with one on wind powered ships (looking backward rather than forward at the new style design for sail power) plus signs of a fight brewing over who gets to benefit from "lower" carbon emissions due to halted land clearing.
This week's copy of The Land newspaper contains one of the more unusual items the state's august rural newspaper has published. It's a two-page ad presenting an invoice for $10.5 billion from some farmers to the governments of Australia. The farmers want to be paid for carbon dioxide their farms have absorbed because of state government laws banning land clearing. After all, the more trees left standing, the more carbon gets sucked out of the air.
The farmers argue that laws to protect native vegetation have enabled the Federal Government to claim it is meeting its Kyoto emissions target without penalising other industries. Therefore those industries, via the government, owe the farmers a lot of money. The price has been estimated according to the NSW Greenhouse Gas Abatement Scheme.
The farmers say they will launch court action if their invoice is not paid. Their chances of success are probably zero, but the ad does raise important questions about Australia's response to climate change and about emissions trading, the subject of an inquiry by a prime ministerial task group, due to report at the end of this month. It is also a useful reminder of how farmers have been shamelessly used by the Federal Government in relation to the Kyoto Protocol.
Not only have they suffered harshly from state native vegetation laws, which have effectively nationalised large parts of many farms without compensation, their suffering has been used as an excuse to allow other businesses to continue to belch out carbon and profit from it.
Australia managed to get a special clause inserted in the Kyoto deal allowing reductions in land clearing after 1990 to be part of the calculation of its net carbon emissions. Fortuitously, an unusually large amount of land had been cleared in 1990. As a result, the Federal Government has been able to claim that Australia was meeting its targets without any economic activity being penalised.
Apart, that is, from farming. Last October a think tank called The Climate Institute published a report claiming that, thanks to bans on land clearing, "Australia's farmers have been responsible for virtually the entire share of the nation's greenhouse gas emissions reductions … Over the same period, emissions from energy and transport have and continue to skyrocket. For example, total energy sector emissions are projected to be 45 per cent above 1990 levels by 2010."
The climate institute also claimed that, because Australia refused to ratify the Kyoto Protocol, farmers have been unable to access those international emissions trading schemes that do exist. Our farmers "are unable to convert the emissions reductions they have achieved into financial value and benefit from the growing global carbon market".
WorldChanging has a post on a new open source ecology wiki called OpenEcoSource.org.
Architect Zoka Zola is probably best known (at least in green circles) for her gorgeous and ground breaking zero (fossil fuel) energy house in Chicago. The concept became one of the early poster children of zero energy residential building, proving that self-sustaining homes can be beautiful and modern as well as light on the planet. More recently, she scaled up from the house to the whole city, with a green urban design proposal for Chicago.
Now the Croatian architect has a new green plan in the works -- this time it's virtual and she wants anyone and everyone to help build it. Zola has just launched OpenEcoSource.org, a wiki she calls "the first open source project dedicated to ecology." When I first visited OpenEcoSource, it seemed like there wasn't much there yet, but that's just the point -- the project is meant to be a community-constructed resource from the beginning, where tools can be aggregated and answers shared about numerous challenges related to climate change. From this platform, Zola says,...we will learn what other web-based tools are needed, and we will incorporate them into the development of OpenEcoSource.org. We will launch a number of other projects on OpenEcoSource in the next few weeks, like news, updates on innovations, climatic information, information about products and building systems, production practices, best regional passive practices, renewable energy sources, reducing/reusing/recycling practices, water issues, transportation design and strategies, economic tools and opportunities, and other global warming related issues.
At this stage, most of the entries relate to domestic activities or building. There are also a number of official documents with detailed research and information about the science of climate change. It's clear that the creators of OpenEcoSource have eyes and ears open now that the project has gone live, learning as quickly as they can what works and what needs improvement, and making those changes as the site progresses and more participants join in.
Zola explained that they are in a continuing process of matching edit, rate, sort, and categorize functions with geographic regions and topic areas. Information that's already been added will be retroactively tagged according to subject and region. Users can upload images, graphs, documents, text and links.
Like all good wikis, improvement and growth relies on the user. The foundation has been laid and the tools are out there, so if you have information to share or questions to ask, it might be worth a visit to the site to see how your ideas can become another building block in what Zola hopes will develop into a central resource about ecology, climate change, and collaborating for a greener future.
TreeHugger reports that China is encouraging its media to act as a watchdog over efforts to reduce pollution and energy consumption.
Last year, China failed to meet ambitious national targets for reducing pollution and energy consumption. This year, the central government is calling on state media to serve as watchdogs on these issues, "assist[ing] the authorities' efforts to control pollution... arousing the public's awareness of energy-saving and exposing problems and irregularities." Reporters have been encouraged to report, in-depth, "on the issues that most concern the public and ones that receive the most complaints."
There are plenty of pollution stories - and complaints - out there in China, and though the booming economy's energy intensity is a major concern for the authorities, last year the country fell far short of its annual goal for reducing energy consumption. (By 2010 the national government intends to cut energy consumption per unit GDP by 20% from 2005 levels, sticking to the target that was in place before last year's setback.) It's encouraging to see government calling on media to play an active role, and so soon after the promulgation of new transparency regulations.
The more environmentally educated China's consumers are, the better. But it remains to be seen how helpful probing journalists can be in influencing China's energy consumption - or the industry and construction sectors, which are crucial. And we can't help thinking back just a couple of weeks to the government's call for more citizen activism, put out not long after the jailing of a prominent environmental activist. Also see ::China's Green Revolution: How Far Will It (Not) Go?
Bruce Sterling has a post in his "Spime watch" series, pointing to a paper on the "Transformation of Manufacturing in the 21st Century" - with the end result being locally manufactured products, made from recycled materials - hopefully that should be a pleasant thought for powerdown doomers as well as Viridian types like me...
Distributed digital production, a category of processes evolving from rapid prototyping, rapid manufacturing, free-form fabrication, and layered manufacturing, is a harbinger of twenty-first-century production, which is dramatically different from the kind of “manufacturing” we know today.
The fundamental nature of distributed-digital processes—the construction of functional metal work pieces by assembling elemental particles, layer by layer, with no instructions other than the computer design files widely used to define objects geometrically—is based on different assumptions than those that drove manufacturing and distribution strategies throughout the twentieth century.
The United States has an early lead in these emerging technologies, partly as a result of creative work at some of the nation’s best universities (e.g., MIT, University of Texas, Carnegie Mellon University, Stanford University, University of Southern California, University of Michigan, and Johns Hopkins University) and Sandia and Los Alamos National Laboratories. The U.S. lead is also the result of the visionary spirit of technology-focused entrepreneurs who head and back companies that are pioneering these new technologies. However, the biggest factor has been the impetus provided by the U.S. government, principally the U.S. Department of Defense, which has much to gain from the development of processes for building spare parts and new products flexibly and without cost sensitivity to production volumes. Whether or not the United States maintains and strengthens its leadership position and realizes the benefits of these processes may depend on the outcome of the current debate on the role of government in providing a national “manufacturing technology infrastructure.”
As the costs and wait times of tooling, programming, and “designing for manufacturing” are reduced and then eliminated, the perceived advantages of high-production volumes, concentrated manufacturing sites, and complex distribution logistics will yield to the advantages of distributed digital production—products designed to meet the specific preferences of individual customers that can be produced on or near the point of consumption at the time of consumption (e.g., automotive spare parts produced at a dealership).
The design freedom enabled by constructing objects in thin layers from particles with dimensions in microns will significantly reduce a product’s component-parts count. This, in turn, will reduce product weight by eliminating attachment features and fasteners and optimize functionality by eliminating excess material and wasted energy. The particles that are not needed for the part produced can be recycled to become the next—maybe very different—part. The metal in older, no longer useful products can be locally recycled to become metal powder feedstock for tomorrow’s production.
Thus, inventory carrying costs and risks and transportation costs can be dramatically reduced, increasing savings in energy, materials, and labor. Finally, because these processes are highly automated, the size of the workforce required to produce and deliver manufactured products to the customer will be greatly reduced. Consequently, low-cost, so-called touch labor will lose its competitive advantage in the production of physical objects.
The demand for innovative product designs will expand dramatically. And, because ideas will be delivered electronically, designers can be located anywhere. As design for manufacturing becomes less important, and because design superiority will be gained principally through understanding and responding to customers’ tastes, designers might want to be located near their customers.
Even if products are designed remotely, however, production will be done locally. Physical objects will be produced “at home” or “in the neighborhood” from locally recycled materials....
"Burko" at Sydney Peak Oil has a write up of the Agri-char / Terra Preta conference that has been on in Terrigal this week.
International Agrichar Initiative 2007 (IAI)
I was fortunate enough to attend the recent 3 day IAI in Terrigal, NSW. I'd describe the experience as illuminating and exciting.
In writing this summary, I'm taking time to provide a correct impression. It would be very easy to become enthusiastic about the future of these integrated technologies. However, there is one overriding impression of this field to keep in mind – it is brand, spanking new. So new that even the choice of name Agrichar is being debated. There are no books; there are few years of experience even amongst the researchers; the debates about the benefits to AGW are only just beginning.
In short, being a part of the conference could be compared to hearing an orchestra tuning up. There are skillful cellists and masterful tuba players preparing next to each other. The idea is potential for beautiful music, rather than cacophony. We aren't really sure who the conductor is yet – plenty of skillful people are taking part of that role. There is cooperation and the desire to share experience at all points – but this is a new kind of orchestra.
What is Agrichar anyway?
The name Agrichar refers to the practice of making charcoal to enrich soil health and boost agricultural/horticultural yields. Many cultures have traditional use of charcoal in their horticulture, but it seems the renewed interest has grown from research into the Terra Preta soils of the Amazon basin. Recent research is indicating that these rich, dark soils were in part created by human influence. The BBC produced a documentary, The Secret of El Dorado describing more recent theories about the origins of these soils – a full transcript is available at http://www.bbc.co.uk/science/horizon/2002/eldoradotrans.shtml
If you haven't seen the documentary, read the transcript – it's well worth your while
However, there's more to the story than just the soil research. Charcoal can be a bi-product of various biomass energy processes, such as green waste gasification. The bioenergy field is booming in our post modern world and every useful output assists in the development and commercialization of bioenegy.
At the core of the agrichar viability debate is the idea of sinking CO2, literally, into the soils. Charcoal contains significant quantities of carbon. The real question is this – when we bury charcoal in soils, how long will it be before the buried carbon finds it's way back into the atmosphere as CO2?
I'm resisting the term carbon sequestration as best I can. Yes, this is part of the excitement behind agrichar – can we make energy that is not just carbon free but carbon negative, whilst enhancing soil health and crop yields. As governments and policy makers around the world wrestle with the idea of sequestering CO2 from coal – fired power stations, it might prove wise to avoid the term sequester, since there are numerous problems to be solved and political battles to be fought over CO2 emissions.
We known we can make energy by processing biomass from renewable resources. The biomass energy processes will yield some solid carbon, larger amounts if the process is tuned up for an agrichar yield. Research into the Terra Preta soils indicates that the contained organic carbon can indeed remain in the soil for hundreds or thousands of years!
Have we found a way of fighting global warming and improving soil health and making a reasonable energy yield from renewable resources? Sounds too good to be true....
Not all char is created equal
The first revelation for me at the conference was this – there are massive variations, chemical and structural composition of char. This is the first key in understanding the need for large scale trials and research. I had imagined that charcoal, being mostly carbon, would be a fairly ubiquitous creature. Not so. The ingredients, or feedstock, will largely drive the chemical breakdown, which is can be surprisingly low in carbon. For example, garden greenwaste was sampled in one test batch at 36% carbon.
The duration and temperature of thermal processing has structural effects as well as chemical composition effects. Perhaps you've heard the term “activated charcoal” - I find that this generally indicates charcoals produced at higher temperatures with significantly higher surface areas per gram.
Read about activated carbon at wikipedia http://en.wikipedia.org/wiki/Activated_carbon
Activated carbon can have surface areas of approximately 500m2 per gram if you believe it. No wonder this stuff can have such an impact on a soils ability to hold water.
There is a huge variety of feedstocks that could contribute to the agrichar world. Certainly crop waste material, chicken manures and forestry wastes are being considered and tested. For now, I think of the feedstock as “any woody biomass”, within reason. Though chicken manure and cows skeletons might not be that woody, they are potential feedstock too, when you have other goals in mind, like high nitrogen content. In the case of the cattle bones, disposing of potential Mad Cow microbes while recovering the bone minerals is the goal.
Gasification and pyrolysis
Chars can be produced in a variety of ways. Anyone who's lit a campfire can tell you that you'll end up with some when you're campfire goes out. The point is, chars are produced by the partial combustion of biomass. A very efficient campfire might burn all the way to ash – nice for staying warm, but not so good for the char production. We'd like a controlled burn, so we decide the balance of energy and char made available by the process.
Again, wikipedia has an excellent summary of the relationship between gasification and pyrolysis – see http://en.wikipedia.org/wiki/Gasification
Essentially, pyrolysis is said to be a slightly endothermic reaction which is one of the three processes that make up gasification – pyrolysis, combustion and gasification.
The gas produced is referred to as syn gas, called producer gas sometimes.
My formative understanding of the process says something like this – if you want to produce non-activated chars, temperatures need to be constrained below the levels that gasification requires in order to make the reaction sufficiently exothermic to be self sustaining.
Of course, there is more to it than that – I did find that combustion engineers found it difficult to provide a simpler explanation.
I did get one useful figure from Dr Robert Brown, from Iowa State University – if you're burning wood in an open fire, you're probably only getting a third of the heat energy that should be possible from gasification – a pretty compelling reason to try and understand this stuff. It's been said that up to a third of the worlds deforestation happens in the name of inefficient cooking fires.
A word about scale
One of the things I find most compelling about the agrichar process is – it seems you can start really small. Robert Flanagan demonstrated his gasifier stove, which is design to cook for about two hours on an armload of wood. He's aim is to sell these stoves very cheaply to the third world. As he explains, “this stuff does not need to be rocket science”.
The conference concluded with a visit to the Best Energies site at Somersby in NSW. Here we observed there plant in operation, transforming about 340kg of greenwaste per hour into appropriately 160kw of electricity, via an internal combustion engine and gen set. So, we are talking about roughly 2kg of waste material yielding a kw/h of electricity. That word was waste by the way...
There was plenty of talk about bigger agrichar plants – my feeling is, these ideas do scale, withing certain parameters, of course.
Greener plants
So, what about the crop trials? Like so many things, the answer is “it depends”, but overall the impression seems positive. It's silly to try and quote percentage improvements in any plant, any climate, any combination of fertilizers. There are just way too many variables.
What I can say is this – quantities of tested char seem to float around 1-10tonnes per hectare. Not that much if you think about it. The number and diversity of microbes in soils tested with various chars seem to show dramatic improvements. It seems there are some mysterious interactions between char and fertilizers in some of the trials – together, they somehow become more than the some of there parts. In any case, the existing trials indicate an enthusiastic need for much more research.
What's the downside?
We hope that anyone generating bioenergy will be responsible about their feedstocks – for example, there's the potential for dioxins being produced under certain conditions (presence of chlorine? I'm guessing). This is going to be a case of careful governance and standards, lest somebody the sully the waters with some irresponsible behavior.
Also - it seems way too easy for people to confuse renewable with sustainable – lets hope that the agrichar idea doesn't tempt people to burn biomass in an unsustainable fashion. Watch this space!
Believe it or not, there us very little activity on the net regarding this research – I suspect that will change very rapidly. For now, check out the following lists:
http://terrapreta.bioenergylists.org/?q=about
and
The so called Stoves list, at the head of the bioenergy site
http://bioenergylists.org/
Jason at Anthropik has a post on "The Allegheny’s Black Gold" and the new oil boom in Pennsylvania.
"Rock oil" was already known at the time, but George Bissell had the idea of using it for kerosene, and sent his business partner, "Colonel" Edwin Drake, to Titusville, Pennsylvania to drill for it. Drake's Well struck oil on 27 August 1859 to become the world's first commercial oil well. Rock oil—better known by the Greek petroleum, or simply "oil"—has become the primary fuel of transportation. We've become addicted to it, an addiction that began on the cusp of the Allegheny National Forest. Today, the forest remains in a productive oil region. It produces more oil and natural gas than all other national forests combined, and like any other place where oil's been struck, we're more than happy to kill every living thing in our path to get to it.The greatest, swiftest most efficient and most appalling wave of forest destruction in human history was swelling to its climax in the United States. Nobody knew how much timberland we had left and nobody cared. We were still a nation of pioneers. The world was all before us, and there would always be plenty of everything for everybody.
So wrote Gifford Pinchot, two-term governor of Pennsylvania, personal friend of Theodore Roosevelt, and the man most responsible for the creation of the U.S. Forest Service, in his memoirs about the deforestation of the Allegheny. Much of that came from the wood chemicals industries, tanning and others that logged the forest, but much of it also came from Pennsylvania's oil boom. Some of the small towns of northwestern Pennsylvania boomed with a sudden flux of prosperity, but meanwhile, the rush for oil that began with Drake's Well in 1859 wiped out the forest.
Before Texas or the Middle East, Pennsylvania was the site of the world's first oil boom, and it was the Allegheny that bore the brunt of the cost. "Quaker State" and "Pennzoil" harken back to the origins of the oil industry on the Allegheny's doorstep. Reduced to the "Allegheny Brush Heap" by oilmen and loggers, the watershed of the Allegheny River suffered, leading to floods in Pittsburgh. Pennsylvania peaked as an oil region in 1891, with a second, smaller peak in 1937. What oil still remains in Pennsylvania's oil wells is deeper, heavier, more sour, and not under pressure. It takes more money to extract and refine. But with the price of oil rising from our dependence on foreign, unstable supplies, many of the wells in the Allegheny National Forest have become worth revisiting, leading to a more recent "bump" in oil production.What's going on atop Pennsylvania's northwestern plateau is an oil and gas boom that is among the biggest since 1859 when Edwin Drake drilled the well that launched the modern oil industry in Venango County, south of Titusville.
The DEP issued a record 3,775 oil and gas well drilling permits in 2006, a 24 percent increase over the 3,044 granted in 2005, which was previously the record year for issuance of oil and gas well permits.
"Over the last five years we've been in an upward spiral of oil and gas permits," said Freda Tarbell, a DEP spokeswoman, who noted that the regional office added two new enforcement officers in the past year to an overworked force that now totals 17 working in 27 northern counties.
More than 1,000 of the wells drilled in that region last year were sunk in the Allegheny National Forest. More than 8,000 active wells are operating in the forest, and federal forest officials expect 1,200 additional wells there this year.
I linked to Jamais Cascio's new "Last Hegemon" post ("The End Of Conventional War") yesterday - for those who didn't bother to click on the link, here are some quotes.
The reasons for this obsolescence are clear: conventional military forces appear to be unable to defeat a networked insurgency, which combines the information age's distributed communication and rapid learning with the traditional guerilla's invisibility (by being indistinguishable from the populace) and low support needs. It's not just the American experience in Iraq (and, not as widely discussed, Afghanistan) that tells us this; Israel's latest war in Lebanon leads us to the same conclusion, and even the Soviet Union's experience in Afghanistan and America's war in Vietnam underline this same point. Insurgencies have always been hard to defeat with conventional forces, but the "open source warfare" model, where tactics can be learned, tested and communicated both formally and informally across a distributed network of guerillas, poses an effectively impossible challenge for conventional militaries.
To be clear, this isn't a crude argument that networked insurgency forces are "stronger" than conventional militaries. In a stand-up fight against a modern army, whether on attack or defense, the guerillas will lose; in an insurgency, where stand-up fights are avoided, the modern army simply cannot win. But even talking about winning and losing in this context is simplistic. Networked insurgencies are best at forcing costly stalemates. When on the offense, networked insurgencies are less about compellence than about provocation (making the enemy more likely to engage in acts that horrify the populace and undermine the enemy's support); on the defense, they're less about protection than about disruption (making the enemy expend increasing amounts of force, money and attention on maintaining its own critical support systems). As a result, a networked insurgency can best be thought of as a deterrent force, promising (and able) to exact a high cost in retaliation for a perceived attack.
If deterrence as a way of making conventional militaries obsolete sounds familiar, it should. Such obsolescence actually began in 1945, with the beginning of the nuclear era. The risk of escalation made conventional conflict between nuclear-armed states functionally impossible, by making it something that must be avoided. ...
Nuclear weapons make conventional conflicts extremely unlikely between nuclear states. Historically, this meant that nuclear states could still mess around with conventional conflicts against non-nuclear states, with varying degrees of success. The growing empowerment of insurgent forces has now made conventional conflicts extremely costly and nearly impossible to win, as well. In time, this should come to make them extremely unlikely at the low end, too.
Because this empowerment looks set to accelerate both technologically (such as with the advent of inexpensive fabbers or the proliferation of ultra-cheap, ultra-smart embedded processors and programming know-how) and organizationally (as the increased participation of various globally-distributed guerilla movements increases the pool of tactics and ways to test them), fights against networked insurgencies will only become more and more dangerous. If the lessons of Iraq, Afghanistan and Lebanon don't sink in this time, the next attempt to use conventional military forces will lead to even costlier failure, and the next after that costlier still -- and, eventually, the fading hegemons, rising superpowers, regional badasses and so forth will finally realize that the Great Game they thought they'd been playing ended years ago.
But what's the new game? Networked insurgencies are just the latest in a long evolution of conflict. How, then, will the powerful again come to dominate the weak?
That remains to be seen, but it's almost certain to involve figuring out ways to achieve networked supremacy, rather than simple force supremacy. It will very likely be much more automated, in part due to the growing reluctance of post-industrial nations to give up the lives of soldiers, and in part due to the growing ability of semi-autonomous machines to carry out tasks beyond the capacity of the human body. Ideally, the proliferation of networked systems in the service of "politics by other means" might even allow for the development of tools that minimize casualties on all sides. (The stalled but brilliant web comic Spiders is one intriguing scenario of what that kind of world might look like.)
Despite the end of the utility of conventional force, the lack of certainty as to what the next wave of global compellence power will look like will inevitably lead to strategic mistakes. As we look ahead, it's clear that if another state -- say, China -- decides to take America's place as the leading hegemonic power on the planet by emulating the current American model of extreme emphasis on conventional force projection, that state has already become another Lost Hegemon. The system has changed, and the meaning of power has changed.
Conversely, the first group that cracks this problem has the potential to leapfrog the others in assuming the role of global powerhouse. Given the speed with which technology and organizational models are evolving, we can't assume it will be a state. Corporations seemed poised to take on that role in the 1990s; non-governmental groups are the lead candidates today. It's entirely possible that the kind of social organization that will become the next hegemonic force has yet to be invented. One thing is clear: the next superpower, whoever or whatever it is, will be the actor that finally figures out the new meaning of power.
Technology Review has an article on "Better Catalysts for Fuel Cells", which explains that "Nanoparticles with a completely new shape may lead to cheaper catalysts that could make many experimental-energy technologies more practical".
New nanoparticles with a totally original shape, made by researchers at Georgia Tech, in Atlanta, and Xiamen University, in China, and described in the current issue of Science, could lead to cheaper catalysts for making and using alternative fuels. The 24-sided platinum nanoparticles have surfaces that show up to four times greater catalytic activity compared with commercial catalysts.
If researchers can make even smaller nanoparticles with this same efficient shape, it could significantly reduce the amount of platinum used. Reducing the amount of this expensive metal--it currently sells for about $1,300 per ounce--would make applications such as fuel cells more affordable. Reducing the cost of platinum catalysts could also be critical in other applications, such as synthesizing alternative fuels and converting waste materials like carbon dioxide into useful products. (See "Making Gasoline from Carbon Dioxide.")
The new work is important, says Francesco Stellacci, professor of materials science and engineering at MIT, because it involves platinum, which he says is "by far the most interesting metal" for catalysis. ...
The multifaceted shape made by the researchers has many high-energy areas in which more atoms are unstable and reactive than in conventional platinum nanoparticles. The researchers showed that these surfaces, compared with the surfaces of commercial platinum nanoparticles, catalyzed reactions at a much higher rate.
The current work is only a step toward the goal of making cheaper catalysts. Alexis Bell, professor of chemical engineering at the University of California, Berkeley, says that while the work is interesting because it addresses one of the particular challenges of creating catalysts--controlling the surface structure--the new nanoparticles are in fact not small enough. Existing commercial platinum catalysts can be less than five nanometers wide. The Georgia Tech and Xiamen researchers made particles between 50 and 200 nanometers. Being larger, the new type of nanoparticles have a larger proportion of the expensive platinum locked beneath the surface, where it can't serve to catalyze reactions. As a result, for now, the new nanoparticles are actually worse catalysts than are commercial catalysts available today.
According to Wang, the goal is ultimately to use the new nanoparticles and the methods for making them to help find ways of transforming much cheaper materials into useful catalysts. If that can be done, some technologies limited to the lab bench today could be applied to meeting growing worldwide energy needs.
Chris Nelder at GetRealList has a Billmon style set of paired quotes from Senator James Inhofe showing that his understanding of the history of the Iraq war is on a par with his understanding of global warming. This dude is the biggest tinfoil merchant in the US Senate, bar none. Chris also has a link to a debate between Steve Forbes and Boone Pickens about peak oil, from the April 24 Milken Institute Global Conference
"The whole idea of weapons of mass destruction was never the issue, yet they keep trying to bring this up."
-- Sen. James Inhofe (R-OK), 4/27/07, criticizing Congress and the media for "mischaracterizing" the reasons for U.S. involvement in Iraq
"Our intelligence system has said that we know that Saddam Hussein has weapons of mass destruction -- I believe including nuclear."
-- Inhofe, 8/18/02