Tom Konrad has a look at some of his favoured clean energy companies for the upcoming year - Ten Clean Energy Stocks for 2010. Its interesting that none of these companies generate power (or build equipment for generating power).
In late 2008, when I was putting together my list for 2009, I had a relatively easy time. Fear was rampant, and there were many great companies selling for single-digit multiples of earnings. Today, complacency and greed have returned to the markets, and good values are very hard to come by. The following 10 are mostly the result of culling through our Alternative Energy Stock lists for companies in my favored sectors that look ready for the premature end of the recovery: Companies with strong balance sheets, good cash flow and profitability at not-too-expensive multiples.
Technology Review has a roundup of the year's big news on the energy front, highlighting "Liquid batteries, giant lasers, and vast new reserves of natural gas" (showing a touching amount of faith in the long term future for unconventional natural gas from both a supply point of view and an environmental point of view) - The Year in Energy.
One of the most dramatic developments ("Natural Gas Changes the Energy Map") was the rush to exploit a vast new resource; new drilling technologies have made it possible to economically recover natural gas from shale deposits scattered throughout the country, including in Texas and parts of New York, Pennsylvania, and Ohio. Advances in drilling technology have increased available natural gas by 39 percent, according to an estimate released in June. The relatively clean-burning fuel could cut greenhouse gas emissions by becoming a substitute for coal. Natural gas might even provide an alternative to petroleum in transportation, especially for buses and taxis--if only policymakers could take advantage of the new opportunity. ...
This year was also the year of the smart grid, as numerous test projects for improving the reliability of the grid and enabling the use of large amounts of renewable energy got underway ("Technology Overview: Intelligent Electricity"). The smart grid will be enabled by key advances, such as superconductors for high-energy transmission lines ("Superconductors to Wire a Smarter Grid") and smart networks being developed by companies such as GE ("Q&A: Mark Little, Head of GE Global Research").
Cellulosic ethanol--made from biomass such as grass rather than corn grain--moved closer to commercialization, with announcements of demonstration plant openings ("Commercializing Garbage to Ethanol") and scientific breakthroughs that could make the process cheaper ("Cellulosic Ethanol on the Cheap"). But at the same time, a number of companies are moving beyond cellulosic ethanol to the production of gasoline, diesel, and jet fuel from biomass--fuels that can be used much more readily in existing infrastructure and in existing vehicles. Exxon-Mobil announced substantial investments in algae-based fuels ("Big Oil Turns to Algae"). Remarkably, one startup declared its process--based on synthetic genomics and algae--could allow biofuels to replace all of transportation fuels without overwhelming farmland ("A Biofuel Process to Replace All Fossil Fuels").
Still, most people think biofuels will only supply a fraction of our transportation needs ("Briefing: Transportation"). To eliminate carbon emissions and drastically curtail petroleum consumption will require plug-in hybrids ("Driving the Volt") and other electricity-powered vehicles ("Nissan's Leaf: Charged with Information"). Advances that could double (or more) the energy capacity of batteries and lower their costs could one day make such vehicles affordable to the masses. These include new formulations such as lithium-sulfur batteries ("Revisiting Lithium-Sulfur Batteries"), metal-air batteries ("High-Energy Batteries Coming to Market") such as lithium-air batteries ("IBM Invests in Battery Research"), and batteries that rely on nanowires and silicon ("More Energy in Batteries"). A novel concept for super-fast charge stations at bus stops could make electric buses practical ("Next Stop: Ultracapacitor Buses").
Getting the electricity to charge these vehicles--without releasing vast amounts of carbon dioxide--could be made easier by a number of advances this year. A new liquid battery could cheaply store energy from wind turbines and solar panels for use when the sun isn't shining and the wind isn't blowing ("TR10: Liquid Battery"), making it practical to rely on large amounts of renewable electricity. Vast arrays of mirrors ("Solar Thermal Heats Up") are being assembled in the desert to convert solar heat into electricity, and photovoltaic solar farms for converting light directly into electricity ("Chasing the Sun") are getting a boost from the federal stimulus money. And researchers are finding ways to increase the efficiency of solar cells ("More Efficient, and Cheaper, Solar Cells") and are discovering new photovoltaic materials to make solar power cheaper ("Mining Fool's Gold for Solar").
The future of plastics once fossil fuels run dry or the price for it becomes too expensive is bioplastics.
But that alternative future is distant, measured in terms of decades, says Frederic Scheer, chairman, president and founder of Cereplast Inc., a Hawthorne, CA, company that designs and manufactures bio-based, sustainable plastics.
Which is not to say that bioplastics’ present is particularly shabby: Scheer says that U.S. demand for bioplastics could exceed $10 billion by 2020. That’s a conservative estimate, he contends, but it’s still a “drop in the bucket” compared to the traditional plastic market, which is about $2.5 trillion.
Scheer may be a visionary and a pioneer when it comes to bioplastics, but he’s also realistic about the challenge and the effort it will take to penetrate and begin to replace the traditional plastic market.
Referring to a recent 245-page study on the emerging bioplastics market commissioned by European Bioplastics and the European Polysaccharide Network of Excellence, Scheer says, “In 2007, only 0.3 percent of global plastic production was bio-based. By 2013 we expect that overall bioplastics manufacturing capacity will increase by approximately seven times current levels, which still barely taps the surface.”
But there is no escape, Scheer continues. Traditional plastics “will need to embrace bioplastic” because the price of oil is volatile and will surely increase over time, which increases the pressure to move to bioplastics. “There’s also increasing demand from consumers to use bioplastic.”
Cereplast’s technology produces bio-based resins, which are little pellets of material used as the building blocks of molded plastic products. They are used to replace nearly all or a significant portion of the petroleum-based additives used in plastics by using natural material from starches such as tapioca, corn, wheat and potatoes.
In addition to starch-based resins, Cereplast has developed a technology to transform algae into bioplastics and is planning to launch a new family of algae-based resins. Algae have the potential to become a major green feedstock for biofuels and bioplastics.
There will be a day when bioplastic replaces traditional plastic but that replacement will occur incrementally over a long period of time, say 20-30 years “because we are starting from a low point,” Scheer says. It is an emerging market with plenty of room for growth and new entrants – Cereplast is one of only three major players, the others being Natureworks and Metabolix.
Senator Dianne Feinstein introduced legislation in Congress on Monday to protect a million acres of the Mojave Desert in California by scuttling some 13 big solar plants and wind farms planned for the region.
But before the bill to create two new Mojave national monuments has even had its first hearing, the California Democrat has largely achieved her aim. Regardless of the legislation’s fate, her opposition means that few if any power plants are likely to be built in the monument area, a complication in California’s effort to achieve its aggressive goals for renewable energy.
Developers of the projects have already postponed several proposals or abandoned them entirely. The California agency charged with planning a renewable energy transmission grid has rerouted proposed power lines to avoid the monument.
“The very existence of the monument proposal has certainly chilled development within its boundaries,” said Karen Douglas, chairwoman of the California Energy Commission.
Mrs. Feinstein heads the Senate subcommittee that oversees the budget of the Interior Department, giving her substantial clout over that agency, which manages the government’s landholdings. Her intervention in the Mojave means it will be more difficult for California utilities to achieve a goal, set by the state, of obtaining a third of their electricity from renewable sources by 2020; projects in the monument area could have supplied a substantial portion of that power.
“This is arguably the best solar land in the world, and Senator Feinstein shouldn’t be allowed to take this land off the table without a proper and scientific environmental review,” said Robert F. Kennedy Jr., the environmentalist and a partner with a venture capital firm that invested in a solar developer called BrightSource Energy. In September, BrightSource canceled a large project in the monument area.
In Europe there are two nuclear plants under construction, one in Flamanville, France and one in Olkiluoto, Finland both by France's state-owned Areva. Both have been subject to significant troubles, partly related to being the first-build of the most evolved advanced model in production, Areva's EPR, which was supposed to be simpler, more efficient, cheaper and faster to build. In Finland's Olkiluotu a 50 per cent blowout in costs (to $US6.4 billion so far, lawsuits pending) and doubling in construction time (from 3.5 years to at least seven years) is typical of nuclear projects over the decades. Today Areva concede that construction of a similar reactor of 1.6 gigawatts would be $US8 billion ($A9 billion).
The reasons why nuclear plants routinely run into such troubles are that it is hugely capital intensive so delays greatly add to the cost of capital long before any revenue is generated. Construction is extremely complex, which is greatly compounded by safety regulation — this was another major cause of the slowdown at Olkiluoto. For these reasons the industry prefers to use "overnight" costs, which are the costs as if a plant was constructed overnight at today's prices.
Dr Ziggy Switkowski, chairman of the Australian Nuclear Science and Technology Organisation (ANSTO), has said that Australia should build 50 reactors though this assumes a doubling of electricity consumption by 2050. Dr Ian Smith suggested, when chief executive of ANSTO in 2008, that Australia could realistically construct six to 14 plants but this would still provide only 10-20 per cent of total electricity requirements.
Australia's current electricity consumption is almost 40 gigawatts from installed capacity of about 50 gigawatts. So, to replace most of this would require about 25 reactors of the EPR design, each of 1.6 gigawatts (or 40 of the Westinghouse AP1000 1 gigawatt design). This could cost about $225 billion in today's money, or close to half a trillion dollars for 50 reactors. Using Smith's more modest suggestions the cost could be up to $126 billion but displace a lot less coal burning. Switkowski may be correct in the sense that why create all these contentious issues and still not substantially solve the problem? This points to another weakness: with nuclear it appears to be an all-or-nothing gamble with hundreds of billions of dollars.
Nuclear advocates always cite "next-gen" designs and purported much swifter and cheaper construction but the figures given above are the actual costs of the plants being constructed in Europe today, not even the much higher industry estimates reported by Grunwald for the proposed US plants. The timetable of this construction is anyone's guess except that history warns us to be pessimistic. By comparison China plans for 50-60 of the simpler, smaller Westinghouse design by 2030, but nuclear will still account for only about 4 per cent of their energy needs.
Those are just the construction costs. As is well known, liability insurance needs to be covered by government. The other big cost is the decommissioning of reactors. Even with many of the world's 439 existing reactors approaching the end of their productive lives, so far none have been decommissioned. The world's first commercial nuclear power generator, Calder Hall at what is now called Sellafield (previously Windscale), was turned off in 2003. It has been estimated by the UK industry that full decommissioning of Calder Hall, if ever done, will cost about $2 billion at today's prices. Meanwhile, old plants need continuous maintenance and high-security against decay and incursion including against potential terrorists.
But the biggest cost, especially for Australia, could be the opportunity cost of throwing these vast sums into an old technology dominated by other countries, rather than investing in new renewable technologies and industries of the future. From relatively modest funding Australia has already produced world-beating solar-photovoltaic and solar-thermal technologies, even if both have moved offshore due to lack of investment support. Geothermal power has just received government grants, which will allow full prototypes to be tested in a few years. Many scientists believe that it is inevitable that these technologies will be viable, provide so-called baseload power cost-competitively, and that their maturation would be faster than the typical construction schedules of nuclear power stations if comparable budgets and subsidies were deployed.
Is this any different to the claims by the nuclear dreamers such as Brook and Nicholson? Emphatically yes. The nuclear industry is not a new one but an old mature one. For more than 50 years it has consistently over-promised and under-delivered, yet its advocates continue to propose that governments should provide massive subsidies to nuclear construction, provide unlimited liability insurance, assume most of the decommissioning costs and — after 50 years — continue to search for the elusive "permanent" storage of high-level waste.
There are not minority views and indeed are not contested by the nuclear industry, or the Wall Street Journal, or Lazards the merchant bank. Or many scientists. Here is commentary from the world's top science journal Nature (W.Patterson, Vol 449, 11/10/07): "As climate and fuel security dominate the energy agenda, the battle between traditional and innovative electricity intensifies around the world, notably in fast-growing economies such as China. After half a century, nuclear power is the ultimate in tradition. It needs climate more than climate needs it. To avert catastrophic global warming, why pick the slowest, most expensive, most limited, most inflexible and riskiest option? In 1957, despite the Windscale fire, nuclear power was worth trying. We tried it: its weakness proved to be economics, not safety. Now nuclear generation is just an impediment to sustainable electricity."
It is a clear enough choice. The economics and the long time to approve and build show nuclear is not the smart choice, arguably for the world but certainly not for Australia with its plentiful resources in renewables (solar, wind, wave, tidal, geothermal).
The real question for Australia is whether we have what it takes to grasp the opportunities.
Peak Energy has (somewhat surprisingly) made it past its 5th birthday.
While its been an up and down year for me personally and a bit of a slow year here at PE in terms of original content, I'm glad to see there are still plenty of people reading and I'd like to wish you all a happy new year.
Rigzone has an article about high hopes for a methane hydrates development offshore from New Zealand (in an area where a wind farm would probably achieve as high a utilisation rate as you'd find anywhere in the world, I'd note) - NZ Methane Hydrates May Soon Be Developed.
A gas industry using frozen gas hydrates below the seabed off the East Coast could be developed in the near future thanks to rapid global technical developments.
George Hooper, executive director of the Centre for Advanced Engineering, told a recent Oil and Gas conference in Wellington that exploitation of methane hydrates could transform New Zealand's energy market.
Hooper is lead author in a recent CAE report on an options analysis for commercial development of energy from offshore methane hydrates in New Zealand.
He said 'sweet spots' containing high concentrations (about 4-10%) of methane hydrate found in sheets under the seabed off the East Coast may contain about 8.5 - 21 trillion cubic feet (TCF) of recoverable gas.
He said New Zealand's methane hydrates endowment is very likely the largest in the world on a per capita basis and potentially one of the largest resources in the world.
Inferred resources of hydrates in New Zealand are 813 TCF with 40 TCF identified as potentially economically recoverable. Inferred world resources of hydrates are 20,000 TCF.
The ice-like crystals of water and methane molecules intermixed with sediments are found over 50,000 sq km from offshore Marlborough to offshore Gisborne, as well as off Fiordland.
A number of countries were now working on developing commercial gas production from hydrates including Japan, India, the US and South Korea. Japan was talking of a 2015 timeline for first production, though this might be optimistic, Hooper said.
He anticipates rapid progress in the engineering geology and production technologies required for hydrates extraction, both internationally and in New Zealand.
This demanded a considerable ramp-up of hydrate research and development effort here to ensure New Zealand has the earliest possible opportunity to develop its hydrate resources and associated skills.
A conceptual well development plan for a known Wairarapa hydrates 'sweet spot' site offshore Wairarapa, east of Wellington, was prepared for the study.
Costings for a small scale 10 petajoule a year 'proving' project indicated this option would require capital expenditure of $370 million.
To produce 150 PJ of gas, equal to the entire New Zealand gas market, the capital expenditure would be about $4 billion, about twice the $2 billion capital spending required to produce a similar volume of conventional natural gas.
The cost of building a 300 PJ project both for domestic gas use and for the export of LNG, would cost about $8 billion.
Energy keeps our economy running. Energy is also what we use to obtain more energy. The more energy we use to obtain more energy, the less we have for the rest of the economy.
The concept of Energy Return on Investment (EROI), alternatively called Energy Return on Energy Invested (EROEI) has been widely used to quantify this concept. The following chart, from a SciAm paper, shows the EROI of various sources of energy, with the tan section of the bar representing the range of EROIs depending on the source and the technology used. I've seen many other estimates of EROI, and this one seems to be on the optimistic (high EROI) end for most renewable energy sources.
The general trend is clear: the energy of the future will have lower EROI than the energy of the past. Low carbon fuels such as natural gas, nuclear, photovoltaics, wind, and biofuels have low EROI compared to high-carbon fuels such as coal and (formerly) oil.
The graph also clearly shows the decline in the EROI over time for oil. Other fossil fuels, such as coal and natural gas, also will have declining EROI over time. This happens because we always exploit the easiest resources first. The biggest coal deposits that are nearest to the surface and nearest to customers will be the first ones we mine. When those are depleted, we move on to the less easy to exploit deposits. The decline will not be linear, and new technology can also bring temporary improvements in EROI, but new technology cannot change the fact that we've already exploited all the easiest to get deposits, and new sources and technologies for extracting fossil fuels often fail to live up to the hype.
While there is room for improvement in renewable energy technologies, the fact remains that fossil fuels allow us to exploit the energy of millions of years of stored sunlight at once. All renewable energy (solar, wind, biomass, geothermal) involves extracting a current energy flux (sunlight, wind, plant growth, or heat from the earth) as it arrives. In essence, fossil fuels are all biofuels, but biofuels from plants that grew and harvested sunlight over millions of years. I don't think that technological improvements can make up for the inherent EROI advantage of the many-millions-to-one time compression conveys to fossil fuels.
Hence, going forward, we are going to have to power our society with a combination of renewable energy and fossil fuels that have EROI no better than the approximately 30:1 potentially available from firewood and wind. Since neither of these two fuels can come close to powering our entire society (firewood because of limited supply, and wind because of its inherent variability.) Also, storable fuels such as natural gas, oil, and biofuels all have either declining EROI below 20 or extremely low EROI to begin with (biofuels). Energy storage is needed to match electricity supply with variable demand, and to power transportation.
The Australian has an article on a forthcoming report by the Australian Academy of Science on renewable energy - Green power feasible.
The report, titled Australia's Renewable Energy Future, puts the scientific might of the academy up against sceptics claiming that renewables cannot meet baseload energy needs.
It challenges assumptions underlying an economic model of renewable energy take-up developed by the CSIRO and the Australian Bureau of Agricultural and Resource Economics on the grounds they are too conservative. In the virtual futures generated in the modelling, geothermal and solar thermal would remain as only minor components in Australia's energy mix until 2040.
The model could not capture recent technological advances and the stimulatory impact of government intervention, Professor Dopita said. In the real world, it risked becoming a self-fulfilling prophecy, helping to reinforce a focus on fossil fuel in policy formulation.
"We can change the way we do business entirely by stimulating those new industries, getting them past the economic thresholds that make them appear to be uncompetitive with coal," he said. "If you give the appropriate financial incentives early on, the whole thing snowballs. As the technology accrues the advantages of scale, it becomes self-sustaining and provides new employment and export opportunities."
The academy estimates Australia has enough accessible geothermal energy to meet 26,000 years of its power needs. More than 30 companies aim to deliver geothermal energy to the grid, the renewable energy report says.
However, the accessible geothermal resource is concentrated in granite formations in the outback. To cut energy losses in getting the hot rock power to the cities, the government would need to invest billions of dollars in a high-voltage direct current long-distance electricity transmission system.
Sandia lead investigator Greg Nielson said the research team has identified more than 20 benefits of scale for its microphotovoltaic cells. These include new applications, improved performance, potential for reduced costs and higher efficiencies.
"Eventually units could be mass-produced and wrapped around unusual shapes for building-integrated solar, tents and maybe even clothing," he said. This would make it possible for hunters, hikers or military personnel in the field to recharge batteries for phones, cameras and other electronic devices as they walk or rest.
Even better, such microengineered panels could have circuits imprinted that would help perform other functions customarily left to large-scale construction with its attendant need for field construction design and permits.
Said Sandia field engineer Vipin Gupta, "Photovoltaic modules made from these microsized cells for the rooftops of homes and warehouses could have intelligent controls, inverters and even storage built in at the chip level. Such an integrated module could greatly simplify the cumbersome design, bid, permit and grid integration process that our solar technical assistance teams see in the field all the time."
For large-scale power generation, said Sandia researcher Murat Okandan, "One of the biggest scale benefits is a significant reduction in manufacturing and installation costs compared with current PV techniques."
The growing attractiveness of the global solar energy market was underlined this week when South Korea's LG Electronics (LG) announced that it is to start commercial production of solar cells and modules next month.
The company said that it plans to manufacture approximately 520,000 solar modules a year using silicon wafers, at a plant 200 kilometres to the south east of Seoul with a total capacity of 120MW.
LG said that it planned to set up another production line for operation by 2011, increasing total output to 240MW.
Kwan-shik Cho, vice president of the solar business team at LG Electronics, explained that the goal is to become a global player in the world's solar industry.
"While we recognise this is a crowded playing field, LG has the necessary skills, know-how and business strategy to make this a profitable venture for the long-term," he said.
LG sees the solar business as a key area of growth, and claimed that it had been preparing to enter the market since 2004.
The firm will manufacture large-area thin-film solar cells, as well as the more widespread crystalline solar cells.
In July 2009, LG announced that the company had achieved the world's most energy efficient large-area thin-film solar cells in a trial.
The world's biggest supplier of solar-manufacturing equipment has opened a research and development center in China, and its chief technology officer will relocate from Silicon Valley to that country next month. Applied Materials, founded in 1967 as a semiconductor company, has manufactured in China for 25 years, but is expanding its presence to be closer to its customers and develop products suited to the country's urban population.
"We're doing R&D in China because they're becoming a big market whose needs are different from those in the U.S.," says Mark Pinto, Applied Materials's CTO. Going forward, he says, "energy will become the biggest business for the company," and China, not the U.S., "will be the biggest solar market in the world."
Indeed, the move by Applied Materials is just the latest sign that China is rapidly moving to the forefront in adopting renewable energy technologies. China is no model for addressing climate change--its greenhouse-gas emissions are expected to nearly double by 2030. The lion's share of demand for photovoltaics comes from Europe, which accounted for 82 percent of the photovoltaics sold in 2008, according to a report by Solarbuzz. China currently makes up less than 1 percent of the demand for photovoltaics, but its demand for photovoltaics is expected to grow; Beijing aims to produce 20,000 megawatts of solar energy by 2020.
Technology Review has an article on (at this stage theoretical) ultra-high performance "Digital quantum batteries" - A Quantum Leap in Battery Design.
A "digital quantum battery" concept proposed by a physicist at the University of Illinois at Urbana-Champaign could provide a dramatic boost in energy storage capacity--if it meets its theoretical potential once built.
The concept calls for billions of nanoscale capacitors and would rely on quantum effects--the weird phenomena that occur at atomic size scales--to boost energy storage. Conventional capacitors consist of one pair of macroscale conducting plates, or electrodes, separated by an insulating material. Applying a voltage creates an electric field in the insulating material, storing energy. But all such devices can only hold so much charge, beyond which arcing occurs between the electrodes, wasting the stored power.
If capacitors were instead built as nanoscale arrays--crucially, with electrodes spaced at about 10 nanometers (or 100 atoms) apart--quantum effects ought to suppress such arcing. For years researchers have recognized that nanoscale capacitors exhibit unusually large electric fields, suggesting that the tiny scale of the devices was responsible for preventing energy loss. But "people didn't realize that a large electric field means a large energy density, and could be used for energy storage that would far surpass anything we have today," says Alfred Hubler, the Illinois physicist and lead author of a paper outlining the concept, to be published in the journal Complexity.
Hubler claims the resulting power density (the speed at which energy can be stored or released) could be orders of magnitude greater, and the energy density (the amount of energy that can be stored) two to 10 times greater than possible with today's best lithium-ion and other battery technologies.
San Francisco's Pacific Gas & Electric said today it filed a filed a preliminary permit application with the Federal Energy Regulatory Commission (FERC) for a three-year study of a potential wave power site with a capacity of up to 100 megawatts off the coast of northern Santa Barbara County.
If the study is successful, the utility proposes the installation of wave energy conversion devices that would feed into the electrical grid at Vandenberg Air Force Base. PG&E said it already signed an agreement with the U.S. Air Force to allow it to conduct the study.
PG&E has received approval from FERC to conduct environmental studies of a potential wave power site off the coast of Humboldt County. The utility says it plans to give a maximum of four wave energy converter (WEC) manufacturers the chance to test their devices.
The five-year trial is expected to be installed in fall 2013.
Eos has just published the results of a survey of 3146 Earth Scientists conducted by Peter Doran and Maggie Kendall Zimmerman. The graph below shows the results for this question:
Do you think human activity is a significant contributing factor in changing mean global temperatures?
The 97% of active climatologists is 75 out of the 77 in the survey. Doran and Zimmermann say:
While respondents' names are kept private, the authors noted that the survey included participants with well-documented dissenting opinions on global warming theory.
I'm guessing that Lindzen and Spencer are the two that said "no".
The difference between the opinion of the general public and the scientists is striking. For comparison, despite the ongoing efforts of right-wing pundits here, 80% of Australians answered "yes" to a similar question.
What’s black and white and sunny all over? It’s the BUMPS housing complex in Beijing, created by SAKO Architects. Typical buildings in China face north and south, but SAKO’s design rotates the buildings 45 degrees from the north-south axis to provide maximum exposure to sunlight. The designers also staggered the two-floor units to create additional space between the units, which tenants can then use as terraces. The integrated project, which includes both residential and commercial buildings, is located in a developing area in southwest Beijing and was one of this month’s WAN Awards residential category entries.
The United Kingdom is splattered with fossil fuel based power plants and concrete cooling towers which are major carbon producers as well as eyesores. Luckily, plans for a new biomass power plant covered in native grasses in the UK have just been released and they will complement the surrounding ecology as well as decrease carbon emissions by 80% compared to coal or gas fired power stations. Designed by Thomas Heatherwick, a London-based firm, the 49.3 MW power plant located on the banks of the River Tees will be a man-made mountain covered in plants and will certainly be a welcome replacement to the older, pollution-spewing plants around the country.
Powered by palm kernel shells, which are the byproducts of palm oil plantations, the plant will reduce carbon emissions by 80% compared to traditional coal or gas fired stations. The palm kernel shells, considered a renewable fuel, will be delivered directly by boat, eliminating the need to haul the fuel by truck. The 49 MW plant will provide enough power for 50,000 homes, providing cleaner, lower carbon baseload power for the region. Inside, the power plant will also contain offices, a visitors’ center and an education resource center for renewable energy.
It’s possible that human beings will simply never be able to figure out how to bring global warming under control — that having been warned about the greatest danger we ever faced, we simply won’t take significant action to prevent it. That’s the unavoidable conclusion of the conference that staggered to a close in the early hours of Saturday morning in Copenhagen. It was a train wreck, but a fascinating one, revealing an enormous amount about the structure of the globe.
Let’s concede first just how difficult the problem is to solve — far more difficult than any issue the United Nations has ever faced. Reaching agreement means overcoming the most entrenched and powerful economic interests on Earth — the fossil fuel industry — and changing some of the daily habits of that portion of humanity that uses substantial amounts of oil and coal, or hopes to someday soon. Compared to that, issues like the war in Iraq, or nuclear proliferation, or the Law of the Sea are simple. No one really liked Saddam Hussein, not to mention nuclear war, and the Law of the Sea meant nothing to anyone in their daily lives unless they were a tuna.
Faced with that challenge, the world’s governments could have had a powerful and honest conversation about what should be done. Civil society did its best to help instigate that conversation. In late October, for instance, 350.org — the organization of which I am a founder — held what CNN called the “most widespread day of political action in the planet’s history,” with 5,200 demonstrations in 181 countries all focused on an obscure scientific data point: 350 parts per million (ppm) of CO2, which NASA scientists have described as the maximum amount of carbon we can have in the atmosphere if we want a planet “similar to the one on which civilization developed, and to which life on Earth is adapted.”
In fact, that kind of scientific reality informed the negotiations in Copenhagen much more thoroughly than past conclaves — by midweek diplomats from much of the world were sporting neckties with a big 350 logo, and 116 nations had signed on to a resolution making that the dividing line. A radical position? In one sense, yes — it would take the quick transition away from fossil fuels to make it happen. But in another sense? The most conservative of ideas, that you might want to preserve a planet like the one you were born onto.
From the beginning, the most important nations chose not to go the route of truth-telling. The Obama administration decided not long after taking office that they would barely mention “global warming,” instead confining themselves to talking about “green jobs” and “energy security.” Perhaps they had no choice, and it was the only way to reach the U.S. Senate — we’ll never know, because they clung to their strategy tightly. On Oct. 24, when there were world leaders from around the globe joining demonstrations, they refused to send even minor officials to take part. Instead, they continued to insist on something that scientists kept saying was untrue: The safe level of carbon in the atmosphere was 450 ppm, and their plans would keep temperature from rising more than 2 degrees Celsius (3.6 F) and thus avoid “catastrophic consequences.” (Though since 0.8 degrees C had melted the Arctic, it wasn’t clear how they defined catastrophe).
In any event, even this unambitious claim was a sham. That’s strong language, so here’s what I mean. Thirty-six hours before the conference drew to a close, a leaked document from the UN Secretariat began circulating around the halls. It had my name scrawled across the front, not because I’d leaked it but apparently because it confirmed something I’d been writing for weeks here at Yale Environment 360 and elsewhere: Even if you bought into the idea that all we needed to do was keep warming to 2 degrees C and 450 ppm, the plans the UN was debating didn’t even come close. In fact, said the six-page report, the plans on offer from countries rich and poor, if you added them all up, would produce a world where the temperature rose at least 3 degrees C, and carbon soared to at least 550 ppm. (Hades, technically described). It ended with a classic piece of bureaucratic prose: Raising the temperature three degrees, said the anonymous authors, would “reduce the probability” of hitting the two degree target. You think?
The document helped make already-suspicious vulnerable nations even more suspicious. Remember: The reports from the Intergovernmental Panel on Climate Change have made it clear that a two-degree temperature rise globally might make Africa 3.5 degrees C hotter. Almost everyone
The most vulnerable nations didn’t knuckle under quite as easily as usual.
thinks that even 450 ppm will raise sea level enough to drown small island nations. There wasn’t much solace in the money on offer: $10 billion in “fast start” money for poor nations (about $2.50 a head — I’d like to buy the world a Coke) and an eventual $100 billion in annual financial aid that U.S. Secretary of State Hillary Clinton promised when she arrived on Thursday morning. Even if that money ever materialized (Clinton couldn’t say where it would come from, except “special alternative financial means”) it wouldn’t do much good for countries that weren’t actually going to exist once sea levels rose. They were backed to the wall.
And so, they squawked. They didn’t knuckle under quite as easily as usual, despite the usual round of threats and bribes. (One island nation left a meeting with the U.S. fearing for its International Monetary Fund loans; one African nation left a meeting with the Chinese hoping for two new hospitals if only it would toe the line.)
This annoyed the powerful. When President Obama finally appeared on Friday, his speech to the plenary had none of the grace and sense of history that often mark his words — it was an exasperated and tight-lipped little dressing-down about the need for countries to take “responsibility.” (Which might have gone over better if he’d even acknowledged that the United States had some special historical responsibility for the fix we’re in, but the U.S. negotiation position all along has been that we owe nothing for our past. As always, Americans are eager for a fresh new morning). In any event, it didn’t suffice — other nations were still grumbling, and not just the cartoonish Hugo Chavez.
In fact, the biggest stumbling block to the kind of semi-dignified face-saving agreement most people envisioned was China. According to accounts I’ve heard from a number of sources, Obama met with 25 other world leaders after his press conference for a negotiating session. It was a disaster — China turned down one reasonable idea after another, unwilling to constrain its ability to burn coal in any meaningful way (and not needing to, since power, especially in any non-military negotiation, has swung definitively in its direction).
Copenhagen was a disaster. That much is agreed. But the truth about what actually happened is in danger of being lost amid the spin and inevitable mutual recriminations. The truth is this: China wrecked the talks, intentionally humiliated Barack Obama, and insisted on an awful "deal" so western leaders would walk away carrying the blame. How do I know this? Because I was in the room and saw it happen.
China's strategy was simple: block the open negotiations for two weeks, and then ensure that the closed-door deal made it look as if the west had failed the world's poor once again. And sure enough, the aid agencies, civil society movements and environmental groups all took the bait. The failure was "the inevitable result of rich countries refusing adequately and fairly to shoulder their overwhelming responsibility", said Christian Aid. "Rich countries have bullied developing nations," fumed Friends of the Earth International.
All very predictable, but the complete opposite of the truth. Even George Monbiot, writing in yesterday's Guardian, made the mistake of singly blaming Obama. But I saw Obama fighting desperately to salvage a deal, and the Chinese delegate saying "no", over and over again. Monbiot even approvingly quoted the Sudanese delegate Lumumba Di-Aping, who denounced the Copenhagen accord as "a suicide pact, an incineration pact, in order to maintain the economic dominance of a few countries".
Sudan behaves at the talks as a puppet of China; one of a number of countries that relieves the Chinese delegation of having to fight its battles in open sessions. It was a perfect stitch-up. China gutted the deal behind the scenes, and then left its proxies to savage it in public.
Here's what actually went on late last Friday night, as heads of state from two dozen countries met behind closed doors. Obama was at the table for several hours, sitting between Gordon Brown and the Ethiopian prime minister, Meles Zenawi. The Danish prime minister chaired, and on his right sat Ban Ki-moon, secretary-general of the UN. Probably only about 50 or 60 people, including the heads of state, were in the room. I was attached to one of the delegations, whose head of state was also present for most of the time.
What I saw was profoundly shocking. The Chinese premier, Wen Jinbao, did not deign to attend the meetings personally, instead sending a second-tier official in the country's foreign ministry to sit opposite Obama himself. The diplomatic snub was obvious and brutal, as was the practical implication: several times during the session, the world's most powerful heads of state were forced to wait around as the Chinese delegate went off to make telephone calls to his "superiors".
Shifting the blame
To those who would blame Obama and rich countries in general, know this: it was China's representative who insisted that industrialised country targets, previously agreed as an 80% cut by 2050, be taken out of the deal. "Why can't we even mention our own targets?" demanded a furious Angela Merkel. Australia's prime minister, Kevin Rudd, was annoyed enough to bang his microphone. Brazil's representative too pointed out the illogicality of China's position. Why should rich countries not announce even this unilateral cut? The Chinese delegate said no, and I watched, aghast, as Merkel threw up her hands in despair and conceded the point. Now we know why – because China bet, correctly, that Obama would get the blame for the Copenhagen accord's lack of ambition.
China, backed at times by India, then proceeded to take out all the numbers that mattered. A 2020 peaking year in global emissions, essential to restrain temperatures to 2C, was removed and replaced by woolly language suggesting that emissions should peak "as soon as possible". The long-term target, of global 50% cuts by 2050, was also excised. No one else, perhaps with the exceptions of India and Saudi Arabia, wanted this to happen. I am certain that had the Chinese not been in the room, we would have left Copenhagen with a deal that had environmentalists popping champagne corks popping in every corner of the world. ...
This does not mean China is not serious about global warming. It is strong in both the wind and solar industries. But China's growth, and growing global political and economic dominance, is based largely on cheap coal. China knows it is becoming an uncontested superpower; indeed its newfound muscular confidence was on striking display in Copenhagen. Its coal-based economy doubles every decade, and its power increases commensurately. Its leadership will not alter this magic formula unless they absolutely have to.
Copenhagen was much worse than just another bad deal, because it illustrated a profound shift in global geopolitics. This is fast becoming China's century, yet its leadership has displayed that multilateral environmental governance is not only not a priority, but is viewed as a hindrance to the new superpower's freedom of action.
Plans to develop the UK's first commercial-scale geothermal power station in Cornwall have secured nearly £1.5m of government funding.
The power plant - using "hot rocks" technology - is to be based at Redruth and would provide electricity and heat for homes and businesses.
Geothermal Engineering Ltd (GEL) has gained the grant from the Department of Energy and Climate Change. The total cost of building the plant will be about £40m.
The power plant will work by pumping water deep underground to be warmed by the earth's natural heat and then returned to the surface. The heated water would power turbines, generating electricity and heat.
A vast fan-shaped compound in China has officially taken the title of “largest solar-powered office building in the world“. Located in Dezhou in the Shangdong Province in northwest China, the 75,000 square meter structure is a multi-use building and features exhibition centers, scientific research facilities, meeting and training facilities, and a hotel – all of which run on solar power.
The design of the new building is based on the sun dial and “underlines the urgency of seeking renewable energy sources to replace fossil fuels.” Aside from the obvious sustainable nature of the solar panel – clad exterior, other green features include advanced roof and wall insulation practices resulting in an energy savings of 30% more than the national standard. In addition, the external structure of the building used a mere 1% of the amount of steel used to construct the Bird’s Nest.
For decades researchers have investigated a theoretical means to double the power output of solar cells--by making use of so-called "hot electrons." Now researchers at Boston College have provided new experimental evidence that the theory will work. They built solar cells that get a power boost from high-energy photons. This boost, the researchers say, is the result of extracting hot electrons.
The results are a step toward solar cells that break conventional efficiency limits. Because of the way ordinary solar cells work, they can, in theory, convert at most about 35 percent of the energy in sunlight into electricity, wasting the rest as heat. Making use of hot electrons could result in efficiencies as high as 67 percent, says Matthew Beard, a senior scientist at the National Renewable Energy Laboratory in Golden, CO, who was not involved in the current work. Doubling the efficiency of solar cells could cut the cost of solar power in half.
Carbon Commentary has an excellent article on how renewables can provide large amounts of electricity into an electrical grid - in this case Spain's - (Spain’s variable wind and stable electricity networks. Yet if you listened to some people spouting PR for the nuclear power industry, you'd believe this is impossible.
One of the frequent criticisms of wind energy is that national distribution systems (‘the grid’) cannot cope with large number of turbines because of the variability and unpredictability of their output. Grids need to match supply and demand precisely, the critics say, and because wind varies so much it causes huge problems. Recent data from two meteorologically unusual days in Spain – the world leader in the management of renewable energy supplies – shows this assertion is almost certainly false.
* During part of 8 November, Spain saw over 50% of its electricity come from turbines as an Atlantic depression swept over the country’s wind parks. (They are so big that no one seems to call them ‘farms’.) Unlike similar times in November 2008, when Spanish turbines were disconnected because the grid had an excess of electricity, the system accepted and used all the wind power that was offered to it. * A very different event in January of this year saw unexpectedly high winds shut down most of the country’s turbines with little warning. The grid coped with this untoward incident as well. These two events show that a well run transmission system can cope with extreme and unexpected events even with a large fraction of power provided by wind.
Over the course of this year Spain will generate about 14% of its total electricity from wind and this number is likely to rise to the high twenties by 2020. Spain is showing the rest of the world that these figures are not incompatible with grid stability. Although wind is ‘variable’, ‘intermittent’ and ‘unpredictable’, a well functioning grid system can still use wind to help stabilise electricity costs, reduce carbon emissions and improve energy security.
At some periods on the night of 8/9 November, wind provided 53% of Spain’s need for electricity. This was a new record for the Spanish system. As the country continues to install thousands of new wind turbines a year, this record will not stand for long.
Spain is able to manage the integration of wind power into its grid primarily because it has reasonable amounts of hydro-electricity and pumped storage.[2] Hydro-electricity can be used when winds are less than expected and pumped storage can assist both when wind is unexpectedly high or unexpectedly low.
One of the main criticisms levelled at wind is that its power is so unpredictable that huge amounts of fossil fuel generating capacity needs to be kept ready to replace it at a moment’s notice. Those antagonistic to wind believe that the carbon cost of keeping power stations in a state of what the industry calls ‘spinning reserve’ is enormous. Power stations, they say, are burning fuel so that they can instantaneously start producing electricity if and when the wind drops.
But is wind so variable that power stations need to provide immediate backup? The utterly superb REE website provides easy-to-use data to test this theory. I’ve used this data to try to demonstrate that wind production was remarkably consistent during the peak day of 8 November.[3] Not only is wind speed largely predictable with good meteorology, but REE data shows that even in the windy days of early November, the amount of electricity generated only varied gradually.
During this 24-hour period the total generated varied from about 9.3 gigawatts (9,300 megawatts) at the start, to a peak of around 11.5 gigawatts at about 14.30 in the afternoon. For most of the day, the wind output was very stable around 10 gigawatts. (The wind output estimate is provided every ten minutes on the REE website.) The mean percentage variation from one reading to the next was 0.72%. On only three occasions out of 143 observations did the output vary more than 2% between two readings.
When the wind is blowing strongly, any local variations in wind speeds tend to be balanced out by compensating changes elsewhere. A country like Spain, with over ten thousand turbines spread across a large landmass, will have low variability of electricity output from wind. As a country adds wind turbines, the degree of variability in electricity output will tend to fall. In Spain, the variations on 8/9 November represented no threat to the stability of the electricity system, even when wind was meeting half of total power demand.
Outside Egypt's capital, in the shadow of the Pyramids and tucked in the mountains of Mokattam, is an incredible city that literally survives on trash. Garbage City, as it's known, is home to 30,000 Zabaleens - Coptic Christians from southern Egypt - who, each day, enter Cairo and collect its waste. 60 percent of the trash produced in Cairo passes through Garbage City to be recycled. It is an amazing sight, awash in refuse.
Recently, photographers Bas Princen, Klavs Bo Christensen and Alexander Heilner visited Garbage City and returned with some captivating images that depict the close, day-to-day relationship between the Zabaleens and the garbage. Piles of the stuff are virtually everywhere, a fact that these recyclers-by-trade seem none too concerned with.
The garbage collecting process is so organized, Cairo has had no need to create a government sponsored program, relying fully upon the residents of Garbage City to collect their trash. Just a single pair of Zabaleens, working with a horse-drawn carriage, are able to collect the trash from 350 of Cairo's homes and businesses. They are not paid for their labor either, as the profiting from recycling is enough for many to live on.
Nostalgia aside, there's a lot to be learned from the rise and fall of appropriate tech in the 1970s, and one of its lessons bears directly on the theme of this series of posts. For many of the people involved in it back then, appropriate tech was the inevitable wave of the future; nearly everyone assumed that energy costs would continue to rise as the limits to growth clamped down with increasing force, making anything but Ecotopia tantamount to suicide. A formidable body of thought backed those conclusions, and the core of that body of thought was systems theory.
Nowadays, the only people who pay attention to systems theory are specialists in a handful of obscure fields, and it can be hard to remember that forty years ago systems theory had the same cachet that more recently gathered around fractals and chaos theory. Born of a fusion between ecology, cybernetics, and a current in contemporary philosophy best displayed in Jan Smuts' Holism and Evolution, systems theory argued that complex systems -- all complex systems – shared certain distinctive traits and behaviors, so that insights gained in one field of study could be applied to phenomena in completely different fields that shared a common degree of complexity.
It had its weaknesses, to be sure, but on the whole, systems theory did exactly what theories are supposed to do – it provided a useful toolkit for making sense of part of the universe of human experience, posed plenty of fruitful questions for research, and proved useful in a sizable range of practical applications. As popular theories sometimes do, though, it became associated with a position in the cultural struggles of the time, and as some particularly unfortunate theories do, it got turned into a vehicle for a group of intellectuals who craved power. Once that happened, systems theory became another casualty of Weishaupt's Fallacy.
Those of my readers who don't pay attention to conspiracy theory may not recognize the name of Adam Weishaupt; those who do pay attention to conspiracy theory probably "know" a great deal about him that doesn't happen to be true. He was a professor of law at the University of Ingolstadt in Bavaria in the second half of the eighteenth century, and he found himself in an awkward position between the exciting new intellectual adventures coming out of Paris and the less than exciting intellectual climate in conservative, Catholic Bavaria. In 1776, he and four of his grad students founded a private society for enthusiasts of the new Enlightenment thought; they went through several different names for their club before finally settling on one that stuck: the Bavarian Illuminati.
Yes, those Bavarian Illuminati.
There were a fair number of people interested in avant-garde ideas in and around Bavaria just then, and. before too long, the Illuminati had several hundred members. This gave Weishaupt and his inner circle some grandiose notions about where all this might lead. Pretty soon, they hoped,all the movers and shakers in Bavaria – not to mention the other microkingdoms into which Germany was divided at that time – would join the Illuminati and stuff their heads full of Voltaire and Rousseau, and then the whole country would become, well, illuminated.
They were still telling themselves that when the Bavarian government launched a series of police raids that broke the back of the organization. Weishaupt got out of Bavaria in time, but many of his fellow Illuminati were not so lucky, and a great deal of secret paperwork got scooped up by the police and published in lavish tell-all books that quickly became bestsellers all over Europe. That was the end of the Illuminati, but not of their reputation; reactionaries found that blaming the Illuminati for everything made great copy, not least because they weren't around any more and so could be redefined with impunity – liberal, conservative, Marxist, capitalist, evil space lizards, you name it.
The problem with Professor Weishaupt's fantasy of an illuminated Bavaria was a bit of bad logic that has been faithfully repeated by intellectuals seeking power ever since: the belief, as sincere as it is silly, that if you have the right ideas, you are by definition smarter than the system you are trying to control. That's Weishaupt's Fallacy. Because Weishaupt and his fellow Illuminati were convinced that the conservative forces in Bavaria were a bunch of clueless boors, they were totally unprepared for the counterblow that followed once the Bavarian government figured out who the Illuminati were and what they were after.
For a more recent example, consider the rise and fall of the neoconservative movement, which stormed into power in the United States in 2000 boldly proclaiming the arrival of a "new American century," and proceeded to squander what remained of America's wealth and global reputation in a series of foreign and domestic policy blunders that have set impressive new standards for political fecklessness. The neoconservatives were convinced that they understood the world better than anybody else. That conviction was the single most potent factor behind their failure; when mainstream conservatives (not to mention everybody else!) tried to warn them where their fantasies of remaking the Middle East in America's image would inevitably end, the neoconservatives snorted in derision and marched straight on into the disaster they were making for themselves, and of course for the rest of us as well.
Systems theory was a victim of the same fallacy. The systems movement, to coin a label for the heterogeneous group of thinkers and policy wonks that made systems theory its banner, had ambitions no less audacious than the neoconservatives, though aimed in a completely different direction. Their dream was world systems management. Such leading figures in the movement as Jay Forrester of MIT and Aurelio Peccei of the Club of Rome agreed that humanity's impact on the planet had become so great that methods devised for engineering and corporate management – in which, not coincidentally, they were expert – had to be put to work to manage the entire world.
The study that led to the 1973 publication of The Limits to Growth was one product of this movement. Sponsored by Peccei's Club of Rome and carried out by a team led by one of Forrester's former Ph.D. students, it applied systems theory to the task of making sense of the future, and succeeded remarkably well. As Graham Turner's study "A Comparison Of The Limits to Growth With Thirty Years of Reality" (CSIRO, 2008) points out, the original study's baseline "Standard Run" scenario matches the observed reality of the last three and a half decades far more exactly than rival scenarios.
It's not often remembered, though, that the Club of Rome followed up The Limits to Growth with a series of further studies, all basically arguing that the problems outlined in the original study could be solved by planetary management on the part of a systems-savvy elite. The same notions can be found in dozens of similar books from the same era – indeed, it's hard to think of a systems thinker with any public presence in the 1970s who didn't publish at least one book proposing some kind of worldwide systems management as the only alternative to a very messy future.
It's only fair to stress the role that idealism and the best intentions played in all this. Still, the political dimensions shouldn't be ignored. Forrester, Peccei, and their many allies were, among other things, suggesting that a great deal of effective power be given to them, or to people who shared their values and goals. Since the systems movement was by no means politically neutral – quite the contrary, it aligned itself forcefully with specific ideological positions in the fractured politics of the decade – that suggestion was bound to evoke a forceful response from the entire range of opposing interests.
The Reagan revolution of 1980 saw the opposition seize the upper hand, and the systems movement was among the big losers. Hardball politics have always played a significant role in public funding of research in America, so it should have come as no surprise when Reagan's appointees all but shut off the flow of government grants into the entire range of initiatives that had gathered around the systems theory approach. From appropriate tech to alternative medicine to systems theory itself, entire disciplines found themselves squeezed out of the government feed trough, while scholars who pursued research that could be used against the systems agenda reaped the benefits of that stance. Clobbered in its most vulnerable spot – the pocketbook – the systems movement collapsed in short order.
What made this implosion all the more ironic is that a systems analysis of the systems movement itself, and its relationship to the wider society, might have provided a useful warning. Very few of the newborn institutions in the systems movement were self-funding; from prestigious think tanks to neighborhood energy-conservation schemes, most of them subsisted on government grants, and thus were in the awkward position of depending on the social structures they hoped to overturn. That those structures could respond homeostatically to oppose their efforts might, one would think, be obvious to people who were used to the strange loops and unintended consequences that pervade complex systems.
Still, Weishaupt's Fallacy placed a massive barrier in the way of such a realization. Read books by many of the would-be global managers of the 1970s and you can very nearly count on being bowled over by the scent of intellectual arrogance. The possibility that the system they hoped to manage might, in effect, have been more clever than they were probably crossed very few minds. Yet that's how things turned out; at the end of the day, the complex system that was American society had reacted, exactly as systems theory would predict, to neutralize a force that threatened to push it out of its preferred state.
Jamais at Open The Future has a post on the second biggest contributor to global warming - "black carbon" - None More Black.
An aerosol known as "black carbon," a primary component in soot, looks to be a key driver of anthropogenic global warming in tropical locations around the world -- most notably, in the Himalayan region.
...new research, by NASA’s William Lau and collaborators, reinforces with detailed numerical analysis what earlier studies suggest: that soot and dust contribute as much (or more) to atmospheric warming in the Himalayas as greenhouse gases. This warming fuels the melting of glaciers and could threaten fresh water resources in a region that is home to more than a billion people.
[...] Nicknamed the “Third Pole”, the region in fact holds the third largest amount of stored water on the planet beyond the North and South Poles. But since the early 1960s, the acreage covered by Himalayan glaciers has declined by over 20 percent. Some Himalayan glaciers are melting so rapidly, some scientists postulate, that they may vanish by mid-century if trends persist. Climatologists have generally blamed the build-up of greenhouse gases for the retreat, but Lau’s work suggests that may not be the complete story.
He has produced new evidence suggesting that an “elevated heat pump” process is fueling the loss of ice, driven by airborne dust and soot particles absorbing the sun’s heat and warming the local atmosphere and land surface.
Globally, black carbon looks to be the second most-important warming agent after CO2.
Here's the twist: much of the production of black carbon comes from the combustion of biofuels and diesel, the two leading "greener" fuel technologies.
Aerosols last for months in the atmosphere, as opposed to the decades that greenhouse gases can last. This is good, as it means that policies that reduce the production of black carbon can start showing positive results in a matter of weeks.
InfranetLab has an interesting post on the changing geography (and fish-ography) of the east African lake system - Wet Borders: Microslums and Meanders.
Migingo Island, home to some 300 residents, sits precariously within Lake Victoria along the watery border of Uganda and Kenya. Its undetermined origins declare that either: a) two Kenyan fisherman settled there in 1991, or b) a Ugandan fisherman also claimed to have settled there in an abandoned house in 2004. Regardless, since that time, the place has really taken off – becoming what one journalist called a microslum. Each successive year that the level of Lake Victoria decreased, the originally rocky tip exposed greater landmass to occupy. So, complicating matters is Lake Victoria’s rapidly receding lake. But why here, why such a precious outpost?
Its all about the perch, Nile perch. Fishing in Lake Victoria, one of the largest bodies of fresh water, is essential to some of the 30 million Africans that live within its reach. Nile perch was introduced here in the 1950s and has risen to become an essential part of the economy of Lake Victoria’s fishery. (The perch was so successful in rejuvenating the fishing economy here that it decimated nearly 350 native fish species to rise to the top of the chain.) This success means that in recent years the Nile perch populations have dwindled and many native species are thought to be recovering. But really the whole Nile perch story, which in a Jared Diamond-esque way utlimately leads to weapons, is epic enough to be a film in its own right.
Fishing supports an export industry in East Africa whose value is estimated at US$250 million annually. And the convenience of Migingo Island in this tightening economy (and shrinking ecology) has placed extreme presue on the island, with Ugandan police patrolling the waters and intercepting catch from Kenyan fisherman. A claim by several locals involved in the dispute has even lobbied that the fish are Kenyan because of which side of the border they breed on. Another strange claim is that the land belongs to Kenya but the water belongs to Uganda. And the dispute continues as on the island itself Ugandans and Kenyans exist within different ‘neighborhoods’ on this tiny acre of rock.
Kiashu at GWAG has a simple and easy to understand of the problem with nuclear power - Nukes are stupid.
“Hey, we’ve got a problem with our energy.”
“What’s that?”
“Well, we’re relying on fossil fuels. We’re burning a finite resource, it’s not like iron or something that we can recycle, once we burn it, it’s gone.”
“Hey, I’ve got the solution!”
“Yeah?”
“Yeah! We’ll change to burning another finite resource!”
“Brilliant!”
“Even better, this finite resource is really hard to burn well and safely, so we’ll need the best and most conscientious engineers and inspection regime ever just to make sure hundreds of thousands of people don’t get killed.”
“Awesome! But isn’t there a problem with waste, which no-one in the world has ever managed to permanently store safely?”
"There is that. But some people have plans for breeder reactors. These take ordinary depleted uranium and turn it into plutonium - they make new fuel! So we'd end up with at least one hundred times more fuel than we have now."
"But I thought that breeder reactors were unsafe and didn't produce very much fuel anyway?"
"That's all just greenie propaganda!"
"Not actual facts, then?"
"Okay it's facts, but the new designs will be safe and efficient, honest!"
"Won't there still be deadly waste, though?"
“Sure! But on the other hand, the waste from the burning process is also good to make weapons with, weapons which can kill millions in an instant. In an age of failing states, that’s just what we need!”
“Sounds great! When can we start?”
“Well, first we must get official approval, and push the official approval over the protests of the public, for some reason those idiots are against it.”
“Can’t think why.”
I believe in democracy. That’s why I propose that people should get to vote on what power source they want in their backyard. Because in the end, whether it’s objectively a good or bad idea, if that’s what the people want then they should get it. For some reason, those in favour of nuclear and fossil fuels never support my idea for a vote. I wonder why?
Austria, 1978 – in a referendum the Austrians voted 50.5% against nuclear power. They had 1 reactor under construction, plans to finish it off were not set aside until after Chernobyl.
Sweden, 1980: 12 reactors, they voted to “keep the 12 reactors in operation, but to shut them down at a later date by taking into consideration the welfare of the country and its economic development and the supply and demand of power in Sweden.” They have closed down 2. Basically they built up other sources and found they enjoyed all that extra electricity (they now have about 25,000kWh per capita annually, twice the US and three times Germany or Denmark).
Italy, 1987 – in a referendum the people voted to abolish nuclear power in their country, closing the three power plants which had in any case been closed since Chernobyl. They’re still closed, but Italy does not scruple to buy nuclear-generated electricity from France.
Japan, Maki, 1996 – residents of the town voted 60% to refuse land for building a reactor. Plans to build it there were dropped.
Taiwan, Yenliao, 1994 – residents voted 96.2% against two reactors in their region. Plans to build them there were dropped.
Switzerland has had many referenda on nuclear energy, with all voting to keep it or not phase it out.
Thus, it can be fairly said that of all the countries and regions in the world where citizens have been given a choice about nuclear energy, only the Swiss have chosen to have it. All the others given a choice have rejected it. But most countries have never bothered to ask their citizens. I wonder why?
Confidential contracts detailing Monsanto Co.'s business practices reveal how the world's biggest seed developer is squeezing competitors, controlling smaller seed companies and protecting its dominance over the multibillion-dollar market for genetically altered crops, an Associated Press investigation has found.
With Monsanto's patented genes being inserted into roughly 95 percent of all soybeans and 80 percent of all corn grown in the U.S., the company also is using its wide reach to control the ability of new biotech firms to get wide distribution for their products, according to a review of several Monsanto licensing agreements and dozens of interviews with seed industry participants, agriculture and legal experts.
Declining competition in the seed business could lead to price hikes that ripple out to every family's dinner table. That's because the corn flakes you had for breakfast, soda you drank at lunch and beef stew you ate for dinner likely were produced from crops grown with Monsanto's patented genes.
Monsanto's methods are spelled out in a series of confidential commercial licensing agreements obtained by the AP. The contracts, as long as 30 pages, include basic terms for the selling of engineered crops resistant to Monsanto's Roundup herbicide, along with shorter supplementary agreements that address new Monsanto traits or other contract amendments.
The company has used the agreements to spread its technology — giving some 200 smaller companies the right to insert Monsanto's genes in their separate strains of corn and soybean plants. But, the AP found, access to Monsanto's genes comes at a cost, and with plenty of strings attached.
For example, one contract provision bans independent companies from breeding plants that contain both Monsanto's genes and the genes of any of its competitors, unless Monsanto gives prior written permission — giving Monsanto the ability to effectively lock out competitors from inserting their patented traits into the vast share of U.S. crops that already contain Monsanto's genes.
Monsanto's business strategies and licensing agreements are being investigated by the U.S. Department of Justice and at least two state attorneys general, who are trying to determine if the practices violate U.S. antitrust laws. The practices also are at the heart of civil antitrust suits filed against Monsanto by its competitors, including a 2004 suit filed by Syngenta AG that was settled with an agreement and ongoing litigation filed this summer by DuPont in response to a Monsanto lawsuit.
As a multi-year victim of the British transport system I find the idea of clean, new fast trains running on UK tracks almost inconceivable - but as its not April 1st I'll assume this story from Inhabitat is true - High-Speed Javelin Trains Arrive in the UK.
This week the UK joined the ranks of many other European countries as it unveiled its first domestic high-speed rail service. The 140 mph Japanese-built Javelin trains will be slightly more expensive to ride than their slow-poke counterparts, but they will drastically cut down on journey times.
For example, a train ride from London to Ramsgate is reduced from 81 minutes to half an hour, and an Ashford to London trip is cut down from over an hour to 37 minutes. Such time-saving trips come at a price–a 7.3% increase in the case of some off-peak fares. But hopefully, the short trips times will still be attractive enough to dissuade commuters from driving.
Eventually, Britain’s high-speed rail network will expand even further. Prime Minister Gordon Brown recently announced a $32 billion investment in railway infrastructure, and a North-South high-speed rail network is planned in the coming years.
A Canadian startup has built a pilot desalination plant in Vancouver that uses a quarter of the energy of conventional plants to remove salt from seawater. The process relies on concentration gradients, and the natural tendency of sodium and chloride ions--the key components of salt--to flow from higher to lower salinity concentrations. If the system can be scaled up it could offer a cheaper way to bring drinking water to the planet's most parched regions while leaving behind a much lower carbon footprint than other desalination methods.
"We've taken it from a benchtop prototype to a fully functional seawater pilot plant," says inventor Ben Sparrow, a mechanical engineer who established Saltworks Technologies in 2008 to commercialize the process. "The plant is currently running on real seawater, and we're in the final stage of expanding it to a capacity of 1,000 liters a day."
Today most desalination plants are based on one of two approaches. One is distillation through an evaporation-condensation cycle, and the other is membrane filtration through reverse osmosis. But both options are energy-intensive and costly.
Saltworks takes a completely different approach based on the principles of ionic exchange. The process begins with the creation of a reservoir of seawater that is evaporated until its salt concentration rises from 3.5 percent to 18 percent or higher.
The evaporation is done in one of two ways: either the seawater is sprayed into a shallow pond exposed to sunlight and dry ambient air, or seawater is kept in a large tower that's exposed to waste heat from a neighboring industrial facility. The second approach is used in the commercial-scale plant. The concentrated water is then pumped at low pressure into the company's desalting unit along with three separated streams of regular seawater. At this point the most energy-intensive part of the process is already over.
Topic du jour is the Australian government's sneaky pre-Christmas release of their plans to censor the internet, after some mock-trials "proving" the filter will be effective and won't have any adverse side effect - Crikey leads off with - Conroy’s internet filter: so what?.
“Our pilot, and the experience of ISPs in many Western democracies, shows that ISP-level filtering of a defined list of URLs can be delivered with 100% accuracy,” Senator Stephen Conroy said yesterday when announcing that mandatory internet censorship — sorry, “filtering” — is going ahead.
“It also demonstrated that it can be done with negligible impact on internet speed.”
Conroy is right on both counts, as it happens — provided you gloss over that reference to “many” unnamed democracies. I wouldn’t call a dozen countries with ISP-level filtering “many”, and in some of them filtering isn’t mandatory. And provided you restrict your aims precisely to those carefully worded factoids cherry-picked from Enex TestLab’s trial report.
And provided you never make a mistake.
Blocking a defined list of URLs [specific web addresses] such as the ACMA blacklist of Refused Classification material, even 100% of it, falls far short of “protecting” children from “inappropriate” material, to use the wording of Labor’s original cyber-safety policy.
Google’s index passed a trillion web pages a year and a half ago. ACMA’s manually compiled blacklist of a thousand-odd URLs reported by concerned citizens is a token drop in that ocean, a mere 0.0000001%.
ACMA told Senate Estimates that of the 1175 URLs on their blacklist on September 30, 54% were Refused Classification material, and only 33% of those related to child sexual abuse. The rest of the blacklist? 41% was X18+ material, and 5% was R18+ material without a “restricted access system” to prevent access by minors.
The same key problems with a filter-based approach, which Crikey has reported many times before, are confirmed by the Enex report.
If you go beyond the pre-defined ACMA blacklist to catch a wider range of content, the false positive rate — material blocked when it shouldn’t be — is still up to 3.4%. Enex’s examples include the incorrect blocking of “sperm whales” and “robin red breast”. In the industry, this is known as the Scunthorpe Problem.
Australia’s biggest telco, Telstra, wasn’t part of the official trial, but it conducted its own tests and discussed the results with Enex.
“Telstra found its filtering solution was not effective in the case of non-web based protocols such as instant messaging, peer-to-peer [file sharing like BitTorrent] or chat rooms. Enex confirms that this is also the case for all filters presented in the pilot.”
For all filters.
Telstra also reported that its filtering system could be overloaded if pages on heavy traffic sites like YouTube ended up on the blacklist. Every request for anything on YouTube would have to be routed to the secret filter box to see whether it was listed.
“This is also the case for all filters presented in the pilot,” reports Enex.
For all filters.
In any event, as the Enex report reminds us, “A technically competent user could, if they wished, circumvent the filtering technology.” In its own tests, Telstra didn’t even bother testing circumvention because they take it as given.
Bernard Keane thinks its just another bizarre example of Labor's urge to play wedge politics instead of governing responsibly - Net filtering won’t work, so what is Conroy up to? (I think he's underestimating their control freak impulses personally but he may have a point).
It’s been quite some time since I’ve seen as breathtakingly mendacious a policy announcement as yesterday’s declaration by Stephen Conroy that the government would introduce internet censorship.
It’s one thing to hold off on an announcement (which Conroy admitted he’d been sitting on since October) until the week before Christmas, when half the serious journalists in the country are on the other side of the world. That had its reward, with minimal, and decidedly thin, coverage of the announcement in the mainstream media today.
It’s quite another, even in these days of spin and media management, for a government minister to stand up and blatantly declare that black is white, and the government will be proceeding on the basis of that fact.
The internet “filtering” trial — perhaps we should drop the “filter” term, and call it what it is, censorship — was carefully structured by the government so that the filtering technology tested would meet low benchmarks and limited performance requirements. But it looks an awful lot like one of the reasons the government sat on the trial outcome for so long was because most of the trial results failed to meet even the minimal hurdles set up by the government.
On the basis of the trial report, even advocates of censorship could not support what Conroy has proposed, on the basis that it just doesn’t work.
That’s why Conroy, in charging ahead yesterday, had to tell a series of patent untruths. That filtering could be done with “100% accuracy”, when the trial saw up to 3.4% of web content (which means tens of million of web pages worldwide) wrongly blocked.
That the “wild claims” that censorship affects internet speed have been “put to bed” when the trial, despite trying to define the problem away by declaring “negligible” effect on usage speed as less than 10%, saw speed reductions of 30-40%.
Or the big lie, that filtering works, when several filters were bypassed more often than not (in one case, more than 90%), and the only filter that defeated nearly all efforts to circumvent it was the one with the 40%+ performance degradation. ...
The government’s real objective here is to shore up its family-friendly credentials. While the technologically literate may laugh at the trial outcome, and free speech advocates rail at censorship, Kevin Rudd and Stephen Conroy know they’re a tiny minority of voters. This is all about giving ill-informed and often lazy parents, most of whom think that you can “stumble upon” p-rnography on the internet, the illusion that their children are safe, even as their kids circumvent the mechanism and go looking for s-xual material, which is what kids have always done. That parents should be active monitors of what their kids consume in the media is apparently old-fashioned thinking.
It isn’t about changing votes, so much as solidifying the government’s branding in the minds of mainstream voters as morally middle-of-the-road and supportive of families.
The other target is the coalition. Hitherto, particularly under Nick Minchin, the coalition has been hostile to the filtering scheme. But in the end, the coalition — which in the face of Green opposition will be necessary for Conroy’s Bill to pass the Senate — may struggle to oppose it. Blocking the Bill will enable the government to portray the coalition as out-of-touch with families and “mainstream values”. The value of censorship as a wedge far exceeds any losses that will accrue from a few IT nerds.
And if the technically competent, as the report says, can bypass these filters easily, what’s the issue? Geeks can have an uncensored internet, while your average suburban mum and dad are happy their kids won’t be clicking onto child abuse while doing their homework.
This is where this political stunt has serious consequences, and where the issue stops being about the ineffectiveness of filtering technology and about freedom of speech. Conroy insists that the censorship will only be about RC-material. “So for people wanting to campaign on the basis that we’re going to maybe slip political content in — we will never support that. And if someone proposes that I will be on the floor of Parliament arguing against it.”
Good to hear, minister, and I actually believe you. But you’re in effect asking us to trust not just you but every politician in the future. We’ve all seen the confected moral panics that the tabloid media, and politicians, are happy to use. Maybe it’s an unsavoury incident on a reality TV show. Maybe it’s a particularly foul-mouthed chef. The results are the same — the demand for politicians to censor, to block, to ban and restrict.
And that’s before we get to the moralisers and the demonisers. Maybe it’s euthanasia, accepted and legal in other countries but banned from discussion in Australia. Maybe it’s junk-food advertising, or alcohol advertising, another alleged source of vexation to parents.
The government’s censorship proposal locks in a universal mechanism that can be extended at will by politicians. Those who want to circumvent it will be able to, yes, but the bulk of the population will be subject to it, barely aware that it’s there — like they are barely aware that politicians have already banned the online expression of certain ideas such as euthanasia.
Senator Conroy thinks he can sneak his plan to censor the internet in as Australia settles in for Christmas. As he considers the future of the scheme he needs to know that we'll be watching every step of the way.
At this crucial moment send Senator Conroy a quick message to let him know what you think of his plans to censor Australia's internet.
Crikey's Bernard Keane has an interesting essay on how to avoid getting a form letter response to your complaints to the government - and how to make them aware of the impact of clogged bandwidth - Bernard Keane’s guide to writing to Ministers.
If your first instinct upon hearing about the Rudd-Conroy plan to censor the internet is to email Stephen Conroy, your local member and Labor senators from your state to protest, wait up.
Or, in fact, do it anyway, then read this.
Let me explain some facts about writing to ministers, drawn from my sordid, blood-soaked and adventure-filled time as a public servant.
For a start, understand that few ministers if any read their correspondence. It’s not that they don’t care, it’s that it’s not humanly possible to read even a fraction of the amount of emails, faxes and letters they get. So the chances of you directly influencing a Minister with your particularly brilliant insight into the issue are zip. Deal with it. Things don’t work like that.
Their staff will read correspondence, but only when considering a reply prepared by their Department.
And that is only a small proportion of the actual volume of correspondence received. Some is answered directly by bureaucrats. But much of it is simply binned. Don’t waste your time sending off a letter pre-prepared by some enthusiastic online advocacy group, where you sign at the bottom, endorsing the nicely-phrased sentiments at the top. They’re called “campaign” ministerials and are binned without being read or replied to (but please don’t tell the Friends of the ABC, who rely heavily on that technique, and haven’t had a letter to Canberra read for two decades).
Most non-campaign letters and emails - some departments still won’t reply to emails but demand your snail mail address, perhaps out of residual loyalty to Australia Post - are answered using what’s called “standard words” - a reply that ostensibly covers the issue raised but which normally says as little as possible. They say as little as possible because the mindset of bureaucrats and ministerial advisers is to keep as many options open as possible, except when there is a particular message that the Government wants to hammer.
Standard words are worked up by bureaucrats and edited and signed off by the Minister’s staff when they’re happy the words are risk-free or convey the desired message. In most departments, they are then loaded into electronic ministerial correspondence systems. This means a bureaucrat doesn’t even need to cut-and-paste into a Word document, merely tell the system to use a particular set of standard words under the name, address, salutation and opening paragraph, which have all been electronically entered already.
So if you send off an angry email or letter about net filtering, all you’ll likely get is an automatically-generated reply giving you the standard words on the issue. There’ll be minimal human involvement in the writing of it until it is stuffed into an envelope and dispatched.
You may not think it’s very democratic or consultative, but it’s a damn sight more efficient than processing correspondence by hand.
But if you can’t have any impact on policy, you can have an impact on the level of resources used to answer your letter. And that resource is the time of bureaucrats - the same bureaucrats who advise Conroy on policy, and implement his decisions. In most Departments, ministerial replies have to be approved by SES Band 1 officers before being sent to the Minister’s office, which means many replies consume the precious time both of senior bureaucrats and ministerial advisers. Many Departments also have formal agreements with Ministers that a certain proportion of correspondence will be answered within a certain period of time. If they’re not, more people have to be put into answering correspondence.
So if you want to consume as much of the Department of Broadband’s time as possible, here’s what to do. There’s not much you can do to avoid receiving a standard reply. But you don’t have to confine your missive to net filtering. Throw in some other topics. That means someone will have to put together a reply using standard words from different areas, which is a lot more complicated and can’t be done automatically. Ask about the rollout of the National Broadband Network (NBN). That means someone in the NBN area has to provide some words. Ask about Telstra. That’s another area entirely that has to provide input. If there’s three or four topics in your letter, bureaucrats will start arguing to avoid having to be responsible for it. The NBN area will tell the net filtering area it’s their responsibility to collate the response. The net filtering area will try to off-load it to the Telstra area. A Band 1 in one area will make changes and the whole lot will have to be re-approved by a Band 1 in another area.
Throw in something on Australia Post. Ask about something obscure. They may not have standard words at all and someone will have to actually prepare a proper reply.
You see, once your letter stops being a standard rant about filtering and requires actual work, the amount of time taken to prepare a response can snowball dramatically.
You can also use the Government’s system for allocating correspondence. As a start, always write to your MP first, even if it’s a Coalition MP. They will send the letter to Conroy and ask for a response to provide to you. MPs - even Opposition MPs - must get a response no matter what, as part of the civilities of politics, and it normally has to come from the Minister himself. But write to other Ministers as well. Ask Kim Carr what the impact of filtering will be on Australia’s IT industry. Ask Jenny Macklin what impact she thinks it will have on families. Ask Robert McClelland what the penalties will be for breaches of the mandatory filtering requirements. And ask Kevin Rudd how a Government that understands the need to bring Australia’s online infrastructure into the 21st century wants to drag it back to the 19th when it comes to content regulation.
All of those letters will have to go from the recipient’s department to Conroy’s Department for a response, then back to the originating Department, where they might add some additional material of their own. If you come up with a particularly complicated issue, the bureaucrats might start disagreeing with each other. Innovation bureaucrats might think Broadband’s net filter standard words doesn’t quite answer your question and want something else.
And don’t ask the same questions in different letters, otherwise they’ll bin them and tell you they understand you’ve separately written to your MP/another Minister/Kevin Rudd and here’s your job lot reply. Ask different questions and raise different issues.
And be pleasant. Apart from anything else, if there’s too much abuse in a letter, it gets thrown out (quite rightly). But these are decent, hard-working bureaucrats and regardless of what you think of Stephen Conroy, they deserve civility and respect.
Most of all, get your friends, acquaintances, family members, work colleagues, passing strangers, all writing. The bureaucratic capacity to handle ministerial correspondence is a lot like the net filters trialled earlier this year. At low levels of traffic they work OK, but once the traffic picks up, things start to choke up. That’s when Stephen Conroy and his office might start to notice that things are slowing down.