Thursday, December 24, 2009

Another example of what continues to attract research funding and investment dollars. As others have pointed out, after seeing how well top down planning has solved so many transportation issues, I will continue to choose walking and biking as a bottom up alternative type of intelligent vehicles.

The holistic perspective, however, tends to undermine the legitimacy of the techno-fix approach with the fact that the most intelligent and healthy vehicles are self powered! Steve

Steve

How Intelligent Vehicles Will Increase the Capacity of Our Roads

As the percentage of computer-controlled cars on the road increases, traffic should flow smoothly for longer, says a new study.

Adaptive cruise control systems work by monitoring the road ahead using a radar or laser-based device and then use both the accelerator and brake to maintain a certain distance from the vehicle ahead. (Other variations can bring the car to a halt in the event of a potential accident).

These devices have been available on upmarket cars for ten years or more and are now becoming increasingly common. If you drive regularly on freeways, the chances are you regularly come across other vehicles being driven by these devices, especially in Europe and Japan (here, the density of traffic means that ordinary cruise control has never caught on in the way it has in the U.S.).

So how does the presence of computer-controlled vehicles affect traffic dynamics? Today, Arne Kesting and pals at the Technical University of Dresden in Germany provide an answer of sorts using a model of traffic flow in which both human and computer-driven cars share the road.

They say that the presence of computer-driven cars increases the amount of traffic that can flow on a road before jamming occurs. And the more of these cars, the greater the capacity becomes. "1 percent more [computer-controlled] vehicles will lead to an increase of the capacities by about 0.3 percent," they say.

That's interesting but there have been other studies suggesting that computer-controlled cars can lead to greater congestion and it's not at all clear why Kesting and company's analysis is superior.

Either way, the argument is probably moot. Computer-controlled cars are just the first step in what many expect to be a revolution in car travel. The big increases in traffic capacity are likely to come when cars are able to communicate with each other. This should allow entire platoons of vehicles to travel as one unit, with just a few centimetres gap between cars and the vehicle in the front communicating its intentions to all the others. Platooning should improve fuel efficiency, too.

Of course, that won't be possible until there is a critical mass of computer-controlled cars on the roads. Even then there is a bigger hurdle to overcome of creating the legal framework in which all this can happen: imagine the insurance claims if one of these platoons were to crash.

The biggest challenge for the makers of cars that drive themselves is no longer technical but legal.

Ref: arxiv.org/abs/0912.3613: Enhanced Intelligent Driver Model to Access the Impact of Driving Strategies on Traffic Capacity
A sign [article] of what's trending in California. Reusing landfill makes sense [you may even be able to capture the landfill's methane]; the conversion of ag land into power generation land is already creating battlelines as 'we' struggle to maintain our "way of life". In the words of Stewart Brand, "The climate change problem is principally an energy problem". Steve

Stanislaus County Board of Supervisors Watch (12/23/09)

last updated: December 22, 2009 10:53:17 PM
By unanimous vote, Stanislaus County supervisors Tuesday:
• Awarded a one-year exclusive negotiating period to Sol Orchard, which wants to build a solar energy farm at the former Greer Road Landfill. Sol's response beat those submitted by two other companies, including JKB Energy, which recently won a one-year negotiating period for a much larger solar farm next to the county's Fink Road landfill near Crows Landing. Sol has a pending partnership with Solar Power Partners, the nation's third-largest solar energy developer. Preliminary plans include a mounting system that doesn't penetrate soil. Sol will have a year to do environmental studies, make a power-selling deal and negotiate a lease with the county.
-- Garth Stapley

Monday, November 16, 2009

Vulcan Logic and the Missing Sink

By: David Richardson

The daily carbon dioxide emissions report probably doesn't come up very often at America's dining room tables, but Kevin Gurney and researchers from the Vulcan Project hope to soon see that change.

Gurney leads Purdue University's Vulcan Project , which has produced the nation's first county-by-county, hour-by-hour snapshot of CO2 emissions. With Vulcan — named after the Roman god of fire and funded by the federal government through the North American Carbon Program — it is possible to peer, literally, into your own backyard (or your neighbor's) to see how your local area is contributing to the global problem that ultimately leads to global warming.

A little goes a long way
The gases nitrogen and oxygen account for 98 percent of the Earth's atmosphere. As one of the alphabet soup of trace gases naturally present in the air, CO2 accounts for a mere .038 percent of the atmosphere by mass; however, it has a disproportionate impact on warming.

As Gurney , the associate director of the Climate Change Research Center at Purdue University, explained, "It comes down to how we look at the atmosphere. If we look at it in terms of mass — just the weight of things — then CO2 is very small, but if we think about it in terms of the amount of infrared radiation it would trap, then CO2 would look huge."

Although the oceans remove one-third to one-half of the carbon dioxide produced each year, "the bad news is, their ability to do so appears to be diminishing," Gurney said. Meanwhile, the thousand-year atmospheric lifespan of CO2 means levels will continue to ratchet up even if emissions ended immediately.

Nevertheless, given what is known about carbon dioxide sources and sinks (locales or processes that remove CO2 from the atmosphere), scientists say CO2 has not accumulated as fast as one might expect, leading to the ironic question, "Why is the atmosphere not more polluted?"

Gurney and Daniel Mendoza, a graduate student with the Vulcan Project, suspect the answer may lie in an unrecognized carbon sink somewhere on the planet, and in light of the diminishing CO2 capacity of the seas, they believe this "missing sink" will play an important role in the carbon policy equation.

Though Gurney said CO2 concentration varies by no more than a few parts per million from pole to pole, he believes that mapping these barely perceptible peaks and valleys in CO2 levels could lead the way to the missing carbon sinks.

"In effect the weather map is a great analogy," he said. "Most people know that the air has temperature everywhere, but we know that in some places it's a bit higher and in some places it's a bit lower just like a weather map shows. CO2 is kind of like that. It's everywhere but it definitely has little bumps and valleys depending upon what's happening at the surface.

"The absence of the gas, where it would otherwise be expected, would indicate the possible location of the missing sinks."

To find the hidden sink, Gurney explained, research must first pin down where the observed carbon dioxide in the atmosphere originates. But that is not easy. CO2 mixes thoroughly with air as it travels over the planet. "You can find CO2 at the South Pole that was generated by industrial activity in the Northern Hemisphere."

Nonetheless, Gurney said, "We've known at the national level how much is coming from the U.S. as a whole," but prior to the Vulcan Project, the best estimates of emissions at the state level were based on fuel sales figures and shipping records.

Drilling down to the local level the math gets even fuzzier. Gurney says "the gold standard" of CO2 emissions estimates merely apportioned total discharges among jurisdictions on the basis of population density. Noting the obvious flaw in that approach, he points out that major CO2 emitters such as interstate highways and power plants, which alone account for 40 percent of U.S. emissions, are often a considerable distance from the population centers they serve.

To get a more accurate view of emissions sources Gurney turned to the U.S. Environmental Protection Agency and its network of local air pollution monitors . "Emission monitoring has gone on for 40 years — since the 1960s — which is an amazing legacy of information and infrastructure built and perfected over four decades," he explained. "In fact it is so good that we almost take it for granted now."

Reverse engineering
Although the EPA monitoring system was never intended to record CO2 levels, Gurney said their reports can be reverse-engineered to quantify the CO2 component of the exhaust that produced the pollution, "provided you know the type of device and fuel."

Mendoza is using data recycled from transportation studies picked up by "the thin wires that you can see that run across the road."

"They're the Federal Highway Administration 's weight and motion sensors," he said. "They classify vehicles as either light duty or heavier duty" as they pass over the counter. Mendoza says from this and similar data, Vulcan can generate CO2 emissions figures along major roads.

Although Vulcan collects no original field data, Mendoza says an incredible amount of detail can be coaxed from archival sources. Looking back to the year 2002, he says, Vulcan provides a sector-by-sector breakdown of CO2 emissions from the power plant sector, residential, commercial sectors, and the cement sector. "We can bring it down to a 10-kilometer-by-10-kilometer grid and provide a temporal pattern for most sectors," he said.

"We were a little bit floored by how much emissions actually come out of very unpopulated areas — a lot of that is due to electricity generation, which is such a great part of the economy here," Mendoza noted.

He said Vulcan delivered an additional surprise, revealing large cities to be "much more" CO2 efficient than smaller communities. Aside from the efficiency of urban mass transit, Mendoza believes lifestyle plays a role in the disparity. "Here in rural Indiana, we have large houses out in the middle of two acres, so the heating cost are much larger, but in the city you have the urban heat island effect keeping costs down."

The Vulcan Project Web page offers animated displays that show local CO2 emissions ramping up and down in response to heating and cooling needs, traffic patterns or other cycles of daily American life for the year 2002.

International Appeal
Gurney said U.S. government officials asked him if Vulcan can be used for "verification purposes" for an eventual climate treaty. It's an idea he finds appealing: "It would certainly be better to have an independent scientific body perform that function than a government."

He believes the average citizen can benefit from a dialogue with Vulcan as well. (You can take a look immediately using Google Earth .)

"I was pretty amazed at how interested people were when we released the maps and the movies," he said. "It's always been difficult to communicate the essence of this problem because it's very abstract in a certain way. One of the things that Vulcan has done is start to make this problem a bit more real and at least make it recognizable to people's lives.

"It brings the discussion down to the human scale, to the scale people live, in their state, their county, in some cases, their city. It brings it into their living rooms."

Mendoza said his next tasks will include adding data from Mexico and Canada so the Vulcan grid covers the entire continent.

Gurney, who predicts Vulcan eventually will produce emissions forecasts three months in advance, said he has received funding for a global version of the project and is currently exploring partnership opportunities with candidates in Europe, Asia and South America.

Despite the detour down the public awareness road, Gurney says he has not wavered from his initial quest to discover the missing sink. However, with a better grasp on where the CO2 originates, he says it will take direct observation, on a global scale, to determine where it might be going, and that job he said, is best performed from space.

Print

Monday, November 2, 2009

Intelligent Lighting Controls Deliver ROI in 3 Years

October 16, 2009

More building owners and managers are considering intelligent lighting systems to cut energy use because lighting, on average, accounts for approximately one-quarter of a building’s overall electricity use, rivaled only by HVAC and office equipment, according to Gary Meshberg, LEED, AP, and director of sales for Encelium Technologies .

State-of-the-art lighting systems reduce costs, demonstrate an overall commitment to being environmentally friendly, as well as contribute toward higher building values, higher tenant retention rates and overall end-user satisfaction, said Meshberg.

As an example, Encelium Technologies’ Energy Control System (ECS) uses addressable networking technology in combination with advanced control hardware and software, which can be integrated with HVAC, security and irrigation systems.

ECS uses a universal I/O (input/output) module to connect to standard lighting components such as low-voltage non-dimming ballasts, and occupancy sensors or photo sensors for digital control capabilities. The system allows each person to control his or her own workspace light levels from their desktop computer, and provides facility managers with energy management capabilities.

Installation of an intelligent lighting system also provides a significant return on investment (ROI), said Meshberg. He cites the following example. ECS installations, which cost between $3.00 and $3.50 per square foot for existing space, are designed to reduce lighting-related energy costs by 50 to 75 percent, so the projected savings of 75 cents to $1.25 per square foot per year means that the installation cost is amortized in less than three years.

As an example, the Rogers Centre sports and entertainment complex in Toronto, with approximately 7,000 light fixtures, cut its energy use by 77 percent, or by 3,731,000 KWh annually, with an ECS installation, according to Meshberg. The complex also achieved a 39 percent reduction in energy demand and a savings of 76 percent for energy costs. This translates into a cost savings of about $300,000 per year for the complex.

Meshberg also said ECS installations ease the way for buildings to earn the U.S. Green Building Council’s Leadership in Energy and Environmental (LEED) certification, contributing up to 18 points needed for certification, as well as facilitate a building’s compliance with ASHRAE 90.1, EPAct, Title 24 of the California Code of Regulations and various utility rebate programs.

Tuesday, October 6, 2009

Spinning flywheels said to make greener energy

Sep 21, 2009
Associated Press Online
By JAY LINDSAY

TYNGSBOROUGH, Mass., Sep. 21, 2009 (AP Online delivered by Newstex) -- Spinning flywheels have been used for centuries for jobs from making pottery to running steam engines. Now the ancient tool has been given a new job by a Massachusetts company: smooth out the electricity flow, and do it fast and clean.
Beacon Power's flywheels -- each weighing one ton, levitating in a sealed chamber and spinning up to 16,000 times per minute -- will make the electric grid more efficient and green, the company says. It's being given a chance to prove it: the U.S. Department of Energy has granted Beacon a $43 million conditional loan guarantee to construct a 20-megawatt flywheel plant in upstate New York.
"We are very excited about this technology and this company," said Matt Rogers, a senior adviser to the Secretary of Energy. "It's a lower (carbon dioxide) impact, much faster response for a growing market need, and so we get pretty excited about that."
Beacon's flywheel plant will act as a short-term energy storage system for New York's electrical distribution system, sucking excess energy off the grid when supply is high, storing it in the flywheels' spinning cores, then returning it when demand surges. The buffer protects against swings in electrical power frequency, which, in the worst cases, cause blackouts.
Such frequency regulation makes up just 1 percent of the total U.S. electricity market, but that's equal to more than $1 billion annually in revenues. The job is done now mainly by fossil-fuel powered generators that Rogers said are one-tenth the speed of flywheels and create double the carbon emissions.
Beacon said the carbon emissions saved over the 20-year life of a single 20-megawatt flywheel plant are equal to the carbon reduction achieved by planting 660,000 trees.
Flywheels also figure into the emerging renewable energy market, where intermittent energy sources such as wind and solar provide power at wildly varying intensities, depending on how long the breeze blows and sun shines. That increases the need for the faster frequency buffering, Rogers said.
Dan Rastler of the Electric Power Research Institute, an industry research group, added that if a carbon tax is passed by Congress, flywheels start looking a lot better than fossil-fuel powered alternatives.
Beacon's flywheels, massive carbon and fiberglass cylinders, have already been tested on a small scale in New York, California and the company's Tyngsborough offices. Chief executive officer Bill Capp hopes the Stephentown, N.Y., plant will be up and running by the end of 2010.
Flywheels are rotating discs or cylinders that store energy as motion, like the bicycle wheel that keeps rotating long after a pedal's been turned. That energy can be drawn off smoothly depending on the needs of the user, such as when the speed of a potter's wheel is adjusted to shape the clay as desired.
The basics of Beacon's flywheels seem simple enough as they spin silently in their chambers in a small facility outside Beacon's Tyngsborough plant. But the technological challenges to create them were immense and have cost Beacon $180 million, so far.
For instance, the one-ton flywheel had to be durable enough to spin smoothly at exceptionally high speeds. To avoid losing stored energy to friction, the flywheel levitates between magnets in a vacuum chamber.
"We've pretty much demonstrated that it works, it's just a question of scaling," Capp said. "The more we run, the more people get comfortable with us."
Beacon's flywheels are powered by the excess energy they take off the grid. When demand for electricity surges, the flywheels even things out and return the energy to the grid by slowing down.
Flywheels have some clear benefits in energy storage, including the durability to store and release power hundreds of thousands of times over a long, 20-year life, said Yuri Makarov, chief scientist in power systems at Pacific Northwest National Laboratory, which tested Beacon's system for the DOE. Chemical batteries being developed for the same job wear out after a couple thousand charge-and-discharge cycles.
Flywheels use less energy than fossil-fuel powered generators because they adjust more quickly to the ever-shifting demands of the electric grid by simply slowing down or spinning faster, Makarov said. Fossil-fuel generators are slower and less efficient as they constantly fire up and down.
The disadvantage of flywheels, Makarov said, is that they can only store a limited amount of energy for a limited amount of time. That can shut them out of numerous other services the grid demands -- and that other storage technologies can perform -- such as long-term power storage.
Regulations in many markets are also lagging. Beacon will bid against other power generators to provide frequency regulation, but in some markets, the bidding system doesn't even exist yet for energy storage.
Beacon's reward for taking on the technology is that it's the first flywheel company in the nation ready to provide utility-scale frequency regulation in the electric grid. Rogers said the New York project will help show whether the flywheels can do the job:
"If they're successful in New York, we'd expect this kind of technology to be picked up in many other markets around the country," he said.

Wednesday, September 16, 2009

Superefficient Solar from Nanotubes

Tuesday, September 15, 2009

Carbon nanotube photovoltaics can wring twice the charge from light.
By Katherine Bourzac

Today's solar cells lose much of the energy in light to heat. Now researchers at Cornell University have made a photovoltaic cell out of a single carbon nanotube that can take advantage of more of the energy in light than conventional photovoltaics. The tiny carbon tubes might eventually be used to make more-efficient next-generation solar cells.

"The main limiting factor in a solar cell is that when you absorb a high-energy photon, you lose energy to heat, and there's no way to recover it," says Matthew Beard , a senior scientist at the National Renewable Energy Laboratory in Golden, CO. Loss of energy to heat limits the efficiency of the best solar cells to about 33 percent. "The material that can convert at a much higher efficiency will be a game-changer," says Beard.

Researchers led by Paul McEuen , professor of physics at Cornell, began by putting a single nanotube in a circuit and giving it three electrical contacts called gates, one at each end and one underneath. They used the gates to apply a voltage across the nanotube, then illuminated it with light. When a photon hits the nanotube, it transfers some of its energy to an electron, which can then flow through the circuit off the nanotube. This one-photon, one-electron process is what normally happens in a solar cell <http://www.technologyreview.com/energy/23459/> . What's unusual about the nanotube cell, says McEuen, is what happens when you put in what he calls "a big photon" -- a photon whose energy is twice as big as the energy normally required to get an electron off the cell. In conventional cells, this is the energy that's lost as heat. In the nanotube device, it kicks a second electron into the circuit. The work was described last week in the journal Science <http://sciencemag.org/> .

There's evidence that another class of nanomaterials called quantum dots can also convert the energy of one photon into more than one electron. However, making operational quantum-dot cells that can do this has proved a major hurdle, says Beard, whose lab, led by Arthur Nozik , is working on the problem. One of the challenges with quantum-dot solar is that it's very difficult to get the freed electrons to leave the quantum dot and enter an external circuit. "The system is teasing you; you can't get those charge carriers out, so what's the point?" says Ji Ung Lee , professor of nanoscale engineering at the State University of New York in Albany. "McEuen's group has shown this in a system where you can get the extra carriers out."

McEuen cautions that his work on carbon nanotube photovoltaics is fundamental. "We've made the world's smallest solar cell, and that's not necessarily a good thing," he says. To take advantage of the nanotubes' superefficiency, researchers will first have to develop methods for making large arrays of the diodes. "We're not at a point where we can scale up carbon nanotubes, but that should be the ultimate goal," says Lee, who developed the first nanotube diodes while a researcher at General Electric.

It's not clear why the nanotube photovoltaic cell offers this two-for-one energy conversion. "It's mysterious to us," says McEuen. However, the most likely reason is that while conventional solar materials have only one energy level for electrons to move through, carbon nanotubes have several. And two of them just happen to be very well matched: one of the energy levels, or bandgaps, is twice as high as the other. "We may have gotten lucky, and it has very little to do with the fact that it's a carbon nanotube," says McEuen. This means, McEuen hopes, that even if it proves too challenging to make arrays of nanotube solar cells, materials scientists can look for pairs of materials that have these kinds of matched bandgaps, and layer them to make solar cells that do with two materials what the single nanotube cells can do. "Maybe the answer won't be in nanotubes, but in another pair of materials," McEuen says.

Copyright Technology Review 2009.

Tuesday, September 8, 2009

Panels of Light Fascinate Designers

September 7, 2009
By ERIC A. TAUB

LED light bulbs, with their minuscule energy consumption and 20-year life expectancy, have grabbed the consumer’s imagination.

But an even newer technology is intriguing the world’s lighting designers: OLEDs, or organic light-emitting diodes , create long-lasting, highly efficient illumination in a wide range of colors, just like their inorganic LED cousins. But unlike LEDs, which provide points of light like standard incandescent bulbs, OLEDs create uniform, diffuse light across ultrathin sheets of material that eventually can even be made to be flexible.

Ingo Maurer , who has designed chandeliers of shattered plates and light bulbs with bird wings , is using 10 OLED panels in a table lamp in the shape of a tree. The first of its kind, it sells for about $10,000.

He is thinking of other uses. “If you make a wall divider with OLED panels, it can be extremely decorative. I would combine it with point light sources,” he said.

Other designers have thought about putting them in ceiling tiles or in Venetian blinds, so that after dusk a room looks as if sunshine is still streaming in.

Today, OLEDs are used in a few cellphones, like the Impression from Samsung, and for small, expensive, ultrathin TVs from Sony and soon from LG. (Sony’s only OLED television, with an 11-inch screen, costs $2,500.) OLED displays produce a high-resolution picture with wider viewing angles than LCD screens.

In 2008, seven million of the one billion cellphones sold worldwide used OLED screens, according to Jennifer Colegrove, a DisplaySearch analyst. She predicts that next year, that number will jump more than sevenfold, to 50 million phones.

But OLED lighting may be the most promising market. Within a year, manufacturers expect to sell the first OLED sheets that one day will illuminate large residential and commercial spaces. Eventually they will be as energy efficient and long-lasting as LED bulbs, they say.

Because of the diffuse, even light that OLEDs emit, they will supplement, rather than replace, other energy-efficient technologies, like LED, compact fluorescent and advanced incandescent bulbs that create light from a single small point.

Its use may be limited at first, designers say, and not just because of its high price. “OLED lighting is even and monotonous,” said Mr. Maurer, a lighting designer with studios in Munich and New York. “It has no drama; it misses the spiritual side.”

“OLED lighting is almost unreal,” said Hannes Koch, a founder of rAndom International in London, a product design firm. “It will change the quality of light in public and private spaces.”

Mr. Koch’s firm was recently commissioned by Philips to create a prototype wall of OLED light, whose sections light up in response to movement.

Because OLED panels could be flexible, lighting companies are imagining sheets of lighting material wrapped around columns. (General Electric created an OLED-wrapped Christmas tree as an experiment.) OLED can also be incorporated into glass windows; nearly transparent when the light is off, the glass would become opaque when illuminated.

Because OLED panels are just 0.07 of an inch thick and give off virtually no heat when lighted, one day architects will no longer need to leave space in ceilings for deep lighting fixtures, just as homeowners do not need a deep armoire for their television now that flat-panel TVs are common.

The new technology is being developed by major lighting companies like G.E., Konica Minolta, Osram Sylvania, Philips and Universal Display.

“We’re putting significant financial resources into OLED development,” said Dieter Bertram, general manager for Philips’s OLED lighting group. Philips recently stepped up its investment in this area with the world’s first production line for OLED lighting, in Aachen, Germany.

Universal Display, a company started 15 years ago that develops and licenses OLED technologies, has received about $10 million in government grants over the last five years for OLED development, said Joel Chaddock, a technical project manager for solid state lighting in the Energy Department.

Armstrong World Industries and the Energy Department collaborated with Universal Display to develop thin ceiling tiles that are cool to the touch while producing pleasing white light that can be dimmed like standard incandescent bulbs. With a recently awarded $1.65 million government contract, Universal is now creating sheetlike undercabinet lights.

“The government’s role is to keep the focus on energy efficiency,” Mr. Chaddock said. “Without government input, people would settle for the neater aspects of the technology.”

G.E. is developing a roll-to-roll manufacturing process, similar to the way photo film and food packaging are created; it expects to offer OLED lighting sheets as early as the end of next year.

“We think that a flexible product is the way to go,” said Anil Duggal, head of G.E.’s 30-person OLED development team. OLED is one of G.E.’s top research priorities; the company is spending more than half its research and development budget for lighting on OLED.

Exploiting the flexible nature of OLED technology, Universal Display has developed prototype displays for the United States military, including a pen with a built-in screen that can roll in and out of the barrel.

The company has also supplied the Air Force with a flexible, wearable tablet that includes GPS technology and video conferencing capabilities.

As production increases and the price inevitably drops, OLED will eventually find wider use, its proponents believe, in cars, homes and businesses.



“I want to get the price down to $6 for an OLED device that gives off the same amount of light as a standard 60-watt bulb,” said Mr. Duggal of G.E. “Then, we’ll be competitive.”

Monday, August 31, 2009

The Next Evolution in Economics: Rethinking Growth

1:30 PM Wednesday August 19, 2009
by Stan Stalnaker

The credit crunch has forced people across many sectors to rethink their assumptions about how they do business, the roles of the individual in the larger system, and the very future of the system itself.

These reflections are beginning to bear fruit. We've begun to see a shift from the old, linear transaction-based approach to business toward a new, circular view, in which shared resources can better benefit all in a way that adds depth (and value) to this future economy.

Economists describe this new model in many ways. One way is to use human cellular structures as a metaphor for economic growth. Call it cellular economic theory.

What do cells tell us about business? Well, consider that cells that grow continually and exponentially (like we've been taught our economies should grow) are a form of cancer. We know intuitively and logically that continuous growth can't be sustained in living things. It's likewise unsustainable (and undesirable) in business.

But that's our current model--to just keep growing. And in this model there's no alternative to growth, only stagnation which leads to death. The result of this is policy at every level (micro, macro, corporate and public) that champions growth at all costs.

Cellular economic theory suggests an alternative to linear growth: circular growth. In the body, cells grow. Cells die. New cells grow. New cells die. On and on. We sustain ourselves through regeneration. In business, a form of staged, regenerative growth could become the norm. The growth may not even change the size of the "economic body."

Here, growth is not seen as the ultimate byproduct of an economic life cycle, but just an important one. Growth becomes one of several life cycle stages that are primarily about replenishment. Instead of growing in size and scope, companies grow in capabilities, processes and offerings. New ones come along. Old ones dies. Just like cells, growth becomes regenerative--only what needs replacing is replaced, reducing waste and improving society along the way.

For example, a brewery in India is using cellular economic thinking to grow its bottom line without producing and selling more beer. Instead it's using chaff and grain detritus to create fertilizer and biofuels--regenerating resources to lower their own production costs while widening the life cycles of their inputs.

KATIKA, a Swiss wood furniture maker, is reforesting at a rate greater than their production, using profits from their sales today to ensure the availability of resources later. In the meantime, their reforestation projects create local jobs and other sustainable benefits (home for wildlife and food, CO2 reduction) while increasing the value of formerly degraded land holdings.

In a cellular economy, key metrics change. GDP growth is less important than GDP regeneration. Successful growth takes into account the sustainability of that growth.

The most profound change in a cellular economy is the devaluation of the transaction. Today, economic value is determined primarily by the value of the transaction. To grow (even just to survive), we must keep trading, keep consuming--no matter how wasteful the process becomes--because success is creating more transactions. This keeps us locked into a linear, growth oriented paradox.

Fortunately, (if not painfully), the Internet is exposing the impossibility of sustaining a transaction-based economy. As the net drives the cost of certain goods and services toward zero, it strips profit from transactions.

In publishing, for example, the cost of information is falling while sources multiply. Same for music and other creative enterprises. Same for micro-lending versus traditional banking. Fashion and retail. Oil. Anywhere there's a middle man between the natural resource and the end consumer, the Internet is obviating the need for the middle man.

And, in place of transactions and supply chains (which are, essentially, series of middle men), communities are gaining leverage and power from these shared commodities like news and gas.

A low-level web of constant relationships, circular, cellular systems where shared, collaborative contributions are the norm, is developing. Here, the value resides with relationships, not transactions. Maybe, instead of buying and selling more and more in a mad race for grabbing the most growth, the future will be about a collaborative, community-oriented regenerative growth model.

This "economy of shares" relies on crowd-sourced contributions, a free market, and a fair dose of incentives for sustainability. When it becomes bad business to waste resources in pursuit of profit, then the regenerative model takes hold and we can kiss goodbye to the things we know we don't need but can't seem to give up. Wasteful packaging. Super-sized food portions. Environmentally damaging newspapers. Gas-guzzling SUVs.

Eventually, in a regenerative economy, we learn to focus on kaizen --constant improvements, as opposed to an ever expanding volume of low-quality transactions and markets. Call it the co-op economy. It's the kind of economic system we always say we want but can't bring ourselves to build.

If the experts are right and we do indeed need to find more sustainable ways of living, and the bankers are right in saying that we have to live within our means, and the technologists are right saying that collaborative systems are the future, then it stands to reason that the next evolution in economics is to a more natural, life-like system.

We are moving to a world where transactions will happen instantly, on demand, for free. We are moving to a time when transactions can't sustain an economy. We are realizing all systems are like biological systems--even economic ones. Growth-at-all-costs business is malignant.

It's time to apply that broad realization in new ways to the situation at hand.

http://blogs.harvardbusiness.org/hbr/hbr-now/2009/08/a-new-approach-to-economics.html

Stan Stalnaker is the Founder and Creative Director of Hub Culture Ltd , a social network that merges online and physical world environments.

Wednesday, August 12, 2009

Keeping it local

Jul 30th 2009 | AUSTIN
From The Economist print edition

A rising vogue for shopping near home

IN 2002 the city of Austin planned to extend about $2m in incentives to a developer who wanted to build a new Borders bookstore on a prominent downtown corner. This was an unpleasant prospect for the owners of two local independent businesses, BookPeople and Waterloo Records. If the deal had gone through they would have faced a big competitor located directly across the street. Steve Bercu, the owner of BookPeople, says that he always assumed that local businesses were better for Austin for sound economic reasons. But in the circumstances, he wanted to test the proposition.

So BookPeople and Waterloo called in Civic Economics, a consultancy. They went through the books and found that for every $100 spent at the two locals, $45 stayed in Austin in wages to local staff, payments to other local merchants, and so on. When that sum went to a typical Borders store, only $13 went back into circulation locally.


Although the study was part-funded by BookPeople and Waterloo it gave a boost to the growing “buy local” movement in America. For years business and community leaders have been full of reasons for people to do their shopping close to home. They say that local and independent businesses have more individual character, and that they are owned by your friends and neighbours. Some stores, particularly grocers, point out that it takes much less carbon to haul a truck from a few towns over than from halfway across the country.

At the moment, the economic argument has special traction. Dan Houston, a partner at Civic Economics, says that in recent studies he has found that locally-owned businesses put about twice as much money back into the community as the chains do, not three times, as the Austin study found. But that is still enough of a “local multiplier” to catch people’s attention. Stacy Mitchell of the Institute for Local Self-Reliance in Portland, Maine, reckons that some 30,000 local independents have joined about 130 independent business alliances around the nation.

Big companies are taking note that customers are rooting for the home team. Ms Mitchell points to a telling development in Seattle, Washington, where Starbucks got its start. On July 24th the company opened a new coffee shop there. The newcomer is not called a Starbucks; it is called “15th Ave. Coffee & Tea”. It promises “a deep connection to the local community,” and its seats are recycled from a local theatre.

There is an insular element to the trend. “Is it pure local protectionism? Sure, to some extent it is,” says Mr Houston. But the advocates are not zealots. One national campaign is asking people to shift a mere 10% of their spending to local outfits.

The Borders project in Austin eventually fell through, and the proposed site is now occupied by the flagship of Whole Foods Market. The chain was founded in Austin and is local in a sense, although it is now publicly traded. Throughout the shop, produce advertises its credentials: local, organic, fairly traded, made in-house, vegan, and so on. This week its customers faced an ethical dilemma: is it better to buy the organic watermelon from California, or the conventionally-grown kind from Lexington, Texas?

Thursday, August 6, 2009

Water Scarcity Looms

Worldwatch Institute
by Gary Gardner/ August 6, 2009

Water scarcity grows in urgency in many regions as population growth, climate change, pollution, lack of investment, and management failures restrict the amount of water available relative to demand. The Stockholm International Water Institute calculated in 2008 that 1.4 billion people live in "closed basins"-regions where existing water cannot meet the agricultural, industrial, municipal, and environmental needs of all.1 Their estimate is consistent with a 2007 Food and Agriculture Organization (FAO) calculation that 1.2 billion people live in countries and regions that are water-scarce.2 And the situation is projected to worsen rapidly: FAO estimates that the number of water-scarce will rise to 1.8 billion by 2025, particularly as population growth pushes many countries and regions into the scarcity column.3

"Water scarcity" has several meanings. Physical water scarcity exists wherever available water is insufficient to meet demand: parts of the southwestern United States, northern Mexico, North Africa, the Middle East, Central Asia, northern China, and southeastern Australia are characterized by physical water scarcity.4 Economic water scarcity occurs when water is available but inaccessible because of a lack of investment in water provision or poor management and regulation of water resources. Much of the water scarcity of sub-Saharan Africa falls into this category. 5

Signs of scarcity are plentiful. Several major rivers, including the Indus, Rio Grande, Colorado, Murray-Darling, and Yellow, no longer reach the sea year-round as a growing share of their waters are claimed for various uses.6 Water tables are falling as groundwater is overpumped in South Asia, northern China, the Middle East, North Africa, and the southwestern United States, often propping up food production unsustainably.7 The World Bank estimates that some 15 percent of India's food, for example, is produced using water from nonrenewable aquifers.8 Another sign of scarcity is that desalination, a limited and expensive water supply solution, is on the rise.

Water scarcity has many causes. Population growth is a major driver at the regional and global levels, but other factors play a large role locally. Pollution reduces the amount of usable water available to farmers, industry, and cities. The World Bank and the government of China have estimated, for instance, that 54 percent of the water in seven main rivers in China is unusable because of pollution.9 In addition, urbanization tends to increase domestic and industrial demand for water, as does rising incomes-two trends prominent in rapidly developing countries such as China, India, and Brazil.10

In some cases, water scarcity leads to greater dependence on foreign sources of water. A country's "water footprint"-the volume of water used to produce the goods and services, including imports, that people consume-can be used to demonstrate this.11 The ratio between the water footprint of a country's imports and its total water footprint yields its water import dependence. The higher the ratio, the more a country depends on outside water resources. In the Netherlands, for example, imported goods and services account for 82 percent of the country's total water footprint.12 (See Table 1.)

A looming new threat to water supplies is climate change, which is causing rainfall patterns to shift, ice stocks to melt, and soil moisture content and runoff to change.13 According to the Intergovernmental Panel on Climate Change, the area of our planet classified as "very dry" has more than doubled since the 1970s, and the volume of glaciers in many regions and snow pack in northern hemisphere mountains-two important freshwater sources-has decreased significantly.14

Climate change is expected to have a net negative impact on water scarcity globally this century. By the 2050s, the area subject to greater water stress due to climate change will be twice as large as the area experiencing decreased water stress.15 Less rainfall is expected in already arid areas, including the Mediterranean Basin, western United States, southern Africa, and northeastern Brazil, where various models all indicate that runoff will decrease by 10-30 percent in the coming decades.16 And loss of snowpack will remove a natural, off-season water reservoir in many regions: by the 2020s, for example, 41 percent of the water supply to the populous southern California region is likely to be vulnerable to warming as some of the Sierra Nevada and Colorado River basin snowpacks disappear.17

Policymakers look to a variety of solutions to address water scarcity. Desalination is increasingly feasible for small-scale water supply, as technological advances reduce costs. This involves removing most salt from salt water, typically by passing it through a series of membranes. Global desalination capacity doubled between 1995 and 2006, and according to some business forecasts it could double again by 2016.18 But production is tiny: global capacity in 2005 was some 55.4 million cubic meters, barely 0.003 percent of the world's municipal and industrial water consumption, largely because desalination remains an energy-intensive and expensive option.19 Not surprisingly, 47 percent of global capacity in 2006 was in the Middle East, where the need is great and energy is cheap.20 In addition, the technology is plagued by environmental concerns, especially disposal of salt concentrates.

Another limited solution to scarcity involves accounting for "virtual water"-the water used to produce a crop or product-when designing trade policy. Nations conserve their own water if they import products having a large virtual water component, such as foodstuffs, rather than producing them domestically. Imports to Jordan, for instance, including wheat and rice from the United States, have a virtual water content of some 5-7 billion cubic meters per year compared with domestic water use of some 1 billion cubic meters per year.21 Jordan's import policy yields enormous water savings for the country, although it also increases its food dependency. The bulk of North and South America, Australia, Asia, and Central Africa are net exporters of virtual water.22 Most of Europe, Japan, North and South Africa, the Middle East, Mexico, and Indonesia, in contrast, are net importers of virtual water.23

Other solutions focus on structural shifts in water use, including growing crops that are less water-intensive, changing dietary patterns to reduce meat consumption, and shifting to renewable sources of energy. Diets heavy in livestock products, for example, are water-intensive because of the huge quantities of water required for livestock production.24 (See Table 2.) Similarly, fossil fuel production requires many times more water than renewable energy sources do. 25 (See Table 3.)

Complete trends will be available with full endnote referencing, Excel spreadsheets, and customizable presentation-ready charts as part of our new subscription service, Vital Signs Online, slated to launch this fall.

http://www.worldwatch.org/node/6213?emc=el&m=279787&l=4&v=7c1e33b23a

Wednesday, August 5, 2009

LEDs Are As Energy Efficient as Compact Fluorescents

NYT August 4, 2009, 12:01 am

By Eric A. Taub

While there’s no question that LED lamps use a fraction of the energy to produce the same amount of light compared with a standard incandescent bulb, several Bits readers have pointed out that that’s only half the story.

If the energy used to create and dispose of the LED lamp is more than that for a comparable standard bulb, then all of the proclaimed energy savings to produce light are for naught.

Until recently, no one knew if that was the case. In March, a preliminary study reported by Carnegie Mellon indicated that LED lamps were more energy efficient throughout their life, but the researchers pointed out that not every aspect of the production process was taken into account.

A new study released on Tuesday by Osram, the German lighting giant, claims to have confirmed the efficiency findings.

Conducted by the Siemens Corporate Technology Centre for Eco Innovations (Siemens is the parent of Osram and Sylvania), the report examines the energy needed to create and power an LED lamp. Even the energy needed to ship a lamp from the factory in China to an installation in Europe was taken into account.

The study used a 25,000-hour LED lamp life as a constant, comparing the energy needed throughout its life to that used for 25 1,000-hour incandescents and 2.5 10,000-hour compact fluorescents.

The findings, according to a summary of the study: today’s LED lamps are essentially as energy efficient as compact fluorescents, in the amount of energy needed to create, recycle and provide light. Osram said it expected those numbers to improve as LEDs become more energy efficient.

The company issued no in-depth information to support its claims. It said that confirming data will be released this fall, after review by three independent analysts.

But assuming the numbers hold, this total Life Cycle Assessment should put to rest any lingering doubts about the overall “greenness” of LEDs.

* Copyright 2009 The New York Times Company
* Privacy Policy
* NYTimes.com 620 Eighth Avenue New York, NY 10018

Monday, August 3, 2009

Solar Industry: No Breakthroughs Needed

Monday, August 03, 2009

The solar industry says incremental advances have made transformational technologies unnecessary.
By Kevin Bullis

The federal government is behind the times when it comes to making decisions about advancing the solar industry, according to several solar-industry experts. This has led, they argue, to a misplaced emphasis on research into futuristic new technologies, rather than support for scaling up existing ones. That was the prevailing opinion at a symposium last week put together by the National Academies in Washington, DC, on the topic of scaling up the solar industry.

The meeting was attended by numerous experts from the photovoltaic industry and academia. And many complained that the emphasis on finding new technologies is misplaced. "This is such a fast-moving field," said Ken Zweibel, director of the Solar Institute at George Washington University. "To some degree, we're fighting the last war. We're answering the questions from 5, 10, 15 years ago in a world where things have really changed."

In the past year, the federal government has announced new investments in research into "transformational" solar technologies that represent radical departures from existing crystalline-silicon or thin-film technologies that are already on the market. The investments include new energy-research centers sponsored by the Department of Energy and a new agency called ARPA-Energy, modeled after the Defense Advanced Research Projects Agency. Such investments are prompted by the fact that conventional solar technologies have historically produced electricity that's far more expensive than electricity from fossil fuels.

In fact, Energy Secretary Steven Chu has said that a breakthrough is needed for photovoltaic technology to make a significant contribution to reducing greenhouse gases. Researchers are exploring solar cells that use very cheap materials or even novel physics that could dramatically increase efficiency, which could bring down costs.

But industry experts at the Washington symposium argued that new technologies will take decades to come to market, judging from how long commercialization of other solar technologies has taken. Meanwhile, says Zweibel, conventional technologies "have made the kind of progress that we were hoping futuristic technologies could make." For example, researchers have sought to bring the cost of solar power to under $1 per watt, and as of the first quarter of this year one company, First Solar, has done this.

These cost reductions have made solar power cheaper than the natural-gas-powered plants used to produce extra electricity to meet demand on hot summer days. With subsidies, which Zweibel argues are justified because of the "externalities" of other power sources, such as the cost from pollution, solar can be competitive with conventional electricity even outside peak demand times, at least in California. And projected cost decreases will make solar competitive with current electricity prices in more areas, even without subsidies.

Representatives of the solar industry say the federal government should do more to remove obstacles that are slowing the industry's development. One issue is financing for new solar installations, which can be much more expensive if lending institutions deem them high risk. A recent extension of federal tax credits and grants for solar investments is a step in the right direction, many solar experts say. But more could be done. A price on carbon would help make solar more economically competitive and more attractive to lenders.

Friday, July 31, 2009

A New Approach to Fusion

Technology Review Friday, July 31, 2009

A startup snags funding to start early work on a low-budget test reactor.
By Tyler Hamilton

General Fusion, a startup in Vancouver, Canada, says it can build a prototype fusion power plant within the next decade and do it for less than a billion dollars. So far, it has raised $13.5 million from public and private investors to help kick-start its ambitious effort.

Unlike the $14 billion ITER project under way in France, General Fusion's approach doesn't rely on expensive superconducting magnets--called tokamaks--to contain the superheated plasma necessary to achieve and sustain a fusion reaction. Nor does the company require powerful lasers, such as those within the National Ignition Facility at Lawrence Livermore National Laboratory, to confine a plasma target and compress it to extreme temperatures until fusion occurs.

Instead, General Fusion says it can achieve "net gain"--that is, create a fusion reaction that gives off more energy than is needed to trigger it--using relatively low-tech, mechanical brute force and advanced digital control technologies that scientists could only dream of 30 years ago.

It may seem implausible, but some top U.S. fusion experts say General Fusion's approach, which is a variation on what the industry calls magnetized target fusion, is scientifically sound and could actually work. It's a long shot, they say, but well worth a try.

"I'm rooting for them," says Ken Fowler, professor emeritus of nuclear engineering and plasma physics at the University of California, Berkeley, and a leading authority on fusion-reactor designs. He's analyzed the approach and found no technical showstoppers. "Maybe these guys can do it. It's really luck of the draw."

The prototype reactor will be composed of a metal sphere about three meters in diameter containing a liquid mixture of lithium and lead. The liquid is spun to create a vortex inside the sphere that forms a vertical cavity in the middle. At this point, two donut-shaped plasma rings held together by self-generated magnetic fields, called spheromaks, are injected into the cavity from the top and bottom of the sphere and come together to create a target in the center. "Think about it as blowing smoke rings at each other," says Doug Richardson, chief executive of General Fusion.

On the outside of the metal sphere are 220 pneumatically controlled pistons, each programmed to simultaneously ram the surface of the sphere at 100 meters a second. The force of the pistons sends an acoustic wave through the lead-lithium mixture, and that accelerates into a shock wave as it reaches the plasma, which is made of the hydrogen isotopes deuterium and tritium.

If everything works as planned, the plasma will compress instantly and the isotopes will fuse into helium, releasing a burst of energy-packed neutrons that are captured by the lead-lithium liquid. The rapid heat buildup in the liquid will be extracted through a heat exchanger, with half used to create steam that spins a turbine for power generation, and the rest used to recharge the pistons for the next "shot."

The ultimate goal is to inject a new plasma target and fire the pistons every second, creating pulses of fusion reactions as part of a self-sustaining process. This contrasts with ITER, which aims to create a single fusion reaction that can sustain itself. "One of the big risks to the project is nobody has compressed spheromaks to fusion-relevant conditions before," says Richardson. "There's no reason why it won't work, but nobody has ever proven it."

He says it look longer than expected to raise the money for the prototype project, but the company can now start the first phase of building the test reactor, including the development of 3-D simulations and the technical verification of components. General Fusion aims to complete the reactor and demonstrate net gain within five years, assuming it can raise another $37 million.

If successful, it believes it can build a grid-capable fusion reactor rated at 100 megawatts four years later for about $500 million, beating ITER by about 20 years and at a fraction of the cost.

"I usually pass up these quirky ideas that pass my way, but this one really fascinated me," says Fowler. He notes that there are immense challenges to overcome, but the culture of a private startup may be what it takes to tackle them with a sense of urgency. "In the big programs, especially the fusion ones, people have gotten beat up so much that they've become so risk averse."

General Fusion's basic approach isn't entirely new. It builds on work done during the 1980s by the U.S. Naval Research Laboratory, based on a concept called Linus. The problem was that scientists couldn't figure out a fast-enough way to compress the plasma before it lost its donut-shaped magnetic confinement, a window of opportunity measured in milliseconds. Just like smoke rings, the plasma rings maintain their shape only momentarily before dispersing.

Nuclear-research giant General Atomics later came up with the idea of rapidly compressing the plasma using a mechanical ramming process that creates acoustic waves. But the company never followed through--likely because the technology to precisely control the speed and simultaneous triggering of the compressed-air pistons simply didn't exist two decades ago.

Richardson says that high-speed digital processing is readily available today, and General Fusion's mission over the next two to four years is to prove it can do the job. Before building a fully functional reactor with 220 pistons on a metal sphere, the company will first verify that smaller rings of 24 pistons can be synchronized to strike an outer metal shell.

Glen Wurden, program manager of fusion energy sciences at Los Alamos National Laboratory and an expert on magnetized target fusion, says General Fusion has a challenging road ahead and many questions to answer definitively. Can they produce spheromaks with the right densities, temperature, and life span? Can they inject two spheromaks into opposite ends of the vortex cavity and make sure they collide and merge? Will the acoustic waves travel uniformly through the liquid metal?

"You can do a good amount of it through simulations, but not all of it," says Wurden. "This is all very complex, state-of-the-art work. The problem is you're dealing with different timescales and different effects on materials when they're exposed to shock waves."

Los Alamos and General Fusion are collaborating as part of a recently signed research agreement. But Richardson isn't planning on a smooth ride. "The project has many risks," he says, "and we expect most of it to not perform exactly as expected." However, if the company can pull off its test reactor, it hopes to attract enough attention to easily raise the $500 million for a demonstration power plant.

Says Fowler, "Miracles do happen."

Copyright Technology Review 2009.

Monday, July 20, 2009

White House to Push Forward on National Urban Policy Agenda

Administration to Host Daylong Talks Tomorrow;Tour of U.S. Cities Planned
By Robin Shulman
Washington Post Staff Writer
Sunday, July 12, 2009; 11:12 AM

After remaining out of the public eye since its creation in February, the White House Office of Urban Affairs plans on Monday to launch a public conversation to create a national urban policy agenda, said Adolfo Carrión Jr., its director.
The White House will host a daylong urban policy discussion including mayors, county executives, governors, urban policy experts, and heads of various agencies, Carrión said in a telephone interview yesterday .
President Obama is expected to address the conference and announce plans to send Carrión and other senior administration officials on a tour of American cities to discuss urban issues, Carrión said.
The conference is the first indication that the White House could back its urban policy office with the kind of muscle that Obama suggested during his campaign, before the economic collapse. He called for a new kind of urban policy to address cities and also their suburbs, and urban advocates hoped that this could be a focus of his administration's economic development approach.
"We have not had a national urban policy for decades," said Carrión. "Meanwhile, the economy has changed."
Those gathered Monday will consider local initiatives that could become best practices to emulate, with the goals of increasing the competitiveness, sustainable development and opportunity of metropolitan regions.
The conference is to present an interdisciplinary approach to urban issues and include the heads of the Departments of Labor, Transportation, and Housing and Urban Development, and of the Environmental Protection Agency, and the Small Business Administration.
Carrión said discussion will include initiatives like Choice Neighborhoods, a new HUD program that provides poor neighborhoods not only with housing, but also social and economic benefits, like day care and farmers' markets; and Promise Neighborhoods, a Department of Education program modeled after the Harlem Children's Zone, to improve academic achievement and life skills by offering after school and weekend sports, social and arts activities.
The conference will include several dozen policy experts, including Bruce Katz, the director of the Metropolitan Policy Program at the Brookings Institution, who developed some of the ideas that led to the creation of the Office of Urban Affairs. Bankers, planners, and advocates will also attend.
Meanwhile, the listening tour is to begin this month, said Carrión, and continue through the fall, reaching public school auditoriums and factory plants, and including developers, housing and environmental advocates, educators, health care providers, public safety officials, and local elected officials.
Analysts have said that the most important thing Carrión's office can do is outline a national agenda for metropolitan areas, after decades of federal inaction.
Carrión said the listening tour reflects Obama's philosophy that "the best solutions are in communities," where answers have "bubbled up" despite a lack of federal guidance and support.
Obama grew up in Honolulu and Jakarta, and forged his career in Chicago, as a community organizer, and he has been shaped by cities to a greater degree than any president in nearly a century.
"For too long government has operated from the top down," said Carrión. "We've always heard why does the national government send down these unfunded mandates, under funded mandates, mandates that are not necessarily universally applicable. The bottom-up approach speaks to the need for this to be flexible."

Eco-friendly light bulbs flip switch on problems

Ann Geracimos – Washington Times
July 20, 2009

An energy efficiency measure is turning into a ticking time bomb.

The federal government plans to require consumers over the next several years to replace incandescent light bulbs with more expensive but more energy-efficient and longer-lasting compact fluorescent bulbs (CFLs).

But improper disposal of the mercury-powered bulbs poses an environmental hazard, and the federal government has given little guidance to consumers. The outlets for safe disposal are few and haphazard, and history suggests that compliance will be spotty.

"The problem to the environment comes when millions get disposed of and the cumulative effect becomes problematic. That is when the [Environmental Protection Agency] gets concerned," said Neal Langerman, a former chairman of the American Chemical Society Division of Chemical Health and Safety. "If you have a municipal urban landfill and have a population of 450,000 households disposing of one or two CFLs a year - you do the arithmetic. Put one-half milligram of mercury per bulb, it amounts to a significant loading, and mercury does migrate into groundwater."

Although California has banned CFLs from trash since 2006, local governments there estimate that less than 10 percent of CFLs receive proper disposal and recycling, said San Francisco's KGO-TV.

Revised standards for home appliances and lighting under the December 2007 energy bill require incandescent light bulbs - the basic model that has been used for 130 years - to be phased out in order to achieve about 25 percent greater efficiency for bulbs by 2014 and about 200 percent greater efficiency by 2020.

Without organized programs to educate consumers on safe handling and disposal of used or broken bulbs, landfills are likely to become even more polluted, Mr. Langerman told The Washington Times.

"The appropriate thing for us as a nation is not to dispose but have an aggressive take-back program," said Mr. Langerman, who advocates a profit incentive for recycling, a system where "if you go out of your way [to safely dispose or recycle the bulbs] you get some money back. People will do this if made convenient."

The federal Web site Energy Star (www.energystar.gov) notes that each CFL bulb contains an average of 4 milligrams of mercury, compared with the 500 milligrams contained in old-style glass thermometers. None of the mercury is released in operation, and leakage is a risk only if the bulbs are broken.

The site says "electricity use is the main source of mercury emissions in the U.S.," so it's important that CFLs use less electricity than incandescent lights. The EPA says that "a 13-watt, 8,000-rated-hour-life CFL (60-watt equivalent; a common light bulb type)" will save enough energy over its lifetime to offset even all of its mercury leakage into landfills.

EPA spokeswoman Tisha Petteway wrote in an e-mail: "Once a CFL or a fluorescent lamp is at the end of its life, EPA strongly encourages Americans to recycle it."

The CFLs sold at supermarkets and drugstores have small warnings that the bulbs contain mercury. A 13-watt model from General Electric Co. does not elaborate on the risks beyond telling consumers to "manage in accord with disposal laws." The packaging refers buyers to a recycling information Web site (www.lamprecycle.org) and provides a toll-free phone number.

Breaking a thermometer "can raise mercury levels in a tight bedroom to high enough levels to cause symptoms in a child in a short amount of time," Mr. Langerman said. "The biggest difference is the amount of ventilation."

No federal mandate requires households to recycle or safely dispose of such bulbs. The massive energy bill in Congress offers no guidance on the question of disposal, and the subject has generated little discussion during debate. That leaves this issue subject to a hodgepodge of state and local rules, some more serious than others about regulation.

The EPA gives consumers advice about finding safe disposal and recycling facilities across the country on its Web site www.epa.gov/waste/hazard/wastetypes/universal/lamps/live.htm.

D.C. residents, for example, are directed to trash transfer stations in Northeast. Contractors accept items for recycling from those with proof of local residency or from vehicles with D.C. license plates.

The two D.C. stations take recyclables only on Saturdays from 8 a.m. to 3 p.m. A total of "620 unit pounds" of CFL bulbs and mercury lamps were dropped off in an eight-month period since collection was made available last year, said Nancy Lyons, a spokeswoman for the D.C. Department of Public Works.

Home Depot, Ace Hardware, Ikea and other retailers collect the bulbs for recycling as a customer service.

Chris Jensen, the employee in charge of lighting supplies at Frager's Hardware on Capitol Hill, said the store considered offering a disposal center but found that it would cost "one grand a month."

The store advises customers to go to Prudential Carruthers Realtors sales office at 216 Seventh St. SE, where bulbs can be dropped off for recycling in a cardboard box with a heavy plastic liner.

Prudential manager Larry Kamins said the office periodically buys the package for $95 from a professional recycling company he found on the Internet. He said he decided to offer the community service because, when he started using the bulbs, he couldn't find any information on recycling locations.

"They certainly don't go out of their way" for this, Mr. Kamins said.

He said people often drop off bulbs on his doorstep overnight and that a Capitol Hill resident voluntarily collects used bulbs from neighbors and brings them to Prudential.

Mark Kohorst, senior manager for environment, health and safety at the Rosslyn-based National Electrical Manufacturers Association, said federal business regulations classify mercury-containing lamps as a subcategory of hazardous waste.

"A nationwide network of recyclers exists to serve that sector," he said.

The rule exempts households but gives states the right to adopt the federal law and apply it to households, thereby making it illegal for anyone to dispose of the bulbs in any way other than recycling.

Mr. Kohorst said Maine has enacted "the first law of its kind requiring manufacturers to fund recycling," forcing the development and implementation of state-approved programs by January. Manufacturers that do no comply will not be allowed to sell the lamps in Maine.

"We do need a national program because what good does it do for California to ban it" when neighboring states don't, said Leonard Robinson, chief deputy director of California's Department of Toxic Substances Control.

He plans to address the subject when he visits Washington with Energy Secretary Steven Chu, "a Californian," he said, who "knows the situation."

A pilot project in Humboldt County in Northern California allows households to mail used bulbs directly to a recycler, he said.

With the cooperation of the U.S. Postal Service and funding provided by Pacific Gas & Electric Co., about 58,000 residences have used direct shipment since February, he said. "That takes care of the rural people, because not everybody has a Wal-Mart or Ikea nearby."

New compact fluorescent light bulbs that carry the "Energy Star Qualified" label are supposed to last up to 10 times longer than incandescent bulbs and use one-quarter of the energy to produce the same amount of light. Although the CFLs numbers are higher, the bulbs save money because they last longer.

Mr. Jensen noted, however, that most CFLs on the market don't work with motion sensors or "dusk to dawn" fixtures, "and many models warn they are 'not for use with dimmers.' "

Lane Burt, an energy policy analyst with the Natural Resources Defense Council's Washington office, said CFL bulbs are just "a stopover" before an even more efficient type of light bulb arrives.

"Everyone should know where we want to go is LEDs [light-emitting diodes]. They are much more efficient and costs are coming down quickly. They have improved tenfold in the past decade."

Friday, July 17, 2009

Congressional Hearing on Transportation's Role in Climate Change and Greenhouse Gases

July 14, 2009

Excerpt:

Addressing VMT growth plays a key role in decreasing transportation related GHG emissions and should be included in overall efforts to prevent climate change. One way to achieve significant reductions in VMT is to develop more livable communities. The effects of reduced VMT on greenhouse gas emissions have repeatedly been demonstrated. A report aired on National Public Radio evaluated the carbon footprint of two families living in Atlanta. One family moved from a walkable, transit-served community to a car dependent one and another family moved from a car dependent area to a livable community. The greatest difference in CO2 emissions between the families was in transportation related emissions. The carbon footprint for the family that moved to a car dependent area was 40 percent higher, and transportation accounted for almost 85 percent of the difference. This report, among others, indicates the relevance of VMT to greenhouse gas emissions and indicates that we should accelerate our efforts to identify ways to reduce VMT growth in order to meet our climate goals.

There are several steps that can be taken to spur the development of more livable communities and reduce VMT:

First, we can provide more transportation choices in more communities across the country. Single occupancy vehicles should be only one of many transportation options available to Americans to reach their destinations. Walking, bicycling, light rail and buses can be made available in more places.

Second, we can promote development of housing in close proximity to transit. In addition to reducing VMT and greenhouse gas emissions from cars driven by commuters, such planning would have the added benefits of decreasing transportation costs for families and reducing traffic congestion.

Third, we can promote mixed-use development, which incorporates residential and commercial buildings, allowing individuals the choice to walk, drive a shorter distance or easily use public transportation to reach their destination. Residents should have the option to live in an area with services and goods that are easily accessible. In addition to reducing greenhouse gas emissions, this would also reduce travel times involved in driving to and from grocery and department stores, medical service providers or even entertainment centers such as movie theaters.

While many view community planning and multi-modal transportation as affecting only urban or larger suburban areas, there are many ways in which such provisions would benefit smaller towns and rural areas as well. A strong, well planned town center could provide smaller towns or rural communities with easy access to jobs and services in one centralized location and increase foot traffic around locally owned small businesses. These town centers will also protect open spaces and valuable farmland. Additionally, all people, whether in urban or rural areas, need access to job centers, medical services and schools. In urban settings this access might take the form of sidewalks and bike lanes. In rural areas, it might look more like intercity rail and bus service. But, especially as populations age, non-driving access to essential services is increasingly central to making towns more livable for 21st century populations. This poses a particular challenge for rural areas.

All of these factors will be critical elements of our livability initiative. Our work will not be easy, but it offers great promise for improving the lives of all Americans and reducing our use of energy and greenhouse gas emissions. The Department of Transportation and other agencies are already working closely to determine the best means to support sustainable, livable communities.

On June 16, Housing and Urban Development Secretary Shaun Donovan, Environmental Protection Agency Administrator Lisa Jackson, and I announced a new partnership to help American families in all communities - rural, suburban and urban - develop sustainable communities. Over the course of our collective work, we have defined six guiding principles. We are committed to

• providing more transportation choices,
• promoting equitable, affordable housing,
• enhancing economic competitiveness,
• supporting existing communities,
• coordinating policies and leverage investment, and
• valuing the uniqueness of communities and neighborhoods.

These principles will guide the interagency working group as we continue our efforts.

As we consider surface transportation reauthorization -- both in the short and longer-term -- the Department will prioritize creating a livability program that measurably works to reduce VMT, greenhouse gas emissions, and also provide added economic benefits to Americans in all geographic locations. Multi-modal transportation combined with mixed-use development and smart community planning are important issues to address when we consider transportation’s role in climate change. Combined with more efficient vehicles and cleaner burning fuels, these strategies will be important to reaching our GHG reduction goals. They will also reduce our reliance on foreign oil

The Senate now has the opportunity, for the first time, to create a system of clean energy incentives designed to jumpstart a clean energy economy and confront the threat of carbon pollution. As the President has said, it is important that we accomplish these goals while protecting consumers, and helping sensitive industries transition. I have outlined in my testimony today some of the ways in which the Department of Transportation can contribute to this effort. We would be particularly pleased if the final legislation gave the Department better tools to integrate climate change considerations into the transportation planning, financing, and implementation process and to facilitate system improvements. Failing to recognize the connection between transportation and climate change will likely jeopardize our ability to achieve our GHG reduction goals.

http://epw.senate.gov/public/index.cfm?FuseAction=Files.View&FileStore_id=2127b852-f4ce-437c-83a1-1eda266059ab

Friday, July 10, 2009

Feed-in tariff and spate of other climate-related bills move forward

Debra Kahn – ClimateWire

July 9, 2009

A bill that would boost the state's feed-in tariff to apply to renewable energy projects larger than 1.5 megawatts passed a key California Senate committee this week.
A.B. 1106, by state Rep. Felipe Fuentes (D), would establish two tiers of profits for renewable energy generators selling electricity to utilities. The first tier would include projects up to 5 megawatts, while the second would encompass those from 5 to 10 MW. Utilities would only have to offer the guaranteed pricing for up to 500 MW of generation total.

Both tiers would boost the guaranteed price by a few cents above the current level, which is the 20-year levelized cost of electricity from a combined-cycle gas turbine. The California Public Utilities Commission would be in charge of determining the costs of production plus a "reasonable profit" for each form of renewable energy: solar photovoltaic, solar thermal, wind, biogas, biomass, hydropower and geothermal.
Adam Browning, executive director of the Vote Solar Initiative, said the bill will likely be revised substantially before it passes the full Senate.
It differs from a proposal the state Public Utilities Commission made last month, which calls for a cap of 10 MW per project and a program-wide cap of 1500MW. The agency is planning to request comments within a few weeks on how to determine pricing.

The bill is moving ahead despite CPUC's request in May for legislators to wait until the agency finishes its proceedings. "It was only let through with the understanding that there would be a lot of changes to it," Browning said.
The Senate Energy, Utilities and Communications Committee also passed bills dealing with overall renewable energy targets, energy efficiency audits and spending proceeds from a potential cap-and-trade auction.
Passing the committee Tuesday were Sen. Paul Krekorian's (D) bill establishing a 33 percent renewable portfolio standard, A.B. 64, and A.B. 758, requiring the state Energy Commission to develop a plan to reduce energy use in existing residential and commercial buildings, as well as requiring utilities to perform a certain number of low-cost energy efficiency audits annually.
It also approved A.B. 1405, which would establish a fund for low-income communities directly affected by climate change. Thirty percent of the state's revenue from auctioning cap-and-trade CO2 allowances would go into the fund.

Wednesday, July 8, 2009

Solar for Dark Climates

Technology Review Wednesday, July 08, 2009

Solar technology that generates both heat and electricity could make solar energy practical in places that aren't sunny.
By Kevin Bullis

Cool Energy, a startup based in Boulder, CO, is developing a system that produces heat and electricity from the sun. It could help make solar energy competitive with conventional sources of energy in relatively dark and cold climates, such as the northern half of the United States and countries such as Canada and Germany.

The company's system combines a conventional solar water heater with a new Stirling-engine-based generator that it is developing. In cool months, the solar heater provides hot water and space heating. In warmer months, excess heat is used to drive the Stirling engine and generate electricity.

Samuel Weaver, the company's president and CEO, says that the system is more economical than solar water heaters alone because it makes use of heat that would otherwise be wasted during summer months. The system will also pay for itself about twice as quickly as conventional solar photovoltaics will, he says. That's in part because it can efficiently offset heating bills in the winter--something that photovoltaics can't do--and in part because the evacuated tubes used to collect heat from the sun make better use of diffuse light than conventional solar panels do.

The system is designed to provide almost all of a house's heating needs. But the generator, which will produce only 1.5 kilowatts of power, won't be enough to power a house on its own. The system is designed to work with power from the grid, although the power is enough to run a refrigerator and a few lights in the event of a power failure.

The company's key innovation is the Stirling engine, which is designed to work at temperatures much lower than ordinary Stirling engines. In these engines, a piston is driven by heating up one side of the engine while keeping the opposite side cool. Ordinarily, the engines require temperatures of above 500 °C, but Cool Energy's engine is designed to run at the 200 degrees that solar water heaters provide.

The success of the technology, however, hinges on achieving the efficiency targets, says Dean Kamen, the inventor of the Segway, who is developing high-temperature Stirling engines for other applications, including transportation. "We need data," he says. The company's second prototype was only 10 percent efficient at converting heat into electricity. Its engineers hope to reach 20 percent with a new prototype.

A Stirling engine's efficiency is limited by the difference in temperature between the cool and hot side. Typically, reaching the necessary high temperatures using sunlight requires mirrors and lenses for concentrating the light and tracking systems for keeping the concentrators pointed at the sun. The concentrators require direct sunlight, so they don't work on overcast days, and they're too bulky to be mounted on the roof of a house.

To make a practical Stirling engine that runs at low temperatures and doesn't require concentrators, the engineers at Cool Energy addressed a problem with conventional engines that leads to wasted energy: heat leaks from the hot side of the system to the cool side, lowering the temperature difference between them. This happens because the materials required for high temperatures and pressures--typically metals--conduct heat. Working at lower temperatures, the engineers concluded, allows them to use materials such as plastics and certain ceramics that don't conduct heat, reducing these losses. These materials also help lower costs: they're cheaper than some of the metals typically used, and they don't require lubrication, improving the reliability of the engines and reducing maintenance costs.

Cool Energy's engineers are currently assembling the company's third prototype, which they say will allow them to reach their efficiency targets by the end of this summer, after which they plan to test pilot systems outside the lab. Within two years, they plan to manufacture enough systems to drive costs down and achieve their payback targets.

Copyright Technology Review 2009.

Tuesday, June 23, 2009

'Milking' Microscopic Algae Could Yield Massive Amounts Of Oil

ScienceDaily (June 23, 2009) —

Scientists in Canada and India are proposing a surprising new solution to the global energy crisis —"milking" oil from the tiny, single-cell algae known as diatoms, renowned for their intricate, beautifully sculpted shells that resemble fine lacework.

Richard Gordon, T. V. Ramachandra, Durga Madhab Mahapatra, and Karthick Band note that some geologists believe that much of the world's crude oil originated in diatoms, which produce an oily substance in their bodies. Barely one-third of a strand of hair in diameter, diatoms flourish in enormous numbers in oceans and other water sources. They die, drift to the seafloor, and deposit their shells and oil into the sediments. Estimates suggest that live diatoms could make 10-200 times as much oil per acre of cultivated area compared to oil seeds, Gordon says.

"We propose ways of harvesting oil from diatoms, using biochemical engineering and also a new solar panel approach that utilizes genetically modifiable aspects of diatom biology, offering the prospect of "milking" diatoms for sustainable energy by altering them to actively secrete oil products," the scientists say. "Secretion by and milking of diatoms may provide a way around the puzzle of how to make algae that both grow quickly and have a very high oil content."

Journal reference:

1. Ramachandra et al. Milking Diatoms for Sustainable Energy: Biochemical Engineering versus Gasoline-Secreting Diatom Solar Panels. Industrial & Engineering Chemistry Research, 2009; 090609115002039 DOI: 10.1021/ie900044j

Tuesday, June 16, 2009

New US climate report dire, but offers hope

By SETH BORENSTEIN

WASHINGTON (AP) — Rising sea levels, sweltering temperatures, deeper droughts, and heavier downpours — global warming's serious effects are already here and getting worse, the Obama administration warned on Tuesday in the grimmest, most urgent language on climate change ever to come out of any White House.

But amid the warnings, scientists and government officials seemed to go out of their way to soften the message. It is still not too late to prevent some of the worst consequences, they said, by acting aggressively to reduce world emissions of heat-trapping gases, primarily carbon dioxide from the burning of fossil fuels.

The new report differs from a similar draft issued with little fanfare or context by George W. Bush's administration last year. It is paradoxically more dire about what's happening and more optimistic about what can be done.

The Obama administration is backing a bill in Congress that would limit heat-trapping pollution from power plants, refineries and factories. A key player on a climate bill in the Senate, California Democrat Barbara Boxer, said the report adds "urgency to the growing momentum in Congress" for passing a law.

"It's not too late to act," said Jane Lubchenco, one of several agency officials at a White House briefing. "Decisions made now will determine whether we get big changes or small ones." But what has happened already is not good, she said: "It's happening in our own backyards and it affects the kind of things people care about."

Lubchenco, a marine biologist, heads the National Oceanic and Atmospheric Administration.

In one of its key findings, the report warned: "Thresholds will be crossed, leading to large changes in climate and ecosystems." The survival of some species could be affected, it said.

The document, a climate status report required periodically by Congress, was a collaboration by about three dozen academic, government and institute scientists. It contains no new research, but it paints a fuller and darker picture of global warming in the United States than previous studies.

Bush was ultimately forced by a lawsuit to issue a draft report last year, and that document was the basis for this one. Obama science adviser John Holdren called the report nonpartisan, started by a Republican administration and finished by a Democratic one.

"The observed climate changes that we report are not opinions to be debated. They are facts to be dealt with," said one of the report's chief authors, Jerry Melillo of Marine Biological Lab in Woods Hole, Mass. "We can act now to avoid the worst impacts."

Among the things Melillo said he would like to avoid are more flooding disasters in New Orleans and an upheaval of the world's food supply.

The scientists softened the report from an earlier draft that said "tipping points have already been reached and have led to large changes." Melillo said that is because some of the changes seen so far are still reversible.

Even so, Tom Karl of the National Climatic Data Center said that at least one tipping point — irreversible sea level rise — has been passed.

A point of emphasis of the report, which is just under 200 pages, is what has already happened in the United States. That includes rapidly retreating glaciers in the American West and Alaska, altered stream flows, trouble with the water supply, health problems, changes in agriculture, and energy and transportation worries.

"There are in some cases already serious consequences," report co-author Anthony Janetos of the University of Maryland told The Associated Press. "This is not a theoretical thing that will happen 50 years from now. Things are happening now."

For example, winters in parts of the Midwest have warmed by 7 degrees in just 30 years and the frost-free period has grown a week, the report said.

Shorter winters have some benefits, such as longer growing seasons, but those are changes that require adjustments just the same, the authors note.

The "major disruptions" already taking place will only increase as warming continues, the authors wrote. The world's average temperature may rise by as much as 11.5 degrees by the end of the century, the report said. And the U.S. average temperature could go even higher than that, Karl said.

Environmental groups praised the report as a call for action, with the Union of Concerned Scientists calling it what "America needs to effectively respond to climate change."

Scott Segal, a Washington lobbyist for the coal industry, was more cautious: "Fast action without sufficient planning is a route to potential economic catastrophe with little environmental gain."

Associated Press Writer Dina Cappiello in Washington contributed to this report.

Wednesday, June 10, 2009

Roll-Up Solar Panels

Technology Review Thursday, June 04, 2009

A startup is making thin-film solar cells on flexible steel sheets.
By Prachi Patel

Xunlight, a startup in Toledo, Ohio, has developed a way to make large, flexible solar panels. It has developed a roll-to-roll manufacturing technique that forms thin-film amorphous silicon solar cells on thin sheets of stainless steel. Each solar module is about one meter wide and five and a half meters long.

As opposed to conventional silicon solar panels, which are bulky and rigid, these lightweight, flexible sheets could easily be integrated into roofs and building facades or on vehicles. Such systems could be more attractive than conventional solar panels and be incorporated more easily into irregular roof designs. They could also be rolled up and carried in a backpack, says the company's cofounder and president, Xunming Deng. "You could take it with you and charge your laptop battery," he says.

Amorphous silicon thin-film solar cells can be cheaper than conventional crystalline cells because they use a fraction of the material: the cells are 1 micrometer thick, as opposed to the 150-to-200-micrometer-thick silicon layers in crystalline solar cells. But they're also notoriously inefficient. To boost their efficiency, Xunlight made triple-junction cells, which use three different materials--amorphous silicon, amorphous silicon germanium, and nanocrystalline silicon--each of which is tuned to capture the energy in different parts of the solar spectrum. (Conventional solar cells use one primary material, which only captures one part of the spectrum efficiently.)

Still, Xunlight's flexible PV modules are only about 8 percent efficient, while some crystalline silicon modules on the market are more than 20 percent efficient. As a result, Xunlight's large modules produce only 330 watts, whereas an array of crystalline silicon solar panels covering the same area would produce about 740 watts.

United Solar Ovonic, based in Auburn Hills, MI, is already selling flexible PV modules. The company also uses triple-junction amorphous silicon cells, and its modules can be attached to roofing materials. But Xunlight's potential advantage is its high-volume roll-to-roll technique. "If their roll-to-roll process allows them to go to lower cost and larger area, that's the central advantage," says Johanna Schmidtke, an analyst with Lux Research, in Boston. "But they have to prove it with manufacturing."

Other companies, notably Heliovolt and Nanosolar, are in a race to make thin-film panels using copper indium gallium selenide (CIGS) cells. These have shown efficiencies on par with crystalline silicon and can be made on flexible substrates. In comparison with amorphous silicon, CIGS is a relatively difficult material to work with, and no one has been able to create low-cost products consistently in large quantities, says Ryan Boas, an analyst with Photon Consulting, in Boston.

Building integrated photovoltaics (BIPV), especially rooftop applications, would be the biggest market for flexible PV technology, Boas says. That's because flexible products are inherently very light, in addition to being quick and easy to install. "Imagine carrying a roll of flexible product on the roof and unrolling it," he says. "Workers are already used to unrolling roofing material."

But there are hidden risks and costs associated with BIPV, Schmidtke says. "BIPV is often touted as low cost," she says, "but in actuality, you've got greater risk in terms of a watertight system [for roofing materials] or fire risk, and that increases total installation cost." However, BIPV does have the advantage of being more aesthetically pleasing, which is important to consumers, she says.

So far, Xunlight has raised $40 million from investors. In December, the state of Ohio gave the company a $7 million loan to speed up the construction of a 25-megawatt production line for its flexible solar modules. The company expects to have commercial products available in 2010.

Copyright Technology Review 2009.