Tuesday, September 8, 2009

Panels of Light Fascinate Designers

September 7, 2009
By ERIC A. TAUB

LED light bulbs, with their minuscule energy consumption and 20-year life expectancy, have grabbed the consumer’s imagination.

But an even newer technology is intriguing the world’s lighting designers: OLEDs, or organic light-emitting diodes , create long-lasting, highly efficient illumination in a wide range of colors, just like their inorganic LED cousins. But unlike LEDs, which provide points of light like standard incandescent bulbs, OLEDs create uniform, diffuse light across ultrathin sheets of material that eventually can even be made to be flexible.

Ingo Maurer , who has designed chandeliers of shattered plates and light bulbs with bird wings , is using 10 OLED panels in a table lamp in the shape of a tree. The first of its kind, it sells for about $10,000.

He is thinking of other uses. “If you make a wall divider with OLED panels, it can be extremely decorative. I would combine it with point light sources,” he said.

Other designers have thought about putting them in ceiling tiles or in Venetian blinds, so that after dusk a room looks as if sunshine is still streaming in.

Today, OLEDs are used in a few cellphones, like the Impression from Samsung, and for small, expensive, ultrathin TVs from Sony and soon from LG. (Sony’s only OLED television, with an 11-inch screen, costs $2,500.) OLED displays produce a high-resolution picture with wider viewing angles than LCD screens.

In 2008, seven million of the one billion cellphones sold worldwide used OLED screens, according to Jennifer Colegrove, a DisplaySearch analyst. She predicts that next year, that number will jump more than sevenfold, to 50 million phones.

But OLED lighting may be the most promising market. Within a year, manufacturers expect to sell the first OLED sheets that one day will illuminate large residential and commercial spaces. Eventually they will be as energy efficient and long-lasting as LED bulbs, they say.

Because of the diffuse, even light that OLEDs emit, they will supplement, rather than replace, other energy-efficient technologies, like LED, compact fluorescent and advanced incandescent bulbs that create light from a single small point.

Its use may be limited at first, designers say, and not just because of its high price. “OLED lighting is even and monotonous,” said Mr. Maurer, a lighting designer with studios in Munich and New York. “It has no drama; it misses the spiritual side.”

“OLED lighting is almost unreal,” said Hannes Koch, a founder of rAndom International in London, a product design firm. “It will change the quality of light in public and private spaces.”

Mr. Koch’s firm was recently commissioned by Philips to create a prototype wall of OLED light, whose sections light up in response to movement.

Because OLED panels could be flexible, lighting companies are imagining sheets of lighting material wrapped around columns. (General Electric created an OLED-wrapped Christmas tree as an experiment.) OLED can also be incorporated into glass windows; nearly transparent when the light is off, the glass would become opaque when illuminated.

Because OLED panels are just 0.07 of an inch thick and give off virtually no heat when lighted, one day architects will no longer need to leave space in ceilings for deep lighting fixtures, just as homeowners do not need a deep armoire for their television now that flat-panel TVs are common.

The new technology is being developed by major lighting companies like G.E., Konica Minolta, Osram Sylvania, Philips and Universal Display.

“We’re putting significant financial resources into OLED development,” said Dieter Bertram, general manager for Philips’s OLED lighting group. Philips recently stepped up its investment in this area with the world’s first production line for OLED lighting, in Aachen, Germany.

Universal Display, a company started 15 years ago that develops and licenses OLED technologies, has received about $10 million in government grants over the last five years for OLED development, said Joel Chaddock, a technical project manager for solid state lighting in the Energy Department.

Armstrong World Industries and the Energy Department collaborated with Universal Display to develop thin ceiling tiles that are cool to the touch while producing pleasing white light that can be dimmed like standard incandescent bulbs. With a recently awarded $1.65 million government contract, Universal is now creating sheetlike undercabinet lights.

“The government’s role is to keep the focus on energy efficiency,” Mr. Chaddock said. “Without government input, people would settle for the neater aspects of the technology.”

G.E. is developing a roll-to-roll manufacturing process, similar to the way photo film and food packaging are created; it expects to offer OLED lighting sheets as early as the end of next year.

“We think that a flexible product is the way to go,” said Anil Duggal, head of G.E.’s 30-person OLED development team. OLED is one of G.E.’s top research priorities; the company is spending more than half its research and development budget for lighting on OLED.

Exploiting the flexible nature of OLED technology, Universal Display has developed prototype displays for the United States military, including a pen with a built-in screen that can roll in and out of the barrel.

The company has also supplied the Air Force with a flexible, wearable tablet that includes GPS technology and video conferencing capabilities.

As production increases and the price inevitably drops, OLED will eventually find wider use, its proponents believe, in cars, homes and businesses.



“I want to get the price down to $6 for an OLED device that gives off the same amount of light as a standard 60-watt bulb,” said Mr. Duggal of G.E. “Then, we’ll be competitive.”

Monday, August 31, 2009

The Next Evolution in Economics: Rethinking Growth

1:30 PM Wednesday August 19, 2009
by Stan Stalnaker

The credit crunch has forced people across many sectors to rethink their assumptions about how they do business, the roles of the individual in the larger system, and the very future of the system itself.

These reflections are beginning to bear fruit. We've begun to see a shift from the old, linear transaction-based approach to business toward a new, circular view, in which shared resources can better benefit all in a way that adds depth (and value) to this future economy.

Economists describe this new model in many ways. One way is to use human cellular structures as a metaphor for economic growth. Call it cellular economic theory.

What do cells tell us about business? Well, consider that cells that grow continually and exponentially (like we've been taught our economies should grow) are a form of cancer. We know intuitively and logically that continuous growth can't be sustained in living things. It's likewise unsustainable (and undesirable) in business.

But that's our current model--to just keep growing. And in this model there's no alternative to growth, only stagnation which leads to death. The result of this is policy at every level (micro, macro, corporate and public) that champions growth at all costs.

Cellular economic theory suggests an alternative to linear growth: circular growth. In the body, cells grow. Cells die. New cells grow. New cells die. On and on. We sustain ourselves through regeneration. In business, a form of staged, regenerative growth could become the norm. The growth may not even change the size of the "economic body."

Here, growth is not seen as the ultimate byproduct of an economic life cycle, but just an important one. Growth becomes one of several life cycle stages that are primarily about replenishment. Instead of growing in size and scope, companies grow in capabilities, processes and offerings. New ones come along. Old ones dies. Just like cells, growth becomes regenerative--only what needs replacing is replaced, reducing waste and improving society along the way.

For example, a brewery in India is using cellular economic thinking to grow its bottom line without producing and selling more beer. Instead it's using chaff and grain detritus to create fertilizer and biofuels--regenerating resources to lower their own production costs while widening the life cycles of their inputs.

KATIKA, a Swiss wood furniture maker, is reforesting at a rate greater than their production, using profits from their sales today to ensure the availability of resources later. In the meantime, their reforestation projects create local jobs and other sustainable benefits (home for wildlife and food, CO2 reduction) while increasing the value of formerly degraded land holdings.

In a cellular economy, key metrics change. GDP growth is less important than GDP regeneration. Successful growth takes into account the sustainability of that growth.

The most profound change in a cellular economy is the devaluation of the transaction. Today, economic value is determined primarily by the value of the transaction. To grow (even just to survive), we must keep trading, keep consuming--no matter how wasteful the process becomes--because success is creating more transactions. This keeps us locked into a linear, growth oriented paradox.

Fortunately, (if not painfully), the Internet is exposing the impossibility of sustaining a transaction-based economy. As the net drives the cost of certain goods and services toward zero, it strips profit from transactions.

In publishing, for example, the cost of information is falling while sources multiply. Same for music and other creative enterprises. Same for micro-lending versus traditional banking. Fashion and retail. Oil. Anywhere there's a middle man between the natural resource and the end consumer, the Internet is obviating the need for the middle man.

And, in place of transactions and supply chains (which are, essentially, series of middle men), communities are gaining leverage and power from these shared commodities like news and gas.

A low-level web of constant relationships, circular, cellular systems where shared, collaborative contributions are the norm, is developing. Here, the value resides with relationships, not transactions. Maybe, instead of buying and selling more and more in a mad race for grabbing the most growth, the future will be about a collaborative, community-oriented regenerative growth model.

This "economy of shares" relies on crowd-sourced contributions, a free market, and a fair dose of incentives for sustainability. When it becomes bad business to waste resources in pursuit of profit, then the regenerative model takes hold and we can kiss goodbye to the things we know we don't need but can't seem to give up. Wasteful packaging. Super-sized food portions. Environmentally damaging newspapers. Gas-guzzling SUVs.

Eventually, in a regenerative economy, we learn to focus on kaizen --constant improvements, as opposed to an ever expanding volume of low-quality transactions and markets. Call it the co-op economy. It's the kind of economic system we always say we want but can't bring ourselves to build.

If the experts are right and we do indeed need to find more sustainable ways of living, and the bankers are right in saying that we have to live within our means, and the technologists are right saying that collaborative systems are the future, then it stands to reason that the next evolution in economics is to a more natural, life-like system.

We are moving to a world where transactions will happen instantly, on demand, for free. We are moving to a time when transactions can't sustain an economy. We are realizing all systems are like biological systems--even economic ones. Growth-at-all-costs business is malignant.

It's time to apply that broad realization in new ways to the situation at hand.

http://blogs.harvardbusiness.org/hbr/hbr-now/2009/08/a-new-approach-to-economics.html

Stan Stalnaker is the Founder and Creative Director of Hub Culture Ltd , a social network that merges online and physical world environments.

Wednesday, August 12, 2009

Keeping it local

Jul 30th 2009 | AUSTIN
From The Economist print edition

A rising vogue for shopping near home

IN 2002 the city of Austin planned to extend about $2m in incentives to a developer who wanted to build a new Borders bookstore on a prominent downtown corner. This was an unpleasant prospect for the owners of two local independent businesses, BookPeople and Waterloo Records. If the deal had gone through they would have faced a big competitor located directly across the street. Steve Bercu, the owner of BookPeople, says that he always assumed that local businesses were better for Austin for sound economic reasons. But in the circumstances, he wanted to test the proposition.

So BookPeople and Waterloo called in Civic Economics, a consultancy. They went through the books and found that for every $100 spent at the two locals, $45 stayed in Austin in wages to local staff, payments to other local merchants, and so on. When that sum went to a typical Borders store, only $13 went back into circulation locally.


Although the study was part-funded by BookPeople and Waterloo it gave a boost to the growing “buy local” movement in America. For years business and community leaders have been full of reasons for people to do their shopping close to home. They say that local and independent businesses have more individual character, and that they are owned by your friends and neighbours. Some stores, particularly grocers, point out that it takes much less carbon to haul a truck from a few towns over than from halfway across the country.

At the moment, the economic argument has special traction. Dan Houston, a partner at Civic Economics, says that in recent studies he has found that locally-owned businesses put about twice as much money back into the community as the chains do, not three times, as the Austin study found. But that is still enough of a “local multiplier” to catch people’s attention. Stacy Mitchell of the Institute for Local Self-Reliance in Portland, Maine, reckons that some 30,000 local independents have joined about 130 independent business alliances around the nation.

Big companies are taking note that customers are rooting for the home team. Ms Mitchell points to a telling development in Seattle, Washington, where Starbucks got its start. On July 24th the company opened a new coffee shop there. The newcomer is not called a Starbucks; it is called “15th Ave. Coffee & Tea”. It promises “a deep connection to the local community,” and its seats are recycled from a local theatre.

There is an insular element to the trend. “Is it pure local protectionism? Sure, to some extent it is,” says Mr Houston. But the advocates are not zealots. One national campaign is asking people to shift a mere 10% of their spending to local outfits.

The Borders project in Austin eventually fell through, and the proposed site is now occupied by the flagship of Whole Foods Market. The chain was founded in Austin and is local in a sense, although it is now publicly traded. Throughout the shop, produce advertises its credentials: local, organic, fairly traded, made in-house, vegan, and so on. This week its customers faced an ethical dilemma: is it better to buy the organic watermelon from California, or the conventionally-grown kind from Lexington, Texas?

Thursday, August 6, 2009

Water Scarcity Looms

Worldwatch Institute
by Gary Gardner/ August 6, 2009

Water scarcity grows in urgency in many regions as population growth, climate change, pollution, lack of investment, and management failures restrict the amount of water available relative to demand. The Stockholm International Water Institute calculated in 2008 that 1.4 billion people live in "closed basins"-regions where existing water cannot meet the agricultural, industrial, municipal, and environmental needs of all.1 Their estimate is consistent with a 2007 Food and Agriculture Organization (FAO) calculation that 1.2 billion people live in countries and regions that are water-scarce.2 And the situation is projected to worsen rapidly: FAO estimates that the number of water-scarce will rise to 1.8 billion by 2025, particularly as population growth pushes many countries and regions into the scarcity column.3

"Water scarcity" has several meanings. Physical water scarcity exists wherever available water is insufficient to meet demand: parts of the southwestern United States, northern Mexico, North Africa, the Middle East, Central Asia, northern China, and southeastern Australia are characterized by physical water scarcity.4 Economic water scarcity occurs when water is available but inaccessible because of a lack of investment in water provision or poor management and regulation of water resources. Much of the water scarcity of sub-Saharan Africa falls into this category. 5

Signs of scarcity are plentiful. Several major rivers, including the Indus, Rio Grande, Colorado, Murray-Darling, and Yellow, no longer reach the sea year-round as a growing share of their waters are claimed for various uses.6 Water tables are falling as groundwater is overpumped in South Asia, northern China, the Middle East, North Africa, and the southwestern United States, often propping up food production unsustainably.7 The World Bank estimates that some 15 percent of India's food, for example, is produced using water from nonrenewable aquifers.8 Another sign of scarcity is that desalination, a limited and expensive water supply solution, is on the rise.

Water scarcity has many causes. Population growth is a major driver at the regional and global levels, but other factors play a large role locally. Pollution reduces the amount of usable water available to farmers, industry, and cities. The World Bank and the government of China have estimated, for instance, that 54 percent of the water in seven main rivers in China is unusable because of pollution.9 In addition, urbanization tends to increase domestic and industrial demand for water, as does rising incomes-two trends prominent in rapidly developing countries such as China, India, and Brazil.10

In some cases, water scarcity leads to greater dependence on foreign sources of water. A country's "water footprint"-the volume of water used to produce the goods and services, including imports, that people consume-can be used to demonstrate this.11 The ratio between the water footprint of a country's imports and its total water footprint yields its water import dependence. The higher the ratio, the more a country depends on outside water resources. In the Netherlands, for example, imported goods and services account for 82 percent of the country's total water footprint.12 (See Table 1.)

A looming new threat to water supplies is climate change, which is causing rainfall patterns to shift, ice stocks to melt, and soil moisture content and runoff to change.13 According to the Intergovernmental Panel on Climate Change, the area of our planet classified as "very dry" has more than doubled since the 1970s, and the volume of glaciers in many regions and snow pack in northern hemisphere mountains-two important freshwater sources-has decreased significantly.14

Climate change is expected to have a net negative impact on water scarcity globally this century. By the 2050s, the area subject to greater water stress due to climate change will be twice as large as the area experiencing decreased water stress.15 Less rainfall is expected in already arid areas, including the Mediterranean Basin, western United States, southern Africa, and northeastern Brazil, where various models all indicate that runoff will decrease by 10-30 percent in the coming decades.16 And loss of snowpack will remove a natural, off-season water reservoir in many regions: by the 2020s, for example, 41 percent of the water supply to the populous southern California region is likely to be vulnerable to warming as some of the Sierra Nevada and Colorado River basin snowpacks disappear.17

Policymakers look to a variety of solutions to address water scarcity. Desalination is increasingly feasible for small-scale water supply, as technological advances reduce costs. This involves removing most salt from salt water, typically by passing it through a series of membranes. Global desalination capacity doubled between 1995 and 2006, and according to some business forecasts it could double again by 2016.18 But production is tiny: global capacity in 2005 was some 55.4 million cubic meters, barely 0.003 percent of the world's municipal and industrial water consumption, largely because desalination remains an energy-intensive and expensive option.19 Not surprisingly, 47 percent of global capacity in 2006 was in the Middle East, where the need is great and energy is cheap.20 In addition, the technology is plagued by environmental concerns, especially disposal of salt concentrates.

Another limited solution to scarcity involves accounting for "virtual water"-the water used to produce a crop or product-when designing trade policy. Nations conserve their own water if they import products having a large virtual water component, such as foodstuffs, rather than producing them domestically. Imports to Jordan, for instance, including wheat and rice from the United States, have a virtual water content of some 5-7 billion cubic meters per year compared with domestic water use of some 1 billion cubic meters per year.21 Jordan's import policy yields enormous water savings for the country, although it also increases its food dependency. The bulk of North and South America, Australia, Asia, and Central Africa are net exporters of virtual water.22 Most of Europe, Japan, North and South Africa, the Middle East, Mexico, and Indonesia, in contrast, are net importers of virtual water.23

Other solutions focus on structural shifts in water use, including growing crops that are less water-intensive, changing dietary patterns to reduce meat consumption, and shifting to renewable sources of energy. Diets heavy in livestock products, for example, are water-intensive because of the huge quantities of water required for livestock production.24 (See Table 2.) Similarly, fossil fuel production requires many times more water than renewable energy sources do. 25 (See Table 3.)

Complete trends will be available with full endnote referencing, Excel spreadsheets, and customizable presentation-ready charts as part of our new subscription service, Vital Signs Online, slated to launch this fall.

http://www.worldwatch.org/node/6213?emc=el&m=279787&l=4&v=7c1e33b23a

Wednesday, August 5, 2009

LEDs Are As Energy Efficient as Compact Fluorescents

NYT August 4, 2009, 12:01 am

By Eric A. Taub

While there’s no question that LED lamps use a fraction of the energy to produce the same amount of light compared with a standard incandescent bulb, several Bits readers have pointed out that that’s only half the story.

If the energy used to create and dispose of the LED lamp is more than that for a comparable standard bulb, then all of the proclaimed energy savings to produce light are for naught.

Until recently, no one knew if that was the case. In March, a preliminary study reported by Carnegie Mellon indicated that LED lamps were more energy efficient throughout their life, but the researchers pointed out that not every aspect of the production process was taken into account.

A new study released on Tuesday by Osram, the German lighting giant, claims to have confirmed the efficiency findings.

Conducted by the Siemens Corporate Technology Centre for Eco Innovations (Siemens is the parent of Osram and Sylvania), the report examines the energy needed to create and power an LED lamp. Even the energy needed to ship a lamp from the factory in China to an installation in Europe was taken into account.

The study used a 25,000-hour LED lamp life as a constant, comparing the energy needed throughout its life to that used for 25 1,000-hour incandescents and 2.5 10,000-hour compact fluorescents.

The findings, according to a summary of the study: today’s LED lamps are essentially as energy efficient as compact fluorescents, in the amount of energy needed to create, recycle and provide light. Osram said it expected those numbers to improve as LEDs become more energy efficient.

The company issued no in-depth information to support its claims. It said that confirming data will be released this fall, after review by three independent analysts.

But assuming the numbers hold, this total Life Cycle Assessment should put to rest any lingering doubts about the overall “greenness” of LEDs.

* Copyright 2009 The New York Times Company
* Privacy Policy
* NYTimes.com 620 Eighth Avenue New York, NY 10018

Monday, August 3, 2009

Solar Industry: No Breakthroughs Needed

Monday, August 03, 2009

The solar industry says incremental advances have made transformational technologies unnecessary.
By Kevin Bullis

The federal government is behind the times when it comes to making decisions about advancing the solar industry, according to several solar-industry experts. This has led, they argue, to a misplaced emphasis on research into futuristic new technologies, rather than support for scaling up existing ones. That was the prevailing opinion at a symposium last week put together by the National Academies in Washington, DC, on the topic of scaling up the solar industry.

The meeting was attended by numerous experts from the photovoltaic industry and academia. And many complained that the emphasis on finding new technologies is misplaced. "This is such a fast-moving field," said Ken Zweibel, director of the Solar Institute at George Washington University. "To some degree, we're fighting the last war. We're answering the questions from 5, 10, 15 years ago in a world where things have really changed."

In the past year, the federal government has announced new investments in research into "transformational" solar technologies that represent radical departures from existing crystalline-silicon or thin-film technologies that are already on the market. The investments include new energy-research centers sponsored by the Department of Energy and a new agency called ARPA-Energy, modeled after the Defense Advanced Research Projects Agency. Such investments are prompted by the fact that conventional solar technologies have historically produced electricity that's far more expensive than electricity from fossil fuels.

In fact, Energy Secretary Steven Chu has said that a breakthrough is needed for photovoltaic technology to make a significant contribution to reducing greenhouse gases. Researchers are exploring solar cells that use very cheap materials or even novel physics that could dramatically increase efficiency, which could bring down costs.

But industry experts at the Washington symposium argued that new technologies will take decades to come to market, judging from how long commercialization of other solar technologies has taken. Meanwhile, says Zweibel, conventional technologies "have made the kind of progress that we were hoping futuristic technologies could make." For example, researchers have sought to bring the cost of solar power to under $1 per watt, and as of the first quarter of this year one company, First Solar, has done this.

These cost reductions have made solar power cheaper than the natural-gas-powered plants used to produce extra electricity to meet demand on hot summer days. With subsidies, which Zweibel argues are justified because of the "externalities" of other power sources, such as the cost from pollution, solar can be competitive with conventional electricity even outside peak demand times, at least in California. And projected cost decreases will make solar competitive with current electricity prices in more areas, even without subsidies.

Representatives of the solar industry say the federal government should do more to remove obstacles that are slowing the industry's development. One issue is financing for new solar installations, which can be much more expensive if lending institutions deem them high risk. A recent extension of federal tax credits and grants for solar investments is a step in the right direction, many solar experts say. But more could be done. A price on carbon would help make solar more economically competitive and more attractive to lenders.

Friday, July 31, 2009

A New Approach to Fusion

Technology Review Friday, July 31, 2009

A startup snags funding to start early work on a low-budget test reactor.
By Tyler Hamilton

General Fusion, a startup in Vancouver, Canada, says it can build a prototype fusion power plant within the next decade and do it for less than a billion dollars. So far, it has raised $13.5 million from public and private investors to help kick-start its ambitious effort.

Unlike the $14 billion ITER project under way in France, General Fusion's approach doesn't rely on expensive superconducting magnets--called tokamaks--to contain the superheated plasma necessary to achieve and sustain a fusion reaction. Nor does the company require powerful lasers, such as those within the National Ignition Facility at Lawrence Livermore National Laboratory, to confine a plasma target and compress it to extreme temperatures until fusion occurs.

Instead, General Fusion says it can achieve "net gain"--that is, create a fusion reaction that gives off more energy than is needed to trigger it--using relatively low-tech, mechanical brute force and advanced digital control technologies that scientists could only dream of 30 years ago.

It may seem implausible, but some top U.S. fusion experts say General Fusion's approach, which is a variation on what the industry calls magnetized target fusion, is scientifically sound and could actually work. It's a long shot, they say, but well worth a try.

"I'm rooting for them," says Ken Fowler, professor emeritus of nuclear engineering and plasma physics at the University of California, Berkeley, and a leading authority on fusion-reactor designs. He's analyzed the approach and found no technical showstoppers. "Maybe these guys can do it. It's really luck of the draw."

The prototype reactor will be composed of a metal sphere about three meters in diameter containing a liquid mixture of lithium and lead. The liquid is spun to create a vortex inside the sphere that forms a vertical cavity in the middle. At this point, two donut-shaped plasma rings held together by self-generated magnetic fields, called spheromaks, are injected into the cavity from the top and bottom of the sphere and come together to create a target in the center. "Think about it as blowing smoke rings at each other," says Doug Richardson, chief executive of General Fusion.

On the outside of the metal sphere are 220 pneumatically controlled pistons, each programmed to simultaneously ram the surface of the sphere at 100 meters a second. The force of the pistons sends an acoustic wave through the lead-lithium mixture, and that accelerates into a shock wave as it reaches the plasma, which is made of the hydrogen isotopes deuterium and tritium.

If everything works as planned, the plasma will compress instantly and the isotopes will fuse into helium, releasing a burst of energy-packed neutrons that are captured by the lead-lithium liquid. The rapid heat buildup in the liquid will be extracted through a heat exchanger, with half used to create steam that spins a turbine for power generation, and the rest used to recharge the pistons for the next "shot."

The ultimate goal is to inject a new plasma target and fire the pistons every second, creating pulses of fusion reactions as part of a self-sustaining process. This contrasts with ITER, which aims to create a single fusion reaction that can sustain itself. "One of the big risks to the project is nobody has compressed spheromaks to fusion-relevant conditions before," says Richardson. "There's no reason why it won't work, but nobody has ever proven it."

He says it look longer than expected to raise the money for the prototype project, but the company can now start the first phase of building the test reactor, including the development of 3-D simulations and the technical verification of components. General Fusion aims to complete the reactor and demonstrate net gain within five years, assuming it can raise another $37 million.

If successful, it believes it can build a grid-capable fusion reactor rated at 100 megawatts four years later for about $500 million, beating ITER by about 20 years and at a fraction of the cost.

"I usually pass up these quirky ideas that pass my way, but this one really fascinated me," says Fowler. He notes that there are immense challenges to overcome, but the culture of a private startup may be what it takes to tackle them with a sense of urgency. "In the big programs, especially the fusion ones, people have gotten beat up so much that they've become so risk averse."

General Fusion's basic approach isn't entirely new. It builds on work done during the 1980s by the U.S. Naval Research Laboratory, based on a concept called Linus. The problem was that scientists couldn't figure out a fast-enough way to compress the plasma before it lost its donut-shaped magnetic confinement, a window of opportunity measured in milliseconds. Just like smoke rings, the plasma rings maintain their shape only momentarily before dispersing.

Nuclear-research giant General Atomics later came up with the idea of rapidly compressing the plasma using a mechanical ramming process that creates acoustic waves. But the company never followed through--likely because the technology to precisely control the speed and simultaneous triggering of the compressed-air pistons simply didn't exist two decades ago.

Richardson says that high-speed digital processing is readily available today, and General Fusion's mission over the next two to four years is to prove it can do the job. Before building a fully functional reactor with 220 pistons on a metal sphere, the company will first verify that smaller rings of 24 pistons can be synchronized to strike an outer metal shell.

Glen Wurden, program manager of fusion energy sciences at Los Alamos National Laboratory and an expert on magnetized target fusion, says General Fusion has a challenging road ahead and many questions to answer definitively. Can they produce spheromaks with the right densities, temperature, and life span? Can they inject two spheromaks into opposite ends of the vortex cavity and make sure they collide and merge? Will the acoustic waves travel uniformly through the liquid metal?

"You can do a good amount of it through simulations, but not all of it," says Wurden. "This is all very complex, state-of-the-art work. The problem is you're dealing with different timescales and different effects on materials when they're exposed to shock waves."

Los Alamos and General Fusion are collaborating as part of a recently signed research agreement. But Richardson isn't planning on a smooth ride. "The project has many risks," he says, "and we expect most of it to not perform exactly as expected." However, if the company can pull off its test reactor, it hopes to attract enough attention to easily raise the $500 million for a demonstration power plant.

Says Fowler, "Miracles do happen."

Copyright Technology Review 2009.