Showing posts with label emissions. Show all posts
Showing posts with label emissions. Show all posts

Monday, November 16, 2009

Vulcan Logic and the Missing Sink

By: David Richardson

The daily carbon dioxide emissions report probably doesn't come up very often at America's dining room tables, but Kevin Gurney and researchers from the Vulcan Project hope to soon see that change.

Gurney leads Purdue University's Vulcan Project , which has produced the nation's first county-by-county, hour-by-hour snapshot of CO2 emissions. With Vulcan — named after the Roman god of fire and funded by the federal government through the North American Carbon Program — it is possible to peer, literally, into your own backyard (or your neighbor's) to see how your local area is contributing to the global problem that ultimately leads to global warming.

A little goes a long way
The gases nitrogen and oxygen account for 98 percent of the Earth's atmosphere. As one of the alphabet soup of trace gases naturally present in the air, CO2 accounts for a mere .038 percent of the atmosphere by mass; however, it has a disproportionate impact on warming.

As Gurney , the associate director of the Climate Change Research Center at Purdue University, explained, "It comes down to how we look at the atmosphere. If we look at it in terms of mass — just the weight of things — then CO2 is very small, but if we think about it in terms of the amount of infrared radiation it would trap, then CO2 would look huge."

Although the oceans remove one-third to one-half of the carbon dioxide produced each year, "the bad news is, their ability to do so appears to be diminishing," Gurney said. Meanwhile, the thousand-year atmospheric lifespan of CO2 means levels will continue to ratchet up even if emissions ended immediately.

Nevertheless, given what is known about carbon dioxide sources and sinks (locales or processes that remove CO2 from the atmosphere), scientists say CO2 has not accumulated as fast as one might expect, leading to the ironic question, "Why is the atmosphere not more polluted?"

Gurney and Daniel Mendoza, a graduate student with the Vulcan Project, suspect the answer may lie in an unrecognized carbon sink somewhere on the planet, and in light of the diminishing CO2 capacity of the seas, they believe this "missing sink" will play an important role in the carbon policy equation.

Though Gurney said CO2 concentration varies by no more than a few parts per million from pole to pole, he believes that mapping these barely perceptible peaks and valleys in CO2 levels could lead the way to the missing carbon sinks.

"In effect the weather map is a great analogy," he said. "Most people know that the air has temperature everywhere, but we know that in some places it's a bit higher and in some places it's a bit lower just like a weather map shows. CO2 is kind of like that. It's everywhere but it definitely has little bumps and valleys depending upon what's happening at the surface.

"The absence of the gas, where it would otherwise be expected, would indicate the possible location of the missing sinks."

To find the hidden sink, Gurney explained, research must first pin down where the observed carbon dioxide in the atmosphere originates. But that is not easy. CO2 mixes thoroughly with air as it travels over the planet. "You can find CO2 at the South Pole that was generated by industrial activity in the Northern Hemisphere."

Nonetheless, Gurney said, "We've known at the national level how much is coming from the U.S. as a whole," but prior to the Vulcan Project, the best estimates of emissions at the state level were based on fuel sales figures and shipping records.

Drilling down to the local level the math gets even fuzzier. Gurney says "the gold standard" of CO2 emissions estimates merely apportioned total discharges among jurisdictions on the basis of population density. Noting the obvious flaw in that approach, he points out that major CO2 emitters such as interstate highways and power plants, which alone account for 40 percent of U.S. emissions, are often a considerable distance from the population centers they serve.

To get a more accurate view of emissions sources Gurney turned to the U.S. Environmental Protection Agency and its network of local air pollution monitors . "Emission monitoring has gone on for 40 years — since the 1960s — which is an amazing legacy of information and infrastructure built and perfected over four decades," he explained. "In fact it is so good that we almost take it for granted now."

Reverse engineering
Although the EPA monitoring system was never intended to record CO2 levels, Gurney said their reports can be reverse-engineered to quantify the CO2 component of the exhaust that produced the pollution, "provided you know the type of device and fuel."

Mendoza is using data recycled from transportation studies picked up by "the thin wires that you can see that run across the road."

"They're the Federal Highway Administration 's weight and motion sensors," he said. "They classify vehicles as either light duty or heavier duty" as they pass over the counter. Mendoza says from this and similar data, Vulcan can generate CO2 emissions figures along major roads.

Although Vulcan collects no original field data, Mendoza says an incredible amount of detail can be coaxed from archival sources. Looking back to the year 2002, he says, Vulcan provides a sector-by-sector breakdown of CO2 emissions from the power plant sector, residential, commercial sectors, and the cement sector. "We can bring it down to a 10-kilometer-by-10-kilometer grid and provide a temporal pattern for most sectors," he said.

"We were a little bit floored by how much emissions actually come out of very unpopulated areas — a lot of that is due to electricity generation, which is such a great part of the economy here," Mendoza noted.

He said Vulcan delivered an additional surprise, revealing large cities to be "much more" CO2 efficient than smaller communities. Aside from the efficiency of urban mass transit, Mendoza believes lifestyle plays a role in the disparity. "Here in rural Indiana, we have large houses out in the middle of two acres, so the heating cost are much larger, but in the city you have the urban heat island effect keeping costs down."

The Vulcan Project Web page offers animated displays that show local CO2 emissions ramping up and down in response to heating and cooling needs, traffic patterns or other cycles of daily American life for the year 2002.

International Appeal
Gurney said U.S. government officials asked him if Vulcan can be used for "verification purposes" for an eventual climate treaty. It's an idea he finds appealing: "It would certainly be better to have an independent scientific body perform that function than a government."

He believes the average citizen can benefit from a dialogue with Vulcan as well. (You can take a look immediately using Google Earth .)

"I was pretty amazed at how interested people were when we released the maps and the movies," he said. "It's always been difficult to communicate the essence of this problem because it's very abstract in a certain way. One of the things that Vulcan has done is start to make this problem a bit more real and at least make it recognizable to people's lives.

"It brings the discussion down to the human scale, to the scale people live, in their state, their county, in some cases, their city. It brings it into their living rooms."

Mendoza said his next tasks will include adding data from Mexico and Canada so the Vulcan grid covers the entire continent.

Gurney, who predicts Vulcan eventually will produce emissions forecasts three months in advance, said he has received funding for a global version of the project and is currently exploring partnership opportunities with candidates in Europe, Asia and South America.

Despite the detour down the public awareness road, Gurney says he has not wavered from his initial quest to discover the missing sink. However, with a better grasp on where the CO2 originates, he says it will take direct observation, on a global scale, to determine where it might be going, and that job he said, is best performed from space.

Print

Friday, July 31, 2009

A New Approach to Fusion

Technology Review Friday, July 31, 2009

A startup snags funding to start early work on a low-budget test reactor.
By Tyler Hamilton

General Fusion, a startup in Vancouver, Canada, says it can build a prototype fusion power plant within the next decade and do it for less than a billion dollars. So far, it has raised $13.5 million from public and private investors to help kick-start its ambitious effort.

Unlike the $14 billion ITER project under way in France, General Fusion's approach doesn't rely on expensive superconducting magnets--called tokamaks--to contain the superheated plasma necessary to achieve and sustain a fusion reaction. Nor does the company require powerful lasers, such as those within the National Ignition Facility at Lawrence Livermore National Laboratory, to confine a plasma target and compress it to extreme temperatures until fusion occurs.

Instead, General Fusion says it can achieve "net gain"--that is, create a fusion reaction that gives off more energy than is needed to trigger it--using relatively low-tech, mechanical brute force and advanced digital control technologies that scientists could only dream of 30 years ago.

It may seem implausible, but some top U.S. fusion experts say General Fusion's approach, which is a variation on what the industry calls magnetized target fusion, is scientifically sound and could actually work. It's a long shot, they say, but well worth a try.

"I'm rooting for them," says Ken Fowler, professor emeritus of nuclear engineering and plasma physics at the University of California, Berkeley, and a leading authority on fusion-reactor designs. He's analyzed the approach and found no technical showstoppers. "Maybe these guys can do it. It's really luck of the draw."

The prototype reactor will be composed of a metal sphere about three meters in diameter containing a liquid mixture of lithium and lead. The liquid is spun to create a vortex inside the sphere that forms a vertical cavity in the middle. At this point, two donut-shaped plasma rings held together by self-generated magnetic fields, called spheromaks, are injected into the cavity from the top and bottom of the sphere and come together to create a target in the center. "Think about it as blowing smoke rings at each other," says Doug Richardson, chief executive of General Fusion.

On the outside of the metal sphere are 220 pneumatically controlled pistons, each programmed to simultaneously ram the surface of the sphere at 100 meters a second. The force of the pistons sends an acoustic wave through the lead-lithium mixture, and that accelerates into a shock wave as it reaches the plasma, which is made of the hydrogen isotopes deuterium and tritium.

If everything works as planned, the plasma will compress instantly and the isotopes will fuse into helium, releasing a burst of energy-packed neutrons that are captured by the lead-lithium liquid. The rapid heat buildup in the liquid will be extracted through a heat exchanger, with half used to create steam that spins a turbine for power generation, and the rest used to recharge the pistons for the next "shot."

The ultimate goal is to inject a new plasma target and fire the pistons every second, creating pulses of fusion reactions as part of a self-sustaining process. This contrasts with ITER, which aims to create a single fusion reaction that can sustain itself. "One of the big risks to the project is nobody has compressed spheromaks to fusion-relevant conditions before," says Richardson. "There's no reason why it won't work, but nobody has ever proven it."

He says it look longer than expected to raise the money for the prototype project, but the company can now start the first phase of building the test reactor, including the development of 3-D simulations and the technical verification of components. General Fusion aims to complete the reactor and demonstrate net gain within five years, assuming it can raise another $37 million.

If successful, it believes it can build a grid-capable fusion reactor rated at 100 megawatts four years later for about $500 million, beating ITER by about 20 years and at a fraction of the cost.

"I usually pass up these quirky ideas that pass my way, but this one really fascinated me," says Fowler. He notes that there are immense challenges to overcome, but the culture of a private startup may be what it takes to tackle them with a sense of urgency. "In the big programs, especially the fusion ones, people have gotten beat up so much that they've become so risk averse."

General Fusion's basic approach isn't entirely new. It builds on work done during the 1980s by the U.S. Naval Research Laboratory, based on a concept called Linus. The problem was that scientists couldn't figure out a fast-enough way to compress the plasma before it lost its donut-shaped magnetic confinement, a window of opportunity measured in milliseconds. Just like smoke rings, the plasma rings maintain their shape only momentarily before dispersing.

Nuclear-research giant General Atomics later came up with the idea of rapidly compressing the plasma using a mechanical ramming process that creates acoustic waves. But the company never followed through--likely because the technology to precisely control the speed and simultaneous triggering of the compressed-air pistons simply didn't exist two decades ago.

Richardson says that high-speed digital processing is readily available today, and General Fusion's mission over the next two to four years is to prove it can do the job. Before building a fully functional reactor with 220 pistons on a metal sphere, the company will first verify that smaller rings of 24 pistons can be synchronized to strike an outer metal shell.

Glen Wurden, program manager of fusion energy sciences at Los Alamos National Laboratory and an expert on magnetized target fusion, says General Fusion has a challenging road ahead and many questions to answer definitively. Can they produce spheromaks with the right densities, temperature, and life span? Can they inject two spheromaks into opposite ends of the vortex cavity and make sure they collide and merge? Will the acoustic waves travel uniformly through the liquid metal?

"You can do a good amount of it through simulations, but not all of it," says Wurden. "This is all very complex, state-of-the-art work. The problem is you're dealing with different timescales and different effects on materials when they're exposed to shock waves."

Los Alamos and General Fusion are collaborating as part of a recently signed research agreement. But Richardson isn't planning on a smooth ride. "The project has many risks," he says, "and we expect most of it to not perform exactly as expected." However, if the company can pull off its test reactor, it hopes to attract enough attention to easily raise the $500 million for a demonstration power plant.

Says Fowler, "Miracles do happen."

Copyright Technology Review 2009.

Tuesday, June 16, 2009

New US climate report dire, but offers hope

By SETH BORENSTEIN

WASHINGTON (AP) — Rising sea levels, sweltering temperatures, deeper droughts, and heavier downpours — global warming's serious effects are already here and getting worse, the Obama administration warned on Tuesday in the grimmest, most urgent language on climate change ever to come out of any White House.

But amid the warnings, scientists and government officials seemed to go out of their way to soften the message. It is still not too late to prevent some of the worst consequences, they said, by acting aggressively to reduce world emissions of heat-trapping gases, primarily carbon dioxide from the burning of fossil fuels.

The new report differs from a similar draft issued with little fanfare or context by George W. Bush's administration last year. It is paradoxically more dire about what's happening and more optimistic about what can be done.

The Obama administration is backing a bill in Congress that would limit heat-trapping pollution from power plants, refineries and factories. A key player on a climate bill in the Senate, California Democrat Barbara Boxer, said the report adds "urgency to the growing momentum in Congress" for passing a law.

"It's not too late to act," said Jane Lubchenco, one of several agency officials at a White House briefing. "Decisions made now will determine whether we get big changes or small ones." But what has happened already is not good, she said: "It's happening in our own backyards and it affects the kind of things people care about."

Lubchenco, a marine biologist, heads the National Oceanic and Atmospheric Administration.

In one of its key findings, the report warned: "Thresholds will be crossed, leading to large changes in climate and ecosystems." The survival of some species could be affected, it said.

The document, a climate status report required periodically by Congress, was a collaboration by about three dozen academic, government and institute scientists. It contains no new research, but it paints a fuller and darker picture of global warming in the United States than previous studies.

Bush was ultimately forced by a lawsuit to issue a draft report last year, and that document was the basis for this one. Obama science adviser John Holdren called the report nonpartisan, started by a Republican administration and finished by a Democratic one.

"The observed climate changes that we report are not opinions to be debated. They are facts to be dealt with," said one of the report's chief authors, Jerry Melillo of Marine Biological Lab in Woods Hole, Mass. "We can act now to avoid the worst impacts."

Among the things Melillo said he would like to avoid are more flooding disasters in New Orleans and an upheaval of the world's food supply.

The scientists softened the report from an earlier draft that said "tipping points have already been reached and have led to large changes." Melillo said that is because some of the changes seen so far are still reversible.

Even so, Tom Karl of the National Climatic Data Center said that at least one tipping point — irreversible sea level rise — has been passed.

A point of emphasis of the report, which is just under 200 pages, is what has already happened in the United States. That includes rapidly retreating glaciers in the American West and Alaska, altered stream flows, trouble with the water supply, health problems, changes in agriculture, and energy and transportation worries.

"There are in some cases already serious consequences," report co-author Anthony Janetos of the University of Maryland told The Associated Press. "This is not a theoretical thing that will happen 50 years from now. Things are happening now."

For example, winters in parts of the Midwest have warmed by 7 degrees in just 30 years and the frost-free period has grown a week, the report said.

Shorter winters have some benefits, such as longer growing seasons, but those are changes that require adjustments just the same, the authors note.

The "major disruptions" already taking place will only increase as warming continues, the authors wrote. The world's average temperature may rise by as much as 11.5 degrees by the end of the century, the report said. And the U.S. average temperature could go even higher than that, Karl said.

Environmental groups praised the report as a call for action, with the Union of Concerned Scientists calling it what "America needs to effectively respond to climate change."

Scott Segal, a Washington lobbyist for the coal industry, was more cautious: "Fast action without sufficient planning is a route to potential economic catastrophe with little environmental gain."

Associated Press Writer Dina Cappiello in Washington contributed to this report.

Friday, April 10, 2009

Calif. offset firm seeks to nationalize its reach

Debra Kahn – E&E News

April 6, 2009

SAN DIEGO -- By playing their cards right, offset firms -- which have until now marketed to a small niche of environmentally conscious businesses and consumers -- have a chance to turn their voluntary programs into huge opportunities if the mandatory House cap-and-trade bill passes.
At the "Navigating the American Carbon World" conference here, the mood was jubilant, thanks to the bill introduced last week by Reps. Henry Waxman (D-Calif.) and Ed Markey (D-Mass.). The draft legislation would permit covered industries to use offsets -- reductions in emissions not covered by the cap -- for up to 2 billion tons of greenhouse gas emissions per year, as a cost-lowering mechanism.
For the California Climate Action Registry, a state-created nonprofit that monitors voluntary emissions reporting, the news was especially timely. Less than 48 hours after the new House cap-and-trade bill dropped, CCAR announced it was changing its name from the California Climate Action Registry to the Climate Action Reserve. The nonprofit has shifted its focus from reporting to verifying offset projects in recent years, and wanted its name to reflect that shift, group President Gary Gero said.
The group is also pushing to have its projects' emissions reductions recognized as legitimate reductions under California and the West Coast's budding cap-and-trade system, by setting aside some allowances specifically for reductions taken before 2012.
"We are now well beyond the borders of California, and our focus has shifted from being a registry for emission inventories to focus on offsets and project reductions," he said. "What we have been saying now is our members should be recognized by the state of California as they develop their cap-and-trade program by the allocation of set-aside allowances from the cap. That is a fundamental promise that we need to make sure is kept."
Offsets are one of several thorny issues in any cap-and-trade system. Originally conceived as a way for environmentally minded businesses and individuals to support emissions-reducing efforts outside of their own actions, they have given rise to a national market that reached $254 million in 2007, or 42 megatons of CO2 equivalent, according to Ecosystems Marketplace. Whether the emissions reductions would have happened anyway, without the project's sponsorship, is notoriously difficult to prove, and various standards have emerged to give buyers a sense of authenticity.
Now advocates are trying to ensure a niche for offsets in a mandatory emissions program, arguing that they can be a cheaper way for businesses to meet their obligations. An alliance of businesses and environmental groups including General Motors, the Natural Resources Defense Council, Shell, BP, Duke Energy and the Nature Conservancy has been particularly influential. The Waxman-Markey discussion draft includes several of the U.S. Climate Action Partnership's recommendations, such as unlimited banking of credits, multi-year compliance periods and incentives for research into carbon capture and storage.
Gero said the timing of CCAR's rebranding, two days after the release of a major cap-and-trade bill, was coincidental. "It's just reflective of how fast things are moving," he said. "We had developed a strategic plan to do this about 6 months ago, and were planning to do it over the course of 12 to 18 months," he said. "But just as the world has taken off, our board said it's time for us to do this sooner."
Leading the pack in a complex, potentially lucrative business
It appears the Climate Action Registry is well-placed to take a leading role in offset verification if Waxman-Markey's offset provisions remain intact. According to the World Resources Institute, of the 17 existing voluntary standards for offsets, only two meet the bill's requirements: the Regional Greenhouse Gas Initiative and CCAR.
"Offsets will unquestionably play a role, and likely a significant one," said WRI Senior Associate Alexia Kelly.
"We're awfully proud of the way the bill recognizes us for the quality work we do," said Gero, who testified to Waxman's Energy and Commerce Committee last month on the environmental integrity of offsets and has answered technical questions for committee staffers. While the bill does not recognize CCAR by name, he noted, "we're absolutely thrilled and excited and proud that WRI would think we're one of the only two that qualify."
Gero compared the potential relationship between government and nonprofit offset verifiers to that of the North American Electric Reliability Corp. and the Federal Energy Regulatory Commission, which recognized NERC as an enforcer of mandatory reliability standards in 2006.
If the bill passes and EPA decides to give the Climate Action Reserve an official role verifying offset projects, the group will have its work cut out for it in terms of meeting demand. It's aiming to have 1.5 million tons of projects registered by this fall and 4-5 million tons by 2010 -- compared to the 1 billion tons per year that the bill gives domestic offsets. "That's still clearly not a billion, but we're not the only game in town, and we don't cover all sectors yet," Gero said.
Agriculture and forestry are ripe for expansion of offset projects, he said, as are ozone-depleting substances and gases with high global warming potential. "I don't know whether we can get a billion, but there's an awful lot of domestic opportunity," he said. "We're one little ice cube on the tip of the iceberg."
On the state level, California is under pressure to figure out how to reward businesses that have already taken action to reduce their emissions below business-as-usual levels. It's working on a list of "additional" actions that would qualify, but might not finish the process until the regional cap-and-trade program takes effect in 2012 -- or is superseded by a national system.
"It's unfortunate that we do have to go through a set procedure to do these things, and it is going to take time," said Michael Gibbs, California EPA's assistant secretary for climate change. "We could end up not finishing the early action rules until it's too late, and that's clearly not acceptable. Then, we're always faced with the question of how much to invest in what we're doing locally as a national program comes into existence, as well."
"The pressure for quick action is of course hugely intense," he said. "An entity that's going to have a regulatory compliance obligation decides to take action in advance and wants that action recognized and reported. It's the single largest pressure point in terms of trying to move quickly."

Wednesday, March 25, 2009

EPA Raises Heat on Emissions Debate

Ian Talley – Wall Street Journal
March 24, 2009

WASHINGTON -- The Environmental Protection Agency has sent the White House a proposed finding that carbon dioxide is a danger to public health, a step that could trigger a clampdown on emissions of greenhouse gases across a wide swath of the economy.

If approved by the White House Office of Management and Budget, the endangerment finding could clear the way for the EPA to use the Clean Air Act to control emissions of carbon dioxide and other greenhouse gases believed to contribute to climate change. In effect, the government would treat carbon dioxide as a pollutant. The EPA submitted the proposed rule to the White House on Friday, according to federal records published Monday.

Such a finding would raise pressure on Congress to enact a system that caps greenhouse gases -- which trap the sun's heat in the earth's atmosphere -- and creates a market for businesses to buy and sell the right to emit them, as President Barack Obama has proposed.

A White House representative said Monday that Mr. Obama's "strong preference is for Congress to pass energy security legislation that includes a cap on greenhouse-gas emissions. The Supreme Court ruled that the EPA must review whether greenhouse-gas emissions pose a threat to public health or welfare, and this is simply the next step in what will be a long process that engages stakeholders and the public."

The administration has proposed a cap-and-trade system that could raise $646 billion by 2019 through government auctions of emission allowances. Environmentalists want the administration to act on climate change before December, ahead of talks aimed at forging a successor to the Koyoto Protocol, the 1997 agreement that commits many industrialized countries to reducing their greenhouse-gas emissions.

EPA spokeswoman Cathy Milbourn declined to comment on the details of the endangerment proposal, saying it is "still [an] internal and deliberative" document. But in a move that indicated the potential scope of regulation, the agency earlier this month proposed a national system for reporting carbon-dioxide and other greenhouse-gas emissions by major emitters. The EPA has said about 13,000 facilities, accounting for about 85% to 90% of greenhouse gases emitted in the U.S., would be covered under the proposal.

Industry officials say it will still take months, possibly even years, for the administration to finalize rules for regulating greenhouse-gas emissions.

According to an internal document presented by the EPA to White House officials earlier this month, the EPA believes the health effects of elevated greenhouse-gas levels could cause "severe heat waves...with likely increases in mortality and morbidity, especially among the elderly, young and frail." The agency also said climate change caused by higher greenhouse-gas levels could result in more severe storms and more suffering related to "floods, storms, droughts and fires."

Business groups such as the U.S. Chamber of Commerce and the National Association of Manufacturers warn that if the EPA moves forward on regulation of CO2 under the Clean Air Act -- instead of a measured legislative approach -- it could hobble the already weak economy.

Coal-fired power plants, oil refineries and domestic industries, such as energy-intensive paper, cement, fertilizer, steel, and glass manufacturers, worry that increased cost burdens imposed by climate-change laws will put them at a severe competitive disadvantage to their international peers that aren't bound by similar environmental rules.

Environmentalists have called for the endangerment finding, and say action by Congress or the Obama administration to curb greenhouse gases is necessary to halt the ill effects of climate change.