Geoengineering Goes Mainstream

The MIT Technology Review — one of my favorite magazines —
writes about geoengineering in the January/February 2010 issue. Much of what is said in the article will be familiar to people who have read SuperFreakonomics, but it also talks about carbon capture, which we didn’t discuss much.

The more I have thought about these issues, the more I have become convinced that carbon capture is going to end up being the centerpiece of long-term geoengineering solutions. There are good reasons to be optimistic that in 50 to 100 years we will be able to remove carbon dioxide from the air for one-thousandth or one-millionth the current costs. While that may seem fanciful at first blush, think about the rate of increase in computing speeds over the last 30 years, for instance. If carbon capture will get cheap and scalable, then the current focus on reducing carbon dioxide emissions (as opposed to keeping the temperature of the earth stable via geoengineering in the short run until carbon capture becomes routine) looks misguided. I suspect that with the failure more or less of Copenhagen (surprise, surprise), we will be hearing more and more about geoengineering.


Quite true. Yet another example of "I told you so" for a project relying on geopolitics.

Thanks for nothing politicians, now get out of the way and let us engineers get to work.


Isn't this just another argument for a carbon-credits type of market, though? Let the burden for investing in capture technologies rest on those who want the ability to emit, and measure their emission in net carbon, not gross emissions, if they pay for carbon capture.

Dan M

I agree that carbon capture is worthwhile pursuing, but what characteristics of this technology lead you to believe a Moore's Law-like cost/performance curve is likely? All technologies tend to improve when they move out of the lab, but "one-thousandth or one-millionth [of] the current costs" seems rather dramatic.

Joel Upchurch

I think that carbon capture from air is the ultimate solution, but doing it by mechanical means is highly unlikely method. The problem is that CO2 is very diffuse. Any system to capture CO2 would have to cover hundreds of thousands of square kilometers to make a significant dent in CO2. No improvements in technology will get around this. Even corn exhausts the local CO2 quickly and has to be replenished by air circulation. Schemes to produce Biofuels from algae, need supplemental sources of CO2, like a power plant, to produce at full efficiency.

On the other hand, using techniques to introduce algae blooms in the ocean to sequester CO2 could cover millions of square kilometers of ocean with CO2 eating plants for little cost. The ocean already has natural dead zones where the blooms would have little effect on animal life. There have also been experiments with genetically engineered bacteria, that might be cheaper and more efficient than algae.


Kevin H

Everyone always brings up computing power as the example of growth in efficiency, but the unfortunate fact is that computing power and computer memory are vast outliers. It would be much more sensible to base future growth rates of a non-computing technology on other factors such as the price of energy production, the price of building structures, or the price of producing chemicals. While I'm sure all of these have gone down in the last 50 years, I'm almost certain the rate is less than the growth in computing power.


Some carbon capture may be done to reduce levels to some pre-defined optimum. But once that is achieved, I would have thought it would be easier all round if the capture occurs as close as possible to the places where the carbon is vented. i.e. the factories. It is easier (and therefore cheaper) to prevent it being thrown out than to collect after the fact.
Of course this depends on the distribution and size of the production sites. Access to the atmosphere is available anywhere but access to carbon emitters may not be as straightforward.

I have always thought that the best way to dramatically reduce carbon levels would be to find some highly profitable use for the CO2. If it is a valuable resource then no-one is going to want to throw it away. The basis of any emissions trading scheme is to artificially create a market. How much better it would be if a true market existed.


I know a way we can profit from CO2. Release it in into the air to warm the planet and fuel photosynthesis.

Elemental LED staff

How could reducing carbon dioxide emissions possibly be "misguided"? Regardless of the success of carbon capturing, reducing carbon emissions and developing cleaner fuel and power sources is still a necessity, both for the environment, and to address future shortages of coal and petroleum. If you mean it's misguided to focus on just one strategy for reducing carbon, that's probably true.


There needs to be incentives to bring carbon capture out of the lab and into the atmosphere - right now the incentives look like being a long way off.


Moore's law works well for microelectronics because the devices have steadily decreased in size so the amount of production cost/energy per unit has also decreased steadily. Removing carbon dioxide from the atmosphere is energy intensive for basic thermodynamic reasons that are not going to change regardless of what technology is used.

Stimulating algae blooms or planting real or artificial trees to remove atmospheric carbon dioxide are techniques that use solar energy and photosynthesis but may not be efficient enough to compensate for deforestation and urbanization. Injecting sulfur thioxide into the upper atmosphere to produce long-lasting aerosols to block a tiny fraction of sunlight would seem to be a cost-effective alternative to carbon dioxode capture or emissions limitations. We need more research on alternatives before making a very costly committment to carbon capture/limitation.



Computer technology differs from carbon capture because computer technology is a perfect feedback loop; the better computers designed by generation n are immediately used to design generation n+1. I don't see how you could get that effect with carbon capture. And I also don't see that geoengineering or massively more effective carbon capture justifies profligate carbon emissions. Carbon emissions come from consuming scarce materials - we can't have our cake and eat it, too. Reducing unnecessary energy means that our kind of energy will be available for our great-grandchildren; if it reduces global warming, so much the better.


Though carbon sequestering may be a viable solution in the medium term, how long would we be able to sequester co2 while outputs continue to rise ever year?


Is this an "all-in" bet?

Monte Davis

Kevin @5: "...computing power and computer memory are vast outliers."

True, important, and too often ignored. Computing manipulates *patterns* that are independent of material substrate. Move those patterns from abacus beads or gearwheel cogs to relays, to vacuum tubes, to transistors, to q-bits or electron spins, and the manipulation not only still works, it works better: faster, less energy etc.

Most other engineering -- most other human activities --*have to* manipulate macroscopic amounts of matter and energy. If Intel could make ten thousand tiny perfect bulldozers or oil tankers or gas turbines on a wafer, they'd be useless. (Cf. Galileo on scaling).

There are meaningful similarities between computing and other *information* technologies: language, printing, broadcasting. But most speculation of the form "if technology X improved a la Moore's Law..." is bogus handwaving.

Of course, that makes it a good fit for much of _Super F_, aka _We May Be Wildly Wrong, But We're More Feistily Counterintuitive Than Ever!_. So perhaps the effect is intended.


Eromsnid Flor

Conversion of carbon dioxide from a gas to a liquid or solid is very expensive energetically. Unlike computer speeds, which theoretically could operate at the speed of light, there is no viable way to shortcut the energy barrier. Computers got faster because engineers removed barriers between actual and ideal, and that process continues.

Current technology for scrubbing nitric and sulfuric acids from coal plants was developed in the 80's, and installed in the 90's. There is nothing better, newer, or cheaper than what is being used right now. That means that there has been zero progress for 20-30 years in the cost of scrubbing acid gases.

It costs coal plants millions of dollars a year in ammonia, lime, catalyst, and reduced plant efficiency. If there was a cheaper way, and if any progress had been made, it would have been installed.

Idaho just permitted a new power plant and is requiring that the plant purchase carbon offsets until they figure out how to compress and pump the CO2 into the ground. There is no way that this plant can compete profitably with other states or other countries such as China or India.

In my most humble opinion, we should be investing in truly novel technology such as fusion and solar, instead of wasting valuable time and money trying to retrofit our coal plants.


Dr. Manak

I want to reiterate two of the other poster's excellent points - what makes you believe the cost of carbon capture will reduce so quickly? is this just a diffuse belief in human ingenuity or is there some scalable technology that has a good chance of working - while the computer cost/erformance curve scaled nicely many other technologies have not scaled so well - we still travel at the same speeds we did in the 60s, most machines are not 100x more power efficient, we do not live 100x longer etc... Second, if you do believe in innovation so much why not support an incentive for people to innovate - like increasing the cost of carbon? this is an economics blog - right?

Andrew Maynard

Great title Steven - here's my piece under the same banner from last April :-)

On carbon capture, I think we have the potential to develop cheap and sophisticated carbon management technologies. The thing that bothers me is - what's the driver? Future need doesn't necessarily work - how many technologies do we have now that were developed to address a need, rather than adopted to address a need? And if the latter is more prevalent (as I think it is), how do we ensure the technologies being developed are those that will be most useful?

Kevin Lehmann

Steve needs to learn some thermodynamics! There is a minimum possible work, called the reversible work, required to capture carbon dioxide from the atmosphere and concentrate to ~10 atm pressure required to pump down a well. This is about 2 GigaJoule per metric ton of carbon. The second law of thermodynamics assures that any real process will always require even more work (and thus energy input). Current processes are far less efficient but a decrease of 1000 times in cost would require violating the second law of thermodynamics. Even at this reversible work efficiency, removing the CO2 that human activity has added to the atmosphere would require an energy equivalent to about 10 years of the total US energy consumption. If we ever get fusion power, this may be practical to do, but it will never be inexpensive. Note that this does not apply to capture at the power plant, where essentially pure carbon dioxide and water are generated if oxygen is separated from nitrogen in the air used for combustion.



"If carbon capture will get cheap and scalable, then the current focus on reducing carbon dioxide emissions...looks misguided"

That's about as big an if as you can get. You want to gamble future generations' security on the development of carbon capture? I'm quite astonished that you think this way.


And that would make only one billion or one trillion times more expensive than conservation in the first place?