The Weekly Carboholic: Project Vulcan maps US CO2 emissions in detail

Posted on April 9, 2008

13


carboholic

In our first Carboholic, I pointed people at a great new tool to monitor global carbon emissions, the Carma (Carbon Monitoring for Action) website. Today I’d like to point people to a fantastic new scientific tool for monitoring the carbon emissions of the continental United States: Project Vulcan. But first, a YouTube video on the project.

Project Vulcan is a joint project of Purdue University, Colorado State University, and Lawrence Berkeley National Laboratory, and it was funded jointly by NASA and the Department of Energy. The goal was to create a map of carbon dioxide emissions that was highly detailed both in time and in space. Using other types of pollution that are monitored by the EPA, DoE, et al as proxies for CO2, Vulcan was able to indirectly monitor the concentrations of CO2 in the atmosphere on a 10km uniform grid spaced across the entire United States, and on an hourly basis. Vulcan modeled utilities, industrial sources, and transportation across the country and combined them all into multiple datasets that can be downloaded from the Project Vulcan website.

Nothing like Vulcan has ever been done before, and once the data and methodology have been published, there’s no reason why Vulcan’s success in tracking U.S. CO2 emissions couldn’t be duplicated in any nation that tracks pollutants. In addition, several organizations have shown interest in monitoring carbon dioxide directly in a similar fashion to weather stations, and if such a system is ultimately implemented, then there may eventually be a method to correlate Vulcan’s pollution proxy data to CO2 directly. Finally, NASA plans to launch the Orbiting Carbon Observatory (OCO) in December, and this satellite will enable scientists to monitor CO2 concentrations in the atmosphere in synchronicity with a number of other climate-monitoring satellites known as the A Train (fact sheet). Over the two year mission lifetime of the OCO, it will provide yet another tool to compare with the Vulcan data, and as a result, our global understanding of how carbon dioxide affects our local and global climate will be dramatically improved.

Read the Project Vulcan press release here
Thanks to James Bruggers and Tom Henry for putting me onto this great resource.

———-

Norfolk
Image source: Telegraph.co.uk

Sea level rise is expected to drive tens to hundreds of millions of people out of their homes over the next century, mostly the highly populated river deltas of Asia. However, Europe and North America will not be immune from the effects of sea level rise. Nations will have to make difficult choices about what to save and what to abandon to the sea. According to the Telegraph, a number of villages in England may be abandoned so that others may be saved.

The sea would be allowed to breach 15 miles of the north Norfolk coast between Eccles-on-Sea and Winterton and would flood low-lying land to create a new bay.

Seawater would destroy the villages of Eccles, Sea Palling, Waxham, Horsey, Hickling and Potter Heigham along with five fresh water lakes.

These six Norfolk villages each have thousands of years of history, and if they’re allowed to be flooded, that history will be lost to a likely salt marsh. But ultimately, as the spokeswoman for Natural England said to the Telegraph, “We have got to face up to the issue. We have got to have discussion. There are difficult decisions to be made and we have produced this report after lengthy research.”

It’s good to see that Natural England, at least, is willing to put the hardest options forward for real, honest, discussion. Similar debates need to happen in every nation that will be affected by sea level rise, and, generally speaking, those discussions don’t appear to be happening. Serious discussions about whether to restore New Orleans, or just parts of it, never occurred so far as I can tell – it was always assumed that New Orleans, elevation between -6.5 and +20 feet, should be restored at any expense. Similarly, there is very little discussion in the U.S. over whether flood insurance for homes and businesses along the coasts and in major flood plains is the best use of tax money, especially when some areas are flooded, and structures rebuilt at great cost, repeatedly.

With climate models anticipating rising sea levels for the foreseeable future and flooding that is both worse and more common, there will be places that are historically, economically, and culturally valuable that have to be abandoned. Simply put, it will be impossible to save everything. And I applaud Natural England for having the courage to suggest that this part of Norfolk might have to be abandoned to the sea, even if it never happens. Nearly every nation on the Earth will need similar courage eventually.

———-

One of the greater criticisms of global heating science is its reliance on advanced computer models. The argument goes something like this: “the models lack sufficient resolution to make accurate predictions, and they rely on too many assumptions, and they have too many unknowns that had to be guessed at, for politicians to make any decisions based on the models’ results.” Now, each of those criticisms is fair to a certain degree. There are, after all, great debates among scientists as to what assumptions are reasonable or not, what guesses are accurate or not, and what resolution is necessary to make viable climate predictions. A new study by meteorologists at the University of Utah aims to determine, using scientifically determined and explained variables and metrics, whether the climate models being used today are accurate. And their results appear to indicate that the most recent models are actually quite accurate.

According to the University of Utah press release, Thomas Reichler and Junsu Kim of the Department of Meteorology analyzed about 50 different models, including the IPCC models used for the Fourth Assessment Report last year, and concluded that they were quite “credible.” However, there were some fundamental problems with their methods that make determining the models’ reliability a more difficult proposition. First and foremost is the fact that the models were created using real climate data, so entering existing climate data back into the models to test them is of questionable utility. However, Reichler and Kim attempted to bypass this major problem by using different statistical metrics to determine the credibility of the models differently from prior model validations. And in the process, Reichler and Kim have shown that averages of multiple runs of third generation models used most recently by the IPCC are rapidly approaching the accuracy of observational analyses of real climate. Or, put simply, model predictions of climate are nearly as accurate as an analysis of the actual climate.

The Carboholic has reported several times since we launched in December that recent observations of the oceans and ice melting are starting to confirm the predictions of third-generation models.

Note, however, that models are only as accurate as the data they’re given, and that fourth, fifth, or Nth generation models will certainly be developed specifically to answer the questions posed in the model debates. However, this study again confirms what the IPCC said last year – it is highly likely that human consumption of fossil fuels and agriculture are the drivers of recent global heating.

———-

The BBC last week reported on another scientific study, this one from a team of scientists from the University of Lancaster that have found that the impact of cosmic rays on global heating is minimal and may be entirely negligible.

The basic hypothesis that Danish scientist Henrik Svensmark has proposed goes like this: galactic cosmic rays (high energy particles from outside the solar system) are largely responsible for cloud formation, and since the output of the sun supposedly determines the number of rays hitting the Earth’s atmosphere, when the sun’s output falls, more rays reach the Earth and create more clouds, producing greater cooling than the reduction in the Sun’s output directly. Conversely, when the sun’s output is higher, fewer clouds are generated and so the Earth heats up more.

It’s this hypothesis that the Lancaster team believes they’ve tested and disproved by comparing periods of high and low solar output (when fewer or more galactic cosmic rays would be hitting the atmosphere) and cloud formation. And in the process, they found that, at most, galactic cosmic rays could account for only 25% of the observed cloud formation over the last solar cycle, and that there was no correlation between cloud formation and solar output this solar cycle. As you can imagine, Dr. Svensmark disagrees, but there are apparently several other UK scientists who concur with Terry Sloan and the other Lancaster scientists. Giles Harrison of Reading University and Mike Lockwood of the Rutherford-Appleton Laboratory have independently analyzed cloud cover data over the UK and found that there was only limited correlation between the cloud cover and the intensity of galactic cosmic rays. In fact, Mr. Lockwood’s own research on solar output appears to indicate that, if Dr. Svensmark were right, the Earth would have been cooling over the last 20 years instead of heating.

This won’t be the end of the galactic cosmic ray question, because cloud formation is one of the things that has the greatest impact on climate – clouds act like a reflector of solar energy both back out to space and back down to the surface. I look forward to reading more about how the various competing results can be combined and a greater understanding of our climate derived in the process.

Advertisements
Posted in: Uncategorized