Wednesday, October 19, 2011

Climate Change, Photosynthesis, and El Niño

The way carbon is moved between the atmosphere, the biosphere, oceans, and other parts of the Earth system plays an important role in the current scientific consensus on climate change. While the various processes involved are known well enough to predict that the planet will continue warming if CO2 concentrations keep increasing from fossil fuel emissions, better understanding of any part of the so-called carbon cycle can improve predictions of how exactly how much warming will occur. The image below shows a model of the carbon cycle.


In particular, the approximation of the conversion of CO2 to sugars by plants during photosynthesis (called primary production) could have its range of possible quantities narrowed. In a recent study in the journal Nature, the authors use measurements the rare, stable oxygen isotope, oxygen-18, in CO2 to estimate the value of primary production. Since CO2 can exchange oxygen atoms with water in leaves without undergoing photosynthesis, the ratio of oxygen-18 to oxygen-16 in CO2 is related to how much CO2 is converted to sugars in plants. Using these measurements, they actually found that primary production might be greater than previously thought! Their estimate has the advantage of not relying on assumptions about biology and provides further constraints on primary production. This increase in the estimate of primary production certainly may turn out to be good news because it could mean that CO2 concentrations will rise (slightly) more slowly. However, especially because primary production does not count how much of the sugars produced are consumed by the plant itself (and thus changed back into CO2), it does unfortunately not mean that the biosphere can completely offset all changes in CO2 from fossil fuel emissions.

By observing the oxygen-18 to oxygen-16 ratio over 30 years at various locations around the world, the scientists found occasional small increases in the ratio from year to year. Oddly enough, these increases occurred at the same time as El Niño! They explained this increase in the oxygen-18 to oxygen-16 ratio by decreases in rainfall over rain forests (where lots of photosynthesis occurs) in Southeast Asia and northern South America during El Niño. Because the water with oxygen-18 is "heavier" than water with oxygen-16, the water in clouds tends to have more oxygen-16 since "lighter" oxygen-16 containing water evaporates first. During periods with less rainfall, the water still evaporates over rainforests, reducing the amount of oxygen-16 in water in the soil. Without enough rainfall to return oxygen-16 back to the ground, the relative amount of oxygen-18 increases, thus the water in plant leaves also has more oxygen-18. Although this effect is very small (a 0.05% change), it can be measured in CO2. As the southern ocean returns to a La Niña pattern and rainfall in these regions increases, the oxygen-18 to oxygen-16 ratio in CO2 eventually returns to its "normal" level. The scientists used the rate of change in oxygen-18 to calculate an estimate of primary productivity. Additional measurements of oxygen-17 to oxygen-16 ratios may provide additional constraints to help improve this estimate further. (For a more detailed explanation of isotopes in geology and chemistry, see this previous post)

(h/t Jeremiah J.)

Thursday, October 6, 2011

Super soggy air on Mars

Using observations from the SPICAM instrument on the European Space Agency's Mars probe Mars Express, French scientists discovered that the amount of water vapor in the upper Martian atmosphere sometimes far exceeds expectations. While some water vapor is formed from sublimation (evaporation) off ice on the Marian surface, the amount found in the upper atmosphere is greater than the temperature would predict. In terms of relative humidity, the humidity of the air was observed to approach almost as much as 1000%! Scientists call such air "supersaturated" in water, since the air is holding far more water than it would otherwise.


Credit: NASA NSSDC

How could so much extra water end up in the atmosphere of Mars? The authors suggest that the low pressure and lack of dust particles in the upper atmosphere make condensation into ice very difficult, so that the water simply stays in vapor form. Simultaneous measurements of the amount of dust show that this could indeed be the case. These results fundamentally change scientists understanding of the water cycle on Mars as water vapor exists in much higher concentrations at higher altitudes than previous thought. Greater amounts of water vapor in the upper Martian atmosphere imply that a larger amount of water is able to escape Mars's gravity than previously thought. Models of the chemistry of the Martian atmosphere are also affected by water - despite its relatively low amounts (even with this result) water acts as a catalyst in many chemical cycles in the Martian atmosphere.

Thursday, September 22, 2011

The Scientific Process and Ship Wakes

This article from the open-access journal Atmospheric Chemistry and Physics was actually rejected to be published by the journal in 2010. Because of the open-access model of the journal, the originally submitted article along with reviewer comments can be viewed despite the article's rejection. In the article, the authors measured the amount of light reflected by the wakes from shipping barges in the northern Pacific Ocean from airplanes. They then used their results to estimate the increase in reflected sunlight from increases in shipping across the Pacific, since this would have a (very) small cooling effect on the climate. Unfortunately, while this was a somewhat clever idea, the reviewers thought that there were far too many uncertainties in both their measurements and calculations so that the reported number was an overestimate of an already small number.


While this article was rejected, because of the open-access nature of the journal, it shows how the scientific process of peer review works. Typically, when science and the scientific method is taught in schools, the curriculum focuses on the experimental process of science. While certainly this is very important, science also has important social ways of processing new information. If a group of scientists do a series of experiments that reach new and interesting conclusions, they will try to get their results published to a scientific journal. (And present this information at conferences, universities, research centers, etc.) Upon submission to a journal, two (or more) other scientists from the same research field will read the initial paper, either recommending that the results be published along with potential changes or that the paper be rejected outright. If the paper is rejected, the authors can appeal to have it accepted, but this is not usually granted. The authors then edit the paper based on the suggestions of the reviewers, and it is resubmitted and published in the next issue of the journal. Usually a reader only sees the final product of this process in a scientific journal, but in the case of an open-access journal such as Atmospheric Chemistry and Physics the entire process is visible to the reader (and the journal is free online). Even articles that eventually are rejected such as this one are immediately put on the journal's website after a quick review process to ensure the article is relevant.

Peer review is an important process to ensure quality as scientists being human after all are prone to misjudgment, bias, or error. While science is constantly changing over time as new ideas or methods are realized, it's important to make sure these ideas are plausible or methods actually work! While there are many cases of new ideas overturning the scientific consensus (I'm looking at you, quantum mechanics) it's often after these new ideas have been thoroughly scrutinized and verified through the process of peer review, discussions at scientific conferences, and independent testing of observations or experiments. The process is not without its faults, of course! Occasionally a scientist will unethically use peer review to block the publication of results that contradict their own. (Yes, it's kind of silly. There is the joke: "The debates in science are so fierce because the stakes are so low." I've seen people spend 15 minutes at a conference arguing about what the proper term for something is.) In those cases, an open access journal such as Atmospheric Chemistry and Physics provides the kind of transparency that makes such cases apparent to improve the scientific process. Hopefully with the growing use of information technology (I know I rarely ever read the actual printed journal) more publishers decide to switch to an open-access model not only for transparency but to improve access to scientific journals to the general public.

Wednesday, September 14, 2011

Variations in Volcanic Dust High Up

Last month, in Science magazine, Susan Solomon,* an atmospheric chemist at the National Oceanic and Atmospheric Administration (NOAA) and her colleagues presented satellite measurements of sulfate aerosols in volcanic dust in the stratosphere, the upper region of the atmosphere that contains the ozone layer. These measurements showed that the levels of volcanic dust in the stratosphere actually vary significantly even in the absence of major volcanic eruptions such as the Mount Pinatubo eruption in 1991.

Aerosols such as this volcanic dust scatter and reflect light from the sun, thus causing a net cooling of the climate. The volcanic dust in the stratosphere actually increased (mostly from natural volcanic events) enough from 2000 to 2010 to decrease the heat trapped by the atmosphere by 0.1 W/m2. For comparison, the increase in CO2 during the same period increased the heat trapped by the atmosphere by 0.28 W/m2, thus this volcanic dust cancelled out some of the warming that would have occurred from CO2 alone. While this is certainly a good thing as it has slowed the pace of global warming, it is unclear whether this increase in volcanic dust will continue due to the unpredictable nature of volcanic activity. For example, if by 2020 volcanic dust were to return to levels seen in the 1960, any cooling effect would disappear and cause average global temperature to increase by 0.06oC in addition to any changes from increased greenhouse gases. Despite the inherit unpredictability of volcanic activity, understanding that volcanic dust does have variable effects on the climate over time can help better constrain the possible range of any future changes in the climate.

Implications
In a broader sense, aerosols (i.e. dust and liquid droplets floating in air) are an important part of the climate change picture that are often under-discussed compared to poor, infamous CO2. Changes in aerosols caused by humans from land use, transportation (autos, trains, etc.), and industry (smoke stacks) since the industrial revolution have increased the light reflected by the Earth. Much like the volcanic dust in the stratosphere, this has reduced the heat trapped by the atmosphere, partially offsetting the increased heat trapped by greenhouse gases. In the period from 1940 to 1980, the combined effect of these two might have canceled each other out for a time, leading some to speculate about the possibility of global cooling in the 1970's. Of course, this hypothesis has not been borne out by the data since then. Aerosols have negative effects on humans directly through inhalation or indirectly through smog and acid rain, so government regulation of these pollutants has reduced their concentration in the atmosphere. While this is an obviously good thing for human society, it had the unfortunate effect of reducing their cooling effect on the atmosphere!

Studies like this one are especially important for improving climate predictions, as aerosols are the least well-understood part of climate change, especially because of their indirect effects on cloud formation. (See this chart from the IPCC and notice the very large black bars on aerosols compared to other factors.) They're certainly understood enough to predict that the Earth is warming and will continue to warm without any changes in human activity, but predicting how much the climate might change in the future is constrained largely by the uncertainty in the effects of aerosols. Better understanding the impact of aerosols, both human-made and natural, can improve the uncertainty in future predictions of climate. This, in turn, can provide better estimates of the costs and benefits of any potential emissions reductions or even geoengineering.

*Susan Solomon has won the US National Medal of Science for her work on understanding the cause of ozone depletion, and was one of the co-chairs of the physical science report for the International Panel on Climate Change.

Thursday, September 8, 2011

The Decade-long Mystery of Atmospheric Methane

In a recent paper in Atmospheric Chemistry and Physics, scientists from the Institute for Marine and Atmospheric Research Utrecht compared the results from an atmospheric model of methane to observations of methane and its carbon isotope ratios from various atmospheric monitoring stations around the world. (Full disclosure: I worked at IMAU for 3 months on a fellowship with the PI on an unrelated research project) From 1998 to 2006, the amount of methane (CH4) in the atmosphere stopped growing even though emissions from human civilization during this time increased. Based on a model that takes into account the flow of methane into the atmosphere from natural and human sources and the flow of methane out of the atmosphere from the natural sinks, methane should have increased during this time period.

Why would anyone care about methane? Because methane absorbs infrared radiation (heat) strongly, it is the second most important greenhouse gas on the atmosphere of Earth despite its relatively short lifetime in the atmosphere (~9 years). Its concentration has actually increased in the atmosphere from 700 parts per billion in 1750 to nearly 1900 parts per billion today. This increase in methane has led to more heat being trapped by the atmosphere, partially contributing to climate change.

Because the sources and sinks of methane are not completely understood, models that try to take all of them into account cannot predict the "leveling off" of methane observed in the atmosphere from 1998 to 2006. The authors use a mathematical model similar to the one displayed below that includes estimated values for the natural sources of methane (mostly from bacteria in wetlands), the human sources of methane (fossil fuel mining/extraction, rice paddies, waste and water treatment, biomass (i.e. wood) burning, and livestock), and the natural sinks of methane (mostly hydroxyl free radicals (OH) produced from water exposed to sunlight). They then do a "sensitivity analysis" where each major source or sink is increased or decreased within the model to show how changes in these can potentially affect the methane calculated by the model. By increasing the sink through increased hydroxyl radical concentrations or by decreasing the source from wetlands, the model calculates a trend similar to atmospheric observations of methane. How can we tell which of these may have caused this change in the trend, though?

This diagram depicts the flow of methane from sources into the atmosphere as well as the sinks that consume methane.
A. Permafrost, Glaciers, and Ice Cores B. Wetlands C. Forest Fires D. Rice Paddies E. Animals F. Plants G. Landfills H. Waste Water Treatment Facilities I. Hydroxyl Radical J. Chlorine Radical
Image by Olivia Shoup and used under the Creation Commons Attribution Share-Alike 3.0 license

Observations and modeling of the carbon-13 to carbon-12 ratio in methane can help provide additional constraints on understanding the sources and sinks of methane. (For more about isotopes and chemistry and how it relates to the atmosphere, see this part of a previous post on a related topic.) Since each source and sink has a relatively different carbon-13 to carbon-12 ratio, any changes to these sources and sinks in the model will also affect the modeled carbon-13 to carbon-12 ratio. These results then can be compared to the atmospheric record of carbon-13 to carbon-12 in methane to see if any change in the estimated value for a source or sink is justified. Using the isotope ratio as a guide, a decrease in wetland methane emissions in the model, while it can bring the methane concentration in the model close to that observed in nature, results in a significant increase in the carbon-13 to carbon-12 ratio, which is not observed. In contrast, an increase in the hydroxyl radical sink will bring the methane in the model close to that observed in nature, but without changing the modeled isotope ratio. Such an increase in hydroxyl radical concentration has been independently proposed, but this hypothesis is hard to confirm since hydroxyl radicals are difficult to observe directly due to their very low lifetime (less than one second).

Using both the atmospheric concentration and carbon isotope ratios of methane, the scientists in this study were able to identify potential causes of the slow down in the growth of methane in the atmosphere from 1998 to 2006. The ways methane is produced or released in the atmosphere may have been reduced during this time period, but this is not necessarily consistent with either the carbon isotope ratios or other outside estimates of these sources. The primary way methane is destroyed in the atmosphere, through reaction with hydroxyl radicals (OH), may have also increased during this time period. This is consistent with the isotope evidence as well as other studies relating to hydroxyl radicals. A combination of these effects is also possible, but the combined effect would need to be consistent with the isotope evidence as well. This kind of modeling study demonstrates how measuring the isotope ratios in an atmospheric gas can be useful for understanding its chemical and biological activity in the atmosphere.

Implications

Using a model such as this along with atmospheric observations, scientists can develop a better understanding of how methane or other atmospheric gases move in and out of the atmosphere. This is crucial to understanding the potential range of impacts from a given public policy towards environmental pollutants, whether related to smog or climate change. Because of its short lifetime and significant impacts not only on the climate but as a precursor to photochemical smog (ozone pollution), it has been suggested that reductions in methane (and other pollutants) that are technologically feasible now could "buy time" on climate change while having significant public health benefits. While a certain degree of uncertainty does exist about any form of public policy (who knows if an asteroid might hit the Earth tomorrow or nuclear war could break out?), developing as complete an understanding of the physical basis behind any proposed policy as possible is key to helping best estimate the cost and benefits of any decision.

Wednesday, September 7, 2011

Extreme cold and global warming

"Hey! There was some crazy blizzards in the US this year and the last. How can you scientists still say that the Earth is warming when it was unusually cold in the winter of 2010 and 2011?" This is a common kind of question from laymen in regards to the feasibility of climate change. Luckily, some scientists actually looked at this in a recent article in Geophysical Research Letters. (gated beyond the abstract, unfortunately)

These scientists divided the Northern hemisphere into eight regions: USA, Canada, Alaska/Yukon, Siberia, Far East, Central Asia, Nothern Europe/Russia, and Mediterranean/Middle East. Then, using statistical techniques, they examined the temperature record for extreme cold and warm events for the winters of 2010 and 2011 and compared it to the historical record for 1950-2009. Indeed, the USA, Northern Europe, and Siberia did experience winters that were unusually cold in 2010 and 2011. However, the number and extent of unusually cold days in these two regions was offset by an even greater number of unusually hot days in every other region of the Northern Hemisphere! Not only were there are larger number of extremely hot days, but these hot spells also lasted much longer and were more extreme compared to any of the extremely cold days in the United States, Europe, or Siberia.

What accounted for the cold extremes experienced in parts of the Northern Hemisphere then? Similar to the natural El Niño/El Niña phenomenon in the Southern Pacific ocean, another natural phenomenon called the North Atlantic oscillation causes variability in local climates in the Northern Hemisphere. During the winters of 2010 and 2011, this oscillation exhibited an unusual and persistent pattern that caused cold air from the Arctic to move down to the United States and Europe, while warm air traveled up to the Arctic. This made it so that while much of North America and Europe was unusually cold, the Arctic was, in fact, much warmer during the winter. Even now, the total area of Arctic sea ice is near record lows this year.

In contrast, the extremely hot weather in the rest of the Northern Hemisphere cannot be explained by natural variations in climate, but require an outside explanation. This paper demonstrates that while there may be locally extreme cold events in weather than run counter to the overall trend of anthropogenic climate change, the overall trend for the entire planet is warming.

Thursday, March 17, 2011

Of Smog and Satellites

In an article (pay wall) in Geophysical Research Letters, Lamsal (and others) measure the amount of NOx (NO + NO2) at the Earth's surface using an instrument on board a satellite. NOx is an important gas to track in the Earth's atmosphere despite the relatively low amount present, because it ends up making smog and acid rain. Before this paper, scientists and regulators had to rely on crude estimates based on indirect data about NOx formation from combustion in car engines, traffic, car usage, power plants and electricity usage, and so on. Because of extremely rapid economic growth in China, though, these estimates are quickly outdated and need to be updated with more real-time data. The authors of this paper were able to use satellite data along with computer models of the Earth's atmosphere to provide forecasts until these estimates are updated with current data. They are able to get the same forecast for 2003 that was predicted using a "bottom-up" approach described above. They were also able to predict an increase in NOx over East Asia in 2009, while correctly predicting a decrease in NOx over North America due to increased regulation. These results provide useful benchmarks for updating estimates of NOx emissions more frequently as conditions change.


Photo David McNew, Getty Images
The color of the LA skyline shown here is actually caused by NO2, which is a brown gas

Basic Concepts
Photochemical smog has some fairly complex chemistry that creates numerous compounds with negative health consequences for humans and other life, such as important food crops. The most important of these compounds is ozone. Although ozone is a good thing in the ozone layer 20 miles into the atmosphere since it absorbs harmful radiation from the sun, at the surface it is toxic to humans. Furthermore, ozone can also react with other forms of pollution, such as hydrocarbons from incomplete combustion of gasoline or diesel, to form other harmful compounds. How can NOx compounds end up producing ozone though? This is where the "photo" (light) in photochemical smog comes in. During the day time, any NO2 produced, from say automobiles, is broken down by light to form NO and a highly reactive loose oxygen atom. This oxygen atom goes on to combine again with an O2 molecule to form ozone (O3).

Some more informed readers may ask, "If NOx is the problem in creating smog, then why is it still a problem since catalytic converters have been around for decades now?" It's true that catalytic converters have greatly helped the problem, as the smog in Los Angeles was much, much worse in the 70's and 80's than it is today. However, even small amounts of NOx can cause a lot of ozone production. After NO2 is converted to NO by light, the NO2 can be created again by NO reacting with hydrocarbons from unburned fuel. If you've ever gotten your car smogged, they're likely to reject it if the engine is not running optimally because this ends up releasing a lot of unburned hydrocarbons into the air. Catalytic converters and exhaust filters can help deal with this problem, and indeed have helped reduce the pollution that causes smog. Because of the complex chemistry of smog, though, it still remains a difficult problem to keep at manageable levels.

Other interesting and related topics:
Website for the SCHIAMACHY instrument from the European Space Agency
Website for the equivalent NASA satellite
Any of the Wikipedia entries linked above

Friday, March 4, 2011

Can isotopes be used to track regional sources of greenhouse gas pollution?

[Note: This is an attempt to try to write about recent science articles for a lay audience. I don't think many people even follow this, but any feedback would be appreciated. My plan is to try to break down a current journal article related to my research every week or so. We'll see how that turns out in practice. :)]

Summary
In a recent article in the open access (i.e. free) journal Atmospheric Chemistry and Physics, Tuszon and colleagues explore this idea in the mountains of Switzerland. They measure the stable isotope ratios of both carbon and oxygen in CO2 using a laser device that allows for rapid real-time measurements. Previously, measurement of the oxygen isotope ratios of CO2 (but not carbon) used a much more time-consuming technique that requires hours of sample preparation and half an hour of measurement time. (I have to do it for my research; it's not fun.) Because of the high quality and sheer number of the measurements, the authors of this paper could estimate the sources of the elevated CO2 by comparing the isotope ratios to the concentration. Using the carbon-13 to carbon-12 ratio, they find that three of the events with high CO2 concentration can be linked to burning gasoline or other petroleum products, while one of the events is the result of coal or wood burning. They use a computer model to estimate the potential sources in the following colorful plot (Fig. 7 in the paper):

Essentially, the authors were able to detect not only the source of the high levels of CO2 but the region the CO2 was emitted from. Such measurements may be important for verifying or enforcing any regulations limiting greenhouse gas emissions since the isotope ratios of CO2 in polluted air could be used to estimate the potential source.

Concepts
Although when most people think of the word 'isotope', they probably associate it with radioactivity, the isotopes measured in this study do not undergo radioactive decay. (thus 'stable isotope') So unfortunately, nobody in that laboratory in Switzerland is going to be bit by a radioactive spider and turn into Spider-man. Copyright Marvel Characters, Inc. 2008 So just what is an isotope then? Recall that an atom is made up of protons and neutrons in the central nucleus and electrons in outer orbitals. Because the positively-charged protons determine the chemistry of the atom while the neutrons for the most part do not affect the chemistry, each element is named for the number of protons. Thus, an atom with one proton, regardless of the number of neutrons, is always a hydrogen atom. An atom with two protons is always helium, an atom with 6 protons is always carbon, and so on. An isotope then is an atom of an element with a specific number of neutrons that is named for the total amount of protons and neutrons in the nucleus. (e.g. carbon-13 has 6 protons and 7 neutrons) A certain number of neutrons are required for a nuclei to remain stable and not radioactively decay, however - these are the 'stable isotopes' of an element. Most of the time there is one major stable isotope with one or more rare stable isotopes. For the article here, the isotopes discussed are carbon-12 (major) and carbon-13 (rare, ~1% of carbon atoms) along with oxygen-16 (major) and oxygen-18 (rare, ~0.2% of oxygen atoms). (Aside: There's also an even rarer but stable oxygen-17 isotope that makes up 0.04% of all oxygen atoms, but for various reasons it is usually not studied. There are some highly unusual isotope effects related to oxygen-17 in the middle atmosphere that are beyond the scope of this article, but are the focus of my research.)

Alright, so if the difference in the number of neutrons doesn't really affect the chemistry, how can the scientists in this paper tell what the sources of the greenhouse gas pollution are? Well, the main factor in determining the chemistry of an element is the number of protons and electrons, but the neutrons can affect the chemistry slightly because they change the mass of the nucleus. The change in mass makes it harder (or even easier in some cases) for any chemical containing a heavier isotope such as carbon-13 to react. Consider a chemical reaction like a "hill" that the reactants have to climb such as shown below. In a sense*, atoms and molecules that contain a heavy isotope are "harder" to push up the hill than lighter isotopes, so they react more slowly. Thus, unless all of the reactants are converted into products, the reactants will tend to contain more of the heavy isotope, and the products will tend to contain more of the light isotope. Because the "steepness" of the hill depends on the reactants, different chemicals will tend to have different ratios of heavy to light isotopes. Thus, by looking at the carbon-13 content of the CO2, the scientists here can tell what kind of process produced it based on what they already know about the isotope distribution for those processes in the lab. In practice, the differences are small, around 3%. This has a number of other uses, as well, including (my personal favorite) verifying the region wine was grown in by comparing the known carbon-13 content of regional soils to that in wine. I like to imagine scientists drinking some wine and then pouring some into an instrument. :) Pretty cool, huh? (Well, at least I think so... but I'm getting my Ph.D. in chemical physics...)

* The real picture is more complicated than this (the isotopes actually affect the "shape" of the hill, for example) but it's fine for the purposes of this post