Wednesday, January 30, 2013

Groundwater depletion linked to climate change

Groundwater depletion linked to climate change

Jan. 25, 2013 — Simon Fraser University earth scientist Diana Allen, a co-author on a new paper about climate changes' impacts on the world's ground water, says climate change may be exacerbating many countries' experience of water stress.

"Increasing food requirements to feed our current world's growing population and prolonged droughts in many regions of the world are already increasing dependence on groundwater for agriculture," says Allen. "Climate-change-related stresses on fresh surface water, such as glacier-fed rivers, will likely exacerbate that situation.

"Add to that our mismanagement and inadequate monitoring of groundwater usage and we may see significant groundwater depletion and contamination that will seriously compromise much of the world's agriculturally-grown food supply."

In Ground Water and Climate Change, Allen and several other international scientists explain how several human-driven factors, if not rectified, will combine with climate change to significantly reduce useable groundwater availability for agriculture globally.

The paper was published in late 2012 in the journal Nature Climate Change.

The authors note that inadequate groundwater supply records and mathematical models for predicting climate change and associated sea-level-rise make it impossible to forecast groundwater's long-range fate globally.

"Over-pumping of groundwater for irrigation is mining dry the world's ancient Pleistocene-age, ice-sheet-fed aquifers and, ironically, at the same time increasing sea-level rise, which we haven't factored into current estimations of the rise," says Allen. "Groundwater pumping reduces the amount of stored water deep underground and redirects it to the more active hydrologic system at the land-surface. There, it evaporates into the atmosphere, and ultimately falls as precipitation into the ocean."

Current research estimates oceans will rise by about a metre globally by the end of the century due to climate change. But that estimation doesn't factor in another half-a-centimetre-a-year rise, says this study, expected due to groundwater recycling back into the ocean globally.

Increasing climate-change-induced storm surges will also flood coastal areas, threatening the quality of groundwater supplies and compromising their usability.

This is the second study that Allen and her colleagues have produced to assist the Intergovernmental Panel on Climate Change (IPCC) in assessing the impact of climate change on the world's groundwater supply.

The IPCC, established by the United Nations Environmental Programme and the World Meteorological Organization in 1988, periodically reviews the latest research on climate change and assesses its potential environmental and socio-economic impacts.

This study is one of several guiding the IPCC's formulation of upcoming reports, the first being about the physical science behind climate change, due Sept. 2013.

Share this story on Facebook, Twitter, and Google:

Story Source:
The above story is reprinted from materials provided by Simon Fraser University.
Note: Materials may be edited for content and length. For further information, please contact the source cited above

Tuesday, January 29, 2013

Cities affect temperatures for thousands of miles

Cities affect temperatures for thousands of miles

Jan. 27, 2013 — Even if you live more than 1,000 miles from the nearest large city, it could be affecting your weather.

In a new study that shows the extent to which human activities are influencing the atmosphere, scientists have concluded that the heat generated by everyday activities in metropolitan areas alters the character of the jet stream and other major atmospheric systems. This affects temperatures across thousands of miles, significantly warming some areas and cooling others, according to the study this week in Nature Climate Change.

The extra "waste heat" generated from buildings, cars, and other sources in major Northern Hemisphere urban areas causes winter warming across large areas of northern North America and northern Asia. Temperatures in some remote areas increase by as much as 1 degree Celsius (1.8 degrees Fahrenheit), according to the research by scientists at the Scripps Institution of Oceanography; University of California, San Diego; Florida State University; and the National Center for Atmospheric Research.

At the same time, the changes to atmospheric circulation caused by the waste heat cool areas of Europe by as much as 1 degree C (1.8 degrees F), with much of the temperature decrease occurring in the fall.

The net effect on global mean temperatures is nearly negligible -- an average increase worldwide of just 0.01 degrees C (about 0.02 degrees F). This is because the total human-produced waste heat is only about 0.3 percent of the heat transported across higher latitudes by atmospheric and oceanic circulations.

However, the noticeable impact on regional temperatures may explain why some regions are experiencing more winter warming than projected by climate computer models, the researchers conclude. They suggest that models be adjusted to take the influence of waste heat into account.
"The burning of fossil fuel not only emits greenhouse gases but also directly affects temperatures because of heat that escapes from sources like buildings and cars," says NCAR scientist Aixue Hu, a co-author of the study. "Although much of this waste heat is concentrated in large cities, it can change atmospheric patterns in a way that raises or lowers temperatures across considerable distances."

Distinct from urban heat island effect
The researchers stressed that the effect of waste heat is distinct from the so-called urban heat island effect. Such islands are mainly a function of the heat collected and re-radiated by pavement, buildings, and other urban features, whereas the new study examines the heat produced directly through transportation, heating and cooling units, and other activities.

The study, "Energy consumption and the unexplained winter warming over northern Asia and North America," appeared online January 27. It was funded by the National Science Foundation, NCAR's sponsor, as well as the Department of Energy and the National Oceanic and Atmospheric Administration.

Hu, along with lead author Guang Zhang of Scripps and Ming Cai of Florida State University, analyzed the energy consumption -- from heating buildings to powering vehicles -- that generates waste heat release. The world's total energy consumption in 2006 was equivalent to a constant-use rate of 16 terawatts (1 terawatt, or TW, equals 1 trillion watts). Of that, an average rate of 6.7 TW was consumed in 86 metropolitan areas in the Northern Hemisphere.

Using a computer model of the atmosphere, the authors found that the influence of this waste heat can widen the jet stream.

"What we found is that energy use from multiple urban areas collectively can warm the atmosphere remotely, thousands of miles away from the energy consumption regions," Zhang says. "This is accomplished through atmospheric circulation change."

The release of waste heat is different from energy that is naturally distributed in the atmosphere, the researchers noted. The largest source of heat, solar energy, warms Earth's surface and atmospheric circulations redistribute that energy from one region to another. Human energy consumption distributes energy that had lain dormant and sequestered for millions of years, mostly in the form of oil or coal.

Though the amount of human-generated energy is a small portion of that transported by nature, it is highly concentrated in urban areas. In the Northern Hemisphere, many of those urban areas lie directly under major atmospheric troughs and jet streams.

"The world's most populated and energy-intensive metropolitan areas are along the east and west coasts of the North American and Eurasian continents, underneath the most prominent atmospheric circulation troughs and ridges," Cai says. "The release of this concentrated waste energy causes the noticeable interruption to the normal atmospheric circulation systems above, leading to remote surface temperature changes far away from the regions where waste heat is generated."

Share this story on Facebook, Twitter, and Google:

Story Source:
The above story is reprinted from materials provided by National Center for Atmospheric Research/University Corporation for Atmospheric Research.
Note: Materials may be edited for content and length. For further information, please contact the source cited above

Monday, January 28, 2013

Jet fuel, plastics exposures cause disease in later generations; Reproductive diseases, obesity

Jet fuel, plastics exposures cause disease in later generations; Reproductive diseases, obesity

Jan. 24, 2013 — Washington State University researchers have lengthened their list of environmental toxicants that can negatively affect as many as three generations of an exposed animal's offspring.

Writing in the online journal PLOS ONE, scientists led by molecular biologist Michael Skinner document reproductive disease and obesity in the descendants of rats exposed to the plasticizer bisephenol-A, or BPA, as well DEHP and DBP, plastic compounds known as phthalates.

In a separate article in the journal Reproductive Toxicology, they report the first observation of cross-generation disease from a widely used hydrocarbon mixture the military refers to as JP8.

Both studies are the first of their kind to see obesity stemming from the process of "epigenetic transgenerational inheritance." While the animals are inheriting traits conveyed by their parents' DNA sequences, they are also having epigenetic inheritance with some genes turned on and off. Skinner's lab in the past year has documented these epigenetic effects from a host of environmental toxicants, including plastics, pesticides, fungicide, dioxin and hydrocarbons.

The recent PLOS ONE study found "significant increases" in disease and abnormalities in the first and third generations of both male and female descendants of animals exposed to plastics. The first generation, whose mother had been directly exposed during gestation, had increased kidney and prostate diseases. The third generation had pubertal abnormalities, testis disease, ovarian disease and obesity.

The study also identified nearly 200 epigenetic molecular markers for exposure and transgenerational disease. The markers could lead to the development of a diagnostic tool and new therapies.
The Reproductive Toxicology study exposed female rats to the hydrocarbon mixture as their fetuses' gonads were developing. The first generation of offspring had increased kidney and prostate abnormalities and ovarian disease. The third generation had increased losses of primordial follicles, the precursors to eggs, polycystic ovarian disease and obesity.

The study, said Skinner, "provides additional support for the possibility that environmental toxicants can promote the epigenetic transgenerational inheritance of disease."

"Your great-grandmothers exposures during pregnancy may cause disease in you, while you had no exposure," he said. "This is a non-genetic form of inheritance not involving DNA sequence, but environmental impacts on DNA chemical modifications. This is the first set of studies to show the epigenetic transgenerational inheritance of disease such as obesity, which suggests ancestral exposures may be a component of the disease development."

Share this story on Facebook, Twitter, and Google:

Story Source:
The above story is reprinted from materials provided by Washington State University. The original article was written by Eric Sorensen.
Note: Materials may be edited for content and length. For further information, please contact the source cited above

Global warming less extreme than feared? New estimates from a Norwegian project on climate calculations

Global warming less extreme than feared? New estimates from a Norwegian project on climate calculations

Jan. 25, 2013 — Policymakers are attempting to contain global warming at less than 2°C. New estimates from a Norwegian project on climate calculations indicate this target may be more attainable than many experts have feared.

Climate researcher Caroline Leck of Stockholm University has evaluated the Norwegian project and is enthusiastic.

"These results are truly sensational," says Dr Leck. "If confirmed by other studies, this could have far-reaching impacts on efforts to achieve the political targets for climate."

Temperature rise is levelling off
After Earth's mean surface temperature climbed sharply through the 1990s, the increase has levelled off nearly completely at its 2000 level. Ocean warming also appears to have stabilised somewhat, despite the fact that CO2 emissions and other anthropogenic factors thought to contribute to global warming are still on the rise.

It is the focus on this post-2000 trend that sets the Norwegian researchers' calculations on global warming apart.

Sensitive to greenhouse gases
Climate sensitivity is a measure of how much the global mean temperature is expected to rise if we continue increasing our emissions of greenhouse gases into the atmosphere.

CO2 is the primary greenhouse gas emitted by human activity. A simple way to measure climate sensitivity is to calculate how much the mean air temperature will rise if we were to double the level of overall CO2 emissions compared to the world's pre-industrialised level around the year 1750.
If we continue to emit greenhouse gases at our current rate, we risk doubling that atmospheric CO2 level in roughly 2050.

Mutual influences
A number of factors affect the formation of climate development. The complexity of the climate system is further compounded by a phenomenon known as feedback mechanisms, i.e. how factors such as clouds, evaporation, snow and ice mutually affect one another.

Uncertainties about the overall results of feedback mechanisms make it very difficult to predict just how much of the rise in Earth's mean surface temperature is due to humanmade emissions. According to the Intergovernmental Panel on Climate Change (IPCC) the climate sensitivity to doubled atmospheric CO2 levels is probably between 2°C and 4.5°C, with the most probable being 3°C of warming.

In the Norwegian project, however, researchers have arrived at an estimate of 1.9°C as the most likely level of warming.

Humanmade climate forcing
"In our project we have worked on finding out the overall effect of all known feedback mechanisms," says project manager Terje Berntsen, who is a professor at the University of Oslo's Department of Geosciences and a senior research fellow at the Center for International Climate and Environmental Research -- Oslo (CICERO). The project has received funding from the Research Council of Norway's Large-scale Programme on Climate Change and its Impacts in Norway (NORKLIMA).

"We used a method that enables us to view the entire earth as one giant 'laboratory' where humankind has been conducting a collective experiment through our emissions of greenhouse gases and particulates, deforestation, and other activities that affect climate."

For their analysis, Professor Berntsen and his colleagues entered all the factors contributing to human-induced climate forcings since 1750 into their model. In addition, they entered fluctuations in climate caused by natural factors such as volcanic eruptions and solar activity. They also entered measurements of temperatures taken in the air, on ground, and in the oceans.

The researchers used a single climate model that repeated calculations millions of times in order to form a basis for statistical analysis. Highly advanced calculations based on Bayesian statistics were carried out by statisticians at the Norwegian Computing Center.

2000 figures make the difference
When the researchers at CICERO and the Norwegian Computing Center applied their model and statistics to analyse temperature readings from the air and ocean for the period ending in 2000, they found that climate sensitivity to a doubling of atmospheric CO2 concentration will most likely be 3.7°C, which is somewhat higher than the IPCC prognosis.

But the researchers were surprised when they entered temperatures and other data from the decade 2000-2010 into the model; climate sensitivity was greatly reduced to a "mere" 1.9°C.
Professor Berntsen says this temperature increase will first be upon us only after we reach the doubled level of CO2 concentration (compared to 1750) and maintain that level for an extended time, because the oceans delay the effect by several decades.

Natural changes also a major factor
The figure of 1.9°C as a prediction of global warming from a doubling of atmospheric CO2 concentration is an average. When researchers instead calculate a probability interval of what will occur, including observations and data up to 2010, they determine with 90% probability that global warming from a doubling of CO2 concentration would lie between 1.2°C and 2.9°C.

This maximum of 2.9°C global warming is substantially lower than many previous calculations have estimated. Thus, when the researchers factor in the observations of temperature trends from 2000 to 2010, they significantly reduce the probability of our experiencing the most dramatic climate change forecast up to now.

Professor Berntsen explains the changed predictions: "The Earth's mean temperature rose sharply during the 1990s. This may have caused us to overestimate climate sensitivity.

"We are most likely witnessing natural fluctuations in the climate system -- changes that can occur over several decades -- and which are coming on top of a long-term warming. The natural changes resulted in a rapid global temperature rise in the 1990s, whereas the natural variations between 2000 and 2010 may have resulted in the levelling off we are observing now."

Climate issues must be dealt with
Terje Berntsen emphasises that his project's findings must not be construed as an excuse for complacency in addressing human-induced global warming. The results do indicate, however, that it may be more within our reach to achieve global climate targets than previously thought.

Regardless, the fight cannot be won without implementing substantial climate measures within the next few years.

Sulphate particulates
The project's researchers may have shed new light on another factor: the effects of sulphur-containing atmospheric particulates.

Burning coal is the main way that humans continue to add to the vast amounts of tiny sulphate particulates in the atmosphere. These particulates can act as condensation nuclei for cloud formation, cooling the climate indirectly by causing more cloud cover, scientists believe. According to this reasoning, if Europe, the US and potentially China reduce their particulate emissions in the coming years as planned, it should actually contribute to more global warming.

But the findings of the Norwegian project indicate that particulate emissions probably have less of an impact on climate through indirect cooling effects than previously thought.

So the good news is that even if we do manage to cut emissions of sulphate particulates in the coming years, global warming will probably be less extreme than feared.

Share this story on Facebook, Twitter, and Google:

Story Source:
The above story is reprinted from materials provided by The Research Council of Norway. The original article was written by Bård Amundsen/Else Lie; translation: Darren McKellep/Carol B. Eckmann.
Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Thursday, January 24, 2013

Health and environment: A closer look at plastics

Health and environment: A closer look at plastics

Jan. 23, 2013 — Plastics have transformed modern society, providing attractive benefits but also befouling waterways and aquifers, depleting petroleum supplies and disrupting human health.

Rolf Halden, a researcher at Arizona State University's Biodesign Institute has been following the chemical trail of plastics, quantifying their impact on human health and the environment. In a new overview appearing in the journal Reviews on Environmental Health, Halden and his co-author, ASU student Emily North, detail the risks and societal rewards of plastics and describe strategies to mitigate their negative impacts, through reconsideration of plastic composition, use and disposal.

"We are in need of a second plastic revolution. The first one brought us the age of plastics, changing human society and enabling the birth and explosive growth of many industries. But the materials used to make plastics weren't chosen judiciously and we see the adverse consequences in widespread environmental pollution and unnecessary human exposure to harmful substances. Smart plastics of the future will be equally versatile but also non-toxic, biodegradable and made from renewable energy sources," says Halden.

Plastics are made up of a network of molecular monomers linked to form macromolecules. These versatile chemical structures come in enormous varieties and today over 20 major forms of plastics exist. Plastics are typically lightweight and biocompatible. Along with their myriad uses in everyday life, plastics fulfill many needs in the public health arena, where they are found in items including absorbable sutures, prosthetics and engineered tissues.

Further, plastics may be manufactured at low cost using little energy and their adaptable composition allows them to be synthesized in soft, transparent or flexible forms suitable for a broad range of medical applications. Because they can be readily disposed of, items like latex gloves, dialysis tubes, intravenous bags and plastic syringes eliminate the need for repeated sterilization, which is often costly and inefficient. Such single-use items have had a marked effect on reducing blood-borne infections, including hepatitis B and HIV.

Many varieties of polymers are produced to meet the expanding needs of modern medicine. Polymer chemistry is used to produce sophisticated drug delivery systems for the pharmaceutical industry; material to cement bone for hip replacements is made with polymer polymethylmethacrylate and polymer scaffolds are revolutionizing the practice of tissue engineering.

Researchers like Halden have shown, however, that the benefits of global plastics use can come at a steep price in terms of both human and environmental health. Continuous contact with plastic products, from the beginning to the end of life has caused chemical ingredients -- some with potentially harmful effects -- to form steady-state concentrations in the human body.

In recent years, two plastic-associated compounds have been singled out for particular scrutiny, due to their endocrine-disrupting properties: Bisphenol A (BPA) and di-(2-ethylhexyl)phthalate (DEHP). Studies of bioaccumulation have shown that detectable levels of BPA in urine have been identified in 95 percent of the adult population in the U.S. and both BPA and DEHP have been associated, through epidemiological and animal studies, with adverse effects on health and reproduction. These include early sexual maturation, decreased male fertility, aggressive behavior and other effects. Concern over BPA exposure, particularly for highly vulnerable members of the population, has recently led the Food and Drug Administration to place a ban on BPA use in infant bottles, spill proof cups and other products intended for infants and toddlers.

Similar issues exist with DEHP, a plasticizer found in polyvinyl chloride (PVC). Because this additive is not tightly bound to the plastics in which it is used, the potential exists for DEHP to leach out and enter the body, causing unwanted exposure and affecting health. Both animal and human studies suggest DEHP may produce harmful effects, including insulin resistance, increased waist circumference and changes to male and female reproductive systems.

A variety of other plastic-related chemicals are currently under evaluation by Halden's group for their adverse effects on health and the environment. These include polyhalogenated flame retardants, polyfluorinated compounds and antimicrobials containing plastic additives such as triclosan and triclocarban.

While researchers are still at the early stages of assessing the risks to human health posed by plastics use, negative impacts on the environment have been a growing concern for many years. Over 300 million metric tons of plastics are produced worldwide each year. Roughly 50 percent of this volume is made up of products disposed of within one year of purchase.

Plastics today represent 15-25 percent of all hospital waste in the U.S. Some newer plastics are biodegradeable, but the rest must be incinerated, disposed of in landfills, or recycled. All of these methods have drawbacks and carry environmental risk, as the new study explains.

Biodegradeable plastics may break down in the environment into smaller polymer constituents, which may still pose a risk to the environment. Incineration liberates greenhouse gases associated with climate change. Landfilling of plastics, particularly in the enormous volumes now produced, may be an impractical use of land resources and a danger exists of plastics constituents entering the ground water. Finally, recycling of plastics requires careful sorting of plastic material, which is difficult. Recycled plastics tend to be of lower quality and may not be practical for health care and other applications.

As Halden explains, the problems posed by plastics need to be addressed on several fronts, and current research offers significant hope for improvements to human and environmental health. Better biodegradeable plastics are now being developed using carbon dioxide and carbon monoxide compounds and applying metal complexes as catalysts.

The technique provides a double benefit, binding unwanted greenhouse gases, while avoiding the competition with the human food supply. (Conventional bioplastics are made with plant sources like corn and molasses.) One application would be to replace BPA-containing epoxy resins lining metal food cans, thereby dramatically reducing BPA exposure while also sequestering 180 million metric tons of carbon dioxide (greenhouse) emissions.

The use of disposable items is also undergoing a reevaluation, in light of the potential environmental toll. In some cases, reusable plastic products are gaining ground, and estimates suggest the potential for a 50 percent reduction in medical equipment costs. Almost a quarter of all U.S. hospitals are now using reprocessing to decrease disposable waste.

Nevertheless, the largest source of plastics-related environmental damage stems from the overuse of items whose long-term harm outweighs their short-term benefit. Typically, these are consumer convenience items, often quickly discarded after a short use-life, including plastic water bottles, grocery bags, packaging, Styrofoam cups, Teflon-coated dental floss and other products. Halden recommends a thorough life-cycle assessment of plastics-based products, to identify safer, more sustainable replacement materials that reduce adverse effects to the environment and human health from plastic consumption.

"Many current types and consumption patterns of plastics are unsusustainable, as indicated by harmful plastic components circulating in our blood streams and multiple giant garbage patches of plastic debris swirling in the world's oceans. Continued use of plastics into the future will require us to redesign these indispensible materials of daily life to make them compatible with human health and the ecosystems we rely on," says Halden.

Share this story on Facebook, Twitter, and Google:

Story Source:
The above story is reprinted from materials provided by Arizona State University. The original article was written by Richard Harth.

Unprecedented glacier melting in the Andes blamed on climate change

Unprecedented glacier melting in the Andes blamed on climate change

Jan. 22, 2013 — Glaciers in the tropical Andes have been retreating at increasing rate since the 1970s, scientists write in the most comprehensive review to date of Andean glacier observations. The researchers blame the melting on rising temperatures as the region has warmed about 0.7°C over the past 50 years (1950-1994). This unprecedented retreat could affect water supply to Andean populations in the near future. These conclusions are published January 22 in The Cryosphere, an Open Access journal of the European Geosciences Union (EGU).

The international team of scientists -- uniting researchers from Europe, South America and the US -- shows in the new paper that, since the 1970s, glaciers in tropical Andes have been melting at a rate unprecedented in the past 300 years. Globally, glaciers have been retreating at a moderate pace as the planet warmed after the peak of the Little Ice Age, a cold period lasting from the 16th to the mid-19th century. Over the past few decades, however, the rate of melting has increased steeply in the tropical Andes. Glaciers in the mountain range have shrunk by an average of 30-50% since the 1970s, according to Antoine Rabatel, researcher at the Laboratory for Glaciology and Environmental Geophysics in Grenoble, France, and lead author of the study.

Glaciers are retreating everywhere in the tropical Andes, but the melting is more pronounced for small glaciers at low altitudes, the authors report. Glaciers at altitudes below 5,400 metres have lost about 1.35 metres in ice thickness (an average of 1.2 metres of water equivalent [see note]) per year since the late 1970s, twice the rate of the larger, high-altitude glaciers.

"Because the maximum thickness of these small, low-altitude glaciers rarely exceeds 40 metres, with such an annual loss they will probably completely disappear within the coming decades," says Rabatel.

The researchers further report that the amount of rainfall in the region did not change much over the past few decades and, therefore, cannot account for changes in glacier retreat. Instead, climate change is to blame for the melting: regional temperatures increased an average of 0.15°C per decade over the 1950-1994 period.

"Our study is important in the run-up to the next IPCC report, coming out in 2013," says Rabatel. The Intergovernmental Panel on Climate Change (IPCC) has pointed out that tropical glaciers are key indicators of recent climate change as they are particularly sensitive to temperature changes. The tropical Andes host 99% of all tropical glaciers in the world, most of them in Peru.
The research is also important to anticipate the future behaviour of Andean glaciers an
d the impact of their accelerated melting on the region. "The ongoing recession of Andean glaciers will become increasingly problematic for regions depending on water resources supplied by glacierised mountain catchments, particularly in Peru," the scientists write. Without changes in precipitation, the region could face water shortages in the future.

The Santa River valley in Peru will be most affected, as its hundreds of thousands of inhabitants heavily rely on glacier water for agriculture, domestic consumption, and hydropower. Large cities, such as La Paz in Bolivia, could also face shortages. "Glaciers provide about 15% of the La Paz water supply throughout the year, increasing to about 27% during the dry season," says Alvaro Soruco, a Bolivian researcher who took part in the study.

In their comprehensive review of Andean glaciers, the scientists synthesised data collected over several decades, some dating as far back as the 1940s. "The methods we used to monitor glacier changes in this region include field observations of glacier mass balance, and remote-sensing measurements based on aerial photographs and satellite images for glacier surface and volume changes," explains Rabatel.

The study takes into account data collected for glaciers in Colombia, Ecuador, Peru and Bolivia, covering a total of almost a thousand square kilometres. This corresponds to about 50% of the total area covered by glaciers in the tropical Andes in the early 2000s.
The research was conducted to provide the scientific community with a comprehensive overview of the status of glaciers in the tropical Andes and determine the rate of retreat and identify potential causes for the melting. But the authors hope the results can have a wider impact.
"This study has been conducted with scientific motivations, but if the insight it provides can motivate political decisions to mitigate anthropogenic impact on climate and glacier retreat, it will be an important step forward," Rabatel concludes.

Note
Glacier mass balance is the difference between ice accumulation and ablation (melting and sublimation) in a glacier. Scientists express the annual mass balance in metre water equivalent (m w.e.). A loss of 1.2 m w.e. corresponds to a reduction of about 1.35 metres in ice thickness.

Share this story on Facebook, Twitter, and Google:

Story Source:
The above story is reprinted from materials provided by European Geosciences Union.
Note: Materials may be edited for content and length. For further information, please contact the source cited above.

2012 Antarctic ozone hole second smallest in 20 years

2012 Antarctic ozone hole second smallest in 20 years

Oct. 24, 2012 — The average area covered by the Antarctic ozone hole this year was the second smallest in the last 20 years, according to data from NASA and National Oceanic and Atmospheric Administration (NOAA) satellites. Scientists attribute the change to warmer temperatures in the Antarctic lower stratosphere.

The ozone hole reached its maximum size Sept. 22, covering 8.2 million square miles (21.2 million square kilometers), or the area of the United States, Canada and Mexico combined. The average size of the 2012 ozone hole was 6.9 million square miles (17.9 million square kilometers). The Sept. 6, 2000 ozone hole was the largest on record at 11.5 million square miles (29.9 million square kilometers).

"The ozone hole mainly is caused by chlorine from human-produced chemicals, and these chlorine levels are still sizable in the Antarctic stratosphere," said NASA atmospheric scientist Paul Newman of NASA's Goddard Space Flight Center in Greenbelt, Md. "Natural fluctuations in weather patterns resulted in warmer stratospheric temperatures this year. These temperatures led to a smaller ozone hole."

Observing Earth's Ozone Layer
Atmospheric ozone is no longer declining because concentrations of ozone-depleting chemicals stopped increasing and are now declining.

The ozone layer acts as Earth's natural shield against ultraviolet radiation, which can cause skin cancer. The ozone hole phenomenon began making a yearly appearance in the early 1980s. The Antarctic ozone layer likely will not return to its early 1980s state until about 2065, Newman said. The lengthy recovery is because of the long lifetimes of ozone-depleting substances in the atmosphere. Overall atmospheric ozone no longer is declining as concentrations of ozone-depleting substances decrease. The decrease is the result of an international agreement regulating the production of certain chemicals.

This year also marked a change in the concentration of ozone over the Antarctic. The minimum value of total ozone in the ozone hole was the second highest level in two decades. Total ozone, measured in Dobson units (DU) reached 124 DU on Oct. 1. NOAA ground-based measurements at the South Pole recorded 136 DU on Oct. 5. When the ozone hole is not present, total ozone typically ranges from 240-500 DU.

This is the first year growth of the ozone hole has been observed by an ozone-monitoring instrument on the Suomi National Polar-orbiting Partnership (NPP) satellite. The instrument, called the Ozone Mapping Profiler Suite (OMPS), is based on previous instruments, such as the Total Ozone Mapping Spectrometer (TOMS) and the Solar Backscatter Ultraviolet instrument (SBUV/2). OMPS continues a satellite record dating back to the early 1970s.

In addition to observing the annual formation and extent of the ozone hole, scientists hope OMPS will help them better understand ozone destruction in the middle and upper stratosphere with its Nadir Profiler. Ozone variations in the lower stratosphere will be measured with its Limb Profiler.

"OMPS Limb looks sideways, and it can measure ozone as a function of height," said Pawan K. Bhartia, a NASA atmospheric physicist and OMPS Limb instrument lead. "This OMPS instrument allows us to more closely see the vertical development of Antarctic ozone depletion in the lower stratosphere where the ozone hole occurs."

NASA and NOAA have been monitoring the ozone layer on the ground and with a variety of instruments on satellites and balloons since the 1970s. Long-term ozone monitoring instruments have included TOMS, SBUV/2, Stratospheric Aerosol and Gas Experiment series of instruments, the Microwave Limb Sounder, the Ozone Monitoring Instrument, and the OMPS instrument on Suomi NPP. Suomi NPP is a bridging mission leading to the next-generation polar-orbiting environmental satellites called the Joint Polar Satellite System, will extend ozone monitoring into the 2030s.

NASA and NOAA have a mandate under the Clean Air Act to monitor ozone-depleting gases and stratospheric depletion of ozone. NOAA complies with this mandate by monitoring ozone via ground and satellite measurements. The NOAA Earth System Research Laboratory in Boulder, Colo., performs the ground-based monitoring. The Climate Prediction Center performs the satellite monitoring.

Share this story on Facebook, Twitter, and Google:
Story Source:
The above story is reprinted from materials provided by NASA/Goddard Space Flight Center.
Note: Materials may be edited for content and length. For further information, please contact the source cited above


Thursday, January 17, 2013

Salinization of rivers: A global environmental problem

Salinization of rivers: A global environmental problem

Jan. 11, 2013 — The salinisation of rivers is a global problem that affects to countries all over the world and it causes a high environmental and economic cost, and poses a high risk to global health. Climate change and the increasing water consumption can worsen even more the future scene, according to an article published on the journal Environmental Pollution based on the research developed by an international team led by the experts of the Department of Ecology of the University of Barcelona Narcís Prat and Miguel Cañedo-Argüelles.

Human activity increases the salinity of river ecosystems
River salinity can be natural, caused by the geology of the area or the climatology, or anthropogenic, in other words, caused by domestic and industrial waste discharges, mining activity, agricultural and farming residues, etc. In worldwide river ecosystems, excessive salt concentrations caused by human activity are a threat to the survival of organisms and communities, biodiversity, the ecosystem's biological balance, and it produces severe economical and public health problems.

According to Miguel Cañello-Argüelles, the main author of the article, "this article aims at giving a integrating view and emphasize the seriousness of the ecological, economic and global health effects that secondary salinisation has." The expert remarks that it is a global process: "It happens in many regions from all over the world, although there is a great ignorance about the problem." The most extreme case of salinisation occurs in some Australian rivers. "However -- Cañedo-Argüelles adds -- , in this case local studies have been done in order to clearly diagnose the problem; therefore, all the agents who make use of the natural resources of some rivers (farmers, industrialists, etc.) have collaborated in the process of finding solutions."

In Europe, the process of river salinisation by human action is getting worse as years goes by. "It is also a problem in Spain," declares the professor Narcís Prat, director of the Research Group Freshwater Ecology and Management (FEM) of the UB. " In the Ebro plain, due to soil's characteristic and the kind of agricultural activity performed, rivers are saltier than in Australia -- he explains -- , but here river conservation is not among the priorities of water resources management," so these problems are not solved. According to Prat, the question is even worse in the region of Murcia: "It is a semi-arid area where irrigation is a common activity and rivers are saline as a result of the excessive exploitation of water resources."

What is the degree of salinity of Catalan rivers?
In the Catalan river system, there are also some parts where high levels of salinity can be found. To be exact, experts have studied the salinisation of the Llobregat River basin supported by Mesocosmos Sostaqua, an infrastructure located at the water-treatment plan of Balsareny. The pump, which reproduces the natural conditions of the river ecosystem, was built by the group FEM of the UB and the company Aigües de Barcelona. "We are aware of the salinity of Llobregat River -- Narcís Prat affirms -- , but apart from the salt, there are also other features that can damage the environmental quality of water. Therefore, sometimes we cannot determine what is more important: salt or the pollution produced by other factors. With Mesocosmos, we can study separately the effect of each factor (for example, the salt concentration) and differentiate its influence from the one made by the other factors." Despite the qualitative improvement of Llobregat River water thanks to the collector of brines, which leads mining leachates directly to the sea, the UB experts alert that salinity is a remaining question because the collector has not been able to solve all the problems. According to Narcís Prat, "the level of salinity at the lower course of the Llobregat River where the area of potash mining begins is so high that its use can only be agricultural, not human. It is not such an alarming situation as in Australia but it is worrying. The situation is the same at the lower course of the Besòs River: its water is more and more salty, and in this case the reason is not mining but all the processes developed in order to decalcificate the water (like in dishwashers when we add salt to avoid the stains that lime will cause)."

Experts explain that excessive salt is also a factor that has a negative effect on water potabilization. For example, it makes necessary to install new technologies, such as reverse osmosis, that have put up the price of water potabilization for human consumption in the plants of Abrera and Sant Joan Despí. In addition, the use of chlorine to potabilizate water produces many chemical compounds (borates, chlorates, trihalomethanes, etc.) which can be toxic for environment and health.

Looking for solutions
According to the article, current legislation is generally flexible when it comes to establish limits for salt concentrations in rivers. In Europe, salinisation is not considered an important problem and no legally prescribed environmental quality standards exist for salt. In many countries, business and industrial factor predominates over the necessity to set a limiting regulation. Miguel Cañedo-Argüelles considers that "legislation is still waiting. People are not aware of the severity of the problem and information about the effects of excessive salt on river ecosystems is missing."

In the article, the authors also quote some successful management strategies, for example, the Hunter River salinity trading scheme upstream in Singleton (Australia), with controlled salt discharges adapted to the volume of the river: when the volume is high, more salt is discharged, whereas when it goes down the quantity of salt is reduced.

In a future
The study states that the effects of global change could increase even more the salinisation of rivers in many regions. Miguel Cañedo-Argüelles thinks that "it is difficult to predict the impact of climate change. In comparison with other regions of the planet, lower rainfall, worse drought, more water consumption, and therefore, more salinity in rivers are expected in the Mediterranean region." Finally, Narcís Prat concludes that "the most important aspect is to stop fighting and began to work together. It is necessary to react against the problem of excessive salinity in Catalan and worldwide rivers before it will be a severer problem."

The article is also authored by Ben J. Kefford from the University of Technology of Sydney (Australia); Christophe Piscart from the University of Lyon (France); Ralf B. Schäfer from the University Koblenz-Landau (Germany); and Claus-Jürgen Schulz from the Thuringian State Institute for Environment and Geology (TLUG, Germany).

Share this story on Facebook, Twitter, and Google:

Story Source:
The above story is reprinted from materials provided by Universidad de Barcelona.
Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Where there's smoke or smog, there's climate change

Where there's smoke or smog, there's climate change

Jan. 15, 2013 — In addition to causing smoggy skies and chronic coughs, soot -- or black carbon -- turns out to be the number two contributor to global warming. It's second only to carbon dioxide, according to a four-year assessment by an international panel.

The new study concludes that black carbon, the soot particles in smoke and smog, contributes about twice as much to global warming as previously estimated, even by the 2007 Intergovernmental Panel on Climate Change.

"We were surprised at its potential contribution to climate," said Sarah Doherty, a University of Washington atmospheric scientist and one of four coordinating lead authors. The silver lining may be that controlling these emissions can deliver more immediate climate benefits than trying to control carbon dioxide, she said.

The paper was made freely available online Jan. 15 in the Journal of Geophysical Research-Atmospheres.

Some previous research had hinted that models were underestimating black-carbon emissions, Doherty said, from such things as open burning of forests, crops and grasslands, and from energy-related emissions in Southeast Asia and East Asia.

Black carbon's role in climate is complex. Dark particles in the air work to shade Earth's surface while warming the atmosphere. Black carbon that settles on the surface of snow and ice darkens the surface to absorb more sunlight and increase melting. Finally, soot particles influence cloud formation in ways that can have either a cooling or warming impact.

The report surveyed past studies and included new research to quantify the sources of black carbon and better understand its overall effect on the climate.

Doherty was executive director of the International Global Atmospheric Chemistry Project in 2009 when policy groups were seeking better information on the benefits of reducing black-carbon emissions. The scientific body undertook a comprehensive assessment, supported by IGAC and the U.S. National Oceanic and Atmospheric Administration.

"Because of a lack of action to reduce carbon dioxide emissions, the policy community is asking what else we can do, particularly to help places like the Arctic that are melting much more quickly than we had anticipated," Doherty said. "We hope reducing black-carbon emissions buys us some time. But it doesn't replace cutting back on CO2 emissions."

While carbon dioxide has a half-life of 100 years, black carbon stays in the atmosphere for only a few days.

The authors investigated various sources of black carbon to see which reductions might have the most short-term cooling impact. Regulating emissions from diesel engines followed by replacing some wood- and coal-burning household stoves, authors find, would have the greatest immediate cooling impact.

"If you're just thinking about impact on climate, you would want to be strategic about which sources you cut back on," Doherty said. "We looked at the overall impact because some of these sources also emit associated particles that can have counteracting effects."
Black carbon contributes to climate change in the mid to high latitudes, including the northern United States, Canada, northern Europe and northern Asia, as well as affecting rainfall patterns of the Asian Monsoon.

The report incorporates data that Doherty and co-author Stephen Warren, a UW professor of atmospheric sciences, gathered between 2007 and 2009 to measure soot on Arctic snow. Calculating black carbon deposits in the Arctic is difficult, so data are essential for testing and correcting models.
First author Tami Bond, now at the University of Illinois, earned a doctoral degree at the UW in 2000 that combined engineering, chemistry and atmospheric science to measure emissions from burning that have atmospheric importance.

"Mitigating black carbon is good for curbing short-term climate change, but to really solve the long-term climate problem, carbon dioxide emissions must also be reduced," Bond said.
In related research, Doherty, Warren and UW graduate student Cheng Dang will travel next month to Colorado, Wyoming, the Dakotas, Saskatchewan, Manitoba and elsewhere to collect snow samples and investigate black carbon's effects on North America's Great Plains.

Share this story on Facebook, Twitter, and Google:


Story Source:
The above story is reprinted from materials provided by University of Washington. The original article was written by Hannah Hickey.
Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Wednesday, January 16, 2013

Gas that triggers ozone destruction revealed

Gas that triggers ozone destruction revealed

Jan. 13, 2013 — Scientists at the Universities of York and Leeds have made a significant discovery about the cause of the destruction of ozone over oceans.

They have established that the majority of ozone-depleting iodine oxide observed over the remote ocean comes from a previously unknown marine source.
The research team found that the principal source of iodine oxide can be explained by emissions of hypoiodous acid (HOI) – a gas not yet considered as being released from the ocean – along with a contribution from molecular iodine (I2).

Since the 1970s when methyl iodide (CH3I) was discovered as ubiquitous in the ocean, the presence of iodine in the atmosphere has been understood to arise mainly from emissions of organic compounds from phytoplankton -- microscopic marine plants.

This new research, which is published in Nature Geoscience, builds on an earlier study which showed that reactive iodine, along with bromine, in the atmosphere is responsible for the destruction of vast amounts of ozone – around 50 per cent more than predicted by the world’s most advanced climate models – in the lower atmosphere over the tropical Atlantic Ocean.

The scientists quantified gaseous emissions of inorganic iodine following the reaction of iodide with ozone in a series of laboratory experiments. They showed that the reaction of iodide with ozone leads to the formation of both molecular iodine and hypoiodous acid. Using laboratory models, they show that the reaction of ozone with iodide on the sea surface could account for around 75 per cent of observed iodine oxide levels over the tropical Atlantic Ocean.

Professor Lucy Carpenter, of the Department of Chemistry at York, said: “Our laboratory and modelling studies show that these gases are produced from the reaction of atmospheric ozone with iodide on the sea surface interfacial layer, at a rate which is highly significant for the chemistry of the marine atmosphere.

“Our research reveals an important negative feedback for ozone – a sort of self-destruct mechanism. The more ozone there is, the more gaseous halogens are created which destroy it. The research also has implications for the way that radionucleides of iodine in seawater, released into the ocean mainly from nuclear reprocessing facilities, can be re-emitted into the atmosphere.”

Professor John Plane, from the University of Leeds’ School of Chemistry, said: “This mechanism of iodine release into the atmosphere appears to be particularly important over tropical oceans, where measurements show that there is more iodide in seawater available to react with ozone. The rate of the process also appears to be faster in warmer water. The negative feedback for ozone should therefore be particularly important for removing ozone in the outflows of pollution from major cities in the coastal tropics.”

The research was funded by the UK Natural Environment Research Council SOLAS (Surface Ocean Lower Atmosphere) programme.

Share this story on Facebook, Twitter, and Google:

Story Source:
The above story is reprinted from materials provided by University of York, via AlphaGalileo.
Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Global warming has increased monthly heat records worldwide by a factor of five, study finds

Global warming has increased monthly heat records worldwide by a factor of five, study finds

Jan. 14, 2013 — Monthly temperature extremes have become much more frequent, as measurements from around the world indicate. On average, there are now five times as many record-breaking hot months worldwide than could be expected without long-term global warming, shows a study now published in Climatic Change. In parts of Europe, Africa and southern Asia the number of monthly records has increased even by a factor of ten. 80 percent of observed monthly records would not have occurred without human influence on climate, concludes the authors-team of the Potsdam Institute for Climate Impact Research (PIK) and the Complutense University of Madrid.

"The last decade brought unprecedented heat waves; for instance in the US in 2012, in Russia in 2010, in Australia in 2009, and in Europe in 2003," lead-author Dim Coumou says. "Heat extremes are causing many deaths, major forest fires, and harvest losses -- societies and ecosystems are not adapted to ever new record-breaking temperatures." The new study relies on 131 years of monthly temperature data for more than 12,000 grid points around the world, provided by NASA. Comprehensive analysis reveals the increase in records.

The researchers developed a robust statistical model that explains the surge in the number of records to be a consequence of the long-term global warming trend. That surge has been particularly steep over the last 40 years, due to a steep global-warming trend over this period. Superimposed on this long-term rise, the data show the effect of natural variability, with especially high numbers of heat records during years with El Niño events. This natural variability, however, does not explain the overall development of record events, found the researchers.

Natural variability does not explain the overall development of record events
If global warming continues, the study projects that the number of new monthly records will be 12 times as high in 30 years as it would be without climate change. "Now this doesn't mean there will be 12 times more hot summers in Europe than today -- it actually is worse," Coumou points out. For the new records set in the 2040s will not just be hot by today's standards. "To count as new records, they actually have to beat heat records set in the 2020s and 2030s, which will already be hotter than anything we have experienced to date," explains Coumou. "And this is just the global average -- in some continental regions, the increase in new records will be even greater."

"Statistics alone cannot tell us what the cause of any single heat wave is, but they show a large and systematic increase in the number of heat records due to global warming," says Stefan Rahmstorf, a co-author of the study and co-chair of PIK's research domain Earth System Analysis. "Today, this increase is already so large that by far most monthly heat records are due to climate change. The science is clear that only a small fraction would have occurred naturally."

Share this story on Facebook, Twitter, and Google:

Story Source:
The above story is reprinted from materials provided by Potsdam Institute for Climate Impact Research (PIK).
Note: Materials may be edited for content and length. For further information, please contact the source cited above.