Friday, March 29, 2013

New type of solar structure cools buildings in full sunlight

New type of solar structure cools buildings in full sunlight

Mar. 27, 2013 — A Stanford team has designed an entirely new form of cooling panel that works even when the sun is shining. Such a panel could vastly improve the daylight cooling of buildings, cars and other structures by radiating sunlight back into the chilly vacuum of space.

Homes and buildings chilled without air conditioners. Car interiors that don't heat up in the summer sun. Tapping the frigid expanses of outer space to cool the planet. Science fiction, you say? Well, maybe not any more.

A team of researchers at Stanford has designed an entirely new form of cooling structure that cools even when the sun is shining. Such a structure could vastly improve the daylight cooling of buildings, cars and other structures by reflecting sunlight back into the chilly vacuum of space. Their paper describing the device was published March 5 in Nano Letters.

"People usually see space as a source of heat from the sun, but away from the sun outer space is really a cold, cold place," explained Shanhui Fan, professor of electrical engineering and the paper's senior author. "We've developed a new type of structure that reflects the vast majority of sunlight, while at the same time it sends heat into that coldness, which cools humanmade structures even in the day time."

The trick, from an engineering standpoint, is two-fold. First, the reflector has to reflect as much of the sunlight as possible. Poor reflectors absorb too much sunlight, heating up in the process and defeating the purpose of cooling.

The second challenge is that the structure must efficiently radiate heat back into space. Thus, the structure must emit thermal radiation very efficiently within a specific wavelength range in which the atmosphere is nearly transparent. Outside this range, Earth's atmosphere simply reflects the light back down. Most people are familiar with this phenomenon. It's better known as the greenhouse effect -- the cause of global climate change.

Two goals in one
The new structure accomplishes both goals. It is an effective a broadband mirror for solar light -- it reflects most of the sunlight. It also emits thermal radiation very efficiently within the crucial wavelength range needed to escape Earth's atmosphere.

Radiative cooling at nighttime has been studied extensively as a mitigation strategy for climate change, yet peak demand for cooling occurs in the daytime.

"No one had yet been able to surmount the challenges of daytime radiative cooling -- of cooling when the sun is shining," said Eden Rephaeli, a doctoral candidate in Fan's lab and a co-first-author of the paper. "It's a big hurdle."

The Stanford team has succeeded where others have come up short by turning to nanostructured photonic materials. These materials can be engineered to enhance or suppress light reflection in certain wavelengths.

"We've taken a very different approach compared to previous efforts in this field," said Aaswath Raman, a doctoral candidate in Fan's lab and a co-first-author of the paper. "We combine the thermal emitter and solar reflector into one device, making it both higher performance and much more robust and practically relevant. In particular, we're very excited because this design makes viable both industrial-scale and off-grid applications."

Using engineered nanophotonic materials the team was able to strongly suppress how much heat-inducing sunlight the panel absorbs, while it radiates heat very efficiently in the key frequency range necessary to escape Earth's atmosphere. The material is made of quartz and silicon carbide, both very weak absorbers of sunlight.

Net cooling power
The new device is capable of achieving a net cooling power in excess of 100 watts per square meter. By comparison, today's standard 10-percent-efficient solar panels generate the about the same amount of power. That means Fan's radiative cooling panels could theoretically be substituted on rooftops where existing solar panels feed electricity to air conditioning systems needed to cool the building.

To put it a different way, a typical one-story, single-family house with just 10 percent of its roof covered by radiative cooling panels could offset 35 percent its entire air conditioning needs during the hottest hours of the summer.

Radiative cooling has another profound advantage over all other cooling strategy such as air-conditioner. It is a passive technology. It requires no energy. It has no moving parts. It is easy to maintain. You put it on the roof or the sides of buildings and it starts working immediately.

A changing vision of cooling
Beyond the commercial implications, Fan and his collaborators foresee a broad potential social impact. Much of the human population on Earth lives in sun-drenched regions huddled around the equator. Electrical demand to drive air conditioners is skyrocketing in these places, presenting an economic and an environmental challenge. These areas tend to be poor and the power necessary to drive cooling usually means fossil-fuel power plants that compound the greenhouse gas problem."In addition to these regions, we can foresee applications for radiative cooling in off-the-grid areas of the developing world where air conditioning is not even possible at this time. There are large numbers of people who could benefit from such systems," Fan said.

Share this story on Facebook, Twitter, and Google:

Story Source:
The above story is reprinted from materials provided by Stanford School of Engineering. The original article was written by Andrew Myers.
Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Wednesday, March 27, 2013

Protected areas successfully prevent deforestation in Amazon rainforest

Protected areas successfully prevent deforestation in Amazon rainforest

Mar. 11, 2013 — Strictly protected areas such as national parks and biological reserves have been more effective at reducing deforestation in the Amazon rainforest than so-called sustainable-use areas that allow for controlled resource extraction, two University of Michigan researchers and their colleagues have found.

In addition, protected areas established primarily to safeguard the rights and livelihoods of indigenous people performed especially well in places where deforestation pressures are high. The U-M-led study, which found that all forms of protection successfully limit deforestation, is scheduled for online publication March 11 in the Proceedings of the National Academy of Sciences.

The lead author is Christoph Nolte, a doctoral candidate at the U-M School of Natural Resources and Environment. Co-authors include Arun Agrawal, a professor of natural resources at SNRE.

"Perhaps the biggest surprise is the finding that indigenous lands perform the best when it comes to lower deforestation in contexts of high deforestation pressure," Agrawal said. "Many observers have suggested that granting substantial autonomy and land rights to indigenous people over vast tracts of land in the Amazon will lead to high levels of deforestation because indigenous groups would want to take advantage of the resources at their disposal.

"This study shows that -- based on current evidence -- such fears are misplaced," he said.

Preventing deforestation of rainforests is a goal for conserving biodiversity and, more recently, for reducing carbon emissions in the Brazilian Amazon, which covers an area of nearly 2 million square miles.

After making international headlines for historically high Amazon deforestation rates between 2000 and 2005, Brazil achieved radical reductions in deforestation rates in the second half of the past decade. Although part of those reductions were attributed to price declines of agricultural commodities, recent analyses also show that regulatory government policies -- including a drastic increase in enforcement activities and the expansion and strengthening of protected-area networks -- all contributed significantly to the observed reductions.

In their study, the U-M researchers and their colleagues used new remote-sensing-based datasets from 292 protected areas in the Brazilian Amazon, along with a sophisticated statistical analysis, to assess the effectiveness of different types of protected areas. They looked at three categories of protected areas: strictly protected areas, sustainable use areas and indigenous lands.

Strictly protected areas -- state and national biological stations, biological reserves, and national and state parks -- consistently avoided more deforestation than sustainable-use areas, regardless of the level of deforestation pressure. Sustainable-use areas allow for controlled resource extraction, land use change and, in many instances, human settlements.

"Earlier analyses suggested that strict protection, because it allows no resource use, is so controversial that it is less likely to be implemented where deforestation pressures are high -- close to cities or areas of high agricultural value, for example," Nolte said.

"But we observed that recent designations of the Brazilian government placed new strictly protected areas in very high-pressure areas, attenuating this earlier argument," he said.

Hundreds of millions of people in the tropics depend on forests for their subsistence. Forest products that households rely on include firewood, fodder for livestock and timber for housing.

Co-authors of the PNAS paper are Kirsten M. Silvius of the Gordon and Betty Moore Foundation and Britaldo S. Soares-Filho of the Universidade Federal de Minas Gerais in Brazil.

The work was supported by the Gordon and Betty Moore Foundation, the Rights and Resources Initiative, the U-M Graham Sustainability Institute, the National Science Foundation and the Brazilian National Council for Scientific and Technological Development.

Share this story on Facebook, Twitter, and Google:

Story Source:
The above story is reprinted from materials provided by University of Michigan.
Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Prenatal exposure to pesticide DDT linked to adult high blood pressure

Prenatal exposure to pesticide DDT linked to adult high blood pressure

Mar. 12, 2013 — Infant girls exposed to high levels of the pesticide DDT while still inside the womb are three times more likely to develop hypertension when they become adults, according to a new study led by the University of California, Davis.

Previous studies have shown that adults exposed to DDT (dichlorodiplhenyltrichloroethane) are at an increased risk of high blood pressure. But this study, published online March 12 in Environmental Health Perspectives, is the first to link prenatal DDT exposure to hypertension in adults.

Hypertension, or high blood pressure, is a high risk factor for heart disease, which remains the leading cause of death in the United States and worldwide.

"The prenatal period is exquisitely sensitive to environmental disturbance because that's when the tissues are developing," said study lead author Michele La Merrill, an assistant professor in the UC Davis Department of Environmental Toxicology.

The U.S. Environmental Protection Agency banned DDT in this country in 1972 after nearly three decades of use. However, the pesticide is still used for malaria control in other parts of the world, such as India and South Africa. That means children born in those areas could have a higher risk of hypertension as adults.

La Merrill said that traces of DDT, a persistent organic pollutant, also remain in the food system, primarily in fatty animal products.

The study examined concentrations of DDT in blood samples collected from women who had participated in the Child Health and Development Studies, an ongoing project of the nonprofit Public Health Institute. The CHDS recruited women who sought obstetric care through Kaiser Permanente Foundation Health Plan in the San Francisco Bay Area between 1959 and 1967. They also surveyed the adult daughters of those women to learn if they had developed hypertension.

Evidence from our study shows that women born in the U.S. before DDT was banned have an increased risk of hypertension that might be explained by increased DDT exposure," said La Merrill. "And the children of people in areas where DDT is still used may have an increased risk, as well."

Share this story on Facebook, Twitter, and Google:

Story Source:
The above story is reprinted from materials provided by University of California - Davis, via EurekAlert!, a service of AAAS.
Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Canadian Arctic glacier melt accelerating, irreversible, projections suggest

Canadian Arctic glacier melt accelerating, irreversible, projections suggest

Mar. 12, 2013 — Ongoing glacier loss in the Canadian high Arctic is accelerating and probably irreversible, new model projections by Lenaerts et al. suggest. The Canadian high Arctic is home to the largest clustering of glacier ice outside of Greenland and Antarctica -- 146,000 square kilometers (about 60,000 square miles) of glacier ice spread across 36,000 islands.

In the past few years, the mass of the glaciers in the Canadian Arctic archipelago has begun to plummet. Observations from NASA's Gravity Recovery and Climate Experiment (GRACE) satellites suggest that from 2004 to 2011 the region's glaciers shed approximately 580 gigatons of ice. Aside from glacier calving, which plays only a small role in Canadian glacier mass loss, the drop is due largely to a shift in the surface-mass balance, with warming-induced meltwater runoff outpacing the accumulation of new snowfall.

Using a coupled atmosphere-snow climate model, the authors reproduced the observed changes in glacier mass and sought to forecast projected changes given a future of continued warming. Driving the model with a climate reanalysis dataset for the period 1960 to 2011 and with a potential future warming pathway, the authors find that their model accurately reproduces observed glacier mass losses, including a recent up-tick in the rate of the ice's decline.

The authors calculate that by 2100, when the Arctic archipelago is 6.5 Kelvin (14 degrees Fahrenheit) warmer, the rate of glacier mass loss will be roughly 144 gigatons per year, up from the present rate of 92 gigatons per year. In total, the researchers expect Canadian Arctic archipelago glaciers to lose around 18 percent of their mass by the end of the century. Given current warming trends, they suggest that the ongoing glacier loss is effectively irreversible.
Share this story on Facebook, Twitter, and Google:


Story Source:
The above story is reprinted from materials provided by American Geophysical Union, via EurekAlert!, a service of AAAS.
Note: Materials may be edited for content and length. For further information, please contact the source cited above

Chemicals pollutants threaten health in the Arctic

Chemicals pollutants threaten health in the Arctic

Mar 15, 2013  - People living in Arctic areas can be more sensitive to pollutants due to their genetics, says researcher Arja Rautio at the Centre for Arctic Medicine in theUniversity of Oulu, Finland. This is unfortunate since the northernmost areas of Europe are receiving more harmful chemicals. Scientists believe climate change may be a culprit as air and water mass movements push some of these undesirable chemicals towards the Arctic. "In real life, people are exposed to lots of chemicals," says Rautio, who leads studies into the human health effects from contaminants and the influence of climate change in a EU-funded project called ArcRisk, "and I think the people of the north are exposed to higher levels than for example the general population in Europe."

Many new contaminants like fluorinated and brominated compounds and bisphenol A can act on hormones and so have impacts on human health. But seeing an effect on humans, at the population level, could take ten or even 20 years, especially in the case of cancer, she adds. This is why ArcRisk has established a database containing data on concentration levels and trends of contaminants in humans. The project team analysed frozen blood samples collected in Norway in 1978, 1986, 1995 and 2008 for polychlorinated biphenyls (PCBs), chlorinated pesticides and polybrominated diphenylethers (PBDEs).

The main challenge that project scientists struggle with is to disentangle the effects of contaminant chemicals from what we do in our everyday lives. "We know that dioxins can lead to more diabetes and high blood pressure," says Rautio, "but there are many other confounding factors. We are changing our diet and many of us are less active and those lifestyle choices can also increase the risk of diseases like diabetes." The results of the project are due to be presented at a conference of Arctic
Frontiers in Tromsø, Norway, in January 2014.

Previous studies have also struggled with disentangling contaminants effects when trying to understand their impact on health. There are uncertainties between the chemicals and direct health impacts because people are exposed to so many chemicals simultaneously, cautions biologist Thomas Zoeller at the University of Massachusetts Amherst, USA. Besides, the human population is genetically variable and may react differently to the chemicals and we don't even know which of the chemicals affect us.

"Moreover, some of these chemicals reside in the environment -- and in the body -- for a long time, and this means that they may build up," says Zoeller. His recently edited a recent World Health Organization report which warned that chronic diseases are increasing worldwide and many are related to hormones. It noted that known hormone-disrupting chemicals are "only the tip of the iceberg" and better tests are needed to catch others.

Health problems induced by these chemicals could be worse than anticipated. Some of the pollutants found in the Arctic by the project scientists like the fluorinated compounds have higher affinities for hormone receptors than even the natural hormones. "We have documented several direct harmful effects of these and other chemicals, especially in seabirds, top predators such as the glaucous and ivory," saysGeir Wing Gabrielsen, an environmental scientist at the Norwegian Polar Institute, who is not part of ArcRisk.

These animal studies already show worrying trends that do not bode well for humans. "When we see these findings in Arctic animals I am very concerned about what we will find with regards to humans, though we ourselves don't do human studies," Gabrielsen says. He notes that long periods of warm air are being transported to the Arctic and that the sea currents around places like the Svalbard islands [located midway between Norway and the North Pole] now consist of warmer Atlantic water; they used to consist of polar waters. "Climate change is having an effect and it is resulting in higher levels of contaminants in the environment and [therefore] also in the animals," Gabrielsen warns.

Rautio concludes that there is a need to clarify the effects so that people -- not only in those living in the remote northern areas -- can make decisions about their own lives, what to eat, how to avoid exposure to harm.

Share this story on Facebook, Twitter, and Google:


Story Source:
The above story is reprinted from materials provided by youris.com.
Note: Materials may be edited for content and length. For further information, please contact the source cited above.



Losing wetlands to grow crops

Losing wetlands to grow crops

Mar. 25, 2013 — Getting enough to eat is a basic human need -- but at what cost to the environment? Research published in BioMed Central's journal Agriculture & Food Security demonstrates that as their crops on higher ground fail due to unreliable rainfall, people in countries like Uganda are increasingly relocating to wetland areas. Unless the needs of these people are addressed in a more sustainable way, overuse of wetland resources through farming, fishing, and hunting will continue.

In 2009 it was estimated that about a third of Uganda's wetlands had been lost to growing crops and grazing. While the environmental significance of wetland loss is important, so are National Food Security targets and the Millennium Development Goal of halving the number of people who suffer from hunger by 2015. In order to evaluate how people are using the wetlands researchers from Makerere University, Uganda, with financial support from IDRC surveyed residents living in either Lake Victoria crescent, Kyoga plains, and South Western farmlands.

The survey revealed that more than 80% of people in these areas use wetland resources including collecting water, catching fish, hunting bush meat (Sitatunga, a type of antelope, and wild rat), and harvesting wild fruits and vegetables. Some of these they consume but others they sell in order to be able to buy food. Over half admitted to growing crops in the nutrient rich soil wetlands with its ready water supply. The families who were most likely to use the wetlands in this way were the ones who had the least access to other sources of food.

The locals blame their bad harvests on global warming, and as global weather systems change this can only get worse. Dr Nelson Turyahabwe explained, "Food insecurity is a real problem across the world. In Uganda the families most at risk tended to have younger or female household heads, or were less educated. Large families were also at high risk of not having enough to eat. In these cases use of wetlands allows families to survive. In designing sustainable use policies for wetlands the needs of humans also needs to be considered."
Share this story on Facebook, Twitter, and Google:


Story Source:
The above story is reprinted from materials provided by BioMed Central Limited, via AlphaGalileo.
Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Properly planned roads could help rather than harm the environment, say experts

Properly planned roads could help rather than harm the environment, say experts

Mar. 20, 2013 — Two leading ecologists say a rapid proliferation of roads across the planet is causing irreparable damage to nature, but properly planned roads could actually help the environment.

Loggers, miners and other road builders are putting roads almost everywhere, including places they simply shouldn't go, such as wilderness areas," said Professor Andrew Balmford of the University of Cambridge, UK. "Some of these roads are causing environmental disasters."

"The current situation is largely chaos," said Professor William Laurance of James Cook University in Cairns, Australia. "Roads are going almost everywhere and often open a Pandora's Box of environmental problems."

"Just look at the Amazon rainforest," said Laurance. "Over 95 percent of all forest destruction and wildfires occur within 10 kilometers of roads, and there's now 100,000 kilometers of roads crisscrossing the Amazon."

But the researchers say it doesn't have to be like this. "Roads are like real estate," said Laurance. "It's 'location, location, location'. In the right places, roads can actually help protect nature."

The secret, say the scientists, is to plan roads carefully, keeping them out of wilderness areas and concentrating them in areas that are best-suited for farming and development.

"In such areas," said Balmford, "roads can improve farming, making it much easier to move crops to market and import fertilizers. This can increase farm profits, improve the livelihoods of rural residents, enhance food security and draw migrants away from vulnerable wilderness areas."

This will be crucial in the future, say the scientists, given that global farming production will need to double in the coming decades to feed up to 10 billion people.

Writing in the journal Nature, the researchers say a global mapping program is needed, to advise on where to put roads, where to avoid new roads and where to close down existing roads that are causing severe environmental damage.

"It's all about being proactive," said Laurance. "Ultimately, local decision-makers will decide where to put roads. But by working together, development experts, agriculturalists and ecologists could provide badly needed guidelines on where to build good roads rather than bad roads."

Share this story on Facebook, Twitter, and Google:

Story Source:
The above story is reprinted from materials provided by University of Cambridge. The original story is licensed under a Creative Commons Licence.
Note: Materials may be edited for content and length. For further information, please contact the source cited above

Sustainable Development Goals must sustain people and planet, experts say

Sustainable Development Goals must sustain people and planet, experts say

Mar. 20, 2013 — In the wake of last week's meetings at the UN on the definition of the Sustainable Development Goals (SDGs), a group of international scientists have published a call in the journal Nature today, arguing for a set of six SDGs that link poverty eradication to protection of Earth's life support. The researchers argue that in the face of increasing pressure on the planet's ability to support life, adherence to out-dated definitions of sustainable development threaten to reverse progress made in developing countries over past decades.

Ending poverty and safeguarding Earth's life support system must be the twin priorities for the Sustainable Development Goals, say the researchers. The team identified six goals that, if met, would contribute to global sustainability while helping to alleviate poverty.

"Climate change and other global environmental threats will increasingly become serious barriers to further human development," says lead author Professor David Griggs from Monash University in Australia. Humans are transforming Earth's life support system -- the atmosphere, oceans, waterways, forests, ice sheets and biodiversity that allow us to thrive and prosper -- in ways "likely to undermine development gains," he added.

Co-author Professor Johan Rockström, director of the Stockholm Resilience Centre said, "Mounting research shows we are now at the point that the stable functioning of Earth systems is a prerequisite for a thriving global society and future development."

The team asserts that the classic model of sustainable development, of three integrated pillars -- economic, social and environmental -- that has served nations and the UN for over a decade, is flawed because it does not reflect reality. "As the global population increases towards nine billion people sustainable development should be seen as an economy serving society within Earth's life support system, not as three pillars," says co-author Dr. Priya Shyamsundar from the South Asian Network for Development and Environmental Economics, Nepal.

The researchers say that the Millennium Development Goals (MDGs), set to expire in 2015, have helped focus international efforts on eight poverty-related goals. However, despite successes in some areas -- the number of people living on less than one dollar a day has been more than halved -- many MDGs have not been met, and some remain in conflict with one another. Economic gains, for example, have come at the expense of environmental protection. Politicians are struggling to link global environmental concerns with addressing poverty.

The new set of goals -- thriving lives and livelihoods, food security, water security, clean energy, healthy and productive ecosystems, and governance for sustainable societies -- aim to resolve this conflict. The targets beneath each goal include updates and expanded targets under the MDGs, including ending poverty and hunger, combating HIV/aids, and improving maternal and child health. But they also define a set of planetary "must haves": climate stability, the reduction biodiversity loss, protection of ecosystem services, a healthy water cycle and oceans, sustainable nitrogen and phosphorus use, clean air and sustainable material use.

Co-author Dr. Mark Stafford Smith, science director of CSIRO's climate adaptation research programme in Australia said: "The key point is that the SDGs must genuinely add up to sustainability. The SDGs have the potential to lock in the spectacular gains on human development that we have achieved in the past two decades and help the globe transition to a sustainable lifestyle. But the link between these two aims must be more coherent."

The new research is linked to Future Earth, a new international research programme designed to "develop the knowledge required for societies worldwide to face challenges posed by global environmental change and to identify opportunities for a transition to global sustainability." Several authors are closely involved in developing this new research programme.

"Ultimately, the choice of goals is a political decision. But science can inform what combination of goals can achieve a sustainable future. And science can identify measurable targets and indicators," said Dr Stafford Smith.

Share this story on Facebook, Twitter, and Google:

Story Source:
The above story is reprinted from materials provided by International Council for Science, via AlphaGalileo.
Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Natural climate swings contribute more to increased monsoon rainfall than global warming

Natural climate swings contribute more to increased monsoon rainfall than global warming

Mar. 20, 2013 — Natural swings in the climate have significantly intensified Northern Hemisphere monsoon rainfall, showing that these swings must be taken into account for climate predictions in the coming decades, a new study finds.

The findings are published in the March 18 online publication of the Proceedings of the National Academy of Sciences.

Monsoon rainfall in the Northern Hemisphere impacts about 60% of the World population in Southeast Asia, West Africa and North America. Given the possible impacts of global warming, solid predictions of monsoon rainfall for the next decades are important for infrastructure planning and sustainable economic development. Such predictions, however, are very complex because they require not only pinning down how humanmade greenhouse gas emissions will impact the monsoons and monsoon rainfall, but also a knowledge of natural long-term climate swings, about which little is known so far.

To tackle this problem an international team of scientists around Meteorology Professor Bin Wang at the International Pacific Research Center, University of Hawaii at Manoa, examined climate data to see what happened in the Northern Hemisphere during the last three decades, a time during which the global-mean surface-air temperature rose by about 0.4°C. Current theory predicts that the Northern Hemisphere summer monsoon circulation should weaken under anthropogenic global warming.

Wang and his colleagues, however, found that over the past 30 years, the summer monsoon circulation, as well as the Hadley and Walker circulations, have all substantially intensified. This intensification has resulted in significantly greater global summer monsoon rainfall in the Northern Hemisphere than predicted from greenhouse-gas-induced warming alone: namely a 9.5% increase, compared to the anthropogenic predicted contribution of 2.6% per degree of global warming.

Most of the recent intensification is attributable to a cooling of the eastern Pacific that began in 1998. This cooling is the result of natural long-term swings in ocean surface temperatures, particularly swings in the Interdecadal Pacific Oscillation or mega-El Niño-Southern Oscillation, which has lately been in a mega-La Niña or cool phase. Another natural climate swing, called the Atlantic Multidecadal Oscillation, also contributes to the intensification of monsoon rainfall.

"These natural swings in the climate system must be understood in order to make realistic predictions of monsoon rainfall and of other climate features in the coming decades," says Wang. "We must be able to determine the relative contributions of greenhouse-gas emissions and of long-term natural swings to future climate change."
Share this story on Facebook, Twitter, and Google:

Story Source:
The above story is reprinted from materials provided by University of Hawaii ‑ SOEST, via EurekAlert!, a service of AAAS.
Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Suggestions for a middle ground between unlogged forest and intensively managed lands

Suggestions for a middle ground between unlogged forest and intensively managed lands

Mar. 18, 2013 — It is increasingly recognized that protected areas alone are not sufficient for successful biodiversity conservation, and that management of production areas (e.g. forestry and agricultural land) plays a crucial role in that respect.Retention forestry and agroforestry are two land management systems aiming to reconcile the production of human goods with biodiversity conservation.

The retention forestry model is, as the name suggests, based on retaining some of the local forest structures when harvesting trees in an attempt to preserve local biodiversity. Agroforestry addresses this need through the intentional management of shade trees alongside agricultural crops. Despite the technical differences, both systems provide an intermediary between unlogged forest and intensively managed land. A paper recently published in the open access journal Nature Conservation, draws an important parallel between the two systems.

From a conservation point of view, both retention forestry and agroforestry are expected to provide a variety of ecological benefits, such as the maintenance and restoration of ecosystem heterogeneity. They also provide habitat for tree-dependent species outside the forest as well as increased connectivity for forest species within landscapes. Moreover, both systems minimize some of the off-site impacts of management. In spite of some inherent differences between the two systems, the large number of similarities suggests that both would benefit from a bridging of scientific and practical experiences.

The author team, led by Dr. Roberge from the Department of Wildlife, Fish and Environmental Studies at the Swedish University of Agricultural Sciences(SLU), calls for studies addressing cost-effectiveness of different retention and agroforestry systems in relation to biodiversity conservation, argues for a stronger focus on the two systems' effects on species of special conservation concern, and encourages increased collaboration between researchers and practitioners across the two fields.

Share this story on Facebook, Twitter, and Google:


Story Source:
The above story is reprinted from materials provided by Pensoft Publishers. The original story is licensed under a Creative Commons License.
Note: Materials may be edited for content and length. For further information, please contact the source cited above




.

Ten times more hurricane surges in future, new research predicts

Ten times more hurricane surges in future, new research predicts

Mar. 18, 2013 — By examining the frequency of extreme storm surges in the past, previous research has shown that there was an increasing tendency for storm hurricane surges when the climate was warmer. But how much worse will it get as temperatures rise in the future? How many extreme storm surges like that from Hurricane Katrina, which hit the U.S. coast in 2005, will there be as a result of global warming? New research from the Niels Bohr Institute show that there will be a tenfold increase in frequency if the climate becomes two degrees Celsius warmer.

The results are published in the scientific journal, Proceedings of the National Academy of Science, PNAS.

Tropical cyclones arise over warm ocean surfaces with strong evaporation and warming of the air. The typically form in the Atlantic Ocean and move towards the U.S. East Coast and the Gulf of Mexico. If you want to try to calculate the frequency of tropical cyclones in a future with a warmer global climate, researchers have developed various models. One is based on the regional sea temperatures, while another is based on differences between the regional sea temperatures and the average temperatures in the tropical oceans. There is considerable disagreement among researchers about which is best.

New model for predicting cyclones

"Instead of choosing between the two methods, I have chosen to use temperatures from all around the world and combine them into a single model," explains climate scientist Aslak Grinsted, Centre for Ice and Climate at the Niels Bohr Institute at the University of Copenhagen.

He takes into account the individual statistical models and weights them according to how good they are at explaining past storm surges. In this way, he sees that the model reflects the known physical relationships, for example, how the El Niño phenomenon affects the formation of cyclones. The research was performed in collaboration with colleagues from China and England.

The statistical models are used to predict the number of hurricane surges 100 years into the future. How much worse will it be per degree of global warming? How many 'Katrinas' will there be per decade?

Since 1923, there has been a 'Katrina' magnitude storm surge every 20 years.

10 times as many 'Katrinas'

"We find that 0.4 degrees Celcius warming of the climate corresponds to a doubling of the frequency of extreme storm surges like the one following Hurricane Katrina. With the global warming we have had during the 20th century, we have already crossed the threshold where more than half of all 'Katrinas' are due to global warming," explains Aslak Grinsted.

"If the temperature rises an additional degree, the frequency will increase by 3-4 times and if the global climate becomes two degrees warmer, there will be about 10 times as many extreme storm surges. This means that there will be a 'Katrina' magnitude storm surge every other year," says Aslak Grinsted and he points out that in addition to there being more extreme storm surges, the sea will also rise due to global warming. As a result, the storm surges will become worse and potentially more destructive.

Share this story on Facebook, Twitter, and Google:

Story Source:
The above story is reprinted from materials provided by University of Copenhagen, via EurekAlert!, a service of AAAS.
Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Petroleum use, greenhouse gas emissions of automobiles could drop 80 percent by 2050: U.S. report

Petroleum use, greenhouse gas emissions of automobiles could drop 80 percent by 2050: U.S. report

Mar. 18, 2013 — A new National Research Council report finds that by the year 2050, the U.S. may be able to reduce petroleum consumption and greenhouse gas emissions by 80 percent for light-duty vehicles -- cars and small trucks -- via a combination of more efficient vehicles; the use of alternative fuels like biofuels, electricity, and hydrogen; and strong government policies to overcome high costs and influence consumer choices. While achieving these goals will be difficult, improving technologies driven by strong and effective policies could make deep reductions possible.

"To reach the 2050 goals for reducing petroleum use and greenhouse gases, vehicles must become dramatically more efficient, regardless of how they are powered," said Douglas M. Chapin, principal of MPR Associates, and chair of the committee that wrote the report."In addition, alternative fuels to petroleum must be readily available, cost-effective and produced with low emissions of greenhouse gases.Such a transition will be costly and require several decades.The committee's model calculations, while exploratory and highly uncertain, indicate that the benefits of making the transition, i.e. energy cost savings, improved vehicle technologies, and reductions in petroleum use and greenhouse gas emissions, exceed the additional costs of the transition over and above what the market is willing to do voluntarily."

Improving the efficiency of conventional vehicles is, up to a point, the most economical and easiest-to-implement approach to saving fuel and lowering emissions, the report says.This approach includes reducing work the engine must perform -- reducing vehicle weight, aerodynamic resistance, rolling resistance, and accessories -- plus improving the efficiency of the internal combustion engine powertrain.

Improved efficiency alone will not meet the 2050 goals, however.The average fuel economy of vehicles on the road would have to exceed 180 mpg, which, the report says, is extremely unlikely with current technologies.Therefore, the study committee also considered other alternatives for vehicles and fuels, including:

·hybrid electric vehicles, such as the Toyota Prius;

·plug-in hybrid electric vehicles, such as the Chevrolet Volt;

·battery electric vehicles, such as the Nissan Leaf;

·hydrogen fuel cell electric vehicles, such as the Mercedes F-Cell, scheduled to be introduced about 2014; and

·compressed natural gas vehicles, such as the Honda Civic Natural Gas.

Although driving costs per mile will be lower, especially for vehicles powered by natural gas or electricity, the high initial purchase cost is likely to be a significant barrier to widespread consumer acceptance, the report says.All the vehicles considered are and will continue to be several thousand dollars more expensive than today's conventional vehicles.Additionally, particularly in the early years, the report predicts that alternative vehicles will likely be limited to a few body styles and sizes; some will rely on fuels that are not readily available or have restricted travel range; and others may require bulky energy storage that will limit their cargo and passenger capacity.Wide consumer acceptance is essential, however, and large numbers of alternative vehicles must be purchased long before 2050 if the on-road fleet is to meet desired performance goals.Strong policies and technology advances are critical in overcoming this challenge.

The report identified several scenarios that could meet the more demanding 2050 greenhouse gas goal.Each combines highly efficient vehicles with at least one of three alternative power sources -- biofuel, electricity, or hydrogen.Natural gas vehicles were considered, but their greenhouse gas emissions are too high for the 2050 goal.However, if the costs of these vehicles can be reduced and appropriate refueling infrastructure created, they have great potential for reducing petroleum consumption.

While corn-grain ethanol and biodiesel are the only biofuels to have been produced in commercial quantities in the U.S. to date, the study committee found much greater potential in biofuels made from lignocellulosic biomass -- which includes crop residues like wheat straw, switchgrass, whole trees, and wood waste.This "drop-in" fuel is designed to be a direct replacement for gasoline and could lead to large reductions in both petroleum use and greenhouse gas emissions; it can also be introduced without major changes in fuel delivery infrastructure or vehicles.The report finds that sufficient lignocellulosic biomass could be produced by 2050 to meet the goal of an 80 percent reduction in petroleum use when combined with highly efficient vehicles.

Vehicles powered by electricity will not emit any greenhouse gases, but the production of electricity and the additional load on the electric power grid are factors that must be considered.To the extent that fossil resources are used to generate electricity, the report says that the successful implementation of carbon capture and storage will be essential.These vehicles also rely on batteries, which are projected to drop steeply in price.However, the report says that limited range and long recharge times are likely to limit the use of all-electric vehicles mainly to local driving.Advanced battery technologies under development all face serious technical challenges.

When hydrogen is used as a fuel cell in electric vehicles, the only vehicle emission is water.However, varying amounts of greenhouse gases are emitted during hydrogen production, and the low-greenhouse gas methods of making hydrogen are more expensive and will need further development to become competitive.Hydrogen fuel cell vehicles could become less costly than the advanced internal combustion engine vehicles of 2050.Fuel cell vehicles are not subject to the limitations of battery vehicles, but developing a hydrogen infrastructure in concert with a growing number of fuel cell vehicles will be difficult and expensive, the report says.

The technology advances required to meet the 2050 goals are challenging and not assured.Nevertheless, the committee considers that dramatic cost reduction and overall performance enhancement is possible without unpredictable technology breakthroughs.Achieving these goals requires that the improved technology focus on reducing fuel use rather than adding greater power or weight, the report says.

It is impossible to know which technologies will ultimately succeed, the report says, because all involve uncertainty.The best approach, therefore, is to promote a portfolio of vehicle and fuel research and development, supported by both government and industry, designed to solve the critical challenges in each major candidate technology.Such primary research efforts need continuing evaluation of progress against performance goals to determine which technologies, fuels, designs, and production methods are emerging as the most promising and cost-effective.

Overcoming the barriers to advanced vehicles and fuels will require a rigorous policy framework that is more stringent than the proposed fuel economy standards for 2025.This policy intervention could include high and increasing fuel economy standards, R&D support, subsidies, and public information programs aimed at improving consumers' familiarity with the new fuels and powertrains.Because of the high level of uncertainty in the pace and scale of technology advances, this framework should be modified as technologies develop and as conditions change.

It is essential that policies promoting particular technologies to the public are not introduced before these new fuels and vehicle technologies are close to market readiness, and consumer behavior toward them is well understood.The report warns that forcing a technology into the market should be undertaken only when the benefits of the proposed support justify its costs.

Report: http://www.nap.edu/catalog.php?record_id=18264

Share this story on Facebook, Twitter, and Google:

Story Source:
The above story is reprinted from materials provided by National Academy of Sciences.
Note: Materials may be edited for content and length. For further information, please contact the source cited above.