Google Search

For weather information from across the nation, please check out our home site National Weather Outlook. Thanks!

Miami, Florida Current Weather Conditions

Miami, Florida Weather Forecast

Miami, Florida 7 Day Weather Forecast

Miami, Florida Weather Radar

Top weather conditions that amplify Lake Erie algal blooms revealed

Of the many weather-related factors that contribute to harmful algal blooms (HABs) in Lake Erie, a new study has identified one as most important: the wind.

Over a 10-year period in Lake Erie, wind speed contributed more consistently to HABs than sunshine or even precipitation, researchers at The Ohio State University and their colleagues found.

The ongoing study is unusual, in that researchers are building the first detailed analyses of how the various environmental factors influence each other -- in the context of satellite studies of Lake Erie.

They gave their early results at the American Geophysical Union meeting on Dec. 17.

To C.K. Shum, Distinguished University Scholar and professor of geodetic science at Ohio State, the finding "underscores the need for environmental agencies to incorporate the threat of extreme weather events caused by climate change into future algae mitigation strategies."

Where other studies have linked weather phenomena to HABs, this study goes a step further to look at how environmental drivers impact each other, and "ranks" them by their relative importance in promoting HABs, said Song Liang, formerly of Ohio State and now an associate professor of environmental and global health at the University of Florida.

"What surprised us the most was how the impact of nonweather factors, such as nitrogen and phosphorus pollution, varied strongly by season, while weather factors remained consistently important throughout the year," he said.

Researchers have long known that high nitrogen and phosphorus levels are the actual causes of HABs, which choke freshwater ecosystems and render the water toxic. But when it comes to the various environmental factors that can amplify the amount of these nutrients in the water, or aid or hamper the spread of algae, the relationships are much more complex.

"One of the objectives of this project is investigating historical patterns of harmful algal blooms and their linkage to water quality and environmental factors," explained project leader Jiyoung Lee, associate professor of environmental health sciences at Ohio State. "By doing this, we can better understand and predict the future of HABs and water safety in the Lake Erie community with the impact of changing climate and environmental factors."

Liang and his group analyzed nine environmental factors, including solar radiation, wind speed, precipitation, nitrogen concentration, water temperature and water quality in Lake Erie from 2002 to 2012. Then the larger research team used data from the sensor onboard the European Space Agency's Envisat satellite MEdium Resolution Imaging Spectrometer (MERIS) to examine how the color of the lake water changed during those years -- an indication of the concentration of the toxic blue-green algae present in HABs.

The researchers examined the environmental drivers by season, and found that wind speed affected the spread of algal blooms consistently throughout spring, summer and fall. Seasons of low winds led to larger blooms. That's because when wind speed is low, lake water is more still, and algae can more easily float to the top and form thick mats that spread along the lake surface.

Sunlight, meanwhile, was important in the spring and summer as a source of energy for the algae. Precipitation was very important in the summer and the winter, when rains and melting snow boosted runoff and delivered nitrogen and phosphorus, which algae use as food sources, to the lake.

As the project continues, the researchers hope to get a better understanding of how the variables relate to each other, and explore the notion of weather and climate as factors in a kind of "early warning system" for HABs.


View the original article here

Giant atmospheric rivers add mass to Antarctica's ice sheet

Extreme weather phenomena called atmospheric rivers were behind intense snowstorms recorded in 2009 and 2011 in East Antarctica. The resulting snow accumulation partly offset recent ice loss from the Antarctic ice sheet, report researchers from KU Leuven.

Atmospheric rivers are long, narrow water vapour plumes stretching thousands of kilometres across the sky over vast ocean areas. They are capable of rapidly transporting large amounts of moisture around the globe and can cause devastating precipitation when they hit coastal areas.

Although atmospheric rivers are notorious for their flood-inducing impact in Europe and the Americas, their importance for Earth's polar climate -- and for global sea levels -- is only now coming to light.

In this study, an international team of researchers led by Irina Gorodetskaya of KU Leuven's Regional Climate Studies research group used a combination of advanced modelling techniques and data collected at Belgium's Princess Elisabeth polar research station in East Antarctica's Dronning Maud Land to produce the first ever in-depth look at how atmospheric rivers affect precipitation in Antarctica.

The researchers studied two particular instances of heavy snowfall in the East Antarctic region in detail, one in May 2009 and another in February 2011, and found that both were caused by atmospheric rivers slamming into the East Antarctic coast.

The Princess Elisabeth polar research station recorded snow accumulation equivalent to up to 5 centimetres of water for each of these weather events, good for 22 per cent of the total annual snow accumulation in those years.

The findings point to atmospheric rivers' impressive snow-producing power. "When we looked at all the extreme weather events that took place during 2009 and 2011, we found that the nine atmospheric rivers that hit East Antarctica in those years accounted for 80 per cent of the exceptional snow accumulation at Princess Elisabeth station," says Irina Gorodetskaya.

And this can have important consequences for Antarctica's diminishing ice sheet. "There is a need to understand how the flow of ice within Antarctica's ice sheet responds to warming and gain insight in atmospheric processes, cloud formation and snowfall," adds Nicole Van Lipzig, co-author of the study and professor of geography at KU Leuven.

A separate study found that the Antarctic ice sheet has lost substantial mass in the last two decades -- at an average rate of about 68 gigatons per year during the period 1992-2011.

"The unusually high snow accumulation in Dronning Maud Land in 2009 that we attributed to atmospheric rivers added around 200 gigatons of mass to Antarctica, which alone offset 15 per cent of the recent 20-year ice sheet mass loss," says Irina Gorodetskaya.

"This study represents a significant advance in our understanding of how the global water cycle is affected by atmospheric rivers. It is the first to look at the effect of atmospheric rivers on Antarctica and to explore their role in cryospheric processes of importance to the global sea level in a changing climate," says Martin Ralph, contributor to the study and Director of the Center for Western Weather and Water Extremes at the University of California, San Diego.

"Moving forward, we aim to explore the impact of atmospheric rivers on precipitation in all Antarctic coastal areas using data records covering the longest possible time period. We want to determine exactly how this phenomenon fits into climate models," says Irina Gorodetskaya.

"Our results should not be misinterpreted as evidence that the impacts of global warming will be small or reversed due to compensating effects. On the contrary, they confirm the potential of Earth's warming climate to manifest itself in anomalous regional responses. Thus, our understanding of climate change and its worldwide impact will strongly depend on climate models' ability to capture extreme weather events, such as atmospheric rivers and the resulting anomalies in precipitation and temperature," she concludes.


View the original article here

Atmospheric rivers, cloud-creating aerosol particles, and california reservoirs

In the midst of the California rainy season, scientists are embarking on a field campaign designed to improve the understanding of the natural and human-caused phenomena that determine when and how the state gets its precipitation. They will do so by studying atmospheric rivers, meteorological events that include the famous rainmaker known as the Pineapple Express.

CalWater 2015 is an interagency, interdisciplinary field campaign starting January 14, 2015. CalWater 2015 will entail four research aircraft flying through major storms while a ship outfitted with additional instruments cruises below. The research team includes scientists from Scripps Institution of Oceanography at UC San Diego, the Department of Energy's Pacific Northwest National Laboratory, NOAA, and NASA and uses resources from the DOE's Atmospheric Radiation Measurement (ARM) Climate Research Facility -- a national scientific user facility.

The study will help provide a better understanding of how California gets its rain and snow, how human activities are influencing precipitation, and how the new science provides potential to inform water management decisions relating to drought and flood.

"After several years in the making by an interdisciplinary science team, and through support from multiple agencies, the CalWater 2015 field campaign is set to observe the key conditions offshore and over California like has never been possible before," said Scripps climate researcher Marty Ralph, a CalWater lead investigator. "These data will ultimately help develop better climate projections for water and will help test the potential of using existing reservoirs in new ways based on atmospheric river forecasts."

Like land-based rivers, atmospheric rivers carry massive amounts of moisture long distances -- in California's case, from the tropics to the U.S. West Coast. When an atmospheric river hits the coast, it releases its moisture as precipitation. How much and whether it falls as rain or snow depends on aerosols -- tiny particles made of dust, sea salt, volatile molecules, and pollution.

The researchers will examine the strength of atmospheric rivers, which produce up to 50 percent of California's precipitation and can transport 10-20 times the flow of the Mississippi River. They will also explore how to predict when and where atmospheric rivers will hit land, as well as the role of ocean evaporation and how the ocean changes after a river passes.

"Climate and weather models have a hard time getting precipitation right," said Ralph. "In fact, the big precipitation events that are so important for water supply and can cause flooding, mostly due to atmospheric rivers, are some of the most difficult to predict with useful accuracy. The severe California drought is essentially a result of a dearth of atmospheric rivers, while, conversely, the risk of Katrina-like damages for California due to severe ARs has also been quantified in previous research."

For the next month or more, instrument teams will gather data from the NOAA research vessel Ronald H. Brown and two NOAA, one DOE, and one NASA research aircraft with a coordinated implementation strategy when weather forecasters see atmospheric rivers developing in the Pacific Ocean off the coast of California. NASA will also provide remote sensing data for the project.

"Improving our understanding of atmospheric rivers will help us produce better forecasts of where they will hit and when, and how much rain and snow they will deliver," said Allen White, NOAA research meteorologist and CalWater 2015 mission scientist. "Better forecasts will give communities the environmental intelligence needed to respond to droughts and floods."

Most research flights will originate at McClellan Airfield in Sacramento. Ground-based instruments in Bodega Bay, Calif., and scattered throughout the state will also collect data on natural and human contributions to the atmosphere such as dust and pollution. This data-gathering campaign follows the 2009-2011 CalWater1 field campaign, which yielded new insights into how precipitation processes in the Sierra Nevada can be influenced by different sources of aerosols that seed the clouds.

"This will be an extremely important study in advancing our overall understanding of aerosol impacts on clouds and precipitation," said Kimberly Prather, a CalWater lead investigator and Distinguished Chair in Atmospheric Chemistry with appointments at Scripps Oceanography and the Department of Chemistry and Biochemistry at UC San Diego. "It will build upon findings from CalWater1, adding multiple aircraft to directly probe how aerosols from different sources, local, ocean, as well as those from other continents, are influencing clouds and precipitation processes over California."

"We are collecting this data to improve computer models of rain that represent many complex processes and their interactions with the environment," said PNNL's Leung. "Atmospheric rivers contribute most of the heavy rains along the coast and mountains in the West. We want to capture those events better in our climate models used to project changes in extreme events in the future."

Prather's group showed during CalWater1 that aerosols can have competing effects, depending on their source. Intercontinental mineral dust and biological particles possibly from the ocean corresponded to events with more precipitation, while aerosols produced by local air pollution correlated with less precipitation.

The CalWater 2015 campaign is comprised of two interdependent efforts. Major investments in facilities include aircraft, ship time, and sensors by NOAA. Marty Ralph, Kim Prather, and Dan Cayan from Scripps, and Chris Fairall, Ryan Spackman, and Allen White of NOAA lead CalWater-2. The DOE-funded ARM Cloud Aerosol Precipitation Experiment (ACAPEX) is led by Ruby Leung from PNNL. NSF and NASA have also provided major support for aspects of CalWater, leveraging the NOAA and DOE investments.


View the original article here

Sudden jump in a storm's lightning might warn a supercell is forming

A sudden jump in the number of lightning strikes inside a garden-variety thunderstorm might soon give forecasters a new tool for predicting severe weather and issuing timely warnings, according to research at The University of Alabama in Huntsville (UAH).

The sudden increase in lightning is one sign a normal storm is rapidly evolving into a supercell, with a large rotating updraft -- or mesocyclone -- at its heart.

"Supercells are more prone to produce severe weather events, including damaging straight line winds and large hail," said Sarah Stough, a UAH graduate student in atmospheric science. "Supercells also produce the strongest and most deadly tornadoes."

Early results from Stough's research were presented Jan. 7 in Phoenix at the American Meteorological Society's annual meeting.

"Roughly 90 percent of mesocyclones are related to severe weather of some kind, while only 25 percent are associated with tornadoes," Stough said.

Because the sudden increase in lightning strikes is either concurrent with -- or within minutes of -- a supercell forming, UAH researchers are developing algorithms that might be used by forecasters to issue timely severe weather warnings.

"Basically, we keep a 10-minute running average of the number of lightning flashes in a cell," Stough said. "Then, if the flash rate suddenly jumps to at least twice the standard deviation of that running average, there is a high probability the updraft in that cell has strengthened, a supercell is forming and severe weather is more likely with that storm."

"We can use the lightning jump as a nowcasting tool for supercells if the jump is set in the context of that storm's environmental data," said Dr. Larry Carey, a UAH associate professor in atmospheric science. "If the meteorology of the day suggests supercells are likely, the jump can tell us when and where that is happening. Early warning of supercells -- especially the first of a severe weather day -- is an important forecasting challenge."

The lightning jump has been tested as a forecast tool by National Weather Service forecasters in Huntsville, Ala., and at NWS testing facilities in Norman, Okla.

"I know a lot of forecasters are excited about having this information," Stough said.

While the ongoing research uses ground-based lightning detection networks, the UAH team is also working on being able to use lightning counts reported by the Geostationary Lightning Mapper aboard the GOES-R geostationary weather satellite, which is scheduled to launch in 2016.

"The lightning jump is getting in front of forecasters now so we can get feedback, and fit the lightning jump concept into their forecasting methods," said Chris Schultz, an atmospheric science graduate student at UAH and an intern at NASA's Marshall Space Flight Center. "This way, when the real-time data from GLM is available and the lightning jump is implemented, it will immediately fit into the forecasters' warning operations."


View the original article here

Hurricane sandy increased incidents of heart attacks, stroke in hardest hit New Jersey counties

Heart attacks and strokes are more likely to occur during extreme weather and natural disasters such as earthquakes and floods. Researchers at the Cardiovascular Institute of New Jersey at Rutgers Robert Wood Johnson Medical School have found evidence that Hurricane Sandy, commonly referred to as a superstorm, had a significant effect on cardiovascular events, including myocardial infarction (heart attack) and stroke, in the high-impact areas of New Jersey two weeks following the 2012 storm. The study, led by Joel N. Swerdel, MS, MPH, an epidemiologist at the Cardiovascular Institute and the Rutgers School of Public Health, was published in the Journal of the American Heart Association.

Utilizing the Myocardial Infarction Data Acquisition System (MIDAS), the researchers examined changes in the incidence of and mortality from myocardial infarctions and strokes from 2007 to 2012 for two weeks prior to and two weeks after October 29, the date of Hurricane Sandy. MIDAS is an administrative database containing hospital records of all patients discharged from non-federal hospitals in New Jersey with a cardiovascular disease diagnosis or invasive cardiovascular procedure.

In the two weeks following Hurricane Sandy, the researchers found that in the eight counties determined to be high-impact areas, there was a 22 percent increase in heart attacks as compared with the same time period in the previous five years. In the low impact areas (the remaining 13 counties), the increase was less than one percent. 30-day mortality from heart attacks also increased by 31 percent in the high-impact area.

"We estimate that there were 69 more deaths from myocardial infarction during the two weeks following Sandy than would have been expected. This is a significant increase over typical non-emergency periods," said Swerdel. "Our hope is that the research may be used by the medical community, particularly emergency medical services, to prepare for the change in volume and severity of health incidents during extreme weather events."

In regard to stroke, the investigators found an increase of 7 percent compared to the same time period in the prior five years in areas of the state impacted the most. There was no change in the incidence of stroke in low-impact areas. There also was no change in the rate of 30-day mortality due to stroke in either the high- or low-impact areas.

"Hurricane Sandy had unprecedented environmental, financial and health consequences on New Jersey and its residents, all factors that can increase the risk of cardiovascular events," said John B. Kostis, MD, director of the Cardiovascular Institute of New Jersey and associate dean for cardiovascular research at Rutgers Robert Wood Johnson Medical School. "Increased stress and physical activity, dehydration and a decreased attention or ability to manage one's own medical needs probably caused cardiovascular events during natural disasters or extreme weather. Also, the disruption of communication services, power outages, gas shortages, and road closures, also were contributing factors to efficiently obtaining medical care."

Journal Reference:

J. N. Swerdel, T. M. Janevic, N. M. Cosgrove, J. B. Kostis. The Effect of Hurricane Sandy on Cardiovascular Events in New Jersey. Journal of the American Heart Association, 2014; 3 (6): e001354 DOI: 10.1161/JAHA.114.001354

View the original article here

Average temperature in Finland has risen by more than two degrees

Over the past 166 years, the average temperature in Finland has risen by more than two degrees. During the observation period, the average increase was 0.14 degrees per decade, which is nearly twice as much as the global average.

According to a recent University of Eastern Finland and Finnish Meteorological Institute study, the rise in the temperature has been especially fast over the past 40 years, with the temperature rising by more than 0.2 degrees per decade. "The biggest temperature rise has coincided with November, December and January. Temperatures have also risen faster than the annual average in the spring months, i.e., March, April and May. In the summer months, however, the temperature rise has not been as significant," says Professor Ari Laaksonen of the University of Eastern Finland and the Finnish Meteorological Institute. As a result of the temperature rising, lakes in Finland get their ice cover later than before, and the ice cover also melts away earlier in the spring. Although the temperature rise in the actual growth season has been moderate, observations of Finnish trees beginning to blossom earlier than before have been made.

Temperature has risen in leaps

The annual average temperature has risen in two phases, the first being from the beginning of the observation period to the late 1930s, and the second from the late 1960s to present. Since the 1960s, the temperature has risen faster than ever before, with the rise varying between 0.2 and 0.4 degrees per decade. Between the late 1930s and late 1960s, the temperature remained nearly steady. "The stop in the temperature rise can be explained by several factors, including long-term changes in solar activity and post-World War II growth of human-derived aerosols in the atmosphere. When looking at recent years' observations from Finland, it seems that the temperature rising is not slowing down," University of Eastern Finland researcher Santtu Mikkonen explains.

The temperature time series was created by averaging the data produced by all Finnish weather stations across the country. Furthermore, as the Finnish weather station network wasn't comprehensive nation-wide in the early years, data obtained from measurement stations in Finland's neighbouring countries was also used.

Finland is located between the Atlantic Ocean and the continental Eurasia, which causes great variability in the country's weather. In the time series of the average temperature, this is visible in the form of strong noise, which makes it very challenging to detect statistically significant trends. The temperature time series for Finland was analysed by using a dynamic regression model. The method allows the division of the time series into sections indicating mean changes, i.e. trends, periodic variation, observation inter-dependence and noise. The method makes it possible to take into consideration the seasonal changes typical of Nordic conditions, as well as significant annual variation.

Journal Reference:

S. Mikkonen, M. Laine, H. M. M?kel?, H. Gregow, H. Tuomenvirta, M. Lahtinen, A. Laaksonen. Trends in the average temperature in Finland, 1847–2013. Stochastic Environmental Research and Risk Assessment, 2014; DOI: 10.1007/s00477-014-0992-2

View the original article here

Even in restored forests, extreme weather strongly influences wildfire's impacts

The 2013 Rim Fire, the largest wildland fire ever recorded in the Sierra Nevada region, is still fresh in the minds of Californians, as is the urgent need to bring forests back to a more resilient condition. Land managers are using fire as a tool to mimic past fire conditions, restore fire-dependent forests, and reduce fuels in an effort to lessen the potential for large, high-intensity fires, like the Rim Fire. A study led by the U.S. Forest Service's Pacific Southwest Research Station (PSW) and recently published in the journal Forest Ecology and Management examined how the Rim Fire burned through forests with restored fire regimes in Yosemite National Park to determine whether they were as resistant to high-severity fire as many scientists and land managers expected.

Since the late 1960s, land managers in Yosemite National Park have used prescribed fire and let lower intensity wildland fires burn in an attempt to bring back historical fire regimes after decades of fire suppression. For this study, researchers seized a unique opportunity to study data on forest structure and fuels collected in 2009 and 2010 in Yosemite's old-growth, mixed-conifer forests that had previously burned at low to moderate severity. Using post-Rim Fire data and imagery, researchers found that areas burned on days the Rim Fire was dominated by a large pyro-convective plume -- a powerful column of smoke, gases, ash, and other debris -- burned at moderate to high severity regardless of the number of prior fires, topography, or forest conditions.

"The specific conditions leading to large plume formation are unknown, but what is clear from many observations is that these plumes are associated with extreme burning conditions," says Jamie Lydersen, PSW biological science technician and the study's lead author. "Plumes often form when atmospheric conditions are unstable, and result in erratic fire behavior driven by its own local effect on surface wind and temperatures that override the influence of more generalized climate factors measured at nearby weather stations."

When the extreme conditions caused by these plumes subsided during the Rim Fire, other factors influenced burn severity. "There was a strong influence of elapsed time since the last burn, where forests that experienced fire within the last 14 years burned mainly at low severity in the Rim Fire. Lower elevation areas and those with greater shrub cover tended to burn at higher severity," says Lydersen.

When driven by extreme weather, which often coincides with wildfires that escape initial containment efforts, fires can severely burn large swaths of forest regardless of ownership and fire history. These fires may only be controlled if more forests across the landscape have been managed for fuel reduction to allow early stage suppression before weather- and fuels-driven fire intensity makes containment impossible. Coordination of fire management activities by land management agencies across jurisdictions could favor burning under more moderate weather conditions when wildfires start and reduce the occurrences of harmful, high-intensity fires.


View the original article here

When it comes to variations in crop yield, climate has a big say

What impact will future climate change have on food supply? That depends in part on the extent to which variations in crop yield are attributable to variations in climate. A new report from researchers at the University of Minnesota Institute on the Environment has found that climate variability historically accounts for one-third of yield variability for maize, rice, wheat and soybeans worldwide -- the equivalent of 36 million metric tons of food each year. This provides valuable information planners and policy makers can use to target efforts to stabilize farmer income and food supply and so boost food security in a warming world.

The work was published in the journal Nature Communications by Deepak Ray, James Gerber, Graham MacDonald and Paul West of IonE's Global Landscapes Initiative.The researchers looked at newly available production statistics for maize, rice, wheat and soybean from 13,500 political units around the world between 1979 and 2008, along with precipitation and temperature data. The team used these data to calculate year-to-year fluctuations and estimate how much of the yield variability could be attributed to climate variability.About 32 to 39 percent of year-to-year variability for the four crops could be explained by climate variability. This is substantial -- the equivalent of 22 million metric tons of maize, 3 million metric tons of rice, 9 million metric tons of wheat, and 2 million metric tons of soybeans per year.The links between climate and yield variability differed among regions. Climate variability explained much of yield variability in some of the most productive regions, but far less in low-yielding regions. "This means that really productive areas contribute to food security by having a bumper crop when the weather is favorable but can be hit really hard when the weather is bad and contribute disproportionately to global food insecurity," says Ray. "At the other end of the spectrum, low-yielding regions seem to be more resilient to bad-weather years but don't see big gains when the weather is ideal." Some regions, such as in parts of Asia and Africa, showed little correlation between climate variability and yield variability.More than 60 percent of the yield variability can be explained by climate variability in regions that are important producers of major crops, including the Midwestern U.S., the North China Plains, western Europe and Japan.Depicted as global maps, the results show where and how much climate variability explains yield variability.

The research team is now looking at historical records to see whether the variability attributable to climate has changed over time -- and if so, what aspects of climate are most pertinent.

"Yield variability can be a big problem from both economic and food supply standpoints," Ray said. "The results of this study and our follow-up work can be used to improve food system stability around the world by identifying hot spots of food insecurity today as well as those likely to be exacerbated by climate change in the future."


View the original article here

Electromagnetic waves linked to particle fallout in Earth's atmosphere, new study finds

In a new study that sheds light on space weather's impact on Earth, Dartmouth researchers and their colleagues show for the first time that plasma waves buffeting the planet's radiation belts are responsible for scattering charged particles into the atmosphere.

The study is the most detailed analysis so far of the link between these waves and the fallout of electrons from the planet's radiation belts. The belts are impacted by fluctuations in "space weather" caused by solar activity that can disrupt GPS satellites, communication systems, power grids and manned space exploration.

The results appear in the journal Geophysical Research Letters. A PDF is available on request.

The Dartmouth space physicists are part of a NASA-sponsored team that studies the Van Allen radiation belts, which are donut-shaped belts of charged particles held in place by Earth's magnetosphere, the magnetic field surrounding our planet. In a quest to better predict space weather, the Dartmouth researchers study the radiation belts from above and below in complementary approaches -- through satellites (the twin NASA Van Allen Probes) high over Earth and through dozens of instrument-laden balloons (BARREL, or Balloon Array for Radiation belt Relativistic Electron Losses) at lower altitudes to assess the particles that rain down.

The Van Allen Probes measure particle, electric and magnetic fields, or basically everything in the radiation belt environment, including the electrons, which descend following Earth's magnetic field lines that converge at the poles. This is why the balloons are launched from Antarctica, where some of the best observations can be made. As the falling electrons collide with the atmosphere, they produce X-rays and that is what the balloon instruments are actually recording.

"We are measuring those atmospheric losses and trying to understand how the particles are getting kicked into the atmosphere," says co-author Robyn Millan, an associate professor in Dartmouth's Department of Physics and Astronomy and the principal investigator of BARREL. "Our main focus has been really on the processes that are occurring out in space. Particles in the Van Allen belts never reach the ground, so they don't constitute a health threat. Even the X-rays get absorbed, which is why we have to go to balloon altitudes to see them."

In their new study, the BARREL researchers' major objective was to obtain simultaneous measurements of the scattered particles and of ionoized gas called plasma out in space near Earth's equator. They were particularly interested in simultaneous measurements of a particular kind of plasma wave called electromagnetic ion cyclotron waves and whether these waves were responsible for scattering the particles, which has been an open question for years.

The researchers obtained measurements in Antarctica in 2013 when the balloons and both the Geostationary Operational Environmental Satellite (GOES) and Van Allen Probe satellites were near the same magnetic field line. They put the satellite data into their model that tests the wave-particle interaction theory, and the results suggest the wave scattering was the cause of the particle fallout. "This is the first real quantitative test of the theory," Millan says.


View the original article here

In the mood to trade? Weather may influence institutional investors' stock decisions

Weather changes may affect how institutional investors decide on stock plays, according to a new study by a team of finance researchers. Their findings suggest sunny skies put professional investors more in a mood to buy, while cloudy conditions tend to discourage stock purchases.

The researchers conclude that cloudier days increase the perception that individual stocks and the Dow Jones Industrials are overpriced, increasing the inclination for institutions to sell.

The research paper, "Weather-Induced Mood, Institutional Investors, and Stock Returns," has been published in the January 2015 issue of The Review of Financial Studies. The research was collaborated by Case Western Reserve University's Dasol Kim and three other finance professors (William Goetzmann of Yale University, Alok Kumar of University of Miami and Qin Wang of University of Michigan-Dearborn).

Institutional investors represent large organizations, such as banks, mutual funds, labor union funds and finance or insurance companies that make substantial investments in stocks. Kim said the results of the study are surprising, given that professional investors are well regarded for their financial sophistication.

"We focus on institutional investors because of the important role they have in how stock prices are formed in the markets," said Kim, assistant professor of banking and finance at Case Western Reserve's Weatherhead School of Management. "Other studies have already shown that ordinary retail investors are susceptible to psychological biases in their investment decisions. Trying to evaluate similar questions for institutional investors is challenging, because relevant data is hard to come by."

Building on previous findings from psychological studies about the effect of sunshine on mood, the researchers wanted to learn how mood affects professional investor opinions on their stock market investments.

By linking responses to a survey of investors from the Yale Investor Behavior Project of Nobel Prize-winning economist Robert Shiller and institutional stock trade data with historical weather data from the National Oceanic and Atmospheric Administration, the researchers concluded aggregated data shows that seasonably sunnier weather leads to optimistic responses and a willingness to buy.

The research accounts for differences in weather across regions of the country and seasons. They show that these documented mood effects also influence stock prices, and that the observed impact does not persist for long periods of time.

A summary of the research was also recently featured at The Harvard Law School Forum on Corporate Governance and Financial Regulation.

Journal Reference:

W. N. Goetzmann, D. Kim, A. Kumar, Q. Wang. Weather-Induced Mood, Institutional Investors, and Stock Returns. Review of Financial Studies, 2014; 28 (1): 73 DOI: 10.1093/rfs/hhu063

View the original article here

NASA satellite set to get the dirt on Earth's soil moisture

A new NASA satellite that will peer into the topmost layer of Earth's soils to measure the hidden waters that influence our weather and climate is in final preparations for a Jan. 29 dawn launch from California.

The Soil Moisture Active Passive (SMAP) mission will take the pulse of a key measure of our water planet: how freshwater cycles over Earth's land surfaces in the form of soil moisture. The mission will produce the most accurate, highest-resolution global maps ever obtained from space of the moisture present in the top 2 inches (5 centimeters) of Earth's soils. It also will detect and map whether the ground is frozen or thawed. This data will be used to enhance scientists' understanding of the processes that link Earth's water, energy and carbon cycles.

"With data from SMAP, scientists and decision makers around the world will be better equipped to understand how Earth works as a system and how soil moisture impacts a myriad of human activities, from floods and drought to weather and crop yield forecasts," said Christine Bonniksen, SMAP program executive with the Science Mission Directorate's Earth Science Division at NASA Headquarters in Washington. "SMAP's global soil moisture measurements will provide a new capability to improve our understanding of Earth's climate."

Globally, the volume of soil moisture varies between three and five percent in desert and arid regions, to between 40 and 50 percent in saturated soils. In general, the amount depends on such factors as precipitation patterns, topography, vegetation cover and soil composition. There are not enough sensors in the ground to map the variability in global soil moisture at the level of detail needed by scientists and decision makers. From space, SMAP will produce global maps with 6-mile (10-kilometer) resolution every two to three days.

Researchers want to measure soil moisture and its freeze/thaw state better for numerous reasons. Plants and crops draw water from the soil through their roots to grow. If soil moisture is inadequate, plants fail to grow, which over time can lead to reduced crop yields. Also, energy from the sun evaporates moisture in the soil, thereby cooling surface temperatures and also increasing moisture in the atmosphere, allowing clouds and precipitation to form more readily. In this way, soil moisture has a significant effect on both short-term regional weather and longer-term global climate.

In summer, plants in Earth's northern boreal regions -- the forests found in Earth's high northern latitudes -- take in carbon dioxide from the air and use it to grow, but lay dormant during the winter freeze period. All other factors being equal, the longer the growing season, the more carbon plants take in and the more effective forests are in removing carbon dioxide from the air. Since the start of the growing season is marked by the thawing and refreezing of water in soils, mapping the freeze/thaw state of soils with SMAP will help scientists more accurately account for how much carbon plants are removing from the atmosphere each year. This information will lead to better estimates of the carbon budget in the atmosphere and, hence, better assessments of future global warming.

SMAP data will enhance our confidence in projections of how Earth's water cycle will respond to climate change.

"Assessing future changes in regional water availability is perhaps one of the greatest environmental challenges facing the world today," said Dara Entekhabi, SMAP science team leader at the Massachusetts Institute of Technology in Cambridge. "Today's computer models disagree on how the water cycle -- precipitation, clouds, evaporation, runoff, soil water availability -- will increase or decrease over time and in different regions as our world warms. SMAP's higher-resolution soil moisture data will improve the models used to make daily weather and longer-term climate predictions."

SMAP also will advance our ability to monitor droughts, predict floods and mitigate the related impacts of these extreme events. It will allow the monitoring of regional deficits in soil moisture and provide critical inputs into drought monitoring and early warning systems used by resource managers. The mission's high-resolution observations of soil moisture will improve flood warnings by providing information on ground saturation conditions before rainstorms.

SMAP's two advanced instruments work together to produce soil moisture maps. Its active radar works much like a flash camera, but instead of transmitting visible light, it transmits microwave pulses that pass through clouds and moderate vegetation cover to the ground and measures how much of that signal is reflected back. Its passive radiometer operates like a natural-light camera, capturing emitted microwave radiation without transmitting a pulse. Unlike traditional cameras, however, SMAP's images are in the microwave range of the electromagnetic spectrum, which is invisible to the naked eye. Microwave radiation is sensitive to how much moisture is contained in the soil.

The two instruments share a large, lightweight reflector antenna that will be unfurled in orbit like a blooming flower and then spin at about 14 revolutions per minute. The antenna will allow the instruments to collect data across a 621-mile (1,000-kilometer) swath, enabling global coverage every two to three days.

SMAP's radiometer measurements extend and expand on soil moisture measurements currently made by the European Space Agency's Soil Moisture Ocean Salinity (SMOS) mission, launched in 2009. With the addition of a radar instrument, SMAP's soil moisture measurements will be able to distinguish finer features on the ground.

SMAP will launch from Vandenberg Air Force Base on a United Launch Alliance Delta II rocket and maneuver into a 426-mile (685-kilometer) altitude, near-polar orbit that repeats exactly every eight days. The mission is designed to operate at least three years.

SMAP is managed for NASA's Science Mission Directorate in Washington by the agency's Jet Propulsion Laboratory in Pasadena, California, with instrument hardware and science contributions made by NASA's Goddard Space Flight Center in Greenbelt, Maryland. JPL is responsible for project management, system engineering, radar instrumentation, mission operations and the ground data system. Goddard is responsible for the radiometer instrument. Both centers collaborate on science data processing and delivery to the Alaska Satellite Facility, in Fairbanks, and the National Snow and Ice Data Center, at the University of Colorado in Boulder, for public distribution and archiving. NASA's Launch Services Program at the agency's Kennedy Space Center in Florida is responsible for launch management. JPL is managed for NASA by the California Institute of Technology in Pasadena.

For more information about the Soil Moisture Active Passive mission, visit:

http://www.nasa.gov/smap

and

http://smap.jpl.nasa.gov

SMAP will be the fifth NASA Earth science mission to launch within a 12-month period. NASA monitors Earth's vital signs from land, air and space with a fleet of satellites and ambitious airborne and ground-based observation campaigns. NASA develops new ways to observe and study Earth's interconnected natural systems with long-term data records and computer analysis tools to better see how our planet is changing.

For more information about NASA's Earth science activities, visit:

http://www.nasa.gov/earthrightnow


View the original article here

Glacier beds can get slipperier at higher sliding speeds

As a glacier's sliding speed increases, the bed beneath the glacier can grow slipperier, according to laboratory experiments conducted by Iowa State University glaciologists.

They say including this effect in efforts to calculate future increases in glacier speeds could improve predictions of ice volume lost to the oceans and the rate of sea-level rise.

The glaciologists -- Lucas Zoet, a postdoctoral research associate, and Neal Iverson, a professor of geological and atmospheric sciences -- describe the results of their experiments in the Journal of Glaciology. The paper uses data collected from a newly constructed laboratory tool, the Iowa State University Sliding Simulator, to investigate glacier sliding. The device was used to explore the relationship between drag and sliding speed for comparison with the predictions of theoretical models.

"We really have a unique opportunity to study the base of glaciers with these experiments," said Zoet, the lead author of the paper. "The other tactic you might take is studying these relationships with field observations, but with field data so many different processes are mixed together that it becomes hard to untangle the relevant data from the noise."

Data collected by the researchers show that resistance to glacier sliding -- the drag that the bed exerts on the ice -- can decrease in response to increasing sliding speed. This decrease in drag with increasing speed, although predicted by some theoreticians a long as 45 years ago, is the opposite of what is usually assumed in mathematical models of the flow of ice sheets.

These are the first empirical results demonstrating that as ice slides at an increasing speed -- perhaps in response to changing weather or climate -- the bed can become slipperier, which could promote still faster glacier flow.

The response of glaciers to changing climate is one of the largest potential contributors to sea-level rise. Predicting glacier response to climate change depends on properly characterizing the way a glacier slides over its bed. There has been a half-century debate among theoreticians as to how to do that.

The simulator features a ring of ice about 8 inches thick and about 3 feet across that is rotated over a model glacier bed. Below the ice is a hydraulic press that can simulate the weight of a glacier several hundred yards thick. Above are motors that can rotate the ice ring over the bed at either a constant speed or a constant stress. A circulating, temperature-regulated fluid keeps the ice at its melting temperature -- a necessary condition for significant sliding.

"About six years were required to design, construct, and work the bugs out of the new apparatus," Iverson said, "but it is performing well now and allowing hypothesis tests that were formerly not possible."


View the original article here

Improving forecasts for rain-on-snow flooding

Many of the worst West Coast winter floods pack a double punch. Heavy rains and melting snow wash down the mountains together to breach riverbanks, wash out roads and flood buildings.

These events are unpredictable and difficult to forecast. Yet they will become more common as the planet warms and more winter precipitation falls as rain rather than snow.

University of Washington mountain hydrology experts are using the physics behind these events to better predict the risks.

"One of the main misconceptions is that either the rain falls and washes the snow away, or that heat from the rain is melting the snow," said Nicholas Wayand, a UW doctoral student in civil and environmental engineering. He will present his research Dec. 18 at the annual meeting of the American Geophysical Union.

Most of the largest floods on record in the western U.S. are associated with rain falling on snow. But it's not that the rain is melting or washing away the snow.

Instead, it's the warm, humid air surrounding the drops that is most to blame for the melting, Wayand said. Moisture in the air condenses on the cold snow just like water droplets form on a cold drink can. The energy released when the humid air condenses is absorbed by the snow. The other main reason is that rainstorms bring warmer air, and this air blows across the snow to melt its surface. His work support previous research showing that these processes provide 60 to 90 percent of the energy for melting.

Places that experience rain-on-snow flooding are cities on rivers that begin in the mountains, such as Sacramento, California, and Centralia, Washington. In the 1997 New Year's Day flood in Northern California, melting snow exacerbated flooding, which broke levees and caused millions of dollars in damage. The biggest recent rain-on-snow event in Washington was the 2009 flood in the Snoqualmie basin. And the Calgary flood in summer of 2013 included snow from the Canadian Rockies that caused rivers to overflow their banks.

The UW researchers developed a model by recreating the 10 worst rain-on-snow flooding events between 1980 and 2008 in three regions: the Snoqualmie basin in Washington state, the upper San Joaquin basin in central California and the East North Fork of the Feather River basin in southern California.

Their results allow them to gauge the risks for any basin and any incoming storm. The three factors that matter most, they found, are the shape of the basin, the elevation of the rain-to-snow transition before and during the storm, and the amount of tree cover. Basins most vulnerable to snowmelt are treeless basins with a lot of area within the rain-snow transition zone, where the precipitation can fall as snow and then rain.

Trees reduce the risk of flooding because they slow the storm's winds.

"If you've ever been in a forest on a windy day, it's a lot calmer," Wayand said. That slows the energy transferred from condensation and from contact with warm air to the snowpack.

Simulations also show that meltwater accounted for up to about a quarter of the total flooding. That supports earlier research showing that snow is not the main contributor to rain-on-snow floods, but cannot be neglected since it adds water to an already heavy winter rainstorm.

The complexity of mountain weather also plays a role.

"The increase in precipitation with elevation is much greater than usual for some of these storms," said Jessica Lundquist, a UW associate professor of civil and environmental engineering. "Higher flows can result from heavier rainfall rates at higher elevations, rather than from snowmelt."

In related work, Lundquist's group has developed a tennis-ball snow sensor and is measuring growth and melt of the snowpack in the foothills east of Seattle. The scientists aim to better understand how changes in climate and forestry practices might affect municipal water supplies and flood risks.

Wayand and another student in the group have developed a high school curriculum for Seattle teachers to explain rain-on-snow events and the physics behind why they occur. They hope to begin teaching the curriculum sometime next year.

The other collaborator on the work being presented in San Francisco is Martyn Clark at the National Center for Atmospheric Research in Colorado.


View the original article here

Deep Space Climate Observatory to provide 'EPIC' views of Earth

NASA has contributed two Earth science instruments for NOAA's space weather observing satellite called the Deep Space Climate Observatory or DSCOVR, set to launch in January 2015. One of the instruments called EPIC or Earth Polychromatic Imaging Camera will image the Earth in one picture, something that hasn't been done before from a satellite. EPIC will also provide valuable atmospheric data.

Currently, to get an entire Earth view, scientists have to piece together images from satellites in orbit. With the launch of the National Oceanic and Atmospheric Administration's (NOAA) DSCOVR and the EPIC instrument, scientists will get pictures of the entire sunlit side of Earth. To get that view, EPIC will orbit the first sun-Earth Lagrange point (L1), 1 million miles from Earth. At this location, four times further than the orbit of the Moon, the gravitational pull of the sun and Earth cancel out providing a stable orbit for DSCOVR. Most other Earth-observing satellites circle the planet within 22,300 miles.

"Unlike personal cameras, EPIC will take images in 10 very narrow wavelength ranges," said Adam Szabo, DSCOVR project scientist at NASA's Goddard Space Flight Center, Greenbelt, Maryland. "Combining these different wavelength images allows the determination of physical quantities like ozone, aerosols, dust and volcanic ash, cloud height, or vegetation cover. These results will be distributed as different publicly available data products allowing their combination with results from other missions."

These data products are of interest to climate science, as well as hydrology, biogeochemistry, and ecology. Data will also provide insight into Earth's energy balance.

EPIC was built by Lockheed Martin's Advanced Technology Center, in Palo Alto, California. It is a 30 centimeter (11.8 inch) telescope that measures in the ultraviolet, and visible areas of the spectrum. EPIC images will have a resolution of between 25 and 35 kilometers (15.5 to 21.7 miles).


View the original article here

Temperature anomalies are warming faster than Earth's average, study finds

It's widely known that Earth's average temperature has been rising. But research by an Indiana University geographer and colleagues finds that spatial patterns of extreme temperature anomalies -- readings well above or below the mean -- are warming even faster than the overall average.

And trends in extreme heat and cold are important, said Scott M. Robeson, professor of geography in the College of Arts and Sciences at IU Bloomington. They have an outsized impact on water supplies, agricultural productivity and other factors related to human health and well-being.

"Average temperatures don't tell us everything we need to know about climate change," he said. "Arguably, these cold extremes and warm extremes are the most important factors for human society."

Robeson is the lead author of the article "Trends in hemispheric warm and cold anomalies," which will be published in the journal Geophysical Research Letters and is available online. Co-authors are Cort J. Willmott of the University of Delaware and Phil D. Jones of the University of East Anglia.

The researchers analyzed temperature records for the years 1881 to 2013 from HadCRUT4, a widely used data set for land and sea locations compiled by the University of East Anglia and the U.K. Met Office. Using monthly average temperatures at points across the globe, they sorted them into "spatial percentiles," which represent how unusual they are by their geographic size.

Their findings include:

Temperatures at the cold and warm "tails" of the spatial distribution -- the 5th and 95th percentiles -- increased more than the overall average Earth temperature.Over the 130-year record, cold anomalies increased more than warm anomalies, resulting in an overall narrowing of the range of Earth's temperatures.In the past 30 years, however, that pattern reversed, with warm anomalies increasing at a faster rate than cold anomalies. "Earth's temperature was becoming more homogenous with time," Robeson said, "but now it's not."

The study records separate results for the Northern and Southern Hemispheres. Temperatures are considerably more volatile in the Northern Hemisphere, an expected result because there's considerably less land mass in the South to add complexity to weather systems.

The study also examined anomalies during the "pause" in global warming that scientists have observed since 1998. While a 16-year-period is too short a time to draw conclusions about trends, the researchers found that warming continued at most locations on the planet and during much of the year, but that warming was offset by strong cooling during winter months in the Northern Hemisphere.

"There really hasn't been a pause in global warming," Robeson said. "There's been a pause in Northern Hemisphere winter warming."

Co-author Jones of the University of East Anglia said the study provides scientists with better knowledge about what's taking place with Earth's climate. "Improved understanding of the spatial patterns of change over the three periods studied are vital for understanding the causes of recent events," he said.

It may seem counterintuitive that global warming would be accompanied by colder winter weather at some locales. But Robeson said the observation aligns with theories about climate change, which hold that amplified warming in the Arctic region produces changes in the jet stream, which can result in extended periods of cold weather at some locations in the mid-northern latitudes.

And while the rate of planetary warming has slowed in the past 16 years, it hasn't stopped. The World Meteorological Organization announced this month that 2014 is on track to be one of the warmest, if not the warmest, years on record as measured by global average temperatures.

In the U.S., the East has been unusually cold and snowy in recent years, but much of the West has been unusually warm and has experienced drought. And what happens here doesn't necessarily reflect conditions on the rest of the planet. Robeson points out that the United States, including Alaska, makes up only 2 percent of Earth's surface.


View the original article here

New insights into predicting future droughts in California: Natural cycles, sea surface temperatures found to be main drivers in ongoing event

According to a new NOAA-sponsored study, natural oceanic and atmospheric patterns are the primary drivers behind California's ongoing drought. A high pressure ridge off the West Coast (typical of historic droughts) prevailed for three winters, blocking important wet season storms, with ocean surface temperature patterns making such a ridge much more likely. Typically, the winter season in California provides the state with a majority of its annual snow and rainfall that replenish water supplies for communities and ecosystems.

Further studies on these oceanic conditions and their effect on California's climate may lead to advances in drought early warning that can help water managers and major industries better prepare for lengthy dry spells in the future.

"It's important to note that California's drought, while extreme, is not an uncommon occurrence for the state. In fact, multi-year droughts appear regularly in the state's climate record, and it's a safe bet that a similar event will happen again. Thus, preparedness is key," said Richard Seager, report lead author and professor with Columbia University's Lamont Doherty Earth Observatory.

This report builds on earlier studies, published in September in the Bulletin of the American Meteorological Society, which found no conclusive evidence linking human-caused climate change and the California drought. The current study notes that the atmospheric ridge over the North Pacific, which has resulted in decreased rain and snowfall since 2011, is almost opposite to what models project to result from human-induced climate change. The report illustrates that mid-winter precipitation is actually projected to increase due to human-induced climate change over most of the state, though warming temperatures may sap much of those benefits for water resources overall, while only spring precipitation is projected to decrease.

The report makes clear that to provide improved drought forecasts for California, scientists will need to fully understand the links between sea surface temperature variations and winter precipitation over the state, discover how these ocean variations are generated, and better characterize their predictability.

This report contributes to a growing field of science-climate attribution-where teams of scientists aim to identify the sources of observed climate and weather patterns.

"There is immense value in examining the causes of this drought from multiple scientific viewpoints," said Marty Hoerling, report co-author and researcher with NOAA's Earth System Research Laboratory. "It's paramount that we use our collective ability to provide communities and businesses with the environmental intelligence they need to make decisions concerning water resources, which are becoming increasingly strained."

To view the report, visit:?http://cpo.noaa.gov/MAPP/californiadroughtreport.


View the original article here

Hurricane-forecast satellites will keep close eyes on the tropics

A set of eight hurricane-forecast satellites being developed at the University of Michigan is expected to give deep insights into how and where storms suddenly intensify--a little-understood process that's becoming more crucial to figure out as the climate changes, U-M researchers say.

The Cyclone Global Navigation Satellite System is scheduled to launch in fall 2016. At the American Geophysical Union Meeting in San Francisco this week, U-M researchers released estimates of how significantly CYGNSS could improve wind speed and storm intensity forecasts.

CYGNSS--said like the swan constellation--is a $173-million NASA mission that U-M is leading with Texas-based Southwest Research Institute. Each of its eight observatories is about the size of a microwave oven. That's much smaller than a typical weather satellite, which is about the size of a van.

The artificial CYGNSS "constellation," as researchers refer to it, will orbit at tropical, hurricane-belt latitudes. Its coverage will stretch from the 38th parallel north near Delaware's latitude to its counterpart in the south just below Buenos Aires.

Because of their arrangement and number, the observatories will be able to measure the same spot on the globe much more often than the weather satellites flying today can. CYGNSS's revisit time will average between four and six hours, and at times, it can be as fast as 12 minutes.

Conventional weather satellites only cross over the same point once or twice a day. Meteorologists can use ground-based Doppler radar to help them make predictions about storms near land, but hurricanes, which form over the open ocean, present a tougher problem.

"The rapid refresh CYGNSS will offer is a key element of how we'll be able to improve hurricane forecasts," said CYGNSS lead investigator Christopher Ruf, director of the U-M Space Physics Research Lab and professor of atmospheric, oceanic and space sciences.

"CYGNSS gets us the ability to measure things that change fast, like extreme weather. Those are the hardest systems to measure with today's satellites. And because the world is warmer and there's more energy to feed storm systems, there's more likelihood of extreme weather."

Through simulations, the researchers quantified the improvement CYGNSS could have on storm intensity predictions. They found that for a wind speed forecast that is off by 33 knots, or 38 miles per hour--the average error with current capabilities--CYGNSS could reduce that by 9 knots, or about 10 mph.

Considering that the categories of hurricane strength ratchet up, on average, every 20 mph, the accuracy boost is "a very significant number," Ruf said.

"I'd describe the feeling about it as guarded excitement," he said. "It's preliminary and it's all based on models. People will be really excited when we get up there and it works."

The numbers could also improve as scientists update weather prediction tools to better use the new kind of information that CYGNSS will provide.

For people who live in common hurricane or typhoon paths, closer wind speed predictions could translate into more accurate estimates of the storm surge at landfall, Ruf said. That's the main way these systems harm people and property.

"The whole ocean gets higher because the wind pushes the water. That's really hard to forecast now and it's an area we hope to make big improvements in," Ruf said.

Researchers expect the satellite system to give them new insights into storm processes. Hurricanes evolve slowly at first, but then they reach a tipping point, says Aaron Ridley, a professor of atmospheric, oceanic and space sciences.

"The hurricane could be meandering across the Atlantic Ocean and then something happens." Ridley said. "It kicks up a notch and people aren't exactly sure why. A lot of scientists would like to study this rapid intensification in more detail. With a normal mission, you might not be able to see it, but with CYGNSS, you have a better chance."

The satellites will operate in a fundamentally different way than their counterparts do. Rather than transmit a signal and read what reflects back, they'll measure how GPS signals from other satellites bounce off the ocean surface. Each of the eight CYGNSS nodes will measure signals from four of the 32 Global Positioning System satellites.

They'll also be able to take measurements through heavy rain--something other weather satellites are, surprisingly, not very good at.


View the original article here

Small volcanic eruptions partly explain 'warming hiatus'

The "warming hiatus" that has occurred over the last 15 years has been caused in part by small volcanic eruptions.

Scientists have long known that volcanoes cool the atmosphere because of the sulfur dioxide that is expelled during eruptions. Droplets of sulfuric acid that form when the gas combines with oxygen in the upper atmosphere can persist for many months, reflecting sunlight away from Earth and lowering temperatures at the surface and in the lower atmosphere.

Previous research suggested that early 21st-century eruptions might explain up to a third of the recent warming hiatus.

New research available online in the journal Geophysical Research Letters (GRL) further identifies observational climate signals caused by recent volcanic activity. This new research complements an earlier GRL paper published in November, which relied on a combination of ground, air and satellite measurements, indicating that a series of small 21st-century volcanic eruptions deflected substantially more solar radiation than previously estimated.

"This new work shows that the climate signals of late 20th- and early 21st-century volcanic activity can be detected in a variety of different observational data sets," said Benjamin Santer, a Lawrence Livermore National Laboratory scientist and lead author of the study.

The warmest year on record is 1998. After that, the steep climb in global surface temperatures observed over the 20th century appeared to level off. This "hiatus" received considerable attention, despite the fact that the full observational surface temperature record shows many instances of slowing and acceleration in warming rates. Scientists had previously suggested that factors such as weak solar activity and increased heat uptake by the oceans could be responsible for the recent lull in temperature increases. After publication of a 2011 paper in the journal Science by Susan Solomon of the Massachusetts Institute of Technology (link is external) (MIT), it was recognized that an uptick in volcanic activity might also be implicated in the warming hiatus.

Prior to the 2011 Science paper, the prevailing scientific thinking was that only very large eruptions -- on the scale of the cataclysmic 1991 Mount Pinatubo eruption in the Philippines, which ejected an estimated 20 million metric tons (44 billion pounds) of sulfur -- were capable of impacting global climate. This conventional wisdom was largely based on climate model simulations. But according to David Ridley, an atmospheric scientist at MIT and lead author of the November GRL paper, these simulations were missing an important component of volcanic activity.

Ridley and colleagues found the missing piece of the puzzle at the intersection of two atmospheric layers, the stratosphere and the troposphere -- the lowest layer of the atmosphere, where all weather takes place. Those layers meet between 10 and 15 kilometers (six to nine miles) above Earth.

Satellite measurements of the sulfuric acid droplets and aerosols produced by erupting volcanoes are generally restricted to above 15 km. Below 15 km, cirrus clouds can interfere with satellite aerosol measurements. This means that toward the poles, where the lower stratosphere can reach down to 10 km, the satellite measurements miss a significant chunk of the total volcanic aerosol loading.

To get around this problem, the study by Ridley and colleagues combined observations from ground-, air- and space-based instruments to better observe aerosols in the lower portion of the stratosphere. They used these improved estimates of total volcanic aerosols in a simple climate model, and estimated that volcanoes may have caused cooling of 0.05 degrees to 0.12 degrees Celsius since 2000.

The second Livermore-led study shows that the signals of these late 20th and early 21st eruptions can be positively identified in atmospheric temperature, moisture and the reflected solar radiation at the top of the atmosphere. A vital step in detecting these volcanic signals is the removal of the "climate noise" caused by El Ni?os and La Ni?as.

"The fact that these volcanic signatures are apparent in multiple independently measured climate variables really supports the idea that they are influencing climate in spite of their moderate size," said Mark Zelinka, another Livermore author. "If we wish to accurately simulate recent climate change in models, we cannot neglect the ability of these smaller eruptions to reflect sunlight away from Earth."


View the original article here

Muddy forests, shorter winters present challenges for loggers

Stable, frozen ground has long been recognized a logger's friend, capable of supporting equipment and trucks in marshy or soggy forests. Now, a comprehensive look at weather from 1948 onward shows that the logger's friend is melting.

The study, published in the current issue of the Journal of Environmental Management, finds that the period of frozen ground has declined by an average of two or three weeks since 1948. During that time, wood harvests have shifted in years with more variability in freezing and thawing to red pine and jack pine -- species that grow in sandy, well-drained soil that can support trucks and heavy equipment when not frozen.

Jack pine, a characteristic north woods Wisconsin species, is declining, and areas that have been harvested are often replaced with a different species, changing the overall ecosystem.

The study was an effort to look at how long-term weather trends affect forestry, says author Adena Rissman, an assistant professor of forest and wildlife ecology at the University of Wisconsin-Madison. "When my co-author, Chad Rittenhouse, and I began this project, we wanted to know how weather affects our ability to support sustainable working forests. We found a significant decline in the duration of frozen ground over the past 65 years, and at the same time, a significant change in the species being harvested."

"This study identifies real challenges facing forest managers, loggers, landowners, and industry," says Rittenhouse, now an assistant research professor of natural resources and the environment at the University of Connecticut. "Once we understood the trends in frozen ground, we realized how pulling out that issue tugged on economics, livelihoods, forest ecology, wildlife habitat and policy."

Mud can make forests impassable in fall, and even more so after the snow melts in spring, making life difficult for companies that buy standing trees, Rittenhouse says. "Nobody wants to get stuck; you lose time and have to get hauled out or wait for the ground to firm up again."

Shorter winters and uncertainty complicate management for logging companies, Rissman adds. "They often need to plan out their jobs for the next six months or year." The same is true for managers of state and county forests, which typically allow two years for a cut to be completed. "In some cases," she says, "they are going to three-year contracts to give more time to get the timber out."

Even if equipment can traverse muddy roads, their ruts may ruin the road and cause unacceptable erosion. "There is increased attention to rutting on public land, and on private land that is in the state's managed forest program or in a form of sustainable forest certification," says Rissman. "Excessively wet and muddy ground during harvest is a lose-lose-lose for the logger, the landowner and the environment."

The study drew data from weather records from airports, used to model when the ground was frozen; Department of Natural Resources records on harvest levels for various tree species; and interviews with forest managers and loggers.

"People in the forestry industry say this is a big deal; winter is normally the most profitable time," Rissman observes. "It's more and more difficult to make a profit in forestry (with) more loggers (taking) on a lot of debt -- they are heavily mechanized, have heavy labor and insurance expenses, and these costs don't end when they don't have work."

The uncertainty about when and where they can work emerged during an interview with a veteran logger, who is quoted as follows in the study: "When I started in the business ... the typical logger ... would shut down and not do anything for the month or two months that the spring break up would last for. Nowadays, with the cost of equipment, and just the cost of insurance on that equipment alone, you're looking for work almost 12 months out of the year."

The shorter winters seem linked to climate change, Rissman acknowledges. "For many people, climate change is something that happens, or not, in places that are far away, at scales that are difficult to see or understand through personal experience. Here's an example of something we can clearly document, of a trend that is having an impact on how forests are managed, right here at home."


View the original article here