Extreme weather catastrophes could be as much as three times more likely as the global climate warms, researchers from Stanford University warned this week.
Using California as a case study, Noah Diffenbaugh and his colleagues, writing in Bulletin of the American Meteorological Society, show how as temperatures rise the risk of droughts becomes more likely.
The 2013 California droughts were the driest since records began. A state-level “drought emergency” was declared by California Governor Jerry Brown, and all 58 counties of California were declared “natural disaster areas,” with the economic costs estimated to be in the region of $2 billion.
The droughts were caused by a large atmospheric ridge, a region of high pressure in the atmosphere, that prevented precipitation from reaching California, as well as Oregon and Washington.
“If you think of a large boulder in a stream,” says Noah Diffenbaugh, one of the authors of the paper, “the water is going to be diverted around the boulder, and that’s essentially what has been happening [in the atmosphere].”
“As the air currents move across the Pacific, those that would normally make it to California have been blocked or steered northward.”
The atmospheric ridge responsible for the Californian droughts was extraordinary for its size and longevity, which lead to it being named the Ridiculously Resilient Ridge, by Daniel Swain a graduate student in Diffenbaugh’s research group.
To determine the role of climate change in the likelihood of the Californian droughts, the research group teamed up with Bala Rajaratnam, assistant professor of Statistics and of Environmental Earth System Science. Rajaratnam and his team used statistical techniques to analyse a variety of climate model simulations. They ran two sets of experiments, one where the greenhouse gas level in the atmosphere were set at today’s values, and the other set to levels before the industrial revolution.
According to Diffenbaugh, the research shows with a 95% confidence “the global warming that has already happened, has at least tripled the probability of this extreme atmospheric condition, relative to the probability without the human contribution.”
The blame for any individual climate event can never be fully attributed to climate change, but with an increased likelihood of extreme weather both in the US and around the globe we are going to need to learn to be better prepared.
As Diffenbaugh puts it “there are a lot of opportunities to improve our resilience in the current climate and at the same time prepare for future events.”
Click to listen to the full interview with Noah Diffenbaugh:
Climate has always changed, and always will, because the atmosphere is inherently unstable. Only a fool would deny that. But there are some who make a living by pretending that it is a recent phenomenon, caused by capitalism (why else would China be exempt from the Kyoto protocol?) and likely to be a disaster if you don't unplug your telephone charger, throw away your lightbulbs, and buy an expensive electricity meter.
As far as I know China is not “exempt” from the Kyoto Protocol, it just didn’t sign up to it. If I remember correctly the U.S.A. had to be dragged into it after years of denial.
When climate scientists get a prediction right, based on actual rather than "adjusted" data and arbitrary proxies, it will be time to listen to them. The fashionable view in the 1960's was that we are headed for an imminent ice age.
To be fair, China is closing some coal fired power stations and banning the use of the more pollutive types of coal in large areas. The aim is to improve air quality.
Yeah, that's the usual bunkum put about by the CO2 lobby. It doesn't explain why the prehistoric CO2 curve follows rather than leads the temperature curve, unless you believe that the laws of physics changed in 1900. The simple explanation is that water is the driver and the CO2 balance between plants and animals is temperature dependent - as any insect will tell you.
I agree Alan,...............it's all about the Globalists control of wealth redistribution...................Ethos Ethos_, Wed, 22nd Oct 2014
It's not just "prehistoric" data that makes one doubt the importance of anthropogenic CO2. We now have evidence from a retreating glacier that the planet, or at least Canada, was warmer 500 years ago than it is now, well within recorded history but way before the industrial revolution.
And not just an ice age. The famous Mauna Loa data shows an annual cycle of CO2 concentration that (a) lags behind the temperature graph and (b) reaches a minimum in late winter when anthropogenic CO2 emission is at a maximum. Indeed all the evidence I have seen, points towards CO2 being the thermometer, not the thermostat. alancalverd, Sun, 2nd Nov 2014
As in this: Mauna Loa Carbon Dioxide... ?
Forest growth should reduce the CO2 level in summer. The data shows the opposite. alancalverd, Tue, 4th Nov 2014
I think we live on a different planet, or at least we till different soils. Fastest growth occurs in the period February-July. Once the days start getting shorter, deciduous forests and grasslands think about seeding, not competing for sunlight, and the leaves start falling in September, with most trees and grasses virtually dormant by the end of October. Something to do with "fall" and "harvest".