« ASI 2015 update 2: take me to the other side | Main | SIPN Call for Contributions to 2015 Sea Ice Outlook »

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

A-Team

A new 01 Jun 15 article on the wavier jet stream by Dr. Francis is quite readable; free full text at:
http://rsta.royalsocietypublishing.org/content/373/2045/20140170

Evidence linking rapid Arctic warming to mid-latitude weather patterns
Jennifer Francis, Natasa Skific
Philosophical Transactions of the Royal Society A.

The effects of rapid Arctic warming and ice loss on weather patterns in the Northern Hemisphere is a topic of active research, lively scientific debate and high societal impact. The emergence of Arctic amplification—the enhanced sensitivity of high-latitude temperature to global warming—in only the last 10–20 years presents a challenge to identifying statistically robust atmospheric responses using observations. Several recent studies have proposed and demonstrated new mechanisms by which the changing Arctic may be affecting weather patterns in mid-latitudes, and these linkages differ fundamentally from tropics/jet-stream interactions through the transfer of wave energy.

In this study, new metrics and evidence are presented that suggest disproportionate Arctic warming—and resulting weakening of the poleward temperature gradient—is causing the Northern Hemisphere circulation to assume a more meridional character (i.e. wavier), although not uniformly in space or by season, and that highly amplified jet-stream patterns are occurring more frequently. Further analysis based on self-organizing maps supports this finding. These changes in circulation are expected to lead to persistent weather patterns that are known to cause extreme weather events. As emissions of greenhouse gases continue unabated, therefore, the continued amplification of Arctic warming should favour an increased occurrence of extreme events caused by prolonged weather conditions.

A-Team

The article above was just one of nine in a quite interesting symposium published at

http://rsta.royalsocietypublishing.org/content/373/2045

Research article: Arctic sea ice trends, variability and implications for seasonal ice forecasting
Mark C. Serreze, Julienne Stroeve

September Arctic sea ice extent has a strong downward trend with a detrended 1 year lag autocorrelation of essentially zero. We argue from a stronger albedo feedback, a longer melt season, the lack of especially cold winters that the downward trend itself is steepening. The lack of autocorrelation manifests both the inherent large variability in summer atmospheric circulation patterns and that oceanic heat loss in winter acts as a negative (stabilizing) feedback, albeit insufficient to counter the steepening trend. There remains an inherent limit to predictability owing to the largely chaotic nature of atmospheric variability.

Research article: Variability of Arctic sea ice thickness and volume from CryoSat-2
R. Kwok, G. F. Cunningham

We present our estimates of the thickness and volume of the Arctic Ocean ice cover from CryoSat-2 data acquired between October 2010 and May 2014. Average ice thickness and draft differences are within 0.16 m of measurements from other sources (moorings, submarine, electromagnetic sensors, IceBridge). The choice of parameters that affect the conversion of ice freeboard to thickness is discussed. Estimates between 2011 and 2013 suggest moderate decreases in volume followed by a notable increase of more than 2500 km3 (or 0.34 m of thickness over the basin) in 2014, which could be attributed to not only a cooler summer in 2013 but also to large-scale ice convergence just west of the Canadian Arctic Archipelago due to wind-driven onshore drift.

Review article: A seamless approach to understanding and predicting Arctic sea ice in Met Office modelling systems
Helene T. Hewitt, Jeff K. Ridley, Ann B. Keen, Alex E. West, K. Andrew Peterson, Jamie G. L. Rae, Sean F. Milton, Sheldon Bacon

Recent CMIP5 models predict large losses of summer Arctic sea ice, with only mitigation scenarios showing sustainable summer ice. Sea ice is inherently part of the climate system, and heat fluxes affecting sea ice can be small residuals of much larger air–sea fluxes. Analysis of energy budgets in point to the importance of early summer processes such as clouds and meltponds in determining both the seasonal cycle and the trend in ice decline. Forecasting on time scales from short range to decadal might help to unlock the drivers of high latitude biases in climate models.

Research article: Regional variability in sea ice melt in a changing Arctic
Donald K. Perovich, Jacqueline A. Richter-Menge

The amount of surface melt and bottom melt that occurs during the summer melt season was measured at 41 sites over the time period 1957 to 2014. There are large regional and temporal variations. Combined surface and bottom melt ranged from 16 to 294 cm, with a mean of 101 cm. The mean ice equivalent surface melt was 48 cm and the mean bottom melt was 53 cm.… Under current conditions, summer melting in the central Arctic is not large enough to completely remove the sea ice cover.

Research article: Factors affecting projected Arctic surface shortwave heating and albedo change in coupled climate models
Marika M. Holland, Laura Landrum

Changes in the surface sea ice properties are associated with an earlier melt season onset, a longer snow-free season and enhanced surface ponding. Because many of these changes occur during peak solar insolation, they have a considerable influence on Arctic surface shortwave heating that is comparable to the influence of ice area loss in the early twenty-first century. As ice area loss continues through the twenty-first century, it overwhelms the influence of changes in the sea ice surface state, and is responsible for a majority of the net shortwave increases by the mid-twenty-first century.

Research article: Sea-ice thermodynamics and brine drainage
M. Grae Worster, David W. Rees Jones

As the summertime extent of sea ice diminishes, the Arctic is increasingly characterized by first-year rather than multi-year ice. It is during the early stages of ice growth that most brine is injected into the oceans, contributing to the buoyancy flux that mediates the thermo-haline circulation. Current operational sea-ice components of climate models often treat brine rejection between sea ice and the ocean similarly to a thermodynamic segregation process, assigning a fixed salinity to the sea ice, typical of multi-year ice. However, brine rejection is a dynamical, buoyancy-driven process and the salinity of sea ice varies significantly during the first growth season. As a result, current operational models may over-predict the early brine fluxes from newly formed sea ice/

Review article: Recent changes in Antarctic Sea Ice
John Turner, J. Scott Hosking, Thomas J. Bracegirdle, Gareth J. Marshall, Tony Phillips

In contrast to the Arctic, total sea ice extent across the Southern Ocean has increased since the late 1970s. However, this net increase masks regional variations, most notably an increase (decrease) over the Ross (Amundsen–Bellingshausen) Sea. Sea ice variability results from changes in atmospheric and oceanic conditions. T former is thought more significant since there is a high correlation between anomalies in the ice concentration and the near-surface wind field. The Southern Ocean extent trend is dominated by the increase in the Ross Sea sector, where it is significantly correlated with the deepening the Amundsen Sea Low.

Research article: Evidence linking rapid Arctic warming to mid-latitude weather patterns
Jennifer Francis, Natasa Skific
see above

Neven

Thanks, A-Team. Very interesting.

Jim Hunt

It's a pity about all the paywalls though, Jennifer excepted!

Susan Anderson

A-Team thank you for that.

MrVillabolo

Pardon this unprofessional non-scientist if I'm asking an idiot question. I've been following the Arctic meltdown for 5 years and I'm curious to know this. What year approximately will we see an ice free summer in the Artic Ocean?

Neven

Villabolo, I believe mainstream opinion among Arctic experts is somewhere between 2030 and 2040 for a ice-free day in September (I'm guessing they would say an ice-free summer would probably come decades after that). Some say it could happen earlier.

I personally think that it could happen any year as soon as volume is low enough at the start of a melting season. Perhaps it already is. With just the right weather conditions that remain persistent for all the melting season (caused by a 'stuck' jet stream, for instance), it could happen. But it would take a perfect chronology of events, a freak year, to happen earlier.

Bill Fothergill

Villabolo,

Adding to Neven's comments, one obvious caveat is the actual meaning of "ice free summer."

There is no absolute consensus (where have I heard that before?), but "less than 1 million sq kms by September" is accepted by many as being a reasonable definition. What I'm less certain of is whether this "less than 1 million sq kms" would be for the daily minimum or for the September average.

Personally, I always give far greater credence to monthly figures.

Chris Reynolds

Bill,

I think it is generally for the monthly average. A lot of the work is based on GCMs and due to internal variability within the model runs daily isn't very helpful.

A-Team

"It's a pity about all the paywalls though, Jennifer Francis excepted"

I think we can agree $29 x 9 is over the top for the public to see what their public money has funded for public employees. I have been pinging them via Research Gate for reprints, recommended. Ronald Kwok has helpfully sent a pdf (3rd link below) that I believe will work for everyone -- it is a substantial paper on a topic of great interest to us, "Variability of Arctic sea ice thickness and volume from CryoSat-2".

It would be great if Chris or others can review this over at one of the forums (not sure what best spot for it is). So far we have 4 of the 9.


http://rsta.royalsocietypublishing.org/content/373/2045/20140163

http://rsta.royalsocietypublishing.org/content/roypta/373/2045/20140171.full.pdf+

https://www.researchgate.net/requests/attachment/10617841

http://journals.ametsoc.org/doi/pdf/10.1175/BAMS-D-13-00177.1
Towards Quantifying The Increasing Role Of Oceanic Heat In Sea Ice Loss In The New Arctic

This paper summarizes our present understanding of how heat reaches the ice base from the original sources – inflows of Atlantic and Pacific Water, river discharge, and summer sensible heat and shortwave radiati ve fluxes at the ocean/ice surface – and speculates on how such processes may change in the New Arctic. 1) improved mapping of the upper and mid - depth Arctic Ocean, 2) enhanced quantification of important process, 3) expanded long - term monitoring at key heat - flux locations, and 4) development of numerical capabilities that focus on parameterization of heat flux mechanisms and their interactions.

Bill Fothergill

" ... over the top for the public to see what their public money has funded for public employees ... "

In the UK, there has been some move towards redressing this very issue. A plethora of research papers - including recent papers on climate change - can be viewed on-line for free at participating libraries.

http://www.accesstoresearch.org.uk/about

I've recently been churning through some of the material by Bill Ruddiman at my local library.

For those of you unfortunate enough not to live in this green and pleasant (i.e. cool and wet) land, it might be worth asking at your own libraries in case a similar scheme is in place.

A-Team

Now we have 5 of the 9. And this June issue was so perfectly timed for the coming melt season.

This journal imposes a $2960 supplemental charge for each article, should its authors want it to be open access, even though that involves no additional costs nor lost revenue for the journal. That is on top of the base cost and ~$1000 per article color printing supplemental. For an issue entirely organized by the submitters and reviewed by unpaid volunteers!

A seamless approach to understanding and predicting Arctic sea ice in Met Office modelling systems.
Helene T Hewitt et al
https://www.researchgate.net/requests/attachment/10617665

Kevin O'Neill

A-Team, the full article appears to be available here from The Royal Society: A seamless approach to understanding and predicting Arctic sea ice in Met Office modelling systems

Chris Reynolds

A Team,

I've fully read Carmac et al a few times (been on my phone for ages). That it is entitled 'towards' is apt, it doesn't give a clear answer to the role of ocean heat in sea ice loss. That is not to say it is of no use - I have read it several times and I found it very useful.

Coming from a conference involving the leading experts a few years ago, it the MUST READ paper for everybody on the forum and reading Neven's blog or mine.

As for summarising it, I can only refer people to the abstract. The onwards to read the paper itself.

Notably there is only one reference to PIOMAS. Notwithstanding the concentration on other ocean heat sources I still consider that Lindsay & Zhang have accurately summed up the reason for the 1995 to 2004 PIOMAS volume loss in their 2004 paper - this being ice/ocean albedo feedback. That this volume loss in the PIOMAS model correctly reflects the dominant processes at work in the 'real world' and that the initiation and maintenance of ice/ocean albedo feedback is driven by AGW.

Now, a decade on, I think it would be a valuable addition to see Lindsay & Zhang 2004 revisited and updated.

Kevin McKinney

Bill, "till we have built Jerusalem…" :-)

Neven

Bill can't log on temporarily and asked me to post the following:


Well Kevin, as it's the centenary year for the WI in this scepter'd isle, perhaps that should say "Jam & Jerusalem".

Blaine

My apologies on waiting to post on this subject until the thread is beginning to go stale. Despite this being a sea ice specific forum, I'm going to try actually addressing the subject of this post, the rains in Texas.

At around 8 feet above the previous record, there is no doubt that the Blanco River flood at Wimberly was an extremely unlikely event even in the current climate, let alone in the pre-Antrhopocene climate. However, the US NWS maintains statistics on around 7,300 gauging stations, and the odds that one of them would show this type of event are around 7,300 times larger than the chance that any one station would. A similar but less extreme logic applies to the Houston neighborhood flooding.

Of course, the rains over any single small area during a short period of time will have a larger fractional variance than the rains over a larger area. Climate change will be much easier to see in statewide monthly rain totals than in 24-hour rain totals at single locations.

Jeff Masters is kind enough to give us the sorted record rainfall totals for the wettest months of the past 120 years of Texas and Oklahoma.

The expected distribution of values on the table is very much non-Gaussian, as it is the tail of larger distribution which is itself expected to be fat-tailed. Nevertheless, calculation of sigma values of May 2015 relative to an assumed normal distribution of the other 9 record months is useful as a crude back-of-the-envelope check for exactly how unusual May 2015 was. This yields sigma values of 5.6 for Oklahoma and 12.6 for Texas! Even with this crude technique it is obvious that May 2015 was an extremely unusual month.

The question then arises, that, assuming that the previous table of values accurately represented the preexisting climate, exactly how unusual was May 2015? Estimation of this of course requires extrapolation far from the range of previously observed values, and of course the answer obtained will depend on the functional form which this probability distribution is assumed to take. Obviously rainfall occurrences are not close to Gaussian. The usual solution is to assume a power law relationship between the rainfall amount and its recurrence interval.

I took the ordinal values 1-9 for the previous record wettest months on Oklahoma and Texas, and did a simple linear linear correlation between their logarithms and the logarithms of the rainfall amounts, which results in a power law cuvefit. I took the measured rainfall of May 2015 in Oklahoma and Texas, and recovered the matching ordinal values from the curvefit of 0.22 and 0.0039 respectively. Invert and multiply by the observed time period of 120 years and we have recurrence intervals of 550 and 30,700 years, respectively.

The previous 120 years may have been unusually dry in Texas, and very careful consideration of this would likely decrease the recurrence interval somewhat, but it's not likely to make an extreme difference. A distribution which is extremely fat-tailed at long intervals would also decrease the recurrence interval for this kind of rainfall, but the observed distribution is less fat-tailed than a power-law distribution, not more. Presumably comparison with other Texas-sized areas could shed some light on this. I'm also not really absolutely certain about the quality of the supplied data.

Barring some unknown explanation for a fat tail only at extreme values with >120 year recurrence intervals, though, it really looks as though the May 2015 statewide rainfall totals in Texas are, in themselves, so extremely improbable that they could never reasonably happen without significant climate change, ever. I don't really understand why this hasn't gotten more attention than it has, rather than everyone focusing on some moderately improbable local rainfall events.

Bill Fothergill

Neven (et al),

Thanks for the email update regarding the work being done by Typepad in order to fix the digital certificate problem.

As you can doubtless see from the very existence of this posting, their fix/patch appears to have worked.

@ Blaine

Yep, you can't beat doing a bit of statistical analysis in order to see if any given factoid possesses genuine significance.

If you want a look at how NOT to do it, there was a hilariously incompetent bit of crap produced by Lubos Motl back in 2010. This was slavishly and sycophantically regurgitated by James Delingpole in a blog piece titled "AGW: I refute it thus."

However, credit where credit is due; Delingpole did at least attempt to use the word "refute" correctly, and, with delicious but intended irony, simultaneously managed to use a colon for the purpose it was intended.

cheers bill f

D_C_S

Blaine:

Regarding your statement "Nevertheless, calculation of sigma values of May 2015 relative to an assumed normal distribution of the other 9 record months is useful as a crude back-of-the-envelope check for exactly how unusual May 2015 was", did you use the mean and the standard deviation of just those next 9 top monthly rainfalls, assuming a normal distribution, to estimate the probability of the top monthly rainfall happening?

I think that it is entirely improper to estimate the probability of the top monthly rainfall happening based on a normal distribution using only the next 9 top monthly rainfalls. Those other top 9 monthly rainfalls would have a much smaller standard deviation than would all of the monthly rainfalls over the 120-year period. Of course, the top event would be closer to the mean of the top 9 events than to the mean of all of the monthly rainfalls, but I don't see why this should balance out with the narrower distribution of the next 9 top monthly rainfalls. I don't have an especially strong background in statistics though.

Blaine

@D_C_S: Yes, that particular calculation is relative to the previous top 9, with the basic idea that the most recent top 10 rainfall is in the same group as the other 9 and should not be wildly out of line the them, as there was no particular reason why the most recent top 10 should be the highest. If you get >4 sigma on this crude check, the new value is probably exceptional even for a record, while if you get <3, it most likely isn't, all though this does depend on exactly how fat the tail of the distribution is.

I did note why it isn't really proper mathematically in the original post, but it is the first thing I check, just as a general sanity check. If you find that it confuses more than illuminates, you can just ignore it and focus on the more mathematically proper treatment.

Taking the standard deviation of all months isn't relevant at all because the rainfall distribution is so fat-tailed (or in other words has such an excess of extreme events relative to a normal Gaussian distribution) that it is meaningless for comparison to the the distribution of the extreme months. Yes, that is what you would normally do if you had a reasonably normal distribution.

D_C_S

Blaine:

My crude way of approaching it would be different from yours, as the top 10 events probably wouldn't be anywhere near normally distributed.

A fat tail to the right should make extreme values there more likely than if the distribution were a normal one.

There are 1445 months from January 1895 to May 2015, so the top 10 months represent 10 / 1445 of all months. Assuming a normal distribution, that fraction of points corresponds to being at least about 2.46137 standard deviations greater than the mean. For Texas, this would correspond to about 6.18" (the least of the top 10 values).

Using the average rainfalls of 41 locations in Texas, I crudely estimate the average statewide monthly rainfall in Texas to be about 2.80".

We can now estimate the standard deviation, and then estimate the probability of a monthly rainfall in Texas that is greater than or equal to 8.81", assuming a normal distribution.

8.81" is about 4.38 standard deviations from the mean, for a probability of 1 in about 166000 for any given month, which corresponds to once in about 13,800 years. However, a fat tail to the right could make this event a lot more likely.

D_C_S

Clarification:

Even if all of the monthly rainfalls were normally distributed, the top 10 values probably wouldn't be anywhere near normally distributed among themselves. I assumed a normal distribution for all of the monthly rainfalls. Then the probability of the recent extreme event should be more likely with a fat tail to the right than with a normal distribution for all of the monthly rainfalls.

Kevin McKinney

Thanks Bill (and Neven). ""Jam & Jerusalem"" makes me think of Emerson, Lake & Palmer more than WI, but then I'm a Canuck baby boomer. ;-)

Blaine

@DCS: I find your numbers to be broadly correct, and certainly more mathematically sound than my first estimate, which I did tag as being rough. I have two relatively minor issues with it.

On average the values corresponding to ordinals 1-10 will be more than 2.46137 standard deviations away from the mean, since 1-9 are farther away. Also there is what I call the "flying cat" issue.

The questions "Based on previous experience, if I had a cat what is the probability that it could fly?," and "If I had a flying cat, what would have been be the probability of that happening?," are two very different questions. In the second case, it is known with certainty that at least one flying cat exists, namely yours, and therefore flying cats may be improbable, but it known with certainty that they are not impossible. The issue is whether the current value being tested is admissible as a valid member of the training data set. Both probabilities are valid questions to ask, but have different answers.

In this case, the "no flying cats" question is, "Under the climate as previously observed, what was the probability of this new event happening?", while the "flying cats" question is, "under the climate as observed to date, what was the probability of this new event happening".

I know that almost all statistics textbooks will tell us that you must always use the "flying cat" question, but I still find myself unconvinced. Say, for example, that we took a visit to Fukushima and took a large number of observations all observing flying cats. We might easily follow the prescriptions of the statistics textbooks and conclude that the new observations of flying cats are not really that unusual because flying cats are common. If we are explicitly looking for a change from the status quo that we already believed to have taken place before checking the newest value, the "no flying cats" seems to me to be the more natural one to ask.

To return to the rainfall data, I tried a version of the method you described. For each data point, dividing the difference from average by the expected number of standard deviations for a normal distribution gives a calculated standard deviation. Averaging these gives a standard deviation with which I calculate a recurrence interval for the May 2015 rainfall. For Texas, for your monthly average value of 2.8" I get 36,800 years in the "no flying cats" case and 9,200 years in the "flying cats" case. For the value I found of 2.41", I get 15,600 and 4,800.

Oddly, the previous nine values appear to represent a thin-tailed distribution, as the calculated standard deviation becomes lower with lower ordinal numbers.

For Oklahoma and 3.04", I get 101,000 and 19,400 years. This is as expected a distinctly fat-tailed distribution, so these intervals, calculated assuming a normal distribution, should be overestimates.

D_C_S

Blaine:

Yes, of course it would make a bit of a difference whether you use the 10 events or use the 9 events. One could do either, depending on purpose. Obviously, it would be better to use all months, either up to or including May 2015, to get a better idea of the mean and the standard deviation.

The comments to this entry are closed.