It's half time in the Arctic, and with the Summer Solstice and the first half of the melting season behind us, it's time for an assessment. It's clear that PAC-2013, this year's persistent Arctic cyclone, kept the Arctic in a cold and cloudy grip, causing a very slow start to the melting season. In this post I want to explore how the conditions in the past two months of May and June compare to those of previous years, and get a feel for just how slow the start to this melting season has been.
When reading Chris Reynolds' second entry in the June Status series on his Dosbat blog, describing atmospheric patterns for the previous month in the 2004-2013 period, I remembered doing a similar thing last year in a blog post called Arctic atmosphere June-July 2012. In that post I divided the period from June 1st to July 15th up in three parts by using maps from the daily mean composites page from NOAA's Earth Science Research Laboratory.
This time I've combined maps showing sea level pressure patterns from 2007, 2010, 2011, 2012 and this year, divided into four periods spanning two weeks in May and June. I've chosen these years because they have been the biggest contenders for the records and have the most interesting features that help us understand the period after the shift in 2007. And of course, I already had the images that I used for last year's post.
Below the maps I have added Cryosphere Today sea ice area graphs (CT SIA) which will allow me to discuss certain correlations. The image is pretty big, so I hope it fits on everyone's screen when they want to have a closer look by clicking on it:
Before discussing the two months one by one - and bear with me, because it's going to get interesting -, here's a quick explanation on how to interpret these sea level pressure (SLP) maps:
The green, yellow and orange stand for high pressure, the blue and
purple for low pressure. In high-pressure areas the winds blow in a
clockwise fashion. In low-pressure areas it's the other way around: winds
blow in an anti-clockwise fashion. The direction and intensity of winds are usually represented by isobars (dividing lines between different
pressures), but in this case the different colours make
it clear enough.
The important thing to remember is that the ice follows the wind, with a bit of a lag and not as fast. With clockwise blowing winds the ice gets pushed together, under anti-clockwise blowing winds ice floes get dispersed (see image on the right). This has to do with the Coriolis effect caused by the Earth's rotation.
Another important difference between high-pressure and low-pressure areas is that highs make for clear skies, and thus lots of insolation during summer (or radiation out to space during the dark winter months). Lows cause cloudiness, and thus less sunshine reaches the ice pack during summer, making the air generally colder. During winter clouds can block outgoing radiation and thus increase air temperatures, though still below freezing. Depending on their position, highs and lows can pull in warm or cold air from the continents.
What makes area/extent decrease fast in general, is when highs dominate the American side and centre of the Arctic (the right half of the maps) and lows dominate the Siberian side of the Arctic (left half of the maps), because together they act like cogwheels - one turning clockwise, the other anti-clockwise - that push all the ice floes together and towards the Atlantic where they melt out. This phenomenon is known as the Arctic dipole anomaly.
Right, and now onto the analysis:
May
2011 (purple trend line) starts out low at the beginning of May and stays low because of lots of high pressure all around the Arctic. It is soon joined by 2010 (green trend line) that has a very quick start to the melting season due to a big high over the Chuckchi and Beaufort regions that weakens slightly after the first half of the month. You can clearly see the slight lag on the CT SIA map, because as I said it takes a while for the ice pack to respond when weather patterns stay put
for a while, especially early in the season.2007 and 2012 (blue and orange) aren't far from 2010 and 2011 at the end of May, also having enjoyed some high pressure during the first half of the month, turning into a Arctic Dipole (AD) during the second half of the month. The reason that 2012 drops slightly faster than 2007 most probably has to do with the fact that the ice in 2012 was a lot thinner on average after the 2010 and 2011 melting seasons.
2013 (red trend line) clearly stands out on the May CT SIA graph. It starts out alright, but the ice decrease soon stalls under the influence of the persistent cyclone that will dominate the Arctic for a couple of weeks, bringing in clouds and cold, but at the same time dispersing the floes in the interior of the ice pack.
June
2013 remains at a fair distance from the other years because of PAC-2013. A large low-pressure zone even envelops Greenland, which make for antithetical conditions compared to jaw-dropping 2012. This first half of the melting season is clearly very different from anything we've seen in recent years.
During all of June 2007 (orange) and 2011 (purple) keep dropping steadily due to some pretty stable conditions with highs continuing to dominate much of the area over the Arctic Ocean.
2010 has a bit of a lull during the first half of the month, but drops fast again when a relatively big high-pressure system takes over the Central Arctic. 2012 produces an even bigger high during the first half of the month, causing the trend line to plummet on the CT SIA graph.
And that brings us to the most interesting period of last year, a period that in some ways was more interesting to me than the period with the Great Arctic Cyclone that would cause havoc with a capital H a month later. In the second half of June the Arctic became dominated by a widespread low, and although the sea ice area decrease slowed down, it didn't stall completely like it did in previous years under similar circumstances (see last year's blog post on atmospheric conditions for further explanation).
To me this was a sign that the ice was so weak in large parts of the ice pack that it didn't really matter what the weather was, the ice melted out anyway. Especially, of course, after a warm winter in the Kara and Barentsz Seas where the ice melted out in record time during a very decent May and first half of June, combined with lots of insolation over the Beaufort Sea. Even though the lows persisted during July (image on the right), 2012 never gave away the lead.
We could be witnessing the same thing this year, even though 2013's start of the melting season doesn't come close to those of previous years, let alone 2012. We have to keep in mind though that this year did start out with a total volume that was as low as 2011 and 2012, the highest amount of first-year ice on record (due to last year's record minimum), and with PAC-2013 that has impacted the interior of the ice pack. We still haven't seen the full effects of that event.
There are basically three set-ups that determine the rate of decrease:
- Highs dominating the Arctic (negative Arctic Oscillation) -> fast decrease, especially if the highs are on the American side of the Arctic and stay put for a while, causing a lot of compaction.
- Lows dominating the Arctic (positive Arctic Osciallation) -> average decrease, faster when the pressure is very low. Lows on the Siberian side of the Arctic, combined into an Arctic Dipole with highs on the other side, cause the fastest decrease. 2007 was so special because the AD was persistent and relatively stable for almost all of the melting season. Were that to happen again in coming years, the Arctic could very well go ice-free before the end of the melting season.
- Neither fish nor flesh, neutral Arctic Oscillation, unstable weather systems that don't stay into place -> slow decrease.
What has changed compared to the past, is that the gaps between these three set-ups have become smaller, and this obviously has to do with the thinning of the Arctic sea ice pack. A less intense high can do as much damage as an exceptional high during the 80's and 90's. A large cyclone tears the ice pack apart, whereas the ice pack was much better able to withstand it in the past. Thinner ice is now dampening weather dominance.
And so we are entering that period of the new melting season, where area and extent keep decreasing steadily, even if the weather isn't all that conducive to transport, melting and compaction. With some favourable weather conditions in coming weeks 2013 might even join the top years, or possibly even have a shot at 2012's title (more on that in the next ASI update). It would be a remarkable feat after the unusually slow start to the melting season.
Rob, instructivde forward calculation from current sea ice concentration.
Note the slightest melt of snow drastically changes its albedo, as does the slightest film of water (incipient melt pond) on ice. Blue ice implies elastic scattering of mainly short wavelength light, not a big component of northern bottom-of-atmosphere solar blackbody radiation.
Sunlight effectively penetrates melt ponds and blue ice deep into the water below, accounting for the prolific photosynthetic diatom community on the bottom of the ice.
However, we noted a paper some time back that described significant capture of sunlight well within the ice. Warming the internal temperature profile is a bit of a game-changer, complicating the almighty heat equation considerably.
Posted by: A-Team | July 08, 2013 at 16:15
Here is a time series of 10 Ghz horizontal polarization on descending limb of orbit. The cross section of the Arctic is tilted slightly back.
As the time series accumulates as a stack of images, each slanted column of pixels provides the microwave emission history at that pixel, here for 14 Jun 13 to 04 Jul 13.
Had these been the 28 possible channels on a single day, this would have represented the spectroscopic signature of each pixel.
It is challenging to display time-dependent emission spectroscopy for more than a few sites at a time.
Posted by: A-Team | July 09, 2013 at 01:10
I love the stacked time series, A-team. Here are all 36 MODIS channels in their glory, from a single exposure about a week ago in the much-looked-at area of broken ice in tile r04c04 about 400 km from the pole.
I have slightly reordered the channels by wavelength order. Channels 1-2 have 250m resolution, channels 3-7 500m resolution, and 8-36 have 1km resolution.
I reduced the animation to 1km pixel resolution to match all channels. There are important decisions in reducing from the 16-bit data to an 8-bit image as the dynamic range of every channel easily exceeds 8 bits. Most of the raw image histograms have a long tail of outlying bright pixels, so a simple rescaling/rebinning to 8 bits will result in a very dark image. Moreover the mean for each channel varies dramatically.
I chose to linearly rescale, preserving all minimum data (subject to binning of course) but with a crude maximum channel-by-channel cutoff of mean + 2 standard deviations, which in practice clips to white significantly less than the 2.5% of pixels a gaussian would since the distribution is so skewed.
Recall the usual visible RGB product is channels 143, with the combinations 367 and 127 of great use in distinguishing cloud from ice. Channel 5 and channels 8-36 are not used in any ordinary imagery product I've seen online though they are inputs to specialized higher-level products (e.g. vegetation maps).
You can see that not every channel is equally usable. In fact a number of the visible channels (e.g. 10-15) are almost completely overexposed. This is not an artifact of my bit reduction as the original 16-bit images also have most pixels nearly 65535. Perhaps these channels are extremely sensitive and will be usable in lower light, or maybe there's something wrong with them. In any case for true night visible images Suomi's VIIRS instrument (already putting out uncalibrated data) will be superior.
Anyone should please feel free to extract the channels as frames from the animation and suggest useful combination methods. Now for fun, here are a few spectral combinations I've tried:
true color 143:
367 visible + IR: (note my channel 7 image has some hot pixel problems)

126: my equivalent to the 127 combination:

156: making use of an intermediate IR channel that isn't part of any of the usual image products

1-2-6-20-26-32: I took the 126 image and then added portions of the latter three channels in cyan, magenta, and blue. 26 is confusingly in near IR between 2 & 6, but 20 & 32 go much deeper into IR. The additional information helps distinguish cloud heights from each other (cirrus, etc.)

Posted by: Dan P. | July 09, 2013 at 03:06
@ A-Team | July 08, 2013 at 01:49
"If well-mixed, the heat content of the deeper waters would overwhelm surface ice..."
I've noticed what seem to be curveiinear early melt features in the ice over the East Siberian Shelf which roughly follow ocean depth contours. The pressure/temperature response of methane hydrate decomposition means that at a constant temperature, there is a pressure(depth) below which the hydrate decomposes, and above which it is stable; the upper edge of a hydrate layer would therefore follow a depth contour. As the temperature increases, the hydrate stability depth increases - so the hydrate between the original edge of the hydrate layer and the new stability contour would decompose, causing the "bubble fountains" (observed by Semilotev and others) to align approximately along depth contours. These rising bubbles of methane will entrain bottom water, which is saltier, denser, but warmer - which could contribute to bottom melt of the ice. Higher resolution ice maps and on site observations would confirm this theory(SWAG?)
According to http://www.hindawi.com/journals/ace/2010/789547/, "Experimental Investigation of a Rectangular Airlift Pump", the mass flow ratio of gas to liquid is on the order of 100:1. This is ducted, not free flow, and working against a head created by raising the top of the duct above the surface of the entrained liquid, so the mass of sea water lifted versus methane evolved from hydrate decomposition is likely larger.
Posted by: Bdwo | July 09, 2013 at 08:22
@bdwo
As a localized phenomena, maybe. The mass of water and volumes of gas required to move it would be enormous. I would also expect entrainment to simply get lost in the existing flow, as little as it might be. I expect the far greater effect would actually be from the prompt greenhouse effect caused by the release of that much CH4 into the high arctic.
Posted by: jdallen_wa | July 09, 2013 at 09:43
Amazing work, DonP. Animations are a good way to distribute a set of files as a single convenient download.
This data raise a great many questions -- why do they have so many channels with no effective public distribution, why do so many look defunct, why not better management of contrast, is staff aware/care, etc.
It troubles me that you are able to make all these new Arctic Ocean products -- time-consuming and way beyond the skill set of the average Arctic scientist -- yet the agency is not providing them despite a large paid staff, decades of experience with swath stitching algorithms, easy pipeline setup, and unlimited cheap server storage.
I have the impression overall with many earth observation satellites that once the instrument is built and launched and the data is streaming down to the archive, the job is considered done -- there is little interest in its content or anything beyond pro forma distribution.
Looking at frames 11-18, these appear way over-exposed for visible but do seem to pick out open water. However because any end-user could change the contrast later themselves, it seems like information has been discarded too early on.
At least the frames here come in a simple grayscale palette (though was 16 bit ever meaningfully utilized?), allowing you to combine them in various ways for false color. (Didn't quite understand the base color space here: "1-2-6-20-26-32: took 1-2-6 image and added cyan, magenta, and blue.")
On Jaxa, the grayscale is provided only for 3 of the 28 channels. The rest are this crazy temperature palette which does not drop down to a monotonic grayscale (which it could easily have done).
I don't see any way of fixing it short of replacing each of 100 colors one at a time with a macro x 28 images per day x 30 days a month x number of users.
Ironically, they are sitting on a palette-conversion script that got them out of grayscale to begin with. They then tossed the grayscale or declined to make that folder available.
Posted by: A-Team | July 09, 2013 at 16:26
Interesting train of thought, Bdwo/jdallen.
Not to forget the Siberian rivers emptying waters into the Laptev and Kara that, while barely over 0ºC, are still well over the freezing temperature of sea water, -1.85ºC or so.
The methane would also be quite warm from the earth's geothermal gradient, though that heat content might be small relative to waters that the bubbles entrain.
I'm recalling from the Laptev studies that the clathrates are primarily exposed to sea water at the break in the continental shelf. Otherwise, geological fractures, Lena delta, and unevenly distributed melt-through features that more or less existed at the time of Holocene flooding.
They spoke of bubble zones a km in width but I don't recall seeing their lat/long supplied. I've looked for them several times with image enhancement but not found anything to date. The resolution is at hand only for Modis visible.
Distinct steady-state excesses of local methane in the lower atmosphere seem like they would lead to a locally enhanced greenhouse effect, with the 103x over carbon dioxide being applicable.
So the potential is there for significant positive feedback of melt. Each contributing factor needs seasonal quantification however.
Posted by: A-Team | July 09, 2013 at 16:47
Regarding the images, if I was in charge of a US agency, I would distribute raw data and nothing else.
My rational is:
1. Raw data is the purest we have for scientific work. I don't know how pure that is, actually, as I assume it requires some processing from satellite to file system. BUt still, there is value in providing it.
2. In this climate of GOP anti-science, putting out a product that retouches data and makes global warming, arctic melt more visible is more than my job is worth. It doesn't matter how innocent or valuable the tansformation is, it is a transformation adn therefore 'tempering'.
3. There is a practical problem of which transformations to use, etc. If I am like us here on the site ['us' used very, very, very loosely], I still want to know what transformations were used so I can understand artifacts etc. So maybe the rational is that it is better to do nothing.
As an amateur however, I really appreciate the work you all do (A-Team especially singled out).
Posted by: Fufufunknknk | July 09, 2013 at 17:36
A-team:
1. The original images were all 16-bit, and in general made fair use of that dynamic range. A typical histogram had a minimum about 50, median around 2000-10000, and a long tail of brighter pixels. My linear rescaling to 8 bits doesn't always do justice to the contrast of features we might be interested in, but after a fair bit of experimentation I realized it would take a lot of effort to come up with a better generic solution for display. Often it is both the low and the high brightness features that need better contrast (obscured icy water in IR channels and cloud tops), which requires splitting the range into different functional forms.
2. For the apparently useless channels 11-18, as I said the problem is already there in the 16-bit file, although it is true a carefully optimized palette could extract a bit more contrast. But essentially to the extent the problem was information being discarded too early (rather than detector faults), it was by having the gain set too high for that sensor pipeline.
3. My colorspaces for all but the last one are direct RGB combination. The 1-2-6-20-26-32 was just an experiment that I expect to simplify and improve on, so I can only give you an approximate scheme:
(ch_1)R + (ch_2)G + (ch_6)B + .37(ch_20)M + .44(ch_26)C + .44(ch_32)B
I don't have a lot of expertise on color spaces, so the way I'm thinking about it is that I have a fundamental 3-D basis with some alternate conventional basis vectors. I can get away with adding extra channels above 3 because the eye can use spatial information to separate out coherent channels even if their color information overlaps, but it still seems like it would be better to keep the # of channels to no more than 4 or 5.
Fufufunknknk: All of your points have some reason to them but I still lean towards "it would be a lot better if someone professional were doing this work and making it available rather than me". The answer to #2 is basically "haters gonna' hate", so you shouldn't expect policy to respond to whatever idiocies they spout.
As for #1 and #3, they make a strong case for having the raw data available *in addition* to any higher level products, with careful descriptions of the methods used to get them. I believe NASA does well in this regard for many of the products they produce, but there is a disconnect that A-team has been stressing when it comes to easily-readable imagery.
In many cases this imagery seems treated as a display product only (despite the fact that its universality generally means it will be an important input for a broader range of science than any other data product). I have a milder point of view than A-team about the persons involved in the production, even if I'm dissatisfied with the results of the overall system. I've run into comments in documentation several times with statements like "we have schemes for alternate methods of presentation and are seeking funding to implement them", written years ago, like a little cry for help from the data processing factory.
Posted by: Dan P. | July 09, 2013 at 21:19
Reacting to entirely hypothetical intimidation on something this estoteric makes no sense -- it would have to be challenged on scientific grounds since no interpretive component is being provided.
But that makes no sense either as long as they documented exactly what was done to the image (good luck with that). Then the end user knows exactly what they have (I'm yet to encounter this).
If someone want to go back to square one and do it differently, they are free to do so. In fact, for Modis they are welcome to mount an antenna on the roof of their home and receive the data stream directly -- it is all described on the internet.
There is no such thing as raw data. Major product-affecting decisions were already baked into sensor electronics and later in setting the gain (here probably set for vegetation rather than white on white), making corrections for orbit and angle, on and on. Usually they have a longish series of processing stages called levels.
The same was true of film and what happened afterward in the darkroom. Did Ansel Adams take his film to Kinko's to be developed? No. A goodly portion of what goes on in Gimp/PS is just digital darkroom: invert, grain merge, dodge, burn, screen, etc.
I am very familiar with the norms across the physical sciences. For starters, 99.9% of academic scientists are not going to work with arcane data formats like HDF. The image providers are falling way short of acceptable practises here -- in documentation, server products and interfacing with end-users.
When you take the public's money for salary and project grants, that triggers an obligation. If you don't want the obligation, don't take the money. They're paid for a 40 hour week, 50 weeks a year. I'm talking a half hour of common sense to get beyond these nuisance display products.
We are well into a major crisis with climate change -- with the Arctic is leading the pack. It is high time for these satellite data providers to get it in gear. If they do their bit, I can do my bit, and the next person can do their bit.
So don't ask why I am wanting to light a burner under these folks, ask instead why you are not.
Posted by: A-Team | July 09, 2013 at 22:55
A-Team, there are quite a few scientists dropping in here. How about writing a (short) guest blog about your qualms with satellite data providers? Maybe they're just not aware of what the public would like to see, or what 'we' would like to see, as the link between the public and them.
Posted by: Neven | July 10, 2013 at 00:27
Neven, let me think about that, whether it could move the needle. It's a little late for the Arctic but might benefit Greenland, tundra, Antarctic or whatever comes next.
The situation with practical availability of remote sensing imagery is very similar to what happened with the human genome project. Except there, the problem was fixed on day one.
A string of letters 3 billion long is already a very large file. Using just four letters, it is exceedingly uninformative to behold. Add another for each person with a genetic condition and many species of vertebrates for comparison, you're at many terabytes of intractable raw data.
To assembly a mammalian genome ab initio from trace reads and mark up features takes about six weeks of computing time using 2000 fast processors running in parallel. At that time, the end-user community consisted of computer-illiterate mds and molecular biologists who could barely send an email or open a pdf.
So there was a tremendous disconnect between the geeks who could process these files but didn't know a nucleotide from nucleolus and a diverse biomedical side which needed to process patients using all available information.
One option, which the geeks favored, was for the biomedical community to take a five year leave of absence and damn it, learn computer programming. Meanwhile the biomedical community controlled the grant money; their preferred option was a lab geek who would follow orders, like their nurse or technician.
In the end though, they ended up collaborating on a graphical user interface called a genome browser. Two million lines of code, tens of thousands of back-and-forths about what worked and what didn't. Intuitive amd self-explanatory in the end, nobody had to learn a thing.
What we have in Arctic imagery is exactly parallel to this: satellite specialists on one side, all-over-the-map scientists on the other. A dozen or so interfaces, mostly beta but also alpha, all built without a single back-and-forth communication.
The net effect is exclusionary and ineffectual. We just cannot afford this.
Posted by: A-Team | July 10, 2013 at 00:47
On 16-bit data, it is not going to show up on the monitor or in print; even if it did, your eye can hardly distinguish the 16 million colors in 3-channel 8-bit now.
On satellite data, it really has to do with round-off binning error upon subsequent manipulations -- pixellated images with degraded scientific content. Fine, the satellite center is well positioned to do initial rounds of manipulations, then hand off desktop 8-bit.
That seems to discard meaningful data on the scientific side, yet 16-bit has no significance for concentration/area/extent/thickness where the end result has substantial (several percent minimally) error and spotty validation of that even. And at the end of the day, to share or publish a product image, you're going to dumb it down to 8-bit.
What does it accomplish to see the same old obscuring cloud at ever greater bit depth? We need better cloud penetration and better digital removal.
Gimp is going to 32-bit resolution by year-end. That will retain scientific accuracy for extreme sequential non-linear manipulations but these aren't often motivated.
We can do an awful lot of rapid hypothesis exploration at 8-bit and always go back later to repeat on 16-bit as warranted. Retaining more significant digits than you actually need vis-a-vis final product error just adds to the overhead.
What we'd like really is 2x better resolution on the ground even if 8-bit pixels. That means for microwave, launching with a larger antenna. Not going to happen in time left for the Arctic ice.
For visible, Modis ground resolution is probably already excessive. The Arctic Ocean is large, with file size growing as the square of pixel size, an issue already at 8-bit 250m. None of this is getting us any closer to actual properties of the ice that matter, be it slush or tensile strength.
Posted by: A-Team | July 10, 2013 at 02:28
I was trying to discuss rationales, not justify the lack of product, although my statement can certainly be read like that in hindsight.
I actually do some work with computer vision, although not on satellite images and I am familiar with the problems with datasets and different formats. I am not familiar with satellite formats; the only satelite data i have ever worked with are GIS topo maps and so on, but on a small enough scale that I could conveniently avoid problems with projections and, more importantly, data for which I had a ground truth so that i could pretty much plug it into Grass or whatever (e.g. QGIS) and do my import relatively automatically.
Maybe what is needed is a standard/specification, or three or four standards, to be pushed so that the agencies have a target that is independent of thier internal situation. Maybe, it is just a matter of convincing them that there is a need for this product.
However, I yield to your opinion because i have no idea what i am talking about and i repeat my thanks for your and others work.
Posted by: Fufufunknknk | July 10, 2013 at 09:37
In trying to figure out why the Arctic appears to be cooler this summer than last year with more lows than highs, I ended up staring at these graphs from NOAA, from the Geo Potential Height (GPH) over the Northern Hemisphere.
This is 2013 so far :
http://www.cpc.ncep.noaa.gov/products/stratosphere/strat-trop/gif_files/time_pres_HGT_ANOM_ALL_NH_2013.gif
And this is 2012 :
http://www.cpc.ncep.noaa.gov/products/stratosphere/strat-trop/gif_files/time_pres_HGT_ANOM_ALL_NH_2012.gif
Now, it is apparent that in 2012, highs dominated the surface, while in 2013, low pressure at the surface dominates.
I checked back a couple of years on the main site
http://www.cpc.ncep.noaa.gov/products/stratosphere/strat-trop/
and it seems to me that 2013 is an exception to the rule.
Normally, high or low pressure from the mesosphere tends to propagate down through the stratosphere in a couple of months time frame, but this year it seems that the high pressure zone (around 10mb) just sits there and does not come down to the troposphere. In fact, it seems that throughout the Northern Hemisphere, lows dominate the surface this year, while high pressure is stuck between 150-5mb.
I am very much unskilled with stratospheric and troposphere interaction, so the question I have is why ?
Why does a lump of high pressure just sits there in the lower stratosphere and does not disperse down into the lows of the troposphere, while it does in other years ?
Posted by: Rob Dekker | July 10, 2013 at 11:21
In fact, what has been going on since autumn last year, is a restructuring of the three NH cells. The gradient between them is fading. The behavior of the Polar Jet Stream is directly related to that.
The geopotential difference is lower, meaning that the height of the atmospheric pressure zones, especially over the Arctic and Boreal zones have gone up.
Mind, the difference is in tens, maximally 100-150 of meters. But still. And the main effect is noticeable on the 500-300Mb level.
It is therefore no surprise that a small, but important pressure anomaly is measured in the higher atmosphere. In this light, it is interesting to read back what FI Wayne Davidson has put on his Eh2r blog this spring. Check ‘adiabatic’ processes. He seems to link it to thinner ice. I tend to see more relations with “intercellulair” teleconnections (yes, my math is bad, but hear my words…).
On another track, I think Chris Reynolds is also busy getting to grips with the obvious pattern change we see this season.
I’m very eager to see whether we can fit it all together, which to me seems of more general importance than daily nitty-gritty over SIE/SIA numbers.
I still have the opinion that these changes may for the moment be more pronounced in the mid-latitudes than in the Arctic. There are some consequences that have led to different behavior of the spring melt. It doesn’t mean the whole season will end in ‘recovery’. It might, on the other hand, indicate the start of the ‘longer than expected’ tail.
BTW I still hold my position on this year, finally, appearing to be front-runner. When the scientific sensors ‘come to their senses…’.
Posted by: Werther | July 10, 2013 at 14:40
Werther,
re: NH cells This is, in my opinion, the largest effect of climate change. The weakening of the polar jet is indicating a gradual merge of the ferrell and polar cells. It is this global restructuring of global weather patterns that will produce the most immediate impacts on society and humanity's food production capabilities. This will occur due to increases in droughts in current food producing regions and floods in current population centers.
What we are also seeing in this regime of climate change is a gradual strengthening of the hadley cell as tropical evaporation increases. This is increasing desertification. The northward expansion of the 30'N desert belt will also significantly impact grain and livestock production.
Studies of these effects and predictions of their impacts are in the peer-reviewed record produced over the last 20 years. The fact that we have done nothing to act on the impacts that have been observed indicates that only a "breakthrough" event will produce real motivation for change. The longer that we wait, the more painful climate change will be and the more difficult it will be to fix it. In fact, we are approaching a point where it is becoming very likely that modernity will not survive the next 65 years.
With the arctic ice pack being the "canary in the coal mine" and the most effective messenger of these developing threats, it is important that we more fully understand and and are able to explain what is happening in the arctic today.
That is why I monitor these blogs, and sincerely appreciate the work that is being done here.
http://onlinelibrary.wiley.com/doi/10.1029/2010JD015197/abstract
"poleward shift of the subtropical dry zones (up to 2° decade−1 in June-July-August (JJA) in the Northern Hemisphere "
Posted by: Jai Mitchell | July 10, 2013 at 17:19
Werther
Thanks Werther, but I'm not sure I understand this logic.
Why would the height of the atmospheric pressure zones, especially over the Arctic, go up if the gradient between the NH cells fades ?
What I understand is that in 2012 the gradient between the NH cells was low, (and the polar vortex was weak, and that is why we had a warm Arctic.
So why would 2013 show the opposite ?
Posted by: Rob Dekker | July 11, 2013 at 08:45
Posted by: Artful Dodger | July 11, 2013 at 10:22
Now that we entered the second have of the melting season, I'm having more and more concerns about the ice in the Central Basin in the aftermath of PAC2013.
Especially the area of r04c04 frame on Modis (a huge chunk of the Central Basin from the North Pole almost to the Siberian Coast) was affected by the persistent lows raging through much of May and June.
Although A-team as well as DanP have done some great work with different wavelength channels in that frame, individual images are still often obscured by clouds, and the situation as a whole remain somewhat unclear.
Our eyes have a great ability to recognize patterns in multiple images shown quickly after each other, even when each individual image is partially obscured. So, with 1 image per day from Terra/Modis, I thought it would be interesting to make an animation of frame r04c04 showing 3 days per second, running over the past 3 months.
This way, we'll see frame r04c04 before, during and after PAC2013.
Here is the result (day 120 - today). Click the image to download the animation :
I'm not sure what you guys think, but I don't see how any of that cottage cheese and low-albedo slush can possibly make it through July melting and even minor August storms, let alone survive until September...
Posted by: Rob Dekker | July 11, 2013 at 21:30
Thanks for presenting it like that, Rob.
Y3et another way of getting to the quality ... which is exceptional.
Posted by: Werther | July 11, 2013 at 22:07
Thanks Werther. PAC 2013 was not easy on the Arctic Basin ice, no ?
I updated the animation combining images from BOTH Terra and Aqua. So we now have two images per day. I also doubled the frame rate (to 6 images per second) so its our eyes have more data to extract the image of the ice cover through the clouds.
It's amazing how you can see the ice being obliterated by PAC 2013, and then attacked by late June and early July sunlight. It's melting in front of our eyes....
Posted by: Rob Dekker | July 11, 2013 at 23:56