« Meanwhile, on Greenland... | Main | PIOMAS July 2013 »


Feed You can follow this conversation by subscribing to the comment feed for this post.


Rob, instructivde forward calculation from current sea ice concentration.

Note the slightest melt of snow drastically changes its albedo, as does the slightest film of water (incipient melt pond) on ice. Blue ice implies elastic scattering of mainly short wavelength light, not a big component of northern bottom-of-atmosphere solar blackbody radiation.

Sunlight effectively penetrates melt ponds and blue ice deep into the water below, accounting for the prolific photosynthetic diatom community on the bottom of the ice.

However, we noted a paper some time back that described significant capture of sunlight well within the ice. Warming the internal temperature profile is a bit of a game-changer, complicating the almighty heat equation considerably.


Here is a time series of 10 Ghz horizontal polarization on descending limb of orbit. The cross section of the Arctic is tilted slightly back.

As the time series accumulates as a stack of images, each slanted column of pixels provides the microwave emission history at that pixel, here for 14 Jun 13 to 04 Jul 13.

Had these been the 28 possible channels on a single day, this would have represented the spectroscopic signature of each pixel.

It is challenging to display time-dependent emission spectroscopy for more than a few sites at a time.

 photo dliceDiceslant2_zpsdbff9b7d.gif

Dan P.

I love the stacked time series, A-team. Here are all 36 MODIS channels in their glory, from a single exposure about a week ago in the much-looked-at area of broken ice in tile r04c04 about 400 km from the pole.

I have slightly reordered the channels by wavelength order. Channels 1-2 have 250m resolution, channels 3-7 500m resolution, and 8-36 have 1km resolution.

I reduced the animation to 1km pixel resolution to match all channels. There are important decisions in reducing from the 16-bit data to an 8-bit image as the dynamic range of every channel easily exceeds 8 bits. Most of the raw image histograms have a long tail of outlying bright pixels, so a simple rescaling/rebinning to 8 bits will result in a very dark image. Moreover the mean for each channel varies dramatically.

I chose to linearly rescale, preserving all minimum data (subject to binning of course) but with a crude maximum channel-by-channel cutoff of mean + 2 standard deviations, which in practice clips to white significantly less than the 2.5% of pixels a gaussian would since the distribution is so skewed.

Recall the usual visible RGB product is channels 143, with the combinations 367 and 127 of great use in distinguishing cloud from ice. Channel 5 and channels 8-36 are not used in any ordinary imagery product I've seen online though they are inputs to specialized higher-level products (e.g. vegetation maps).

You can see that not every channel is equally usable. In fact a number of the visible channels (e.g. 10-15) are almost completely overexposed. This is not an artifact of my bit reduction as the original 16-bit images also have most pixels nearly 65535. Perhaps these channels are extremely sensitive and will be usable in lower light, or maybe there's something wrong with them. In any case for true night visible images Suomi's VIIRS instrument (already putting out uncalibrated data) will be superior.

Anyone should please feel free to extract the channels as frames from the animation and suggest useful combination methods. Now for fun, here are a few spectral combinations I've tried:

true color 143:

367 visible + IR: (note my channel 7 image has some hot pixel problems)

126: my equivalent to the 127 combination:

156: making use of an intermediate IR channel that isn't part of any of the usual image products

1-2-6-20-26-32: I took the 126 image and then added portions of the latter three channels in cyan, magenta, and blue. 26 is confusingly in near IR between 2 & 6, but 20 & 32 go much deeper into IR. The additional information helps distinguish cloud heights from each other (cirrus, etc.)


@ A-Team | July 08, 2013 at 01:49
"If well-mixed, the heat content of the deeper waters would overwhelm surface ice..."

I've noticed what seem to be curveiinear early melt features in the ice over the East Siberian Shelf which roughly follow ocean depth contours. The pressure/temperature response of methane hydrate decomposition means that at a constant temperature, there is a pressure(depth) below which the hydrate decomposes, and above which it is stable; the upper edge of a hydrate layer would therefore follow a depth contour. As the temperature increases, the hydrate stability depth increases - so the hydrate between the original edge of the hydrate layer and the new stability contour would decompose, causing the "bubble fountains" (observed by Semilotev and others) to align approximately along depth contours. These rising bubbles of methane will entrain bottom water, which is saltier, denser, but warmer - which could contribute to bottom melt of the ice. Higher resolution ice maps and on site observations would confirm this theory(SWAG?)

According to http://www.hindawi.com/journals/ace/2010/789547/, "Experimental Investigation of a Rectangular Airlift Pump", the mass flow ratio of gas to liquid is on the order of 100:1. This is ducted, not free flow, and working against a head created by raising the top of the duct above the surface of the entrained liquid, so the mass of sea water lifted versus methane evolved from hydrate decomposition is likely larger.



As a localized phenomena, maybe. The mass of water and volumes of gas required to move it would be enormous. I would also expect entrainment to simply get lost in the existing flow, as little as it might be. I expect the far greater effect would actually be from the prompt greenhouse effect caused by the release of that much CH4 into the high arctic.


Amazing work, DonP. Animations are a good way to distribute a set of files as a single convenient download.

This data raise a great many questions -- why do they have so many channels with no effective public distribution, why do so many look defunct, why not better management of contrast, is staff aware/care, etc.

It troubles me that you are able to make all these new Arctic Ocean products -- time-consuming and way beyond the skill set of the average Arctic scientist -- yet the agency is not providing them despite a large paid staff, decades of experience with swath stitching algorithms, easy pipeline setup, and unlimited cheap server storage.

I have the impression overall with many earth observation satellites that once the instrument is built and launched and the data is streaming down to the archive, the job is considered done -- there is little interest in its content or anything beyond pro forma distribution.

Looking at frames 11-18, these appear way over-exposed for visible but do seem to pick out open water. However because any end-user could change the contrast later themselves, it seems like information has been discarded too early on.

At least the frames here come in a simple grayscale palette (though was 16 bit ever meaningfully utilized?), allowing you to combine them in various ways for false color. (Didn't quite understand the base color space here: "1-2-6-20-26-32: took 1-2-6 image and added cyan, magenta, and blue.")

On Jaxa, the grayscale is provided only for 3 of the 28 channels. The rest are this crazy temperature palette which does not drop down to a monotonic grayscale (which it could easily have done).

I don't see any way of fixing it short of replacing each of 100 colors one at a time with a macro x 28 images per day x 30 days a month x number of users.

Ironically, they are sitting on a palette-conversion script that got them out of grayscale to begin with. They then tossed the grayscale or declined to make that folder available.


Interesting train of thought, Bdwo/jdallen.

Not to forget the Siberian rivers emptying waters into the Laptev and Kara that, while barely over 0ºC, are still well over the freezing temperature of sea water, -1.85ºC or so.

The methane would also be quite warm from the earth's geothermal gradient, though that heat content might be small relative to waters that the bubbles entrain.

I'm recalling from the Laptev studies that the clathrates are primarily exposed to sea water at the break in the continental shelf. Otherwise, geological fractures, Lena delta, and unevenly distributed melt-through features that more or less existed at the time of Holocene flooding.

They spoke of bubble zones a km in width but I don't recall seeing their lat/long supplied. I've looked for them several times with image enhancement but not found anything to date. The resolution is at hand only for Modis visible.

Distinct steady-state excesses of local methane in the lower atmosphere seem like they would lead to a locally enhanced greenhouse effect, with the 103x over carbon dioxide being applicable.

So the potential is there for significant positive feedback of melt. Each contributing factor needs seasonal quantification however.


Regarding the images, if I was in charge of a US agency, I would distribute raw data and nothing else.

My rational is:

1. Raw data is the purest we have for scientific work. I don't know how pure that is, actually, as I assume it requires some processing from satellite to file system. BUt still, there is value in providing it.

2. In this climate of GOP anti-science, putting out a product that retouches data and makes global warming, arctic melt more visible is more than my job is worth. It doesn't matter how innocent or valuable the tansformation is, it is a transformation adn therefore 'tempering'.

3. There is a practical problem of which transformations to use, etc. If I am like us here on the site ['us' used very, very, very loosely], I still want to know what transformations were used so I can understand artifacts etc. So maybe the rational is that it is better to do nothing.

As an amateur however, I really appreciate the work you all do (A-Team especially singled out).

Dan P.

1. The original images were all 16-bit, and in general made fair use of that dynamic range. A typical histogram had a minimum about 50, median around 2000-10000, and a long tail of brighter pixels. My linear rescaling to 8 bits doesn't always do justice to the contrast of features we might be interested in, but after a fair bit of experimentation I realized it would take a lot of effort to come up with a better generic solution for display. Often it is both the low and the high brightness features that need better contrast (obscured icy water in IR channels and cloud tops), which requires splitting the range into different functional forms.

2. For the apparently useless channels 11-18, as I said the problem is already there in the 16-bit file, although it is true a carefully optimized palette could extract a bit more contrast. But essentially to the extent the problem was information being discarded too early (rather than detector faults), it was by having the gain set too high for that sensor pipeline.

3. My colorspaces for all but the last one are direct RGB combination. The 1-2-6-20-26-32 was just an experiment that I expect to simplify and improve on, so I can only give you an approximate scheme:

(ch_1)R + (ch_2)G + (ch_6)B + .37(ch_20)M + .44(ch_26)C + .44(ch_32)B

I don't have a lot of expertise on color spaces, so the way I'm thinking about it is that I have a fundamental 3-D basis with some alternate conventional basis vectors. I can get away with adding extra channels above 3 because the eye can use spatial information to separate out coherent channels even if their color information overlaps, but it still seems like it would be better to keep the # of channels to no more than 4 or 5.

Fufufunknknk: All of your points have some reason to them but I still lean towards "it would be a lot better if someone professional were doing this work and making it available rather than me". The answer to #2 is basically "haters gonna' hate", so you shouldn't expect policy to respond to whatever idiocies they spout.

As for #1 and #3, they make a strong case for having the raw data available *in addition* to any higher level products, with careful descriptions of the methods used to get them. I believe NASA does well in this regard for many of the products they produce, but there is a disconnect that A-team has been stressing when it comes to easily-readable imagery.

In many cases this imagery seems treated as a display product only (despite the fact that its universality generally means it will be an important input for a broader range of science than any other data product). I have a milder point of view than A-team about the persons involved in the production, even if I'm dissatisfied with the results of the overall system. I've run into comments in documentation several times with statements like "we have schemes for alternate methods of presentation and are seeking funding to implement them", written years ago, like a little cry for help from the data processing factory.


Reacting to entirely hypothetical intimidation on something this estoteric makes no sense -- it would have to be challenged on scientific grounds since no interpretive component is being provided.

But that makes no sense either as long as they documented exactly what was done to the image (good luck with that). Then the end user knows exactly what they have (I'm yet to encounter this).

If someone want to go back to square one and do it differently, they are free to do so. In fact, for Modis they are welcome to mount an antenna on the roof of their home and receive the data stream directly -- it is all described on the internet.

There is no such thing as raw data. Major product-affecting decisions were already baked into sensor electronics and later in setting the gain (here probably set for vegetation rather than white on white), making corrections for orbit and angle, on and on. Usually they have a longish series of processing stages called levels.

The same was true of film and what happened afterward in the darkroom. Did Ansel Adams take his film to Kinko's to be developed? No. A goodly portion of what goes on in Gimp/PS is just digital darkroom: invert, grain merge, dodge, burn, screen, etc.

I am very familiar with the norms across the physical sciences. For starters, 99.9% of academic scientists are not going to work with arcane data formats like HDF. The image providers are falling way short of acceptable practises here -- in documentation, server products and interfacing with end-users.

When you take the public's money for salary and project grants, that triggers an obligation. If you don't want the obligation, don't take the money. They're paid for a 40 hour week, 50 weeks a year. I'm talking a half hour of common sense to get beyond these nuisance display products.

We are well into a major crisis with climate change -- with the Arctic is leading the pack. It is high time for these satellite data providers to get it in gear. If they do their bit, I can do my bit, and the next person can do their bit.

So don't ask why I am wanting to light a burner under these folks, ask instead why you are not.


A-Team, there are quite a few scientists dropping in here. How about writing a (short) guest blog about your qualms with satellite data providers? Maybe they're just not aware of what the public would like to see, or what 'we' would like to see, as the link between the public and them.


Neven, let me think about that, whether it could move the needle. It's a little late for the Arctic but might benefit Greenland, tundra, Antarctic or whatever comes next.

The situation with practical availability of remote sensing imagery is very similar to what happened with the human genome project. Except there, the problem was fixed on day one.

A string of letters 3 billion long is already a very large file. Using just four letters, it is exceedingly uninformative to behold. Add another for each person with a genetic condition and many species of vertebrates for comparison, you're at many terabytes of intractable raw data.

To assembly a mammalian genome ab initio from trace reads and mark up features takes about six weeks of computing time using 2000 fast processors running in parallel. At that time, the end-user community consisted of computer-illiterate mds and molecular biologists who could barely send an email or open a pdf.

So there was a tremendous disconnect between the geeks who could process these files but didn't know a nucleotide from nucleolus and a diverse biomedical side which needed to process patients using all available information.

One option, which the geeks favored, was for the biomedical community to take a five year leave of absence and damn it, learn computer programming. Meanwhile the biomedical community controlled the grant money; their preferred option was a lab geek who would follow orders, like their nurse or technician.

In the end though, they ended up collaborating on a graphical user interface called a genome browser. Two million lines of code, tens of thousands of back-and-forths about what worked and what didn't. Intuitive amd self-explanatory in the end, nobody had to learn a thing.

What we have in Arctic imagery is exactly parallel to this: satellite specialists on one side, all-over-the-map scientists on the other. A dozen or so interfaces, mostly beta but also alpha, all built without a single back-and-forth communication.

The net effect is exclusionary and ineffectual. We just cannot afford this.


On 16-bit data, it is not going to show up on the monitor or in print; even if it did, your eye can hardly distinguish the 16 million colors in 3-channel 8-bit now.

On satellite data, it really has to do with round-off binning error upon subsequent manipulations -- pixellated images with degraded scientific content. Fine, the satellite center is well positioned to do initial rounds of manipulations, then hand off desktop 8-bit.

That seems to discard meaningful data on the scientific side, yet 16-bit has no significance for concentration/area/extent/thickness where the end result has substantial (several percent minimally) error and spotty validation of that even. And at the end of the day, to share or publish a product image, you're going to dumb it down to 8-bit.

What does it accomplish to see the same old obscuring cloud at ever greater bit depth? We need better cloud penetration and better digital removal.

Gimp is going to 32-bit resolution by year-end. That will retain scientific accuracy for extreme sequential non-linear manipulations but these aren't often motivated.

We can do an awful lot of rapid hypothesis exploration at 8-bit and always go back later to repeat on 16-bit as warranted. Retaining more significant digits than you actually need vis-a-vis final product error just adds to the overhead.

What we'd like really is 2x better resolution on the ground even if 8-bit pixels. That means for microwave, launching with a larger antenna. Not going to happen in time left for the Arctic ice.

For visible, Modis ground resolution is probably already excessive. The Arctic Ocean is large, with file size growing as the square of pixel size, an issue already at 8-bit 250m. None of this is getting us any closer to actual properties of the ice that matter, be it slush or tensile strength.


I was trying to discuss rationales, not justify the lack of product, although my statement can certainly be read like that in hindsight.

I actually do some work with computer vision, although not on satellite images and I am familiar with the problems with datasets and different formats. I am not familiar with satellite formats; the only satelite data i have ever worked with are GIS topo maps and so on, but on a small enough scale that I could conveniently avoid problems with projections and, more importantly, data for which I had a ground truth so that i could pretty much plug it into Grass or whatever (e.g. QGIS) and do my import relatively automatically.

Maybe what is needed is a standard/specification, or three or four standards, to be pushed so that the agencies have a target that is independent of thier internal situation. Maybe, it is just a matter of convincing them that there is a need for this product.

However, I yield to your opinion because i have no idea what i am talking about and i repeat my thanks for your and others work.

Rob Dekker

In trying to figure out why the Arctic appears to be cooler this summer than last year with more lows than highs, I ended up staring at these graphs from NOAA, from the Geo Potential Height (GPH) over the Northern Hemisphere.
This is 2013 so far :
And this is 2012 :

Now, it is apparent that in 2012, highs dominated the surface, while in 2013, low pressure at the surface dominates.
I checked back a couple of years on the main site
and it seems to me that 2013 is an exception to the rule.
Normally, high or low pressure from the mesosphere tends to propagate down through the stratosphere in a couple of months time frame, but this year it seems that the high pressure zone (around 10mb) just sits there and does not come down to the troposphere. In fact, it seems that throughout the Northern Hemisphere, lows dominate the surface this year, while high pressure is stuck between 150-5mb.

I am very much unskilled with stratospheric and troposphere interaction, so the question I have is why ?

Why does a lump of high pressure just sits there in the lower stratosphere and does not disperse down into the lows of the troposphere, while it does in other years ?


In fact, what has been going on since autumn last year, is a restructuring of the three NH cells. The gradient between them is fading. The behavior of the Polar Jet Stream is directly related to that.

The geopotential difference is lower, meaning that the height of the atmospheric pressure zones, especially over the Arctic and Boreal zones have gone up.

Mind, the difference is in tens, maximally 100-150 of meters. But still. And the main effect is noticeable on the 500-300Mb level.
It is therefore no surprise that a small, but important pressure anomaly is measured in the higher atmosphere. In this light, it is interesting to read back what FI Wayne Davidson has put on his Eh2r blog this spring. Check ‘adiabatic’ processes. He seems to link it to thinner ice. I tend to see more relations with “intercellulair” teleconnections (yes, my math is bad, but hear my words…).

On another track, I think Chris Reynolds is also busy getting to grips with the obvious pattern change we see this season.
I’m very eager to see whether we can fit it all together, which to me seems of more general importance than daily nitty-gritty over SIE/SIA numbers.

I still have the opinion that these changes may for the moment be more pronounced in the mid-latitudes than in the Arctic. There are some consequences that have led to different behavior of the spring melt. It doesn’t mean the whole season will end in ‘recovery’. It might, on the other hand, indicate the start of the ‘longer than expected’ tail.

BTW I still hold my position on this year, finally, appearing to be front-runner. When the scientific sensors ‘come to their senses…’.

Jai Mitchell


re: NH cells This is, in my opinion, the largest effect of climate change. The weakening of the polar jet is indicating a gradual merge of the ferrell and polar cells. It is this global restructuring of global weather patterns that will produce the most immediate impacts on society and humanity's food production capabilities. This will occur due to increases in droughts in current food producing regions and floods in current population centers.

What we are also seeing in this regime of climate change is a gradual strengthening of the hadley cell as tropical evaporation increases. This is increasing desertification. The northward expansion of the 30'N desert belt will also significantly impact grain and livestock production.

Studies of these effects and predictions of their impacts are in the peer-reviewed record produced over the last 20 years. The fact that we have done nothing to act on the impacts that have been observed indicates that only a "breakthrough" event will produce real motivation for change. The longer that we wait, the more painful climate change will be and the more difficult it will be to fix it. In fact, we are approaching a point where it is becoming very likely that modernity will not survive the next 65 years.

With the arctic ice pack being the "canary in the coal mine" and the most effective messenger of these developing threats, it is important that we more fully understand and and are able to explain what is happening in the arctic today.

That is why I monitor these blogs, and sincerely appreciate the work that is being done here.


"poleward shift of the subtropical dry zones (up to 2° decade−1 in June-July-August (JJA) in the Northern Hemisphere "

Rob Dekker


In fact, what has been going on since autumn last year, is a restructuring of the three NH cells. The gradient between them is fading. The behavior of the Polar Jet Stream is directly related to that.

The geopotential difference is lower, meaning that the height of the atmospheric pressure zones, especially over the Arctic and Boreal zones have gone up.

Thanks Werther, but I'm not sure I understand this logic.
Why would the height of the atmospheric pressure zones, especially over the Arctic, go up if the gradient between the NH cells fades ?

What I understand is that in 2012 the gradient between the NH cells was low, (and the polar vortex was weak, and that is why we had a warm Arctic.

So why would 2013 show the opposite ?

Artful Dodger
Rob Dekker wrote: July 11, 2013 at 08:45

Why would the height of the atmospheric pressure zones, especially over the Arctic, go up if the gradient between the NH cells fades?

Hi Rob,

I think the increased height of the troposphere is a direct consequence of more water vapour in the atmosphere due to a warmer earth, rather than the gradient between circulation cells. Wikipee says:

As a rule, the "cells" of Earth's atmosphere shift polewards in warmer climates (e.g. interglacials compared to glacials

So it seems the Hadley and Ferrel cells themselves are crowding the Polar cell, raising the height of the atmosphere in the Arctic. Or more exactly pushing the Polar cell further to the North.

However, as the jet stream weakens, one would also expect the loss of this natural barrier between cells to raise the average height of the polar cells.

Have you seen the polar jet stream this week? It's running from 70N to 83N in the CAA, and 75N to 85N over the Laptev sea right now (00z 11 Jul 2013).

Highly unusual, and very far North.


Rob Dekker

Now that we entered the second have of the melting season, I'm having more and more concerns about the ice in the Central Basin in the aftermath of PAC2013.

Especially the area of r04c04 frame on Modis (a huge chunk of the Central Basin from the North Pole almost to the Siberian Coast) was affected by the persistent lows raging through much of May and June.

Although A-team as well as DanP have done some great work with different wavelength channels in that frame, individual images are still often obscured by clouds, and the situation as a whole remain somewhat unclear.

Our eyes have a great ability to recognize patterns in multiple images shown quickly after each other, even when each individual image is partially obscured. So, with 1 image per day from Terra/Modis, I thought it would be interesting to make an animation of frame r04c04 showing 3 days per second, running over the past 3 months.

This way, we'll see frame r04c04 before, during and after PAC2013.
Here is the result (day 120 - today). Click the image to download the animation :

I'm not sure what you guys think, but I don't see how any of that cottage cheese and low-albedo slush can possibly make it through July melting and even minor August storms, let alone survive until September...


Thanks for presenting it like that, Rob.
Y3et another way of getting to the quality ... which is exceptional.

Rob Dekker

Thanks Werther. PAC 2013 was not easy on the Arctic Basin ice, no ?

I updated the animation combining images from BOTH Terra and Aqua. So we now have two images per day. I also doubled the frame rate (to 6 images per second) so its our eyes have more data to extract the image of the ice cover through the clouds.

It's amazing how you can see the ice being obliterated by PAC 2013, and then attacked by late June and early July sunlight. It's melting in front of our eyes....

The comments to this entry are closed.