As much as I like the effect, I wish giant image headers would just die in a fire. They take up an enormous amount of page space and bandwidth and don’t contribute enough to the content of a page.
Are progressive JPEGs not supported by mainstream browsers? I would’ve thought that would be a better option than something that results in me downloading more image content just so that it looks like it’s loading faster.
Progressive jpegs look really ugly while progressing. (They look like extremely overcompressed jpegs, obviously enough.) If you’re looking for an attractive blur-in effect, progressive jpeg is not the way to get that.
On the other hand, it seems like it should be possible to write a progressive decoder that produces nicer-looking intermediate states based on the knowledge that it’s only got part of the image and can safely blur out the compression blocks.
As far as I know, support for them is nearly universal, though.
Ah, yes - hadn’t thought of rendering aesthetics. Interestingly, it seems that progressive JPEGs can, in some cases, optimise better than regular baseline JPEGs (see, eg, 1 and 2).
The popular online publishing platform Medium has adopted an infuriating image loading mechanism
Joking aside, whenever I see these I think my browser’s image rendering pipeline has broken. It doesn’t help that I run canary/dev channel builds a lot, but the effect is still jarring.
Im not sure why they create the blurred image at run time. The original image is effectively static content, just create a second blurry or even just low res version that is loaded first, it can go on the distributed cds too.
As much as I like the effect, I wish giant image headers would just die in a fire. They take up an enormous amount of page space and bandwidth and don’t contribute enough to the content of a page.
What, you don’t like the evolution of the Web into a big picture book with Duplo letters and banal content?
You probably don’t even look at cat pictures on it. >:(
Are progressive JPEGs not supported by mainstream browsers? I would’ve thought that would be a better option than something that results in me downloading more image content just so that it looks like it’s loading faster.
Progressive jpegs look really ugly while progressing. (They look like extremely overcompressed jpegs, obviously enough.) If you’re looking for an attractive blur-in effect, progressive jpeg is not the way to get that.
On the other hand, it seems like it should be possible to write a progressive decoder that produces nicer-looking intermediate states based on the knowledge that it’s only got part of the image and can safely blur out the compression blocks.
As far as I know, support for them is nearly universal, though.
Ah, yes - hadn’t thought of rendering aesthetics. Interestingly, it seems that progressive JPEGs can, in some cases, optimise better than regular baseline JPEGs (see, eg, 1 and 2).
Joking aside, whenever I see these I think my browser’s image rendering pipeline has broken. It doesn’t help that I run canary/dev channel builds a lot, but the effect is still jarring.
Im not sure why they create the blurred image at run time. The original image is effectively static content, just create a second blurry or even just low res version that is loaded first, it can go on the distributed cds too.