I’ve always liked this idea of sending low-res first and, if the user is still interested, send the…

I’ve always liked this idea of sending low-res first and, if the user is still interested, send the higher-resolution versions next.

Jeff Percival implemented a very nice version of this technique in the late 1990s for the University of Wisconsin Astronomy Dept to enable rapid viewing of telescope images, where the telescope was in Arizona and the astronomer was in Wisconsin. Being able to get a quick low-res first glance was very valuable to spot-check exposure, focus, alignment, etc. The internet was a lot slower back then!

Some of Sony’s professional cameras include a 4G modem to upload low-res “proxy” videos from reporters in the field so editing can begin before the hi-res footage is available.

Originally shared by Ilya Grigorik

Really interesting experiment… definitely more to explore here!

“This means that a VP9-based still image format (unlike VP8-based WebP) could encode multiple resolutions to be loaded and decoded progressively, at each step encoding only the differences from the previous resolution level… So to load an image at 2x “Retina” display density, we’d load up a series of smaller, lower density frames, decoding and updating the display until reaching the full size (say, 640×360). If the user scrolls away before we reach 2x, we can simply stop loading — and if they scroll back, we can pick up right where we left off.”

http://bit.ly/1UaKR8F