Download!Download Point responsive WP Theme for FREE!

Reconsidering gzip

compressiondoggyWhen our server generates a webpage to send to your browser, among other tricks, using a piece of software called gzip, it compresses the text associated with each page – the HTML, the CSS, the javascript files and beams it over along with the images. Your browser then decompresses this data and constructs the page. The result is a little extra math needing to be performed both by our server and your browser but the reward is a 63% size reduction in that data for the network transfer. The resources overhead on both sides is generally nominal, including with phones, and the use of gzip or something similar, DEFLATE, is quite ubiquitous, in part because of what was a good compromise, versus other formats, of its compression and decompression time and its reduction muscle.

But twenty years have elapsed. Better math has been brewed. Websites are more often dynamic than before, separate methods for media compression have been developed, and though much of that content can be cached and compressed just once (from time to time) and served many times, so we no longer should factor in gzip’s compression speed and performance on anything other than text versus other algorithms when deciding what would make the web the fastest.

There exist algorithms such as PPM, LZHAM and PPM that are, for text, far superior to gzip’s LZ77, both in speed and in how much they can squish a chunk of data down to. So, Google developer advocate Colt McAnlis concludes we should explore the alternatives and switch to one or more of them. Getting the world to switch to something, no matter how obviously better it is than what’s currently being used, isn’t easy; however he seems determined and already has a Mozilla developer, Patrick McManus, on board to help search for the compression cure — good luck with that fellas.

Doug Simmons