But twenty years have elapsed. Better math has been brewed. Websites are more often dynamic than before, separate methods for media compression have been developed, and though much of that content can be cached and compressed just once (from time to time) and served many times, so we no longer should factor in gzip’s compression speed and performance on anything other than text versus other algorithms when deciding what would make the web the fastest.
There exist algorithms such as PPM, LZHAM and PPM that are, for text, far superior to gzip’s LZ77, both in speed and in how much they can squish a chunk of data down to. So, Google developer advocate Colt McAnlis concludes we should explore the alternatives and switch to one or more of them. Getting the world to switch to something, no matter how obviously better it is than what’s currently being used, isn’t easy; however he seems determined and already has a Mozilla developer, Patrick McManus, on board to help search for the compression cure — good luck with that fellas.