Major sites not conserving bandwidth with gzip content compression

At GreatSchools we do around 1M real page views per day and another 250k or so for crawlers. Before content compression we were running well in excess of 10Mbit/s during peak hours and were getting hit with bursting charges on high traffic months. When we switched our proxy servers to Apache with mod_deflate (gzip based compression) we saw a 35% decrease in bandwidth utilization and the 3 proxy servers that do the compression and sit in front of our 10 web servers barely register a load at all.

Our average page size (on the first page view) clocks in around 80-90K. However, compare that with your average Web 2.0 company like Digg or 37signals using prototype, scriptaculous, lightbox, and so on and you’ll often see a total page size closer to 200K and remarkably, these companies are not doing gzip compression! In fact many major sites are not yet realizing the benefits of content compression, however, some such as CNN, MySpace, and Slashdot have already caught on!

If you’re running a site that does some traffic you owe it to yourself to look into content compression of HTML, CSS, and Javascript with mod_deflate or equivalent, the bandwidth savings can be tremendous! You can click here to check if your site is already using gzip based content compression.

This entry was posted in Systems Administration, Web. Bookmark the permalink.

3 Responses to Major sites not conserving bandwidth with gzip content compression