Web page weight | Office of Information Technology

Web page weight

Last Updated: 01/07/2014

When you are looking at houses, it is usually easy to see which ones have structural damage. The porch roof is leaning unnaturally, or the keystone above the window is missing. We have trained our eyes to see structural damage in houses, but it is harder for us to see structural definiencies in our web sites. What web diagnostics tools do we have to avoid the equivalent of sagging porches and leaky roofs?

A simple method availabe in most modern browsers is the analysis of network activity that a page load on your site generates. It can show us two important metrics: the total page weight and the number of HTTP requests.

The screenshot below shows the network analysis performed in Firefox 25.0 for Mac. I emptied the browser cache before loading the page so that all the resources would load from the server. I then turned on the network analysis pane (Tools > Web Developer > Network) and loaded the page. The browser gives me a list of files that were loaded and calculates the total page weight:

The average page weight has grown 15x in the last decade and is now around 1.5 Mb. If your site weight is above 1Mb, there are probably things that you can optimize (we will give you some pointers about that later on). The number of HTTP requests is the number of files that your browser has to load to display your page. It is usually an assortment of HTML files, CSS style sheets, JavaScripts, images and icons, and media files like video and audio. A quick survey of sites linked from colorado.edu home page in November 2013 showed that the average number of HTTP requests for those sites was 44. If your site loads more than 50 resources, you probably have some files that should be sent to the butcher block.

Let's look at three common culprits leading to page weight bloat and HTTP request balooning and suggest techniques for improving the situation.

Giant images
Any image that is greater than 200-300 Kb in size is likely too big or isn't properly compressed. It is generally a very bad idea to load a big image (that may have come straight from your high-resolution camera) if you only need to display a small thumbnail. I recently came across a site that loaded a 2 megapixel photo that weighed 1.2Mb only to resize in code and display it at a size of 300 by 200 pixels. Loading a properly sized small image instead would have shaved over a megabyte off the page weight.

Orphaned scripts
A site I reviewed recently had two sets of hit counters on the front page - an old web statistics tool popular 5-7 years ago and a Google Analytics script. The counters loaded about 200Kb of JavaScript between themselves. The site manager recently took over the site from someone else and wasn't aware of the scripts' existence - meaning those scripts were simply dead weight. Removing them would improve both the page weight and the number of HTTP requests.

Inefficient mashups
Let's say you want to display a button on your site that allows your users to follow you on Twitter. You use the code provided by Twitter, and you end up with your pretty button - but you also end up with a nearly 100Kb JavaScript that loads on every page that has that button. All you needed is a simple image linked to your Twitter account - you didn't really need the script that came with it. Your shortcut solution has simplified your life at the expense of your visitors.

Lean pages load faster, reduce the need for powerful servers, and are simpler and therefore easier to maintain. Watching the weight of your pages is a worthy exercise. If you would like to get more ideas about making your web site lean or if you would like OIT to help you with your analysis, you can always ask for a free web consultation.