Websites are now richer than ever. They have high quality images, astonishing animations and responsive CSS, but this comes with a cost: your server will have to serve bigger images and text files, and your clients’ browsers will need to use more CPU cycles to process your site. In this post we will see how we can improve this without affecting the quality of the beautiful site you devised.

The Peregrine Falcon is considered the fastest member of the animal kingdom.

The Peregrine Falcon is considered the fastest member of the animal kingdom. A really “optimized” bird.

GZIP compression

Enabling GZIP compression is probably the easiest way to give a boost to our website, as we can have it up and running in a matter of minutes.

GZIP provides lossless compression, that is, we can recover the original data when decompressing it. It is highly recommended for text files such as HTML, CSS, JavaScript or XML, and as it is really fast, there is almost no overhead both in the server or the browser.

Check out our post on how GZIP works to see how to enable GZIP compression in Apache or nginx.

Image optimization

Optimizing images is crucial, as they represent around 60% in average of the total weight of modern websites, being JPEG the most used format:


The first step to optimize images, even before using any tool, is to determine which is the most appropriate format for a specific image.


JPEG images are usually suitable for photographs and performs worst results for small images or containing text or sharp lines.

There are a few tools to optimize JPEG images, but we will see only two: jpegoptim and jpegtran.

jpegoptim is a command-line tool available in most Linux repositories, as well as in Macports and Homebrew. Supports lossless and lossy compression.

$ jpegoptim aircraft.jpg
aircraft.jpg 467x312 24bit Exif ICC JFIF  [OK] 31623 --> 30810 bytes (2.57%), optimized.

If we ask jpegoptim to discard ICC profiles, IPTC markers, EXIF markers and comment markers, we can achieve greater compression:

$ jpegoptim --strip-all aircraft.jpg
aircraft.jpg 467x312 24bit JFIF  [OK] 31623 --> 19261 bytes (39.09%), optimized.

By default, the lossless optimization mode is enabled. So, previous transformations have not had cost in terms of image quality. We can enable the lossy mode and so have even higher compression ratios (reducing quality). In the following example, a quality of 80% is defined. Ratio is quite impressive but obviously the quality of the image is not the same:

$ jpegoptim --strip-all -m80 aircraft.jpg
aircraft.jpg 467x312 24bit JFIF  [OK] 31623 --> 14981 bytes (52.63%), optimized.

Determining the optimal value of the quality loss is a manual process, but we don’t have to be afraid of it. Usually, a quality loss of 5% – 10% is not perceived by the regular human eye. There are other situations where quality is not important, so we can have a bigger loss, like in small thumbnails.

The other tool, jpegtran, performs only lossless transformation of JPEG files. It also allows us to transform the image: rotate, scale, convert into grayscale, etc.

$ jpegtran aircraft.jpg > aircraft_comp.jpg
$ ls -lh
-rw-r--r-- 1 raul  raul  31K Apr 22 10:56 aircraft.jpg
-rw-r--r-- 1 raul  raul  20K Apr 22 11:01 aircraft_comp.jpg

Many times, grayscale images used in websites are not really grayscale, but color images using only gray tones. Changing the image mode to grayscale we can achieve better results.


PNG is a raster graphics file format that supports lossless data compression. PNG was created as an improved, non-patented replacement for GIF. It is usually used for graphics, but is capable to handle an impressive number of image types with good results.

OptiPNG is a PNG optimizer that, without losing any information, reduces the size of PNG files. It works in two steps: first, preprocesses the pixels to make them more “compressible”, that are later compressed using LZ77. It provides quite a few options and optimization modes.

Executing it with default settings we get a 39.18% decrease in this image:

$ optipng image.png
** Processing: image.png
450x90 pixels, 4x8 bits/pixel, RGB+alpha
Reducing image to 8 bits/pixel, 37 colors (34 transparent) in palette
Input IDAT size = 7467 bytes
Input file size = 8375 bytes

  zc = 9  zm = 8  zs = 0  f = 0		IDAT size = 4017
Selecting parameters:
  zc = 9  zm = 8  zs = 0  f = 0		IDAT size = 4017

Output IDAT size = 4017 bytes (3450 bytes decrease)
Output file size = 5094 bytes (3281 bytes = 39.18% decrease)

As it provides different optimization levels (-o0 … -o7), we can try to get better ratios by using more time/memory.


GIF is a bitmap format usually used for simple animations, but its usage is decreasing in favor of CSS3 animations. It is still a widely used format to display funny cats though :)

If you like computer science history, the GIF part is quite interesting. From the “Unisys/CompuServe GIF Controversy” episode, that led to the “Burn all GIFs campaign”, to crazy stuff that has been done recently such as gifsockets, a library to provide real time communication using animated GIFs as a transport layer.

Going back to the purpose of the post, even though GIF images are usually really small, we can reduce them a bit. Gifsicle is a command-line tool for creating or editing GIF images, and since the animations consist of frames and some parts of the image don’t change from one frame to another, GIFsicle detects it and does not carry over the duplicate pixel information:

$ gifsicle -O2 image.gif > image_opt.gif
$ ls -lh
-rw-r--r-- 1 raul  raul  565K Apr 22 14:07 image.gif
-rw-r--r-- 1 raul  raul  548K Apr 22 14:07 image_opt.gif


WebP is an image format created by Google that provides lossless and lossy compression for images. The compression ratio is significantly better than PNG (26%) and JPEG (25-34%). Unfortunately is still not supported in all browsers. Firefox, Internet Explored and Safari still don’t support this format.


A favicon is that little icon associated with a web page that is displayed next to the URL. This icon is usually placed in the server’s root directory with the name favicon.ico. Forgetting to include this icon affects the performance. As web browsers ask for it explicitly, if we don’t have a favicon our web server will have to look for it every time and return a 404 Not Found error. So, the recommendation is to have a small (less than 1K) and cacheable favicon.

ImageMagick can help us to create small favicons by reducing its size (16×16 should be enough) and number of colors. For example, to create a 16×16 favicon using only 4 colors from a PNG image:

$ convert -resize 16x16 -colors 4 favicon.png favicon.ico

We can increase the number of colors until our favicon looks ok. For example, I was able to create a ServerGrove favicon of 318 bytes (current one weights 678 bytes) using 16 colors and with similar quality.

Text files size reduction

As we discussed earlier, text files must be compressed with GZIP as we can achieve great compression ratios. Let’s see what we can do if we want to go even further:

JavaScript and CSS

There are two common techniques to try to reduce JavaScript and CSS files size: minification and combination.

By minifying a JavaScript/CSS file, we remove unnecessary bytes, such as extra spaces, line breaks and indentation.

YUI Compressor is a JavaScript and CSS compressor tool created by Yahoo!. It is a JAR file so we only have to download it and execute it with the Java runtime. For example, to minify the jQuery library:

$ java -jar yuicompressor.jar jquery.js > jquery.min.js
$ ls -lh
-rw-r--r-- 1 raul  raul  239K Apr 22 13:25 jquery.js
-rw-r--r-- 1 raul  raul  127K Apr 22 13:26 jquery.min.js

Another common technique is to combine different JavaScript/CSS files into 1 (1 of each type, of course). This has two benefits: less round-trips (requests) and possibly better GZIP compression ratios, as it works even better for larger files.

Finally, there are specific techniques to reduce the size of JavaScript or CSS files. For example, in CSS, we can convert “margin: 0px 0px 0px 0px” to “margin:0”, or “border-color: #ffeedd” to “border-color: #fed”. These optimizations are also done by YUI compressor. For JavaScript, variable names can be changed automatically for shorter ones.


HTML files can be minified, but is not an easy task at all, as there are tags such as <pre>, <textarea>, <script> and <style> that may be affected if we remove certain whitespaces. It is also important to note that can be a slow process, so doing it on the fly may not be the best idea, but for files minified once and served many times.

htmlcompressor is a Java-based command-line tool to minify HTML files. For example, the index page of the ServerGrove site can be reduced from 29K to 23K:

$ java -jar htmlcompressor.jar -o servergrove.min.html servergrove.html 
$ ls -lh
-rw-r--r-- 1 raul  raul  29K Apr 22 12:54 servergrove.html
-rw-r--r-- 1 raul  raul  23K Apr 22 12:55 servergrove.min.html

Remember to test the results as it is still an experimental technique.

Other files

Web fonts

Web fonts usage is growing steadily. According to HTTP Archive, 43% of websites are using them:


Web fonts size is determined by the number of glyphs, metadata and the used compression method. To make it more complicated, there are four different formats (woff, ttf, eot, svg), and none of them provides universal adoption. Services like Google Fonts can help us with this, and as we will see, it also provides ways to optimize the font size.

For example, the Open Sans webfont, as supports 20+ languages, weighs in at over 217K. If we are using only latin characters, we can ask only for that subset, reducing its size dramatically to 36K:

<link href=";subset=latin" rel="stylesheet" />

We can even download a web font containing only a few characters, something useful for fonts that are only used for a title for example:

<link href=";text=ServerGrove" rel="stylesheet" />


Unlike web fonts, the use of Flash in modern websites is going down, but still relevant, as 29% of websites still have at least 1 Flash file.


Using vector graphics and avoiding embedded fonts can help to reduce the size of Flash files. Also, when using raster images, previous tips for compressing images are applicable too. Said that, my personal recommendation is to not use Flash unless is strictly necessary, something that is becoming less and less common with CSS3 and the new JavaScript APIs.


Caching is essential in today’s websites and applications. The basic idea behind caching is not to serve the same content to the same client twice. It is especially useful for resources that don’t change often, such as CSS/JavaScript files or images. With caching, we reduce the number of request our web server will have to handle, as well as the data transmitted. In HTTP, this is achieved by using HTTP headers.

HTTP/1.1 provides the following caching response headers:

  • Expires and Cache-Control: max-age: These headers specify how long the browser can consider the resource as fresh and use it instead of requesting it again. Once this time is expired, the browser will request the resource to the server.
  • Last-Modified and ETag: These headers follow a different approach, as are two ways to determine if the file we are requesting is the same as we have in the cache, and in case it differs, download it again.

For static assets, the recommended strategy is to set Expires to one year and the Last-Modified date to the last time the resource was changed.

Network optimizations

Reduce cookie size

Cookies must be as small as possible. The reason is that they are sent in the HTTP headers in every single request. No matter if we are requesting an HTML page, JavaScript file or even images, they are sent always. Using a different domain/subdomain for serving static assets can help as no cookies will be sent.

DNS lookups

Web browsers have a limit on the number of concurrent connections, so it is common to parallelize downloads across hostnames (domains or subdomains). While this can be good measure to load your site faster, if the number of hostnames is too long the time that the client spends to resolve the domain can affect negatively.

Avoid bad requests

404 Not found or 410 Gone HTTP responses come from unnecessary requests, which waste server time and resources. Looking for broken links and removing them is simple, so there is no excuse to leave them there.


It is important to be able to measure our page as soon as we are improving its performance. Automatic tools are of great help, as they report potential issues and give us indications in order to improve them. The most important ones are Google PageSpeed, YSlow and GTmetrix.

PageSpeed Module

mod_pagespeed and ngx_pagespeed are open-source modules created by Google for Apache and nginx. These modules speeds up websites by automatically applying some of the recommendations we have been talking about in this post, without having to modify our existing content or workflow.


Photo: Peregrine Stretching Wings, by Jerry Kirkhart