This week we spent some time setting up a few new features on our web server. The major of which, is a server stack which includes nginx, Varnish Cache, and Apache. The speed benefits were immediately obvious. Load times on larger Drupal websites went from around 5-6 seconds to 2-4 seconds or less. Load-time benefits for visitors over the long-term can now be monitored with Google’s new Site Speed for Analytics as well.

What does this stack do? Basically, when a user visits a website on our server, the php is handled by Apache, which is then cached by Varnish, and gzipped by nginx. Many can (and do) argue that nginx can cache alone, so why are we running Varnish with extra overhead? The answer to that question is very subjective depending on who you’re talking to, but in our testing it’s just been faster as Varnish is using malloc storage — this means its cached to RAM, rather than to disk, thus removing a disk search for every request. This many not be the ideal option for every server, but if you have the RAM to spare, why not use it.

Why do we care about all of this anyway? There’s several reasons, really. Google prefers fast-loading pages; they now actually use site speed as a pageranking factor. The other major reason is users — users like fast websites, it’s a pretty straightforward concept. In today’s world of speed, people don’t like waiting, so with lower load times, you have a pretty good guarantee that your site won’t be remembered as “that slow website that I hate going to”.