Brought to you by

Fast by default

Posted 8 years ago in Tech by Craig Walker
Posted by Craig Walker

A couple of weeks ago I had the opportunity to go to the 2010 Velocity Web Performance and Operations Conference in Santa Clara, California. Velocity was partly set up by the godfather of web performance, Steve Souders (then at Yahoo, now at Google) 3 years ago as a way to openly discuss and push web performance and web operations techniques to the ever expanding audience of software developers and engineers demanding this knowledge in the age of massively scalable web sites. Velocity has been high on my conference bucket list – with over 1000 attendees and close to a hundred speakers it’s a who’s who of web performance.

Anyone that’s followed me or this blog for a while knows that performance is a favorite topic of mine, one I’ve presented on a few times including at TechEd last year. The theme at this year’s conference was “fast by default”: I’ve always thought performance is the number one non-functional requirement for all software and Velocity is about making web software much much faster.

First the bad news: there are still web developers out there that still don’t understand basic web performance! If you don’t know the 14 rules then learn them! Having said that we at Xero often let ourselves slip into bad practices – it’s very easy to get complacent with performance, but everyone in your organization should be thinking about it – not just the performance team (if you even have one) – everyone from development to operations to QA to marketing needs to understand the impact that performance can have – it needs to be baked into the culture. This often means aligning performance metrics with real-world business metrics – if I speed up my website by 1 sec will that increase conversions? Or to put it more simply: will 1 second make me more money? This is stuff that is core at the big players such as Google & Yahoo & Facebook – but it’s relevant to any size organization trying to make it’s way on the web.

Fortunately most of the talks were at a deeper level – getting to the nitty gritty of performance. There were a few talks that tackled the subject of the inner workings of the web: John Rauser from Amazon was brilliant in tackling the usually dry subject of TCP and the problems inherent in a protocol that was invented almost 20 years before Google was, and Tom Hughes-Croucher from Yahoo! provided a detailed look at what happens in the lifetime of a single web request. There are still developers out there that don’t really understand what happens during an HTTP request and how much inherent latency there is in the Internet. We’re all at the mercy of the laws of physics (unfortunately New Zealand, the speed of light is a contributing factor when connecting to web sites hosted in the US) but there is still a lot that can be done to improve the bits between your computer and the web server. Probably the most interesting discovery was how simple things such as the quality of a user’s router can have such an adverse affect on web performance. In fact there’s a big push at the moment to get Internet Service Providers to worry less about increasing bandwidth, but rather focusing on providing better quality gear to their customers (it doesn’t matter if you have 1 Gbps to your home if your router can barely handle 5 Mbps). It’s also interesting to note the work that Google has been doing with protocols like Spdy to try to improve the bits of TCP that weren’t designed to cope with the size of the current Internet. Again – latency is the killer – Google have proven that improving latency has a much greater benefit on the speed of websites than adding raw bandwidth.

Also worth noting is the rise of progressive enhancement as the current pattern du jour of the web performance world. Talks from the Google Docs team, Google Maps team, Facebook and others highlighted how important this technique can be in improving performance when the basics aren’t good enough. Essentially progressive enhancement is about rendering the simplest page possible and then progressively enhancing it through CSS and progressively adding interactivity through JavaScript under the complete page is available. This is a user experience pattern more than a pure performance pattern – the idea is that content is king, and therefore by displaying the content quickly then it gives the perception of speed. I found this illustration on a blog post about progressive enhancement which actually is a very apt way of explaining it.

Progressive enhancement

For the geeks this technique has the benefit of requiring you to construct your user interface in a loosely coupled and very explicit manner – HTML needs to be semantic and clean, CSS needs to handle all the styling, and the JavaScript needs to be unobtrusive, adding interactivity as late as possible in the page load. With that done you can then play with some interesting tricks like loading your JavaScript asynchronously or using on-demand techniques like RequireJS. Even though a lot of sites use techniques that may feel like progressive enhancement, it’s sites like Facebook that really take the concept to the limit.

Overall Velocity lived up to the hype and has definitely made me think about the thing we could do better at Xero. While we’ve done some good work there is still so much we could do to make Xero fast by default – which can only be a good thing for our customers.


Jim Morrison
July 7, 2010 at 6.43 pm

Great post Craig, thanks. That Yahoo ’14 point list’ has grown a bit since last time I looked so good to have a re-read; some fascinating additions.

Xero’s always been pretty dam fast but good to know you guys keep an eye on it!


Andrew Haynes
July 8, 2010 at 1.45 am

I’ve been spending some time trialing one of your competitors. Their page load time is between 5-8 seconds.


July 8, 2010 at 12.55 pm

What were some of the factors that led xero to use rackspace instead of something like amazon web services?

Leave a reply

Your email address will not be published. Required fields are marked *