Posts Tagged ‘ performance

Why Page Speed Isn’t Enough

Ajax Architecture – Market Navigation

Velocity 2010 Performance by Design video

How does Google measure site speed?

So, Google is factoring web site speed into ranking algorithms.  Just how are they figuring web site speed?  What does it mean for a site engineer?  We’ve been obsessing over page performance for some years now, so how does this really change anything?

Well, first off, how does it impact a site engineer?  According to the official post:

it doesn’t carry as much weight as the relevance of a page. Currently, fewer than 1% of search queries are affected by the site speed signal in our implementation

Its not clear to me which 1% are affected or how that’s decided.  Is it just a test affecting queries for 1% of users, or is it uniformly applied to all users but is only being applied to 1% of queries?  Which queries and why?

Anyway, given that its only one of “more than 200 signals” clearly its not the major factor in determining relevancy.  But still, Google can’t throw out a challenge like this and not expect people to obsess over it.  Which is part of the point, I think.

When working on optimizing site performance engineers typically consider a variety of KPI:

  • Time to first byte
  • Base page download
  • Progressively rendered elements:  headers, above-the-fold
  • Full page download, including all resources

So what is Google actually measuring as “web site speed”?

The official post displays a chart indicating

Labs > Site Performance shows the speed of your website as experienced by users around the world as in the chart below

Furthermore, they link to an earlier post describing site performance in webmaster tools which says

The performance overview shows a graph of the aggregated speed numbers for the website, based on the pages that were most frequently accessed by visitors who use the Google Toolbar with the PageRank feature activated

Matt Cutt’s post on site speed links to the blog post containing the above information indicating

Google’s webmaster console provides information very close to the information that we’re actually using in our ranking

So there it is.  Google are measuring web site speed as Full Page Download, including all resources across ALL pages on your site.  All pages.  They confirm this

As the page load times are based on actual accesses made by your users, it’s possible that it includes pages which are disallowed from crawling. While Googlebot will not be able to crawl disallowed pages, they may be a significant part of your site’s user experience

So to recap:

  • They’re measuring full page load including all resources.  Your scripts, your images, third party display ads, third party scripts etc.
  • They’re measuring all pages visited by users on your site, not just crawlable pages.
  • They’re measuring from users actual web browsers.  No simulations.  From real bandwidths.

Clearly, most of the well-documented best practices for speeding up your website still apply.  So is there anything else to consider?

  • Post-loading content is looking pretty interesting to us, if it can be done in such a way that it is not factored into page load time.
  • Minimizing 3rd party content, such as display ads, could have a huge impact.  We have little control over 3rd party creative.  I’ve seen ads make up to 7 additional HTTP requests for XML, Flash, images etc.  Steve Souders has a complete initiative around Performance of 3rd Party Content.
  • Focus on pages that might be contributing to longer load times even if they aren’t your primary experience.
  • Beware of links on your page that are served from your domain but redirect to other sites.  E.g. http://your-site.com/redirector?target=<some_other_url> that 302s to some_other_url.  I believe Google is counting the foreign site load times as part of the linking domain’s performance.  I’ll have concrete numbers on that in a few weeks.
  • Continuously monitor and measure your web-site performance over typical user bandwidths, for example using Keynote KITE.  Optimizing for your office LAN and a low-end DSL connection are two different propositions.

So to conclude… If you’re already focused on site performance, you don’t really have much to worry about.  Keep optimizing pages for your real end users on real bandwidth and continuously monitor your sites performance.

Velocity 2010 – The Measurable Value of Performance By Design

I’ll be speaking at Velocity 2010 on the topic of “The Measurable Value of Performance By Design“.  Last year Shopzilla talked about “You get what you measure“.  This will be a followup covering the additional work we’ve done to try and maintain performance while rapidly adding features and experimenting on our site.  In a nutshell, we took our eye off the ball, something we claimed was a bad idea.  And it was.  I’m going to talk about:

  • Why we took our eye off the ball
  • The real financial costs of slowing down
  • The specific techniques we applied to make our sites fast again
  • The process and technology framework we put in place to ensure this never happens again

There are a lot of interesting-looking talks this year:

If you work on building or operating web applications (or both!) you really should consider attending Velocity.

O'Reilly Velocity Web Performance & Operations Conference 2010

Performance By Design – TSSJS Edition

I’ll be at The Server Side Java Symposium next week – March 17th through 19th – in Las Vegas.  Shopzilla are sponsoring the event.  I’m going to present a more tech-oriented (and much-distilled) version of my Performance By Design presentation.  We have some new content too which has come out of some of the more recent performance work we’ve done.

Performance By Design

I was fortunate to get the opportunity to lead a team that re-engineered our bizrate.com and shopzilla.com websites from scratch on a new technology stack.  One of the driving forces was to improve the performance of our sites.  We knew that in order to make performance a first-order priority we had to design it into the architecture of the site.

We first began speaking about the bottom line benefits at Velocity 2009.  Since then, we wanted to share some of the technical details about how we built our new site infrastructure and some of the techniques we used to measure then improve performance.

I got the chance to deliver a presentation at a number of Souther California Java Users Groups.  We’ve chosen to share it here: Shopzilla – Performance By Design

Some other useful references for high performance sites:

Something else we’ve learned is that if you take your eye off the ball, performance will regress.  As we’ve spent the last year adding more and more features to our sites, we’ve given back some of our performance gains.  We recently embarked on a project to get some of that back and add more automtated performance measurement.  Stay tuned to our Shopzilla Tech Blog for more info.