Performance Research, Part 1: What the 80/20 Rule Tells Us about Reducing HTTP Requests

By YUI TeamNovember 28th, 2006

This is the first in a series of articles describing experiments conducted to learn more about optimizing web page performance. You may be wondering why you’re reading a performance article on the YUI Blog. It turns out that most of web page performance is affected by front-end engineering, that is, the user interface design and development.

It’s no secret that users prefer faster web sites. I work in a dedicated team focused on quantifying and improving the performance of Yahoo! products worldwide. As part of our work, we conduct experiments related to web page performance. We are sharing our findings so that other front-end engineers join us in accelerating the user experience on the web.

The 80/20 Performance Rule

Vilfredo Pareto, an economist in the early 1900s, made a famous observation where 80% of the nation’s wealth belonged to 20% of the population. This was later generalized into what’s commonly referred to as the Pareto principle (also known as the 80-20 rule), which states for any phenomenon, 80% of the consequences come from 20% of the causes. We see this phenomenon in software engineering where 80% of the time is spent in only 20% of the code. When we optimize our applications, we know to focus on that 20% of the code. This same technique should also be applied when optimizing web pages. Most performance optimization today are made on the parts that generate the HTML document (apache, C++, databases, etc.), but those parts only contribute to about 20% of the user’s response time. It’s better to focus on optimizing the parts that contribute to the other 80%.

Using a packet sniffer, we discover what takes place in that other 80%. Figure 1 is a graphical view of where the time is spent loading http://www.yahoo.com with an empty cache. Each bar represents a specific component and is shown in the order started by the browser. The first bar is the time spent for the browser to retrieve just the HTML document. Notice only 10% of the time is spent here for the browser to request the HTML page, and for apache to stitch together the HTML and return the response back to the browser. The other 90% of the time is spent fetching other components in the page including images, scripts and stylesheets.

Figure 1. Loading http://www.yahoo.com

Figure 1. Loading http://www.yahoo.com

Table 1 shows popular web sites spending between 5% and 38% of the time downloading the HTML document. The other 62% to 95% of the time is spent making HTTP requests to fetch all the components in that HTML document (i.e. images, scripts, and stylesheets). The impact of having many components in the page is exacerbated by the fact that browsers download only two or four components in parallel per hostname, depending on the HTTP version of the response and the user’s browser. Our experience shows that reducing the number of HTTP requests has the biggest impact on reducing response time and is often the easiest performance improvement to make.

Shouldn’t everything be saved in the browser’s cache anyway?

The conclusion is the same: Reducing the number of HTTP requests has the biggest impact on reducing response time and is often the easiest performance improvement to make. In the next article we’ll look at the impact of caching, and some surprising real-world findings.

Disclaimer: Design imperatives dictating visual richness need to be weighed against this request-reduction goal. When you need visual richness, additional steps can be taken — aggregating JS files, using CSS sprites, etc. — but visual richness does tend to run counter to a slender HTTP request pipeline.

128 Comments

  1. [...] Performance Research, Part 1: What the 80/20 Rule Tells Us about Reducing HTTP Requests “This is the first in a series of articles describing experiments conducted to learn more about optimizing web page performance. You may be wondering why you’re reading a performance article on the YUI Blog. It turns out that most of web page performance is affected by front-end engineering, that is, the user interface design and development.” [...]

  2. [...] Performance Research, Part 1: What the 80/20 Rule Tells Us about Reducing HTTP Requests “This is the first in a series of articles describing experiments conducted to learn more about optimizing web page performance. You may be wondering why you’re reading a performance article on the YUI Blog. It turns out that most of web page performance is affected by front-end engineering, that is, the user interface design and development.” [...]

  3. I think next release of firebug (extension for firefox) will include this feature where you can see the requests made by the browser…I think you will have to pay for it though, not such a bad thing because it’s something you can’t develop without

  4. […] is consistent with Yahoo’s 80/20 rule and indicates that the biggest benefit will come from optimizing front-end […]

  5. […] got an HTML page that includes a bunch of Javascript files, which makes development easy but which hurts performance in […]

  6. [...] of articles describing experiments conducted to learn more about optimizing web page performance (Part 1, Part 2, Part 3, Part 4). You may be wondering why you’re reading a performance article on [...]

  7. [...] Shea and Tenni Theurer have continued their performance series by delving into the iPhone and its poor little [...]

  8. [...] Shea and Tenni Theurer have continued their performance series by delving into the iPhone and its poor little [...]

  9. [...] Shea and Tenni Theurer have continued their performance series by delving into the iPhone and its poor little [...]

  10. […] got an HTML page that includes a bunch of Javascript files, which makes development easy but which hurts performance in […]

    nice one

  11. [...] the quicker it loads. Performance related research by Yahoo! User Interface has proven this point. The 80/20 performance rule shows that “only 10% of the time is spent […] for the browser to request the HTML page [...]

  12. [...] And a lot about optimisation and preloading. Presume that the server is faster than the client at dealing with data. See http://yuiblog.com/blog/2006/11/28/performance-research-part-1/ [...]

  13. I’m looking for an explanation for improving performance for a website in which they are using external image server to host all images.

    Has anyone done any detailed analysis on performance of website w.r.t external server and same server?

  14. Tenni Theurer said:
    March 21, 2008 at 1:49 pm

    @SEO: It’s best to use a CDN to host your images rather than your own web server. This moves the static content closer to your users, thus improving performance for your users. Take a look at Rule 2: Use a CDN. Hosting assets on different domains has the added advantage of increasing the number of parallel downloads by the browser as discussed in Performance Research Part 4. Stoyan Stefanov has written a great article that touches on these points as well.

  15. [...] want to minimise requests to the server if they’re not necessary – it’s been shown that reducing HTTP requests can reduce response times and improve server [...]

  16. Hi,

    For lack of a better place to post this, I wonder if you would answer this for me…

    I have two websites,(site1.com and site2.com) and both use the same main.css hosted at site3.com.

    A visitor browses through site1.com(user agent downloads and caches main.css), and then visits site2.com. Does the css get downloaded a second time or does the main.css already in the cache get used?

    I hope that makes sense.

    Thanks.

  17. …and how do I get to the next article?

  18. [...] “What the 80/20 Rule Tells Us about Reducing HTTP Requests”a blog post on the YUI blog explaining the precedence of front-end performance for optimal response times. [...]

  19. [...] Videoda bahsettiğim Yahoo ekibinin bulgularının yer aldığı ve videoda kullandığım tablo > Performans Araştırması 1 [...]

  20. [...] been talking for a long time at Yahoo about the importance of minimizing HTTP requests to improve performance. One important technique for YUI users has long been to use the pre-build "rollup" files [...]

  21. [...] articles describing experiments conducted to learn more about optimizing web page performance (Part 1, Part 2, Part 3, Part 4, Part [...]

  22. Sites can provide two versions of their most important web pages. One an vanilla text version and the other with images, css, scripts. Users which want quick download times can use the vanilla text version. It would be interesting to study download times for APPLE as they use a lot of graphics.

  23. Chris Merrill said:
    January 13, 2009 at 7:05 am

    I thought you might want to know that this blog content has been copied, without attribution, on this blog:

    http://web-performance-research.blogspot.com/2008/05/performance-research-part-1-what-8020.html

    They have copied some of our research as well, and have requested that they take it down. I suggest you do the same.

    Chris

  24. Thanks for the nice article.
    We should use HttpWatch in IE for considering the response time.Fiddler Tool is the another one.But Fiddler works as a proxy.

  25. [...] and scriptaculous javascript libraries which also consists of a couple of different files. Now that an article about the same problem featured on the Yahoo! User Interface blog, I decided to make my solution [...]

  26. [...] especially when compared to several smaller images? The answer lies in HTTP requests and Yahoo’s 80/20 rule explains this much better than I could! To summarise, the numbers of HTTP requests to the websites [...]