YUI Theater — Douglas Crockford: "Ajax Performance"

By YUI TeamDecember 23rd, 2008

Douglas Crockford returns to YUI Theater with another chapter in his evolving lecture series. This session, “Ajax Performance,” debunks common misconceptions about the relationship between JavaScript and performance and gives engineers a core focus for improving the performance of web apps: Reduce the value of n. Because DOM interactions are generally slow, leveraging Ajax to reduce the number of DOM operations, Douglas argues, is often the most important optmization you can make. In fact, it usually dwarfs other techniques in terms of its impact on the actual experience of using a website.

This talk joins an extensive library of Douglas’s lectures now available on YUI Theater, including his popular series on JavaScript.

Douglas Crockford: "Ajax Performance" @ Yahoo! Video

download (m4v)

In Case You Missed…

Some other recent videos from the YUI Theater series:

Subscribing to YUI Theater:


  1. The YUI blog RSS feed keeps attaching the wrong files for these YUI Theater posts. kloots-aria.m4v was attached to this instead of crockford-performance.m4v…
    Also, the link to crockford-performance.m4v seems broken.

  2. The video download link isn’t working. It’s returning 404.

  3. I think the download link is broken. Can you please fix?

  4. Sorry guys — the download link is being fixed. Check back in 10 minutes. -Eric

  5. Good presentation, informative stuff.

    I worked on the redesign of Yahoo! Photos, and had pinged Doug and some other JS folks at Yahoo! about “the wall” issue which was killing IE 6’s performance as the working dataset grew. It is probably the most unusual browser issue I have ever had to debug.

    The problem stems from having a large number of DOM nodes, Javascript objects, and (a lesser number) of Javascript-DOM references (think circular references) and event handlers active and assigned; it appears that the browser goes into overtime trying to manage everything, and all operations (including JS loops through DOM node collections, from what I recall) slow to a crawl.

    Doug’s comments on how performance degraded with each page are spot on. I had implemented destructor methods on all of the main photo-related Javascript objects (ie., a photo item which has a thumbnail, selected state and so on) with resource management in mind from the beginning, so when moving from page 1 to page 2, all of the “old” objects, DOM nodes and event handlers (we were using event delegation for many things also) were removed from the DOM, nodes were removed, and the Javascript objects themselves were destroyed, nulled out and deleted.

    Originally I had thought that simply removing the nodes from the DOM (via removeChild() and storing in a JS object) and keeping the “old” photo objects active in memory would be fine, and it was in all browsers except for IE 6.

    By process of elimination I found that I had to remove DOM nodes, event handlers and destroy objects to prevent hitting “the wall.” Doing one or two of these things would help, but would not prevent the slowdown issue we were seeing.

    We had to be careful about circular references creating memory leaks in IE 6 in particular, so this is where the destructor methods came in handy. (YUI’s DOM/Event components did not exist when the photos redesign project began in 2005, though we did implement the connection object for our XHR stuff.)

    Ultimately we ended up throwing away all of the objects, DOM nodes and event handlers, and kept the JSON data for the given page of photos (photo ID, name, thumbnail URL etc.) in memory. When the user returned to that page, we would then reconstruct the photo objects from that JSON data, saving the API call – and while it theoretically was slower than simply switching a pointer to the current active page of objects and swapping in related DOM nodes with a single appendChild() call, it was still quite fast.