Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Book Reviews Books Media

High Performance Web Sites 132

Michael J. Ross writes "Every Internet user's impressions of a Web site is greatly affected by how quickly that site's pages are presented to the user, relative to their expectations — regardless of whether they have a broadband or narrowband connection. Web developers often assume that most page-loading performance problems originate on the back-end, and thus the developers have little control over performance on the front-end, i.e., directly in the visitor's browser. But Steve Souders, head of site performance at Yahoo, argues otherwise in his book, High Performance Web Sites: Essential Knowledge for Frontend Engineers." Read on for the rest of Michael's review.
High Performance Web Sites
author Steve Souders
pages 168
publisher O'Reilly Media
rating 9/10
reviewer Michael J. Ross
ISBN 0596529309
summary 14 rules for faster Web pages
The typical Web developer — particularly one well-versed in database programming — might believe that the bulk of a Web page's response time is consumed in delivering the HTML document from the Web server, and in performing other back-end tasks, such as querying a database for the values presented in the page. But the author quantitatively demonstrates that — at least for what are arguably the top 10 sites — less than 20 percent of the total response time is consumed by downloading the HTML document. Consequently, more than 80 percent of the response time is spent on front-end processing — specifically, downloading all of the components other than the HTML document itself. In turn, cutting that front-end load in half would improve the total response time by more than 40 percent. At first glance, this may seem insignificant, given how few seconds or even deciseconds it takes for the typical Web page to appear using broadband. But any delays, even a fraction of a second, accumulate in reducing the satisfaction of the user. Likewise, improved site performance not only benefits the site visitor, in terms of faster page loading, but also the site owner, with reduced bandwidth costs and happier site visitors.

Creators and maintainers of Web sites of all sizes should thus take a strong interest in the advice provided by "Chief Performance Yahoo!," in the 14 rules for improving Web site performance that he has learned in the trenches. High Performance Web Sites was published on 11 September 2007, by O'Reilly Media, under the ISBNs 0596529309 and 978-0596529307. As with all of their other titles, the publisher provides a page for the book, where visitors can purchase or register a copy of the book, or read online versions of its table of contents, index, and a sample chapter, "Rule 4: Gzip Components" (Chapter 4), as a PDF file. In addition, visitors can read or contribute reviews of the book, as well as errata — of which there are none, as of this writing. O'Reilly's site also hosts a video titled "High Performance Web Sites: 14 Rules for Faster Pages," in which the author talks about his site performance best practices.

The bulk of the book's information is contained in 14 chapters, with each one corresponding to one of the performance rules. Preceding this material are two chapters on the importance of front-end performance, and an overview of HTTP. Together these form a well-chosen springboard for launching into the performance rules. In an additional and last chapter, "Deconstructing 10 Top Sites," the author analyzes the performance of 10 major Web sites, including his own, Yahoo, to provide real-world examples of how the implementation of his performance rules could make a dramatic difference in the response times of those sites. These test results and his analysis are preceded by a discussion of page weight, response times, YSlow grading, and details on how he performed the testing. Naturally, if and when a reader peruses those sites, checking their performance at the time, the owners of those sites may have fixed most if not all of the performance problems pointed out by Steve Souders. If they have not, then they have no excuse, if only because of the publication of this book.

Each chapter begins with a brief introduction to whatever particular performance problem is addressed by that chapter's rule. Subsequent sections provide more technical detail, including the extent of the problem found on the previously mentioned 10 top Web sites. The author then explains how the rule in question solves the problem, with test results to back up the claims. For some of the rules, alternative solutions are presented, as well as the pros and cons of implementing his suggestions. For instance, in his coverage of JavaScript minification, he examines the potential downsides to this practice, including increased code maintenance costs. Every chapter ends with a restatement of the rule.

The book is a quick read compared to most technical books, and not just due to its relatively small size (168 pages), but also the writing style. Admittedly, this may be partly the result of O'Reilly's in-house and perhaps outsource editors — oftentimes the unsung heroes of publishing enterprises. This book is also valuable in that it offers the candid perspective of a Web performance expert, who never loses sight of the importance of the end-user experience. (My favorite phrase in the book, on page 38, is: "...the HTML page is the progress indicator.")

The ease of implementing the rules varies greatly. Most developers would have no difficulty putting into practice the admonition to make CSS and JavaScript files external, but would likely find it far more challenging, for instance, to use a content delivery network, if their budget puts it out of reach. In fact, differences in difficulty levels will be most apparent to the reader when he or she finishes Chapter 1 (on making fewer HTTP requests, which is straightforward) and begins reading Chapter 2 (content delivery networks).

In the book's final chapter, Steve Souders critiques the top 10 sites used as examples throughout the book, evaluating them for performance and specifically how they could improve that through the implementation of his 14 rules. In critiquing the Web site of his employer, he apparently pulls no punches — though few are needed, because the site ranks high in performance versus the others, as does Google. Such objectivity is appreciated.

For Web developers who would like to test the performance of the Web sites for which they are responsible, the author mentions in his final chapter the five primary tools that he used for evaluating the top 10 Web sites for the book, and, presumably, used for the work that he and his team do at Yahoo. These include YSlow, a tool that he created himself. Also, in Chapter 5, he briefly mentions another of his tools, sleep.cgi, a freely available Perl script that tests how delayed components affect Web pages.

As with any book, this one is not perfect — nor is any work. In Chapter 1, the author could make more clear the distinction between function and file modularization, as otherwise his discussion could confuse inexperienced programmers. In Chapter 10, the author explores the gains to be made from minifying JavaScript code, but fails to do the same for HTML files, or even explain the absence of this coverage — though he does briefly discuss minifying CSS. Lastly, the redundant restatement of the rules at the end of every chapter, can be eliminated — if only in keeping with the spirit of improving performance and efficiency by reducing reader workload.

Yet these weaknesses are inconsequential and easily fixable. The author's core ideas are clearly explained; the performance improvements are demonstrated; the book's production is excellent. High Performance Web Sites is highly recommended to all Web developers seriously interested in improving their site visitors' experiences.

Michael J. Ross is a Web developer, freelance writer, and the editor of PristinePlanet.com's free newsletter.

You can purchase High Performance Web Sites from amazon.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
This discussion has been archived. No new comments can be posted.

High Performance Web Sites

Comments Filter:
  • by perlwolf ( 903757 ) on Wednesday October 10, 2007 @03:29PM (#20930905) Homepage
    Your music page [geometricvisions.com] fails the XHTML W3C check though it says its XHTML compliant.
  • Re:Solution (Score:5, Informative)

    by chez69 ( 135760 ) on Wednesday October 10, 2007 @04:14PM (#20931605) Homepage Journal
    sounds good, except you may or may not know that a lot of javascript implementations are sloooow. not to mention you usually have to set the no cache headers for everything in the page so your javascript works right.

    I find that sites built with the method you describe are the asshole sites that fuck with browser history, disable the back button, try to disable the context menu, and those dumb ass tricks to get around the fact they don't know how to write proper server side code.

    There's no reason you can't make a fast serverside site (with ajax too, that works without the stupid tricks I described above), if you can't I suggest you educate yourself, or don't use a wallmart PC for production use.

    I've personally written many J2EE webapps (no EJB BS, spring & struts & jsp/velocity) that where very fast, with proper coding you can let the browser cache stuff so it constantly doesn't have to refetch crap. when you do this, all you push down to the client is the HTML to render, which browsers are really good at doing quickly.
  • ISBN redundancy (Score:3, Informative)

    by merreborn ( 853723 ) on Wednesday October 10, 2007 @04:23PM (#20931741) Journal
    FTFA:

    High Performance Web Sites was published on 11 September 2007, by O'Reilly Media, under the ISBNs 0596529309 and 978-0596529307

    There's no need to list both the ISBN 10 and the ISBN 13. ISBN 13 is a superset of ISBN10. Notice that both numbers contain the exact same 9 data digits:
    0596529309
    9780596529307

    The only difference is the 978 "bookland" region has been prepended, and the check digit has been recalculated (using the EAN/UPC algorithm, instead of ISBN's old algo). You can just give the ISBN 10, or just the ISBN 13. You can trivially calculate one from the other. All software that deals with ISBNs should do this for you. e.g., if you search either the ISBN13 or ISBN10 on amazon, you'll end up at the exact same page.
  • by Anonymous Coward on Wednesday October 10, 2007 @04:40PM (#20931977)
    Uh... most implementations of Ajax are used in conjunction with a server side programming language of some sort. The only performance boost is that you don't have to reload the entire page... only the part that you need to update. The obvious drawback is that if users don't have javascript enabled you have eliminated your users... or you write a second site to handle users without javascript. It can be used to help, but be careful of the suggestions you carelessly throw out there.
  • precompile your HTML (Score:3, Informative)

    by victorvodka ( 597971 ) on Wednesday October 10, 2007 @06:01PM (#20933099) Homepage
    One solution that gives you a dynamic website with the advantages of a database and server-side scripting is to precompile your site to static HTML - you update it by recompiling more HTML. It can be done fairly transparently, with all the actual precompiling happening via automatic scripts. Obviously you can't have a user-login-based site work effectively this way, but for a site of modest dynamics (such as a blog, product catalog, or even some message boards), pre-compiling to HTML can be a real benefit. You can also precompile pieces of pages, although the benefits are less because includes require a certain amount of backend processing (unless they are slurped in from the front end using DHTML or whatever).

"Ninety percent of baseball is half mental." -- Yogi Berra

Working...