Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Book Reviews Books Media

High Performance Web Sites 132

Michael J. Ross writes "Every Internet user's impressions of a Web site is greatly affected by how quickly that site's pages are presented to the user, relative to their expectations — regardless of whether they have a broadband or narrowband connection. Web developers often assume that most page-loading performance problems originate on the back-end, and thus the developers have little control over performance on the front-end, i.e., directly in the visitor's browser. But Steve Souders, head of site performance at Yahoo, argues otherwise in his book, High Performance Web Sites: Essential Knowledge for Frontend Engineers." Read on for the rest of Michael's review.
High Performance Web Sites
author Steve Souders
pages 168
publisher O'Reilly Media
rating 9/10
reviewer Michael J. Ross
ISBN 0596529309
summary 14 rules for faster Web pages
The typical Web developer — particularly one well-versed in database programming — might believe that the bulk of a Web page's response time is consumed in delivering the HTML document from the Web server, and in performing other back-end tasks, such as querying a database for the values presented in the page. But the author quantitatively demonstrates that — at least for what are arguably the top 10 sites — less than 20 percent of the total response time is consumed by downloading the HTML document. Consequently, more than 80 percent of the response time is spent on front-end processing — specifically, downloading all of the components other than the HTML document itself. In turn, cutting that front-end load in half would improve the total response time by more than 40 percent. At first glance, this may seem insignificant, given how few seconds or even deciseconds it takes for the typical Web page to appear using broadband. But any delays, even a fraction of a second, accumulate in reducing the satisfaction of the user. Likewise, improved site performance not only benefits the site visitor, in terms of faster page loading, but also the site owner, with reduced bandwidth costs and happier site visitors.

Creators and maintainers of Web sites of all sizes should thus take a strong interest in the advice provided by "Chief Performance Yahoo!," in the 14 rules for improving Web site performance that he has learned in the trenches. High Performance Web Sites was published on 11 September 2007, by O'Reilly Media, under the ISBNs 0596529309 and 978-0596529307. As with all of their other titles, the publisher provides a page for the book, where visitors can purchase or register a copy of the book, or read online versions of its table of contents, index, and a sample chapter, "Rule 4: Gzip Components" (Chapter 4), as a PDF file. In addition, visitors can read or contribute reviews of the book, as well as errata — of which there are none, as of this writing. O'Reilly's site also hosts a video titled "High Performance Web Sites: 14 Rules for Faster Pages," in which the author talks about his site performance best practices.

The bulk of the book's information is contained in 14 chapters, with each one corresponding to one of the performance rules. Preceding this material are two chapters on the importance of front-end performance, and an overview of HTTP. Together these form a well-chosen springboard for launching into the performance rules. In an additional and last chapter, "Deconstructing 10 Top Sites," the author analyzes the performance of 10 major Web sites, including his own, Yahoo, to provide real-world examples of how the implementation of his performance rules could make a dramatic difference in the response times of those sites. These test results and his analysis are preceded by a discussion of page weight, response times, YSlow grading, and details on how he performed the testing. Naturally, if and when a reader peruses those sites, checking their performance at the time, the owners of those sites may have fixed most if not all of the performance problems pointed out by Steve Souders. If they have not, then they have no excuse, if only because of the publication of this book.

Each chapter begins with a brief introduction to whatever particular performance problem is addressed by that chapter's rule. Subsequent sections provide more technical detail, including the extent of the problem found on the previously mentioned 10 top Web sites. The author then explains how the rule in question solves the problem, with test results to back up the claims. For some of the rules, alternative solutions are presented, as well as the pros and cons of implementing his suggestions. For instance, in his coverage of JavaScript minification, he examines the potential downsides to this practice, including increased code maintenance costs. Every chapter ends with a restatement of the rule.

The book is a quick read compared to most technical books, and not just due to its relatively small size (168 pages), but also the writing style. Admittedly, this may be partly the result of O'Reilly's in-house and perhaps outsource editors — oftentimes the unsung heroes of publishing enterprises. This book is also valuable in that it offers the candid perspective of a Web performance expert, who never loses sight of the importance of the end-user experience. (My favorite phrase in the book, on page 38, is: "...the HTML page is the progress indicator.")

The ease of implementing the rules varies greatly. Most developers would have no difficulty putting into practice the admonition to make CSS and JavaScript files external, but would likely find it far more challenging, for instance, to use a content delivery network, if their budget puts it out of reach. In fact, differences in difficulty levels will be most apparent to the reader when he or she finishes Chapter 1 (on making fewer HTTP requests, which is straightforward) and begins reading Chapter 2 (content delivery networks).

In the book's final chapter, Steve Souders critiques the top 10 sites used as examples throughout the book, evaluating them for performance and specifically how they could improve that through the implementation of his 14 rules. In critiquing the Web site of his employer, he apparently pulls no punches — though few are needed, because the site ranks high in performance versus the others, as does Google. Such objectivity is appreciated.

For Web developers who would like to test the performance of the Web sites for which they are responsible, the author mentions in his final chapter the five primary tools that he used for evaluating the top 10 Web sites for the book, and, presumably, used for the work that he and his team do at Yahoo. These include YSlow, a tool that he created himself. Also, in Chapter 5, he briefly mentions another of his tools, sleep.cgi, a freely available Perl script that tests how delayed components affect Web pages.

As with any book, this one is not perfect — nor is any work. In Chapter 1, the author could make more clear the distinction between function and file modularization, as otherwise his discussion could confuse inexperienced programmers. In Chapter 10, the author explores the gains to be made from minifying JavaScript code, but fails to do the same for HTML files, or even explain the absence of this coverage — though he does briefly discuss minifying CSS. Lastly, the redundant restatement of the rules at the end of every chapter, can be eliminated — if only in keeping with the spirit of improving performance and efficiency by reducing reader workload.

Yet these weaknesses are inconsequential and easily fixable. The author's core ideas are clearly explained; the performance improvements are demonstrated; the book's production is excellent. High Performance Web Sites is highly recommended to all Web developers seriously interested in improving their site visitors' experiences.

Michael J. Ross is a Web developer, freelance writer, and the editor of PristinePlanet.com's free newsletter.

You can purchase High Performance Web Sites from amazon.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
This discussion has been archived. No new comments can be posted.

High Performance Web Sites

Comments Filter:
  • by MichaelCrawford ( 610140 ) on Wednesday October 10, 2007 @02:57PM (#20930421) Homepage Journal
    Why?

    All my pages are static HTML. Not a web application in site, not even PHP. Yes, it's a drag when I need to do some kind of sitewide update, like adding a navigation item.

    I also have less to worry about security, as long as my hosting service keeps their patches up to date, I know I haven't introduced any holes myself.

    Also, for the most part, my pages are very light on graphics, with most of the graphics present being repeated on every page such as my site's logo, which gets cached.

    Finally, all my pages are XHTML 1.0 Strict with CSS, with the CSS being provided by a single sitewide stylesheet. This means less HTML text to transfer compared to formatting with HTML tags.

  • Solution (Score:5, Interesting)

    by dsginter ( 104154 ) on Wednesday October 10, 2007 @03:09PM (#20930597)
    All my pages are static HTML. Not a web application in site, not even PHP.

    This is a great point, but here is my anecdotal experience:

    Years ago, I tested static HTML vs. PHP by simply benchmarking a simple document (I used the GPL license). On the particular box, I was able to serve over 400 pages per second with static HTML but only about 12 pages per second with PHP. I was blown away. I went one step further and used PHP to fetch the data from Oracle (OCI8, IIRC) and that went down to 3 requests/sec. You can see that caching does help, but not a whole lot.

    So, rather than whine about it, what is the solution?

    AJAX, done properly, will solve the problem. Basically, instead of serving dynamic pages with PHP, JSP, ASP or whatever... just serve an AJAX client (which is served in a speedy manner with no server side processing to bog things down). This client loads in the browser and fetches a static XML document from the server and then uses the viewer's browser to generate the page - so everything thrown down by the server is static and all processing is done on the client side.

    Now, to facilitate a dynamic website (e.g. - message board, journal, or whatever), you have to generate the XML file upon insert (which are generally a small fraction of the read load) using a trigger or embedded in the code.

    Viola! Static performance with dynamic content using browser-side processing.
  • by Josef Meixner ( 1020161 ) on Wednesday October 10, 2007 @03:16PM (#20930723) Homepage

    I guess I am not alone in noticing that often the ads on a page drag the load time way down. I find it interesting, that there is no rule about minimizing content dragged in from other servers you have no or little control over. Blind spot because of Yahoo's business, I guess.

  • by Cyberskin ( 1171659 ) on Wednesday October 10, 2007 @03:17PM (#20930739)
    Rule #34 is all about how slow loading the java plugin is for any browser. It's always been slow, it was supposed to improve w/6 and really it's still slow. The main problem is that NOTHING shows up until the plugin gets loaded. My solution was two-fold. Write the object in javascript (which conveniently allows the rest of the html in the page to load and display, but also eliminates the IE problem of "click on this" to activate the applet) and create an animated gif loading screen div which I block when the applet div finished loading (I ended up loading the applet at the bottom of the page below the watermark because otherwise I couldn't catch the finished loading event. I just made the loading screen match the page background so the only way you could tell anything was going on was that the scrollbar on the right changed sizes). Not exactly elegant, but it was better then the blank screen you get waiting for the plugin to load and provided a nice custom animated loading gif instead of the default applet loading logo.
  • by zestyping ( 928433 ) on Wednesday October 10, 2007 @03:23PM (#20930793) Homepage
    The title of the book should be "High Speed Web Sites" or just "Fast Web Sites."

    "Performance" is not a general-purpose synonym for "speed." "Performance" is a much more general term; it can refer to memory utilization, fault tolerance, uptime, accuracy, low error rate, user productivity, user satisfaction, throughput, and many other things. A lot of people like to say "performance" just because it's a longer word and it makes them sound smart. But this habit just makes them sound fake -- and more importantly, it encourages people to ignore all the other factors that make up the bigger picture. This book is all about speed, and the title should reflect that.

    So, I beg you: resist the pull of unnecessary jargon. The next time you are about to call something "performance," stop and think; if there's a simpler or more precise word for what you really mean, use it.

    Thanks for listening!
  • by mr_mischief ( 456295 ) on Wednesday October 10, 2007 @03:41PM (#20931147) Journal
    The new interface is a joke for performance compared to the old server-generated HTML one. Sure, they might be saving some hardware resources, but it's slow, and the message bodies are the bulk of the data anyway. The main transfers they cut out using JavaScript and dynamic loading seem to be updates to the message list when you delete a bunch of spam. That would be better handled by putting it in the spam folder where it belongs. OTOH, I often delete non-spam messages without reading them as I do subscribe to a few legit mailing lists from my Yahoo address but don't want to read every message.
  • by PHPfanboy ( 841183 ) on Wednesday October 10, 2007 @03:42PM (#20931159)
    From my observations of web developers in the wild this does seem to be scientifically true. Actually, most web developers are so overloaded with projects that even if they did give a shit, they simply don't have time to benchmark, test and optimize properly.

    It's not an excuse, it's just that teams are so fluid, project work is chaotic and project management is driven by marketing considerations (read: "get it out", not "enterprise stability") that site performance is seen as a server hardware issue.

    Shame really, in these times of green awareness, I hate to think how much wattage is being wasted on processor cycles and disk I/O when a simple caching strategy and site optimization can do so much via software and memory management.
  • by schoett ( 98398 ) on Thursday October 11, 2007 @06:13AM (#20937641)

    Firefox 2 and IE7 do indeed begin to display gzip-compressed pages while they are still loading over the net. The method used to verify this was to insert a local Squid cache that uses "delay pools" to limit transfer bandwidth and that records time and duration of all network transfers made. Using this method, I could see that a lengthy compressed HTML page was transferred in 14 seconds, and the content became visible in IE7 after 6 seconds and finished loading after 18 seconds.

    If you have a physically slow connection, you can probably do the same experiment with the Firebug plugin for Firefox (which shows you the loading time of the page elements) and a stop watch to time the browser display behaviour.

"What man has done, man can aspire to do." -- Jerry Pournelle, about space flight

Working...