Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Even Faster Web Sites

samzenpus posted more than 4 years ago | from the read-all-about-it dept.

Programming 171

Michael J. Ross writes "Slow Web page loading can discourage visitors to a site more than any other problem, regardless of how attractive or feature-rich the given site might be. Consequently, many Web developers hope to achieve faster response times using AJAX (Asynchronous JavaScript and XML), since only portion(s) of an AJAX page need to be reloaded. But for many rich Internet applications (RIAs), such potential performance gains can be lost as a result of non-optimized JavaScript, graphics, and CSS files. Steve Souders — a Web performance expert previously at Yahoo and now with Google — addresses these topics in his second book, Even Faster Web Sites: Performance Best Practices for Web Developers." Read on for the rest of Michael's review.The book was published by O'Reilly Media on 18 June 2009, under the ISBN 978-0596522308. The publisher makes available a Web page, where visitors can purchase the print and electronic versions of the book (as well as a bundle of the two), read the book online as part of the Safari library service, and check the reported errata — comprising those confirmed by the author (of which there are currently two) and any unconfirmed errors (all six of which are valid, though the fifth one may be a coincidence). In a break with traditional practice among technical publishers nowadays, there is no sample chapter available, as of this writing.

In many ways, this second book is similar to Steve's previous one, High Performance Web Sites: It presents methods of enhancing the performance of websites, with a focus on client-side factors. It is fairly slender (this one is 254 pages), relative to most programming books nowadays, and the material is organized into 14 chapters. However, unlike its predecessor, Even Faster Web Sites emphasizes generally more advanced topics, such as script splitting, coupling, blocking, and chunking (which to non-developers may sound like a list of the more nefarious techniques in professional hockey). This second book also has employed a team approach to authorship, such that six of the chapters are written by contributing authors. In his preface, Steve notes that the 14 chapters are grouped into three broad areas: JavaScript performance (Chapters 1-7), network performance (Chapters 8-12), and browser performance (Chapters 13-14). The book concludes with an appendix in which he presents his favorite tools for performance analysis, organized into four types of applications: packet sniffers, Web development tools, performance analyzers, and some miscellaneous applications.

In the first chapter, "Understanding Ajax Performance," guest author Douglas Crockford briefly describe some of the key trade-offs and principles of optimizing applications, and how JavaScript now plays a pivotal role in that equation — as websites nowadays are designed to operate increasingly like desktop programs. On pages 2 and 3, he uses some figures to illustrate fixed versus variable overhead, and the dangers of attempting to optimize the wrong portions of one's code. By the way, the so-called "axes" are not axes, or even Cartesian grid lines, but simply levels. Aside from its choppy narrative style and a pointless religious reference in the first paragraph, the material serves as a thought-provoking springboard for what follows. Chapter 2, titled "Creating Responsive Web Applications," was written by Ben Galbraith and Dion Almaer, who discuss response times, user perception of them, techniques for measuring latency, browser threads, Web Workers, Google Gears, timers, and memory issues. The material is neatly explained, although Figure 2-2 is quite confusing; moreover, both of the figures on that page should not have been made Mac- and Firefox-specific.

In the subsequent four chapters, Steve dives into the critical topic of how to optimize the performance of JavaScript-heavy pages through better script content and organization — specifically, how and when to split up large scripts into smaller ones, how to load scripts without blocking one another or breaking dependencies within the code, and how to best in-line scripts, when called for. Each of the four chapters follows an effective methodology: The first author delineates a particular performance mistake made by even some of the most popular websites, with the statistics to back it up. He presents one or more solutions, including any relevant tools, again with waterfall charts illustrating how well the solutions work. Lastly, he explains any browser-specific issues, oftentimes with a handy chart showing which possible method would likely be optimal for the reader's given situation, such as expected browser choices in the site's target audience. When there are potential pitfalls, Steve points them out, with helpful workarounds. He generally provides enough example source code to allow any experienced developer to implement the proposed solutions. Unfortunately, the example code does not appear to be available for download from O'Reilly's website.

The discussion of JavaScript optimization is capped off by the seventh chapter, written by Nicholas C. Zakas, who explains variable scope within JavaScript code, the advantages of choosing local variables as much as possible, scope chain augmentation, the performance ramifications of the four major data types (literal values, variables, arrays, and objects), optimizing flow control statements, and string concatenation. He outlines what sorts of problems can cause the user's Web browser to freeze up, and the differing responses she would see depending upon her chosen browser. Nicholas concludes his chapter by explaining how to utilize timer code to force long-running scripts to yield, in order to avoid these problems. By the way, in Figures 7-2 and 7-3, the data point symbols need to be enlarged so as to be distinguishable; as it is, they are quite difficult to read. More importantly, on page 93, the sentence beginning "This makes array lookup ideal..." is either misworded or mistaken, since array lookup cannot be used for testing inclusion in ranges.

With the eighth chapter, the book shifts gears to focus on network considerations — namely, how to improve the site visitor's experience by optimizing the number of bytes that must be pushed down the wire. In "Scaling with Comet," Dylan Schiemann introduces an emerging set of techniques that Steve Souders describes as "an architecture that goes beyond Ajax to provide high-volume, low-latency communication for real-time applications such as chat and document collaboration" — specifically, by reducing the server-side resources per connection. In Chapter 9, Tony Gentilcore discusses a rather involved problem with using gzip compression — one that negatively impacts at least 15% of Internet users. Even though videos, podcasts, and other audiovisual files consume a lot of the Internet's bandwidth, images are still far more common on websites, and this alone is reason enough for Chapter 10, in which Stoyan Stefanov and Nicole Sullivan explain how to reduce the size of image files without degrading visible quality. They compare the most popular image formats, and also explain alpha transparency and the use of sprites. The only clear improvement that could be made to their presentation is on page 157, where the phrase "named /favicon.ico that sits in the web root" should instead read something like "usually named favicon.ico," since a favicon can have any filename, and can be located anywhere in a site's directory structure.

The lead author returns in Chapter 11, in which he explains how to best divide resources among multiple domains (termed "sharding"). In the subsequent chapter, "Flushing the Document Early," Steve explores the approach of utilizing chunked encoding in order to begin rendering the Web page before its full contents have been downloaded to the browser. The third and final section of the book, devoted to Web browser performance, consists of two chapters, both of whose titles neatly summarize their contents: "Using Iframes Sparingly" and "Simplifying CSS Selectors." That last chapter contains some performance tips that even some of the most experienced CSS wizards may have never heard of before. As with most of the earlier chapters, the narrative tends to be stronger than the illustrations. For instance, Figure 14-5, a multiline chart, is quite misleading, because it appears to depict three values varying over time, when actually each of the ten x-axis coordinates represents a separate major website. A bar chart would obviously have been a much better choice.

Like any first edition of a technical book, this one contains a number of errata (aside from those mentioned earlier): In Figure 1-1, "iteration" is misspelled. On page 23, in the sentence beginning "Thus, if...," the term "was" should instead read "were." In Figures 7-1 and 7-4, the "Global object" box should not contain "num2." On page 95, in the phrase "the terminal condition evaluates to true," that instead should read "false." On page 147, in the sentence beginning "However, the same icon...," the "was" should instead read "were." On page 214, "Web-Pagetest. AOL" should instead read "Web-Pagetest, then AOL," because the first sentence is one long absolute phrase (i.e., lacking a finite noun and verb).

All of these defects can be easily corrected in future printings. What will probably need to wait for a second edition, are improvements to the figures that are in need of replacement or clarification. What the publisher can rectify immediately — should the author and O'Reilly choose to do so — would be to make all of the example source code available for download.

Even though this book is decidedly longer than High Performance Web Sites, and has many more contributing authors, it does not appear to contain as much actionable information as his predecessor — at least for small- to medium-sized websites, which probably make up the majority of all sites on the Web. Even though such methodologies as Comet, Doloto, and Web Workers appear impressive, one has to wonder just how many real-world websites can justify the development and maintenance costs of implementing them, and whether their overhead could easily outweigh any possible benefits. Naturally, these are the sorts of questions that are best answered through equally hard-nosed experimentation — as exemplified by Steve Souders's admirable emphasis upon proving what techniques really work.

Fortunately, none of this detracts from the application development and optimization knowledge presented in the book. With its no-nonsense analysis of Internet performance hurdles, and balanced recommendations of the most promising solutions, Even Faster Web Sites truly delivers on its title's promise to help Web developers wring even more speed out of their websites.

Michael J. Ross is a freelance Web developer and writer.

You can purchase Even Faster Web Sites from amazon.com. Slashdot welcomes readers' book reviews — to see your own review here, read the book review guidelines, then visit the submission page.

cancel ×

171 comments

Sorry! There are no comments related to the filter you selected.

SOMEONE buy a copy for the /. coders! (5, Insightful)

Anonymous Coward | more than 4 years ago | (#28784583)

Slashdot is the SLOWEST web site on the net.

Re:SOMEONE buy a copy for the /. coders! (1, Flamebait)

pha7boy (1242512) | more than 4 years ago | (#28784633)

maybe it's just you.

the parent is about as useful a comment as saying "first."

Re:SOMEONE buy a copy for the /. coders! (1, Informative)

MyLongNickName (822545) | more than 4 years ago | (#28784731)

And maybe you really are new here. This site looks like crap. The response rates at times are incredibly slow. Please mode original post up to a six or seven.

Re:SOMEONE buy a copy for the /. coders! (2, Informative)

countertrolling (1585477) | more than 4 years ago | (#28785357)

Your post is not flamebait. One of the biggest drags with Slashdot and many others is the response time of constantly linking and loading off site ad servers before the main page shows itself, which is probably the intention. Doubleclick is especially horrible, and since I blocked it Slashdot loading has sped up very nicely. c.fsdn.com which seems to contain a lot of the "pretty widgets" is another real monster. Though I can read the page, blocking it breaks too much stuff. The quest for monetizing every little thing is what will keep the internet in its miserable state, but who notices, with all the pretty pictures? Whatever new technological wizardry comes along will be completely consumed by Madison Ave.

Re:SOMEONE buy a copy for the /. coders! (2, Interesting)

hairyfeet (841228) | more than 4 years ago | (#28787305)

Not to mention on dialup all the extra offsite loading is uberslow! Not everybody in the USA can get broadband you know, and sadly I have members of my family trapped on the evil 28k dialup. It is getting to the point that even with ABP and Noscript page loading is God awful. And whatever they did to FF3 with the 'upgrade" to 3.5 makes it completely unusable for dialup. I am gonna have to bring out my family FF3.0.1.2 and hope that they fix the problems with the 3.5 branch before 3.0 is abandoned in Jan, otherwise it is Kmeleon and Opera for them.

You know, there really ought to be code added to detect dialup like they do IE. Something like "If speed = 64k then offsite linking =0" or something. Because until/unless Obama rolls out nationwide broadband many like my mom will never ever see it. Nearly 40 homes are on the 1 mile stretch she lives on, and she can see the end of the cable from her front door. It was a block away when she and dad built the house 29 years ago with a "we plan to be out to run line that way in about six months" well 29 years later it has moved exactly 0 inches from where it was. So until we actually get nationwide broadband like most western countries it would be nice if the websites didn't bloat the shit out of everything with offsite linking. sorry about the rant, but with Internet access becoming more and more of a requirement to get anything done, it really sucks how websites are bloating the living hell out of their code expecting everyone to have high speed.

Re:SOMEONE buy a copy for the /. coders! (0)

Anonymous Coward | more than 4 years ago | (#28784673)

Wrong. xbox.com makes Slashdot look like a speeding bullet by comparision.

Re:SOMEONE buy a copy for the /. coders! (1)

Ron_Fitzgerald (1101005) | more than 4 years ago | (#28786911)

Interesting, xbox.com loads fine for me but their forums are terribly slow.

Re:SOMEONE buy a copy for the /. coders! (2, Interesting)

Phroggy (441) | more than 4 years ago | (#28784715)

It's fast enough for me.

The CSS is horribly broken, but I have no complaints about the speed. Posting is certainly faster for me than it used to be before they switched to AJAX. It just looks like crap.

Re:SOMEONE buy a copy for the /. coders! (1)

dyefade (735994) | more than 4 years ago | (#28784989)

Posting is certainly faster for me than it used to be before they switched to AJAX. It just looks like crap.

Really? It takes AGES to preview a comment, I'm thinking 10-20 seconds, during which time you can't scroll down or do anything else (well you can, you'd just have to remember to scroll back up). At least with the vanilla html version you could open a new tab, and I don't remember having any issues with speed back then.

The rest of the site is great though - keyboard navigation for comments is something I really miss when on other sites now.

Re:SOMEONE buy a copy for the /. coders! (2, Insightful)

MogNuts (97512) | more than 4 years ago | (#28785017)

"Classic" /. was always fast posting that I recall. I just wish I could have the features of the new /. (read: "the +/-" to meta-mod stories), with the old that was lost (read: ability to only display stories from certain sections, rather than only by author it seems now).

And maybe it's just me, but where is the option to not timeout your login? I don't want auto-login or to stay logged in after I close the browser (for security), but I hate that /. logs me out after only like 2 minutes of inactivity. I can't even read half a discussion before I go to post and it says to login. Drive me mad, it's been over 10 years now lol!

Re:SOMEONE buy a copy for the /. coders! (1)

bzipitidoo (647217) | more than 4 years ago | (#28787129)

Have your browser delete all cookies on exit. Firefox can do that. That will log you out of Slashdot, and everything else too.

Re:SOMEONE buy a copy for the /. coders! (1)

godrik (1287354) | more than 4 years ago | (#28785097)

I was going to reply to say it is usually slow for me and just when I clicked the 'reply to this' button, I waited more than 15 seconds for the reply box to appear. Sometimes the preview button takes more than a minute to process the preview. There is probably a lot of parameter that impact the times. But it is usually slow for me.

Re:SOMEONE buy a copy for the /. coders! (4, Insightful)

eln (21727) | more than 4 years ago | (#28784815)

Slashdot isn't slow, it's just buggy as hell.

Re:SOMEONE buy a copy for the /. coders! (1)

sootman (158191) | more than 4 years ago | (#28785491)

Agreed. Also, it's slow.

Re:SOMEONE buy a copy for the /. coders! (1)

rrhal (88665) | more than 4 years ago | (#28786067)

If your site was slashdotted 24/7/365 it would be slow too.

See also "Anonymous Cowardon" (1)

PontifexPrimus (576159) | more than 4 years ago | (#28786447)

I mean, how can you overlook something as glaring as a missing space? "Anonymous Cowardon" indeed!
What I miss most is some kind of feedback corner - ok, sometimes the admins invite comments on new stuff, but those threads vanish too rapidly from the front page. Why not include a simple new slashbox named "Feedback" where people could notify the editors and administrators of things that bug them? This might also be an easy way to improve the site by pointing out common problems (spelling errors in submissions could be caught faster and more reliably than by posting in the respective threads, where notices would be modded as "off-topic" or even "troll" when the error was corrected, for instance).

Re:See also "Anonymous Cowardon" (1)

improfane (855034) | more than 4 years ago | (#28787789)

I've noticed this when I use classic mode and low bandwidth and small screen mode.

It doesn't happen in other modes. IMHO think it's a spacing issue in my web browser. Try a different browser and see if you still see 'Anonymous cowardon'

"Preview Post" lag (4, Interesting)

electrosoccertux (874415) | more than 4 years ago | (#28784827)

The only part that bothers me is the "Preview Post" lag lasting for 20-35 seconds. I love everything else about the navigation on the site, though.

Re:"Preview Post" lag (3, Informative)

Ant P. (974313) | more than 4 years ago | (#28785519)

That's the part where it port scans your computer to see if you're running an open proxy. Apparently everyone's considered untrustworthy regardless of current karma or account age.

Re:"Preview Post" lag (0)

Anonymous Coward | more than 4 years ago | (#28787453)

You're kidding, right? A port scan for every comment? How much extra traffic would be generated across the net pipelines if every site (every blog, etc) did this?

Re:"Preview Post" lag (1)

improfane (855034) | more than 4 years ago | (#28787813)

Wow, is that why? That's an intelligent security system.

Re:SOMEONE buy a copy for the /. coders! (1)

Hatta (162192) | more than 4 years ago | (#28785125)

Turn off javascript, it's actually usable then. Nothing worthwhile on the site actually uses javascript.

Re:SOMEONE buy a copy for the /. coders! (1, Insightful)

Anonymous Coward | more than 4 years ago | (#28785363)

Slashdot is the SLOWEST web site on the net.

You obviously haven't had to suffer through the bloated shit that Gawker Media forces upon its victims [jalopnik.com] . Not only does their slower-than-Slashdot Javashit have to be enabled to view comments, and comments were just changed to be forcibly presented in reverse chronological order. Always. No user-configurable option to sort 'em chronologically. Image galleries are forcefed as slideshows; no click-to-open-in-tab ability.

I used to visit Jalopnik multiple times a day, and am now down to once or twice a week, and now actively avoid Gawker's other properties. The only other "design" that ever annoyed me enough that I pre-emptively avoid reading anything they do was the Forbes fetish for the Top 10 Slide Show, but the new Gawker is right up there. Compared to that, the pile of bloated Javashit that is Slashdot is tame by comparison -- because unlike the other sites, at least we can turn all the crap off (classic view) and still read the comments.

Re:SOMEONE buy a copy for the /. coders! (1)

1u3hr (530656) | more than 4 years ago | (#28785417)

As I post, there are only 31 comments. Yet it still took THIRTY FUCKING SECONDS before the hourglass turned into a cursor and I could scroll the fucking page. During which time I notice the CPU usage going to 100%. WHAT THE FUCK IS SLASHDOT DOING????

A couple of years ago Slashdot was bearable, especially as I could turn it to the "low tech" preference, which has now disappeared, and that even let me see it in my default font and not the sans font I do now. Now I have a later browser, twice as much RAM, and the site is dropping lower and lower in my favorite list, not just because of the dumb stories, but because when there is a good story it takes SO FUCKING LONG to open a comment page. Sometime I just skim the front page, check any interesting links, and avoid the tarpit of the comment pages.

Re:SOMEONE buy a copy for the /. coders! (1)

HTH NE1 (675604) | more than 4 years ago | (#28785715)

It actually gets slower when you have mod points on some platforms (I can't say all as I don't use all). It can take a long time just to open the combobox choose how to mod the message, close the box, see it change, and get focus away from the box so when you try to scroll using the keyboard it doesn't change how you're modding the post. I think it is a problem with form efficiency in the browser: it just doesn't expect that many comboboxes to exist in a single form.

Then again, it may be just as slow to have each message have its own form. I've hit limits of operating systems as to how many controls I could have in a screen.

Re:SOMEONE buy a copy for the /. coders! (1)

Wireless Joe (604314) | more than 4 years ago | (#28785759)

You're obviously not a Gawker Media fan, or haven't visited them since their last commenting system update. I still can't get comments to loand in Firefox, and switching to an IE tab means the comments load in a minute or two. They JUST fixed comment viewing in IE 8.

Re:SOMEONE buy a copy for the /. coders! (1)

Baloo Uriza (1582831) | more than 4 years ago | (#28786031)

It's not quite so bad if you use a caching web proxy, particularly if most of what you need is already cached. Granted, that doesn't help with big comment threads, but then again, bitching in the comments kind of contributes to the problem, in a way...

No... (3, Informative)

Last_Available_Usern (756093) | more than 4 years ago | (#28784597)

The greatest barrier to website use is information overload, kinda like this review.

TLDR

WALL OF TEXT CRITS YOU FOR +5 TROLL

Re:No... (0)

Anonymous Coward | more than 4 years ago | (#28784819)

What does TLDR mean?

"Total Loss, Don't Read"?

Re:No... (0)

Anonymous Coward | more than 4 years ago | (#28784933)

You were very close.

Too Long; Didn't Read

Re:No... (2, Interesting)

mcgrew (92797) | more than 4 years ago | (#28785015)

I'd say the greatest barrier to website use is advertising, which isn't information; it's useless data. As to the book, AJAXing your site makes it slow. If you want a fast site, don't put anything on any page that doesn't absolutely need to be there, and have fast pipes and fast and enough servers.

If I see a really fancy layout on a web site, I always wonder why its developer thought the content was so bad that they needed to disguise it with slickness?

bing comes to mind. I've tried it quite a few times, and after seeing the "information overload" ad I finally got it - it doesn't index as many pages as Google and its results aren't as relevant. If you have an inferior product, making your product pretty and marketing the hell out of it is your only chance to obtain any market share.

Google could indeed be improved on, and there will indeed be a better search engine eventially, but bing ain't it. Google is an excellent example of good web design.

BTW, shouldn't that be TLTR instead of TLDR? Oh, too long DIDN'T read. Gotcha. You wouldn't like most of my journals, or anybody's books.

Re:No... (1)

Chabo (880571) | more than 4 years ago | (#28785279)

BTW, shouldn't that be TLTR instead of TLDR? Oh, too long DIDN'T read. Gotcha. You wouldn't like most of my journals, or anybody's books.

"tl;dr" usually implies more than just the length of the text involved (although in some cases people are just that impatient). Usually it means that the writing either drags, or is just unnecessarily long for the message conveyed.

Re:No... (3, Insightful)

Bogtha (906264) | more than 4 years ago | (#28786611)

AJAXing your site makes it slow.

Nonsense. Using Ajax can slow down a site, or it can speed it up immensely. It depends on how you use it. When GMail was first launched, everybody raved about how fast it was. That was the Ajax.

If you want a fast site, don't put anything on any page that doesn't absolutely need to be there, and have fast pipes and fast and enough servers.

Assume you did that for Slashdot. You could then make further speed improvements by using Ajax in various places, for example so that it didn't reload a page full of hundreds of comments whenever you moderated.

Re:No... (1)

mcgrew (92797) | more than 4 years ago | (#28787043)

Yes, if your site is like slashdot ajax could speed it up if done properly, but unfortunately most aren't like that. Any tool can be misused, and the webmastering community is really bad about jumping on the bandwagon using tools that just aren't appropriate for the task at hand.

I'm especially annoyed with sites that use javascript for a simple link. There are a few cases where that would be appropriate, but very few of the ones I've seen that use it are appropriate. If you right click one of these to open in a new tab or window, you get a blank screen.

Actually I think the problem stems from people who are writing web sites that don't even know HTML and rely on automated tools.

Re:No... (1)

Orion Blastar (457579) | more than 4 years ago | (#28785289)

Yeah the more text in a web site or Slashdot story, the longer it takes to load.

Sometimes less is more. Learn to say more by using less text.

Turn off "Verbose" mode.

AJAX (5, Insightful)

metamechanical (545566) | more than 4 years ago | (#28784611)

Has anybody else noticed that of all the websites visited, some of the SLOWEST make heavy use of AJAX? Or is that just me?

Re:AJAX (0)

Anonymous Coward | more than 4 years ago | (#28784655)

Aye. Too much fancy work can slow down anything.

Replacing XML with JSON can speed things up a lot, but it's still a problem if your site is too complex.

Re:AJAX (1)

ByOhTek (1181381) | more than 4 years ago | (#28784671)

And no technology can fix a bad coder who makes more than the minimum number of request calls.

Re:AJAX (1)

Amouth (879122) | more than 4 years ago | (#28787073)

i'm glad i'm not the only person that looked at XML and sat there scratching my head.. wondering why we need a solution to a non existing problem

Re:AJAX (0)

Anonymous Coward | more than 4 years ago | (#28784833)

I wouldn't attribute the slowness to the use of AJAX directly but more to multiple frameworks layered on top of each other and often all in the head section so their loading stops the page rendering in it's tracks while they load. All in all just some poor design choices.

Re:AJAX (3, Insightful)

Archangel Michael (180766) | more than 4 years ago | (#28784907)

Conversely, some of the fast websites use basic TEXT and skimp on the graphics.

I do websites for non-profits and I one of my rules in doing a website is NO UNNECESSARY graphics. Graphics that add pretties but don't add anything else are not allowed. I also run all the graphics through a size filter to limit unneeded pixel count and quality. You don't need 25MB photo on the front page; a scaled and reduced quality is a better choice.

My way of saying this is, "I don't need to see the pimples on the porn stars butt". Sometimes too much detail is too much.

Re:AJAX (1)

commodore64_love (1445365) | more than 4 years ago | (#28785095)

>>>Conversely, some of the fast websites use basic TEXT and skimp on the graphics.

I concur. On my 50kbit/s dialup connection the sites that load fastest are the ones that resemble old 90s-era sites - just plain text and images. The slow ones are those sites that have to preload some javascript, flash, or other executable before they can display anything. Staring at a blank screen for 2 minutes is annoying.

>>>NO UNNECESSARY graphics. Graphics that add pretties but don't add anything else are not allowed.

Well I disagree with you on this. First off, 25 megabyte??? I don't think I've ever seen a GIF or JPEG take-up that much room. You exaggerate.

Second, images aren't really a big deal on fast connections because they zip right through, and on slow connections virtually all dialup ISPs provide compression. So a 100 kilobyte photo would be squeezed to about 20 kilobyte before being sent over the phoneline... a mere 3 seconds.

Re:AJAX (1)

Archangel Michael (180766) | more than 4 years ago | (#28785283)

Check this out ... http://web.forret.com/tools/megapixel.asp?width=3464&height=3464 [forret.com]

okay, so it isn't 25, it is 18MB (RAW). I would consider 7MB PNG to be overkill as well. Scale down, crop, and you can easily get it under 250k or even smaller.

Re:AJAX (0)

Anonymous Coward | more than 4 years ago | (#28785635)

Before I click on that link... is that the proverbial image of "the pimple on the porn stars butt"?

Re:AJAX (1)

sjaskow (143707) | more than 4 years ago | (#28786141)

It's not an image all, it's a calculator to tell you how big a picture you're going to get depending on type and size, etc.

Re:AJAX (0)

Anonymous Coward | more than 4 years ago | (#28786519)

I have a handful of NEF and TIFF files between 15mb and 28mb (all at around 3800x2500 pixels) from a photoshoot on my desktop... some ~35mb JPEG at 2848x4256 pixels from another photoshoot.

I'm guessing you don't do any high-end photography?

In reality these get edited, scaled down to under 900 pixels wide and reduced to 150kb JPEG for the "high-res" versions on the website.

Re:AJAX (0)

Anonymous Coward | more than 4 years ago | (#28786371)

on slow connections virtually all dialup ISPs provide compression. So a 100 kilobyte photo would be squeezed to about 20 kilobyte before being sent over the phoneline...

Please explain how to compress a JPEG (or other compressed image) to 20% of it's compressed size ...

Re:AJAX (1)

Freetardo Jones (1574733) | more than 4 years ago | (#28786827)

You reencode the picture with lower quality settings.

Re:AJAX (0)

Anonymous Coward | more than 4 years ago | (#28787191)

You reencode the picture with lower quality settings.

Well, that's a nice ISP re-encoding with lower quality settings ...

Well, no, they usually don't re-encode images, they just compress, which, as my following poster WillKemp pointed out, would usually increase the file size of compressed images.

Re:AJAX (1)

Freetardo Jones (1574733) | more than 4 years ago | (#28787465)

Well, no, they usually don't re-encode images, they just compress, which, as my following poster WillKemp pointed out, would usually increase the file size of compressed images.

Well you and WillKemp are both wrong. Because by "recompress" he wasn't talking about gzipping it or anything like that. He was talking about them recompressing the JPEG image with lower quality settings. Just so you know, reencoding a JPEG file is "recompressing" it.

Re:AJAX (1)

WillKemp (1338605) | more than 4 years ago | (#28786645)

Second, images aren't really a big deal on fast connections because they zip right through,

That's not really true. Every image requires a separate http connection - which takes time to set up - and it has to be rendered and flowed, which takes time to do (depending on browser and processor speed etc).

and on slow connections virtually all dialup ISPs provide compression. So a 100 kilobyte photo would be squeezed to about 20 kilobyte before being sent over the phoneline... a mere 3 seconds.

Not even close. Photos are usually jpegs and therefore they're already compressed. Generally if you compress a file that's already compressed, it gets bigger.

Re:AJAX (1)

Freetardo Jones (1574733) | more than 4 years ago | (#28786845)

Generally if you compress a file that's already compressed, it gets bigger.

Then you're doing it wrong. To compress a JPEG to a lower size you would decrease the quality setting in whatever editing program you are using and reencode the picture. It's really not that hard.

Re:AJAX (1)

WillKemp (1338605) | more than 4 years ago | (#28787065)

That's not what i'm talking about. Nor what the GP was referring to. I'm talking about, for example, gzipping a jpeg file.

Re:AJAX (1)

Freetardo Jones (1574733) | more than 4 years ago | (#28787435)

But that wasn't what commodore64_love was talking about. By recompressing the image he does mean that they recompress the image to a lower quality before sending it to the user.

Re:AJAX (1)

WillKemp (1338605) | more than 4 years ago | (#28787537)

virtually all dialup ISPs provide compression. So a 100 kilobyte photo would be squeezed to about 20 kilobyte before being sent over the phoneline

Re:AJAX (2, Interesting)

Chabo (880571) | more than 4 years ago | (#28785337)

Conversely, some of the fast websites use basic TEXT and skimp on the graphics.

As I said in another thread the other day: Whether or not you like his writing, I think Maddox hit the peak of usable web design: dark background, with large-font bright text. It's the easiest webpage on the internet to read, and despite having some graphics, it loads very quickly because he uses the graphics as actual content, not just filler.

Re:AJAX (0)

Anonymous Coward | more than 4 years ago | (#28786883)

Too bad he only updates once a year, otherwise it actually might be worth it to do more than an annual check for the single update that MIGHT be there.

Re:AJAX (1)

Chabo (880571) | more than 4 years ago | (#28787381)

He only really slowed down once he got a book deal, and started devoting his time to writing for pay. You can't really blame him; most college students who contribute to open-source projects find they have much less time and drive to do so after they get hired.

Example (0)

Anonymous Coward | more than 4 years ago | (#28787171)

> dark background, with large-font bright text. It's the easiest webpage on the internet to read, and despite having some graphics, it loads very quickly because he uses the graphics as actual content, not just filler.

Example: http://www.riverofinnocents.com/wp/ [riverofinnocents.com]

Re:AJAX (1)

techno-vampire (666512) | more than 4 years ago | (#28787657)

My way of saying this is, "I don't need to see the pimples on the porn stars butt".

You might not, and I certainly don't, but I have a friend who's interests are exceptionally "backal" who'd probably appreciate that type of detail. Ugol's Law implies that there are at least a few people out there looking for that in their pr0n. Have you considered an alternate page that caters to them?

Re:AJAX (0)

Anonymous Coward | more than 4 years ago | (#28784947)

The bottleneck in most cases is DOM access. A lot of pages these days walk the DOM tree (which isn't small for those complex sites) doing some wacky stuff.

Re:AJAX (1)

jminne (521597) | more than 4 years ago | (#28785145)

I would hope that the user experience is improved with the additional AJAX. Balancing speed with functionality is an art.

Re:AJAX (3, Insightful)

iknowcss (937215) | more than 4 years ago | (#28785149)

You're right, but it's not the AJAX that is the cause. It's the incompetent "web developers" who think AJAX is going to solve all of their problems by getting them a better job and making them worth more. These are the same people who firmly believe that Ruby on Rails is the only way to create an AJAX application and not-so-coincidentally couldn't code their way out of a paper bag. I should know. I went to high school with them :)

Re:AJAX (1)

wowbagger (69688) | more than 4 years ago | (#28785811)

You're right, but it's not the AJAX that is the cause. It's the incompetent "web developers" who think AJAX is going to solve all of their problems....

Almost, but I would also lay the blame on the "when all you have is a hammer, all the world's a nail" developers who insist upon putting EVERYTHING into a database, because "that's how you make websites DUH LOL KTHXBYE". So every image, every bit of text, every CSS file, all require accessing a database, rather than a simple pull of a file.

Sure, you can have a database table that maps images to URLs, so that the code that IS accessing the database to build the page can find what URL to embed, and you can change which image to use by altering the database table, but the ACTUAL IMAGE FETCH should be nothing more than pulling a file from the web server's file system. Nice, simple, and it allows the web server to correctly handle IF-MODIFIED-SINCE and Expires.

And I have to wonder if having the database execute a command on modification, to build simple static HTML which can then be served out ad infinitum to the world until the next time the database changes wouldn't also help speed things up.

After all, things like that work for Slashdot - why NOT look into them for other sites and use them where appropriate?

Re:AJAX (0)

Anonymous Coward | more than 4 years ago | (#28785197)

Be sensible, do not use Ajax... the majority of your users will not complain about that.

Another way to improve the performance of a web site is to use tiny URL (so users don't have to upload a big amount of bytes...)

p.s. to the hell with web 2.0.

Re:AJAX (1)

Ant P. (974313) | more than 4 years ago | (#28785757)

The way most sites implement it, all AJAX does is offload the server's page-rendering load onto the user's CPU, because the developers are too lazy to write efficient server-side code.

There was a comment on that story a while back about the MJ death news slowing the entire web down, pointing out that Twitter crashed because some abysmally low number of requests per second overloaded it.

Cars won't end the world through an energy crisis, Web 2.0 developers will.

Re:AJAX (0)

Anonymous Coward | more than 4 years ago | (#28786695)

The way most sites implement it, all AJAX does is offload the server's page-rendering load onto the user's CPU, because the developers are too lazy to write efficient server-side code.

Yes, that's the main reason for using AJAX in the first place. Apart from that, you don't necessarily gain anything, since all the bits and pieces loading separately into the page add to the HTTP overhead.

Troll Roll Call (-1, Troll)

Anonymous Coward | more than 4 years ago | (#28784613)

The spotlight is YOURS!!!

Congratulations Slashdot! (0)

Anonymous Coward | more than 4 years ago | (#28784615)

You fixed most of your bugs.

I think you might be ready to move from Beta to Release Candidate.

So what are they saying to Slashdot with this... (3, Insightful)

Anonymous Coward | more than 4 years ago | (#28784639)

See slashdot, slow loading pages == bad.

I hope this book helps you.

I'll Start The Collection... (-1, Troll)

damn_registrars (1103043) | more than 4 years ago | (#28784649)

... so we can buy a copy to send to the programmers at slashdot. You can send your money to me and I'll handle the rest.

Slow sites lose me (4, Interesting)

WiiVault (1039946) | more than 4 years ago | (#28784675)

I used to go to VGChartz all the time until a few months ago they "updated" their site with seeming dozens of constant flash ads, talking, moving, popping up on the forums, ect. It got to the point where my Core2Duo desktop was litterally pausing for 5 seconds everytime I navigated to a new page. Pretty quickly I realized that the content on the site wasn't worth my time. I see this happening more and more. Sure I could have used adblock but frankly they were asking for a price (getting attacked by ads) and I chose not to pay it (leaving the site). I was glad to see a few ads, but abuse is a sure sign you have little respect for your readers.

Re:Slow sites lose me (1)

Josejx (46837) | more than 4 years ago | (#28784783)

I'm in this boat too, except my machine is an old G4 laptop. I don't visit VGChartz anymore and I've cut way back on my slashdot reading because the site locks up Firefox (3.5.1 on PowerPC) for long enough that I get those "Script is taking too long to complete" errors. Noscript helps and is probably the only reason why I still come to slashdot. :(

Re:Slow sites lose me (1)

WiiVault (1039946) | more than 4 years ago | (#28785569)

Yeah I've got a G4 iBook (1.2ghz) that required me to force-quit Safari about ever 5-6 page loads. Painfully unusable.

The first thing to do... (5, Informative)

tcopeland (32225) | more than 4 years ago | (#28784765)

...when your site is slow is to fire up YSlow [yahoo.com] and see what it says. Sometimes all you have to do is enable mod_expires in Apache to get a huge performance increase and a much lighter server load.

If YSlow doesn't flag anything, then you've got to start digging. But at least you've eliminated some of the easy fixes.

Running up the down escalator (4, Interesting)

intx13 (808988) | more than 4 years ago | (#28784777)

In my opinion, tuning a Javascript laden website for speed is an exercise in futility. The Web's great difference is user-extensibility; practically anyone can throw together some HTML and make a website. It may be a MySpace page, but hey, it's something, and it contributes to the culture of the Internet. Speed, however, is not a feature of the HTML/Javascript/CSS world.

Were I to be developing a new AJAX-driven Web application I would focus first on simplicity. I feel that if you have so much AJAX stuff going on that you need to resort to crazy tricks you have already lost. Take the following, quoted from the review:

In the subsequent chapter, "Flushing the Document Early," Steve explores the approach of utilizing chunked encoding in order to begin rendering the Web page before its full contents have been downloaded to the browser.

While good advice, something like this should be implemented as a natural part of the specification, or not at all. This rings to me as an attempt to manhandle HTML/Javascript/CSS into a use case for which it is not intended.

I want to see a real protocol for webpages - something between Postscript (except less document-oriented; and yes I know about NeXT's work) and a windowing environment (except more constrained). Then, to preserve the ease of user-input, a simple HTML/XML-like layer. For 99% of the sites that are constructed directly by the user, original HTML with italics, colors, fonts, etc. is sufficient. For projects beyond the scope of Joe Facebook, a true system is needed that allows seperation of design and content. But all attempts thus far to do both, frankly, suck.

Re:Running up the down escalator (1)

godrik (1287354) | more than 4 years ago | (#28785045)

I completely agree with you, those technologies were not thought to be used like that. Thus, some effects are not directly possible and need serious hacking to be reached. HTML/Javascript/CSS are definitively not the right tool for what we are doing with it. The problem is that there is no tool/technology/language to replace it; and I can't see one coming.

I am taking here the risk to be modded troll. I do not believe in so called "web applications". If you need an application, just right a classical application that connects to remote servers. We do not lack portable tool today. python is present on most (all?) operating systems. You can easily do the GUI using portable libraries QT or GTK are available on most OS as well. You can still even use java or .net (recall mono is standard in debian now, I believe there is a similar thing for mac os x).

Re:Running up the down escalator (1)

Chrono11901 (901948) | more than 4 years ago | (#28786485)

Except for web applications theirs no need to working about patching clients, or weird hardware issues.

Is there a minor bug in foo.js? it will take less then a day to fix,test and push out the fix.
Got a bug in your local app... well on out next patch release in 3 to 5 weeks it will be fix.

Hardware/os issue causing your app to crash? good thing we piss away money on lots of tech support to solve strange isolated issues or common dependency issues. For a web app if it runs in IE,FF, and safari we're set, at worst we will get some miss aligned element.

Re:Running up the down escalator (1)

godrik (1287354) | more than 4 years ago | (#28787085)

if you have weird hardware issues,you are using a low level programming language which is clearly not required for the type of apps that are web based nowaday.

you can push a new version of the code at startup time. Since the software requires a server connexion, it is easy to send a new code/binary/resources/whatever at this time.

Re:Running up the down escalator (4, Interesting)

commodore64_love (1445365) | more than 4 years ago | (#28785419)

>>>>>begin rendering the Web page before its full contents have been downloaded to the browser.

>>This rings to me as an attempt to manhandle HTML/Javascript/CSS into a use case for which it is not intended.

I disagree. Today's websites don't do it, but in the simpler 1990s era of pure HTML, the website DID render before completing download. The browser was expected to grab the HTML first, render the page with "X" placeholders, and then download the images last. That way the user could read the website even with the image only partially present.

So yes prerendering the webpage before download was completed *was* the original intent of the web. It is only lately that webpages have shifted away from that, and I for one would like to see them restore it.

Re:Running up the down escalator (1)

Civil_Disobedient (261825) | more than 4 years ago | (#28785467)

While good advice, something like this should be implemented as a natural part of the specification, or not at all. This rings to me as an attempt to manhandle HTML/Javascript/CSS into a use case for which it is not intended.

HTML already does chunked loading. This is one of the big reasons you shouldn't use tables for layout. Since most tables are set to use auto layout, the browser has to wait for the table content before it can start rendering.

Wait a second... (2, Insightful)

damn_registrars (1103043) | more than 4 years ago | (#28784925)

At first I thought this would help the slashdot coders, until I read more closely:

Slow Web page loading can discourage visitors to a site more than any other problem, regardless of how attractive or feature-rich the given site might be

Attractive and feature-rich are not adjectives that are appropriate to apply to slashdot it its current form. Hence I don't think this book is the cure to what ails them.

great idea (0)

Anonymous Coward | more than 4 years ago | (#28784943)

how about people quit using flash and other technologies that cause the page to take forever to load and go back to plain html and images with a little css thrown in

Or how about... (5, Insightful)

castironpigeon (1056188) | more than 4 years ago | (#28784953)

Most websites don't need to be ridiculously complicated to be effective. Go ahead and call me a luddite, but does everyone need comments on their website? Fancy sliding menus? Pop-up image galleries? Flash or other web programming up the wazoo?

When did simple, efficient web design that presents content well and doesn't get in the way of it get tossed out the window?

Re:Or how about... (1)

godrik (1287354) | more than 4 years ago | (#28785139)

I completely agree with you. I am often browsing website on low powered device and sometimes with low bandwidth and it is so slow...

And there are website that can not be browsed if you do display images which you really don't want to do when you have a low bandwidth

webmasters, just strip the shit out of your pages !

Re:Or how about... (3, Funny)

thedonger (1317951) | more than 4 years ago | (#28785339)

Most web sites do not need to be as ridiculously complicated to look so complicated. The web is relatively young, and AJAXification even younger. Give it time and common sense will prevail.

The semantic web, HTML5, and CSS3 will eventually usher in an era of peace and tranquility. Music will be free. Passwords will no longer be necessary. Web sites will design themselves to look different to everyone (like the alien in Contact looking like Jodie Foster's dad, or the talking taco pooping ice cream in South Park), appearing in a form with which the user feels most comfortable. Accessibility and affordance [wikipedia.org] [wikipedia] will be implicit. Fluid design will exist in four dimensions, and tables will only be used for tabular data. JavaScript will be so unobtrusive it may never load, but that will be OK because all web sites will be semantic and thus 100% accessible and functional without it. And Noscript will no longer be needed on the browser because it will be installed at the ISP level...

Whoops. Sorry. I dozed off there but I kept typing while I was dreaming...

"Faster response times using AJAX" (5, Funny)

Anonymous Coward | more than 4 years ago | (#28785273)

I read this guy's other book, "Clearer thought using vodka". Great read. Can't remember much of it.

K.I.S.S. (4, Insightful)

Tablizer (95088) | more than 4 years ago | (#28785281)

The fastest websites I see generally use plain-jane HTML and pre-sized graphics (via size attributes on IMG tag). AJAX, CSS, and too much JavaScript seem to cause confusing UI behavior, unexpected pausing while something fancy is doing GC, or jerky scroll motion that can be perceived as sluggishness, especially on older hardware.

Re:K.I.S.S. (1)

Grishnakh (216268) | more than 4 years ago | (#28786035)

I also wonder how much sluggishness is caused by putting everything in a database. I tried implementing a simple shopping-cart system for my website a while ago, where I have a very small handful of products I sell using Paypal buttons currently, and tried using some typical open-source packages like OScommerce. All of them make very heavy use of MySQL for every little thing: configuration settings, the actual content on each page, even images! This really seemed like overkill to me, and wasn't exactly fast, so I started work on my own shopping-cart code, though I got waylaid with other projects and had to put it aside for a while.

My current site is just simple HTML with small images, and loads extremely quickly.

Re:K.I.S.S. (1)

Tablizer (95088) | more than 4 years ago | (#28786375)

Hopefully the database is providing something of value, such as quick configuration, content, or style changes. While speed is a good thing, it's not the only concern. It may also be that the servers are overloaded. It may be cheaper to split or beef-up the database servers than to rewrite all the code.

Ads, ads and more ads? (0)

Anonymous Coward | more than 4 years ago | (#28785327)

I'm on a slower computer at work - as such I have plenty of time to watch the status bar in the browser. I would claim that the best way to speed up a site is to reduce or remove the "fancy" (read that as "annoying as hell") ads.

Approximately 80% of my loading time is spent waiting for Google's server to respond with ads, Yahoo's, and whoever else does those terrible javascript/flash ads. Yes yes, I could use adblock, but sadly my browser cannot install addons.

I know people already mentioned VGCharts, but another site that is very terrible is TomsHardware. I used to be an avid reader, but now it takes me almost a minute to load the entire page - I haven't been back in months.

Re:Ads, ads and more ads? (0)

Anonymous Coward | more than 4 years ago | (#28785545)

I agree. But also, just leave out the googaws and give us content instead. It doesn't attract my attention more because it moves, sparkles, or makes unwanted sounds. I also give up on a site whose opening page has nothing more than the loading of some awesome flash app when all I want is to find a fact. These sites are often content-free anyway.

It may not be a technical problem (3, Insightful)

Anonymous Coward | more than 4 years ago | (#28785387)

I've seen a lot of good sites "re-designed" into oblivion. I think it's mostly a social problem.

1. The "we're artistes" mentality. Unless your site is explicitly an art site (which it usually isn't), people will just want information from your site. You could do most of that with static HTML and a few simple images.

2. The "we're losing eyeballs, we need to do something" problem. The site design is an easy target; but it may not be your site design. Maybe your content is just boring. Competition moved in. You were participating in a fad. The honeymoon is over. The public tastes change. You can't control those things, but you can control your site design so that's what you do. It's like pushing the walk button at an intersection when somebody else has already pushed it.

3. Now, this is the one I really hate to say; but here goes... "You can't figure out how to let people go". The project is done. You need a few maintenance coders, the most experienced people. OK, maybe there's some justification for keeping staff around for new projects; but I bet often there isn't. If you're directly involved with the developers, you don't want to do this. The risk is that you'll spark brain-drain and cause the people you really need to leave. It's far less risky for you to keep everybody, even the grunts. Better yet, you can go up to the next level of management and tell them that site redesign is needed. That makes you the hero instead of the villain. The next level up from you is probably not going to figure out that you are just trying to keep your department busy. Perhaps this problem could be solved by making it clear right from the start that there will be a permanent staff and some contract workers. Instead, I bet there are a lot of companies where the web developers figured out how to justify their existance--to the detrement of the site, and the company.

4. Cool new technology. 'nuff said.

5. People just aren't that bright. They dynamicly generate javascript via a scripting language with a framework on top of it. Just to put "Hello World" on a page. Look at me! I'm a web developer.

Google's response (1)

peater (1422239) | more than 4 years ago | (#28785819)

I suspect he overshot his 20% time while writing this, so the guys at Google decided to screw him over by posting this:

Let's make the web faster [google.com]

Why is Google held in such high regard? (1)

pongo000 (97357) | more than 4 years ago | (#28786879)

Google Mail is, hands down, *the* slowest webapp that I deal with. I'm not talking about the copout "basic HTML" option. I run any number of browsers (FF, Opera, Safari), and *all* take tens of seconds to load and interact with Google Mail. I dream about sharp needles in the eye as I wait for Google Mail to do its...magic.

So the author of this book is a "web performance expert" who is now employed by Google. Instead of writing books, maybe said author should figure out what's wrong with the Google Mail interface?

This page takes six seconds to load. (1)

Animats (122034) | more than 4 years ago | (#28786979)

This page takes six seconds to load. "data.coremetrics.com" is slow today. "c.fsdn" is slow, too. "doubleclick.com" only wasted about a second.

Maybe someone at Slashdot should read the book.

As he says, most sites need more simple advice (2, Informative)

greenreaper (205818) | more than 4 years ago | (#28787079)

The average webmaster doesn't need this book; they need tools like Page Speed [google.com] that highlight very simple issues that webmasters can fix, usually without any code changes. And they need clear instructions on how to do it, and why it will benefit their users. I recently ran a project to cut the fluff [wikifur.com] on large furry websites. Most failed to even gzip their HTML, CSS or JS, or were resizing images into thumbnails. It's not hard, people just don't think to do it, or don't know how.

Efficient code? (2, Insightful)

dontmakemethink (1186169) | more than 4 years ago | (#28787599)

Someone advocating more efficient coding of websites should surely take less than 12 paragraphs to do so!

It's not your own content (4, Insightful)

roc97007 (608802) | more than 4 years ago | (#28788055)

In my experience (having 20/5 Mbit fibre to the house) is that it's almost never the website's local content that loads slow, it's the damned ads. Let me say that again so that we're clear -- I'm sitting there waiting not for the content I was seeking to load, but for the ads that I don't want to load. I know there's work-arounds for this -- I use some of them and they do help. But Fred and Ethyl Nongeek is not going to know about ad suppression. To them the 'net is "just slow" and they don't know why.

Making your local content lightning fast isn't going to help if your customers are waiting for unwanted content from some other website. And (grrrr....) this includes CAPTCHA! Sometimes it takes almost a half minute for the captcha image to come up, long after the rest of the login page has loaded.

The worst example, and a sign of things to come, was when Google Ads went down awhile back and took a bunch of websites down with it, including (as I remember) Slashdot. If your page included google ads, it just wouldn't load. I don't think AJAX is going to help with that. Feel free to disagree, but be specific.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>