Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Google Shares Insights On Accelerating Web Sites

samzenpus posted more than 4 years ago | from the hurry-it-up dept.

Google 230

miller60 writes "The average web page takes 4.9 seconds to load and includes 320 KB of content, according to Google executive Urs Holzle. In his keynote at the O'Reilly Velocity conference on web performance, Holzle said that competition from Chrome has made Internet Explorer and Firefox faster. He also cited the potential for refinements to TCP, DNS, and SSL/TLS to make the web a much faster place, and cited compressing headers as a powerful performance booster. Holzle also noted that Google's ranking algorithm now includes a penalty for sites that load too slowly."

Sorry! There are no comments related to the filter you selected.

First Post! (2, Funny)

Anonymous Coward | more than 4 years ago | (#32673614)

If only Slashdot loaded faster I could have had my first post!

Ajax Libraries (3, Interesting)

0100010001010011 (652467) | more than 4 years ago | (#32673626)

Now if only every website didn't include 300kb of Javascript libraries that I had to download.

Noscript (4, Informative)

iYk6 (1425255) | more than 4 years ago | (#32673640)

That's what noscript is for. With noscript, your browser doesn't even download the .js files.

Re:Noscript (5, Insightful)

Saeed al-Sahaf (665390) | more than 4 years ago | (#32674200)

That's what noscript is for. With noscript, your browser doesn't even download the .js files.

That's fine and dandy. IF.

If you don't care to see or experience the vast majority of web sites on the Intertubes today.

Honestly, when I see (yet another) pious elitist bleating about no-script or whatever, I wonder: Why don't you just surf in Lynx?

If you're surfing with no-script, you're missing 75% of the Internet. If it's not the 75% you want to see and or experience, than good for you. But bleating about the creative uses of JavaScript on the World Wide Web is old news.

Re:Noscript (5, Informative)

Dylan16807 (1539195) | more than 4 years ago | (#32674238)

75%? When I turn off javascript it only seems to affect about a tenth of the sites I visit.

Re:Noscript (4, Insightful)

calmofthestorm (1344385) | more than 4 years ago | (#32674320)

Likewise. And if I see flash it's a damn good indication I just don't care what's on the site.

Re:Noscript (0)

Anonymous Coward | more than 4 years ago | (#32674906)

I don't get it. What point are you trying to make by avoiding Flash and Javascript? It seems so arbitrary and pointless...

Re:Noscript (1)

ArsenneLupin (766289) | more than 4 years ago | (#32675020)

And if I see flash it's a damn good indication I just don't care what's on the site.

Except for games...

Re:Noscript (2, Funny)

Anonymous Coward | more than 4 years ago | (#32675158)

Hey well looky here, we have 2 gramps in our midst.

For that, not only am i going to stand on your lawn, i'm going to rip out grass in the shape of a stencil of goatse.

Re:Noscript (2, Insightful)

EvanED (569694) | more than 4 years ago | (#32674514)

A tenth? What Internet are you using?

My my estimation based on the scroll bar position after counting for a while, I've got about 250 websites listed with site-specific preferences; some of those will just have plugins enabled so I can look at PDFs, but most are to enable JavaScript.

They range from a couple dozen sites that I want to watch some Flash video on (enabling plugins but leaving JS off doesn't seem to work) to online stores (NewEgg and Best Buy both need JS for nearly essential features) to two of my banks (one of which doesn't actually need it I think, but it adds a few neat features; the other I think needs it) to some discussion-centric sites (even /. by default, but also things like Blogspot if you want to add comments) to the social networking sites to sites like this [realworldhaskell.org] .

I can't say for certain what percentage of the sites I visit I have whitelisted, but rarely does a day go by when I don't discover some new site to add.

Re:Noscript (1)

EvanED (569694) | more than 4 years ago | (#32674956)

Just to drive this point home, it's less an hour and a half later, I haven't been browsing the internet, and I've already found a new site that I needed to enable JavaScript for.

Re:Noscript (5, Insightful)

Jurily (900488) | more than 4 years ago | (#32674262)

If you're surfing with no-script, you're missing 75% of the Internet.

Actually, it's more like 95%. However, you did completely miss the point. Turning off Noscript for a site you choose to bless takes two mouse clicks and a reload.

You're not missing out on what you want to see. You're missing out on all the other random shit you couldn't care less about.

Re:Nostyle (1)

cryptoluddite (658517) | more than 4 years ago | (#32674710)

Also knock it up a notch with View -> Page Style -> No Style.

This works really well for sites that put stories into tiny columns or use unreadable fonts.

Re:Noscript (3, Insightful)

Anonymous Coward | more than 4 years ago | (#32674336)

Or if you actually want to be able to use your [credit card/bank/net provider] site. The problem isn't what we, the users, allow or deny--the problem is the hubris of the programmers and web designers who want to stuff in all that bloatware, just because they can.

Dear Web Site Designer:

If your page takes more than a few seconds to load, your customer has moved on.

If your search algorithm brings up even more pages when another term is added, your customer won't slog through the cruft

If the pages is flashing and singing and offering 50 different subjects in 20 colours, your customer is confused and will not select anything.

If your customer has to fight to find what they want and pay for it, they will go to a brick and mortar.

If your reader wants to share a link or open it in another tab, finding out it is a script that can only be opened where it was found is annoying and likely to gain you some hostility.

If there is any effort in reading and understanding what you published, they don't care how pertinent your subject is or how true your opinion.

Oooo . . . shiiiinnnny does not a good web page make.

Re:Noscript (1)

Pentium100 (1240090) | more than 4 years ago | (#32674416)

I have noscript on and yes, it affects a lot of sites, but if I want, I can always enable the scripts for a particular site, but it sill helps not having them on by default.

That's not insightful (5, Informative)

Per Abrahamsen (1397) | more than 4 years ago | (#32674454)

Noscript doesn't turn off Javascript. Most browsers already have an option for that. What Noscript does is to make the control of Javascript (and Flash) much more fine grained and convenient.

Some typical case:

1. Scripts on poor web sites just serve to detract from the content. Those you simply never turn on.

2. Scripts on good web sites improve access to content. Those sites you enable permanently first time you visit (press no Noscript button in the lower right corner, and select "enable permanently") and forget about it.

3. Some web sites contain a mix of the two. Here you can either explicitly enable a specific object (by clicking on a placeholder, like with flashblock), or temporarily enable scripts for that site.

Basically, Noscript makes more, not less, of the web accessible. The good web sites you use normally will not be affected (as they all will be allowed to run scripts). But following links from social web sites like /. become a much more pleasant experience.

Of course, most of the noise scripts distacting from content are ads, so AdBlock gives you much of the same benefit. But I don't want to hide ads, as that is how the sites pay their bills.

Re:Noscript (1)

jochem_m (1718280) | more than 4 years ago | (#32674576)

You're mostly missing a lot of slow loading, badly designed pages. You can turn it off selectively when you need to you know.

250 websites is not the vast majority (0)

Anonymous Coward | more than 4 years ago | (#32674608)

The vast majority of web sites on the Intertubes today don't need javascript

Re:Noscript (1)

rtfa-troll (1340807) | more than 4 years ago | (#32674754)

It's true that javascript is crucial to the web. However, good and important javascript is limited to a handful of sites. What you should do is take a liberal approach at the beginning and e.g. immediately turn on the whole of google.com and the other top ten whole domains that you use regularly. Next, if you find yourself allowing a site a second time, you do it long term. Finally; learn a few alternative sites that are less script dependent. The worst ones are ones which make you use javascript for menus or for content access. This normally means that they have some really nasty advertising (overlay frames with flash or something) which relies on javascript.

Once you have a you will actually find that your browsing experience is better with noscript than without it. You will very occasionally temporarily allow a site and that costs you a second or so, but most of the time you will just not see the various annoyances which take up so much time when web browsing normally.

Comparing browsing with noscript to browsing with lynx really shows you have never tried it properly. Maybe it's not for everybody, and people with no understanding of what a URL is or a server name / DNS name might have difficulties with the interface to start with; however for anyone with either a slight technical knowledge or under the age of 50, browsing with noscript is going to be better than browsing without it. The important thing to know is that you won't achieve this in your first five minutes of noscript.

Re:Noscript (0)

Anonymous Coward | more than 4 years ago | (#32674926)

You realize noscript allows you to enable javascript for certain domains in two clicks, right?

Re:Ajax Libraries (2, Informative)

Anonymous Coward | more than 4 years ago | (#32673684)

All modern browsers support caching, and chances are, you aren't actually downloading a brand new set of libraries each time.

Re:Ajax Libraries (0)

Anonymous Coward | more than 4 years ago | (#32674056)

Really? Does it support caching of TLS/SSL content? No?

Well, how about just the stylesheets or javascript? Still no?

Well...shit...can't have a secure and fast web.

Re:Ajax Libraries (1)

hey (83763) | more than 4 years ago | (#32673702)

You can load those libraries from Google.
I assume the don't penalize you for that!

Re:Ajax Libraries (1)

Korin43 (881732) | more than 4 years ago | (#32673780)

If you load them from Google, it's far less likely to impact loading times (since your browser will use the same cached script for every site that loads from the same place).

Re:Ajax Libraries (1)

Jesus_666 (702802) | more than 4 years ago | (#32674166)

On the other hand, Google tends to use amazingly slow servers for some functions. When I decided to edit my hosts file to have www.google-analytics.com point to 0.0.0.0 it shaved a good ten to fifteen seconds off my page loading times because GA had such ridiculously long loading times.

Maybe they've since brought their infrastructure in order and maybe they always hosted their libraries in a more reasonable manner than GA but I'd still be wary of Google-hosted libraries - if the user doesn't have them cached I'd expect them to increase the loading time to epic proportions, making for a horrible first impression.

Re:Ajax Libraries (0)

Anonymous Coward | more than 4 years ago | (#32673906)

I prefer not to, I don't trust google.

Re:Ajax Libraries (4, Insightful)

nappingcracker (700750) | more than 4 years ago | (#32673834)

I disagree. Libraries have greatly improved the usability of many websites. I also doubt that many people are pulling down 300kb of libraries every time, since most are minified and gzipped. Even with a ton of bells and whistles it's hard to hit 100kb of .js, The ever popular jQuery + jQuery UI is only ~30kb (with reasonably useful plugins like tabs, dialog, etc, not all the crazy and expensive FX).

I'm OK with users having to pull even 100kb one time to have a nicer browsing experience all around.

I really wish I could get over my paranoia and link to the libraries on google's code CDN. Slim chance, but if they go down and my sites are still up, there be problems!

Re:Ajax Libraries (0, Troll)

socsoc (1116769) | more than 4 years ago | (#32673880)

So Google's CDN is gonna go down and your site will remain up? Nice idea, stop bogarting whatever you are smoking and pass it along.

Re:Ajax Libraries (1)

nappingcracker (700750) | more than 4 years ago | (#32674042)

You're right. Big company services never go down...except those times they did and it was a huge problem. Remember the Amazon S3 outage? EC2 botnet attacks? Google GMail and document services going down? This month Google's jQuery libs on their CDN went down 2-3 times.

Stuff going down for a few hours is a lot of money lost.

Re:Ajax Libraries (1)

metaconcept (1315943) | more than 4 years ago | (#32674688)

Does this help?

<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js"></script> <script type="text/javascript"> if (!("jQuery" in this)) { document.write('<'+'script type="text/javascript" src="query.min.js"></'+'script>'); } </script>

Re:Ajax Libraries (0)

Anonymous Coward | more than 4 years ago | (#32674892)

isn't that invalid xml if you don't use &lt; ad &gt; in the string literals? oh, i forgot, the new trend is to not give a shit about easy parseability again

Re:Ajax Libraries (1, Insightful)

gaspyy (514539) | more than 4 years ago | (#32673918)

Frameworks are great but they are also overused.
JQuery is fantastic if you're doing a big site that you want to feel like an app, but many people load JQuery just to do an image fade or animation - stuff that you can easily code yourself. [richnetapps.com]

To add insult to injury, sites made with Joomla, WP, Drupal, etc. often rely on plugins, which use their own libraries. The end result is a site that loads JQuery, Mootools and Scriptaculous just to do some trivial effects that would be achieved just as well with document.getElementById(), setTimeout() and the element.style property.

Re:Ajax Libraries (2, Informative)

micheas (231635) | more than 4 years ago | (#32674096)

Frameworks are great but they are also overused.
JQuery is fantastic if you're doing a big site that you want to feel like an app, but many people load JQuery just to do an image fade or animation - stuff that you can easily code yourself. [richnetapps.com]

To add insult to injury, sites made with Joomla, WP, Drupal, etc. often rely on plugins, which use their own libraries. The end result is a site that loads JQuery, Mootools and Scriptaculous just to do some trivial effects that would be achieved just as well with document.getElementById(), setTimeout() and the element.style property.

The problem with doing things yourself instead of using a framework for common things like fades is that you have to remember how to code for each of the common browsers. The libraries hide that nastiness from you.

Re:Ajax Libraries (1)

Yoozer (1055188) | more than 4 years ago | (#32674520)

stuff that you can easily code yourself.

Yes. The result however, of this DIY script is one script of 5 kb that costs much more work to maintain and does just one thing - collapsing and expanding. 5 of those scripts for 5 operations (let's assume image galleries, a lightbox, some AJAX, form validation and a floating div for a form) would already add up to the size of Mootools*; and they'd only be good for just that one thing.

Writing this in Mootools would take a total of 6 lines or so; but more importantly - the Mootools framework is useful for other things, too. For one thing, it makes your JS far more readable.

Joomla or Drupal are already bloated by themselves because the components need to be wrapped in several layers for customizability; having 2 or 3 JS frameworks fight with eachother is just icing on the cake.

* (replace Mootools with your framework of choice)

Flash ... (1)

Midnight Thunder (17205) | more than 4 years ago | (#32674074)

For Flash heavy sites, will the time it takes for the Flash to load be taken into account? Or how about sites slowed down by all the external ads?

Re:Ajax Libraries/Bulky Pages/User-Agent Dependant (0)

Anonymous Coward | more than 4 years ago | (#32674116)

How about sites that, using javascript, for menus have so much stuff that the front page for them is 200K?
Example: Benton County, WA [benton.wa.us]
Or store their state in a hidden form element on the page?
(On the comment on user agent: 200K using "Mozilla/5.0 Firefox/3.5.4", 17K using "Mozilla/4.0", 31K using standard Wget user agent)
Good: Changing page based on user agent(links2 actually can navigate a non-javascripted version, Firefox requires javascript to go anywhere). Bad: HUGE. If I was on a dial up, it would be about 90 seconds for that page. And that is the one page, not any extras like pictures or included scripts. AND they have Cache-Control set to "private" so a web proxy for the office doesn't give any benefit to anyone else.
And every other page is under "pview.aspx?id=...."
Another Bad: Depending on the user agent it changes. What do you suppose happens when the Google Robot comes by compared to a normal user?...
Did I ever mention that I hate the new web site for my county compared to the old one? (Old: Mostly static pages, much slimmer)
(And most of the current site content is static on a day-to-day basis it seems.)

I am struck by the concept (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#32673646)

I am struck by the concept that the longer a business continues the more liabilities it gathers. I am now of the opinion that a business needs a start date and an end date and that those dates should be very public from inception.
As a small example think of having good employees that stay with you and are aging. Health insurance as well as more serious consequences for a simple fall accrue. If you keep your people the financial costs may prevent competition. If you fire them you may well be sued. But if your plan from the

Hope they don't get to trigger happy (2, Interesting)

cameljockey91 (1455491) | more than 4 years ago | (#32673648)

How many times will their crawler check a slowly loading website before they penalizes it?

NONONONONO (3, Informative)

Magic5Ball (188725) | more than 4 years ago | (#32674036)

"He also cited the potential for refinements to TCP, DNS, and SSL/TLS to make the web a much faster place"

The core Internet protocol and infrastructure was and remains a conduit of innovation /because/ it is agnostic to HTTP and all other protocols. Optimizing for one small subset of its protocols and for a single kind of contemporary usage would discourage all kinds of innovation using protocols we've not conceived yet, and would be the single largest setback the modern Internet has seen.

I prefer low-tech solutions... (5, Funny)

Lord_of_the_nerf (895604) | more than 4 years ago | (#32673650)

I find my browsing goes faster if I just yell at my housemate to stop downloading torrents that are *ahem* 'Barely Legal'.

Re:I prefer low-tech solutions... (3, Informative)

dintech (998802) | more than 4 years ago | (#32675120)

If you haven't already, get a router with QoS. Next, when he's in some other room 'maximising his bandwidth', set his max connections to 30 and his upload to 1/4 of your upload speed. You might also consider Tomato or DD-WRT if you have a compaitible router.

java sites screwed (0, Flamebait)

codepunk (167897) | more than 4 years ago | (#32673652)

"Holzle also noted that Google's ranking algorithm now includes a penalty for sites that load too slowly"

If it is server response time the java sites are screwed.

What the hell I have karma to burn.

Re:java sites screwed (3, Insightful)

kainosnous (1753770) | more than 4 years ago | (#32673800)

This really begs the question of what it tries to load. If it simply loads the html, then JavaScript laden sites and Flash sites will have the edge over simple information sites that serve dynamic content. However, if they load all referenced content, then the reverse may be true.

I would like it if the latter were true. What could be better than every Flash site being seen as a large bundle of data that simply displays "This site requires Flash". When I surf the web, I surf for content, not pretty pictures. In my opinion, if a site can't simultaniously be surfed in Lynx, read in Braille, and parsed with a spider, then it really isn't a web site.

Re:java sites screwed (1)

bruno.fatia (989391) | more than 4 years ago | (#32674366)

I'm guessing you dont like Flickr much..

Re:java sites screwed (0)

Anonymous Coward | more than 4 years ago | (#32674420)

In my opinion, if a site can't simultaniously be surfed in Lynx, read in Braille, and parsed with a spider, then it really isn't a web site.

You sound fat

Re:java sites screwed (0)

Anonymous Coward | more than 4 years ago | (#32674974)

Wow, you're so "l33t."

I don't know what point you're making, but at least you're making it as hard as you can...

Re:java sites screwed (2, Informative)

grahamsz (150076) | more than 4 years ago | (#32673878)

There's no inherent reason that Java should be slow. I run a discussion site (linked in my sig) that's running off an all-java codebase, and while it has occasional load issues. We can render the content for the front page of the site in 20 ms or less (it's at the bottom of the page if you are curious). Java has a proper application model, so with smart use of singletons you can effectively keep the entire working set of a forum site in memory. Our performance is much poorer if you start browsing through archives, but that makes up a tiny percentage of our page views.

Re:java sites screwed (4, Informative)

Nadaka (224565) | more than 4 years ago | (#32674050)

java really really only has problems with startup time (that a web spider will never see) and the delay when a servlet|jsp is hit the first time. While doing web development, we see that startup and first load most of the time, giving an appearance of slowness, but it is much better on a production server with regular traffic.

Re:java sites screwed (2, Insightful)

glwtta (532858) | more than 4 years ago | (#32674788)

Right, because how could Java possibly hope to compete with the blazing speeds of PHP and Ruby?

Anyone Notice (0)

codepunk (167897) | more than 4 years ago | (#32673676)

Anyone notice how slow the "How Where Making the web faster page" is loading?

google-analytics.com ? (5, Insightful)

corsec67 (627446) | more than 4 years ago | (#32673680)

I saw my browser waiting on google-analytics.com quite often before I started using No-Script.

Why do sites put up with an AD server/analytics service that slows down a site by a large amount?

Re:google-analytics.com ? (0)

Anonymous Coward | more than 4 years ago | (#32673746)

it's a common misconception that it will improve google search ranking

Re:google-analytics.com ? (5, Insightful)

Anonymous Coward | more than 4 years ago | (#32673778)

Because it's valuable data, and google is the only game in town. You can see which keywords are converting, and for what dollar amount, and which keywords are money pits. Yes, it will on occasion hang but you should look at the data that it produces before saying it's not worth it.

Re:google-analytics.com ? (5, Insightful)

dintech (998802) | more than 4 years ago | (#32675184)

Yes, it will on occasion hang but you should look at the data that it produces before saying it's not worth it.

Not worth it to who? It's not worth it to me. Noscript please.

Re:google-analytics.com ? (1)

NoPantsJim (1149003) | more than 4 years ago | (#32673854)

Isn't this sort of mitigated when people put Google Analytics at the bottom of the page? I've never noticed any slowdowns waiting on GA on any of my sites, and I have GA scripts at the bottom of every page. The absolute bottom.

Re:google-analytics.com ? (0)

Anonymous Coward | more than 4 years ago | (#32673884)

Noooo! you are doing it all wrong, it must be the 1st thing to load, so that google gets their results 1st and user experience is enhanced by that.

Re:google-analytics.com ? (1)

NoPantsJim (1149003) | more than 4 years ago | (#32673996)

I suspect people who actually think like this are the reason GA gets a bad reputation.

Re:google-analytics.com ? (1)

ArsenneLupin (766289) | more than 4 years ago | (#32675052)

If you have any body onload scripts, then they will wait for the entire page to load... including google analytics that are at the very bottom.

Re:google-analytics.com ? (0)

Anonymous Coward | more than 4 years ago | (#32673944)

I have google-analytics blocked by no-script and in my hosts file. Makes things so much faster and prevents the great Satan from knowing what I do!

Re:google-analytics.com ? (0)

Anonymous Coward | more than 4 years ago | (#32674018)

This reminds me: I added an adblock rule for google analytics the other day when /. temporarily broke the "more" part of the comment system.

I thought maybe the javascript for the more feature was getting blocked, so I opened the adblock tab. Nothing looked out of place, so I tested with adblock disabled just to make sure. However, the one good thing that came out of it: I noticed google analytics so I blocked it.

Upshot: My page loads feel faster on every site now. Good fucking riddance, google analytics.

Re:google-analytics.com ? (1, Interesting)

Anonymous Coward | more than 4 years ago | (#32674112)

I run adblock, noscript, flashblock, betterprivacy, yslow... and...I run all of that stuff at work too.

Despite that, the website I run--*runs* analytics. Even though it's slow. Oh yeah--that slowness...it's often DNS. No clue why--I'd think that server farm would stay cached everywhere on the planet. The deal is--if your browser hangs on that load--somebody wrote the page wrong. My analytics urchin--except on the blog (damned wordpress) always runs at the end, after content's done rendering.

Smart webmasters understand that their webpage is not a single document that just suddenly appears.

Unfortunately, that seems to be a rare breed--as these days I have trouble getting interns or programmers who understand even the basics like minification, modgzip, or who are even comfortable setting cache headers.

Re:google-analytics.com ? (3, Informative)

Spikeles (972972) | more than 4 years ago | (#32674306)

Googles' own documents [google.com] recommend that you should use asynchronous tracking which should cause no page slowdowns, and even if you use the traditional code [google.com] it should be at the at the end just before the closing body tag.

If a page is loading slowly because of google-analytics, blame the web site developer.

Re:google-analytics.com ? (0)

Anonymous Coward | more than 4 years ago | (#32674472)

For this reason, I just embed the analytics code inline at the bottom of each page. As my pages are served compressed, this increases page size only fractionally, so from the user's point of view is well worth it.

How fast? (4, Insightful)

tpstigers (1075021) | more than 4 years ago | (#32673814)

Google's ranking algorithm now includes a penalty for sites that load too slowly.

I'm not sure how I feel about this. My initial response was a happy one, but the more I think about it, the more it seems to be unnecessarily discriminating against those who are too far away from the bleeding edge. Do we really live in a world where 'Speed=Good' so completely that we need to penalize those who don't run fast enough? And where are we drawing the line between 'fast' and 'slow'?

Re:How fast? (0)

Anonymous Coward | more than 4 years ago | (#32673840)

> And where are we drawing the line between 'fast' and 'slow'?
4.9 seconds and / or 320KB - over or under ... clock / measure it!

Re:How fast? (1)

javaguy (67183) | more than 4 years ago | (#32673858)

Where are "we" drawing the line? We're not. Google is. It'd be nice if they told us where the line is, but I can't see that happening any time soon, since Google is pretty much a black box (though it is a pretty, shiny black box).

Re:How fast? (1)

mentil (1748130) | more than 4 years ago | (#32673980)

Furthermore, what if the server resides in some far corner of the planet, and Google is testing the speed from SoCal? The speed GOOG gets will not always resemble what someone else (potentially across the ocean) will get, unless they test at every major node of the internet.

Re:How fast? (1, Funny)

Anonymous Coward | more than 4 years ago | (#32674022)

I wonder if the people behind Google were smart enough to consider this?

Re:How fast? (0)

Anonymous Coward | more than 4 years ago | (#32674970)

But they're not checking it themselves; they're sending out a spider that crawls all over the web -- when it actually hits that site, it'll probably be coming from a nearby site!

And how is speed relevant to the content? (1)

RonVNX (55322) | more than 4 years ago | (#32674064)

This is just a pointless degradation of the search results. Google needs to stop screwing around with the accuracy of their results like this or they too will find themselves displaced by a competitor that focuses on relevancy of the content and not irrelevant outside factors.

For the unbearably slow, there's always Google Cache.

Re:And how is speed relevant to the content? (0)

Anonymous Coward | more than 4 years ago | (#32674126)

Chances are Google made the decision because slow loading sites caused a negative response among testers. I doubt Google is differentiating between 0.5 and 1 second, but I have no problem believing that users didn't like getting search results that took 8+ seconds to load. I may not agree with their priority but I can understand why Google would account for this.

Re:And how is speed relevant to the content? (3, Insightful)

Anonymous Coward | more than 4 years ago | (#32674258)

Speed is relevant because crap mirror sites should be ranked lower than the originating site. [Or even vice versa, faster mirrors should be preferred over the original source]

You seem to be under the delusion that Google is just going to delete slow sites, or return results purely on speed regardless of content. I have no idea what could lead you to think this way (well, I do "knee jerk reaction") because as far as I can tell, the most relevant site will be preferred but if there are multiple sites that are approximately all around the same relevance, the faster one is preferred.

Sounds like an excellent idea to me, lord knows that I've been pissed off waiting 45 seconds for a page to load when the next result loads instantly with similar information.

Re:How fast? (1)

noahbagels (177540) | more than 4 years ago | (#32674314)

While yours is a well thought out comment, from dealing extensively with web site latency for multiple sites, "bleeding edge" is often slower than "simple" or "old". As others have pointed out, it's the 300kb of javascript from 10 different social widget and ad sites that slow down the page. Most research on this topic today emphasizes client-side latency, as in the code and structure of what your browser downloads and in what sequence. Client side latency generally consumes > 90% of the user visible latency. After all, light and packets can travel half-way around the world in just 60 milliseconds, yet this is talking about penalizing sites taking over 4 seconds to load. TL; DR; You won't have to upgrade or buy lightning fast hosting. Just don't bloat down your page with dozens of third party js or tens/hundreds of poorly compressed images and you'll be fine!

Re:How fast? (1)

calmofthestorm (1344385) | more than 4 years ago | (#32674342)

Cue lack of net neutrality and this becomes a nasty can of censorship worms.

Re:How fast? (0)

Anonymous Coward | more than 4 years ago | (#32674350)

It's not fair because sites that are chock full of information are slower loading. We get penalized because we have large pages full of images and info, but spam sites that are almost completely without info and have a couple of stock photos have smaller pages.

This is bad for the web.

Re:How fast? (2, Insightful)

Wierdy1024 (902573) | more than 4 years ago | (#32674786)

Yes, we should penalize them.

Imagine there are two sites with the information I need. I would much prefer to see the faster site than the slower one, because by loading the faster site I get my information faster.

If I wanted my information slowly, I would walk to the library...

Re:How fast? (1)

XCondE (615309) | more than 4 years ago | (#32674914)

Dude, stop whinging. You don't even know what kind of weight that metric receives - actually none of us do. Given two pages with the same relevance (as far as the computer can tell) give me the faster one. Didn't work? no worries, I'll try the other next.

Moore's Law (1, Interesting)

Anonymous Coward | more than 4 years ago | (#32673926)

Not to discount how important it is to make your website as fast as possible but...

I doubt anyone with a decent internet connection is complaining about these 320k pages. Even on a cell phone it's not a big deal. As technology moves forward and speed improves even more these size related complaints will get less and less important.

Think about it - who complains about a 340k file on their hard drive anymore. I'm sure in the mid '80s lots of geeks rightfully griped about it.

Penalty for speed (3, Insightful)

csmanoj (1760150) | more than 4 years ago | (#32673968)

That would make google search results bad right. When I search I want the site with the best information. Not the one that loads fastest.

srsly (0)

Anonymous Coward | more than 4 years ago | (#32674150)

Bad might be overstating the problem, but absolutely less accurate than they could be. We can only hope they realize the error of their ways, but don't be surprised if this is merely the beginning of tweaks to their algorithm that have a negative effect on accuracy of results to advance some interest of Google's.

Re:Penalty for speed (1, Insightful)

Anonymous Coward | more than 4 years ago | (#32674186)

And the most reputable/best sites will not be filled with crappy flash adverts and affiliate marketing crapola.

I think that's what this is aimed at. Remember it's not the only factor, so a slow site, but one that's really popular/reputable will be high up, but a unpopular site, peppered with keywords and flash/javascript advertising shite will rank low.

I'm pretty sure the google search engineers have thought of this (and many other issues) when designing it. I'm fairly certain that they didn't just stick on a "if(size>200kB){rank-=10}"

Re:Penalty for speed (1)

sound+vision (884283) | more than 4 years ago | (#32674876)

I'd hope that their algorithm is heavily weighted towards accuracy versus speed. Given that Google seems to know what they're doing most of the time, the speed of a site is probably only important enough to move it up or down a few places within a page of search results. I can't remember the last time I had to click past the first page of a Google result, anyway. Of course, there's no way to tell just how much it matters, since they keep the inner workings of PageRank secret.

Re:Penalty for speed (0)

Anonymous Coward | more than 4 years ago | (#32674992)

I don't know their algorithm, but I'd imagine the speed penalty is only a small part of the overall ranking. If the site with the best info loads very slowly, it's info-rate should help balance out the speed-rate.

Re:Penalty for speed (1)

cbhacking (979169) | more than 4 years ago | (#32675070)

The ironic thing from my perspective is that Google's own services (ads and analytics) are among the worst offenders for making web pages slow down, in my experience...

Thugs and Vandals (0)

Anonymous Coward | more than 4 years ago | (#32673974)

Google has jumped the shark.

This is only going to discriminate in favour of sites with money behind them. Like stock investment we're seeing Google cranking up the race to buy a big fuck off server and locate it as close to a hub as possible. In an age where information is power the likes of Murdoch et al are just going to get an edge and leave everyone else who may have better content at a disadvantage.

"Compassionate Conservatism." "We're all in this together." "Do no Evil."

This all has the familiar ring of the right wing thug who punches your face in while smiling at the same time.

Re:Thugs and Vandals (0)

Anonymous Coward | more than 4 years ago | (#32674100)

Probably what's coming is the suggestion everyone should just host at Google because it'll be fastest hosted there. Fastest for Google at least.

If avg page is 320 KB, a header is a drop n bucket (0)

Anonymous Coward | more than 4 years ago | (#32673992)

If the average page is 320 KB, a page header is a drop in the bucket. HTTP headers may be 100 or at most 200 bytes. Sounds like a blow hard to me, at least with the header nonsense.

Correction:If avg page is 320 KB, headers use 30% (1)

thijsh (910751) | more than 4 years ago | (#32674512)

Yeah, but that 320Kb is most likely divided over at least 30 HTML/CSS/JS/JPG/PNG/SWF files... And headers include lots of information including cookies being sent back-and-forth, so the average headers are closer to 1000 bytes (around 500 both ways) per request now. According to my count this is around 30Kb, or an overhead of 10%, so this does leave some room for improvement... But if you account for the fact that those 320Kb will most likely also be transmitted with gzip compression the bytes over the wire are closer to 100Kb (roughly), so this brings the header-overhead on the wire to a whopping 30%. Those headers can be compressed with standard gzip to bring it back down to around 8Kb, but when you would take advantage of compression with a predefined dictionary optimized for HTTP headers you could shave it down a lot more, at least well under the 30% overhead we currently waste.

I hope Google will kickstart this initiative by adding it to Google Chrome and contributing code to HTTP servers, this way Chrome will be even faster than competition and other browsers will have to keep up... I love it when competition works!

Most delay is ad-related. (4, Informative)

Animats (122034) | more than 4 years ago | (#32674052)

Most real-world page load delay today seems to be associated with advertising. Merely loading the initial content usually isn't too bad, although "content-management systems" can make it much worse, as overloaded databases struggle to "customize" the content. "Web 2.0" wasn't a win; pulling in all those big CSS and JavaScript libraries doesn't help load times.

We do some measurement in this area, as SiteTruth [sitetruth.com] reads through sites trying to find a street address on each site rated. We never read more than 21 pages from a site, and for most sites, we can find a street address within 45 seconds, following links likely to lead to contact information. Only a few percent of sites go over 45 seconds for all those pages. Excessively slow sites tried recently include "directserv.org" (a link farm full of ads), "www.w3.org" (embarrassing), and "religioustolerance.org" (an underfunded nonprofit). We're not loading images, ads, Javascript, or CSS; that's pure page load delay. It's not that much of a problem, and we're seeing less of it than we did two years ago.

time standard (0)

Anonymous Coward | more than 4 years ago | (#32674322)

I met the guy who invented pull down windows. In the '60s he had his students run simulations of what PC's should be like in the future. Any request that takes longer than 2 seconds will be uncomfortable to the user.

The opening comment here sez pages are closer to 5 seconds now, which means the web is a lose.

After an hour or more of click-wait-abort and getting no work done it's tempting to give up on computers entirely. Especially when I recall how very fast it was in the '80's using stuff like VUI

I always tell folks they should have one page of text and a handful of graphics totaling maybe 30k.

Sorry I'm not on here often, maybe I'll go post on some old "google groups" next.

Nils K. Hammer

what about the other browsers? (2, Insightful)

SuperBanana (662181) | more than 4 years ago | (#32674288)

Holzle said that competition from Chrome has made Internet Explorer and Firefox faster.

Bull. Back when IE and Firefox's last major releases came out, Chrome was a tiny drop in the bucket market-share-wise. January was the first time it passed Safari in marketshare. I think it's more accurate to say that competition in general has led to companies improving their browsers. I'd bet we could also attribute the performance improvements to better standards compliance by websites, since there are now so many mainstream browsers.

I'd say that Firefox vs IE competition (and Firefox vs Safari on the mac) have inspired the improvements...

Measuring speed from *where* exactly? (5, Interesting)

buro9 (633210) | more than 4 years ago | (#32674316)

Where are the measuring *from*?

I've moved a site from Linode New Jersey to Linode London, UK because the target audience are in London ( http://www.lfgss.com/ [lfgss.com] ).

However in Google Webmaster Tools the page load time increased, suggesting that the measurements are being calculated from US datacentres, even though for the target audience the speed increased and page load time decreased.

I would like to see Google use the geographic target preference and to have the nearest datacentre to the target be the one that performs the measurement... or better still to have both a local and remote datacentre perform every measurement and then find a weighted time between them that might reflect real-world usage.

Otherwise if I'm being sent the message that I am being penalised for not hosting close to a Google datacentre from where the measurements are calculated, then I will end up moving there in spite of the fact that this isn't the right thing for my users.

Re:Measuring speed from *where* exactly? (5, Informative)

Anonymous Coward | more than 4 years ago | (#32674510)

From the docs:

"Page load time is the total time from the moment the user clicks on a link to your page until the time the entire page is loaded and displayed in a browser. It is collected directly from users who have installed the Google Toolbar and have enabled the optional PageRank feature."

http://www.google.com/support/webmasters/bin/answer.py?answer=158541&hl=en

Re:Measuring speed from *where* exactly? (0)

Anonymous Coward | more than 4 years ago | (#32674742)

That's some advanced stuff you're saying you want to see from google. I doubt it occurred to them - they're new at this whole interwebs thing.

Net neutrality (0)

Anonymous Coward | more than 4 years ago | (#32674406)

Great. Once net neutrality is gone, sites that aren't run by media conglomerates controlling the fast pipes will also lose search rank for being slow.

Lane change. (0)

Anonymous Coward | more than 4 years ago | (#32674430)

"The average web page takes 4.9 seconds to load and includes 320 KB of content, according to Google executive Urs Holzle.

So does that mean we don't need broadband?

Average 320KB per page? (1)

IBBoard (1128019) | more than 4 years ago | (#32674886)

I'll have to double-check my sites to be sure, but I think I'd be throwing a huge optimisation at any of my pages that got near 320KB, never mind averaging that large. That's just crazy-huge for a page given the amount of actual useful content that most pages have. If only people put in useful stuff instead of filling sites with pointless cruft.

accelerate? (1)

kkirk (1840930) | more than 4 years ago | (#32675026)

javascript,http and tcp is now bleeding for years now..
...javascript/etc is slow because of it's design it often mades my computers run like they were 10 year old crap fighting with ecc errors and io errors behind the scenes...this is not web 2.0, this is web 0.2
...http is designed to download web-page, not to solve a cryptic javascript initiated download puzzle i don't really think anyone should save it - it should be replaced for good
so...to speed up the headers should be altered - and compress them? outcome: smaller required bandwidth footprint ... but nowadays links are getting faster, if this was enough in the last few years it will suffice till the burrial of http
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?