Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Google Incorporates Site Speed Into PageRank Calculation

Soulskill posted more than 4 years ago | from the thousands-of-websites-just-started-caring-about-optimization dept.

Google 202

lee1 writes "Google is now taking into account how fast a page loads in calculating its PageRank. In their own words: '[W]e're including a new signal in our search ranking algorithms: site speed. Site speed reflects how quickly a website responds to web requests. ... our users place a lot of value in speed — that's why we've decided to take site speed into account in our search rankings. ... While site speed is a new signal, it doesn't carry as much weight as the relevance of a page. Currently, fewer than 1% of search queries are affected by the site speed signal in our implementation and the signal for site speed only applies for visitors searching in English on Google.com at this point.' Considering the increasing dilution of high-ranking results by endless series of plagiarizing 'blogs,' brainless forums, and outright scam sites, anything that further reduces the influence of the quality of the content is something I would rather not have. Not that Google asked me."

Sorry! There are no comments related to the filter you selected.

Sweet (1)

kyrio (1091003) | more than 4 years ago | (#31808600)

Nice

Re:Sweet (0)

Anonymous Coward | more than 4 years ago | (#31808692)

Mod parent up! Google is catching on to something us slashdotters have known for a long time. The person who posts fastest usually has the most insightful things to say!

Re:Sweet (1)

kyrio (1091003) | more than 4 years ago | (#31808710)

I just summed up what I felt about the article in two words!

Re:Sweet (3, Funny)

complacence (214847) | more than 4 years ago | (#31808768)

tl;dr

Re:Sweet (3, Insightful)

K. S. Kyosuke (729550) | more than 4 years ago | (#31808772)

Mod parent up! Google is catching on to something us slashdotters have known for a long time. The person who posts fastest usually has the most insightful things to say!

I have an idea: Slashdot could easily incorporate average commenting speed into its UserRank and serve pages to excessively-first poster slowly, giving chance to other, more insightful readers, such as the humble me.

Re:Sweet (1, Funny)

Anonymous Coward | more than 4 years ago | (#31809090)

Now that is how you create the proper April Fools joke. Why the fuck didn't Slashdot hire you two weeks ago? Does everything have to be immediately apparent these days?

I have nothing to say! (-1, Offtopic)

ThePangolino (1756190) | more than 4 years ago | (#31808694)

I just wanted to let you know.

Asking (4, Funny)

Anonymous Coward | more than 4 years ago | (#31808614)

Not that Google asked me.

Well, now they know that you're an influential Slashdot contributor I'm sure they'll sit up and take notice.

Re:Asking (0)

Gruff1002 (717818) | more than 4 years ago | (#31808878)

And rank AC's lower.

Slashdot (4, Insightful)

SimonTheSoundMan (1012395) | more than 4 years ago | (#31808616)

So when a site gets slashdotted and blown to oblivion, Google also ranks it lower. Awesome!

Re:Slashdot (1)

Oddscurity (1035974) | more than 4 years ago | (#31808656)

One would think only if the Google Bot happens to be indexing your site at that exact moment; one would additionally think they'll revisit to see if it's structural or not?

Re:Slashdot (3, Informative)

DNS-and-BIND (461968) | more than 4 years ago | (#31808730)

Google Bot is always indexing your site. I push 8-10Gb of traffic a month (yeah I know it's not a lot, thanks for informing me) and of that, 1Gb is Google. I don't know why Google constantly loads my pages even though they don't change that much, but Google does it.

Re:Slashdot (4, Insightful)

loufoque (1400831) | more than 4 years ago | (#31809076)

Maybe if you correctly used Last-Modified and Etag headers with a 304 Not Modified response, you could avoid a significant part your bandwidth usage.

Re:Slashdot (4, Informative)

amorsen (7485) | more than 4 years ago | (#31809192)

You can associate your site with a Google account and override their heuristic.

Re:Slashdot (5, Interesting)

Stan Vassilev (939229) | more than 4 years ago | (#31808748)

One would think only if the Google Bot happens to be indexing your site at that exact moment; one would additionally think they'll revisit to see if it's structural or not?

If you use Google Webmaster Central you may notice that, while Google's algorithm is smart, it's also very overestimated in some areas, and involves plenty of manual tweaking by the Google employees for it to work properly.

Site Speed is not calculated solely from the times the Google bot takes to crawl the page, it's calculated from Google toolbars that have the pagerank feature enabled (that feature calls home which sites you visit, and how fast the page got loaded).

Whether Google can detect clusters of frequent accesses such as from "slashdotting" is entirely under question, since most slashdot users may not have google toolbar with pagerank on, but for the *few* users that do, the site will just appear slow in general.

Additionally, if a site targets a demographic that has worse latency (low income people, areas with dial-up and so on), then, again, that site will appear to be slower, while actually the visitors have slower internet in general.

Additionally yet, often the reason a site is slow is somewhere along the route, nowhere close to either the visitor ISP, not the site server, and it's not for all users either. So if you have bad luck or due to your content you pick up users that happen to often be routed through the bad route, you'll lose page rank.

Re:Slashdot (1)

Oddscurity (1035974) | more than 4 years ago | (#31808800)

Hmmm, indeed. Maybe they need to rework that feature so that if it passes Y!Slow and similar, it's considered as quick as it's going to get. Otherwise you'll indeed see sites that have no recourse penalised. You make an excellent point.

Re:Slashdot (1)

commodore64_love (1445365) | more than 4 years ago | (#31808902)

It sounds like Google is already using that idea since "less than 1% of sites are affected" by the speed rating. In other words only those 1% of slow sites that are > Yslow would be downgraded in rank. Those sites Yslow have no affect.

Re:Slashdot (2, Informative)

Sechr Nibw (1278786) | more than 4 years ago | (#31809222)

Additionally, if a site targets a demographic that has worse latency (low income people, areas with dial-up and so on), then, again, that site will appear to be slower, while actually the visitors have slower internet in general.

Except they (according to the summary, didn't RTFA) aren't going by page load times, they're going by server response. That means that pages that are poorly written and take forever to load (or are connected to slow ad-servers) won't get downranked because of that. Only ones with slow server times. The slashdot effect will still potentially impact it, but the speed of a user's internet connection makes little impact on the speed of a ping.

Re:Slashdot (1)

Dumnezeu (1673634) | more than 4 years ago | (#31809224)

And what's wrong with that? If a server can't handle much load, it's probably not that important and it is also less valuable to the user if there is greater chance in getting a browser error page instead of what the user expects. Slashdotting, power failure, tsumani, cleaning lady tripping over the network cables, poor server-side scripting, badly configured web server... What's the difference anyway? The user's experience is degraded by all these factors, therefore the site should receive lower ranking. I've always thought search engines take page load times into consideration.

Re:Slashdot (5, Insightful)

FireFury03 (653718) | more than 4 years ago | (#31809554)

If a server can't handle much load, it's probably not that important

Or it is a very informative hobbyist site with lots of useful info on it, which is comparatively slow compared to a well funded commercial site that has nothing but marketing-speak.

TFA says they are looking at "server response times", but I can't see this being at all useful unless they look at the total page load time (including all the ads that come off slow servers).

Slashdotting, power failure, tsumani, cleaning lady tripping over the network cables, poor server-side scripting, badly configured web server... What's the difference anyway?

The difference is that some of these problems are transitory and some are more permanent. You probably don't want transitory problems to affect the ranking (here's hoping they average it over several crawls).

Re:Slashdot (1)

TheRaven64 (641858) | more than 4 years ago | (#31809604)

The problem is that sites that are experiencing something like the /. effect are doing so because a lot of people are trying to access them, which is usually because they contain something that a lot of people want to see. Ranking them lower is the opposite of helpful.

The other problem with this is that it's very location dependent. A site in the USA takes noticeably longer to load for me than one in the UK, but in a lot of cases I'd rather see the one in the UK because it's locally relevant. If Google's spider is crawling from the USA, it will get 100ms of extra latency, which can contribute 2-3 seconds to the loading time of a typical page, for the UK page so will rank it lower. The same is true even within the USA. I doubt people in New York want to see sites in California prioritised because they happen to be closer to the relevant Google data center.

How about bloat? (2, Interesting)

GrumblyStuff (870046) | more than 4 years ago | (#31808662)

If site A and site B have the same info, then how about weighing which one has the info spread over 10 pages with 3-4 different adservers spewing flash and gifs and all sorts of javascript trickery and which one doesn't (or has less at least)?

Re:How about bloat? (2, Insightful)

Captain Splendid (673276) | more than 4 years ago | (#31808712)

You want Google to discriminate against sites with more advertising? Good luck with that, buddy.

Re:How about bloat? (1)

thetoadwarrior (1268702) | more than 4 years ago | (#31809004)

Google says it's only affecting a tiny fraction of sites anyway, the quality of the content still means more, and it only affects those searching in English on Google.com.

It's probably still an experiment and may go away but with websites becoming more than plain text I'm glad to see performance taken into account.

Where is the 'speed' measured from? (1)

Cimexus (1355033) | more than 4 years ago | (#31808664)

I suppose an obvious question to ask then is: from where is Google measuring site speed? From a single particular server/location (presumably in the US)? From the 'nearest Google datacentre/server farm' to the site (and if so, how do they determine this)?

If they are measuring site speed from a single (US) location, that's gotta be hurting the page rankings for any sites hosted outside the US, as even if those pages are lightning fast locally, you're always going to have that ~100 ms latency to Europe / ~150 ms to Asia / ~200 ms to NZ & Australia etc, from the US.

Re:Where is the 'speed' measured from? (3, Funny)

caffeinemessiah (918089) | more than 4 years ago | (#31808752)

Geez, will you at least RTFS?

Currently, fewer than 1% of search queries are affected by the site speed signal in our implementation and the signal for site speed only applies for visitors searching in English on Google.com at this point.

The main site serves visitors from the US. Thus, measuring speeds from multiple locations around the US is probably the best thing to do. They're presumably measuring speed from all their datacenters (their crawlers are likely to be distributed across the country (and world), so recording the average speed over multiple crawls would be a good approximation when you're dealing with the scale of Google and the Web).

Re:Where is the 'speed' measured from? (1)

Cimexus (1355033) | more than 4 years ago | (#31808872)

It didn't say they are only implementing this on the US site. They said searches in English from Google.com.

Most people in other English speaking countries (the UK/Australia/NZ/SA etc.) just search in their browser search bar rather than going to google.com manually (which would redirect them to .uk, .au, .nz as appropriate). And depending on how the browser's been set up, those searches generally get pushed to google.com (the main site). The result page may be redirected to the country-specific one, but if you look at the string the browser sends (i.e. the actual search), it's often the plain old .com.

Re:Where is the 'speed' measured from? (1)

Dogtanian (588974) | more than 4 years ago | (#31809412)

Most people in other English speaking countries (the UK/Australia/NZ/SA etc.) just search in their browser search bar rather than going to google.com manually (which would redirect them to .uk, .au, .nz as appropriate)

Both Firefox and IE 8 redirected me to google.co.uk (my appropriate local website) when I typed some nonsense into the search box.

Re:Where is the 'speed' measured from? (1)

TheRaven64 (641858) | more than 4 years ago | (#31809610)

Safari searches on google.com irrespective of locale. I filed a bug report about this five years ago - it was marked as a duplicate but has still not been fixed.

Re:Where is the 'speed' measured from? (1)

FireFury03 (653718) | more than 4 years ago | (#31809632)

Both Firefox and IE 8 redirected me to google.co.uk (my appropriate local website) when I typed some nonsense into the search box.

My FireFox sends me to google.com. Which is quite annoying because it means I get the US version of Froogle if I use the "Shopping" link.

But more annoying is the way that Google don't implement some features in every language. E.g. if I want to turn SafeSearch off, I have to switch to English because the Welsh version of the preferences page doesn't have the damned SafeSearch options...

Re:Where is the 'speed' measured from? (1)

M. Baranczak (726671) | more than 4 years ago | (#31808966)

They did say that only 1% of sites are affected. That leads me to believe they have a pretty generous threshold (like several seconds or more). At that point, 200 ms more or less wouldn't make much difference.

Re:Where is the 'speed' measured from? (1)

TheRaven64 (641858) | more than 4 years ago | (#31809648)

200ms is the difference in round trip time, not the difference in loading time. A RTT difference of 200ms can easily add a couple of seconds to the total loading time because an HTTP session involves the initial TCP connection setup, then the server transmitting something, then the client making the request, then the server sending the reply. The RTT also affects the maximum transfer speed due to the TCP rate limiting algorithm, so this is penalising sites with a lot of content on one page, if it's waiting until the entire page is loaded.

Re:Where is the 'speed' measured from? (1)

ObsessiveMathsFreak (773371) | more than 4 years ago | (#31809036)

The speed ranking could be entirely location based.

It's been a talking point for a while for webhosts in Ireland that Google ranks sites more highly if they are based in the same locale/country as the user making requests. In other words(they claim), it's worth paying more to host your site with a local provider than getting a deal with a big overseas web-hosting company. Now they would say that; but having seen my share of generic search results return local companies again and again, I'm inclined to think their notion may have some merit. In any case, if Google are implementing this, they'll probably take location into account in a similar fashion.

Re:Where is the 'speed' measured from? (1)

Cimexus (1355033) | more than 4 years ago | (#31809070)

Yeah I've noticed this in Australia as well. Makes sense though ... it's more likely that someone in Australia would find more relevant information on a .au site, especially if the subject matter is something that varies between countries. E.g. if you looked up "tax law" or "drivers license" or something...

Here's the problem (1)

bogaboga (793279) | more than 4 years ago | (#31808672)

In their own words: '[W]e're including a new signal in our search ranking algorithms: site speed. Site speed reflects how quickly a website responds to web requests. ... our users place a lot of value in speed -- that's why we've decided to take site speed into account in our search rankings.

Search speed is almost 100% subjective. Heck...are the tools Google is using to evaluate speed openly known to all that matter in this?

I predict trouble ahead.

Re:Here's the problem (1)

yakatz (1176317) | more than 4 years ago | (#31808956)

I am not sure if this is how they measure, but Google makes some tools for site owners to check speed and it would be logical that they use something like this.
http://code.google.com/speed/page-speed/
(Similar to YSlow)

Also, Google Webmaster Tools has a Site Performance section (under labs) which may have something to do with this.

I care. (1)

unity100 (970058) | more than 4 years ago | (#31808682)

especially the corporate run sites have become loaded with shit. flash, javascript ads, javascript code that tries to get all kinds of info from me in order to deliver it to the advertisers, banner ads, includes from numerous other sites, their javascript, this that, a lot of loaded shit. some can even clog your browser if they chance up in a particular moment.

it will be good. now they will need to weigh speed factor. i shouldnt have to wait for a damn 3rd party ad provider's clogged servers to view the actual page im visiting.

Re:I care. (1)

Low Ranked Craig (1327799) | more than 4 years ago | (#31808740)

i shouldnt have to wait for a damn 3rd party ad provider's clogged servers to view the actual page im visiting.*

* unless that 3rd party is Google, of course

so, spammers just need servers... (5, Insightful)

FuckingNickName (1362625) | more than 4 years ago | (#31808690)

...close to and prioritising Google. Gotcha.

Really, am I the only one to find Google a fairly poor *find* engine? I mean, for anything which might remotely come close to sounding like it's a product, you've got Wikipedia right at the top, followed by 1000 review/comparison/pricing sites. For a tech question, you have expert-sexchange and 1000 crappy forums with responses from the downright wrong to the gratuitously abusive. I barely use Google (or any search engine much) for their generic WWW search - I'm more likely to be +site: searching a specific newsgroup/support forum/journal/enthusiast site I already know has intelligence. I don't need Google using yet another algorithm to fail at finding useful information - just employ 100 people spending 8 hours a day tagging the clone/spam/pricecheck/etc sites if you actually want to make a difference.

Re:so, spammers just need servers... (1)

Stan Vassilev (939229) | more than 4 years ago | (#31808806)

...close to and prioritising Google. Gotcha.

That'd do nothing. Speed isn't detect via the Google bots or from the google servers

Re:so, spammers just need servers... (1)

Pteraspidomorphi (1651293) | more than 4 years ago | (#31808814)

Well, you must know PageRank, at the core, uses the amount of links to a page to rank it, so wikipedia is bound to show up often. Expert sex change also pisses me off, because you can only see the actual answer if you pay, making me waste a lot of time when I'm not paying attention.

But I think their decision to use website speed is good, provided they test it from several different points across the globe and not just from the US. All other things being similar, I want the webpage I try first to load as fast as possible. Very often (at least for me) google's first page results take forever to load, if they load at all. Sometimes I have to try as many as 5 or 6 results before I manage to load one in less than 10 seconds. Maybe they load faster in the US, or maybe they load slowly precisely BECAUSE they are well ranked by google, but I welcome this attempt to solve that issue.

Re:so, spammers just need servers... (2, Informative)

QuantumRiff (120817) | more than 4 years ago | (#31808842)

What? I always scroll down to the bottom, way past all the crap about paying, and find it waaaay down below. Try scrolling further next time, or just use google's Cached page.

Re:so, spammers just need servers... (4, Informative)

rockNme2349 (1414329) | more than 4 years ago | (#31808936)

Half the people I heard from said that if they scroll all the way to the bottom they can read the answers for free, and the other half say this doesn't work. This confused me for the longest time until I finally figured out the answer.

Expertsexchange allows you to scroll down to the bottom to get a free answer the first time you visit their page, then gives your browser a cookie saying that you have gotten your free answer, and won't show you any more. So if you want to ensure that you can always scroll to the bottom, you simply have to block cookies from them and you are good to go.

Re:so, spammers just need servers... (0)

Anonymous Coward | more than 4 years ago | (#31809440)

Hit google cache and scroll to bottom.

Re:so, spammers just need servers... (1)

wzzzzrd (886091) | more than 4 years ago | (#31809446)

Either that, or just disable css (firebug or chromium's web dev tools[ctrl+shift+i]). The actual "hiding" of the answers is on the client side.

Re:so, spammers just need servers... (1)

rockNme2349 (1414329) | more than 4 years ago | (#31809516)

It has always baffled me that a website that seems to have such a good supply of technical knowledge could be so incompetent at implementing a pay-wall.

Re:so, spammers just need servers... (3, Informative)

TheLink (130905) | more than 4 years ago | (#31809490)

I thought it checked the "http-referer" - so if you clicked via google, you'll get the answer at the bottom.

But if you copy the URL and paste it on a browser, you don't get to see the answer at the bottom.

Personally what annoys me more than expertsexchange are the journal sites. For those I don't get the answer at the bottom or anywhere, even though it shows up in the Google search results.

Used to be Google policy that a site is not allowed to show different content to Google from what it shows to users - they smacked BMW Germany down for that. But now I see lots of sites getting away with that, and no, those journal sites don't get fooled by the user agent thing.

Perhaps they pay Google to be allowed to do it.

Re:so, spammers just need servers... (1)

Ernesto Alvarez (750678) | more than 4 years ago | (#31808864)

In expert-sex-change, you can find the answers just by scrolling down.
It looks like there's just a big footer under the question, but if you keep scrolling down, you'll find the answers.

Re:so, spammers just need servers... (0)

Anonymous Coward | more than 4 years ago | (#31809266)

In expert-sex-change

I love this meme with all my adorable heart.

Re:so, spammers just need servers... (0)

Anonymous Coward | more than 4 years ago | (#31808866)

Expert sex change also pisses me off, because you can only see the actual answer if you pay, making me waste a lot of time when I'm not paying attention.

Or you could just scroll to the bottom of the page where you can view all of the replies for free.

Re:so, spammers just need servers... (0)

Anonymous Coward | more than 4 years ago | (#31808876)

Expert sex change also pisses me off, because you can only see the actual answer if you pay, making me waste a lot of time when I'm not paying attention.

Just ignore the link telling you to log in, scroll down to the bottom of the page, past a bunch of junk, and there are the answers.

Re:so, spammers just need servers... (-1, Redundant)

Anonymous Coward | more than 4 years ago | (#31808880)

Just scroll down to the bottom of the Exchange page. I was fooled like you were for quite some time before someone clued me in. The answers are sometimes helpful.

Re:so, spammers just need servers... (-1, Redundant)

Anonymous Coward | more than 4 years ago | (#31808912)

Expert sex change also pisses me off, because you can only see the actual answer if you pay, making me waste a lot of time when I'm not paying attention.

Scroll down idiot.

Re:so, spammers just need servers... (1)

somersault (912633) | more than 4 years ago | (#31809096)

A "+0.5, Tough Love" mod would have been handy here.

Re:so, spammers just need servers... (1)

thetoadwarrior (1268702) | more than 4 years ago | (#31809518)

They were punished by Google for that and now expert sex change put the answer at the bottom, after scrolling through 3 or so pages worth of shit. Imo, it's still dodgy and Google should still punish them.

Re:so, spammers just need servers... (1)

Kral_Blbec (1201285) | more than 4 years ago | (#31809620)

I think that if you can't figure out how to bypass the expertsexchange paywall, you aren't qualified to try any of the advice there anyways.
For what its worth, I think I have only found relevant, usable info there about 3 times.

Re:so, spammers just need servers... (5, Insightful)

GIL_Dude (850471) | more than 4 years ago | (#31808868)

You hit the nail on the head with that one. I, too, find myself using queries containing "site:msdn.microsoft.com (rest of search)" (for say Windows API information) or using "-" in the searches to suppress certain results. Like you say, otherwise you get basically "a bunch of crap" - mainly from people who have no idea what they are doing. Just today I had a problem with elbyvcdshell.dll (from Slysoft's Virtual Clone Drive) causing Windows Explorer to hang for 5 minutes each time I renamed a folder. I tried searching that on Google - hell half of the hits were stupid posts of every file on a system at malware check sites, or bleepingcomputer.com, or other "is this malware" posts. Did I say half? Shoot - I just checked again and I think I meant 85%. The results for most tech searches are indeed useless unless you already know what site you want and include that information in your search. The internet is just filled with crap sites that make it into the indexes and get high relevance.

Re:so, spammers just need servers... (1)

mindbrane (1548037) | more than 4 years ago | (#31809024)

I had a problem with elbyvcdshell.dll (from Slysoft's Virtual Clone Drive) causing Windows Explorer to hang for 5 minutes each time I renamed a folder.

thanks for the tip, i've had similar problems but as i installed the virtual clone drive and a new logitech 9000 series webcam/microphone on the same day i've been contemplating my pc's navel trying to guesstimate which one caused the problem while being too lazy to run it down. the more so since i also installed anydvd and read it installs as a driver.

Re:so, spammers just need servers... (0)

Anonymous Coward | more than 4 years ago | (#31809524)

Yet, it works for so many people...

Personally, for technical problems, I usually copy-paste the error I get and I add the programme name (like "Stepmania FFmpeg movie decoder not found"). I also have a rule in my hosts file to redirect "expert-exchange" directly to oblivion (I wished there was a way to remove that crap from google results!!).

Now of course, if you want to Google right, practice does indeed help!
(Usually, you should try synonyms, and add words just for the context when needed -- "package not found" won't get you anywhere, "debian lenny ffmpeg-devel package not found" might.)

(Or "elbyvcdshell.dll slow folder opening" instead of "elbyvcdshell.dll"...)

Re:so, spammers just need servers... (1)

thetoadwarrior (1268702) | more than 4 years ago | (#31809606)

That is true. What Google needs to do is make it so you can block something from coming up in any search. I hate seeing Rose India results in my Java searches. I remove them when I find them but if I search for something else it can come up again.

That said I do believe the results are getting better. People find ways to trick Google and they get away with it for a bit but Google does catch on. I just checked for a search on Java servlet tutorial and Rose India ranks much lower than they used to and they're lower than on Bing.

Re:so, spammers just need servers... (1)

Yvanhoe (564877) | more than 4 years ago | (#31809268)

Google is still the best tool to do a search on the whole internet.
If you have another solution to find an answer to a tech question than checking the 40 first google entries, I am more than willing to check it out, but as crappy as one might call it, I have the feeling that it remains the best.

Re:so, spammers just need servers... (1)

Junior J. Junior III (192702) | more than 4 years ago | (#31809300)

. I don't need Google using yet another algorithm to fail at finding useful information - just employ 100 people spending 8 hours a day tagging the clone/spam/pricecheck/etc sites if you actually want to make a difference.

I think that would be giving too much power and responsibility to a tiny number of people who have no accountability to anyone except google.

But a better idea might be to allow the user to interact with the search results page, moderating the results and flagging results that weren't helpful.

Re:so, spammers just need servers... (1)

FuckingNickName (1362625) | more than 4 years ago | (#31809560)

But a better idea might be to allow the user to interact with

Anonymous users, and teams of users operating under declared banners (acting independently of Google, but using an interface provided by Google). Such teams compete to provide the best filters, where there will inevitably be different segments of the population having different opinions on what's shit and what's not. Add possibility to filter results by one or more teams, with metalists.

For example:
- Anti-Wikipedia-clone teams, which identify all clones of Wikipedia;
- Anti-paywall teams;
- Anti-comparison/review teams, which get rid of all the fucking "READ TWO LINE REVIEW OF PRODUCT XYZ" sites;
- Anti-porn teams, which slavishly discover/visit porn sites which nevertheless appear when supposed filtering is enabled... etc.

The web's size hasn't come near to correlating with availability of good information.

Of course, making Google any better than just-a-little-bit-better-than any other search engine reduces time spent using Google, so reducing ad revenue, so it's never really in Google's interest to improve more than it has to. It's learnt that from Microsoft's approach to the Internet, I guess.

Re:so, spammers just need servers... (1)

thetoadwarrior (1268702) | more than 4 years ago | (#31809508)

I usually find a relevant link right at the top for products but if you're not then click the shopping link and it is nothing but products. You may not like the results but my guess is the results are spot on otherwise it wouldn't be number one. Keep in mind some people do want price comparisons or just plain info, like you may find on wikipedia.

Not as bad as it looks (1)

fph il quozientatore (971015) | more than 4 years ago | (#31808704)

Obviously needs to be refined, but in principle it's not as bad as it looks. There are a lots of queries when you'd rather have a big company's site in the first page of results, rather than an obscure blog or scam site. Discovering how much they wish to pay for bandwidth is a good method to tell them apart.

Re:Not as bad as it looks (1)

dingen (958134) | more than 4 years ago | (#31808734)

The thing is though that most obscure blogs or scam sites tend to load a lot faster than sites of big corporations.

Re:Not as bad as it looks (1)

amorsen (7485) | more than 4 years ago | (#31809390)

There are a lots of queries when you'd rather have a big company's site in the first page of results, rather than an obscure blog or scam site.

You assume that big companies can afford powerful web servers and fast lines.

I offer you HP and Cisco who seem to be hosted on the same Commodore 64 in Timbuktu on a GPRS line.

Slowbotted (5, Funny)

Naatach (574111) | more than 4 years ago | (#31808706)

Yay! I can DDoS my competitors and have Google endorse it!

Re:Slowbotted (1, Interesting)

Anonymous Coward | more than 4 years ago | (#31809134)

Since Google is using the Google toolbar to measure the responsiveness; wouldn't it make more sense, and more efficient, to reverse engineer the protocol that Google is using, and submit faked response times to the Google server?

That is assuming that you have control over a bot-net that you are using to perform the DDoS attacks.

Net or Search Neutrality? (2, Funny)

JordanH (75307) | more than 4 years ago | (#31808746)

So, now well-connected sites run by media companies will have more relevance in Search results vs. minority opinions put out on a cheap web host?
'Do no evil' is meaningless if you don't actually examine what you are doing.

bias for large advertisers (1)

fermion (181285) | more than 4 years ago | (#31808756)

This is clearly an effort to give precedence to commercial enterprises and advertisers. Take a link farm. Nothing really there, so it does not require much power to serve pages. Pages will load quickly, and, coincidently, generate revenue for Google. For legitimate businesses, those that can afford network optimizations are exactly those that will also pay for ads. OTOH, web sites that provide useful services but are slow are going to, eventually, be left in the dust. More link farms, fewer useful services, a lamer google. Too bad MS can't put together a legitimate search engine.

Re:bias for large advertisers (1)

Todd Knarr (15451) | more than 4 years ago | (#31809016)

Possibly, but remember what Google also said: the speed has only a very low weight compared to relevance. So if sites A and B have almost equal relevance then speed might determine the results but if A gets 98% for relevance and B gets 90% then the fact that B is twice as fast as A will be pretty much ignored when ranking them.

Re:bias for large advertisers (0)

Anonymous Coward | more than 4 years ago | (#31809312)

Seeing as google will typically have things "right" (albeit indefinitely in Beta), i'd buy into this sentiment. Doesn't make any sense to serve a faster site that has significantly lower relevance (ie 5% or whatever threshold they give it).

BUT if they now take two sites with 98% and 97% relevance you might get the 97% relevant site first. Provided that 97 isn't expertsexchange, this is a big benefit to the user since GOOGLE got them the answer faster than $OTHER_SEARCH. Rationally thinking people (OK, /.ers...) would then likely continue to use Google over $OTHER because $OTHER might be serving the more relevant (but $Deity-awful slow) sites.

Re:bias for large advertisers (1)

thetoadwarrior (1268702) | more than 4 years ago | (#31809638)

The quality of the content still matters more so link farms don't win just by being faster.

That sounds reasonable.... so far (5, Insightful)

erroneus (253617) | more than 4 years ago | (#31808770)

Does this help do battle against spam/scam sites? Yes.

Does this help hosts of original content? Maybe... maybe not.

Does this serve as an indirect or otherwise passive-aggressive push for network neutrality? I suspect it might be.

After all, those seeking to act against Google's interests by lowering speed and throughput to and from Google would automatically get a lower rank. Think about some of the newspapers out there who can't get over their aging business model. Think about other sources of information who might also be a competitor of Google in other markets? At the moment, Google is the primary source for lots of people.

I must admit, I am having some difficulty coming up with arguments against this idea but I can't help but get a slightly uneasy feeling about this just the same.

Re:That sounds reasonable.... so far (5, Interesting)

rockNme2349 (1414329) | more than 4 years ago | (#31808970)

Does this serve as an indirect or otherwise passive-aggressive push for network neutrality? I suspect it might be.

It sounds to me like a push completely against net-neutrality. The websites that are served up faster get a higher rank. The websites that are throttled get a lower rank. Net neutrality isn't about how website owners filter their bandwidth for their visitors; they've always been free to do what they want. Net neutrality is about the ISPs and other backbone entities of the internet throttling traffic. If there was an ISP between google and two webpages it could direcly influence their ranks by throttling the site it wants knocked down and prioritizing the site it wants to give a higher rank to.

Re:That sounds reasonable.... so far (0)

Anonymous Coward | more than 4 years ago | (#31809468)

Ok, so let's say X site is the best, most noted site for its facts on ABC in the world, but is hosted out of someone's living room because it contains necessarily sensitive, secure data. The opposing viewpoint site is hosted by an idiot with a lot of political contribution money who hosts it in the Amazon cloud.

The site with the highest bandwidth is now ranked higher?

OR, let's say you're a chinese blogger and you host your blog with some relatively low powered servers in Uzbekiville because they are the most protected servers in the world in terms of privacy. YOUR site's opponent is the chinese government who can create 1 billion pages to kill your page rank, not with relevance, but with faster servers and more bandwidth.

The fastest site on a given subject is not necessarily the best. It's possible that if you have a lot of visitors (that slow the site down), that you'll be punished FOR YOUR MERIT.

That doesn't make sense to me.

Net Neutrality, I Knew Thee Well (0)

Anonymous Coward | more than 4 years ago | (#31808780)

I supposed net neutrality only applies when you're serving the derivitive work of newspapers, blogs and television companies - and not when you're scraping it.

Re:Net Neutrality, I Knew Thee Well (1)

blackraven14250 (902843) | more than 4 years ago | (#31808948)

What the fuck? Google is a search provider, not an internet provider. If they think the site's speed provides a good indicator of its usefulness to the user, let them go for it. They're not favoring one over the other because of a payoff, but because of their actual performance.

As a user of Google (1)

wisnoskij (1206448) | more than 4 years ago | (#31808828)

"our users place a lot of value in speed"
is not my opinion in the least, personally I like quality over speed.

Re:As a user of Google (1)

arkenian (1560563) | more than 4 years ago | (#31809056)

"our users place a lot of value in speed" is not my opinion in the least, personally I like quality over speed.

Absolutely. But in many, many, searches, there are going to be a hundred sites with roughly the same quality. In that case, I want the fastest to win. Also in searches that aren't narrow enough, high speed at the top will quickly tell me I have to refine my search parameters. Overall, I have to agree that optimizing for speed should optimize the whole search experience for all of us. As long as its just one of several signals, that is.

Re:As a user of Google (0)

Anonymous Coward | more than 4 years ago | (#31809500)

Liar. You'll stop page load after 5 seconds, unless you are absolutely certain that the page contains what you need.

net neutrality (0)

Anonymous Coward | more than 4 years ago | (#31808938)

this seems like an indirect push AGAINST net neutrality, not for it.
although the strict definition of net neutrality doesn't apply the essence of it is the same. the one that can afford greater bandwidth has a louder voice.

Measured via the toolbar (2, Informative)

asquithea (630068) | more than 4 years ago | (#31808940)

From a slightly older article [blogspot.com] on the same blog:

The load time data is derived from aggregated information sent by users of your site who have installed the Google Toolbar and opted-in to its enhanced features.

So this isn't quite as susceptible to people playing games with Googlebot as it might appear.

Bad (2, Insightful)

dandart (1274360) | more than 4 years ago | (#31808942)

If another site pretends to be me or tries to sell products that sound like my product, and have more money than me to spend on servers, and are closer to Google, Google will redirect people to them instead of me. Bad move.

Goodbye neutrality (0)

Anonymous Coward | more than 4 years ago | (#31808962)

Oh boy, all the more reason for ISPs to tier bandwidth! I can see the gleam in the marketing department's eyes now: pay us extra or your traffic will be a bit slower AND you won't show up in search rankings.

That's not PageRank (0)

Anonymous Coward | more than 4 years ago | (#31808986)

PageRank is a specific calculation that just looks at incoming links to a site. This change has to do with how a site is ranked by the search engine, and has nothing to do with the PageRank part of the algo.

Result of anti-Net Neutrality ruling (0)

Anonymous Coward | more than 4 years ago | (#31809132)

I think this is just a result of the net neutrality ruling. As ISPs start to choke the bandwidth of people that do not pay for premium access they will drop in the Google rankings.

Isn't Google missing the point? (4, Interesting)

fluffernutter (1411889) | more than 4 years ago | (#31809150)

I occasionally put websites together for small businesses and it seems increasingly hard to get these kinds of websites known. Google seems to be more and more indexing websites with lots of content and now with speedier response which will completely slant their rankings towards large companies with huge resources.

For example, I did a website for a lady that sells garden and landscaping lighting local to where I am from. Her business focus is not one that needs a large web page, she just wants her catalog to display basically but she does want people to find her with Google. I've done all the things like making sure the title is accurate and headers are relevant, etc. However, it seems to me that much of it is futile. Unless she is the type of business that focuses on inviting people to add content to her site (in other words an internet/web business) the sad truth is that she will basically get ignored by Google.

Wow (1)

phantomfive (622387) | more than 4 years ago | (#31809176)

First time my website [byuh.edu] has ever moved up in pagerank!! Lazy HTML FTW!!

why telling the truth can be uncomfortable (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#31809194)

it can happen when the truth bears difficulties to face.

more often, it's a pride/ego deal when one must reveal previous dishonesty.

either way, the results are an improvement over what we often do now.

never a better time to consult with/trust in your creators, sporting 100% uptime running at the speed of right from the highest ranking site in the universe. goo-goo that. see you there?

I might just install Google Toolbar (1)

bhagwad (1426855) | more than 4 years ago | (#31809232)

I'm not sure how many of my readers have Google toolbar installed. Guess I can install it myself and visit my site for Google to get the page load speed data...

Not pagerank? (1)

leenks (906881) | more than 4 years ago | (#31809306)

Where does TFA mention this is incorporated in PageRank (a very specific algorithm)?

Google use hundreds of algorithms to determine the ranking of pages in result lists and my understanding, from talks Google staff have given, is that PageRank is used in only a tiny fraction of queries.

Re:Not pagerank? (0)

Anonymous Coward | more than 4 years ago | (#31809550)

Correct, the submitter is clueless "Google is now taking into account how fast a page loads in calculating its PageRank" is clearly wrong.

Speed == quality. (0)

Anonymous Coward | more than 4 years ago | (#31809382)

> anything that further reduces the influence of the quality of the content is something I would rather not have.

See title. If someone has something insightful to say, usually said someone can say it in simple terms (because that is a virtue of powerful minds).

Simple terms, simple web marking... you where I'm going.

I can't stand slowness. So, for one, I think faster is great.

> Not that Google asked me.

Apparently, Google was seen drinking a coffee with Apple. Apparently, they asked Apple something. Apparently, or possibly, they thought someone was behind Adobe's increased speed in Widow$. Apparently, they think M$ is doing whatever it wants. Apparently, they decided to put a stop to that, since the DOJ was ineffective.

I applaud this move with a standing ovation (though I don't endorse Jobs' ways -- I liked Woz best).

Net neutrality? (1)

Dragoniz3r (992309) | more than 4 years ago | (#31809424)

How long until I have to pay my webhost not to sandbag these measurements? Yippee...

Copy and paste article searching? (1)

Midnight Thunder (17205) | more than 4 years ago | (#31809434)

Next we will see support for "copied and pasted" text, where the main content of one site is the same as another. I can imagine now in my results "likelihood of matching text as previous result: x%". This should help work out which pages are simply copied and pasted blogs, news or press releases.

Don't do evil (1)

gmuslera (3436) | more than 4 years ago | (#31809476)

Google must be stopped! Is taking advantage of its monopoly to... to... well, do good. At least from their point of view. Some sites are badly coded, not even try to be optimized, and speeding them up probably won't require a big investment, while will improve the experience for the visitors.

But in the other hand, some sites by goals, general idea, location or popularity end being slow from google's point of view and gets punished, potentially being the authoritative in some topic. Could be mitigated a bit if the "speed" they are measuring is the kind of metric and recommendations that do page speed, yslow and some of their other suggested tools do, that in most part arent about how fast your server side scripts run or how much bandwidth your server have, but usually cheap to follow directives like compressing output, optimizing images or where you include your javascripts/css in the html.

Optimize for Google spider to rank higher (0)

Anonymous Coward | more than 4 years ago | (#31809512)

In other words, they're compelling webmasters to optimize their sites to respond more quickly to Google's spider, in order to improve their rankings.

"Google Site Speed" is not the host provider speed (2, Interesting)

Anonymous Coward | more than 4 years ago | (#31809658)

Google Site Speed is how well you have kept to the protocol specs to make
sure the size of your website is as small as possible so as it travels through the
pipes, it does so as efficiently as possible. It is NOT a rank of how fast your
host provider delivers to the end user.

Badly implemented pages will get a lower rank. (...and so they should IMHO)

Google is trying to make sure everyone makes clean websites.
I am sure Google also benefits by saving power/processing costs if the amount of
kilobytes to parse/store per web page is smaller.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?