×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Webmasters Pounce On Wiki Sandboxes

simoniker posted more than 9 years ago | from the fold-spindle-mutilate dept.

The Internet 324

Yacoubean writes "Wiki sandboxes are normally used to learn the syntax of wiki posts. But webmasters may soon deluge these handy tools with links back to their site, not to get clicks, but to increase Google page rank. One such webmaster recently demonstrated this successfully. Isn't it time for Google finally to put some work into refining their results to exclude tricks like this? I know all the bloggers and wiki maintainers would sure appreciate it."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

324 comments

The Myth of the Gipper (-1, Offtopic)

Anonymous Coward | more than 9 years ago | (#9357469)

Reagan Didn't End the Cold War

By WILLIAM BLUM

Ronald Reagan's biggest crimes were the bloody military actions to suppress social and political change in El Salvador, Nicaragua, Guatemala and Afghanistan, but I'd like to deal here with the media's gushing about Reagan's supposed role in ending the cold war. In actuality, he prolonged it. Here is something I wrote for my book Killing Hope.

It has become conventional wisdom that it was the relentlessly tough anti-communist policies of the Reagan Administration, with its heated-up arms race, that led to the collapse and reformation of the Soviet Union and its satellites. American history books may have already begun to chisel this thesis into marble. The Tories in Great Britain say that Margaret Thatcher and her unflinching policies contributed to the miracle as well. The East Germans were believers too.

When Ronald Reagan visited East Berlin, the people there cheered him and thanked him "for his role in liberating the East". Even many leftist analysts, particularly those of a conspiracy bent, are believers. But this view is not universally held; nor should it be. Long the leading Soviet expert on the United States, Georgi Arbatov, head of the Moscow-based Institute for the Study of the U.S.A. and Canada, wrote his memoirs in 1992. A Los Angeles Times book review by Robert Scheer summed up a portion of it:

Arbatov understood all too well the failings of Soviet totalitarianism in comparison to the economy and politics of the West. It is clear from this candid and nuanced memoir that the movement for change had been developing steadily inside the highest corridors of power ever since the death of Stalin. Arbatov not only provides considerable evidence for the controversial notion that this change would have come about without foreign pressure, he insists that the U.S. military buildup during the Reagan years actually impeded this development.

George F. Kennan agrees. The former US ambassador to the Soviet Union, and father of the theory of "containment" of the same country, asserts that "the suggestion that any United States administration had the power to influence decisively the course of a tremendous domestic political upheaval in another great country on another side of the globe is simply childish." He contends that the extreme militarization of American policy strengthened hard-liners in the Soviet Union. "Thus the general effect of Cold War extremism was to delay rather than hasten the great change that overtook the Soviet Union."

Though the arms-race spending undoubtedly damaged the fabric of the Soviet civilian economy and society even more than it did in the United States, this had been going on for 40 years by the time Mikhail Gorbachev came to power without the slightest hint of impending doom. Gorbachev's close adviser, Aleksandr Yakovlev, when asked whether the Reagan administration's higher military spending, combined with its "Evil Empire" rhetoric, forced the Soviet Union into a more conciliatory position, responded:

It played no role. None. I can tell you that with the fullest responsibility. Gorbachev and I were ready for changes in our policy regardless of whether the American president was Reagan, or Kennedy, or someone even more liberal. It was clear that our military spending was enormous and we had to reduce it.

Understandably, some Russians might be reluctant to admit that they were forced to make revolutionary changes by their arch enemy, to admit that they lost the Cold War. However, on this question we don't have to rely on the opinion of any individual, Russian or American. We merely have to look at the historical facts. From the late 1940s to around the mid-1960s, it was an American policy objective to instigate the downfall of the Soviet government as well as several Eastern European regimes. Many hundreds of Russian exiles were organized, trained and equipped by the CIA, then sneaked back into their homeland to set up espionage rings, to stir up armed political struggle, and to carry out acts of assassination and sabotage, such as derailing trains, wrecking bridges, damaging arms factories and power plants, and so on.

The Soviet government, which captured many of these men, was of course fully aware of who was behind all this. Compared to this policy, that of the Reagan administration could be categorized as one of virtual capitulation.

Yet what were the fruits of this ultra-tough anti-communist policy? Repeated serious confrontations between the United States and the Soviet Union in Berlin, Cuba and elsewhere, the Soviet interventions into Hungary and Czechoslovakia, creation of the Warsaw Pact (in direct reaction to NATO), no glasnost, no perestroika, only pervasive suspicion, cynicism and hostility on both sides.

It turned out that the Russians were human after all -- they responded to toughness with toughness. And the corollary: there was for many years a close correlation between the amicability of US-Soviet relations and the number of Jews allowed to emigrate from the Soviet Union. Softness produced softness. If there's anyone to attribute the changes in the Soviet Union and Eastern Europe to, both the beneficial ones and those questionable, it is of course Mikhail Gorbachev and the activists he inspired.

It should be remembered that Reagan was in office for over four years before Gorbachev came to power, and Thatcher for six years, but in that period of time nothing of any significance in the way of Soviet reform took place despite Reagan's and Thatcher's unremitting malice toward the communist state.

Why just wikis? (4, Insightful)

GillBates0 (664202) | more than 9 years ago | (#9357476)

Why not normal discussion boards and blogs? We, for one, saw how the SCO joke (litigious b'turds) managed to GoogleBomb SCO in first place without a problem.

Re:Why just wikis? (5, Funny)

caino59 (313096) | more than 9 years ago | (#9357622)

We, for one, saw how the SCO joke (litigious b'turds) managed to GoogleBomb SCO in first place without a problem.

You forgot the link: Litigious Bastards [sco.com]

Re:Why just wikis? (3, Interesting)

abscondment (672321) | more than 9 years ago | (#9357640)

posting on Wikis doesn't screw up your own blog.

posts on message boards will be deleted quickly, unless the board is expressly google bombing (as in the current Nigritude Ultramarine 1st placer [google.com]) / people are stupid

i think the idea is that wikis make it easier in general for your post to stay up and not affect your blog.

Re:Why just wikis? (5, Informative)

ichimunki (194887) | more than 9 years ago | (#9357735)

The real problem with Wikis is that the link will remain there, even after it has been removed from the current page, because most Wikis have a revision history feature. So what's needed is careful set up in the robots.txt file and other HTML clues for the web crawlers to exclude anything but the most current version of a page (and to skip over the other 'action' pages, like edits, etc).

My wiki got hit by this stupid link, but not in the sandbox. Of course, recovering the previous version of the page is easy... it's wiping out any trace of the lameness that gets trickier. I suppose the easiest way to defeat this would be to require simple registration in order to edit Wiki pages.

What else can we do? Alter the names of the submit buttons and some of the other key strings involved in Editing?

Re:Why just wikis? (4, Interesting)

nautical9 (469723) | more than 9 years ago | (#9357641)

I host my own little phpBB boards for friends and family, but it is open to the world. Recently I've noticed spammers registering users for the sole purpose of being included in the "member list", with a corresponding link back to whatever site they wish to promote. They'll never actually post anything, but they've obviously automated the sign-up procedure as I get a new member every day or so, and google will eventually find the member list link.

And of course there are still sites that list EVERY referer in their logs somewhere on their site, so spammers have been adding their site URLs to their bot's user agent string. It's amazing the lengths these people will go to spam google.

Sure hope they can find a nice, elegant solution to this.

Re:Why just wikis? (3, Insightful)

Andy Mitchell (780458) | more than 9 years ago | (#9357742)

I'm not sure this will make you feel better but this startergy has a limited lifetime.

The contribution of your page to another pages page rank depends on two factors, firstly the page rank of your page, and secondly the number of links coming from your page.

As more people take up this tactic the return everyone gets from it, gets smaller. E.g. When there are hundred of links on that page they cease to have any real value. Eventually people should give up on this one.

visual security code for sign-up (4, Informative)

Saeed al-Sahaf (665390) | more than 9 years ago | (#9357765)

Most BB boards (including phpBB, upgrade!) and blogs (including Slashdot) now feature the visual security code for sign-up. But, of course, this does not prevent hand entry of spam...

Re:visual security code for sign-up (5, Insightful)

stevey (64018) | more than 9 years ago | (#9357821)

There was a story about defeating this system on /. a while back.

Rather than using OCR or anything poeople would merely harvest a load of images from a signup site - possible when there are only a given number of finite images, or when there is a consistent naming policy.

Then once the images were collected they would merely setup an online porn site, asking people to join for free proving they were human by decoding the very images they had downloaded.

Human lust for porn meant that they could decode a large number of these images in a very short space of time, then return and mount a dictionary attack...

Quite clever really, sidestepping all the tricky obfuscation/OCR problems by tricking humans into doing their work for them ..

Re:Why just wikis? (5, Funny)

Anonymous Coward | more than 9 years ago | (#9357682)

Why not normal discussion boards and blogs?

As an employee of JBOSS [jboss.org], I'm shocked and appalled at your suggestion. Fortunately, JBOSS [jboss.org] is working on a new JBOSS [jboss.org] solution to overcome this problem using JBOSS [jboss.org]. We at JBOSS [jboss.org] are passionate that our JBOSS [jboss.org] technology will prevent even non- JBOSS [jboss.org] users from taking advantage of boards this way.

Frank Lee Awnist
JBOSS [jboss.org] Employee
JBOSS [jboss.org] Inc.

JBOSS [jboss.org] JBOSS [jboss.org] JBOSS [jboss.org]

Re:Why just wikis? (1)

Andy Mitchell (780458) | more than 9 years ago | (#9357785)

You missed out on: 1) The opinions expressed in this post are not necersarily those of.... 2) This message is (c) of....

Re:Why just wikis? (1)

athakur999 (44340) | more than 9 years ago | (#9357827)

Most wiki sandboxes will let you modify them without any sort of registration at all, so it's much more time effective than signing up for a bunch of discussion boards, waiting for the validation emails, etc. They also probably have a higher average page rank than most discussion boards and blogs would, so a little goes a long way.

Cyberneighborhood Not-Watch? (5, Interesting)

raehl (609729) | more than 9 years ago | (#9357478)

In the real world, there are neighborhood watch signs to "deter" criminals.

Perhaps there could be a command in the robots.txt file which says "Browse my site, but don't count any links here for page ranking"? That would make your site less of a target for spammers, but not prevent you from being ranked at all.

Re:Cyberneighborhood Not-Watch? (3, Insightful)

lunax (235701) | more than 9 years ago | (#9357553)

Why not put the sandbox in it's own folder and add an entry to the robots.txt telling it not to browse that folder?

Re:Cyberneighborhood Not-Watch? (2, Informative)

Random Web Developer (776291) | more than 9 years ago | (#9357645)

The problem with wiki's is that they use 1 template for all pages, including the sandbox, everything is wiki.pl?PageName or something like that. You would have to dive in the code instead of just "using" the wiki

Re:Cyberneighborhood Not-Watch? (5, Informative)

Random Web Developer (776291) | more than 9 years ago | (#9357623)

There is a robots meta tag for this that you can put in your headers for a single page (robots.txt needs subdirs) but unfortunately most webmasters are too ignorant to realize the power of these:

http://www.robotstxt.org/wc/meta-user.html

LOLO (-1, Offtopic)

Anonymous Coward | more than 9 years ago | (#9357480)

Firstus postus!!

I put link back to my ass, increase the rank!

Oh well (5, Informative)

SpaceCadetTrav (641261) | more than 9 years ago | (#9357485)

Google and others will just lower/diminish the value of links from Wiki pages, just like they did to those open "Guest Book" pages on personal sites.

Yes... PLEASE... (4, Insightful)

Paulrothrock (685079) | more than 9 years ago | (#9357491)

Google needs to do something about this. I had to turn off comments on my blog because all I was getting was spam. Two or three a day that I had to go in and delete. I have to now find a system that will keep the bots out.

What happened to the nice internet we had in 1996?

Re:Yes... PLEASE... (1)

ack154 (591432) | more than 9 years ago | (#9357533)

I still haven't really seen a problem with this on my blog. I've had comments enabled for the past two years and have maybe gotten 3 or 4 total spam comments in that time (one today actually).

Mine has always been set to not allow anon comments, but I know most people have that set as well.

I have been using MovableType and just haven't really had any problems. Been lucky I guess.

Re:Yes... PLEASE... (1)

Paulrothrock (685079) | more than 9 years ago | (#9357656)

I'm using Wordpress, and before that b2. It's only started in the past month, too.

Unfortunately, my spam comments fill in the email fields, so I can't turn of anonymous comments. Is there any way for me to get the IP addresses of spam comments and forward this to the authorities?

Nice guys get outsourced. (-1, Offtopic)

Anonymous Coward | more than 9 years ago | (#9357558)

It was replaced.

Re:Yes... PLEASE... (2, Interesting)

lukewarmfusion (726141) | more than 9 years ago | (#9357587)

As my site grows, I'm thinking about adding a mechanism to address those issues: when the user requests a page for the first time, he'll get a session value that says he's a valid visitor to the site. When he submits a comment, he has to have that value, or comments aren't allowed. I don't know how you'd write a script to circumvent that. (If someone can tell me, I'd love to know so I try to prevent it!)

The bot can keep the cookies it gets (0)

Anonymous Coward | more than 9 years ago | (#9357737)

Coding cookie preserving http connection takes about 20 lines of java code. You better think about something better.

Re:Yes... PLEASE... (1)

karmatic (776420) | more than 9 years ago | (#9357593)

For my blog (which uses WordPress), I added in a redirect page. This page has noindex,nofollow on it - so no pagerank goes out.

Also, any comments containing more than X links or spammy terms (customizable) automatically requires moderator approval.

So far, no successful spam. Remove the incentive, nobody will bother.

"What happened to the nice internet we had in 1996?
AOL. No really (although other "user-friendly ISPs hurt too"). Because of the influx of technically illeterate (or just incompetent) people, "Spammy" techniques work. PR manipulation, bulk mailings, etc. actually make money.

Where the suckers are, the people who exploit them go. A mandatory proficiency/IQ test to get on the 'net would go a long way towards helping alleviate these problems.

Re:Yes... PLEASE... (0)

Anonymous Coward | more than 9 years ago | (#9357651)

influx of technically illeterate (or just incompetent) people,

Nice irony there.

Re:Yes... PLEASE... (1)

Frizzle Fry (149026) | more than 9 years ago | (#9357792)

Based on your use of bold, you seem to be saying it's ironic that he couldn't spell illiterate, but equally ironic is that his screed against the "technically illeterate" is contained in an improperly closed italics tag.

Re:Yes... PLEASE... (4, Interesting)

n-baxley (103975) | more than 9 years ago | (#9357638)

The system was even easier to rig back then. Back in 96ish, I created a web page with the title "Not Sexy Naked Women". Then repeated that phrase several times and then gave a message telling people to click the link below for more Hot Sexy Naked Women which took them to a page that admonished them for looking for such trash. I added a banner ad to the top of both of these pages, submitted them to a search engine and made $500 in a month! Things are better today, but they're still not perfect.

Re:Yes... PLEASE... (2, Funny)

happyfrogcow (708359) | more than 9 years ago | (#9357646)

What happened to the nice internet we had in 1996?

i blame blogs

Re:Yes... PLEASE... (2, Insightful)

Paulrothrock (685079) | more than 9 years ago | (#9357725)

No, I blame opportunistic bastards who can't see that it's okay to not profit from something. *Thinks about his sledding hill that was destroyed by an upscale minimall.*

Re:Yes... PLEASE... (1)

Safety Cap (253500) | more than 9 years ago | (#9357741)

I had to turn off comments on my blog because all I was getting was spam.
The simple solution [godaddy.com] is to require the poster to read a distored graphic of a random numeric value and enter the value into a field in order to submit his message.

Re:Yes... PLEASE... (0)

Anonymous Coward | more than 9 years ago | (#9357823)

YEA! Fuck the blind! They have no business on the Web anyway.

like porn (4, Interesting)

millahtime (710421) | more than 9 years ago | (#9357498)

These seems similar to the system all those porn systems used to get such a high rank in google.

Kind playing the system with the content not being quite as desirable.

Naughty behaviour (1)

doodlelogic (773522) | more than 9 years ago | (#9357503)

Just a shame that Google is one of the few search engines that are any good.

I always use All the Web [alltheweb.com] when looking for any company or organisation I know the name of, but for more general queries I'm looking for a clean, fast, non-buggy alternative to the google giant. Preferably open source.

Any suggestions?

Re:Naughty behaviour (0)

Anonymous Coward | more than 9 years ago | (#9357617)

I always use All the Web when looking for any company or organisation I know the name of, but for more general queries I'm looking for a clean, fast, non-buggy alternative to the google giant. Preferably open source.

WTF does "preferably open source" mean for a search engine? Who cares if it's open source or closed source or something some guy hacked up in his basement out of an old shareware app he wrote? It's a search engine!

Re:Naughty behaviour (1)

Syzar (765581) | more than 9 years ago | (#9357672)

Nutch [nutch.org] aims to create opensource search engine, though they don't have anything yet.

Re:Naughty behaviour (0)

Anonymous Coward | more than 9 years ago | (#9357692)

I swore off Alltheweb after they started using Yahoo tracking links.

I have found favorable results with Wisenut [wisenut.com].

Re:Naughty behaviour (2, Informative)

Doesn't_Comment_Code (692510) | more than 9 years ago | (#9357727)

I'm looking for a clean, fast, non-buggy alternative to the google giant. Preferably open source.

Any suggestions?


The only big one I know of right now is Nutch. It is an open source search engine that is in the later stages of development, but hasn't produced a large, usable site yet.

nutch.org [nutch.org]

Since it will be open source, you will be able to read the ranking algorithms and change/abuse them as you see fit.

This one http://search.mnogo.ru/ [mnogo.ru] is also available.

You know... (3, Insightful)

fizban (58094) | more than 9 years ago | (#9357505)

...what Google needs? A "Was this result helpful in your search?" button for each link returned, so that the search itself also influences page ranks. Maybe that will help get rid of this Google bombing mess.

Re:You know... (1, Insightful)

Anonymous Coward | more than 9 years ago | (#9357545)

Because obviously bots couldn't mess with that....

Re:You know... (1, Insightful)

Anonymous Coward | more than 9 years ago | (#9357552)

"...what Google needs? A "Was this result helpful in your search?" button for each link returned, so that the search itself also influences page ranks. Maybe that will help get rid of this Google bombing mess."

Except spambots can also work to make sure that the most helpful links are the ones linking to spam sites.

Re:You know... (4, Insightful)

Anonymous Coward | more than 9 years ago | (#9357565)

that button will also get spammed, as bots will click 'yes' for their sites and 'no' for the competitors sites

Re:You know... (3, Insightful)

goon america (536413) | more than 9 years ago | (#9357612)

Wouldn't that be equally abused?

Re:You know... (1)

gunnk (463227) | more than 9 years ago | (#9357712)

I'm guessing that you are asking:

"What's to keep Google-bombers from marking down the significance of real links in order to increase the rank of their links?"

One way to mitigate it is simply to let a given IP address mark a link as good or bad only once. The bomber would have to use a multitude of IP addresses in order to make any significant counter to the huge number of legitimate users that would be marking them down. It would be too labor intensive and therefore cost prohibitive.

Re:You know... (-1)

Anonymous Coward | more than 9 years ago | (#9357798)

No, abuse reports are manually reviewed.

It just might work! (4, Funny)

mcmonkey (96054) | more than 9 years ago | (#9357749)

'You know what Google needs? A "Was this result helpful in your search?" button for each link returned'

Yes! Genius! That's it! Google needs some kind of system of rating results to modify future results returned--a system of 'mods' if you will.

Of course some people will 'mod' stuff down just because they don't like the viewpoint expressed, or they're in a perennial bad mood because their favorite operating system is dead, so we'll need to have a system of allowing people to rate the moderations--'meta-mod' if I may be so bold.

It sounds crazy, I know, but I think we could do this.

< jab jab > (2, Interesting)

jx100 (453615) | more than 9 years ago | (#9357512)

Well, couldn't have been that successful, for he didn't win [searchguild.com].

Re: (0)

Anonymous Coward | more than 9 years ago | (#9357673)

yeah but the winner's wiki spamming is all over also:

http://www.google.com/search?hl=en&lr=&ie=UTF-8& c2 coff=1&edition=us&q=merkey.net+wiki&btnG=Searc h

Some people ... (2, Insightful)

TheGavster (774657) | more than 9 years ago | (#9357514)

It still gets me how the people who are participating in the nigritude ultramarine thing don't see anything wrong with what they're doing. This line particularly got me:
"Without, as opposed to guestbook spamming, being evil it's a sandbox after all."

Yes its a sandbox, no its not your personal playground.

google works (3, Informative)

mwheeler01 (625017) | more than 9 years ago | (#9357518)

Google does tweak their ranking system on a regular basis. When the problem becomes evident, (and it looks like it just has) they do something about it...that's why they're google.

Who's fault is that? (4, Insightful)

lukewarmfusion (726141) | more than 9 years ago | (#9357523)

Google's algorithm isn't the problem. The problem is the availability of easily abused areas such as these "sandboxes."

Some search engines accept any old site. Others accept sites based on human approval and categorization. Google is a nice combination of the two - by using outside references (counting how often the site is linked) it assumes that the site is more relevant. Because other people have put links on their sites. That's a human factor, without directly using human beings to review and categorize the sites and rankings.

Sure it can be abused, but it's not Google's fault; perhaps these areas of abuse (blogs, wikis, etc.) should address the problems from their end.

Re:Who's fault is that? (1)

Chanc_Gorkon (94133) | more than 9 years ago | (#9357662)

Yes it is. When with less than a million links miserable failure searches on google are linked to President Bush's biography on the whitehouse web site, that's a problem (leave your political views out of this). Same geos for Weapon's of Mass Destruction and other google bombs. Google....fix it now before it gets to be a real problem.

ROBOTS.TXT (4, Insightful)

gtrubetskoy (734033) | more than 9 years ago | (#9357524)

The burden is not on Google, but on Wiki sandbox admins, who should provide proper ROBOTS.TXT files to inform Google that this content should not be indexed.

As a sidenote, I think that with recent Wiki abuse, the issue of open wikis will become a similar one to open proxies and mail relays.

Re:ROBOTS.TXT (1)

sylvester (98418) | more than 9 years ago | (#9357769)

wtf. That's not insightful.

First of all, while my wiki is mostly personal junk, there's no reason it shouldn't be indexed. And many open source projects use Wikis as a primary source of documentation.

Secondly, the cat is out of the bag; I doubt these spammers are checking whether the sandboxes are indexed by Google.

I'm mostly pissed off that the edits to my sandbox have been only from nigritude ultramarine [slashdot.org] people. Frankly, I think google should stomp on that contest by not allowing the words to be searched for together.

Same site, a few days later: Don't do it. (2, Insightful)

micha2305 (769447) | more than 9 years ago | (#9357527)

Ok, but the same webmaster says [outer-court.com]:

I decided to stop posting backlinks in Wiki sandboxes, the SEO strategy previously explained. [...] In the meantime I'm asking developers and those hosting Wikis of their own to please exclude sandboxes from search engine results (via the robots.txt file). Doing so would shield the sandbox from backlink-postings, and there is no need for it to turn up in search results in the first place.

This sure makes sense, and who knows, maybe future wiki distributions do it by default. (If

<meta name="robots" content="noindex">
would work universally...)

Complacency (5, Interesting)

faust2097 (137829) | more than 9 years ago | (#9357534)

Isn't it time for Google finally to put some work into refining their results to exclude tricks like this?

It was time to do that at least a year ago. It's pretty much impossible to find good information on any popular consumer product and this is a problem that's been around for a long time.

But they're too busy making an email application with 9 frames and 200k of Javascript to pay attention to the reason people use them in the first place. It's a little disappointing, I'm an AltaVista alumni and I got to watch them forget about search and do a bunch of useless crap instead, then die. I was hoping Google would be different.

Re:Complacency (1)

koreth (409849) | more than 9 years ago | (#9357791)

But they're too busy making an email application with 9 frames and 200k of Javascript

Because, of course, if they weren't doing that, every last one of the engineers on that project would be tinkering with the search engine instead. It's not like they have separate engineering teams or people with different areas of expertise there or anything.

Well, it's about time this gets some attention (4, Insightful)

digitalgimpus (468277) | more than 9 years ago | (#9357536)

I've noticed that my blog's getting lots of spam from sites that don't seem like typical spam sites....

From what I can see, it looks like those "search ranking professionals" who "guarantee to raise your google rank in 30 days" are using blog spamming, and perhaps Wiki Spamming as a way to increase their clients ratings.

It's not about meta tags, or submitting anymore... it's spamming.

Perhaps it's time for people to finally be warry of these services. After all, can a third party really guarantee a position in another companies search index?

IMHO those services are pure evil. They either do nothing, or they do something to increase page rank... what is that "something"? How many options do they have?

If they are going to use my blog... why can't I get a cut in that business?

Re:Well, it's about time this gets some attention (4, Insightful)

Lurker McLurker (730170) | more than 9 years ago | (#9357610)

IMHO those services are pure evil.
No, 9/11 was pure evil, some unwanted comments on a blog is an annoyance. If you have a website that allows anyone to post comments, you will get some you don't like. That's life.

Re:Well, it's about time this gets some attention (1)

sabernet (751826) | more than 9 years ago | (#9357732)

semantic nonsense for the sake of making yourself feel smarter is also annoying. Get a life.

9/11 was genius ! (0)

Anonymous Coward | more than 9 years ago | (#9357776)


Landmines that USA sells to poor nations is evil

and 3000 people die a month on american roads but i dont see people burning down GMC or Ford

Re:Well, it's about time this gets some attention (1)

sabernet (751826) | more than 9 years ago | (#9357661)

here's an idea. link to some of those services and let the slashdotting begin:)

Re:Well, it's about time this gets some attention (1)

jtwronski (465067) | more than 9 years ago | (#9357703)

I agree completely. I can't count how many times my customers have asked me "What about those companies that guarantee 1st page rankings? What are they doing that you aren't?". Its hard to compete (honestly, anyway) with folks who have sold their souls and annoyed countless thousands by taking unfair advantage of the features that have made google the #1 search engine out there. Link trades, registration, and smart content and meta tags apply less and less to rankings nowadays. At least I can console myself (and hopefully, my customers) in that I can offer rankings without being annoying or stepping on others' toes.

Re:Well, it's about time this gets some attention (1)

87C751 (205250) | more than 9 years ago | (#9357761)

I've noticed that my blog's getting lots of spam from sites that don't seem like typical spam sites....
I had a spate of comment spamming too, about a month ago. In fact, that was what inspired me to move from blogware (WordPress) to a full-up CMS (PostNuke). The comment spammers' scripts don't seem to have found PostNuke yet. By the time they do, I'll have anti-bot measures in place (if I haven't simply closed comments to unregistered users).

Re:Well, it's about time this gets some attention (0)

Anonymous Coward | more than 9 years ago | (#9357829)

After all, can a third party really guarantee a position in another companies search index?

Here's what Google has to say on the subject: [google.com]

Beware of SEO's that claim to guarantee rankings, or that claim a "special relationship" with Google, or that claim to have a "priority submit" to Google.

This happened to me (4, Interesting)

JohnGrahamCumming (684871) | more than 9 years ago | (#9357539)

This happened on the POPFile Wiki [sourceforge.net]. Eventually I solved it by changing the code of the Wiki itself to have an allowed list of URLs (actually a set of regexps). If someone adds a page which uses a new URL that isn't covered it wont show up when the page is displayed and the user has to email me to get that specific URL added.

It's a bit of an administrative burden, but stopped people messing up our Wiki with irrelevant links to some site in China.

John.

Re:This happened to me (-1)

Anonymous Coward | more than 9 years ago | (#9357804)

off topic

Thanks for Popfile, my family can at last deal with their email a bit easier

keep up the good work

Steven.

E2 (1)

mirko (198274) | more than 9 years ago | (#9357541)

I have got the impression that this could work with E2 [everything2.com] as well as probably most bbcode powered fora.

Re:E2 (1)

proj_2501 (78149) | more than 9 years ago | (#9357716)

E2 doesn't have external links except those posted by gods, and they also have a vicious team of editors just waiting to pounce on things like this.

There's already a solution (-1, Redundant)

Aim Here (765712) | more than 9 years ago | (#9357542)

Since it apparently annoys the Wiki owners, why don't they just stop the google bot indexing the sandbox files using 'robots.txt'? Just a thought...

I've seen this (3, Informative)

goon america (536413) | more than 9 years ago | (#9357559)

I just reverted some pages on my watch list on Wikipedia that had been edited with a google spam bot to link all sorts of words back to its mother site.... lots of mistakes, looked like the script they were using hadn't been tested that well yet. (Would post an example, but wikipedia is completely fuxx0red at the moment).

This may become a big problem for sites like this. The only solution might be one of those annoying "write down the letters in this generated gif" humanity tests.

Webmasters Pounce On Wiki Sandboxes? (-1, Troll)

Pan T. Hose (707794) | more than 9 years ago | (#9357562)

Am I the only one who thinks that with such headlines it is not surprising that we have no lifes?

"Hey, baby! Did you hear that webmasters pounce on wiki sandboxes?"
"OMG! WTF?"

Sad but true.

apache + search + p2p = distributed search engine (2, Insightful)

datrus (265707) | more than 9 years ago | (#9357563)

Something that would make a nice opensource project would be to include p2p search functionality in apache itself.
This way all the modificed web servers would make a giant distributed search engine.
Some nice algorithms like koorde or kademlia could be used.
Anyone thought about starting something like this?

David

Re:apache + search + p2p = distributed search engi (1)

Bert690 (540293) | more than 9 years ago | (#9357753)

Something that would make a nice opensource project would be to include p2p search functionality in apache itself. This way all the modificed web servers would make a giant distributed search engine. Some nice algorithms like koorde or kademlia could be used. Anyone thought about starting something like this?

We looked into something a lot like what you suggest [ibm.com] (and actually have it up and running inside our intranet with 2k or so users). The problem with doing this on the internet is that p2p techniques are MUCH more susceptible to spamming than centralized techniques in general (because, for one, p2p reputation systems are very difficult to get right). Another problem is that most existing p2p search methods work great for finding popular content but not very well for finding that very specific peice of information that maybe only you are looking for at the current moment. Kademlia/Chord are DHT's and do not solve the text search problem on their own. While some p2p networks have adapted DHT's for keyword searching, the results still leave a lot to be desired (IMO).

Google. (3, Interesting)

Rick and Roll (672077) | more than 9 years ago | (#9357576)

When I search on Google, half the time I am looking for one of the best sites in a category, like perhaps "OpenGL programming". Other times, however, I am looking for something very specific that may only be referenced about twenty times, if at all.

When I do search in the first category, especially for things such as wallpaper, or simpsons audio clips, the sites that usually turn up are the least coherent ones with dozens of ads. I usually have to dig four or five pages to find a relevant one.

The people with these sites are playing hardball. Google wants them on their side, though, because they often display Google text ads.

Right now, my domain of choice is owned by a squatter that says "here are the results for your search" with a bunch of Google text ads. I was going to/may still put a site there that is very interesting, and the name was a key part of it.

I firmly believe that advertisements are the plague of the Internet. I would like to see sites selling their own products to fund themselves. Google doesn't really help in this regard. The text ads are less annoying than banner ads, but only slightly less annoying.

Don't get me wrong, I like Google. It's an invaluable tool when I'm doing research. I would just like to see them come out in full force against squatters.

Tomorrow today yesterday (4, Insightful)

boa13 (548222) | more than 9 years ago | (#9357579)

But webmasters may soon deluge these handy tools with links back to their site, not to get clicks, but to increase Google page rank.

The Arch Wiki [gnuarch.org] has sufferred several times from such vandals in the past few months. I'm sure other wikis have, too. They create links over single spaces or dots, so that casual readers don't notice them. Attentively watching the RecentChanges page is the most effective way to find and fight them, but this is tiresome. I guess many wikis will require posters to be authenticated soon, which is a blow in the wiki ideal, but not such a major blow. Alternatively, maybe someone will develop heuristics to fight the most common abuses (e.g. external link over a single space).

So, this is not new, but this is now news.

Re:Tomorrow today yesterday (1)

Neophytus (642863) | more than 9 years ago | (#9357818)

One to look out for is <div style="display:none;"> if html can be posted. It makes the span invisible to any human reader but I doubt that any current search engine can identify the purpose of such a tag.

exclude the sandboxes (-1, Redundant)

orb_fan (677056) | more than 9 years ago | (#9357583)

Maybe webmasters of Wiki sites should exclude the sandbox from search engines in their robot.txt files - afterall the whole purpose of the sandbox is an unmoderated test area.

Alternately, only include links in page rankings that have a recipical link back.

Not a big deal (4, Informative)

arvindn (542080) | more than 9 years ago | (#9357589)

Recently the Chinese wikipedia suffered a spam attack with a distributed network of bots editing articles to add link to some chinese intenet marketing site. In response, the latest version of MediaWiki (the software that runs the wikipedias and sister projects) has a feature to block edits matching a regex (so you can prevent links to a specific domain). Wikis generally have more protection against spamming than weblogs. So I wouldn't worry.

any one else finding msn.com unresolvable? (-1, Offtopic)

Anonymous Coward | more than 9 years ago | (#9357606)

any one else finding msn.com unresolvable?

Re:any one else finding msn.com unresolvable? (-1, Offtopic)

Anonymous Coward | more than 9 years ago | (#9357627)

;; ANSWER SECTION:
msn.com. 78 IN A 207.68.172.246

Hmm (3, Interesting)

Julian Morrison (5575) | more than 9 years ago | (#9357628)

Leave the links, edit the text to read something like "worthless scumbag, scamming git, googlebomb, please die, low quality, boring" - and lock the page.

This is a concern for the Google Gorilla? (2, Interesting)

Mr.Fork (633378) | more than 9 years ago | (#9357635)

Wait a minute - a way to spoof Google to get your page ranked better through WiKi? OMFG! Call the internet police, call Dr. Eric E. Schmidt, call out the Google Gorilla goons! I'm sure the good Dr. has a fix like the ones he used at Novell...

The problem with the whole Google model is that it's biased to begin with. If I'm looking for granny-smith apples, chances are an internet chimp they've bought the space with banana's to Google's goons. It becomes obvious when you see a chimp site that is near the top that has no business at the top. To the experienced googler, it's just an annoying fly on the screen and you just move further down.

I'm hoping that Google doesn't get too bogged down in becoming that big Ape like Micro$oft and be a little more proactive in protecting their business property. It's bad enough that they're selling top space to companies willing to pay, but here's hoping they don't slip on their own banana peels.

True (4, Funny)

Pan T. Hose (707794) | more than 9 years ago | (#9357644)

"Isn't it time for Google finally to put some work into refining their results to exclude tricks like this?"

I agree. I hope Google will finally put some work into refining their search results. I mean, they are probably the worst search engine ever! Now, Yahoo, MSN, Overture, Altavista... Those are much better. But Google?! Please...

Here's the solution... (-1)

Anonymous Coward | more than 9 years ago | (#9357704)

/. the sandboxes ;)

Who cares, web search is "done" (0)

Ars-Fartsica (166957) | more than 9 years ago | (#9357721)

If I were Google, the last place I would be putting more funding into is web search. Any algorithm will be spoofed, so the search nerds will never be satisifed long term. Average users though seem quite enambored with it, so the ROI for a new algorithm isn't clear.

Compare this to the ROI for music search, non-web search etc and its pretty clear Google's R and D is better directed to new products. Get used to it folks, when they go public there will be a huge expectation of new products on a regular basis. Web search will get tuned when ad keyword revenue dictates it.

server-supplied meta-info to reduce search weight? (1)

osmethnee (717516) | more than 9 years ago | (#9357722)

A possible solution I've been toying with... 1. Servers provide a meta-tag for certain pages which search engines interpret as reducing/eliminating that specific page's search weight. 2. Scripts which allow user-created content (wikis?, guestbooks, weblog comment forms, forums, and so on) can be updated by the content-provider to include this meta-tag. 3. To encourage spammers to check this tag and move on elsewhere if it's implemented, these same scripts should enforce a longish (5 second?) delay for all user-initiated content changes. [and seeing this is slashdot] 4. ??? 5. Profit!

omg (0)

Anonymous Coward | more than 9 years ago | (#9357755)

This is a really sad day for news. Get over it and quit crying.

Google may well downrate this (1)

Animats (122034) | more than 9 years ago | (#9357763)

I expect that Google will in time give drastically lower weight to easily-modified pages like "blogs" and "wikis". They're not that hard to recognize.

Why are people so surprised? (0)

stubear (130454) | more than 9 years ago | (#9357780)

The web was designed around the concept of trust and this simply does not work anymore. The only way to fix the internet is to eliminate all form of anonymity and temper this with strong legal protection of private information. Until this happens you will always have to deal with spam, viruses, hackers and the like. Once people can be held accountable for their actions online then, and only then, will the internet work as it was intended.

Sandbox persistence (2, Insightful)

gmuslera (3436) | more than 9 years ago | (#9357789)

If its a test area, is needed to store it? Wikis could just have it live for the current session or testing of the user, and when the user logs out or finish editing, simply delete/restore it to a default introductory text. Don't need to be some kind of collaborative blackboard or graffiti wall, or at least, if it must be, that be the webmaster choice to be that way (at least TikiWiki [tikiwiki.org] let me disable the sandbox if i want).

But if the problem is to have in websites areas where visitors (even unregistered ones) can post random text and links, even slashdot is potentially target of the same (maybe should be a "Spam" mod score?) or by the way, any site where unregistered visitors can store content in a way or another, be wiki or not.

"Finally"?? (4, Interesting)

jdavidb (449077) | more than 9 years ago | (#9357802)

Isn't it time for Google finally to put some work into refining their results to exclude tricks like this?

I take extreme issue with that statement, and I'm surprised noone else has challenged it. Google does in fact put quite a bit of work into making themselves less vulnerable to these kinds of stunts. They even have a link on every results page where you can tell them if you got results you didn't expect, so they can hunt down the cause and refine their algorithm.

The system will never be perfect, and this is the latest issue that has not (yet) been dealt with. Quit your griping.

Why doesn't google (1)

hackstraw (262471) | more than 9 years ago | (#9357812)

simply make a distinction between "I am looking to buy something" searches vs "I am looking for information about something".

They are cleary different kinds of searches, and I do both of them, yet I get the same results for both kinds of searches. With the exception for froogle, which is definitely a step in the right direction, but not quite there.

Although the interface has gotten a little better on altavista (remember them??), but searches like: for used condoms [altavista.com] do not make sense for retail stores at all. I'm sorry guys, there isn't a market for used condoms, but if there were I'm sure someone would be more than willing to supply the demand.

The google search for used condoms [google.com] is a little better, but the advertising links on the right hand side does have:

Used Anything -Dirt Cheap
at Gov't & Police Auctions Near You
Seized, Surplus Property. Hot Deals
www.GovernmentAuctions.org

And please do not take a tangent on "used condoms", its just a sick memorable example.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...