Beta

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Content Owners to Charge Royalties for Searching?

CowboyNeal posted more than 7 years ago | from the pieces-of-the-pie dept.

203

dwarfking writes in with a story that follows up on the impact of recent Google events: "Ok, maybe I'm a little dense here, but isn't this plan more of an impact to the content provider than to the search engines. From the article: 'In one example of how ACAP would work, a newspaper publisher could grant search engines permission to index its site, but specify that only select ones display articles for a limited time after paying a royalty.' So, ok, a search engine company decides it doesn't want to pay royalties and therefore doesn't index the provider's site. Now won't the provider actually lose readers since their articles won't be locatable by search anymore?"

cancel ×

203 comments

Sorry! There are no comments related to the filter you selected.

Dumb (5, Insightful)

daspriest (904701) | more than 7 years ago | (#16167543)

Sounds like one of the dumbest ideas I have heard, this goes alongside the MPAA and RIAA shenanigans.

Re:Dumb (0)

Anonymous Coward | more than 7 years ago | (#16167587)

iep. if they don't want me to read them, i won't.

too many things to read anyway.

Is it, though? (1, Interesting)

Anonymous Coward | more than 7 years ago | (#16167675)

But is it really "dumb"? I don't have any doubt that these media outfits have very talented economists, financialists, and lawyers working for them. These are people who can accurately predict what will happen if the media companies were to take this course of action. They know how consumers will respond, and they know how it will affect their company's bottom line.

So while it may seem "dumb" to you, I think that many manyears of analysis have gone into this situation, performed by very bright individuals with a variety of backgrounds. And collectively, they have realized that this may very well be the most effective and profitable course of action for their company to take.

Re:Is it, though? (-1, Redundant)

Anonymous Coward | more than 7 years ago | (#16167795)

You must be new here.

Re:Is it, though? (4, Insightful)

tomhudson (43916) | more than 7 years ago | (#16168109)

"hese media outfits have very talented economists, financialists, and lawyers working for them. These are people who can accurately predict what will happen if the media companies were to take this course of action. They know how consumers will respond, and they know how it will affect their company's bottom line."

I hope you're just trolling. Most analysts in any field aren't worth shit when it comes to making predictions. That's why there are so many product failures every year, and why for every winner in the stock market, there's at least one loser.

Media people are among the most clueless. Take a look at how many movies bomb, at how many magazines die every year, how many tv shows don't go beyond the first season, how many newspapers are having to cope with declining readership

http://www.naa.org/marketscope/pdfs/Sunday_Nationa l_Top50_1998-2005.pdf [naa.org] Sorry, its one of those darned pdfs.

Sample stats:

1998 - 135,000,000 adult population, 92,000,000 readers
2005 - 150,000,000 adult population, 89,000,000 readers.

So, while the potential market has grown more than 10%, their readership has declined 4%.

Huh?? (4, Insightful)

www.sorehands.com (142825) | more than 7 years ago | (#16168127)

Many companies with supposed "very talented economists, financialists, and lawyers working for them" have done things that failed. Look at Sony and their rootkits? In 1986 many "very talented economists, financialists, and lawyers" commented buying a PC software only company, Microsoft, would be a very bad investment. Many people said the same thing about a company that sells over priced coffee -- Star Bucks. A very talented manager at HP ridiculed Steve Wozniak when he designed a personal computer.

Re:Is it, though? (1)

Trailwalker (648636) | more than 7 years ago | (#16168143)

These are people who can accurately predict what will happen if the media companies were to take this course of action

Same fellows who predicted the precipitous decline of the daily newspaper?

With advice like that, they can stop the presses now.

Re:Is it, though? (0)

Anonymous Coward | more than 7 years ago | (#16168275)

Holy shit, I've seen people pull conjectures from their asses before but this takes the cake. I work for a media conglomerate and they have even less of a clue than you. 1) we have no economists (economists?) 2) 'financialists' concentrate on cutting costs though we call them accounting 3) lawyers.... wtf do lawyers have to do with this? Your post would be no less nonsensical to claim our team of media astronauts ands neurosurgeons thought this a brilliant plan.

Re:Is it, though? (3, Interesting)

Mr. Underbridge (666784) | more than 7 years ago | (#16168277)

But is it really "dumb"? I don't have any doubt that these media outfits have very talented economists, financialists, and lawyers working for them. These are people who can accurately predict what will happen if the media companies were to take this course of action. They know how consumers will respond, and they know how it will affect their company's bottom line.

The question, though, is whether those smart people are actually allowed to make the final decision. This is the newspaper indstry we're talking about. The same industry that is *still* making you start an account and sign in to track what you read. This pisses a lot of people (like me) off, who end up not coming back to the site. Not only that, 90% of that information can be tracked by simply logging IP. So they turn away advertising revenue because of a completely antiquated practice.

So no, it's not surprising that companies like that would fail to figure out that search engines are FREE FUCKING ADVERTISING.

Go check out some of the recent articles on Techdirt, this is one of the author's favorite pet peeves. The upshot is that the newspaper industry has its collective head up its ass and completely fails to understand this whole internet thing. The recent developments in Belgium and France, where newspapers have sued google to avoid being cached, demonstrate this principle in action.

If I'm google, I tell them go ahead - your funeral.

Re:Is it, though? (1)

edumacator (910819) | more than 7 years ago | (#16168387)

I don't have any doubt that these media outfits have very talented economists, financialists, and lawyers working for them.

I remember when Coca Cola changed their formula too. That worked out really well...

Re:Dumb (4, Insightful)

Alef (605149) | more than 7 years ago | (#16168145)

Sounds like one of the dumbest ideas I have heard, this goes alongside the MPAA and RIAA shenanigans.

What makes it extra dumb is the fact that it basically is an inverse of Google's targeted ads, if I'm getting this straight. Site owners already pay Google to have their link shown when people search for related material. And now, apparently, some of them expect Google to instead pay them for the exact same thing? Really, really dumb...

Re:Dumb (4, Informative)

Deadstick (535032) | more than 7 years ago | (#16168163)

In fact, it's right up there with Radio Shack's policy back in the TRS-80 days.

They claimed the exclusive right to control mention of their computer in print. If you published a BASIC program to run on it, or an article about how to use it, their lawyer would show up demanding that you pay royalties or desist. Magazines resorted to talking about "S-80 Bus" computers, which was sufficiently generic.

They got their wish, of course: you can read all the computer magazines you want without seeing anything about Radio Shack computers.

rj

What's wrong with that? (1)

neonprimetime (528653) | more than 7 years ago | (#16167581)

Many publishers feel, however, that the search engines are becoming publishers themselves by aggregating, sometimes caching and occasionally creating their own content.

Re:What's wrong with that? (3, Interesting)

Ruff_ilb (769396) | more than 7 years ago | (#16167725)

The problem is that the search engines aren't TRYING to be publishers. The entire point of the search engines is to direct you to the content that you want. Aggregation, caching, and content creation are means to further this end. On the other hand, creation and caching of content is the whole point of the publishers. That IS their end.

Saying that search engines are becoming publishers because they create, aggregate, and cache content to help users FIND content from publishers seems to be just a little off the mark.

Re:What's wrong with that? (0)

Anonymous Coward | more than 7 years ago | (#16168287)

There's two sides to that coin. Searching, indexing and excerpting are fine and I can't imagine any publisher would object to that. The problematic aspects of search engine behaviour are aggregation and caching, because these actions don't drive visitors to the publishers' sites. On the contrary, aggregation and caching are added value that the publisher cannot compete with. Even though the name of the publisher is made more visible, the users don't actually need to visit his site anymore. If your business model is to sell a print publication and your online presence is merely advertising for your actual product, then you probably don't object to the increased visibility, even though it draws visitors away from your site. As long as people see your name, you're going to be fine. But if your business model relies on attracting visitors to your site, for example because you get paid for online ads on your site, then aggregation and caching diminish your revenue potential. A good first step would be to strictly follow meta information which instructs robots not to cache pages. If the publisher doesn't want his content cached or aggregated, don't do it. IMHO "caching" websites and republishing them under a different URL is a straightforward copyright violation, and so is aggregation if you use more than a short excerpt from the original text. A cache is a temporary unmodified copy which is delivered in place of the original when the original is requested. A copy of a webpage under a different URL is a copyright violation, not a cache.

Re:What's wrong with that? (2, Interesting)

pacalis (970205) | more than 7 years ago | (#16168297)

Search engines are distributors of information content, just like publishers. The 'entire point of search engines' is not to help 'the customer' find the content, any more than the 'entire point of publishers' is to direct customers to the content they want. Really the 'entire point' of companies is to profit for their shareholders. Profit models are slightly different but in the large, given that they both profit primarily from ad content, their interests look more similar than different.

News publishers reduce consumer search costs by aggregating content basically using evolutionary improvements on hundred year old business model/technology. Search engines have a more targeted, more revolutionary present-relevant model/technology. Isn't it obvious that these are competitors?

 

Re:What's wrong with that? (2, Insightful)

arminw (717974) | more than 7 years ago | (#16168597)

.....Isn't it obvious that these are competitors....

If they are competitors, then search engines such as Google should just de-list that publisher from ALL their searches until further notice from that publisher that they want to be listed after all. If said publisher notices a precipitous drop in their page views, they WILL come crawling back on their hands an knees to be re-instated.

Re:What's wrong with that? (2, Insightful)

Frosty Piss (770223) | more than 7 years ago | (#16168319)

The problem is that the search engines aren't TRYING to be publishers.

I think that searches such as Google really straddle the line with how they present their search results, presenting content almost like RSS aggregators.

I think it's easy to make a good argument that they want to provide the same type of service but with the added value of more sources, so as to attract eyeballs. Googl's not in it for humanity, you know.

Idiotic (0)

Anonymous Coward | more than 7 years ago | (#16167599)

This is on par with charging money for getting lyrics online. Greed never ceases to amaze.

Re:Idiotic (2, Insightful)

kfg (145172) | more than 7 years ago | (#16167941)

This is on par with charging money for getting lyrics online. Greed never ceases to amaze.

No more nor less stupid than charging money for any other form of recorded music/literature online. A recording artist makes money by selling his sound recordings, a lyricist makes money by selling his text recordings.

If you purchase a sound recording or sheet to learn the lyrics, the lyricist gets paid. If you download a free sound recording or sheet, the lyricist does not.

So the question, as always, develoves back to the root; the inherent validity of the copyright concept, whether or not you think that lyricists should get paid for using their works.

At the moment I solve the issue by forcing you to come to enough of my live performances to memorize my work (at least those that I haven't "given away" by posting on Slashdot), or you could book a lesson, but not everyone sees it that way.

KFG

Robots? (5, Insightful)

TechyImmigrant (175943) | more than 7 years ago | (#16167601)

So using the courts they have failed to get royalties and achieved what they could have achieved with some robots.txt files.

Re:Robots? (3, Insightful)

Anonymous Coward | more than 7 years ago | (#16167989)

They just want more money for providing the same service. It is just like the big telecomm companies trying to charge twice for data going over their network (the anti-net neutrality non-sense). What can "publishers" do to prevent indexing? Robot.txt files, of course. But if they somehow feel these simple files telling others not to index their content do not do enough, they can always turn off public access to their content and go to a subscription only model with terms of service prohibiting indexing.

Nobody is making the publishers make their content available publicly. But so long as publishers do so, others can remember their content, index it, and tell me about it.

Re:Robots? (1)

Tim C (15259) | more than 7 years ago | (#16168261)

What can "publishers" do to prevent indexing? Robot.txt files, of course. But if they somehow feel these simple files telling others not to index their content do not do enough

To be fair, robots.txt is only a request; search engines can ignore it (and I believe that some do).

Re:Robots? (1, Insightful)

Anonymous Coward | more than 7 years ago | (#16168361)

Search engines can get their asses banned if they ignore robots.txt. On top of that, they can get themselves into legal trouble when their ignorance causes them to buy 100 airplanes from the webshop that was off limits for robots. The web is a dangerous place for machines which don't understand what they're doing. They should take all the help they can get, especially the kind which was specifically created for them. Last but not least, obeying robots.txt is just good manners. If you cast common courtesy aside, don't expect me to treat you with respect either.

So's a court order. (1)

Kadin2048 (468275) | more than 7 years ago | (#16168471)

They certainly can ignore robots.txt, but I don't think that any major search engine does; Google obeys both no-spider and no-cache restrictions, as does the Internet Archive.

When you get right down to it, a court order is basically a request too, it just has some more weight behind it, if you happen to live in that court's jurisdiction.

I think that a robots.txt file would be something like a "No Trespassing" sign; if you had one, and then you were cached or spidered and went to court, it would give you a big advantage because it would show that the search engine / database willfully ignored the standard request not to be included in their system.

Anyway; you're right that any system can ignore robots.txt -- you can test this yourself if you want just by using curl. Last time I checked it had an optional flag to ignore it, and would further do lots of other fun things like insert random pauses between page requests, to make its robotic nature a little less obvious to the webserver. This in and of itself isn't illegal, but depending on how you used it, it might be.

Re:Robots? (1)

cgenman (325138) | more than 7 years ago | (#16168441)

Google maintains an opt-out policy for both its Google News and Google Print services, saying any publisher can withdraw its content simply by asking.

So that's two routes the publishers could have taken to achieve the same thing.

It seems like the issue is that publishers want to remain in Google, they just want to find ways to get paid for it. They're looking for a third option between "Play by google's rules" and "take our content and go home."

On the other hand, I was under the strong impression that "indexing" a site is completely legal, as you're not violating any 1.copyright or 2.trademark laws, and that really the issue is whether the reproduced blurb about the page exceeds fair use.

Being a content creator myself, I do think that content creators should be compensated for their works. But there is fair compensation, and there is misunderstanding the technology and trying to pass restrictive laws to get all that you can grab. Thankfully, this seems to be somewhere in the middle.

greed... (3, Informative)

wulfbyte (722147) | more than 7 years ago | (#16167603)

is the worlds most common and least forgivable form of stupidity.

NAA (1)

ignipotentis (461249) | more than 7 years ago | (#16167605)

What do you expect? *AA like to use their size to affect price and generate revenue. This is their job. If the New York Times and Washington Post jump on board, wouldn't Google look foolish for not being able to return stories which match nicely to the search request?

Although I do find it funny that the NAA (http://www.naa.org) is made searchable through a Google Mini appliance.

Re:NAA (1)

Alioth (221270) | more than 7 years ago | (#16168227)

No - not at all. That sort of thing would be a huge opportunity for, say, the BBC or USA Today (or any other competitor) who doesn't keep their information from being indexed and found.

Submitter has hit the nail on the head. (4, Insightful)

numbski (515011) | more than 7 years ago | (#16167617)

There's really not much more to say about this. Let 'em wallow in their own stupidity, and they'll come around. Sometimes, like children, you have to let someone learn the hard way, and they'll never do it again. :)

"You'll shoot your eye out! You'll shoot your eye out!"

Side note - anyone else lose their login cookie this morning only forced to log back in and fill out a captcha? Weirdness. Worse, I saw no option for the visually impaired to log in either. Tsk tsk tsk....guys, come on. I'm not meaning to toss flames around, but you've got to provide some sort of opt-out link for those who can't see your captcha images. :(

Re:Submitter has hit the nail on the head. (4, Insightful)

mrmeval (662166) | more than 7 years ago | (#16167711)

The absolute second a pornographer sued Google they should have ripped anything by them off their server and made sure that would never appear on a Google search again.

Any 'content holder' that whines needs the same thing done to them with no option for reindexing without paying enough to bleed them white or better donating a large chunk of their content to the public domain.

Since when did the world work for Google? (0, Redundant)

mccalli (323026) | more than 7 years ago | (#16168007)

All these points of view are predicated on the fact the content providers should be grateful to have their content placed in a search engine. Well, these people have explicitly said that they are not grateful, and that they don't want it to happen.

Any 'content holder' that whines needs the same thing done to them with no option for reindexing without paying enough to bleed them white

What sort of attitude is that? You see, that's exactly why people get fed up. A search engine could not exist as a commercial entity if there were nothing to search. Original content needs to be generated somewhere, and these people are saying that content is generate for the benefit of their site, not for Google's (or MSN or Yahoo et. al.). They are saying that the ad revenue for viewing headlines on the site should go to them, not to Google. That the terms of viewing the site should be set by themselves. That they own copyright where they say they do. And, since they originate the content, I agree with them.

Cheers,
Ian

Re:Since when did the world work for Google? (3, Insightful)

kimvette (919543) | more than 7 years ago | (#16168053)

If they disagree with how Google works, they should block googlebot, or at minimum, create a robots.txt

Re:Since when did the world work for Google? (0, Troll)

mccalli (323026) | more than 7 years ago | (#16168135)

If they disagree with how Google works, they should block googlebot, or at minimum, create a robots.txt

No, that's missing the point of what I said again. I said - since when did the world work for Google? I don't want to do extra work because someone is misusing my copyright, I want them to abide by the copyright in the first place.

For just appearing in google.com's search index I have little sympathy for the sites' case. But look at Google News and tell me it isn't becoming a publisher. At the very least it needs to decide which articles should feature more prominently, whether by machine or not. I'm sticking by my normal privacy principles over this - opt in, not opt out.

Cheers,
Ian

Re:Since when did the world work for Google? (2, Insightful)

sabernet (751826) | more than 7 years ago | (#16168245)

Your point is moot as by publishing to a public internet site, you have opted in. Otherwise it would be a VPN or at the least a privileged site protected by a password.

If you have a site on the www, it means it can be queried by ANYONE. Regardless of if you like it or not. To say otherwise would be to broadcast a message on FM radio and complain that someone heard it and spoke about it.

There are ways around this: password protection and robots.txt.

The world does not work for google, rather google works on the internet. If you don't want to put the extra effort in to secure it, then don't put it on the web to begin with or you lose all rights to complain.

Would you sympathize for a bank having all their customer data on an insecure website but blaming it on people who visited the link when they "should have understood they shoudn't have"?

Re:Since when did the world work for Google? (1)

tsm_sf (545316) | more than 7 years ago | (#16168545)

Since you're not a professional sysadmin/webmaster/whathaveyou it's forgivable that you didn't know about the standard method for blocking search engines from indexing your site. It's not a hidden feature, it's not a hack, but it is a little obscure if you're outside the "art".

The news service's webmaster has no such excuse. This isn't a matter of intellectual property vs. the "electronic commons", it's a matter of incompetence vs. greed.

Usage and abusage (1)

matt me (850665) | more than 7 years ago | (#16168165)

You suggest that Google should deal with anyone who threatens their profits by removing them from their index, essentially a death sentence for any online business. (Google has a 54% market share [Wikipedia], probably the more web-savvy half that consume the most) This is an apalling suggestion. I very much hope they don't do anything like this - but how could we tell? Business go to any lengths to beat their competitors, I'd wager Google receive hundreds of emails offering ridiculous values of currency in return for bumping up one site past another on pagerank. Thousands of sites will offer to do this for you, and needless to say if they do work, then it's by spamming.

You can't assume Google will always act ethically. Power corrupts, absolute power corrupts absolutely. Scandal is inevitable, be it privacy, corruption or censorship. The way we use Google has eeroded the web's greatest quality, that of the hyperlink to render all sites equal. The hierachy of pagerank means sites beyond the first ten results are often ignored. This influence is dangerous.

There's so much talk on Slashdot of Microsoft abusing their stolen monopoly. Yet we've handed Google one. People blindly swear allegiance to them, defending their more questionable actions that if another company perpretrated, they'd certainly condemn. Honestly, when did last use another search engine? When Google's broken, are you even able to find one?

Re:Usage and abusage (0)

Anonymous Coward | more than 7 years ago | (#16168307)

The way we use Google has eeroded the web's greatest quality, that of the hyperlink to render all sites equal. The hierachy of pagerank means sites beyond the first ten results are often ignored. This influence is dangerous.

This is a fallacy. Search engine have rendered the web a usable place by providing entry points for requests. From these entry points, you still can go everywhere. Go and find the web site of the belgian newspaper that won the case againt Google. Don't use a search engine of course.

Re:Submitter has hit the nail on the head. (1)

westlake (615356) | more than 7 years ago | (#16167857)

Let 'em wallow in their own stupidity, and they'll come around

There are sites and services Google News must access to remain credible. The throw-away weekly shopping paper from Nowhere, Nebraska is not a substitute for the WSJ.

Microsoft would like nothing better than to become the news channel, the portal, for the decision-makers in this world.

Only if the search engines hang tough (2, Interesting)

jmorris42 (1458) | more than 7 years ago | (#16167627)

The premise of the submitter only holds if ALL of the search engines hang tough. If only Google tells em to go piss up a rope, they lose most of the news sources and readers start using someone else. One of the failing search sites will pay (because for them the cost will be mimimal.... at the time) and with luck become successful. Then they give all the profits to the news providers and become a .bomb and we repeat the cycle until they are all dead except Google who only derives a small income from banner ads on Google News. See online music P2pP sites become DRMed music providers and then die for a template.

Re:Only if the search engines hang tough (1)

kfg (145172) | more than 7 years ago | (#16168201)

The premise of the submitter only holds if ALL of the search engines hang tough. If only Google tells em to go piss up a rope, they lose most of the news sources and readers start using someone else.

See my post about Wal-Mart shipping back DVDs.

KFG

Re:Only if the search engines hang tough (2, Insightful)

paeanblack (191171) | more than 7 years ago | (#16168223)

I think the news publishers are in a worse predicament, given that 90% of non-local articles are verbatim reprints of AP reports. Unless they are all holding firm, the search engines will see the content. Google, et. al, also has the option of subscribing to AP directly and becoming a true publisher themselves.

Lawsuit (5, Insightful)

ultranova (717540) | more than 7 years ago | (#16167633)

So, ok, a search engine company decides it doesn't want to pay royalties and therefor doesn't index the provider's site. Now won't the provider actually lose readers since their articles won't be locatable by search anymore?

Sounds like rounds for suing the search engine for lost revenue to me !

"Your honor, by refusing to pay our fee the search engine is not only depriving us of our fair due, but also giving an unfair competitive advantage to our competitors. We demand that they add us to their search database and pay our very reasonable fee for accessing our pages."

And if anyone mods me funny, well... that's one naive fellow, then.

Re:Lawsuit (1)

JanneM (7445) | more than 7 years ago | (#16167745)

And if anyone mods me funny, well...

Funny? Had there been a "-1 You're making me cry" moderation I would not have needed to write this.

Re:Lawsuit (1)

Morphine007 (207082) | more than 7 years ago | (#16167871)

I would hope that a US judge would throw that kind of stupid shite out the instant it crossed his/her desk... but given this kind of shenanigans [slashdot.org] your post deserves the "+1, sad but true" mod :S

Not as clear cut as it sounds (1, Insightful)

joe545 (871599) | more than 7 years ago | (#16167637)

These news sites are taking the point of view that not all publicity is good publicity. If they don't want their content to be aggregated into a section of another (and very popular) web site that they feel is encroaching on their business, then who are we to say that they shouldn't try to take action? If google decide to put ads up on Google News, then they'd be making money from others' content, why shouldn't they want and get a piece of that pie?

Re:Not as clear cut as it sounds (4, Insightful)

ignipotentis (461249) | more than 7 years ago | (#16167665)

I don't see this as making money on other's content. Google does link you back to the main story. It does not display a cached version.

Google is making money on their unique ability to gather, index, and make sense of all of the seperate news articles being created. They can show you news stories about related items from different sources around the world. They are making money on the service which lets you actually see the news from different points of view.

This is very different, and the content creators had nothing to do with this. This is a service on top of their service, and they deserve nothing from it. Google does provde the opt out option either by contacting them, or by simply using the robots.txt.

Re:Not as clear cut as it sounds (0)

Anonymous Coward | more than 7 years ago | (#16167891)

I don't know... Google does display a few sentences of context. It may be that a few sentences are all those complaining organizations provided in the first place.

(It's amazing how little information these news organizations actually give out now. Probably because joe-public just doesn't have the attention span.)

You're right (0)

Anonymous Coward | more than 7 years ago | (#16168587)

If they want to shoot themselves in the foot, we have no place in stopping them.

Willfully stupid (4, Insightful)

hublan (197388) | more than 7 years ago | (#16167639)

From the article:
"Since search engine operators rely on robotic 'spiders' to manage their automated processes, publishers' Web sites need to start speaking a language which the operators can teach their robots to understand," according to a document seen by Reuters that outlines the publishers' plans.

"What is required is a standardized way of describing the permissions which apply to a Web site or Web page so that it can be decoded by a dumb machine without the help of an expensive lawyer."


You mean like robots.txt?

This sounds like willful ignorance. All the search engines mention it as the method to avoid having particular content indexed. They might not read RFCs but a quick peek at the help pages on the search engines in question would've answered this (and squashed the lawsuit) in no time.

I agree (2, Interesting)

khallow (566160) | more than 7 years ago | (#16167649)

Instead, it'd be the content provider paying the search engine. I can't imagine a scenario where the webpage is so valuable that the economics would work this way. Further, I don't understand their concern about indexing content. It's not hard at all to block or steer search engines. It strikes me that these publishing companies are either ignorant of the value provided by external search engines and/or delusional about the value of content that isn't indexed by a popular search engine.

Re:I agree (1)

paeanblack (191171) | more than 7 years ago | (#16168417)

I can't imagine a scenario where the webpage is so valuable that the economics would work this way.

Anything sufficiently popular and time-sensitive

nyse
nasdaq
ebay
craigslist

Hey! I got a better idea (5, Funny)

iminplaya (723125) | more than 7 years ago | (#16167655)

Let's just shut down the net. Damn thing has been nothing but trouble since the beginning. We probably should outlaw all communications that don't provide more income for the content "owners". That means no more printing, writing, singing, painting, talking...anything. If we don't want to give all our money to these damn people, we should shut the hell up, right?

And put in your earplugs
put on your eyeshades
you know where to put the cork...

Oops, there goes another violation.

Re:Hey! I got a better idea (1, Interesting)

Anonymous Coward | more than 7 years ago | (#16167925)

I agree. The Net was an interesting experiment, but it's just about over now. But, there were a few ideas to be learned.

I suggest someone make an Internet Version 2. Now that we have software patents and EULAs and interminably long copyrights, patent/copyright/trademark the *&$%^ out of it. Then, license it to anyone and everyone on the sole condition that they agree to not let a lawyer anywhere near it. In order to run a web server, you have to agree not to sue anyone over content you post on it. In order to run a web browser, the EULA requires that you forego the right to sue anyone over any content you see with it. If you're a telecom company that wants to buy a router, you have to agree not to block ports or artificially slow your competitor's traffic. If you want to make software that uses Internet Version 2 functions, you have to agree to forego software patents or license them to everybody. In short, let's build a new, better internet - without the lawyers.

Can it be done?

Re:Hey! I got a better idea (1)

kfg (145172) | more than 7 years ago | (#16168301)

And put in your earplugs

Brought to you by Bose(tm)

put on your eyeshades

Brought to you by Paramount Studios(tm)

you know where to put the cork...

Brought to you by Kaopectate(tm)

KFG

Robot.txt is a machine readable permissions model (4, Insightful)

mdfst13 (664665) | more than 7 years ago | (#16167669)

"What is required is a standardized way of describing the permissions which apply to a Web site or Web page so that it can be decoded by a dumb machine without the help of an expensive lawyer."

They already have this. It's called the robot.txt file. You can use it to tell search bots not to index you. This just seems to be a richer permissions model, that includes things like caching and excerpting options.

In the longer term, I agree that this hurts content providers more than Google. Overall, it makes the search index less useful. However, it makes the content unfindable. Content that uses this will simply be replaced by content that does not.

Why would Google pay to provide better search results for content? It would make more sense for them to pay for the content direct so that they could have an exclusive. Or for content to pay to appear in the search results, like with Yahoo.

Re:Robot.txt is a machine readable permissions mod (0)

Anonymous Coward | more than 7 years ago | (#16168457)

"What is required is a standardized way of describing the permissions which apply to a Web site or Web page so that it can be decoded by a dumb machine without the help of an expensive lawyer."

They already have this. It's called the robot.txt file. You can use it to tell search bots not to index you. This just seems to be a richer permissions model, that includes things like caching and excerpting options.


You don't expect expensive lawyers to understand dumb machines, do you? It's much more profitable to make obfuscated law than obfuscated code.

Sad news ... Bin Laden, dead at 54 (-1, Offtopic)

Anonymous Coward | more than 7 years ago | (#16167699)

I just heard some sad news on talk radio - Al-Qaeda leader Osama Bin Laden was found dead in his Pakistan home this morning. There weren't any more details. I'm sure everyone in the Slashdot community will miss him - even if you didn't enjoy his jihad, there's no denying his contributions to popular terror. Truly an anti-American icon.

Protective tarriffs vs. revenue tarriffs (1)

davidwr (791652) | more than 7 years ago | (#16167705)

If a newspaper wants to make money, they'll set the "tax" low enough so most search engines will just pay the fee. They'll do this because they know at least one of their major competitors will do the same, especially those competitors who are ad-supported.

If a newspaper wants to be delisted, they can charge a very high fee.

It's like the 19th-century import duties in the USA. Small taxes on imports were a large part of Washington's tax base. However, to protect American producers, some items had huge taxes. Washington didn't make much money off of those taxes.

The dying gasps of print publishing (1)

grapeape (137008) | more than 7 years ago | (#16167707)

This is just the latest in an attempt to survive for the traditional newspapaper. Back when the web first started remember how many newpapers refused to be online at all or required registration hurdles to prevent "linking", after enough bad PR and dropping subscription levels, most reluctantly accepted the web and started to regain credability. I have a newspaper salesperson that comes to my door at least once a month, the last time he was here we argued about the value of the newspaper, he mentioned classified, i mentioned craigs list, he mentioned local news I mentioned kanascity.com, for every "exclusive" I could easily counter with 2-3 web alternatives that were not only more up to date but also allowed a certain level of interactivity. Print newspapers just cant compete.

There has been a huge content shift with many newspapers mainly due to the readiness of up to the minute news online. Many papers now rely on their columnists and unique local features to survive, some have managed to flourish but most will never be as important to their respective cities as they once were, for a publisher that hurts. They all still want to maintain their mini-empires which frankly is impossible in the modern information age. Limiting their content will do nothing but limit the views to their content, further exasperating the very problem they feel they are facing.

Re:The dying gasps of print publishing (1)

extropic (926837) | more than 7 years ago | (#16167927)

He should have mentioned: "Reading it in the bathroom, bed, etc." "Bird cages" "Fish and Chips" "Moving day in the kitchen" "Packaging items sold on eBay" "Training the dog" "Reading the sports while she reads Lifestyle and junior reads the funnies"

No, it's a great idea ! (1)

Hawthorne01 (575586) | more than 7 years ago | (#16167719)

We have nothing but high hopes that this will increase our perceived value to our readers and boost our crediblity in teh news market, as well as continue the return the same value to our investors as our recent Times $elect service.

- The Managing Staff Of The New York Titantic ^H^H^H^H^H^H mes

That'd be great for earnings... not really. (1)

HatchedEggs (1002127) | more than 7 years ago | (#16167727)

Think about it, search engines make the internet go round. Well, not really.. we could in many ways survive without search engines, but it would be inefficient and problematic.

Content owners that try to make somebody pay to search for their content are going to find that the desire to find that content "miraculously" dries up.

Things might eventually change a bit in how pay sites are indexed so their content isn't made free to all, but as you'll find with the companies that got one leg up on Google this time, they will be the ones that end up suffering in the long run.

Search Engines are your friends... they drive content to your site. The game is how to get the traffic to your site and make money on it once it gets there. Not trying to charge people to help bring you money.

This could cripple content owners who do this (4, Insightful)

ConfusedSelfHating (1000521) | more than 7 years ago | (#16167779)

There will always be smaller news outlets who want to get additional daily viewers. They want Google to direct people to their site. If the large news organizations want to opt out, there will always be someone to take their place.

When you look at Google news, you see a brief summary of the news article and then when you click on it, you are directed to that website. The website will earn revenue from their advertising. If they have an attractive and useful website, people may go to their site directly. New unique users. Often I find that after I've read an article I found through a search, I will go to the homepage of the site (through the hacking known as modifiying the URL) and look at their other articles. Most websites would pay Google to have links to them, now some sites want to Google to pay them? Google will just ignore them and their competitors will prosper.

Doesn't slashdot do something similar. Someone reads something interesting on the web and suddenly there's a link to it. I'm sure if some sites wanted to charge a fee to slashdot, they would promptly be ignored.

The idea that comes to mind is revenue stream. Someone working for the news organizations came up with the thought "Google has lots of money, let's take it" and so it began.

Re:This could cripple content owners who do this (1)

westlake (615356) | more than 7 years ago | (#16168037)

There will always be smaller news outlets who want to get additional daily viewers. They want Google to direct people to their site. If the large news organizations want to opt out, there will always be someone to take their place.

Do you have the faintest idea of what the cost of entry is here?

Most cities count themselves lucky to have a single marginally competent daily newspaper. One TV station that rises above the "Eyewitness News" level. Where I live there is one regional upstate paper that is worth a damn.

Re:This could cripple content owners who do this (1)

ConfusedSelfHating (1000521) | more than 7 years ago | (#16168185)

I admit that I was thinking on an international and national scale. But let's say a city with two papers that also have websites. One has a circulation of 70% and the other has a circulation of 30%. If the paper with a circulation of 70% demands royalties, it will be ignored and search results will bring up the website of the less popular paper. Even if it's of lower quality. If every news outlet of a region demands royalties, that region will be ignored. You will just need to bookmark the websites of your local news outlets when you find them.

The only way around this, is for the news outlets to lobby for mandatory compensation within the law. I think the major news organizations may pursue this option.

Re:This could cripple content owners who do this (1)

westlake (615356) | more than 7 years ago | (#16168405)

Even if it's of lower quality. If every news outlet of a region demands royalties, that region will be ignored.

A Canadian search engine that ignored the Globe and Mail, McClean's and the CBC would be next to useless. Searchers will simply move on the alternative search engine. If you index financial news, you won't be taken seriously unless you include the WSJ

Google = parasite! (1)

LCookie (685814) | more than 7 years ago | (#16167799)

I think this is a good idea.. Google is just a leech, making money with other people's content.
They should pay, plain and simple!

About robot.txt (1, Redundant)

HatchedEggs (1002127) | more than 7 years ago | (#16167831)

BTW, the Belgium newspapers, when asked about why they didn't just use robots.txt, stated that it should not be on their shoulders to have to keep others from misuing their copyrighted work.

What this translates to is that not only are they too lazy to spend 5 minutes updating their site so that Google doesn't index it, but then they fail to understand the benefit that they obtain from Search engines. Which in reality is probably quite great.

Content owners are the losers in the internet (-1, Offtopic)

Anonymous Coward | more than 7 years ago | (#16167847)

n/t

Fair enough (3, Insightful)

Ajehals (947354) | more than 7 years ago | (#16167853)

This would work fine - *if every content provider did it*. (when I say work fine I mean how the content providers would like it to work.

i.e. Lets say The Guardian the Independent charged a royalty for indexing certain articles. and the Times didn't then when a person searches for something that would under normal circumstances return all 3 content providers articles (say you are searching on current news - or better an archived new story - say the search is for "Falklands War Newspaper Headlines" or something. Instead of getting all three papers returning a result you would get just the one from the times.

Now assuming not everyone knows that certain papers charge search engines for permission to index their content, it will simply look like the Guardian and the Independent didn't report the Falklands War - or whatever you searched for.

Repeatedly this may even turn customers against their traditional sales, especially with more and more people using multiple online papers and buying a paper copy. I mean if you start reading the Times on-line everyday as it is the only remaining fully indexed paper, are you more or less likely to buy it when you decide to get a real copy? I guess it would do wonders for international brand recognition too - I mean if you are not indexed for common searches who is going to know who you are enough to trust you for the occasional bit where you have allowed yourself to be indexed.

Really this is all about the fact that search engines generate advertising revenue for themselves using others content, content providers are now looking and saying -

"hey Google makes X million dollars by directing people to my site and advertising for my competitors, it indexes my content (goggle images / news etc..) so people aren't coming directly to me, maybe If i threaten the source of their content they will pay me and I can finally make some money from this inter web thing without having to actually charge people ourselves!"

I guess this is an attempt to get at the revenue they assumed that they would get from selling content to their visitors directly through online subscriptions which didn't work. (unless you were a specialist or exclusive provider - such as companies providing financial information / stock prices / adult material etc..). It didn't work because others didn't charge, why pay for access to ITV news or CNN online (if they charged) when the BBC or some other organisation offered the same stories (with a different editorial slant..) for free?

What they should be saying is how can I get a search engine to get as many people to my site as possible where I can then try and sell whatever services or exclusive content I want! after all the more page hits the more (theoretically at least) conversions.

Anyway - let them try and charge a royalty - or enforce their rights regarding copyright and prevent thee search engines from making money by including their content in their search engines, it will only harm them.

The internet really is a level playing field, anyone with a good site can get listed on a search engine and get hits - hopefully achieving whatever it is they are trying to do, why do some people want to change it so that it benefits them more? all that will happen is that it will break the way the internet works, or is perceived and damage their own web activities. Plus some content providers simply will never do this (probably at least) the BBC in the UK certainly would find it difficult, so too will other public service information providers (I assume) too so I guess there will always be at least one or two news site out there.

I know I have focused on newspapers here (and that does appear to be the gist FTA) but providers of other content such as music, video, software etc. are in the same position. Problem is the internet using public like getting stuf for free, and probably wont pay for something if they cant have access to it for free for at least a while first.

Ah well, (By the way I'm absolutely shattered so if any / all of the above makes no sense or be a bit jumbled - then please ignore...)

Oh and from the article: "Since search engine operators rely on robotic 'spiders' to manage their automated processes, publishers' Web sites need to start speaking a language which the operators can teach their robots to understand," according to a document seen by Reuters

I hope they don't spend too much before they figure out what a robots.txt file does.. :)

Forget knee-jerk reactions... (3, Interesting)

Bender0x7D1 (536254) | more than 7 years ago | (#16167887)

First, we must remember that not all web sites are found through Google. When was the last time you did a search for Amazon.com or BarnesAndNoble.com? Sure, it's nice to be listed but hardly deadly if they weren't - they already have the name recognition.

Now, there is a legitimate downside to being listed on Google News - all of your competitor news sources are also listed - right next to you. If the New York Times runs a piece from the Associated Press, I can see that the Des Moines Register runs the same story, why go to the big name source? The NYT has spent decades and millions of dollars building their reputation and get listed next to other, less-known papers. It serves to dillute their name and reputation.

For those of you convinced that you can get plenty of news from other places and that these print publications can adjust to new business models or die, are you crazy?!? One nice thing about having a huge newspaper is that they generally try to verify their stories, or at least avoid making things up. (I said generally...) When your paper owns buildings and huge printing presses and is sold at every newsstand your reputation means something. If you are a few people working out of a basement, then who cares? As long as you got people reading, you are happy. I like the idea of responsible journalism. It may be less than it was, but if I see it in the NYT I am inclinded to believe it. If it is in some tabloid, I am inclined to not believe it. In a strictly Internet world, how do you tell the difference?

I hope that a good arrangement is made between the press and the search engines, but I don't think the survival of the press is based on them being indexed by Google.

Re:Forget knee-jerk reactions... (2, Insightful)

topham (32406) | more than 7 years ago | (#16168119)


I use Google News a lot. It makes it very easy to find a news article I would not find otherwise; I get annoyed when Google News shows a site which, when i click on the Article doesn't let me view it because I need to be registered. My solution? Follow the next link as it will likely not have the registration requirement.

News sites are going to have to understand something, except for those sites I choose to go to on a daily basis the rest are secondary. I won't go to them unless there is a story of interest.

News sites which require some fee to be indexed will drop by the wayside as the smaller sites become more available. I stopped going to Forbes a while back because there page caused Java runtime to load in the browser I was using, it slowed the system to a crawl and it was generally possible to read the article elsewhere.

Except for a particular attachment someone may have to a specific site (be it Fox, CNN, Forbes, Toronto Star, whatever) all other articles are read on a whim. On the other hand, it is those articles which would create the opportunity for someone to see the site and realize it is worth a look on an on going basis.

Somehow I think the smarter sites will realize the trick is to get people to stay on their site (by choice) once they get there, rather than charge an indexing service.

Re:Forget knee-jerk reactions... (1)

Dorceon (928997) | more than 7 years ago | (#16168207)

Not everyone is a Slashdot-level techie. I've heard of people for whom the Google search box has completely replaced the address bar. It's easier to avoid cybersquatters that way too.

Re:Forget knee-jerk reactions... (1)

Klintus Fang (988910) | more than 7 years ago | (#16168499)

When a big name news organization like NYT gets indexed site right next to some small-time news organization it only degrades the big time news organizations reputation if the stories have the same content and/or the same quality. And if that is the case, then a reader would be quite reasonable to conclude that the bigger news organizations reputation is over inflated. I can see how that might be something the bigger news organization may be threatended by, because it makes it more difficult for the bigger news organization to hide behind its reputation. I do not see how that is a problem for the user or for the integrity of news in general. If anything, it is good for the user and for the integrity of the news media because it keeps the bigger news organizations honest. If the bigger news organization is not able to provide content which is unique enough and compelling enough that it continues to stand out as superior when placed up against joe-blow's article on the same topic then all that means to me is that the bigger news organization's reputation isn't deserved. Now if the bigger news organization wants to be able to provide vanilla content to its dedicated users which it knows is no different than the vanilla content at Joe Blows site and doesn't want the fact that they do provied that kind of content to degrade the value of their stronger content: well, that's what I presume things like this robot.txt file are for. The bigger news organization can selectively prevent that type of content from being indexed so as not to tarnish the image they have built for themselves. Sounds like a solid and healthy system to me.

Re:Forget knee-jerk reactions... (2, Insightful)

saikou (211301) | more than 7 years ago | (#16168585)

I don't see how "verifying story" is applicable to, say, AP or Reuters newsfeeds. Those are pretty much blasted equally by small, medium and large newspapers as is (even though they need verification and oh boy, if you ever saw how they castrate content to make 10-sentence detail-less snippet out of interesting 10 paragraph story...). And for those news it does not really matter where you go, you read identical content (hence 50+ articles that start with exactly the same words). For those, the newspaper that will be smart enough to allow indexing and provide access, there will be a windfall of visitors (what they do with them is another matter, but at least they can try to recoup their bandwidth investments through ads). Heck, Reuters has its own web site, and I bet it'll be happy enough to allow indexing.
For local news you pretty much have to go to small sites. The Washington Post for some strange reason does not cover news of, say, Hell, Michigan. So it goes back to local news provider. Which would like to have more visitors, not less.
So... big media entities will keep out of the way of small entities. Users would be able to find pretty much the same content. Where's the down side? If the whole "don't index me, I want to charge for my content" thing leads to growth in small news provider niche, will you really object? Because even if you don't trust your local News 59 Hometown tv newscrew or Mournful Examiner newspaper, being able to read multiple reports on the same event beats one "trusted" source any day.

Please stop already! (1)

db32 (862117) | more than 7 years ago | (#16167889)

There have been a dozen stories and lawsuits over this crap already. Why the hell hasn't robots.txt come out in court yet? I have a hard time believing the lawyers defending against this are so incompetent that thery wouldn't someone with a clue on the stand to explain that the system is already there and the people aren't using it. In fact a smart lawyer would counter sue because the system IS already there and they failed to use it and instead wasted the courts time. Imagine (diddly do diddly do slashdot analogy time) you sue the auto manufacturer for not including a safety harness in their vehicles because you got injured while not wearing your already installed seat belt.

Or maybe there is something more to this than the typical slashdot "duh robots.txt" response...who knows...but I really would like a meaningful answer about if its something more...if not someone needs to make this shit stop! Maybe these places are using robots.txt and the search engines are ignoring it, or someone is using it wrong, or maybe we should get together and write a nice pretty page on robots.txt and submit it to all the news stations as a "new innovative" way to protect online content...maybe they will get it then...and then we can all laugh as all those sites fall off the net because no search engine can find em.

Re:Please stop already! (1)

Chapter80 (926879) | more than 7 years ago | (#16167991)

Imagine (diddly do diddly do slashdot analogy time)...
It took me a minute to figure out what the heck you meant by the above line. But then, after a chuckle, I read the rest of your message, awaiting the "Scooby-doo ending" where the mask gets pulled off, and *surprise*, it's the RIAA underneath, saying that they would have gotten away with it if not for you meddling slashdotters...

Re:Please stop already! (0)

Anonymous Coward | more than 7 years ago | (#16168047)

Because robots.txt is an opt-out mechanism, but copyright is not.

There are many more users/redistributors then there are content makers. It would be infeasible at best and impossible at worst if you, as a content maker, had to go to each redistributor and explicitly opt-out of their service.

Not all bots honor robots.txt, which is a voluntary file for both the site owner and the robot owner. Plus, there are more ways to get content from a site, such as the RSS feed, which is intended for personalized newsfeeds, not redistributors or commercial aggregators.

This is not the same as the seatbelt argument, since you're looking at it from the side of the user, which would be the search engine in the original scenario, NOT the car maker, which would be the manufacturer. In fact, this argument actually supports the car maker/manufacturer: why should they be responsible for the user's choices? There are many more users than manufacturers, and it's impossible to meet the individual needs of all of them.

Re:Please stop already! (1)

db32 (862117) | more than 7 years ago | (#16168147)

I fail to see how this is anyone's problem but the content makers or distributers. Don't publish to a public area if you don't want the public to be able to get to it...pretty simple. Its not like the search engines are violating logins and whatnot to cache this stuff. Its out in the open.

Further, the analogy was about how they are demanding this way to opt out of search engines and how vital it is they have this...when it already exists and they only have problems because they aren't already using the existing one.

implicit assumptions (1)

grikdog (697841) | more than 7 years ago | (#16167917)

Isn't there an implicit assumption that indexing every conceivable shred of garbage on the net is a service? Isn't there a tacit understanding that anything useful, or interesting, or commercial will be herded behind tents and flogged by carnival barkers (thinking of Boing Boing or a score of others) for dimes at a time? Isn't there an immodest presumption that this activity shall be passed off as "scholarship" (such as requiring a disambiguation page at Wikipedia to disentangle Omar Khayyam, Persian poet, from Omar Khayyam, suicide bomber?) I have no objections to godless capitalism whatsoever, so long as it does not turn into Mordac, the Preventer of Information Technology. Let the shakedown ... er ... shakeout begin!

Public domain (1)

Antony-Kyre (807195) | more than 7 years ago | (#16167939)

Aren't websites not sing the robots exclusion essentially in the public domain?

It would be totally unlike a music CD where it's not free to download, hence a site couldn't necessarily crawl and put it up there for everyone to see.

Since the website was free for anyone to see in the first place, no harm done. Unless the site requires a subscription and if Noogling that bypasses it, there really isn't anything that can be done I think.

Re:Public domain (0)

Anonymous Coward | more than 7 years ago | (#16168087)

No. Western copyright laws grant copyright UNLESS (that is, if and only if) the copyright owner opts out and assigns ownership to the public domain before the copyright ends.

Re:Public domain (1)

Antony-Kyre (807195) | more than 7 years ago | (#16168159)

The site is essentially being tagged,not copied, in the search results, right? It's also being cached, but robot exclusion can stop search engines from doing that part.

Re:Public domain (1)

Tim C (15259) | more than 7 years ago | (#16168327)

As the original respondent said, just because content is on the web doesn't mean that it's in the public domain. It's publicly accessible, but that's a very different thing to being in the public domain.

Secondly, unless its changed since I last bothered with it, robots.txt can't allow indexing but prevent caching, and nor can it actually prevent anything. Robots.txt is a *request* to a user agent to not enter certain parts of a domain. User agents are perfectly at liberty to ignore the request, although doing so is obviously somewhat rude. Even if the user agent abides by the request, though, it'll mean that the excluded sections won't be indexed, not that they just won't be cached.

The Answer Would Be... (1)

interval1066 (668936) | more than 7 years ago | (#16167951)

"Now won't the provider actually lose readers since their articles won't be locatable by search anymore?"

Yes.

Legislation-facilitated industry association (0)

Anonymous Coward | more than 7 years ago | (#16167957)

How about Congress authorizes creation of the CCCA (Content Creators of America) as an "opt out" platform under which there is a basic royalty scheme for search engine to pay content creators when their items arise to a certain level in their displayed results?

Allow content creators to "opt out".

It's all about transforming a classic technology monopolist (unavoidable given technology facilitated increasing returns to scale) holding court over a decentralized competitive supply market, into a classic duopoly.

As a matter of fact, the status quo is a significant disincentive to content creation. The duopoly model would be better.

Re:Legislation-facilitated industry association (1)

BlueStrat (756137) | more than 7 years ago | (#16168411)

"How about Congress authorizes creation of the CCCA (Content Creators of America)..."

How about Congress authorizes creation of the CCCP (Coalition of Content Creators and Publishers)...

There, fixed that for you. :P

Cheers!

Strat

Simple directions (0)

Anonymous Coward | more than 7 years ago | (#16168091)

Draw gun. Aim at foot. Apply pressure to trigger. Reload and repeat as needed.

lose readers? (0)

Anonymous Coward | more than 7 years ago | (#16168103)

No. Regular readers will visit the site regardless of indexing by google news.
Eventual readers will keep visiting whatever gets to be in the frontpage at the time they load it.

Dear reuters, (0)

Anonymous Coward | more than 7 years ago | (#16168129)

Is finding competent developers so hard?
javascript:ArticlePaging('/news/articlenews.aspx', 'internetNews','2006-09-22T132105Z_01_L22732625_RT RUKOC_0_US-MEDIA-PUBLISHERS-SEARCH.xml','1','','', 'NewsArt-C1-ArticlePage1');
Special links like this which specifically target the JS impaired need to be written using appendChild and the DOM, then a normal link placed inside noscript tags so that users with a clue don't have to copy and paste from your javascript. This simple approach is covered in HTML lesson 1 and greatly enhances accessibility and usability by allowing users to follow a link using a new browser tab or whichever method or assistive technology they prefer.

Regards from the web (something which you obviously don't understand).

Opt-in or Opt-out (1)

houghi (78078) | more than 7 years ago | (#16168153)

Wether this is stupid to do or not should be entirely up to the copyright holder.

I should not tell NOT to index my side, I should tell to DO index my site.
Not opt-out, opt-in, just like anything else, if possible.

In every other business opt-in is desired by all here, except when it concernes Google, because then it it handy for us.

Nice double standard. :-(
Go on, moderate me into oblovion, I have Karma to burn.

Re:Opt-in or Opt-out (1, Insightful)

Anonymous Coward | more than 7 years ago | (#16168461)

Thats frankly rediculous.

By choosing to post something on a publicly accessible domain without a robots.txt file you HAVE chosen to opt-in. Much in the same way that walking down the street 'opts-in' to being seen by other people on the street. The structure of the internet was designed so that access to most domains was freely available. If you don't like how the internet operates you can choose not to use it. Or if you still want to use it you have numerous options to prevent your content from being indexed (robots.txt) or accessible (restricted access domain, login requirements, etc).

Just like when you go shopping the business retains records of your purchase without asking your permission and it's a non issue. In some arrangements Opt-In makes sense and in others Opt-Out does. Provided the reasonable one is choosen and that one is always available (thus why spam doesn't qualify) it is perfectly reasonable. Just like it isn't a double standard to use a hammer to nail something instead of a wrench, using the proper tool (opt-in vs opt-out) in the proper situation is perfectly reasonable.

Claiming that all Slashdotters are vigourous Opt-In believers and that any modding you get is because we don't like our hypocrisy being pointed out is wrong. If you get modded down its because you argument is faulty, poorly thought out, inflamatory, and stupid.

Regards,
-Dan

WOOT.. 7p (-1, Troll)

Anonymous Coward | more than 7 years ago | (#16168225)

Engineering project [samag.Com] in the 40,,00 workstations fastest-growing GAY World will have *BSD is dying It is

AOL mistake (0)

Anonymous Coward | more than 7 years ago | (#16168481)

Don't make the same mistake as AOL did in the 90's.

Payola (2, Interesting)

Anne Thwacks (531696) | more than 7 years ago | (#16168517)

Well the radio stations paid "payola" to play music didn't they?...

on second thoughts...

Perhaps the record companies and musicians union might ask the RIAA to "cease and desist"?

SOmeone has lost the plot here. Must be me!

If it wasn't so funny you'd be driven to strangle (1)

popsicle67 (929681) | more than 7 years ago | (#16168569)

Royalties to index a site for a search engine. This is the same chicken-shit thinking that causes cities to pass hotel room taxes. This era we live in where everybody has to make a buck on everybody elses back is the main reason life is so miserable these days. I would love it if search engines shut down for a week. Nobody goes anywhere on the net that they don't have the I.P.address for. Lets run the net like the old BBS days and see just how loud the little bitches squawk.No hits, No ad revenues, No nuthin, just a bunch of geeks gathering in well worn little niches in cyberspace and no commerce. We rule the internet. It is the way it is because we needed much of what it has become. If we want to stop this crap from happening we have to pile scorn upon dickheads and embarass them, shame them into being useful and harmless same as we need to do to our elected officials who do not listen, embarass and shame the bastards into quitting or submitting.

Why stop there? (2, Insightful)

Zaphod2016 (971897) | more than 7 years ago | (#16168577)

Ok, fine, content owners are entitled to royalties from Google.

And when writing a story about me, or my company, I deserve royalties too. Sure, you might argue that publicity is valuable, but I say that without ME and my circumstances, these content owners would have nothing to write about.

And, obviously, my dear old mum deserves royalties too. After all, without her genetic contributions, I couldn't exist, couldn't do anything news worthy, couldn't be the basis of a content owner's story.

And lets not forget about Grandma. And great-grandma.

And of course, I am writing this comment on a MacBook, so its only fair that Apple gets a piece of any slashdot ad revenues generated by people reading this.

And, obviously, those interested in clicking on the slashdot ads are using Amazon's patented "click" technology, so they deserve a cut too.

...as does Jeff Bezos' mum and grandma.

No Google news for french-speaking Belgium (3, Interesting)

tflash (605545) | more than 7 years ago | (#16168619)

An organisation representing French and German speaking newspapers in Belgium has won a court order forcing Google to stop indexing these journals. They also forced Google to post the order on its homepage in belgium: http://www.google.be/ [google.be] Here the address of these enlightened people: http://www.presscopyrights.be/ [presscopyrights.be] Lucky I speak Flemish and NEVER read these newspapers anyway. Nothing is lost, I assure you. Only a bunch of isolated people will get even more isolated since any news of them will totally fall into oblivion.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?
or Connect with...

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>