Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Could a Reputation System Improve Wikipedia?

kdawson posted more than 7 years ago | from the must-be-true dept.


Acidus writes, "There is an excellent article in this month's First Monday about using reputation systems to limit the effects of vandalism on public wikis like Wikipedia. It discusses the benefits and weaknesses of various algorithms to judge how 'reliable' a given piece of text or an edit is. From the article: 'I propose that it would be better to provide Wikipedia users with a visual cue that enables them to see what assertions in an article have, in fact, survived the scrutiny of a large number of people, and what assertions are relatively fresh, and may not be as reliable. This would enable Wikipedia users to take more advantage of the power of the collaborative editing process taking place without forcing that process to change.'"

cancel ×


Sorry! There are no comments related to the filter you selected.

FP (-1, Troll)

Anonymous Coward | more than 7 years ago | (#16091713)

First Post!

Sullied Reputations (0, Troll)

Anonymous Coward | more than 7 years ago | (#16091776)

There are sooo many sullied reputations, perhaps your credit rating would be more informative.

Easier said than done? (3, Insightful)

gasmonso (929871) | more than 7 years ago | (#16091759)

I agree that they need to do something, but that is a fantastic challenge. Look at your major encyclopedias, they have a team of several thousand to do fact checking on a paid basis. I'm not saying people wouldn't fact check, but its a great challenge. How would you know that people aren't just saying its legit or not just for fun? []

Re:Easier said than done? (4, Insightful)

truthsearch (249536) | more than 7 years ago | (#16091998)

With the method recommended in the article the system automatically assumes that a section of text which has not been modified over a series of edits to be more likely to be accurate. It's not that someone denotes a section as fact checked. But if a page has been edited many times, yet one section of it has not been modified, it assumes that unmodified section is more likely correct and colors it appropriately.

Re:Easier said than done? (4, Insightful)

miyako (632510) | more than 7 years ago | (#16092324)

There are two big problems I see with this:
The first is that there are a lot of articles and sections of articles in wikipedia that are heavily edited without the facts changing much. This is mostly a good thing, cleaning up grammar, etc. But the if that is used as a basis for how reliable the information is, it could be misleading because the software won't know if the facts have changed, or just their wording
The other problem that I see with this is that it makes it easy for people who "disagree" with facts to make edits to the sections to reduce their rating without just deleting them. It just makes me thing of those people who say "yeah, but evolution is only a theory" to undermine it, I can see them making minor changes to wordings of things to make the facts seem less debatable.
Of course, if someone was doing that, it would be impossible to say if they were doing it because they wanted to supress facts by making them look less reliable, or if there were simply trying to contribute to the quality of an article.

Re:Easier said than done? (3, Insightful)

Anonymous Coward | more than 7 years ago | (#16092406)

For instance, something has been wrong with an article from the very begnining ss_submarine&diff=2048162&oldid=2048151 [] Like the Horsepower on the submarine being four times its actual size. Then it gets repeated on every single entry for the class [] and then gets repeated as fact on thousands of wiki-replicas +engines [] to the point where you can only find the truth on a few sites obscured sites if you search for the HP of a Gato class submarine. [] It doenst matter how long it has stood, and especialy not that other sites share the same misinformation, all that matters is if someone knows what they are talking about, or if someone's "source" got their information from wikipedia to begin with.

Re:Easier said than done? (3, Insightful)

beheaderaswp (549877) | more than 7 years ago | (#16092020)

Well the biggest problem they are dealing with is standards. In academia there's a criteria for what constitutes a "fact" and a formal process for fact checking. On the "internets", well anyone can submit edits, and claim a factual basis for it.

So a reputation system is pretty useless because special interest groups can mobilize to skew reputation.

Want to have Intelligent Design show more favorably? Ok, get a bunch of like minded people to raise your reputation.

Heck, we even see it on Slashdot when a conservative or liberal viewpoint gets buried in moderation because people of a certain political belief gang up on the opposition.

A lot of what Wikipedia is dealing with is a direct result of the deep divisions on our society. And the fact that unlike World Book Encyclopedia, apparently *everyone* is allowed into the research offices. In a virtual sense of course...

Re:Easier said than done? (1)

Tweekster (949766) | more than 7 years ago | (#16092244)

Do they have teams of several thousand?

seriously, do they

We do know one thing: avoid Slashdot mod system. (0)

Anonymous Coward | more than 7 years ago | (#16092297)

Wikipedia should never use the Slashdot mod system. It is little better than a popularity contest that promotes the most popular ideas.

Wikipedia articles should strive for the truth, not the most-popular urban legends.

I can give you the answer without even RTFA (4, Insightful)

RLiegh (247921) | more than 7 years ago | (#16091760)

That answer is "no". We've seen numerous ratings and karma systems set up on a variety of boards and time and time again they've been defeated by people willing to take the time to game them for whatever reason.

It's typical nerd hubris to believe that you can solve social problems through technological means.
It's been proven time and time again that you can't.

Re:I can give you the answer without even RTFA (2, Insightful)

Anonymous Coward | more than 7 years ago | (#16091808)

A rating system is a social solution to a social problem.

No, it isn't. (1, Insightful)

Anonymous Coward | more than 7 years ago | (#16092205)

The current way Wikipedia does these things-- the use of talk pages, strict documentation of edit and contribution histories, and allowing some people to have more clout based on past contributions-- is a social solution to a social problem.

A reputation system is not a social solution. It is a number. It is a technical solution.

The crucial difference here is that social systems have the ability to be smart. They can understand things like context, or changes in circumstances. If for example someone makes 400 high-quality additions to sports articles on wikipedia, then abruptly shifts gears and starts randomly seeding pages about World War 2 with outright falsehoods, a social system is potentially smart enough to realize something changed there and reject the World War 2 changes quickly based on their content, despite his edit history. But a rep system, a dumb technical solution, would be obligated to help this person along in his changes to World War 2 articles based solely on the numbers gained from his sports edits.

Re:I can give you the answer without even RTFA (5, Insightful)

NewWorldDan (899800) | more than 7 years ago | (#16091891)

Firstly, the word was "improve", not "solve". I think Wikipedia would improve substantially if it added an editorial supervision system. For example, changes were not posted until approved by a randomly assigned editor. The random part is important. Sure, it's still possible to trash the system, but that takes a lot more effort. And then you need a rating system for editors, and so on and so forth. But the Wikipedia is run on a volunteer basis. There are limits to what it can accomplish without resorting to professional oversight, which would change the very nature of the beast. Ultimately, I think we just have to accept that it is what it is.


CmdrTaco (troll) (578383) | more than 7 years ago | (#16091939)


Re:I can give you the answer without even RTFA (3, Funny)

Somatic (888514) | more than 7 years ago | (#16091908)

I was tempted to mod that down cause it would have been funny. But no one but me would have got it.

Re:I can give you the answer without even RTFA (2, Interesting)

cavintage (1002078) | more than 7 years ago | (#16091938)

There are successful reputation systems out there. For instance, Credence [] can avoid Gnutella spam. They have a cool algo for detecting when a group of fake users all rate the same bogus files up. Wish this sort of thing was more widely deployed. Bitzi [] is the dumb version of the same idea, but never worked for me.

Re:I can give you the answer without even RTFA (5, Insightful)

catbutt (469582) | more than 7 years ago | (#16091976)

Good the Wright brothers didn't say that because lots of attempts were made at flying that failed.

Slashdot's karma system is far from perfect, but at the end of the day it works. Can you game it? I don't really think so, at least not without a LOT of effort, which generally means contributing a lot of good content/ratings so that you can sneak in a very small amount of biased content or ratings.

Whether "ungameable" is possible or not I don't know, but I am quite sure that wikipedia's system could be improved upon massively.

Re:I can give you the answer without even RTFA (3, Insightful)

Anonymous Coward | more than 7 years ago | (#16092412)

Slashdot's karma system works fine for Slashdot, but wouldn't work at all for Wikipedia. Slashdot, despite how much you may like it, is pretty far from neutral. It does the same thing this reputation thing would, represent the opinions of those most active in using it. Wikipedia would be ruined.

Re:I can give you the answer without even RTFA (3, Insightful)

AndyG314 (760442) | more than 7 years ago | (#16092045)

I think that almost any reputation system woulth threaten the impartial nature of wikipedia.

Re:I can give you the answer without even RTFA (2, Insightful)

Yokaze (70883) | more than 7 years ago | (#16092175)

> It's typical nerd hubris to believe that you can solve social problems through technological means.

That is not what nerds are trying, it is what society is doing: Trying to solve social problems through software for wetware (laws).

In the case of computer based communities, that laws are codified in programming languages, whereas in RL it is codified in legalese.

> It's been proven time and time again that you can't.

Yes... like flying.

I admit, my first statement seem to be more an argument against the possability of creating a good working online community, as we cannot really claim to have solved our social problems in real life.

However, one has to remember, that an on-line community only has to solve a small subset of the problems, which one has to solve in real life.

IRC, some time ago (one or two years ago) there was a post on Slashdot about a sociology study on the matter on reward and penalty system in communities, which claimed to have isolated some simple rules for a thriving one. I cannot remember having seen it implemented in software.

technology is the answer (2, Insightful)

klenwell (960296) | more than 7 years ago | (#16092185)

One point with Wikipedia that seems to get overlooked -- or at least taken for granted -- is the power and ingenuity of the code that runs it. Technology is part of the solution here. If nothing else, Wikimedia deserves credit for putting together a state-of-the-art wiki machine -- an open source state-of-the-art wiki machine. Some of its features are dauntingly obscure and complex but it falls back quite gracefully to allow even the newest user to function with it effectively. I'd argue Wikipedia has succeeded in large part due to the technology.

That said, there seems to be two alternate proposals here in the summary. (1) A karma system and (2) a new color-coded visual feature. I agree that #1 would be vulnerable to all sorts of gaming schemes -- which isn't to say it wouldn't help, but it'd have to be smart. #2 sounds like it would be a more unequivocal benefit.

Both would be interesting innovations and consistent with the progressive user-friendly code behind Wikimedia.

Re:technology is the answer (1)

amRadioHed (463061) | more than 7 years ago | (#16092470)

#2 sounds like it would be a more unequivocal benefit.

Maybe, but even if you managed to get it to provide meaningful insight into the reliability of the content, could you do it without turning the page into a rainbow striped eyesore?

Re:I can give you the answer without even RTFA (1, Interesting)

Anonymous Coward | more than 7 years ago | (#16092354)

A smaller but much less gameable step would just be annotation of recent edits. Yes, you can look them up in the history if you want to now. No, this won't help with people who post believable BS. But it'd be nice if when skimming an article I could see the information that's been there forever and is likely not in dispute vs the information that got posted 3 hours ago by some anonymous jackass.

Of course, while you couldn't game this to make information look more factual (a recent edit would always show up as such) you could discredit information if you wanted by editing it slightly. Also it might lead to people not wanting to correct style/grammar/etc errors for fear of ruining the article's reputation.

Congress & Christians (0, Flamebait)

EmbeddedJanitor (597831) | more than 7 years ago | (#16092482)

You just have to look at how Christian groups etc use their powers to influence what could be a democratic process. The truth etc soon dies in favour of what is politically correct or is acceptable to people with enough passion to screw things up.

Re:I can give you the answer without even RTFA (1)

speculatrix (678524) | more than 7 years ago | (#16092526)

We've seen numerous ratings and karma systems set up on a variety of boards and time and time again they've been defeated by people willing to take the time to game them for whatever reason

surely these people who can beat the system are smart enough to make a useful contribution?
if they put all the effort into getting a good reputation, then won't they want to keep it?
rep based systems can work quite well - ebay for instance (although I can tell that a significant number of ebaybuyers don't really think about it too hard).. and here on slashpedia,er,wikidot,er,slashdot - especially if people can tag friends & enemies.

Insularity is the key (1)

MythoBeast (54294) | more than 7 years ago | (#16092531)

Any sufficently insular group can convince themselves of any idea they choose simply by weeding out those that don't agree with them. This is a given. What you have to do is identify the obvious biases of a group (i.e. Slashdotters hate Microsoft) and ignore any opinion in that direction. You'll still get plenty of actual facts (or at least well supported truths), but those will require supplementing from an inversely biased truth source.

What you CAN do is identify those thing for which there is no natural bias in the group. That's a little harder, but not impossible. For instance, I don't belive that Slashdotters have a particular reason for supporting Democrats over Republicans, so political statements don't need to be taken with as large a grain of salt. Comments about GWB are an exception to this because Slashdotters notably value intelligence, and he's a blatant idiot.

With Wikipedia, you have a group that is very stringently non-insular. There are people of all biases, and they are encouraged to intelligently consider each other's ideas. The way they rate each other is by how well they back up what they have to say with supporting fact. They're notorious for disregarding credentials as an ad-hominem attack - only the information is important. I think that this particular scale of superiority is especially resistant to the kind of flaws that other rating systems fall prey to.

The Slashdot moderation system proves.... (2, Insightful)

no_nicks_available (463299) | more than 7 years ago | (#16091761)

this wouldn't work.

Re:The Slashdot moderation system proves.... (1)

$RANDOMLUSER (804576) | more than 7 years ago | (#16091843)

That's exactly right. We would just see the emergence of a "Wikipedia groupthink"; not unlike what we see on Slashdot, or University faculties.

Re:The Slashdot moderation system proves.... (1)

catbutt (469582) | more than 7 years ago | (#16092005)

And what is wikipedia today if not "groupthink"? As much as they like to say their is "NPOV" and that this is purely objective, I call BS.

Wikipedia's current system is "edit till the arguing stops". Ultimately, the more people sharing an opinion, the more the articles will bias that way. A good reputation system would not change this, just make it more efficient.

Re:The Slashdot moderation system proves.... (4, Insightful)

joe 155 (937621) | more than 7 years ago | (#16091924)

I must disagree, the /. system is actually working pretty well. If you say something which is needlessly offensive you will be modded flaimbait, the same would go if you're trying to start a flame war with comments like "GNOME smells of cheese and suX!!11!". If you make some "GNAA!!!!!" type posts, that'll be a troll. If you say something which is completely off topic, it gets modded as such. Both of these things mean that modding becomes pretty much a true/false kind of thing, which meta modding can comfirm. It also stops being seeing it as a default and makes the best shine out...

Possitive modding is a little more shakey with "informative/interesting/insightful" all meaning pretty much the same thing in most people's mind, but that's not too much of a problem.

Group think can cause issues, but in reality there is such a wide range of modders it is often avoided (you can see some pro-MS or anti-Apple comments come through)... although the system isn't perfect I guess group think at least only makes content that most would want to see if they come here.

It is also interesting to note that most people do care about karma and do like to get modded +5, maybe the wiki system would work in a similar way - where people will care.

Hardly (1)

Cybert4 (994278) | more than 7 years ago | (#16091960)

Some of the most brilliant stuff I've ever read on the internet had a -1 Overrated on it. I think that's the god-modding by the editors--who seem to have a huge conservative streak.

Re:Hardly (2, Interesting)

catbutt (469582) | more than 7 years ago | (#16092043)

Yeah but some of the most brilliant things on the internet might have a low "page rank" on Google. Still, Google's reputation system (which is exactly what it is), does a pretty good job, even with the fact that it must infer ratings by links. Obviously, they have to work pretty hard to make it hard to game.

Re:Hardly (2, Informative)

Millenniumman (924859) | more than 7 years ago | (#16092173)

The editors have a conservative streak? That is amusing. Or are you forgetting the editors' comments (in the story summaries) about the free market being a failure, fahrenheit 9/11 being very insightful and wishing it would sway voters, and numerous other such comments? Most of /. is very far from conservative.

Re:The Slashdot moderation system proves.... (1)

AndyG314 (760442) | more than 7 years ago | (#16092117)

If people really used it like that, then it would work, but people also mod down statements that they don't like, or disagree with. Go ahead and post a well thought out pro GWB statement and wathch it get modded to -1, or post something negetave about apple computers etc...

The moderation system does protect from crap floods, trolls and flame wars, but it also promotes groupthink and surpresses comments opposed to the majorit.

Re:The Slashdot moderation system proves.... (2, Insightful)

AlphaWolf_HK (692722) | more than 7 years ago | (#16092166)

If you say something which is needlessly offensive you will be modded flaimbait, the same would go if you're trying to start a flame war with comments like "GNOME smells of cheese and suX!!11!". If you make some "GNAA!!!!!" type posts, that'll be a troll. If you say something which is completely off topic, it gets modded as such. Both of these things mean that modding becomes pretty much a true/false kind of thing, which meta modding can comfirm. It also stops being seeing it as a default and makes the best shine out...

This is not true. I have seen posts where somebody points out a mere fact about microsoft that gives them a positive light get moderated as a troll just because somebody doesn't like microsoft. It also occurs often that somebody will post comments that have mere facts which support a particular conservative viewpoint, yet just because somebody who may be a liberal doesn't like it they downmoderate it as overrated or as flamebait.

Then the karma system basically says "don't make these kinds of comments again, or else all future comments you make will be ignored." People realize this, thus they are reluctant to speak their mind when they already know that the group (due to its demographic) is going to disagree with them. By having a karma system, you essentially introduce the "fear" element common among e.g. dictatorships. That is, fear of speaking your mind lest you offend somebody who reduces your group reputation. Another fitting way of describing group think would be "group censorship."

The meta moderation system attempts to solve this, but it falls far short for various reasons. First of all the meta moderator could agree with the moderator, even though both of them are biased. Second of all, not everybody wants to spend enough time reading each subject and then each post in order to understand the context fully, but they still want the moderator points anyways so they may just pick answers at random.

This would be terrible for an encyclopedia, since if somebody doesn't like a fact (even though it may be true) they can go along with the groupthink and censor it anyways. An encyclopedia is not, nor should it ever be, a democracy. An encyclopedia should follow the facts as they are, not the facts how people want them to be.

If you still feel that this isn't the case, then explain why it often occurs that people feel the need to post as anonymous coward when posting a view that they already know the group won't like?

Re:The Slashdot moderation system proves.... (1)

Crabbyass (867531) | more than 7 years ago | (#16092255)

If you say something which is needlessly offensive you will be modded flaimbait, the same would go if you're trying to start a flame war with comments like "GNOME smells of cheese and suX!!11!".

True, but you're forgetting the fact that the exact same thing happens if you make a rational argument that says something POSITIVE about Microsoft. You get attacked immediately. THIS is the major problem with Slashdot's moderation system: the automatic bias FOR Linux/OSS/etc and AGAINST Microsoft.

The same thing will undoubtedly happen to this post, because I'm basically saying that it's VERY annoying.

Re:The Slashdot moderation system proves.... (1)

drinkypoo (153816) | more than 7 years ago | (#16092030)

[The Slashdot moderation system proves....] this wouldn't work.

So what you're saying is that because a bad, stupid, wrong moderation system doesn't work on slashdot, that some other moderation system wouldn't work on wikipedia?

I don't think you really completed that thought before you wrote your comment.

Re:The Slashdot moderation system proves.... (1, Interesting)

Arivia (783328) | more than 7 years ago | (#16092169)

Of course it was. It was perfectly formed for the best karma payoff with the least amount of effort. That's what posting to Slashdot is all about, right?

Re:The Slashdot moderation system proves.... (1)

catbutt (469582) | more than 7 years ago | (#16092125)

Please make your case that slashdot would be better if there was no karma system...I don't buy it. I think it is poorly designed, but still makes slashdot more useful than boards of comparable size that have no such system. (actually, I don't think boards of such size without a rep system exist....they became unusable long ago because of all the trolling and spam)

Also I think it is odd to say that "because there is a case of something doesn't work, that will never work".

rep farming (2, Insightful)

Sebastopol (189276) | more than 7 years ago | (#16091766)

of course it won't help. people will just grind for rep and then vandalize.

what we need are national ids and biometric logins.

i kid... i kid...

Re:rep farming (1)

happyemoticon (543015) | more than 7 years ago | (#16092115)

Bah. I'm already wasting several hours a week farming Cenarion Circle rep for my new axe, and now I've gotta farm wikipedia rep?

solution in search of a problem (3, Insightful)

capoccia (312092) | more than 7 years ago | (#16091769)

this is a solution in search of a problem. wikipedia does not have a problem with ordinary vandalism that could result in a reasonable measure of a user's reliability. wikipedia's biggest problem is with unfounded but believable information. in this case, the measure of reliability of a user would be nearly useless because the reliability of their edits is unknown.

Re:solution in search of a problem (2)

owlnation (858981) | more than 7 years ago | (#16092270)

Yes, absolutely, I couldn't agree more.

I am progressively more and more disturbed by the wikipedophile focus on finding a solution to "vandalism". Liken it if you will to an right wing politicans campaign to rid the world of "terrorism".

I would concur that mindless destruction of someone's work is annoying and should be dealt with, however, I think it is important to understand that not all acts of destruction are in fact mindless - some are legitimate protests. Much as wikipedia likes to harp on about vandalism, I don't yet see a valid definition of what constitutes vandalism.

Also, what I feel is potentially the most dangerous aspect of Wikipedia is the ability of a moderator to deliberately manipulate information to promote agendae. Wiki's editors are self-appointed and not accountable to anyone. For as long as wikipedia remains in the form it is in; by small changes, small spins and offering seemingly plausible facts you could potentially alter the course of history. The ability to alter the perception of history and facts is right there on a wikipedia page. I am astounded that people are not already doing this - I'm sure they must be. It is incredible power.

While the mindless vandalism is annoying, removing moderation and removing protection is much safer in the long run. Since, in this way, people will know not to take Wikipedia seriously as the single source of any fact, and also because it will be much harder for extremists to push agendas. We as a community can contribute to keeping the site tidy, rather than have a secret police force doing it for us.

Quis custodiet ipsos custodes? We actually need an answer to this question before wikipedia does anything else.

How about one for /.? (4, Funny)

ackthpt (218170) | more than 7 years ago | (#16091778)

I mean, we could all moderate/evaluate the slashdot editors on their choice of stories and keep stats, like onna baseball card.

Dupes: 23
Veiled ads as news: 18
Old news: 17
Allowed Bad Grammar: 2,980
Allowed Bad Spelling: 9,874,376

Yes (3, Funny)

neonprimetime (528653) | more than 7 years ago | (#16091780)

Could a Reputation System Improve Wikipedia?

YES - It works on /.

Re:Yes (2, Funny)

treeves (963993) | more than 7 years ago | (#16091814)

Yeah, new improved: with Wiki-karma whores!

Re:Yes (1)

johnlittledotorg (858326) | more than 7 years ago | (#16091822)

Mod the parent Funny please.

Re:Yes (4, Insightful)

drinkypoo (153816) | more than 7 years ago | (#16092065)

While the slashdot moderation system is complete crap, can you imagine what this place would be like without anything at all? Me neither.

Re:Yes (1)

P3NIS_CLEAVER (860022) | more than 7 years ago | (#16092282)

My god it would be a fucking zoo.

Re:Yes (1)

MK_CSGuy (953563) | more than 7 years ago | (#16092252)

If you think the system works on Slashdot MOD ME UP.

Wiki needs a YTMND-esque rating system (0)

Anonymous Coward | more than 7 years ago | (#16091790)

With a rating system, the vandals could downvote good information and upvote lines of "goatse goatse goatse goatse goatse". That would be heaven.

Rep doesn't lead to reliability (1)

Neophytus (642863) | more than 7 years ago | (#16091815)

Even if somebody makes hundreds of edits in good faith, there will still be a good deal of inaccuracy in some of the edits. A rep system built on trusted edits does not mean the quality will be any better. Whatsmore, determined vandals could start trying to access accounts through phishing.

Honestly... (1)

urbanradar (1001140) | more than 7 years ago | (#16091827)

Honestly, this seems to me like the sort of solution that only makes things more complicated than the original problem, and yet still doesn't really address a major issue - some articles contain misinformation which is believed to be true by most people, so they'd just flag it as correct.

As for the problem of vandalism, that can be fought more effectively by having "stable" versions of Wiki articles which have been verified as unvandalised.

no No No NO NO!!! (1)

shoma-san (739914) | more than 7 years ago | (#16091845)

didn't you see my sig? ...

somethign simular to yahoo answers (1)

Brigadier (12956) | more than 7 years ago | (#16091856)

I've only edited a wiki once, and that was info on my home country. I have however been addicted to yahoo answers. thats what i doo all day ( why I dont know) one cool feature is you have to gain a certain rank before you can be allowed to either thumbs up or thumbs down and answer or question. I guess it's kind of a prove your worth sort of deal.

What about... (5, Interesting)

MarcoAtWork (28889) | more than 7 years ago | (#16091864)

The site is /.-ed, but this got me thinking: what about having an additional page view that uses color to highlight text age? Oldest text would be black, newest would be something else (red? blue?), intermediate 'ages' in intermediate shades. This would make it quite obvious which parts of the article haven't been modified in a long time.

Re:What about... (1)

goofyheadedpunk (807517) | more than 7 years ago | (#16091986)

It's an interesting idea, to be certain, but how would this be helpful?

Re:What about... (1)

MarcoAtWork (28889) | more than 7 years ago | (#16092130)

It's an interesting idea, to be certain, but how would this be helpful?

when reading a controversial article, or in general any article one wants to be reasonably sure about, one could use this view and take into account mostly only the 'black/oldest' text. As much as it's easy to figure out obvious vandalism, it's not as easy, especially if one doesn't have domain knowledge, to identify subtle changes.

This would also be helpful for articles on my watchlist when I don't look at them for a while, since edits would stand out fairly easily without having to go through the diff page.

Re:What about... (0)

Anonymous Coward | more than 7 years ago | (#16092019)

I had the same thought, except I was thinking more of a yellow to represent "hot" material.

Re:What about... (1)

Decius6i5 (650884) | more than 7 years ago | (#16092221)

That is, actually, exactly what the article suggests. You should read it. BTW, I can load the page from here...

Re:What about... (1)

MarcoAtWork (28889) | more than 7 years ago | (#16092413)

it finally worked from here, although I really don't agree with his color scheme

In my implementation I chose four colors to represent text of varying degrees of maturity. Most high-tech cultures on earth employ automobiles and have fairly consistent standards for traffic light coloring, so I employed these colors to indicate the age of text. The newest text in an article is colored red, indicating that users should employ caution in relying on it. Slightly older text is colored yellow. Text that is nearing maturity is colored green. Mature text is colored black.

this will make pages really difficult to read, because a) yellow on white and sometimes green on white are hard to see b) there are quite a few people that have trouble distinguishing red from green.

There is also another issue, which is vandalizing of wiki links, where it would be trivial to leave the original link name (highlighted by wikipedia in a different color) and change its destination to vandal-friendly sites.

I think I would like to amend my original thought with having the color change be in the =background= of the text: having only one color but different saturation levels (from, say, pure medium slate blue, to light slate blue, to white) which should be distinguishable by everybody and make the reading of the article still reasonably easy, while presenting the age information as well as without interfering with link/visited color choices.

Still, I wish I had been able to read the article before my original post, since I thought it discussed only reputation.

Not a complete solution (4, Insightful)

br00tus (528477) | more than 7 years ago | (#16091897)

On Wikipedia, pages relating to quantum mechanics are sometimes vandalized, but 99% of people are on the same side in terms of keeping it accurate. So there are various ways this can be improved.

On the other hand, the pages regarding the fight between Hamas and IDF are as much a battleground as is the area around the Israeli/Lebanese border. I have been involved in Wikipedia for years and have just seen things deteriorate around these types of flame-wars. Wikipedia's leadership is not dealing with it well. Imagine Slashdot setting up a wiki where we had to determine which was better - Debian or Gentoo (or Ubuntu etc.), BSD or Linux, vi or emacs etc.

We are technical people, and there's the old thing about when you have a hammer everything looks like a nail. But I don't think a technical solution will help much in regards to this. I'm not even sure you really can have a neutral view about wars in the Middle East. And even if you could, Wikipedia's "cabal" is nowhere near able to deal with it, and I doubt they ever will be. Personally, I think most of the people in high positions at Wikipedia are jerks, all the flamewars and such seem to have driven most of the nice people off.

Things like convince me that what will ultimately happen is alternatives to Wikipedia will pop up. Wikipedia is a new phenomenom, and it makes sense everyone edits on the same wiki, but why should that be? Why should pro-Hamas and pro-Israel people edit and battle on the same wiki? It makes little sense, and I'm sure in time, just as IRC went from one network to EFnet and Anet, and then split even more, I'm sure we'll see splits with Wikipedia. In the old days, the Encyclopaedia Britannica had one view of history and the Great Soviet Encyclopedia had another, why should the future be any different?

Re:Not a complete solution (5, Funny)

Kesch (943326) | more than 7 years ago | (#16092025)

On Wikipedia, pages relating to quantum mechanics are sometimes vandalized.

Actually, every article on quantum mechanics exists in a state between vandalized and not vandalized. By viewing it you colapse the waveform and change it's value. Now, there is a good probability that it will turn out unvandalized, but as you have stated it occasionaly collapses into a vandalized article. After you leave the page, Wikipedia runs complex calculations in it's improbability engine and sends the article back into a quantum state.

P.S. This is presented as per my understanding of quantum mechanics which I learned entirely from Wikipedia. It may be wrong however as my viewing might have caused it to appear in a vandalized state.

Imagine Slashdot setting up a wiki where we had to determine which was better - Debian or Gentoo (or Ubuntu etc.), BSD or Linux, vi or emacs

Debian, Linux, emacs. That wasn't so hard. (Anyone who disagrees is a terrorist.)

Re:Not a complete solution (1)

FhnuZoag (875558) | more than 7 years ago | (#16092086)

Simple, really.

Because splitting from Wikipedia to create the Palestinapedia would be an admission of defeat.

Complete solution not needed (1)

dwheeler (321049) | more than 7 years ago | (#16092260)

A "perfect" solution isn't really needed. Indeed, given Wikipedia's stellar success, you could argue that the current situation is already good enough. People already use Wikipedia, and make improvements to it... and since it has many readers and writers, it's a success by any mreasure.

But I think that discussing ways to improve Wikipedia is very valuable; only by proposing ideas and trying them out can things get better.

This is not a user reputation scheme; it simply colors text based on how many edits the text has survived unchanged. If the text is part of an edit war, then it'll stay "recent" (e.g., red in his scheme). As it survives more and more edits, it will become different colors until finally it's black.

Actually, I like this idea. There are refinements possible too (maybe after many reads, by many different people, should SLIGHTLY increase its rank). Maybe it'll work, maybe it won't. But it seems worth trying out.

Re:Not a complete solution (2, Interesting)

Anonymous Coward | more than 7 years ago | (#16092512)

I'm not even sure you really can have a neutral view about wars in the Middle East. ... Why should pro-Hamas and pro-Israel people edit and battle on the same wiki?

Because the harder you try, the closer you come to success.

In contrast to most articles on the Israeli-Palestinian situation where the writer is trying to advance a particular viewpoint, I've actually found the Wikipedia articles to be quite good. You actually have a situation where people on both sides of the issue are coming together to try to figure out what the facts are. In my experience, it's the extremists committed to propagating factual inaccuracies that eventually get driven away from wikipedia because, unlike when they preach to their choirs, the audience at wikipedia actually cares whether the articles are factually accurate.

On the other hand, I have found some of the high traffic articles to be overly neutralized. For example, I was interested in the crimminal justice system in Singapore but, when I went to the main article on Singapore, all it contained was bland touristy propaganda. One of the main things Singapore is known for in the world is harsh punishment of minor crimes but there was only one sentence about it buried in an unrelated paragraph way down the page.

This is bullshit... (1)

GreatBunzinni (642500) | more than 7 years ago | (#16091913)

...and I'm not even speaking about the validity and effectiveness of a karma point system. I mean, a visual queue to tell people what content to believe or not? What happened to reasoning, critical thinking and the scientific process? Do we need to think for ourselves or rely on someone's visually appealing color code to know what or what not to trust?

Re:This is bullshit... (2, Interesting)

KillerBob (217953) | more than 7 years ago | (#16092415)

...and I'm not even speaking about the validity and effectiveness of a karma point system. I mean, a visual queue to tell people what content to believe or not? What happened to reasoning, critical thinking and the scientific process? Do we need to think for ourselves or rely on someone's visually appealing color code to know what or what not to trust?

Sometimes it's obvious when a factual error has been made. Say, for example, somebody changes the article on Monarch butterflies to claim that they feed on the blood of human babies. Anybody with an ounce of reason can see that it's a fallacious claim. But what if it's a minor factual error that can easily slip past your notice? Or worse... an outright lie that seems more reasonable than the truth? Say, for example, somebody claims that the volume of a mol of helium is 23.6L. Outside of people with an actual background and education in chemistry, nobody's going to notice that error. Critical thought or no, that's something you just have to know in order to see as wrong. Those with a background in Chemistry know that it's actually 22.4L. Those without have no clue.

And the problem arises when people then use that number as fact, without bothering to do research outside of Wikipedia. A *lot* of research papers, particularly in High School and Undergraduate studies, skimp on research. I know people who have never been to a library to research, and do all of it online. These are the people getting burned. Arguments over the morality or wisdom in doing all of your research in that manner aside, Wiki is fast becoming the prime source of research data. And that's why the people at Wiki are trying to find a way to improve their credibility.

Digg + Wikipedia = ? (1)

JediLuke (57867) | more than 7 years ago | (#16091915)


Reputation a General Term (1)

BlackGriffen (521856) | more than 7 years ago | (#16091918)

Any reputation system would need to take into account topic area somehow. Otherwise you could get someone who is extremely competent in one area making an @ss of themselves in another article.

Consider Einstein's quote, "Marriage is nothing more than an attempt to make something lasting out of an incident." Obviously Einstein was a less than stellar social psychologist.

When does the bound version come out? (1)

antialias02 (997199) | more than 7 years ago | (#16091928)

Promoting a system of elitism turns Wikipedia into just another Encyclopedia - albeit one where the kids can scribble entries of their own in the back. In Wikipedia, you never will eliminate vandalism - you will simply raise the amount of determination required to perform the vandalism. While obvious vandalism and blatant lies may be siphoned out, I can see this new system as accepting a lot more "unnoticeable errors" that never purge in the long run - because so many people just marked it as "okay." It might work, in theory, but at the core Wikipedia is the notion that privileges can and will be abused. To remove or substantially these very privileges is to make Wikipedia less "Wiki."

Wikipedia would benefit from tagging good versions (1, Interesting)

Anonymous Coward | more than 7 years ago | (#16091929)

Rather than a rating system, Wikipedia would benefit most from random users being able to tag old revisions as "accurate" or "inaccurate".

That way if end users want the latest breaking news on an article they could see the latest rev at the risk that someone vandalized it -- or if they wanted a less up-to-date but true version the could look for the last "accurate" tag.

These accurate tags could play much the same role that Debian Stable does compared to Unstable.

Re:Wikipedia would benefit from tagging good versi (0)

Anonymous Coward | more than 7 years ago | (#16092233)

A more refined version of that is currently being tested on the German WP. The outcome of that testbed will dictate whether it should be trialled (and then possibly implemented) on the English language version.

Basically the system proposed is effectively as you describe the "stable" "unstable" variant. The user browsing by casually will see the stable version - a version which has been checked and is free from "I waz ere" junk. A "live" version will also exist, where folks are working on things and will operate much as the current entire system does. The points needing ironed out are how the stable versions are agreed upon, and how "updates" are made to it. Votes? Just an admin? Consensus or unilaterally? The fear is if this system is effectively "controlled" by the admins then they will effectively control all content that is (on first glance) publicly viewable - the argument runs that that is not what admins are there for. They are just normal editors with the odd extra ability to delete, block and housekeep (stuff that every user couldn't have or there would be pandemonium) - the current admin setup in no way has enough rigidity to suddenly make them the guardians of everything that is (immediately) seen.

TBH I think it will come - it has to. WP has outgrown its "all come into the shed and join us" model. That is not to say that good new contributors are not joining every day - they are. It is just that WP is now at a point where its own success has led to every half wit, POV pusher and vandal with an internet connection on WP vandalising - either by just scrawling swear words everywhere, or by making sneaky changes that push their agendas.

I expect when it is implemented it will just be rolled out to the "big" articles - articles that ATM are pretty much permanently semi-protected (means anon's and new accounts can't edit them), such as G Bush, Islam etc etc. Perhaps in time it will move out to all articles, but the danger with that is someone who wants to add a lot of good content to a 2 line stub article on a town in Idaho will be forced to jump through 10 hoops of red-tape before they can get their content moved from the live to the stable. Ultimately Jimbo will be the driving force - and I think the high publicity WP now receives (and the amount of that that is reflected back onto him) will mean WP 1.0, (as it is called) live/stable version will be the ultimate result.

Vanadlism is not Wikipedia's main problem (5, Insightful)

gnetwerker (526997) | more than 7 years ago | (#16091932)

In my opinion, vandalism is not the primary Wikipedia problem. Yes, it is embarrassing, but ultimately only a secondary symptom of the central problem: when you have an "encyclopedia that anyone can edit", anyone does edit it. The clear observtion (can't remember who said this first) is that twenty teenage idiots do not collaboratively make an expert. The perhaps more important corollary is that twenty teenage idiots plus one expert are indistinguishable from twenty-one idiots.

Larry Sanger has acutely commented on Wikipedia's anti-elitism and the way they have run experts off the system. Experts don't have the time or energy to debate fundamental points of well-understood scholarships with game-playing trolls. Further, even when they aren't teenagers, Wikipedia has become the home of everyone who wants history and scholarship to read the way they like it rather than representing some academic consensus. As a result we have politicians trying to rewrite their personal biographies (or those of their opponents), partisans on each side of the world's conflicts burnishing their allies and undermining their opponents (Israel/Palestine, Turkey/Armenia, US/everyone else), and devotees of everything from Microsoft Vista to Nintendo to PETA skillfully expunging objective truth from their deifications of the chosen object of worship.

So doling out karma to 100,000 teenage idiots is not going to solve Wikipedia's problem. In order to save Wikipedia, we need to destroy it -- it needs to be edited by more experts and fewer "normal people".

Re:Vanadlism is not Wikipedia's main problem (1)

nlmille1 (940351) | more than 7 years ago | (#16092207)

Reminds me of my favorite de-motivational poster on [] , entitled "Consensus":
"None of us is as dumb as all of us."

Re:Vanadlism is not Wikipedia's main problem (0)

AnyThingButWindows (939158) | more than 7 years ago | (#16092211)

/ In my opinion, vandalism is not the primary Wikipedia problem. Yes, it is embarrassing, but ultimately only a secondary symptom of the central problem: when you have an "encyclopedia that anyone can edit", anyone does edit it. /

If everyone is allowed to edit it, then who is a vandal? How do you DEFINE vandal by this standard? If you want to call a vandal, would not the vandal be the one that created the wiki, OR anyone that edits it? From my first impression of wikipedia, nobody is allowed to edit, or make changes. Especially if those changes link to, or involve opposing views, that involve strong factual information with reputable sources... pretty much censorship was my first impression.

Re:Vanadlism is not Wikipedia's main problem (1)

MK_CSGuy (953563) | more than 7 years ago | (#16092310)

If a karma system will be implemented perhaps Wikipedia (as a foundation) can give volunteered/contracted experts a very high starting karma (or even a redicously high karma so that no one but the most dedicated non-expert can reach it) so their opinion would count (possibly much) more.

One Idea from the artical that was worth while (2, Informative)

AndyG314 (760442) | more than 7 years ago | (#16091936)

First off, most of this artical was a bad idea. One thing that did seem like a good idea, was to somehow (perhaps by marking in red or some other visual clue) indicate what part of an artical was new, from the part of the artical which has existed for a while. This would help in several ways:
1) People looking for reliable information would know that these parts of an artical have not been exposed to long term scrutney, and therefore may be inacurate.
2) The new, and therefore unverified parts would be more obvious, which would help focus accuracy checking on new material.

It would seem logical for "new" text to remain new, untill it had been viewed a certain number of times, allowing enough sets of eyes to read it, rather than a set time limit, since some articals are not viewed very often, which allows them to remain inaccurate for a long time.

Wikipedia worked fine! (1, Insightful)

Anonymous Coward | more than 7 years ago | (#16091949)

All these proposals to enhance (read limit) Wikipedia always misses the point. There is a huge amount of evidence, TWO MILLION PLUS ARTICLES, proving that the basic wiki model really works! All changes that try to further limit the openness of Wikipedia need to take that into account.

The reason that proposals that limit Wikipedia seem so attractive is because only the negative sides is making the headlines. It is similar to how most people believe the crime rate is going up, while the statistics show that it is decreasing in most places. The media and its sensationalism is to blame. Instead of carefully measuring vandalism rates and the average time it takes for vandalism to be reverted, we have guys like John Siegenthaler publishing an editorial in Washington Post whining about how Wikipedia contained libel about him for many months. Ofcourse that is very bad for him, but decisions on how Wikipedia should work shouldn't be made solely due to so exceptional screwups. In general, Wikipedia articles are factual and do not contain libel.

It is very unfortunate that Jimmy Wales (founder of Wikipedia) have bought the journalists sensational thinking and are now in the process of implementing more and more protective measures which will make the Wikipedia process more like a normal boring editorial system. Nupedia's fiasko seems to have been forgotten...

And for evidence of how worthless reputation systems are, and how much they raise the barrier to entry, check the modding score of this fine comment.

Re:Wikipedia worked fine! (0)

Anonymous Coward | more than 7 years ago | (#16092127)

All these proposals to enhance (read limit) Wikipedia always misses the point.
I think you missed the point of the article. The article provides a way for users to get better information about the reliability of information in Wikipedia without limiting or changing the process through which it is edited.

Wikipedia hubris (1)

kindbud (90044) | more than 7 years ago | (#16091975)

They need to stop worrying about being authoritative or credible. Wikipedia is useful for discovering links and keywords to use in a subsequent search for authoritative material. It's a place to start, not a place to finish. The more people that give a shit about a topic on Wikipedia - whichever side of a controversy they are on - the more useful the content posted to that topic becomes for the purpose of getting more research leads.

The topics on Wiki that are least useful are the obscure, non-controversial topics dealing with facts, where Wiki contributors would just be copy/pasting stuff from somewhere else. That's boring, so few people do it, so topics like that are usually absent or just a stub on Wiki.

Re:Wikipedia hubris (1)

sanso999 (997008) | more than 7 years ago | (#16092278)

Exactly! Reader beware, is all they need as a caveat. Wikipedia is a good basic, and a great way for teachers to find lazy kids. Online info tends to send the reader hither and yon with links (which is why I usually try and stay away from here unless I have lotsa time). People need to recognize plain stupidity, but also see ideas from more than one side.

Re:Wikipedia hubris (1)

value_added (719364) | more than 7 years ago | (#16092391)

They need to stop worrying about being authoritative or credible. Wikipedia is useful for discovering links and keywords to use in a subsequent search for authoritative material.

Sounds fair, but there's that tricky part of obtaining the cooperation of everyone who cites a Wikipedia article as an authoritative source to go along. Judging from posts on Slashdot, for example, I'd suggest the "It's on the internet, therefore it's true." approach is, if not alive and well, valid enough to garner enough nods of approval to merit mod points, which in turn lends more credibility to something that can often be characterised as somewhere between innacurate and vaguely wrong, or lacking enough context to make it misleading.

Human nature being what it is, I find myself agreement with those who view this as a social problem rather than a technical one. That said, I'll continue reading Wikipedia as a primer on certain subjects, but defer to the pipe-smoking-cardigan-wearing old guys who publish their work elsewhere.

tub6irl (-1, Offtopic)

Anonymous Coward | more than 7 years ago | (#16092002)

to keep up as themselves tKo be a watershed essay, use:rs of BSD/OS. A later seen in

recognition for good edits (1)

j1m+5n0w (749199) | more than 7 years ago | (#16092003)

The link's web server does not seem to be responding, so I have no idea what they are proposing. I think a reputation system somewhat similar to slashdot's might be very useful if correctly implemented - users could establish a reputation for themselves over time, and edits by users with a bad reputation (or no reputation) might receive more scrutiny.

This might help with an unrelated problem: giving recognition to people who make good edits. I suspect a lot of people wouldn't post as often (or word their posts as carefully) here on slashdot if they weren't trying to acrue good karma. Getting good karma is in one sense a game; it gives users a goal to achieve. But it's also a way of telling users they're appreciated by the community. I know I get a warm fuzzy feeling whenever my posts are modded up. I think that Wikipedia could benefit greatly from a similar system of community recognition for good contributors.

The danger is that a reputation system might encourage groupthink, but I'm not really sure that happens much here on slashdot.

Here's my suggestion (1)

aplusjimages (939458) | more than 7 years ago | (#16092006)

When you create an account you get the opportunity to create one article. Then the moderators get to vote on the article (because lets face it those editors read everything posted). If you get enough votes, then you can post another article. If you don't get enough votes, then you have to fix up the article until it can get enough votes to write a new article. The votes will also go into a total votes system. So after a month or so of submitting one article at a time you accumulate 100 votes, then you can level up to someone who can submit 2 more articles, then those articles need less votes to allow 2 more article entries. So those who contribute a lot will have the opportunity to do so, and those who want to goof around can only do it to one page. Once you reach a certain level, then you'll be able to go in and correct articles or dispute articles you find incorrect.

TFA (1)

master0ne (655374) | more than 7 years ago | (#16092024)

OK for all of you who couldnt or havent RTFA, basicaly the idea is this (i beleave i saw someone wondering about it somewhat above me) basicaly, determin credibility of text based on age, not user status, as you dont need to have an account to edit a wiki, every user has the same trusted status: untrusted. as such, every edit he makes will apear in red. over the course of time, say 20 edits past his edit, or idealy 200 pageviews or some such time keeping measure as that, the orignal edit gains credibility as noone has corrected it. and will eventualy fade to yellow, then gree, and finally black for fully mature content. as such vandals can keep vandalizing, but its obvious the text has been recently edited, and hasnt had much peer review....

I dunno... (4, Informative)

earthbound kid (859282) | more than 7 years ago | (#16092028)

According to some preliminary [] research [] by Aaron Swartz about who write Wikipedia, while it's true that most of the editing is done by regulars of the sort who would have karma, most of the original content is added by people with few other contributions to Wikipedia. The regulars just go back and put everything into Wiki format, add tags, make things follow style guides, etc. Since the real work is done by anonymous people who may never come back to the site, it's important to keep the process as open as possible for people who are still new to Wikipedia.

views (1)

nostriluu (138310) | more than 7 years ago | (#16092034)

I don't think a single reputation system is a particularly good idea, but I think having site member points of view, which are either edits or endorsements, and the ability for people to choose other views, would work well. Since the wikipedia content is all gfdl, other sites could represent their own points of view, and this content could be exchanged, and no content would need to be censored.

Browsing a site might consist of choosing which views you want to see by default, and accessing other versions/sites if you aren't happy with what you see.

Not sure about this-'un. (1)

Chas (5144) | more than 7 years ago | (#16092080)

Yes, a credibility system might improve the situation. Readers could know that certain entries would be from people with a history as a high-quality contributor.

Then again, it may not. Karma-whoring and alias-building could hurt badly. Also, exactly HOW are you going to indicate which chunks of text came from whom? And what kind of resources are going to be necessary to track this over multiple series of edits without a reader going into the version tracker and conducting a line-by-line comparison?

Visual Cues (1)

teslar (706653) | more than 7 years ago | (#16092087)

I propose that it would be better to provide Wikipedia users with a visual cue that enables them to see what assertions in an article have, in fact, survived the scrutiny of a large number of people, and what assertions are relatively fresh, and may not be as reliable.

And I know just the way to implement this! New text starts with font colour #FFFFFF. At every edit, if the text survives, the colour value is decremented by 1. When it hits #000000 it is forever barred from being edited ever again.

It's useful: New text is only really visible to the very enthusiastic editor and easy to ignore by everyone else if they so chose.

It's healthy: all those pretty colours will soon make you forget your gloomy daily life, cheer you up, and will have you stare at them and consequently absorb new information for hours on end.

Plus, many years from now, a wise man will philosophise that no colour patterns of any two wiki pages have ever been alike. Isn't that deep?

Tyranny (0, Troll)

Archangel Michael (180766) | more than 7 years ago | (#16092107)

We are dealing with man's nature towards tyranny.

Should the "majority" hold the power, it becomes Tyranny of the majority.

When a "minority" can muck it up for the rest of us, it is Tyranny of the minority.

The US' founding fathers understood this, and created a system that in theory should have prevented both (but hasn't been realized because we no longer use that system in its original form).

Once you realize the problem, only then can one begin to work towards a solution. Since there is no real "solution" to tyranny, the only solution is to stop seeking a solution and make everyone responsible, equally, for themselves, and ONLY themselves.

I am not opposed to dealing with those that muck up the system, be they majority or minority stakeholders, it just needs to be done. A system that recognizes and emphasizes differences of opinion and the inate value of facts could solve the problem.

Think of Red, Blue, Green, Yellow, Orange, Purple lables for those pieces of disputed information / opinion. This way we could have ALL the information, and it wouldn't be subject to political bias (since all sides would be represented).

Most of the flame wars on Wiki are not over facts, but the value of, and which facts ought to be included. One man's "LIE" is another man's "fact".

Re:Tyranny -- or is it (1)

Gnostic Ronin (980129) | more than 7 years ago | (#16092503)

I think you're missing a fundemental point here. The site purports to be giving information on various topics. The Majority, in the case of Wikipedia aren't experts on many topics. Yes they know all the trivia on Geek(tm) TV and Movies (Dr. Who, Babylon 5, DS9), but when it comes to giving information about other things, I'd rather have the minority. The monority may be editing the page on politics because they know what the hell they're talking about.

All the rainbow colored text in the world can't change basic facts. Either Barack Obama was born in Kenya or he wasn't. Either Kerry earned 3 purple hearts or he didn't. I don't think that we should turn wikipedia into a system in which one can have conflicting "facts" side by side.

I can't imagine a system like that would do much good.

Barack Obama was born in [1971/1973/1966] in [Nigeria/Kenya/Zimbabwe]. He was elected to the US Senate in [2001/2003/2006].

We may be entitled to our opinions, but we aren't entitled to our own facts.

So now we rate people too... Bad move. (5, Insightful)

kinglink (195330) | more than 7 years ago | (#16092114)

I'm not going to get in the politicing and all. The simple fact is the only response back you'll get from this is how many reverts have been done when you post and those arn't always your fault.

The best parts of Wikipedia is a fast and easy way to edit information, no hassles, no extra effort required. You get out what you put in and that's it. You want to put in the work to be a vandal you're a vandal, but in the end you already know what you're doing. Type in a good sentance but someone replaces it with a better paragraph that's fine.

But instead of working on the core of the experience now we are going to spend time rating each others' facts, rating each other. Basically just killing time. The simple fact is we don't need it, this system is in place in a lot of other places and in effect it basically weeds out the bad apples at the inconvience of all the good users. "You'll have to do 5 discussion posts before you can edit an article" "you have to edit three more articles before you can add an article". This stuff doesn't help or appeal to anyone but "karma whore" types.

If I write a well written page about the new player on the Red Soxes, I should be able to go in to a page that links to it create that page, set up my links and go. I should be able to do this on the first day as well as the fifth year with the same ease. Adding in safe blocks and guards will only hurt wikipedia's overall goals, not help the ideas it promotes. The best thing to do is start handing out serious penalties for vandalism or obvious weasel words.

This doesn't even get into the idea of being able to do fast edits with out logging in, something that's helpful at times.

Haha.. (0)

nexarias (944986) | more than 7 years ago | (#16092342)

If it eventually ends up that people with more reputation gain an edit advantage over people with none or less, than Wikipedia really becomes a simulation of real-life with the "establishment" and its rebels.

Web of credibility (2, Interesting)

jemenake (595948) | more than 7 years ago | (#16092364)

About 10 years ago, when I learned of PGP's "Web of Trust" system (where I could choose to trust everybody that you trust), I turned to a colleague and said "What we need next is a 'Web of Credibility'..." where individuals in the community can bestow credibility points to others... and the points I can bestow to you would depend upon how credible the community thinks *I* am, and so on. In other words, if Noam Chomsky or Lester Thurow vouch that you're highly-credible, then that would boost your credibility more than a few glowing scores from your cable guy and the kid at the local sip-n-go. Ultimately, it would be a measure of how likely (or unlikely) you were to spout off on something that you had no clue about.

Having not yet RTFA, I'd just like to say that I agree, wholeheartedly, with the general notion... and I look forward to the day when our credibilities are incoporated into our digital signatures (that I hope we're also all using someday). - Joe

Slashdot has it down (1)

theg (123457) | more than 7 years ago | (#16092365)

Moderators and meta-moderators, add the concept of eBay and digg and you've got what you need for Wikipedia. Allow changes to be made immediately. If people agree with what you add, then they say so, by doing so they are given points 'potential points' to moderate with in the future, when a 'trusted' meta-moderator agrees with the moderation the points are actually granted. These points allow someone to gain noteriaty and use their points for saying someone else's additions were incorrect, unfair, etc. Without rating people positively, and then being meta-moderated to create a check and balance, you don't have points to tear people's posts down.

When enough 'trusted' people use their points to 'undo' a modification, it actually becomes undone and is only viewable in a history. If someone has several of their stories taken down by different moderators that are unrelated (by IP/e-mail/etc) then they are prevented from their changes (or changes at their IP) being immediately added to the system and a moderator would view said changes first and approve them.

The community would police itself. :) I might have missed some of my concepts, but I have thought about how Wikipedia could implement this for a long time because there are three types of Wikipedia users:

1) Content getters (read-only types)
2) Content posters (the occasional update/edit/addition)
3) Wikipedia nuts (massive amounts of work and time spent on Wikipedia)

The vast majority is a #1, I happen to be a #2 as many of you likely are, and #3's are just #2's with too much time. This really focuses on helping #1s (and the rest of us) get better information from Wikipedia by the #2s and #3s getting more involved (and encouraging more #1s to become #2/3s)

Digg! (1)

Temeriki (997169) | more than 7 years ago | (#16092390)

Considering what Digg is now going through I think a rating system based on reputaion has to be given some serious thought, especially on wikipedia, where someones personal opinions hace a great effect on political articles (many other things could be effected but IM having a serious brain fart). Piss one guy off and he could cause a whole mess of damage.

Mix this with other ideas... (1)

jouvart (915737) | more than 7 years ago | (#16092485)

Another idea for improving the quality of collaborative resources has been to have experts verify an article before publication. Why not take this idea and mix it with user reputation? Have some experts in specific fields rank user's contributions to the Wikipedia, and then have karma/reputation rise for that user. There could even be different reputation scales for different areas of expertise, so you might have users who are marked to make good contributions to physics articles, but are untrustable on political articles. This might prevent some of the mob-think moderation of other "democratic" ranking systems (e.g. digg).

PageRank, TrustRank and DMOZ (1)

mcguyver (589810) | more than 7 years ago | (#16092500)

Search engines use a similar variation of the idea that trusted sources lead to accurate results (PageRank & TrustRank). However anything can get manipulated. Search engine rankings are always abused. And look at DMOZ - editors that act as gatekeepers for submissions demand money. Corruption is rampant on DMOZ. But maybe wikipedia has too little commercial motivation to lead to high levels of corruption. The whole colbert thing with african elephants seemed like a good test of the system and wikipedia did a good job of combatting the vandalism. /rant off
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>