Wikipedia Reputation System
June 25th, 2008

Wikipedia has long needed a reputation system for articles because the website is user generated and edited by millions of people. Enter the UCSC Wiki Lab who has come up with a way to track the reputation of specific phrases and words in the text added by users with a color overlay on the text.

By using a two step computing system, the UCSC Wiki Lab judges the information and it’s validity and then it colors the text that is troubling to the system a light to a very dark orange depending on the trustworthiness of the information. You can see an example of how the color coded trust system works here.

Be Sociable, Share!
Be Sociable, Share!
Be Sociable, Share!
1 - David

Wikipedia is a great tool. but can one trust the information provided since it is edited by millions?

2 - David

I have submitted to Wikipedia and the parameters seem to be very stringent, but one wonders how much you can trust the info?

3 - Terry Steichen

(cross-posted – with edits – from Read/Write Web)
Based on the WikiTrust logic (as I understand it), if a person contributes to a controversial subject, they are likely to get a lower reputation level (because others with different views will be more likely to change the post). (Indeed, it might create an incentive that, in order to preserve your reputation, you stick to non-controversial topics – which is a counter-productive outcome, IMHO.) Or, if you happen to have someone with a strong bias on the topic to which you contribute, you’ll also suffer. OTOH, if you contribute to topics that few people care about, your reputation will grow.

I don’t see how any of those outcomes will be positive.

I used to submit information and links back in the day but found as time went on, there were unelected “official trolls” who would delete information. The process necessary to have the info put back up was not worth the effort, so I stopped. This tool is fantastic!

Post a comment