As we’ve all learned, anyone can make an edit to a Wikipedia entry. Some improve the quality of the content, but others can shade the truth about publicfigures or even, shockingly, downright lie. Very soon you’ll be able to see the dubious statements on an entry in bright, shining organce. Software developed by computer scienceprofessor Luca de Alfaro and others at UC Santa Cruz, will highlight questionable text in orange. The deepepr the color, the more likely it is to be a lie. Since there’s an open history of edits, the software analyzes how any contributor’s work survives edits by others. The better the survival rate of a contributor’s work, the moretrustworthy it’s deemed to be. But if it’s been shot down often, the statements will turn a deeper shade of orange.
Obviously, that’s not foolproof, but it received an enthusiastic reception at the Wikimania convention in Taiwan last month. The software is currently in demo mode and just works with an older subset of Wikipedia entries. But de Alfaro hopes to work with the Wikipedia Foundation to bring it on the site as a real-time option.
“I don’t want to give the impression that Wikipedia is low-trust. It really works very well,” de Alfaro said. “What I wanted to make sure is that nobody cansingle-handedly modify information without some trace of that being available for some time afterwards.”