Wiki Vandalysis. Wikipedia Vandalism Analysis
The Graduate School, Stony Brook University: Stony Brook, NY.
Wikipedia describes itself as"The free encyclopedia that anyone canedit". Along with the helpful volunteers who contribute by improving the articles, a great number of malicious users abuse the open nature of Wikipedia by vandalizing articles. Wikipedia editors fight vandalism both manually and with automated bots that use regular expressions and other simple rules to recognize malicious edits[Carter]. Researchers have also proposed Machine Learning algorithms for vandalism detection[Smets et al., 2008; Potthast et al., 2008a], but these algorithms are still in their infancy and have much room for improvement. This paper presents an approach to fighting vandalism using natural language processing and machine learning techniques. Along with basic features of the edit like edit distance, edit type, count of abnormal patterns and slang words, we use features related to information about the editor, past revision history of the article, change in sentiment of the article and PCFG sentence parser score. We have successfully been able to achieve an area under the ROC curve (AUC) of 0.94 and F1 score of 0.53 using LogitBoost in a 10 cross validation setting on a training set [Potthast, 2010] of 32444 human annotated edits. We also analyze the performance of our features by building separate classifier for insert or changes, deletes and template edits in a balanced and unbalanced corpus setting.