Monday, May 26, 2008

Soft Peer Review: Social Software and Distributed Scientific Evaluation

Soft Peer Review: Social Software and Distributed Scientific Evaluation

Dario TARABORELLI / Department of Psychology / University College London / Gower Street / London / WC1 6BT / United Kingdom / d.taraborelli@ucl.ac.uk

Abstract

The debate on the prospects of peer-review in the Internet age and the increasing criticism leveled against the dominant role of impact factor indicators are calling for new measurable criteria to assess scientific quality. Usage-based metrics offer a new avenue to scientific quality assessment but face the same risks as first generation search engines that used unreliable metrics (such as raw traffic data) to estimate content quality. In this article I analyze the contribution that social bookmarking systems can provide to the problem of usage-based metrics for scientific evaluation. I suggest that collaboratively aggregated metadata may help fill the gap between traditional citation-based criteria and raw usage factors. I submit that bottom-up, distributed evaluation models such as those afforded by social bookmarking will challenge more traditional quality assessment models in terms of coverage, efficiency and scalability. Services aggregating user-related quality indicators for online scientific content will come to occupy a key function in the scholarly communication system.

D. Taraborelli (2008), Soft peer review. Social software and distributed scientific evaluation, Proceedings of the 8th International Conference on the Design of Cooperative Systems (COOP 08), Carry-Le-Rouet, France, May 20-23, 2008

References

[1] Revolutionizing peer review? Nat Neurosci, 8(4):397–397, April 2005. doi: 10.1038/nn0405397. URL http://dx.doi.org/10.1038/nn0405397

[2] Peer review and fraud. Nature, 444(7122):971–972, December 2006. doi: 10.1038/444971b. URL http://dx.doi.org/10.1038/444971b

[3] The impact factor game. PLoS Medicine, 3(6), June 2006. doi: 10.1371/journal. pmed.0030291. URL http://dx.doi.org/10.1371/journal.pmed.0030291

[4] S. Bao, G. Xue, X. Wu, Y. Yu, B. Fei, and Z. Su. Optimizing web search using social annotations. In WWW ’07: Proceedings of the 16th international conference on World Wide Web, pages 501–510, New York, NY, USA, 2007. ACM Press. ISBN 9781595936547. doi: 10.1145/1242572.1242640. URL http://dx.doi.org/10.1145/1242572.1242640

[5] J. Bollen, H. Van de Sompel, J. A. Smith, and R. Luce. Toward alternative metrics of journal impact: A comparison of download and citation data. Information Processing & Management, 41(6):1419–1440, December 2005. doi: 10.1016/j.ipm.2005.03.024. URL http://dx.doi.org/10.1016/j.ipm.2005.03.024

[6] T. Brody, S. Harnad, and L. Carr. Earlier Web usage statistics as predictors of later citation impact. J. Am. Soc. Inf. Sci. Technol., 57(8):1060–1072, June 2006. ISSN 1532-2882. doi: 10.1002/asi.v57:8. URL http://dx.doi.org/10.1002/asi.v57:8.13

[7] E. Garfield. The agony and the ecstasy— the history and meaning of the journal impact factor. In International Congress on Peer Review And Biomedical Publication, Chicago, September 2005. URL http://garfield.library.upenn.edu/papers/jifchicago2005.pdf

[8] P. Ginsparg. Can peer review be better focused. Science & Technology Libraries, 22 (3-4):5–17, January 2004. doi: 10.1300/J122v22n03 02. URL http://people.ccmr.cornell.edu/~ginsparg/blurb/pg02pr.html

[9] W. Glanzel. Journal impact measures in bibliometric research. Scientometrics, 53(2): 171–193, 2002. URL http://www.ingentaconnect.com/content/klu/scie/2002/00000053/00000002/00400216

[10] S. Greaves, J. Scott, M. Clarke, L. Miller, T. Hannay, A. Thomas, and P. Campbell. Nature’s trial of open peer review. Nature, December 2006. doi: 10.1038/nature05535. URL http://www.nature.com/nature/peerreview/debate/nature05535.html

[11] S. Harnad. Open access scientometrics and the uk research assessment exercise. In D. Torres-Salinas and H. F. Moed, editors, 11th Annual Meeting of the International Society for Scientometrics and Informetrics, volume 11, pages 27–33, 2007. URL http://eprints.ecs.soton.ac.uk/13804/

[12] S. Harnad. Implementing Peer Review on the Net: Scientific Quality Control in Scholarly Electronic Journals, pages 103–118. MIT Press, 1996. URL http://eprints.ecs.soton.ac.uk/2900/

[13] C. Heintz. Web search engines and distributed assessment systems. Pragmatics & Cognition, 14(2):387–409, 2006.

[14] C. G. Jennings. Quality and value: The true purpose of peer review. Nature, 2006. doi: 10.1038/nature05032. URL http://www.nature.com/nature/peerreview/debate/nature05032.html

[15] M. Jensen. The new metrics of scholarly authority. The Chronicle, June 2007. URL http://chronicle.com/free/v53/i41/41b00601.htm

[16] G. McKiernan. Peer review in the internet age: Five (5) easy pieces. Against the Grain, 16(3):52–55, June 2004. URL http://www.public.iastate.edu/~gerrymck/DraftFive.htm

[ALSO: URL http://www.public.iastate.edu/~gerrymck/FiveEasyPieces.pdf]

[17] H. Roosendaal and P. Geurts. Forces and functions in scientific communication. In Cooperative Research Information Systems in Physics, Oldenburg, Germany, August 1997. URL http://www.physik.unioldenburg.de/conferences/crisp97/roosendaal.html

[18] P. T. Shepherd. Final report on the investigation into the feasibility of developing and implementing journal usage factors. Technical report, United Kingdom Serials Group, May 2007. URL http://www.uksg.org/sites/uksg.org/files/FinalReportUsageFactorProject.pdf

[19] Y. Yanbe, A. Jatowt, S. Nakamura, and K. Tanaka. Can social bookmarking enhance search in the web? In JCDL ’07: Proceedings of the 2007 conference on Digital libraries, pages 107–116, New York, NY, USA, 2007. ACM Press. ISBN 9781595936448. doi: 10.1145/1255175.1255198. URL http://dx.doi.org/10.1145/1255175.1255198.

Keywords

peer review; rating; impact factor; citation analysis; usage factors; scholarly publishing; social bookmarking; collaborative annotation; online reference managers; social software; web 2.0; tagging; folksonomy

* This paper is based on ideas previously published on a post on the Academic Productivity blog.

[http://www.academicproductivity.com/blog/2007/soft-peer-review-social-software-and-distributed-scientific-evaluation/]

PDF of Presentation Slides Available

[http://nitens.org/docs/coop08-slides.pdf]

No comments:

Post a Comment

Popular Posts