Monday, September 13, 2010

Peer Review Highly Sensitive To Poor Refereeing, Claim Researchers

Just a small number of bad referees can significantly undermine the ability of the peer-review system to select the best scientific papers. That is according to a pair of complex systems researchers in Austria who have modelled an academic publishing system and showed that human foibles can have a dramatic effect on the quality of published science.

[snip]

While the concept of peer review is widely considered the most appropriate system for regulating scientific publications, it is not without its critics. Some feel that the system's reliance on impartiality and the lack of remuneration for referees mean that in practice the process is not as open as it should be. This may be particularly apparent when referees are asked to review more controversial ideas that could damage their own standing within the community if they give their approval.

Questioning referee competence

Stefan Thurner and Rudolf Hanel at the Medical University of Vienna set out to make an assessment of how the peer-review system might respond to incompetent refereeing. [snip]

The researchers created a model of a generic specialist field where referees, selected at random, can fall into one of five categories. There are the "correct" who accept the good papers and reject the bad. There are the "altruists" and the "misanthropists", who accept or reject all papers respectively. Then there are the "rational", who reject papers that might draw attention away from their own work. And finally, there are the "random" who are not qualified to judge the quality of a paper because of incompetence or lack of time.

[snip]

Within this model community, the quality of scientists is assumed to follow a Gaussian distribution where each scientist produces one new paper every two time-units, the quality reflecting an author's ability. At every step in the model, each new paper is passed to two referees chosen at random from the community, with self-review excluded, with a reviewer being allowed to either accept or reject the paper. The paper is published if both reviewers approve the paper, and rejected if they both do not like it. If the reviewers are divided, the paper gets accepted with a probability of 0.5.

Big impact on quality

After running the model with 1000 scientists over 500 time-steps, Thurner and Hanel find that even a small presence of rational or random referees can significantly reduce the quality of published papers. When just 10% of referees do not behave "correctly" the quality of accepted papers drops by one standard deviation. If the fractions of rational, random and correct referees are about 1/3 each, the quality selection aspect of peer review practically vanished altogether.

"Our message is clear: if it can not be guaranteed that the fraction of rational and random referees is confined to a very small number, the peer-review system will not perform much better than by accepting papers by throwing (an unbiased!) coin," explain the researchers.

[snip]

Don't forget the editors

But Tim Smith, senior publisher for New Journal of Physics at IOP Publishing, which also publishes physics world.com, feels that the study overlooks the role of journal editors. "[snip]. In relation to this study however, one shouldn't ignore the role played by journal editors and Boards in accounting for potential conflicts of interest, and preserving the integrity of the referee selection and decision-making processes," he says.

Michèle Lamont a sociologist at Harvard University who analyses peer review in her 2009 book, How Professors Think: Inside the Curious World of Academic Judgment, feels that we expect too much from peer review. [snip]

When asked by physicsworld.com to offer an alternative to the current peer-review system, Thurner argues that science would benefit from the creation of a "market for scientific work". He envisages a situation where journal editors and their "scouts" search preprint servers for the most innovative papers before approaching authors with an offer of publication. [snip]

[snip]

Related

"Peer Review And Journal Models" / Paolo Dall'Aglio

[http://arxiv.org/abs/physics/0608307]

About the author

James Dacey is a reporter for physicsworld.com

[http://physicsworld.com/cws/article/news/43691]

!!! Thanks To Antonella De Robbio For The HeadsUp !!!

Friday, September 10, 2010

The Idea of Order: Transforming Research Collections for 21st Century Scholarship

The Idea of Order explores the transition from an analog to a digital environment for knowledge access, preservation, and reconstitution, and the implications of this transition for managing research collections. The volume comprises three reports. The first, "Can a New Research Library be All-Digital?" by Lisa Spiro and Geneva Henry, explores the degree to which a new research library can eschew print. The second, "On the Cost of Keeping a Book," by Paul Courant and Matthew "Buzzy" Nielsen, argues that from the perspective of long-term storage, digital surrogates offer a considerable cost savings over print-based libraries. The final report, "Ghostlier Demarcations," examines how well large text databases being created by Google Books and other mass-digitization efforts meet the needs of scholars, and the larger implications of these projects for research, teaching, and publishing.

The reports are introduced by Charles Henry; the volume includes a conclusion by Roger Schonfeld and an epilogue by Charles Henry.

June 2010 / 123 pp. / $25 / ISBN 978-1-932326-35-2 / CLIR Reports 147

Source And Full Text Available At

[http://www.clir.org/pubs/abstract/pub147abst.html]

Monday, September 6, 2010

NISO, IU Receive Mellon Grant to Advance Tools for Quantifying Scholarly Impact From Large-scale Usage Data

BLOOMINGTON, Ind., and BALTIMORE, Md. -- A $349,000 grant from the Andrew W. Mellon Foundation to Indiana University Bloomington will fund research to develop a sustainable initiative to create metrics for assessing scholarly impact from large-scale usage data.
IU Bloomington School of Informatics and Computing associate professor Johan Bollen and the National Information Standards Organization (NISO) will share the Mellon Foundation grant designed to build upon the MEtrics from Scholarly Usage of Resources (MESUR) project that Bollen began in 2006 with earlier support from the foundation. Bollen is also a member of the IU School of Informatics and Computing's Center for Complex Networks and Systems Research (CNetS) and the IU Cognitive Science Program faculty.

The new funding for "Developing a Generalized and Sustainable Framework for a Public, Open, Scholarly Assessment Service Based on Aggregated Large-scale Usage Data," will support the evolution of the MESUR project to a community-supported, sustainable scholarly assessment framework. MESUR has already created a database of more than 1 billion usage events with related bibliographic, citation and usage data for scholarly content.

[snip]

The project will focus on four areas in developing the sustainability model -- financial sustainability, legal frameworks for protecting data privacy, technical infrastructure and data exchange, and scholarly impact -- and then integrate the four areas to provide the MESUR project with a framework upon which to build a sustainable structure for deriving valid metrics for assessing scholarly impact based on usage data. Simultaneously, MESUR's ongoing operations will be continued with the grant funding and expanded to ingest additional data and update its present set of scholarly impact indicators.

[snip]

Data from more than 110,000 journals, newspapers and magazines, along with publisher-provided usage reports covering more than 2,000 institutions, is being ingested and normalized in MESUR's databases, resulting in large-scale, longitudinal maps of the scholarly community and a survey of more than 40 different metrics of scholarly impact.

Sources
 
[http://www.librarytechnology.org/ltg-displaytext.pl?RC=15037]
 
[http://newsinfo.iu.edu/news/page/normal/15040.html]

Popular Posts