Interrater Reliability in Grading Abstracts for the Orthopaedic Trauma Association
- 1 June 2004
- journal article
- research article
- Published by Wolters Kluwer Health in Clinical Orthopaedics and Related Research
- Vol. 423 (423) , 217-221
- https://doi.org/10.1097/01.blo.0000127584.02606.00
Abstract
Only a small proportion of submitted abstracts to the annual meeting of the Orthopaedic Trauma Association can be accepted for podium presentation. Annual program committee members must ensure that the selection of abstracts is free from bias and transparent to investigators. The objectives of this study are to examine the consistency of reviewers in grading abstracts submitted for podium presentations at the 2001 and 2002 Annual Meetings of the Orthopaedic Trauma Association and to evaluate whether the grades of the actual podium presentations at the meeting are consistent with the grades based on abstracts only. Reviewers independently graded all abstracts submitted to the Orthopaedic Trauma Association for presentation in a blinded manner. Abstracts submitted by members of the review panel were independently adjudicated by six reviewers who were not members of the committee. Before final decision-making, all reviewers met to discuss the abstracts submitted for oral presentation. Among the 440 papers reviewed in 2001 and 438 papers reviewed in 2002, the interreviewer reliability for abstract review was 0.23 and 0.27, respectively. Despite disagreements in the quality of the abstracts, reviewers achieved consensus by discussions to determine the final program. Agreement among unblinded reviewers of the 67 and 73 podium presentations during the 2001 and 2002 meetings, respectively, did not improve interreviewer agreement. Of the papers of the 2002 meeting that ultimately ranked in the top 20 after the full presentation of the papers, 15 papers originally had been ranked less than 20 in the initial grading. Only one of the top three papers of the meeting originally was ranked in the top three before the meeting.This publication has 10 references indexed in Scilit:
- Reliability of a Structured Method of Selecting Abstracts for a Plastic Surgical Scientific MeetingPlastic and Reconstructive Surgery, 2003
- Development and evaluation of a quality score for abstractsBMC Medical Research Methodology, 2003
- Impact of Blinded Versus Unblinded Abstract Review on Scientific Program ContentJournal of Urology, 2002
- Reproducibility of peer review in clinical neuroscience: Is agreement between reviewers any greater than would be expected by chance alone?Brain, 2000
- Assessment of abstracts submitted for the 1998 BNMS Annual MeetingNuclear Medicine Communications, 1999
- Effect of open peer review on quality of reviews and on reviewers' recommendations: a randomised trialBMJ, 1999
- What Makes a Good Reviewer and a Good Review for a General Medical Journal?JAMA, 1998
- Improving Participation and Interrater Agreement in Scoring Ambulatory Pediatric Association Abstracts: How Well Have We Succeeded?Archives of Pediatrics & Adolescent Medicine, 1996
- Abstract scoring for the annual SMR program: Significance of reviewer score normalizationMagnetic Resonance in Medicine, 1994
- How reliable is peer review of scientific abstracts?Journal of General Internal Medicine, 1993