Assessing transcription agreement: Methodological aspects
- 1 January 1996
- journal article
- research article
- Published by Taylor & Francis in Clinical Linguistics & Phonetics
- Vol. 10 (2) , 131-155
- https://doi.org/10.3109/02699209608985167
Abstract
In recent years it has become common practice among speech researchers to report transcription agreement coefficients, when describing findings based on phonetic transcription. Although such coefficients are intended to give an indication of the degree of transcription accuracy, in reality it is not clear to what extent high agreement coefficients do indeed guarantee great transcription accuracy. In this paper it is argued that the most commonly used agreement coefficient, percentage agreement, has three major disadvantages when applied to phonetic transcription: (a) it is based on the assumption that agreement between transcription symbols is all-or-none, (b) it is strongly influenced by chance agreement, and (c) it gives no account of the criteria used for transcription alignment. After a detailed discussion of these drawbacks, an alternative approach to calculating transcription (dis)agreement is proposed. In this approach experimentally derived feature matrices are used as input to a program that aligns transcription pairs automatically and at the same time calculates a (dis)agreement measure for each transcription pair, the average distance. It is argued that this metric is more adequate to express the degree of (dis)similarity between transcriptions than the more usual percentage agreement. The results of an evaluation experiment corroborate this view.Keywords
This publication has 28 references indexed in Scilit:
- A longitudinal study of the babbling and phonological development of a child with hypoglossiaClinical Linguistics & Phonetics, 1991
- The validity of phonetic transcription: Limitations of a sociolinguistic research toolLanguage Variation and Change, 1990
- Interobserver Reliability and Perceptual RatingsJournal of Speech, Language, and Hearing Research, 1988
- Transcribing phonetic detail in the speech of unintelligible children: A comparison of procedures*International Journal of Language & Communication Disorders, 1985
- A REVIEW OF THE OBSERVATIONAL DATA‐COLLECTION AND RELIABILITY PROCEDURES REPORTED IN THE JOURNAL OF APPLIED BEHAVIOR ANALYSISJournal of Applied Behavior Analysis, 1977
- EVALUATING INTEROBSERVER RELIABILITY OF INTERVAL DATA1Journal of Applied Behavior Analysis, 1977
- CONSIDERATIONS IN THE CHOICE OF INTEROBSERVER RELIABILITY ESTIMATESJournal of Applied Behavior Analysis, 1977
- Weighted kappa: Nominal scale agreement provision for scaled disagreement or partial credit.Psychological Bulletin, 1968
- A Coefficient of Agreement for Nominal ScalesEducational and Psychological Measurement, 1960
- Accuracy in Testing the Articulation of Speech SoundsThe Journal of Educational Research, 1938