Damaged Merchandise? A Review of Experiments That Compare Usability Evaluation Methods
- 1 September 1998
- journal article
- review article
- Published by Taylor & Francis in Human–Computer Interaction
- Vol. 13 (3) , 203-261
- https://doi.org/10.1207/s15327051hci1303_2
Abstract
An interest in the design of interfaces has been a core topic for researchers and practitioners in the field of human-computer interaction (HCI); an interest in the design of experiments has not. To the extent that reliable and valid guidance for the former depends on the results of the latter, it is necessary that researchers and practitioners understand how small features of an experimental design can cast large shadows over the results and conclusions that can be drawn. In this review we examine the design of 5 experiments that compared usability evaluation methods (UEMs). Each has had an important influence on HCI thought and practice. Unfortunately, our examination shows that small problems in the way these experiments were designed and conducted call into serious question what we thought we knew regarding the efficacy of various UEMs. If the influence of these experiments were trivial, then such small problems could be safely ignored. Unfortunately, the outcomes of these experiments have been used t...Keywords
This publication has 22 references indexed in Scilit:
- A Guide to GOMS Model Usability Evaluation using NGOMSLPublished by Elsevier ,1997
- The GOMS family of user interface analysis techniquesACM Transactions on Computer-Human Interaction, 1996
- Using GOMS for user interface design and evaluationACM Transactions on Computer-Human Interaction, 1996
- Sample Sizes for Usability Studies: Additional ConsiderationsHuman Factors: The Journal of the Human Factors and Ergonomics Society, 1994
- Understanding usability issues addressed by three user-system interface evaluation techniquesInteracting with Computers, 1994
- Toward a deeper comparison of methodsPublished by Association for Computing Machinery (ACM) ,1994
- Project Ernestine: Validating a GOMS Analysis for Predicting and Explaining Real-World Task PerformanceHuman–Computer Interaction, 1993
- Comparison of empirical testing and walkthrough methods in user interface evaluationPublished by Association for Computing Machinery (ACM) ,1992
- The Acquisition and Performance of Text-Editing Skill: A Cognitive Complexity AnalysisHuman–Computer Interaction, 1990
- Research Methods in Human-Computer InteractionPublished by Elsevier ,1988