Criteria For Evaluating Usability Evaluation Methods
Top Cited Papers
- 1 December 2001
- journal article
- Published by Taylor & Francis in International Journal of Human–Computer Interaction
- Vol. 13 (4) , 373-410
- https://doi.org/10.1207/s15327590ijhc1304_03
Abstract
The current variety of alternative approaches to usability evaluation methods (UEMs) designed to assess and improve usability in software systems is offset by a general lack of understanding of the capabilities and limitations of each. Practitioners need to know which methods are more effective and in what ways and for what purposes. However, UEMs cannot be evaluated and compared reliably because of the lack of standard criteria for comparison. In this article, we present a practical discussion of factors, comparison criteria, and UEM performance measures useful in studies comparing UEMs. In demonstrating the importance of developing appropriate UEM evaluation criteria, we offer operational definitions and possible measures of UEM performance. We highlight specific challenges that researchers and practitioners face in comparing UEMs and provide a point of departure for further discussion and refinement of the principles and techniques used to approach UEM evaluation and comparison.Keywords
This publication has 38 references indexed in Scilit:
- Testing a Framework for Reliable Classification of Usability ProblemsProceedings of the Human Factors and Ergonomics Society Annual Meeting, 2000
- A Pair of Techniques for Effective Interface Evaluation: Cognitive Walkthroughs and Think-Aloud EvaluationsProceedings of the Human Factors and Ergonomics Society Annual Meeting, 1997
- Looking for usability problems with the ergonomic criteria and with the ISO 9241-10 dialogue principlesPublished by Association for Computing Machinery (ACM) ,1996
- Evaluating a user interface with ergonomic criteriaInternational Journal of Human–Computer Interaction, 1995
- Understanding usability issues addressed by three user-system interface evaluation techniquesInteracting with Computers, 1994
- Integrating theory development with design evaluationBehaviour & Information Technology, 1992
- Interface-Walkthroughs: efficient collaborative testingIEEE Software, 1991
- Development of an instrument measuring user satisfaction of the human-computer interfacePublished by Association for Computing Machinery (ACM) ,1988
- An Evaluation of Critical Incidents for Software Documentation DesignProceedings of the Human Factors Society Annual Meeting, 1986
- A Coefficient of Agreement for Nominal ScalesEducational and Psychological Measurement, 1960