Using Algebra Word Problems to Assess Quantitative Ability: Attributes, Strategies, and Errors
- 1 September 1996
- journal article
- Published by Taylor & Francis in Cognition and Instruction
- Vol. 14 (3) , 285-343
- https://doi.org/10.1207/s1532690xci1403_2
Abstract
Changing goals in mathematics education have encouraged more open-ended problem solving in assessment. However, the use of these less constrained approaches has been limited by a lack of demonstrated relations between the underlying cognitive models and measurement consequences. In order to begin to characterize the cognitive basis for this emerging approach to measurement in a small domain-algebra word problemsdetailed analyses of solutions to 20 problems that had appeared on the Graduate Record Examination General Test were collected from 51 undergraduates. Problems were characterized in terms of their major attributes, and solutions were described by students' strategies and errors. Regression analyses indicated that models including attributes such as the need to apply algebraic concepts, problem complexity, and problem content could account for 37% to 62% of the variance in problem difficulty. Protocol analyses identified four major solution strategies-equation formulation, ratio setup, simulation, and other (unsystematic) approaches-as well as a number of collateral strategies, including the use of pictures, formulae, and verbal descriptions. Higher achieving students used more equation strategies, more collateral strategies, and fewer unsystematic approaches than lower achieving students. Student errors tended to be idiosyncratic but could be classified into six principal categories that were used to identify sources of performance. Overall, the results support the notion that constructed responses capture strategy formulation and high-level planning, as do more traditional measures of quantitative reasoning. At the same time, constructed responses are more sensitive to individual problem characteristics and procedural errors that may be helpful in instruction but are a potential source of bias in assessment. A preliminary theoretical framework for describing performance on algebra word problems is proposed, and its usefulness for instruction and for more systematic design of tests is discussed.Keywords
This publication has 22 references indexed in Scilit:
- The challenge in mathematics and science education: Psychology's response.Published by American Psychological Association (APA) ,1993
- A Comparison of Quantitative Questions in Open‐Ended and Multiple‐Choice FormatsJournal of Educational Measurement, 1992
- Expert-System Scores for Complex Constructed-Response Quantitative Items: A Study of Convergent ValidityApplied Psychological Measurement, 1991
- Exploring the Episodic Structure of Algebra Story Problem SolvingCognition and Instruction, 1989
- The role of understanding in solving word problemsCognitive Psychology, 1988
- Component Latent Trait Models for Paragraph Comprehension TestsApplied Psychological Measurement, 1987
- Construct validity: Construct representation versus nomothetic span.Psychological Bulletin, 1983
- Cognitive Development and Children's Solutions to Verbal Arithmetic ProblemsJournal for Research in Mathematics Education, 1982
- Diagnostic Models for Procedural Bugs in Basic Mathematical Skills*Cognitive Science, 1978
- Structural variables that determine problem-solving difficulty in computer-assisted instruction.Journal of Educational Psychology, 1972