A task‐appropriate hybrid architecture for explanation
- 1 November 1991
- journal article
- Published by Wiley in Computational Intelligence
- Vol. 7 (4) , 315-333
- https://doi.org/10.1111/j.1467-8640.1991.tb00404.x
Abstract
This paper analyzes how a spectrum of architectural and structural ideas fit together to provide the required functionality for explanation generation. Several information processing tasks involved in the choice and organization of the content of an explanation are identified. These are best modeled by distinct mechanisms; hence a particular class of hybrid planning architectures most clearly reflects the nature of the explanation task. The architecture is exemplified by a description of an implemented explanation planner. Various implications of the architecture are discussed, including a classification of structuring relations based on their sources and roles in planning; the elimination of goal‐posting preconditions from goal refinement operators; and the level at which nondeterminism is handled.Keywords
This publication has 22 references indexed in Scilit:
- Pragmatics and natural language generationPublished by Elsevier ,2003
- Capturing high-level structure of naturally occurring, extended explanations using bottom-up strategiesComputational Intelligence, 1991
- Using meta‐comments to generate fluent text in a technical domain1Computational Intelligence, 1991
- Generating context-sensitive responses to object-related misconceptionsArtificial Intelligence, 1989
- Discourse strategies for generating natural-language textArtificial Intelligence, 1985
- Planning english referring expressionsArtificial Intelligence, 1985
- The epistemology of a rule-based expert system —a framework for explanationArtificial Intelligence, 1983
- BLAH, a system which explains its reasoningArtificial Intelligence, 1980
- Production rules as a representation for a knowledge-based consultation programArtificial Intelligence, 1977
- The translation of formal proofs into EnglishArtificial Intelligence, 1976