Combination of word-based and category-based language models

Abstract
A language model combining word based and category based n grams within a backoff framework is presented. Word n grams conveniently capture sequential relations between particular words, while the category model, which is based on part of speech classifications and allows ambiguous category membership, is able to generalise to unseen word sequences and therefore appropriate in backoff situations. Experiments on the LOB, Switchboard and WSJO corpora demonstrate that the technique greatly improves language model perplexities for sparse training sets, and offers significantly improved complexity versus performance tradeoffs when compared with standard trigram models.

This publication has 2 references indexed in Scilit: