An HDL Simulation of the Effects of Single Event Upsets on Microprocessor Program Flow

Abstract
Simulation experiments for determining the effects of single event upsets on microprocessor program flow are described. A 16 bit microprocessor is modeled using a hardware description language. Using pseudorandom selection of event time and effected flip-flop, SEU's are injected into the microprocessor model. Upset detectors are modeled along with the microprocessor for determination of fault coverage of several candidate fault detection techniques.

This publication has 3 references indexed in Scilit: