Abstract
We perform a simulation of one‐dimensional quasi‐random gratings for quantum well infrared photodetectors. The simulation reveals the trade‐off between the grating induced intersubband absorption efficiency and the resulting spectral response range for normal incident radiation. By controlling the degree of grating quasi‐randomness, one can optimize the absorption for the desired spectral response range. The general features in the simulation results can be used as guidelines in designing two‐dimensional quasi‐random gratings for the fabrication of focal plane imaging arrays.