Abstract
We present a new study of the surface-plasmon contribution to surface-enhanced Raman scattering from molecules adsorbed onto metallic gratings. To this end, we use a rigorous electromagnetic theory of diffraction, i.e., a theory in which the groove depth is not considered as a perturbative parameter. The rigorous feature of this formalism leads to the introduction of an important and until now unknown result: There is an optimal groove depth of the grating for which the surface-plasmon contribution to surface-enhanced Raman scattering is the strongest. The enhancement of the electromagnetic field is calculated for different metals and wavelengths. An enhancement factor as high as 3.8×104 is found when a pyridine film is deposited onto a silver grating suitably optimized.