Gibbs artifact in magnetic resonance imaging results when band-limited interpolation is used. This is typically done when there are more reconstructed pixels in the phase encoding direction of the image than corresponding phase encoding measurements. Such sampling is effectively an ideal (in a noise sense) low-pass filter which provides a maximal improvement in contrast resolution at the expense of a decrease in spatial resolution. In this paper we demonstrate that an alternate low-pass filter can be used to improve contrast resolution with a loss in spatial resolution and yet not result in Gibbs artifact. We show that the noise performance of this filter can be made to approach that of an ideal filter by properly specifying the number of samples averaged for each phase encoding index.