Abstract
It has been shown that two correlated photons incident upon two distant interferometers can give a coincidence counting rate that depends nonlocally on the sum of the phases of the two interferometers. It is shown here that the results of existing experiments violate a simple inequality that must be satisfied by any classical or semiclassical field theory. The inequality provides a graphic illustration of the lack of objective realism of the electric field.