Abstract
Several neural network models in continuous time are reconsidered in the framework of a general mean-field theory which is exact in the limit of a large and fully connected network. The theory assumes pointlike spikes which are generated by a renewal process. The effect of spikes on a receiving neuron is described by a linear response kernel which is the dominant term in a weak-coupling expansion. It is shown that the resulting ‘‘spike response model’’ is the most general renewal model with linear inputs. The standard integrate-and-fire model forms a special case. In a network structure with several pools of identical spiking neurons, the global states and the dynamic evolution are determined by a nonlinear integral equation which describes the effective interaction within and between different pools. We derive explicit stability criteria for stationary (incoherent) and oscillatory (coherent) solutions. It is shown that the stationary state of noiseless systems is ‘‘almost always’’ unstable. Noise suppresses fast oscillations and stabilizes the system. Furthermore, collective oscillations are stable only if the firing occurs while the synaptic potential is increasing. In particular, collective oscillations in a network with delayless excitatory interaction are at most semistable. Inhibitory interactions with short delays or excitatory interactions with long delays lead to stable oscillations. Our general results allow a straightforward application to different network models with spiking neurons. Furthermore, the theory allows an estimation of the errors introduced in firing rate or ‘‘graded-response’’ models.