Abstract
The present paper is an attempt to formulate the mathematical properties of events occurring in time with a correlation between them. One effect of the correlation is a change in the mean square deviation of the number of events occurring in an interval of time t. By analogy with the shot effect in the presence of space change, this is denoted by a factor Γ 2 for large t, and Γ 2 is evaluated in terms of the interval distributions between events. For finite intervals t a corresponding factor γ 2(t) is introduced, which usually tends to 1 as t→0. A correlation function between events occurring at times separated by an interval τ is defined, and its relation to γ 2(t) is discussed. A generalization of Campbell's Theorem applying to correlated events is derived. Finally the problem of the random partitioning of correlated events is discussed.

This publication has 7 references indexed in Scilit: