Abstract
The model described by us earlier (Erlykin and Wolfendale 2001a J. Phys. G: Nucl. Part. Phys. 27 941), which involves Monte Carlo calculations for cosmic rays accelerated by supernova remnants in the interstellar medium, has been used to predict Galactic cosmic ray energy spectra as a function of space and time. Moderate variations of cosmic ray characteristics connected with the random spacetime distribution of supernovae are found to be accompanied by much stronger changes caused by explosions of nearby and recent supernovae. The spatial variations have been compared with results from gamma ray astronomy which relate to possible small variations in spectral shape for the average cosmic ray proton intensity in the energy range 3-100 GeV out to distances of some 100s of pc from the Earth (Fatoohi et al 1995 J. Phys. G: Nucl. Part. Phys. 21 679). Similarly, comparison has been made with results from radio-astronomy, which relate to the electron component. There is found to be no inconsistency with the model predictions in either case. The predicted temporal changes in the cosmic ray intensity at Earth in the range 10-50 GeV, appropriate to cosmogenic nucleus measurements, are, again, not inconsistent with those observed (an upper limit of a few 10s of per cent, with the value depending on the cosmogenic nucleus under study). The amplitude of the anisotropy in arrival directions of cosmic rays predicted by the model is of the order of that observed (typically 1% at 1 PeV) for the situation where there has been a local, recent supernova.