Long-Term Fluctuations versus Actual Field Loss in Glaucoma Patients1

Abstract
In following glaucomatous visual fields with automated perimetry, one becomes aware of the difficulty in differentiating between long-term fluctuations and actual changes. We have analyzed visual fields of one randomly selected eye of each of 34 patients (mean age = 61.8 +/- 12.3 years) with proven glaucoma chronicum simplex and at least 4 (mean = 5.6 +/- 1.9) visual field examinations with programs 31 and/or 33 of the Octopus 201 within at least 21 months (mean = 32.6 +/- 6 months). When a group of earlier visual fields is compared with a group of later ones utilizing the statistical program delta-change, the results of regression analysis, based on data from program delta-series, are juxtaposed to the results of the t test with very good correlation. This investigation confirms that glaucomatous visual fields exhibit considerable long-term fluctuations; as long as field loss and gain observed over a longer time period are evenly distributed, the symmetrical portions of the distribution curve may serve as the outer limits of long-term fluctuation. In this regard, definite visual field deterioration was discerned in none of our medically treated patients over an average observation period of 32 months. If more selective criteria for differentiating between long-term fluctuations and actual field change are used (such as measurement of long-term fluctuations performed in earlier investigations, the t test of program delta-change, regression analysis of the data produced by program delta-series, as well as the correspondence between the t test and the correlation coefficient of the regression analysis), 3 eyes have changed for the worse regarding loss per test location, and 3 have changed for the worse regarding sensitivity.

This publication has 0 references indexed in Scilit: