Abstract
In situations where foraging sites vary both in food reward and predation risk, conventional optimal foraging models based on the criterion of maximizing net rate of energy intake commonly fail to predict patch choice of foragers. Recently, an alternative model based on the simple rule "when foraging, minimize the ratio of mortality rate (u) to foraging rate (f)" was successful in predicting patch preference under such conditions (Gilliam and Fraser 1987). In the present study, I compare the predictive ability of these two models under conditions where available patches vary both in predation hazard and foraging returns. Juvenile bluegill sunfish (Lepomis macrochirus) were presented with a choice between two patches of artificial vegetation differing in stem density (i.e. 100, 250, and 500 stems/m2) in which to forage. Each combination (100:250, 250:500, or 100:500) was presented in the absence, presence, and after exposure to a bass predator (Micropterus salmoides). Which patch of vegetation bluegills chose to forage in, and foraging rate within each patch were recorded. Independent measurements of bluegill foraging rate and risk of mortality in the three stem densities provided the data for predicting patch choice by the two models. With no predator, preference between plots was consistent with the maximize energy intake per unit time rule of conventional optimality models. However, with a predator present, patch preference switched to match a minimize u/f criterion. Finally, when tested shortly after exposure to a predator (i.e. 15 min), bluegill preference appeared to be in a transitional phase between these two rules. Results are discussed with respect to factors determining the distribution of organisms within beds of aquatic vegation.