Abstract
Predictions generated from optimality models are based on a number of assumptions. The predictive value of these models is often determined by the degree to which the behavior of an organism fits the underlying assumptions of the model. Optimal diet choice was analyzed by relaxing 2 sets of assumptions made in previous optimality models. Foraging-bout length (the uninterrupted time devoted just to foraging), generally treated as infinitely long, affected optimal diet choice. For many foragers, foraging-bout length may be considerably shortened by the presence of predators, or by physical or social features of the forager''s environment. A model was derived which incorporates a short bout length into the decision of diet choice. The model predicts that animals should become more catholic in their diet choice as the amount of uninterrupted foraging time decreases. This prediction appears to be supported by 3 studies from the literature. Jaeger et al. (1981) showed that salamanders incorporated more lower ranked prey (small flies) when they were either on the territory of a conspecific or on no territory as compared with prey choice when they were on their own territory. In this case, foraging time was uninterrupted when the salamanders were feeding selectively, but continuously interrrupted by submissive behavior and marking behavior when no diet choice was exhibited. Freed (1981) showed that wrens foraging for nestlings spent less time per foraging bout when a predator was in the nesting area than when no predator was in sight. The reduction in foraging bout time correlated with a reduction in prey size fed to the young. The foraging time of some intertidal snails was confined by the length of the low tide cycle (Menge 1974). As the end of the low tide drew near, the snails decreased diet selectivity. As the remaining time available for foraging decreased, the predator exhibited a lower degree of prey selection. Variance in prey encounter interval affected the utility of classical optimal diet models in predicting the optimal diet. Charnov''s (1976) model overestimates the net rate of energy intake when mean encounter rate varies about some fixed level. Predictions from Charnov''s model are incorrect over some ranges of prey encounter rates because of this overestimation. As variance in prey encounter rate increases, the time over which the forager estimates prey encounter rate will have a strong effect on the ability of the forager to maximize the net rate of energy intake. Foragers that forage on patchily distributed prey should use a shorter amount of time to estimate prey density than foragers that prey on evenly dispersed prey. Animals that are capable of reducing the time required to estimate prey density (e.g., predators that hunt by sight in areas of high prey density) should alter their diet in response to local variation in prey density. For this type of forager, as variance in prey encounter rate increases, fluctuations in the number of prey types in the diet will increase. As a result, there should be an increase in the degree of partial prey preference exhibited by the forager with increasing variance in prey encounter rate.