Provisioners can alter their selection of prey for delivery, and this gives them another way to address the problems of increased demand. Scores of papers, including many good experimental studies, reveal that provisioners change the size or type of prey delivered when the demand at the delivery point changes (e.g., Siikamaki et al. 1998; see Moore 2002 for a review). In some cases, parents deliver larger prey to larger offspring; this could occur for the simple reason that small offspring cannot swallow large prey. A more sophisticated hypothesis holds that parents change prey selection to boost the energy delivery rate. For example, when experimenters increase brood size, European starling parents deliver poorer but more easily obtained prey, increasing the delivery of energy at the expense of other nutrients (Wright et al. 1998).
A third possible hypothesis involves variance sensitivity (Stephens and Krebs 1986). (The ecological literature usually calls this concept "risk sensitivity," a term borrowed from economics, where it refers to variable returns on invested capital. Unfortunately, ecologists also use the term "risk" in other ways, as in "risk of predation." Substituting the terms "variance" or "danger," as appropriate, eliminates any potential confusion.) From their beginnings, central place foraging models focused on how changes in provisioning tactics (e.g., selectivity in a single-prey loader) affect the mean delivery rate. If there is any stochasticity in components of the provisioning process (e.g., capture time, handling time, prey size), there will also be variance about the mean delivery rate. The variance itself will also be affected by the tactic chosen. So, in the most general case, provisioning tactics affect both the mean and variance of delivery rates.
Variance becomes important when deviations above and below the mean delivery rate have different effects on fitness. The original development of variance sensitivity theory focused on starvation avoidance (Stephens and Krebs 1986). In these "shortfall-avoidance" models, falling below a requirement results in a different fitness outcome (starvation) than exceeding it (survival). However, the basic idea applies whenever the cost offalling below the mean differs from the benefit of exceeding the mean. Thus, we expect that provisioners will show variance-sensitive behavior when fitness increments resulting from delivery above and below the mean are unequal (Ydenberg 1994). This asymmetry could arise via falling short, as the early model imagined, but it could also arise if the growth (and hence fitness) of offspring shows diminishing returns with delivery.
Figure 8.6 gives a worked example to show how variance sensitivity can shape provisioning tactics. A parent common tern (Sterna hirundo) flies from its breeding colony to a lake, where it searches for a fish. Fish vary in size, and the tern encounters them sequentially as it flies over the lake. When highly selective, a tern spends more time searching for a suitable fish, but delivers larger fish than when it is less selective. As the tern becomes increasingly selective, the total daily delivery initially rises because it delivers larger fish, but the delivery rate falls if the tern becomes too selective, because it then spends too much time searching for a suitable fish. Selectivity also affects the variance in daily delivery: at low selectivities, the tern spends little time catching prey and most ofits time ferrying small prey to the nest. This tactic leads to a total daily delivery rate with little variation. For a selective tern, on the other hand, the time to capture an acceptable item varies greatly, and the variation in total daily delivery rises.
Moore (2002) calculated the total daily delivery resulting from each possible prey selection tactic using a computer simulation. Moore's simulation created stochasticity by randomly drawing prey items from a given size distribution. Each time the simulated provisioner encountered a prey item, the computer applied a minimum acceptable prey size rule. When the item exceeded the minimum size, the provisioner delivered it to the nest; otherwise, the provisioner continued searching. The program computed total daily delivery for 1,000 simulated days, then calculated the mean and standard deviation of total daily delivery from this distribution. Moore (2002) used the "i-score" method of Stephens and Charnov (1982) to show that when the demand at the nest rises above the expected delivery, provisioners should adopt more variance-prone tactics, which in this case means becoming more selective (i.e., delivering larger prey). In field experiments, he manipulated the brood size ofcommon terns and found that, as predicted, the mean size of prey delivered increased with brood size (see fig. 8.3.1). Moreover, natural variation in brood size and interannual variation in prey availability led to changes in prey selection that Moore could explain in the same way. Moore's study suggests that common tern parental provisioning repertoires regularly include variance-sensitive responses. Box 8.3 explores this idea further.
So far, I have illustrated how any one of several basic tactics can influence the delivery that a provisioner can attain, but this one-at-a-time analysis probably does not reflect reality. In nature, real provisioners must simultaneously decide what kind of prey to deliver, how fast to fly, and where to search. The empirical studies by Moore (2002) and Wright et al. (1998) on the common tern and starling provide the most complete pictures to date. Both studies found that in response to experimentally manipulated brood sizes, parents changed the amount of time they spent delivering and altered their
Was this article helpful?