Chapter 8 Summary

  1. All animals use their sense organs to accumulate information about previous and current conditions around them. Cumulative information can be stored in the nervous system in various ways; a common mechanism is a list of alternative conditions and probability estimates for each alternative. Animals use their current state of information to make decisions about future actions. Optimal decision making adopts the action with the best expected payoff. The expected payoff of an action is the sum of the possible consequences for that action, each discounted (multiplied) by the currently estimated probability that the condition leading to that outcome exists.
  2. Cues and signals are stimuli correlated with conditions of interest to receivers: cues are generated for reasons irrespective of the presence or responses of receivers, whereas senders emit signals because they and receivers generally benefit by the provision of this information. Receivers combine the receipt of a cue or signal with knowledge of the correlations between the stimuli and the conditions of interest to update their probability estimates that alternative conditions are true. Optimal receivers can then make a decision on future actions by computing and comparing expected payoffs, or by comparing the new probability estimates to threshold values (red lines) that are based on relative payoffs of alternative actions.
  3. There is increasing evidence that many animals, including humans, track the current probabilities of ambient conditions, update these probabilities upon receipt of cues and signals, and make optimal decisions much of the time. However, both humans and animals sometimes deviate from optimal decision making. Causes include nonlinear functions relating payoffs to actual utilities (risk effects); nonlinear scalings of stimuli in sensory systems (Weber’s Law); the time and effort saved by shortcuts to Bayesian updating; exploratory decisions to update condition lists and probabilities, and the need for fall-back strategies (heuristics) when alternative expected payoffs or probabilities have similar values.
  4. A species’ overall signal repertoire can usually be broken down into separate signal sets, each of which includes all alternative signals that might be given to provide information about the same question. The rules by which senders and receivers associate conditions and signals in a signal set constitute that set’s coding scheme. Animal coding schemes are highly diverse. At a gross level, they can differ in plurality (i.e., whether they consist of continuous alternatives or discrete alternatives, and if the latter, in how many alternatives); persistence (how long an emitted signal continues to reflect a correlated condition); and the level of reliability guarantees. The latter range from conventional signals with no intrinsic guarantees for reliability, to handicap signals that create additional costs or reduced benefits when given unreliably, to index signals which are physically impossible or at least extremely difficult to produce unreliably.
  5. Multivariate signals encode information in more than one concurrent signal property. This provides even greater diversity in coding schemes. The multiple properties may all provide the same information, as in redundant signals, or each property might provide totally different information, as in multiple-message signals. Multivariate signals of most animals provide a mix of redundant and multiple message information. Information provided by multivariate signals may also depend on interactions between properties. Interactive coding schemes include: enhancement (in which redundant properties given concurrently produce stronger receiver response than if given singly); trade-offs, (in which performance of some properties at extreme values inhibits performance of other concurrent properties at extreme values); amplifiers (properties that never provide information themselves but facilitate information provision if invoked concurrently with others); dominance (in which multiple properties provide independent information when given singly, but one property dominates receiver responses when given concurrently); modulation (a special case of dominance in which the presence of the dominated properties enhances receiver responses to the dominant property information); and emergence (in which combinations of properties provide different information to receivers than any does when given singly).
  6. Serial signals consist of multiple elements emitted in succession. All interactive coding schemes listed for multivariate signals can occur using successive elements in serial signals. Because the number of potential permutations of elements in serial signals can be enormous, most species limit allowable options using phonological syntax rules (in which individual elements given singly provide no information), or lexical syntax rules (in which individual elements can provide information). Human speech relies on phonological syntax to make words, and lexical syntax to make sentences; most animals only use phonological syntax.
  7. Coding schemes can be treated as mappings of an output on an input: senders map emitted signals onto perceived conditions, propagation maps transmitted signals onto emitted signals, and receivers map expected categories on transmitted signals. Few of these mappings are isomorphic: that is, the plurality of the input is not preserved in the output. Senders may map discrete signals onto continuously varying conditions. Propagation often converts initially discrete signals into noisy continuous ones. Receivers often divide continuous propagated signals into discrete categories, either early in sensory processing (categorical perception) or at any later stage in the process. Changes in plurality at each stage invariably reduce the information transmitted.
  8. Measures of signal effectiveness attempt to quantify the average amount of information provided by a signal set. Forward measures like consistency, the index of association, and matrix determinants quantify the correlations between alternative outputs (emitted signals) and relevant inputs (conditions). Perfect information is provided only when all dominant consistencies are 100% and the index of association and matrix determinant equal 1.0. Backward measures such as reliability quantify the fraction of time that a receiver relying on signals correctly infers the current condition. Mutual information is a measure that rescales reliability into units based on equivalent binary questions (bits). Existing data indicate that most animal signal sets have intermediate consistencies of 30–80%; perfect information appears to be very rare in animal communication.
Back to top