Chapter 3 Summary
- Sound signals typically degrade during propagation to the receiver. Some species have developed acoustic adaptations that minimize degradation and maximize the active space of their signals. Ranging is the assessment of degradation in a signal; it allows a receiver or eavesdropper to estimate the distance to the sender.
- The amplitudes of all frequencies in a sound signal decrease equally with distance from the sender due to spreading losses. Where the medium is stratified with respect to flow, temperature, or pressure, refraction may decrease sound amplitudes faster or slower than expected given spreading losses alone. Spreading losses are reduced by refraction when terrestrial senders call at dawn, are upwind, or call between the understory and the canopy in a dense forest; they are more severe on hot days, when the sender is downwind, or when the caller is above the forest canopy.
- The frequency spectrum of a sound often changes as it propagates between sender and receiver. A plot of the change in amplitude for each possible frequency component is called the frequency response of that medium and situation. In most environments, amplitudes of propagating high-frequency sounds attenuate faster than those of low-frequency ones because of heat losses and scattering by objects in the sound path between sender and receiver. Heat losses and scatter are generally more severe in terrestrial environments than in aquatic ones.
- A boundary between media can also alter the frequency response along the path between sender and receiver. It does so by reflecting waves toward the receiver. These waves can then interfere (positively or negatively) with the waves that have traveled directly between the two parties. In many cases, the reflected waves suffer a phase shift at the boundary, causing negative interference and cancellation of low frequencies. For sound in air propagating over soft earth, a ground wave can restore some of the lowest frequencies, at least over moderate distances. As a result, the frequency response graph for sound propagation in air but near the ground often has a notch in which intermediate frequencies are severely attenuated, but very low and high frequencies continue to propagate well. Boundary interference also results in high attenuation of low-frequency sounds propagating just under the surface in water. There are no restorative effects in water to minimize this filtering.
- When sender and receiver are both located between the same two or more reflective boundaries, the medium for sound propagation becomes a waveguide. The complex reflections from the multiple surfaces result in certain frequencies (normal modes) being favored over others during propagation. Typically, there is a cutoff frequency below which propagation is negligible. Waveguides can seriously distort signal patterns over long distances.
- A third type of boundary effect occurs when the sound propagates in one medium but the receiver is located in another adjacent medium. The receiver must then rely on whatever version of the sound is detectable on its side of the boundary. This is the case for insects exchanging signals on the water’s surface, insects and spiders communicating through bending waves inside plants, and elephants and burrowing mammals communicating with seismic signals. All of these situations attenuate high frequencies faster than low frequencies, and the resulting sound waves, like waveguides, may have complex resonances and favored modes.
- Temporal patterns in propagating sounds are most often altered by reverberations (echoes). Airborne sound signals propagating in forests acquire more reverberations than do similar sounds in open country. Echoes from fish swim bladders create reverberations under water. If the original signal has elaborate modulations, these will be degraded by added reverberations; if the signal consists of a long single frequency, it may be enhanced by becoming longer and louder to receivers. Most forest birds avoid modulations in their songs and calls. In open country, on the other hand, slow-amplitude modulations may be added to signals as they pass through wind vortices or heat bubbles. Open country birds thus favor rapid amplitude modulations for long distance signals, or rely more heavily on frequency modulation for pattern.
- Sounds propagating on a water or solid substrate boundary, inside plants, or in air or water waveguides will suffer dispersion, in which different frequencies propagate at different speeds. This can create major distortions in the temporal and frequency patterns by the time a signal reaches a receiver.
- Noise is a ubiquitous problem for animals communicating with sound signals. In most habitats, noise is more likely to mask low frequencies (due to wind in air and waves in water), and high frequencies (due to insects in air and snapping shrimp in water). Bending waves inside plants can be masked by rustling of the leaves and branches. Human-caused (anthropogenic) noise is increasingly a problem for sound-communicating animals in the wild. When faced with significant noise, animals may focus more on intermediate frequencies, limit signaling to periods of relative quiet, or increase signal amplitude to ensure effective long-distance communication.
- All animal ears use differential motion between special mechanoreceptors and the rest of their body to detect sounds. As with sound radiation, coupling of ambient sounds into an ear is hindered if the wavelengths are significantly larger than the animal, and the acoustic impedance of the medium and the ear are sufficiently different. Animals have evolved special adaptations to enhance coupling, modification of captured sounds, and detection.
- Small hairs on the exoskeleton and plumose antennae of arthropods move more easily in a near field than does the rest of the body. This differential movement provides the necessary coupling for near-field sounds. In other species, the internal sensory hairs are covered by heavy objects (statoliths in crustaceans and otoliths in fish). These masses accelerate more slowly in near fields than do the hairs and other soft tissues; and this differential movement bends the hairs and stimulates the sensory cells.
- Terrestrial animals have thin membranes called tympana to couple far-field sounds in air into their ears. In mammals and moths, the tympana are stretched over closed cavities. The resulting pressure detectors compare external sound pressures to the reference pressure in the closed cavity. The tympana of grasshoppers, cicadas, crickets, katydids, frogs, lizards, and birds compare sounds sampled at two different locations in the far field and thus serve as pressure differential detectors. All use the exterior of a tympanum as one sample point. Crickets and katydids use tracheal tubes to convey a second sample taken outside their thorax to the other side of the tympanum. In grasshoppers, cicadas, and all terrestrial vertebrates except mammals, the insides of the two tympana are connected by an airspace or air-filled sacs. The second sample is thus taken outside the tympanum of the opposite ear.
- In water, a number of fish use swim bladders or accessory air sacs the way terrestrial animals use tympana—to capture far-field sounds and convert them into oscillations of the cavity wall. Swim bladder movements are conveyed to the ears by small bones or intervening tissues. Toothed whales use fat-filled jawbones to convey far-field sounds to a thin bone in the ear that converts the pressure variations into vibrational motion. Baleen whales and true seals appear to capture sounds in various parts of their skeletons and convey them to ears that respond to bone-conducted sounds.
- Boundary-propagated sounds are usually coupled into receiver bodies through their legs. Tension and posture can be varied to improve resonance of the body relative to frequencies of interest. Spiders use slits and insects use pits in the exoskeleton that are then compressed and expanded as a result of the coupled vibrations. These arthropods also use stretch receptors and blood movement detectors to monitor leg and body vibrations. Frogs absorb seismic signals through their bodies and convey them through a specialized muscle to their ears.
- Terrestrial animals use horns, articulated chains of bones, or successive membranes of decreasing size to modify captured sounds and reduce impedance mismatches between the ambient medium and their bodies. While these devices can increase the sound energy delivered to inner ears, they usually have their own resonant properties that may constrain the animal’s auditory range and resolution.
- Arthropod and vertebrate auditory mechanoreceptors all have dendrites with a ciliary component. Vertebrate receptors accompany this kinocilium with multiple stereocilia. Bending, compressing, or stretching these mechanoreceptors constitutes the primary transduction step for hearing. Both types of sensors can be extremely sensitive: threshold movements as small as or smaller than the diameters of single atoms are common in both groups.
- Many insects use their ears largely to detect and avoid echolocating bats. They tend to have simple ears tuned to ultrasonic frequencies and few adaptations for frequency resolution. Taxa that use their ears for intraspecific communication usually have more sophisticated mechanisms for breaking complex sounds down into separate frequency bands and assessing the relative amplitude of each band. They thus perform some level of Fourier analysis. Broad frequency ranges and good frequency resolution can be achieved simultaneously by the presence of many sensory cells, each tuned to a different subset of the overall frequency range. In tonotopic ears, the sensory cells are arrayed in order of their resonant characteristic frequencies. Tonotopy reduces interference between stimulated cells with very different characteristic frequencies. In many tonotopic ears, the sensory cells are covered with a tectorial membrane that insures that all cells in a given band are stimulated simultaneously. In both arthropods and vertebrates, some sensory cells physically vibrate when stimulated; this can stimulate adjacent cells with similar characteristic frequencies leadung to auditory amplification. Frequency resolution can be further refined for lower frequencies through phase locking, in which nerve impulses being sent to the brain by the sensory neurons are synchronized with peaks in the waveform of the relevant sound frequency.
- Dynamic range is largely set by the lowest amplitude signals that can be detected. Although arthropod and vertebrate mechanoreceptors are so sensitive that the potential dynamic range for animals should be enormous, ambient noise is often sufficiently high that it sets the floor for the effective dynamic range. Amplitude resolution, like frequency resolution, is largely limited by the number of sensory cells that are tuned to the same frequency but at different amplitude thresholds. In insects, amplitudes must differ by 1–2 dB at favored frequencies to be discriminated; equivalent thresholds for birds and mammals are 0.8–4 dB.
- There is a trade-off in most auditory systems between frequency and temporal resolutions. Sensory cells that are narrowly tuned (large Q) will have worse temporal resolution than more broadly tuned and rapidly damped cells. In many cases, brain processes (e.g., phase-locking) can be invoked to improve temporal resolution. In practice, separate acoustic events will be discriminated by insects if they are separated by at least 4–8 msec; birds and mammals require intervening intervals of only 1–4 msec.
- Animals can use any of four different kinds of information to identify the azimuth and altitude of a sound source. In near fields, hearing organs can often use the direction of molecular motion in the medium to estimate both azimuth and altitude. Time delays in the arrival of a far field sound at two or more hearing organs are often used either behaviorally (by rotating until the delay is zero) or computationally (by the brain) to identify the source azimuth. This cue becomes decreasingly useful as receiver size gets smaller. Amplitude differences provide a third type of far-field cue, and azimuth can again be identified using turning or brain computation. Animals larger than relevant wavelengths rely on diffraction to create amplitude differences at two or more hearing organs; animals smaller than relevant wavelengths use one or more pressure differential organs that are inherently directional. The fourth cue, changes in far-field frequency spectra at the ear as a function of source angle, is used by many mammals and by some birds to estimate the altitude of a sound source.
- Both sender constraints and receiver constraints contribute to the widely found inverse relationship between an animal’s body size and the sound frequencies it uses for communication.