Chapter 16 Web Topics
16.1 Social Microbes
Introduction
Microbes of two completely unrelated taxa have evolved a convergent system of aggregating, migrating, and forming fruiting bodies: the myxobacteria and the social amoebae. Their similar soil environment is undoubtedly responsible for this convergence. In any seasonal environment, the soil periodically dries up and becomes inhospitable for many organisms. To survive, organisms must either move, or reproduce and send out airborne or otherwise mobile propagules. Reproduction via fruiting bodies invariably requires some individuals to altruistically sacrifice their reproduction to build the fruiting structure. Thus aggregations should be restricted to kin, and mechanisms should evolve for identifying and selectively joining kin groups. The web sites listed here are from various laboratories studying these organisms, and show the remarkable movement patterns of these very small entities.
Myxobacteria
The best studied social bacterium, Myxococcus xanthus, engages in cooperative swarming, predation, and multicellular development. The cells are normally solitary, but aggregate when they run low on amino acids, their primary food source. The aggregations have been likened to wolf packs, as they collectively cover prey organisms and secrete toxic and lytic compounds to degrade and digest the prey.
Suggested sites
- General information on Myxococus xanthus
https://en.wikipedia.org/wiki/Myxococcus_xanthus - Zusman Laboratory, University of California Berkeley
http://mcb.berkeley.edu/labs/zusman/zusman%20lab%20web%20page/Myxococcus%20xanthus.html - YouTube Videos
http://www.youtube.com/watch?v=0ALM7X1_LqA
https://www.youtube.com/watch?v=ZHGEi2JzXso
https://www.youtube.com/watch?v=O1jPzhz1Qyc
https://www.youtube.com/watch?v=UF6FDuXHSt4
https://www.youtube.com/watch?v=tstc6doiNCU
Social amoeba
The social amoeba, or slime molds, Dictyostelium spp., are single-celled eukaryotes. They initiate their aggregation in response to poor environmental conditions when a core group of cells sends out waves of an aggregating chemical signal, cyclic AMP. The aggregation develops through several stages to become a slug, which has the ability to move to a better location. The slug can also form a fruiting body, which is raised above the soil surface and can attach itself to passing mobile animals.
Suggested sites
- Firtel Laboratory, University of California San Diego
http://www-biology.ucsd.edu/~firtel/ - DictyBase website
http://dictybase.org/Multimedia/index.html - DictyBase videos
http://dictybase.org/Multimedia/development/development.html
16.2 Plant Movement
Introduction
Plants are in constant motion, but they live on a different time scale from that of animals. They move as they grow, search for light and nutrients, avoid predators, exploit neighbors, and reproduce. Time-lapse photography speeds up this movement so we can view it on our time scale. These websites have some remarkable video clips of plant motion.
Suggested sites
- Plants-in-Motion, by Roger Hangarter
This is a well-organized website with many examples of plant movements, including germination, photomorphogenesis, tropisms, nastic movements such as trap closing by the Venus flytrap, circadian responses, general growth, and flower movement (be patient, it takes a minute to load).
http://plantsinmotion.bio.indiana.edu/plantmotion/starthere.html - Dnatube site
http://www.dnatube.com/video/3359/Plant-Growth-and-Movement - Mimosa on Wikipedia
http://en.wikipedia.org/wiki/Mimosa_pudica - Vine searching for support
http://www.youtube.com/watch?v=sLHeFmLJoLg
Highly recommended reading:
Koller, D. 2011. The Restless Plant. Cambridge, MA: Harvard University Press.
16.3 Body Language
Introduction
Humans transmit a great deal of information to others through their body language—gestures, postures, and body movements. Sending and receiving of these signals is largely unconscious, although when pressed, people can verbalize their perceptions and evaluation of body language signals, and senders can sometimes voluntarily modulate their signals. It is very challenging to conduct objective studies of the perception and evaluation of body language in humans because perceivers are strongly influenced by other aspects of the sender’s appearance, such as skin color, facial features, gender, height, and clothing. Over the years, innovative research techniques have been devised to overcome this problem. This Web Topics unit focuses on studies of human nonverbal communication, especially body movements, that have employed these techniques. We first briefly describe the techniques, and then highlight some of the most interesting recent studies that used these techniques to examine human gait patterns, dancing, dominance postures, and speech-associated gestures. The interested reader can obtain more detailed information on this topic in the recent reviews by Hugill et al. (2010), and Fink et al. (2014).
Overview of techniques
The earliest technique used to obtain objective evaluations of body movement patterns is called the point-light (PL) technique. This simple but elegant and effective technique was developed by Johansson (1973, 1976). Approximately 10–13 small lights are placed on the major joints of the body (ankles, knees, hips, shoulders, elbows, wrists, and head), and ambient lighting is dimmed so that only the point lights are visible while subjects perform prescribed movements. Humans are remarkably able to extract information about gender, age, emotion, and activity, as long as the subjects are moving (Barclay et al. 1978; Bertenthal and Pinto 1994).
Once graphical computer software became widely available to researchers, more sophisticated techniques became possible. Video motion-capture technology was first developed for sports training applications to study detailed body movements (Josefsson et al. 1996). Subjects wear tight (or little) clothing, and up to 40 small infrared reflectors are attached to key joints and body regions. Eight to 12 infrared high-speed video cameras simultaneously record the performer. Software programs are then used to extract the trajectories of each reflector during the movement episode. Vicon™ (Oxford, UK) is a widely used commercial product that provides both the cameras and the analysis software. If only an analysis of movement patterns is required, these trajectories can be directly submitted to statistical analysis. However, for behavioral research on the signal value of movements, another software program is employed to transform the trajectories into a video clip of a moving three-dimensional human-like figure. The kinematics of the movement is based on a model of the constrained movements of the human skeleton with input from the joint trajectories. In some cases, the figure is a jointed stick-like structure or wooden manikin, whereas in more recent studies a smoothed human form is constructed, popularly called an avatar. The very great advantage of this technique is the ability to experimentally modify the movement patterns to test hypotheses about the critical components of specific body movements.
Some highly sophisticated video-capture and software systems are now able to extract the movement kinematics directly from the video recording without the need for the subject to wear any reflectors (Bente 1989; Bente et al. 2001; Bente et al. 2008). This is advantageous when studying conversational interactions, where wearing tight clothing and reflectors can be inhibiting for the actors. The avatar directly mimics the postures and movements of the video-recorded interactants. A final strategy is to link fine-scale gestures of speaking subjects to the spoken words, an application designed to make animated figures in computer games show more realistic movement patterns while speaking phrases of text (Levine et al. 2009; Levine et al. 2010).
Analysis of gait patterns
Point-light studies of walking humans have shown that perceivers can derive a great deal of socially relevant information from this highly simplified form of visual input. For example, observers can assess an individual’s age, based on the happier and more powerful gaits of younger individuals (Montepare and Zebrowitz-McArthur 1988). The gender of a walker is easily distinguished by types of lateral sway: males tend to swing their shoulders from side to side more than their hips, whereas women swing their hips more than their shoulders (Barclay et al. 1978; Mather and Murdoch 1994; Sumi 2000). Sexual orientation is also distinguishable from simple point-light displays of walkers (Ambady et al. 1999). Individual identity can often be recognized from point-light depictions based on differences in stride length, bounce, rhythm, speed, and swagger, and observers can usually identify their own point-light image (Stevenage et al. 1999; Loula et al. 2005). Various types of action, such as different modes of locomotion, instrumental actions, and social activities such as playing, chasing, fighting, courting, and following, can also be distinguished from simplified visual input (Johansson 1976; Dittrich 1993; Norman et al. 2004; Barrett et al. 2005). Even two-day old human infants can discriminate biological versus non-biological point-light animations (Simion et al. 2008). These studies lead to the conclusion that humans possess intrinsic visual and cognitive adaptations that enable them to discriminate human and animal motion and infer the identity and actions of other people from a distance using gait patterns.
It has long been known that emotional feelings are reflected in a person’s gait, along with facial expressions and other nonverbal body postures and gestures. Psychological studies with actor–walkers and evaluators are able to identify sadness, anger, happiness, and pride from gait information at better than chance levels. Features such as the amount of arm swing, stride length, heavy-footedness, vertical head position, and walking speed are the cues that were found to differentiate these emotions (Montepare et al. 1987; de Meijer 1989; Sogon and Masutani 1989; Schouwstra and Hoogstraten 1995; Wallbott 1998; Montepare et al. 1999; Coulson 2004; Michalak et al. 2009). Studies based on computer-animated avatars have been able to zero in on the precise body movements that characterize different emotions. Roether et al. (2009) employed the Vicon motion-capture technique to extract the key joint trajectories that differentiate anger, happiness, sadness, and fear gaits. Observers were able to correctly classify these four gaits with 70 to 90% accuracy. Multivariate analyses found that relatively few postural and kinematic features of gaits were required to characterize these emotions. Head inclination distinguished sadness from the others, elbow flexion characterized both fear and anger, speed of movement was high for anger and happiness and low for fear and sadness, upper arm retraction and knee flexion were increased for fear, and hip and shoulder movement distinguished anger and happiness. Avatar stimuli with only these few diagnostic features were then generated and presented to observers, who were able to correctly identify the emotions nearly as well as with the stimuli obtained from natural walkers. Movies from this study can be viewed at the website given below. In a similar study, Karg et al. (2010) found that emotions can be very well differentiated both in terms of basic emotional categories and in terms of the three emotional axes—valence (pleasure), arousal (activation), and potency (power or degree of control), as illustrated in main text Figure 16.26. Kapur et al. (2005) also demonstrated excellent emotion category classification by both human observers and machine learners. These studies highlight how important the signaling of emotion is in humans; gait-related components are visible at a greater distance than facial components and provide early information on the intentions of an approaching person. The full text of the Roether et al. (2009) article with supplemental material can be accessed here: http://jov.arvojournals.org/article.aspx?articleid=2204009#media-1.
Although women do not give obvious signals of fertility, they may produce some subtle fertility cues that are detectable and highly attractive to men (Morris and Udry 1970; Doty et al. 1975; Roberts et al. 2004). Another relevant gait study asked whether women in the fertile versus non-fertile stage of their menstrual cycle differed in gait, and if so, whether men expressed any preferences (Provost et al. 2008). This study employed the Vicon motion-capture system to extract information on joint trajectories during normal walking. Women subjects using birth control pills performed one walking session, while subjects not taking pills performed two walking sessions, one during the luteal phase (non-fertile), and another during the follicular phase (optimal fertile period 14–16 days before the start of the next menstrual period). A discriminant function analysis on the combined trajectory variables found a significant difference overall between fertile and non-fertile women, and 71% of women were correctly classified. Although hypothetical extreme walkers showed visible differences, involving lateral distances between the knee and ankle joints and hip movement, no significant difference was found between fertile and non-fertile women for these three body regions. The differences were either very subtle, or different subjects differed in the body region that varied as a function of phase. In the male perception experiment, point-light displays of the non-birth-control women were rendered and presented as stimuli in random order. Men scored the attractiveness of the walkers on a six-point scale. Women in the non-fertile stage were rated higher than women in the fertile stage. Thus despite the results of other studies showing that some cues given by women while fertile are detected by men and are rated as more attractive, these cues are more likely to be detected only by an intimate partner; women may subconsciously make a broadly advertised cue such as walking more attractive when they are not fertile, and avoid attracting unwanted suitors when they are fertile.
Dance performance
The performance of vigorous courtship displays is an important mate choice criterion for many animals (see Behavioral Display section of main text Chapter 12 and review by Byers et al. 2010). Even Darwin (1871, 1872) proposed that dancing was a sexually selected courtship signal in humans that could reveal an individual’s quality. As argued in the main text, movement displays have a greater potential to reveal underlying aspects of genetic and phenotypic quality of the sender because they require the integration of many morphological and physiological pathways and more developmental genes than is the case for most structural display ornaments. In humans, dancing is a set of intentional rhythmic body movements generally considered to play a role in sexual attraction for both sexes. Dancing likely evolved in parallel with bipedalism, which facilitates a greater range of movement possibilities than found in quadrapeds. The torso can twist and bend, the arms can throw and swing in many directions, the legs can assume a variety of positions, and the head can swivel and nod in many ways, permitting a wide range of dynamic creative movements (Sheets–Johnstone 2005). For men in particular, dancing ability could signal aspects of male quality such as physical strength, agility, flexibility, and neurocognitive coordination (Hugill et al. 2010; Bläsing et al. 2012). Several studies have employed computer-animated models to discover possible underlying determinants of dance performance while removing all other aspects of physical appearance.
One series of studies examined the hypothesis that personality traits are reflected in a dancer’s moves. Luck et al. (2009) had participants complete the Big Five personality inventory (Digman 1990), which measures the five primary personality dimensions (openness, conscientiousness, extraversion, agreeableness, and neuroticism). Participants were then recorded with a P-L motion-capture system while dancing to a 30-sec musical selection. A few correlations were found between certain personality traits and specific movement patterns. Neuroticism was positively correlated with acceleration of the feet and jerky movements of the central body. In a larger follow-up study, dancers were given six musical clips of different musical genres (Luck et al. 2010). Again, neuroticism was associated with jerky and accelerated movements, especially of the head, hands and feet. In addition, extraverts exhibited higher movement speeds of the head, hands, and central body. Thus dancing moves can reflect at least some aspects of the performer’s personality traits. Fink et al. (2012) then asked whether female observers of male dancers could detect a dancer’s personality. These researchers generated humanoid avatar stimuli from male dancers wearing 38 reflectors and recorded with 12 tracking cameras; the reconstructed 3D animated avatars contained no cues of body mass, height, or facial features. The male dancers were previously scored for the Big Five traits. Women judges were asked to describe the dancer’s personality. Male dancing ability was significantly correlated with consciousness and agreeableness. Moreover, extraversion was marginally positively correlated with dance quality perception, and neuroticism and openness were negatively correlated with dance quality perception. Since females do react differentially to certain types of male personalities, these studies tentatively concluded that male dance movements can play a role in female mate preferences. See also Saarikallio et al. (2013) and Luck et al. (2014).
Another study of dancing in humans by Neave et al. (2011) again focused on female evaluation of male dancers and attempted to discover the specific moves that females found attractive. This study employed similar recording technology to the humanoid avatar study described above. There were 19 male subjects, who danced to a core drumbeat to remove any impact of music likability, and 37 heterosexual female evaluators, who rated the dances on a seven-point scale. The biomechanical trajectory data from the reflectors were used to quantify different aspects of the movement, such as the amplitude, speed, duration, and variability of different limb movements. Several key movement measures were correlated with dance quality: variability and amplitude of neck and trunk movements, and speed of movements of the right knee. These results pointed to greater attractiveness of more vigorous dancing patterns, again supporting the conclusion that dancing reveals potentially useful information about dancer quality. Video clips of the stimuli can be viewed here:
Good dancing avatar: https://www.youtube.com/watch?v=xMQ9HYDnplU
Bad dancing avatar: https://www.youtube.com/watch?v=gECcPvQ-zVc
A final series of studies asked whether dance moves reflected the physical strength of the dancer. Hugill et al. (2009) recorded dance movements of 40 heterosexual male students (all non-professional dancers) using a digital video camera. The subjects also provided a measure of handgrip strength, as well as physical measurements of height and weight. Fifty female judges rated 10 sec of the video clips on the basis of perceived attractiveness and assertiveness. Handgrip strength of male dancers was significantly positively correlated with female assessments of both attractiveness and assertiveness. Several subsequent studies have confirmed the association between physical strength and dancing ability (McCarty et al. 2013; Hufschmidt et al. 2015; Weege et al. 2015). If strength is an honest indicator of general health, then female perception of and preference for strength-indicating cues and signals revealed in dancing is an evolutionarily adaptive behavior.
Analysis of dominance interactions
The analysis of dyadic interactions between a dominant and subordinate individual presents a special challenge because of the two-way effects of each participant on the other. We reviewed the definitions of power, and some of the body postures and conversational management differences between subordinates and dominants in Chapter 16. Research on the nonverbal signals given in such interactions is further confounded by cultural differences in the value people assign to the vertical dimension of social relations. Some cultures accept that inequalities in power and status are natural, so those with power tend to emphasize it and distinguish themselves from subordinates, while other cultures view power and status as man-made and artificial and therefore de-emphasize it and share power (Storti 1999). Despite different perceptions of the value of status, individuals from different cultures generally agree on the nonverbal signals that express power (Triandis and Gelfand 1998; Kowner and Wiseman 2003; Guerrero and Floyd 2006; Burgoon et al. 2010). To resolve these differences in perception and evaluation of power from nonverbal signals, Bente et al. (2010) conducted a study of dominance interactions in three cultures (Germans, Americans, and Arabs) using culture-free video clips converted into avatars. Subjects from the three countries enacted a supervisor–employee conflict resolution scenario while being video recorded. Specialized software extracted the body movements of both participants and recreated a video clip of a one minute section of the interaction between wooden manikin figures (Bente 1989; Bente et al. 2001; Bente et al. 2008.) These clips were shown to receivers from the same and different countries. Status roles were only distinguishable above chance in the German sample. There was no evidence for an in-group advantage. Nevertheless, there were significant correlations in the ratings of dominance across observers, with a slightly stronger in-group effect. Microanalysis of movement behaviors demonstrated that certain behaviors predicted the dominance rating: vertical head posture and an open position of upper and lower limbs. In Germany only, less movement activity and stretching of the lower limbs was perceived as more dominant. Germany is known to be a high-power-distance society, and America is known to be a lower-power-distance society, so results for these countries met expectations. Arabs are a high-power-distance society, but they are also a collectivist society, so expressions of power could be repressed for this reason.
Other techniques have been developed to study conversational interactions with computer-generated faces. Wilms et al. (2010) used eye-tracking technology to generate a gaze-dependent avatar face whose behavior becomes responsive to being looked at by the participant. This allows the participant to engage in “online” interaction with the avatar in real time. In some applications of this technology, brain activity of the subject is simultaneously monitored with fMRI. Čereković and Pandžić (2011) developed what they call Embodied Conversation Agents, which graphically embody virtual characters that can engage in meaningful conversation with human participants. Multimodal data are extracted from the speech and nonverbal behaviors of subjects performing short two-second expressions. These clips form a large library, from which facial movements and emotions are synthesized. The goal of this complex project is to create believable and expressive virtual characters in order to enhance the communication abilities of machines. Similarly, Levine et al. (2009, 2010) extract correlations between spoken words and speech gestures in real speakers, and then use these associations to produce text-specific movements for animated figures in games. A video clip explaining the methods and objectives of this research can be found at the website of Levine et al. (2009) supplemental material (click on the Source Materials tab): http://dl.acm.org/citation.cfm?id=1618518
Further reading
Byers, J., E. Hebets and J. Podos. 2010. Female mate choice based upon male motor performance. Animal Behaviour 79: 771–778.
Fink, B., B. Weege, N. Neave, B. Ried, and O. Cardoso Do Lago. 2014. Female perceptions of male body movements. In Evolutionary Perspectives on Human Sexual Psychology and Behavior (Weekes–Shackelford, V. and T. K. Shackelford, eds.). Berlin: Springer, pp. 299–324.
Fink, B., B. Weege, N. Neav, M. N. Pham, and T. K. Shackelford. 2015. Integrating body movement into attractiveness research. Frontiers in Psychology 6: 220.
Hugill, N., B. Fink and N. Neave. 2010. The role of human body movements in mate selection. Evolutionary Psychology 8: 66–89.
Pentland, A. 2008. Honest Signals: How They Shape Our World. Cambridge, MA: MIT Press.
Sheets–Johnstone, M. 2005. Man has always danced: Forays into the origins of an art largely forgotten by philosophers. Contemporary Aesthetics http://www.contempaesthetics.org/newvolume/pages/article.php?articleID=273.
Literature cited
Ambady, N., M. Hallahan and B. Conner. 1999. Accuracy of judgments of sexual orientation from thin slices of behavior. Journal of Personality and Social Psychology 77: 538–547.
Barclay, C.D., J.E. Cutting and L.T. Kozlowski. 1978. Temporal and spatial factors in gait perception that influence gender recognition. Perception and Psychophysics 23: 145–152.
Barrett, H.C., P.M. Todd, G.F. Miller and P.W. Blythe. 2005. Accurate judgments of intention from motion cues alone: A cross–cultural study. Evolution and Human Behavior 26: 313–331.
Bente, G. 1989. Facilities for the graphical computer simulation of head and body movements. Behavior Research Methods Instruments and Computers 21: 455–462.
Bente, G., N.C. Kramer, A. Petersen and J.P. de Ruiter. 2001. Computer animated movement and person perception: Methodological advances in nonverbal behavior research. Journal of Nonverbal Behavior 25: 151–166.
Bente, G., H. Leuschner, A. Al Issa and J.J. Blascovich. 2010. The others: Universals and cultural specificities in the perception of status and dominance from nonverbal behavior. Consciousness and Cognition 19: 762–777.
Bente, G., M. Senokozlieva, S. Pennig, A. Al–Issa and O. Fischer. 2008. Deciphering the secret code: A new methodology for the cross–cultural analysis of nonverbal behavior. Behavior Research Methods 40: 269–277.
Bertenthal, B.I. and J. Pinto. 1994. Global processing of biological motions. Psychological Science 5: 221–225.
Bläsing, B., B. Calvo–Merino, E. S. Cross, C. Jola, J. Honisch, and C. J. Stevens. 2012. Neurocognitive control in dance perception and performance. Acta Psychologica 139: 300–3008
Burgoon, J.K., L.K. Guerrero and K. Floyd. 2010. Nonverbal Communication. Boston, MA: Allyn and Bacon.
Byers, J., E. Hebets and J. Podos. 2010. Female mate choice based upon male motor performance. Animal Behaviour 79: 771–778.
Čereković, A. and I.S. Pandžić. 2011. Multimodal behavior realization for embodied conversational agents. Multimedia Tools and Applications 54: 143–164.
Coulson, M. 2004. Attributing emotion to static body postures: Recognition accuracy, confusions, and viewpoint dependence. Journal of Nonverbal Behavior 28: 117–139.
Darwin, C. 1871. The descent of men, and selection in relation to sex. Princeton, NJ: Princeton University Press.
Darwin, C. 1972. The expression of emotions in man and animals. London: Harper Collins.
de Meijer, M. 1989. The contribution of general features of body movement to the attribution of emotions. Journal of Nonverbal Behavior 13: 247–268.
Digman, J. M. 1990. Personality structure: Emergence of the five–factor model. Annual Review of Psychology 41: 417–440.
Dittrich, W.H. 1993. Action categories and the perception of biological motion. Perception 22: 15–22.
Doty, R.L., M. Ford, G. Preti and G.R. Huggins. 1975. Changes in intensity and pleasantness of human vaginal odors during menstral cycle. Science 190: 1316–1318.
Fink, B., B. Weege, J. Flügge, S. Röder, N. Neave, and K. McCarty. (2012). Men’s personality and women’s perception of their dance quality. Personality and Individual Differences 52: 232–235.
Fink, B., B. Weege, N. Neave, B. Ried, and O. Cardoso Do Lago. 2014. Female perceptions of male body movements. In Evolutionary Perspectives on Human Sexual Psychology and Behavior (Weekes–Shackelford, V. and T. K. Shackelford, eds.). Berlin: Springer, pp. 299–324.
Fink, B., B. Weege, N. Neav, M. N. Pham, and T. K. Shackelford. 2015. Integrating body movement into attractiveness research. Frontiers in Psychology 6: 220.
Guerrero, L.K. and K. Floyd. 2006. Nonverbal Communication in Close Relationships. Mahwah, NJ: Lawrence Erlbaum Associates.
Hufschmidt, C., B. Weege, S. Röder, K. Pisanski, N. Neave, and B. Fink. 2015. Physical strength and gender identification from dance movements. Personality and Individual Differences 76: 13–17.
Hugill, N., B. Fink and N. Neave. 2010. The role of human body movements in mate selection. Evolutionary Psychology 8: 66–89.
Johansson, G. 1973. Visual perception of biological motion and a model for its analysis. Perception and Psychophysics 14: 201–211.
Johansson, G. 1976. Spatio–temporal differentiation and integration in visual motion perception—experimental and theoretical analysis of calculus–like functions in visual data processing. Psychological Research–Psychologische Forschung 38: 379–393.
Josefsson, T., E. Nordh and P.O. Eriksson. 1996. A flexible high–precision video system for digital recording of motor acts through lightweight reflex markers. Computer Methods and Programs in Biomedicine 49: 119–129.
Kapur, A., N. Virji–Babul, G. Tzanetakis and P.F. Driessen. 2005. Gesture–based affective computing on motion capture data. In Affective Computing and Intelligent Interaction, Proceedings (Tao, J. and R.W. Picard, eds.). Berlin: Springer–Verlag Berlin. pp. 1–7.
Karg, M., K. Kuhnlenz and M. Buss. 2010. Recognition of affect based on gait patterns. Ieee Transactions on Systems Man and Cybernetics Part B–Cybernetics 40: 1050–1061.
Kowner, R. and R. Wiseman. 2003. Culture and status–related behavior: Japanese and American perceptions of interaction in asymmetric dyads. Cross–Cultural Research 37: 178–210.
Levine, S., P. Krähenbuhl, S. Thrun and V. Koltun. 2010. Gesture controllers. ACM Transactions on Graphics 29: 124.
Levine, S., C. Theobalt and V. Koltun. 2009. Real–time prosody–driven synthesis of body language. ACM Transactions on Graphics 28: 172.
Loula, F., S. Prasad, K. Harber and M. Shiffrar. 2005. Recognizing people from their movement. Journal of Experimental Psychology–Human Perception and Performance 31: 210–220.
Luck, G., S. Saarikallio, and P. Toiviainen. 2009. Personality traits correlate with characteristics of music- induced movement. In Proceedings of the 7th triennial conference of European Society for the Cognitive Sciences of Music (J. Louhivuori, T. Eerola, S. Saarikallio, T. Himberg, and P.-S. Eerola, eds.), Jyväskylä, Finland: University of Jyväskylä, pp. 276–279.
Luck, G., S. Saarikallio, B. Burger, M. R.Thompson, and P. Toiviainen. 2010. Effects of the Big Five and musical genre on music-induced movement. Journal of Research in Personality 44: 714–720.
Luck, G., S. Saarikallio, B. Burger, M. Thompson, and P. Toiviainen. 2014. Emotion-driven encoding of music preference and personality in dance. Musicae Scientiae 18: 307-323.
Mather, G. and L. Murdoch. 1994. Gender discrimination in biological motion displays based on dynamic cues. Proceedings of the Royal Society of London Series B–Biological Sciences 258: 273–279.
McCarty, K., J. Honekopp, N. Neave, N. Caplan, and B. Fink. 2013. Male body movements as possible cues to physical strength: A biomechanical analysis. American Journal of Human Biology 25: 307–312.
Michalak, J., N.F. Troje, J. Fischer, P. Vollmar, T. Heidenreich and D. Schulte. 2009. Embodiment of sadness and depression–gait patterns associated with dysphoric mood. Psychosomatic Medicine 71: 580–587.
Montepare, J., E. Koff, D. Zaitchik and M. Albert. 1999. The use of body movements and gestures as cues to emotions in younger and older adults. Journal of Nonverbal Behavior 23: 133–152.
Montepare, J.M., S.B. Goldstein and A. Clausen. 1987. The identification of emotions from gait information. Journal of Nonverbal Behavior 11: 33–42.
Montepare, J.M. and L. Zebrowitz–McArthur. 1988. Impressions of people created by age–related qualities of their gaits. Journal of Personality and Social Psychology 55: 547–556.
Morris, N.M. and J.R. Udry. 1970. Variations in pedometer activity during menstral cycle. Obstetrics and Gynecology 35: 199–201.
Neave, N., K. McCarty, J. Freynik, N. Caplan, J. Honekopp and B. Fink. 2011. Male dance moves that catch a woman's eye. Biology Letters 7: 221–224.
Norman, J.F., S.M. Payton, J.R. Long and L.M. Hawkes. 2004. Aging and the perception of biological motion. Psychology and Aging 19: 219–225.
Provost, M.P., V.L. Quinsey and N.F. Troje. 2008. Differences in gait across the menstrual cycle and their attractiveness to men. Archives of Sexual Behavior 37: 598–604.
Roberts, S.C., J. Havlicek, J. Flegr, M. Hruskova, A.C. Little, B.C. Jones, D.I. Perrett and M. Petrie. 2004. Female facial attractiveness increases during the fertile phase of the menstrual cycle. Proceedings of the Royal Society of London Series B–Biological Sciences 271: S270–S272.
Roether, C.L., L. Omlor, A. Christensen and M.A. Giese. 2009. Critical features for the perception of emotion from gait. Journal of Vision 9: 32.
Saarikallio, S., G. Luck, B. Burger, M. Thompson, and P. Toiviainen, P. 2013. Dance Moves Reflect Current Affective State Illustrative of Approach–Avoidance Motivation. Psychology of Aesthetics Creativity and the Arts 7: 296–305.
Schouwstra, S.J. and J. Hoogstraten. 1995. Head position and spinal position as determinants of perceived emotional state. Perceptual and Motor Skills 81: 673–674.
Sheets–Johnstone, M. 2005. Man has always danced: Forays into theorigins of an art largely forgotten by philosophers. http://www.contempaesthetics.org/newvolume/pages/article.php?articleID=273.
Simion, F., L. Regolin and H. Bulf. 2008. A predisposition for biological motion in the newborn baby. Proceedings of the National Academy of Sciences of the United States of America 105: 809–813.
Sogon, S. and M. Masutani. 1989. Identification of emotion from body movements—a cross–cultural study of Americans and Japanese. Psychological Reports 65: 35–46.
Stevenage, S.V., M.S. Nixon and K. Vince. 1999. Visual analysis of gait as a cue to identity. Applied Cognitive Psychology 13: 513–526.
Storti, C. 1999. Figuring Foreigners Out: A Practical Guide. Yarmouth: Intercultural Press.
Sumi, S. 2000. Perception of point–light walker produced by eight lights attached to the back of the walker. Swiss Journal of Psychology 59: 126–132.
Triandis, H.C. and M.J. Gelfand. 1998. Converging measurement of horizontal and vertical individualism and collectivism. Journal of Personality and Social Psychology 74: 118–128.
Wallbott, H.G. 1998. Bodily expression of emotion. European Journal of Social Psychology 28: 879–896.
Weege, B., M. N. Pham, T. K. Shackelford, and B. Fink. 2015. Physical strength and dance attractiveness: Further evidence for an association in men, but not in women. American Journal of Human Biology 27: 728–730.
Wilms, M., L. Schilbach, U. Pfeiffer, G. Bente, G.R. Fink and K. Vogeley. 2010. It's in your eyes–using gaze–contingent stimuli to create truly interactive paradigms for social cognitive and affective neuroscience. Social Cognitive and Affective Neuroscience 5: 98–107.