An Ecological Theory of Face Perception
An Ecological Theory of Face Perception
Abstract and Keywords
This chapter presents an ecological theory of face perception, which draws on a Gibsonian approach to perception. This theory is offered not as a hypothetico-deductive framework, but rather as a general conceptual model that can integrate existing research findings and guide future research questions. It adds to the dual process and face-space models by expanding the attributes perceived in faces, emphasizing the function of face perception to guide adaptive behavior, predicting face perceptions from the overgeneralization of adaptive responses, highlighting dynamic and multimodal cues to face perceptions, and considering the perceiver attunements that moderate face perceptions. The chapter reviews the tenets of the ecological theory of face perception and considers how they inform the perception of four significant facial qualities: familiarity, age, emotion, and attractiveness. It also considers how ostensibly separate domains of face perception may be integrated through the ecological concept of perceived affordances.
Humans’ fascination with faces dates to ancient times and spans diverse cultures (cf. Zebrowitz, 1997). Scientific attention to faces is more recent. Fifty years ago, an investigator interested in face perception could easily read all the extant research literature. However, the 590 citations to “face perception” found in PsycINFO in the quarter century from 1956 to 1980 swelled over 1,000 percent to 6,782 citations in the next 25 years. This burgeoning scientific interest has addressed a wide range of questions including the contents of face perception, influencing stimuli, individual differences, disorders, developmental trajectories, behavioral consequences, and neural mechanisms (for a review, see Zebrowitz, 2006). The rapid growth in the literature highlights the need for a theoretical framework that integrates what is known and directs future research.
A dominant model of face perception in the cognitive neuroscience literature is a dual process model that calls attention to dissociations between perceptions of identity and emotional expression in addition to changeable facial qualities such as eye gaze and head angle (Bruce & Young, 1998; Calder & Young, 2005; Haxby, Hoffman, & Gobbini, 2002). Although this model does not claim to be a comprehensive model of how face perception works, it has generated considerable support from both cognitive and neurological data demonstrating functional and neurological divisions in the processing of identity versus emotion information and other dynamic cues, such as speech movements, eye gaze direction, and head movements (but see Calder & Young, 2005 and the discussion of confluence below).
Another prominent face perception model holds that the information provided by faces is coded relative to an average face at the center of a mental face-space (Busey, Wenger, & Townsend, 2001; Lee, Byatt, & Rhodes, 2000; O’Toole, Wenger, & Townsend, 2001; Valentine, 1991). In this model, faces are represented as points, and the distance between faces represents the similarity in their appearance. The dimensions of the space are the features that differentiate faces. O’Toole, et al. (2001) discuss different instantiations of face space, from the empirical (e.g., the results of principal component analyses of face images) to the hypothetical (e.g., the assumption that similar facial expressions are located near one another in the face space). Although face space is anchored by physical appearance, the particular layout of the space is a function of an individual’s perceptual experiences. Thus, the perceived similarity of two identical twins—their distance in the face space—depends on prior experiences with the twins (Stevenage, 1998). The face-space model has been applied primarily to the ease of recognizing different faces. However, it may also have relevance to other aspects of face perception. For example, faces with greater proximity in a face-space may be perceived to have similar psychological traits, as shown in early research demonstrating that people who looked alike were judged to be alike (Secord, 1958). (For an example of a face-space model, see Figure 1.1).
In this chapter we present an ecological theory of face perception (McArthur & Baron, 1983; Zebrowitz & Montepare, 2006), which draws on a Gibsonian approach to perception (Gibson, 1969; Gibson, 1966; Gibson, 1979). This theory is offered not as a hypothetico-deductive framework, but rather as a general conceptual model that can integrate existing research findings and guide future research questions. It adds to the dual process and face-space models by expanding the attributes perceived in faces, emphasizing the function of face perception to guide adaptive behavior, predicting face perceptions from the overgeneralization of adaptive responses, highlighting dynamic and multimodal cues to face perceptions, and considering the perceiver attunements that moderate face perceptions. In the following sections, we review the tenets of the ecological theory of face (p.4)
Perceived Affordances Guide Adaptive Behavior
Ecological theory expands the facial attributes considered by the dual-process model to include the perception of social affordances—the opportunities for acting or being acted upon that are provided by other people. A vivid illustration of what Gibson meant by affordance is provided by his quotation from Koffka: “Each thing says what it is…a fruit says ‘eat me’; water says ‘drink me’; thunder says ‘fear me’; and woman says ‘love me’”(Gibson, 1979, p. 138). From an ecological perspective, face perception is not simply about recognizing a particular identity or emotion. Rather, it is about the function of the perceptual system to guide actions that serve to solve specific adaptive problems or to facilitate other goal attainments of individuals. As such, the affordance concept focuses attention on the behavioral responses linked to face perception.
The facial qualities highlighted in the dual-process model—identity, emotion expression, and gaze direction—each convey significant (p.5) social affordances. We can expect different social interactions with people who have different identities, show different facial expressions, or look toward or away from us. Other attributes revealed in the face include familiarity, age, and attractiveness, and each is also associated with the perception of behavioral affordances.
Although affordances can be denoted by numerous descriptive trait adjectives (Beauvois & Dubois, 2000; Mignon & Mollaret, 2002), these traits are fairly well captured by the dimensions of likeability and power that correspond to the evaluative and potency dimensions on the semantic differential.1 These dimensions underlie human judgments about a host of stimuli and decades of social psychology research reveal them to be the primary dimensions along which people evaluate others (Cuddy & Fiske, 2002; Rosenberg, Nelson, & Vivekananthan, 1968; Wiggins, 1996). Not surprisingly, likeability and power differentiate the perceived affordances of faces varying in familiarity, age, emotion, and attractiveness. Compared with unfamiliar faces, previously seen faces create impressions of high likeability (Bornstein, 1989; Bornstein, 1993; Zajonc, 1968), although there is no specific effect on perceived power. Compared with a neutral expression, angry expressions create impressions of low likeability and high power, fear expressions create impressions of moderate likeability and low power, happy expressions create impressions of high likeability and high power (Hess, Blairy, & Kleck, 2000; Knutson, 1996; Montepare & Dobish, 2003). Compared with young adult faces, faces of babies create impressions of high likeability and low power (Alley, 1983; Berry & McArthur, 1986; Zebrowitz, Fellous, Mignault, & Andreoletti, 2003), while faces of elderly adults create impressions of low likeability and low power (Zebrowitz, et al., 2003).2 Finally, attractive people are perceived as more likable as well as more powerful than those who are less attractive (Eagly, Ashmore, Makhijani, & Longo, 1991; Feingold, 1992; Langlois et al., 2000; Zebrowitz, Hall, Murphy, & Rhodes, 2002; Zebrowitz & Rhodes, 2004), effects that may be driven more by the perception that “ugly is bad” than by the perception that “beautiful is good”(Griffin & Langlois, 2006).
Consistent with the ecological theory tenet that perceiving is for doing, the affordances associated with different types of faces are linked to perceivers’ behavior. Some behaviors are so obvious that they have received little research attention, such as those directed toward people whose faces vary in age, emotion expression, or familiarity. Nevertheless, there is empirical evidence that babies elicit different behavioral responses than do adults, although little work has focused on faces per se (Berry & McArthur, 1986; Eibl-Eibesfeldt, 1989; Zebrowitz, 1997). Considerable research has also documented preferential treatment of attractive people in interpersonal, occupational, judicial, and other domains (Langlois et al., 2000; Zebrowitz, 1997).
Other research has examined more subtle variations in behavioral responses to faces, with greater control over the stimulus information eliciting the response. For example, using evoked potentials, reaction time, and visual search measures, investigators have demonstrated more attention to faces than to other stimuli (Hershler & Hochstein, 2006; Ito & Cacioppo, 2000), as well as to particular types of faces, including the four facial qualities considered in this chapter: familiarity (Ito & Urland, 2003), age (Quinn & Macrae, 2005), emotional expression (Lundqvist, Ohman, Barrett, Niedenthal, & Winkielman, 2005) and attractiveness (Maner, Gailliot, & DeWall, 2007). (For a pertinent review, see Palermo & Rhodes, 2007).
(p.6) Faces also influence affective and approach/avoidance responses. For example, angry faces potentiate infant startle reactions to noise, a rudimentary defensive behavior (Balaban, 1995). Similarly, fearful or angry faces facilitate aversive classical conditioning and retard extinction as compared with happy faces (Lanzetta & Orr, 1981; Ohman & Dimberg, 1978; Orr & Lanzetta, 1984). Fearful or happy faces also speed hand movements that mimic approach responses, as do attractive faces, whereas angry or disfigured faces speed hand movements that mimic avoidant responses (Marsh, Ambady, & Kleck, 2005b; Zhang, 2007). Face race also influences hand movements that mimic approach/avoidance responses. Black faces, judged by white perceivers as more dangerous than own-race faces, speed avoidance responses in comparison with white faces; Asian faces, judged as less dangerous than own-race faces, speed approach (Zhang, 2007). Faces elicit motivational responses beyond approach or avoidance. Human perceivers exert more effort to see attractive faces (Aharon et al., 2001), and monkeys trade a favorite juice for a look at potential female mates or high-status conspecifics of either sex (Deaner, Khera, & Platt, 2005). Research on neural activation elicited by faces also suggests a link between face perception and behavior. Activity elicited when observing mouth and eye movements is differentiated within the superior temporal sulcus (STS), with the activation elicited by eye movements localized near areas involved in spatial attention and the activation elicited by mouth movements localized near areas involved in social communication (Pelphrey, Morris, Michelich, Allison, & McCarthy, 2005).
Not only does face perception influence behavior, but also behavior influences perception, as emphasized by Gibson (1966; 1979) and recent work on ‘embodied cognition’(Niedenthal, 2007; Sommerville & Decety, 2006). For example, surreptitiously inducing people to smile or frown while viewing faces or other stimuli influences their evaluative responses (Ito, Chiao, Devine, Lorig, & Cacioppo, 2006; Laird, 1974). More research is clearly needed to conceptualize and specify the bidirectional relationships between behavior and face perception.
Accuracy and overgeneralization
Ecological theory assumes that face perception functions to guide adaptive behavior, which raises the twin issues of accuracy and bias. Attention to accurate and biased face perceptions within the dual-process model has focused primarily on dissociations in the recognition of identity and emotions as evidenced in disorders of face perception, such as prosopagnosia or autism (Adolphs, Sears, & Piven, 2001; Damasio, Tranel, Rizzo, & Mesulam, 2000). A more comprehensive model of face perception requires attention to normally functioning individuals’ accurate and biased perceptions of other socially relevant qualities that are revealed in the face.
In addition to recognizing obvious socially relevant attributes, such as familiarity, age, emotion, and sex, perceivers are also attuned to subtle qualities in faces such as ethnicity (Andrzejewski, Hall, & Salib, 2007) and sexual orientation (Rule, Ambady, Adams, & Macrae, 2007). Moreover, research using brief stimuli reveals how exquisitely sensitive perceivers are to functionally significant attributes. For instance, perceivers show rapid or automatic processing of facial emotion (Stenberg, Wiking, & Dahl, 1998), familiarity (Ellis, Young, & Koenken, 1993), and attractiveness (Olson & Marshuetz, 2005). Faces also activate spontaneous trait inferences (Mason, Cloutier, & Macrae, 2006) as well as rapid trait perceptions related to likeability and competence (Willis & Todorov, 2006), with judgments often showing above- chance accuracy (Zebrowitz & Collins, 1997).
The flipside of accuracy is biased face perception. A set of overgeneralization hypotheses derived from ecological theory codify biases that result from the preparedness to respond to adaptively significant facial qualities. These hypotheses hold that the affordances that are accurately revealed by the facial information that conveys familiarity, age, emotion, or low fitness also are perceived in people whose faces resemble one of these facial qualities (p.7) (cf. Zebrowitz, 1996, 1997; Zebrowitz & Collins, 1997).3 According to ecological theory, any errors yielded by overgeneralizations occur because they are less maladaptive than those that might result from failing to respond appropriately to people who vary in familiarity, age, emotion, or fitness.
Familiar face overgeneralization
In familiar face overgeneralization, faces that are similar in appearance to a known face will convey similar affordances (DeBruine, 2002; Lewicki, 1985). A broader familiar face overgeneralization effect is found in reactions to other-race faces. Faces of strangers from other racial groups appear less familiar than strangers from one’s own group (Zebrowitz, Bronstad, & Lee, 2007). Moreover, consistent with evidence that unfamiliar stimuli are generally less liked (Bornstein, 1989; Bornstein, 1993; Hamm, Baum, & Nikels, 1975; Rhodes, Halberstadt, & Brajkovich, 2001; Zajonc, 1968), faces of other-race strangers are perceived as less likable, and this effect is partially mediated by their lower familiarity, quite apart from other contributing social factors. The lesser familiarity of other-race faces also contributes to the strength of culturally based race stereotypes, partially mediating negative stereotypes of other-race faces and partially suppressing positive ones (Zebrowitz, Bronstad, & Lee, 2007).
Other evidence that unfamiliarity influences a readiness to perceive particular affordances and to behave accordingly is provided by white judges’ reactions to faces that are more prototypically black, regardless of their actual race. Such faces prime negative concepts more than do less prototypical black faces or white ones (Livingston & Brewer, 2002), and they are perceived to have more negative traits (Blair, Judd, Sadler, & Jenkins, 2002). Moreover, judges (mostly caucasian) give convicted criminals with a more prototypical black appearance longer prison terms (Blair, Judd, & Chapleau, 2004) and more frequent death sentences (Eberhardt, Davies, Purdie-Vaughns, & Johnson, 2006). Similarly, Koreans perceive Korean or white faces with a more Asian appearance as more familiar, more likable, and less dangerous (Strom, Zhang, Zebrowitz, Bronstad, & Lee, 2008). Importantly, these effects were often found for faces that did not differ in race, indicating that they reflect a reaction to the deviation of the faces’ similarity to the average face experienced by the judges rather than to face race per se.
Emotion face overgeneralization
In emotion face overgeneralization, responses to the actual affordances of people displaying emotion expressions generalize to impressions of people whose neutral expression resembles an emotion. For example, people whose neutral faces show some resemblance to an angry expression are perceived as less likable and more powerful (Montepare & Dobish, 2003; Said, Sebe, & Todorov, 2009; Zebrowitz, Kikuchi, & Fellous, in press). The lesser resemblance of female than male facial structure to anger expressions (Becker, Kenrick, Neuberg, Blackwell, & Smith, 2007; Hess, Adams, & Kleck, 2004, Zebrowitz et al., in press) may similarly contribute to impressions of greater likeability and lower power in women.
In baby-face overgeneralization, the actual affordances of babies are mirrored by impressions of baby-faced individuals as less powerful and more likable than their more mature-faced peers, an effect that holds true across age, sex, and race of faces and perceivers (Montepare & Zebrowitz, 1998; Zebrowitz, 1997). Supporting the overgeneralization hypothesis, research investigating the accuracy of such impressions has demonstrated that they are often inaccurate or even opposite to reality (Bachmann & Nurmoja, 2006; Zebrowitz, Andreoletti, Collins, Lee, & Blumenthal, 1998; Zebrowitz, Collins, & Dutta, 1998; Zebrowitz & Lee, 1999). Nevertheless, they are reflected in behavioral responses, consistent with the ecological tenet that perception guides behavior (Montepare & Zebrowitz, 1998; Zebrowitz, 1997). Baby-face overgeneralization also contributes to differences in the perceived affordances of female and male faces. Adult female faces typically retain more neotenous, babyish qualities than do adult male faces (p.8) (Enlow, 1982), contributing to the perception of women as more likable and less powerful than men (Friedman & Zebrowitz, 1992). Although age overgeneralization in face perception has not been studied at the older end of the lifespan, young adults with more elderly gait qualities are perceived as less happy and less powerful than their more youthful-walking peers (Montepare & Zebrowitz-McArthur, 1988), and the same may be true for young adults with more elderly looking faces.
Anomalous face overgeneralization
In anomalous face overgeneralization, responses to the actual affordances of people whose faces are marked by disease or genetic anomalies generalize to people whose faces resemble an anomalous one. This hypothesis can explain the well-documented tendency for attractive people to be perceived as more likable and powerful than unattractive people—the attractiveness halo effect. Evidence for anomalous face overgeneralization is provided by the finding that the judged attractiveness, likability, and competence of normal faces were predicted by the extent to which a neural network confused them with those of people with craniofacial anomalies that mark syndromes characterized by various intellectual and social and physical deficits (Zebrowitz et al., 2003). Thus, accurate impressions of the deficiencies signaled by anomalous faces are overgeneralized to unattractive normal individuals. It is important to note that in this study, the people with more resemblance to anomalous faces were not less fit, as assessed by measures of their intelligence and health. Thus, the divergent impressions of them must be explained by overgeneralization, not accuracy.4
A pivotal feature of the ecological theory is an emphasis on identifying the external stimulus environment that informs perception. Gibson (1966, 1979) argued that multimodal, dynamic changes over space and time are features that provide the most useful information to perceivers in nonsocial perception, and McArthur and Baron (1983) suggested that the same is true for social perception. Thus, the dynamic and multifaceted facial information provided by facial structure, pigmentation, texture, and movement should provide the most useful information about all of the qualities gleaned from faces. Yet, research has often ignored the question of what stimulus information informs face perception or it has focused on facial structure revealed in two-dimensional black-and-white static facial images, ignoring dynamic and textural information. Research also has rarely considered face perception within the context of multimodal cues. However, the qualities that are conveyed by facial structure, be it familiarity, age, emotion, or attractiveness, may also be specified in voices, bodies, or gesture. Our perceptual systems have evolved to extract useful information from moving, talking faces attached to bodies, and we will learn more about how the face perception system works if we study it in more ecologically valid contexts.
Another significant shortcoming of efforts to identify the stimulus information that guides face perception is that much of the work has been data-driven or applied, although there are exceptions. Empirically driven investigations of cues to face perception tend to generate a laundry list that has no theoretical coherence, and such investigations have often failed to uncover the cues that perceivers actually utilize when perceiving various attributes in faces (Zebrowitz & Collins, 1997). The face overgeneralization hypotheses discussed earlier provide a theoretical rationale for examining the contribution of particular facial cues to perceived affordances.
Research examining the cues that facilitate the recognition of familiar faces indicates that face recognition relies on a large constellation of (p.9) facial features (Gosselin & Schyns, 2001; Smith, Muir, Pascalis, & Slater, 2003) that are processed configurally (Myowa-Yamakoshi, Yamaguchi, Tomonaga, Tanaka, & Matsuzawa, 2005; Young, Hellawell, & Hay, 1987) with particular emphasis on the eye region (Prodan, Orbelo, Testa, & Ross, 2001; Sadr, Jarudi, & Sinha, 2003) and the left half of the face (Yovel, Levy, Grabowecky, & Paller, 2003). Research within the face-space conceptual framework that was discussed earlier also has revealed that accurate recognition of faces increases with their distance from the average face, located at the center of the face space (Rhodes, Brennan, & Carey, 1987), and, more importantly, with lower density of proximal faces (Caldara & Abdi, 2006; Furl, Phillips, & O’Toole, 2002).
The evidence that facial identity is coded relative to an average face and that face recognition depends on the spatial relations among facial features more than on isolated parts is consistent with the ecological emphasis on configural facial qualities. However, the ecological approach would also underscore the importance of studying three-dimensional, animated faces as well as multimodal presentations. Indeed, differentiating unfamiliar faces is better for the more three-dimensional, three-quarter view faces than it is for frontal or profile views (Bruce, Valentine, & Baddeley, 1987), and moving faces are recognized better than static ones (O’Toole, Roark, & Abdi, 2002; Pike, Kemp, Towell, & Phillips, 1997). In addition, identity can be revealed in facial movements alone, as shown using point-light displays (Bruce, Valentine, Gruneberg, Morris, & Sykes, 1988), and motion capture techniques (Bruce et al., 1988; Hill & Johnston, 2001). The recognition of familiar faces can also be influenced by multimodal cues as evidenced by the facilitatory effect of speech information on infants’ discrimination of mother’s and stranger’s faces (Burnham, 1993). Moreover, the movement of the face and the sound of the voice provide equivalent information about identity, such that perceivers can match silent videotapes of talking faces with their corresponding voices at above-chance accuracy (Kamachi, Hill, Lander, & Vatikiotis-Bateson, 2003). It may be that movement cues beyond the face would also provide convergent information about identity. In sum, it seems likely that the similarity of faces within a face-space model and the deviations from an average face, with attendant variations in ease of recognition, may depend not only on similarities in their two-dimensional structure, but also on similarities in their three-dimensional and dynamic and multimodal qualities.
Identifying the face space underlying familiar face recognition requires, not only taking into account dynamic and multimodal cues when mapping similarities among faces, but also selecting an appropriately representative set of faces. Theoretically, the average face should depend on the set of faces to which a perceiver has been exposed most frequently. Since there may be nontrivial differences in the faces to which different perceivers are exposed, the average face and consequent ease of recognition of various faces is likely to differ across perceivers. Indeed, the well-documented other-race disadvantage in familiar face recognition (Meissner & Brigham, 2001) suggests that people attend less to the stimulus information that differentiates among other-race than own-race faces. This effect could reflect the fact other-race faces are more tightly clustered in the face space (Caldara & Abdi, 2006; Furl et al., 2002; Lewis, 2004) and/or it may occur because perceivers initially emphasize information specifying race category at the expense of individuating information (Levin, 2000).
The significant question remains as to what stimulus information actually differentiates less familiar other-race from own-race faces. Skin color is one salient difference. People report that it is the primary cue used to judge whether another person is African, Caucasian, or Hispanic, and that it is second only to eyes when judging whether another person is Asian (Brown, Dane, & Durham, 1998). Interestingly, however, three-dimensional shape was as important as surface color when participants were asked to categorize faces as European or Japanese on the basis of one cue or the other, and shape influenced racial categorization even in the context of contradictory color cues (p.10) (Hill, Bruce, & Akamatsu, 1995). Dynamic facial movements and textural cues besides skin color, such as eyebrows and facial and scalp hair may also contribute to race recognition, as may multimodal cues, such as vocal qualities (Walton & Orlikoff, 1994) and gesture (Johnson, 2006).
From an ecological theory perspective, the question is not only what stimulus information objectively or subjectively differentiates faces that vary in familiarity, but also what stimulus information influences their perceived affordances. Skin tone does not appear to be as univocal a cue as one may expect (Maddox, 2004). Consistent with familiar face overgeneralization, Black raters judged White faces that had been morphed to have darker skin as more likable than the original White faces, whose skin tone was less similar to own-race faces. However, White raters judged Black faces morphed to have lighter skin as less likable and less competent than the original Black faces, and the skin tone morphing had no effect on the perceived likeability or competence of own race faces for either Black or White raters (Zebrowitz, Barglow, Bronstad, & Lee, 2008). Other research has investigated the racial prototypicality of static two-dimensional faces, as subjectively defined. As noted earlier, this work reveals that a more prototypical other-race appearance, which can include a range of cues, elicits negative reactions at least in part due to its unfamiliarity (Blair et al., 2002; Eberhardt, Goff, Purdie, & Davies, 2004; Livingston & Brewer, 2002; Maddox, 2004; Strom et al., 2008). Although the influence of unfamiliar dynamic and multimodal cues has been little studied, evidence suggesting that these cues also influence perceived affordances is provided by the finding that hand gestures displayed more by black or hispanic than by white individuals tend to be perceived as suspicious by white police officers (Johnson, 2006).
Anthropometric comparisons of the faces of babies and adults show that the facial structure of babies is characterized by a rounder face with bigger cheeks, a larger forehead with a more vertical slope, a larger cranium and a smaller, less protrusive chin, a small, wide, and concave nose with a sunken bridge, thinner eyebrows, proportionately larger eyes and larger pupil size (Enlow, 1990; Todd, Mark, Shaw, & Pittenger, 1980). Strong evidence that at least some of these differences are sufficient to convey age is provided by research that has changed the shape of heads or automobiles using a mathematical formula (cardiodal strain transformation) that simulates changes produced by actual maturation. People are able to identify the older of two faces or two Volkswagen Beetles when the resultant difference in shape is only slightly greater than the smallest difference that can be detected (Mark, Todd, & Shaw, 1981; Pittenger, Shaw, & Mark, 1979; Shaw & Pittenger, 1977). Reliable relative age judgments were also made of 3-D heads to which the strain transformation was applied (Bruce, Burton, Doyle, & Dench, 1989; Mark & Todd, 1983).
Dynamic facial cues also vary with age, and perceivers can discern relative age from the facial movements of children, middle-aged adults, and elderly adults depicted in point-light displays (Berry, 1990). There are also age differences in textural cues (Frost, 1988; Zebrowitz, 1997), and age can be identified from facial information not only during the formative years, when there are pronounced structural variations in the face, but also with a fair degree of accuracy in adulthood when the variations are mostly textural (Henss, 1991). Although research has not investigated whether multimodal cues yield more accurate age recognition, as ecological theory would predict, relative age is specified not only by facial qualities, but also by voice qualities (Helfrich, 1979) and gait qualities (Montepare & Zebrowitz-McArthur, 1988).
Most interesting from an ecological perspective is the influence of age-related face cues on perceived affordances. Consistent with baby-face overgeneralization, the structural facial qualities that mark babies also mark adults who are judged to be baby-faced (Montepare & Zebrowitz, 1998; Zebrowitz, Fellous, Mignault, & Andreoletti, 2003). Moreover, faces of any age with more babyish structural qualities are perceived as more likable and less powerful (Berry & McArthur, 1985; McArthur & Apatow, 1983; (p.11) Montepare & Zebrowitz, 1998; Zebrowitz & Montepare, 1992). Age-related differences in movement patterns also reveal behavioral affordances, with children’s faces shown in point-light displays eliciting impressions of lower power than adult faces even with perceived age controlled (Berry, 1990). It would be interesting to determine whether there is a baby-face overgeneralization effect shown in impressions of adults whose faces have more childlike textural cues or more childlike facial dynamics, and whether multimodal babyish cues elicit the strongest overgeneralization effects. Similar questions remain regarding the perceived affordances of young adult faces that resemble elderly ones.
A prodigious body of research has identified specific two-dimensional static facial components that objectively differentiate the basic emotions of happy, angry, sad, fear, disgust, and surprise (Ekman, 1971; Izard, 1977). It has been further shown that these components differ from those that differentiate identity (Calder, Burton, Miller, Young, & Akamatsu, 2001). Research examining what two-dimensional static facial components effectively communicate an emotion reveals that perceivers use information in different regions to recognize different emotions (Smith, Cottrell, Gosselin, & Schyns, 2005). Nevertheless, emotion recognition, like identity recognition, is holistic (Calder, Young, Keane, & Dean, 2000).
Research investigating dynamic cues to emotion has identified the specific facial muscle movements that differentiate several basic emotions (Cohn, Ekman, Harrigan, Rosenthal, & Scherer, 2006). Other research has provided evidence for the contribution of dynamic facial information to the communication of emotion (Bassili, 1978, 1979; Sato & Yoshikawa, 2004; Wagner, MacDonald, & Manstead, 1986; Wehrle, Kaiser, Schmidt, & Scherer, 2000). The significance of dynamic cues is underscored by the finding that recognition of subtle emotion expressions is better from moving rather than static faces (Ambadar, Schooler, & Cohen, 2005), that people are highly sensitive to the temporal progression of emotion displays (Edwards, 1998), and that dynamic facial expressions of emotion elicit stronger arousal in perceivers (Sato & Yoshikawa, 2007). Consistent with the ecological emphasis on multimodal stimulus information, there are vocal cues to emotion (Juslin & Scherer, 2005) as well as bodily cues (deGelder, 2006). Although not a focus of research, textural cues may also contribute to impressions of emotions which are marked by distinctive eyebrows and skin tone, with faces reddening in anger or embarrassment and paling in fear. Moreover, some research has demonstrated that when multimodal information is consistent, it can yield more effective communication of emotion than the face alone (DePaulo & Rosenthal, 1978; Vaish & Striano, 2004).
Research investigating the specific stimulus information for emotion perception has been more empirically than theoretically driven. A notable exception is provided by research that raised the question of why particular emotion expressions look the way they do (Marsh, Adams, & Kleck, 2005a). These authors suggested that both the morphology of emotion expressions and the impressions they elicit may derive from the adaptive utility of their mimicking variations in facial maturity. More specifically, they argued that fear and anger expressions evolved to mimic babies’ faces and mature faces, respectively, because it is adaptive for those experiencing fear to elicit reactions paralleling those elicited by helpless babies and for those experiencing anger to elicit reactions paralleling those elicited by powerful adults. Consistent with this hypothesis, both subjective ratings of emotions’ resemblance to babies as well as objective indices from connectionist models showed that fear expressions or closely related surprise expressions resemble babies more than do neutral expressions, while anger expressions resemble babies less (Marsh et al., 2005a; Zebrowitz, Kikuchi, & Fellous, 2007). As previously noted, anger expressions also resemble women less than they resemble men (Becker et al., 2007; Hess et al., 2004; Zebrowitz et al., in press), which is consistent with the fact that faces of adult women retain more neotenous qualities than those of men.
(p.12) From an ecological theory perspective, the question is not only what stimulus information objectively or subjectively differentiates emotion expressions, but also what stimulus information influences perceived affordances. The previously cited research demonstrating the differential resemblance of emotion expressions to babies also showed effects on impressions. More baby-faced eye-region metrics in surprise than neutral faces mediated perceptions of surprise faces as less powerful and more likable, whereas the less baby-faced overall facial metrics in anger than neutral faces mediated perceptions of anger faces as more powerful and less likable. An ingenious program of research that examined primitive masks used for different rituals in a variety of cultures revealed that structural diagonality and angularity are higher order invariants that communicate, not only anger, but also threat, and other research suggests that angular structures or angular and diagonal movement patterns may communicate similar affordances in contrast to those communicated by rounded contours that are found in a baby’s face (Aronoff, Barclay, & Stevenson, 1988; Aronoff, Woike, & Hyman, 1992; Bar & Neta, 2006, 2007).
The research reviewed earlier indicates that unattractive faces resemble anomalous ones, but the question remains about exactly what stimulus information differentiates anomalous or normal unattractive faces from more attractive ones. Research motivated by evolutionary theory predictions concerning facial attributes that signal low fitness (Folstad & Karter, 1992; Scheib, Gangestad, & Thornhill, 1999; Thornhill & Gangestad, 1999) has identified several such qualities. These include low facial averageness (a facial configuration far from the population mean), low symmetry, poor skin quality, an older age, and low sexual dimorphism (a face less prototypical for its sex), (for a review, see Zebrowitz & Rhodes, 2002). Negative facial expressions also diminish attractiveness (Reis et al., 1990). For women, less prototypically female facial movements decrease attractiveness (Morrison, Gralewski, Campbell, & Penton-Voak, 2007), and the dynamic quality of low expressiveness decreases attractiveness for both sexes (DePaulo, Blank, & Swaim, 1992; Riggio & Friedman, 1986). Although differences in the perceived affordances signaled by high versus low expressiveness provide a proximal explanation for this effect, it may also be influenced by anomalous face overgeneralization, since low facial expressiveness is a marker of physical and psychological disorders, such as Parkinson’s disease (Pentland, Pitcairn, Gray, & Riddle, 1987) depression, and schizophrenia (Schneider, Heimann, Himer, & Huss, 1990). The fact that vocal cues vary in attractiveness (Miyake & Zuckerman, 1993), as do body cues (Peters, Rhodes, & Simmons, 2007) suggests that multimodal, dynamic stimulus information may have a more profound influence on judged facial attractiveness and than does the standard stimulus information of 2D faces. For example, judgments of speakers’ facial attractiveness are influenced by their vocal attractiveness (Zuckerman, Miyake, & Hodgins, 1991).
Once again, from an ecological perspective, the influence of attractive or unattractive facial qualities on perceived affordances is of central interest. In addition to the enormous literature on the effects of global attractiveness (Eagly et al., 1991; Langlois et al., 2000; Zebrowitz, 1997), it has also been shown that the specific qualities of high facial symmetry and averageness each contribute to more positive impressions on traits related to likability and power as well as perceived health (Luevano, 2007; Zebrowitz et al., 2002; Zebrowitz & Rhodes, 2004; Zebrowitz, Voinescu, & Collins, 1996). Research investigating the influence of multimodal, dynamic stimulus information on perceived affordances is needed to better understand the contribution of facial attractiveness to real world social interactions.
Although Gibson’s ecological theory of perception did not focus on neural mechanisms per se (Gibson 1966, 1979), he argued that (p.13) understanding the nature of the stimulus information to which organisms respond will elucidate how their perceptual systems operate. Thus, ecological theory implies that neural mechanisms may be elucidated by a deep understanding of the stimulus. Indeed, Gibson’s specifications of the kinds of higher order stimulus information that informs adaptive action stimulated research that has identified neurons tuned to such information (Nakayama, 1994). Moreover, shortcomings of research on the neural mechanisms that guide face perception can be informed by ecological theory.
One shortcoming is that neural mechanisms have often been studied in a piecemeal fashion rather than as an integrated system. This certainly reflects our rudimentary understanding of neural systems and the difficult methodological and analytical task of assessing integration even in parts of the system that are fairly well understood. However, it may also reflect a mindset that was early criticized by Gibson (1966) in his landmark book The Senses Considered as Perceptual Systems when he wrote: “The question is not how the receptors work, or how the nerve cells work, or where the impulses go, but how the systems work as a whole”(p. 6). A systems analysis of face perception requires consideration of both the spatial and the temporal neural response to faces.
An excellent illustration of a systems approach is provided by research on taste perception in the rat (Katz, Nicolelis, & Simon, 2002). Whereas it was long believed that an animal identifies a taste on its tongue on the basis of which neurons are activated and how much they are activated, this research reveals that the story is much more complex. In the first 200 ms after a taste is applied to the tongue, neural responses in the cortex reflect the same somatosensory experience of fluid, regardless of taste. Between 250 and 800 ms, tastes with different identities, such as sweet, sour, salty, bitter, elicit unique neural responses. Still later, tastes of similar palatability (e.g., likable tastes of sweet and salty or disliked tastes of sour and bitter) elicit similar neural responses. Moreover, manipulations of palatability, such as conditioned taste aversion, exclusively affect this late neural response to the taste’s affordance, which may be the one associated with behavioral responses. This research provides a cautionary message to face perception researchers who search for neural mechanisms for the perception of certain face qualities in a particular brain region or even in a particular pattern of brain regions. Considering the temporal as well as the spatial pattern of activation is likely to be necessary to account for responses to different faces. For example, all faces may activate the fusiform face area (FFA), regardless of their identity. With some time lag, activation in other brain regions may be differentiated according to the identity or social category of the face, with the temporal delay varying across different types of faces. Indeed, research has documented temporal differences in the neural response to different face categories (Ito & Urland, 2003). The behavioral affordances of faces may be registered in yet other brain regions and perhaps with a different time stamp.
Research investigating the neural mechanisms for face perception has implicated several brain regions (Haxby et al., 2002). These include the FFA, involved in face recognition (Kanwisher, McDermott, & Chun, 1997; McKone, Kanwisher, & Duchaine, 2007); the superior temporal sulcus (STS), involved in the perception of changeable facial qualities, such as eye gaze and emotion expression (Hooker et al., 2003; LaBar, Crupain, Voyvodic, & McCarthy, 2003); the amygdala (AMG), activated by emotionally salient stimuli, including faces (Fitzgerald, Angstadt, Jelsone, Nathan, & Phan, 2006; Winston, O’Doherty, Kilner, Perrett, & Dolan, 2007); and the orbital frontal cortex (ORB) and other parts of the reward system, which are activated by positively and negatively reinforcing stimuli, including faces (Harris, van den Bos, Fiske, McClure, & Cohen, 2007; Liang, Zebrowitz, & Zhang, in press; Mitchell, Neil Macrae, & Banaji, 2005).
Consistent with the range of brain regions that are responsive to faces as well as with the ecological theory emphasis on a systems approach, there is considerable evidence that (p.14) the neural mechanisms for the different facets of face perception are distributed rather than localized in a particular brain region. For example, FFA activation may be modulated by activation in other brain regions (Critchley et al., 2000; Lewis et al., 2003; Vuilleumier & Pourtois, 2007; Zebrowitz, Luevano, Bronstad, & Aharon, 2009; Zhang, Zebrowitz, & Aharon, 2007), and the pattern of neural activation to faces versus objects can be distinguished even when FFA activation is ignored (Haxby et al., 2001). Other evidence for a distributed processing of face perception is provided by the finding that the activation of brain reward regions by highly attractive faces (Aharon et al., 2001) is greater for faces that are looking directly at the observer (Kampe, Frith, Dolan, & Frith, 2001) or smiling (O’Doherty et al., 2003), thereby demonstrating that the brain regions that process both facial expression and eye gaze are also involved in the processing of facial attractiveness. Similarly, eye gaze moderates activation of the amgydala by emotion expressions (Adams, Gordon, Baird, Ambady, & Kleck, 2003). Consistent with this evidence that a single brain region may be involved in the perception of multiple facial qualities, investigations of face sensitive cells in macaques revealed that in addition to cells that responded to identity and others that responded to expression, some cells were sensitive to both qualities (Hasselmo, Rolls, & Baylis, 1989). Even the neural response to emotion expressions alone is distributed (Liang, Zebrowitz, & Aharon, 2009). A meta-analysis of brain areas consistently associated with particular emotional states identified six emotion networks that depend on the emotion and the category of eliciting stimuli (Wager et al., 2007).
There is temporal as well as spatial distribution in the neural substrates for face perception, as revealed in the neural response to facial stimuli in evoked response potentials (ERPs). ERPs are electrical activity measured at the scalp that reflect the neural activity of large populations of cortical neurons. They are negatively or positively charged and are measured over a very short timescale. For example, after initial attention to less racially familiar faces shown in the N170(a negative-going wave that peaks at approximately 170 milliseconds after stimulus), later ERP components indicated greater processing of more familiar own-race faces, with racially ambiguous faces treated like faces of one’s own race. Slightly later, racially ambiguous faces are differentiated from both of the other groups, capturing the explicit categorizations provided by perceivers (Ito & Urland, 2003; Xiaohu, Yuejia, Xing, Guofeng, & Jinghan, 2003). ERP responses to emotion faces arise as early as 120 ms after presentation, and other, more sustained, ERP responses arise later, as do neural responses recorded intracranially (see Vuilleumier & Portois, 2007 for a review). A fuller understanding of the neural mechanisms for face perception will require considering both the temporal and spatial patterns of brain responses to faces, and the connectivity among brain regions across time.
A shortcoming in research on neural mechanisms for face perception is the frequent reliance on two-dimensional static facial images. Faces appear in a three-dimensional head on a body with a voice. A better understanding of neural mechanisms requires examining those involved in processing animated faces. Indeed, brain regions implicated in processing facial expressions of emotion showed stronger activation to dynamic than static expressions (LaBar et al., 2003; Yoshikawa & Sato, 2006). A related departure from ecological theory is the immobilization of perceivers in an MRI scanner. Gibson (1979) emphasized that we see things not with our eyes but with our “eyes-in-the-head-on-the-body-resting-on-the-ground”(p. 205), and that the process of picking up visual information involves, in part, head turning and walking around. Indeed, fixating the eye region while viewing faces facilitates emotion recognition (Adolphs, 2006), and conceptual representation of objects, as indexed by neural processing measures, depends on the sensory-motor interactions the perceiver has previously had with them (Kiefer, Sim, Liebich, Hauk, & Tanaka, 2007). Moreover, as noted earlier, the relationship between perception and action is bi-directional. Even immobilized perceivers (p.15) show evidence of behavioral responses to faces. For example, there is greater activation in the motor cortex when viewing more attractive faces (Kawabata & Zeki, 2004), and research on mirror neurons shows neural activation to observed actions that mirrors activation to executed actions, suggesting that we may understand the affordances conveyed by another person’s facial movement through its effect on our own incipient motor responses (Knoblich & Sebanz, 2006).
As in behavioral research on face perception, the dearth of work on multimodal stimulus information is another shortcoming of the research on neural mechanisms for face perception. Consistent with the suggestion that there will be intermodal neural mechanisms for face perception, neurons in the central nucleus of the monkey amygdala, which sends outputs to other-emotion-related brain areas, were activated either by facial or vocal emotion cues as well as both together (Kuraoka & Kakamura, 2007). In humans, bodily expressions of fear elicit similar patterns of neural activation to the facial expressions (Hadjikhani & de Gelder, 2003). Similarly, the posterior superior temporal sulcus (STSp), known to be activated by visual stimuli depicting biological motion, is similarly activated by the acoustic cues of a person walking (Bidet-Caulet, Voisin, Bertrand, & Fonlupt, 2005).
Not only are the same brain regions activated by information from multiple modalities, but also stimuli in other modalities can modulate the neural activation to faces. For example, late ERPs suggested enhanced attention to unpleasant faces that were preceded by an incongruous pleasant odor (Bensafi, Pierson, et al., 2002). Such multimodal influences on face perception can be adaptive. Animal research has demonstrated that in order for classical conditioning to occur, it is necessary that the conditioned stimulus and unconditioned stimulus activate the same neural regions (LeDoux, 2007). Extrapolating to humans, this suggests that learning functional associations to faces requires that the same neural regions be activated by the faces (CS) and the primary or secondary reinforcing stimuli with which they are paired, be they auditory, olfactory, or tactile.
Although current technology is not conducive to brain imaging of mobile human perceivers interacting with moving, talking people, a partial solution is to present dynamic facial images while scanning, as well as using virtual reality methods that allow perceivers to control their virtual position with respect to another person. Such methods, coupled with manipulations of perceivers’ goals, can also accommodate the ecological dictum that perceiving is for doing. In particular, they may reveal that neural mechanisms for face perception significantly involve motor cortex and brain areas concerned with goal directed behavior, and that the mechanisms subserving perceptions of the same face may vary with the perceiver’s goals and the concomitant affordances for that perceiver. In short, the neural mechanisms underlying face perception are likely to be more fully revealed when they can be studied in the context of dynamic social interactions.
A fundamental tenet of the ecological approach is that the detection of social affordances depends on the perceivers’ attunements— their sensitivity to the stimulus information that reveals particular affordances. Although Gibson emphasized the objective reality of affordances (“Each thing says what it is…a fruit says ‘eat me’… and woman says ‘love me,” Koffka, quoted in Gibson, 1979, p. 138), he also emphasized their emergence from the interaction of the environment and the perceiver. For example, woman does not communicate “love me” to heterosexual women or gay men. Indeed, neuroimage data reflects this; medial orbitofrontal cortex, activated when rewarding stimuli are viewed, is only active for facial stimuli of the appropriate sex for the perceiver’s sexual orientation (Kranz & Ishai, 2006). In short, the concept of attunements captures the fact that what a person perceives in faces depends on what information exists, what information the person is able to detect, and what information is useful to that perceiver.
(p.16) Attunements develop through a process of perceptual learning that spans a variety of mechanisms, including imprinting, differentiation, unitization, attentional weighting, adaptation, prototype extraction, and stimulus generalization(cf. Goldstone, 1998). These mechanisms, which may operate in concert, all reflect Gibson’s idea of the education of attention (Gibson, 1966), and they may be modulated by perceivers’ behavioral goals and capabilities (Zebrowitz & Montepare, 2006). Perceptual learning sensitizes neural mechanisms to particular object qualities (Quinn, Westerlund, & Nelson, 2006), a process that may create attunements that influence perceptions of face familiarity, emotion, age, attractiveness, and the associated affordances. Effects of the stimulus generalization mechanism have already been discussed in the context of the overgeneralization hypotheses. Illustrative examples of the other perceptual learning mechanisms are provided below.
In imprinting, specialized detectors rapidly become attuned after birth, and the stimuli subsequently shape the detectors’ sensitivity (Goldstone, 1998). A notable example of this is that two-day old infants prefer their mother’s face to a female stranger’s (Bushnell, Sai, & Mullin, 1989; Walton, Bower, & Bower, 1992). Imprinting on the mother may be more critical to animals that are mobile soon after birth, such as Lorenz’s geese, but it serves the problem of forming a lasting emotional bond between infant and caregiver. A generalization of infants’ early preference for their parent’s face is manifest also at three to four months of age, when they prefer faces of the same sex as their primary caregiver, whether male or female (Quinn, Yahr, Kuhn, Slater, & Pascalis, 2002). Imprinting may even affect mate preferences much later; men tend to facially resemble their wives’ adoptive fathers (Bereczkai, Gyuris, & Weisfeld, 2004). That they prefer their adoptive fathers’ appearance suggests that neither innate representations nor facial similarity to the self are involved, and gives some support to the imprinting hypothesis.
Differentiation, Unitization, and Attentional Weighting
Differentiation is the aspect of perceptual learning emphasized by Eleanor Gibson, who defined it as “extracting the information that specifies relevant events and narrowing down from a vast manifold of information to the minimal, optimal information that specifies the affordance…”(2003, p. 286). Given the vast amount of information in human faces, perceivers must extract that which is most useful for successful social interactions. Unitization and attentional weighting are two ways to reduce the manifold of stimulus information.
In unitization, perceptions that originally required detection of several parts are achieved by integrating the target features into a single unit. Face recognition is impaired when faces are inverted, suggesting that perceptual experience causes faces to be perceived as a configural unit that is disrupted by inversion (Bruce & Young, 1998). Interestingly, recognition of other-race faces is less impaired by inversion, indicating less unitization (Rhodes, Brake, Taylor, & Tan, 1989), which is already evident in preschoolers (Sangrigoli, Pallier, Argenti, Ventureyra, & de Schonen, 2005).
In attentional weighting, there is increased attention to relevant features, as well as a perceptual narrowing, whereby irrelevant features are not attended. This mechanism may contribute to unitization insofar as relational aspects of facial structure become weighted more than isolated features. Attentional weighting is illustrated by developmental changes in the ability to discriminate faces of monkeys and humans. Six-month-old human infants discriminate faces of different monkeys and different humans. However, nine-month-olds and adults ignore features that differentiate monkey faces and are only able to discriminate among human faces, which, of course, are the faces that need to be differentiated to successfully perceive social affordances (Dufour, Pascalis, & Petit, 2006; Quinn et al., 2002).
Attentional weighting also contributes to variations in differentiation among human faces. The other-race effect is a prime example of this. In adulthood, in childhood, and even (p.17) by three months of age, people more easily recognize faces from familiar races, whereas those of an unfamiliar race tend to all look alike (Pezdek, Blandon-Gitlin, & Moore, 2003; Sangrigoli et al., 2005; Shapiro, 1986). This effect can be accounted for by greater attentional weighting of features that differentiate own-race faces (Furl et al., 2002; Rumelhart, Smolensky, McClelland, & Hinton, 1986). More specifically, Furl et al (2002) showed that the other-race effect can be simulated with a computational model that warps facial feature space as if it were a human with experience with one race. The greater weight given to features of familiar race A is reflected in face space by a local warping or expansion of the space that contains members of race A. Race B occupies relatively less volume in face space because the features that differentiate members of race B are either ignored or minimized. Such attentional weighting is flexible, as evidenced by a reversal of the other-race effect among Korean adults who were adopted into Caucasian families between the ages of three and nine (Sangrigoli et al., 2005). For these individuals, Korean faces are less differentiable than Caucasian ones. Moreover, reduced differentiation of other-race faces is somewhat penetrable; when attention is explicitly focused on variations in facial qualities, people are able to differentiate faces of outgroup members, although it unknown whether they do so using the same face processing mechanisms as for own-race faces (Hugenberg & Sczesny, 2006; Zebrowitz, Montepare, & Lee, 1993).
Just as people are better at differentiating among faces of their own race, so are they better at differentiating faces of their own age, sex, and sexual orientation. College-aged, middle-aged, and elderly participants, each showed better recognition of faces within their own age category (Anastasi & Rhodes, 2006), and men and women showed better recognition of own-sex faces (Cross, Cross, & Daly, 1971; Slone, Brigham, & Meissner, 2000). Men also better recall the faces of men who share their sexual orientation (Rule, Ambady, Adams, & Macrae, 2007). Like the own-race bias, these effects suggest that there is greater differentiation among faces of the types of people with whom more social interaction occurs. However, research has not explicitly demonstrated greater attentional weighting of features that differentiate faces of one’s own age, sex, or sexual orientation, like that suggested by the computational model of the face space that predicts own- and other-race recognition (Furl et al., 2002).
The own-race and own-age advantage in face recognition is also shown in emotion recognition. People are more accurate in judging the emotions of people from their own age group (Malatesta, Izard, Culver, & Nicolich, 1987). People also are more accurate in judging the emotions of people from racial and cultural groups with whom they have more perceptual experience. However, the attunements to emotional expression are more complex than a simple own-group advantage. Specifically, it appears that the own-group advantage is more marked when emotions are expressed differently in own- and other-group faces (Elfenbein, Beupré, Lévesque, & Hess, 2007), and it may derive in part from greater weighting of facial areas that are difficult to control by perceivers from cultures where concealing emotions is normative (Yuki, Maddux, & Masuda, 2007).
Differential attunement to the emotion information in own-race and own-age faces is consistent with the ecological theory emphasis on attunements in the service of perceiving adaptively relevant behavioral affordances, since it is most useful to perceive the emotions of those with whom we frequently interact. A nice illustration of the short-term development of attunements that reveal behavioral affordances is provided by research that exposed people to a series of faces, some of which were identified as “fair professors” and other of which were identified as “unfair professors.” Subtle differences in face shape differentiated the fair and unfair professors. After viewing these faces, perceivers used facial shape to categorize new faces as fair or unfair although they were unaware of the facial cues that had differentiated the faces that served to develop their attunement (Hill, Lewicki, Czyzewska, & Schuller, 1990). (p.18)
In cognitive theories, a prototype is the mental representation of the central tendency of a category, which is like an average of the category members (Rosch, 1973). This differs from attentional weighting, since the central tendency of the category is an unweighted average of all of the feature dimensions. For example, many psychology researchers have an idea of what a greeble looks like (i.e., they have a greeble prototype), but cannot readily differentiate among greebles; those who are able to do so may differ from others in their attentional weighting of greeble features. Prototypes are important theoretical constructs in face perception, because they influence facial preferences, among other things. In particular, faces closer to the population prototype are judged more attractive (Bronstad, Langlois, & Russell, 2008a; Langlois & Roggman, 1990; Rhodes, 2006; Winkielman, Halberstadt, Fazendeiro, & Catty, 2006), and the own-race facial preferences that were discussed earlier (Zebrowitz, Bronstad, & Lee, 2007) may be explained, in part, by the fact that facial prototypes tend to differ for people of different races. Although the influence of prototypicality on preferences has been well documented, it is not clear why prototypes are preferred. Some candidate mechanisms include increased perceptual fluency for prototypes (Reber, Winkielman, & Schwarz, 1998), reduced negative affect, such as apprehension (Zajonc, 2001), and mate selection mechanisms (Halberstadt, 2006).
Research has demonstrated that we rapidly extract prototypes from the faces we view, contributing to flexible perceiver attunements. For example, after adults viewed several faces, they liked a prototype of these faces more than a prototype of faces they had not seen (Rhodes et al., 2001). Even infants appear to rapidly construct prototypes of faces they experience, reacting to an average of recently seen faces as if it were familiar (Rubenstein, Kalakanis, & Langlois, 1999). Different experiences with faces have noticeable influences early in life; infants prefer own-race faces only if they are living in a racially segregated environment (Bar-Haim, Ziv, Lamy, & Hodes, 2006). Long term effects of experiences with particular faces also have been demonstrated. Specifically, affiliated individuals have more similar facial preferences than do strangers, presumably due in part to greater similarity in the prototypes extracted from the faces in their shared environment (Bronstad & Russell, 2007).
An interesting perceptual learning mechanism related to prototype extraction is the face adaptation aftereffect, which demonstrates a very rapid development of perceptual attunements. After briefly adapting to one face, perception of subsequent faces is biased relative to the adapting face. Adaptation effects can bias perception of face identity and sex (Leopold, O’Toole, Vetter, & Blanz, 2001; Webster, Kaping, Mizokami, & Duhamel, 2004), as well as preferences for faces. For example, when adults were briefly exposed to consistent distortions of normal faces, they experienced an aftereffect marked by a shift toward the distorted faces in which faces looked most normal and which faces looked most attractive (Rhodes, Jeffery, Watson, Clifford, & Nakayama, 2003). These aftereffects suggest that the average face—that is, the prototypical face—is used as a referent for coding facial appearance and that the prototype changes dynamically in response to external stimulation (cf. Clifford & Rhodes, 2005; Rhodes & Jeffery, 2006). The contribution of prototypicality to facial attractiveness is consistent with the previously cited evidence that averageness is one of the stimulus qualities that makes a face attractive. Short-term adaptation effects on face prototypicality also may explain the finding that short-term exposure to other-race faces increases the likability of other faces of that race (Zebrowitz, White, & Weineke, 2008). Although the prototypical face is malleable, the changes that short-term perceptual adaptation induces do not appear to be lasting.
There are other antecedents and consequences of attunements in addition to the effects of perceptual learning mechanisms. In particular, goal directed variations in attentional focus (p.19) may attune perceivers to particular facial information, and these attunements may influence the perception of behavioral affordances. For example, dominant perceivers, with the goal of controlling others, may notice how assertive people are, whereas dependent perceivers, with the goal of eliciting support from others, may notice how affiliative the same individuals are (Battistich & Aronoff, 1985). People who live in cultures with a high incidence of parasites, who have a higher need to steer clear of unfit associates, show a stronger preference for physically attractive mates (Gangestad & Buss, 1993). Similarly, perceivers who feel more vulnerable to disease show more negative reactions to ethnic outgroups (Faulkner, Schaller, Park, & Duncan, 2004; Navarrete & Fessler, 2006). These studies have obvious implications for individual differences in the anomalous-face and familiar-face overgeneralization effects.
Confluence in the Face Perception System
The principal insight from ecological theory that face perception is focused on the detection of social affordances suggests that those faces that communicate similar affordances—be it due to their familiarity, emotion, or some other attribute—will elicit similar behavioral responses and be perceived via similar neural mechanisms. Interestingly, commonalities in responses to various types of faces are often tracked by commonalities in the stimulus information that defines them. As noted ealier, there are similarities in the facial configuration of faces varying in emotion and maturity, and these are paralleled by similarities in perceived affordances (Marsh et al., 2005a; Zebrowitz et al., 2007). There are also parallels in the facial structure and the behavioral affordances associated with emotion and sex (Becker et al., 2007; Zebrowitz et al., in press) and with sex and maturity (Friedman & Zebrowitz, 1992). Men, mature faces, and angry faces share similar facial structures and perceived affordances as do women, baby-faced, and fear or surprise faces. Similarities in perceived affordances may also track similarities in distance from the average face. Principal components analysis (PCA) of the facial metrics of various face categories have shown that both disfigured and elderly faces are farther from the average face than are attractive faces (O'Toole et al., 1999; Bronstad, Zebrowitz, & Aharon, 2007). Own-race and less familiar other-race faces also are differently represented in face space (Caldara & Abdi, 2006; Furl et al., 2002), and it has been suggested that more familiar faces are closer to the average (Valentine, 1991). Thus, less likability is associated with faces that have facial metrics farther from the average in a perceiver’s face space, be is due to attractiveness or familiarity or age.
Commonalities in the perceived affordances and stimulus information that differentiate various facial attributes have potential implications for the neural mechanisms involved in the perception of face familiarity, emotion, age, and attractiveness. To the extent that neural mechanisms have evolved to mirror the stimulus information provided in the perceiver’s visual world, there may be mechanisms tuned to multiple facial attributes rather than a set of totally independent processes. Indeed, we have seen that, not only are the neural mechanisms for face perception spatially and temporally distributed, but also the mechanisms associated with perceiving some facial attributes have similar signatures. It would be interesting to determine whether perceiving the same behavioral affordance in faces marked by a particular emotion expression, age, familiarity or attractiveness level would give rise to a similar pattern of brain activation. Since babies, like fear expressions, are perceived as low in power, they may elicit similar patterns of activation as may happy faces and attractive faces, both of which are perceived as high in likability and power. Although some research suggests that emotion and personality trait judgments (of body movements) rely on at least partly distinct neural circuitry (Heberlein & Saxe, 2005), that research did not compare emotions and traits that specify the same affordances, such as fear and submissive or happy and likable. If basic facial qualities that share similar affordances elicit similar neural activation, then faces that resemble these basic qualities may do so as well, in keeping with the face overgeneralization hypotheses. Consistent with this suggestion, baby-faced men and babies, who communicate similar affordances, also (p.20) elicit similar patterns of neural activation (Zebrowitz et al., 2009).
The ecological theory of face perception emphasizes the perception of behavioral affordances, and it views such perceptions as closely coupled to action. It predicts that reactions to faces that specify adaptively significant affordances will be overgeneralized to other faces that merely resemble them. It gives high priority to identifying the stimulus information to which perceivers respond, emphasizing the importance of dynamic and multimodal information. It also emphasizes the need to explain the development of perceivers’ attunements to facial qualities. Research that embraces these ecological tenets can contribute much to our understanding of face perception.
The emphasis on perceiving affordances in faces suggests that many aspects of face perception are intertwined rather than independent, as suggested by the dual-process model. Placing a high priority on identifying the dynamic, multimodal stimulus information for face perception can enhance the face space model, and facilitate better prediction of evaluative and behavioral responses to particular faces. Focusing on the behavioral concomitants of face perception will move research into more ecologically valid settings, where the full breadth of attributes gleaned from faces can be better understood. Studying face perception in meaningful social interaction contexts also promises to provide a fuller understanding of the neural mechanisms that subserve this adaptively significant process. Finally, attention to perceiver differences in face perception beyond those accounted for by brain abnormalities will elucidate developmental processes that tune the perceptual system to particular facial qualities.
Adams, R. B., Jr., Gordon, H. L., Baird, A. A., Ambady, N., & Kleck, R. E. (2003). Effects of gaze on amygdala sensitivity to anger and fear faces. Science, 300(5625), 1536.
Adolphs, R. (2006). Perception and emotion: How we recognize facial expressions. Current Directions in Psychological Science, 15(5), 222–226.
Adolphs, R., Sears, L., & Piven, J. (2001). Abnormal processing of social information from faces in autism. Journal of Cognitive Neuroscience, 13(2), 232–240.
Aharon, I., Etcoff, N., Ariely, D., Chabris, C. F., O’Connor, E., & Breiter, H. C. (2001). Beautiful faces have variable reward value: fMRI and behavioral evidence. Neuron, 32(3), 537–551.
Alley, T. R. (1983). Infant head shape as n elicitor of adult protection. Merrill–Palmer Quarterly, 29, 411–427.
Ambadar, Z., Schooler, J. W., & Cohen, J. F. (2005). Deciphering the enigmatic face: The importance of facial dynamics in interpreting subtle facial expressions. Psychological Science, 16, 403–410.
Anastasi, J. S., & Rhodes, M. G. (2006). Evidence for an own–age bias in face recognition. North American Journal of Psychology, 8, 237–252.
Andrzejewski, S. A., Hall, J. A., & Salib, E. R. (2007). Anti–semitism and identification of Jewish group membership from photographs. Journal of Nonverbal Behavior, 33(1), 47–58.
Aronoff, J., Barclay, A. M., & Stevenson, L. A. (1988). The recognition of threatening facial stimuli. Journal of Personality and Social Psychology, 54(4), 647–655.
Aronoff, J., Woike, B. A., & Hyman, L. M. (1992). Which are the stimuli in facial displays of anger and happiness? Configurational bases of emotion recognition. Journal of Personality and Social Psychology, 62, 1050–1066.
Bachmann, T., & Nurmoja, M. (2006). Are there affordances of suggestibility in facial appearance? Journal of Nonverbal Behavior, 30(2), 87–92.
Balaban, M. T. (1995). Affective influences on startle in five–month–old infants reactions to facial expressions of emotion. Child Development, 58, 28–36.
Bar–Haim, Y., Ziv, T., Lamy, D., & Hodes, R. M. (2006). Nature and Nurture in Own–Race Face Processing. Psychological Science, 17(2), 159–163. (p.21)
Bar, M., & Neta, M. (2006). Humans Prefer Curved Visual Objects. Psychological Science, 17(8), 645–648.
Bar, M., & Neta, M. (2007). Visual elements of subjective preference modulate amygdala activation. Neuropsychologia, 45(10), 2191–2200.
Bassili, J. N. (1978). Facial motion in the perception of faces and of emotional expression. Journal of Experimental Psychology: Human Perception and Performance, 4(3), 373–379.
Bassili, J. N. (1979). Emotion recognition: The role of facial movement and the relative importance of upper and lower areas of the face. Journal of Personality and Social Psychology, 37(11), 2049–2058.
Battistich, V. A., & Aronoff, J. (1985). Perceiver, target, and situational influences on social cognition: An interactional analysis. Journal of Personality and Social Psychology, 49(3), 788–798.
Beauvois, J.–L., & Dubois, N. (2000). Affordances in social judgment: Experimental proof of why it is a mistake to ignore how others behave towards a target and look solely at how the target behaves. Swiss Journal of Psychology – Zeitschrift fÃ¼r Psychologie – Revue Suisse de Psychologie, 59(1), 16–33.
Becker, D. V., Kenrick, D. T., Neuberg, S. L., Blackwell, K. C., & Smith, D. M. (2007). The confounded nature of angry men and happy women. Journal of Personality and Social Psychology, 92(2), 179–190.
Bensafi, M., Pierson A., Rouby, C., Bertrand B., Vigouroux, M., Jouvent, R., & Holly, A. (2002). Modulation of visual event-related potentials by emotional olfactory stimuli. Neurophysio- logie Clinique-clinical Neurophysiology, 32(6), 335–342.
Bereczkai, T., & Gyuris, Petra & Weisfeld, Glenn E. (2004). Sexual imprinting in human mate choice. Proceedings of the Royal Society of London, Series B: Biological Sciences, 271(1544), 1129–1134.
Berry, D. S. (1990). What can a moving face tell us? Journal of Personality and Social Psychology, 58(6), 1004–1014.
Berry, D. S., & McArthur, L. A. (1985). Some components and consequences of a babyface. Journal of Personality and Social Psychology, 48, 312–323.
Berry, D. S., & McArthur, L. A. (1986). Perceiving character in faces: The impact of age–related craniofacial changes on social perception. Psychological Bulletin, 100, 3–18.
Bidet–Caulet, A., Voisin, J., Bertrand, O., & Fonlupt, P. (2005). Listening to a walking human activates the temporal biological motion area NeuroImage, 28, 132–139.
Blair, I. V., Judd, C. M., & Chapleau, K. M. (2004). The influence of Afrocentric facial features in criminal sentencing. Psychological Science, 15(10), 674–679.
Blair, I. V., Judd, C. M., Sadler, M. S., & Jenkins, C. (2002). The role of Afrocentric features in person perception: Judging by features and categories. Journal of Personality and Social Psychology, 83(1), 5–25.
Bornstein, R. F. (1989). Exposure and affect: Overview and meta–analysis of research 1968–1987. Psychological Bulletin, 106, 265–289.
Bornstein, R. F. (1993). Mere exposure effects with outgroup stimuli. In D. M. Mackie & D. L. Hamilton (Eds.), Affect, cognition, and stereotyping: Interactive processes in group perception.(pp. 195–211). San Diego: Academic Press.
Bronstad, P. M., Langlois, J. H., & Russell, R. (2007). Classifying spatial patterns of brain activity associated with human face categories, poster presented at the Neural Systems of Social Behaviour Conference, Austin, Tx, May, 11–13.
Bronstad, P. M., & Russell, R. (2007). Beauty is in the “we” of the beholder: Greater agreement on facial attractiveness among close relations. Perception, 36, 1674–1681.
Bronstad, P. M., Zebrowitz, L. A., & Aharon, I. (2007). Classifying spatial patterns of brain activity associated with human face categories, poster presented at the Neural Systems of Social Behavior Conference, Austin, Tx, May, 11–13.
Brown, T. D., Jr., Dane, F. C., & Durham, M. D. (1998). Perception of race and ethnicity. Journal of Social Behavior and Personality, 13(2), 295–306.
Bruce, V., Burton, M., Doyle, T., & Dench, N. (1989). Further experiments on the perception of growth in three dimensions. Perception and Psychophysics, 46, 528–536.
Bruce, V., Valentine, T., & Baddeley, A. (1987). The basis of the 3/4 view advantage in face recognition. Applied Cognitive Psychology, 1(2), 109–120.
Bruce, V., Valentine, T., Gruneberg, M. M., Morris, P. E., & Sykes, R. N. (1988). When a nod’s as good as a wink: The role of dynamic information in facial recognition. In Practical aspects (p.22) of memory: Current research and issues, Vol. 1: Memory in everyday life.(pp. 169–174). New York: John Wiley & Sons.
Bruce, V., & Young, A. (1998). In the eye of the beholder: The science of face perception. New York: Oxford University Press.
Burnham, D. (1993). Visual recognition of mother by young infants: Facilitation by speech. Perception, 22(10), 1133–1153.
Busey, T. A., Wenger, M. J., & Townsend, J. T. (2001). Formal models of familiarity and memorability in face recognition. Mahwah, NJ: Erlbaum.
Bushnell, I. W., Sai, F., & Mullin, J. T. (1989). Neonatal recognition of the mother’s face. British Journal of Developmental Psychology, 7(1), 3–15.
Caldara, R., & Abdi, H. (2006). Simulating the ‘other–race’ effect with autoassociative neural networks: Further evidence in favor of the face–space model. Perception, 35(5), 659–670.
Calder, A. J., Burton, A. M., Miller, P., Young, A. W., & Akamatsu, S. (2001). A principal component analysis of facial expressions. Vision Research, 41(9), 1179–1208.
Calder, A. J., & Young, A. W. (2005). Understanding the recognition of facial identity and facial expression. Nature Reviews Neuroscience, 6(8), 641–651.
Calder, A. J., Young, A. W., Keane, J., & Dean, M. (2000). Configural information in facial perception. Journal of Experimental Psychology: Human Perception and Performance, 26, 527–551.
Clifford, C. W. G. E., & Rhodes, G. E. (2005). Fitting the mind to the world : Adaptation and after–effects in high–level vision. Oxford: Oxford University Press.
Cohn, J. F., Ekman, P., Harrigan, J. A., Rosenthal, R., & Scherer, K. R. (2006). Measuring facial action. In The new handbook of methods in nonverbal behavior research.(pp. 9–64). New York: Oxford University Press.
Critchley, H., Daly, E., Phillips, M., Brammer, M., Bullmore, E., Williams, S., et al. (2000). Explicit and implicit neural mechanisms for processing of social information from facial expressions: a functional magnetic resonance imaging study. Human Brain Mapping, 9(2), 93–105.
Cross, J. F., Cross, J., & Daly, J. (1971). Sex, race, age, and beauty as factors in recognition of faces. Perception and Psychophysics, 10, 393–396.
Cuddy, A. J. C., & Fiske, S. T. (2002). Doddering but dear: Content, and functioning in stereotyping of older persons. In T. D. Nelson (Ed.), Ageism: Stereotyping and prejudice against older people(pp. 3–26). Cambridge, MA: MIT Press.
Cuddy, A. J. C., Norton, M. I., & Fiske, S. T. (2005). This Old Stereotype: The Pervasiveness and Persistence of the Elderly Stereotype. Journal of Social Issues, 61(2), 267–285.
Damasio, A. R., Tranel, D., Rizzo, M., & Mesulam, M. M. (2000). Disorders of complex visual processing. In Principles of behavioral and cognitive neurology (2nd ed.).(pp. 332–372): New York: Oxford University Press.
Deaner, R. O., Khera, A. V., & Platt, M. L. (2005). Monkeys pay per view: adaptive valuation of social images by rhesus macaques. Curr Biol, 15(6), 543–548.
DeBruine, L. (2002). Facial resemblance enhances trust. Proceedings of the Royal Society of London, Series B: Biological Sciences, 269, 1307–1312.
deGelder, B. (2006). Towards the neurobiology of emotional body language. Nature Reviews Neuroscience, 7, 242–249.
DePaulo, B. M., Blank, A. L., & Swaim, G. W. (1992). Expressiveness and expressive control. Personality and Social Psychology Bulletin, 18, 276–285.
DePaulo, B. M., & Rosenthal, R. (1978). Age changes in nonverbal decoding as a function of increasing amounts of information. Journal of Experimental Child Psychology, 26(2), 280–287.
Dufour, V., Pascalis, O., & Petit, O. (2006). Face processing limitation to own species in primates: a comparative study in brown capuchins, Tonkean macaques and humans. Behavioral Processes, 73, 107–113.
Eagly, A. H., Ashmore, R. D., Makhijani, M. G., & Longo, L. C. (1991). What is beautiful is good, but: A meta–analytic review of research on the physical attractiveness stereotype’. Psychological Bulletin, 110(1), 109–128.
Eberhardt, J. L., Davies, P. G., Purdie–Vaughns, V. J., & Johnson, S. L. (2006). Looking deathworthy: Perceived stereotypicality of black defendants predicts capital–sentencing outcomes. Psychological Science, 17(5), 383–386.
Eberhardt, J. L., Goff, P. A., Purdie, V. J., & Davies, P. G. (2004). Seeing Black: Race, Crime, and Visual Processing. Journal of Personality and Social Psychology, 87(6), 876–893. (p.23)
Edwards, K. (1998). Temporal cues in facial expressions of emotion. Psychological Science, 9, 270–276.
Eibl–Eibesfeldt, I. (1989). Human ethology. New York: Aldine de Gruyter.
Ekman, P. (1971). Universals and cultural differences in facial expressions of emotion. Nebraska Symposium on Motivation, 19, 207–283.
Elfenbein, H. A., Beupré, M., Lévesque, M., & Hess, U. (2007). Toward a dialect theory: Cultural differences in the expression and recognition of posed facial expressions. Emotion, 7, 131–146.
Ellis, H. D., Young, A. W., & Koenken, G. (1993). Covert face recognition with prosopagnosia. Behavioural Neurology, 6(1), 27–32.
Enlow, D. H. (1990). Facial Growth (3rd ed.). philadelphia: Harcourt Brace.
Faulkner, J., Schaller, M., Park, J. H., & Duncan, L. A. (2004). Evolved disease–avoidance mechanisms and contemporary xenophobic attitudes. Group Processes & Intergroup Relations, 7(4), 333–353.
Feingold, A. (1992). Good–looking people are not what we think. Psychological Bulletin, 111(2).
Fitzgerald, D. A., Angstadt, M., Jelsone, L. M., Nathan, P. J., & Phan, K. L. (2006). Beyond threat: Amygdala reactivity across multiple expressions of facial affect. NeuroImage, 30(4), 1441–1448.
Folstad, I., & Karter, A. J. (1992). Parasites, bright males and the immunocompetence handicap. American Naturalist, 139, 603–622.
Friedman, H., & Zebrowitz, L. A. (1992). The contribution of typical sex differences in facial maturity to sex role stereotypes. Personality & Social Psychology Bulletin, 18(4), 430–438.
Frost, P. (1988). Human skin color: A possible relationship between its sexual dimorphism and its social perception. Perspectives in Biology & Medicine, 32(1), 38–58.
Furl, N., Phillips, P. J., & O’Toole, A. J. (2002). Face recognition algorithms and the other–race effect: Computational mechanisms for a developmental contact hypothesis. Cognitive Science, 26(6), 797–815.
Gangestad, S. W., & Buss, D. M. (1993). Pathogen prevalence and human mate preferences. Ethology and Sociobiology, 14, 89–96.
Gibson, E. J. (1969). Principles of perceptual learning and development: New York: Appleton–Century–Crofts.
Gibson, E. J. (2003). The world is so full of a number of things: On specification and perceptual learning. Ecological Psychology, 15, 283–287.
Gibson, J. J. (1966). The senses considered as perceptual systems. Oxford, England: Houghton Mifflin.
Gibson, J. J. (1979). The ecological approach to visual perception. Boston: Houghton Mifflin.
Goldstone, R. L. (1998). Perceptual learning. Annual Review of Psychology, 49, 585–612.
Gosselin, F., & Schyns, P. G. (2001). Bubbles: A technique to reveal the use of information in recognition tasks. Perception, 41, 2261–2271.
Griffin, A. M., & Langlois, J. H. (2006). Stereotype directionality and attractiveness stereotyping: Is beauty good or is ugly bad? Social Cognition, 24(2), 187–206.
Hadjikhani, N., & de Gelder, B. (2003). Seeing Fearful Body Expressions Activates the Fusiform Cortex and Amygdala. Current Biology, 13(24), 2201–2205.
Halberstadt, J. (2006). The Generality and Ultimate Origins of the Attractiveness of Prototypes. Personality and Social Psychology Review, 10(2), 166–183.
Hamm, N. H., Baum, M. R., & Nikels, K. W. (1975). Effects of race and exposure on judgments of interpersonal favorability. Journal of Experimental Social Psychology, 11(1), 14–24.
Harris, L., van den Bos, W., Fiske, S., McClure, S., & Cohen, J. (2007). Neural Evidence for the Person Positivity Bias. Paper presented at the Neural Systems of Social Behavior, Austin, Texas.
Hasselmo, M. E., Rolls, E. T., & Baylis, G. C. (1989). The role of expression and identity in the face–selective responses of neurons in the temporal visual cortex of the monkey. Behavioural Brain Research, 32(3), 203–218.
Haxby, J. V., Gobbini, M. I., Furey, M. L., Ishai, A., Schouten, J. L., & Pietrini, P. (2001). Distributed and overlapping representations of faces and objects in ventral temporal cortex. Science, 293(5539), 2425–2430.
Haxby, J. V., Hoffman, E. A., & Gobbini, M. I. (2002). Human neural systems for face recognition and social communication. Biological Psychiatry, 51(1), 59–67.
Heberlein, A. S., & Saxe, R. R. (2005). Dissociation between emotion and personality judgments: convergent evidence from functional neuroimaging. NeuroImage, 770–777.
Helfrich, H. (1979). Age markers in speech. In K. R. Scherer & H. Giles (Eds.), Social markers in speech(pp. 63–107). Cambridge: Cambridge University Press. (p.24)
Henss, R. (1991). Perceiving age and attractiveness in facial photographs. Journal of Applied Social Psychology, 21(11), 933–946.
Hershler, O., & Hochstein, S. (2006). With a careful look: Still no low–level confound to face pop–out. Vision Research, 46(18), 3028–3035.
Hess, U., Adams, R. B., Jr., & Kleck, R. E. (2004). Facial Appearance, Gender, and Emotion Expression. Emotion, 4(4), 378–388.
Hess, U., Blairy, S., & Kleck, R. E. (2000). The influence of facial emotion displays, gender, and ethnicity on judgments of dominance and affiliation. Journal of Nonverbal Behavior, 24, 265–283.
Hill, H., Bruce, V., & Akamatsu, S. (1995). Perceiving the sex and race of faces: The role of shape and colour. Proceedings of the Royal Society of London, Series B: Biological Sciences, 261, 367–373.
Hill, H., & Johnston, A. (2001). Categorizing sex and identity from the biological motion of faces. Current Biology, 11(11), 880–885.
Hill, T., Lewicki, P., Czyzewska, M., & Schuller, G. (1990). The role of learned inferential encoding rules in the perception of faces: Effects of non–conscious self–perpetuation of a bias. Journal of Experimental Social Psychology, 26, 350–371.
Hooker, C. I., Paller, K. A., Gitelman, D. R., Parrish, T. B., Mesulam, M. M., & Reber, P. J. (2003). Brain networks for analyzing eye gaze. Cognitive Brain Research, 17(2), 406–418.
Hugenberg, K., & Sczesny, S. (2006). On wonderful women and seeing smiles: Social categorization moderates the happy face response latency advantage. Social Cognition, 24(5), 516–539.
Ito, T. A., & Cacioppo, J. T. (2000). Electrophysiological evidence of implicit and explicit categorization processes. Journal of Experimental Social Psychology, 36(6), 660–676.
Ito, T. A., Chiao, K. W., Devine, P. G., Lorig, T. S., & Cacioppo, T. (2006). The Influence of Facial Feedback on Race Bias. Psychological Science, 17(3), 256–261.
Ito, T. A., & Urland, G. R. (2003). Race and gender on the brain: Electrocortical measures of attention to the race and gender of multiply categorizable individuals. Journal of Personality and Social Psychology, 85(4), 616–626.
Izard, C. E. (1977). Human emotions. New York: Plenum Press.
Johnson, R. R. (2006). Confounding influences on police detection of suspiciousness. Journal of Criminal Justice, 34(4), 435–442.
Juslin, P. N., & Scherer, K. R. (2005). Vocal expression of affect. In J. A. Harrigan, R. Rosenthal & K. R. Scherer (Eds.), The new handbook of methods in nonverbal behavior research.(pp. 65–135). New York: Oxford University Press.
Kamachi, M., Hill, H., Lander, K., & Vatikiotis–Bateson, E. (2003). ‘Putting the face to the voice’: Matching identity across modality. Current Biology, 13(19), 1709–1714.
Kampe, K. K. W., Frith, C. D., Dolan, R. J., & Frith, U. (2001). Reward value of attractiveness and gaze. Nature, 413(6856), 589–590.
Kanwisher, N., McDermott, J., & Chun, M. M. (1997). The fusiform face area: A module in human extrastriate cortex specialized for face perception. Journal of Neuroscience, 17(11), 4302–4311.
Katz, D. B., Nicolelis, M. A., & Simon, S. A. (2002). Gustatory processing is dynamic and distributed. Current Opinion in Neurobiology, 12(4), 448–454.
Kawabata, H., & Zeki, S. (2004). Neural correlates of beauty. Journal of Neurophysiology, 91(4), 1699–1705.
Kiefer, M., Sim, E.–J., Liebich, S., Hauk, O., & Tanaka, J. (2007). Experience–dependent plasticity of conceptual representations in human sensory–motor areas. Journal of Cognitive Neuroscience, 19(3), 525–542.
Knoblich, G., & Sebanz, N. (2006). The social nature of perception and action. Current Directions in Psychological Science, 15, 99–104.
Knutson, B. (1996). Facial expressions of emotion influence interpersonal trait inferences. Journal of Nonverbal Behavior, 20, 165–182.
Kranz, F., & Ishai, A. (2006). Face perception is modulated by sexual preference. Current Biology, 16, 63–68.
Kuraoka, K. & Nakamura, K. (2007). Responses of single neurons in monkey amygdala to facial and vocal emotions. Journal of Neurophysiology, 97(2), 1379–1387.
LaBar, K. S., Crupain, M. J., Voyvodic, J. T., & McCarthy, G. (2003). Dynamic perception of facial affect and identity in the human brain. Cerebral Cortex, 13(10), 1023–1033.
Laird, J. D. (1974). Self–attribution of emotion: The effects of expressive behavior on the quality of emotional experience. Journal of Personality and Social Psychology, 29(4), 475–486.
Langlois, J. H., Kalakanis, L., Rubenstein, A. J., Larson, A., Hallam, M., & Smoot, M. (2000). (p.25) Maxims or myths of beauty? A meta-analytic and theoretical review. Psychological Bulletin, 126, 390–423.
Langlois, J. H., & Roggman, L. A. (1990). Attractive faces are only average. Psychological Science, 1(2), 115–121.
Lanzetta, J. T., & Orr, S. P. (1981). Stimulus properties of facial expressions and their influence on the classical conditioning of fear. Motivation and Emotion, 5(3), 225–234.
LeDoux, J. (2007, May 17). Fearful brains in an anxious world. Paper presented at the Eleventh International Conference on Cognitive and Neural Systems, Boston, MA.
Lee, K. J., Byatt, G., & Rhodes, G. (2000). Caricature effects, distinctiveness, and identification: Testing the face–space framework. Psychological Science, 11, 379–385.
Leopold, D. A., O’Toole, A. J., Vetter, T., & Blanz, V. (2001). Prototype-referenced shape encoding revealed by high–level aftereffects. Nature Neuroscience, 4(1), 89–94.
Levin, D. T. (2000). Race as a visual feature: Using visual search and perceptual discrimination tasks to understand face categories and the cross-race recognition deficit. Journal of Experimental Psychology: General, 129(4), 559–574.
Lewicki, P. (1985). Nonconscious biasing effects of single instances on subsequent judgments. Journal of Personality and Social Psychology, 48, 563–574.
Lewis, M. B. (2004). Face–space–R: Towards a unified account of face recognition. Visual Cognition, 11(1), 29–69.
Lewis, S., Thoma, R. J., Lanoue, M. D., Miller, G. A., Heller, W., Edgar, C., et al. (2003). Visual processing of facial affect. Neuroreport, 14(14), 1841–1845.
Liang, X., Zebrowitz, L. A., & Aharon, I. (2009). Effective connectivity between amygdala and orbitofrontal cortex differentiates the perception of facial expressions. Social Neuroscience, 4, 185–196.
Liang, X., Zebrowitz, L. A., & Zhang, Y. (in press). Neural Activation in the ‘Reward Circuit’ Shows a Nonlinear Response to Facial Attractiveness. Social Neuroscience.
Livingston, R. W., & Brewer, M. B. (2002). What are we really priming? Cue–based versus category–based processing of facial stimuli. Journal of Personality and Social Psychology, 82(1), 5–18.
Luevano, V. X. (2007). Truth in Advertising: The Relationship of Facial Appearance to Apparent and Actual Health Across the Lifespan. Unpublished Ph.D., Brandeis University, Waltham.
Lundqvist, D., Ohman, A., Barrett, L. F., Niedenthal, P. M., & Winkielman, P. (2005). Caught by the Evil Eye: Nonconscious Information Processing, Emotion, and Attention to Facial Stimuli. New York: Guilford Press.
Maddox, K. B. (2004). Perspectives on Racial Phenotypicality Bias. Personality and Social Psychology Review, 8(4), 383–401.
Malatesta, C. Z., Izard, C. E., Culver, C., & Nicolich, M. (1987). Emotion communication skills in young, middle–aged, and older women. Psychology and Aging, 2, 193–203.
Maner, J. K., Gailliot, M. T., & DeWall, C. N. (2007). Adaptive attentional attunement: evidence for mating–related perceptual bias. Evolution and Human Behavior, 28(1), 28–36.
Mark, L. E., & Todd, J. T. (1983). The perception of growth in three dimensions. Perception and Psychophysics, 33, 193–196.
Mark, L. S., Todd, J. T., & Shaw, R. E. (1981). Perception of growth: A geometric analysis of how different styles of change are distinguished. Journal of Experimental Psychology: Human Perception and Performance, 7(4), 855–868.
Marsh, A. A., Adams, R. B., Jr., & Kleck, R. E. (2005a). Why Do Fear and Anger Look the Way They Do? Form and Social Function in Facial Expressions. Personality and Social Psychology Bulletin, 31(1), 73–86.
Marsh, A. A., Ambady, N., & Kleck, R. E. (2005b). The effects of fear and anger facial expressions on approach and avoidance related behaviors. Emotion, 5, 119–124.
Mason, M. F., Cloutier, J., & Macrae, C. N. (2006). On construing others: Category and stereotype activation from facial cues. Social Cognition, 24(5), 540–562.
McArthur, L. Z., & Apatow, K. (1983). Impressions of baby-faced adults. Social Cognition, 2, 315–342.
McArthur, L. Z., & Baron, R. M. (1983). Toward an ecological theory of social perception. Psychological Review, 90, 215–238.
McKone, E., Kanwisher, N., & Duchaine, B. C. (2007). Can generic expertise explain special processing for faces? Trends in Cognitive Sciences, 11(1), 8–15.
Meissner, C. A., & Brigham, J. C. (2001). Thirty years of investigating the own–race bias in memory for faces: A meta–analytic review. Psychology, Public Policy, and Law, 7(1), 3–35. (p.26)
Mignon, A., & Mollaret, P. (2002). Applying the affordance conception of traits: A person perception study. Personality and Social Psychology Bulletin, 28(10), 1327–1334.
Mitchell, J. P., Neil Macrae, C., & Banaji, M. R. (2005). Forming impressions of people versus inanimate objects: Social–cognitive processing in the medial prefrontal cortex. NeuroImage, 26(1), 251–257.
Miyake, K., & Zuckerman, M. (1993). Beyond Personality Impressions: Effects of Physical and Vocal Attractiveness on False Consensus, Social Comparison, Affiliation, and Assumed and perceived Similarity. Journal of Personality, 61(3), 411–437.
Montepare, J. M., & Dobish, H. (2003). The contribution of emotion perceptions and their overgeneralizations to trait impressions. Journal of Nonverbal Behavior, 27, 237–254.
Montepare, J. M., & Zebrowitz–McArthur, L. (1988). Impressions of people created by age–related qualities of their gaits. Journal of Personality and Social Psychology, 55(4), 547–556.
Montepare, J. M., & Zebrowitz, L. A. (1998). “Person perception comes of age”: The salience and significance of age in social judgments. In M. Zanna (Ed.), Advances in Experimental Social Psychology(vol. 30)(pp. 93–163). San Diego: Academic Press.
Morrison, E. R., Gralewski, L., Campbell, N., & Penton–Voak, I. S. (2007). Facial movement varies by sex and is related to attractiveness. Evolution and Human Behavior, 28(3), 186–192.
Myowa–Yamakoshi, M., Yamaguchi, M. K., Tomonaga, M., Tanaka, M., & Matsuzawa, T. (2005). Development of face recognition in infant chimpanzees (Pan troglodytes). Cognitive Development, 20(1), 49–63.
Nakayama, K. (1994). James J Gibson–An appreciation. Psychological Review, 101, 329–335.
Navarrete, C. D., & Fessler, D. M. T. (2006). Disease avoidance and ethnocentrism: the effects of disease vulnerability and disgust sensitivity on intergroup attitudes. Evolution and Human Behavior, 27(4), 270–282.
Niedenthal, P. M. (2007). Embodying emotion. Science, 316, 1002–1005.
O’Doherty, J., Winston, J., Critchley, H., Perrett, D., Burt, D. M., & Dolan, R. J. (2003). Beauty in a smile: The role of medial orbitofrontal cortex in facial attractiveness. Neuropsychologia, 41(2), 147–155.
O’Toole, A. J., Roark, D. A., & Abdi, H. (2002). Recognizing moving faces: A psychological and neural synthesis. Trends in Cognitive Sciences, 6(6), 261–266.
O’Toole, A. J., Wenger, M. J., & Townsend, J. T. (2001). Quantitative models of perceiving and remembering faces: Precedents and possibilities. In M. J. Wenger & J. T. Townsend (Eds.), Computational, geometric, and process perspectives on facial cognition: Contexts and challenges.(pp. 1–38). Mahwah, NJ: Erlbaum.
O'Toole, A. J., Price, T., Vetter T, Bartlett, J. C., & Blanz, V. (1999). 3D shape and 2D surface textures of human faces: The role of "averages" in attractiveness and age. Imaging and Vision Computing, 18(1), 9–19.
Ohman, A., & Dimberg, U. (1978). Facial expressions as conditioned stimuli for electrodermal responses: A case of ‘preparedness’? Journal of Personality and Social Psychology, 36(11), 1251–1258.
Olson, I. R., & Marshuetz, C. (2005). Facial Attractiveness Is Appraised in a Glance. Emotion, 5(4), 498–502.
Orr, S. P., & Lanzetta, J. T. (1984). Extinction of an emotional response in the presence of facial expressions of emotion. Motivation and Emotion, 8(1), 55–66.
Palermo, R., & Rhodes, G. (2007). Are you always on my mind? A review of how face perception and attention interact. Neuropsychologia, 45(1), 75–92.
Pelphrey, K. A., Morris, J. P., Michelich, C. R., Allison, T., & McCarthy, G. (2005). Functional anatomy of biological motion perception in posterior temporal cortex: An fMRI study of eye, mouth and hand movements. Cerebral Cortex, 15(12), 1866–1876.
Pentland, B., Pitcairn, T. K., Gray, J. M., & Riddle, W. J. R. (1987). The effects of reduced expression in Parkinson’s disease on impression formation by health professionals. Clinical Rehabilitation, 1, 307–313.
Peters, M., Rhodes, G., & Simmons, L. W. (2007). Contributions of the face and body to overall attractiveness. Animal Behaviour, 73(6), 937–942.
Pezdek, K., Blandon–Gitlin, Iris, & Moore, Catherine. (2003). Children’s face recognition memory: More evidence for the cross–race effect. Journal of Applied Psychology, 88(4), 760–763. (p.27)
Pike, G. E., Kemp, R. I., Towell, N. A., & Phillips, K. C. (1997). Recognizing moving faces: The relative contribution of motion and perspective view information. Visual Cognition, 4(4), 409–437.
Pittenger, J. B., Shaw, R. E., & Mark, L. S. (1979). Perceptual information for the age–level of faces as a higher order invariant of growth. Journal of Experimental Psychology: Human Perception and Performance, 5(478–493).
Prodan, C., Orbelo, D., Testa, J., & Ross, E. (2001). Hemispheric differences in recognizing upper and lower facial displays of emotion. Neuropsychiatry, Neuropsychology, & Behavioral Neurology, 14(4), 206–212.
Quinn, K. A., & Macrae, C. N. (2005). Categorizing Others: The Dynamics of Person Construal. Journal of Personality and Social Psychology, 88(3), 467–479.
Quinn, P. C., Westerlund, A., & Nelson, C. A. (2006). Neural Markers of Categorization in 6–Month–Old Infants. Psychological Science, 17(1), 59–66.
Quinn, P. C., Yahr, J., Kuhn, A., Slater, A. M., & Pascalis, O. (2002). Representation of the gender of human faces by infants: A preference for female. Perception, 31(9), 1109–1121.
Reber, R., Winkielman, P., & Schwarz, N. (1998). Effects of perceptual fluency on affective judgments. Psychological Science, 9(1), 45–48.
Reis, H. T., Wilson, I. M., Monestere, C., Bernstein, S., Clark, K., Seidl, E., et al. (1990). What is smiling is beautiful and good. European Journal of Social Psychology, 20, 259–267.
Rhodes, G. (2006). The evolutionary psychology of facial beauty. Annual Review of Psychology, 57, 199–226.
Rhodes, G., Brake, S., Taylor, K., Tan, S. (1989). Expertise and configural coding in face recognition. British Journal of Psychology, 80(3), 313–331.
Rhodes, G., Brennan, S., & Carey, S. (1987). Identification and ratings of caricatures: Implications for mental representations of faces. Cognitive Psychology, 19(4), 473–497.
Rhodes, G., Halberstadt, J., & Brajkovich, G. (2001). Generalization of mere exposure effects to averaged composite faces. Social Cognition, 19(1), 57–70.
Rhodes, G., & Jeffery, L. (2006). Adaptive norm-based coding of facial identity. Vision Research, 46(18), 2977–2987.
Rhodes, G., Jeffery, L., Watson, T. L., Clifford, C. W. G., & Nakayama, K. (2003). Fitting the mind to the world: Face adaptation and attractiveness aftereffects. Psychological Science, 14(6), 558–566.
Riggio, R. E., & Friedman, H. (1986). Impression formation:The role of expressive behavior. Journal of Personality and Social Psychology, 50, 421–427.
Rosch, E. H. (1973). Natural categories. Cognitive Psychology, 4, 328–350.
Rosenberg, S., Nelson, C., & Vivekananthan, P. S. (1968). A multidimensional approach to the structure of personality impressions Journal of Personality and Social Psychology, 9(4), 283–294.
Rubenstein, A. J., Kalakanis, L., & Langlois, J. H. (1999). Infant preferences for attractive faces: a cognitive explanation. Developmental Psychology, 15, 848–855.
Rule, N. O., Ambady, N., Adams, R. B., & Macrae, C. N. (2007). Us and them: Memory advantages in perceptually ambiguous groups. Psychonomic Bulletin & Review, 14, 687–692.
Rumelhart, D. E., Smolensky, P., McClelland, J. L., & Hinton, G. E. (1986). Schemata and sequential thought processes in PDP models. In D. E. R. J. L. McClelland & P. D. P. R. G. the (Eds.), Parallel Distributed Processing(Vol. 2). Cambridge, MA: MIT Press.
Sadr, J., Jarudi, I., & Sinha, P. (2003). The role of eyebrows in face recognition. Perception, 32(3), 285–293.
Said CP, Sebe N, Todorov A. (2009). Structural Resemblance to Emotional Expressions Predicts Evaluation of Emotionally Neutral Faces. Emotion, 9(2), 260–264.
Sangrigoli, S., Pallier, C., Argenti, A. M., Ventureyra, V. A. G., & de Schonen, S. (2005). Reversibility of the Other–Race Effect in Face Recognition During Childhood. Psychological Science, 16(6), 440–444.
Sato, W., & Yoshikawa, S. (2004). The dynamic aspects of emotional facial expressions. Cognition & Emotion, 18(5), 701–710.
Sato, W., & Yoshikawa, S. (2007). Enhanced experience of emotional arousal in response to dynamic facial expressions. Journal of Nonverbal Behavior, 31, 119–135.
Scheib, J. E., Gangestad, S. W., & Thornhill, R. (1999). Facial attractiveness, symmetry, and cues to good genes. Proceedings of the Royal Society of London, Series B: Biological Sciences, 266, 1913–1917.
Schneider, F., Heimann, H., Himer, W., & Huss, D. (1990). Computer-based analysis of facial (p.28) action in schizophrenic and depressed patients. European Archives of Psychiatry and Clinical Neuroscience, 240(2), 67–76.
Secord, P. (1958). Facial features and inference processes in interpersonal perception. In R. Tagiuri & L. Petrullo (Eds.), Person Perception and Interpersonal Behavior(pp. 300–315). Stanford, CA: Stanford University Press.
Shapiro, P. P., S. (1986). Meta–analysis of facial identification studies. Psychological bulletin, 100(13), 156.
Shaw, R., & Pittenger, J. (Eds.). (1977). Perceiving the face of change in changing faces: Implications for a theory of object perception. Hillsdale, NJ: Erlbaum.
Slone, A., Brigham, J., & Meissner, C. (2000). Social and cognitive factors affecting the own–race bias in Whites. Basic and Applied Social Psychology, 22, 71–84.
Smith, M. L., Cottrell, G. W., Gosselin, F, & Schyns (2005). Transmitting and coding facial expressions. Psychological Science, 16(3)184–189.
Smith, L., Muir, D., Pascalis, O., & Slater, A. (2003). Infant perception of dynamic faces: Emotion, inversion and eye direction effects. In The development of face processing in infancy and early childhood: Current perspectives.(pp. 119–130): Nova Science Publishers.
Sommerville, J. A., & Decety, J. (2006). Weaving the fabric of social interaction: Articulating developmental psychology and cognitive neuroscience in the domain of motor cognition. Psychonomic Bulletin & Review, 13(2), 179–200.
Stenberg, G., Wiking, S., & Dahl, M. (1998). Judging words at face value: Interference in a word processing task reveals automatic processing of affective facial expressions. Cognition & Emotion, 12(6), 755–782.
Stevenage, S. V. (1998). Which twin are you? A demonstration of induced categorical perception of identical twin faces. British Journal of Psychology, 89, 39–57.
Strom, M., Zhang, S., Zebrowitz, L. A., Bronstad, P. M., & Lee, H. K. (2008). Race–related Facial Qualities Contribute to Stereotyping by White, Black, and Korean Judges.Unpublished manuscript, Waltham, MA.
Thornhill, R., & Gangestad, S. W. (1999). Facial attractiveness. Trends in Cognitive Sciences, 3(12), 452–460.
Todd, J. T., Mark, L. S., Shaw, R. E., & Pittenger, J. B. (1980). The perception of human growth. Scientific American, 24, 106–114.
Vaish, A., & Striano, T. (2004). Is visual reference necessary? Contributions of facial versus vocal cues in 12–month–olds’ social referencing behavior. Developmental Science, 7(3), 261–269.
Valentine, T. (1991). A unified account of the effects of distinctiveness, inversion, and race in face recognition. The Quarterly Journal of Experimental Psychology A: Human Experimental Psychology, 43(2), 161–204.
Vuilleumier, P., & Pourtois, G. (2007). Distributed and interactive brain mechanisms during emotion face perception: Evidence from functional neuroimaging. Neuropsychologia, 45(1), 174–194.
Wager, T. D., Barrett, L. F., Bliss–Moreau, E., Lindquist, K., Duncan, S., Kober, H., et al. (2007). The neuroimaging of emotion. New York: Columbia University.
Wagner, H. L., MacDonald, C. J., & Manstead, A. S. (1986). Communication of individual emotions by spontaneous facial expressions. Journal of Personality and Social Psychology, 50(4), 737–743.
Walton, G. E., Bower, N. J., & Bower, T. G. (1992). Recognition of familiar faces by newborns. Infant Behavior & Development, 15, 265–269.
Walton, J. H., & Orlikoff, R. F. (1994). Speaker race identification from acoustic cues in the vocal signal. Journal of Speech & Hearing Research, 37(4), 738–745.
Webster, M. A., Kaping, D., Mizokami, Y., Duhamel, P. (2004). Adaptation to natural facial categories. Nature 428(6982), 557–561.
Wehrle, T., Kaiser, S., Schmidt, S., & Scherer, K. R. (2000). Studying the dynamics of emotional expression using synthesized facial muscle movements. Journal of Personality and Social Psychology, 78(1), 105–119.
Wiggins, J. S. (1996). An informal history of the interpersonal circumplex tradition. Journal of Personality Assessment, 66, 217–233.
Willis, J., & Todorov, A. (2006). First Impressions: Making Up Your Mind After a 100–Ms Exposure to a Face. Psychological Science, 17(7), 592–598.
Winkielman, P., Halberstadt, J., Fazendeiro, T., & Catty, S. (2006). Prototypes are attractive because they are easy on the mind. Psychological Science, 17, 799–806.
Winston, J. S., O’Doherty, J., Kilner, J. M., Perrett, D. I., & Dolan, R. J. (2007). Brain (p.29) systems for assessing facial attractiveness. Neuropsychologia, 45(1), 195–206.
Xiaohu, P., Yuejia, L., Xing, W., Guofeng, W., & Jinghan, W. (2003). The mechanism of other race effect between Eastern and Western faces revealed by electrophysiology study. Acta Psychologica Sinica, 35(1), 49–55.
Yoshikawa, S., & Sato, W. (2006). Enhanced perceptual, emotional, and motor processing in response to dynamic facial expressions of emotion. Japanese Psychological Research, 48(3), 213–222.
Young, A. W., Hellawell, D., & Hay, D. C. (1987). Configural information in face perception. Perception, 28, 141–145.
Yovel, G., Levy, J., Grabowecky, M., & Paller, K. A. (2003). Neural correlates of the left-visual-field superiority in face perception appear at multiple stages of face processing. J Cogn Neurosci, 15(3), 462–474.
Yuki, M., Maddux, W. W., & Masuda, T. (2007). Are the windows to the soul the same in the East and West? Cultural differences in using the eyes and mouth as cues to recognize emotions in Japan and the United States. Journal of Experimental Social Psychology, 43(2), 303–311.
Zajonc, R. B. (1968). Attitudinal effects of mere exposure. Journal of Personality and Social Psychology, 9(2, PT. 2), 1–27.
Zajonc, R. B. (2001). Mere exposure: A gateway to the subliminal. Current Directions in Psychological Science, 10(6), 224–228.
Zebrowitz, L. A. (1996). Physical appearance as a basis of stereotyping. In N. MacRae, M. Hewstone, & C. Stangor (Eds.) Foundations of stereotypes and stereotyping, 79–120. New York: Guilford Press.
Zebrowitz, L. A. (1997). Reading faces: Window to the soul? Boulder, Colo.: Westview Press.
Zebrowitz, L. A. (2006). Finally, faces find favor. Social Cognition, 24(5), 657–701.
Zebrowitz, L. A., Andreoletti, C., Collins, M. A., Lee, S. Y., & Blumenthal, J. (1998). Bright, bad, babyfaced boys: appearance stereotypes do not always yield self–fulfilling prophecy effects. J Pers Soc Psychol, 75(5), 1300–1320.
Zebrowitz, L. A., Barglow, J., Bronstad, P. M., & Lee, H. K. (2008). Contribution of Skin Tone to White, Black, and Korean Judges' Impressions of Own and Other-Race Faces. Unpublished Manuscript. Brandeis University.
Zebrowitz, L. A., Bronstad, P. M., & Lee, H. K. (2007a). The contribution of face familiarity to ingroup favoritism and stereotyping. Social Cognition, 25, 306–338.
Zebrowitz, L. A., & Collins, M. A. (1997). Accurate social perception at zero acquaintance: The affordances of a Gibsonian approach. Personality and Social Psychology Review, 1, 204–223.
Zebrowitz, L. A., Collins, M. A., & Dutta, R. (1998). The relationship between appearance and personality across the life span. Personality & Social Psychology Bulletin, 24(7), 736–749.
Zebrowitz, L. A., Kikuchi, M., & Fellous, J. M. (in press). Facial Resemblance to Emotions: Group Differences, Impression Effects, and Race Stereotypes. Journal of Personality and Social Psychology.
Zebrowitz, L. A., Fellous, J. M., Mignault, A., & Andreoletti, C. (2003). Trait impressions as overgeneralized responses to adaptively significant facial qualities: Evidence from connectionist modeling. Personality and Social Psychology Review, 7(3), 194–215.
Zebrowitz, L. A., Hall, J. A., Murphy, N. A., & Rhodes, G. (2002). Looking smart and looking good: Facial cues to intelligence and their origins. Personality and Social Psychology Bulletin, 28(2), 238–249.
Zebrowitz, L. A., Kikuchi, M., & Fellous, J.-M. (2007). Are effects of emotion expression on trait impressions mediated by babyfaceness? Evidence from connectionist modeling. Personality and Social Psychology Bulletin, 33(5), 648–662.
Zebrowitz, L. A., Kikuchi, M., & Fellous, J. M. (in press). Facial Resemblance to Emotions: Group Differences, Impression Effects, and Race Stereotypes. Journal of Personality and Social Psychology.
Zebrowitz, L. A., & Lee, S. Y. (1999). Appearance, stereotype incongruent behavior, and social relationships. Personality and Social Psychology Bulletin, 25, 569–584.
Zebrowitz, L. A., Luevano, V. X., Bronstad, P. M., & Aharon, I. (2009). Neural activation to babyfaced men matches activation to babies. Journal of Social Neuroscience, 4, 185–196.
Zebrowitz, L. A., & Montepare, J. M. (1992). Impressions of babyfaced individuals across the life span. Developmental Psychology, 28(6), 1143–1152.
Zebrowitz, L. A., & Montepare, J. M. (2006). The ecological approach to person perception: Evolutionary roots and contemporary (p.30) offshoots. In M. Schaller, J. A. Simpson & D. T. Kenrick (Eds.), Evolution and Social Psychology(pp. 81–113). New York: Psychology Press.
Zebrowitz, L. A., Montepare, J. M., & Lee, H. K. (1993). They don’t all look alike: Individual impressions of other racial groups. Journal of Personality and Social Psychology, 65(1), 85–101.
Zebrowitz, L. A., & Rhodes, G. (2002). Nature let a hundred flowers bloom: The multiple ways and wherefores of attractiveness. In G. Rhodes & L. A. Zebrowitz (Eds.), Facial attractiveness: Evolutionary, cognitive, and social perspectives. Advances in visual cognition(Vol. 1, pp. 261–293). Westport, CT: Ablex.
Zebrowitz, L. A., & Rhodes, G. (2004). Sensitivity to “Bad Genes” and the Anomalous Face Overgeneralization Effect: Cue Validity, Cue Utilization, and Accuracy in Judging Intelligence and Health. Journal of Nonverbal Behavior, 28(3), 167–185.
Zebrowitz, L. A., Voinescu, L., & Collins, M. A. (1996). “Wide–eyed” and “crooked–faced”: Determinants of perceived and real honesty across the life span. Personality and Social Psychology Bulletin, 22(12), 1258–1269.
Zebrowitz, L. A., White, B., & Weineke, K. (2008). Mere exposure and racial prejudice: Exposure to other-race faces increases liking for strangers of that race. Social Cognition, 26, 259–275.
Zhang, Y. (2007). Perceiving is for doing: exploring the approach–avoidance predispositions of face perception. Unpublished Unpublished manuscript. Brandeis University.
Zhang, Y., Zebrowitz, L. A., & Aharon, I. (2007). Neural Substrates of Perceiving Variations in Facial Attractiveness. Paper presented at the Organization for Human Brain Mapping, Chicago, IL, June 10–14, 2007.
Zuckerman, M., Miyake, K., & Hodgins, H. S. (1991). Cross-Channel Effects of Vocal and Physical Attractiveness and Their Implications for Interpersonal Perception. Journal of Personality & Social Psychology, 60(4), 545–554.
1 These dimensions are sometimes been labeled Warmth and Competence or Dominance, but the attributes that contribute are the same. Likablity/Warmth includes attributes such as warmth, approachability, and sociability; Power/Competence/Dominance includes attributes such as dominance, shrewdness, intelligence, and physical strength.
2 In contrast, when people are making judgments in response to verbal probes rather than to faces, they rate elderly people as higher in likability than young adults (warmth), albeit still lower in power (competence)(Cuddy, Norton, & Fiske, 2005).
3 Another overgeneralization hypothesis is animal analogies, whereby people may be perceived to have traits that are associated with the animals that their features resemble (Zebrowitz, 1997, pp 58–61).
4 Although impressions in this study were not accurate, other research using a sample of faces that was representative of the population revealed that lower-than-average facial attractiveness predicted lower health and intelligence, whereas higher-than-average attractiveness did not predict higher fitness. Nevertheless, consistent with the anomalous face overgeneralization hypothesis, intelligence and health were perceived to vary not only from low to moderate levels of attractiveness, but also from moderate to high levels (Zebrowitz & Rhodes, 2004).