Abstracts 2005 - The Psychonomic Society
Abstracts 2005 - The Psychonomic Society
Abstracts 2005 - The Psychonomic Society
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
Posters 5030–5036 Saturday Evening<br />
(5030)<br />
Evidence for Gender Differences in Memory for Names and Faces.<br />
KIMBERLY D. BATES & SHELIA M. KENNISON, Oklahoma State<br />
University (sponsored by Shelia M. Kennison)—<strong>The</strong> research investigated<br />
gender differences in memory for names and faces. Prior research<br />
has shown that women remember faces better than do men. It<br />
has further been shown that both men and women remember names<br />
better when they refer to individuals of the same gender than those of<br />
the opposite gender. In the present research, participants were shown<br />
picture–name combinations. Names were unfamiliar and gender neutral<br />
(e.g., Sharg, Tilk). Counterbalancing was used to ensure that each<br />
name was paired with a male and a female picture. In two experiments,<br />
participants studied 40 picture–name combinations. In Experiment<br />
1, participants were later instructed to recall names when presented<br />
with the picture. In Experiment 2, participants were asked to<br />
match the name with the appropriate picture. <strong>The</strong> results of both experiments<br />
showed that women remembered names of men and women<br />
equally well but that men remembered names of men significantly<br />
more accurately than names of women.<br />
(5031)<br />
Task and Strategic Influences on the Eye-Movement–Based Memory<br />
Effect. TRAVIS L. SEYMOUR, CHRIS BAKER, & JOSH GAUNT,<br />
University of California, Santa Cruz—We measured eye movements<br />
to faces in an exclude–recognition task where one of two old lists was<br />
to be accepted as “old” and the other as “new” along with new filler<br />
faces. Previously, Althoff and Cohen (1999) showed that a host of eyebased<br />
measures differentiate familiar from nonfamiliar faces (e.g.,<br />
number of fixations, regions sampled, and different looking patterns).<br />
We compared eye movements during a familiarity task with those in<br />
the exclude–recognition task, where source recollection is required.<br />
Results show that whereas the familiarity task replicates previous eye<br />
movement patterns (longer, more constrained movements to new<br />
faces), exclude tasks lead to the opposite pattern (slower, more constrained<br />
looking for old items). Surprisingly, even the looking patterns<br />
for new items differ across tasks. We suggest that eye movements in<br />
recognition are dependent on task strategy. Also, despite previous reports,<br />
we show that unconscious behaviors, such as pupil dilation, but<br />
not blinking, can index prior exposure.<br />
(5032)<br />
Viewpoint-Dependent Performance for Faces Rotated About Pitch<br />
and Yaw Axes. SIMONE K. FAVELLE, STEPHEN A. PALMISANO,<br />
& RYAN MALONEY, University of Wollongong—When investigating<br />
the extent to which the processing of faces is dependent on viewpoint,<br />
studies typically consider rotation in depth (yaw; rotation around the<br />
head’s vertical axis) and compare performance between front-on and<br />
profile views. However, another common, although less studied,<br />
source of viewpoint variation in faces is rotation in pitch (rotation<br />
around the head’s horizontal axis). In the present study, we systematically<br />
examined and compared the effect of viewpoint change in both<br />
the pitch and the yaw axes on human face recognition. We used a sequential<br />
matching task in which stimuli were images of real female<br />
Caucasian faces shown at 15º increments around pitch (0º–75º) and<br />
yaw (0º–90º). Performance on both axes was dependent on viewpoint;<br />
however, participants responded more slowly and with less accuracy<br />
when matching faces rotated around pitch as opposed to around yaw.<br />
(5033)<br />
Short-Term Adaptations of Face Configural Representation.<br />
PAMELA M. PALLETT & KANG LEE, University of California, San<br />
Diego (sponsored by Victor S. Ferreira)—Configural information plays<br />
an important role in face processing. Lifelong experience with faces<br />
leads one to acquire common configural representations. Are these<br />
representations malleable with short-term experiences, despite an arduous<br />
acquisition process? To address this, our study capitalized on<br />
an illusion demonstrating that perception of an oval containing a face<br />
128<br />
is influence by face configuration. Moving the eyes and mouth closer<br />
to the nose makes the oval appear rounder. Subjects adapted to 64 similarly<br />
shortened faces and provided magnitude estimations for an oval<br />
surrounding an unseen shortened face. Subjects perceived a strong<br />
shape illusion. Exposure to shortened faces decreased this illusion,<br />
whereas normal face exposure failed to produce this shift. <strong>The</strong>se findings<br />
suggest that our face configural representations are malleable with<br />
short-term experience. <strong>The</strong> present results add to converging evidence<br />
for high-level nonretinotopically organized face adaptation that distinctly<br />
differs from adaptation to basic psychophysical properties.<br />
(5034)<br />
<strong>The</strong> Influence of Familiarity on Sex Decisions to Faces and Names.<br />
ROBERT A. JOHNSTON & RUTH CLUTTERBUCK, University of<br />
Birmingham—According to the model of face recognition by Bruce<br />
and Young (1986), sex analysis occurs independently of identity<br />
analysis, and as a consequence, no influence of familiarity should be<br />
found on the time taken to perform sex decisions. Results of recent<br />
behavioral studies cast doubt upon this claim. Two experiments are reported<br />
that explore the influence of familiarity on sex decisions to<br />
faces (Experiment 1) and surnames (Experiment 2) of different levels<br />
of familiarity. In Experiment 1, participants were able to assign sex<br />
more quickly to highly familiar faces than to unfamiliar faces. <strong>The</strong>refore,<br />
familiarity can influence the speed at which sex is analyzed from<br />
faces. Similarly, in Experiment 2, participants were able to assign sex<br />
and familiarity more quickly to highly familiar surnames than to moderately<br />
familiar surnames. <strong>The</strong>se findings are discussed in relation to<br />
the influence of sex information from identity-specific semantics, and<br />
an explanation is offered on the basis of Burton et al.’s (1990) IAC<br />
model of face recognition.<br />
(5035)<br />
Amygdalae and Affective Facial Expressions. ROMINA PALERMO,<br />
Macquarie University, LAURIE MILLER, Royal Prince Alfred Hospital,<br />
& MAX COLTHEART, Macquarie University—<strong>The</strong> ability to recognize<br />
facial expressions of emotion is crucial for successful social<br />
interactions. One neural structure that appears to be involved in processing<br />
information from facial expressions is the amygdala. We have<br />
been investigating how people with temporal lobe epilepsy, who may<br />
have amygdala damage, recognize facial expressions of emotion. In<br />
one study, we examined whether people who have affective auras prior<br />
to their seizures recognize facial expressions differently than do those<br />
that do not have auras. Another study examined whether the amygdala<br />
is involved in affective priming. In the affective priming task, we manipulated<br />
the expression, duration, and spatial frequency information<br />
of the face primes. Affective priming was absent when the prime faces<br />
contained only low-pass or high-pass information and were shown very<br />
briefly. Implications for models of affective processing are discussed.<br />
(5036)<br />
<strong>The</strong> Role of the Eyes and Mouth in Facial Emotions. CHRISTOPHER<br />
KOCH, George Fox University—Facial processing of emotions was<br />
examined using an emotional face Stroop task in which a face was<br />
presented with an emotion word. <strong>The</strong> expressed emotions included<br />
anger, sadness, happiness, and fear. In addition, the importance of facial<br />
features was examined by removing the eyes or mouth on some<br />
trials. Thus, there were three face conditions (full face, eyes removed,<br />
mouth removed) and three word conditions (no word, congruent emotion,<br />
incongruent emotion). Twenty-seven graduate students participated<br />
in the study. <strong>The</strong> results reveal a significant effect of face<br />
[F(2,52) = 4.63, p < .02] and word conditions [F(2,52) = 9.57, p <<br />
.001]. Full faces produced significantly shorter RTs than did faces<br />
with either the eyes or the mouth removed, but there was no difference<br />
between the eyes- and mouth-removed faces. Congruent emotion<br />
words did not produce facilitation but incongruent words produced<br />
significant interference. <strong>The</strong>se findings suggest the eyes and the<br />
mouth are equally important in facial expressions of emotions.