Abstracts 2005 - The Psychonomic Society

Abstracts 2005 - The Psychonomic Society Abstracts 2005 - The Psychonomic Society

psychonomic.org
from psychonomic.org More from this publisher
29.01.2013 Views

Posters 4001–4007 Saturday Noon POSTER SESSION IV Sheraton Hall, Saturday Noon, 12:00–1:30 • 3-D AND MOTION PERCEPTION • (4001) Visual Characteristics of Biological Motion: Investigations With a New Stimulus Set. JOHN A. PYLES, EMILY D. GROSSMAN, & DONALD D. HOFFMAN, University of California, Irvine (sponsored by Myron L. Braunstein)—Biological motion research typically uses point-light animations depicting highly constrained and familiar human movement. To investigate the visual characteristics critical for invoking perception of biological motion, we have created a new stimulus set based on computer generated creatures that possess novel, coherent motion and appear biological. Our results from a perceived animacy rating experiment confirm that the creatures are perceived as alive. In a second experiment, we measured noise tolerance thresholds for upright and inverted human, animal, and creature animations. Inversion of point-light animations reduces sensitivity to biological motion and is thought to reflect configural processing. We found a reduced inversion effect for creature and animal animations, relative to the human sequences. Our results provide evidence for the importance of motion in perceived animacy and of familiarity and body structure in biological motion perception. (4002) Is Action Observation Synonymous With Action Prediction? GERTRUDE RAPINETT, Max Planck Institute for Human Cognitive and Brain Sciences, GÜNTHER KNOBLICH, Rutgers University, Newark, MARGARET WILSON, University of California, Santa Cruz, & WOLFGANG PRINZ, Max Planck Institute for Human Cognitive and Brain Sciences—When we observe someone performing an action, we can predict, to some extent, the outcome of the observed action. Action prediction may be a corollary of the involvement of the observer’s action system during the perception of actions performed by conspecifics (Kilner, 2004). The aim of these experiments was to identify which attributes of an action enable an observer to anticipate the future consequences of actions. More specifically, we investigated whether the kinetics of an action (dynamic/static), the congruency between the action grip and the target object (power/precision), and the functional relationship between the action and the target object (related/ unrelated) effect the accuracy of prediction. Participants were presented with images from different points in the movement trajectories and were required to predict the outcome. Kinetic information modulated most strongly the accuracy of prediction. Possible mechanisms involved in action prediction based on the observer’s action system are discussed. (4003) Implicit Action Encoding Influences Personal Trait Attribution. PATRIC BACH, CHARLES E. LEEK, & STEVEN P. TIPPER, University of Wales, Bangor—To-be-executed actions and observed actions activate overlapping action representations. A consequence of this vision–action matching process is that producing actions one simultaneously observes will be easier than producing different actions. For example, when observing another person kick a ball, a foot response to identify a stimulus will be faster than a response with a finger. In contrast, observing a person press a key will facilitate a finger response relative to a foot response. We demonstrate that compatibility between perceived actions and executed responses can also influence the personality traits attributed to the viewed person. These vision–action– personality effects can be observed both with an explicit rating measure and when measured implicitly with a priming procedure. (4004) Improving Distance Estimation in Immersive Virtual Environments. ADAM R. RICHARDSON & DAVID WALLER, Miami University (sponsored by Yvonne Lippa)—It is well known that distances are un- 106 derestimated in computer simulated (virtual) environments, more so than in comparable real-world environments. Three experiments examined observers’ ability to use explicit (Experiments 1 and 2) or implicit (Experiment 3) feedback to improve the accuracy of their estimates of distance in virtual environments. Explicit feedback (e.g., “You estimated the distance to be 2.4 meters, the actual distance was 4.5 meters”) improved the accuracy of observers’ distance estimates, but only for the trained type of distance estimate (egocentric or exocentric) and the trained type of response (direct or indirect blindfolded walking). Interestingly, brief closed-loop interaction with the virtual environment (i.e., implicit feedback) also resulted in near-veridical distance estimation accuracy. (4005) Misperceived Heading and Steering Errors Occur When Driving in Blowing Snow. BRIAN P. DYRE & ROGER LEW, University of Idaho—Driving through blowing snow creates a transparent optical flow with two foci of expansion (FOEs). Previous research showed that such optical flow produces systematic errors in judgments of the direction of heading when the FOEs are misaligned (nonrigid). Here, we examined whether these errors generalize to control of heading in a more realistic simulation: driving across a textured ground plane through blowing snow. Participants were instructed to steer a simulated vehicle such that they maintained a straight path while crosswinds caused snow to move at varying angles (0º–64º) relative to the initial direction of translation. For nonzero angles, a pattern of systematic steering errors was found where smaller angles biased steering away from the FOE defined by the snow, and larger angles biased steering toward the snow’s FOE. These results show that misperception of heading from nonrigid transparent optical flow can cause systematic errors in the steering of simulated vehicles. (4006) Perceptual Learning and the Visual Control of Collision Avoidance. BRETT R. FAJEN, Rensselaer Polytechnic Institute—What distinguishes experts from novices performing the same perceptual–motor task? The superior performance of experts could be attributed, in part, to a form of perceptual learning known as perceptual attunement. In this study, perceptual attunement was investigated using an emergency braking task in which participants waited until the last possible moment to slam on the brakes. Biases resulting from the manipulation of sign radius and initial speed were used to identify the optical variables upon which participants relied at various stages of practice. I found that biases that were present early in practice diminished or were eliminated with experience and that the optical variables to which observers became attuned depended on the range of practice conditions and the availability of information. Perceptual attunement resulting from practice on emergency braking transferred to normal, regulated braking, suggesting that perceptual attunement plays an important role in learning to perform a visually guided action. • MUSIC COGNITION • (4007) Repetition Priming in Music Performance. SEAN HUTCHINS & CAROLINE PALMER, McGill University—Four experiments addressed the role of implicit memory in a music production task. Singers heard short melodies of predetermined length and sang the final tone as quickly as possible. We manipulated whether the final tone (target) was a repetition of a previous melodic tone (prime) and the distance (intervening tones) between prime and target tones. Experiment 1 manipulated prime–target distance along with stimulus length, whereas Experiment 2 manipulated prime–target distance independently of stimulus length. Experiments 3 and 4 also manipulated the stimulus rate (tempo). Experiment 1 showed a significant benefit of repetition priming on singers’ response latencies. Experiment 2 showed a benefit for repetition at shorter prime–target distances and a benefit for expected (tonic) endings over less expected (nontonic) endings. Re-

Saturday Noon Posters 4008–4015 sponse latencies in Experiments 3 and 4 showed entrainment to stimulus rate, and repetition priming was modulated by tonal expectedness. We discuss cognitive factors that can affect auditory repetition priming. (4008) Is Categorical Perception of Musical Intervals a Short-Term Memory Phenomenon? SINI E. MAURY, University of Helsinki, & ELISABET M. SERVICE, University of Helsinki and Dalhousie University (sponsored by Elisabet M. Service)—This study explores whether categorical perception of musical intervals can vary as a function of immediate memory load caused by interference from other sounds in a sequence. In a two-interval same–different discrimination task, musicians heard melodic intervals in isolation or embedded in four-note sequences. Half of the interval pairs were similar, and half were derived either from the same or from a different interval category. The results showed that discriminability measured by d′ was significantly higher for intervals straddling the category boundary. This effect was even more pronounced when intervals formed a part of a melodic sequence. This could mean that categorical perception is a short-term memory phenomenon in which degraded auditory traces are repaired with top-down categorical information. The results also imply that the categorical information retrieved in the repair process takes the form of the prototype of the category and is not general knowledge about category membership. (4009) Musicians, Intermediate Musicians, and Nonmusicians’ Perception of Bitonality. MAYUMI HAMAMOTO, MARGARET P. MUNGER, & KYOTA KO, Davidson College—Bitonal music is characterized by a dissonant “crunch” sound that had been believed to be clearly audible by everyone (Wolpert, 2000). However, Wolpert found that nonmusicians did not identify bitonality in a free response task. The present study replicated Wolpert’s findings but also had participants rate song clips for preference, correctness and pleasantness. Monotonal music was rated higher on all dimensions, independently of the individual’s level of musical training. In addition, following a brief training session, nonmusicians (less than 1 year of musical training) identified the tonality of the final clips at equivalently high rates as the intermediate (mean, 2.4 years) and expert (mean, 9.2 years) musician groups. (4010) Cross-Modal Perception of Contour: The Role of Surface Correlation and Fourier Analysis Similarity. JON B. PRINCE & MARK A. SCHMUCKLER, University of Toronto, Scarborough—The perceived similarity of cross-modally presented contours was investigated with two experiments. The combination of surface correlation and Fourier analysis techniques allows quantitative descriptions of both global and local contour information. Experiment 1 investigated auditory–visual similarity by presenting a tonal melody followed by a line drawing and asking participants to rate the similarity between the two. Both stimuli were coded as integer series representing pitch or vertical height, respectively. Ratings were predicted by the surface correlation between the melody and the drawing (the correlation of the two integer series). Experiment 2 reversed the order of presentation by presenting the drawing first, followed by the melody. Surface correlation again predicted similarity ratings, in addition to amplitude and phase components derived from a Fourier analysis model. These results validate the Fourier analysis model of contour cross-modally, particularly when participants must attend to the global character of visual and auditory contours. (4011) Effect of Encoding Processes on Remembering Melodies. ESRA MUNGAN & ZEHRA F. PEYNIRCIO ˇGLU, American University— In this study, both musicians and nonmusicians were asked to study a list of highly familiar melodies, using four different orienting tasks. Two were conceptually driven (continuing the melody and judging the 107 mood conveyed by the melody), and two were data driven (counting the number of long notes and tracing the melodic shape). The study phase was followed by an incidental free-choice recognition test. Findings showed that for nonmusicians, conceptually driven orienting tasks led to better memory performance than did data-driven orienting tasks, whereas for musicians the reverse was true. These findings are discussed within the transfer-appropriate-processing framework. (4012) The Relationship Between Emotions Expressed and Elicited by Music and the Effect of Familiarity. OMAR ALI & ZEHRA F. PEYNIRCIO ˇGLU, American University (sponsored by Zehra F. Peynircio ˇglu)—We examined the effects of melodies on participants’ ratings of emotionality. The intensity of the ratings was higher when participants were asked to judge the emotion that was expressed by a melody (i.e., how happy/sad/calm/angry is this music?) than when they were asked to judge the emotion elicited by the same melody (i.e., how happy/sad/calm/angry does this music make you feel?). This pattern held across all four of the emotions and also even when the melodies were made familiar through repetition. In addition, positive emotions (i.e., happy and calm) were rated higher than negative emotions (i.e., sad and angry). Finally, for both types of ratings (i.e., conveying or eliciting the emotion), the ratings in response to the repeated melodies were higher, but only for the sad and calm emotions. • EVENT COGNITION • (4013) Time Estimation and Fluency in Event Perception. MACKENZIE GLAHOLT, AVA ELAHIPANAH, ANTHONY R. MCINTOSH, & EYAL M. REINGOLD, University of Toronto, Mississauga—Intervals in which familiar stimuli (e.g., words) are presented are judged as longer than equal duration intervals in which unfamiliar stimuli are presented (e.g., nonwords). This perceptual illusion may result from the misattribution of the enhanced perceptual fluency associated with processing familiar stimuli. We investigated whether a similar phenomenon occurs in the perception of events. To manipulate event familiarity, we used 2-sec video clips of collisions between hockey players, played forward or in reverse. Reversed clips were closely matched to forward clips in terms of low-level perceptual characteristics, but they depicted events that violated physical laws and, as such, were unfamiliar. Participants judged reverse clips as having shorter duration and faster motion, as compared with forward clips. These findings replicate and extend the findings with linguistic stimuli. (4014) From Seeing to Remembering Events in Time. SHULAN LU, Texas A&M University, Commerce—Everyday events have beginnings, ends, and intervals. These temporal parameters have different combinations, and events have dynamic temporal trajectories. Previous research tends to assume that events follow one another and that subevents occur sometime in between. Recently, studies have begun to suggest that people may make finer grained temporal links than we previously thought. What kind of temporal properties get preserved more robustly? Participants viewed animations of fish-swimming events, where test events were embedded in a schema. For example, a group of fish chased away other fish. In Experiment 1, participants made judgments about the temporal relation of two given events immediately after they had viewed each animation. In Experiment 2, participants made judgments after viewing each animation and then drawing a maze for 25 sec. The results showed that people did not remember the time interval that occurred between two events but robustly preserved the overlap between events. (4015) Event Recognition in Free View and at an Eyeblink. REINHILD GLANEMANN, CHRISTIAN DOBEL, & PIENIE ZWITSERLOOD, Westfälische Wilhelms-Universität Münster—Recent studies demonstrated that brief visual presentation (around 20 msec) of photoreal-

Posters 4001–4007 Saturday Noon<br />

POSTER SESSION IV<br />

Sheraton Hall, Saturday Noon, 12:00–1:30<br />

• 3-D AND MOTION PERCEPTION •<br />

(4001)<br />

Visual Characteristics of Biological Motion: Investigations With<br />

a New Stimulus Set. JOHN A. PYLES, EMILY D. GROSSMAN, &<br />

DONALD D. HOFFMAN, University of California, Irvine (sponsored<br />

by Myron L. Braunstein)—Biological motion research typically uses<br />

point-light animations depicting highly constrained and familiar<br />

human movement. To investigate the visual characteristics critical for<br />

invoking perception of biological motion, we have created a new stimulus<br />

set based on computer generated creatures that possess novel, coherent<br />

motion and appear biological. Our results from a perceived animacy<br />

rating experiment confirm that the creatures are perceived as<br />

alive. In a second experiment, we measured noise tolerance thresholds<br />

for upright and inverted human, animal, and creature animations. Inversion<br />

of point-light animations reduces sensitivity to biological motion<br />

and is thought to reflect configural processing. We found a reduced<br />

inversion effect for creature and animal animations, relative to<br />

the human sequences. Our results provide evidence for the importance<br />

of motion in perceived animacy and of familiarity and body structure<br />

in biological motion perception.<br />

(4002)<br />

Is Action Observation Synonymous With Action Prediction?<br />

GERTRUDE RAPINETT, Max Planck Institute for Human Cognitive<br />

and Brain Sciences, GÜNTHER KNOBLICH, Rutgers University,<br />

Newark, MARGARET WILSON, University of California, Santa Cruz,<br />

& WOLFGANG PRINZ, Max Planck Institute for Human Cognitive<br />

and Brain Sciences—When we observe someone performing an action,<br />

we can predict, to some extent, the outcome of the observed action.<br />

Action prediction may be a corollary of the involvement of the<br />

observer’s action system during the perception of actions performed<br />

by conspecifics (Kilner, 2004). <strong>The</strong> aim of these experiments was to<br />

identify which attributes of an action enable an observer to anticipate<br />

the future consequences of actions. More specifically, we investigated<br />

whether the kinetics of an action (dynamic/static), the congruency between<br />

the action grip and the target object (power/precision), and the<br />

functional relationship between the action and the target object (related/<br />

unrelated) effect the accuracy of prediction. Participants were presented<br />

with images from different points in the movement trajectories<br />

and were required to predict the outcome. Kinetic information modulated<br />

most strongly the accuracy of prediction. Possible mechanisms<br />

involved in action prediction based on the observer’s action system are<br />

discussed.<br />

(4003)<br />

Implicit Action Encoding Influences Personal Trait Attribution.<br />

PATRIC BACH, CHARLES E. LEEK, & STEVEN P. TIPPER, University<br />

of Wales, Bangor—To-be-executed actions and observed actions<br />

activate overlapping action representations. A consequence of this<br />

vision–action matching process is that producing actions one simultaneously<br />

observes will be easier than producing different actions. For<br />

example, when observing another person kick a ball, a foot response<br />

to identify a stimulus will be faster than a response with a finger. In<br />

contrast, observing a person press a key will facilitate a finger response<br />

relative to a foot response. We demonstrate that compatibility between<br />

perceived actions and executed responses can also influence the personality<br />

traits attributed to the viewed person. <strong>The</strong>se vision–action–<br />

personality effects can be observed both with an explicit rating measure<br />

and when measured implicitly with a priming procedure.<br />

(4004)<br />

Improving Distance Estimation in Immersive Virtual Environments.<br />

ADAM R. RICHARDSON & DAVID WALLER, Miami University<br />

(sponsored by Yvonne Lippa)—It is well known that distances are un-<br />

106<br />

derestimated in computer simulated (virtual) environments, more so<br />

than in comparable real-world environments. Three experiments examined<br />

observers’ ability to use explicit (Experiments 1 and 2) or implicit<br />

(Experiment 3) feedback to improve the accuracy of their estimates<br />

of distance in virtual environments. Explicit feedback (e.g.,<br />

“You estimated the distance to be 2.4 meters, the actual distance was<br />

4.5 meters”) improved the accuracy of observers’ distance estimates,<br />

but only for the trained type of distance estimate (egocentric or exocentric)<br />

and the trained type of response (direct or indirect blindfolded<br />

walking). Interestingly, brief closed-loop interaction with the virtual<br />

environment (i.e., implicit feedback) also resulted in near-veridical<br />

distance estimation accuracy.<br />

(4005)<br />

Misperceived Heading and Steering Errors Occur When Driving<br />

in Blowing Snow. BRIAN P. DYRE & ROGER LEW, University of<br />

Idaho—Driving through blowing snow creates a transparent optical<br />

flow with two foci of expansion (FOEs). Previous research showed<br />

that such optical flow produces systematic errors in judgments of the<br />

direction of heading when the FOEs are misaligned (nonrigid). Here,<br />

we examined whether these errors generalize to control of heading in<br />

a more realistic simulation: driving across a textured ground plane<br />

through blowing snow. Participants were instructed to steer a simulated<br />

vehicle such that they maintained a straight path while crosswinds<br />

caused snow to move at varying angles (0º–64º) relative to the<br />

initial direction of translation. For nonzero angles, a pattern of systematic<br />

steering errors was found where smaller angles biased steering<br />

away from the FOE defined by the snow, and larger angles biased<br />

steering toward the snow’s FOE. <strong>The</strong>se results show that misperception<br />

of heading from nonrigid transparent optical flow can cause systematic<br />

errors in the steering of simulated vehicles.<br />

(4006)<br />

Perceptual Learning and the Visual Control of Collision Avoidance.<br />

BRETT R. FAJEN, Rensselaer Polytechnic Institute—What distinguishes<br />

experts from novices performing the same perceptual–motor<br />

task? <strong>The</strong> superior performance of experts could be attributed, in part,<br />

to a form of perceptual learning known as perceptual attunement. In<br />

this study, perceptual attunement was investigated using an emergency<br />

braking task in which participants waited until the last possible moment<br />

to slam on the brakes. Biases resulting from the manipulation of<br />

sign radius and initial speed were used to identify the optical variables<br />

upon which participants relied at various stages of practice. I found<br />

that biases that were present early in practice diminished or were eliminated<br />

with experience and that the optical variables to which observers<br />

became attuned depended on the range of practice conditions<br />

and the availability of information. Perceptual attunement resulting<br />

from practice on emergency braking transferred to normal, regulated<br />

braking, suggesting that perceptual attunement plays an important<br />

role in learning to perform a visually guided action.<br />

• MUSIC COGNITION •<br />

(4007)<br />

Repetition Priming in Music Performance. SEAN HUTCHINS &<br />

CAROLINE PALMER, McGill University—Four experiments addressed<br />

the role of implicit memory in a music production task. Singers heard<br />

short melodies of predetermined length and sang the final tone as<br />

quickly as possible. We manipulated whether the final tone (target)<br />

was a repetition of a previous melodic tone (prime) and the distance<br />

(intervening tones) between prime and target tones. Experiment 1 manipulated<br />

prime–target distance along with stimulus length, whereas<br />

Experiment 2 manipulated prime–target distance independently of<br />

stimulus length. Experiments 3 and 4 also manipulated the stimulus<br />

rate (tempo). Experiment 1 showed a significant benefit of repetition<br />

priming on singers’ response latencies. Experiment 2 showed a benefit<br />

for repetition at shorter prime–target distances and a benefit for<br />

expected (tonic) endings over less expected (nontonic) endings. Re-

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!