Inhaltspezifische Aktionen

2007

When sliding a finger across a bump on a surface, the finger follows the geometry of the bump (position signal). At the same time, forces related to the slope of the bump accelerate and decelerate the finger (force signal) [1]. Consistent with the Maximum Likelihood Estimate (MLE) model [2] haptically perceived shape can be described by the weighted average of the shape signaled by the force and the position signal [3, 4]. Here we investigated – for the haptic perception of bump amplitude – the effects of the movement parameters pressure and velocity on the signal weighting, as well as on the discrimination threshold (experiment 1). In experiment 2 we examined, whether the integration of force and position signals within the haptic modality is consistent with the MLE model under varying exploratory pressure. Kaim, L., & Drewing, K. (2007). Influence of parametric variation in exploratory movement on signal integration for haptic shape perception. In 10th Tübinger Perception Conference (pp. 99-99). Knirsch.

Wird ein Finger über eine physikalische Erhebung bewegt, so folgt er zum einen der Geometrie der Oberfläche (der Finger geht hoch und wieder runter = Positionssignale), zum anderen wird der Finger gemäß der Steigung der Erhebung abgebremst beziehungsweise auf der anderen Seite wieder beschleunigt (=Kraftsignale). Bei haptischer Formwahrnehmung wird das Gesamtperzept aus einem gewichteten Mittel der während der Exploration gewonnenen Kraft- und Positionssignale gebildet (Drewing & Ernst, 2006). Hier untersuchten wir den Einfluss der Bewegungsparameter Druck und Geschwindigkeit auf die Gewichtung des Kraft- und Positionssignals, sowie auf die Diskriminationsschwellen bei aktiver haptischer Wahrnehmung. Kaim, L., & Drewing, K. (2007). Signalintegration bei haptischer Formwahrnehmung unter Variation von Kraft und Geschwindigkeit der exploratorischen Bewegung. Beiträge zur, 49, 194.

The target article fails to disentangle the functional description from the structural description of the two somatosensory streams. Additional evidence and thorough reconsideration of the evidence cited argue for a functional distinction between the how processing and the what processing of somatosensory information, while questioning the validity and usefulness of the equation of these two types of processing with structural streams. We propose going one step further: to investigate how the distinct functional streams are coordinated via attention. Drewing, K., & Schneider, W. (2007). Disentangling functional from structural descriptions, and the coordinating role of attention. Behavioral and Brain Sciences, 30(2), 205-206. doi:10.1017/S0140525X07001446

We studied multi-sensory integration of directional information during the execution of goal-directed pointing movements. Subjects pointed at a visual target of 6 cm diameter, presented at 35 cm from the starting position of the arm movement. Subjects performed the pointing movement under open loop conditions, i.e. visual feedback about finger and target position was removed during the movement. Proprioceptive directional information was provided by applying a small force pulse (amplitude 1 N, pulse duration 50 ms) orthogonal to the movement direction early in the movement. In some trials, a noisy visual directional cue was presented. Time and spatial location of presentation of the visual cue were matched to the force pulse. The direction of the visual cue was either consistent with the force pulse direction or differed by 30°, either clockwise or counterclockwise. Subjects were instructed to hit the target within 1200 ms following target presentation. We measured perceived direction of the proprioceptive cue when both cues were provided and perceived direction for each cue alone. In conditions in which both cues were presented simultaneously, we compared subjects' responses to the predictions of an ideal observer model. The model combines visual and proprioceptive direction estimates measured in single-cue conditions by weighted averaging. The weights depend on the reliability of each cue. In accordance with the predictions of the ideal observer model we find that subjects' responses were less variable when both visual and proprioceptive cues were available. In conditions in which the mean direction of proprioceptive cue and visual cue differed, subjects' responses exhibited a bias towards the direction of the visual cue. This bias was larger for more reliable visual cues and smaller for more reliable force pulse directions. These results are consistent with the idea of a reliability weighted combination of both cues. Serwe, S. Drewing, K. Trommershäuser, J. (2007). Integration of multi-sensory directional information during goal-directed pointing [Abstract]. Journal of Vision, 7(9):307, 307a, http://journalofvision.org/7/9/307/, doi:10.1167/7.9.307

When a participant’s unseen real hand and an artificial seen hand are stroked in synchrony, the participant reports a vivid illusion of feeling the tactile sensations as originating from the stroking of the artificial hand. Additionally, the felt position of the real hand is shifted towards the seen artificial hand (Botvinick, & Cohen, 1998). In two experiments we investigated top-down influences on the illusion as indicated by position shift and subjective onset time and duration of the illusion. Experiment 1 demonstrated for all three measures that the extent of the illusion systematically depends on the plausibility of the seen artificial limb (hand, distorted hand, cell phone). In experiment 2, we projected changing patterns of dots onto the artificial hand. Participants had to count backwards (stepsize 1 or 3), whenever a dot was stroked, or they just looked at the dots. Drewing, K., Albus, P., & Kunkel, A. (2007). Top-down influences on the rubber hand illusion. K