We perceive the world surrounding us via multiple sensory modalities, including touch, vision and audition. The information derived from all these different modalities has to converge in order to form a coherent and robust percept of the world. Here, we review a model (the MLE model) that in the statistical sense describes an optimal integration mechanism. The benefit from integrating sensory information comes from a reduction in variance of the final perceptual estimate. We here illustrate this integration mechanism in the human brain with two examples: the fist example demonstrates the integration of force and position cues to shape within haptic perception; the second example highlights multimodal perception and shows that tactile and auditory information for temporal perception interacts in a way predicted by the MLE integration model. Ernst, M. O., Bresciani, J. P., Drewing, K., & Bülthoff, H. H. (2004). Integration of sensory information within touch and across modalities. In Proceedings of the 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2004).
Previous research indicated that sound can bias visual [1-4] as well as tactile perception [5,6]. The present experiment tested whether auditory stimuli can alter the tactile perception of sequences of taps (2 to 4 taps per sequence) delivered on the index ngertip. The taps were delivered using a PHANToM force feedback device. The subjects did not have any visual or auditory feedback about the tactile stimulation and their task was to report after each sequence how many taps they felt. In the rst experiment, for some trials, auditory sequences of beeps were presented concomitantly with the tactile sequences (through earphones). The number of beeps diffused in the auditory sequence could be the same as, less, or more than the number of taps of the simultaneously presented tactile sequence. Though irrelevant (subjects were instructed to focus on the tactile stimuli), the auditory stimuli systematically biased subjects' tactile perception, i.e. subjects' responses depended signicantly on the number of diffused beeps. The results also suggested that for such an auditory-tactile interaction to occur, a certain amount of structural congruency between the simultaneously presented stimuli is required. Indeed, the diffusion of an auditory stimulus obviously incongruent with the tactile sequence failed to evoke any bias of tactile perception. In the second experiment, we tested whether the auditory-tactile interaction also requires temporal congruency or whether a bias can be evoked without temporal overlapping between the auditory and tactile presented sequences. The tactile and auditory stimuli were the same as in the rst experiment (the structurally incongruent auditory stimulus was not used here) but the auditory sequences were presented either simultaneously with, before the beginning, or after the end of the tactile sequences. Audition strongly biased tactile perception when the stimuli were temporally concomitant (reproduction of the results obtained in the rst experiment). With a temporally asynchronous audio-tactile stimulus the interaction gradually disappeared. We conclude that auditory and tactile sensory signals are integrated when they both provide redundant information in good temporal coherence. Bresciani, J. P., Ernst, M. O., Drewing, K., Bouyer, G., Maury, V., & Kheddar, A. (2004, February). Feeling What You Hear: An Auditory-Evoked Tactile Illusion. In 7th Tübingen Perception Conference (TWK 2004) (p. 73). Knirsch.
When sliding a nger across a bumpy surface, the nger follows the geometry of the bumps/holes providing positional cues for the shape. At the same time the nger is opposed by forces related to the steepness of the bumps/holes. With a specic device Robles-de-la-Torre and Hayward [1] dissociated positional and force cues in the haptic perception of small-scale bumps and holes: Participants in this experiment reported to predominantly feel the class of shapes (bumps or holes) indicated by the force cues. Drewing and Ernst [2] extended this research by disentangling force and position cues to the perception of curves more systematically and by also quantifying the perceived curvature. The result was that the perceived curvature could be predicted from weighted averaging of the two cues. This is consistent with current models of cue integration [e.g., 3]. These integration models further predict that the cue weight is proportional to the cue's reliability. Here, we aimed at testing this prediction for the integration of force and position cues to haptic shape by manipulating the shapes' material properties: high softness can be assumed to decrease the reliability of the position cue as compared to low softness, and high friction to decrease the reliability of the force cue. Using the PHANToM force-feedback device we constructed haptic curve stimuli. We systematically intermixed force and position cues indicating curvatures of 14 and 24 /m. Using the method of double-staircases, we measured the point of subjective equality (PSE) of the curvature of these as compared to `natural' stimuli (i.e., with consistent position and force cues). From the PSE data we determined the cue weights. This was done under each combination of material properties (low vs high softness X low vs high friction). We found that material properties affected the cue weights in a manner consistent with our predictions. These results further conrm the validity of existing models of cue integration in haptic shape perception. Drewing, K., Wiecki, T., & Ernst, M. O. (2004, February). Cue Reliabilities Affect Cue Integration in Haptic Shape Perception. In 7th Tübingen Perception Conference (TWK 2004) (p. 123). Knirsch.
The purpose of this study is to investigate multimodal visual-haptic texture perception for which we used virtual reality techniques. Participants judged a broad range of textures according to their roughness and their spatial density under visual, haptic and visual-haptic exploration conditions. Participants were well able to differentiate between the different textures both by using the roughness and the spatial density judgment. When provided with visualhaptic textures, subjects performance increased (for both judgments) indicating sensory combination of visual and haptic texture information. Most interestingly, performance for density and roughness judgments did not differ significantly, indicating that these estimates are highly correlated. This may be due to the fact that our textures were generated in virtual reality using a haptic pointforce display (PHANToM). In conclusion, it seems that the roughness and spatial density estimate were based on the same physical parameters given the display technology used. Drewing, K., Ernst, M. O., Lederman, S. J., & Klatzky, R. (2004, June). Roughness and spatial density judgments on visual and haptic textures using virtual reality. In 4th International Conference EuroHaptics 2004 (pp. 203-206). Institute of Automatic Control Engineering.
At present, tactile displays are constructed either as shape or vibrotactile displays. While shape displays render the shape of objects to the skin, vibrotactile devices display high frequent but small amplitude patterns of forces. Existing tactile displays of both types base on an array of small pins, which move normal to the contact surface. That is, the pins create a pattern of indentation into the skin. Usually, the devices are applied to the human finger pad. However, in order to produce a realistic tactile impression of the environment it is probably as important to provide forces lateral to the human skin, so called shear forces. This is particularly reasonable when considering perceptions evoked by movements of the skin relative to the environment, eg when stroking with the finger across a surface. We aim at technically realizing a third type of tactile display which can provide shear forces. The poster presents the prototype of a shear force display for the finger tip and a first psychophysical evaluation. In order to explore whether the stimuli produced by the display are appropriate for human perception we studied in a first step discrimination performance of humans for distinguishing between different directions of pin movement. This basic psychophysical knowledge that so far did not exist because the technology was not yet available will in return be used to improve the design of the display. Fritschi, M., Drewing, K., Zopf, R., Ernst, M. O., & Buss, M. (2004, June). Construction and first evaluation of a newly developed tactile shear force display. In 4th International Conference EuroHaptics 2004 (pp. 508-511). Institute of Automatic Control Engineering.
We tested whether the tactile perception of sequences of taps delivered on the index fingertip can be modulated by sequences of auditory beeps. In the first experiment, the tactile and auditory sequences were always presented simultaneously, and were structurally either similar or dissimilar. In the second experiment, the auditory and tactile sequences were always structurally similar but not always presented simultaneously. When structurally similar and presented simultaneously, the auditory sequences significantly modulated tactile taps perception. This automatic combination of “redundant-like” tactile and auditory signals likely constitutes an optimization process taking advantage of multimodal redundancy for perceptual estimates. Bresciani, J. P., Ernst, M. O., Drewing, K., Bouyer, G., Maury, V., & Kheddar, A. (2004, June). Auditory modulation of tactile taps perception. In 4th International Conference EuroHaptics 2004 (pp. 198-202). Institute of Automatic Control Engineering.
In a repetitive tapping task, the within-hand variability of intertap intervals is reduced when participants tap with both hands instead of single-handedly. This bimanual advantage has been attributed to timer as opposed to motor variance (according to the WingKristofferson model; Helmuth and Ivry 1996) and related to the additional sensory consequences of the movement of the extra hand in the bimanual case (Drewing et al. 2002). In the present study the effect of sensory feedback of the movement on this advantage was investigated by comparing the results of a person (IW) deafferented below the neck with those of age-matched controls. IW showed an even more pronounced bimanual advantage than controls, suggesting that the bimanual advantage is not due to actual sensory feedback. These results support another hypothesis, namely that bimanual timing profits from the averaging of different central control signals that relate to each effector’s movements. Drewing, K., Stenneken, P., Cole, J., Prinz, W., & Aschersleben, G. (2004). Timing of bimanual movements and deafferentation: Implications for the role of sensory movement effects. Experimental Brain Research, 158(1), 50-57.
Most models of object recognition assume that object recognition is based on the matching of the 2-D view of the object with representations of the object stored in memory. They propose that a time-consuming normalisation process compensates for any difference in viewpoint between the 2-D percept and the stored representation. Our experiment shows that this normalisation is less time-consuming when it has to compensate for disorientations around the vertical than around the horizontal axis of rotation. By decoupling the different possible reference frames, we demonstrate that this anisotropy of the recognition performance is not defined with respect to the retinal, but with respect to the gravitational or the visuo-contextual frame of reference. Our results suggest that the visual system may call upon both the gravitational vertical and the visual context to serve as the frame of reference with respect to which objects are gauged in 3-D object recognition. Waszak, F., Drewing, K., & Mausfeld, R. (2004). Viewer-external frames of reference in 3-D object recognition. Poster presented at 27th European Conference on Visual Perception (ECVP 2004), Budapest, Hungary.
Tactile feedback is among haptics one of the more recent modalities for human-system interaction. Research in tactile feedback using pin-array type actuators has been going on during the past years or so. A survey about technological achievements, human sensing capabilities, and psychophysical evaluation in this area is presented. Then the focus is on novel approaches in actuator technology and tactile feedback systems providing shear force (tangential force to the finger-tip). Fritschi, M., Buss, M., Drewing, K., Zopf, R., & Ernst, M. O. (2004, September). Tactile feedback systems. In Workshop" Touch and Haptics": 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2004) (pp. 1-21).
Investigating multisensory integration, Shams et al (2000 Nature 408 788) recently found that the number of perceived visual flashes could be altered by a sequence of beeps presented simultaneously. Here, we tested whether auditory sequences of beeps can modulate the tactile perception of sequences of taps (2 to 4 taps per sequence). In experiment 1, the auditory and tactile sequences were presented simultaneously. The number of beeps delivered in the auditory sequence was either the same as, less than, or more than the number of tactile taps. Though task-irrelevant (subjects were instructed to focus on the tactile stimuli), the auditory stimuli significantly modulated subjects' tactile perception. Such modulation occurred only when the auditory and tactile stimuli were structurally similar. In experiment 2, we tested whether auditory-tactile interaction depends on simultaneity or whether a bias can be evoked without temporal overlap between the auditory and tactile sequences. Audition significantly modulated tactile perception when the stimuli were presented simultaneously, but this effect gradually disappeared when a temporal asynchrony was introduced between auditory and tactile stimuli. These results show that when provided with auditory and tactile signals that are likely to be generated by the same stimulus, the brain tends to automatically combine these signals. Ernst, M. O., Bresciani, J. P., & Drewing, K. (2004, September). Feeling what you hear: Auditory signals can modulate the perception of tactile taps. In 27th European Conference on Visual Perception (ECVP 2004) (p. 143). Pion Ltd..
This work presents the prototype of a shear force display for the finger tip and a first psychophysical evaluation. In order to explore whether the stimuli produced by the display are appropriate for human perception we studied discrimination performance of humans for distinguishing between different directions of pin movement. In a second step we explored the perceptual integration of multi-pin movements. This basic psychophysical knowledge that so far did not exist because the technology was not yet available to be used to improve the design of the display. Fritschi, M., Drewing, K., Zopf, R., Ernst, M. O., & Buss, M. (2004, September). Construction and psychophysical evaluation of a novel tactile shear force display. In RO-MAN 2004. 13th IEEE International Workshop on Robot and Human Interactive Communication (IEEE Catalog No. 04TH8759) (pp. 509-513). IEEE.