Inhaltspezifische Aktionen

2021

The softness of objects can be perceived through several senses. For instance, to judge the softness of a cat's fur, we do not only look at it, we often also run our fingers through its coat. Recently, we have shown that haptically perceived softness covaries with the compliance, viscosity, granularity, and furriness of materials (Dovencioglu, Üstün, Doerschner, & Drewing, 2020). However, it is unknown whether vision can provide similar information about the various aspects of perceived softness. Here, we investigated this question in an experiment with three conditions: in the haptic condition, blindfolded participants explored materials with their hands, in the static visual condition participants were presented with close-up photographs of the same materials, and in the dynamic visual condition participants watched videos of the hand-material interactions that were recorded in the haptic condition. After haptically or visually exploring the materials, participants rated them on various attributes. Our results show a high overall perceptual correspondence among the three experimental conditions. With a few exceptions, this correspondence tended to be strongest between haptic and dynamic visual conditions. These results are discussed with respect to information potentially available through the senses, or through prior experience, when judging the softness of materials. Cavdan, M., Drewing, K., & Doerschner, K. (2021). The look and feel of soft are similar across different softness dimensions. Journal of vision, 21(10), 20-20.

Adaptation to delays between actions and sensory feedback is important for efficiently interacting with our environment. Adaptation may rely on predictions of action-feedback pairing (motor-sensory component), or predictions of tactile-proprioceptive sensation from the action and sensory feedback of the action (inter-sensory component). Reliability of temporal information might differ across sensory feedback modalities (e.g. auditory or visual), which in turn influences adaptation. Here, we investigated the role of motor-sensory and inter-sensory components on sensorimotor temporal recalibration for motor-auditory (button press-tone) and motor-visual (button press-Gabor patch) events. In the adaptation phase of the experiment, action-feedback pairs were presented with systematic temporal delays (0 ms or 150 ms). In the subsequent test phase, audio/visual feedback of the action were presented with variable delays. The participants were then asked whether they detected a delay. To disentangle motor-sensory from inter-sensory component, we varied movements (active button press or passive depression of button) at adaptation and test. Our results suggest that motor-auditory recalibration is mainly driven by the motor-sensory component, whereas motor-visual recalibration is mainly driven by the inter-sensory component. Recalibration transferred from vision to audition, but not from audition to vision. These results indicate that motor-sensory and inter-sensory components contribute to recalibration in a modality-dependent manner. Arikan, B. E., van Kemenade, B. M., Fiehler, K., Kircher, T., Drewing, K., & Straube, B. (2021). Different contributions of efferent and reafferent feedback to sensorimotor temporal recalibration. Scientific reports, 11(1), 1-15.

Haptic exploration of objects usually consists of repeated exploratory movements and our perception of their properties is the result of the integration of information gained during each of these single movements. The serial nature of information integration in haptic perception requires that sensory estimates from single exploratory movements are retained in memory. Here we propose an optimal model for serial integration of information in haptic explorations which considers memory limitations. We tested the model by predicting discrimination performance in free and restricted (fixed number of indentations and varied number of switches between the stimuli) explorations of softness. Our model overall well predicts performance given different exploratory patterns in both free and restricted explorations. The model slightly overestimates performance in the restricted exploration and predictions are accurate in free explorations. These results suggest that integration of information can be well approximated by our model, in particular in free haptic exploration. We further tested whether participants prefer explorations which maximize performance. The model predicts that with constant number of indentations switching between the stimuli increases performance. Our results show that participants increase the number of switches only up to three switches, suggesting a trade-off between muscular switching costs and performance. Metzger, A., & Drewing, K. (2021, July). A Kalman filter model for predicting discrimination performance in free and restricted haptic explorations. In 2021 IEEE World Haptics Conference (WHC) (pp. 439-444). IEEE.

Humans typically interact with the environment using bare hands. However, sometimes this is not possible or not preferred, e.g., when wearing protective gloves for work or sensor gloves in mixed/augmented reality (AR). Also, studying softness is highly important since it makes use of tactile and proprioceptive cues and it might be highly sensitive to restrictions. Here we tested how corresponding haptic constraints affect perceived softness. Participants manually explored and rated 10 materials on 15 sensory adjectives under four constraint conditions: bare hand, open-fingered glove, open-fingered glove with rigid sensors, and full glove. The materials represented extreme values on different softness dimensions; the adjectives were chosen to assess these dimensions. Principal Component Analysis (PCA), Procrustes distances, and correlation analyses showed that across constraint conditions, softness perception is overall highly similar. However, when we inspected responses on a more detailed level, per material-adjective combination, we observed that the full glove condition differed from the others especially for judgments on surface softness. Overall, the results suggest that sensor gloves hardly change the perception of different dimensions of softness if fingertips are left bare. Cavdan, M., Ennis, R., Drewing, K., & Doerschner, K. (2021, July). Constraining haptic exploration with sensors and gloves hardly changes the multidimensional structure of softness perception. In 2021 IEEE World Haptics Conference (WHC) (pp. 31-36). IEEE.

Humans can optimize haptic perception by tuning their exploratory behavior. In softness exploration humans use more force when expecting a pair of hard objects as compared to soft ones, and this force control improves softness discrimination. Such force tuning seems to be based on implicit prior information about the upcoming compliance category. In previous studies, prior information was implicitly induced by presenting blocks of trials of the same category (hard or soft). Here, we studied force control when hard and soft stimulus pairs alternate according to a predictable pattern. Participants had to decide which one of two silicone stimuli was softer. Soft and hard trials were presented in random order, in blocks, in alternating order (short pattern), or alternating always two hard and two soft trials (long pattern). We confirmed the finding of force tuning to compliance for blocked as compared to random presentation. The predictable presentation patterns also influenced force control, but not in the expected direction. We conclude that implicit expectations from sequences can be used in force control, but they are not sufficient for successful tuning. A further sequential analysis shows that forces are not only adapted by simple reactive trial-by-trial mechanisms. Drewing, K., & Zoeller, A. C. (2021, July). Influence of presentation order on force control in softness exploration. In 2021 IEEE World Haptics Conference (WHC) (pp. 19-24). IEEE.

Haptic texture perception is based on sensory information sequentially gathered during several lateral movements (“strokes”). In this process, sensory information of earlier strokes must be preserved in a memory system. We investigated whether this system may be a haptic sensory memory. In the first experiment, participants performed three strokes across each of two textures in a frequency discrimination task. Between the strokes over the first texture, participants explored an intermediate area, which presented either a mask (high-energy tactile pattern) or minimal stimulation (low-energy smooth surface). Perceptual precision was significantly lower with the mask compared with a three-strokes control condition without an intermediate area, approaching performance in a one-stroke-control condition. In contrast, precision in the minimal stimulation condition was significantly better than in the one-stroke control condition and similar to the three-strokes control condition. In a second experiment, we varied the number of strokes across the first stimulus (one, three, five, or seven strokes) and either presented no masking or repeated masking after each stroke. Again, masking between the strokes decreased perceptual precision relative to the control conditions without masking. Precision effects of masking over different numbers of strokes were fit by a proven model on haptic serial integration that modeled masking by repeated disturbances in the ongoing integration. Taken together, results suggest that masking impedes the processes of haptic information preservation and integration. We conclude that a haptic sensory memory, which is comparable to iconic memory in vision, is used for integrating sequentially gathered sensory information. Drewing, K., & Lezkan, A. (2021). Masking interferes with haptic texture perception from sequential exploratory movements. Attention, Perception, & Psychophysics, 83(4), 1766-1776.

Haptic search is a common everyday task, usually consisting of two processes: target search and target analysis. During target search we need to know where our fingers are in space, remember the already completed path and the outline of the remaining space. During target analysis we need to understand whether the detected potential target is the desired one. Here we characterized dynamics of exploratory movements in these two processes. In our experiments participants searched for a particular configuration of symbols on a rectangular tactile display. We observed that participants preferentially moved the hand parallel to the edges of the tactile display during target search, which possibly eased orientation within the search space. After a potential target was detected by any of the fingers, there was higher probability that subsequent exploration was performed by the index or the middle finger. At the same time, these fingers ramatically slowed down. Being in contact with the potential target, the index and the middle finger moved within a smaller area than the other fingers, which rather seemed to move away to leave them space. These results suggest that the middle and the index finger are specialized for fine analysis in haptic search. Metzger, A., Toscani, M., Valsecchi, M., & Drewing, K. (2021). Target search and inspection strategies in haptic search. IEEE Transactions on Haptics.

Haptic search is a common everyday task, usually consisting of two processes: target search and target analysis. During target search we need to know where our fingers are in space, remember the already completed path and the outline of the remaining space. During target analysis we need to understand whether the detected potential target is the desired one. Here we characterized dynamics of exploratory movements in these two processes. In our experiments participants searched for a particular configuration of symbols on a rectangular tactile display. We observed that participants preferentially moved the hand parallel to the edges of the tactile display during target search, which possibly eased orientation within the search space. After a potential target was detected by any of the fingers, there was higher probability that subsequent exploration was performed by the index or the middle finger. At the same time, these fingers ramatically slowed down. Being in contact with the potential target, the index and the middle finger moved within a smaller area than the other fingers, which rather seemed to move away to leave them space. These results suggest that the middle and the index finger are specialized for fine analysis in haptic search. Metzger, A., Toscani, M., Valsecchi, M., & Drewing, K. (2021). Target search and inspection strategies in haptic search. IEEE Transactions on Haptics.

Haptic exploration usually involves stereotypical systematic movements that are adapted to the task. Here we tested whether exploration movements are also driven by physical stimulus features. We designed haptic stimuli, whose surface relief varied locally in spatial frequency, height, orientation, and anisotropy. In Experiment 1, participants subsequently explored two stimuli in order to decide whether they were same or different. We trained a variational autoencoder to predict the spatial distribution of touch duration from the surface relief of the haptic stimuli. The model successfully predicted where participants touched the stimuli. It could also predict participants’ touch distribution from the stimulus’ surface relief when tested with two new groups of participants, who performed a different task (Exp. 2) or explored different stimuli (Exp. 3). We further generated a large number of virtual surface reliefs (uniformly expressing a certain combination of features) and correlated the model’s responses with stimulus properties to understand the model’s preferences in order to infer which stimulus features were preferentially touched by participants. Our results indicate that haptic exploratory behavior is to some extent driven by the physical features of the stimuli, with e.g. edge-like structures, vertical and horizontal patterns, and rough regions being explored in more detail. Metzger, A., Toscani, M., Akbarinia, A., Valsecchi, M., & Drewing, K. (2021). Deep neural network model of haptic saliency. Scientific reports, 11(1), 1-14.

The ability to sample sensory information with our hands is crucial for smooth and efficient interactions with the world. Despite this important role of touch, tactile sensations on a moving hand are perceived weaker than when presented on the same but stationary hand.1-3 This phenomenon of tactile suppression has been explained by predictive mechanisms, such as forward models, that estimate future sensory states of the body on the basis of the motor command and suppress the associated predicted sensory feedback.4 The origins of tactile suppression have sparked a lot of debate, with contemporary accounts claiming that suppression is independent of predictive mechanisms and is instead akin to unspecific gating.5 Here, we target this debate and provide evidence for sensation-specific tactile suppression due to sensorimotor predictions. Participants stroked with their finger over textured surfaces that caused predictable vibrotactile feedback signals on that finger. Shortly before touching the texture, we applied external vibrotactile probes on the moving finger that either matched or mismatched the frequency generated by the stroking movement. We found stronger suppression of the probes that matched the predicted sensory feedback. These results show that tactile suppression is not limited to unspecific gating but is specifically tuned to the predicted sensory states of a movement. Führer, E., Voudouris, D., Lezkan, A., Drewing, K., & Fiehler, K. (2021). Tactile suppression stems from sensation-specific sensorimotor predictions. bioRxiv.

Adaptation to delays between actions and sensory feedback is important for efficiently interacting with our environment. Adaptation may rely on predictions of action-feedback pairing (motor-sensory component), or predictions of tactile-proprioceptive sensation from the action and sensory feedback of the action (inter-sensory component). Reliability of temporal information might differ across sensory feedback modalities (e.g. auditory or visual), influencing adaptation. Here, we investigated the role of motor-sensory and inter-sensory components on sensorimotor temporal recalibration for motor-auditory events (button press-tone) and motor-visual events (button press-Gabor patch). In the adaptation phase of the experiment, the motor action-feedback event pairs were presented with systematic temporal delays (0ms or 150ms). In the subsequent test phase, sensory feedback of the action were presented with variable delays. The participants were then asked whether this delay could be detected. To disentangle motor-sensory from inter-sensory component, we varied movements (active button press or passive depression of button) at adaptation and test. Our results suggest that motor-auditory recalibration is mainly driven by motor-sensory component, whereas motor-visual recalibration is mainly driven by inter-sensory component. Recalibration transferred from vision to audition, but not from audition to vision. These results indicate that motor-sensory and inter-sensory components of recalibration are weighted in a modality-dependent manner. Arikan, B. E., van Kemenade, B. M., Fiehler, K., Kircher, T., Drewing, K., & Straube, B. (2021). Sensorimotor temporal recalibration: the contribution of motor-sensory and inter-sensory components. bioRxiv.

The softness of objects can be perceived through several senses. For instance, to judge the softness of our cat9s fur, we do not only look at it, we also run our fingers in idiosyncratic ways through its coat. Recently, we have shown that haptically perceived softness covaries with the compliance, viscosity, granularity, and furriness of materials (Dovencioglu et al.,2020). However, it is unknown whether vision can provide similar information about the various aspects of perceived softness. Here, we investigated this question in an experiment with three conditions: in the haptic condition, blindfolded participants explored materials with their hands, in the visual-static condition participants were presented with close-up photographs of the same materials, and in the visual-dynamic condition participants watched videos of the hand-material interactions that were recorded in the haptic condition. After haptically or visually exploring the materials participants rated them on various attributes. Our results show a high overall perceptual correspondence between the three experimental conditions. With a few exceptions, this correspondence tended to be strongest between haptic and visual-dynamic conditions. These results are discussed with respect to information potentially available through the senses, or through prior experience, when judging the softness of materials. Cavdan, M., Drewing, K., & Doerschner, K. (2021). Materials in action: The look and feel of soft. bioRxiv.