Inhaltspezifische Aktionen

2024

In robotics, understanding human interaction with autonomous systems is crucial for enhancing collaborative technologies. We focus on human-swarm interaction (HSI), exploring how differently sized groups of active robots affect operators' cognitive and perceptual reactions over different durations. We analyze the impact of different numbers of active robots within a 15-robot swarm on operators' time perception, emotional state, flow experience, and task difficulty perception. Our findings indicate that managing multiple active robots when compared to one active robot significantly alters time perception and flow experience, leading to a faster passage of time and increased flow. More active robots and extended durations cause increased emotional arousal and perceived task difficulty, highlighting the interaction between robot the number of active robots and human cognitive processes. These insights inform the creation of intuitive human-swarm interfaces and aid in developing swarm robotic systems aligned with human cognitive structures, enhancing human-robot collaboration. Kaduk, J., Cavdan, M., Drewing, K., & Hamann, H. (2024). From One to Many: How Active Robot Swarm Sizes Influence Human Cognitive Processes. arXiv preprint arXiv:2403.13541. https://doi.org/10.48550/arXiv.2403.13541
When interacting with surfaces, humans perceive surface attributes which are often accompanied by affective responses. Notably, rough materials tend to evoke unpleasant feelings whereas some soft materials are frequently associated with pleasantness. While the literature predominantly focused on the relationship between solid objects and pleasantness, our daily haptic interactions also include fluids. Here, our main objective was to explore the relationship between unpleasantness and perceived qualities of touched fluids. We created a stimulus set by varying fluid properties of real-life materials (e.g., diluting honey with water). Participants actively explored the materials without time or movement constraints. In a first presentation block, they rated the unpleasantness of the materials while in a second block, they evaluated the materials based on seven sensory adjectives. Principal Component Analysis on adjective ratings revealed the dimensions characterizing differences in sensory qualities of our materials: viscosity and slipperiness. Importantly, we observed a positive significant correlation between unpleasantness and viscosity while no correlation was found for slipperiness. Specifically, materials perceived as more viscous felt unpleasant, emphasizing the role of viscosity in affective responses during haptic exploration. Overall, the current study contributes to the broader understanding of unpleasantness by extending our knowledge beyond the traditionally studied solid materials. Cavdan, M., Drewing, K. (2025). To Touch or Not to Touch: The Linkage Between Viscosity and Unpleasantness. In: Kajimoto, H., et al. Haptics: Understanding Touch; Technology and Systems; Applications and Interaction. EuroHaptics 2024. Lecture Notes in Computer Science, vol 14769. Springer, Cham. https://doi.org/10.1007/978-3-031-70061-3_6
The perception of material/object properties plays a fundamental role in our daily lives. Previous research has shown that individuals use distinct and consistent patterns of hand movements, known as exploratory procedures (EPs), to extract perceptual information relevant to specific material/object properties. Here, we investigated the variation in EP usage across different tasks involving objects that varied in task-relevant properties (shape or deformability) as well as in task-irrelevant properties (deformability or texture). Participants explored 1 reference object and 2 test objects with a single finger before selecting the test object that was most similar to the reference. We recorded their finger movements during explorations, and these movements were then categorised into different EPs. Our results show strong task-dependent usage of EPs, even when exploration was confined to a single finger. Furthermore, within a given task, EPs varied as a function of material/object properties unrelated to the primary task. These variations suggest that individuals flexibly adapt their exploration strategies to obtain consistent and relevant information. Lin, L.P.Y. et al. (2025). Task-Adapted Single-Finger Explorations of Complex Objects. In: Kajimoto, H., et al. Haptics: Understanding Touch; Technology and Systems; Applications and Interaction. EuroHaptics 2024. Lecture Notes in Computer Science, vol 14768. Springer, Cham. https://doi.org/10.1007/978-3-031-70058-3_11
Signals from different senses are integrated into multisensory events or segregated according to their temporal and spatial relations. If signals are integrated, we perceive synchrony between them even in the presence of slight stimulus onset asynchronies (SOA). The range of SOAs during which physically asynchronous signals are perceived to be synchronous is called the temporal binding window (TBW). The TBW depends on various factors. Here we investigated how spatial congruency affects the width of the visuotactile TBW in a naturalistic setting, given that spatial congruency of signals in the single senses should promote multisensory integration and thereby binding. In a virtual reality (VR) environment, we presented visual and vibrotactile stimuli in different locations. Vibrotactile stimuli were presented on the participants’ hands or forearms, and visual stimuli were rendered in real time on virtual counterparts of the tracked hands or forearms. We varied SOAs between vision and touch and asked if visual and tactile stimuli had occurred synchronously. Similar to what has been found in the audiovisual domain, the temporal binding window was wider when visual and tactile stimuli were spatially congruent—possibly due to enhanced multisensory integration. Thus, we extend the previous findings and conclusions on spatial congruency effects to visuotactile interactions in VR environments. Celebi, B., Cavdan, M., Drewing, K. (2025). The Visuotactile Temporal Binding Window Widens with Spatial Congruency. In: Kajimoto, H., et al. Haptics: Understanding Touch; Technology and Systems; Applications and Interaction. EuroHaptics 2024. Lecture Notes in Computer Science, vol 14769. Springer, Cham. https://doi.org/10.1007/978-3-031-70061-3_12
Haptic exploration is an inherently active process by which humans gather sensory information through physical contact with objects. It has been proposed that humans generally optimize their exploration behavior to improve perception. We hypothesized that the duration of haptic explorations is the result of an optimal interplay of sensory and predictive processes, also taking costs such as motor effort into account. We assessed exploration duration and task performance in a two-alternative forced-choice spatial frequency discrimination task under varying conditions of task demand and motor effort. We manipulated task demands by varying the discriminability of virtual grating stimuli and manipulated motor effort by implementing forces counteracting the participants’ movements while switching between stimuli. Participants were instructed to switch between stimuli after each swipe movement. Results revealed that higher task demands lead to higher numbers of exploratory movements (i.e. longer exploration duration), likely reflecting a compensatory mechanism that enables participants to attain a certain level of task performance. However, this effect is reduced when motor effort is increased; while low and medium task demands yield similar numbers of movements regardless of related motor effort, higher demands are not associated with increased numbers of movements when the required motor effort is high. In conclusion, the extent to which increased task demands are compensated via the extension of an exploration seems to depend on the motor costs that the agent is confronted with. Jeschke, M., Metzger, A., Drewing, K. (2025). Humans Terminate Their Haptic Explorations According to an Interplay of Task Demands and Motor Effort. In: Kajimoto, H., et al. Haptics: Understanding Touch; Technology and Systems; Applications and Interaction. EuroHaptics 2024. Lecture Notes in Computer Science, vol 14768. Springer, Cham. https://doi.org/10.1007/978-3-031-70058-3_7

People regularly use active touch to perform daily life tasks. Imagine choosing a comfortable pillow and how you would explore its softness. It is known that people tune their exploratory behavior to get the most relevant information. In the exploration process, also prior information is used, which is available before we touch an object. For softness perception, object indentation plays a crucial role; indentation forces were higher, when people implicitly expected to explore harder as compared to softer objects. This force-tuning improved perception, and was observed when trials of the same softness level (hard or soft) were presented in longer blocks. However, it was not reported for predictable patterns in that hard and soft stimuli alternate in every or every other trial. Here, we investigated when and how implicit prior information about the softness level becomes accessible for successful force-tuning in softness discrimination. Participants were presented with hard and soft stimulus pairs in sequences of the length of 2, 4 or 6 trials. In predictable conditions, same-length sequences of hard and soft trials alternated constantly. In unpredictable conditions, we presented sequences of lengths 2, 4 and 6 randomly. We analyzed initial peak indentation forces. Participants applied higher forces to harder stimuli in the predictable condition in longer sequences (4 and 6) as compared to the unpredictable condition and shorter sequences of 2. We interpret the findings in terms of an anticipatory and incremental mechanism of force-tuning, which needs to be triggered by an initial predictable stimulus. Katircilar, D., Drewing, K. (2025). The Role of Implicit Prior Information in Haptic Perception of Softness. In: Kajimoto, H., et al. Haptics: Understanding Touch; Technology and Systems; Applications and Interaction. EuroHaptics 2024. Lecture Notes in Computer Science, vol 14768. Springer, Cham. https://doi.org/10.1007/978-3-031-70058-3_13

Fingertip friction is a key component of tactile perception. In active tactile exploration, friction forces depend on the applied normal force and on the sliding speed chosen. We have investigated whether humans perceive the speed dependence of friction for textured surfaces of materials, which show either increase or decrease of the friction coefficient with speed. Participants perceived the decrease or increase when the relative difference in friction coefficient between fast and slow sliding speed was more than 20 %. The fraction of comparison judgments which were in agreement with the measured difference in friction coefficient did not depend on variations in the applied normal force. The results indicate a perceptual constancy for fingertip friction with respect to self-generated variations of sliding speed and applied normal force. Fehlberg, M., Monfort, E., Saikumar, S., Drewing, K., & Bennewitz, R. (2024). Perceptual Constancy in the Speed Dependence of Friction During Active Tactile Exploration. IEEE Transactions on Haptics.

This chapter provides an initial overview of the state of knowledge on the joint processing of information from different senses in humans. It deals with processes of multisensory integration of redundant information and multisensory combination, the problem of assigning related information from different senses, mechanisms of matching between the senses, the role of attention and the neurophysiological principles of multisensory processing. Examples from ergonomics and clinical practice are used to illustrate the applicability of the findings. Drewing, K. (2024). Multisensorische Informationsverarbeitung. In: Rieger, M., Müsseler, J. (eds) Allgemeine Psychologie. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-68476-4_4

Combining or integrating information from multiple senses often provides richer and more reliable estimates for the perception of objects and events. In daily life, sensory information from the same source often is in close spatiotemporal proximity. This can be an important determinant of whether and how multisensory signals are combined. The introduction of advanced technical display systems allows to present multisensory information in virtual environments. However, technical displays can lack the spatiotemporal fidelity of the real world due the rendering delays. Thus, any spatiotemporal incongruency could alter how information is combined. In the current study we tested this by investigating if and how spatially and temporally discrepant tactile displacement cues can supplement imprecise visual displacement cues. Participants performed a visual displacement task with visual and tactile displacement cues under spatial and temporal incongruency conditions. We modelled how participants combined visual and tactile information in visuotactile condition using their performance in visual only condition. We found that temporal incongruency lead to an increase in tactile weights although they were correlated with the congruency condition. In contrast, the spatial incongruency led to individual differences altering cue combination strategies. Our results illustrate the importance of spatiotemporal congruency for combining tactile and visual cues when making visual displacement judgments. Given the altered cue combination strategies and individual differences, we recommend developers to adopt individual spatiotemporal calibration procedures to improve the efficiency of the sensory augmentation. Goktepe, N., Drewing, K., & Schütz, A. C. (2024). Spatiotemporal congruency modulates weighting of visuotactile information in displacement judgments. IEEE Transactions on Haptics.

Humans can use prior information to optimize their haptic exploratory behavior. Here, we investigated the usage of visual priors, which mechanisms enable their usage, and how the usage is affected by information quality. Participants explored different grating textures and discriminated their spatial frequency. Visual priors on texture orientation were given each trial, with qualities randomly varying from high to no informational value. Adjustments of initial exploratory movement direction orthogonal to the textures’ orientation served as an indicator of prior usage. Participants indeed used visual priors; the more so the higher the priors’ quality (Experiment 1). Higher task demands did not increase the direct usage of visual priors (Experiment 2), but possibly fostered the establishment of adjustment behavior. In Experiment 3, we decreased the proportion of high-quality priors presented during the session, hereby reducing the contingency between high-quality priors and haptic information. In consequence, even priors of high quality ceased to evoke movement adjustments. We conclude that the establishment of adjustment behavior results from a rather implicit contingency learning. Overall, it became evident that humans can autonomously learn to use rather abstract visual priors to optimize haptic exploration, with the learning process and direct usage substantially depending on the priors’ quality. Jeschke, M., Zoeller, A.C. & Drewing, K. Humans flexibly use visual priors to optimize their haptic exploratory behavior. Sci Rep 14, 14906 (2024). https://doi.org/10.1038/s41598-024-65958-6

Göktepe N. & Cavdan, M (2024). In D. Dövencioglu (Eds.) Nörondan algoritmaya davranışı anlamak (1st ed., pp. 85-107). Nobel Yayincilik.

In everyday interaction we touch different materials, which we experience along a limited number of perceptual and emotional dimensions: For instances, a furry surface feels soft and pleasant, whereas sandpaper feels rough and unpleasant. In a previous study, younger adults manually explored a representative set of solid, fluid and granular materials. Their ratings were made along six perceptual dimensions (roughness, fluidity, granularity, deformability, fibrousness, heaviness) and three emotional ones (valence, arousal, dominance). Perceptual and emotional dimensions were systematically correlated. Here, we wondered how this perceptuo-affective organization of touched materials depends on age, given that older adults show decline in haptic abilities, in particular detail perception. 30 younger participants (~22 years, half females) and 15 older participants (~66 years) explored 25 materials using 18 perceptual and 9 emotional adjectives. We extracted 6 perceptual and 2 emotional dimensions. Older and younger adults showed similar dimensions. However, in younger participants roughness and granularity judgments were done separately, while they were collapsed in a single dimension in older people. Further, age groups differed in the perception of roughness, granularity and valence, and older people did not show a positive correlation between valence and granularity as did younger people. As expected, control analyses between young males and females did not reveal similar gender differences. Overall, the results demonstrate that older people organize and experience materials partly differently from younger people, which we lead back to sensory decline. However, other aspects of perceptual organization that also include fine perception are preserved into older age. Drewing, K. (2024). Perceptuo-affective organization of touched materials in younger and older adults. Plos one, 19(1), e0296633.