Document Actions

Grasping Lab

                                                                                                                                                                            

In the grasping lab, we measure the movements of people when they grasp and manipulate objects. We use various experimental set-ups for this. For example, we can have our test subjects grasp objects in a virtual environment. This is done by having the subject look at a mirror that reflects the image of a monitor in such a way that the virtual object appears to be lying directly in front of them on the table. For example, we are able to manipulate very precisely the perceived material properties of the object to be grasped, such as colour, size and texture, because these can have an influence on human movement planning. On another set-up, we have the possibility to present real objects for the test participants to grasp. However, an electronically controllable liquid crystal screen allows us to obscure the view before or during the movement, which enables us to better understand the influence of visual feedback on movement planning. Furthermore, an eye tracker for measuring eye movements is also available in the gripping laboratory. With this device we are able to record eye and hand movements simultaneously. The aim here is to gain an understanding of how our brain manages the simultaneous coordination of eye and hand.

 

In our experiments, the movements are measured with an opto-electronic system. Small infrared lights are stuck on the hands of the test person. The light from these infrared lights is captured by several cameras. From the different camera images, the position of the lights - and thus the position of the hands and fingers - in the room can be determined. The device we use for this, the Optotrak 3020 from the Canadian company Northern Digital, is one of the world's fastest and most accurate measuring devices with a spatial resolution of one tenth of a millimetre and a temporal resolution of up to 3200 measuring positions per second.


Our various projects explore the interplay between visual perception and motion control. Our goal is to understand which visual information is used by the brain to control our hand movements and in which way this information is processed beforehand.