Are we better at reaching for what we see, or what we feel?

In everyday life, people manipulate all kinds of physical objects. These objects can differ by their physical properties such as shape, texture and size, all of which the person must consider when reaching for and using that object. Many people use two hands to manipulate objects. For example, when you want to zip up your jacket, you first need to reach the slider with one hand and place it correctly in the insertion pin held by the other hand to be able to pull up the slider along the chain. This kind of motor action is often guided by visual information or by touch information. However, in most cases, people will use all the available and relevant sensory information to perform the action, thus combining visual and touch information.

How is visual and touch information appropriately integrated and used for specific motor outputs? A team of researchers at Western university started addressing these questions by investigating how visual and tactile stimuli impact reaching actions.

In 2016, Dr. Andrew Pruszynski and his lab showed that touch information collected by one hand allowed the other hand to adjust its trajectory to reach a target object. This means that the information collected by one hand is quickly shared to the other hand to properly perform the action1. Recently, Dr. Sasha Reschechtko and Dr. Andrew Pruszynski were interested to see if the reaching responses guided by touch information lead to the same observations as for reaching responses guided by visual information2. To investigate this question, they asked participants to reach towards a spherical target with one hand while they felt a stimulus on the other hand (edge or sandpaper below the finger). This stimulus, by providing touch information, indicated to the participant in which direction will move the target they have to reach. In some of the trials, participants also have access to visual information to see the position of the target. 

Visual information often serves to guide our actions. . Think about a scenario where you are driving your car but suddenly you see another car in the opposite direction driving in your way (right road). To stay safe, you will have to drive to the opposite side (left road) to avoid hitting the car. However, your brain’s intuitive and automatic response will first start to move toward the visual target (the car) before realising this error and correcting your movement toward the opposite side (driving to the left road). The same effect happens when reaching for things with your arm, and your brain needs time to integrate and process the correct position in order to reach it.

The authors found that when participants are shown where a target object will move (visual only condition), then instructed to reach the opposite side of the target, they made initial incorrect movements in the direction of the target as it started moving, before correcting their trajectory to reach the opposite side. Alternatively, when participants were given the touch only condition, with touch information (edge or sandpaper) to indicate which side the target will move towards, participants correctly moved opposite of the target without making the initial incorrect movement. Finally, when the two experiments were combined (it was indicated to them both through touch and visual information where the target will move), they still made the initial incorrect movement towards the target. These findings suggest that the reaching actions investigated were mainly guided by visual information

In addition to measuring whether the movement happened, the researchers also measured the speed with which movements occurred by tracking the muscle activity of the arm performing movements. Interestingly, they found that even though the touch-only condition did not have an initial incorrect movement, it was not performed faster than the case of visual-only or visual+touch conditions due to antagonistic muscle action. This finding shows that the brain used different regions or ‘circuits’ to correct the reaching trajectory to the opposite side of the target, depending on whether the information came from vision or touch.

Interestingly, the kind of touch stimulation used by the authors did not change the observations. Indeed, whether participants used the sandpaper or the edge to indicate where the target was moving to, they did not perform movements toward the wrong direction but they also did not move faster than when visual information was available. In other words, the motor response does not depend on all the features of touch information (texture, position, orientation) but that the feeling of something slipping under your finger is enough in this given paradigm.

This study showed that even though humans heavily rely on vision and touch to guide reaching, these senses do not all follow all of the same rules (e.g., temporal and spatial parameters, speed, flexibility). A follow-up question that can be asked is whether responses to touch information can be further modified by training, as people eventually learn to use hand-held objects very effectively (e.g., open a bottle’s cap while holding the bottle, putting a leash on your dog while holding him). This research is important as it tries to unravel how our brain processes action depending on the sensory targets available around us. Another question that might come up is if the same observation will be true in a familiar environment, where one has lots of experience. For example, we may expect baseball batters to be faster than an occasional player to correct their trajectory to reach the base before the ball was caught by the opposite team. These kinds of questions are what Dr. Sasha Reschechtko and Dr. Andrew Pruszynski will now investigate, so stay within reach if you want to learn more!


Original article: “Voluntary modification of rapid tactile-motor responses during reaching differs from its visuomotor counterpart” by Sasha Reschechtko and J. Andrew Pruszynski. https://doi.org/10.1152/jn.00232.2020

References

  1. Pruszynski JA, Johansson RS, Flanagan JR. A Rapid Tactile-Motor Reflex Automatically Guides Reaching toward Handheld Objects. Curr Biol 26: 788–792, 2016.

  2. Reschechtko S, Pruszynski JA. Voluntary modification of rapid tactile-motor responses during reaching differs from its visuomotor counterpart. Journal of Neurophysiology 124: 284–294, 2020.

Previous
Previous

Brain tissue from living patients: A new perspective on Parkinson’s Disease

Next
Next

Silent communication: A brain-computer interface that can read minds