Chapter I: The space around the body: Peripersonal space

1.4 Peripersonal space in action

Evidence from neuroimaging allows demonstrating that several sensory and motor regions modulate their neural activity to visual or auditory stimuli based on their distance from the hand or the face. However this approach does not enable us to determine the direct involvement of such representations in motor processing. As already mentioned, a series of findings in monkeys clearly indicate the strong link existing between PPS and action. To date, despite the advocated role of PPS in motor functions, paradoxically in almost all the investigations participants has been static (siting passively or, on occasion, standing still) and PPS representation has assessed in completely static conditions. On the contrary, presentation of dynamic stimuli provides information on the updating of multisensory interactions when objects approach the static observer (e.g., Fogassi et al., 1996).

Also, when humans are interacting with the environment (e.g., during grasping) or actively moving through (e.g., while walking), the position of the body in space will change constantly relative to the stimuli in the environment. Because of this, the representation(s) of PPS will obviously need to be updated in order to maintain effective interaction with the external stimuli (Holmes and Spence 2004; van der Stoep et al., 2016). A series of findings in humans support the view that the multisensory interactions in space vary depending not only as a consequence of object moving around the body but also depending on the movements of the body in the environment. Finally, recent research has tried to experimentally assess the link between multisensory processing and motor actions, filling thus the gap between the multisensory perceptual investigation of PPS and its involvement in the execution of action.

Because of the relevance of moving objects to the PPS system, Canzoneri and others (2012) developed a paradigm enabling to investigate the influence of dynamic auditory stimuli on tactile perception. In an audio-tactile task, the authors measured reaction times to a tactile stimulus applied to the right index finger while dynamic sounds, which gave the impression of either approaching or receding from the subject’s hand, were presented. Tactile stimulation was delivered at different temporal delays from the onset of the sound, such that it occurred when the sound source was perceived at varying distances from the body. Participants were simply asked to respond to touches as fast as possible, trying to ignore the sound. It was found that an auditory stimulus speeded up the processing of a tactile stimulus applied to the hand when the sound was administered within a limited distance from the hand. Also, such boosting progressively decreases as a function of the perceived approach of the acoustic stimulus. Thus, the authors were able to mark “a boundary”: the critical region within which approaching auditory stimuli facilitated the participants' detection of tactile stimuli. Moreover, results additionally suggested that approaching sounds had a stronger

spatially-dependent effect on tactile processing compared to receding sounds. In line with the finding that selectivity of some VIP and F4 neurons appears as optimally tuned for the detection of dynamic looming visual stimuli (Colby et al., 1993; Bremmer et al., 2002a, 2002b; Rizzolatti et al., 1981; Graziano et al., 1997), such cross-modal processing of approaching stimuli is evident also for looming visual information (Kandula et al., 2014). Tactile sensitivity is enhanced at the predicted location and predicted time of impact of a looming visual stimulus to the face when compared to baseline tactile sensitivity or when looming stimulus not is temporally or spatially predictive. To note that tactile perception is also enhanced, as compared to baseline tactile sensitivity, when the looming stimulus brushes past the face without however predicting an impact to the face, that when the looming stimulus is within face PPS (Clery et al., 2015b).

Relevant for this review, in a series of investigations inspired by the macaque neurophysiological works, Makin and colleagues tested rapid motor responses to “real” three-dimensional objects approaching the hand (Makin et al., 2009; 2015). They asked participants to perform a simple button-press motor response with the right index finger, while a task-irrelevant three-dimensional ball suddenly fell just above the participants’ responding hand (near condition), or at a distance (far condition). To assess the effects of the rapidly approaching stimulus on the excitability of the motor system, single pulse transcranial magnetic stimulation (TMS) was applied to the contralateral primary motor cortex to elicit motor evoked potentials (MEPs) in the responding hand. The sudden appearance of this potentially threatening visual stimulus was associated with a reduction in corticospinal excitability at the very early and specific time window of 70-80 ms following its appearance. This inhibition is proposed to reflect the proactive suppression of an automatic avoidance-related response during the execution of the task-related response. Indeed, when the two motor behaviours (i.e., the avoidance- and the task-related responses) were uncoupled, the approaching ball had an opposite, facilitatory effect on corticospinal excitability.

Critically, both the rapid inhibition and facilitation of corticospinal excitability were hand-centred.

Regardless of the location of both overt gaze and covert spatial attention, this motor response was selective for approaching stimuli, depending mostly on the distance of the ball from the hand. Thes e observations thus reveal a direct and fast connection between the visual processing of information in the space near the hand and the on-going motor behaviour, suggesting a role for PPS representation in motor responses, for example to avoid rapidly approaching (and potentially threatening) objects (see also Serino et al., 2009; Avenanti et al., 2012 for related results for auditory stimuli). For such a sensorimotor system to be really effective, not only general information about whether the hand is approached, but also the more specific information about which hand is approached, should be processed rapidly. Using the same approach, it was

demonstrated that within a very short time-window of 70ms from the appearance of the ball, the motor system is already capable of coding which hand this object potentially threatens. These findings therefore provide support for the general claim that PPS coding may well serve to perform defensive actions. As stated above, such a possibility has already been tested in monkeys through direct cortical microstimulation by Graziano and colleagues (Graziano and Cooke 2006, Cooke and Graziano, 2003, 2004; Graziano et al., 2002), who studied macaque motor activity during defensive movements evoked by aversive cutaneous stimulation. They identified in VIP a startle-related muscular activity occurring as early as ~20 to 30 ms after stimulus onset and a later muscle response starting ~70 ms after stimulus onset. Similar motor responses were also evoked by electrical microstimulation of the premotor cortex. Although comparisons between data arising from monkeys and humans should be made with caution, the hand-centred modulations of motor excitability are similar to the response properties of macaque bimodal neurons. Indeed, independently of the retinal position of the visual stimulus, modulations in motor excitability varied with the distance of the object from the hand and were specific for three-dimensional objects approaching the hand. As suggested by Makin and colleagues (2009, 2012, 2015), these hand-centred mechanisms may play a specific and prominent role in the rapid selection and control of manual actions.

Last, related to this issue is the observation that defensive reflex responses can be finely modulated by the position of the stimulus within the PPS, and, in particular, in relation to the area of the body for which the reflex response provides protection (Sambo et al., 2012a, 2012b). For example, the blink reflex elicited by a strong stimulation of the median nerve of the w rist (the hand-blink reflex, HBR) is modulated by the distance between the hand and the eye, i.e. whether the stimulus is mapped within PPS or not (Sambo and Iannetti, 2013). Although the HBR is an entirely subcortical response, when the stimulated hand is placed closer to the eye the reflex magnitude is dramatically increased. The authors suggest this effect is a consequence of the fact that the brainstem circuits mediating the HBR undergo a tonic and selective top-down modulation from higher order cortical areas responsible for encoding the location of somatosensory stimuli (Sambo et al., 2012b). This observation therefore indicates that the nervous system is able to adjust its output in a very specific and fine-grained manner, even at the level of seemingly stereotyped defensive reflex responses. In addition, the hand blink reflex is highly dependent upon cognitive expectations and inferences, given that, for instance, it is enhanced only when participants expect to receive stimuli on the hand (placed close to the face, Sambo et al., 2012b). Finally, the HBR enhancement by hand–face proximity is suppressed when a thin wooden screen is placed between

the participants' face and their hand. This seems therefore to indicate that creating a virtual separation between the face and hand can reshape and reduce the extension of PPS.

These multisensory and motor interactions might be adaptive not only for defensive, but also for appetitive actions, such as grasping (e.g., Gardner et al., 2002; 2007; Marzocchi et al., 2008). In this respect, the properties of multisensory neurons underling PPS representations may allow the brain to represent a target object in a coordinate system centred on the body (e.g., the hand, the head, the trunk) that, in addition, could be continuously updated during bodily movements. Since sensory stimuli coming from an external object are initially processed in sensory-dependent reference frames (e.g., visual stimuli in eye-centred, auditory stimuli in head-centred, tactile stimuli in body-centred frames), their coordinates need to be aligned for integration for controlling a moving body. To this aim, the same stimuli are coded with respect to a common body-centred reference frame (Colby, 1998; Andersen et al., 1997; Cohen and Andersen, 2002; Sereno and Huang, 2014; Bhattacharyya et al., 2009). The computations necessary for coding stimuli from different modalities in body-centred reference frames differ depending on the concerned body parts to which external stimuli are referenced to (Andersen and Buneo, 2002; Cohen and Andersen, 2002;

Pouget et al., 2002, Bhattacharyya et al., 2009, Pesaran et al., 2010). For instance, given that our hands can move independently from our eyes, the brain needs to integrate information arising in an eye-centred reference frame with information about the current position of the hand relative to the body and to nearby potential target objects. In order to do so, eye-centred representations would have to be transformed into effector-centred representations to command movements directed towards those targets (Makin et al., 2012). It is worth recalling here the motor properties of parietal and frontal visuo-tactile neurons: these multisensory cells have been documented to respond when the arm is voluntarily moved within the reaching space of the animal and proposed to code goal-directed actions (Gardner et al., 2007; Rizzolatti et al., 1981a, 1981b, 1997). Hence, one could theorize that the same defensive anticipatory function featured by the PPS network in the case of avoidance reactions, may also have evolved to guide voluntary object-oriented actions. The two hypotheses are not mutually exclusive, given that one could even think that a more sophisticated grasping function could have developed from more primordial defensive machinery, using the same body-part centred coding of visual space (Brozzoli et al., 2014).

Only recently, though, research started investigating the link between PPS and voluntary motor behaviour in humans. The rationale behind this line of studies is that if the PPS representation guides the execution of voluntary free-hand (i.e., without tool) actions, then the motor program should induce a rapid online remapping of visuo-tactile spatial interactions. To test this hypothesis, multisensory interactions have been assessed during the execution of a grasping

action (Brozzoli et al., 2009). Specifically, Brozzoli and co-workers employed a visuo-tactile interaction (hereafter VTI) task, a modified version of the classic CCE task, to measure how (much) VTI varied in real time during the action as a proxy for changes in PPS. In this task, healthy participants had to discriminate the location (up or down) of a tactile stimulus delivered to either of two digits (index or thumb) of one of the two hands. At the same time, they were asked to ignore a task-irrelevant visual distractor that was concurrently presented on the to-be-grasped target object (see Figure 1.10). There was no cue-target delay between tactile targets and visual distractors, thus enhancing the likelihood of causing multisensory integration instead of crossmodal spatial attention (McDonald et al., 2001; Van der Stoep at al., 2015). When compared to a static condition prior to movement initiation, the start of the grasping action selectively increased the influence exerted by visual inputs originating from the (far) target object on tactile stimuli delivered to the grasping hand. In addition, a further increase in the magnitude of VTI was observed shortly after (200ms) the onset of the hand movement.

Fig 1.10 View of the visuo-tactile interaction task used by Brozzoli and others (2009) to assess the multisensory interaction during the execution of an action. (a) Bird’s eye view of the participant facing the to-be-grasped object. Electro-cutaneous targets (green zap) were delivered to the index finger (up) or thumb (down), while a visual distractor (yellow flash) could be presented from either the same (congruent, not shown) or different (incongruent) elevation. Visual distractors are embedded into the to-be grasped object. (b) and (c) are example of grasping actions. From Brozzoli et al., 2009.

Crucially, when the same action was performed with the non-stimulated hand, no multisensory modulation was observed, even though both hands displayed comparable kinematic profiles (Brozzoli et al. 2009). This result constitutes evidence in favour of the fact that the execution of a grasping movement triggers a motor-evoked remapping of the hand PPS and additionally reveals that PPS remapping can occur independent of tool-use (see also Serino et al., 2015a). Possibly as in the monkey brain, the human brain links sources of visual and tactile information that are spatially separated at the action onset, updating their interaction as a function of the phase of the action. It is interesting to note that the increase in the strength of the visuo-tactile interaction was present well before the hand comes into contact with the object, being triggering by action execution and further increasing during the early execution phase. This further online modulation of visuo-tactile performance suggests that the multisensory representation of the space around the hand might guide the action as it unfolds in time and space (Brozzoli et al., 2010). Such a dynamic, action-dependent modulation of PPS was replicated in a second study in which two types of actions were performed.

Brozzoli and his colleagues assessed the effects of performing two different actions (i.e., grasping or pointing) towards the same object on the on-line modulations of PPS, as measured by VTI (Brozzoli et al., 2010). When compared to the static condition, the grasping and the pointing actions had similar effects of increasing VTI at the action onset. More interestingly, VTI further increased during the execution phase of the grasping, but not of the pointing action, when the kinematics of these movements started to diverge (see Figure 1.11). These findings therefore suggest that performing voluntary actions induce a continuous remapping of PPS as a function of the on-line contextual demands imposed by their kinematics. In other terms, as a further proof of the deep the relationship between PPS and the motor characteristics of the action, different multisensory interactions arise as a function of the required sensory–motor demands, being more important for actions that need relatively more complex sensory–motor transformations. If (at least part of) the remapping of PPS is already effective at the onset of the motor program, the visuo-tactile modulation will be kept unchanged. On the other hand, in the case of relatively complex object-oriented interactions such as grasping, the remapping of PPS will be dynamically updated with respect to the motor command.

Figure 1.11 On-line modulations of PPS during actions. Upper panel. VTI changes during grasping and pointing movements as function of action phase. Lower panel. Kinematic changes of the transport component for both actions: peaks of acceleration (left part), velocity (central part) and deceleration (right part). Adapted from Brozzoli et al., 2010.

A similar remapping has been documented in the case of walking: PPS extends when participants walk as compared to when they were standing still (Noel et al., 2015). The boundary of the trunk PPS was assessed by measuring the spatial distance at which an approaching sound significantly speeded-up reactions to tactile stimuli on the participant’s body (see Canzoneri et al., 2012; Serino et al., 2015b). The experiment was conducted while the participants were either standing or walking on a treadmill, such that the relative distance between the participant’s body and the sound source was equivalent in the two conditions. However, while in the static condition sounds occurring closer than ~80-90cm from the participant decreased tactile RTs, in the walking condition the speeding-up of participants’ responses occurred for sounds farther than 2 m. The latter result therefore suggests that potential interactions between external stimuli and bodily stimuli are anticipated in the case of walking (Noel et al., 2015).

1.4.1 A space for body-object interactions? Interim summary

In the lights of such a wealth of demonstrations, an obvious yet critical question arises: why the brain should be endowed with a modular representation of space displaying multisensory and motor features?

A crucial point I tried to stress so far is that the encoding of near space has not only sensory, but also motor-related nature that qualifies PPS as multisensory-motor interface. Indeed, neurophysiological evidence in monkey demonstrated that neurons in premotor and parietal cortex and in the putamen have multisensory functions as well as motor functions (Rizzolatti et al., 1997).

In humans, single-pulse TMS experiments have shown that visual (Makin et al., 2009) or auditory (Serino et al., 2009) information within PPS transiently modulate the excitability of the hand representation in the primary motor cortex. For this reason, PPS representations are probably best described as multisensory-motor interface(s), which serve to encode the location of nearby sensory stimuli to generate suitable motor acts. Arguably, the body part-centred PPS representations may provide an effective mechanism to guide actions both towards and away from nearby stimuli presented within reaching distance by using different effectors. Indeed, the encoding of the spatial position of external stimuli in a body-centred frame of reference has traditionally been suggested to facilitate the “possibility to act in space” in terms of approaching and defensive responses. Even for very simple actions, such as avoiding a stimulus coming towards the face or the hand (see Grazia no and Cooke, 2006), or reaching to grasp an object, or getting food into the mouth (see Rizzolatti et al., 1997), the motor system needs to compute the position of the visual stimulus relative to the relevant body parts.

From a theoretical perspective, De Vignemont and Iannetti (2014) have accordingly proposed a dual model of PPS representations based on a functional distinction between bodily protection and goal-directed action. Protecting the body against physical threats is one of the vital functions the system should guarantee. By acting as an anticipatory multisensory-motor interface, PPS may serve for the early detection of potential threats approaching the body (Fogassi et al., 1996) in order to drive involuntary defensive movements (Cooke and Graziano 2003; Graziano and Cooke 2006). As already described, the most direct evidence in favour of this hypothesis comes from cortical electrical stimulation studies (although concerns have been raised in this respect; see Strick 2002). It is worth acknowledging that, by employing a similar paradigm it has been demonstrated that the stimulation of parietal visuo-tactile areas can induce not only movements that are compatible with defensive behaviour, but also movement’s compatible with “appetitive”

From a theoretical perspective, De Vignemont and Iannetti (2014) have accordingly proposed a dual model of PPS representations based on a functional distinction between bodily protection and goal-directed action. Protecting the body against physical threats is one of the vital functions the system should guarantee. By acting as an anticipatory multisensory-motor interface, PPS may serve for the early detection of potential threats approaching the body (Fogassi et al., 1996) in order to drive involuntary defensive movements (Cooke and Graziano 2003; Graziano and Cooke 2006). As already described, the most direct evidence in favour of this hypothesis comes from cortical electrical stimulation studies (although concerns have been raised in this respect; see Strick 2002). It is worth acknowledging that, by employing a similar paradigm it has been demonstrated that the stimulation of parietal visuo-tactile areas can induce not only movements that are compatible with defensive behaviour, but also movement’s compatible with “appetitive”

In document The Peripersonal Space: A Space to INTER-ACT Action- and Social-related Modulations of the Space around Us (Page 47-56)