Portrait of Dr Luigi Tamè

Dr Luigi Tamè

Lecturer in Cognitive Neuroscience
Placement Year Degree Assistant Co-ordinator


Luigi Tamè is a Lecturer in Cognitive Neuroscience in the School of Psychology.

Research interests

Luigi's research is concerned with aspects of sensory perception and sensory-motor interaction. He uses a combination of psychophysics and neuroimaging techniques (eg fMRI, MEG, EEG, TMS) in order to define the behavioural and neural correlates of these phenomena. He is particularly interested in the sense of touch, its interactions with the other senses and the motor system, and the way in which touch is represented and elaborated by the brain.

His main areas of interest include:

  • Physiology of the sense of touch in healthy individuals 
  • Neural basis of body representation in healthy individuals  
  • Multisensory integration between vision and touch 
  • The development of body knowledge during the first months of life  
  • TMS, EEG and fMRI techniques

Luigi welcomes enquiries from prospective doctoral students interested in these or other related topics on somatosensation or multisensory and sensory-motor integration. 

Key publications

  • Holmes, N. P. & Tamè, L. (2019). Locating primary somatosensory cortex in human brain stimulation studies: Systematic review and meta-analytic evidence. Journal of Neurophysiology, 121,152-162.
  • Sadibolova, R., Tamè, L. & Longo, M. R. (2018) More than skin deep: Integration of skin-based and musculo-skeletal reference frames in localisation of touch. Journal of Experimental Psychology: Human Perception and Performance, 44, 1672-1682.
  • Tamè, L., Dransfield, E., Quittier, T. & Longo, M. R. (2017). Finger posture modulates structural body representations. Scientific Reports, 7, 43019.
  • Tamè, L. & Holmes, N. P. (2016). Involvement of human primary somatosensory cortex in vibrotactile detection depends on task demand. NeuroImage138, 184-96.
  • Tamè, L., Braun, C., Holmes, N. P., Farnè, A. & Pavani, F. (2016). Bilateral representations of touch in the primary somatosensory cortex. Cognitive Neuropsychology33, 48-66.


Grants and awards

2019-2020EPS Small Grant 
"Repetitive somatosensory stimulation and motor evoked potentials" 
(Principal Investigator)
2019-2020University of Kent Faculty of Social Sciences Research Grant 
"Sensory attenuation in preschool children" 
(PI, with Dr Elena Nava, University Milano-Bicocca CoI; Dr Nadia Bolognini (University Milano-Bicocca CoI; Dr Nicholas P Holmes, University of Nottingham CoI)
2019-2020Visiting Expert Research Grant (PAT) 
"Inter-hemispheric interactions serving sensory-motor coordination across body sides" 
(Principal Investigator)
2017CIMeC Young Researcher Award, 10th anniversary CIMeC, University of Trento€500
2016Reviewer Lottery Winner, Cognitive Neuropsychology, Taylor & Francis€250
2015Best Paper Award, Italian Association of Psychology
2011-2014Marie Curie Individual Fellow, European Commission/PAT
(Principal Investigator)
2013Young Researcher Award, Italian Association of Psychology€250
2010-2011Postdoctoral Research Fellowship 
(Co Principal Investigator)
2006-2009PhD Research Fellowship, University of Trento€40,000
2001Erasmus Exchange Student Program Fellowship, University of Padova€1,500



  • Hidaka, S., Tamè, L., Zafarana, A., & Longo, M. (2020). Anisotropy in tactile time perception. Cortex, 128, 124-131. Retrieved from https://doi.org/10.1016/j.cortex.2020.03.011
    Spatial distortions in touch have been investigated since the 19th century. For example, two touches applied to the hand dorsum feel farther apart when aligned with the mediolateral axis (i.e., across the hand) than when aligned with the proximodistal axis (along the hand). Stimulations to our sensory receptors are usually dynamic, where spatial and temporal inputs closely interact to establish our percept. For example, physically bigger tactile stimuli are judged to last longer than smaller stimuli. Given such links between space and time in touch, we investigated whether there is a tactile anisotropy in temporal perception analogous to the anisotropy described above. In this case, the perceived duration between the onset of two touches should be larger when they are aligned with the mediolateral than with the proximodistal axis of the hand dorsum. To test this hypothesis, we asked participants to judge which of two tactile temporal sequences, having the same spatial separation along and across the dorsum, felt longer. A clear anisotropy of the temporal perception was observed: temporal intervals across the hand were perceived as longer than those along the hand. Consistent with the spatial anisotropy, the temporal anisotropy did not appear on the palm side of the hand, indicating that the temporal anisotropy was based on perceptual processes rather than top-down modulations such as attentional or decisional/response biases. Contrary to our predictions, however, we found no correlation between the magnitudes of the temporal and spatial anisotropies. Our results demonstrated a novel type of temporal illusion in touch, which is strikingly similar in nature to the previously reported spatial anisotropy. Thus, qualitatively similar distorted somatosensory representations appear to underlie both temporal and spatial processing of touch.
  • Dolgilevica, K., Longo, M., & Tamè, L. (2020). Structural representations of fingers rely on both anatomical and spatial reference frames. Journal of Experimental Psychology: Human Perception and Performance. Retrieved from http://dx.doi.org/10.1037/xhp0000715
    Finger agnosia refers to a neurological condition in which patients with left posterior parietal lesions fail to identify their fingers, despite having relatively preserved abilities in sensation and skilled action. This dissociation suggests that the structural body representations (BSRs) may be distinct from sensorimotor representations. However, recent research has reported that postural changes modulate representation of hand structure, revealing dynamic interactions between structural and sensorimotor body representations. However, it is unknown how and to what extent anatomical and spatial proximity contribute to shape the hand structural representation. We investigate this question using the “in-between” test in which participants estimate how many unstimulated fingers are in-between two touched fingers of the left hand placed palm down. The first phalange of the participants’ fingers was touched on the left or right side. Judged finger numerosity was greater when fingers were stimulated on far sides (i.e., opposite sides of the two fingers) compared to when they were stimulated on close (i.e., sides facing each other’s) or mid-distance (i.e., sides facing in the same direction) sides. Therefore, fingers identification was modulated both by anatomical and spatial proximity in external space between touches. This demonstrates that BSRs rely on both anatomical and external reference frames.
  • Manser-Smith, K., Tamè, L., & Longo, M. (2019). A common representation of fingers and toes. Acta Psychologica, 199, 102900. doi:10.1016/j.actpsy.2019.102900
    There are many similarities and differences between the human hands and feet. On a psychological level, there is some evidence from clinical disorders and studies of tactile localisation in healthy adults for deep functional connections between the hands and feet. One form these connections may take is in common high-level mental representations of the hands and feet. Previous studies have shown that there are systematic, but distinct patterns of confusion found between both the fingers and toes. Further, there are clear individual differences between people in the exact patterns of mislocalisations. Here, we investigated whether these idiosyncratic differences in tactile localisation are shared between the fingers and toes, which may indicate a shared high-level representation. We obtained confusion matrices showing the pattern of mislocalisation on the hairy skin surfaces of both the fingers and toes. Using a decoding approach, we show that idiosyncratic differences in individuals' pattern of confusions are shared across the fingers and toes, despite different overall patterns of confusions. These results suggest that there is a common representation of the fingers and toes.
  • Romano, D., Tamè, L., Amoruso, E., Azañón, E., Maravita, A., & Longo, M. (2019). The standard posture of the hand. Journal of Experimental Psychology: Human Perception & Performance. doi:10.1037/xhp0000662
    Perceived limb position is known to rely on sensory signals and motor commands. Another potential source of input is a standard representation of body posture, which may bias perceived limb position towards more stereotyped positions. Recent results show that tactile stimuli are processed more efficiently when delivered to a thumb in a relatively low position or an index finger in a relatively high position. This observation suggests that we may have a standard posture of the body that promotes a more efficient interaction with the environment. In this study, we mapped the standard posture of the entire hand by characterizing the spatial associations of all five digits. Moreover, we show that the effect is not an artefact of intermanual integration. Results showed that the thumb is associated with low positions, while the other fingers are associated with upper locations.
  • Tamè, L., Azañón, E., & Longo, M. (2019). A Conceptual Model of Tactile Processing across Body Features of Size, Shape, Side, and Spatial Location. Frontiers in Psychology, 10. doi:doi:10.3389/fpsyg.2019.00291
    The processing of touch depends of multiple factors, such as the properties of the skin and type of receptors stimulated, as well as features related to the actual configuration and shape of the body itself. A large body of research has focused on the effect that the nature of the stimuli has on tactile processing. Less research, however, has focused on features beyond the nature of the touch. In this review, we focus on some features related to the body that have been investigated for less time and in a more fragmented way. These include the symmetrical quality of the two sides of the body, the postural configuration of the body, as well as the size and shape of different body parts. We will describe what we consider three key aspects: (1) how and at which stages tactile information is integrated between different parts and sides of the body; (2) how tactile signals are integrated with online and stored postural configurations of the body, regarded as priors; (3) and how tactile signals are integrated with representations of body size and shape. Here, we describe how these different body dimensions affect integration of tactile information as well as guide motor behavior by integrating them in a single model of tactile processing. We review a wide range of neuropsychological, neuroimaging, and neurophysiological data and suggest a revised model of tactile integration on the basis of the one proposed previously by Longo et al.
  • Holmes, N., Tamè, L., Beeching, P., Medford, M., Rakova, M., Stuart, A., & Zeni, S. (2019). Locating primary somatosensory cortex in human brain stimulation studies: Experimental evidence. Journal of Neurophysiology, 121, 336-344. doi:10.1152/jn.00641.2018
    Transcranial magnetic stimulation (TMS) over human primary somatosensory cortex (S1) does not produce immediate outputs. Researchers must therefore rely on indirect methods for TMS coil positioning. The 'gold standard' is to use individual functional and structural magnetic resonance imaging (MRI) data, but the majority of studies don't do this. The most common method to locate the hand area of S1 (S1-hand) is to move the coil posteriorly from the hand area of primary motor cortex (M1-hand). Yet, S1-hand is not directly posterior to M1-hand. We localised the index finger area of S1-hand experimentally in four ways. First, we re-analysed functional MRI data from 20 participants who received vibrotactile stimulation to their 10 digits. Second, to assist the localisation of S1-hand without MRI data, we constructed a probabilistic atlas of the central sulcus from 100 healthy adult MRIs, and measured the likely scalp location of S1-index. Third, we conducted two experiments mapping the effects of TMS across the scalp on tactile discrimination performance. Fourth, we examined all available neuronavigation data from our laboratory on the scalp location of S1-index. Contrary to the prevailing method, and consistent with systematic review evidence, S1-index is close to the C3/C4 electroencephalography (EEG) electrode locations on the scalp, approximately 7-8 cm lateral to the vertex, and approximately 2 cm lateral and 0.5 cm posterior to the M1-FDI scalp location. These results suggest that an immediate revision to the most commonly-used heuristic to locate S1-hand is
    45 required. The results of many TMS studies of S1-hand need reassessment.
  • Holmes, N., & Tamè, L. (2019). Locating primary somatosensory cortex in human brain stimulation studies: systematic review and meta-analytic evidence. Journal of Neurophysiology, 121, 152-162. doi:10.1152/jn.00614.2018
    Transcranial magnetic stimulation (TMS) over human primary somatosensory cortex (S1), unlike over primary motor cortex (M1), does not produce an immediate, objective output. Researchers must therefore rely on one or more indirect methods to position the TMS coil over S1. The 'gold standard' method of TMS coil positioning is to use individual functional and structural magnetic resonance imaging (F/SMRI) alongside a stereotactic navigation system. In the absence of these facilities, however, one common method used to locate S1 is to find the scalp location which produces twitches in a hand muscle (e.g., the first dorsal interosseus, M1-FDI), then move the coil posteriorly to target S1. There has been no systematic assessment of whether this commonly-reported method of finding the hand area of S1 is optimal. To do this, we systematically reviewed 124 TMS studies targeting the S1 hand area, and 95 functional magnetic resonance imaging (FMRI) studies involving passive finger and hand stimulation. 96 TMS studies reported the scalp location assumed to correspond to S1-hand, which was on average 1.5 to 2cm posterior to the functionally-defined M1-hand area. Using our own scalp measurements combined with similar data from MRI and TMS studies of M1-hand, we provide the estimated scalp locations targeted in these TMS studies of the S1-hand. We also provide a summary of reported S1 coordinates for passive finger and hand stimulation in FMRI studies. We conclude that S1-hand is more lateral to M1-hand than assumed by the majority of TMS studies.
  • Manser-Smith, K., Tamè, L., & Longo, M. (2018). Tactile confusions of the fingers and toes. Journal of Experimental Psychology: Human Perception and Performance, 44, 1727-1738. doi:10.1037/xhp0000566
    Recent research has shown systematic patterns of confusions between digits of the hands and feet. The present study addressed whether such confusions arise from early somatosensory maps or higher level body representations. As the glabrous and hairy skin of the hands and feet have distinct representations in somatosensory cortex, an effect arising from early somatotopic maps may show distinct patterns on each skin surface. In contrast, if the effect arises from higher level body representations which represent the digits as volumetric units, similar patterns should be apparent regardless of which side of the digit is touched. We obtained confusion matrices showing the pattern of mislocalization on the glabrous and hairy skin surfaces of the toes (Experiment 1) and fingers (Experiment 2). Our results replicated the characteristic pattern of mislocalizations found on the glabrous skin reported in previous studies. Critically, these effects were highly similar on the hairy skin surface of both the toes and fingers. Despite the pattern of mislocalizations being highly stereotyped across participants, there were consistent individual differences in the pattern of confusions across the two skin surfaces. These results suggest that mislocalizations occur at the level of individual digits, consistent with their resulting from higher level body representations.
  • Tamè, L., Linkenauger, S., & Longo, M. (2018). Dissociation of feeling and belief in the rubber hand illusion. PLOS ONE, 13. doi:10.1371/journal.pone.0206367
    The Rubber Hand Illusion (RHI) has been widely used to investigate the perception of the bodily self. Commonly used measures of the illusion are self-report questionnaires and proprioceptive drift of the participants’ hands towards the rubber hand. Recent studies have shown that these measures can be dissociated, suggesting they may arise from distinct mechanisms. In previous studies using questionnaires, participants were asked to base responses on their subjective feelings of body ownership, rather than their beliefs. This makes sense given the obvious fact that whereas participants may feel like the rubber hand is part of their body, they do not believe that it is. It is not clear, however, whether a similar dissociation between feelings and beliefs also exists for proprioceptive drift. Here, we investigated the presence of a dissociation between feeling and belief in the context of the RHI. When participants reported their feelings there was an increase both in the sense of body ownership over the fake hand as well as in the proprioceptive drift, compared to when they reported their beliefs. Strikingly, unlike the sense of ownership, proprioceptive drift was unaffected by the synchrony of stimulation. This may be an important way in which the two measures of the RHI differ.
  • Sadibolova, R., Tamè, L., & Longo, M. (2018). More than skin-deep: Integration of skin-based and musculoskeletal reference frames in localization of touch. Journal of Experimental Psychology: Human Perception and Performance, 44, 1672-1682. doi:10.1037/xhp0000562
    The skin of the forearm is, in one sense, a flat 2D sheet, but in another sense approximately cylindrical, mirroring the 3D volumetric shape of the arm. The role of frames of reference based on the skin as a 2D sheet versus based on the musculo-skeletal structure of the arm remains unclear. When we rotate the forearm from a pronated to a supinated posture, the skin on its surface is displaced. Thus, a marked location will slide with the skin across the underlying flesh, and the touch perceived at this location should follow this displacement if it is localised within a skin-based reference frame. We investigated, however, if the perceived tactile locations were also affected by the rearrangement in underlying musculo skeletal structure, i.e. displaced medially and laterally on a pronated and supinated forearm, respectively. Participants pointed to perceived touches (Experiment 1), or marked them on a three-dimensional size-matched forearm on a computer screen (Experiment 2). The perceived
    locations were indeed displaced medially after forearm pronation in both response modalities. This misperception was reduced (Experiment 1), or absent altogether (Experiment 2) in the supinated posture when the actual stimulus grid moved laterally with the displaced skin. The
    grid was perceptually stretched at medial-lateral axis, and it was displaced distally, which suggest the influence of skin-based factors. Our study extends the tactile localisation literature focused on the skin-based reference frame and on the effects of spatial positions of
    body parts by implicating the musculo-skeletal factors in localisation of touch on the body.
  • Tamè, L. (2018). L’integrazione somatosensoriale e sensomotoria tra i due emisferi cerebrali. Riabilitazione Neurocognitiva, 1, 15-22.
  • Holmes, N., & Tamè, L. (2018). Multisensory Perception: Magnetic Disruption of Attention in Human Parietal Lobe. Current Biology, 28, 259-261. doi:10.1016/j.cub.2018.01.078
    Paying attention to sounds and touches at the same time is demanding. New research shows how the parietal lobe of the human brain mediates multisensory perception of stimulus frequency and intensity.
  • Sadibolova, R., Tamè, L., Walsh, E., & Longo, M. (2018). Mind the Gap: The Effects of Temporal and Spatial Separation in Localization of Dual Touches on the Hand. Frontiers in Human Neuroscience, 12. doi:10.3389/fnhum.2018.00055
    In this study, we aimed to relate the findings from two predominantly separate streams of literature, one reporting on the localization of single touches on the skin, and the other on the distance perception of dual touches. Participants were touched with two points, delivered either simultaneously or separated by a short delay to various locations on their left hand dorsum. They then indicated on a size-matched hand silhouette the perceived locations of tactile stimuli. We quantified the deviations between the actual stimulus grid and the corresponding perceptual map which was constructed from the perceived tactile locations, and we calculated the precision of tactile localization (i.e., the variability across localization attempts). The evidence showed that the dual touches, akin to single touch stimulations, were mislocalized distally and that their variable localization error was reduced near joints, particularly near knuckles. However, contrary to single-touch localization literature, we observed for the dual touches to be mislocalized towards the ulnar side of the hand, particularly when they were presented sequentially. Further, the touches presented in a sequential order were slightly “repelled” from each other and their perceived distance increased, while the simultaneous tactile pairs were localized closer to each other and their distance was compressed. Whereas the sequential touches may have been localized with reference to the body, the compression of tactile perceptual space for simultaneous touches was related in the previous literature to signal summation and inhibition and the low-level factors, including the innervation density and properties of receptive fields (RFs) of somatosensory neurons.
  • Ambroziak, K., Tamè, L., & Longo, M. (2018). Conceptual distortions of hand structure are robust to changes in stimulus information. Consciousness and Cognition, 61, 107-116. doi:10.1016/j.concog.2018.01.002
    Previous studies showed stereotyped distortions in hand representations. People judge their knuckles as farther forward in the hand than they actually are. The cause of this bias remains unclear. We tested whether both visual and tactile information contribute to the bias. In Experiment 1, participants judged the location of their knuckles by pointing to the location on their palm with: (1) a metal baton (using vision and touch), (2) a metal baton while blindfolded (using touch), or (3) a laser pointer (using vision). Distal mislocalisations were found in all conditions. In Experiment 2, we investigated whether judgments are influenced by visual landmarks such as creases. Participants localized their knuckles on either a photograph of their palm or a silhouette. Distal mislocalisations were apparent in both conditions. These results show that distal biases are resistant to changes in stimulus information, suggesting that such mislocalisations reflect a conceptual mis-representation of hand structure.
  • Medina, S., Tamè, L., & Longo, M. (2018). Tactile localization biases are modulated by gaze direction. Experimental Brain Research, 236, 31-42. doi:10.1007/s00221-017-5105-2
    Identifying the spatial location of touch on the skin surface is a fundamental function of our somatosensory system. Despite the fact that stimulation of even single mechanoreceptive afferent fibres is sufficient to produce clearly localised percepts, tactile localisation can be modulated also by higher level processes such as body posture. This suggests that tactile events are coded using multiple representations using different coordinate systems. Recent reports provide evidence for systematic biases on tactile localisation task, which are thought to result from a supramodal representation of the skin surface. While the influence of non-informative vision of the body and gaze direction on tactile discrimination tasks has been extensively studied, their effects on tactile localisation tasks remain largely unexplored. To address this question, participants performed a tactile localization task on their left hand under different visual conditions by means of a mirror box; in the mirror condition, a single stimulus was delivered on participants’ hand, while the reflexion of the right hand was seen through the mirror; in the object condition, participants looked at a box through the mirror, and in the right hand condition, participants looked directly at their right hand. Participants reported the location of the tactile stimuli using a silhouette of a hand. Results showed a shift in the localization of the touches towards the tip of the fingers (distal bias) and the thumb (radial biases) across conditions. Critically, distal biases were reduced when participants looked towards the mirror compared to when they looked at their right hand suggesting that gaze direction reduces the typical proximodistal biases in tactile localization. Moreover, vision of the hand modulates the internal configuration of points’ locations, by elongating it, in the radio-ulnar axis.
  • Tamè, L., Wühle, A., Petri, C., Pavani, F., & Braun, C. (2017). Concurrent use of somatotopic and external reference frames in a tactile mislocalization task. Brain and Cognition, 111, 25-33. doi:10.1016/j.bandc.2016.10.005
    Localizing tactile stimuli on our body requires sensory information to be represented in multiple frames of reference along the sensory pathways. These reference frames include the representation of sensory information in skin coordinates, in which the spatial relationship of skin regions is maintained. The organization of the primary somatosensory cortex matches such somatotopic reference frame. In contrast, higher-order representations are based on external coordinates, in which body posture and gaze direction are taken into account in order to localise touch in other meaningful ways according to task demands. Dominance of one representation or the other, or the use of multiple representations with different weights, is thought to depend on contextual factors of cognitive and/or sensory origins. However, it is unclear under which situations a reference frame takes over another or when different reference frames are jointly used at the same time. The study of tactile mislocalizations at the fingers has shown a key role of the somatotopic frame of reference, both when touches are delivered unilaterally to a single hand, and when they are delivered bilaterally to both hands. Here, we took advantage of a well-established tactile mislocalization paradigm to investigate whether the reference frame used to integrate bilateral tactile stimuli can change as a function of the spatial relationship between the two hands. Specifically, supra-threshold interference stimuli were applied to the index or little fingers of the left hand 200 ms prior to the application of a test stimulus on a finger of the right hand. Crucially, different hands postures were adopted (uncrossed or crossed). Results show that introducing a change in handposture triggered the concurrent use of somatotopic and external reference frames when processing bilateral touch at the fingers. This demonstrates that both somatotopic and external reference frames can be concurrently used to localise tactile stimuli on the fingers.
  • Tamè, L., Carr, A., & Longo, M. (2017). Vision of the body improves inter-hemispheric integration of tactile-motor responses. Acta Psychologica, 175, 21-27. doi:10.1016/j.actpsy.2017.02.007
    Sensory input from and motor output to the two sides of the body needs to be continuously integrated between the two cerebral hemispheres. This integration can be measured through its cost in terms of processing speed. In simple detection tasks, reaction times (RTs) are faster when stimuli are presented to the side of the body ipsilateral to the body part used to respond. This advantage – the contralateral-ipsilateral difference (also known as the crossed-uncrossed difference: CUD) – is thought to reflect inter-hemispheric interactions needed for sensorimotor information to be integrated between the two hemispheres. Several studies have shown that non-informative vision of the body enhances performance in tactile tasks. However, it is unknown whether the CUD can be similarly affected by vision. Here, we investigated whether the CUD is modulated by vision of the body (i.e., the stimulated hand) by presenting tactile stimuli unpredictably on the middle fingers when one hand was visible (i.e., either the right or left hand). Participants detected the stimulus and responded as fast as possible using either their left or right foot. Consistent with previous results, a clear CUD (5.8 ms) was apparent on the unseen hand. Critically, however, no such effect was found on the hand that was visible (?2.2 ms). Thus, when touch is delivered to a seen hand, the usual cost in processing speed of responding with a contralateral effector is eliminated. This result suggests that vision of the body improves the interhemispheric integration of tactile-motor responses.
  • Tamè, L., Dransfield, E., Quettier, T., & Longo, M. (2017). Finger posture modulates structural body representations. Scientific Reports, 7. doi:10.1038/srep43019
    Patients with lesions of the left posterior parietal cortex commonly fail in identifying their fingers, a condition known as finger agnosia, yet are relatively unimpaired in sensation and skilled action. Such dissociations have traditionally been interpreted as evidence that structural body representations (BSR), such as the body structural description, are distinct from sensorimotor representations, such as the body schema. We investigated whether performance on tasks commonly used to assess finger agnosia is modulated by changes in hand posture. We used the ‘in between’ test in which participants estimate the number of unstimulated fingers between two touched fingers or a localization task in which participants judge which two fingers were stimulated. Across blocks, the fingers were placed in three levels of splay. Judged finger numerosity was analysed, in Exp. 1 by direct report and in Exp. 2 as the actual number of fingers between the fingers named. In both experiments, judgments were greater when non-adjacent stimulated fingers were positioned far apart compared to when they were close
    together or touching, whereas judgements were unaltered when adjacent fingers were stimulated. This demonstrates that BSRs are not fixed, but are modulated by the real-time physical distances between body parts.
  • Tamè, L., Bumpus, N., Linkenauger, S., & Longo, M. (2017). Distorted body representations are robust to differences in experimental instructions. Attention, Perception, & Psychophysics, 79, 1204-1216. doi:10.3758/s13414-017-1301-1
    Several recent reports have shown that even healthy adults maintain highly distorted representations of the size and shape of their body. These distortions have been shown to be highly consistent across different study designs and dependent measures. However, previous studies have found that visual judgments of size can be modulated by the experimental instructions used, for example, by asking for judgments of the participant’s subjective experience of stimulus size (i.e., apparent instructions) versus judgments of actual stimulus properties (i.e., objective instructions). Previous studies investigating internal body representations have relied exclusively on ‘apparent’ instructions. Here, we investigated whether apparent versus objective instructions modulate findings of distorted body representations underlying position sense (Exp. 1), tactile distance perception (Exp. 2), as well as the conscious body image (Exp. 3). Our results replicate the characteristic distortions previously reported for each of these tasks and further show that these distortions are not affected by instruction type (i.e., apparent vs. objective). These results show that the distortions measured with these paradigms are robust to differences in instructions and do not reflect a dissociation between perception and belief.
  • Tamè, L., & Holmes, N. (2016). Involvement of human primary somatosensory cortex in vibrotactile detection depends on task demand. NeuroImage, 138, 184-196. doi:10.1016/j.neuroimage.2016.05.056
    Detecting and discriminating sensory stimuli are fundamental functions of the nervous system. Electrophysiological and lesion studies suggest that macaque primary somatosensory cortex (SI) is critically involved in discriminating between stimuli, but is not required simply for detecting stimuli. By contrast, transcranial magnetic stimulation (TMS) studies in humans have shown near-complete disruption of somatosensory detection when a single pulse of TMS is delivered over SI. To address this discrepancy, we measured the sensitivity and decision criteria of participants detecting vibrotactile stimuli with individually-tailored fMRI-guided TMS over SI, over a control site not activated by vibrotactile stimuli (inferior parietal lobule, IPL), or away from the head (a no TMS condition). In a one-interval detection task, TMS increased participants' likelihood of reporting 'no' target present regardless of site, but TMS over SI also decreased detection sensitivity, and prevented improvement in tactile sensitivity over time. We then measured tactile thresholds in a series of two-interval forced-choice (2IFC) detection and discrimination tasks with lower dependence on response criteria and short-term memory load. We found that thresholds for detecting stimuli were comparable with TMS over SI and IPL, but TMS over SI specifically and significantly impaired frequency discrimination. We conclude that, in accordance with macaque studies, human SI is required for discriminating between tactile stimuli and for maintaining stimulus representations over time, or under high task demand, but may not be required for simple tactile detection.
  • Tamè, L., Braun, C., Holmes, N., Farnè, A., & Pavani, F. (2016). Bilateral representations of touch in the primary somatosensory cortex. Cognitive Neuropsychology, 33, 48-66. doi:10.1080/02643294.2016.1159547
    According to current textbook knowledge, primary somatosensory cortex (SI) supports unilateral tactile representations, whereas structures beyond SI, in particular the secondary somatosensory cortices (SII), support bilateral tactile representations. However, dexterous and well-coordinated bimanual motor tasks require early integration of bilateral tactile information. Sequential processing, first of unilateral and subsequently of bilateral sensory information might not be sufficient to accomplish these tasks. This view of sequential processing in the somatosensory system might therefore be questioned, at least for demanding bimanual tasks. Evidence from the last fifteen years is forcing a revision of this textbook notion. Studies in animals and humans indicate that SI is more than a simple relay for unilateral sensory information and, together with SII, contributes to the integration of somatosensory inputs from both sides of the body. Here, we review a series of recent works from our own and other laboratories in favour of interactions between tactile stimuli on the two sides of the body at early stages of processing. We will focus on tactile processing, although a similar logic may also apply to other aspects of somatosensation. We begin by describing the basic anatomy and physiology of interhemispheric transfer, drawing on neurophysiological studies in animals and behavioural studies in humans that showed tactile interactions between body sides, both in healthy and brain-damaged individuals. Then we describe the neural substrates of bilateral interactions in somatosensation as revealed by neurophysiological work in animals and neuroimaging studies in humans (i.e., functional magnetic resonance imaging, magnetoencephalography, and transcranial magnetic stimulation). Finally, we conclude with considerations on the dilemma of how efficiently integrating bilateral sensory information at early processing stages can coexist with more lateralised representations of somatosensory input, in the context of motor control.
  • Azañón, E., Tamè, L., Maravita, A., Linkenauger, S., Ferrè, E., Tajadura-Jiménez, A., & Longo, M. (2016). Multimodal Contributions to Body Representation. Multisensory Research, 29, 635-661. doi:10.1163/22134808-00002531
    Our body is a unique entity by which we interact with the external world. Consequently, the way we represent our body has profound implications in the way we process and locate sensations and in turn perform appropriate actions. The body can be the subject, but also the object of our experience, providing information from sensations on the body surface and viscera, but also knowledge of the body as a physical object. However, the extent to which different senses contribute to constructing the rich and unified body representations we all experience remains unclear. In this review, we aim to bring together recent research showing important roles for several different sensory modalities in constructing body representations. At the same time, we hope to generate new ideas of how and at which level the senses contribute to generate the different levels of body representations and how they interact. We will present an overview of some of the most recent neuropsychological evidence about multisensory control of pain, and the way that visual, auditory, vestibular and tactile systems contribute to the creation of coherent representations of the body. We will focus particularly on some of the topics discussed in the symposium on Multimodal Contributions to Body Representation held on the 15th International Multisensory Research Forum (2015, Pisa, Italy).
  • Longo, M., Sadibolova, R., & Tamè, L. (2016). Embodying prostheses – how to let the body welcome assistive devices: Comment on “The embodiment of assistive devices—from wheelchair to exoskeleton” by M. Pazzaglia and M. Molinari. Physics of Life Reviews, 16, 184-185. doi:10.1016/j.plrev.2016.01.012
  • Tamè, L., & Longo, M. (2015). Inter-hemispheric integration of tactile-motor responses across body parts. Frontiers in Human Neuroscience, 9, 345. doi:10.3389/fnhum.2015.00345
    In simple detection tasks, reaction times (RTs) are faster when stimuli are presented to the visual ?eld or side of the body ipsilateral to the body part used to respond. This advantage, the crossed-uncrossed difference (CUD), is thought to re?ect interhemispheric interactions needed for sensorimotor information to be integrated between the two cerebral hemispheres. However, it is unknown whether the tactile CUD is invariant when different body parts are stimulated. The most likely structure mediating such processing is thought to be the corpus callosum (CC). Neurophysiological studies have shown that there are denser callosal connections between regions that represent proximal parts of the body near the body midline and more sparse connections for regions representing distal extremities. Therefore, if the information transfer between the two hemispheres is affected by the density of callosal connections, stimuli presented on more distal regions of the body should produce a greater CUD compared to stimuli presented on more proximal regions. This is because interhemispheric transfer of information from regions with sparse callosal connections will be less ef?cient, and hence slower. Here, we investigated whether the CUD is modulated as a function of the different body parts stimulated by presenting tactile stimuli unpredictably on body parts at different distances from the body midline (i.e., Middle Finger, Forearm, or Forehead of each side of the body). Participants detected the stimulus and responded as fast as possible using either their left or right foot. Results showed that the magnitude of the CUD was larger on the ?nger (?2.6 ms) and forearm (?1.8 ms) than on the forehead ('0.9 ms). This result suggests that the interhemispheric transfer of tactile stimuli varies as a function of the strength of callosal connections of the body parts.
  • Tamè, L., Pavani, F., Braun, C., Salemme, R., Farnè, A., & Reilly, K. (2015). Somatotopy and temporal dynamics of sensorimotor interactions: evidence from double afferent inhibition. European Journal of Neuroscience, 41, 1459-1465. doi:10.1111/ejn.12890
    Moving and interacting with the world requires that the sensory and motor systems share information, but while some information about tactile events is preserved during sensorimotor transfer the spatial specificity of this information is unknown. Afferent inhibition studies, in which corticospinal excitability is inhibited when a single tactile stimulus is presented before a transcranial magnetic stimulation pulse over the motor cortex, offer contradictory results regarding the sensory-to-motor transfer of spatial information. Here, we combined the techniques of afferent inhibition and tactile repetition suppression (RS: the decreased neurophysiological response following double stimulation of the same vs. different fingers) to investigate whether topographic information is preserved in the sensory-to-motor transfer in humans. We developed a double afferent inhibition paradigm to examine both spatial (same vs. different finger) and temporal (short vs. long delay) aspects of sensorimotor interactions. Two consecutive electrocutaneous stimuli (separated by either 30 or 125 ms) were delivered to either the same or different fingers on the left hand (i.e., index finger stimulated twice or middle finger stimulated before index finger). Information about which fingers were stimulated was reflected in the size of the motor responses in a time-constrained manner: corticospinal excitability was modulated differently by same and different finger stimulation only when the two stimuli were separated by the short delay (p=.004). We demonstrate that the well-known response of the somatosensory cortices following repetitive stimulation is mirrored in the motor cortex and that corticospinal excitability is modulated as a function of the temporal and spatial relationship between afferent stimuli.
  • Tamè, L., Pavani, F., Papadelis, C., Farnè, A., & Braun, C. (2015). Early integration of bilateral touch in the primary somatosensory cortex. Human Brain Mapping, 36, 1506-1523. doi:10.1002/hbm.22719
    Animal, as well as behavioural and neuroimaging studies in humans have documented integration of bilateral tactile information at the level of primary somatosensory cortex (SI). However, it is still debated whether integration in SI occurs early or late during tactile processing, and whether it is somatotopically organized. To address both the spatial and temporal aspects of bilateral tactile processing we used magnetoencephalography in a tactile repetition-suppression paradigm. We examined somatosensory evoked-responses produced by probe stimuli preceded by an adaptor, as a function of the relative position of adaptor and probe (probe always at the left index finger; adaptor at the index or middle finger of the left or right hand) and as a function of the delay between adaptor and probe (0, 25 or 125 ms). Percentage of response-amplitude suppression was computed by comparing paired (adaptor+probe) with single stimulations of adaptor and probe. Results show that response suppression varies differentially in SI and SII as a function of both spatial and temporal features of the stimuli. Remarkably, repetition suppression of SI activity emerged early in time, regardless of whether the adaptor stimulus was presented on the same and the opposite body side with respect to the probe. These novel findings support the notion of an early and somatotopically organized interhemispheric integration of tactile information in SI.
  • Rusconi, E., Tamè, L., Furlan, M., Haggard, P., Demarchi, G., Adriani, M., Ferrari, P., Braun, C., & Schwarzbach, J. (2014). Neural Correlates of Finger Gnosis. Journal of Neuroscience, 34, 9012-9023. doi:doi:10.1523/JNEUROSCI.3119-13.2014
    Neuropsychological studies have described patients with a selective impairment of finger identification in association with posterior parietal lesions. However, evidence of the role of these areas in finger gnosis from studies of the healthy human brain is still scarce. Here we used functional magnetic resonance imaging to identify the brain network engaged in a novel finger gnosis task, the intermanual in-betweentask(IIBT),inhealthyparticipants.Severalbrainregionsexhibitedastrongerbloodoxygenationlevel-dependent(BOLD) response in IIBT than in a control task that did not explicitly rely on finger gnosis but used identical stimuli and motor responses as the IIBT. The IIBT involved stronger signal in the left inferior parietal lobule (IPL), bilateral precuneus (PCN), bilateral premotor cortex, and left inferior frontal gyrus. In all regions, stimulation of nonhomologous fingers of the two hands elicited higher BOLD signal than stimulation of homologous fingers. Only in the left anteromedial IPL(a-mIPL) and left PCN did signal strength decrease parametrically from nonhomology, through partial homology, to total homology with stimulation delivered synchronously to the two hands. With asynchronous stimulation, the signal was stronger in the left a-mIPL than in any other region, possibly indicating retention of task relevant information. We suggest that the left PCN may contribute a supporting visuospatial representation via its functional connection to the right PCN. Thea mIPL may instead provide the core substrate of an explicit bilateral body structure representation for the fingers that when disrupted can produce the typical symptoms of finger agnosia.
  • Tamè, L., Moles, A., & Holmes, N. (2014). Within, but not between hands interactions in vibrotactile detection thresholds reflect somatosensory receptive field organization. Frontiers in Psychology. Retrieved from http://dx.doi.org/10.3389/fpsyg.2014.00174
    Detection of a tactile stimulus on one finger is impaired when a concurrent stimulus (masker) is presented on an additional finger of the same or the opposite hand. This phenomenon is known to be finger-specific at the within-hand level. However, whether this specificity is also maintained at the between-hand level is not known. In four experiments, we addressed this issue by combining a Bayesian adaptive staircase procedure (QUEST) with a two-interval forced choice (2IFC) design in order to establish threshold for detecting 200ms, 100Hz sinusoidal vibrations applied to the index or little fingertip of either hand (targets). We systematically varied the masker finger (index, middle, ring, or little finger of either hand), while controlling the spatial location of the target and masker stimuli. Detection thresholds varied consistently as a function of the masker finger when the latter was on the same hand (Experiments 1 and 2), but not when on different hands (Experiments 3 and 4). Within the hand, detection thresholds increased for masker fingers closest to the target finger (i.e., middle>ring when the target was index). Between the hands, detection thresholds were higher only when the masker was present on any finger as compared to when the target was presented in isolation. The within hand effect of masker finger is consistent with the segregation of different fingers at the early stages of somatosensory processing, from the periphery to the primary somatosensory cortex (SI). We propose that detection is finger-specific and reflects the organisation of somatosensory receptive fields in SI within, but not between the hands.
  • Tamè, L., Farnè, A., & Pavani, F. (2013). Vision of the body and the differentiation of perceived body side in touch. Cortex, 49, 1340-1351. doi:10.1016/j.cortex.2012.03.016
    Although tactile representations of the two body sides are initially segregated into opposite hemispheres of the brain, behavioural interactions between body sides exist and can be revealed under conditions of tactile double simultaneous stimulation (DSS) at the hands. Here we examined to what extent vision can affect body side segregation in touch. To this aim, we changed hand-related visual input while participants performed a go/no-go task to detect a tactile stimulus delivered to one target finger (e.g., right index), stimulated alone or with a concurrent non-target finger either on the same hand (e.g., right middle finger) or on the other hand (e.g., left index finger = homologous; left middle finger = non-homologous). Across experiments, the two hands were visible or occluded from view (Exp.1), images of the two hands were either merged using a morphing technique (Exp.2), or were shown in a compatible vs. incompatible position with respect to the actual posture (Exp.3). Overall, the results showed reliable interference effects of DSS, as compared to target-only stimulation. This interference varied as a function of which non-target finger was stimulated, and emerged both within and between hands. These results imply that the competition between tactile events is not clearly segregated across body sides. Crucially, non-informative vision of the hand affected overall tactile performance only when a visual/proprioceptive conflict was present, while neither congruent nor morphed hand vision affected tactile DSS interference. This suggests that DSS operates at a tactile processing stage in which interactions between body sides can occur regardless of the available visual input from the body.
  • Tamè, L., Braun, C., Lingnau, A., Schwarzbach, J., Demarchi, G., Li Hegner, Y., Farnè, A., & Pavani, F. (2012). The Contribution of Primary and Secondary Somatosensory Cortices to the Representation of Body Parts and Body Sides: An fMRI Adaptation Study. Journal of Cognitive Neuroscience, 24, 2306-2320. doi:10.1162/jocn_a_00272
    While the somatosensory homunculus is a classically used description of the way somatosensory inputs are processed in the brain, the actual contribution of primary (SI) and secondary (SII) somatosensory cortices to the spatial coding of touch remains poorly understood. We studied adaptation of the fMRI BOLD response in the somatosensory cortex by delivering pairs of vibrotactile stimuli to the finger tips of the index and middle fingers. The second stimulus (test) was always administered to the left index finger, while the first stimulus (adaptor) was delivered either to the same or to a different (middle) finger of the right or left hand. The overall BOLD response evoked by the stimulation was primarily contralateral in SI and was more bilateral in SII. However, our fMRI adaptation approach also revealed that both somatosensory cortices were sensitive to ipsilateral as well as to contralateral inputs. SI and SII adapted more when the stimulation repeated over homologous than non-homologous fingers, showing a distinction between different fingers. Most importantly, for both somatosensory cortices this finger-specific adaptation occurred irrespective of whether the tactile stimulus was delivered to the same or to different hands. This result implies integration of contralateral and ipsilateral somatosensory inputs in SI, as well as in SII. These findings suggest that SI is more than simply a relay for sensory information, and that both SI and SII contribute to the spatial coding of touch by discriminating between body parts (fingers) and by integrating the somatosensory input from the two sides of the body (hands).
  • Kanayama, N., Tamè, L., Ohira, H., & Pavani, F. (2012). Top down influence on visuo-tactile interaction modulates neural oscillatory responses. NeuroImage, 59, 3406-3417. doi:10.1016/j.neuroimage.2011.11.076
    Multisensory integration involves bottom-up as well as top-down processes. We investigated the influences of top-down control on the neural responses to multisensory stimulation using EEG recording and timefrequency analyses. Participants were stimulated at the index or thumb of the left hand, using tactile vibrators mounted on a foam cube. Simultaneously they received a visual distractor from a light emitting diode adjacent to the active vibrator (spatially congruent trial) or adjacent to the inactive vibrator (spatially incongruent trial). The task was to respond to the elevation of the tactile stimulus (upper or lower), while ignoring the simultaneous visual distractor. To manipulate top-down control on this multisensory stimulation, the proportion of spatially congruent (vs. incongruent) trials was changed across blocks. Our results reveal that the behavioral cost of responding to incongruent than congruent trials (i.e., the crossmodal congruency effect) was modulated by the proportion of congruent trials. Most importantly, the EEG gamma band response and the gamma–theta coupling were also affected by this modulation of top-down control, whereas the late theta band response related to the congruency effect was not. These findings suggest that gamma band response is more than a marker of multisensory binding, being also sensitive to the correspondence between expected and actual multisensory stimulation. By contrast, theta band response was affected by congruency but appears to be largely immune to stimulation expectancy.
  • Tamè, L., Farnè, A., & Pavani, F. (2011). Spatial coding of touch at the fingers: Insights from double simultaneous stimulation within and between hands. Neuroscience Letters, 487, 78-82. doi:10.1016/j.neulet.2010.09.078
    We studied the effect of tactile double simultaneous stimulation (DSS) within and between hands to examine spatial coding of touch at the fingers. Participants performed a go/no-go task to detect a tactile stimulus delivered to one target finger (e.g., right index), stimulated alone or with a concurrent non-target finger, either on the same hand (e.g., right middle finger) or on the other hand (e.g., left index finger = homologous; left middle finger = non-homologous). Across blocks we also changed the unseen hands posture (both hands palm down, or one hand rotated palm-up). When both hands were palm-down DSS interference effects emerged both within and between hands, but only when the non-homologous finger served as non-target. This suggests a clear segregation between the fingers of each hand, regardless of finger side. By contrast, when one hand was palm-up, interference effects emerged only within a hand, whereas between hands DSS interference was considerably reduced or absent. Thus, between hands interference was clearly affected by changes in hand posture. Taken together, these findings provide behavioral evidence in humans for multiple spatial coding of touch during tactile DSS at the fingers. In particular, they confirm the existence of early representations of touch that distinguish between body-regions more than body-sides. Moreover, they show that the availability of tactile stimulation side becomes prominent when postural update is required.
  • Galfano, G., Mazza, V., Tamè, L., Umiltà, C., & Turatto, M. (2008). Change detection evokes a Simon-like effect. Acta Psychologica, 127, 186-196. doi:10.1016/j.actpsy.2007.04.004
    A change detection paradigm was used to estimate the role of explicit change detection in the generation of the irrelevant spatial stimulus coding underlying the Simon effect. In one condition, no blank was interposed between two successive displays, which produced efficient change detection. In another condition, the presence of a blank frame produced a robust change blindness effect, which is crucially assumed to occur as the consequence of impaired attentional orienting to the change location. The results showed a strong Simonlike effect under conditions of efficient change detection. By contrast, no Simon-like effect was observed under conditions of change blindness, namely when attention shifting towards the change location was hampered. Experiment 2 supported this pattern by showing that a Simon-like effect could be observed when the blank was present, but only when participants detected the change by means of a cue that was informative as to change location. Overall, our findings show that a Simon-like effect can only be observed under conditions of explicit change detection, likely because a shift of attention towards the change location has occurred.
  • Turatto, M., Valsecchi, M., Tamè, L., & Betta, E. (2007). Microsaccades distinguish between global and local visual processing. NeuroReport, 18, 1015-1018. doi:10.1097/WNR.0b013e32815b615b
    Much is known about the functional mechanisms involved in visual search. Yet, the fundamental question of whether the visual system can perform different types of visual analysis at different spatial resolutions still remains unsettled. In the visual-attention literature, the distinction between different spatial scales of visual processing corresponds to the distinction between distributed and focused attention. Some authors have argued that singleton detection can be performed in distributed attention, whereas others suggest that even such a simple visual operation involves focused attention. Here we showed that microsaccades were spatially biased during singleton discrimination but not during singleton detection. The results provide support to the hypothesis that some coarse visual analysis can be performed in a distributed attention mode.
Last updated