Portrait of Dr Brianna Beck

Dr Brianna Beck

Lecturer in Psychology
Placement Year Degree Assistant Co-ordinator
Member of the Ethics team


Dr Brianna Beck is a Lecturer in Psychology at the University of Kent. Her research interests include pain and somatosensory perception, body representation and the sense of agency.

Key publications

  • Amemiya, T., Beck, B., Walsh, V., Gomi, H., & Haggard, P. (2017). Visual area V5/hMT+ contributes to perception of tactile motion direction: a TMS study. Scientific Reports, 7, 40937.
  • Beck, B., Di Costa, S., & Haggard, P. (2017). Having control over the external world increases the implicit sense of agency. Cognition, 162, 54-60.
  • Borhani, K., Beck, B., & Haggard, P. (2017). Choosing, doing, and controlling: implicit sense of agency over somatosensory events. Psychological Science, 28(7), 882-893.
  • Beck, B., Ladavas, E., & Haggard, P. (2016). Viewing the body modulates both pain sensations and pain responses. Experimental Brain Research, 234(7), 1795-1805.

Research interests

Brianna's primary research interest is somatosensory perception (i.e. touch, temperature and, in particular, pain perception). Specifically, she is interested in how those senses interact with cognitive and affective processes such as learning, motivation, and our sense of control over our actions and their outcomes. Brianna is also interested in how somatosensory processing contributes to bodily awareness and to our mental representations of our own bodies. She uses behavioural, psychophysical, electrophysiological (EEG) and brain stimulation (TMS) methods to investigate questions such as...

  • How do experiences of pain influence our sense of agency (i.e. our sense of having control over our actions and, through them, over the world around us)?
  • How does persistent pain affect our ability to learn about rewards and our motivation to seek them?
  • How are nociceptive signals (i.e. the sensory signals that normally give rise to pain) integrated within multisensory spatial representations of the body in the brain?

Brianna welcomes prospective doctoral students to contact her if they are interested in these questions or other related topics on pain and somatosensory perception.


Grants and Awards




  • Beck, B., Saramandi, A., Ferrè, E., & Haggard, P. (2020). Which way is down? Visual and tactile verticality perception in expert dancers and non-experts. Neuropsychologia, 146. doi:10.1016/j.neuropsychologia.2020.107546
    Gravity provides an absolute verticality reference for all spatial perception, allowing us to move within and interact effectively with our world. Bayesian inference models explain verticality perception as a combination of online sensory cues with a prior prediction that the head is usually upright. Until now, these Bayesian models have been formulated for judgements of the perceived orientation of visual stimuli. Here, we investigated whether judgements of the verticality of tactile stimuli follow a similar pattern of Bayesian perceptual inference. We also explored whether verticality perception is affected by the postural and balance expertise of dancers. We tested both the subjective visual vertical (SVV) and the subjective tactile vertical (STV) in ballet dancers and non-dancers. A robotic arm traced downward-moving visual or tactile stimuli in separate blocks while participants held their head either upright or tilted 30° to their right. Participants reported whether these stimuli deviated to the left (clockwise) or right (anti-clockwise) of the gravitational vertical. Tilting the head biased the SVV away from the longitudinal head axis (the classical E-effect), consistent with a failure to compensate for the vestibulo-ocular counter-roll reflex. On the contrary, tilting the head biased the STV toward the longitudinal head axis (the classical A-effect), consistent with a strong upright head prior. Critically, tilting the head reduced the precision of verticality perception, particularly for ballet dancers’ STV judgements. Head tilt is thought to increase vestibular noise, so ballet dancers seem to be surprisingly susceptible to degradation of vestibular inputs, giving them an inappropriately high weighting in verticality judgements.
  • Fardo, F., Beck, B., Allen, M., & Finnerup, N. (2019). Beyond labeled lines: A population coding account of the thermal grill illusion. Neuroscience & Biobehavioral Reviews, 108, 472-479. doi:10.1016/j.neubiorev.2019.11.017
    Heat and pain illusions (synthetic heat and the thermal grill illusion) can be generated by simultaneous cold and warm stimulation on the skin at temperatures that would normally be perceived as innocuous in isolation. Historically, two key questions have dominated the literature: which specific pathway conveys the illusory perceptions of heat and pain, and where, specifically, does the illusory pain originate in the central nervous system? Two major theories - the addition and disinhibition theories - have suggested distinct pathways, as well as specific spinal or supraspinal mechanisms. However, both theories fail to fully explain experimental findings on illusory heat and pain phenomena. We suggest that the disagreement between previous theories and experimental evidence can be solved by abandoning the assumption of one-to-one relations between pathways and perceived qualities. We argue that a population coding framework, based on distributed activity across non-nociceptive and nociceptive pathways, offers a more powerful explanation of illusory heat and pain. This framework offers new hypotheses regarding the neural mechanisms underlying temperature and pain perception.
  • Beck, B., Gnanasampanthan, S., Iannetti, G., & Haggard, P. (2019). No temporal contrast enhancement of simple decreases in noxious heat. Journal of Neurophysiology, 121, 1778-1786. doi:10.1152/jn.00335.2018
    Offset analgesia (OA) studies have found that small decreases in the intensity of a tonic noxious heat stimulus yield a disproportionately large amount of pain relief. In the classic OA paradigm, the decrease in stimulus intensity is preceded by an increase of equal size from an initial noxious level. While the majority of researchers believe this temporal sequence of two changes is important for eliciting OA, it has also been suggested that the temporal contrast mechanism underlying OA may enhance detection of simple, isolated decreases in noxious heat. To test whether decreases in noxious heat intensity, by themselves, are perceived better than increases of comparable sizes, we used an adaptive two-interval alternative forced choice task to find perceptual thresholds for increases and decreases in radiant and contact heat. Decreases in noxious heat were more difficult to perceive than increases of comparable sizes from the same initial temperature of 45°C. In contrast, decreases and increases were perceived equally well within a common range of noxious temperatures (i.e., when increases started from 45°C and decreases started from 47°C). In another task, participants rated the pain intensity of heat stimuli that randomly and unpredictably increased, decreased or remained constant. Ratings of unpredictable stimulus decreases also showed no evidence of perceptual enhancement. Our results demonstrate that there is no temporal contrast enhancement of simple, isolated decreases in noxious heat intensity. Combined with previous OA findings, they suggest that long-lasting noxious stimuli that follow an increase-decrease pattern may be important for eliciting the OA effect.
  • Christensen, J., Di Costa, S., Beck, B., & Haggard, P. (2019). I just lost it! Fear and anger reduce the sense of agency: a study using intentional binding. Experimental Brain Research, 237, 1205-1212. doi:10.1007/s00221-018-5461-6
    Two recent studies have demonstrated that increases in arousal states lead to an increase people’s sense of agency, i.e., the subjective experience of controlling one’s own voluntary actions (Minohara et al. in Front Psychol 7:1165, 2016; Wen et al. in Conscious Cogn 36:87–95, 2015). We here extend these findings by showing that arousal states with negative emotional valence, such as fear and anger, decrease sense of agency. Anger and fear are negative emotional states. Anecdotally, they are often invoked as reasons for losing control, and neuroscientific evidence confirms important effects on the brain’s action control systems. Surprisingly, the subjective experience of acting in anger or fear has scarcely been investigated experimentally. Thus, the legal notion that these intense emotions may undermine normal voluntary control over actions and outcomes (the ‘Loss of Control’, a partial defence for murder) lacks any clear evidence base. In three laboratory experiments, we measured sense of agency using an implicit measure based on time perception (the “intentional binding” paradigm). These actions occurred in either an emotionally neutral condition, or in a fearful (experiments 1 and 2) or angry state (experiment 3). In line with our hypotheses, fear or anger reduced the subjective sense of control over an action outcome, even though the objective causal link between action and outcome remained the same. This gap between the objective facts of agency, and a reduced subjective experience of agency under emotional conditions, has important implications for society and law.
  • Beck, B., Peña-Vivas, V., Fleming, S., & Haggard, P. (2019). Metacognition across sensory modalities: Vision, warmth, and nociceptive pain. Cognition, 186, 32-41. doi:10.1016/j.cognition.2019.01.018
    The distinctive experience of pain, beyond mere processing of nociceptive inputs, is much debated in psychology and neuroscience. One aspect of perceptual experience is captured by metacognition—the ability to monitor and evaluate one’s own mental processes. We investigated confidence in judgements about nociceptive pain (i.e. pain that arises from the activation of nociceptors by a noxious stimulus) to determine whether metacognitive processes contribute to the distinctiveness of the pain experience. Our participants made intensity judgements about noxious heat, innocuous warmth, and visual contrast (first-order, perceptual decisions) and rated their confidence in those judgements (second-order, metacognitive decisions). First-order task performance between modalities was balanced using adaptive staircase procedures. For each modality, we quantified metacognitive efficiency (meta-d’/d’)—the degree to which participants’ confidence reports were informed by the same evidence that contributed to their perceptual judgements—and metacognitive bias (mean confidence)—the participant’s tendency to report higher or lower confidence overall. We found no overall differences in metacognitive efficiency or mean confidence between modalities. Mean confidence ratings were highly correlated between all three tasks, reflecting stable inter-individual variability in metacognitive bias. However, metacognitive efficiency for pain varied independently of metacognitive efficiency for warmth and visual perception. That is, those participants who had higher metacognitive efficiency in the visual task also tended to have higher metacognitive efficiency in the warmth task, but not necessarily in the pain task. We thus suggest that some distinctive and idiosyncratic aspects of the pain experience may stem from additional variability at a metacognitive level. We further speculate that this additional variability may arise from the affective or arousal aspects of pain.
  • von Mohr, M., Krahé, C., Beck, B., & Fotopoulou, A. (2018). The social buffering of pain by affective touch: a laser-evoked potential study in romantic couples. Social Cognitive and Affective Neuroscience, 13, 1121-1130. doi:10.1093/scan/nsy085
    Pain is modulated by social context. Recent neuroimaging studies have shown that romantic partners can provide a potent form of social support during pain. However, such studies have only focused on passive support, finding a relatively late-onset modulation of pain-related neural processing. In this study, we examined for the first time dynamic touch by one’s romantic partner as an active form of social support. Specifically, 32 couples provided social, active, affective (vs active but neutral) touch according to the properties of a specific C-tactile afferent pathway to their romantic partners, who then received laser-induced pain. We measured subjective pain ratings and early N1 and later N2-P2 laser-evoked potentials (LEPs) to noxious stimulation, as well as individual differences in adult attachment style. We found that affective touch from one’s partner reduces subjective pain ratings and similarly attenuates LEPs both at earlier (N1) and later (N2-P2) stages of cortical processing. Adult attachment style did not affect LEPs, but attachment anxiety had a moderating role on pain ratings. This is the first study to show early neural modulation of pain by active, partner touch, and we discuss these findings in relation to the affective and social modulation of sensory salience.
  • Fardo, F., Beck, B., Cheng, T., & Haggard, P. (2018). A mechanism for spatial perception on human skin. Cognition, 178, 236-243. doi:10.1016/j.cognition.2018.05.024
    Our perception of where touch occurs on our skin shapes our interactions with the world. Most accounts of cutaneous localisation emphasise spatial transformations from a skin-based reference frame into body-centred and external egocentric coordinates. We investigated another possible method of tactile localisation based on an intrinsic perception of ‘skin space’. The arrangement of cutaneous receptive fields (RFs) could allow one to track a stimulus as it moves across the skin, similarly to the way animals navigate using path integration. We applied curved tactile motions to the hands of human volunteers. Participants identified the location midway between the start and end points of each motion path. Their bisection judgements were systematically biased towards the integrated motion path, consistent with the characteristic inward error that occurs in navigation by path integration. We thus showed that integration of continuous sensory inputs across several tactile RFs provides an intrinsic mechanism for spatial perception.
  • Borhani, K., Beck, B., & Haggard, P. (2017). Choosing, Doing, and Controlling: Implicit Sense of Agency Over Somatosensory Events. Psychological Science, 28, 882-893. doi:10.1177/0956797617697693
    Sense of agency—a feeling of control over one’s actions and their outcomes—might include at least two components: free choice over which outcome to pursue and motoric control over the action causing the outcome. We orthogonally manipulated locus of outcome choice (free or instructed choice) and motoric control (active or passive movement), while measuring the perceived temporal attraction between actions and outcomes (temporal binding) as an implicit marker of agency. Participants also rated stimulus intensity so that we could measure sensory attenuation, another possible implicit marker of agency. Actions caused higher or lower levels of either painful heat or mild electrotactile stimulation. We found that both motoric control and outcome choice contributed to outcome binding. Moreover, free choice, relative to instructed choice, attenuated the perceived magnitude of high-intensity outcomes, but only when participants made an active movement. Thus, choosing, not just doing, influences temporal binding and sensory attenuation, though in different ways. Our results show that these implicit measures of agency are sensitive to both voluntary motor commands and instrumental control over action outcomes.
  • Beck, B., Di Costa, S., & Haggard, P. (2017). Having control over the external world increases the implicit sense of agency. Cognition, 162, 54-60. doi:10.1016/j.cognition.2017.02.002
    The sense of agency refers to the feeling of control over one’s actions, and, through them, over external events. One proposed marker of implicit sense of agency is ‘intentional binding’—the tendency to perceive voluntary actions and their outcomes as close in time. Another is attenuation of the sensory consequences of a voluntary action. Here we show that the ability to choose an outcome through action selection contributes to implicit sense of agency. We measured intentional binding and stimulus intensity ratings using painful and non-painful somatosensory outcomes. In one condition, participants chose between two actions with different probabilities of producing high or low intensity outcomes, so action choices were meaningful. In another condition, action selection was meaningless with respect to the outcome. Having control over the outcome increased binding, especially when outcomes were painful. Greater sensory attenuation also tended to be associated with stronger binding of the outcome towards the action that produced it. Previous studies have emphasised the link between sense of agency and initiation of voluntary motor actions. Our study shows that the ability to control outcomes by discriminative action selection is another key element of implicit sense of agency. It also investigates, for the first time, the relation between binding and sensory attenuation for the same events.
  • Amemiya, T., Beck, B., Walsh, V., Gomi, H., & Haggard, P. (2017). Visual area V5/hMT+ contributes to perception of tactile motion direction: a TMS study. Scientific Reports, 7. doi:10.1038/srep40937
    Human imaging studies have reported activations associated with tactile motion perception in visual motion area V5/hMT+, primary somatosensory cortex (SI) and posterior parietal cortex (PPC; Brodmann areas 7/40). However, such studies cannot establish whether these areas are causally involved in tactile motion perception. We delivered double-pulse transcranial magnetic stimulation (TMS) while moving a single tactile point across the fingertip, and used signal detection theory to quantify perceptual sensitivity to motion direction. TMS over both SI and V5/hMT+, but not the PPC site, significantly reduced tactile direction discrimination. Our results show that V5/hMT+ plays a causal role in tactile direction processing, and strengthen the case for V5/hMT+ serving multimodal motion perception. Further, our findings are consistent with a serial model of cortical tactile processing, in which higher-order perceptual processing depends upon information received from SI. By contrast, our results do not provide clear evidence that the PPC site we targeted (Brodmann areas 7/40) contributes to tactile direction perception.
  • Walsh, L., Critchlow, J., Beck, B., Cataldo, A., de Boer, L., & Haggard, P. (2016). Salience-driven overestimation of total somatosensory stimulation. Cognition, 154, 118-129. doi:10.1016/j.cognition.2016.05.006
    Psychological characterisation of sensory systems often focusses on minimal units of perception, such as thresholds, acuity, selectivity and precision. Research on how these units are aggregated to create integrated, synthetic experiences is rarer. We investigated mechanisms of somatosensory integration by asking volunteers to judge the total intensity of stimuli delivered to two fingers simultaneously. Across four experiments, covering physiological pathways for tactile, cold and warm stimuli, we found that judgements of total intensity were particularly poor when the two simultaneous stimuli had different intensities. Total intensity of discrepant stimuli was systematically overestimated. This bias was absent when the two stimulated digits were on different hands. Taken together, our results showed that the weaker stimulus of a discrepant pair was not extinguished, but contributed less to the perception of the total than the stronger stimulus. Thus, perception of somatosensory totals is biased towards the most salient element. ‘Peak’ biases in human judgements are well-known, particularly in affective experience. We show that a similar mechanism also influences sensory experience.
  • Beck, B., Làdavas, E., & Haggard, P. (2016). Viewing the body modulates both pain sensations and pain responses. Experimental Brain Research, 234, 1795-1805. doi:10.1007/s00221-016-4585-9
    Viewing the body can influence pain perception, even when vision is non-informative about the noxious stimulus. Prior studies used either continuous pain rating scales or pain detection thresholds, which cannot distinguish whether viewing the body changes the discriminability of noxious heat intensities or merely shifts reported pain levels. In Experiment 1, participants discriminated two intensities of heat-pain stimulation. Noxious stimuli were delivered to the hand in darkness immediately after participants viewed either their own hand or a non-body object appearing in the same location. The visual condition varied randomly between trials. Discriminability of the noxious heat intensities (d?) was lower after viewing the hand than after viewing the object, indicating that viewing the hand reduced the information about stimulus intensity available within the nociceptive system. In Experiment 2, the hand and the object were presented in separate blocks of trials. Viewing the hand shifted perceived pain levels irrespective of actual stimulus intensity, biasing responses toward ‘high pain’ judgments. In Experiment 3, participants saw the noxious stimulus as it approached and touched their hand or the object. Seeing the pain-inducing event counteracted the reduction in discriminability found when viewing the hand alone. These findings show that viewing the body can affect both perceptual processing of pain and responses to pain, depending on the visual context. Many factors modulate pain; our study highlights the importance of distinguishing modulations of perceptual processing from modulations of response bias.
  • Beck, B., Bertini, C., Haggard, P., & Làdavas, E. (2015). Dissociable routes for personal and interpersonal visual enhancement of touch. Cortex, 73, 289-297. doi:10.1016/j.cortex.2015.09.008
    Seeing a hand can enhance tactile acuity on the hand, even when tactile stimulation is not visible. This visual enhancement of touch (VET) occurs both when participants see their own hand (personal VET), and when they see another person's hand (interpersonal VET). Interpersonal VET occurs irrespective of where the viewed hand appears, while personal VET is eliminated when visual and proprioceptive signals about the location of one's own hand are incongruent. This suggests that the neural mechanisms for VET may differ according to ownership of the seen hand. We used continuous theta-burst transcranial magnetic stimulation (TMS) to disrupt either the human ventral intraparietal area (hVIP), which integrates tactile, proprioceptive, and visual information about one's own body, or the extrastriate body area (EBA), which processes visual body information irrespective of ownership. Participants then judged the orientation of tactile gratings applied to their hand while viewing images of their own hand, another person's hand, or a non-body object on a screen placed over their actual hand. Disrupting the hVIP attenuated personal VET but did not affect interpersonal VET, suggesting the hVIP is only involved in VET when one's own hand is seen. Disrupting the EBA reduced both personal and interpersonal VET, suggesting it is common to both routes.
  • Beck, B., Cardini, F., Làdavas, E., & Bertini, C. (2015). The Enfacement Illusion Is Not Affected by Negative Facial Expressions. PLOS ONE, 10, e0136273. doi:doi:10.1371/journal.pone.0136273
    Enfacement is an illusion wherein synchronous visual and tactile inputs update the mental representation of one’s own face to assimilate another person’s face. Emotional facial expressions, serving as communicative signals, may influence enfacement by increasing the observer’s motivation to understand the mental state of the expresser. Fearful expressions, in particular, might increase enfacement because they are valuable for adaptive behavior and more strongly represented in somatosensory cortex than other emotions. In the present study, a face was seen being touched at the same time as the participant’s own face. This face was either neutral, fearful, or angry. Anger was chosen as an emotional control condition for fear because it is similarly negative but induces less somatosensory resonance, and requires additional knowledge (i.e., contextual information and social contingencies) to effectively guide behavior. We hypothesized that seeing a fearful face (but not an angry one) would increase enfacement because of greater somatosensory resonance. Surprisingly, neither fearful nor angry expressions modulated the degree of enfacement relative to neutral expressions. Synchronous interpersonal visuo-tactile stimulation led to assimilation of the other’s face, but this assimilation was not modulated by facial expression processing. This finding suggests that dynamic, multisensory processes of self-face identification operate independently of facial expression processing.
  • Beck, B., Bertini, C., Scarpazza, C., & Làdavas, E. (2013). Observed Touch on a Non-Human Face Is Not Remapped onto the Human Observer’s Own Face. PLoS ONE, 8, e73681. doi:10.1371/journal.pone.0073681
    Visual remapping of touch (VRT) is a phenomenon in which seeing a human face being touched enhances detection of tactile stimuli on the observer's own face, especially when the observed face expresses fear. This study tested whether VRT would occur when seeing touch on monkey faces and whether it would be similarly modulated by facial expressions. Human participants detected near-threshold tactile stimulation on their own cheeks while watching fearful, happy, and neutral human or monkey faces being concurrently touched or merely approached by fingers. We predicted minimal VRT for neutral and happy monkey faces but greater VRT for fearful monkey faces. The results with human faces replicated previous findings, demonstrating stronger VRT for fearful expressions than for happy or neutral expressions. However, there was no VRT (i.e. no difference between accuracy in touch and no-touch trials) for any of the monkey faces, regardless of facial expression, suggesting that touch on a non-human face is not remapped onto the somatosensory system of the human observer.
  • Ward, J., Moore, S., Thompson-Lake, D., Salih, S., & Beck, B. (2008). The Aesthetic Appeal of Auditory-Visual Synaesthetic Perceptions in People without Synaesthesia. Perception, 37, 1285-1296. doi:doi:10.1068/p5815
    The term ‘visual music’ refers to works of art in which both hearing and vision are directly or indirectly stimulated. Our ability to create, perceive, and appreciate visual music is hypothesised to rely on the same multisensory processes that support auditory – visual (AV) integration in other contexts. Whilst these mechanisms have been extensively studied, there has been little research on how these processes affect aesthetic judgments (of liking or preference). Studies of synaesthesia in which sound evokes vision and studies of cross-modal biases in non-synaesthetes have revealed non-arbitrary mappings between visual and auditory properties (eg high-pitch sounds being smaller and brighter). In three experiments, we presented members of the general population with animated AV clips derived from synaesthetic experiences and contrasted them with a number of control conditions. The control conditions consisted of the same clips rotated or with the colour changed, random AV pairings, or animated clips generated by non-synaesthetes. Synaesthetic AV animations were generally preferred over the control conditions. The results suggest that non-arbitrary AV mappings, present in the experiences of synaesthetes, can be readily appreciated by others and may underpin our tendency to engage with certain forms of art.
Last updated