Cognitive Psychology and Neuropsychology of Nociception and Pain



Fig. 1.1
Components of attention. Physical features of the information from the environment are represented, that is, encoded, into particular patterns of neural activations. The representations with the highest signal strength will be selected for further processing and access to working memory that holds active the representations of the information which are significant for ongoing cognitive processing. The selection is based on the salience of the sensory stimuli, that is, their ability to stand out relative to neighboring stimuli or relative to recent past events, or their relevance, that is, their pertinence for current cognitive and behavioral aims or for motivation. At the first level, information flow is filtered by salience detectors. These detectors weight the neural representations of sensory inputs relative to the representations of the sensory inputs from neighboring stimuli. These detectors modify the weight of the neural representations of sensory inputs relative to the representations of the sensory inputs from neighboring stimuli (Itti and Koch 2001). The stimuli that are the more distinctive receive then stronger representation signals (spatial salience detection). Other detectors increase the strength of neural responses to salient stimuli by identifying the stimuli that are novel or that represent a change according to recent past sensory events (Näätänen 1992) (temporal salience detection). On the basis of these mechanisms which translate physical salience into weighted neural representation (black arrow “D1”; “D” for distracter), the sensory inputs that receive the strongest neural response are those that are able to capture attention, even if these inputs are not explicitly attended by the individual (bottom-up or stimulus-driven selection). At the second level, processing prioritization is based on ongoing cognitive aims and high-order motivations, and the selection is then voluntarily controlled towards the sensory inputs that allow to achieve these aims and to satisfy motivations (dark gray arrows “T”; “T” for target) (top-down or goal-directed selection). The balance between top-down and bottom-up selection depends on several variables. First, top-down selection is under the control of working memory that maintains active the aims and the features of the to-be-attended information during the achievement of the task (Desimone and Duncan 1995). Second, the features of the targets are defined by the attentional set that helps attention to search and identify the relevant information in the environment. A consequence of the activity of attentional setting is that distracter stimuli that share one or more features with the attended targets (black arrow “D2”) will also enter into the focus of attention (dotted gray square) (Folk et al. 1992). Third, attention abilities will be more or less loaded during selection (Lavie 2010). Under high-load selection, attention is narrowed on the processing of relevant information and distracters are rejected. To the contrary, under low selection, information processing is less selective; distracters will also be perceived, and their ability to gain control over cognitive activity will depend on the ability of executive functions to inhibit interference (Adapted from Legrain et al. (2009b))





1.4 Spatial Perception


The role of the representation of space in the perception of nociceptive stimuli has been recently highlighted. For instance, a study has observed, in patients who showed hemispatial neglect syndrome after a stroke, that the perception of a nociceptive stimulus depends on the ability to localize stimuli in space and on the integrity of cortical structures such a posterior parietal and prefrontal areas (Liu et al. 2011). Some of these patients were able to report correctly the occurrence of a nociceptive stimulus only when this was applied on the hand contralateral to the lesion site. The perception of the same stimulus was extinguished when it was delivered concomitantly with another nociceptive stimulus on the ipsilesional hand (nociceptive extinction). Other patients were also able to identify correctly the occurrence of the nociceptive stimulus applied to the contralesional hand, but they localized it as if it has been applied to the ipsilesional hand (nociceptive allesthesia).

The ability to localize nociceptive stimuli is important because it allows the detection of which part of the body is potentially threatened. It is also of primary importance to identify in external space the position of the object that might be the cause of damage in order to prompt and to guide defensive motor responses towards the location of the threat. These considerations underline the importance of coordinating the representation of the body space and the representation of external space. The brain normally takes into account different frames of reference when coding the spatial position of sensory information (Fig. 1.2; see Vallar and Maravita 2009). One type of reference framework relates to the anatomical reference frames, which are based on the existence of a spatial organization of sensory receptors in receptive fields which project to separate populations of neurons. The primary somatosensory and motor cortices are somatotopically organized and contain a spatially organized representation of the cutaneous surface of the body (Penfield and Boldrey 1937). However, this type of frame of reference alone is unable to integrate the perception of which part of the body is stimulated and the perception of the position of external objects in contact with the body. In other words, defensive motor responses cannot be spatially guided towards the threat efficiently if the position of nociceptive stimuli is not remapped according to both the position of the stimulated body part and the position of the threatening object in external space. The peripersonal frame of reference is of particular interest because it allows integrating the body space and the space surrounding it. Indeed, this frame allows coding the position of somatosensory stimuli on the body space and the position of external stimuli (i.e., visual or auditory) occurring close to the body part on which the somatosensory stimuli are applied (see Holmes and Spence 2004; Maravita et al. 2003). The peripersonal frame of reference is specifically relevant to help guide direct manipulation of objects (Rizzolatti et al. 1997), unlike a more extrapersonal frame of reference which is more useful to explore the space by eye movements and to prepare reaching movements. Moreover, it is believed to be crucial for the organization of defensive motor actions (Graziano and Cooke 2006).

A315450_1_En_1_Fig2_HTML.gif


Fig. 1.2
Different frames of reference to perceive body and extra-body spaces. Three main reference frames can be dissociated. The personal reference frame corresponds to the space of the body. This frame can be dissociated into a somatotopic personal frame based on the anatomical projection of somatosensory receptive fields in spatially ordered groups of neurons and a spatiotopic personal frame using external space as a coordinate system. According to this second reference frame, as illustrated in the figure, we are able to recognize, eyes closed, that the right hand, that crosses the midline of the body, is touched by a right-sided object, despite the fact that somatosensory inputs are sent to the left hemisphere. Spatiotopic reference frames integrate therefore somatosensory and proprioceptive information. The peripersonal frame of reference corresponds to a coordinate system integrating body space and external space close to the body. This reference frame allows the integration of somatosensory information with visual and auditory information when visual and auditory stimuli occur close to the body. The peripersonal reference frame can be centered on the body; the sagittal midline of the body is used as a coordinate to separate the left and right parts of space. It can also be centered on each limb; the limb itself is then used as coordinate. Therefore, the peripersonal reference frame is considered as an interplay of body-part-centered coordinates mapping stimuli from the different senses and moving in space with the body part onto which these maps are anchored. The extrapersonal frame of reference corresponds to a reference frame used to perceive the far space, that is, to explore environment by movements of the eyes and the limbs. Finally, these reference frames were defined according to an egocentric perspective, that is, relative to the observer’s own body. According to an allocentric object-centered perspective, spatial coordinates are defined relative the object itself (e.g., in the illustration the white part of the rectangle is in the right side relative to the black part, while both parts are in the left space of the observer) (Adapted from Legrain (2011))

The existence of a peripersonal frame of reference to map the position of nociceptive stimuli supposes firstly the existence of multimodal interactions between nociceptive inputs and sensory inputs from other modalities. For instance, it has been suggested that vision of the limb onto which nociceptive stimuli are applied can modify the cortical processing of these nociceptive stimuli and the elicited pain (Longo et al. 2009; Mancini et al. 2011; Romano and Maravita 2014). In addition, Sambo et al. (2013) showed that the judgment about the occurrence of nociceptive stimuli could depend on the relative position of the limbs. They used a temporal order judgment task during which healthy blindfolded volunteers had to judge which of two nociceptive stimuli applied to either hand was perceived as being delivered first. The task was either performed with the hands in an uncrossed posture or with hands crossed over the sagittal midline of the body. This crossing-hand procedure is often used to induce a competition between somatotopic and spatiotopic frames of reference (when crossed, the left hand is right sided and the right hand left sided) (e.g., Shore et al. 2002; Smania and Aglioti 1995; Spence et al. 2004). The authors showed that judgments were much more complicated when the hands were crossed, suggesting that the perception of nociceptive stimuli was affected by a space-based frame of reference. It was also shown that crossing hands alters the processing of intensity of the stimuli and modifies brain responses to those stimuli (Gallace et al. 2011; Torta et al. 2013). These data support the idea that nociceptive inputs are integrated in multimodal representations of the body (Legrain et al. 2011; Haggard et al. 2013) in a brain network extending far beyond the classic nociceptive cortical network (Moseley et al. 2012b). More striking evidence was recently reported by De Paepe et al. (2014b) who provided data supporting the existence of a peripersonal frame of reference to map nociceptive stimuli. They used a temporal order judgment task with nociceptive stimuli applied to either hand and showed that the judgments were systematically biased by the occurrence of a visual stimulus in one side of space. Indeed, this visual cue facilitated the perception of the nociceptive stimulus applied to the ipsilateral hand, at the expenses of the stimulus applied to the opposite hand. Most important, this bias was significantly greater when the visual cue was presented close to the hand as compared to when it was presented 70 cm from the front of the hand. Using the crossing-hand procedure, additional experiments showed that this visuo-nociceptive spatial congruency effect was also influenced by the position of the limb (De Paepe et al. 2014a). For instance, the perception of a nociceptive stimulus applied to the left hand was facilitated by a proximal left-sided visual stimulus when the hands were uncrossed, but by a proximal right-sided visual stimulus when they were crossed. One important question that remains to be addressed regards the neuronal mechanisms supporting such multimodal integration of nociceptive inputs. Animal studies have largely supported the notion that the peripersonal processing of tactile stimuli relies on the existence of multimodal neurons in the monkey’s premotor and parietal cortices firing to the occurrence of tactile stimuli and visual stimuli when the latter are presented close to the adjacent somatosensory receptive fields (Graziano et al. 2004; see Macaluso and Maravita 2010 for a discussion about similar mechanisms in humans). Regarding nociception, only one study found similar multimodal neurons in the monkey’s inferior parietal lobe (Dong et al. 1994).

The importance of the interaction between nociception, pain, and the representation of body space is also illustrated by the neuropsychological investigation of patients with chronic pain and more specifically in patients with complex regional pain syndrome (CRPS) (Moseley et al. 2012b; Legrain et al. 2012a). In addition to their sensory, motor, and vegetative symptoms, CRPS patients also suffer from unilateral cognitive deficits leading to impaired perception and impaired utilization of the affected limb. For this reason, CRPS patients were suspected to present with a “neglect-like” symptomatology (e.g., Förderreuther et al. 2004; Galer and Jensen 1999; Moseley 2004). Although the comparison to the symptomatology observed in poststroke patients with hemispatial neglect syndrome is still a matter of debate (see Legrain et al. 2012a; Punt et al. 2013), cortical changes observed in CRPS do not only affect areas involved in sensory and motor functions (Krause et al. 2006; Maihöfner et al. 2004) but also those involved in more complex and multisensory processing (Maihöfner et al. 2007). Several neglect-like symptoms were described such as asomatognosia (loss of body limbs’ ownership) (Galer and Jensen 1999), hypo- and bradykinesia (movements are difficult to initiate and slower) (Frettlöh et al. 2006; Galer and Jensen 1999), impaired mental image (Moseley 2005), and impaired schema (Schwoebel et al. 2001; Moseley 2004) of the CRPS limb (see Legrain et al. 2012a for a review). Classic neuropsychological testing of neglect did not reveal major deficits in extra-body space (Förderreuther et al. 2004; Kolbe et al. 2012). Conversely, body space evaluations revealed phenomena of referred sensations such as allesthesia or synchiria in response to tactile stimuli applied to the CRPS limb (Acerra and Moseley 2005; Maihöfner et al. 2006; McCabe et al. 2003). Moseley et al. (2009) showed that temporal order judgments of tactile stimuli applied to either hand in a normal posture were biased at the expenses of the stimulus applied to the CRPS hand, suggesting a deficit similar to tactile extinction. But, surprisingly, the orientation of the perceptual bias was influenced by the position of the hands: when the hands were crossed, the perception of the stimulus applied to the healthy hand was in this case biased at the advantage of the stimulus applied to the CRPS hand. It was hypothesized that CRPS patients do not specifically neglect the perception of the CRPS limb but rather the part of the body placed in the side of space where the CRPS limb normally resides. The authors also showed significant changes of limb temperature when the limbs were crossed over the body midline (Moseley et al. 2012a). Finally, based on an experimental procedure aimed to misalign vision and proprioception using prismatic goggles, they suggested that the influence of spatial representation on body perception and temperature was mostly driven by visual features rather than the proprioceptive perception of the position of the CRPS limb (Moseley et al. 2013). For these authors, the neglect-like symptoms observed in CRPS might reveal an altered representation of the body space organized along the sagittal midline of the body (Moseley et al. 2012b). The studies reviewed here above also show that CRPS-related symptoms can alter, not only somatotopic representations, but also spatiotopic representations of the body space (Moseley et al. 2009). These misaligned spatial representations would have been caused by maladaptive changes in cortical plasticity due to the initial musculoskeletal trauma (Moseley et al. 2012b) or implicit behavioral strategies to avoid limb provocation (Marinus et al. 2011). Altered body representations might in turn impair sensory perception and autonomic regulation of the pathological hemibody.

However, these assumptions were challenged by studies that showed that neglect-like symptoms cannot be locked to the side of space corresponding to the CRPS limb. Sumitani et al. (2007b) evaluated body representation in CRPS patients by means of visual estimates of the body midline. A visual stimulus was flashed and moved horizontally on a screen about 2 m in front of the participants. Patients were asked to guide verbally the visual stimulus until they estimated that the stimulus was positioned on the sagittal plane of their body midline. When the task was performed in the dark, their estimations were shifted significantly towards the side of space ipsilateral to their CRPS hand, as if, in the present case, they neglected the side of space corresponding to their healthy limb (for opposing results, see Kolbe et al. 2012; Reinersmann et al. 2012). As a consequence of nerve block following lidocaine injection, those estimates of the body midline tended to shift to the other hemispace, that is, the side of space contralateral to the CRPS hand (Sumitani et al. 2007b). These data suggest that the unbalanced body representation, as evaluated by visual body midline judgment task, is caused by attentional shifts due to excessive information coming from the affected limb, a hypothesis sharply in contrast with the assumption of a disownership of the CRPS limb (Moseley et al. 2012b). These discrepancies between the observed data across different studies emphasize that CRPS symptoms cannot be strictly paralleled to those observed in hemispatial neglect consequent to a stroke. Punt et al. (2013) argued that the CRPS-related motor symptoms such as hypo- and bradykinesia can be interpreted as a consequence to a learned nonuse consecutive to conditioned reduced attempts to move the pathological limb. Punt et al. (2013) added that representational and perceptual deficits were too subtle to be clinically relevant. Legrain et al. (2012a) suggested instead that neuropsychological testing performed until now was not adequate enough to reveal perceptual deficits specific to the CRPS pathophysiology. These authors recommended also a systematic investigation of spatial perception abilities across the different sensory modalities and, then, across the different frames of reference, using similar experimentally controlled procedures (see also Rossetti et al. 2013). In any case, the data reviewed in this paragraph suggest that chronic pain states such as CRPS can be useful to investigate the impact of pain on the abilities to represent and perceive the body and the surrounding space (Legrain et al. 2011; Moseley et al. 2012b) and the integration of nociceptive inputs in such cognitive representations (Haggard et al. 2013).


1.5 Action Selection


Peripersonal space is the privileged space for grasping and manipulating objects, but also for preparing defensive actions towards proximal objects that appear to be threatening. However, motor control and action selection have rarely been investigated in pain research. Yet, it is known that motor and premotor areas are activated by nociceptive stimuli (Gelnar et al. 1999; Frot et al. 2012). Using transcranial magnetic stimuli, Algoet et al. (2013) showed that nociceptive stimuli can modify motor excitability of the muscles of the arm and the hand onto which the stimulus is applied. It was also shown that the decision to move or to not move the hand onto which the noxious stimulus was applied altered the electrophysiological responses to this stimulus (Filevitch and Haggard 2012). But the neurophysiological mechanisms underlying the selection and the preparation of an action in response to nociceptive stimuli are still unknown. Recent studies suggest that reflex motor responses such as the eye blink reflex triggered by hand electrocutaneous stimulation can be controlled by high-order cognitive functions (Sambo et al. 2012a, b; Sambo and Iannetti 2013). These authors showed an increase of the magnitude of the eye blink reflex when the hand onto which the stimuli were applied approached the face. The authors concluded that this increase in the motor response could index the boundary of a defensive peripersonal representation of the face. However, because in these studies no external visual stimulus approaching the face was used as a control, the authors could not confirm the main role of vision nor exclude a causal role of personality traits such as anxiety. In this sense, any conclusion about a link between antinociceptive motor responses and spatial cognition is premature.

Only gold members can continue reading. Log In or Register to continue

Oct 21, 2016 | Posted by in PAIN MEDICINE | Comments Off on Cognitive Psychology and Neuropsychology of Nociception and Pain

Full access? Get Clinical Tree

Get Clinical Tree app for offline access