DSpace Collection:https://hdl.handle.net/1889/6652024-03-29T11:15:32Z2024-03-29T11:15:32ZCorticostriatal connectivity in the macaque brainRizzo, Mariannahttps://hdl.handle.net/1889/53362023-06-20T09:37:42Z2023-01-01T00:00:00ZTitle: Corticostriatal connectivity in the macaque brain
Authors: Rizzo, Marianna
Abstract: In the macaque brain, projections from distant, interconnected cortical areas converge in specific zones of the striatum. For example, specific zones of the motor putamen are targets of projections from frontal motor, inferior parietal, and ventrolateral prefrontal hand-related areas and thus are integral part of the so-called “lateral grasping network.”
The present thesis presents two studies on two aspects of the corticostriatal connectivity in the macaque brain whose results extend current models of corticostriatal interactions.
In the study 1, we analyzed the laminar distribution of corticostriatal neurons projecting to different parts of the motor putamen and caudate. After injections of retrograde neural tracers in different parts of the striatum, the laminar distribution of the labeled corticostriatal neurons was analyzed quantitatively. In frontal motor areas, frontal operculum, and prefrontal cortex, where most labeled cells were located, almost everywhere the proportion of corticostriatal labeled neurons in layers III and/ or VI was comparable or even stronger than in layer V. Furthermore, within these regions, the laminar distribution pattern of corticostriatal labeled neurons largely varied independently from their density and from the projecting area/sector, but likely according to the target striatal zone. Accordingly, the present data show that cortical areas may project in different ways to different striatal zones, which can be targets of specific combinations of signals originating from the various cortical layers of the areas of a given network, suggesting more complex modes of information processing in the basal ganglia for different motor and nonmotor functions and opening new questions on the architecture of the corticostriatal circuitry.
In the study 2, again based on neural tracer injections in different parts of the striatum, we analyzed and compared qualitatively and quantitatively the distribution of labeled CSt cells in the two hemispheres in macaque brain. The results showed that crossed CSt projections to the caudate and the putamen can be relatively robust (up to 30% of total labeled cells). The origin of the direct and the crossed CSt projections was not symmetrical as the crossed ones originated almost exclusively from motor, prefrontal, and cingulate areas and not from parietal and temporal areas. Furthermore, there were several cases in which the contribution of contralateral areas tended to equal that of the ipsilateral ones. This study is the first detailed description of this anatomic pathway of the macaque brain and provides the substrate for bilateral distribution of motor, motivational, and cognitive signals for reinforcement learning and selection of actions or action sequences, and for learning compensatory motor strategies after cortical stroke.2023-01-01T00:00:00ZThe processing of emotional body expressions within architectural experience: electroencephalography and eye-tracking studies in virtual realityPresti, Paolohttps://hdl.handle.net/1889/53352023-06-20T09:35:35Z2023-01-01T00:00:00ZTitle: The processing of emotional body expressions within architectural experience: electroencephalography and eye-tracking studies in virtual reality
Authors: Presti, Paolo
Abstract: The perception of emotional body expressions is crucial to human social behavior. Historically, researchers have investigated how we perceive emotional body expressions isolated from their context, characterizing the underlying brain mechanisms without considering the effect of the natural background in which we typically interact. Therefore, the present dissertation aims to study how the processing of emotional body expression is influenced by the surrounding architectural space in which human beings spend most of their lifetime.
To this aim, I conducted two initial studies to characterize both avatars’ body postures and virtual architectures in terms of their affective components. The obtained results laid the basis for a third study where avatars and architectural spaces were combined to recreate a controlled environment resembling a social scenario. Specifically, using Virtual Reality (VR) technology, participants dynamically experienced the surrounding architectural space and then faced a virtual avatar with different emotional body postures.
The analysis of electroencephalographic (EEG) signals and eye-gaze behavior revealed that the processing of emotional body expressions was influenced by the architectural experience, which modulated early evoked potentials and oscillatory activity related to attentional mechanisms as well as the visual exploration of the avatar’s body. Moreover, the source localization analysis revealed that the processing of both the architecture and body expressions activated motor-related brain areas, proving that the space/cognition interplay is rooted in common neural substrates.
Overall, these studies demonstrate that the architectural experience modulates brain mechanisms underpinning the processing of others’ affective states, showing that the mere manipulation of the surrounding architecture is sufficient to influence human behavior in social interactions.2023-01-01T00:00:00ZThe perception of audio spatialization during cinematic immersion: an HD-EEG study on the sense of PresenceLangiulli, Nunziohttps://hdl.handle.net/1889/53342023-06-20T09:32:56Z2023-01-01T00:00:00ZTitle: The perception of audio spatialization during cinematic immersion: an HD-EEG study on the sense of Presence
Authors: Langiulli, Nunzio
Abstract: Although many studies have investigated spectators’ cinematic experience, only a few of them explored the neurophysiological correlates of the sense of Presence evoked by the spatial characteristics of audio delivery devices. Nevertheless, nowadays both the industrial and the consumer markets have been saturated by some forms of spatial audio format that enrich the audio-visual cinematic experience, reducing the gap between the real and the digitally mediated world. The increase in the immersive capabilities correspond to the instauration of both the sense of Presence, the psychological sense of being in the virtual environment, and embodied simulation mechanisms. While it is well known that these mechanisms can be activated in the real world, they may be elicited even in virtual environments and could be modulated by the acoustic spatialization cues reproduced by sound systems. Hence, the present study aims to investigate the neural basis of the sense of Presence, together with the emotional and physical involvement, evoked by different forms of mediation by testing different sound delivery presentation modes (Monophonic, Stereophonic and Surround). To these aims, a behavioral investigation and a high-density electroencephalographic (HD-EEG) study have been developed. A large set of ecological and heterogeneous stimuli extracted from feature movies were used. Furthermore, 32 participants were selected following the Generalized listener selection procedure. We found a significant event-related desynchronization (ERD) in the Surround condition when compared to the Monophonic condition both in Alpha and Low Beta centro-parietal clusters. We discuss the results as an index of embodied simulation mechanisms that could be considered as a possible neurophysiological correlate of the instauration of the sense of Presence.2023-01-01T00:00:00ZVentrolateral Prefrontal Neurons of the Monkey Encode Visual Instructions and Motor Behaviors in the Same Pragmatic FormatGravante, Alfonsohttps://hdl.handle.net/1889/53332023-06-20T09:24:11Z2023-01-01T00:00:00ZTitle: Ventrolateral Prefrontal Neurons of the Monkey Encode Visual Instructions and Motor Behaviors in the Same Pragmatic Format
Authors: Gravante, Alfonso
Abstract: The lateral prefrontal cortex (LPF), because of its extended anatomical connections with other cortical and subcortical areas, has access to a wide set of information regarding both the internal state of the subject and the external world, that determine its involvement in a broad spectrum of sensorimotor and cognitive processes. The resulting multidimensional representations enable this cortical sector to produce flexible strategies to navigate into the complex, ever-changing social environment, exploiting contextual and motivational information for selecting and implementing appropriate behaviors, as well as for inhibiting unnecessary or inappropriate ones (Miller, 2000; Rozzi and Fogassi, 2017; Tanji and Hoshi, 2008).
Monkey electrophysiology and human fMRI studies suggested that information processing for action planning becomes more abstract when moving along a caudal-to-rostral gradient in the frontal cortex (Badre and D’Esposito, 2007, 2009; Koechlin et al., 2003; Koechlin and Summerfield, 2007). Despite theoretical differences among the authors, there is general agreement that the mid-portion of LPF is involved not only in behavior selection but also in planning actions. These functions require a strict relation between the middle sector of LPF and the parieto-premotor circuits subserving sensorimotor transformations. Accordingly, anatomical studies in the monkey have indicated that this sector, in particular the part corresponding to areas 12r and 46v, is anatomically connected with the parieto-premotor circuits for grasping control (Barbas and Pandya, 1989; Borra et al., 2011; Cavada and Goldman-Rakic, 1989; Cipolloni and Pandya, 1999; Gerbella et al., 2013; Saleem et al., 2014), suggesting that this prefrontal region could be an additional node of the lateral grasping network, involved in the context-based control of motor goals (see Rizzolatti et al., 2014). In line with these anatomical data, previous work from our lab (Simone et al. 2015) demonstrated that the ventral part of the lateral prefrontal cortex (VLPF) contains movement-related neurons, active during grasping execution both when the behavior is instructed by abstract rules and in naturalistic situation (Simone et al., 2015). Although these processes necessarily require the generation of goals based on the current context, it is largely unknown how VLPF neurons prospectively encode the instructing stimuli in relation to behavioral demands, and what is the specific format of the underlying neural representations.
To tackle this issue, in Study 1, we analyzed the temporal dynamics of the responses VLPF neurons recorded during a Visuo-motor task instructed by visual cues, in which the monkey had to observe real objects and, subsequently, to perform (Action condition) or refrain (Inaction condition) object-oriented grasping actions. Our data show that VLPF recorded sector contains neurons responding in different task phases, and that the neuronal population discharge is stronger in the Inaction condition when the instructing cue is presented, and in the Action condition in the subsequent phases, from object presentation to action execution. Decoding analyses performed on neuronal populations showed that the activity recorded during the initial phases of the task shares the same type of format with that recorded during the final phases, suggesting the pragmatic nature of this format and that instructions and goals are encoded by prefrontal neurons as predictions of the action outcome.
In Study 2, we aimed at assessing whether prefrontal neurons visual responses exclusively depend on the visual properties of the observed stimuli or are modulated by their pragmatic features and by the environmental contingencies. To this purpose, we recorded neuronal activity of prefrontal neurons in a Visual task requiring the monkeys simply to observe images of objects on a monitor. The recording sessions were carried out in the same days as for Study 1, allowing us to compare the visual responses in the two tasks in the same neurons.
Our results indicate that part of VLPF neurons respond specifically to one stimulus or to a small set of stimuli, but there is no indication of a “passive” categorical coding. The comparison of neural responses recorded in the Visual and the Visuo-Motor tasks indicates that the visual responses to objects are often modulated by the task demands, with the strongest discharge when the object is target of an action.
Altogether, the data of the two studies indicate that VLPF neurons encode sensory stimuli (e.g. instructing cues and real objects) in relation to the current individual intention, and we propose that VLPF sensory-related responses are encoded at the neural level in terms of their behavioural outcome (pragmatic hypothesis).2023-01-01T00:00:00Z