When a Rubber Hand Is Perceived As Our Own: Embodiment and Multisensory Integration

Miss Cecily Brasch (). When a Rubber Hand Is Perceived As Our Own: Embodiment and Multisensory Integration Professional Doctorate, School of Psychology, The University of Queensland.

       
Attached Files (Some files may be inaccessible until you login with your UQ eSpace credentials)
Name Description MIMEType Size Downloads
s4075867_pd_abstract.pdf s4075867_pd_abstract.pdf Click to show the corresponding preview/stream application/pdf 92.60KB 0
s4075867_pd_totalthesis.pdf s4075867_pd_totalthesis.pdf Click to show the corresponding preview/stream application/pdf 1.16MB 2
Author Miss Cecily Brasch
Thesis Title When a Rubber Hand Is Perceived As Our Own: Embodiment and Multisensory Integration
School, Centre or Institute School of Psychology
Institution The University of Queensland
Thesis type Professional Doctorate
Supervisor Dr. Ada Kritikos
Total pages 160
Language eng
Subjects 170101 Biological Psychology (Neuropsychology, Psychopharmacology, Physiological Psychology)
Abstract/Summary Abstract The representation of one’s own body is constructed through the integration of multisensory inputs, particularly visual, somatosensory and proprioceptive information (Rizzolati, Fadiga, Fogassi & Gallese, 1997). An intact body representation is important for successful interactions with objects and people within our external environment. But is this representation dynamic? That is: can changing multisensory inputs alter our body representation and how we experience our body? The three experiments presented here investigated how information regarding the body in space is generated via the integration of information from different sensory systems - primarily the visual, somatosensory and proprioceptive systems. The current thesis examined how one’s bodily experience can be manipulated through the use of tools and a sense of body-ownership induced over non-corporeal objects. Two central questions were addressed; firstly to what extent can external effectors (i.e. tools, artificial hands and embodied artificial hands) be integrated and attributed to the body? Secondly, under what conditions does this integration occur? This thesis utilised saccade co-ordinates previously collected by Ms. A.S. and Mr. W.H. as part of an honours thesis and research internship at The University of Queensland respectively. Examination of these co-ordinates identified inconsistencies in the information initially extracted from EyeLink (video based infrared eye-tracking system) as well as the data generated from the original macro. The macro scripted to analyse these archival saccade co-ordinates, contained a number of coding errors which mapped saccades to the left and right side as well as congruent and incongruent conditions incorrectly. This meant the archival saccade co-ordinates needed to be calculated using new macros and updated analysis programs. I undertook this process and generated novel data and parameters prior to the statistical analyses reported in this thesis. I wrote three new macros through which to filter these saccade co-ordinates from EyeLink to generate new saccade reaction time (SRT), Error and End Point Accuracy (EPA: the proximity of final gaze position to target position) parameters which I then analysed in SPSS. Thus, the new data generated from the archival saccade co-ordinates collected by Ms. A.S. and Mr. W.H. in Experiments 1 and 2 had not previously been analysed nor had they been presented in results sections elsewhere. The crossmodal congruency paradigm used in Experiments 1 and 2 was employed as the basis for the new work conducted in Experiment 3. In the task, participants were instructed to detect visual targets (light emitting diodes, LEDs) in the presence of somatosensory distractors (vibrotactile stimulation delivered to the tip of the index finger) presented to one of two different spatial locations. The location of that target and distractor could either be congruent (coincident locations) or incongruent (opposite locations). To investigate multisensory interactions with respect to peripersonal space, the distance between the visual and somatosensory stimuli was modulated by locating participants’ hands either to the front or to the side. Experiment 1, conducted by Ms. A.S., investigated whether one’s perception of peripersonal space and multisensory integration could be altered through tool use. If tools could become integrated into one’s body, thereby extending multisensory coding of near peri-hand space into far space, somatosensory stimuli should be processed in peripersonal space now located at the tool tips placed next to the visual target. The results from the newly generated parameters presented in Chapter 4 indicated saccades were significantly faster and more accurate to visual than to somatosensory targets and in congruent versus incongruent conditions. SRTs were faster when there was a small spatial distance between the visual target and the somatosensory distractor. When participants’ hands were positioned at a spatially-remote location to the target, saccades were more hypometric than when positioned next to the target. When responding to visual targets only, SRTs were significantly faster with tools than without tools. Interestingly this significant difference was not indicated when participants responded to a somatosensory target. The current thesis took a conservative approach when interpreting these results. I proposed that in order to demonstrate whether tool use manipulated peripersonal space; one would seek to find a difference between the Hands Front and/or the Tools Side condition versus the Hands Side condition (with no tools). However, this difference was not identified. Using this as a guideline, I suggest that tools were not integrated into one’s body schema and were therefore unable to extend the area of multisensory integration by remapping peripersonal space to extrapersonal space. Experiment 2, conducted by Mr. W.H., examined whether vision of artificial hands would facilitate embodiment. If embodied, participants would then disregard proprioceptive information and attribute somatosensory inputs delivered to their real hand to the peripersonal space of the seen artificial hands. The results from the newly generated parameters presented in Chapter 7 again indicated saccades were significantly faster and more accurate to visual than to somatosensory targets and in congruent versus incongruent conditions. Again SRTs were faster when there was a small spatial distance between the visual target and the somatosensory distractor. Saccades were more hypometric when a visual target was preceded by a distractor than when both sensory inputs were presented simultaneously. Contrary to hypotheses, positioning artificial hands at spatially discrepant locations to one’s real hands did not modulate how participants weighted visual versus proprioceptive information. Vision of the artificial hands alone was not sufficient for embodiment, nor did it elongate multimodal peri-hand space into extrapersonal space. Experiment 3 explored the phenomenon of body-ownership by examining whether multisensory visuo-tactile information could be discriminated when it came from an embodied rubber hand following induction of the RHI. Consistent with the results presented in Chapter 4 and 7, saccades were significantly faster and more accurate to visual than to somatosensory targets and in congruent versus incongruent conditions. Again SRTs were faster when there was a small spatial distance between the visual target and the somatosensory distractor. Saccades were more hypometric when responding to visual than to somatosensory targets. Crucially, results demonstrated in unimodal trials saccades were faster in the IRH condition than the HS condition when responding to both visual and somatosensory targets. Furthermore, when responding to a somatosensory target there was no significant difference between the IRH and the HF condition. In bimodal trials SRTs were again significantly faster in the IRH than the HF and HS condition. These results suggest that experiencing a sense of ownership over rubber hands extended multisensory coding of near space into far space which in turn significantly influenced the strength of visuo-tactile integration. To answer the two central research questions, the results presented in this thesis demonstrate a gradation of incorporation: from embodied artificial hands for which one feels a sense of ownership to those which are not embodied and finally those which are non-corporeal. Furthermore, effectors were more likely to be integrated into one’s body when visuo-tactile stimuli were presented from the peripersonal space of that effector.
Keyword When a Rubber Hand is Perceived as Our Own

 
Citation counts: Google Scholar Search Google Scholar
Access Statistics: 78 Abstract Views, 2 File Downloads  -  Detailed Statistics
Created: Wed, 16 May 2012, 16:31:06 EST by Miss Cecily Brasch on behalf of Faculty of Social & Behavioural Sciences