Title: Mobile Brain/Body Imaging (MoBI) of Active Cognition
Abstract: Event Abstract Back to Event Mobile Brain/Body Imaging (MoBI) of Active Cognition Klaus Gramann1*, Nima Bigdely-Shamlo1, Andrey Vankov1 and Scott Makeig1 1 University of California, Swartz Center for Computational Neuroscience, United States Human cognition is embodied in the sense that cognitive processes are based on and make use of our physical structure while being situated in a specific environment. Brain areas originally evolved to organize motor behavior of animals in three-dimensional environments also support human cognition (Rizzolatti et al., 2002), suggesting that joint imaging of human brain activity and motor behavior could be an invaluable resource for understanding the distributed brain dynamics of human cognition. However, despite existing knowledge there is a lack of studies investigating the brain dynamics underlying motivated behaviors. This is due to technical constraints of brain imaging methods (e.g., fMRI, MEG) that require subjects to remain motionless because of high sensitivity to movement artifacts. This imposes a fundamental mismatch between the bandwidth of recorded brain dynamics (now up to 106 bits/second) and behavior (button presses at ~1/second). Only electroencephalography (EEG) involves sensors light enough to allow near-complete freedom of movement of the head and body. Furthermore, EEG provides sufficient time resolution to record brain activity on the time scale of natural motor behavior, making joint EEG and behavioral recording the clear choice for mobile brain imaging of humans. To better understand the embodied aspect of human cognition, we have developed a mobile brain/body imaging (MoBI) modality to allow for synchronous recording of EEG and body movements as subjects actively perform natural movements (Makeig et al., 2009). MoBI recording allows analyses of brain activity during preparation, execution, and evaluation of motivated actions in natural environments. In a first experiment, we recorded high-density EEG with a portable active-electrode amplifier system mounted in a specially constructed backpack, while whole body movements were assessed with an active motion capture system. The concurrently recorded time series data were synchronized online across a distributed PC LAN. Standing subjects were asked to orient to (point, look, or walk towards) 3-D objects placed in a semi-circular array (Figure 1). Online routines tracked subject pointing and head directions to cue advances in the stimulus sequence. Independent components (ICs) accounting for eye movements, muscle, and brain activities were identified in results of independent component analysis (ICA, Makeig et al., 2004) applied to EEG data. Equivalent dipoles for IC processes were located throughout cortex, the eyes, and identifiable neck and scalp muscles. Neck muscle activity exhibited task-dependent modulations across a broad frequency range while spectral activities of brain ICs exhibited modulations time-locked to eye movements, segments of body and head movements, including precisely timed high gamma band modulations in frontal medial cortex. Simultaneous recording of whole-body movements and brain dynamics during free and naturally motivated 3-D orienting actions, combined with data-driven analysis of brain dynamics, allows, for the first time, studies of distributed EEG dynamics, body movements, and eye, head and neck muscle activities during active cognition in situ. The new mobile brain/body imaging approach allows analysis of joint brain and body dynamics supporting and expressing natural cognition, including self-guided search for and processing of relevant information and motivated behavior in realistic environments. Conference: Bernstein Conference on Computational Neuroscience, Frankfurt am Main, Germany, 30 Sep - 2 Oct, 2009. Presentation Type: Poster Presentation Topic: Sensory processing Citation: Gramann K, Bigdely-Shamlo N, Vankov A and Makeig S (2009). Mobile Brain/Body Imaging (MoBI) of Active Cognition. Front. Comput. Neurosci. Conference Abstract: Bernstein Conference on Computational Neuroscience. doi: 10.3389/conf.neuro.10.2009.14.140 Copyright: The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers. They are made available through the Frontiers publishing platform as a service to conference organizers and presenters. The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated. Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed. For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions. Received: 28 Aug 2009; Published Online: 28 Aug 2009. * Correspondence: Klaus Gramann, University of California, Swartz Center for Computational Neuroscience, San Diego, CA, United States, [email protected] Login Required This action requires you to be registered with Frontiers and logged in. To register or login click here. Abstract Info Abstract The Authors in Frontiers Klaus Gramann Nima Bigdely-Shamlo Andrey Vankov Scott Makeig Google Klaus Gramann Nima Bigdely-Shamlo Andrey Vankov Scott Makeig Google Scholar Klaus Gramann Nima Bigdely-Shamlo Andrey Vankov Scott Makeig PubMed Klaus Gramann Nima Bigdely-Shamlo Andrey Vankov Scott Makeig Related Article in Frontiers Google Scholar PubMed Abstract Close Back to top Javascript is disabled. Please enable Javascript in your browser settings in order to see all the content on this page.