Assessing Pediatric Cognitive Development via Multisensory Brain Imaging Analysis
No Thumbnail Available
Permanent Link
Author/Creator ORCID
Date
2024
Type of Work
Department
Program
Citation of Original Publication
Belyaeva, Irina, Yu-Ping Wang, Tony W Wilson, Vince D Calhoun, Julia M Stephen, and Tulay Adali. “Assessing Pediatric Cognitive Development via Multisensory Brain Imaging Analysis,” Proceedings of European Signal Processing Conference (EUSIPCO) (2024). https://eurasip.org/Proceedings/Eusipco/Eusipco2024/pdfs/0001362.pdf.
Rights
This item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.
Subjects
Abstract
Adolescence is a special period between childhood and adulthood and constitutes a critical developmental stage for humans. During adolescence, the brain processes various stimuli to form a complete view of the world. This study highlights the critical role of multisensory integration, where the brain processes multiple senses together rather than focusing on just one sensory modality at a time. Brain imaging modalities such as magnetoencephalography (MEG) and functional magnetic resonance imaging (fMRI) can be utilized to gain insights into the non-additive effects of multisensory integration by fusing data across different sensory stimuli in both time and space. While MEG and fMRI are powerful tools, traditional approaches to combining data from these modalities often ignore their multisensory aspect, focusing instead on single tasks. To leverage their complementarity, we introduce a multitask learning multimodal data fusion framework for joint learning of multisensory brain developmental patterns from MEG and fMRI data through a novel application of coupled canonical polyadic tensor decomposition. The multitask learning paradigm performs multimodal fusion from multiple sensory stimuli using multitask coupled tensor-tensor factorization (MCTTF). We demonstrate that multitask multimodal fusion of MEG and fMRI data can identify unique brain components, demonstrating a higher grouplevel multisensory integration effect.