February 15, 2021 16:24
Computational Brain Dynamics Team (PI: Okito Yamashita) thumbnails

Description

Computational Brain Dynamics Team (https://aip.riken.jp/labs/goalorient_tech/comput_brain_dyn/?lang=en) at RIKEN AIP

Speaker 1: Okito Yamashita
Title : Introduction of Computational Brain DynamicsTeam : AI technology for developing neuroimaging biomarkers of mental illness
Abstract: Mental disorders such as depression, schizophrenia, obsessive compulsive disorder and ASD are one of the biggest social problems in the world. It is expected COVID19 will make the situation even more severe. However, the current diagnosis system DSM5 which is based on the symptom is not effective to treatment selection and biomarkers of mental disorder, objective ways to define mental disorders based on biological data, are anticipated. Among a number of biological data such as gene, blood and brain, the biomarker based on neuroimaging is one of the promising candidates as disruption of brain circuits could be a cause of mental disorders. Towards realizing neuroimaging biomarkers of mental illness, we are developing AI technology and conducting Big data analysis of neuroimaging data. In my presentation, I will review background and recent progress of neuroimaging biomarker using machine learning technique and discuss issues and necessity to explore new features hidden in brain activities.

Speaker 2: Shigeyuki Ikeda
Title : Predicting behavioral traits through dynamic modes in resting-state brain activity
Abstract: Dynamic functional connectivity in resting-state brain activity has gained much attention because of its relations with individual differences in human behavior, e.g., intelligence and personality. The present study used dynamic mode decomposition (DMD) to extract dynamic modes, spatial-temporal coherent patterns, inherent in resting-state brain activity. In order to validate the effectiveness of DMD, we investigated whether individual differences in various traits were predicted using dynamic modes and multivariate pattern analysis. Prediction was successful and DMD outperformed independent component analysis, a conventional method that focuses on either space or time. In addition, most of the significant prediction results were observed in cognitive traits, indicating that individual differences in cognition were associated with dynamic modes in resting-state brain activity.

Speaker3 : Yusuke Takeda
Title: Estimating repetitive spatiotemporal patterns from many subjects’ resting-state fMRIs
Abstract: Repetitive spatiotemporal patterns in resting-state brain activities have been widely observed in various scales, regions, and species, such as the rat visual cortex. They probably reflect the past experiences embedded in neuronal circuits and play a role in memory consolidation. However, they were difficult to extract from human brain activity data owing to their unknown onsets. Recently, we overcame this difficulty with a method called SpatioTemporal Pattern estimation (STeP) (Takeda et al., 2016). From resting-state data, STeP can estimate several spatiotemporal patterns and their onsets even if they are overlapping. More recently, we extended STeP to make it applicable to big databases (BigSTeP) (Takeda et al., 2019). From many subjects’ resting-state data, BigSTeP estimates spatiotemporal patterns that are common across subjects (common spatiotemporal patterns) as well as the corresponding spatiotemporal patterns in each subject (subject-specific spatiotemporal patterns). We applied BigSTeP to over 1,000 subjects’ resting-state fMRIs recorded from autism spectrum disorder (ASD) and typically developed (TD) subjects. The obtained results suggest the context-dependent differences between ASDs and TDs, that is, the differences in fMRI activities between ASDs and TDs do not always occur during the resting state but tend to occur when the default mode network exhibits large positive activity. These results demonstrate the usefulness of BigSTeP in extracting inspiring hypotheses from big databases in a data-driven way.

References
– Takeda Y., Hiroe N., Yamashita O., Sato M., Estimating repetitive spatiotemporal patterns from resting-state brain activity data. NeuroImage (2016); 133:251-65.
– Takeda Y., Itahashi T., Sato M., Yamashita O., Estimating repetitive spatiotemporal patterns from many subjects’ resting-state fMRIs. NeuroImage (2019); 203:116182.

Speaker4: Hiroshi Morioka
Title: Nonlinear ICA of fMRI reveals primitive temporal structures linked to rest, task, and behavioral traits
Abstract: Accumulating evidence from whole brain functional magnetic resonance imaging (fMRI) suggests that the human brain at rest is functionally organized in a spatially and temporally constrained manner. However, because of their complexity, the fundamental mechanisms underlying time-varying functional networks are still not well understood. Here, we develop a novel nonlinear feature extraction framework called local space-contrastive learning (LSCL), which extracts distinctive nonlinear temporal structure hidden in time series, by training a deep temporal convolutional neural network in an unsupervised, data-driven manner. We demonstrate that LSCL identifies certain distinctive local temporal structures, referred to as temporal primitives, which repeatedly appear at different time points and spatial locations, reflecting dynamic resting-state networks. We also show that these temporal primitives are also present in task-evoked spatiotemporal responses. We further show that the temporal primitives capture unique aspects of behavioral traits such as fluid intelligence and working memory. These results highlight the importance of capturing transient spatiotemporal dynamics within fMRI data and suggest that such temporal primitives may capture fundamental information underlying both spontaneous and task-induced fMRI dynamics.

Related Laboratories

last updated on March 23, 2024 15:01Laboratory