Affiliations: [a] Centre for Intelligent Signal & Imaging Research (CISIR), Universiti Teknologi PETRONAS, 32610 Bandar Seri Iskandar, Perak, Malaysia | [b] Department of Electrical & Electronic Engineering, Universiti Teknologi PETRONAS, 32610 Bandar Seri Iskandar, Perak, Malaysia | [c] Department of Fundamental & Applied Sciences, Universiti Teknologi PETRONAS, 32610 Bandar Seri Iskandar, Perak, Malaysia | [d] Center for Neuroscience Services and Research, Universiti Sains Malaysia, 16150, Kubang Kerian, Kota Bharu, Kelantan, Malaysia | [e] Department of Neurosciences, School of Medical Sciences, Universiti Sains Malaysia, 16150, Kubang Kerian, Kota Bharu, Kelantan, Malaysia | [f] Jabatan Neurosains, Hospital Universiti Sains Malaysia, 16150, Kubang Kerian, Kota Bharu, Kelantan, Malaysia
Abstract: Decoding of human brain activity has always been a primary goal in neuroscience especially with functional magnetic resonance imaging (fMRI) data. In recent years, Convolutional neural network (CNN) has become a popular method for the extraction of features due to its higher accuracy, however it needs a lot of computation and training data. In this study, an algorithm is developed using Multivariate pattern analysis (MVPA) and modified CNN to decode the behavior of the brain for different images with limited data sets. Selection of significant features is an important part of fMRI data analysis, since it reduces the computational burden and improves the prediction performance; significant features are selected using a t-test. MVPA uses machine learning algorithms to classify different brain states and helps with prediction during the task. General linear model (GLM) is used to find the unknown parameters of every individual voxel and a classification is done using a multi-class support vector machine (SVM). A proposed MVPA-CNN based algorithm is compared with a region of interest (ROI) based method and MVPA based estimated values. The proposed method showed better overall accuracy (68.6%) compared to ROI (61.88%) and estimation values (64.17%).