TY - JOUR
T1 - TSANet
T2 - multibranch attention deep neural network for classifying tactile selective attention in brain-computer interfaces
AU - Jang, Hyeonjin
AU - Park, Jae Seong
AU - Jun, Sung Chan
AU - Ahn, Sangtae
N1 - Publisher Copyright:
© 2023, Korean Society of Medical and Biological Engineering.
PY - 2024/1
Y1 - 2024/1
N2 - Brain-computer interfaces (BCIs) enable communication between the brain and a computer and electroencephalography (EEG) has been widely used to implement BCIs because of its high temporal resolution and noninvasiveness. Recently, a tactile-based EEG task was introduced to overcome the current limitations of visual-based tasks, such as visual fatigue from sustained attention. However, the classification performance of tactile-based BCIs as control signals is unsatisfactory. Therefore, a novel classification approach is required for this purpose. Here, we propose TSANet, a deep neural network, that uses multibranch convolutional neural networks and a feature-attention mechanism to classify tactile selective attention (TSA) in a tactile-based BCI system. We tested TSANet under three evaluation conditions, namely, within-subject, leave-one-out, and cross-subject. We found that TSANet achieved the highest classification performance compared with conventional deep neural network models under all evaluation conditions. Additionally, we show that TSANet extracts reasonable features for TSA by investigating the weights of spatial filters. Our results demonstrate that TSANet has the potential to be used as an efficient end-to-end learning approach in tactile-based BCIs.
AB - Brain-computer interfaces (BCIs) enable communication between the brain and a computer and electroencephalography (EEG) has been widely used to implement BCIs because of its high temporal resolution and noninvasiveness. Recently, a tactile-based EEG task was introduced to overcome the current limitations of visual-based tasks, such as visual fatigue from sustained attention. However, the classification performance of tactile-based BCIs as control signals is unsatisfactory. Therefore, a novel classification approach is required for this purpose. Here, we propose TSANet, a deep neural network, that uses multibranch convolutional neural networks and a feature-attention mechanism to classify tactile selective attention (TSA) in a tactile-based BCI system. We tested TSANet under three evaluation conditions, namely, within-subject, leave-one-out, and cross-subject. We found that TSANet achieved the highest classification performance compared with conventional deep neural network models under all evaluation conditions. Additionally, we show that TSANet extracts reasonable features for TSA by investigating the weights of spatial filters. Our results demonstrate that TSANet has the potential to be used as an efficient end-to-end learning approach in tactile-based BCIs.
KW - Brain-computer interface
KW - Deep neural network
KW - Electroencephalography
KW - Feature attention
KW - Tactile selective attention
UR - http://www.scopus.com/inward/record.url?scp=85167346118&partnerID=8YFLogxK
U2 - 10.1007/s13534-023-00309-4
DO - 10.1007/s13534-023-00309-4
M3 - Article
AN - SCOPUS:85167346118
SN - 2093-9868
VL - 14
SP - 45
EP - 55
JO - Biomedical Engineering Letters
JF - Biomedical Engineering Letters
IS - 1
ER -