How much features in brain-computer interface are discriminative? - Quantitative measure by relative entropy

Sangtae Ahn, Sungwook Kang, Sung Chan Jun

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Brain Computer Interface (BCI) gives opportunities to control a computer or a machine by imagination of limb movement, which activates somatosensory motor region in a discriminative manner. As far as it has been concerned, it has been not well investigated how much the given (extracted) features in BCI are discriminative in the sense of information theory. For this purpose, we cast the feature spaces corresponding to given conditions into probability spaces by yielding corresponding probability distributions. Then the relative entropy (measures to estimate the difference between two probability distributions) is introduced to measure the distance between these probability distributions. Such a distance represents well how two feature spaces are separable. We compare this distance with BCI performance (classification success rate) to see their correlation.

Original languageEnglish
Title of host publicationHCI International 2011 - Posters' Extended Abstracts - International Conference, HCI International 2011, Proceedings
Pages274-278
Number of pages5
EditionPART 2
DOIs
StatePublished - 2011
Event14th International Conference on Human-Computer Interaction, HCI International 2011 - Orlando, FL, United States
Duration: 9 Jul 201114 Jul 2011

Publication series

NameCommunications in Computer and Information Science
NumberPART 2
Volume174 CCIS
ISSN (Print)1865-0929

Conference

Conference14th International Conference on Human-Computer Interaction, HCI International 2011
Country/TerritoryUnited States
CityOrlando, FL
Period9/07/1114/07/11

Keywords

  • Brain Computer Interface
  • Information Theory
  • Relative Entropy

Fingerprint

Dive into the research topics of 'How much features in brain-computer interface are discriminative? - Quantitative measure by relative entropy'. Together they form a unique fingerprint.

Cite this