Using eye movement data to infer human behavioral intentions

Hyeonggyu Park, Sangil Lee, Minho Lee, Mun Seon Chang, Ho Wan Kwak

Research output: Contribution to journalArticlepeer-review

42 Scopus citations

Abstract

Behavior-directed intentions can be revealed by certain biological signals that precede behaviors. This study used eye movement data to infer human behavioral intentions. Participants were asked to view pictures while operating under different intentions, which necessitated cognitive search and affective appraisal. Intentions regarding the pictures were non-specific or specific, specific intentions were cognitive or affective, and affective intentions were to evaluate either the positive or negative emotions expressed by the individuals depicted. The affective task group made more fixations and had a larger average pupil size than the cognitive task group. The positive appreciation group made more and shorter fixations, on average, than the negative appreciation group. However, support vector machine algorithms revealed low classification accuracy. This was due to large inter-individual variance and psychological factors underlying intentions. We demonstrated improvement in classification accuracy using individual repeated measures data, which helped infer participants' self-selected intentions.

Original languageEnglish
Pages (from-to)796-804
Number of pages9
JournalComputers in Human Behavior
Volume63
DOIs
StatePublished - 1 Oct 2016

Keywords

  • Eye movement
  • Human behavioral intentions
  • Intention classification
  • Intention inference
  • Natural user interface

Fingerprint

Dive into the research topics of 'Using eye movement data to infer human behavioral intentions'. Together they form a unique fingerprint.

Cite this