Abstract
Behavior-directed intentions can be revealed by certain biological signals that precede behaviors. This study used eye movement data to infer human behavioral intentions. Participants were asked to view pictures while operating under different intentions, which necessitated cognitive search and affective appraisal. Intentions regarding the pictures were non-specific or specific, specific intentions were cognitive or affective, and affective intentions were to evaluate either the positive or negative emotions expressed by the individuals depicted. The affective task group made more fixations and had a larger average pupil size than the cognitive task group. The positive appreciation group made more and shorter fixations, on average, than the negative appreciation group. However, support vector machine algorithms revealed low classification accuracy. This was due to large inter-individual variance and psychological factors underlying intentions. We demonstrated improvement in classification accuracy using individual repeated measures data, which helped infer participants' self-selected intentions.
Original language | English |
---|---|
Pages (from-to) | 796-804 |
Number of pages | 9 |
Journal | Computers in Human Behavior |
Volume | 63 |
DOIs | |
State | Published - 1 Oct 2016 |
Keywords
- Eye movement
- Human behavioral intentions
- Intention classification
- Intention inference
- Natural user interface