Generating Real-Time, Selective, and Multimodal Haptic Effects from Sound for Gaming Experience Enhancement

Gyeore Yun, Minjae Mun, Jungeun Lee, Dong Geun Kim, Hong Z. Tan, Seungmoon Choi

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

18 Scopus citations

Abstract

We propose an algorithm that generates a vibration, an impact, or a vibration+impact haptic effect by processing a sound signal in real time. Our algorithm is selective in that it matches the most appropriate type of haptic effects to the sound using a machine-learning classifier (random forest) that is built on expert-labeled datasets. Our algorithm is tailored to enhance user experiences for video game play, and we present two examples for the RPG (role-playing game) and FPS (first-person shooter) genres. We demonstrate the effectiveness of our algorithm by a user study in comparison to other state-of-the-art (SOTA) methods for the same cross-modal conversion. Our system elicits better multisensory user experiences than the SOTA algorithms for both game genres.

Original languageEnglish
Title of host publicationCHI 2023 - Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems
PublisherAssociation for Computing Machinery
ISBN (Electronic)9781450394215
DOIs
StatePublished - 19 Apr 2023
Event2023 CHI Conference on Human Factors in Computing Systems, CHI 2023 - Hamburg, Germany
Duration: 23 Apr 202328 Apr 2023

Publication series

NameConference on Human Factors in Computing Systems - Proceedings

Conference

Conference2023 CHI Conference on Human Factors in Computing Systems, CHI 2023
Country/TerritoryGermany
CityHamburg
Period23/04/2328/04/23

Keywords

  • audio-haptic conversion
  • automatic generation
  • game
  • multimodal haptic effects
  • sound-haptic conversion

Fingerprint

Dive into the research topics of 'Generating Real-Time, Selective, and Multimodal Haptic Effects from Sound for Gaming Experience Enhancement'. Together they form a unique fingerprint.

Cite this