Radar and Vision Sensor Fusion for Object Detection in Autonomous Vehicle Surroundings

Jihun Kim, Dong Seog Han, Benaoumeur Senouci

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

60 Scopus citations

Abstract

Multi-sensor data fusion for advanced driver assistance systems (ADAS) in the automotive industry has received much attention recently due to the emergence of self-driving vehicles and road traffic safety applications. Accurate surroundings recognition through sensors is critical to achieving efficient advanced driver assistance systems (ADAS). In this paper, we use radar and vision sensors for accurate object recognition. However, since sensor-specific data have different coordinates, the data coordinate calibrate is essential. In this paper, we introduce the coordinate calibration algorithms between radar and vision images and perform sensor calibrating using data obtained from actual sensors.

Original languageEnglish
Title of host publicationICUFN 2018 - 10th International Conference on Ubiquitous and Future Networks
PublisherIEEE Computer Society
Pages76-78
Number of pages3
ISBN (Print)9781538646465
DOIs
StatePublished - 14 Aug 2018
Event10th International Conference on Ubiquitous and Future Networks, ICUFN 2018 - Prague, Czech Republic
Duration: 3 Jul 20186 Jul 2018

Publication series

NameInternational Conference on Ubiquitous and Future Networks, ICUFN
Volume2018-July
ISSN (Print)2165-8528
ISSN (Electronic)2165-8536

Conference

Conference10th International Conference on Ubiquitous and Future Networks, ICUFN 2018
Country/TerritoryCzech Republic
CityPrague
Period3/07/186/07/18

Keywords

  • Autonomous Vehicle
  • radar
  • Sensor calibration
  • Sensor fusion
  • Vision

Fingerprint

Dive into the research topics of 'Radar and Vision Sensor Fusion for Object Detection in Autonomous Vehicle Surroundings'. Together they form a unique fingerprint.

Cite this