3D registration using inertial navigation system and Kinect for image-guided surgery

Joonyoung Bang, Shirazi Muhammad Ayaz, Khan Danish, Soo In Park, Hyunki Lee, Min Young Kim

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

When 3d image registration using 2d image by hand-held scanner, the most important thing is pose estimation and object surface measure. The relationship between images is important to image registration. In this paper suggest 3d registration using Kinect and INS. The Kinect and INS are integrated on rigid body to improve the accuracy of estimate pose. It can complement the disadvantages of each sensor through the sensor fusion of INS and vision sensor position to make more precise estimates. Using purposed system can measure the three-dimensional shape of a variety of objects, and also can obtain an accurate image in a medical imaging applications which require precision application and apply in Image-guided surgery.

Original languageEnglish
Title of host publicationICCAS 2015 - 2015 15th International Conference on Control, Automation and Systems, Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1512-1515
Number of pages4
ISBN (Electronic)9788993215090
DOIs
StatePublished - 23 Dec 2015
Event15th International Conference on Control, Automation and Systems, ICCAS 2015 - Busan, Korea, Republic of
Duration: 13 Oct 201516 Oct 2015

Publication series

NameICCAS 2015 - 2015 15th International Conference on Control, Automation and Systems, Proceedings

Conference

Conference15th International Conference on Control, Automation and Systems, ICCAS 2015
Country/TerritoryKorea, Republic of
CityBusan
Period13/10/1516/10/15

Keywords

  • 3D Surface Measurement
  • Image Registration
  • Inertial navigation system
  • Sensor fusion

Fingerprint

Dive into the research topics of '3D registration using inertial navigation system and Kinect for image-guided surgery'. Together they form a unique fingerprint.

Cite this