Vision Based 3D Pose Estimation for Underwater Vehicle Docking Problem
DC Field | Value | Language |
---|---|---|
dc.contributor.author | 한경민 | - |
dc.contributor.author | 최현택 | - |
dc.date.accessioned | 2021-12-08T18:41:09Z | - |
dc.date.available | 2021-12-08T18:41:09Z | - |
dc.date.issued | 20110829 | - |
dc.identifier.uri | https://www.kriso.re.kr/sciwatch/handle/2021.sw.kriso/5488 | - |
dc.description.abstract | This paper presents an inexpensive, but efficient and accurate vision based pose estimation algorithm for underwater docking problems. While previous docking methods for underwater robots required complicated procedure involving specially designed docking station, the proposed method only needs a simple pattern board as the critical target of the docking task. Our method is based on the estimation of homography induced by projected pattern board into the camera coordinate frame. Each homography is used for estimating the pose of the robot in order to decide the optimal path or sequence of the docking task. The proposed method is validated through real image data set, and the estimated orientation angles were compared against the corresponding IMU data. We have achieved promising results, | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.title | Vision Based 3D Pose Estimation for Underwater Vehicle Docking Problem | - |
dc.title.alternative | Vision Based 3D Pose Estimation for Underwater Vehicle Docking Problem | - |
dc.type | Conference | - |
dc.citation.title | 17th International Unmanned Untethered Submersible Technology Conference | - |
dc.citation.volume | 0 | - |
dc.citation.number | 0 | - |
dc.citation.startPage | 1 | - |
dc.citation.endPage | 4 | - |
dc.citation.conferenceName | 17th International Unmanned Untethered Submersible Technology Conference | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
(34103) 대전광역시 유성구 유성대로1312번길 32042-866-3114
COPYRIGHT 2021 BY KOREA RESEARCH INSTITUTE OF SHIPS & OCEAN ENGINEERING. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.