AUV SLAM using forward/downward looking cameras and artificial landmarks
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Jung, J. | - |
dc.contributor.author | Lee, Y. | - |
dc.contributor.author | Kim, D. | - |
dc.contributor.author | Lee, D. | - |
dc.contributor.author | Myung, H. | - |
dc.contributor.author | Choi, H.-T. | - |
dc.date.accessioned | 2023-12-22T08:30:37Z | - |
dc.date.available | 2023-12-22T08:30:37Z | - |
dc.date.issued | 2017 | - |
dc.identifier.issn | 0000-0000 | - |
dc.identifier.uri | https://www.kriso.re.kr/sciwatch/handle/2021.sw.kriso/8474 | - |
dc.description.abstract | Autonomous underwater vehicles (AUVs) are usually equipped with one or more optical cameras to obtain visual data of underwater environments. The camera can also be used to estimate the AUV's pose information, along with various navigation sensors such as inertial measurement unit (IMU), Doppler velocity log (DVL), depth sensor, and so on. In this paper, we propose a vision-based simultaneous localization and mapping (SLAM) of AUVs, where underwater artificial landmarks are used to help visual sensing of forward and downward looking cameras. Three types of landmarks are introduced and their detection algorithms are organized in a framework of conventional extended Kalman filter (EKF) SLAM to estimate both robot and landmark states. The proposed method is validated by an experiment performed in a engineering basin. Since DVL suffers from noises in a real ocean environment, we generated synthetic noisy data based on the real sensor data. With this data we verify that the proposed SLAM approach can recover from the erroneous dead reckoning position. ? 2017 IEEE. | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | Institute of Electrical and Electronics Engineers Inc. | - |
dc.title | AUV SLAM using forward/downward looking cameras and artificial landmarks | - |
dc.type | Article | - |
dc.identifier.doi | 10.1109/UT.2017.7890307 | - |
dc.identifier.scopusid | 2-s2.0-85018189185 | - |
dc.identifier.bibliographicCitation | 2017 IEEE OES International Symposium on Underwater Technology, UT 2017 | - |
dc.citation.title | 2017 IEEE OES International Symposium on Underwater Technology, UT 2017 | - |
dc.type.docType | Conference Paper | - |
dc.description.isOpenAccess | N | - |
dc.description.journalRegisteredClass | scopus | - |
dc.subject.keywordPlus | Cameras | - |
dc.subject.keywordPlus | Extended Kalman filters | - |
dc.subject.keywordPlus | Kalman filters | - |
dc.subject.keywordPlus | Mapping | - |
dc.subject.keywordPlus | Robotics | - |
dc.subject.keywordPlus | Robots | - |
dc.subject.keywordPlus | Units of measurement | - |
dc.subject.keywordPlus | Vision | - |
dc.subject.keywordPlus | Artificial landmark | - |
dc.subject.keywordPlus | Autonomous underwater vehicles (AUVs) | - |
dc.subject.keywordPlus | Detection algorithm | - |
dc.subject.keywordPlus | Doppler velocity logs | - |
dc.subject.keywordPlus | Inertial measurement unit | - |
dc.subject.keywordPlus | Simultaneous localization and mapping | - |
dc.subject.keywordPlus | Underwater environments | - |
dc.subject.keywordPlus | Vision based simultaneous localization and mappings | - |
dc.subject.keywordPlus | Autonomous underwater vehicles | - |
dc.subject.keywordAuthor | AUV | - |
dc.subject.keywordAuthor | extended Kalman filter | - |
dc.subject.keywordAuthor | simultaneous localization and mapping | - |
dc.subject.keywordAuthor | vision | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
(34103) 대전광역시 유성구 유성대로1312번길 32042-866-3114
COPYRIGHT 2021 BY KOREA RESEARCH INSTITUTE OF SHIPS & OCEAN ENGINEERING. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.