Object detection and tracking for autonomous underwater robots using weighted template matching
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Kim, D. | - |
dc.contributor.author | Lee, D. | - |
dc.contributor.author | Myung, H. | - |
dc.contributor.author | Choi, H.-T. | - |
dc.date.accessioned | 2023-12-22T09:00:49Z | - |
dc.date.available | 2023-12-22T09:00:49Z | - |
dc.date.issued | 2012 | - |
dc.identifier.issn | 0000-0000 | - |
dc.identifier.uri | https://www.kriso.re.kr/sciwatch/handle/2021.sw.kriso/8795 | - |
dc.description.abstract | Underwater environment has a noisy medium and limited light source, so underwater vision has disadvantages of the limited detection range and the poor visibility. However it is still attractive in close range detections, especially for navigation. Thus, in this paper, vision-based object detection (template matching) and tracking (mean shift tracking) techniques for underwater robots using artificial objects have been studied. Also, we propose a novel weighted correlation coefficient using the feature-based and color-based approaches to enhance the performance of template matching in various illumination conditions. The average color information is incorporated into template matching using original and texturized images to robustly calculate correlation coefficients. And the objects are recognized using multiple template-based selection approach. Finally, the experiments in a test pool have been conducted to demonstrate the performance of the proposed techniques using an underwater robot platform yShark made by KORDI. ? 2012 IEEE. | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.title | Object detection and tracking for autonomous underwater robots using weighted template matching | - |
dc.type | Article | - |
dc.identifier.doi | 10.1109/OCEANS-Yeosu.2012.6263501 | - |
dc.identifier.scopusid | 2-s2.0-84866713708 | - |
dc.identifier.bibliographicCitation | Program Book - OCEANS 2012 MTS/IEEE Yeosu: The Living Ocean and Coast - Diversity of Resources and Sustainable Activities | - |
dc.citation.title | Program Book - OCEANS 2012 MTS/IEEE Yeosu: The Living Ocean and Coast - Diversity of Resources and Sustainable Activities | - |
dc.type.docType | Conference Paper | - |
dc.description.isOpenAccess | N | - |
dc.description.journalRegisteredClass | scopus | - |
dc.subject.keywordPlus | Artificial objects | - |
dc.subject.keywordPlus | Autonomous underwater robot | - |
dc.subject.keywordPlus | Close range | - |
dc.subject.keywordPlus | Color information | - |
dc.subject.keywordPlus | Correlation coefficient | - |
dc.subject.keywordPlus | Detection range | - |
dc.subject.keywordPlus | Feature-based | - |
dc.subject.keywordPlus | Mean shift tracking | - |
dc.subject.keywordPlus | Object Detection | - |
dc.subject.keywordPlus | Object Tracking | - |
dc.subject.keywordPlus | Poor visibility | - |
dc.subject.keywordPlus | Template-based | - |
dc.subject.keywordPlus | Underwater environments | - |
dc.subject.keywordPlus | Underwater robots | - |
dc.subject.keywordPlus | Underwater vision | - |
dc.subject.keywordPlus | Various illumination conditions | - |
dc.subject.keywordPlus | Vision based | - |
dc.subject.keywordPlus | Weighted correlation | - |
dc.subject.keywordPlus | Light sources | - |
dc.subject.keywordPlus | Object recognition | - |
dc.subject.keywordPlus | Oceanography | - |
dc.subject.keywordPlus | Robots | - |
dc.subject.keywordPlus | Template matching | - |
dc.subject.keywordPlus | Tracking (position) | - |
dc.subject.keywordPlus | Vision | - |
dc.subject.keywordPlus | Autonomous underwater vehicles | - |
dc.subject.keywordAuthor | Object detection | - |
dc.subject.keywordAuthor | Object tracking | - |
dc.subject.keywordAuthor | Underwater vision | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
(34103) 대전광역시 유성구 유성대로1312번길 32042-866-3114
COPYRIGHT 2021 BY KOREA RESEARCH INSTITUTE OF SHIPS & OCEAN ENGINEERING. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.