실시간 순환 신경망 기반의 멀티빔 소나 이미지를 이용한 수중 물체의 추적에 관한 연구Study on Underwater Object Tracking Based on Real-Time Recurrent Regression Networks Using Multi-beam Sonar Images
- Other Titles
- Study on Underwater Object Tracking Based on Real-Time Recurrent Regression Networks Using Multi-beam Sonar Images
- Authors
- 이언호; 이영준; 최진우; 이세진
- Issue Date
- 2020
- Publisher
- 한국로봇학회
- Keywords
- Underwater Sonar Image; Object Tracking; Real-Time Recurrent Regression Networks; Heterogeneous Sonar Sensors
- Citation
- 로봇학회 논문지, v.15, no.1, pp 8 - 15
- Pages
- 8
- Journal Title
- 로봇학회 논문지
- Volume
- 15
- Number
- 1
- Start Page
- 8
- End Page
- 15
- URI
- https://www.kriso.re.kr/sciwatch/handle/2021.sw.kriso/284
- DOI
- 10.7746/jkros.2020.15.1.008
- ISSN
- 1975-6291
- Abstract
- This research is a case study of underwater object tracking based on real-time recurrent regression networks (Re3). Re3 has the concept of generic object tracking. Because of these characteristics, it is very effective to apply this model to unclear underwater sonar images. The model also an pursues object tracking method, thus it solves the problem of calculating load that may be limited when object detection models are used, unlike the tracking models. The model is also highly intuitive, so it has excellent continuity of tracking even if the object being tracked temporarily becomes partially occluded or faded. There are 4 types of the dataset using multi-beam sonar images: including (a) dummy object floated at the testbed; (b) dummy object settled at the bottom of the sea; (c) tire object settled at the bottom of the testbed; (d) multi-objects settled at the bottom of the testbed. For this study, the experiments were conducted to obtain underwater sonar images from the sea and underwater testbed, and the validity of using noisy underwater sonar images was tested to be able to track objects robustly.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - ETC > 1. Journal Articles
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.