LEE- Uçak ve Uzay Mühendisliği Lisansüstü Programı
Bu topluluk için Kalıcı Uri
Gözat
Yazar "Artykov, Arslan" ile LEE- Uçak ve Uzay Mühendisliği Lisansüstü Programı'a göz atma
Sayfa başına sonuç
Sıralama Seçenekleri
-
ÖgeDeep learning-based keypoints driven visual inertial odometry for GNSS-denied flight(Graduate School, 2023-07-03) Artykov, Arslan ; Koyuncu, Emre ; 511201110 ; Aeronautics and Astronautics EngineeringIn this thesis, our primary objective was to comprehensively investigate and enhance the visual-inertial navigation capabilities of autonomous 3D quadrotor flight, which presents a multitude of challenges when operating in outdoor environments. The successful execution of autonomous flight in such settings necessitates the implementation of a navigation algorithm that is not only highly precise but also exceptionally robust, enabling the quadrotor to safely navigate through a variety of static and dynamic obstacles encountered along its flight path. A critical requirement for addressing this complex problem is the accurate estimation of the agent's ego-motion. While global positioning tools like GPS can offer valuable information for ego-motion estimation, their effectiveness is often limited in challenging scenarios where their signals may be distorted or unavailable. In light of this, we pursued an alternative approach by exploring the fusion of visual and inertial equipment to achieve more reliable and accurate ego-motion estimation. Conventional visual-inertial odometry (VIO) techniques commonly suffer from performance degradation or even complete failure when confronted with the adverse conditions typically encountered in outdoor environments. Factors such as high dynamic range, textureless scenes, motion blur, and rapid motion can pose significant challenges to the accuracy and reliability of traditional VIO methods. To overcome these limitations and enhance the state-of-the-art in visual-inertial navigation systems (VINS), our research focused on the development of innovative techniques that go beyond traditional human-engineered salient point detectors. Instead, we leveraged the power of deep learning-based feature extractors to create a novel hybrid visual-inertial odometry method that exhibits remarkable resilience in the face of the aforementioned harsh conditions. By adopting this approach, we not only achieved a significant improvement in the speed of the conventional algorithm but also witnessed a substantial enhancement in its accuracy and robustness. The integration of deep learning-based feature extractors allowed for more effective identification and tracking of relevant visual features, enabling the system to better handle challenging environmental conditions and dynamic flight maneuvers. Consequently, our proposed method not only offers superior performance in terms of ego-motion estimation but also paves the way for safer navigation in GPS-denied environments where traditional techniques may falter or become unreliable. The outcomes of our research make a substantial contribution to the advancement of autonomous flight systems, providing a more reliable and effective solution for visual-inertial navigation. By successfully addressing the limitations of conventional VIO techniques and improving the overall performance and resilience of visual-inertial navigation systems, we have opened up new possibilities for the deployment of autonomous quadrotors in real-world scenarios that were previously deemed too challenging or risky. The insights gained from our findings have the potential to shape the future development and implementation of autonomous flight technologies, enabling their broader adoption and integration into various industries and applications where precise and robust navigation capabilities are essential.