A New Visual Inertial Simultaneous Localization and Mapping (SLAM) Algorithm Based on Point and Line Features

  1. Zhang, Tong
  2. Liu, Chunjiang
  3. Li, Jiaqi
  4. Pang, Minghui
  5. Wang, Mingang
  6. Chow, Jacky C. K.
  7. Shahbazi, Mozhdeh
  8. Kurian, Ajeesh
  9. González-Aguilera, Diego 1
  1. 1 Universidad de Salamanca
    info

    Universidad de Salamanca

    Salamanca, España

    ROR https://ror.org/02f40zc51

Revista:
Drones

ISSN: 2504-446X

Año de publicación: 2022

Volumen: 6

Número: 1

Páginas: 23

Tipo: Artículo

DOI: 10.3390/DRONES6010023 GOOGLE SCHOLAR lock_openAcceso abierto editor

Otras publicaciones en: Drones

Resumen

In view of traditional point-line feature visual inertial simultaneous localization and mapping (SLAM) system, which has weak performance in accuracy so that it cannot be processed in real time under the condition of weak indoor texture and light and shade change, this paper proposes an inertial SLAM method based on point-line vision for indoor weak texture and illumination. Firstly, based on Bilateral Filtering, we apply the Speeded Up Robust Features (SURF) point feature extraction and Fast Nearest neighbor (FLANN) algorithms to improve the robustness of point feature extraction result. Secondly, we establish a minimum density threshold and length suppression parameter selection strategy of line feature, and take the geometric constraint line feature matching into consideration to improve the efficiency of processing line feature. And the parameters and biases of visual inertia are initialized based on maximum posterior estimation method. Finally, the simulation experiments are compared with the traditional tightly-coupled monocular visual–inertial odometry using point and line features (PL-VIO) algorithm. The simulation results demonstrate that the proposed an inertial SLAM method based on point-line vision for indoor weak texture and illumination can be effectively operated in real time, and its positioning accuracy is 22% higher on average and 40% higher in the scenario that illumination changes and blurred image.

Referencias bibliográficas

  • 10.1002/rob.20178
  • 10.1109/TRO.2018.2853729
  • 10.1109/TRO.2017.2705103
  • Aguilar, (2017), pp. 596
  • Lowe, (2003), Int. J. Comput. Vis., 20, pp. 91
  • 10.1109/TRO.2019.2899783
  • 10.3390/s18041159
  • Fu, (2020), arXiv
  • 10.1109/TPAMI.2008.300
  • 10.1109/TRO.2021.3075644
  • 10.1109/TIP.2011.2159234
  • 10.1177/0278364915620033