Range and vision sensors fusion for outdoor 3D reconstruction - Université Clermont Auvergne Accéder directement au contenu
Communication Dans Un Congrès Année : 2015

Range and vision sensors fusion for outdoor 3D reconstruction

Résumé

The conscience of the surrounding environment is inevitable task for several applications such as mapping, autonomous navigation and localization. In this paper we are interested by exploiting the complementarity of a panoramic microwave radar and a monocular camera for 3D reconstruction of large scale environments. Considering the robustness to environmental conditions and depth detection ability of the radar on one hand, and the high spatial resolution of a vision sensor on the other hand, makes these tow sensors well adapted for large scale outdoor cartography. Firstly, the system model of the two sensors is represented and a new 3D reconstruction method based on sensors geometry is introduced. Secondly, we address the global calibration problem which consists in finding the exact transformation between radar and camera coordinate systems. The method is based on the optimization of a non-linear criterion obtained from a set of radar-to-image target correspondences. Both methods have been validated with synthetic and real data.
Fichier non déposé

Dates et versions

hal-02604592 , version 1 (16-05-2020)

Identifiants

Citer

G. El Natour, O. Ait Aider, R. Rouveure, F. Berry, P. Faure. Range and vision sensors fusion for outdoor 3D reconstruction. International Conference on Computer Vision Theory and Applications, Mar 2015, Berlin, Germany. pp.1-6. ⟨hal-02604592⟩
14 Consultations
0 Téléchargements

Partager

Gmail Facebook X LinkedIn More