R3LIVE Real-time Robust Tightly Coupled System by MaRS Laboratory, HKULivox ...

[Copy Link]
Author: Livox Pioneer | Time: 2022-2-17 16:02:54 | Smart city|
2 17318

36

Threads

38

Posts

490

Credits

Administrator

Rank: 9Rank: 9Rank: 9

Credits
490
Posted on 2022-2-17 16:02:54| All floors |Read mode
R3LIVE Introduction
R3LIVE is a novel LiDAR-Inertial-Visual sensor fusion framework, it’s developed by MaRS of the university of Hongkong, which is led by Prof. Fu Zhang. It takes advantage of measurement of LiDAR, inertial, and visual sensors to achieve robust and accurate state estimation.

R3LIVE is built upon their previous work R2LIVE, which contains two sub-systems: the LiDAR-inertial odometry (LIO) and the visual-inertial odometry (VIO). The LIO subsystem (FAST-LIO) takes advantage of the measurement from LiDAR and inertial sensors and builds the geometric structure of (i.e. the position of 3D points) global maps. The VIO subsystem utilizes the data of visual-inertial sensors and renders the map's texture (i.e. the color of 3D points).


R3LIVE Hardware System
In this project, the author Dr. Lin Jiarong built a handheld 3D scanning system with Livox Avia LiDAR, industrial camera, and RoboMaster Manifold 2C;Livox Avia's non-repetitive scanning 3D point cloud information, the reconstructed map could obtain a denser point cloud map, which significantly improved the robustness of map feature matching. In addition, the author also used the built-in IMU chip of Livox Avia, and relied on the high-frequency (200Hz) six-degree-freedom motion information outputted by it synchronously to effectively improve the positioning accuracy of the tightly coupled algorithm.




R3LIVE system architecture diagram

Application of Livox Avia in R3LIVE system

In the R3LIVE experiment, the author used Livox Avia as a depth sensor to collect data in the campuses of the University of Hong Kong and the Hong Kong University of Science and Technology to verify the robustness and accuracy of the algorithm. By making full use of  the Livox Avia's non-repetitive scanning 3D point cloud information, the reconstructed map could obtain a denser point cloud map, which significantly improved the robustness of map feature matching. In addition, the author also used the built-in IMU chip of Livox Avia, and relied on the high-frequency (200Hz) six-degree-freedom motion information outputted by it synchronously to effectively improve the positioning accuracy of the tightly coupled algorithm.

R3LIVE public experimental dataset
The authors of R3LIVE have published a total of 9 datasets that they have collected. Users can visit the following URL to download the datasets to reproduce and evaluate the experimental effect of R3LIVE: https://github.com/ziv-lin/r3live_dataset


R3LIVE project summary
In the work, the author innovatively introduced R3LIVE, and effectively solved the following problems:
  • A set of high-precision and high-efficiency color point cloud reconstruction system was introduced to reconstruct the dense color point cloud of the surrounding environment in real-time;
  • The problem that LiDAR cannot be positioned normally in degraded scenes was effectively solved by fusing camera information;
  • Aiming to promote research and applications in LiDAR related industries with a cost-effective solution, the author has opened source for a complete set of software and hardware solutions based on Livox Avia LiDAR;



Results of several of our experiments
Results of several of our experimentsR3LIVE is a highly scalable system. In addition to being used as a SLAM system for real-time robot applications, it could also be used for reconstructing dense and accurate RGB color 3D maps for applications such as surveying and mapping. Moreover, the developers of R3LIVE also provided a series of utility applications for reconstructing and rendering polygon grid maps (mesh), so that the maps reconstructed by R3LIVE can be imported into various 3D applications including games and simulators more conveniently and efficiently, further improving the scalability of R3LIVE.

Livox welcomes university laboratories and research teams to reach out for more in-depth discussions on Livox LiDAR applications. We believe that our cost-effective LiDAR solutions will empower the research, application, and progress of related fields!


R3LIVE 系统介绍

R3LIVE(A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupledstate Estimation and mapping package)系统是由香港大学 MaRS 团队张富教授、林家荣博士提出的新型 LiDAR-Inertial-Visual 传感器融合框架,已于近期开源。它利用Livox Avia 激光雷达、惯性和视觉传感器实现了鲁棒、更高精度的状态估计。

R3LIVE 包含两个子系统:激光雷达-惯性里程计 (下文简称LIO) 和视觉-惯性里程计 (下文简称VIO)。其中LIO (FAST-LIO) 利用 LiDAR和惯性传感器的测量数据构建全局地图几何结构(3D 点云位置);VIO 利用视觉-惯性传感器的数据来渲染地图纹理(3D 点云颜色),同时VIO 还通过最小化帧到地图的光度误差来直接、有效地融合视觉数据。

R3LIVE 硬件系统
在本项目中,作者林家荣博士使用 Livox Avia 激光雷达、工业相机和 RoboMaster Manifold 2C 搭建了一套手持式三维扫描系统;同时也加入了一套 RTK 系统进行了真值比对数据的采集。







R3LIVE 系统架构示意图


Livox Avia R3LIVE 系统中的应用
在R3LIVE 的实验中,作者使用了Livox Avia 作为深度传感器,在香港大学以及香港科技大学校园采集数据以验证其算法的鲁棒性以及精度。得益于Livox Avia提供了非重复扫描的三维点云信息,使其所重建的地图能获得更加稠密的点云地图,显著提升了地图特征匹配的鲁棒性。此外,作者还使用了Livox Avia 内置的IMU芯片,利用其同步输出的高频率(200Hz)的六自由度运动信息,有效改善紧耦合算法的定位精度。


R3LIVE 公开实验数据集
R3LIVE 的作者已经公布了9个他们采集的数据集(总),用户可以访问以下网址获得数据集下载链接,复现和评估 R3LIVE实验效果:


R3LIVE 项目小结
在这个工作中,作者创新性地提出了R3LIVE, 有效地解决了以下几个问题:
- 通过融合相机信息,有效解决了激光雷达在退化场景下无法正常定位的问题;
- 提出了一套高精度、高效率的彩色点云重建系统,实时重建周围环境的稠密彩色点云;
-  作者开源了整套基于Livox Avia激光雷达的软硬件解决方案,旨在于以极具性价比的方案推动相关行业领域的研究以及应用。


部分实验结果展示


R3LIVE 系统应用的可拓展性极强,它不仅可以作为实时机器人应用的 SLAM 系统,还可以为测绘等应用重建密集、精确RGB 彩色3D 地图 。此外,R3LIVE的开发者还提供了一系列用于重建和渲染多边形网格地图(mesh)的实用程序,使R3LIVE 重建的地图能更方便、高效地导入到各种如游戏,仿真模拟器等 3D应用程序中去,进一步提升了R3LIVE 的可拓展性。

Livox 欢迎更多高校实验室及研究团队与我们沟通联系,就Livox 激光雷达应用进行深入探讨。也期待我们以高性价比激光雷达方案一同推动相关领域的研究、应用及进展!



MaRS Introduction
The Mechatronics and Robotic Systems (MaRS) laboratory of the University of Hong Kong is a laboratory led by Professor Zhang Fu focusing on electromechanical systems and robotic applications. At present, the team has achieved great success in various fields such as aerobot design, planning and control, and SLAM applications.

Papers
  • R3LIVE: A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package
  • R2LIVE: A Robust, Real-time, LiDAR-Inertial-Visual tightly-coupled state Estimator and mapping
  • Fast-LIO: A computationally efficient and robust LiDAR-inertial odometry package.
  • ikd-tree: A state-of-art dynamic KD-Tree for 3D kNN search.
  • Loam-livox: A robust LiDAR Odometry and Mapping (LOAM) package for Livox-LiDAR
  • livox_camera_calib: A robust, high accuracy extrinsic calibration tool between high resolution LiDAR (e.g. Livox) and camera in targetless environment.
  • mlcc: A fast and accurate Multiple LiDARs and Cameras extrinsic Calibration


香港大学 MaRS 实验室

香港大学 Mechatronics and Robotic Systems(简称MaRS)实验室,是由张富教授带领的专注于机电系统及机器人应用的实验室,目前团队在航空机器人设计、规划与控制及SLAM 应用等多个方向上取得了丰硕成果。


MaRS代表研究成果:
- R3LIVE: A Robust, Real-time, RGB-colored,LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package
- R2LIVE: A Robust,Real-time, LiDAR-Inertial-Visual tightly-coupled state Estimator and mapping
- Fast-LIO: A computationally efficient and robust LiDAR-inertial odometry package.
- ikd-tree: A state-of-art dynamic KD-Tree for 3D kNN search.
- Loam-livox: A robust LiDAR Odometry and Mapping (LOAM) package for Livox-LiDAR
- livox_camera_calib: A robust, high accuracy extrinsic calibration tool between highresolution LiDAR (e.g. Livox) and camera in targetless environment.
- mlcc: A fast andaccurate Multiple LiDARs and Cameras extrinsic Calibration

This post contains more resources

You need to Login Before they can download or view these resources, Don’t have an account?Register

x
Reply

Use props Report

0

Threads

1

Posts

6

Credits

Kindergarten

Rank: 1

Credits
6
Posted on 2022-4-7 10:58:41| All floors
如果只有雷达、IMU,可以使用R3LIVE系统吗?
Reply

Use props Report

0

Threads

3

Posts

35

Credits

Kindergarten

Rank: 1

Credits
35
Posted on 2022-6-9 15:12:26| All floors
JunlongGuo Posted at 2022-4-7 10:58
如果只有雷达、IMU,可以使用R3LIVE系统吗?

Hello R3LIVE 系统内必须使用相机(参考文内系统架构),如果没有相机就会退化为lio了~
Reply

Use props Report

You need to log in before you can reply Login | Register

Credit Rules

Quick Reply Back to top