About Us

Welcome to the Mechatronics and Robotic Systems (MaRS) Laboratory. We are part of the Department of Mechanical Engineering at the University of Hong Kong (HKU). Our lab focuses on general mechatronic systems and robotics, with emphasis on their practical use in real human life and industry. Our current focuses are on aerial robot design, planning and control, and lidar-based simultanous localization and mapping (SLAM).

We are hiring new MPhil and Ph.D. students on UAV design, planning, control, and LiDAR SLAM. Potential students can contact Dr. Zhang at fuzhang@hku.hk for the positions.

Our website has moved to mars.hku.hk. Please visit the new website for updates.

Project Highlights

FAST-LIO2: Fast Direct LiDAR-inertial Odometry

FAST-LIO2 is computationally-efficient (e.g., up to 100 Hz odometry and mapping in large outdoor environments), robust (e.g., reliable pose estimation in cluttered indoor environments with rotation up to 1000 deg/s), versatile (i.e., applicable to both multi-line spinning and solid-state LiDARs, UAV and handheld platforms, and Intel and ARM-based processors), while still achieving higher or comparable accuracy with existing methods.

Authors: Wei Xu, Yixi Cai, Dongjiao He, Jiarong Lin, Fu Zhang
Videos: video 1, video 2
Code: https://github.com/hku-mars/FAST_LIO

Fast and Accurate Extrinsic Calibration for Multiple LiDARs and Cameras

We propose a fast, accurate, and targetless extrinsic calibration method for multiple LiDARs and cameras based on adaptive voxelization. On the theory level, we incorporate the LiDAR extrinsic calibration with the bundle adjustment method. We derive the second-order derivatives of the cost function w.r.t. the extrinsic parameter to accelerate the optimization. On the implementation level, we apply the adaptive voxelization to dynamically segment the LiDAR point cloud into voxels with non-identical sizes, and reduce the computation time in the process of feature correspondence matching.

Authors: Xiyuan Liu, Chongjian Yuan, Fu Zhang
Videos: video
Code: https://github.com/hku-mars/mlcc


ikd-Tree is an incremental k-d tree designed for robotic applications. The ikd-Tree incrementally updates a k-d tree with new coming points only, leading to much lower computation time than existing static k-d trees. Besides point-wise operations, the ikd-Tree supports several features such as box-wise operations and down-sampling that are practically useful in robotic applications.

Authors: Yixi Cai, Wei Xu, Fu Zhang
Videos: video
Code: https://github.com/hku-mars/ikd-Tree

R2LIVE: A Robust, Real-time, LiDAR-Inertial-Visual tightly-coupled state Estimator and mapping

R2LIVE is a robust, real-time tightly-coupled multi-sensor fusion framework, which fuses the measurement from the LiDAR, inertial sensor, visual camera to achieve robust, accurate state estimation. Taking advantage of measurement from all individual sensors, our algorithm is robust enough to various visual failure, LiDAR-degenerated scenarios, and is able to run in real time on an on-board computation platform, as shown by extensive experiments conducted in indoor, outdoor, and mixed environment of different scale.

Authors: Jiarong Lin, Fu Zhang
Videos: video
Code: https://github.com/hku-mars/r2live

R3LIVE: A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package

R3LIVE is a novel LiDAR-Inertial-Visual sensor fusion framework, which takes advantage of measurement of LiDAR, inertial, and visual sensors to achieve robust and accurate state estimation. R3LIVE is contained of two subsystems, the LiDAR-inertial odometry (LIO) and visual-inertial odometry (VIO). The LIO subsystem (FAST-LIO) takes advantage of the measurement from LiDAR and inertial sensors and builds the geometry structure of (i.e. the position of 3D points) global maps. The VIO subsystem utilizes the data of visual-inertial sensors and renders the map's texture (i.e. the color of 3D points).

Authors: Jiarong Lin, Fu Zhang
Videos: video 1, video 2
Code: https://github.com/hku-mars/r3live

Avoiding dynamic small obstacles with onboard sensing and computating on aerial robots

This repository is used for UAV dynamic small obstacles avoidance. It is a complete system for lidar-based UAV, including FAST-LIO slam, time-accumulated KD-Tree mapping and kinodynamic A* search modules. It is able to avoid dynamic small obstacles (down to 20mm diameter bars) by running at 50Hz.

Authors: Fanze Kong, Wei Xu, Fu Zhang
Videos: video
Code: https://github.com/hku-mars/dyn_small_obs_avoidance

Pixel-level Extrinsic Self Calibration of High Resolution LiDAR and Camera in Targetless Environments

livox_camera_calib is a robust, high accuracy extrinsic calibration tool between high resolution LiDAR (e.g. Livox) and camera in targetless environment. Our algorithm can run in both indoor and outdoor scenes, and only requires edge information in the scene. If the scene is suitable, we can achieve pixel-level accuracy similar to or even beyond the target based method.

Authors: Chongjian Yuan, Xiyuan Liu, Xiaoping Hong, Fu Zhang
Videos: video
Code: https://github.com/hku-mars/livox_camera_calib