sensor-fusion
Here are 684 public repositories matching this topic...
Language:All
Sort:Most stars
[ICRA'23] BEVFusion: Multi-Task Multi-Sensor Fusion with Unified Bird's-Eye View Representation
- Updated
Jul 31, 2024 - Python
FAST-LIVO2: Fast, Direct LiDAR-Inertial-Visual Odometry
- Updated
Apr 30, 2025 - C++
A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package
- Updated
May 28, 2024 - C++
A Fast and Tightly-coupled Sparse-Direct LiDAR-Inertial-Visual Odometry (LIVO).
- Updated
Nov 6, 2024 - C++
[PAMI'23] TransFuser: Imitation with Transformer-Based Sensor Fusion for Autonomous Driving; [CVPR'21] Multi-Modal Fusion Transformer for End-to-End Autonomous Driving
- Updated
Jun 28, 2024 - Python
Arduino sketches for MPU9250 9DoF with AHRS sensor fusion
- Updated
May 11, 2019 - C++
IMU + X(GNSS, 6DoF Odom) Loosely-Coupled Fusion Localization based on ESKF, IEKF, UKF(UKF/SPKF, JUKF, SVD-UKF) and MAP
- Updated
Mar 28, 2025 - C++
Tightly coupled GNSS-Visual-Inertial system for locally smooth and globally consistent state estimation in complex environment.
- Updated
Sep 11, 2021 - C++
Implementation of Tightly Coupled 3D Lidar Inertial Odometry and Mapping (LIO-mapping)
- Updated
Feb 13, 2020 - C++
alfred-py: A deep learning utility library for **human**, more detail about the usage of lib to:https://zhuanlan.zhihu.com/p/341446046
- Updated
Apr 11, 2025 - Python
X Inertial-aided Visual Odometry
- Updated
Feb 24, 2023 - C++
A general framework for map-based visual localization. It contains 1) Map Generation which support traditional features or deeplearning features. 2) Hierarchical-Localizationvisual in visual(points or line) map. 3)Fusion framework with IMU, wheel odom and GPS sensors.
- Updated
Oct 28, 2020
An in-depth step-by-step tutorial for implementing sensor fusion with robot_localization! 🛰
- Updated
Mar 15, 2019
LiLi-OM is a tightly-coupled, keyframe-based LiDAR-inertial odometry and mapping system for both solid-state-LiDAR and conventional LiDARs.
- Updated
Mar 18, 2023 - C++
Predict dense depth maps from sparse and noisy LiDAR frames guided by RGB images. (Ranked 1st place on KITTI) [MVA 2019]
- Updated
May 1, 2022 - Python
HybVIO visual-inertial odometry and SLAM system
- Updated
May 5, 2022 - C++
Official code for "EagerMOT: 3D Multi-Object Tracking via Sensor Fusion" [ICRA 2021]
- Updated
Nov 23, 2022 - Python
This is a package for extrinsic calibration between a 3D LiDAR and a camera, described in paper: Improvements to Target-Based 3D LiDAR to Camera Calibration. This package is used for Cassie Blue's 3D LiDAR semantic mapping and automation.
- Updated
Oct 21, 2021 - MATLAB
Ground-Fusion: A Low-cost Ground SLAM System Robust to Corner Cases (ICRA2024)
- Updated
Apr 17, 2025 - C++
MaRS: A Modular and Robust Sensor-Fusion Framework
- Updated
Apr 23, 2025 - C++
Improve this page
Add a description, image, and links to thesensor-fusion topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with thesensor-fusion topic, visit your repo's landing page and select "manage topics."