Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings
#

sensor-fusion

Here are 766 public repositories matching this topic...

[ICRA'23] BEVFusion: Multi-Task Multi-Sensor Fusion with Unified Bird's-Eye View Representation

  • UpdatedJul 31, 2024
  • Python

A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package

  • UpdatedDec 4, 2025
  • C++

A Fast and Tightly-coupled Sparse-Direct LiDAR-Inertial-Visual Odometry (LIVO).

  • UpdatedSep 12, 2025
  • C++

[PAMI'23] TransFuser: Imitation with Transformer-Based Sensor Fusion for Autonomous Driving; [CVPR'21] Multi-Modal Fusion Transformer for End-to-End Autonomous Driving

  • UpdatedOct 19, 2025
  • Python

Arduino sketches for MPU9250 9DoF with AHRS sensor fusion

  • UpdatedMay 11, 2019
  • C++

Tightly coupled GNSS-Visual-Inertial system for locally smooth and globally consistent state estimation in complex environment.

  • UpdatedSep 11, 2021
  • C++

IMU + X(GNSS, 6DoF Odom) Loosely-Coupled Fusion Localization based on ESKF, IEKF, UKF(UKF/SPKF, JUKF, SVD-UKF) and MAP

  • UpdatedMar 28, 2025
  • C++

Implementation of Tightly Coupled 3D Lidar Inertial Odometry and Mapping (LIO-mapping)

  • UpdatedFeb 13, 2020
  • C++

alfred-py: A deep learning utility library for **human**, more detail about the usage of lib to:https://zhuanlan.zhihu.com/p/341446046

  • UpdatedNov 24, 2025
  • Python

X Inertial-aided Visual Odometry

  • UpdatedFeb 24, 2023
  • C++

A general framework for map-based visual localization. It contains 1) Map Generation which support traditional features or deeplearning features. 2) Hierarchical-Localizationvisual in visual(points or line) map. 3)Fusion framework with IMU, wheel odom and GPS sensors.

  • UpdatedOct 28, 2020

An in-depth step-by-step tutorial for implementing sensor fusion with robot_localization! 🛰

  • UpdatedMar 15, 2019

LiLi-OM is a tightly-coupled, keyframe-based LiDAR-inertial odometry and mapping system for both solid-state-LiDAR and conventional LiDARs.

  • UpdatedMar 18, 2023
  • C++

A highly robust and accurate LiDAR-only, LiDAR-inertial odometry

  • UpdatedNov 8, 2025
  • C++
HybVIO

HybVIO visual-inertial odometry and SLAM system

  • UpdatedMay 5, 2022
  • C++

Predict dense depth maps from sparse and noisy LiDAR frames guided by RGB images. (Ranked 1st place on KITTI) [MVA 2019]

  • UpdatedMay 1, 2022
  • Python

Official code for "EagerMOT: 3D Multi-Object Tracking via Sensor Fusion" [ICRA 2021]

  • UpdatedNov 23, 2022
  • Python

Ground-Fusion: A Low-cost Ground SLAM System Robust to Corner Cases (ICRA2024)

  • UpdatedSep 23, 2025
  • C++

[T-RO 24] Swarm-LIO2: Decentralized, Efficient LiDAR-inertial Odometry for UAV Swarms

  • UpdatedMay 6, 2025
  • C++

Improve this page

Add a description, image, and links to thesensor-fusion topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with thesensor-fusion topic, visit your repo's landing page and select "manage topics."

Learn more


[8]ページ先頭

©2009-2025 Movatter.jp