Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
#

sensor-fusion

Here are 684 public repositories matching this topic...

[ICRA'23] BEVFusion: Multi-Task Multi-Sensor Fusion with Unified Bird's-Eye View Representation

  • UpdatedJul 31, 2024
  • Python

A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package

  • UpdatedMay 28, 2024
  • C++

A Fast and Tightly-coupled Sparse-Direct LiDAR-Inertial-Visual Odometry (LIVO).

  • UpdatedNov 6, 2024
  • C++

[PAMI'23] TransFuser: Imitation with Transformer-Based Sensor Fusion for Autonomous Driving; [CVPR'21] Multi-Modal Fusion Transformer for End-to-End Autonomous Driving

  • UpdatedJun 28, 2024
  • Python

Arduino sketches for MPU9250 9DoF with AHRS sensor fusion

  • UpdatedMay 11, 2019
  • C++

IMU + X(GNSS, 6DoF Odom) Loosely-Coupled Fusion Localization based on ESKF, IEKF, UKF(UKF/SPKF, JUKF, SVD-UKF) and MAP

  • UpdatedMar 28, 2025
  • C++

Tightly coupled GNSS-Visual-Inertial system for locally smooth and globally consistent state estimation in complex environment.

  • UpdatedSep 11, 2021
  • C++

Implementation of Tightly Coupled 3D Lidar Inertial Odometry and Mapping (LIO-mapping)

  • UpdatedFeb 13, 2020
  • C++

alfred-py: A deep learning utility library for **human**, more detail about the usage of lib to:https://zhuanlan.zhihu.com/p/341446046

  • UpdatedApr 11, 2025
  • Python

X Inertial-aided Visual Odometry

  • UpdatedFeb 24, 2023
  • C++

A general framework for map-based visual localization. It contains 1) Map Generation which support traditional features or deeplearning features. 2) Hierarchical-Localizationvisual in visual(points or line) map. 3)Fusion framework with IMU, wheel odom and GPS sensors.

  • UpdatedOct 28, 2020

An in-depth step-by-step tutorial for implementing sensor fusion with robot_localization! 🛰

  • UpdatedMar 15, 2019

LiLi-OM is a tightly-coupled, keyframe-based LiDAR-inertial odometry and mapping system for both solid-state-LiDAR and conventional LiDARs.

  • UpdatedMar 18, 2023
  • C++

Predict dense depth maps from sparse and noisy LiDAR frames guided by RGB images. (Ranked 1st place on KITTI) [MVA 2019]

  • UpdatedMay 1, 2022
  • Python
HybVIO

HybVIO visual-inertial odometry and SLAM system

  • UpdatedMay 5, 2022
  • C++

Official code for "EagerMOT: 3D Multi-Object Tracking via Sensor Fusion" [ICRA 2021]

  • UpdatedNov 23, 2022
  • Python

This is a package for extrinsic calibration between a 3D LiDAR and a camera, described in paper: Improvements to Target-Based 3D LiDAR to Camera Calibration. This package is used for Cassie Blue's 3D LiDAR semantic mapping and automation.

  • UpdatedOct 21, 2021
  • MATLAB

Ground-Fusion: A Low-cost Ground SLAM System Robust to Corner Cases (ICRA2024)

  • UpdatedApr 17, 2025
  • C++

MaRS: A Modular and Robust Sensor-Fusion Framework

  • UpdatedApr 23, 2025
  • C++

Improve this page

Add a description, image, and links to thesensor-fusion topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with thesensor-fusion topic, visit your repo's landing page and select "manage topics."

Learn more


[8]ページ先頭

©2009-2025 Movatter.jp