Visual-Inertial State Initialization
- Estimator initialization in vision-aided inertial navigation with unknown camera-IMU calibration - Link
- Closed-form Solutions for Vision-aided Inertial Navigation - Link
- Closed-form solution of visual-inertial structure from motion - Link
- Simultaneous state initialization and gyroscope bias calibration in visual inertial aided navigation - Link
- Robust initialization of monocular visual-inertial estimation on aerial robots - Link
- A convex formulation for motion estimation using visual and inertial sensors - Link
- VINS-Mono: A robust and versatile monocular visual-inertial state estimator - Link
- Visual-inertial monocular SLAM with map reuse - Link
- Fast and Robust Initialization for Visual-Inertial SLAM - Link
- Inertial-only optimization for visual-inertial initialization - Link
- Monocular visual–inertial state estimation with online initialization and camera–IMU extrinsic calibration - Link
- Revisiting visual-inertial structure from motion for odometry and SLAM initialization - Link
- An Analytical Solution to the IMU Initialization Problem for Visual-Inertial Systems - Link
- Mid-Air Range-Visual-Inertial Estimator Initialization for Micro Air Vehicles - Link
- A Rotation-Translation-Decoupled Solution for Robust and Efficient Visual-Inertial Initialization - Link
|
Open-Sourced Visual-Inertial Codebases
- rpng / OpenVINS - Link
- rpng / ov_plane - Link
- rpng / ov_secondary - Link
- rpng / ov_maplab - Link
- rpng / R-VIO - Link
- ethz-asl / okvis - Link
- ethz-asl / maplab - Link
- ethz-asl / rovio - Link
- TUM / basalt - Link
- HKUST-Aerial-Robotics / VINS-Fusion - Link
- HKUST-Aerial-Robotics / VINS-Mono - Link
- MIT-SPARK / Kimera-VIO - Link
- ucla-vision / xivo - Link
- KumarRobotics / msckf_vio - Link
- jpl-x / x - Link
- aau-cns / MSCEqF - Link
- uzh-rpg / rpg_svo_pro_open - Link
|
Continuous-Time Trajectory Estimation
- Unified temporal and spatial calibration for multi-sensor systems - Link
- Spline Fusion: A continuous-time representation for visual-inertial fusion with application to rolling shutter cameras. - Link
- Continuous-time visual-inertial odometry for event cameras - Link
- Efficient Visual-Inertial Navigation using a Rolling-Shutter Camera with Inaccurate Timestamps. - Link
- Alternating-stereo VINS: Observability analysis and performance evaluation - Link
- Multi-camera visual-inertial navigation with online intrinsic and extrinsic calibration - Link
- Decoupled Representation of the Error and Trajectory Estimates for Efficient Pose Estimation - Link
|
Machine Learning Uncertainty
- Uncertainty in Deep Learning - Link
- Geometry and uncertainty in deep learning for computer vision - Link
- Modelling uncertainty in deep learning for camera relocalization - Link
- Dropout as a bayesian approximation: Representing model uncertainty in deep learning - Link
- What uncertainties do we need in bayesian deep learning for computer vision? - Link
- Multi-task learning using uncertainty to weigh losses for scene geometry and semantics - Link
|
Resource Constrained Extended Kalman Filtering
- A provably consistent method for imposing sparsity in feature-based SLAM information filters - Link
- Optimization-based estimator design for vision-aided inertial navigation - Link
- Vision-aided inertial navigation for resource-constrained systems - Link
- Power-SLAM: a linear-complexity, anytime algorithm for SLAM - Link
- A resource-aware vision-aided inertial navigation system for wearable and portable computers - Link
- An iterative kalman smoother for robust 3D localization and mapping - Link
- Inverse Schmidt Estimators - Link
- Consistent map-based 3D localization on mobile devices - Link
- RISE-SLAM: A Resource-aware Inverse Schmidt Estimator for SLAM - Link
- Markov Parallel Tracking and Mapping for Probabilistic SLAM - Link
|
Event-based Cameras
- Event-based Vision Resources (master list) - Link
- Real-Time 3D Reconstruction and 6-DoF Tracking with an Event Camera - Link
- Continuous-Time Trajectory Estimation for Event-based Vision Sensors - Link
- Continuous-Time Visual-Inertial Trajectory Estimation with Event Cameras - Link
- EVO: A Geometric Approach to Event-Based 6-DOF Parallel Tracking and Mapping in Real-time - Link
- Event-based Visual Inertial Odometry - Link
- EEKLT: Asynchronous, Photometric Feature Tracking using Events and Frames - Link
|
Rolling Shutter Cameras
- MIMC-VINS: A Versatile and Resilient Multi-IMU Multi-Camera Visual-Inertial Navigation System - Link
- Rolling Shutter Camera Calibration - Link
- Continuous-time batch trajectory estimation using temporal basis functions - Link
- Vision-aided inertial navigation with rolling-shutter cameras - Link
- Efficient Visual-Inertial Navigation using a Rolling-Shutter Camera with Inaccurate Timestamps - Link
- Real-time Motion Tracking on a Cellphone using Inertial Sensing and a Rolling-Shutter Camera - Link
- 3-D Motion Estimation and Online Temporal Calibration for Camera-IMU Systems - Link
- High-fidelity Sensor Modeling and Self-Calibration in Vision-aided Inertial Navigation - Link
|
Historical VINS Literature
- A Robust and Modular Multi-Sensor Fusion Approach Applied to MAV Navigation - Link
- Determining the Time Delay Between Inertial and Visual Sensor Measurements - Link
- A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation - Link
- Vision-Aided Inertial Navigation for Spacecraft Entry, Descent, and Landing - Link
|
Original ORB-SLAM
- ORB: an efficient alternative to SIFT or SURF - Link
- Bags of Binary Words for Fast Place Recognition in Image Sequences - Link
- ORB-SLAM: Tracking and Mapping Recognizable Features - Link
- ORB-SLAM: a Versatile and Accurate Monocular SLAM System - Link
- Parallel Tracking and Mapping for Small AR Workspaces - Link
|