Empowering autonomous indoor navigation with informed machine learning techniques

G Uganya, I Bremnavas, KV Prashanth… - Computers and …, 2023 - Elsevier
This paper proposes the application of informed machine learning technique to enhance the
performance of an autonomous indoor navigation system by leveraging prior knowledge …

MOLO-SLAM: A Semantic SLAM for Accurate Removal of Dynamic Objects in Agricultural Environments

J Lv, B Yao, H Guo, C Gao, W Wu, J Li, S Sun, Q Luo - Agriculture, 2024 - mdpi.com
Visual simultaneous localization and mapping (VSLAM) is a foundational technology that
enables robots to achieve fully autonomous locomotion, exploration, inspection, and more …

Evolution of SLAM: Toward the Robust-Perception of Autonomy

B Udugama - arXiv preprint arXiv:2302.06365, 2023 - arxiv.org
Simultaneous localisation and mapping (SLAM) is the problem of autonomous robots to
construct or update a map of an undetermined unstructured environment while …

An automatic driving trajectory planning approach in complex traffic scenarios based on integrated driver style inference and deep reinforcement learning

Y Liu, S Diao - PLoS one, 2024 - journals.plos.org
As autonomous driving technology continues to advance and gradually become a reality,
ensuring the safety of autonomous driving in complex traffic scenarios has become a key …

Single Frame Lidar-Camera Calibration Using Registration of 3D Planes

A Singandhupe, HM La, QP Ha - 2022 Sixth IEEE International …, 2022 - ieeexplore.ieee.org
This work focuses on finding the extrinsic parameters (rotation and translation) between
Lidar and an RGB camera sensor. We use a planar checkerboard and place it inside the …

Direct Monocular Visual Odometry Based on Lidar Vision Fusion

B Fang, Q Pan, H Wang - 2023 WRC Symposium on Advanced …, 2023 - ieeexplore.ieee.org
Lidar-assisted visual odometry is a widely used method for pose estimation. However,
existing lidar-visual odometry methods suffer from depth association errors, and single-point …

Ultra-high-accuracy visual marker for indoor precise positioning

H Tanaka - 2020 IEEE International Conference on Robotics …, 2020 - ieeexplore.ieee.org
Indoor positioning technology is essential for indoor mobile robots and drones. However,
there has never been a general-purpose technology or infrastructure that enables indoor …

A real-time fusion framework for long-term visual localization

Y Yang, X Zhang, S Gao, J Wan, Y Ping, Y Liu… - arXiv preprint arXiv …, 2022 - arxiv.org
Visual localization is a fundamental task that regresses the 6 Degree Of Freedom (6DoF)
poses with image features in order to serve the high precision localization requests in many …

Autonomous Navigation by Mobile Robot with Sensor Fusion Based on Deep Reinforcement Learning

Y Ou, Y Cai, Y Sun, T Qin - Sensors, 2024 - mdpi.com
In the domain of mobile robot navigation, conventional path-planning algorithms typically
rely on predefined rules and prior map information, which exhibit significant limitations when …

[HTML][HTML] 面向点云退化的隧道环境的无人车激光SLAM 方法

李帅鑫, 李九人, 田滨, 陈龙, 王力, 李广云 - 2021 - xb.chinasmp.com
基于激光同时定位与地图构建(simultaneous localization and mapping, SLAM) 技术,
不仅能够实现车辆在未知环境下的实时定位, 还能高效地获取环境的三维地理空间信息 …