The paper describes the implementation of an Autonomous Mobile Robot able to na- vigate the environment by combining range and odometry data from LiDAR and wheel encoders sensors in the Robot Operating System (ROS) framework. The SLAM algo- rithm uses this sensory information to produce a static map of the environment. This is then relied upon by the navigation stack of the framework to navigate the environment, where the sensor data is used to localize the robot in the map and to calculate an opti- mal trajectory towards a set destination that avoids static and dynamic obstacles. The system is tested in simulated and real scenarios and the main challenges of mapping and navigation are surveyed. The different approaches are then discussed with a par- ticular focus on their robustness, by studying their shortcomings and advantages. This paper ultimately aims to guide the reader through the steps needed to implement the described system and outlines the best practices that lead to a sound solution.
Video 1: Multi-Sensor Integration. The robot decides upon multiple sensor data if the area underneath the table allows a passage. Video 2: The robot finds its way through a given maze. It uses SLAM (self localisation and mapping) to make a map from the maze and eventually find its way through.