Nav2 and Autonomous NavigationJune 09, 20257 min read

ROS 2 Sensor Fusion with EKF and AMCL

Master ROS 2 sensor fusion using EKF and AMCL. Improve robot localization, correct IMU drift, and achieve accurate mapping with this hands on guide.

ROS 2 Sensor Fusion with EKF and AMCL

Share :

Quick answer

Master ROS 2 sensor fusion using EKF and AMCL. Improve robot localization, correct IMU drift, and achieve accurate mapping with this hands on guide.

Quick Answer

Master ROS 2 sensor fusion using EKF and AMCL. Improve robot localization, correct IMU drift, and achieve accurate mapping with this hands on guide.

Who This Is For

  • ROS 2 Learner
  • Mobile Robotics Student
  • Robotics Career Shifter

What You Will Learn

  • What Nav2 means in practical robotics.
  • How this topic connects to real robot projects.
  • What to learn or build next after this article.

When Good Maps Go Wobbly: The Case for Sensor Fusion

So far, your Raspberry Pi mapping robot has drawn its first map with LiDAR using SLAM. But maybe the corners don't quite match. Maybe the walls jitter, or the robot misplaces itself slightly every few meters. That's not a failing of your LiDAR, it's a sign you need ROS 2 sensor fusion to strengthen your robot localization strategy. Welcome to sensor fusion-where data from encoders, IMUs, and other sensors are blended to provide stable, accurate state estimation. In this post, we explore how ROS 2's Extended Kalman Filter (EKF) and Adaptive Monte Carlo Localization (AMCL) work together to minimize drift and enable precise navigation.

1. Why Sensor Fusion Matters in ROS 2

Every robot must answer a simple but essential question: "Where am I?"

  • Encoders provide distance via wheel ticks.
  • IMUs track orientation and acceleration.
  • LiDARs see walls, shapes, and space. Each sensor brings strengths-but also weaknesses. Wheel slippage, gyro drift, and scan mismatches can throw off your robot's position, especially over time. By fusing sensor data with ROS 2's sensor fusion stack, we create a more reliable and consistent sense of the robot's location in its environment.

2. Meet the Players

Here's a quick breakdown of the main sensors involved: SensorRelatable AnalogyMeasuresCommon WeaknessEncodersPedometerWheel distanceSlips and false ticksIMUBalance boardTilt, spinDrift over timeLiDARFlashlight scannerDistances to wallsNoisy, requires clean environment Individually, these sensors falter. Together, they compensate for one another.

3. Install the ROS 2 Sensor Fusion Package

To begin, ensure your system includes the robot_localization package, which provides the EKF node.

sudo apt install ros-humble-robot-localization

If you're working from source or want full access to configuration, clone it from the GitHub repo:

git clone https://github.com/cra-ros-pkg/robot_localization.git
cd robot_localization && colcon build

For comprehensive details on configuring and utilizing the robot_localization package, refer to the official ROS documentation.

4. Configure the EKF Node for ROS 2 Sensor Fusion

The ekf_node is the centerpiece of ROS 2 sensor fusion. It blends data from IMU and wheel encoders to produce an accurate robot state estimate. Basic ekf.yaml Sample:

ekf_filter_node:
 ros__parameters:
 frequency: 50
 sensor_timeout: 0.1
 two_d_mode: true
 publish_tf: true
 map_frame: map
 odom_frame: odom
 base_link_frame: base_link
 world_frame: odom
 odometry0: /wheel_odom
 odometry0_config: [true, true, false,
 false, false, false,
 false, false, false,
 false, false, false,
 false, false, false]
 imu0: /imu/data
 imu0_config: [false, false, false,
 true, true, true,
 false, false, false,
 false, false, false,
 false, false, false]

Adjust the odom_frame, base_link_frame, and input topics to match your specific configuration.

Launch File:

ros2 run robot_localization ekf_node --ros-args --params-file ekf.yaml

You can now verify output on the /odometry/filtered topic:

ros2 topic echo /odometry/filtered

This fused odometry combines encoder and IMU inputs, minimizing IMU drift and slippage issues.

5. Add AMCL for Accurate Localization in a Known Map

Where EKF handles sensor fusion, AMCL(Adaptive Monte Carlo Localization) aligns your robot with a pre-built map, using particle filters and real-time laser scans. It continuously adjusts your robot's pose against a known occupancy grid. To use AMCL, make sure your map is ready (e.g., from SLAM Toolbox), then configure the following launch command:

ros2 launch nav2_bringup localization_launch.py
map:=/path/to/map.yaml

**Note:**AMCL does not generate odometry-it refines your pose estimate within a map using LiDAR data. It requires /scan data and existing tf transforms between map -> odom -> base_link.

6. Visualizing the Fusion in RViz

To observe everything in real-time:

  • Launch RViz:
rviz2
  • Add displays for:
  • LaserScan (to view LiDAR)
  • Odometry (to see /odometry/filtered)
  • TF (to track frame alignment)
  • Pose (to see AMCL pose) Watch how the filtered odometry remains stable even as IMU and encoders drift independently. With AMCL running, your robot adjusts its global location based on scan matches to the known map.

7. Tuning EKF and AMCL Parameters

To get the most from ROS 2 sensor fusion, tuning is essential. Here's what to watch:

For EKF:

  • sensor_timeout too low? Data drops will occur.
  • frequency too high? High CPU load.
  • Enable only the axes you trust for each sensor.

For AMCL:

  • Increase laser_max_beams for better accuracy (at cost of performance)
  • Tweak min_particles and max_particles to find performance balance
  • Adjust update_min_d and update_min_a to prevent over-updating pose A complete list of parameters is available in the AMCL configuration guide by ROS Navigation

8. Real-World Test: Watch the Drift Shrink

Try this validation sequence:

  1. Drive your robot manually using teleop_twist_keyboard.
  2. Observe /wheel_odom drifting slightly in open space.
  3. Enable IMU fusion; drift slows.
  4. Turn on AMCL; position jumps back into place when scan match hits. This is the real power of ROS 2 sensor fusion. Each source informs and corrects the others.

9. Common Pitfalls & How to Fix Them

  • IMU Drift Persists? Double-check that your IMU's covariance settings are realistic. Excessive trust in a cheap sensor will backfire.
  • Robot Teleports in RViz? Your TF tree might be broken. Confirm transforms from map -> odom -> base_link are broadcasting correctly.
  • Odometry Freezes? Sensor topics might be throttling or missing timestamps. These debugging steps will help ensure that your ROS 2 sensor fusion system behaves as expected.

10. Wrapping It Up: Toward Reliable Robot Localization

By blending encoder and IMU data through EKF and refining global pose with AMCL, you're achieving what every autonomous robot needs: consistent, repeatable localization. This is the foundation for autonomous navigation, path planning, and real-world robotic tasks. Without fused localization, your robot is just guessing. With it, it knows exactly where it is-and where it's going next.

Next Steps: Planning Your First Autonomous Path

With accurate localization in place, the next step is full path planning and obstacle avoidance. You'll learn how to use Nav2 to compute safe routes across your map using your newly stable pose. Stay tuned for the next post in our series. In the meantime, explore more tutorials and hands-on builds at Robotisim, where we break down ROS 2 development for Raspberry Pi mapping robots and beyond.

Practical Example

A practical way to use this article is to connect the concept to a small robot workflow: identify the input, the processing step, and the output you expect from the robot. If the article involves ROS 2, test the idea in a small workspace or simulation before applying it to a larger robot project.

Common Mistakes

  • Trying to memorize the term without connecting it to a robot behavior.
  • Skipping the prerequisite concepts that make the workflow easier to debug.
  • Copying commands or code without checking what each node, topic, file, or parameter is responsible for.
  • Treating one tutorial as a complete roadmap instead of linking it to the next concept.

How This Connects to Other Topics

  • How to Add Custom Libraries to a ROS 2 C++ Package
  • Robot Localization Guide: Indoor Pose Estimation with Encoder Odometry
  • Robot Starter Kit Guide: Choose the Right Beginner Robot Parts
  • ROS 2 SLAM Beginner Guide: Help Your Robot Draw Its First Floorplan
  • From STL to Autonomy: Building an Indoor Self-Driving Robot

Learn Next

  • How to Add Custom Libraries to a ROS 2 C++ Package
  • Robot Localization Guide: Indoor Pose Estimation with Encoder Odometry
  • Robot Starter Kit Guide: Choose the Right Beginner Robot Parts
  • ROS 2 SLAM Beginner Guide: Help Your Robot Draw Its First Floorplan
  • From STL to Autonomy: Building an Indoor Self-Driving Robot
  • Mobile Robotics Engineer Path

FAQ

Is ROS 2 Sensor Fusion with EKF and AMCL suitable for beginners?

Yes. The article is written to make the concept easier to understand, while still connecting it to practical robotics work.

What should I learn before this topic?

Start with the prerequisite ideas listed in the article, then connect them to a small project or simulation so the concept becomes concrete.

How does this topic connect to real robots?

It helps you understand how software, sensors, control, simulation, or career decisions show up in practical robot development.

What should I do after reading this article?

Pick one related concept from the Learn Next section and build a small example that uses it.

Can I learn this through Robotisim?

Yes. Robotisim connects these concepts to structured learning paths and project-based robotics practice.

Final Summary

ROS 2 Sensor Fusion with EKF and AMCL is part of the broader Nav2 and Autonomous Navigation learning path. The key is to understand the concept, connect it to a real robot workflow, and then practice it through a focused project instead of learning it in isolation.

Connected learning path

This article supports Mobile Robotics Engineer Path, especially Nav2.

Learn with Robotisim

Build a complete Nav2 robot inside Robotisim.

Explore the academy

Learn next

How to Add Custom Libraries to a ROS 2 C++ Package
Jun 02, 2024|8 min read

How to Add Custom Libraries to a ROS 2 C++ Package

Learn to add custom libraries to your ROS 2 C++ packages. Enhance your robotics projects with reusable code and streamline your development process!

Read more
Robot Localization Guide: Indoor Pose Estimation with Encoder Odometry
Oct 07, 2025|6 min read

Robot Localization Guide: Indoor Pose Estimation with Encoder Odometry

Understand robot localization using encoder odometry in C++. Learn how robots estimate pose indoors with equations, examples, and real world insights.

Read more
Robot Starter Kit Guide: Choose the Right Beginner Robot Parts
Jul 29, 2025|9 min read

Robot Starter Kit Guide: Choose the Right Beginner Robot Parts

Avoid overspending on the wrong parts. This guide walks you through building your first robot with a smart, affordable robot starter kit and key upgrade paths.

Read more
ROS 2 SLAM Beginner Guide: Help Your Robot Draw Its First Floorplan
Jun 07, 2025|8 min read

ROS 2 SLAM Beginner Guide: Help Your Robot Draw Its First Floorplan

Learn how to set up ROS 2 SLAM and map your robot's environment. A beginner friendly guide to launching SLAM with LiDAR, odometry, and ROS 2 navigation stack.

Read more