Lab 5 Improving Odometry


The goal of this lab is to improve the Odometry in order to achieve more precise indoor localization for the robot. In the previous labs, the localization is achieved based on the data received from the IMU sensor. However, IMU is prone to magnetic field distortions (e.g. from metal objects and electronic devices nearby). This can sometimes confuse the robot and make it act weird. It seems like the magnetometer becomes “locked”, resulting in the heading always drifting/converging back to the same angle. An alternative method to compute the Odometry is to use the Hokuyo sensor data together with the robot speed and direction. An advantage of using this method is that it’s not sensitive to magnetic field distortions.

We will make use of the following nodes:

  • Laser range finder (Hokuyo)
  • IMU module
  • Keyboard or Joystick teleoperation package for motion control
  • Gmapping for visualizing the map in rviz
  • Our main node that we will implement in this lab. The script for the main node is adapted from the localization script from lab 3.2. The node will subscribe to data from IMU, Hokuyo, and Telop nodes, analyze and transform the data, and finally publish the improved Odometry.

Let’s begin by creating a new ROS package:



Change directory to the newly created package, create src/lab5.cpp and paste the following code into it  and save changes.


An important addition in the above code is the Hokuyo callback function gotScanCallback() where the robot direction and turn angle is computed as follows:

  • m_straightdirection=1 (robot moving forward)
  • m_straightdirection=-1 (moving backward)
  • m_turndirection=0 (moving in straight line i.e. not turning around)
  • m_turndirection=1 (turning clock-wise)
  • m_turndirection=-1 (turning counter clock-wise)

We use the above two variables to keep track of the robots relative location using the following code (the code is inside while (ros::ok(){} loop).



Open CMakeLists.txt and add the required libraries and executables:


Also add the necessary dependencies to manifest.xml so that it looks like this:


Compile the package and create the node in bin folder:


Create a .launch file that will start all the necessary nodes. Create launch directory in your package folder, then create lab5.launch inside it:

The lauch file is now open for editing. Add the following lines:



Before launching, make sure to activate the Hokuyo range finder:


Finally,  run the .launch file to start the nodes:


Check the list of topics with rostopic list. You should see the following topics:


From your laptop, run the following command if you are using keyboard to remotely control the robot. Make sure to send the teleoperation commands through the same terminal where you run the command:

If you are using joystick then execute the following command:


In a new terminal launch rviz. Firstly, add a TF visualization and set the Fixed Frame option to map in the global options. Secondly add a Laserscan topic and set its topic to /scan. Try moving the robot around and observe the visualized data in rviz. The robot should now able to keep track of its location relative to the surroundings as shown in the figure.


You can also monitor the robot distance from its initial location from the Odometry topic