You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am getting an incorrect point cloud when I start moving the robot forward. Initially, by only rotating the robot, the map is okay. I am using a gazebo simulation and the following command to run rtabmap in RGBD mode:
The following is an image of point cloud and the simulation environment:
What am I missing? Why are the same objects being repeated at different locations even after using the right arguments? The problem is not as bad as when I am using a different robot with the same command and environment.
This is how another robot (with the same sensor) mapped it:
The image above has imperfections but is better, but I don't know why the other robot is giving a bad result in SLAM. I am using quadrupeds, so there are some shaky movements in the camera on both robots.
The text was updated successfully, but these errors were encountered:
Can you share database of the first (or both) map? Is the camera point of view the same between the robots? When you say same sensor, does it mean exactly the same simulated camera? Bad depth registration can be a cause. Your simulated environment seems to have varied textures, though depending where the camera is looking at, poor texture or repetitive patterns can cause large odometry drift. If one robot makes the camera shaking more that could be also another issue.
For the parameters:
RGBD/OptimizeMaxError below 1 is maybe taken from an old example, it should be now > 1, or 3 as the default.
RGBD/LoopClosureReextractFeatures: if taken from an old example, we don't need this anymore
Grid/GlobalFullUpdate doesn't exist anymore, you may have seen a warning/error log about this
Vis/InlierDistance is ignored if Vis/EstimationType=1
There is maybe a typo Viz/EstimationType -> Vis/EstimationType (anyway default is 1)
I would not use RGBD/NeighborLinkRefining if you don't use a lidar
Optimizer/Iterations=200 is quite high, default is 20
If you want to use Optimizer/Robust, you should set RGBD/OptimizeMaxError to 0
depth_camera_info_topic is ignored
To make sure you get everything synchronized, set approx_sync:=false
approx_rgbd_sync is ignored unless you use also rgbd_sync:=true
Hi,
I am getting an incorrect point cloud when I start moving the robot forward. Initially, by only rotating the robot, the map is okay. I am using a gazebo simulation and the following command to run rtabmap in RGBD mode:
The following is an image of point cloud and the simulation environment:
What am I missing? Why are the same objects being repeated at different locations even after using the right arguments? The problem is not as bad as when I am using a different robot with the same command and environment.
This is how another robot (with the same sensor) mapped it:
The image above has imperfections but is better, but I don't know why the other robot is giving a bad result in SLAM. I am using quadrupeds, so there are some shaky movements in the camera on both robots.
The text was updated successfully, but these errors were encountered: