-
Notifications
You must be signed in to change notification settings - Fork 0
Navigation Tutorial
In this tutorial, we'll test out navigation in simulation. Run the following command:
roslaunch segbot_gazebo segbot_navigation.launch
You should now be able to use autonomous navigation through the rviz gui. To autonomously navigate the robot, click the 2D Nav Goal button on the top of the rviz window (not the 2D pose estimate button). Then click, hold down your mouse button, and drag in the map to select a target location and orientation. You must click and drag in order to select both a desired location and orientation. If you just click without dragging, it will not work. You will now see the robot driving towards your specified navigation goal. Play around with various goals, and examine how the robot performs when going through doorways and past other objects.
In the original gazebo simulator window, you can also try adding objects into the robot’s path by clicking the cylinder, sphere, or cube on the top and then double clicking a location on the map to add them. Check out how the robot adjusts when navigating past or around unexpected obstacles.
You can also teleoperate the robot when it is not autonomously navigating. Use the following script and follow the instructions on the screen. Note that if the robot is getting instructions from both you and the autonomous navigator, it will be very confused:
rosrun segbot_bringup teleop_twist_keyboard
You can also change where the robot thinks it is in the rviz window. If you click the 2D pose estimate button and then select a location and orientation by clicking and dragging, the robot will now think it is at that location. Notice that if you do this to the robot, it will have a very difficult time recovering and navigating anywhere. This sort of situation would be when the robot might want to ask a friendly human for help if it is lost.
This version of the simulated robot is using the Hokoyu laser scanner we discussed in class. The laser scanner sends out laser beams at a variety of angles and finds the distance to the nearest object at each angle. Take a look at the simulated sensor output in the rviz window. You’ll see dots where it estimates it is getting returns from its laser scanner. Some of these dots will be from walls, table legs, or other objects, and some will return the maximum distance the laser scanner can sense if there are no objects in that direction. Since that the lasers cannot see the table-top on the table next to the robot. Consequently, the robot will try and navigate through it but will be unable to do.
Kill any open roslaunch windows by pressing Ctrl+C
. Rerun the tutorial with the following command:
roslaunch segbot_gazebo segbot_navigation.launch robot_configuration:=`rospack find segbot_bringup`/launch/includes/auxiliary.segbot_kinect_scan.launch.xml
Next, we will turn on visualization of the Kinect point cloud. This will show the pixels captured by the Kinect camera, with their correct color at the sensed depth. In the rviz window, click the “Add” button on the bottom left. Select PointCloud2 and click OK. Scroll to the bottom of the left window and expand PointCloud2. Click the white box to the right of Topic under PointCloud2. Select “/nav_kinect/depth/points” as the topic, which should be your only option. Click to the right of Style, where it says Squares, and select Points instead. Now you should see a display of the Kinect data in the rviz window. If the robot is still in its starting location, you’ll see it sensing the wall in front of it, which is not very exciting. Provide some navigation goals and look at the Kinect output as the robot drives around. Try panning and zooming in the rviz window to get a better look at the simulated Kinect data. See if there are any issues with navigating with the Kinect sensor only.
Note: As on 2013/8/21, there is a bug in the kinect sensor simulator inside gazebo. If you see an error about unsynchronized messages from image_transport, you might have to run the above command 2-3 times until it actually works.