TF transforms must be provided (through using robot's URDF or manually in a file using static transform publishers). $ roslaunch openni_launch openni.launch depth_registration:=true.Launch OpenNI with depth_registration:=true: If you have a laser: launch the laser node, it should publish sensor_msgs/LaserScan messages. If your robot has odometry: the robot should publish his odometry in nav_msgs/Odometry format. The robot is equipped with a Kinect, an URG-04LX and odometry is provided using the wheel encoders. These examples are based on what I did for AZIMUT3. ) which outputs nav_msgs/Odometry message.Ī calibrated Kinect-like sensor compatible with openni_launch, openni2_launch or freenect_launch ros packages. I will present in the next sections some possible configurations depending on the robot with example launch files.Ī 2D laser which outputs sensor_msgs/LaserScan messages. If you want to use a 2D laser, the Kinect's clouds must be aligned with the laser scans. I recommend highly to calibrate your Kinect-like sensor following this guide. The robot must be equipped at least with a Kinect-like sensor. Switching between Mapping and Localization.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |