Book Image

ROS Programming: Building Powerful Robots

By : Anil Mahtani, Aaron Martinez, Enrique Fernandez Perdomo, Luis Sánchez, Lentin Joseph
Book Image

ROS Programming: Building Powerful Robots

By: Anil Mahtani, Aaron Martinez, Enrique Fernandez Perdomo, Luis Sánchez, Lentin Joseph

Overview of this book

This learning path is designed to help you program and build your robots using open source ROS libraries and tools. We start with the installation and basic concepts, then continue with the more complex modules available in ROS, such as sensor and actuator integration (drivers), navigation and mapping (so you can create an autonomous mobile robot), manipulation, computer vision, perception in 3D with PCL, and more. We then discuss advanced concepts in robotics and how to program using ROS. You'll get a deep overview of the ROS framework, which will give you a clear idea of how ROS really works. During the course of the book, you will learn how to build models of complex robots, and simulate and interface the robot using the ROS MoveIt motion planning library and ROS navigation stacks. We'll go through great projects such as building a self-driving car, an autonomous mobile robot, and image recognition using deep learning and ROS. You can find beginner, intermediate, and expert ROS robotics applications inside! It includes content from the following Packt products: ? Effective Robotics Programming with ROS - Third Edition ? Mastering ROS for Robotics Programming ? ROS Robotics Projects
Table of Contents (37 chapters)
Title page
Copyright and Credits
Packt Upsell
Preface
Bibliography
Index

Publishing sensor information


Your robot can have a lot of sensors to see the world; you can program a lot of nodes to take this data and do something, but the navigation stack is prepared only to use the planar laser's sensor. So, your sensor must publish the data with one of these types: sensor_msgs/LaserScan or sensor_msgs/PointCloud2.

We are going to use the laser located in front of the robot to navigate in Gazebo. Remember that this laser is simulated on Gazebo, and it publishes data on the hokuyo_link frame with the topic name /robot/laser/scan.

In our case, we do not need to configure anything in our laser to use it on the navigation stack. This is because we have tf configured in the .urdf file, and the laser is publishing data with the correct type.

If you use a real laser, ROS might have a driver for it. Indeed, in Chapter 7, Using Sensors and Actuators with ROS, we will show you how to connect the Hokuyo laser to ROS. Anyway, if you are using a laser that has no driver on ROS and...