Book Image

Learning ROS for Robotics Programming

By : Aaron Martinez, Enrique Fernández
Book Image

Learning ROS for Robotics Programming

By: Aaron Martinez, Enrique Fernández

Overview of this book

<p>Both the amateur and the professional roboticist who has ever tried their hand at robotics programming will have faced with the cumbersome task of starting from scratch, usually reinventing the wheel. ROS comes with a great number of already working functionalities, and this book takes you from the first steps to the most elaborate designs possible within this software framework.</p> <p>"Learning ROS for Robotics Programming" is full of practical examples that will help you to understand the framework from the very beginning. Build your own robot applications in a simulated environment and share your knowledge with the large community supporting ROS.</p> <p>"Learning ROS for Robotics Programming" starts with the basic concepts and usage of ROS in a very straightforward and practical manner. It is a painless introduction to the fascinating world of robotics, covering sensor integration, modeling, simulation, computer vision, and navigation algorithms, among other topics.</p> <p>After the first two chapters, concepts like topics, messages, and nodes will become daily bread. Make your robot see with HD cameras, or navigate avoiding obstacles with range sensors. Furthermore, thanks to the contributions of the vast ROS community, your robot will be able to navigate autonomously, and even recognize and interact with you, in a matter of minutes.</p> <p>"Learning ROS for Robotics Programming" will give you all the background you need to know in order to start in the fascinating world of robotics and program your own robot. Simply, you put the limit!</p>
Table of Contents (16 chapters)
Learning ROS for Robotics Programming
Credits
About the Authors
About the Reviewers
www.PacktPub.com
Preface
Index

Publishing sensor information


Your robot can have a lot of sensors to see the world; you can program a lot of nodes to take these data and do something, but the navigation stack is prepared only to use the planar laser's sensor. So, your sensor must publish the data with one of these types: sensor_msgs/LaserScan or sensor_msgs/PointCloud.

We are going to use the laser located in front of the robot to navigate in Gazebo. Remember that this laser is simulated on Gazebo, and it publishes data on the base_scan/scan frame.

In our case, we do not need to configure anything of our laser to use it on the navigation stack. This is because we have tf configured in the .urdf file, and the laser is publishing data with the correct type.

If you use a real laser, ROS might have a driver for it. Indeed, in Chapter 4, Using Sensors and Actuators with ROS, you learned how to connect the Hokuyo laser to ROS. Anyway, if you are using a laser that has no driver on ROS and want to write a node to publish the data...