Difference between revisions of "Robot Operating System"

From Mech
Jump to navigationJump to search
 
(23 intermediate revisions by the same user not shown)
Line 26: Line 26:


===System Overview===
===System Overview===
As mentioned above, the purpose of this system is to perform open and closed-loop tracking on a mass hanging from a puppeteer robot's winch. For the purposes of this article, we will not discuss the robot's code or the code in the Kinect stack. It is particularly important to treat the Kinect as a black box because that software is updated and maintained by Willow Garage. Assuming these two systems perform as they should, we can focus on the six nodes that were written specifically for this system. These six nodes are as follows:
As mentioned above, the purpose of this system is to perform open and closed-loop tracking on a mass hanging from a puppeteer robot's winch. A video of the open-loop version of the system can be found [http://www.youtube.com/watch?v=O0HLw3IkAOo here]For the purposes of this article, we will not discuss the robot's code or the code in the Kinect stack. It is particularly important to treat the Kinect as a black box because that software is updated and maintained by Willow Garage. Assuming these two systems perform as they should, we can focus on the six nodes that were written specifically for this system. These six nodes are as follows:


'''Original Nodes:'''


Serial Node (C++): Interfaces with the robot through a serial port assigned to the FTDI cable that attaches to the XBee Wireless Chip. As a safety fail safe, It also watches stdin for any key strike and executes an emergency stop when it sees one.
Serial Node (C++): Interfaces with the robot through a serial port assigned to the FTDI cable that attaches to the XBee Wireless Chip. As a safety fail safe, It also watches stdin for any key strike and executes an emergency stop when it sees one.


Estimator Node (C++): Collects state information about the hanging mass from the object tracker and about the robots current position from robot's encoder-based odometry calculations. Although the current version of the Control Node does this, it will eventually be responsible for calculating string length and robot and mass velocities.
Estimator Node (C++): Collects state information about the hanging mass from the object tracker and about the robots current position from robot's encoder-based odometry calculations. Although the current version of the Control Node does this, it will eventually be responsible for calculating string length and robot and mass velocities.

'''Original Nodes:'''


Control Node (Python): Uses the current state, last state, and time to calculate the proper gains and control inputs for the next time step.
Control Node (Python): Uses the current state, last state, and time to calculate the proper gains and control inputs for the next time step.
Line 53: Line 54:


[[Image:Puppeteer_Block_Diagram_2.png|600px|Puppeteer Block Diagram|center]]
[[Image:Puppeteer_Block_Diagram_2.png|600px|Puppeteer Block Diagram|center]]

Details and videos of the system can be found on the main research page for the [http://lims.mech.northwestern.edu/RESEARCH/current_projects/Puppets/Marionette_homepage.html puppeteer project].


===Kinect Overview===
===Kinect Overview===
Line 58: Line 61:




Kinect Hardware Summary:
'''Kinect Hardware Summary:'''


1 RGB camera (640x480)
1 RGB camera (640x480)
Line 70: Line 73:
30 Hz update rate
30 Hz update rate



Relevant Links:
'''Relevant Links:'''


[http://www.primesense.com/ PrimeSense Homepage]
[http://www.primesense.com/ PrimeSense Homepage]
Line 89: Line 93:


===Installation===
===Installation===
All of the packages used in this example can be found on the following [https://github.com/jakeware/ github page]. Here are individual links to the packages: ([https://github.com/jakeware/serial_node Serial Node], [https://github.com/jakeware/estimator_node Estimator Node], [https://github.com/jakeware/puppeteer_control Control Node], [https://github.com/jakeware/object_tracker Object Tracker Node], [https://github.com/jakeware/puppeteer_markers Marker Node], [https://github.com/jakeware/keyboard_node Keyboard Node], [https://github.com/jakeware/puppeteer_msgs Puppeteer Messages])

Note that these repositories are constantly being updated and their overall structure may diverge from the descriptions listed below. You can find static versions of the source code [[Puppeteer_Code.zip |here]].

Once you are in a folder in the ROS package path, enter the following commands to download build the package:
<pre>
roscreate-pkg PACKAGE_NAME
cd PACKAGE_NAME
git clone git@github.com:jakeware/PACKAGE_NAME.git
cd PACKAGE_NAME
mv * .git .gitignore ../
cd ..
rm -r PACKAGE_NAME
rosmake
</pre>

===Serial Node===
===Serial Node===
This package is the only node with access to the serial port. Therefore, it is the only node that can use the XBee to talk with the robot. Its primary function is to provide two services to the rest of the system. One is the speed command service, and the other is the position request service. When another node calls either of these services, the serial node takes the incoming message definition and compiles a string of custom floats to send to the robot. If the serial node received a speed command, it simply sends the string out to the XBee and replies to the requesting node whether or not that operation was successful. If the serial node received a position request, it compiles a similar string, and then waits for the reply from the robot. When it receives the reply, it sends this information back to the node that made the original request. If it does not get a reply from the robot, it will eventually time out and return a failure to the requesting node. This node also has an added safety feature where it looks for a key strike on the stdin for its terminal and, if it sees one, it shuts down both services and begins sending the stop string repeatedly.

===Estimator Node===
===Estimator Node===
The estimator node is responsible for collecting both the state of the mass and the robot. Currently, it is driven by the Kinect frequency and will not function unless the object tracker publishes a new mass position on its outgoing topic. Once the estimator node gets a new mass position, it calls the position request service and waits for the serial node to pass back the robots reported position. Once it has all of this information, it assembles the system state and publishes it on the system_state topic.

===Object Tracker===
===Object Tracker===
The object tracker is responsible for finding the mass position given a RGBD point cloud from the Kinect. It has two modes of operation. First, it looks at the point cloud from the entire area under the puppeteer stage and finds the centroid of all the points in this cloud. Once it has found the object, it will only look at a cube several inches wide around the last valid centroid position. This reduces the computational time and cuts out noise dispersed across the entire point cloud. If it ever finds a centroid from a cloud with very few data points, it assumes it has lost the mass and returns to looking at the entire puppeteer stage. Although the mass position is passed to the estimator node through the object1_position topic, it also publishes a point cloud for the object, a point cloud for all of the Kinect data, and a frame for the mass position. All three of these can be viewed in rviz.

===Control Node===
===Control Node===
The control node lies downstream of the estimator node and is driven by updates to the system_state topic. This is the most complex node in the system and is responsible for using the optimal control gains, desired state, and current state to generate the next set of controls to be sent to the robot.

===Keyboard Node===
===Keyboard Node===
The keyboard is a dedicated node for user input. In any system with so many nodes, you end up running programs in several different terminals. As such, it is often hard to quickly start or stop certain nodes. This is a problem when you are trying to recover from an error or stop the robot from doing something unsafe. To fix this, you can use a dedicated node that changes variables stored on the global parameter server and modify the other nodes to watch these parameters. The keyboard node was designed to do just this, and responds to several different commands. The primary function of this node is to change the system state between idle, calibrate, run, and stop. Although there is a fifth condition, emergency stop, only other nodes can set the system to that condition. This node uses a timer and the kbhit function to check stdin at 200Hz. Any key that isn't explicitly defined will set the system condition to stop.
===Camera Node===

===Marker Node===
This node uses ROS's visualization stack to generate markers that are visible in rviz. These markers are easy to implement and are tremendously helpful for debugging transformations and data with a spacial component. This node is subscribed to and driven by the system_state topic. Although the data is not in the same coordinate system as the optimization and the control node, it lines up properly with the Kinect point cloud and camera image.


==Highlights==
==Highlights==
The following highlights are things that were learned over two quarters of coding in ROS

===Timers===
===Timers===
ROS has made it easy to implement timers that run in different threads. Find details [http://www.ros.org/wiki/roscpp/Overview/Timers here].

===Parameter Server===
===Parameter Server===
The parameter server is essentially a reserved space in ROS for global variables. Any node can check to see if they exist, get them, or set them. Check [http://www.ros.org/wiki/Parameter%20Server here] for details.

===Severity Levels===
===Severity Levels===
Although it is still possible to use basic print statements in ROS, there is a built in system for creating print statements with severity levels attached to them. These severity levels are DEBUG, INFO, WARN, and ERROR. The system defaults to only printing INFO, WARN, and ERROR. As such, it is a good idea to put debug statements all over your code and just leave them there. Although you will not see them normally, you can change this using the [http://www.ros.org/wiki/rxconsole rxconsole] command line tool. ROS also automatically color codes the different severity levels so you can easily distinguish between them. This is very helpful when you are printing out a ton of information.

===Launch Files===
===Launch Files===
Launch files are hugely helpful when you begin working with larger systems. They allow you to launch multiple nodes at the same time and pass configuration parameters to them. Find more information about them [http://www.ros.org/wiki/roslaunch here].
===Command Line Tools===

===Data Logging===
ROS has a built in system for collecting, compressing, saving, replaying, and viewing any message data being passed within the system. They call the compressed files bag files. You can use [http://www.ros.org/wiki/rosbag rosbag] to collect the data and [http://www.ros.org/wiki/rxbag rxbag] to view and replay the data.

===System Diagram===
rxgraph is a tool for visualizing the layout of your current ROS system. This can be very useful for debugging system structure problems. Find details [http://www.ros.org/wiki/rxgraph here].

===Visualization===
Although a little troublesome at times, rviz is a tremendously powerful tool for visualizing all sorts of spacial data. Any robotic system can benefit hugely from this tool. Find details [http://www.ros.org/wiki/rviz here]


==Resources==
==Resources==

Latest revision as of 15:42, 11 June 2011

Overview

This page serves as a short introduction to ROS for the new or potential user. Although ROS is a tremendously complex and multifaceted software package, this page endeavors to outline the basic uses and functionality provided by ROS's framework. This is done through example by discussing the high level design of a ROS system developed by Jake Ware and Jarvis Schultz in 2011 for the puppeteer robot system. There is also a short "highlights" section that directs new users towards some useful ROS features that might not be readily apparent.

Introduction

Above all else, ROS should be seen as a tool to create and manage complex electromechanical systems. Originally developed by the Stanford Artificial Intelligence Laboratory in 2007, the ROS project was adopted by Willow Labs in 2008 and remains in their care. The following is Willow Labs' description of ROS:

"ROS is an open-source, meta-operating system for your robot. It provides the services you would expect from an operating system, including hardware abstraction, low-level device control, implementation of commonly-used functionality, message-passing between processes, and package management. It also provides tools and libraries for obtaining, building, writing, and running code across multiple computers." [Source: ROS Intro]

All of this is true, but the underlying message of all this technical sophistication is still that ROS enables groups of people to work on complex projects by providing a common and well organized framework, while adding a minimal amount of overhead.

A presentation was given to the LIMS lab by Jake Ware and Jarvis Schultz in the Spring of 2011. Although it is not comprehensive, it covers the overall structure and use of ROS, some of the utilities provided, discusses some applications, and goes over pros and cons of using it in a project.

LIMS ROS Presentation: Part 1, Part 2, Part 3, Part 4

Willow Garage ROS Compilation: Three Years

Getting Started

Installation

Currently, ROS is only fully supported in Ubuntu Linux. The full list of supported systems can be found here. A detailed installation walkthrough can be found here. The installation time can take anywhere from 45 minutes to several hours depending on the speed of your internet connection.

Tutorials

If you are planning on using ROS for a long term project, it is absolutely worth investing the time to work through the tutorials provided on the ROS website. Although there are many more tutorials focused on specific stacks and packages, the introductory tutorials are the best place to start and can be found here. If you need a quick refresher, or are having trouble remembering some of the command line tools, you can use this cheat sheet.

Example

In the interest of demonstrating some basic techniques and good practices in ROS, the following example system is presented and described. First, the overall layout and structure of the system will be explained and justified. This will be followed by a short description of each of the packages and a link to download the actual code. The purpose of this system is to perform open and closed-loop control of a mass hung from a winch on a puppeteer robot using the Microsoft Kinect for object tracking. The majority of this code was written by either Jarvis Schultz or Jake Ware, but credit will be given for the parts that were not original.

System Overview

As mentioned above, the purpose of this system is to perform open and closed-loop tracking on a mass hanging from a puppeteer robot's winch. A video of the open-loop version of the system can be found hereFor the purposes of this article, we will not discuss the robot's code or the code in the Kinect stack. It is particularly important to treat the Kinect as a black box because that software is updated and maintained by Willow Garage. Assuming these two systems perform as they should, we can focus on the six nodes that were written specifically for this system. These six nodes are as follows:


Original Nodes:

Serial Node (C++): Interfaces with the robot through a serial port assigned to the FTDI cable that attaches to the XBee Wireless Chip. As a safety fail safe, It also watches stdin for any key strike and executes an emergency stop when it sees one.

Estimator Node (C++): Collects state information about the hanging mass from the object tracker and about the robots current position from robot's encoder-based odometry calculations. Although the current version of the Control Node does this, it will eventually be responsible for calculating string length and robot and mass velocities.

Control Node (Python): Uses the current state, last state, and time to calculate the proper gains and control inputs for the next time step.

Object Tracker Node (C++): Find the hanging mass location from the point cloud data generated from the openni_camera node developed by WIllow Garage.

Marker Node (C++): Generates 3D visuals for the robot and mass and displays them in the proper orientation and position in rviz, ROS's visualization software.

Keyboard Node (C++): This node watches stdin for keyboard input and modifies the operating condition of the system according to a defined command set.

Puppeteer Messages (NA): A collection of all the message definitions for the topics and services used in this system.

Borrowed Nodes

Kinect Nodes: This is a black box for us and consists of openni_camera and several other ROS packages. This software must be started before the system can function.

rviz: ROS's visualization software that is extremely useful for debugging and working with 3D data.

The following block diagram illustrates the flow of information between these nodes. The timing of the system is driven by the 30Hz rate of the Kinect. That is, the Kinect drives the openni (Kinect) nodes, the openni nodes drive the object tracker, the object tracker drives the estimator, the estimator drives the control node, the control node sends a new command to the robot, and process is repeated. In the long run, the system won't be driven off the 30Hz rate of the Kinect, but by an independent timer that will get robot position updates more frequently.

Puppeteer Block Diagram

Details and videos of the system can be found on the main research page for the puppeteer project.

Kinect Overview

It is helpful to have some background information on the Kinect to understand how this system operates. The Microsoft Kinect is a device developed by PrimeSense for the Xbox 360 gaming platform to allow the system to track the user's motion and gestures. Because of its low cost and relatively accurate sensor, the open source software community quickly rallied around it. Within three hours of its release, the Kinect's protocol had been hacked and drivers were released under an open source license. This resulted in the OpenKinect project and the implementation of these drivers in several different languages and frameworks. The most significant of these efforts was Willow Garage's decision to support the Kinect and create a stack for it in ROS. Shortly after, PrimeSense released their own open source drivers and NITE middleware that gave users similar versions of their skeleton tracking and edge detection algorithms. All of this software was grouped under the OpenNI project. Willow Garage quickly adopted these drivers and stopped supporting the OpenKinect version. In parallel with this, Willow Garage also began developing a new version of their point cloud library called PCL2. This made working with the raw Kinect data much easier. See below for a hardware summary and relevant links.


Kinect Hardware Summary:

1 RGB camera (640x480)

1 infrared camera (640x480 with 2048 depth levels)

1 infrared emitter

Structured Light approach to measuring depth

30 Hz update rate


Relevant Links:

PrimeSense Homepage

PrimeSense NITE Middleware

OpenNI Homepage

Point Cloud Library Homepage

Microsoft Kinect Homepage

Kinect Teardown - iFixit

Structured Light - Wikipedia

Kinect Projects

Installation

All of the packages used in this example can be found on the following github page. Here are individual links to the packages: (Serial Node, Estimator Node, Control Node, Object Tracker Node, Marker Node, Keyboard Node, Puppeteer Messages)

Note that these repositories are constantly being updated and their overall structure may diverge from the descriptions listed below. You can find static versions of the source code here.

Once you are in a folder in the ROS package path, enter the following commands to download build the package:

roscreate-pkg PACKAGE_NAME
cd PACKAGE_NAME
git clone git@github.com:jakeware/PACKAGE_NAME.git
cd PACKAGE_NAME
mv * .git .gitignore ../
cd ..
rm -r PACKAGE_NAME
rosmake

Serial Node

This package is the only node with access to the serial port. Therefore, it is the only node that can use the XBee to talk with the robot. Its primary function is to provide two services to the rest of the system. One is the speed command service, and the other is the position request service. When another node calls either of these services, the serial node takes the incoming message definition and compiles a string of custom floats to send to the robot. If the serial node received a speed command, it simply sends the string out to the XBee and replies to the requesting node whether or not that operation was successful. If the serial node received a position request, it compiles a similar string, and then waits for the reply from the robot. When it receives the reply, it sends this information back to the node that made the original request. If it does not get a reply from the robot, it will eventually time out and return a failure to the requesting node. This node also has an added safety feature where it looks for a key strike on the stdin for its terminal and, if it sees one, it shuts down both services and begins sending the stop string repeatedly.

Estimator Node

The estimator node is responsible for collecting both the state of the mass and the robot. Currently, it is driven by the Kinect frequency and will not function unless the object tracker publishes a new mass position on its outgoing topic. Once the estimator node gets a new mass position, it calls the position request service and waits for the serial node to pass back the robots reported position. Once it has all of this information, it assembles the system state and publishes it on the system_state topic.

Object Tracker

The object tracker is responsible for finding the mass position given a RGBD point cloud from the Kinect. It has two modes of operation. First, it looks at the point cloud from the entire area under the puppeteer stage and finds the centroid of all the points in this cloud. Once it has found the object, it will only look at a cube several inches wide around the last valid centroid position. This reduces the computational time and cuts out noise dispersed across the entire point cloud. If it ever finds a centroid from a cloud with very few data points, it assumes it has lost the mass and returns to looking at the entire puppeteer stage. Although the mass position is passed to the estimator node through the object1_position topic, it also publishes a point cloud for the object, a point cloud for all of the Kinect data, and a frame for the mass position. All three of these can be viewed in rviz.

Control Node

The control node lies downstream of the estimator node and is driven by updates to the system_state topic. This is the most complex node in the system and is responsible for using the optimal control gains, desired state, and current state to generate the next set of controls to be sent to the robot.

Keyboard Node

The keyboard is a dedicated node for user input. In any system with so many nodes, you end up running programs in several different terminals. As such, it is often hard to quickly start or stop certain nodes. This is a problem when you are trying to recover from an error or stop the robot from doing something unsafe. To fix this, you can use a dedicated node that changes variables stored on the global parameter server and modify the other nodes to watch these parameters. The keyboard node was designed to do just this, and responds to several different commands. The primary function of this node is to change the system state between idle, calibrate, run, and stop. Although there is a fifth condition, emergency stop, only other nodes can set the system to that condition. This node uses a timer and the kbhit function to check stdin at 200Hz. Any key that isn't explicitly defined will set the system condition to stop.

Marker Node

This node uses ROS's visualization stack to generate markers that are visible in rviz. These markers are easy to implement and are tremendously helpful for debugging transformations and data with a spacial component. This node is subscribed to and driven by the system_state topic. Although the data is not in the same coordinate system as the optimization and the control node, it lines up properly with the Kinect point cloud and camera image.

Highlights

The following highlights are things that were learned over two quarters of coding in ROS

Timers

ROS has made it easy to implement timers that run in different threads. Find details here.

Parameter Server

The parameter server is essentially a reserved space in ROS for global variables. Any node can check to see if they exist, get them, or set them. Check here for details.

Severity Levels

Although it is still possible to use basic print statements in ROS, there is a built in system for creating print statements with severity levels attached to them. These severity levels are DEBUG, INFO, WARN, and ERROR. The system defaults to only printing INFO, WARN, and ERROR. As such, it is a good idea to put debug statements all over your code and just leave them there. Although you will not see them normally, you can change this using the rxconsole command line tool. ROS also automatically color codes the different severity levels so you can easily distinguish between them. This is very helpful when you are printing out a ton of information.

Launch Files

Launch files are hugely helpful when you begin working with larger systems. They allow you to launch multiple nodes at the same time and pass configuration parameters to them. Find more information about them here.

Data Logging

ROS has a built in system for collecting, compressing, saving, replaying, and viewing any message data being passed within the system. They call the compressed files bag files. You can use rosbag to collect the data and rxbag to view and replay the data.

System Diagram

rxgraph is a tool for visualizing the layout of your current ROS system. This can be very useful for debugging system structure problems. Find details here.

Visualization

Although a little troublesome at times, rviz is a tremendously powerful tool for visualizing all sorts of spacial data. Any robotic system can benefit hugely from this tool. Find details here

Resources

ROS on Wikipedia

ROS Homepage

ROS Documentation

ROS Getting Started

ROS Tutorials

git Homepage

github Homepage