|
1 | | -# ROS Depth Based Robot Tracking Library (dbrt) |
2 | | - |
3 | | -This package provides three robot trackers |
4 | | -* Rotary tracker: A robot tracker based on joint angle measurements. This |
5 | | - tracker runs typically at 100Hz or 1kHz |
6 | | -* Visual tracker: This tracker uses depth images from a Kinect or XTION to |
7 | | - estimate the joint state. The filter is based on the Rao-Blackwellized |
8 | | - corrdinate descent particle filter implemented in |
9 | | - [dbot](https://github.com/bayesian-object-tracking/dbot) package. The tracking |
10 | | - rate lies between 5Hz to 30Hz depending on the degree-of-freedom and model |
11 | | - configurations. |
12 | | -* Fusion tracker: This tracker fuses both filters mentioned above. The fusion |
13 | | - is performed while taking into considering the camera delay and the |
14 | | - computational time of the vision filter. |
15 | | - |
16 | | -## Running robot trackers with default configuration |
17 | | - |
18 | | -These trackers are robot specific. You will need to create your own package and |
19 | | -add your config files and URDF models there. |
20 | | -Take [dbrt_apollo](http://git-amd.tuebingen.mpg.de/amd-clmc/dbrt_apollo) |
21 | | -Apollo robot configuration package as an example or a template. |
22 | | - |
23 | | -Make sure the robot is publishing the joint state to `/joint_states` topic. If |
24 | | -you are using the visual or the fusion tracker, make also sure that the depth |
25 | | -camera is providing depth images to a topic of choice spcified in the |
26 | | -`dbrt_my_robot/config/camera.yaml` |
27 | | - |
28 | | -Launching the trackers takes a single `roslaunch` call |
29 | | -```bash |
30 | | -roslaunch dbrt_apollo rotary_tracker.launch |
31 | | -``` |
32 | | -```bash |
33 | | -roslaunch dbrt_apollo visual_tracker.launch |
34 | | -``` |
35 | | -```bash |
36 | | -roslaunch dbrt_apollo fusion_tracker.launch |
37 | | -``` |
| 1 | +# ROS Depth-Based Robot Tracking Package (dbrt) |
| 2 | + |
| 3 | +This package extends the object tracking packages, dbot and dbot_ros to track |
| 4 | +articulated rigid bodies with several degree of freedom. In addition to depth |
| 5 | +images, the robot tracker incorporates joint angle measurements at a higher |
| 6 | +rate, typically 100Hz-1kHz. Here are some of the core features |
| 7 | + |
| 8 | + * Provides joint state estimates at the rate of joint encoders |
| 9 | + * Compensates for inaccurate kinematics by estimating biases on the joint |
| 10 | + angles |
| 11 | + * Estimates the head camera to robot base, if needed. Typically, the exact |
| 12 | + camera location is unknown |
| 13 | + * Handles occlusion |
| 14 | + * Copes with camera delays |
| 15 | + * Requires only the model, i.e. the URDF description including the link meshes. |
| 16 | + |
| 17 | +## Getting Started Example |
| 18 | + |
| 19 | +First follow the steps of setting up the object tracking as described in |
| 20 | +Check out the [dbrt_getting_started](https://git-amd.tuebingen.mpg.de/open-source/dbrt_getting_started.git) |
| 21 | +for a full example of a robot setup with recorded data. You can find the setup |
| 22 | +steps in the (Getting Started)[https://github.com/bayesian-object-tracking/getting_started#robot-tracking] |
| 23 | +documentation. |
| 24 | + |
| 25 | +## Setting Up Your Own Robot |
| 26 | +Provided a URDF, you only need adapt the Fusion Tracker config. For that take |
| 27 | +the [dbrt_example](https://git-amd.tuebingen.mpg.de/open-source/dbrt_getting_started/tree/master/dbrt_example) |
| 28 | +as an example. |
| 29 | + |
| 30 | +In the Fusion Tracker config file, you have map all the joint names to |
| 31 | +uncertainty standard deviations for the joint process model and joint |
| 32 | +observation models. The [dbrt_example](https://git-amd.tuebingen.mpg.de/open-source/dbrt_getting_started/tree/master/dbrt_example) |
| 33 | +package provides a good starting point. |
| 34 | + |
| 35 | +### URDF Camera Frame |
| 36 | + |
| 37 | +In case your URDF model does not specify a camera link, you have to attach |
| 38 | +one to some part of the robot where the camera is mounted. This requires |
| 39 | +connecting a camera link through a joint to another link of the robot. Take a |
| 40 | +look at (head.urdf.xacro)[https://git-amd.tuebingen.mpg.de/open-source/dbrt_getting_started/blob/master/apollo_robot_model/models/head.urdf.xacro#L319]. |
| 41 | +The XTION camera link *XTION_RGB* is linked to the link *B_HEAD* through the |
| 42 | +joint *XTION_JOINT*. The transformation between the camera and the robot is not |
| 43 | +required to be very precise. However, it must be accurate enough to provide |
| 44 | +a rough initial pose. |
| 45 | + |
| 46 | +Finally, the camera link name (here XTION_RGB) must match the camera frame |
| 47 | +provided by the point cloud topic. To determine the name of the depth camera |
| 48 | +frame or the RGB frame if registration is used, run |
38 | 49 |
|
39 | | -## Running robot trackers with custom configuration |
40 | | -The predefined launch files allow you to pass a custom config file |
41 | | -```bash |
42 | | -roslaunch dbrt_apollo rotary_tracker.launch rotary_tracker_config:=my_custom_rotary_tracker_config.yaml |
43 | | -``` |
44 | | -```bash |
45 | | -roslaunch dbrt_apollo visual_tracker.launch visual_tracker_config:=my_custom_visual_tracker_config.yaml |
46 | | -``` |
47 | 50 | ```bash |
48 | | -roslaunch dbrt_apollo fusion_tracker.launch fusion_tracker_config:=my_custom_fusion_tracker_config.yaml |
| 51 | +rostopic echo /camera/depth/camera_info |
49 | 52 | ``` |
50 | 53 |
|
0 commit comments