@@ -63,7 +63,7 @@ If no CUDA enabled device is available, you can build without the GPU implementa
6363catkin_make -DCMAKE_BUILD_TYPE=Release -DDBOT_BUILD_GPU=Off
6464```
6565
66- ### Install and run example
66+ ### Install and run the example
6767
6868The getting started repository contains a ROS bagfile (a depth image sequence of an object being moved),
6969and mesh models of some objects. Additionally it contains launch files, which allow
@@ -90,6 +90,9 @@ If you did not install CUDA, you can run instead:
9090``` bash
9191roslaunch dbot_example launch_example_cpu.launch
9292```
93+ Note that the tracking performance is significantly better with the GPU version.
94+
95+
9396As soon as you launch the example, an interactive marker should show up in
9497rviz. This is for initialization of the tracker, you can move it to align it
9598with the point cloud, but it should already be approximately aligned. Once you
@@ -117,9 +120,10 @@ inproceedings{wuthrich-iros-2013,
117120
118121## Robot Tracking
119122
123+ ### Workspace setup and compilation
120124The robot tracking setup builds on top of the object tracking, i.e. follow
121- first the workspace setup and of the object tracking above. Then checkout
122- the following package to the workspace
125+ first the workspace setup of the object tracking above. Then continue
126+ with the instructions below:
123127
124128``` bash
125129cd $HOME
@@ -133,7 +137,7 @@ Again, if no CUDA enabled device is available, you can deactivate the GPU implem
133137catkin_make -DCMAKE_BUILD_TYPE=Release -DDBOT_BUILD_GPU=Off
134138```
135139
136- ### Example Robot Tracking Project using MPI Apollo Robot
140+ ### Install and run the example
137141
138142Add the following example project to the workspace
139143
@@ -144,17 +148,26 @@ cd ..
144148catkin_make -DCMAKE_BUILD_TYPE=Release -DDBOT_BUILD_GPU=On
145149source devel/setup.bash
146150```
147- Once compile you can run the robot tracker along with the
151+ Now you can run the robot tracker along with the
148152recorded sensory data:
149153
150154``` bash
151155roslaunch dbrt_example launch_example_gpu.launch
152156```
153157
158+ If CUDA is not being used, you can start the CPU based setup instead:
159+ ``` bash
160+ roslaunch dbrt_example launch_example_cpu.launch
161+ ```
162+ Note that the tracking performance is significantly better with the GPU version.
163+
154164This will start the data playback, the visualization and the robot tracker.
165+ You should see a point cloud in white, the robot model using only joint
166+ encoders in red, and the corrected robot model in blue. It should be visible
167+ that the blue robot model is significantly better aligned with the point cloud than
168+ the red one.
169+
155170
156- If CUDA is not being used, you can start the CPU based setup by launching
157- ` launch_example_cpu.launch ` instead. Note that the CPU version will run slower.
158171
159172### Addition documentation
160173
0 commit comments