You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+22-6Lines changed: 22 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -38,7 +38,7 @@ For a closed-source version of ORB-SLAM3 for commercial purposes, please contact
38
38
39
39
If you use ORB-SLAM3 in an academic work, please cite:
40
40
41
-
@article{ORBSLAM3_2020,
41
+
@article{ORBSLAM3_TRO,
42
42
title={{ORB-SLAM3}: An Accurate Open-Source Library for Visual, Visual-Inertial
43
43
and Multi-Map {SLAM}},
44
44
author={Campos, Carlos AND Elvira, Richard AND G\´omez, Juan J. AND Montiel,
@@ -95,7 +95,23 @@ chmod +x build.sh
95
95
96
96
This will create **libORB_SLAM3.so** at *lib* folder and the executables in *Examples* folder.
97
97
98
-
# 4. EuRoC Examples
98
+
# 4. Running ORB-SLAM3 with your camera
99
+
100
+
Directory `Examples` contains several demo programs and calibration files to run ORB-SLAM3 in all sensor configurations with Intel Realsense cameras T265 and D435i. The steps needed to use your own camera are:
101
+
102
+
1. Calibrate your camera following `Calibration_Tutorial.pdf` and write your calibration file `your_camera.yaml`
103
+
104
+
2. Modify one of the provided demos to suit your specific camera model, and build it
105
+
106
+
3. Connect the camera to your computer using USB3 or the appropriate interface
107
+
108
+
4. Run ORB-SLAM3. For example, for our D435i camera, we would execute:
[EuRoC dataset](http://projects.asl.ethz.ch/datasets/doku.php?id=kmavvisualinertialdatasets) was recorded with two pinhole cameras and an inertial sensor. We provide an example script to launch EuRoC sequences in all the sensor configurations.
100
116
101
117
1. Download a sequence (ASL format) from http://projects.asl.ethz.ch/datasets/doku.php?id=kmavvisualinertialdatasets
@@ -115,7 +131,7 @@ Execute the following script to process sequences and compute the RMS ATE:
115
131
./euroc_eval_examples
116
132
```
117
133
118
-
# 5. TUM-VI Examples
134
+
# 6. TUM-VI Examples
119
135
[TUM-VI dataset](https://vision.in.tum.de/data/datasets/visual-inertial-dataset) was recorded with two fisheye cameras and an inertial sensor.
120
136
121
137
1. Download a sequence from https://vision.in.tum.de/data/datasets/visual-inertial-dataset and uncompress it.
@@ -135,7 +151,7 @@ Execute the following script to process sequences and compute the RMS ATE:
135
151
./tum_vi_eval_examples
136
152
```
137
153
138
-
# 6. ROS Examples
154
+
# 7. ROS Examples
139
155
140
156
### Building the nodes for mono, mono-inertial, stereo, stereo-inertial and RGB-D
141
157
Tested with ROS Melodic and ubuntu 18.04.
@@ -212,8 +228,8 @@ Once ORB-SLAM3 has loaded the vocabulary, press space in the rosbag tab.
A flag in `include\Config.h` activates time measurements. It is necessary to uncomment the line `#define REGISTER_TIMES` to obtain the time stats of one execution which is shown at the terminal and stored in a text file(`ExecTimeMean.txt`).
217
233
218
-
# 8. Calibration
234
+
# 9. Calibration
219
235
You can find a tutorial for visual-inertial calibration and a detailed description of the contents of valid configuration files at `Calibration_Tutorial.pdf`
0 commit comments