├── .nojekyll ├── README.md ├── _navbar.md ├── _sidebar.md ├── css └── vue.css ├── dev_guide ├── README.md ├── code_style.md ├── docs_preview.md └── pre_requisites.md ├── en ├── _navbar.md ├── _sidebar.md ├── dev_guide │ ├── README.md │ ├── code_style.md │ ├── docs_preview.md │ └── pre_requisites.md ├── hardware_specifications.md ├── quick_start │ ├── quick_test.md │ └── setup_on_manifold2.md ├── resources.md ├── roborts.md ├── sdk_docs │ ├── _sidebar.md │ ├── architecture.md │ ├── roborts_base.md │ ├── roborts_bringup.md │ ├── roborts_camera.md │ ├── roborts_decision.md │ ├── roborts_detection.md │ ├── roborts_localization.md │ ├── roborts_planning_global_planner.md │ └── roborts_planning_local_planner.md └── software_framework.md ├── hardware_specifications.md ├── images ├── AMCL_flowchart.png ├── AMCL_mindnode.png ├── airobot1.gif ├── airobot2.gif ├── architecture ├── architecture.png ├── armor_detection.png ├── armor_detection_server.png ├── camera.png ├── chassis.png ├── chassis_executor.png ├── gimbal.jpeg ├── gimbal_executor.png ├── global_planner.png ├── global_planner_actionlib.png ├── global_planner_plan.png ├── global_planner_rviz.png ├── local_planner.png ├── localization.png ├── localization_rviz.png ├── modules.png ├── navigation.png ├── referee_system.png ├── referee_system2.png ├── roborts_base_node.png ├── robot.png ├── robot45.jpg ├── run_example.gif ├── software.png ├── system.png └── system_diagram.png ├── index.html ├── js ├── docsify.min.js ├── prism-bash.js └── prism-cpp.js ├── pdf └── projectile_model.pdf ├── quick_start ├── quick_test.md └── setup_on_manifold2.md ├── resources.md ├── roborts.md ├── sdk_docs ├── _sidebar.md ├── architecture.md ├── roborts_base.md ├── roborts_bringup.md ├── roborts_camera.md ├── roborts_decision.md ├── roborts_detection.md ├── roborts_localization.md ├── roborts_planning_global_planner.md └── roborts_planning_local_planner.md └── software_framework.md /.nojekyll: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/.nojekyll -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 |

RoboMaster 2019 AI Robot Platform

2 | 3 | ![modules](https://rm-static.djicdn.com/documents/20758/d4e8eabc0a8161547546323338435301.jpg) 4 | 5 | **RoboMaster 2019 AI Robot Platform** is a universal mobile robot platform for [**ICRA 2019 RoboMaster AI Challenge**](https://www.icra2019.org/competitions/dji-robomaster-ai-challenge)**.** It is an Open Source Platform for not only mechanical design, but software framework for real-time embedded system based on STM32 series and onboard RoboRTS framework for advanced robotic algorithms fully supported in ROS with community driven as well as platform-customized code and examples. 6 | 7 | 8 | 9 | -------------------------------------------------------------------------------- /_navbar.md: -------------------------------------------------------------------------------- 1 | * [中文](/roborts) 2 | * [English](/en/roborts) 3 | * [RoboMaster](https://www.robomaster.com/) 4 | -------------------------------------------------------------------------------- /_sidebar.md: -------------------------------------------------------------------------------- 1 | 2 | * 概述 3 | * [RoboMaster 2019](roborts.md) 4 | * [硬件规格](hardware_specifications.md) 5 | * [软件框架](software_framework.md) 6 | * [相关资料](resources.md) 7 | * 快速入门 8 | * [Manifold 2 配置指南](quick_start/setup_on_manifold2.md) 9 | * [快速测试](quick_start/quick_test.md) 10 | * 开发指南 11 | * [基础知识](dev_guide/pre_requisites.md) 12 | * [代码规范](dev_guide/code_style.md) 13 | * 说明文档 14 | * [SDK](sdk_docs/architecture.md) 15 | -------------------------------------------------------------------------------- /css/vue.css: -------------------------------------------------------------------------------- 1 | @import url("https://fonts.googleapis.com/css?family=Roboto+Mono|Source+Sans+Pro:300,400,600"); 2 | * { 3 | -webkit-font-smoothing: antialiased; 4 | -webkit-overflow-scrolling: touch; 5 | -webkit-tap-highlight-color: rgba(0,0,0,0); 6 | -webkit-text-size-adjust: none; 7 | -webkit-touch-callout: none; 8 | -webkit-box-sizing: border-box; 9 | box-sizing: border-box; 10 | } 11 | body:not(.ready) { 12 | overflow: hidden; 13 | } 14 | body:not(.ready) [data-cloak], 15 | body:not(.ready) .app-nav, 16 | body:not(.ready) > nav { 17 | display: none; 18 | } 19 | div#app { 20 | font-size: 30px; 21 | font-weight: lighter; 22 | margin: 40vh auto; 23 | text-align: center; 24 | } 25 | div#app:empty::before { 26 | content: 'Loading...'; 27 | } 28 | .emoji { 29 | height: 1.2rem; 30 | vertical-align: middle; 31 | } 32 | .progress { 33 | background-color: var(--theme-color, #fff); 34 | height: 2px; 35 | left: 0px; 36 | position: fixed; 37 | right: 0px; 38 | top: 0px; 39 | -webkit-transition: width 0.2s, opacity 0.4s; 40 | transition: width 0.2s, opacity 0.4s; 41 | width: 0%; 42 | z-index: 999999; 43 | } 44 | .search a:hover { 45 | color: var(--theme-color, #fff); 46 | } 47 | .search .search-keyword { 48 | color: var(--theme-color, #fff); 49 | font-style: normal; 50 | font-weight: bold; 51 | } 52 | html, 53 | body { 54 | height: 100%; 55 | } 56 | body { 57 | -moz-osx-font-smoothing: grayscale; 58 | -webkit-font-smoothing: antialiased; 59 | color: #707473; 60 | font-family: "Gotham", "Open Sans", "PingFangSC-Regular", "Helvetica Neue", "Hiragino Sans GB", "Microsoft YaHei", "WenQuanYi Micro Hei", Arials, sans-serif; 61 | font-size: 15px; 62 | letter-spacing: 0; 63 | margin: 0; 64 | overflow-x: hidden; 65 | } 66 | img { 67 | max-width: 100%; 68 | } 69 | a[disabled] { 70 | cursor: not-allowed; 71 | opacity: 0.6; 72 | } 73 | kbd { 74 | border: solid 1px #ccc; 75 | border-radius: 3px; 76 | display: inline-block; 77 | font-size: 12px !important; 78 | line-height: 12px; 79 | margin-bottom: 3px; 80 | padding: 3px 5px; 81 | vertical-align: middle; 82 | } 83 | .task-list-item { 84 | list-style-type: none; 85 | } 86 | li input[type='checkbox'] { 87 | margin: 0 0.2em 0.25em -1.6em; 88 | vertical-align: middle; 89 | } 90 | .app-nav { 91 | margin: 25px 60px 0 0; 92 | position: absolute; 93 | right: 0; 94 | text-align: right; 95 | z-index: 10; 96 | /* navbar dropdown */ 97 | } 98 | .app-nav.no-badge { 99 | margin-right: 25px; 100 | } 101 | .app-nav p { 102 | margin: 0; 103 | } 104 | .app-nav > a { 105 | margin: 0 1rem; 106 | padding: 5px 0; 107 | } 108 | .app-nav ul, 109 | .app-nav li { 110 | display: inline-block; 111 | list-style: none; 112 | margin: 0; 113 | } 114 | .app-nav a { 115 | color: inherit; 116 | font-size: 16px; 117 | text-decoration: none; 118 | -webkit-transition: color 0.3s; 119 | transition: color 0.3s; 120 | } 121 | .app-nav a:hover { 122 | color: var(--theme-color, #44a8f2); 123 | } 124 | .app-nav a.active { 125 | border-bottom: 2px solid var(--theme-color, #44a8f2); 126 | color: var(--theme-color, #44a8f2); 127 | } 128 | .app-nav li { 129 | display: inline-block; 130 | margin: 0 1rem; 131 | padding: 5px 0; 132 | position: relative; 133 | } 134 | .app-nav li ul { 135 | background-color: #fff; 136 | border: 1px solid #ddd; 137 | border-bottom-color: #ccc; 138 | border-radius: 4px; 139 | -webkit-box-sizing: border-box; 140 | box-sizing: border-box; 141 | display: none; 142 | max-height: calc(100vh - 61px); 143 | overflow-y: auto; 144 | padding: 10px 0; 145 | position: absolute; 146 | right: -15px; 147 | text-align: left; 148 | top: 100%; 149 | white-space: nowrap; 150 | } 151 | .app-nav li ul li { 152 | display: block; 153 | font-size: 14px; 154 | line-height: 1rem; 155 | margin: 0; 156 | margin: 8px 14px; 157 | white-space: nowrap; 158 | } 159 | .app-nav li ul a { 160 | display: block; 161 | font-size: inherit; 162 | margin: 0; 163 | padding: 0; 164 | } 165 | .app-nav li ul a.active { 166 | border-bottom: 0; 167 | } 168 | .app-nav li:hover ul { 169 | display: block; 170 | } 171 | .github-corner { 172 | border-bottom: 0; 173 | position: fixed; 174 | right: 0; 175 | text-decoration: none; 176 | top: 0; 177 | z-index: 1; 178 | } 179 | .github-corner:hover .octo-arm { 180 | -webkit-animation: octocat-wave 560ms ease-in-out; 181 | animation: octocat-wave 560ms ease-in-out; 182 | } 183 | .github-corner svg { 184 | color: #fff; 185 | fill: var(--theme-color, #44a8f2); 186 | height: 80px; 187 | width: 80px; 188 | } 189 | main { 190 | display: block; 191 | position: relative; 192 | width: 100vw; 193 | height: 100%; 194 | z-index: 0; 195 | } 196 | main.hidden { 197 | display: none; 198 | } 199 | .anchor { 200 | display: inline-block; 201 | text-decoration: none; 202 | -webkit-transition: all 0.3s; 203 | transition: all 0.3s; 204 | } 205 | .anchor span { 206 | color: #555; 207 | } 208 | .anchor:hover { 209 | text-decoration: underline; 210 | } 211 | .sidebar { 212 | border-right: 1px solid rgba(0,0,0,0.07); 213 | overflow-y: auto; 214 | padding: 40px 0 0; 215 | position: absolute; 216 | top: 0; 217 | bottom: 0; 218 | left: 0; 219 | -webkit-transition: -webkit-transform 250ms ease-out; 220 | transition: -webkit-transform 250ms ease-out; 221 | transition: transform 250ms ease-out; 222 | transition: transform 250ms ease-out, -webkit-transform 250ms ease-out; 223 | width: 300px; 224 | z-index: 20; 225 | } 226 | .sidebar > h1 { 227 | margin: 0 auto 1rem; 228 | font-size: 1.5rem; 229 | font-weight: 300; 230 | text-align: center; 231 | } 232 | .sidebar > h1 a { 233 | color: inherit; 234 | text-decoration: none; 235 | } 236 | .sidebar > h1 .app-nav { 237 | display: block; 238 | position: static; 239 | } 240 | .sidebar .sidebar-nav { 241 | line-height: 2em; 242 | padding-bottom: 40px; 243 | } 244 | .sidebar li.collapse .app-sub-sidebar { 245 | display: none; 246 | } 247 | .sidebar ul { 248 | margin: 0; 249 | padding: 0; 250 | } 251 | .sidebar li > p { 252 | font-weight: 700; 253 | margin: 0; 254 | } 255 | .sidebar ul, 256 | .sidebar ul li { 257 | list-style: none; 258 | } 259 | .sidebar ul li a { 260 | border-bottom: none; 261 | display: block; 262 | } 263 | .sidebar ul li ul { 264 | padding-left: 20px; 265 | } 266 | .sidebar::-webkit-scrollbar { 267 | width: 4px; 268 | } 269 | .sidebar::-webkit-scrollbar-thumb { 270 | background: transparent; 271 | border-radius: 4px; 272 | } 273 | .sidebar:hover::-webkit-scrollbar-thumb { 274 | background: rgba(136,136,136,0.4); 275 | } 276 | .sidebar:hover::-webkit-scrollbar-track { 277 | background: rgba(136,136,136,0.1); 278 | } 279 | .sidebar-toggle { 280 | background-color: transparent; 281 | background-color: rgba(255,255,255,0.8); 282 | border: 0; 283 | outline: none; 284 | padding: 10px; 285 | position: absolute; 286 | bottom: 0; 287 | left: 0; 288 | text-align: center; 289 | -webkit-transition: opacity 0.3s; 290 | transition: opacity 0.3s; 291 | width: 284px; 292 | z-index: 30; 293 | } 294 | .sidebar-toggle .sidebar-toggle-button:hover { 295 | opacity: 0.4; 296 | } 297 | .sidebar-toggle span { 298 | background-color: var(--theme-color, #979797); 299 | display: block; 300 | margin-bottom: 4px; 301 | width: 16px; 302 | height: 2px; 303 | } 304 | body.sticky .sidebar, 305 | body.sticky .sidebar-toggle { 306 | position: fixed; 307 | } 308 | .content { 309 | padding-top: 60px; 310 | position: absolute; 311 | top: 0; 312 | right: 0; 313 | bottom: 0; 314 | left: 300px; 315 | -webkit-transition: left 250ms ease; 316 | transition: left 250ms ease; 317 | } 318 | .markdown-section { 319 | margin: 0 auto; 320 | max-width: 1000px; 321 | padding: 30px 15px 40px 15px; 322 | position: relative; 323 | } 324 | .markdown-section > * { 325 | -webkit-box-sizing: border-box; 326 | box-sizing: border-box; 327 | font-size: inherit; 328 | } 329 | .markdown-section > :first-child { 330 | margin-top: 0 !important; 331 | } 332 | .markdown-section hr { 333 | border: none; 334 | border-bottom: 1px solid #eee; 335 | margin: 2em 0; 336 | } 337 | .markdown-section iframe { 338 | border: 1px solid #eee; 339 | } 340 | .markdown-section table { 341 | border-collapse: collapse; 342 | border-spacing: 0; 343 | display: block; 344 | margin-bottom: 1rem; 345 | overflow: auto; 346 | width: 100%; 347 | } 348 | .markdown-section th { 349 | border: 1px solid #ddd; 350 | font-weight: bold; 351 | padding: 6px 13px; 352 | } 353 | .markdown-section td { 354 | border: 1px solid #ddd; 355 | padding: 6px 13px; 356 | } 357 | .markdown-section tr { 358 | border-top: 1px solid #ccc; 359 | } 360 | .markdown-section tr:nth-child(2n) { 361 | background-color: #f8f8f8; 362 | } 363 | .markdown-section p.tip { 364 | background-color: #f8f8f8; 365 | border-bottom-right-radius: 2px; 366 | border-left: 4px solid #f66; 367 | border-top-right-radius: 2px; 368 | margin: 2em 0; 369 | padding: 12px 24px 12px 30px; 370 | position: relative; 371 | } 372 | .markdown-section p.tip:before { 373 | background-color: #f66; 374 | border-radius: 100%; 375 | color: #fff; 376 | content: '!'; 377 | font-family: 'Dosis', 'Source Sans Pro', 'Helvetica Neue', Arial, sans-serif; 378 | font-size: 14px; 379 | font-weight: bold; 380 | left: -12px; 381 | line-height: 20px; 382 | position: absolute; 383 | height: 20px; 384 | width: 20px; 385 | text-align: center; 386 | top: 14px; 387 | } 388 | .markdown-section p.tip code { 389 | background-color: #efefef; 390 | } 391 | .markdown-section p.tip em { 392 | color: #34495e; 393 | } 394 | .markdown-section p.warn { 395 | background: rgba(66,185,131,0.1); 396 | border-radius: 2px; 397 | padding: 1rem; 398 | } 399 | body.close .sidebar { 400 | -webkit-transform: translateX(-300px); 401 | transform: translateX(-300px); 402 | } 403 | body.close .sidebar-toggle { 404 | width: auto; 405 | } 406 | body.close .content { 407 | left: 0; 408 | } 409 | @media print { 410 | .github-corner, 411 | .sidebar-toggle, 412 | .sidebar, 413 | .app-nav { 414 | display: none; 415 | } 416 | } 417 | @media screen and (max-width: 768px) { 418 | .github-corner, 419 | .sidebar-toggle, 420 | .sidebar { 421 | position: fixed; 422 | } 423 | .app-nav { 424 | margin-top: 16px; 425 | } 426 | .app-nav li ul { 427 | top: 30px; 428 | } 429 | main { 430 | height: auto; 431 | overflow-x: hidden; 432 | } 433 | .sidebar { 434 | left: -300px; 435 | -webkit-transition: -webkit-transform 250ms ease-out; 436 | transition: -webkit-transform 250ms ease-out; 437 | transition: transform 250ms ease-out; 438 | transition: transform 250ms ease-out, -webkit-transform 250ms ease-out; 439 | } 440 | .content { 441 | left: 0; 442 | max-width: 100vw; 443 | position: static; 444 | padding-top: 20px; 445 | -webkit-transition: -webkit-transform 250ms ease; 446 | transition: -webkit-transform 250ms ease; 447 | transition: transform 250ms ease; 448 | transition: transform 250ms ease, -webkit-transform 250ms ease; 449 | } 450 | .app-nav, 451 | .github-corner { 452 | -webkit-transition: -webkit-transform 250ms ease-out; 453 | transition: -webkit-transform 250ms ease-out; 454 | transition: transform 250ms ease-out; 455 | transition: transform 250ms ease-out, -webkit-transform 250ms ease-out; 456 | } 457 | .sidebar-toggle { 458 | background-color: transparent; 459 | width: auto; 460 | padding: 30px 30px 10px 10px; 461 | } 462 | body.close .sidebar { 463 | -webkit-transform: translateX(300px); 464 | transform: translateX(300px); 465 | } 466 | body.close .sidebar-toggle { 467 | background-color: rgba(255,255,255,0.8); 468 | -webkit-transition: 1s background-color; 469 | transition: 1s background-color; 470 | width: 284px; 471 | padding: 10px; 472 | } 473 | body.close .content { 474 | -webkit-transform: translateX(300px); 475 | transform: translateX(300px); 476 | } 477 | body.close .app-nav, 478 | body.close .github-corner { 479 | display: none; 480 | } 481 | .github-corner:hover .octo-arm { 482 | -webkit-animation: none; 483 | animation: none; 484 | } 485 | .github-corner .octo-arm { 486 | -webkit-animation: octocat-wave 560ms ease-in-out; 487 | animation: octocat-wave 560ms ease-in-out; 488 | } 489 | } 490 | @-webkit-keyframes octocat-wave { 491 | 0%, 100% { 492 | -webkit-transform: rotate(0); 493 | transform: rotate(0); 494 | } 495 | 20%, 60% { 496 | -webkit-transform: rotate(-25deg); 497 | transform: rotate(-25deg); 498 | } 499 | 40%, 80% { 500 | -webkit-transform: rotate(10deg); 501 | transform: rotate(10deg); 502 | } 503 | } 504 | @keyframes octocat-wave { 505 | 0%, 100% { 506 | -webkit-transform: rotate(0); 507 | transform: rotate(0); 508 | } 509 | 20%, 60% { 510 | -webkit-transform: rotate(-25deg); 511 | transform: rotate(-25deg); 512 | } 513 | 40%, 80% { 514 | -webkit-transform: rotate(10deg); 515 | transform: rotate(10deg); 516 | } 517 | } 518 | section.cover { 519 | -webkit-box-align: center; 520 | -ms-flex-align: center; 521 | align-items: center; 522 | background-position: center center; 523 | background-repeat: no-repeat; 524 | background-size: cover; 525 | height: 100vh; 526 | display: none; 527 | } 528 | section.cover.show { 529 | display: -webkit-box; 530 | display: -ms-flexbox; 531 | display: flex; 532 | } 533 | section.cover.has-mask .mask { 534 | background-color: #fff; 535 | opacity: 0.8; 536 | position: absolute; 537 | top: 0; 538 | height: 100%; 539 | width: 100%; 540 | } 541 | section.cover .cover-main { 542 | -webkit-box-flex: 1; 543 | -ms-flex: 1; 544 | flex: 1; 545 | margin: -20px 16px 0; 546 | text-align: center; 547 | z-index: 1; 548 | } 549 | section.cover a { 550 | color: inherit; 551 | text-decoration: none; 552 | } 553 | section.cover a:hover { 554 | text-decoration: none; 555 | } 556 | section.cover p { 557 | line-height: 1.5rem; 558 | margin: 1em 0; 559 | } 560 | section.cover h1 { 561 | color: inherit; 562 | font-size: 2.5rem; 563 | font-weight: 300; 564 | margin: 0.625rem 0 2.5rem; 565 | position: relative; 566 | text-align: center; 567 | } 568 | section.cover h1 a { 569 | display: block; 570 | } 571 | section.cover h1 small { 572 | bottom: -0.4375rem; 573 | font-size: 1rem; 574 | position: absolute; 575 | } 576 | section.cover blockquote { 577 | font-size: 1.5rem; 578 | text-align: center; 579 | } 580 | section.cover ul { 581 | line-height: 1.8; 582 | list-style-type: none; 583 | margin: 1em auto; 584 | max-width: 500px; 585 | padding: 0; 586 | } 587 | section.cover .cover-main > p:last-child a { 588 | border-color: var(--theme-color, #42b983); 589 | border-radius: 2rem; 590 | border-style: solid; 591 | border-width: 1px; 592 | -webkit-box-sizing: border-box; 593 | box-sizing: border-box; 594 | color: var(--theme-color, #42b983); 595 | display: inline-block; 596 | font-size: 1.05rem; 597 | letter-spacing: 0.1rem; 598 | margin: 0.5rem 1rem; 599 | padding: 0.75em 2rem; 600 | text-decoration: none; 601 | -webkit-transition: all 0.15s ease; 602 | transition: all 0.15s ease; 603 | } 604 | section.cover .cover-main > p:last-child a:last-child { 605 | background-color: var(--theme-color, #42b983); 606 | color: #fff; 607 | } 608 | section.cover .cover-main > p:last-child a:last-child:hover { 609 | color: inherit; 610 | opacity: 0.8; 611 | } 612 | section.cover .cover-main > p:last-child a:hover { 613 | color: inherit; 614 | } 615 | section.cover blockquote > p > a { 616 | border-bottom: 2px solid var(--theme-color, #42b983); 617 | -webkit-transition: color 0.3s; 618 | transition: color 0.3s; 619 | } 620 | section.cover blockquote > p > a:hover { 621 | color: var(--theme-color, #42b983); 622 | } 623 | body { 624 | background-color: #fff; 625 | } 626 | /* sidebar */ 627 | .sidebar { 628 | background-color: #fff; 629 | color: #364149; 630 | } 631 | .sidebar li { 632 | margin: 6px 0 6px 15px; 633 | } 634 | .sidebar ul li a { 635 | color: #505d6b; 636 | font-size: 14px; 637 | font-weight: normal; 638 | overflow: hidden; 639 | text-decoration: none; 640 | text-overflow: ellipsis; 641 | white-space: nowrap; 642 | } 643 | .sidebar ul li a:hover { 644 | text-decoration: underline; 645 | } 646 | .sidebar ul li ul { 647 | padding: 0; 648 | } 649 | .sidebar ul li.active > a { 650 | border-right: 2px solid; 651 | color: var(--theme-color, #44a8f2); 652 | font-weight: 600; 653 | } 654 | .app-sub-sidebar li::before { 655 | content: '-'; 656 | padding-right: 4px; 657 | float: left; 658 | } 659 | /* markdown content found on pages */ 660 | .markdown-section h1, 661 | .markdown-section h2, 662 | .markdown-section h3, 663 | .markdown-section h4 { 664 | color: #555; 665 | font-weight: normal; 666 | } 667 | 668 | .markdown-section strong{ 669 | font-style: italic; 670 | /* color: #485360; */ 671 | font-weight: bold; 672 | } 673 | 674 | .markdown-section a { 675 | color: var(--theme-color, #44a8f2); 676 | font-weight: 200; 677 | } 678 | .markdown-section h1 { 679 | font-size: 2rem; 680 | margin: 0 0 1rem; 681 | } 682 | .markdown-section h2 { 683 | font-size: 1.75rem; 684 | margin: 45px 0 0.8rem; 685 | } 686 | .markdown-section h3 { 687 | font-size: 1.5rem; 688 | margin: 40px 0 0.6rem; 689 | } 690 | .markdown-section h4 { 691 | font-size: 1.25rem; 692 | } 693 | .markdown-section h5 { 694 | font-size: 1rem; 695 | } 696 | .markdown-section h6 { 697 | color: #777; 698 | font-size: 1rem; 699 | } 700 | .markdown-section figure, 701 | .markdown-section p { 702 | margin: 1.2em 0; 703 | } 704 | .markdown-section p, 705 | .markdown-section ul, 706 | .markdown-section ol { 707 | line-height: 1.6rem; 708 | word-spacing: 0.05rem; 709 | } 710 | .markdown-section ul, 711 | .markdown-section ol { 712 | padding-left: 1.5rem; 713 | } 714 | .markdown-section blockquote { 715 | border-left: 4px solid var(--theme-color, #555); 716 | color: #858585; 717 | margin: 2em 0; 718 | padding-left: 20px; 719 | } 720 | .markdown-section blockquote p { 721 | font-weight: 600; 722 | margin-left: 0; 723 | } 724 | .markdown-section iframe { 725 | margin: 1em 0; 726 | } 727 | .markdown-section em { 728 | color: #7f8c8d; 729 | } 730 | .markdown-section code { 731 | background-color: #fff; 732 | border-radius: 2px; 733 | color: #44a8f2; 734 | font-family: 'Roboto Mono', Monaco, courier, monospace; 735 | font-size: 0.8rem; 736 | margin: 0 2px; 737 | padding: 3px 5px; 738 | white-space: pre-wrap; 739 | } 740 | .markdown-section pre { 741 | -moz-osx-font-smoothing: initial; 742 | -webkit-font-smoothing: initial; 743 | background-color: #f8f8f8; 744 | font-family: 'Roboto Mono', Monaco, courier, monospace; 745 | line-height: 1.5rem; 746 | margin: 1.2em 0; 747 | overflow: auto; 748 | padding: 0 1.4rem; 749 | position: relative; 750 | word-wrap: normal; 751 | } 752 | /* code highlight */ 753 | .token.comment, 754 | .token.prolog, 755 | .token.doctype, 756 | .token.cdata { 757 | color: #8e908c; 758 | } 759 | .token.namespace { 760 | opacity: 0.7; 761 | } 762 | .token.boolean, 763 | .token.number { 764 | color: #44a8f2; 765 | } 766 | .token.punctuation { 767 | color: #525252; 768 | } 769 | .token.property { 770 | color: #c08b30; 771 | } 772 | .token.tag { 773 | color: #2973b7; 774 | } 775 | .token.string { 776 | color: var(--theme-color, #42b983); 777 | } 778 | .token.selector { 779 | color: #6679cc; 780 | } 781 | .token.attr-name { 782 | color: #2973b7; 783 | } 784 | .token.entity, 785 | .token.url, 786 | .language-css .token.string, 787 | .style .token.string { 788 | color: #22a2c9; 789 | } 790 | .token.attr-value, 791 | .token.control, 792 | .token.directive, 793 | .token.unit { 794 | color: var(--theme-color, #42b983); 795 | } 796 | .token.function{ 797 | color: #2f9c0ab5 798 | } 799 | 800 | .token.keyword { 801 | color: #44a8f2; 802 | } 803 | .token.statement, 804 | .token.regex, 805 | .token.atrule { 806 | color: #22a2c9; 807 | } 808 | .token.placeholder, 809 | .token.variable { 810 | color: #3d8fd1; 811 | } 812 | .token.deleted { 813 | text-decoration: line-through; 814 | } 815 | .token.inserted { 816 | border-bottom: 1px dotted #202746; 817 | text-decoration: none; 818 | } 819 | .token.italic { 820 | font-style: italic; 821 | } 822 | .token.important, 823 | .token.bold { 824 | font-weight: bold; 825 | } 826 | .token.important { 827 | color: #44a8f2; 828 | } 829 | .token.entity { 830 | cursor: help; 831 | } 832 | .markdown-section pre > code { 833 | -moz-osx-font-smoothing: initial; 834 | -webkit-font-smoothing: initial; 835 | background-color: #f8f8f8; 836 | border-radius: 2px; 837 | color: #525252; 838 | display: block; 839 | font-family: 'Roboto Mono', Monaco, courier, monospace; 840 | font-size: 0.8rem; 841 | line-height: inherit; 842 | margin: 0 2px; 843 | max-width: inherit; 844 | overflow: inherit; 845 | padding: 2.2em 5px; 846 | white-space: inherit; 847 | } 848 | .markdown-section code::after, 849 | .markdown-section code::before { 850 | letter-spacing: 0.05rem; 851 | } 852 | code .token { 853 | -moz-osx-font-smoothing: initial; 854 | -webkit-font-smoothing: initial; 855 | min-height: 1.5rem; 856 | } 857 | pre::after { 858 | color: #ccc; 859 | content: attr(data-lang); 860 | font-size: 0.6rem; 861 | font-weight: 600; 862 | height: 15px; 863 | line-height: 15px; 864 | padding: 5px 10px 0; 865 | position: absolute; 866 | right: 0; 867 | text-align: right; 868 | top: 0; 869 | } 870 | -------------------------------------------------------------------------------- /dev_guide/README.md: -------------------------------------------------------------------------------- 1 | RoboRTS是一个基于实时策略机器人研究的开源软件平台,目标是为基于学习(reinforcement learning,imitation learning或neroevolution)的多机器人决策研究提供一个通用性的硬件和软件解决方案。 2 | 3 | 本文会从以下几个方面来介绍`RoboRTS`平台 4 | 5 | 1. 快速上手(为了让人们快速跑demo) 6 | - 硬件组成与搭建 7 | - 机械结构,硬件选型和链路 8 | - 安装和连接 9 | - 软件组成与准备 10 | - 嵌入式层STM32烧写代码 11 | - 软件安装和准备: 12 | - Nvidia Jetson TX2相关配置(适配Jetson安装配置) 13 | - ROS和依赖安装 14 | - udev配置,网络配置,开机服务配置 15 | - 仿真平台配置 16 | - Stage 17 | - Gazebo 18 | - 简单测试与调试 19 | - Demo 20 | 2. 开发者 SDK (为了让人们更好的调用功能和简要拓展) 21 | - 协议 22 | - 整体系统框架及实现功能 23 | - RoboRTS接口通用格式,各接口调用和测试实例 24 | 3. 系统算法介绍 (面向科普教程或算法知识性总结) 25 | - 状态估计 26 | - 视觉感知 27 | - 路径规划 28 | - 反馈控制 29 | - 行为决策 30 | 4. 系统设计思路 Modularity(为了让人们更好搭建和完善平台) 31 | - 机械设计 32 | - 三个平台系统性设计模式 33 | - STM32 34 | - RoboRTS 35 | - 仿真平台 × 36 | 5. 功能需求与未来规划 37 | 38 | > **注**: 39 | >1. 第二部分只写接口模块的 40 | - 整体格式,运行思路 41 | - 依赖的节点和输入信息 42 | - 输出信息 43 | >2. 第三部分写具体实现算法的部分,主要是每个算法参数是什么,以及如何调试参数,最好附理论和推导 44 | 45 | 46 | -------------------------------------------------------------------------------- /dev_guide/code_style.md: -------------------------------------------------------------------------------- 1 | # 代码规范 2 | 3 | ## RoboRTS 4 | RoboRTS软件框架代码以C++为主,以C++14作为C++标准,使用了部分C++14和C++11中的特性,以替代部分在Boost库中的相关功能。 5 | 6 | 代码风格主要以[Google Code Style](https://google.github.io/styleguide/cppguide.html)为准,并有部分修改。 7 | 8 | 如有相关疑问请联系开发者。 9 | 10 | 11 | -------------------------------------------------------------------------------- /dev_guide/docs_preview.md: -------------------------------------------------------------------------------- 1 | # 文档编写指南 2 | 3 | ## 安装依赖 4 | 5 | ### NVM 6 | 7 | 安装NVM用于管理node.js。 8 | 9 | ``` 10 | curl -o- https://raw.githubusercontent.com/creationix/nvm/v0.33.11/install.sh | bash 11 | ``` 12 | 13 | 安装node.js 14 | 15 | ``` 16 | nvm install node # "node" is an alias for the latest version 17 | ``` 18 | 19 | 或安装制定版本node.js 20 | 21 | ``` 22 | nvm install 6.14.4 # or 10.10.0, 8.9.1, etc 23 | ``` 24 | 25 | 可以通过以下命令查询node.js可用版本 26 | 27 | ``` 28 | nvm ls-remote 29 | ``` 30 | 31 | 32 | 33 | ### docsify-cli 34 | 35 | 推荐全局安装docsify-cli用于本地预览。 36 | 37 | ```bash 38 | npm i docsify-cli -g 39 | ``` 40 | 41 | 参考链接: [docsify.js](https://docsify.js.org/#/) 42 | 43 | ## 开启本地预览 44 | 45 | 安装好依赖后,可以通过`docsify serve`命令来在浏览器中本地预览。 46 | 47 | 切换到教程目录,运行如下命令。 48 | 49 | ```bash 50 | docsify serve ./ --port 8000 51 | ``` 52 | 53 | 打开浏览器,输入`http://localhost:8000` 或 `http://127.0.0.1:8000`即可实时本地预览。 54 | 55 | 56 | 57 | ## 功能支持 58 | 59 | ### KaTex 60 | 61 | 本文档默认开启[docsify-katex](https://github.com/upupming/docsify-katex)插件,支持利用KaTex编写数学符号和数学公式。 62 | 63 | ### Flexible Alerts 64 | 65 | 本文档默认开启[docsify-plugin-flexible-alerts](https://github.com/zanfab/docsify-plugin-flexible-alerts),详细使用可以参考其说明文档。 -------------------------------------------------------------------------------- /dev_guide/pre_requisites.md: -------------------------------------------------------------------------------- 1 | # 基础知识 2 | ## 技能需求 3 | 如果你希望基于RoboRTS开发自己的机器人应用,你至少需要具备以下技能: 4 | 5 | - 有C++编程经验并熟悉CMake工具的使用 6 | - 有基于ROS的开发经验 7 | 8 | ## 参考资料 9 | 10 | 可以参考知乎专栏中的: 11 | 12 | [《从RoboMaster AI挑战赛到移动机器人系统的入坑指南》](https://zhuanlan.zhihu.com/p/44117460) 13 | 14 | 下面为开发必备的知识: 15 | 16 | ### C++ 17 | 18 | - C++ Primer 19 | - 如果你想更深入的学习C++可以参考 20 | [A Study Path for Game Programmer](https://github.com/miloyip/game-programmer)的Programming Languages小节 21 | 22 | ### CMake 23 | 24 | - [An Introduction to Modern CMake](https://cliutils.gitlab.io/modern-cmake/) 25 | 26 | ### ROS 27 | 28 | - [A Gentle Introduction To ROS](https://www.cse.sc.edu/~jokane/agitr/)([中文版](https://www.cse.sc.edu/~jokane/agitr/%E6%9C%BA%E5%99%A8%E4%BA%BA%E6%93%8D%E4%BD%9C%E7%B3%BB%E7%BB%9F%EF%BC%88ROS%EF%BC%89%E6%B5%85%E6%9E%90.pdf)) 29 | - [ROS Wiki](http://wiki.ros.org/) 30 | 31 | ### Robotics 32 | 33 | - Probabilistic Robotics 34 | - [Artificial Intelligence For Robotics](https://www.udacity.com/course/artificial-intelligence-for-robotics--cs373) 35 | -------------------------------------------------------------------------------- /en/_navbar.md: -------------------------------------------------------------------------------- 1 | * [中文](/roborts) 2 | * [English](/en/roborts) 3 | * [RoboMaster](https://www.robomaster.com/) 4 | -------------------------------------------------------------------------------- /en/_sidebar.md: -------------------------------------------------------------------------------- 1 | 2 | * Overview 3 | * [RoboMaster 2019](en/roborts.md) 4 | * [Hardware Specifications](en/hardware_specifications.md) 5 | * [Software Framework](en/software_framework.md) 6 | * [Related Resources](en/resources.md) 7 | * Quick Start 8 | * [Manifold 2 Setup Guide](en/quick_start/setup_on_manifold2.md) 9 | * [Quick Test](en/quick_start/quick_test.md) 10 | * Developer Guide 11 | * [Prerequisites](en/dev_guide/pre_requisites.md) 12 | * [Code Style](en/dev_guide/code_style.md) 13 | * Introduction Doc 14 | * [SDK](en/sdk_docs/architecture.md) 15 | -------------------------------------------------------------------------------- /en/dev_guide/README.md: -------------------------------------------------------------------------------- 1 | RoboRTS is an open source software platform based on real-time strategy robotics research , which goal is to provide a versatile hardware and software solution for multi-robot decision research based on learning (reinforcement learning, imitation learning or neroevolution). 2 | 3 | This article will introduce the `RoboRTS` platform from the following aspects: 4 | 5 | 1. Get started quickly (in order for people to run the demo quickly) 6 | - Composition and construction of hardware 7 | - Mechanical structure, hardware selection and link 8 | - Installation and connection 9 | - Composition and preparation of software 10 | - Embedded layer STM32 programming code 11 | - Installation and connection of software: 12 | - Nvidia Jetson TX2 related configuration(adapt to the install configuration of Jetson) 13 | - ROS and depend installation 14 | - udev configuration,network configuration,Boot service configuration 15 | 16 | Configuration of the simulation platform 17 | - Stage 18 | - Gazebo 19 | - Simple test and debug 20 | 21 | - Demo 22 | 2. Developer SDK(in order to let people invoking function and brief develop better) 23 | - Agreement 24 | - Overall system framework and implementation functions 25 | - RoboRTS interface common format, each interface call and test instance 26 | 3. Introduction to System Algorithms (for science tutorials or algorithmic knowledge summaries) 27 | - State estimation 28 | - Visual perception 29 | - Route planning 30 | - Feedback controls 31 | - Behavior decision 32 | 4. System design ideas Modularity (in order to make people better build and improve the platform) 33 | - Mechanical design 34 | - Three platform system design patterns 35 | - STM32 36 | - RoboRTS 37 | - Simulation platform × 38 | 5. Functional requirements and future planning 39 | 40 | > **Tips*: 41 | > 42 | > 1. The second part only writes the interface module 43 | > - Overall format, running ideas 44 | > - Dependent nodes and input information 45 | > - Output information 46 | > 2. The third part writes the part of the specific implementation algorithm, mainly what is the parameter of each algorithm, and how to debug the parameters, preferably with theory and derivation. 47 | 48 | 49 | -------------------------------------------------------------------------------- /en/dev_guide/code_style.md: -------------------------------------------------------------------------------- 1 | # Code specification 2 | 3 | ## RoboRTS 4 | The RoboRTS software framework code is based on C++ with C++14 standard, and uses some of the features in C++14 and C++11 to replace some of the related functions in the Boost library. 5 | 6 | The code style is based on [Google Code Style](https://google.github.io/styleguide/cppguide.html) and has been partially modified. 7 | 8 | If you have questions, please contact the developer. 9 | 10 | 11 | -------------------------------------------------------------------------------- /en/dev_guide/docs_preview.md: -------------------------------------------------------------------------------- 1 | # Document Writing Guide 2 | 3 | ## Installation dependency 4 | 5 | ### NVM 6 | 7 | Install NVM to manage node.js 8 | 9 | ``` 10 | curl -o- https://raw.githubusercontent.com/creationix/nvm/v0.33.11/install.sh | bash 11 | ``` 12 | 13 | Install node.js 14 | 15 | ``` 16 | nvm install node # "node" is an alias for the latest version 17 | ``` 18 | 19 | Or install set version node.js 20 | 21 | ``` 22 | nvm install 6.14.4 # or 10.10.0, 8.9.1, etc 23 | ``` 24 | 25 | You can see about the available version of node.js through the following command. 26 | 27 | ``` 28 | nvm ls-remote 29 | ``` 30 | 31 | 32 | 33 | ### docsify-cli 34 | 35 | It is recommended to install docsify-cli globally for local preview. 36 | 37 | ```bash 38 | npm i docsify-cli -g 39 | ``` 40 | 41 | Reference link: [docsify.js](https://docsify.js.org/#/) 42 | 43 | ## Open local preview 44 | 45 | Once the dependencies are installed, they can be previewed locally in the browser via the `docsify serve` command. 46 | Switch to the tutorial directory and run the following command. 47 | 48 | ```bash 49 | docsify serve ./ --port 8000 50 | ``` 51 | 52 | Open browser,input`http://localhost:8000` or `http://127.0.0.1:8000`,then you can browse locally。 53 | 54 | 55 | 56 | ## Function Support 57 | 58 | ### KaTex 59 | 60 | This file open the plug-in [docsify-katex](https://github.com/upupming/docsify-katex)by default,and it is supported to use KaTex to write mathematical symbols and formulas. 61 | 62 | ### Flexible Alerts 63 | 64 | This file open [docsify-plugin-flexible-alerts](https://github.com/zanfab/docsify-plugin-flexible-alerts)by default,you can refer to the instructions text to know more details about the use. -------------------------------------------------------------------------------- /en/dev_guide/pre_requisites.md: -------------------------------------------------------------------------------- 1 | # Basic Knowledge 2 | ## Skill Requirement 3 | If you want to develop your own robot application based on RoboRTS, you need at least the following skills: 4 | 5 | - You must have experience in C++ programming and be familiar with the use of CMake tools. 6 | - You must have ROS-based development experience 7 | 8 | ## Reference Material 9 | 10 | You can refer to relevant materials in Reddit: 11 | 12 | [《An Introduction from RoboMaster AI Challenge to Mobile Robotics System》](https://www.reddit.com/r/ROBOMASTER/comments/9m31qb/an_introduction_from_robomaster_ai_challenge_to/) 13 | 14 | The following is the necessary knowledge for development: 15 | 16 | ### C++ 17 | 18 | - C++ Primer 19 | - If you want to learn C++ deeply, you can refer to: 20 | [A Study Path for Game Programmer](https://github.com/miloyip/game-programmer) 21 | 22 | ### CMake 23 | 24 | - [An Introduction to Modern CMake](https://cliutils.gitlab.io/modern-cmake/) 25 | 26 | ### ROS 27 | 28 | - [A Gentle Introduction To ROS](https://www.cse.sc.edu/~jokane/agitr/) 29 | - [ROS Wiki](http://wiki.ros.org/) 30 | 31 | ### Robotics 32 | 33 | - Probabilistic Robotics 34 | - [Artificial Intelligence For Robotics](https://www.udacity.com/course/artificial-intelligence-for-robotics--cs373) 35 | -------------------------------------------------------------------------------- /en/hardware_specifications.md: -------------------------------------------------------------------------------- 1 | ## Structural Composition 2 | The whole platform is modular-designed with quick disassembly and each part can be separately programmed and tested. 3 | 4 |
5 | 6 |
7 | 8 | The platform is mainly composed of three parts. 9 | 10 | 1. chassis module 11 | 12 |
13 | 14 |
15 | 16 | - Use Mecanum wheels for omnidirectional movement 17 | 18 | - Powered by the RoboMaster [M3508 P19 Brushless DC Gear Motor](https://store.dji.com/product/rm-m3508-p19-brushless-dc-gear-motor) and RoboMaster [C620 ESC](https://store.dji.com/product/rm-c620-brushless-dc-motor-speed-controller)) 19 | 20 | - Use RoboMaster [Development Board](https://store.dji.com/product/rm-development-board-type-a?from=menu_products)[ Type A](https://store.dji.com/product/rm-development-board-type-a?from=menu_products) (STM32F427) as MCU 21 | 22 | 23 | 2. Gimbal module 24 | 25 |
26 | 27 |
28 | 29 | 30 | - Use 2-axis gimbal for two-DOF rotation movement 31 | - Provide the mechanism for supplying and launching 17mm TPU projectiles 32 | - Powered by RoboMaster [GM6020 Brushless Motor](https://store.dji.com/cn/product/rm-gm6020-brushless-dc-motor) (with the ESC) for gimbal movement 33 | - Powered by RoboMaster [M2006 P36 Brushless DC Gear Motor](https://store.dji.com/cn/product/rm-m2006-p36-brushless-motor) for projectile supply 34 | - Powered by [DJI Snail 2305 Racing Motor](https://store.dji.com/product/snail-racing-propulsion-system?from=menu_products) for projectile launching 35 | - Use RoboMaster [Development Board Type A](https://store.dji.com/product/rm-development-board-type-a?from=menu_products) (STM32F427) as MCU 36 | 37 | 38 | 3. Referee system module 39 | 40 |
41 | 42 |
43 | 44 | 45 | - An electronic penalty system that integrates computation, communication, and control features into different submodules and is used for robotic competitions only. Developers can acquire the information from specific software interface about the progress of the competition and the status of robots 46 | - The Referee System includes the onboard terminal installed on the robot, as well as the server and client software installed on the PC 47 | - Submodules installed on the robot consists of **Armor Module**,**Main Control Module**,**Speed Monitor Module**,**RFID Interaction Module** and **Power Management Module** 48 | - For more information about the referee system, please refer to "the Referee System Specification Manuel" under [Related Resources](en/resources). 49 | 50 | In addition, **DT7 Remote Controller** and smart Lipo 6s battery ([Matrice 100 TB47D Battery](https://store.dji.com/product/matrice-100-tb47d-battery?from=autocomplete&position=0) or [TB48D](https://store.dji.com/product/matrice-100-tb48d-battery)) with related charger are included in the accessories of the robot plateform. 51 | 52 | The platform accommodates an extensive variety of sensors and computing devices, customized to meet research needs easy for extended development. It provides sensor Installation holder compatible with different types of sensors including industrial mono-camera, Lidar, UWB locating kit, depth camera and so on. And the platform officially supported DJI Manifold 2 as the onboard computing device, but it is compatible with intel NUC, Nvidia Jetson TX1, TX2 or Xavier with certain type of carrier board. 53 | 54 | ## Hardware Parameters 55 | | Structure | | 56 | | :-------------------- | :-------------------- | 57 | | Overall size | 600 x 450 x 460 mm | 58 | | Weight (Including battery) | 16.6 kg | 59 | | **Performance** | | 60 | | Maximum forward Speed | 3 m/s | 61 | | Maximum Pan Speed | 2 m/s | 62 | | Gimbal Pitch axis rotating angle | -20° ~ 20° | 63 | | Gimbal Yaw axis rotating angle | -90° ~ 90° | 64 | | Maximum launching frequency | 10 projectiles per second | 65 | | Maximum launching speed | 25 m/s | 66 | | Remote Controller | 200 projetiles | 67 | | **Battery** | | 68 | | Model | DJI TB47D / DJI TB48D | 69 | | Type | LiPo 6s | 70 | | Voltage | 22.8v | 71 | | Battery capacity | 4500 mAH / 5700 mAH | 72 | | **Remote control** | | 73 | | Model | DJI DT7 | 74 | | Firmware upgrade | 2.4 GHz | 75 | | Charing port | Micro USB | 76 | | **Communicating port** | | 77 | | Port type | Micro USB | 78 | | Communication mode | STM32???? | 79 | | Baud rate | 921600 | 80 | 81 | 82 | -------------------------------------------------------------------------------- /en/quick_start/quick_test.md: -------------------------------------------------------------------------------- 1 | # Quick Test 2 | 3 | > [!Warning] 4 | > 5 | >Before testing, Refer to [Software Dependency Configuration](en/quick_start/setup_on_manifold2?id=software-dependency-configuration) and [Roborts Download and Compile](en/quick_start/setup_on_manifold2?id=roborts-download-and-compile). Make sure software dependencies are installed, roborts packages are compiled successfully according to the guidelines before testing. 6 | 7 | ### Test in Simulator 8 | 9 | 1. Launch the script of Stage simulator, localization and motion planning modules 10 | ```bash 11 | roslaunch roborts_bringup roborts_stage.launch 12 | ``` 13 | 14 | 2. Run behavior test node for simple decisions 15 | ```bash 16 | rosrun roborts_decision behavior_test_node 17 | ``` 18 | Input different digital command like 1/2/3/4/5/6 and you can switch to perform different behaviors 19 | 20 | ### Test in Real World 21 | 22 | #### Step 1: Test Robot Driver 23 | 24 | 1. Launch the script of robot driver module 25 | ``` 26 | roslaunch roborts_bringup base.launch 27 | ``` 28 | Check the functionality to get information from MCU and control the robot through the command line tools 29 | 30 | > [!Tip] 31 | > 32 | >Robot driver module communicates with MCU through virtual serial, thus peripheral port mapping needs configuring correctly. Please refer to [Peripheral Port Mapping](en/quick_start/setup_on_manifold2?id=peripheral-port-mapping) for more information。 33 | 34 | #### Step 2: Test Simple Decisions 35 | 36 | 1. Launch the script of robot driver, static TF broadcaster, lidar, localization and motion planning modules 37 | ```bash 38 | roslaunch roborts_bringup roborts.launch 39 | ``` 40 | 41 | 2. Run behavior test node for simple decisions 42 | ```bash 43 | rosrun roborts_decision behavior_test_node 44 | ``` 45 | Input different digital command like 1/2/3/4/5/6 and you can switch to perform different behaviors 46 | 47 | -------------------------------------------------------------------------------- /en/quick_start/setup_on_manifold2.md: -------------------------------------------------------------------------------- 1 | This chapter focuses on the deployment and application of RoboRTS on Manifold 2. 2 | 3 | > [!Tip] 4 | > 5 | > Manifold 2 is a microcomputer create by DJI for developers. The Manifold 2-G series is equipped with the NVIDIA Jetson TX2 module. The Manifold 2 is the Manifold 2-G series by default. This document also applies to the Nvidia Jetson TX2 Native Development Kit. 6 | 7 | ## Performance Configuration 8 | 9 | ### Turn on the maximum performance 10 | 11 | In Manifold 2, you can use NVPModel to allocate the cpu core number and the maximum frequency of cpu and gpu. Manifold 2 default mode only opens 4 `CPU` cores , to get maximum performance, you need to use the `nvpmodel` command to change the configuration. 12 | 13 | #### View and change mode 14 | 15 | ```bash 16 | sudo nvpmodel -q --verbose # View current mode 17 | 18 | sudo nvpmodel -m [mode] # For the introduction of each mode, please refer to the table in Appendix 1. 19 | sudo nvpmodel -m0 # Recommended configuration 20 | ``` 21 | The mode corresponding to the number is as follows: 22 | 23 | 24 | 25 | | MODE | MODE NAME | DENVER 2 | FREQUENCY | ARM A57 | FREQUENCY | GPU FREQUENCY | 26 | | :--: | :------------: | :------: | :-------: | :-----: | :-------: | :-----------: | 27 | | 0 | Max-N | 2 | 2.0 GHz | 4 | 2.0 GHz | 1.30 Ghz | 28 | | 1 | Max-Q | 0 | | 4 | 1.2 GHz | 0.85 Ghz | 29 | | 2 | Max-P Core-All | 2 | 1.4 GHz | 4 | 1.4 GHz | 1.12 Ghz | 30 | | 3 | Max-P ARM | 0 | | 4 | 2.0 GHz | 1.12 Ghz | 31 | | 4 | Max-P Denver | 2 | 2.0 GHz | 0 | | 1.12 Ghz | 32 | 33 | #### Turn on the maximum clock frequency 34 | 35 | Manifold 2 will install this script in the Home directory. Run this script changing the clock frequency, mainly to maximize performance. 36 | 37 | ```bash 38 | sudo ./jetson_clocks.sh 39 | ``` 40 | 41 | ### Network speed test and connection 42 | 43 | - Ethernet bandwidth and speed test on Manifold 2 44 | 45 | ``` bash 46 | sudo apt-get install iperf 47 | #start server 48 | sudo iperf -s 49 | #start client 50 | sudo iperf -c 192.168.1.xxx -i 5 51 | ``` 52 | 53 | - WiFi test on Manifold 2 54 | 55 | - Turn off wlan0 energy saving mode 56 | 57 | ```bash 58 | iw dev wlan0 set power_save off #- to disable power save and reduce ping latency. 59 | iw dev wlan0 get power_save # will return current state. 60 | 61 | ``` 62 | 63 | - View RSSI of WiFi 64 | 65 | ```bash 66 | watch -n 0.1 iw dev wlan0 link 67 | ``` 68 | 69 | > [!Tip] 70 | > 71 | > Refer to the [Script](en/sdk_docs/roborts_bringup?id=script) section of the `roborts_bringup` module documentation to enable maximum performance and turn off wlan0 power-saving mode by booting the service after downloading and compiling the roborts packages. 72 | 73 | 74 | ## Peripheral Port Mapping 75 | 76 | According to the hardware interface (serial port, USB or ACM), configure the udev file in /etc/udev/rules.d to implement the device binding of the STM32 device virtual serial port and lidar: 77 | 78 | First connect to the virtual serial port of the STM32 device. lsusb can view the IDs of Vendor and Product, then create and configure the /etc/udev/rules.d/roborts.rules file. 79 | 80 | ```bash 81 | KERNEL=="ttyACM*", ATTRS{idVendor}=="0483", ATTRS{idProduct}=="5740", MODE:="0777", SYMLINK+="serial_sdk" 82 | 83 | ``` 84 | The same applies to the laser scanner. Then reload and start the udev service, which may take effect after the device is re-plugged. 85 | 86 | ```bash 87 | sudo service udev reload 88 | sudo service udev restart 89 | ``` 90 | 91 | It is a bit more troublesome to configure multiple cameras of the same model. Since the IDs of Vendor and Product are the same, you should check the specific characteristics of each camera. 92 | 93 | ```bash 94 | udevadm info --attribute-walk --name=/dev/video0 95 | ``` 96 | 97 | Generally, you can use the serial port number as a property to distinguish each camera, for example: 98 | 99 | ```bash 100 | SUBSYSTEM=="usb", ATTR{serial}=="68974689267119892", ATTR{idVendor}=="1871", ATTR{idProduct}=="0101", SYMLINK+="camera0" 101 | SUBSYSTEM=="usb", ATTR{serial}=="12345698798725654", ATTR{idVendor}=="1871", ATTR{idProduct}=="0101", SYMLINK+="camera1" 102 | ``` 103 | 104 | If it is a cheap camera, the serial port number may be the same, you can configure it according to the physical port of the connected HUB (KERNEL or KERNELS binding), for example: 105 | 106 | ```bash 107 | SUBSYSTEM=="usb", KERNEL=="2-3", ATTR{idVendor}=="1871", ATTR{idProduct}=="0101", SYMLINK+="camera0" 108 | SUBSYSTEM=="usb", KERNEL=="2-4", ATTR{idVendor}=="1871", ATTR{idProduct}=="0101", SYMLINK+="camera1" 109 | ``` 110 | 111 | >[!Tip] 112 | > 113 | >You can refer to the [Script](en/sdk_docs/roborts_bringup?id=script) section of the `roborts_bringup` module documentation to execute the udev port mapping script after downloading and compiling the roborts packages. 114 | 115 | 116 | 117 | ## Software Dependency Configuration 118 | 119 | ### ROS (ros-kinetic-ros-base) 120 | 121 | Manifold 2 installs ROS Kinetic by default. If you use other platforms, you can install ROS by referring to the [Installation Guide](http://wiki.ros.org/kinetic/Installation/Ubuntu) 122 | 123 | > [!Note] 124 | > 125 | > Pay attention that whether to write `source /opt/ros/kinetic/setup.bash` into `.bashrc` or `.zshrc` to properly load ROS-related environment variables. 126 | 127 | Third-party dependencies required to install ROS, as well as other dependencies such as `SuiteSparse`, `Glog`, `Protobuf` 128 | 129 | ```bash 130 | sudo apt-get install -y ros-kinetic-opencv3 \ 131 | ros-kinetic-cv-bridge \ 132 | ros-kinetic-image-transport \ 133 | ros-kinetic-stage-ros \ 134 | ros-kinetic-map-server \ 135 | ros-kinetic-laser-geometry \ 136 | ros-kinetic-interactive-markers \ 137 | ros-kinetic-tf \ 138 | ros-kinetic-pcl-* \ 139 | ros-kinetic-libg2o \ 140 | ros-kinetic-rplidar-ros \ 141 | ros-kinetic-rviz \ 142 | protobuf-compiler \ 143 | libprotobuf-dev \ 144 | libsuitesparse-dev \ 145 | libgoogle-glog-dev \ 146 | ``` 147 | 148 | ### Other recommended software 149 | 150 | - git 151 | - cmake 152 | - vim 153 | - terminator 154 | - htop 155 | 156 | ## RoboRTS Download and Compile 157 | 158 | 159 | ```bash 160 | # Create a workspace folder 161 | mkdir -p roborts_ws/src 162 | # Switch to the src directory 163 | cd roborts_ws/src 164 | # Download RoboRTS source code 165 | git clone https://github.com/RoboMaster/RoboRTS 166 | # Compile source code 167 | cd .. 168 | catkin_make 169 | # Load environment variables 170 | source devel/setup.bash 171 | ``` 172 | 173 | > [!Note] 174 | > 175 | > If you use `zsh`, be careful to source `setup.zsh` instead of `setup.bash`. 176 | -------------------------------------------------------------------------------- /en/resources.md: -------------------------------------------------------------------------------- 1 | # Documents 2 | 3 | ICRA 2019 RoboMaster AI Challenge documents 4 | 5 | ## Rules Manual V1.0 6 | 7 | [ICRA 2019 RoboMaster AI Challenge Rules Manual V1.0.pdf](https://rm-static.djicdn.com/tem/19806/ICRA%202019%20RoboMaster%20AI%20Challenge%20Rules%20Manual%20V1.0.pdf) 8 | 9 | ## AI Robot User Manual V1.0 10 | 11 | [ICRA 2019 DJI RoboMaster AI Challenge AI Robot User Manual V1.0.pdf](https://rm-static.djicdn.com/tem/19806/ICRA%202019%20DJI%20RoboMaster%20AI%20Challenge%20AI%20Robot%20User%20Manual%20V1.0.pdf) 12 | 13 | 14 | ## Referee System Specification Manual V1.0 15 | 16 | [ICRA 2019 RoboMaster AI Challenge Referee System Specification Manual V1.0.pdf](https://rm-static.djicdn.com/tem/19806/ICRA%202019%20RoboMaster%20AI%20Challenge%20Referee%20System%20Specification%20Manual%20V1.0.pdf) 17 | 18 | ## RoboMaster AI Robot Mechanism drawings 19 | 20 | ### STEP 21 | 22 | [Mechanism drawings STEP](https://rm-static.djicdn.com/documents/19806/4df8649b3596f1548056917303346609.STEP) 23 | 24 | ### SolidWorks 25 | 26 | (SolidWorks 2016 or higher version) 27 | 28 | [Mechanism drawings Solidworks](https://rm-static.djicdn.com/documents/19806/232fed8050cfe1548739880652461892.SLDPRT) 29 | 30 | ### Creo 31 | (Creo 3 or higher version) 32 | 33 | [Mechanism drawings Creo](https://rm-static.djicdn.com/documents/19806/a96a1cc07664b1548738962638883052.1) 34 | 35 | ## RoboMaster AI Robot Open Source Code Repository 36 | [RoboMaster AI Robot Embedded MCU(STM32)](https://github.com/RoboMaster/RoboRTS-Firmware) 37 | 38 | [RoboMaster AI Robot Onboard PC(ROS)](https://github.com/RoboMaster/RoboRTS) -------------------------------------------------------------------------------- /en/roborts.md: -------------------------------------------------------------------------------- 1 |

RoboMaster 2019 AI Robot Platform

2 | 3 | ![modules](https://rm-static.djicdn.com/documents/20758/d4e8eabc0a8161547546323338435301.jpg) 4 | **RoboMaster 2019 AI Robot Platform** is a universal mobile robot platform for [**ICRA 2019 RoboMaster AI Challenge**](https://www.icra2019.org/competitions/dji-robomaster-ai-challenge)**.** It is an Open Source Platform for not only mechanical design, but software framework for real-time embedded system based on STM32 series and onboard RoboRTS framework for advanced robotic algorithms fully supported in ROS with community driven as well as platform-customized code and examples. 5 | -------------------------------------------------------------------------------- /en/sdk_docs/_sidebar.md: -------------------------------------------------------------------------------- 1 | 2 | * [Back](en/) 3 | * [architecture](en/sdk_docs/architecture.md) 4 | * [roborts_base](en/sdk_docs/roborts_base.md) 5 | * [roborts_camera](en/sdk_docs/roborts_camera.md) 6 | * [roborts_detection](en/sdk_docs/roborts_detection.md) 7 | * [roborts_localization](en/sdk_docs/roborts_localization.md) 8 | * [roborts_decision](en/sdk_docs/roborts_decision.md) 9 | * roborts_planning 10 | * [global path planner](en/sdk_docs/roborts_planning_global_planner.md) 11 | * [local trajecotry planner](en/sdk_docs/roborts_planning_local_planner.md) 12 | * [roborts_bringup](en/sdk_docs/roborts_bringup.md) 13 | 14 | 15 | 16 | -------------------------------------------------------------------------------- /en/sdk_docs/architecture.md: -------------------------------------------------------------------------------- 1 | # Overall Architecture 2 | 3 | ## Architecture and Module Introduction 4 | The overall system is structured with the loop of `sensor->perception->decision->planning->control->executor`, and the different modules are maintained in the form of ROS Package. The module and its data flow are shown in the figure below. 5 | ![](https://rm-static.djicdn.com/documents/20758/f42d65d85d97c1547553106539783606.png) 6 | 7 | ### Sensors, Controllers and Executers 8 | 9 | - The central module integrates sensor modules (lidar, camera, IMU, etc.), embedded control platform (executing real-time tasks such as closed-loop control and data acquisition and pre-processing) and actuators (motors, etc.), and is responsible for sensing and control. The ROS Package for the embedded SDK is [roborts_base] (en/sdk_docs/roborts_base); for the camera is [roborts_camera](en/sdk_docs/roborts_camera). The system also includes the ROS packages for the related sensor drivers. 10 | 11 | ### Perception 12 | 13 | Perception includes robot localization, map maintenance and representation, target detection and tracking, etc. 14 | 15 | - localization module is responsible for robot localization. See [roborts_localization](en/sdk_docs/roborts_localization) for details. 16 | 17 | - The map module is responsible for robot map maintenance, currently using ROS open source package [map_server](http://wiki.ros.org/map_server) 18 | 19 | - The costmap module is responsible for the representation of the costmap. It integrates the static map layer, the obstacle layer and the inflation layer. It is mainly used in the motion planning part. Please see roborts_costmap for details. In the subsequent plans, this module will be updated to feature map module, not only for planning. 20 | 21 | - The detection module is responsible for target detection and tracking. For details, see [roborts_detection](en/sdk_docs/roborts_detection). Currently, this module integrates the enemy detection and the projectile controller. Since the frame rate requirement is relatively high, the current detection and control are coupled, which will be decoupled later, putting the gimbal planner for tracking the target into the gimbal_executor. 22 | 23 | ### Task Scheduling and Decision Making 24 | 25 | The task scheduling and decision making part includes the interface of the perception scheduler as input and the plan execution scheduler as output , and the core framework of the decision-making. 26 | - The decision module is a robotic decision-making framework, and the Behavior Tree is officially provided. The blackboard module schedules the perception task acquisition information of various modules and the game information of the referee system. The behavior module integrates various actions or behaviors of the discrete action space. For details, see [roborts_decision](en/sdk_docs/roborts_decision) 27 | 28 | - The executor module is a dependency of the behavior module, which contains the delegation interface of the chassis and the gimbal of different abstraction levels (e.g. scheduling the chassis to execute the result of the motion planning). See roborts_decision/executor for details. 29 | 30 | ### Motion Planning 31 | 32 | The motion planning part is the motion planning function module, which is scheduled and delegated by the chassis_executor module in the decision part, completed in the roborts_planning part. See roborts_planning for details. 33 | 34 | - The global_planner module is responsible for the global path planning of the robot. See [roborts_planning/global_planner](en/sdk_docs/roborts_planning_global_planner) for details. It depends on roborts_costmap 35 | 36 | - The local_planner module is responsible for the local trajectory planning of the robot. See [roborts_planning/local_planner](en/sdk_docs/roborts_planning_local_planner) for details. It depends on roborts_costmap 37 | 38 | 39 | ## ROS Package Introduction 40 | 41 | | Package | function | internal dependency | 42 | | :--: | :------------: | :------: | 43 | | roborts | Meta-package | - | 44 | | roborts_base | Embedded communication interface | roborts_msgs | 45 | | roborts_camera | camera driver | roborts_common | 46 | | roborts_common | common dependency | - | 47 | | roborts_decision | robot decision package | roborts_common
roborts_msgs
roborts_costmap | 48 | | roborts_detection | computer vision detection algorithm package | roborts_msgs
roborts_common
roborts_camera | 49 | | roborts_localization | robot localization algorithm package | - | 50 | | roborts_costmap | costmap-related package | roborts_common | 51 | | roborts_msgs | custom message definition package | - | 52 | | roborts_planning | motion planning algorithm package | roborts_common
roborts_msgs
roborts_costmap | 53 | | roborts_bringup | launch package | roborts_base
roborts_common
roborts_localization
roborts_costmap
roborts_msgs
roborts_planning
| 54 | | roborts_tracking | computer vision tracking algorithm package | roborts_msgs | 55 | -------------------------------------------------------------------------------- /en/sdk_docs/roborts_base.md: -------------------------------------------------------------------------------- 1 | # Robot Driver Module 2 | 3 | ## Module introduction 4 | 5 | The robot Driver Module is a bridge between the underlying embedded control board to the upper onboard computer,through the virtual serial port, the data type of bidirectional transmission is defined based on a common set of protocols, thereby completing the task of data mutual transmission. 6 | 7 | In computing device,`roborts_base `has a ROS interface which receives data sent by the underlying embedded control board and sends data to complete mode switching and motion control of the robot. 8 | 9 | `roborts_base`depends on the relevant message types defined in`roborts_msgs`. 10 | 11 | The module file directory is as follows 12 | ```bash 13 | roborts_base 14 | ├── CMakeLists.txt 15 | ├── cmake_module 16 | │   └── FindGlog.cmake 17 | ├── chassis #Chassis module ROS interface package 18 | │   ├── chassis.cpp 19 | │   └── chassis.h 20 | ├── gimbal #Gimbal module ROS interface package 21 | │   ├── gimbal.cpp 22 | │   └── gimbal.h 23 | ├── config 24 | │   └── roborts_base_parameter.yaml #Parameter configuration file 25 | ├── roborts_base_config.h #Parameter reading class 26 | ├── roborts_base_node.cpp #Core node Main function 27 | ├── ros_dep.h #Includes header files for all protocols corresponding to ROS messages 28 | ├── roborts_sdk 29 | │   ├── ... 30 | └── package.xml 31 | ``` 32 | The Original protocol SDK in the roborts_sdk file,without any ROS dependencies, as follows 33 | ```bash 34 | ├── roborts_sdk 35 | │   ├── hardware #Hardware layer to raw data transmission 36 | │   │   ├── hardware_interface.h #Hardware layer base class 37 | │   │   ├── serial_device.cpp #Serial device implementation 38 | │   │   └── serial_device.h 39 | │   ├── protocol #Protocol layer to unpack and pack the message 40 | │   │   ├── protocol.cpp #Protocol layer 41 | │   │   ├── protocol_define.h #Protocol message type definition header file 42 | │   │   └── protocol.h 43 | │   ├── dispatch #The dispatch layer to dispatch message 44 | │   │   ├── dispatch.h #Dispatch layer 45 | │   │   ├── execution.cpp #Dispatch layer execution 46 | │   │   ├── execution.h 47 | │   │   ├── handle.cpp #roborts_sdk three-layer external interface 48 | │   │   └── handle.h 49 | │   ├── sdk.h #roborts_sdk external document 50 | │   ├── test 51 | │   │   └── sdk_test.cpp #Protocol test file 52 | │   └── utilities 53 | │   ├── circular_buffer.h #Ring buffer pool 54 | │   ├── crc.h #crc check file 55 | │   ├── log.h #Log file 56 | │   └── memory_pool.h #Memory pool 57 | ``` 58 | 59 | >[!Note] 60 | >For details of the agreement, please refer to `RoboMaster AI Robot communication protocol document(TODO)`. 61 | 62 | 63 | In the core running node`roborts_base_node` of the module,After the objects of the required modules (such as the chassis and the gimbal) are created and initialized, the communication task can be performed normally. 64 | 65 | Its ROS node is shown as follows: 66 | 67 | ![](https://rm-static.djicdn.com/documents/20758/002d528eb36ad1550474043463957284.png) 68 | 69 | ### Chassis Module 70 | 71 | #### Input 72 | 73 | * /cmd_vel ([geomtry_msgs/Twist]()) 74 | 75 | Chassis speed control, In the next control cycle, the chassis is moving at a constant speed. 76 | 77 | * /cmd_vel_acc ([roborts_msgs/TwistAccel]()) 78 | 79 | Chassis speed and acceleration control,During the next control cycle, the chassis performs a uniform acceleration motion at an initial given speed. 80 | 81 | 82 | #### Output 83 | 84 | * /odom ([nav_msgs/Odometry]()) 85 | 86 | Chassis odometry information 87 | 88 | * /uwb ([geometry_msgs/PoseStamped]()) 89 | 90 | Pose information of the chassis in the UWB coordinate system 91 | 92 | * /tf ([tf/tfMessage](http://docs.ros.org/api/tf/html/msg/tfMessage.html)) 93 | 94 | From base_link->odom 95 | 96 | 97 | ### Gimbal Module 98 | 99 | #### Input 100 | 101 | * /cmd_gimbal_angle ([roborts_msgs/GimbalAngle]()) 102 | 103 | Control of the angle of the gimba. Judging whether it is absolute angle control or relative angle control according to the relative angle mark 104 | 105 | * /cmd_gimbal_rate ([roborts_msgs/GimbalRate]()) 106 | 107 | [**deprecated**] Gimbal rate control 108 | 109 | * /set_gimbal_mode ([roborts_msgs/GimbalMode]()) 110 | 111 | Set the mode of the gimbal 112 | 113 | * /cmd_fric_wheel ([roborts_msgs/FricWhl]()) 114 | 115 | Open and close the friction wheel (to be opened to launch speed control) 116 | 117 | * /cmd_shoot ([roborts_msgs/ShootCmd]()) 118 | 119 | Control instructions for projectile launch (including launch mode, frequency and number) 120 | 121 | #### Output 122 | 123 | * /tf ([tf/tfMessage](http://docs.ros.org/api/tf/html/msg/tfMessage.html)) 124 | 125 | From base_link->gimbal 126 | 127 | ### Related Parameters 128 | 129 | * serial_path(`string`, Defaults: "/dev/serial_sdk") 130 | 131 | Serial port path name,The default value is "/dev/serial_sdk" set by udev rules 132 | 133 | ## Compile and Run 134 | 135 | ### Compile 136 | 137 | Compile in the ROS workspace 138 | 139 | ```shell 140 | catkin_make -j4 roborts_base_node 141 | ``` 142 | 143 | ### Run 144 | 145 | Execute the roborts_base_node 146 | 147 | ```shell 148 | rosrun roborts_base roborts_base_node 149 | ``` 150 | 151 | Or start the relevant launch file 152 | 153 | ```shell 154 | roslaunch roborts_bringup base.launch 155 | ``` 156 | 157 | 158 | -------------------------------------------------------------------------------- /en/sdk_docs/roborts_bringup.md: -------------------------------------------------------------------------------- 1 | # Bringup Module 2 | 3 | ## Module Introduction 4 | 5 | The bringup module mainly includes basic configuration files and startup scripts in the roborts_bringup Package. The module file directory is as follows. 6 | ```bash 7 | roborts_bringup 8 | ├── CMakeLists.txt 9 | ├── launch #Launch startup script 10 | │   ├── base.launch #Robot driver script 11 | │   ├── roborts.launch #Functional script in the actual scene 12 | │   ├── roborts_stage.launch #Functional script in the stage simulation 13 | │   ├── mapping.launch #Mapping script in the actual scene 14 | │   ├── mapping_stage.launch #Mapping script in the stage simulation 15 | │   ├── slam_gmapping.xml #gmapping node script 16 | │   └── static_tf.launch #Static coordinate transformation script 17 | ├── maps #Map profile 18 | │   ├── icra2018.pgm 19 | │   ├── icra2018.yaml 20 | │   ├── icra2019.pgm 21 | │   └── icra2019.yaml 22 | ├── package.xml 23 | ├── rviz #rviz configuration file 24 | │   ├── mapping.rviz 25 | │   ├── roborts.rviz 26 | │   └── teb_test.rviz 27 | ├── scripts 28 | │   ├── udev #udev port mapping script 29 | │   │   ├── create_udev_rules.sh 30 | │   │   ├── delete_udev_rules.sh 31 | │   │   └── roborts.rules 32 | │   └── upstart #startup script 33 | │   ├── create_upstart_service.sh 34 | │   ├── delete_upstart_service.sh 35 | │   ├── jetson_clocks.sh 36 | │   ├── max-performance.service 37 | │   ├── max_performance.sh 38 | │   ├── roborts.service 39 | │   └── roborts-start.sh 40 | └── worlds #stage configuration file 41 | ├── icra2018.world 42 | └── icra2019.world 43 | 44 | ``` 45 | 46 | ## Script 47 | 48 | Execute the script to add udev mapping rules: 49 | 50 | ```shell 51 | ./create_udev_rules.sh 52 | ``` 53 | 54 | Execute the script to delete udev mapping rules: 55 | ```shell 56 | ./delete_udev_rules.sh 57 | ``` 58 | 59 | The udev rule script is described in the `roborts.rules` file. According to the peripherals you need, you can add and modify it flexibly. 60 | 61 | In Manifold2, execute the script to add a startup service: 62 | 63 | ```shell 64 | ./create_upstart_service.sh 65 | ``` 66 | Execute the script to delete the startup service: 67 | 68 | ```shell 69 | ./delete_upstart_service.sh 70 | ``` 71 | 72 | The startup service includes 73 | 74 | - To execute the jetson_clock.sh script with the nvpmodel command to maximize performance. 75 | 76 | - Start the roborts.service service and execute the roborts-start.sh script to run the ros launch script. 77 | 78 | Users can modify the required script files and service files to customize the startup service according to their needs. 79 | 80 | ## Test and Run 81 | 82 | In the actual scenario, run the robot driver script 83 | 84 | ```shell 85 | roslaunch roborts_bringup base.launch 86 | ``` 87 | 88 | In the actual scenario, execute the script for functional nodes in addition to the decision and scheduling module. 89 | 90 | ```shell 91 | roslaunch roborts_bringup roborts.launch 92 | ``` 93 | 94 | In the stage simulation environment,execute the script for functional nodes in addition to the decision and scheduling module. 95 | 96 | ```shell 97 | roslaunch roborts_bringup roborts_stage.launch 98 | ``` 99 | 100 | In the actual scenario, execute the test script for gmapping 101 | 102 | ```shell 103 | roslaunch roborts_bringup mapping.launch 104 | ``` 105 | 106 | In the stage simulation environment, execute the test script for gmapping 107 | 108 | ```shell 109 | roslaunch roborts_bringup mapping_stage.launch 110 | ``` 111 | 112 | > [!Note] 113 | > 114 | > Pay attention, all launch scripts that are applied in the actual scenario will not launch the rviz node for visualization by default. 115 | -------------------------------------------------------------------------------- /en/sdk_docs/roborts_camera.md: -------------------------------------------------------------------------------- 1 | # Camera driver module 2 | 3 | ## Module Introduction 4 | 5 | The camera driver module encapsulates common camera drivers and publishes image data and camera parameters via ROS `image_transport`. 6 | 7 | The camera module is located in the roborts_camera Package and depends on the abstract factory pattern and parameter reading in the roborts_common Package. The module file directory is as follows 8 | 9 | ```bash 10 | roborts_camera 11 | ├── CMakeLists.txt 12 | ├── package.xml 13 | ├── cmake_module 14 | │   └── FindProtoBuf.cmake 15 | ├── camera_base.h #Camera abstract 16 | ├── camera_node.cpp #Camera core running node and Main function 17 | ├── camera_node.h 18 | ├── camera_param.cpp #Camera parameter reading 19 | ├── camera_param.h 20 | ├── config 21 | │   └── camera_param.prototxt #Camera parameter profile 22 | ├── proto 23 | │   ├── camera_param.pb.cc 24 | │   ├── camera_param.pb.h 25 | │   └── camera_param.proto #Camera parameter definition file 26 | ├── test 27 | │   └── image_capture.cpp #Camera test node for image capture 28 | └── uvc 29 | ├── CMakeLists.txt 30 | ├── uvc_driver.cpp #uvc camera 31 | └── uvc_driver.h 32 | ``` 33 | 34 | The camera runs the node`roborts_camera_node`, which automatically dispatches the camera to publish image data by reading the configuration parameters of one or more cameras. 35 | 36 | ### Related Parameters 37 | The parameters are defined in `proto/camera_param.proto`. For the configuration of the parameters, see `config/camera_param.prototxt`which accepts multiple camera configurations. Single camera parameters include 38 | 39 | * camera_name (`string`) 40 | 41 | Camera name, namespace used to post camera messages 42 | 43 | * camera_type (`string`) 44 | 45 | Camera type, used to register and instantiate key values of camera objects in abstract factory mode, such as uvc camera as "uvc" 46 | 47 | * camera_path (`string`) 48 | 49 | Camera path, used to open the camera port path name 50 | 51 | * camera_matrix (`double[]`) 52 | 53 | Camera internal reference matrix, generally 3X3 54 | 55 | * camera_distortion (`double[]`) 56 | 57 | Camera distortion matrix 58 | 59 | * fps (`uint32`) 60 | 61 | Camera frame rate(Unit frame per second) 62 | 63 | * resolution 64 | 65 | * width (`uint32`) 66 | 67 | Resolution width (in pixels) 68 | 69 | * height (`uint32`) 70 | 71 | Resolution length (in pixels) 72 | 73 | * width_offset (`uint32`) 74 | 75 | Resolution width offset value (in pixels) for the Crop offset when the hardware captures the image frame 76 | 77 | * height_offset (`uint32`) 78 | 79 | Resolution length offset value (in pixels) for the Crop offset when the hardware captures the image frame 80 | 81 | * auto_exposure (`bool`) 82 | 83 | Auto exposure 84 | 85 | * exposure_value (`uint32`) 86 | 87 | Exposure value 88 | 89 | * exposure_time (`uint32`) 90 | 91 | Exposure time (unit us) 92 | 93 | * auto_white_balance (`bool`) 94 | 95 | Automatic white balance 96 | 97 | * white_balance_value (`uint32`) 98 | 99 | White balance value 100 | 101 | * auto_gain (`bool`) 102 | 103 | Auto gain 104 | 105 | * gain_value (`uint32`) 106 | 107 | Gain value 108 | 109 | * contrast (`uint32`) 110 | 111 | Contrast 112 | 113 | ### Output 114 | 115 | /camera_name/camera_info ([sensor_msgs/CameraInfo]()) 116 | 117 | Camera parameter information 118 | 119 | /camera_name/image_raw ([sensor_msgs/Image]()) 120 | 121 | Camera raw image data 122 | 123 | 124 | ## Compile and Run 125 | 126 | ### Compile 127 | 128 | Compile in the ROS workspace 129 | 130 | ```shell 131 | catkin_make -j4 roborts_camera_node 132 | ``` 133 | 134 | ### Run 135 | 136 | Start camera node 137 | 138 | ```shell 139 | rosrun roborts_camera roborts_camera_node 140 | ``` 141 | 142 | Open rqt image viewer 143 | 144 | ```shell 145 | rqt_image_view 146 | ``` 147 | -------------------------------------------------------------------------------- /en/sdk_docs/roborts_decision.md: -------------------------------------------------------------------------------- 1 | # Task Scheduling and Decision Module 2 | 3 | ## Module Introduction 4 | 5 | Task scheduling and decision module , provided perceptual input scheduling modules , planning the interface of executing the output scheduling module,and the core frame of the decision. 6 | 7 | The module is located in `roborts_decision`,and depends on the module of parameter reading in `roborts_common` , and the objects of cost maps in the `roborts_costmap` (selectable) as well as the related information types in the `roborts_msgs`. 8 | 9 | The module file directory is as follows: 10 | 11 | ```bash 12 | roborts_decision/ 13 | ├── behavior_test.cpp #Test nodes of the behavior example 14 | ├── behavior_tree #Example of decision frame,behavior tree 15 | │   ├── behavior_node.h #Definiton of behavior tree's nodes 16 | │   ├── behavior_state.h #Definition of behavior tree's state 17 | │   └── behavior_tree.h #Definition of behavior tree's runnig 18 | ├── blackboard 19 | │   └── blackboard.h #Definition of the blackboard(the input of decision frame) 20 | ├── CMakeLists.txt 21 | ├── cmake_module 22 | │   ├── FindEigen3.cmake 23 | │   └── FindProtoBuf.cmake 24 | ├── config 25 | │   └── decision.prototxt #Parameter configuration file of the behavior example 26 | ├── example_behavior #Behavior examples(the output of decision frame) 27 | │   ├── back_boot_area_behavior.h #Definition of behavior of returning to the startup area 28 | │   ├── chase_behavior.h #Definition of behavior of Chasing the enemy 29 | │   ├── escape_behavior.h #Definition of executing escape behavior when obseving the enemy 30 | │   ├── goal_behavior.h #Definition of behavior of assigning target navigation 31 | │   ├── line_iterator.h 32 | │   ├── patrol_behavior.h #Definition of fixed patrol behavior 33 | │   └── search_behavior.h #Definition of searching behavior in the local disappearance area 34 | ├── executor #Scheduling of task executor(Mission commission of different modules) 35 | │   ├── chassis_executor.cpp 36 | │   ├── chassis_executor.h #Definition of chassis task scheduling 37 | │   ├── gimbal_executor.cpp 38 | │   └── gimbal_executor.h #Definition of gimbal task scheduling 39 | ├── package.xml 40 | └── proto 41 | ├── decision.pb.cc 42 | ├── decision.pb.h 43 | └── decision.proto #Parameter configuration file of the behavior example 44 | 45 | ``` 46 | There is two core parts including decision and task scheduling: 47 | 48 | ### Decision Modules 49 | 50 | Decision modules include several parts: 51 | 52 | - Decision frame 53 | 54 | The decision-making framework takes the information of the observation as input and the action as the output to assist the robot in making decisions. The current official example framework is the behavior tree , more information please refer to:`roborts_decision/behavior_tree` 55 | 56 | - Blackboard 57 | 58 | The blackboard is similar to the concept of Blackboard in game design. As an input to the observation in the current decision system, it is used to dispatch a series of perceptual tasks and obtain perceptual information. More details please refer to:`roborts_decision/blackboard/blackboard.h`, the user can complete the modification and improvement of the blackboard class according to the type of information acquired by the user. 59 | 60 | - Behavior 61 | 62 | The specific behavior of the robot can be used as an action in the current decision system after different levels of abstraction. The framework provide a series of specific behavior examples. More details please refer to:`roborts_decision/example_behavior`,users can custom behavior according to samples. 63 | 64 | ### Task Scheduling Modules 65 | 66 | The behavior-dependent task scheduling module is mainly responsible for the task delegation of each module, and the function execution module is scheduled to complete the specific task. 67 | 68 | For each scheduling module, the core is: 69 | 70 | - Task execution(specific task input) 71 | 72 | - Updating of the task state(real-time feedback of the task) 73 | 74 | - Cancel of task(State reset and related recycling after interruption) 75 | 76 | Three call interfaces can basically perform scheduling for different tasks. 77 | 78 | Task state including: 79 | 80 | - IDLE 81 | 82 | - RUNNING 83 | 84 | - SUCCESS 85 | 86 | - FAILURE 87 | 88 | According to the main part of the robot module , it can be divided into : 89 | 90 | - chassis scheduling modules 91 | 92 | - gimbal scheduling modules 93 | 94 | #### Chassis Scheduling Modules 95 | 96 | The chassis scheduling modules includes a task scheduling interface with different levels of abstraction for the chassis. The operation block diagram is as follows: 97 | 98 | ![](https://rm-static.djicdn.com/documents/20758/ae091269db41b1547553446982826082.png) 99 | 100 | It includes three task modules: 101 | 102 | - Motion planning control 103 | 104 | Input the target goal ([geometry_msgs/PoseStamped]()) , delegate path plannning task and trajectory plannning task for robot chassis motion planning control 105 | 106 | - Speed control 107 | 108 | Input the target speed twist ([geometry_msgs/Twist]()) to control the robot chassis directly moving at a constant speed. 109 | 110 | - Speed and acceleration control 111 | 112 | Input target speed and acceleration twist_accel ([roborts_msgs/TwistAccel]()) to control the robot chassis directly performing a uniform acceleration motion at an initial given speed. 113 | 114 | ##### Examples of Navigation Task 115 | 116 | As to the navigation task of motion planning control, it is actually a complex task in which multi-module nodes cooperate with each other. The chassis scheduling module delegates the planning task to the global planning module and the local planning module, and finally outputs the speed and acceleration control amount to the underlying main control board. The planning module also relies on real-time updated odometer information, location information and cost maps. 117 | 118 | The block diagram of the navigation system in the actual scene and virtual environment is as follows: 119 | 120 | ![](https://rm-static.djicdn.com/documents/20758/1a8702f75a2361547553400516274462.png) 121 | 122 | #### Gimbal scheduling Modules 123 | 124 | The gimbal scheduling modules includes a task scheduling interface with different levels of abstraction for the gimbal. The operation block diagram is as follows: 125 | 126 | ![](https://rm-static.djicdn.com/documents/20758/5a50d15ba49371547553470292508751.png) 127 | 128 | It includes two task modes: 129 | 130 | - Angle control 131 | 132 | Input target angle ([roborts_msgs/GimbalAngle]())to control the angle of gimbal of the robot directly 133 | 134 | - Rate control 135 | 136 | [**deprecated**] Input target rate ([roborts_msgs/GimbalRate]())to control the rate of gimbal of the robot directly 137 | 138 | 139 | 140 | 141 | ## Compile and Run 142 | 143 | ### Compile 144 | 145 | Compile in the workspace of ROS 146 | 147 | ```shell 148 | catkin_make -j4 behavior_test_node 149 | ``` 150 | 151 | ### Run 152 | 153 | 154 | Test in the simulation environment 155 | 156 | ```shell 157 | roslaunch roborts_bringup roborts_stage.launch 158 | ``` 159 | 160 | Start the testing nodes of behavior 161 | 162 | ```shell 163 | rosrun roborts_decision behavior_test_node 164 | ``` 165 | 166 | Input different digital command like 1/2/3/4/5/6 and you can switch to perform different behaviors 167 | 168 | 169 | 170 | 171 | 172 | 173 | -------------------------------------------------------------------------------- /en/sdk_docs/roborts_detection.md: -------------------------------------------------------------------------------- 1 | # Detection Module 2 | 3 | ## Module Introduction 4 | 5 | Decision module provides tools to detect robot armor in ICRA Robomaster 2019 AI Challenge, and it also includes a simple analysis of the projectile model. (for more information, see the PDF documentation [projectile model analysis](https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/master/pdf/projectile_model.pdf)) 6 | 7 | This module is a submodule of `roborts_detection` and depends on the abstract factory pattern and parameter loading in module `roborts_common`. The module file directory is shown as below. 8 | 9 | ```bash 10 | roborts_detection 11 | ├── package.xml 12 | ├── CMakeLists.txt 13 | ├── armor_detection # Algorithms for armor detection 14 | │   ├── CMakeLists.txt 15 | │   ├── config 16 | │   │   └── armor_detection.prototxt # Config file for armor detecion parameters 17 | │   ├── armor_detection_algorithms.h # Header file for armor detection algorithms (all algorithm's header file should be included here) 18 | │   ├── armor_detection_base.h # Base class of the armor detection class 19 | │   ├── armor_detection_client.cpp # Client of actionlib for armor detection, for development usages 20 | │   ├── armor_detection_node.cpp # ROS node for internal logic of armor detection 21 | │   ├── armor_detection_node.h # Header/Entry file for the armor detection node 22 | │   ├── gimbal_control.cpp # Calculate gimbal's pitch and yaw according to projectile model 23 | │   ├── gimbal_control.h 24 | │   └── proto 25 | │   ├── armor_detection.pb.cc 26 | │   ├── armor_detection.pb.h 27 | │   └── armor_detection.proto # Structure description file for parameters used by armor detection node 28 | │   ├── constraint_set # Armor detection algorithm, identifies armor using armor characteristics 29 | │   │   ├── CMakeLists.txt 30 | │   │   ├── config 31 | │   │   │   └── constraint_set.prototxt # Adjustable armor detection parameters 32 | │   │   ├── constraint_set.cpp 33 | │   │   ├── constraint_set.h 34 | │   │   └── proto 35 | │   │   ├── constraint_set.pb.cc 36 | │   │   ├── constraint_set.pb.h 37 | │   │   └── constraint_set.proto # Structure description file for parameters used by constraint set 38 | ├── cmake_module 39 | │   ├── FindEigen3.cmake 40 | │   └── FindProtoBuf.cmake 41 | └── util 42 | ├── CMakeLists.txt 43 | └── cv_toolbox.h # Image data subscriber used by the detection node. It acts as a tool box for common image processing functions. 44 | ``` 45 | 46 | The ROS node graph of armor_detection_node can be shown as follows: 47 | 48 | ![](https://rm-static.djicdn.com/documents/20758/01dfe2ff9684a1547553225209707914.png) 49 | 50 | Inputs and outputs of the node are as follows: 51 | 52 | ### Input 53 | 54 | - /camera_name/image_raw ([sensor_msgs/Image](http://docs.ros.org/melodic/api/sensor_msgs/html/msg/Image.html)) 55 | 56 | (requried) Obtained by subscribing to `roborts_camera`, used for armor detection. 57 | 58 | - /camera_name/camera_info ([sensor_msgs/CameraInfo](http://docs.ros.org/melodic/api/sensor_msgs/html/msg/CameraInfo.html)) 59 | 60 | (requried) Obtained by subscribing to `roborts_camera`, used to calculate PnP and get 3D coodinates of the target. 61 | 62 | ### Output 63 | 64 | - /armor_detection_node_action/feedback (roborts_msgs/action/ArmorDetection) 65 | 66 | `Actionlib Server` responds in real-time with location data of detected armor; the interface is wrapped by Actionlib. 67 | 68 | - /armor_detection_node_action/status ([actionlib_msgs/GoalStatusArray](http://docs.ros.org/melodic/api/actionlib_msgs/html/msg/GoalStatusArray.html)) 69 | 70 | `Actionlib Server` responds in real-time with states of the detection node; the interface is wrapped by Actionlib. 71 | 72 | 73 | - /cmd_gimbal_angle (roborts_msgs/msgs/GimbalAngle) 74 | 75 | Publish gimbal control messages. 76 | 77 | ### Related Parameters 78 | 79 | For definitions of parameters see `proto/armor_detection.proto`,for parameter config data see `config/armor_detection.prototxt`. 80 | 81 | - name(`string`) 82 | 83 | Name of the armor detection algorithm 84 | 85 | - selected_algorithm(`string`) 86 | 87 | Selected detection algorithm name 88 | 89 | - undetected_armor_delay(`uint32`) 90 | 91 | Counter variable that keeps track of times when no armor is detected and data of the previous frame is published 92 | 93 | - camera_name(`string`) 94 | 95 | Camera name, should be identical to camera_name in the config file under roborts_camera 96 | 97 | - camera_gimbal_transform(`CameraGimbalTransform`) # composite data 98 | 99 | Transformation matrix for camera and gimbal, including the following parameters 100 | 101 | - offset_x(`float`) 102 | 103 | Offset of camera and gimbal in x direction 104 | 105 | - offset_y 106 | 107 | Offset of camera and gimbal in y direction 108 | 109 | - offset_z 110 | 111 | offset of camera and gimbal in z direction 112 | 113 | - offset_pitch 114 | 115 | Offset of pitch angle of camera and gimbal 116 | 117 | - offset_yaw 118 | 119 | Offset of yaw angle of camera and gimbal 120 | 121 | - projectile_model_info(`ProjectileModelInfo`) # composite data 122 | 123 | Projectile initial parameters, used for model analysis, including the following parameters 124 | 125 | - init_v(`float`) 126 | 127 | Initial velocity of the projectile 128 | 129 | - init_k(`float`) 130 | 131 | Air friction constant, for more details see projectile_model.pdf 132 | 133 | ## Compile and Run 134 | 135 | ### Compile 136 | 137 | Compile inside a ros workspace 138 | 139 | ```shell 140 | catkin_make -j4 armor_detection_client armor_detection_node 141 | ``` 142 | 143 | ### Run 144 | 145 | Run the armor detection node 146 | 147 | ```shell 148 | rosrun roborts_detection armor_detection_node 149 | ``` 150 | 151 | In the actual debugging process, you may need to run the armor_detection_client node 152 | 153 | ```shell 154 | rosrun roborts_detection armor_detection_client 155 | ``` 156 | 157 | 158 | 159 | 160 | 161 | -------------------------------------------------------------------------------- /en/sdk_docs/roborts_localization.md: -------------------------------------------------------------------------------- 1 | # Localization 2 | 3 | ## Module Introduction 4 | 5 | The localization node is a required dependency in the localization system. By analysing data from sensors using specific algorithms, the node acquires the transformation of coordinate systems from the robot and the map, which indicates the pose and position of the robot. The default localization algorithm in `roborts_localization` is AMCL algorithm. For more information about AMCL and relative parameters, see [AMCL](en/sdk_docs/roborts_localization?id=amcl). 6 | 7 | 8 | The data flow diagram is as follows: 9 | 10 | ![localization](https://rm-static.djicdn.com/documents/20758/0ba7ea27fc531547553277041255026.png) 11 | 12 | 13 | The file directory is as shown below: 14 | 15 | ```bash 16 | ├── CMakeLists.txt 17 | ├── cmake_module 18 | │ ├── FindEigen3.cmake 19 | │ └── FindGlog.cmake 20 | ├── config 21 | │ └── localization.yaml # parameter config file for localization, loaded in the launch file 22 | ├── localization_config.h # parameter structure class for localization 23 | ├── localization_math.cpp 24 | ├── localization_math.h # common math related functions, for internal use only 25 | ├── localization_node.cpp # main node and main function 26 | ├── localization_node.h 27 | ├── log.h # Glog Wrapper 28 | ├── package.xml 29 | ├── types.h # Type define 30 | ├── amcl # AMCL algorithm directory 31 | │ ├── amcl_config.h # AMCL parameter config 32 | │ ├── amcl.cpp # AMCL main logic 33 | │ ├── amcl.h 34 | │ ├── CMakeLists.txt 35 | │ ├── config 36 | │ │ └── amcl.yaml # AMCL parameter config file, loaded in the launch file 37 | │ ├── map 38 | │ │ ├── amcl_map.cpp 39 | │ │ └── amcl_map.h # AMCL map related calculation 40 | │ ├── particle_filter # particle filter directory 41 | │ │ ├── particle_filter.cpp 42 | │ │ ├── particle_filter_gaussian_pdf.cpp 43 | │ │ ├── particle_filter_gaussian_pdf.h 44 | │ │ ├── particle_filter.h 45 | │ │ ├── particle_filter_kdtree.cpp 46 | │ │ ├── particle_filter_kdtree.h 47 | │ │ └── particle_filter_sample.h 48 | │ └── sensors # Odometer sensor model and lidar sensor model 49 | │ ├── sensor_laser.cpp 50 | │ ├── sensor_laser.h 51 | │ ├── sensor_odom.cpp 52 | │ └── sensor_odom.h 53 | ``` 54 | 55 | 56 | 57 | 58 | Localization node can be started independently from the command line: 59 | 60 | ```bash 61 | # compile roborts_localization 62 | catkin_make -DCMAKE_BUILD_TYPE=Release localization_node 63 | 64 | rosrun roborts_localization localization_node 65 | ``` 66 | 67 | 68 | Or run the node from launch file: 69 | 70 | ```xml 71 | 72 | 73 | 74 | 75 | 76 | 77 | ``` 78 | 79 | Test result in RViz: 80 | 81 | ![localization_rviz](https://rm-static.djicdn.com/documents/20758/f172f1f29aa2f1547553303273821911.png) 82 | 83 | 84 | 85 | ## Input 86 | 87 | * /map ([nav_msgs/OccupancyGrid](http://docs.ros.org/api/nav_msgs/html/msg/OccupancyGrid.html)) 88 | 89 | (required) Subscribe to static map data. 90 | 91 | * /tf ([tf/tfMessage](http://docs.ros.org/api/tf/html/msg/tfMessage.html)) 92 | 93 | (required) Obtain transformation between coordinate systems. (odom->bask_link) 94 | 95 | * /scan ([sensor_msgs/LaserScan](http://docs.ros.org/api/sensor_msgs/html/msg/LaserScan.html)) 96 | 97 | (required) Subscribe to laser scan data. 98 | 99 | * /initialpose ([geometry_msgs/PoseWithCovarianceStamped](http://docs.ros.org/api/geometry_msgs/html/msg/PoseWithCovarianceStamped.html)) 100 | 101 | (optional) Estimate the pose and initialize the particle filter with this mean and variance. Corresponds to 2D Pose Estimate in Rviz. 102 | 103 | * /uwb ([geometry_msgs/PoseStamped](http://docs.ros.org/melodic/api/geometry_msgs/html/msg/PoseStamped.html)) 104 | 105 | (optional) UWB data used to correct global localization data. 106 | 107 | ## Output 108 | 109 | * /amcl_pose ([geometry_msgs/PoseWithCovarianceStamped](http://docs.ros.org/api/geometry_msgs/html/msg/PoseWithCovarianceStamped.html)) 110 | 111 | Estimated pose by the AMCL algorithm. 112 | 113 | * /particlecloud ([geometry_msgs/PoseArray](http://docs.ros.org/api/geometry_msgs/html/msg/PoseArray.html)) 114 | 115 | The pose of the particles in the particle filter. 116 | 117 | * /tf ([tf/tfMessage](http://docs.ros.org/api/tf/html/msg/tfMessage.html)) 118 | 119 | Publish the transformation between `odom` and the `map` coordinate systems. 120 | 121 | 122 | > [!NOTE] 123 | > before running `roborts_localization` make sure roborts_base and drivers for laser scan is correctly installed and running. 124 | 125 | 126 | 127 | ## Related Parameters 128 | 129 | * odom_frame_id (`string`, default: "odom") 130 | 131 | coordinate system of odom 132 | 133 | * base_frame_id (`string`, default: "base_link") 134 | 135 | coordinate system of the robot body 136 | 137 | * global_frame_id (`string`, default: "map") 138 | 139 | coordinate system of the map 140 | 141 | * laser_topic_name (`string`, default: "scan") 142 | 143 | name of the topic published by laser scan 144 | 145 | * map_topic_name (`string`, default: "map") 146 | 147 | name of the topic published by the map 148 | 149 | * init_pose_topic_name (`string`, default: "initialpose") 150 | 151 | name of the topic to which the estimated pose is sent 152 | 153 | * transform_tolerance (`double`, default: 0.1) 154 | 155 | tf publish time interval 156 | 157 | 158 | * initial_pose_x (`double`, default: 1) 159 | 160 | x position of initially estimated pose 161 | 162 | * initial_pose_y (`double`, default: 1) 163 | 164 | y position of initially estimated pose 165 | 166 | * initial_pose_a (`double`, default: 0) 167 | 168 | yaw angle of initially estimated pose 169 | 170 | * initial_cov_xx (`double`, default: 0.1) 171 | 172 | the xx covariance of the initially estimated pose 173 | 174 | * initial_cov_yy (`double`, default: 0.1) 175 | 176 | the yy covariance of the initially estimated pose 177 | 178 | * initial_cov_aa (`double`, default: 0.1) 179 | 180 | the aa covariance of the initially estimated pose 181 | 182 | * enable_uwb (`bool`, default: true) 183 | 184 | whether to enable uwb correction 185 | 186 | * uwb_frame_id (`string`, default: "uwb") 187 | 188 | UWB coordinate system 189 | 190 | * uwb_topic_name (`string`, default: "uwb") 191 | 192 | name of the topic published by UWB 193 | 194 | * use_sim_uwb (`bool`, default: false) 195 | 196 | whether to use fake UWB data generated by stage simulator 197 | 198 | * uwb_correction_frequency (`int`, default: 20) 199 | 200 | frequency of UWB correction 201 | 202 | 203 | ## AMCL 204 | 205 | ### ALgorithm Introduction 206 | 207 | ![autonomous mobile robot](https://rm-static.djicdn.com/documents/20758/6b0ea21b6d57a1547553332830529202.png) 208 | 209 | Adaptive Monte Carlo Localization (AMCL) is a set of localization algorithms for 2D robot motion. The main algorithmic principle stems from the combination and implementation of the **MCL**, **Augmented_MCL**, and **KLD_Sampling_MCL** algorithms described in Probabilistic Robotics, Chapter 8.3. Meanwhile, the Motion Model described in Chapter 5 and the Sensor Model described in Chapter 6 are used for motion prediction and particle weight update in AMCL. 210 | 211 | The integrated AMCL in `roborts_localization` comes with additional functions for random angles, random positions and angle initialization, in order to facilitate rapid deployment in the competition. 212 | 213 | 214 | The basic logic of the algorithm is shown as follows. 215 | ![AMCL_flowchart](https://rm-static.djicdn.com/documents/20758/e75d3533fc03c1547553357220164263.png) 216 | 217 | 218 | 219 | ### Related Parameters 220 | 221 | * min_particles (int, default: 50) 222 | 223 | the minimum number of particles in the particle filter 224 | 225 | * max_particles (int, default: 2000) 226 | 227 | the maximum number of particles in the particle filter 228 | 229 | * use_global_localization (bool, default: false) 230 | 231 | whether to randomly initialize global localization on start up 232 | 233 | * random_heading (bool, default: false) 234 | 235 | whether to initialize random angle on start up 236 | 237 | * update_min_d (double, default: 0.1) 238 | 239 | the displacement threshold for filter update 240 | 241 | 242 | * update_min_a (double, default: 0.5) 243 | 244 | the rotation threshold for filter update 245 | 246 | 247 | * odom_model (ENUM, ODOM_MODEL_OMNI) 248 | 249 | robot motion model, currently only supports omnidirectional wheel odometer models 250 | 251 | * odom_alpha1 (double, default: 0.005) 252 | 253 | error parameter 1 for odometer model 254 | 255 | * odom_alpha2 (double, default: 0.005) 256 | 257 | error parameter 2 for odometer model 258 | 259 | * odom_alpha3 (double, default: 0.01) 260 | 261 | error parameter 3 for odometer model 262 | 263 | * odom_alpha4 (double, default: 0.005) 264 | 265 | error parameter 4 for odometer model 266 | 267 | * odom_alpha5 (double, default: 0.003) 268 | 269 | error parameter 5 for odometer model 270 | 271 | * laser_min_range (double, default: 0.15) 272 | 273 | the minimum effective distance of the laser radar 274 | 275 | 276 | * laser_max_range (double, default: 8.0) 277 | 278 | the maximum effective distance of the laser radar 279 | 280 | * laser_max_beams (int, default: 30) 281 | 282 | the maximum number of laser beams 283 | 284 | * laser_model (ENUM, default: LASER_MODEL_LIKELIHOOD_FIELD_PROB) 285 | 286 | laser radar rangefinder model. Currently only the likelihood domain improvement model is supported. 287 | 288 | * z_hit (double, default: 0.5) 289 | 290 | the $z_{hit}$ parameter in the likelihood domain model 291 | 292 | * z_rand (double, default: 0.5) 293 | 294 | the $z_{rand}$ parameter in the likelihood domain model 295 | 296 | * sigma_hit (double, default: 0.2) 297 | 298 | the $\sigma_{hit}$ parameter in the likelihood domain model 299 | 300 | * laser_likelihood_max_dist (double, default: 2.0) 301 | 302 | the maximum distance between the ranging point and the obstacle in the likelihood domain model 303 | 304 | * do_beamskip (bool, default: true) 305 | 306 | whether to ignore part of the laser beams in Position Tracking phase, in order to avoid unpredictable errors, such as moving objects. 307 | 308 | * beam_skip_distance (double, default: 0.5) 309 | 310 | the distance to ignore obstacles detected by the laser beam 311 | 312 | * beam_skip_threshold (double, default: 0.3) 313 | 314 | the threshold to ignore obstacles detected by the laser beam 315 | 316 | * beam_skip_error_threshold (double, default: 0.9) 317 | 318 | the error threshold for beam to ignore/skip obstacles 319 | 320 | * resample_interval (int, default: 2) 321 | 322 | interval for resampling 323 | 324 | * recovery_alpha_slow (double, default: 0.001) 325 | 326 | $\alpha_{slow}$ parameter in **Augmented_MCL** 327 | 328 | * recovery_alpha_fast (double, default: 0.1) 329 | 330 | $\alpha_{test}$ parameter in **Augmented_MCL** 331 | 332 | * kld_err (double, default: 0.05) 333 | 334 | $\epsilon$ parameter in **KLD_Sampling_MCL** 335 | 336 | * kld_z (double, default: 0.99) 337 | 338 | $(1-\delta)$ in **KLD_Sampling_MCL** 339 | 340 | * laser_filter_weight (double, default: 0.4) 341 | 342 | weight parameter used to filter out measured laser data that has lower weight 343 | 344 | * max_uwb_particles (int, default: 10) 345 | 346 | the maximum resampling number with UWB as the mean in resampling phase 347 | 348 | * uwb_cov_x (double, default: 0.06) 349 | 350 | the x covariance of the Gaussian distribution with UWB as the mean in resampling phase 351 | 352 | * uwb_cov_y (double, default: 0.06) 353 | 354 | the y covariance of the Gaussian distribution with UWB as the mean in resampling phase 355 | 356 | * resample_uwb_factor (double, default: 4.0) 357 | 358 | resampling factor used to determine symmetric localization 359 | 360 | 361 | 362 | 363 | 364 | -------------------------------------------------------------------------------- /en/sdk_docs/roborts_planning_global_planner.md: -------------------------------------------------------------------------------- 1 | # Global Path Planning 2 | 3 | ## Module Introduction 4 | 5 | Global path planning (Referred to as global planning) is the first step of the motion planning of the Navigation System , after you set a target location, it will search a shortest path (a series of discrete coordinate points) without collision by perceptual global cost map , and then passed the path to the local trajectory planning module as a input to control the specific motion of the robot. 6 | 7 | 8 | 9 | The module of global path planning is located in `roborts_planner`, relevant action and mags are defined in the`roborts_msgs`and the abstract factory pattern and parameter in the`roborts_common`. The directory of module files are as follows: 10 | 11 | ```bash 12 | └── global_planner/ 13 | ├── CMakeLists.txt 14 | ├── global_planner_algorithms.h #Contains header files for specific algorithms 15 | ├── global_planner_base.h #The abstract class of global planning algorithm. 16 | ├── global_planner_test.cpp #The test node of global planning. 17 | ├── global_planner_node.cpp #The planner executive node with Main functions 18 | ├── global_planner_node.h 19 | ├── a_star_planner #Astar global planning algorithm. 20 | │   └── ... 21 | ├── config 22 | │   └── global_planner_config.prototxt # Parameter configuration file 23 | └── proto 24 | ├── global_planner_config.pb.cc 25 | ├── global_planner_config.pb.h 26 | └── global_planner_config.proto # Global planning parameter definition file. 27 | ``` 28 | 29 | Related algorithm of global path planning refers to [A Star Algorithm](en/sdk_docs/roborts_planning_global_planner?id=a) 30 | 31 | global_planner_node is the core planning node,ROS node graph is as follow: 32 | 33 | ![](https://rm-static.djicdn.com/documents/20758/63b3d8db24ce71547553505842780076.png) 34 | 35 | The input and output of the nodes are as follows: 36 | 37 | ### Input 38 | 39 | * /tf ([tf/tfMessage](http://docs.ros.org/api/tf/html/msg/tfMessage.html)) 40 | 41 | (Must)It is a base_link->map transformation that is listened to by `TransformListener`, provided by the `roborts_localization` Package within the roborts framework. 42 | 43 | * global_costmap Object (`roborts_costmap/CostmapInterface`) 44 | 45 | It can represent the global cost map (global_costmap) including the static and obstacle layers (optional), depending on the topic /tf, /map (static layer) and /scan (obstacle layer), which is provided by the `roborts_costmap` package. 46 | 47 | * /global_planner_node_action/goal ([roborts_msgs/GlobalPlannerGoal]()) 48 | 49 | (Must)Enter the target of the global planning from `Actionlib Client` to `Actionlib Server`. The interface is encapsulated by Actionlib. The specific message type is([geometry_msgs/PoseStamped](http://docs.ros.org/api/geometry_msgs/html/msg/PoseStamped.html)) 50 | 51 | * /global_planner_node_action/cancel ([actionlib_msgs/GoalID](http://docs.ros.org/api/actionlib_msgs/html/msg/GoalID.html)) 52 | 53 | Apply`Actionlib Client`to`Actionlib Server`to cancel the running task of the global planning. The interface is encapsulated by Actionlib. 54 | 55 | ### Output 56 | 57 | * /global_planner_node_action/feedback ([roborts_msgs/GlobalPlannerFeedback]()) 58 | 59 | `Actionlib Server`Real-time feedback of the planning path. The interface is encapsulated by Actionlib. Concrete message type is: [nav_msgs/Path](http://docs.ros.org/api/nav_msgs/html/msg/Path.html) 60 | 61 | * /global_planner_node_action/result ([roborts_msgs/GlobalPlannerResult]()) 62 | 63 | `Actionlib Server`Feedback of the planning results, which is used to judge whether it reaches the target or failed to get the feasible plan. The interface is encapsulated by Actionlib. 64 | 65 | * /global_planner_node_action/status ([actionlib_msgs/GoalStatusArray](http://docs.ros.org/api/actionlib_msgs/html/msg/GoalStatusArray.html)) 66 | 67 | `Actionlib Server`Real-time feedback of the planning status. The interface is encapsulated by Actionlib. 68 | 69 | * /global_planner_node/path ([nav_msgs/Path](http://docs.ros.org/api/nav_msgs/html/msg/Path.html)) 70 | 71 | It is used to visually display paths. 72 | 73 | * /global_costmap/global_costmap/costmap ([nav_msgs/OccupancyGrid](http://docs.ros.org/api/nav_msgs/html/msg/OccupancyGrid.html)) 74 | 75 | It is used to visually display the global cost map. 76 | 77 | ### Related Parameters 78 | 79 | Definition of the parameters refer to `proto/global_planner_config.proto`. Configuration of the parameters refer to `config/global_planner_config.prototxt` 80 | 81 | * selected_algorithm(`string`, default: "a_star_planner") 82 | 83 | Select the name of global planning algorithm. 84 | 85 | * frequency(`int`, default: 3) 86 | 87 | Frequency of global planning execution. 88 | 89 | * max_retries(`int`, default: 5) 90 | 91 | max allowed times of retry after the global plan failed. 92 | 93 | * goal_distance_tolerance(`double`, default: 0.15) 94 | 95 | Euclidean distance tolerance on the goal point for the global planner. 96 | 97 | * goal_angle_tolerance(`double`, default: 0.1) 98 | 99 | Angle (Unit radian) tolerance on the goal point for the global planner. 100 | 101 | ### Executing Process 102 | 103 | In the core planning nodes`global_planner_node_action`, executing process is as follows: 104 | 105 | - Initialization: 106 | - Initialize Actionlib Server,create a visible publisher 107 | - Read the parameter. 108 | - Instantiate tf listener,global_costmap objects. 109 | - Instantiate specific algorithm planner object. 110 | 111 | - After the initialization is finished, start the planning thread, ROS callback queue starts to callback in the main thread, at the meantime, Actionlib Server Callback also start to callback. 112 | - The flow diagram of the planning thread when it is executed: 113 | ![](https://rm-static.djicdn.com/documents/20758/057c895762b7d1547553536324774678.png) 114 | - The flow diagram of Actionlib Server Callback thread: 115 | ![](https://rm-static.djicdn.com/documents/20758/b93803f9be2aa1547553557409469215.png) 116 | 117 | 118 | ## Compile and Run 119 | 120 | ### Compile 121 | 122 | Compile in the working area of ROS: 123 | 124 | ```shell 125 | catkin_make -j4 global_planner_node global_planner_test 126 | ``` 127 | 128 | ### Run 129 | 130 | Test in the simulation environment : 131 | 132 | ```shell 133 | roslaunch roborts_bringup roborts_stage.launch 134 | ``` 135 | 136 | Start the testing nodes of global path planning. 137 | 138 | ```shell 139 | rosrun roborts_planning global_planner_test 140 | ``` 141 | Start Rviz,it display the input and output of modules that needed,as shown in the following figure: 142 | 143 | ![](https://rm-static.djicdn.com/documents/20758/844f7b73e9a091547553578138533412.png) 144 | 145 | - Pose: Red arrow is the planning destination. 146 | - LaserScan: Yellow point set is the data of laser radar scanning. 147 | - TF: Refer to coordinate system of TF. 148 | - Map: Gray-black part is the static map. 149 | - Map: Different colors (purple, red, blue, blue, etc.) show global cost maps at different costs. 150 | - Path: Green path is the planning path. 151 | 152 | ## A Star 153 | ### Algorithm Introduction 154 | 155 | The A*(A-Star) algorithm is the most effective method for solving the shortest path in a static road network.The formula is expressed as: 156 | 157 | $$ f(n) = g(n) + h(n) $$ 158 | 159 | $f(n)$ is the value function of node n from the initial point to the target point, $g(n)$ is the actual cost from the initial node to the n node in the state space, $h(n)$ is the cost from n to the target node of the best path . In the actual global path planning, the static network routing is provided by the global cost map. 160 | 161 | ### Related Parameters 162 | * inaccessible_cost(`int`, default: 253) 163 | 164 | The cost of map grid which is impassable(deadly obstacle). 165 | 166 | * heuristic_factor(`double`, default: 1.0) 167 | 168 | Heuristic factor,refers to heuristic function: h(n) = heuristic_factor * manhattan_distance` 169 | 170 | * goal_search_tolerance(`double`, default: 0.45) 171 | 172 | Search tolerating distance of the new destination(unit:meter) , if the destination you send is impassable , then you can find a new passable destination in the given allowing distance range. 173 | 174 | 175 | -------------------------------------------------------------------------------- /en/sdk_docs/roborts_planning_local_planner.md: -------------------------------------------------------------------------------- 1 | # Local Path Planner 2 | 3 | ## Module Introduction 4 | 5 | Local Path Planner Module calculates the optimal speed without collision for the robot, based on the Odometry information, LiDAR sensoring information, and the optimal path output from the Global Path Planner. 6 | 7 | Local Path Planner Module is located in `roborts_planner` package, it depends on the action and msgs in `roborts_msgs` package, and the abstract factory mode and parameter reader in `roborts_common`. 8 | 9 | 10 | The Local Path Planner has the following hierachy: 11 | 12 | ```bash 13 | local_planner 14 | ├── CMakeLists.txt 15 | ├── config 16 | │   └── local_planner.prototxt # Local Path Planner configuration file 17 | ├── include 18 | │   └── local_planner 19 | │   ├── proto 20 | │   │   ├── local_planner.pb.cc 21 | │   │   ├── local_planner.pb.h 22 | │   │   └── local_planner.proto # Local Path Planner parameter generation file 23 | │   ├── data_base.h # Local Path Planner data structures, used to build g2o edges and vertices 24 | │   ├── data_converter.h # Data converter, used to convert data into data_base structure 25 | │   ├── distance_calculation.h # Calculates distances between 2D points, lines, and geometry shapes 26 | │   ├── line_iterator.h # Calculates coordinates for continues line segments in discrete grid 27 | │   ├── local_planner_algorithms.h # Local Path Planner header files (all algorithm header files should be included in this file) 28 | │   ├── local_planner_base.h # Superclass for Local Path Planner algorithm 29 | │   ├── local_planner_node.h # Local Path Planner entry node 30 | │   ├── local_visualization.h # Visualization of Local Path Planner result 31 | │   ├── obstacle.h # 2D obstacle information 32 | │   ├── odom_info.h # Odometry information 33 | │   ├── optimal_base.h # Superclass for optimization algorithms 34 | │   ├── robot_footprint_model.h # Robot dimension descriptions 35 | │   ├── robot_position_cost.h # Cost calculation for robot position, used for determine the avalibility of the path 36 | │   └── utility_tool.h # Utility tool functions 37 | ├── src 38 | │   ├── local_planner_node.cpp # ros node file,used for logical scheduling inside the Local Path Planner 39 | │   ├── vel_converter.cpp # ros node file, convert the velocity into "cmd_vel" and publish it during the simulation 40 | │   ├── teb_test.cpp # timed elastic band test file 41 | │   └── ... 42 | └── timed_elastic_band 43 | ├── CMakeLists.txt 44 | ├── config 45 | │   └── timed_elastic_band.prototxt # timed elastic band configuration file 46 | ├── include 47 | │   └── timed_elastic_band 48 | │   ├── proto 49 | │   │   ├── timed_elastic_band.pb.cc 50 | │   │   ├── timed_elastic_band.pb.h 51 | │   │   └── timed_elastic_band.proto 52 | │   ├── teb_local_planner.h # timed elastic band implementation,inherited from local_planner_base.h 53 | │   ├── teb_optimal.h # g2o optimization logic,inherited from optimal_base.h 54 | │   └── ... # other g2o edge and vertex files 55 | └── src 56 | ├── teb_local_planner.cpp # timed elastic band implementation 57 | ├── teb_optimal.cpp # g2o optimization logic 58 | └── ... 59 | 60 | ``` 61 | 62 | Local Path Planner related algorithms please refer to [Time Elastic Band](en/sdk_docs/roborts_planning_local_planner?id=timed-elastic-band) 63 | 64 | 65 | local_planner_node is the core planning node,the ROS node graph is shown below: 66 | 67 | ![](https://rm-static.djicdn.com/documents/20758/708f059fc647c1547553602751260545.png) 68 | 69 | The output of the node as follows: 70 | 71 | ### Input 72 | 73 | - /tf ([tf/tfMessage](http://docs.ros.org/api/tf/html/msg/tfMessage.html)) 74 | 75 | (Required) Use `TransformListener` to monitor the conversion of map->odom,provided by roborts_localization package in roborts framework 76 | 77 | - local_costmap object(`roborts_costmap/CostmapInterface`) 78 | 79 | (Required) Can be used to represents the local grid cost map in obstacle layer(local_costmap), depends on topic /tf, /scan(obstacle layer),provided by roborts_costmap package in roborts framework. 80 | 81 | - /local_planner_node_action/goal (roborts_msgs/LocalPlannerGoal) 82 | 83 | (Required) From `Actionlib Client` to `Actionlib Server` input the path generated by Global Path Planner, the interface is encapsulted using Actionlib,the topic type is ([nav_msgs/Path](http://docs.ros.org/melodic/api/nav_msgs/html/msg/Path.html)) 84 | 85 | - /locall_planner_node_action/cancel ([actionlib_msgs/GoalID](http://docs.ros.org/melodic/api/actionlib_msgs/html/msg/GoalID.html)) 86 | 87 | From `Actionlib Client` to `Actionlib Server` to apply calceling the Local Planning task, the interface is encapsulted using Actionlib. 88 | 89 | ### Output 90 | 91 | - /local_planner_node_action/feedback (roborts_msgs/LocalPlannerFeedback) 92 | 93 | `Actionlib Server` generates real-time feedback of the error code and error message for Local Path Planner, the interface is encapsulted using Actionlib. 94 | 95 | - /local_planner_node_action/result (roborts_msgs/LocalPlannerResult) 96 | 97 | Not used 98 | 99 | - /local_planner_node_action/status ([actionlib_msgs/GoalStatusArray](http://docs.ros.org/melodic/api/actionlib_msgs/html/msg/GoalStatusArray.html)) 100 | 101 | `Actionlib Server` generates the real-time feedback of the planning status, the interface is encapsulted using Actionlib. 102 | 103 | - /local_planner_node/trajectory([nav_msgs/Path](http://docs.ros.org/melodic/api/nav_msgs/html/msg/Path.html)) 104 | 105 | Used for visualizing the path of Local Planner. 106 | 107 | - /local_costmap/local_costmap/costmap ([nav_msgs/OccupancyGrid](http://docs.ros.org/melodic/api/nav_msgs/html/msg/OccupancyGrid.html)) 108 | 109 | Used for visualizing the Local Grid Costmap. 110 | 111 | 112 | 113 | ## Compile and Run 114 | 115 | ### Compile 116 | 117 | Before compiling, make sure installed all the dependencies. In your ROS workspace run the following command 118 | 119 | ```shell 120 | catkin_make local_planner_node vel_converter teb_test 121 | ``` 122 | 123 | ### Run 124 | 125 | ```shell 126 | # Launch the Local Planner node 127 | rosrun roborts_planning local_planner_node 128 | # Launch vel_converter 129 | rosrun roborts_planning vel_converter 130 | # Launch timed elastic band test node 131 | rosrun roborts_planning teb_test 132 | ``` 133 | 134 | Or using launch file 135 | 136 | ```xml 137 | 138 | 139 | 140 | 141 | ``` 142 | 143 | ## Timed Elastic Band 144 | ### Algorithm Introduction 145 | 146 | The algorithm built-in the objective function for the path execution time, optimizes the actual moving path and calculates the optimal velocity, while keeping the safe distance with obstacles, and comply with the robot Kinematics. More theoretical detail can be found in paper: 147 | 148 | * C. Rösmann, F. Hoffmann and T. Bertram: Integrated online trajectory planning and optimization in distinctive topologies, Robotics and Autonomous Systems, Vol. 88, 2017, pp. 142–153. 149 | 150 | ### Related Parameters 151 | 152 | #### Path Parameters 153 | 154 | * teb_autosize 155 | 156 | Whether or not re-calculating the path 157 | 158 | * dt_ref 159 | 160 | Initial time constant 161 | 162 | * dt_hysteresis 163 | 164 | Time compensate constant 165 | 166 | * global_plan_overwrite_orientation 167 | 168 | Whether or not re-write the orientation for Path Planning position 169 | 170 | * allow_init_with_backwards_motion 171 | 172 | Whether or not allowing backward motion 173 | 174 | * global_plan_viapoint_sep 175 | 176 | Choose required Global Path Planning point 177 | 178 | * via_points_ordered 179 | 180 | Go to the waypoints via points order 181 | 182 | * max_global_plan_lookahead_dist 183 | 184 | Look ahead distance for Local Path Planning 185 | 186 | * exact_arc_length 187 | 188 | Arc length between two positions 189 | 190 | * force_reinit_new_goal_dist 191 | 192 | Force reinitialize new goal for Local Path Planner 193 | 194 | * feasibility_check_no_poses 195 | 196 | Calculate number of feasibility points for path 197 | 198 | * publish_feedback 199 | 200 | Not used 201 | 202 | * min_samples 203 | 204 | Minimum samples for path 205 | 206 | * max_samples 207 | 208 | Maximum samples for path 209 | 210 | 211 | #### Kinematics Parameters 212 | 213 | 214 | * max_vel_x 215 | 216 | Maximum forward velocity 217 | 218 | * max_vel_x_backwards 219 | 220 | Maximum backward velocity 221 | 222 | * max_vel_y 223 | 224 | Maximum translation velocity 225 | 226 | * max_vel_theta 227 | 228 | Maximum angular velocity 229 | 230 | * acc_lim_x 231 | 232 | Maximum forward/backward acceleration 233 | 234 | * acc_lim_y 235 | 236 | Maximum translation acceleration 237 | 238 | * acc_lim_theta 239 | 240 | Maximum angular acceleration 241 | 242 | * min_turning_radius 243 | 244 | Minimum turning radius 245 | 246 | * wheelbase 247 | 248 | Not used 249 | 250 | * cmd_angle_instead_rotvel 251 | 252 | Not used 253 | 254 | #### Tolerance Parameters 255 | 256 | * xy_goal_tolerance 257 | 258 | Goal distance tolerance 259 | 260 | * yaw_goal_tolerance 261 | 262 | Goal angular tolerance 263 | 264 | * free_goal_vel 265 | 266 | Free goal velocity 267 | 268 | 269 | #### Obstacle Parameters 270 | 271 | * min_obstacle_dist 272 | 273 | Minimum obstacle distance 274 | 275 | * inflation_dist 276 | 277 | Obstacle inflation distance 278 | 279 | * include_costmap_obstacles 280 | 281 | Not used, whether include costmap obstacles 282 | 283 | * costmap_obstacles_behind_robot_dist 284 | 285 | Threshold for obstacle costmap behind the robot 286 | 287 | * obstacle_poses_affected 288 | 289 | Obstacles that affect the number of waypoints, only works when legacy_obstacle_association is true 290 | 291 | * legacy_obstacle_association 292 | 293 | Select optimization objective to be the closest waypoint to the obstacle, or the closest obstacle to the waypoint 294 | 295 | * obstacle_association_cutoff_factor 296 | 297 | Do not consider the factor of obstacle, only works when legacy_obstacle_association is false 298 | 299 | * obstacle_association_force_inclusion_factor 300 | 301 | Force consider the factor of obstacle, only works when legacy_obstacle_association is false 302 | 303 | #### Optimization Parameters 304 | 305 | * no_inner_iterations 306 | 307 | The number of iterations for the solver when the path adjustment is done 308 | 309 | * no_outer_iterations 310 | 311 | The number of automatic path adjustment 312 | 313 | * optimization_activate 314 | 315 | Whether or not start optimization 316 | 317 | * optimization_verbose 318 | 319 | Whether or not output debug message for optimization 320 | 321 | * penalty_epsilon 322 | 323 | Constrained cost function conpensition 324 | 325 | * weight_max_vel_x 326 | 327 | Maximum x velocity weight 328 | 329 | * weight_max_vel_y 330 | 331 | Maximum y velocity weight 332 | 333 | * weight_max_vel_theta 334 | 335 | Maximum angular velocity weight 336 | 337 | * weight_acc_lim_x 338 | 339 | Maximum x acceleration weight 340 | 341 | * weight_acc_lim_y 342 | 343 | Maximum y acceleration weight 344 | 345 | * weight_acc_lim_theta 346 | 347 | Maximum angular acceleration weight 348 | 349 | * weight_kinematics_nh 350 | 351 | Non complete Kinematics constraint weight 352 | 353 | * weight_kinematics_forward_drive 354 | 355 | Forward drive weight 356 | 357 | * weight_kinematics_turning_radius 358 | 359 | Minimum steering radius weight, car like model 360 | 361 | * weight_optimaltime 362 | 363 | Optimization track time weight 364 | 365 | * weight_obstacle 366 | 367 | Minimum distance to obstacle weight 368 | 369 | * weight_inflation 370 | 371 | Weight after inflation 372 | 373 | * weight_dynamic_obstacle 374 | 375 | Dynamic obstacle weight 376 | 377 | * weight_viapoint 378 | 379 | Minimum distance to waypoint weight 380 | 381 | * weight_adapt_factor 382 | 383 | Weight adaptation factor, only used in obstacle for now 384 | 385 | * weight_prefer_rotdir 386 | 387 | Steering weight -------------------------------------------------------------------------------- /en/software_framework.md: -------------------------------------------------------------------------------- 1 | # Software Architecture 2 | The entire software system decouples the driver module, function module and algorithm module to enhance the portability of the platform and improve the efficiency of collaborative development by developers in different fields. 3 | 4 | The entire software architecture is shown below: 5 | 6 | ![system_diagram](https://rm-static.djicdn.com/documents/20758/40aac2fa9e6b81547552996504129513.png) 7 | 8 | Based on the platform, developers can develop in a buttom-up fashion to control the robot to complete different tasks. The whole framework is divided into two parts: 9 | 10 | 11 | As the low-level computing platform, the STM32 microcontroller runs RTOS (Real-Time Operating System) with limited computing power to perform various real-time tasks: 12 | 13 | - Closed-loop control tasks with feedback based on encoder or IMU 14 | 15 | - Acquisition, pre-processing and forwarding of sensor information for pose estimation 16 | 17 | - Forward information related to the competition, such as robot HP value, etc. 18 | 19 | On-board computing devices (such as Manifold 2) act as the upper-level computing platform, responsible for the execution of various algorithms with large amounts of computation: 20 | 21 | - Environment perception, including real-time localization, mapping, target detection, identification and tracking 22 | 23 | - Motion planning, including global planning (path planning), local planning (trajectory planning) 24 | 25 | - An observation-action-based decision model interface that adapts to various decision frameworks such as FSM (Finite State Machine), Behavior Tree and other learning-based decision frameworks. 26 | 27 | The two computing platforms communicate with the specified open source protocol through the serial port. The platform also provides a ROS-based API interface, which can be used for secondary development in ROS using C++ or Python. Based on this interface, developers can control the movement of the chassis and the gimbal, control the launching mechanism to fire the projectile by changing the input frequency, the velocity of projectiles and the quantity of projectiles, and receive the sensor information and game data in real-time. In more advanced applications, developers can define their own protocols based on the related documentation in [roborts_base](en/sdk_docs/roborts_base) to extend the capabilities of the robot. 28 | 29 | # Applications 30 | 31 | The original intention of RoboRTS and RoboMaster mobile robot platform is to encourage more developers to participate in the research of robot intelligent decision-making under different scenarios and rules. The first step is to implement some robot perception or planning algorithms based on this hardware platform, so the observation space and action space can be abstracted, and better decision models can be constructed. 32 | 33 | Our primary scenario is the decision-making of robotic competition, as shown below: 34 | 35 | Pursuit and search: 36 | 37 | ![airobot1](https://rm-static.djicdn.com/documents/20758/2c66c923154ae1547553014851539095.gif) 38 | 39 | Patrol: 40 | 41 | ![airobot2](https://rm-static.djicdn.com/documents/20758/299a5a10c187a1547553038856640888.gif) 42 | 43 | 44 | 45 | 46 | -------------------------------------------------------------------------------- /hardware_specifications.md: -------------------------------------------------------------------------------- 1 | ## 结构组成 2 | RoboMaster机器人平台基于模块化设计,支持快拆结构,各模块可单独编程与调试使用。 3 | 4 |
5 | 6 |
7 | 8 | 整个平台主要由三个部分组成 9 | 10 | 1. 底盘模块 11 |
12 | 13 |
14 | 15 | - 使用麦克纳姆轮,支持全向运动 16 | 17 | - 动力系统由RoboMaster [M3508 P19 无刷直流减速电机](https://store.dji.com/product/rm-m3508-p19-brushless-dc-gear-motor) 18 | 和RoboMaster [C620 电子调速器](https://store.dji.com/product/rm-c620-brushless-dc-motor-speed-controller)组成 19 | - 使用RoboMaster [开发板](https://store.dji.com/product/rm-development-board-type-a?from=menu_products)[A型](https://store.dji.com/product/rm-development-board-type-a?from=menu_products) (STM32F427) 作为主控板 20 | 21 | 22 | 2. 云台模块 23 | 24 |
25 | 26 |
27 | 28 | 29 | - 采用两轴云台支持2自由度的旋转运动 30 | 31 | - 提供17mm TPU弹丸的拨弹与发射机构 32 | 33 | - 云台动力系统由RoboMaster [GM6020 无刷电机](https://store.dji.com/cn/product/rm-gm6020-brushless-dc-motor) (包含电调) 组成 34 | 35 | - 拨弹动力系统由[M2006 P36 无刷直流减速电机](https://store.dji.com/cn/product/rm-m2006-p36-brushless-motor) 组成 36 | 37 | - 发射动力系统由 [DJI Snail 2305 竞速电机](https://store.dji.com/product/snail-racing-propulsion-system?from=menu_products) 组成 38 | - 使用RoboMaster [Development Board Type A](https://store.dji.com/product/rm-development-board-type-a?from=menu_products) (STM32F427) 作为主控板 39 | 40 | 3. 裁判系统模块 41 | 42 |
43 | 44 |
45 | 46 | - 裁判系统是集成计算、通信、控制于一体的针对机器人比赛的电子判罚系统。开发者可以通过裁判系统,从特定的软件接口获取比赛进程和机器人状态的信息 47 | 48 | - 裁判系统整体包含安装于机器人上的机载端以及安装在 PC 物理机上的服务器和客户端软件两部分。 49 | 50 | - 机载端包含**主控模块**、**电源管理模块**、**装甲模块**、**测速模块**、**场地交互模块**等。 51 | 52 | - 关于裁判系统模块更多信息请参考[相关资料](/resources)中的《裁判系统规范手册》。 53 | 54 | 除此之外,整个平台还包括**DT7遥控器**和智能锂电池 ([经纬 M100 TB47D电池](https://store.dji.com/product/matrice-100-tb47d-battery?from=autocomplete&position=0) 或 [TB48D](https://store.dji.com/product/matrice-100-tb48d-battery)) 及其充电器。关于机器人平台更多详细信息,请参考[相关文档](/documents)中的《AI机器人用户手册》。 55 | 56 | 整个平台可以容纳多种类型的传感器和计算设备,满足研究者定制化扩展开发的需求。整个平台提供了传感器安装架接口,适配多种类型的传感器包括单目相机、激光雷达、UWB定位套件、深度相机等。平台官方支持DJI Manifold 2作为机载端的计算设备,同时还兼容intel NUC和搭载相应尺寸扩展板的Nvidia Jetson TX1, TX2 或 Xavier设备。 57 | 58 | ## 硬件参数 59 | 60 | | 结构 | | 61 | | :-------------------- | :-------------------- | 62 | | 整机尺寸 | 600 x 450 x 460 mm | 63 | | 重量(含电池) | 16.6 kg | 64 | | **性能** | | 65 | | 最大前进速度 | 3 m/s | 66 | | 最大平移速度 | 2 m/s | 67 | | 云台 Pitch轴 转动角度 | -20° ~ 20° | 68 | | 云台 Yaw轴 转动角度 | -90° ~ 90° | 69 | | 弹丸最大发射频率 | 10 发/秒 | 70 | | 弹丸最大发射速度 | 25 m/s | 71 | | Remote Controller | 200 发 | 72 | | **电池** | | 73 | | 型号 | DJI TB47D / DJI TB48D | 74 | | 类型 | LiPo 6s | 75 | | 电压 | 22.8v | 76 | | 电池容量 | 4500 mAH / 5700 mAH | 77 | | **遥控器** | | 78 | | 型号 | DJI DT7 | 79 | | Firmware upgrade | 2.4 GHz | 80 | | 充电接口 | Micro USB | 81 | | **通信接口** | | 82 | | 接口类型 | Micro USB | 83 | | 通信方式 | STM32虚拟串口 | 84 | | 波特率 | 921600 | 85 | 86 | 87 | -------------------------------------------------------------------------------- /images/AMCL_flowchart.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/AMCL_flowchart.png -------------------------------------------------------------------------------- /images/AMCL_mindnode.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/AMCL_mindnode.png -------------------------------------------------------------------------------- /images/airobot1.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/airobot1.gif -------------------------------------------------------------------------------- /images/airobot2.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/airobot2.gif -------------------------------------------------------------------------------- /images/architecture: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/architecture -------------------------------------------------------------------------------- /images/architecture.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/architecture.png -------------------------------------------------------------------------------- /images/armor_detection.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/armor_detection.png -------------------------------------------------------------------------------- /images/armor_detection_server.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/armor_detection_server.png -------------------------------------------------------------------------------- /images/camera.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/camera.png -------------------------------------------------------------------------------- /images/chassis.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/chassis.png -------------------------------------------------------------------------------- /images/chassis_executor.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/chassis_executor.png -------------------------------------------------------------------------------- /images/gimbal.jpeg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/gimbal.jpeg -------------------------------------------------------------------------------- /images/gimbal_executor.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/gimbal_executor.png -------------------------------------------------------------------------------- /images/global_planner.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/global_planner.png -------------------------------------------------------------------------------- /images/global_planner_actionlib.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/global_planner_actionlib.png -------------------------------------------------------------------------------- /images/global_planner_plan.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/global_planner_plan.png -------------------------------------------------------------------------------- /images/global_planner_rviz.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/global_planner_rviz.png -------------------------------------------------------------------------------- /images/local_planner.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/local_planner.png -------------------------------------------------------------------------------- /images/localization.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/localization.png -------------------------------------------------------------------------------- /images/localization_rviz.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/localization_rviz.png -------------------------------------------------------------------------------- /images/modules.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/modules.png -------------------------------------------------------------------------------- /images/navigation.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/navigation.png -------------------------------------------------------------------------------- /images/referee_system.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/referee_system.png -------------------------------------------------------------------------------- /images/referee_system2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/referee_system2.png -------------------------------------------------------------------------------- /images/roborts_base_node.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/roborts_base_node.png -------------------------------------------------------------------------------- /images/robot.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/robot.png -------------------------------------------------------------------------------- /images/robot45.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/robot45.jpg -------------------------------------------------------------------------------- /images/run_example.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/run_example.gif -------------------------------------------------------------------------------- /images/software.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/software.png -------------------------------------------------------------------------------- /images/system.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/system.png -------------------------------------------------------------------------------- /images/system_diagram.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/images/system_diagram.png -------------------------------------------------------------------------------- /index.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | Document 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 |
14 | 25 | 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | 41 | 42 | 43 | 44 | 45 | 46 | 47 | 48 | 49 | 50 | 51 | 52 | 53 | 54 | 55 | -------------------------------------------------------------------------------- /js/prism-bash.js: -------------------------------------------------------------------------------- 1 | (function(Prism) { 2 | var insideString = { 3 | variable: [ 4 | // Arithmetic Environment 5 | { 6 | pattern: /\$?\(\([\s\S]+?\)\)/, 7 | inside: { 8 | // If there is a $ sign at the beginning highlight $(( and )) as variable 9 | variable: [{ 10 | pattern: /(^\$\(\([\s\S]+)\)\)/, 11 | lookbehind: true 12 | }, 13 | /^\$\(\(/ 14 | ], 15 | number: /\b0x[\dA-Fa-f]+\b|(?:\b\d+\.?\d*|\B\.\d+)(?:[Ee]-?\d+)?/, 16 | // Operators according to https://www.gnu.org/software/bash/manual/bashref.html#Shell-Arithmetic 17 | operator: /--?|-=|\+\+?|\+=|!=?|~|\*\*?|\*=|\/=?|%=?|<<=?|>>=?|<=?|>=?|==?|&&?|&=|\^=?|\|\|?|\|=|\?|:/, 18 | // If there is no $ sign at the beginning highlight (( and )) as punctuation 19 | punctuation: /\(\(?|\)\)?|,|;/ 20 | } 21 | }, 22 | // Command Substitution 23 | { 24 | pattern: /\$\([^)]+\)|`[^`]+`/, 25 | greedy: true, 26 | inside: { 27 | variable: /^\$\(|^`|\)$|`$/ 28 | } 29 | }, 30 | /\$(?:[\w#?*!@]+|\{[^}]+\})/i 31 | ] 32 | }; 33 | 34 | Prism.languages.bash = { 35 | 'shebang': { 36 | pattern: /^#!\s*\/bin\/bash|^#!\s*\/bin\/sh/, 37 | alias: 'important' 38 | }, 39 | 'comment': { 40 | pattern: /(^|[^"{\\])#.*/, 41 | lookbehind: true 42 | }, 43 | 'string': [ 44 | //Support for Here-Documents https://en.wikipedia.org/wiki/Here_document 45 | { 46 | pattern: /((?:^|[^<])<<\s*)["']?(\w+?)["']?\s*\r?\n(?:[\s\S])*?\r?\n\2/, 47 | lookbehind: true, 48 | greedy: true, 49 | inside: insideString 50 | }, 51 | { 52 | pattern: /(["'])(?:\\[\s\S]|\$\([^)]+\)|`[^`]+`|(?!\1)[^\\])*\1/, 53 | greedy: true, 54 | inside: insideString 55 | } 56 | ], 57 | 'variable': insideString.variable, 58 | // Originally based on http://ss64.com/bash/ 59 | 'function': { 60 | pattern: /(^|[\s;|&])(?:alias|apropos|apt-get|aptitude|aspell|awk|basename|bash|bc|bg|builtin|bzip2|cal|cat|catkin_make|cd|cfdisk|chgrp|chmod|chown|chroot|chkconfig|cksum|clear|cmp|comm|command|cp|cron|crontab|csplit|curl|cut|date|dc|dd|ddrescue|df|diff|diff3|dig|dir|dircolors|dirname|dirs|dmesg|du|egrep|eject|enable|env|ethtool|eval|exec|expand|expect|export|expr|fdformat|fdisk|fg|fgrep|file|find|fmt|fold|format|free|fsck|ftp|fuser|gawk|getopts|git|grep|groupadd|groupdel|groupmod|groups|gzip|hash|head|help|hg|history|hostname|htop|iconv|id|ifconfig|ifdown|ifup|import|install|jobs|join|kill|killall|less|link|ln|locate|logname|logout|look|lpc|lpr|lprint|lprintd|lprintq|lprm|ls|lsof|make|man|mkdir|mkfifo|mkisofs|mknod|more|most|mount|mtools|mtr|mv|mmv|nano|netstat|nice|nl|nohup|notify-send|npm|nslookup|open|op|passwd|paste|pathchk|ping|pkill|popd|pr|printcap|printenv|printf|ps|pushd|pv|pwd|quota|quotacheck|quotactl|ram|rar|rcp|read|readarray|readonly|reboot|rename|renice|remsync|rev|rm|rmdir|rsync|roslaunch|rosrun|screen|scp|sdiff|sed|seq|service|sftp|shift|shopt|shutdown|sleep|slocate|sort|source|split|ssh|stat|strace|su|sudo|sum|suspend|sync|tail|tar|tee|time|timeout|times|touch|top|traceroute|trap|tr|tsort|tty|type|ulimit|umask|umount|unalias|uname|unexpand|uniq|units|unrar|unshar|uptime|useradd|userdel|usermod|users|uuencode|uudecode|v|vdir|vi|vmstat|wait|watch|wc|wget|whereis|which|who|whoami|write|xargs|xdg-open|yes|zip)(?=$|[\s;|&])/, 61 | lookbehind: true 62 | }, 63 | 'keyword': { 64 | pattern: /(^|[\s;|&])(?:let|:|\.|if|then|else|elif|fi|for|break|continue|while|in|case|function|select|do|done|until|echo|exit|return|declare)(?=$|[\s;|&])/, 65 | lookbehind: true 66 | }, 67 | 'boolean': { 68 | pattern: /(^|[\s;|&])(?:true|false)(?=$|[\s;|&])/, 69 | lookbehind: true 70 | }, 71 | 'operator': /&&?|\|\|?|==?|!=?|<<>|<=?|>=?|=~/, 72 | 'punctuation': /\$?\(\(?|\)\)?|\.\.|[{}[\];]/ 73 | }; 74 | 75 | var inside = insideString.variable[1].inside; 76 | inside.string = Prism.languages.bash.string; 77 | inside['function'] = Prism.languages.bash['function']; 78 | inside.keyword = Prism.languages.bash.keyword; 79 | inside['boolean'] = Prism.languages.bash['boolean']; 80 | inside.operator = Prism.languages.bash.operator; 81 | inside.punctuation = Prism.languages.bash.punctuation; 82 | 83 | Prism.languages.shell = Prism.languages.bash; 84 | })(Prism); 85 | -------------------------------------------------------------------------------- /js/prism-cpp.js: -------------------------------------------------------------------------------- 1 | Prism.languages.cpp = Prism.languages.extend('c', { 2 | 'keyword': /\b(?:alignas|alignof|asm|auto|bool|break|case|catch|char|char16_t|char32_t|class|compl|const|constexpr|const_cast|continue|decltype|default|delete|do|double|dynamic_cast|else|enum|explicit|export|extern|float|for|friend|goto|if|inline|int|int8_t|int16_t|int32_t|int64_t|uint8_t|uint16_t|uint32_t|uint64_t|long|mutable|namespace|new|noexcept|nullptr|operator|private|protected|public|register|reinterpret_cast|return|short|signed|sizeof|static|static_assert|static_cast|struct|switch|template|this|thread_local|throw|try|typedef|typeid|typename|union|unsigned|using|virtual|void|volatile|wchar_t|while)\b/, 3 | 'boolean': /\b(?:true|false)\b/, 4 | 'operator': /--?|\+\+?|!=?|<{1,2}=?|>{1,2}=?|->|:{1,2}|={1,2}|\^|~|%|&{1,2}|\|\|?|\?|\*|\/|\b(?:and|and_eq|bitand|bitor|not|not_eq|or|or_eq|xor|xor_eq)\b/ 5 | }); 6 | 7 | Prism.languages.insertBefore('cpp', 'keyword', { 8 | 'class-name': { 9 | pattern: /(class\s+)\w+/i, 10 | lookbehind: true 11 | } 12 | }); 13 | 14 | Prism.languages.insertBefore('cpp', 'string', { 15 | 'raw-string': { 16 | pattern: /R"([^()\\ ]{0,16})\([\s\S]*?\)\1"/, 17 | alias: 'string', 18 | greedy: true 19 | } 20 | }); 21 | -------------------------------------------------------------------------------- /pdf/projectile_model.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/e4a48c2070ac6d60bed1cb67e9384dfd3b08979b/pdf/projectile_model.pdf -------------------------------------------------------------------------------- /quick_start/quick_test.md: -------------------------------------------------------------------------------- 1 | # 快速测试 2 | 3 | > [!Warning] 4 | > 5 | >在测试之前,参考[软件依赖配置](quick_start/setup_on_manifold2?id=软件依赖配置)与[RoboRTS下载与编译](quick_start/setup_on_manifold2?id=roborts下载与编译),确保所有的软件依赖完成安装,roborts相关包顺利编译完成。 6 | 7 | ### 在仿真环境中测试 8 | 9 | 1. 运行Stage仿真环境以及定位与规划功能模块 10 | ```bash 11 | roslaunch roborts_bringup roborts_stage.launch 12 | ``` 13 | 14 | 2. 运行决策测试节点 15 | ```bash 16 | rosrun roborts_decision behavior_test_node 17 | ``` 18 | 输入1\2\3\4\5\6来切换不同行为 19 | 20 | ### 在实际环境中测试 21 | 22 | #### 步骤1: 测试机器人驱动节点 23 | 24 | 1. 运行机器人驱动sdk的脚本 25 | ``` 26 | roslaunch roborts_bringup base.launch 27 | ``` 28 | 通过命令行来测试是否能获取信息以及基本的控制功能 29 | 30 | > [!Tip] 31 | > 32 | >机器人驱动部分通过虚拟串口进行通信,因此需要外设端口映射配置,参考[外设端口映射配置](quick_start/setup_on_manifold2?id=外设端口映射配置)相关文档。 33 | 34 | #### 步骤2: 执行简单的决策 35 | 36 | 1. 运行机器人驱动sdk、激光雷达、静态TF广播以及定位与规划功能模块 37 | ``` 38 | roslaunch roborts_bringup roborts.launch 39 | ``` 40 | 41 | 2. 运行决策测试节点 42 | ```bash 43 | rosrun roborts_decision behavior_test_node 44 | ``` 45 | 输入1\2\3\4\5\6来切换不同行为 46 | 47 | -------------------------------------------------------------------------------- /quick_start/setup_on_manifold2.md: -------------------------------------------------------------------------------- 1 | 本章节主要介绍RoboRTS在Manifold 2上的部署与应用。 2 | 3 | > [!Tip] 4 | > 5 | > Manifold 2 是 DJI 为开发者打造的微型计算机,其中Manifold 2-G 系列搭载 NVIDIA Jetson TX2 模块,文中默认Manifold 2均为Manifold 2-G系列产品。本文档同样适用于Nvidia Jetson TX2原生开发套件。 6 | 7 | ## Manifold 2性能配置 8 | 9 | ### 开启Manifold 2最大性能 10 | 11 | 在Manifold 2中,可以使用NVPModel分配cpu核数,cpu和gpu的最大频率。Manifold 2默认模式只打开4个`CPU`核, 为获得最大运算性能,需要使用`nvpmodel`命令来更改配置。 12 | 13 | #### 查看与更改模式: 14 | 15 | ```bash 16 | sudo nvpmodel -q --verbose # 查看当前模式 17 | 18 | sudo nvpmodel -m [mode] # 各模式介绍可以参考附录1中表格 19 | sudo nvpmodel -m0 # 推荐配置 20 | ``` 21 | 数字对应的模式如下: 22 | 23 | 24 | 25 | | MODE | MODE NAME | DENVER 2 | FREQUENCY | ARM A57 | FREQUENCY | GPU FREQUENCY | 26 | | :--: | :------------: | :------: | :-------: | :-----: | :-------: | :-----------: | 27 | | 0 | Max-N | 2 | 2.0 GHz | 4 | 2.0 GHz | 1.30 Ghz | 28 | | 1 | Max-Q | 0 | | 4 | 1.2 GHz | 0.85 Ghz | 29 | | 2 | Max-P Core-All | 2 | 1.4 GHz | 4 | 1.4 GHz | 1.12 Ghz | 30 | | 3 | Max-P ARM | 0 | | 4 | 2.0 GHz | 1.12 Ghz | 31 | | 4 | Max-P Denver | 2 | 2.0 GHz | 0 | | 1.12 Ghz | 32 | 33 | #### 开启最大时钟频率: 34 | 35 | Manifold 2在Home目录下会安装此脚本,更改时钟频率脚本,主要为了开启最大性能。 36 | 37 | ```bash 38 | sudo ./jetson_clocks.sh 39 | ``` 40 | 41 | ### 网速测试和连接 42 | 43 | - Manifold 2上以太网带宽和速度测试 44 | 45 | ``` bash 46 | sudo apt-get install iperf 47 | #start server 48 | sudo iperf -s 49 | #start client 50 | sudo iperf -c 192.168.1.xxx -i 5 51 | ``` 52 | 53 | - Manifold 2上WiFi测试 54 | 55 | - 关闭wlan0节能模式 56 | 57 | ```bash 58 | iw dev wlan0 set power_save off #- to disable power save and reduce ping latency. 59 | iw dev wlan0 get power_save # will return current state. 60 | 61 | ``` 62 | 63 | - 查看WiFi的RSSI 64 | 65 | ```bash 66 | watch -n 0.1 iw dev wlan0 link 67 | ``` 68 | 69 | > [!Tip] 70 | > 71 | > 在下载和编译roborts包后,可参考`roborts_bringup`模块文档中的[脚本](sdk_docs/roborts_bringup?id=脚本)小节,通过开机自启服务,开启最大性能同时关闭wlan0节能模式。 72 | 73 | 74 | ## 外设端口映射配置 75 | 76 | 根据硬件接口(串口,USB或者ACM)来配置/etc/udev/rules.d中的udev文件,分别实现STM32设备虚拟串口和激光雷达的设备绑定: 77 | 78 | 首先连接STM32设备的虚拟串口,lsusb可以查看Vendor和Product的ID,然后创建并配置/etc/udev/rules.d/roborts.rules文件 79 | 80 | ```bash 81 | KERNEL=="ttyACM*", ATTRS{idVendor}=="0483", ATTRS{idProduct}=="5740", MODE:="0777", SYMLINK+="serial_sdk" 82 | 83 | ``` 84 | 同理配置激光雷达。然后重新加载并启动udev服务,可能需要重新插拔设备后生效。 85 | 86 | ```bash 87 | sudo service udev reload 88 | sudo service udev restart 89 | ``` 90 | 91 | 而配置多个相同型号的相机的会麻烦一些,由于Vendor和Product的ID是一样的,因此要查看每个相机具体的特性 92 | 93 | ```bash 94 | udevadm info --attribute-walk --name=/dev/video0 95 | ``` 96 | 97 | 一般可以用串口号不同(serial)作为属性来区分每个相机,例如: 98 | 99 | ```bash 100 | SUBSYSTEM=="usb", ATTR{serial}=="68974689267119892", ATTR{idVendor}=="1871", ATTR{idProduct}=="0101", SYMLINK+="camera0" 101 | SUBSYSTEM=="usb", ATTR{serial}=="12345698798725654", ATTR{idVendor}=="1871", ATTR{idProduct}=="0101", SYMLINK+="camera1" 102 | ``` 103 | 104 | 如果是廉价相机,可能串口号也相同,可以由连接HUB的物理端口不同(KERNEL或KERNELS绑定)来配置,例如: 105 | 106 | ```bash 107 | SUBSYSTEM=="usb", KERNEL=="2-3", ATTR{idVendor}=="1871", ATTR{idProduct}=="0101", SYMLINK+="camera0" 108 | SUBSYSTEM=="usb", KERNEL=="2-4", ATTR{idVendor}=="1871", ATTR{idProduct}=="0101", SYMLINK+="camera1" 109 | ``` 110 | 111 | >[!Tip] 112 | > 113 | >在下载和编译roborts包后,可参考`roborts_bringup`模块文档中的[脚本](sdk_docs/roborts_bringup?id=脚本)小节,执行udev端口映射脚本。 114 | 115 | 116 | 117 | ## 软件依赖配置 118 | 119 | ### ROS (ros-kinetic-ros-base) 120 | 121 | Manifold 2默认安装ROS Kinetic。若使用其它平台,可参考ROS Wiki的[安装教程](http://wiki.ros.org/kinetic/Installation/Ubuntu)安装ROS。 122 | 123 | > [!Note] 124 | > 125 | > 注意是否将`source /opt/ros/kinetic/setup.bash`写入`.bashrc`或`.zshrc`,以正确加载ROS相关环境变量。 126 | 127 | 安装ROS所需第三方依赖包,以及`SuiteSparse`,`Glog`,`Protobuf`等其他依赖 128 | 129 | ```bash 130 | sudo apt-get install -y ros-kinetic-opencv3 \ 131 | ros-kinetic-cv-bridge \ 132 | ros-kinetic-image-transport \ 133 | ros-kinetic-stage-ros \ 134 | ros-kinetic-map-server \ 135 | ros-kinetic-laser-geometry \ 136 | ros-kinetic-interactive-markers \ 137 | ros-kinetic-tf \ 138 | ros-kinetic-pcl-* \ 139 | ros-kinetic-libg2o \ 140 | ros-kinetic-rplidar-ros \ 141 | ros-kinetic-rviz \ 142 | protobuf-compiler \ 143 | libprotobuf-dev \ 144 | libsuitesparse-dev \ 145 | libgoogle-glog-dev \ 146 | ``` 147 | 148 | ### 其它常用软件 149 | 150 | - git 151 | - cmake 152 | - vim 153 | - terminator 154 | - htop 155 | 156 | ## RoboRTS下载与编译 157 | 158 | 159 | ```bash 160 | # 创建工作空间文件夹 161 | mkdir -p roborts_ws/src 162 | # 切换至src目录 163 | cd roborts_ws/src 164 | # 下载RoboRTS源码 165 | git clone https://github.com/RoboMaster/RoboRTS 166 | # 编译源码 167 | cd .. 168 | catkin_make 169 | # 加载环境变量 170 | source devel/setup.bash 171 | ``` 172 | 173 | > [!Note] 174 | > 175 | > 如果使用`zsh`,注意将`setup.bash`替换为`setup.zsh`。 176 | -------------------------------------------------------------------------------- /resources.md: -------------------------------------------------------------------------------- 1 | # 相关文档 2 | 3 | ICRA 2019 RoboMaster人工智能挑战赛相关文档。 4 | 5 | ## 规则手册 V1.0 6 | 7 | [ICRA_2019_RoboMaster_人工智能挑战赛规则手册_V1.0.pdf](https://rm-static.djicdn.com/tem/19806/ICRA%202019%20RoboMaster%20%E4%BA%BA%E5%B7%A5%E6%99%BA%E8%83%BD%E6%8C%91%E6%88%98%E8%B5%9B%E8%A7%84%E5%88%99%E6%89%8B%E5%86%8C%20V1.0.pdf) 8 | 9 | ## AI机器人用户手册 V1.0 10 | 11 | [ICRA_2019_DJI_RoboMaster_人工智能挑战赛AI机器人用户手册V1.0.pdf](https://rm-static.djicdn.com/tem/19806/ICRA%202019%20DJI%20RoboMaster%20%E4%BA%BA%E5%B7%A5%E6%99%BA%E8%83%BD%E6%8C%91%E6%88%98%E8%B5%9BAI%E6%9C%BA%E5%99%A8%E4%BA%BA%E7%94%A8%E6%88%B7%E6%89%8B%E5%86%8CV1.0.pdf) 12 | 13 | 14 | ## 裁判系统规范手册 V1.0 15 | 16 | [ICRA_2019_RoboMaster_人工智能挑战赛裁判系统规范手册V1.0.pdf](https://rm-static.djicdn.com/tem/19806/ICRA%202019%20RoboMaster%20%E4%BA%BA%E5%B7%A5%E6%99%BA%E8%83%BD%E6%8C%91%E6%88%98%E8%B5%9B%E8%A3%81%E5%88%A4%E7%B3%BB%E7%BB%9F%E8%A7%84%E8%8C%83%E6%89%8B%E5%86%8CV1.0.pdf) 17 | 18 | ## RoboMaster AI机器人机械图纸 19 | 20 | ### STEP 21 | 22 | [RoboMaster AI Robot Mechanism STEP drawings](https://rm-static.djicdn.com/documents/19806/4df8649b3596f1548056917303346609.STEP) 23 | 24 | ### SolidWorks版 25 | 26 | 该机械图纸需要SolidWorks 2016或更高版本。 27 | 28 | [RoboMaster AI机器人机械图纸(SW)](https://rm-static.djicdn.com/documents/19806/232fed8050cfe1548739880652461892.SLDPRT) 29 | 30 | ### Creo版 31 | 该机械图纸需要Creo3或更高版本。 32 | 33 | [RoboMaster AI机器人机械图纸(Creo)](https://rm-static.djicdn.com/documents/19806/a96a1cc07664b1548738962638883052.1) 34 | 35 | ## RoboMaster AI机器人开源代码仓库 36 | 37 | [RoboMaster AI 机器人嵌入式MCU(STM32)](https://github.com/RoboMaster/RoboRTS-Firmware) 38 | 39 | [RoboMaster AI 机器人机载PC(ROS)](https://github.com/RoboMaster/RoboRTS) -------------------------------------------------------------------------------- /roborts.md: -------------------------------------------------------------------------------- 1 |

RoboMaster 2019 AI 机器人平台

2 | 3 | ![modules](https://rm-static.djicdn.com/documents/20758/d4e8eabc0a8161547546323338435301.jpg) 4 | **RoboMaster 2019 AI 机器人平台**是为[**ICRA 2019 RoboMaster 人工智能挑战赛**](https://www.icra2019.org/competitions/dji-robomaster-ai-challenge)所设计的一款通用移动机器人开发平台。该平台开源了硬件设计、基于STM32系列单片机嵌入式框架、支持ROS机器人操作系统的RoboRTS机载平台软件框架,涵盖了一系列移动机器人的算法实现及相关例程。 5 | 6 | 7 | -------------------------------------------------------------------------------- /sdk_docs/_sidebar.md: -------------------------------------------------------------------------------- 1 | 2 | * [返回]() 3 | * [架构说明](sdk_docs/architecture.md) 4 | * [roborts_base](sdk_docs/roborts_base.md) 5 | * [roborts_camera](sdk_docs/roborts_camera.md) 6 | * [roborts_detection](sdk_docs/roborts_detection.md) 7 | * [roborts_localization](sdk_docs/roborts_localization.md) 8 | * [roborts_decision](sdk_docs/roborts_decision.md) 9 | * roborts_planning 10 | * [全局路径规划](sdk_docs/roborts_planning_global_planner.md) 11 | * [局部路径规划](sdk_docs/roborts_planning_local_planner.md) 12 | * [roborts_bringup](sdk_docs/roborts_bringup.md) 13 | 14 | 15 | 16 | -------------------------------------------------------------------------------- /sdk_docs/architecture.md: -------------------------------------------------------------------------------- 1 | # 整体架构 2 | 3 | ## 架构与模块介绍 4 | 整体系统以 `机器人传感器->感知->决策->规划->控制->执行器` 的环路进行架构,不同模块具体以ROS Package的形式维护,模块和其数据流如下图所示 5 | ![](https://rm-static.djicdn.com/documents/20758/f42d65d85d97c1547553106539783606.png) 6 | 7 | ### 传感器、控制器与执行器 8 | 9 | - 中心模块集成传感器模块(雷达、相机、IMU等)、嵌入式控制平台(执行实时任务,如闭环控制和数据采集与预处理)与执行器(电机等),负责sensing和control两大任务,具体ROS Package为包含嵌入式SDK的[roborts_base](sdk_docs/roborts_base),相机的[roborts_camera](sdk_docs/roborts_camera)以及相关传感器驱动包 10 | 11 | ### 感知部分 12 | 13 | 感知部分包括机器人定位、地图的维护和抽象、目标识别与追踪等 14 | 15 | - localization模块负责机器人定位,详见[roborts_localization](sdk_docs/roborts_localization) 16 | 17 | - map模块负责机器人地图维护,目前采用ROS开源Package [map_server](http://wiki.ros.org/map_server) 18 | 19 | - costmap模块负责代价地图维护,集成了静态地图层,障碍物层和膨胀层,主要用于运动规划部分,详见roborts_costmap,后续将会计划更新为feature_map模块,不单纯针对规划使用。 20 | 21 | - detection模块负责目标识别和追踪,详见[roborts_detection](sdk_docs/roborts_detection),当前主要集成了敌人装甲板的识别和发射弹丸的控制器,由于帧率需求比较高,当前识别和控制是耦合的,后续将会解耦,将控制跟随部分放到gimbal_executor中 22 | 23 | ### 任务调度与决策部分 24 | 25 | 任务调度与决策部分包括调度感知输入模块和调度规划执行输出模块的接口,以及决策的核心框架。 26 | - decision模块为机器人决策框架,官方提供行为树(BehaviorTree)的决策框架。blackboard模块调度各种模块的感知任务获取信息和裁判系统的比赛信息,behavior模块集成了离散动作空间的各种动作或行为,详见[roborts_decision](sdk_docs/roborts_decision) 27 | 28 | - executor模块是behavior模块的依赖,其包含底盘和云台内不同模块内不同抽象程度的机器人任务委托接口(例如调度底盘运动规划执行),详见roborts_decision/executor 29 | 30 | ### 运动规划部分 31 | 32 | 运动规划部分是运动规划功能模块,由决策部分中chassis_executor模块来调度完成导航,详见roborts_planning 33 | 34 | - global_planner模块负责机器人的全局路径规划,详见[roborts_planning/global_planner](sdk_docs/roborts_planning_global_planner),依赖roborts_costmap 35 | 36 | - local_planner模块负责机器人的局部轨迹规划模块,详见[roborts_planning/local_planner](sdk_docs/roborts_planning_local_planner),依赖roborts_costmap 37 | 38 | 39 | ## ROS Package介绍 40 | 41 | | Package | 功能 | 内部依赖 | 42 | | :--: | :------------: | :------: | 43 | | roborts | Meta-package | - | 44 | | roborts_base | 嵌入式通信接口 | roborts_msgs | 45 | | roborts_camera | 相机驱动包 | roborts_common | 46 | | roborts_common | 通用依赖包 | - | 47 | | roborts_decision | 机器人决策包 | roborts_common
roborts_msgs
roborts_costmap | 48 | | roborts_detection | 视觉识别算法包 | roborts_msgs
roborts_common
roborts_camera | 49 | | roborts_localization | 机器人定位算法包 | - | 50 | | roborts_costmap | 代价地图相关支持包 | roborts_common | 51 | | roborts_msgs | 自定义消息类型包 | - | 52 | | roborts_planning | 运动规划算法包 | roborts_common
roborts_msgs
roborts_costmap | 53 | | roborts_bringup | 启动包 | roborts_base
roborts_common
roborts_localization
roborts_costmap
roborts_msgs
roborts_planning
| 54 | | roborts_tracking | 视觉追踪算法包 | roborts_msgs | 55 | -------------------------------------------------------------------------------- /sdk_docs/roborts_base.md: -------------------------------------------------------------------------------- 1 | # 机器人驱动模块 2 | 3 | ## 模块介绍 4 | 5 | 机器人驱动模块是连接底层嵌入式控制板与上层机载电脑的桥梁,通过虚拟串口,基于一套通用的协议来定义双向传输的数据种类,进而完成数据互传的任务。 6 | 7 | 在机载端,`roborts_base`提供了ROS接口,接收底层嵌入式控制板发送的数据并控制其完成对机器人的模式切换和运动控制。 8 | 9 | `roborts_base`依赖`roborts_msgs`中定义的相关消息类型。 10 | 11 | 模块文件目录如下所示 12 | ```bash 13 | roborts_base 14 | ├── CMakeLists.txt 15 | ├── cmake_module 16 | │   └── FindGlog.cmake 17 | ├── chassis #底盘模块ROS接口封装类 18 | │   ├── chassis.cpp 19 | │   └── chassis.h 20 | ├── gimbal #云台模块ROS接口封装类 21 | │   ├── gimbal.cpp 22 | │   └── gimbal.h 23 | ├── config 24 | │   └── roborts_base_parameter.yaml #参数配置文件 25 | ├── roborts_base_config.h #参数读取类 26 | ├── roborts_base_node.cpp #核心节点Main函数 27 | ├── ros_dep.h #包括所有协议对应ROS消息的头文件 28 | ├── roborts_sdk 29 | │   ├── ... 30 | └── package.xml 31 | ``` 32 | 其中roborts_sdk文件内原生协议SDK,无任何ROS的依赖,详见 33 | ```bash 34 | ├── roborts_sdk 35 | │   ├── hardware #硬件层完成对协议数据的收发 36 | │   │   ├── hardware_interface.h #硬件层基类 37 | │   │   ├── serial_device.cpp #串口设备实现类 38 | │   │   └── serial_device.h 39 | │   ├── protocol #协议层完成对协议的解包与打包 40 | │   │   ├── protocol.cpp #协议层类 41 | │   │   ├── protocol_define.h #协议具体定义消息类型的头文件 42 | │   │   └── protocol.h 43 | │   ├── dispatch #分发层完成对消息的分发 44 | │   │   ├── dispatch.h #分发层类 45 | │   │   ├── execution.cpp #分发层执行类 46 | │   │   ├── execution.h 47 | │   │   ├── handle.cpp #roborts_sdk三层的对外接口类 48 | │   │   └── handle.h 49 | │   ├── sdk.h #roborts_sdk对外头文件 50 | │   ├── test 51 | │   │   └── sdk_test.cpp #协议测试文件 52 | │   └── utilities 53 | │   ├── circular_buffer.h #环形缓冲池 54 | │   ├── crc.h #crc校验文件 55 | │   ├── log.h #日志记录文件 56 | │   └── memory_pool.h #内存池 57 | ``` 58 | 59 | >[!Note] 60 | >协议详细信息请参考`RoboMaster AI机器人通信协议文档(TODO)`。 61 | 62 | 63 | 在该模块的核心运行节点`roborts_base_node`中,创建所需模块的对象(如底盘、云台)并初始化后,即可正常执行通信任务。 64 | 65 | 其ROS节点图示如下: 66 | 67 | ![](https://rm-static.djicdn.com/documents/20758/002d528eb36ad1550474043463957284.png) 68 | 69 | ### 底盘模块 70 | 71 | #### 输入 72 | 73 | * /cmd_vel ([geomtry_msgs/Twist]()) 74 | 75 | 底盘速度的控制量,即在下一个控制周期内,底盘做匀速运动 76 | 77 | * /cmd_vel_acc ([roborts_msgs/TwistAccel]()) 78 | 79 | 底盘速度与加速度的控制量,即在下一个控制周期内,底盘以给定的速度做匀加速度运动 80 | 81 | 82 | #### 输出 83 | 84 | * /odom ([nav_msgs/Odometry]()) 85 | 86 | 底盘里程计信息 87 | 88 | * /uwb ([geometry_msgs/PoseStamped]()) 89 | 90 | 底盘在UWB坐标系中的位姿信息 91 | 92 | * /tf ([tf/tfMessage](http://docs.ros.org/api/tf/html/msg/tfMessage.html)) 93 | 94 | 从base_link->odom的变换 95 | 96 | 97 | ### 云台模块 98 | 99 | #### 输入 100 | 101 | * /cmd_gimbal_angle ([roborts_msgs/GimbalAngle]()) 102 | 103 | 云台角度的控制量,根据相对角度标志位判断是绝对角度控制还是相对角度控制 104 | 105 | * /cmd_gimbal_rate ([roborts_msgs/GimbalRate]()) 106 | 107 | [**未启用**]云台角速度的控制量 108 | 109 | * /set_gimbal_mode ([roborts_msgs/GimbalMode]()) 110 | 111 | 设定云台的模式 112 | 113 | * /cmd_fric_wheel ([roborts_msgs/FricWhl]()) 114 | 115 | 开启与关闭摩擦轮(待开放发射速度控制) 116 | 117 | * /cmd_shoot ([roborts_msgs/ShootCmd]()) 118 | 119 | 弹丸发射的控制指令(包括发射模式,频率与个数) 120 | 121 | #### 输出 122 | 123 | * /tf ([tf/tfMessage](http://docs.ros.org/api/tf/html/msg/tfMessage.html)) 124 | 125 | 从base_link->gimbal的变换 126 | 127 | ### 相关参数 128 | 129 | * serial_path(`string`, 默认值: "/dev/serial_sdk") 130 | 131 | 串口的端口路径名称,默认值为udev rules设置的"/dev/serial_sdk" 132 | 133 | ## 编译与运行 134 | 135 | ### 编译 136 | 137 | 在ROS的工作区内编译 138 | 139 | ```shell 140 | catkin_make -j4 roborts_base_node 141 | ``` 142 | 143 | ### 运行 144 | 145 | 执行roborts_base_node节点 146 | 147 | ```shell 148 | rosrun roborts_base roborts_base_node 149 | ``` 150 | 151 | 或者启动相关launch文件 152 | 153 | ```shell 154 | roslaunch roborts_bringup base.launch 155 | ``` 156 | 157 | 158 | -------------------------------------------------------------------------------- /sdk_docs/roborts_bringup.md: -------------------------------------------------------------------------------- 1 | # 启动模块 2 | 3 | ## 模块介绍 4 | 5 | 启动模块主要包括基本的配置文件与启动脚本,位于roborts_bringup Package中,模块文件目录如下所示 6 | ```bash 7 | roborts_bringup 8 | ├── CMakeLists.txt 9 | ├── launch #launch启动脚本 10 | │   ├── base.launch #机器人驱动sdk启动脚本 11 | │   ├── roborts.launch #除决策和调度模块的其他功能性节点测试脚本 12 | │   ├── roborts_stage.launch #对应上面的stage仿真测试脚本 13 | │   ├── mapping.launch #建图的测试脚本 14 | │   ├── mapping_stage.launch #建图的stage仿真测试脚本 15 | │   ├── slam_gmapping.xml #gmapping建图节点启动脚本 16 | │   └── static_tf.launch #静态坐标变换脚本 17 | ├── maps #地图配置文件 18 | │   ├── icra2018.pgm 19 | │   ├── icra2018.yaml 20 | │   ├── icra2019.pgm 21 | │   └── icra2019.yaml 22 | ├── package.xml 23 | ├── rviz #rviz配置文件 24 | │   ├── mapping.rviz 25 | │   ├── roborts.rviz 26 | │   └── teb_test.rviz 27 | ├── scripts 28 | │   ├── udev #udev端口映射脚本 29 | │   │   ├── create_udev_rules.sh 30 | │   │   ├── delete_udev_rules.sh 31 | │   │   └── roborts.rules 32 | │   └── upstart #开机自启脚本 33 | │   ├── create_upstart_service.sh 34 | │   ├── delete_upstart_service.sh 35 | │   ├── jetson_clocks.sh 36 | │   ├── max-performance.service 37 | │   ├── max_performance.sh 38 | │   ├── roborts.service 39 | │   └── roborts-start.sh 40 | └── worlds #比赛地图stage配置文件 41 | ├── icra2018.world 42 | └── icra2019.world 43 | 44 | ``` 45 | 46 | ## 脚本 47 | 48 | 执行脚本添加udev映射规则: 49 | 50 | ```shell 51 | ./create_udev_rules.sh 52 | ``` 53 | 54 | 执行脚本删除udev映射规则: 55 | ```shell 56 | ./delete_udev_rules.sh 57 | ``` 58 | 59 | 其中,udev规则脚本详见`roborts.rules`文件,可根据所需外设进行灵活添加和修改 60 | 61 | 在Manifold2中,执行脚本添加开机启动服务: 62 | 63 | ```shell 64 | ./create_upstart_service.sh 65 | ``` 66 | 执行脚本删除开机启动服务: 67 | 68 | ```shell 69 | ./delete_upstart_service.sh 70 | ``` 71 | 72 | 其中,开机启动服务具体包括 73 | 74 | - 开启max-performance.service服务来执行jetson_clocks.sh脚本与nvpmodel达到最大化的性能, 75 | 76 | - 开启roborts.service服务执行roborts-start.sh脚本来运行ros launch脚本 77 | 78 | 用户可根据需求修改所需脚本文件和服务文件定制开机启动服务 79 | 80 | ## 测试与运行 81 | 82 | 在实际场景中,开启机器人驱动sdk脚本 83 | 84 | ```shell 85 | roslaunch roborts_bringup base.launch 86 | ``` 87 | 88 | 在实际场景中,执行除决策和调度模块的其他功能性节点测试脚本 89 | 90 | ```shell 91 | roslaunch roborts_bringup roborts.launch 92 | ``` 93 | 94 | 在stage仿真环境中,执行除决策和调度模块的其他功能性节点测试脚本 95 | 96 | ```shell 97 | roslaunch roborts_bringup roborts_stage.launch 98 | ``` 99 | 100 | 在实际场景中,执行gmapping建图的测试脚本 101 | 102 | ```shell 103 | roslaunch roborts_bringup mapping.launch 104 | ``` 105 | 106 | 在stage仿真环境中,执行gmapping建图的测试脚本 107 | 108 | ```shell 109 | roslaunch roborts_bringup mapping_stage.launch 110 | ``` 111 | 112 | > [!Note] 113 | > 114 | > 注意,所有在实际场景中应用的launch脚本,默认不会启动rviz可视化界面。 115 | -------------------------------------------------------------------------------- /sdk_docs/roborts_camera.md: -------------------------------------------------------------------------------- 1 | # 相机驱动模块 2 | 3 | ## 模块介绍 4 | 5 | 相机驱动模块封装了常见相机驱动,并通过ROS的`image_transport`发布图像数据和相机参数。 6 | 7 | 相机模块位于roborts_camera Package中,依赖roborts_common Package中的抽象工厂模式和参数读取,模块文件目录如下所示 8 | 9 | ```bash 10 | roborts_camera 11 | ├── CMakeLists.txt 12 | ├── package.xml 13 | ├── cmake_module 14 | │   └── FindProtoBuf.cmake 15 | ├── camera_base.h #相机的抽象类 16 | ├── camera_node.cpp #相机核心运行节点和Main函数 17 | ├── camera_node.h 18 | ├── camera_param.cpp #相机参数读取类 19 | ├── camera_param.h 20 | ├── config 21 | │   └── camera_param.prototxt #相机参数配置文件 22 | ├── proto 23 | │   ├── camera_param.pb.cc 24 | │   ├── camera_param.pb.h 25 | │   └── camera_param.proto #相机参数定义文件 26 | ├── test 27 | │   └── image_capture.cpp #相机数据和参数接收测试节点 28 | └── uvc 29 | ├── CMakeLists.txt 30 | ├── uvc_driver.cpp #uvc相机类 31 | └── uvc_driver.h 32 | ``` 33 | 34 | 相机运行节点`roborts_camera_node`,通过读取一个或多个相机的配置参数,可自动调度相机来发布图像数据。 35 | 36 | ### 相关参数 37 | 参数的定义见`proto/camera_param.proto`,参数的配置见`config/camera_param.prototxt`,接受多个相机配置,单个相机参数包括 38 | 39 | * camera_name (`string`) 40 | 41 | 相机名称,用于发布相机消息的namespace 42 | 43 | * camera_type (`string`) 44 | 45 | 相机类型,用于抽象工厂模式中注册和实例化相机对象的键值,例如uvc相机为"uvc" 46 | 47 | * camera_path (`string`) 48 | 49 | 相机路径,用于打开相机的端口路径名称 50 | 51 | * camera_matrix (`double[]`) 52 | 53 | 相机内参矩阵,一般为3X3 54 | 55 | * camera_distortion (`double[]`) 56 | 57 | 相机畸变矩阵 58 | 59 | * fps (`uint32`) 60 | 61 | 相机帧率(单位frame per second) 62 | 63 | * resolution 64 | 65 | * width (`uint32`) 66 | 67 | 分辨率宽度(单位像素) 68 | 69 | * height (`uint32`) 70 | 71 | 分辨率长度(单位像素) 72 | 73 | * width_offset (`uint32`) 74 | 75 | 分辨率宽度偏移值(单位像素),用于硬件采集图像帧时做的Crop偏移 76 | 77 | * height_offset (`uint32`) 78 | 79 | 分辨率长度偏移值(单位像素),用于硬件采集图像帧时做的Crop偏移 80 | 81 | * auto_exposure (`bool`) 82 | 83 | 自动曝光 84 | 85 | * exposure_value (`uint32`) 86 | 87 | 曝光值 88 | 89 | * exposure_time (`uint32`) 90 | 91 | 曝光时间(单位us) 92 | 93 | * auto_white_balance (`bool`) 94 | 95 | 自动白平衡 96 | 97 | * white_balance_value (`uint32`) 98 | 99 | 白平衡值 100 | 101 | * auto_gain (`bool`) 102 | 103 | 自动增益 104 | 105 | * gain_value (`uint32`) 106 | 107 | 增益值 108 | 109 | * contrast (`uint32`) 110 | 111 | 对比度 112 | 113 | ### 输出 114 | 115 | /camera_name/camera_info ([sensor_msgs/CameraInfo]()) 116 | 117 | 相机参数信息 118 | 119 | /camera_name/image_raw ([sensor_msgs/Image]()) 120 | 121 | 相机原生图像数据 122 | 123 | 124 | ## 编译与运行 125 | 126 | ### 编译 127 | 128 | 在ROS的工作区内编译 129 | 130 | ```shell 131 | catkin_make -j4 roborts_camera_node 132 | ``` 133 | 134 | ### 运行 135 | 136 | 启动相机节点 137 | 138 | ```shell 139 | rosrun roborts_camera roborts_camera_node 140 | ``` 141 | 142 | 开启rqt测试发布图像 143 | 144 | ```shell 145 | rqt_image_view 146 | ``` 147 | -------------------------------------------------------------------------------- /sdk_docs/roborts_decision.md: -------------------------------------------------------------------------------- 1 | # 任务调度与决策模块 2 | 3 | ## 模块介绍 4 | 5 | 任务调度与决策模块,提供感知输入调度模块、规划执行输出调度模块的接口,以及决策的核心框架。 6 | 7 | 模块位于`roborts_decision`包中,依赖`roborts_common`包中的参数读取模块、`roborts_costmap`包中的代价地图对象(可选)以及`roborts_msgs`包中相关消息类型。 8 | 9 | 模块文件目录如下所示 10 | 11 | ```bash 12 | roborts_decision/ 13 | ├── behavior_test.cpp #行为示例的测试节点 14 | ├── behavior_tree #决策框架示例,行为树 15 | │   ├── behavior_node.h #行为树节点类定义 16 | │   ├── behavior_state.h #行为树状态定义 17 | │   └── behavior_tree.h #行为树运行类定义 18 | ├── blackboard 19 | │   └── blackboard.h #黑板定义(决策框架的输入) 20 | ├── CMakeLists.txt 21 | ├── cmake_module 22 | │   ├── FindEigen3.cmake 23 | │   └── FindProtoBuf.cmake 24 | ├── config 25 | │   └── decision.prototxt #行为示例的参数配置文件 26 | ├── example_behavior #行为示例 (决策框架的输出) 27 | │   ├── back_boot_area_behavior.h #返回启动区行为定义 28 | │   ├── chase_behavior.h #追击敌方行为定义 29 | │   ├── escape_behavior.h #看到敌方执行逃跑行为定义 30 | │   ├── goal_behavior.h #指定目标导航行为定义 31 | │   ├── line_iterator.h 32 | │   ├── patrol_behavior.h #定点巡逻行为定义 33 | │   └── search_behavior.h #在地方消失区域搜寻行为定义 34 | ├── executor #任务执行的调度 (不同模块的任务委托) 35 | │   ├── chassis_executor.cpp 36 | │   ├── chassis_executor.h #底盘任务调度类定义 37 | │   ├── gimbal_executor.cpp 38 | │   └── gimbal_executor.h #云台任务调度类定义 39 | ├── package.xml 40 | └── proto 41 | ├── decision.pb.cc 42 | ├── decision.pb.h 43 | └── decision.proto #行为示例的参数配置文件 44 | 45 | ``` 46 | 其中主要包括决策和任务调度两个核心部分: 47 | 48 | ### 决策模块 49 | 50 | 决策模块主要包括几个部分: 51 | 52 | - 决策框架 53 | 54 | 决策框架以观察(Observation)的信息为输入,以动作(Action)为输出,辅助机器人制定决策。当前官方提供的示例框架为行为树,详见`roborts_decision/behavior_tree` 55 | 56 | - 黑板 57 | 58 | 黑板与游戏设计中黑板(Blackboard)的概念相似,作为当前决策系统中观察(Observation)的输入,用于调度一系列感知任务并获取感知信息。示例详见`roborts_decision/blackboard/blackboard.h`,用户可根据自身获取的信息种类完成blackboard类的修改与完善 59 | 60 | - 行为 61 | 62 | 机器人的具体行为经过不同程度的抽象,即可作为当前决策系统中的动作(Action)。我们提供了一系列具体行为示例,详见`roborts_decision/example_behavior`,用户可模仿示例自定义行为 63 | 64 | ### 任务调度模块 65 | 66 | 行为依赖的任务调度模块主要负责各个模块的任务委托,以调度功能执行模块来完成具体任务。 67 | 68 | 对于每一个调度模块,其核心在于 69 | 70 | - 任务执行(具体任务的输入) 71 | 72 | - 任务状态更新(反馈任务实时状态) 73 | 74 | - 任务取消(打断后状态重置和相关回收) 75 | 76 | 三个调用接口,基本可以完成对于不同任务的调度。 77 | 78 | 其中,任务状态包括 79 | 80 | - 初始 IDLE 81 | 82 | - 运行 RUNNING 83 | 84 | - 成功 SUCCESS 85 | 86 | - 失败 FAILURE 87 | 88 | 根据机器人模块主要分为 89 | 90 | - 底盘调度模块 91 | 92 | - 云台调度模块 93 | 94 | #### 底盘调度模块 95 | 96 | 底盘调度模块包括对底盘不同抽象程度的任务调度接口,运行框图如下所示 97 | 98 | ![](https://rm-static.djicdn.com/documents/20758/ae091269db41b1547553446982826082.png) 99 | 100 | 其中包括三个任务模式 101 | 102 | - 运动规划控制 103 | 104 | 输入目标点 goal ([geometry_msgs/PoseStamped]())调度全局路径规划和局部轨迹规划进行机器人底盘规划控制 105 | 106 | - 速度控制 107 | 108 | 输入速度 twist ([geometry_msgs/Twist]())直接进行机器人底盘匀速运动的速度控制 109 | 110 | - 速度加速度控制 111 | 112 | 输入速度与加速度 twist_accel ([roborts_msgs/TwistAccel]())直接进行机器人底盘匀加速度运动的速度控制 113 | 114 | ##### 导航任务示例 115 | 116 | 对于运动规划控制的导航任务,实际上是多模块节点相互配合的复杂任务,底盘调度模块将规划任务委托给全局规划模块和局部规划模块,最终输出速度和加速度控制量给底层主控板,而规划模块还依赖实时更新的里程计信息、定位信息与代价地图 117 | 118 | 在实际场景与虚拟环境中的导航系统框图如下所示 119 | 120 | ![](https://rm-static.djicdn.com/documents/20758/1a8702f75a2361547553400516274462.png) 121 | 122 | #### 云台调度模块 123 | 124 | 云台调度模块包括对云台不同抽象程度的任务调度接口,运行框图如下所示 125 | 126 | ![](https://rm-static.djicdn.com/documents/20758/5a50d15ba49371547553470292508751.png) 127 | 128 | 其中包括两个任务模式 129 | 130 | - 角度控制 131 | 132 | 输入目标角度 angle ([roborts_msgs/GimbalAngle]())直接进行机器人云台角度控制 133 | 134 | - 速度控制 135 | 136 | 输入目标角速度 angle ([roborts_msgs/GimbalRate]())直接进行机器人云台角速度控制 137 | 138 | 139 | 140 | 141 | ## 编译与运行 142 | 143 | ### 编译 144 | 145 | 在ROS的工作区内编译 146 | 147 | ```shell 148 | catkin_make -j4 behavior_test_node 149 | ``` 150 | 151 | ### 运行 152 | 153 | 154 | 在仿真环境中测试 155 | 156 | ```shell 157 | roslaunch roborts_bringup roborts_stage.launch 158 | ``` 159 | 160 | 启动行为测试节点 161 | 162 | ```shell 163 | rosrun roborts_decision behavior_test_node 164 | ``` 165 | 166 | 输入1\2\3\4\5\6不同数字指令,可切换执行不同的行为 167 | 168 | 169 | 170 | 171 | 172 | 173 | -------------------------------------------------------------------------------- /sdk_docs/roborts_detection.md: -------------------------------------------------------------------------------- 1 | # Detection模块 2 | 3 | ## 模块介绍 4 | 5 | Detection模块为检测ICRA Robomaster 2019人工智能挑战赛机器人装甲模块, 同时也包含了对弹丸模型的简单分析(参见PDF文档[弹丸模型分析](https://raw.githubusercontent.com/RoboMaster/RoboRTS-Tutorial/master/pdf/projectile_model.pdf)). 6 | 7 | 模块位于`roborts_detection`包中, 依赖`roborts_common`包中抽象工场模式和参数读取, 模块文件目录如下所示. 8 | 9 | 10 | ```bash 11 | roborts_detection 12 | ├── package.xml 13 | ├── CMakeLists.txt 14 | ├── armor_detection # 装甲识别算法 15 | │   ├── CMakeLists.txt 16 | │   ├── config 17 | │   │   └── armor_detection.prototxt # 装甲识别参数配置文件 18 | │   ├── armor_detection_algorithms.h # 装甲识别算法头文件(所有算法头文件都应在此文件中引入) 19 | │   ├── armor_detection_base.h # 装甲识别父类 20 | │   ├── armor_detection_client.cpp # 装甲识别actionlib中的client,在调试中使用。 21 | │   ├── armor_detection_node.cpp # ros node,负责装甲识别内部逻辑调度 22 | │   ├── armor_detection_node.h # 装甲识别入口文件 23 | │   ├── gimbal_control.cpp # 云台控制文件,以弹丸模型建立,得出云台控制的pitch以及yaw 24 | │   ├── gimbal_control.h 25 | │   └── proto 26 | │   ├── armor_detection.pb.cc 27 | │   ├── armor_detection.pb.h 28 | │   └── armor_detection.proto # 装甲识别参数生成文件 29 | │   ├── constraint_set # 装甲识别算法,使用装甲特征识别装甲板 30 | │   │   ├── CMakeLists.txt 31 | │   │   ├── config 32 | │   │   │   └── constraint_set.prototxt # 装甲识别可调参数 33 | │   │   ├── constraint_set.cpp 34 | │   │   ├── constraint_set.h 35 | │   │   └── proto 36 | │   │   ├── constraint_set.pb.cc 37 | │   │   ├── constraint_set.pb.h 38 | │   │   └── constraint_set.proto # 装甲识别参数生成文件 39 | ├── cmake_module 40 | │   ├── FindEigen3.cmake 41 | │   └── FindProtoBuf.cmake 42 | └── util 43 | ├── CMakeLists.txt 44 | └── cv_toolbox.h # 负责订阅图片以整个detection使用,同时也包含通用图像处理函数 45 | ``` 46 | 47 | armor_detection_node 的ROS节点图示为 48 | 49 | ![](https://rm-static.djicdn.com/documents/20758/01dfe2ff9684a1547553225209707914.png) 50 | 51 | 节点的输入输出如下 52 | 53 | ### 输入 54 | 55 | - /camera_name/image_raw ([sensor_msgs/Image](http://docs.ros.org/melodic/api/sensor_msgs/html/msg/Image.html)) 56 | 57 | (必须) 通过订阅roborts_camera得到, 用于装甲识别 58 | 59 | - /camera_name/camera_info ([sensor_msgs/CameraInfo](http://docs.ros.org/melodic/api/sensor_msgs/html/msg/CameraInfo.html)) 60 | 61 | (必须) 通过订阅roborts_camera得到, 用于PnP结算, 得出目标三维坐标 62 | 63 | ### 输出 64 | 65 | - /armor_detection_node_action/feedback (roborts_msgs/action/ArmorDetection) 66 | 67 | `Actionlib Server`实时反馈识别到装甲的位置信息,接口由Actionlib封装 68 | 69 | - /armor_detection_node_action/status ([actionlib_msgs/GoalStatusArray](http://docs.ros.org/melodic/api/actionlib_msgs/html/msg/GoalStatusArray.html)) 70 | 71 | `Actionlib Server`实时反馈装甲识别节点运行状态,接口由Actionlib封装 72 | 73 | - /cmd_gimbal_angle (roborts_msgs/msgs/GimbalAngle) 74 | 75 | 发布云台控制信息 76 | 77 | ### 相关参数 78 | 79 | 参数的定义见`proto/armor_detection.proto`,参数的配置见`config/armor_detection.prototxt` 80 | 81 | - name(`string`) 82 | 83 | 装甲板识别算法的名称 84 | 85 | - selected_algorithm(`string`) 86 | 87 | 选择装甲板识别算法的名称 88 | 89 | - undetected_armor_delay(`uint32`) 90 | 91 | 未检测到装甲板时, 继续发布上一帧数据的次数 92 | 93 | - camera_name(`string`) 94 | 95 | 相机名称, 应该和roborts_camera配置文件中的camera_name相同 96 | 97 | - camera_gimbal_transform(`CameraGimbalTransform`) # 复合数据 98 | 99 | 相机与云台的变换矩阵包括了一下参数 100 | 101 | - offset_x(`float`) 102 | 103 | 相机与云台x方向偏移 104 | 105 | - offset_y 106 | 107 | 相机与云台y方向偏移 108 | 109 | - offset_z 110 | 111 | 相机与云台z方向偏移 112 | 113 | - offset_pitch 114 | 115 | 相机与云台pitch角偏移 116 | 117 | - offset_yaw 118 | 119 | 相机与云台yaw角偏移 120 | 121 | - projectile_model_info(`ProjectileModelInfo`) # 复合数据 122 | 123 | 弹丸初始参数, 用于弹丸模型分析, 包括一下参数 124 | 125 | - init_v(`float`) 126 | 127 | 弹丸出射速度 128 | 129 | - init_k(`float`) 130 | 131 | 空气阻力常数, 具体参考projectile_model.pdf 132 | 133 | ## 编译与运行 134 | 135 | ### 编译 136 | 137 | 在ROS的工作区内编译 138 | 139 | ```shell 140 | catkin_make -j4 armor_detection_client armor_detection_node 141 | ``` 142 | 143 | ### 运行 144 | 145 | 启动装甲板识别节点 146 | 147 | ```shell 148 | rosrun roborts_detection armor_detection_node 149 | ``` 150 | 151 | 实际调试过程中,需启动armor_detection_client节点 152 | 153 | ```shell 154 | rosrun roborts_detection armor_detection_client 155 | ``` 156 | 157 | 158 | 159 | 160 | 161 | -------------------------------------------------------------------------------- /sdk_docs/roborts_localization.md: -------------------------------------------------------------------------------- 1 | # 定位模块 2 | 3 | ## 模块介绍 4 | 5 | 定位模块是导航系统中强依赖的节点,其主要通过获得一系列的传感器信息,通过特定算法的处理,最终得到机器人坐标系到地图坐标系的变换关系,也就是机器人在地图中的位姿。在`roborts_localization`中,默认的定位算法是AMCL算法,关于AMCL算法的介绍和参数说明可以参考[AMCL](sdk_docs/roborts_localization?id=amcl算法)章节。 6 | 7 | 数据流示意图如下: 8 | 9 | ![localization](https://rm-static.djicdn.com/documents/20758/0ba7ea27fc531547553277041255026.png) 10 | 11 | 12 | 13 | 文件目录如下: 14 | 15 | ```bash 16 | ├── CMakeLists.txt 17 | ├── cmake_module 18 | │ ├── FindEigen3.cmake 19 | │ └── FindGlog.cmake 20 | ├── config 21 | │ └── localization.yaml #Localization参数配置文件,需要在launch file中load 22 | ├── localization_config.h #Localization参数类 23 | ├── localization_math.cpp 24 | ├── localization_math.h #模块内通用数学函数 25 | ├── localization_node.cpp #主节点和Main函数 26 | ├── localization_node.h 27 | ├── log.h #Glog Wrapper 28 | ├── package.xml 29 | ├── types.h #Type define 30 | ├── amcl #AMCL算法代码目录 31 | │ ├── amcl_config.h #AMCL参数类 32 | │ ├── amcl.cpp #AMCL主要逻辑代码 33 | │ ├── amcl.h 34 | │ ├── CMakeLists.txt 35 | │ ├── config 36 | │ │ └── amcl.yaml #AMCL参数配置文件,需要在launch file中load 37 | │ ├── map 38 | │ │ ├── amcl_map.cpp 39 | │ │ └── amcl_map.h #AMCL地图运算相关代码 40 | │ ├── particle_filter #粒子滤波器相关代码 41 | │ │ ├── particle_filter.cpp 42 | │ │ ├── particle_filter_gaussian_pdf.cpp 43 | │ │ ├── particle_filter_gaussian_pdf.h 44 | │ │ ├── particle_filter.h 45 | │ │ ├── particle_filter_kdtree.cpp 46 | │ │ ├── particle_filter_kdtree.h 47 | │ │ └── particle_filter_sample.h 48 | │ └── sensors #里程计模型与激光雷达传感器模型 49 | │ ├── sensor_laser.cpp 50 | │ ├── sensor_laser.h 51 | │ ├── sensor_odom.cpp 52 | │ └── sensor_odom.h 53 | ``` 54 | 55 | 56 | 57 | 定位节点可以从命令行独立启动 58 | 59 | ```bash 60 | # 编译roborts_localization 61 | catkin_make -DCMAKE_BUILD_TYPE=Release localization_node 62 | 63 | rosrun roborts_localization localization_node 64 | ``` 65 | 66 | 或从launch文件启动 67 | 68 | ```xml 69 | 70 | 71 | 72 | 73 | 74 | 75 | ``` 76 | 77 | RViz中的测试效果 78 | 79 | ![localization_rviz](https://rm-static.djicdn.com/documents/20758/f172f1f29aa2f1547553303273821911.png) 80 | 81 | 82 | 83 | ## 模块输入 84 | 85 | * /map ([nav_msgs/OccupancyGrid](http://docs.ros.org/api/nav_msgs/html/msg/OccupancyGrid.html)) 86 | 87 | (必须) 订阅静态地图。 88 | 89 | * /tf ([tf/tfMessage](http://docs.ros.org/api/tf/html/msg/tfMessage.html)) 90 | 91 | (必须) 获取坐标系间转换关系。(odom->bask_link) 92 | 93 | * /scan ([sensor_msgs/LaserScan](http://docs.ros.org/api/sensor_msgs/html/msg/LaserScan.html)) 94 | 95 | (必须) 订阅激光雷达数据。 96 | 97 | * /initialpose ([geometry_msgs/PoseWithCovarianceStamped](http://docs.ros.org/api/geometry_msgs/html/msg/PoseWithCovarianceStamped.html)) 98 | 99 | (可选) 估计位姿,以此均值和方差初始化粒子滤波器。对应Rviz中的2D Pose Estimate。 100 | 101 | * /uwb ([geometry_msgs/PoseStamped](http://docs.ros.org/melodic/api/geometry_msgs/html/msg/PoseStamped.html)) 102 | 103 | (可选) UWB数据,用于校正全局定位信息。 104 | 105 | ## 模块输出 106 | 107 | * /amcl_pose ([geometry_msgs/PoseWithCovarianceStamped](http://docs.ros.org/api/geometry_msgs/html/msg/PoseWithCovarianceStamped.html)) 108 | 109 | 通过AMCL算法估计的位姿。 110 | 111 | * /particlecloud ([geometry_msgs/PoseArray](http://docs.ros.org/api/geometry_msgs/html/msg/PoseArray.html)) 112 | 113 | 粒子滤波器中粒子的位姿。 114 | 115 | * /tf ([tf/tfMessage](http://docs.ros.org/api/tf/html/msg/tfMessage.html)) 116 | 117 | 发布`odom`系到`map`系的转换关系。 118 | 119 | 120 | 121 | > [!NOTE] 122 | > 在运行`roborts_localization`模块前需检查roborts_base和激光雷达相关驱动是否已正确运行。 123 | 124 | 125 | 126 | ## 相关参数 127 | 128 | * odom_frame_id (`string`, 默认值: "odom") 129 | 130 | odom的坐标系。 131 | 132 | * base_frame_id (`string`, 默认值: "base_link") 133 | 134 | 机体坐标系。 135 | 136 | * global_frame_id (`string`, 默认值: "map") 137 | 138 | 地图坐标系。 139 | 140 | * laser_topic_name (`string`, 默认值: "scan") 141 | 142 | 激光雷达发送的topic名。 143 | 144 | * map_topic_name (`string`, 默认值: "map") 145 | 146 | 静态地图发送的topic名。 147 | 148 | * init_pose_topic_name (`string`, 默认值: "initialpose") 149 | 150 | 指定估计位姿发送的topic名。 151 | 152 | * transform_tolerance (`double`, 默认值: 0.1) 153 | 154 | tf发布间隔。 155 | 156 | * initial_pose_x (`double`, 默认值: 1) 157 | 158 | 初始估计位姿的x轴位置。 159 | 160 | * initial_pose_y (`double`, 默认值: 1) 161 | 162 | 初始估计位姿的y轴位置。 163 | 164 | * initial_pose_a (`double`, 默认值: 0) 165 | 166 | 初始估计位姿的yaw角度。 167 | 168 | * initial_cov_xx (`double`, 默认值: 0.1) 169 | 170 | 初始估计位姿的xx协方差。 171 | 172 | * initial_cov_yy (`double`, 默认值: 0.1) 173 | 174 | 初始估计位姿的yy协方差。 175 | 176 | * initial_cov_aa (`double`, 默认值: 0.1) 177 | 178 | 初始估计位姿的aa协方差。 179 | 180 | * enable_uwb (`bool`, 默认值: true) 181 | 182 | 是否启用UWB校正。 183 | 184 | * uwb_frame_id (`string`, 默认值: "uwb") 185 | 186 | UWB坐标系。 187 | 188 | * uwb_topic_name (`string`, 默认值: "uwb") 189 | 190 | UWB发布的topic名。 191 | 192 | * use_sim_uwb (`bool`, 默认值: false) 193 | 194 | 是否使用stage模拟器生成fake uwb信息。 195 | 196 | * uwb_correction_frequency (`int`, 默认值: 20) 197 | 198 | UWB校正的频率。 199 | 200 | 201 | ## AMCL算法 202 | 203 | ### 算法介绍 204 | 205 | ![autonomous mobile robot](https://rm-static.djicdn.com/documents/20758/6b0ea21b6d57a1547553332830529202.png) 206 | 207 | 自适应蒙特卡洛定位( AMCL, Adaptive Monte Carlo Localization)是一套适用于二维机器人运动的定位算法。其主要的算法原理源于《Probabilistic Robotics》一书8.3中描述的**MCL**, **Augmented_MCL**, 和**KLD_Sampling_MCL**算法的结合与实现。同时书中第5章描述的Motion Model和第6章描述的Sensor Model,用于AMCL中运动的预测和粒子权重的更新。 208 | 209 | 在`roborts_localization`内部集成的AMCL,功能上添加了随机角度、随机位置和角度的初始化支持,以便于竞技比赛中的快速部署。 210 | 211 | 212 | 213 | 算法基本流程如下图![AMCL_flowchart](https://rm-static.djicdn.com/documents/20758/e75d3533fc03c1547553357220164263.png) 214 | 215 | 216 | 217 | ### 相关参数 218 | 219 | * min_particles (int, 默认值: 50) 220 | 221 | 粒子滤波器的最小粒子数。 222 | 223 | * max_particles (int, 默认值: 2000) 224 | 225 | 粒子滤波器的最大粒子数。 226 | 227 | * use_global_localization (bool, 默认值: false) 228 | 229 | 是否初始随机初始定位。 230 | 231 | * random_heading (bool, 默认值: false) 232 | 233 | 是否初始随机角度的初始定位。 234 | 235 | * update_min_d (double, 默认值: 0.1) 236 | 237 | 滤波器更新的位移阈值。 238 | 239 | * update_min_a (double, 默认值: 0.5) 240 | 241 | 滤波器更新的旋转阈值。 242 | 243 | * odom_model (ENUM, ODOM_MODEL_OMNI) 244 | 245 | 机器人运动模型。目前只支持全向轮里程计模型。 246 | 247 | * odom_alpha1 (double, 默认值: 0.005) 248 | 249 | 里程计模型的误差参数。 250 | 251 | * odom_alpha2 (double, 默认值: 0.005) 252 | 253 | 里程计模型的误差参数。 254 | 255 | * odom_alpha3 (double, 默认值: 0.01) 256 | 257 | 里程计模型的误差参数。 258 | 259 | * odom_alpha4 (double, 默认值: 0.005) 260 | 261 | 里程计模型的误差参数。 262 | 263 | * odom_alpha5 (double, 默认值: 0.003) 264 | 265 | 里程计模型的误差参数。 266 | 267 | * laser_min_range (double, 默认值: 0.15) 268 | 269 | 激光雷达的最小有效测距距离。 270 | 271 | * laser_max_range (double, 默认值: 8.0) 272 | 273 | 激光雷达的最大有效测距距离。 274 | 275 | * laser_max_beams (int, 默认值: 30) 276 | 277 | 激光雷达的波束数。 278 | 279 | * laser_model (ENUM, 默认值: LASER_MODEL_LIKELIHOOD_FIELD_PROB) 280 | 281 | 激光雷达测距仪模型。目前只支持似然域改进模型。 282 | 283 | * z_hit (double, 默认值: 0.5) 284 | 285 | 似然域模型中的$z_{hit}$参数。 286 | 287 | * z_rand (double, 默认值: 0.5) 288 | 289 | 似然域模型中的$z_{rand}$参数。 290 | 291 | * sigma_hit (double, 默认值: 0.2) 292 | 293 | 似然域模型中的$\sigma_{hit}$参数。 294 | 295 | * laser_likelihood_max_dist (double, 默认值: 2.0) 296 | 297 | 似然域模型中的测距点与障碍物间最大距离。 298 | 299 | * do_beamskip (bool, 默认值: true) 300 | 301 | Position Tracking阶段忽略部分激光波束,以避免不可预料的误差,比如动态物体等等。 302 | 303 | * beam_skip_distance (double, 默认值: 0.5) 304 | 305 | 忽略波束的障碍物距离阈值。 306 | 307 | * beam_skip_threshold (double, 默认值: 0.3) 308 | 309 | 波束忽略的阈值。 310 | 311 | * beam_skip_error_threshold (double, 默认值: 0.9) 312 | 313 | 波束忽略的错误阈值。 314 | 315 | * resample_interval (int, 默认值: 2) 316 | 317 | 重采样周期。 318 | 319 | * recovery_alpha_slow (double, 默认值: 0.001) 320 | 321 | **Augmented_MCL**中的$\alpha_{slow}$参数。 322 | 323 | * recovery_alpha_fast (double, 默认值: 0.1) 324 | 325 | **Augmented_MCL**中的$\alpha_{fast}$参数。 326 | 327 | * kld_err (double, 默认值: 0.05) 328 | 329 | **KLD_Sampling_MCL**中的$\epsilon$。 330 | 331 | * kld_z (double, 默认值: 0.99) 332 | 333 | **KLD_Sampling_MCL**中的$(1-\delta)$。 334 | 335 | * laser_filter_weight (double, 默认值: 0.4) 336 | 337 | 权重阈值,用于筛选出权重较低的激光雷达测量值。 338 | 339 | * max_uwb_particles (int, 默认值: 10) 340 | 341 | 重采样阶段,以UWB为均值的最大重采样数。 342 | 343 | * uwb_cov_x (double, 默认值: 0.06) 344 | 345 | 重采样阶段,以UWB为均值的高斯分布的方差x 346 | 347 | * uwb_cov_y (double, 默认值: 0.06) 348 | 349 | 重采样阶段,以UWB为均值的高斯分布的方差y 350 | 351 | * resample_uwb_factor (double, 默认值: 4.0) 352 | 353 | 重采样因子,用于判断对称定位。 354 | 355 | 356 | 357 | 358 | 359 | -------------------------------------------------------------------------------- /sdk_docs/roborts_planning_global_planner.md: -------------------------------------------------------------------------------- 1 | # 全局路径规划 2 | 3 | ## 模块介绍 4 | 5 | 全局路径规划(简称全局规划)是导航系统中运动规划的第一个步骤,在给定目标位置后,根据感知的全局代价地图搜索得到一条无碰撞的最短路径(即一系列离散的坐标点),然后作为输入传递给局部轨迹规划模块控制机器人的具体运动。 6 | 7 | 全局路径规划模块位于`roborts_planner`包中,依赖`roborts_msgs`包中的Action消息与`roborts_common`包中的抽象工厂模式和参数读取,模块文件目录如下所示 8 | 9 | ```bash 10 | └── global_planner/ 11 | ├── CMakeLists.txt 12 | ├── global_planner_algorithms.h #包含实现具体算法类的头文件 13 | ├── global_planner_base.h #全局规划算法的抽象类 14 | ├── global_planner_test.cpp #全局规划的测试节点 15 | ├── global_planner_node.cpp #全局规划核心功能执行的节点和Main函数 16 | ├── global_planner_node.h 17 | ├── a_star_planner #Astar全局规划算法实现 18 | │   └── ... 19 | ├── config 20 | │   └── global_planner_config.prototxt # 全局规划参数配置文件 21 | └── proto 22 | ├── global_planner_config.pb.cc 23 | ├── global_planner_config.pb.h 24 | └── global_planner_config.proto # 全局规划参数定义文件 25 | ``` 26 | 27 | 全局路径规划的相关算法参考[A Star算法](sdk_docs/roborts_planning_global_planner?id=a) 28 | 29 | global_planner_node为核心规划节点,ROS节点图示为 30 | 31 | ![](https://rm-static.djicdn.com/documents/20758/63b3d8db24ce71547553505842780076.png) 32 | 33 | 节点的输入输出如下 34 | 35 | ### 输入 36 | 37 | * /tf ([tf/tfMessage](http://docs.ros.org/api/tf/html/msg/tfMessage.html)) 38 | 39 | (必须) 通过`TransformListener`监听到的base_link->map的变换,由`roborts_localization`包提供 40 | 41 | * global_costmap对象 (`roborts_costmap/CostmapInterface`) 42 | 43 | (必须) 可以表述静态层和障碍物层(可选)的全局栅格代价地图(global_costmap),依赖话题/tf, /map(静态层)和/scan(障碍物层可选),由`roborts_costmap`包提供 44 | 45 | * /global_planner_node_action/goal ([roborts_msgs/GlobalPlannerGoal]()) 46 | 47 | (必须) 由`Actionlib Client`向`Actionlib Server`输入全局规划的目标点,接口由Actionlib封装,具体消息类型为([geometry_msgs/PoseStamped](http://docs.ros.org/api/geometry_msgs/html/msg/PoseStamped.html)) 48 | 49 | * /global_planner_node_action/cancel ([actionlib_msgs/GoalID](http://docs.ros.org/api/actionlib_msgs/html/msg/GoalID.html)) 50 | 51 | 由`Actionlib Client`向`Actionlib Server`申请取消正在运行的全局规划任务,接口由Actionlib封装 52 | 53 | ### 输出 54 | 55 | * /global_planner_node_action/feedback ([roborts_msgs/GlobalPlannerFeedback]()) 56 | 57 | `Actionlib Server`实时反馈的规划路径,接口由Actionlib封装,具体消息类型为 [nav_msgs/Path](http://docs.ros.org/api/nav_msgs/html/msg/Path.html) 58 | 59 | * /global_planner_node_action/result ([roborts_msgs/GlobalPlannerResult]()) 60 | 61 | `Actionlib Server`反馈规划的结果,即是否判断到达目标点,接口由Actionlib封装 62 | 63 | * /global_planner_node_action/status ([actionlib_msgs/GoalStatusArray](http://docs.ros.org/api/actionlib_msgs/html/msg/GoalStatusArray.html)) 64 | 65 | `Actionlib Server`实时反馈的规划状态,接口由Actionlib封装 66 | 67 | * /global_planner_node/path ([nav_msgs/Path](http://docs.ros.org/api/nav_msgs/html/msg/Path.html)) 68 | 69 | 用于可视化显示的规划路径 70 | 71 | * /global_costmap/global_costmap/costmap ([nav_msgs/OccupancyGrid](http://docs.ros.org/api/nav_msgs/html/msg/OccupancyGrid.html)) 72 | 73 | 用于可视化显示的全局代价栅格地图 74 | 75 | ### 相关参数 76 | 77 | 参数的定义见`proto/global_planner_config.proto`,参数的配置见`config/global_planner_config.prototxt` 78 | 79 | * selected_algorithm(`string`, 默认值: "a_star_planner") 80 | 81 | 选择全局规划算法的名称 82 | 83 | * frequency(`int`, 默认值: 3) 84 | 85 | 全局规划算法的频率 86 | 87 | * max_retries(`int`, 默认值: 5) 88 | 89 | 允许全局规划失败重试的次数 90 | 91 | * goal_distance_tolerance(`double`, 默认值: 0.15) 92 | 93 | 全局规划到达目标的欧式距离误差 94 | 95 | * goal_angle_tolerance(`double`, 默认值: 0.1) 96 | 97 | 全局规划到达目标的角度(单位弧度制)误差 98 | 99 | ### 执行流程 100 | 101 | 核心规划节点`global_planner_node_action`的执行流程如下 102 | 103 | - 初始化: 104 | - 初始化Actionlib Server,构建可视化的publisher 105 | - 读取参数 106 | - 创建tf listener,global_costmap对象 107 | - 创建具体算法的planner对象 108 | 109 | - 初始化成功后,开启规划线程,ROS回调队列在主线程开始回调,同时线程Actionlib Server Callback也开始回调 110 | - 规划线程执行流程图 111 | ![](https://rm-static.djicdn.com/documents/20758/057c895762b7d1547553536324774678.png) 112 | - Actionlib Server Callback线程执行流程图 113 | ![](https://rm-static.djicdn.com/documents/20758/b93803f9be2aa1547553557409469215.png) 114 | 115 | 116 | ## 编译与运行 117 | 118 | ### 编译 119 | 120 | 在ROS的工作区内编译 121 | 122 | ```shell 123 | catkin_make -j4 global_planner_node global_planner_test 124 | ``` 125 | 126 | ### 运行 127 | 128 | 在仿真环境中测试 129 | 130 | ```shell 131 | roslaunch roborts_bringup roborts_stage.launch 132 | ``` 133 | 134 | 启动全局路径规划测试节点 135 | 136 | ```shell 137 | rosrun roborts_planning global_planner_test 138 | ``` 139 | 开启Rviz,显示了模块所需的输入输出,如下图所示 140 | 141 | ![](https://rm-static.djicdn.com/documents/20758/844f7b73e9a091547553578138533412.png) 142 | 143 | - Pose: 红色箭头为规划目标点 144 | - LaserScan: 黄色点集为激光雷达扫描数据 145 | - TF: 参见TF的坐标系 146 | - Map: 灰黑部分为静态地图 147 | - Map: 不同颜色(紫\红\青\蓝等)显示不同代价的全局代价地图 148 | - Path: 绿色路径为规划路径 149 | 150 | ## A* 151 | ### 算法介绍 152 | 153 | A*(A-Star)算法是一种静态路网中求解最短路最有效的方法。公式表示为: 154 | 155 | $$ f(n) = g(n) + h(n) $$ 156 | 157 | 其中$f(n)$是节点n从初始点到目标点的估价函数,$g(n)$是在状态空间中从初始节点到n节点的实际代价,$h(n)$是从n到目标节点最佳路径的估计代价。在实际的全局路径规划中,静态网路由全局代价地图提供。 158 | 159 | ### 相关参数 160 | * inaccessible_cost(`int`, 默认值: 253) 161 | 162 | 不可通过(即视为致命障碍物)的地图栅格代价值 163 | 164 | * heuristic_factor(`double`, 默认值: 1.0) 165 | 166 | 启发因子,见启发函数`h(n) = heuristic_factor * manhattan_distance` 167 | 168 | * goal_search_tolerance(`double`, 默认值: 0.45) 169 | 170 | 寻找新目标点的容忍距离(单位米),如果发送的目标点不可通过,则在此允许距离周围内找一个可以通过的新目标点 171 | 172 | 173 | -------------------------------------------------------------------------------- /sdk_docs/roborts_planning_local_planner.md: -------------------------------------------------------------------------------- 1 | # 局部轨迹规划 2 | 3 | ## 模块介绍 4 | 5 | 局部路径规划模块根据机器人的里程计信息,雷达的感知信息,结合全局路径规划给出的最优路径,计算出机器人当前能够避免与障碍物碰撞的最优速度. 6 | 7 | 局部路径规划模块位于`roborts_planner`包中,依赖`roborts_msgs`包中的action与msgs消息与`roborts_common`中的抽象工厂模式和参数读取。 8 | 9 | 局部路径规划层级简略如下: 10 | 11 | ```bash 12 | local_planner 13 | ├── CMakeLists.txt 14 | ├── config 15 | │   └── local_planner.prototxt #局部路径规划算法配置文件 16 | ├── include 17 | │   └── local_planner 18 | │   ├── proto 19 | │   │   ├── local_planner.pb.cc 20 | │   │   ├── local_planner.pb.h 21 | │   │   └── local_planner.proto # 局部路径规划参数生成文件 22 | │   ├── data_base.h # 局部路径规划数据结构,用于g2o构建边以及顶点 23 | │   ├── data_converter.h # 数据转换,将数据转换为data_base数据结构 24 | │   ├── distance_calculation.h # 计算二维点、线、几何图形之间的距离 25 | │   ├── line_iterator.h # 连续线段在离散栅格中的坐标计算 26 | │   ├── local_planner_algorithms.h # 局部路径规划算法头文件(所有算法头文件都应在此文件中引入) 27 | │   ├── local_planner_base.h # 局部路径规划算法父类 28 | │   ├── local_planner_node.h # 局部路径规划入口节点 29 | │   ├── local_visualization.h # 可视化局部路径规划结果 30 | │   ├── obstacle.h # 二维障碍物信息 31 | │   ├── odom_info.h # 里程计信息 32 | │   ├── optimal_base.h # 优化相关算法父类 33 | │   ├── robot_footprint_model.h # 机器人外形描述 34 | │   ├── robot_position_cost.h # 机器人所在位置代价计算,用于判断路径是否可行 35 | │   └── utility_tool.h # 通用函数 36 | ├── src 37 | │   ├── local_planner_node.cpp # ros node文件,负责局部路径规划内的逻辑调度 38 | │   ├── vel_converter.cpp # ros node文件, 仿真时将局部路径规划速度转换为“cmd_vel”发布 39 | │   ├── teb_test.cpp # 算法timed elastic band 测试文件 40 | │   └── ... 41 | └── timed_elastic_band 42 | ├── CMakeLists.txt 43 | ├── config 44 | │   └── timed_elastic_band.prototxt # timed elastic band 配置文件 45 | ├── include 46 | │   └── timed_elastic_band 47 | │   ├── proto 48 | │   │   ├── timed_elastic_band.pb.cc 49 | │   │   ├── timed_elastic_band.pb.h 50 | │   │   └── timed_elastic_band.proto 51 | │   ├── teb_local_planner.h #算法timed elastic band实现,继承local_planner_base.h 52 | │   ├── teb_optimal.h # g2o优化逻辑,继承optimal_base.h 53 | │   └── ... # 其他g2o的边以及顶点相关文件。 54 | └── src 55 | ├── teb_local_planner.cpp # 算法timed elastic band实现 56 | ├── teb_optimal.cpp #g2o优化逻辑 57 | └── ... 58 | 59 | ``` 60 | 61 | 局部路径规划的相关算法参考 [Time Elastic Band](/sdk_docs/roborts_planning_local_planner?id=timed-elastic-band) 62 | 63 | local_planner_node为核心规划节点,ROS节点图示为: 64 | 65 | ![](https://rm-static.djicdn.com/documents/20758/708f059fc647c1547553602751260545.png) 66 | 67 | 节点的输入输出如下 68 | 69 | ### 输入 70 | 71 | - /tf ([tf/tfMessage](http://docs.ros.org/api/tf/html/msg/tfMessage.html)) 72 | 73 | (必须) 通过`TransformListener`监听到的map->odom的变换,在roborts框架内由roborts_localization Package提供 74 | 75 | - local_costmap对象 (`roborts_costmap/CostmapInterface`) 76 | 77 | (必须) 可以表述障碍物层的局部栅格代价地图(local_costmap),依赖话题/tf, /scan(障碍物层),在roborts框架内由roborts_costmap Package提供 78 | 79 | - /local_planner_node_action/goal (roborts_msgs/LocalPlannerGoal) 80 | 81 | (必须) 由`Actionlib Client`向`Actionlib Server`输入由全局路径规划得到的path,接口由Actionlib封装,具体消息类型为([nav_msgs/Path](http://docs.ros.org/melodic/api/nav_msgs/html/msg/Path.html)) 82 | 83 | - /locall_planner_node_action/cancel ([actionlib_msgs/GoalID](http://docs.ros.org/melodic/api/actionlib_msgs/html/msg/GoalID.html)) 84 | 85 | 由`Actionlib Client`向`Actionlib Server`申请取消正在运行的局部规划任务,接口由Actionlib封装 86 | 87 | ### 输出 88 | 89 | - /local_planner_node_action/feedback (roborts_msgs/LocalPlannerFeedback) 90 | 91 | `Actionlib Server`实时反馈的局部路径规划的错误码,错误信息,接口由Actionlib封装. 92 | 93 | - /local_planner_node_action/result (roborts_msgs/LocalPlannerResult) 94 | 95 | 暂未使用 96 | 97 | - /local_planner_node_action/status ([actionlib_msgs/GoalStatusArray](http://docs.ros.org/melodic/api/actionlib_msgs/html/msg/GoalStatusArray.html)) 98 | 99 | `Actionlib Server`实时反馈的规划状态,接口由Actionlib封装 100 | 101 | - /local_planner_node/trajectory([nav_msgs/Path](http://docs.ros.org/melodic/api/nav_msgs/html/msg/Path.html)) 102 | 103 | 用于可视化显示局部路径 104 | 105 | - /local_costmap/local_costmap/costmap ([nav_msgs/OccupancyGrid](http://docs.ros.org/melodic/api/nav_msgs/html/msg/OccupancyGrid.html)) 106 | 107 | 用于可视化显示的局部代价栅格地图 108 | 109 | 110 | 111 | ## 编译与运行 112 | 113 | ### 编译 114 | 115 | 在编译之前, 确保已经安装所有依赖。在 ros 工程目录下运行一下指令 116 | 117 | ```shell 118 | catkin_make local_planner_node vel_converter teb_test 119 | ``` 120 | 121 | ### 运行 122 | 123 | ```shell 124 | # 启动局部路径规划节点 125 | rosrun roborts_planning local_planner_node 126 | # 启动vel_converter 127 | rosrun roborts_planning vel_converter 128 | # 启动timed elastic band测试节点 129 | rosrun roborts_planning teb_test 130 | ``` 131 | 132 | 或从 launch 文件启动 133 | 134 | ```xml 135 | 136 | 137 | 138 | 139 | ``` 140 | 141 | ## Timed Elastic Band 142 | ### 算法介绍 143 | 144 | 算法建立了轨迹执行时间的目标方程,在满足与障碍物保持安全距离, 运动时遵守动力学约束的前提下,优化机器人实际运行轨迹,最终求出机器人可执行的速度。具体理论可参考论文: 145 | 146 | * C. Rösmann, F. Hoffmann and T. Bertram: Integrated online trajectory planning and optimization in distinctive topologies, Robotics and Autonomous Systems, Vol. 88, 2017, pp. 142–153. 147 | 148 | ### 相关参数 149 | 150 | #### 轨迹相关 151 | 152 | * teb_autosize 153 | 154 | 是否重新计算路径 155 | 156 | * dt_ref 157 | 158 | 初始时间常数 159 | 160 | * dt_hysteresis 161 | 162 | 时间补偿常数 163 | 164 | * global_plan_overwrite_orientation 165 | 166 | 是否重写路径规划点的朝向 167 | 168 | * allow_init_with_backwards_motion 169 | 170 | 是否允许后退运动 171 | 172 | * global_plan_viapoint_sep 173 | 174 | 选取必须通过的全局路径规划点 175 | 176 | * via_points_ordered 177 | 178 | 按顺序通过必经路径点 179 | 180 | * max_global_plan_lookahead_dist 181 | 182 | 局部路径规划截取全局路径规划距离 183 | 184 | * exact_arc_length 185 | 186 | 两个位置之间以弧线连接 187 | 188 | * force_reinit_new_goal_dist 189 | 190 | 重新规划局部路径 191 | 192 | * feasibility_check_no_poses 193 | 194 | 计算路径可行性点的个数 195 | 196 | * publish_feedback 197 | 198 | 无用参数 199 | 200 | * min_samples 201 | 202 | 路径最小点的个数 203 | 204 | * max_samples 205 | 206 | 路径最多点的个数 207 | 208 | 209 | #### 运动学参数 210 | 211 | 212 | * max_vel_x 213 | 214 | 最大前进运动速度 215 | 216 | * max_vel_x_backwards 217 | 218 | 最大后退速度 219 | 220 | * max_vel_y 221 | 222 | 最大平移速度 223 | 224 | * max_vel_theta 225 | 226 | 最大角速度 227 | 228 | * acc_lim_x 229 | 230 | 最大纵向加速度 231 | 232 | * acc_lim_y 233 | 234 | 最大平移加速度 235 | 236 | * acc_lim_theta 237 | 238 | 最大角加速度 239 | 240 | * min_turning_radius 241 | 242 | 最小转弯半径 243 | 244 | * wheelbase 245 | 246 | 无用参数 247 | 248 | * cmd_angle_instead_rotvel 249 | 250 | 无用参数 251 | 252 | #### 容忍度参数 253 | 254 | * xy_goal_tolerance 255 | 256 | 允许距离误差 257 | 258 | * yaw_goal_tolerance 259 | 260 | 允许角度误差 261 | 262 | * free_goal_vel 263 | 264 | 终点零速 265 | 266 | 267 | #### 障碍物参数 268 | 269 | * min_obstacle_dist 270 | 271 | 障碍物最小距离 272 | 273 | * inflation_dist 274 | 275 | 障碍物膨胀大小 276 | 277 | * include_costmap_obstacles 278 | 279 | 是否考虑代价地图中的障碍物,无用参数 280 | 281 | * costmap_obstacles_behind_robot_dist 282 | 283 | 考虑车后障碍物距离阈值 284 | 285 | * obstacle_poses_affected 286 | 287 | 障碍物影响路径点的数目,仅在legacy_obstacle_association为true有用 288 | 289 | * legacy_obstacle_association 290 | 291 | 选择以离障碍物最近的路径点优化,还是离路径点最近的障碍物优化 292 | 293 | * obstacle_association_cutoff_factor 294 | 295 | 不考虑的障碍物因子,legacy_obstacle_association为false有用 296 | 297 | * obstacle_association_force_inclusion_factor 298 | 299 | 强制考虑障碍物因子,工作条件同上 300 | 301 | #### 优化参数 302 | 303 | * no_inner_iterations 304 | 305 | 每一次调整轨迹后,求解器迭代次数 306 | 307 | * no_outer_iterations 308 | 309 | 自动调整轨迹次数 310 | 311 | * optimization_activate 312 | 是否开始优化 313 | 314 | * optimization_verbose 315 | 316 | 是否输出调试信息 317 | 318 | * penalty_epsilon 319 | 320 | 约束惩罚函数补偿项 321 | 322 | * weight_max_vel_x 323 | 324 | 最大x速度权重 325 | 326 | * weight_max_vel_y 327 | 328 | 最大y速度权重 329 | 330 | * weight_max_vel_theta 331 | 332 | 最大角速度权重 333 | 334 | * weight_acc_lim_x 335 | 336 | 最大x加速度权重 337 | 338 | * weight_acc_lim_y 339 | 340 | 最大y加速度权重 341 | 342 | * weight_acc_lim_theta 343 | 344 | 最大角速度权重 345 | 346 | * weight_kinematics_nh 347 | 348 | 非完整运动学约束权重 349 | 350 | * weight_kinematics_forward_drive 351 | 352 | 正向运动权重 353 | 354 | * weight_kinematics_turning_radius 355 | 356 | 最小化转弯半径权重,car like model 357 | 358 | * weight_optimaltime 359 | 360 | 优化轨迹时间的权重 361 | 362 | * weight_obstacle 363 | 364 | 满足与障碍物最小距离的权重 365 | 366 | * weight_inflation 367 | 368 | 膨胀后权重 369 | 370 | * weight_dynamic_obstacle 371 | 372 | 动态障碍物权重 373 | 374 | * weight_viapoint 375 | 376 | 最小与必经点距离权重 377 | 378 | * weight_adapt_factor 379 | 380 | 权重自曾参数,现在只有obstacle使用 381 | 382 | * weight_prefer_rotdir 383 | 384 | 转向权重 385 | 386 | 387 | 388 | 389 | 390 | 391 | 392 | 393 | 394 | 395 | 396 | 397 | -------------------------------------------------------------------------------- /software_framework.md: -------------------------------------------------------------------------------- 1 | # 软件框架 2 | 整个软件系统解耦了驱动模块,功能模块和算法模块,以增强平台的可移植性,提高不同领域开发者合作开发的效率。 3 | 4 | 整个软件系统架构如下图所示 5 | 6 | ![system_diagram](https://rm-static.djicdn.com/documents/20758/40aac2fa9e6b81547552996504129513.png) 7 | 8 | 开发者基于该平台可以自底向上进行编程开发,来控制机器人完成不同的任务,整个框架分为两个部分 9 | 10 | 作为底层计算平台,STM32微控制器以有限的计算能力运行RTOS(实时操作系统)完成实时各种任务: 11 | 12 | - 以编码器或IMU为反馈的闭环控制任务 13 | 14 | - 传感器信息的采集、预处理与转发,用于姿态估计 15 | 16 | - 转发与比赛相关的信息,例如机器人血量等 17 | 18 | 机载运算设备(例如Manifold 2)扮演着上层计算平台的角色,负责较大运算量的各种算法执行: 19 | 20 | - 环境感知,包括实时定位、建图、目标检测、识别和追踪 21 | 22 | - 运动规划,包括全局规划(路径规划),局部规划(轨迹规划) 23 | 24 | - 基于观察-动作的决策模型接口,适配各种不同的决策框架,例如FSM(有限状态机),Behavior Tree(行为树)以及其他基于学习的决策框架。 25 | 26 | 两个运算平台之间通过串口以规定的开源协议进行通信,平台同时提供了基于ROS的API接口,可以在ROS中利用C++或Python编程,进行二次开发。基于这个接口,开发者可以控制机器人底盘和云台的运动,控制发射机构以输入的频率、射速与数量发射弹丸,同时实时接收传感器信息与比赛数据等功能。更高级的应用中,开发者可以根据[roborts_base](sdk_docs/roborts_base)相关文档定义自己的协议,以扩展机器人的功能。 27 | 28 | # 应用 29 | 30 | RoboRTS及RoboMaster移动机器人平台的初衷,是为了鼓励更多的开发者参与到不同场景与规则下的机器人智能决策研究中来,第一步是基于这个硬件平台实现一些机器人感知或规划算法,以便抽象观察空间与动作空间,更好的构建决策模型。 31 | 32 | 我们的首要场景是机器人竞技的决策,如下图所示: 33 | 34 | 追击与搜索: 35 | 36 | ![airobot1](https://rm-static.djicdn.com/documents/20758/2c66c923154ae1547553014851539095.gif) 37 | 38 | 巡逻: 39 | 40 | ![airobot2](https://rm-static.djicdn.com/documents/20758/299a5a10c187a1547553038856640888.gif) 41 | 42 | 43 | 44 | 45 | --------------------------------------------------------------------------------