├── {{cookiecutter.project_name}}
├── .github
│ ├── lint
│ │ ├── __init__.py
│ │ ├── errors.py
│ │ ├── lint_shellcheck.py
│ │ ├── lint_ruff.py
│ │ ├── run_command.py
│ │ ├── lint_clang_format.py
│ │ ├── lint_darglint.py
│ │ ├── lint_mypy.py
│ │ ├── main.py
│ │ └── paths.py
│ └── workflows
│ │ ├── lint.yaml
│ │ └── test.yaml
├── pkgs
│ ├── {{cookiecutter.example_package_name}}
│ │ ├── resource
│ │ │ └── {{cookiecutter.example_package_name}}
│ │ ├── {{cookiecutter.example_package_name}}_test
│ │ │ ├── unit
│ │ │ │ ├── __init__.py
│ │ │ │ └── nodes
│ │ │ │ │ └── test_{{cookiecutter.example_package_name}}_node.py
│ │ │ └── integration
│ │ │ │ ├── __init__.py
│ │ │ │ └── nodes
│ │ │ │ └── test_{{cookiecutter.example_package_name}}_node.py
│ │ ├── README.md
│ │ ├── {{cookiecutter.example_package_name}}
│ │ │ ├── nodes
│ │ │ │ ├── __init__.py
│ │ │ │ └── {{cookiecutter.example_package_name}}_node.py
│ │ │ ├── __init__.py
│ │ │ └── example_urdf.py
│ │ ├── poetry.lock
│ │ ├── package.xml
│ │ └── pyproject.toml
│ └── {{cookiecutter.__example_messages_package_name}}
│ │ ├── msg
│ │ └── ExampleMessage.msg
│ │ ├── srv
│ │ └── ExampleService.srv
│ │ ├── action
│ │ └── ExampleAction.action
│ │ ├── package.xml
│ │ └── CMakeLists.txt
├── .dockerignore
├── docker
│ ├── grafana
│ │ ├── provisioning
│ │ │ ├── datasources
│ │ │ │ └── default.yaml
│ │ │ └── dashboards
│ │ │ │ └── all.yaml
│ │ ├── grafana.ini
│ │ └── dashboard_data
│ │ │ └── logs_dashboard.json
│ ├── utils
│ │ ├── environment
│ │ │ ├── install-packages
│ │ │ ├── with-package-list
│ │ │ ├── save-build-cache
│ │ │ ├── restore-build-cache
│ │ │ ├── add-workspace
│ │ │ ├── make-pth-file-from-workspace
│ │ │ ├── enter-workspaces
│ │ │ ├── add-apt-repo
│ │ │ └── install-ros-package-from-git
│ │ └── README.md
│ ├── exec
│ ├── run
│ ├── reload-ros-nodes
│ ├── test
│ ├── launch
│ ├── _shared.sh
│ ├── promtail
│ │ └── config.yaml
│ └── Dockerfile
├── .gitignore
├── .env
├── launch-profiles
│ └── {{cookiecutter.example_launch_profile}}
│ │ ├── parameters
│ │ └── parameters.yaml
│ │ ├── launcher.py
│ │ └── rviz-config.rviz
├── .gitattributes
├── README.md
├── docker-compose.yaml
├── pyproject.toml
└── docs
│ └── about_template.md
├── imgs
└── img.png
├── LICENSE
├── cookiecutter.json
└── README.md
/{{cookiecutter.project_name}}/.github/lint/__init__.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/imgs/img.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/UrbanMachine/create-ros-app/HEAD/imgs/img.png
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/pkgs/{{cookiecutter.example_package_name}}/resource/{{cookiecutter.example_package_name}}:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/pkgs/{{cookiecutter.example_package_name}}/{{cookiecutter.example_package_name}}_test/unit/__init__.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/.dockerignore:
--------------------------------------------------------------------------------
1 | .git
2 | .github
3 | .docker_volumes
4 | .docker_cache
5 | .mypy_cache
6 | pkgs/*/*_test
7 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/pkgs/{{cookiecutter.example_package_name}}/{{cookiecutter.example_package_name}}_test/integration/__init__.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/pkgs/{{cookiecutter.example_package_name}}/README.md:
--------------------------------------------------------------------------------
1 | # {{cookiecutter.example_package_name}}
2 |
3 | {{cookiecutter.example_package_description}}
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/pkgs/{{cookiecutter.example_package_name}}/{{cookiecutter.example_package_name}}_test/unit/nodes/test_{{cookiecutter.example_package_name}}_node.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/pkgs/{{cookiecutter.example_package_name}}/{{cookiecutter.example_package_name}}_test/integration/nodes/test_{{cookiecutter.example_package_name}}_node.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/pkgs/{{cookiecutter.example_package_name}}/{{cookiecutter.example_package_name}}/nodes/__init__.py:
--------------------------------------------------------------------------------
1 | from .{{cookiecutter.example_package_name}}_node import {{cookiecutter.example_node_name}}
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/pkgs/{{cookiecutter.__example_messages_package_name}}/msg/ExampleMessage.msg:
--------------------------------------------------------------------------------
1 | # This is a placeholder message to show how to create a message file with the template
2 |
3 | string example_data
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/pkgs/{{cookiecutter.example_package_name}}/{{cookiecutter.example_package_name}}/__init__.py:
--------------------------------------------------------------------------------
1 | """**{{cookiecutter.example_package_name}}**
2 |
3 | {{cookiecutter.example_package_description}}
4 | """
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/.github/lint/errors.py:
--------------------------------------------------------------------------------
1 | class CommandError(Exception):
2 | """An error with a message to provide to the user"""
3 |
4 | def __init__(self, message: str):
5 | super().__init__(message)
6 | self.message = message
7 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/docker/grafana/provisioning/datasources/default.yaml:
--------------------------------------------------------------------------------
1 | apiVersion: 1
2 |
3 | datasources:
4 | - name: Loki
5 | type: loki
6 | access: proxy
7 | orgId: 1
8 | url: http://loki:3100
9 | basicAuth: false
10 | version: 1
11 | editable: false
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/docker/grafana/provisioning/dashboards/all.yaml:
--------------------------------------------------------------------------------
1 | apiVersion: 1
2 |
3 | providers:
4 | - name: main
5 | orgId: 1
6 | folder: ''
7 | type: file
8 | disableDeletion: true
9 | editable: true
10 | options:
11 | path: /etc/grafana/dashboard_data
12 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/pkgs/{{cookiecutter.__example_messages_package_name}}/srv/ExampleService.srv:
--------------------------------------------------------------------------------
1 | # This message is for demonstrating an service message that uses the RobustRPC node_helpers library
2 |
3 |
4 |
5 | ---
6 | string example_data
7 |
8 | # RobustRPC required fields
9 | string error_name
10 | string error_description
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/pkgs/{{cookiecutter.example_package_name}}/poetry.lock:
--------------------------------------------------------------------------------
1 | # This file is automatically @generated by Poetry 1.8.4 and should not be changed by hand.
2 | package = []
3 |
4 | [metadata]
5 | lock-version = "2.0"
6 | python-versions = ">=3.12.0,<4.0"
7 | content-hash = "55077cf34bc451233d3044bf620c6a190f4462bc9d3fca046c9e4a6636be1781"
8 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/pkgs/{{cookiecutter.__example_messages_package_name}}/action/ExampleAction.action:
--------------------------------------------------------------------------------
1 | # This message is for demonstrating an action message that uses the RobustRPC node_helpers library
2 |
3 | ---
4 |
5 | string example_data
6 |
7 | # RobustRPC required fields
8 | string error_name
9 | string error_description
10 |
11 | ---
12 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/docker/utils/environment/install-packages:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env bash
2 |
3 | # Installs APT packages with the recommended image-building boilerplate.
4 | #
5 | # Usage: install-packages {packages}
6 |
7 | set -o errexit
8 | set -o pipefail
9 | set -o nounset
10 |
11 | with-package-list apt-get install --yes --no-install-recommends "$@"
12 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/.github/lint/lint_shellcheck.py:
--------------------------------------------------------------------------------
1 | from .paths import bash_files
2 | from .run_command import run_command
3 |
4 |
5 | def lint_shellcheck(fix: bool) -> None:
6 | if fix:
7 | return # Shellcheck does not have a fix mode
8 |
9 | for path in bash_files():
10 | run_command(["shellcheck", "--external-sources", str(path)])
11 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/docker/grafana/grafana.ini:
--------------------------------------------------------------------------------
1 | [security]
2 | # Allows Grafana to be embedded into other websites
3 | allow_embedding = true
4 |
5 | [auth.anonymous]
6 | # Allows anonymous read-only access
7 | enabled = true
8 | org_name = Main Org.
9 | org_role = Admin
10 |
11 | [users]
12 | default_theme = dark
13 | home_page = /d/logs_dashboard/logs
14 |
15 | [dashboards]
16 | min_refresh_interval = 1s
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/.gitignore:
--------------------------------------------------------------------------------
1 | # PyCharm
2 | .idea/
3 |
4 | # VSCode
5 | .vscode/
6 |
7 | # Colcon
8 | build/
9 | install/
10 | log/
11 |
12 | # Python
13 | *.pyc
14 | __pycache__/
15 |
16 | # Override Files
17 | *.override.*
18 |
19 | # Docker Compose environment configuration
20 | .env
21 |
22 | # Persistent data
23 | .docker_volumes
24 |
25 | # URDF
26 | *.part
27 |
28 | # Dev specific overrides
29 | *.override.yaml
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/docker/utils/environment/with-package-list:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env bash
2 |
3 | # Updates the package list, runs the provided command, and clears the package
4 | # list. This ensures that the list is always up-to-date, and the resulting
5 | # Docker image is small.
6 | #
7 | # Usage: with-package-list {command}
8 |
9 | set -o errexit
10 | set -o pipefail
11 | set -o nounset
12 |
13 | apt-get update
14 | "$@"
15 | rm -rf /var/lib/apt/lists/*
16 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/.github/lint/lint_ruff.py:
--------------------------------------------------------------------------------
1 | from pathlib import Path
2 |
3 | from .run_command import run_command
4 |
5 |
6 | def lint_ruff_check(fix: bool) -> None:
7 | mode = "--fix" if fix else "--no-fix"
8 | cmd: list[str | Path] = ["ruff", "check", mode, Path()]
9 | run_command(cmd)
10 |
11 |
12 | def lint_ruff_format(fix: bool) -> None:
13 | optional = [] if fix else ["--check"]
14 | cmd: list[str | Path] = ["ruff", "format", *optional, Path()]
15 | run_command(cmd)
16 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/.github/lint/run_command.py:
--------------------------------------------------------------------------------
1 | import logging
2 | import subprocess
3 | from collections.abc import Sequence
4 | from pathlib import Path
5 |
6 |
7 | def run_command(command: Sequence[str | Path], cwd: Path | None = None) -> None:
8 | """Prints the given command and then runs it with ``subprocess.run``"""
9 | # Convert any paths to str
10 | command_str = [str(c) for c in command]
11 |
12 | logging.warning("Running " + " ".join(command_str))
13 | subprocess.run(command_str, check=True, cwd=cwd)
14 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/.github/lint/lint_clang_format.py:
--------------------------------------------------------------------------------
1 | from shutil import which
2 |
3 | from .errors import CommandError
4 | from .paths import cpp_files
5 | from .run_command import run_command
6 |
7 |
8 | def lint_clang_format(fix: bool) -> None:
9 | if which("clang-format") is None:
10 | raise CommandError("clang-format is not installed")
11 |
12 | additional_args = ["-i"] if fix else ["--dry-run", "-Werror"]
13 |
14 | for source in cpp_files():
15 | run_command(["clang-format", "-style=file", *additional_args, str(source)])
16 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/docker/exec:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env bash
2 |
3 | # Enters into the ros2 container that is spun up when a deployment is run. This
4 | # is a convenient way to quickly enter the container and run a program.
5 | #
6 | # Usage:
7 | # docker/exec {executable}
8 | #
9 | # Examples:
10 | # docker/exec bash
11 | # docker/exec ros2 topic list
12 |
13 | set -o errexit
14 | set -o pipefail
15 | set -o nounset
16 |
17 | source docker/_shared.sh
18 |
19 | function main {
20 | enable_display_passthrough
21 | docker compose exec --interactive ros-nodes "${@:-bash}"
22 | }
23 |
24 | main "${@}"
25 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/docker/utils/README.md:
--------------------------------------------------------------------------------
1 | # docker/utils
2 |
3 | These scripts are intended to assist in creating and using the robot Docker
4 | container. They are installed in `/usr/local/bin` for easy availability.
5 |
6 | Scripts in `environment/` assist in installing and setting up dependencies
7 | in the image. These are introduced into the image early on and can be used
8 | anytime during the build process or afterward.
9 |
10 | Scripts in `runtime/` are for starting tasks in the image after it has been
11 | built. These are introduced into the image once everything else has been set
12 | up.
13 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/.github/lint/lint_darglint.py:
--------------------------------------------------------------------------------
1 | from pathlib import Path
2 |
3 | from .run_command import run_command
4 |
5 |
6 | def lint_darglint(fix: bool) -> None:
7 | # Configure darglint in here because it doesn't support pyproject.toml, and we're
8 | # hoping to get rid of it in favor of ruff once it supports it
9 | cmd: list[str | Path] = [
10 | "darglint",
11 | "--strictness",
12 | "short",
13 | "--docstring-style",
14 | "sphinx",
15 | "--ignore-regex",
16 | "^test.*",
17 | Path(),
18 | ]
19 | if not fix:
20 | run_command(cmd)
21 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/docker/utils/environment/save-build-cache:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env bash
2 |
3 | # Copies the build cache from a build operation into `/cache`, where it will be
4 | # saved as a cache volume by Docker for the next build.
5 |
6 | set -o errexit
7 | set -o pipefail
8 | set -o nounset
9 |
10 | # Makes `*` include hidden files
11 | shopt -s dotglob
12 |
13 | # This directory is mounted by the Dockerfile and is unique to each project.
14 | BUILD_CACHE=/colcon-build-cache/{{cookiecutter.__project_name_slug}}/
15 |
16 | if [[ ! -d "${BUILD_CACHE}" ]]; then
17 | echo "Build cache directory not mounted" >&2
18 | exit 1
19 | fi
20 |
21 | rm -rf ${BUILD_CACHE}*
22 | cp -r /robot/build/. ${BUILD_CACHE}.
23 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/docker/utils/environment/restore-build-cache:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env bash
2 |
3 | # Copies the build cache from a special dir into a place where the build system can
4 | # use it. Copying like this ensures that the build cache is present in the
5 | # final image, which is necessary for packages containing messages.
6 |
7 | set -o errexit
8 | set -o pipefail
9 | set -o nounset
10 |
11 | # This directory is mounted by the Dockerfile and is unique to each project.
12 | BUILD_CACHE=/colcon-build-cache/{{cookiecutter.__project_name_slug}}/
13 |
14 | if [[ ! -d "${BUILD_CACHE}" ]]; then
15 | echo "Build cache directory not mounted" >&2
16 | exit 1
17 | fi
18 |
19 | rm -rf /robot/build
20 | cp -r ${BUILD_CACHE} /robot/build
21 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/.env:
--------------------------------------------------------------------------------
1 | # This file is used to store environment variables for the project.
2 | # Docker compose will automatically load these variables and apply them to the
3 | # docker-compose.yaml. They are also used by scripts in the `docker/` directory.
4 |
5 |
6 | # Where to mount directories for storing persistent data. Used by 'grafana' and 'loki'
7 | # by default.
8 | VOLUMES_DIRECTORY="./.docker_volumes"
9 |
10 | # The name of the 'ros-nodes' image after its built
11 | BUILT_IMAGE_NAME="{{cookiecutter.dockerhub_username_or_org}}/{{cookiecutter.__project_name_slug}}:latest"
12 |
13 | # The name of the base image to use for the 'ros-nodes' image.
14 | BASE_IMAGE="ubuntu:24.04"
15 |
16 | # Used for logging
17 | PROJECT_NAME="{{cookiecutter.__project_name_slug}}"
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/docker/utils/environment/add-workspace:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env bash
2 |
3 | # Adds a ROS workspace to a stack of workspaces that will be entered in order
4 | # by the enter-workspaces command. When introducing a new ROS package to a
5 | # container, you likely will need to run this command in order to activate that
6 | # package's workspace.
7 | #
8 | # Usage:
9 | # add-workspace {workspace activation script}
10 | #
11 | # Examples:
12 | # add-workspace /opt/ros/foxy/setup.bash
13 | # add-workspace install/setup.bash
14 |
15 | set -o errexit
16 | set -o pipefail
17 | set -o nounset
18 |
19 | config_path=/etc/{{cookiecutter.__project_name_slug}}
20 | mkdir -p "${config_path}"
21 | workspaces_file="${config_path}/workspaces.txt"
22 |
23 | echo "${1}" >> "${workspaces_file}"
24 |
25 | echo "Added ${1} to workspaces!" >&2
26 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/docker/utils/environment/make-pth-file-from-workspace:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env bash
2 |
3 | # Creates a Python .pth file pointing to the dependency installation location
4 | # of all ROS packages installed to the provided directory. This can be used to
5 | # add these directories to the sys.path without having to rely on PYTHONPATH.
6 | #
7 | # Usage:
8 | # make-pth-file-from-workspace {workspace install dir} {.pth file destination}
9 | #
10 | # Example:
11 | # make-pth-file-from-workspace /robot/install /usr/local/lib/pythonX.XX/dist-packages/robot.pth
12 |
13 | set -o errexit
14 | set -o pipefail
15 | set -o nounset
16 |
17 | install_dir="${1}"
18 | pth_file="${2}"
19 |
20 | package_install_paths=("${install_dir}"/*/lib/python3.*/site-packages)
21 |
22 | for python_path in "${package_install_paths[@]}"
23 | do
24 | echo "${python_path}" >> "${pth_file}"
25 | done
26 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/launch-profiles/{{cookiecutter.example_launch_profile}}/parameters/parameters.yaml:
--------------------------------------------------------------------------------
1 | meta_parameters:
2 | urdf_modules_to_load:
3 | # URDF's are attached to base_link down below in the
4 | - namespace: "example_node_namespace"
5 | urdf_constant_name: "forklift"
6 |
7 |
8 | example_node_namespace:
9 | {{cookiecutter.example_node_name}}:
10 | ros__parameters:
11 | root_config:
12 | forklift_speed: 0.25
13 | forklift_max_extent: 0.5
14 |
15 | urdf_arrangement:
16 | interactive_transform_publisher:
17 | ros__parameters:
18 | static_transforms_file: /robot/persistent/interactive_transforms.json
19 | scale_factor: 1.0
20 | tf_publish_frequency: 1.0
21 | transforms:
22 | # Mounts the forklift urdf to the base_link so it can be visualized in rviz
23 | - "base_link:example_node_namespace.forklift_body"
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/docker/run:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env bash
2 |
3 | # Runs a docker container and enters the container interactively with as similar as
4 | # possible a configuration as when running ROS2 nodes with Docker Swarm.
5 | # This command must be run in the root of the repository.
6 | # If no executable is passed, bash will be used as default.
7 | #
8 | # Usage:
9 | # docker/run {executable}
10 | #
11 | # Examples:
12 | # Run commands directly in the /robot/ directory
13 | # docker/run ls
14 | #
15 | # Or, enter into a bash shell for a more interactive experience
16 | # docker/run
17 |
18 | set -o errexit
19 | set -o pipefail
20 | set -o nounset
21 |
22 | source docker/_shared.sh
23 |
24 | function main {
25 | enable_display_passthrough
26 | build_images
27 |
28 | docker compose \
29 | run \
30 | --interactive \
31 | --rm \
32 | ros-nodes \
33 | "${@:-bash}"
34 | }
35 |
36 | main "${@}"
37 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/pkgs/{{cookiecutter.example_package_name}}/package.xml:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 | {{cookiecutter.example_package_name}}
5 | {{cookiecutter.example_package_version}}
6 | {{cookiecutter.example_package_description}}
7 | {{cookiecutter.github_username_or_org}}
8 |
9 | {{cookiecutter.license}}
10 |
11 | python3-pytest
12 | python3-pytest-cov
13 |
14 |
15 |
16 | robot_state_publisher
17 | rviz2
18 | joint_state_publisher
19 |
20 |
21 | ament_python
22 |
23 |
24 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/pkgs/{{cookiecutter.example_package_name}}/pyproject.toml:
--------------------------------------------------------------------------------
1 | [tool.poetry]
2 | name = "{{cookiecutter.example_package_name}}"
3 | version = "{{cookiecutter.example_package_version}}"
4 | description = "{{cookiecutter.example_package_description}}"
5 | authors = ["{{cookiecutter.github_username_or_org}} <{{cookiecutter.email}}>"]
6 | license = "{{cookiecutter.license}}"
7 |
8 | [tool.poetry.dependencies]
9 | python = ">=3.12.0,<4.0"
10 |
11 | [tool.poetry.scripts]
12 | # Each entry here will create an executable which can be referenced in launchfiles
13 | run_{{cookiecutter.example_node_name}} = "{{cookiecutter.example_package_name}}.nodes.{{cookiecutter.example_package_name}}_node:main"
14 |
15 | [tool.colcon-poetry-ros.data-files]
16 | "share/ament_index/resource_index/packages" = ["resource/{{cookiecutter.example_package_name}}"]
17 | "share/{{cookiecutter.example_package_name}}" = ["package.xml"]
18 |
19 | [build-system]
20 | requires = ["poetry-core>=1.0.0"]
21 | build-backend = "poetry.core.masonry.api"
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/pkgs/{{cookiecutter.__example_messages_package_name}}/package.xml:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 | {{cookiecutter.__example_messages_package_name}}
5 | {{cookiecutter.example_package_version}}
6 | Defines messages used by the '{{cookiecutter.__project_name_slug}} project.
7 | {{cookiecutter.github_username_or_org}}
8 |
9 | {{cookiecutter.license}}
10 |
11 | ament_cmake
12 |
13 | rosidl_default_generators
14 |
15 | rosidl_default_runtime
16 |
17 | rosidl_interface_packages
18 |
19 |
20 | ament_cmake
21 |
22 |
23 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/pkgs/{{cookiecutter.example_package_name}}/{{cookiecutter.example_package_name}}/example_urdf.py:
--------------------------------------------------------------------------------
1 | """This module demonstrates how to define a `node_helpers` URDFConstants object for a
2 | basic URDF, specify necessary joint and frame names, and register the URDF with the
3 | `node_helpers` package.
4 |
5 | By registering it, the URDF can be accessed _by name_ in configuration files.
6 | """
7 |
8 | from typing import NamedTuple
9 |
10 | from node_helpers.urdfs import URDFConstants
11 |
12 |
13 | class ForkliftJoints(NamedTuple):
14 | FORKS: str = "forks"
15 | FORKS_PARENT_DATUM: str = "forks_parent_datum"
16 |
17 |
18 | class ForkliftFrames(NamedTuple):
19 | BASE_LINK: str = "forklift_body"
20 |
21 | # Joint tracking
22 | FORKS_ORIGIN: str = "forks_origin"
23 | FORKS: str = "forks"
24 |
25 | ForkliftURDF = URDFConstants[ForkliftJoints, ForkliftFrames](
26 | from_package="node_helpers",
27 | registration_name="forklift",
28 | urdf_paths=[(None, "sample_urdfs/forklift/robot.urdf")],
29 | joints=ForkliftJoints(),
30 | frames=ForkliftFrames(),
31 | )
32 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/docker/utils/environment/enter-workspaces:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env bash
2 |
3 | # Runs the provided command after entering the workspaces provided by calls to
4 | # add-workspace. This is intended to be used in Docker ENTRYPOINT definitions.
5 | #
6 | # Usage:
7 | # enter-workspaces {command(s)}
8 | #
9 | # Examples:
10 | # enter-workspaces ros2 run my_package my_node
11 |
12 | set -o errexit
13 | set -o pipefail
14 | # The nounset check is explicitly not enabled because it breaks workspace
15 | # scripts
16 |
17 | workspaces_file="/etc/{{cookiecutter.__project_name_slug}}/workspaces.txt"
18 | if [[ ! -f "${workspaces_file}" ]]
19 | then
20 | echo "File ${workspaces_file} does not exist! Run add-workspace to add workspaces to this file" 1>&2
21 | exit 1
22 | fi
23 |
24 | # Enter into the workspaces
25 | while read -r workspace
26 | do
27 | # shellcheck disable=SC1090
28 | source "${workspace}"
29 | done < "${workspaces_file}"
30 |
31 | if [[ "$#" -gt 0 ]]
32 | then
33 | # Run the command provided
34 | exec "$@"
35 | else
36 | # Default to entering a shell
37 | exec "${SHELL}"
38 | fi
39 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/.gitattributes:
--------------------------------------------------------------------------------
1 | * text=auto
2 |
3 | *.pcd filter=lfs diff=lfs merge=lfs -binary
4 | *.png filter=lfs diff=lfs merge=lfs -binary
5 | *.jpg filter=lfs diff=lfs merge=lfs -binary
6 | *.jpeg filter=lfs diff=lfs merge=lfs -binary
7 | *.svg filter=lfs diff=lfs merge=lfs -binary
8 | *.mkv filter=lfs diff=lfs merge=lfs -binary
9 | *.npy filter=lfs diff=lfs merge=lfs -binary
10 | *.stl filter=lfs diff=lfs merge=lfs -binary
11 | *.STL filter=lfs diff=lfs merge=lfs -binary
12 | *.urdf filter=lfs diff=lfs merge=lfs -binary
13 | *.ply filter=lfs diff=lfs merge=lfs -binary
14 | *.pickle filter=lfs diff=lfs merge=lfs -binary
15 | *.msgbytes filter=lfs diff=lfs merge=lfs -binary
16 | *.pt filter=lfs diff=lfs merge=lfs -binary
17 | *.mp3 filter=lfs diff=lfs merge=lfs -binary
18 | *.mp4 filter=lfs diff=lfs merge=lfs -binary
19 | *.ogg filter=lfs diff=lfs merge=lfs -binary
20 | *.dae filter=lfs diff=lfs merge=lfs -binary
21 | *.usd filter=lfs diff=lfs merge=lfs -binary
22 |
23 | # ReadTheDocs does not support Git LFS
24 | docs/** !filter !diff !merge binary
25 |
26 | # Makes images buildable on Windows
27 | docker/** text eol=lf
28 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/.github/lint/lint_mypy.py:
--------------------------------------------------------------------------------
1 | import os
2 |
3 | from .paths import LAUNCH_PATH, LINT_PATH, ROS_PATH, ros_packages
4 | from .run_command import run_command
5 |
6 |
7 | def lint_mypy(fix: bool) -> None:
8 | if not fix:
9 | # Create MYPYPATH, which allows mypy to cross-check type hints in monorepo
10 | # packages
11 | mypy_path = ""
12 | for path in ROS_PATH.iterdir():
13 | mypy_path += f":{path}"
14 | os.environ["MYPYPATH"] = mypy_path
15 |
16 | # Run MyPy on each ROS package and other Python locations
17 | target_paths = ros_packages() + [LAUNCH_PATH, LINT_PATH]
18 | for path in target_paths:
19 | run_command(
20 | [
21 | "mypy",
22 | # Turns off the cache to avoid "Should never get here in
23 | # normal mode, got NoneType: instead of TypeInfo" errors
24 | "--cache-dir",
25 | os.devnull,
26 | "--show-error-codes",
27 | path,
28 | ]
29 | )
30 |
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) 2024 Urban Machine
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE.
22 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/docker/reload-ros-nodes:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env bash
2 |
3 | # Builds and re-runs the ROS nodes container. This is useful if you want to restart
4 | # the container over and over, oftentimes when developing on you ROS code.
5 | # The container is only re-run if the image has changed.
6 | #
7 | # Usage:
8 | # docker/reload-ros-nodes
9 | #
10 | # Examples:
11 | # docker/reload-ros-nodes {{cookiecutter.example_launch_profile}}
12 |
13 | set -o errexit
14 | set -o pipefail
15 | set -o nounset
16 |
17 | source docker/_shared.sh
18 |
19 | function main {
20 | local launch_profile
21 | launch_profile="${1:-}"
22 |
23 | if [[ -z "${launch_profile}" ]]; then
24 | echo "Missing argument, specify a directory under 'launch-profiles/'" >&2
25 | echo "Usage: docker/reload-ros-nodes " >&2
26 | launch_profiles_helper_msg
27 | fi
28 | validate_launch_profile "${launch_profile}"
29 |
30 | build_images # Build any images that need to be built
31 | enable_display_passthrough # Enable passthrough for the stack
32 |
33 | export LAUNCH_PROFILE="${launch_profile}"
34 | docker compose up -d --force-recreate ros-nodes
35 | }
36 |
37 | main "${@}"
38 |
--------------------------------------------------------------------------------
/cookiecutter.json:
--------------------------------------------------------------------------------
1 | {
2 | "email": "example@urbanmachine.build",
3 | "github_username_or_org": "urbanmachine",
4 | "dockerhub_username_or_org": "{{ cookiecutter.github_username_or_org }}",
5 | "project_name": "cool_ros_project",
6 | "__project_name_slug": "{{ cookiecutter.project_name | lower | replace('-', '_') }}",
7 | "project_description": "This is a cool ROS project, showing off best practices for ROS applications.",
8 | "version": "0.1.0",
9 | "license": "MIT",
10 | "example_package_name": "cool_package",
11 | "example_package_description": "This is the first package of this project, and shows how to integrate packages!",
12 | "example_node_name": "CoolNode",
13 | "example_launch_profile": "{{ cookiecutter.example_node_name | lower + '_launch_profile' }}",
14 | "example_package_version": "0.1.0",
15 | "__example_messages_package_name": "{{ cookiecutter.example_package_name | lower + '_msgs' }}",
16 | "__prompts__": {
17 | "example_node_name": "The following variable should be written in PascalCase, because it will be used for a class name in python.\nexample_node_name ",
18 | "example_launch_profile": "This template comes pre-filled with something called a 'launch-profile'. In this project, a launch profile is a launch.py & any configuration for running a specific ROS apps.)\nexample_launch_profile ",
19 | "project_name": "This is the repository name and project name. Don't include spaces or special characters. Underscores and dashes are okay.\nproject_name "
20 | }
21 | }
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/docker/utils/environment/add-apt-repo:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env bash
2 |
3 | # Adds an APT repository to the repository list, allowing packages to be
4 | # installed from that source.
5 | #
6 | # This script handles downloading and installing the repository's GPG key to
7 | # the proper location. That location is controlled by the provided source
8 | # entry's "signed-by" field. For example, if the script is executed like this:
9 | #
10 | # add-apt-repo 'deb [signed-by=/my/key.gpg] https://coolpackages.com focal main' https://coolpackages.com/key.gpg
11 | #
12 | # ... then the key will be downloaded from "https://coolpackages.com/key.gpg"
13 | # and saved to a file at "/my/key.gpg".
14 | #
15 | # Usage:
16 | # add-apt-repo {package list line} {URL to GPG key}
17 |
18 | set -o errexit
19 | set -o pipefail
20 | set -o nounset
21 |
22 | # Parse the "signed-by" field to figure out where the GPG key should be
23 | # placed
24 | signed_by_pattern='signed-by=([[:alnum:]./_-]+)'
25 | if [[ "${1}" =~ ${signed_by_pattern} ]]
26 | then
27 | key_file=${BASH_REMATCH[1]}
28 | else
29 | echo "The provided source is missing a 'signed-by' field" >&2
30 | exit 1
31 | fi
32 |
33 | dependencies=(ca-certificates curl gnupg)
34 | if ! dpkg --list "${dependencies[@]}" >/dev/null 2>&1
35 | then
36 | install-packages "${dependencies[@]}"
37 | fi
38 |
39 | echo "${1}" >> /etc/apt/sources.list
40 |
41 | update-ca-certificates
42 | curl --silent --show-error --location "${2}" | gpg --dearmor > "${key_file}"
43 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/.github/workflows/lint.yaml:
--------------------------------------------------------------------------------
1 | {% raw %}
2 | name: Lint
3 |
4 | on: [ push ]
5 |
6 | jobs:
7 | check_lint:
8 | runs-on: ubuntu-24.04
9 |
10 | steps:
11 | - uses: actions/checkout@v4
12 | - name: pip cache
13 | uses: actions/cache@v4
14 | with:
15 | path: ~/.cache/pip
16 | key: lint-pip-${{ hashFiles('**/pyproject.toml') }}
17 | restore-keys: |
18 | lint-pip-
19 | - name: Set up Python 3.12
20 | uses: actions/setup-python@v5
21 | with:
22 | python-version: '3.12'
23 | - name: Install Python dependencies
24 | run: |
25 | python -m pip install --upgrade pip
26 | python -m pip install --upgrade poetry
27 | poetry install
28 | - name: Install Bash dependencies
29 | run: sudo apt-get install --yes shellcheck
30 | - name: Lint
31 | run: poetry run lint --languages all --mode check
32 |
33 | check_upstream_template:
34 | runs-on: ubuntu-24.04
35 |
36 | steps:
37 | - uses: actions/checkout@v4
38 | - name: pip cache
39 | uses: actions/cache@v4
40 | with:
41 | path: ~/.cache/pip
42 | key: lint-pip-${{ hashFiles('**/pyproject.toml') }}
43 | restore-keys: |
44 | lint-pip-
45 | - name: Set up Python 3.12
46 | uses: actions/setup-python@v5
47 | with:
48 | python-version: '3.12'
49 | - name: Install Python dependencies
50 | run: |
51 | python -m pip install --upgrade pip
52 | python -m pip install --upgrade poetry
53 | poetry install
54 | - name: Check Upstream Template
55 | run: poetry run cruft check
56 | {% endraw %}
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/pkgs/{{cookiecutter.__example_messages_package_name}}/CMakeLists.txt:
--------------------------------------------------------------------------------
1 | cmake_minimum_required(VERSION 3.5)
2 | project({{cookiecutter.example_package_name}}_msgs)
3 |
4 | # Default to C99
5 | if(NOT CMAKE_C_STANDARD)
6 | set(CMAKE_C_STANDARD 99)
7 | endif()
8 |
9 | # Default to C++14
10 | if(NOT CMAKE_CXX_STANDARD)
11 | set(CMAKE_CXX_STANDARD 14)
12 | endif()
13 |
14 | if(CMAKE_COMPILER_IS_GNUCXX OR CMAKE_CXX_COMPILER_ID MATCHES "Clang")
15 | add_compile_options(-Wall -Wextra -Wpedantic)
16 | endif()
17 |
18 | find_package(ament_cmake REQUIRED)
19 | find_package(rosidl_default_generators REQUIRED)
20 |
21 | ######################################################################
22 | ########## NOTE TO TEMPLATE USERS: When adding messages that require other messages,
23 | ########## add the following line below to ensure that the other messages are found.
24 | ########## Also make sure new messages are added to the rosidl_generate_interfaces list.
25 | # find_package(my_required_msgs_package REQUIRED)
26 | ######################################################################
27 |
28 |
29 | rosidl_generate_interfaces(
30 | ${PROJECT_NAME}
31 |
32 | ######################################################################
33 | ########## NOTE TO TEMPLATE USERS: Add new messages here, like so:
34 | # msg/ANewMessageFile.msg
35 | ######################################################################
36 |
37 | "msg/ExampleMessage.msg"
38 |
39 | "action/ExampleAction.action"
40 |
41 | "srv/ExampleService.srv"
42 |
43 | ######################################################################
44 | ########## NOTE TO TEMPLATE USERS: Add your dependency packages here, like so:
45 | # DEPENDENCIES my_required_msgs_package
46 | ######################################################################
47 |
48 | )
49 |
50 | ament_package()
51 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/docker/test:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env bash
2 |
3 | # Runs tests in all packages under the `pkgs/ directory.
4 | # All other arguments are passed to `colcon test`. Some examples below:
5 | #
6 | # Usage:
7 | # docker/test
8 | #
9 | # Examples:
10 | # Run all tests on all `pkgs/`
11 | # docker/test
12 | #
13 | # Run tests on a specific package (passing flags to `colcon test`)
14 | # docker/test --packages-select {{cookiecutter.example_package_name}}
15 | #
16 | # Run tests on a specific package with a specific test
17 | # docker/test --packages-select {{cookiecutter.example_package_name}} --pytest-arg {{cookiecutter.example_package_name}}_test/unit/nodes/test_{{cookiecutter.example_package_name}}_node.py
18 |
19 | set -o errexit
20 | set -o pipefail
21 | set -o nounset
22 |
23 | source docker/_shared.sh
24 |
25 | build_images
26 | enable_display_passthrough
27 |
28 | # We don't use `docker compose run ...` for tests, because we don't want to have network
29 | # mode be 'host' for tests, and we also want to mount in the 'pkgs' directory wholesale,
30 | # while usually it contains everything _except_ the tests.
31 | docker run -it \
32 | `# Disable GPU to make the testing more similar to the Github Actions environment` \
33 | --runtime runc \
34 | `# For display passthrough, for tests that have visualization options during dev` \
35 | -e DISPLAY="${DISPLAY}" \
36 | -v "/tmp/.X11-unix:/tmp/.X11-unix:rw" \
37 | `# Tests are excluded from the image, so we need to reintroduce them. This` \
38 | `# saves us from having to rebuild ROS packages for test changes.` \
39 | -v "$(pwd)/pkgs:/robot/pkgs" \
40 | ${BUILT_IMAGE_NAME} \
41 | `# Run tests in 'pkgs' with sane logging defaults` \
42 | colcon test \
43 | --base-paths pkgs \
44 | --event-handlers console_direct+ \
45 | --executor sequential \
46 | --return-code-on-test-failure \
47 | "${@}"
48 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/README.md:
--------------------------------------------------------------------------------
1 | # {{cookiecutter.project_name}}
2 | {{cookiecutter.project_description}}
3 |
4 | ---
5 | [](https://github.com/{{cookiecutter.github_username_or_org}}/{{cookiecutter.project_name}}/actions?query=workflow%3ATest)
6 | [](https://github.com/{{cookiecutter.github_username_or_org}}/{{cookiecutter.project_name}}/actions?query=workflow%3ALint)
7 | [](https://codecov.io/gh/{{cookiecutter.github_username_or_org}}/{{cookiecutter.project_name}})
8 | [](https://github.com/astral-sh/ruff)
9 | 
10 | 
11 |
12 | ---
13 |
14 | ## Running This Project
15 |
16 | To run the project, use the following command:
17 |
18 | ```shell
19 | docker/launch {{cookiecutter.example_launch_profile}}
20 | ```
21 | Then, open http://localhost/ on your browser to view the project logs.
22 |
23 | For in-depth documentation on the repository features, read the [About Template](docs/about_template.md) documentation.
24 |
25 | ### Dependencies
26 |
27 | - [Docker](https://docs.docker.com/get-docker/), and optionally [Nvidia Container Toolkit](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html) for hardware acceleration.
28 | - [Poetry](https://python-poetry.org/docs/), in order to use linting tooling.
29 |
30 | ---
31 | This repository was initialized by the [create-ros-app](https://github.com/UrbanMachine/create-ros-app) template. Contributions are welcome!
32 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/pkgs/{{cookiecutter.example_package_name}}/{{cookiecutter.example_package_name}}/nodes/{{cookiecutter.example_package_name}}_node.py:
--------------------------------------------------------------------------------
1 | from typing import Any
2 |
3 | from sensor_msgs.msg import JointState
4 |
5 | from node_helpers.nodes import HelpfulNode
6 | from pydantic import BaseModel
7 | from node_helpers.spinning import create_spin_function
8 | from rclpy.qos import qos_profile_services_default
9 |
10 | from {{cookiecutter.example_package_name}}.example_urdf import ForkliftURDF
11 |
12 | class {{cookiecutter.example_node_name}}(HelpfulNode):
13 |
14 | class Parameters(BaseModel):
15 | # Define your ROS parameters here
16 | forklift_speed: float # m/s
17 | forklift_max_extent: float
18 |
19 | def __init__(self, **kwargs: Any):
20 | super().__init__("{{cookiecutter.example_node_name}}", **kwargs)
21 | # Load parameters from the ROS parameter server
22 | self.params = self.declare_from_pydantic_model(self.Parameters, "root_config")
23 | self.urdf = ForkliftURDF.with_namespace(self.get_namespace())
24 |
25 | # Track the forks position and direction, so we can move them up and down
26 | self.forklift_position = 0.0
27 | self.forklift_direction = 1
28 |
29 | # Create publishers
30 | self.joint_state_publisher = self.create_publisher(
31 | JointState,
32 | "desired_joint_states",
33 | qos_profile_services_default
34 | )
35 |
36 | # Create a timer to publish joint values
37 | self._publish_hz = 20
38 | self.create_timer(1 / self._publish_hz, self.on_publish_joints)
39 |
40 |
41 | def on_publish_joints(self) -> None:
42 | if self.forklift_position >= self.params.forklift_max_extent:
43 | self.forklift_direction = -1
44 | elif self.forklift_position <= 0:
45 | self.forklift_direction = 1
46 |
47 | self.forklift_position += self.forklift_direction * self.params.forklift_speed / self._publish_hz
48 |
49 | joint_positions = {
50 | self.urdf.joints.FORKS: self.forklift_position
51 | }
52 |
53 | self.joint_state_publisher.publish(
54 | JointState(
55 | name=list(joint_positions.keys()),
56 | position = list(joint_positions.values())
57 | )
58 | )
59 |
60 | main = create_spin_function({{cookiecutter.example_node_name}})
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/docker/launch:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env bash
2 |
3 | # Deploys a 'launch-profile' on the local machine.
4 | #
5 | # A launch-profile is any directory under the root `launch-profiles/` directory, which
6 | # contains a 'launch.py' file and related configuration for this profile.
7 | #
8 | # Normally, this command checks for updates on the base image before building
9 | # our images. This behavior can be disabled with the `--no-pull` flag. This is useful
10 | # if working in production, because you may want to make changes without having to have
11 | # upstream image changes cause your Dockerfile to need to be rebuilt. For example, if
12 | # there's a slight update to 'ubuntu:XXXX' on dockerhub, without `--no-pull` you might
13 | # be stuck with the container rebuilding at an inconvenient time.
14 | #
15 | # Usage:
16 | # docker/launch [--no-pull]
17 | #
18 | # Examples:
19 | # docker/launch {{cookiecutter.example_launch_profile}}
20 | # docker/launch --no-pull {{cookiecutter.example_launch_profile}}
21 |
22 | {% raw %}
23 | set -o errexit
24 | set -o pipefail
25 | set -o nounset
26 |
27 | source docker/_shared.sh
28 |
29 | function usage {
30 | echo "Usage: docker/launch [--no-pull] " >&2
31 | launch_profiles_helper_msg
32 | }
33 |
34 | function main {
35 | local launch_profile
36 | launch_profile=""
37 | local pull_upstream_images
38 | pull_upstream_images=true
39 |
40 | while [[ "${#}" -gt 0 ]]; do
41 | case "${1}" in
42 | --no-pull)
43 | pull_upstream_images=false
44 | ;;
45 | *)
46 | if [[ "${1}" == -* ]]; then
47 | echo "Unknown flag ${1}" >&2
48 | usage
49 | elif [[ "${launch_profile}" -eq "" ]]; then
50 | launch_profile="${1}"
51 | else
52 | echo "Invalid number of positional arguments" >&2
53 | usage
54 | fi
55 | ;;
56 | esac
57 | shift
58 | done
59 |
60 | if [[ "${launch_profile}" == "" ]]; then
61 | echo "Missing argument, specify a directory under 'launch-profiles/'" >&2
62 | usage
63 | fi
64 | validate_launch_profile "${launch_profile}"
65 |
66 | # To reduce downtime, build the latest images before stopping any existing stacks.
67 | if [[ "${pull_upstream_images}" = true ]]; then
68 | pull_images
69 | fi
70 |
71 | build_images # Build any images that need to be built
72 | undeploy # Stop any existing stack
73 | enable_display_passthrough # Enable passthrough for the stack
74 |
75 | deploy_and_wait "${launch_profile}"
76 | }
77 |
78 | main "${@}"
79 | {% endraw %}
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/.github/lint/main.py:
--------------------------------------------------------------------------------
1 | import logging
2 | import sys
3 | from argparse import ArgumentParser
4 | from subprocess import SubprocessError
5 |
6 | from .errors import CommandError
7 | from .lint_clang_format import lint_clang_format
8 | from .lint_darglint import lint_darglint
9 | from .lint_mypy import lint_mypy
10 | from .lint_ruff import lint_ruff_check, lint_ruff_format
11 | from .lint_shellcheck import lint_shellcheck
12 |
13 | PYTHON_LANGUAGE = "python"
14 | CPP_LANGUAGE = "cpp"
15 | BASH_LANGUAGE = "bash"
16 | ALL_LANGUAGES = "all"
17 |
18 |
19 | LINTERS = {
20 | PYTHON_LANGUAGE: [
21 | # Run linters from fastest to slowest
22 | lint_ruff_format,
23 | lint_ruff_check,
24 | lint_darglint,
25 | lint_mypy,
26 | ],
27 | CPP_LANGUAGE: [
28 | lint_clang_format,
29 | ],
30 | BASH_LANGUAGE: [
31 | lint_shellcheck,
32 | ],
33 | }
34 |
35 |
36 | FIX_MODE = "fix"
37 | CHECK_MODE = "check"
38 | ALL_MODE = "all"
39 |
40 |
41 | def entrypoint() -> None:
42 | parser = ArgumentParser(description="Runs a suite of code linting tools")
43 |
44 | parser.add_argument(
45 | "--mode",
46 | choices=[FIX_MODE, CHECK_MODE, ALL_MODE],
47 | default=ALL_MODE,
48 | help=f"The '{FIX_MODE}' mode allows linters to auto-fix problems. Linters "
49 | f"without this feature will not be run. The '{CHECK_MODE}' mode runs all "
50 | f"linters without changing code, and exits with an error code if any fail."
51 | f"The '{ALL_MODE}' mode runs '{FIX_MODE}', then '{CHECK_MODE}'. This is the"
52 | f"default.",
53 | )
54 |
55 | parser.add_argument(
56 | "--languages",
57 | nargs="+",
58 | choices=[ALL_LANGUAGES, *LINTERS.keys()],
59 | default=[PYTHON_LANGUAGE],
60 | help=f"Specifies which languages should be linted. To run linting for all "
61 | f"available languages, specify '{ALL_MODE}'.",
62 | )
63 |
64 | args = parser.parse_args()
65 |
66 | linters_to_run = []
67 | if ALL_LANGUAGES in args.languages:
68 | for linters in LINTERS.values():
69 | linters_to_run += linters
70 | else:
71 | for language in args.languages:
72 | linters_to_run += LINTERS[language]
73 |
74 | for linter in linters_to_run:
75 | if args.mode in [FIX_MODE, ALL_MODE]:
76 | linter(True)
77 | if args.mode in [CHECK_MODE, ALL_MODE]:
78 | linter(False)
79 |
80 |
81 | def main() -> None:
82 | try:
83 | entrypoint()
84 | except SubprocessError:
85 | # The subprocess will log error information
86 | sys.exit(1)
87 | except CommandError as ex:
88 | logging.error(ex.message)
89 | sys.exit(1)
90 |
91 |
92 | if __name__ == "__main__":
93 | main()
94 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/launch-profiles/{{cookiecutter.example_launch_profile}}/launcher.py:
--------------------------------------------------------------------------------
1 | """Launch nodes for this launch profile."""
2 |
3 | from pathlib import Path
4 |
5 | from launch import LaunchDescription
6 | from launch_ros.actions import Node
7 | from pydantic import BaseModel
8 |
9 | from node_helpers import launching
10 | from node_helpers.parameters import ParameterLoader
11 |
12 | # Import the forklift URDF module so it register itself with the URDFConstants
13 | from {{cookiecutter.example_package_name}} import example_urdf
14 |
15 | class MetaParameters(BaseModel):
16 | """This is a great place to put parameters that affect the generation of the launch
17 | file. Don't put node specific configuration in here, rather, put configuration for
18 | what nodes you want to be created in the first place.
19 |
20 | Read more about this functionality under docs/parameters.rst
21 | """
22 |
23 | urdf_modules_to_load: list[launching.URDFModuleNodeFactory.Parameters]
24 | """This is an example of dynamically loading an arbitrary number of URDFs.
25 |
26 | This is set in the `/robot/launch-profile/parameters.yaml` under `meta_parameters`.
27 | """
28 |
29 |
30 | def generate_launch_description() -> LaunchDescription:
31 | # Create a parameter loader to parse all yaml files in the launch-profile/parameters
32 | # directory, and then apply overrides from the override file, if one exists.
33 | param_loader: ParameterLoader[MetaParameters] = ParameterLoader(
34 | parameters_directory=Path("/robot/launch-profile/parameters/"),
35 | override_file=Path("/robot/launch-profile/parameters.override.yaml"),
36 | meta_parameters_schema=MetaParameters,
37 | )
38 |
39 | rviz_config = launching.required_file("/robot/launch-profile/rviz-config.rviz")
40 |
41 | urdf_node_factories = (
42 | launching.URDFModuleNodeFactory(
43 | parameters=node_factory_params
44 | )
45 | for node_factory_params in param_loader.meta_parameters.urdf_modules_to_load
46 | )
47 | urdf_nodes = []
48 | for urdf_node_factory in urdf_node_factories:
49 | urdf_nodes += urdf_node_factory.create_nodes()
50 |
51 | launch_description = [
52 | *urdf_nodes,
53 | Node(
54 | package="{{cookiecutter.example_package_name}}",
55 | executable="run_{{cookiecutter.example_node_name}}",
56 | name="{{cookiecutter.example_node_name}}",
57 | parameters=[param_loader.ros_parameters_file],
58 | namespace="example_node_namespace",
59 | ),
60 | Node(
61 | package="rviz2",
62 | executable="rviz2",
63 | arguments=["-d", [str(rviz_config)]],
64 | ),
65 | Node(
66 | namespace="urdf_arrangement",
67 | package="node_helpers",
68 | executable="interactive_transform_publisher",
69 | parameters=[param_loader.ros_parameters_file],
70 | ),
71 | ]
72 | return LaunchDescription(launch_description)
73 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/docker/utils/environment/install-ros-package-from-git:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env bash
2 |
3 | # Installs ROS packages from a Git repository into the Docker image.
4 | #
5 | # This script:
6 | # - Requires specifying the repository URL, branch/tag, and the relative path to a pkgs directory.
7 | # - Clones the repository to a temporary directory.
8 | # - Checks out the specified branch or tag.
9 | # - Moves the contents of the given pkgs directory into a new directory (named after the repo) in the current directory.
10 | # - Installs dependencies and builds all packages found in that directory.
11 | #
12 | # Usage:
13 | # install-ros-package-from-git {repository_url} {branch_or_tag} {pkgs_subdir}
14 | #
15 | # Example:
16 | # install-ros-package-from-git https://github.com/UrbanMachine/example_repo.git main pkgs
17 | #
18 | # This might produce a structure like:
19 | # current_dir/
20 | # example_repo/
21 | # some_pkg_A/
22 | # some_pkg_B/
23 |
24 | set -o errexit
25 | set -o pipefail
26 | set -o nounset
27 |
28 | function usage {
29 | echo "Usage: install-ros-package-from-git {repository_url} {branch_or_tag} {pkgs_subdir}" >&2
30 | exit 1
31 | }
32 |
33 | function main {
34 | if [[ "$#" -ne 3 ]]; then
35 | usage
36 | fi
37 |
38 | local repo_url="$1"
39 | local branch_or_tag="$2"
40 | local pkgs_subdir="$3"
41 |
42 | local original_dir
43 | original_dir="$(pwd)"
44 |
45 | # Derive the package name from the repo URL
46 | local package_name
47 | package_name="$(basename "${repo_url}" .git)"
48 |
49 | local dest_dir="${original_dir}/${package_name}"
50 | if [[ -d "${dest_dir}" ]]; then
51 | echo "Directory ${dest_dir} already exists. Remove it or choose a different repository." >&2
52 | exit 1
53 | fi
54 |
55 | # Create a temporary directory for cloning
56 | local temp_dir
57 | temp_dir="$(mktemp -d)"
58 |
59 | # Clone the repository into the temp directory
60 | git clone "${repo_url}" "${temp_dir}"
61 | cd "${temp_dir}"
62 |
63 | # Checkout the specified branch or tag
64 | git checkout "${branch_or_tag}"
65 |
66 | # Verify that the specified pkgs_subdir exists
67 | if [[ ! -d "${pkgs_subdir}" ]]; then
68 | echo "The specified pkgs_subdir '${pkgs_subdir}' does not exist in the repository." >&2
69 | exit 1
70 | fi
71 |
72 | # Create the destination directory and move the packages
73 | mkdir -p "${dest_dir}"
74 | # Move everything from pkgs_subdir into the new directory
75 | mv "${pkgs_subdir}"/* "${dest_dir}/"
76 |
77 | # Return to the original directory to install dependencies and build
78 | cd "${original_dir}"
79 |
80 | # Install ROS dependencies with rosdep
81 | with-package-list rosdep install -i --from-path "${dest_dir}" --rosdistro "${ROS2_DISTRO:-foxy}" -y
82 |
83 | # Install Poetry dependencies for the packages
84 | python3 -m colcon_poetry_ros.dependencies.install --base-paths "${dest_dir}"
85 |
86 | # Build the packages using colcon
87 | # We use enter-workspaces to ensure all previously added workspaces are sourced.
88 | enter-workspaces colcon build --base-paths "${package_name}"
89 | add-workspace "${original_dir}/install/setup.bash"
90 | }
91 |
92 | main "$@"
93 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/.github/lint/paths.py:
--------------------------------------------------------------------------------
1 | from itertools import chain
2 | from pathlib import Path
3 |
4 |
5 | def required_path(path_str: str) -> Path:
6 | path = Path(path_str)
7 | if not path.is_dir():
8 | raise NotADirectoryError(f"Path {path} is not a directory")
9 | return path
10 |
11 |
12 | ROS_PATH = required_path("pkgs")
13 | LINT_PATH = required_path(".github/lint")
14 | FIRMWARE_PATH = Path()
15 | LAUNCH_PATH = required_path("launch-profiles")
16 |
17 |
18 | def python_paths() -> list[Path]:
19 | """Get all top-level directories containing lintable Python code"""
20 | return [ROS_PATH, LINT_PATH, LAUNCH_PATH]
21 |
22 |
23 | def python_files() -> list[Path]:
24 | """Get all lintable Python files"""
25 | output = []
26 | for path in python_paths():
27 | output += list(path.rglob("*.py"))
28 | return output
29 |
30 |
31 | def cpp_paths() -> list[Path]:
32 | """Get all top-level directories containing lintable C++ code"""
33 |
34 | # Directories under firmware/ that are not written by us
35 | blacklist = [".vs", "libClearCore", "LwIP", "Tools"]
36 |
37 | # Discover project locations
38 | paths: list[Path] = [
39 | dir_
40 | for dir_ in FIRMWARE_PATH.iterdir()
41 | if dir_.is_dir() and dir_.name not in blacklist
42 | ]
43 |
44 | return paths
45 |
46 |
47 | # File extensions for C++ code
48 | _CPP_EXTENSIONS = ["*.h", "*.cpp", "*.cc"]
49 |
50 |
51 | def _is_external_source(source: Path) -> bool:
52 | """Checks if the source resides in a directory indicating that it's a vendored
53 | dependency and not written by us.
54 |
55 | :param source: The source to check
56 | :return: True if an external source
57 | """
58 | return any(parent.name in ["external", "submodules"] for parent in source.parents)
59 |
60 |
61 | def cpp_files() -> list[Path]:
62 | """Get all lintable C++ files"""
63 | output = []
64 |
65 | for path in cpp_paths():
66 | sources = chain(*[path.rglob(e) for e in _CPP_EXTENSIONS])
67 | output += [s for s in sources if not _is_external_source(s)]
68 |
69 | return output
70 |
71 |
72 | def ros_packages() -> list[Path]:
73 | """Get all directories that are ROS packages"""
74 | return [p for p in ROS_PATH.iterdir() if (p / "pyproject.toml").is_file()]
75 |
76 |
77 | # Directories that contain bash scripts somewhere in them
78 | _DIRS_WITH_BASH_SCRIPTS = [Path("docker"), Path("docs")]
79 |
80 |
81 | def bash_files() -> list[Path]:
82 | """
83 | :return: All bash scripts in the project
84 | """
85 | output: list[Path] = []
86 |
87 | for folder in _DIRS_WITH_BASH_SCRIPTS:
88 | output += [f for f in folder.rglob("*") if _is_bash_script(f)]
89 |
90 | return output
91 |
92 |
93 | def _is_bash_script(path: Path) -> bool:
94 | """Checks if the given file is a bash script by inspecting its shebang"""
95 | shebang = b"#!/usr/bin/env bash"
96 |
97 | if not path.is_file():
98 | return False
99 |
100 | with path.open("rb") as f:
101 | return f.read(len(shebang)) == shebang
102 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/docker/_shared.sh:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env bash
2 | # Shared functions used in multiple deployment scripts
3 |
4 | source .env
5 |
6 | # Builds our Docker images
7 | function build_images {
8 | echo "Building project images. This may take a few minutes on the first run..." >&2
9 | docker compose build --build-arg BASE_IMAGE="${BASE_IMAGE}"
10 | }
11 |
12 | # Removes an existing Docker Compose deployment, if one exists
13 | function undeploy {
14 | if is_stack_running; then
15 | echo "" >&2
16 | echo 'Waiting for robot stack to come down' >&2
17 | docker compose down
18 | while is_stack_running; do
19 | sleep 0.1
20 | done
21 | echo " done!" >&2
22 | fi
23 | }
24 |
25 | # Pulls necessary images for offline building and deployment
26 | function pull_images {
27 | echo "Pulling necessary docker images. This may take a few minutes on the first run..." >&2
28 |
29 | # Pull all the non-locally-built images
30 | docker compose pull grafana loki promtail
31 |
32 | # Pull the base images for (for image building purposes)
33 | # Doing this explicitly tags the base
34 | # image, preventing `docker build` from checking for updates itself. That
35 | # way, if the user chooses to skip this step, builds can be done offline.
36 | docker pull "${BASE_IMAGE}"
37 | }
38 |
39 | # Enables display passthrough of windows. Other passthrough (such as pulseaudio) can also
40 | # be configured in this function.
41 | function enable_display_passthrough {
42 | xhost +local:root
43 | }
44 |
45 | # This is automatically called when the user presses Ctrl+C. It will bring down the stack.
46 | function _catch_sigint {
47 | trap '' INT
48 | undeploy
49 | }
50 |
51 | # Checks if the stack is running
52 | function is_stack_running() {
53 | [ -n "$(docker compose ps --services)" ]
54 | }
55 |
56 | # Starts a Docker Compose deployment, and waits until the user presses Ctrl+C.
57 | # Then, the stack will be brought down.
58 | function deploy_and_wait {
59 | # Start the stack non-blockingly, because logs are best accessed via the web interface
60 | export LAUNCH_PROFILE="${1}"
61 | docker compose up --detach
62 |
63 | trap _catch_sigint INT
64 |
65 | echo "" >&2
66 | echo "Deployment of '${PROJECT_NAME}' has started! Press Ctrl+C to stop." >&2
67 | echo "Logs are viewable at http://localhost/" >&2
68 |
69 | while is_stack_running; do
70 | sleep 1
71 | done
72 | echo "Another process brought down the stack." >&2
73 | return 1
74 | }
75 |
76 | # Shows the user the available launch profiles
77 | function launch_profiles_helper_msg {
78 | echo "Available launch profiles are:" >&2
79 | # shellcheck disable=SC2012
80 | ls -1 launch-profiles/ | sed 's/^/ - /' >&2
81 | echo "" >&2
82 | echo "Read more about 'launch-profiles' under 'docs/about_template.md'" >&2
83 | exit 1
84 | }
85 |
86 | # Inform the user that the chosen launch profile is invalid if it is not a directory
87 | function validate_launch_profile {
88 | local chosen_profile
89 | chosen_profile="$1"
90 |
91 | # Check if the chosen profile is a directory under 'launch-profiles'
92 | if [[ ! -d "launch-profiles/${chosen_profile}" ]]; then
93 | echo "Error: '${chosen_profile}' is not a valid launch profile." >&2
94 | echo "It should be a directory under 'launch-profiles/'." >&2
95 | launch_profiles_helper_msg
96 | fi
97 | }
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/docker-compose.yaml:
--------------------------------------------------------------------------------
1 | # Set the docker compose project name, which will apply to all services
2 | name: {{cookiecutter.project_name}}
3 |
4 | # Create a variable for holding common configuration for all containers
5 | # This syntax inspired by https://docs.geoserver.org/2.21.x/en/user/styling/ysld/reference/variables.html
6 | x-common-config: &common-config
7 | restart: "unless-stopped"
8 | labels:
9 | # Tells Promtail to get logs for this container
10 | - "build.{{cookiecutter.dockerhub_username_or_org}}.collect-logs=true"
11 |
12 | services:
13 | ros-nodes:
14 | command: ros2 launch "launch-profiles/${LAUNCH_PROFILE:- 'set in docker/_shared'}/launcher.py"
15 | image: ${BUILT_IMAGE_NAME:- 'set-in-env'}
16 | build:
17 | context: .
18 | dockerfile: docker/Dockerfile
19 | volumes:
20 | # Mount a directory you can write to from the container for storing persistent data
21 | # In your local machine this will be in '.docker_volumes/ros-nodes/'. In the robot
22 | # it will be in '/robot/persistent/'
23 | - ${VOLUMES_DIRECTORY:- 'set in .env'}/ros-nodes:/robot/persistent
24 | # Mounts the 'launch-profiles/LAUNCH_PROFILE' directory into '/robot/launch-profile/'
25 | # This way you can save configuration files from GUI's like rviz right back to your source dir
26 | # LAUNCH_PROFILE is set in docker/_shared.sh
27 | - ./launch-profiles/${LAUNCH_PROFILE:- ''}:/robot/launch-profile
28 | # Necessary for display passthrough
29 | - "/tmp/.X11-unix:/tmp/.X11-unix:rw"
30 | # Necessary for PulseAudio passthrough
31 | - "/run/user/${USER_ID:-1000}/pulse/native:/pulse-socket"
32 | # Build cache, used by `save-build-cache` and `restore-build-cache` docker scripts
33 | - type: volume
34 | source: ros-nodes-build-cache
35 | target: /robot/build
36 | environment:
37 | # Necessary for display passthrough
38 | DISPLAY: $DISPLAY
39 | # Necessary for PulseAudio passthrough
40 | PULSE_SERVER: "unix:/pulse-socket"
41 | # Enable serial passthrough
42 | privileged: true
43 | # Gives the container access to kernel capabilities, useful for most robots
44 | network_mode: host
45 | cap_add:
46 | - ALL
47 | # Sometimes nodes never fully stop on SIGTERM
48 | stop_signal: SIGKILL
49 | <<:
50 | - *common-config
51 |
52 | grafana:
53 | image: grafana/grafana-oss:11.3.0
54 | user: "0"
55 | volumes:
56 | - ${VOLUMES_DIRECTORY:- 'set in .env'}/grafana:/var/lib/grafana
57 | - ./docker/grafana:/etc/grafana
58 | ports:
59 | - "${GRAFANA_PORT:-80}:3000"
60 | <<: *common-config
61 |
62 | promtail:
63 | image: grafana/promtail:3.2.1
64 | command: -config.file=/config.yaml
65 | configs:
66 | - source: promtail
67 | target: /config.yaml
68 | volumes:
69 | # For reading container labels and logs
70 | - /var/run/docker.sock:/var/run/docker.sock
71 | - /var/lib/docker/containers:/var/lib/docker/containers
72 | # Promtail takes a long time to stop if for some reason Loki isn't running. See:
73 | # https://github.com/grafana/loki/issues/6533
74 | stop_signal: SIGKILL
75 | <<:
76 | - *common-config
77 |
78 | loki:
79 | image: grafana/loki:3.2.1
80 | user: "root"
81 | volumes:
82 | - ${VOLUMES_DIRECTORY:- 'set in .env'}/loki:/loki
83 | <<: *common-config
84 |
85 | volumes:
86 | ros-nodes-build-cache:
87 |
88 | configs:
89 | promtail:
90 | file: docker/promtail/config.yaml
91 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/pyproject.toml:
--------------------------------------------------------------------------------
1 | [tool.poetry]
2 | name = "{{cookiecutter.__project_name_slug}}"
3 | version = "{{cookiecutter.version}}"
4 | description = "{{cookiecutter.project_description}}"
5 | authors = ["{{cookiecutter.github_username_or_org}} <{{cookiecutter.email}}>"]
6 | license = "{{cookiecutter.license}}"
7 | readme = "README.md"
8 | packages = [
9 | { include = "lint", from = ".github" }
10 | ]
11 |
12 | [tool.poetry.dependencies]
13 | python = ">=3.10.0,<4.0"
14 |
15 | [tool.poetry.dev-dependencies]
16 | mypy = "^1.9.0"
17 | ruff = "^0.3.3"
18 | cruft = "^2.15.0"
19 |
20 | # Remove darglint once ruff adds support for the rules.
21 | # Tracking: https://github.com/astral-sh/ruff/issues/458
22 | darglint = "^1.8.1"
23 |
24 | # The following libraries are used only to provide mypy with stubs
25 | pydantic = "^2.5.2"
26 | types-PyYAML = "^6.0.12.6"
27 | types-pytz = "^2024.1.0.20240203"
28 |
29 | [tool.poetry.scripts]
30 | lint = "lint.main:main"
31 |
32 | [build-system]
33 | requires = ["poetry-core>=1.0.0"]
34 | build-backend = "poetry.core.masonry.api"
35 |
36 | [tool.mypy]
37 | strict = true
38 | ignore_missing_imports = true
39 | disallow_subclassing_any = false
40 | implicit_reexport = true
41 | # We can't add annotations to decorators from other libraries, making this
42 | # check not very useful
43 | disallow_untyped_decorators = false
44 |
45 | [tool.ruff]
46 | extend-exclude = []
47 | target-version = "py310"
48 |
49 | [tool.ruff.lint]
50 | # Rules documentation: https://docs.astral.sh/ruff/rules/#flake8-bandit-s
51 | select = ["ALL"]
52 | ignore = [
53 | # Partially disabled checks libraries
54 | "N818", # pep8-naming: Requires the word 'error' in all exception types
55 | "S101", "S3", "S6", # flake8-bandit rules
56 | "PT007", "PT011", "PT012", "PT018", "PT019", # flake8-pytest-styles rules
57 | "RET503", "RET504", "RET505", "RET506", "RET507", # flake8-return
58 | "SIM105",
59 | "TD002", "TD003", # flake8-todos: disable strict rules
60 | "FIX", # flake8-fixme: this disallows todo statements entirely
61 | "PGH003", # pygrep-hooks: disallows '# type: ignore' statements
62 | "PLR0913", "PLR2004", # pylint: A few rules that are too aggressive
63 | "TRY003", "TRY004", "TRY300", "TRY301", "TRY400", # tryceratops
64 | "NPY002", # Deprecated random number generation (let's fix this over time)
65 | "PERF203", # This perf improvement is fixed in python 3.11, so no reason to frett
66 | "RUF005", "RUF012", # ruff specific rules
67 | # All fully disabled checks libraries
68 | "D", # pydocstyle: disabled, as it's too aggressive
69 | "ANN", # flake8-annotations: disables, it's too aggressive and we have mypy
70 | "FBT", # flake8-boolean-trap: disabled, it doesn't allow bools as positional args
71 | "A", # flake8-builtins: disabled, would be nice to work towards reenabling
72 | "COM", # flake8-commas: disabled, adds commas where it's silly to do so
73 | "CPY", # flake8-copyright: disabled, we don't put copyright headers in every file
74 | "EM", # flake8-errmsg: disabled, doesn't allow fstrings when making error messages
75 | "G", # flake8-logging-format: disabled, disallows fstrings in logging
76 | "INP", # flake8-no-pep420: disabled, disallows namespace packages
77 | "TID", # flake8-tidy-imports: disabled, in the future we should work towards this
78 | "TCH", # flake8-type-checking: disabled, in the future we should work towards this
79 | "ARG", # flake8-unused-arguments: disabled, too many false positives
80 | "ISC001", # ruff throws warnings if we use its formatter but also have this enabled
81 | "B027", # Doesn't allow empty methods in an abstract class
82 | ]
83 |
84 | [tool.ruff.lint.per-file-ignores]
85 | "__init__.py" = ["F401"]
86 | "test*" = ["S101", "SLF001", "ERA001"]
87 | "conftest.*" = ["SLF001"]
88 | "*validation.*" = ["SLF001"] # Library test code is treated like test code
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/docker/promtail/config.yaml:
--------------------------------------------------------------------------------
1 | # This configures Promtail to scrape logs from all containers with the label
2 | # "build.{{cookiecutter.dockerhub_username_or_org}}.collect-logs=true".
3 | #
4 | # It then uses regex to extract the node name from the container name, the node
5 | # executable from the log, and the log level from the log. It then assigns the
6 | # log level to a new label "level" and removes the log level from the log itself.
7 | #
8 | # Finally, it collapses multiline logs into one block and assigns a "level" label
9 | # to the log based on the log level extracted from the log.
10 | #
11 | # This configuration is used in the docker-compose.yaml file to configure the
12 | # Promtail service.
13 |
14 | # https://grafana.com/docs/loki/latest/clients/promtail/configuration/
15 | # https://docs.docker.com/engine/api/v1.41/#operation/ContainerList
16 | server:
17 | http_listen_port: 9080
18 | grpc_listen_port: 0
19 |
20 | clients:
21 | - url: http://loki:3100/loki/api/v1/push
22 |
23 | scrape_configs:
24 | - job_name: docker
25 | # use docker.sock to filter containers
26 | docker_sd_configs:
27 | - host: "unix:///var/run/docker.sock"
28 | refresh_interval: 15s
29 | filters:
30 | - name: label
31 | # Make sure that all relevant containers have this label
32 | values: [ "build.{{cookiecutter.dockerhub_username_or_org}}.collect-logs=true" ]
33 | # Use the container name to create a loki label
34 | relabel_configs:
35 | - source_labels: [ '__meta_docker_container_name' ]
36 | # Containers created by Swarm look like "/robot_dashboard.1.wp29jaucvp5pneq6zyw9mgwu7". This selects just the
37 | # "robot_dashboard" part.
38 | regex: '/([^\.]*).*'
39 | target_label: 'container'
40 |
41 |
42 | # Great, at this point, we have logs from all containers along with a "container" label
43 | pipeline_stages:
44 | # Now, extract the "node_executable" label only from the ros-nodes container
45 | - match:
46 | selector: '{container=~".*nodes.*"}'
47 | stages:
48 | # Find the node name by matching [some_node_name-##]
49 | - regex: &node_executable_regex
50 | expression: '(?i)(?P^\[)(?P[^\]]+-\d+)(?P\] )'
51 | - labels:
52 | node_executable: ""
53 |
54 | # Remove the [node_name-#] from the log itself (we'll show the label in grafana)
55 | # Also removes the space after the bracket
56 | - replace:
57 | <<: *node_executable_regex
58 |
59 | - regex: &ros2_loglevel_and_timestamp
60 | expression: '(?i)(?P\[)(?Pinfo|warn(?:ing)?|error|debug|traceback|critical)(?P\] )(?P\[[+-]?([0-9]*[.])?[0-9]+\] ?)'
61 | - labels:
62 | ros2_loglevel: ""
63 |
64 | - regex: &node_name_regex
65 | # Match at least 1 alphabetical character,
66 | # Then match any alphabetecal+numerical+underscore characters
67 | # Then at least one period
68 | # Then match any alphabetical, numerical, underscore, or period characters
69 | # Finally, expect a closing bracket and maybe an opening colon and space.
70 | expression: '(?P\[)(?P[a-zA-Z].[a-zA-Z0-9_]+\.[a-zA-Z0-9_.]+)(?P\]:? )'
71 | - labels:
72 | node_name: ""
73 |
74 | # Collapse multiline logs into one multiline block, using regex to detect the first line
75 | - multiline:
76 | # Matches lines that have certain key words in them, UNLESS they start with
77 | # characters that aren't an opening bracket, a space, or nothing.
78 | firstline: '(?i)(^|\s|\[)(info|warn(?:ing)?|error|debug|traceback|critical)(\]|:|( \(most recent call last)).*'
79 | max_wait_time: 3s
80 | max_lines: 500
81 |
82 | # Capture log level from logs that look like "LEVEL:" or "LEVEL: "
83 | # Case insensitive. This is for other libraries that don't use the default logger.
84 | - regex: &python_logging_loglevel_regex
85 | expression: '^(?i)(?P\s*)(?Pinfo|warn(?:ing)?|error|debug|traceback|critical):\s?'
86 | - labels:
87 | python_loglevel:
88 |
89 | # Create a new 'level' label holding the value of either 'python_loglevel' or 'ros2_loglevel'
90 | # If no level was extracted, "UNKNOWN" will be used.
91 | - template:
92 | source: "level"
93 | {% raw %}
94 | template: "\
95 | {{ if .python_loglevel }}{{ .python_loglevel | ToUpper }}\
96 | {{ else if .ros2_loglevel }}{{ .ros2_loglevel | ToUpper }}\
97 | {{ else }}UNKNOWN\
98 | {{ end }}"
99 | {% endraw %}
100 | # Remove "ING" in the 'level', so WARNING -> WARN, DEBUGGING -> DEBUG, etc
101 | - replace:
102 | source: 'level'
103 | expression: ".*(?PING)"
104 | replace: ''
105 | - labels:
106 | level:
107 |
108 | # Delete the log level text. If you do this before the templating, templating will fail
109 | - replace:
110 | <<: *python_logging_loglevel_regex
111 | - replace:
112 | <<: *node_name_regex
113 | - replace:
114 | <<: *ros2_loglevel_and_timestamp
115 |
116 | # Drop the 'python_loglevel' and 'ros2_loglevel' labels now that 'level' has been assigned.
117 | - labeldrop:
118 | - python_loglevel
119 | - ros2_loglevel
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/.github/workflows/test.yaml:
--------------------------------------------------------------------------------
1 | name: Test
2 |
3 | on: [ push ]
4 |
5 | jobs:
6 | # This job builds the Docker image that will be used in the following test job.
7 | # It also publishes the image to the GitHub Container Registry, so every PR can
8 | # have an up-to-date image by the name of ghcr.io/GITHUB_USERNAME/PROJECT_NAME:BRANCH_NAME
9 | build-image:
10 | runs-on: ubuntu-24.04
11 | env:
12 | BUILT_IMAGE_NAME: ghcr.io/{{cookiecutter.github_username_or_org}}/{{cookiecutter.project_name}}
13 | PROJECT_NAME: node_helpers
14 | {% raw %}
15 | defaults:
16 | run:
17 | shell: bash
18 | steps:
19 | - name: Create A Unique Tag For This Image Based On Branch Name
20 | id: prep
21 | run: |
22 | if [[ "${GITHUB_REF}" == "refs/heads/"* ]]
23 | then
24 | # This is a build on a branch
25 | TAG=${GITHUB_REF#refs/heads/}
26 | elif [[ "${GITHUB_REF}" == "refs/pull/"* ]]
27 | then
28 | # This is a PR build
29 | TAG=${GITHUB_REF#refs/pull/}
30 | elif [[ "${GITHUB_REF}" == "refs/tags/"* ]]
31 | then
32 | # This is a tagged build
33 | TAG=${GITHUB_REF#refs/tags/}
34 | else
35 | echo "Unexpected reference format ${GITHUB_REF}"
36 | exit 1
37 | fi
38 |
39 | # Remove slashes, since they're not allowed in Docker tags
40 | TAG=$(echo "${TAG}" | sed 's#/#-#g')
41 | echo "tagged_image=${BUILT_IMAGE_NAME}:${TAG}" >> $GITHUB_OUTPUT
42 | - name: Set up Docker Buildx
43 | uses: docker/setup-buildx-action@v2
44 | - name: Load Previous Docker Layers Cache
45 | uses: actions/cache@v4
46 | with:
47 | path: /tmp/.buildx-cache
48 | # Key is named differently to avoid collision
49 | key: v1-${{ env.PROJECT_NAME }}-${{ runner.os }}-multi-buildx-${{ github.sha }}
50 | restore-keys: v1-${{ env.PROJECT_NAME }}-${{ runner.os }}-multi-buildx
51 | - name: Log in to GitHub Container Registry
52 | run: echo "${{ secrets.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin
53 | - name: Checkout
54 | uses: actions/checkout@v4
55 | with:
56 | lfs: true
57 | - name: Get Base Image Name
58 | id: get_base_image
59 | run: |
60 | source .env
61 | echo "base_image=${BASE_IMAGE}" >> $GITHUB_OUTPUT
62 | - name: Build and push
63 | uses: docker/build-push-action@v5
64 | with:
65 | context: .
66 | file: docker/Dockerfile
67 | push: true
68 | tags: ${{ steps.prep.outputs.tagged_image }}
69 | build-args: |
70 | BASE_IMAGE=${{ steps.get_base_image.outputs.base_image }}
71 | cache-from: type=local,src=/tmp/.buildx-cache
72 | # Note the mode=max here
73 | # More: https://github.com/moby/buildkit#--export-cache-options
74 | # And: https://github.com/docker/buildx#--cache-tonametypetypekeyvalue
75 | cache-to: type=local,mode=max,dest=/tmp/.buildx-cache-new
76 | # This is done to avoid storing outdated build steps in the cache, since
77 | # BuildKit never clears stuff out of the cache itself
78 | - name: Move cache
79 | run: |
80 | rm -rf /tmp/.buildx-cache
81 | mv /tmp/.buildx-cache-new /tmp/.buildx-cache
82 | outputs:
83 | tagged_image: ${{ steps.prep.outputs.tagged_image }}
84 |
85 | # This job runs the colcon tests for the project
86 | ros-tests:
87 | needs: build-image
88 | runs-on: ubuntu-24.04
89 | defaults:
90 | run:
91 | shell: bash
92 | timeout-minutes: 15
93 | container:
94 | image: ${{needs.build-image.outputs.tagged_image}}
95 | credentials:
96 | username: ${{ github.actor }}
97 | password: ${{ secrets.GITHUB_TOKEN }}
98 | env:
99 | GITHUB_WORKSPACE: /repo/
100 | steps:
101 | - name: Install Git LFS
102 | run: sudo apt-get update && sudo apt-get install git-lfs
103 | - name: Checkout
104 | uses: actions/checkout@v4
105 | with:
106 | lfs: true
107 | - name: Fix git error in the following line
108 | # Caused by an update to git that dislikes when users of different ID's touch
109 | # the same git directory, which happens when using a docker runner in CI
110 | # This fixes test reporting in the `Generate Test Report` section
111 | # https://github.blog/2022-04-12-git-security-vulnerability-announced/
112 | # https://github.com/actions/checkout/issues/760
113 | run: git config --global --add safe.directory ${GITHUB_WORKSPACE}
114 | - name: Copy Python dependencies to github workspace
115 | run: cp -r /robot/install install
116 | - name: Copy Build output to github workspace
117 | run: cp -r /robot/install build
118 | - name: Move launch-profile files to /robot/launch-profiles so hardcoded paths work
119 | run: cp -r ${GITHUB_WORKSPACE}/launch-profiles /robot/
120 | - name: Run tests
121 | run: |
122 | enter-workspaces colcon test \
123 | --base-paths pkgs \
124 | --pytest-with-coverage \
125 | --test-result-base test-results \
126 | --pytest-args \
127 | --durations=50
128 | - name: Generate Test Report
129 | uses: dorny/test-reporter@v1
130 | if: always()
131 | with:
132 | name: "Test report"
133 | path: test-results/**/*.xml
134 | reporter: java-junit
135 | - name: Report Codecov Coverage
136 | uses: codecov/codecov-action@v3.1.0
137 | with:
138 | token: ${{ secrets.CODECOV_TOKEN }}
139 | {% endraw %}
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/docker/Dockerfile:
--------------------------------------------------------------------------------
1 | ARG BASE_IMAGE="this_variable_is_set_in_the_.env_file
2 | FROM ${BASE_IMAGE}
3 |
4 | # Set configurable environment variables
5 | ARG ROS2_DISTRO=jazzy
6 | ARG POETRY_VERSION=1.8.4
7 | ARG COLCON_POETRY_ROS_VERSION=0.9.0
8 | ARG NODE_HELPERS_VERSION=v0.5.1
9 |
10 | # Set variables for basic local and config, to make building easier
11 | ENV LANG=C.UTF-8
12 | ENV LC_ALL=C.UTF-8
13 | ENV DEBIAN_FRONTEND=noninteractive
14 | ENV PYTHON_VERSION=python3.12
15 | ENV PYTHONDONTWRITEBYTECODE=1
16 |
17 | # Set variables to allow Docker to display windows on the host machine
18 | ENV NVIDIA_VISIBLE_DEVICES=all
19 | ENV NVIDIA_DRIVER_CAPABILITIES=all
20 |
21 | # Set variables for caches
22 | ENV APT_CACHE=/var/cache/apt
23 | ENV PIP_CACHE=/root/.cache/pip
24 | ENV BUILD_CACHE=/colcon-build-cache/{{cookiecutter.__project_name_slug}}/
25 |
26 | # TODO: was opengl necessary?
27 |
28 | # Copy in useful scripts for use during the build, such as `add-apt-repo` and `install-packages`
29 | COPY docker/utils/environment/* /usr/local/bin/
30 |
31 | # Add the ROS2 repo
32 | RUN --mount=type=cache,target="${APT_CACHE}" \
33 | add-apt-repo \
34 | "deb [signed-by=/usr/share/keyrings/ros2-archive-keyring.gpg] http://packages.ros.org/ros2/ubuntu noble main" \
35 | "https://raw.githubusercontent.com/ros/rosdistro/master/ros.key"
36 |
37 | ######################################################################
38 | ########## NOTE TO TEMPLATE USERS: This is a good place to add any additional apt packages you need
39 | # Install ROS2 and other necessary packages
40 | RUN --mount=type=cache,target="${APT_CACHE}" \
41 | install-packages \
42 | # ROS2
43 | ros-${ROS2_DISTRO}-ros-base \
44 | ros-${ROS2_DISTRO}-rosbridge-suite \
45 | ros-${ROS2_DISTRO}-rmw-cyclonedds-cpp \
46 | # Build tools
47 | build-essential \
48 | git \
49 | git-lfs \
50 | # Complementary ROS2 Python tools
51 | python3-colcon-common-extensions \
52 | python3-colcon-mixin \
53 | python3-rosdep \
54 | # Poetry dependencies \
55 | python3-pip \
56 | python-is-python3 \
57 | python3-venv
58 | ######################################################################
59 |
60 | # Install colcon-poetry-ros, a tool by Urban Machine for building ROS2 packages with Poetry
61 | RUN --mount=type=cache,target="${PIP_CACHE}" \
62 | pip3 install --break-system-packages "colcon-poetry-ros==${COLCON_POETRY_ROS_VERSION}"
63 | RUN curl -fsSL https://install.python-poetry.org --output /tmp/install-poetry.py \
64 | && POETRY_HOME=/usr/local python3 /tmp/install-poetry.py --version "${POETRY_VERSION}"
65 | RUN poetry self add poetry-plugin-bundle
66 |
67 | # Add the ROS core setup script to the 'workspaces' command
68 | RUN add-workspace /opt/ros/${ROS2_DISTRO}/setup.bash
69 |
70 | # Set up colcon so it can install mixins. This is required for `colcon-poetry-ros`
71 | RUN colcon mixin add default \
72 | https://raw.githubusercontent.com/colcon/colcon-mixin-repository/master/index.yaml && \
73 | colcon mixin update && \
74 | colcon metadata add default \
75 | https://raw.githubusercontent.com/colcon/colcon-metadata-repository/master/index.yaml && \
76 | colcon metadata update
77 |
78 | # Initialize Rosdep for automatically downloading dependencies from `package.xml` in pkgs
79 | WORKDIR /robot
80 | RUN rosdep init
81 | RUN rosdep update --rosdistro ${ROS2_DISTRO}
82 |
83 | ######################################################################
84 | ########## Add package.xml's of each package
85 | ########## NOTE TO TEMPLATE USERS: When adding a new package, add a new line here copying the package.xml
86 | COPY pkgs/{{cookiecutter.example_package_name}}/package.xml pkgs/{{cookiecutter.example_package_name}}/package.xml
87 | COPY pkgs/{{cookiecutter.__example_messages_package_name}}/package.xml pkgs/{{cookiecutter.__example_messages_package_name}}/package.xml
88 | ######################################################################
89 |
90 | # Install rosdep dependencies for each package
91 | RUN --mount=type=cache,target="${PIP_CACHE}" \
92 | --mount=type=cache,target="${APT_CACHE}" \
93 | with-package-list \
94 | rosdep install -i --from-path pkgs --rosdistro "${ROS2_DISTRO}" -y
95 |
96 | ######################################################################
97 | ########## Add package.xml's of each package
98 | ########## NOTE TO TEMPLATE USERS: When adding a new package, add a new line here copying the poetry.lock and pyproject.toml files
99 | COPY pkgs/{{cookiecutter.example_package_name}}/poetry.lock pkgs/{{cookiecutter.example_package_name}}/poetry.lock
100 | COPY pkgs/{{cookiecutter.example_package_name}}/pyproject.toml pkgs/{{cookiecutter.example_package_name}}/pyproject.toml
101 | ######################################################################
102 |
103 | ######################################################################
104 | ########## Add Git ROS2 Packages
105 | ########## NOTE TO TEMPLATE USERS: If you need to depend on a package that is not in the ROS2 distro, you can add it here
106 | WORKDIR /ros-git-deps/
107 | RUN --mount=type=cache,target="${PIP_CACHE}" \
108 | --mount=type=cache,target="${APT_CACHE}" \
109 | install-ros-package-from-git \
110 | https://github.com/UrbanMachine/node_helpers.git $NODE_HELPERS_VERSION pkgs && \
111 | ##################### Add your packages here!
112 | ########### install-ros-package-from-git {URL} {BRANCH} {PKGS PATH IN REPO}
113 | echo "Done installing ROS2 packages from git"
114 | ######################################################################
115 |
116 | # Install Poetry dependencies for each package in this repo
117 | WORKDIR /robot
118 | RUN --mount=type=cache,target="${PIP_CACHE}" \
119 | python3 -m colcon_poetry_ros.dependencies.install --base-paths pkgs
120 |
121 | ## Add ROS2 libraries to Python's path ahead of enter-workspaces to help developer tools
122 | ## like PyCharm or VS Code understand where packages are. It also breaks dependency isolation a bit,
123 | ## but it's a necessary evil for now.
124 | RUN echo "/opt/ros/${ROS2_DISTRO}/lib/${PYTHON_VERSION}/site-packages" >> /usr/local/lib/${PYTHON_VERSION}/dist-packages/ros2.pth
125 | RUN echo "/opt/ros/${ROS2_DISTRO}/local/lib/${PYTHON_VERSION}/dist-packages" >> /usr/local/lib/${PYTHON_VERSION}/dist-packages/ros2.pth
126 | RUN make-pth-file-from-workspace "$(pwd)/install" /usr/local/lib/${PYTHON_VERSION}/dist-packages/robot.pth
127 |
128 | # Move the build cache from a Docker cache mount to a place where our build
129 | # system can see it. This helps make `colcon build` faster between runs.
130 | RUN --mount=type=cache,target="${BUILD_CACHE}" restore-build-cache
131 |
132 | # Build all packages, copying project files under a root `/robot/` directory in the container
133 | COPY ../launch-profiles /robot/launch-profiles
134 | COPY ../pkgs /robot/pkgs
135 | RUN enter-workspaces colcon build --packages-select $(ls pkgs)
136 | RUN add-workspace /robot/install/setup.bash
137 |
138 | # Move the build cache back into the Docker cache mount so it can be reused on
139 | # future runs
140 | RUN --mount=type=cache,target="${BUILD_CACHE}" save-build-cache
141 |
142 | # Make ROS tools available by default
143 | ENTRYPOINT [ "enter-workspaces" ]
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/launch-profiles/{{cookiecutter.example_launch_profile}}/rviz-config.rviz:
--------------------------------------------------------------------------------
1 | Panels:
2 | - Class: rviz_common/Displays
3 | Help Height: 78
4 | Name: Displays
5 | Property Tree Widget:
6 | Expanded:
7 | - /InteractiveMarkers1
8 | - /InteractiveTransforms1
9 | Splitter Ratio: 0.5
10 | Tree Height: 1108
11 | - Class: rviz_common/Selection
12 | Name: Selection
13 | - Class: rviz_common/Tool Properties
14 | Expanded:
15 | - /2D Goal Pose1
16 | - /Publish Point1
17 | Name: Tool Properties
18 | Splitter Ratio: 0.5886790156364441
19 | - Class: rviz_common/Views
20 | Expanded:
21 | - /Current View1
22 | Name: Views
23 | Splitter Ratio: 0.5
24 | - Class: rviz_common/Time
25 | Experimental: false
26 | Name: Time
27 | SyncMode: 0
28 | SyncSource: ""
29 | Visualization Manager:
30 | Class: ""
31 | Displays:
32 | - Alpha: 0.5
33 | Cell Size: 1
34 | Class: rviz_default_plugins/Grid
35 | Color: 160; 160; 164
36 | Enabled: true
37 | Line Style:
38 | Line Width: 0.029999999329447746
39 | Value: Lines
40 | Name: Grid
41 | Normal Cell Count: 0
42 | Offset:
43 | X: 0
44 | Y: 0
45 | Z: 0
46 | Plane: XY
47 | Plane Cell Count: 10
48 | Reference Frame:
49 | Value: true
50 | - Alpha: 1
51 | Class: rviz_default_plugins/RobotModel
52 | Collision Enabled: false
53 | Description File: ""
54 | Description Source: Topic
55 | Description Topic:
56 | Depth: 5
57 | Durability Policy: Volatile
58 | History Policy: Keep Last
59 | Reliability Policy: Reliable
60 | Value: /example_node_namespace/urdf_0/robot_description
61 | Enabled: true
62 | Links:
63 | All Links Enabled: true
64 | Expand Joint Details: false
65 | Expand Link Details: false
66 | Expand Tree: false
67 | Link Tree Style: Links in Alphabetic Order
68 | example_node_namespace.forklift_body:
69 | Alpha: 1
70 | Show Axes: false
71 | Show Trail: false
72 | Value: true
73 | example_node_namespace.forks:
74 | Alpha: 1
75 | Show Axes: false
76 | Show Trail: false
77 | Value: true
78 | example_node_namespace.forks_origin:
79 | Alpha: 1
80 | Show Axes: false
81 | Show Trail: false
82 | Value: true
83 | Mass Properties:
84 | Inertia: false
85 | Mass: false
86 | Name: ExampleURDF
87 | TF Prefix: ""
88 | Update Interval: 0
89 | Value: true
90 | Visual Enabled: true
91 | - Class: rviz_default_plugins/InteractiveMarkers
92 | Enable Transparency: true
93 | Enabled: true
94 | Interactive Markers Namespace: ""
95 | Name: InteractiveMarkers
96 | Show Axes: false
97 | Show Descriptions: true
98 | Show Visual Aids: false
99 | Value: true
100 | - Class: rviz_default_plugins/InteractiveMarkers
101 | Enable Transparency: true
102 | Enabled: true
103 | Interactive Markers Namespace: /urdf_arrangement
104 | Name: InteractiveTransforms
105 | Show Axes: false
106 | Show Descriptions: true
107 | Show Visual Aids: false
108 | Value: true
109 | Enabled: true
110 | Global Options:
111 | Background Color: 48; 48; 48
112 | Fixed Frame: base_link
113 | Frame Rate: 30
114 | Name: root
115 | Tools:
116 | - Class: rviz_default_plugins/Interact
117 | Hide Inactive Objects: true
118 | - Class: rviz_default_plugins/MoveCamera
119 | - Class: rviz_default_plugins/Select
120 | - Class: rviz_default_plugins/FocusCamera
121 | - Class: rviz_default_plugins/Measure
122 | Line color: 128; 128; 0
123 | - Class: rviz_default_plugins/SetInitialPose
124 | Covariance x: 0.25
125 | Covariance y: 0.25
126 | Covariance yaw: 0.06853891909122467
127 | Topic:
128 | Depth: 5
129 | Durability Policy: Volatile
130 | History Policy: Keep Last
131 | Reliability Policy: Reliable
132 | Value: /initialpose
133 | - Class: rviz_default_plugins/SetGoal
134 | Topic:
135 | Depth: 5
136 | Durability Policy: Volatile
137 | History Policy: Keep Last
138 | Reliability Policy: Reliable
139 | Value: /goal_pose
140 | - Class: rviz_default_plugins/PublishPoint
141 | Single click: true
142 | Topic:
143 | Depth: 5
144 | Durability Policy: Volatile
145 | History Policy: Keep Last
146 | Reliability Policy: Reliable
147 | Value: /clicked_point
148 | Transformation:
149 | Current:
150 | Class: rviz_default_plugins/TF
151 | Value: true
152 | Views:
153 | Current:
154 | Class: rviz_default_plugins/Orbit
155 | Distance: 8.649819374084473
156 | Enable Stereo Rendering:
157 | Stereo Eye Separation: 0.05999999865889549
158 | Stereo Focal Distance: 1
159 | Swap Stereo Eyes: false
160 | Value: false
161 | Focal Point:
162 | X: -0.4261273145675659
163 | Y: -0.3684071898460388
164 | Z: 0.5727261900901794
165 | Focal Shape Fixed Size: true
166 | Focal Shape Size: 0.05000000074505806
167 | Invert Z Axis: false
168 | Name: Current View
169 | Near Clip Distance: 0.009999999776482582
170 | Pitch: 0.5403984189033508
171 | Target Frame:
172 | Value: Orbit (rviz)
173 | Yaw: 1.0753980875015259
174 | Saved: ~
175 | Window Geometry:
176 | Displays:
177 | collapsed: false
178 | Height: 1331
179 | Hide Left Dock: false
180 | Hide Right Dock: false
181 | QMainWindow State: 000000ff00000000fd0000000400000000000001c5000004ddfc0200000008fb0000001200530065006c0065006300740069006f006e00000001e10000009b0000005c00fffffffb0000001e0054006f006f006c002000500072006f007000650072007400690065007302000001ed000001df00000185000000a3fb000000120056006900650077007300200054006f006f02000001df000002110000018500000122fb000000200054006f006f006c002000500072006f0070006500720074006900650073003203000002880000011d000002210000017afb000000100044006900730070006c006100790073010000003b000004dd000000c700fffffffb0000002000730065006c0065006300740069006f006e00200062007500660066006500720200000138000000aa0000023a00000294fb00000014005700690064006500530074006500720065006f02000000e6000000d2000003ee0000030bfb0000000c004b0069006e0065006300740200000186000001060000030c00000261000000010000010f00000499fc0200000003fb0000001e0054006f006f006c002000500072006f00700065007200740069006500730100000041000000780000000000000000fb0000000a00560069006500770073000000003b00000499000000a000fffffffb0000001200530065006c0065006300740069006f006e010000025a000000b200000000000000000000000200000490000000a9fc0100000001fb0000000a00560069006500770073030000004e00000080000002e10000019700000003000009f40000003efc0100000002fb0000000800540069006d00650000000000000009f40000025300fffffffb0000000800540069006d0065010000000000000450000000000000000000000829000004dd00000004000000040000000800000008fc0000000100000002000000010000000a0054006f006f006c00730100000000ffffffff0000000000000000
182 | Selection:
183 | collapsed: false
184 | Time:
185 | collapsed: false
186 | Tool Properties:
187 | collapsed: false
188 | Views:
189 | collapsed: false
190 | Width: 2548
191 | X: 8
192 | Y: 64
193 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # create-ros-app
2 |
3 | A template for creating robust ROS2 (Robot Operating System) applications, designed for large-scale, end-to-end robotics projects. This template is optimized for maintaining consistency and quality across multiple ROS2 packages and project components.
4 |
5 | **Featured Projects Using This Template**
6 |
7 | - [Pi At Home](https://github.com/apockill/pi_at_home/): A WIP synthetic data generation project for training AI models for robotic trajectory generation
8 | - [Urban Machine in Isaac SIM](https://github.com/UrbanMachine/isaac-um-factory): A project for simulation urban machine's robots in isaac sim
9 | - [Node Helpers](https://github.com/UrbanMachine/node_helpers): A library (that this template includes by default) for simplifying many common ROS robotics needs
10 | - Your project- please create issues to show off your projects, and we'll include them here!
11 |
12 |
13 | ## What? Why?
14 |
15 | When you build a robot, you have to juggle a **ton** of opinionated decisions before you so much as tell an actuator to move.
16 |
17 | This template helps streamline the development of ROS2 applications by setting up a standardized project structure, tooling, and configuration for ROS2 packages, ensuring that every project starts with consistent settings and follows best practices. It also includes a centralized linting and testing setup to ensure code quality across packages.
18 |
19 | ## This feels like a lot. Where do I even start?
20 |
21 | Don't worry. All of these features will feel natural if you just stick to the scripts and
22 | commands in the [about_template.md](%7B%7Bcookiecutter.project_name%7D%7D/docs/about_template.md) Scripts section.
23 |
24 | Start by editing code in the example nodes generated under `pkgs`, then play around with
25 | the example `launcher.py` files under `launch-profiles/` to see how to launch your nodes.
26 |
27 | Eventually, you'll be comfortable with all the features of this template, and you'll be able to
28 | edit and add to them as you see fit.
29 |
30 | ## Features
31 |
32 | The full documentation for the project features can be found in [about_template.md](%7B%7Bcookiecutter.project_name%7D%7D/docs/about_template.md)
33 |
34 | - **Containerized ROS2 Environment**: The project only needs `docker` to run. You only need to know a few simple commands to launch your project.
35 | - **Launch** specific ROS2 components or full stacks with `docker/launch`.
36 | - **Run** components in isolation using `docker/run`.
37 | - **Execute** commands within running containers using `docker/exec`.
38 | - **Test** all ROS2 packages with `docker/test`.
39 | - **Display and Sound Passthrough**: Projects come pre-configured with `x11` and `pulseaudio` passthrough, so you can run GUI applications and hear sound from within the container.
40 | - **Logging Made Easy**: Projects come pre-configured with `grafana`, `loki` and `promtail`, so you can search and browse logs easily under `http://localhost` after launching.
41 | - **Standardized ROS2 Project Structure**:
42 | - Pre-organized packages under `pkgs.`
43 | - The `launch-profiles/` directory lets you create separate ROS2 "apps" where launchfiles, configuration, model files can live in one place separate from the package code.
44 | - **Dependencies are Organized and Automatically Installed**: ROS2 dependencies go in a `package.xml`, python dependencies go in the `pyproject.toml`, and the `Dockerfile` has a spot for apt dependencies.
45 | - **Centralized Linting and Testing**: A preconfigured linter tool for Python, C++, and bash runnable via `poetry run lint`
46 | - **GitHub Actions CI/CD**: Pre-configured workflows for continuous integration, including linting, testing, and optional Codecov integration to monitor code coverage.
47 | - **Cruft Integration for Template Sync**: Ensures projects remain up-to-date with the latest template improvements, allowing the team to adopt new best practices with minimal effort.
48 |
49 |
50 | ## Quick Start Guide
51 |
52 | Here's how you create your first ROS2 app:
53 |
54 | 1. Install Cruft (if not already installed):
55 |
56 | This tool allows you to create projects from 'cookiecutter' repositories, such as this one.
57 | ```shell
58 | pip install cruft
59 | ```
60 | 2. Initialize the template. This will create a new directory where the project files will
61 | be dumped.
62 |
63 | Fill in the form with the appropriate values for your project. The values in the
64 | parameters are the default values that will be used if you don't fill in your own.
65 | ```shell
66 | cruft create https://github.com/UrbanMachine/create-ros-app.git
67 | ```
68 | 3. Check that everything is synced with the template repository
69 | ```shell
70 | cruft check
71 | ```
72 |
73 |
74 | ### Updating a Template on an Existing Project
75 | To pull in the latest template changes:
76 | 1.Run the following:
77 | ```shell
78 | cruft update --allow-untracked-files
79 | ```
80 | 2. Follow the prompts and resolve any merge conflicts that arise.
81 |
82 | ## Post Set-Up Guide
83 | ### Lock the Root pyproject.toml File
84 |
85 | This project requires poetry for linting. You can get it [here](https://python-poetry.org/docs/).
86 |
87 | After adding the template to your project, you should lock the `pyproject.toml` file to ensure that all developers use the same dependencies. This file is generated by Poetry and should be committed to the repository. To lock the file, run:
88 | ```shell
89 | poetry lock
90 | ```
91 |
92 | Then commit the `pyproject.lock` file to the repository.
93 |
94 | ### Commit the generated `.env` File
95 |
96 | A `.env` file is generated by the template to store important env variable. These are
97 | automatically read by `docker-compose` when you run the project.
98 |
99 | However, it's added to the `.gitignore` file because `.env` files can contain sensitive
100 | information.
101 |
102 | Commit the first generated `.env` file to the repository.
103 |
104 | ```shell
105 | git add .env
106 | git commit -m "Add .env file"
107 | ```
108 |
109 | ### Fixing the `lint` Github Action
110 | This template automatically runs CI via github actions on every pull request.
111 |
112 | The CI uses cruft to check if there's been upstream changes on the template repository.
113 | Depending on how you clone the repository, you might get the following error:
114 |
115 | ```shell
116 | ╭─ Error ──────────────────────────────────────────────────────────────────────╮
117 | │ Unable to initialize the cookiecutter using │
118 | │ git@github.com:UrbanMachine/create-ros-app.git! Failed to clone the repo. │
119 | │ stderr: 'Cloning into '/tmp/tmpavykj68r'... │
120 | │ git@github.com: Permission denied (publickey). │
121 | │ fatal: Could not read from remote repository. │
122 | │ │
123 | │ Please make sure you have the correct access rights │
124 | │ and the repository exists. │
125 | │ ' │
126 | ╰──────────────────────────────────────────────────────────────────────────────╯
127 | ```
128 |
129 | If you do, it's because github actions is trying to use SSH to clone the template repo,
130 | and failing. To fix this, edit your `.cruft.json` `template` key so it points to the
131 | repository using `https://...your-url...git`
132 |
133 | ### Fixing Test+Build Workflow
134 |
135 | This template will set up a CI pipeline that automatically builds images of the latest
136 | commit on each branch, and tags them `YOUR_GITHUB_ORG/PROJECT_NAME:BRANCH_NAME`.
137 |
138 | However, it will initially fail until you give your workflow permissions to push to the
139 | github Docker registry.
140 |
141 | You'll see something like:
142 |
143 | ```shell
144 | #36 exporting to image
145 | #36 pushing layers 0.2s done
146 | #36 ERROR: failed to push ghcr.io/YOUR_GITHUB_ORG/PROJECT_NAME:BRANCH_NAME: unexpected status from POST request to https://ghcr.io/v2/.../.../blobs/uploads/: 403 Forbidden
147 | ```
148 |
149 | To fix this, go to the following page and configure Read Write Permissions as shown below.
150 | https://github.com/YOUR_GITHUB_ORG/PROJECT_NAME/settings/actions
151 |
152 | 
153 |
154 | ### Optional: Adding Codecov Support
155 | Codecov let's your project report on test coverage on every pull request. This process requires being an Admin on the github org this project lives in.
156 |
157 | 1. Sign in to [Codecov](https://about.codecov.io/sign-up/) with your Github account.
158 | 2. Under your repository in Codecov, select "Configure".
159 | 3. Get a codecov API token. Settings can be found under:
160 | https://app.codecov.io/gh/GITHUB_ORG/PROJECT_NAME/
161 | 4. Add a secret named `CODECOV_TOKEN` to your Github repositories secrets. Settings can be found under:
162 | https://github.com/GITHUB_ORG/PROJECT_NAME/settings/secrets/actions
163 | 6. You should now be able to see code coverage under codecov!
164 |
165 |
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/docker/grafana/dashboard_data/logs_dashboard.json:
--------------------------------------------------------------------------------
1 | {
2 | "annotations": {
3 | "list": [
4 | {
5 | "builtIn": 1,
6 | "datasource": {
7 | "type": "grafana",
8 | "uid": "-- Grafana --"
9 | },
10 | "enable": true,
11 | "hide": true,
12 | "iconColor": "rgba(0, 211, 255, 1)",
13 | "name": "Annotations & Alerts",
14 | "target": {
15 | "limit": 100,
16 | "matchAny": false,
17 | "tags": [],
18 | "type": "dashboard"
19 | },
20 | "type": "dashboard"
21 | }
22 | ]
23 | },
24 | "editable": true,
25 | "fiscalYearStartMonth": 0,
26 | "graphTooltip": 0,
27 | "id": 1,
28 | "links": [],
29 | "liveNow": true,
30 | "panels": [
31 | {
32 | "datasource": {
33 | "type": "loki",
34 | "uid": "P8E80F9AEF21F6940"
35 | },
36 | "fieldConfig": {
37 | "defaults": {
38 | "color": {
39 | "mode": "continuous-GrYlRd"
40 | },
41 | "custom": {
42 | "axisBorderShow": false,
43 | "axisCenteredZero": false,
44 | "axisColorMode": "text",
45 | "axisLabel": "",
46 | "axisPlacement": "auto",
47 | "axisSoftMin": 0,
48 | "barAlignment": 0,
49 | "barWidthFactor": 0.6,
50 | "drawStyle": "line",
51 | "fillOpacity": 43,
52 | "gradientMode": "scheme",
53 | "hideFrom": {
54 | "legend": false,
55 | "tooltip": false,
56 | "viz": false
57 | },
58 | "insertNulls": false,
59 | "lineInterpolation": "smooth",
60 | "lineStyle": {
61 | "fill": "solid"
62 | },
63 | "lineWidth": 1,
64 | "pointSize": 10,
65 | "scaleDistribution": {
66 | "type": "linear"
67 | },
68 | "showPoints": "always",
69 | "spanNulls": 10000,
70 | "stacking": {
71 | "group": "A",
72 | "mode": "none"
73 | },
74 | "thresholdsStyle": {
75 | "mode": "off"
76 | }
77 | },
78 | "decimals": 0,
79 | "mappings": [],
80 | "min": 0,
81 | "thresholds": {
82 | "mode": "absolute",
83 | "steps": [
84 | {
85 | "color": "green",
86 | "value": null
87 | },
88 | {
89 | "color": "red",
90 | "value": 80
91 | }
92 | ]
93 | }
94 | },
95 | "overrides": []
96 | },
97 | "gridPos": {
98 | "h": 8,
99 | "w": 24,
100 | "x": 0,
101 | "y": 0
102 | },
103 | "id": 1,
104 | "options": {
105 | "legend": {
106 | "calcs": [],
107 | "displayMode": "list",
108 | "placement": "bottom",
109 | "showLegend": false
110 | },
111 | "tooltip": {
112 | "mode": "single",
113 | "sort": "none"
114 | }
115 | },
116 | "pluginVersion": "11.3.0",
117 | "targets": [
118 | {
119 | "datasource": {
120 | "type": "loki",
121 | "uid": "P8E80F9AEF21F6940"
122 | },
123 | "editorMode": "code",
124 | "expr": "sum(count_over_time({container=~\"(.*$container_name.*)\"} |~ `(?i)$search_term` !~ `(?i)$search_exclude_term` | level =~ `$log_level` | node_executable =~ `.*$node_executable_search.*` | node_name =~ `.*$node_name_search.*` [$__interval]))",
125 | "queryType": "range",
126 | "refId": "A"
127 | }
128 | ],
129 | "title": "Log Frequency",
130 | "type": "timeseries"
131 | },
132 | {
133 | "datasource": {
134 | "type": "loki",
135 | "uid": "P8E80F9AEF21F6940"
136 | },
137 | "fieldConfig": {
138 | "defaults": {},
139 | "overrides": []
140 | },
141 | "gridPos": {
142 | "h": 20,
143 | "w": 24,
144 | "x": 0,
145 | "y": 8
146 | },
147 | "id": 2,
148 | "options": {
149 | "dedupStrategy": "exact",
150 | "enableLogDetails": true,
151 | "prettifyLogMessage": true,
152 | "showCommonLabels": true,
153 | "showLabels": true,
154 | "showTime": false,
155 | "sortOrder": "Descending",
156 | "wrapLogMessage": true
157 | },
158 | "pluginVersion": "11.3.0",
159 | "targets": [
160 | {
161 | "datasource": {
162 | "type": "loki",
163 | "uid": "P8E80F9AEF21F6940"
164 | },
165 | "editorMode": "builder",
166 | "expr": "{container=~\".*$container_name.*\"} |~ `(?i)$search_term` !~ `(?i)$search_exclude_term` | level =~ `$log_level` | node_executable =~ `.*$node_executable_search.*` | node_name =~ `.*$node_name_search.*`",
167 | "queryType": "range",
168 | "refId": "A"
169 | }
170 | ],
171 | "title": "Logs",
172 | "type": "logs"
173 | }
174 | ],
175 | "preload": false,
176 | "refresh": "auto",
177 | "schemaVersion": 40,
178 | "tags": [],
179 | "templating": {
180 | "list": [
181 | {
182 | "allValue": ".*",
183 | "current": {
184 | "text": [
185 | "All"
186 | ],
187 | "value": [
188 | "$__all"
189 | ]
190 | },
191 | "datasource": {
192 | "type": "loki",
193 | "uid": "P8E80F9AEF21F6940"
194 | },
195 | "definition": "",
196 | "description": "Filter out logs based on their printed log level",
197 | "includeAll": true,
198 | "label": "Log Level",
199 | "multi": true,
200 | "name": "log_level",
201 | "options": [],
202 | "query": {
203 | "label": "level",
204 | "refId": "LokiVariableQueryEditor-VariableQuery",
205 | "stream": "",
206 | "type": 1
207 | },
208 | "refresh": 2,
209 | "regex": "",
210 | "type": "query"
211 | },
212 | {
213 | "current": {
214 | "text": "",
215 | "value": ""
216 | },
217 | "description": "Type plain text here find logs that include this text.",
218 | "label": "Search",
219 | "name": "search_term",
220 | "options": [
221 | {
222 | "selected": true,
223 | "text": "",
224 | "value": ""
225 | }
226 | ],
227 | "query": "",
228 | "type": "textbox"
229 | },
230 | {
231 | "current": {
232 | "text": "",
233 | "value": ""
234 | },
235 | "description": "Filter out logs based on the name of the node that the log came from.",
236 | "label": "Node Name",
237 | "name": "node_name_search",
238 | "options": [
239 | {
240 | "selected": true,
241 | "text": "",
242 | "value": ""
243 | }
244 | ],
245 | "query": "",
246 | "type": "textbox"
247 | },
248 | {
249 | "current": {
250 | "text": "",
251 | "value": ""
252 | },
253 | "description": "Filter out logs based on the name of the executable that the log came from.",
254 | "label": "Process Name",
255 | "name": "node_executable_search",
256 | "options": [
257 | {
258 | "selected": true,
259 | "text": "",
260 | "value": ""
261 | }
262 | ],
263 | "query": "",
264 | "type": "textbox"
265 | },
266 | {
267 | "current": {
268 | "text": "",
269 | "value": ""
270 | },
271 | "description": "Text entered here will be used to exclude logs that have matching lines.",
272 | "label": "Exclude",
273 | "name": "search_exclude_term",
274 | "options": [
275 | {
276 | "selected": true,
277 | "text": "",
278 | "value": ""
279 | }
280 | ],
281 | "query": "",
282 | "type": "textbox"
283 | },
284 | {
285 | "current": {
286 | "text": "nodes",
287 | "value": "nodes"
288 | },
289 | "includeAll": false,
290 | "label": "Containers",
291 | "name": "container_name",
292 | "options": [
293 | {
294 | "selected": true,
295 | "text": "nodes",
296 | "value": "nodes"
297 | },
298 | {
299 | "selected": false,
300 | "text": "promtail",
301 | "value": "promtail"
302 | },
303 | {
304 | "selected": false,
305 | "text": "grafana",
306 | "value": "grafana"
307 | },
308 | {
309 | "selected": false,
310 | "text": "loki",
311 | "value": "loki"
312 | }
313 | ],
314 | "query": "nodes, promtail, grafana, loki",
315 | "type": "custom"
316 | }
317 | ]
318 | },
319 | "time": {
320 | "from": "now-5m",
321 | "to": "now"
322 | },
323 | "timepicker": {
324 | "nowDelay": "0m",
325 | "refresh_intervals": [
326 | "1s",
327 | "2s",
328 | "3s",
329 | "5s",
330 | "10s",
331 | "30s",
332 | "1m",
333 | "5m",
334 | "15m",
335 | "30m",
336 | "1h",
337 | "2h",
338 | "1d"
339 | ]
340 | },
341 | "timezone": "",
342 | "title": "Logs",
343 | "uid": "logs_dashboard",
344 | "version": 1,
345 | "weekStart": ""
346 | }
--------------------------------------------------------------------------------
/{{cookiecutter.project_name}}/docs/about_template.md:
--------------------------------------------------------------------------------
1 | # Using `create-ros-app`
2 |
3 | This repository was initialized by the [create-ros-app](https://github.com/UrbanMachine/create-ros-app) template.
4 | This template is a everything-but-the-robot-code starter for ROS projects. It includes a Dockerfile for building ROS packages, a GitHub Actions workflow for linting and autoformatting, and many other goodies.
5 |
6 | This documentation walks through the features of the template, and how to use them.
7 |
8 | **Make sure to follow the [Post-Set-Up Guide](https://github.com/UrbanMachine/create-ros-app/blob/main/README.md#post-set-up-guide) after setting up the template before diving into the features below!**
9 |
10 | ## Quick Start Guide
11 |
12 | Here's a quick guide on the features of this template
13 |
14 | ### Scripts
15 | - Linting and autoformatting for python (ruff), C++ (clangformat), bash (shellcheck)
16 |
17 | **Relevant Scripts:**
18 | ```shell
19 | poetry run lint --help
20 | ```
21 | You may need to run `poetry lock` and `poetry install` before running the above command.
22 | - Easily build ROS apps in an out-of-the box containerized environment, with helper scripts under the `docker/` directory.
23 |
24 | **Relevant Scripts:**
25 | ```shell
26 | # Run a profile specified under `launch-profiles/`
27 | docker/launch
28 |
29 | # Run and enter a new ROS container without executing anything
30 | docker/run
31 |
32 | # Enter a currently running ROS container to poke around
33 | docker/exec
34 |
35 | # Rebuild and restart the ROS nodes in the container, useful for fast development
36 | docker/reload-ros-nodes
37 | ```
38 |
39 | More usage examples for the above scripts are documented at the top of the script files.
40 |
41 | ### Things You Likely Want To Do
42 |
43 | - **Create a new package**: It's recommended to just start by developing with the example package that comes with the
44 | template after generation. Once you're familiar with development with the template and want to create a second package,
45 | you can follow the steps under [Adding a New Package](#adding-a-new-package).
46 | - **Add new dependencies:**
47 | - **System dependencies:** can be added under `docker/Dockerfile`, under the `install-packages` section.
48 | - **Python dependencies:** can be added under `pkgs//pyproject.toml`. Run `poetry lock` after.
49 | - **ROS dependencies:** can be added under `pkgs//package.xml`.
50 | - **ROS Git dependencies:** If you need to add a ROS package that isn't available in the ROS package index, you can add it as a git dependency in the `docker/Dockerfile` under the `Add Git ROS2 Packages` section.
51 |
52 | After any changes, the container will automatically re-build on the next launch.
53 | - **Save persistent data:** The `/robot/persistent` directory is mounted to `.docker_volumes/ros-nodes/` on your local machine. Save data there to persist across container runs, and to view it in your file manager.
54 | - **Look at logs:** Logs are available at `http://localhost/` after running `docker/launch `.
55 | - **Create a new Launch Profile:**
56 | - Add a directory under `docker/launch-profiles/`
57 | - Add a `launching.py` file that launches your ROS nodes inside your new profile directory
58 | - Fill out the `launching.py` file with the nodes you want to launch.
59 | - The directory you created will be mounted under `/robot/launch-profile/` at launch. This is a great place to store configuration files, URDFs, and other specifics for your launch profile.
60 |
61 |
62 | ## Container Structure
63 |
64 | ### `/robot` Directory
65 |
66 | The `/robot` directory is where your source code lives after building the container.
67 | You can find your:
68 | - `pkgs/` directory
69 | - `build/`, `install/` from the colcon build
70 |
71 | ### `/robot/launch-profile/` Directory
72 | A directory pointing towards the launch profile chosen in `docker/launch `
73 | is mounted under `/robot/launch-profile/`.
74 |
75 | ### `/robot/persistent/` Directory
76 | For saving persistent data, `/robot/persistent` is mounted to `.docker_volumes/ros-nodes/` on your local machine.
77 | This is a great place to save any serialized data, databases, etc. For static configuration,
78 | it's recommended to use the `launch-profile` directory.
79 |
80 | ### `/ros-git-deps/` Directory
81 |
82 | In the `Dockerfile`, there's a section for adding ROS packages that aren't available in
83 | the ROS package index. These are added as git dependencies. The `ros-git-deps/`
84 | directory is where these packages are cloned to, and built.
85 |
86 |
87 | ## Project Structure
88 |
89 | ### `pkgs/`
90 |
91 | The packages directory contains all the packages that are used in the project. Each package is added in the `Dockerfile`, and any new packages should be added there as well.
92 |
93 | #### Python Package structure
94 | Each python package is made up of:
95 | - A `resource` directory, which is a colcon requirement
96 | - A `package.xml` file, which is a colcon requirement
97 | - A `pyproject.toml`, because this project uses [colcon-poetry-ros](https://github.com/UrbanMachine/colcon-poetry-ros) to install project dependencies. Most ROS python packages use `setup.py`, but by using this plugin, we can use a modern python tool called [Poetry](https://python-poetry.org/) to manage dependencies.
98 | - A directory for code
99 | - A directory for tests
100 |
101 | ##### Test directories
102 |
103 | As (arbitrary) best practice, the example node uses a test directory that follows the following structure
104 |
105 | ```shell
106 |
107 | package_name/
108 | ├── package_name/
109 | │ ├── __init__.py
110 | │ ├── node.py
111 | ├── package_name_test/
112 | │ ├── unit/
113 | │ │ ├── test_node.py
114 | │ ├── integration/
115 | │ │ ├── test_node.py
116 |
117 | ```
118 |
119 | Essentially, tests exist in a parallel directory to the package, and are split into `unit` and `integration` tests. The directories within `unit` and `integration` mirror the structure of the package itself, except that module names are prefixed with `test_`.
120 |
121 | #### Message Package Structure
122 |
123 | The template will generate a message package for you with an `ExampleAction`, `ExampleService`, and `ExampleMessage`. You can add more messages by adding them to the `msg` directory and updating the `CMakeLists.txt` and `package.xml` files.
124 |
125 | This can be used as a place for you to store your messages used just for this project. It follows standard ROS2 message package structure.
126 |
127 | ### `.github/`
128 |
129 | This project uses poetry for linting, and has some code for running linting and autoformatting under `.github/lint`.
130 | Run `poetry install` then enter the shell via `poetry shell` to get access to the linting tool.
131 |
132 | This project is based on a template, upstream changes can be found on the [template repository](https://github.com/UrbanMachine/create-ros-app).
133 | A tool called `cruft` will alert you when there are upstream changes, and help you merge those in.
134 |
135 | ### `docker/`
136 |
137 | This directory contains scripts for building and running the Docker container. Look at the [Fancy Features](#fancy-features) section for more information on how to use these scripts.
138 |
139 | As for the structure:
140 |
141 | ```shell
142 | docker/
143 | ├── grafana/ # Stores Grafana provisioning configuration
144 | ├── promtail/ # Stores Promtail provisioning configuration
145 | ├── utils/ # Contains utility scripts for building or running inside the Docker container
146 | │ ├── environment/ # Scripts for use during Dockerfile build
147 | │ ├── runtime/ # Helper for use when using the docker container
148 | ```
149 |
150 | #### `docker/Dockerfile`
151 |
152 | Feel free to edit as you like, but it's recommended to stick to editing within the sections
153 | blocked off by `#######`.
154 |
155 | The Dockerfile will need to be edited in two places for each new ROS package you add:
156 |
157 |
158 | 1. Copying in the `package.xml`
159 | 2. Copying in the `pyproject.toml` and `poetry.lock`
160 |
161 | There's also a location for adding new `apt` dependencies. It's recommended that ROS package
162 | dependencies are added through `package.xml` if they are available.
163 |
164 | ### `.env`
165 |
166 | This file holds environment variables that are used by the docker build and launch scripts.
167 |
168 | Most variables can safely be edited.
169 |
170 | ### `.docker_volumes/`
171 |
172 | This directory is automatically created the first time you run `docker/launch`. It holds persistent state across runs for containers.
173 |
174 | For example, if you enter your container via `docker/run` and save a file under `/robot/persistent/`, that file will be available the next time you run `docker/run`. It will also (by default) exist in your local machine under `.docker_volumes/ros-nodes/`.
175 | The `/robot/persistent` directory is intended for you, the developer, to use. So have at it!
176 |
177 | ## `.gitattributes`
178 |
179 | The `.gitattributes` file is used to configure common binary file formats that are used in
180 | robots such that git uses Large File Storage (LFS) to store them. It also specified certain
181 | line endings so that docker support works on windows.
182 |
183 | # Adding a New Package
184 |
185 | It's recommended to start by developing with the example package that comes with the
186 | template after generation. Once you're familiar with development with the template and want to create a second package,
187 | you can follow the steps below:
188 |
189 | 1. Create a new package directory under `pkgs/`
190 | 2. Add a `package.xml` file to the package directory, follow the standard ROS2 package.xml format
191 | 3. Create a pyproject.toml file. Follow the example in the `example_package` directory.
192 | 4. Create a `resource` directory in the package directory, with an empty file in it called `{your package name}`
193 | 5. Create a directory for your code in the package directory, and it's recommended to create a `{your_package_name}_test` directory as well.
194 | 6. Add a few things to the Dockerfile:
195 | - Under the `Add package.xml's of each package` section, copy the `package.xml` file into the container
196 | - Under the `Add pyproject.toml's of each package` section, copy the `pyproject.toml` file into the container
197 |
198 | You should be good to go! Next time you run `docker/launch`, your new package will be built and available in the container.
--------------------------------------------------------------------------------