├── .bumpversion.cfg ├── .github ├── CODE_OF_CONDUCT.md ├── FUNDING.yaml ├── ISSUE_TEMPLATE │ ├── 1-bug.md │ ├── 2-regression.md │ ├── 3-documentation.md │ ├── 4-enhancement.md │ ├── 5-refactor.md │ └── 6-other.md ├── PULL_REQUEST_TEMPLATE.md ├── SECURITY.md ├── codeql-config.yaml └── workflows │ ├── codeql.yaml │ ├── manual.yaml │ └── tests.yaml ├── .gitignore ├── .pre-commit-config.yaml ├── .readthedocs.yaml ├── CHANGELOG.rst ├── CITATION.cff ├── CONTRIBUTING.rst ├── DOCUMENTATION.rst ├── LICENSE.rst ├── README.rst ├── jsonargparse ├── __init__.py ├── _actions.py ├── _cli.py ├── _common.py ├── _completions.py ├── _core.py ├── _deprecated.py ├── _formatters.py ├── _jsonnet.py ├── _jsonschema.py ├── _link_arguments.py ├── _loaders_dumpers.py ├── _namespace.py ├── _optionals.py ├── _parameter_resolvers.py ├── _postponed_annotations.py ├── _signatures.py ├── _stubs_resolver.py ├── _type_checking.py ├── _typehints.py ├── _util.py ├── py.typed └── typing.py ├── jsonargparse_tests ├── __init__.py ├── __main__.py ├── conftest.py ├── test_actions.py ├── test_argcomplete.py ├── test_attrs.py ├── test_cli.py ├── test_core.py ├── test_dataclasses.py ├── test_deprecated.py ├── test_final_classes.py ├── test_formatters.py ├── test_jsonnet.py ├── test_jsonschema.py ├── test_link_arguments.py ├── test_loaders_dumpers.py ├── test_namespace.py ├── test_omegaconf.py ├── test_optionals.py ├── test_parameter_resolvers.py ├── test_parsing_settings.py ├── test_paths.py ├── test_postponed_annotations.py ├── test_pydantic.py ├── test_shtab.py ├── test_signatures.py ├── test_stubs_resolver.py ├── test_subclasses.py ├── test_subcommands.py ├── test_typehints.py ├── test_typing.py └── test_util.py ├── pyproject.toml └── sphinx ├── changelog.rst ├── conf.py ├── index.rst └── license.rst /.bumpversion.cfg: -------------------------------------------------------------------------------- 1 | [bumpversion] 2 | current_version = 4.42.0 3 | commit = True 4 | tag = True 5 | tag_name = v{new_version} 6 | parse = (?P\d+)\.(?P\d+)\.(?P\d+)(?P(\.dev|rc)[0-9]+)? 7 | serialize = 8 | {major}.{minor}.{patch}{release} 9 | {major}.{minor}.{patch} 10 | 11 | [bumpversion:file:jsonargparse/__init__.py] 12 | -------------------------------------------------------------------------------- /.github/CODE_OF_CONDUCT.md: -------------------------------------------------------------------------------- 1 | # Contributor Covenant Code of Conduct 2 | 3 | ## Our Pledge 4 | 5 | We as members, contributors, and leaders pledge to make participation in our 6 | community a harassment-free experience for everyone, regardless of age, body 7 | size, visible or invisible disability, ethnicity, sex characteristics, gender 8 | identity and expression, level of experience, education, socio-economic status, 9 | nationality, personal appearance, race, caste, color, religion, or sexual 10 | identity and orientation. 11 | 12 | We pledge to act and interact in ways that contribute to an open, welcoming, 13 | diverse, inclusive, and healthy community. 14 | 15 | ## Our Standards 16 | 17 | Examples of behavior that contributes to a positive environment for our 18 | community include: 19 | 20 | * Demonstrating empathy and kindness toward other people 21 | * Being respectful of differing opinions, viewpoints, and experiences 22 | * Giving and gracefully accepting constructive feedback 23 | * Accepting responsibility and apologizing to those affected by our mistakes, 24 | and learning from the experience 25 | * Focusing on what is best not just for us as individuals, but for the overall 26 | community 27 | 28 | Examples of unacceptable behavior include: 29 | 30 | * The use of sexualized language or imagery, and sexual attention or advances of 31 | any kind 32 | * Trolling, insulting or derogatory comments, and personal or political attacks 33 | * Public or private harassment 34 | * Publishing others' private information, such as a physical or email address, 35 | without their explicit permission 36 | * Other conduct which could reasonably be considered inappropriate in a 37 | professional setting 38 | 39 | ## Enforcement Responsibilities 40 | 41 | Community leaders are responsible for clarifying and enforcing our standards of 42 | acceptable behavior and will take appropriate and fair corrective action in 43 | response to any behavior that they deem inappropriate, threatening, offensive, 44 | or harmful. 45 | 46 | Community leaders have the right and responsibility to remove, edit, or reject 47 | comments, commits, code, wiki edits, issues, and other contributions that are 48 | not aligned to this Code of Conduct, and will communicate reasons for moderation 49 | decisions when appropriate. 50 | 51 | ## Scope 52 | 53 | This Code of Conduct applies within all community spaces, and also applies when 54 | an individual is officially representing the community in public spaces. 55 | Examples of representing our community include using an official e-mail address, 56 | posting via an official social media account, or acting as an appointed 57 | representative at an online or offline event. 58 | 59 | ## Enforcement 60 | 61 | Instances of abusive, harassing, or otherwise unacceptable behavior may be 62 | reported to the community leaders responsible for enforcement at 63 | mauricio@omnius.com. All complaints will be reviewed and investigated promptly 64 | and fairly. 65 | 66 | All community leaders are obligated to respect the privacy and security of the 67 | reporter of any incident. 68 | 69 | ## Enforcement Guidelines 70 | 71 | Community leaders will follow these Community Impact Guidelines in determining 72 | the consequences for any action they deem in violation of this Code of Conduct: 73 | 74 | ### 1. Correction 75 | 76 | **Community Impact**: Use of inappropriate language or other behavior deemed 77 | unprofessional or unwelcome in the community. 78 | 79 | **Consequence**: A private, written warning from community leaders, providing 80 | clarity around the nature of the violation and an explanation of why the 81 | behavior was inappropriate. A public apology may be requested. 82 | 83 | ### 2. Warning 84 | 85 | **Community Impact**: A violation through a single incident or series of 86 | actions. 87 | 88 | **Consequence**: A warning with consequences for continued behavior. No 89 | interaction with the people involved, including unsolicited interaction with 90 | those enforcing the Code of Conduct, for a specified period of time. This 91 | includes avoiding interactions in community spaces as well as external channels 92 | like social media. Violating these terms may lead to a temporary or permanent 93 | ban. 94 | 95 | ### 3. Temporary Ban 96 | 97 | **Community Impact**: A serious violation of community standards, including 98 | sustained inappropriate behavior. 99 | 100 | **Consequence**: A temporary ban from any sort of interaction or public 101 | communication with the community for a specified period of time. No public or 102 | private interaction with the people involved, including unsolicited interaction 103 | with those enforcing the Code of Conduct, is allowed during this period. 104 | Violating these terms may lead to a permanent ban. 105 | 106 | ### 4. Permanent Ban 107 | 108 | **Community Impact**: Demonstrating a pattern of violation of community 109 | standards, including sustained inappropriate behavior, harassment of an 110 | individual, or aggression toward or disparagement of classes of individuals. 111 | 112 | **Consequence**: A permanent ban from any sort of public interaction within the 113 | community. 114 | 115 | ## Attribution 116 | 117 | This Code of Conduct is adapted from the [Contributor Covenant][homepage], 118 | version 2.1, available at 119 | [https://www.contributor-covenant.org/version/2/1/code_of_conduct.html][v2.1]. 120 | 121 | Community Impact Guidelines were inspired by 122 | [Mozilla's code of conduct enforcement ladder][Mozilla CoC]. 123 | 124 | For answers to common questions about this code of conduct, see the FAQ at 125 | [https://www.contributor-covenant.org/faq][FAQ]. Translations are available at 126 | [https://www.contributor-covenant.org/translations][translations]. 127 | 128 | [homepage]: https://www.contributor-covenant.org 129 | [v2.1]: https://www.contributor-covenant.org/version/2/1/code_of_conduct.html 130 | [Mozilla CoC]: https://github.com/mozilla/diversity 131 | [FAQ]: https://www.contributor-covenant.org/faq 132 | [translations]: https://www.contributor-covenant.org/translations 133 | -------------------------------------------------------------------------------- /.github/FUNDING.yaml: -------------------------------------------------------------------------------- 1 | github: mauvilsa 2 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/1-bug.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Bug report 3 | about: Report for something that doesn't work as expected 4 | title: '' 5 | labels: bug 6 | assignees: '' 7 | --- 8 | 9 | 16 | 17 | 22 | 23 | ## 🐛 Bug report 24 | 25 | 26 | 27 | ### To reproduce 28 | 29 | 79 | 80 | ### Expected behavior 81 | 82 | 83 | 84 | ### Environment 85 | 86 | 87 | 88 | - jsonargparse version: 89 | - Python version: 90 | - How jsonargparse was installed: 91 | - OS: 92 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/2-regression.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Regression report 3 | about: Report for something that used to work but doesn't anymore 4 | title: '' 5 | labels: bug 6 | assignees: '' 7 | --- 8 | 9 | 16 | 17 | ## 🕰️ Regression report 18 | 19 | 20 | 21 | ### To reproduce 22 | 23 | 88 | 89 | ### Prior behavior 90 | 91 | 92 | 93 | ### Environment 94 | 95 | 96 | 97 | - jsonargparse version: 98 | - Python version: 99 | - How jsonargparse was installed: 100 | - OS: 101 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/3-documentation.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Documentation fixes and improvements 3 | about: Tell us how the documentation can be improved 4 | title: '' 5 | labels: documentation 6 | assignees: '' 7 | --- 8 | 9 | 16 | 17 | 21 | 22 | ## 📚 Documentation improvement 23 | 24 | 25 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/4-enhancement.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Feature request 3 | about: Suggest an idea for this project 4 | title: '' 5 | labels: enhancement 6 | assignees: '' 7 | --- 8 | 9 | 16 | 17 | 21 | 22 | ## 🚀 Feature request 23 | 24 | 25 | 26 | ### Motivation 27 | 28 | 33 | 34 | ### Pitch 35 | 36 | 37 | 38 | ### Alternatives 39 | 40 | 41 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/5-refactor.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Code refactor 3 | about: Suggest a code refactor or deprecation 4 | title: '' 5 | labels: refactor 6 | assignees: '' 7 | --- 8 | 9 | 16 | 17 | ## 🔧 Code refactor 18 | 19 | 20 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/6-other.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Other issue type 3 | about: Any kind of issue that does not fit the other templates 4 | title: '' 5 | labels: '' 6 | assignees: '' 7 | --- 8 | 9 | 16 | 17 | 18 | -------------------------------------------------------------------------------- /.github/PULL_REQUEST_TEMPLATE.md: -------------------------------------------------------------------------------- 1 | 8 | 9 | 14 | 15 | ## What does this PR do? 16 | 17 | 23 | 24 | ## Before submitting 25 | 26 | 31 | 32 | - [ ] Did you read the [contributing guideline](https://github.com/omni-us/jsonargparse/blob/main/CONTRIBUTING.rst)? 33 | - [ ] Did you update **the documentation**? (readme and public docstrings) 34 | - [ ] Did you write **unit tests** such that there is 100% coverage on related code? (required for bug fixes and new features) 35 | - [ ] Did you verify that new and existing **tests pass locally**? 36 | - [ ] Did you make sure that all changes preserve **backward compatibility**? 37 | - [ ] Did you update **the CHANGELOG** including a pull request link? (not for typos, docs, test updates, or minor internal changes/refactors) 38 | -------------------------------------------------------------------------------- /.github/SECURITY.md: -------------------------------------------------------------------------------- 1 | # Security Policy 2 | 3 | To report a vulnerability please use Github's [Security 4 | Advisories](https://github.com/omni-us/jsonargparse/security/advisories). In 5 | general only the latest minor version is actively supported. However, depending 6 | on the severity of a vulnerability, fix for older versions could be considered. 7 | -------------------------------------------------------------------------------- /.github/codeql-config.yaml: -------------------------------------------------------------------------------- 1 | query-filters: 2 | - exclude: 3 | id: 4 | - py/cyclic-import 5 | - py/unsafe-cyclic-import 6 | -------------------------------------------------------------------------------- /.github/workflows/codeql.yaml: -------------------------------------------------------------------------------- 1 | name: "CodeQL" 2 | 3 | on: 4 | push: 5 | branches: [main] 6 | pull_request: 7 | branches: [main] 8 | workflow_dispatch: 9 | 10 | concurrency: 11 | group: ${{ github.workflow }}-${{ github.ref }} 12 | cancel-in-progress: true 13 | 14 | jobs: 15 | analyze: 16 | name: Analyze 17 | runs-on: ubuntu-latest 18 | permissions: 19 | actions: read 20 | contents: read 21 | security-events: write 22 | 23 | steps: 24 | - name: Checkout 25 | uses: actions/checkout@v5 26 | 27 | - name: Initialize CodeQL 28 | uses: github/codeql-action/init@v3 29 | with: 30 | languages: python 31 | queries: +security-and-quality 32 | config-file: ./.github/codeql-config.yaml 33 | 34 | - name: Autobuild 35 | uses: github/codeql-action/autobuild@v3 36 | 37 | - name: Perform CodeQL Analysis 38 | uses: github/codeql-action/analyze@v3 39 | with: 40 | category: "/language:python" 41 | -------------------------------------------------------------------------------- /.github/workflows/manual.yaml: -------------------------------------------------------------------------------- 1 | name: manual 2 | 3 | on: 4 | workflow_dispatch: 5 | inputs: 6 | OS: 7 | description: Operating System 8 | required: true 9 | type: choice 10 | options: 11 | - windows-2019 12 | - macOS-15 13 | - ubuntu-22.04 14 | 15 | jobs: 16 | tox-coverage: 17 | runs-on: ${{ inputs.OS }} 18 | steps: 19 | - uses: actions/checkout@v5 20 | - uses: actions/setup-python@v5 21 | with: 22 | python-version: | 23 | 3.9 24 | 3.10 25 | 3.11 26 | 3.12 27 | - run: pip install -e ".[dev]" 28 | - run: tox -- --cov=../jsonargparse --cov-append 29 | - uses: actions/upload-artifact@v4 30 | with: 31 | name: coverage 32 | path: jsonargparse_tests/.coverage 33 | include-hidden-files: true 34 | if-no-files-found: error 35 | -------------------------------------------------------------------------------- /.github/workflows/tests.yaml: -------------------------------------------------------------------------------- 1 | name: tests 2 | 3 | on: 4 | push: 5 | branches: [main] 6 | tags: 7 | - 'v*' 8 | pull_request: 9 | branches: [main] 10 | workflow_dispatch: 11 | 12 | concurrency: 13 | group: ${{ github.workflow }}-${{ github.ref }} 14 | cancel-in-progress: true 15 | 16 | jobs: 17 | 18 | linux: 19 | runs-on: ubuntu-latest 20 | strategy: 21 | matrix: 22 | python: ["3.9", "3.10", "3.11", "3.12", "3.13", "3.14"] 23 | steps: 24 | - uses: actions/checkout@v5 25 | - uses: actions/setup-python@v5 26 | with: 27 | python-version: ${{ matrix.python }} 28 | cache: pip 29 | - name: Test without optional dependencies and without pyyaml 30 | run: | 31 | pip install .[coverage] 32 | pip uninstall -y pyyaml types-PyYAML 33 | pytest --cov --cov-report=term --cov-report=xml --junit-xml=junit.xml 34 | mv coverage.xml coverage_py${{ matrix.python }}_bare.xml 35 | mv junit.xml junit_py${{ matrix.python }}_bare.xml 36 | - name: Test with all optional dependencies 37 | run: | 38 | pip install .[test,all] 39 | pytest --cov --cov-report=term --cov-report=xml --junit-xml=junit.xml 40 | mv coverage.xml coverage_py${{ matrix.python }}_all.xml 41 | mv junit.xml junit_py${{ matrix.python }}_all.xml 42 | - name: Test without future annotations 43 | run: | 44 | sed -i '/^from __future__ import annotations$/d' jsonargparse_tests/test_*.py 45 | pytest --cov --cov-report=term --cov-report=xml --junit-xml=junit.xml 46 | mv coverage.xml coverage_py${{ matrix.python }}_types.xml 47 | mv junit.xml junit_py${{ matrix.python }}_types.xml 48 | - uses: actions/upload-artifact@v4 49 | with: 50 | name: coverage_py${{ matrix.python }} 51 | path: ./coverage_py* 52 | - uses: actions/upload-artifact@v4 53 | with: 54 | name: junit_py${{ matrix.python }} 55 | path: ./junit_py* 56 | 57 | windows: 58 | runs-on: windows-2025 59 | strategy: 60 | fail-fast: false 61 | matrix: 62 | python: ["3.9", "3.10", "3.11", "3.12", "3.13", "3.14"] 63 | steps: 64 | - uses: actions/checkout@v5 65 | - uses: actions/setup-python@v5 66 | with: 67 | python-version: ${{ matrix.python }} 68 | cache: pip 69 | - run: pip install tox 70 | - run: tox -e py-all-extras 71 | 72 | macos: 73 | runs-on: macOS-15 74 | strategy: 75 | fail-fast: false 76 | matrix: 77 | python: ["3.10", "3.12", "3.14"] 78 | steps: 79 | - uses: actions/checkout@v5 80 | - uses: actions/setup-python@v5 81 | with: 82 | python-version: ${{ matrix.python }} 83 | cache: pip 84 | - run: pip install tox 85 | - run: tox -e py-all-extras 86 | 87 | omegaconf: 88 | runs-on: ubuntu-latest 89 | steps: 90 | - uses: actions/checkout@v5 91 | - uses: actions/setup-python@v5 92 | with: 93 | python-version: "3.12" 94 | cache: pip 95 | - run: pip install tox 96 | - run: tox -e omegaconf 97 | 98 | pydantic-v1: 99 | runs-on: ubuntu-latest 100 | steps: 101 | - uses: actions/checkout@v5 102 | - uses: actions/setup-python@v5 103 | with: 104 | python-version: "3.12" 105 | cache: pip 106 | - name: With pydantic<2 107 | run: | 108 | pip install .[coverage] 109 | pip install "pydantic<2" 110 | pytest --cov --cov-report=term --cov-report=xml --junit-xml=junit.xml jsonargparse_tests/test_pydantic.py 111 | mv coverage.xml coverage_pydantic1.xml 112 | mv junit.xml junit_pydantic1.xml 113 | - name: with pydantic>=2 114 | run: | 115 | sed -i "s|import pydantic|import pydantic.v1 as pydantic|" jsonargparse_tests/test_pydantic.py 116 | sed -i "s|^annotated = .*|annotated = False|" jsonargparse_tests/test_pydantic.py 117 | pip install "pydantic>=2" 118 | pytest --cov --cov-report=term --cov-report=xml --junit-xml=junit.xml jsonargparse_tests/test_pydantic.py 119 | mv coverage.xml coverage_pydantic2.xml 120 | mv junit.xml junit_pydantic2.xml 121 | - uses: actions/upload-artifact@v4 122 | with: 123 | name: coverage_pydantic 124 | path: ./coverage_py* 125 | - uses: actions/upload-artifact@v4 126 | with: 127 | name: junit_pydantic 128 | path: ./junit_py* 129 | 130 | build-package: 131 | runs-on: ubuntu-latest 132 | steps: 133 | - uses: actions/checkout@v5 134 | - uses: actions/setup-python@v5 135 | with: 136 | python-version: "3.12" 137 | - name: Build package 138 | run: | 139 | pip install -U build 140 | python -m build 141 | - uses: actions/upload-artifact@v4 142 | with: 143 | name: package 144 | path: ./dist/* 145 | 146 | installed-package: 147 | runs-on: ubuntu-latest 148 | needs: [build-package] 149 | steps: 150 | - uses: actions/checkout@v5 151 | - uses: actions/setup-python@v5 152 | with: 153 | python-version: "3.12" 154 | cache: pip 155 | - uses: actions/download-artifact@v4 156 | with: 157 | name: package 158 | path: dist 159 | - name: Test without optional dependencies and without pyyaml 160 | run: | 161 | cd dist 162 | pip install $(ls *.whl)[test-no-urls] 163 | pip uninstall -y pyyaml 164 | python -m jsonargparse_tests 165 | - name: Test with all optional dependencies 166 | run: | 167 | cd dist 168 | pip install $(ls *.whl)[test,all] 169 | python -m jsonargparse_tests 170 | 171 | doctest: 172 | runs-on: ubuntu-latest 173 | steps: 174 | - uses: actions/checkout@v5 175 | - uses: actions/setup-python@v5 176 | with: 177 | python-version: "3.12" 178 | cache: pip 179 | - run: pip install -e .[all,doc] 180 | - name: Run doc tests 181 | run: sphinx-build -M doctest sphinx sphinx/_build sphinx/index.rst 182 | 183 | mypy: 184 | runs-on: ubuntu-latest 185 | steps: 186 | - uses: actions/checkout@v5 187 | - uses: actions/setup-python@v5 188 | with: 189 | python-version: "3.12" 190 | cache: pip 191 | - uses: actions/cache@v4 192 | with: 193 | key: pre-commit-cache 194 | path: ~/.cache/pre-commit 195 | - run: pip install pre-commit 196 | - run: pre-commit run -a --hook-stage pre-push mypy 197 | 198 | codecov: 199 | runs-on: ubuntu-latest 200 | environment: codecov 201 | needs: [linux, pydantic-v1] 202 | steps: 203 | - uses: actions/checkout@v5 204 | - uses: actions/download-artifact@v4 205 | with: 206 | merge-multiple: true 207 | - uses: codecov/codecov-action@v5 208 | with: 209 | fail_ci_if_error: true 210 | files: ./coverage_*.xml 211 | token: ${{ secrets.CODECOV_TOKEN }} 212 | - uses: codecov/test-results-action@v1 213 | with: 214 | fail_ci_if_error: true 215 | files: ./junit_*.xml 216 | token: ${{ secrets.CODECOV_TOKEN }} 217 | 218 | sonarcloud: 219 | runs-on: ubuntu-latest 220 | environment: sonarcloud 221 | if: | 222 | (github.event_name == 'push') || 223 | (github.event_name == 'pull_request' && !github.event.pull_request.head.repo.fork) 224 | needs: [linux, pydantic-v1] 225 | steps: 226 | - uses: actions/checkout@v5 227 | with: 228 | fetch-depth: 0 # Shallow clone disabled for a better relevancy of analysis 229 | - uses: actions/download-artifact@v4 230 | with: 231 | merge-multiple: true 232 | - name: Get version 233 | run: | 234 | TAG=$(git describe --tags --exact-match 2>/dev/null | sed 's/^v//') 235 | if [ -n "$TAG" ]; then 236 | VERSION="$TAG" 237 | else 238 | VERSION="$(git describe --tags --abbrev=0 | sed 's/^v//')+$(git rev-parse --short HEAD)" 239 | fi 240 | echo "VERSION=$VERSION" >> $GITHUB_ENV 241 | - uses: SonarSource/sonarqube-scan-action@v6 242 | with: 243 | args: > 244 | -Dsonar.organization=omni-us 245 | -Dsonar.projectKey=omni-us_jsonargparse 246 | -Dsonar.projectVersion=${{ env.VERSION }} 247 | -Dsonar.sources=jsonargparse 248 | -Dsonar.exclusions=sphinx/** 249 | -Dsonar.tests=jsonargparse_tests 250 | -Dsonar.python.coverage.reportPaths=coverage_*.xml 251 | -Dsonar.python.version=3.9,3.10,3.11,3.12,3.13,3.14 252 | env: 253 | GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # Needed to get PR information, if any 254 | SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }} 255 | 256 | pypi-publish: 257 | if: startsWith(github.ref, 'refs/tags/v') 258 | runs-on: ubuntu-latest 259 | needs: [linux, windows, macos, omegaconf, pydantic-v1, installed-package, doctest, mypy] 260 | environment: 261 | name: pypi 262 | url: https://pypi.org/p/jsonargparse 263 | permissions: 264 | id-token: write 265 | steps: 266 | - uses: actions/download-artifact@v4 267 | with: 268 | name: package 269 | path: dist 270 | - name: Publish to PyPI 271 | uses: pypa/gh-action-pypi-publish@release/v1 272 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | __pycache__ 2 | *.egg-info 3 | .coverage 4 | coverage.xml 5 | .mypy_cache 6 | .pytest_cache 7 | .ruff_cache 8 | .tox 9 | venv* 10 | sphinx/_build 11 | build 12 | dist 13 | htmlcov 14 | -------------------------------------------------------------------------------- /.pre-commit-config.yaml: -------------------------------------------------------------------------------- 1 | fail_fast: true 2 | default_install_hook_types: [pre-commit, pre-push] 3 | default_language_version: 4 | python: python3.12 5 | 6 | ci: 7 | skip: 8 | - twine-check 9 | - changelog-bump 10 | autofix_prs: true 11 | autoupdate_commit_msg: '[pre-commit.ci] pre-commit suggestions' 12 | autoupdate_schedule: quarterly 13 | 14 | repos: 15 | 16 | - repo: https://github.com/pre-commit/pre-commit-hooks 17 | rev: v6.0.0 18 | hooks: 19 | - id: check-added-large-files 20 | - id: check-ast 21 | - id: check-case-conflict 22 | - id: check-docstring-first 23 | - id: end-of-file-fixer 24 | - id: mixed-line-ending 25 | - id: trailing-whitespace 26 | exclude: .bumpversion.cfg 27 | 28 | - repo: https://github.com/psf/black 29 | rev: 25.9.0 30 | hooks: 31 | - id: black 32 | 33 | - repo: https://github.com/astral-sh/ruff-pre-commit 34 | rev: v0.14.0 35 | hooks: 36 | - id: ruff 37 | args: ["--fix"] 38 | 39 | - repo: https://github.com/asottile/yesqa 40 | rev: v1.5.0 41 | hooks: 42 | - id: yesqa 43 | 44 | - repo: https://github.com/crate-ci/typos 45 | rev: v1.38.1 46 | hooks: 47 | - id: typos 48 | args: [] 49 | verbose: true 50 | 51 | - repo: https://github.com/pre-commit/mirrors-mypy 52 | rev: v1.18.2 53 | hooks: 54 | - id: mypy 55 | files: jsonargparse.*/.*.py 56 | stages: [pre-push] 57 | additional_dependencies: 58 | [ 59 | types-PyYAML, 60 | types-requests, 61 | types-toml, 62 | ] 63 | verbose: true 64 | 65 | - repo: local 66 | hooks: 67 | 68 | - id: twine-check 69 | name: twine check [on bumpversion] 70 | entry: bash -c ' 71 | set -e; 72 | if [ "${BUMPVERSION_NEW_VERSION+x}" = "" ]; then 73 | echo "$(tput setaf 6) Skipped, only runs when bumping version $(tput sgr0)"; 74 | else 75 | python3 -m build --wheel; 76 | twine check dist/*.whl; 77 | fi' 78 | language: system 79 | pass_filenames: false 80 | verbose: true 81 | 82 | - id: changelog-bump 83 | name: changelog bump [on bumpversion] 84 | entry: bash -c ' 85 | set -e; 86 | if [ "${BUMPVERSION_NEW_VERSION+x}" = "" ]; then 87 | echo "$(tput setaf 6) Skipped, only runs when bumping version $(tput sgr0)"; 88 | else 89 | CHANGELOG=$(grep -E "^v.+\..+\..+ \(.*\)" CHANGELOG.rst | head -n 1); 90 | EXPECTED="v$BUMPVERSION_NEW_VERSION ($(date -u +%Y-%m-%d))"; 91 | if [ "$CHANGELOG" != "$EXPECTED" ] && [ $(echo $BUMPVERSION_NEW_VERSION | grep -cE "[0-9.]+(\.dev|rc)[0-9]+") = 0 ]; then 92 | if [ $(grep -c "^v$BUMPVERSION_NEW_VERSION " CHANGELOG.rst) = 1 ]; then 93 | echo "Updating the date for v$BUMPVERSION_NEW_VERSION in CHANGELOG.rst"; 94 | sed -i -e "s|^v$BUMPVERSION_NEW_VERSION .*|$EXPECTED|" CHANGELOG.rst; 95 | git add CHANGELOG.rst; 96 | else 97 | echo "Expected release in CHANGELOG.rst to be "$EXPECTED" or not have a definitive date."; 98 | exit 1; 99 | fi 100 | fi 101 | fi' 102 | language: system 103 | pass_filenames: false 104 | verbose: true 105 | 106 | - id: tox 107 | name: tox --parallel 108 | entry: tox --parallel 109 | stages: [pre-push] 110 | language: system 111 | pass_filenames: false 112 | verbose: true 113 | 114 | - id: doctest 115 | name: sphinx-build -M doctest sphinx sphinx/_build sphinx/index.rst 116 | entry: bash -c ' 117 | set -e; 118 | if [ "$(which sphinx-build)" = "" ]; then 119 | echo "$(tput setaf 6) Skipped, sphinx-build command not found $(tput sgr0)"; 120 | else 121 | sphinx-build -M doctest sphinx sphinx/_build sphinx/index.rst; 122 | fi' 123 | stages: [pre-push] 124 | language: system 125 | pass_filenames: false 126 | verbose: true 127 | 128 | - id: coverage 129 | name: pytest -v -s --cov --cov-report=term --cov-report=html 130 | entry: pytest -v -s --cov --cov-report=term --cov-report=html 131 | stages: [pre-push] 132 | language: system 133 | pass_filenames: false 134 | verbose: true 135 | -------------------------------------------------------------------------------- /.readthedocs.yaml: -------------------------------------------------------------------------------- 1 | # Read the Docs configuration file for Sphinx projects 2 | # See https://docs.readthedocs.io/en/stable/config-file/v2.html for details 3 | 4 | version: 2 5 | 6 | build: 7 | os: ubuntu-24.04 8 | tools: 9 | python: "3.12" 10 | 11 | sphinx: 12 | configuration: sphinx/conf.py 13 | 14 | python: 15 | install: 16 | - method: pip 17 | path: . 18 | extra_requirements: 19 | - doc 20 | -------------------------------------------------------------------------------- /CITATION.cff: -------------------------------------------------------------------------------- 1 | cff-version: 1.2.0 2 | message: "If you want to cite this software, please do as follows." 3 | title: jsonargparse 4 | license: MIT 5 | repository-code: https://github.com/omni-us/jsonargparse 6 | authors: 7 | - family-names: Villegas 8 | given-names: Mauricio 9 | orcid: https://orcid.org/0000-0001-7450-6707 10 | - name: contributors 11 | version: 4 12 | date-released: 2021-11-16 13 | -------------------------------------------------------------------------------- /CONTRIBUTING.rst: -------------------------------------------------------------------------------- 1 | Contributing 2 | ============ 3 | 4 | Contributions to jsonargparse are very welcome. There are multiple ways for 5 | people to help and contribute, among them: 6 | 7 | - Star ⭐ the github project ``__. 8 | - `Sponsor 🩷 `__ its maintenance and 9 | development. 10 | - Spread the word in your community about the features you like from 11 | jsonargparse. 12 | - Help others to learn how to use jsonargparse by creating tutorials, such as 13 | blog posts and videos. 14 | - Become active in existing github issues and pull requests. 15 | - Create `issues `__ for 16 | reporting bugs and proposing improvements. 17 | - Create `pull requests `__ with 18 | documentation improvements, bug fixes or new features. 19 | 20 | .. note:: 21 | 22 | While creating an issue before submitting a pull request is not mandatory, 23 | it might be helpful. Issues allow for discussion and feedback before 24 | significant development effort is invested. However, in some cases, code 25 | changes can better illustrate a proposal, making it more effective to submit 26 | a pull request directly. In such cases please avoid opening a largely 27 | redundant issue. 28 | 29 | Development environment 30 | ----------------------- 31 | 32 | If you intend to work with the source code, note that this project does not 33 | include any ``requirements.txt`` file. This is by intention. To make it very 34 | clear what are the requirements for different use cases, all the requirements of 35 | the project are stored in the file ``pyproject.toml``. The basic runtime 36 | requirements are defined in ``dependencies``. Requirements for optional features 37 | are stored in ``[project.optional-dependencies]``. Also in the same section 38 | there are requirements for testing, development and documentation building: 39 | ``test``, ``dev`` and ``doc``. 40 | 41 | The recommended way to work with the source code is the following. First clone 42 | the repository, then create a virtual environment, activate it and finally 43 | install the development requirements. More precisely the steps are: 44 | 45 | .. code-block:: bash 46 | 47 | git clone https://github.com/omni-us/jsonargparse.git 48 | cd jsonargparse 49 | python -m venv venv 50 | . venv/bin/activate 51 | 52 | The crucial step is installing the requirements which would be done by running: 53 | 54 | .. code-block:: bash 55 | 56 | pip install -e ".[dev,all]" 57 | 58 | pre-commit 59 | ---------- 60 | 61 | Please also install the `pre-commit `__ git hooks so 62 | that unit tests and code checks are automatically run locally. This is done as 63 | follows: 64 | 65 | .. code-block:: bash 66 | 67 | pre-commit install 68 | 69 | .. note:: 70 | 71 | ``.pre-commit-config.yaml`` is configured to run the hooks using Python 72 | 3.12. Ensure you have Python 3.12 installed and available in your 73 | environment for ``pre-commit`` to function correctly. For development, other 74 | Python versions will work, but for convenience, Python 3.12 is recommended. 75 | 76 | The ``pre-push`` stage runs several hooks, including tests, doctests, mypy, and 77 | coverage. These hooks are designed to inform developers of issues that must be 78 | resolved before a pull request can be merged. Note that these hooks may take 79 | some time to complete. If you wish to push without running these hooks, use the 80 | command ``git push --no-verify``. 81 | 82 | Formatting of the code is done automatically by pre-commit. If some pre-commit 83 | hooks fail and you decide to skip them, the formatting will be automatically 84 | applied by a github action in pull requests. 85 | 86 | Documentation 87 | ------------- 88 | 89 | To build the documentation run: 90 | 91 | .. code-block:: bash 92 | 93 | sphinx-build sphinx sphinx/_build sphinx/*.rst 94 | 95 | To view the built documentation, open the file ``sphinx/_build/index.html`` in a 96 | browser. 97 | 98 | Tests 99 | ----- 100 | 101 | Running the unit tests can be done either using `pytest 102 | `__ or `tox 103 | `__. The tests are also installed with 104 | the package, thus can be run in a production system. Also pre-commit runs some 105 | additional tests. 106 | 107 | .. code-block:: bash 108 | 109 | tox # Run tests using tox on available python versions 110 | pytest # Run tests using pytest on the python of the environment 111 | pytest --cov # Run tests and generate coverage report 112 | python -m jsonargparse_tests # Run tests on installed package (requires pytest and pytest-subtests) 113 | pre-commit run -a --hook-stage pre-push # Run pre-push git hooks (tests, doctests, mypy, coverage) 114 | 115 | Coverage 116 | -------- 117 | 118 | For a nice html test coverage report, run: 119 | 120 | .. code-block:: bash 121 | 122 | pytest --cov --cov-report=html 123 | 124 | Then open the file ``htmlcov/index.html`` in a browser. 125 | 126 | To get a full coverage report, you need to install all supported python 127 | versions, and then: 128 | 129 | .. code-block:: bash 130 | 131 | rm -fr jsonargparse_tests/.coverage jsonargparse_tests/htmlcov 132 | tox -- --cov=../jsonargparse --cov-append 133 | cd jsonargparse_tests 134 | coverage html 135 | 136 | Then open the file ``jsonargparse_tests/htmlcov/index.html`` in a browser. 137 | 138 | Pull requests 139 | ------------- 140 | 141 | When creating a pull request, it is recommended that in your fork, create a 142 | specific branch for the changes you want to contribute, instead of using the 143 | ``main`` branch. 144 | 145 | The required tasks to do for a pull request, are listed in 146 | `PULL_REQUEST_TEMPLATE.md 147 | `__. 148 | 149 | One of the tasks is adding a changelog entry. For this, note that this project 150 | uses semantic versioning. Depending on whether the contribution is a bug fix or 151 | a new feature, the changelog entry would go in a patch or minor release. The 152 | changelog section for the next release does not have a definite date, for 153 | example: 154 | 155 | .. code-block:: 156 | 157 | v4.28.0 (unreleased) 158 | -------------------- 159 | 160 | Added 161 | ^^^^^ 162 | - 163 | 164 | If no such section exists, just add it with "(unreleased)" instead of a date. 165 | Have a look at previous releases to decide under which subsection the new entry 166 | should go. If you are unsure, ask in the pull request. 167 | 168 | Please don't open pull requests with breaking changes unless this has been 169 | discussed and agreed upon in an issue. 170 | -------------------------------------------------------------------------------- /LICENSE.rst: -------------------------------------------------------------------------------- 1 | The MIT License (MIT) 2 | 3 | Copyright (c) 2019-present, Mauricio Villegas 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.rst: -------------------------------------------------------------------------------- 1 | .. image:: https://readthedocs.org/projects/jsonargparse/badge/?version=stable 2 | :target: https://readthedocs.org/projects/jsonargparse/ 3 | .. image:: https://github.com/omni-us/jsonargparse/actions/workflows/tests.yaml/badge.svg 4 | :target: https://github.com/omni-us/jsonargparse/actions/workflows/tests.yaml 5 | .. image:: https://codecov.io/gh/omni-us/jsonargparse/branch/main/graph/badge.svg 6 | :target: https://codecov.io/gh/omni-us/jsonargparse 7 | .. image:: https://sonarcloud.io/api/project_badges/measure?project=omni-us_jsonargparse&metric=alert_status 8 | :target: https://sonarcloud.io/dashboard?id=omni-us_jsonargparse 9 | .. image:: https://badge.fury.io/py/jsonargparse.svg 10 | :target: https://badge.fury.io/py/jsonargparse 11 | 12 | 13 | jsonargparse 14 | ============ 15 | 16 | Docs: https://jsonargparse.readthedocs.io/ | Source: https://github.com/omni-us/jsonargparse/ 17 | 18 | ``jsonargparse`` is a library for creating command-line interfaces (CLIs) and 19 | making Python apps easily configurable. It is a well-maintained project with 20 | frequent releases, adhering to high standards of development: semantic 21 | versioning, deprecation periods, changelog, automated testing, and full test 22 | coverage. 23 | 24 | Although ``jsonargparse`` might not be widely recognized yet, it already boasts 25 | a `substantial user base 26 | `__. Most notably, 27 | it serves as the framework behind pytorch-lightning's `LightningCLI 28 | `__. 29 | 30 | Teaser examples 31 | --------------- 32 | 33 | CLI with minimal boilerplate: 34 | 35 | .. code-block:: python 36 | 37 | from jsonargparse import auto_cli 38 | 39 | def main_function(...): # your main parameters and logic here 40 | ... 41 | 42 | if __name__ == "__main__": 43 | auto_cli(main_function) # parses arguments and runs main_function 44 | 45 | Minimal boilerplate but manually parsing: 46 | 47 | .. code-block:: python 48 | 49 | from jsonargparse import auto_parser 50 | 51 | parser = auto_parser(main_function) 52 | cfg = parser.parse_args() 53 | ... 54 | 55 | Powerful argparse-like low level parsers: 56 | 57 | .. code-block:: python 58 | 59 | from jsonargparse import ArgumentParser 60 | 61 | parser = ArgumentParser() 62 | parser.add_argument("--config", action="config") # support config files 63 | parser.add_argument("--opt", type=Union[int, Literal["off"]]) # complex arguments via type hints 64 | parser.add_function_arguments(main_function, "function") # add function parameters 65 | parser.add_class_arguments(SomeClass, "class") # add class parameters 66 | ... 67 | cfg = parser.parse_args() 68 | init = parser.instantiate_classes(cfg) 69 | ... 70 | 71 | 72 | Features 73 | -------- 74 | 75 | ``jsonargparse`` is user-friendly and encourages the development of **clean, 76 | high-quality code**. It encompasses numerous powerful features, some unique to 77 | ``jsonargparse``, while also combining advantages found in similar packages: 78 | 79 | - **Automatic** creation of CLIs, like `Fire 80 | `__, `Typer 81 | `__, `Clize 82 | `__ and `Tyro 83 | `__. 84 | 85 | - Use **type hints** for argument validation, like `Typer 86 | `__, `Tap 87 | `__ and `Tyro 88 | `__. 89 | 90 | - Use of **docstrings** for automatic generation of help, like `Tap 91 | `__, `Tyro 92 | `__ and `SimpleParsing 93 | `__. 94 | 95 | - Parse from **configuration files** and **environment variables**, like 96 | `OmegaConf `__, `dynaconf 97 | `__, `confuse 98 | `__ and `configargparse 99 | `__. 100 | 101 | - **Dataclasses** support, like `SimpleParsing 102 | `__ and `Tyro 103 | `__. 104 | 105 | 106 | Other notable features include: 107 | 108 | - **Extensive type hint support:** nested types (union, optional), containers 109 | (list, dict, etc.), protocols, user-defined generics, restricted types (regex, 110 | numbers), paths, URLs, types from stubs (``*.pyi``), future annotations (PEP 111 | `563 `__), and backports (PEP `604 112 | `__). 113 | 114 | - **Keyword arguments introspection:** resolving of parameters used via 115 | ``**kwargs``. 116 | 117 | - **Dependency injection:** support types that expect a class instance and 118 | callables that return a class instance. 119 | 120 | - **Structured configs:** parse config files with more understandable non-flat 121 | hierarchies. 122 | 123 | - **Config file formats:** `json `__, `yaml 124 | `__, `toml `__, `jsonnet 125 | `__ and extensible to more formats. 126 | 127 | - **Relative paths:** within config files and parsing of config paths referenced 128 | inside other configs. 129 | 130 | - **Argument linking:** directing parsed values to multiple parameters, 131 | preventing unnecessary interpolation in configs. 132 | 133 | - **Variable interpolation:** powered by `OmegaConf 134 | `__. 135 | 136 | - **Tab completion:** powered by `shtab 137 | `__ or `argcomplete 138 | `__. 139 | 140 | 141 | Design principles 142 | ----------------- 143 | 144 | - **Non-intrusive/decoupled:** 145 | 146 | There is no requirement for unrelated modifications throughout a codebase, 147 | maintaining the `separation of concerns principle 148 | `__. In simpler terms, 149 | changes should make sense even without the CLI. No need to inherit from a 150 | special class, add decorators, or use CLI-specific type hints. 151 | 152 | - **Minimal boilerplate:** 153 | 154 | A recommended practice is to write code with function/class parameters having 155 | meaningful names, accurate type hints, and descriptive docstrings. Reuse these 156 | wherever they appear to automatically generate the CLI, following the `don't 157 | repeat yourself principle 158 | `__. A notable 159 | advantage is that when parameters are added or types changed, the CLI will 160 | remain synchronized, avoiding the need to update the CLI's implementation. 161 | 162 | - **Dependency injection:** 163 | 164 | Using as type hint a class or a callable that instantiates a class, a practice 165 | known as `dependency injection 166 | `__, is a sound design 167 | pattern for developing loosely coupled and highly configurable software. Such 168 | type hints should be supported with minimal restrictions. 169 | 170 | 171 | .. _installation: 172 | 173 | Installation 174 | ============ 175 | 176 | You can install using `pip `__ as: 177 | 178 | .. code-block:: bash 179 | 180 | pip install jsonargparse 181 | 182 | By default, the only dependency installed with ``jsonargparse`` is `PyYAML 183 | `__. However, several optional features can be 184 | enabled by specifying one or more of the following extras (optional 185 | dependencies): ``signatures``, ``jsonschema``, ``jsonnet``, ``urls``, 186 | ``fsspec``, ``toml``, ``ruamel``, ``omegaconf``, ``shtab``, and ``argcomplete``. 187 | Additionally, the ``all`` extras can be used to enable all optional features 188 | (excluding tab completion ones). To install ``jsonargparse`` with extras, use 189 | the following syntax: 190 | 191 | .. code-block:: bash 192 | 193 | pip install "jsonargparse[signatures,urls]" # Enable signatures and URLs features 194 | pip install "jsonargparse[all]" # Enable all optional features 195 | 196 | To install the latest development version, use the following command: 197 | 198 | .. code-block:: bash 199 | 200 | pip install "jsonargparse[signatures] @ git+https://github.com/omni-us/jsonargparse.git@main" 201 | -------------------------------------------------------------------------------- /jsonargparse/__init__.py: -------------------------------------------------------------------------------- 1 | from argparse import ( 2 | ONE_OR_MORE, 3 | OPTIONAL, 4 | PARSER, 5 | REMAINDER, 6 | SUPPRESS, 7 | ZERO_OR_MORE, 8 | ArgumentError, 9 | ) 10 | 11 | from ._actions import * # noqa: F403 12 | from ._cli import * # noqa: F403 13 | from ._common import * # noqa: F403 14 | from ._core import * # noqa: F403 15 | from ._deprecated import * # noqa: F403 16 | from ._formatters import * # noqa: F403 17 | from ._jsonnet import * # noqa: F403 18 | from ._jsonschema import * # noqa: F403 19 | from ._link_arguments import * # noqa: F403 20 | from ._loaders_dumpers import * # noqa: F403 21 | from ._namespace import * # noqa: F403 22 | from ._optionals import * # noqa: F403 23 | from ._signatures import * # noqa: F403 24 | from ._typehints import * # noqa: F403 25 | from ._util import * # noqa: F403 26 | 27 | __all__ = [ 28 | "ArgumentError", 29 | "OPTIONAL", 30 | "REMAINDER", 31 | "SUPPRESS", 32 | "PARSER", 33 | "ONE_OR_MORE", 34 | "ZERO_OR_MORE", 35 | ] 36 | 37 | 38 | from . import ( 39 | _actions, 40 | _cli, 41 | _common, 42 | _core, 43 | _deprecated, 44 | _formatters, 45 | _jsonnet, 46 | _jsonschema, 47 | _link_arguments, 48 | _loaders_dumpers, 49 | _namespace, 50 | _optionals, 51 | _signatures, 52 | _typehints, 53 | _util, 54 | ) 55 | 56 | __all__ += _cli.__all__ 57 | __all__ += _core.__all__ 58 | __all__ += _signatures.__all__ 59 | __all__ += _typehints.__all__ 60 | __all__ += _link_arguments.__all__ 61 | __all__ += _jsonschema.__all__ 62 | __all__ += _jsonnet.__all__ 63 | __all__ += _actions.__all__ 64 | __all__ += _namespace.__all__ 65 | __all__ += _formatters.__all__ 66 | __all__ += _optionals.__all__ 67 | __all__ += _common.__all__ 68 | __all__ += _loaders_dumpers.__all__ 69 | __all__ += _util.__all__ 70 | __all__ += _deprecated.__all__ 71 | 72 | 73 | __version__ = "4.42.0" 74 | -------------------------------------------------------------------------------- /jsonargparse/_cli.py: -------------------------------------------------------------------------------- 1 | """Simple creation of command line interfaces.""" 2 | 3 | import inspect 4 | from typing import Any, Callable, Dict, List, Optional, Type, Union 5 | 6 | from ._actions import ActionConfigFile, _ActionPrintConfig, remove_actions 7 | from ._core import ArgumentParser 8 | from ._deprecated import deprecation_warning_cli_return_parser 9 | from ._namespace import Namespace, dict_to_namespace 10 | from ._optionals import get_doc_short_description 11 | from ._util import capture_parser, default_config_option_help 12 | 13 | __all__ = [ 14 | "CLI", 15 | "auto_cli", 16 | "auto_parser", 17 | ] 18 | 19 | 20 | ComponentType = Union[Callable, Type] 21 | DictComponentsType = Dict[str, Union[ComponentType, "DictComponentsType"]] 22 | ComponentsType = Optional[Union[ComponentType, List[ComponentType], DictComponentsType]] 23 | 24 | 25 | def CLI(*args, **kwargs): 26 | """Alias of :func:`auto_cli`.""" 27 | return auto_cli(*args, _stacklevel=3, **kwargs) 28 | 29 | 30 | def auto_cli( 31 | components: ComponentsType = None, 32 | args: Optional[List[str]] = None, 33 | config_help: str = default_config_option_help, 34 | set_defaults: Optional[Dict[str, Any]] = None, 35 | as_positional: bool = True, 36 | fail_untyped: bool = True, 37 | parser_class: Type[ArgumentParser] = ArgumentParser, 38 | **kwargs, 39 | ): 40 | """Simple creation of command line interfaces. 41 | 42 | Previously CLI, renamed to follow the standard of functions in lowercase. 43 | 44 | Creates an argument parser from one or more functions/classes, parses 45 | arguments and runs one of the functions or class methods depending on what 46 | was parsed. If the 'components' parameter is not given, then the components 47 | will be all the locals in the context and defined in the same module as from 48 | where auto_cli is called. 49 | 50 | Args: 51 | components: One or more functions/classes to include in the command line interface. 52 | args: List of arguments to parse or None to use sys.argv. 53 | config_help: Help string for config file option in help. 54 | set_defaults: Dictionary of values to override components defaults. 55 | as_positional: Whether to add required parameters as positional arguments. 56 | fail_untyped: Whether to raise exception if a required parameter does not have a type. 57 | parser_class: The ArgumentParser class to use. 58 | **kwargs: Used to instantiate :class:`.ArgumentParser`. 59 | 60 | Returns: 61 | The value returned by the executed function or class method. 62 | """ 63 | return_parser = kwargs.pop("return_parser", False) 64 | stacklevel = kwargs.pop("_stacklevel", 2) 65 | 66 | if components is None: 67 | caller = inspect.stack()[stacklevel - 1][0] 68 | module = inspect.getmodule(caller) 69 | components = [ 70 | v for v in vars(module).values() if ((inspect.isclass(v) or callable(v)) and inspect.getmodule(v) is module) 71 | ] 72 | if len(components) == 0: 73 | raise ValueError( 74 | "Either components parameter must be given or there must be at least one " 75 | "function or class among the locals in the context where CLI is called." 76 | ) 77 | 78 | if isinstance(components, list) and len(components) == 1: 79 | components = components[0] 80 | 81 | elif not components: 82 | raise ValueError("components parameter expected to be non-empty") 83 | 84 | if isinstance(components, list): 85 | unexpected = [c for c in components if not (inspect.isclass(c) or callable(c))] 86 | elif isinstance(components, dict): 87 | ns = dict_to_namespace(components) 88 | unexpected = [c for k, c in ns.items() if not k.endswith("._help") and not (inspect.isclass(c) or callable(c))] 89 | else: 90 | unexpected = [c for c in [components] if not (inspect.isclass(c) or callable(c))] 91 | if unexpected: 92 | raise ValueError(f"Unexpected components, not class or function: {unexpected}") 93 | 94 | parser = parser_class(default_meta=False, **kwargs) 95 | parser.add_argument("--config", action=ActionConfigFile, help=config_help) 96 | 97 | if not isinstance(components, (list, dict)): 98 | _add_component_to_parser(components, parser, as_positional, fail_untyped, config_help) 99 | if set_defaults is not None: 100 | parser.set_defaults(set_defaults) 101 | if return_parser: 102 | deprecation_warning_cli_return_parser(stacklevel) 103 | return parser 104 | cfg = parser.parse_args(args) 105 | init = parser.instantiate_classes(cfg) 106 | return _run_component(components, init) 107 | 108 | elif isinstance(components, list): 109 | components = {c.__name__: c for c in components} 110 | 111 | _add_subcommands(components, parser, config_help, as_positional, fail_untyped) 112 | 113 | if set_defaults is not None: 114 | parser.set_defaults(set_defaults) 115 | if return_parser: 116 | deprecation_warning_cli_return_parser(stacklevel) 117 | return parser 118 | cfg = parser.parse_args(args) 119 | init = parser.instantiate_classes(cfg) 120 | components_ns = dict_to_namespace(components) 121 | subcommand = init.get("subcommand") 122 | while isinstance(init.get(subcommand), Namespace) and isinstance(init[subcommand].get("subcommand"), str): 123 | subsubcommand = subcommand + "." + init[subcommand].get("subcommand") 124 | if subsubcommand in components_ns: 125 | subcommand = subsubcommand 126 | else: 127 | break 128 | component = components_ns[subcommand] 129 | return _run_component(component, init.get(subcommand)) 130 | 131 | 132 | def auto_parser(*args, **kwargs) -> ArgumentParser: 133 | """Same as auto_cli, but returns the parser, so doesn't parse arguments or run. 134 | 135 | This is a shorthand for ``capture_parser(lambda: auto_cli(*args, **kwargs))``. 136 | """ 137 | return capture_parser(lambda: auto_cli(*args, **kwargs)) 138 | 139 | 140 | def get_help_str(component, logger): 141 | if isinstance(component, dict): 142 | return component.get("_help") 143 | help_str = get_doc_short_description(component, logger=logger) 144 | if not help_str: 145 | help_str = str(component) 146 | return help_str 147 | 148 | 149 | def _add_subcommands( 150 | components, 151 | parser: ArgumentParser, 152 | config_help: str, 153 | as_positional: bool, 154 | fail_untyped: bool, 155 | ) -> None: 156 | subcommands = parser.add_subcommands(required=True) 157 | for name, component in components.items(): 158 | if name == "_help": 159 | continue 160 | description = get_help_str(component, parser.logger) 161 | subparser = type(parser)(description=description) 162 | subparser.add_argument("--config", action=ActionConfigFile, help=config_help) 163 | subcommands.add_subcommand(name, subparser, help=description) 164 | if isinstance(component, dict): 165 | _add_subcommands(component, subparser, config_help, as_positional, fail_untyped) 166 | else: 167 | added_args = _add_component_to_parser(component, subparser, as_positional, fail_untyped, config_help) 168 | if not added_args: 169 | remove_actions(subparser, (ActionConfigFile, _ActionPrintConfig)) 170 | 171 | 172 | def has_parameter(component, name) -> bool: 173 | return name in inspect.signature(component).parameters 174 | 175 | 176 | def _add_component_to_parser( 177 | component, 178 | parser: ArgumentParser, 179 | as_positional: bool, 180 | fail_untyped: bool, 181 | config_help: str, 182 | ): 183 | kwargs: dict = {"as_positional": as_positional, "fail_untyped": fail_untyped, "sub_configs": True} 184 | if inspect.isclass(component): 185 | class_methods = [ 186 | k for k, v in inspect.getmembers(component) if (callable(v) or isinstance(v, property)) and k[0] != "_" 187 | ] 188 | if not class_methods: 189 | added_args = parser.add_class_arguments(component, as_group=False, **kwargs) 190 | if not parser.description: 191 | parser.description = get_help_str(component, parser.logger) 192 | return added_args 193 | added_args = parser.add_class_arguments(component, **kwargs) 194 | subcommands = parser.add_subcommands(required=True) 195 | for method in class_methods: 196 | method_object = getattr(component, method) 197 | description = get_help_str(method_object, parser.logger) 198 | subparser = type(parser)(description=description) 199 | if not isinstance(method_object, property): 200 | if not has_parameter(method_object, "config"): 201 | subparser.add_argument("--config", action=ActionConfigFile, help=config_help) 202 | added_subargs = subparser.add_method_arguments(component, method, as_group=False, **kwargs) 203 | added_args += [f"{method}.{a}" for a in added_subargs] 204 | if not added_subargs: 205 | remove_actions(subparser, (ActionConfigFile, _ActionPrintConfig)) 206 | subcommands.add_subcommand(method, subparser, help=get_help_str(method_object, parser.logger)) 207 | else: 208 | added_args = parser.add_function_arguments(component, as_group=False, **kwargs) 209 | if not parser.description: 210 | parser.description = get_help_str(component, parser.logger) 211 | return added_args 212 | 213 | 214 | def _run_component(component, cfg): 215 | cfg.pop("config", None) 216 | subcommand = cfg.pop("subcommand") 217 | if inspect.isclass(component) and subcommand: 218 | subcommand_cfg = cfg.pop(subcommand, {}) 219 | subcommand_cfg.pop("config", None) 220 | component_obj = component(**cfg) 221 | if isinstance(getattr(component, subcommand), property): 222 | return getattr(component_obj, subcommand) 223 | component = getattr(component_obj, subcommand) 224 | cfg = subcommand_cfg 225 | if inspect.iscoroutinefunction(component): 226 | return __import__("asyncio").run(component(**cfg)) 227 | return component(**cfg) 228 | -------------------------------------------------------------------------------- /jsonargparse/_completions.py: -------------------------------------------------------------------------------- 1 | import argparse 2 | import inspect 3 | import locale 4 | import os 5 | import re 6 | from collections import defaultdict 7 | from contextlib import contextmanager, suppress 8 | from contextvars import ContextVar 9 | from copy import copy 10 | from enum import Enum 11 | from importlib.util import find_spec 12 | from subprocess import PIPE, Popen 13 | from typing import List, Literal, Union 14 | 15 | from ._actions import ActionConfigFile, _ActionConfigLoad, _ActionHelpClassPath, remove_actions 16 | from ._common import get_optionals_as_positionals_actions, get_parsing_setting 17 | from ._parameter_resolvers import get_signature_parameters 18 | from ._typehints import ( 19 | ActionTypeHint, 20 | callable_origin_types, 21 | get_all_subclass_paths, 22 | get_callable_return_type, 23 | get_typehint_origin, 24 | is_subclass, 25 | type_to_str, 26 | ) 27 | from ._util import NoneType, Path, import_object, unique 28 | 29 | 30 | def handle_completions(parser): 31 | if find_spec("argcomplete") and "_ARGCOMPLETE" in os.environ: 32 | import argcomplete 33 | 34 | from ._common import parser_context 35 | 36 | with parser_context(load_value_mode=parser.parser_mode): 37 | argcomplete.autocomplete(parser) 38 | 39 | if find_spec("shtab") and not getattr(parser, "parent_parser", None): 40 | if not any(isinstance(action, ShtabAction) for action in parser._actions): 41 | parser.add_argument("--print_shtab", action=ShtabAction) 42 | 43 | 44 | # argcomplete 45 | 46 | 47 | def get_files_completer(): 48 | from argcomplete.completers import FilesCompleter 49 | 50 | return FilesCompleter() 51 | 52 | 53 | def argcomplete_namespace(caller, parser, namespace): 54 | if caller == "argcomplete": 55 | namespace.__class__ = __import__("jsonargparse").Namespace 56 | namespace = parser.merge_config(parser.get_defaults(skip_validation=True), namespace).as_flat() 57 | return namespace 58 | 59 | 60 | def argcomplete_warn_redraw_prompt(prefix, message): 61 | import argcomplete 62 | 63 | if prefix != "": 64 | argcomplete.warn(message) 65 | with suppress(Exception): 66 | proc = Popen(f"ps -p {os.getppid()} -oppid=".split(), stdout=PIPE, stderr=PIPE) 67 | stdout, _ = proc.communicate() 68 | shell_pid = int(stdout.decode().strip()) 69 | os.kill(shell_pid, 28) 70 | _ = "_" if locale.getlocale()[1] != "UTF-8" else "\xa0" 71 | return [_ + message.replace(" ", _), ""] 72 | 73 | 74 | # shtab 75 | 76 | shtab_shell: ContextVar = ContextVar("shtab_shell") 77 | shtab_prog: ContextVar = ContextVar("shtab_prog") 78 | shtab_preambles: ContextVar = ContextVar("shtab_preambles") 79 | 80 | 81 | class ShtabAction(argparse.Action): 82 | def __init__( 83 | self, 84 | option_strings, 85 | dest=argparse.SUPPRESS, 86 | default=argparse.SUPPRESS, 87 | ): 88 | import shtab 89 | 90 | super().__init__( 91 | option_strings=option_strings, 92 | dest=dest, 93 | default=default, 94 | choices=shtab.SUPPORTED_SHELLS, 95 | help="Print shtab shell completion script.", 96 | ) 97 | 98 | def __call__(self, parser, namespace, shell, option_string=None): 99 | import shtab 100 | 101 | prog = norm_name(parser.prog) 102 | assert prog 103 | preambles = [] 104 | if shell == "bash": 105 | preambles = [bash_compgen_typehint.strip().replace("%s", prog)] 106 | with prepare_actions_context(shell, prog, preambles): 107 | shtab_prepare_actions(parser) 108 | print(shtab.complete(parser, shell, preamble="\n".join(preambles))) 109 | parser.exit(0) 110 | 111 | 112 | @contextmanager 113 | def prepare_actions_context(shell, prog, preambles): 114 | token_shell = shtab_shell.set(shell) 115 | token_prog = shtab_prog.set(prog) 116 | token_preambles = shtab_preambles.set(preambles) 117 | try: 118 | yield 119 | finally: 120 | shtab_shell.reset(token_shell) 121 | shtab_prog.reset(token_prog) 122 | shtab_preambles.reset(token_preambles) 123 | 124 | 125 | def norm_name(name: str) -> str: 126 | return re.sub(r"\W+", "_", name) 127 | 128 | 129 | def shtab_prepare_actions(parser) -> None: 130 | remove_actions(parser, (ShtabAction,)) 131 | if parser._subcommands_action: 132 | for subparser in parser._subcommands_action._name_parser_map.values(): 133 | shtab_prepare_actions(subparser) 134 | if get_parsing_setting("parse_optionals_as_positionals"): 135 | for action in get_optionals_as_positionals_actions(parser): 136 | clone = copy(action) 137 | clone.option_strings = [] 138 | clone.nargs = "?" 139 | parser._actions.append(clone) 140 | for action in parser._actions: 141 | shtab_prepare_action(action, parser) 142 | 143 | 144 | def shtab_prepare_action(action, parser) -> None: 145 | import shtab 146 | 147 | if action.choices or hasattr(action, "complete"): 148 | return 149 | 150 | complete = None 151 | if isinstance(action, (ActionConfigFile, _ActionConfigLoad)): 152 | complete = shtab.FILE 153 | elif isinstance(action, ActionTypeHint): 154 | typehint = action._typehint 155 | if get_typehint_origin(typehint) == Union: 156 | subtypes = [s for s in typehint.__args__ if s not in {NoneType, str, dict, list, tuple, bytes}] 157 | if len(subtypes) == 1: 158 | typehint = subtypes[0] 159 | if is_subclass(typehint, Path): 160 | if "f" in typehint._mode: 161 | complete = shtab.FILE 162 | elif "d" in typehint._mode: 163 | complete = shtab.DIRECTORY 164 | elif is_subclass(typehint, os.PathLike): 165 | complete = shtab.FILE 166 | if complete: 167 | action.complete = complete 168 | return 169 | 170 | choices = None 171 | if isinstance(action, ActionTypeHint): 172 | skip = getattr(action, "sub_add_kwargs", {}).get("skip", set()) 173 | prefix = action.option_strings[0] if action.option_strings else None 174 | choices = get_typehint_choices(action._typehint, prefix, parser, skip) 175 | if shtab_shell.get() == "bash": 176 | message = f"Expected type: {type_to_str(action._typehint)}" 177 | if action.option_strings == []: 178 | message = f"Argument: {action.dest}; " + message 179 | add_bash_typehint_completion(parser, action, message, choices) 180 | choices = None 181 | elif isinstance(action, _ActionHelpClassPath): 182 | choices = get_help_class_choices(action._typehint) 183 | if choices: 184 | action.choices = choices 185 | 186 | 187 | bash_compgen_typehint_name = "_jsonargparse_%s_compgen_typehint" 188 | bash_compgen_typehint = """ 189 | _jsonargparse_%%s_matched_choices() { 190 | local TOTAL=$(echo "$1" | wc -w | tr -d " ") 191 | if [ "$TOTAL" != 0 ]; then 192 | local MATCH=$(echo "$2" | wc -w | tr -d " ") 193 | printf "; $MATCH/$TOTAL matched choices" 194 | fi 195 | } 196 | %(name)s() { 197 | local MATCH=( $(IFS=" " compgen -W "$1" "$2") ) 198 | if [ ${#MATCH[@]} = 0 ]; then 199 | if [ "$COMP_TYPE" = 63 ]; then 200 | MATCHED=$(_jsonargparse_%%s_matched_choices "$1" "${MATCH[*]}") 201 | printf "%(b)s\\n$3$MATCHED\\n%(n)s" >&2 202 | kill -WINCH $$ 203 | fi 204 | else 205 | IFS=" " compgen -W "$1" "$2" 206 | if [ "$COMP_TYPE" = 63 ]; then 207 | MATCHED=$(_jsonargparse_%%s_matched_choices "$1" "${MATCH[*]}") 208 | printf "%(b)s\\n$3$MATCHED%(n)s" >&2 209 | fi 210 | fi 211 | } 212 | """ % { 213 | "name": bash_compgen_typehint_name, 214 | "b": "$(tput setaf 5)", 215 | "n": "$(tput sgr0)", 216 | } 217 | 218 | 219 | def add_bash_typehint_completion(parser, action, message, choices) -> None: 220 | fn_typehint = norm_name(bash_compgen_typehint_name % shtab_prog.get()) 221 | fn_name = parser.prog.replace(" [options] ", "_") 222 | fn_name = norm_name(f"_jsonargparse_{fn_name}_{action.dest}_typehint") 223 | fn = '{fn_name}(){{ {fn_typehint} "{choices}" "$1" "{message}"; }}'.format( 224 | fn_name=fn_name, 225 | fn_typehint=fn_typehint, 226 | choices=" ".join(choices), 227 | message=message, 228 | ) 229 | shtab_preambles.get().append(fn) 230 | action.complete = {"bash": fn_name} 231 | 232 | 233 | def get_typehint_choices(typehint, prefix, parser, skip, choices=None, added_subclasses=None) -> List[str]: 234 | if choices is None: 235 | choices = [] 236 | if not added_subclasses: 237 | added_subclasses = set() 238 | if typehint is bool: 239 | choices.extend(["true", "false"]) 240 | elif typehint is NoneType: 241 | choices.append("null") 242 | elif is_subclass(typehint, Enum): 243 | choices.extend(list(typehint.__members__)) 244 | else: 245 | origin = get_typehint_origin(typehint) 246 | if origin == Literal: 247 | choices.extend([str(a) for a in typehint.__args__ if isinstance(a, (str, int, float))]) 248 | elif origin == Union: 249 | for subtype in typehint.__args__: 250 | if subtype in added_subclasses or subtype is object: 251 | continue 252 | get_typehint_choices(subtype, prefix, parser, skip, choices, added_subclasses) 253 | elif ActionTypeHint.is_subclass_typehint(typehint): 254 | added_subclasses.add(typehint) 255 | choices.extend(add_subactions_and_get_subclass_choices(typehint, prefix, parser, skip, added_subclasses)) 256 | elif origin in callable_origin_types: 257 | return_type = get_callable_return_type(typehint) 258 | if return_type and ActionTypeHint.is_subclass_typehint(return_type): 259 | num_args = len(typehint.__args__) - 1 260 | skip.add(num_args) 261 | choices.extend( 262 | add_subactions_and_get_subclass_choices(return_type, prefix, parser, skip, added_subclasses) 263 | ) 264 | 265 | return [] if choices == ["null"] else choices 266 | 267 | 268 | def add_subactions_and_get_subclass_choices(typehint, prefix, parser, skip, added_subclasses) -> List[str]: 269 | choices = [] 270 | paths = get_all_subclass_paths(typehint) 271 | init_args = defaultdict(list) 272 | subclasses = defaultdict(list) 273 | for path in paths: 274 | choices.append(path) 275 | try: 276 | cls = import_object(path) 277 | params = get_signature_parameters(cls, None, parser._logger) 278 | except Exception as ex: 279 | parser._logger.debug(f"Unable to get signature parameters for '{path}': {ex}") 280 | continue 281 | num_skip = next((s for s in skip if isinstance(s, int)), 0) 282 | if num_skip > 0: 283 | params = params[num_skip:] 284 | for param in params: 285 | if param.name not in skip: 286 | init_args[param.name].append(param.annotation) 287 | subclasses[param.name].append(path.rsplit(".", 1)[-1]) 288 | 289 | if prefix is not None: 290 | for name, subtypes in init_args.items(): 291 | option_string = f"{prefix}.{name}" 292 | if option_string not in parser._option_string_actions: 293 | action = parser.add_argument(option_string) 294 | for subtype in unique(subtypes): 295 | subchoices = get_typehint_choices(subtype, option_string, parser, skip, None, added_subclasses) 296 | if shtab_shell.get() == "bash": 297 | message = f"Expected type: {type_to_str(subtype)}; " 298 | message += f"Accepted by subclasses: {', '.join(subclasses[name])}" 299 | add_bash_typehint_completion(parser, action, message, subchoices) 300 | elif subchoices: 301 | action.choices = subchoices 302 | 303 | return choices 304 | 305 | 306 | def get_help_class_choices(typehint) -> List[str]: 307 | choices = [] 308 | if get_typehint_origin(typehint) == Union: 309 | for subtype in typehint.__args__: 310 | if inspect.isclass(subtype): 311 | choices.extend(get_help_class_choices(subtype)) 312 | else: 313 | choices = get_all_subclass_paths(typehint) 314 | return choices 315 | -------------------------------------------------------------------------------- /jsonargparse/_jsonnet.py: -------------------------------------------------------------------------------- 1 | """Actions to support jsonnet.""" 2 | 3 | from typing import Any, Dict, Optional, Tuple, Union 4 | 5 | from ._actions import _find_action, _is_action_value_list 6 | from ._common import Action, parser_context 7 | from ._jsonschema import ActionJsonSchema 8 | from ._loaders_dumpers import get_loader_exceptions, load_value 9 | from ._optionals import ( 10 | _get_config_read_mode, 11 | get_jsonschema_exceptions, 12 | import_jsonnet, 13 | import_jsonschema, 14 | pyyaml_available, 15 | ) 16 | from ._typehints import ActionTypeHint 17 | from ._util import NoneType, Path, argument_error 18 | 19 | __all__ = ["ActionJsonnet"] 20 | 21 | 22 | class ActionJsonnet(Action): 23 | """Action to parse a jsonnet, optionally validating against a jsonschema.""" 24 | 25 | def __init__( 26 | self, 27 | ext_vars: Optional[str] = None, 28 | schema: Optional[Union[str, Dict]] = None, 29 | **kwargs, 30 | ): 31 | """Initializer for ActionJsonnet instance. 32 | 33 | Args: 34 | ext_vars: Key where to find the external variables required to parse the jsonnet. 35 | schema: Schema to validate values against. 36 | 37 | Raises: 38 | ValueError: If a parameter is invalid. 39 | jsonschema.exceptions.SchemaError: If the schema is invalid. 40 | """ 41 | if "_validator" not in kwargs: 42 | import_jsonnet("ActionJsonnet") 43 | if not isinstance(ext_vars, (str, NoneType)): 44 | raise ValueError("ext_vars has to be either None or a string.") 45 | self._ext_vars = ext_vars 46 | if schema is not None: 47 | jsonvalidator = import_jsonschema("ActionJsonnet")[1] 48 | if isinstance(schema, str): 49 | mode = "yaml" if pyyaml_available else "json" 50 | with parser_context(load_value_mode=mode): 51 | try: 52 | schema = load_value(schema) 53 | except get_loader_exceptions(mode) as ex: 54 | raise ValueError(f"Problems parsing schema: {ex}") from ex 55 | jsonvalidator.check_schema(schema) 56 | self._validator = ActionJsonSchema._extend_jsonvalidator_with_default(jsonvalidator)(schema) 57 | else: 58 | self._validator = None 59 | else: 60 | self._ext_vars = kwargs.pop("_ext_vars") 61 | self._validator = kwargs.pop("_validator") 62 | super().__init__(**kwargs) 63 | 64 | def __call__(self, *args, **kwargs): 65 | """Parses an argument as jsonnet using ext_vars if defined. 66 | 67 | Raises: 68 | TypeError: If the argument is not valid. 69 | """ 70 | if len(args) == 0: 71 | kwargs["_ext_vars"] = self._ext_vars 72 | kwargs["_validator"] = self._validator 73 | if "help" in kwargs and "%s" in kwargs["help"] and self._validator is not None: 74 | import json 75 | 76 | kwargs["help"] = kwargs["help"] % json.dumps(self._validator.schema, sort_keys=True) 77 | return ActionJsonnet(**kwargs) 78 | setattr(args[1], self.dest, self._check_type_(args[2], cfg=args[1])) 79 | return None 80 | 81 | def _check_type(self, value, cfg): 82 | islist = _is_action_value_list(self) 83 | ext_vars = {} 84 | if cfg: 85 | ext_vars = cfg.get(self._ext_vars, {}) 86 | if not islist: 87 | value = [value] 88 | for num, val in enumerate(value): 89 | try: 90 | if isinstance(val, str): 91 | val = self.parse(val, ext_vars=ext_vars, with_meta=True) 92 | elif self._validator is not None: 93 | self._validator.validate(val) 94 | value[num] = val 95 | except (TypeError, RuntimeError) + get_jsonschema_exceptions() + get_loader_exceptions() as ex: 96 | elem = "" if not islist else " element " + str(num + 1) 97 | raise TypeError(f'Parser key "{self.dest}"{elem}: {ex}') from ex 98 | return value if islist else value[0] 99 | 100 | @staticmethod 101 | def _check_ext_vars_action(parser, action): 102 | if isinstance(action, ActionJsonnet) and action._ext_vars: 103 | ext_vars_action = _find_action(parser, action._ext_vars) 104 | if not ext_vars_action: 105 | raise ValueError(f"No argument found for ext_vars='{action._ext_vars}'") 106 | ext_vars_type = isinstance(ext_vars_action, ActionTypeHint) and ext_vars_action._typehint 107 | if ext_vars_type not in {dict, Dict}: 108 | raise ValueError( 109 | f"Type for ext_vars='{action._ext_vars}' argument must be dict, given: {ext_vars_type}" 110 | ) 111 | if ext_vars_action.default is None: 112 | ext_vars_action.default = {} 113 | if not isinstance(ext_vars_action.default, dict): 114 | raise ValueError( 115 | f"Default value for the ext_vars='{action._ext_vars}' argument " 116 | f"must be dict or None, given: {ext_vars_action.default}" 117 | ) 118 | ext_vars_action.jsonnet_ext_vars = True 119 | 120 | @staticmethod 121 | def split_ext_vars(ext_vars: Optional[Dict[str, Any]]) -> Tuple[Dict[str, Any], Dict[str, Any]]: 122 | """Splits an ext_vars dict into the ext_codes and ext_vars required by jsonnet. 123 | 124 | Args: 125 | ext_vars: External variables. Values can be strings or any other basic type. 126 | """ 127 | if ext_vars is None: 128 | ext_vars = {} 129 | import json 130 | 131 | ext_codes = {k: json.dumps(v) for k, v in ext_vars.items() if not isinstance(v, str)} 132 | ext_vars = {k: v for k, v in ext_vars.items() if isinstance(v, str)} 133 | return ext_vars, ext_codes 134 | 135 | def parse( 136 | self, 137 | jsonnet: Union[str, Path], 138 | ext_vars: Optional[Dict[str, Any]] = None, 139 | with_meta: bool = False, 140 | ) -> Dict: 141 | """Method that can be used to parse jsonnet independent from an ArgumentParser. 142 | 143 | Args: 144 | jsonnet: Either a path to a jsonnet file or the jsonnet content. 145 | ext_vars: External variables. Values can be strings or any other basic type. 146 | with_meta: Whether to include metadata in config object. 147 | 148 | Returns: 149 | The parsed jsonnet object. 150 | 151 | Raises: 152 | TypeError: If the input is neither a path to an existent file nor a jsonnet. 153 | """ 154 | _jsonnet = import_jsonnet("ActionJsonnet") 155 | ext_vars, ext_codes = self.split_ext_vars(ext_vars) 156 | fpath = None 157 | fname = "snippet" 158 | snippet = jsonnet 159 | try: 160 | fpath = Path(jsonnet, mode=_get_config_read_mode()) 161 | except TypeError: 162 | pass 163 | else: 164 | fname = jsonnet(absolute=False) if isinstance(jsonnet, Path) else jsonnet 165 | snippet = fpath.get_content() 166 | try: 167 | with parser_context(load_value_mode="yaml" if pyyaml_available else "json"): 168 | values = load_value(_jsonnet.evaluate_snippet(fname, snippet, ext_vars=ext_vars, ext_codes=ext_codes)) 169 | except RuntimeError as ex: 170 | raise argument_error(f'Problems evaluating jsonnet "{fname}": {ex}') from ex 171 | if self._validator is not None: 172 | self._validator.validate(values) 173 | if with_meta: 174 | if fpath is not None: 175 | values["__path__"] = fpath 176 | values["__orig__"] = snippet 177 | return values 178 | -------------------------------------------------------------------------------- /jsonargparse/_jsonschema.py: -------------------------------------------------------------------------------- 1 | """Action to support jsonschemas.""" 2 | 3 | import os 4 | from typing import Dict, Optional, Union 5 | 6 | from ._actions import _is_action_value_list 7 | from ._common import Action, parser_context 8 | from ._loaders_dumpers import get_loader_exceptions, load_value 9 | from ._namespace import strip_meta 10 | from ._optionals import ( 11 | get_jsonschema_exceptions, 12 | import_jsonschema, 13 | pyyaml_available, 14 | ) 15 | from ._util import parse_value_or_config 16 | 17 | __all__ = ["ActionJsonSchema"] 18 | 19 | 20 | class ActionJsonSchema(Action): 21 | """Action to parse option as json validated by a jsonschema.""" 22 | 23 | def __init__( 24 | self, schema: Optional[Union[str, Dict]] = None, enable_path: bool = True, with_meta: bool = True, **kwargs 25 | ): 26 | """Initializer for ActionJsonSchema instance. 27 | 28 | Args: 29 | schema: Schema to validate values against. 30 | enable_path: Whether to try to load json from path (def.=True). 31 | with_meta: Whether to include metadata (def.=True). 32 | 33 | Raises: 34 | ValueError: If a parameter is invalid. 35 | jsonschema.exceptions.SchemaError: If the schema is invalid. 36 | """ 37 | if schema is not None: 38 | if isinstance(schema, str): 39 | mode = "yaml" if pyyaml_available else "json" 40 | with parser_context(load_value_mode=mode): 41 | try: 42 | schema = load_value(schema) 43 | except get_loader_exceptions(mode) as ex: 44 | raise ValueError(f"Problems parsing schema: {ex}") from ex 45 | jsonvalidator = import_jsonschema("ActionJsonSchema")[1] 46 | jsonvalidator.check_schema(schema) 47 | self._validator = self._extend_jsonvalidator_with_default(jsonvalidator)(schema) 48 | self._enable_path = enable_path 49 | self._with_meta = with_meta 50 | elif "_validator" not in kwargs: 51 | raise ValueError("Expected schema keyword argument.") 52 | else: 53 | self._validator = kwargs.pop("_validator") 54 | self._enable_path = kwargs.pop("_enable_path") 55 | self._with_meta = kwargs.pop("_with_meta") 56 | super().__init__(**kwargs) 57 | 58 | def __call__(self, *args, **kwargs): 59 | """Parses an argument validating against the corresponding jsonschema. 60 | 61 | Raises: 62 | TypeError: If the argument is not valid. 63 | """ 64 | if len(args) == 0: 65 | kwargs["_validator"] = self._validator 66 | kwargs["_enable_path"] = self._enable_path 67 | kwargs["_with_meta"] = self._with_meta 68 | if "help" in kwargs and isinstance(kwargs["help"], str) and "%s" in kwargs["help"]: 69 | import json 70 | 71 | kwargs["help"] = kwargs["help"] % json.dumps(self._validator.schema, sort_keys=True) 72 | class_type = kwargs.pop("_class_type", ActionJsonSchema) 73 | return class_type(**kwargs) 74 | val = self._check_type(args[2]) 75 | if not self._with_meta: 76 | val = strip_meta(val) 77 | setattr(args[1], self.dest, val) 78 | return None 79 | 80 | def _check_type(self, value): 81 | islist = _is_action_value_list(self) 82 | if not islist: 83 | value = [value] 84 | for num, val in enumerate(value): 85 | try: 86 | val, fpath = parse_value_or_config(val, enable_path=self._enable_path) 87 | path_meta = val.pop("__path__") if isinstance(val, dict) and "__path__" in val else None 88 | self._validator.validate(val) 89 | if path_meta is not None: 90 | val["__path__"] = path_meta 91 | if isinstance(val, dict) and fpath is not None: 92 | val["__path__"] = fpath 93 | value[num] = val 94 | except (TypeError, ValueError) + get_jsonschema_exceptions() + get_loader_exceptions() as ex: 95 | elem = "" if not islist else " element " + str(num + 1) 96 | raise TypeError(f'Parser key "{self.dest}"{elem}: {ex}') from ex 97 | return value if islist else value[0] 98 | 99 | @staticmethod 100 | def _extend_jsonvalidator_with_default(validator_class): 101 | """Extends a json schema validator so that it fills in default values.""" 102 | validate_properties = validator_class.VALIDATORS["properties"] 103 | 104 | def set_defaults(validator, properties, instance, schema): 105 | valid = True 106 | for validation in validate_properties(validator, properties, instance, schema): 107 | if isinstance(validation, jsonschema.exceptions.ValidationError): 108 | valid = False 109 | yield validation 110 | if valid: 111 | for prop, subschema in properties.items(): 112 | if "default" in subschema: 113 | instance.setdefault(prop, subschema["default"]) 114 | 115 | jsonschema = import_jsonschema("ActionJsonSchema")[0] 116 | return jsonschema.validators.extend(validator_class, {"properties": set_defaults}) 117 | 118 | def completer(self, prefix, **kwargs): 119 | """Used by argcomplete, validates value and shows expected type.""" 120 | if chr(int(os.environ["COMP_TYPE"])) == "?": 121 | from ._completions import argcomplete_warn_redraw_prompt 122 | 123 | try: 124 | if prefix.strip() == "": 125 | raise ValueError() 126 | self._validator.validate(load_value(prefix)) 127 | msg = "value already valid, " 128 | except (ValueError,) + get_jsonschema_exceptions() + get_loader_exceptions(): 129 | msg = "value not yet valid, " 130 | else: 131 | import json 132 | 133 | schema = json.dumps(self._validator.schema, indent=2, sort_keys=True).replace("\n", "\n ") 134 | msg += f"required to be valid according to schema:\n {schema}\n" 135 | return argcomplete_warn_redraw_prompt(prefix, msg) 136 | -------------------------------------------------------------------------------- /jsonargparse/_stubs_resolver.py: -------------------------------------------------------------------------------- 1 | import ast 2 | import inspect 3 | import sys 4 | from contextlib import suppress 5 | from copy import deepcopy 6 | from importlib import import_module 7 | from typing import TYPE_CHECKING, Any, Dict, Optional, Tuple 8 | 9 | from ._common import get_parsing_setting 10 | from ._optionals import import_typeshed_client, typeshed_client_support 11 | from ._postponed_annotations import NamesVisitor, get_arg_type 12 | 13 | if TYPE_CHECKING: # pragma: no cover 14 | import typeshed_client as tc 15 | else: 16 | tc = import_typeshed_client() 17 | 18 | 19 | kinds = inspect._ParameterKind 20 | 21 | 22 | def import_module_or_none(path: str): 23 | if path.endswith(".__init__"): 24 | path = path[:-9] 25 | try: 26 | return import_module(path) 27 | except ModuleNotFoundError: 28 | return None 29 | 30 | 31 | class ImportsVisitor(ast.NodeVisitor): 32 | def visit_ImportFrom(self, node: ast.ImportFrom) -> None: 33 | if node.level: 34 | module_path = self.module_path[: -node.level] 35 | if node.module: 36 | module_path.append(node.module) 37 | node = deepcopy(node) 38 | node.module = ".".join(module_path) 39 | node.level = 0 40 | for alias in node.names: 41 | self.imports_found[alias.asname or alias.name] = (node.module, alias.name) 42 | 43 | def find(self, node: ast.AST, module_path: str) -> Dict[str, Tuple[Optional[str], str]]: 44 | self.module_path = module_path.split(".") 45 | self.imports_found: Dict[str, Tuple[Optional[str], str]] = {} 46 | self.visit(node) 47 | return self.imports_found 48 | 49 | 50 | def ast_annassign_to_assign(node: ast.AnnAssign) -> ast.Assign: 51 | return ast.Assign( 52 | targets=[node.target], 53 | value=node.value, # type: ignore[arg-type] 54 | lineno=node.lineno, 55 | end_lineno=node.lineno, 56 | ) 57 | 58 | 59 | class AssignsVisitor(ast.NodeVisitor): 60 | def visit_Assign(self, node: ast.Assign) -> None: 61 | for target in node.targets: 62 | if hasattr(target, "id"): 63 | self.assigns_found[target.id] = node 64 | 65 | def visit_AnnAssign(self, node: ast.AnnAssign) -> None: 66 | if hasattr(node.target, "id"): 67 | self.assigns_found[node.target.id] = ast_annassign_to_assign(node) 68 | 69 | def find(self, node: ast.AST) -> Dict[str, ast.Assign]: 70 | self.assigns_found: Dict[str, ast.Assign] = {} 71 | self.visit(node) 72 | return self.assigns_found 73 | 74 | 75 | class MethodsVisitor(ast.NodeVisitor): 76 | method_found: Optional[ast.FunctionDef] 77 | 78 | def visit_FunctionDef(self, node: ast.FunctionDef) -> None: 79 | if not self.method_found and node.name == self.method_name: 80 | self.method_found = node 81 | 82 | def visit_If(self, node: ast.If) -> None: 83 | test_ast = ast.parse("___test___ = 0") 84 | test_ast.body[0].value = node.test # type: ignore[attr-defined] 85 | exec_vars = {"sys": sys} 86 | with suppress(Exception): 87 | exec(compile(test_ast, filename="", mode="exec"), exec_vars, exec_vars) 88 | if exec_vars["___test___"]: 89 | node.orelse = [] 90 | else: 91 | node.body = [] 92 | self.generic_visit(node) 93 | 94 | def find(self, node: ast.AST, method_name: str) -> Optional[ast.FunctionDef]: 95 | self.method_name = method_name 96 | self.method_found = None 97 | self.visit(node) 98 | return self.method_found 99 | 100 | 101 | stubs_resolver = None 102 | 103 | 104 | def get_stubs_resolver(): 105 | global stubs_resolver 106 | if not stubs_resolver: 107 | allow_py_files = get_parsing_setting("stubs_resolver_allow_py_files") 108 | search_context = tc.get_search_context(allow_py_files=allow_py_files) 109 | stubs_resolver = StubsResolver(search_context=search_context) 110 | return stubs_resolver 111 | 112 | 113 | def get_mro_method_parent(parent, method_name): 114 | while hasattr(parent, "__dict__") and method_name not in parent.__dict__: 115 | try: 116 | parent = inspect.getmro(parent)[1] 117 | except IndexError: 118 | parent = None 119 | return None if parent is object else parent 120 | 121 | 122 | def get_source_module(path: str, component) -> tc.ModulePath: 123 | if component is None: 124 | module_path, name = path.rsplit(".", 1) 125 | component = getattr(import_module_or_none(module_path), name, None) 126 | if component is not None: 127 | module = inspect.getmodule(component) 128 | assert module is not None 129 | module_path = module.__name__ 130 | if getattr(module, "__file__", "").endswith("__init__.py"): 131 | module_path += ".__init__" 132 | return tc.ModulePath(tuple(module_path.split("."))) 133 | 134 | 135 | class StubsResolver(tc.Resolver): 136 | def __init__(self, **kwargs) -> None: 137 | super().__init__(**kwargs) 138 | self._module_ast_cache: Dict[str, Optional[ast.AST]] = {} 139 | self._module_assigns_cache: Dict[str, Dict[str, ast.Assign]] = {} 140 | self._module_imports_cache: Dict[str, Dict[str, Tuple[Optional[str], str]]] = {} 141 | 142 | def get_imported_info(self, path: str, component=None) -> Optional[tc.ImportedInfo]: 143 | resolved = self.get_fully_qualified_name(path) 144 | imported_info = None 145 | if isinstance(resolved, tc.ImportedInfo): 146 | imported_info = resolved 147 | elif isinstance(resolved, tc.NameInfo): 148 | source_module = get_source_module(path, component) 149 | imported_info = tc.ImportedInfo(source_module=source_module, info=resolved) 150 | return imported_info 151 | 152 | def get_component_imported_info(self, component, parent) -> Optional[tc.ImportedInfo]: 153 | if not parent and inspect.ismethod(component): 154 | parent = type(component.__self__) 155 | component = getattr(parent, component.__name__) 156 | if not parent: 157 | return self.get_imported_info(f"{component.__module__}.{component.__name__}", component) 158 | parent = get_mro_method_parent(parent, component.__name__) 159 | stub_import = parent and self.get_imported_info(f"{parent.__module__}.{parent.__name__}", component) 160 | if stub_import and isinstance(stub_import.info.ast, ast.AST): 161 | method_ast = MethodsVisitor().find(stub_import.info.ast, component.__name__) 162 | assert method_ast 163 | name_info = tc.NameInfo(name=component.__qualname__, is_exported=False, ast=method_ast) 164 | stub_import = tc.ImportedInfo(source_module=stub_import.source_module, info=name_info) 165 | return stub_import 166 | 167 | def get_aliases(self, imported_info: tc.ImportedInfo): 168 | aliases: Dict[str, Tuple[str, Any]] = {} 169 | self.add_import_aliases(aliases, imported_info) 170 | return aliases 171 | 172 | def get_module_stub_ast(self, module_path: str): 173 | if module_path not in self._module_ast_cache: 174 | self._module_ast_cache[module_path] = tc.get_stub_ast(module_path, search_context=self.ctx) 175 | return self._module_ast_cache[module_path] 176 | 177 | def get_module_stub_assigns(self, module_path: str): 178 | if module_path not in self._module_assigns_cache: 179 | module_ast = self.get_module_stub_ast(module_path) 180 | self._module_assigns_cache[module_path] = AssignsVisitor().find(module_ast) 181 | return self._module_assigns_cache[module_path] 182 | 183 | def get_module_stub_imports(self, module_path: str): 184 | if module_path not in self._module_imports_cache: 185 | module_ast = self.get_module_stub_ast(module_path) 186 | self._module_imports_cache[module_path] = ImportsVisitor().find(module_ast, module_path) 187 | return self._module_imports_cache[module_path] 188 | 189 | def add_import_aliases(self, aliases, stub_import: tc.ImportedInfo): 190 | module_path = ".".join(stub_import.source_module) 191 | module = import_module_or_none(module_path) 192 | stub_ast: Optional[ast.AST] = None 193 | if isinstance(stub_import.info.ast, (ast.Assign, ast.AnnAssign)): 194 | stub_ast = stub_import.info.ast.value 195 | elif isinstance(stub_import.info.ast, ast.AST): 196 | stub_ast = stub_import.info.ast 197 | if stub_ast: 198 | self.add_module_aliases(aliases, module_path, module, stub_ast) 199 | return module_path, stub_import.info.ast 200 | 201 | def add_module_aliases(self, aliases, module_path, module, node, skip=set()): 202 | names = NamesVisitor().find(node) if node else [] 203 | for name in names: 204 | if alias_already_added(aliases, name, module_path) or name in skip: 205 | continue 206 | source = module_path 207 | if name in __builtins__: 208 | source = "__builtins__" 209 | value = __builtins__[name] 210 | elif hasattr(module, name): 211 | value = getattr(module, name) 212 | elif name in self.get_module_stub_assigns(module_path): 213 | value = self.get_module_stub_assigns(module_path)[name] 214 | self.add_module_aliases(aliases, module_path, module, value.value, skip={name}) 215 | elif name in self.get_module_stub_imports(module_path): 216 | imported_module_path, imported_name = self.get_module_stub_imports(module_path)[name] 217 | imported_module = import_module_or_none(imported_module_path) 218 | if hasattr(imported_module, imported_name): 219 | source = imported_module_path 220 | value = getattr(imported_module, imported_name) 221 | else: 222 | stub_import = self.get_imported_info(f"{imported_module_path}.{imported_name}") 223 | source, value = self.add_import_aliases(aliases, stub_import) 224 | else: 225 | value = NotImplementedError(f"{name!r} from {module_path!r} not in builtins, module or stub") 226 | if alias_already_added(aliases, name, source): 227 | continue 228 | if not alias_is_unique(aliases, name, source, value): 229 | value = NotImplementedError( 230 | f"non-unique alias {name!r}: {aliases[name][1]} ({aliases[name][0]}) vs {value} ({source})" 231 | ) 232 | aliases[name] = (source, value) 233 | 234 | 235 | def alias_already_added(aliases, name, source): 236 | return name in aliases and aliases[name][0] in {"__builtins__", source} 237 | 238 | 239 | def alias_is_unique(aliases, name, source, value): 240 | if name in aliases: 241 | src, val = aliases[name] 242 | if src != source: 243 | return val is value 244 | return True 245 | 246 | 247 | def get_stub_types(params, component, parent, logger) -> Optional[Dict[str, Any]]: 248 | if not typeshed_client_support: 249 | return None 250 | missing_types = { 251 | p.name: n 252 | for n, p in enumerate(params) 253 | if p.kind not in {kinds.VAR_POSITIONAL, kinds.VAR_KEYWORD} and p.annotation == inspect._empty 254 | } 255 | if not missing_types: 256 | return None 257 | resolver = get_stubs_resolver() 258 | stub_import = resolver.get_component_imported_info(component, parent) 259 | if not stub_import: 260 | return None 261 | known_params = {p.name for p in params} 262 | aliases = resolver.get_aliases(stub_import) 263 | arg_asts = stub_import.info.ast.args.args + stub_import.info.ast.args.kwonlyargs 264 | types = {} 265 | for arg_ast in arg_asts[1:] if parent else arg_asts: 266 | name = arg_ast.arg 267 | if arg_ast.annotation and (name in missing_types or name not in known_params): 268 | try: 269 | types[name] = get_arg_type(arg_ast.annotation, aliases) 270 | except Exception as ex: 271 | logger.debug( 272 | f"Failed to parse type stub for {component.__qualname__!r} parameter {name!r}", exc_info=ex 273 | ) 274 | if name not in known_params: 275 | types[name] = inspect._empty # pragma: no cover 276 | return types 277 | -------------------------------------------------------------------------------- /jsonargparse/_type_checking.py: -------------------------------------------------------------------------------- 1 | from typing import TYPE_CHECKING 2 | 3 | __all__ = [ 4 | "ArgumentGroup", 5 | "ActionsContainer", 6 | "ArgumentParser", 7 | "docstring_parser", 8 | "ruamelCommentedMap", 9 | ] 10 | 11 | if TYPE_CHECKING: # pragma: no cover 12 | import docstring_parser 13 | from ruamel.yaml.comments import CommentedMap as ruamelCommentedMap 14 | 15 | from ._core import ActionsContainer, ArgumentGroup, ArgumentParser 16 | else: 17 | globals().update({k: None for k in __all__}) 18 | -------------------------------------------------------------------------------- /jsonargparse/py.typed: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/omni-us/jsonargparse/1f901fea37e595d11642771f3ae18aff2b41df02/jsonargparse/py.typed -------------------------------------------------------------------------------- /jsonargparse_tests/__init__.py: -------------------------------------------------------------------------------- 1 | import os 2 | 3 | if "JSONARGPARSE_OMEGACONF_FULL_TEST" in os.environ: 4 | import warnings 5 | 6 | from jsonargparse._loaders_dumpers import loaders, set_omegaconf_loader 7 | 8 | set_omegaconf_loader() 9 | if "omegaconf" in loaders: 10 | loaders["yaml"] = loaders["omegaconf"] 11 | warnings.warn("Running all tests with omegaconf as the yaml loader.") 12 | -------------------------------------------------------------------------------- /jsonargparse_tests/__main__.py: -------------------------------------------------------------------------------- 1 | """Run all unit tests in package.""" 2 | 3 | import os 4 | import sys 5 | import warnings 6 | from pathlib import Path 7 | 8 | import pytest 9 | 10 | 11 | def run_tests(): 12 | filter_action = "default" 13 | warnings.simplefilter(filter_action) 14 | os.environ["PYTHONWARNINGS"] = filter_action 15 | testing_package = Path(__file__).parent 16 | exit_code = pytest.main(["-v", "-s", f"--rootdir={testing_package.parent}", "--pyargs", str(testing_package)]) 17 | if exit_code != 0: 18 | sys.exit(True) 19 | 20 | 21 | if __name__ == "__main__": 22 | run_tests() 23 | -------------------------------------------------------------------------------- /jsonargparse_tests/conftest.py: -------------------------------------------------------------------------------- 1 | import logging 2 | import os 3 | import platform 4 | import re 5 | import sys 6 | from contextlib import ExitStack, contextmanager, redirect_stderr, redirect_stdout 7 | from functools import wraps 8 | from importlib.util import find_spec 9 | from io import StringIO 10 | from pathlib import Path 11 | from typing import Iterator, List 12 | from unittest.mock import MagicMock, patch 13 | 14 | import pytest 15 | 16 | from jsonargparse import ArgumentParser, set_parsing_settings 17 | from jsonargparse._loaders_dumpers import json_compact_dump, json_load, yaml_dump, yaml_load 18 | from jsonargparse._optionals import ( 19 | docstring_parser_support, 20 | fsspec_support, 21 | jsonnet_support, 22 | jsonschema_support, 23 | omegaconf_support, 24 | pyyaml_available, 25 | toml_load_available, 26 | url_support, 27 | ) 28 | 29 | if docstring_parser_support: 30 | from docstring_parser import DocstringStyle 31 | 32 | set_parsing_settings(docstring_parse_style=DocstringStyle.GOOGLE) 33 | 34 | 35 | columns = "200" 36 | 37 | is_cpython = platform.python_implementation() == "CPython" 38 | is_posix = os.name == "posix" 39 | 40 | json_or_yaml_dump = yaml_dump if pyyaml_available else json_compact_dump 41 | json_or_yaml_load = yaml_load if pyyaml_available else json_load 42 | 43 | skip_if_no_pyyaml = pytest.mark.skipif( 44 | not pyyaml_available, 45 | reason="PyYAML package is required", 46 | ) 47 | 48 | skip_if_not_cpython = pytest.mark.skipif( 49 | not is_cpython, 50 | reason="only supported in CPython", 51 | ) 52 | 53 | skip_if_not_posix = pytest.mark.skipif( 54 | not is_posix, 55 | reason="only supported in posix systems", 56 | ) 57 | 58 | 59 | skip_if_jsonschema_unavailable = pytest.mark.skipif( 60 | not jsonschema_support, 61 | reason="jsonschema package is required", 62 | ) 63 | 64 | skip_if_fsspec_unavailable = pytest.mark.skipif( 65 | not fsspec_support, 66 | reason="fsspec package is required", 67 | ) 68 | 69 | skip_if_docstring_parser_unavailable = pytest.mark.skipif( 70 | not docstring_parser_support, 71 | reason="docstring-parser package is required", 72 | ) 73 | 74 | skip_if_requests_unavailable = pytest.mark.skipif( 75 | not url_support, 76 | reason="requests package is required", 77 | ) 78 | 79 | responses_available = bool(find_spec("responses")) 80 | 81 | skip_if_responses_unavailable = pytest.mark.skipif( 82 | not responses_available, 83 | reason="responses package is required", 84 | ) 85 | 86 | skip_if_running_as_root = pytest.mark.skipif( 87 | is_posix and os.geteuid() == 0, 88 | reason="User is root, permission tests will not work", 89 | ) 90 | 91 | if responses_available: 92 | import responses 93 | 94 | responses_activate = responses.activate 95 | else: 96 | 97 | def nothing_decorator(func): 98 | return func 99 | 100 | responses_activate = nothing_decorator 101 | 102 | 103 | def parser_modes(test_function): 104 | if "JSONARGPARSE_OMEGACONF_FULL_TEST" in os.environ: 105 | parser_modes = ["yaml"] 106 | else: 107 | parser_modes = ["json"] 108 | if toml_load_available: 109 | parser_modes += ["toml"] 110 | if pyyaml_available: 111 | parser_modes += ["yaml"] 112 | if jsonnet_support: 113 | parser_modes += ["jsonnet"] 114 | if omegaconf_support: 115 | parser_modes += ["omegaconf"] 116 | return pytest.mark.parametrize("parser", parser_modes, indirect=True)(test_function) 117 | 118 | 119 | @pytest.fixture 120 | def parser(request) -> ArgumentParser: 121 | kwargs = {} 122 | if getattr(request, "param", None): 123 | kwargs["parser_mode"] = request.param 124 | return ArgumentParser(exit_on_error=False, **kwargs) 125 | 126 | 127 | @pytest.fixture 128 | def subparser() -> ArgumentParser: 129 | return ArgumentParser(exit_on_error=False) 130 | 131 | 132 | @pytest.fixture 133 | def example_parser() -> ArgumentParser: 134 | parser = ArgumentParser(prog="app", exit_on_error=False) 135 | group_1 = parser.add_argument_group("Group 1", name="group1") 136 | group_1.add_argument("--bool", type=bool, default=True) 137 | group_2 = parser.add_argument_group("Group 2") 138 | group_2.add_argument("--nums.val1", type=int, default=1) 139 | group_2.add_argument("--nums.val2", type=float, default=2.0) 140 | return parser 141 | 142 | 143 | @pytest.fixture 144 | def tmp_cwd(tmpdir) -> Iterator[Path]: 145 | with tmpdir.as_cwd(): 146 | yield Path(tmpdir) 147 | 148 | 149 | @pytest.fixture 150 | def file_r(tmp_cwd) -> Iterator[str]: 151 | filename = "file_r" 152 | Path(filename).touch() 153 | yield filename 154 | 155 | 156 | @pytest.fixture 157 | def mock_stdin(): 158 | @contextmanager 159 | def _mock_stdin(data: str): 160 | mock = MagicMock() 161 | mock.read.side_effect = [data, ""] 162 | with patch("sys.stdin", mock): 163 | yield 164 | 165 | return _mock_stdin 166 | 167 | 168 | @pytest.fixture 169 | def logger() -> logging.Logger: 170 | logger = logging.getLogger(__name__) 171 | logger.level = logging.DEBUG 172 | logger.parent = None 173 | logger.handlers = [logging.StreamHandler()] 174 | return logger 175 | 176 | 177 | @contextmanager 178 | def capture_logs(logger: logging.Logger) -> Iterator[StringIO]: 179 | with ExitStack() as stack: 180 | captured = StringIO() 181 | for handler in logger.handlers: 182 | if isinstance(handler, logging.StreamHandler): 183 | stack.enter_context(patch.object(handler, "stream", captured)) 184 | yield captured 185 | 186 | 187 | @contextmanager 188 | def source_unavailable(obj=None): 189 | if obj and obj.__module__ in sys.modules: 190 | del sys.modules[obj.__module__] 191 | with patch("inspect.getsource", side_effect=OSError("mock source code not available")): 192 | yield 193 | 194 | 195 | @pytest.fixture(autouse=True) 196 | def no_color(): 197 | with patch.dict(os.environ, {"NO_COLOR": "true"}): 198 | yield 199 | 200 | 201 | def get_parser_help(parser: ArgumentParser, strip=False, columns=columns) -> str: 202 | out = StringIO() 203 | with patch.dict(os.environ, {"COLUMNS": columns}): 204 | parser.print_help(out) 205 | if strip: 206 | return re.sub(" *", " ", out.getvalue()) 207 | return out.getvalue() 208 | 209 | 210 | def get_parse_args_stdout(parser: ArgumentParser, args: List[str]) -> str: 211 | out = StringIO() 212 | with patch.dict(os.environ, {"COLUMNS": columns}), redirect_stdout(out), pytest.raises(SystemExit): 213 | parser.parse_args(args) 214 | return out.getvalue() 215 | 216 | 217 | def get_parse_args_stderr(parser: ArgumentParser, args: List[str]) -> str: 218 | err = StringIO() 219 | with patch.object(parser, "exit_on_error", return_value=True): 220 | with redirect_stderr(err), pytest.raises(SystemExit): 221 | parser.parse_args(args) 222 | return err.getvalue() 223 | 224 | 225 | class BaseClass: 226 | def __init__(self): 227 | pass 228 | 229 | 230 | def wrap_fn(fn): 231 | @wraps(fn) 232 | def wrapped_fn(*args, **kwargs): 233 | return fn(*args, **kwargs) 234 | 235 | return wrapped_fn 236 | -------------------------------------------------------------------------------- /jsonargparse_tests/test_argcomplete.py: -------------------------------------------------------------------------------- 1 | from __future__ import annotations 2 | 3 | import os 4 | import sys 5 | from contextlib import ExitStack, contextmanager 6 | from enum import Enum 7 | from importlib.util import find_spec 8 | from io import StringIO 9 | from pathlib import Path 10 | from typing import List, Optional 11 | from unittest.mock import patch 12 | 13 | import pytest 14 | 15 | from jsonargparse import ActionJsonSchema, ActionYesNo 16 | from jsonargparse._common import parser_context 17 | from jsonargparse.typing import Email, Path_fr, PositiveFloat, PositiveInt 18 | from jsonargparse_tests.conftest import ( 19 | skip_if_jsonschema_unavailable, 20 | skip_if_not_cpython, 21 | skip_if_not_posix, 22 | ) 23 | 24 | 25 | @pytest.fixture(autouse=True) 26 | def skip_if_argcomplete_unavailable(): 27 | if not find_spec("argcomplete"): 28 | pytest.skip("argcomplete package is required") 29 | 30 | 31 | @contextmanager 32 | def mock_fdopen(): 33 | err = StringIO() 34 | with patch("os.fdopen", return_value=err): 35 | yield err 36 | 37 | 38 | def complete_line(parser, value): 39 | stack = ExitStack() 40 | stack.enter_context(parser_context(load_value_mode="yaml")) 41 | with patch.dict( 42 | os.environ, 43 | { 44 | "_ARGCOMPLETE": "1", 45 | "_ARGCOMPLETE_SUPPRESS_SPACE": "1", 46 | "_ARGCOMPLETE_COMP_WORDBREAKS": " \t\n\"'><=;|&(:", 47 | "COMP_TYPE": str(ord("?")), # ='63' str(ord('\t'))='9' 48 | "COMP_LINE": value, 49 | "COMP_POINT": str(len(value)), 50 | }, 51 | ): 52 | import argcomplete 53 | 54 | out = StringIO() 55 | with pytest.raises(SystemExit), mock_fdopen() as err: 56 | argcomplete.autocomplete(parser, exit_method=sys.exit, output_stream=out) 57 | stack.close() 58 | return out.getvalue(), err.getvalue() 59 | 60 | 61 | def test_handle_completions(parser): 62 | parser.add_argument("--option") 63 | with ( 64 | patch("argcomplete.autocomplete") as mock, 65 | patch.dict( 66 | os.environ, 67 | { 68 | "_ARGCOMPLETE": "1", 69 | }, 70 | ), 71 | ): 72 | parser.parse_args([]) 73 | assert mock.called 74 | assert mock.call_args[0][0] is parser 75 | 76 | 77 | def test_complete_nested_one_option(parser): 78 | parser.add_argument("--group1.op") 79 | out, err = complete_line(parser, "tool.py --group1") 80 | assert out == "--group1.op" 81 | assert err == "" 82 | 83 | 84 | def test_complete_nested_two_options(parser): 85 | parser.add_argument("--group2.op1") 86 | parser.add_argument("--group2.op2") 87 | out, err = complete_line(parser, "tool.py --group2") 88 | assert out == "--group2.op1\x0b--group2.op2" 89 | assert err == "" 90 | 91 | 92 | @skip_if_not_cpython 93 | @pytest.mark.parametrize( 94 | ["value", "expected"], 95 | [ 96 | ("--int=a", "value not yet valid, expected type int"), 97 | ("--int=1", "value already valid, expected type int"), 98 | ("--float=a", "value not yet valid, expected type float"), 99 | ("--float=1", "value already valid, expected type float"), 100 | ("--pint=0", "value not yet valid, expected type PositiveInt"), 101 | ("--pint=1", "value already valid, expected type PositiveInt"), 102 | ("--pfloat=0", "value not yet valid, expected type PositiveFloat"), 103 | ("--pfloat=1", "value already valid, expected type PositiveFloat"), 104 | ("--email=a", "value not yet valid, expected type Email"), 105 | ("--email=a@b.c", "value already valid, expected type Email"), 106 | ], 107 | ) 108 | def test_stderr_instruction_simple_types(parser, value, expected): 109 | parser.add_argument("--int", type=int) 110 | parser.add_argument("--float", type=float) 111 | parser.add_argument("--pint", type=PositiveInt) 112 | parser.add_argument("--pfloat", type=PositiveFloat) 113 | parser.add_argument("--email", type=Email) 114 | out, err = complete_line(parser, "tool.py " + value) 115 | assert out == "" 116 | assert expected in err 117 | 118 | 119 | @skip_if_not_posix 120 | def test_action_config_file(parser, tmp_cwd): 121 | parser.add_argument("--cfg", action="config") 122 | Path("file1").touch() 123 | Path("config.yaml").touch() 124 | 125 | out, err = complete_line(parser, "tool.py --cfg=") 126 | assert out == "config.yaml\x0bfile1" 127 | assert err == "" 128 | out, err = complete_line(parser, "tool.py --cfg=c") 129 | assert out == "config.yaml" 130 | assert err == "" 131 | 132 | 133 | @pytest.mark.parametrize( 134 | ["value", "expected"], 135 | [ 136 | ("--op1", "--op1"), 137 | ("--no_op1", "--no_op1"), 138 | ("--op2", "--op2"), 139 | ("--no_op2", "--no_op2"), 140 | ("--op2=", "true\x0bfalse\x0byes\x0bno"), 141 | ("--with-op3", "--with-op3"), 142 | ("--without-op3", "--without-op3"), 143 | ], 144 | ) 145 | def test_action_yes_no(parser, value, expected): 146 | parser.add_argument("--op1", action=ActionYesNo) 147 | parser.add_argument("--op2", nargs="?", action=ActionYesNo) 148 | parser.add_argument("--with-op3", action=ActionYesNo(yes_prefix="with-", no_prefix="without-")) 149 | out, err = complete_line(parser, "tool.py " + value) 150 | assert out == expected 151 | assert err == "" 152 | 153 | 154 | def test_bool(parser): 155 | parser.add_argument("--bool", type=bool) 156 | out, err = complete_line(parser, "tool.py --bool=") 157 | assert out == "true\x0bfalse" 158 | assert err == "" 159 | out, err = complete_line(parser, "tool.py --bool=f") 160 | assert out == "false" 161 | assert err == "" 162 | 163 | 164 | def test_enum(parser): 165 | class EnumType(Enum): 166 | abc = 1 167 | xyz = 2 168 | abz = 3 169 | 170 | parser.add_argument("--enum", type=EnumType) 171 | out, err = complete_line(parser, "tool.py --enum=ab") 172 | assert out == "abc\x0babz" 173 | assert err == "" 174 | 175 | 176 | def test_optional_bool(parser): 177 | parser.add_argument("--bool", type=Optional[bool]) 178 | out, err = complete_line(parser, "tool.py --bool=") 179 | assert out == "true\x0bfalse\x0bnull" 180 | assert err == "" 181 | 182 | 183 | def test_optional_enum(parser): 184 | class EnumType(Enum): 185 | A = 1 186 | B = 2 187 | 188 | parser.add_argument("--enum", type=Optional[EnumType]) 189 | out, err = complete_line(parser, "tool.py --enum=") 190 | assert out == "A\x0bB\x0bnull" 191 | assert err == "" 192 | 193 | 194 | def test_str_with_choices(parser): 195 | parser.add_argument("--str", type=str, choices=["xyz", "abc"]) 196 | out, err = complete_line(parser, "tool.py --str=") 197 | assert out == "xyz\x0babc" 198 | assert err == "" 199 | 200 | 201 | @skip_if_not_cpython 202 | @skip_if_jsonschema_unavailable 203 | def test_action_jsonschema(parser): 204 | parser.add_argument("--json", action=ActionJsonSchema(schema={"type": "object"})) 205 | 206 | for value, expected in [ 207 | ("--json=1", "value not yet valid"), 208 | ("--json='{\"a\": 1}'", "value already valid"), 209 | ]: 210 | out, err = complete_line(parser, f"tool.py {value}") 211 | assert out == "" 212 | assert expected in err 213 | 214 | with patch("os.popen") as popen_mock: 215 | popen_mock.side_effect = ValueError 216 | out, err = complete_line(parser, f"tool.py {value}") 217 | assert out == "" 218 | assert expected in err 219 | 220 | out, err = complete_line(parser, "tool.py --json ") 221 | assert "value not yet valid" in out.replace("\xa0", " ").replace("_", " ") 222 | assert err == "" 223 | 224 | 225 | def test_list(parser): 226 | parser.add_argument("--list", type=List[int]) 227 | 228 | out, err = complete_line(parser, "tool.py --list='[1, 2, 3]'") 229 | assert out == "" 230 | assert "value already valid, expected type List[int]" in err 231 | 232 | out, err = complete_line(parser, "tool.py --list=") 233 | assert "value not yet valid" in out.replace("\xa0", " ").replace("_", " ") 234 | assert err == "" 235 | 236 | 237 | @skip_if_not_posix 238 | def test_optional_path(parser, tmp_cwd): 239 | parser.add_argument("--path", type=Optional[Path_fr]) 240 | Path("file1").touch() 241 | Path("file2").touch() 242 | 243 | for value, expected in [ 244 | ("--path=", "null\x0bfile1\x0bfile2"), 245 | ("--path=n", "null"), 246 | ("--path=f", "file1\x0bfile2"), 247 | ]: 248 | out, err = complete_line(parser, f"tool.py {value}") 249 | assert out == expected 250 | assert err == "" 251 | -------------------------------------------------------------------------------- /jsonargparse_tests/test_attrs.py: -------------------------------------------------------------------------------- 1 | from __future__ import annotations 2 | 3 | from typing import List 4 | 5 | import pytest 6 | 7 | from jsonargparse import Namespace 8 | from jsonargparse._optionals import attrs_support 9 | from jsonargparse_tests.conftest import get_parser_help 10 | 11 | if attrs_support: 12 | import attrs 13 | 14 | @attrs.define 15 | class AttrsData: 16 | p1: float 17 | p2: str = "-" 18 | 19 | @attrs.define 20 | class AttrsSubData(AttrsData): 21 | p3: int = 3 22 | 23 | @attrs.define 24 | class AttrsFieldFactory: 25 | p1: List[str] = attrs.field(factory=lambda: ["one", "two"]) 26 | 27 | @attrs.define 28 | class AttrsFieldInitFalse: 29 | p1: dict = attrs.field(init=False) 30 | 31 | def __attrs_post_init__(self): 32 | self.p1 = {} 33 | 34 | @attrs.define 35 | class AttrsSubField: 36 | p1: str = "-" 37 | p2: int = 0 38 | 39 | @attrs.define 40 | class AttrsWithNestedDefaultDataclass: 41 | p1: float 42 | subfield: AttrsSubField = attrs.field(factory=AttrsSubField) 43 | 44 | @attrs.define 45 | class AttrsWithNestedDataclassNoDefault: 46 | p1: float 47 | subfield: AttrsSubField 48 | 49 | 50 | @pytest.mark.skipif(not attrs_support, reason="attrs package is required") 51 | class TestAttrs: 52 | def test_define(self, parser): 53 | parser.add_argument("--data", type=AttrsData) 54 | defaults = parser.get_defaults() 55 | assert Namespace(p1=None, p2="-") == defaults.data 56 | cfg = parser.parse_args(["--data.p1=0.2", "--data.p2=x"]) 57 | assert Namespace(p1=0.2, p2="x") == cfg.data 58 | 59 | def test_subclass(self, parser): 60 | parser.add_argument("--data", type=AttrsSubData) 61 | defaults = parser.get_defaults() 62 | assert Namespace(p1=None, p2="-", p3=3) == defaults.data 63 | 64 | def test_field_factory(self, parser): 65 | parser.add_argument("--data", type=AttrsFieldFactory) 66 | cfg1 = parser.parse_args([]) 67 | cfg2 = parser.parse_args([]) 68 | assert cfg1.data.p1 == ["one", "two"] 69 | assert cfg1.data.p1 == cfg2.data.p1 70 | assert cfg1.data.p1 is not cfg2.data.p1 71 | 72 | def test_field_init_false(self, parser): 73 | parser.add_argument("--data", type=AttrsFieldInitFalse) 74 | cfg = parser.parse_args([]) 75 | help_str = get_parser_help(parser) 76 | assert "--data.p1" not in help_str 77 | assert cfg == Namespace() 78 | init = parser.instantiate_classes(cfg) 79 | assert init.data.p1 == {} 80 | 81 | def test_nested_with_default(self, parser): 82 | parser.add_argument("--data", type=AttrsWithNestedDefaultDataclass) 83 | cfg = parser.parse_args(["--data.p1=1.23"]) 84 | assert cfg.data == Namespace(p1=1.23, subfield=Namespace(p1="-", p2=0)) 85 | 86 | def test_nested_without_default(self, parser): 87 | parser.add_argument("--data", type=AttrsWithNestedDataclassNoDefault) 88 | cfg = parser.parse_args(["--data.p1=1.23"]) 89 | assert cfg.data == Namespace(p1=1.23, subfield=Namespace(p1="-", p2=0)) 90 | -------------------------------------------------------------------------------- /jsonargparse_tests/test_final_classes.py: -------------------------------------------------------------------------------- 1 | from __future__ import annotations 2 | 3 | import pytest 4 | 5 | from jsonargparse import ArgumentError, Namespace, lazy_instance 6 | from jsonargparse.typing import final 7 | 8 | 9 | @final 10 | class FinalClass: 11 | def __init__(self, a1: int = 1, a2: float = 2.3): 12 | self.a1 = a1 13 | self.a2 = a2 14 | 15 | 16 | class NotFinalClass: 17 | def __init__(self, b1: str = "4", b2: FinalClass = lazy_instance(FinalClass, a2=-3.2)): 18 | self.b1 = b1 19 | self.b2 = b2 20 | 21 | 22 | def test_add_class_final(parser): 23 | parser.add_class_arguments(NotFinalClass, "b") 24 | 25 | assert parser.get_defaults().b.b2 == Namespace(a1=1, a2=-3.2) 26 | cfg = parser.parse_args(['--b.b2={"a2": 6.7}']) 27 | assert cfg.b.b2 == Namespace(a1=1, a2=6.7) 28 | assert cfg == parser.parse_string(parser.dump(cfg)) 29 | cfg = parser.instantiate_classes(cfg) 30 | assert isinstance(cfg["b"], NotFinalClass) 31 | assert isinstance(cfg["b"].b2, FinalClass) 32 | 33 | pytest.raises(ArgumentError, lambda: parser.parse_args(['--b.b2={"bad": "value"}'])) 34 | pytest.raises(ArgumentError, lambda: parser.parse_args(['--b.b2="bad"'])) 35 | pytest.raises(ValueError, lambda: parser.add_class_arguments(FinalClass, "a", default=FinalClass())) 36 | -------------------------------------------------------------------------------- /jsonargparse_tests/test_formatters.py: -------------------------------------------------------------------------------- 1 | from __future__ import annotations 2 | 3 | import os 4 | from pathlib import Path 5 | from typing import Tuple 6 | from unittest.mock import patch 7 | 8 | import pytest 9 | 10 | from jsonargparse import ActionParser, ActionYesNo, ArgumentParser 11 | from jsonargparse_tests.conftest import get_parser_help, json_or_yaml_dump 12 | 13 | 14 | @pytest.fixture 15 | def parser() -> ArgumentParser: 16 | return ArgumentParser(prog="app", default_env=True) 17 | 18 | 19 | def test_help_basics(parser): 20 | help_str = get_parser_help(parser) 21 | assert "ARG: -h, --help" in help_str 22 | assert "APP_HELP" not in help_str 23 | 24 | 25 | def test_help_action_version(parser): 26 | parser.add_argument("--version", action="version", version="1.0.0") 27 | help_str = get_parser_help(parser) 28 | assert "ARG: --version" in help_str 29 | assert "APP_VERSION" not in help_str 30 | assert "show program's version number and exit" in help_str 31 | 32 | 33 | def test_help_action_config_file(parser): 34 | parser.add_argument("-c", "--cfg", help="Config in yaml/json.", action="config") 35 | help_str = get_parser_help(parser) 36 | assert "ARG: --print_config" in help_str 37 | assert "ARG: -c CFG, --cfg CFG" in help_str or "ARG: -c, --cfg CFG" in help_str 38 | assert "ENV: APP_CFG" in help_str 39 | assert "Config in yaml/json." in help_str 40 | assert "APP_PRINT_CONFIG" not in help_str 41 | 42 | 43 | def test_help_positional(parser): 44 | parser.add_argument("pos") 45 | help_str = get_parser_help(parser) 46 | assert "ARG: pos" in help_str 47 | assert "ENV: APP_POS" in help_str 48 | 49 | 50 | def test_help_required_and_default(parser): 51 | parser.add_argument("--v1", help="Option v1.", default="v1", required=True) 52 | help_str = get_parser_help(parser) 53 | assert "ARG: --v1 V1" in help_str 54 | assert "ENV: APP_V1" in help_str 55 | assert "Option v1. (required, default: v1)" in help_str 56 | 57 | 58 | def test_help_type_and_null_default(parser): 59 | parser.add_argument("--v2", type=int, help="Option v2.") 60 | help_str = get_parser_help(parser) 61 | assert "ARG: --v2 V2" in help_str 62 | assert "ENV: APP_V2" in help_str 63 | assert "Option v2. (type: int, default: null)" in help_str 64 | 65 | 66 | def test_help_no_type_and_default(parser): 67 | parser.add_argument("--g1.v3", help="Option v3.", default="v3") 68 | help_str = get_parser_help(parser) 69 | assert "ARG: --g1.v3 V3" in help_str 70 | assert "ENV: APP_G1__V3" in help_str 71 | assert "Option v3. (default: v3)" in help_str 72 | 73 | 74 | def test_help_choices_and_null_default(parser): 75 | parser.add_argument("--v4", choices=["A", "B"], help="Option v4.") 76 | help_str = get_parser_help(parser) 77 | assert "ARG: --v4 {A,B}" in help_str 78 | assert "ENV: APP_V4" in help_str 79 | assert "Option v4. (default: null)" in help_str 80 | 81 | 82 | def test_help_action_parser(parser): 83 | parser2 = ArgumentParser() 84 | parser2.add_argument("--v4") 85 | parser.add_argument("--g2", action=ActionParser(parser=parser2)) 86 | help_str = get_parser_help(parser) 87 | assert "ARG: --g2.v4 V4" in help_str 88 | assert "ENV: APP_G2__V4" in help_str 89 | 90 | 91 | def test_help_action_yes_no(parser): 92 | parser.add_argument("--v5", action=ActionYesNo, default=True, help="Option v5.") 93 | help_str = get_parser_help(parser) 94 | assert "ARG: --v5, --no_v5" in help_str 95 | assert "ENV: APP_V5" in help_str 96 | assert "Option v5. (type: bool, default: True)" in help_str 97 | 98 | 99 | @pytest.fixture 100 | def default_config_files(tmp_cwd) -> Tuple[ArgumentParser, str, Path]: 101 | not_exist = "does_not_exist.yaml" 102 | exists = Path("config.yaml") 103 | exists.write_text(json_or_yaml_dump({"v1": "from yaml v1", "n1.v2": "from yaml v2"})) 104 | 105 | parser = ArgumentParser(default_config_files=[not_exist, exists]) 106 | parser.add_argument("--v1", default="from default v1") 107 | parser.add_argument("--n1.v2", default="from default v2") 108 | return parser, not_exist, exists 109 | 110 | 111 | def test_help_default_config_files_overridden(default_config_files): 112 | parser, not_exist, exists = default_config_files 113 | help_str = get_parser_help(parser) 114 | assert "default config file locations" in help_str 115 | assert "from yaml v1" in help_str 116 | assert "from yaml v2" in help_str 117 | assert f"['{not_exist}', '{exists}']" in help_str 118 | assert f"overridden by the contents of: {exists}" in help_str 119 | 120 | 121 | def test_help_default_config_files_not_overridden(default_config_files): 122 | parser, not_exist, _ = default_config_files 123 | parser.default_config_files = [not_exist] 124 | help_str = get_parser_help(parser) 125 | assert "from default v1" in help_str 126 | assert "from default v2" in help_str 127 | assert str([not_exist]) in help_str 128 | assert "no existing default config file found" in help_str 129 | 130 | 131 | def test_help_default_config_files_none(default_config_files): 132 | parser, _, _ = default_config_files 133 | parser.default_config_files = None 134 | help_str = get_parser_help(parser) 135 | assert "default config file locations" not in help_str 136 | 137 | 138 | def test_help_default_config_files_with_required(tmp_path, parser): 139 | config_path = tmp_path / "config.yaml" 140 | config_path.write_text(json_or_yaml_dump({"v1": "from config"})) 141 | 142 | parser.default_config_files = [config_path] 143 | parser.add_argument("req", help="req description") 144 | parser.add_argument("--v1", default="from default") 145 | 146 | help_str = get_parser_help(parser) 147 | assert "req description" in help_str 148 | assert "from config" in help_str 149 | 150 | 151 | def test_help_subcommands_with_default_env(parser): 152 | subcommands = parser.add_subcommands() 153 | subparser1 = ArgumentParser() 154 | subparser2 = ArgumentParser() 155 | subcommands.add_subcommand("greet", subparser1, help="Greet someone") 156 | subcommands.add_subcommand("farewell", subparser2, help="Say goodbye") 157 | help_str = get_parser_help(parser) 158 | # Individual subcommands should NOT have ARG: prefix or ENV: lines 159 | assert "ARG: greet" not in help_str 160 | assert "ARG: farewell" not in help_str 161 | assert "ENV: APP_GREET" not in help_str 162 | assert "ENV: APP_FAREWELL" not in help_str 163 | # But the subcommand names and help should still be present 164 | assert "greet" in help_str 165 | assert "Greet someone" in help_str 166 | assert "farewell" in help_str 167 | assert "Say goodbye" in help_str 168 | 169 | 170 | def test_format_usage(parser): 171 | parser.add_argument("--v1") 172 | with patch.dict(os.environ, {"COLUMNS": "200"}): 173 | assert parser.format_usage() == "usage: app [-h] [--v1 V1]\n" 174 | -------------------------------------------------------------------------------- /jsonargparse_tests/test_jsonnet.py: -------------------------------------------------------------------------------- 1 | from __future__ import annotations 2 | 3 | import json 4 | import re 5 | from pathlib import Path 6 | 7 | import pytest 8 | 9 | from jsonargparse import ( 10 | ActionJsonnet, 11 | ActionJsonSchema, 12 | ArgumentError, 13 | ArgumentParser, 14 | strip_meta, 15 | ) 16 | from jsonargparse._optionals import jsonnet_support, pyyaml_available 17 | from jsonargparse_tests.conftest import ( 18 | get_parser_help, 19 | json_or_yaml_load, 20 | skip_if_jsonschema_unavailable, 21 | ) 22 | 23 | 24 | @pytest.fixture(autouse=True) 25 | def skip_if_jsonnet_unavailable(): 26 | if not jsonnet_support: 27 | pytest.skip("jsonnet package is required") 28 | 29 | 30 | example_1_jsonnet = """ 31 | local make_record(num) = { 32 | 'ref': '#'+(num+1), 33 | 'val': 3*(num/2)+5, 34 | }; 35 | 36 | { 37 | 'param': 654, 38 | 'records': [make_record(n) for n in std.range(0, 8)], 39 | } 40 | """ 41 | 42 | example_2_jsonnet = """ 43 | local param = std.extVar('param'); 44 | 45 | local make_record(num) = { 46 | 'ref': '#'+(num+1), 47 | 'val': 3*(num/2)+5, 48 | }; 49 | 50 | { 51 | 'param': param, 52 | 'records': [make_record(n) for n in std.range(0, 8)], 53 | } 54 | """ 55 | 56 | records_schema = { 57 | "type": "array", 58 | "items": { 59 | "type": "object", 60 | "properties": { 61 | "ref": {"type": "string"}, 62 | "val": {"type": "number"}, 63 | }, 64 | }, 65 | } 66 | 67 | example_schema = { 68 | "type": "object", 69 | "properties": { 70 | "param": {"type": "integer"}, 71 | "records": records_schema, 72 | }, 73 | } 74 | 75 | 76 | # test parser mode jsonnet 77 | 78 | 79 | @skip_if_jsonschema_unavailable 80 | def test_parser_mode_jsonnet(tmp_path): 81 | parser = ArgumentParser(parser_mode="jsonnet", exit_on_error=False) 82 | parser.add_argument("--cfg", action="config") 83 | parser.add_argument("--param", type=int) 84 | parser.add_argument("--records", action=ActionJsonSchema(schema=records_schema)) 85 | 86 | jsonnet_file = tmp_path / "example.jsonnet" 87 | jsonnet_file.write_text(example_1_jsonnet) 88 | 89 | cfg = parser.parse_args([f"--cfg={jsonnet_file}"]) 90 | assert 654 == cfg.param 91 | assert 9 == len(cfg.records) 92 | assert "#8" == cfg.records[-2]["ref"] 93 | assert 15.5 == cfg.records[-2]["val"] 94 | 95 | pytest.raises(ArgumentError, lambda: parser.parse_args(["--cfg", "{}}"])) 96 | 97 | 98 | def test_parser_mode_jsonnet_import_libsonnet(parser, tmp_cwd): 99 | parser.parser_mode = "jsonnet" 100 | parser.add_argument("--cfg", action="config") 101 | parser.add_argument("--name", type=str, default="Lucky") 102 | parser.add_argument("--prize", type=int, default=100) 103 | 104 | Path("conf").mkdir() 105 | Path("conf", "name.libsonnet").write_text('"Mike"') 106 | 107 | config_path = Path("conf", "test.jsonnet") 108 | config_path.write_text('local name = import "name.libsonnet"; {"name": name, "prize": 80}') 109 | 110 | cfg = parser.parse_args([f"--cfg={config_path}"]) 111 | assert cfg.name == "Mike" 112 | assert cfg.prize == 80 113 | assert str(cfg.cfg[0]) == str(config_path) 114 | 115 | 116 | def test_parser_mode_jsonnet_subconfigs(parser, tmp_cwd): 117 | class Class: 118 | def __init__(self, name: str = "Lucky", prize: int = 100): 119 | pass 120 | 121 | parser.parser_mode = "jsonnet" 122 | parser.add_class_arguments(Class, "group", sub_configs=True) 123 | 124 | Path("conf").mkdir() 125 | Path("conf", "name.libsonnet").write_text('"Mike"') 126 | config_path = Path("conf", "test.jsonnet") 127 | config_path.write_text('local name = import "name.libsonnet"; {"name": name, "prize": 80}') 128 | 129 | cfg = parser.parse_args([f"--group={config_path}"]) 130 | assert cfg.group.name == "Mike" 131 | assert cfg.group.prize == 80 132 | 133 | 134 | # test action jsonnet 135 | 136 | 137 | @skip_if_jsonschema_unavailable 138 | def test_action_jsonnet(parser): 139 | parser.add_argument("--input.ext_vars", type=dict) 140 | parser.add_argument( 141 | "--input.jsonnet", 142 | action=ActionJsonnet(ext_vars="input.ext_vars", schema=json.dumps(example_schema)), 143 | ) 144 | 145 | cfg2 = parser.parse_args(["--input.ext_vars", '{"param": 123}', "--input.jsonnet", example_2_jsonnet]) 146 | assert 123 == cfg2.input.jsonnet["param"] 147 | assert 9 == len(cfg2.input.jsonnet["records"]) 148 | assert "#8" == cfg2.input.jsonnet["records"][-2]["ref"] 149 | assert 15.5 == cfg2.input.jsonnet["records"][-2]["val"] 150 | 151 | cfg1 = parser.parse_args(["--input.jsonnet", example_1_jsonnet]) 152 | assert cfg1.input.jsonnet["records"] == cfg2.input.jsonnet["records"] 153 | 154 | with pytest.raises(ArgumentError): 155 | parser.parse_args(["--input.ext_vars", '{"param": "a"}', "--input.jsonnet", example_2_jsonnet]) 156 | with pytest.raises(ArgumentError): 157 | parser.parse_args(["--input.jsonnet", example_2_jsonnet]) 158 | 159 | 160 | def test_action_jsonnet_save_config_metadata(parser, tmp_path): 161 | parser.add_argument("--ext_vars", type=dict) 162 | parser.add_argument("--jsonnet", action=ActionJsonnet(ext_vars="ext_vars")) 163 | parser.add_argument("--cfg", action="config") 164 | 165 | jsonnet_file = tmp_path / "example.jsonnet" 166 | jsonnet_file.write_text(example_2_jsonnet) 167 | output_config = tmp_path / "output" / "main.yaml" 168 | output_jsonnet = tmp_path / "output" / "example.jsonnet" 169 | (tmp_path / "output").mkdir() 170 | 171 | # save the config with metadata and verify it is saved as two files 172 | cfg = parser.parse_args(["--ext_vars", '{"param": 123}', f"--jsonnet={jsonnet_file}"]) 173 | assert str(cfg.jsonnet["__path__"]) == str(jsonnet_file) 174 | parser.save(cfg, output_config) 175 | assert output_config.is_file() 176 | assert output_jsonnet.is_file() 177 | 178 | # rewrite the config to make sure that ext_vars is after jsonnet 179 | main_cfg = json_or_yaml_load(output_config.read_text()) 180 | main_cfg = {k: main_cfg[k] for k in ["jsonnet", "ext_vars"]} 181 | if pyyaml_available: 182 | import yaml 183 | 184 | output_config.write_text(yaml.safe_dump(main_cfg, sort_keys=False)) 185 | else: 186 | output_config.write_text(json.dumps(main_cfg)) 187 | 188 | # parse using saved config and verify result is the same 189 | cfg2 = parser.parse_args([f"--cfg={output_config}"]) 190 | cfg2.cfg = None 191 | assert strip_meta(cfg) == strip_meta(cfg2) 192 | 193 | # save the config without metadata and verify it is saved as a single file 194 | output_config.unlink() 195 | output_jsonnet.unlink() 196 | parser.save(strip_meta(cfg), output_config) 197 | assert output_config.is_file() 198 | assert not output_jsonnet.is_file() 199 | 200 | # parse using saved config and verify result is the same 201 | cfg3 = parser.parse_args([f"--cfg={output_config}"]) 202 | cfg3.cfg = None 203 | assert strip_meta(cfg) == strip_meta(cfg3) 204 | 205 | 206 | @skip_if_jsonschema_unavailable 207 | def test_action_jsonnet_in_help(parser): 208 | parser.add_argument( 209 | "--jsonnet", 210 | action=ActionJsonnet(schema=example_schema), 211 | help="schema: %s", 212 | ) 213 | help_str = get_parser_help(parser) 214 | schema = re.sub( 215 | "^.*schema:([^()]+)[^{}]*$", 216 | r"\1", 217 | help_str.replace("\n", " "), 218 | ) 219 | assert example_schema == json.loads(schema) 220 | 221 | 222 | def test_action_jsonnet_parse_method(): 223 | parsed = ActionJsonnet().parse(example_2_jsonnet, ext_vars={"param": 123}) 224 | assert 123 == parsed["param"] 225 | assert 9 == len(parsed["records"]) 226 | assert "#8" == parsed["records"][-2]["ref"] 227 | assert 15.5 == parsed["records"][-2]["val"] 228 | 229 | 230 | def test_action_jsonnet_ext_vars_default(parser): 231 | parser.add_argument("--ext_vars", type=dict, default={"param": 432}) 232 | parser.add_argument("--jsonnet", action=ActionJsonnet(ext_vars="ext_vars")) 233 | cfg = parser.parse_args(["--jsonnet", example_2_jsonnet]) 234 | assert 432 == cfg.jsonnet["param"] 235 | 236 | 237 | def test_action_jsonnet_ext_vars_not_defined(parser): 238 | with pytest.raises(ValueError) as ctx: 239 | parser.add_argument("--jsonnet", action=ActionJsonnet(ext_vars="ext_vars")) 240 | ctx.match("No argument found for ext_vars") 241 | 242 | 243 | def test_action_jsonnet_ext_vars_invalid_type(parser): 244 | parser.add_argument("--ext_vars", type=list) 245 | with pytest.raises(ValueError) as ctx: 246 | parser.add_argument("--jsonnet", action=ActionJsonnet(ext_vars="ext_vars")) 247 | ctx.match("Type for ext_vars='ext_vars' argument must be dict") 248 | 249 | 250 | def test_action_jsonnet_ext_vars_invalid_default(parser): 251 | parser.add_argument("--ext_vars", type=dict, default="none") 252 | with pytest.raises(ValueError) as ctx: 253 | parser.add_argument("--jsonnet", action=ActionJsonnet(ext_vars="ext_vars")) 254 | ctx.match("Default value for the ext_vars='ext_vars' argument must be dict or None") 255 | 256 | 257 | # other tests 258 | 259 | 260 | @skip_if_jsonschema_unavailable 261 | def test_action_jsonnet_schema_dict_or_str(): 262 | action1 = ActionJsonnet(schema=example_schema) 263 | action2 = ActionJsonnet(schema=json.dumps(example_schema)) 264 | assert action1._validator.schema == action2._validator.schema 265 | 266 | 267 | @skip_if_jsonschema_unavailable 268 | def test_action_jsonnet_init_failures(): 269 | pytest.raises(ValueError, lambda: ActionJsonnet(ext_vars=2)) 270 | from jsonschema.exceptions import SchemaError 271 | 272 | pytest.raises((ValueError, SchemaError), lambda: ActionJsonnet(schema="." + json.dumps(example_schema))) 273 | pytest.raises((ValueError, SchemaError), lambda: ActionJsonnet(schema=".")) 274 | -------------------------------------------------------------------------------- /jsonargparse_tests/test_jsonschema.py: -------------------------------------------------------------------------------- 1 | from __future__ import annotations 2 | 3 | import json 4 | import re 5 | from importlib.util import find_spec 6 | 7 | import pytest 8 | 9 | from jsonargparse import ActionJsonSchema, ArgumentError 10 | from jsonargparse_tests.conftest import get_parser_help, json_or_yaml_dump 11 | 12 | 13 | @pytest.fixture(autouse=True) 14 | def skip_if_jsonschema_unavailable(): 15 | if not find_spec("jsonschema"): 16 | pytest.skip("jsonschema package is required") 17 | 18 | 19 | # test schema array 20 | 21 | schema_array = { 22 | "type": "array", 23 | "items": {"type": "integer"}, 24 | } 25 | 26 | 27 | @pytest.fixture 28 | def parser_schema_array(parser): 29 | parser.add_argument( 30 | "--op1", 31 | action=ActionJsonSchema(schema=schema_array), 32 | help="schema: %s", 33 | ) 34 | return parser 35 | 36 | 37 | @pytest.mark.usefixtures("parser_schema_array") 38 | def test_schema_in_help(parser): 39 | help_str = get_parser_help(parser) 40 | help_schema = re.sub( 41 | "^.*schema:([^()]+)[^{}]*$", 42 | r"\1", 43 | help_str.replace("\n", " "), 44 | ) 45 | assert schema_array == json.loads(help_schema) 46 | 47 | 48 | @pytest.mark.usefixtures("parser_schema_array") 49 | def test_schema_array_parse_args(parser): 50 | assert [0, 1, 2] == parser.parse_args(["--op1", "[0, 1, 2]"]).op1 51 | pytest.raises(ArgumentError, lambda: parser.parse_args(["--op1", '[1, "two"]'])) 52 | pytest.raises(ArgumentError, lambda: parser.parse_args(["--op1", "[1.5, 2]"])) 53 | 54 | 55 | @pytest.mark.usefixtures("parser_schema_array") 56 | def test_schema_array_parse_string(parser): 57 | cfg = parser.parse_string(json_or_yaml_dump({"op1": [3, 7]})) 58 | assert [3, 7] == cfg["op1"] 59 | 60 | 61 | @pytest.mark.usefixtures("parser_schema_array") 62 | def test_schema_array_parse_path(parser, tmp_path): 63 | path = tmp_path / "op1.json" 64 | path.write_text('{"op1": [-1, 1, 0]}') 65 | cfg = parser.parse_path(path) 66 | assert [-1, 1, 0] == cfg["op1"] 67 | 68 | 69 | # test schema object 70 | 71 | 72 | @pytest.fixture 73 | def parser_schema_object(parser): 74 | schema_object = { 75 | "type": "object", 76 | "properties": { 77 | "k1": {"type": "string"}, 78 | "k2": {"type": "integer"}, 79 | "k3": { 80 | "type": "number", 81 | "default": 17, 82 | }, 83 | }, 84 | "additionalProperties": False, 85 | } 86 | parser.add_argument("--op2", action=ActionJsonSchema(schema=schema_object, with_meta=False)) 87 | parser.add_argument("--cfg", action="config") 88 | return parser 89 | 90 | 91 | @pytest.mark.usefixtures("parser_schema_object") 92 | def test_schema_object_parse_args(parser): 93 | op2_val = {"k1": "one", "k2": 2, "k3": 3.3} 94 | assert op2_val == parser.parse_args(["--op2", json_or_yaml_dump(op2_val)]).op2 95 | assert 17 == parser.parse_args(["--op2", '{"k2": 2}']).op2["k3"] 96 | pytest.raises(ArgumentError, lambda: parser.parse_args(["--op2", '{"k1": 1}'])) 97 | pytest.raises(ArgumentError, lambda: parser.parse_args(["--op2", '{"k2": "2"}'])) 98 | pytest.raises(ArgumentError, lambda: parser.parse_args(["--op2", '{"k4": 4}'])) 99 | 100 | 101 | @pytest.mark.usefixtures("parser_schema_object") 102 | def test_schema_object_parse_string(parser): 103 | op2_val = {"k1": "two", "k2": 7, "k3": 2.4} 104 | cfg = parser.parse_string(json_or_yaml_dump({"op2": op2_val})) 105 | assert op2_val == cfg["op2"] 106 | 107 | 108 | @pytest.mark.usefixtures("parser_schema_object") 109 | def test_schema_object_parse_config(parser, tmp_path): 110 | op2_val = {"k1": "three", "k2": -3, "k3": 0.4} 111 | path = tmp_path / "op2.json" 112 | path.write_text(json_or_yaml_dump({"op2": op2_val})) 113 | cfg = parser.parse_args([f"--cfg={path}"]) 114 | assert op2_val == cfg["op2"] 115 | 116 | 117 | def test_schema_oneof_add_defaults(parser): 118 | schema = { 119 | "oneOf": [ 120 | { 121 | "type": "object", 122 | "properties": { 123 | "kind": {"type": "string", "enum": ["x"]}, 124 | "pc": {"type": "integer", "default": 1}, 125 | "px": {"type": "string", "default": "dx"}, 126 | }, 127 | }, 128 | { 129 | "type": "object", 130 | "properties": { 131 | "kind": {"type": "string", "enum": ["y"]}, 132 | "pc": {"type": "integer", "default": 2}, 133 | "py": {"type": "string", "default": "dy"}, 134 | }, 135 | }, 136 | ] 137 | } 138 | parser.add_argument("--data", action=ActionJsonSchema(schema=schema)) 139 | parser.add_argument("--cfg", action="config") 140 | 141 | cfg = parser.parse_args(['--data={"kind": "x"}']) 142 | assert cfg.data == {"kind": "x", "pc": 1, "px": "dx"} 143 | 144 | cfg = parser.parse_args(['--data={"kind": "y", "pc": 3}']) 145 | assert cfg.data == {"kind": "y", "pc": 3, "py": "dy"} 146 | 147 | 148 | # other tests 149 | 150 | 151 | def test_action_jsonschema_schema_dict_or_str(): 152 | action1 = ActionJsonSchema(schema=schema_array) 153 | action2 = ActionJsonSchema(schema=json.dumps(schema_array)) 154 | assert action1._validator.schema == action2._validator.schema 155 | 156 | 157 | def test_action_jsonschema_init_failures(): 158 | pytest.raises(ValueError, ActionJsonSchema) 159 | from jsonschema.exceptions import SchemaError 160 | 161 | pytest.raises((ValueError, SchemaError), lambda: ActionJsonSchema(schema=":")) 162 | pytest.raises((ValueError, SchemaError), lambda: ActionJsonSchema(schema=".")) 163 | -------------------------------------------------------------------------------- /jsonargparse_tests/test_loaders_dumpers.py: -------------------------------------------------------------------------------- 1 | from __future__ import annotations 2 | 3 | import json 4 | from dataclasses import dataclass 5 | from pathlib import Path 6 | from typing import List 7 | from unittest.mock import patch 8 | 9 | import pytest 10 | 11 | from jsonargparse import ArgumentParser, get_loader, set_dumper, set_loader 12 | from jsonargparse._common import parser_context 13 | from jsonargparse._loaders_dumpers import load_value 14 | from jsonargparse._optionals import pyyaml_available, toml_dump_available, toml_load_available 15 | from jsonargparse_tests.conftest import get_parse_args_stdout, json_or_yaml_dump, json_or_yaml_load, skip_if_no_pyyaml 16 | 17 | if pyyaml_available: 18 | import yaml 19 | 20 | 21 | @skip_if_no_pyyaml 22 | def test_set_dumper_custom_yaml(parser): 23 | parser.add_argument("--list", type=List[int]) 24 | 25 | def custom_yaml_dump(data) -> str: 26 | return yaml.safe_dump(data, default_flow_style=True) 27 | 28 | with patch.dict("jsonargparse._loaders_dumpers.dumpers"): 29 | set_dumper("yaml_custom", custom_yaml_dump) 30 | cfg = parser.parse_args(["--list=[1,2,3]"]) 31 | dump = parser.dump(cfg, format="yaml_custom") 32 | assert dump == "{list: [1, 2, 3]}\n" 33 | 34 | 35 | def test_yaml_implicit_mapping_values_disabled(parser): 36 | parser.add_argument("--val", type=str) 37 | assert "{one}" == parser.parse_args(["--val={one}"]).val 38 | assert "{one,two,three}" == parser.parse_args(["--val={one,two,three}"]).val 39 | 40 | 41 | class Bar: 42 | def __init__(self, x: str): 43 | pass 44 | 45 | 46 | def test_yaml_implicit_null_disabled(parser): 47 | parser.add_argument("--bar", type=Bar) 48 | cfg = parser.parse_args(["--bar=Bar", "--bar.x=Foo:"]) 49 | assert "Foo:" == cfg.bar.init_args.x 50 | 51 | 52 | def test_invalid_parser_mode(): 53 | pytest.raises(ValueError, lambda: ArgumentParser(parser_mode="invalid")) 54 | 55 | 56 | def test_get_loader(): 57 | from jsonargparse._loaders_dumpers import jsonnet_load 58 | 59 | assert jsonnet_load is get_loader("jsonnet") 60 | 61 | 62 | def test_set_loader_parser_mode_subparsers(parser, subparser): 63 | subcommands = parser.add_subcommands() 64 | subcommands.add_subcommand("sub", subparser) 65 | 66 | with patch.dict("jsonargparse._loaders_dumpers.loaders"): 67 | set_loader("custom", yaml.safe_load if pyyaml_available else json.loads) 68 | parser.parser_mode = "custom" 69 | assert "custom" == parser.parser_mode 70 | assert "custom" == subparser.parser_mode 71 | 72 | 73 | @skip_if_no_pyyaml 74 | def test_dump_header_yaml(parser): 75 | parser.add_argument("--int", type=int, default=1) 76 | parser.dump_header = ["line 1", "line 2"] 77 | dump = parser.dump(parser.get_defaults()) 78 | assert dump == "# line 1\n# line 2\nint: 1\n" 79 | 80 | 81 | def test_dump_header_json(parser): 82 | parser.add_argument("--int", type=int, default=1) 83 | parser.dump_header = ["line 1", "line 2"] 84 | dump = parser.dump(parser.get_defaults(), format="json") 85 | assert dump == '{"int":1}' 86 | 87 | 88 | def test_dump_header_invalid(parser): 89 | with pytest.raises(ValueError): 90 | parser.dump_header = True 91 | 92 | 93 | @skip_if_no_pyyaml 94 | def test_load_value_dash(): 95 | with parser_context(load_value_mode="yaml"): 96 | assert "-" == load_value("-") 97 | assert " - " == load_value(" - ") 98 | 99 | 100 | @dataclass 101 | class CustomData: 102 | fn: dict 103 | 104 | 105 | class CustomContainer: 106 | def __init__(self, data: CustomData): 107 | self.data = data 108 | 109 | 110 | def custom_loader(data): 111 | if pyyaml_available: 112 | data = yaml.safe_load(data) 113 | else: 114 | data = json.loads(data) 115 | if isinstance(data, dict) and "fn" in data: 116 | data["fn"] = {k: f"custom loaded {v}" for k, v in data["fn"].items()} 117 | return data 118 | 119 | 120 | def custom_dumper(data): 121 | if "data" in data and "fn" in data["data"]: 122 | data["data"]["fn"] = {k: "dumped" for k in data["data"]["fn"]} 123 | return json_or_yaml_dump(data) 124 | 125 | 126 | def test_nested_parser_mode(parser): 127 | set_loader("custom", custom_loader) 128 | set_dumper("custom", custom_dumper) 129 | parser.parser_mode = "custom" 130 | parser.add_argument("--custom", type=CustomContainer) 131 | cfg = parser.parse_args(['--custom.data={"fn": {"key": "value"}}']) 132 | assert cfg.custom.init_args.data["fn"]["key"] == "custom loaded value" 133 | dump = json_or_yaml_load(parser.dump(cfg)) 134 | assert dump["custom"]["init_args"]["data"] == {"fn": {"key": "dumped"}} 135 | 136 | 137 | # toml tests 138 | 139 | 140 | toml_config = """ 141 | root = "-" 142 | 143 | [group] 144 | child1 = 1.2 145 | child2 = [ 3.0, 4.5,] 146 | """ 147 | 148 | 149 | @pytest.mark.skipif(not toml_load_available, reason="tomllib or toml package is required") 150 | def test_toml_parse_args_config(parser, tmp_cwd): 151 | parser.parser_mode = "toml" 152 | config_path = Path("config.toml") 153 | config_path.write_text(toml_config) 154 | parser.add_argument("--cfg", action="config") 155 | parser.add_argument("--root", type=str) 156 | parser.add_argument("--group.child1", type=float) 157 | parser.add_argument("--group.child2", type=List[float]) 158 | cfg = parser.parse_args([f"--cfg={config_path}"]) 159 | assert cfg.root == "-" 160 | assert cfg.group.as_dict() == {"child1": 1.2, "child2": [3.0, 4.5]} 161 | 162 | 163 | @pytest.mark.skipif(not toml_dump_available, reason="toml package is required") 164 | def test_toml_print_config(parser): 165 | parser.parser_mode = "toml" 166 | parser.add_argument("--config", action="config") 167 | parser.add_argument("--root", type=str, default="-") 168 | parser.add_argument("--group.child1", type=float, default=1.2) 169 | parser.add_argument("--group.child2", type=List[float], default=[3.0, 4.5]) 170 | out = get_parse_args_stdout(parser, ["--print_config"]) 171 | assert out.strip() == toml_config.strip() 172 | -------------------------------------------------------------------------------- /jsonargparse_tests/test_namespace.py: -------------------------------------------------------------------------------- 1 | from __future__ import annotations 2 | 3 | import argparse 4 | import platform 5 | 6 | import pytest 7 | 8 | from jsonargparse import Namespace, dict_to_namespace 9 | from jsonargparse._namespace import NSKeyError, meta_keys 10 | 11 | skip_if_no_setattr_insertion_order = pytest.mark.skipif( 12 | platform.python_implementation() != "CPython", 13 | reason="requires __setattr__ insertion order", 14 | ) 15 | 16 | 17 | def test_shallow_dot_set_get(): 18 | ns = Namespace() 19 | ns.a = 1 20 | assert 1 == ns.a 21 | assert ns == Namespace(a=1) 22 | 23 | 24 | def test_shallow_attr_set_get_del(): 25 | ns = Namespace() 26 | setattr(ns, "a", 1) 27 | assert 1 == getattr(ns, "a") 28 | assert ns == Namespace(a=1) 29 | delattr(ns, "a") 30 | with pytest.raises(AttributeError): 31 | getattr(ns, "a") 32 | 33 | 34 | def test_shallow_item_set_get_del(): 35 | ns = Namespace() 36 | ns["a"] = 1 37 | assert 1 == ns["a"] 38 | assert ns == Namespace(a=1) 39 | del ns["a"] 40 | with pytest.raises(KeyError): 41 | ns["a"] 42 | 43 | 44 | def test_nested_item_set_get(): 45 | ns = Namespace() 46 | ns["x.y.z"] = 1 47 | assert Namespace(x=Namespace(y=Namespace(z=1))) == ns 48 | assert 1 == ns["x.y.z"] 49 | assert 1 == ns["x"]["y"]["z"] 50 | assert Namespace(z=1) == ns["x.y"] 51 | assert Namespace(z=1) == ns["x"]["y"] 52 | ns["x.y"] = 2 53 | assert 2 == ns["x.y"] 54 | 55 | 56 | def test_nested_item_set_del(): 57 | ns = Namespace() 58 | ns["x.y"] = 1 59 | assert Namespace(x=Namespace(y=1)) == ns 60 | del ns["x.y"] 61 | assert Namespace(x=Namespace()) == ns 62 | 63 | 64 | def test_get(): 65 | ns = Namespace() 66 | ns["x.y"] = 1 67 | assert 1 == ns.get("x.y") 68 | assert Namespace(y=1) == ns.get("x") 69 | assert 2 == ns.get("z", 2) 70 | assert ns.get("z") is None 71 | 72 | 73 | @pytest.mark.parametrize("key", [None, True, False, 1, 2.3]) 74 | def test_get_non_str_key(key): 75 | ns = Namespace() 76 | assert ns.get(key) is None 77 | assert ns.get(key, "abc") == "abc" 78 | 79 | 80 | def test_set_item_nested_dict(): 81 | ns = Namespace(d={"a": 1}) 82 | ns["d.b"] = 2 83 | assert 2 == ns["d"]["b"] 84 | 85 | 86 | @pytest.mark.parametrize("key", [None, True, False, 1, 2.3]) 87 | def test_contains_non_str_key(key): 88 | ns = Namespace() 89 | assert key not in ns 90 | 91 | 92 | def test_pop(): 93 | ns = Namespace() 94 | ns["x.y.z"] = 1 95 | assert 1 == ns.pop("x.y.z") 96 | assert ns == Namespace(x=Namespace(y=Namespace())) 97 | 98 | 99 | def test_nested_item_invalid_set(): 100 | ns = Namespace() 101 | with pytest.raises(KeyError): 102 | ns["x."] = 1 103 | with pytest.raises(KeyError): 104 | ns["x .y"] = 2 105 | 106 | 107 | def test_nested_key_in(): 108 | ns = Namespace() 109 | ns["x.y.z"] = 1 110 | assert "x" in ns 111 | assert "x.y" in ns 112 | assert "x.y.z" in ns 113 | assert "a" not in ns 114 | assert "x.a" not in ns 115 | assert "x.y.a" not in ns 116 | assert "x.y.z.a" not in ns 117 | assert "x..y" not in ns 118 | assert 123 not in ns 119 | 120 | 121 | @skip_if_no_setattr_insertion_order 122 | def test_items_generator(): 123 | ns = Namespace() 124 | ns["a"] = 1 125 | ns["b.c"] = 2 126 | ns["b.d"] = 3 127 | ns["p.q.r"] = {"x": 4, "y": 5} 128 | items = list(ns.items()) 129 | assert items == [("a", 1), ("b.c", 2), ("b.d", 3), ("p.q.r", {"x": 4, "y": 5})] 130 | 131 | 132 | @skip_if_no_setattr_insertion_order 133 | def test_keys_generator(): 134 | ns = Namespace() 135 | ns["a"] = 1 136 | ns["b.c"] = 2 137 | ns["b.d"] = 3 138 | ns["p.q.r"] = {"x": 4, "y": 5} 139 | keys = list(ns.keys()) 140 | assert keys == ["a", "b.c", "b.d", "p.q.r"] 141 | 142 | 143 | @skip_if_no_setattr_insertion_order 144 | def test_values_generator(): 145 | ns = Namespace() 146 | ns["a"] = 1 147 | ns["b.c"] = 2 148 | ns["b.d"] = 3 149 | ns["p.q.r"] = {"x": 4, "y": 5} 150 | values = list(ns.values()) 151 | assert values == [1, 2, 3, {"x": 4, "y": 5}] 152 | 153 | 154 | def test_non_str_keys(): 155 | ns = Namespace(a=Namespace(b=Namespace(c=1))) 156 | with pytest.raises(NSKeyError, match="Key must be a string, got: 0"): 157 | [x for x in ns.a.b] 158 | 159 | 160 | def test_namespace_from_dict(): 161 | dic = {"a": 1, "b": {"c": 2}} 162 | ns = Namespace(dic) 163 | assert ns == Namespace(a=1, b={"c": 2}) 164 | 165 | 166 | def test_as_dict(): 167 | ns = Namespace() 168 | ns["w"] = 1 169 | ns["x.y"] = 2 170 | ns["x.z"] = 3 171 | ns["p"] = {"q": Namespace(r=4)} 172 | assert ns.as_dict() == {"w": 1, "x": {"y": 2, "z": 3}, "p": {"q": {"r": 4}}} 173 | assert Namespace().as_dict() == {} 174 | 175 | 176 | def test_as_flat(): 177 | ns = Namespace() 178 | ns["w"] = 1 179 | ns["x.y.z"] = 2 180 | flat = ns.as_flat() 181 | assert isinstance(flat, argparse.Namespace) 182 | assert vars(flat) == {"w": 1, "x.y.z": 2} 183 | 184 | 185 | def test_clone(): 186 | ns = Namespace() 187 | pqr = {"x": 4, "y": 5} 188 | ns["a"] = 1 189 | ns["p.q.r"] = pqr 190 | assert ns["p.q.r"] is pqr 191 | assert ns.clone() == ns 192 | assert ns.clone()["p.q.r"] is not pqr 193 | assert ns.clone()["p.q"] is not ns["p.q"] 194 | 195 | 196 | def test_update_shallow(): 197 | ns_from = Namespace(a=1, b=None) 198 | ns_to = Namespace(a=None, b=2, c=3) 199 | ns_to.update(ns_from) 200 | assert ns_to == Namespace(a=1, b=None, c=3) 201 | 202 | 203 | def test_update_invalid(): 204 | ns = Namespace() 205 | with pytest.raises(KeyError): 206 | ns.update(123) 207 | 208 | 209 | def test_init_from_argparse_flat_namespace(): 210 | argparse_ns = argparse.Namespace() 211 | setattr(argparse_ns, "w", 0) 212 | setattr(argparse_ns, "x.y.a", 1) 213 | setattr(argparse_ns, "x.y.b", 2) 214 | setattr(argparse_ns, "z.c", 3) 215 | ns = Namespace(argparse_ns) 216 | assert ns == Namespace(w=0, x=Namespace(y=Namespace(a=1, b=2)), z=Namespace(c=3)) 217 | 218 | 219 | def test_init_invalid(): 220 | with pytest.raises(ValueError): 221 | Namespace(1) 222 | with pytest.raises(ValueError): 223 | Namespace(argparse.Namespace(), x=1) 224 | 225 | 226 | def test_dict_to_namespace(): 227 | ns1 = Namespace(a=1, b=Namespace(c=2), d=[Namespace(e=3)]) 228 | dic = {"a": 1, "b": {"c": 2}, "d": [{"e": 3}]} 229 | ns2 = dict_to_namespace(dic) 230 | assert ns1 == ns2 231 | 232 | 233 | def test_use_for_kwargs(): 234 | def func(a=1, b=2, c=3): 235 | return a, b, c 236 | 237 | kwargs = Namespace(a=4, c=5) 238 | val = func(**kwargs) 239 | assert val == (4, 2, 5) 240 | 241 | 242 | def test_shallow_clashing_keys(): 243 | ns = Namespace() 244 | assert "get" not in ns 245 | exec("ns.get = 1") 246 | assert "get" in ns 247 | assert ns.get("get") == 1 248 | assert dict(ns.items()) == {"get": 1} 249 | ns["pop"] = 2 250 | assert ns["pop"] == 2 251 | assert ns.as_dict() == {"get": 1, "pop": 2} 252 | assert ns.pop("get") == 1 253 | assert dict(**ns) == {"pop": 2} 254 | assert ns.as_flat() == argparse.Namespace(pop=2) 255 | del ns["pop"] 256 | assert ns == Namespace() 257 | assert Namespace(update=3).as_dict() == {"update": 3} 258 | 259 | 260 | def test_leaf_clashing_keys(): 261 | ns = Namespace() 262 | ns["x.get"] = 1 263 | assert "x.get" in ns 264 | assert ns.get("x.get") == 1 265 | assert ns["x.get"] == 1 266 | assert ns["x"]["get"] == 1 267 | assert ns.as_dict() == {"x": {"get": 1}} 268 | assert dict(ns.items()) == {"x.get": 1} 269 | assert str(ns.as_flat()) == "Namespace(**{'x.get': 1})" 270 | assert ns.pop("x.get") == 1 271 | assert ns.get("x.get") is None 272 | 273 | 274 | def test_shallow_branch_clashing_keys(): 275 | ns = Namespace(get=Namespace(x=2)) 276 | assert "get.x" in ns 277 | assert ns.get("get.x") == 2 278 | assert ns["get.x"] == 2 279 | assert ns["get"] == Namespace(x=2) 280 | assert ns.as_dict() == {"get": {"x": 2}} 281 | assert dict(ns.items()) == {"get.x": 2} 282 | assert ns.pop("get.x") == 2 283 | 284 | 285 | def test_nested_branch_clashing_keys(): 286 | ns = Namespace() 287 | ns["x.get.y"] = 3 288 | assert "x.get.y" in ns 289 | assert ns.get("x.get.y") == 3 290 | assert ns.as_dict() == {"x": {"get": {"y": 3}}} 291 | assert ns.pop("x.get.y") == 3 292 | 293 | 294 | @pytest.mark.parametrize("meta_key", meta_keys) 295 | def test_add_argument_meta_key_error(meta_key, parser): 296 | with pytest.raises(ValueError) as ctx: 297 | parser.add_argument(meta_key) 298 | ctx.match(f'"{meta_key}" not allowed') 299 | 300 | 301 | def test_items_branches_nested(): 302 | ns = Namespace() 303 | ns["a.b"] = 1 304 | ns["a.c"] = 2 305 | ns["d"] = 3 306 | 307 | items = list(ns.items(branches=True)) 308 | assert items == [("a", Namespace(b=1, c=2)), ("a.b", 1), ("a.c", 2), ("d", 3)] 309 | 310 | items = list(ns.items(branches=True, nested=False)) 311 | assert items == [("a", Namespace(b=1, c=2)), ("d", 3)] 312 | 313 | items = list(ns.items(nested=False)) 314 | assert items == [("d", 3)] 315 | 316 | 317 | def test_keys_branches_nested(): 318 | ns = Namespace() 319 | ns["a.b"] = 1 320 | ns["a.c"] = 2 321 | ns["d"] = 3 322 | 323 | keys = list(ns.keys(branches=True)) 324 | assert keys == ["a", "a.b", "a.c", "d"] 325 | 326 | keys = list(ns.keys(branches=True, nested=False)) 327 | assert keys == ["a", "d"] 328 | 329 | keys = list(ns.keys(nested=False)) 330 | assert keys == ["d"] 331 | -------------------------------------------------------------------------------- /jsonargparse_tests/test_omegaconf.py: -------------------------------------------------------------------------------- 1 | from __future__ import annotations 2 | 3 | import json 4 | import math 5 | import multiprocessing 6 | import os 7 | from dataclasses import dataclass 8 | from pathlib import Path 9 | from unittest.mock import patch 10 | 11 | import pytest 12 | 13 | from jsonargparse import ArgumentParser, Namespace 14 | from jsonargparse._common import parser_context, set_parsing_settings 15 | from jsonargparse._loaders_dumpers import loaders, yaml_dump 16 | from jsonargparse._optionals import omegaconf_absolute_to_relative_paths, omegaconf_support 17 | from jsonargparse.typing import Path_fr 18 | from jsonargparse_tests.conftest import get_parser_help 19 | 20 | if omegaconf_support: 21 | from omegaconf import OmegaConf 22 | 23 | skip_if_omegaconf_unavailable = pytest.mark.skipif( 24 | not omegaconf_support, 25 | reason="omegaconf package is required", 26 | ) 27 | 28 | 29 | @pytest.fixture(autouse=True) 30 | def patch_loaders(): 31 | with patch.dict("jsonargparse._loaders_dumpers.loaders"): 32 | yield 33 | 34 | 35 | @pytest.mark.skipif( 36 | not (omegaconf_support and "JSONARGPARSE_OMEGACONF_FULL_TEST" in os.environ), 37 | reason="only for omegaconf as the yaml loader", 38 | ) 39 | def test_omegaconf_as_yaml_loader(): 40 | assert loaders["yaml"] is loaders["omegaconf"] 41 | 42 | 43 | @skip_if_omegaconf_unavailable 44 | @pytest.mark.parametrize("mode", ["omegaconf", "omegaconf+"]) 45 | def test_omegaconf_interpolation(mode): 46 | parser = ArgumentParser(parser_mode=mode) 47 | parser.add_argument("--server.host", type=str) 48 | parser.add_argument("--server.port", type=int) 49 | parser.add_argument("--client.url", type=str) 50 | parser.add_argument("--config", action="config") 51 | 52 | config = { 53 | "server": { 54 | "host": "localhost", 55 | "port": 80, 56 | }, 57 | "client": { 58 | "url": "http://${server.host}:${server.port}/", 59 | }, 60 | } 61 | cfg = parser.parse_args([f"--config={yaml_dump(config)}"]) 62 | assert cfg.client.url == "http://localhost:80/" 63 | assert "url: http://localhost:80/" in parser.dump(cfg) 64 | 65 | 66 | @skip_if_omegaconf_unavailable 67 | @pytest.mark.parametrize("mode", ["omegaconf", "omegaconf+", "omegaconf+absolute"]) 68 | @patch.dict("jsonargparse._common.parsing_settings") 69 | def test_omegaconf_interpolation_in_subcommands(mode, parser, subparser): 70 | subparser.add_argument("--config", action="config") 71 | subparser.add_argument("--source", type=str) 72 | subparser.add_argument("--target", type=str) 73 | 74 | if mode == "omegaconf+absolute": 75 | set_parsing_settings(omegaconf_absolute_to_relative_paths=True) 76 | 77 | parser.parser_mode = mode.replace("absolute", "") 78 | subcommands = parser.add_subcommands() 79 | subcommands.add_subcommand("sub", subparser) 80 | 81 | config = { 82 | "source": "hello", 83 | "target": "${.source}" if mode == "omegaconf+" else "${source}", 84 | } 85 | cfg = parser.parse_args(["sub", f"--config={yaml_dump(config)}"]) 86 | assert cfg.sub.target == "hello" 87 | 88 | 89 | @dataclass 90 | class Server: 91 | host: str = "localhost" 92 | port: int = 80 93 | 94 | 95 | @dataclass 96 | class Client: 97 | url: str = "http://example.com:8080" 98 | 99 | 100 | @skip_if_omegaconf_unavailable 101 | def test_omegaconf_global_interpolation(parser): 102 | parser.parser_mode = "omegaconf+" 103 | parser.add_class_arguments(Server, "server") 104 | parser.add_class_arguments(Client, "client") 105 | 106 | config = {"url": "http://${server.host}:${..server.port}/"} 107 | cfg = parser.parse_args([f"--client={yaml_dump(config)}"]) 108 | assert cfg.client == Namespace(url="http://localhost:80/") 109 | 110 | cfg = parser.parse_args([f"--client={yaml_dump(config)}", "--server.port=9000"]) 111 | assert cfg.client == Namespace(url="http://localhost:9000/") 112 | 113 | 114 | @skip_if_omegaconf_unavailable 115 | def test_omegaconf_global_resolver_config(parser): 116 | OmegaConf.register_new_resolver("increment", lambda x: x + 1) 117 | 118 | parser.parser_mode = "omegaconf+" 119 | parser.add_argument("--config", action="config") 120 | parser.add_argument("--value", type=int, default=0) 121 | parser.add_argument("--incremented", type=int, default=0) 122 | 123 | assert parser.parse_args([]) == Namespace(config=None, value=0, incremented=0) 124 | 125 | config = {"value": 1, "incremented": "${increment:${value}}"} 126 | cfg = parser.parse_args([f"--config={yaml_dump(config)}", "--value=5"]) 127 | assert cfg == Namespace(value=5, incremented=6) # currently config is lost 128 | 129 | OmegaConf.clear_resolver("increment") 130 | 131 | 132 | @skip_if_omegaconf_unavailable 133 | def test_omegaconf_global_resolver_argument(parser): 134 | def const(expr: str): 135 | allowed = {"pi": math.pi} 136 | return eval(expr, {"__builtins__": None}, allowed) 137 | 138 | OmegaConf.register_new_resolver("const", const) 139 | 140 | parser.parser_mode = "omegaconf+" 141 | parser.add_argument("--value", type=float) 142 | cfg = parser.parse_args(["--value=${const:3*pi/4}"]) 143 | assert cfg.value == 3 * math.pi / 4 144 | 145 | OmegaConf.clear_resolver("const") 146 | 147 | 148 | @skip_if_omegaconf_unavailable 149 | @patch.dict(os.environ, {"X": "true"}) 150 | def test_omegaconf_global_resolver_default(parser): 151 | parser.parser_mode = "omegaconf+" 152 | action = parser.add_argument("--env", type=bool, default="${oc.env:X}") 153 | assert action.default == "${oc.env:X}" 154 | 155 | help_str = get_parser_help(parser) 156 | assert "default: ${oc.env:X}" in help_str 157 | 158 | cfg = parser.parse_args([]) 159 | assert cfg.env is True 160 | 161 | 162 | @dataclass 163 | class Nested: 164 | path: Path_fr 165 | 166 | 167 | @skip_if_omegaconf_unavailable 168 | @patch.dict(os.environ, {"X": "Y"}) 169 | def test_omegaconf_global_path_preserve_relative(parser, tmp_cwd): 170 | import yaml 171 | 172 | parser.parser_mode = "omegaconf+" 173 | parser.add_class_arguments(Nested, "nested") 174 | parser.add_argument("--env") 175 | 176 | subdir = Path("sub") 177 | subdir.mkdir() 178 | (subdir / "file").touch() 179 | nested = subdir / "nested.json" 180 | nested.write_text(json.dumps({"path": "file"})) 181 | 182 | cfg = parser.parse_args([f"--nested={nested}", "--env=${oc.env:X}"]) 183 | assert cfg.env == "Y" 184 | assert cfg.nested.path.relative == "file" 185 | assert cfg.nested.path.cwd == str(tmp_cwd / subdir) 186 | 187 | with parser_context(path_dump_preserve_relative=True): 188 | dump = yaml.safe_load(parser.dump(cfg))["nested"]["path"] 189 | assert dump == {"relative": "file", "cwd": str(tmp_cwd / subdir)} 190 | 191 | 192 | @skip_if_omegaconf_unavailable 193 | def test_omegaconf_inf_nan(parser): 194 | parser.parser_mode = "omegaconf+" 195 | parser.add_argument("--a", type=float, default=0.0) 196 | parser.add_argument("--b", type=float, default=1.0) 197 | parser.add_argument("--c", type=float, default=float("nan")) 198 | parser.add_argument("--d", type=float, default=float("inf")) 199 | parser.add_argument("--e", type=float, default=float("-inf")) 200 | 201 | cfg = parser.parse_args(["--a=2.5", "--b=${a}"]) 202 | assert cfg.a == 2.5 203 | assert cfg.b == 2.5 204 | assert math.isnan(cfg.c) 205 | assert cfg.d == float("inf") 206 | assert cfg.e == float("-inf") 207 | 208 | 209 | @skip_if_omegaconf_unavailable 210 | def test_omegaconf_absolute_to_relative_paths(): 211 | data = { 212 | "a": "x", 213 | "b": "prefix ${a} suffix", 214 | "c": {"d": "${b}", "e": "${c.d}"}, 215 | "f": [10, "${c.e}", "${..b}"], 216 | "g": "${env:USER}", 217 | "h": "${f[0]}", 218 | } 219 | expected = { 220 | "a": "x", 221 | "b": "prefix ${.a} suffix", 222 | "c": {"d": "${..b}", "e": "${.d}"}, 223 | "f": [10, "${..c.e}", "${..b}"], 224 | "g": "${env:USER}", 225 | "h": "${.f[0]}", 226 | } 227 | assert omegaconf_absolute_to_relative_paths(data) == expected 228 | 229 | 230 | def parse_in_spawned_process(queue, parser, args): 231 | try: 232 | cfg = parser.parse_args(args) 233 | queue.put(cfg) 234 | except Exception as ex: 235 | queue.put(ex) 236 | 237 | 238 | @skip_if_omegaconf_unavailable 239 | def test_omegaconf_in_spawned_process(parser): 240 | parser.parser_mode = "omegaconf" 241 | parser.add_argument("--dict", type=dict) 242 | assert parser.parse_args(['--dict={"x":1}']).dict == {"x": 1} 243 | 244 | ctx = multiprocessing.get_context("spawn") 245 | queue = ctx.Queue() 246 | process = ctx.Process(target=parse_in_spawned_process, args=(queue, parser, ['--dict={"x":2}'])) 247 | process.start() 248 | process.join() 249 | 250 | cfg = queue.get() 251 | assert cfg.dict == {"x": 2} 252 | -------------------------------------------------------------------------------- /jsonargparse_tests/test_optionals.py: -------------------------------------------------------------------------------- 1 | from __future__ import annotations 2 | 3 | import sys 4 | 5 | import pytest 6 | 7 | from jsonargparse import set_parsing_settings 8 | from jsonargparse._optionals import ( 9 | _get_config_read_mode, 10 | docstring_parser_support, 11 | fallback_final, 12 | fsspec_support, 13 | get_docstring_parse_options, 14 | import_docstring_parser, 15 | import_fsspec, 16 | import_jsonnet, 17 | import_jsonschema, 18 | import_requests, 19 | import_ruamel, 20 | jsonnet_support, 21 | jsonschema_support, 22 | ruamel_support, 23 | url_support, 24 | ) 25 | from jsonargparse.typing import is_final_class 26 | from jsonargparse_tests.conftest import ( 27 | skip_if_docstring_parser_unavailable, 28 | skip_if_fsspec_unavailable, 29 | skip_if_requests_unavailable, 30 | ) 31 | 32 | # jsonschema support 33 | 34 | 35 | @pytest.mark.skipif(not jsonschema_support, reason="jsonschema package is required") 36 | def test_jsonschema_support_true(): 37 | import_jsonschema("test_jsonschema_support_true") 38 | 39 | 40 | @pytest.mark.skipif(jsonschema_support, reason="jsonschema package should not be installed") 41 | def test_jsonschema_support_false(): 42 | with pytest.raises(ImportError) as ctx: 43 | import_jsonschema("test_jsonschema_support_false") 44 | ctx.match("test_jsonschema_support_false") 45 | 46 | 47 | # jsonnet support 48 | 49 | 50 | @pytest.mark.skipif(not jsonnet_support, reason="jsonnet package is required") 51 | def test_jsonnet_support_true(): 52 | import_jsonnet("test_jsonnet_support_true") 53 | 54 | 55 | @pytest.mark.skipif(jsonnet_support, reason="jsonnet package should not be installed") 56 | def test_jsonnet_support_false(): 57 | with pytest.raises(ImportError) as ctx: 58 | import_jsonnet("test_jsonnet_support_false") 59 | ctx.match("test_jsonnet_support_false") 60 | 61 | 62 | # requests support 63 | 64 | 65 | @skip_if_requests_unavailable 66 | def test_url_support_true(): 67 | import_requests("test_url_support_true") 68 | 69 | 70 | @pytest.mark.skipif(url_support, reason="requests package should not be installed") 71 | def test_url_support_false(): 72 | with pytest.raises(ImportError) as ctx: 73 | import_requests("test_url_support_false") 74 | ctx.match("test_url_support_false") 75 | 76 | 77 | # docstring-parser support 78 | 79 | 80 | @skip_if_docstring_parser_unavailable 81 | def test_docstring_parser_support_true(): 82 | import_docstring_parser("test_docstring_parser_support_true") 83 | 84 | 85 | @pytest.mark.skipif(docstring_parser_support, reason="docstring-parser package should not be installed") 86 | def test_docstring_parser_support_false(): 87 | with pytest.raises(ImportError) as ctx: 88 | import_docstring_parser("test_docstring_parser_support_false") 89 | ctx.match("test_docstring_parser_support_false") 90 | 91 | 92 | @skip_if_docstring_parser_unavailable 93 | def test_docstring_parse_options(): 94 | from docstring_parser import DocstringStyle 95 | 96 | options = get_docstring_parse_options() 97 | options["style"] = None 98 | options = get_docstring_parse_options() 99 | 100 | for style in [DocstringStyle.NUMPYDOC, DocstringStyle.GOOGLE]: 101 | set_parsing_settings(docstring_parse_style=style) 102 | assert options["style"] == style 103 | with pytest.raises(ValueError): 104 | set_parsing_settings(docstring_parse_style="invalid") 105 | 106 | assert options["attribute_docstrings"] is False 107 | for attribute_docstrings in [True, False]: 108 | set_parsing_settings(docstring_parse_attribute_docstrings=attribute_docstrings) 109 | assert options["attribute_docstrings"] is attribute_docstrings 110 | with pytest.raises(ValueError): 111 | set_parsing_settings(docstring_parse_attribute_docstrings="invalid") 112 | 113 | 114 | # fsspec support 115 | 116 | 117 | @skip_if_fsspec_unavailable 118 | def test_fsspec_support_true(): 119 | import_fsspec("test_fsspec_support_true") 120 | 121 | 122 | @pytest.mark.skipif(fsspec_support, reason="fsspec package should not be installed") 123 | def test_fsspec_support_false(): 124 | with pytest.raises(ImportError) as ctx: 125 | import_fsspec("test_fsspec_support_false") 126 | ctx.match("test_fsspec_support_false") 127 | 128 | 129 | # ruamel.yaml support 130 | 131 | 132 | @pytest.mark.skipif(not ruamel_support, reason="ruamel.yaml package is required") 133 | def test_ruamel_support_true(): 134 | import_ruamel("test_ruamel_support_true") 135 | 136 | 137 | @pytest.mark.skipif(ruamel_support, reason="ruamel.yaml package should not be installed") 138 | def test_ruamel_support_false(): 139 | with pytest.raises(ImportError) as ctx: 140 | import_ruamel("test_ruamel_support_false") 141 | ctx.match("test_ruamel_support_false") 142 | 143 | 144 | # config read mode tests 145 | 146 | 147 | @skip_if_requests_unavailable 148 | def test_config_read_mode_url_support_true(): 149 | assert "fr" == _get_config_read_mode() 150 | set_parsing_settings(config_read_mode_urls_enabled=True) 151 | assert "fur" == _get_config_read_mode() 152 | set_parsing_settings(config_read_mode_urls_enabled=False) 153 | assert "fr" == _get_config_read_mode() 154 | 155 | 156 | @pytest.mark.skipif(url_support, reason="request package should not be installed") 157 | def test_config_read_mode_url_support_false(): 158 | assert "fr" == _get_config_read_mode() 159 | with pytest.raises(ImportError): 160 | set_parsing_settings(config_read_mode_urls_enabled=True) 161 | assert "fr" == _get_config_read_mode() 162 | set_parsing_settings(config_read_mode_urls_enabled=False) 163 | assert "fr" == _get_config_read_mode() 164 | 165 | 166 | @skip_if_fsspec_unavailable 167 | def test_config_read_mode_fsspec_support_true(): 168 | assert "fr" == _get_config_read_mode() 169 | set_parsing_settings(config_read_mode_fsspec_enabled=True) 170 | assert "fsr" == _get_config_read_mode() 171 | set_parsing_settings(config_read_mode_fsspec_enabled=False) 172 | assert "fr" == _get_config_read_mode() 173 | 174 | 175 | @pytest.mark.skipif(fsspec_support, reason="fsspec package should not be installed") 176 | def test_config_read_mode_fsspec_support_false(): 177 | assert "fr" == _get_config_read_mode() 178 | with pytest.raises(ImportError): 179 | set_parsing_settings(config_read_mode_fsspec_enabled=True) 180 | assert "fr" == _get_config_read_mode() 181 | set_parsing_settings(config_read_mode_fsspec_enabled=False) 182 | assert "fr" == _get_config_read_mode() 183 | 184 | 185 | # final decorator tests 186 | 187 | 188 | @fallback_final 189 | class FinalClass: 190 | pass 191 | 192 | 193 | @pytest.mark.skipif(sys.version_info < (3, 11), reason="final decorator __final__ introduced in python 3.11") 194 | def test_final_decorator(): 195 | assert is_final_class(FinalClass) is True 196 | assert is_final_class(test_final_decorator) is False 197 | -------------------------------------------------------------------------------- /jsonargparse_tests/test_parsing_settings.py: -------------------------------------------------------------------------------- 1 | import re 2 | from dataclasses import dataclass 3 | from typing import Literal, Optional 4 | from unittest.mock import patch 5 | 6 | import pytest 7 | 8 | from jsonargparse import ActionYesNo, ArgumentError, Namespace, set_parsing_settings 9 | from jsonargparse._common import get_parsing_setting 10 | from jsonargparse_tests.conftest import capture_logs, get_parse_args_stdout, get_parser_help 11 | from jsonargparse_tests.test_typehints import Optimizer 12 | 13 | 14 | @pytest.fixture(autouse=True) 15 | def patch_parsing_settings(): 16 | with patch.dict("jsonargparse._common.parsing_settings"): 17 | yield 18 | 19 | 20 | def test_get_parsing_setting_failure(): 21 | with pytest.raises(ValueError, match="Unknown parsing setting"): 22 | get_parsing_setting("unknown_setting") 23 | 24 | 25 | # validate_defaults 26 | 27 | 28 | def test_set_validate_defaults_failure(): 29 | with pytest.raises(ValueError, match="validate_defaults must be a boolean"): 30 | set_parsing_settings(validate_defaults="invalid") 31 | 32 | 33 | def test_validate_defaults_success(parser): 34 | set_parsing_settings(validate_defaults=True) 35 | 36 | parser.add_argument("--config", action="config") 37 | parser.add_argument("--num", type=int, default=1) 38 | parser.add_argument("--untyped", default=2) 39 | 40 | 41 | def test_validate_defaults_failure(parser): 42 | set_parsing_settings(validate_defaults=True) 43 | 44 | with pytest.raises(ValueError, match="Default value is not valid:"): 45 | parser.add_argument("--num", type=int, default="x") 46 | 47 | 48 | @dataclass 49 | class DataWithDefault: 50 | param: str = "foo" 51 | 52 | 53 | def test_validate_defaults_dataclass(parser): 54 | set_parsing_settings(validate_defaults=True) 55 | 56 | added_args = parser.add_class_arguments(DataWithDefault) 57 | assert added_args == ["param"] 58 | 59 | 60 | # parse_optionals_as_positionals 61 | 62 | 63 | def test_set_parse_optionals_as_positionals_failure(): 64 | with pytest.raises(ValueError, match="parse_optionals_as_positionals must be a boolean"): 65 | set_parsing_settings(parse_optionals_as_positionals="invalid") 66 | 67 | 68 | def test_parse_optionals_as_positionals_simple(parser, logger, subtests): 69 | set_parsing_settings(parse_optionals_as_positionals=True) 70 | 71 | parser.add_argument("p1", type=Optional[Literal["p1"]]) 72 | parser.add_argument("--o1", type=Optional[int]) 73 | parser.add_argument("--flag", default=False, nargs=0, action=ActionYesNo) 74 | parser.add_argument("--o2", type=Optional[Literal["o2"]]) 75 | parser.add_argument("--o3") 76 | 77 | with subtests.test("help"): 78 | help_str = get_parser_help(parser) 79 | assert " p1 [o1 [o2 [o3]]]" in help_str 80 | assert "extra positionals are parsed as optionals in the order shown above" in help_str 81 | 82 | with subtests.test("no extra positionals"): 83 | cfg = parser.parse_args(["--o2=o2", "--o1=1", "p1"]) 84 | assert cfg == Namespace(p1="p1", o1=1, o2="o2", o3=None, flag=False) 85 | 86 | with subtests.test("one extra positional"): 87 | cfg = parser.parse_args(["--o2=o2", "p1", "2"]) 88 | assert cfg == Namespace(p1="p1", o1=2, o2="o2", o3=None, flag=False) 89 | 90 | with subtests.test("two extra positionals"): 91 | cfg = parser.parse_args(["p1", "3", "o2"]) 92 | assert cfg == Namespace(p1="p1", o1=3, o2="o2", o3=None, flag=False) 93 | 94 | with subtests.test("three extra positionals"): 95 | cfg = parser.parse_args(["p1", "3", "o2", "v3"]) 96 | assert cfg == Namespace(p1="p1", o1=3, o2="o2", o3="v3", flag=False) 97 | 98 | with subtests.test("extra positional has precedence"): 99 | cfg = parser.parse_args(["p1", "3", "o2", "--o1=4"]) 100 | assert cfg == Namespace(p1="p1", o1=3, o2="o2", o3=None, flag=False) 101 | 102 | with subtests.test("extra positionals invalid values"): 103 | with pytest.raises(ArgumentError) as ex: 104 | parser.parse_args(["p1", "o2", "5"]) 105 | assert re.match('Parser key "o1".*Given value: o2', ex.value.message, re.DOTALL) 106 | 107 | with pytest.raises(ArgumentError) as ex: 108 | parser.parse_args(["p1", "6", "invalid"]) 109 | assert re.match('Parser key "o2".*Given value: invalid', ex.value.message, re.DOTALL) 110 | 111 | parser.logger = logger 112 | with subtests.test("unrecognized arguments"): 113 | with capture_logs(logger) as logs: 114 | with pytest.raises(ArgumentError, match="Unrecognized arguments: --unk=x"): 115 | parser.parse_args(["--unk=x"]) 116 | assert "Positional argument p1 missing, aborting _positional_optionals" in logs.getvalue() 117 | 118 | 119 | def test_parse_optionals_as_positionals_subcommands(parser, subparser, subtests): 120 | set_parsing_settings(parse_optionals_as_positionals=True) 121 | 122 | subparser.add_argument("p1", type=Optional[Literal["p1"]]) 123 | subparser.add_argument("--o1", type=Optional[int]) 124 | subparser.add_argument("--o2", type=Optional[Literal["o2"]]) 125 | parser.add_argument("--g1") 126 | subcommands = parser.add_subcommands() 127 | subcommands.add_subcommand("subcmd", subparser) 128 | 129 | with subtests.test("help global"): 130 | help_str = get_parser_help(parser) 131 | assert " [g1]" not in help_str 132 | assert "extra positionals are parsed as optionals in the order shown above" not in help_str 133 | 134 | with subtests.test("help subcommand"): 135 | help_str = get_parse_args_stdout(parser, ["subcmd", "-h"]) 136 | assert " p1 [o1 [o2]]" in help_str 137 | assert "extra positionals are parsed as optionals in the order shown above" in help_str 138 | 139 | with subtests.test("no extra positionals"): 140 | cfg = parser.parse_args(["subcmd", "--o2=o2", "--o1=1", "p1"]) 141 | assert cfg.subcmd == Namespace(p1="p1", o1=1, o2="o2") 142 | 143 | with subtests.test("one extra positional"): 144 | cfg = parser.parse_args(["subcmd", "--o2=o2", "p1", "2"]) 145 | assert cfg.subcmd == Namespace(p1="p1", o1=2, o2="o2") 146 | 147 | with subtests.test("two extra positionals"): 148 | cfg = parser.parse_args(["subcmd", "p1", "3", "o2"]) 149 | assert cfg.subcmd == Namespace(p1="p1", o1=3, o2="o2") 150 | 151 | with subtests.test("extra positionals invalid values"): 152 | with pytest.raises(ArgumentError) as ex: 153 | parser.parse_args(["subcmd", "p1", "o2", "5"]) 154 | assert re.match('Parser key "o1".*Given value: o2', ex.value.message, re.DOTALL) 155 | 156 | 157 | def test_optionals_as_positionals_usage_wrap(parser): 158 | set_parsing_settings(parse_optionals_as_positionals=True) 159 | 160 | parser.prog = "long_prog_name" 161 | parser.add_argument("relatively_long_positional") 162 | parser.add_argument("--first_long_optional") 163 | parser.add_argument("--second_long_optional") 164 | 165 | help_str = get_parser_help(parser, columns="80") 166 | assert "usage: long_prog_name " in help_str 167 | assert " relatively_long_positional" in help_str 168 | assert " [first_long_optional [second_long_optional]]" in help_str 169 | 170 | 171 | @dataclass 172 | class DataOptions: 173 | d1: int = 1 174 | 175 | 176 | def test_optionals_as_positionals_unsupported_arguments(parser): 177 | set_parsing_settings(parse_optionals_as_positionals=True) 178 | 179 | parser.add_argument("p1", type=Optional[Literal["p1"]]) 180 | parser.add_argument("--o1", type=Optimizer) 181 | parser.add_argument("--o2", type=Optional[int]) 182 | parser.add_argument("--o3", type=DataOptions) 183 | parser.add_argument("--o4.n1", type=float) 184 | 185 | help_str = get_parser_help(parser) 186 | assert " p1 [o2 [o3.d1 [o4.n1]]]" in help_str 187 | 188 | help_str = get_parse_args_stdout(parser, ["--o1.help=Adam"]) 189 | assert "extra positionals are parsed as optionals in the order shown above" not in help_str 190 | 191 | 192 | # stubs_resolver_allow_py_files 193 | 194 | 195 | def test_set_stubs_resolver_allow_py_files_failure(): 196 | with pytest.raises(ValueError, match="stubs_resolver_allow_py_files must be a boolean"): 197 | set_parsing_settings(stubs_resolver_allow_py_files="invalid") 198 | 199 | 200 | # omegaconf_absolute_to_relative_paths 201 | 202 | 203 | def test_set_omegaconf_absolute_to_relative_paths_failure(): 204 | with pytest.raises(ValueError, match="omegaconf_absolute_to_relative_paths must be a boolean"): 205 | set_parsing_settings(omegaconf_absolute_to_relative_paths="invalid") 206 | -------------------------------------------------------------------------------- /jsonargparse_tests/test_paths.py: -------------------------------------------------------------------------------- 1 | from __future__ import annotations 2 | 3 | import json 4 | import os 5 | from calendar import Calendar 6 | from pathlib import Path 7 | from typing import Any, Dict, List, Optional, Union 8 | 9 | import pytest 10 | 11 | from jsonargparse import ArgumentError, Namespace 12 | from jsonargparse.typing import Path_drw, Path_fc, Path_fr, path_type 13 | from jsonargparse_tests.conftest import get_parser_help, json_or_yaml_dump, json_or_yaml_load 14 | 15 | # stdlib path types tests 16 | 17 | 18 | def test_pathlib_path(parser, file_r): 19 | parser.add_argument("--path", type=Path) 20 | cfg = parser.parse_args([f"--path={file_r}"]) 21 | assert isinstance(cfg.path, Path) 22 | assert str(cfg.path) == file_r 23 | assert json_or_yaml_load(parser.dump(cfg)) == {"path": "file_r"} 24 | 25 | 26 | def test_os_pathlike(parser, file_r): 27 | parser.add_argument("--path", type=os.PathLike) 28 | assert file_r == parser.parse_args([f"--path={file_r}"]).path 29 | 30 | 31 | # jsonargparse path types tests 32 | 33 | 34 | def test_path_fr(file_r): 35 | path = Path_fr(file_r) 36 | assert path == file_r 37 | assert path() == os.path.realpath(file_r) 38 | pytest.raises(TypeError, lambda: Path_fr("does_not_exist")) 39 | 40 | 41 | def test_path_fc_with_kwargs(tmpdir): 42 | path = Path_fc("some-file.txt", cwd=tmpdir) 43 | assert path() == os.path.join(tmpdir, "some-file.txt") 44 | 45 | 46 | def test_path_fr_already_registered(): 47 | assert Path_fr is path_type("fr") 48 | 49 | 50 | def test_paths_config_relative_absolute(parser, tmp_cwd): 51 | parser.add_argument("--cfg", action="config") 52 | parser.add_argument("--file", type=Path_fr) 53 | parser.add_argument("--dir", type=Path_drw) 54 | 55 | (tmp_cwd / "example").mkdir() 56 | rel_yaml_file = Path("..", "example", "example.yaml") 57 | abs_yaml_file = (tmp_cwd / "example" / rel_yaml_file).resolve() 58 | abs_yaml_file.write_text(json_or_yaml_dump({"file": str(rel_yaml_file), "dir": str(tmp_cwd)})) 59 | 60 | cfg = parser.parse_args([f"--cfg={abs_yaml_file}"]) 61 | assert os.path.realpath(tmp_cwd) == os.path.realpath(cfg.dir) 62 | assert str(rel_yaml_file) == str(cfg.file) 63 | assert str(abs_yaml_file) == os.path.realpath(cfg.file) 64 | 65 | cfg = parser.parse_args([f"--file={abs_yaml_file}", f"--dir={tmp_cwd}"]) 66 | assert str(abs_yaml_file) == os.path.realpath(cfg.file) 67 | assert str(tmp_cwd) == os.path.realpath(cfg.dir) 68 | 69 | pytest.raises(ArgumentError, lambda: parser.parse_args([f"--dir={abs_yaml_file}"])) 70 | pytest.raises(ArgumentError, lambda: parser.parse_args([f"--file={tmp_cwd}"])) 71 | 72 | 73 | def test_path_fc_nargs_plus(parser, tmp_cwd): 74 | parser.add_argument("--files", nargs="+", type=Path_fc) 75 | (tmp_cwd / "subdir").mkdir() 76 | cfg = parser.parse_args(["--files", "file1", "subdir/file2"]) 77 | assert isinstance(cfg.files, list) 78 | assert 2 == len(cfg.files) 79 | assert str(tmp_cwd / "subdir" / "file2") == os.path.realpath(cfg.files[1]) 80 | 81 | 82 | def test_list_path_fc(parser, tmp_cwd): 83 | parser.add_argument("--paths", type=List[Path_fc]) 84 | cfg = parser.parse_args(['--paths=["file1", "file2"]']) 85 | assert ["file1", "file2"] == cfg.paths 86 | assert isinstance(cfg.paths[0], Path_fc) 87 | assert isinstance(cfg.paths[1], Path_fc) 88 | 89 | 90 | def test_optional_path_fr(parser, file_r): 91 | parser.add_argument("--path", type=Optional[Path_fr]) 92 | assert None is parser.parse_args(["--path=null"]).path 93 | cfg = parser.parse_args([f"--path={file_r}"]) 94 | assert file_r == cfg.path 95 | assert isinstance(cfg.path, Path_fr) 96 | pytest.raises(ArgumentError, lambda: parser.parse_args(["--path=not_exist"])) 97 | 98 | 99 | def test_register_path_dcc_default_path(parser, tmp_cwd): 100 | path_dcc = path_type("dcc") 101 | parser.add_argument("--path", type=path_dcc, default=path_dcc("test")) 102 | cfg = parser.parse_args([]) 103 | assert {"path": "test"} == json_or_yaml_load(parser.dump(cfg)) 104 | help_str = get_parser_help(parser) 105 | assert "(type: Path_dcc, default: test)" in help_str 106 | 107 | 108 | def test_path_dump(parser, tmp_cwd): 109 | parser.add_argument("--path", type=Path_fc) 110 | cfg = parser.parse_string(json_or_yaml_dump({"path": "path"})) 111 | assert json_or_yaml_load(parser.dump(cfg)) == {"path": "path"} 112 | 113 | 114 | def test_paths_dump(parser, tmp_cwd): 115 | parser.add_argument("--paths", nargs="+", type=Path_fc) 116 | cfg = parser.parse_args(["--paths", "path1", "path2"]) 117 | assert json_or_yaml_load(parser.dump(cfg)) == {"paths": ["path1", "path2"]} 118 | 119 | 120 | # enable_path tests 121 | 122 | 123 | def test_enable_path_dict(parser, tmp_cwd): 124 | data = {"a": 1, "b": 2, "c": [3, 4]} 125 | Path("data.yaml").write_text(json.dumps(data)) 126 | 127 | parser.add_argument("--data", type=Dict[str, Any], enable_path=True) 128 | cfg = parser.parse_args(["--data=data.yaml"]) 129 | assert "data.yaml" == str(cfg["data"].pop("__path__")) 130 | assert data == cfg["data"] 131 | with pytest.raises(ArgumentError) as ctx: 132 | parser.parse_args(["--data=does-not-exist.yaml"]) 133 | ctx.match("does-not-exist.yaml either not accessible or invalid") 134 | 135 | 136 | def test_enable_path_subclass(parser, tmp_cwd): 137 | cal = {"class_path": "calendar.Calendar"} 138 | Path("cal.yaml").write_text(json.dumps(cal)) 139 | 140 | parser.add_argument("--cal", type=Calendar, enable_path=True) 141 | cfg = parser.parse_args(["--cal=cal.yaml"]) 142 | init = parser.instantiate_classes(cfg) 143 | assert isinstance(init["cal"], Calendar) 144 | 145 | 146 | def test_enable_path_list_path_fr(parser, tmp_cwd, mock_stdin, subtests): 147 | tmpdir = tmp_cwd / "subdir" 148 | tmpdir.mkdir() 149 | (tmpdir / "file1").touch() 150 | (tmpdir / "file2").touch() 151 | (tmpdir / "file3").touch() 152 | (tmpdir / "file4").touch() 153 | (tmpdir / "file5").touch() 154 | list_file1 = tmpdir / "files1.lst" 155 | list_file2 = tmpdir / "files2.lst" 156 | list_file3 = tmpdir / "files3.lst" 157 | list_file4 = tmpdir / "files4.lst" 158 | list_file1.write_text("file1\nfile2\nfile3\nfile4\n") 159 | list_file2.write_text("file5\n") 160 | list_file3.touch() 161 | list_file4.write_text("file1\nfile2\nfile6\n") 162 | 163 | parser.add_argument( 164 | "--list", 165 | type=List[Path_fr], 166 | enable_path=True, 167 | ) 168 | parser.add_argument( 169 | "--lists", 170 | nargs="+", 171 | type=List[Path_fr], 172 | enable_path=True, 173 | ) 174 | 175 | with subtests.test("paths list from file"): 176 | cfg = parser.parse_args([f"--list={list_file1}"]) 177 | assert all(isinstance(x, Path_fr) for x in cfg.list) 178 | assert ["file1", "file2", "file3", "file4"] == [str(x) for x in cfg.list] 179 | 180 | with subtests.test("paths list from stdin"): 181 | with mock_stdin("file1\nfile2\n"): 182 | with Path_drw("subdir").relative_path_context(): 183 | cfg = parser.parse_args(["--list", "-"]) 184 | assert all(isinstance(x, Path_fr) for x in cfg.list) 185 | assert ["file1", "file2"] == [str(x) for x in cfg.list] 186 | 187 | with subtests.test("paths list from stdin path not exist"): 188 | with mock_stdin("file1\nfile2\n"): 189 | with pytest.raises(ArgumentError) as ctx: 190 | parser.parse_args(["--list", "-"]) 191 | ctx.match("File does not exist") 192 | 193 | with subtests.test("paths list nargs='+' single"): 194 | cfg = parser.parse_args(["--lists", str(list_file1)]) 195 | assert 1 == len(cfg.lists) 196 | assert ["file1", "file2", "file3", "file4"] == [str(x) for x in cfg.lists[0]] 197 | assert all(isinstance(x, Path_fr) for x in cfg.lists[0]) 198 | 199 | with subtests.test("paths list nargs='+' multiple"): 200 | cfg = parser.parse_args(["--lists", str(list_file1), str(list_file2)]) 201 | assert 2 == len(cfg.lists) 202 | assert ["file1", "file2", "file3", "file4"] == [str(x) for x in cfg.lists[0]] 203 | assert ["file5"] == [str(x) for x in cfg.lists[1]] 204 | 205 | with subtests.test("paths list nargs='+' empty"): 206 | cfg = parser.parse_args(["--lists", str(list_file3)]) 207 | assert [[]] == cfg.lists 208 | 209 | with subtests.test("paths list nargs='+' path not exist"): 210 | pytest.raises(ArgumentError, lambda: parser.parse_args(["--lists", str(list_file4)])) 211 | 212 | with subtests.test("paths list nargs='+' list not exist"): 213 | pytest.raises(ArgumentError, lambda: parser.parse_args(["--lists", "no-such-file"])) 214 | 215 | 216 | def test_enable_path_list_path_fr_default_stdin(parser, tmp_cwd, mock_stdin, subtests): 217 | (tmp_cwd / "file1").touch() 218 | (tmp_cwd / "file2").touch() 219 | 220 | parser.add_argument( 221 | "--list", 222 | type=List[Path_fr], 223 | enable_path=True, 224 | default="-", 225 | ) 226 | 227 | with subtests.test("without args"): 228 | with mock_stdin("file1\nfile2\n"): 229 | cfg = parser.parse_args([]) 230 | assert all(isinstance(x, Path_fr) for x in cfg.list) 231 | assert ["file1", "file2"] == [str(x) for x in cfg.list] 232 | 233 | with subtests.test("stdin arg"): 234 | with mock_stdin("file1\nfile2\n"): 235 | cfg = parser.parse_args(["--list=-"]) 236 | assert all(isinstance(x, Path_fr) for x in cfg.list) 237 | assert ["file1", "file2"] == [str(x) for x in cfg.list] 238 | 239 | 240 | class DataOptionalPath: 241 | def __init__(self, path: Optional[os.PathLike] = None): 242 | pass 243 | 244 | 245 | def test_enable_path_optional_pathlike_subclass_parameter(parser, tmp_cwd): 246 | data_path = Path("data.json") 247 | data_path.write_text('{"a": 1}') 248 | 249 | parser.add_argument("--data", type=DataOptionalPath, enable_path=True) 250 | 251 | cfg = parser.parse_args([f"--data={__name__}.DataOptionalPath", f"--data.path={data_path}"]) 252 | assert cfg.data.class_path == f"{__name__}.DataOptionalPath" 253 | assert cfg.data.init_args == Namespace(path=str(data_path)) 254 | 255 | 256 | class Base: 257 | pass 258 | 259 | 260 | class DataUnionPath: 261 | def __init__(self, path: Union[Base, os.PathLike, str] = ""): 262 | pass 263 | 264 | 265 | def test_sub_configs_union_subclass_and_pathlike(parser, tmp_cwd): 266 | data_path = Path("data.csv") 267 | data_path.write_text("x\ny\n") 268 | config = { 269 | "data": { 270 | "path": "data.csv", 271 | } 272 | } 273 | config_path = Path("config.json") 274 | config_path.write_text(json.dumps(config)) 275 | 276 | parser.add_class_arguments(DataUnionPath, "data", sub_configs=True) 277 | parser.add_argument("--cfg", action="config") 278 | 279 | cfg = parser.parse_args([f"--cfg={config_path}"]) 280 | assert cfg.data.path == str(data_path) 281 | -------------------------------------------------------------------------------- /jsonargparse_tests/test_postponed_annotations.py: -------------------------------------------------------------------------------- 1 | from __future__ import annotations # keep 2 | 3 | import dataclasses 4 | import os 5 | import sys 6 | from typing import TYPE_CHECKING, Dict, List, Optional, Tuple, Type, Union 7 | 8 | import pytest 9 | 10 | from jsonargparse import Namespace 11 | from jsonargparse._parameter_resolvers import get_signature_parameters as get_params 12 | from jsonargparse._postponed_annotations import ( 13 | TypeCheckingVisitor, 14 | evaluate_postponed_annotations, 15 | get_types, 16 | ) 17 | from jsonargparse.typing import Path_drw 18 | from jsonargparse_tests.conftest import capture_logs, source_unavailable 19 | 20 | 21 | def function_pep604(p1: str | None, p2: int | float | bool = 1): 22 | return p1 23 | 24 | 25 | def test_get_types_pep604(): 26 | types = get_types(function_pep604) 27 | assert types == {"p1": Union[str, None], "p2": Union[int, float, bool]} 28 | 29 | 30 | class NeedsBackport: 31 | def __init__(self, p1: list | set): 32 | self.p1 = p1 33 | 34 | @staticmethod 35 | def static_method(p1: str | int): 36 | return p1 37 | 38 | @classmethod 39 | def class_method(cls, p1: float | None): 40 | return p1 41 | 42 | 43 | @pytest.mark.parametrize( 44 | ["method", "expected"], 45 | [ 46 | (NeedsBackport.__init__, {"p1": Union[list, set]}), 47 | (NeedsBackport.static_method, {"p1": Union[str, int]}), 48 | (NeedsBackport.class_method, {"p1": Union[float, None]}), 49 | ], 50 | ) 51 | def test_get_types_methods(method, expected): 52 | types = get_types(method) 53 | assert types == expected 54 | 55 | 56 | def function_forward_ref(cls: "NeedsBackport", p1: "int"): 57 | return cls 58 | 59 | 60 | def test_get_types_forward_ref(): 61 | types = get_types(function_forward_ref) 62 | assert types == {"cls": NeedsBackport, "p1": int} 63 | 64 | 65 | def function_undefined_type(p1: not_defined | None, p2: int): # type: ignore # noqa: F821 66 | return p1 67 | 68 | 69 | def test_get_types_undefined_type(): 70 | types = get_types(function_undefined_type) 71 | assert types["p2"] is int 72 | assert isinstance(types["p1"], KeyError) 73 | assert "not_defined" in str(types["p1"]) 74 | 75 | params = get_params(function_undefined_type) 76 | assert params[0].annotation == "not_defined | None" 77 | 78 | 79 | def function_all_types_fail(p1: not_defined | None, p2: not_defined): # type: ignore # noqa: F821 80 | return p1 81 | 82 | 83 | def test_get_types_all_types_fail(): 84 | with pytest.raises(NameError) as ctx: 85 | get_types(function_all_types_fail) 86 | ctx.match("not_defined") 87 | 88 | 89 | def test_evaluate_postponed_annotations_all_types_fail(logger): 90 | params = get_params(function_all_types_fail) 91 | with capture_logs(logger) as logs: 92 | evaluate_postponed_annotations(params, function_all_types_fail, None, logger) 93 | assert "Unable to evaluate types for " in logs.getvalue() 94 | 95 | 96 | def function_missing_type(p1, p2: str | int): 97 | return p1 98 | 99 | 100 | def test_get_types_missing_type(): 101 | types = get_types(function_missing_type) 102 | assert types == {"p2": Union[str, int]} 103 | 104 | 105 | type_checking_template = """ 106 | %(typing_import)s 107 | 108 | if %(condition)s: 109 | SUCCESS = True 110 | """ 111 | 112 | 113 | @pytest.mark.parametrize( 114 | ["typing_import", "condition"], 115 | [ 116 | ("from typing import TYPE_CHECKING", "TYPE_CHECKING and COND2 and COND3"), 117 | ("from typing import TYPE_CHECKING", "COND1 or COND2 or TYPE_CHECKING"), 118 | ("from typing import TYPE_CHECKING as TC", "TC"), 119 | ("import typing", "typing.TYPE_CHECKING"), 120 | ("import typing as t", "t.TYPE_CHECKING"), 121 | ], 122 | ) 123 | def test_type_checking_visitor(typing_import, condition): 124 | source = type_checking_template % {"typing_import": typing_import, "condition": condition} 125 | visitor = TypeCheckingVisitor() 126 | aliases = {} 127 | visitor.update_aliases(source, __name__, aliases) 128 | assert aliases.get("SUCCESS") is True 129 | 130 | 131 | type_checking_failure = """ 132 | from typing import TYPE_CHECKING 133 | 134 | if TYPE_CHECKING: 135 | INVALID += 1 136 | """ 137 | 138 | 139 | def test_type_checking_visitor_failure(logger): 140 | visitor = TypeCheckingVisitor() 141 | with capture_logs(logger) as logs: 142 | visitor.update_aliases(type_checking_failure, __name__, {}, logger) 143 | assert "Failed to execute 'TYPE_CHECKING' block" in logs.getvalue() 144 | 145 | 146 | if TYPE_CHECKING: 147 | import xml.dom 148 | 149 | class TypeCheckingClass1: 150 | pass 151 | 152 | class TypeCheckingClass2: 153 | pass 154 | 155 | type_checking_alias = Union[int, TypeCheckingClass2, List[str]] 156 | 157 | 158 | def function_type_checking_nested_attr(p1: str, p2: Optional["xml.dom.Node"]): 159 | return p1 160 | 161 | 162 | def test_get_types_type_checking_nested_attr(): 163 | types = get_types(function_type_checking_nested_attr) 164 | from xml.dom import Node 165 | 166 | assert types == {"p1": str, "p2": Optional[Node]} 167 | 168 | 169 | def function_type_checking_union(p1: Union[bool, TypeCheckingClass1, int], p2: Union[float, "TypeCheckingClass2"]): 170 | return p1 171 | 172 | 173 | def test_get_types_type_checking_union(): 174 | types = get_types(function_type_checking_union) 175 | assert list(types) == ["p1", "p2"] 176 | if sys.version_info < (3, 14): 177 | assert str(types["p1"]) == f"typing.Union[bool, {__name__}.TypeCheckingClass1, int]" 178 | assert str(types["p2"]) == f"typing.Union[float, {__name__}.TypeCheckingClass2]" 179 | else: 180 | assert str(types["p1"]) == f"bool | {__name__}.TypeCheckingClass1 | int" 181 | assert str(types["p2"]) == f"float | {__name__}.TypeCheckingClass2" 182 | 183 | 184 | def function_type_checking_alias(p1: type_checking_alias, p2: "type_checking_alias"): 185 | return p1 186 | 187 | 188 | def test_get_types_type_checking_alias(): 189 | types = get_types(function_type_checking_alias) 190 | assert list(types) == ["p1", "p2"] 191 | if sys.version_info < (3, 14): 192 | assert str(types["p1"]) == f"typing.Union[int, {__name__}.TypeCheckingClass2, typing.List[str]]" 193 | assert str(types["p2"]) == f"typing.Union[int, {__name__}.TypeCheckingClass2, typing.List[str]]" 194 | else: 195 | assert str(types["p1"]) == f"int | {__name__}.TypeCheckingClass2 | typing.List[str]" 196 | assert str(types["p2"]) == f"int | {__name__}.TypeCheckingClass2 | typing.List[str]" 197 | 198 | 199 | def function_type_checking_optional_alias(p1: type_checking_alias | None, p2: Optional["type_checking_alias"]): 200 | return p1 201 | 202 | 203 | def test_get_types_type_checking_optional_alias(): 204 | types = get_types(function_type_checking_optional_alias) 205 | assert list(types) == ["p1", "p2"] 206 | if sys.version_info < (3, 14): 207 | assert str(types["p1"]) == f"typing.Union[int, {__name__}.TypeCheckingClass2, typing.List[str], NoneType]" 208 | assert str(types["p2"]) == f"typing.Union[int, {__name__}.TypeCheckingClass2, typing.List[str], NoneType]" 209 | else: 210 | assert str(types["p1"]) == f"int | {__name__}.TypeCheckingClass2 | typing.List[str] | None" 211 | assert str(types["p2"]) == f"int | {__name__}.TypeCheckingClass2 | typing.List[str] | None" 212 | 213 | 214 | def function_type_checking_list(p1: List[Union["TypeCheckingClass1", TypeCheckingClass2]]): 215 | return p1 216 | 217 | 218 | def test_get_types_type_checking_list(): 219 | types = get_types(function_type_checking_list) 220 | assert list(types) == ["p1"] 221 | lst = "typing.List" 222 | if sys.version_info < (3, 14): 223 | assert str(types["p1"]) == f"{lst}[typing.Union[{__name__}.TypeCheckingClass1, {__name__}.TypeCheckingClass2]]" 224 | else: 225 | assert str(types["p1"]) == f"{lst}[{__name__}.TypeCheckingClass1 | {__name__}.TypeCheckingClass2]" 226 | 227 | 228 | def function_type_checking_tuple(p1: Tuple[TypeCheckingClass1, "TypeCheckingClass2"]): 229 | return p1 230 | 231 | 232 | def test_get_types_type_checking_tuple(): 233 | types = get_types(function_type_checking_tuple) 234 | assert list(types) == ["p1"] 235 | tpl = "typing.Tuple" 236 | assert str(types["p1"]) == f"{tpl}[{__name__}.TypeCheckingClass1, {__name__}.TypeCheckingClass2]" 237 | 238 | 239 | def function_type_checking_type(p1: Type["TypeCheckingClass2"]): 240 | return p1 241 | 242 | 243 | def test_get_types_type_checking_type(): 244 | types = get_types(function_type_checking_type) 245 | assert list(types) == ["p1"] 246 | tpl = "typing.Type" 247 | assert str(types["p1"]) == f"{tpl}[{__name__}.TypeCheckingClass2]" 248 | 249 | 250 | def function_type_checking_dict(p1: Dict[str, Union[TypeCheckingClass1, "TypeCheckingClass2"]]): 251 | return p1 252 | 253 | 254 | def test_get_types_type_checking_dict(): 255 | types = get_types(function_type_checking_dict) 256 | assert list(types) == ["p1"] 257 | dct = "typing.Dict" 258 | if sys.version_info < (3, 14): 259 | assert ( 260 | str(types["p1"]) 261 | == f"{dct}[str, typing.Union[{__name__}.TypeCheckingClass1, {__name__}.TypeCheckingClass2]]" 262 | ) 263 | else: 264 | assert str(types["p1"]) == f"{dct}[str, {__name__}.TypeCheckingClass1 | {__name__}.TypeCheckingClass2]" 265 | 266 | 267 | def function_type_checking_undefined_forward_ref(p1: List["Undefined"], p2: bool): # type: ignore # noqa: F821 268 | return p1 269 | 270 | 271 | def test_get_types_type_checking_undefined_forward_ref(logger): 272 | with capture_logs(logger) as logs: 273 | types = get_types(function_type_checking_undefined_forward_ref, logger) 274 | assert types == {"p1": List["Undefined"], "p2": bool} # noqa: F821 275 | assert "Failed to resolve forward refs in " in logs.getvalue() 276 | assert "NameError: Name 'Undefined' is not defined" in logs.getvalue() 277 | 278 | 279 | @dataclasses.dataclass 280 | class DataclassForwardRef: 281 | p1: "int" 282 | p2: Optional["xml.dom.Node"] = None 283 | 284 | 285 | def test_get_types_type_checking_dataclass_init_forward_ref(): 286 | import xml.dom 287 | 288 | types = get_types(DataclassForwardRef.__init__) 289 | assert types == {"p1": int, "p2": Optional[xml.dom.Node], "return": type(None)} 290 | 291 | 292 | def function_source_unavailable(p1: List["TypeCheckingClass1"]): 293 | return p1 294 | 295 | 296 | def test_get_types_source_unavailable(logger): 297 | with source_unavailable(function_source_unavailable), pytest.raises(NameError) as ctx, capture_logs(logger) as logs: 298 | get_types(function_source_unavailable, logger) 299 | ctx.match("'TypeCheckingClass1' is not defined") 300 | assert "source code not available" in logs.getvalue() 301 | 302 | 303 | @dataclasses.dataclass 304 | class Data585: 305 | a: list[int] 306 | b: str = "x" 307 | 308 | 309 | def test_get_types_dataclass_pep585(parser): 310 | types = get_types(Data585) 311 | assert types == {"a": list[int], "b": str} 312 | parser.add_class_arguments(Data585, "data") 313 | cfg = parser.parse_args(["--data.a=[1, 2]"]) 314 | assert cfg.data == Namespace(a=[1, 2], b="x") 315 | 316 | 317 | @dataclasses.dataclass 318 | class DataWithInit585(Data585): 319 | def __init__(self, b: Path_drw, **kwargs): 320 | super().__init__(b=os.fspath(b), **kwargs) 321 | 322 | 323 | def test_add_dataclass_with_init_pep585(parser, tmp_cwd): 324 | parser.add_class_arguments(DataWithInit585, "data") 325 | cfg = parser.parse_args(["--data.a=[1, 2]", "--data.b=."]) 326 | assert cfg.data == Namespace(a=[1, 2], b=Path_drw(".")) 327 | -------------------------------------------------------------------------------- /pyproject.toml: -------------------------------------------------------------------------------- 1 | [build-system] 2 | requires = ["setuptools"] 3 | build-backend = "setuptools.build_meta" 4 | 5 | 6 | [project] 7 | name = "jsonargparse" 8 | dynamic = ["version"] 9 | description = "Implement minimal boilerplate CLIs derived from type hints and parse from command line, config files and environment variables." 10 | authors = [ 11 | {name = "Mauricio Villegas", email = "mauricio@omnius.com"}, 12 | ] 13 | readme = "README.rst" 14 | license = "MIT" 15 | license-files = ["LICENSE.rst"] 16 | requires-python = ">=3.9" 17 | 18 | classifiers = [ 19 | "Development Status :: 5 - Production/Stable", 20 | "Programming Language :: Python", 21 | "Programming Language :: Python :: 3", 22 | "Programming Language :: Python :: 3 :: Only", 23 | "Programming Language :: Python :: 3.9", 24 | "Programming Language :: Python :: 3.10", 25 | "Programming Language :: Python :: 3.11", 26 | "Programming Language :: Python :: 3.12", 27 | "Programming Language :: Python :: 3.13", 28 | "Programming Language :: Python :: 3.14", 29 | "Intended Audience :: Developers", 30 | "Operating System :: POSIX :: Linux", 31 | "Operating System :: MacOS", 32 | "Operating System :: Microsoft :: Windows", 33 | ] 34 | 35 | dependencies = [ 36 | "PyYAML>=3.13", 37 | ] 38 | 39 | [project.optional-dependencies] 40 | all = [ 41 | "jsonargparse[signatures]", 42 | "jsonargparse[jsonschema]", 43 | "jsonargparse[jsonnet]", 44 | "jsonargparse[toml]", 45 | "jsonargparse[urls]", 46 | "jsonargparse[fsspec]", 47 | "jsonargparse[ruamel]", 48 | "jsonargparse[omegaconf]", 49 | "jsonargparse[typing-extensions]", 50 | "jsonargparse[reconplogger]", 51 | ] 52 | signatures = [ 53 | "jsonargparse[typing-extensions]", 54 | "docstring-parser>=0.17", 55 | "typeshed-client>=2.8.2", 56 | ] 57 | jsonschema = [ 58 | "jsonschema>=3.2.0", 59 | ] 60 | jsonnet = [ 61 | "jsonnet>=0.21.0", 62 | ] 63 | toml = [ 64 | "toml>=0.10.2", 65 | ] 66 | urls = [ 67 | "requests>=2.18.4", 68 | ] 69 | fsspec = [ 70 | "fsspec>=0.8.4", 71 | ] 72 | shtab = [ 73 | "shtab>=1.7.1", 74 | ] 75 | argcomplete = [ 76 | "argcomplete>=3.5.1", 77 | ] 78 | ruamel = [ 79 | "ruamel.yaml>=0.18.15", 80 | ] 81 | ruyaml = [ 82 | "jsonargparse[ruamel]", 83 | ] 84 | omegaconf = [ 85 | "omegaconf>=2.1.1", 86 | ] 87 | typing-extensions = [ 88 | "typing-extensions>=3.10.0.0; python_version < '3.10'", 89 | ] 90 | reconplogger = [ 91 | "reconplogger>=4.4.0", 92 | ] 93 | test = [ 94 | "jsonargparse[test-no-urls]", 95 | "jsonargparse[shtab]", 96 | "jsonargparse[argcomplete]", 97 | "types-PyYAML>=6.0.11", 98 | "types-requests>=2.28.9", 99 | "responses>=0.12.0", 100 | "pydantic>=2.3.0; python_version < '3.14'", # restricted until pydantic installs correctly in 3.14 101 | "attrs>=22.2.0", 102 | ] 103 | test-no-urls = [ 104 | "pytest>=6.2.5", 105 | "pytest-subtests>=0.8.0", 106 | ] 107 | coverage = [ 108 | "jsonargparse[test-no-urls]", 109 | "pytest-cov>=4.0.0", 110 | ] 111 | dev = [ 112 | "jsonargparse[test]", 113 | "jsonargparse[coverage]", 114 | "jsonargparse[doc]", 115 | "pre-commit>=2.19.0", 116 | "tox>=3.25.0", 117 | "build>=0.10.0", 118 | ] 119 | doc = [ 120 | "Sphinx>=1.7.9", 121 | "sphinx-rtd-theme>=1.2.2", 122 | "autodocsumm>=0.1.10", 123 | "sphinx-autodoc-typehints>=1.19.5", 124 | ] 125 | maintainer = [ 126 | "bump2version>=0.5.11", 127 | "twine>=4.0.2", 128 | ] 129 | 130 | [project.urls] 131 | Documentation-stable = "https://jsonargparse.readthedocs.io/en/stable/" 132 | Documentation-latest = "https://jsonargparse.readthedocs.io/en/latest/" 133 | Changes = "https://jsonargparse.readthedocs.io/en/stable/changelog.html" 134 | GitHub = "https://github.com/omni-us/jsonargparse" 135 | PyPI = "https://pypi.org/project/jsonargparse" 136 | SonarCloud = "https://sonarcloud.io/dashboard?id=omni-us_jsonargparse" 137 | Codecov = "https://codecov.io/gh/omni-us/jsonargparse" 138 | 139 | 140 | [tool.setuptools] 141 | platforms = ["Any"] 142 | packages = ["jsonargparse", "jsonargparse_tests"] 143 | 144 | [tool.setuptools.dynamic] 145 | version = {attr = "jsonargparse.__version__"} 146 | 147 | [tool.setuptools.package-data] 148 | jsonargparse = ["py.typed"] 149 | 150 | 151 | [tool.pytest.ini_options] 152 | addopts = "-s" 153 | testpaths = ["jsonargparse_tests"] 154 | 155 | 156 | [tool.coverage.run] 157 | relative_files = true 158 | source = ["jsonargparse"] 159 | 160 | 161 | [tool.mypy] 162 | allow_redefinition = true 163 | warn_unused_ignores = true 164 | disable_error_code = "annotation-unchecked" 165 | 166 | 167 | [tool.ruff] 168 | line-length = 120 169 | 170 | [tool.ruff.lint] 171 | select = [ 172 | "E", "W", # https://pypi.org/project/pycodestyle 173 | "F", # https://pypi.org/project/pyflakes 174 | "I", # https://pypi.org/project/isort 175 | ] 176 | ignore = [ 177 | "E731", # Do not convert lambda assigns to a def 178 | "E721", # Allow comparing types with type() 179 | ] 180 | 181 | [tool.ruff.lint.pydocstyle] 182 | convention = "google" 183 | 184 | 185 | [tool.black] 186 | line-length = 120 187 | 188 | 189 | [tool.typos.default.extend-identifiers] 190 | Villegas = "Villegas" 191 | 192 | 193 | [tool.tox] 194 | legacy_tox_ini = """ 195 | [tox] 196 | envlist = py{39,310,311,312,313,314}-{all,no}-extras,omegaconf,pydantic-v1,without-pyyaml,without-future-annotations 197 | skip_missing_interpreters = true 198 | 199 | [testenv] 200 | extras = 201 | all-extras: test,coverage,all 202 | no-extras: coverage 203 | changedir = jsonargparse_tests 204 | commands = python -m pytest {posargs} 205 | usedevelop = true 206 | 207 | [testenv:omegaconf] 208 | extras = test,coverage,all 209 | setenv = 210 | JSONARGPARSE_OMEGACONF_FULL_TEST = true 211 | 212 | [testenv:pydantic-v1] 213 | extras = coverage 214 | commands = 215 | # Test with pydantic<2 216 | python -c "\ 217 | from pathlib import Path; \ 218 | from shutil import copy; \ 219 | copy('conftest.py', Path(r'{envtmpdir}', 'conftest.py')); \ 220 | copy('test_pydantic.py', Path(r'{envtmpdir}', 'test_pydantic.py'))\ 221 | " 222 | pip install "pydantic<2" 223 | python -m pytest {posargs} {envtmpdir}/test_pydantic.py 224 | 225 | # Test with pydantic>=2 importing from pydantic.v1 226 | python -c "\ 227 | import re, pathlib; \ 228 | path = pathlib.Path(r'{envtmpdir}', 'test_pydantic.py'); \ 229 | content = path.read_text(); \ 230 | content = re.sub(r'import pydantic', 'import pydantic.v1 as pydantic', content); \ 231 | content = re.sub(r'^annotated = [^\\n]*', 'annotated = False', content, flags=re.MULTILINE); \ 232 | path.write_text(content)\ 233 | " 234 | pip install "pydantic>=2" 235 | python -m pytest {posargs} {envtmpdir}/test_pydantic.py 236 | 237 | [testenv:without-pyyaml] 238 | extras = test,coverage,all 239 | commands = 240 | pip uninstall -y argcomplete omegaconf pyyaml reconplogger responses ruamel.yaml ruamel.yaml.clib types-PyYAML 241 | python -m pytest {posargs} 242 | 243 | [testenv:without-future-annotations] 244 | extras = test,coverage,all 245 | allowlist_externals = sh 246 | commands = 247 | sh -c "\ 248 | rm -rf /tmp/_without_future_annotations; \ 249 | mkdir /tmp/_without_future_annotations; \ 250 | cp *.py /tmp/_without_future_annotations; \ 251 | sed -i -e '/^from __future__ import annotations$/d' /tmp/_without_future_annotations/*.py; \ 252 | " 253 | python -m pytest /tmp/_without_future_annotations {posargs} 254 | commands_post = 255 | sh -c "rm -rf /tmp/_without_future_annotations" 256 | """ 257 | -------------------------------------------------------------------------------- /sphinx/changelog.rst: -------------------------------------------------------------------------------- 1 | :orphan: 2 | 3 | .. _changelog: 4 | 5 | .. include:: ../CHANGELOG.rst 6 | -------------------------------------------------------------------------------- /sphinx/conf.py: -------------------------------------------------------------------------------- 1 | # Configuration file for the Sphinx documentation builder. 2 | # 3 | # For the full list of built-in configuration values, see the documentation: 4 | # https://www.sphinx-doc.org/en/master/usage/configuration.html 5 | 6 | import os 7 | import sys 8 | 9 | os.environ["SPHINX_BUILD"] = "" 10 | 11 | 12 | # -- Project information ----------------------------------------------------- 13 | # https://www.sphinx-doc.org/en/master/usage/configuration.html#project-information 14 | 15 | project = "jsonargparse" 16 | copyright = "2019-present, Mauricio Villegas" 17 | author = "Mauricio Villegas" 18 | 19 | 20 | # -- General configuration --------------------------------------------------- 21 | # https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration 22 | 23 | extensions = [ 24 | "sphinx.ext.autodoc", 25 | "sphinx.ext.doctest", 26 | "sphinx.ext.intersphinx", 27 | "sphinx.ext.napoleon", 28 | "autodocsumm", 29 | "sphinx_autodoc_typehints", 30 | ] 31 | 32 | templates_path = [] 33 | exclude_patterns = ["_build"] 34 | 35 | 36 | # -- Options for HTML output ------------------------------------------------- 37 | # https://www.sphinx-doc.org/en/master/usage/configuration.html#options-for-html-output 38 | 39 | html_theme = "sphinx_rtd_theme" 40 | html_static_path = [] 41 | 42 | 43 | # -- autodoc 44 | 45 | sys.path.insert(0, os.path.abspath("../")) 46 | 47 | autodoc_default_options = { 48 | "members": True, 49 | "exclude-members": "groups", 50 | "member-order": "bysource", 51 | "show-inheritance": True, 52 | "autosummary": True, 53 | "autosummary-imported-members": False, 54 | "special-members": "__init__,__call__", 55 | } 56 | 57 | 58 | # -- doctest 59 | 60 | import doctest # noqa: E402 61 | 62 | IGNORE_RESULT = doctest.register_optionflag("IGNORE_RESULT") 63 | 64 | OutputChecker = doctest.OutputChecker 65 | 66 | 67 | class CustomOutputChecker(OutputChecker): 68 | def check_output(self, want, got, optionflags): 69 | if IGNORE_RESULT & optionflags: 70 | return True 71 | return OutputChecker.check_output(self, want, got, optionflags) 72 | 73 | 74 | doctest.OutputChecker = CustomOutputChecker 75 | 76 | doctest_global_setup = """ 77 | import os 78 | import pathlib 79 | import shutil 80 | import sys 81 | import tempfile 82 | from calendar import Calendar 83 | from dataclasses import dataclass 84 | from io import StringIO 85 | from typing import Callable, Iterable, List, Protocol 86 | import jsonargparse_tests 87 | from jsonargparse import * 88 | from jsonargparse.typing import * 89 | from jsonargparse._util import unresolvable_import_paths 90 | 91 | def doctest_mock_class_in_main(cls): 92 | cls.__module__ = None 93 | setattr(sys.modules["__main__"], cls.__name__, cls) 94 | unresolvable_import_paths[cls] = f"__main__.{cls.__name__}" 95 | """ 96 | 97 | 98 | # -- intersphinx 99 | 100 | intersphinx_mapping = {"python": ("https://docs.python.org/3", None)} 101 | -------------------------------------------------------------------------------- /sphinx/index.rst: -------------------------------------------------------------------------------- 1 | .. include:: ../README.rst 2 | 3 | .. include:: ../DOCUMENTATION.rst 4 | 5 | .. include:: ../CONTRIBUTING.rst 6 | 7 | .. _api-ref: 8 | 9 | API Reference 10 | ============= 11 | 12 | Even though jsonargparse has several internal modules, users are expected to 13 | only import from ``jsonargparse`` or ``jsonargparse.typing``. This allows doing 14 | internal refactoring without affecting dependants. Only objects explicitly 15 | exposed in ``jsonargparse.__init__.__all__`` and in 16 | ``jsonargparse.typing.__all__`` are included in this API reference and is what 17 | can be considered public. 18 | 19 | 20 | jsonargparse 21 | ------------ 22 | .. automodule:: jsonargparse 23 | 24 | jsonargparse.typing 25 | ------------------- 26 | .. automodule:: jsonargparse.typing 27 | :exclude-members: get_import_path, import_object, Path 28 | 29 | 30 | Index 31 | ===== 32 | 33 | * :ref:`changelog` 34 | * :ref:`license` 35 | * :ref:`genindex` 36 | -------------------------------------------------------------------------------- /sphinx/license.rst: -------------------------------------------------------------------------------- 1 | :orphan: 2 | 3 | .. _license: 4 | 5 | License 6 | ======= 7 | 8 | .. include:: ../LICENSE.rst 9 | --------------------------------------------------------------------------------