[(scope)]: Subject
64 |
65 | [Body]
66 | ```
67 |
68 | **Subject and body must be valid Markdown.** Subject must have proper casing (uppercase for first letter if it makes sense), but no dot at the end, and no punctuation in general.
69 |
70 | Scope and body are optional. Type can be:
71 |
72 | - `build`: About packaging, building wheels, etc.
73 | - `chore`: About packaging or repo/files management.
74 | - `ci`: About Continuous Integration.
75 | - `deps`: Dependencies update.
76 | - `docs`: About documentation.
77 | - `feat`: New feature.
78 | - `fix`: Bug fix.
79 | - `perf`: About performance.
80 | - `refactor`: Changes that are not features or bug fixes.
81 | - `style`: A change in code style/format.
82 | - `tests`: About tests.
83 |
84 | If you write a body, please add trailers at the end (for example issues and PR references, or co-authors), without relying on GitHub's flavored Markdown:
85 |
86 | ```
87 | Body.
88 |
89 | Issue #10: https://github.com/namespace/project/issues/10
90 | Related to PR namespace/other-project#15: https://github.com/namespace/other-project/pull/15
91 | ```
92 |
93 | These "trailers" must appear at the end of the body, without any blank lines between them. The trailer title can contain any character except colons `:`. We expect a full URI for each trailer, not just GitHub autolinks (for example, full GitHub URLs for commits and issues, not the hash or the #issue-number).
94 |
95 | We do not enforce a line length on commit messages summary and body, but please avoid very long summaries, and very long lines in the body, unless they are part of code blocks that must not be wrapped.
96 |
97 | ## Pull requests guidelines
98 |
99 | Link to any related issue in the Pull Request message.
100 |
101 | During the review, we recommend using fixups:
102 |
103 | ```bash
104 | # SHA is the SHA of the commit you want to fix
105 | git commit --fixup=SHA
106 | ```
107 |
108 | Once all the changes are approved, you can squash your commits:
109 |
110 | ```bash
111 | git rebase -i --autosquash main
112 | ```
113 |
114 | And force-push:
115 |
116 | ```bash
117 | git push -f
118 | ```
119 |
120 | If this seems all too complicated, you can push or force-push each new commit, and we will squash them ourselves if needed, before merging.
121 |
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | ISC License
2 |
3 | Copyright (c) 2023, Timothée Mazzucotelli
4 |
5 | Permission to use, copy, modify, and/or distribute this software for any
6 | purpose with or without fee is hereby granted, provided that the above
7 | copyright notice and this permission notice appear in all copies.
8 |
9 | THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
10 | WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
11 | MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
12 | ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
13 | WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
14 | ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
15 | OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
16 |
--------------------------------------------------------------------------------
/Makefile:
--------------------------------------------------------------------------------
1 | # If you have `direnv` loaded in your shell, and allow it in the repository,
2 | # the `make` command will point at the `scripts/make` shell script.
3 | # This Makefile is just here to allow auto-completion in the terminal.
4 |
5 | actions = \
6 | allrun \
7 | changelog \
8 | check \
9 | check-api \
10 | check-docs \
11 | check-quality \
12 | check-types \
13 | clean \
14 | coverage \
15 | docs \
16 | docs-deploy \
17 | format \
18 | help \
19 | multirun \
20 | release \
21 | run \
22 | setup \
23 | test \
24 | vscode
25 |
26 | .PHONY: $(actions)
27 | $(actions):
28 | @python scripts/make "$@"
29 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # griffe-pydantic
2 |
3 | [](https://github.com/mkdocstrings/griffe-pydantic/actions?query=workflow%3Aci)
4 | [](https://mkdocstrings.github.io/griffe-pydantic/)
5 | [](https://pypi.org/project/griffe-pydantic/)
6 | [](https://app.gitter.im/#/room/#griffe-pydantic:gitter.im)
7 |
8 | [Griffe](https://mkdocstrings.github.io/griffe/) extension for [Pydantic](https://github.com/pydantic/pydantic).
9 |
10 | ## Installation
11 |
12 | ```bash
13 | pip install griffe-pydantic
14 | ```
15 |
16 | ## Usage
17 |
18 | ### Command-line
19 |
20 | ```bash
21 | griffe dump mypackage -e griffe_pydantic
22 | ```
23 |
24 | See [command-line usage in Griffe's documentation](https://mkdocstrings.github.io/griffe/extensions/#on-the-command-line).
25 |
26 | ### Python
27 |
28 | ```python
29 | import griffe
30 |
31 | griffe.load(
32 | "mypackage",
33 | extensions=griffe.load_extensions(
34 | [{"griffe_pydantic": {"schema": True}}]
35 | )
36 | )
37 | ```
38 |
39 | See [programmatic usage in Griffe's documentation](https://mkdocstrings.github.io/griffe/extensions/#programmatically).
40 |
41 | ### MkDocs
42 |
43 | ```yaml title="mkdocs.yml"
44 | plugins:
45 | - mkdocstrings:
46 | handlers:
47 | python:
48 | options:
49 | extensions:
50 | - griffe_pydantic:
51 | schema: true
52 | ```
53 |
54 |
55 | See [MkDocs usage in Griffe's documentation](https://mkdocstrings.github.io/griffe/extensions/#in-mkdocs).
56 |
--------------------------------------------------------------------------------
/config/coverage.ini:
--------------------------------------------------------------------------------
1 | [coverage:run]
2 | branch = true
3 | parallel = true
4 | source =
5 | src/
6 | tests/
7 |
8 | [coverage:paths]
9 | equivalent =
10 | src/
11 | .venv/lib/*/site-packages/
12 | .venvs/*/lib/*/site-packages/
13 |
14 | [coverage:report]
15 | precision = 2
16 | omit =
17 | src/*/__init__.py
18 | src/*/__main__.py
19 | tests/__init__.py
20 | exclude_lines =
21 | pragma: no cover
22 | if TYPE_CHECKING
23 |
24 | [coverage:json]
25 | output = htmlcov/coverage.json
26 |
--------------------------------------------------------------------------------
/config/git-changelog.toml:
--------------------------------------------------------------------------------
1 | bump = "auto"
2 | convention = "angular"
3 | in-place = true
4 | output = "CHANGELOG.md"
5 | parse-refs = false
6 | parse-trailers = true
7 | sections = ["build", "deps", "feat", "fix", "refactor"]
8 | template = "keepachangelog"
9 | versioning = "pep440"
10 |
--------------------------------------------------------------------------------
/config/mypy.ini:
--------------------------------------------------------------------------------
1 | [mypy]
2 | ignore_missing_imports = true
3 | exclude = tests/fixtures/
4 | warn_unused_ignores = true
5 | show_error_codes = true
6 |
--------------------------------------------------------------------------------
/config/pytest.ini:
--------------------------------------------------------------------------------
1 | [pytest]
2 | python_files =
3 | test_*.py
4 | addopts =
5 | --cov
6 | --cov-config config/coverage.ini
7 | testpaths =
8 | tests
9 |
10 | # action:message_regex:warning_class:module_regex:line
11 | filterwarnings =
12 | error
13 | # TODO: remove once pytest-xdist 4 is released
14 | ignore:.*rsyncdir:DeprecationWarning:xdist
15 |
--------------------------------------------------------------------------------
/config/ruff.toml:
--------------------------------------------------------------------------------
1 | target-version = "py39"
2 | line-length = 120
3 |
4 | [lint]
5 | exclude = [
6 | "tests/fixtures/*.py",
7 | ]
8 | select = [
9 | "A", "ANN", "ARG",
10 | "B", "BLE",
11 | "C", "C4",
12 | "COM",
13 | "D", "DTZ",
14 | "E", "ERA", "EXE",
15 | "F", "FBT",
16 | "G",
17 | "I", "ICN", "INP", "ISC",
18 | "N",
19 | "PGH", "PIE", "PL", "PLC", "PLE", "PLR", "PLW", "PT", "PYI",
20 | "Q",
21 | "RUF", "RSE", "RET",
22 | "S", "SIM", "SLF",
23 | "T", "T10", "T20", "TCH", "TID", "TRY",
24 | "UP",
25 | "W",
26 | "YTT",
27 | ]
28 | ignore = [
29 | "A001", # Variable is shadowing a Python builtin
30 | "ANN101", # Missing type annotation for self
31 | "ANN102", # Missing type annotation for cls
32 | "ANN204", # Missing return type annotation for special method __str__
33 | "ANN401", # Dynamically typed expressions (typing.Any) are disallowed
34 | "ARG005", # Unused lambda argument
35 | "C901", # Too complex
36 | "D105", # Missing docstring in magic method
37 | "D417", # Missing argument description in the docstring
38 | "E501", # Line too long
39 | "ERA001", # Commented out code
40 | "G004", # Logging statement uses f-string
41 | "PLR0911", # Too many return statements
42 | "PLR0912", # Too many branches
43 | "PLR0913", # Too many arguments to function call
44 | "PLR0915", # Too many statements
45 | "SLF001", # Private member accessed
46 | "TRY003", # Avoid specifying long messages outside the exception class
47 | ]
48 |
49 | [lint.per-file-ignores]
50 | "src/**/cli.py" = [
51 | "T201", # Print statement
52 | ]
53 | "src/*/debug.py" = [
54 | "T201", # Print statement
55 | ]
56 | "!src/*/*.py" = [
57 | "D100", # Missing docstring in public module
58 | ]
59 | "!src/**.py" = [
60 | "D101", # Missing docstring in public class
61 | "D103", # Missing docstring in public function
62 | ]
63 | "scripts/*.py" = [
64 | "INP001", # File is part of an implicit namespace package
65 | "T201", # Print statement
66 | ]
67 | "tests/**.py" = [
68 | "ARG005", # Unused lambda argument
69 | "FBT001", # Boolean positional arg in function definition
70 | "PLR2004", # Magic value used in comparison
71 | "S101", # Use of assert detected
72 | ]
73 |
74 | [lint.flake8-quotes]
75 | docstring-quotes = "double"
76 |
77 | [lint.flake8-tidy-imports]
78 | ban-relative-imports = "all"
79 |
80 | [lint.isort]
81 | known-first-party = ["griffe_pydantic"]
82 |
83 | [lint.pydocstyle]
84 | convention = "google"
85 |
86 | [format]
87 | exclude = [
88 | "tests/fixtures/*.py",
89 | ]
90 | docstring-code-format = true
91 | docstring-code-line-length = 80
92 |
--------------------------------------------------------------------------------
/config/vscode/launch.json:
--------------------------------------------------------------------------------
1 | {
2 | "version": "0.2.0",
3 | "configurations": [
4 | {
5 | "name": "python (current file)",
6 | "type": "debugpy",
7 | "request": "launch",
8 | "program": "${file}",
9 | "console": "integratedTerminal",
10 | "justMyCode": false,
11 | "args": "${command:pickArgs}"
12 | },
13 | {
14 | "name": "run",
15 | "type": "debugpy",
16 | "request": "launch",
17 | "module": "griffe_pydantic",
18 | "console": "integratedTerminal",
19 | "justMyCode": false,
20 | "args": "${command:pickArgs}"
21 | },
22 | {
23 | "name": "docs",
24 | "type": "debugpy",
25 | "request": "launch",
26 | "module": "mkdocs",
27 | "justMyCode": false,
28 | "args": [
29 | "serve",
30 | "-v"
31 | ]
32 | },
33 | {
34 | "name": "test",
35 | "type": "debugpy",
36 | "request": "launch",
37 | "module": "pytest",
38 | "justMyCode": false,
39 | "args": [
40 | "-c=config/pytest.ini",
41 | "-vvv",
42 | "--no-cov",
43 | "--dist=no",
44 | "tests",
45 | "-k=${input:tests_selection}"
46 | ]
47 | }
48 | ],
49 | "inputs": [
50 | {
51 | "id": "tests_selection",
52 | "type": "promptString",
53 | "description": "Tests selection",
54 | "default": ""
55 | }
56 | ]
57 | }
--------------------------------------------------------------------------------
/config/vscode/settings.json:
--------------------------------------------------------------------------------
1 | {
2 | "files.watcherExclude": {
3 | "**/.venv*/**": true,
4 | "**/.venvs*/**": true,
5 | "**/venv*/**": true
6 | },
7 | "mypy-type-checker.args": [
8 | "--config-file=config/mypy.ini"
9 | ],
10 | "python.testing.unittestEnabled": false,
11 | "python.testing.pytestEnabled": true,
12 | "python.testing.pytestArgs": [
13 | "--config-file=config/pytest.ini"
14 | ],
15 | "ruff.enable": true,
16 | "ruff.format.args": [
17 | "--config=config/ruff.toml"
18 | ],
19 | "ruff.lint.args": [
20 | "--config=config/ruff.toml"
21 | ],
22 | "yaml.schemas": {
23 | "https://squidfunk.github.io/mkdocs-material/schema.json": "mkdocs.yml"
24 | },
25 | "yaml.customTags": [
26 | "!ENV scalar",
27 | "!ENV sequence",
28 | "!relative scalar",
29 | "tag:yaml.org,2002:python/name:materialx.emoji.to_svg",
30 | "tag:yaml.org,2002:python/name:materialx.emoji.twemoji",
31 | "tag:yaml.org,2002:python/name:pymdownx.superfences.fence_code_format"
32 | ]
33 | }
--------------------------------------------------------------------------------
/config/vscode/tasks.json:
--------------------------------------------------------------------------------
1 | {
2 | "version": "2.0.0",
3 | "tasks": [
4 | {
5 | "label": "changelog",
6 | "type": "process",
7 | "command": "scripts/make",
8 | "args": ["changelog"]
9 | },
10 | {
11 | "label": "check",
12 | "type": "process",
13 | "command": "scripts/make",
14 | "args": ["check"]
15 | },
16 | {
17 | "label": "check-quality",
18 | "type": "process",
19 | "command": "scripts/make",
20 | "args": ["check-quality"]
21 | },
22 | {
23 | "label": "check-types",
24 | "type": "process",
25 | "command": "scripts/make",
26 | "args": ["check-types"]
27 | },
28 | {
29 | "label": "check-docs",
30 | "type": "process",
31 | "command": "scripts/make",
32 | "args": ["check-docs"]
33 | },
34 | {
35 | "label": "check-api",
36 | "type": "process",
37 | "command": "scripts/make",
38 | "args": ["check-api"]
39 | },
40 | {
41 | "label": "clean",
42 | "type": "process",
43 | "command": "scripts/make",
44 | "args": ["clean"]
45 | },
46 | {
47 | "label": "docs",
48 | "type": "process",
49 | "command": "scripts/make",
50 | "args": ["docs"]
51 | },
52 | {
53 | "label": "docs-deploy",
54 | "type": "process",
55 | "command": "scripts/make",
56 | "args": ["docs-deploy"]
57 | },
58 | {
59 | "label": "format",
60 | "type": "process",
61 | "command": "scripts/make",
62 | "args": ["format"]
63 | },
64 | {
65 | "label": "release",
66 | "type": "process",
67 | "command": "scripts/make",
68 | "args": ["release", "${input:version}"]
69 | },
70 | {
71 | "label": "setup",
72 | "type": "process",
73 | "command": "scripts/make",
74 | "args": ["setup"]
75 | },
76 | {
77 | "label": "test",
78 | "type": "process",
79 | "command": "scripts/make",
80 | "args": ["test", "coverage"],
81 | "group": "test"
82 | },
83 | {
84 | "label": "vscode",
85 | "type": "process",
86 | "command": "scripts/make",
87 | "args": ["vscode"]
88 | }
89 | ],
90 | "inputs": [
91 | {
92 | "id": "version",
93 | "type": "promptString",
94 | "description": "Version"
95 | }
96 | ]
97 | }
--------------------------------------------------------------------------------
/docs/.overrides/main.html:
--------------------------------------------------------------------------------
1 | {% extends "base.html" %}
2 |
3 | {% block announce %}
4 |
5 | Fund this project through
6 | sponsorship
7 |
8 | {% include ".icons/octicons/heart-fill-16.svg" %}
9 | —
10 |
11 | Follow
12 | @pawamoy on
13 |
14 |
15 | {% include ".icons/fontawesome/brands/mastodon.svg" %}
16 |
17 | Fosstodon
18 |
19 | for updates
20 | {% endblock %}
21 |
--------------------------------------------------------------------------------
/docs/.overrides/partials/comments.html:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 |
21 |
22 |
23 |
57 |
--------------------------------------------------------------------------------
/docs/.overrides/partials/path-item.html:
--------------------------------------------------------------------------------
1 | {# Fix breadcrumbs for when mkdocs-section-index is used. #}
2 | {# See https://github.com/squidfunk/mkdocs-material/issues/7614. #}
3 |
4 |
5 | {% macro render_content(nav_item) %}
6 |
7 | {{ nav_item.title }}
8 |
9 | {% endmacro %}
10 |
11 |
12 | {% macro render(nav_item, ref=nav_item) %}
13 | {% if nav_item.is_page %}
14 |
15 |
16 | {{ render_content(ref) }}
17 |
18 |
19 | {% elif nav_item.children %}
20 | {{ render(nav_item.children | first, ref) }}
21 | {% endif %}
22 | {% endmacro %}
23 |
--------------------------------------------------------------------------------
/docs/changelog.md:
--------------------------------------------------------------------------------
1 | ---
2 | title: Changelog
3 | ---
4 |
5 | --8<-- "CHANGELOG.md"
6 |
--------------------------------------------------------------------------------
/docs/code_of_conduct.md:
--------------------------------------------------------------------------------
1 | ---
2 | title: Code of Conduct
3 | ---
4 |
5 | --8<-- "CODE_OF_CONDUCT.md"
6 |
--------------------------------------------------------------------------------
/docs/contributing.md:
--------------------------------------------------------------------------------
1 | ---
2 | title: Contributing
3 | ---
4 |
5 | --8<-- "CONTRIBUTING.md"
6 |
--------------------------------------------------------------------------------
/docs/credits.md:
--------------------------------------------------------------------------------
1 | ---
2 | title: Credits
3 | hide:
4 | - toc
5 | ---
6 |
7 | ```python exec="yes"
8 | --8<-- "scripts/gen_credits.py"
9 | ```
10 |
--------------------------------------------------------------------------------
/docs/css/insiders.css:
--------------------------------------------------------------------------------
1 | @keyframes heart {
2 |
3 | 0%,
4 | 40%,
5 | 80%,
6 | 100% {
7 | transform: scale(1);
8 | }
9 |
10 | 20%,
11 | 60% {
12 | transform: scale(1.15);
13 | }
14 | }
15 |
16 | @keyframes vibrate {
17 | 0%, 2%, 4%, 6%, 8%, 10%, 12%, 14%, 16%, 18% {
18 | -webkit-transform: translate3d(-2px, 0, 0);
19 | transform: translate3d(-2px, 0, 0);
20 | }
21 | 1%, 3%, 5%, 7%, 9%, 11%, 13%, 15%, 17%, 19% {
22 | -webkit-transform: translate3d(2px, 0, 0);
23 | transform: translate3d(2px, 0, 0);
24 | }
25 | 20%, 100% {
26 | -webkit-transform: translate3d(0, 0, 0);
27 | transform: translate3d(0, 0, 0);
28 | }
29 | }
30 |
31 | .heart {
32 | color: #e91e63;
33 | }
34 |
35 | .pulse {
36 | animation: heart 1000ms infinite;
37 | }
38 |
39 | .vibrate {
40 | animation: vibrate 2000ms infinite;
41 | }
42 |
43 | .new-feature svg {
44 | fill: var(--md-accent-fg-color) !important;
45 | }
46 |
47 | a.insiders {
48 | color: #e91e63;
49 | }
50 |
51 | .sponsorship-list {
52 | width: 100%;
53 | }
54 |
55 | .sponsorship-item {
56 | border-radius: 100%;
57 | display: inline-block;
58 | height: 1.6rem;
59 | margin: 0.1rem;
60 | overflow: hidden;
61 | width: 1.6rem;
62 | }
63 |
64 | .sponsorship-item:focus, .sponsorship-item:hover {
65 | transform: scale(1.1);
66 | }
67 |
68 | .sponsorship-item img {
69 | filter: grayscale(100%) opacity(75%);
70 | height: auto;
71 | width: 100%;
72 | }
73 |
74 | .sponsorship-item:focus img, .sponsorship-item:hover img {
75 | filter: grayscale(0);
76 | }
77 |
78 | .sponsorship-item.private {
79 | background: var(--md-default-fg-color--lightest);
80 | color: var(--md-default-fg-color);
81 | font-size: .6rem;
82 | font-weight: 700;
83 | line-height: 1.6rem;
84 | text-align: center;
85 | }
86 |
87 | .mastodon {
88 | color: #897ff8;
89 | border-radius: 100%;
90 | box-shadow: inset 0 0 0 .05rem currentcolor;
91 | display: inline-block;
92 | height: 1.2rem !important;
93 | padding: .25rem;
94 | transition: all .25s;
95 | vertical-align: bottom !important;
96 | width: 1.2rem;
97 | }
98 |
99 | .premium-sponsors {
100 | text-align: center;
101 | }
102 |
103 | #silver-sponsors img {
104 | height: 140px;
105 | }
106 |
107 | #bronze-sponsors img {
108 | height: 140px;
109 | }
110 |
111 | #bronze-sponsors p {
112 | display: flex;
113 | flex-wrap: wrap;
114 | justify-content: center;
115 | }
116 |
117 | #bronze-sponsors a {
118 | display: block;
119 | flex-shrink: 0;
120 | }
121 |
122 | .sponsors-total {
123 | font-weight: bold;
124 | }
--------------------------------------------------------------------------------
/docs/css/material.css:
--------------------------------------------------------------------------------
1 | /* More space at the bottom of the page. */
2 | .md-main__inner {
3 | margin-bottom: 1.5rem;
4 | }
5 |
--------------------------------------------------------------------------------
/docs/css/mkdocstrings.css:
--------------------------------------------------------------------------------
1 | /* Indentation. */
2 | div.doc-contents:not(.first) {
3 | padding-left: 25px;
4 | border-left: .05rem solid var(--md-typeset-table-color);
5 | }
6 |
7 | /* Mark external links as such. */
8 | a.external::after,
9 | a.autorefs-external::after {
10 | /* https://primer.style/octicons/arrow-up-right-24 */
11 | mask-image: url('data:image/svg+xml, ');
12 | -webkit-mask-image: url('data:image/svg+xml, ');
13 | content: ' ';
14 |
15 | display: inline-block;
16 | vertical-align: middle;
17 | position: relative;
18 |
19 | height: 1em;
20 | width: 1em;
21 | background-color: currentColor;
22 | }
23 |
24 | a.external:hover::after,
25 | a.autorefs-external:hover::after {
26 | background-color: var(--md-accent-fg-color);
27 | }
--------------------------------------------------------------------------------
/docs/examples/model_ext.py:
--------------------------------------------------------------------------------
1 | from typing import Any
2 | from pydantic import field_validator, model_validator, ConfigDict, BaseModel, Field
3 |
4 |
5 | class ExampleModel(BaseModel):
6 | """An example model."""
7 |
8 | model_config = ConfigDict(frozen=False)
9 |
10 | field_without_default: str
11 | """Shows the *[Required]* marker in the signature."""
12 |
13 | field_plain_with_validator: int = 100
14 | """Show standard field with type annotation."""
15 |
16 | field_with_validator_and_alias: str = Field("FooBar", alias="BarFoo", validation_alias="BarFoo")
17 | """Shows corresponding validator with link/anchor."""
18 |
19 | field_with_constraints_and_description: int = Field(
20 | default=5, ge=0, le=100, description="Shows constraints within doc string."
21 | )
22 |
23 | @field_validator("field_with_validator_and_alias", "field_without_default", mode="before")
24 | @classmethod
25 | def check_max_length_ten(cls, v) -> str:
26 | """Show corresponding field with link/anchor."""
27 | if len(v) >= 10:
28 | raise ValueError("No more than 10 characters allowed")
29 | return v
30 |
31 | @model_validator(mode="before")
32 | @classmethod
33 | def lowercase_only(cls, data: dict[str, Any]) -> dict[str, Any]:
34 | """Ensure that the field without a default is lowercase."""
35 | if isinstance(data.get("field_without_default"), str):
36 | data["field_without_default"] = data["field_without_default"].lower()
37 | return data
38 |
--------------------------------------------------------------------------------
/docs/examples/model_noext.py:
--------------------------------------------------------------------------------
1 | model_ext.py
--------------------------------------------------------------------------------
/docs/index.md:
--------------------------------------------------------------------------------
1 | ---
2 | title: Overview
3 | hide:
4 | - feedback
5 | ---
6 |
7 | --8<-- "README.md"
8 |
9 |
18 |
19 | ## Examples
20 |
21 | /// tab | Pydantic model
22 |
23 | ```python exec="1" result="python"
24 | print('--8<-- "docs/examples/model_ext.py"')
25 | ```
26 |
27 | ///
28 |
29 | /// tab | Without extension
30 |
31 | ::: model_noext.ExampleModel
32 | options:
33 | heading_level: 3
34 |
35 | ///
36 |
37 |
38 | /// tab | With extension
39 |
40 | ::: model_ext.ExampleModel
41 | options:
42 | heading_level: 3
43 | extensions:
44 | - griffe_pydantic
45 |
46 | ///
47 |
48 |
--------------------------------------------------------------------------------
/docs/insiders/changelog.md:
--------------------------------------------------------------------------------
1 | # Changelog
2 |
3 | ## griffe-pydantic Insiders
4 |
5 | ### 1.0.1 May 27, 2024 { id="1.0.1" }
6 |
7 | - Depend on Griffe 0.38 minimum
8 | - Detect inheritance when `BaseModel` is imported from `pydantic.main`
9 | - Don't crash on keyword arguments in `@field_validator` decorators
10 |
11 | ### 1.0.0 March 20, 2023 { id="1.0.0" }
12 |
13 | - Support Pydantic v2
14 | - Support both static and dynamic analysis
15 | - Detect when classes inherit from Pydantic models
16 |
17 | ### 1.0.0a0 July 13, 2023 { id="1.0.0a0" }
18 |
19 | - Release first Insiders version (alpha)
20 |
--------------------------------------------------------------------------------
/docs/insiders/goals.yml:
--------------------------------------------------------------------------------
1 | goals:
2 | 500:
3 | name: PlasmaVac User Guide
4 | features: []
5 | 1000:
6 | name: GraviFridge Fluid Renewal
7 | features:
8 | - name: "[Project] Griffe extension for Pydantic"
9 | ref: /
10 | since: 2023/07/13
11 | 1500:
12 | name: HyperLamp Navigation Tips
13 | features: []
14 | 2000:
15 | name: FusionDrive Ejection Configuration
16 | features: []
17 |
--------------------------------------------------------------------------------
/docs/insiders/index.md:
--------------------------------------------------------------------------------
1 | ---
2 | title: Insiders
3 | ---
4 |
5 | # Insiders
6 |
7 | *griffe-pydantic* follows the **sponsorware** release strategy, which means that new features are first exclusively released to sponsors as part of [Insiders][]. Read on to learn [what sponsorships achieve][sponsorship], [how to become a sponsor][sponsors] to get access to Insiders, and [what's in it for you][features]!
8 |
9 | ## What is Insiders?
10 |
11 | *griffe-pydantic Insiders* is a private fork of *griffe-pydantic*, hosted as a private GitHub repository. Almost[^1] [all new features][features] are developed as part of this fork, which means that they are immediately available to all eligible sponsors, as they are granted access to this private repository.
12 |
13 | [^1]: In general, every new feature is first exclusively released to sponsors, but sometimes upstream dependencies enhance existing features that must be supported by *griffe-pydantic*.
14 |
15 | Every feature is tied to a [funding goal][funding] in monthly subscriptions. When a funding goal is hit, the features that are tied to it are merged back into *griffe-pydantic* and released for general availability, making them available to all users. Bugfixes are always released in tandem.
16 |
17 | Sponsorships start as low as [**$10 a month**][sponsors].[^2]
18 |
19 | [^2]: Note that $10 a month is the minimum amount to become eligible for Insiders. While GitHub Sponsors also allows to sponsor lower amounts or one-time amounts, those can't be granted access to Insiders due to technical reasons. Such contributions are still very much welcome as they help ensuring the project's sustainability.
20 |
21 | ## What sponsorships achieve
22 |
23 | Sponsorships make this project sustainable, as they buy the maintainers of this project time – a very scarce resource – which is spent on the development of new features, bug fixing, stability improvement, issue triage and general support. The biggest bottleneck in Open Source is time.[^3]
24 |
25 | [^3]: Making an Open Source project sustainable is exceptionally hard: maintainers burn out, projects are abandoned. That's not great and very unpredictable. The sponsorware model ensures that if you decide to use *griffe-pydantic*, you can be sure that bugs are fixed quickly and new features are added regularly.
26 |
27 | If you're unsure if you should sponsor this project, check out the list of [completed funding goals][goals completed] to learn whether you're already using features that were developed with the help of sponsorships. You're most likely using at least a handful of them, [thanks to our awesome sponsors][sponsors]!
28 |
29 | ## What's in it for me?
30 |
31 | ```python exec="1" session="insiders"
32 | data_source = "docs/insiders/goals.yml"
33 | ```
34 |
35 |
36 | ```python exec="1" session="insiders" idprefix=""
37 | --8<-- "scripts/insiders.py"
38 |
39 | if unreleased_features:
40 | print(
41 | "The moment you [become a sponsor](#how-to-become-a-sponsor), you'll get **immediate "
42 | f"access to {len(unreleased_features)} additional features** that you can start using right away, and "
43 | "which are currently exclusively available to sponsors:\n"
44 | )
45 |
46 | for feature in unreleased_features:
47 | feature.render(badge=True)
48 |
49 | print(
50 | "\n\nThese are just the features related to this project. "
51 | "[See the complete feature list on the author's main Insiders page](https://pawamoy.github.io/insiders/#whats-in-it-for-me)."
52 | )
53 | else:
54 | print(
55 | "The moment you [become a sponsor](#how-to-become-a-sponsor), you'll get immediate "
56 | "access to all released features that you can start using right away, and "
57 | "which are exclusively available to sponsors. At this moment, there are no "
58 | "Insiders features for this project, but checkout the [next funding goals](#goals) "
59 | "to see what's coming, as well as **[the feature list for all Insiders projects](https://pawamoy.github.io/insiders/#whats-in-it-for-me).**"
60 | )
61 | ```
62 |
63 |
64 | Additionally, your sponsorship will give more weight to your upvotes on issues, helping us prioritize work items in our backlog. For more information on how we prioritize work, see this page: [Backlog management][backlog].
65 |
66 | ## How to become a sponsor
67 |
68 | Thanks for your interest in sponsoring! In order to become an eligible sponsor with your GitHub account, visit [pawamoy's sponsor profile][github sponsor profile], and complete a sponsorship of **$10 a month or more**. You can use your individual or organization GitHub account for sponsoring.
69 |
70 | Sponsorships lower than $10 a month are also very much appreciated, and useful. They won't grant you access to Insiders, but they will be counted towards reaching sponsorship goals. Every sponsorship helps us implementing new features and releasing them to the public.
71 |
72 | **Important:** By default, when you're sponsoring **[@pawamoy][github sponsor profile]** through a GitHub organization, all the publicly visible members of the organization will be invited to join our private repositories. If you wish to only grant access to a subset of users, please send a short email to insiders@pawamoy.fr with the name of your organization and the GitHub accounts of the users that should be granted access.
73 |
74 | **Tip:** to ensure that access is not tied to a particular individual GitHub account, you can create a bot account (i.e. a GitHub account that is not tied to a specific individual), and use this account for the sponsoring. After being granted access to our private repositories, the bot account can create private forks of our private repositories into your own organization, which all members of your organization will have access to.
75 |
76 | You can cancel your sponsorship anytime.[^5]
77 |
78 | [^5]: If you cancel your sponsorship, GitHub schedules a cancellation request which will become effective at the end of the billing cycle. This means that even though you cancel your sponsorship, you will keep your access to Insiders as long as your cancellation isn't effective. All charges are processed by GitHub through Stripe. As we don't receive any information regarding your payment, and GitHub doesn't offer refunds, sponsorships are non-refundable.
79 |
80 |
81 | [:octicons-heart-fill-24:{ .pulse } Join our awesome sponsors][github sponsor profile]{ .md-button .md-button--primary }
82 |
83 |
84 |
89 |
90 |
91 |
92 |
93 |
94 | If you sponsor publicly, you're automatically added here with a link to your profile and avatar to show your support for *griffe-pydantic*. Alternatively, if you wish to keep your sponsorship private, you'll be a silent +1. You can select visibility during checkout and change it afterwards.
95 |
96 |
97 | ## Funding
98 |
99 | ### Goals
100 |
101 | The following section lists all funding goals. Each goal contains a list of features prefixed with a checkmark symbol, denoting whether a feature is :octicons-check-circle-fill-24:{ style="color: #00e676" } already available or :octicons-check-circle-fill-24:{ style="color: var(--md-default-fg-color--lightest)" } planned, but not yet implemented. When the funding goal is hit, the features are released for general availability.
102 |
103 | ```python exec="1" session="insiders" idprefix=""
104 | for goal in goals.values():
105 | if not goal.complete:
106 | goal.render()
107 | ```
108 |
109 | ### Goals completed
110 |
111 | This section lists all funding goals that were previously completed, which means that those features were part of Insiders, but are now generally available and can be used by all users.
112 |
113 | ```python exec="1" session="insiders" idprefix=""
114 | for goal in goals.values():
115 | if goal.complete:
116 | goal.render()
117 | ```
118 |
119 | ## Frequently asked questions
120 |
121 | ### Compatibility
122 |
123 | > We're building an open source project and want to allow outside collaborators to use *griffe-pydantic* locally without having access to Insiders. Is this still possible?
124 |
125 | Yes. Insiders is compatible with *griffe-pydantic*. Almost all new features and configuration options are either backward-compatible or implemented behind feature flags. Most Insiders features enhance the overall experience, though while these features add value for the users of your project, they shouldn't be necessary for previewing when making changes to content.
126 |
127 | ### Payment
128 |
129 | > We don't want to pay for sponsorship every month. Are there any other options?
130 |
131 | Yes. You can sponsor on a yearly basis by [switching your GitHub account to a yearly billing cycle][billing cycle]. If for some reason you cannot do that, you could also create a dedicated GitHub account with a yearly billing cycle, which you only use for sponsoring (some sponsors already do that).
132 |
133 | If you have any problems or further questions, please reach out to insiders@pawamoy.fr.
134 |
135 | ### Terms
136 |
137 | > Are we allowed to use Insiders under the same terms and conditions as *griffe-pydantic*?
138 |
139 | Yes. Whether you're an individual or a company, you may use *griffe-pydantic Insiders* precisely under the same terms as *griffe-pydantic*, which are given by the [ISC license][license]. However, we kindly ask you to respect our **fair use policy**:
140 |
141 | - Please **don't distribute the source code** of Insiders. You may freely use it for public, private or commercial projects, privately fork or mirror it, but please don't make the source code public, as it would counteract the sponsorware strategy.
142 | - If you cancel your subscription, your access to the private repository is revoked, and you will miss out on all future updates of Insiders. However, you may **use the latest version** that's available to you **as long as you like**. Just remember that [GitHub deletes private forks][private forks].
143 |
144 | [backlog]: https://pawamoy.github.io/backlog/
145 | [insiders]: #what-is-insiders
146 | [sponsorship]: #what-sponsorships-achieve
147 | [sponsors]: #how-to-become-a-sponsor
148 | [features]: #whats-in-it-for-me
149 | [funding]: #funding
150 | [goals completed]: #goals-completed
151 | [github sponsor profile]: https://github.com/sponsors/pawamoy
152 | [billing cycle]: https://docs.github.com/en/github/setting-up-and-managing-billing-and-payments-on-github/changing-the-duration-of-your-billing-cycle
153 | [license]: ../license.md
154 | [private forks]: https://docs.github.com/en/github/setting-up-and-managing-your-github-user-account/removing-a-collaborator-from-a-personal-repository
155 |
156 |
157 |
158 |
--------------------------------------------------------------------------------
/docs/insiders/installation.md:
--------------------------------------------------------------------------------
1 | ---
2 | title: Getting started with Insiders
3 | ---
4 |
5 | # Getting started with Insiders
6 |
7 | *griffe-pydantic Insiders* is a compatible drop-in replacement for *griffe-pydantic*, and can be installed similarly using `pip` or `git`. Note that in order to access the Insiders repository, you need to [become an eligible sponsor][] of @pawamoy on GitHub.
8 |
9 | ## Installation
10 |
11 | ### with the `insiders` tool
12 |
13 | [`insiders`][insiders-tool] is a tool that helps you keep up-to-date versions of Insiders projects in the PyPI index of your choice (self-hosted, Google registry, Artifactory, etc.).
14 |
15 | **We kindly ask that you do not upload the distributions to public registries, as it is against our [Terms of use][].**
16 |
17 | ### with pip (ssh/https)
18 |
19 | *griffe-pydantic Insiders* can be installed with `pip` [using SSH][install-pip-ssh]:
20 |
21 | ```bash
22 | pip install git+ssh://git@github.com/pawamoy-insiders/griffe-pydantic.git
23 | ```
24 |
25 | Or using HTTPS:
26 |
27 | ```bash
28 | pip install git+https://${GH_TOKEN}@github.com/pawamoy-insiders/griffe-pydantic.git
29 | ```
30 |
31 | >? NOTE: **How to get a GitHub personal access token?** The `GH_TOKEN` environment variable is a GitHub token. It can be obtained by creating a [personal access token][github-pat] for your GitHub account. It will give you access to the Insiders repository, programmatically, from the command line or GitHub Actions workflows:
32 | >
33 | > 1. Go to https://github.com/settings/tokens
34 | > 2. Click on [Generate a new token][github-pat-new]
35 | > 3. Enter a name and select the [`repo`][scopes] scope
36 | > 4. Generate the token and store it in a safe place
37 | >
38 | > Note that the personal access token must be kept secret at all times, as it allows the owner to access your private repositories.
39 |
40 | ### with Git
41 |
42 | Of course, you can use *griffe-pydantic Insiders* directly using Git:
43 |
44 | ```
45 | git clone git@github.com:pawamoy-insiders/griffe-pydantic
46 | ```
47 |
48 | When cloning with Git, the package must be installed:
49 |
50 | ```
51 | pip install -e griffe-pydantic
52 | ```
53 |
54 | ## Upgrading
55 |
56 | When upgrading Insiders, you should always check the version of *griffe-pydantic* which makes up the first part of the version qualifier. For example, a version like `8.x.x.4.x.x` means that Insiders `4.x.x` is currently based on `8.x.x`.
57 |
58 | If the major version increased, it's a good idea to consult the [changelog][] and go through the steps to ensure your configuration is up to date and all necessary changes have been made.
59 |
60 | [become an eligible sponsor]: ./index.md#how-to-become-a-sponsor
61 | [changelog]: ./changelog.md
62 | [github-pat]: https://docs.github.com/en/github/authenticating-to-github/creating-a-personal-access-token
63 | [github-pat-new]: https://github.com/settings/tokens/new
64 | [insiders-tool]: https://pawamoy.github.io/insiders-project/
65 | [install-pip-ssh]: https://docs.github.com/en/authentication/connecting-to-github-with-ssh
66 | [scopes]: https://docs.github.com/en/developers/apps/scopes-for-oauth-apps#available-scopes
67 | [terms of use]: ./index.md#terms
68 |
--------------------------------------------------------------------------------
/docs/js/feedback.js:
--------------------------------------------------------------------------------
1 | const feedback = document.forms.feedback;
2 | feedback.hidden = false;
3 |
4 | feedback.addEventListener("submit", function(ev) {
5 | ev.preventDefault();
6 | const commentElement = document.getElementById("feedback");
7 | commentElement.style.display = "block";
8 | feedback.firstElementChild.disabled = true;
9 | const data = ev.submitter.getAttribute("data-md-value");
10 | const note = feedback.querySelector(".md-feedback__note [data-md-value='" + data + "']");
11 | if (note) {
12 | note.hidden = false;
13 | }
14 | })
15 |
--------------------------------------------------------------------------------
/docs/js/insiders.js:
--------------------------------------------------------------------------------
1 | function humanReadableAmount(amount) {
2 | const strAmount = String(amount);
3 | if (strAmount.length >= 4) {
4 | return `${strAmount.slice(0, strAmount.length - 3)},${strAmount.slice(-3)}`;
5 | }
6 | return strAmount;
7 | }
8 |
9 | function getJSON(url, callback) {
10 | var xhr = new XMLHttpRequest();
11 | xhr.open('GET', url, true);
12 | xhr.responseType = 'json';
13 | xhr.onload = function () {
14 | var status = xhr.status;
15 | if (status === 200) {
16 | callback(null, xhr.response);
17 | } else {
18 | callback(status, xhr.response);
19 | }
20 | };
21 | xhr.send();
22 | }
23 |
24 | function updatePremiumSponsors(dataURL, rank) {
25 | let capRank = rank.charAt(0).toUpperCase() + rank.slice(1);
26 | getJSON(dataURL + `/sponsors${capRank}.json`, function (err, sponsors) {
27 | const sponsorsDiv = document.getElementById(`${rank}-sponsors`);
28 | if (sponsors.length > 0) {
29 | let html = '';
30 | html += `${capRank} sponsors `
31 | sponsors.forEach(function (sponsor) {
32 | html += `
33 |
34 |
35 |
36 | `
37 | });
38 | html += '
'
39 | sponsorsDiv.innerHTML = html;
40 | }
41 | });
42 | }
43 |
44 | function updateInsidersPage(author_username) {
45 | const sponsorURL = `https://github.com/sponsors/${author_username}`
46 | const dataURL = `https://raw.githubusercontent.com/${author_username}/sponsors/main`;
47 | getJSON(dataURL + '/numbers.json', function (err, numbers) {
48 | document.getElementById('sponsors-count').innerHTML = numbers.count;
49 | Array.from(document.getElementsByClassName('sponsors-total')).forEach(function (element) {
50 | element.innerHTML = '$ ' + humanReadableAmount(numbers.total);
51 | });
52 | getJSON(dataURL + '/sponsors.json', function (err, sponsors) {
53 | const sponsorsElem = document.getElementById('sponsors');
54 | const privateSponsors = numbers.count - sponsors.length;
55 | sponsors.forEach(function (sponsor) {
56 | sponsorsElem.innerHTML += `
57 |
60 | `;
61 | });
62 | if (privateSponsors > 0) {
63 | sponsorsElem.innerHTML += `
64 |
67 | `;
68 | }
69 | });
70 | });
71 | updatePremiumSponsors(dataURL, "gold");
72 | updatePremiumSponsors(dataURL, "silver");
73 | updatePremiumSponsors(dataURL, "bronze");
74 | }
75 |
--------------------------------------------------------------------------------
/docs/license.md:
--------------------------------------------------------------------------------
1 | ---
2 | title: License
3 | hide:
4 | - feedback
5 | ---
6 |
7 | # License
8 |
9 | ```
10 | --8<-- "LICENSE"
11 | ```
12 |
--------------------------------------------------------------------------------
/docs/reference/griffe_pydantic.md:
--------------------------------------------------------------------------------
1 | # ::: griffe_pydantic
2 | options:
3 | show_submodules: true
4 |
--------------------------------------------------------------------------------
/duties.py:
--------------------------------------------------------------------------------
1 | """Development tasks."""
2 |
3 | from __future__ import annotations
4 |
5 | import os
6 | import re
7 | import sys
8 | from contextlib import contextmanager
9 | from functools import wraps
10 | from importlib.metadata import version as pkgversion
11 | from pathlib import Path
12 | from typing import TYPE_CHECKING, Any, Callable
13 |
14 | from duty import duty, tools
15 |
16 | if TYPE_CHECKING:
17 | from collections.abc import Iterator
18 |
19 | from duty.context import Context
20 |
21 |
22 | PY_SRC_PATHS = (Path(_) for _ in ("src", "tests", "duties.py", "scripts"))
23 | PY_SRC_LIST = tuple(str(_) for _ in PY_SRC_PATHS)
24 | PY_SRC = " ".join(PY_SRC_LIST)
25 | CI = os.environ.get("CI", "0") in {"1", "true", "yes", ""}
26 | WINDOWS = os.name == "nt"
27 | PTY = not WINDOWS and not CI
28 | MULTIRUN = os.environ.get("MULTIRUN", "0") == "1"
29 |
30 |
31 | def pyprefix(title: str) -> str:
32 | if MULTIRUN:
33 | prefix = f"(python{sys.version_info.major}.{sys.version_info.minor})"
34 | return f"{prefix:14}{title}"
35 | return title
36 |
37 |
38 | def not_from_insiders(func: Callable) -> Callable:
39 | @wraps(func)
40 | def wrapper(ctx: Context, *args: Any, **kwargs: Any) -> None:
41 | origin = ctx.run("git config --get remote.origin.url", silent=True)
42 | if "pawamoy-insiders/griffe" in origin:
43 | ctx.run(
44 | lambda: False,
45 | title="Not running this task from insiders repository (do that from public repo instead!)",
46 | )
47 | return
48 | func(ctx, *args, **kwargs)
49 |
50 | return wrapper
51 |
52 |
53 | @contextmanager
54 | def material_insiders() -> Iterator[bool]:
55 | if "+insiders" in pkgversion("mkdocs-material"):
56 | os.environ["MATERIAL_INSIDERS"] = "true"
57 | try:
58 | yield True
59 | finally:
60 | os.environ.pop("MATERIAL_INSIDERS")
61 | else:
62 | yield False
63 |
64 |
65 | def _get_changelog_version() -> str:
66 | changelog_version_re = re.compile(r"^## \[(\d+\.\d+\.\d+)\].*$")
67 | with Path(__file__).parent.joinpath("CHANGELOG.md").open("r", encoding="utf8") as file:
68 | return next(filter(bool, map(changelog_version_re.match, file))).group(1) # type: ignore[union-attr]
69 |
70 |
71 | @duty
72 | def changelog(ctx: Context, bump: str = "") -> None:
73 | """Update the changelog in-place with latest commits.
74 |
75 | Parameters:
76 | bump: Bump option passed to git-changelog.
77 | """
78 | ctx.run(tools.git_changelog(bump=bump or None), title="Updating changelog")
79 | ctx.run(tools.yore.check(bump=bump or _get_changelog_version()), title="Checking legacy code")
80 |
81 |
82 | @duty(pre=["check-quality", "check-types", "check-docs", "check-api"])
83 | def check(ctx: Context) -> None:
84 | """Check it all!"""
85 |
86 |
87 | @duty
88 | def check_quality(ctx: Context) -> None:
89 | """Check the code quality."""
90 | ctx.run(
91 | tools.ruff.check(*PY_SRC_LIST, config="config/ruff.toml"),
92 | title=pyprefix("Checking code quality"),
93 | )
94 |
95 |
96 | @duty
97 | def check_docs(ctx: Context) -> None:
98 | """Check if the documentation builds correctly."""
99 | Path("htmlcov").mkdir(parents=True, exist_ok=True)
100 | Path("htmlcov/index.html").touch(exist_ok=True)
101 | with material_insiders():
102 | ctx.run(
103 | tools.mkdocs.build(strict=True, verbose=True),
104 | title=pyprefix("Building documentation"),
105 | )
106 |
107 |
108 | @duty
109 | def check_types(ctx: Context) -> None:
110 | """Check that the code is correctly typed."""
111 | os.environ["FORCE_COLOR"] = "1"
112 | ctx.run(
113 | tools.mypy(*PY_SRC_LIST, config_file="config/mypy.ini"),
114 | title=pyprefix("Type-checking"),
115 | )
116 |
117 |
118 | @duty
119 | def check_api(ctx: Context, *cli_args: str) -> None:
120 | """Check for API breaking changes."""
121 | ctx.run(
122 | tools.griffe.check("griffe_pydantic", search=["src"], color=True).add_args(*cli_args),
123 | title="Checking for API breaking changes",
124 | nofail=True,
125 | )
126 |
127 |
128 | @duty
129 | def docs(ctx: Context, *cli_args: str, host: str = "127.0.0.1", port: int = 8000) -> None:
130 | """Serve the documentation (localhost:8000).
131 |
132 | Parameters:
133 | host: The host to serve the docs from.
134 | port: The port to serve the docs on.
135 | """
136 | with material_insiders():
137 | ctx.run(
138 | tools.mkdocs.serve(dev_addr=f"{host}:{port}").add_args(*cli_args),
139 | title="Serving documentation",
140 | capture=False,
141 | )
142 |
143 |
144 | @duty
145 | def docs_deploy(ctx: Context, *, force: bool = False) -> None:
146 | """Deploy the documentation to GitHub pages.
147 |
148 | Parameters:
149 | force: Whether to force deployment, even from non-Insiders version.
150 | """
151 | os.environ["DEPLOY"] = "true"
152 | with material_insiders() as insiders:
153 | if not insiders:
154 | ctx.run(lambda: False, title="Not deploying docs without Material for MkDocs Insiders!")
155 | origin = ctx.run("git config --get remote.origin.url", silent=True, allow_overrides=False)
156 | if "pawamoy-insiders/griffe-pydantic" in origin:
157 | ctx.run(
158 | "git remote add upstream git@github.com:mkdocstrings/griffe-pydantic",
159 | silent=True,
160 | nofail=True,
161 | allow_overrides=False,
162 | )
163 | ctx.run(
164 | tools.mkdocs.gh_deploy(remote_name="upstream", force=True),
165 | title="Deploying documentation",
166 | )
167 | elif force:
168 | ctx.run(
169 | tools.mkdocs.gh_deploy(force=True),
170 | title="Deploying documentation",
171 | )
172 | else:
173 | ctx.run(
174 | lambda: False,
175 | title="Not deploying docs from public repository (do that from insiders instead!)",
176 | nofail=True,
177 | )
178 |
179 |
180 | @duty
181 | def format(ctx: Context) -> None:
182 | """Run formatting tools on the code."""
183 | ctx.run(
184 | tools.ruff.check(*PY_SRC_LIST, config="config/ruff.toml", fix_only=True, exit_zero=True),
185 | title="Auto-fixing code",
186 | )
187 | ctx.run(tools.ruff.format(*PY_SRC_LIST, config="config/ruff.toml"), title="Formatting code")
188 |
189 |
190 | @duty
191 | def build(ctx: Context) -> None:
192 | """Build source and wheel distributions."""
193 | ctx.run(
194 | tools.build(),
195 | title="Building source and wheel distributions",
196 | pty=PTY,
197 | )
198 |
199 |
200 | @duty
201 | @not_from_insiders
202 | def publish(ctx: Context) -> None:
203 | """Publish source and wheel distributions to PyPI."""
204 | if not Path("dist").exists():
205 | ctx.run("false", title="No distribution files found")
206 | dists = [str(dist) for dist in Path("dist").iterdir()]
207 | ctx.run(
208 | tools.twine.upload(*dists, skip_existing=True),
209 | title="Publishing source and wheel distributions to PyPI",
210 | pty=PTY,
211 | )
212 |
213 |
214 | @duty(post=["build", "publish", "docs-deploy"])
215 | @not_from_insiders
216 | def release(ctx: Context, version: str = "") -> None:
217 | """Release a new Python package.
218 |
219 | Parameters:
220 | version: The new version number to use.
221 | """
222 | if not (version := (version or input("> Version to release: ")).strip()):
223 | ctx.run("false", title="A version must be provided")
224 | ctx.run("git add pyproject.toml CHANGELOG.md", title="Staging files", pty=PTY)
225 | ctx.run(["git", "commit", "-m", f"chore: Prepare release {version}"], title="Committing changes", pty=PTY)
226 | ctx.run(f"git tag {version}", title="Tagging commit", pty=PTY)
227 | ctx.run("git push", title="Pushing commits", pty=False)
228 | ctx.run("git push --tags", title="Pushing tags", pty=False)
229 |
230 |
231 | @duty(silent=True, aliases=["cov"])
232 | def coverage(ctx: Context) -> None:
233 | """Report coverage as text and HTML."""
234 | ctx.run(tools.coverage.combine(), nofail=True)
235 | ctx.run(tools.coverage.report(rcfile="config/coverage.ini"), capture=False)
236 | ctx.run(tools.coverage.html(rcfile="config/coverage.ini"))
237 |
238 |
239 | @duty
240 | def test(ctx: Context, *cli_args: str, match: str = "") -> None:
241 | """Run the test suite.
242 |
243 | Parameters:
244 | match: A pytest expression to filter selected tests.
245 | """
246 | py_version = f"{sys.version_info.major}{sys.version_info.minor}"
247 | os.environ["COVERAGE_FILE"] = f".coverage.{py_version}"
248 | ctx.run(
249 | tools.pytest(
250 | "tests",
251 | config_file="config/pytest.ini",
252 | select=match,
253 | color="yes",
254 | ).add_args("-n", "auto", *cli_args),
255 | title=pyprefix("Running tests"),
256 | )
257 |
--------------------------------------------------------------------------------
/mkdocs.yml:
--------------------------------------------------------------------------------
1 | site_name: "griffe-pydantic"
2 | site_description: "Griffe extension for Pydantic."
3 | site_url: "https://mkdocstrings.github.io/griffe-pydantic"
4 | repo_url: "https://github.com/mkdocstrings/griffe-pydantic"
5 | repo_name: "mkdocstrings/griffe-pydantic"
6 | site_dir: "site"
7 | watch: [mkdocs.yml, README.md, CONTRIBUTING.md, CHANGELOG.md, src/griffe_pydantic]
8 | copyright: Copyright © 2023 Timothée Mazzucotelli
9 | edit_uri: edit/main/docs/
10 |
11 | validation:
12 | omitted_files: warn
13 | absolute_links: warn
14 | unrecognized_links: warn
15 |
16 | nav:
17 | - Home:
18 | - Overview: index.md
19 | - Changelog: changelog.md
20 | - Credits: credits.md
21 | - License: license.md
22 | - API reference: reference/griffe_pydantic.md
23 | - Development:
24 | - Contributing: contributing.md
25 | - Code of Conduct: code_of_conduct.md
26 | - Coverage report: coverage.md
27 | - Insiders:
28 | - insiders/index.md
29 | - Getting started:
30 | - Installation: insiders/installation.md
31 | - Changelog: insiders/changelog.md
32 | - Author's website: https://pawamoy.github.io/
33 |
34 | theme:
35 | name: material
36 | custom_dir: docs/.overrides
37 | icon:
38 | logo: material/currency-sign
39 | features:
40 | - announce.dismiss
41 | - content.action.edit
42 | - content.action.view
43 | - content.code.annotate
44 | - content.code.copy
45 | - content.tooltips
46 | - navigation.footer
47 | - navigation.instant.preview
48 | - navigation.path
49 | - navigation.sections
50 | - navigation.tabs
51 | - navigation.tabs.sticky
52 | - navigation.top
53 | - search.highlight
54 | - search.suggest
55 | - toc.follow
56 | palette:
57 | - media: "(prefers-color-scheme)"
58 | toggle:
59 | icon: material/brightness-auto
60 | name: Switch to light mode
61 | - media: "(prefers-color-scheme: light)"
62 | scheme: default
63 | primary: teal
64 | accent: purple
65 | toggle:
66 | icon: material/weather-sunny
67 | name: Switch to dark mode
68 | - media: "(prefers-color-scheme: dark)"
69 | scheme: slate
70 | primary: black
71 | accent: lime
72 | toggle:
73 | icon: material/weather-night
74 | name: Switch to system preference
75 |
76 | extra_css:
77 | - css/material.css
78 | - css/mkdocstrings.css
79 | - css/insiders.css
80 |
81 | extra_javascript:
82 | - js/feedback.js
83 |
84 | markdown_extensions:
85 | - attr_list
86 | - admonition
87 | - callouts
88 | - footnotes
89 | - pymdownx.blocks.tab:
90 | alternate_style: true
91 | slugify: !!python/object/apply:pymdownx.slugs.slugify
92 | kwds:
93 | case: lower
94 | - pymdownx.emoji:
95 | emoji_index: !!python/name:material.extensions.emoji.twemoji
96 | emoji_generator: !!python/name:material.extensions.emoji.to_svg
97 | - pymdownx.magiclink
98 | - pymdownx.snippets:
99 | base_path: [!relative $config_dir]
100 | check_paths: true
101 | - pymdownx.superfences
102 | - pymdownx.tasklist:
103 | custom_checkbox: true
104 | - toc:
105 | permalink: "¤"
106 |
107 | plugins:
108 | - search
109 | - markdown-exec
110 | - section-index
111 | - coverage
112 | - mkdocstrings:
113 | handlers:
114 | python:
115 | paths: [src, docs/examples]
116 | inventories:
117 | - https://docs.python.org/3/objects.inv
118 | - https://mkdocstrings.github.io/griffe/objects.inv
119 | - https://docs.pydantic.dev/latest/objects.inv
120 | options:
121 | docstring_options:
122 | ignore_init_summary: true
123 | docstring_section_style: list
124 | filters: ["!^_"]
125 | heading_level: 1
126 | inherited_members: true
127 | merge_init_into_class: true
128 | separate_signature: true
129 | show_root_heading: true
130 | show_root_full_path: false
131 | show_signature_annotations: true
132 | show_source: false
133 | show_symbol_type_heading: true
134 | show_symbol_type_toc: true
135 | signature_crossrefs: true
136 | summary: true
137 | - llmstxt:
138 | files:
139 | - output: llms-full.txt
140 | inputs:
141 | - index.md
142 | - reference/**.md
143 | - git-revision-date-localized:
144 | enabled: !ENV [DEPLOY, false]
145 | enable_creation_date: true
146 | type: timeago
147 | - minify:
148 | minify_html: !ENV [DEPLOY, false]
149 | - group:
150 | enabled: !ENV [MATERIAL_INSIDERS, false]
151 | plugins:
152 | - typeset
153 |
154 | extra:
155 | social:
156 | - icon: fontawesome/brands/github
157 | link: https://github.com/pawamoy
158 | - icon: fontawesome/brands/mastodon
159 | link: https://fosstodon.org/@pawamoy
160 | - icon: fontawesome/brands/twitter
161 | link: https://twitter.com/pawamoy
162 | - icon: fontawesome/brands/gitter
163 | link: https://gitter.im/griffe-pydantic/community
164 | - icon: fontawesome/brands/python
165 | link: https://pypi.org/project/griffe-pydantic/
166 | analytics:
167 | feedback:
168 | title: Was this page helpful?
169 | ratings:
170 | - icon: material/emoticon-happy-outline
171 | name: This page was helpful
172 | data: 1
173 | note: Thanks for your feedback!
174 | - icon: material/emoticon-sad-outline
175 | name: This page could be improved
176 | data: 0
177 | note: Let us know how we can improve this page.
178 |
--------------------------------------------------------------------------------
/pyproject.toml:
--------------------------------------------------------------------------------
1 | [build-system]
2 | requires = ["pdm-backend"]
3 | build-backend = "pdm.backend"
4 |
5 | [project]
6 | name = "griffe-pydantic"
7 | description = "Griffe extension for Pydantic."
8 | authors = [{name = "Timothée Mazzucotelli", email = "dev@pawamoy.fr"}]
9 | license = "ISC"
10 | license-files = ["LICENSE"]
11 | readme = "README.md"
12 | requires-python = ">=3.9"
13 | keywords = []
14 | dynamic = ["version"]
15 | classifiers = [
16 | "Development Status :: 4 - Beta",
17 | "Intended Audience :: Developers",
18 | "Programming Language :: Python",
19 | "Programming Language :: Python :: 3",
20 | "Programming Language :: Python :: 3 :: Only",
21 | "Programming Language :: Python :: 3.9",
22 | "Programming Language :: Python :: 3.10",
23 | "Programming Language :: Python :: 3.11",
24 | "Programming Language :: Python :: 3.12",
25 | "Programming Language :: Python :: 3.13",
26 | "Programming Language :: Python :: 3.14",
27 | "Topic :: Documentation",
28 | "Topic :: Software Development",
29 | "Topic :: Utilities",
30 | "Typing :: Typed",
31 | ]
32 | dependencies = [
33 | "griffe>=1.6.3",
34 | ]
35 |
36 | [project.urls]
37 | Homepage = "https://mkdocstrings.github.io/griffe-pydantic"
38 | Documentation = "https://mkdocstrings.github.io/griffe-pydantic"
39 | Changelog = "https://mkdocstrings.github.io/griffe-pydantic/changelog"
40 | Repository = "https://github.com/mkdocstrings/griffe-pydantic"
41 | Issues = "https://github.com/mkdocstrings/griffe-pydantic/issues"
42 | Discussions = "https://github.com/mkdocstrings/griffe-pydantic/discussions"
43 | Gitter = "https://gitter.im/mkdocstrings/griffe-pydantic"
44 | Funding = "https://github.com/sponsors/pawamoy"
45 |
46 | [project.entry-points."mkdocstrings.python.templates"]
47 | griffe-pydantic = "griffe_pydantic:get_templates_path"
48 |
49 | [tool.pdm.version]
50 | source = "call"
51 | getter = "scripts.get_version:get_version"
52 |
53 | [tool.pdm.build]
54 | # Include as much as possible in the source distribution, to help redistributors.
55 | excludes = ["**/.pytest_cache"]
56 | source-includes = [
57 | "config",
58 | "docs",
59 | "scripts",
60 | "share",
61 | "tests",
62 | "duties.py",
63 | "mkdocs.yml",
64 | "*.md",
65 | "LICENSE",
66 | ]
67 |
68 | [tool.pdm.build.wheel-data]
69 | # Manual pages can be included in the wheel.
70 | # Depending on the installation tool, they will be accessible to users.
71 | # pipx supports it, uv does not yet, see https://github.com/astral-sh/uv/issues/4731.
72 | data = [
73 | {path = "share/**/*", relative-to = "."},
74 | ]
75 |
76 | [dependency-groups]
77 | maintain = [
78 | "build>=1.2",
79 | "git-changelog>=2.5",
80 | "twine>=5.1",
81 | "yore>=0.3.3",
82 | ]
83 | ci = [
84 | "duty>=1.6",
85 | "ruff>=0.4",
86 | "pytest>=8.2",
87 | "pytest-cov>=5.0",
88 | "pytest-randomly>=3.15",
89 | "pytest-xdist>=3.6",
90 | "mypy>=1.10",
91 | "pydantic>=2.10",
92 | "types-markdown>=3.6",
93 | "types-pyyaml>=6.0",
94 | ]
95 | docs = [
96 | "markdown-callouts>=0.4",
97 | "markdown-exec>=1.8",
98 | "mkdocs>=1.6",
99 | "mkdocs-coverage>=1.0",
100 | "mkdocs-git-revision-date-localized-plugin>=1.2",
101 | "mkdocs-llmstxt>=0.1",
102 | "mkdocs-material>=9.5",
103 | "mkdocs-minify-plugin>=0.8",
104 | "mkdocstrings[python]>=0.28",
105 | "mkdocs-section-index>=0.3",
106 | # YORE: EOL 3.10: Remove line.
107 | "tomli>=2.0; python_version < '3.11'",
108 | ]
109 |
110 | [tool.uv]
111 | default-groups = ["maintain", "ci", "docs"]
112 |
--------------------------------------------------------------------------------
/scripts/gen_api_ref.py:
--------------------------------------------------------------------------------
1 | # Generate the API reference pages and navigation.
2 |
3 | from pathlib import Path
4 |
5 | import mkdocs_gen_files
6 |
7 | nav = mkdocs_gen_files.Nav()
8 | mod_symbol = '
'
9 |
10 | root = Path(__file__).parent.parent
11 | src = root / "src"
12 |
13 | for path in sorted(src.rglob("*.py")):
14 | module_path = path.relative_to(src).with_suffix("")
15 | doc_path = path.relative_to(src).with_suffix(".md")
16 | full_doc_path = Path("reference", doc_path)
17 |
18 | parts = tuple(module_path.parts)
19 |
20 | if parts[-1] == "__init__":
21 | parts = parts[:-1]
22 | doc_path = doc_path.with_name("index.md")
23 | full_doc_path = full_doc_path.with_name("index.md")
24 |
25 | if any(part.startswith("_") for part in parts):
26 | continue
27 |
28 | nav_parts = [f"{mod_symbol} {part}" for part in parts]
29 | nav[tuple(nav_parts)] = doc_path.as_posix()
30 |
31 | with mkdocs_gen_files.open(full_doc_path, "w") as fd:
32 | ident = ".".join(parts)
33 | fd.write(f"---\ntitle: {ident}\n---\n\n::: {ident}")
34 |
35 | mkdocs_gen_files.set_edit_path(full_doc_path, ".." / path.relative_to(root))
36 |
37 | with mkdocs_gen_files.open("reference/SUMMARY.txt", "w") as nav_file:
38 | nav_file.writelines(nav.build_literate_nav())
39 |
--------------------------------------------------------------------------------
/scripts/gen_credits.py:
--------------------------------------------------------------------------------
1 | # Script to generate the project's credits.
2 |
3 | from __future__ import annotations
4 |
5 | import os
6 | import sys
7 | from collections import defaultdict
8 | from collections.abc import Iterable
9 | from importlib.metadata import distributions
10 | from itertools import chain
11 | from pathlib import Path
12 | from textwrap import dedent
13 | from typing import Union
14 |
15 | from jinja2 import StrictUndefined
16 | from jinja2.sandbox import SandboxedEnvironment
17 | from packaging.requirements import Requirement
18 |
19 | # YORE: EOL 3.10: Replace block with line 2.
20 | if sys.version_info >= (3, 11):
21 | import tomllib
22 | else:
23 | import tomli as tomllib
24 |
25 | project_dir = Path(os.getenv("MKDOCS_CONFIG_DIR", "."))
26 | with project_dir.joinpath("pyproject.toml").open("rb") as pyproject_file:
27 | pyproject = tomllib.load(pyproject_file)
28 | project = pyproject["project"]
29 | project_name = project["name"]
30 | devdeps = [dep for group in pyproject["dependency-groups"].values() for dep in group if not dep.startswith("-e")]
31 |
32 | PackageMetadata = dict[str, Union[str, Iterable[str]]]
33 | Metadata = dict[str, PackageMetadata]
34 |
35 |
36 | def _merge_fields(metadata: dict) -> PackageMetadata:
37 | fields = defaultdict(list)
38 | for header, value in metadata.items():
39 | fields[header.lower()].append(value.strip())
40 | return {
41 | field: value if len(value) > 1 or field in ("classifier", "requires-dist") else value[0]
42 | for field, value in fields.items()
43 | }
44 |
45 |
46 | def _norm_name(name: str) -> str:
47 | return name.replace("_", "-").replace(".", "-").lower()
48 |
49 |
50 | def _requirements(deps: list[str]) -> dict[str, Requirement]:
51 | return {_norm_name((req := Requirement(dep)).name): req for dep in deps}
52 |
53 |
54 | def _extra_marker(req: Requirement) -> str | None:
55 | if not req.marker:
56 | return None
57 | try:
58 | return next(marker[2].value for marker in req.marker._markers if getattr(marker[0], "value", None) == "extra")
59 | except StopIteration:
60 | return None
61 |
62 |
63 | def _get_metadata() -> Metadata:
64 | metadata = {}
65 | for pkg in distributions():
66 | name = _norm_name(pkg.name) # type: ignore[attr-defined,unused-ignore]
67 | metadata[name] = _merge_fields(pkg.metadata) # type: ignore[arg-type]
68 | metadata[name]["spec"] = set()
69 | metadata[name]["extras"] = set()
70 | metadata[name].setdefault("summary", "")
71 | _set_license(metadata[name])
72 | return metadata
73 |
74 |
75 | def _set_license(metadata: PackageMetadata) -> None:
76 | license_field = metadata.get("license-expression", metadata.get("license", ""))
77 | license_name = license_field if isinstance(license_field, str) else " + ".join(license_field)
78 | check_classifiers = license_name in ("UNKNOWN", "Dual License", "") or license_name.count("\n")
79 | if check_classifiers:
80 | license_names = []
81 | for classifier in metadata["classifier"]:
82 | if classifier.startswith("License ::"):
83 | license_names.append(classifier.rsplit("::", 1)[1].strip())
84 | license_name = " + ".join(license_names)
85 | metadata["license"] = license_name or "?"
86 |
87 |
88 | def _get_deps(base_deps: dict[str, Requirement], metadata: Metadata) -> Metadata:
89 | deps = {}
90 | for dep_name, dep_req in base_deps.items():
91 | if dep_name not in metadata or dep_name == "griffe-pydantic":
92 | continue
93 | metadata[dep_name]["spec"] |= {str(spec) for spec in dep_req.specifier} # type: ignore[operator]
94 | metadata[dep_name]["extras"] |= dep_req.extras # type: ignore[operator]
95 | deps[dep_name] = metadata[dep_name]
96 |
97 | again = True
98 | while again:
99 | again = False
100 | for pkg_name in metadata:
101 | if pkg_name in deps:
102 | for pkg_dependency in metadata[pkg_name].get("requires-dist", []):
103 | requirement = Requirement(pkg_dependency)
104 | dep_name = _norm_name(requirement.name)
105 | extra_marker = _extra_marker(requirement)
106 | if (
107 | dep_name in metadata
108 | and dep_name not in deps
109 | and dep_name != project["name"]
110 | and (not extra_marker or extra_marker in deps[pkg_name]["extras"])
111 | ):
112 | metadata[dep_name]["spec"] |= {str(spec) for spec in requirement.specifier} # type: ignore[operator]
113 | deps[dep_name] = metadata[dep_name]
114 | again = True
115 |
116 | return deps
117 |
118 |
119 | def _render_credits() -> str:
120 | metadata = _get_metadata()
121 | dev_dependencies = _get_deps(_requirements(devdeps), metadata)
122 | prod_dependencies = _get_deps(
123 | _requirements(
124 | chain( # type: ignore[arg-type]
125 | project.get("dependencies", []),
126 | chain(*project.get("optional-dependencies", {}).values()),
127 | ),
128 | ),
129 | metadata,
130 | )
131 |
132 | template_data = {
133 | "project_name": project_name,
134 | "prod_dependencies": sorted(prod_dependencies.values(), key=lambda dep: str(dep["name"]).lower()),
135 | "dev_dependencies": sorted(dev_dependencies.values(), key=lambda dep: str(dep["name"]).lower()),
136 | "more_credits": "http://pawamoy.github.io/credits/",
137 | }
138 | template_text = dedent(
139 | """
140 | # Credits
141 |
142 | These projects were used to build *{{ project_name }}*. **Thank you!**
143 |
144 | [Python](https://www.python.org/) |
145 | [uv](https://github.com/astral-sh/uv) |
146 | [copier-uv](https://github.com/pawamoy/copier-uv)
147 |
148 | {% macro dep_line(dep) -%}
149 | [{{ dep.name }}](https://pypi.org/project/{{ dep.name }}/) | {{ dep.summary }} | {{ ("`" ~ dep.spec|sort(reverse=True)|join(", ") ~ "`") if dep.spec else "" }} | `{{ dep.version }}` | {{ dep.license }}
150 | {%- endmacro %}
151 |
152 | {% if prod_dependencies -%}
153 | ### Runtime dependencies
154 |
155 | Project | Summary | Version (accepted) | Version (last resolved) | License
156 | ------- | ------- | ------------------ | ----------------------- | -------
157 | {% for dep in prod_dependencies -%}
158 | {{ dep_line(dep) }}
159 | {% endfor %}
160 |
161 | {% endif -%}
162 | {% if dev_dependencies -%}
163 | ### Development dependencies
164 |
165 | Project | Summary | Version (accepted) | Version (last resolved) | License
166 | ------- | ------- | ------------------ | ----------------------- | -------
167 | {% for dep in dev_dependencies -%}
168 | {{ dep_line(dep) }}
169 | {% endfor %}
170 |
171 | {% endif -%}
172 | {% if more_credits %}**[More credits from the author]({{ more_credits }})**{% endif %}
173 | """,
174 | )
175 | jinja_env = SandboxedEnvironment(undefined=StrictUndefined)
176 | return jinja_env.from_string(template_text).render(**template_data)
177 |
178 |
179 | print(_render_credits())
180 |
--------------------------------------------------------------------------------
/scripts/get_version.py:
--------------------------------------------------------------------------------
1 | # Get current project version from Git tags or changelog.
2 |
3 | import re
4 | from contextlib import suppress
5 | from pathlib import Path
6 |
7 | from pdm.backend.hooks.version import SCMVersion, Version, default_version_formatter, get_version_from_scm
8 |
9 | _root = Path(__file__).parent.parent
10 | _changelog = _root / "CHANGELOG.md"
11 | _changelog_version_re = re.compile(r"^## \[(\d+\.\d+\.\d+)\].*$")
12 | _default_scm_version = SCMVersion(Version("0.0.0"), None, False, None, None) # noqa: FBT003
13 |
14 |
15 | def get_version() -> str:
16 | scm_version = get_version_from_scm(_root) or _default_scm_version
17 | if scm_version.version <= Version("0.1"): # Missing Git tags?
18 | with suppress(OSError, StopIteration): # noqa: SIM117
19 | with _changelog.open("r", encoding="utf8") as file:
20 | match = next(filter(None, map(_changelog_version_re.match, file)))
21 | scm_version = scm_version._replace(version=Version(match.group(1)))
22 | return default_version_formatter(scm_version)
23 |
24 |
25 | if __name__ == "__main__":
26 | print(get_version())
27 |
--------------------------------------------------------------------------------
/scripts/insiders.py:
--------------------------------------------------------------------------------
1 | # Functions related to Insiders funding goals.
2 |
3 | from __future__ import annotations
4 |
5 | import json
6 | import logging
7 | import os
8 | import posixpath
9 | from dataclasses import dataclass
10 | from datetime import date, datetime, timedelta
11 | from itertools import chain
12 | from pathlib import Path
13 | from typing import TYPE_CHECKING, cast
14 | from urllib.error import HTTPError
15 | from urllib.parse import urljoin
16 | from urllib.request import urlopen
17 |
18 | import yaml
19 |
20 | if TYPE_CHECKING:
21 | from collections.abc import Iterable
22 |
23 | logger = logging.getLogger(f"mkdocs.logs.{__name__}")
24 |
25 |
26 | def human_readable_amount(amount: int) -> str:
27 | str_amount = str(amount)
28 | if len(str_amount) >= 4: # noqa: PLR2004
29 | return f"{str_amount[: len(str_amount) - 3]},{str_amount[-3:]}"
30 | return str_amount
31 |
32 |
33 | @dataclass
34 | class Project:
35 | name: str
36 | url: str
37 |
38 |
39 | @dataclass
40 | class Feature:
41 | name: str
42 | ref: str | None
43 | since: date | None
44 | project: Project | None
45 |
46 | def url(self, rel_base: str = "..") -> str | None: # noqa: D102
47 | if not self.ref:
48 | return None
49 | if self.project:
50 | rel_base = self.project.url
51 | return posixpath.join(rel_base, self.ref.lstrip("/"))
52 |
53 | def render(self, rel_base: str = "..", *, badge: bool = False) -> None: # noqa: D102
54 | new = ""
55 | if badge:
56 | recent = self.since and date.today() - self.since <= timedelta(days=60) # noqa: DTZ011
57 | if recent:
58 | ft_date = self.since.strftime("%B %d, %Y") # type: ignore[union-attr]
59 | new = f' :material-alert-decagram:{{ .new-feature .vibrate title="Added on {ft_date}" }}'
60 | project = f"[{self.project.name}]({self.project.url}) — " if self.project else ""
61 | feature = f"[{self.name}]({self.url(rel_base)})" if self.ref else self.name
62 | print(f"- [{'x' if self.since else ' '}] {project}{feature}{new}")
63 |
64 |
65 | @dataclass
66 | class Goal:
67 | name: str
68 | amount: int
69 | features: list[Feature]
70 | complete: bool = False
71 |
72 | @property
73 | def human_readable_amount(self) -> str: # noqa: D102
74 | return human_readable_amount(self.amount)
75 |
76 | def render(self, rel_base: str = "..") -> None: # noqa: D102
77 | print(f"#### $ {self.human_readable_amount} — {self.name}\n")
78 | if self.features:
79 | for feature in self.features:
80 | feature.render(rel_base)
81 | print("")
82 | else:
83 | print("There are no features in this goal for this project. ")
84 | print(
85 | "[See the features in this goal **for all Insiders projects.**]"
86 | f"(https://pawamoy.github.io/insiders/#{self.amount}-{self.name.lower().replace(' ', '-')})",
87 | )
88 |
89 |
90 | def load_goals(data: str, funding: int = 0, project: Project | None = None) -> dict[int, Goal]:
91 | goals_data = yaml.safe_load(data)["goals"]
92 | return {
93 | amount: Goal(
94 | name=goal_data["name"],
95 | amount=amount,
96 | complete=funding >= amount,
97 | features=[
98 | Feature(
99 | name=feature_data["name"],
100 | ref=feature_data.get("ref"),
101 | since=feature_data.get("since") and datetime.strptime(feature_data["since"], "%Y/%m/%d").date(), # noqa: DTZ007
102 | project=project,
103 | )
104 | for feature_data in goal_data["features"]
105 | ],
106 | )
107 | for amount, goal_data in goals_data.items()
108 | }
109 |
110 |
111 | def _load_goals_from_disk(path: str, funding: int = 0) -> dict[int, Goal]:
112 | project_dir = os.getenv("MKDOCS_CONFIG_DIR", ".")
113 | try:
114 | data = Path(project_dir, path).read_text()
115 | except OSError as error:
116 | raise RuntimeError(f"Could not load data from disk: {path}") from error
117 | return load_goals(data, funding)
118 |
119 |
120 | def _load_goals_from_url(source_data: tuple[str, str, str], funding: int = 0) -> dict[int, Goal]:
121 | project_name, project_url, data_fragment = source_data
122 | data_url = urljoin(project_url, data_fragment)
123 | try:
124 | with urlopen(data_url) as response: # noqa: S310
125 | data = response.read()
126 | except HTTPError as error:
127 | raise RuntimeError(f"Could not load data from network: {data_url}") from error
128 | return load_goals(data, funding, project=Project(name=project_name, url=project_url))
129 |
130 |
131 | def _load_goals(source: str | tuple[str, str, str], funding: int = 0) -> dict[int, Goal]:
132 | if isinstance(source, str):
133 | return _load_goals_from_disk(source, funding)
134 | return _load_goals_from_url(source, funding)
135 |
136 |
137 | def funding_goals(source: str | list[str | tuple[str, str, str]], funding: int = 0) -> dict[int, Goal]:
138 | if isinstance(source, str):
139 | return _load_goals_from_disk(source, funding)
140 | goals = {}
141 | for src in source:
142 | source_goals = _load_goals(src, funding)
143 | for amount, goal in source_goals.items():
144 | if amount not in goals:
145 | goals[amount] = goal
146 | else:
147 | goals[amount].features.extend(goal.features)
148 | return {amount: goals[amount] for amount in sorted(goals)}
149 |
150 |
151 | def feature_list(goals: Iterable[Goal]) -> list[Feature]:
152 | return list(chain.from_iterable(goal.features for goal in goals))
153 |
154 |
155 | def load_json(url: str) -> str | list | dict:
156 | with urlopen(url) as response: # noqa: S310
157 | return json.loads(response.read().decode())
158 |
159 |
160 | data_source = globals()["data_source"]
161 | sponsor_url = "https://github.com/sponsors/pawamoy"
162 | data_url = "https://raw.githubusercontent.com/pawamoy/sponsors/main"
163 | numbers: dict[str, int] = load_json(f"{data_url}/numbers.json") # type: ignore[assignment]
164 | sponsors: list[dict] = load_json(f"{data_url}/sponsors.json") # type: ignore[assignment]
165 | current_funding = numbers["total"]
166 | sponsors_count = numbers["count"]
167 | goals = funding_goals(data_source, funding=current_funding)
168 | ongoing_goals = [goal for goal in goals.values() if not goal.complete]
169 | unreleased_features = sorted(
170 | (ft for ft in feature_list(ongoing_goals) if ft.since),
171 | key=lambda ft: cast(date, ft.since),
172 | reverse=True,
173 | )
174 |
--------------------------------------------------------------------------------
/scripts/make:
--------------------------------------------------------------------------------
1 | make.py
--------------------------------------------------------------------------------
/scripts/make.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python3
2 | from __future__ import annotations
3 |
4 | import os
5 | import shutil
6 | import subprocess
7 | import sys
8 | from contextlib import contextmanager
9 | from pathlib import Path
10 | from textwrap import dedent
11 | from typing import TYPE_CHECKING, Any
12 |
13 | if TYPE_CHECKING:
14 | from collections.abc import Iterator
15 |
16 |
17 | PYTHON_VERSIONS = os.getenv("PYTHON_VERSIONS", "3.9 3.10 3.11 3.12 3.13").split()
18 |
19 |
20 | def shell(cmd: str, *, capture_output: bool = False, **kwargs: Any) -> str | None:
21 | """Run a shell command."""
22 | if capture_output:
23 | return subprocess.check_output(cmd, shell=True, text=True, **kwargs) # noqa: S602
24 | subprocess.run(cmd, shell=True, check=True, stderr=subprocess.STDOUT, **kwargs) # noqa: S602
25 | return None
26 |
27 |
28 | @contextmanager
29 | def environ(**kwargs: str) -> Iterator[None]:
30 | """Temporarily set environment variables."""
31 | original = dict(os.environ)
32 | os.environ.update(kwargs)
33 | try:
34 | yield
35 | finally:
36 | os.environ.clear()
37 | os.environ.update(original)
38 |
39 |
40 | def uv_install(venv: Path) -> None:
41 | """Install dependencies using uv."""
42 | with environ(UV_PROJECT_ENVIRONMENT=str(venv), PYO3_USE_ABI3_FORWARD_COMPATIBILITY="1"):
43 | if "CI" in os.environ:
44 | shell("uv sync --no-editable")
45 | else:
46 | shell("uv sync")
47 |
48 |
49 | def setup() -> None:
50 | """Setup the project."""
51 | if not shutil.which("uv"):
52 | raise ValueError("make: setup: uv must be installed, see https://github.com/astral-sh/uv")
53 |
54 | print("Installing dependencies (default environment)")
55 | default_venv = Path(".venv")
56 | if not default_venv.exists():
57 | shell("uv venv")
58 | uv_install(default_venv)
59 |
60 | if PYTHON_VERSIONS:
61 | for version in PYTHON_VERSIONS:
62 | print(f"\nInstalling dependencies (python{version})")
63 | venv_path = Path(f".venvs/{version}")
64 | if not venv_path.exists():
65 | shell(f"uv venv --python {version} {venv_path}")
66 | with environ(UV_PROJECT_ENVIRONMENT=str(venv_path.resolve())):
67 | uv_install(venv_path)
68 |
69 |
70 | def run(version: str, cmd: str, *args: str, **kwargs: Any) -> None:
71 | """Run a command in a virtual environment."""
72 | kwargs = {"check": True, **kwargs}
73 | uv_run = ["uv", "run", "--no-sync"]
74 | if version == "default":
75 | with environ(UV_PROJECT_ENVIRONMENT=".venv"):
76 | subprocess.run([*uv_run, cmd, *args], **kwargs) # noqa: S603, PLW1510
77 | else:
78 | with environ(UV_PROJECT_ENVIRONMENT=f".venvs/{version}", MULTIRUN="1"):
79 | subprocess.run([*uv_run, cmd, *args], **kwargs) # noqa: S603, PLW1510
80 |
81 |
82 | def multirun(cmd: str, *args: str, **kwargs: Any) -> None:
83 | """Run a command for all configured Python versions."""
84 | if PYTHON_VERSIONS:
85 | for version in PYTHON_VERSIONS:
86 | run(version, cmd, *args, **kwargs)
87 | else:
88 | run("default", cmd, *args, **kwargs)
89 |
90 |
91 | def allrun(cmd: str, *args: str, **kwargs: Any) -> None:
92 | """Run a command in all virtual environments."""
93 | run("default", cmd, *args, **kwargs)
94 | if PYTHON_VERSIONS:
95 | multirun(cmd, *args, **kwargs)
96 |
97 |
98 | def clean() -> None:
99 | """Delete build artifacts and cache files."""
100 | paths_to_clean = ["build", "dist", "htmlcov", "site", ".coverage*", ".pdm-build"]
101 | for path in paths_to_clean:
102 | shutil.rmtree(path, ignore_errors=True)
103 |
104 | cache_dirs = {".cache", ".pytest_cache", ".mypy_cache", ".ruff_cache", "__pycache__"}
105 | for dirpath in Path(".").rglob("*/"):
106 | if dirpath.parts[0] not in (".venv", ".venvs") and dirpath.name in cache_dirs:
107 | shutil.rmtree(dirpath, ignore_errors=True)
108 |
109 |
110 | def vscode() -> None:
111 | """Configure VSCode to work on this project."""
112 | shutil.copytree("config/vscode", ".vscode", dirs_exist_ok=True)
113 |
114 |
115 | def main() -> int:
116 | """Main entry point."""
117 | args = list(sys.argv[1:])
118 | if not args or args[0] == "help":
119 | if len(args) > 1:
120 | run("default", "duty", "--help", args[1])
121 | else:
122 | print(
123 | dedent(
124 | """
125 | Available commands
126 | help Print this help. Add task name to print help.
127 | setup Setup all virtual environments (install dependencies).
128 | run Run a command in the default virtual environment.
129 | multirun Run a command for all configured Python versions.
130 | allrun Run a command in all virtual environments.
131 | 3.x Run a command in the virtual environment for Python 3.x.
132 | clean Delete build artifacts and cache files.
133 | vscode Configure VSCode to work on this project.
134 | """,
135 | ),
136 | flush=True,
137 | )
138 | if os.path.exists(".venv"):
139 | print("\nAvailable tasks", flush=True)
140 | run("default", "duty", "--list")
141 | return 0
142 |
143 | while args:
144 | cmd = args.pop(0)
145 |
146 | if cmd == "run":
147 | run("default", *args)
148 | return 0
149 |
150 | if cmd == "multirun":
151 | multirun(*args)
152 | return 0
153 |
154 | if cmd == "allrun":
155 | allrun(*args)
156 | return 0
157 |
158 | if cmd.startswith("3."):
159 | run(cmd, *args)
160 | return 0
161 |
162 | opts = []
163 | while args and (args[0].startswith("-") or "=" in args[0]):
164 | opts.append(args.pop(0))
165 |
166 | if cmd == "clean":
167 | clean()
168 | elif cmd == "setup":
169 | setup()
170 | elif cmd == "vscode":
171 | vscode()
172 | elif cmd == "check":
173 | multirun("duty", "check-quality", "check-types", "check-docs")
174 | run("default", "duty", "check-api")
175 | elif cmd in {"check-quality", "check-docs", "check-types", "test"}:
176 | multirun("duty", cmd, *opts)
177 | else:
178 | run("default", "duty", cmd, *opts)
179 |
180 | return 0
181 |
182 |
183 | if __name__ == "__main__":
184 | try:
185 | sys.exit(main())
186 | except subprocess.CalledProcessError as process:
187 | if process.output:
188 | print(process.output, file=sys.stderr)
189 | sys.exit(process.returncode)
190 |
--------------------------------------------------------------------------------
/src/griffe_pydantic/__init__.py:
--------------------------------------------------------------------------------
1 | """griffe-pydantic package.
2 |
3 | Griffe extension for Pydantic.
4 | """
5 |
6 | from __future__ import annotations
7 |
8 | from pathlib import Path
9 |
10 | from griffe_pydantic._internal.extension import PydanticExtension
11 |
12 |
13 | def get_templates_path() -> Path:
14 | """Return the templates directory path."""
15 | return Path(__file__).parent / "templates"
16 |
17 |
18 | __all__: list[str] = ["PydanticExtension", "get_templates_path"]
19 |
--------------------------------------------------------------------------------
/src/griffe_pydantic/_internal/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/mkdocstrings/griffe-pydantic/187b4b8d052325b4dcf7a562a78b629f9e70c9b2/src/griffe_pydantic/_internal/__init__.py
--------------------------------------------------------------------------------
/src/griffe_pydantic/_internal/common.py:
--------------------------------------------------------------------------------
1 | from __future__ import annotations
2 |
3 | import json
4 | from functools import partial
5 | from typing import TYPE_CHECKING
6 |
7 | if TYPE_CHECKING:
8 | from collections.abc import Sequence
9 |
10 | from griffe import Attribute, Class, Function
11 | from pydantic import BaseModel
12 |
13 | _self_namespace = "griffe_pydantic"
14 | _mkdocstrings_namespace = "mkdocstrings"
15 |
16 | _field_constraints = {
17 | "gt",
18 | "ge",
19 | "lt",
20 | "le",
21 | "multiple_of",
22 | "min_length",
23 | "max_length",
24 | "pattern",
25 | "allow_inf_nan",
26 | "max_digits",
27 | "decimal_place",
28 | }
29 |
30 |
31 | def _model_fields(cls: Class) -> dict[str, Attribute]:
32 | return {name: attr for name, attr in cls.all_members.items() if "pydantic-field" in attr.labels} # type: ignore[misc]
33 |
34 |
35 | def _model_validators(cls: Class) -> dict[str, Function]:
36 | return {name: func for name, func in cls.all_members.items() if "pydantic-validator" in func.labels} # type: ignore[misc]
37 |
38 |
39 | def _json_schema(model: type[BaseModel]) -> str:
40 | """Produce a model schema as JSON.
41 |
42 | Parameters:
43 | model: A Pydantic model.
44 |
45 | Returns:
46 | A schema as JSON.
47 | """
48 | return json.dumps(model.model_json_schema(), indent=2)
49 |
50 |
51 | def _process_class(cls: Class) -> None:
52 | """Set metadata on a Pydantic model.
53 |
54 | Parameters:
55 | cls: The Griffe class representing the Pydantic model.
56 | """
57 | cls.labels.add("pydantic-model")
58 | cls.extra[_self_namespace]["fields"] = partial(_model_fields, cls)
59 | cls.extra[_self_namespace]["validators"] = partial(_model_validators, cls)
60 | cls.extra[_mkdocstrings_namespace]["template"] = "pydantic_model.html.jinja"
61 |
62 |
63 | def _process_function(func: Function, cls: Class, fields: Sequence[str]) -> None:
64 | """Set metadata on a Pydantic validator.
65 |
66 | Parameters:
67 | cls: A Griffe function representing the Pydantic validator.
68 | """
69 | func.labels = {"pydantic-validator"}
70 | if fields and fields[0] == "*":
71 | targets = [member for member in cls.all_members.values() if "pydantic-field" in member.labels]
72 | else:
73 | targets = [cls.all_members[field] for field in fields]
74 |
75 | func.extra[_self_namespace].setdefault("targets", [])
76 | func.extra[_self_namespace]["targets"].extend(targets)
77 | for target in targets:
78 | target.extra[_self_namespace].setdefault("validators", [])
79 | target.extra[_self_namespace]["validators"].append(func)
80 |
--------------------------------------------------------------------------------
/src/griffe_pydantic/_internal/debug.py:
--------------------------------------------------------------------------------
1 | from __future__ import annotations
2 |
3 | import os
4 | import platform
5 | import sys
6 | from dataclasses import dataclass
7 | from importlib import metadata
8 |
9 |
10 | @dataclass
11 | class _Variable:
12 | """Dataclass describing an environment variable."""
13 |
14 | name: str
15 | """Variable name."""
16 | value: str
17 | """Variable value."""
18 |
19 |
20 | @dataclass
21 | class _Package:
22 | """Dataclass describing a Python package."""
23 |
24 | name: str
25 | """Package name."""
26 | version: str
27 | """Package version."""
28 |
29 |
30 | @dataclass
31 | class _Environment:
32 | """Dataclass to store environment information."""
33 |
34 | interpreter_name: str
35 | """Python interpreter name."""
36 | interpreter_version: str
37 | """Python interpreter version."""
38 | interpreter_path: str
39 | """Path to Python executable."""
40 | platform: str
41 | """Operating System."""
42 | packages: list[_Package]
43 | """Installed packages."""
44 | variables: list[_Variable]
45 | """Environment variables."""
46 |
47 |
48 | def _interpreter_name_version() -> tuple[str, str]:
49 | if hasattr(sys, "implementation"):
50 | impl = sys.implementation.version
51 | version = f"{impl.major}.{impl.minor}.{impl.micro}"
52 | kind = impl.releaselevel
53 | if kind != "final":
54 | version += kind[0] + str(impl.serial)
55 | return sys.implementation.name, version
56 | return "", "0.0.0"
57 |
58 |
59 | def _get_version(dist: str = "griffe-pydantic") -> str:
60 | """Get version of the given distribution.
61 |
62 | Parameters:
63 | dist: A distribution name.
64 |
65 | Returns:
66 | A version number.
67 | """
68 | try:
69 | return metadata.version(dist)
70 | except metadata.PackageNotFoundError:
71 | return "0.0.0"
72 |
73 |
74 | def _get_debug_info() -> _Environment:
75 | """Get debug/environment information.
76 |
77 | Returns:
78 | Environment information.
79 | """
80 | py_name, py_version = _interpreter_name_version()
81 | packages = ["griffe-pydantic"]
82 | variables = ["PYTHONPATH", *[var for var in os.environ if var.startswith("GRIFFE_PYDANTIC")]]
83 | return _Environment(
84 | interpreter_name=py_name,
85 | interpreter_version=py_version,
86 | interpreter_path=sys.executable,
87 | platform=platform.platform(),
88 | variables=[_Variable(var, val) for var in variables if (val := os.getenv(var))],
89 | packages=[_Package(pkg, _get_version(pkg)) for pkg in packages],
90 | )
91 |
92 |
93 | def _print_debug_info() -> None:
94 | """Print debug/environment information."""
95 | info = _get_debug_info()
96 | print(f"- __System__: {info.platform}")
97 | print(f"- __Python__: {info.interpreter_name} {info.interpreter_version} ({info.interpreter_path})")
98 | print("- __Environment variables__:")
99 | for var in info.variables:
100 | print(f" - `{var.name}`: `{var.value}`")
101 | print("- __Installed packages__:")
102 | for pkg in info.packages:
103 | print(f" - `{pkg.name}` v{pkg.version}")
104 |
105 |
106 | if __name__ == "__main__":
107 | _print_debug_info()
108 |
--------------------------------------------------------------------------------
/src/griffe_pydantic/_internal/dynamic.py:
--------------------------------------------------------------------------------
1 | from __future__ import annotations
2 |
3 | from typing import Any, Callable
4 |
5 | from griffe import (
6 | Attribute,
7 | Class,
8 | Docstring,
9 | Function,
10 | Kind,
11 | get_logger,
12 | )
13 | from pydantic.fields import FieldInfo
14 |
15 | from griffe_pydantic._internal import common
16 |
17 | _logger = get_logger(__name__)
18 |
19 |
20 | def _process_attribute(obj: Any, attr: Attribute, cls: Class, *, processed: set[str]) -> None:
21 | """Handle Pydantic fields."""
22 | if attr.canonical_path in processed:
23 | return
24 | processed.add(attr.canonical_path)
25 | if attr.name == "model_config":
26 | cls.extra[common._self_namespace]["config"] = obj
27 | return
28 |
29 | if not isinstance(obj, FieldInfo):
30 | return
31 |
32 | attr.labels = {"pydantic-field"}
33 | attr.value = obj.default
34 | constraints = {}
35 | for constraint in common._field_constraints:
36 | if (value := getattr(obj, constraint, None)) is not None:
37 | constraints[constraint] = value
38 | attr.extra[common._self_namespace]["constraints"] = constraints
39 |
40 | # Populate docstring from the field's `description` argument.
41 | if not attr.docstring and (docstring := obj.description):
42 | attr.docstring = Docstring(docstring, parent=attr)
43 |
44 |
45 | def _process_function(obj: Callable, func: Function, cls: Class, *, processed: set[str]) -> None:
46 | """Handle Pydantic field validators."""
47 | if func.canonical_path in processed:
48 | return
49 | processed.add(func.canonical_path)
50 | if dec_info := getattr(obj, "decorator_info", None):
51 | common._process_function(func, cls, dec_info.fields)
52 |
53 |
54 | def _process_class(obj: type, cls: Class, *, processed: set[str], schema: bool = False) -> None:
55 | """Detect and prepare Pydantic models."""
56 | common._process_class(cls)
57 | if schema:
58 | cls.extra[common._self_namespace]["schema"] = common._json_schema(obj)
59 | for member in cls.all_members.values():
60 | kind = member.kind
61 | if kind is Kind.ATTRIBUTE:
62 | _process_attribute(getattr(obj, member.name), member, cls, processed=processed) # type: ignore[arg-type]
63 | elif kind is Kind.FUNCTION:
64 | _process_function(getattr(obj, member.name), member, cls, processed=processed) # type: ignore[arg-type]
65 |
--------------------------------------------------------------------------------
/src/griffe_pydantic/_internal/extension.py:
--------------------------------------------------------------------------------
1 | from __future__ import annotations
2 |
3 | import ast
4 | from typing import TYPE_CHECKING, Any
5 |
6 | from griffe import (
7 | Class,
8 | Extension,
9 | Module,
10 | get_logger,
11 | )
12 |
13 | from griffe_pydantic._internal import dynamic, static
14 |
15 | if TYPE_CHECKING:
16 | from griffe import ObjectNode
17 |
18 |
19 | _logger = get_logger(__name__)
20 |
21 |
22 | class PydanticExtension(Extension):
23 | """Griffe extension for Pydantic."""
24 |
25 | def __init__(self, *, schema: bool = False) -> None:
26 | """Initialize the extension.
27 |
28 | Parameters:
29 | schema: Whether to compute and store the JSON schema of models.
30 | """
31 | super().__init__()
32 | self._schema = schema
33 | self._processed: set[str] = set()
34 | self._recorded: list[tuple[ObjectNode, Class]] = []
35 |
36 | def on_package_loaded(self, *, pkg: Module, **kwargs: Any) -> None: # noqa: ARG002
37 | """Detect models once the whole package is loaded."""
38 | for node, cls in self._recorded:
39 | self._processed.add(cls.canonical_path)
40 | dynamic._process_class(node.obj, cls, processed=self._processed, schema=self._schema)
41 | static._process_module(pkg, processed=self._processed, schema=self._schema)
42 |
43 | def on_class_instance(self, *, node: ast.AST | ObjectNode, cls: Class, **kwargs: Any) -> None: # noqa: ARG002
44 | """Detect and prepare Pydantic models."""
45 | # Prevent running during static analysis.
46 | if isinstance(node, ast.AST):
47 | return
48 |
49 | try:
50 | import pydantic
51 | except ImportError:
52 | _logger.warning("could not import pydantic - models will not be detected")
53 | return
54 |
55 | if issubclass(node.obj, pydantic.BaseModel):
56 | self._recorded.append((node, cls))
57 |
--------------------------------------------------------------------------------
/src/griffe_pydantic/_internal/static.py:
--------------------------------------------------------------------------------
1 | from __future__ import annotations
2 |
3 | import ast
4 | import sys
5 | from typing import TYPE_CHECKING
6 |
7 | from griffe import (
8 | Alias,
9 | Attribute,
10 | Class,
11 | Docstring,
12 | Expr,
13 | ExprCall,
14 | ExprKeyword,
15 | ExprName,
16 | Function,
17 | Kind,
18 | Module,
19 | dynamic_import,
20 | get_logger,
21 | )
22 |
23 | from griffe_pydantic._internal import common
24 |
25 | if TYPE_CHECKING:
26 | from pathlib import Path
27 |
28 |
29 | _logger = get_logger(__name__)
30 |
31 |
32 | def _inherits_pydantic(cls: Class) -> bool:
33 | """Tell whether a class inherits from a Pydantic model.
34 |
35 | Parameters:
36 | cls: A Griffe class.
37 |
38 | Returns:
39 | True/False.
40 | """
41 | for base in cls.bases:
42 | if isinstance(base, (ExprName, Expr)):
43 | base = base.canonical_path # noqa: PLW2901
44 | if base in {"pydantic.BaseModel", "pydantic.main.BaseModel"}:
45 | return True
46 |
47 | return any(_inherits_pydantic(parent_class) for parent_class in cls.mro())
48 |
49 |
50 | def _pydantic_validator(func: Function) -> ExprCall | None:
51 | """Return a function's `pydantic.field_validator` decorator if it exists.
52 |
53 | Parameters:
54 | func: A Griffe function.
55 |
56 | Returns:
57 | A decorator value (Griffe expression).
58 | """
59 | for decorator in func.decorators:
60 | if isinstance(decorator.value, ExprCall) and decorator.callable_path in {
61 | "pydantic.field_validator",
62 | "pydantic.model_validator",
63 | }:
64 | return decorator.value
65 | return None
66 |
67 |
68 | def _process_attribute(attr: Attribute, cls: Class, *, processed: set[str]) -> None:
69 | """Handle Pydantic fields."""
70 | if attr.canonical_path in processed:
71 | return
72 | processed.add(attr.canonical_path)
73 |
74 | # Properties are not fields.
75 | if "property" in attr.labels:
76 | return
77 |
78 | # Presence of `class-attribute` label and absence of `instance-attribute` label
79 | # indicates that the attribute is annotated with `ClassVar` and should be ignored.
80 | if "class-attribute" in attr.labels and "instance-attribute" not in attr.labels:
81 | return
82 |
83 | kwargs = {}
84 | if isinstance(attr.value, ExprCall):
85 | kwargs = {
86 | argument.name: argument.value for argument in attr.value.arguments if isinstance(argument, ExprKeyword)
87 | }
88 |
89 | if (
90 | attr.value.function.canonical_path == "pydantic.Field"
91 | and len(attr.value.arguments) >= 1
92 | and not isinstance(attr.value.arguments[0], ExprKeyword)
93 | and attr.value.arguments[0] != "..." # handle Field(...), i.e. no default
94 | ):
95 | kwargs["default"] = attr.value.arguments[0]
96 |
97 | elif attr.value is not None:
98 | kwargs["default"] = attr.value
99 |
100 | if attr.name == "model_config":
101 | config = {}
102 | for key, value in kwargs.items():
103 | if isinstance(value, str):
104 | try:
105 | config[key] = ast.literal_eval(value)
106 | except ValueError:
107 | config[key] = value
108 | else:
109 | config[key] = value
110 | cls.extra[common._self_namespace]["config"] = config
111 | return
112 |
113 | attr.labels.add("pydantic-field")
114 | attr.labels.discard("class-attribute")
115 | attr.labels.discard("instance-attribute")
116 |
117 | attr.value = kwargs.get("default", None)
118 | constraints = {kwarg: value for kwarg, value in kwargs.items() if kwarg not in {"default", "description"}}
119 | attr.extra[common._self_namespace]["constraints"] = constraints
120 |
121 | # Populate docstring from the field's `description` argument.
122 | if not attr.docstring and (docstring := kwargs.get("description", None)):
123 | try:
124 | attr.docstring = Docstring(ast.literal_eval(docstring), parent=attr) # type: ignore[arg-type]
125 | except ValueError:
126 | _logger.debug(f"Could not parse description of field '{attr.path}' as literal, skipping")
127 |
128 |
129 | def _process_function(func: Function, cls: Class, *, processed: set[str]) -> None:
130 | """Handle Pydantic field validators."""
131 | if func.canonical_path in processed:
132 | return
133 | processed.add(func.canonical_path)
134 |
135 | if isinstance(func, Alias):
136 | _logger.warning(f"cannot yet process {func}")
137 | return
138 |
139 | if decorator := _pydantic_validator(func):
140 | fields = [ast.literal_eval(field) for field in decorator.arguments if isinstance(field, str)]
141 | common._process_function(func, cls, fields)
142 |
143 |
144 | def _process_class(cls: Class, *, processed: set[str], schema: bool = False) -> None:
145 | """Finalize the Pydantic model data."""
146 | if cls.canonical_path in processed:
147 | return
148 |
149 | if not _inherits_pydantic(cls):
150 | return
151 |
152 | processed.add(cls.canonical_path)
153 |
154 | common._process_class(cls)
155 |
156 | if schema:
157 | import_path: Path | list[Path] = cls.package.filepath
158 | if isinstance(import_path, list):
159 | import_path = import_path[-1]
160 | if import_path.name == "__init__.py":
161 | import_path = import_path.parent
162 | import_path = import_path.parent
163 | try:
164 | true_class = dynamic_import(cls.path, import_paths=[import_path, *sys.path])
165 | except ImportError:
166 | _logger.debug(f"Could not import class {cls.path} for JSON schema")
167 | return
168 | cls.extra[common._self_namespace]["schema"] = common._json_schema(true_class)
169 |
170 | for member in cls.all_members.values():
171 | kind = member.kind
172 | if kind is Kind.ATTRIBUTE:
173 | _process_attribute(member, cls, processed=processed) # type: ignore[arg-type]
174 | elif kind is Kind.FUNCTION:
175 | _process_function(member, cls, processed=processed) # type: ignore[arg-type]
176 | elif kind is Kind.CLASS:
177 | _process_class(member, processed=processed, schema=schema) # type: ignore[arg-type]
178 |
179 |
180 | def _process_module(
181 | mod: Module,
182 | *,
183 | processed: set[str],
184 | schema: bool = False,
185 | ) -> None:
186 | """Handle Pydantic models in a module."""
187 | if mod.canonical_path in processed:
188 | return
189 | processed.add(mod.canonical_path)
190 |
191 | for cls in mod.classes.values():
192 | # Don't process aliases, real classes will be processed at some point anyway.
193 | if not cls.is_alias:
194 | _process_class(cls, processed=processed, schema=schema)
195 |
196 | for submodule in mod.modules.values():
197 | _process_module(submodule, processed=processed, schema=schema)
198 |
--------------------------------------------------------------------------------
/src/griffe_pydantic/common.py:
--------------------------------------------------------------------------------
1 | """Deprecated. Import from `griffe_pydantic` directly instead."""
2 |
3 | import warnings
4 | from typing import Any
5 |
6 | from griffe_pydantic._internal import common
7 |
8 | # YORE: Bump 2: Remove file.
9 |
10 |
11 | def __getattr__(name: str) -> Any:
12 | warnings.warn(
13 | "Importing from `griffe_pydantic.common` is deprecated. Import from `griffe_pydantic` directly instead.",
14 | DeprecationWarning,
15 | stacklevel=2,
16 | )
17 | try:
18 | return getattr(common, name)
19 | except AttributeError:
20 | return getattr(common, name.removeprefix("_"))
21 |
--------------------------------------------------------------------------------
/src/griffe_pydantic/dynamic.py:
--------------------------------------------------------------------------------
1 | """Deprecated. Import from `griffe_pydantic` directly instead."""
2 |
3 | import warnings
4 | from typing import Any
5 |
6 | from griffe_pydantic._internal import dynamic
7 |
8 | # YORE: Bump 2: Remove file.
9 |
10 |
11 | def __getattr__(name: str) -> Any:
12 | warnings.warn(
13 | "Importing from `griffe_pydantic.common` is deprecated. Import from `griffe_pydantic` directly instead.",
14 | DeprecationWarning,
15 | stacklevel=2,
16 | )
17 | try:
18 | return getattr(dynamic, name)
19 | except AttributeError:
20 | return getattr(dynamic, name.removeprefix("_"))
21 |
--------------------------------------------------------------------------------
/src/griffe_pydantic/extension.py:
--------------------------------------------------------------------------------
1 | """Deprecated. Import from `griffe_pydantic` directly instead."""
2 |
3 | import warnings
4 | from typing import Any
5 |
6 | from griffe_pydantic._internal import extension
7 |
8 | # YORE: Bump 2: Remove file.
9 |
10 |
11 | def __getattr__(name: str) -> Any:
12 | warnings.warn(
13 | "Importing from `griffe_pydantic.common` is deprecated. Import from `griffe_pydantic` directly instead.",
14 | DeprecationWarning,
15 | stacklevel=2,
16 | )
17 | try:
18 | return getattr(extension, name)
19 | except AttributeError:
20 | return getattr(extension, name.removeprefix("_"))
21 |
--------------------------------------------------------------------------------
/src/griffe_pydantic/py.typed:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/mkdocstrings/griffe-pydantic/187b4b8d052325b4dcf7a562a78b629f9e70c9b2/src/griffe_pydantic/py.typed
--------------------------------------------------------------------------------
/src/griffe_pydantic/static.py:
--------------------------------------------------------------------------------
1 | """Deprecated. Import from `griffe_pydantic` directly instead."""
2 |
3 | import warnings
4 | from typing import Any
5 |
6 | from griffe_pydantic._internal import static
7 |
8 | # YORE: Bump 2: Remove file.
9 |
10 |
11 | def __getattr__(name: str) -> Any:
12 | warnings.warn(
13 | "Importing from `griffe_pydantic.common` is deprecated. Import from `griffe_pydantic` directly instead.",
14 | DeprecationWarning,
15 | stacklevel=2,
16 | )
17 | try:
18 | return getattr(static, name)
19 | except AttributeError:
20 | return getattr(static, name.removeprefix("_"))
21 |
--------------------------------------------------------------------------------
/src/griffe_pydantic/templates/material/_base/pydantic_model.html.jinja:
--------------------------------------------------------------------------------
1 | {% extends "_base/class.html.jinja" %}
2 |
3 | {% block contents %}
4 | {% block bases %}{{ super() }}{% endblock %}
5 | {% block docstring %}{{ super() }}{% endblock %}
6 |
7 | {% block schema scoped %}
8 | {% if class.extra.griffe_pydantic.schema %}
9 | Show JSON schema:
10 | {{ class.extra.griffe_pydantic.schema | highlight(language="json") }}
11 |
12 | {% endif %}
13 | {% endblock schema %}
14 |
15 | {% block config scoped %}
16 | {% if class.extra.griffe_pydantic.config %}
17 | Config:
18 |
19 | {% for name, value in class.extra.griffe_pydantic.config.items() %}
20 | {{ name }}
: {{ value|string|highlight(language="python", inline=True) }}
21 | {% endfor %}
22 |
23 | {% endif %}
24 | {% endblock config %}
25 |
26 | {% block fields scoped %}
27 | {% with fields = class.extra.griffe_pydantic.fields() %}
28 | {% if fields %}
29 | Fields:
30 |
31 | {% for name, field in fields.items() %}
32 |
33 | {{ name }}
34 | {% with expression = field.annotation %}
35 | ({% include "expression.html.jinja" with context %}
)
36 | {% endwith %}
37 |
38 | {% endfor %}
39 |
40 | {% endif %}
41 | {% endwith %}
42 | {% endblock fields %}
43 |
44 | {% block validators scoped %}
45 | {% with validators = class.extra.griffe_pydantic.validators() %}
46 | {% if validators %}
47 | Validators:
48 |
49 | {% for name, validator in validators.items() %}
50 |
51 | {{ name }}
52 | {% if validator.extra.griffe_pydantic.targets %}
53 | →
54 | {% for target in validator.extra.griffe_pydantic.targets %}
55 | {{ target.name }}
56 | {%- if not loop.last %}, {% endif %}
57 | {% endfor %}
58 | {% endif %}
59 |
60 | {% endfor %}
61 |
62 | {% endif %}
63 | {% endwith %}
64 | {% endblock validators %}
65 |
66 | {% block source %}{{ super() }}{% endblock %}
67 | {% block children %}{{ super() }}{% endblock %}
68 | {% endblock contents %}
69 |
--------------------------------------------------------------------------------
/src/griffe_pydantic/templates/material/pydantic_model.html.jinja:
--------------------------------------------------------------------------------
1 | {% extends "_base/pydantic_model.html.jinja" %}
2 |
--------------------------------------------------------------------------------
/tests/__init__.py:
--------------------------------------------------------------------------------
1 | """Tests suite for `griffe_pydantic`."""
2 |
3 | from pathlib import Path
4 |
5 | TESTS_DIR = Path(__file__).parent
6 | TMP_DIR = TESTS_DIR / "tmp"
7 | FIXTURES_DIR = TESTS_DIR / "fixtures"
8 |
--------------------------------------------------------------------------------
/tests/conftest.py:
--------------------------------------------------------------------------------
1 | """Configuration for the pytest test suite."""
2 |
3 | from __future__ import annotations
4 |
5 | from collections import ChainMap
6 | from typing import TYPE_CHECKING, Any
7 |
8 | import pytest
9 | from markdown.core import Markdown
10 | from mkdocs.config.defaults import MkDocsConfig
11 |
12 | if TYPE_CHECKING:
13 | from collections.abc import Iterator
14 | from pathlib import Path
15 |
16 | from mkdocs import config
17 | from mkdocstrings_handlers.python.handler import PythonHandler
18 |
19 |
20 | @pytest.fixture(name="mkdocs_conf")
21 | def fixture_mkdocs_conf(request: pytest.FixtureRequest, tmp_path: Path) -> Iterator[config.Config]:
22 | """Yield a MkDocs configuration object."""
23 | conf = MkDocsConfig()
24 | while hasattr(request, "_parent_request") and hasattr(request._parent_request, "_parent_request"):
25 | request = request._parent_request
26 |
27 | conf_dict = {
28 | "site_name": "foo",
29 | "site_url": "https://example.org/",
30 | "site_dir": str(tmp_path),
31 | "plugins": [{"mkdocstrings": {"default_handler": "python"}}],
32 | **getattr(request, "param", {}),
33 | }
34 | # Re-create it manually as a workaround for https://github.com/mkdocs/mkdocs/issues/2289
35 | mdx_configs: dict[str, Any] = dict(ChainMap(*conf_dict.get("markdown_extensions", [])))
36 |
37 | conf.load_dict(conf_dict)
38 | assert conf.validate() == ([], [])
39 |
40 | conf["mdx_configs"] = mdx_configs
41 | conf["markdown_extensions"].insert(0, "toc") # Guaranteed to be added by MkDocs.
42 |
43 | conf = conf["plugins"]["mkdocstrings"].on_config(conf)
44 | conf = conf["plugins"]["autorefs"].on_config(conf)
45 | yield conf
46 | conf["plugins"]["mkdocstrings"].on_post_build(conf)
47 |
48 |
49 | @pytest.fixture(name="python_handler")
50 | def fixture_python_handler(mkdocs_conf: MkDocsConfig) -> PythonHandler:
51 | """Return a PythonHandler instance."""
52 | handlers = mkdocs_conf.plugins["mkdocstrings"].handlers # type: ignore[attr-defined]
53 | handler = handlers.get_handler("python")
54 | handler._update_env(md=Markdown(extensions=["toc"]))
55 | handler.env.filters["convert_markdown"] = lambda *args, **kwargs: str(args) + str(kwargs)
56 | return handler
57 |
--------------------------------------------------------------------------------
/tests/test_api.py:
--------------------------------------------------------------------------------
1 | """Tests for our own API exposition."""
2 |
3 | from __future__ import annotations
4 |
5 | from collections import defaultdict
6 | from pathlib import Path
7 | from typing import TYPE_CHECKING
8 |
9 | import griffe
10 | import pytest
11 | from mkdocstrings import Inventory
12 |
13 | import griffe_pydantic
14 |
15 | if TYPE_CHECKING:
16 | from collections.abc import Iterator
17 |
18 |
19 | @pytest.fixture(name="loader", scope="module")
20 | def _fixture_loader() -> griffe.GriffeLoader:
21 | loader = griffe.GriffeLoader()
22 | loader.load("griffe_pydantic")
23 | loader.resolve_aliases()
24 | return loader
25 |
26 |
27 | @pytest.fixture(name="internal_api", scope="module")
28 | def _fixture_internal_api(loader: griffe.GriffeLoader) -> griffe.Module:
29 | return loader.modules_collection["griffe_pydantic._internal"]
30 |
31 |
32 | @pytest.fixture(name="public_api", scope="module")
33 | def _fixture_public_api(loader: griffe.GriffeLoader) -> griffe.Module:
34 | return loader.modules_collection["griffe_pydantic"]
35 |
36 |
37 | def _yield_public_objects(
38 | obj: griffe.Module | griffe.Class,
39 | *,
40 | modules: bool = False,
41 | modulelevel: bool = True,
42 | inherited: bool = False,
43 | special: bool = False,
44 | ) -> Iterator[griffe.Object | griffe.Alias]:
45 | for member in obj.all_members.values() if inherited else obj.members.values():
46 | try:
47 | if member.is_module:
48 | if member.is_alias or not member.is_public:
49 | continue
50 | if modules:
51 | yield member
52 | yield from _yield_public_objects(
53 | member, # type: ignore[arg-type]
54 | modules=modules,
55 | modulelevel=modulelevel,
56 | inherited=inherited,
57 | special=special,
58 | )
59 | elif member.is_public and (special or not member.is_special):
60 | yield member
61 | else:
62 | continue
63 | if member.is_class and not modulelevel:
64 | yield from _yield_public_objects(
65 | member, # type: ignore[arg-type]
66 | modules=modules,
67 | modulelevel=False,
68 | inherited=inherited,
69 | special=special,
70 | )
71 | except (griffe.AliasResolutionError, griffe.CyclicAliasError):
72 | continue
73 |
74 |
75 | @pytest.fixture(name="modulelevel_internal_objects", scope="module")
76 | def _fixture_modulelevel_internal_objects(internal_api: griffe.Module) -> list[griffe.Object | griffe.Alias]:
77 | return list(_yield_public_objects(internal_api, modulelevel=True))
78 |
79 |
80 | @pytest.fixture(name="internal_objects", scope="module")
81 | def _fixture_internal_objects(internal_api: griffe.Module) -> list[griffe.Object | griffe.Alias]:
82 | return list(_yield_public_objects(internal_api, modulelevel=False, special=True))
83 |
84 |
85 | @pytest.fixture(name="public_objects", scope="module")
86 | def _fixture_public_objects(public_api: griffe.Module) -> list[griffe.Object | griffe.Alias]:
87 | return list(_yield_public_objects(public_api, modulelevel=False, inherited=True, special=True))
88 |
89 |
90 | @pytest.fixture(name="inventory", scope="module")
91 | def _fixture_inventory() -> Inventory:
92 | inventory_file = Path(__file__).parent.parent / "site" / "objects.inv"
93 | if not inventory_file.exists():
94 | raise pytest.skip("The objects inventory is not available.")
95 | with inventory_file.open("rb") as file:
96 | return Inventory.parse_sphinx(file)
97 |
98 |
99 | def test_exposed_objects(modulelevel_internal_objects: list[griffe.Object | griffe.Alias]) -> None:
100 | """All public objects in the internal API are exposed under `griffe_pydantic`."""
101 | not_exposed = [
102 | obj.path
103 | for obj in modulelevel_internal_objects
104 | if obj.name not in griffe_pydantic.__all__ or not hasattr(griffe_pydantic, obj.name)
105 | ]
106 | assert not not_exposed, "Objects not exposed:\n" + "\n".join(sorted(not_exposed))
107 |
108 |
109 | def test_unique_names(modulelevel_internal_objects: list[griffe.Object | griffe.Alias]) -> None:
110 | """All internal objects have unique names."""
111 | names_to_paths = defaultdict(list)
112 | for obj in modulelevel_internal_objects:
113 | names_to_paths[obj.name].append(obj.path)
114 | non_unique = [paths for paths in names_to_paths.values() if len(paths) > 1]
115 | assert not non_unique, "Non-unique names:\n" + "\n".join(str(paths) for paths in non_unique)
116 |
117 |
118 | def test_single_locations(public_api: griffe.Module) -> None:
119 | """All objects have a single public location."""
120 |
121 | def _public_path(obj: griffe.Object | griffe.Alias) -> bool:
122 | return obj.is_public and (obj.parent is None or _public_path(obj.parent))
123 |
124 | multiple_locations = {}
125 | for obj_name in griffe_pydantic.__all__:
126 | obj = public_api[obj_name]
127 | if obj.aliases and (
128 | public_aliases := [path for path, alias in obj.aliases.items() if path != obj.path and _public_path(alias)]
129 | ):
130 | multiple_locations[obj.path] = public_aliases
131 | assert not multiple_locations, "Multiple public locations:\n" + "\n".join(
132 | f"{path}: {aliases}" for path, aliases in multiple_locations.items()
133 | )
134 |
135 |
136 | def test_api_matches_inventory(inventory: Inventory, public_objects: list[griffe.Object | griffe.Alias]) -> None:
137 | """All public objects are added to the inventory."""
138 | ignore_names = {"__getattr__", "__init__", "__repr__", "__str__", "__post_init__"}
139 | not_in_inventory = [
140 | obj.path for obj in public_objects if obj.name not in ignore_names and obj.path not in inventory
141 | ]
142 | msg = "Objects not in the inventory (try running `make run mkdocs build`):\n{paths}"
143 | assert not not_in_inventory, msg.format(paths="\n".join(sorted(not_in_inventory)))
144 |
145 |
146 | def test_inventory_matches_api(
147 | inventory: Inventory,
148 | public_objects: list[griffe.Object | griffe.Alias],
149 | loader: griffe.GriffeLoader,
150 | ) -> None:
151 | """The inventory doesn't contain any additional Python object."""
152 | # YORE: Bump 2: Remove block.
153 | ignore = ("model_ext", "model_noext")
154 | ignore_paths = {
155 | "griffe_pydantic.common",
156 | "griffe_pydantic.extension",
157 | "griffe_pydantic.dynamic",
158 | "griffe_pydantic.static",
159 | }
160 |
161 | not_in_api = []
162 | public_api_paths = {obj.path for obj in public_objects}
163 | public_api_paths.add("griffe_pydantic")
164 | for item in inventory.values():
165 | # YORE: Bump 2: Remove block.
166 | if item.name.startswith(ignore) or item.name in ignore_paths:
167 | continue
168 |
169 | if item.domain == "py" and "(" not in item.name:
170 | obj = loader.modules_collection[item.name]
171 | if obj.path not in public_api_paths and not any(path in public_api_paths for path in obj.aliases):
172 | not_in_api.append(item.name)
173 | msg = "Inventory objects not in public API (try running `make run mkdocs build`):\n{paths}"
174 | assert not not_in_api, msg.format(paths="\n".join(sorted(not_in_api)))
175 |
176 |
177 | def test_no_module_docstrings_in_internal_api(internal_api: griffe.Module) -> None:
178 | """No module docstrings should be written in our internal API.
179 |
180 | The reasoning is that docstrings are addressed to users of the public API,
181 | but internal modules are not exposed to users, so they should not have docstrings.
182 | """
183 |
184 | def _modules(obj: griffe.Module) -> Iterator[griffe.Module]:
185 | for member in obj.modules.values():
186 | yield member
187 | yield from _modules(member)
188 |
189 | for obj in _modules(internal_api):
190 | assert not obj.docstring
191 |
--------------------------------------------------------------------------------
/tests/test_extension.py:
--------------------------------------------------------------------------------
1 | """Tests for the `extension` module."""
2 |
3 | from __future__ import annotations
4 |
5 | import logging
6 | from typing import TYPE_CHECKING
7 |
8 | import pytest
9 | from griffe import Extensions, temporary_inspected_package, temporary_visited_package
10 |
11 | from griffe_pydantic._internal.extension import PydanticExtension
12 |
13 | if TYPE_CHECKING:
14 | from mkdocstrings_handlers.python.handler import PythonHandler
15 |
16 |
17 | code = """
18 | from pydantic import field_validator, ConfigDict, BaseModel, Field
19 |
20 |
21 | class ExampleParentModel(BaseModel):
22 | '''An example parent model.'''
23 | parent_field: str = Field(..., description="Parent field.")
24 |
25 |
26 | class ExampleModel(ExampleParentModel):
27 | '''An example child model.'''
28 |
29 | model_config = ConfigDict(frozen=False)
30 |
31 | field_without_default: str
32 | '''Shows the *[Required]* marker in the signature.'''
33 |
34 | field_plain_with_validator: int = 100
35 | '''Show standard field with type annotation.'''
36 |
37 | field_with_validator_and_alias: str = Field("FooBar", alias="BarFoo", validation_alias="BarFoo")
38 | '''Shows corresponding validator with link/anchor.'''
39 |
40 | field_with_constraints_and_description: int = Field(
41 | default=5, ge=0, le=100, description="Shows constraints within doc string."
42 | )
43 |
44 | @field_validator("field_with_validator_and_alias", "field_plain_with_validator", mode="before")
45 | @classmethod
46 | def check_max_length_ten(cls, v):
47 | '''Show corresponding field with link/anchor.'''
48 | if len(v) >= 10:
49 | raise ValueError("No more than 10 characters allowed")
50 | return v
51 |
52 | def regular_method(self):
53 | pass
54 |
55 |
56 | class RegularClass(object):
57 | regular_attr = 1
58 | """
59 |
60 |
61 | @pytest.mark.parametrize("analysis", ["static", "dynamic"])
62 | def test_extension(analysis: str) -> None:
63 | """Test the extension."""
64 | loader = {"static": temporary_visited_package, "dynamic": temporary_inspected_package}[analysis]
65 | with loader(
66 | "package",
67 | modules={"__init__.py": code},
68 | extensions=Extensions(PydanticExtension(schema=True)),
69 | ) as package:
70 | assert package
71 |
72 | assert "ExampleParentModel" in package.classes
73 | assert package.classes["ExampleParentModel"].labels == {"pydantic-model"}
74 |
75 | assert "ExampleModel" in package.classes
76 | assert package.classes["ExampleModel"].labels == {"pydantic-model"}
77 |
78 | config = package.classes["ExampleModel"].extra["griffe_pydantic"]["config"]
79 | assert config == {"frozen": False}
80 |
81 | schema = package.classes["ExampleModel"].extra["griffe_pydantic"]["schema"]
82 | assert schema.startswith('{\n "description"')
83 |
84 |
85 | def test_imported_models() -> None:
86 | """Test the extension with imported models."""
87 | with temporary_visited_package(
88 | "package",
89 | modules={
90 | "__init__.py": "from ._private import MyModel\n\n__all__ = ['MyModel']",
91 | "_private.py": "from pydantic import BaseModel\n\nclass MyModel(BaseModel):\n field1: str\n '''Some field.'''\n",
92 | },
93 | extensions=Extensions(PydanticExtension(schema=False)),
94 | ) as package:
95 | assert package["MyModel"].labels == {"pydantic-model"}
96 | assert package["MyModel.field1"].labels == {"pydantic-field"}
97 |
98 |
99 | def test_rendering_model_config_using_configdict(python_handler: PythonHandler) -> None:
100 | """Test the extension with model config using ConfigDict."""
101 | code = """
102 | from pydantic import BaseModel, ConfigDict, Field
103 |
104 | class Model(BaseModel):
105 | usage: str | None = Field(
106 | None,
107 | description="Some description.",
108 | example="Some example.",
109 | )
110 | model_config = ConfigDict(
111 | json_schema_extra={
112 | "example": {
113 | "usage": "Some usage.",
114 | "limitations": "Some limitations.",
115 | "billing": "Some value.",
116 | "notice_period": "Some value.",
117 | }
118 | }
119 | )
120 | """
121 | with temporary_visited_package(
122 | "package",
123 | modules={"__init__.py": code},
124 | extensions=Extensions(PydanticExtension(schema=False)),
125 | ) as package:
126 | python_handler.render(package["Model"], python_handler.get_options({})) # Assert no errors.
127 |
128 |
129 | def test_not_crashing_on_dynamic_field_description(caplog: pytest.LogCaptureFixture) -> None:
130 | """Test the extension with dynamic field description."""
131 | code = """
132 | import pydantic
133 |
134 | desc = "xyz"
135 |
136 | class TestModel(pydantic.BaseModel):
137 | abc: str = pydantic.Field(description=desc)
138 | """
139 | with (
140 | caplog.at_level(logging.DEBUG),
141 | temporary_visited_package(
142 | "package",
143 | modules={"__init__.py": code},
144 | extensions=Extensions(PydanticExtension(schema=False)),
145 | ),
146 | ):
147 | assert any(
148 | record.levelname == "DEBUG" and "field 'package.TestModel.abc' as literal" in record.message
149 | for record in caplog.records
150 | )
151 |
152 |
153 | def test_ignore_classvars() -> None:
154 | """Test the extension ignores class variables."""
155 | code = """
156 | from pydantic import BaseModel
157 | from typing import ClassVar
158 |
159 | class Model(BaseModel):
160 | field: str
161 | class_var: ClassVar[int] = 1
162 | """
163 | with temporary_visited_package(
164 | "package",
165 | modules={"__init__.py": code},
166 | extensions=Extensions(PydanticExtension(schema=False)),
167 | ) as package:
168 | assert "pydantic-field" not in package["Model.class_var"].labels
169 | assert "class-attribute" in package["Model.class_var"].labels
170 |
171 |
172 | def test_wildcard_field_validator() -> None:
173 | """Test field validator that works on all fields."""
174 | code = """
175 | from pydantic import BaseModel, field_validator
176 |
177 | class Schema(BaseModel):
178 | a: int
179 | b: int
180 |
181 | @field_validator('*', mode='before')
182 | @classmethod
183 | def set_if_none(cls, v: Any, info):
184 | ...
185 | """
186 | with temporary_visited_package(
187 | "package",
188 | modules={"__init__.py": code},
189 | extensions=Extensions(PydanticExtension(schema=False)),
190 | ) as package:
191 | validator = package["Schema.set_if_none"]
192 | assert validator.labels == {"pydantic-validator"}
193 | assert validator in package["Schema.a"].extra["griffe_pydantic"]["validators"]
194 | assert validator in package["Schema.b"].extra["griffe_pydantic"]["validators"]
195 |
196 |
197 | def test_ignoring_properties() -> None:
198 | """Properties are not fields and must be ignored."""
199 | code = """
200 | from pydantic import BaseModel, field
201 |
202 | class Base(BaseModel):
203 | @property
204 | def a(self) -> int:
205 | return 0
206 |
207 | class Model(Base):
208 | b: int = field(default=1)
209 | """
210 | with temporary_visited_package(
211 | "package",
212 | modules={"__init__.py": code},
213 | extensions=Extensions(PydanticExtension(schema=False)),
214 | ) as package:
215 | assert "pydantic-field" not in package["Model.a"].labels
216 |
217 |
218 | def test_process_non_model_base_class_fields() -> None:
219 | """Fields in a non-model base class must be processed."""
220 | code = """
221 | from pydantic import BaseModel, field
222 |
223 | class A:
224 | a: int = 0
225 |
226 | class B(BaseModel, A):
227 | b: int = 1
228 | """
229 | with temporary_visited_package(
230 | "package",
231 | modules={"__init__.py": code},
232 | extensions=Extensions(PydanticExtension(schema=False)),
233 | ) as package:
234 | assert "pydantic-field" in package["B.a"].labels
235 |
--------------------------------------------------------------------------------