├── .gitattributes ├── .github └── workflows │ └── deploy.yml ├── .gitignore ├── .python-version ├── Dockerfile ├── LICENSE ├── Makefile ├── Pipfile ├── Pipfile.lock ├── README.md ├── docker-compose.yml ├── entrypoint.sh ├── source ├── _static │ ├── fail.png │ ├── fail.svg │ ├── jsonschema.css │ ├── jsonschema.js │ ├── logo.ico │ ├── logo.pdf │ ├── logo.png │ ├── octopus.png │ ├── octopus.svg │ ├── pass.png │ ├── pass.svg │ └── schema.png ├── _templates │ └── layout.html ├── about.rst ├── basics.rst ├── conf.py ├── conventions.rst ├── credits.rst ├── index.rst ├── reference │ ├── array.rst │ ├── boolean.rst │ ├── combining.rst │ ├── conditionals.rst │ ├── generic.rst │ ├── index.rst │ ├── non_json_data.rst │ ├── null.rst │ ├── numeric.rst │ ├── object.rst │ ├── regular_expressions.rst │ ├── schema.rst │ ├── string.rst │ └── type.rst ├── sphinxext │ ├── __init__.py │ ├── jsonschemaext.py │ └── tab.py └── structuring.rst └── texlive.profile /.gitattributes: -------------------------------------------------------------------------------- 1 | * text=auto 2 | 3 | *.sh text eol=lf 4 | -------------------------------------------------------------------------------- /.github/workflows/deploy.yml: -------------------------------------------------------------------------------- 1 | on: 2 | push: 3 | branches: 4 | - master 5 | 6 | jobs: 7 | deploy: 8 | runs-on: ubuntu-latest 9 | name: Deploy 10 | steps: 11 | - name: Checkout repo 12 | uses: actions/checkout@v2 13 | - run: docker-compose pull 14 | - uses: satackey/action-docker-layer-caching@v0.0.11 15 | continue-on-error: true 16 | - run: docker-compose build 17 | - name: Build site 18 | run: docker-compose run web ls -l build/html 19 | - name: Deploy to Github Pages 20 | uses: JamesIves/github-pages-deploy-action@4.1.4 21 | with: 22 | branch: gh-pages 23 | folder: build/html 24 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | build 2 | *.pyc 3 | texlive 4 | install-tl* 5 | install-tl-unx.tar.gz -------------------------------------------------------------------------------- /.python-version: -------------------------------------------------------------------------------- 1 | 3.8 2 | -------------------------------------------------------------------------------- /Dockerfile: -------------------------------------------------------------------------------- 1 | FROM fedora:37 2 | 3 | # Install system dependencies 4 | RUN dnf update -y 5 | RUN dnf install -y make latexmk texlive pipenv python3.8 6 | RUN dnf install -y 'tex(fncychap.sty)' \ 7 | 'tex(tabulary.sty)' \ 8 | 'tex(framed.sty)' \ 9 | 'tex(wrapfig.sty)' \ 10 | 'tex(upquote.sty)' \ 11 | 'tex(capt-of.sty)' \ 12 | 'tex(needspace.sty)' \ 13 | 'tex(overlock.sty)' \ 14 | 'tex(inconsolata.sty)' \ 15 | 'tex(bbding.sty)' \ 16 | 'tex(mdframed.sty)' \ 17 | 'tex(bbding10.pfb)' 18 | RUN texhash 19 | RUN alternatives --install /usr/bin/python python /usr/bin/python3.8 1 20 | 21 | COPY . /code 22 | WORKDIR /code 23 | 24 | # Install Python dependencies 25 | RUN pipenv install --system 26 | 27 | # Build website 28 | ENTRYPOINT ["./entrypoint.sh"] 29 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Copyright (c) 2013, Space Telescope Science Institute 2 | All rights reserved. 3 | 4 | Redistribution and use in source and binary forms, with or without modification, 5 | are permitted provided that the following conditions are met: 6 | 7 | Redistributions of source code must retain the above copyright notice, this 8 | list of conditions and the following disclaimer. 9 | 10 | Redistributions in binary form must reproduce the above copyright notice, this 11 | list of conditions and the following disclaimer in the documentation and/or 12 | other materials provided with the distribution. 13 | 14 | Neither the name of the Space Telescope Science Institute nor the 15 | names of its contributors may be used to endorse or promote products 16 | derived from this software without specific prior written 17 | permission. 18 | 19 | THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND 20 | ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED 21 | WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE 22 | DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR 23 | ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES 24 | (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; 25 | LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON 26 | ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT 27 | (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS 28 | SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. 29 | -------------------------------------------------------------------------------- /Makefile: -------------------------------------------------------------------------------- 1 | # Makefile for Sphinx documentation 2 | # 3 | 4 | # You can set these variables from the command line. 5 | SPHINXOPTS = 6 | SPHINXBUILD = sphinx-build 7 | PAPER = 8 | BUILDDIR = build 9 | 10 | # User-friendly check for sphinx-build 11 | ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1) 12 | $(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/) 13 | endif 14 | 15 | # Internal variables. 16 | PAPEROPT_a4 = -D latex_paper_size=a4 17 | PAPEROPT_letter = -D latex_paper_size=letter 18 | ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) source 19 | # the i18n builder cannot share the environment and doctrees with the others 20 | I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) source 21 | 22 | .PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext 23 | 24 | help: 25 | @echo "Please use \`make ' where is one of" 26 | @echo " html to make standalone HTML files" 27 | @echo " dirhtml to make HTML files named index.html in directories" 28 | @echo " singlehtml to make a single large HTML file" 29 | @echo " pickle to make pickle files" 30 | @echo " json to make JSON files" 31 | @echo " htmlhelp to make HTML files and a HTML help project" 32 | @echo " qthelp to make HTML files and a qthelp project" 33 | @echo " devhelp to make HTML files and a Devhelp project" 34 | @echo " epub to make an epub" 35 | @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter" 36 | @echo " latexpdf to make LaTeX files and run them through pdflatex" 37 | @echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx" 38 | @echo " text to make text files" 39 | @echo " man to make manual pages" 40 | @echo " texinfo to make Texinfo files" 41 | @echo " info to make Texinfo files and run them through makeinfo" 42 | @echo " gettext to make PO message catalogs" 43 | @echo " changes to make an overview of all changed/added/deprecated items" 44 | @echo " xml to make Docutils-native XML files" 45 | @echo " pseudoxml to make pseudoxml-XML files for display purposes" 46 | @echo " linkcheck to check all external links for integrity" 47 | @echo " doctest to run all doctests embedded in the documentation (if enabled)" 48 | 49 | clean: 50 | rm -rf $(BUILDDIR)/* 51 | 52 | html: 53 | $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html 54 | @echo 55 | @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." 56 | 57 | dirhtml: 58 | $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml 59 | @echo 60 | @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml." 61 | 62 | singlehtml: 63 | $(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml 64 | @echo 65 | @echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml." 66 | 67 | pickle: 68 | $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle 69 | @echo 70 | @echo "Build finished; now you can process the pickle files." 71 | 72 | json: 73 | $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json 74 | @echo 75 | @echo "Build finished; now you can process the JSON files." 76 | 77 | htmlhelp: 78 | $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp 79 | @echo 80 | @echo "Build finished; now you can run HTML Help Workshop with the" \ 81 | ".hhp project file in $(BUILDDIR)/htmlhelp." 82 | 83 | qthelp: 84 | $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp 85 | @echo 86 | @echo "Build finished; now you can run "qcollectiongenerator" with the" \ 87 | ".qhcp project file in $(BUILDDIR)/qthelp, like this:" 88 | @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/UnderstandingJSONSchema.qhcp" 89 | @echo "To view the help file:" 90 | @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/UnderstandingJSONSchema.qhc" 91 | 92 | devhelp: 93 | $(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp 94 | @echo 95 | @echo "Build finished." 96 | @echo "To view the help file:" 97 | @echo "# mkdir -p $$HOME/.local/share/devhelp/UnderstandingJSONSchema" 98 | @echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/UnderstandingJSONSchema" 99 | @echo "# devhelp" 100 | 101 | epub: 102 | $(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub 103 | @echo 104 | @echo "Build finished. The epub file is in $(BUILDDIR)/epub." 105 | 106 | latex: 107 | $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex 108 | @echo 109 | @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex." 110 | @echo "Run \`make' in that directory to run these through (pdf)latex" \ 111 | "(use \`make latexpdf' here to do that automatically)." 112 | 113 | latexpdf: 114 | $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex 115 | @echo "Running LaTeX files through pdflatex..." 116 | $(MAKE) -C $(BUILDDIR)/latex all-pdf 117 | @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." 118 | 119 | latexpdfja: 120 | $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex 121 | @echo "Running LaTeX files through platex and dvipdfmx..." 122 | $(MAKE) -C $(BUILDDIR)/latex all-pdf-ja 123 | @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." 124 | 125 | text: 126 | $(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text 127 | @echo 128 | @echo "Build finished. The text files are in $(BUILDDIR)/text." 129 | 130 | man: 131 | $(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man 132 | @echo 133 | @echo "Build finished. The manual pages are in $(BUILDDIR)/man." 134 | 135 | texinfo: 136 | $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo 137 | @echo 138 | @echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo." 139 | @echo "Run \`make' in that directory to run these through makeinfo" \ 140 | "(use \`make info' here to do that automatically)." 141 | 142 | info: 143 | $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo 144 | @echo "Running Texinfo files through makeinfo..." 145 | make -C $(BUILDDIR)/texinfo info 146 | @echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo." 147 | 148 | gettext: 149 | $(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale 150 | @echo 151 | @echo "Build finished. The message catalogs are in $(BUILDDIR)/locale." 152 | 153 | changes: 154 | $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes 155 | @echo 156 | @echo "The overview file is in $(BUILDDIR)/changes." 157 | 158 | linkcheck: 159 | $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck 160 | @echo 161 | @echo "Link check complete; look for any errors in the above output " \ 162 | "or in $(BUILDDIR)/linkcheck/output.txt." 163 | 164 | doctest: 165 | $(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest 166 | @echo "Testing of doctests in the sources finished, look at the " \ 167 | "results in $(BUILDDIR)/doctest/output.txt." 168 | 169 | xml: 170 | $(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml 171 | @echo 172 | @echo "Build finished. The XML files are in $(BUILDDIR)/xml." 173 | 174 | pseudoxml: 175 | $(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml 176 | @echo 177 | @echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml." 178 | -------------------------------------------------------------------------------- /Pipfile: -------------------------------------------------------------------------------- 1 | [[source]] 2 | url = "https://pypi.org/simple" 3 | verify_ssl = true 4 | name = "pypi" 5 | 6 | [packages] 7 | sphinx = "<2.0.0" 8 | sphinx-bootstrap-theme = "*" 9 | jsonschema = "*" 10 | jschon = "0.7.3" 11 | 12 | [dev-packages] 13 | 14 | [requires] 15 | python_version = "3.8" 16 | -------------------------------------------------------------------------------- /Pipfile.lock: -------------------------------------------------------------------------------- 1 | { 2 | "_meta": { 3 | "hash": { 4 | "sha256": "f52f5d469be4f12821db40fdf85dde48e45850d38e45c4d8473142410b640dd5" 5 | }, 6 | "pipfile-spec": 6, 7 | "requires": { 8 | "python_version": "3.8" 9 | }, 10 | "sources": [ 11 | { 12 | "name": "pypi", 13 | "url": "https://pypi.org/simple", 14 | "verify_ssl": true 15 | } 16 | ] 17 | }, 18 | "default": { 19 | "alabaster": { 20 | "hashes": [ 21 | "sha256:446438bdcca0e05bd45ea2de1668c1d9b032e1a9154c2c259092d77031ddd359", 22 | "sha256:a661d72d58e6ea8a57f7a86e37d86716863ee5e92788398526d58b26a4e4dc02" 23 | ], 24 | "version": "==0.7.12" 25 | }, 26 | "attrs": { 27 | "hashes": [ 28 | "sha256:149e90d6d8ac20db7a955ad60cf0e6881a3f20d37096140088356da6c716b0b1", 29 | "sha256:ef6aaac3ca6cd92904cdd0d83f629a15f18053ec84e6432106f7a4d04ae4f5fb" 30 | ], 31 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'", 32 | "version": "==21.2.0" 33 | }, 34 | "babel": { 35 | "hashes": [ 36 | "sha256:ab49e12b91d937cd11f0b67cb259a57ab4ad2b59ac7a3b41d6c06c0ac5b0def9", 37 | "sha256:bc0c176f9f6a994582230df350aa6e05ba2ebe4b3ac317eab29d9be5d2768da0" 38 | ], 39 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", 40 | "version": "==2.9.1" 41 | }, 42 | "certifi": { 43 | "hashes": [ 44 | "sha256:78884e7c1d4b00ce3cea67b44566851c4343c120abd683433ce934a68ea58872", 45 | "sha256:d62a0163eb4c2344ac042ab2bdf75399a71a2d8c7d47eac2e2ee91b9d6339569" 46 | ], 47 | "version": "==2021.10.8" 48 | }, 49 | "charset-normalizer": { 50 | "hashes": [ 51 | "sha256:1eecaa09422db5be9e29d7fc65664e6c33bd06f9ced7838578ba40d58bdf3721", 52 | "sha256:b0b883e8e874edfdece9c28f314e3dd5badf067342e42fb162203335ae61aa2c" 53 | ], 54 | "markers": "python_version >= '3'", 55 | "version": "==2.0.9" 56 | }, 57 | "docutils": { 58 | "hashes": [ 59 | "sha256:686577d2e4c32380bb50cbb22f575ed742d58168cee37e99117a854bcd88f125", 60 | "sha256:cf316c8370a737a022b72b56874f6602acf974a37a9fba42ec2876387549fc61" 61 | ], 62 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'", 63 | "version": "==0.17.1" 64 | }, 65 | "idna": { 66 | "hashes": [ 67 | "sha256:84d9dd047ffa80596e0f246e2eab0b391788b0503584e8945f2368256d2735ff", 68 | "sha256:9d643ff0a55b762d5cdb124b8eaa99c66322e2157b69160bc32796e824360e6d" 69 | ], 70 | "markers": "python_version >= '3'", 71 | "version": "==3.3" 72 | }, 73 | "imagesize": { 74 | "hashes": [ 75 | "sha256:1db2f82529e53c3e929e8926a1fa9235aa82d0bd0c580359c67ec31b2fddaa8c", 76 | "sha256:cd1750d452385ca327479d45b64d9c7729ecf0b3969a58148298c77092261f9d" 77 | ], 78 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", 79 | "version": "==1.3.0" 80 | }, 81 | "importlib-resources": { 82 | "hashes": [ 83 | "sha256:33a95faed5fc19b4bc16b29a6eeae248a3fe69dd55d4d229d2b480e23eeaad45", 84 | "sha256:d756e2f85dd4de2ba89be0b21dba2a3bbec2e871a42a3a16719258a11f87506b" 85 | ], 86 | "markers": "python_version < '3.9'", 87 | "version": "==5.4.0" 88 | }, 89 | "jinja2": { 90 | "hashes": [ 91 | "sha256:077ce6014f7b40d03b47d1f1ca4b0fc8328a692bd284016f806ed0eaca390ad8", 92 | "sha256:611bb273cd68f3b993fabdc4064fc858c5b47a973cb5aa7999ec1ba405c87cd7" 93 | ], 94 | "markers": "python_version >= '3.6'", 95 | "version": "==3.0.3" 96 | }, 97 | "jschon": { 98 | "hashes": [ 99 | "sha256:44444038c34684bdfa66ce4ece6ac4b3b49f9fcd6a43d952557f445c89790147", 100 | "sha256:8307ef60fefe3e586d45bc3683a9edf5465d07c79df1ee8b9e2fcfcd2b86a7ad" 101 | ], 102 | "index": "pypi", 103 | "version": "==0.7.3" 104 | }, 105 | "jsonschema": { 106 | "hashes": [ 107 | "sha256:0070ca8dd5bf47941d1e9d8bc115a3654b1138cfb8aff44f3e3527276107314f", 108 | "sha256:91ffbad994d766041c6003d5f8f475cceb890c30084bd0e64847ccb1c10e48bb" 109 | ], 110 | "index": "pypi", 111 | "version": "==4.3.1" 112 | }, 113 | "markupsafe": { 114 | "hashes": [ 115 | "sha256:01a9b8ea66f1658938f65b93a85ebe8bc016e6769611be228d797c9d998dd298", 116 | "sha256:023cb26ec21ece8dc3907c0e8320058b2e0cb3c55cf9564da612bc325bed5e64", 117 | "sha256:0446679737af14f45767963a1a9ef7620189912317d095f2d9ffa183a4d25d2b", 118 | "sha256:04635854b943835a6ea959e948d19dcd311762c5c0c6e1f0e16ee57022669194", 119 | "sha256:0717a7390a68be14b8c793ba258e075c6f4ca819f15edfc2a3a027c823718567", 120 | "sha256:0955295dd5eec6cb6cc2fe1698f4c6d84af2e92de33fbcac4111913cd100a6ff", 121 | "sha256:0d4b31cc67ab36e3392bbf3862cfbadac3db12bdd8b02a2731f509ed5b829724", 122 | "sha256:10f82115e21dc0dfec9ab5c0223652f7197feb168c940f3ef61563fc2d6beb74", 123 | "sha256:168cd0a3642de83558a5153c8bd34f175a9a6e7f6dc6384b9655d2697312a646", 124 | "sha256:1d609f577dc6e1aa17d746f8bd3c31aa4d258f4070d61b2aa5c4166c1539de35", 125 | "sha256:1f2ade76b9903f39aa442b4aadd2177decb66525062db244b35d71d0ee8599b6", 126 | "sha256:20dca64a3ef2d6e4d5d615a3fd418ad3bde77a47ec8a23d984a12b5b4c74491a", 127 | "sha256:2a7d351cbd8cfeb19ca00de495e224dea7e7d919659c2841bbb7f420ad03e2d6", 128 | "sha256:2d7d807855b419fc2ed3e631034685db6079889a1f01d5d9dac950f764da3dad", 129 | "sha256:2ef54abee730b502252bcdf31b10dacb0a416229b72c18b19e24a4509f273d26", 130 | "sha256:36bc903cbb393720fad60fc28c10de6acf10dc6cc883f3e24ee4012371399a38", 131 | "sha256:37205cac2a79194e3750b0af2a5720d95f786a55ce7df90c3af697bfa100eaac", 132 | "sha256:3c112550557578c26af18a1ccc9e090bfe03832ae994343cfdacd287db6a6ae7", 133 | "sha256:3dd007d54ee88b46be476e293f48c85048603f5f516008bee124ddd891398ed6", 134 | "sha256:4296f2b1ce8c86a6aea78613c34bb1a672ea0e3de9c6ba08a960efe0b0a09047", 135 | "sha256:47ab1e7b91c098ab893b828deafa1203de86d0bc6ab587b160f78fe6c4011f75", 136 | "sha256:49e3ceeabbfb9d66c3aef5af3a60cc43b85c33df25ce03d0031a608b0a8b2e3f", 137 | "sha256:4dc8f9fb58f7364b63fd9f85013b780ef83c11857ae79f2feda41e270468dd9b", 138 | "sha256:4efca8f86c54b22348a5467704e3fec767b2db12fc39c6d963168ab1d3fc9135", 139 | "sha256:53edb4da6925ad13c07b6d26c2a852bd81e364f95301c66e930ab2aef5b5ddd8", 140 | "sha256:5855f8438a7d1d458206a2466bf82b0f104a3724bf96a1c781ab731e4201731a", 141 | "sha256:594c67807fb16238b30c44bdf74f36c02cdf22d1c8cda91ef8a0ed8dabf5620a", 142 | "sha256:5b6d930f030f8ed98e3e6c98ffa0652bdb82601e7a016ec2ab5d7ff23baa78d1", 143 | "sha256:5bb28c636d87e840583ee3adeb78172efc47c8b26127267f54a9c0ec251d41a9", 144 | "sha256:60bf42e36abfaf9aff1f50f52644b336d4f0a3fd6d8a60ca0d054ac9f713a864", 145 | "sha256:611d1ad9a4288cf3e3c16014564df047fe08410e628f89805e475368bd304914", 146 | "sha256:6300b8454aa6930a24b9618fbb54b5a68135092bc666f7b06901f897fa5c2fee", 147 | "sha256:63f3268ba69ace99cab4e3e3b5840b03340efed0948ab8f78d2fd87ee5442a4f", 148 | "sha256:6557b31b5e2c9ddf0de32a691f2312a32f77cd7681d8af66c2692efdbef84c18", 149 | "sha256:693ce3f9e70a6cf7d2fb9e6c9d8b204b6b39897a2c4a1aa65728d5ac97dcc1d8", 150 | "sha256:6a7fae0dd14cf60ad5ff42baa2e95727c3d81ded453457771d02b7d2b3f9c0c2", 151 | "sha256:6c4ca60fa24e85fe25b912b01e62cb969d69a23a5d5867682dd3e80b5b02581d", 152 | "sha256:6fcf051089389abe060c9cd7caa212c707e58153afa2c649f00346ce6d260f1b", 153 | "sha256:7d91275b0245b1da4d4cfa07e0faedd5b0812efc15b702576d103293e252af1b", 154 | "sha256:89c687013cb1cd489a0f0ac24febe8c7a666e6e221b783e53ac50ebf68e45d86", 155 | "sha256:8d206346619592c6200148b01a2142798c989edcb9c896f9ac9722a99d4e77e6", 156 | "sha256:905fec760bd2fa1388bb5b489ee8ee5f7291d692638ea5f67982d968366bef9f", 157 | "sha256:97383d78eb34da7e1fa37dd273c20ad4320929af65d156e35a5e2d89566d9dfb", 158 | "sha256:984d76483eb32f1bcb536dc27e4ad56bba4baa70be32fa87152832cdd9db0833", 159 | "sha256:99df47edb6bda1249d3e80fdabb1dab8c08ef3975f69aed437cb69d0a5de1e28", 160 | "sha256:9f02365d4e99430a12647f09b6cc8bab61a6564363f313126f775eb4f6ef798e", 161 | "sha256:a30e67a65b53ea0a5e62fe23682cfe22712e01f453b95233b25502f7c61cb415", 162 | "sha256:ab3ef638ace319fa26553db0624c4699e31a28bb2a835c5faca8f8acf6a5a902", 163 | "sha256:aca6377c0cb8a8253e493c6b451565ac77e98c2951c45f913e0b52facdcff83f", 164 | "sha256:add36cb2dbb8b736611303cd3bfcee00afd96471b09cda130da3581cbdc56a6d", 165 | "sha256:b2f4bf27480f5e5e8ce285a8c8fd176c0b03e93dcc6646477d4630e83440c6a9", 166 | "sha256:b7f2d075102dc8c794cbde1947378051c4e5180d52d276987b8d28a3bd58c17d", 167 | "sha256:baa1a4e8f868845af802979fcdbf0bb11f94f1cb7ced4c4b8a351bb60d108145", 168 | "sha256:be98f628055368795d818ebf93da628541e10b75b41c559fdf36d104c5787066", 169 | "sha256:bf5d821ffabf0ef3533c39c518f3357b171a1651c1ff6827325e4489b0e46c3c", 170 | "sha256:c47adbc92fc1bb2b3274c4b3a43ae0e4573d9fbff4f54cd484555edbf030baf1", 171 | "sha256:cdfba22ea2f0029c9261a4bd07e830a8da012291fbe44dc794e488b6c9bb353a", 172 | "sha256:d6c7ebd4e944c85e2c3421e612a7057a2f48d478d79e61800d81468a8d842207", 173 | "sha256:d7f9850398e85aba693bb640262d3611788b1f29a79f0c93c565694658f4071f", 174 | "sha256:d8446c54dc28c01e5a2dbac5a25f071f6653e6e40f3a8818e8b45d790fe6ef53", 175 | "sha256:deb993cacb280823246a026e3b2d81c493c53de6acfd5e6bfe31ab3402bb37dd", 176 | "sha256:e0f138900af21926a02425cf736db95be9f4af72ba1bb21453432a07f6082134", 177 | "sha256:e9936f0b261d4df76ad22f8fee3ae83b60d7c3e871292cd42f40b81b70afae85", 178 | "sha256:f0567c4dc99f264f49fe27da5f735f414c4e7e7dd850cfd8e69f0862d7c74ea9", 179 | "sha256:f5653a225f31e113b152e56f154ccbe59eeb1c7487b39b9d9f9cdb58e6c79dc5", 180 | "sha256:f826e31d18b516f653fe296d967d700fddad5901ae07c622bb3705955e1faa94", 181 | "sha256:f8ba0e8349a38d3001fae7eadded3f6606f0da5d748ee53cc1dab1d6527b9509", 182 | "sha256:f9081981fe268bd86831e5c75f7de206ef275defcb82bc70740ae6dc507aee51", 183 | "sha256:fa130dd50c57d53368c9d59395cb5526eda596d3ffe36666cd81a44d56e48872" 184 | ], 185 | "markers": "python_version >= '3.6'", 186 | "version": "==2.0.1" 187 | }, 188 | "packaging": { 189 | "hashes": [ 190 | "sha256:dd47c42927d89ab911e606518907cc2d3a1f38bbd026385970643f9c5b8ecfeb", 191 | "sha256:ef103e05f519cdc783ae24ea4e2e0f508a9c99b2d4969652eed6a2e1ea5bd522" 192 | ], 193 | "markers": "python_version >= '3.6'", 194 | "version": "==21.3" 195 | }, 196 | "pygments": { 197 | "hashes": [ 198 | "sha256:b8e67fe6af78f492b3c4b3e2970c0624cbf08beb1e493b2c99b9fa1b67a20380", 199 | "sha256:f398865f7eb6874156579fdf36bc840a03cab64d1cde9e93d68f46a425ec52c6" 200 | ], 201 | "markers": "python_version >= '3.5'", 202 | "version": "==2.10.0" 203 | }, 204 | "pyparsing": { 205 | "hashes": [ 206 | "sha256:04ff808a5b90911829c55c4e26f75fa5ca8a2f5f36aa3a51f68e27033341d3e4", 207 | "sha256:d9bdec0013ef1eb5a84ab39a3b3868911598afa494f5faa038647101504e2b81" 208 | ], 209 | "markers": "python_version >= '3.6'", 210 | "version": "==3.0.6" 211 | }, 212 | "pyrsistent": { 213 | "hashes": [ 214 | "sha256:097b96f129dd36a8c9e33594e7ebb151b1515eb52cceb08474c10a5479e799f2", 215 | "sha256:2aaf19dc8ce517a8653746d98e962ef480ff34b6bc563fc067be6401ffb457c7", 216 | "sha256:404e1f1d254d314d55adb8d87f4f465c8693d6f902f67eb6ef5b4526dc58e6ea", 217 | "sha256:48578680353f41dca1ca3dc48629fb77dfc745128b56fc01096b2530c13fd426", 218 | "sha256:4916c10896721e472ee12c95cdc2891ce5890898d2f9907b1b4ae0f53588b710", 219 | "sha256:527be2bfa8dc80f6f8ddd65242ba476a6c4fb4e3aedbf281dfbac1b1ed4165b1", 220 | "sha256:58a70d93fb79dc585b21f9d72487b929a6fe58da0754fa4cb9f279bb92369396", 221 | "sha256:5e4395bbf841693eaebaa5bb5c8f5cdbb1d139e07c975c682ec4e4f8126e03d2", 222 | "sha256:6b5eed00e597b5b5773b4ca30bd48a5774ef1e96f2a45d105db5b4ebb4bca680", 223 | "sha256:73ff61b1411e3fb0ba144b8f08d6749749775fe89688093e1efef9839d2dcc35", 224 | "sha256:772e94c2c6864f2cd2ffbe58bb3bdefbe2a32afa0acb1a77e472aac831f83427", 225 | "sha256:773c781216f8c2900b42a7b638d5b517bb134ae1acbebe4d1e8f1f41ea60eb4b", 226 | "sha256:a0c772d791c38bbc77be659af29bb14c38ced151433592e326361610250c605b", 227 | "sha256:b29b869cf58412ca5738d23691e96d8aff535e17390128a1a52717c9a109da4f", 228 | "sha256:c1a9ff320fa699337e05edcaae79ef8c2880b52720bc031b219e5b5008ebbdef", 229 | "sha256:cd3caef37a415fd0dae6148a1b6957a8c5f275a62cca02e18474608cb263640c", 230 | "sha256:d5ec194c9c573aafaceebf05fc400656722793dac57f254cd4741f3c27ae57b4", 231 | "sha256:da6e5e818d18459fa46fac0a4a4e543507fe1110e808101277c5a2b5bab0cd2d", 232 | "sha256:e79d94ca58fcafef6395f6352383fa1a76922268fa02caa2272fff501c2fdc78", 233 | "sha256:f3ef98d7b76da5eb19c37fda834d50262ff9167c65658d1d8f974d2e4d90676b", 234 | "sha256:f4c8cabb46ff8e5d61f56a037974228e978f26bfefce4f61a4b1ac0ba7a2ab72" 235 | ], 236 | "markers": "python_version >= '3.6'", 237 | "version": "==0.18.0" 238 | }, 239 | "pytz": { 240 | "hashes": [ 241 | "sha256:3672058bc3453457b622aab7a1c3bfd5ab0bdae451512f6cf25f64ed37f5b87c", 242 | "sha256:acad2d8b20a1af07d4e4c9d2e9285c5ed9104354062f275f3fcd88dcef4f1326" 243 | ], 244 | "version": "==2021.3" 245 | }, 246 | "requests": { 247 | "hashes": [ 248 | "sha256:6c1246513ecd5ecd4528a0906f910e8f0f9c6b8ec72030dc9fd154dc1a6efd24", 249 | "sha256:b8aa58f8cf793ffd8782d3d8cb19e66ef36f7aba4353eec859e74678b01b07a7" 250 | ], 251 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4, 3.5'", 252 | "version": "==2.26.0" 253 | }, 254 | "rfc3986": { 255 | "hashes": [ 256 | "sha256:270aaf10d87d0d4e095063c65bf3ddbc6ee3d0b226328ce21e036f946e421835", 257 | "sha256:a86d6e1f5b1dc238b218b012df0aa79409667bb209e58da56d0b94704e712a97" 258 | ], 259 | "version": "==1.5.0" 260 | }, 261 | "six": { 262 | "hashes": [ 263 | "sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926", 264 | "sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254" 265 | ], 266 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2'", 267 | "version": "==1.16.0" 268 | }, 269 | "snowballstemmer": { 270 | "hashes": [ 271 | "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1", 272 | "sha256:c8e1716e83cc398ae16824e5572ae04e0d9fc2c6b985fb0f900f5f0c96ecba1a" 273 | ], 274 | "version": "==2.2.0" 275 | }, 276 | "sphinx": { 277 | "hashes": [ 278 | "sha256:5973adbb19a5de30e15ab394ec8bc05700317fa83f122c349dd01804d983720f", 279 | "sha256:e096b1b369dbb0fcb95a31ba8c9e1ae98c588e601f08eada032248e1696de4b1" 280 | ], 281 | "index": "pypi", 282 | "version": "==1.8.6" 283 | }, 284 | "sphinx-bootstrap-theme": { 285 | "hashes": [ 286 | "sha256:038ee7e89478e064b5dd7e614de6f3f4cec81d9f9efbebb06e105693d6a50924", 287 | "sha256:8b648023a0587f1695460670554ca3fb493e344313189b74a87b0ba27168ca47" 288 | ], 289 | "index": "pypi", 290 | "version": "==0.8.0" 291 | }, 292 | "sphinxcontrib-serializinghtml": { 293 | "hashes": [ 294 | "sha256:352a9a00ae864471d3a7ead8d7d79f5fc0b57e8b3f95e9867eb9eb28999b92fd", 295 | "sha256:aa5f6de5dfdf809ef505c4895e51ef5c9eac17d0f287933eb49ec495280b6952" 296 | ], 297 | "markers": "python_version >= '3.5'", 298 | "version": "==1.1.5" 299 | }, 300 | "sphinxcontrib-websupport": { 301 | "hashes": [ 302 | "sha256:4edf0223a0685a7c485ae5a156b6f529ba1ee481a1417817935b20bde1956232", 303 | "sha256:6fc9287dfc823fe9aa432463edd6cea47fa9ebbf488d7f289b322ffcfca075c7" 304 | ], 305 | "markers": "python_version >= '3.5'", 306 | "version": "==1.2.4" 307 | }, 308 | "urllib3": { 309 | "hashes": [ 310 | "sha256:4987c65554f7a2dbf30c18fd48778ef124af6fab771a377103da0585e2336ece", 311 | "sha256:c4fdf4019605b6e5423637e01bc9fe4daef873709a7973e195ceba0a62bbc844" 312 | ], 313 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4' and python_version < '4'", 314 | "version": "==1.26.7" 315 | }, 316 | "zipp": { 317 | "hashes": [ 318 | "sha256:71c644c5369f4a6e07636f0aa966270449561fcea2e3d6747b8d23efaa9d7832", 319 | "sha256:9fe5ea21568a0a70e50f273397638d39b03353731e6cbbb3fd8502a33fec40bc" 320 | ], 321 | "markers": "python_version < '3.10'", 322 | "version": "==3.6.0" 323 | } 324 | }, 325 | "develop": {} 326 | } 327 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | --- 2 | 📌 **Deprecation Notice** 📌 3 | 4 | This repository is now deprecated. To contribute to the JSON Schema docs please use the new repository ➡️ [https://github.com/json-schema-org/website](https://github.com/json-schema-org/website). 5 | 6 | --- 7 | 8 | understanding-json-schema 9 | ========================= 10 | [![Contributor Covenant](https://img.shields.io/badge/Contributor%20Covenant-2.1-4baaaa.svg)](https://github.com/json-schema-org/.github/blob/main/CODE_OF_CONDUCT.md) [![Project Status: Active – The project has reached a stable, usable state and is being actively developed.](https://www.repostatus.org/badges/latest/active.svg)](https://www.repostatus.org/#active) [![Financial Contributors on Open Collective](https://opencollective.com/json-schema/all/badge.svg?label=financial+contributors)](https://opencollective.com/json-schema) 11 | 12 | [![Build Status](https://travis-ci.org/json-schema-org/understanding-json-schema.png)](https://travis-ci.org/json-schema-org/understanding-json-schema) 13 | 14 | 15 | A website aiming to provide more accessible documentation for JSON schema. 16 | 17 | http://json-schema.org/understanding-json-schema/index.html 18 | 19 | ## Build locally 20 | 21 | You can build and serve the website locally using docker. Running 22 | `docker-compose up` will start the server on http://localhost:8000 23 | -------------------------------------------------------------------------------- /docker-compose.yml: -------------------------------------------------------------------------------- 1 | version: '3.9' 2 | 3 | services: 4 | web: 5 | build: . 6 | volumes: 7 | - .:/code:Z 8 | ports: 9 | - '8000:8000' 10 | command: '/usr/bin/python3 -m http.server 8000 --directory build/html' 11 | -------------------------------------------------------------------------------- /entrypoint.sh: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | 3 | make html latexpdf 4 | cp build/latex/*.pdf build/html 5 | exec "$@" 6 | -------------------------------------------------------------------------------- /source/_static/fail.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/json-schema-org/understanding-json-schema/39a0a46c069b916064b98bba4b70adc6abd23bd9/source/_static/fail.png -------------------------------------------------------------------------------- /source/_static/fail.svg: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 7 | 8 | 15 | 16 | 17 | -------------------------------------------------------------------------------- /source/_static/jsonschema.css: -------------------------------------------------------------------------------- 1 | @import url(bootstrap-sphinx.css); 2 | 3 | @import url(https://fonts.googleapis.com/css?family=Overlock:900); 4 | @import url(https://fonts.googleapis.com/css?family=Inconsolata); 5 | 6 | div.admonition { 7 | margin-top: 0; 8 | } 9 | 10 | div.highlight pre { 11 | background-color: transparent; 12 | } 13 | 14 | div.jsonschema-pass { 15 | margin-left: 48px; 16 | } 17 | 18 | div.jsonschema-fail { 19 | margin-left: 48px; 20 | } 21 | 22 | div.jsonschema .highlight { 23 | background-color: #f5f5f5; 24 | background-repeat: no-repeat; 25 | background-position: right center; 26 | } 27 | 28 | div.jsonschema-pass .highlight { 29 | background-color: #eeffcc; 30 | background-image: url(pass.svg); 31 | background-repeat: no-repeat; 32 | background-position: right center; 33 | } 34 | 35 | div.jsonschema-fail .highlight { 36 | background-color: #ffcccc; 37 | background-image: url(fail.svg); 38 | background-repeat: no-repeat; 39 | background-position: right center; 40 | } 41 | 42 | p.jsonschema-comment { 43 | margin-left: 48px; 44 | } 45 | 46 | .tabbable { 47 | margin-bottom: 12px; 48 | } 49 | 50 | .nav { 51 | margin-bottom: 0px; 52 | margin-left: 0; 53 | list-style: none; 54 | } 55 | 56 | .tab-content { 57 | padding: 9.5px; 58 | border-left: 1px solid #ddd; 59 | border-bottom: 1px solid #ddd; 60 | border-right: 1px solid #ddd; 61 | } 62 | 63 | code, pre { 64 | font-family: Inconsolata,Monaco,Menlo,Consolas,"Courier New",monospace; 65 | } 66 | 67 | .navbar, h1, h2, h3, h4, h5, h6, .h1, .h2, .h3, .h4, .h5, .h6, .new { 68 | font-family: "Overlock","Helvetica Neue",Helvetica,Arial,sans-serif; 69 | font-weight: 900; 70 | } 71 | 72 | .new { 73 | color: #3333ff; 74 | } 75 | 76 | .new:before { 77 | content: "★ "; 78 | } 79 | -------------------------------------------------------------------------------- /source/_static/jsonschema.js: -------------------------------------------------------------------------------- 1 | function supports_html5_storage() { 2 | try { 3 | return 'localStorage' in window && window['localStorage'] !== null; 4 | } catch (e) { 5 | return false; 6 | } 7 | } 8 | 9 | 10 | if (supports_html5_storage()) { 11 | $(function(){ 12 | /* Upon loading, set all language-specific tabs to the 13 | preferred language. */ 14 | var language = localStorage.getItem("preferred-language"); 15 | if (language) { 16 | var tabs = $jqTheme('a[data-toggle="tab"]'); 17 | for (var i = 0; i < tabs.size(); ++i) { 18 | var href = tabs[i].href; 19 | if (href.split("_")[1] == language) { 20 | $jqTheme('a[href="#' + href.split('#')[1] + '"]').tab("show"); 21 | } 22 | } 23 | } 24 | 25 | $jqTheme('a[data-toggle="tab"]').on('shown', function (e) { 26 | var language = e.target.href.split("_")[1]; 27 | localStorage.setItem("preferred-language", language); 28 | }) 29 | }) 30 | } 31 | -------------------------------------------------------------------------------- /source/_static/logo.ico: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/json-schema-org/understanding-json-schema/39a0a46c069b916064b98bba4b70adc6abd23bd9/source/_static/logo.ico -------------------------------------------------------------------------------- /source/_static/logo.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/json-schema-org/understanding-json-schema/39a0a46c069b916064b98bba4b70adc6abd23bd9/source/_static/logo.pdf -------------------------------------------------------------------------------- /source/_static/logo.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/json-schema-org/understanding-json-schema/39a0a46c069b916064b98bba4b70adc6abd23bd9/source/_static/logo.png -------------------------------------------------------------------------------- /source/_static/octopus.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/json-schema-org/understanding-json-schema/39a0a46c069b916064b98bba4b70adc6abd23bd9/source/_static/octopus.png -------------------------------------------------------------------------------- /source/_static/pass.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/json-schema-org/understanding-json-schema/39a0a46c069b916064b98bba4b70adc6abd23bd9/source/_static/pass.png -------------------------------------------------------------------------------- /source/_static/pass.svg: -------------------------------------------------------------------------------- 1 | 2 | 3 | 6 | 7 | 12 | 13 | 14 | -------------------------------------------------------------------------------- /source/_static/schema.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/json-schema-org/understanding-json-schema/39a0a46c069b916064b98bba4b70adc6abd23bd9/source/_static/schema.png -------------------------------------------------------------------------------- /source/_templates/layout.html: -------------------------------------------------------------------------------- 1 | {% extends "!layout.html" %} 2 | 3 | {# Include our new CSS file into existing ones. #} 4 | {% set bootswatch_css_custom = ['_static/jsonschema.css']%} 5 | 6 | {%- block extrahead %} 7 | {{ super() }} 8 | 9 | 10 | 17 | {% endblock %} 18 | 19 | 20 | {# Include some custom js #} 21 | {% set script_files = script_files + ['_static/jsonschema.js']%} 22 | 23 | 24 | 25 | {% block header %} 26 | {{ super() }} 27 | Fork me on GitHub 32 | {% endblock %} 33 | -------------------------------------------------------------------------------- /source/about.rst: -------------------------------------------------------------------------------- 1 | .. _about: 2 | 3 | What is a schema? 4 | ================= 5 | 6 | If you've ever used XML Schema, RelaxNG or ASN.1 you probably already 7 | know what a schema is and you can happily skip along to the next 8 | section. If all that sounds like gobbledygook to you, you've come to 9 | the right place. To define what JSON Schema is, we should probably 10 | first define what JSON is. 11 | 12 | JSON stands for "JavaScript Object Notation", a simple data 13 | interchange format. It began as a notation for the world wide web. 14 | Since JavaScript exists in most web browsers, and JSON is based on 15 | JavaScript, it's very easy to support there. However, it has proven 16 | useful enough and simple enough that it is now used in many other 17 | contexts that don't involve web surfing. 18 | 19 | At its heart, JSON is built on the following data structures: 20 | 21 | - object:: 22 | 23 | { "key1": "value1", "key2": "value2" } 24 | 25 | - array:: 26 | 27 | [ "first", "second", "third" ] 28 | 29 | - number: 30 | 31 | .. code-block:: text 32 | 33 | 42 34 | 3.1415926 35 | 36 | - string: 37 | 38 | .. code-block:: text 39 | 40 | "This is a string" 41 | 42 | - boolean: 43 | 44 | .. code-block:: text 45 | 46 | true 47 | false 48 | 49 | - null: 50 | 51 | .. code-block:: text 52 | 53 | null 54 | 55 | These types have analogs in most programming languages, though they 56 | may go by different names. 57 | 58 | .. language_specific:: 59 | 60 | --Python 61 | The following table maps from the names of JSON types to their 62 | analogous types in Python: 63 | 64 | +----------+-----------+ 65 | |JSON |Python | 66 | +----------+-----------+ 67 | |string |string | 68 | | |[#1]_ | 69 | +----------+-----------+ 70 | |number |int/float | 71 | | |[#2]_ | 72 | +----------+-----------+ 73 | |object |dict | 74 | +----------+-----------+ 75 | |array |list | 76 | +----------+-----------+ 77 | |boolean |bool | 78 | +----------+-----------+ 79 | |null |None | 80 | +----------+-----------+ 81 | 82 | .. rubric:: Footnotes 83 | 84 | .. [#1] Since JSON strings always support unicode, they are 85 | analogous to ``unicode`` on Python 2.x and ``str`` on 86 | Python 3.x. 87 | 88 | .. [#2] JSON does not have separate types for integer and 89 | floating-point. 90 | 91 | --Ruby 92 | The following table maps from the names of JSON types to their 93 | analogous types in Ruby: 94 | 95 | +----------+----------------------+ 96 | |JSON |Ruby | 97 | +----------+----------------------+ 98 | |string |String | 99 | +----------+----------------------+ 100 | |number |Integer/Float | 101 | | |[#3]_ | 102 | +----------+----------------------+ 103 | |object |Hash | 104 | +----------+----------------------+ 105 | |array |Array | 106 | +----------+----------------------+ 107 | |boolean |TrueClass/FalseClass | 108 | +----------+----------------------+ 109 | |null |NilClass | 110 | +----------+----------------------+ 111 | 112 | .. rubric:: Footnotes 113 | 114 | .. [#3] JSON does not have separate types for integer and 115 | floating-point. 116 | 117 | With these simple data types, all kinds of structured data can be 118 | represented. With that great flexibility comes great responsibility, 119 | however, as the same concept could be represented in myriad ways. For 120 | example, you could imagine representing information about a person in 121 | JSON in different ways:: 122 | 123 | { 124 | "name": "George Washington", 125 | "birthday": "February 22, 1732", 126 | "address": "Mount Vernon, Virginia, United States" 127 | } 128 | 129 | { 130 | "first_name": "George", 131 | "last_name": "Washington", 132 | "birthday": "1732-02-22", 133 | "address": { 134 | "street_address": "3200 Mount Vernon Memorial Highway", 135 | "city": "Mount Vernon", 136 | "state": "Virginia", 137 | "country": "United States" 138 | } 139 | } 140 | 141 | Both representations are equally valid, though one is clearly more 142 | formal than the other. The design of a record will largely depend on 143 | its intended use within the application, so there's no right or wrong 144 | answer here. However, when an application says "give me a JSON record 145 | for a person", it's important to know exactly how that record should 146 | be organized. For example, we need to know what fields are expected, 147 | and how the values are represented. That's where JSON Schema comes 148 | in. The following JSON Schema fragment describes how the second 149 | example above is structured. Don't worry too much about the details 150 | for now. They are explained in subsequent chapters. 151 | 152 | .. schema_example:: 153 | 154 | { 155 | "type": "object", 156 | "properties": { 157 | "first_name": { "type": "string" }, 158 | "last_name": { "type": "string" }, 159 | "birthday": { "type": "string", "format": "date" }, 160 | "address": { 161 | "type": "object", 162 | "properties": { 163 | "street_address": { "type": "string" }, 164 | "city": { "type": "string" }, 165 | "state": { "type": "string" }, 166 | "country": { "type" : "string" } 167 | } 168 | } 169 | } 170 | } 171 | --X 172 | // By "validating" the first example against this schema, you can 173 | // see that it fails: 174 | { 175 | "name": "George Washington", 176 | "birthday": "February 22, 1732", 177 | "address": "Mount Vernon, Virginia, United States" 178 | } 179 | -- 180 | // However, the second example passes: 181 | { 182 | "first_name": "George", 183 | "last_name": "Washington", 184 | "birthday": "1732-02-22", 185 | "address": { 186 | "street_address": "3200 Mount Vernon Memorial Highway", 187 | "city": "Mount Vernon", 188 | "state": "Virginia", 189 | "country": "United States" 190 | } 191 | } 192 | 193 | You may have noticed that the JSON Schema itself is written in JSON. 194 | It is data itself, not a computer program. It's just a declarative 195 | format for "describing the structure of other data". This is both its 196 | strength and its weakness (which it shares with other similar schema 197 | languages). It is easy to concisely describe the surface structure of 198 | data, and automate validating data against it. However, since a JSON 199 | Schema can't contain arbitrary code, there are certain constraints on 200 | the relationships between data elements that can't be expressed. Any 201 | "validation tool" for a sufficiently complex data format, therefore, 202 | will likely have two phases of validation: one at the schema (or 203 | structural) level, and one at the semantic level. The latter check 204 | will likely need to be implemented using a more general-purpose 205 | programming language. 206 | -------------------------------------------------------------------------------- /source/basics.rst: -------------------------------------------------------------------------------- 1 | .. _basics: 2 | 3 | The basics 4 | ========== 5 | 6 | .. contents:: :local: 7 | 8 | In :ref:`about`, we described what a schema is, and hopefully 9 | justified the need for schema languages. Here, we proceed to 10 | write a simple JSON Schema. 11 | 12 | Hello, World! 13 | ------------- 14 | 15 | When learning any new language, it's often helpful to start with the 16 | simplest thing possible. In JSON Schema, an empty object is a 17 | completely valid schema that will accept any valid JSON. 18 | 19 | .. schema_example:: 20 | 21 | { } 22 | -- 23 | // This accepts anything, as long as it's valid JSON 24 | 42 25 | -- 26 | "I'm a string" 27 | -- 28 | { "an": [ "arbitrarily", "nested" ], "data": "structure" } 29 | 30 | |draft6| 31 | 32 | You can also use ``true`` in place of the empty object to represent a schema 33 | that matches anything, or ``false`` for a schema that matches nothing. 34 | 35 | .. schema_example:: 36 | 37 | true 38 | -- 39 | // This accepts anything, as long as it's valid JSON 40 | 42 41 | -- 42 | "I'm a string" 43 | -- 44 | { "an": [ "arbitrarily", "nested" ], "data": "structure" } 45 | 46 | .. schema_example:: 47 | 48 | false 49 | --X 50 | "Resistance is futile... This will always fail!!!" 51 | 52 | The type keyword 53 | ---------------- 54 | 55 | Of course, we wouldn't be using JSON Schema if we wanted to just 56 | accept any JSON document. The most common thing to do in a JSON 57 | Schema is to restrict to a specific type. The ``type`` keyword is 58 | used for that. 59 | 60 | .. note:: 61 | 62 | When this book refers to JSON Schema "keywords", it means the 63 | "key" part of the key/value pair in an object. Most of the work 64 | of writing a JSON Schema involves mapping a special "keyword" to a 65 | value within an object. 66 | 67 | For example, in the following, only strings are 68 | accepted: 69 | 70 | .. schema_example:: 71 | 72 | { "type": "string" } 73 | -- 74 | "I'm a string" 75 | --X 76 | 42 77 | 78 | The ``type`` keyword is described in more detail in `type`. 79 | 80 | Declaring a JSON Schema 81 | ----------------------- 82 | 83 | It's not always easy to tell which draft a JSON Schema is using. You 84 | can use the ``$schema`` keyword to declare which version of the JSON 85 | Schema specification the schema is written to. See `schema` for more 86 | information. It's generally good practice to include it, though it is 87 | not required. 88 | 89 | .. note:: 90 | For brevity, the ``$schema`` keyword isn't included in most of the 91 | examples in this book, but it should always be used in the real 92 | world. 93 | 94 | .. schema_example:: 95 | 96 | { "$schema": "https://json-schema.org/draft/2020-12/schema" } 97 | 98 | .. draft_specific:: 99 | 100 | --Draft 4 101 | In Draft 4, a ``$schema`` value of 102 | ``http://json-schema.org/schema#`` referred to the latest version 103 | of JSON Schema. This usage has since been deprecated and the use 104 | of specific version URIs is required. 105 | 106 | Declaring a unique identifier 107 | ----------------------------- 108 | 109 | It is also best practice to include an ``$id`` property as a unique 110 | identifier for each schema. For now, just set it to a URL at a domain 111 | you control, for example:: 112 | 113 | { "$id": "http://yourdomain.com/schemas/myschema.json" } 114 | 115 | The details of `id` become more apparent when you start `structuring`. 116 | 117 | |draft6| 118 | 119 | .. draft_specific:: 120 | 121 | --Draft 4 122 | In Draft 4, ``$id`` is just ``id`` (without the dollar-sign). 123 | -------------------------------------------------------------------------------- /source/conf.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | # 3 | # Understanding JSON Schema documentation build configuration file, created by 4 | # sphinx-quickstart on Thu Sep 5 10:09:57 2013. 5 | # 6 | # This file is execfile()d with the current directory set to its containing dir. 7 | # 8 | # Note that not all possible configuration values are present in this 9 | # autogenerated file. 10 | # 11 | # All configuration values have a default; values that are commented out 12 | # serve to show the default. 13 | 14 | import sys, os 15 | import datetime 16 | 17 | # If extensions (or modules to document with autodoc) are in another directory, 18 | # add these directories to sys.path here. If the directory is relative to the 19 | # documentation root, use os.path.abspath to make it absolute, like shown here. 20 | sys.path.insert(0, os.path.abspath(os.path.dirname('__file__'))) 21 | 22 | # The default JSON Schema dialect to test the examples against 23 | jsonschema_standard = 'https://json-schema.org/draft/2020-12/schema' 24 | 25 | rst_prolog = """ 26 | .. role:: new 27 | 28 | .. |draft2020-12| replace:: :new:`New in draft 2020-12` 29 | .. |draft2019-09| replace:: :new:`New in draft 2019-09` 30 | .. |draft7| replace:: :new:`New in draft 7` 31 | .. |draft6| replace:: :new:`New in draft 6` 32 | """ 33 | 34 | # -- General configuration ----------------------------------------------------- 35 | 36 | import sphinx_bootstrap_theme 37 | 38 | # If your documentation needs a minimal Sphinx version, state it here. 39 | needs_sphinx = '1.7' 40 | 41 | # Add any Sphinx extension module names here, as strings. They can be extensions 42 | # coming with Sphinx (named 'sphinx.ext.*') or your custom ones. 43 | extensions = ['sphinx.ext.mathjax', 'sphinx.ext.ifconfig', 44 | 'sphinxext.jsonschemaext', 'sphinxext.tab'] 45 | 46 | # Add any paths that contain templates here, relative to this directory. 47 | templates_path = ['_templates'] 48 | 49 | # The suffix of source filenames. 50 | source_suffix = '.rst' 51 | 52 | # The encoding of source files. 53 | source_encoding = 'utf-8' 54 | 55 | # The master toctree document. 56 | master_doc = 'index' 57 | 58 | # General information about the project. 59 | project = u'Understanding JSON Schema' 60 | copyright = u'2013-2016 Michael Droettboom, Space Telescope Science Institute; © 2016-{0} Michael Droettboom'.format( 61 | datetime.datetime.now().year) 62 | 63 | # The version info for the project you're documenting, acts as replacement for 64 | # |version| and |release|, also used in various other places throughout the 65 | # built documents. 66 | # 67 | # The short X.Y version. 68 | version = '2020-12' 69 | # The full version, including alpha/beta/rc tags. 70 | release = '2020-12' 71 | 72 | # The language for content autogenerated by Sphinx. Refer to documentation 73 | # for a list of supported languages. 74 | #language = None 75 | 76 | # There are two options for replacing |today|: either, you set today to some 77 | # non-false value, then it is used: 78 | #today = '' 79 | # Else, today_fmt is used as the format for a strftime call. 80 | #today_fmt = '%B %d, %Y' 81 | 82 | # List of patterns, relative to source directory, that match files and 83 | # directories to ignore when looking for source files. 84 | exclude_patterns = [] 85 | 86 | # The reST default role (used for this markup: `text`) to use for all documents. 87 | default_role = "ref" 88 | 89 | # If true, '()' will be appended to :func: etc. cross-reference text. 90 | #add_function_parentheses = True 91 | 92 | # If true, the current module name will be prepended to all description 93 | # unit titles (such as .. function::). 94 | #add_module_names = True 95 | 96 | # If true, sectionauthor and moduleauthor directives will be shown in the 97 | # output. They are ignored by default. 98 | #show_authors = False 99 | 100 | # The name of the Pygments (syntax highlighting) style to use. 101 | pygments_style = 'sphinx' 102 | 103 | # The default language for highlighting 104 | highlight_language = 'javascript' 105 | 106 | # A list of ignored prefixes for module index sorting. 107 | #modindex_common_prefix = [] 108 | 109 | # If true, keep warnings as "system message" paragraphs in the built documents. 110 | #keep_warnings = False 111 | 112 | 113 | # -- Options for HTML output --------------------------------------------------- 114 | 115 | # The theme to use for HTML and HTML Help pages. See the documentation for 116 | # a list of builtin themes. 117 | html_theme = 'bootstrap' 118 | 119 | # Theme options are theme-specific and customize the look and feel of a theme 120 | # further. For a list of options available for each theme, see the 121 | # documentation. 122 | 123 | # See sphinx-bootstrap-theme for documentation of these options 124 | # https://github.com/ryan-roemer/sphinx-bootstrap-theme 125 | 126 | html_theme_options = { 127 | 'navbar_site_name': 'Document', 128 | 'navbar_pagenav': False 129 | } 130 | 131 | # Add any paths that contain custom themes here, relative to this directory. 132 | html_theme_path = sphinx_bootstrap_theme.get_html_theme_path() 133 | 134 | # The name for this set of Sphinx documents. If None, it defaults to 135 | # " v documentation". 136 | #html_title = None 137 | 138 | # A shorter title for the navigation bar. Default is the same as html_title. 139 | #html_short_title = None 140 | 141 | # The name of an image file (relative to this directory) to place at the top 142 | # of the sidebar. 143 | # html_logo = '_static/logo.png' 144 | 145 | # The name of an image file (within the static path) to use as favicon of the 146 | # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 147 | # pixels large. 148 | html_favicon = '_static/logo.ico' 149 | 150 | # Add any paths that contain custom static files (such as style sheets) here, 151 | # relative to this directory. They are copied after the builtin static files, 152 | # so a file named "default.css" will overwrite the builtin "default.css". 153 | html_static_path = ['_static'] 154 | 155 | html_style = 'jsonschema.css' 156 | 157 | # If not '', a 'Last updated on:' timestamp is inserted at every page bottom, 158 | # using the given strftime format. 159 | html_last_updated_fmt = '%b %d, %Y' 160 | 161 | # If true, SmartyPants will be used to convert quotes and dashes to 162 | # typographically correct entities. 163 | html_use_smartypants = True 164 | 165 | # Custom sidebar templates, maps document names to template names. 166 | #html_sidebars = {} 167 | 168 | # Additional templates that should be rendered to pages, maps page names to 169 | # template names. 170 | #html_additional_pages = {} 171 | 172 | # If false, no module index is generated. 173 | html_domain_indices = False 174 | 175 | # If false, no index is generated. 176 | html_use_index = True 177 | 178 | # If true, the index is split into individual pages for each letter. 179 | #html_split_index = False 180 | 181 | # If true, links to the reST sources are added to the pages. 182 | html_show_sourcelink = False 183 | 184 | # If true, "Created using Sphinx" is shown in the HTML footer. Default is True. 185 | html_show_sphinx = True 186 | 187 | # If true, "(C) Copyright ..." is shown in the HTML footer. Default is True. 188 | html_show_copyright = True 189 | 190 | # If true, an OpenSearch description file will be output, and all pages will 191 | # contain a tag referring to it. The value of this option must be the 192 | # base URL from which the finished HTML is served. 193 | #html_use_opensearch = '' 194 | 195 | # This is the file name suffix for HTML files (e.g. ".xhtml"). 196 | #html_file_suffix = None 197 | 198 | # Output file base name for HTML help builder. 199 | htmlhelp_basename = 'UnderstandingJSONSchemadoc' 200 | 201 | 202 | # -- Options for LaTeX output -------------------------------------------------- 203 | 204 | from sphinxext import jsonschemaext 205 | from sphinxext import tab 206 | 207 | latex_elements = { 208 | # The paper size ('letterpaper' or 'a4paper'). 209 | 'papersize': 'letterpaper', 210 | 211 | 'passoptionstopackages': jsonschemaext.passoptionstopackages, 212 | 213 | # The font size ('10pt', '11pt' or '12pt'). 214 | 'pointsize': '10pt', 215 | 216 | 'inputenc': r'\usepackage[utf8x]{inputenc}', 217 | 218 | # Additional stuff for the LaTeX preamble. 219 | 'preamble': r''' 220 | % Use a more modern-looking monospace font 221 | \usepackage{overlock} 222 | \usepackage{inconsolata} 223 | \usepackage{bbding} 224 | \usepackage{fixltx2e} 225 | \usepackage{microtype} 226 | \MakeRobust\marginpar 227 | 228 | \makeatletter 229 | \def\marginparright{\@mparswitchfalse} 230 | \def\marginparoutside{\@mparswitchtrue} 231 | \makeatother 232 | \definecolor{VerbatimBorderColor}{rgb}{1,1,1} 233 | ''' + jsonschemaext.latex_preamble + tab.latex_preamble, 234 | 235 | 'fncychap': '\\usepackage[Sonny]{fncychap}' 236 | } 237 | 238 | # Grouping the document tree into LaTeX files. List of tuples 239 | # (source start file, target name, title, author, documentclass [howto/manual]). 240 | latex_documents = [ 241 | ('index', 'UnderstandingJSONSchema.tex', u'Understanding JSON Schema', 242 | u'Michael Droettboom, et al\\\\Space Telescope Science Institute', 'manual'), 243 | ] 244 | 245 | # The name of an image file (relative to this directory) to place at the top of 246 | # the title page. 247 | latex_logo = '_static/logo.pdf' 248 | 249 | # For "manual" documents, if this is true, then toplevel headings are parts, 250 | # not chapters. 251 | latex_use_parts = False 252 | 253 | # If true, show page references after internal links. 254 | latex_show_pagerefs = True 255 | 256 | # If true, show URL addresses after external links. 257 | #latex_show_urls = False 258 | 259 | # Documents to append as an appendix to all manuals. 260 | #latex_appendices = [] 261 | 262 | # If false, no module index is generated. 263 | latex_domain_indices = False 264 | 265 | 266 | # -- Options for manual page output -------------------------------------------- 267 | 268 | # One entry per manual page. List of tuples 269 | # (source start file, name, description, authors, manual section). 270 | man_pages = [ 271 | ('index', 'understandingjsonschema', u'Understanding JSON Schema', 272 | [u'Michael Droettboom, et al'], 1) 273 | ] 274 | 275 | # If true, show URL addresses after external links. 276 | #man_show_urls = False 277 | 278 | 279 | # -- Options for Texinfo output ------------------------------------------------ 280 | 281 | # Grouping the document tree into Texinfo files. List of tuples 282 | # (source start file, target name, title, author, 283 | # dir menu entry, description, category) 284 | texinfo_documents = [ 285 | ('index', 'UnderstandingJSONSchema', u'Understanding JSON Schema', 286 | u'Michael Droettboom, et al', 'UnderstandingJSONSchema', 287 | u'JSON Schema documentation for mere mortals.', 288 | 'Miscellaneous'), 289 | ] 290 | 291 | # Documents to append as an appendix to all manuals. 292 | #texinfo_appendices = [] 293 | 294 | # If false, no module index is generated. 295 | #texinfo_domain_indices = True 296 | 297 | # How to display URL addresses: 'footnote', 'no', or 'inline'. 298 | #texinfo_show_urls = 'footnote' 299 | 300 | # If true, do not generate a @detailmenu in the "Top" node's menu. 301 | #texinfo_no_detailmenu = False 302 | 303 | 304 | # -- Options for Epub output --------------------------------------------------- 305 | 306 | # Bibliographic Dublin Core info. 307 | epub_title = u'Understanding JSON Schema' 308 | epub_author = u'Michael Droettboom' 309 | epub_publisher = u'Space Telescope Science Institute' 310 | epub_copyright = u'2013-{0}, Space Telescope Science Institute'.format( 311 | datetime.datetime.now().year) 312 | 313 | # The language of the text. It defaults to the language option 314 | # or en if the language is not set. 315 | #epub_language = '' 316 | 317 | # The scheme of the identifier. Typical schemes are ISBN or URL. 318 | #epub_scheme = '' 319 | 320 | # The unique identifier of the text. This can be a ISBN number 321 | # or the project homepage. 322 | #epub_identifier = '' 323 | 324 | # A unique identification for the text. 325 | #epub_uid = '' 326 | 327 | # A tuple containing the cover image and cover page html template filenames. 328 | #epub_cover = () 329 | 330 | # A sequence of (type, uri, title) tuples for the guide element of content.opf. 331 | #epub_guide = () 332 | 333 | # HTML files that should be inserted before the pages created by sphinx. 334 | # The format is a list of tuples containing the path and title. 335 | #epub_pre_files = [] 336 | 337 | # HTML files shat should be inserted after the pages created by sphinx. 338 | # The format is a list of tuples containing the path and title. 339 | #epub_post_files = [] 340 | 341 | # A list of files that should not be packed into the epub file. 342 | #epub_exclude_files = [] 343 | 344 | # The depth of the table of contents in toc.ncx. 345 | #epub_tocdepth = 3 346 | 347 | # Allow duplicate toc entries. 348 | #epub_tocdup = True 349 | 350 | # Fix unsupported image types using the PIL. 351 | #epub_fix_images = False 352 | 353 | # Scale large images. 354 | #epub_max_image_width = 0 355 | 356 | # If 'no', URL addresses will not be shown. 357 | #epub_show_urls = 'inline' 358 | 359 | # If false, no index is generated. 360 | #epub_use_index = True 361 | -------------------------------------------------------------------------------- /source/conventions.rst: -------------------------------------------------------------------------------- 1 | .. _conventions: 2 | 3 | Conventions used in this book 4 | ============================= 5 | 6 | .. contents:: :local: 7 | 8 | Language-specific notes 9 | ----------------------- 10 | 11 | The names of the basic types in JavaScript and JSON can be confusing 12 | when coming from another dynamic language. I'm a Python programmer by 13 | day, so I've notated here when the names for things are different from 14 | what they are in Python, and any other Python-specific advice for 15 | using JSON and JSON Schema. I'm by no means trying to create a Python 16 | bias to this book, but it is what I know, so I've started there. 17 | In the long run, I hope this book will be useful to programmers of 18 | all stripes, so if you're interested in translating the Python 19 | references into Algol-68 or any other language you may know, pull 20 | requests are welcome! 21 | 22 | The language-specific sections are shown with tabs for each language. 23 | Once you choose a language, that choice will be remembered as you read 24 | on from page to page. 25 | 26 | For example, here's a language-specific section with advice on using 27 | JSON in a few different languages: 28 | 29 | .. language_specific:: 30 | 31 | --Python 32 | In Python, JSON can be read using the json module in the standard 33 | library. 34 | --Ruby 35 | In Ruby, JSON can be read using the json gem. 36 | --C 37 | For C, you may want to consider using `Jansson 38 | `_ to read and write JSON. 39 | 40 | Draft-specific notes 41 | -------------------- 42 | 43 | The JSON Schema standard has been through a number of revisions or 44 | "drafts". The current version is Draft 2020-12, but some older drafts 45 | are still widely used as well. 46 | 47 | The text is written to encourage the use of Draft 2020-12 and gives 48 | priority to the latest conventions and features, but where it differs 49 | from earlier drafts, those differences are highlighted in special 50 | call-outs. If you only wish to target Draft 2020-12, you can safely 51 | ignore those sections. 52 | 53 | |draft2020-12| 54 | 55 | .. draft_specific:: 56 | 57 | --Draft 2019-09 58 | This is where anything pertaining to an old draft would be mentioned. 59 | 60 | 61 | Examples 62 | -------- 63 | 64 | There are many examples throughout this book, and they all follow 65 | the same format. At the beginning of each example is a short JSON 66 | schema, illustrating a particular principle, followed by short JSON 67 | snippets that are either valid or invalid against that schema. Valid 68 | examples are in green, with a checkmark. Invalid examples are in red, 69 | with a cross. Often there are comments in between to explain why 70 | something is or isn't valid. 71 | 72 | .. note:: 73 | These examples are tested automatically whenever the book is 74 | built, so hopefully they are not just helpful, but also correct! 75 | 76 | For example, here's a snippet illustrating how to use the ``number`` 77 | type: 78 | 79 | .. schema_example:: 80 | 81 | { "type": "number" } 82 | -- 83 | 42 84 | -- 85 | -1 86 | -- 87 | // Simple floating point number: 88 | 5.0 89 | -- 90 | // Exponential notation also works: 91 | 2.99792458e8 92 | --X 93 | // Numbers as strings are rejected: 94 | "42" 95 | -------------------------------------------------------------------------------- /source/credits.rst: -------------------------------------------------------------------------------- 1 | Acknowledgments 2 | =============== 3 | 4 | Michael Droettboom wishes to thank the following contributors: 5 | 6 | - Alexander Kjeldaas 7 | - Alexander Lang 8 | - Anders D. Johnson 9 | - Armand Abric 10 | - Ben Hutton 11 | - Brandon Wright 12 | - Brent Tubbs 13 | - Chris Carpenter 14 | - Christopher Mark Gore 15 | - David Branner 16 | - David Michael Karr 17 | - David Worth 18 | - E\. M\. Bray 19 | - Fenhl 20 | - forevermatt 21 | - goldaxe 22 | - Henry Andrews 23 | - Hervé 24 | - Hongwei 25 | - Jesse Claven 26 | - Koen Rouwhorst 27 | - Mike Kobit 28 | - Oliver Kurmis 29 | - Sam Blackman 30 | - Vincent Jacques 31 | -------------------------------------------------------------------------------- /source/index.rst: -------------------------------------------------------------------------------- 1 | .. Understanding JSON Schema documentation master file, created by 2 | sphinx-quickstart on Thu Sep 5 10:09:57 2013. 3 | You can adapt this file completely to your liking, but it should at least 4 | contain the root `toctree` directive. 5 | 6 | Understanding JSON Schema 7 | ========================= 8 | 9 | JSON Schema is a powerful tool for validating the structure of JSON 10 | data. However, learning to use it by reading its specification is 11 | like learning to drive a car by looking at its blueprints. You don't 12 | need to know how an electric motor fits together if all 13 | you want to do is pick up the groceries. This book, therefore, aims 14 | to be the friendly driving instructor for JSON Schema. It's for those 15 | that want to write it and understand it, but maybe aren't interested 16 | in building their own car---er, writing their own JSON Schema 17 | validator---just yet. 18 | 19 | .. only:: html 20 | 21 | .. image:: _static/octopus.svg 22 | :alt: octopus 23 | :align: right 24 | 25 | .. note:: 26 | 27 | This book describes JSON Schema draft 2020-12. Earlier versions of 28 | JSON Schema are not completely compatible with the format 29 | described here, but for the most part, those differences are noted 30 | in the text. 31 | 32 | **Where to begin?** 33 | 34 | - This book uses some novel `conventions ` for showing 35 | schema examples and relating JSON Schema to your programming 36 | language of choice. 37 | 38 | - If you're not sure what a schema is, check out `about`. 39 | 40 | - `basics` chapter should be enough to get you started with 41 | understanding the core `reference`. 42 | 43 | - When you start developing large schemas with many nested and 44 | repeated sections, check out `structuring`. 45 | 46 | - `json-schema.org `__ has a number of 47 | resources, including the official specification and tools for 48 | working with JSON Schema from various programming languages. 49 | 50 | - There are a number of `online JSON Schema tools `__ 51 | that allow you to run your own JSON schemas against example 52 | documents. These can be very handy if you want to try things out 53 | without installing any software. 54 | 55 | .. only:: html 56 | 57 | Contents: 58 | 59 | .. toctree:: 60 | :maxdepth: 3 61 | 62 | conventions.rst 63 | about.rst 64 | basics.rst 65 | reference/index.rst 66 | structuring.rst 67 | 68 | .. only:: html 69 | 70 | There is also a `print version of this document 71 | `__. 72 | 73 | .. toctree:: 74 | :maxdepth: 1 75 | 76 | credits.rst 77 | 78 | .. only:: html 79 | 80 | * :ref:`genindex` 81 | * :ref:`search` 82 | -------------------------------------------------------------------------------- /source/reference/array.rst: -------------------------------------------------------------------------------- 1 | .. index:: 2 | single: array 3 | 4 | .. _array: 5 | 6 | array 7 | ----- 8 | 9 | .. contents:: :local: 10 | 11 | Arrays are used for ordered elements. In JSON, each element in an 12 | array may be of a different type. 13 | 14 | .. language_specific:: 15 | 16 | --Python 17 | In Python, "array" is analogous to a ``list`` or ``tuple`` type, 18 | depending on usage. However, the ``json`` module in the Python 19 | standard library will always use Python lists to represent JSON 20 | arrays. 21 | --Ruby 22 | In Ruby, "array" is analogous to a ``Array`` type. 23 | 24 | .. schema_example:: 25 | 26 | { "type": "array" } 27 | -- 28 | [1, 2, 3, 4, 5] 29 | -- 30 | [3, "different", { "types" : "of values" }] 31 | --X 32 | {"Not": "an array"} 33 | 34 | There are two ways in which arrays are generally used in JSON: 35 | 36 | - **List validation:** a sequence of arbitrary length where each 37 | item matches the same schema. 38 | 39 | - **Tuple validation:** a sequence of fixed length where each item may 40 | have a different schema. In this usage, the index (or location) of 41 | each item is meaningful as to how the value is interpreted. (This 42 | usage is often given a whole separate type in some programming 43 | languages, such as Python's ``tuple``). 44 | 45 | .. index:: 46 | single: array; items 47 | single: items 48 | 49 | .. _items: 50 | 51 | Items 52 | ''''' 53 | 54 | List validation is useful for arrays of arbitrary length where each 55 | item matches the same schema. For this kind of array, set the 56 | ``items`` keyword to a single schema that will be used to validate all 57 | of the items in the array. 58 | 59 | In the following example, we define that each item in an array is a 60 | number: 61 | 62 | .. schema_example:: 63 | 64 | { 65 | "type": "array", 66 | "items": { 67 | "type": "number" 68 | } 69 | } 70 | -- 71 | [1, 2, 3, 4, 5] 72 | --X 73 | // A single "non-number" causes the whole array to be invalid: 74 | [1, 2, "3", 4, 5] 75 | -- 76 | // The empty array is always valid: 77 | [] 78 | 79 | .. index:: 80 | single: array; tuple validation 81 | 82 | .. _tuple-validation: 83 | 84 | Tuple validation 85 | '''''''''''''''' 86 | 87 | Tuple validation is useful when the array is a collection of items 88 | where each has a different schema and the ordinal index of each item 89 | is meaningful. 90 | 91 | For example, you may represent a street address such as:: 92 | 93 | 1600 Pennsylvania Avenue NW 94 | 95 | as a 4-tuple of the form: 96 | 97 | [number, street_name, street_type, direction] 98 | 99 | Each of these fields will have a different schema: 100 | 101 | - ``number``: The address number. Must be a number. 102 | 103 | - ``street_name``: The name of the street. Must be a string. 104 | 105 | - ``street_type``: The type of street. Should be a string from a 106 | fixed set of values. 107 | 108 | - ``direction``: The city quadrant of the address. Should be a string 109 | from a different set of values. 110 | 111 | To do this, we use the ``prefixItems`` keyword. ``prefixItems`` is an 112 | array, where each item is a schema that corresponds to each index of 113 | the document's array. That is, an array where the first element 114 | validates the first element of the input array, the second element 115 | validates the second element of the input array, etc. 116 | 117 | .. draft_specific:: 118 | --Draft 4 - 2019-09 119 | In Draft 4 - 2019-09, tuple validation was handled by an alternate 120 | form of the ``items`` keyword. When ``items`` was an array of 121 | schemas instead of a single schema, it behaved the way 122 | ``prefixItems`` behaves. 123 | 124 | Here's the example schema: 125 | 126 | .. schema_example:: 127 | 128 | { 129 | "type": "array", 130 | "prefixItems": [ 131 | { "type": "number" }, 132 | { "type": "string" }, 133 | { "enum": ["Street", "Avenue", "Boulevard"] }, 134 | { "enum": ["NW", "NE", "SW", "SE"] } 135 | ] 136 | } 137 | -- 138 | [1600, "Pennsylvania", "Avenue", "NW"] 139 | --X 140 | // "Drive" is not one of the acceptable street types: 141 | [24, "Sussex", "Drive"] 142 | --X 143 | // This address is missing a street number 144 | ["Palais de l'Élysée"] 145 | -- 146 | // It's okay to not provide all of the items: 147 | [10, "Downing", "Street"] 148 | -- 149 | // And, by default, it's also okay to add additional items to end: 150 | [1600, "Pennsylvania", "Avenue", "NW", "Washington"] 151 | 152 | .. index:: 153 | single: array; tuple validation; items 154 | single: items 155 | 156 | .. _additionalitems: 157 | 158 | Additional Items 159 | ~~~~~~~~~~~~~~~~ 160 | 161 | The ``items`` keyword can be used to control whether it's valid to 162 | have additional items in a tuple beyond what is defined in 163 | ``prefixItems``. The value of the ``items`` keyword is a schema that 164 | all additional items must pass in order for the keyword to validate. 165 | 166 | .. draft_specific:: 167 | 168 | --Draft 4 - 2019-09 169 | Before to Draft 2020-12, you would use the ``additionalItems`` 170 | keyword to constrain additional items on a tuple. It works the same 171 | as ``items``, only the name has changed. 172 | 173 | --Draft 6 - 2019-09 174 | In Draft 6 - 2019-09, the ``additionalItems`` keyword is ignored if 175 | there is not a "tuple validation" ``items`` keyword present in the 176 | same schema. 177 | 178 | Here, we'll reuse the example schema above, but set 179 | ``items`` to ``false``, which has the effect of disallowing 180 | extra items in the tuple. 181 | 182 | .. schema_example:: 183 | 184 | { 185 | "type": "array", 186 | "prefixItems": [ 187 | { "type": "number" }, 188 | { "type": "string" }, 189 | { "enum": ["Street", "Avenue", "Boulevard"] }, 190 | { "enum": ["NW", "NE", "SW", "SE"] } 191 | ], 192 | "items": false 193 | } 194 | -- 195 | [1600, "Pennsylvania", "Avenue", "NW"] 196 | -- 197 | // It's ok to not provide all of the items: 198 | [1600, "Pennsylvania", "Avenue"] 199 | --X 200 | // But, since ``items`` is ``false``, we can't provide 201 | // extra items: 202 | [1600, "Pennsylvania", "Avenue", "NW", "Washington"] 203 | 204 | You can express more complex constraints by using a non-boolean schema 205 | to constrain what value additional items can have. In that case, we 206 | could say that additional items are allowed, as long as they are all 207 | strings: 208 | 209 | .. schema_example:: 210 | 211 | { 212 | "type": "array", 213 | "prefixItems": [ 214 | { "type": "number" }, 215 | { "type": "string" }, 216 | { "enum": ["Street", "Avenue", "Boulevard"] }, 217 | { "enum": ["NW", "NE", "SW", "SE"] } 218 | ], 219 | "items": { "type": "string" } 220 | } 221 | -- 222 | // Extra string items are ok ... 223 | [1600, "Pennsylvania", "Avenue", "NW", "Washington"] 224 | --X 225 | // ... but not anything else 226 | [1600, "Pennsylvania", "Avenue", "NW", 20500] 227 | 228 | .. index:: 229 | single: array; tuple validation; unevaluatedItems 230 | single: unevaluatedItems 231 | 232 | .. _unevaluateditems: 233 | 234 | Unevaluated Items 235 | ''''''''''''''''' 236 | 237 | |draft2019-09| 238 | 239 | The ``unevaluatedItems`` keyword is useful mainly when you want to add 240 | or disallow extra items to an array. 241 | 242 | ``unevaluatedItems`` applies to any values not evaluated by an 243 | ``items``, ``prefixItems``, or ``contains`` keyword. Just as 244 | ``unevaluatedProperties`` affects only **properties** in an object, 245 | ``unevaluatedItems`` affects only **items** in an array. 246 | 247 | .. note:: 248 | Watch out! The word "unevaluated" *does not mean* "not evaluated by 249 | ``items``, ``prefixItems``, or ``contains``." "Unevaluated" means 250 | "not successfully evaluated", or "does not evaluate to true". 251 | 252 | Like with ``items``, if you set ``unevaluatedItems`` to ``false``, you 253 | can disallow extra items in the array. 254 | 255 | .. schema_example:: 256 | 257 | { 258 | "prefixItems": [ 259 | { "type": "string" }, { "type": "number" } 260 | ], 261 | "unevaluatedItems": false 262 | } 263 | -- 264 | ["foo", 42] 265 | // All the values are evaluated. The schema passes validation. 266 | --X 267 | ["foo", 42, null] 268 | // The schema fails validation because ``"unevaluatedItems": false"`` 269 | // specifies no extra values should exist. 270 | 271 | Note that ``items`` doesn't "see inside" any instances of ``allOf``, 272 | ``anyOf``, or ``oneOf`` in the same subschema. So in this next example, 273 | ``items`` ignores ``allOf`` and thus fails to validate. 274 | 275 | .. schema_example:: 276 | 277 | { 278 | "allOf": [{ "prefixItems": [{ "type": "boolean" }, { "type": "string" }] }], 279 | "items": { "const": 2 } 280 | } 281 | --X 282 | [true, "a", 2] 283 | 284 | But if you replace ``items`` with ``unevaluatedItems``, then the same 285 | array validates. 286 | 287 | .. schema_example:: 288 | 289 | { 290 | "allOf": [{ "prefixItems": [{ "type": "boolean" }, { "type": "string" }] }], 291 | "unevaluatedItems": { "const": 2 } 292 | } 293 | -- 294 | [true, "a", 2] 295 | 296 | You can also make a "half-closed" schema: something useful when you 297 | want to keep the first two arguments, but also add more in certain 298 | situations. ("Closed" to two arguments in some places, "open" to 299 | more arguments when you need it to be.) 300 | 301 | .. schema_example:: 302 | 303 | { 304 | "$id": "https://example.com/my-tuple", 305 | "type": "array", 306 | "prefixItems": [ 307 | { "type": "boolean" }, 308 | { "type": "string" } 309 | ], 310 | 311 | "$defs": { 312 | "closed": { 313 | "$anchor": "closed", 314 | "$ref": "#", 315 | "unevaluatedItems": false 316 | } 317 | } 318 | } 319 | 320 | Here the schema is "closed" to two array items. You can then later 321 | use ``$ref`` and add another item like this: 322 | 323 | .. schema_example:: 324 | 325 | { 326 | "$id": "https://example.com/my-extended-tuple", 327 | "$ref": "https://example.com/my-tuple", 328 | "prefixItems": [ 329 | { "type": "boolean" }, 330 | { "type": "string" }, 331 | { "type": "number" } 332 | ], 333 | 334 | "$defs": { 335 | "closed": { 336 | "$anchor": "closed", 337 | "$ref": "#", 338 | "unevaluatedItems": false 339 | } 340 | } 341 | } 342 | 343 | Thus, you would reference ``my-tuple#closed`` when you need only 344 | two items and reference ``my-tuple#extended`` when you need three 345 | items. 346 | 347 | .. index:: 348 | single: array; contains 349 | single: contains 350 | 351 | .. _contains: 352 | 353 | Contains 354 | '''''''' 355 | 356 | |draft6| 357 | 358 | While the ``items`` schema must be valid for every item in the array, 359 | the ``contains`` schema only needs to validate against one or more 360 | items in the array. 361 | 362 | .. schema_example:: 363 | 364 | { 365 | "type": "array", 366 | "contains": { 367 | "type": "number" 368 | } 369 | } 370 | -- 371 | // A single "number" is enough to make this pass: 372 | ["life", "universe", "everything", 42] 373 | --X 374 | // But if we have no number, it fails: 375 | ["life", "universe", "everything", "forty-two"] 376 | -- 377 | // All numbers is, of course, also okay: 378 | [1, 2, 3, 4, 5] 379 | 380 | minContains / maxContains 381 | ~~~~~~~~~~~~~~~~~~~~~~~~~ 382 | 383 | |draft2019-09| 384 | 385 | ``minContains`` and ``maxContains`` can be used with ``contains`` to 386 | further specify how many times a schema matches a ``contains`` 387 | constraint. These keywords can be any non-negative number including 388 | zero. 389 | 390 | .. schema_example:: 391 | 392 | { 393 | "type": "array", 394 | "contains": { 395 | "type": "number" 396 | }, 397 | "minContains": 2, 398 | "maxContains": 3 399 | } 400 | --X 401 | // Fails ``minContains`` 402 | ["apple", "orange", 2] 403 | -- 404 | ["apple", "orange", 2, 4] 405 | -- 406 | ["apple", "orange", 2, 4, 8] 407 | --X 408 | // Fails ``maxContains`` 409 | ["apple", "orange", 2, 4, 8, 16] 410 | 411 | .. index:: 412 | single: array; length 413 | single: minItems 414 | single: maxItems 415 | 416 | .. _length: 417 | 418 | Length 419 | '''''' 420 | 421 | The length of the array can be specified using the ``minItems`` and 422 | ``maxItems`` keywords. The value of each keyword must be a 423 | non-negative number. These keywords work whether doing 424 | `list validation ` or `tuple-validation`. 425 | 426 | .. schema_example:: 427 | 428 | { 429 | "type": "array", 430 | "minItems": 2, 431 | "maxItems": 3 432 | } 433 | --X 434 | [] 435 | --X 436 | [1] 437 | -- 438 | [1, 2] 439 | -- 440 | [1, 2, 3] 441 | --X 442 | [1, 2, 3, 4] 443 | 444 | 445 | .. index:: 446 | single: array; uniqueness 447 | single: uniqueItems 448 | 449 | .. _uniqueItems: 450 | 451 | Uniqueness 452 | '''''''''' 453 | 454 | A schema can ensure that each of the items in an array is unique. 455 | Simply set the ``uniqueItems`` keyword to ``true``. 456 | 457 | .. schema_example:: 458 | 459 | { 460 | "type": "array", 461 | "uniqueItems": true 462 | } 463 | -- 464 | [1, 2, 3, 4, 5] 465 | --X 466 | [1, 2, 3, 3, 4] 467 | -- 468 | // The empty array always passes: 469 | [] 470 | -------------------------------------------------------------------------------- /source/reference/boolean.rst: -------------------------------------------------------------------------------- 1 | .. index:: 2 | single: boolean 3 | 4 | .. _boolean: 5 | 6 | boolean 7 | ------- 8 | 9 | The boolean type matches only two special values: ``true`` and 10 | ``false``. Note that values that *evaluate* to ``true`` or ``false``, 11 | such as 1 and 0, are not accepted by the schema. 12 | 13 | .. language_specific:: 14 | 15 | --Python 16 | In Python, "boolean" is analogous to ``bool``. Note that in JSON, 17 | ``true`` and ``false`` are lower case, whereas in Python they are 18 | capitalized (``True`` and ``False``). 19 | --Ruby 20 | In Ruby, "boolean" is analogous to ``TrueClass`` and ``FalseClass``. Note 21 | that in Ruby there is no ``Boolean`` class. 22 | 23 | .. schema_example:: 24 | 25 | { "type": "boolean" } 26 | -- 27 | true 28 | -- 29 | false 30 | --X 31 | "true" 32 | --X 33 | // Values that evaluate to ``true`` or ``false`` are still not 34 | // accepted by the schema: 35 | 0 36 | -------------------------------------------------------------------------------- /source/reference/combining.rst: -------------------------------------------------------------------------------- 1 | .. index:: 2 | single: schema composition 3 | 4 | .. _combining: 5 | 6 | Schema Composition 7 | ================== 8 | 9 | .. contents:: :local: 10 | 11 | JSON Schema includes a few keywords for combining schemas together. 12 | Note that this doesn't necessarily mean combining schemas from 13 | multiple files or JSON trees, though these facilities help to enable 14 | that and are described in `structuring`. Combining schemas may be as 15 | simple as allowing a value to be validated against multiple criteria 16 | at the same time. 17 | 18 | These keywords correspond to well known boolean algebra concepts like 19 | AND, OR, XOR, and NOT. You can often use these keywords to express 20 | complex constraints that can't otherwise be expressed with standard 21 | JSON Schema keywords. 22 | 23 | The keywords used to combine schemas are: 24 | 25 | - `allOf`: (AND) Must be valid against *all* of the subschemas 26 | - `anyOf`: (OR) Must be valid against *any* of the subschemas 27 | - `oneOf`: (XOR) Must be valid against *exactly one* of the subschemas 28 | 29 | All of these keywords must be set to an array, where each item is a 30 | schema. 31 | 32 | In addition, there is: 33 | 34 | - `not`: (NOT) Must *not* be valid against the given schema 35 | 36 | .. index:: 37 | single: allOf 38 | single: schema composition; allOf 39 | 40 | .. _allOf: 41 | 42 | allOf 43 | ----- 44 | 45 | To validate against ``allOf``, the given data must be valid against all 46 | of the given subschemas. 47 | 48 | .. schema_example:: 49 | 50 | { 51 | "allOf": [ 52 | { "type": "string" }, 53 | { "maxLength": 5 } 54 | ] 55 | } 56 | -- 57 | "short" 58 | --X 59 | "too long" 60 | 61 | .. note:: 62 | `allOf` can not be used to "extend" a schema to add more details to 63 | it in the sense of object-oriented inheritance. Instances must 64 | independently be valid against "all of" the schemas in the 65 | ``allOf``. See the section on `extending` for more 66 | information. 67 | 68 | .. index:: 69 | single: anyOf 70 | single: schema composition; anyOf 71 | 72 | .. _anyOf: 73 | 74 | anyOf 75 | ----- 76 | 77 | To validate against ``anyOf``, the given data must be valid against any 78 | (one or more) of the given subschemas. 79 | 80 | .. schema_example:: 81 | 82 | { 83 | "anyOf": [ 84 | { "type": "string", "maxLength": 5 }, 85 | { "type": "number", "minimum": 0 } 86 | ] 87 | } 88 | -- 89 | "short" 90 | --X 91 | "too long" 92 | -- 93 | 12 94 | --X 95 | -5 96 | 97 | .. index:: 98 | single: oneOf 99 | single: schema composition; oneOf 100 | 101 | .. _oneOf: 102 | 103 | oneOf 104 | ----- 105 | 106 | To validate against ``oneOf``, the given data must be valid against 107 | exactly one of the given subschemas. 108 | 109 | .. schema_example:: 110 | 111 | { 112 | "oneOf": [ 113 | { "type": "number", "multipleOf": 5 }, 114 | { "type": "number", "multipleOf": 3 } 115 | ] 116 | } 117 | -- 118 | 10 119 | -- 120 | 9 121 | --X 122 | // Not a multiple of either 5 or 3. 123 | 2 124 | --X 125 | // Multiple of *both* 5 and 3 is rejected. 126 | 15 127 | 128 | .. index:: 129 | single: not 130 | single: schema composition; not 131 | 132 | .. _not: 133 | 134 | not 135 | --- 136 | 137 | The ``not`` keyword declares that an instance validates if it doesn't 138 | validate against the given subschema. 139 | 140 | For example, the following schema validates against anything that is 141 | not a string: 142 | 143 | .. schema_example:: 144 | 145 | { "not": { "type": "string" } } 146 | -- 147 | 42 148 | -- 149 | { "key": "value" } 150 | --X 151 | "I am a string" 152 | 153 | .. index:: 154 | single: not 155 | single: schema composition; subschema independence 156 | 157 | .. _composition: 158 | 159 | Properties of Schema Composition 160 | -------------------------------- 161 | 162 | .. _illogicalschemas: 163 | 164 | Illogical Schemas 165 | ''''''''''''''''' 166 | 167 | Note that it's quite easy to create schemas that are logical 168 | impossibilities with these keywords. The following example creates a 169 | schema that won't validate against anything (since something may not 170 | be both a string and a number at the same time): 171 | 172 | .. schema_example:: 173 | 174 | { 175 | "allOf": [ 176 | { "type": "string" }, 177 | { "type": "number" } 178 | ] 179 | } 180 | --X 181 | "No way" 182 | --X 183 | -1 184 | 185 | .. _factoringschemas: 186 | 187 | Factoring Schemas 188 | ''''''''''''''''' 189 | 190 | Note that it's possible to "factor" out the common parts of the 191 | subschemas. The following two schemas are equivalent. 192 | 193 | .. schema_example:: 194 | 195 | { 196 | "oneOf": [ 197 | { "type": "number", "multipleOf": 5 }, 198 | { "type": "number", "multipleOf": 3 } 199 | ] 200 | } 201 | 202 | .. schema_example:: 203 | 204 | { 205 | "type": "number", 206 | "oneOf": [ 207 | { "multipleOf": 5 }, 208 | { "multipleOf": 3 } 209 | ] 210 | } 211 | -------------------------------------------------------------------------------- /source/reference/conditionals.rst: -------------------------------------------------------------------------------- 1 | .. index:: 2 | single: conditionals 3 | 4 | .. _conditionals: 5 | 6 | Applying Subschemas Conditionally 7 | ================================= 8 | 9 | .. contents:: :local: 10 | 11 | .. index:: 12 | single: conditionals; dependentRequired 13 | single: property dependentRequired 14 | 15 | .. _dependentRequired: 16 | 17 | dependentRequired 18 | ''''''''''''''''' 19 | 20 | The ``dependentRequired`` keyword conditionally requires that certain 21 | properties must be present if a given property is present in an 22 | object. For example, suppose we have a schema representing a customer. 23 | If you have their credit card number, you also want to ensure you have 24 | a billing address. If you don't have their credit card number, a 25 | billing address would not be required. We represent this dependency 26 | of one property on another using the ``dependentRequired`` keyword. 27 | The value of the ``dependentRequired`` keyword is an object. Each 28 | entry in the object maps from the name of a property, *p*, to an array 29 | of strings listing properties that are required if *p* is present. 30 | 31 | In the following example, whenever a ``credit_card`` property is 32 | provided, a ``billing_address`` property must also be present: 33 | 34 | .. schema_example:: 35 | 36 | { 37 | "type": "object", 38 | 39 | "properties": { 40 | "name": { "type": "string" }, 41 | "credit_card": { "type": "number" }, 42 | "billing_address": { "type": "string" } 43 | }, 44 | 45 | "required": ["name"], 46 | 47 | "dependentRequired": { 48 | "credit_card": ["billing_address"] 49 | } 50 | } 51 | -- 52 | { 53 | "name": "John Doe", 54 | "credit_card": 5555555555555555, 55 | "billing_address": "555 Debtor's Lane" 56 | } 57 | --X 58 | // This instance has a ``credit_card``, but it's missing a ``billing_address``. 59 | { 60 | "name": "John Doe", 61 | "credit_card": 5555555555555555 62 | } 63 | -- 64 | // This is okay, since we have neither a ``credit_card``, or a ``billing_address``. 65 | { 66 | "name": "John Doe" 67 | } 68 | -- 69 | // Note that dependencies are not bidirectional. It's okay to have 70 | // a billing address without a credit card number. 71 | { 72 | "name": "John Doe", 73 | "billing_address": "555 Debtor's Lane" 74 | } 75 | 76 | To fix the last issue above (that dependencies are not bidirectional), 77 | you can, of course, define the bidirectional dependencies explicitly: 78 | 79 | .. schema_example:: 80 | 81 | { 82 | "type": "object", 83 | 84 | "properties": { 85 | "name": { "type": "string" }, 86 | "credit_card": { "type": "number" }, 87 | "billing_address": { "type": "string" } 88 | }, 89 | 90 | "required": ["name"], 91 | 92 | "dependentRequired": { 93 | "credit_card": ["billing_address"], 94 | "billing_address": ["credit_card"] 95 | } 96 | } 97 | --X 98 | // This instance has a ``credit_card``, but it's missing a ``billing_address``. 99 | { 100 | "name": "John Doe", 101 | "credit_card": 5555555555555555 102 | } 103 | --X 104 | // This has a ``billing_address``, but is missing a ``credit_card``. 105 | { 106 | "name": "John Doe", 107 | "billing_address": "555 Debtor's Lane" 108 | } 109 | 110 | .. draft_specific:: 111 | --Draft 4-7 112 | Previously to Draft 2019-09, ``dependentRequired`` and 113 | ``dependentSchemas`` were one keyword called ``dependencies``. If 114 | the dependency value was an array, it would behave like 115 | ``dependentRequired`` and if the dependency value was a schema, it 116 | would behave like ``dependentSchemas``. 117 | 118 | .. index:: 119 | single: conditionals; dependentSchemas 120 | single: dependentSchemas 121 | 122 | .. _dependentSchemas: 123 | 124 | dependentSchemas 125 | '''''''''''''''' 126 | 127 | The ``dependentSchemas`` keyword conditionally applies a subschema 128 | when a given property is present. This schema is applied in the same 129 | way `allOf` applies schemas. Nothing is merged or extended. Both 130 | schemas apply independently. 131 | 132 | For example, here is another way to write the above: 133 | 134 | .. schema_example:: 135 | 136 | { 137 | "type": "object", 138 | 139 | "properties": { 140 | "name": { "type": "string" }, 141 | "credit_card": { "type": "number" } 142 | }, 143 | 144 | "required": ["name"], 145 | 146 | "dependentSchemas": { 147 | "credit_card": { 148 | "properties": { 149 | "billing_address": { "type": "string" } 150 | }, 151 | "required": ["billing_address"] 152 | } 153 | } 154 | } 155 | -- 156 | { 157 | "name": "John Doe", 158 | "credit_card": 5555555555555555, 159 | "billing_address": "555 Debtor's Lane" 160 | } 161 | --X 162 | // This instance has a ``credit_card``, but it's missing a 163 | // ``billing_address``: 164 | { 165 | "name": "John Doe", 166 | "credit_card": 5555555555555555 167 | } 168 | -- 169 | // This has a ``billing_address``, but is missing a 170 | // ``credit_card``. This passes, because here ``billing_address`` 171 | // just looks like an additional property: 172 | { 173 | "name": "John Doe", 174 | "billing_address": "555 Debtor's Lane" 175 | } 176 | 177 | .. draft_specific:: 178 | --Draft 4-7 179 | Previously to Draft 2019-09, ``dependentRequired`` and 180 | ``dependentSchemas`` were one keyword called ``dependencies``. If 181 | the dependency value was an array, it would behave like 182 | ``dependentRequired`` and if the dependency value was a schema, it 183 | would behave like ``dependentSchemas``. 184 | 185 | .. index:: 186 | single: conditionals 187 | single: conditionals; if 188 | single: conditionals; then 189 | single: conditionals; else 190 | single: if 191 | single: then 192 | single: else 193 | 194 | .. _ifthenelse: 195 | 196 | If-Then-Else 197 | '''''''''''' 198 | 199 | |draft7| The ``if``, ``then`` and ``else`` keywords allow the 200 | application of a subschema based on the outcome of another schema, 201 | much like the ``if``/``then``/``else`` constructs you've probably seen 202 | in traditional programming languages. 203 | 204 | If ``if`` is valid, ``then`` must also be valid (and ``else`` is ignored.) If 205 | ``if`` is invalid, ``else`` must also be valid (and ``then`` is ignored). 206 | 207 | If ``then`` or ``else`` is not defined, ``if`` behaves as if they have a value 208 | of ``true``. 209 | 210 | If ``then`` and/or ``else`` appear in a schema without ``if``, ``then`` and 211 | ``else`` are ignored. 212 | 213 | We can put this in the form of a truth table, showing the combinations of when 214 | ``if``, ``then``, and ``else`` are valid and the resulting validity of the 215 | entire schema: 216 | 217 | ==== ==== ==== ============ 218 | if then else whole schema 219 | ==== ==== ==== ============ 220 | T T n/a T 221 | T F n/a F 222 | F n/a T T 223 | F n/a F F 224 | n/a n/a n/a T 225 | ==== ==== ==== ============ 226 | 227 | For example, let's say you wanted to write a schema to handle addresses in the 228 | United States and Canada. These countries have different postal code formats, 229 | and we want to select which format to validate against based on the country. If 230 | the address is in the United States, the ``postal_code`` field is a "zipcode": 231 | five numeric digits followed by an optional four digit suffix. If the address is 232 | in Canada, the ``postal_code`` field is a six digit alphanumeric string where 233 | letters and numbers alternate. 234 | 235 | .. schema_example:: 236 | 237 | { 238 | "type": "object", 239 | "properties": { 240 | "street_address": { 241 | "type": "string" 242 | }, 243 | "country": { 244 | "default": "United States of America", 245 | "enum": ["United States of America", "Canada"] 246 | } 247 | }, 248 | "if": { 249 | "properties": { "country": { "const": "United States of America" } } 250 | }, 251 | "then": { 252 | "properties": { "postal_code": { "pattern": "[0-9]{5}(-[0-9]{4})?" } } 253 | }, 254 | "else": { 255 | "properties": { "postal_code": { "pattern": "[A-Z][0-9][A-Z] [0-9][A-Z][0-9]" } } 256 | } 257 | } 258 | -- 259 | { 260 | "street_address": "1600 Pennsylvania Avenue NW", 261 | "country": "United States of America", 262 | "postal_code": "20500" 263 | } 264 | -- 265 | { 266 | "street_address": "1600 Pennsylvania Avenue NW", 267 | "postal_code": "20500" 268 | } 269 | -- 270 | { 271 | "street_address": "24 Sussex Drive", 272 | "country": "Canada", 273 | "postal_code": "K1M 1M4" 274 | } 275 | --X 276 | { 277 | "street_address": "24 Sussex Drive", 278 | "country": "Canada", 279 | "postal_code": "10000" 280 | } 281 | --X 282 | { 283 | "street_address": "1600 Pennsylvania Avenue NW", 284 | "postal_code": "K1M 1M4" 285 | } 286 | 287 | .. note:: 288 | 289 | In this example, "country" is not a required property. Because the "if" 290 | schema also doesn't require the "country" property, it will pass and the 291 | "then" schema will apply. Therefore, if the "country" property is not 292 | defined, the default behavior is to validate "postal_code" as a USA postal 293 | code. The "default" keyword doesn't have an effect, but is nice to include 294 | for readers of the schema to more easily recognize the default behavior. 295 | 296 | Unfortunately, this approach above doesn't scale to more than two countries. You 297 | can, however, wrap pairs of ``if`` and ``then`` inside an ``allOf`` to create 298 | something that would scale. In this example, we'll use United States and 299 | Canadian postal codes, but also add Netherlands postal codes, which are 4 digits 300 | followed by two letters. It's left as an exercise to the reader to expand this 301 | to the remaining postal codes of the world. 302 | 303 | .. schema_example:: 304 | 305 | { 306 | "type": "object", 307 | "properties": { 308 | "street_address": { 309 | "type": "string" 310 | }, 311 | "country": { 312 | "default": "United States of America", 313 | "enum": ["United States of America", "Canada", "Netherlands"] 314 | } 315 | }, 316 | "allOf": [ 317 | { 318 | "if": { 319 | "properties": { "country": { "const": "United States of America" } } 320 | }, 321 | "then": { 322 | "properties": { "postal_code": { "pattern": "[0-9]{5}(-[0-9]{4})?" } } 323 | } 324 | }, 325 | { 326 | "if": { 327 | "properties": { "country": { "const": "Canada" } }, 328 | "required": ["country"] 329 | }, 330 | "then": { 331 | "properties": { "postal_code": { "pattern": "[A-Z][0-9][A-Z] [0-9][A-Z][0-9]" } } 332 | } 333 | }, 334 | { 335 | "if": { 336 | "properties": { "country": { "const": "Netherlands" } }, 337 | "required": ["country"] 338 | }, 339 | "then": { 340 | "properties": { "postal_code": { "pattern": "[0-9]{4} [A-Z]{2}" } } 341 | } 342 | } 343 | ] 344 | } 345 | -- 346 | { 347 | "street_address": "1600 Pennsylvania Avenue NW", 348 | "country": "United States of America", 349 | "postal_code": "20500" 350 | } 351 | -- 352 | { 353 | "street_address": "1600 Pennsylvania Avenue NW", 354 | "postal_code": "20500" 355 | } 356 | -- 357 | { 358 | "street_address": "24 Sussex Drive", 359 | "country": "Canada", 360 | "postal_code": "K1M 1M4" 361 | } 362 | -- 363 | { 364 | "street_address": "Adriaan Goekooplaan", 365 | "country": "Netherlands", 366 | "postal_code": "2517 JX" 367 | } 368 | --X 369 | { 370 | "street_address": "24 Sussex Drive", 371 | "country": "Canada", 372 | "postal_code": "10000" 373 | } 374 | --X 375 | { 376 | "street_address": "1600 Pennsylvania Avenue NW", 377 | "postal_code": "K1M 1M4" 378 | } 379 | 380 | .. note:: 381 | 382 | The "required" keyword is necessary in the "if" schemas or they would all 383 | apply if the "country" is not defined. Leaving "required" off of the 384 | "United States of America" "if" schema makes it effectively the default if 385 | no "country" is defined. 386 | 387 | .. note:: 388 | 389 | Even if "country" was a required field, it's still recommended to have the 390 | "required" keyword in each "if" schema. The validation result will be the 391 | same because "required" will fail, but not including it will add noise to 392 | error results because it will validate the "postal_code" against all three 393 | of the "then" schemas leading to irrelevant errors. 394 | 395 | .. index:: 396 | single: conditionals; implication 397 | single: implication 398 | 399 | .. _implication: 400 | 401 | Implication 402 | ''''''''''' 403 | 404 | Before Draft 7, you can express an "if-then" conditional using the 405 | `combining` keywords and a boolean algebra concept called 406 | "implication". ``A -> B`` (pronounced, A implies B) means that if A is 407 | true, then B must also be true. It can be expressed as ``!A || B`` 408 | which can be expressed as a JSON Schema. 409 | 410 | .. schema_example:: 411 | 412 | { 413 | "type": "object", 414 | "properties": { 415 | "restaurantType": { "enum": ["fast-food", "sit-down"] }, 416 | "total": { "type": "number" }, 417 | "tip": { "type": "number" } 418 | }, 419 | "anyOf": [ 420 | { 421 | "not": { 422 | "properties": { "restaurantType": { "const": "sit-down" } }, 423 | "required": ["restaurantType"] 424 | } 425 | }, 426 | { "required": ["tip"] } 427 | ] 428 | } 429 | -- 430 | { 431 | "restaurantType": "sit-down", 432 | "total": 16.99, 433 | "tip": 3.4 434 | } 435 | --X 436 | { 437 | "restaurantType": "sit-down", 438 | "total": 16.99 439 | } 440 | -- 441 | { 442 | "restaurantType": "fast-food", 443 | "total": 6.99 444 | } 445 | -- 446 | { "total": 5.25 } 447 | 448 | Variations of implication can be used to express the same things you 449 | can express with the ``if``/``then``/``else`` keywords. 450 | ``if``/``then`` can be expressed as ``A -> B``, ``if``/``else`` can be 451 | expressed as ``!A -> B``, and ``if``/``then``/``else`` can be 452 | expressed as ``A -> B AND !A -> C``. 453 | 454 | .. note:: 455 | Since this pattern is not very intuitive, it's recommended to 456 | put your conditionals in ``$defs`` with a descriptive name and 457 | ``$ref`` it into your schema with ``"allOf": [{ "$ref": 458 | "#/$defs/sit-down-restaurant-implies-tip-is-required" }]``. 459 | -------------------------------------------------------------------------------- /source/reference/generic.rst: -------------------------------------------------------------------------------- 1 | Generic keywords 2 | ================ 3 | 4 | .. contents:: :local: 5 | 6 | This chapter lists some miscellaneous properties that are available 7 | for all JSON types. 8 | 9 | .. index:: 10 | single: annotation 11 | single: title 12 | single: description 13 | single: default 14 | single: examples 15 | single: readOnly 16 | single: writeOnly 17 | single: deprecated 18 | 19 | .. _annotation: 20 | 21 | Annotations 22 | ----------- 23 | 24 | JSON Schema includes a few keywords, that aren't strictly used for 25 | validation, but are used to describe parts of a schema. None of these 26 | "annotation" keywords are required, but they are encouraged for good 27 | practice, and can make your schema "self-documenting". 28 | 29 | The ``title`` and ``description`` keywords must be strings. A "title" 30 | will preferably be short, whereas a "description" will provide a more 31 | lengthy explanation about the purpose of the data described by the 32 | schema. 33 | 34 | The ``default`` keyword specifies a default value. This value is not 35 | used to fill in missing values during the validation process. 36 | Non-validation tools such as documentation generators or form 37 | generators may use this value to give hints to users about how to use 38 | a value. However, ``default`` is typically used to express that if a 39 | value is missing, then the value is semantically the same as if the 40 | value was present with the default value. The value of ``default`` 41 | should validate against the schema in which it resides, but that isn't 42 | required. 43 | 44 | |draft6| The ``examples`` keyword is a place to provide an array of 45 | examples that validate against the schema. This isn't used for 46 | validation, but may help with explaining the effect and purpose of the 47 | schema to a reader. Each entry should validate against the schema in 48 | which it resides, but that isn't strictly required. There is no need 49 | to duplicate the ``default`` value in the ``examples`` array, since 50 | ``default`` will be treated as another example. 51 | 52 | |draft7| The boolean keywords ``readOnly`` and ``writeOnly`` are 53 | typically used in an API context. ``readOnly`` indicates that a value 54 | should not be modified. It could be used to indicate that a ``PUT`` 55 | request that changes a value would result in a ``400 Bad Request`` 56 | response. ``writeOnly`` indicates that a value may be set, but will 57 | remain hidden. It could be used to indicate you can set a value with a 58 | ``PUT`` request, but it would not be included when retrieving that 59 | record with a ``GET`` request. 60 | 61 | |draft2019-09| The ``deprecated`` keyword is a boolean that indicates 62 | that the instance value the keyword applies to should not be used and 63 | may be removed in the future. 64 | 65 | .. schema_example:: 66 | 67 | { 68 | "title": "Match anything", 69 | "description": "This is a schema that matches anything.", 70 | "default": "Default value", 71 | "examples": [ 72 | "Anything", 73 | 4035 74 | ], 75 | "deprecated": true, 76 | "readOnly": true, 77 | "writeOnly": false 78 | } 79 | 80 | .. index:: 81 | single: comment 82 | single: $comment 83 | 84 | .. _comments: 85 | 86 | Comments 87 | -------- 88 | 89 | |draft7| ``$comment`` 90 | 91 | The ``$comment`` keyword is strictly intended for adding comments to 92 | a schema. Its value must always be a string. Unlike the annotations 93 | ``title``, ``description``, and ``examples``, JSON schema 94 | implementations aren't allowed to attach any meaning or behavior to it 95 | whatsoever, and may even strip them at any time. Therefore, they are 96 | useful for leaving notes to future editors of a JSON schema, but 97 | should not be used to communicate to users of the schema. 98 | 99 | .. index:: 100 | single: enum 101 | single: enumerated values 102 | 103 | .. _enum: 104 | 105 | Enumerated values 106 | ----------------- 107 | 108 | The ``enum`` keyword is used to restrict a value to a fixed set of 109 | values. It must be an array with at least one element, where each 110 | element is unique. 111 | 112 | The following is an example for validating street light colors: 113 | 114 | .. schema_example:: 115 | 116 | { 117 | "enum": ["red", "amber", "green"] 118 | } 119 | -- 120 | "red" 121 | --X 122 | "blue" 123 | 124 | You can use ``enum`` even without a type, to accept values of 125 | different types. Let's extend the example to use ``null`` to indicate 126 | "off", and also add 42, just for fun. 127 | 128 | .. schema_example:: 129 | 130 | { 131 | "enum": ["red", "amber", "green", null, 42] 132 | } 133 | -- 134 | "red" 135 | -- 136 | null 137 | -- 138 | 42 139 | --X 140 | 0 141 | 142 | .. index:: 143 | single: const 144 | single: constant values 145 | 146 | .. _const: 147 | 148 | Constant values 149 | --------------- 150 | 151 | |draft6| 152 | 153 | The ``const`` keyword is used to restrict a value to a single value. 154 | 155 | For example, if you only support shipping to the United States for 156 | export reasons: 157 | 158 | .. schema_example:: 159 | 160 | { 161 | "properties": { 162 | "country": { 163 | "const": "United States of America" 164 | } 165 | } 166 | } 167 | -- 168 | { "country": "United States of America" } 169 | --X 170 | { "country": "Canada" } 171 | -------------------------------------------------------------------------------- /source/reference/index.rst: -------------------------------------------------------------------------------- 1 | .. _reference: 2 | 3 | JSON Schema Reference 4 | ===================== 5 | 6 | .. toctree:: 7 | :maxdepth: 1 8 | 9 | type.rst 10 | string.rst 11 | regular_expressions.rst 12 | numeric.rst 13 | object.rst 14 | array.rst 15 | boolean.rst 16 | null.rst 17 | generic.rst 18 | non_json_data.rst 19 | combining.rst 20 | conditionals.rst 21 | schema.rst 22 | -------------------------------------------------------------------------------- /source/reference/non_json_data.rst: -------------------------------------------------------------------------------- 1 | .. index:: 2 | single: non-JSON data 3 | single: media 4 | 5 | .. _media: 6 | 7 | Media: string-encoding non-JSON data 8 | ------------------------------------ 9 | 10 | .. contents:: :local: 11 | 12 | |draft7| 13 | 14 | JSON schema has a set of keywords to describe and optionally validate 15 | non-JSON data stored inside JSON strings. Since it would be difficult 16 | to write validators for many media types, JSON schema validators are 17 | not required to validate the contents of JSON strings based on these 18 | keywords. However, these keywords are still useful for an application 19 | that consumes validated JSON. 20 | 21 | .. index:: 22 | single: contentMediaType 23 | single: media; contentMediaType 24 | 25 | contentMediaType 26 | ```````````````` 27 | 28 | The ``contentMediaType`` keyword specifies the MIME type of the contents of a 29 | string, as described in `RFC 2046 `_. 30 | There is a list of `MIME types officially registered by the IANA 31 | `_, but the set 32 | of types supported will be application and operating system dependent. Mozilla 33 | Developer Network also maintains a `shorter list of MIME types that are 34 | important for the web 35 | `_ 36 | 37 | .. index:: 38 | single: contentEncoding 39 | single: media; contentEncoding 40 | 41 | contentEncoding 42 | ``````````````` 43 | 44 | The ``contentEncoding`` keyword specifies the encoding used to store the 45 | contents, as specified in `RFC 2054, part 6.1 46 | `_ and `RFC 4648 47 | `_. 48 | 49 | The acceptable values are ``7bit``, ``8bit``, ``binary``, 50 | ``quoted-printable``, ``base16``, ``base32``, and ``base64``. If not 51 | specified, the encoding is the same as the containing JSON document. 52 | 53 | Without getting into the low-level details of each of these encodings, there are 54 | really only two options useful for modern usage: 55 | 56 | - If the content is encoded in the same encoding as the enclosing JSON document 57 | (which for practical purposes, is almost always UTF-8), leave 58 | ``contentEncoding`` unspecified, and include the content in a string as-is. 59 | This includes text-based content types, such as ``text/html`` or 60 | ``application/xml``. 61 | 62 | - If the content is binary data, set ``contentEncoding`` to ``base64`` and 63 | encode the contents using `Base64 `_. 64 | This would include many image types, such as ``image/png`` or audio types, 65 | such as ``audio/mpeg``. 66 | 67 | .. index:: 68 | single: contentSchema 69 | single: media; contentSchema 70 | 71 | contentSchema 72 | ````````````` 73 | 74 | |draft2019-09| 75 | 76 | Documentation Coming soon 77 | 78 | Examples 79 | ```````` 80 | 81 | The following schema indicates the string contains an HTML document, encoded 82 | using the same encoding as the surrounding document: 83 | 84 | .. schema_example:: 85 | 86 | { 87 | "type": "string", 88 | "contentMediaType": "text/html" 89 | } 90 | -- 91 | "" 92 | 93 | The following schema indicates that a string contains a `PNG 94 | `_ image, encoded using Base64: 95 | 96 | .. schema_example:: 97 | 98 | { 99 | "type": "string", 100 | "contentEncoding": "base64", 101 | "contentMediaType": "image/png" 102 | } 103 | -- 104 | "iVBORw0KGgoAAAANSUhEUgAAABgAAAAYCAYAAADgdz34AAAABmJLR0QA/wD/AP+gvaeTAAAA..." 105 | 106 | -------------------------------------------------------------------------------- /source/reference/null.rst: -------------------------------------------------------------------------------- 1 | .. index:: 2 | single: null 3 | 4 | .. _null: 5 | 6 | null 7 | ---- 8 | 9 | When a schema specifies a ``type`` of ``null``, it has only one 10 | acceptable value: ``null``. 11 | 12 | .. note:: 13 | 14 | It's important to remember that in JSON, ``null`` isn't equivalent 15 | to something being absent. See `required` for an example. 16 | 17 | .. language_specific:: 18 | 19 | --Python 20 | In Python, ``null`` is analogous to ``None``. 21 | --Ruby 22 | In Ruby, ``null`` is analogous to ``nil``. 23 | 24 | .. schema_example:: 25 | 26 | { "type": "null" } 27 | -- 28 | null 29 | --X 30 | false 31 | --X 32 | 0 33 | --X 34 | "" 35 | --X 36 | 37 | -------------------------------------------------------------------------------- /source/reference/numeric.rst: -------------------------------------------------------------------------------- 1 | .. index:: 2 | single: integer 3 | single: number 4 | single: types; numeric 5 | 6 | .. _numeric: 7 | 8 | Numeric types 9 | ------------- 10 | 11 | .. contents:: :local: 12 | 13 | There are two numeric types in JSON Schema: `integer` and `number`. They 14 | share the same validation keywords. 15 | 16 | .. note:: 17 | 18 | JSON has no standard way to represent complex numbers, so there is 19 | no way to test for them in JSON Schema. 20 | 21 | .. _integer: 22 | 23 | 24 | integer 25 | ''''''' 26 | 27 | The ``integer`` type is used for integral numbers. JSON does not have 28 | distinct types for integers and floating-point values. Therefore, the 29 | presence or absence of a decimal point is not enough to distinguish 30 | between integers and non-integers. For example, ``1`` and ``1.0`` are 31 | two ways to represent the same value in JSON. JSON Schema considers 32 | that value an integer no matter which representation was used. 33 | 34 | .. language_specific:: 35 | 36 | --Python 37 | In Python, "integer" is analogous to the ``int`` type. 38 | --Ruby 39 | In Ruby, "integer" is analogous to the ``Integer`` type. 40 | 41 | .. schema_example:: 42 | 43 | { "type": "integer" } 44 | -- 45 | 42 46 | -- 47 | -1 48 | -- 49 | // Numbers with a zero fractional part are considered integers 50 | 1.0 51 | --X 52 | // Floating point numbers are rejected: 53 | 3.1415926 54 | --X 55 | // Numbers as strings are rejected: 56 | "42" 57 | 58 | .. _number: 59 | 60 | number 61 | '''''' 62 | 63 | The ``number`` type is used for any numeric type, either integers or 64 | floating point numbers. 65 | 66 | .. language_specific:: 67 | 68 | --Python 69 | In Python, "number" is analogous to the ``float`` type. 70 | --Ruby 71 | In Ruby, "number" is analogous to the ``Float`` type. 72 | 73 | .. schema_example:: 74 | 75 | { "type": "number" } 76 | -- 77 | 42 78 | -- 79 | -1 80 | -- 81 | // Simple floating point number: 82 | 5.0 83 | -- 84 | // Exponential notation also works: 85 | 2.99792458e8 86 | --X 87 | // Numbers as strings are rejected: 88 | "42" 89 | 90 | .. index:: 91 | single: multipleOf 92 | single: number; multiple of 93 | 94 | .. _multiples: 95 | 96 | Multiples 97 | ''''''''' 98 | 99 | Numbers can be restricted to a multiple of a given number, using the 100 | ``multipleOf`` keyword. It may be set to any positive number. 101 | 102 | .. schema_example:: 103 | 104 | { 105 | "type": "number", 106 | "multipleOf" : 10 107 | } 108 | -- 109 | 0 110 | -- 111 | 10 112 | -- 113 | 20 114 | --X 115 | // Not a multiple of 10: 116 | 23 117 | 118 | .. index:: 119 | single: number; range 120 | single: maximum 121 | single: exclusiveMaximum 122 | single: minimum 123 | single: exclusiveMinimum 124 | 125 | Range 126 | ''''' 127 | 128 | Ranges of numbers are specified using a combination of the 129 | ``minimum`` and ``maximum`` keywords, (or ``exclusiveMinimum`` and 130 | ``exclusiveMaximum`` for expressing exclusive range). 131 | 132 | If *x* is the value being validated, the following must hold true: 133 | 134 | - *x* ≥ ``minimum`` 135 | - *x* > ``exclusiveMinimum`` 136 | - *x* ≤ ``maximum`` 137 | - *x* < ``exclusiveMaximum`` 138 | 139 | While you can specify both of ``minimum`` and ``exclusiveMinimum`` or both of 140 | ``maximum`` and ``exclusiveMaximum``, it doesn't really make sense to do so. 141 | 142 | .. schema_example:: 143 | 144 | { 145 | "type": "number", 146 | "minimum": 0, 147 | "exclusiveMaximum": 100 148 | } 149 | --X 150 | // Less than ``minimum``: 151 | -1 152 | -- 153 | // ``minimum`` is inclusive, so 0 is valid: 154 | 0 155 | -- 156 | 10 157 | -- 158 | 99 159 | --X 160 | // ``exclusiveMaximum`` is exclusive, so 100 is not valid: 161 | 100 162 | --X 163 | // Greater than ``maximum``: 164 | 101 165 | 166 | .. draft_specific:: 167 | 168 | --Draft 4 169 | In JSON Schema Draft 4, ``exclusiveMinimum`` and ``exclusiveMaximum`` work 170 | differently. There they are boolean values, that indicate whether 171 | ``minimum`` and ``maximum`` are exclusive of the value. For example: 172 | 173 | - if ``exclusiveMinimum`` is ``false``, *x* ≥ ``minimum``. 174 | - if ``exclusiveMinimum`` is ``true``, *x* > ``minimum``. 175 | 176 | This was changed to have better keyword independence. 177 | 178 | Here is an example using the older Draft 4 convention: 179 | 180 | .. schema_example:: http://json-schema.org/draft-04/schema# 181 | 182 | { 183 | "type": "number", 184 | "minimum": 0, 185 | "maximum": 100, 186 | "exclusiveMaximum": true 187 | } 188 | --X 189 | // Less than ``minimum``: 190 | -1 191 | -- 192 | // ``exclusiveMinimum`` was not specified, so 0 is included: 193 | 0 194 | -- 195 | 10 196 | -- 197 | 99 198 | --X 199 | // ``exclusiveMaximum`` is ``true``, so 100 is not included: 200 | 100 201 | --X 202 | // Greater than ``maximum``: 203 | 101 204 | -------------------------------------------------------------------------------- /source/reference/object.rst: -------------------------------------------------------------------------------- 1 | .. index:: 2 | single: object 3 | 4 | .. _object: 5 | 6 | object 7 | ====== 8 | 9 | .. contents:: :local: 10 | 11 | Objects are the mapping type in JSON. They map "keys" to "values". 12 | In JSON, the "keys" must always be strings. Each of these pairs is 13 | conventionally referred to as a "property". 14 | 15 | .. language_specific:: 16 | --Python 17 | In Python, "objects" are analogous to the ``dict`` type. An 18 | important difference, however, is that while Python dictionaries 19 | may use anything hashable as a key, in JSON all the keys 20 | must be strings. 21 | 22 | Try not to be confused by the two uses of the word "object" here: 23 | Python uses the word ``object`` to mean the generic base class for 24 | everything, whereas in JSON it is used only to mean a mapping from 25 | string keys to values. 26 | 27 | --Ruby 28 | In Ruby, "objects" are analogous to the ``Hash`` type. An important 29 | difference, however, is that all keys in JSON must be strings, and therefore 30 | any non-string keys are converted over to their string representation. 31 | 32 | Try not to be confused by the two uses of the word "object" here: 33 | Ruby uses the word ``Object`` to mean the generic base class for 34 | everything, whereas in JSON it is used only to mean a mapping from 35 | string keys to values. 36 | 37 | 38 | .. schema_example:: 39 | 40 | { "type": "object" } 41 | -- 42 | { 43 | "key": "value", 44 | "another_key": "another_value" 45 | } 46 | -- 47 | { 48 | "Sun": 1.9891e30, 49 | "Jupiter": 1.8986e27, 50 | "Saturn": 5.6846e26, 51 | "Neptune": 10.243e25, 52 | "Uranus": 8.6810e25, 53 | "Earth": 5.9736e24, 54 | "Venus": 4.8685e24, 55 | "Mars": 6.4185e23, 56 | "Mercury": 3.3022e23, 57 | "Moon": 7.349e22, 58 | "Pluto": 1.25e22 59 | } 60 | --X 61 | // Using non-strings as keys is invalid JSON: 62 | { 63 | 0.01: "cm", 64 | 1: "m", 65 | 1000: "km" 66 | } 67 | --X 68 | "Not an object" 69 | --X 70 | ["An", "array", "not", "an", "object"] 71 | 72 | 73 | .. index:: 74 | single: object; properties 75 | single: properties 76 | 77 | .. _properties: 78 | 79 | Properties 80 | ---------- 81 | 82 | The properties (key-value pairs) on an object are defined using the 83 | ``properties`` keyword. The value of ``properties`` is an object, 84 | where each key is the name of a property and each value is a schema 85 | used to validate that property. Any property that doesn't match any of 86 | the property names in the ``properties`` keyword is ignored by this 87 | keyword. 88 | 89 | .. note:: 90 | See `additionalproperties` and `unevaluatedproperties` for how to 91 | disallow properties that don't match any of the property names in 92 | ``properties``. 93 | 94 | For example, let's say we want to define a simple schema for an 95 | address made up of a number, street name and street type: 96 | 97 | .. schema_example:: 98 | 99 | { 100 | "type": "object", 101 | "properties": { 102 | "number": { "type": "number" }, 103 | "street_name": { "type": "string" }, 104 | "street_type": { "enum": ["Street", "Avenue", "Boulevard"] } 105 | } 106 | } 107 | -- 108 | { "number": 1600, "street_name": "Pennsylvania", "street_type": "Avenue" } 109 | --X 110 | // If we provide the number in the wrong type, it is invalid: 111 | { "number": "1600", "street_name": "Pennsylvania", "street_type": "Avenue" } 112 | -- 113 | // By default, leaving out properties is valid. See 114 | // `required`. 115 | { "number": 1600, "street_name": "Pennsylvania" } 116 | -- 117 | // By extension, even an empty object is valid: 118 | { } 119 | -- 120 | // By default, providing additional properties is valid: 121 | { "number": 1600, "street_name": "Pennsylvania", "street_type": "Avenue", "direction": "NW" } 122 | 123 | .. index:: 124 | single: object; properties; regular expression 125 | single: patternProperties 126 | 127 | .. _patternProperties: 128 | 129 | Pattern Properties 130 | ------------------ 131 | 132 | Sometimes you want to say that, given a particular kind of property 133 | name, the value should match a particular schema. That's where 134 | ``patternProperties`` comes in: it maps regular expressions to 135 | schemas. If a property name matches the given regular expression, the 136 | property value must validate against the corresponding schema. 137 | 138 | .. note:: 139 | Regular expressions are not anchored. This means that when defining 140 | the regular expressions for ``patternProperties``, it's important 141 | to note that the expression may match anywhere within the property 142 | name. For example, the regular expression ``"p"`` will match any 143 | property name with a ``p`` in it, such as ``"apple"``, not just a 144 | property whose name is simply ``"p"``. It's therefore usually less 145 | confusing to surround the regular expression in ``^...$``, for 146 | example, ``"^p$"``. 147 | 148 | In this example, any properties whose names start with the prefix 149 | ``S_`` must be strings, and any with the prefix ``I_`` must be 150 | integers. Any properties that do not match either regular expression 151 | are ignored. 152 | 153 | .. schema_example:: 154 | 155 | { 156 | "type": "object", 157 | "patternProperties": { 158 | "^S_": { "type": "string" }, 159 | "^I_": { "type": "integer" } 160 | } 161 | } 162 | -- 163 | { "S_25": "This is a string" } 164 | -- 165 | { "I_0": 42 } 166 | --X 167 | // If the name starts with ``S_``, it must be a string 168 | { "S_0": 42 } 169 | --X 170 | // If the name starts with ``I_``, it must be an integer 171 | { "I_42": "This is a string" } 172 | -- 173 | // This is a key that doesn't match any of the regular expressions: 174 | { "keyword": "value" } 175 | 176 | 177 | .. index:: 178 | single: object; properties 179 | single: additionalProperties 180 | 181 | .. _additionalproperties: 182 | 183 | Additional Properties 184 | --------------------- 185 | 186 | The ``additionalProperties`` keyword is used to control the handling 187 | of extra stuff, that is, properties whose names are not listed in the 188 | ``properties`` keyword or match any of the regular expressions in the 189 | ``patternProperties`` keyword. By default any additional properties 190 | are allowed. 191 | 192 | The value of the ``additionalProperties`` keyword is a schema that 193 | will be used to validate any properties in the instance that are not 194 | matched by ``properties`` or ``patternProperties``. Setting the 195 | ``additionalProperties`` schema to ``false`` means no additional 196 | properties will be allowed. 197 | 198 | Reusing the example from `properties`, but this time setting 199 | ``additionalProperties`` to ``false``. 200 | 201 | .. schema_example:: 202 | 203 | { 204 | "type": "object", 205 | "properties": { 206 | "number": { "type": "number" }, 207 | "street_name": { "type": "string" }, 208 | "street_type": { "enum": ["Street", "Avenue", "Boulevard"] } 209 | }, 210 | "additionalProperties": false 211 | } 212 | -- 213 | { "number": 1600, "street_name": "Pennsylvania", "street_type": "Avenue" } 214 | --X 215 | // Since ``additionalProperties`` is ``false``, this extra 216 | // property "direction" makes the object invalid: 217 | { "number": 1600, "street_name": "Pennsylvania", "street_type": "Avenue", "direction": "NW" } 218 | 219 | You can use non-boolean schemas to put more complex constraints on the 220 | additional properties of an instance. For example, one can allow 221 | additional properties, but only if their values are each a string: 222 | 223 | .. schema_example:: 224 | 225 | { 226 | "type": "object", 227 | "properties": { 228 | "number": { "type": "number" }, 229 | "street_name": { "type": "string" }, 230 | "street_type": { "enum": ["Street", "Avenue", "Boulevard"] } 231 | }, 232 | "additionalProperties": { "type": "string" } 233 | } 234 | -- 235 | { "number": 1600, "street_name": "Pennsylvania", "street_type": "Avenue" } 236 | -- 237 | // This is valid, since the additional property's value is a string: 238 | { "number": 1600, "street_name": "Pennsylvania", "street_type": "Avenue", "direction": "NW" } 239 | --X 240 | // This is invalid, since the additional property's value is not a string: 241 | { "number": 1600, "street_name": "Pennsylvania", "street_type": "Avenue", "office_number": 201 } 242 | 243 | You can use ``additionalProperties`` with a combination of 244 | ``properties`` and ``patternProperties``. In the following example, 245 | based on the example from `patternProperties`, we add a ``"builtin"`` 246 | property, which must be a number, and declare that all additional 247 | properties (that are neither defined by ``properties`` nor matched by 248 | ``patternProperties``) must be strings: 249 | 250 | .. schema_example:: 251 | 252 | { 253 | "type": "object", 254 | "properties": { 255 | "builtin": { "type": "number" } 256 | }, 257 | "patternProperties": { 258 | "^S_": { "type": "string" }, 259 | "^I_": { "type": "integer" } 260 | }, 261 | "additionalProperties": { "type": "string" } 262 | } 263 | -- 264 | { "builtin": 42 } 265 | -- 266 | // This is a key that doesn't match any of the regular expressions: 267 | { "keyword": "value" } 268 | --X 269 | // It must be a string: 270 | { "keyword": 42 } 271 | 272 | .. index:: 273 | single: object; properties; additionalProperties 274 | single: extending 275 | 276 | .. _extending: 277 | 278 | Extending Closed Schemas 279 | '''''''''''''''''''''''' 280 | 281 | It's important to note that ``additionalProperties`` only recognizes 282 | properties declared in the same subschema as itself. So, 283 | ``additionalProperties`` can restrict you from "extending" a schema 284 | using `combining` keywords such as `allOf`. In the following example, 285 | we can see how the ``additionalProperties`` can cause attempts to 286 | extend the address schema example to fail. 287 | 288 | .. schema_example:: 289 | 290 | { 291 | "allOf": [ 292 | { 293 | "type": "object", 294 | "properties": { 295 | "street_address": { "type": "string" }, 296 | "city": { "type": "string" }, 297 | "state": { "type": "string" } 298 | }, 299 | "required": ["street_address", "city", "state"], 300 | "additionalProperties": false 301 | } 302 | ], 303 | 304 | "properties": { 305 | "type": { "enum": [ "residential", "business" ] } 306 | }, 307 | "required": ["type"] 308 | } 309 | --X 310 | // Fails ``additionalProperties``. "type" is considered additional. 311 | { 312 | "street_address": "1600 Pennsylvania Avenue NW", 313 | "city": "Washington", 314 | "state": "DC", 315 | "type": "business" 316 | } 317 | --X 318 | // Fails ``required``. "type" is required. 319 | { 320 | "street_address": "1600 Pennsylvania Avenue NW", 321 | "city": "Washington", 322 | "state": "DC" 323 | } 324 | 325 | Because ``additionalProperties`` only recognizes properties declared 326 | in the same subschema, it considers anything other than 327 | "street_address", "city", and "state" to be additional. Combining the 328 | schemas with `allOf` doesn't change that. A workaround you can use is 329 | to move ``additionalProperties`` to the extending schema and redeclare 330 | the properties from the extended schema. 331 | 332 | .. schema_example:: 333 | 334 | { 335 | "allOf": [ 336 | { 337 | "type": "object", 338 | "properties": { 339 | "street_address": { "type": "string" }, 340 | "city": { "type": "string" }, 341 | "state": { "type": "string" } 342 | }, 343 | "required": ["street_address", "city", "state"] 344 | } 345 | ], 346 | 347 | "properties": { 348 | "street_address": true, 349 | "city": true, 350 | "state": true, 351 | "type": { "enum": [ "residential", "business" ] } 352 | }, 353 | "required": ["type"], 354 | "additionalProperties": false 355 | } 356 | -- 357 | { 358 | "street_address": "1600 Pennsylvania Avenue NW", 359 | "city": "Washington", 360 | "state": "DC", 361 | "type": "business" 362 | } 363 | --X 364 | { 365 | "street_address": "1600 Pennsylvania Avenue NW", 366 | "city": "Washington", 367 | "state": "DC", 368 | "type": "business", 369 | "something that doesn't belong": "hi!" 370 | } 371 | 372 | Now the ``additionalProperties`` keyword is able to recognize all the 373 | necessary properties and the schema works as expected. Keep reading to 374 | see how the ``unevaluatedProperties`` keyword solves this problem 375 | without needing to redeclare properties. 376 | 377 | .. index:: 378 | single: object; properties; extending 379 | single: unevaluatedProperties 380 | 381 | .. _unevaluatedproperties: 382 | 383 | Unevaluated Properties 384 | ---------------------- 385 | 386 | |draft2019-09| 387 | 388 | In the previous section we saw the challenges with using 389 | ``additionalProperties`` when "extending" a schema using 390 | `combining`. The ``unevaluatedProperties`` keyword is similar to 391 | ``additionalProperties`` except that it can recognize properties 392 | declared in subschemas. So, the example from the previous section can 393 | be rewritten without the need to redeclare properties. 394 | 395 | .. schema_example:: 396 | 397 | { 398 | "allOf": [ 399 | { 400 | "type": "object", 401 | "properties": { 402 | "street_address": { "type": "string" }, 403 | "city": { "type": "string" }, 404 | "state": { "type": "string" } 405 | }, 406 | "required": ["street_address", "city", "state"] 407 | } 408 | ], 409 | 410 | "properties": { 411 | "type": { "enum": ["residential", "business"] } 412 | }, 413 | "required": ["type"], 414 | "unevaluatedProperties": false 415 | } 416 | -- 417 | { 418 | "street_address": "1600 Pennsylvania Avenue NW", 419 | "city": "Washington", 420 | "state": "DC", 421 | "type": "business" 422 | } 423 | --X 424 | { 425 | "street_address": "1600 Pennsylvania Avenue NW", 426 | "city": "Washington", 427 | "state": "DC", 428 | "type": "business", 429 | "something that doesn't belong": "hi!" 430 | } 431 | 432 | ``unevaluatedProperties`` works by collecting any properties that are 433 | successfully validated when processing the schemas and using those as 434 | the allowed list of properties. This allows you to do more complex 435 | things like conditionally adding properties. The following example 436 | allows the "department" property only if the "type" of address is 437 | "business". 438 | 439 | .. schema_example:: 440 | 441 | { 442 | "type": "object", 443 | "properties": { 444 | "street_address": { "type": "string" }, 445 | "city": { "type": "string" }, 446 | "state": { "type": "string" }, 447 | "type": { "enum": ["residential", "business"] } 448 | }, 449 | "required": ["street_address", "city", "state", "type"], 450 | 451 | "if": { 452 | "type": "object", 453 | "properties": { 454 | "type": { "const": "business" } 455 | }, 456 | "required": ["type"] 457 | }, 458 | "then": { 459 | "properties": { 460 | "department": { "type": "string" } 461 | } 462 | }, 463 | 464 | "unevaluatedProperties": false 465 | } 466 | -- 467 | { 468 | "street_address": "1600 Pennsylvania Avenue NW", 469 | "city": "Washington", 470 | "state": "DC", 471 | "type": "business", 472 | "department": "HR" 473 | } 474 | --X 475 | { 476 | "street_address": "1600 Pennsylvania Avenue NW", 477 | "city": "Washington", 478 | "state": "DC", 479 | "type": "residential", 480 | "department": "HR" 481 | } 482 | 483 | In this schema, the properties declared in the ``then`` schema only 484 | count as "evaluated" properties if the "type" of the address is 485 | "business". 486 | 487 | .. index:: 488 | single: object; required properties 489 | single: required 490 | 491 | .. _required: 492 | 493 | Required Properties 494 | ------------------- 495 | 496 | By default, the properties defined by the ``properties`` keyword are 497 | not required. However, one can provide a list of required properties 498 | using the ``required`` keyword. 499 | 500 | The ``required`` keyword takes an array of zero or more strings. Each 501 | of these strings must be unique. 502 | 503 | .. draft_specific:: 504 | 505 | --Draft 4 506 | In Draft 4, ``required`` must contain at least one string. 507 | 508 | In the following example schema defining a user record, we require 509 | that each user has a name and e-mail address, but we don't mind if 510 | they don't provide their address or telephone number: 511 | 512 | .. schema_example:: 513 | 514 | { 515 | "type": "object", 516 | "properties": { 517 | "name": { "type": "string" }, 518 | "email": { "type": "string" }, 519 | "address": { "type": "string" }, 520 | "telephone": { "type": "string" } 521 | }, 522 | "required": ["name", "email"] 523 | } 524 | -- 525 | { 526 | "name": "William Shakespeare", 527 | "email": "bill@stratford-upon-avon.co.uk" 528 | } 529 | -- 530 | // Providing extra properties is fine, even properties not defined 531 | // in the schema: 532 | { 533 | "name": "William Shakespeare", 534 | "email": "bill@stratford-upon-avon.co.uk", 535 | "address": "Henley Street, Stratford-upon-Avon, Warwickshire, England", 536 | "authorship": "in question" 537 | } 538 | --X 539 | // Missing the required "email" property makes the JSON document invalid: 540 | { 541 | "name": "William Shakespeare", 542 | "address": "Henley Street, Stratford-upon-Avon, Warwickshire, England", 543 | } 544 | --X 545 | // In JSON a property with value ``null`` is not equivalent to the property 546 | // not being present. This fails because ``null`` is not of type "string", 547 | // it's of type "null" 548 | { 549 | "name": "William Shakespeare", 550 | "address": "Henley Street, Stratford-upon-Avon, Warwickshire, England", 551 | "email": null 552 | } 553 | 554 | .. index:: 555 | single: object; property names 556 | single: propertyNames 557 | 558 | .. _propertyNames: 559 | 560 | Property names 561 | -------------- 562 | 563 | |draft6| 564 | 565 | The names of properties can be validated against a schema, irrespective of their 566 | values. This can be useful if you don't want to enforce specific properties, 567 | but you want to make sure that the names of those properties follow a specific 568 | convention. You might, for example, want to enforce that all names are valid 569 | ASCII tokens so they can be used as attributes in a particular programming 570 | language. 571 | 572 | .. schema_example:: 573 | 574 | { 575 | "type": "object", 576 | "propertyNames": { 577 | "pattern": "^[A-Za-z_][A-Za-z0-9_]*$" 578 | } 579 | } 580 | -- 581 | { 582 | "_a_proper_token_001": "value" 583 | } 584 | --X 585 | { 586 | "001 invalid": "value" 587 | } 588 | 589 | Since object keys must always be strings anyway, it is implied that the 590 | schema given to ``propertyNames`` is always at least:: 591 | 592 | { "type": "string" } 593 | 594 | .. index:: 595 | single: object; size 596 | single: minProperties 597 | single: maxProperties 598 | 599 | Size 600 | ---- 601 | 602 | The number of properties on an object can be restricted using the 603 | ``minProperties`` and ``maxProperties`` keywords. Each of these 604 | must be a non-negative integer. 605 | 606 | .. schema_example:: 607 | 608 | { 609 | "type": "object", 610 | "minProperties": 2, 611 | "maxProperties": 3 612 | } 613 | --X 614 | {} 615 | --X 616 | { "a": 0 } 617 | -- 618 | { "a": 0, "b": 1 } 619 | -- 620 | { "a": 0, "b": 1, "c": 2 } 621 | --X 622 | { "a": 0, "b": 1, "c": 2, "d": 3 } 623 | -------------------------------------------------------------------------------- /source/reference/regular_expressions.rst: -------------------------------------------------------------------------------- 1 | .. index:: 2 | single: regular expressions 3 | 4 | .. _regular-expressions: 5 | 6 | Regular Expressions 7 | =================== 8 | 9 | .. contents:: :local: 10 | 11 | The :ref:`pattern ` and `patternProperties` keywords use 12 | regular expressions to express constraints. The regular expression 13 | syntax used is from JavaScript (`ECMA 262 14 | `__, 15 | specifically). However, that complete syntax is not widely supported, 16 | therefore it is recommended that you stick to the subset of that 17 | syntax described below. 18 | 19 | - A single unicode character (other than the special characters 20 | below) matches itself. 21 | 22 | - ``.``: Matches any character except line break characters. (Be aware that what 23 | constitutes a line break character is somewhat dependent on your platform and 24 | language environment, but in practice this rarely matters). 25 | 26 | - ``^``: Matches only at the beginning of the string. 27 | 28 | - ``$``: Matches only at the end of the string. 29 | 30 | - ``(...)``: Group a series of regular expressions into a single 31 | regular expression. 32 | 33 | - ``|``: Matches either the regular expression preceding or following 34 | the ``|`` symbol. 35 | 36 | - ``[abc]``: Matches any of the characters inside the square brackets. 37 | 38 | - ``[a-z]``: Matches the range of characters. 39 | 40 | - ``[^abc]``: Matches any character *not* listed. 41 | 42 | - ``[^a-z]``: Matches any character outside of the range. 43 | 44 | - ``+``: Matches one or more repetitions of the preceding regular 45 | expression. 46 | 47 | - ``*``: Matches zero or more repetitions of the preceding regular 48 | expression. 49 | 50 | - ``?``: Matches zero or one repetitions of the preceding regular 51 | expression. 52 | 53 | - ``+?``, ``*?``, ``??``: The ``*``, ``+``, and ``?`` qualifiers are 54 | all greedy; they match as much text as possible. Sometimes this 55 | behavior isn't desired and you want to match as few characters as 56 | possible. 57 | 58 | - ``(?!x)``, ``(?=x)``: Negative and positive lookahead. 59 | 60 | - ``{x}``: Match exactly ``x`` occurrences of the preceding regular 61 | expression. 62 | 63 | - ``{x,y}``: Match at least ``x`` and at most ``y`` occurrences of 64 | the preceding regular expression. 65 | 66 | - ``{x,}``: Match ``x`` occurrences or more of the preceding regular 67 | expression. 68 | 69 | - ``{x}?``, ``{x,y}?``, ``{x,}?``: Lazy versions of the above 70 | expressions. 71 | 72 | Example 73 | ''''''' 74 | 75 | The following example matches a simple North American telephone number 76 | with an optional area code: 77 | 78 | .. schema_example:: 79 | 80 | { 81 | "type": "string", 82 | "pattern": "^(\\([0-9]{3}\\))?[0-9]{3}-[0-9]{4}$" 83 | } 84 | -- 85 | "555-1212" 86 | -- 87 | "(888)555-1212" 88 | --X 89 | "(888)555-1212 ext. 532" 90 | --X 91 | "(800)FLOWERS" 92 | -------------------------------------------------------------------------------- /source/reference/schema.rst: -------------------------------------------------------------------------------- 1 | .. index:: 2 | single: $schema 3 | 4 | Declaring a Dialect 5 | =================== 6 | 7 | .. contents:: :local: 8 | 9 | A version of JSON Schema is called a dialect. A dialect represents the 10 | set of keywords and semantics that can be used to evaluate a schema. 11 | Each JSON Schema release is a new dialect of JSON Schema. JSON Schema 12 | provides a way for you to declare which dialect a schema conforms to 13 | and provides ways to describe your own custom dialects. 14 | 15 | .. index:: 16 | single: $schema 17 | single: schema; keyword 18 | 19 | .. _schema: 20 | 21 | $schema 22 | ------- 23 | 24 | The ``$schema`` keyword is used to declare which dialect of JSON 25 | Schema the schema was written for. The value of the ``$schema`` 26 | keyword is also the identifier for a schema that can be used to verify 27 | that the schema is valid according to the dialect ``$schema`` 28 | identifies. A schema that describes another schema is called a 29 | "meta-schema". 30 | 31 | ``$schema`` applies to the entire document and must be at the root 32 | level. It does not apply to externally referenced (``$ref``, 33 | ``$dynamicRef``) documents. Those schemas need to declare their own 34 | ``$schema``. 35 | 36 | If ``$schema`` is not used, an implementation might allow you to 37 | specify a value externally or it might make assumptions about which 38 | specification version should be used to evaluate the schema. It's 39 | recommended that all JSON Schemas have a ``$schema`` keyword to 40 | communicate to readers and tooling which specification version is 41 | intended. Therefore most of the time, you'll want this at the root of 42 | your schema:: 43 | 44 | "$schema": "https://json-schema.org/draft/2020-12/schema" 45 | 46 | .. draft_specific:: 47 | 48 | --Draft 4 49 | The identifier for Draft 4 is ``http://json-schema.org/draft-04/schema#``. 50 | 51 | Draft 4 defined a value for ``$schema`` without a specific dialect 52 | (``http://json-schema.org/schema#``) which meant, use the latest 53 | dialect. This has since been deprecated and should no longer be 54 | used. 55 | 56 | You might come across references to Draft 5. There is no Draft 5 57 | release of JSON Schema. Draft 5 refers to a no-change revision of 58 | the Draft 4 release. It does not add, remove, or change any 59 | functionality. It only updates references, makes clarifications, 60 | and fixes bugs. Draft 5 describes the Draft 4 release. If you came 61 | here looking for information about Draft 5, you'll find it under 62 | Draft 4. We no longer use the "draft" terminology to refer to 63 | patch releases to avoid this confusion. 64 | 65 | --Draft 6 66 | The identifier for Draft 6 is ``http://json-schema.org/draft-06/schema#``. 67 | 68 | --Draft 7 69 | The identifier for Draft 7 is ``http://json-schema.org/draft-07/schema#``. 70 | 71 | --Draft 2019-09 72 | The identifier for Draft 2019-09 is ``https://json-schema.org/draft/2019-09/schema``. 73 | 74 | .. index:: 75 | single: $vocabularies 76 | single: schema; $vocabularies 77 | 78 | .. _vocabularies: 79 | 80 | Vocabularies 81 | ------------ 82 | 83 | |draft2019-09| 84 | 85 | Documentation Coming Soon 86 | 87 | .. draft_specific:: 88 | --Draft 4-7 89 | 90 | Before the introduction of Vocabularies, you could still extend 91 | JSON Schema with your custom keywords but the process was much less 92 | formalized. The first thing you'll need is a meta-schema that 93 | includes your custom keywords. The best way to do this is to make a 94 | copy of the meta-schema for the version you want to extend and make 95 | your changes to your copy. You will need to choose a custom URI to 96 | identify your custom version. This URI must not be one of the URIs 97 | used to identify official JSON Schema specification drafts and 98 | should probably include a domain name you own. You can use this URI 99 | with the ``$schema`` keyword to declare that your schemas use your 100 | custom version. 101 | 102 | .. note:: 103 | Not all implementations support custom meta-schemas and custom 104 | keyword implementations. 105 | 106 | .. index:: 107 | single: $vocabularies 108 | single: schema; $vocabularies; guidelines 109 | 110 | .. _guidelines: 111 | 112 | Guidelines 113 | '''''''''' 114 | 115 | One of the strengths of JSON Schema is that it can be written in JSON 116 | and used in a variety of environments. For example, it can be used for 117 | both front-end and back-end HTML Form validation. The problem with 118 | using custom vocabularies is that every environment where you want to 119 | use your schemas needs to understand how to evaluate your vocabulary's 120 | keywords. Meta-schemas can be used to ensure that schemas are written 121 | correctly, but each implementation will need custom code to understand 122 | how to evaluate the vocabulary's keywords. 123 | 124 | Meta-data keywords are the most interoperable because they don't 125 | affect validation. For example, you could add a ``units`` keyword. 126 | This will always work as expecting with an compliant validator. 127 | 128 | .. schema_example:: 129 | 130 | { 131 | "type": "number", 132 | "units": "kg" 133 | } 134 | -- 135 | 42 136 | --X 137 | "42" 138 | 139 | The next best candidates for custom keywords are keywords that don't 140 | apply other schemas and don't modify the behavior of existing 141 | keywords. An ``isEven`` keyword is an example. In contexts where some 142 | validation is better than no validation such as validating an HTML 143 | Form in the browser, this schema will perform as well as can be 144 | expected. Full validation would still be required and should use a 145 | validator that understands the custom keyword. 146 | 147 | .. schema_example:: 148 | 149 | { 150 | "type": "integer", 151 | "isEven": true 152 | } 153 | -- 154 | 2 155 | -- 156 | // This passes because the validator doesn't understand ``isEven`` 157 | 3 158 | --X 159 | // The schema isn't completely impaired because it doesn't understand ``isEven`` 160 | "3" 161 | 162 | The least interoperable type of custom keyword is one that applies 163 | other schemas or modifies the behavior of existing keywords. An 164 | example would be something like ``requiredProperties`` that declares 165 | properties and makes them required. This example shows how the schema 166 | becomes almost completely useless when evaluated with a validator that 167 | doesn't understand the custom keyword. That doesn't necessarily mean 168 | that ``requiredProperties`` is a bad idea for a keyword, it's just not 169 | the right choice if the schema might need to be used in a context that 170 | doesn't understand custom keywords. 171 | 172 | .. schema_example:: 173 | 174 | { 175 | "type": "object", 176 | "requiredProperties": { 177 | "foo": { "type": "string" } 178 | } 179 | } 180 | -- 181 | { "foo": "bar" } 182 | -- 183 | // This passes because ``requiredProperties`` is not understood 184 | {} 185 | -- 186 | // This passes because ``requiredProperties`` is not understood 187 | { "foo": 42 } 188 | -------------------------------------------------------------------------------- /source/reference/string.rst: -------------------------------------------------------------------------------- 1 | .. index:: 2 | single: string 3 | 4 | .. _string: 5 | 6 | string 7 | ------ 8 | 9 | .. contents:: :local: 10 | 11 | The ``string`` type is used for strings of text. It may contain 12 | Unicode characters. 13 | 14 | .. language_specific:: 15 | 16 | --Python 17 | In Python, "string" is analogous to the ``unicode`` type on Python 18 | 2.x, and the ``str`` type on Python 3.x. 19 | --Ruby 20 | In Ruby, "string" is analogous to the ``String`` type. 21 | 22 | .. schema_example:: 23 | 24 | { "type": "string" } 25 | -- 26 | "This is a string" 27 | -- 28 | // Unicode characters: 29 | "Déjà vu" 30 | -- 31 | "" 32 | -- 33 | "42" 34 | --X 35 | 42 36 | 37 | .. index:: 38 | single: string; length 39 | single: maxLength 40 | single: minLength 41 | 42 | Length 43 | '''''' 44 | 45 | The length of a string can be constrained using the ``minLength`` and 46 | ``maxLength`` keywords. For both keywords, the value must be a 47 | non-negative number. 48 | 49 | .. schema_example:: 50 | 51 | { 52 | "type": "string", 53 | "minLength": 2, 54 | "maxLength": 3 55 | } 56 | --X 57 | "A" 58 | -- 59 | "AB" 60 | -- 61 | "ABC" 62 | --X 63 | "ABCD" 64 | 65 | .. index:: 66 | single: string; regular expression 67 | single: pattern 68 | 69 | Regular Expressions 70 | ''''''''''''''''''' 71 | 72 | .. _pattern: 73 | 74 | The ``pattern`` keyword is used to restrict a string to a particular 75 | regular expression. The regular expression syntax is the one defined 76 | in JavaScript (`ECMA 262 77 | `__ 78 | specifically) with Unicode support. See `regular-expressions` for 79 | more information. 80 | 81 | .. note:: 82 | When defining the regular expressions, it's important to note that 83 | the string is considered valid if the expression matches anywhere 84 | within the string. For example, the regular expression ``"p"`` 85 | will match any string with a ``p`` in it, such as ``"apple"`` not 86 | just a string that is simply ``"p"``. Therefore, it is usually 87 | less confusing, as a matter of course, to surround the regular 88 | expression in ``^...$``, for example, ``"^p$"``, unless there is a 89 | good reason not to do so. 90 | 91 | The following example matches a simple North American telephone number 92 | with an optional area code: 93 | 94 | .. schema_example:: 95 | 96 | { 97 | "type": "string", 98 | "pattern": "^(\\([0-9]{3}\\))?[0-9]{3}-[0-9]{4}$" 99 | } 100 | -- 101 | "555-1212" 102 | -- 103 | "(888)555-1212" 104 | --X 105 | "(888)555-1212 ext. 532" 106 | --X 107 | "(800)FLOWERS" 108 | 109 | .. index:: 110 | single: string; format 111 | single: format 112 | 113 | .. _format: 114 | 115 | Format 116 | '''''' 117 | 118 | The ``format`` keyword allows for basic semantic identification of 119 | certain kinds of string values that are commonly used. For example, 120 | because JSON doesn't have a "DateTime" type, dates need to be encoded 121 | as strings. ``format`` allows the schema author to indicate that the 122 | string value should be interpreted as a date. By default, ``format`` 123 | is just an annotation and does not affect validation. 124 | 125 | Optionally, validator implementations can provide a configuration 126 | option to enable ``format`` to function as an assertion rather than 127 | just an annotation. That means that validation will fail if, for 128 | example, a value with a ``date`` format isn't in a form that can be 129 | parsed as a date. This can allow values to be constrained beyond what 130 | the other tools in JSON Schema, including `regular-expressions` can 131 | do. 132 | 133 | .. note:: 134 | Implementations may provide validation for only a subset of the 135 | built-in formats or do partial validation for a given format. For 136 | example, some implementations may consider a string an email if it 137 | contains a ``@``, while others might do additional checks 138 | for other aspects of a well formed email address. 139 | 140 | .. draft_specific:: 141 | --Draft 4-7 142 | In Draft 4-7, there is no guarantee that you get annotation-only 143 | behavior by default. 144 | 145 | There is a bias toward networking-related formats in the JSON Schema 146 | specification, most likely due to its heritage in web technologies. 147 | However, custom formats may also be used, as long as the parties 148 | exchanging the JSON documents also exchange information about the 149 | custom format types. A JSON Schema validator will ignore any format 150 | type that it does not understand. 151 | 152 | .. index:: 153 | single: format 154 | 155 | Built-in formats 156 | ^^^^^^^^^^^^^^^^ 157 | 158 | The following is the list of formats specified in the JSON Schema 159 | specification. 160 | 161 | .. index:: 162 | single: date-time 163 | single: time 164 | single: date 165 | single: format; date-time 166 | single: format; time 167 | single: format; date 168 | 169 | Dates and times 170 | *************** 171 | 172 | Dates and times are represented in `RFC 3339, section 5.6 173 | `_. This is 174 | a subset of the date format also commonly known as `ISO8601 format 175 | `_. 176 | 177 | - ``"date-time"``: Date and time together, for example, 178 | ``2018-11-13T20:20:39+00:00``. 179 | 180 | - ``"time"``: |draft7| Time, for example, ``20:20:39+00:00`` 181 | 182 | - ``"date"``: |draft7| Date, for example, ``2018-11-13``. 183 | 184 | - ``"duration"``: |draft2019-09| A duration as defined by the `ISO 185 | 8601 ABNF for "duration" 186 | `_. For 187 | example, ``P3D`` expresses a duration of 3 days. 188 | 189 | .. index:: 190 | single: email 191 | single: idn-email 192 | single: format; email 193 | single: format; idn-email 194 | 195 | Email addresses 196 | *************** 197 | 198 | - ``"email"``: Internet email address, see `RFC 5321, 199 | section 4.1.2 `_. 200 | 201 | - ``"idn-email"``: |draft7| The internationalized form of an Internet email address, see 202 | `RFC 6531 `_. 203 | 204 | .. index:: 205 | single: hostname 206 | single: idn-hostname 207 | single: format; hostname 208 | single: format; idn-hostname 209 | 210 | Hostnames 211 | ********* 212 | 213 | - ``"hostname"``: Internet host name, see `RFC 1123, section 2.1 214 | `_. 215 | 216 | - ``"idn-hostname"``: |draft7| An internationalized Internet host 217 | name, see `RFC5890, section 2.3.2.3 218 | `_. 219 | 220 | .. index:: 221 | single: ipv4 222 | single: ipv6 223 | single: format; ipv4 224 | single: format; ipv6 225 | 226 | IP Addresses 227 | ************ 228 | 229 | - ``"ipv4"``: IPv4 address, according to dotted-quad ABNF syntax as 230 | defined in `RFC 2673, section 3.2 231 | `_. 232 | 233 | - ``"ipv6"``: IPv6 address, as defined in `RFC 2373, section 2.2 234 | `_. 235 | 236 | .. index:: 237 | single: uuid 238 | single: uri 239 | single: uri-reference 240 | single: iri 241 | single: iri-reference 242 | single: format; uuid 243 | single: format; uri 244 | single: format; uri-reference 245 | single: format; iri 246 | single: format; iri-reference 247 | 248 | Resource identifiers 249 | ******************** 250 | 251 | - ``"uuid"``: |draft2019-09| A Universally Unique Identifier as 252 | defined by `RFC 4122 253 | `_. Example: 254 | ``3e4666bf-d5e5-4aa7-b8ce-cefe41c7568a`` 255 | 256 | - ``"uri"``: A universal resource identifier (URI), according to 257 | `RFC3986 `_. 258 | 259 | - ``"uri-reference"``: |draft6| A URI Reference (either a URI or a 260 | relative-reference), according to `RFC3986, section 4.1 261 | `_. 262 | 263 | - ``"iri"``: |draft7| The internationalized equivalent of a "uri", 264 | according to `RFC3987 `_. 265 | 266 | - ``"iri-reference"``: |draft7| The internationalized equivalent of a 267 | "uri-reference", according to `RFC3987 `_ 268 | 269 | If the values in the schema have the ability to be relative to a particular source 270 | path (such as a link from a webpage), it is generally better practice to use 271 | ``"uri-reference"`` (or ``"iri-reference"``) rather than ``"uri"`` (or 272 | ``"iri"``). ``"uri"`` should only be used when the path must be absolute. 273 | 274 | .. draft_specific:: 275 | 276 | --Draft 4 277 | Draft 4 only includes ``"uri"``, not ``"uri-reference"``. Therefore, there is 278 | some ambiguity around whether ``"uri"`` should accept relative paths. 279 | 280 | .. index:: 281 | single: uri-template 282 | single: format; uri-template 283 | 284 | URI template 285 | ************ 286 | 287 | - ``"uri-template"``: |draft6| A URI Template (of any level) according to 288 | `RFC6570 `_. If you don't already know 289 | what a URI Template is, you probably don't need this value. 290 | 291 | .. index:: 292 | single: json-pointer 293 | single: relative-json-pointer 294 | single: format; json-pointer 295 | single: format; relative-json-pointer 296 | 297 | JSON Pointer 298 | ************ 299 | 300 | - ``"json-pointer"``: |draft6| A JSON Pointer, according to `RFC6901 301 | `_. There is more discussion on the use 302 | of JSON Pointer within JSON Schema in `structuring`. Note that this should be 303 | used only when the entire string contains only JSON Pointer content, e.g. 304 | ``/foo/bar``. JSON Pointer URI fragments, e.g. ``#/foo/bar/`` should use 305 | ``"uri-reference"``. 306 | 307 | - ``"relative-json-pointer"``: |draft7| A `relative JSON pointer 308 | `_. 309 | 310 | .. index:: 311 | single: regex 312 | single: format; regex 313 | 314 | Regular Expressions 315 | ******************* 316 | 317 | - ``"regex"``: |draft7| A regular expression, which should be valid according to 318 | the `ECMA 262 319 | `_ 320 | dialect. 321 | 322 | Be careful, in practice, JSON schema validators are only required to accept the 323 | safe subset of `regular-expressions` described elsewhere in this document. 324 | 325 | .. TODO: Add some examples for ``format`` here 326 | -------------------------------------------------------------------------------- /source/reference/type.rst: -------------------------------------------------------------------------------- 1 | .. index:: 2 | single: type 3 | single: types; basic 4 | 5 | .. _type: 6 | 7 | Type-specific keywords 8 | ====================== 9 | 10 | The ``type`` keyword is fundamental to JSON Schema. It specifies the 11 | data type for a schema. 12 | 13 | At its core, JSON Schema defines the following basic types: 14 | 15 | - `string` 16 | - `number ` 17 | - `integer ` 18 | - `object` 19 | - `array` 20 | - `boolean` 21 | - `null` 22 | 23 | These types have analogs in most programming languages, though they 24 | may go by different names. 25 | 26 | .. language_specific:: 27 | 28 | --Python 29 | The following table maps from the names of JSON types to their 30 | analogous types in Python: 31 | 32 | +----------+-----------+ 33 | |JSON |Python | 34 | +----------+-----------+ 35 | |string |str | 36 | | |[#1]_ | 37 | +----------+-----------+ 38 | |number |int/float | 39 | | |[#2]_ | 40 | +----------+-----------+ 41 | |object |dict | 42 | +----------+-----------+ 43 | |array |list | 44 | +----------+-----------+ 45 | |boolean |bool | 46 | +----------+-----------+ 47 | |null |None | 48 | +----------+-----------+ 49 | 50 | .. rubric:: Footnotes 51 | 52 | .. [#1] Since JSON strings always support unicode, they are 53 | analogous to ``unicode`` on Python 2.x and ``str`` on 54 | Python 3.x. 55 | 56 | .. [#2] JSON does not have separate types for integer and 57 | floating-point. 58 | 59 | 60 | --Ruby 61 | The following table maps from the names of JSON types to their 62 | analogous types in Ruby: 63 | 64 | +----------+----------------------+ 65 | |JSON |Ruby | 66 | +----------+----------------------+ 67 | |string |String | 68 | +----------+----------------------+ 69 | |number |Integer/Float | 70 | | |[#3]_ | 71 | +----------+----------------------+ 72 | |object |Hash | 73 | +----------+----------------------+ 74 | |array |Array | 75 | +----------+----------------------+ 76 | |boolean |TrueClass/FalseClass | 77 | +----------+----------------------+ 78 | |null |NilClass | 79 | +----------+----------------------+ 80 | 81 | .. rubric:: Footnotes 82 | 83 | .. [#3] JSON does not have separate types for integer and 84 | floating-point. 85 | 86 | The ``type`` keyword may either be a string or an array: 87 | 88 | - If it's a string, it is the name of one of the basic types above. 89 | 90 | - If it is an array, it must be an array of strings, where each string 91 | is the name of one of the basic types, and each element is unique. 92 | In this case, the JSON snippet is valid if it matches *any* of the 93 | given types. 94 | 95 | Here is a simple example of using the ``type`` keyword: 96 | 97 | .. schema_example:: 98 | 99 | { "type": "number" } 100 | -- 101 | 42 102 | -- 103 | 42.0 104 | --X 105 | // This is not a number, it is a string containing a number. 106 | "42" 107 | 108 | In the following example, we accept strings and numbers, but not 109 | structured data types: 110 | 111 | .. schema_example:: 112 | 113 | { "type": ["number", "string"] } 114 | -- 115 | 42 116 | -- 117 | "Life, the universe, and everything" 118 | --X 119 | ["Life", "the universe", "and everything"] 120 | 121 | For each of these types, there are keywords that only apply to those 122 | types. For example, numeric types have a way of specifying a numeric 123 | range, that would not be applicable to other types. In this 124 | reference, these validation keywords are described along with each of 125 | their corresponding types in the following chapters. 126 | -------------------------------------------------------------------------------- /source/sphinxext/__init__.py: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /source/sphinxext/jsonschemaext.py: -------------------------------------------------------------------------------- 1 | import json 2 | 3 | from docutils import nodes 4 | from docutils import statemachine 5 | from docutils.parsers.rst import Directive 6 | from sphinx.util.nodes import set_source_info 7 | 8 | import jsonschema 9 | from jschon import create_catalog, JSON, JSONSchema, URI 10 | 11 | 12 | legacy = { 13 | 'http://json-schema.org/draft-03/schema#': jsonschema.validators.Draft3Validator, 14 | 'http://json-schema.org/draft-04/schema#': jsonschema.validators.Draft4Validator, 15 | 'http://json-schema.org/draft-06/schema#': jsonschema.validators.Draft6Validator, 16 | 'http://json-schema.org/draft-07/schema#': jsonschema.validators.Draft7Validator 17 | } 18 | 19 | def validate(schema, part, standard): 20 | if standard in legacy: 21 | cls = legacy[standard] 22 | 23 | try: 24 | jsonschema.validate(part.json, schema.json, cls=cls) 25 | return (True, '') 26 | except jsonschema.ValidationError as e: 27 | return (False, str(e)) 28 | except jsonschema.SchemaError as e: 29 | raise ValueError("Schema is invalid:\n{0}\n\n{1}".format( 30 | str(e), schema.content)) 31 | 32 | return (is_valid, message) 33 | else: 34 | catalogue = create_catalog('2019-09', '2020-12') 35 | 36 | compiled_schema = JSONSchema(schema.json, metaschema_uri=URI(standard)) 37 | if not compiled_schema.validate().valid: 38 | raise ValueError("Schema is invalid:\n{0}\n\n{1}".format( 39 | "INVALID SCHEMA", schema.content)) 40 | elif part.json == (1+1j): 41 | return (False, 'INVALID JSON') 42 | else: 43 | jsonValue = JSON.loads(part.content) 44 | validation_result = compiled_schema.evaluate(jsonValue) 45 | 46 | if validation_result.valid: 47 | return (True, ''); 48 | else: 49 | return (False, 'VALIDATION ERROR'); 50 | 51 | 52 | class jsonschema_node(nodes.Element): 53 | pass 54 | 55 | 56 | class AttrDict(dict): 57 | def __init__(self, *args, **kwargs): 58 | super(AttrDict, self).__init__(*args, **kwargs) 59 | self.__dict__ = self 60 | 61 | 62 | def split_content(l): 63 | parts = [] 64 | should_pass = True 65 | part = [] 66 | comment = [] 67 | 68 | def add_part(): 69 | hl_lines = [] 70 | for i, line in enumerate(part): 71 | if line.lstrip().startswith('*'): 72 | line = line.replace('*', '', 1) 73 | hl_lines.append(i + 1) 74 | part[i] = line 75 | 76 | content = '\n'.join(part) 77 | try: 78 | json_content = json.loads(content) 79 | except ValueError: 80 | if should_pass: 81 | raise ValueError("Invalid json: {0}".format(content)) 82 | else: 83 | # A complex number will never validate 84 | json_content = 1+1j 85 | parts.append(AttrDict({ 86 | 'should_pass': should_pass, 87 | 'content': content, 88 | 'json': json_content, 89 | 'comment': comment, 90 | 'hl_lines': hl_lines} 91 | )) 92 | 93 | for line in l: 94 | if line.startswith('//'): 95 | comment.append(line[2:].lstrip()) 96 | elif line == '--': 97 | add_part() 98 | should_pass = True 99 | part = [] 100 | comment = [] 101 | elif line == '--X': 102 | add_part() 103 | should_pass = False 104 | part = [] 105 | comment = [] 106 | else: 107 | part.append(line) 108 | 109 | add_part() 110 | 111 | return parts[0], parts[1:] 112 | 113 | 114 | class SchemaExampleDirective(Directive): 115 | has_content = True 116 | validate = True 117 | optional_arguments = 1 118 | 119 | def run(self): 120 | env = self.state.document.settings.env 121 | if len(self.arguments) == 1: 122 | standard = self.arguments[0] 123 | else: 124 | standard = env.config.jsonschema_standard 125 | 126 | result = [] 127 | 128 | schema, parts = split_content(self.content) 129 | 130 | container = jsonschema_node() 131 | set_source_info(self, container) 132 | 133 | literal = nodes.literal_block( 134 | schema.content, schema.content) 135 | literal['language'] = 'javascript' 136 | literal['classes'] = container['classes'] = ['jsonschema'] 137 | if schema.hl_lines: 138 | literal['highlight_args'] = {'hl_lines': schema.hl_lines} 139 | set_source_info(self, literal) 140 | container.append(literal) 141 | result.append(container) 142 | 143 | for part in parts: 144 | if self.validate: 145 | is_valid, message = validate(schema, part, standard) 146 | 147 | if is_valid != part.should_pass: 148 | if part.should_pass: 149 | raise ValueError( 150 | "Doc says fragment should pass, " 151 | "but it does not validate:\n" + 152 | part.content + "\n" + 153 | message) 154 | else: 155 | raise ValueError( 156 | "Doc says fragment should not pass, " 157 | "but it validates:\n" + 158 | part.content) 159 | else: 160 | is_valid = part.should_pass 161 | 162 | if len(part.comment): 163 | paragraph = nodes.paragraph('', '') 164 | comment = statemachine.StringList(part.comment) 165 | comment.parent = self.content.parent 166 | self.state.nested_parse(comment, 0, paragraph) 167 | paragraph['classes'] = ['jsonschema-comment'] 168 | set_source_info(self, paragraph) 169 | result.append(paragraph) 170 | 171 | container = jsonschema_node() 172 | set_source_info(self, container) 173 | literal = nodes.literal_block( 174 | part.content, part.content) 175 | literal['language'] = 'javascript' 176 | if is_valid: 177 | literal['classes'] = container['classes'] = ['jsonschema-pass'] 178 | else: 179 | literal['classes'] = container['classes'] = ['jsonschema-fail'] 180 | if part.hl_lines: 181 | literal['highlight_args'] = {'hl_lines': part.hl_lines} 182 | set_source_info(self, literal) 183 | container.append(literal) 184 | result.append(container) 185 | 186 | return result 187 | 188 | 189 | class SchemaExampleNoValidationDirective(SchemaExampleDirective): 190 | validate = False 191 | 192 | 193 | def visit_jsonschema_node_html(self, node): 194 | pass 195 | 196 | 197 | def depart_jsonschema_node_html(self, node): 198 | pass 199 | 200 | 201 | def visit_jsonschema_node_latex(self, node): 202 | adjust = False 203 | color = "gray" 204 | char = "" 205 | if 'jsonschema-pass' in node['classes']: 206 | char = r"\Checkmark" 207 | color = "ForestGreen" 208 | adjust = True 209 | elif 'jsonschema-fail' in node['classes']: 210 | char = r"\XSolidBrush" 211 | color = "BrickRed" 212 | adjust = True 213 | elif 'jsonschema' in node['classes']: 214 | char = r"\{ json schema \}" 215 | 216 | if adjust: 217 | self.body.append(r"\begin{adjustwidth}{2.5em}{0pt}") 218 | self.body.append(r"\vspace{4pt}") 219 | self.body.append(r"\begin{jsonframe}{%s}{%s}" % (char, color)) 220 | 221 | 222 | def depart_jsonschema_node_latex(self, node): 223 | adjust = False 224 | if 'jsonschema-pass' in node['classes']: 225 | adjust = True 226 | elif 'jsonschema-fail' in node['classes']: 227 | adjust = True 228 | 229 | self.body.append(r"\end{jsonframe}") 230 | if adjust: 231 | self.body.append(r"\end{adjustwidth}") 232 | 233 | 234 | def setup(app): 235 | app.add_config_value('jsonschema_standard', 'http://json-schema.org/draft-04/schema#', 'env') 236 | 237 | app.add_directive('schema_example', 238 | SchemaExampleDirective) 239 | app.add_directive('schema_example_novalid', 240 | SchemaExampleNoValidationDirective) 241 | 242 | app.add_node( 243 | jsonschema_node, 244 | html=(visit_jsonschema_node_html, depart_jsonschema_node_html), 245 | latex=(visit_jsonschema_node_latex, depart_jsonschema_node_latex)) 246 | 247 | 248 | passoptionstopackages = r'\PassOptionsToPackage{dvipsnames}{xcolor}' 249 | 250 | 251 | latex_preamble = r""" 252 | \usepackage{changepage} 253 | \usepackage{xcolor} 254 | """ 255 | -------------------------------------------------------------------------------- /source/sphinxext/tab.py: -------------------------------------------------------------------------------- 1 | from docutils import nodes 2 | from docutils import statemachine 3 | from docutils.parsers.rst import Directive 4 | import re 5 | 6 | 7 | class AttrDict(dict): 8 | def __init__(self, *args, **kwargs): 9 | super(AttrDict, self).__init__(*args, **kwargs) 10 | self.__dict__ = self 11 | 12 | 13 | def split_content(l): 14 | parts = [] 15 | part = [] 16 | label = None 17 | 18 | def add_part(): 19 | if label is None: 20 | raise ValueError("No label specified") 21 | parts.append(AttrDict({ 22 | 'label': label, 23 | 'content': part})) 24 | 25 | for line in l: 26 | if line.startswith('--'): 27 | if len(part): 28 | add_part() 29 | part = [] 30 | label = line[2:].strip() 31 | else: 32 | part.append(line) 33 | 34 | add_part() 35 | 36 | return parts 37 | 38 | 39 | class pages(nodes.Element): 40 | local_attributes = ['parts'] 41 | 42 | def __init__(self, *args, **kwargs): 43 | self.parts = kwargs['parts'] 44 | nodes.Element.__init__(self, *args, **kwargs) 45 | 46 | 47 | class language_specific_pages(pages): 48 | header = 'Language-specific info:' 49 | 50 | 51 | class draft_pages(pages): 52 | header = 'Draft-specific info:' 53 | 54 | 55 | class section(nodes.Element): 56 | pass 57 | 58 | 59 | def visit_pages_node_html(self, node): 60 | node['classes'] = ['tabbable'] 61 | 62 | ul = nodes.bullet_list() 63 | ul['classes'] = ['nav', 'nav-tabs'] 64 | # set_source_info(self, ul) 65 | 66 | href = tab('', node.header) 67 | href['classes'] = ['disabled'] 68 | paragraph = nodes.paragraph('', '') 69 | li = nodes.list_item('') 70 | li['classes'] = ['disabled'] 71 | 72 | paragraph.append(href) 73 | li.append(paragraph) 74 | ul.append(li) 75 | 76 | first = True 77 | for part in node.parts: 78 | href = tab(part.label, part.label) 79 | href['refuri'] = '#' + make_id(node, part.label) 80 | paragraph = nodes.paragraph('') 81 | li = nodes.list_item('') 82 | if first: 83 | li['classes'].append('active') 84 | 85 | paragraph.append(href) 86 | li.append(paragraph) 87 | ul.append(li) 88 | 89 | first = False 90 | 91 | node.append(ul) 92 | 93 | pages = section() 94 | pages['classes'] = ['tab-content'] 95 | 96 | first = True 97 | for part in node.parts: 98 | page = section() 99 | page['classes'] = ['tab-pane'] 100 | if first: 101 | page['classes'].append('active') 102 | page['ids'] = [make_id(node, part.label)] 103 | 104 | page.append(part.paragraph) 105 | pages.append(page) 106 | 107 | first = False 108 | 109 | node.append(pages) 110 | 111 | self.body.append(self.starttag(node, 'div')) 112 | 113 | 114 | def depart_pages_node_html(self, node): 115 | self.body.append('') 116 | 117 | 118 | def visit_pages_node_latex(self, node): 119 | for part in node.parts: 120 | t = tab('', '') 121 | t.label = part.label 122 | t.append(part.paragraph) 123 | node.append(t) 124 | 125 | 126 | def depart_pages_node_latex(self, node): 127 | pass 128 | 129 | 130 | class tab(nodes.General, nodes.Inline, nodes.Referential, nodes.TextElement): 131 | pass 132 | 133 | 134 | def visit_tab_node_html(self, node): 135 | atts = {} 136 | if 'refuri' in node: 137 | atts['href'] = node['refuri'] 138 | atts['data-toggle'] = 'tab' 139 | self.body.append(self.starttag(node, 'a', '', **atts)) 140 | 141 | 142 | def depart_tab_node_html(self, node): 143 | self.body.append('') 144 | 145 | 146 | def visit_tab_node_latex(self, node): 147 | self.body.append(r'\begin{jsonframe}{%s}{black}' % node.label) 148 | 149 | 150 | def depart_tab_node_latex(self, node): 151 | self.body.append(r'\end{jsonframe}') 152 | 153 | 154 | def make_id(self, label): 155 | return '{0}_{1}'.format(hex(id(self))[2:], re.sub(r"\W", "_", label)) 156 | 157 | 158 | class TabDirective(Directive): 159 | has_content = True 160 | 161 | def run(self): 162 | parts = split_content(self.content) 163 | container = self.make_container(parts) 164 | 165 | for part in parts: 166 | paragraph = nodes.paragraph('', '') 167 | content = statemachine.StringList(part.content) 168 | content.parent = self.content.parent 169 | self.state.nested_parse(content, 0, paragraph) 170 | part.paragraph = paragraph 171 | 172 | return [container] 173 | 174 | 175 | class LanguageSpecificDirective(TabDirective): 176 | def make_container(self, parts): 177 | return language_specific_pages(parts=parts) 178 | 179 | 180 | class DraftDirective(TabDirective): 181 | def make_container(self, parts): 182 | return draft_pages(parts=parts) 183 | 184 | 185 | def setup(app): 186 | app.add_node(tab, 187 | html=(visit_tab_node_html, depart_tab_node_html), 188 | latex=(visit_tab_node_latex, depart_tab_node_latex)) 189 | app.add_node(language_specific_pages, 190 | html=(visit_pages_node_html, 191 | depart_pages_node_html), 192 | latex=(visit_pages_node_latex, 193 | depart_pages_node_latex)) 194 | app.add_node(draft_pages, 195 | html=(visit_pages_node_html, 196 | depart_pages_node_html), 197 | latex=(visit_pages_node_latex, 198 | depart_pages_node_latex)) 199 | 200 | app.add_directive('language_specific', LanguageSpecificDirective) 201 | app.add_directive('draft_specific', DraftDirective) 202 | 203 | 204 | latex_preamble = r""" 205 | \usepackage{mdframed} 206 | \usepackage{tikz} 207 | 208 | \newenvironment{jsonframe}[2]{% 209 | \ifstrempty{#1}% 210 | {}% 211 | {\mdfsetup{% 212 | skipabove=10pt, 213 | frametitle={% 214 | \tikz[baseline=(current bounding box.east),outer sep=0pt,text=white] 215 | \node[anchor=east,rectangle,fill=#2] 216 | {\strut \textsf{ #1 }};}}% 217 | }% 218 | \mdfsetup{innertopmargin=10pt,linecolor=#2,% 219 | skipabove=10pt, 220 | linewidth=1pt,topline=true,nobreak=true, 221 | frametitleaboveskip=\dimexpr-\ht\strutbox\relax,} 222 | \begin{mdframed}[]\relax% 223 | }{\end{mdframed}} 224 | """ 225 | -------------------------------------------------------------------------------- /source/structuring.rst: -------------------------------------------------------------------------------- 1 | .. index:: 2 | single: structure 3 | 4 | .. _structuring: 5 | 6 | Structuring a complex schema 7 | ============================ 8 | 9 | .. contents:: :local: 10 | 11 | When writing computer programs of even moderate complexity, it's 12 | commonly accepted that "structuring" the program into reusable 13 | functions is better than copying-and-pasting duplicate bits of code 14 | everywhere they are used. Likewise in JSON Schema, for anything but 15 | the most trivial schema, it's really useful to structure the schema 16 | into parts that can be reused in a number of places. This chapter 17 | will present the tools available for reusing and structuring schemas 18 | as well as some practical examples that use those tools. 19 | 20 | .. index:: 21 | single: schema identification 22 | single: structuring; schema identification 23 | 24 | .. _schema-identification: 25 | 26 | Schema Identification 27 | --------------------- 28 | 29 | Like any other code, schemas are easier to maintain if they can be 30 | broken down into logical units that reference each other as necessary. 31 | In order to reference a schema, we need a way to identify a schema. 32 | Schema documents are identified by non-relative URIs. 33 | 34 | Schema documents are not required to have an identifier, but 35 | you will need one if you want to reference one schema from 36 | another. In this documentation, we will refer to schemas with no 37 | identifier as "anonymous schemas". 38 | 39 | In the following sections we will see how the "identifier" for a 40 | schema is determined. 41 | 42 | .. note:: 43 | URI terminology can sometimes be unintuitive. In this document, the 44 | following definitions are used. 45 | 46 | - **URI** `[1] 47 | `__ or 48 | **non-relative URI**: A full URI containing a scheme (``https``). 49 | It may contain a URI fragment (``#foo``). Sometimes this document 50 | will use "non-relative URI" to make it extra clear that relative 51 | URIs are not allowed. 52 | - **relative reference** `[2] 53 | `__: A 54 | partial URI that does not contain a scheme (``https``). It may 55 | contain a fragment (``#foo``). 56 | - **URI-reference** `[3] 57 | `__: A 58 | relative reference or non-relative URI. It may contain a URI 59 | fragment (``#foo``). 60 | - **absolute URI** `[4] 61 | `__ A 62 | full URI containing a scheme (``https``) but not a URI fragment 63 | (``#foo``). 64 | 65 | .. note:: 66 | Even though schemas are identified by URIs, those identifiers are 67 | not necessarily network-addressable. They are just identifiers. 68 | Generally, implementations don't make HTTP requests (``https://``) 69 | or read from the file system (``file://``) to fetch schemas. 70 | Instead, they provide a way to load schemas into an internal schema 71 | database. When a schema is referenced by it's URI identifier, the 72 | schema is retrieved from the internal schema database. 73 | 74 | .. index:: 75 | single: base URI 76 | single: structuring; base URI 77 | 78 | .. _base-uri: 79 | 80 | Base URI 81 | -------- 82 | 83 | Using non-relative URIs can be cumbersome, so any URIs used in 84 | JSON Schema can be URI-references that resolve against the schema's 85 | base URI resulting in a non-relative URI. This section describes how a 86 | schema's base URI is determined. 87 | 88 | .. note:: 89 | Base URI determination and relative reference resolution is defined 90 | by `RFC-3986 91 | `__. If 92 | you are familiar with how this works in HTML, this section should 93 | feel very familiar. 94 | 95 | .. index:: 96 | single: retrieval URI 97 | single: structuring; base URI; retrieval URI 98 | 99 | .. _retrieval-uri: 100 | 101 | Retrieval URI 102 | ~~~~~~~~~~~~~ 103 | 104 | The URI used to fetch a schema is known as the "retrieval URI". It's 105 | often possible to pass an anonymous schema to an implementation in 106 | which case that schema would have no retrieval URI. 107 | 108 | Let's assume a schema is referenced using the URI 109 | ``https://example.com/schemas/address`` and the following schema is 110 | retrieved. 111 | 112 | .. schema_example:: 113 | 114 | { 115 | "type": "object", 116 | "properties": { 117 | "street_address": { "type": "string" }, 118 | "city": { "type": "string" }, 119 | "state": { "type": "string" } 120 | }, 121 | "required": ["street_address", "city", "state"] 122 | } 123 | 124 | The base URI for this schema is the same as the retrieval URI, 125 | ``https://example.com/schemas/address``. 126 | 127 | .. index:: 128 | single: $id 129 | single: structuring; base URI; $id 130 | 131 | .. _id: 132 | 133 | $id 134 | ~~~ 135 | 136 | You can set the base URI by using the ``$id`` keyword at the root of 137 | the schema. The value of ``$id`` is a URI-reference without a fragment 138 | that resolves against the `retrieval-uri`. The resulting URI is the 139 | base URI for the schema. 140 | 141 | .. draft_specific:: 142 | 143 | --Draft 4 144 | In Draft 4, ``$id`` is just ``id`` (without the dollar sign). 145 | 146 | --Draft 4-7 147 | In Draft 4-7 it was allowed to have fragments in an ``$id`` (or 148 | ``id`` in Draft 4). However, the behavior when setting a base URI 149 | that contains a URI fragment is undefined and should not be used 150 | because implementations may treat them differently. 151 | 152 | .. note:: 153 | This is analogous to the ```` `tag in HTML 154 | `__. 155 | 156 | .. note:: 157 | When the ``$id`` keyword appears in a subschema, it means something 158 | slightly different. See the `bundling` section for more. 159 | 160 | Let's assume the URIs ``https://example.com/schema/address`` and 161 | ``https://example.com/schema/billing-address`` both identify the 162 | following schema. 163 | 164 | .. schema_example:: 165 | 166 | { 167 | "$id": "/schemas/address", 168 | 169 | "type": "object", 170 | "properties": { 171 | "street_address": { "type": "string" }, 172 | "city": { "type": "string" }, 173 | "state": { "type": "string" } 174 | }, 175 | "required": ["street_address", "city", "state"] 176 | } 177 | 178 | No matter which of the two URIs is used to retrieve this schema, the 179 | base URI will be ``https://example.com/schemas/address``, which is the 180 | result of the ``$id`` URI-reference resolving against the 181 | `retrieval-uri`. 182 | 183 | However, using a relative reference when setting a base URI can be 184 | problematic. For example, we couldn't use this schema as an 185 | anonymous schema because there would be no `retrieval-uri` and you 186 | can't resolve a relative reference against nothing. For this and other 187 | reasons, it's recommended that you always use an absolute URI when 188 | declaring a base URI with ``$id``. 189 | 190 | The base URI of the following schema will always be 191 | ``https://example.com/schemas/address`` no matter what the 192 | `retrieval-uri` was or if it's used as an anonymous schema. 193 | 194 | .. schema_example:: 195 | 196 | { 197 | "$id": "https://example.com/schemas/address", 198 | 199 | "type": "object", 200 | "properties": { 201 | "street_address": { "type": "string" }, 202 | "city": { "type": "string" }, 203 | "state": { "type": "string" } 204 | }, 205 | "required": ["street_address", "city", "state"] 206 | } 207 | 208 | .. index:: 209 | single: JSON Pointer 210 | single: structuring; subschema identification; JSON Pointer 211 | 212 | .. _json-pointer: 213 | 214 | JSON Pointer 215 | ~~~~~~~~~~~~ 216 | 217 | In addition to identifying a schema document, you can also identify 218 | subschemas. The most common way to do that is to use a `JSON Pointer 219 | `__ in the URI fragment that 220 | points to the subschema. 221 | 222 | A JSON Pointer describes a slash-separated path to traverse the keys 223 | in the objects in the document. Therefore, 224 | ``/properties/street_address`` means: 225 | 226 | 1) find the value of the key ``properties`` 227 | 2) within that object, find the value of the key ``street_address`` 228 | 229 | The URI 230 | ``https://example.com/schemas/address#/properties/street_address`` 231 | identifies the highlighted subschema in the following schema. 232 | 233 | .. schema_example:: 234 | 235 | { 236 | "$id": "https://example.com/schemas/address", 237 | 238 | "type": "object", 239 | "properties": { 240 | "street_address": 241 | * { "type": "string" }, 242 | "city": { "type": "string" }, 243 | "state": { "type": "string" } 244 | }, 245 | "required": ["street_address", "city", "state"] 246 | } 247 | 248 | .. index:: 249 | single: $anchor 250 | single: structuring; subschema identification; $anchor 251 | 252 | .. _anchor: 253 | 254 | $anchor 255 | ~~~~~~~ 256 | 257 | A less common way to identify a subschema is to create a named anchor 258 | in the schema using the ``$anchor`` keyword and using that name in the 259 | URI fragment. Anchors must start with a letter followed by any number 260 | of letters, digits, ``-``, ``_``, ``:``, or ``.``. 261 | 262 | .. draft_specific:: 263 | 264 | --Draft 4 265 | In Draft 4, you declare an anchor the same way you do in Draft 6-7 266 | except that ``$id`` is just ``id`` (without the dollar sign). 267 | 268 | --Draft 6-7 269 | In Draft 6-7, a named anchor is defined using an ``$id`` that 270 | contains only a URI fragment. The value of the URI fragment is the 271 | name of the anchor. 272 | 273 | JSON Schema doesn't define how ``$id`` should be interpreted when 274 | it contains both fragment and non-fragment URI parts. Therefore, 275 | when setting a named anchor, you should not use non-fragment URI 276 | parts in the URI-reference. 277 | 278 | .. note:: 279 | If a named anchor is defined that doesn't follow these naming 280 | rules, then behavior is undefined. Your anchors might work in some 281 | implementation, but not others. 282 | 283 | The URI ``https://example.com/schemas/address#street_address`` 284 | identifies the subschema on the highlighted part of the following 285 | schema. 286 | 287 | .. schema_example:: 288 | 289 | { 290 | "$id": "https://example.com/schemas/address", 291 | 292 | "type": "object", 293 | "properties": { 294 | "street_address": 295 | * { 296 | * "$anchor": "street_address", 297 | * "type": "string" 298 | * }, 299 | "city": { "type": "string" }, 300 | "state": { "type": "string" } 301 | }, 302 | "required": ["street_address", "city", "state"] 303 | } 304 | 305 | .. index:: 306 | single: $ref 307 | single: structuring; $ref 308 | 309 | .. _ref: 310 | 311 | $ref 312 | ---- 313 | 314 | A schema can reference another schema using the ``$ref`` keyword. The 315 | value of ``$ref`` is a URI-reference that is resolved against the 316 | schema's `base-uri`. When evaluating a ``$ref``, an implementation 317 | uses the resolved identifier to retrieve the referenced schema and 318 | applies that schema to the instance. 319 | 320 | .. draft_specific:: 321 | 322 | -- Draft 4-7 323 | In Draft 4-7, ``$ref`` behaves a little differently. When an 324 | object contains a ``$ref`` property, the object is considered a 325 | reference, not a schema. Therefore, any other properties you put 326 | in that object will not be treated as JSON Schema keywords and will 327 | be ignored by the validator. ``$ref`` can only be used where a 328 | schema is expected. 329 | 330 | For this example, let's say we want to define a customer record, where 331 | each customer may have both a shipping and a billing address. 332 | Addresses are always the same---they have a street address, city and 333 | state---so we don't want to duplicate that part of the schema 334 | everywhere we want to store an address. Not only would that make the 335 | schema more verbose, but it makes updating it in the future more 336 | difficult. If our imaginary company were to start doing international 337 | business in the future and we wanted to add a country field to all the 338 | addresses, it would be better to do this in a single place rather than 339 | everywhere that addresses are used. 340 | 341 | .. schema_example:: 342 | 343 | { 344 | "$id": "https://example.com/schemas/customer", 345 | 346 | "type": "object", 347 | "properties": { 348 | "first_name": { "type": "string" }, 349 | "last_name": { "type": "string" }, 350 | "shipping_address": { "$ref": "/schemas/address" }, 351 | "billing_address": { "$ref": "/schemas/address" } 352 | }, 353 | "required": ["first_name", "last_name", "shipping_address", "billing_address"] 354 | } 355 | 356 | The URI-references in ``$ref`` resolve against the schema's `base-uri` 357 | (``https://example.com/schemas/customer``) which results in 358 | ``https://example.com/schemas/address``. The implementation retrieves 359 | that schema and uses it to evaluate the "shipping_address" and 360 | "billing_address" properties. 361 | 362 | .. note:: 363 | When using ``$ref`` in an anonymous schema, relative references may 364 | not be resolvable. Let's assume this example is used as an 365 | anonymous schema. 366 | 367 | .. schema_example:: 368 | 369 | { 370 | "type": "object", 371 | "properties": { 372 | "first_name": { "type": "string" }, 373 | "last_name": { "type": "string" }, 374 | "shipping_address": { "$ref": "https://example.com/schemas/address" }, 375 | "billing_address": { "$ref": "/schemas/address" } 376 | }, 377 | "required": ["first_name", "last_name", "shipping_address", "billing_address"] 378 | } 379 | 380 | The ``$ref`` at ``/properties/shipping_address`` can resolve just 381 | fine without a non-relative base URI to resolve against, but the 382 | ``$ref`` at ``/properties/billing_address`` can't resolve to a 383 | non-relative URI and therefore can't can be used to retrieve the 384 | address schema. 385 | 386 | .. index:: 387 | single: $defs 388 | single: structuring; $defs 389 | 390 | .. _defs: 391 | 392 | $defs 393 | ----- 394 | 395 | Sometimes we have small subschemas that are only intended for use in 396 | the current schema and it doesn't make sense to define them as 397 | separate schemas. Although we can identify any subschema using JSON 398 | Pointers or named anchors, the ``$defs`` keyword gives us a 399 | standardized place to keep subschemas intended for reuse in the 400 | current schema document. 401 | 402 | Let's extend the previous customer schema example to use a common 403 | schema for the name properties. It doesn't make sense to define a new 404 | schema for this and it will only be used in this schema, so it's a 405 | good candidate for using ``$defs``. 406 | 407 | .. schema_example:: 408 | 409 | { 410 | "$id": "https://example.com/schemas/customer", 411 | 412 | "type": "object", 413 | "properties": { 414 | "first_name": { "$ref": "#/$defs/name" }, 415 | "last_name": { "$ref": "#/$defs/name" }, 416 | "shipping_address": { "$ref": "/schemas/address" }, 417 | "billing_address": { "$ref": "/schemas/address" } 418 | }, 419 | "required": ["first_name", "last_name", "shipping_address", "billing_address"], 420 | 421 | "$defs": { 422 | "name": { "type": "string" } 423 | } 424 | } 425 | 426 | ``$defs`` isn't just good for avoiding duplication. It can also be 427 | useful for writing schemas that are easier to read and maintain. 428 | Complex parts of the schema can be defined in ``$defs`` with 429 | descriptive names and referenced where it's needed. This allows 430 | readers of the schema to more quickly and easily understand the schema 431 | at a high level before diving into the more complex parts. 432 | 433 | .. note:: 434 | It's possible to reference an external subschema, but generally you 435 | want to limit a ``$ref`` to referencing either an external schema 436 | or an internal subschema defined in ``$defs``. 437 | 438 | .. index:: 439 | single: recursion 440 | single: $ref 441 | single: structuring; recursion; $ref 442 | 443 | .. _recursion: 444 | 445 | Recursion 446 | --------- 447 | 448 | The ``$ref`` keyword may be used to create recursive schemas that 449 | refer to themselves. For example, you might have a ``person`` schema 450 | that has an array of ``children``, each of which are also ``person`` 451 | instances. 452 | 453 | .. schema_example:: 454 | 455 | { 456 | "type": "object", 457 | "properties": { 458 | "name": { "type": "string" }, 459 | "children": { 460 | "type": "array", 461 | * "items": { "$ref": "#" } 462 | } 463 | } 464 | } 465 | -- 466 | // A snippet of the British royal family tree 467 | { 468 | "name": "Elizabeth", 469 | "children": [ 470 | { 471 | "name": "Charles", 472 | "children": [ 473 | { 474 | "name": "William", 475 | "children": [ 476 | { "name": "George" }, 477 | { "name": "Charlotte" } 478 | ] 479 | }, 480 | { 481 | "name": "Harry" 482 | } 483 | ] 484 | } 485 | ] 486 | } 487 | 488 | Above, we created a schema that refers to itself, effectively creating 489 | a "loop" in the validator, which is both allowed and useful. Note, 490 | however, that a ``$ref`` referring to another ``$ref`` could cause 491 | an infinite loop in the resolver, and is explicitly disallowed. 492 | 493 | .. schema_example:: 494 | 495 | { 496 | "$defs": { 497 | "alice": { "$ref": "#/$defs/bob" }, 498 | "bob": { "$ref": "#/$defs/alice" } 499 | } 500 | } 501 | 502 | .. index:: 503 | single: Extending Recursive Schemas 504 | single: $recursiveRef 505 | single: $recursiveAnchor 506 | single: structuring; Extending Recursive Schemas 507 | 508 | .. _extending-recursive-schemas: 509 | 510 | Extending Recursive Schemas 511 | --------------------------- 512 | 513 | |Draft2019-09| 514 | 515 | Documentation Coming Soon 516 | 517 | .. index:: 518 | single: bundling 519 | single: $id 520 | single: structuring; bundling; $id 521 | 522 | .. _bundling: 523 | 524 | Bundling 525 | -------- 526 | 527 | Working with multiple schema documents is convenient for development, 528 | but it's often more convenient for distribution to bundle all of your 529 | schemas into a single schema document. This can be done using the 530 | ``$id`` keyword in a subschema. When ``$id`` is used in a subschema, 531 | it indicates an embedded schema. The identifier for the embedded 532 | schema is the value of ``$id`` resolved against the `base-uri` of the 533 | schema it appears in. A schema document that includes embedded schemas 534 | is called a Compound Schema Document. Each schema with an ``$id`` in a 535 | Compound Schema Document is called a Schema Resource. 536 | 537 | .. draft_specific:: 538 | 539 | --Draft 4 540 | In Draft 4, ``$id`` is just ``id`` (without the dollar sign). 541 | 542 | --Draft 4-7 543 | In Draft 4-7, an ``$id`` in a subschema did not indicate an 544 | embedded schema. Instead it was simply a base URI change in a 545 | single schema document. 546 | 547 | .. note:: 548 | This is analogous to the ``