57 | {% endblock %}
58 |
59 | {# The blocks below, defined in the basic layout, are effectively removed from
60 | the template as we replace their contents with an empty string. #}
61 | {% block relbar1 %}{% endblock %}
62 | {% block relbar2 %}{% endblock %}
63 | {% block footer %}{% endblock %}
64 |
--------------------------------------------------------------------------------
/src/exceptions-and-errors.rst:
--------------------------------------------------------------------------------
1 | .. SPDX-License-Identifier: MIT OR Apache-2.0
2 | SPDX-FileCopyrightText: The Ferrocene Developers
3 |
4 | .. default-domain:: spec
5 |
6 | .. _fls_dzq9cdz4ibsz:
7 |
8 | Exceptions and Errors
9 | =====================
10 |
11 | .. rubric:: Legality Rules
12 |
13 | :dp:`fls_vsk4vhnuiyyz`
14 | The Rust programming language lacks exceptions and exception handlers. Instead,
15 | the language uses the following tiered error handling scheme:
16 |
17 | * :dp:`fls_ebangxc36t74`
18 | A possibly absent :t:`value` is usually represented using :t:`enum`
19 | :std:`core::option::Option`.
20 |
21 | * :dp:`fls_ckeitwiv326r`
22 | The result of a possibly erroneous computation is usually represented using
23 | :t:`enum` :std:`core::result::Result`.
24 |
25 | * :dp:`fls_eg0orgibg98m`
26 | Erroneous behavior is signaled using :t:`macro` :std:`core::panic`.
27 |
28 | :dp:`fls_ko1x0gp9e7y3`
29 | :t:`Enum` :std:`core::option::Option` indicates whether a :t:`value` is
30 | either present using :std:`core::option::Option::Some` or absent using
31 | :std:`core::option::Option::None`.
32 |
33 | :dp:`fls_gwu4cn4ziabe`
34 | :t:`Enum` :std:`core::result::Result` indicates whether a computation completed
35 | successfully and produced a :t:`value` using :std:`core::result::Result::Ok` or
36 | the computation failed with an error using :std:`core::result::Result::Err`.
37 |
38 | .. _fls_k02nt1m5fq1z:
39 |
40 | Panic
41 | -----
42 |
43 | .. rubric:: Legality Rules
44 |
45 | :dp:`fls_a554v4n0khye`
46 | A :t:`panic` is an abnormal program state caused by invoking :t:`macro`
47 | :std:`core::panic`.
48 |
49 | .. rubric:: Dynamic Semantics
50 |
51 | :dp:`fls_i9njhpte5l0t`
52 | Invoking :t:`macro` :std:`core::panic` has the following runtime effects:
53 |
54 | #. :dp:`fls_n6q7bksyn1m`
55 | Control flow halts the execution of the current thread.
56 |
57 | #. :dp:`fls_xmtt04lw517w`
58 | Control flow of the current thread resumes execution by invoking the
59 | :t:`function` subject to :t:`attribute` :c:`panic_handler`.
60 |
61 | .. rubric:: Examples
62 |
63 | .. code-block:: rust
64 |
65 | panic!("This was a terrible mistake!");
66 |
67 | .. _fls_hi1iz0gbnksi:
68 |
69 | Abort
70 | -----
71 |
72 | .. rubric:: Legality Rules
73 |
74 | :dp:`fls_9a1izu3omkbn`
75 | :t:`Abort` is the immediate termination of a program.
76 |
77 | .. rubric:: Dynamic Semantics
78 |
79 | :dp:`fls_iq6olct3rw4u`
80 | :t:`Abort` has the following runtime effects:
81 |
82 | #. :dp:`fls_wd2q6ft9yzrg`
83 | Control flow halts the execution of all threads.
84 |
85 | #. :dp:`fls_7bnrbjb0pq5n`
86 | The program terminates.
87 |
88 |
--------------------------------------------------------------------------------
/exts/ferrocene_spec_lints/require_paragraph_ids.py:
--------------------------------------------------------------------------------
1 | # SPDX-License-Identifier: MIT OR Apache-2.0
2 | # SPDX-FileCopyrightText: The Ferrocene Developers
3 |
4 | from docutils import nodes
5 | from ferrocene_spec.definitions import DefIdNode
6 |
7 |
8 | def check(app, raise_error):
9 | for document in app.env.found_docs:
10 | doctree = app.env.get_doctree(document)
11 | if document in app.config.lint_no_paragraph_ids:
12 | check_does_not_have_ids(doctree, raise_error)
13 | else:
14 | check_has_ids(doctree, raise_error)
15 |
16 |
17 | def check_has_ids(node, raise_error):
18 | is_paragraph = nodes.paragraph is type(node)
19 |
20 | if is_paragraph and nodes.section is type(node.parent):
21 | should_have_id(node, "paragraph", raise_error)
22 | elif is_paragraph and nodes.list_item is type(node.parent):
23 | should_have_id(node, "list item", raise_error)
24 | elif is_paragraph and nodes.entry is type(node.parent):
25 | if node.parent.parent.index(node.parent) == 0:
26 | should_have_id(node, "first cell of a table row", raise_error)
27 | else:
28 | should_not_have_id(
29 | node,
30 | "second or later cell of a table row",
31 | raise_error,
32 | )
33 | elif nodes.section is type(node):
34 | if not any(name.startswith("fls_") for name in node["names"]):
35 | raise_error("section should have an id", location=node)
36 | else:
37 | should_not_have_id(node, type(node).__name__, raise_error)
38 |
39 | for child in node.children:
40 | check_has_ids(child, raise_error)
41 |
42 |
43 | def check_does_not_have_ids(node, raise_error):
44 | if nodes.section is type(node):
45 | if any(name.startswith("fls_") for name in node["names"]):
46 | raise_error("section should not have an id", location=node)
47 | else:
48 | should_not_have_id(node, type(node).__name__, raise_error)
49 |
50 | for child in node.children:
51 | check_does_not_have_ids(child, raise_error)
52 |
53 |
54 | def should_have_id(node, what, raise_error):
55 | if any(is_definition(child) for child in node.children[1:]):
56 | raise_error(f"id in {what} is not the first element", location=node)
57 | elif not len(node.children) or not is_definition(node.children[0]):
58 | raise_error(f"{what} should have an id", location=node)
59 |
60 |
61 | def should_not_have_id(node, what, raise_error):
62 | if any(is_definition(child) for child in node.children):
63 | raise_error(f"{what} should not have an id", location=node)
64 |
65 |
66 | def is_definition(node):
67 | return (
68 | DefIdNode is type(node)
69 | and node["def_kind"] == "paragraph"
70 | and node["def_id"].startswith("fls_")
71 | )
72 |
--------------------------------------------------------------------------------
/src/background.rst:
--------------------------------------------------------------------------------
1 | .. SPDX-License-Identifier: MIT OR Apache-2.0
2 | SPDX-FileCopyrightText: The Rust Project Contributors
3 |
4 | .. default-domain:: spec
5 |
6 | .. informational-page::
7 |
8 | .. _fls_mLo6B3EWF50J:
9 |
10 | FLS Background
11 | ==============
12 |
13 | :dp:`fls_p7Jiyppmg0FU` The FLS is a document describing aspects of the Rust
14 | language for Rust toolchain qualification purposes.
15 |
16 | :dp:`fls_Uvf5tHsKSV19` It was created by Ferrous Systems, in an original joint
17 | effort with AdaCore, as one of the prerequisites for qualifying `Ferrocene`_, a
18 | Rust toolchain qualified for safety-critical environments. The FLS is compiled
19 | of existing Rust documentation, but presented with a rigorous structure in order
20 | to meet the requirements of qualification.
21 |
22 | :dp:`fls_J7ZI7mBXffzZ` The FLS is not intended to be used as *the* normative
23 | specification of the Rust language (see the `Rust Reference`_), nor is it meant
24 | to replace the decision-making processes of the Rust project. Any difference
25 | between the FLS and the behavior of the Rust compiler is considered an error on
26 | our part and the FLS will be updated accordingly.
27 |
28 | :dp:`fls_ffv8XSbBOMkR` The FLS text is licensed under either the ``MIT`` or
29 | ``Apache-2.0`` licenses, at your option. Individual files might have different
30 | licensing. Licensing metadata is present in each file, and the full licenses
31 | text is present in the ``LICENSES/`` directory.
32 |
33 | .. _Ferrocene: https://ferrocene.dev
34 | .. _Rust Reference: https://doc.rust-lang.org/reference/
35 |
36 | .. _fls_UMsvnuLqzd99:
37 |
38 | Acknowledging Ferrous Systems
39 | -----------------------------
40 |
41 | :dp:`fls_lmlLUAdtfCo5` The Rust Project would like to thank `Ferrous Systems`_
42 | for `donating`_ the FLS (formerly the Ferrocene Language Specification) to the
43 | Rust Project for its continued maintenance and development.
44 |
45 | :dp:`fls_FZRrMT5AYsAQ` As a brief history, the FLS is a description of the Rust
46 | programming language, developed by Ferrous Systems and `AdaCore`_ in July 2022
47 | as part of Ferrocene, a Rust compiler and toolchain designed for safety-critical
48 | and regulated industries. The FLS provides a structured and detailed reference
49 | for Rust's syntax, semantics, and behavior, serving as a foundation for
50 | verification, compliance, and standardization efforts. The FLS represented a
51 | major step toward describing Rust in a way that aligns with industry
52 | requirements, particularly in high-assurance domains. Until its transfer in
53 | April 2025, Ferrous Systems had been the sole steward of the FLS since July
54 | 2023.
55 |
56 | .. _Ferrous Systems: https://ferrous-systems.com
57 | .. _donating: https://rustfoundation.org/media/ferrous-systems-donates-ferrocene-language-specification-to-rust-project/
58 | .. _AdaCore: https://www.adacore.com
--------------------------------------------------------------------------------
/.github/ISSUE_TEMPLATE/change.yml:
--------------------------------------------------------------------------------
1 | # SPDX-License-Identifier: MIT OR Apache-2.0
2 | # SPDX-FileCopyrightText: The Rust Project Developers
3 |
4 | name: Change
5 | description: File a change for the FLS
6 | title: "[Change]: "
7 | labels: ["change intake"]
8 | assignees:
9 | - PLeVasseur
10 | body:
11 | - type: markdown
12 | attributes:
13 | value: |
14 | Thank you for contributing a change!
15 | - type: input
16 | id: title
17 | attributes:
18 | label: Change description
19 | description: Description of what is intended to be changed
20 | placeholder: Fix description of lexing due to xx
21 | validations:
22 | required: true
23 | - type: dropdown
24 | id: area
25 | attributes:
26 | label: Area
27 | description: Which part of the language does this fall under?
28 | options:
29 | - ABI
30 | - async / await
31 | - attributes
32 | - closures
33 | - conditional compilation
34 | - const evaluation
35 | - conventions
36 | - destructors
37 | - diagnostics
38 | - "edition: 2015"
39 | - "edition: 2018"
40 | - "edition: 2021"
41 | - "edition: 2024"
42 | - editions
43 | - generics
44 | - glossary
45 | - grammar
46 | - inline assembly
47 | - linkage
48 | - macros
49 | - method call
50 | - panic
51 | - patterns
52 | - proc macros
53 | - resolution (names, paths, namespaces, preludes, ...)
54 | - types
55 | - type inference
56 | - type layout
57 | - undefined behavior
58 | - N/A
59 | validations:
60 | required: true
61 | - type: dropdown
62 | id: category
63 | attributes:
64 | label: Category
65 | description: What kind of change is this?
66 | options:
67 | - Bug
68 | - Formatting
69 | - Improvement to non-prose section
70 | - Language cleanup
71 | - New content
72 | - Other
73 | validations:
74 | required: true
75 | - type: input
76 | id: fls-id
77 | attributes:
78 | label: FLS Paragraph ID
79 | description: |
80 | The Paragraph ID from the FLS corresponding to what this Guideline covers.
81 | Easiest way to obtain this is to navigate to the
82 | [FLS](https://rust-lang.github.io/fls), right click a section ID
83 | (e.g. `4.2:2`), inspect, and then find it in the pane which opens in
84 | your browser.
85 |
86 | **Note**: New content need not fill this in.
87 | placeholder: fls_deaddeadbeef
88 | validations:
89 | required: false
90 | - type: textarea
91 | id: change-description
92 | attributes:
93 | label: Change description (extended)
94 | description: |
95 | A longer description of the intended change, including background and
96 | details on in what way the FLS is improved by the change.
97 | placeholder: Fix description of lexing due to xx, since under conditions yy it's possible for zz to occur.
98 | validations:
99 | required: true
100 |
--------------------------------------------------------------------------------
/exts/ferrocene_spec/definitions/paragraphs.py:
--------------------------------------------------------------------------------
1 | # SPDX-License-Identifier: MIT OR Apache-2.0
2 | # SPDX-FileCopyrightText: The Ferrocene Developers
3 |
4 | from .. import utils
5 | from collections import defaultdict
6 | from docutils import nodes
7 | import hashlib
8 | import sphinx
9 |
10 |
11 | ROLE = "p"
12 | NAME = "paragraph"
13 | PRETTY_NAME = "paragraph"
14 |
15 |
16 | class Paragraph:
17 | def __init__(self, id, document, section_anchor, section_id, plaintext, sequential):
18 | self.id = id
19 | self.document = document
20 | self.section_anchor = section_anchor
21 | self.section_id = section_id
22 | self.plaintext = plaintext
23 | self.sequential = sequential
24 |
25 | def number(self, env):
26 | section = ".".join(
27 | str(n) for n in env.toc_secnumbers[self.document][self.section_anchor]
28 | )
29 | return f"{section}:{self.sequential}"
30 |
31 | def anchor(self):
32 | return self.id
33 |
34 | def include_in_search(self):
35 | return False
36 |
37 | def display_name(self, env):
38 | return f"{self.number(env)} {self.content_checksum()}"
39 |
40 | def content_checksum(self):
41 | sha256 = hashlib.sha256()
42 | sha256.update(self.plaintext.encode("utf-8"))
43 | return sha256.hexdigest()
44 |
45 |
46 | def collect_items_in_document(app, nodes_to_collect):
47 | ids = defaultdict(lambda: 1)
48 | for node in nodes_to_collect:
49 | section_node = find_parent_of_type(node, nodes.section)
50 | if section_node is None:
51 | raise RuntimeError(f"could not find section for {node!r}")
52 |
53 | try:
54 | section_id, section_anchor = utils.section_id_and_anchor(section_node)
55 | except utils.NoSectionIdError:
56 | logger = sphinx.util.logging.getLogger(__name__)
57 | logger.warn(
58 | "paragraph inside a section that doesn't have an id starting with fls_",
59 | location=node,
60 | )
61 | return
62 |
63 | yield Paragraph(
64 | id=node["def_id"],
65 | document=app.env.docname,
66 | section_anchor=section_anchor,
67 | section_id=section_id,
68 | plaintext=plaintext_paragraph(node),
69 | sequential=ids[section_id],
70 | )
71 | ids[section_id] += 1
72 |
73 |
74 | def replace_id_node(app, node, paragraph):
75 | new = nodes.inline()
76 | new["ids"].append(paragraph.id)
77 | new["classes"].append("spec-paragraph-id")
78 | new += nodes.Text(paragraph.number(app.env))
79 | node.replace_self(new)
80 |
81 |
82 | def create_ref_node(env, text, item):
83 | if item is not None:
84 | return nodes.emphasis("", item.number(env))
85 | else:
86 | return nodes.emphasis("", "Paragraph " + text)
87 |
88 |
89 | def plaintext_paragraph(node):
90 | paragraph = find_parent_of_type(node, nodes.paragraph)
91 | if paragraph is None:
92 | paragraph = node
93 | return paragraph.astext().replace("\n", " ")
94 |
95 |
96 | def find_parent_of_type(node, ty):
97 | cursor = node
98 | while cursor is not None:
99 | if isinstance(cursor, ty):
100 | return cursor
101 | cursor = cursor.parent
102 | return None
103 |
--------------------------------------------------------------------------------
/CONTRIBUTING.rst:
--------------------------------------------------------------------------------
1 | .. SPDX-License-Identifier: MIT OR Apache-2.0
2 | SPDX-FileCopyrightText: The Ferrocene Developers
3 | SPDX-FileCopyrightText: The Rust Project Contributors
4 |
5 | ====================================================
6 | Contributing to the FLS
7 | ====================================================
8 |
9 | The FLS is released publicly under an open source license, specifically MIT and
10 | Apache 2.0.
11 |
12 | The specification is still a work in progress, and while we're open to
13 | contributions, we want to manage your expectations on how much we'll be able to
14 | interact with the community in this repository:
15 |
16 | * Any change made to the specification text causes extra work for us behind the
17 | scenes, as we have other tooling required for qualification that consumes the
18 | specification text.
19 |
20 | * Our resources are limited when it comes to reviewing PRs, especially large
21 | ones.
22 |
23 | We'll try our best to review changes proposed by the community, but we might not
24 | be able to review all of them (or they might be out of date once we get to
25 | them). If there are changes you'd like to make, we recommend opening an issue
26 | beforehand, so that we can provide feedback on whether we'll be able to merge
27 | the changes.
28 |
29 | We've all dealt with those open source projects that feel open in name only, and
30 | have big patches and history-free source drops appearing from behind the walls
31 | of some large organization. We don't like that, and we're not going to do that.
32 | But please bear with us until we have the capacity to accept all external
33 | contributions.
34 |
35 | This introduction was inspired by Oxide Computer Company's `Hubris
36 | contribution guidelines
37 | `_.
38 |
39 | Contribution Process
40 | ====================
41 |
42 | Before contributing, it would be helpful to familiarize yourself with the
43 | grammar and structure of the FLS. You'll find everything you need to in `Chapter
44 | 1: General `_ of the specification.
45 |
46 | There are three kinds of contribution that can be made:
47 |
48 | * **Flags:** Here the contributor makes the Ferrocene team aware of changes that
49 | were made upstream, but leaves the changes up to the Ferrocene team. This
50 | helps Ferrocene with our mission to "keep the moving parts safe" and ensures
51 | that we can respond to upstream changes in a timely manner. Please `open an
52 | issue `_ for this.
53 |
54 | * **Fixes:** These are defined as small errors or suggestions for improvements
55 | within one or two sentences. The structure of the content still makes sense,
56 | but the contribution makes the concept more precise or better defined. To
57 | contribute a fix to the FLS, please open a pull request!
58 |
59 | * **Rewrites:** These are bigger changes in which a contributor rewrites an
60 | entire chapter, section or subsection. Please open an issue to discuss the
61 | rewrite you want to make, and if we have the capacity to accept and review it
62 | we'll coordinate on how to best do it.
63 |
64 | PRs and approval
65 | ================
66 |
67 | Whether the changes appear in the FLS will remain the responsibility and
68 | discretion of the Ferrocene team. Changes may be approved as is or may be edited
69 | to match the tone, style and format of the FLS.
70 |
71 | Again, we thank you for your patience as we address the suggested changes and,
72 | last but not least, thank you for your interest and contributions to the FLS!
73 |
--------------------------------------------------------------------------------
/themes/fls/static/rust-lang.svg:
--------------------------------------------------------------------------------
1 |
2 |
3 |
64 |
--------------------------------------------------------------------------------
/themes/fls/static/favicon.svg:
--------------------------------------------------------------------------------
1 |
2 |
3 |
66 |
--------------------------------------------------------------------------------
/exts/ferrocene_spec/informational.py:
--------------------------------------------------------------------------------
1 | # SPDX-License-Identifier: MIT OR Apache-2.0
2 | # SPDX-FileCopyrightText: The Ferrocene Developers
3 |
4 | from . import utils
5 | from collections import defaultdict
6 | from docutils import nodes
7 | from sphinx import addnodes
8 | from sphinx.directives import SphinxDirective
9 | from sphinx.environment.collectors import EnvironmentCollector
10 | from sphinx.transforms import SphinxTransform
11 | import sphinx
12 |
13 |
14 | # Marker to use inside the informational storage to signal the whole page, and
15 | # not just a section, is informational
16 | WHOLE_PAGE = "{{{whole-page}}}"
17 |
18 |
19 | def build_directive(kind):
20 | class InformationalDirective(SphinxDirective):
21 | has_content = False
22 |
23 | def run(self):
24 | paragraph = nodes.paragraph()
25 | paragraph += nodes.Text(f"The contents of this {kind} are informational.")
26 |
27 | note = nodes.note()
28 | note += paragraph
29 |
30 | return [note, InformationalMarkerNode(kind)]
31 |
32 | return InformationalDirective
33 |
34 |
35 | class InformationalMarkerNode(nodes.Element):
36 | def __init__(self, kind):
37 | super().__init__()
38 | self["kind"] = kind
39 |
40 |
41 | class InformationalPagesCollector(EnvironmentCollector):
42 | def clear_doc(self, app, env, docname):
43 | storage = get_storage(env)
44 | if docname in storage:
45 | del storage[docname]
46 |
47 | def merge_other(self, app, env, docnames, other):
48 | storage = get_storage(env)
49 | for document, contents in get_storage(other).items():
50 | storage[document] = storage[document].union(contents)
51 |
52 | def process_doc(self, app, document):
53 | storage = get_storage(app.env)
54 | for node in document.findall(InformationalMarkerNode):
55 | if node["kind"] == "page":
56 | self.process_page(app, storage, node)
57 | elif node["kind"] == "section":
58 | self.process_section(app, storage, node)
59 | else:
60 | raise RuntimeError("unknown directive kind: " + node["kind"])
61 |
62 | def process_page(self, app, storage, node):
63 | if addnodes.document is not type(node.parent):
64 | warn("informational-page must be at the top of the document", node)
65 | return
66 |
67 | storage[app.env.docname].add(WHOLE_PAGE)
68 |
69 | def process_section(self, app, storage, node):
70 | if nodes.section is not type(node.parent):
71 | warn("informational-section must be inside a section", node)
72 | return
73 |
74 | try:
75 | _id, anchor = utils.section_id_and_anchor(node.parent)
76 | except utils.NoSectionIdError:
77 | warn(
78 | "informational-section must be inside a section with an ID "
79 | "starting with fls_"
80 | )
81 | return
82 | storage[app.env.docname].add(anchor)
83 |
84 |
85 | class RemoveInformationalMarkerNodesTransform(SphinxTransform):
86 | default_priority = 500
87 |
88 | def apply(self):
89 | for node in self.document.findall(InformationalMarkerNode):
90 | node.parent.remove(node)
91 |
92 |
93 | def is_document_informational(env, docname):
94 | return WHOLE_PAGE in get_storage(env)[docname]
95 |
96 |
97 | def is_section_informational(env, docname, anchor):
98 | return anchor in get_storage(env)[docname]
99 |
100 |
101 | def get_storage(env):
102 | key = "spec_informational"
103 | if not hasattr(env, key):
104 | setattr(env, key, defaultdict(set))
105 | return getattr(env, key)
106 |
107 |
108 | def warn(message, location):
109 | logger = sphinx.util.logging.getLogger(__name__)
110 | logger.warning(message, location=location)
111 |
112 |
113 | def setup(app):
114 | app.add_node(InformationalMarkerNode)
115 | app.add_env_collector(InformationalPagesCollector)
116 | app.add_post_transform(RemoveInformationalMarkerNodesTransform)
117 |
--------------------------------------------------------------------------------
/LICENSES/OFL-1.1.txt:
--------------------------------------------------------------------------------
1 | Copyright (c) 2025, The Raleway Project Authors (impallari@gmail.com),
2 | with Reserved Font Name Raleway.
3 |
4 | This Font Software is licensed under the SIL Open Font License, Version 1.1.
5 |
6 | This license is copied below, and is also available with a FAQ at: http://scripts.sil.org/OFL
7 |
8 | SIL OPEN FONT LICENSE
9 |
10 | Version 1.1 - 26 February 2007
11 |
12 | PREAMBLE
13 |
14 | The goals of the Open Font License (OFL) are to stimulate worldwide development of collaborative font projects, to support the font creation efforts of academic and linguistic communities, and to provide a free and open framework in which fonts may be shared and improved in partnership with others.
15 |
16 | The OFL allows the licensed fonts to be used, studied, modified and redistributed freely as long as they are not sold by themselves. The fonts, including any derivative works, can be bundled, embedded, redistributed and/or sold with any software provided that any reserved names are not used by derivative works. The fonts and derivatives, however, cannot be released under any other type of license. The requirement for fonts to remain under this license does not apply to any document created using the fonts or their derivatives.
17 |
18 | DEFINITIONS
19 |
20 | "Font Software" refers to the set of files released by the Copyright Holder(s) under this license and clearly marked as such. This may include source files, build scripts and documentation.
21 |
22 | "Reserved Font Name" refers to any names specified as such after the copyright statement(s).
23 |
24 | "Original Version" refers to the collection of Font Software components as distributed by the Copyright Holder(s).
25 |
26 | "Modified Version" refers to any derivative made by adding to, deleting, or substituting — in part or in whole — any of the components of the Original Version, by changing formats or by porting the Font Software to a new environment.
27 |
28 | "Author" refers to any designer, engineer, programmer, technical writer or other person who contributed to the Font Software.
29 |
30 | PERMISSION & CONDITIONS
31 |
32 | Permission is hereby granted, free of charge, to any person obtaining a copy of the Font Software, to use, study, copy, merge, embed, modify, redistribute, and sell modified and unmodified copies of the Font Software, subject to the following conditions:
33 |
34 | 1) Neither the Font Software nor any of its individual components, in Original or Modified Versions, may be sold by itself.
35 | 2) Original or Modified Versions of the Font Software may be bundled, redistributed and/or sold with any software, provided that each copy contains the above copyright notice and this license. These can be included either as stand-alone text files, human-readable headers or in the appropriate machine-readable metadata fields within text or binary files as long as those fields can be easily viewed by the user.
36 | 3) No Modified Version of the Font Software may use the Reserved Font Name(s) unless explicit written permission is granted by the corresponding Copyright Holder. This restriction only applies to the primary font name as presented to the users.
37 | 4) The name(s) of the Copyright Holder(s) or the Author(s) of the Font Software shall not be used to promote, endorse or advertise any Modified Version, except to acknowledge the contribution(s) of the Copyright Holder(s) and the Author(s) or with their explicit written permission.
38 | 5) The Font Software, modified or unmodified, in part or in whole, must be distributed entirely under this license, and must not be distributed under any other license. The requirement for fonts to remain under this license does not apply to any document created using the Font Software.
39 | TERMINATION
40 |
41 | This license becomes null and void if any of the above conditions are not met.
42 |
43 | DISCLAIMER
44 |
45 | THE FONT SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO ANY WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT OF COPYRIGHT, PATENT, TRADEMARK, OR OTHER RIGHT. IN NO EVENT SHALL THE COPYRIGHT HOLDER BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, INCLUDING ANY GENERAL, SPECIAL, INDIRECT, INCIDENTAL, OR CONSEQUENTIAL DAMAGES, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF THE USE OR INABILITY TO USE THE FONT SOFTWARE OR FROM OTHER DEALINGS IN THE FONT SOFTWARE.
--------------------------------------------------------------------------------
/README.rst:
--------------------------------------------------------------------------------
1 | .. SPDX-License-Identifier: MIT OR Apache-2.0
2 | SPDX-FileCopyrightText: The Ferrocene Developers
3 | SPDX-FileCopyrightText: The Rust Project Contributors
4 |
5 | ===
6 | FLS
7 | ===
8 |
9 | .. raw:: html
10 |
11 |
13 |
14 | The FLS is a document describing aspects of the Rust language for Rust toolchain
15 | qualification purposes.
16 |
17 | It was created by Ferrous Systems, in an original joint effort with AdaCore, as
18 | one of the prerequisites for qualifying `Ferrocene`_, a Rust toolchain qualified
19 | for safety-critical environments. The FLS is compiled of existing Rust
20 | documentation, but presented with a rigorous structure in order to meet the
21 | requirements of qualification.
22 |
23 | The FLS is not intended to be used as *the* normative specification of the Rust
24 | language (see the `Rust Reference`_), nor is it meant to replace the
25 | decision-making processes of the Rust project. Any difference between the FLS
26 | and the behavior of the Rust compiler is considered an error on our part and the
27 | FLS will be updated accordingly.
28 |
29 | The FLS text is licensed under either the ``MIT`` or ``Apache-2.0`` licenses, at
30 | your option. Individual files might have different licensing. Licensing metadata
31 | is present in each file, and the full licenses text is present in the
32 | ``LICENSES/`` directory.
33 |
34 | .. _Ferrocene: https://ferrocene.dev
35 | .. _Rust Reference: https://doc.rust-lang.org/reference/
36 |
37 | Acknowledging Ferrous Systems
38 | =============================
39 |
40 | The Rust Project would like to thank `Ferrous Systems`_ for `donating`_ the FLS
41 | (formerly the Ferrocene Language Specification) to the Rust Project for its
42 | continued maintenance and development.
43 |
44 | As a brief history, the FLS is a description of the Rust programming language,
45 | developed by Ferrous Systems and `AdaCore`_ in July 2022 as part of Ferrocene, a
46 | Rust compiler and toolchain designed for safety-critical and regulated
47 | industries. The FLS provides a structured and detailed reference for Rust's
48 | syntax, semantics, and behavior, serving as a foundation for verification,
49 | compliance, and standardization efforts. The FLS represented a major step toward
50 | describing Rust in a way that aligns with industry requirements, particularly in
51 | high-assurance domains. Until its transfer in April 2025, Ferrous Systems had
52 | been the sole steward of the FLS since July 2023.
53 |
54 | .. _Ferrous Systems: https://ferrous-systems.com
55 | .. _donating: https://rustfoundation.org/media/ferrous-systems-donates-ferrocene-language-specification-to-rust-project/
56 | .. _AdaCore: https://www.adacore.com
57 |
58 | Building the specification
59 | ==========================
60 |
61 | FLS uses `Sphinx`_ to build a rendered version of the specification, and `uv`_
62 | to install and manage Python dependencies (including Sphinx itself). To simplify
63 | building the rendered version, we created a script called ``make.py`` that takes
64 | care of invoking Sphinx with the right flags.
65 |
66 | You can build the rendered version by running::
67 |
68 | ./make.py
69 |
70 | By default, Sphinx uses incremental rebuilds to generate the content that
71 | changed since the last invocation. If you notice a problem with incremental
72 | rebuilds, you can pass the ``-c`` flag to clear the existing artifacts before
73 | building::
74 |
75 | ./make.py -c
76 |
77 | The rendered version will be available in ``build/html/``.
78 |
79 | You can also start a local server on port 8000 with automatic rebuild and reload
80 | whenever you change a file by passing the ``-s`` flag::
81 |
82 | ./make.py -s
83 |
84 | Checking links consistency
85 | ==========================
86 |
87 | It's possible to run Rust's linkchecker tool on the rendered documentation, to
88 | see if there are broken links. To do so, pass the ``--check-links`` flag::
89 |
90 | ./make.py --check-links
91 |
92 | This will clone the source code of the tool, build it, and execute it on the
93 | rendered documentation.
94 |
95 | .. _Sphinx: https://www.sphinx-doc.org
96 | .. _uv: https://docs.astral.sh/uv/
97 |
98 | Updating build dependencies
99 | ===========================
100 |
101 | The FLS uses ``uv`` to manage the Python dependencies used for builds. If you
102 | change the list of dependencies in ``pyproject.toml`` they will automatically be
103 | installed the next time you run ``make.py``. If you want to update the packages
104 | in the lockfile, run::
105 |
106 | uv lock --upgrade
107 |
--------------------------------------------------------------------------------
/src/concurrency.rst:
--------------------------------------------------------------------------------
1 | .. SPDX-License-Identifier: MIT OR Apache-2.0
2 | SPDX-FileCopyrightText: The Ferrocene Developers
3 |
4 | .. default-domain:: spec
5 |
6 | .. _fls_3v733mnewssy:
7 |
8 | Concurrency
9 | ===========
10 |
11 | :dp:`fls_opt7v0mopxc8`
12 | The Rust programming language provides features for concurrent programming
13 | without :t:`[data race]s`, whose rules are presented in this chapter.
14 |
15 | .. rubric:: Legality Rules
16 |
17 | :dp:`fls_tx4b8r6i93n4`
18 | A :t:`data race` is a scenario where two or more threads access a shared memory
19 | location concurrently without any synchronization, where one of the accesses is
20 | a modification.
21 |
22 | .. rubric:: Undefined Behavior
23 |
24 | :dp:`fls_isypweqewe78`
25 | It is undefined behavior if two or more threads engage in a :t:`data race`.
26 |
27 | .. _fls_eiw4by8z75di:
28 |
29 | Send and Sync
30 | -------------
31 |
32 | .. rubric:: Legality Rules
33 |
34 | :dp:`fls_n5l17mlglq11`
35 | The Rust programming language provides the :std:`core::marker::Send` and
36 | :std:`core::marker::Sync` :t:`[trait]s` for preventing data races at the
37 | :t:`type` level.
38 |
39 | :dp:`fls_2jujsujpjp3w`
40 | A :t:`send type` is a :t:`type` that implements the :std:`core::marker::Send`
41 | :t:`trait`.
42 |
43 | :dp:`fls_cax6fe4em23k`
44 | An :t:`abstract data type` automatically implements the
45 | :std:`core::marker::Send` :t:`trait` if the :t:`[type]s` of all its
46 | :t:`[field]s` are :t:`[send type]s`.
47 |
48 | :dp:`fls_4ypqdehn7b0v`
49 | A :t:`send type` shall have :t:`[value]s` that are safe to transfer across
50 | thread boundaries.
51 |
52 | :dp:`fls_dekskhk4g895`
53 | A :t:`sync type` is a :t:`type` that implements the :std:`core::marker::Sync`
54 | :t:`trait`.
55 |
56 | :dp:`fls_y0iqr5ibnbfe`
57 | An :t:`abstract data type` automatically implements the
58 | :std:`core::marker::Sync` :t:`trait` if the :t:`[type]s` of all its
59 | :t:`[field]s` are :t:`[sync type]s`.
60 |
61 | :dp:`fls_zgemofbs5q2x`
62 | A :t:`sync type` shall have :t:`[value]s` that are allowed to be shared across
63 | multiple threads at any given time without incurring data races.
64 |
65 | .. _fls_vyc9vcuamlph:
66 |
67 | Atomics
68 | -------
69 |
70 | .. rubric:: Legality Rules
71 |
72 | :dp:`fls_3pjla9s93mhd`
73 | An :t:`atomic type` is a :t:`type` defined in :t:`module`
74 | :std:`core::sync::atomic`. :t:`[Atomic type]s` provide primitive shared-memory
75 | communication between threads.
76 |
77 | :dp:`fls_wn4ynaio8u47`
78 | :t:`[Atomic type]s` are related to :t:`[type]s` as follows:
79 |
80 | .. list-table::
81 |
82 | * - :dp:`fls_q7mn6pdd8bix`
83 | - **Type**
84 | - **Atomic Type**
85 | * - :dp:`fls_jx0784jzxy00`
86 | - :c:`bool`
87 | - :std:`core::sync::atomic::AtomicBool`
88 | * - :dp:`fls_vzuwnpx7mt08`
89 | - :c:`i8`
90 | - :std:`core::sync::atomic::AtomicI8`
91 | * - :dp:`fls_cpcd0vexfbhj`
92 | - :c:`i16`
93 | - :std:`core::sync::atomic::AtomicI16`
94 | * - :dp:`fls_jt7rfq9atbiv`
95 | - :c:`i32`
96 | - :std:`core::sync::atomic::AtomicI32`
97 | * - :dp:`fls_2hqmfwswc6k`
98 | - :c:`i64`
99 | - :std:`core::sync::atomic::AtomicI64`
100 | * - :dp:`fls_5ab2sw3gwmt3`
101 | - :c:`isize`
102 | - :std:`core::sync::atomic::AtomicIsize`
103 | * - :dp:`fls_w2mw833g28eb`
104 | - ``*mut T``
105 | - :std:`core::sync::atomic::AtomicPtr`
106 | * - :dp:`fls_mjq1l1y0vmz4`
107 | - :c:`u8`
108 | - :std:`core::sync::atomic::AtomicU8`
109 | * - :dp:`fls_906978wtss6n`
110 | - :c:`u16`
111 | - :std:`core::sync::atomic::AtomicU16`
112 | * - :dp:`fls_4urmnh4mfehl`
113 | - :c:`u32`
114 | - :std:`core::sync::atomic::AtomicU32`
115 | * - :dp:`fls_2qkrcd5eovpe`
116 | - :c:`u64`
117 | - :std:`core::sync::atomic::AtomicU64`
118 | * - :dp:`fls_cry1e78gp19q`
119 | - :c:`usize`
120 | - :std:`core::sync::atomic::AtomicUsize`
121 |
122 | .. _fls_mtuwzinpfvkl:
123 |
124 | Asynchronous Computation
125 | ------------------------
126 |
127 | .. rubric:: Legality Rules
128 |
129 | :dp:`fls_g40xp4andj5g`
130 | The Rust programming language provides asynchronous computation through
131 | :t:`module` :std:`core::task` and the :std:`core::future::Future` :t:`trait`.
132 |
133 | :dp:`fls_fte085hi1yqj`
134 | A :t:`future` represents a :t:`value` of a :t:`type` that implements the
135 | :std:`core::future::Future` :t:`trait` which may not have finished computing
136 | yet.
137 |
138 | :dp:`fls_7muubin2wn1v`
139 | The computed :t:`value` of a :t:`future` is obtained by using an
140 | :t:`await expression` or by invoking :std:`core::future::Future::poll`.
141 |
142 | :dp:`fls_ftzey2156ha`
143 | :std:`core::future::Future::poll` shall not be invoked on a :t:`future` that has
144 | already returned :std:`core::task::Poll::Ready`.
145 |
146 |
--------------------------------------------------------------------------------
/exts/ferrocene_spec/paragraph_ids.py:
--------------------------------------------------------------------------------
1 | # SPDX-License-Identifier: MIT OR Apache-2.0
2 | # SPDX-FileCopyrightText: The Ferrocene Developers
3 |
4 | from . import definitions, informational, utils
5 | from collections import defaultdict
6 | from docutils import nodes
7 | from sphinx.environment.collectors import EnvironmentCollector
8 | import json
9 | import os
10 | import sphinx
11 |
12 |
13 | def write_paragraph_ids(app):
14 | env = app.env
15 | informational_storage = informational.get_storage(env)
16 |
17 | paragraphs_by_section = defaultdict(list)
18 | for paragraph in definitions.get_storage(env, definitions.paragraphs).values():
19 | paragraphs_by_section[paragraph.section_id].append(
20 | {
21 | "id": paragraph.id,
22 | "number": paragraph.number(app.env),
23 | "link": app.builder.get_target_uri(paragraph.document)
24 | + "#"
25 | + paragraph.id,
26 | "checksum": paragraph.content_checksum(),
27 | }
28 | )
29 |
30 | sections_by_document = defaultdict(list)
31 | for section in env.spec_sections:
32 | sections_by_document[section.document].append(
33 | {
34 | "id": section.id,
35 | "number": ".".join(
36 | str(n) for n in env.toc_secnumbers[section.document][section.anchor]
37 | ),
38 | "title": section.title,
39 | "link": app.builder.get_target_uri(section.document) + section.anchor,
40 | "paragraphs": paragraphs_by_section[section.id],
41 | "informational": (
42 | section.anchor in informational_storage[section.document]
43 | ),
44 | }
45 | )
46 |
47 | documents = []
48 | for docname, title in env.titles.items():
49 | documents.append(
50 | {
51 | "title": title.astext(),
52 | "link": app.builder.get_target_uri(docname),
53 | "sections": sections_by_document[docname],
54 | "informational": (
55 | informational.WHOLE_PAGE in informational_storage[docname]
56 | ),
57 | }
58 | )
59 |
60 | with open(os.path.join(app.outdir, "paragraph-ids.json"), "w") as f:
61 | json.dump({"documents": documents}, f)
62 | f.write("\n")
63 |
64 |
65 | def build_finished(app, exception):
66 | # The build finished hook also runs when an exception is raised.
67 | if exception is not None:
68 | return
69 |
70 | with sphinx.util.display.progress_message("dumping paragraph ids"):
71 | write_paragraph_ids(app)
72 |
73 |
74 | def setup(app):
75 | app.connect("build-finished", build_finished)
76 | app.add_env_collector(SectionsCollector)
77 |
78 |
79 | class SectionsCollector(EnvironmentCollector):
80 | def clear_doc(self, app, env, docname):
81 | """
82 | This is called by Sphinx during incremental builds, either when a
83 | document was removed or when the document has been changed. In the
84 | latter case, process_doc is called after this method.
85 | """
86 | if not hasattr(env, "spec_sections"):
87 | env.spec_sections = []
88 | env.spec_sections = [s for s in env.spec_sections if s.document != docname]
89 |
90 | def merge_other(self, app, env, docnames, other):
91 | """
92 | Sphinx supports parallel builds, with each process having its own
93 | environment instance, but once each document is processed those
94 | parallel environments need to be merged together. This method does it.
95 | """
96 | if not hasattr(env, "spec_sections"):
97 | env.spec_sections = []
98 | if not hasattr(other, "spec_sections"):
99 | return
100 |
101 | for section in other.spec_sections:
102 | if section.document not in docnames:
103 | continue
104 | env.spec_sections.append(section)
105 |
106 | def process_doc(self, app, doctree):
107 | """
108 | This method can expect no existing information about the same document
109 | being stored in the environment, as during incremental rebuilds the
110 | clear_doc method is called ahead of this one.
111 | """
112 | env = app.env
113 | if not hasattr(env, "spec_sections"):
114 | env.spec_sections = []
115 |
116 | for section in doctree.findall(nodes.section):
117 | try:
118 | id, anchor = utils.section_id_and_anchor(section)
119 | except utils.NoSectionIdError:
120 | continue
121 |
122 | title = None
123 | for child in section.children:
124 | if isinstance(child, nodes.title):
125 | title = child.astext()
126 | if title is None:
127 | raise RuntimeError(f"section without title: {section}")
128 |
129 | env.spec_sections.append(
130 | Section(id=id, title=title, anchor=anchor, document=env.docname)
131 | )
132 |
133 |
134 | class Section:
135 | def __init__(self, id, title, anchor, document):
136 | self.id = id
137 | self.title = title
138 | self.anchor = anchor
139 | self.document = document
140 |
--------------------------------------------------------------------------------
/exts/ferrocene_spec/items_with_rubric.py:
--------------------------------------------------------------------------------
1 | # SPDX-License-Identifier: MIT OR Apache-2.0
2 | # SPDX-FileCopyrightText: The Ferrocene Developers
3 |
4 | from collections import defaultdict
5 | from dataclasses import dataclass
6 | from docutils import nodes
7 | from sphinx.directives import SphinxDirective
8 | from sphinx.environment.collectors import EnvironmentCollector
9 | from .informational import is_document_informational, is_section_informational
10 | from sphinx.transforms import SphinxTransform
11 | import sphinx
12 |
13 |
14 | class ItemsWithRubricMarkerNode(nodes.Element):
15 | def __init__(self, rubric, location):
16 | super().__init__()
17 | self["rubric"] = rubric
18 |
19 | # Make the node's source location be the same as the directive, so that
20 | # later transforms can retrieve the location for error reporting.
21 | self.source = location[0]
22 | self.line = location[1]
23 |
24 |
25 | class ItemsWithRubricDirective(SphinxDirective):
26 | has_content = True
27 |
28 | def run(self):
29 | if len(self.content) != 1:
30 | warn(
31 | "items-with-rubric accepts only the name of the rubric",
32 | self.get_location(),
33 | )
34 | return []
35 |
36 | return [ItemsWithRubricMarkerNode(self.content[0], self.get_source_info())]
37 |
38 |
39 | class RubricCollector(EnvironmentCollector):
40 | def clear_doc(self, app, env, docname):
41 | storage = get_storage(env)
42 | for rubric, items in storage.items():
43 | items[:] = (item for item in items if item.document != docname)
44 |
45 | def merge_other(self, app, env, docnames, other):
46 | current = get_storage(env)
47 | other = get_storage(other)
48 | for rubric, items in other.items():
49 | for item in items:
50 | if item.document in docnames:
51 | current[rubric].append(item)
52 |
53 | def process_doc(self, app, document):
54 | for rubric in document.findall(nodes.rubric):
55 | self.process_rubric(app, document, rubric)
56 |
57 | def process_rubric(self, app, document, rubric):
58 | rubric_name = rubric.astext()
59 |
60 | if not isinstance(rubric.parent, nodes.section):
61 | warn("rubric is not directly inside a section", rubric)
62 | return
63 |
64 | title_node = next(rubric.parent.findall(nodes.title))
65 | if title_node is None:
66 | warn(
67 | "section containing this rubric doesn't have a title",
68 | rubric,
69 | )
70 | return
71 | section_title = title_node.astext()
72 |
73 | section_id = rubric.parent["ids"][-1]
74 | get_storage(app.env)[rubric_name].append(
75 | StoredRubric(
76 | document=app.env.docname,
77 | section_id=section_id,
78 | section_title=section_title,
79 | )
80 | )
81 |
82 |
83 | class InjectContentTransform(SphinxTransform):
84 | default_priority = 500
85 |
86 | def apply(self):
87 | for node in self.document.findall(ItemsWithRubricMarkerNode):
88 | self.replace_node(node)
89 |
90 | def replace_node(self, node):
91 | items = get_storage(self.env)[node["rubric"]]
92 | if not items:
93 | warn(f"no items with rubric {node['rubric']}", node)
94 | node.parent.remove(node)
95 | return
96 |
97 | # Ensure the items are in a sorted order to guarantee reproducibility.
98 | items.sort(key=lambda item: (item.section_title, item.section_id))
99 |
100 | bullet_list = nodes.bullet_list()
101 | for item in items:
102 | if is_document_informational(self.env, item.document):
103 | continue
104 | if is_section_informational(self.env, item.document, f"#{item.section_id}"):
105 | continue
106 |
107 | paragraph = nodes.paragraph()
108 | paragraph += sphinx.util.nodes.make_refnode(
109 | self.app.builder,
110 | self.env.docname,
111 | item.document,
112 | item.section_id,
113 | nodes.Text(item.section_title),
114 | )
115 | paragraph += nodes.Text(" (in ")
116 | paragraph += sphinx.util.nodes.make_refnode(
117 | self.app.builder,
118 | self.env.docname,
119 | item.document,
120 | "",
121 | nodes.Text(self.env.titles[item.document].astext()),
122 | )
123 | paragraph += nodes.Text(")")
124 |
125 | list_item = nodes.list_item()
126 | list_item += paragraph
127 |
128 | bullet_list += list_item
129 |
130 | node.replace_self(bullet_list)
131 |
132 |
133 | def get_storage(env):
134 | key = "spec_rubrics"
135 | if not hasattr(env, key):
136 | setattr(env, key, defaultdict(list))
137 | return getattr(env, key)
138 |
139 |
140 | @dataclass
141 | class StoredRubric:
142 | document: str
143 | section_id: str
144 | section_title: str
145 |
146 |
147 | def warn(message, location):
148 | logger = sphinx.util.logging.getLogger(__name__)
149 | logger.warning(message, location=location)
150 |
151 |
152 | def setup(app):
153 | app.add_env_collector(RubricCollector)
154 | app.add_node(ItemsWithRubricMarkerNode)
155 | app.add_post_transform(InjectContentTransform)
156 |
--------------------------------------------------------------------------------
/exts/ferrocene_toctrees.py:
--------------------------------------------------------------------------------
1 | # SPDX-License-Identifier: MIT OR Apache-2.0
2 | # SPDX-FileCopyrightText: The Ferrocene Developers
3 |
4 | from docutils import nodes
5 | from sphinx import addnodes as sphinxnodes
6 | from sphinx.directives.other import TocTree
7 | from sphinx.environment.collectors.toctree import TocTreeCollector
8 | import gc
9 |
10 |
11 | class AppendicesDirective(TocTree):
12 | def run(self):
13 | result = super().run()
14 |
15 | def compat_check(condition):
16 | if not condition:
17 | raise RuntimeError(
18 | "bug: the toctree Sphinx directive used by appendix emitted "
19 | "unexpected nodes, please update the extension to handle that"
20 | )
21 |
22 | # We're modifying the result of Sphinx's builtin TocTree directive, so
23 | # ensure it contains what we expect.
24 | compat_check(isinstance(result, list))
25 | compat_check(len(result) == 1)
26 | compat_check(isinstance(result[0], nodes.compound))
27 | compat_check(len(result[0].children) == 1)
28 | compat_check(isinstance(result[0].children[0], sphinxnodes.toctree))
29 |
30 | # Mark this toctree as containing appendices, so that the environment
31 | # collector can distinguish it from normal toctrees.
32 | result[0].children[0]["are_appendices"] = True
33 |
34 | return result
35 |
36 |
37 | # To ensure the minimum disruption possible, to update section numbers we're
38 | # replacing the EnvironmentCollector responsible for assigning section numbers.
39 | #
40 | # We let the builtin Sphinx logic assign section numbers, and as soon as it
41 | # finishes we go over the sections and replace the first number of appendices
42 | # with a letter, and we offset everything based on the previous toctrees (to
43 | # support multiple toctrees in a site).
44 | #
45 | # Doing this as part of TocTreeCollector ensures the rest of the build always
46 | # sees the correct ID for sections, as no Sphinx code runs between the upstream
47 | # number assigning and us modifying them.
48 | class TocTreeCollectorWithAppendices(TocTreeCollector):
49 | def __init__(self, *args, **kwargs):
50 | super().__init__(*args, **kwargs)
51 |
52 | self.__within_appendices = False
53 | self.__chapter_offset = 0
54 | self.__appendix_offset = 0
55 | self.__chapter_max = 0
56 | self.__appendix_max = 0
57 |
58 | def assign_section_numbers(self, env):
59 | result = super().assign_section_numbers(env)
60 |
61 | for docname in env.numbered_toctrees:
62 | doctree = env.get_doctree(docname)
63 | for toctree in doctree.findall(sphinxnodes.toctree):
64 | self.__replace_toctree(env, toctree)
65 |
66 | self.__chapter_offset = self.__chapter_max
67 | self.__appendix_offset = self.__appendix_max
68 |
69 | return result
70 |
71 | def __replace_toctree(self, env, toctree):
72 | self.__within_appendices = "are_appendices" in toctree
73 | for _, ref in toctree["entries"]:
74 | env.titles[ref]["secnumber"] = self.__renumber(env.titles[ref]["secnumber"])
75 | if ref in env.tocs:
76 | self.__replace_toc(env, ref, env.tocs[ref])
77 |
78 | def __replace_toc(self, env, ref, node):
79 | if isinstance(node, nodes.reference):
80 | fixed_number = self.__renumber(node["secnumber"])
81 | node["secnumber"] = fixed_number
82 | env.toc_secnumbers[ref][node["anchorname"]] = fixed_number
83 |
84 | elif isinstance(node, sphinxnodes.toctree):
85 | raise RuntimeError("nested toctrees are not supported")
86 |
87 | else:
88 | for child in node.children:
89 | self.__replace_toc(env, ref, child)
90 |
91 | def __renumber(self, number):
92 | if not number:
93 | return number
94 |
95 | if self.__within_appendices:
96 | with_offset = self.__appendix_offset + number[0]
97 | if with_offset > 26:
98 | raise RuntimeError("more than 26 appendices are not supported")
99 |
100 | fixed = chr(ord("A") - 1 + with_offset)
101 | self.__appendix_max = max(self.__appendix_max, with_offset)
102 | else:
103 | fixed = self.__chapter_offset + number[0]
104 | self.__chapter_max = max(self.__chapter_max, fixed)
105 |
106 | return (fixed, *number[1:])
107 |
108 |
109 | # This extension needs to replace the builtin TocTreeCollector, so it's safer
110 | # to disable it. That'll avoid two TocTreeCollectors running in the build.
111 | def disable_builtin_toctree_collector(app):
112 | for obj in gc.get_objects():
113 | if not isinstance(obj, TocTreeCollector):
114 | continue
115 | # When running sphinx-autobuild, this function might be called multiple
116 | # times. When the collector is already disabled `listener_ids` will be
117 | # `None`, and thus we don't need to disable it again.
118 | #
119 | # Note that disabling an already disabled collector will fail.
120 | if obj.listener_ids is None:
121 | continue
122 | obj.disable(app)
123 |
124 |
125 | def setup(app):
126 | app.add_directive("appendices", AppendicesDirective)
127 |
128 | disable_builtin_toctree_collector(app)
129 | app.add_env_collector(TocTreeCollectorWithAppendices)
130 |
131 | return {
132 | "version": "0",
133 | "parallel_read_safe": True,
134 | "parallel_write_safe": True,
135 | }
136 |
--------------------------------------------------------------------------------
/make.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env -S uv run
2 | # SPDX-License-Identifier: MIT OR Apache-2.0
3 | # SPDX-FileCopyrightText: The Ferrocene Developers
4 | # SPDX-FileCopyrightText: The Rust Project Contributors
5 |
6 | import os
7 | from pathlib import Path
8 | import argparse
9 | import subprocess
10 | import shutil
11 |
12 | # Automatically watch the following extra directories when --serve is used.
13 | EXTRA_WATCH_DIRS = ["exts", "themes"]
14 |
15 |
16 | def build_docs(root, builder, clear, serve, debug):
17 | dest = root / "build"
18 | output_dir = dest / builder
19 |
20 | args = ["-b", builder, "-d", dest / "doctrees"]
21 | if debug:
22 | # Disable parallel builds and show exceptions in debug mode.
23 | #
24 | # We can't show exceptions in parallel mode because in parallel mode
25 | # all exceptions will be swallowed up by Python's multiprocessing.
26 | # That's also why we don't show exceptions outside of debug mode.
27 | args += ["-j", "1", "-T"]
28 | else:
29 | # Enable parallel builds:
30 | args += ["-j", "auto"]
31 | if clear:
32 | if output_dir.exists():
33 | shutil.rmtree(output_dir)
34 | # Using a fresh environment
35 | args.append("-E")
36 | if serve:
37 | for extra_watch_dir in EXTRA_WATCH_DIRS:
38 | extra_watch_dir = root / extra_watch_dir
39 | if extra_watch_dir.exists():
40 | args += ["--watch", extra_watch_dir]
41 | else:
42 | # Error out at the *end* of the build if there are warnings:
43 | args += ["-W", "--keep-going"]
44 |
45 | commit = current_git_commit(root)
46 | if commit is not None:
47 | args += ["-D", f"html_theme_options.commit={commit}"]
48 |
49 | try:
50 | subprocess.run(
51 | [
52 | "sphinx-autobuild" if serve else "sphinx-build",
53 | *args,
54 | root / "src",
55 | output_dir,
56 | ],
57 | check=True,
58 | )
59 | except KeyboardInterrupt:
60 | exit(1)
61 | except subprocess.CalledProcessError:
62 | print("\nhint: if you see an exception, pass --debug to see the full traceback")
63 | exit(1)
64 |
65 | return dest / builder
66 |
67 |
68 | def build_linkchecker(root):
69 | repo = root / ".linkchecker"
70 | src = repo / "src" / "tools" / "linkchecker"
71 | bin = src / "target" / "release" / "linkchecker"
72 |
73 | if not src.is_dir():
74 | subprocess.run(["git", "init", repo], check=True)
75 |
76 | def git(args):
77 | subprocess.run(["git", *args], cwd=repo, check=True)
78 |
79 | # Avoid fetching blobs unless needed by the sparse checkout
80 | git(["remote", "add", "origin", "https://github.com/rust-lang/rust"])
81 | git(["config", "remote.origin.promisor", "true"])
82 | git(["config", "remote.origin.partialCloneFilter", "blob:none"])
83 |
84 | # Checkout only the linkchecker tool rather than the whole repo
85 | git(["config", "core.sparsecheckout", "true"])
86 | with open(repo / ".git" / "info" / "sparse-checkout", "w") as f:
87 | f.write("/src/tools/linkchecker/")
88 |
89 | # Avoid fetching the whole history
90 | git(["fetch", "--depth=1", "origin", "main"])
91 | git(["checkout", "main"])
92 |
93 | if not bin.is_file():
94 | subprocess.run(["cargo", "build", "--release"], cwd=src, check=True)
95 |
96 | return bin
97 |
98 |
99 | def current_git_commit(root):
100 | try:
101 | return (
102 | subprocess.run(
103 | ["git", "rev-parse", "HEAD"],
104 | check=True,
105 | stdout=subprocess.PIPE,
106 | )
107 | .stdout.decode("utf-8")
108 | .strip()
109 | )
110 | # `git` executable missing from the system
111 | except FileNotFoundError:
112 | print("warning: failed to detect git commit: missing executable git")
113 | return
114 | # `git` returned an error (git will print the actual error to stderr)
115 | except subprocess.CalledProcessError:
116 | print("warning: failed to detect git commit: git returned an error")
117 | return
118 |
119 |
120 | def main(root):
121 | root = Path(root)
122 |
123 | parser = argparse.ArgumentParser()
124 | parser.add_argument(
125 | "-c", "--clear", help="disable incremental builds", action="store_true"
126 | )
127 | group = parser.add_mutually_exclusive_group()
128 | group.add_argument(
129 | "-s",
130 | "--serve",
131 | help="start a local server with live reload",
132 | action="store_true",
133 | )
134 | group.add_argument(
135 | "--check-links", help="Check whether all links are valid", action="store_true"
136 | )
137 | group.add_argument(
138 | "--xml", help="Generate Sphinx XML rather than HTML", action="store_true"
139 | )
140 | group.add_argument(
141 | "--debug",
142 | help="Debug mode for the extensions, showing exceptions",
143 | action="store_true",
144 | )
145 | args = parser.parse_args()
146 |
147 | rendered = build_docs(
148 | root, "xml" if args.xml else "html", args.clear, args.serve, args.debug
149 | )
150 |
151 | if args.check_links:
152 | linkchecker = build_linkchecker(root)
153 | if subprocess.run([linkchecker, rendered]).returncode != 0:
154 | print("error: linkchecker failed")
155 | exit(1)
156 |
157 |
158 | main(os.path.abspath(os.path.dirname(__file__)))
159 |
--------------------------------------------------------------------------------
/exts/ferrocene_spec/README.rst:
--------------------------------------------------------------------------------
1 | .. SPDX-License-Identifier: MIT OR Apache-2.0
2 | SPDX-FileCopyrightText: The Ferrocene Developers
3 |
4 | ====================
5 | FLS Sphinx Extension
6 | ====================
7 |
8 | To enhance the FLS, and to facilitate its authoring, updating, and testing it
9 | easier, we developed and use a custom Sphinx extension that adds new roles and
10 | directives. The source code of the extension is in the ``exts/ferrocene_spec``
11 | directory.
12 |
13 | .. contents:: In this document:
14 |
15 | Definitions
16 | ===========
17 |
18 | To aid users reading the FLS, we make extensive use of internal links. Since
19 | Sphinx's native support for creating and linking to definitions did not suit
20 | our needs we implemented custom roles for defining and linking to definitions.
21 |
22 | Definition namespaces
23 | ---------------------
24 |
25 | The extension includes support for multiple definition namespaces. In addition
26 | to allowing the same name to be used as a definition in multiple namespaces,
27 | each namespace has a different styling when rendered.
28 |
29 | .. list-table::
30 | :header-rows: 1
31 |
32 | * - Namespace
33 | - Link role
34 | - Definition role
35 | - Styling
36 |
37 | * - Terms
38 | - ``:t:`foo```
39 | - ``:dt:`foo```
40 | - Normal text
41 |
42 | * - Programmatic constructs
43 | - ``:c:`foo```
44 | - ``:dc:`foo```
45 | - Monospace text
46 |
47 | * - Syntactic categories
48 | - ``:s:`Foo```
49 | - ``:ds:`Foo```
50 | - Monospace text
51 |
52 | Both link roles and definition roles are standard reStructuredText roles, and
53 | can be used inline inside a sentence. For example:
54 |
55 | .. code-block:: rst
56 |
57 | An :dt:`associated item` is an :t:`item` that appears within an :dt:`implementation` or a :dt:`trait`.
58 |
59 | Definitions are case insensitive: you can define and link to them with
60 | whichever case suits the surrounding text the most.
61 |
62 | Linking to definitions
63 | ----------------------
64 |
65 | The standard syntax for link roles is to just include the name of the
66 | definition inside the role, which will generate a link to the place the
67 | definition is defined in. In addition to that, more advanced syntaxes are
68 | available to handle more complex use cases.
69 |
70 | Prefixes and suffixes
71 | ~~~~~~~~~~~~~~~~~~~~~
72 |
73 | Sometimes you might need to add prefixes or suffixes to a word you're linking,
74 | for example adding an ``s`` at the end to pluralize it. The native RST syntax
75 | for that is quite clunky, as you need to put an "escaped space" between the
76 | role and the rest of the word:
77 |
78 | .. code-block:: rst
79 |
80 | Multiple :t:`value`\ s can be used here.
81 |
82 | The snippet above will render as you'd expect ("Multiple values can be used
83 | here" with a link on "value"), but that syntax is hard to read and annoying to
84 | write. To ease adding prefixes and suffixes to links, the extension allows you
85 | to include the prefix and suffix inside the role itself, by wrapping the actual
86 | definition inside square brackets:
87 |
88 | .. code-block:: rst
89 |
90 | Multiple :t:`[value]s` can be used here.
91 |
92 | Arbitrary labels
93 | ~~~~~~~~~~~~~~~~
94 |
95 | If you need to link to a definition but the link label doesn't fully contain
96 | the definition name (for example if you're using a sentence as the link label,
97 | or you need to pluralize a word with irregular plurals) you can put the
98 | arbitrary label inside the role and put the definition name at the end of the
99 | role within angle brackets:
100 |
101 | .. code-block:: rst
102 |
103 | The :t:`criteria ` used for the change are:
104 |
105 | Links to the Rust standard library
106 | ==================================
107 |
108 | You can link to the documentation of items defined in the Rust standard library
109 | (``core``, ``alloc``, ``std``, ``test`` and ``proc_macro``) by using the
110 | ``:std:`type``` role (even for types defined in other standard library crates)
111 | with the fully qualified item path:
112 |
113 | .. code-block:: rst
114 |
115 | The type needs to implement :std:`core::marker::Copy`.
116 |
117 | Syntax blocks
118 | =============
119 |
120 | To ease the process of defining blocks of syntax definitions, the extension
121 | implements the custom ``syntax`` directive, which parses the syntax contained
122 | within it and automatically inserts definitions and links to the referenced
123 | syntactic categories:
124 |
125 | .. code-block:: rst
126 |
127 | .. syntax::
128 |
129 | ExpressionStatement ::=
130 | ExpressionWithBlock $$;$$?
131 | | ExpressionWithoutBlock $$;$$
132 |
133 | In the directive above, the extension will automatically insert a syntactic
134 | category definition for ``ExpressionStatement`` (since it's followed by ``::=``), and it
135 | will insert syntactic category links for both ``ExpressionWithBlock`` and
136 | ``ExpressionWithoutBlock``.
137 |
138 | Words and characters wrapped within ``$$`` are considered "literals": they will
139 | be rendered differently than syntactic categories, and they won't be considered
140 | by the extension when looking for syntactic categories.
141 |
142 | Paragraph IDs
143 | =============
144 |
145 | Ferrocene's test suite needs each paragraph in the FLS to have a unique ID
146 | attached to it. To ensure that this happens, the extension provides a way to
147 | easily define the ID for each paragraph, and to use that ID to link to the
148 | paragraph from other parts of the FLS.
149 |
150 | Paragraph IDs can be added to a paragraph with the ``:dp:`id``` role. The role must
151 | contain an unique random ID prefixed with ``fls_``, and the role must appear at
152 | the start of a paragraph:
153 |
154 | .. code-block:: rst
155 |
156 | :dp:`fls_qTgd9xuAY3n3`
157 | This is a paragraph with an ID.
158 |
159 | You can generate a list of random IDs by running the following command from the
160 | root of the specification repository::
161 |
162 | ./generate-random-ids.py
163 |
164 | You can also link to an existing paragraph with the ``:p:`id``` role:
165 |
166 | .. code-block:: rst
167 |
168 | See :p:`fls_qTgd9xuAY3n3` for a sample paragraph using IDs.
169 |
170 | Note that paragraph IDs are also used to generate the human-readable paragraph
171 | numbers generated by Sphinx: while IDs are supposed to be stable across FLS
172 | revisions, the human-readable paragraph numbers can change between renderings.
173 |
--------------------------------------------------------------------------------
/src/statements.rst:
--------------------------------------------------------------------------------
1 | .. SPDX-License-Identifier: MIT OR Apache-2.0
2 | SPDX-FileCopyrightText: The Ferrocene Developers
3 |
4 | .. default-domain:: spec
5 |
6 | .. _fls_wdicg3sqa98e:
7 |
8 | Statements
9 | ==========
10 |
11 | .. rubric:: Syntax
12 |
13 | .. syntax::
14 |
15 | Statement ::=
16 | ExpressionStatement
17 | | Item
18 | | LetStatement
19 | | TerminatedMacroInvocation
20 | | $$;$$
21 |
22 | .. rubric:: Legality Rules
23 |
24 | :dp:`fls_7zh6ziglo5iy`
25 | An :t:`expression statement` is an :t:`expression` whose result is ignored.
26 |
27 | :dp:`fls_kdxe1ukmgl1`
28 | An :t:`item statement` is a :t:`statement` that is expressed as an :t:`item`.
29 |
30 | :dp:`fls_fftdnwe22xrb`
31 | An :t:`empty statement` is a :t:`statement` expressed as character 0x3B
32 | (semicolon).
33 |
34 | :dp:`fls_or125cqtxg9j`
35 | A :t:`macro statement` is a :t:`statement` expressed as a
36 | :t:`terminated macro invocation`.
37 |
38 | .. rubric:: Dynamic Semantics
39 |
40 | :dp:`fls_estqu395zxgk`
41 | :t:`Execution` is the process by which a :t:`statement` achieves its runtime
42 | effects.
43 |
44 | :dp:`fls_dl763ssb54q1`
45 | The :t:`execution` of an :t:`empty statement` has no effect.
46 |
47 | .. _fls_yivm43r5wnp1:
48 |
49 | Let Statements
50 | --------------
51 |
52 | .. rubric:: Syntax
53 |
54 | .. syntax::
55 |
56 | LetStatement ::=
57 | OuterAttributeOrDoc* $$let$$ PatternWithoutAlternation TypeAscription? LetInitializer? $$;$$
58 |
59 | LetInitializer ::=
60 | $$=$$ Expression ($$else$$ BlockExpression)?
61 |
62 | .. rubric:: Legality Rules
63 |
64 | :dp:`fls_ct7pp7jnfr86`
65 | A :t:`let statement` is a :t:`statement` that introduces new :t:`[binding]s`
66 | produced by its :t:`pattern-without-alternation` that are optionally
67 | initialized to a :t:`value`.
68 |
69 | :dp:`fls_SR3dIgR5K0Kq`
70 | A :t:`let initializer` is a :t:`construct` that provides the :t:`value` of
71 | the :t:`[binding]s` of the :t:`let statement` using an :t:`expression`, or
72 | alternatively executes a :t:`block expression`.
73 |
74 | :dp:`fls_iqar7vvtw22c`
75 | If a :t:`let statement` lacks a :t:`block expression`, then the :t:`pattern` of
76 | the :t:`let statement` shall be an :t:`irrefutable pattern`.
77 |
78 | :dp:`fls_1s1UikGU5YQb`
79 | If a :t:`let statement` has a :t:`block expression`, then the :s:`Expression` of
80 | the :s:`LetInitializer` shall not be a :s:`LazyBooleanExpression` or end with
81 | token ``}``.
82 |
83 | :dp:`fls_iB25BeFys0j8`
84 | The :t:`expected type` of the :t:`pattern` of the :t:`let statement` is determined as follows:
85 |
86 | * :dp:`fls_zObyLdya4DYc`
87 | If the :t:`let statement` lacks a :t:`type ascription` and a :t:`let initializer`, then the :t:`expected type` is the :t:`inferred type`.
88 |
89 | * :dp:`fls_r38TXWKQPjxv`
90 | If the :t:`let statement` lacks a :t:`type ascription`, then the :t:`expected type` is the :t:`type` of the :t:`let initializer`.
91 |
92 | * :dp:`fls_6QSdwF4pzjoe`
93 | Otherwise the :t:`expected type` is the :t:`type` specified by the :t:`type ascription`.
94 |
95 | :dp:`fls_1prqh1trybwz`
96 | The :t:`type` of a :t:`binding` introduced by a :t:`let statement` is
97 | determined as follows:
98 |
99 | * :dp:`fls_djkm8r2iuu6u`
100 | If the :t:`let statement` appears with a :t:`type ascription`, then the
101 | :t:`type` is the :t:`type` specified by the :t:`type ascription`.
102 |
103 | * :dp:`fls_ppj9gvhp8wcj`
104 | If the :t:`let statement` lacks a :t:`type ascription`, then the :t:`type` is
105 | inferred using :t:`type inference`.
106 |
107 | :dp:`fls_1eBQDZdBuDsN`
108 | The :t:`type` of the :t:`block expression` of a :t:`let statement` shall be the
109 | :t:`never type`.
110 |
111 | :dp:`fls_m8a7gesa4oim`
112 | The :t:`value` of a :t:`binding` introduced by a :t:`let statement` is
113 | determined as follows:
114 |
115 | * :dp:`fls_oaxnre7m9s10`
116 | If the :t:`let statement` appears with a :t:`let initializer`, then the
117 | :t:`value` is the :t:`value` of the :t:`expression` of the
118 | :t:`let initializer`.
119 |
120 | * :dp:`fls_t5bjwluyv8za`
121 | Otherwise the :t:`binding` is uninitialized.
122 |
123 | .. rubric:: Dynamic Semantics
124 |
125 | :dp:`fls_4j9riqyf4p9`
126 | The :t:`execution` of a :t:`let statement` with a :t:`let initializer` proceeds
127 | as follows:
128 |
129 | #. :dp:`fls_t53g5hlabqw1`
130 | The :t:`expression` of the :t:`let initializer` is evaluated.
131 |
132 | #. :dp:`fls_7j4qlwg72ege`
133 | If the :t:`value` of the :t:`expression` is matched successfully against the
134 | :t:`pattern` of the :t:`let statement`, then the :t:`value` is assigned to
135 | each :t:`binding` introduced by the :t:`let statement`.
136 |
137 | #. :dp:`fls_ea9bRFZjH8Im`
138 | Otherwise the :t:`block expression` of the :t:`let initializer` is evaluated.
139 |
140 | .. rubric:: Examples
141 |
142 | .. code-block:: rust
143 |
144 | let local = 0;
145 | let local: u32;
146 | let (a, b) = (0, 0);
147 | let Some(value) = vector.pop() else {
148 | panic!();
149 | };
150 |
151 | .. _fls_1pg5ig740tg1:
152 |
153 | Expression Statements
154 | ---------------------
155 |
156 | .. rubric:: Syntax
157 |
158 | .. syntax::
159 |
160 | ExpressionStatement ::=
161 | ExpressionWithBlock $$;$$?
162 | | ExpressionWithoutBlock $$;$$
163 |
164 | .. rubric:: Legality Rules
165 |
166 | :dp:`fls_xmdj8uj7ixoe`
167 | An :t:`expression statement` is an :t:`expression` whose result is ignored.
168 |
169 | :dp:`fls_gzzmudc1hl6s`
170 | The :t:`expected type` of an :t:`expression statement` without character 0x3B
171 | (semicolon) is the :t:`unit type`.
172 |
173 | .. rubric:: Dynamic Semantics
174 |
175 | :dp:`fls_kc99n8qrszxh`
176 | The :t:`execution` of an :t:`expression statement` proceeds as follows:
177 |
178 | #. :dp:`fls_r8poocwqaglf`
179 | The :t:`operand` is evaluated.
180 |
181 | #. :dp:`fls_88e6s3erk8tj`
182 | The :t:`value` of the :t:`operand` is :t:`dropped`.
183 |
184 | .. rubric:: Examples
185 |
186 | .. code-block:: rust
187 |
188 | let mut values = vec![1, 2, 3];
189 |
190 | :dp:`fls_4q90jb39apwr`
191 | The following expression statement ignores the result from ``pop()``.
192 |
193 | .. code-block:: rust
194 |
195 | values.pop();
196 |
197 | :dp:`fls_xqtztcu8ibwq`
198 | The following expression statement does not require a semicolon.
199 |
200 | .. code-block:: rust
201 |
202 | if values.is_empty() {
203 | values.push(42);
204 | }
205 | else {
206 | values.remove(0);
207 | }
208 |
209 | :dp:`fls_2p9xnt519nbw`
210 | The following expression statement is not an index expression.
211 |
212 | .. code-block:: rust
213 |
214 | [42];
215 |
--------------------------------------------------------------------------------
/exts/ferrocene_spec/syntax_directive.py:
--------------------------------------------------------------------------------
1 | # SPDX-License-Identifier: MIT OR Apache-2.0
2 | # SPDX-FileCopyrightText: The Ferrocene Developers
3 |
4 | from .definitions import DefIdNode, DefRefNode
5 | from docutils import nodes
6 | from sphinx.directives import SphinxDirective
7 |
8 |
9 | class SyntaxDirective(SphinxDirective):
10 | has_content = True
11 |
12 | def run(self):
13 | # The first argument of creating any docutils node is the "raw source",
14 | # which in theory is completely optional for our needs. That's why all
15 | # the nodes we create pass an empty string (the default) to it.
16 | #
17 | # Still, this block is fairly special: to decide whether to perform
18 | # syntax highlighting on a literal block, Sphinx compares the raw
19 | # source with the output of the astext() method, and it highlights only
20 | # if the two are equal.
21 | #
22 | # We had problems in the past with the heuristic randomly causing the
23 | # syntax blocks to be highlighted by Sphinx (thus losing all our custom
24 | # formatting). Thus, to avoid problems we set the raw source to
25 | # something that will never appear in the astext() output (byte zero).
26 | node = nodes.literal_block("\0")
27 |
28 | node["classes"].append("spec-syntax")
29 |
30 | for child in Parser("\n".join(self.content), self.env.docname).parse():
31 | node += child
32 |
33 | return [node]
34 |
35 |
36 | class Parser:
37 | def __init__(self, content, document_name):
38 | self.document_name = document_name
39 | self.lexer = Lexer(content).lex()
40 | self.peek_buffer = []
41 |
42 | def parse(self):
43 | while True:
44 | token = self.next()
45 | if token is None:
46 | return
47 |
48 | if token.kind == "literal":
49 | node = nodes.strong("", token.content)
50 | node["classes"].append("spec-syntax-literal")
51 | yield node
52 |
53 | elif token.kind == "identifier" and is_syntax_identifier(token.content):
54 |
55 | def peek_kind(kind, nth=0):
56 | peeked = self.peek(nth)
57 | return peeked is not None and peeked.kind == kind
58 |
59 | if peek_kind("definition", int(peek_kind("whitespace"))):
60 | yield DefIdNode("syntaxes", token.content)
61 | else:
62 | yield DefRefNode("syntaxes", self.document_name, token.content)
63 |
64 | else:
65 | yield nodes.Text(token.content)
66 |
67 | def next(self):
68 | if self.peek_buffer:
69 | return self.peek_buffer.pop(0)
70 | else:
71 | return next(self.lexer, None)
72 |
73 | def peek(self, nth=0):
74 | while len(self.peek_buffer) <= nth:
75 | token = next(self.lexer, None)
76 | if token is None:
77 | return None
78 | self.peek_buffer.append(token)
79 | return self.peek_buffer[nth]
80 |
81 |
82 | class Lexer:
83 | def __init__(self, content):
84 | self.content = content
85 | self.pos = 0
86 |
87 | def lex(self):
88 | while True:
89 | char = self.next()
90 | if char is None:
91 | return
92 |
93 | if char == "$" and self.peek() == "$":
94 | self.next() # Consume the second "$"
95 |
96 | # Collect all the chars until the next `$$`
97 | buffer = ""
98 | while True:
99 | peeked = self.peek()
100 | if peeked is None:
101 | break
102 | # We check that the third peek is not a "$" to perform a
103 | # greedy parsing. This way, $$$$$ is parsed as "$"
104 | # rather than "$".
105 | elif peeked == "$" and self.peek(1) == "$" and self.peek(2) != "$":
106 | self.next() # Consume the first "$"
107 | self.next() # Consume the second "$"
108 | break
109 | else:
110 | buffer += self.next()
111 | yield Token("literal", buffer)
112 |
113 | elif char == ":" and self.peek() == ":" and self.peek(1) == "=":
114 | self.next() # Consume the second ":"
115 | self.next() # Consume the "="
116 | yield Token("definition", "::=")
117 |
118 | elif char.isalpha():
119 | buffer = char
120 | while True:
121 | peeked = self.peek()
122 | if peeked is None or not (peeked.isalpha() or peeked == "_"):
123 | break
124 | buffer += self.next()
125 | yield Token("identifier", buffer)
126 |
127 | elif char.isspace():
128 | buffer = char
129 | while True:
130 | peeked = self.peek()
131 | if peeked is None or not peeked.isspace():
132 | break
133 | buffer += self.next()
134 | yield Token("whitespace", buffer)
135 |
136 | else:
137 | yield Token("other", char)
138 |
139 | def peek(self, nth=0):
140 | try:
141 | return self.content[self.pos + nth]
142 | except IndexError:
143 | return None
144 |
145 | def next(self):
146 | try:
147 | char = self.content[self.pos]
148 | except IndexError:
149 | return None
150 | self.pos += 1
151 | return char
152 |
153 |
154 | def is_syntax_identifier(identifier):
155 | EXPECT_ANY = 0
156 | EXPECT_UPPER = 1
157 | EXPECT_LOWER = 2
158 |
159 | # Some of the identifier referring to Unicode categories are called
160 | # XID_Category. The problem is that the XID_ portion is not a valid
161 | # identifier based on our rules. This code special-case that by ignoring
162 | # the problematic part of the identifier.
163 | if identifier.startswith("XID_"):
164 | identifier = identifier[len("XID_") :]
165 | # An unfortunate case where we end up with a single letter prefix violating
166 | # our rules. Special case this like above by avoiding the rule breakage.
167 | if identifier == "CStringLiteral":
168 | identifier = "CstringLiteral"
169 |
170 | expected = EXPECT_UPPER
171 | for char in identifier:
172 | if expected == EXPECT_UPPER:
173 | if not char.isupper():
174 | return False
175 | expected = EXPECT_LOWER
176 | elif expected == EXPECT_LOWER:
177 | if char.isupper():
178 | return False
179 | expected = EXPECT_ANY
180 |
181 | return True
182 |
183 |
184 | class Token:
185 | def __init__(self, kind, content=None):
186 | self.kind = kind
187 | self.content = content
188 |
--------------------------------------------------------------------------------
/src/associated-items.rst:
--------------------------------------------------------------------------------
1 | .. SPDX-License-Identifier: MIT OR Apache-2.0
2 | SPDX-FileCopyrightText: The Ferrocene Developers
3 |
4 | .. default-domain:: spec
5 |
6 | .. _fls_l21tjqjkkaa0:
7 |
8 | Associated Items
9 | ================
10 |
11 | .. rubric:: Syntax
12 |
13 | .. syntax::
14 |
15 | AssociatedItem ::=
16 | OuterAttributeOrDoc* (AssociatedItemWithVisibility | TerminatedMacroInvocation)
17 |
18 | AssociatedItemWithVisibility ::=
19 | VisibilityModifier? (
20 | ConstantDeclaration
21 | | FunctionDeclaration
22 | | TypeAliasDeclaration
23 | )
24 |
25 | .. rubric:: Legality Rules
26 |
27 | :dp:`fls_ckzd25qd213t`
28 | An :t:`associated item` is an :t:`item` that appears within an
29 | :t:`implementation` or a :t:`trait`.
30 |
31 | :dp:`fls_5y6ae0xqux57`
32 | An :t:`associated constant` is a :t:`constant` that appears as an
33 | :t:`associated item`.
34 |
35 | :dp:`fls_lj7492aq7fzo`
36 | An :t:`associated function` is a :t:`function` that appears as an
37 | :t:`associated item`.
38 |
39 | :dp:`fls_8cz4rdrklaj4`
40 | An :t:`associated type` is a :t:`type alias` that appears as an
41 | :t:`associated item`.
42 |
43 | :dp:`fls_w8nu8suy7t5`
44 | An :t:`associated type` shall not be used in the :t:`path expression` of a
45 | :t:`struct expression`.
46 |
47 | :dp:`fls_wasocqdnuzd1`
48 | An :t:`associated type` with a :s:`TypeBoundList` shall appear only as an
49 | :t:`associated trait type`.
50 |
51 | :dp:`fls_PeD0DzjK57be`
52 | A :t:`generic associated type` is an :t:`associated type` with
53 | :t:`[generic parameter]s`.
54 |
55 | :dp:`fls_3foYUch29ZtF`
56 | A :t:`lifetime parameter` of a :t:`generic associated type` requires a
57 | :t:`bound` of the form ``T: 'lifetime``, where ``T`` is a :t:`type parameter`
58 | or :c:`Self` and ``'lifetime`` is the :t:`lifetime parameter`, when
59 |
60 | * :dp:`fls_SnQc0zZS57Cz`
61 | The :t:`generic associated type` is used in an :t:`associated function` of
62 | the same :t:`trait`, and
63 |
64 | * :dp:`fls_6Z05BK2JSzpP`
65 | The corresponding :t:`lifetime argument` in the use is not the ``'static``
66 | :t:`lifetime` and has either an explicit :t:`bound` or an :t:`implied bound`
67 | that constrains the :t:`type parameter`, and
68 |
69 | * :dp:`fls_AtItgS1UvwiX`
70 | The intersection of all such uses is not empty.
71 |
72 | :dp:`fls_l3iwn56n1uz8`
73 | An :t:`associated implementation constant` is an :t:`associated constant` that
74 | appears within an :t:`implementation`.
75 |
76 | :dp:`fls_4ftfefcotb4g`
77 | An :t:`associated implementation constant` shall have a :t:`constant
78 | initializer`.
79 |
80 | :dp:`fls_qb5qpfe0uwk`
81 | An :t:`associated implementation function` is an :t:`associated function` that
82 | appears within an :t:`implementation`.
83 |
84 | :dp:`fls_1zlkeb6fz10j`
85 | An :t:`associated implementation function` shall have a :t:`function body`.
86 |
87 | :dp:`fls_tw8u0cc5867l`
88 | An :t:`associated implementation type` is an :t:`associated type` that appears
89 | within an :t:`implementation`.
90 |
91 | :dp:`fls_bx7931x4155h`
92 | An :t:`associated implementation type` shall have an :t:`initialization type`.
93 |
94 | :dp:`fls_bnTcCbDvdp94`
95 | An :t:`associated trait item` is an :t:`associated item` that appears
96 | within a :t:`trait`.
97 |
98 | :dp:`fls_N3cdn4lCZ2Bf`
99 | An :t:`associated trait implementation item` is an :t:`associated item` that
100 | appears within a :t:`trait implementation`.
101 |
102 | :dp:`fls_x564isbhobym`
103 | An :t:`associated trait constant` is an :t:`associated constant` that appears
104 | within a :t:`trait`.
105 |
106 | :dp:`fls_b6nns7oqvdpm`
107 | An :t:`associated trait function` is an :t:`associated function` that appears
108 | within a :t:`trait`.
109 |
110 | :dp:`fls_2TRwCz38kuRz`
111 | An :t:`associated trait function` shall not be subject to :t:`keyword` ``const``.
112 |
113 | :dp:`fls_WnsVATJvUdza`
114 | Every occurrence of an :t:`impl trait type` in the :t:`return type` of an
115 | :t:`associated trait function` is equivalent to referring to a new
116 | anonymous :t:`associated trait type` of the :t:`implemented trait`.
117 |
118 | :dp:`fls_yyhebj4qyk34`
119 | An :t:`associated trait type` is an :t:`associated type` that appears within
120 | a :t:`trait`.
121 |
122 | :dp:`fls_kl9p3ycl5mzf`
123 | An :t:`associated trait type` shall not have an :t:`initialization type`.
124 |
125 | :dp:`fls_a5prbmuruma4`
126 | An :t:`associated trait type` has an implicit :std:`core::marker::Sized`
127 | :t:`bound`.
128 |
129 | :dp:`fls_vp2ov6ykueue`
130 | An :t:`associated trait type` of the form
131 |
132 | .. code-block:: rust
133 |
134 | trait T {
135 | type X: Bound;
136 | }
137 |
138 | :dp:`fls_5uf74nvdm64o`
139 | is equivalent to a :t:`where clause` of the following form:
140 |
141 | .. code-block:: rust
142 |
143 | trait T where Self::X: Bound {
144 | type X;
145 | }
146 |
147 | :dp:`fls_amWtS80fPtza`
148 | An :t:`associated trait implementation function` is an :t:`associated function`
149 | that appears within a :t:`trait implementation`.
150 |
151 | :dp:`fls_Cu8FWrisrqz1`
152 | Every occurrence of an :t:`impl trait type` in the :t:`return type` of an
153 | :t:`associated trait implementation function` is equivalent to referring to the
154 | corresponding :t:`associated trait type` of the corresponding :t:`associated
155 | trait function`.
156 |
157 | :dp:`fls_oy92gzxgc273`
158 | A :t:`method` is an :t:`associated function` with a :t:`self parameter`.
159 |
160 | :dp:`fls_WXnCWfJGoQx3`
161 | The type of a :t:`self parameter` shall be one of the following:
162 |
163 | * :dp:`fls_OaszUw4IFobz`
164 | A :t:`type specification` resolving to the :t:`implementing type` of the
165 | related :t:`implementation`, or
166 |
167 | * :dp:`fls_Wd2FZRomB5yn`
168 | ``&T`` where ``T`` is one of the :t:`[type]s` listed in this enumeration,
169 | or
170 |
171 | * :dp:`fls_lcEyToYIlcmf`
172 | ``&mut T`` where ``T`` is one of the :t:`[type]s` listed in this
173 | enumeration, or
174 |
175 | * :dp:`fls_IKSPR7ZQMErU`
176 | :std:`core::pin::Pin` where ``T`` is one of the :t:`[type]s` listed in this
177 | enumeration.
178 |
179 | :dp:`fls_oHxzyaiT7Qm6`
180 | The :t:`visibility modifier` of an :t:`associated trait item` or :t:`associated
181 | trait implementation item` is rejected, but may still be consumed by
182 | :t:`[macro]s`.
183 |
184 | .. rubric:: Examples
185 |
186 | .. code-block:: rust
187 |
188 | trait Greeter {
189 | const MAX_GREETINGS: i32;
190 |
191 | fn greet(self, message: &str);
192 | }
193 |
194 | struct Implementor {
195 | delivered_greetings: i32
196 | }
197 |
198 | impl Greeter for Implementor {
199 | const MAX_GREETINGS: i32 = 42;
200 |
201 | fn greet(mut self, message: &str) {
202 | if self.delivered_greetings < Self::MAX_GREETINGS {
203 | self.delivered_greetings += 1;
204 | println!("{}", message);
205 | }
206 | }
207 | }
208 |
209 | :dp:`fls_znfADVeOvXHD`
210 | Generic associated type with lifetime bound.
211 |
212 | .. code-block:: rust
213 |
214 | trait LendingIterator {
215 | type Item<'x> where Self: 'x;
216 |
217 | fn next<'a>(&'a mut self) -> Self::Item<'a>;
218 | }
219 |
--------------------------------------------------------------------------------
/src/functions.rst:
--------------------------------------------------------------------------------
1 | .. SPDX-License-Identifier: MIT OR Apache-2.0
2 | SPDX-FileCopyrightText: The Ferrocene Developers
3 |
4 | .. default-domain:: spec
5 |
6 | .. _fls_qcb1n9c0e5hz:
7 |
8 | Functions
9 | =========
10 |
11 | .. rubric:: Syntax
12 |
13 | .. syntax::
14 |
15 | FunctionDeclaration ::=
16 | FunctionQualifierList $$fn$$ Name GenericParameterList? $$($$ FunctionParameterList? $$)$$ ReturnType? WhereClause? (FunctionBody | $$;$$)
17 |
18 | FunctionQualifierList ::=
19 | $$const$$? $$async$$? ItemSafety? AbiSpecification?
20 |
21 | FunctionParameterList ::=
22 | (FunctionParameter ($$,$$ FunctionParameter)* $$,$$?)
23 | | (SelfParameter ($$,$$ FunctionParameter)* $$,$$?)
24 |
25 | FunctionParameter ::=
26 | OuterAttributeOrDoc* (FunctionParameterPattern | FunctionParameterVariadicPart | TypeSpecification)
27 |
28 | FunctionParameterPattern ::=
29 | PatternWithoutAlternation (TypeAscription | ($$:$$ FunctionParameterVariadicPart))
30 |
31 | FunctionParameterVariadicPart ::=
32 | $$...$$
33 |
34 | ReturnType ::=
35 | $$->$$ TypeSpecification
36 |
37 | FunctionBody ::=
38 | BlockExpression
39 |
40 | SelfParameter ::=
41 | OuterAttributeOrDoc* (ShorthandSelf | TypedSelf)
42 |
43 | ShorthandSelf ::=
44 | ($$&$$ LifetimeIndication?)? $$mut$$? $$self$$
45 |
46 | TypedSelf ::=
47 | $$mut$$? $$self$$ TypeAscription
48 |
49 | .. rubric:: Legality Rules
50 |
51 | :dp:`fls_gn1ngtx2tp2s`
52 | A :t:`function` is a :t:`value` of a :t:`function type` that models a behavior.
53 |
54 | :dp:`fls_bdx9gnnjxru3`
55 | A :t:`function` declares a unique :t:`function item type` for itself.
56 |
57 | :dp:`fls_87jnkimc15gi`
58 | A :t:`function qualifier` is a :t:`construct` that determines the role of
59 | a :t:`function`.
60 |
61 | :dp:`fls_nwywh1vjt6rr`
62 | A :t:`function` shall not be subject to both :t:`keyword` ``async`` and
63 | :t:`keyword` ``const``.
64 |
65 | :dp:`fls_uwuthzfgslif`
66 | A :t:`function parameter` is a :t:`construct` that yields a set of
67 | :t:`[binding]s` that bind matched input :t:`[value]s` to :t:`[name]s` at the
68 | site of a :t:`call expression` or a :t:`method call expression`.
69 |
70 | :dp:`fls_ymeo93t4mz4`
71 | A :t:`self parameter` is a :t:`function parameter` expressed by :t:`keyword`
72 | ``self``.
73 |
74 | :dp:`fls_ijbt4tgnl95n`
75 | A :t:`function` shall not specify a :t:`self parameter` unless it is an
76 | :t:`associated function`.
77 |
78 | :dp:`fls_AAYJDCNMJgTq`
79 | The :t:`type` of a :t:`function parameter` is determined as follows:
80 |
81 | * :dp:`fls_PGtp39f6gJwU`
82 | If the :t:`function parameter` is a :t:`self parameter` without a :s:`TypeSpecification`:
83 |
84 | * :dp:`fls_yZ2yIXxmy2ri`
85 | And the :t:`self parameter` has token ``&`` and :t:`keyword` ``mut``, then the :t:`type` is ``&mut Self``.
86 |
87 | * :dp:`fls_35aSvBxBnIzm`
88 | And the :t:`self parameter` has token ``&`` and lacks :t:`keyword` ``mut``, then the :t:`type` is ``&Self``.
89 |
90 | * :dp:`fls_Ogziu8S01qPQ`
91 | And the :t:`self parameter` lacks token ``&`` and :t:`keyword` ``mut``, then the :t:`type` is ``Self``.
92 |
93 | * :dp:`fls_xCSsxYUZUFed`
94 | Otherwise the :t:`type` is the specified :t:`type`.
95 |
96 | :dp:`fls_lxzinvqveuqh`
97 | The :t:`pattern` of a :t:`function parameter` shall be an :t:`irrefutable
98 | pattern`.
99 |
100 | :dp:`fls_kcAbTPZXQ5Y8`
101 | The :t:`expected type` of the :t:`pattern` of a :t:`function parameter` is the :t:`type` of the :t:`function parameter`.
102 |
103 | :dp:`fls_PGDKWK7nPvgw`
104 | The :t:`[binding]s` of all :t:`[pattern]s` of all :t:`[function parameter]s` of a :t:`function` shall not shadow another.
105 |
106 | :dp:`fls_o4uSLPo00KUg`
107 | A :dt:`variadic function` is an :t:`external function` that specifies
108 | :s:`FunctionParameterVariadicPart` as the last :t:`function parameter`.
109 |
110 | :dp:`fls_icdzs1mjh0n4`
111 | A :t:`variadic function` shall specify one of the following :t:`[ABI]s`:
112 |
113 | * :dp:`fls_OR85NVifPwjr`
114 | ``extern "C"``
115 | * :dp:`fls_4s2IdfYDzPrX`
116 | ``extern "C-unwind"``
117 | * :dp:`fls_ZJJppPfiJRou`
118 | ``extern "aapcs"``
119 | * :dp:`fls_jOyZh9ujWWHQ`
120 | ``extern "aapcs-unwind"``
121 | * :dp:`fls_Xdr0bFwxhWiB`
122 | ``extern "cdecl"``
123 | * :dp:`fls_DpTFEHZAABdD`
124 | ``extern "cdecl-unwind"``
125 | * :dp:`fls_b7FTlWfnX2OI`
126 | ``extern "efiapi"``
127 | * :dp:`fls_6urL6fZ5cpaA`
128 | ``extern "system"``
129 | * :dp:`fls_TMOzb6cYIOlH`
130 | ``extern "system-unwind"``
131 | * :dp:`fls_eHPWHrvs7ETl`
132 | ``extern "sysv64"``
133 | * :dp:`fls_mjCrvmikm58M`
134 | ``extern "sysv64-unwind"``
135 | * :dp:`fls_4EUb9zFatZ97`
136 | ``extern "win64"``
137 | * :dp:`fls_4B4B5FIqAes9`
138 | ``extern "win64-unwind"``
139 |
140 | :dp:`fls_vljy4mm0zca2`
141 | A :t:`return type` is the :t:`type` of the result a :t:`function`, :t:`closure type` or :t:`function pointer type` returns.
142 |
143 | :dp:`fls_EqJb3Jl3vK8K`
144 | The :t:`return type` of a :t:`function` is determined as follows:
145 |
146 | * :dp:`fls_C7dvzcXcpQCy`
147 | If the :s:`FunctionDeclaration` specifies a :s:`ReturnType`, then the :t:`return type` is the specified :s:`ReturnType`.
148 |
149 | * :dp:`fls_J8X8ahnJLrMo`
150 | Otherwise the :t:`return type` is the :t:`unit type`.
151 |
152 | :dp:`fls_927nfm5mkbsp`
153 | A :t:`function body` is the :t:`block expression` of a :t:`function`.
154 |
155 | :dp:`fls_yfm0jh62oaxr`
156 | A :t:`function` shall have a :t:`function body` unless it is an
157 | :t:`associated trait function` or an :t:`external function`.
158 |
159 | :dp:`fls_bHwy8FLzEUi3`
160 | A :t:`function body` denotes a :t:`control flow boundary`.
161 |
162 | :dp:`fls_5Q861wb08DU3`
163 | A :t:`function body` of an :t:`async function` denotes an
164 | :t:`async control flow boundary`.
165 |
166 | :dp:`fls_owdlsaaygtho`
167 | A :t:`function signature` is a unique identification of a :t:`function`
168 | that encompasses of its :t:`[function qualifier]s`, :t:`name`,
169 | :t:`[generic parameter]s`, :t:`[function parameter]s`, :t:`return type`, and
170 | :t:`where clause`.
171 |
172 | :dp:`fls_2049qu3ji5x7`
173 | A :t:`constant function` is a :t:`function` subject to :t:`keyword` ``const``.
174 |
175 | :dp:`fls_7mlanuh5mvpn`
176 | The :t:`function body` of a :t:`constant function` shall be a
177 | :t:`constant expression`.
178 |
179 | :dp:`fls_otr3hgp8lj1q`
180 | A :t:`constant function` shall be callable from a :t:`constant context`.
181 |
182 | :dp:`fls_m3jiunibqj81`
183 | An :t:`async function` is a :t:`function` subject to :t:`keyword` ``async``. An
184 | :t:`async function` of the form
185 |
186 | .. code-block:: rust
187 |
188 | async fn async_fn(param: ¶m_type) -> return_type {
189 | /* tail expression */
190 | }
191 |
192 | :dp:`fls_7vogmqyd87ey`
193 | is equivalent to :t:`function`
194 |
195 | .. code-block:: rust
196 |
197 | fn async_fn<'a>(param: &'a param_type) -> impl Future