├── proposals ├── text │ ├── LTR.jpg │ ├── RTL.jpg │ ├── MTEXT.jpg │ ├── Size.jpg │ ├── bold.jpg │ ├── color.jpg │ ├── width.jpg │ ├── Column.jpg │ ├── Justify.jpg │ ├── bgcolor.jpg │ ├── italic.jpg │ ├── margins.png │ ├── oblique.jpg │ ├── overline.jpg │ ├── tabstops.png │ ├── typeface.jpg │ ├── weight.jpg │ ├── LeftAlign.jpg │ ├── RightAlign.jpg │ ├── direction.jpg │ ├── distribute.jpg │ ├── leftindent.png │ ├── linespace.jpg │ ├── undelrine.jpg │ ├── CenterAlign.jpg │ ├── rightindent.png │ ├── strikethrough.jpg │ ├── FirstLineIndent.jpg │ ├── ParagraphSpace.png │ ├── characterSpace.jpg │ ├── columnAlignment.png │ └── marginsAndIndents.png ├── LineStyle │ ├── roundcap.png │ ├── roundJoint.png │ ├── linePatterns.jpg │ ├── rectanglecap.png │ ├── triangleoutcap.png │ ├── screenSpacePattern.png │ └── README.md ├── android │ ├── buildPass.png │ ├── android-model.png │ └── README.md ├── usdotio │ ├── Workflow.png │ ├── Analysis menu.png │ ├── Object Analysis.png │ ├── OpenTimelineIO & OpenUSD.pdf │ ├── Speaker and Scene Analysis.png │ └── LICENSE.md ├── spline-animation │ ├── C0.png │ ├── C1.png │ ├── G1.png │ ├── bezier.png │ ├── boldS.png │ ├── held.png │ ├── linear.png │ ├── repeat.png │ ├── reset.png │ ├── endVert.png │ ├── hermite.png │ ├── tanLens.png │ ├── tangents.png │ ├── extrapLoops.png │ ├── innerLoops.png │ ├── keepRatio.png │ ├── keepStart.png │ ├── loopRegions.png │ ├── oscillate.png │ ├── regressiveS.png │ ├── startVert.png │ ├── discontinuous.png │ ├── nearVertical.png │ ├── centerVertical.png │ ├── fourThirdOneThird.png │ ├── oneThirdFourThird.png │ ├── regressiveSStandard.png │ └── regressive.md ├── image-planes │ ├── pinhole-camera-diagram.png │ └── README.md ├── tf_utf8_identifiers │ ├── codepoint_collation.png │ └── stage_builder_utf8.py ├── boost_python_removal │ ├── module_compile_times.png │ ├── compile_times │ │ ├── pybind11_vs_boost_python1.png │ │ ├── pybind11_vs_boost_python_rerun_adj.png │ │ └── README.md │ └── README.md ├── hydra2 │ ├── ModularSceneDelegationAndManipulationInHydra.pdf │ └── README.md ├── multiple-animations │ ├── README.md │ ├── general.md │ └── usdskel.md ├── semantic_schema │ ├── schema.usda │ └── README.md ├── selfAssemblingModelHierarchy │ └── example.patch ├── variant_set_metadata │ ├── example.usda │ └── README.md ├── Readme.md ├── materialx-versioning │ └── README.md ├── pattern-based-collections │ └── README.md ├── language │ └── README.md ├── profiles │ └── README.md ├── accessibility │ └── README.md └── PickAPI │ └── README.md ├── .github └── pull_request_template.md └── README.md /proposals/text/LTR.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/LTR.jpg -------------------------------------------------------------------------------- /proposals/text/RTL.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/RTL.jpg -------------------------------------------------------------------------------- /proposals/text/MTEXT.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/MTEXT.jpg -------------------------------------------------------------------------------- /proposals/text/Size.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/Size.jpg -------------------------------------------------------------------------------- /proposals/text/bold.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/bold.jpg -------------------------------------------------------------------------------- /proposals/text/color.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/color.jpg -------------------------------------------------------------------------------- /proposals/text/width.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/width.jpg -------------------------------------------------------------------------------- /proposals/text/Column.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/Column.jpg -------------------------------------------------------------------------------- /proposals/text/Justify.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/Justify.jpg -------------------------------------------------------------------------------- /proposals/text/bgcolor.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/bgcolor.jpg -------------------------------------------------------------------------------- /proposals/text/italic.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/italic.jpg -------------------------------------------------------------------------------- /proposals/text/margins.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/margins.png -------------------------------------------------------------------------------- /proposals/text/oblique.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/oblique.jpg -------------------------------------------------------------------------------- /proposals/text/overline.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/overline.jpg -------------------------------------------------------------------------------- /proposals/text/tabstops.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/tabstops.png -------------------------------------------------------------------------------- /proposals/text/typeface.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/typeface.jpg -------------------------------------------------------------------------------- /proposals/text/weight.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/weight.jpg -------------------------------------------------------------------------------- /proposals/text/LeftAlign.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/LeftAlign.jpg -------------------------------------------------------------------------------- /proposals/text/RightAlign.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/RightAlign.jpg -------------------------------------------------------------------------------- /proposals/text/direction.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/direction.jpg -------------------------------------------------------------------------------- /proposals/text/distribute.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/distribute.jpg -------------------------------------------------------------------------------- /proposals/text/leftindent.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/leftindent.png -------------------------------------------------------------------------------- /proposals/text/linespace.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/linespace.jpg -------------------------------------------------------------------------------- /proposals/text/undelrine.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/undelrine.jpg -------------------------------------------------------------------------------- /proposals/LineStyle/roundcap.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/LineStyle/roundcap.png -------------------------------------------------------------------------------- /proposals/android/buildPass.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/android/buildPass.png -------------------------------------------------------------------------------- /proposals/text/CenterAlign.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/CenterAlign.jpg -------------------------------------------------------------------------------- /proposals/text/rightindent.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/rightindent.png -------------------------------------------------------------------------------- /proposals/text/strikethrough.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/strikethrough.jpg -------------------------------------------------------------------------------- /proposals/usdotio/Workflow.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/usdotio/Workflow.png -------------------------------------------------------------------------------- /proposals/LineStyle/roundJoint.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/LineStyle/roundJoint.png -------------------------------------------------------------------------------- /proposals/spline-animation/C0.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/C0.png -------------------------------------------------------------------------------- /proposals/spline-animation/C1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/C1.png -------------------------------------------------------------------------------- /proposals/spline-animation/G1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/G1.png -------------------------------------------------------------------------------- /proposals/text/FirstLineIndent.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/FirstLineIndent.jpg -------------------------------------------------------------------------------- /proposals/text/ParagraphSpace.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/ParagraphSpace.png -------------------------------------------------------------------------------- /proposals/text/characterSpace.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/characterSpace.jpg -------------------------------------------------------------------------------- /proposals/text/columnAlignment.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/columnAlignment.png -------------------------------------------------------------------------------- /proposals/LineStyle/linePatterns.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/LineStyle/linePatterns.jpg -------------------------------------------------------------------------------- /proposals/LineStyle/rectanglecap.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/LineStyle/rectanglecap.png -------------------------------------------------------------------------------- /proposals/android/android-model.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/android/android-model.png -------------------------------------------------------------------------------- /proposals/spline-animation/bezier.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/bezier.png -------------------------------------------------------------------------------- /proposals/spline-animation/boldS.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/boldS.png -------------------------------------------------------------------------------- /proposals/spline-animation/held.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/held.png -------------------------------------------------------------------------------- /proposals/spline-animation/linear.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/linear.png -------------------------------------------------------------------------------- /proposals/spline-animation/repeat.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/repeat.png -------------------------------------------------------------------------------- /proposals/spline-animation/reset.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/reset.png -------------------------------------------------------------------------------- /proposals/text/marginsAndIndents.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/text/marginsAndIndents.png -------------------------------------------------------------------------------- /proposals/usdotio/Analysis menu.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/usdotio/Analysis menu.png -------------------------------------------------------------------------------- /proposals/usdotio/Object Analysis.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/usdotio/Object Analysis.png -------------------------------------------------------------------------------- /proposals/LineStyle/triangleoutcap.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/LineStyle/triangleoutcap.png -------------------------------------------------------------------------------- /proposals/spline-animation/endVert.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/endVert.png -------------------------------------------------------------------------------- /proposals/spline-animation/hermite.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/hermite.png -------------------------------------------------------------------------------- /proposals/spline-animation/tanLens.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/tanLens.png -------------------------------------------------------------------------------- /proposals/spline-animation/tangents.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/tangents.png -------------------------------------------------------------------------------- /proposals/LineStyle/screenSpacePattern.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/LineStyle/screenSpacePattern.png -------------------------------------------------------------------------------- /proposals/spline-animation/extrapLoops.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/extrapLoops.png -------------------------------------------------------------------------------- /proposals/spline-animation/innerLoops.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/innerLoops.png -------------------------------------------------------------------------------- /proposals/spline-animation/keepRatio.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/keepRatio.png -------------------------------------------------------------------------------- /proposals/spline-animation/keepStart.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/keepStart.png -------------------------------------------------------------------------------- /proposals/spline-animation/loopRegions.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/loopRegions.png -------------------------------------------------------------------------------- /proposals/spline-animation/oscillate.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/oscillate.png -------------------------------------------------------------------------------- /proposals/spline-animation/regressiveS.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/regressiveS.png -------------------------------------------------------------------------------- /proposals/spline-animation/startVert.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/startVert.png -------------------------------------------------------------------------------- /proposals/spline-animation/discontinuous.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/discontinuous.png -------------------------------------------------------------------------------- /proposals/spline-animation/nearVertical.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/nearVertical.png -------------------------------------------------------------------------------- /proposals/spline-animation/centerVertical.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/centerVertical.png -------------------------------------------------------------------------------- /proposals/usdotio/OpenTimelineIO & OpenUSD.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/usdotio/OpenTimelineIO & OpenUSD.pdf -------------------------------------------------------------------------------- /proposals/image-planes/pinhole-camera-diagram.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/image-planes/pinhole-camera-diagram.png -------------------------------------------------------------------------------- /proposals/spline-animation/fourThirdOneThird.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/fourThirdOneThird.png -------------------------------------------------------------------------------- /proposals/spline-animation/oneThirdFourThird.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/oneThirdFourThird.png -------------------------------------------------------------------------------- /proposals/usdotio/Speaker and Scene Analysis.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/usdotio/Speaker and Scene Analysis.png -------------------------------------------------------------------------------- /proposals/spline-animation/regressiveSStandard.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/spline-animation/regressiveSStandard.png -------------------------------------------------------------------------------- /proposals/tf_utf8_identifiers/codepoint_collation.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/tf_utf8_identifiers/codepoint_collation.png -------------------------------------------------------------------------------- /proposals/boost_python_removal/module_compile_times.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/boost_python_removal/module_compile_times.png -------------------------------------------------------------------------------- /proposals/hydra2/ModularSceneDelegationAndManipulationInHydra.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/hydra2/ModularSceneDelegationAndManipulationInHydra.pdf -------------------------------------------------------------------------------- /proposals/boost_python_removal/compile_times/pybind11_vs_boost_python1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/boost_python_removal/compile_times/pybind11_vs_boost_python1.png -------------------------------------------------------------------------------- /proposals/boost_python_removal/compile_times/pybind11_vs_boost_python_rerun_adj.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anderslanglands/OpenUSD-proposals/main/proposals/boost_python_removal/compile_times/pybind11_vs_boost_python_rerun_adj.png -------------------------------------------------------------------------------- /proposals/multiple-animations/README.md: -------------------------------------------------------------------------------- 1 | # Multiple Animation Support 2 | 3 | This proposal describes how to add support for multiple animations per asset in USD that can be accessed simultaneously by a runtime. 4 | 5 | For example, a game character may have multiple cycles like walk, run, idle. An environment prop may also have multiple animations like a door opening or blowing in the wind. 6 | 7 | The proposal is split into two parts. 8 | 9 | 1. [USDSkel support for multiple bound animations](usdskel.md) 10 | 2. [General support for all prim types](general.md) -------------------------------------------------------------------------------- /proposals/semantic_schema/schema.usda: -------------------------------------------------------------------------------- 1 | #usda 1.0 2 | ( 3 | subLayers = [ 4 | @usd/schema.usda@ 5 | ] 6 | ) 7 | 8 | over "GLOBAL" ( 9 | customData = { 10 | string libraryName = "usdSemantics" 11 | string libraryPath = "./" 12 | } 13 | ) { 14 | } 15 | 16 | class "SemanticsAPI" ( 17 | inherits = 18 | 19 | customData = { 20 | token apiSchemaType = "multipleApply" 21 | token propertyNamespacePrefix = "semantics" 22 | } 23 | ) 24 | { 25 | token[] labels ( 26 | doc = "List of semantic labels" 27 | ) 28 | } 29 | -------------------------------------------------------------------------------- /.github/pull_request_template.md: -------------------------------------------------------------------------------- 1 | ### Description of Proposal 2 | 3 | 8 | 9 | 10 | [Link to Rendered Proposal](https://github.com/PixarAnimationStudios/OpenUSD-proposals/blob/main/README.md) 11 | 12 | ### Supporting Materials 13 | 14 | 17 | 18 | ### Contributing 19 | 20 | 25 | - [ ] I agree to and accept the [Supplemental Terms](https://graphics.pixar.com/usd/release/contributing_supplemental.html). 26 | -------------------------------------------------------------------------------- /proposals/hydra2/README.md: -------------------------------------------------------------------------------- 1 | ![Status:Implemented, 21.11](https://img.shields.io/badge/Implemented,%2021.11-blue) 2 | # Modular Scene Delegation and Manipulation in Hydra 3 | Copyright © Pixar Animation Studios, 2021, version 1.0 4 | 5 | ## Proposal 6 | 7 | We propose a new mechanism that represents the scene to the renderer backend as 8 | a full scene hierarchy, each node of which is capable of indirecting its 9 | requests to the delegated source scene. Our proposed system is also flexible 10 | and allows custom data to be transported easily. 11 | 12 | We introduce two new concepts: data sources and scene indices. From a high level 13 | a data source provides the data indirection, and a scene index provides the 14 | ability to manipulate scene data. We also discuss how schemas can be used to 15 | give more meaning to data sources, how dependencies can be used across data 16 | sources, and how to locate data sources. 17 | 18 | The full proposal may be viewed here: [Proposal PDF](ModularSceneDelegationAndManipulationInHydra.pdf) 19 | -------------------------------------------------------------------------------- /proposals/selfAssemblingModelHierarchy/example.patch: -------------------------------------------------------------------------------- 1 | diff --git a/pxr/usd/usd/primData.cpp b/pxr/usd/usd/primData.cpp 2 | index ee2250176..1fd362a22 100644 3 | --- a/pxr/usd/usd/primData.cpp 4 | +++ b/pxr/usd/usd/primData.cpp 5 | @@ -146,6 +146,21 @@ Usd_PrimData::_ComposeAndCacheFlags(Usd_PrimDataConstPtr parent, 6 | isComponent = KindRegistry::IsComponent(kind); 7 | isModel = isGroup || isComponent || KindRegistry::IsModel(kind); 8 | } 9 | + else { 10 | + // Enables propagation of model hierarchy potential for 11 | + // non-root primitives. This requires model hierarchy to be 12 | + // opted into at the root level. The pseudo root is trivially 13 | + // a group. Without this check, stages not using model 14 | + // hierarchy would pay for the extra reads of `kind` metadata. 15 | + // 16 | + // A model hierarchy "opt-out" approach would be generally 17 | + // simpler and not require the IsPseudoRoot check 18 | + // in favor of `isGroup` always being true when the parent 19 | + // is a group, but would result in extra `kind` reads in stages 20 | + // not participating in model hierarchy. 21 | + isGroup = !parent->IsPseudoRoot(); 22 | + isModel = isGroup; 23 | + } 24 | } 25 | _flags[Usd_PrimGroupFlag] = isGroup; 26 | _flags[Usd_PrimModelFlag] = isModel; -------------------------------------------------------------------------------- /proposals/variant_set_metadata/example.usda: -------------------------------------------------------------------------------- 1 | #usda 1.0 2 | ( 3 | defaultPrim = "MilkCartonA" 4 | ) 5 | 6 | def Xform "MilkCartonA" ( 7 | kind = "prop" 8 | variants = { 9 | string modelingVariant = "Carton_Opened" 10 | string shadingComplexity = "full" 11 | string shadingTest = "milkBrandA" 12 | } 13 | prepend variantSets = ["modelingVariant", "shadingComplexity", "shadingTest"] 14 | ) 15 | { 16 | variantSet "modelingVariant" ( 17 | doc = "Modeling variations for the asset" 18 | ) = { 19 | "ALL_VARIANTS" { 20 | float3[] extentsHint = [(-6.27056, -6.53532, 0), (6.14027, 6.10374, 29.8274)] 21 | 22 | } 23 | "Carton_Opened" { 24 | float3[] extentsHint = [(-6.27056, -6.53532, 0), (6.14027, 6.10374, 29.8274)] 25 | 26 | } 27 | "Carton_Sealed" { 28 | float3[] extentsHint = [(-6.27056, -6.44992, 0), (6.14027, 6.10374, 29.2792)] 29 | 30 | } 31 | } 32 | variantSet "shadingComplexity" = { 33 | reorder variants = ["none", "display", "modeling", "full"] 34 | "display" { 35 | 36 | } 37 | "full" { 38 | 39 | } 40 | "modeling" { 41 | 42 | } 43 | "none" { 44 | 45 | } 46 | } 47 | 48 | variantSet "shadingTest" ( 49 | """Added UPC# for the display names""" 50 | displayName = "Shading Test Variations" 51 | doc = "Shading variations that are currently being explored for product placement" 52 | variantDisplayNames = { "milkBrandA" : "Milk Brand A (UPC#: 123-ABC)", "milkBrandB" : "Milk Brand B (UPC#: 456-DEF)" } 53 | hidden = True 54 | ) = { 55 | "milkBrandA" { 56 | 57 | } 58 | "milkBrandB" { 59 | 60 | } 61 | } 62 | } 63 | -------------------------------------------------------------------------------- /proposals/tf_utf8_identifiers/stage_builder_utf8.py: -------------------------------------------------------------------------------- 1 | # Copyright 2023 NVIDIA CORPORATION 2 | # 3 | # Licensed under the Apache License, Version 2.0 (the "License"); 4 | # you may not use this file except in compliance with the License. 5 | # You may obtain a copy of the License at 6 | # 7 | # http://www.apache.org/licenses/LICENSE-2.0 8 | # 9 | # Unless required by applicable law or agreed to in writing, software 10 | # distributed under the License is distributed on an "AS IS" BASIS, 11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | # See the License for the specific language governing permissions and 13 | # limitations under the License. 14 | 15 | from pxr import Usd, UsdGeom, Sdf 16 | ​ 17 | stage = Usd.Stage.CreateNew("./complicated.usd") 18 | reference = Usd.Stage.CreateNew("./reference.usd") 19 | reference_scope = UsdGeom.Scope.Define(reference, "/reference") 20 | reference.SetDefaultPrim(reference_scope.GetPrim()) 21 | reference_scope.GetPrim().CreateAttribute("test", Sdf.ValueTypeNames.Float) 22 | reference.GetRootLayer().Save() 23 | ​ 24 | for i in range(100): 25 | root_path = Sdf.Path(f"/root_{i}") 26 | root = UsdGeom.Scope.Define(stage, root_path) 27 | root.GetPrim().CreateAttribute("attr1", Sdf.ValueTypeNames.Int) 28 | root.GetPrim().CreateAttribute("attr2", Sdf.ValueTypeNames.Int) 29 | root.GetPrim().CreateAttribute("attr3", Sdf.ValueTypeNames.IntArray) 30 | root.GetPrim().CreateAttribute("attr4", Sdf.ValueTypeNames.IntArray) 31 | for j in range(100): 32 | child_path = root_path.AppendChild(f"child_{j}") 33 | child = UsdGeom.Scope.Define(stage, child_path) 34 | rel = child.GetPrim().CreateRelationship("rel1") 35 | rel.SetTargets([root_path]) 36 | for k in range(10): 37 | grandchild_path = child_path.AppendChild(f"grandchild_{k}") 38 | grandchild = UsdGeom.Scope.Define(stage, grandchild_path) 39 | rel = grandchild.GetPrim().CreateRelationship("rel2") 40 | rel.SetTargets([child_path]) 41 | for m in range(10): 42 | greatgrandchild_path = grandchild_path.AppendChild(f"greatgrandchild_{m}") 43 | greatgrandchild = UsdGeom.Scope.Define(stage, greatgrandchild_path) 44 | rel = greatgrandchild.GetPrim().CreateRelationship("rel3") 45 | rel.SetTargets([grandchild_path]) 46 | for n in range(12): 47 | greatgrandchild.GetPrim().CreateAttribute(f"primvars:primvar{n}", Sdf.ValueTypeNames.IntArray) 48 | references = greatgrandchild.GetPrim().GetReferences() 49 | references.AddReference("./reference.usd") 50 | ​ 51 | stage.GetRootLayer().Save() -------------------------------------------------------------------------------- /proposals/boost_python_removal/compile_times/README.md: -------------------------------------------------------------------------------- 1 | # Compile Time Benchmarks 2 | 3 | Copyright © 2024, Pixar Animation Studios, version 1.0 4 | 5 | The pybind11 documentation contains a benchmark showing the compile 6 | times for a module wrapping N C++ class, each with 4 member functions, 7 | using both pybind11 and boost::python. A graph showing the results 8 | is published with the documentation and reproduced here: 9 | 10 | ![Compile time benchmark from pybind11 docs](pybind11_vs_boost_python1.png) 11 | 12 | Note that this graph (and the ones below) are log-log plots. 13 | 14 | These results were last updated in 2016 and generated using Apple LLVM 7.0.2 15 | (clang-700). Re-running this benchmark using a modern compiler and machine 16 | yielded significantly different results. 17 | 18 | In addition, the module code used by the benchmark for boost::python was 19 | not as minimal as it could be. It includes the header `boost/python.hpp`, 20 | which includes all of boost::python's headers and functionality, instead 21 | of just the headers necessary for the module. The pybind11 code is required 22 | to include the `pybind11/pybind11.hpp` header because pybind11 intentionally 23 | put the majority of its functionality in a single header. boost::python 24 | is much more modular, allowing client code to pick and choose which headers 25 | are needed. This means the benchmark isn't comparing apples-to-apples; it 26 | is biased towards pybind11. 27 | 28 | The results below show the benchmark results using the original boost::python 29 | module code and an updated version correcting the above issue, and the 30 | original pybind11 code. 31 | 32 | These tests were run on a Mac Mini M1 with 8 cores and 16 GB RAM, using 33 | pybind11 2.13.1 and boost v1.82.0. 34 | 35 | ![Benchmark rerun graph](pybind11_vs_boost_python_rerun_adj.png) 36 | 37 | | Number of Functions | boost::python | boost::python (updated) | pybind11 | 38 | |---------------------|---------------|--------------------------|----------| 39 | | 4 | 1.108 | 0.9089 | 1.779 | 40 | | 8 | 1.213 | 1.047 | 1.918 | 41 | | 16 | 1.426 | 1.23 | 2.143 | 42 | | 32 | 1.828 | 1.642 | 2.504 | 43 | | 64 | 2.635 | 2.474 | 3.329 | 44 | | 128 | 4.317 | 4.117 | 4.929 | 45 | | 256 | 7.685 | 7.471 | 8.205 | 46 | | 512 | 14.731 | 14.572 | 15.118 | 47 | | 1024 | 29.583 | 29.457 | 29.853 | 48 | | 2048 | 61.615 | 62.15 | 62.049 | 49 | -------------------------------------------------------------------------------- /proposals/android/README.md: -------------------------------------------------------------------------------- 1 | # Introduction 2 | 3 | As we know, the importance of mobile devices graphics is increasing for this digital world. The mobile device market encompasses various requsts on graphic capabilities for presentation, interaction, visualization, and other related functions. 4 | We propose a new rendering pipeline to extend the USD capabilities to support Android platform with Vulkan backend. The new solution helps benefit Android users from the USD ecosystem. 5 | 6 | # Technical challenges: 7 | These are the limitations in USD and we have addressed them in our Android work. 8 | - USD does not support Android including build system, rendering pipeline, lib plugin, file loader system and so on 9 | - HgiVulkan backend does not support Vulkan Android extension (e.g. VK_USE_PLATFORM_ANDROID_KHR, VK_KHR_ANDROID_SURFACE_EXTENSION_NAME), the Vulkan context cannot be initialized on Android. 10 | - From this prototype mentioned in the proposal, minimum supported Vulkan version in HgiVulkan is 1.2, resulting in fewer machines that can be adapted 11 | - Currently there is no Android demo app 12 | 13 | 14 | 15 | # Implementation 16 | 17 | ## Recommend Development environment 18 | - IDE: Android Studio Flamingo | 2022.2.1 Patch 2 19 | - Development Kit: The minimum supported NDK version is 22.0.7026061. In this prototype, we use 25.2.9519653 that is recommended. 20 | 21 | 22 | ## Changes/Development: 23 | - Compiled all core USD components on Android. Disabled projects that can't run on Android, including usdviewq and all projects in USD/pxr/usd/bin. 24 | 25 | - Created a way of USD plugin management to meet Android's security policy requirements. Originally, the USD plugin management mechanism loaded plugin libraries with the relative path by reading a JSON file. When we use NDK development on Android, we need to put the .so files in the jniLibs.srcDirs directory for the compiler to link them. After that, the .so files will be packaged into the APK file. If we put the JSON file in the jniLibs.srcDirs directory, it will not be packaged into the APK. To solve this problem, we put all non-so files in the Android app-specific location and specify this location through the environment variable 'PXRPLUGINPATHNAME'. After parsing the JSON file, USD is forced to specify the location of the .so files as jniLibs.srcDirs. 26 | 27 | - Demangle function names properly for Android platform. The real namespace of standard library on Android is '__ndk1' other than 'std', so the original implementation can't demangle the symbols properly. 28 | 29 | - Create first Android Studio project to render final frame on Android with HgiVulkan backend 30 | 31 | 32 | ## HgiVulkan Android adaption 33 | The HgiVulkan module for Android depends on another work, and its proposal is https://github.com/PixarAnimationStudios/OpenUSD-proposals/pull/15. It elaborates how to support Vulkan 1.0 in HgiVulkan. The current HgiVulkan support Vulkan 1.2 or higher, but there are very few Android devices that support Vulkan 1.2. Therefore, degrade HgiVulkan to support Vulkan 1.0 that is supported by most of Android devices. Here we list the work to adapt HgiVulkan to Android. 34 | - Add the Android-specific Vulkan extensions to HgiVulkan to enable right initialization of Vulkan on Android. 35 | - VK_USE_PLATFORM_ANDROID_KHR: 36 | - VK_KHR_ANDROID_SURFACE_EXTENSION_NAME 37 | 38 | # Showcase 39 | - Devices: Galaxy Tab S8+ (model: SM-X800) 40 | - Android version: 13 41 | 42 | ![Screenshot](android-model.png) 43 | 44 | 45 | # Screenshot of build process 46 | ![buildpass](buildPass.png) 47 | 48 | 49 | # Future Work 50 | - Validation on more Android devices 51 | - Improved stability on Android Devices 52 | - Improved performance 53 | -------------------------------------------------------------------------------- /proposals/multiple-animations/general.md: -------------------------------------------------------------------------------- 1 | # Multiple Animation for General Prims 2 | 3 | It would ideally be great to support multiple animations generically across any USD prim type. 4 | 5 | My concern is that possible changes could be a signficant amount of time and work. That said, I recognize the strong benefits of having a singular solution that systems can build around. 6 | 7 | ## Proposals 8 | 9 | There are several forms that this could take. I'll do my best to summarize previous discussions in various forms. 10 | 11 | ### Non-exclusive variants 12 | 13 | A new runtime variant form could be added that allows for each prim to specify variants that the runtime could apply to the prim hierarchy. 14 | 15 | They would ideally have limitations on what changes they can provide, such as preventing changes to references to prevent needing to re-evaluate composition for each possibility. 16 | 17 | For the purpose of this proposal, I'll call this `multiSets` and `multi`. 18 | 19 | It would have the same syntax as variants with 20 | 21 | ``` 22 | def Xform “Foo” ( 23 | prepend multiSets = “animation” 24 | ) { 25 | multiSet “animation” { 26 | “none” {...} 27 | “idle” {...} 28 | “walk” {...} 29 | } 30 | } 31 | ``` 32 | 33 | While this would be a significant change to USD, it has a few advantages: 34 | 35 | - It could also solve the case for other non-exclusive variant situations like levels of details for game runtimes. 36 | - It could include an entire hierarchy within it, so animation clips can be defined in one place. 37 | 38 | ### Namespaced variants 39 | 40 | Another form that we could take is adding namespaces to individual fields. 41 | 42 | For example, the following defines a default, idle and run animation. 43 | 44 | ``` 45 | double3 xformOp:translate.timesamples = {...} 46 | double3 xformOp:translate:idle.timesamples = {...} 47 | double3 xformOp:translate:run.timesamples = {...} 48 | ``` 49 | 50 | This has the advantage of needing a very small change to USD. 51 | 52 | However it has a couple disadvantages: 53 | 54 | - It is hard to discern that the namespace is intended for animation and not another use. 55 | - It would be hard to gather up a clip across they entirety of a hierarchy. 56 | 57 | ### Value Clips 58 | 59 | Value clips and value clip sets are a great choice because they already implement the concept of multiple animations per a USD hierarchy. 60 | 61 | 62 | However I think the following changes would need to happen to USD to allow for this to truly meet the goal for realtime use cases or any case where the animation is being extracted into another runtime. 63 | 64 | 1. Value Clips require multiple layer files to function in their current state, or at least that appears to be the only documented setup. Ideally we'd allow pulling in prim hierarchies from within a single layer to allow for simplified pipelines where needed, since many users of USD might not have a studio pipeline and convention. 65 | 66 | 2. If we enable putting the value clip hierarchy within a single file, we need a way to tell renderers to ignore that hierarchy. That might be as simple as putting them under a Scope convention like "Animations", and then giving them a non-imageable type , or no type at all. Value Clips might therefore need to explicitly allow for type mismatches (which they might but documentation isn't clear.) 67 | 68 | 3. USD doesn't seem to have any API to pull out just the properties and their values that vary. Ideally we should have some kind of API that returns a read only view into what each individual clip would resolve into. 69 | 70 | ## Preference 71 | 72 | Based on the above, and further discussion with folks, I think modifying value clips would be the best general solution for the following reasons: 73 | 74 | 1. It doesn't require any new mental model changes for USD itself. Which is always nice when introducing people to all the ways something can work in USD. 75 | 76 | 2. The changes needed should hopefully be minimal if people agree that they're wanted. 77 | 78 | 3. The intent of a value clip is fairly clear so there's no ambiguity. 79 | 80 | 81 | ## Skeletons 82 | 83 | The changes needed for a general purpose solution are quite a bit more regardless of the solution than those for a UsdSkel only solution. 84 | 85 | I think it would be necessary to evaluate the work needed for the proposals above to decide if it makes sense to do 86 | 87 | 1. A separate solution for skeletons and general prims 88 | 2. A general solution only 89 | 90 | While a general purpose solution would of course be nice for simplifying things down, it does offer two problems: 91 | 92 | 1. Possibly a longer turnaround time 93 | 2. Very hard to bound what can be animated, whereas SkelAnimation is very tidily bounded. 94 | 95 | 96 | ---- 97 | 98 | Anyway hopefully this document is a good recap of the discussions thus far. -------------------------------------------------------------------------------- /proposals/Readme.md: -------------------------------------------------------------------------------- 1 | ## Structure 2 | 3 | Create new proposals here, in their own directories. 4 | 5 | A Readme.md file should exist in each directory, and may contain 6 | the entire proposal, or it may serve as an index to the rest of the materials 7 | within the proposal. 8 | 9 | ## All Proposals and PRs 10 | 11 | See the [OpenUSD Proposals Status](https://github.com/orgs/PixarAnimationStudios/projects/1/views/1) project page for a list of all proposals, proposal PRs, and status. 12 | 13 | ## Committed Proposals 14 | 15 | - [Self Assembling Model Hierarchy](https://github.com/PixarAnimationStudios/USD-proposals/tree/main/proposals/selfAssemblingModelHierarchy) ![Status:Draft](https://img.shields.io/badge/Draft-blue) ([current PRs and issues](https://github.com/PixarAnimationStudios/USD-proposals/pulls?q=is%3Apr++label%3Aself-assembling-model-hieararchy+)) 16 | - [Pattern-based Collections](https://github.com/PixarAnimationStudios/USD-proposals/tree/main/proposals/pattern-based-collections) ![Status:Implemented, 23.11](https://img.shields.io/badge/Implemented,%2023.11-blue) ([current PRs and issues](https://github.com/PixarAnimationStudios/USD-proposals/pulls?q=is%3Apr+label%3Ausd-pattern-based-collections+)) 17 | - [Spline Animation](https://github.com/PixarAnimationStudios/OpenUSD-proposals/tree/main/proposals/spline-animation) ![Status:Published](https://img.shields.io/badge/Published-green) ([current PRs and issues](https://github.com/PixarAnimationStudios/USD-proposals/pulls?q=is%3Apr+label%3Ausd-spline-animation+)) 18 | - [Stage Variable Expressions](https://github.com/PixarAnimationStudios/USD-proposals/tree/main/proposals/stage_variable_expressions) ![Status:Implemented, 23.11](https://img.shields.io/badge/Implemented,%2023.11-blue) ([current PRs and issues](https://github.com/PixarAnimationStudios/USD-proposals/pulls?q=is%3Apr+label%3Ausd-stage-variables)) 19 | - [Unicode Identifiers in USD](https://github.com/PixarAnimationStudios/USD-proposals/tree/main/proposals/tf_utf8_identifiers) ![Status:Implemented, 24.03](https://img.shields.io/badge/Implemented,%2024.03-blue) ([current PRs and issues](https://github.com/PixarAnimationStudios/USD-proposals/pulls?q=is%3Apr++label%3Ausd-utf8-identifiers+)) 20 | - [Namespace Editing](https://github.com/PixarAnimationStudios/OpenUSD-proposals/blob/main/proposals/namespace_editing/README.md) ![Status:Published](https://img.shields.io/badge/Published-green) ([current PRs and issues](https://github.com/PixarAnimationStudios/USD-proposals/pulls?q=is%3Apr++label%3Ausd-namespace-editing+)) 21 | - [Multiple Animation support](multiple-animations/README.md) ![Status:Draft](https://img.shields.io/badge/Draft-blue) ([current PRs and issues](https://github.com/PixarAnimationStudios/USD-proposals/pulls?q=is%3Apr++label%3Ausd-multiple-animation-support+)) 22 | - [Image Plane Support](image-planes/README.md) ![Status:Draft](https://img.shields.io/badge/Draft-blue) ([current PRs and issues](https://github.com/PixarAnimationStudios/USD-proposals/pulls?q=is%3Apr++label%3Ausd-image-planes-support+)) 23 | - [Line Style](LineStyle/README.md) ![Status:Draft](https://img.shields.io/badge/Draft-blue) ([current PRs and issues](https://github.com/PixarAnimationStudios/USD-proposals/pulls?q=is%3Apr++label%3Ausd-line-style+)) 24 | - [Modular Scene Delegation and Manipulation in Hydra](https://github.com/PixarAnimationStudios/USD-proposals/tree/main/proposals/hydra2) ![Status:Implemented, 21.11](https://img.shields.io/badge/Implemented,%2021.11-blue) 25 | - [Validation Framework](usd-validation-framework/README.md) ![Status:Draft](https://img.shields.io/badge/Draft-blue) ([current PRs and issues](https://github.com/PixarAnimationStudios/OpenUSD-proposals/issues?q=label%3Avalidation-framework)) 26 | - [Semantic Schema](semantic_schema/README.md) ![Status:Draft](https://img.shields.io/badge/Draft-blue) ([current PRs and issues](https://github.com/PixarAnimationStudios/OpenUSD-proposals/issues?q=label%3Asemantic-schema)) 27 | - [Android Support](android/README.md) ![Status:Draft](https://img.shields.io/badge/Draft-blue) ([current PRs and issues](https://github.com/PixarAnimationStudios/OpenUSD-proposals/issues?q=label%3Aandroid-support)) 28 | - [Text Support](text/README.md) ![Status:Published](https://img.shields.io/badge/Published-green) ([current PRs and issues](https://github.com/PixarAnimationStudios/OpenUSD-proposals/issues?q=label%3Atext)) 29 | - [Pick API](PickAPI/README.md) ![Status:Draft](https://img.shields.io/badge/Draft-blue) ([current PRs and issues](https://github.com/PixarAnimationStudios/OpenUSD-proposals/issues?q=label%3Ausd-pickapi)) 30 | - [Accessibility Schema](accessibility/README.md) ![Status:Published](https://img.shields.io/badge/Published-green) ([current PRs and issues](https://github.com/PixarAnimationStudios/OpenUSD-proposals/issues?q=label%3Aaccessibility-schema )) 31 | - [OpenTimelineIO Time Series Data](usdotio/README.md) ![Status:Draft](https://img.shields.io/badge/Draft-blue) ([current PRs and issues](https://github.com/PixarAnimationStudios/OpenUSD-proposals/issues?q=label%3Ausdotio)) 32 | -------------------------------------------------------------------------------- /proposals/LineStyle/README.md: -------------------------------------------------------------------------------- 1 | # Introduction 2 | USD already has primitives to render 3D curve-like geometries, such as hair or grass. In practice, we also need curves in sketch or CAD document. The curve has uniform width, and its width will not change when we rotate or zoom the sketch. The curve may also have dash-dot patterns. We will provide a schema which could be applied to the curves primitive, and the primitive will become a uniform screen-width curve, and can have dash-dot patterns, or other type of patterns(such as waveline). 3 | 4 | In this design, we don't consider any 3D-like curve styles, such as Blender's Grease Pencil or Tiltbrush. 5 | 6 | Here is a picture of common curve patterns. 7 | 8 | ![image of curve patterns](linePatterns.jpg) 9 | 10 | # Requirements 11 | 12 | ### Curve Width 13 | The curve width is a screen-space width. It will not change when we zoom in or zoom out. The curve width is uniform across the whole curve. 14 | 15 | ### Curve Caps 16 | The curve cap is the shape at the start or end of a curve or dash. There are different types of curve cap. The value can be different for the start and the end. But all the start caps in a curve should be the same, and all the end caps should be the same. 17 | The curve cap will also impact the shape of the dot in a curve pattern. The start cap is the shape of the left half of the dot, and the end cap is the shape of the right half of the dot. 18 | 19 | | cap type | round | square | triangle | 20 | |:--------:|:---------:|:-----------:|:----------:| 21 | | figure |![round](roundcap.png)|![square](rectanglecap.png)|![triangles](triangleoutcap.png)| 22 | 23 | ### Curve Joint 24 | The curve joint is the shape at the joint of two curves, or at the joint of a polyline. The value is constant for the whole primitive. 25 | 26 | ![image of curve joint](roundJoint.png) 27 | 28 | ### Curve Pattern 29 | A dash-dot pattern is a composite of dashes and dots and the composition is periodic. 30 | 31 | You can also define other type of patterns. 32 | 33 | # The implementation of DashDot curve style 34 | Our implementation will be only applied to a BasisCurve with "linear" type. That is, the curve is a list of line segments or a polyline. The width of the line will be uniform, and it will not change when camera changes. There can be no pattern, which is called "sketch" style. It can also have dash dot pattern, which is called "dashDot" style. 35 | 36 | To implement the curve style, we will add a style property to the BasisCurves, and add the special vertex and fragment shader. We will also provide two different materials: one is for the "sketch" style, and another is for the "dashDot" style. 37 | 38 | ### Modification to the BasisCurves 39 | A new property is added to BasisCuves schema: 40 | 41 | - style. A string uniform, which determines the type of the curve style. Currently its value could be "none", "sketch", "dashDot" and "screenSpaceDashDot". By default the value is "none", which means the curve doesn't have style. The value "sketch", "dashDot" and "screenSpaceDashDot" are only valid when the type is "linear". If the value is "sketch", the width of the curve will not change when camera changed. There is no pattern. If the value is "dashDot", the curve style pattern is dash dot pattern, and the pattern will be based on world unit. If we zoom in, the pattern on the curve will not change in world space. The dash size and the dash gap size in the screen space will be larger. If the value is "screenSpaceDashDot", the curve style pattern is dash dot pattern, and the pattern will be based on screen unit. If we zoom in, the pattern on the curve will change in the world space, so that the dash size and the dash gap size in the screen space will not change. 42 | 43 | ![image of screenSpacePattern](screenSpacePattern.png) 44 | 45 | If the curve style is "sketch", "dashDot" or "screenSpaceDashDot", the curve must bind to a specific material. The property of the style, such as the cap shape or the scale of the dash dot pattern, will be set in the material via the surface input. 46 | 47 | In the implementation, we also create special geometry when the curve style is "sketch", "dashDot" or "screenSpaceDashDot". Each line segment is converted to a rectangle which is composed from two triangles. 48 | The shader of the BasisCurves is also modified. We add two new sections of shader code: "Curves.Vertex.DashDot" and "Curves.Fragment.DashDot". If the curve style is "sketch", "dashDot" or "screenSpaceDashDot", the vertex shader must be "Curves.Vertex.DashDot" and the fragment shader must be "Curves.Fragment.DashDot". 49 | 50 | ### Material to support the dash dot style 51 | If the curve style is "sketch", it must contain a LineSketchSurface Shader. If the curve style is "dashDot" or "screenSpaceDashDot", it must contain a DashDotSurface Shader and DashDotTexture shader. 52 | 53 | ### LineSketchSurface 54 | The shader will decide the opacity of pixels around caps and joint. The materialTag for this material is translucent. 55 | 56 | This surface also has these special input: 57 | - startCapType. An int input. It can be 0, 1, or 2. 0 means the cap type is round. 1 means the cap type is square. 2 means the cap type is triangle. The default value is 0. 58 | - endCapType. An int input. It can be 0, 1, or 2. 0 means the cap type is round. 1 means the cap type is square. 2 means the cap type is triangle. The default value is 0. 59 | - jointType. An int input. Currently it can only be 0, which means the joint type is round. 60 | 61 | ### DashDotSurface 62 | The shader will decide whether the pixel is within a dash or a gap, so that we can decide its opacity. It will also handle the caps and joint. The materialTag for this material is translucent. 63 | 64 | The DashDotSurface must has a color input, which connects to another shader whose shader is DashDotTexture. The DashDotTexture shader links to a texture which saves the information of the dash-dot pattern. 65 | 66 | This surface also has these special input: 67 | - startCapType. An int input. It can be 0, 1, or 2. 0 means the cap type is round. 1 means the cap type is square. 2 means the cap type is triangle. The default value is 0. 68 | - endCapType. An int input. It can be 0, 1, or 2. 0 means the cap type is round. 1 means the cap type is square. 2 means the cap type is triangle. The default value is 0. 69 | - jointType. An int input. Currently it can only be 0, which means the joint type is round. 70 | - patternScale. A float input. The default value is 1. You can lengthen or compress the curve pattern by setting this property. For example, if patternScale is set to 2, the length of each dash and each gap will be enlarged by 2 times. This value will not impact on the curve width. 71 | 72 | ### DashDotTexture 73 | The DashDotTexture shader is quite the same as UVTexture shader. The difference is that it outputs rgba value. The default value for wrap is clamp, and the default value for min/max filter is "nearest". The shader must have a dash-dot texture input. 74 | 75 | ### The dash-dot texture 76 | The dash-dot texture is a texture that saves a type of dash-dot pattern. In the four channels we will save if the pixel is within a dash or a gap, and the start and end position of the current dash. -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # OpenUSD-proposals 2 | 3 | Welcome to OpenUSD-proposals, a forum for sharing and collaborating on proposals for the advancement of USD. 4 | 5 | Before getting started, please familiarize yourself with the contents of the [Supplemental Terms](https://openusd.org/release/contributing_supplemental.html) page. 6 | 7 | If you are interested in browsing existing proposals, please proceed right to [the current list of proposals, with status information](https://github.com/orgs/PixarAnimationStudios/projects/1/views/1). 8 | 9 | ## What is a proposal? 10 | 11 | - a new schema, such as "Level of Detail for Games" 12 | - an outline for a technical action, such as "Removing the usage of boost preprocessor macros" 13 | - a new development, such as "Evaluation of Hermite Patches" 14 | - a discussion of scope tightening, such as "Standardizing Alembic without HDF5" 15 | 16 | For inspiration, here are several proposals that have previously been worked through: https://openusd.org/release/wp.html 17 | 18 | ## Relationship to other forums 19 | 20 | Initial discussion and coordination of effort may occur in face-to-face meetings, the [AOUSD forum](https://forum.aousd.org/), the Academy Software Foundation's wg-usd Slack channels, and other venues. 21 | 22 | When a proposal has taken enough shape that it warrants detailed feedback and iteration, this repository exists as a place to work together on artifacts such as white papers, sample schema definitions, and so on. 23 | 24 | ## Process for a new Proposal 25 | 26 | ### Create a Pull Request for the proposal 27 | 28 | 1. Fork this repo. 29 | 2. Create a directory within `proposals/` for the proposal and its materials, and a README.md. 30 | 1. The README.md document may contain the proposal, or at a minimum, it should announce the contents of the proposal and how to understand the materials within the proposal. 31 | 2. The README.md should also contain notes the author considers important for anyone looking at the proposal, which could include notes that a proposal has been superseded by another, that the proposal resulted in a change to another project, and so on. 32 | 3. Submit a PR and fill out the provided sections in the pull request body. 33 | 1. The PR may include links to supporting materials that could not be included with the proposal files, such as white papers. Add links to the "Supporting Materials" section of the PR body. 34 | 2. Please mention and link any issues and PRs in [OpenUSD](https://github.com/PixarAnimationStudios/OpenUSD) or [OpenUSD-proposals](https://github.com/PixarAnimationStudios/OpenUSD-proposals) that are related to the new proposal. A label will be created to link relevant discussions together. 35 | 36 | ### Guidance for Writing Proposals 37 | 38 | When writing and submitting a proposal, we make the following suggestions for receiving the best feedback. 39 | 40 | 1. Include a link to the rendered proposal in your PR description. This can be as simple as linking to the markdown file in your source branch. 41 | 42 | This makes it significantly easier for someone to read the document in a well formatted way. 43 | 2. Your PR and proposal should include, and ideally start with, a short summary of what your change would achieve. 44 | 45 | A possible structure could be: 46 | 1. **Summary** of what you're hoping to achieve 47 | 2. A **problem statement** explaining why what you are proposing is not currently possible 48 | 3. A **glossary of terms** that readers may not be familiar with. 49 | 4. If applicable, any links to **existing implementations or reference documents** that may be useful 50 | 5. **Details** about your proposal, such as why you are making certain choices 51 | 6. **Risks** that you anticipate (if any) 52 | 7. **Alternate solutions** that you have considered (if any), and why you didn't go with them 53 | 8. Any **excluded topics**, that you have left out and why. These may be things you want to handle separately in the 54 | future for example. 55 | 56 | 3. It is highly recommended that any submitted proposal include text examples of what your proposal would look like 57 | in `.usda` syntax. 58 | 4. It's recommended that long sentences be split over multiple lines in your Markdown file. 59 | This allows more granular feedback, and easier viewing of the files. 60 | 61 | #### Large Proposals 62 | 63 | Some proposals are inherently of significant size. In those case it is recommended to do one or both of the following: 64 | 65 | 1. Split your proposal into multiple smaller proposals. 66 | 1. Create a sub-proposal PR per major feature. 67 | 2. Create an umbrella proposal that links to each sub-proposal, ordered by their priority and dependency on each other. 68 | 2. Divide your proposal into smaller sections. These can be sections within the same document or separate documents. 69 | 70 | This helps make each section easier to digest and provide feedback on. 71 | 72 | ### Discuss the proposal 73 | 74 | A typical workflow for the proposal PR will have some initial discussion on the PR itself. Once some level of consensus has been reached on the proposal details, the PR may be landed. Iteration of the proposal may proceed via subsequent PRs, discussions in the corresponding issue, or using other tools available in github's interface. 75 | 76 | New issues and PRs related to the proposal should be linked to the initial proposal by [autolinking](https://docs.github.com/en/get-started/writing-on-github/working-with-advanced-formatting/autolinked-references-and-urls#issues-and-pull-requests) the original proposal PR (eg, #1234). 77 | 78 | At any point, proposal text may be used in other contexts. For example, a proposal may be referenced when writing new schemas or code for USD. Referencing a proposal in this way does not guarantee that the proposal will advance beyond a discussion stage. 79 | 80 | When a proposal has been approved as a starting point for implementation, that should be noted here with an updated proposal status (see section below). Subsequent development should occur in the appropriate forums. For example, if a proposal has been developed into code and concrete schemas, that might become a pull request against the main OpenUSD repo. Such a development should be noted in the proposal's README.md file and linked to the pull request in the main OpenUSD repo. 81 | 82 | ### Proposal Status 83 | 84 | There are five proposal statuses: 85 | 86 | - **To do** - This item hasn't been started 87 | - **Draft** - Proposal is work-in-progress, but open for feedback and reviews 88 | - **Published** - Proposal is approved to use as a starting point for implementation 89 | - **Implemented** - Proposal has been implemented 90 | - **Hold** - Proposal is not being worked on 91 | 92 | You can monitor your proposal status using the [OpenUSD Proposals Status page](https://github.com/orgs/PixarAnimationStudios/projects/1/views/2) 93 | 94 | New PRs are automatically given the **To do** status. This indicates the proposal is still in an early draft that is not yet ready for discussion/comments. 95 | 96 | When the proposal is changed to the **Draft** status, this indicates the proposal is ready to be reviewed and discussed; however it is still work-in-progress and may continue to be updated. 97 | 98 | Once a proposal is complete and fully reviewed, it will be merged and can be moved to **Published** status. This indicates the proposal can be used as a starting point for implementation. Any changes to the proposal at this point would need to be filed as a new PR. 99 | 100 | Once implementation work has been completed, the proposal will be moved to **Implemented** status, with indication of which version of USD the proposal was implemented in. 101 | 102 | ## Code of Conduct 103 | 104 | The success of the forum is predicated on involvement and communication, so consider this an appeal to everyone's creativity and thoughtful consideration. 105 | 106 | Civility, inclusiveness, friendship, and constructive collaboration shall be the hallmarks of this forum. 107 | 108 | Thank you for your participation! 109 | -------------------------------------------------------------------------------- /proposals/multiple-animations/usdskel.md: -------------------------------------------------------------------------------- 1 | # Multiple Animation in USDSkel 2 | 3 | This proposal outlines a way to add the ability for a Usd Skeleton to have multiple animation sources associated with it. 4 | 5 | A follow up proposal will describe a possible way to add multiple animations to regular prims. I've chosen to split them out so as to focus discussion. 6 | 7 | This would be useful for runtimes like game engines or crowd simulation software where the ability to have access to multiple animations at once are useful so that the runtime can activate and blend between them as needed based on application specific context. 8 | 9 | It would also help bring better parity to the takes system in FBX, which has facilities for specifying multiple animations per object. 10 | 11 | Additionally, there is a desire to have no default animation as well, to speed up time to first load, since animation processing can be elided if the user isn’t using animations till a later time. e.g in Unreal, you may place a static actor for layout purposes, but only apply animation during runtime 12 | 13 | 14 | ## Why not Variants? 15 | 16 | Variants have a runtime cost associated with switching, and only expose a single variant at a time on the composed stage. 17 | Meanwhile, runtimes like game engines require access to the full range of options available and also switch often, such that recomposition is too heavy a cost to pay. 18 | 19 | ## UsdSkel Schema Change Proposal 20 | 21 | USD Skeletons are unique in that they store animation externally to the prim itself. They are also the most likely candidate in a real time pipeline to have multiple animations, and game studios often will treat even basic transforms as a single joint skinned object. 22 | 23 | While the skeleton proposal wouldn’t allow generalization over non-skeletal animation such as transforming objects, it would at least cover the most common use cases requested in a way that should be fairly easy to provide quickly. 24 | This would also allow easier sharing of animations between multiple characters that share a skeleton. 25 | 26 | A future proposal would describe how we may add multiple animations to generic prims, and how that could play well with this Skeleton proposal. 27 | 28 | ### Proposal 1 29 | 30 | In this version, we allow `skel:animationSource` to be a list, much like `blendShapeTargets` is. Since rels are allowed to be lists, this doesn’t change the schema. 31 | 32 | The API’s for the schema would need to be modified, with an overload for a Vec, and modify the getter/setter to take the first item in the list as the default active animation. 33 | 34 | Since this is not a schema change, the API wouldn’t need to be versioned and existing files would continue to work as is. 35 | However files authored with this API may not be compatible with DCCs that do not assume a list. 36 | 37 | With this proposal, you’d need a secondary attribute to control whether animations get applied by default. This would default to True to maintain current behaviour, but could be flipped to signal the intent that animations shouldn’t be applied by default 38 | 39 | ``` 40 | bool skel:applyAnimation ( 41 | customData = { 42 | string apiName = “applyAnimation” 43 | } 44 | doc = “”“Controls whether animations are meant to be applied by default or not. If set to False, the static pose should be used”“” 45 | ) 46 | ``` 47 | 48 | ### Proposal 2 49 | 50 | In this second proposal, we'd add a new property. A new `skel:animationLibrary` type would need to be added to `SkelBindingAPI` like below. It would be a list of sources while leaving the current animationSource as the active selection 51 | 52 | ``` 53 | rel skel:animationLibrary ( 54 | customData = { 55 | string apiName = "animationLibrary" 56 | } 57 | doc = """An ordered list of skeletal animations that can be used for this skeleton. 58 | """ 59 | ) 60 | ``` 61 | 62 | In the interest of backwards compatibility, I propose that this live alongside the existing `skel:animationSource`. The singular one would be used as the default in places like usdview or runtimes that don’t support multiple clips. 63 | 64 | If the singular `animationSource` is not specified, then no animation should be applied by default, falling back to the rest pose. This would maintain the current behaviour, and give the option to elide application of animations when undesired. 65 | 66 | This should avoid the need to version the schema, and the Setter/Getter functions could handle this abstraction for developers. 67 | 68 | 69 | ### Proposal 3 70 | 71 | In the third Proposal, we simply define conventions based on [the general proposal](general.md). 72 | 73 | Each of the general proposals can simply vary the existing `skel:animationSource`. 74 | 75 | See the other file for concerns about bounding and storage. 76 | 77 | 78 | ## Naming 79 | 80 | One other nicety would be naming of the animations. 81 | It is unclear what the best way to do this would be. 82 | 83 | 84 | ### Names as Attributes on SkelBindingAPI 85 | 86 | The names could be a matched index array on the `SkelBindingAPI` that correspond to each rel. 87 | This would allow for the names to not require a stage traversal, and allow each prim to have unique names if necessary. 88 | It would also allow using names elsewhere in the SkelBindingAPI if needed. 89 | 90 | On the downside, this would duplicate the names 91 | 92 | 93 | ### Names on the SkelAnimation 94 | 95 | Since the SkelAnimation is independent already, the name could simply be derived from the Prim name or displayName of the SkelAnimation. 96 | 97 | Alternatively SkelAnimation could have a new TfToken/String attribute for name. 98 | This way every use of the SkelAnimation gets the same name, and no data duplication occurs. 99 | 100 | However fetching the names would require a stage traversal. 101 | 102 | 103 | ## Extents 104 | 105 | Currently there is no way to specify how the `SkelAnimation` would affect the bounding box of an animated character. This causes issues where clipping may occur based on the authored extents of the static mesh. 106 | 107 | Many authoring systems either skip authoring extents for Skeletal animations, or author it based on the single animation source. This varies by runtime, where game engines may ignore the authored extents, while crowd simulation software may make use of it when not using dynamic animation. 108 | 109 | As I do not work in an environment where extents for skeletons are key for their use, this section is a strawman argument but I believe it’s important to bring up since others may have such a need. 110 | 111 | If we were to add multiple animation sources, it might also be beneficial to allow multiple bounding boxes to be associated with the Skeleton. 112 | 113 | This could not exist on the SkelAnimation since the Animation may be shared by multiple characters, each with different extents. Take for example, crowd characters that share the skeleton and animations but have different costume attachments. 114 | 115 | As such it could be valuable to introduce a new schema type like so that could be calculated ahead of time for each bound animationSource: 116 | 117 | ``` 118 | class SkelExtents “SkelAnimationExtents” ( 119 | inherits = 120 | doc = “”“Describes the extents of a given skeleton character when used with a given animation.” 121 | customData = { 122 | string className = “AnimationExtents” 123 | } 124 | ) 125 | ``` 126 | 127 | 128 | The SkelBindingAPI could then have another rel attribute like so 129 | 130 | ``` 131 | rel skel:animationExtents ( 132 | customData = { 133 | string apiName = “animationExtents” 134 | } 135 | doc = “”“An ordered list of extents that maps to the order of the animationSources”“” 136 | ) 137 | ``` 138 | 139 | The ordering of the extents would map to the ordering of the animationSources. 140 | 141 | 142 | ## Support for multiple animations on generic prims 143 | 144 | I think there's also a lot to be gained by having multiple animation support for generic prims, especially Xform based prims. 145 | However in the interest of keeping this proposal scoped smaller, I'll put up a separate proposal for that. 146 | 147 | I expect a general purpose solution to be a much larger undertaking, and I think we'd have maximal, immediate ROI by adding support for skeletal use cases. 148 | 149 | Once that's up, I'll update this proposal with a link and more details on how they could work in concert. -------------------------------------------------------------------------------- /proposals/materialx-versioning/README.md: -------------------------------------------------------------------------------- 1 | # MaterialX Versioning in OpenUSD 2 | 3 | ## Contents 4 | - [TL;DR](#tldr) 5 | - [Introduction](#introduction) 6 | - [Problem](#problem) 7 | - [Proposal to Version MaterialX data in OpenUSD](#proposal-to-version-materialx-data-in-openusd) 8 | - [Overview](#overview) 9 | - [UsdMtlx Implementation](#usdmtlx-implementation-changes) 10 | - [HdMtlx Implementation](#hdmtlx-implementation-changes) 11 | - [Risks](#risks) 12 | - [Risk 1](#risk_1) 13 | - [Out of Scope](#out-of-scope) 14 | - [Questions](#questions) 15 | 16 | ## TL;DR 17 | We propose adding attributes to UsdShade `Material` primitives to record the MaterialX library version they were authored 18 | with. This will allow for MaterialX to provide an automatic upgrade functionality to the data stored in OpenUSD. 19 | 20 | The following is an example of a `Material` primitive with the applied schema applied. 21 | ``` 22 | def "Material" MyMaterial ( 23 | prepend apiSchemas = ["MaterialXInfoAPI"] 24 | ) 25 | { 26 | uniform string mtlxinfo:version = "1.39" 27 | } 28 | ``` 29 | 30 | ## Introduction 31 | From its inception, the MaterialX project has considered MaterialX documents to be an archival format. What goes along 32 | with this is an implied promise that opening a MaterialX file from 10+ years ago should generate a material that will 33 | render a qualitatively similar image. This promise is honored by the project in the form of an automatic system that 34 | upgrades data as documents are loaded to be compatible with the current MaterialX library. 35 | 36 | The MaterialX library is a collection of XML files that describe a standard set of node definitions, type definitions and 37 | other components provided by MaterialX. The Material library version is a `major.minor` version number (currently 1.38) 38 | and allows that set of definitions to be versioned. It is not the version of the MaterialX project itself, which also 39 | includes a `patch` version number, but the MaterialX library version does track in-sync with the MaterialX project version. 40 | 41 | The MaterialX integration within OpenUSD relies upon two libraries, `UsdMtlx` and `HdMtlx`. `UsdMtlx` transcodes the 42 | MaterialX document in to `UsdShade` primitives. Inside of Hydra, `HdMtlx` is used to reconstruct the 43 | MaterialX document using the `UsdShade` primitives that have been transported through Hydra (more information 44 | available [here](https://openusd.org/release/api/_page__material_x__in__hydra__u_s_d.html)). The reconstructed MaterialX 45 | document is then passed to the MaterialX Shader Generation library to generate shader code in the desired destination 46 | shading languages. 47 | 48 | ## Problem 49 | For the entire lifetime of MaterialX support within OpenUSD, there has never been a MaterialX library version change. The 50 | MaterialX library version has always been v1.38. The MaterialX project would like to be able to update the MaterialX 51 | library version without invalidating MaterialX data currently stored in OpenUSD files. Without access to the MaterialX 52 | library version the upgrade code path inside the MaterialX project cannot be used. 53 | 54 | Without some sort of remedy, either MaterialX will be constrained to stay at version 1.38, or OpenUSD will become a 55 | container for ambiguous MaterialX data. 56 | 57 | ## Proposal to Version MaterialX data in OpenUSD 58 | ### Overview 59 | We propose adding an applied API schema `MaterialXInfoAPI` (in the spirit of the 60 | ["Revise use of Layer Metadata in USD"](https://github.com/PixarAnimationStudios/OpenUSD-proposals/blob/main/proposals/revise_use_of_layer_metadata/README.md)) that can record the MaterialX library version. We keep the name of the 61 | applied API schema a little more generalized to leave the door open for additional MaterialX document information to be 62 | added to the schema at a later date. We deliberately decide to only propose adding the MaterialX library version to the 63 | schema to keep the proposal as uncontroversial as possible to add rapid adoption. 64 | 65 | The `MaterialXInfoAPI` will be applied to the UsdShade `Material` primitive. This allows for referencing the material 66 | directly from a `.mtlx` file and still retaining the MaterialX library version when the scene is composed. i.e. 67 | ``` 68 | def "Material" MyMaterial ( 69 | references=@myMaterials.mtlx@ 70 | ) 71 | {} 72 | ``` 73 | 74 | Any UsdShade `Material` primitives that do not have the applied API would be assumed to be version 1.38, thus providing a 75 | path forward to support existing OpenUSD files that are not using this applied API schema. 76 | 77 | With the MaterialX library version recorded in the OpenUSD stage, `HdMtlx` would now have access to an appropriate 78 | version number to use when re-constructing the MaterialX document. This will then allow the existing MaterialX upgrade 79 | functionality to be used. 80 | 81 | ### UsdMtlx Implementation changes 82 | 83 | The only change required in `UsdMtlx` would be to apply the applied API schema to the `Material` primitive, and set the 84 | MaterialX library version. This version number would come directly from the `.mtlx` file that was being transcoded to 85 | OpenUSD. 86 | 87 | ### HdMtlx Implementation changes. 88 | 89 | The `HdMtlx` library will also need to be changed to read the MaterialX library version from the `Material` primitive, 90 | and author this version number in the reconstructed MaterialX document. 91 | 92 | If the version is not present it will be assumed to be v1.38. 93 | 94 | ## Out of Scope 95 | We do not propose any sort of validation regarding the version other than its presence, and then falling back 96 | to the default if missing. 97 | 98 | Specifically we do not propose attempting to validate that all opinions that make up a composed 99 | `Material` primitive were authored with the same MaterialX version, even though this composition could lead to an 100 | invalid MaterialX material, for a number of reasons. 101 | * The expense of the validation, in some circumstances, could be too high to consider. 102 | * It is possible that a `Material` primitive may still contain a valid MaterialX material even if composed of differing 103 | MaterialX library versions. 104 | * Reporting the error during the evaluation of Hydra may be later in a pipeline than desired. 105 | * Some implementations might not even use the `HdMtlx` library to extract the MaterialX material. 106 | 107 | Instead, we propose to leverage another OpenUSD proposal, the [USD Validation Framework](https://github.com/PixarAnimationStudios/OpenUSD-proposals/tree/main/proposals/usd-validation-framework). Adding a `UsdValidator` to 108 | inspect all layers that contribute opinions to a UsdShade `Material` primitive, and reporting an error if opinions are 109 | found that were authored with conflicting MaterialX library versions. 110 | 111 | This proposal also does not attempt to update in-place any OpenUSD data, or provide functionality to do so. The complex 112 | nature of scene composition makes this a difficult problem to tackle, and while this problem could possibly be tackled 113 | in the future, we think it prudent to keep the scope of this problem tractable to expedite forward progress. 114 | 115 | We are only proposing to add version information to OpenUSD regarding the MaterialX library, to facilitate using the upgrade 116 | mechanisms in MaterialX that already exist. This places the responsibility of the actual upgrade on the MaterialX project. 117 | MaterialX currently does not provide any promise of forward compatibility. This proposal does not seek to add this, or 118 | any sort of new compatibility or upgrade functionality to MaterialX. 119 | 120 | ## Other solutions considered 121 | 122 | Initial conversations around this topic started with discussions of adding layer metadata to the layers containing 123 | MaterialX information, but in light of the recent proposal to move away from layer metadata the direction of the 124 | conversation was changed. 125 | 126 | We also considered authoring the MaterialX library version on each UsdShade `Shader` primitive, in the hope that it 127 | would allow more concrete support, or at least discovery, of mismatched MaterialX library versions created during USD 128 | composition. This idea was discarded on the grounds that it would be both noisy in the USD data, and also potentially 129 | error prone if being authored by hand. A single applied API schema for the entire `Material` primitive seemed like a 130 | better compromise. 131 | -------------------------------------------------------------------------------- /proposals/image-planes/README.md: -------------------------------------------------------------------------------- 1 | ![Status:Draft](https://img.shields.io/badge/Draft-blue) 2 | ## Image Plane Support for USD 3 | 4 | ## Contents 5 | - [Introduction](#introduction) 6 | - [High Level Anatomy of an Image Plane](#high-level-anatomy-of-an-image-plane) 7 | - [Pinhole Camera Diagram](#pinhole-camera-diagram) 8 | - [UsdGeomImagePlaneAPI Schema](#usdgeomimageplaneapi-schema) 9 | - [Properties](#properties) 10 | - [API Methods](#api-methods) 11 | - [Hydra Implementation](#hydra-implementation) 12 | - [Alternative Implementation Strategies](#alternative-implementation-strategies) 13 | - [Use Cases](#use-cases) 14 | - [Open Questions](#open-questions) 15 | 16 | ## Introduction 17 | This proposal aims to add native support for Image Planes to USD. An Image Plane is a camera constrained image that 18 | can be seen in "screen space" when looking through that camera. 19 | 20 | The majority of work in VFX is centered around augmenting existing footage, so it is important to view animation in 21 | the context of that footage (it doesn't matter if the animation looks good if it doesn't look integrated). Without 22 | this ability, the utility of tools like usdview is quite restricted and requires compositing before work can be 23 | evaluated in context. 24 | 25 | ## High Level Anatomy of an Image Plane 26 | Conceptually, an Image plane must provide the following information: 27 | - associated camera 28 | - image (can be animated per frame) 29 | - fit to filmback (how image is positioned in relation to cameras filmback) 30 | - visibility (whether image plane is visible or not, because it should be hide-able for different workflows) 31 | 32 | And could potentially also provide: 33 | - depth (if an image plane is in space, this is how far away from pinhole camera this plane should exist in space) 34 | - "image attributes" like alpha gain, preview lut, etc. 35 | 36 | ## Pinhole Camera Diagram 37 | This proposal will follow the same nomenclature from the Cooke site. Here is a reproduced CG Camera diagram. 38 | (from https://cookeoptics.com/i-technology/ > "Cooke Camera Lens Definitions for VFX" > CG Camera diagram 2.1.) 39 | ![](pinhole-camera-diagram.png) 40 | 41 | ## UsdGeomImagePlaneAPI Schema 42 | The UsdGeomImagePlaneAPI schema defines basic properties for rendering an image plane. The UsdGeomImagePlaneAPI 43 | schema derives from UsdSchemaBase and will be a Multi-Apply Schema that would be restricted to UsdGeomCamera prim 44 | types. 45 | 46 | ### Properties 47 | - asset image = @@ : *Path to image file* 48 | - can be time sampled per frame 49 | - This will have to be per frame, as USD/Hydra do not at this time support a $F substitution for image/texture asset paths. [See below](#out-of-scope-f-frame-token) 50 | - double depth = 0 : *Distance, in scene units, from the pinhole to the point on the optical axis where the image plane sits l0 in above* 51 | pinhole camera diagram 52 | - alternative name: "distance" (Autodesk Maya used "depth") 53 | - A nice default could be -1 or "infinite" so that it would always be behind CG. That might be hard to coexist with a physically placed in scene camera depth that exists in front of and behind cg elements. 54 | - bool enabled = True : *Control image plane visibility.* 55 | - regardless of this value, if the (camera) prim itself is invisible, the image plane will not be visible in the viewport/render 56 | - token visibility ["all views", "camera only"] : *Whether this viewport is visible when the scene is viewed through the same camera prim* 57 | - the "none" case could be handled by "enabled" property 58 | - array of float[2] placement = [(0,0), (1,0), (1,1), (0,1)] : *Coordinates relative to the camera film back for each corner of the image plane's image* 59 | - float rotation = 0.0 : *Refine the placement of an image plane by rotating along the optical axis.* 60 | - Applied after placement coordinates 61 | 62 | ### Suggestions for Metadata 63 | The following won't be part of the schema, but would be informal properties. 64 | - Application specific data for round tripping such as ("maya:fit" = "horizontal") 65 | 66 | ### Why a Multi-Apply API Schema? 67 | If we implement a Multi-Apply API Schema for Image Planes we can: 68 | - author attributes directly to camera 69 | - author multiple image planes to camera, since multi-apply schemas can be applied several times to the same prim 70 | multiple image planes could be used to view CG elements in context with rotoscoped foreground elements as well as background plates 71 | 72 | #### API Methods 73 | - CreateImagePlane(imagePlaneName) 74 | - SetImagePlane*Attr(imagePlaneName, value) 75 | - GetImagePlane*Attr(imagePlaneName, attribute) 76 | - SetImagePlanes([]) 77 | - GetImagePlanes() 78 | 79 | #### A Minimum Usage Example: 80 | ``` 81 | camera = stage.GetPrimAtPath('/world/cam') 82 | imagePlaneApi = UsdGeomImagePlaneAPI(camera) 83 | imagePlaneApi.CreateImagePlane("imagePlane1") 84 | framesToImages = [(f, fileSequence.frameAt(i) for f in fileSequence.frames()] 85 | imagePlaneApi.SetImagePlaneImageAttr(framesToImages, "imagePlane1") 86 | ``` 87 | ##### Which would generate this .usda: 88 | ``` 89 | def Xform "world" { 90 | def Camera "cam" ( 91 | apiSchemas = ["UsdGeomImagePlaneAPI:imagePlane1"] 92 | ){ 93 | ... 94 | string[] imagePlanes = ["imagePlane1"] 95 | asset imagePlane1:image = { 96 | 1001: @/path/to/image.1001.exr@, 97 | 1002:... 98 | } 99 | float imagePlane1:depth = 1000.0 100 | } 101 | } 102 | ``` 103 | 104 | ## Hydra Implementation 105 | TODO: Flesh out and discuss with Hydra team. 106 | 107 | Ideas: 108 | - Use UsdPreviewSurface textures in a way similar to GeomModel texture cards. 109 | - Do the compositing in an Hdx post task 110 | 111 | ## Use Cases 112 | ### Hydra / Usdview 113 | It would be helpful to be able to view CG elements against a backdrop of the production plate when checking exported usd caches. 114 | 115 | ### DCC Interchange 116 | Having a "UsdLux" style standard for Image Planes would be very useful when implementing importers and exporters for DCCs like Maya, Blender, Nuke, etc. 117 | 118 | ### Photogrammetry 119 | Reasonably specific example: 120 | Load a mesh and array of cameras - provided by a company like Lidar Lounge - into Maya. 121 | Clean up / simplify the mesh and export as USD. 122 | Import into Mari, so as the cameras become projectors; with each having the path to an image plane's image, which can be projected onto the mesh. 123 | 124 | ## Alternative Implementation Strategies 125 | ### 1.) As a concrete prim type 126 | Here is a working implementation that was done before Multi-Apply schemas were added to USD. 127 | 128 | It's a fully featured solution that is based around Autodesk Maya's "underworld" image plane nodes where many planes 129 | can live under a camera. This specific implementation is just a quick attempt to move maya image planes across packages 130 | and the hydra implementation is just a workaround that generates an HdMesh rprim (instead of creating a new image plane 131 | prim type). 132 | 133 | Reference internal PR from Luma: https://github.com/LumaPictures/USD/pull/1 134 | 135 | #### Pros: 136 | - Prim attributes like [visibility, purpose] can be leveraged to provide intuitive functionality 137 | - Easy to see Image Planes in a hierarchy 138 | #### Shortfalls: 139 | - Requires relationships between camera + plane prims 140 | - Requires an extra API schema to encode the relationship on the Camera. 141 | - There's the possibility that someone targets something other than an ImagePlane with the relationship, so there are more ways to "get stuff wrong". 142 | 143 | ### 2.) Using existing USD types (Plane + material + camera + expressions) 144 | - It is possible to create a UsdPlane with a material and accomplish the same thing as an image plane. 145 | #### Pros: 146 | - No need for this proposal :) 147 | #### Shortfalls: 148 | - Requires users to implement complex authoring code. 149 | - No explicit identity as an "Image Plane". This will make it hard to successfully roundtrip from DCCs to USD. 150 | 151 | ## Appendix: "Fit" Strategy Discussion 152 | One of the key decisions for this schema is to come up with a universally acceptable way to describe how the 153 | image plane will be positioned in relation to the camera film back. We want something that will cover all use cases 154 | without being to burdensome to describe. 155 | 156 | In our discussions, we determined fit should be relative to the film back as opposed to in scene space or render space. 157 | The exact form this takes is still up for discussion... 158 | 159 | #### 1. A "fit" token 160 | Applications like Autodesk Maya use a fit heuristic to make it easy for artists to describe how to position the image 161 | on the film back. 162 | "horizontal" - fit image width to filmback and keep image aspect ratio 163 | "vertical"- fit image height to filmback and keep image aspect ratio 164 | "fill" - stretched to filmback 165 | "to size" - constant size, centered on filmback, and requiring more data to define "image size" and then further 166 | describe positioning 167 | 168 | #### 2. Corner coordinates 169 | An array of image corner coordinates that will allow any manipulations of scale and shearing to be done 170 | on the image. We add a rotate parameter, but we could also split out scale and keystone if we want to handle 171 | those specifically. 172 | 173 | #### 3. Position + Rotation + Scale 174 | Fit could also be described with: 175 | - float[2] position = (0,0) *Lower left coordinate of film back* 176 | - float[3] rotation = (0, 0, 0) *Rotate the image plane in any xyz direction* 177 | - float[2] scale = (1.0, 1.0) *Scale image in x and y, these would probably be relative to film back size* 178 | 179 | ## Out of Scope: $F Frame token proposal 180 | Image planes in USD beg the question of movies and image planes in USD to avoid repetitively described file paths for 181 | frame samples. Look for that in a separate proposal 182 | -------------------------------------------------------------------------------- /proposals/semantic_schema/README.md: -------------------------------------------------------------------------------- 1 | # Semantic data in USD 2 | 3 | Copyright © 2022-2024, NVIDIA Corporation, version 1.0 4 | 5 | 6 | Dennis Lynch 7 | Eric Cameracci 8 | 9 | 10 | ## Overview 11 | 12 | USD is quickly becoming the standard for describing 3D objects to be used for synthetic data generation, but it currently does not have a standard for describing and storing object semantics. 13 | 14 | Semantic data is vitally important to synthetic data generation and AI-driven systems. These labels provide meaningful information about data, making it easier for machines and algorithms to understand and process the content. Semantic labels are essential in supervised machine learning tasks, where a model is trained on labeled data to make predictions on new, unseen data. Having accurate semantic labels on 3D objects at the pixel-level gives rendered synthetic data an advantage over traditional human-labeled real-world captured data. 15 | 16 | Semantically labelled data also enables new ways to search-for and find assets based on semantic attributes, without needing to know about asset paths, names, or directory structures. Semantic data can also provide additional information about an asset's "state", makeup, or other meaningful information. 17 | 18 | While it is possible to add custom schema and metadata to USD prims, it would be beneficial for the community using semantic data to agree upon a basic standard for the handling of semantic data within the official USD specification to promote interoperability and discoverability as more applications and content are created for the purpose of generating synthetic data or AI tools utilizing semantic information. 19 | 20 | ## Schema proposal 21 | 22 | The schema can be found [here](schema.usda) 23 | 24 | ``` 25 | #usda 1.0 26 | ( 27 | subLayers = [ 28 | @usd/schema.usda@ 29 | ] 30 | ) 31 | 32 | over "GLOBAL" ( 33 | customData = { 34 | string libraryName = "usdSemantics" 35 | string libraryPath = "./" 36 | } 37 | ) { 38 | } 39 | 40 | class "SemanticsAPI" ( 41 | inherits = 42 | 43 | customData = { 44 | token apiSchemaType = "multipleApply" 45 | token propertyNamespacePrefix = "semantics" 46 | } 47 | ) 48 | { 49 | token[] labels ( 50 | doc = "List of semantic labels" 51 | ) 52 | } 53 | ``` 54 | 55 | ## Key Considerations 56 | 57 | With semantic labels, we want to move away from naming conventions or parsing prim paths/names to extract information. 58 | An object could be `/kitchen/accessories/cups/wine_glass_D` or 59 | `/props/glasses/french/long_stemmed_wine_glass_83` but both would have the same or similar _semantic_ labels. 60 | 61 | * Needs to store an arbitrary number of labels, as "meaning" grows among users 62 | * Especially true for labels to be applied in the user's native language 63 | * Needs to be multiple-apply, as semantic meaning greatly depends on context. Examples: 64 | * What _is_ an object? A wine glass. `class:wine_glass` 65 | * What is it _made of_? Glass. `material:glass` 66 | * What is it _for_? Drinking. `purpose:drinking` 67 | * Where can it be located? Cabinets, tables, hands, etc. `location:[interior, kitchen, held, table]` 68 | * Is the object, or its style, region specific? `locale:[Europe, France]` 69 | * Needs to be relatively lightweight for performance. 70 | * A single asset may require multiple taxonomies for multiple training paradigms 71 | * Semantic labels must be interpretable on `render products` like images for computer vision pipelines that may not have access to the OpenUSD API. 72 | * Needs to be able to vary over time to encode information like "state": 73 | * A door is always a door, but it could be `open` or `closed` 74 | * Electronics could be `on`/`off` 75 | 76 | A Multiple-Apply API Schema works well for defining semantics about a prim, and allows us to query the data through `HasAPI` instead of having to parse through informal ways of storing information on a prim such as `metadata` or `userProperties` and will diferentiate assets that do and do not have semantic labels. 77 | 78 | 79 | - `assetInfo` - More suited to information about the digital asset, not its semantic meaning(s) 80 | 81 | - `userProperties` - Too informal, do not want to be a dumping-ground for lots of data 82 | 83 | - `Metadata` - Too informal, do not want to be a dumping-ground for lots of data. Cannot vary over time. It could be possible that the semantic "meaning" of something could change over time. 84 | 85 | _Example_: At first an object is a wine glass for drinking, but after dropping it from a physics simulation it is now shards of glass - not for drinking. 86 | 87 | - `Kind` - Must be pre-defined. Cannot vary over time. Better suited for filtering or interacting with objects on the Stage, not for rendering or label data. 88 | 89 | ### Why a list? 90 | 91 | What an object "is" becomes very tricky with the nature of human language. Is the object an `automobile`, a `vehicle`, or the more informal `car`? 92 | This is especially true when considering multiple languages: `car`, `Auto`, `voiture`, `bil`, `coche`, etc. 93 | 94 | Objects can also have more detailed semantics depending on their context. 95 | 96 | Examples: 97 | - `['vehicle', 'emergency vehicle', 'ambulance']` 98 | - `['chair', 'office furniture', 'executive chair']` 99 | 100 | By having a list, semantics on a prim can be additive, increasing the understanding of what the object is for more people and use cases, allowing filtering for the terms that are applicable to the user. 101 | 102 | 103 | ### Encoding semantic "type" 104 | 105 | Within the machine learning community, 'class' is a very common moniker for defining a category for semantic labels, but there could be more "types" of semantic labels. An object's `class` could be `vehicle`, its `sub-class` `ambulance`, and it's material semantics `metal` or `steel` for certain parts, etc. 106 | 107 | For this we took inspiration from existing USD Schemas such as `CoordSysAPI` and light linking that encode meaning in the instance name of the schema object. 108 | 109 | Preferring: 110 | 111 | ``` 112 | token[] semantics:class:labels = ["animal", "bird", "penguin"] 113 | ``` 114 | 115 | instead of multiple instances on the same prim just to contain all of the information for all possible `class` labels 116 | 117 | ``` 118 | token semantics:Semantics_aDa0:semanticType = "class" 119 | token semantics:Semantics_aDa0:semanticLabel = "animal" 120 | 121 | token semantics:Semantics_d282:semanticType = "class" 122 | token semantics:Semantics_d282:semanticLabel = "bird" 123 | 124 | token semantics:Semantics_c0nr:semanticType = "class" 125 | token semantics:Semantics_c0nr:semanticLabel = "penguin" 126 | ``` 127 | 128 | Or requiring string parsing operations 129 | 130 | ``` 131 | string[] semantics:instance_name:labels = ["class:animal", "class:bird", "class:penguin"] 132 | ``` 133 | 134 | The proposed method has better performance as the number of labels of the same type increase on a prim to dozens, hundreds, or thousands of labels. 135 | 136 | ### Multiple Taxonomies 137 | 138 | The combination of a MultipleApply schema and using prim instance names as "special" identifiers allows semantic information to be stored in USD from multiple sources of taxonomy, while still retaining additional existing taxonomy. 139 | 140 | Example of a "police car" prim semantics from different taxonomy sources: 141 | ``` 142 | omniverse-simready: ['emergency_vehicle'] 143 | 144 | # COCO has no special taxonomy for police vehicles so "car" might be most appropriate 145 | ms-coco-stuff: ['car'] 146 | 147 | kitti: ['vehicle-other'] 148 | 149 | ImageNet_classID: ['734'] 150 | 151 | ImageNet_className: ['police van', 'police wagon', 'paddy wagon', 'patrol wagon', 'wagon', 'black Maria'] 152 | ``` 153 | Here, a user would "query" for semantic prim data of only the specific taxonomy that is relevant to their needs. (ex: `ImageNet_classID`) 154 | 155 | Users can also later add additional semantics for new or missing taxonomies to the prim data. 156 | 157 | ### Semantic Aggregation 158 | 159 | "Aggregating" semantic data is essential to workflows. A prim could have the semantic label of `door`, but that is missing wider context. Is the prim a door inside of a room, on the exterior of a building, part of a vehicle? 160 | 161 | A method should be provided to "aggregate" parent semantics in a consistent way that allows users to understand the semantic heirarchy. 162 | 163 | Example prim setup: 164 | ``` 165 | └── xform - car 166 | ├── wheel 167 | ├── door 168 | │ ├── window(glass) 169 | │ └── handle 170 | └── seat 171 | ``` 172 | Querying the semantic hierarchy should should return a uniquified list(Set) of labels, where ordering is determined by ancestral distance from the prim being queried, and the first time a label is encountered in the ancestors-walk determines its position in the list. 173 | 174 | example results: 175 | ``` 176 | Aggregation query on handle: ['handle', 'door', 'car'] 177 | Aggregation query on seat: ['seat', 'car'] 178 | ``` 179 | 180 | ### Proposed Methods 181 | 182 | The `primvarsAPI` has provided some inspiration for possible methods that would be used for semantics: 183 | 184 | ``` 185 | GetSemanticLabels() - Returns list of `labels` on prim 186 | GetSemanticInheritedLabels() - returns an aggregated ordered set of `labels` from the prim and its parent prims 187 | ``` 188 | 189 | ### Tokens vs Strings 190 | 191 | Consideration was taken for the datatype being tokens or strings. Considering that semantic labels should be more read-intensive rarely (if ever) written/updated, tokens are thought to be the better representation. 192 | 193 | Some performance testing was done, and in some cases StringArray performed better than TokenArray in Python, the difference was negligible and could be attributed to the overhead of converting tokens to native Python strings. 194 | 195 | Tokens might also facilitate future expansion use of `allowedTokens` for contraining semantic labels, but that will not be a part of this proposal at this time. 196 | 197 | ## Out-of-Scope 198 | 199 | The purpose of schema is defining _how_ to define and store semantic data on prim data within a USD file. 200 | 201 | What this is _not_ doing is defining specifics on semantics, taxonomy, or ontologies. 202 | 203 | There are already multiple large public datasets with their own label taxonomies for classification ([MS COCO](https://cocodataset.org/#stuff-eval), [ImageNet](https://deeplearning.cms.waikato.ac.nz/user-guide/class-maps/IMAGENET/), [KITTI](https://www.cvlibs.net/datasets/kitti/), [SUN RGB-D](https://rgbd.cs.princeton.edu/), [NYU](https://cs.nyu.edu/~silberman/datasets/nyu_depth_v2.html), etc). 204 | 205 | Trying to define a specific agreed-upon taxonomy for classification across the industry is beyond the scope of this proposal. So this is not trying to dictate `car` vs `vehicle` vs `automobile` for a specific label. 206 | 207 | Dicussions about semantics often devolve into debating over specific labels and consensus can take a long time. 208 | -------------------------------------------------------------------------------- /proposals/pattern-based-collections/README.md: -------------------------------------------------------------------------------- 1 | ![Status:Implemented, 23.11](https://img.shields.io/badge/Implemented,%2023.11-blue) 2 | # Pattern-Based Collections for OpenUSD 3 | The ability to identify collections of objects in scenes is crucial to many digital content workflows. OpenUSD has an existing applied API schema, [`UsdCollectionAPI`](https://openusd.org/dev/api/class_usd_collection_a_p_i.html) to do this, but currently operates in terms of hierarchical include/exclude rules. This is inconvenient or insufficient for some tasks. So DCCs often provide richer ways to identify collections, allowing wildcard object name matching and predicate testing. Just a couple of current examples are [Katana's CEL](https://learn.foundry.com/katana/dev-guide/CEL.html) and [Houdini Solaris' Prim Matching Patterns](https://www.sidefx.com/docs/houdini/solaris/pattern.html). Here we propose to add extensible pattern-based collection support to OpenUSD, for use in `UsdCollectionAPI` and other domains, drawing inspiration from CEL and Solaris. 4 | 5 | ## Requirements and Guiding Principles 6 | - Membership testing must not require possibly unbounded search. To achieve this, patterns may only consider the object itself (and its properties in case of a `UsdPrim`) and its ancestors. 7 | - The [`UsdCollectionAPI`](https://openusd.org/dev/api/class_usd_collection_a_p_i.html) is an important use-case for this technology. But we expect it to be used in other domains to identify non-`UsdObject`s that are also identified by `SdfPath`s. For example, in Hydra Scene Indexes, or in user-facing GUI components. To support this we will build an extensible library that may be adapted to different domains, of which `UsdCollectionAPI` is just one. 8 | 9 | ## Basic Syntax 10 | ### Path Matching Patterns 11 | The syntax for `SdfPath` matching is similar to that for `SdfPath` itself, with the following changes: 12 | - Path elements may contain `/Gl*b/[Ss]tyle/patt?rns` 13 | - Double-slash `//` indicates arbitrary levels of hierarchy, e.g. `/World//cup` matches any prim named `cup` descendant to `/World`. 14 | 15 | ### Predicate Expressions 16 | Each path element in a Path Matching Pattern may optionally include a Predicate Expression that can test object qualities. Predicate Expressions are introduced by braces: `{}`. 17 | - `//Robot*{kind:component}` select all prims whose name starts with `Robot` and have `kind=component`. 18 | - `//Robot*{kind:component}//{isa:Imageable purpose:guide}` select all imageable guides descendant to `Robot` components. 19 | 20 | These predicate functions take zero or more arguments, including arguments with default values, and may be invoked in three different ways: 21 | - `predicate` a bare invocation with no arguments. 22 | - `predicate:arg1,...,argN` invocation with unnamed positional arguments separated by commas with no spaces. 23 | - `predicate(arg1, keyword=value, ...)` invocation with multiple positional and keyword arguments, spaces allowed. 24 | 25 | Boolean operators `not`, `and`, `or` can combine predicate functions, and `(` `)` can group and establish evaluation order. In addition, whitespace between two predicate functions implies the `and` operator. 26 | 27 | The specific set of available predicate functions (like `isa`, `purpose`, and `kind` above) is domain-specific. That is, each domain that implements a pattern-based collection can register its own set of predicate functions appropriate to the objects the collection identifies. The specific set of predicate functions we propose for `UsdCollectionAPI` (and thus to be authored in scene description) are listed below. 28 | 29 | As a convenience, a predicate expression alone (without a Path Matching Pattern), like `{isa:Imageable}` is shorthand for `//*{isa::Imageable}`. That is, all prim paths match. Similarly, a Path Matching Pattern element that is empty except for a predicate expression, like `/World//{isa:Camera}` is shorthand for `/World//*{isa:Camera}`. That is, all prim names at that location match. 30 | 31 | #### Built-in Prim Predicate Functions for UsdCollectionAPI 32 | - `abstract(bool=true)` match prims that are or are not abstract (`UsdPrim::IsAbstract`) 33 | - `defined(bool=true)` match prims that are or are not defined (`UsdPrim::IsDefined`) 34 | - `model(bool=true)` match prims that are or are not considered models (`UsdPrim::IsModel`) 35 | - `group(bool=true)` match prims that are or are not considered groups (`UsdPrim::IsGroup`) 36 | - `kind(kind1, ... kindN, strict=false)` match prims of any of given kinds. If `strict=true` matching subkinds is not allowed, only exact matches pass. 37 | - `specifier(specifier1, ... specifierN)` match prims with any of the given specifiers. 38 | - `isa(typeName1, ... typeNameN, strict=false)` match prims that are any typed schema typeName1..N or subtypes. Disallow subtypes if `strict=true`. 39 | - `hasAPI(typeName1, ... typeNameN, instanceName='')` match prims that have any of the applied API schemas 1..N. Limit matches by `instanceName` if supplied. 40 | - Note that we will include support for queries that reason about schema versioning as well, the exact form of that to be determined. 41 | - `variant(setName1 = selGlob1 .. selGlobN, ... setNameN = ...)` match prims that have matching selections for variant setNames 1..N. 42 | 43 | #### Matching Prims by Testing Properties, and Matching Properties 44 | We can expand the above syntax to support matching prims by testing their properties, and to match properties themselves, if desired. However, we may not support this in the first implementation. 45 | - `//Robot*//.*color` select all properties whose names end in "color" descendant to prims whose names start with "Robot". 46 | - `//Robot*//{isa:Sphere .radius{value:default:closeTo:0}}` select all the `Sphere` prims beneath "Robot" prims whose `radius` attributes' `default` values are close to 0. 47 | - `//Robot*//{isa:Sphere}.radius{value:default:closeTo:0}` select all the `radius` attributes on `Sphere` prims beneath "Robot" prims whose `default` values are close to 0. 48 | 49 | ### Pattern-Based Collection Expressions 50 | A single Path Matching Pattern (with optional Predicate Expressions) is a Collection Expression, but several may also be combined using set-algebra operators. 51 | - Whitespace or the `+` operator forms the set union of two Collections. 52 | - `&` forms the set intersection of two Collections. 53 | - `-` forms the set difference, the left hand Collection minus the right hand. 54 | - `~` complements a Collection. 55 | - `(` `)` may be used to group and enforce evaluation order. 56 | 57 | #### Collection References 58 | In addition, collection references (starting with `%`) may be combined with Path Matching Patterns as above. 59 | - `%_` refers to to the next "weaker" collection expression in scene description. This way `UsdCollectionAPI` expression opinions can make incremental modifications to existing collections if desired. 60 | - For example, `%_ /Added/Prim` would union `/Added/Prim` with whatever the weaker-composed collection would match. `%_ - /main_cam` would match everything the weaker collection would match, except `/main_cam`. 61 | - In the `UsdCollectionAPI` schema domain, `%/path/to:collectionName` refers to a specific collection on another prim. For example, `%/House/Lights:KeyLights` refers to the collection `/House/Lights.collections:KeyLights`, and `%:collectionName` refers to a sibling collection on this prim. 62 | - Note that wildcards/patterns are not allowed in the names of collection references. 63 | 64 | ### Future Possibilities 65 | As mentioned earlier, initially we may not support querying attribute values. One concern is that we do not want collections (at least UsdCollectionAPI collections) to be time-varying. We may consider supporting testing some kinds of attribute values at specific, nonvarying times. 66 | 67 | ## Software Structure 68 | ### New Sdf Attribute Value Type: `SdfPathExpression` 69 | - Contains the expression string as its fundamental data. 70 | - Provides API to: 71 | - Compose over weaker `SdfPathExpressions` 72 | - Support path translation across composition arcs: `SdfPathExpression::ReplacePrefix` (as employed by `PcpMapFunction`). 73 | - In USD value resolution, we will path-translate any "non-speculative" prefixes (i.e. pattern-free elements) of Path Matching Patterns in the same way that relationship and connection target paths are translated today. 74 | - For example, in `/CharacterGroup/Character//*_sbdv` we would path-translate the `/CharacterGroup/Character` prefix across composition arcs. 75 | - Collection reference paths will be path-translated similarly. 76 | - Parse, validate, and introspect. 77 | - Provide syntax error feedback with source locations. 78 | - Build expressions by set operations 79 | 80 | We will add a built-in attribute, `expression` to `UsdCollectionAPI` of this type. 81 | 82 | Note that `SdfPathExpression` can only syntactically validate the predicate expressions that appear within `{` `}`. The language itself (e.g. the valid predicate function names and their signatures) must be provided externally, and can vary from domain to domain. For example, in UsdCollectionAPI, the set of functions are those proposed above. However, predicates in a Hydra Scene Index domain, or in a DCC GUI component may augment or modify these. 83 | 84 | ### Evaluation Engine: `SdfPathExpressionEvaluator` 85 | An `SdfPathExpression` object is in general incapable of evaluation. Generally the paths it matches against are only a property of the actual objects of interest. For example, an expression that wants to test a UsdPrim's "kind" needs the `UsdPrim`, not only its path. For this it needs to be told what its domain objects are, and how to obtain the SdfPath from a given domain object. For example, a common case will have `UsdObject` as the domain and `UsdObject::GetPath()` as the means to obtain `SdfPaths`. It also needs to be given the names, signatures, and implementations of all the predicate expression functions. 86 | - Contains an `SdfPathExpression` 87 | - Contains knowledge of a functional domain (such as `UsdObject`, `SdfSpecHandle`, or `UI_Element`) and how to obtain `SdfPath`s from domain objects. 88 | - Also contains knowledge of how to obtain child objects (e.g. prim & property children) to facilitate searching for matches. 89 | - Contains the set of named predicate expression functions, their function signatures & implementations. 90 | - Contains functions that can answer whether or not a given predicate is "closed" over an interval in the domain. For example, an `isModel` predicate is always false for descendants of `UsdPrim`s that are `component`s. This will serve as the basis for important performance optimizations. 91 | - Provides API to: 92 | - Validate and report errors. `SdfPathExpression` can validate _syntax_, but it cannot say whether the name of a predicate expression function is valid, for example, or if it has been passed the required number of arguments, etc. 93 | - Test individual domain elements for matches 94 | - Search a domain interval for matches 95 | - Batch-compute all matches 96 | - Generate matches incrementally 97 | 98 | ### USD Support 99 | - Add custom USD value resolution support for `SdfPathExpression`-valued attributes. Primarily this means consuming opinions, applying the relevant path translations, and composing stronger over weaker until we have a complete expression. That is, one that does not contain a reference to the next weaker expression: `%_` 100 | - Add custom API to `UsdCollectionAPI` for expressions to create `SdfPathExpressionEvaluator` objects for matching on the `UsdStage` 101 | - Modify `UsdCollectionMembershipQuery` to be able to serve queries from both the includes/excludes relationships and expressions. 102 | -------------------------------------------------------------------------------- /proposals/language/README.md: -------------------------------------------------------------------------------- 1 | # Multiple Language Support 2 | 3 | [Discussion](https://github.com/PixarAnimationStudios/OpenUSD-proposals/pull/55) 4 | 5 | ## Summary 6 | 7 | We propose additions to USD to allow specifying the human language locale used so that content may be 8 | localized to provide language and locale context for rendered text, speech synthesis, assistive technologies, or other applications. 9 | 10 | We propose use of [BCP-47](https://www.w3.org/International/core/langtags/rfc3066bis.html) specifiers according 11 | to the [Unicode CLDR](https://cldr.unicode.org) specification, using underscores as the delimiter. 12 | 13 | We propose specifying the language as [metadata](https://openusd.org/release/glossary.html#usdglossary-metadata), 14 | or as an [attribute](https://openusd.org/release/glossary.html#attribute) 15 | on [prims](https://openusd.org/release/glossary.html#usdglossary-prim) as well as a purpose on attributes. 16 | 17 | ``` 18 | def Foo( 19 | prepend apiSchemas = ["LocaleAPI"] 20 | ) { 21 | uniform string locale:langue = "en_US" 22 | string text = "There's a snake in my boot" 23 | string text:fr_CA = "Il y a un serpent dans ma botte" 24 | string text:hi = "मेरे जूते में एक सांप है" 25 | } 26 | ``` 27 | 28 | ## Problem Statement 29 | 30 | Today, most 3D formats assume a single unspecified language across the represented content. 31 | 32 | A few changes and upcoming changes to USD increase the need to specify language: 33 | 34 | 1. With Unicode support in USD, it is more attractive to people in a wider range of locales. 35 | 2. Upcoming text support feels like a natural area for representing content in different locales 36 | 3. USD is now used as part of interactive content (games, spatial computing), where 37 | localization for user playback and assistive technologies may be useful. 38 | 39 | Since there is no language specification, it is unclear for tooling and users how content should be interpreted 40 | when used by language-aware technologies. 41 | 42 | ## Glossary of Terms 43 | 44 | - **[BCP-47](https://www.w3.org/International/core/langtags/rfc3066bis.html)** : An IETF specification 45 | for language representation that are commonly used by web standards and assistive technologies. 46 | You may be familiar with these when you visit websites that have sections marked `en-CA` or `fr` in the URL 47 | - **[Unicode CLDR](https://cldr.unicode.org)** : The Unicode expression of the BCP-47 identifiers 48 | - **Language** : The primary language. Can be subdivided further. Lowercase is recommended. 49 | e.g `en` for English and `fr` for French. 50 | - **Scripts** : An optional subdivision of Language for representation of a language in different written form. 51 | Title case is recommended. For example, `az-Cyrl` for Azerbaijani in the Cyrillic script 52 | - **Region/Territories** : An optional subdivision of Language for different regions that may share the same core 53 | language. Uppercase is recommended. For example, `en_CA` for Canadian English 54 | 55 | ## Relevant Links 56 | 57 | * [W3C: Choosing Language Tags](https://www.w3.org/International/questions/qa-choosing-language-tags) 58 | * [Unicode CLDR: Picking the Right Language Identifier](https://cldr.unicode.org/index/cldr-spec/picking-the-right-language-code) 59 | * [W3C: Language Tags and Local Identifiers for the World Wide Web](https://www.w3.org/TR/ltli/) 60 | * [Unicode: Language Tag Equivalences](https://cldr.unicode.org/index/cldr-spec/language-tag-equivalences) 61 | * [Common list of Locales](https://gist.github.com/typpo/b2b828a35e683b9bf8db91b5404f1bd1) 62 | * [Apple: Choosing localization regions and Scripts](https://developer.apple.com/documentation/xcode/choosing-localization-regions-and-scripts) 63 | 64 | ## Details 65 | 66 | ### What would use it? 67 | 68 | This addition to USD is designed to be generic over several other schema types that might benefit from it. 69 | 70 | The primary use case is the 71 | current [Text proposal from Autodesk](https://github.com/PixarAnimationStudios/OpenUSD-proposals/tree/main/proposals/text) 72 | , where text is a really good pairing for language specification. 73 | 74 | Hypothetically, in the future, we could see it also being useful for other use cases like: 75 | 76 | - User facing assistive metadata 77 | - Texture 78 | - Geometry 79 | 80 | We do not expect these to support languages right away, but we believe this is a good future looking feature that 81 | would allow for the use of USD in specific multi-language pipelines. 82 | 83 | For this proposal, we do not require that other schemas explicitly adopt language support. We suggest that this is 84 | something that can be adopted in runtimes over time to support in conjunction with other schemas. 85 | 86 | ### Language Encoding 87 | 88 | To maximize compatibility with other systems, we recommend using `BCP-47` derived locales. For use as a `purpose`, 89 | this would require the use of `_` as a delimiter , as opposed to the standard `-`. 90 | 91 | e.g. Instead of `en-CA`, we use `en_CA` 92 | 93 | This brings it closer to the derived `Unicode Common Locale Data Repository (CLDR)` standard. This is commonly used 94 | by many operating systems, programming languages and corporations. If you are on a POSIX system, this also has 95 | significant overlap with the POSIX locale standards ([ISO/IEC 15897](https://www.iso.org/standard/50707.html)). 96 | 97 | An example list of languages is provided in the relevant links section above. 98 | 99 | ### Unspecified Language Fallback 100 | 101 | In the event that a language is not specified, it is recommended to specify a fallback behaviour. 102 | 103 | Our recommendation is: 104 | 105 | 1. If your attribute or prim is missing a language, check the parent hierarchy for an inherited value 106 | 2. If no language is specified, and if your runtime can infer a language, it is free to do so but does not have to. 107 | 3. If you cannot or chose not to infer a language, assume the user's current locale. 108 | 109 | This matches the behaviour of common assistive technologies like screen readers. 110 | 111 | ### Default Metadata 112 | 113 | Most content will prescribe to one primary language, which tends to be the region of the content creator. 114 | To facilitate this, we encourage but do not require content authors to specify a language. 115 | 116 | As is the current convention, layer metadata is used for stage level hints. However the 117 | [Revise Use of Layer Metadata proposal](https://github.com/PixarAnimationStudios/OpenUSD-proposals/pull/45) 118 | suggests moving this to an applied API Schema. 119 | 120 | If we assume current conventions of a layer metadata, we recommend the following field. 121 | 122 | ``` 123 | #usda 1.0 124 | ( 125 | language = "en_CA" 126 | ) 127 | ``` 128 | 129 | However, per the new proposal this should move to an API schema, and we'd propose the following 130 | 131 | ``` 132 | def Foo( 133 | prepend apiSchemas = ["LocaleAPI"] 134 | ) { 135 | uniform string locale:langue = "en_US" 136 | } 137 | ``` 138 | 139 | In both scenarios, the language is inherited as the default value for every prim and attribute below it. 140 | 141 | ### Attribute Purposes 142 | 143 | We take inspiration from web and application development conventions, where it is common to provide resources 144 | for multiple languages in a single context. 145 | 146 | For this we recommend that languages specification be a purpose on the attribute rather than having a single 147 | attribute language. 148 | 149 | Our recommendation is for this to be the last token in the attribute namespaces to work towards the most specific. 150 | 151 | ``` 152 | def foo { 153 | string text = "Colours are awesome" 154 | string text:en_us = "Colors are awesome, but the letter U is not" 155 | string text:fr = "La couleur est géniale" 156 | } 157 | ``` 158 | 159 | One advantage of this system is that you can have your translations in different layer files and referenced it in. 160 | 161 | It would be recommended that at least one version of the attribute exclude the language token, so that it can 162 | fallback to the inherited language and also be used as the fallback if a user asks for a language that has no matching 163 | languages available. 164 | 165 | #### Token Delimiter 166 | 167 | The proposal currently implicitly uses the last token as the language. 168 | It might however be preferable to be explicit about this by also prefixing `lang_` or `lang:` to the last token. 169 | 170 | This could look like one of the below 171 | 172 | ``` 173 | string text:lang_en_us 174 | string text:lang:en_us 175 | ``` 176 | 177 | This perhaps makes things longer, but does make it easier to discern that a token represents a language in a wider 178 | range of use cases. 179 | 180 | #### Why not Variants? 181 | 182 | Variants are also a possible solution, however we believe that this becomes difficult for systems to work with as 183 | variants are effectively unbounded. 184 | 185 | We also find in some of our use cases, that we'd want variants for the core data itself, and doing language 186 | variants per each of these variants would quickly become exponential in variant count and complexity. 187 | 188 | Purposes on attributes feel like the best match to existing paradigms in the web and app development, and easiest for 189 | systems to work with. 190 | 191 | ### API Suggestions 192 | 193 | We do not recommend that OpenUSD itself include all possible language tags. However, it would be beneficial for USD 194 | to provide API to lookup languages specified in the file. 195 | 196 | This could look like the following behaviour where it returns a map of Language and attribute. 197 | 198 | ``` 199 | std::map UsdLocaleAPI::GetLanguagePurposes(const UsdPrim& prim, TfToken attributeName) {...} 200 | ``` 201 | 202 | Using the example above, a call to `GetLanguagePurposes(foo, "text")` would give 203 | 204 | - (``, foo:text) 205 | - (en_US, foo:text:en_US) 206 | - (fr, foo:fr) 207 | 208 | In this case, the `` represents that it should follow the logic 209 | in the `Unspecified Language Fallback` section. 210 | 211 | I would suggest another function like 212 | 213 | ``` 214 | TfToken UsdLocaleAPI::ComputeFallbackLanguage(const UsdPrim& prim) 215 | ``` 216 | 217 | That would return either the inherited value or a sentinel `Unknown` value when no language is specified. 218 | Perhaps USD could have some convenience function to do user locale lookup, but I do not think that needs to be a 219 | requirement. 220 | 221 | #### Language Selection Recommendations 222 | 223 | When the requested or desired language is not represented in the set of languages within the file, there are some recommendations 224 | on how a runtime can pick a fallback option. 225 | 226 | Following the recommendations of CLDR and BCP-47, we suggest: 227 | 228 | 1. If a language is available within the set of languages, pick that attribute. e.g. `en_US` matches `text:en_US` 229 | 2. If a language isn't available, check for a more specific version of that language. 230 | e.g. `de_DE` matches `text:de_DE_u_co_phonebk` 231 | 3. If a more specific language isn't available, then pick a less specific purpose. 232 | e.g. `en_US` matches `text:en` 233 | 4. If a less specific version isn't available, take the version without any language specified. 234 | e.g. `en_US` matches `text` 235 | 236 | ## Risks 237 | 238 | I do not see significant risk with this proposal. There is a potential for significantly more attributes, 239 | but the number of attributes this would apply to is fairly limited. 240 | 241 | One potential issue is that you may want to swap out geometry or assigned textures by locale too. 242 | e.g An English texture vs a French texture. This proposal would allow for that, but the risk is 243 | that support may be very renderer dependent. 244 | 245 | ## Excluded Topics 246 | 247 | This API specifically does not approach other locale based data like currencies, units and Timezones. 248 | At this time, we are not sure if those other locale based metadata have a strong use case within USD. 249 | 250 | However, we suggest naming it something like `UsdLocaleAPI` such that it allows for future additions to those 251 | types of metadata. 252 | 253 | 254 | 255 | -------------------------------------------------------------------------------- /proposals/usdotio/LICENSE.md: -------------------------------------------------------------------------------- 1 | 2 | Apache License 3 | Version 2.0, January 2004 4 | http://www.apache.org/licenses/ 5 | 6 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 7 | 8 | 1. Definitions. 9 | 10 | "License" shall mean the terms and conditions for use, reproduction, 11 | and distribution as defined by Sections 1 through 9 of this document. 12 | 13 | "Licensor" shall mean the copyright owner or entity authorized by 14 | the copyright owner that is granting the License. 15 | 16 | "Legal Entity" shall mean the union of the acting entity and all 17 | other entities that control, are controlled by, or are under common 18 | control with that entity. For the purposes of this definition, 19 | "control" means (i) the power, direct or indirect, to cause the 20 | direction or management of such entity, whether by contract or 21 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 22 | outstanding shares, or (iii) beneficial ownership of such entity. 23 | 24 | "You" (or "Your") shall mean an individual or Legal Entity 25 | exercising permissions granted by this License. 26 | 27 | "Source" form shall mean the preferred form for making modifications, 28 | including but not limited to software source code, documentation 29 | source, and configuration files. 30 | 31 | "Object" form shall mean any form resulting from mechanical 32 | transformation or translation of a Source form, including but 33 | not limited to compiled object code, generated documentation, 34 | and conversions to other media types. 35 | 36 | "Work" shall mean the work of authorship, whether in Source or 37 | Object form, made available under the License, as indicated by a 38 | copyright notice that is included in or attached to the work 39 | (an example is provided in the Appendix below). 40 | 41 | "Derivative Works" shall mean any work, whether in Source or Object 42 | form, that is based on (or derived from) the Work and for which the 43 | editorial revisions, annotations, elaborations, or other modifications 44 | represent, as a whole, an original work of authorship. For the purposes 45 | of this License, Derivative Works shall not include works that remain 46 | separable from, or merely link (or bind by name) to the interfaces of, 47 | the Work and Derivative Works thereof. 48 | 49 | "Contribution" shall mean any work of authorship, including 50 | the original version of the Work and any modifications or additions 51 | to that Work or Derivative Works thereof, that is intentionally 52 | submitted to Licensor for inclusion in the Work by the copyright owner 53 | or by an individual or Legal Entity authorized to submit on behalf of 54 | the copyright owner. For the purposes of this definition, "submitted" 55 | means any form of electronic, verbal, or written communication sent 56 | to the Licensor or its representatives, including but not limited to 57 | communication on electronic mailing lists, source code control systems, 58 | and issue tracking systems that are managed by, or on behalf of, the 59 | Licensor for the purpose of discussing and improving the Work, but 60 | excluding communication that is conspicuously marked or otherwise 61 | designated in writing by the copyright owner as "Not a Contribution." 62 | 63 | "Contributor" shall mean Licensor and any individual or Legal Entity 64 | on behalf of whom a Contribution has been received by Licensor and 65 | subsequently incorporated within the Work. 66 | 67 | 2. Grant of Copyright License. Subject to the terms and conditions of 68 | this License, each Contributor hereby grants to You a perpetual, 69 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 70 | copyright license to reproduce, prepare Derivative Works of, 71 | publicly display, publicly perform, sublicense, and distribute the 72 | Work and such Derivative Works in Source or Object form. 73 | 74 | 3. Grant of Patent License. Subject to the terms and conditions of 75 | this License, each Contributor hereby grants to You a perpetual, 76 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 77 | (except as stated in this section) patent license to make, have made, 78 | use, offer to sell, sell, import, and otherwise transfer the Work, 79 | where such license applies only to those patent claims licensable 80 | by such Contributor that are necessarily infringed by their 81 | Contribution(s) alone or by combination of their Contribution(s) 82 | with the Work to which such Contribution(s) was submitted. If You 83 | institute patent litigation against any entity (including a 84 | cross-claim or counterclaim in a lawsuit) alleging that the Work 85 | or a Contribution incorporated within the Work constitutes direct 86 | or contributory patent infringement, then any patent licenses 87 | granted to You under this License for that Work shall terminate 88 | as of the date such litigation is filed. 89 | 90 | 4. Redistribution. You may reproduce and distribute copies of the 91 | Work or Derivative Works thereof in any medium, with or without 92 | modifications, and in Source or Object form, provided that You 93 | meet the following conditions: 94 | 95 | (a) You must give any other recipients of the Work or 96 | Derivative Works a copy of this License; and 97 | 98 | (b) You must cause any modified files to carry prominent notices 99 | stating that You changed the files; and 100 | 101 | (c) You must retain, in the Source form of any Derivative Works 102 | that You distribute, all copyright, patent, trademark, and 103 | attribution notices from the Source form of the Work, 104 | excluding those notices that do not pertain to any part of 105 | the Derivative Works; and 106 | 107 | (d) If the Work includes a "NOTICE" text file as part of its 108 | distribution, then any Derivative Works that You distribute must 109 | include a readable copy of the attribution notices contained 110 | within such NOTICE file, excluding those notices that do not 111 | pertain to any part of the Derivative Works, in at least one 112 | of the following places: within a NOTICE text file distributed 113 | as part of the Derivative Works; within the Source form or 114 | documentation, if provided along with the Derivative Works; or, 115 | within a display generated by the Derivative Works, if and 116 | wherever such third-party notices normally appear. The contents 117 | of the NOTICE file are for informational purposes only and 118 | do not modify the License. You may add Your own attribution 119 | notices within Derivative Works that You distribute, alongside 120 | or as an addendum to the NOTICE text from the Work, provided 121 | that such additional attribution notices cannot be construed 122 | as modifying the License. 123 | 124 | You may add Your own copyright statement to Your modifications and 125 | may provide additional or different license terms and conditions 126 | for use, reproduction, or distribution of Your modifications, or 127 | for any such Derivative Works as a whole, provided Your use, 128 | reproduction, and distribution of the Work otherwise complies with 129 | the conditions stated in this License. 130 | 131 | 5. Submission of Contributions. Unless You explicitly state otherwise, 132 | any Contribution intentionally submitted for inclusion in the Work 133 | by You to the Licensor shall be under the terms and conditions of 134 | this License, without any additional terms or conditions. 135 | Notwithstanding the above, nothing herein shall supersede or modify 136 | the terms of any separate license agreement you may have executed 137 | with Licensor regarding such Contributions. 138 | 139 | 6. Trademarks. This License does not grant permission to use the trade 140 | names, trademarks, service marks, or product names of the Licensor, 141 | except as required for reasonable and customary use in describing the 142 | origin of the Work and reproducing the content of the NOTICE file. 143 | 144 | 7. Disclaimer of Warranty. Unless required by applicable law or 145 | agreed to in writing, Licensor provides the Work (and each 146 | Contributor provides its Contributions) on an "AS IS" BASIS, 147 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 148 | implied, including, without limitation, any warranties or conditions 149 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 150 | PARTICULAR PURPOSE. You are solely responsible for determining the 151 | appropriateness of using or redistributing the Work and assume any 152 | risks associated with Your exercise of permissions under this License. 153 | 154 | 8. Limitation of Liability. In no event and under no legal theory, 155 | whether in tort (including negligence), contract, or otherwise, 156 | unless required by applicable law (such as deliberate and grossly 157 | negligent acts) or agreed to in writing, shall any Contributor be 158 | liable to You for damages, including any direct, indirect, special, 159 | incidental, or consequential damages of any character arising as a 160 | result of this License or out of the use or inability to use the 161 | Work (including but not limited to damages for loss of goodwill, 162 | work stoppage, computer failure or malfunction, or any and all 163 | other commercial damages or losses), even if such Contributor 164 | has been advised of the possibility of such damages. 165 | 166 | 9. Accepting Warranty or Additional Liability. While redistributing 167 | the Work or Derivative Works thereof, You may choose to offer, 168 | and charge a fee for, acceptance of support, warranty, indemnity, 169 | or other liability obligations and/or rights consistent with this 170 | License. However, in accepting such obligations, You may act only 171 | on Your own behalf and on Your sole responsibility, not on behalf 172 | of any other Contributor, and only if You agree to indemnify, 173 | defend, and hold each Contributor harmless for any liability 174 | incurred by, or claims asserted against, such Contributor by reason 175 | of your accepting any such warranty or additional liability. 176 | 177 | END OF TERMS AND CONDITIONS 178 | 179 | APPENDIX: How to apply the Apache License to your work. 180 | 181 | To apply the Apache License to your work, attach the following 182 | boilerplate notice, with the fields enclosed by brackets "[]" 183 | replaced with your own identifying information. (Don't include 184 | the brackets!) The text should be enclosed in the appropriate 185 | comment syntax for the file format. We also recommend that a 186 | file or class name and description of purpose be included on the 187 | same "printed page" as the copyright notice for easier 188 | identification within third-party archives. 189 | 190 | Copyright MICHAEL DAVEY CONSULTING LTD LIMITED 191 | Copyright SIGNLY LTD 192 | 193 | https://github.com/NVIDIA-Omniverse/kit-extension-template 194 | 195 | Licensed under the Apache License, Version 2.0 (the "License"); 196 | you may not use this file except in compliance with the License. 197 | You may obtain a copy of the License at 198 | 199 | http://www.apache.org/licenses/LICENSE-2.0 200 | 201 | Unless required by applicable law or agreed to in writing, software 202 | distributed under the License is distributed on an "AS IS" BASIS, 203 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 204 | See the License for the specific language governing permissions and 205 | limitations under the License. 206 | -------------------------------------------------------------------------------- /proposals/boost_python_removal/README.md: -------------------------------------------------------------------------------- 1 | # Removing boost::python 2 | 3 | Copyright © 2024, Pixar Animation Studios, version 1.0 4 | 5 | ## Introduction 6 | 7 | The community has been asking for the removal of all boost library 8 | dependencies from OpenUSD for some time now. Boost is seen as an extremely 9 | heavy-weight dependency by most users and is a frequent source of build issues 10 | and support tickets on OpenUSD's GitHub repo. Within some subcommunities like 11 | the games industry, anything that brings in a boost dependency is an immediate 12 | non-starter, which hinders OpenUSD's adoption by those users. 13 | 14 | Thanks to the recent "de-boostification" work, boost::python is the last 15 | remaining use of boost in OpenUSD. Unfortunately, this library is used heavily 16 | throughout the source tree, as it's responsible for all of OpenUSD's Python 17 | bindings. A rough search of the codebase shows 573 files that include a 18 | boost::python header. 19 | 20 | We propose to replace OpenUSD's use of boost::python with a new Python binding 21 | library, based on boost::python, that will be included with OpenUSD itself. This 22 | achieves the goal of removing the last boost dependency with relatively low cost 23 | and risk, requiring purely mechanical and automatable updates to the majority of 24 | the source tree and centralizing the most intensive changes to a few locations. 25 | 26 | ## Details 27 | 28 | ### pxr_boost::python 29 | 30 | A copy of boost::python would be embedded in OpenUSD under 31 | `pxr/external/boost/python`. Tentatively named `pxr_boost::python`, this copy 32 | would be modified heavily to create what would essentially be a modernized, 33 | isolated version of boost::python: 34 | 35 | - Its namespace would be changed from boost to pxr_boost to distinguish it from 36 | its upstream source. It would also be placed into the same namespace used 37 | throughout OpenUSD. 38 | 39 | - Uses of other parts of boost like the MPL and preprocessor libraries would 40 | be removed. Initial experiments as well as the recent de-boostification work 41 | have demonstrated these can be replaced using only built-in language features 42 | from modern C++ standards. 43 | 44 | - Various workarounds to support older (ancient, in some cases) compilers and 45 | C++ standards would be removed as needed. 46 | 47 | The name `pxr_boost::python` was chosen to provide a breadcrumb for developers 48 | (primarily those who work on OpenUSD itself) that indicates its upstream 49 | parent and where to look for documentation or other support. However, it's 50 | important to note that this library will not have any boost dependencies 51 | once this work is completed. 52 | 53 | Critically, pxr_boost::python would maintain exactly the same behavior as 54 | boost::python. It would also maintain almost exactly the same public API. The 55 | few exceptions to this are places where other boost types are part of the API, 56 | such as boost::noncopyable or boost::mpl::vector. 57 | 58 | Because of this, replacing boost::python with pxr_boost::python throughout 59 | the tree should be easily scriptable and incur minimal risk of breaking 60 | anything or introducing accidental behavior changes. 61 | 62 | As an added benefit, using pxr_boost::python could potentially decrease compile 63 | times for Python bindings throughout the tree. Profiling has shown that in many 64 | cases, the expensive part of compiling Python bindings today is processing the 65 | headers for the other parts of boost that boost::python uses, like the MPL and 66 | preprocessor libraries mentioned above. Replacing these with C++ language 67 | features can reduce those costs significantly. Using an early and incomplete 68 | prototype, the time to compile OpenUSD's Python bindings on an M1 Mac dropped by 69 | ~7% (from 7:14 to 6:46). 70 | 71 | ### Bring Your Own Boost 72 | 73 | We *do not* intend to add the "Bring Your Own Boost" (BYOB) feature. 74 | 75 | The BYOB feature would have allowed clients to choose to build OpenUSD against 76 | either the embedded pxr_boost::python or an external boost::python. This was 77 | aimed at giving clients an easy way to maintain current behavior if they relied 78 | on OpenUSD's use of boost::python, e.g. for compatibility with other Python 79 | bindings. 80 | 81 | One of the drawbacks of this feature was that it would force pxr_boost::python 82 | to maintain compatibility with boost::python. Although we do not have any 83 | plans for changes that would break that compatibility, this feature would 84 | have hindered us from doing so if we wanted to in the future. 85 | 86 | Another drawback was that it would introduce fragmentation into the OpenUSD 87 | ecosystem regarding Python bindings. This could make distributing extensions for 88 | OpenUSD (e.g. additional schemas) via PyPI or other platforms more difficult and 89 | confusing. Extension authors would most likely provide distributions that relied 90 | on pxr_boost::python, leaving any clients using the BYOB feature unable to use 91 | these distributions. This isn't necessarily a deal-breaker: one could argue 92 | that BYOB is an advanced feature for customizing builds, and that "regular" 93 | distributions should target standard configurations and not be on the hook 94 | to support every possibility. 95 | 96 | This feature was explicitly called out during earlier discussions and 97 | presentation but there was no feedback from the community that the backwards 98 | compatibility it was meant to provide was important or useful. We are open to 99 | revisiting this based on feedback to this proposal. 100 | 101 | ## Risks 102 | 103 | ### Maintenance 104 | 105 | Embedding a modified version of boost::python means OpenUSD would be taking on 106 | the burden of keeping it up-to-date, e.g. when new version of Python are 107 | released with breaking API changes. 108 | 109 | There have been concerns expressed that Pixar would be unable or unwilling to 110 | update pxr_boost::python to support versions of Python beyond what's specified 111 | in the VFX Reference Platform, or on platforms beyond what's currently 112 | supported. There have also been concerns expressed about patching security 113 | issues should they arise. 114 | 115 | Commit history on GitHub shows 5 changes to accommodate Python API changes over 116 | the last 4 years, with the most recent change addressing an issue in Python 117 | 3.11. This suggests the maintenance burden is not significant, at least for this 118 | category of changes. 119 | 120 | In theory, pxr_boost::python could piggy-back off maintenance and support from 121 | the upstream boost::python project. Unfortunately, the boost::python project has 122 | had very, very little activity over the last few years. However, it recently 123 | accepted a patch to support a newer version of numpy. This is a faint sign of 124 | life, but gives some (minimal) hope that boost::python itself may be updated in 125 | the future, which would allow us to take those patches and apply them to 126 | pxr_boost. 127 | 128 | Another example: Python is pushing forward with the removal of the GIL. An 129 | initial implementation of this is available with Python 3.13, and I believe 130 | "no-GIL" mode is intended to be the default in the long term. If we ever needed 131 | to update pxr_boost::python to accommodate that, doing so could be difficult 132 | since we currently do not have broad expertise in the library's internal 133 | implementation. 134 | 135 | However, note that updating pxr_boost::python *would not* be required for 136 | OpenUSD to build against and work with a "no-GIL" Python. Python C API 137 | extensions must opt in to "no-GIL" mode, otherwise Python will enable the 138 | GIL at runtime. pxr_boost::python would need to be updated to take full 139 | advantage of the increased parallelism that "no-GIL" mode provides, but 140 | in the interim this would not block OpenUSD from being used with future 141 | versions of Python. 142 | 143 | ### Support 144 | 145 | OpenUSD would also be on the hook for answering support queries for build 146 | issues related to pxr_boost::python. 147 | 148 | In practice, the USD team is already shouldering this burden. Clients that 149 | build OpenUSD via the shipped build script (build_usd.py) already file issues 150 | on the OpenUSD GitHub repo when the build fails due to an issue with boost, 151 | which is typically because of boost::python. 152 | 153 | ### Client Impact 154 | 155 | Because different Python binding libraries cannot interoperate, changing 156 | OpenUSD's default binding library may impact third-party code that tries to 157 | interoperate with OpenUSD via Python. 158 | 159 | For example, suppose a client used boost::python to wrap C++ functions that 160 | accepted OpenUSD types like UsdStage or UsdPrim. That currently "just works" 161 | as long as their OpenUSD libraries are built against the same boost::python 162 | shared library. Changing OpenUSD to use the embedded pxr_boost::python would 163 | break this. 164 | 165 | Several users have reported having software built on top of OpenUSD that use 166 | pybind11 to wrap functions that interoperate with boost::python-wrapped 167 | OpenUSD types. This is done by using adapters that bridge boost::python and 168 | pybind11, e.g. take a boost::python-wrapped Usd.Stage and pass it to a 169 | pybind11-wrapped C++ function. One popular example of this is the 170 | [`pyboost11`](https://yyc.solvcon.net/en/latest/writing/2021/pyboost11/pyboost11.html) 171 | adapter published on the web. Those adapters would have to be updated to work 172 | against pxr_boost::python instead. This update ought to be straightforward 173 | because of its similarity to boost::python. 174 | 175 | Note that OpenUSD's bindings and bindings from other libraries can coexist 176 | without problems as long as they're not expected to interoperate. 177 | 178 | ## Alternatives 179 | 180 | ### pybind11 181 | 182 | We had previously proposed replacing boost::python with pybind11. pybind11 183 | is a modern Python binding library that looks similar to boost::python, 184 | but is *not* a drop-in replacement. It has significant differences in public 185 | API and approach to more complex features like custom type conversions. 186 | 187 | Using pybind11 would require rewriting all of the Python bindings and the 188 | foundational utilities and infrastructure throughout the OpenUSD codebase. 189 | Since Pixar's own codebase is built on top of OpenUSD, using pybind11 190 | would also require Pixar to update all of that code at the same time, 191 | which is an additional 3700+ files on top of the 570+ files in OpenUSD. 192 | 193 | The scale of such an update along with the differences mentioned above 194 | makes this approach extremely risky, for both OpenUSD and Pixar. It's 195 | difficult to anticipate performance and implementation issues, and 196 | the ground-up nature of the rewrite means that such issues could show 197 | up late in the project. It also means this approach would take a 198 | significantly longer time to implement and deliver to the community. 199 | 200 | According to benchmarks, pybind11 has slower runtime performance than 201 | boost::python. The benchmark in the nanobind documentation 202 | [here](https://nanobind.readthedocs.io/en/latest/benchmark.html) shows 203 | over 2x more overhead when calling into C++ when using pybind11. 204 | 205 | Benchmarks also show modules using pybind11 are slower to compile than 206 | those using boost::python. Although the numbers published with 207 | the pybind11 documentation [here](https://pybind11.readthedocs.io/en/stable/benchmark.html) 208 | show otherwise, re-running this benchmark shows boost::python-based 209 | modules compiling faster than pybind11, and a prototype version of 210 | pxr_boost::python even faster than that. 211 | 212 | ![Chart showing compile times](module_compile_times.png) 213 | 214 | See [this page](compile_times/README.md) for more information about this 215 | benchmark. 216 | 217 | The primary benefit of using pybind11 is that OpenUSD would not take on the 218 | maintenance and support cost for its own Python binding library. pybind11 is a 219 | widely-used library and, unlike boost::python, has an active community and 220 | maintainers. 221 | 222 | Ultimately, the pxr_boost::python approach achieves the project goals 223 | with significantly less risk and time, and with additional benefits than 224 | converting to pybind11. This comes at the cost of potential maintenance 225 | and support burden, which we do not believe to be significant. Note that 226 | the pxr_boost::python approach does not preclude converting to pybind11 227 | should the situation change in the future. 228 | 229 | ### nanobind 230 | 231 | nanobind is another Python binding library written by the same author as 232 | pybind11. It is not a feasible alternative because it does not contain 233 | features present in both boost::python and pybind11 that are required by 234 | OpenUSD. 235 | -------------------------------------------------------------------------------- /proposals/variant_set_metadata/README.md: -------------------------------------------------------------------------------- 1 | # Variant Set Metadata 2 | 3 | Copyright © 2024, NVIDIA Corporation, version 1.0 4 | 5 | Tyler Hubbard 6 | Joshua Miller 7 | Matthew Kuruc 8 | 9 | ## Contents 10 | 11 | - [Introduction](#introduction) 12 | - [Metadata in OpenUSD](#metadata-in-openusd) 13 | - [Metadata Fields](#core-metadata-fields) 14 | - [Custom Metadata Fields](#custom-metadata-fields) 15 | - [Metadata Value Resolution](#metadata-value-resolution) 16 | - [Example](#example) 17 | - [Variant Display Name](#variant-display-names) 18 | - [Variant Ordering](#variant-ordering) 19 | - [Crate File Format Support](#crate-file-format-support) 20 | - [UsdVariantSet is not a UsdObject](#UsdVariantSet-is-not-a-usdobject) 21 | - [Stage Flattening](#stage-flattening) 22 | 23 | ## Introduction 24 | 25 | Variant sets would benefit from a means of storing and retrieving additional information about them 26 | and their variant children. Users of editors have a need to create and view context 27 | and informative descriptors about variant sets. A common use case for this feature is 28 | showing a user friendly display name instead of just an identifier. For example, 29 | being able to show UTF-8 transcoded display names in editors, 30 | providing a much better user experience across languages. 31 | 32 | There is currently no support for this functionality on variant sets, 33 | so we propose adding the ability to author metadata on `UsdVariantSet` to support similar 34 | functionality available to `UsdPrim`, `UsdAttribute` and `UsdRelationship`. 35 | This metadata is intended to be used in editors to display information such as a 36 | nicely formatted name for the variant set or whether the variant set should be hidden in certain views. 37 | `UsdVariantSet` metadata will not affect the composition behavior for a `UsdPrim`, 38 | `UsdAttribute` or `UsdRelationship` on a `UsdStage`. 39 | 40 | ## Metadata in OpenUSD 41 | 42 | Metadata is a core feature of OpenUSD that is available to `SdfLayer`, `UsdStage`, `UsdPrim` 43 | and both subclasses of `UsdProperty`. `UsdPrim` and `UsdProperty` both derive from `UsdObject` 44 | which provides API for authoring and accessing metadata. 45 | Common examples of metadata available for these types are documentation, display name, 46 | display group, and hidden to name a few. This metadata is extremely helpful for editors, 47 | such as UsdView, so users can get a better understanding, or visualization, of the stage. 48 | For instance, display name is a very common form of metadata on a `UsdPrim` to create a nice, 49 | human-readable name that will be presented in an editor. 50 | 51 | A `UsdVariantSet`, which does not derive from a `UsdObject`, has no such API to create metadata. 52 | If an author would like to produce a `UsdVariantSet` that should be hidden from editors, 53 | there isn't a uniform way to do so. Yes, the author could prefix the name of the `UsdVariantSet` 54 | with something like "test" or "__", but every editor would need to respect this prefix. 55 | A better way to support such a feature on `UsdVariantSet` would be to expose a hidden metadata field. 56 | This would be consistent with the `UsdObject` API and lends itself to other metadata fields 57 | that would be applicable for a `UsdVariantSet` like display name. 58 | 59 | ## Metadata Fields 60 | 61 | The following are metadata fields that will be commonly used on a `UsdVariantSet` 62 | and will have an explicit API. The fields proposed here will provide the normal accessors 63 | and mutators that will be available on a `UsdVariantSet`: 64 | 65 | - `comment` - User notes about a variant set 66 | - `documentation` - Information that can describe the role of the `UsdVariantSet` and its intended uses 67 | - `displayGroup` - Name to assist with grouping similar `UsdVariantSet`s in editors. 68 | - `displayName` - Name for the `UsdVariantSet` that can be used in editors 69 | - `hidden` - Boolean field that can inform an editor if the `UsdVariantSet` should be hidden. 70 | This field will have a fallback value of `false` 71 | - `variantDisplayNames` - Mapping of variant name to a variant's display name that can be used in editors. 72 | See [Variant Display Names](#variant-display-names) for more implementation details. 73 | - `variantOrder` - User defined ordering of variants, similar to primOrder and propertyOrder. 74 | See [Variant Ordering](#variant-ordering) for more implementation details. 75 | - `customData` - User defined dictionary of additional metadata 76 | 77 | ## Custom Metadata Fields 78 | 79 | Similar to other types that allow authoring of metadata fields, `UsdVariantSet` will also 80 | allow custom metadata fields that are defined in plugins. These custom fields can be used to 81 | author metadata on a `UsdVariantSet` that might be site-specific but should always be applied 82 | to the `UsdVariantSet` type. 83 | 84 | ## Metadata Value Resolution 85 | 86 | As described below, a [UsdVariantSet should not be a UsdObject](#usdvariantset-is-not-a-usdobject). 87 | But the metadata value resolution process of a `UsdVariantSet` should behave 88 | similar to that of `UsdObject`. For most types the resolution process will be 89 | "strongest opinion wins" but other, more complicated types like VtDictionary will be 90 | correctly resolved. This will slightly complicate the implementation since 91 | `UsdVariantSet` will need to provide this value resolution step, 92 | but should provide expected results for authored metadata. 93 | 94 | ## Example 95 | 96 | An example showing metadata on variant sets can be found [here](example.usda) 97 | 98 | ``` 99 | #usda 1.0 100 | ( 101 | defaultPrim = "MilkCartonA" 102 | ) 103 | 104 | def Xform "MilkCartonA" ( 105 | kind = "prop" 106 | variants = { 107 | string modelingVariant = "Carton_Opened" 108 | string shadingComplexity = "full" 109 | string shadingTest = "milkBrandA" 110 | } 111 | prepend variantSets = ["modelingVariant", "shadingComplexity", "shadingTest"] 112 | ) 113 | { 114 | variantSet "modelingVariant" ( 115 | doc = "Modeling variations for the asset" 116 | ) = { 117 | "ALL_VARIANTS" { 118 | float3[] extentsHint = [(-6.27056, -6.53532, 0), (6.14027, 6.10374, 29.8274)] 119 | 120 | } 121 | "Carton_Opened" { 122 | float3[] extentsHint = [(-6.27056, -6.53532, 0), (6.14027, 6.10374, 29.8274)] 123 | 124 | } 125 | "Carton_Sealed" { 126 | float3[] extentsHint = [(-6.27056, -6.44992, 0), (6.14027, 6.10374, 29.2792)] 127 | 128 | } 129 | } 130 | variantSet "shadingComplexity" = { 131 | reorder variants = ["none", "display", "modeling", "full"] 132 | "display" { 133 | 134 | } 135 | "full" { 136 | 137 | } 138 | "modeling" { 139 | 140 | } 141 | "none" { 142 | 143 | } 144 | } 145 | 146 | variantSet "shadingTest" ( 147 | """Added UPC# for the display names""" 148 | displayName = "Shading Test Variations" 149 | doc = "Shading variations that are currently being explored for product placement" 150 | variantDisplayNames = { "milkBrandA" : "Milk Brand A (UPC#: 123-ABC)", "milkBrandB" : "Milk Brand B (UPC#: 456-DEF)" } 151 | hidden = True 152 | ) = { 153 | "milkBrandA" { 154 | 155 | } 156 | "milkBrandB" { 157 | 158 | } 159 | } 160 | } 161 | ``` 162 | 163 | ## Variant Display Names 164 | 165 | Adding display names to variants is another metadata field that this proposal would like to support. 166 | This field will need to be handled differently than other displayName metadata fields 167 | found on types like `UsdPrim` and `UsdProperty`. Specifically, a mapping of variant name 168 | to display name will be required as display name can not be added to a variant. 169 | First, there is no `UsdVariant` type to add display name API to. Secondly, and most importantly, 170 | all fields within a variant are applied directly to a `UsdPrim` when selected. 171 | If a displayName metadata field was added to a variant, that field would be applied to the 172 | `UsdPrim`'s displayName and not the variant's. It's for this reason that a mapping of 173 | variant name to display name will be added to a `UsdVariantSet`. 174 | As described in [Metadata Value Resolution](#metadata-value-resolution), 175 | the mapping of variant name to display name will need to be correctly resolved and merged. 176 | 177 | ## Variant Ordering 178 | 179 | One of the metadata fields presented in this proposal is the 180 | addition of [variantOrder](#metadata-fields). The goal of this field is simple; 181 | to specify an explicit order for how an editor should display variants in a `UsdVariantSet`. 182 | When considering the implementation of this metadata field for the `SdfTextFileFormat` 183 | it seems consistent with other `SdfSpec`s to use a `reorder variants` statement. 184 | This can be seen in the [proposed example](example.usda) but worth 185 | calling out specifically as it brings up two questions: 186 | 187 | 1. Is a `reorder variants` statement the correct approach for implementing `variantOrder` 188 | metadata field? 189 | 2. If so, should the addition of a new `reorder` statement in the `SdfTextFileFormat` 190 | require a version bump as it requires changes to the parser? 191 | 192 | ## Crate File Format Support 193 | 194 | The general consensus is that the USD Crate File Format should "just work" when 195 | metadata support is added to `UsdVariantSet`. The proposal does not mention 196 | any implementation details regarding the USD Crate File Format as the expectation 197 | is that no changes should be necessary. Unfortunately, it is not easy to test out 198 | if this assumption is correct due to Crate being a binary file format. 199 | This will require further experimentation once the changes proposed here move 200 | to the implementation step. 201 | 202 | ## UsdVariantSet is not a UsdObject 203 | 204 | `UsdObject` is the base class for `UsdPrim` and `UsdProperty` that provides the 205 | common API for accessing metadata. Investigating metadata support for `UsdVariantSet` 206 | immediately raised the question of: *“Should UsdVariantSet be a UsdObject?”*. 207 | The answer is no, a `UsdVariantSet` should not be a UsdObject. The reason is more 208 | philosophical than technical as a `UsdObject` is the common base shared amongst types 209 | in OpenUSD’s scenegraph. The idea is that a `UsdObject` is a tangible entity 210 | on OpenUSD's scenegraph. These objects can be accessed directly from API such as 211 | UsdStage::GetObjectAtPath(), relationships can be established between multiple 212 | `UsdObject`s, they can be included in collections, they will not be discarded during 213 | the flattening process, and all `UsdObject`s share a common set of metadata. 214 | This last point is the main reason for asking *"Should UsdVariantSet be a UsdObject?"*; 215 | specifically to prevent code duplication. But when thinking about these other points, 216 | it becomes clear that deriving `UsdVariantSet` from a `UsdObject` would require 217 | quite a few exemptions in OpenUSD. 218 | 219 | From a technical perspective though, deriving `UsdVariantSet` from a `UsdObject` seems logical. 220 | It already provides a lot of the metadata API that is presented in this proposal. 221 | But `UsdVariantSet` would also inherit API for metadata such as AssetInfo which 222 | is not something that applies to a `UsdVariantSet`. It would also create ambiguity 223 | with `UsdCollectionAPI` and `UsdRelationship` as those expect `SdfPath`s 224 | to `UsdObject`s like `UsdPrim`, `UsdAttribute` or `UsdRelationship`. 225 | What benefit would a collection containing a `UsdVariantSet` provide? 226 | Or what does a relationship from a `UsdAttribute` to a `UsdVariantSet` mean? 227 | The convenience of avoiding code duplication does not warrant changing `UsdVariantSet` 228 | to be a `UsdObject`. There is also precedent in OpenUSD for API that provides 229 | metadata support without deriving from `UsdObject`, `UsdStage` is one such example. 230 | 231 | But answering this question leads to discussion on how to best implement metadata support 232 | for `UsdVariantSet`. Since `UsdVariantSet` should not be a `UsdObject` a potential 233 | implementation for metadata support could lead to unnecessary code duplication. 234 | A better approach might be to refactor common metadata API found in `UsdObject` to 235 | private internal utilities that can be used for types that do not derive from 236 | `UsdObject`. Types such as `UsdStage` and `UsdVariantSet` could use these utilities 237 | to implement their own API for metadata, avoiding code duplication and 238 | not deriving from `UsdObject`. 239 | 240 | ## Stage Flattening 241 | 242 | The stage flattening process will remove all `UsdVariantSet`s when the stage is collapsed 243 | into a single merged layer. With all `UsdVariantSet`s being removed, there is no reason 244 | to maintain authored metadata as an editor will not have any `UsdVariantSet`s to display. 245 | As such, metadata authored for a `UsdVariantSet` will not be preserved when flattened. 246 | This could be viewed as another argument for [UsdVariantSet should not be a UsdObject](#usdvariantset-is-not-a-usdobject) 247 | since they are discarded during the flattening process. -------------------------------------------------------------------------------- /proposals/profiles/README.md: -------------------------------------------------------------------------------- 1 | # Profiles 2 | 3 | This document proposes the addition of `profiles` as metadata to USD documents. 4 | Profiles are short, structured descriptions of potentially non-standard features 5 | used within the stage such that a runtime or human can know of their existence ahead of time. 6 | 7 | Profiles are not meant to change runtime parsing behaviour, and are essentially just 8 | informational hints. 9 | 10 | A proposed example would be: 11 | 12 | ``` 13 | ( 14 | profile = { 15 | string name = "org.openusd.core" 16 | string version = "1.2.3" 17 | dictionary compatibility = { 18 | string com.apple.realitykit = "4.3.2" 19 | } 20 | dictionary required = { 21 | dictionary fileFormats = { 22 | string org.aom.vvm = "1.0.0" 23 | } 24 | } 25 | dictionary optional = { 26 | dictionary schemas = { 27 | string com.apple.realitykit.components = "1.0.0" 28 | } 29 | } 30 | } 31 | ) 32 | ``` 33 | 34 | ## Problem Statement 35 | 36 | USD documents can be pretty wide in terms of the features supported such as custom schemas, textures, audio or geometry 37 | formats. Some of these are important to be 38 | represented, while others are not meant to be portable. 39 | 40 | This makes it difficult for an application runtime to know what it should let a user 41 | know about, when it can't represent something. It also means that the application 42 | needs to analyze every element before telling the user something may be amiss. 43 | 44 | Conversely, USDZ packages focus on portability and are much stricter about the resource types 45 | they may include (though it omits the same rigidity for schemas). 46 | However, this strictness can also prevent the use of new features until they've 47 | been standardized. While this is great for inter-ecosystem sharing, it can be an 48 | issue for use within more limited scopes. 49 | 50 | Profiles aim to solve this by providing metadata so that USD documents and packages 51 | may express what non-standard features they use. This would allow runtimes to flag 52 | concerns to a user faster for unsupported features, and allow USDZ documents to use 53 | features prior to standardization. 54 | 55 | ## Glossary 56 | 57 | N/A 58 | 59 | ## Details 60 | 61 | We propose the addition of a dictionary metadata that stores a structured mapping 62 | of profile identifiers to their corresponding versions. 63 | 64 | ### Profile Identifiers 65 | 66 | We suggest that profiles identify themselves with 67 | a [Reverse Domain Name Notation](https://en.wikipedia.org/wiki/Reverse_domain_name_notation)**. 68 | 69 | This is a very standard registration type across multiple systems, and has the 70 | advantage of allowing namespacing, while reducing the risk of name squatting. 71 | 72 | E.g `com.apple.text.preliminary` would allow pointing to the use of a preliminary 73 | text schema, that is attributed to Apple. This would allow disambiguation with 74 | something like `com.autodesk.text.preliminary` if Autodesk would want to release 75 | a preliminary version of their schema too. 76 | 77 | While there should be other forms of delineation within the schema, and potentially its name, this allows the 78 | application runtime to alert the user before traversing 79 | the scene and running heuristics. 80 | 81 | It also allows the application to direct the user to the relevant developers site 82 | where they can ask for more information. 83 | 84 | Lastly, it also prevents name collisions. 85 | For example, in the future, [aswf.com](https://www.aswf.com) may want to make schemas for window 86 | film parameters. This would then conflict with [aswf.io](https://www.aswf.io)'s schemas. 87 | Treating the domain identifier as ownership has proven to be quite resilient. 88 | 89 | #### Why not a URL? 90 | 91 | It may be preferable to some to use a URL like `https://www.openusd.org/profiles/core/v1.2.3`. 92 | 93 | However, this has been problematic in past systems where URL's bitrot, and require 94 | that everyone align on one website structure, or that runtimes have many parsers. 95 | 96 | A standardized reverse domain name notation is therefore considered a good middleg round. 97 | Users may still have to search for the specific feature, but they'll 98 | at least know who to ask. 99 | 100 | ### Profile Versioning 101 | 102 | We propose that versions of profiles use [semantic versioning](https://semver.org) 103 | compatible strings. We recognize that many projects, including OpenUSD, do not use 104 | semver. However, there is benefit in using a string that can be parsed by 105 | semver parsers even if the semantic meanings of the tokens aren't the same. 106 | 107 | For example, one application may support a specific version of an extension but not older or newer versions of it. 108 | 109 | ### Profile Augmentation 110 | 111 | Our proposal for profiles includes the concept of a base profile and augmentations 112 | beyond that. 113 | 114 | The base profile should be a well known base level understanding of what features are 115 | standardized under it. 116 | 117 | For example, `org.openusd.core = 0.25.8` to represent a core profile of USD that 118 | aligns with OpenUSD 25.8. 119 | 120 | Features beyond this base profile may be specified as well in a similar way, 121 | augmenting it. Therefore, profiles are always additive. 122 | 123 | `com.apple.realitykit.components = 1.2.3` could augment the base profile, as one 124 | example. 125 | 126 | ### Dictionary Overview 127 | 128 | We propose the dictionary have the following top level keys: 129 | 130 | - **name** : The identifier of the base profile 131 | - **version** : The version of the base profile 132 | 133 | Beyond that, there are sub-dictionaries of augmentations. Each of these two are 134 | further subdivided into categories as described in the next section: 135 | 136 | - **required** : A set of profile augmentations that are 137 | required to present this USD stage. For example, if using geometry compressed 138 | file format plugins, the stage would not represent in a usable form without 139 | their availability. 140 | 141 | - **optional** : A set of profile augmentations that are 142 | not required to portably represent this stage. For example, Reality Kit or other 143 | runtimes may include many runtime specific schemas for behaviour etc... which 144 | are not expected to be used by a DCC. 145 | 146 | Finally, it may also be beneficial to share what versions of runtimes the document 147 | has been intended for or tested with. This is a mapping of identifiers to 148 | their versions. 149 | 150 | - **compatibility** : A map of which versions of a DCC or runtime the document has 151 | been tested for or is intended to be used with. e.g `com.sidefx.houdini = "12.4""`. 152 | These should not be used to prevent loading of the file in mismatched versions, but 153 | provide a standardized way to warn a user if compatibility might be an issue. 154 | 155 | ### Augmentation Dictionary Categories 156 | 157 | Both the required and optional augmentation dictionaries are further subdivided 158 | into categories. 159 | 160 | Categories help organize the augmentations into their respective domains. 161 | This further allows a runtime to decide what to present to a user. 162 | 163 | For example,a runtime that is not rendering need not bother with augmentations 164 | that are meant for visualization. 165 | 166 | Additionally, this allows for better, standardized reporting structures to users 167 | in whatever manner the runtime or app chooses. e.g Maya and Blender would inherently 168 | have different UIs, but wouldn't have to necessarily provide their own categorization. 169 | 170 | We propose the following categories: 171 | 172 | - **imaging** : features that a renderer would need to support this document. 173 | For example, Gaussian Splat rendering `com.adobe.splats`. 174 | - **fileFormats** : data plugins that are needed to be read by USD to recreate a hierarchy. e.g `org.aswf.materialx` 175 | - **assetFormats** : Asset formats for textures or audio that may be required. 176 | e.g `org.aswf.vdb` 177 | - **schemas** : Schemas that may be required for correct parsing of this scene. 178 | e.g `com.apple.realitykit.components` 179 | - **features** : A list of USD features that may not be supported by a given runtime. 180 | e.g a USD file may use relocates, but an older runtime won’t understand them even 181 | if it can parse them. e.g `org.openusd.relocates` 182 | - **general** : Extensions that don’t fit in a predetermined category 183 | 184 | ## Profiles vs Extensions 185 | 186 | Other formats like glTF have extension facilities as described 187 | in [glTF 2.0 Extension Registry](https://github.com/KhronosGroup/glTF/blob/main/extensions/README.md). 188 | 189 | Unlike extensions, profiles (as described here) do not add new functionality. 190 | Instead, profiles are a complement to OpenUSD's existing extension system by allowing 191 | up front declaration of which extensions are in use. 192 | 193 | Profiles are intended to have no runtime side effects beyond their declaration. 194 | 195 | ## Runtime Analysis 196 | 197 | A concern about profiles is that they are just hints, and are taken at face value. 198 | 199 | The real truth is of course always in the actual data stored, whether thats the schemas or asset formats used. 200 | 201 | However, this requires runtimes to analyze everything in the scene, and have an 202 | understanding what they may not support ahead of time. 203 | This is difficult in several scenarios, especially when combined with composition 204 | to analyze ahead of time. Additionally, this can help reduce the need to parse 205 | multiple file formats ahead of time to know if they're supported. 206 | 207 | As such, profiles are simply hints that should be truthful from the content creation 208 | pipeline to the consuming application/runtime. They are not meant to be taken 209 | as absolute truth. 210 | 211 | ## Metadata Location 212 | 213 | One question is where is best to store the metadata. Especially when it comes to the 214 | use of multiple layers composing a stage. 215 | Does the root layer need to describe the sum of all profile information of the rest 216 | of the stage? 217 | 218 | Therefore, it may be preferable to store the metadata on the individual top level prims. 219 | 220 | This would allow the metadata to compose together, at the expense of a little more 221 | complexity in where to look for the metadata. 222 | 223 | ## Suggested Core Profile 224 | 225 | We propose the addition of an OpenUSD base profile that corresponds to the USD version. 226 | 227 | This would be a well recognized base profile to use for systems in the form of 228 | `org.openusd.core` where the version would be `0.24.5` etc... 229 | 230 | Future base profiles could be managed by well known bodies in the industry like the 231 | AOUSD. For example if there was a more limited set of USD for the web without 232 | features like volumetrics, it could be `org.aousd.web` with a requisite version. 233 | 234 | ## Suggested Augmentation Profiles 235 | 236 | The following profiles are examples of hypothetical augmentation profiles. 237 | 238 | - **org.aom.vvm** : Use of 239 | the [Volumetric Visual Media](https://aomedia.org/press%20releases/call-for-proposals-on-static-polygonal-mesh-coding-technology/) 240 | compression 241 | - **org.aom.avif** : Use of the AVIF media encoder, since USD files may need to be 242 | used in older runtime versions that do not include the AV1 decoder. 243 | - **com.apple.realitykit.components** : Components that describe RealityKit specific runtime behaviour. 244 | - **org.aswf.materialx** : Requires usdMtlx be built for a specific version of MaterialX to load the data 245 | - **org.aswf.openvdb** : Requires VDB to be available to load volumetric data 246 | 247 | ## Validation 248 | 249 | The addition of profiles could open up the opportunity for more granular validation. 250 | e.g a file that doesn't claim to use RealityKit components could surface a warning 251 | if those components are encountered. 252 | 253 | Specific additions to validation are considered out of scope for this proposal, but 254 | the idea as an abstract is one that could be useful. 255 | 256 | There are a few ways that profiles can help with validation of USD files: 257 | 258 | 1. An application may present warnings on load when it load a USD file that uses an unsupported extension. This would be 259 | similar to when Maya loads a ma file using unknown plugins , or when Keynote loads a file that makes use of unknown 260 | fonts. 261 | 2. Asset deliveries could have very quick validation to make sure that the assets aren’t using undesirable extensions, 262 | prior to parsing the files themselves. The list of e compatibility information could be provided to asset vendors to 263 | check against in a normative way. 264 | 3. An asset library could choose to only show USD files that have extension versions compatible with the current 265 | application 266 | 4. Command line tools (like the macOS fork of usdchecker) could validate whether a given set of extensions would work on 267 | the current versions of RealityKit etc... 268 | 269 | ## Alternate Solutions 270 | 271 | When proposing this addition, no other alternate approaches were suggested 272 | beforehand by contributors and evaluators. 273 | 274 | 275 | -------------------------------------------------------------------------------- /proposals/accessibility/README.md: -------------------------------------------------------------------------------- 1 | # Accessibility Schema 2 | 3 | [Link to Discussion](https://github.com/PixarAnimationStudios/OpenUSD-proposals/pull/69) 4 | 5 | ## Summary 6 | 7 | As USD enters more widespread use, with a range of interactive experiences for spatial computing and the web, it becomes necessary to make sure we do not exclude anyone. It is important for 3D content to be as accessible as other media types for people with a range of needs that may require virtual assistive services. 8 | 9 | To facilitate this, we propose the addition of accessibility metadata, using industry standard nomenclature. 10 | 11 | ```python 12 | def Mesh "Cube" ( 13 | prepend apiSchemas = ["AccessibilityAPI"] 14 | ) { 15 | string accessibility:label = "Luxo, Jr" 16 | string accessibility:extendedDescription = "The lamp has round base with two sections above it that may be adjusted. It has a conical head with a lightbulb inside. It likes to chase inflatable balls" 17 | token accessibility:important = "standard" 18 | } 19 | ``` 20 | 21 | ## References and Notes 22 | 23 | **Disclaimer :** This proposal should not be taken as an indication of any upcoming feature in our products. It is being provided to garner community feedback and help guide the ecosystem. 24 | 25 | **Authors**: J Lobo Ferreira da Silva , Dhruv Govil 26 | 27 | ### Glossary of Terms 28 | 29 | * **Accessibility affordances** : Refers to features that may provide information for assistive services like audio description tools to provide assistance to users who may have limited sight or mobility. 30 | * **Audio Description** : Several tools and platforms provide the ability to describe what is on screen to the user. 31 | These are usually based on metadata on user interface items, such as image alt text or button descriptions. 32 | * **ARIA (Accessible Rich Internet Application) Roles** : A set of roles and attributes to make the web more accessible. 33 | 34 | ### Related Reading 35 | 36 | Here are some related materials to read on about the use of accessibility in various systems. 37 | None of these are required reading, but may help provide context. 38 | 39 | * [Apple Accessibility](https://www.apple.com/ca/accessibility/) 40 | * [Catch up on accessibility in SwiftUI](https://developer.apple.com/wwdc24/10073) 41 | * [Improving the Accessibility of RealityKit Apps](https://developer.apple.com/documentation/realitykit/improving-the-accessibility-of-realitykit-apps) 42 | * [Microsoft Accessibility](https://www.microsoft.com/en-us/accessibility) 43 | * [Web Accessibility Guidelines](https://www.w3.org/TR/WCAG21/) 44 | * [ARIA Roles](https://developer.mozilla.org/en-US/docs/Web/Accessibility/ARIA) 45 | * [Unity 3D Accessibility](https://docs.unity3d.com/2023.2/Documentation/ScriptReference/UnityEngine.AccessibilityModule.html) 46 | 47 | ## Details 48 | 49 | ### API Schema 50 | 51 | We propose a new Applied API schema that will allow any Prim to provide descriptions for Audio Description tools. 52 | 53 | We include the following attribute groupings, with the `accessibility` namespace prefix. 54 | 55 | * `label`: The primary description string that is presented to the user. 56 | * `extendedDescription` : An extended description string that may provide more detail than the primary label. 57 | * `importance` : Sets the importance of this group so tools may prioritize what to surface to a user. Options are `high`, `standard`, `low` with a default of `standard`. 58 | 59 | All three attributes are based on existing standard names in accessibility frameworks, and in consultation with multiple accessibility experts. 60 | 61 | Given the example in the summary, repeated here 62 | 63 | ```python 64 | def Mesh "Cube" ( 65 | prepend apiSchemas = ["AccessibilityAPI"] 66 | ) { 67 | string accessibility:label = "Luxo, Jr" 68 | string accessibility:extendedDescription = "The lamp has round base with two sections above it that may be adjusted. It has a conical head with a lightbulb inside. It likes to chase inflatable balls" 69 | token accessibility:important = "standard" 70 | } 71 | ``` 72 | 73 | Only the label is required, and so the shortest form of this may be as simple as 74 | 75 | ```python 76 | def Mesh "Cube" ( 77 | prepend apiSchemas = ["AccessibilityAPI"] 78 | ) { 79 | string accessibility:label = "A sentient lamp with an adjustable body and cone head" 80 | } 81 | ``` 82 | 83 | ### Attribute Purposes 84 | 85 | It can be useful to have multiple types of descriptions, represented as purposes. 86 | For example, a primary description may choose to describe a general appearance, but a secondary one may choose to describe a specific detail. 87 | 88 | For example, a user may want to have a short summary of the object, but then ask details about its size. 89 | 90 | We propose that these be represented as purposes on the attribute. 91 | 92 | ```Python 93 | def Mesh "Cube" ( 94 | prepend apiSchemas = ["AccessibilityAPI"] 95 | ) 96 | { 97 | string accessibility:label = "A Cube" 98 | string accessibility:extendedDescription:default = "This cube is a wonderful looking cube" 99 | token accessibility:importance:default = "standard" 100 | 101 | string accessibility:extendedDescription:size = "As big as a house" 102 | token accessibility:importance:size = "low" 103 | } 104 | ``` 105 | 106 | This allows for multiple levels of description, and also allows for aspects of the description to be changed by variants. 107 | 108 | For example a user may ask what color an object is, and different material variants may provide their own `color` purpose here. 109 | 110 | While we do not limit the names of these purposes, a few suggestions may be: 111 | 112 | * `default`: The generic, anonymous purpose. No purpose needs to be specified here. 113 | * `size` : Describe the size of the object in ways a human without inherent unit visualizations might understand. 114 | For example, a user with limited visibility may not know how large a `metre` is, but may understand sizes in relationship to a common object. 115 | * `color` : The color of the object, which can describe more details such as textural properties. 116 | 117 | ### Language Support 118 | 119 | We also allow expect accessibility to, optionally, be provided in multiple languages. For example, in Canada it is often required to provide equal affordances to English and French users. 120 | 121 | We suggest using the [Multiple Language Proposal](https://github.com/PixarAnimationStudios/OpenUSD-proposals/blob/main/proposals/language/README.md) to allow specification of the language. 122 | 123 | In keeping with the language proposal, the language purpose would appear after all other purpose tokens. 124 | 125 | For example, 126 | 127 | ```Python 128 | def Mesh "Cube" ( 129 | prepend apiSchemas = ["AccessibilityAPI"] 130 | ) 131 | { 132 | string accessibility:label = "A Cube" 133 | string accessibility:extendedDescription = "This cube is a wonderful looking cube" 134 | token accessibility:importance = "Standard" 135 | 136 | string accessibility:label:lang:fr = "Un cube" 137 | string accessibility:extendedDescription:lang:fr = "Ce cube est un cube magnifique" 138 | string accessibility:extendedDescription:lang:fr_ca = "Ce cube est un cube magnifique canadien" 139 | } 140 | ``` 141 | 142 | #### Options for Language Tokens 143 | 144 | In the language proposal, I offer two options for language purposes. 145 | 146 | 1. A prefix-less version : `:` 147 | 2. A prefixed version : `:lang:` 148 | 149 | It would likely be preferable to choose the second version here. 150 | 151 | However, if we did want to choose the first one, we propose options for the default accessibility attribute to distinguish between the language purpose and an accessibility purpose token. 152 | 153 | 1. Implicit Default Purpose: 154 | - If no accessibility purpose is given, and no language purpose is given, it is implicitly the `default` purpose. 155 | - If the `default` purpose is explicitly authored, it is preferred over the anonymous version. 156 | - If a language purpose is provided, the `default` accessibility purpose must also be explicitly authored. 157 | 2. Explicit Default Purpose: 158 | - The default purpose must always be provided as part of the attribute name regardless of the language being specified. 159 | 160 | My preference is that we require the language purpose always have the `:lang:` prefix to make it clear in all cases. 161 | 162 | 163 | ### Prim Restrictions 164 | 165 | Initially we believed it might be valuable to restrict accessibility attributes to just `model` kinds, but we've since come to realize that users may like it over any object in the hierarchy. 166 | 167 | This would allow users with accessibility needs to perhaps inspect assets within a scene if needed, but also learn about non-gprim types such as Materials or Lights. 168 | 169 | #### Default Prim Recommendation 170 | 171 | Often, USD files are presented to users as a single object even if they consist of many assets. 172 | 173 | We strongly recommend that the scene description be hosted on the default prim in the scene. 174 | 175 | Utilities that want to go further with exposing the USD hierarchy may still choose to expose accessibility on the rest of the prims, but by having the default prim host the primary scene description, we believe many applications can more easily report the information in the majority of contexts. 176 | 177 | This would be analogous to alt-text for an image which describes an entire image, rather than elements within it, though some tools do support more granular descriptions. 178 | 179 | 180 | ### Temporal Changes 181 | 182 | In most accessibility systems, the information is static. 183 | However, in recent times, several attempts have been made at temporal accessibility in mediums such as games or our very own proposals to HLS video streaming for photosensitivity. 184 | 185 | As such, we think it is valuable to optionally allow the accessibility attributes to have time samples that may vary over time. For example an animation of Luxo, Jr. However, this must never be the primary source of accessibility information for reasons laid out below. 186 | 187 | ```Python 188 | def Xform "LogoIntro" ( 189 | prepend apiSchemas = ["AccessibilityAPI"] 190 | ) 191 | { 192 | string accessibility:extendedDescription.timeSamples = { 193 | 54: "The lamp jumps on the ball", 194 | 56: "The lamp rolls back and forth on the ball", 195 | 60: "The ball deflates and the lamp sinks down", 196 | 66: "The lamp moves itself off the squashed ball", 197 | 70: "The lamp tries to revive the ball", 198 | 80: "The lamp looks down, sad and leaves" 199 | 88: "The lamp returns with a larger ball" 200 | } 201 | } 202 | ``` 203 | 204 | As with the prim restrictions above, we do not suggest that this be commonly used but leave the option for people to do so. 205 | 206 | As most accessibility systems are based around static content, and too much changing information may inundate a user, we strongly recommend that a static default value be provided for accessibility information even when time samples are provided. 207 | 208 | Assistive tools should not be expected to support temporally changing data, though they may choose to at their discretion. 209 | 210 | ### Prim Relationships 211 | 212 | For the initial implementation, we do not allow for rel connections as part of the schema. This is a common request so that strings may format in other prims descriptions, or show semantic relationships between objects. We feel this is too large a problem to keep in scope for the first version. 213 | 214 | However, we recognize that a prims hierarchical and binding relationships may be useful to describe to a user. 215 | 216 | As such we think that it may be useful in the future to provide utilities to gather accessibility data across the following axes: 217 | 218 | 1. Prim ancestors : A prim hierarchy may be getting more specific descriptions as it descends the tree. Being able to collect the information across its parents would allow a utility to combine the descriptions downwards. 219 | 2. Prim children : Similarly, it might be valuable to go in the reverse order when asking about information on a prim where the children provide the details to be combined. 220 | 3. Common bindings like Material and Skeleton to fetch information on what the object may look like or what the animation might be doing 221 | 222 | We do not suggest or require the traversal to be in the initial implementation, but recognize that it may be a good middle ground between having explicit semantic relationships specific to Accessibility. 223 | 224 | My personal suggestion is to defer these API systems till a later point. 225 | 226 | ### Other Non-Goals 227 | 228 | Accessibility is a far-reaching system, and as such we have some explicit non-goals. 229 | 230 | * We do not prescribe the way the accessibility data should be presented by accessibility tools. There is a wide range of utilities that may be suited for different needs. 231 | * We do not limit the default number of purpose keys, and only provide a default set as suggestions. 232 | 233 | 234 | ## Alternate Uses 235 | 236 | Accessibility information is often very useful for other aspects of a scene. We often find that even users without accessibility needs may benefit from accessibility affordances in systems. 237 | 238 | * This may be useful to natural language searches across a scene 239 | * Machine Learning and AI systems may use the accessibility information to gain understanding of the scene. 240 | 241 | 242 | ## Risks 243 | 244 | Given the scope of this proposal is simply metadata, we do not believe there are significant risks. At worst, systems will just ignore them 245 | 246 | ## Alternates Considered 247 | 248 | We've considered several alternatives but discarded them for the following reasons: 249 | 250 | * Metadata: Metadata like assetInfo etc may have been useful here but we feel there's too many axes of information here to represent in metadata. 251 | * SemanticsLabelsAPI is also a valid accessibility information source, but the target goals are too different. Labels are geared towards data management systems versus human interaction. 252 | 253 | ## Closing Notes 254 | 255 | We believe accessibility is a very important facility to provide, and can help open USD up to a wider audience who may otherwise be unable to experience or use the same things many of us take for granted. 256 | 257 | To the best of my knowledge, there is no existing accessibility standards for 3D content that is authored independent of a target engine. If there is, I would love to see it though. If there isn't , I think this can let us lead by example. 258 | 259 | 260 | Again, I'd leave a disclaimer that this proposal is not an indication of any future work on our part. However, I believe that by having this in the USD ecosystem, it's something that we can take on together. -------------------------------------------------------------------------------- /proposals/PickAPI/README.md: -------------------------------------------------------------------------------- 1 | # PickAPI 2 | Copyright © 2023, NVIDIA Corporation, version 1.0 3 | 4 | ## Overview 5 | This document defines picking as mouse / controller driven interaction with one or more prims through interactive rendering of the scene. Picking encompasses both point-and-click and sweep interactions. Picking is often accelerated through the use of the rendering toolkit, such as a pick buffer or a bounding volume hierarchy. 6 | 7 | In most contexts, picked objects are converted into the application's selection state. 8 | 9 | This document proposes a specification for customized prim picking behavior to benefit the interchange of a variety of scenes and assets in USD. 10 | 11 | ## Motivation 12 | ### Interactive Scenes 13 | A designer of an interactive scene might want to limit picking inside of a room to a set of specific scopes. While objects may not be pickable, the designer may still choose to have them occlude picking of other objects (ie. Walls shouldn't be pickable, but picking shouldn't resolve objects behind walls). 14 | 15 | ### Performance 16 | Asset builders may want to improve selection speed for consumers by removing expensive to render prims from pick buffers. Downstream consumers understand and expect that these objects will both not be pickable and not occlude picking (ie. Rendered fur on a character may be invisible to picking, allowing the underlying surface to be picked). 17 | 18 | ### Retargeting 19 | Asset builders may be aware of important scopes made up of multiple gprims, say a group of prims representing a door. They understand that downstream users picking the door knob are generally trying to rotate the entire door and would benefit from retargeting the pick to the rotatable root prim. 20 | 21 | ## Proposal 22 | To describe high level, customized primitive pick behavior, provide a `PickAPI` schema in `UsdUI`. This schema describes two key functions [visibility](#visibility) and [retargeting](#retargeting). 23 | 24 | This proposal distinguishes between pick specialization and [application selection modes](#relationship-to-application-selection-modes). Selection modes are user interface options that apply to the entire scene and cannot be practically encoded on every prim in the scene. 25 | 26 | The `PickAPI` should only be applied to `Imageable` prims (ie. `UsdGeom` and `UsdVol` but not `UsdShade`) and the `GeomSubset`s of `Gprim`s. 27 | 28 | ### Visibility 29 | ``` 30 | uniform token pick:visibility = "inherited" (allowedTokens = ["inherited", "invisible"]) 31 | ``` 32 | `pick:visibility` controls whether or not the object should be considered visible with respect to any picking operation. Objects `invisible` to picking should be excluded from any picking calculations (and acceleration structures like BVHs or pick buffers). Prims should only be included in the pick buffer if they are imaged, taking into account `visibility`, `purpose`, and any other viewport render settings. 33 | 34 | Because picking is often accelerated through the rendering toolkit, it's important to describe picking on its terms. 35 | 36 | If the `PickAPI` is applied to multiple sites in a hierarchy, `pick:visibility` has pruning semantics similar to the `visibility` attribute. If a prim is invisible to picking, so are its descendants. 37 | 38 | To make an object _unpickable_ but still participate in calculations as a matte occluder, users must use [retargeting](#retargeting-for-pick-occluders). 39 | 40 | `pick:visibility` may not be animated. 41 | 42 | In this current proposal, sending geometry to a pick buffer that is not actively imaged is currently out of scope. One hypothetical use case would be interactive picking of volumetrics. A user might provide a mesh to be used as the pick target for a dense volumetric cloud. This proposal doesn't provide a path for including non-imageable geometry in picking because the semantics quickly become confusing. It also complicates implementations leveraging the rendering pipeline. 43 | 44 | ### Retargeting 45 | 46 | ``` 47 | uniform token pick:retargeting = "none" (allowedTokens = ["none", "replace"]) 48 | rel pick:targets 49 | ``` 50 | These attributes control whether or not to forward picking to alternative targets. The default behavior (`none`) is to not forward picking. However, if `replace` is selected, all descendant prim selections will be replaced with `pick:targets`. 51 | 52 | If the `PickAPI` is applied to multiple sites in a hierarchy, the retargeting behavior will be defined by the nearest applied ancestor. This is notably different than the pruning semantics of `pick:visibility`. As specified, retargeting could be set to `none` in a descendant, overriding any ancestral retargeting opinions. 53 | 54 | Targets will usually include one or more prims, but may include one or more properties. If a relationship is targeted, standard relationship forwarding should be applied. If an attribute is targeted, that attribute should be selected if meaningful, otherwise the attribute's parent prim. There is no way to use the `PickAPI` to retarget selection to a relationship. 55 | 56 | #### Retargeting for Pick Occluders 57 | If targets is empty, no prims will be selected, and the hierarchy will act as a matte occluder. Utilities (say `UsdUIPickAPI(prim).OccludeOnly()`) can be used to explicitly set `pick:visibility` to `inherited`, `pick:retargeting` to `replace`, and `pick:targets` to an empty list. 58 | 59 | #### Retargeting Descendants 60 | If targets is set to `<.>`, the `PickAPI` will naturally retarget to the prim on which the schema was applied. Utilities (say `UsdUIPickAPI(prim).TargetSelf()`) can be used to explicitly set `pick:visibility` to `inherited`, `pick:retargeting` to `replace`, and `pick:target` to `<.>`. 61 | 62 | 63 | ### Relationship to Application Selection Modes 64 | The proposed schema aimed at customizing viewport picking but and not for implementing workflow-based selection modes. 65 | 66 | For example, a user may enter various application selection modes: 67 | * Filter picking by schema (ie. make lights not pickable) 68 | * Select the bound materials of the picked hierarchy 69 | * Select the points of the picked object 70 | * Select ancestral component model 71 | 72 | Implementing all these features as variants with different `PickAPI` settings would result in an explosion of variants. Interest in cross-platform standardization of selection modes should be handled by utilities and specifications external to this schema. 73 | 74 | Selection modes should in general operate on top of the `PickAPI`. Since pick visibility is pruning and respects all other visibility settings, the order of application doesn't strictly matter. Selection modes that engage in retargeting behavior should clearly document their relationship to the `PickAPI`. This proposal recommends that schema pick retargeting should be applied, followed by application selection mode. (ie. select bound materials of `/picked/prim.pick:targets` not `/picked/prim`). If there are practical reasons where the schema defined pick mode need to be ignored (ie. explicit point sculpting of meshes), the interface or documentation should aim to communicate that the `PickAPI` will intentionally be ignored. 75 | 76 | Applications that do not support the `PickAPI` and its features should try to issue warnings when they observe its application. 77 | 78 | Implementers should consider the implications of this API being applied to a wide variety of prims in a hierarchy, assembly or component models, point instancers, gprims, and geom subsets. 79 | 80 | The `PickAPI` strictly customizes behavior from viewport picking. It should not be used to customize selection behavior in scene graph tree views and other widgets. However, widgets and interfaces may choose to reference the `PickAPI` in filtering and display of pickable scopes. 81 | 82 | ### Subsets 83 | > **NOTE** `GeomSubset` picking is not currently supported in Hydra. This proposed usage is somewhat speculative and subject to revision. 84 | 85 | Subset support is challenging because there may be multiple subset families which may or may not be partioning. This proposal presupposes element index is easy to access when picking via ray hit or via a secondary element id pick buffer. 86 | 87 | The `PickAPI` only makes sense when applied to subsets whose elements have area. Faces have area. Edges do not. Points in the context of the `Mesh` and `BasisCurves` schemas do not. Points in the context of the `Points` schema have width and do have area. 88 | 89 | The proposal allows the `PickAPI` to be applied to any area based subset and does not require a specific picking partition (though users are free to set one up). The expectation is that users and applications will likely want to reuse subsets reserved for other purposes (like material binding) for picking. 90 | 91 | Just like applications may have special selection modes to restrict selection on prim type, it's reasonable for applications to have special selection modes to enable or disable picking of certain subset families separate from the `PickAPI`. 92 | 93 | #### Visibility 94 | Since we've chosen simple pruning visibility as our idiom, it is easy to reason about behavior. All subsets which have the `PickAPI` applied should be consulted. If an element invisible in any subset, it's invisible in all. 95 | 96 | #### Retargeting 97 | Retargeting is more complicated to reason about because of the non-partitioning nature of subsets. An element like a face may be a member of multiple subsets or none. As such, a single point-and-click pick event can actually result in multiple `GeomSubset`s being selected. 98 | 99 | How retargeting applies to `GeomSubset` picking must be approached from the point of view of the picked element. Map the picked element to all of its subsets that have the `PickAPI` applied with `pick:retargeting` enabled. If no such subset exists, its containing `Gprim` is picked. Otherwise, picking should union all of the `pick:targets` of the subsets with `pick:retargeting` enabled. 100 | 101 | To make the subset itself, pickable, its targets may be simply set to `<.>`. Subsets may retarget to any other prim. 102 | 103 | Since faces can be elements of multiple subsets, care must be taken to properly setup matte occluders. All subsets a face is an element in that have enabled `pick:retargeting` must be set to `[]`. One advantage of framing matte occlusion as a retargeting behavior is that it provides the implementation clarity in the unusual scenario where an element resides in multiple subsets with different opinions about matting. 104 | 105 | Such edge cases are specified only for completeness and to avoid ambiguity. It's expected that users will generally only have a single family of non-overlapping faces using the `PickAPI`. 106 | 107 | ## Applied Examples 108 | ### Fur (Visibility) 109 | Consider the case where fur curves are affecting picking of the underlying mesh. 110 | ``` 111 | def Xform "Cat" { 112 | def Mesh "Body" { ... } 113 | def BasisCurves "Fur" ( 114 | append apiSchemas = "PickAPI" 115 | ) { 116 | uniform token pick:visibility = "invisible" 117 | } 118 | } 119 | ``` 120 | 121 | ### Forest of Trees (Retargeting) 122 | It may be hard to generalize point instancer picking behavior, especially 123 | when considering departmental usage variation and nesting of point instancers. 124 | To ensure that the 'Leaves' point instancer is always picked, this example uses 125 | retargeting. 126 | ``` 127 | def Xform "VeryLargeForest" { 128 | def PointInstancer "OakTrees" { 129 | rel prototypes = <./OakTree_1> 130 | def Xform "OakTree_1" { 131 | def Mesh "Trunk" { ... } 132 | def PointInstancer "Leaves" ( 133 | append apiSchemas = "PickAPI" 134 | ) { 135 | token pick:visibility = "inherited" 136 | token pick:retargeting = "replace" 137 | rel pick:targets = <.> 138 | 139 | rel prototypes = <./Leaf_1> 140 | def "Leaf_1" { ... } 141 | } 142 | } 143 | } 144 | } 145 | ``` 146 | 147 | ### Room (Retargeting, Inheritance, and Invisibility) 148 | This example demonstrates custom picking behavior for several scopes. 149 | - Use retargeting to suppress picking of prims 150 | - Overriding retargeting when nesting `PickAPI` application 151 | - Using guide geometry as a target for unpickable prims 152 | 153 | ``` 154 | # The room specifies that by default, prims are visible to picking (if imaged) 155 | # but that they should retarget to an empty list. By default, objects in the 156 | # room act as pick occluders. 157 | def Xform "Room" ( 158 | append apiSchemas = "PickAPI" 159 | ) { 160 | uniform token pick:visibility = "inherited" 161 | uniform token pick:retargeting = "replace" 162 | rel pick:targets = [] 163 | 164 | def Xform "Walls" { 165 | def Mesh "NorthWall" { ... } 166 | def Mesh "SouthWall" { ... } 167 | def Mesh "EastWall" { ... } 168 | def Mesh "WestWall" { ... } 169 | } 170 | 171 | # The door overrides the retargeting behavior so that it and all its 172 | # descendants are pickable and retarget to the rotatable 'DoorGroup' 173 | def Xform "DoorGroup" ( 174 | append apiSchemas = "PickAPI" 175 | ) { 176 | uniform token[] xformOpOrder = ["xformOp:rotateZ"] 177 | double xformOp:rotateZ = 0.0 178 | 179 | uniform token pick:visibility = "inherited" 180 | uniform token pick:retargeting = "replace" 181 | rel pick:targets = <.> 182 | 183 | def Mesh "Door" { ... } 184 | def Mesh "DoorKnob" { ... } 185 | def Mesh "DoorKnocker" { ... } 186 | } 187 | 188 | # The window makes its glass not pickable by default but provides a 189 | # tiny sphere collocated with Glass to be used as a pick target by 190 | # users when guides are enabled. 191 | def Xform "Window" { 192 | def Mesh "Frame" { ... } 193 | def Mesh "Glass" ( 194 | append apiSchemas = "PickAPI" 195 | ) { 196 | uniform token pick:visibility = "invisible" 197 | } 198 | def Sphere "GlassPickTarget" ( 199 | append apiSchemas = "PickAPI" 200 | ) { 201 | uniform token purpose = "guide" 202 | uniform token pick:visibility = "inherited" 203 | uniform token pick:retargeting = "replace" 204 | rel pick:targets = <../Glass> 205 | } 206 | } 207 | } 208 | ``` 209 | ### Terrain Trail (Matting with Subsets) 210 | In this example, to ensure picking follows the trail, the terrain is structured to be a matte occluder (its targets are empty). However, for elements in the `Trail` subset, this behavior is overridden to be self targeting. 211 | ``` 212 | def Mesh "HikingTerrain" (append apiSchemas = "PickAPI") { 213 | uniform pick:retargeting = "replace" 214 | rel pick:targets = [] 215 | 216 | ... 217 | 218 | def GeomSubset "Trail" (append apiSchemas = "PickAPI"){ 219 | uniform pick:retargeting = "replace" 220 | rel pick:targets = <.> 221 | 222 | ... 223 | } 224 | } 225 | ``` 226 | 227 | ### Subset Puzzle 228 | Support for `GeomSubset` prims can result in some interesting scenarios. In this example, the odd numbered faces do not have the `PickAPI` applied. As such, they inherit the "replace with ``" behavior. The even number faces have the `PickAPI` applied but undo the retargeting behavior because the default value is `none`. As such, `` will be picked for these faces. 229 | 230 | This puzzle exists not to demonstrate a user workflow but to demonstrate that the specification provides unambiguous picking behavior even in unusual scenarios. 231 | ``` 232 | def Xform "Root" (append apiSchemas = "PickAPI"){ 233 | uniform token pick:retargeting = "replace" 234 | rel pick:targets = <.> 235 | def Mesh "Mesh" { 236 | def GeomSubset "even_numbered_faces" ( 237 | append apiSchemas = "PickAPI" 238 | ) {} 239 | def GeomSubset "odd_numbered_faces" () {} 240 | } 241 | } 242 | ``` 243 | 244 | ## Questions 245 | * Should pick visibility and and pick retargeting be two separate APIs? One could imagine tools providing facility for controlling pick visibility but not supporting retargeting. One challenge with this formulation is identifying whether the visibility or retargeting schemas are responsible for describing pick occluders without creating ambiguity. 246 | * Can the PickAPI subsume or clarify the responsibilities of the proxyPrim relationship? Does it conflict at all with that relationship? 247 | * Are the rules for hierarchies with multiple applied PickAPIs consistent and expressive? 248 | * This proposal considered an `append` retargeting mode to accumulate ancestral targets but didn't find a practical example. Relationship forwarding could be used to express this more explicitly in the current formulation. Are there other "retargeting" modes that would be useful? 249 | * This proposal considered explicit `self` and `matte` retargeting modes. They were dropped as they were redundant with certain values for relationships. Are they common enough that they warrant special enumeration? 250 | 251 | * This schema views picking through the lens of authoring static scenes and not on picking as a potential event trigger (say, playback of an animation when a scope is picked). Does this schema complicate event triggers? Does it complement it? Or would this schema simply not be used in those contexts? -------------------------------------------------------------------------------- /proposals/spline-animation/regressive.md: -------------------------------------------------------------------------------- 1 | 2 | # Regressive Splines in USD 3 | 4 | Bezier splines are great for artists, but they have some interesting technical 5 | challenges. One is that they are parametric: rather than y = f(x) as we would 6 | like, they are { x = f(t), y = f(t) }. Among other things, this means they can 7 | easily go backwards in time: 8 | 9 | ![Regressive Example](./regressiveS.png) 10 | 11 | Such curves are non-functions: given a time, there is more than one possible 12 | value. This is intolerable for a system that governs values over time. We call 13 | these _regressive segments_. They are mathematically _non-monotonic_ in the 14 | time dimension. 15 | 16 | Artists probably don't create regressive curves on purpose. But they can 17 | certainly create them by accident. Splines can also be transformed, or 18 | generated programmatically, and this can lead to unusual cases. The bottom line 19 | is that regressive curves are an unavoidable consequence of using Beziers, and 20 | every animation system has to pick a way to deal with them. 21 | 22 | Unfortunately, there has never been a standard for preventing regression. There 23 | are several different popular strategies. This means that, given a regressive 24 | curve, different clients will do wildly different things. This document aims to 25 | illustrate how regression arises, describe some known strategies, and propose a 26 | spline authoring system that can accommodate many of these strategies, resulting 27 | in non-regressive splines that will behave identically for all clients. 28 | 29 | # When Regression Arises 30 | 31 | When addressing regression, we are only concerned with x(t), not with y(t). 32 | This means we only care about the placement of the control points in the time 33 | dimension. Anything can change in the value dimension, and what happens in the 34 | time dimension won't change at all. This is because x(t) and y(t) are separate 35 | functions. Changing values will squash, stretch, and skew the curve in the 36 | value dimension, but that doesn't matter for the question of regression. 37 | 38 | We can also work in a _normalized interval_, scaled and translated so that the 39 | start knot is at time 0 and the end knot at time 1. This also has no effect on 40 | regression. The end result is that we care only about two numbers: the time 41 | coordinates of the knot tangent endpoints, expressed relative to the normalized 42 | interval. 43 | 44 | If knot tangents don't leave the segment interval, we are guaranteed there is no 45 | regression. This is also intuitive for artists. It's a conservative rule you 46 | can learn: don't let a tangent cross the opposite knot, and you'll never get 47 | regression. 48 | 49 | If both tangents alight exactly at the opposite knot, we get a _single vertical_, 50 | the limit of non-regression. (Throughout this document, we use "vertical" in 51 | the sense of "when graphed with time on the horizontal axis".) 52 | 53 | ![Near vertical](./nearVertical.png) 54 | ![Vertical](./centerVertical.png) 55 | ![Regressive](./regressiveSStandard.png) 56 | 57 | All regressive cases have at least one tangent outside the interval. Is the 58 | converse also true: that all cases with a tangent outside the interval are 59 | regressive? No: there are non-regressive curve shapes that can only be achieved 60 | by placing one of the tangent endpoints outside the interval. This can be 61 | non-regressive if the out-of-bounds tangent isn't too long, and the segment's 62 | other tangent is short enough. These cases are fairly atypical, involving 63 | regions of rapid change, but they are valid functions: 64 | 65 | ![Bold tangent](./boldS.png) 66 | 67 | There's a graph that describes when regression arises: 68 | 69 | ![Regression graph](./tanLens.png) 70 | 71 | - In the green square, we have _contained tangents_, and there is no regression. 72 | 73 | - In the orange ellipse sections, we have _bold tangents_, and there is no 74 | regression. 75 | 76 | - On the red ellipse edge, we have a _single vertical_, and there is no 77 | regression. 78 | 79 | - Everywhere else, we have regression. 80 | 81 | - The ellipse also continues inside the green square, but it isn't meaningful 82 | there. 83 | 84 | We've seen the (1, 1) point on the ellipse above; it's the symmetrical case with 85 | the vertical in the center of the interval. That one is also a corner of the 86 | green box, one of the limits of contained tangents. 87 | 88 | Here are two additional important cases, the (1/3, 4/3) and (4/3, 1/3) ellipse 89 | limits. These are the longest possible non-regressive tangents. They put the 90 | vertical at times 1/9 and 8/9 respectively. 91 | 92 | ![One third, four thirds](./oneThirdFourThird.png) 93 | ![Four thirds, one third](./fourThirdOneThird.png) 94 | 95 | As we move between the above two limits, in the _center_ of the ellipse edge, we 96 | make one tangent longer and the other shorter. As we move between the (4/3, 97 | 1/3) limits and the (1, 0) limits, at the _fringes_ of the ellipse edge, we make 98 | both tangents longer or shorter. 99 | 100 | At (0, 1) and (1, 0), we get a vertical at either endpoint. These are also at 101 | corners of the green square, and limits of contained tangents. 102 | 103 | ![Vertical at start](./startVert.png) 104 | ![Vertical at end](./endVert.png) 105 | 106 | # Anti-Regression Strategies 107 | 108 | First, most (all?) spline animation systems observe one important restriction, 109 | which Ts will enforce as well: tangents may never face backwards, out of their 110 | segment's interval. Such a tangent will be ignored at runtime, as though it 111 | were zero-length. This prevents some of the worst cases. But, even when forced 112 | to face into their segments, tangents can cause regression by being too long. 113 | 114 | To deal with long tangents, here are some strategies that we know of. We 115 | propose to offer all of them in USD Anim. 116 | 117 | **Single-Tangent Strategies:** shorten each tangent in isolation. 118 | 119 | - **Contain:** Forbid tangents from crossing neighboring knots in the time 120 | dimension. This is overly conservative: it forbids bold tangents, which is 121 | both a slight creative limitation and a possible point of incompatibility. 122 | But the system is simple and intuitive, and it is used in some popular 123 | packages, including Houdini. 124 | 125 | **Dual-Tangent Strategies:** for each spline segment, consider both tangents, 126 | and shorten one or both. 127 | 128 | - **Keep Start:** When tangents are long enough to cause regression, keep the 129 | start tangent the same, and shorten the end tangent until the non-regressive 130 | limit is reached, with a single vertical. This is what Maya does, at both 131 | authoring time and evaluation time. It's asymmetrical, always favoring the 132 | start tangent, a bias that often pushes the adjusted curve far to the right of 133 | the original regressive Bezier. On the ellipse graph above, Maya finds the 134 | vertical line through the original point in tangent-length space, and takes 135 | the nearest intersection with the ellipse. If the start tangent length is 136 | greater than 4/3, Maya alters both tangent lengths, always to (4/3, 1/3). The 137 | latter case is illustrated by this particular segment: 138 | 139 | ![Keep Start](./keepStart.png) 140 | 141 | - **Keep Ratio:** When tangents are long enough to cause regression, shorten 142 | both of them, until the non-regressive limit is reached, with a single 143 | vertical. In so doing, preserve the ratio of the original tangent lengths. 144 | We believe this is a novel strategy. It is similar to Maya's strategy, but 145 | tends to produce curves that more closely match the original regressive 146 | Bezier. On the ellipse graph above, this strategy finds the line from the 147 | origin to the original point, and takes the intersection of that line with the 148 | outer edge of the ellipse. To avoid sharp bends at or near the endpoints, our 149 | prototype implementation clamps the ratio to [1/4, 4], thus always using the 150 | center portion of the ellipse edge, and keeping the vertical between 1/9 and 151 | 8/9. 152 | 153 | ![Keep Ratio](./keepRatio.png) 154 | 155 | **Interactive Strategies:** as knots are interactively edited, clamp tangent 156 | lengths. Operate on both segments adjacent to the knot being edited. Handle 157 | edits to knot time, and edits to tangent length. These strategies only work 158 | during interactive edits, because they require differentiating between the knot 159 | being edited (the "active" knot) and the other knot in the segment (the 160 | "opposite" knot). 161 | 162 | - **Limit Active:** Clamp the active tangent to the non-regressive limit, given 163 | the existing length of the opposite tangent. 164 | 165 | - **Limit Opposite:** As the non-regressive limit is exceeded, shorten the 166 | opposite tangent to barely maintain non-regression. When the (4/3, 1/3) limit 167 | is reached, clamp both tangents there. 168 | 169 | # Authoring-Time Anti-Regression 170 | 171 | Regression can be prevented at authoring time (regressive curves are never 172 | created), or at evaluation time (regressive curves are adjusted just before 173 | evaluation). 174 | 175 | We propose that USD focus on **authoring-time prevention**, shortening 176 | regressive tangents before they are stored in a spline. This is because: 177 | 178 | - What users see at authoring time should match what they get at runtime. 179 | 180 | - There are some very different strategies in use in various DCCs. 181 | 182 | - At authoring time, we can offer lots of flexibility, supporting any popular 183 | anti-regression strategy. 184 | 185 | - If regressive splines are never written to USD files, imports to all DCCs will 186 | look the same. 187 | 188 | - (There is a slight asterisk regarding "bold tangents", explained below, but 189 | still mostly true.) 190 | 191 | - Authoring-time flexibility means we can have runtime simplicity, with only a 192 | single fixed strategy. 193 | 194 | Note that "authoring" here may mean interactive editing, or programmatic 195 | editing. 196 | 197 | # Proposed Default: Keep Ratio 198 | 199 | We propose that anti-regression authoring behavior be enabled by default, and 200 | that the default strategy be **Keep Ratio**, because: 201 | 202 | - It permits all non-regressive cases, including bold tangents. This is an 203 | important point of compatibility. 204 | 205 | - It acts symmetrically, approximating the original regressive Bezier fairly 206 | closely given the circumstances. 207 | 208 | This default can of course be changed. 209 | 210 | # Evaluation-Time Anti-Regression 211 | 212 | It is always possible that USD content will be generated without anti-regression 213 | enabled, or even by directly writing `usda` content without the API. Thus, we 214 | must still have an anti-regression strategy at runtime. 215 | 216 | We propose that runtime evaluation use a fixed **Keep Ratio** strategy. This is 217 | for the same reasons that we propose that Keep Ratio be the authoring default, 218 | and also so that the authoring default matches the runtime behavior. 219 | 220 | # Proposed API 221 | 222 | ## Authoring Mode Setting 223 | 224 | Allow clients to control the mode used for authoring-time anti-regression 225 | behavior. Support all non-interactive strategies: Contain, Keep Start, and Keep 226 | Ratio. Also allow authoring-time anti-regression to be disabled with a mode 227 | called None. Allow the mode to be specified in the following ways: 228 | 229 | - By default, globally enable Keep Ratio. 230 | 231 | - Allow clients to change the global default. This will be useful for clients 232 | that use only one strategy, or offer users the option to choose a strategy. 233 | 234 | - Provide an RAII object that locally overrides the global default for all 235 | anti-regression authoring behaviors. 236 | 237 | ```c++ 238 | enum TsAntiRegressionMode 239 | { 240 | TsAntiRegressionNone, 241 | TsAntiRegressionContain, 242 | TsAntiRegressionKeepStart, 243 | TsAntiRegressionKeepRatio 244 | } 245 | 246 | class TsSpline 247 | { 248 | // Returns the current effective anti-regression authoring mode. 249 | // This may come from the hard-coded default, 250 | // from SetDefaultAntiRegressionAuthoringMode, 251 | // or from an AntiRegressionAuthoringSelector. 252 | // 253 | static TsAntiRegressionMode GetAntiRegressionAuthoringMode(); 254 | 255 | // Set the global default anti-regression authoring mode. Thread-safe. 256 | // 257 | static void SetDefaultAntiRegressionAuthoringMode( 258 | TsAntiRegressionMode mode); 259 | 260 | // RAII helper class that locally sets the anti-regression authoring mode. 261 | // The effect lasts as long as the object exists. 262 | // The effect is limited to the calling thread. 263 | // Multiple instances on the same thread will stack. 264 | // 265 | class AntiRegressionAuthoringSelector 266 | { 267 | AntiRegressionAuthoringSelector( 268 | TsAntiRegressionMode mode); 269 | ~AntiRegressionSelector(); 270 | } 271 | } 272 | ``` 273 | 274 | ## Edit Limiting 275 | 276 | The only two methods of `TsSpline` that can introduce regression are `SetKnot` 277 | and `SwapKnots.` Both of these will call `GetAntiRegressionAuthoringMode`, then 278 | apply that mode to the newly set knots. 279 | 280 | ## Interactive Limiting 281 | 282 | Support DCCs that want to illustrate limiting in real time as users manipulate 283 | knots. Support all strategies. 284 | 285 | ```c++ 286 | // Construct an instance of this class each time a knot is beginning a round of 287 | // interactive edits. Call Set for each change. 288 | // 289 | // Limits lengths of tangents to prevent regression. Operates on the segments 290 | // preceding and following the active knot, if they exist. 291 | // 292 | class TsRegressionPreventer 293 | { 294 | // Anti-regression modes that can only be used with this class. 295 | // 296 | enum InteractiveMode 297 | { 298 | ModeLimitActive, 299 | ModeLimitOpposite 300 | } 301 | 302 | // Construct a Preventer to edit a knot in the given spline. The mode will 303 | // be as returned by TsSpline::GetAntiRegressionAuthoringMode. If 'limit' 304 | // is true, adjustments will be enforced before knots are written to the 305 | // spline. Otherwise, knots will be written without adjustment, but the 306 | // SetResult will describe the adjustments that would be made. The spline 307 | // must remain valid for the lifetime of this object. 308 | // 309 | TsRegressionPreventer( 310 | TsSpline *spline, 311 | TsTime activeKnotTime, 312 | bool limit = true); 313 | 314 | // Construct a Preventer as above, but using one of the interactive modes 315 | // that are only available to this class. 316 | // 317 | TsRegressionPreventer( 318 | TsSpline *spline, 319 | TsTime activeKnotTime, 320 | InteractiveMode mode, 321 | bool limit = true); 322 | 323 | // Set an edited version of the active knot into the spline, adjusting 324 | // tangent widths if needed, based on the mode. Callers may change any 325 | // aspect of the active knot; the aspects that affect regression are knot 326 | // time and tangent widths. SetResult is a struct detailing what 327 | // adjustments were made. 328 | // 329 | bool Set( 330 | const TsKnot &proposedActiveKnot, 331 | SetResult *resultOut = nullptr); 332 | } 333 | ``` 334 | 335 | Interactive and edit limiting can be combined. This allows real-time limiting, 336 | but still provides a backstop for operations that edit splines in ways other 337 | than single-knot dragging. One sensible choice would be to use the same 338 | strategy for both. Another would be to use Limit Active or Limit Opposite 339 | interactively, and Keep Start or Keep Ratio for edit limiting. 340 | 341 | ## Querying 342 | 343 | Provide functions that reveal whether any regression is present in an individual 344 | spline, an entire layer, or an entire stage. 345 | 346 | ```c++ 347 | class TsSpline 348 | { 349 | bool HasRegressiveTangents() const; 350 | } 351 | 352 | bool UsdUtilsDoesLayerHaveRegressiveSplines( 353 | const SdfLayerHandle &layer); 354 | 355 | bool UsdUtilsDoesStageHaveRegressiveSplines( 356 | const UsdStagePtr &stage); 357 | ``` 358 | 359 | ## Bulk Limiting 360 | 361 | Allow individual splines, entire layers, and entire stages to be de-regressed. 362 | This supports cases where clients are importing content that may have been 363 | generated without API guardrails. It may also be useful for applications that 364 | use the Contain strategy, and are importing content that may include bold 365 | tangents, which Contain does not allow. 366 | 367 | ```c++ 368 | class TsSpline 369 | { 370 | // Shorten any regressive tangents. 371 | // The mode will be as returned by GetAntiRegressionAuthoringMode. 372 | // Return whether anything was changed. 373 | // 374 | bool AdjustRegressiveTangents(); 375 | } 376 | 377 | // Shorten any regressive tangents. 378 | // The mode will be as returned by TsSpline::GetAntiRegressionAuthoringMode. 379 | // Return whether anything was changed. 380 | // 381 | bool UsdUtilsAdjustRegressiveSplinesInLayer( 382 | const SdfLayerHandle &layer); 383 | 384 | // Shorten any regressive tangents. 385 | // The mode will be as returned by TsSpline::GetAntiRegressionAuthoringMode. 386 | // Return whether anything was changed. 387 | // 388 | bool UsdUtilsAdjustRegressiveSplinesOnStage( 389 | const UsdStagePtr &stage, 390 | const UsdEditTarget &editTarget = UsdEditTarget()); 391 | ``` 392 | --------------------------------------------------------------------------------