├── .reuse └── dep5 ├── AddingMaterialExtensions ├── AddingMaterialExtensions_001_WhyAddExtensions.md ├── AddingMaterialExtensions_002_UsingVisualStudioCode.md ├── AddingMaterialExtensions_003_TransmissionAndVolume.md ├── AddingMaterialExtensions_004_UsingARaytracer.md ├── AddingMaterialExtensions_005_TransmissionLimitations.md ├── README.md ├── images │ ├── image1.jpg │ ├── image10.png │ ├── image11.png │ ├── image12.png │ ├── image13.png │ ├── image14.png │ ├── image15.png │ ├── image16.png │ ├── image17.png │ ├── image18.png │ ├── image19.jpg │ ├── image2.jpg │ ├── image20.jpg │ ├── image21.jpg │ ├── image22.jpg │ ├── image23.png │ ├── image24.png │ ├── image25.png │ ├── image26.png │ ├── image27.png │ ├── image29.jpg │ ├── image3.jpg │ ├── image30.jpg │ ├── image35.png │ ├── image4.jpg │ ├── image5.png │ ├── image6.png │ ├── image7.png │ ├── image8.jpg │ └── image9.png └── samples │ ├── GlassHurricaneCandleHolder.bin │ ├── GlassHurricaneCandleHolder.gltf │ ├── GlassHurricaneCandleHolder_Green.glb │ ├── GlassHurricaneCandleHolder_Green.gltf │ ├── GlassHurricaneCandleHolder_Green_basecolor.png │ ├── GlassHurricaneCandleHolder_Green_orm.png │ ├── GlassHurricaneCandleHolder_basecolor.png │ ├── GlassHurricaneCandleHolder_orm.png │ ├── GlassHurricaneCandleHolder_thickness.png │ └── SciFiHelmet_SpecularOcclusion.glb ├── BlenderGltfConverter ├── README.md └── blender_gltf_converter.py ├── CODE_OF_CONDUCT.md ├── LICENSE ├── LICENSES └── CC-BY-4.0.txt ├── PBR ├── README.md └── figures │ ├── BRDFs.png │ ├── BTDFs.png │ ├── Fresnel_Conductor.JPG │ ├── Fresnel_Dielectric.JPG │ ├── Interreflection.jpg │ ├── Masking.png │ ├── Shadowing.png │ └── Snells_Law.JPG ├── README.md └── gltfTutorial ├── README.md ├── gltfTutorial_001_Introduction.md ├── gltfTutorial_002_BasicGltfStructure.md ├── gltfTutorial_003_MinimalGltfFile.md ├── gltfTutorial_004_ScenesNodes.md ├── gltfTutorial_005_BuffersBufferViewsAccessors.md ├── gltfTutorial_006_SimpleAnimation.md ├── gltfTutorial_007_Animations.md ├── gltfTutorial_008_SimpleMeshes.md ├── gltfTutorial_009_Meshes.md ├── gltfTutorial_010_Materials.md ├── gltfTutorial_011_SimpleMaterial.md ├── gltfTutorial_012_TexturesImagesSamplers.md ├── gltfTutorial_013_SimpleTexture.md ├── gltfTutorial_014_AdvancedMaterial.md ├── gltfTutorial_015_SimpleCameras.md ├── gltfTutorial_016_Cameras.md ├── gltfTutorial_017_SimpleMorphTarget.md ├── gltfTutorial_018_MorphTargets.md ├── gltfTutorial_019_SimpleSkin.md ├── gltfTutorial_020_Skins.md └── images ├── advancedMaterial_emissive.png ├── advancedMaterial_metallic.png ├── advancedMaterial_normal.png ├── advancedMaterial_roughness.png ├── animatedTriangle.gif ├── animationChannels.png ├── animationChannels.svg ├── animationSamplers.png ├── animationSamplers.svg ├── aos.png ├── aos.svg ├── applications.png ├── applications.svg ├── buffer.png ├── buffer.svg ├── bufferBufferView.png ├── bufferBufferView.svg ├── bufferBufferViewAccessor.png ├── bufferBufferViewAccessor.svg ├── cameras.png ├── contentPipeline.png ├── contentPipeline.svg ├── contentPipelineWithGltf.png ├── contentPipelineWithGltf.svg ├── createPng.bat ├── createPngs.bat ├── gltfJsonStructure.png ├── gltfJsonStructure.svg ├── gltfStructure.png ├── gltfStructure.svg ├── materials.png ├── matrix.png ├── matrix.tex ├── meshPrimitiveAttributes.png ├── meshPrimitiveAttributes.svg ├── metallicRoughnessSpheres.png ├── productMatrix.png ├── productMatrix.tex ├── rotationMatrix.png ├── rotationMatrix.tex ├── scaleMatrix.png ├── scaleMatrix.tex ├── sceneGraph.png ├── sceneGraph.svg ├── simpleMaterial.png ├── simpleMeshes.png ├── simpleMorph.png ├── simpleMorphInitial.png ├── simpleMorphInitial.svg ├── simpleMorphIntermediate.png ├── simpleMorphIntermediate.svg ├── simpleSkin.gif ├── simpleSkinOutline01.png ├── simpleSkinOutline02.png ├── simpleSparseAccessor.png ├── simpleSparseAccessorDescription.png ├── simpleTexture.png ├── skinInverseBindMatrix.png ├── skinInverseBindMatrix.svg ├── skinJointMatrices.png ├── skinJointMatrices.svg ├── skinSkinMatrix.png ├── skinSkinMatrix.svg ├── testTexture.png ├── translationMatrix.png ├── translationMatrix.tex ├── triangle.png └── triangleWithSimpleMaterial.png /.reuse/dep5: -------------------------------------------------------------------------------- 1 | Format: https://www.debian.org/doc/packaging-manuals/copyright-format/1.0/ 2 | Upstream-Name: glTF-Tutorials 3 | Source: https://github.com/KhronosGroup/glTF-Tutorials 4 | 5 | Files: * 6 | Copyright: Copyright 2020 The Khronos Group Inc. 7 | License: CC-BY-4.0 8 | -------------------------------------------------------------------------------- /AddingMaterialExtensions/AddingMaterialExtensions_001_WhyAddExtensions.md: -------------------------------------------------------------------------------- 1 | [Table of Contents](README.md) | Next: [Using Visual Studio Code](AddingMaterialExtensions_002_UsingVisualStudioCode.md) 2 | 3 | # Why add Extensions? 4 | 5 | Glass and other transparent surfaces have historically been difficult to reproduce in glTF. Lately however the new extensions for transmission and volume have become available, adding significant rendering improvements for refraction and absorption. 6 | 7 | Until now, Alpha Coverage was the method for rendering transparent surfaces, with some significant drawbacks. It is used to control the overall visibility of a surface, so it can’t recreate clear glass or liquids. The surface reflectivity is reduced as the surface becomes more transparent. Black pixels will completely hide the surface, grays will cause partial visibility, and white will keep the surface fully visible. Alpha Coverage is useful for materials like leaves mapped onto cards, burlap with gaps between the fibers, wicker with gaps, etc. but it’s not recommended for glass or liquids. 8 | 9 | ![Image of glass vase using Alpha Coverage](images/image1.jpg "Image of glass vase using Alpha Coverage") ![Image of glass vase using Transmission](images/image2.jpg "Image of glass vase using Transmission") 10 | 11 | _Alpha Coverage for glass (left) vs. Transmission & Volume (right). On the left, notice the glass looks foggy instead of clear, reflections are nearly invisible, and there are no refractions. Transmission on the right fixes these issues._ 12 | 13 | ![Image of colored glass vase using Alpha Coverage](images/image3.jpg "Image of colored glass vase using Alpha Coverage") ![Image of colored glass vase using Transmission and Volume](images/image4.jpg "Image of colored glass vase using Transmission and Volume") 14 | 15 | _Alpha Coverage for colored glass (left) vs. Transmission & Volume (right). On the left notice the inside is rendering on top of the outside, the glass color is weaker, and there are no refractions. Transmission on the right fixes these issues, and Volume shows absorption in the thicker glass at the bottom._ 16 | 17 | 18 | [Table of Contents](README.md) | Next: [Using Visual Studio Code](AddingMaterialExtensions_002_UsingVisualStudioCode.md) -------------------------------------------------------------------------------- /AddingMaterialExtensions/AddingMaterialExtensions_002_UsingVisualStudioCode.md: -------------------------------------------------------------------------------- 1 | Previous: [Why add Extensions?](AddingMaterialExtensions_001_WhyAddExtensions.md) | [Table of Contents](README.md) | Next: [KHR_materials_transmission and KHR_materials_volume](AddingMaterialExtensions_003_TransmissionAndVolume.md) 2 | 3 | # Using Visual Studio Code 4 | 5 | While any text editor can be used to modify glTF files, Visual Studio Code has a great toolkit for visual editing, proof-reading, and converting glTF formats. 6 | 7 | Go to the releases section [https://code.visualstudio.com/download](https://code.visualstudio.com/download) to download the appropriate installer, then install it. 8 | 9 | Add the extension glTF Tools. This includes many useful functions for editing glTF models, including a live debugger, a 3D preview window, conversion between glTF and GLB, etc. To add the extension, open Visual Studio Code then go to the Extensions tab and search for _gltf_ 10 | 11 | ![Screenshot of glTF Tools extension](images/image5.png "Screenshot of glTF Tools extension") 12 | 13 | 14 | The extension has several great tools for working with glTF models. It dynamically checks for syntax errors or badly-formatted textures, with helpful popups suggesting potential solutions. 15 | 16 | More information [about glTF Tools here](https://github.com/AnalyticalGraphicsInc/gltf-vscode). 17 | 18 | ## glTF Formats 19 | 20 | When adding extensions, it’s easier to use the text-based .glTF format, as this will open readily in a text editor. 21 | 22 | The binary .GLB format can also be used. It is typically smaller and self-contained with textures and mesh data all together inside one file. This makes the model more portable, but is not as easy to edit. 23 | 24 | 25 | ## Opening glTF in Visual Studio Code 26 | 27 | A binary .GLB will need to be converted into a text-format .glTF file for editing. The glTF Tools extension has a tool for this. The conversion is usually lossless, and once editing is complete the model can be converted back into a self-contained GLB. 28 | 29 | Right-click on the tab, and choose _glTF: Import from GLB_: 30 | 31 | ![screenshot of glTF: Import from GLB](images/image6.png "screenshot of glTF: Import from GLB") 32 | 33 | If the model is already in text-based .glTF format, it can be opened without conversion. 34 | 35 | If the glTF is all on one line it can be converted into a more readable format. Right-click on the first line and choose _Format Document_: 36 | 37 | ![screenshot of Format Document](images/image7.png "screenshot of Format Document") 38 | 39 | ## glTF 3D Preview 40 | 41 | Use the glTF Preview (Alt+G) to see the model. 42 | 43 | ![screenshot of glTF Preview](images/image8.jpg "screenshot of glTF Preview") 44 | 45 | To update the view, save the .glTF (Ctrl+S). Similarly, if the textures need to be edited the glTF Preview will automatically update as soon as they are saved. 46 | 47 | Several renderers are supported. Babylon.js has a built-in editor which can be opened using the button _glTF: Enable Debug Mode_ at top right: 48 | 49 | ![screenshot of Enable Debug Mode](images/image9.png "screenshot of Enable Debug Mode") 50 | 51 | The Babylon.js debug editor does not save back into the glTF file from within Visual Studio Code, but it’s great for quickly testing various glTF features. 52 | 53 | ## Custom IBLs in Visual Studio Code 54 | 55 | Image-based lighting (IBL) environments can be customized for the 3D viewers. 56 | 57 | Go to the Extensions section at left and click on glTF Tools. A new tab will open for glTF Tools. Click on the _Manage_ (gear) icon, and choose _Extension Settings._ 58 | 59 | ![screenshot of Settings button for glTF Tools](images/image10.png "screenshot of Settings button for glTF Tools") 60 | 61 | The path here can be edited to point to a custom IBL: 62 | 63 | ![screenshot of Bablyon Environment setting](images/image11.png "screenshot of Bablyon Environment setting") 64 | 65 | The IBL for Babylon.js must use the .env format, which can be created in the Babylon.js Sandbox. 66 | 67 | 1. Find a suitable HDR panorama. HDRs can be downloaded from various sources, for example [https://polyhaven.com/a/artist_workshop](https://polyhaven.com/a/artist_workshop). 68 | 2. Go to [https://sandbox.babylonjs.com/](https://sandbox.babylonjs.com/) 69 | 3. Load any glTF model. 70 | 4. Drag-and-drop the HDR. 71 | 5. Click on the Display Inspector button at bottom right: 72 | 73 | ![screenshot of Display Inspector button in Bablyon.js](images/image12.png "screenshot of Display Inspector button in Bablyon.js") 74 | 75 | 6. Go to the Tools tab, open the SCENE EXPORT section, and click on _Generate .env texture_: 76 | 77 | ![screenshot of Generate .env Texture button in Bablyon.js](images/image13.png "screenshot of Generate .env Texture button in Bablyon.js") 78 | 79 | It is a good idea to put the .env in a custom location, because the default folder may be overwritten whenever glTF Tools is updated. For example: 80 | 81 | ![screenshot of Bablyon Environment setting with custom file](images/image14.png "screenshot of Bablyon Environment setting with custom file") 82 | 83 | The viewers three.js and Filament each use their own IBL file formats; the details can be found in the glTF Tools settings. 84 | 85 | To see the new custom IBL, close and restart the 3D viewer. 86 | 87 | ## Export to GLB 88 | 89 | Once editing is completed, Visual Studio Code can be used to pack the model back into a single binary GLB file. This is not strictly necessary, but can make it easier to distribute the model. 90 | 91 | Make sure to close the 3D preview window first. Then right-click on the tab and choose _glTF: Export to GLB (Binary file)_: 92 | 93 | ![screenshot of Export to GLB](images/image35.png "screenshot of Export to GLB") 94 | 95 | 96 | 97 | Previous: [Why add Extensions?](AddingMaterialExtensions_001_WhyAddExtensions.md) | [Table of Contents](README.md) | Next: [KHR_materials_transmission and KHR_materials_volume](AddingMaterialExtensions_003_TransmissionAndVolume.md) -------------------------------------------------------------------------------- /AddingMaterialExtensions/AddingMaterialExtensions_003_TransmissionAndVolume.md: -------------------------------------------------------------------------------- 1 | Previous: [Using Visual Studio Code](AddingMaterialExtensions_002_UsingVisualStudioCode.md) | [Table of Contents](README.md) | Next: [Using a Raytracer](AddingMaterialExtensions_004_UsingARaytracer.md) 2 | 3 | # KHR_materials_transmission and KHR_materials_volume 4 | 5 | Visual Studio Code can be used to add extensions to the glTF model. 6 | 7 | Add a new _“extensionsUsed”_ section after the _“asset” _section. If it exists already, don’t add another one, just add the extensions inside the existing one. 8 | 9 | ``` 10 | "extensionsUsed": [ 11 | "KHR_materials_transmission", 12 | "KHR_materials_volume" 13 | ], 14 | ``` 15 | 16 | In this example, two extensions are being added at the same time: Transmission and Volume. Transmission on its own will make thin-walled refraction, as if the surface is infinitely thin and does not bend light. The Volume extension defines a depth for the surface, so light can be bent via refraction, and/or colored via absorption. 17 | 18 | Before: 19 | 20 | ![screenshot of code snippet before editing](images/image15.png "screenshot of code snippet before editing") 21 | 22 | After: 23 | 24 | ![screenshot of code snippet with extensionsUsed addition highlighted](images/image16.png "screenshot of code snippet with extensionsUsed addition highlighted") 25 | 26 | This glTF file was generated using the Max2Babylon glTF exporter for 3ds Max. However any other glTF generator can be used. 27 | 28 | Go to the section for "materials" to find the material for the model, and add a new section for _“extensions”_: 29 | 30 | ``` 31 | "extensions": { 32 | "KHR_materials_transmission": { 33 | "transmissionFactor": 1 34 | }, 35 | "KHR_materials_volume": { 36 | "thicknessFactor": 0.01 37 | } 38 | }, 39 | ``` 40 | 41 | Before: 42 | 43 | ![screenshot of code snippet before editing](images/image17.png "screenshot of code snippet before editing") 44 | 45 | After: 46 | 47 | ![screenshot of code snippet with extensions addition highlighted](images/image18.png "screenshot of code snippet with extensions addition highlighted") 48 | 49 | ## thicknessFactor and thicknessTexture 50 | 51 | The thicknessFactor controls how thick the glass/plastic/liquid is in Meters. Refraction bending and light absorption will not occur if this is not provided. 52 | 53 | Thickness is in meters, but refraction may not be completely accurate so it is OK to fine-tune the thickness to look OK. A word of warning though, going beyond physical values may cause unwanted side effects, so it’s usually best to stick as close to accurate values as possible. 54 | 55 | For “thin wall” transmission, set thicknessFactor to zero. This is useful when there’s a very thin transmission surface, for example a lightbulb made of clear glass. If the glass were set to the thickness of the whole bulb it would refract like a solid glass paperweight. 56 | 57 | A thicknessTexture is optional. This is a grayscale texture that gives a hint to rasterizers about how variable the thickness is across the model, for example the stem under a wine glass is thicker than the sides so it will refract & absorb more. This texture can be “baked” from the model. 58 | 59 | Several tools can be used to bake a thickness texture: 60 | 61 | * 3ds Max - [Render Surface Map: SubSurface Map](https://help.autodesk.com/view/3DSMAX/2022/ENU/?guid=GUID-08738349-9267-4D9C-986F-C6198E9AA900) 62 | * Blender - [Render Bake](https://www.blendernation.com/2018/09/12/baking-thickness-maps-in-blender-2-8/) 63 | * Gestaltor - [Bake Thickness Map](https://docs.gestaltor.io/#create-a-volume-effect-using-a-thickness-map) 64 | * Marmoset Toolbag - [Baker](https://marmoset.co/posts/toolbag-baking-tutorial/#maptypes) 65 | * Substance 3D Painter - [Thickness Baker](https://substance3d.adobe.com/documentation/spdoc/thickness-142213479.html) 66 | * etc. 67 | 68 | ![screenshot of thicknessTexture for the glass vase](images/image19.jpg "screenshot of thicknessTexture for the glass vase") ![screenshot of glass vase rendered](images/image20.jpg "screenshot of glass vase rendered") 69 | 70 | _Thickness texture (left) vs. rendering with Transmission & Volume (right). Absorption can be seen in the thicker glass at the bottom, which is informed by the thickness texture._ 71 | 72 | The thicknessTexture may need to be edited after baking to improve the visual result. White in the thicknessTexture will use the thicknessFactor value; this is the thickest part of the surface. Black applies zero thickness. If there are black areas in the texture they may need to be increased to a darker gray to promote suitable refraction. 73 | 74 | The thicknessTexture may also need to be blurred to simulate a smoother mesh surface. If the thicknessTexture is baked from a low-resolution model it may have facets or hard edges. Blurring the image can reduce these artifacts, even if it is not strictly accurate. 75 | 76 | ![image of raw thicknessTexture with artifacts](images/image21.jpg "image of raw thicknessTexture with artifacts") ![image of fixed thicknessTexture](images/image22.jpg "image of fixed thicknessTexture") 77 | 78 | _A raw thickness bake (left) shows some artifacts: the gradient is reversed, instead the bottom should be brighter since it’s thicker. Also the gradient shows hard edges which can be fixed by blurring the bake. The thicknessTexture after being fixed (right)._ 79 | 80 | ## Adding Textures for Transmission and Volume 81 | 82 | If a texture is needed for transmission or thickness, these can be added in Visual Studio Code. 83 | 84 | Go to the sections for "textures" and “images” to add the new textures. Copy the code for one of the existing textures, and edit the “name” and increment the “source”: 85 | 86 | Before: 87 | 88 | ![screenshot of code snippet before editing](images/image23.png "screenshot of code snippet before editing") 89 | 90 | After: 91 | 92 | ![screenshot of code snippet with new texture highlighted](images/image24.png "screenshot of code snippet with new texture highlighted") 93 | 94 | Go to the section for “materials”, find the suitable material and add the new texture. Set the “index” to match the element in the “textures” section. Note that elements are zero-based, so in this case the third texture element is actually index 2. 95 | 96 | ``` 97 | , 98 | "thicknessTexture": { 99 | "index": 2 100 | } 101 | ``` 102 | 103 | Before: 104 | 105 | ![screenshot of code snippet of material](images/image25.png "screenshot of code snippet of material") 106 | 107 | After: 108 | 109 | ![screenshot of code snippet with thicknessTexture highlighted](images/image26.png "screenshot of code snippet with thicknessTexture highlighted") 110 | 111 | Use the 3D preview (Alt+G) to test the material appearance, and adjust as needed. 112 | 113 | ## attenuationColor and attenuationDistance 114 | 115 | The attenuationColor and attenuationDistance help the 3D viewer to simulate how a thick tinted glass or liquid can absorb light the further it goes into the volume. For example a cup made of colored glass is often darker in the base where the glass is thicker. 116 | 117 | ![screenshot of code snippet of finished material](images/image27.png "screenshot of code snippet") ![screenshot of finished glass vase](images/image20.jpg "screenshot of finished glass vase") 118 | 119 | When using attenuation, it may be beneficial to change the baseColor to white, and put all the color into attenuationColor. This helps improve the contrast for the absorption because it’s not fighting with the baseColor. 120 | 121 | If the glass is multicolored or patterned then a baseColorTexture will probably be needed instead. The attenuationColor does not use a texture; the color would be uniform throughout the material. 122 | 123 | 124 | 125 | Previous: [Using Visual Studio Code](AddingMaterialExtensions_002_UsingVisualStudioCode.md) | [Table of Contents](README.md) | Next: [Using a Raytracer](AddingMaterialExtensions_004_UsingARaytracer.md) 126 | -------------------------------------------------------------------------------- /AddingMaterialExtensions/AddingMaterialExtensions_004_UsingARaytracer.md: -------------------------------------------------------------------------------- 1 | Previous: [KHR_materials_transmission and KHR_materials_volume](AddingMaterialExtensions_003_TransmissionAndVolume.md) | [Table of Contents](README.md) | Next: [Transmission Limitations](AddingMaterialExtensions_005_TransmissionLimitations.md) 2 | 3 | # Using a Raytracer 4 | 5 | When adjusting material parameters, it’s a good idea to check the “ground truth” appearance with a raytracer. 6 | 7 | glTF models can be rendered on a variety of renderers, but some need to take shortcuts to update at an interactive frame rate. With augmented reality or virtual reality, rendering speed is of the utmost importance; rasterizers are typically used for these applications because they don’t require high-end raytracing GPUs. 8 | 9 | The thicknessFactor and thicknessTexture are used by rasterizers to simulate the depth properties of volumetric surfaces. These are “hints” for the rasterizer to improve speed, but ignored by raytracers since they can use the actual geometry depth instead. 10 | 11 | ![Image of glass vase rasterized](images/image29.jpg "Glass vase rasterized") ![Image of glass vase raytraced](images/image30.jpg "Glass vase raytraced") 12 | 13 | _A glTF model rendered in the [Microsoft Babylon.js](https://sandbox.babylonjs.com/) rasterizer (above left) and the [Dassault Enterprise PBR](https://dassaultsystemes-technology.github.io/dspbr-pt/) pathtracer (above right). The thicknessFactor and thicknessTexture are inaccurate here, they should be adjusted to better match the raytraced ground truth._ 14 | 15 | ![Image of glass vase rasterized](images/image20.jpg "Glass vase rasterized") ![Image of glass vase raytraced](images/image30.jpg "Glass vase raytraced") 16 | 17 | _After adjustments, the rasterizer (above left) is a better match to the raytracer (above right)._ 18 | 19 | 20 | 21 | Previous: [KHR_materials_transmission and KHR_materials_volume](AddingMaterialExtensions_003_TransmissionAndVolume.md) | [Table of Contents](README.md) | Next: [Transmission Limitations](AddingMaterialExtensions_005_TransmissionLimitations.md) -------------------------------------------------------------------------------- /AddingMaterialExtensions/AddingMaterialExtensions_005_TransmissionLimitations.md: -------------------------------------------------------------------------------- 1 | Previous: [Using a Raytracer](AddingMaterialExtensions_004_UsingARaytracer.md) | [Table of Contents](README.md) 2 | 3 | # Transmission Limitations 4 | 5 | When rendered in a rasterizer, transmission can cause some significant visual artifacts. 6 | 7 | Rasterizers usually rely on a fast but somewhat inaccurate rendering method. First all transmission surfaces are hidden, then the scene is rendered into a temporary texture. Next this texture is distorted based on the thicknessFactor and thicknessTexture and normalTexture (a normal bump map). Finally, the texture is projected it back onto the transmission surface from the camera’s viewing angle. This is all done in real-time, many times per second. 8 | 9 | This method is fast and often works well, but it causes only one layer of transmission to be rendered. If surfaces overlap one another only the nearest will be rendered; any surfaces with transmission behind the first one will be invisible. 10 | 11 | ![screenshot of glass vase glTF raytraced](images/image30.jpg "screenshot of glass vase glTF raytraced") ![screenshot of glass vase glTF rasterized](images/image20.jpg "screenshot of glass vase glTF rasterized") 12 | 13 | _In the raytraced render (left) the inside surface of the glass shows reflections from the windows. In the rasterizer render (right) only one layer of transmission is visible; the inside surface of the glass is not rendered behind the outside surface._ 14 | 15 | This is OK in many cases, but can be problematic if rendering a model with multiple transmission surfaces. If this is unacceptable there are a couple potential fixes: 16 | 17 | One solution would be to apply transmission only to the parts of the model that need it. This is a good setup as it will improve render performance. However it will cause the rest of the model to be rendered as opaque. 18 | 19 | Another solution could be to use alpha coverage for some surfaces and transmission for others, because alpha coverage will still be refracted behind transmission. Alpha coverage has its own rendering problems, as mentioned before. But seeing something behind transmission is often better than seeing nothing. 20 | 21 | 22 | 23 | Previous: [Using a Raytracer](AddingMaterialExtensions_004_UsingARaytracer.md) | [Table of Contents](README.md) -------------------------------------------------------------------------------- /AddingMaterialExtensions/README.md: -------------------------------------------------------------------------------- 1 | # Adding Material Extensions to glTF Models # 2 | 3 | By Eric Chadwick, Senior 3D Technical Artist, DGG, [@echadwick-artist](https://github.com/echadwick-artist) 4 | 5 | This tutorial explains how to edit glTF files using open source software to add material extensions [KHR_materials_transmission](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Khronos/KHR_materials_transmission/README.md) and [KHR_materials_volume](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Khronos/KHR_materials_volume/README.md) to create glass with reflection, refraction, and absorption. 6 | 7 | These methods can be repurposed for [other material extensions](https://github.com/KhronosGroup/glTF/tree/main/extensions#gltf-extension-registry) too. 8 | 9 | 10 | ## Sample Model ## 11 | 12 | The glTF model used in this tutorial is available in the [samples](samples/) folder. 13 | 14 | ![screenshot of GlassHurricaneCandleHolder.gltf with transmission and volume](images/image20.jpg "screenshot of GlassHurricaneCandleHolder.gltf with transmission and volume") 15 | 16 | _(Above) GlassHurricaneCandleHolder.gltf with transmission and volume_ 17 | 18 | 19 | ## Table of Contents ## 20 | 21 | * [Why add Extensions?](AddingMaterialExtensions_001_WhyAddExtensions.md) 22 | * [Using Visual Studio Code](AddingMaterialExtensions_002_UsingVisualStudioCode.md) 23 | * [KHR_materials_transmission and KHR_materials_volume](AddingMaterialExtensions_003_TransmissionAndVolume.md) 24 | * [Using a Raytracer](AddingMaterialExtensions_004_UsingARaytracer.md) 25 | * [Transmission Limitations](AddingMaterialExtensions_005_TransmissionLimitations.md) 26 | 27 | 28 | ## Acknowledgements ## 29 | 30 | - Alexey Knyazev, [@lexaknyazev](https://github.com/lexaknyazev) 31 | - Emmett Lalish, [@elalish](https://github.com/elalish) 32 | -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image1.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image1.jpg -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image10.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image10.png -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image11.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image11.png -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image12.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image12.png -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image13.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image13.png -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image14.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image14.png -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image15.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image15.png -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image16.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image16.png -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image17.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image17.png -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image18.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image18.png -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image19.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image19.jpg -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image2.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image2.jpg -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image20.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image20.jpg -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image21.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image21.jpg -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image22.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image22.jpg -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image23.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image23.png -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image24.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image24.png -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image25.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image25.png -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image26.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image26.png -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image27.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image27.png -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image29.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image29.jpg -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image3.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image3.jpg -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image30.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image30.jpg -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image35.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image35.png -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image4.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image4.jpg -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image5.png -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image6.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image6.png -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image7.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image7.png -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image8.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image8.jpg -------------------------------------------------------------------------------- /AddingMaterialExtensions/images/image9.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/images/image9.png -------------------------------------------------------------------------------- /AddingMaterialExtensions/samples/GlassHurricaneCandleHolder.bin: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/samples/GlassHurricaneCandleHolder.bin -------------------------------------------------------------------------------- /AddingMaterialExtensions/samples/GlassHurricaneCandleHolder.gltf: -------------------------------------------------------------------------------- 1 | { 2 | "asset": { 3 | "version": "2.0", 4 | "copyright": "(C) 2021, Wayfair LLC. License: CC BY 4.0 International", 5 | "generator": "Generated with 3ds Max, exported with babylon.js glTF exporter, hand-edited in VSCode with glTF Tools" 6 | }, 7 | "extensionsUsed": [ 8 | "KHR_materials_transmission", 9 | "KHR_materials_volume" 10 | ], 11 | "scene": 0, 12 | "scenes": [ 13 | { 14 | "nodes": [ 15 | 0, 16 | 1 17 | ] 18 | } 19 | ], 20 | "nodes": [ 21 | { 22 | "mesh": 0, 23 | "name": "GlassHurricaneCandleHolder-opaque" 24 | }, 25 | { 26 | "mesh": 1, 27 | "rotation": [ 28 | 0.7071068, 29 | 0.0, 30 | 0.0, 31 | 0.7071067 32 | ], 33 | "name": "GlassHurricaneCandleHolder-glass" 34 | } 35 | ], 36 | "meshes": [ 37 | { 38 | "primitives": [ 39 | { 40 | "attributes": { 41 | "POSITION": 1, 42 | "NORMAL": 2, 43 | "TEXCOORD_0": 3 44 | }, 45 | "indices": 0, 46 | "material": 0 47 | } 48 | ], 49 | "name": "GlassHurricaneCandleHolder-opaque" 50 | }, 51 | { 52 | "primitives": [ 53 | { 54 | "attributes": { 55 | "POSITION": 5, 56 | "NORMAL": 6, 57 | "TEXCOORD_0": 7 58 | }, 59 | "indices": 4, 60 | "material": 1 61 | } 62 | ], 63 | "name": "GlassHurricaneCandleHolder-glass" 64 | } 65 | ], 66 | "accessors": [ 67 | { 68 | "bufferView": 0, 69 | "componentType": 5123, 70 | "count": 5082, 71 | "type": "SCALAR", 72 | "name": "accessorIndices" 73 | }, 74 | { 75 | "bufferView": 1, 76 | "componentType": 5126, 77 | "count": 1006, 78 | "max": [ 79 | 0.0707887039, 80 | 0.27212593, 81 | 0.0707886741 82 | ], 83 | "min": [ 84 | -0.0707887039, 85 | 0.0, 86 | -0.0707887262 87 | ], 88 | "type": "VEC3", 89 | "name": "accessorPositions" 90 | }, 91 | { 92 | "bufferView": 1, 93 | "byteOffset": 12072, 94 | "componentType": 5126, 95 | "count": 1006, 96 | "type": "VEC3", 97 | "name": "accessorNormals" 98 | }, 99 | { 100 | "bufferView": 2, 101 | "componentType": 5126, 102 | "count": 1006, 103 | "type": "VEC2", 104 | "name": "accessorUVs" 105 | }, 106 | { 107 | "bufferView": 0, 108 | "byteOffset": 10164, 109 | "componentType": 5123, 110 | "count": 7296, 111 | "type": "SCALAR", 112 | "name": "accessorIndices" 113 | }, 114 | { 115 | "bufferView": 1, 116 | "byteOffset": 24144, 117 | "componentType": 5126, 118 | "count": 1380, 119 | "max": [ 120 | 0.09516754, 121 | 0.09516747, 122 | -0.09550226 123 | ], 124 | "min": [ 125 | -0.09516754, 126 | -0.0951675251, 127 | -0.3065758 128 | ], 129 | "type": "VEC3", 130 | "name": "accessorPositions" 131 | }, 132 | { 133 | "bufferView": 1, 134 | "byteOffset": 40704, 135 | "componentType": 5126, 136 | "count": 1380, 137 | "type": "VEC3", 138 | "name": "accessorNormals" 139 | }, 140 | { 141 | "bufferView": 2, 142 | "byteOffset": 8048, 143 | "componentType": 5126, 144 | "count": 1380, 145 | "type": "VEC2", 146 | "name": "accessorUVs" 147 | } 148 | ], 149 | "bufferViews": [ 150 | { 151 | "buffer": 0, 152 | "byteLength": 24756, 153 | "name": "bufferViewScalar" 154 | }, 155 | { 156 | "buffer": 0, 157 | "byteOffset": 24756, 158 | "byteLength": 57264, 159 | "byteStride": 12, 160 | "name": "bufferViewFloatVec3" 161 | }, 162 | { 163 | "buffer": 0, 164 | "byteOffset": 82020, 165 | "byteLength": 19088, 166 | "byteStride": 8, 167 | "name": "bufferViewFloatVec2" 168 | } 169 | ], 170 | "buffers": [ 171 | { 172 | "uri": "GlassHurricaneCandleHolder.bin", 173 | "byteLength": 101108 174 | } 175 | ], 176 | "materials": [ 177 | { 178 | "pbrMetallicRoughness": { 179 | "baseColorTexture": { 180 | "index": 1 181 | }, 182 | "metallicRoughnessTexture": { 183 | "index": 0 184 | } 185 | }, 186 | "occlusionTexture": { 187 | "index": 0 188 | }, 189 | "name": "GlassHurricaneCandleHolder-opaque" 190 | }, 191 | { 192 | "pbrMetallicRoughness": { 193 | "baseColorFactor": [ 194 | 1, 195 | 1, 196 | 1, 197 | 1 198 | ], 199 | "metallicRoughnessTexture": { 200 | "index": 0 201 | } 202 | }, 203 | "extensions": { 204 | "KHR_materials_transmission": { 205 | "transmissionFactor": 1 206 | }, 207 | "KHR_materials_volume": { 208 | "thicknessFactor": 0.04, 209 | "thicknessTexture": { 210 | "index": 2 211 | }, 212 | "attenuationColor": [ 213 | 0.792, 214 | 0.910, 215 | 1 216 | ], 217 | "attenuationDistance": 0.002 218 | } 219 | }, 220 | "occlusionTexture": { 221 | "index": 0 222 | }, 223 | "name": "GlassHurricaneCandleHolder-glass" 224 | } 225 | ], 226 | "textures": [ 227 | { 228 | "sampler": 0, 229 | "source": 0, 230 | "name": "GlassHurricaneCandleHolder_orm.png" 231 | }, 232 | { 233 | "sampler": 0, 234 | "source": 1, 235 | "name": "GlassHurricaneCandleHolder_basecolor.png" 236 | }, 237 | { 238 | "sampler": 0, 239 | "source": 2, 240 | "name": "GlassHurricaneCandleHolder_thickness.png" 241 | } 242 | ], 243 | "images": [ 244 | { 245 | "uri": "GlassHurricaneCandleHolder_orm.png" 246 | }, 247 | { 248 | "uri": "GlassHurricaneCandleHolder_basecolor.png" 249 | }, 250 | { 251 | "uri": "GlassHurricaneCandleHolder_thickness.png" 252 | } 253 | ], 254 | "samplers": [ 255 | { 256 | "magFilter": 9729, 257 | "minFilter": 9987 258 | } 259 | ] 260 | } -------------------------------------------------------------------------------- /AddingMaterialExtensions/samples/GlassHurricaneCandleHolder_Green.glb: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/samples/GlassHurricaneCandleHolder_Green.glb -------------------------------------------------------------------------------- /AddingMaterialExtensions/samples/GlassHurricaneCandleHolder_Green.gltf: -------------------------------------------------------------------------------- 1 | { 2 | "asset": { 3 | "version": "2.0", 4 | "copyright": "(C) 2023, Wayfair LLC. License: CC BY 4.0 International", 5 | "generator": "Generated with 3ds Max, exported with babylon.js glTF exporter, hand-edited in VSCode with glTF Tools" 6 | }, 7 | "extensionsUsed": [ 8 | "KHR_materials_transmission", 9 | "KHR_materials_volume" 10 | ], 11 | "scene": 0, 12 | "scenes": [ 13 | { 14 | "nodes": [ 15 | 0, 16 | 1 17 | ] 18 | } 19 | ], 20 | "nodes": [ 21 | { 22 | "mesh": 0, 23 | "name": "GlassHurricaneCandleHolder-opaque" 24 | }, 25 | { 26 | "mesh": 1, 27 | "rotation": [ 28 | 0.7071068, 29 | 0.0, 30 | 0.0, 31 | 0.7071067 32 | ], 33 | "name": "GlassHurricaneCandleHolder-glass" 34 | } 35 | ], 36 | "meshes": [ 37 | { 38 | "primitives": [ 39 | { 40 | "attributes": { 41 | "POSITION": 1, 42 | "NORMAL": 2, 43 | "TEXCOORD_0": 3 44 | }, 45 | "indices": 0, 46 | "material": 0 47 | } 48 | ], 49 | "name": "GlassHurricaneCandleHolder-opaque" 50 | }, 51 | { 52 | "primitives": [ 53 | { 54 | "attributes": { 55 | "POSITION": 5, 56 | "NORMAL": 6, 57 | "TEXCOORD_0": 7 58 | }, 59 | "indices": 4, 60 | "material": 1 61 | } 62 | ], 63 | "name": "GlassHurricaneCandleHolder-glass" 64 | } 65 | ], 66 | "accessors": [ 67 | { 68 | "bufferView": 0, 69 | "componentType": 5123, 70 | "count": 5082, 71 | "type": "SCALAR", 72 | "name": "accessorIndices" 73 | }, 74 | { 75 | "bufferView": 1, 76 | "componentType": 5126, 77 | "count": 1006, 78 | "max": [ 79 | 0.0707887039, 80 | 0.27212593, 81 | 0.0707886741 82 | ], 83 | "min": [ 84 | -0.0707887039, 85 | 0.0, 86 | -0.0707887262 87 | ], 88 | "type": "VEC3", 89 | "name": "accessorPositions" 90 | }, 91 | { 92 | "bufferView": 1, 93 | "byteOffset": 12072, 94 | "componentType": 5126, 95 | "count": 1006, 96 | "type": "VEC3", 97 | "name": "accessorNormals" 98 | }, 99 | { 100 | "bufferView": 2, 101 | "componentType": 5126, 102 | "count": 1006, 103 | "type": "VEC2", 104 | "name": "accessorUVs" 105 | }, 106 | { 107 | "bufferView": 0, 108 | "byteOffset": 10164, 109 | "componentType": 5123, 110 | "count": 7296, 111 | "type": "SCALAR", 112 | "name": "accessorIndices" 113 | }, 114 | { 115 | "bufferView": 1, 116 | "byteOffset": 24144, 117 | "componentType": 5126, 118 | "count": 1380, 119 | "max": [ 120 | 0.09516754, 121 | 0.09516747, 122 | -0.09550226 123 | ], 124 | "min": [ 125 | -0.09516754, 126 | -0.0951675251, 127 | -0.3065758 128 | ], 129 | "type": "VEC3", 130 | "name": "accessorPositions" 131 | }, 132 | { 133 | "bufferView": 1, 134 | "byteOffset": 40704, 135 | "componentType": 5126, 136 | "count": 1380, 137 | "type": "VEC3", 138 | "name": "accessorNormals" 139 | }, 140 | { 141 | "bufferView": 2, 142 | "byteOffset": 8048, 143 | "componentType": 5126, 144 | "count": 1380, 145 | "type": "VEC2", 146 | "name": "accessorUVs" 147 | } 148 | ], 149 | "bufferViews": [ 150 | { 151 | "buffer": 0, 152 | "byteLength": 24756, 153 | "name": "bufferViewScalar", 154 | "target": 34963 155 | }, 156 | { 157 | "buffer": 0, 158 | "byteOffset": 24756, 159 | "byteLength": 57264, 160 | "byteStride": 12, 161 | "name": "bufferViewFloatVec3", 162 | "target": 34962 163 | }, 164 | { 165 | "buffer": 0, 166 | "byteOffset": 82020, 167 | "byteLength": 19088, 168 | "byteStride": 8, 169 | "name": "bufferViewFloatVec2", 170 | "target": 34962 171 | } 172 | ], 173 | "buffers": [ 174 | { 175 | "uri": "GlassHurricaneCandleHolder.bin", 176 | "byteLength": 101108 177 | } 178 | ], 179 | "materials": [ 180 | { 181 | "pbrMetallicRoughness": { 182 | "baseColorTexture": { 183 | "index": 1 184 | }, 185 | "metallicRoughnessTexture": { 186 | "index": 0 187 | } 188 | }, 189 | "occlusionTexture": { 190 | "index": 0 191 | }, 192 | "name": "GlassHurricaneCandleHolder-opaque" 193 | }, 194 | { 195 | "pbrMetallicRoughness": { 196 | "baseColorTexture": { 197 | "index": 1 198 | }, 199 | "metallicRoughnessTexture": { 200 | "index": 0 201 | } 202 | }, 203 | "extensions": { 204 | "KHR_materials_transmission": { 205 | "transmissionFactor": 1 206 | }, 207 | "KHR_materials_volume": { 208 | "thicknessFactor": 0.04, 209 | "thicknessTexture": { 210 | "index": 2 211 | }, 212 | "attenuationColor": [ 213 | 0.83, 214 | 0.910, 215 | 0.75 216 | ], 217 | "attenuationDistance": 0.002 218 | } 219 | }, 220 | "occlusionTexture": { 221 | "index": 0 222 | }, 223 | "name": "GlassHurricaneCandleHolder-glass" 224 | } 225 | ], 226 | "textures": [ 227 | { 228 | "sampler": 0, 229 | "source": 0, 230 | "name": "GlassHurricaneCandleHolder_Green_orm.png" 231 | }, 232 | { 233 | "sampler": 0, 234 | "source": 1, 235 | "name": "GlassHurricaneCandleHolder_Green_basecolor.png" 236 | }, 237 | { 238 | "sampler": 0, 239 | "source": 2, 240 | "name": "GlassHurricaneCandleHolder_thickness.png" 241 | } 242 | ], 243 | "images": [ 244 | { 245 | "uri": "GlassHurricaneCandleHolder_Green_orm.png" 246 | }, 247 | { 248 | "uri": "GlassHurricaneCandleHolder_Green_basecolor.png" 249 | }, 250 | { 251 | "uri": "GlassHurricaneCandleHolder_thickness.png" 252 | } 253 | ], 254 | "samplers": [ 255 | { 256 | "magFilter": 9729, 257 | "minFilter": 9987 258 | } 259 | ] 260 | } -------------------------------------------------------------------------------- /AddingMaterialExtensions/samples/GlassHurricaneCandleHolder_Green_basecolor.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/samples/GlassHurricaneCandleHolder_Green_basecolor.png -------------------------------------------------------------------------------- /AddingMaterialExtensions/samples/GlassHurricaneCandleHolder_Green_orm.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/samples/GlassHurricaneCandleHolder_Green_orm.png -------------------------------------------------------------------------------- /AddingMaterialExtensions/samples/GlassHurricaneCandleHolder_basecolor.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/samples/GlassHurricaneCandleHolder_basecolor.png -------------------------------------------------------------------------------- /AddingMaterialExtensions/samples/GlassHurricaneCandleHolder_orm.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/samples/GlassHurricaneCandleHolder_orm.png -------------------------------------------------------------------------------- /AddingMaterialExtensions/samples/GlassHurricaneCandleHolder_thickness.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/samples/GlassHurricaneCandleHolder_thickness.png -------------------------------------------------------------------------------- /AddingMaterialExtensions/samples/SciFiHelmet_SpecularOcclusion.glb: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/AddingMaterialExtensions/samples/SciFiHelmet_SpecularOcclusion.glb -------------------------------------------------------------------------------- /BlenderGltfConverter/README.md: -------------------------------------------------------------------------------- 1 | # blender-gltf-converter 2 | 3 | [Blender](https://www.blender.org/) can be used as a command line tool for converting many different input file formats into glTF assets. This repository contains an example script that shows how to use Blender as a glTF conversion tool. 4 | 5 | The purpose of this repository is to show how to create and extend Blender scripts for tasks automation. It does not explain the details of the conversion options, but is intended as a template that can be used as a basis for custom conversion tasks. 6 | 7 | # General Blender info 8 | 9 | The conversion script is implemented in Python, which can directly be executed by Blender. The library that has to be imported in the Python script in order to access the Blender functionality is called _bpy_. The official documentation can be found at https://docs.blender.org/api/current/index.html . 10 | 11 | In order to find the Python functions that correspond to certain Blender functionalities, you can also enable Python tooltips in Blender: _Edit -> Preferences -> Interface -> Display -> Python Tooltips_. When you hover over an element in the Blender UI, in addition to the name and decription, it will show you the Python equivalent of the element. 12 | 13 | You can also use the Info window to see what API calls are made on each action performed in Blender. 14 | 15 | # Script usage 16 | 17 | The actual conversion script is given in [`blender_gltf_converter.py`](blender_gltf_converter.py). It can be executed with Blender from the command line by calling 18 | 19 | `blender -b -P blender_gltf_converter.py -- -mp `*`"path-to-input-file"`* 20 | 21 | - The `-b` flag (alternatively: `--background`) runs Blender in headless mode. If you wish to see the UI after the script is done processing, omit it. 22 | - The `-P` flag (uppercase! Alternatively: `--python`) allows you to pass the Python script to Blender. If you are not in the same directory as the `blender_gltf_converter.py` file, provide full path to it. 23 | 24 | The full list of Blender command line options can be found at https://docs.blender.org/manual/en/latest/advanced/command_line/arguments.html . 25 | 26 | The `--` is used as a separator between the command line arguments that are passed to Blender itself, and the command line arguments that are passed to the script. The script receives only one argument here: 27 | 28 | - The `-mp` (model path) argument, which is followed by the full path of the file that should be converted into a glTF asset. 29 | 30 | # Script customization 31 | 32 | The example script in this repository is intended as a template for custom scripts. The example only accepts Wavefront OBJ files as the input. The files are imported into blender with `bpy.ops.import_scene.obj`. In order to import other formats (like FBX or X3D files), the corresponding functions from the [import scene operators](https://docs.blender.org/api/current/bpy.ops.import_scene.html) can be used. 33 | 34 | The example script imports the input file into a Blender scene. Customization steps may be inserted after the import, for example, to modify the imported model. 35 | 36 | The script then exports the scene as glTF file, using `bpy.ops.export_scene.gltf`. The list of all parameters for the 37 | glTF export can be found at https://docs.blender.org/api/current/bpy.ops.export_scene.html#bpy.ops.export_scene.gltf. 38 | Parameters that will not be set explicitly, will use `your` Blender settings. -------------------------------------------------------------------------------- /BlenderGltfConverter/blender_gltf_converter.py: -------------------------------------------------------------------------------- 1 | """ 2 | Run from the terminal: 3 | blender -b -P blender_gltf_converter.py -- -mp "path-to-file" 4 | 5 | If you're not in the same directory as the blender_gltf_converter.py file, provide full path to it. 6 | The "-b" flag runs Blender in headless mode. 7 | If you wish to see the UI after the script is done processing, skip it. 8 | The "-P" flag allows to pass script to Blender. 9 | """ 10 | 11 | 12 | import sys 13 | # bpy is the library you need to import to access Blender API 14 | # Link to the official documentation: https://docs.blender.org/api/current/index.html 15 | # You can also enable tooltips in Blender: Edit -> Preferences -> Interface -> Python Tooltips 16 | # You can also use Info window to see what API calls are made on each action performed in Blender 17 | import bpy 18 | 19 | # Custom logic may go here 20 | 21 | def runner(): 22 | import argparse 23 | parser = argparse.ArgumentParser(description='Convert supported file type to gltf format.') 24 | # Arguments go here, e.g. asset path, output path, other custom properties and flags 25 | parser.add_argument('-mp', '--model_path', help='Path to a model you want to convert.') 26 | 27 | # The -- is a separator between arguments passed to Blender directly and to your script 28 | argv = sys.argv 29 | if "--" not in argv: 30 | argv = [] 31 | else: 32 | # Get just the arguments passed to your script 33 | argv = argv[argv.index("--") + 1:] 34 | 35 | # Access passed arguments 36 | args = parser.parse_args(argv) 37 | 38 | # Assignment to a variable is just a convenience, it's not mandatory 39 | model_path = args.model_path 40 | 41 | # Clear default scene 42 | bpy.ops.object.select_all(action='SELECT') 43 | bpy.ops.object.delete() 44 | 45 | # Custom logic may go here 46 | # You might want to e.g. support various file formats, this example works just with .obj files 47 | 48 | bpy.ops.import_scene.obj(filepath=model_path) 49 | output_path = model_path.replace(".obj", "") 50 | 51 | # The list of all parameters for gltf export can be found here: 52 | # https://docs.blender.org/api/current/bpy.ops.export_scene.html#bpy.ops.export_scene.gltf 53 | # Parameters that you won't set will use your Blender settings 54 | bpy.ops.export_scene.gltf(filepath=output_path) 55 | 56 | 57 | if __name__ == "__main__": 58 | runner() 59 | -------------------------------------------------------------------------------- /CODE_OF_CONDUCT.md: -------------------------------------------------------------------------------- 1 | A reminder that this issue tracker is managed by the Khronos Group. Interactions here should follow the Khronos Code of Conduct (https://www.khronos.org/developers/code-of-conduct), which prohibits aggressive or derogatory language. Please keep the discussion friendly and civil. 2 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/LICENSE -------------------------------------------------------------------------------- /PBR/figures/BRDFs.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/PBR/figures/BRDFs.png -------------------------------------------------------------------------------- /PBR/figures/BTDFs.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/PBR/figures/BTDFs.png -------------------------------------------------------------------------------- /PBR/figures/Fresnel_Conductor.JPG: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/PBR/figures/Fresnel_Conductor.JPG -------------------------------------------------------------------------------- /PBR/figures/Fresnel_Dielectric.JPG: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/PBR/figures/Fresnel_Dielectric.JPG -------------------------------------------------------------------------------- /PBR/figures/Interreflection.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/PBR/figures/Interreflection.jpg -------------------------------------------------------------------------------- /PBR/figures/Masking.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/PBR/figures/Masking.png -------------------------------------------------------------------------------- /PBR/figures/Shadowing.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/PBR/figures/Shadowing.png -------------------------------------------------------------------------------- /PBR/figures/Snells_Law.JPG: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/PBR/figures/Snells_Law.JPG -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | 2 | - The main [glTF Tutorial](gltfTutorial/README.md). 3 | - A tutorial about [Physically Based Rendering (PBR) in glTF](PBR/README.md). 4 | - A tutorial for [Adding Material Extensions to glTF Models](AddingMaterialExtensions/README.md). 5 | - A template for [Building a glTF conversion tool based on Blender](BlenderGltfConverter/README.md). 6 | 7 | An online HTML version of these tutorials can be found at https://github.khronos.org/glTF-Tutorials/ . 8 | 9 | ## Feedback 10 | 11 | If there are any tutorials that you would like to see here, you can open a new issue to share your ideas, or add your suggestions to the [Contributions to the glTF-Tutorials issue](https://github.com/KhronosGroup/glTF-Tutorials/issues/8). 12 | 13 | ## Contributing 14 | 15 | If you want to contribute your own tutorial, you can do this in different ways: 16 | 17 | You can open an issue to propose the new tutorial that you want to create. In this issue, you can discuss the intended topic, scope, and structure of the tutorial. This will allow you to gather early feedback, and maybe even find collaborators who would like to support you. 18 | 19 | If you have already created a tutorial and want to make it available here, you can just open a pull request. The new tutorial should be in a subdirectory with a short, distinctive name that indicates the overall topic of the tutorial, in `CamelCase` or `lowerCamelCase`. Inside this directory, there should be a `README.md` file that serves as the entry point. Beyond that, you can structure your tutorial as you see fit: If it is a short tutorial, it could be fully contained in the `README.md` file. If you want to split it up into multiple sections, then you can create one markdown file for each section, and only put a Table Of Contents into the `README.md` file. You can also create further subdirectories, for example, a dedicated `images` directory for all the images that you want to inline. 20 | 21 | The tutorials will be published here under the [CC-BY 4.0 license](https://github.com/KhronosGroup/glTF-Tutorials/blob/master/LICENSE). 22 | -------------------------------------------------------------------------------- /gltfTutorial/README.md: -------------------------------------------------------------------------------- 1 | # glTF Tutorial 2 | 3 | By Marco Hutter, [@javagl](https://github.com/javagl) 4 | 5 | This tutorial gives an introduction to [glTF](https://www.khronos.org/gltf), the GL transmission format. It summarizes the most important features and application cases of glTF, and describes the structure of the files that are related to glTF. It explains how glTF assets may be read, processed, and used to display 3D graphics efficiently. 6 | 7 | Some basic knowledge about [JSON](https://json.org/), the JavaScript Object Notation, is assumed. Additionally, a basic understanding of common graphics APIs, like OpenGL or WebGL, is required. This tutorial focuses on [glTF version 2.0](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html), where support for *Physically Based Rendering* was introduced, but the other concepts that are explained here are similar to how they had been implemented in [glTF version 1.0](https://github.com/KhronosGroup/glTF/tree/main/specification/1.0). 8 | 9 | 10 | - [Introduction](gltfTutorial_001_Introduction.md) 11 | - [Basic glTF Structure](gltfTutorial_002_BasicGltfStructure.md) 12 | - [Example: A Minimal glTF File](gltfTutorial_003_MinimalGltfFile.md) 13 | - [Scenes and Nodes](gltfTutorial_004_ScenesNodes.md) 14 | - [Buffers, BufferViews, and Accessors](gltfTutorial_005_BuffersBufferViewsAccessors.md) 15 | - [Example: A Simple Animation](gltfTutorial_006_SimpleAnimation.md) 16 | - [Animations](gltfTutorial_007_Animations.md) 17 | - [Example: Simple Meshes](gltfTutorial_008_SimpleMeshes.md) 18 | - [Meshes](gltfTutorial_009_Meshes.md) 19 | - [Materials](gltfTutorial_010_Materials.md) 20 | - [Example: A Simple Material](gltfTutorial_011_SimpleMaterial.md) 21 | - [Textures, Images, and Samplers](gltfTutorial_012_TexturesImagesSamplers.md) 22 | - [Example: A Simple Texture](gltfTutorial_013_SimpleTexture.md) 23 | - [Example: An Advanced Material](gltfTutorial_014_AdvancedMaterial.md) 24 | - [Example: Simple Cameras](gltfTutorial_015_SimpleCameras.md) 25 | - [Cameras](gltfTutorial_016_Cameras.md) 26 | - [Example: A Simple Morph Target](gltfTutorial_017_SimpleMorphTarget.md) 27 | - [Morph Targets](gltfTutorial_018_MorphTargets.md) 28 | - [Example: Simple Skin](gltfTutorial_019_SimpleSkin.md) 29 | - [Skins](gltfTutorial_020_Skins.md) 30 | 31 | 32 | **Acknowledgements:** 33 | 34 | - Patrick Cozzi, Cesium, [@pjcozzi](https://twitter.com/pjcozzi) 35 | - Alexey Knyazev, [@lexaknyazev](https://github.com/lexaknyazev) 36 | - Sarah Chow, [@slchow](https://github.com/slchow) 37 | -------------------------------------------------------------------------------- /gltfTutorial/gltfTutorial_001_Introduction.md: -------------------------------------------------------------------------------- 1 | [Table of Contents](README.md) | Next: [Basic glTF Structure](gltfTutorial_002_BasicGltfStructure.md) 2 | 3 | 4 | 5 | 6 | 7 | # Introduction to glTF using WebGL 8 | 9 | An increasing number of applications and services are based on 3D content. Online shops offer product configurators with a 3D preview. Museums digitize their artifacts with 3D scans and allow visitors to explore their collections in virtual galleries. City planners use 3D city models for planning and information visualization. Educators create interactive, animated 3D models of the human body. Many of these applications run directly in the web browser, which is possible because all modern browsers support efficient rendering with WebGL. 10 | 11 |

12 |
13 | Image 1a: Screenshots of various websites and applications showing 3D models. 14 |

15 | 16 | Demand for 3D content in various applications is constantly increasing. In many cases, the 3D content has to be transferred over the web, and it has to be rendered efficiently on the client side. But until now, there has been a gap between the 3D content creation and efficient rendering of that 3D content in the runtime applications. 17 | 18 | 19 | ## 3D content pipelines 20 | 21 | 3D content that is rendered in client applications comes from different sources and is stored in different file formats. The [list of 3D graphics file formats on Wikipedia](https://en.wikipedia.org/wiki/List_of_file_formats#3D_graphics) shows an overwhelming number, with more than 70 different file formats for 3D data, serving different purposes and application cases. 22 | 23 | For example, raw 3D data may be obtained with a 3D scanner. These scanners usually provide the geometry data of a single object, which is stored in [OBJ](https://en.wikipedia.org/wiki/Wavefront_.obj_file), [PLY](https://en.wikipedia.org/wiki/PLY_(file_format)), or [STL](https://en.wikipedia.org/wiki/STL_(file_format)) files. These file formats do not contain information about the scene structure or how the objects should be rendered. 24 | 25 | More sophisticated 3D scenes can be created with authoring tools. These tools allow one to edit the structure of the scene, the light setup, cameras, animations, and, of course, the 3D geometry of the objects that appear in the scene. Applications store this information in their own, custom file formats. For example, [Blender](https://www.blender.org/) stores the scenes in `.blend` files, [LightWave3D](https://www.lightwave3d.com/) uses the `.lws` file format, [3ds Max](https://www.autodesk.com/3dsmax) uses the `.max` file format, and [Maya](https://www.autodesk.com/maya) uses `.ma` files. 26 | 27 | In order to render such 3D content, the runtime application must be able to read different input file formats. The scene structure has to be parsed, and the 3D geometry data has to be converted into the format required by the graphics API. The 3D data has to be transferred to the graphics card memory, and then the rendering process can be described with sequences of graphics API calls. Thus, each runtime application has to create importers, loaders, or converters for all file formats that it will support, as shown in [Image 1b](#contentPipeline-png). 28 | 29 |

30 |
31 | Image 1b: The 3D content pipeline today. 32 |

33 | 34 | 35 | ## glTF: A transmission format for 3D scenes 36 | 37 | The goal of glTF is to define a standard for representing 3D content, in a form that is suitable for use in runtime applications. The existing file formats are not appropriate for this use case: some of do not contain any scene information, but only geometry data; others have been designed for exchanging data between authoring applications, and their main goal is to retain as much information about the 3D scene as possible, resulting in files that are usually large, complex, and hard to parse. Additionally, the geometry data may have to be preprocessed so that it can be rendered with the client application. 38 | 39 | None of the existing file formats were designed for the use case of efficiently transferring 3D scenes over the web and rendering them as efficiently as possible. But glTF is not "yet another file format." It is the definition of a *transmission* format for 3D scenes: 40 | 41 | - The scene structure is described with JSON, which is very compact and can easily be parsed. 42 | - The 3D data of the objects are stored in a form that can be directly used by the common graphics APIs, so there is no overhead for decoding or pre-processing the 3D data. 43 | 44 | Different content creation tools may now provide 3D content in the glTF format. And an increasing number of client applications are able to consume and render glTF. Some of these applications are shown in [Image 1a](#applications-png). So glTF may help to bridge the gap between content creation and rendering, as shown in [Image 1c](#contentPipelineWithGltf-png). 45 | 46 |

47 |
48 | Image 1c: The 3D content pipeline with glTF. 49 |

50 | 51 | An increasing number of content creation tools provide glTF import and export directly. For example, the Blender Manual documents [how to import and export PBR materials](https://docs.blender.org/manual/en/latest/addons/import_export/scene_gltf2.html) using glTF. Alternatively, other file formats can be used to create glTF assets, using one of the open-source conversion utilities listed in the [glTF Project Explorer](https://github.khronos.org/glTF-Project-Explorer/). The output of converters and exporters can be validated using the [Khronos glTF Validator](https://github.khronos.org/glTF-Validator/). 52 | 53 | 54 | [Table of Contents](README.md) | Next: [Basic glTF Structure](gltfTutorial_002_BasicGltfStructure.md) 55 | -------------------------------------------------------------------------------- /gltfTutorial/gltfTutorial_002_BasicGltfStructure.md: -------------------------------------------------------------------------------- 1 | Previous: [Introduction](gltfTutorial_001_Introduction.md) | [Table of Contents](README.md) | Next: [A Minimal glTF File](gltfTutorial_003_MinimalGltfFile.md) 2 | 3 | 4 | # The Basic Structure of glTF 5 | 6 | The core of glTF is a JSON file. This file describes the whole contents of the 3D scene. It consists of a description of the scene structure itself, which is given by a hierarchy of nodes that define a scene graph. The 3D objects that appear in the scene are defined using meshes that are attached to the nodes. Materials define the appearance of the objects. Animations describe how the 3D objects are transformed (e.g., rotated or translated) over time, and skins define how the geometry of the objects is deformed based on a skeleton pose. Cameras describe the view configuration for the renderer. 7 | 8 | ## The JSON structure 9 | 10 | The scene objects are stored in arrays in the JSON file. They can be accessed using the index of the respective object in the array: 11 | 12 | ```javascript 13 | "meshes" : 14 | [ 15 | { ... } 16 | { ... } 17 | ... 18 | ], 19 | ``` 20 | 21 | These indices are also used to define the *relationships* between the objects. The example above defines multiple meshes, and a node may refer to one of these meshes, using the mesh index, to indicate that the mesh should be attached to this node: 22 | 23 | ```javascript 24 | "nodes": 25 | [ 26 | { "mesh": 0, ... }, 27 | { "mesh": 5, ... }, 28 | ... 29 | ] 30 | ``` 31 | 32 | The following image (adapted from the [glTF concepts section](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#concepts)) gives an overview of the top-level elements of the JSON part of a glTF asset: 33 | 34 |

35 |
36 | Image 2a: The glTF JSON structure 37 |

38 | 39 | 40 | These elements are summarized here quickly, to give an overview, with links to the respective sections of the glTF specification. More detailed explanations of the relationships between these elements will be given in the following sections. 41 | 42 | - The [`scene`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-scene) is the entry point for the description of the scene that is stored in the glTF. It refers to the `node`s that define the scene graph. 43 | - The [`node`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-node) is one node in the scene graph hierarchy. It can contain a transformation (e.g., rotation or translation), and it may refer to further (child) nodes. Additionally, it may refer to `mesh` or `camera` instances that are "attached" to the node, or to a `skin` that describes a mesh deformation. 44 | - The [`camera`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-camera) defines the view configuration for rendering the scene. 45 | - A [`mesh`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-mesh) describes a geometric object that appears in the scene. It refers to `accessor` objects that are used for accessing the actual geometry data, and to `material`s that define the appearance of the object when it is rendered. 46 | - The [`skin`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-skin) defines parameters that are required for vertex skinning, which allows the deformation of a mesh based on the pose of a virtual character. The values of these parameters are obtained from an `accessor`. 47 | - An [`animation`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-animation) describes how transformations of certain nodes (e.g., rotation or translation) change over time. 48 | - The [`accessor`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-accessor) is used as an abstract source of arbitrary data. It is used by the `mesh`, `skin`, and `animation`, and provides the geometry data, the skinning parameters and the time-dependent animation values. It refers to a [`bufferView`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-bufferview), which is a part of a [`buffer`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-buffer) that contains the actual raw binary data. 49 | - The [`material`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-material) contains the parameters that define the appearance of an object. It usually refers to `texture` objects that will be applied to the rendered geometry. 50 | - The [`texture`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-texture) is defined by a [`sampler`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-sampler) and an [`image`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-image). The `sampler` defines how the texture `image` should be placed on the object. 51 | 52 | 53 | 54 | 55 | ## References to external data 56 | 57 | The binary data, like geometry and textures of the 3D objects, are usually not contained in the JSON file. Instead, they are stored in dedicated files, and the JSON part only contains links to these files. This allows the binary data to be stored in a form that is very compact and can efficiently be transferred over the web. Additionally, the data can be stored in a format that can be used directly in the renderer, without having to parse, decode, or preprocess the data. 58 | 59 |

60 |
61 | Image 2b: The glTF structure 62 |

63 | 64 | As shown in the image above, there are two types of objects that may contain such links to external resources, namely `buffers` and `images`. These objects will later be explained in more detail. 65 | 66 | 67 | 68 | ## Reading and managing external data 69 | 70 | Reading and processing a glTF asset starts with parsing the JSON structure. After the structure has been parsed, the [`buffer`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-buffer) and [`image`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-image) objects are available in the top-level `buffers` and `images` arrays, respectively. Each of these objects may refer to blocks of binary data. For further processing, this data is read into memory. Usually, the data will be be stored in an array so that they may be looked up using the same index that is used for referring to the `buffer` or `image` object that they belong to. 71 | 72 | 73 | ## Binary data in `buffers` 74 | 75 | A [`buffer`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-buffer) contains a URI that points to a file containing the raw, binary buffer data: 76 | 77 | ```javascript 78 | "buffer01": { 79 | "byteLength": 12352, 80 | "type": "arraybuffer", 81 | "uri": "buffer01.bin" 82 | } 83 | ``` 84 | 85 | This binary data is just a raw block of memory that is read from the URI of the `buffer`, with no inherent meaning or structure. The [Buffers, BufferViews, and Accessors](gltfTutorial_005_BuffersBufferViewsAccessors.md) section will show how this raw data is extended with information about data types and the data layout. With this information, one part of the data may, for example, be interpreted as animation data, and another part may be interpreted as geometry data. Storing the data in a binary form allows it to be transferred over the web much more efficiently than in the JSON format, and the binary data can be passed directly to the renderer without having to decode or pre-process it. 86 | 87 | 88 | 89 | ## Image data in `images` 90 | 91 | An [`image`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-image) may refer to an external image file that can be used as the texture of a rendered object: 92 | 93 | ```javascript 94 | "image01": { 95 | "uri": "image01.png" 96 | } 97 | ``` 98 | 99 | The reference is given as a URI that usually points to a PNG or JPG file. These formats significantly reduce the size of the files so that they may efficiently be transferred over the web. In some cases, the `image` objects may not refer to an external file, but to data that is stored in a `buffer`. The details of this indirection will be explained in the [Textures, Images, and Samplers](gltfTutorial_012_TexturesImagesSamplers.md) section. 100 | 101 | 102 | 103 | 104 | ## Binary data in data URIs 105 | 106 | Usually, the URIs that are contained in the `buffer` and `image` objects will point to a file that contains the actual data. As an alternative, the data may be *embedded* into the JSON, in binary format, by using a [data URI](https://developer.mozilla.org/en-US/docs/Web/HTTP/Basics_of_HTTP/Data_URIs). 107 | 108 | 109 | Previous: [Introduction](gltfTutorial_001_Introduction.md) | [Table of Contents](README.md) | Next: [A Minimal glTF File](gltfTutorial_003_MinimalGltfFile.md) 110 | -------------------------------------------------------------------------------- /gltfTutorial/gltfTutorial_003_MinimalGltfFile.md: -------------------------------------------------------------------------------- 1 | Previous: [Basic glTF Structure](gltfTutorial_002_BasicGltfStructure.md) | [Table of Contents](README.md) | Next: [Scenes and Nodes](gltfTutorial_004_ScenesNodes.md) 2 | 3 | 4 | # A Minimal glTF File 5 | 6 | The following is a minimal but complete glTF asset, containing a single, indexed triangle. You can copy and paste it into a `gltf` file, and every glTF-based application should be able to load and render it. This section will explain the basic concepts of glTF based on this example. 7 | 8 | ```javascript 9 | { 10 | "scene": 0, 11 | "scenes" : [ 12 | { 13 | "nodes" : [ 0 ] 14 | } 15 | ], 16 | 17 | "nodes" : [ 18 | { 19 | "mesh" : 0 20 | } 21 | ], 22 | 23 | "meshes" : [ 24 | { 25 | "primitives" : [ { 26 | "attributes" : { 27 | "POSITION" : 1 28 | }, 29 | "indices" : 0 30 | } ] 31 | } 32 | ], 33 | 34 | "buffers" : [ 35 | { 36 | "uri" : "data:application/octet-stream;base64,AAABAAIAAAAAAAAAAAAAAAAAAAAAAIA/AAAAAAAAAAAAAAAAAACAPwAAAAA=", 37 | "byteLength" : 44 38 | } 39 | ], 40 | "bufferViews" : [ 41 | { 42 | "buffer" : 0, 43 | "byteOffset" : 0, 44 | "byteLength" : 6, 45 | "target" : 34963 46 | }, 47 | { 48 | "buffer" : 0, 49 | "byteOffset" : 8, 50 | "byteLength" : 36, 51 | "target" : 34962 52 | } 53 | ], 54 | "accessors" : [ 55 | { 56 | "bufferView" : 0, 57 | "byteOffset" : 0, 58 | "componentType" : 5123, 59 | "count" : 3, 60 | "type" : "SCALAR", 61 | "max" : [ 2 ], 62 | "min" : [ 0 ] 63 | }, 64 | { 65 | "bufferView" : 1, 66 | "byteOffset" : 0, 67 | "componentType" : 5126, 68 | "count" : 3, 69 | "type" : "VEC3", 70 | "max" : [ 1.0, 1.0, 0.0 ], 71 | "min" : [ 0.0, 0.0, 0.0 ] 72 | } 73 | ], 74 | 75 | "asset" : { 76 | "version" : "2.0" 77 | } 78 | } 79 | ``` 80 | 81 |

82 |
83 | Image 3a: A single triangle. 84 |

85 | 86 | 87 | ## The `scene` and `nodes` structure 88 | 89 | The [`scenes`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-scene) array is the entry point for the description of the scenes that are stored in the glTF. When parsing a glTF JSON file, the traversal of the scene structure will start here. Each scene contains an array called `nodes`, which contains the indices of [`node`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-node) objects. These nodes are the root nodes of a scene graph hierarchy. 90 | 91 | The example here consists of a single scene. The `scene` property indicates that this scene is the default scene that should be displayed when the asset is loaded. The scene refers to the only node in this example, which is the node with the index 0. This node, in turn, refers to the only mesh, which has the index 0: 92 | 93 | 94 | ```javascript 95 | "scene": 0, 96 | "scenes" : [ 97 | { 98 | "nodes" : [ 0 ] 99 | } 100 | ], 101 | 102 | "nodes" : [ 103 | { 104 | "mesh" : 0 105 | } 106 | ], 107 | ``` 108 | 109 | More details about scenes and nodes and their properties will be given in the [Scenes and Nodes](gltfTutorial_004_ScenesNodes.md) section. 110 | 111 | 112 | ## The `meshes` 113 | 114 | A [`mesh`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-mesh) represents an actual geometric object that appears in the scene. The mesh itself usually does not have any properties, but only contains an array of [`mesh.primitive`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-mesh-primitive) objects, which serve as building blocks for larger models. Each mesh primitive contains a description of the geometry data that the mesh consists of. 115 | 116 | The example consists of a single mesh, and has a single `mesh.primitive` object. The mesh primitive has an array of `attributes`. These are the attributes of the vertices of the mesh geometry, and in this case, this is only the `POSITION` attribute, describing the positions of the vertices. The mesh primitive describes an *indexed* geometry, which is indicated by the `indices` property. By default, it is assumed to describe a set of triangles, so that three consecutive indices are the indices of the vertices of one triangle. 117 | 118 | The actual geometry data of the mesh primitive is given by the `attributes` and the `indices`. These both refer to `accessor` objects, which will be explained below. 119 | 120 | ```javascript 121 | "meshes" : [ 122 | { 123 | "primitives" : [ { 124 | "attributes" : { 125 | "POSITION" : 1 126 | }, 127 | "indices" : 0 128 | } ] 129 | } 130 | ], 131 | ``` 132 | 133 | A more detailed description of meshes and mesh primitives can be found in the [meshes](gltfTutorial_009_Meshes.md) section. 134 | 135 | 136 | ## The `buffer`, `bufferView`, and `accessor` concepts 137 | 138 | The `buffer`, `bufferView`, and `accessor` objects provide information about the geometry data that the mesh primitives consist of. They are introduced here quickly, based on the specific example. A more detailed description of these concepts will be given in the [Buffers, BufferViews, and Accessors](gltfTutorial_005_BuffersBufferViewsAccessors.md) section. 139 | 140 | ### Buffers 141 | 142 | A [`buffer`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-buffer) defines a block of raw, unstructured data with no inherent meaning. It contains an `uri`, which can either point to an external file that contains the data, or it can be a [data URI](gltfTutorial_002_BasicGltfStructure.md#binary-data-in-data-uris) that encodes the binary data directly in the JSON file. 143 | 144 | In the example file, the second approach is used: there is a single buffer, containing 44 bytes, and the data of this buffer is encoded as a data URI: 145 | 146 | ```javascript 147 | "buffers" : [ 148 | { 149 | "uri" : "data:application/octet-stream;base64,AAABAAIAAAAAAAAAAAAAAAAAAAAAAIA/AAAAAAAAAAAAAAAAAACAPwAAAAA=", 150 | "byteLength" : 44 151 | } 152 | ], 153 | ``` 154 | 155 | This data contains the indices of the triangle, and the vertex positions of the triangle. But in order to actually use this data as the geometry data of a mesh primitive, additional information about the *structure* of this data is required. This information about the structure is encoded in the `bufferView` and `accessor` objects. 156 | 157 | ### Buffer views 158 | 159 | A [`bufferView`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-bufferview) describes a "chunk" or a "slice" of the whole, raw buffer data. In the given example, there are two buffer views. They both refer to the same buffer. The first buffer view refers to the part of the buffer that contains the data of the indices: it has a `byteOffset` of 0 referring to the whole buffer data, and a `byteLength` of 6. The second buffer view refers to the part of the buffer that contains the vertex positions. It starts at a `byteOffset` of 8, and has a `byteLength` of 36; that is, it extends to the end of the whole buffer. 160 | 161 | ```javascript 162 | "bufferViews" : [ 163 | { 164 | "buffer" : 0, 165 | "byteOffset" : 0, 166 | "byteLength" : 6, 167 | "target" : 34963 168 | }, 169 | { 170 | "buffer" : 0, 171 | "byteOffset" : 8, 172 | "byteLength" : 36, 173 | "target" : 34962 174 | } 175 | ], 176 | ``` 177 | 178 | 179 | ### Accessors 180 | 181 | The second step of structuring the data is accomplished with [`accessor`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-accessor) objects. They define how the data of a `bufferView` has to be interpreted by providing information about the data types and the layout. 182 | 183 | In the example, there are two accessor objects. 184 | 185 | The first accessor describes the indices of the geometry data. It refers to the `bufferView` with index 0, which is the part of the `buffer` that contains the raw data for the indices. Additionally, it specifies the `count` and `type` of the elements and their `componentType`. In this case, there are 3 scalar elements, and their component type is given by a constant that stands for the `unsigned short` type. 186 | 187 | The second accessor describes the vertex positions. It contains a reference to the relevant part of the buffer data, via the `bufferView` with index 1, and its `count`, `type`, and `componentType` properties say that there are three elements of 3D vectors, each having `float` components. 188 | 189 | 190 | ```javascript 191 | "accessors" : [ 192 | { 193 | "bufferView" : 0, 194 | "byteOffset" : 0, 195 | "componentType" : 5123, 196 | "count" : 3, 197 | "type" : "SCALAR", 198 | "max" : [ 2 ], 199 | "min" : [ 0 ] 200 | }, 201 | { 202 | "bufferView" : 1, 203 | "byteOffset" : 0, 204 | "componentType" : 5126, 205 | "count" : 3, 206 | "type" : "VEC3", 207 | "max" : [ 1.0, 1.0, 0.0 ], 208 | "min" : [ 0.0, 0.0, 0.0 ] 209 | } 210 | ], 211 | ``` 212 | 213 | As described above, a `mesh.primitive` may now refer to these accessors, using their indices: 214 | 215 | ```javascript 216 | "meshes" : [ 217 | { 218 | "primitives" : [ { 219 | "attributes" : { 220 | "POSITION" : 1 221 | }, 222 | "indices" : 0 223 | } ] 224 | } 225 | ], 226 | ``` 227 | 228 | When this `mesh.primitive` has to be rendered, the renderer can resolve the underlying buffer views and buffers and will send the required parts of the buffer to the renderer, together with the information about the data types and layout. A more detailed description of how the accessor data is obtained and processed by the renderer is given in the [Buffers, BufferViews, and Accessors](gltfTutorial_005_BuffersBufferViewsAccessors.md) section. 229 | 230 | 231 | 232 | 233 | ## The `asset` description 234 | 235 | In glTF 1.0, this property is still optional, but in subsequent glTF versions, the JSON file is required to contain an `asset` property that contains the `version` number. The example here says that the asset complies to glTF version 2.0: 236 | 237 | ```javascript 238 | "asset" : { 239 | "version" : "2.0" 240 | } 241 | ``` 242 | 243 | The `asset` property may contain additional metadata that is described in the [`asset` specification](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-asset). 244 | 245 | 246 | 247 | 248 | Previous: [Basic glTF Structure](gltfTutorial_002_BasicGltfStructure.md) | [Table of Contents](README.md) | Next: [Scenes and Nodes](gltfTutorial_004_ScenesNodes.md) 249 | -------------------------------------------------------------------------------- /gltfTutorial/gltfTutorial_004_ScenesNodes.md: -------------------------------------------------------------------------------- 1 | Previous: [A Minimal glTF File](gltfTutorial_003_MinimalGltfFile.md) | [Table of Contents](README.md) | Next: [Buffers, BufferViews, and Accessors](gltfTutorial_005_BuffersBufferViewsAccessors.md) 2 | 3 | # Scenes and Nodes 4 | 5 | ## Scenes 6 | 7 | There may be multiple scenes stored in one glTF file. The `scene` property indicated which of these scenes should be the default scene that is displayed when the asset is loaded. Each scene contains an array of `nodes`, which are the indices of the root nodes of the scene graphs. Again, there may be multiple root nodes, forming different hierarchies, but in many cases, the scene will have a single root node. The most simple possible scene description has already been shown in the previous section, consisting of a single scene with a single node: 8 | 9 | ```javascript 10 | "scene": 0, 11 | "scenes" : [ 12 | { 13 | "nodes" : [ 0 ] 14 | } 15 | ], 16 | 17 | "nodes" : [ 18 | { 19 | "mesh" : 0 20 | } 21 | ], 22 | ``` 23 | 24 | 25 | ## Nodes forming the scene graph 26 | 27 | Each [`node`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-node) can contain an array called `children` that contains the indices of its child nodes. So each node is one element of a hierarchy of nodes, and together they define the structure of the scene as a scene graph. 28 | 29 |

30 |
31 | Image 4a: The scene graph representation stored in the glTF JSON. 32 |

33 | 34 | Each of the nodes that are given in the `scene` can be traversed, recursively visiting all their children, to process all elements that are attached to the nodes. The simplified pseudocode for this traversal may look like the following: 35 | 36 | ``` 37 | traverse(node) { 38 | // Process the meshes, cameras, etc., that are 39 | // attached to this node - discussed later 40 | processElements(node); 41 | 42 | // Recursively process all children 43 | for each (child in node.children) { 44 | traverse(child); 45 | } 46 | } 47 | ``` 48 | 49 | In practice, some additional information will be required for the traversal: the processing of some elements that are attached to nodes will require information about *which* node they are attached to. Additionally, the information about the transforms of the nodes has to be accumulated during the traversal. 50 | 51 | 52 | ### Local and global transforms 53 | 54 | Each node can have a transform. Such a transform will define a translation, rotation, and/or scale. This transform will be applied to all elements attached to the node itself and to all its child nodes. The hierarchy of nodes thus allows one to structure the translations, rotations, and scalings that are applied to the scene elements. 55 | 56 | 57 | #### Local transforms of nodes 58 | 59 | There are different possible representations for the local transform of a node. The transform can be given directly by the `matrix` property of the node. This is an array of 16 floating point numbers that describe the matrix in column-major order. For example, the following matrix describes a scaling about (2,1,0.5), a rotation about 30 degrees around the x-axis, and a translation about (10,20,30): 60 | 61 | ```javascript 62 | "node0": { 63 | "matrix": [ 64 | 2.0, 0.0, 0.0, 0.0, 65 | 0.0, 0.866, 0.5, 0.0, 66 | 0.0, -0.25, 0.433, 0.0, 67 | 10.0, 20.0, 30.0, 1.0 68 | ] 69 | } 70 | ``` 71 | 72 | The matrix defined here is as shown in Image 4b. 73 | 74 |

75 |
76 | Image 4b: An example matrix. 77 |

78 | 79 | 80 | The transform of a node can also be given using the `translation`, `rotation`, and `scale` properties of a node, which is sometimes abbreviated as *TRS*: 81 | 82 | ```javascript 83 | "node0": { 84 | "translation": [ 10.0, 20.0, 30.0 ], 85 | "rotation": [ 0.259, 0.0, 0.0, 0.966 ], 86 | "scale": [ 2.0, 1.0, 0.5 ] 87 | } 88 | ``` 89 | 90 | Each of these properties can be used to create a matrix, and the product of these matrices then is the local transform of the node: 91 | 92 | - The `translation` just contains the translation in x-, y-, and z-direction. For example, from a translation of `[ 10.0, 20.0, 30.0 ]`, one can create a translation matrix that contains this translation as its last column, as shown in Image 4c. 93 | 94 |

95 |
96 | Image 4c: A translation matrix. 97 |

98 | 99 | 100 | - The `rotation` is given as a [quaternion](https://en.wikipedia.org/wiki/Quaternion). The mathematical background of quaternions is beyond the scope of this tutorial. For now, the most important information is that a quaternion is a compact representation of a rotation about an arbitrary angle and around an arbitrary axis. It is stored as a tuple `(x,y,z,w)`, where the `w`-component is the cosine of half of the rotation angle. For example, the quaternion `[ 0.259, 0.0, 0.0, 0.966 ]` describes a rotation about 30 degrees, around the x-axis. So this quaternion can be converted into a rotation matrix, as shown in Image 4d. 101 | 102 |

103 |
104 | Image 4d: A rotation matrix. 105 |

106 | 107 | 108 | - The `scale` contains the scaling factors along the x-, y-, and z-axes. The corresponding matrix can be created by using these scaling factors as the entries on the diagonal of the matrix. For example, the scale matrix for the scaling factors `[ 2.0, 1.0, 0.5 ]` is shown in Image 4e. 109 | 110 |

111 |
112 | Image 4e: A scale matrix. 113 |

114 | 115 | When computing the final, local transform matrix of the node, these matrices are multiplied together. It is important to perform the multiplication of these matrices in the right order. The local transform matrix always has to be computed as `M = T * R * S`, where `T` is the matrix for the `translation` part, `R` is the matrix for the `rotation` part, and `S` is the matrix for the `scale` part. So the pseudocode for the computation is 116 | 117 | ``` 118 | translationMatrix = createTranslationMatrix(node.translation); 119 | rotationMatrix = createRotationMatrix(node.rotation); 120 | scaleMatrix = createScaleMatrix(node.scale); 121 | localTransform = translationMatrix * rotationMatrix * scaleMatrix; 122 | ``` 123 | 124 | For the example matrices given above, the final, local transform matrix of the node will be as shown in Image 4f. 125 | 126 |

127 |
128 | Image 4f: The final local transform matrix computed from the TRS properties. 129 |

130 | 131 | This matrix will cause the vertices of the meshes to be scaled, then rotated, and then translated according to the `scale`, `rotation`, and `translation` properties that have been given in the node. 132 | 133 | When any of the three properties is not given, the identity matrix will be used. Similarly, when a node contains neither a `matrix` property nor TRS-properties, then its local transform will be the identity matrix. 134 | 135 | 136 | 137 | #### Global transforms of nodes 138 | 139 | Regardless of the representation in the JSON file, the local transform of a node can be stored as a 4×4 matrix. The *global* transform of a node is given by the product of all local transforms on the path from the root to the respective node: 140 | 141 | Structure: local transform global transform 142 | root R R 143 | +- nodeA A R*A 144 | +- nodeB B R*A*B 145 | +- nodeC C R*A*C 146 | 147 | It is important to point out that after the file was loaded these global transforms can *not* be computed only once. Later, it will be shown how *animations* may modify the local transforms of individual nodes. And these modifications will affect the global transforms of all descendant nodes. Therefore, when the global transform of a node is required, it has to be computed directly from the current local transforms of all nodes. Alternatively, and as a potential performance improvement, an implementation could cache the global transforms, detect changes in the local transforms of ancestor nodes, and update the global transforms only when necessary. The different implementation options for this will depend on the programming language and the requirements for the client application, and thus are beyond the scope of this tutorial. 148 | 149 | 150 | 151 | Previous: [A Minimal glTF File](gltfTutorial_003_MinimalGltfFile.md) | [Table of Contents](README.md) | Next: [Buffers, BufferViews, and Accessors](gltfTutorial_005_BuffersBufferViewsAccessors.md) 152 | -------------------------------------------------------------------------------- /gltfTutorial/gltfTutorial_006_SimpleAnimation.md: -------------------------------------------------------------------------------- 1 | Previous: [Buffers, BufferViews, and Accessors](gltfTutorial_005_BuffersBufferViewsAccessors.md) | [Table of Contents](README.md) | Next: [Animations](gltfTutorial_007_Animations.md) 2 | 3 | 4 | # A Simple Animation 5 | 6 | As shown in the [Scenes and Nodes](gltfTutorial_004_ScenesNodes.md) section, each node can have a local transform. This transform can be given either by the `matrix` property of the node or by using the `translation`, `rotation`, and `scale` (TRS) properties. 7 | 8 | When the transform is given by the TRS properties, an [`animation`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-animation) can be used to describe how the `translation`, `rotation`, or `scale` of a node changes over time. 9 | 10 | The following is the [minimal glTF file](gltfTutorial_003_MinimalGltfFile.md) that was shown previously, but extended with an animation. This section will explain the changes and extensions that have been made to add this animation. 11 | 12 | 13 | ```javascript 14 | { 15 | "scene": 0, 16 | "scenes" : [ 17 | { 18 | "nodes" : [ 0 ] 19 | } 20 | ], 21 | 22 | "nodes" : [ 23 | { 24 | "mesh" : 0, 25 | "rotation" : [ 0.0, 0.0, 0.0, 1.0 ] 26 | } 27 | ], 28 | 29 | "meshes" : [ 30 | { 31 | "primitives" : [ { 32 | "attributes" : { 33 | "POSITION" : 1 34 | }, 35 | "indices" : 0 36 | } ] 37 | } 38 | ], 39 | 40 | "animations": [ 41 | { 42 | "samplers" : [ 43 | { 44 | "input" : 2, 45 | "interpolation" : "LINEAR", 46 | "output" : 3 47 | } 48 | ], 49 | "channels" : [ { 50 | "sampler" : 0, 51 | "target" : { 52 | "node" : 0, 53 | "path" : "rotation" 54 | } 55 | } ] 56 | } 57 | ], 58 | 59 | "buffers" : [ 60 | { 61 | "uri" : "data:application/octet-stream;base64,AAABAAIAAAAAAAAAAAAAAAAAAAAAAIA/AAAAAAAAAAAAAAAAAACAPwAAAAA=", 62 | "byteLength" : 44 63 | }, 64 | { 65 | "uri" : "data:application/octet-stream;base64,AAAAAAAAgD4AAAA/AABAPwAAgD8AAAAAAAAAAAAAAAAAAIA/AAAAAAAAAAD0/TQ/9P00PwAAAAAAAAAAAACAPwAAAAAAAAAAAAAAAPT9ND/0/TS/AAAAAAAAAAAAAAAAAACAPw==", 66 | "byteLength" : 100 67 | } 68 | ], 69 | "bufferViews" : [ 70 | { 71 | "buffer" : 0, 72 | "byteOffset" : 0, 73 | "byteLength" : 6, 74 | "target" : 34963 75 | }, 76 | { 77 | "buffer" : 0, 78 | "byteOffset" : 8, 79 | "byteLength" : 36, 80 | "target" : 34962 81 | }, 82 | { 83 | "buffer" : 1, 84 | "byteOffset" : 0, 85 | "byteLength" : 100 86 | } 87 | ], 88 | "accessors" : [ 89 | { 90 | "bufferView" : 0, 91 | "byteOffset" : 0, 92 | "componentType" : 5123, 93 | "count" : 3, 94 | "type" : "SCALAR", 95 | "max" : [ 2 ], 96 | "min" : [ 0 ] 97 | }, 98 | { 99 | "bufferView" : 1, 100 | "byteOffset" : 0, 101 | "componentType" : 5126, 102 | "count" : 3, 103 | "type" : "VEC3", 104 | "max" : [ 1.0, 1.0, 0.0 ], 105 | "min" : [ 0.0, 0.0, 0.0 ] 106 | }, 107 | { 108 | "bufferView" : 2, 109 | "byteOffset" : 0, 110 | "componentType" : 5126, 111 | "count" : 5, 112 | "type" : "SCALAR", 113 | "max" : [ 1.0 ], 114 | "min" : [ 0.0 ] 115 | }, 116 | { 117 | "bufferView" : 2, 118 | "byteOffset" : 20, 119 | "componentType" : 5126, 120 | "count" : 5, 121 | "type" : "VEC4", 122 | "max" : [ 0.0, 0.0, 1.0, 1.0 ], 123 | "min" : [ 0.0, 0.0, 0.0, -0.707 ] 124 | } 125 | ], 126 | 127 | "asset" : { 128 | "version" : "2.0" 129 | } 130 | 131 | } 132 | ``` 133 | 134 |

135 |
136 | Image 6a: A single, animated triangle. 137 |

138 | 139 | 140 | ## The `rotation` property of the `node` 141 | 142 | The only node in the example now has a `rotation` property. This is an array containing the four floating point values of the [quaternion](https://en.wikipedia.org/wiki/Quaternion) that describes the rotation: 143 | 144 | ```javascript 145 | "nodes" : [ 146 | { 147 | "mesh" : 0, 148 | "rotation" : [ 0.0, 0.0, 0.0, 1.0 ] 149 | } 150 | ], 151 | ``` 152 | 153 | The given value is the quaternion describing a "rotation about 0 degrees," so the triangle will be shown in its initial orientation. 154 | 155 | 156 | ## The animation data 157 | 158 | Three elements have been added to the top-level arrays of the glTF JSON to encode the animation data: 159 | 160 | - A new `buffer` containing the raw animation data; 161 | - A new `bufferView` that refers to the buffer; 162 | - Two new `accessor` objects that add structural information to the animation data. 163 | 164 | ### The `buffer` and the `bufferView` for the raw animation data 165 | 166 | A new `buffer` has been added, which contains the raw animation data. This buffer also uses a [data URI](gltfTutorial_002_BasicGltfStructure.md#binary-data-in-data-uris) to encode the 100 bytes that the animation data consists of: 167 | 168 | ```javascript 169 | "buffers" : [ 170 | ... 171 | { 172 | "uri" : "data:application/octet-stream;base64,AAAAAAAAgD4AAAA/AABAPwAAgD8AAAAAAAAAAAAAAAAAAIA/AAAAAAAAAAD0/TQ/9P00PwAAAAAAAAAAAACAPwAAAAAAAAAAAAAAAPT9ND/0/TS/AAAAAAAAAAAAAAAAAACAPw==", 173 | "byteLength" : 100 174 | } 175 | ], 176 | 177 | "bufferViews" : [ 178 | ... 179 | { 180 | "buffer" : 1, 181 | "byteOffset" : 0, 182 | "byteLength" : 100 183 | } 184 | ], 185 | ``` 186 | 187 | There is also a new `bufferView`, which here simply refers to the new `buffer` with index 1, which contains the whole animation buffer data. Further structural information is added with the `accessor` objects described below. 188 | 189 | Note that one could also have appended the animation data to the existing buffer that already contained the geometry data of the triangle. In this case, the new buffer view would have referred to the `buffer` with index 0, and used an appropriate `byteOffset` to refer to the part of the buffer that then contained the animation data. 190 | 191 | In the example that is shown here, the animation data is added as a new buffer to keep the geometry data and the animation data separated. 192 | 193 | 194 | ### The `accessor` objects for the animation data 195 | 196 | Two new `accessor` objects have been added, which describe how to interpret the animation data. The first accessor describes the *times* of the animation key frames. There are five elements (as indicated by the `count` of 5), and each one is a scalar `float` value (which is 20 bytes in total). The second accessor says that after the first 20 bytes, there are five elements, each being a 4D vector with `float` components. These are the *rotations* that correspond to the five key frames of the animation, given as quaternions. 197 | 198 | ```javascript 199 | "accessors" : [ 200 | ... 201 | { 202 | "bufferView" : 2, 203 | "byteOffset" : 0, 204 | "componentType" : 5126, 205 | "count" : 5, 206 | "type" : "SCALAR", 207 | "max" : [ 1.0 ], 208 | "min" : [ 0.0 ] 209 | }, 210 | { 211 | "bufferView" : 2, 212 | "byteOffset" : 20, 213 | "componentType" : 5126, 214 | "count" : 5, 215 | "type" : "VEC4", 216 | "max" : [ 0.0, 0.0, 1.0, 1.0 ], 217 | "min" : [ 0.0, 0.0, 0.0, -0.707 ] 218 | } 219 | ], 220 | 221 | ``` 222 | 223 | The actual data that is provided by the *times* accessor and the *rotations* accessor, using the data from the buffer in the example, is shown in this table: 224 | 225 | |*times* accessor |*rotations* accessor|Meaning| 226 | |---|---|---| 227 | |0.0| (0.0, 0.0, 0.0, 1.0 )| At 0.0 seconds, the triangle has a rotation of 0 degrees | 228 | |0.25| (0.0, 0.0, 0.707, 0.707)| At 0.25 seconds, it has a rotation of 90 degrees around the z-axis 229 | |0.5| (0.0, 0.0, 1.0, 0.0)| At 0.5 seconds, it has a rotation of 180 degrees around the z-axis | 230 | |0.75| (0.0, 0.0, 0.707, -0.707)| At 0.75 seconds, it has a rotation of 270 (= -90) degrees around the z-axis | 231 | |1.0| (0.0, 0.0, 0.0, 1.0)| At 1.0 seconds, it has a rotation of 360 (= 0) degrees around the z-axis | 232 | 233 | So this animation describes a rotation of 360 degrees around the z-axis that lasts 1 second. 234 | 235 | 236 | ## The `animation` 237 | 238 | Finally, this is the part where the actual animation is added. The top-level `animations` array contains a single `animation` object. It consists of two elements: 239 | 240 | - The `samplers`, which describe the sources of animation data; 241 | - The `channels`, which can be imagined as connecting a "source" of the animation data to a "target." 242 | 243 | In the given example, there is one sampler. Each sampler defines an `input` and an `output` property. They both refer to accessor objects. Here, these are the *times* accessor (with index 2) and the *rotations* accessor (with index 3) that have been described above. Additionally, the sampler defines an `interpolation` type, which is `"LINEAR"` in this example. 244 | 245 | There is also one `channel` in the example. This channel refers to the only sampler (with index 0) as the source of the animation data. The target of the animation is encoded in the `channel.target` object: it contains an `id` that refers to the node whose property should be animated. The actual node property is named in the `path`. So the channel target in the given example says that the `"rotation"` property of the node with index 0 should be animated. 246 | 247 | 248 | ```javascript 249 | "animations": [ 250 | { 251 | "samplers" : [ 252 | { 253 | "input" : 2, 254 | "interpolation" : "LINEAR", 255 | "output" : 3 256 | } 257 | ], 258 | "channels" : [ { 259 | "sampler" : 0, 260 | "target" : { 261 | "node" : 0, 262 | "path" : "rotation" 263 | } 264 | } ] 265 | } 266 | ], 267 | ``` 268 | 269 | Combining all this information, the given animation object says the following: 270 | 271 | > During the animation, the animated values are obtained from the *rotations* accessor. They are interpolated linearly, based on the current simulation time and the key frame times that are provided by the *times* accessor. The interpolated values are then written into the `"rotation"` property of the node with index 0. 272 | 273 | A more detailed description and actual examples for the interpolation and the computations that are involved here can be found in the [Animations](gltfTutorial_007_Animations.md) section. 274 | 275 | 276 | 277 | Previous: [Buffers, BufferViews, and Accessors](gltfTutorial_005_BuffersBufferViewsAccessors.md) | [Table of Contents](README.md) | Next: [Animations](gltfTutorial_007_Animations.md) 278 | -------------------------------------------------------------------------------- /gltfTutorial/gltfTutorial_007_Animations.md: -------------------------------------------------------------------------------- 1 | Previous: [Simple Animation](gltfTutorial_006_SimpleAnimation.md) | [Table of Contents](README.md) | Next: [Simple Meshes](gltfTutorial_008_SimpleMeshes.md) 2 | 3 | # Animations 4 | 5 | As shown in the [Simple Animation](gltfTutorial_006_SimpleAnimation.md) example, an [`animation`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-animation) can be used to describe how the `translation`, `rotation`, or `scale` properties of nodes change over time. 6 | 7 | The following is another example of an `animation`. This time, the animation contains two channels. One animates the translation, and the other animates the rotation of a node: 8 | 9 | ```javascript 10 | "animations": [ 11 | { 12 | "samplers" : [ 13 | { 14 | "input" : 2, 15 | "interpolation" : "LINEAR", 16 | "output" : 3 17 | }, 18 | { 19 | "input" : 2, 20 | "interpolation" : "LINEAR", 21 | "output" : 4 22 | } 23 | ], 24 | "channels" : [ 25 | { 26 | "sampler" : 0, 27 | "target" : { 28 | "node" : 0, 29 | "path" : "rotation" 30 | } 31 | }, 32 | { 33 | "sampler" : 1, 34 | "target" : { 35 | "node" : 0, 36 | "path" : "translation" 37 | } 38 | } 39 | ] 40 | } 41 | ], 42 | ``` 43 | 44 | 45 | ## Animation samplers 46 | 47 | The `samplers` array contains [`animation.sampler`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-animation-sampler) objects that define how the values that are provided by the accessors have to be interpolated between the key frames, as shown in Image 7a. 48 | 49 |

50 |
51 | Image 7a: Animation samplers. 52 |

53 | 54 | In order to compute the value of the translation for the current animation time, the following algorithm can be used: 55 | 56 | * Let the current animation time be given as `currentTime`. 57 | * Compute the next smaller and the next larger element of the *times* accessor: 58 | 59 | `previousTime` = The largest element from the *times* accessor that is smaller than the `currentTime` 60 | 61 | `nextTime` = The smallest element from the *times* accessor that is larger than the `currentTime` 62 | 63 | * Obtain the elements from the *translations* accessor that correspond to these times: 64 | 65 | `previousTranslation` = The element from the *translations* accessor that corresponds to the `previousTime` 66 | 67 | `nextTranslation` = The element from the *translations* accessor that corresponds to the `nextTime` 68 | 69 | * Compute the interpolation value. This is a value between 0.0 and 1.0 that describes the *relative* position of the `currentTime`, between the `previousTime` and the `nextTime`: 70 | 71 | `interpolationValue = (currentTime - previousTime) / (nextTime - previousTime)` 72 | 73 | * Use the interpolation value to compute the translation for the current time: 74 | 75 | `currentTranslation = previousTranslation + interpolationValue * (nextTranslation - previousTranslation)` 76 | 77 | 78 | ### Example: 79 | 80 | Imagine the `currentTime` is **1.2**. The next smaller element from the *times* accessor is **0.8**. The next larger element is **1.6**. So 81 | 82 | previousTime = 0.8 83 | nextTime = 1.6 84 | 85 | The corresponding values from the *translations* accessor can be looked up: 86 | 87 | previousTranslation = (14.0, 3.0, -2.0) 88 | nextTranslation = (18.0, 1.0, 1.0) 89 | 90 | The interpolation value can be computed: 91 | 92 | interpolationValue = (currentTime - previousTime) / (nextTime - previousTime) 93 | = (1.2 - 0.8) / (1.6 - 0.8) 94 | = 0.4 / 0.8 95 | = 0.5 96 | 97 | From the interpolation value, the current translation can be computed: 98 | 99 | currentTranslation = previousTranslation + interpolationValue * (nextTranslation - previousTranslation) 100 | = (14.0, 3.0, -2.0) + 0.5 * ( (18.0, 1.0, 1.0) - (14.0, 3.0, -2.0) ) 101 | = (14.0, 3.0, -2.0) + 0.5 * (4.0, -2.0, 3.0) 102 | = (16.0, 2.0, -0.5) 103 | 104 | So when the current time is **1.2**, then the `translation` of the node is **(16.0, 2.0, -0.5)**. 105 | 106 | 107 | 108 | ## Animation channels 109 | 110 | The animations contain an array of [`animation.channel`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-animation-channel) objects. The channels establish the connection between the input, which is the value that is computed from the sampler, and the output, which is the animated node property. Therefore, each channel refers to one sampler, using the index of the sampler, and contains an [`animation.channel.target`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-animation-channel-target). The `target` refers to a node, using the index of the node, and contains a `path` that defines the property of the node that should be animated. The value from the sampler will be written into this property. 111 | 112 | In the example above, there are two channels for the animation. Both refer to the same node. The path of the first channel refers to the `translation` of the node, and the path of the second channel refers to the `rotation` of the node. So all objects (meshes) that are attached to the node will be translated and rotated by the animation, as shown in Image 7b. 113 | 114 |

115 |
116 | Image 7b: Animation channels. 117 |

118 | 119 | ## Interpolation 120 | 121 | The above example only covers `LINEAR` interpolation. Animations in a glTF asset can use three interpolation modes : 122 | 123 | - `STEP` 124 | - `LINEAR` 125 | - `CUBICSPLINE` 126 | 127 | ### Step 128 | 129 | The `STEP` interpolation is not really an interpolation mode, it makes objects jump from keyframe to keyframe *without any sort of interpolation*. When a sampler defines a step interpolation, just apply the transformation from the keyframe corresponding to `previousTime`. 130 | 131 | ### Linear 132 | 133 | Linear interpolation exactly corresponds to the above example. The general case is : 134 | 135 | Calculate the `interpolationValue`: 136 | 137 | ``` 138 | interpolationValue = (currentTime - previousTime) / (nextTime - previousTime) 139 | ``` 140 | 141 | For scalar and vector types, use a linear interpolation (generally called `lerp` in mathematics libraries). Here's a "pseudo code" implementation for reference 142 | 143 | ``` 144 | Point lerp(previousPoint, nextPoint, interpolationValue) 145 | return previousPoint + interpolationValue * (nextPoint - previousPoint) 146 | ``` 147 | 148 | In the case of rotations expressed as quaternions, you need to perform a spherical linear interpolation (`slerp`) between the previous and next values: 149 | 150 | ``` 151 | Quat slerp(previousQuat, nextQuat, interpolationValue) 152 | var dotProduct = dot(previousQuat, nextQuat) 153 | 154 | //make sure we take the shortest path in case dot Product is negative 155 | if(dotProduct < 0.0) 156 | nextQuat = -nextQuat 157 | dotProduct = -dotProduct 158 | 159 | //if the two quaternions are too close to each other, just linear interpolate between the 4D vector 160 | if(dotProduct > 0.9995) 161 | return normalize(previousQuat + interpolationValue(nextQuat - previousQuat)) 162 | 163 | //perform the spherical linear interpolation 164 | var theta_0 = acos(dotProduct) 165 | var theta = interpolationValue * theta_0 166 | var sin_theta = sin(theta) 167 | var sin_theta_0 = sin(theta_0) 168 | 169 | var scalePreviousQuat = cos(theta) - dotproduct * sin_theta / sin_theta_0 170 | var scaleNextQuat = sin_theta / sin_theta_0 171 | return scalePreviousQuat * previousQuat + scaleNextQuat * nextQuat 172 | ``` 173 | 174 | This example implementation is inspired from this [Wikipedia article](https://en.wikipedia.org/wiki/Slerp) 175 | 176 | ### Cubic Spline interpolation 177 | 178 | Cubic spline interpolation needs more data than just the previous and next keyframe time and values, it also need for each keyframe a couple of tangent vectors that act to smooth out the curve around the keyframe points. 179 | 180 | These tangent are stored in the animation channel. For each keyframe described by the animation sampler, the animation channel contains 3 elements : 181 | 182 | - The input tangent of the keyframe 183 | - The keyframe value 184 | - The output tangent 185 | 186 | The input and output tangents are normalized vectors that will need to be scaled by the duration of the keyframe, we call that the deltaTime 187 | 188 | ``` 189 | deltaTime = nextTime - previousTime 190 | ``` 191 | 192 | To calculate the value for `currentTime`, you will need to fetch from the animation channel : 193 | 194 | - The output tangent direction of `previousTime` keyframe 195 | - The value of `previousTime` keyframe 196 | - The value of `nextTime` keyframe 197 | - The input tangent direction of `nextTime` keyframe 198 | 199 | *note: the input tangent of the first keyframe and the output tangent of the last keyframe are totally ignored* 200 | 201 | To calculate the actual tangents of the keyframe, you need to multiply the direction vectors you got from the channel by `deltaTime` 202 | 203 | ``` 204 | previousTangent = deltaTime * previousOutputTangent 205 | nextTangent = deltaTime * nextInputTangent 206 | ``` 207 | 208 | The mathematical function is described in the [Appendix C](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#appendix-c-interpolation) of the glTF 2.0 specification. 209 | 210 | Here's a corresponding pseudocode snippet : 211 | 212 | ``` 213 | Point cubicSpline(previousPoint, previousTangent, nextPoint, nextTangent, interpolationValue) 214 | t = interpolationValue 215 | t2 = t * t 216 | t3 = t2 * t 217 | 218 | return (2 * t3 - 3 * t2 + 1) * previousPoint + (t3 - 2 * t2 + t) * previousTangent + (-2 * t3 + 3 * t2) * nextPoint + (t3 - t2) * nextTangent; 219 | ``` 220 | 221 | 222 | Previous: [Simple Animation](gltfTutorial_006_SimpleAnimation.md) | [Table of Contents](README.md) | Next: [Simple Meshes](gltfTutorial_008_SimpleMeshes.md) 223 | -------------------------------------------------------------------------------- /gltfTutorial/gltfTutorial_008_SimpleMeshes.md: -------------------------------------------------------------------------------- 1 | Previous: [Animations](gltfTutorial_007_Animations.md) | [Table of Contents](README.md) | Next: [Meshes](gltfTutorial_009_Meshes.md) 2 | 3 | # Simple Meshes 4 | 5 | A [`mesh`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-mesh) represents a geometric object that appears in a scene. An example of a `mesh` has already been shown in the [minimal glTF file](gltfTutorial_003_MinimalGltfFile.md). This example had a single `mesh` attached to a single `node`, and the mesh consisted of a single [`mesh.primitive`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-mesh-primitive) that contained only a single attribute—namely, the attribute for the vertex positions. But usually, the mesh primitives will contain more attributes. These attributes may, for example, be the vertex normals or texture coordinates. 6 | 7 | The following is a glTF asset that contains a simple mesh with multiple attributes, which will serve as the basis for explaining the related concepts: 8 | 9 | ```javascript 10 | { 11 | "scene": 0, 12 | "scenes" : [ 13 | { 14 | "nodes" : [ 0, 1] 15 | } 16 | ], 17 | "nodes" : [ 18 | { 19 | "mesh" : 0 20 | }, 21 | { 22 | "mesh" : 0, 23 | "translation" : [ 1.0, 0.0, 0.0 ] 24 | } 25 | ], 26 | 27 | "meshes" : [ 28 | { 29 | "primitives" : [ { 30 | "attributes" : { 31 | "POSITION" : 1, 32 | "NORMAL" : 2 33 | }, 34 | "indices" : 0 35 | } ] 36 | } 37 | ], 38 | 39 | "buffers" : [ 40 | { 41 | "uri" : "data:application/octet-stream;base64,AAABAAIAAAAAAAAAAAAAAAAAAAAAAIA/AAAAAAAAAAAAAAAAAACAPwAAAAAAAAAAAAAAAAAAgD8AAAAAAAAAAAAAgD8AAAAAAAAAAAAAgD8=", 42 | "byteLength" : 80 43 | } 44 | ], 45 | "bufferViews" : [ 46 | { 47 | "buffer" : 0, 48 | "byteOffset" : 0, 49 | "byteLength" : 6, 50 | "target" : 34963 51 | }, 52 | { 53 | "buffer" : 0, 54 | "byteOffset" : 8, 55 | "byteLength" : 72, 56 | "target" : 34962 57 | } 58 | ], 59 | "accessors" : [ 60 | { 61 | "bufferView" : 0, 62 | "byteOffset" : 0, 63 | "componentType" : 5123, 64 | "count" : 3, 65 | "type" : "SCALAR", 66 | "max" : [ 2 ], 67 | "min" : [ 0 ] 68 | }, 69 | { 70 | "bufferView" : 1, 71 | "byteOffset" : 0, 72 | "componentType" : 5126, 73 | "count" : 3, 74 | "type" : "VEC3", 75 | "max" : [ 1.0, 1.0, 0.0 ], 76 | "min" : [ 0.0, 0.0, 0.0 ] 77 | }, 78 | { 79 | "bufferView" : 1, 80 | "byteOffset" : 36, 81 | "componentType" : 5126, 82 | "count" : 3, 83 | "type" : "VEC3", 84 | "max" : [ 0.0, 0.0, 1.0 ], 85 | "min" : [ 0.0, 0.0, 1.0 ] 86 | } 87 | ], 88 | 89 | "asset" : { 90 | "version" : "2.0" 91 | } 92 | } 93 | ``` 94 | 95 | Image 8a shows the rendered glTF asset. 96 | 97 |

98 |
99 | Image 8a: A simple mesh, attached to two nodes. 100 |

101 | 102 | 103 | ## The mesh definition 104 | 105 | The given example still contains a single mesh that has a single mesh primitive. But this mesh primitive contains multiple attributes: 106 | 107 | ```javascript 108 | "meshes" : [ 109 | { 110 | "primitives" : [ { 111 | "attributes" : { 112 | "POSITION" : 1, 113 | "NORMAL" : 2 114 | }, 115 | "indices" : 0 116 | } ] 117 | } 118 | ], 119 | ``` 120 | 121 | In addition to the `"POSITION"` attribute, it has a `"NORMAL"` attribute. This refers to the `accessor` object that provides the vertex normals, as described in the [Buffers, BufferViews, and Accessors](gltfTutorial_005_BuffersBufferViewsAccessors.md) section. 122 | 123 | 124 | ## The rendered mesh instances 125 | 126 | As can be seen in Image 8a, the mesh is rendered *twice*. This is accomplished by attaching the mesh to two different nodes: 127 | 128 | ```javascript 129 | "nodes" : [ 130 | { 131 | "mesh" : 0 132 | }, 133 | { 134 | "mesh" : 0, 135 | "translation" : [ 1.0, 0.0, 0.0 ] 136 | } 137 | ], 138 | ``` 139 | 140 | The `mesh` property of each node refers to the mesh that is attached to the node, using the index of the mesh. One of the nodes has a `translation` that causes the attached mesh to be rendered at a different position. 141 | 142 | The [next section](gltfTutorial_009_Meshes.md) will explain meshes and mesh primitives in more detail. 143 | 144 | 145 | 146 | Previous: [Animations](gltfTutorial_007_Animations.md) | [Table of Contents](README.md) | Next: [Meshes](gltfTutorial_009_Meshes.md) 147 | -------------------------------------------------------------------------------- /gltfTutorial/gltfTutorial_009_Meshes.md: -------------------------------------------------------------------------------- 1 | Previous: [Simple Meshes](gltfTutorial_008_SimpleMeshes.md) | [Table of Contents](README.md) | Next: [Materials](gltfTutorial_010_Materials.md) 2 | 3 | # Meshes 4 | 5 | The [Simple Meshes](gltfTutorial_008_SimpleMeshes.md) example from the previous section showed a basic example of a [`mesh`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-mesh) with a [`mesh.primitive`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-mesh-primitive) object that contained several attributes. This section will explain the meaning and usage of mesh primitives, how meshes may be attached to nodes of the scene graph, and how they can be rendered with different materials. 6 | 7 | 8 | ## Mesh primitives 9 | 10 | Each `mesh` contains an array of `mesh.primitive` objects. These mesh primitive objects are smaller parts or building blocks of a larger object. A mesh primitive summarizes all information about how the respective part of the object will be rendered. 11 | 12 | 13 | ### Mesh primitive attributes 14 | 15 | A mesh primitive defines the geometry data of the object using its `attributes` dictionary. This geometry data is given by references to `accessor` objects that contain the data of vertex attributes. The details of the `accessor` concept are explained in the [Buffers, BufferViews, and Accessors](gltfTutorial_005_BuffersBufferViewsAccessors.md) section. 16 | 17 | In the given example, there are two entries in the `attributes` dictionary. The entries refer to the `positionsAccessor` and the `normalsAccessor`: 18 | 19 | ```javascript 20 | "meshes" : [ 21 | { 22 | "primitives" : [ { 23 | "attributes" : { 24 | "POSITION" : 1, 25 | "NORMAL" : 2 26 | }, 27 | "indices" : 0 28 | } ] 29 | } 30 | ], 31 | ``` 32 | 33 | Together, the elements of these accessors define the attributes that belong to the individual vertices, as shown in Image 9a. 34 | 35 |

36 |
37 | Image 9a: Mesh primitive accessors containing the data of vertices. 38 |

39 | 40 | 41 | ### Indexed and non-indexed geometry 42 | 43 | The geometry data of a `mesh.primitive` may be either *indexed* geometry or geometry without indices. In the given example, the `mesh.primitive` contains *indexed* geometry. This is indicated by the `indices` property, which refers to the accessor with index 0, defining the data for the indices. For non-indexed geometry, this property is omitted. 44 | 45 | 46 | ### Mesh primitive mode 47 | 48 | By default, the geometry data is assumed to describe a triangle mesh. For the case of *indexed* geometry, this means that three consecutive elements of the `indices` accessor are assumed to contain the indices of a single triangle. For non-indexed geometry, three elements of the vertex attribute accessors are assumed to contain the attributes of the three vertices of a triangle. 49 | 50 | Other rendering modes are possible: the geometry data may also describe individual points, lines, or triangle strips. This is indicated by the `mode` that may be stored in the mesh primitive. Its value is a constant that indicates how the geometry data has to be interpreted. The mode may, for example, be `0` when the geometry consists of points, or `4` when it consists of triangles. These constants correspond to the GL constants `POINTS` or `TRIANGLES`, respectively. See the [`primitive.mode` specification](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#_mesh_primitive_mode) for a list of available modes. 51 | 52 | ### Mesh primitive material 53 | 54 | The mesh primitive may also refer to the `material` that should be used for rendering, using the index of this material. In the given example, no `material` is defined, causing the objects to be rendered with a default material that just defines the objects to have a uniform 50% gray color. A detailed explanation of materials and the related concepts will be given in the [Materials](gltfTutorial_010_Materials.md) section. 55 | 56 | 57 | ## Meshes attached to nodes 58 | 59 | In the example from the [Simple Meshes](gltfTutorial_008_SimpleMeshes.md) section, there is a single `scene`, which contains two nodes, and both nodes refer to the same `mesh` instance, which has the index 0: 60 | 61 | ```javascript 62 | "scenes" : [ 63 | { 64 | "nodes" : [ 0, 1] 65 | } 66 | ], 67 | "nodes" : [ 68 | { 69 | "mesh" : 0 70 | }, 71 | { 72 | "mesh" : 0, 73 | "translation" : [ 1.0, 0.0, 0.0 ] 74 | } 75 | ], 76 | 77 | "meshes" : [ 78 | { ... } 79 | ], 80 | ``` 81 | 82 | The second node has a `translation` property. As shown in the [Scenes and Nodes](gltfTutorial_004_ScenesNodes.md) section, this will be used to compute the local transform matrix of this node. In this case, the matrix will cause a translation of 1.0 along the x-axis. The product of all local transforms of the nodes will yield the [global transform](gltfTutorial_004_ScenesNodes.md#global-transforms-of-nodes). And all elements that are attached to the nodes will be rendered with this global transform. 83 | 84 | So in this example, the mesh will be rendered twice because it is attached to two nodes: once with the global transform of the first node, which is the identity transform, and once with the global transform of the second node, which is a translation of 1.0 along the x-axis. 85 | 86 | 87 | 88 | Previous: [Simple Meshes](gltfTutorial_008_SimpleMeshes.md) | [Table of Contents](README.md) | Next: [Materials](gltfTutorial_010_Materials.md) 89 | -------------------------------------------------------------------------------- /gltfTutorial/gltfTutorial_010_Materials.md: -------------------------------------------------------------------------------- 1 | Previous: [Meshes](gltfTutorial_009_Meshes.md) | [Table of Contents](README.md) | Next: [Simple Material](gltfTutorial_011_SimpleMaterial.md) 2 | 3 | # Materials 4 | 5 | ## Introduction 6 | 7 | The purpose of glTF is to define a transmission format for 3D assets. As shown in the previous sections, this includes information about the scene structure and the geometric objects that appear in the scene. But a glTF asset can also contain information about the *appearance* of the objects; that is, how these objects should be rendered on the screen. 8 | 9 | There are different possible representations for the properties of a material, and the *shading model* describes how these properties are processed. Simple shading models, like the [Phong](https://en.wikipedia.org/wiki/Phong_reflection_model) or [Blinn-Phong](https://en.wikipedia.org/wiki/Blinn%E2%80%93Phong_shading_model), are directly supported by common graphics APIs like OpenGL or WebGL. These shading models are built on a set of basic material properties. For example, the material properties involve information about the color of diffusely reflected light (often in the form of a texture), the color of specularly reflected light, and a shininess parameter. Many file formats contain exactly these parameters. For example, [Wavefront OBJ](https://en.wikipedia.org/wiki/Wavefront_.obj_file) files are combined with `MTL` files that contain this texture and color information. Renderers can read this information and render the objects accordingly. But in order to describe more realistic materials, more sophisticated shading and material models are required. 10 | 11 | ## Physically-Based Rendering (PBR) 12 | 13 | To allow renderers to display objects with a realistic appearance under different lighting conditions, the shading model has to take the *physical* properties of the object surface into account. There are different representations of these physical material properties. One that is frequently used is the *metallic-roughness-model*. Here, the information about the object surface is encoded with three main parameters: 14 | 15 | - The *base color*, which is the "main" color of the object surface. 16 | - The *metallic* value. This is a parameter that describes how much the reflective behavior of the material resembles that of a metal. 17 | - The *roughness* value, indicating how rough the surface is, affecting the light scattering. 18 | 19 | The metallic-roughness model is the representation that is used in glTF. Other material representations, like the *specular-glossiness-model*, are supported via extensions. 20 | 21 | The effects of different metallic- and roughness values are illustrated in this image: 22 | 23 |

24 |
25 | Image 10a: Spheres with different metallic- and roughness values. 26 |

27 | 28 | The base color, metallic, and roughness properties may be given as single values and are then applied to the whole object. In order to assign different material properties to different parts of the object surface, these properties may also be given in the form of textures. This makes it possible to model a wide range of real-world materials with a realistic appearance. 29 | 30 | Depending on the shading model, additional effects can be applied to the object surface. These are usually given as a combination of a texture and a scaling factor: 31 | 32 | - An *emissive* texture describes the parts of the object surface that emit light with a certain color. 33 | - The *occlusion* texture can be used to simulate the effect of objects self-shadowing each other. 34 | - The *normal map* is a texture applied to modulate the surface normal in a way that makes it possible to simulate finer geometric details without the cost of a higher mesh resolution. 35 | 36 | glTF supports all of these additional properties, and defines sensible default values for the cases that these properties are omitted. 37 | 38 | The following sections will show how these material properties are encoded in a glTF asset, including various examples of materials: 39 | 40 | - [A Simple Material](gltfTutorial_011_SimpleMaterial.md) 41 | - [Textures, Images, and Samplers](gltfTutorial_012_TexturesImagesSamplers.md) that serve as a basis for defining material properties 42 | - [A Simple Texture](gltfTutorial_013_SimpleTexture.md) showing an example of how to use a texture for a material 43 | - [An Advanced Material](gltfTutorial_014_AdvancedMaterial.md) combining multiple textures to achieve a sophisticated surface appearance for the objects 44 | 45 | 46 | Previous: [Meshes](gltfTutorial_009_Meshes.md) | [Table of Contents](README.md) | Next: [Simple Material](gltfTutorial_011_SimpleMaterial.md) 47 | -------------------------------------------------------------------------------- /gltfTutorial/gltfTutorial_011_SimpleMaterial.md: -------------------------------------------------------------------------------- 1 | Previous: [Materials](gltfTutorial_010_Materials.md) | [Table of Contents](README.md) | Next: [Textures, Images, Samplers](gltfTutorial_012_TexturesImagesSamplers.md) 2 | 3 | # A Simple Material 4 | 5 | The examples of glTF assets that have been given in the previous sections contained a basic scene structure and simple geometric objects. But they did not contain information about the appearance of the objects. When no such information is given, viewers are encouraged to render the objects with a "default" material. And as shown in the screenshot of the [minimal glTF file](gltfTutorial_003_MinimalGltfFile.md), depending on the light conditions in the scene, this default material causes the object to be rendered with a uniformly white or light gray color. 6 | 7 | This section will start with an example of a very simple material and explain the effect of the different material properties. 8 | 9 | This is a minimal glTF asset with a simple material: 10 | 11 | ```javascript 12 | { 13 | "scene": 0, 14 | "scenes" : [ 15 | { 16 | "nodes" : [ 0 ] 17 | } 18 | ], 19 | 20 | "nodes" : [ 21 | { 22 | "mesh" : 0 23 | } 24 | ], 25 | 26 | "meshes" : [ 27 | { 28 | "primitives" : [ { 29 | "attributes" : { 30 | "POSITION" : 1 31 | }, 32 | "indices" : 0, 33 | "material" : 0 34 | } ] 35 | } 36 | ], 37 | 38 | "buffers" : [ 39 | { 40 | "uri" : "data:application/octet-stream;base64,AAABAAIAAAAAAAAAAAAAAAAAAAAAAIA/AAAAAAAAAAAAAAAAAACAPwAAAAA=", 41 | "byteLength" : 44 42 | } 43 | ], 44 | "bufferViews" : [ 45 | { 46 | "buffer" : 0, 47 | "byteOffset" : 0, 48 | "byteLength" : 6, 49 | "target" : 34963 50 | }, 51 | { 52 | "buffer" : 0, 53 | "byteOffset" : 8, 54 | "byteLength" : 36, 55 | "target" : 34962 56 | } 57 | ], 58 | "accessors" : [ 59 | { 60 | "bufferView" : 0, 61 | "byteOffset" : 0, 62 | "componentType" : 5123, 63 | "count" : 3, 64 | "type" : "SCALAR", 65 | "max" : [ 2 ], 66 | "min" : [ 0 ] 67 | }, 68 | { 69 | "bufferView" : 1, 70 | "byteOffset" : 0, 71 | "componentType" : 5126, 72 | "count" : 3, 73 | "type" : "VEC3", 74 | "max" : [ 1.0, 1.0, 0.0 ], 75 | "min" : [ 0.0, 0.0, 0.0 ] 76 | } 77 | ], 78 | 79 | "materials" : [ 80 | { 81 | "pbrMetallicRoughness": { 82 | "baseColorFactor": [ 1.000, 0.766, 0.336, 1.0 ], 83 | "metallicFactor": 0.5, 84 | "roughnessFactor": 0.1 85 | } 86 | } 87 | ], 88 | "asset" : { 89 | "version" : "2.0" 90 | } 91 | } 92 | ``` 93 | 94 | When rendered, this asset will show the triangle with a new material, as shown in Image 11a. 95 | 96 |

97 |
98 | Image 11a: A triangle with a simple material. 99 |

100 | 101 | 102 | ## Material definition 103 | 104 | 105 | A new top-level array has been added to the glTF JSON to define this material: The `materials` array contains a single element that defines the material and its properties: 106 | 107 | ```javascript 108 | "materials" : [ 109 | { 110 | "pbrMetallicRoughness": { 111 | "baseColorFactor": [ 1.000, 0.766, 0.336, 1.0 ], 112 | "metallicFactor": 0.5, 113 | "roughnessFactor": 0.1 114 | } 115 | } 116 | ], 117 | ``` 118 | 119 | The actual definition of the [`material`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-material) here only consists of the [`pbrMetallicRoughness`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-material-pbrmetallicroughness) object, which defines the basic properties of a material in the *metallic-roughness-model*. (All other material properties will therefore have default values, which will be explained later.) The `baseColorFactor` contains the red, green, blue, and alpha components of the main color of the material - here, a bright orange color. The `metallicFactor` of 0.5 indicates that the material should have reflection characteristics between that of a metal and a non-metal material. The `roughnessFactor` causes the material to not be perfectly mirror-like, but instead scatter the reflected light a bit. 120 | 121 | ## Assigning the material to objects 122 | 123 | The material is assigned to the triangle, namely to the `mesh.primitive`, by referring to the material using its index: 124 | 125 | ```javascript 126 | "meshes" : [ 127 | { 128 | "primitives" : [ { 129 | "attributes" : { 130 | "POSITION" : 1 131 | }, 132 | "indices" : 0, 133 | "material" : 0 134 | } ] 135 | } 136 | ``` 137 | 138 | The next section will give a short introduction to how textures are defined in a glTF asset. The use of textures will then allow the definition of more complex and realistic materials. 139 | 140 | Previous: [Materials](gltfTutorial_010_Materials.md) | [Table of Contents](README.md) | Next: [Textures, Images, Samplers](gltfTutorial_012_TexturesImagesSamplers.md) 141 | -------------------------------------------------------------------------------- /gltfTutorial/gltfTutorial_012_TexturesImagesSamplers.md: -------------------------------------------------------------------------------- 1 | Previous: [Simple Material](gltfTutorial_011_SimpleMaterial.md) | [Table of Contents](README.md) | Next: [Simple Texture](gltfTutorial_013_SimpleTexture.md) 2 | 3 | # Textures, Images, and Samplers 4 | 5 | Textures are an important aspect of giving objects a realistic appearance. They make it possible to define the main color of the objects, as well as other characteristics that are used in the material definition in order to precisely describe what the rendered object should look like. 6 | 7 | A glTF asset may define multiple [`texture`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-texture) objects, which can be used as the textures of geometric objects during rendering, and which can be used to encode different material properties. Depending on the graphics API, there may be many features and settings that influence the process of texture mapping. Many of these details are beyond the scope of this tutorial. There are dedicated tutorials that explain the exact meaning of all the texture mapping parameters and settings; for example, on [webglfundamentals.org](https://webglfundamentals.org/webgl/lessons/webgl-3d-textures.html), [open.gl](https://open.gl/textures), and others. This section will only summarize how the information about textures is encoded in a glTF asset. 8 | 9 | There are three top-level arrays for the definition of textures in the glTF JSON. The `textures`, `samplers`, and `images` dictionaries contain [`texture`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-texture), [`sampler`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#_texture_sampler), and [`image`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-image) objects, respectively. The following is an excerpt from the [Simple Texture](gltfTutorial_013_SimpleTexture.md) example, which will be presented in the next section: 10 | 11 | ```javascript 12 | "textures": [ 13 | { 14 | "source": 0, 15 | "sampler": 0 16 | } 17 | ], 18 | "images": [ 19 | { 20 | "uri": "testTexture.png" 21 | } 22 | ], 23 | "samplers": [ 24 | { 25 | "magFilter": 9729, 26 | "minFilter": 9987, 27 | "wrapS": 33648, 28 | "wrapT": 33648 29 | } 30 | ], 31 | ``` 32 | 33 | The `texture` itself uses indices to refer to one `sampler` and one `image`. The most important element here is the reference to the `image`. It contains a URI that links to the actual image file that will be used for the texture. Information about how to read this image data can be found in the section about [image data in `images`](gltfTutorial_002_BasicGltfStructure.md#image-data-in-images). 34 | 35 | The next section will show how such a texture definition may be used inside a material. 36 | 37 | Previous: [Simple Material](gltfTutorial_011_SimpleMaterial.md) | [Table of Contents](README.md) | Next: [Simple Texture](gltfTutorial_013_SimpleTexture.md) 38 | -------------------------------------------------------------------------------- /gltfTutorial/gltfTutorial_013_SimpleTexture.md: -------------------------------------------------------------------------------- 1 | Previous: [Textures, Images, and Samplers](gltfTutorial_012_TexturesImagesSamplers.md) | [Table of Contents](README.md) | Next: [Advanced Material](gltfTutorial_014_AdvancedMaterial.md) 2 | 3 | # A Simple Texture 4 | 5 | As shown in the previous sections, the material definition in a glTF asset contains different parameters for the color of the material or the overall appearance of the material under the influence of light. These properties may be given via single values, for example, defining the color or the roughness of the object as a whole. Alternatively, these values may be provided via textures that are mapped on the object surface. The following is a glTF asset that defines a material with a simple, single texture: 6 | 7 | ```javascript 8 | { 9 | "scene": 0, 10 | "scenes" : [ { 11 | "nodes" : [ 0 ] 12 | } ], 13 | "nodes" : [ { 14 | "mesh" : 0 15 | } ], 16 | "meshes" : [ { 17 | "primitives" : [ { 18 | "attributes" : { 19 | "POSITION" : 1, 20 | "TEXCOORD_0" : 2 21 | }, 22 | "indices" : 0, 23 | "material" : 0 24 | } ] 25 | } ], 26 | 27 | "materials" : [ { 28 | "pbrMetallicRoughness" : { 29 | "baseColorTexture" : { 30 | "index" : 0 31 | }, 32 | "metallicFactor" : 0.0, 33 | "roughnessFactor" : 1.0 34 | } 35 | } ], 36 | 37 | "textures" : [ { 38 | "sampler" : 0, 39 | "source" : 0 40 | } ], 41 | "images" : [ { 42 | "uri" : "testTexture.png" 43 | } ], 44 | "samplers" : [ { 45 | "magFilter" : 9729, 46 | "minFilter" : 9987, 47 | "wrapS" : 33648, 48 | "wrapT" : 33648 49 | } ], 50 | 51 | "buffers" : [ { 52 | "uri" : "data:application/gltf-buffer;base64,AAABAAIAAQADAAIAAAAAAAAAAAAAAAAAAACAPwAAAAAAAAAAAAAAAAAAgD8AAAAAAACAPwAAgD8AAAAAAAAAAAAAgD8AAAAAAACAPwAAgD8AAAAAAAAAAAAAAAAAAAAAAACAPwAAAAAAAAAA", 53 | "byteLength" : 108 54 | } ], 55 | "bufferViews" : [ { 56 | "buffer" : 0, 57 | "byteOffset" : 0, 58 | "byteLength" : 12, 59 | "target" : 34963 60 | }, { 61 | "buffer" : 0, 62 | "byteOffset" : 12, 63 | "byteLength" : 96, 64 | "byteStride" : 12, 65 | "target" : 34962 66 | } ], 67 | "accessors" : [ { 68 | "bufferView" : 0, 69 | "byteOffset" : 0, 70 | "componentType" : 5123, 71 | "count" : 6, 72 | "type" : "SCALAR", 73 | "max" : [ 3 ], 74 | "min" : [ 0 ] 75 | }, { 76 | "bufferView" : 1, 77 | "byteOffset" : 0, 78 | "componentType" : 5126, 79 | "count" : 4, 80 | "type" : "VEC3", 81 | "max" : [ 1.0, 1.0, 0.0 ], 82 | "min" : [ 0.0, 0.0, 0.0 ] 83 | }, { 84 | "bufferView" : 1, 85 | "byteOffset" : 48, 86 | "componentType" : 5126, 87 | "count" : 4, 88 | "type" : "VEC2", 89 | "max" : [ 1.0, 1.0 ], 90 | "min" : [ 0.0, 0.0 ] 91 | } ], 92 | 93 | "asset" : { 94 | "version" : "2.0" 95 | } 96 | } 97 | ``` 98 | 99 | The actual image that the texture consists of is stored as a PNG file called `"testTexture.png"` (see Image 13a). 100 | 101 |

102 |
103 | Image 13a: The image for the simple texture example. 104 |

105 | 106 | Bringing this all together in a renderer will result in the scene rendered in Image 13b. 107 | 108 |

109 |
110 | Image 13b: A simple texture on a unit square. 111 |

112 | 113 | 114 | ## The Textured Material Definition 115 | 116 | The material definition in this example differs from the [Simple Material](gltfTutorial_011_SimpleMaterial.md) that was shown earlier. While the simple material only defined a single color for the whole object, the material definition now refers to the newly added texture: 117 | 118 | ```javascript 119 | "materials" : [ { 120 | "pbrMetallicRoughness" : { 121 | "baseColorTexture" : { 122 | "index" : 0 123 | }, 124 | "metallicFactor" : 0.0, 125 | "roughnessFactor" : 1.0 126 | } 127 | } ], 128 | ``` 129 | 130 | The `baseColorTexture` is the index of the texture that will be applied to the object surface. The `metallicFactor` and `roughnessFactor` are still single values. A more complex material where these properties are also given via textures will be shown in the next section. 131 | 132 | In order to apply a texture to a mesh primitive, there must be information about the texture coordinates that should be used for each vertex. The texture coordinates are only another attribute for the vertices defined in the `mesh.primitive`. By default, a texture will use the texture coordinates that have the attribute name `TEXCOORD_0`. If there are multiple sets of texture coordinates, the one that should be used for one particular texture may be selected by adding a `texCoord` property to the texture reference: 133 | 134 | ```javascript 135 | "baseColorTexture" : { 136 | "index" : 0, 137 | "texCoord": 2 138 | }, 139 | ``` 140 | In this case, the texture would use the texture coordinates that are contained in the attribute called `TEXCOORD_2`. 141 | 142 | 143 | Previous: [Textures, Images, and Samplers](gltfTutorial_012_TexturesImagesSamplers.md) | [Table of Contents](README.md) | Next: [Advanced Material](gltfTutorial_014_AdvancedMaterial.md) 144 | -------------------------------------------------------------------------------- /gltfTutorial/gltfTutorial_014_AdvancedMaterial.md: -------------------------------------------------------------------------------- 1 | Previous: [Simple Texture](gltfTutorial_013_SimpleTexture.md) | [Table of Contents](README.md) | Next: [Simple Cameras](gltfTutorial_015_SimpleCameras.md) 2 | 3 | # An Advanced Material 4 | 5 | The [Simple Texture](gltfTutorial_013_SimpleTexture.md) example in the previous section showed a material for which the "base color" was defined using a texture. But in addition to the base color, there are other properties of a material that may be defined via textures. These properties have already been summarized in the [Materials](gltfTutorial_010_Materials.md) section: 6 | 7 | - The *base color*, 8 | - The *metallic* value, 9 | - The *roughness* of the surface, 10 | - The *emissive* properties, 11 | - An *occlusion* texture, and 12 | - A *normal map*. 13 | 14 | 15 | The effects of these properties cannot properly be demonstrated with trivial textures. Therefore, they will be shown here using one of the official Khronos PBR sample models, namely, the [WaterBottle](https://github.com/KhronosGroup/glTF-Sample-Assets/tree/main/Models/WaterBottle) model. Image 14a shows an overview of the textures that are involved in this model, and the final rendered object: 16 | 17 |

18 |
19 | Image 14a: An example of a material where the surface properties are defined via textures. 20 |

21 | 22 | Explaining the implementation of physically based rendering is beyond the scope of this tutorial. The official Khronos [glTF Sample Viewer](https://github.com/KhronosGroup/glTF-Sample-Viewer) contains a reference implementation of a PBR renderer based on WebGL, and provides implementation hints and background information. The following images mainly aim at demonstrating the effects of the different material property textures, under different lighting conditions. 23 | 24 | Image 14b shows the effect of the roughness texture: the main part of the bottle has a low roughness, causing it to appear shiny, compared to the cap, which has a rough surface structure. 25 | 26 |

27 |
28 | Image 14b: The influence of the roughness texture. 29 |

30 | 31 | Image 14c highlights the effect of the metallic texture: the bottle reflects the light from the surrounding environment map. 32 | 33 |

34 |
35 | Image 14c: The influence of the metallic texture. 36 |

37 | 38 | Image 14d shows the emissive part of the texture: regardless of the dark environment setting, the text, which is contained in the emissive texture, is clearly visible. 39 | 40 |

41 |
42 | Image 14d: The emissive part of the texture. 43 |

44 | 45 | Image 14e shows the part of the bottle cap for which a normal map is defined: the text appears to be embossed into the cap. This makes it possible to model finer geometric details on the surface, even though the model itself only has a very coarse geometric resolution. 46 | 47 |

48 |
49 | Image 14e: The effect of a normal map. 50 |

51 | 52 | Together, these textures and maps allow modeling a wide range of real-world materials. Thanks to the common underlying PBR model - namely, the metallic-roughness model - the objects can be rendered consistently by different renderer implementations. 53 | 54 | 55 | 56 | Previous: [Simple Texture](gltfTutorial_013_SimpleTexture.md) | [Table of Contents](README.md) | Next: [Simple Cameras](gltfTutorial_015_SimpleCameras.md) 57 | -------------------------------------------------------------------------------- /gltfTutorial/gltfTutorial_015_SimpleCameras.md: -------------------------------------------------------------------------------- 1 | Previous: [Advanced Material](gltfTutorial_014_AdvancedMaterial.md) | [Table of Contents](README.md) | Next: [Cameras](gltfTutorial_016_Cameras.md) 2 | 3 | # Simple Cameras 4 | 5 | The previous sections showed how a basic scene structure with geometric objects is represented in a glTF asset, and how different materials can be applied to these objects. This did not yet include information about the view configuration that should be used for rendering the scene. This view configuration is usually described as a virtual *camera* that is contained in the scene, at a certain position, and pointing in a certain direction. 6 | 7 | The following is a simple, complete glTF asset. It is similar to the assets that have already been shown: it defines a simple `scene` containing `node` objects and a single geometric object that is given as a `mesh`, attached to one of the nodes. But this asset additionally contains two [`camera`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-camera) objects: 8 | 9 | 10 | ```javascript 11 | { 12 | "scene": 0, 13 | "scenes" : [ 14 | { 15 | "nodes" : [ 0, 1, 2 ] 16 | } 17 | ], 18 | "nodes" : [ 19 | { 20 | "rotation" : [ -0.383, 0.0, 0.0, 0.924 ], 21 | "mesh" : 0 22 | }, 23 | { 24 | "translation" : [ 0.5, 0.5, 3.0 ], 25 | "camera" : 0 26 | }, 27 | { 28 | "translation" : [ 0.5, 0.5, 3.0 ], 29 | "camera" : 1 30 | } 31 | ], 32 | 33 | "cameras" : [ 34 | { 35 | "type": "perspective", 36 | "perspective": { 37 | "aspectRatio": 1.0, 38 | "yfov": 0.7, 39 | "zfar": 100, 40 | "znear": 0.01 41 | } 42 | }, 43 | { 44 | "type": "orthographic", 45 | "orthographic": { 46 | "xmag": 1.0, 47 | "ymag": 1.0, 48 | "zfar": 100, 49 | "znear": 0.01 50 | } 51 | } 52 | ], 53 | 54 | "meshes" : [ 55 | { 56 | "primitives" : [ { 57 | "attributes" : { 58 | "POSITION" : 1 59 | }, 60 | "indices" : 0 61 | } ] 62 | } 63 | ], 64 | 65 | "buffers" : [ 66 | { 67 | "uri" : "data:application/octet-stream;base64,AAABAAIAAQADAAIAAAAAAAAAAAAAAAAAAACAPwAAAAAAAAAAAAAAAAAAgD8AAAAAAACAPwAAgD8AAAAA", 68 | "byteLength" : 60 69 | } 70 | ], 71 | "bufferViews" : [ 72 | { 73 | "buffer" : 0, 74 | "byteOffset" : 0, 75 | "byteLength" : 12, 76 | "target" : 34963 77 | }, 78 | { 79 | "buffer" : 0, 80 | "byteOffset" : 12, 81 | "byteLength" : 48, 82 | "target" : 34962 83 | } 84 | ], 85 | "accessors" : [ 86 | { 87 | "bufferView" : 0, 88 | "byteOffset" : 0, 89 | "componentType" : 5123, 90 | "count" : 6, 91 | "type" : "SCALAR", 92 | "max" : [ 3 ], 93 | "min" : [ 0 ] 94 | }, 95 | { 96 | "bufferView" : 1, 97 | "byteOffset" : 0, 98 | "componentType" : 5126, 99 | "count" : 4, 100 | "type" : "VEC3", 101 | "max" : [ 1.0, 1.0, 0.0 ], 102 | "min" : [ 0.0, 0.0, 0.0 ] 103 | } 104 | ], 105 | 106 | "asset" : { 107 | "version" : "2.0" 108 | } 109 | } 110 | ``` 111 | 112 | The geometry in this asset is a simple unit square. It is rotated by -45 degrees around the x-axis, to emphasize the effect of the different cameras. Image 15a shows three options for rendering this asset. The first examples use the cameras from the asset. The last example shows how the scene looks from an external, user-defined viewpoint. 113 | 114 |

115 |
116 | Image 15a: The effect of rendering the scene with different cameras. 117 |

118 | 119 | 120 | ## Camera definitions 121 | 122 | The new top-level element of this glTF asset is the `cameras` array, which contains the [`camera`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-camera) objects: 123 | 124 | ```javascript 125 | "cameras" : [ 126 | { 127 | "type": "perspective", 128 | "perspective": { 129 | "aspectRatio": 1.0, 130 | "yfov": 0.7, 131 | "zfar": 100, 132 | "znear": 0.01 133 | } 134 | }, 135 | { 136 | "type": "orthographic", 137 | "orthographic": { 138 | "xmag": 1.0, 139 | "ymag": 1.0, 140 | "zfar": 100, 141 | "znear": 0.01 142 | } 143 | } 144 | ], 145 | ``` 146 | 147 | When a camera object has been defined, it may be attached to a `node`. This is accomplished by assigning the index of the camera to the `camera` property of a node. In the given example, two new nodes have been added to the scene graph, one for each camera: 148 | 149 | ```javascript 150 | "nodes" : { 151 | ... 152 | { 153 | "translation" : [ 0.5, 0.5, 3.0 ], 154 | "camera" : 0 155 | }, 156 | { 157 | "translation" : [ 0.5, 0.5, 3.0 ], 158 | "camera" : 1 159 | } 160 | }, 161 | ``` 162 | 163 | The differences between perspective and orthographic cameras and their properties, the effect of attaching the cameras to the nodes, and the management of multiple cameras will be explained in detail in the [Cameras](gltfTutorial_016_Cameras.md) section. 164 | 165 | 166 | 167 | 168 | Previous: [Advanced Material](gltfTutorial_014_AdvancedMaterial.md) | [Table of Contents](README.md) | Next: [Cameras](gltfTutorial_016_Cameras.md) 169 | -------------------------------------------------------------------------------- /gltfTutorial/gltfTutorial_016_Cameras.md: -------------------------------------------------------------------------------- 1 | Previous: [Simple Cameras](gltfTutorial_015_SimpleCameras.md) | [Table of Contents](README.md) | Next: [Simple Morph Target](gltfTutorial_017_SimpleMorphTarget.md) 2 | 3 | # Cameras 4 | 5 | The example in the [Simple Cameras](gltfTutorial_017_SimpleCameras.md) section showed how to define perspective and orthographic cameras, and how they can be integrated into a scene by attaching them to nodes. This section will explain the differences between both types of cameras, and the handling of cameras in general. 6 | 7 | 8 | ## Perspective and orthographic cameras 9 | 10 | There are two kinds of cameras: *Perspective* cameras, where the viewing volume is a truncated pyramid (often referred to as "viewing frustum"), and *orthographic* cameras, where the viewing volume is a rectangular box. The main difference is that rendering with a *perspective* camera causes a proper perspective distortion, whereas rendering with an *orthographic* camera causes a preservation of lengths and angles. 11 | 12 | The example in the [Simple Cameras](gltfTutorial_015_SimpleCameras.md) section contains one camera of each type, a perspective camera at index 0 and an orthographic camera at index 1: 13 | 14 | ```javascript 15 | "cameras" : [ 16 | { 17 | "type": "perspective", 18 | "perspective": { 19 | "aspectRatio": 1.0, 20 | "yfov": 0.7, 21 | "zfar": 100, 22 | "znear": 0.01 23 | } 24 | }, 25 | { 26 | "type": "orthographic", 27 | "orthographic": { 28 | "xmag": 1.0, 29 | "ymag": 1.0, 30 | "zfar": 100, 31 | "znear": 0.01 32 | } 33 | } 34 | ], 35 | ``` 36 | 37 | 38 | The `type` of the camera is given as a string, which can be `"perspective"` or `"orthographic"`. Depending on this type, the `camera` object contains a [`camera.perspective`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-camera-perspective) object or a [`camera.orthographic`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-camera-orthographic) object. These objects contain additional parameters that define the actual viewing volume. 39 | 40 | The `camera.perspective` object contains an `aspectRatio` property that defines the aspect ratio of the viewport. Additionally, it contains a property called `yfov`, which stands for *Field Of View in Y-direction*. It defines the "opening angle" of the camera and is given in radians. 41 | 42 | The `camera.orthographic` object contains `xmag` and `ymag` properties. These define the magnification of the camera in x- and y-direction, and basically describe the width and height of the viewing volume. 43 | 44 | Both camera types additionally contain `znear` and `zfar` properties, which are the coordinates of the near and far clipping plane. For perspective cameras, the `zfar` value is optional. When it is missing, a special "infinite projection matrix" will be used. 45 | 46 | Explaining the details of cameras, viewing, and projections is beyond the scope of this tutorial. The important point is that most graphics APIs offer methods for defining the viewing configuration that are directly based on these parameters. In general, these parameters can be used to compute a *camera matrix*. The camera matrix can be inverted to obtain the *view matrix*, which will later be post-multiplied with the *model matrix* to obtain the *model-view matrix*, which is required by the renderer. 47 | 48 | 49 | # Camera orientation 50 | 51 | A `camera` can be transformed to have a certain orientation and viewing direction in the scene. This is accomplished by attaching the camera to a `node`. Each [`node`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-node) may contain the index of a `camera` that is attached to it. In the simple camera example, there are two nodes for the cameras. The first node refers to the perspective camera with index 0, and the second one refers to the orthographic camera with index 1: 52 | 53 | ```javascript 54 | "nodes" : { 55 | ... 56 | { 57 | "translation" : [ 0.5, 0.5, 3.0 ], 58 | "camera" : 0 59 | }, 60 | { 61 | "translation" : [ 0.5, 0.5, 3.0 ], 62 | "camera" : 1 63 | } 64 | }, 65 | ``` 66 | 67 | As shown in the [Scenes and Nodes](gltfTutorial_004_ScenesNodes.md) section, these nodes may have properties that define the transform matrix of the node. The [global transform](gltfTutorial_004_ScenesNodes.md#global-transforms-of-nodes) of a node then defines the actual orientation of the camera in the scene. With the option to apply arbitrary [animations](gltfTutorial_007_Animations.md) to the nodes, it is even possible to define camera flights. 68 | 69 | When the global transform of the camera node is the identity matrix, then the eye point of the camera is at the origin, and the viewing direction is along the negative z-axis. In the given example, the nodes both have a `translation` about `(0.5, 0.5, 3.0)`, which causes the camera to be transformed accordingly: it is translated about 0.5 in the x- and y- direction, to look at the center of the unit square, and about 3.0 along the z-axis, to move it a bit away from the object. 70 | 71 | 72 | ## Camera instancing and management 73 | 74 | There may be multiple cameras defined in the JSON part of a glTF. Each camera may be referred to by multiple nodes. Therefore, the cameras as they appear in the glTF asset are really "templates" for actual camera *instances*: Whenever a node refers to one camera, a new instance of this camera is created. 75 | 76 | There is no "default" camera for a glTF asset. Instead, the client application has to keep track of the currently active camera. The client application may, for example, offer a dropdown-menu that allows one to select the active camera and thus to quickly switch between predefined view configurations. With a bit more implementation effort, the client application can also define its own camera and interaction patterns for the camera control (e.g., zooming with the mouse wheel). However, the logic for the navigation and interaction has to be implemented solely by the client application in this case. [Image 15a](gltfTutorial_015_SimpleCameras.md#cameras-png) shows the result of such an implementation, where the user may select either the active camera from the ones that are defined in the glTF asset, or an "external camera" that may be controlled with the mouse. 77 | 78 | 79 | 80 | Previous: [Simple Cameras](gltfTutorial_015_SimpleCameras.md) | [Table of Contents](README.md) | Next: [Simple Morph Target](gltfTutorial_017_SimpleMorphTarget.md) 81 | -------------------------------------------------------------------------------- /gltfTutorial/gltfTutorial_017_SimpleMorphTarget.md: -------------------------------------------------------------------------------- 1 | Previous: [Cameras](gltfTutorial_016_Cameras.md) | [Table of Contents](README.md) | Next: [Morph Targets](gltfTutorial_018_MorphTargets.md) 2 | 3 | # A Simple Morph Target 4 | 5 | Starting with version 2.0, glTF supports the definition of *morph targets* for meshes. A morph target stores displacements or differences for certain mesh attributes. At runtime, these differences may be added to the original mesh, with different weights, in order to animate parts of the mesh. This is often used in character animations, for example, to encode different facial expressions of a virtual character. 6 | 7 | The following is a minimal example that shows a mesh with two morph targets. The new elements will be summarized here, and the broader concept of morph targets and how they are applied at runtime will be explained in the next section. 8 | 9 | 10 | ```javascript 11 | { 12 | "scene": 0, 13 | "scenes":[ 14 | { 15 | "nodes":[ 16 | 0 17 | ] 18 | } 19 | ], 20 | "nodes":[ 21 | { 22 | "mesh":0 23 | } 24 | ], 25 | "meshes":[ 26 | { 27 | "primitives":[ 28 | { 29 | "attributes":{ 30 | "POSITION":1 31 | }, 32 | "targets":[ 33 | { 34 | "POSITION":2 35 | }, 36 | { 37 | "POSITION":3 38 | } 39 | ], 40 | "indices":0 41 | } 42 | ], 43 | "weights":[ 44 | 1.0, 45 | 0.5 46 | ] 47 | } 48 | ], 49 | 50 | "animations":[ 51 | { 52 | "samplers":[ 53 | { 54 | "input":4, 55 | "interpolation":"LINEAR", 56 | "output":5 57 | } 58 | ], 59 | "channels":[ 60 | { 61 | "sampler":0, 62 | "target":{ 63 | "node":0, 64 | "path":"weights" 65 | } 66 | } 67 | ] 68 | } 69 | ], 70 | 71 | "buffers":[ 72 | { 73 | "uri":"data:application/gltf-buffer;base64,AAABAAIAAAAAAAAAAAAAAAAAAAAAAIA/AAAAAAAAAAAAAAA/AAAAPwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAIC/AACAPwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAIA/AACAPwAAAAA=", 74 | "byteLength":116 75 | }, 76 | { 77 | "uri":"data:application/gltf-buffer;base64,AAAAAAAAgD8AAABAAABAQAAAgEAAAAAAAAAAAAAAAAAAAIA/AACAPwAAgD8AAIA/AAAAAAAAAAAAAAAA", 78 | "byteLength":60 79 | } 80 | ], 81 | "bufferViews":[ 82 | { 83 | "buffer":0, 84 | "byteOffset":0, 85 | "byteLength":6, 86 | "target":34963 87 | }, 88 | { 89 | "buffer":0, 90 | "byteOffset":8, 91 | "byteLength":108, 92 | "byteStride":12, 93 | "target":34962 94 | }, 95 | { 96 | "buffer":1, 97 | "byteOffset":0, 98 | "byteLength":20 99 | }, 100 | { 101 | "buffer":1, 102 | "byteOffset":20, 103 | "byteLength":40 104 | } 105 | ], 106 | "accessors":[ 107 | { 108 | "bufferView":0, 109 | "byteOffset":0, 110 | "componentType":5123, 111 | "count":3, 112 | "type":"SCALAR", 113 | "max":[ 114 | 2 115 | ], 116 | "min":[ 117 | 0 118 | ] 119 | }, 120 | { 121 | "bufferView":1, 122 | "byteOffset":0, 123 | "componentType":5126, 124 | "count":3, 125 | "type":"VEC3", 126 | "max":[ 127 | 1.0, 128 | 0.5, 129 | 0.0 130 | ], 131 | "min":[ 132 | 0.0, 133 | 0.0, 134 | 0.0 135 | ] 136 | }, 137 | { 138 | "bufferView":1, 139 | "byteOffset":36, 140 | "componentType":5126, 141 | "count":3, 142 | "type":"VEC3", 143 | "max":[ 144 | 0.0, 145 | 1.0, 146 | 0.0 147 | ], 148 | "min":[ 149 | -1.0, 150 | 0.0, 151 | 0.0 152 | ] 153 | }, 154 | { 155 | "bufferView":1, 156 | "byteOffset":72, 157 | "componentType":5126, 158 | "count":3, 159 | "type":"VEC3", 160 | "max":[ 161 | 1.0, 162 | 1.0, 163 | 0.0 164 | ], 165 | "min":[ 166 | 0.0, 167 | 0.0, 168 | 0.0 169 | ] 170 | }, 171 | { 172 | "bufferView":2, 173 | "byteOffset":0, 174 | "componentType":5126, 175 | "count":5, 176 | "type":"SCALAR", 177 | "max":[ 178 | 4.0 179 | ], 180 | "min":[ 181 | 0.0 182 | ] 183 | }, 184 | { 185 | "bufferView":3, 186 | "byteOffset":0, 187 | "componentType":5126, 188 | "count":10, 189 | "type":"SCALAR", 190 | "max":[ 191 | 1.0 192 | ], 193 | "min":[ 194 | 0.0 195 | ] 196 | } 197 | ], 198 | 199 | "asset":{ 200 | "version":"2.0" 201 | } 202 | } 203 | 204 | ``` 205 | 206 | The asset contains an animation that interpolates between the different morph targets for a single triangle. A screenshot of this asset is shown in Image 17a. 207 | 208 |

209 |
210 | Image 17a: A triangle with two morph targets. 211 |

212 | 213 | 214 | Most of the elements of this asset have already been explained in the previous sections: It contains a `scene` with a single `node` and a single `mesh`. There are two `buffer` objects, one storing the geometry data and one storing the data for the `animation`, and several `bufferView` and `accessor` objects that provide access to this data. 215 | 216 | The new elements that have been added in order to define the morph targets are contained in the `mesh` and the `animation`: 217 | 218 | 219 | ```javascript 220 | "meshes":[ 221 | { 222 | "primitives":[ 223 | { 224 | "attributes":{ 225 | "POSITION":1 226 | }, 227 | "targets":[ 228 | { 229 | "POSITION":2 230 | }, 231 | { 232 | "POSITION":3 233 | } 234 | ], 235 | "indices":0 236 | } 237 | ], 238 | "weights":[ 239 | 0.5, 240 | 0.5 241 | ] 242 | } 243 | ], 244 | 245 | ``` 246 | 247 | The `mesh.primitive` contains an array of [morph `targets`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#_mesh_primitive_targets). Each morph target is a dictionary that maps attribute names to `accessor` objects. In the example, there are two morph targets, both mapping the `"POSITION"` attribute to accessors that contain the morphed vertex positions. The mesh also contains an array of `weights` that defines the contribution of each morph target to the final, rendered mesh. These weights are also the `channel.target` of the `animation` that is contained in the asset: 248 | 249 | ```javascript 250 | "animations":[ 251 | { 252 | "samplers":[ 253 | { 254 | "input":4, 255 | "interpolation":"LINEAR", 256 | "output":5 257 | } 258 | ], 259 | "channels":[ 260 | { 261 | "sampler":0, 262 | "target":{ 263 | "node":0, 264 | "path":"weights" 265 | } 266 | } 267 | ] 268 | } 269 | ], 270 | 271 | ``` 272 | 273 | This means that the animation will modify the `weights` of the mesh that is referred to by the `target.node`. The result of applying the animation to these weights, and the computation of the final, rendered mesh will be explained in more detail in the next section about [Morph Targets](gltfTutorial_018_MorphTargets.md). 274 | 275 | 276 | 277 | Previous: [Cameras](gltfTutorial_016_Cameras.md) | [Table of Contents](README.md) | Next: [Morph Targets](gltfTutorial_018_MorphTargets.md) 278 | -------------------------------------------------------------------------------- /gltfTutorial/gltfTutorial_018_MorphTargets.md: -------------------------------------------------------------------------------- 1 | Previous: [Simple Morph Target](gltfTutorial_017_SimpleMorphTarget.md) | [Table of Contents](README.md) | Next: [SimpleSkin](gltfTutorial_019_SimpleSkin.md) 2 | 3 | # Morph Targets 4 | 5 | The example in the previous section contains a mesh that consists of a single triangle with two morph targets: 6 | 7 | ```javascript 8 | { 9 | "meshes":[ 10 | { 11 | "primitives":[ 12 | { 13 | "attributes":{ 14 | "POSITION":1 15 | }, 16 | "targets":[ 17 | { 18 | "POSITION":2 19 | }, 20 | { 21 | "POSITION":3 22 | } 23 | ], 24 | "indices":0 25 | } 26 | ], 27 | "weights":[ 28 | 1.0, 29 | 0.5 30 | ] 31 | } 32 | ], 33 | ``` 34 | 35 | 36 | The actual base geometry of the mesh, namely the triangle geometry, is defined by the `mesh.primitive` attribute called `"POSITION"`. The morph targets of the `mesh.primitive` are dictionaries that map the attribute name `"POSITION"` to `accessor` objects that contain the *displacements* for each vertex. Image 18a shows the initial triangle geometry in black, and the displacement for the first morph target in red, and the displacement for the second morph target in green. 37 | 38 |

39 |
40 | Image 18a: The initial triangle and morph target displacements. 41 |

42 | 43 | The `weights` of the mesh determine how these morph target displacements are added to the initial geometry in order to obtain the current state of the geometry. The pseudocode for computing the rendered vertex positions for a mesh `primitive` is as follows: 44 | ``` 45 | renderedPrimitive.POSITION = primitive.POSITION + 46 | weights[0] * primitive.targets[0].POSITION + 47 | weights[1] * primitive.targets[1].POSITION; 48 | ``` 49 | 50 | This means that the current state of the mesh primitive is computed by taking the initial mesh primitive geometry and adding a linear combination of the morph target displacements, where the `weights` are the factors for the linear combination. 51 | 52 | The asset additionally contains an `animation` that affects the weights for the morph targets. The following table shows the key frames of the animated weights: 53 | 54 | | Time | Weights | 55 | |:----:|:---------:| 56 | | 0.0 | 0.0, 0.0 | 57 | | 1.0 | 0.0, 1.0 | 58 | | 2.0 | 1.0, 1.0 | 59 | | 3.0 | 1.0, 0.0 | 60 | | 4.0 | 0.0, 0.0 | 61 | 62 | 63 | Throughout the animation, the weights are interpolated linearly, and applied to the morph target displacements. At each point, the rendered state of the mesh primitive is updated accordingly. The following is an example of the state that is computed at 1.25 seconds. The weights that are provided by the animation sampler for this animation time are (0.25, 1.0), and they are used for computing the linear combination of the morph target displacements. 64 | 65 |

66 |
67 | Image 18b: An intermediate state of the morph target animation. 68 |

69 | 70 | 71 | 72 | 73 | Previous: [Simple Morph Target](gltfTutorial_017_SimpleMorphTarget.md) | [Table of Contents](README.md) | Next: [SimpleSkin](gltfTutorial_019_SimpleSkin.md) 74 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | -------------------------------------------------------------------------------- /gltfTutorial/gltfTutorial_019_SimpleSkin.md: -------------------------------------------------------------------------------- 1 | Previous: [Morph Targets](gltfTutorial_018_MorphTargets.md) | [Table of Contents](README.md) | Next: [Skins](gltfTutorial_020_Skins.md) 2 | 3 | # A Simple Skin 4 | 5 | glTF supports *vertex skinning*, which allows the geometry (vertices) of a mesh to be deformed based on the pose of a skeleton. This is essential in order to give animated geometry, for example of virtual characters, a realistic appearance. The core for the definition of vertex skinning in a glTF asset is the [`skin`](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-skin), but vertex skinning in general implies several interdependencies between the elements of a glTF asset that have been presented so far. 6 | 7 | The following is a glTF asset that shows basic vertex skinning for a simple geometry. The elements of this asset will be summarized quickly in this section, referring to the previous sections where appropriate, and pointing out the new elements that have been added for the vertex skinning functionality. The details and background information for vertex skinning will be given in the next section. 8 | 9 | ```javascript 10 | { 11 | "scene" : 0, 12 | "scenes" : [ { 13 | "nodes" : [ 0, 1 ] 14 | } ], 15 | 16 | "nodes" : [ { 17 | "skin" : 0, 18 | "mesh" : 0 19 | }, { 20 | "children" : [ 2 ] 21 | }, { 22 | "translation" : [ 0.0, 1.0, 0.0 ], 23 | "rotation" : [ 0.0, 0.0, 0.0, 1.0 ] 24 | } ], 25 | 26 | "meshes" : [ { 27 | "primitives" : [ { 28 | "attributes" : { 29 | "POSITION" : 1, 30 | "JOINTS_0" : 2, 31 | "WEIGHTS_0" : 3 32 | }, 33 | "indices" : 0 34 | } ] 35 | } ], 36 | 37 | "skins" : [ { 38 | "inverseBindMatrices" : 4, 39 | "joints" : [ 1, 2 ] 40 | } ], 41 | 42 | "animations" : [ { 43 | "channels" : [ { 44 | "sampler" : 0, 45 | "target" : { 46 | "node" : 2, 47 | "path" : "rotation" 48 | } 49 | } ], 50 | "samplers" : [ { 51 | "input" : 5, 52 | "interpolation" : "LINEAR", 53 | "output" : 6 54 | } ] 55 | } ], 56 | 57 | "buffers" : [ { 58 | "uri" : "data:application/gltf-buffer;base64,AAABAAMAAAADAAIAAgADAAUAAgAFAAQABAAFAAcABAAHAAYABgAHAAkABgAJAAgAAAAAvwAAAAAAAAAAAAAAPwAAAAAAAAAAAAAAvwAAAD8AAAAAAAAAPwAAAD8AAAAAAAAAvwAAgD8AAAAAAAAAPwAAgD8AAAAAAAAAvwAAwD8AAAAAAAAAPwAAwD8AAAAAAAAAvwAAAEAAAAAAAAAAPwAAAEAAAAAA", 59 | "byteLength" : 168 60 | }, { 61 | "uri" : "data:application/gltf-buffer;base64,AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAEAAAAAAAAAAAAAAAAAAAABAAAAAAAAAAAAAAAAAAAAAQAAAAAAAAAAAAAAAAAAAAEAAAAAAAAAAAAAAAAAAAABAAAAAAAAAAAAAAAAAAAAAQAAAAAAAAAAAAAAAAAAAAEAAAAAAAAAAAAAAAAAAAABAAAAAAAAAAAAAAAAAAAAgD8AAAAAAAAAAAAAAAAAAIA/AAAAAAAAAAAAAAAAAABAPwAAgD4AAAAAAAAAAAAAQD8AAIA+AAAAAAAAAAAAAAA/AAAAPwAAAAAAAAAAAAAAPwAAAD8AAAAAAAAAAAAAgD4AAEA/AAAAAAAAAAAAAIA+AABAPwAAAAAAAAAAAAAAAAAAgD8AAAAAAAAAAAAAAAAAAIA/AAAAAAAAAAA=", 62 | "byteLength" : 320 63 | }, { 64 | "uri" : "data:application/gltf-buffer;base64,AACAPwAAAAAAAAAAAAAAAAAAAAAAAIA/AAAAAAAAAAAAAAAAAAAAAAAAgD8AAAAAAAAAAAAAAAAAAAAAAACAPwAAgD8AAAAAAAAAAAAAAAAAAAAAAACAPwAAAAAAAAAAAAAAAAAAAAAAAIA/AAAAAAAAAAAAAIC/AAAAAAAAgD8=", 65 | "byteLength" : 128 66 | }, { 67 | "uri" : "data:application/gltf-buffer;base64,AAAAAAAAAD8AAIA/AADAPwAAAEAAACBAAABAQAAAYEAAAIBAAACQQAAAoEAAALBAAAAAAAAAAAAAAAAAAACAPwAAAAAAAAAAkxjEPkSLbD8AAAAAAAAAAPT9ND/0/TQ/AAAAAAAAAAD0/TQ/9P00PwAAAAAAAAAAkxjEPkSLbD8AAAAAAAAAAAAAAAAAAIA/AAAAAAAAAAAAAAAAAACAPwAAAAAAAAAAkxjEvkSLbD8AAAAAAAAAAPT9NL/0/TQ/AAAAAAAAAAD0/TS/9P00PwAAAAAAAAAAkxjEvkSLbD8AAAAAAAAAAAAAAAAAAIA/", 68 | "byteLength" : 240 69 | } ], 70 | 71 | "bufferViews" : [ { 72 | "buffer" : 0, 73 | "byteLength" : 48, 74 | "target" : 34963 75 | }, { 76 | "buffer" : 0, 77 | "byteOffset" : 48, 78 | "byteLength" : 120, 79 | "target" : 34962 80 | }, { 81 | "buffer" : 1, 82 | "byteLength" : 320, 83 | "byteStride" : 16 84 | }, { 85 | "buffer" : 2, 86 | "byteLength" : 128 87 | }, { 88 | "buffer" : 3, 89 | "byteLength" : 240 90 | } ], 91 | 92 | "accessors" : [ { 93 | "bufferView" : 0, 94 | "componentType" : 5123, 95 | "count" : 24, 96 | "type" : "SCALAR" 97 | }, { 98 | "bufferView" : 1, 99 | "componentType" : 5126, 100 | "count" : 10, 101 | "type" : "VEC3", 102 | "max" : [ 0.5, 2.0, 0.0 ], 103 | "min" : [ -0.5, 0.0, 0.0 ] 104 | }, { 105 | "bufferView" : 2, 106 | "componentType" : 5123, 107 | "count" : 10, 108 | "type" : "VEC4" 109 | }, { 110 | "bufferView" : 2, 111 | "byteOffset" : 160, 112 | "componentType" : 5126, 113 | "count" : 10, 114 | "type" : "VEC4" 115 | }, { 116 | "bufferView" : 3, 117 | "componentType" : 5126, 118 | "count" : 2, 119 | "type" : "MAT4" 120 | }, { 121 | "bufferView" : 4, 122 | "componentType" : 5126, 123 | "count" : 12, 124 | "type" : "SCALAR", 125 | "max" : [ 5.5 ], 126 | "min" : [ 0.0 ] 127 | }, { 128 | "bufferView" : 4, 129 | "byteOffset" : 48, 130 | "componentType" : 5126, 131 | "count" : 12, 132 | "type" : "VEC4", 133 | "max" : [ 0.0, 0.0, 0.707, 1.0 ], 134 | "min" : [ 0.0, 0.0, -0.707, 0.707 ] 135 | } ], 136 | 137 | "asset" : { 138 | "version" : "2.0" 139 | } 140 | } 141 | ``` 142 | 143 | 144 | 145 | The result of rendering this asset is shown in Image 19a. 146 | 147 |

148 |
149 | Image 19a: A scene with simple vertex skinning. 150 |

151 | 152 | 153 | ## Elements of the simple skin example 154 | 155 | The elements of the given example are briefly summarized here: 156 | 157 | - The `scenes` and `nodes` elements have been explained in the [Scenes and Nodes](gltfTutorial_004_ScenesNodes.md) section. For the vertex skinning, new nodes have been added: the nodes at index 1 and 2 define a new node hierarchy for the *skeleton*. These nodes can be considered the joints between the "bones" that will eventually cause the deformation of the mesh. 158 | - The new top-level dictionary `skins` contains a single skin in the given example. The properties of this skin object will be explained later. 159 | - The concepts of `animations` has been explained in the [Animations](gltfTutorial_007_Animations.md) section. In the given example, the animation refers to the *skeleton* nodes so that the effect of the vertex skinning is actually visible during the animation. 160 | - The [Meshes](gltfTutorial_009_Meshes.md) section already explained the contents of the `meshes` and `mesh.primitive` objects. In this example, new mesh primitive attributes have been added, which are required for vertex skinning, namely the `"JOINTS_0"` and `"WEIGHTS_0"` attributes. 161 | - There are several new `buffers`, `bufferViews`, and `accessors`. Their basic properties have been described in the [Buffers, BufferViews, and Accessors](gltfTutorial_005_BufferBufferViewsAccessors.md) section. In the given example, they contain the additional data required for vertex skinning. 162 | 163 | Details about how these elements are interconnected to achieve the vertex skinning will be explained in the [Skins](gltfTutorial_020_Skins.md) section. 164 | 165 | 166 | Previous: [Morph Targets](gltfTutorial_018_MorphTargets.md) | [Table of Contents](README.md) | Next: [Skins](gltfTutorial_020_Skins.md) 167 | -------------------------------------------------------------------------------- /gltfTutorial/images/advancedMaterial_emissive.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/advancedMaterial_emissive.png -------------------------------------------------------------------------------- /gltfTutorial/images/advancedMaterial_metallic.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/advancedMaterial_metallic.png -------------------------------------------------------------------------------- /gltfTutorial/images/advancedMaterial_normal.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/advancedMaterial_normal.png -------------------------------------------------------------------------------- /gltfTutorial/images/advancedMaterial_roughness.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/advancedMaterial_roughness.png -------------------------------------------------------------------------------- /gltfTutorial/images/animatedTriangle.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/animatedTriangle.gif -------------------------------------------------------------------------------- /gltfTutorial/images/animationChannels.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/animationChannels.png -------------------------------------------------------------------------------- /gltfTutorial/images/animationSamplers.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/animationSamplers.png -------------------------------------------------------------------------------- /gltfTutorial/images/aos.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/aos.png -------------------------------------------------------------------------------- /gltfTutorial/images/applications.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/applications.png -------------------------------------------------------------------------------- /gltfTutorial/images/buffer.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/buffer.png -------------------------------------------------------------------------------- /gltfTutorial/images/bufferBufferView.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/bufferBufferView.png -------------------------------------------------------------------------------- /gltfTutorial/images/bufferBufferViewAccessor.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/bufferBufferViewAccessor.png -------------------------------------------------------------------------------- /gltfTutorial/images/cameras.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/cameras.png -------------------------------------------------------------------------------- /gltfTutorial/images/contentPipeline.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/contentPipeline.png -------------------------------------------------------------------------------- /gltfTutorial/images/contentPipelineWithGltf.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/contentPipelineWithGltf.png -------------------------------------------------------------------------------- /gltfTutorial/images/createPng.bat: -------------------------------------------------------------------------------- 1 | 2 | echo Create for %1 3 | 4 | "C:\Program Files\Inkscape\inkscape.exe" -z -f "%1" -D -e "%~n1.png" -------------------------------------------------------------------------------- /gltfTutorial/images/createPngs.bat: -------------------------------------------------------------------------------- 1 | 2 | for %%i in (*.svg) do call createPng.bat %%i 3 | -------------------------------------------------------------------------------- /gltfTutorial/images/gltfJsonStructure.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/gltfJsonStructure.png -------------------------------------------------------------------------------- /gltfTutorial/images/gltfStructure.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/gltfStructure.png -------------------------------------------------------------------------------- /gltfTutorial/images/materials.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/materials.png -------------------------------------------------------------------------------- /gltfTutorial/images/matrix.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/matrix.png -------------------------------------------------------------------------------- /gltfTutorial/images/matrix.tex: -------------------------------------------------------------------------------- 1 | \begin{align*} 2 | M = 3 | \[ \left( \begin{array}{cccc} 4 | 2.0 & 0.0 & 0.0 & 10.0 \\ 5 | 0.0 & 0.866 & -0.25 & 20.0 \\ 6 | 0.0 & 0.5 & 0.433 & 30.0 \\ 7 | 0.0 & 0.0 & 0.0, & 1.0 \end{array} 8 | \right)\] 9 | \end{align*} 10 | 11 | 12 | -------------------------------------------------------------------------------- /gltfTutorial/images/meshPrimitiveAttributes.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/meshPrimitiveAttributes.png -------------------------------------------------------------------------------- /gltfTutorial/images/metallicRoughnessSpheres.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/metallicRoughnessSpheres.png -------------------------------------------------------------------------------- /gltfTutorial/images/productMatrix.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/productMatrix.png -------------------------------------------------------------------------------- /gltfTutorial/images/productMatrix.tex: -------------------------------------------------------------------------------- 1 | \begin{align*} 2 | M = T * R * S = 3 | \[ \left( \begin{array}{cccc} 4 | 2.0 & 0.0 & 0.0 & 10.0 \\ 5 | 0.0 & 0.866 & -0.25 & 20.0 \\ 6 | 0.0 & 0.5 & 0.433 & 30.0 \\ 7 | 0.0 & 0.0 & 0.0, & 1.0 \end{array} 8 | \right)\] 9 | \end{align*} 10 | 11 | 12 | -------------------------------------------------------------------------------- /gltfTutorial/images/rotationMatrix.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/rotationMatrix.png -------------------------------------------------------------------------------- /gltfTutorial/images/rotationMatrix.tex: -------------------------------------------------------------------------------- 1 | \begin{align*} 2 | R = 3 | \[ \left( \begin{array}{cccc} 4 | 1.0 & 0.0 & 0.0 & 0.0 \\ 5 | 0.0 & 0.866 & -0.5 & 0.0 \\ 6 | 0.0 & 0.5 & 0.866 & 0.0 \\ 7 | 0.0 & 0.0 & 0.0 & 1.0 \end{array} 8 | \right)\] 9 | \end{align*} -------------------------------------------------------------------------------- /gltfTutorial/images/scaleMatrix.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/scaleMatrix.png -------------------------------------------------------------------------------- /gltfTutorial/images/scaleMatrix.tex: -------------------------------------------------------------------------------- 1 | \begin{align*} 2 | S = 3 | \[ \left( \begin{array}{cccc} 4 | 2.0 & 0.0 & 0.0 & 0.0 \\ 5 | 0.0 & 1.0 & 0.0 & 0.0 \\ 6 | 0.0 & 0.0 & 0.5 & 0.0 \\ 7 | 0.0 & 0.0 & 0.0 & 1.0 \end{array} 8 | \right)\] 9 | \end{align*} -------------------------------------------------------------------------------- /gltfTutorial/images/sceneGraph.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/sceneGraph.png -------------------------------------------------------------------------------- /gltfTutorial/images/simpleMaterial.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/simpleMaterial.png -------------------------------------------------------------------------------- /gltfTutorial/images/simpleMeshes.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/simpleMeshes.png -------------------------------------------------------------------------------- /gltfTutorial/images/simpleMorph.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/simpleMorph.png -------------------------------------------------------------------------------- /gltfTutorial/images/simpleMorphInitial.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/simpleMorphInitial.png -------------------------------------------------------------------------------- /gltfTutorial/images/simpleMorphInitial.svg: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 22 | 24 | 32 | 37 | 38 | 46 | 52 | 53 | 54 | 76 | 79 | 80 | 82 | 83 | 85 | image/svg+xml 86 | 88 | 89 | 90 | 91 | 92 | 96 | 102 | 108 | 114 | 120 | 126 | 132 | 138 | 144 | (0.0, 0.0, 0.0) 156 | (1.0, 0.0, 0.0) 168 | (0.5, 0.5, 0.0) 180 | (-1.0, 1.0, 0.0) 192 | ( 1.0, 1.0, 0.0) 204 | 205 | 206 | -------------------------------------------------------------------------------- /gltfTutorial/images/simpleMorphIntermediate.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/simpleMorphIntermediate.png -------------------------------------------------------------------------------- /gltfTutorial/images/simpleMorphIntermediate.svg: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 22 | 24 | 32 | 37 | 38 | 46 | 52 | 53 | 54 | 77 | 80 | 81 | 83 | 84 | 86 | image/svg+xml 87 | 89 | 90 | 91 | 92 | 96 | 102 | 108 | 114 | 120 | 126 | 132 | 138 | 144 | (0.0, 0.0, 0.0) 155 | (1.0, 0.0, 0.0) 166 | ( 0.5, 0.5, 0.0)+ 0.25 * (-1.0, 1.0, 0.0)+ 1.0 * ( 1.0, 1.0, 0.0)= ( 1.25, 1.75, 0.0) 196 | (-1.0, 1.0, 0.0) 207 | ( 1.0, 1.0, 0.0) 218 | 219 | 220 | -------------------------------------------------------------------------------- /gltfTutorial/images/simpleSkin.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/simpleSkin.gif -------------------------------------------------------------------------------- /gltfTutorial/images/simpleSkinOutline01.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/simpleSkinOutline01.png -------------------------------------------------------------------------------- /gltfTutorial/images/simpleSkinOutline02.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/simpleSkinOutline02.png -------------------------------------------------------------------------------- /gltfTutorial/images/simpleSparseAccessor.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/simpleSparseAccessor.png -------------------------------------------------------------------------------- /gltfTutorial/images/simpleSparseAccessorDescription.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/simpleSparseAccessorDescription.png -------------------------------------------------------------------------------- /gltfTutorial/images/simpleTexture.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/simpleTexture.png -------------------------------------------------------------------------------- /gltfTutorial/images/skinInverseBindMatrix.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/skinInverseBindMatrix.png -------------------------------------------------------------------------------- /gltfTutorial/images/skinJointMatrices.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/skinJointMatrices.png -------------------------------------------------------------------------------- /gltfTutorial/images/skinSkinMatrix.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/skinSkinMatrix.png -------------------------------------------------------------------------------- /gltfTutorial/images/testTexture.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/testTexture.png -------------------------------------------------------------------------------- /gltfTutorial/images/translationMatrix.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/translationMatrix.png -------------------------------------------------------------------------------- /gltfTutorial/images/translationMatrix.tex: -------------------------------------------------------------------------------- 1 | \begin{align*} 2 | T = 3 | \[ \left( \begin{array}{cccc} 4 | 1.0 & 0.0 & 0.0 & 10.0 \\ 5 | 0.0 & 1.0 & 0.0 & 20.0 \\ 6 | 0.0 & 0.0 & 1.0 & 30.0 \\ 7 | 0.0 & 0.0 & 0.0 & 1.0 \end{array} 8 | \right)\] 9 | \end{align*} -------------------------------------------------------------------------------- /gltfTutorial/images/triangle.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/triangle.png -------------------------------------------------------------------------------- /gltfTutorial/images/triangleWithSimpleMaterial.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/javagl/glTF-Tutorials/f4e7fdfb8e6b6e7b82751e9fd0e8794020030889/gltfTutorial/images/triangleWithSimpleMaterial.png --------------------------------------------------------------------------------