├── CNAME ├── docs ├── Meta │ ├── Meta.md │ ├── VR Locomotion Design Guide.md │ ├── Designing for virtual reality 3 tips for content strategists.md │ ├── Mixed Reality Design Guidelines.md │ ├── User Input.md │ ├── Designing for Hands.md │ ├── Locomotion Best Practices.md │ ├── A blueprint for designing inclusive ARVR experiences.md │ └── Designing Accessible VR.md ├── Apple │ ├── Apple.md │ ├── Principles of spatial design.md │ ├── Design for spatial input.md │ ├── Explore immersive sound design.md │ ├── Augmented Reality.md │ ├── Create accessible spatial experiences.md │ ├── Design for spatial user interfaces.md │ ├── Designing for visionOS.md │ └── Design considerations for vision and motion.md ├── Google │ ├── Google.md │ ├── Augmented reality design guidelines.md │ ├── VR Performance best practices.md │ └── Using type in AR & VR.md ├── Others │ ├── Others.md │ └── XR Accessibility User Requirements.md ├── Qualcomm │ ├── Qualcomm.md │ └── Hand Tracking Best practices.md ├── Microsoft │ ├── Microsoft.md │ ├── Spatial sound.md │ ├── Comfort.md │ ├── Eye-gaze-based interaction.md │ ├── Voice input.md │ └── Shared experiences.md ├── Ultraleap │ ├── Ultraleap.md │ └── Design principles.md ├── Magic Leap │ ├── Magic Leap.md │ ├── Privacy, Security & Safety Best Practices.md │ ├── Improve Visual Stability.md │ └── Comfort and Content Placement.md ├── Resources │ ├── Resources.md │ ├── Prompt I use.md │ ├── Communities.md │ ├── Good reads.md │ └── Useful tools.md └── Road to VR │ ├── Road to VR.md │ ├── 7 Lessons ‘Sea of Thieves’ Can Teach Us About Great VR Game Design.md │ ├── Inside VR Design - The Interface of ‘Electronauts’.md │ ├── The Design Behind ‘Cubism’s’ Hand-tracking.md │ ├── Exploring Methods for Conveying Object Weight in Virtual Reality.md │ ├── Inside VR Design - The Clever Weapons, Locomotion, & Open-world of ‘Stormland’.md │ ├── The Design & Implementation of Hand-tracking in ‘Myst’.md │ └── A Concise Beginner’s Guide to Apple Vision Pro Design & Development.md ├── _config.yml ├── .gitignore ├── Gemfile ├── .github └── workflows │ ├── ci.yml │ ├── pages.yml │ └── jekyll.yml ├── LICENSE ├── README.md ├── index.md └── Gemfile.lock /CNAME: -------------------------------------------------------------------------------- 1 | xrdesignhandbook.com 2 | -------------------------------------------------------------------------------- /docs/Meta/Meta.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Meta 4 | has_children: true 5 | permalink: docs/Meta 6 | nav_order: 1 7 | --- 8 | 9 | # Meta 10 | {: .no_toc } -------------------------------------------------------------------------------- /docs/Apple/Apple.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Apple 4 | has_children: true 5 | permalink: docs/Apple 6 | nav_order: 1 7 | --- 8 | 9 | # Apple 10 | {: .no_toc } -------------------------------------------------------------------------------- /docs/Google/Google.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Google 4 | has_children: true 5 | permalink: docs/Google 6 | nav_order: 1 7 | --- 8 | 9 | # Google 10 | {: .no_toc } -------------------------------------------------------------------------------- /docs/Others/Others.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Others 4 | has_children: true 5 | permalink: docs/Others 6 | nav_order: 100 7 | --- 8 | 9 | # Others 10 | {: .no_toc } -------------------------------------------------------------------------------- /docs/Qualcomm/Qualcomm.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Qualcomm 4 | has_children: true 5 | permalink: docs/Qualcomm 6 | nav_order: 1 7 | --- 8 | 9 | # Qualcomm 10 | {: .no_toc } -------------------------------------------------------------------------------- /docs/Microsoft/Microsoft.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Microsoft 4 | has_children: true 5 | permalink: docs/Microsoft 6 | nav_order: 1 7 | --- 8 | 9 | # Microsoft 10 | {: .no_toc } -------------------------------------------------------------------------------- /docs/Ultraleap/Ultraleap.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Ultraleap 4 | has_children: true 5 | permalink: docs/Ultraleap 6 | nav_order: 1 7 | --- 8 | 9 | # Ultraleap 10 | {: .no_toc } -------------------------------------------------------------------------------- /docs/Magic Leap/Magic Leap.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Magic Leap 4 | has_children: true 5 | permalink: docs/Magic Leap 6 | nav_order: 1 7 | --- 8 | 9 | # Magic Leap 10 | {: .no_toc } -------------------------------------------------------------------------------- /docs/Resources/Resources.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Resources 4 | has_children: true 5 | permalink: docs/Resources 6 | nav_order: 100 7 | --- 8 | 9 | # Resources 10 | {: .no_toc } -------------------------------------------------------------------------------- /docs/Road to VR/Road to VR.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Road to VR 4 | has_children: true 5 | permalink: docs/Road to VR 6 | nav_order: 1 7 | --- 8 | 9 | # Road to VR 10 | {: .no_toc } -------------------------------------------------------------------------------- /_config.yml: -------------------------------------------------------------------------------- 1 | title: XR Design Handbook 2 | description: a Compilation of XR Design Guidelines 🚀 3 | theme: just-the-docs 4 | 5 | url: https://www.xrdesignhandbook.com/ 6 | 7 | aux_links: 8 | Check out our repository!: https://github.com/jackyangzzh/XR-Design-Handbook 9 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | # Copied from https://github.com/github/gitignore/blob/main/Jekyll.gitignore 2 | # Ignore metadata generated by Jekyll 3 | _site/ 4 | .sass-cache/ 5 | .jekyll-cache/ 6 | .jekyll-metadata 7 | 8 | # Ignore folders generated by Bundler 9 | .bundle/ 10 | vendor/ 11 | -------------------------------------------------------------------------------- /Gemfile: -------------------------------------------------------------------------------- 1 | source 'https://rubygems.org' 2 | 3 | gem "jekyll", "~> 4.3.2" # installed by `gem jekyll` 4 | # gem "webrick" # required when using Ruby >= 3 and Jekyll <= 4.2.2 5 | 6 | gem "just-the-docs", "0.5.3" # pinned to the current release 7 | # gem "just-the-docs" # always download the latest release 8 | -------------------------------------------------------------------------------- /.github/workflows/ci.yml: -------------------------------------------------------------------------------- 1 | name: CI 2 | 3 | on: 4 | push: 5 | branches: ["main"] 6 | pull_request: 7 | 8 | jobs: 9 | # Build job 10 | build: 11 | runs-on: ubuntu-latest 12 | steps: 13 | - name: Checkout 14 | uses: actions/checkout@v3 15 | 16 | - name: Setup Ruby 17 | uses: ruby/setup-ruby@v1 18 | with: 19 | ruby-version: '3.2.2' # Updated Ruby version 20 | bundler-cache: true # Automatically caches installed gems 21 | cache-version: 1 # Incremented cache version to ensure cache updates 22 | 23 | - name: Build with Jekyll 24 | run: bundle exec jekyll build 25 | -------------------------------------------------------------------------------- /docs/Resources/Prompt I use.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Prompt I use 4 | parent: Resources 5 | --- 6 | 7 | # Prompt I use 8 | Inspired by [The best youtube summary prompt I've come up with](https://www.reddit.com/r/ChatGPT/comments/11pd4sl/the_best_youtube_summary_prompt_ive_come_up_with/) 9 | 10 | ## Starting prompt 11 | - Creating a concise, one paragraph summary using to supply college student notes to use himself. You are to act like an expert in the subject the document is written about. 12 | 13 | - Create 10 numbered bullet points (each with an appropriate emoji) that summarize the key point from the document, each bullet point should prioritize on being clear and comprehensive instead of succinct. Elaborate on each bullet point wherever possible. You are to act like an expert in the subject the document is written about. 14 | 15 | - Extract the most important keywords and any complex words not known to the average reader as well as any acronyms mentioned as bullet points. For each keyword and complex word, provide an explanation and definition based on its occurrence in the document. -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2023 Jack Yang 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /docs/Resources/Communities.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Communities 4 | parent: Resources 5 | --- 6 | 7 | ## Subreddits 8 | - [r/Vive](https://www.reddit.com/r/Vive/): A community for Vive enthusiasts. 9 | - [r/Oculus](https://www.reddit.com/r/oculus/): Dedicated to Oculus fans and discussions about VR. 10 | - [r/OculusQuest](https://www.reddit.com/r/OculusQuest/) : A subreddit more focusing on Quest lineup. 11 | - [r/PSVR](https://www.reddit.com/r/PSVR/): Focused on PlayStation VR. 12 | - [r/virtualreality](https://www.reddit.com/r/virtualreality/): A general subreddit for all things related to virtual reality. 13 | - [r/augmentedreality](https://www.reddit.com/r/augmentedreality/): A general subreddit for all things related to augmented reality. 14 | - [r/VisionPro](https://www.reddit.com/r/VisionPro/): For Apple Vision Pro headset users. 15 | - [r/windowsMR](https://www.reddit.com/r/WindowsMR/): Discussions about Windows Mixed Reality. 16 | - [r/SteamVR](https://www.reddit.com/r/SteamVR/): For all SteamVR users. 17 | - [r/VRGaming](https://www.reddit.com/r/VRGaming/): For all the VR gaming enthusiasts. 18 | - [r/metaverse](https://www.reddit.com/r/metaverse/): Not exactly an AR/VR subreddit but often discusses XR. 19 | -------------------------------------------------------------------------------- /docs/Resources/Good reads.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Good reading 4 | parent: Resources 5 | --- 6 | 7 | ## AR / VR Design 8 | 9 | - [The Design of Virtual and Augmented Reality](https://aliheston.gitbook.io/the-design-of-virtual-and-augmented-reality/) (thanks [@rubychen](https://github.com/rubychen)!) 10 | - [Microsoft Mixed Reality Design](https://learn.microsoft.com/en-us/windows/mixed-reality/design/design) 11 | - [Apple Human Interface Guidelines](https://developer.apple.com/design/human-interface-guidelines) 12 | - [Design at Meta](https://design.facebook.com/) 13 | - [Unproject Spatial Design Library](https://www.unproject.ai/blog-pages/spatial-design-library) (thanks [@Viba](https://medium.com/@viba)!) 14 | 15 | ## AR / VR News 16 | - [Road to VR](https://www.roadtovr.com/): A leading independent VR news publication covering PC VR, Quest, PSVR, Apple Vision Pro, and more. 17 | - [VRScout](https://vrscout.com/): A platform for virtual reality news, videos, and immersive content, inspiring a community of VR explorers and creators. 18 | - [XR Today](https://www.xrtoday.com/): A source for extended reality (XR) industry news, focusing on AR, VR, and MR technologies in business. 19 | - [UploadVR](https://www.uploadvr.com/): A website covering the latest developments in virtual reality and augmented reality, including news, reviews, and guides. 20 | - [MIXED Reality News](https://mixed-news.com/en/): A platform reporting on virtual reality, augmented reality, and artifical intelligence trends and updates. -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # 📖 XR Design Handbook 2 | 3 | The repository contains a compilation of XR design guidelines from numerous leading companies in the field. This is a goldmine for both seasoned designers and newcomers to XR, ensuring we can create user-friendly and effective experiences. 4 | 5 | ## 🎯 What's this for? 6 | 7 | Designed for designers, developers, and all XR enthusiasts, the XR Design Handbook serves as a centralized resource. Its aim? To fuse the design philosophies from various organizations into one comprehensive guide. This fusion provides key insights and best practices, empowering creators to craft immersive and captivating XR experiences. 8 | 9 | ## 🔍 How to use it? 10 | 11 | To dive deep into the XR Design Handbook, either check out the readme files or visit our [user-friendly website](https://www.xrdesignhandbook.com/). This site, generated from the repository, lets you easily search and read the design summaries. Harness these guidelines to shape standout XR experiences. 12 | 13 | ## 🤝 How to contribute? 14 | Here's the catch: I can't do it alone. The world of XR is vast and always evolving. To truly make this handbook valuable, I need your help. 15 | 16 | Whether it's: 17 | 18 | - Submitting a new guideline 19 | - Correcting or updating existing ones 20 | - Sharing your own experiences and insights 21 | 22 | ## 📄 License 23 | 24 | Every piece of information in the XR Design Handbook is protected under the MIT License. To delve into specifics, kindly consult the LICENSE file. 25 | 26 | **Cheers to building better virtual worlds! 🌐** 27 | -------------------------------------------------------------------------------- /index.md: -------------------------------------------------------------------------------- 1 | --- 2 | title: XR Design Handbook 3 | layout: home 4 | nav_order: 0 5 | --- 6 | 7 | # 📖 XR Design Handbook 8 | 9 | The repository contains a compilation of XR design guidelines from numerous leading companies in the field. This is a goldmine for both seasoned designers and newcomers to XR, ensuring we can create user-friendly and effective experiences. 10 | 11 | ## 🎯 What's this for? 12 | 13 | Designed for designers, developers, and all XR enthusiasts, the XR Design Handbook serves as a centralized resource. Its aim? To fuse the design philosophies from various organizations into one comprehensive guide. This fusion provides key insights and best practices, empowering creators to craft immersive and captivating XR experiences. 14 | 15 | ## 🔍 How to use it? 16 | 17 | To dive deep into the XR Design Handbook, either browse through the website or visit our [Github repository](https://github.com/jackyangzzh/XR-Design-Handbook). This site, generated from the repository, lets you easily search and read the design summaries. Harness these guidelines to shape standout XR experiences. 18 | 19 | ## 🤝 How to contribute? 20 | Here's the catch: I can't do it alone. The world of XR is vast and always evolving. To truly make this handbook valuable, I need your help. 21 | 22 | Whether it's: 23 | 24 | - Submitting a new guideline 25 | - Correcting or updating existing ones 26 | - Sharing your own experiences and insights 27 | 28 | ## 📄 License 29 | 30 | Every piece of information in the XR Design Handbook is protected under the MIT License. To delve into specifics, kindly consult the LICENSE file. 31 | 32 | **Cheers to building better virtual worlds! 🌐** 33 | -------------------------------------------------------------------------------- /docs/Resources/Useful tools.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Useful tools 4 | parent: Resources 5 | --- 6 | 7 | ## Prototyping 8 | 9 | - [ShapesXR](https://www.shapesxr.com/): Spatial design and prototyping in VR/AR. 10 | - [Figma](https://www.figma.com/): Collaborative 2D interface design tool. 11 | - [Arkio](https://www.arkio.is/): Collaborative spatial design in VR/AR. 12 | - [Gravity Sketch](https://www.gravitysketch.com/): Intuitive 3D design platform. 13 | 14 | ## Game Engines 15 | - [Unity](https://unity.com/): A powerful game engine for creating and deploying games, films, and immersive experiences across various platforms1. 16 | - [Unreal](https://www.unrealengine.com/en-US): A versatile tool for creating photorealistic and immersive 3D content for games, film, TV, architecture, and more2. 17 | - [RealityKit](https://developer.apple.com/documentation/realitykit): An AR-first 3D framework that seamlessly integrates virtual objects into the real world, leveraging ARKit3. 18 | - [WebXR](https://immersiveweb.dev/): A standard/API for accessing virtual reality and augmented reality devices in compatible web browsers, enabling immersive experiences4. 19 | - [8th Wall](https://www.8thwall.com/): A platform for web-based augmented reality experiences without the need for an app, spanning games, avatars, portals, and virtual try-outs 20 | 21 | ## Modeling 22 | - [Blender](https://www.blender.org/): A powerful and versatile 3D software for modeling, animation, rendering, and VFX, available as free and open-source software1. 23 | - [Maya](https://www.autodesk.com/products/maya/overview): Professional 3D modeling, animation, and simulation software used for creating realistic characters and effects2. 24 | - [Cinema 4D](https://www.maxon.net/en/cinema-4d): Intuitive 3D modeling and animation software for motion graphics, simulations, and immersive environments3. 25 | -------------------------------------------------------------------------------- /.github/workflows/pages.yml: -------------------------------------------------------------------------------- 1 | name: Deploy Jekyll site to Pages 2 | 3 | on: 4 | push: 5 | branches: ["main"] 6 | 7 | # Allows you to run this workflow manually from the Actions tab 8 | workflow_dispatch: 9 | 10 | # Sets permissions of the GITHUB_TOKEN to allow deployment to GitHub Pages 11 | permissions: 12 | contents: read 13 | pages: write 14 | id-token: write 15 | 16 | # Allow one concurrent deployment 17 | concurrency: 18 | group: "pages" 19 | cancel-in-progress: false # Changed to false to allow in-progress runs to complete 20 | 21 | jobs: 22 | # Build job 23 | build: 24 | runs-on: ubuntu-latest 25 | steps: 26 | - name: Checkout 27 | uses: actions/checkout@v3 28 | 29 | - name: Setup Ruby 30 | uses: ruby/setup-ruby@v1 31 | with: 32 | ruby-version: '3.2.2' # Updated Ruby version to 3.2.2 or higher 33 | bundler-cache: true # Automatically caches installed gems 34 | cache-version: 1 # Incremented cache version to refresh cache 35 | 36 | - name: Setup Pages 37 | id: pages 38 | uses: actions/configure-pages@v3 39 | 40 | - name: Install Dependencies 41 | run: bundle install 42 | 43 | - name: Build with Jekyll 44 | # Outputs to the './_site' directory by default 45 | run: bundle exec jekyll build --baseurl "${{ steps.pages.outputs.base_path }}" 46 | env: 47 | JEKYLL_ENV: production 48 | 49 | - name: Upload artifact 50 | # Automatically uploads an artifact from the './_site' directory by default 51 | uses: actions/upload-pages-artifact@v2 52 | 53 | # Deployment job 54 | deploy: 55 | environment: 56 | name: github-pages 57 | url: ${{ steps.deployment.outputs.page_url }} 58 | runs-on: ubuntu-latest 59 | needs: build 60 | steps: 61 | - name: Deploy to GitHub Pages 62 | id: deployment 63 | uses: actions/deploy-pages@v2 64 | -------------------------------------------------------------------------------- /.github/workflows/jekyll.yml: -------------------------------------------------------------------------------- 1 | # This workflow uses actions that are not certified by GitHub. 2 | # They are provided by a third-party and are governed by 3 | # separate terms of service, privacy policy, and support 4 | # documentation. 5 | 6 | # Workflow for building and deploying a Jekyll site to GitHub Pages 7 | name: Deploy Jekyll site to Pages 8 | 9 | on: 10 | # Runs on pushes targeting the main branch 11 | push: 12 | branches: ["main"] 13 | 14 | # Allows you to run this workflow manually from the Actions tab 15 | workflow_dispatch: 16 | 17 | # Sets permissions of the GITHUB_TOKEN to allow deployment to GitHub Pages 18 | permissions: 19 | contents: read 20 | pages: write 21 | id-token: write 22 | 23 | # Allow only one concurrent deployment, skipping runs queued between the run in-progress and latest queued. 24 | # Do not cancel in-progress runs to allow these production deployments to complete. 25 | concurrency: 26 | group: "pages" 27 | cancel-in-progress: false 28 | 29 | jobs: 30 | # Build job 31 | build: 32 | runs-on: ubuntu-latest 33 | steps: 34 | - name: Checkout 35 | uses: actions/checkout@v3 36 | 37 | - name: Setup Ruby 38 | uses: ruby/setup-ruby@v1 39 | with: 40 | ruby-version: '3.2.2' # Updated Ruby version to 3.2.2 41 | bundler-cache: true # Automatically caches installed gems 42 | cache-version: 1 # Incremented cache version to refresh cache 43 | 44 | - name: Setup Pages 45 | id: pages 46 | uses: actions/configure-pages@v3 47 | 48 | - name: Install Dependencies 49 | run: bundle install 50 | 51 | - name: Build with Jekyll 52 | # Outputs to the './_site' directory by default 53 | run: bundle exec jekyll build --baseurl "${{ steps.pages.outputs.base_path }}" 54 | env: 55 | JEKYLL_ENV: production 56 | 57 | - name: Upload artifact 58 | # Automatically uploads an artifact from the './_site' directory by default 59 | uses: actions/upload-pages-artifact@v2 60 | 61 | # Deployment job 62 | deploy: 63 | environment: 64 | name: github-pages 65 | url: ${{ steps.deployment.outputs.page_url }} 66 | runs-on: ubuntu-latest 67 | needs: build 68 | steps: 69 | - name: Deploy to GitHub Pages 70 | id: deployment 71 | uses: actions/deploy-pages@v2 72 | -------------------------------------------------------------------------------- /docs/Magic Leap/Privacy, Security & Safety Best Practices.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Privacy, Security & Safety Best Practices 4 | parent: Magic Leap 5 | --- 6 | 7 | # Privacy, Security & Safety Best Practices 8 | Original article: [Privacy, Security & Safety Best Practices](https://developer-docs.magicleap.cloud/docs/guides/best-practices/privacy-security-safety) 9 | 10 | ## TL;DR 11 | Developers for Magic Leap 2 must prioritize user privacy, security, and safety. Transparent data practices, minimal data collection, and robust user support are key. While Magic Leap offers essential tools, developers are responsible for proper implementation and maintaining user trust. 12 | 13 | ## Bullet points 14 | 1. **📜 Compliance & Standards**: Adhere to all applicable laws and industry standards, avoiding the use of unlicensed intellectual property. 15 | 16 | 2. **🛡️ Data Privacy Disclosure**: Offer a clear privacy policy detailing data usage, retention policies, and sharing mechanisms, updating it for any new practices or features. 17 | 18 | 3. **🖐️ User Permission**: Obtain permission prior to accessing any user data, ensuring default settings prioritize privacy, and provide contextual clarity about the need for permissions. 19 | 20 | 4. **📉 Data Minimization**: Limit data collection to only what's necessary, anonymizing and aggregating where feasible, and restricting unrelated data combinations. 21 | 22 | 5. **🔒 Security-centric Approach**: Prioritize data security at each stage of application development, employing encryption and only using documented APIs. 23 | 24 | 6. **📲 Safe Storage**: Store non-confidential data internally and use secure storage APIs for sensitive information, refraining from logging data unnecessarily. 25 | 26 | 7. **🌐 Third-party Code Due Diligence**: Exercise caution when incorporating third-party code, ensuring it doesn’t compromise security. 27 | 28 | 8. **💡 Safety Protocols**: Offer safety indicators and warnings, prevent possible harm scenarios, and conduct risk assessments for potential application hazards. 29 | 30 | 9. **🔧 Device Intended Use**: Ensure the application’s usage aligns with the device’s primary purpose, notifying users of any system setting changes. 31 | 32 | 10. **📞 Support Availability**: Offer communication avenues for user queries and feedback without redirecting them to Magic Leap for app-specific concerns. 33 | 34 | ## Keywords 35 | - **Magic Leap 2**: A device/platform. In the context, it seems to be a technology device with capabilities like eye-tracking and spatial mapping. 36 | 37 | - **Eye Tracking Data**: Information about where a user is looking on a device screen or within a virtual environment. 38 | 39 | - **Spatial Maps**: Digital representations of physical environments. In this context, it refers to potentially sensitive layouts of workspaces or areas. 40 | 41 | - **Anonymized**: Data that is stripped of personal identifiers, making it impossible to trace back to the original source. 42 | 43 | - **Aggregated**: Data that's combined from several measurements. In privacy, it's often used to generalize datasets so individuals cannot be easily identified. 44 | 45 | - **PII (Personally Identifiable Information)**: Information that can be used to identify an individual either by itself or with other data. 46 | 47 | - **APIs (Application Programming Interfaces)**: Sets of rules and protocols for building and interacting with software applications. 48 | 49 | - **TLS (Transport Layer Security)**: A cryptographic protocol designed to provide communications security over a computer network. 50 | 51 | - **Penetration Testing**: An authorized cyber attack on a computer system, performed to evaluate its security. 52 | -------------------------------------------------------------------------------- /docs/Apple/Principles of spatial design.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Principles of Spatial Design 4 | parent: Apple 5 | --- 6 | 7 | # Principles of Spatial Design 8 | Original Video: [Principles of Spatial Design](https://developer.apple.com/videos/play/wwdc2023/10072/) 9 | 10 | 11 | ## TL;DR 12 | This video transcript focuses on the principles of spatial design for a spatial operating system on the Apple platform. It covers topics such as window design, size flexibility, multiple windows, human-centered design, dimensionality, immersion, and user comfort. The aim is to create familiar, authentic, and immersive experiences for users within the spatial environment. 13 | 14 | 15 | ## Bullet Points 16 | 1. 🧭 Design spatial apps that balance familiarity and new possibilities. Use elements like sidebars and tabs to help users navigate and find what they're looking for. 17 | 2. 🪟 Windows in spatial design have a new visual language, with glass material for contrast and adaptability to different lighting conditions. 18 | 3. 📏 Size and shape flexibility in windows allows for comfortable viewing and accommodation of different types of content. Use tab bars and toolbars to extend beyond the window. 19 | 4. 📐 Points system is used to ensure scalable and legible interface elements at different distances. Design interfaces with points to adapt to various screen sizes. 20 | 5. 👤 Human-centered design is essential in spatial apps. Consider the user's field of view, design with ergonomics in mind, and place content relative to the user's head and position. 21 | 6. 🌌 Dimension and immersion are key in spatial design. Utilize the space around users, leverage depth and scale to create hierarchy and focus, and design immersive experiences beyond windows. 22 | 7. 🔄 Immersive experiences can exist on a spectrum. Start in a window in the shared space, and provide clear ways in and out of immersive experiences. 23 | 8. 🌍 Blend app content with the user's physical surroundings for a more seamless experience. Soft edges help integrate virtual content smoothly. Add animation and sound to enhance realism. 24 | 9. 🛋️ Ensure user comfort and control in immersive experiences. Avoid abrupt or fast movements, provide clear labels and symbols for guidance, and allow users to view their physical surroundings during motion. 25 | 10. 🍎 Create authentic spatial apps that utilize the unique capabilities of the platform. Make the experience engaging, distinct, and memorable by using depth, scale, and audio effectively. 26 | 27 | ## Keywords 28 | - **Spatial operating system**: An operating system designed for spatial computing that provides an immersive, three-dimensional user experience. 29 | - **Dimensionality**: The quality of having depth and spatial characteristics in a virtual environment. 30 | - **Human-centered design**: Designing with a focus on the needs, capabilities, and comfort of the user. 31 | - **Immersion spectrum**: A range of experiences varying in their level of immersion and integration with the physical world. 32 | - **Ergonomics**: The study of designing and arranging objects and interfaces for optimal user comfort and efficiency. 33 | - **Spatialized audio**: Audio that is positioned in a virtual environment to create a sense of space and realism. 34 | - **Field of view**: The extent of the visible world or scene that a person can see at any given moment. 35 | - **Hierarchy**: Organizing elements based on their relative importance and visual prominence. 36 | - **Points**: A unit of measurement used to specify the size and scale of interface elements. 37 | - **Authenticity**: Creating experiences that are genuine, true to the spatial platform's capabilities, and unique to the user's context. 38 | 39 | -------------------------------------------------------------------------------- /docs/Magic Leap/Improve Visual Stability.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Improve Visual Stability 4 | parent: Magic Leap 5 | --- 6 | 7 | # Improve Visual Stability 8 | Original article: [Improve Visual Stability](https://developer-docs.magicleap.cloud/docs/guides/best-practices/improve-visual-stability) 9 | 10 | ## TL;DR 11 | In augmented and mixed reality, Visual Stability anchors virtual objects in the real world. Optimal realism relies on adjusting Focus Distance and Near/Far Clipping Planes based on user interactions and scene content. However, developer oversights, especially in Unity applications, can disrupt this stability. 12 | 13 | ## Bullet points 14 | 15 | 1. **🔍 Visual Stability**: This sensation ensures that virtual objects feel grounded and consistent in the real world, especially as you move your device around. High visual stability means that virtual objects remain steadfast and anchored, enhancing the augmented or mixed reality experience's realism. 16 | 17 | 2. **📏 Focus Distance**: This is a pivotal measurement indicating the distance between the headset and a chosen virtual plane. When this distance closely matches the primary object the user is looking at, the object appears more stable and realistic. In settings with multiple objects, aligning this distance with the furthest object is usually best unless user interactions indicate otherwise. 18 | 19 | 3. **🔲 Near/Far Clipping Planes**: Invisible boundaries determining the range at which virtual content appears. Keeping these boundaries tight and adaptive to the content in view ensures clarity and grounded visuals. Dynamic adjustments, based on scene content, offer the user a clear and undistracted view of the virtual elements. 20 | 21 | 4. **🖼️ World Lock Experience (Pixel Stick Quality)**: A critical aspect of AR and MR applications, this factor dictates how embedded and fixed virtual content feels in the real world. Its effectiveness is closely linked to the correct setting of the focus distance and clipping planes, emphasizing the developer's role in adjusting these parameters accurately. 22 | 23 | 5. **⚠️ Common Pitfalls**: Some typical mistakes developers make include neglecting to set a focus distance, particularly in Unity-driven applications; defaulting to a large, static focus distance value, which disrupts the stability of near objects; and routinely setting the focus on the closest object in a scene with multiple items, causing distant objects to feel unstable. 24 | 25 | 6. **🛠️ Dynamic Adjustments**: Rather than sticking to static values, developers are advised to dynamically alter the near/far clipping planes and focus distance based on the scene's content. This approach ensures optimal visual stability as the environment or user's focus changes, leading to a more immersive experience. 26 | 27 | 7. **👁️ User Interactions as Cues**: When multiple virtual objects are present, user cues such as eye-tracking, controller interactions, or hand gestures can provide invaluable insights. Developers can leverage this information to adjust the focus distance, prioritizing objects that the user is most likely to interact with or focus on. 28 | 29 | ## Keywords 30 | - **Visual Stability**: The degree to which virtual content remains fixed in position and orientation relative to the real world, especially as the user moves or shifts their gaze. 31 | 32 | - **Focus Distance**: A parameter that determines the ideal virtual distance for rendering content to ensure that it appears stable and anchored to the real world. 33 | 34 | - **MLGraphicsFrameParamsEx**: Structure that defines the frame parameters for rendering the next frame in ML2. 35 | 36 | - **World Lock Experience/Pixel Stick Quality**: The extent to which virtual content feels integrated and stable in the real world when viewed through mixed-reality devices. 37 | 38 | - **Near/Far Clipping Planes**: Virtual boundaries that determine the closest and farthest points at which content will be rendered in a mixed-reality scene. -------------------------------------------------------------------------------- /docs/Apple/Design for spatial input.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Design for spatial input 4 | parent: Apple 5 | --- 6 | 7 | # Design for spatial input 8 | Original video: [Design for spatial input](https://developer.apple.com/videos/play/wwdc2023/10073/) 9 | 10 | ## TL;DR 11 | The video introduces an interactive system that uses hand-eye coordination to deliver an immersive user experience. Standard and custom gestures, precise navigation through eye tracking, and direct touch interaction with ergonomic consideration are key features. The system compensates for lack of physical touch with audio cues, and ensures accessibility via compatibility with assistive technologies, fully utilizing the spatial medium's potential. 12 | 13 | ## Bullet Points 14 | 1. 👋 **Hand-Eye Interactions**: The system is primarily interacted with through hand gestures, supported by eye targeting, making for a more intuitive and immersive experience. 15 | 2. 👌 **Standard Gestures**: Pinching, dragging, zooming, and rotating are standard gestures modeled after familiar multi-touch interactions on touch screen devices, providing a seamless transition for users. 16 | 3. ✋ **Custom Gestures**: Developers have the ability to define custom gestures for unique app behaviors, provided they are distinctly different from standard gestures and don't strain the user or have a high rate of false activations. 17 | 4. 👁️ **Eye Tracking**: The system utilizes eye direction as a signal of intent, allowing for precise and satisfying interactions that feel natural and intuitive. 18 | 5. 🖥️ **Interaction Origin**: The start point of certain interactions, like zooming or pointer movement, is determined by the user's eye focus, enhancing the precision of interaction. 19 | 6. 💻 **Direct Interaction**: The system supports direct touch, letting users interact with the virtual environment using their fingertips, such as scrolling through a page or typing on a virtual keyboard. 20 | 7. 💪 **Ergonomics Consideration**: When designing for direct interaction, it's essential to consider the ergonomics to avoid user fatigue, especially for apps that require more direct touch. 21 | 8. 🔊 **Audio Feedback**: Audio cues are used to supplement visual cues, providing more comprehensive feedback in the absence of physical touch, making interactions feel more satisfying and reliable. 22 | 9. 🧑‍🦯 **Assistive Technology Compatibility**: The design and development of interactions should consider those using assistive technologies, ensuring accessibility for all users. 23 | 10. 🌐 **Exploiting Spatial Medium**: The combination of hand gestures and eye input allows for the creation of novel and delightful interactions, fully exploiting the potential of the spatial medium. 24 | 25 | 26 | ## Keywords 27 | - **Hand-Eye Coordination**: The ability to track the movements of the hands and adjust the direction and power of these movements based on the visual information perceived. 28 | - **Multi-touch Interactions**: Multiple points of contact with the interface, such as pressing two fingers on a screen to zoom in or out, or twisting them to rotate an image. 29 | - **Custom Gestures**: Specific, unique movements designed and defined by developers for their applications, not part of the standard system gestures. 30 | - **Eye Tracking**: The process of measuring either the point of gaze (where one is looking) or the motion of an eye relative to the head, used in the system to allow precise and intuitive interactions. 31 | - **Direct Touch Interaction**: Interaction that involves physically reaching out and touching the screen or interface to control the system. 32 | - **Ergonomics**: The practice of designing products, systems, or processes to take proper account of the interaction between them and the people who use them, taken into account here to avoid user fatigue. 33 | - **Audio Cues**: Sound signals used to provide feedback and guide the user's interaction with the system. 34 | - **Assistive Technologies**: Array of devices that allow people with disabilities to interact with technology and data, considered in the system to ensure accessibility for all users. 35 | - **Spatial Medium**: The use of physical space for interaction in a virtual environment, exploited fully in this system to provide immersive experiences. 36 | 37 | -------------------------------------------------------------------------------- /docs/Magic Leap/Comfort and Content Placement.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Comfort and Content Placement 4 | parent: Magic Leap 5 | --- 6 | 7 | # Comfort and Content Placement 8 | Original article: [Comfort and Content Placement](https://developer-docs.magicleap.cloud/docs/guides/best-practices/comfort-content-placement) 9 | 10 | ## TL;DR 11 | When creating immersive experiences for headsets, ensure custom fit, place important content within a 30°x30° area, use visual effects sparingly, and adapt the interface to user movements to avoid discomfort and disorientation. 12 | 13 | ## Bullet points 14 | 1. 🖥️ **Device Fit Quality**: Users should use the Custom Fit application for optimal performance and comfort in the headset. This improves the visual comfort experience by tailoring the device to the individual's physical needs. 15 | 16 | 2. 🎯 **Content Placement**: The location and proximity of content to the user is crucial for maintaining comfort. If content is too close (less than 0.37m), it may lead to discomfort. However, with a Custom Fit, content further than 0.37m will be comfortable for most people. 17 | 18 | 3. 🚅 **Moving Content**: Content in motion, especially in the Z-axis (i.e., forward/backward), can cause visual discomfort. Designing content to limit the need for extended periods of viewing movement can help minimize discomfort. 19 | 20 | 4. 🕺 **User Movement**: Consider the natural tendencies of users to move their head or body while using the device. Creating flexible UI behaviors that can adapt to these movements can help improve user comfort and usability. 21 | 22 | 5. ✨ **Visual Effects**: Certain visual effects such as lense-distortion, chromatic aberration, and depth of field may disorient users. They should be used judiciously to avoid inducing a sense of distorted vision. 23 | 24 | 6. 📏 **Clipping Plane**: The default near clipping plane is set at 0.37m to prevent discomfort caused by content appearing too close to the user. Viewing content any closer than this is not advisable as it can cause nausea. 25 | 26 | 7. 📐 **Optimal Content Placement Area (OCPA)**: The OCPA is a 30°x30° boundary within which content is visible to users at all times, without moving their head. Essential content should be placed within the OCPA to ensure that it's always visible to the user. 27 | 28 | 8. 👀 **Field of View (FOV)**: Due to facial size and shape variance, every individual will not have a similar virtual FOV visible. Designers should account for this variation when designing content placement. 29 | 30 | 9. 🤲 **Interaction Ergonomics**: The placement of content relative to the user should be considered in the context of the expected user experience (controller or gesture). For direct inputs, content should be within arms' reach (0.4-0.6m). For indirect inputs, content should be beyond 0.6m. 31 | 32 | 10. 🕹️ **UI Behavior**: UI behavior should be designed taking into account how and where the user may move. Various UI behaviors include head-relative, body-relative, world-relative, and input-relative. Each behavior has its own use-cases and limitations and should be chosen accordingly. 33 | 34 | ## Keywords 35 | - **Custom Fit**: An application designed to optimize the fit and performance of a device. 36 | - **Focal Plane**: The distance at which the eye is comfortably focused. 37 | - **Clipping Plane**: Invisible boundary in 3D rendering which limits the rendering of objects, set at 0.37m (near) and 10m (far). 38 | - **Z Axis**: Depth axis in 3D space, indicating forward/backward motion. 39 | - **Lense-Distortion, Chromatic Aberration, Depth of Field**: Visual effects which may cause discomfort, disorientation or sense of distorted vision. 40 | - **Optimal Content Placement Area (OCPA)**: A 30°x30° boundary for optimal visibility of content without moving the head. 41 | - **Direct/Indirect Inputs**: Types of user interaction, with content placed within arm's reach (direct) or beyond (indirect). 42 | - **Head-Relative, Body-Relative, World-Relative, Input-Relative**: Strategies for designing UI behavior, each attached to different user positions or movements. 43 | - **Micro-Movements, Macro-Movements**: Smaller and larger user movements to be considered in UI design, such as swaying or walking across a room. 44 | - **System Overlays**: Elements like system notifications, voice command UI, and privacy indicator UI that may appear within an application. -------------------------------------------------------------------------------- /docs/Microsoft/Spatial sound.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Spatial sound 4 | parent: Microsoft 5 | --- 6 | 7 | # Spatial sound 8 | Original article: [Spatial sound](https://learn.microsoft.com/en-us/windows/mixed-reality/design/spatial-sound) 9 | 10 | ## TL;DR 11 | Spatial sound in mixed-reality environments, using head-related transfer function (HRTF), augments user experiences by providing auditory cues that inform about application states, reinforce interactions, and simulate real-world sound propagation. While enriching the immersive quality, developers are advised to exercise restraint in sound use to maintain usability and avoid cognitive overload. 12 | 13 | ## Bullet points 14 | 1. 🔈 **In low-light situations, our sense of hearing keeps us safe**: This capability evolved as a survival mechanism, allowing humans to react to unseen threats and obstacles, even in the dark. In technology, this concept is used to create more immersive and intuitive experiences. 15 | 16 | 2. 🎧 **HoloLens utilizes spatial audio to enhance mixed reality experiences**: Spatial audio creates a more realistic soundscape by simulating sounds from various directions and distances, increasing immersion and aiding user orientation within the virtual environment. 17 | 18 | 3. 👓 **The field of view in HoloLens devices is directly in front of the viewer**: Due to this limitation, only objects directly in front of the user are visible, making peripheral vision less informative in these mixed reality environments. 19 | 20 | 4. 🔊 **Speakers on HoloLens use HRTF audio to simulate sound from various distances and directions**: By employing Head-Related Transfer Function audio, HoloLens is able to create an audio environment that realistically represents how we would perceive sound in the real world, augmenting the immersion in mixed reality. 21 | 22 | 5. 👂 **Our ears help us determine the distance and direction of a sound**: We naturally process small differences in the time and intensity of sounds reaching each ear, allowing us to locate their sources in the space around us. 23 | 24 | 6. 🎼 **HRTF audio simulates this experience through timing differences and spectral changes**: HRTF audio uses mathematical models to recreate these auditory cues, simulating the intricacies of how sound waves interact with our heads and ears. 25 | 26 | 7. 🔄 **HoloLens speakers mimic this experience by simulating sound arrival at different times**: By carefully controlling when sounds reach the user's ears, the system can create the impression of sound sources located at different positions in the space around the user. 27 | 28 | 8. 💻 **HoloLens spatial audio offers extensive functionality for developers**: It allows the placement and movement of sound sources in 3D space, enabling developers to create highly immersive and interactive audio environments. 29 | 30 | 9. 🎛️ **Developers can manipulate the source of the audio on a frame-by-frame basis**: This feature allows developers to precisely control where a sound appears to originate from, creating a highly dynamic and interactive audio experience. 31 | 32 | 10. 🎚️ **Play, stop, pause, resume, loop, and fire-and-forget sound assets can be controlled by developers**: These functions provide developers with granular control over audio playback, allowing them to finely tune the audio experience to match visual or interactive elements in the mixed reality environment. 33 | 34 | ## Keywords 35 | - **Spatial audio**: A technique for creating a three-dimensional sound field, giving the illusion of sound coming from various directions and distances. 36 | - **HoloLens**: A pair of mixed reality smart glasses developed and manufactured by Microsoft. 37 | - **Field of view**: The observable area that can be seen at a particular moment. 38 | - **HRTF (Head-Related Transfer Function) audio**: A response that characterizes how an ear receives a sound from a point in space, used in sound synthesis and spatial audio to simulate directionality. 39 | - **Spectral changes**: Modifications in the frequency content of a signal. 40 | - **Pinna**: The visible part of the ear that resides outside the head. 41 | - **Frame-by-frame basis**: In this context, it refers to the ability to change audio sources for each frame in a sequence, allowing for very detailed control over sound localization. 42 | - **Fire-and-forget sound assets**: This refers to playing a sound asset and not needing to control it further once it's started. 43 | -------------------------------------------------------------------------------- /docs/Qualcomm/Hand Tracking Best practices.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Hand Tracking Best practices 4 | parent: Qualcomm 5 | --- 6 | 7 | # Hand Tracking Best practices 8 | Original article: [Hand Tracking Best practices](https://docs.spaces.qualcomm.com/unity/handtracking/design/BestPractices.html) 9 | 10 | ## TL;DR 11 | Extended Reality (XR) design emphasizes intuitive hand gestures and interactions. Usability and accessibility are key to comfortable virtual experiences. Technological constraints must be acknowledged for robust and engaging XR applications. 12 | 13 | ## Bullet points 14 | 1. **🤲 Virtual Interactions**: Interactions with virtual content are made possible through Hand Tracking technology. This includes hand gestures, specific poses to trigger actions, and physical interactions with virtual objects. Distal interactions allow actions at a distance, while proximal interactions require closer physical engagement. 15 | 16 | 2. **✋ Hand Gestures**: Hand gestures need to be taught gradually in a tutorial, rather than all at once. It's essential to limit the number of gestures and ensure they are distinct, consistent throughout the application, and always accessible via tutorials. This approach ensures intuitive and user-friendly experiences. 17 | 18 | 3. **🖥️ Usability**: To create an engaging and immersive user experience, the design must prioritize spatial environments, interactions, object placement, and user journeys. The ergonomic considerations need to be in sync with user interactions and the overall journey to make the virtual environment feel natural. 19 | 20 | 4. **🚶 User Posture**: Promoting neutral body positions and natural arm placements helps avoid fatigue. Designing interactions that avoid large, strenuous, and repetitive movements aids in maintaining user comfort and enhancing the overall experience. 21 | 22 | 5. **🎗️ Accessibility**: Accessibility in XR is about creating a user interface that everyone can interact with. This includes designing cues that guide users through a full 360° and 3D space, enabling actions at a natural arm's reach, and considering font, color, and interaction methods that are inclusive. 23 | 24 | 6. **🗺️ User Journey**: Guiding users with visual or audio cues, and providing clear feedback helps in building an intuitive experience. Ensuring actions are reversible and offering clear exits from unwanted states allow users to feel in control, enhancing their engagement. 25 | 26 | 7. **🧠 Context**: Contextual design helps in streamlining content delivery and relevant interactions. By tracking user focus and adapting available elements, the visual clutter is minimized. Context makes the experience more intuitive, as seen in the example of a drawing demo, where the menu changes based on user selections. 27 | 28 | 8. **📦 Object Placement**: Following the Gestalt Principles, similar objects are grouped for easier discovery. Objects are placed within reach and sight, appropriately sized, and positioned not to obstruct the user's vision. These considerations improve navigation and interaction within the virtual space. 29 | 30 | 9. **📐 Field of View (FOV)**: FOV is vital in AR/VR. Understanding how Camera FOV (captured by the device's cameras) and Display FOV (what the user actually sees) work ensures that the design aligns with the device's limitations, providing an optimal viewing experience. 31 | 32 | 10. **🚫 Limitations**: Limitations in XR design include occlusion, where one hand may block the other, impacting Hand Tracking. This can be minimized by avoiding overlapping hand gestures and using one-handed patterns. Outdoor environment limitations due to hardware restrictions must also be considered to ensure the application behaves as expected. 33 | 34 | ## Keywords 35 | - **Distal Interaction**: Interaction with objects from a distance. 36 | - **Proximal Interaction**: Physical interaction with virtual objects at close range. 37 | - **Ergonomics**: The study of designing equipment and devices that fit the human body and its movements. 38 | - **Gestalt Principles**: Psychological principles that describe how humans naturally perceive objects as organized patterns and objects. 39 | - **Field of View (FOV)**: The extent of the observable environment seen at any given moment. In AR/VR, it includes Camera FOV (captured by cameras) and Display FOV (what the user sees). 40 | - **Occlusion**: In 3D space, when one object blocks another object from view, often impacting hand tracking in XR experiences. -------------------------------------------------------------------------------- /docs/Apple/Explore immersive sound design.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Explore immersive sound design 4 | parent: Apple 5 | --- 6 | 7 | # Explore immersive sound design 8 | Original video: [Explore immersive sound design](https://developer.apple.com/videos/play/wwdc2023/10271/) 9 | 10 | ## TL;DR 11 | The video emphasizes the significant role of spatial audio in UI design, which mimics real-world sound perceptions to enhance user experiences. It explores the use of spatial audio sources, UI interaction sounds, and techniques like repetition and randomization in sound design. It also discusses creating environmental soundscapes, sound recording and placement, and the use of immersive sound in onboarding experiences, highlighting the need for audio to synchronize with visual aesthetics and interaction dynamics. 12 | 13 | ## Bullet Points 14 | - 🧭 **Spatial Audio Understanding**: We all have an inherent understanding of spatial audio, such as locating a sound source based on its direction and volume. This knowledge is fundamental to our interaction with technologies utilizing spatial audio. 15 | - 🔈 **Spatial Audio System**: Spatial audio systems mimic the behavior of sound in real-world environments, such as the reverberation and absorption of sound by surfaces, creating a realistic auditory experience. 16 | - 🎛️ **UI Sound Design**: UI sounds are crucial for providing user feedback. Subtle interaction sounds help create familiarity and confidence in the user experience. 17 | - 🔂 **Sound Randomization**: To avoid monotonous repetition, sound elements like pitch and amplitude can be randomized. This strategy can help make frequent interaction sounds feel more natural. 18 | - 🌐 **Creating Immersive Experiences**: Immersive audio experiences can be crafted by carefully curating and positioning sounds in the virtual environment. Ambient sounds, spatial audio sources, and randomized sounds are used to form a complex, realistic soundscape. 19 | - 🐸 **Spatial Placement & Randomization**: Sounds are strategically placed in the virtual environment and are randomized to prevent predictability and enhance realism. 20 | - 🎙️ **Recording Process**: For authentic and high-quality soundscapes, professional microphones and sound libraries can be utilized. These resources provide a range of sounds that can be used to build the soundscape. 21 | - 🗺️ **Designing for Different Experiences**: The same principles of spatial audio design can be applied to different experiences within an app, whether it's a full immersive environment or a specific interaction moment. 22 | - 📸 **App-Specific Sound Design**: The soundscape of an app should complement its visuals and enhance the overall user experience. This includes using sounds that are relevant and fitting to the app's theme or aesthetic. 23 | - 👂 **Role of Sound in UX**: Sound plays a significant role in the user experience. It makes the UI feel responsive, confirms user actions, and helps create an engaging and captivating immersive experience. 24 | 25 | ## Keywords 26 | - **Spatial Audio**: A technology that makes sounds appear as if they are coming from a specific location in a 3D space, creating a more immersive listening experience. 27 | - **UI (User Interface)**: The means through which a user interacts with a software or hardware. In the context of this transcript, it refers to the audible and visual elements of a system or app that users interact with. 28 | - **Reverberation**: The persistence of sound in a particular space after the original sound is removed. It is an element of spatial audio that is replicated by the system to provide a realistic sound experience. 29 | - **Sound Randomization**: The process of making slight alterations to repeated sounds to make them feel more natural. This can include changes to pitch and amplitude. 30 | - **Soundscapes**: The characteristic sounds of a particular area or environment. In this context, it refers to the overall sound environment created in an app. 31 | - **Ambient Background Audio**: General background noise or sounds that fill the auditory space, providing a sense of realism to the environment. 32 | - **Spatial Audio Sources**: Individual sounds that exist at specific locations within the 3D space of a spatial audio environment. 33 | - **Microphones**: Devices used to capture sound. The context here refers to professional microphones used for high-quality sound recording. 34 | - **UX (User Experience)**: The overall experience a user has while using a product or service, especially in terms of how easy or pleasing it is to use. 35 | -------------------------------------------------------------------------------- /docs/Apple/Augmented Reality.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Augmented reality 4 | parent: Apple 5 | --- 6 | 7 | # Augmented reality 8 | Original article: [Augmented reality](https://developer.apple.com/design/human-interface-guidelines/augmented-reality) 9 | 10 | ## TL;DR 11 | In Augmented Reality (AR) experiences, prioritize clear, non-technical instructions and favor 3D hints over 2D text overlays. Handle interruptions, like app switches or calls, with user guidance to maintain AR tracking. Consistently use unaltered AR icons and badges to indicate AR functionalities. 12 | 13 | ## Bullet points 14 | 15 | 1. 📱 **Approachable Terminology**: 16 | - Instead of technical jargon like "ARKit" or "tracking," use friendlier alternatives. For example, say "Unable to find a surface" instead of "Unable to find a plane." This ensures users who might be unfamiliar with AR concepts still understand instructions and prompts. 17 | 18 | 2. 🌍 **World Mapping**: 19 | - ARKit utilizes world mapping to overlay virtual objects accurately. It understands the user's environment in detail, allowing for a more immersive AR experience. This is why suggestions like "moving your phone more slowly" or "turning on more lights" might improve the AR experience by assisting with world detection. 20 | 21 | 3. 🔄 **Relocalization**: 22 | - When interruptions occur, like phone calls, AR might misplace virtual objects. Relocalization is the process where ARKit tries to reposition these objects correctly. Providing a reset option or guidance on returning the device to its original position can help in such scenarios. 23 | 24 | 4. 🌀 **3D Hints vs. 2D**: 25 | - Visual feedback is essential in AR. When hinting at possible actions, such as rotating an object, 3D visual hints (like a rotation indicator) are more intuitive than 2D text prompts. It aligns better with the immersive nature of AR. 26 | 27 | 5. 🔡 **Text Readability**: 28 | - When text is indispensable in AR, ensure it is large enough to read and always faces the viewer, even in 3D space. It's about guaranteeing that the user gets the message, no matter their orientation or the text's position. 29 | 30 | 6. 🔍 **More Information**: 31 | - Sometimes, users might need additional information about AR objects. Using visual indicators, like labels with a ">" sign, shows users that they can tap to learn more, enhancing the interactive nature of AR. 32 | 33 | 7. 🛑 **Handling Interruptions**: 34 | - If the user switches apps or accepts a call, ARKit loses tracking. On returning, it's crucial to guide the user to either relocalize or consider hiding misplaced virtual objects to avoid confusion or visual glitches. 35 | 36 | 8. 🔄 **Minimizing Interruptions**: 37 | - For an uninterrupted AR experience, blend non-AR elements within AR. For instance, changing object properties, like furniture upholstery, shouldn't require exiting the AR mode. It promotes sustained engagement. 38 | 39 | 9. 🔖 **Icons and Badges**: 40 | - ARKit provides specific icons for AR experiences. They are meant for specific uses, such as initiating an AR experience. Use these icons uniformly and don't alter their appearance. These consistent visuals help users identify AR-specific interactions quickly. 41 | 42 | 10. 💡 **Problem Solving**: 43 | - AR can sometimes face challenges like insufficient lighting or problems in detecting surfaces. Address these proactively by suggesting fixes. This not only educates the user about potential issues but also empowers them to rectify and enjoy a seamless AR experience. 44 | 45 | ## Keywords 46 | - **ARKit**: Apple's framework for creating augmented reality experiences on iOS devices. 47 | 48 | - **World Detection**: The ability of AR systems to identify and track the user's environment in real-time, allowing virtual objects to be placed with accuracy. 49 | 50 | - **Tracking**: The process by which AR systems follow the movement and orientation of the user's device to align virtual objects with the real world. 51 | 52 | - **Relocalization**: An ARKit feature that attempts to realign virtual objects to their original real-world positions after an interruption. 53 | 54 | - **3D Hints**: Visual indicators in three-dimensional space used to guide users in AR experiences. 55 | 56 | - **System-provided coaching view**: A built-in view provided by ARKit to help users with tasks such as relocalization. 57 | 58 | - **Glyph**: A specific icon or symbol used to represent a function or feature, such as the AR icon in ARKit. 59 | 60 | - **Badging**: Using specific icons or markers to indicate certain capabilities or features, like viewing an object in AR. 61 | -------------------------------------------------------------------------------- /docs/Road to VR/7 Lessons ‘Sea of Thieves’ Can Teach Us About Great VR Game Design.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: 7 Lessons ‘Sea of Thieves’ Can Teach Us About Great VR Game Design 4 | parent: Road to VR 5 | --- 6 | 7 | # 7 Lessons ‘Sea of Thieves’ Can Teach Us About Great VR Game Design 8 | Original article: [7 Lessons ‘Sea of Thieves’ Can Teach Us About Great VR Game Design](https://www.roadtovr.com/7-lessons-sea-of-thieves-can-teach-us-about-great-vr-game-design/) 9 | 10 | ## TL;DR 11 | Sea of Thieves offers an immersive experience by using minimal UI, natural interactions, and a shared game world, allowing for engaging cooperation and emergent fun. The game's design promotes flexibility in pacing, catering to different play styles and moods. Its innovative principles provide valuable insights for VR developers in crafting engaging virtual environments. 12 | 13 | ## Bullet points 14 | 1. 🎮 **Minimal UI & Natural Interactions**: Sea of Thieves has a minimalistic UI, relying on natural interactions and in-game tools like compasses, clocks, and maps instead of abstract menus. This offers immersive gameplay by providing essential information through spatial organization and real-world objects. It makes the world feel more real and engages the player's natural spatial understanding. 15 | 16 | 2. 🌍 **Shared Game World**: The game's shared world model enhances immersion by seamlessly blending single player and multiplayer experiences. Players inhabit the same world, leading to random encounters with real players, creating a world that feels more like a place rather than a simulation. This also accommodates a wide range of concurrent players and encourages unique interactions, whether friendly or hostile. 17 | 18 | 3. 🤲 **'Physical' Objects Make for Emergent Fun**: Many objects in Sea of Thieves are treated as 'real' or 'physical', allowing for shared and dynamic interactions. This creates emergent fun and cooperative play, as objects like treasure chests are not owned by any particular player and can be carried, stolen, or lost. This real, physics-based interaction with items adds depth to gameplay and opportunities for unique experiences. 19 | 20 | 4. ⛵ **Strong Moment to Moment Gameplay, Even When 'Nothing' is Happening**: The game’s interactive sailing mechanics make even basic movement engaging. With actions like raising the anchor, adjusting the sails, or navigating, there's an inherent challenge and skill involved in simply moving around. This engagement, even in quiet moments, adds to the overall enjoyment and complexity of the game. 21 | 22 | 5. 🤝 **Cooperation that Goes Beyond 'Doing the Same Thing at the Same Time'**: Cooperation in Sea of Thieves goes beyond simple team play and requires players to perform different, intermingling roles to achieve common objectives. Whether it’s steering, managing sails, or manning cannons, players must work together in diverse ways, enhancing immersion and creating a rich cooperative experience. 23 | 24 | 6. ⏳ **Wide Ranging (Optional) Pacing**: The game offers players the ability to choose their own pace and style of play. Whether seeking action, relaxation, adventure, or just goofing around, Sea of Thieves operates like a sandbox, letting players pursue their desired gameplay. This flexibility caters to different moods and preferences, making the game more accommodating and enjoyable. 25 | 26 | 7. 📈 **Lessons for VR Developers**: Though not specifically built for VR, Sea of Thieves offers valuable insights for VR developers. Its design decisions, such as minimal UI, shared world model, physical objects, and diverse cooperation, can be applied to create immersive VR experiences. By studying these principles, VR developers can learn how to craft engaging and innovative virtual environments. 27 | 28 | ## Keywords 29 | - **Minimal UI (User Interface)**: A user interface with only essential elements for functionality, making it simple and uncluttered. 30 | 31 | - **Natural Interactions**: Interactions within a digital environment that feel intuitive, mirroring real-world behavior. 32 | 33 | - **Shared Game World**: A game environment where all participants inhabit the same space for shared experiences. 34 | 35 | - **Emergent**: Complex gameplay patterns arising from interactions of relatively simple rules and mechanics. 36 | 37 | - **Moment to Moment Gameplay**: Ongoing experience where every action in the game is engaging or meaningful to the player. 38 | 39 | - **Cooperation**: Working together towards common goals in a game, often requiring communication and coordination. 40 | 41 | - **Pacing**: Refers to how fast or slow events happen in a game, influencing the tension and excitement. 42 | -------------------------------------------------------------------------------- /docs/Microsoft/Comfort.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Comfort 4 | parent: Microsoft 5 | --- 6 | 7 | # Comfort 8 | Original article: [Comfort](https://learn.microsoft.com/en-us/windows/mixed-reality/design/comfort) 9 | 10 | ## TL;DR 11 | 12 | In mixed reality, monocular and binocular cues are vital for interpreting 3D space and depth. Devices like HoloLens require careful content placement and user movement considerations to avoid discomfort. Key terms, such as IPD and FPS, are essential for device calibration and optimal performance. 13 | 14 | ## Bullet points 15 | 1. 👁️ **Monocular Cues**: These are visual cues that only require one eye to interpret 3D shapes and relative positions of objects. Examples include linear perspective, which is the way parallel lines seem to converge as they recede into the distance, and occlusion, where closer objects block those that are farther away. These cues give our brains information about depth and distance even if we're only using one eye. 16 | 17 | 2. 👀 **Binocular Cues**: These cues require both eyes to understand depth and relative positions of objects. For instance, binocular disparity refers to the slight difference between the two visual images (one from each eye) that allows for depth perception. Vergence is the simultaneous movement of both eyes to maintain binocular vision. 18 | 19 | 3. 🎮 **Vergence-Accommodation Conflict**: In natural viewing, our eyes' focus (accommodation) and their rotation (vergence) are linked. However, in many head-mounted displays, this link is broken, leading to visual discomfort or fatigue. This conflict arises when our eyes try to focus on a virtual object at one distance while converging at another. 20 | 21 | 4. 🕶️ **HoloLens Displays**: The HoloLens is a mixed reality device with its display fixed at an optical distance of 2.0m. This means users must always focus at this distance for a clear image. To reduce visual discomfort, it's recommended to place most content near this distance. 22 | 23 | 5. 📏 **Optimal Hologram Distance**: For the best user experience on HoloLens, holograms should be placed between 1.25m and 5m from the user. Placing holograms too close (under 40cm) can cause discomfort due to the vergence-accommodation conflict. 24 | 25 | 6. 🚶 **User Locomotion**: This refers to how users move within a virtual environment. If the virtual movement doesn't match the user's physical movement, it can lead to motion sickness. It's crucial to design experiences that align virtual and physical movements as closely as possible. 26 | 27 | 7. 📊 **HUDs (Heads-Up Displays)**: In video games, HUDs provide vital information directly on the screen. In mixed reality, a HUD that's fixed to the user's head orientation can cause discomfort. Instead, it's better to have HUDs that move with the body, reorienting only after significant head rotation. 28 | 29 | 8. 📜 **Text Legibility**: Text in mixed reality should be easily readable. This involves considering the display's properties, such as pixel density and brightness, as well as the text's properties, like font size and weight. Proper legibility ensures users don't strain their eyes. 30 | 31 | 9. 🖼️ **Holographic Frame**: This refers to the user's field of view in a mixed reality experience. Designers should consider how much users have to move their heads to interact with content. Excessive or unnatural movements can lead to discomfort. 32 | 33 | 10. 💪 **Arm Positions**: Continuous hand gestures or repeated air taps can lead to muscle fatigue. Mixed reality experiences should be designed to minimize these repetitive actions. For instance, voice commands can be integrated to reduce the need for hand gestures. 34 | 35 | ## Keywords 36 | 37 | - **Monocular Cues**: Visual cues that only require one eye to interpret 3D shapes and relative positions of objects. 38 | 39 | - **Binocular Cues**: Visual cues that require both eyes to understand depth and relative positions of objects. 40 | 41 | - **Vergence**: The relative rotations of the eyes required to focus on an object. 42 | 43 | - **Accommodation**: The adjustment of the eyes' focus to the distance of an object. 44 | 45 | - **Vergence-Accommodation Conflict**: A phenomenon where the eyes' focus and rotation don't match, leading to visual discomfort. 46 | 47 | - **HUD (Heads-Up Display)**: A transparent display that presents data without requiring the user to look away from their usual viewpoint. 48 | 49 | - **Locomotion**: The ability to move from one place to another. 50 | 51 | - **IPD (Interpupillary Distance)**: The distance between the pupils of an individual's eyes. 52 | 53 | - **VO (Vertical Offset)**: The potential vertical offset of digital content shown to each eye relative to the horizontal axis of the viewer's eyes. 54 | 55 | - **FPS (Frames Per Second)**: A measure of how many individual frames are displayed in one second of video or animation. 56 | -------------------------------------------------------------------------------- /docs/Meta/VR Locomotion Design Guide.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: VR Locomotion Design Guide 4 | parent: Meta 5 | --- 6 | 7 | # VR Locomotion Design Guide 8 | Original article: [VR Locomotion Design Guide](https://developer.oculus.com/resources/bp-locomotion/) 9 | 10 | ## TL;DR 11 | In VR, user comfort is paramount. Different platforms offer varying mobility and fidelity. Achieving motion in VR can be physical or artificial, with potential for disorientation. Vection can cause discomfort when our eyes disagree with our balance sensors. Developers need to account for space limits, fatigue, accessibility, and predictability in movement to optimize user experience. 12 | 13 | ## Bullet points 14 | 1. 🖥 **VR Types** 15 | VR devices can be grouped mainly into: 16 | - **PC-Tethered VR**: These are connected to computers and offer high-quality, graphic-intensive experiences, providing the most immersive experience. 17 | - **Standalone VR**: These are wireless devices offering flexibility and mobility, but they might trade-off a bit in terms of graphic quality. 18 | 19 | 1. 🌍 **World-Scale Tracking** 20 | Using cutting-edge technology, modern VR devices capture a user's movement across a room. This real-time tracking of the user's full body movement accentuates the realism and immersiveness of the VR experience. 21 | 22 | 1. 🔄 **Locomotion in VR** 23 | Movement in VR can be approached in multiple ways: 24 | - **Physical Locomotion**: Users move in real space, translated to VR. 25 | - **Artificial Locomotion**: Users remain stationary, but use controllers to move. 26 | - **Teleportation**: Users instantly move from one point to another. 27 | - **World Pulling**: Users "pull" or "drag" the virtual world around them. 28 | 29 | 1. ⚖ **Comfort & Usability** 30 | To keep the VR experience engaging, it's vital to minimize sensory mismatches. The more the VR experience aligns with our real-world expectations, the more comfortable and intuitive it becomes. 31 | 32 | 1. 🧠 **Vection & Disorientation** 33 | - **Vection**: The illusion of self-movement based on visual cues. When visuals suggest movement, but the body remains still, this can lead to discomfort. 34 | - **Disorientation**: This occurs when there's a sudden change in camera perspective or location, making users lose their spatial bearings in the virtual world. 35 | 36 | 2. 🚶 **Physical Limitations & Fatigue** 37 | Continuous physical activity in VR, like moving or turning, can lead to exhaustion. Designers must balance between physical interactivity and user comfort, taking into account varying user stamina and space limitations. 38 | 39 | 1. ♿ **Accessibility** 40 | A well-designed VR experience is inclusive. It caters to everyone, regardless of their physical capabilities. This means ensuring that VR is usable for those seated, those with limited mobility, and those with other physical constraints. 41 | 42 | 1. 🧘 **Proprioception & Visual-Vestibular Mismatches** 43 | - **Proprioception**: The innate awareness of body position and movement. In VR, when our virtual representation doesn't align with our actual physical body, it can feel off. 44 | - **Visual-Vestibular Mismatch**: A prime cause of motion sickness in VR. When visuals indicate movement but the body doesn't feel it (or vice versa), it can result in discomfort. 45 | 46 | 2. 📖 **Consistency & Predictability** 47 | In VR, the unexpected can be jarring. Ensuring that movement patterns, controls, and reactions are consistent helps users to predict actions in VR. This makes the environment feel more controlled and thus, more comfortable. 48 | 49 | ## Keywords 50 | - **PC-Tethered VR**: VR devices connected to computers, generally offering high-quality, graphic-intensive experiences. 51 | 52 | - **Standalone VR**: Wireless VR devices offering flexibility and mobility without the need for external computers or wires. 53 | 54 | - **World-Scale Tracking**: Technology in VR devices that captures a user's movement across a room in real-time. 55 | 56 | - **Physical Locomotion**: Movement in VR where users physically move in real space, and this movement gets translated to the virtual environment. 57 | 58 | - **Artificial Locomotion**: Movement in VR where users remain stationary but use controllers or other input methods to simulate movement. 59 | 60 | - **Teleportation**: A type of movement in VR where users instantly move from one point to another within the virtual environment. 61 | 62 | - **Vection**: The illusory perception of self-motion based on visual input consistent with such movements, leading to a feeling of moving even when stationary. 63 | 64 | - **Visual-Vestibular Mismatch**: Discrepancy between what the eyes perceive (vision) and what the inner ear senses (vestibular sense) regarding motion. 65 | 66 | - **Proprioception**: The perception or awareness of body position and movement based on sensory information from muscles, joints, and skin. 67 | -------------------------------------------------------------------------------- /docs/Road to VR/Inside VR Design - The Interface of ‘Electronauts’.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Inside VR Design - The Interface of ‘Electronauts’ 4 | parent: Road to VR 5 | --- 6 | 7 | # Inside VR Design: The Interface of ‘Electronauts’ 8 | Original article: [Inside VR Design: The Interface of ‘Electronauts’](https://www.roadtovr.com/inside-vr-design-interface-of-electronauts/) 9 | 10 | ## TL;DR 11 | Electronauts, a music-making VR game, utilizes an innovative interface design emphasizing ease-of-use through drumstick controls, smart hierarchy, and user flexibility. The game's design allows for precise control and interaction, replicating physical feedback, and accommodating both left and right-handed users. These principles, though crafted for a music game, showcase insights that could be applied broadly in other VR experiences. 12 | 13 | ## Bullet points 14 | 1. **🥁 Ease-of-use & Precision with Physical Feedback:** 15 | - **Intuitive Interaction:** By using drumsticks as both musical instruments and interface tools, Electronauts leverages human adeptness at using tools, adding an intuitive layer to control. 16 | - **Physicality in VR:** The process of inserting the drumstick into a button and pulling a trigger mimics real-world interaction, enhancing the precision and physical feedback, a unique solution in VR interface design. 17 | 18 | 2. **📦 Hierarchy & Organization:** 19 | - **Logical Structure:** Functions are organized within cubes, acting like mini-apps. This hierarchy ensures an intuitive and logical organization without overwhelming depth. 20 | - **Ease of Access:** The cube-and-pedestal system makes functions easily accessible and memorable, ensuring users don’t spend unnecessary time navigating through complex menus. 21 | 22 | 3. **🔄 Flexibility & Workspace Adaptability:** 23 | - **Customization:** The Electronauts interface allows users to rearrange tools according to their needs and preferences, offering a personalized experience. 24 | - **Ambidextrous Design:** Accommodating for both left and right-handed players, the design enhances usability for a broader audience. 25 | - **Task-Specific Adaptation:** Like rearranging windows on a computer, users can adapt the workspace quickly to suit their current musical task, enhancing efficiency. 26 | 27 | 4. **🎵 Smooth Control in a Musical Context:** 28 | - **Seamless Interaction:** Despite the intricacies of controlling various musical elements, the design ensures smooth and responsive control. 29 | - **Rhythm Alignment:** Players are able to interact effortlessly even when pressed by the beat of a song, demonstrating the interface's alignment with the rhythmic context of the game. 30 | 31 | 5. **🌐 Universal Applicability of Design Principles:** 32 | - **Beyond Music:** The core principles found in Electronauts are not limited to music games. The ease-of-use, hierarchy, and flexibility can inspire other VR experiences. 33 | - **Broad Relevance:** These principles show that good interface design in one domain can translate into insights that have broader relevance across different applications and contexts. 34 | 35 | 6. **✋ Hand-held Implement as an Interface Tool:** 36 | - **Physical Extension:** The use of drumsticks or other hand-held implements emphasizes physical extension and control, enhancing the user's connection with the virtual environment. 37 | - **Potential Application:** This concept opens doors for exploration in various VR applications, reinforcing the importance of tangible and precise interaction. 38 | 39 | ## Keywords 40 | - **Electronauts:** A music-making game designed to allow users to feel like DJs. It's designed with an intuitive VR interface. 41 | - **Hierarchy:** Refers to the organization of functions and tools within the interface, arranged in a logical and structured manner. 42 | - **Proprioceptively:** A complex word referring to the sense of the relative position of one's own parts of the body and the strength of effort being employed in movement. In the context, it refers to how tools (like drumsticks in Electronauts) can feel like an extension of oneself with enough practice. 43 | - **Flexibility:** Refers to the ability of the Electronauts interface to be adapted and customized according to the user's needs. 44 | - **Ambidextrous Design:** A design that can be used by both left-handed and right-handed individuals, referring to the customizable aspect of the Electronauts interface. 45 | - **Physical Feedback:** A term referring to the tangible responses or sensations a user feels when interacting with an object, replicated in VR in Electronauts through the use of drumsticks and buttons. 46 | - **Precision:** In this context, it refers to the accurate control and interaction within the Electronauts interface, particularly in interacting with buttons. 47 | - **Rhythm Alignment:** Refers to how the interface in Electronauts allows users to interact smoothly in alignment with the beat or tempo of the music. 48 | -------------------------------------------------------------------------------- /docs/Microsoft/Eye-gaze-based interaction.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Eye-gaze-based interaction 4 | parent: Microsoft 5 | --- 6 | 7 | # Eye-gaze-based interaction 8 | Original article: [Eye-gaze-based interaction](https://learn.microsoft.com/en-us/windows/mixed-reality/design/eye-gaze-interaction) 9 | 10 | ## TL;DR 11 | 12 | HoloLens 2 integrates eye tracking as a new user input, enhancing the immersive experience. While eye-gaze offers rapid interactions, it comes with challenges like unintentional actions. Design recommendations include differentiating eye-gaze from head-gaze and combining it with other inputs for a seamless experience. 13 | 14 | ## Bullet points 15 | 1. 🕶️ **HoloLens 2 Eye Tracking**: HoloLens 2 introduces a revolutionary feature: eye tracking. This technology allows the device to understand where a user is looking, creating a more intuitive and immersive user experience. Before users can fully utilize this feature, they need to undergo a calibration process, ensuring the device accurately captures their unique eye movements. 16 | 17 | 2. 🎮 **Holographic Shell Experience**: The Holographic Shell is the main user interface users encounter when starting up their HoloLens 2. In this interface, eye-gaze input is subtly integrated, enhancing the overall experience by adding a layer of intuitiveness without being the primary mode of interaction. 18 | 19 | 3. 🌟 **Benefits of Eye-gaze Input**: The human eye's speed makes it an excellent tool for high-speed pointing. Combined with the minimal physical effort required to move one's eyes, this offers a seamless interaction method. The system can predict user intentions due to the implicit nature of eye movements. Furthermore, eye-gaze complements other input methods, capitalizing on years of hand-eye coordination experience. Lastly, the system can determine user attention, which has vast implications for UI design and remote communication. 20 | 21 | 4. 🚫 **Challenges of Eye-gaze Input**: Eye movements come with challenges when used as an input method. The risk of unintentional actions, the balance between observation and control, potential misalignments with slower manual inputs, discomfort from targeting small objects, and the rapid nature of eye movements can be challenging to track accurately. Changing light conditions can affect tracking reliability. 22 | 23 | 5. 📌 **Differentiating Eye-gaze and Head-gaze**: Designers need to understand the distinction between where a user is looking (eye-gaze) and where they are pointing their head (head-gaze). Tasks requiring smooth trajectories, like drawing, are better suited for hand or head pointing. 24 | 25 | 6. 🚀 **Eye-gaze's Power**: Eye-gaze stands out as a powerful input method because of its speed and the minimal effort it requires. When combined with other inputs like voice or hand gestures, it provides a rich contextual signal that can confirm a user's intent. 26 | 27 | 7. 🎯 **Target Size and Comfort**: For optimal user experience, designers should ensure that targets or interactive elements are of a comfortable size. A reference point is that these targets should be at least 2° in visual angle, which is roughly the size of your thumbnail when you stretch out your arm. 28 | 29 | 8. 🤝 **Combining Eye-gaze with Other Inputs**: It's beneficial to combine eye-gaze with other inputs like voice or hand gestures to prevent unintentional actions and to provide a richer interaction context. This combination allows for a more immersive and intuitive user experience. 30 | 31 | 9. 🎨 **Subtle Feedback for Eye Tracking**: Feedback for eye-gaze should be subtle to avoid overwhelming users. Slowly blending in and out, visual highlights, or other subtle target behaviors can indicate that the system correctly detected the user's gaze without interrupting their workflow. 32 | 33 | 10. 🚫 **Avoiding Specific Eye Movements**: Enforcing specific eye movements as inputs can be counterintuitive and should be avoided. Instead, designers should focus on natural and intuitive interactions that align with the user's expectations and behaviors. 34 | 35 | ## Keywords 36 | - **HoloLens 2**: Microsoft's mixed reality smart glasses that overlay holograms on the real world. 37 | 38 | - **Eye Tracking**: A technology that determines where a user is looking, used as an input method in HoloLens 2. 39 | 40 | - **Calibration**: The process of adjusting the eye tracking system to work accurately for an individual user. 41 | 42 | - **Holographic Shell**: The main user interface of the HoloLens 2. 43 | 44 | - **Implicitness**: The quality of being implied or understood without being directly expressed. 45 | 46 | - **Visual Attention**: The process by which a user focuses on specific visual information. 47 | 48 | - **Eye-gaze**: The direction in which a person's eyes are looking. 49 | 50 | - **Head-gaze**: The direction in which a person's head is pointing. 51 | 52 | - **DirectX**: A collection of APIs for handling tasks related to multimedia, especially game programming and video, on Microsoft platforms. 53 | -------------------------------------------------------------------------------- /docs/Ultraleap/Design principles.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Design principles 4 | parent: Ultraleap 5 | --- 6 | 7 | # Design principles 8 | Original article: [Design principles](https://docs.ultraleap.com/xr-guidelines/Getting%20started/design-principles.html) 9 | 10 | ## TL;DR 11 | Hand tracking in VR, championed by Ultraleap, prioritizes intuitive physical interactions. Poses, specific hand gestures, require learning but should be used selectively. Feedback, comfort, and understanding device tracking limits are crucial, while avoiding hand occlusion ensures seamless interaction 12 | 13 | ## Bullet points 14 | 1. 🖐 **Physical Interactions**: Design hand-tracking interactions that utilize our innate understanding and experience with manipulating real-world objects. This approach reduces the learning curve for new users since they can naturally engage with virtual environments as they would in reality. 15 | 16 | 2. 🌐 **Three-dimensional UI**: User interface elements should not be flat. Instead, they should be designed with depth and physicality to enhance interaction. Three-dimensional interfaces enable users to manipulate and understand virtual objects better, providing a more immersive experience. 17 | 18 | 3. 🤞 **Use of Poses**: A 'pose' refers to a specific hand position or gesture. While poses can trigger particular actions in VR, it's essential to use them judiciously. They should be reserved for significant or valuable functions to ensure they are worth the user's effort in learning. 19 | 20 | 4. 🎓 **Learned Interaction**: Not all hand interactions are intuitive. Some, like specific poses, need to be learned. It's crucial to provide resources, like tutorials, to guide users through these non-intuitive gestures, ensuring they understand and remember them. 21 | 22 | 5. 📣 **Feedback**: In the absence of tactile feedback in VR, visual and auditory cues become vital. By incorporating clear visual indicators (like color changes) and sounds, users gain more confidence in their interactions. For an even more enhanced experience, integrating haptic feedback through Ultraleap's mid-air haptic devices can recreate a sense of touch. 23 | 24 | 6. 😌 **Comfort**: Users' comfort should always be paramount. Design with an emphasis on minimizing large or strenuous movements, preventing users from holding hand poses for too long, and ensuring they don't have to overextend or raise their arms frequently. Such considerations can prevent fatigue and potential discomfort. 25 | 26 | 7. 📏 **Interaction Zone**: Each hand-tracking device has a specific field of view and interaction range, such as the Leap Motion Controller and Stereo IR 170. Being aware of these limits ensures interactions are designed within the effective tracking range, ensuring a seamless user experience. 27 | 28 | 8. 🚫 **Occlusion**: This refers to moments when the camera can't track a hand due to obstructions. While the Ultraleap system can handle partial occlusions, it's vital to design interactions that prevent complete hand coverages, ensuring continuous tracking. 29 | 30 | 9. 🖖 **Tracking Loss**: If a hand nears the limits of the interaction zone or the camera's field of view, tracking might be lost. Signal this potential loss by adjusting the visual representation of the hands (like fading them out) to inform the user. 31 | 32 | 10. 🎥 **Device Awareness**: Different devices have varying specifications. Being familiar with the capabilities and limitations of devices like Ultraleap hand tracking tools ensures optimized application design, catering to the device's strengths. 33 | 34 | ## Keywords 35 | - **Hand Tracking**: Technology that captures the movements and gestures of the human hand, allowing for interactive experiences in virtual spaces. 36 | 37 | - **Ultraleap**: A company specializing in hand-tracking technology, providing tools and recommendations for effective hand interactions in VR. 38 | 39 | - **Physical Interactions**: Design principle emphasizing the use of tangible, real-world-like interactions within VR to minimize the learning curve. 40 | 41 | - **Pose**: A specific hand position or gesture in VR that can trigger particular actions. Considered a "learned interaction" as it might not be immediately intuitive. 42 | 43 | - **Learned Interaction**: nteractions that aren't immediately recognizable or intuitive and require some instruction or tutorial for users to understand. 44 | 45 | - **Haptic Feedback**: The use of touch sensations in devices, typically vibrations, to provide feedback to the user about their interactions. 46 | 47 | - **Interaction Zone**: The effective area within which the hand-tracking device can capture hand movements. 48 | 49 | - **Occlusion**: An instance where the hand-tracking camera can't see a part of the hand due to obstruction, causing potential tracking issues. 50 | 51 | - **Leap Motion Controller & Stereo IR 170**: Devices by Ultraleap for hand tracking, each having different fields of view and tracking ranges. 52 | 53 | - **Field of View (FOV)**: The observable area that a camera or viewer can see at any given moment. -------------------------------------------------------------------------------- /docs/Apple/Create accessible spatial experiences.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Create accessible spatial experiences 4 | parent: Apple 5 | --- 6 | 7 | # Create accessible spatial experiences 8 | Original video: [Create accessible spatial experiences](https://developer.apple.com/videos/play/wwdc2023/10034/) 9 | 10 | 11 | ## TL;DR 12 | This video discusses accessibility in spatial computing, emphasizing inclusive design for people with disabilities. It covers VoiceOver support, gesture controls, RealityKit's accessibility component, vision accessibility considerations, camera anchors, alternate inputs like Dwell Control and Pointer Control, cognitive accessibility, hearing accessibility with captions, and the importance of creating inclusive spatial experiences. 13 | 14 | 15 | ## Bullet points 16 | 1. 💡 Accessibility is crucial in spatial computing to ensure that individuals with disabilities can fully engage with immersive experiences. It's important to consider the diverse needs of users, including those with visual, motor, cognitive, and hearing impairments. 17 | 18 | 2. 🗣️ VoiceOver support is available on this platform, allowing people who are blind or have low vision to interact with apps using finger pinches on their hands. VoiceOver provides spoken descriptions of on-screen elements, enabling users to navigate and interact with the app. 19 | 20 | 3. 🤝 VoiceOver's Direct Gesture Mode enables individuals to directly process hand input, providing a choice between default interaction mode and direct gestures. This allows users to interact with apps using hand gestures without VoiceOver interfering. 21 | 22 | 4. 👁️ Vision accessibility can be enhanced by adopting SwiftUI's accessibility modifiers and using the accessibility component in RealityKit. These tools enable developers to provide accurate accessibility information, such as labels and traits, to ensure that visually impaired users can understand and interact with app elements. 23 | 24 | 5. 📷 Camera anchors should be used sparingly, as individuals with low vision may need to get closer to content. Providing alternatives, such as world anchors instead of head anchors, allows users to view and interact with objects more effectively. 25 | 26 | 6. ⌨️ Dwell Control and Pointer Control offer alternate input methods for individuals with physical disabilities affecting their use of eyes and hands. Dwell Control allows users to select and interact with UI elements without using their hands, while Pointer Control changes the system focus based on head or finger movements. 27 | 28 | 7. 🧠 Cognitive accessibility can be improved by using Guided Access to minimize distractions and provide a focused experience. Guided Access restricts the system to a single app, preventing users from switching between apps and maintaining focus on the current task. 29 | 30 | 8. 🔊 Captions play a vital role in hearing accessibility, and they should be comprehensive, customizable, and easy to read. Providing accurate captions for audio content ensures that individuals with hearing impairments can access and understand the information being conveyed. 31 | 32 | 9. 🌍 Inclusive spatial experiences require setting accessibility properties on entities and providing options for interaction to accommodate different needs. Developers can use features like the accessibility component in RealityKit to specify accessibility labels, values, and traits, as well as custom actions and content. 33 | 34 | 10. 📚 Developers should familiarize themselves with accessibility features, such as Dynamic Type (adjustable text size), high contrast ratios, and API frameworks like AVKit and AVFoundation, to create accessible apps. Understanding and implementing these features will help ensure that individuals with disabilities can fully enjoy and benefit from spatial computing experiences. 35 | 36 | 37 | 38 | ## Keywords: 39 | - **Spatial computing**: The use of digital technologies to create immersive experiences that integrate the physical and virtual worlds. 40 | - **Accessibility**: The design of products, devices, services, or environments that can be used by people with disabilities. 41 | - **VoiceOver**: A built-in screen reader available on Apple platforms that provides spoken feedback to assist users who are blind or have low vision. 42 | - **RealityKit**: A framework by Apple for creating augmented reality (AR) and virtual reality (VR) experiences. 43 | - **Dynamic Type**: A feature that allows users to adjust the text size in apps to make it more readable. 44 | - **SwiftUI**: A user interface toolkit by Apple for building apps across all Apple platforms. 45 | - **Dwell Control**: An accessibility feature that allows users to interact with UI elements without using their hands. 46 | - **Cognitive accessibility**: The design of products and experiences to accommodate individuals with cognitive disabilities or impairments. 47 | - **AVKit and AVFoundation**: Frameworks by Apple for working with audio and video content in apps. 48 | - **Captioning**: The process of displaying text on screen that corresponds to the audio in a video or other media. -------------------------------------------------------------------------------- /docs/Meta/Designing for virtual reality 3 tips for content strategists.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Designing for virtual reality - 3 tips for content strategists 4 | parent: Meta 5 | --- 6 | 7 | # Designing for virtual reality: 3 tips for content strategists 8 | Original article: [Designing for virtual reality: 3 tips for content strategists](https://design.facebook.com/stories/designing-for-virtual-reality-3-tips-for-content-strategists/) 9 | 10 | ## TL;DR 11 | The article discusses the unique aspects of VR storytelling and suggests a new approach for content strategy, inspired by immersive theater, location-based tours, and exploration video games. It presents three paradigms: giving the audience a defined role in the story, utilizing the setting to establish mood, and gently introducing people to the experience. The content strategist's role in VR is identified as guiding the audience through immersive, relatable, and clear narratives. 12 | 13 | ## Bullet points 14 | 1. **🎭 Emphasize on Participant Roles:** In VR experiences, the audience isn't merely a spectator. Instead, they are active participants in the narrative, contributing to the story progression. This mirrors the interactive nature of immersive theater, where the audience often has a defined role within the performance. 15 | 16 | 2. **🏞️ Utilize the Environment to Set the Mood:** In a VR environment, the setting isn't just a backdrop; it is a character in itself. The use of various visual and audio cues can evoke specific moods, similar to the way a story's setting can influence its tone in literature. 17 | 18 | 3. **🕹️ Ease the Audience into the Experience:** VR presents a new paradigm with spatial interactions. To ensure a smooth user experience, it is essential to guide the audience into learning how to interact in this new environment. Drawing inspiration from video games, VR experiences should introduce new interactions gradually, allowing users to adapt comfortably. 19 | 20 | 4. **🤹 Incorporate Interaction:** Audience participation isn't limited to the narrative progression. Users can interact with the environment, influencing the story as they engage with objects in the VR world. This approach aligns with the mechanics of story exploration video games. 21 | 22 | 5. **💭 Think About Narrative Structure:** While VR is a unique medium, it can borrow storytelling techniques from other immersive experiences. Live theater, location-based visitor tours, and first-person video games all serve as sources of inspiration for crafting narratives in VR. 23 | 24 | 6. **👥 Engage with Non-verbal Communication:** In VR experiences, communication isn't limited to verbal dialogue. Audiences use avatars to express themselves, relying on gestures and emotes to interact. Content strategists should consider this aspect when developing narratives. 25 | 26 | 7. **📚 Use Existing Design Patterns as a Guide:** VR is still an evolving medium with room for exploration. However, existing design patterns from other interactive mediums can offer valuable insights and act as guides during the VR content development process. 27 | 28 | 8. **👓 Remember the Immersion:** The essence of VR lies in its immersive nature. The audience's ability to enter the story allows for deeper, emotional connections between the narrative and the user. Content strategists should harness this immersion to enhance the overall storytelling experience. 29 | 30 | ## Keywords 31 | - **Immersive Experience:** This refers to a simulated reality where a user can interact with their surroundings and perform a variety of actions. In the context of the blog post, it describes the way VR allows users to be part of the story. 32 | 33 | - **Content Strategy:** This is an aspect of user experience design that involves planning and managing content in an effective and structured manner. In the context of the blog post, it relates to the task of structuring and presenting narratives in VR experiences. 34 | 35 | - **Avatar:** This refers to a graphical representation of a user in a virtual reality environment or on the internet. In the blog post, it represents the means by which users express themselves non-verbally in VR experiences. 36 | 37 | - **Non-verbal Communication:** This is a type of communication that doesn't use words but instead uses body language, facial expressions, and other visual cues. In the blog post, it describes how users communicate in VR experiences. 38 | 39 | - **Story Exploration Video Games:** These are video games where the story unfolds through the player's interactions with the game environment. The blog post uses this as an example of interactive storytelling. 40 | 41 | - **Location-based Visitor Tours:** This refers to experiences where a story or information is presented to visitors based on their specific location. In the blog post, it is used as a reference for how VR can use the environment to influence the narrative. 42 | 43 | - **Live Theater:** This is a type of performance art that involves actors performing on stage in front of an audience. In the blog post, it serves as an example of immersive, interactive storytelling. 44 | -------------------------------------------------------------------------------- /docs/Microsoft/Voice input.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Voice input 4 | parent: Microsoft 5 | --- 6 | 7 | # Voice input 8 | Original article: [Voice input](https://learn.microsoft.com/en-us/windows/mixed-reality/design/voice-input) 9 | 10 | ## TL;DR 11 | 12 | HoloLens integrates voice input, allowing users to command holograms without gestures, using gaze direction for targeting. The system offers a variety of voice commands, from basic selections to advanced hologram manipulations, enhanced by features like Cortana. However, users may face challenges like fine-grained control and voice command reliability. 13 | 14 | ## Bullet points 15 | 16 | 1. 🎙️ **Voice Input**: Voice input is a primary interaction method for HoloLens. It provides users with the ability to interact with holograms without the need for physical gestures. This makes navigating the HoloLens interface more intuitive and efficient, especially when navigating complex interfaces. 17 | 18 | 2. 🎯 **Voice and Gaze**: In the HoloLens environment, voice commands are often paired with the user's gaze direction. This combination allows the device to understand which hologram or application the user is referring to. For instance, if a user looks at a specific app and says "open", the device knows which app to launch. 19 | 20 | 3. 🎧 **Device Support**: Voice input is not exclusive to HoloLens. Both the first and second generation of HoloLens, as well as immersive headsets equipped with microphones, support voice commands. This uniformity ensures a consistent user experience across devices. 21 | 22 | 4. 🗣️ **"Select" Command**: The "select" voice command is a universal interaction across HoloLens devices. It acts as a verbal alternative to physical gestures, allowing users to activate or choose holograms by simply saying "select". This is especially useful when the user's hands are occupied or when they prefer a hands-free interaction. 23 | 24 | 5. 🤖 **Hey Cortana**: Cortana is Microsoft's voice-activated digital assistant. By saying "Hey Cortana", users can prompt her to answer questions, open apps, or perform specific tasks. This feature enhances the user experience by providing instant access to information and functionalities. 25 | 26 | 6. 📋 **HoloLens-specific Commands**: These are a set of voice commands tailored specifically for the HoloLens environment. They range from basic commands like "Take a picture" to more advanced ones like "Increase the brightness". These commands provide users with quick access to device functionalities. 27 | 28 | 7. 🏷️ **"See It, Say It"**: This is a user-friendly feature where voice commands are labeled directly on the interface. If a user sees a button labeled "Adjust", they can simply say "Adjust" to activate it. This intuitive model reduces the learning curve and makes voice interactions more straightforward. 29 | 30 | 8. 🎨 **Hologram Manipulation**: Beyond basic interactions, HoloLens allows users to adjust holograms using voice commands. For instance, a user can command a hologram to face them or adjust its size by saying "Bigger" or "Smaller". This provides a more immersive and interactive holographic experience. 31 | 32 | 9. 📝 **Dictation**: Typing in a mixed reality environment can be cumbersome. Dictation offers a solution by allowing users to speak the text they want to input. When the holographic keyboard is active, users can switch to dictation mode, making text input faster and more efficient. 33 | 34 | 10. 🚫 **Challenges**: Like all technologies, voice input in HoloLens is not without its challenges. Fine-grained control, such as adjusting volume by specific increments, can be tricky. Reliability can sometimes be an issue, with the system misinterpreting commands. Social acceptability is another concern, as users might feel awkward speaking commands in public. Lastly, there's a learning curve associated with memorizing specific voice commands. 35 | 36 | ## Keywords 37 | - **HoloLens**: A mixed reality smart-glasses developed by Microsoft. It superimposes 3D holograms onto the real world. 38 | - **Cortana**: Microsoft's virtual assistant, similar to Siri or Alexa. 39 | - **Gaze Cursor**: A targeting mechanism in HoloLens where the device detects where the user is looking. 40 | - **Air Tap**: A gesture used in HoloLens to select or interact with holograms. 41 | - **Hologram**: A three-dimensional image formed by the interference of light beams. 42 | - **Voice Dwell Tooltip**: A tooltip that appears when gazing at a voice-enabled button in HoloLens. 43 | - **Dictation**: The action of saying words aloud to be typed, written down, or recorded on tape. 44 | - **Ambient Environment Audio Capture**: Capturing the surrounding audio, often for recording or analysis. 45 | - **AudioCategory_Communications**: A stream category in HoloLens optimized for call quality and narration. 46 | - **AudioCategory_Speech**: A stream category in HoloLens optimized for the device's speech engine. 47 | - **AudioCategory_Other**: A stream category in HoloLens optimized for ambient audio recording. 48 | - **WMR OOBE**: Windows Mixed Reality Out Of Box Experience, the initial setup process for Windows Mixed Reality devices. 49 | -------------------------------------------------------------------------------- /docs/Road to VR/The Design Behind ‘Cubism’s’ Hand-tracking.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: The Design Behind ‘Cubism’s’ Hand-tracking 4 | parent: Road to VR 5 | --- 6 | 7 | # Case Study: The Design Behind ‘Cubism’s’ Hand-tracking 8 | Original article: [The Design Behind ‘Cubism’s’ Hand-tracking](https://www.roadtovr.com/cubism-hand-tracking-case-study/) 9 | 10 | ## TL;DR 11 | Cubism mploys Ghost Hands for natural interactions and Contact Grabbing for precise object manipulation. Dynamic Hand Smoothing reduces jitter, while Raycast mechanisms detect objects and redundancies enhance usability. Debugging Visualizations aid developers, and careful design minimizes false positives, creating a seamless and immersive user experience. 12 | 13 | ## Bullet points 14 | 1. **🎯 Optimizing for Precise Interactions**: Cubism focuses on precision when it comes to hand tracking input. The interactions are centered around placing small irregular puzzle pieces precisely in a grid. This design principle has guided most of the decision-making around hand input, including grabbing and placing puzzle pieces with exactitude. 15 | 16 | 2. **👻 Ghost Hands Approach**: The developers decided against using physics-based hands, allowing them to pass through pieces until actively grabbed. This design choice avoids pushing away floating puzzle pieces when trying to grab them and facilitates plucking pieces from a full puzzle. 17 | 18 | 3. **🤏 Contact Grabbing Method**: The most effective approach for Cubism's intricate interactions was contact-based grabbing. This involves thumb and index finger intersection over a small distance rather than a full pinch. The fingers are locked in place when a grab starts, providing stability, and multiple safeguards are implemented to avoid unintentional releases. 19 | 20 | 4. **🎮 Midpoint Check for Grabbing**: To further fine-tune grabbing, a midpoint check is performed between the thumb and index fingertip. This helps prioritize the correct piece when multiple options are present, aiding in precise grabbing in densely filled areas of the puzzle grid. 21 | 22 | 5. **🧩 Grabbing the Puzzle & Dynamic Hand Smoothing**: Grabbing the puzzle employs a full pinch within a special zone around the puzzle. Dynamic hand smoothing is also introduced to mitigate the jitter from the Oculus Quest's hand tracking data, providing a more stable grip without a “laggy” feeling. 23 | 24 | 6. **🔘 Pressing Buttons with Interaction Cues**: Buttons in Cubism can be pressed and interacted with. The system utilizes raycasts, colliders, and various cues (such as highlight and drop shadow) to guide the user in interacting with buttons accurately. 25 | 26 | 7. **🔀 Interaction Redundancy and Cues**: Since different players may approach interactions in unintended ways, redundancy is implemented, such as allowing full-finger grabbing or pinching buttons. Multiple cues, like color highlighting and audio feedback, guide users in the intended direction. 27 | 28 | 8. **🏫 Teaching Limitations**: The article acknowledges the limitations of Quest’s hand tracking and stresses the importance of making players aware of these limitations. Guidance includes a modal explaining best practices and in-game indicators when tracking is lost. 29 | 30 | 9. **🖐️ Customizable Tracking Feedback**: Cubism allows players to customize the feedback they receive when tracking is lost. They can choose between the default red hands with warning messages or a more subtle fading of hands, depending on personal preference. 31 | 32 | 10. **🔧 Debugging and Tuning Options**: Throughout the article, the emphasis on fine-tuning and optimizing is evident. There are debug visualizations and settings available within the game for players and developers to understand and customize interactions, ensuring that the user experience is as intuitive and responsive as possible. 33 | 34 | ## Keywords 35 | - **Ghost Hands**: Virtual hands in the application that can pass through objects unless actively grabbing them. 36 | - **Contact Grabbing**: Interaction where an object is grabbed when specific parts of the virtual hand intersect with the object. 37 | - **Dynamic Hand Smoothing**: Technique to counteract jitter in hand tracking data, providing natural and stable hand movement. 38 | - **Raycast**: Method to detect objects along a path or in line of sight, used to detect hovering or pressing of buttons. 39 | - **Redundancies**: Provision of multiple means to achieve the same interaction goal, enhancing usability and flexibility. 40 | - **Debugging Visualizations**: Visual tools to help developers identify and fix problems in the software. 41 | - **Grab Zone**: Designated area where specific grabbing actions can be performed to interact with the puzzle grid. 42 | - **Jitter**: Erratic or shaky movement in hand tracking, countered by dynamic hand smoothing. 43 | - **Pressable**: Buttons or interactive elements that can be pressed or activated by the user. 44 | - **Raycast Mechanisms**: Techniques involving raycasting to detect position and state of objects in 3D space. 45 | - **False Positives**: Unintentional activations or triggers of certain actions or responses in interaction. 46 | -------------------------------------------------------------------------------- /docs/Meta/Mixed Reality Design Guidelines.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Mixed Reality Design Guidelines 4 | parent: Meta 5 | --- 6 | 7 | # Mixed Reality Design Guidelines 8 | Original article: [Mixed Reality Design Guidelines](https://developer.oculus.com/resources/mr-design-guideline/) 9 | 10 | ## TL;DR 11 | This article discusses how Passthrough technology in VR/AR applications can improve user experiences by increasing real-world awareness. It covers the use of Passthrough as a background, reducing onboarding friction, creating selective windows, and providing tips on stylization, occlusions, and shadows. The Insight SDK is mentioned as a tool for developers to control blending and lighting effects, enhancing user immersion and engagement in virtual and augmented reality environments. 12 | 13 | ## Bullet points 14 | 1. 👀 **Passthrough as a Background:** When used as a background, Passthrough technology allows users to see and interact with the real world while also experiencing virtual elements. This feature not only boosts real-world awareness but can also enhance user engagement by providing AR-like experiences in VR applications. 15 | 2. 🌐 **Reducing VR Onboarding Friction:** To ease new users into the VR experience, Passthrough can be utilized to guide users from their actual environment to a fully immersive virtual realm and back. This gradual transition could potentially foster higher adoption rates of VR applications. 16 | 3. 🪟 **Selective Passthrough:** Developers have the ability to create "windows" into the real world within the VR environment. These "windows" can provide partial views of the user's physical surroundings and facilitate transitions between real and virtual realities. 17 | 4. 🔄 **Transitioning between Realities:** When presenting users with multiple realities, it's essential to clearly define the transitions. Unless there's a direct correspondence or 1:1 match between the virtual and physical spaces, mixing realities can confuse users. 18 | 5. 🎨 **Stylizing Passthrough:** The camera sensors on Quest and Quest 2 are black and white. Developers can choose to show Passthrough as-is or stylize it to achieve an artistic look. Use of color can draw attention towards virtual elements and heighten user experience. 19 | 6. ⚡ **Highlighting Moments with Stylization:** Animation and stylization effects can be employed to emphasize certain moments for the user, such as transitioning from the real world to a virtual one or accomplishing an achievement. 20 | 7. 🕳️ **Realistic Virtual Content with Occlusions and Shadows:** Using occlusion and shadow effects can make virtual content appear more realistic within the physical world, thus enhancing Mixed Reality experiences. Without proper occlusion, VR objects can seem like they're floating in the Passthrough view. 21 | 8. 👓 **Blending Passthrough over Virtual Content:** Developers can blend Passthrough over virtual content, employing stylistic effects like outlines for more appealing and less distracting visuals. Outlines are particularly effective as they can contrast with the virtual background. 22 | 9. 💡 **Naive Lighting:** By adjusting the transparency of a black background, applications can create an illusion of lighting and dimming in the environment. This technique can enhance the user's immersion by suggesting that the environment is dynamically responding to their actions. 23 | 10. 🎮 **Insight SDK for Blending Control:** The Insight SDK provides developers with the tools to control the blending of Passthrough and their applications. This allows for creative utilization of lighting and transitions, enhancing user immersion and experience. 24 | 25 | 26 | ## Keywords 27 | - **Passthrough:** In the context of virtual or augmented reality, Passthrough refers to a feature that allows users to see the real world around them while wearing a VR/AR headset. 28 | - **Spatial Anchor:** A spatial anchor is a point of reference used in augmented and mixed reality experiences to keep virtual objects correctly positioned in the real world as viewed through a device. 29 | - **Scene:** In the context of VR/AR, a scene is the virtual environment or space that the user can interact with. 30 | - **Onboarding:** The process of integrating a new user into a system, service, or product, teaching them how to use it. 31 | - **Selective Passthrough:** A feature where only certain parts of the real-world view are shown in a VR environment. 32 | - **1:1 Match:** A perfect correspondence between two sets of things; in this context, it refers to the matching between virtual and physical spaces. 33 | - **Stylization:** The process of designing or presenting in accordance with a style or stylistic pattern. 34 | - **Occlusions:** In the context of VR/AR, occlusion refers to the effect where an object appears to be blocked by another, adding to the realism of the scene. 35 | - **Insight SDK:** A Software Development Kit provided by Oculus that provides features to help developers create VR experiences. 36 | - **Naive Lighting:** A method of manipulating lighting in VR experiences by adjusting the transparency of elements, creating an illusion of the real environment being lit or dimmed. 37 | -------------------------------------------------------------------------------- /docs/Road to VR/Exploring Methods for Conveying Object Weight in Virtual Reality.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Exploring Methods for Conveying Object Weight in Virtual Reality 4 | parent: Road to VR 5 | --- 6 | 7 | # Exploring Methods for Conveying Object Weight in Virtual Reality 8 | Original article: [Exploring Methods for Conveying Object Weight in Virtual Reality](https://www.roadtovr.com/b-reel-simulating-object-weight-mass-virtual-reality-motion-controllers/) 9 | 10 | ## TL;DR 11 | XR development combines visual and physical interactions with a focus on realism and immersive experiences. Essential components include teamwork, iterative testing, user engagement, and leveraging platforms like Unity. The process emphasizes reusable codebases and continuous learning to streamline development and foster innovation. 12 | 13 | ## Bullet points 14 | 1. 🎮 **Focus on Visual vs. Physical Interaction Models**: In the realm of mixed reality, different interaction models are employed. Visual interaction relies on simpler 2D planes, reminiscent of early VR models, where physical interaction is not necessarily mimicked. On the other hand, physical interaction models use 3D modeling to simulate real-world physical interactions. This makes VR feel more realistic by imitating how objects would behave in the actual world. 15 | 16 | 2. ⚖️ **Artificial Weight Simulation in VR**: This concept involves simulating how objects of different weights would feel in a virtual environment. Heavy objects require slower, more deliberate interaction to reflect their real-world weight, while light objects can be picked up or manipulated more quickly. This adds an extra layer of realism and immersion to the VR experience. 17 | 18 | 3. 🚀 **Development Goals and Collaborative Workflow**: A well-structured development plan includes enhancing collaboration through tools like Git, allowing for a smooth workflow among team members. Creating a boilerplate environment ensures a reusable codebase that can expedite development in future VR projects. This focuses on teamwork and efficiency. 19 | 20 | 4. 🔗 **Exploration of Direct and Loose Links**: In VR, there's a choice between direct and loose links between the controller and virtual objects. Direct links provide a one-to-one interaction, where objects follow the controller precisely, while loose links simulate real-world physics, where weight and forces affect the objects' response. This provides varying levels of realism and control. 21 | 22 | 5. 🔉 **Sensory Cues and Feedback**: The incorporation of sensory cues such as visual indicators (color-coded meters showing strain or effort) and haptic feedback (vibration or touch feedback) greatly enhances immersion. These cues provide immediate feedback to the user, mimicking real-world sensations and responses. 23 | 24 | 6. 🧪 **Testing and Tweaking Interaction Models**: Iterative testing is essential in refining and improving interaction in a VR environment. This involves multiple rounds of testing and continuous refinement of visual and haptic feedback elements. The goal is to create a more authentic and satisfying user experience. 25 | 26 | 7. 🧑‍💻 **User Testing and Implementation Variations**: Different methods of user testing, ranging from simple to complex, are vital in gauging effectiveness. Scenario-based testing, where users are tested in various simulated scenarios, allows developers to understand how different users might interact with the system, providing insights for further refinement. 27 | 28 | 8. 🧠 **Insights and Takeaways**: This involves extracting key insights into what worked and what didn’t within the VR interactions. Recognizing that even basic interactions can be complex in a VR environment is essential. These insights contribute to a more comprehensive understanding of the dynamics at play. 29 | 30 | 9. 💡 **Final Recommendations**: The recommendations include finding the right balance between simplicity and complexity based on the need for weight simulation in the VR experience. It also emphasizes the potential for further exploration and refinement in other areas, pointing toward ongoing development opportunities. 31 | 32 | 10. 📚 **The Value of Detailed Exploration**: Breaking down complex interactions into understandable patterns is vital in mixed reality development. It allows for a deeper understanding of recurring patterns and prepares developers for specific challenges. This detailed exploration can lead to achieving side goals like familiarity with platforms like Unity and creating reusable environments, paving the way for more efficient future projects. 33 | 34 | ## Keywords: 35 | - **Unity**: A cross-platform game engine used for developing video games, simulations, and other interactive content. 36 | - **Boilerplate**: Standardized pieces of content in computing that are reusable over different projects. 37 | - **Direct Link**: In VR, a method where objects match the controller’s motion exactly, providing a one-to-one interaction experience. 38 | - **Loose Link**: In VR, a method where objects are attracted to the controller based on weight, providing a more flexible and realistic interaction experience. 39 | - **Haptic Feedback**: Feedback provided through touch or vibration to simulate physical sensations. 40 | - **Git**: A distributed version control system used to track changes in source code during software development. 41 | -------------------------------------------------------------------------------- /docs/Meta/User Input.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: User Input 4 | parent: Meta 5 | --- 6 | 7 | # User Input 8 | Original Article: [User Input](https://developer.oculus.com/resources/bp-userinput/) 9 | 10 | ## TL;DR 11 | The article discusses best practices for designing user interfaces for VR devices, particularly with 3DOF and 6DOF controllers. Recommendations include maintaining a 1:1 movement ratio, standard button mapping, and accommodating left and right-handed users. It also covers the complexities of hand registration, interaction with the virtual world, and using haptics to improve user experience. The challenge of replicating real-world physics in VR is noted, with a strong emphasis on designing believable and comfortable object interactions. 12 | 13 | ## Bullet Points 14 | 1. 🕹️ **VR Controller Types:** The two main types of VR controllers are 3DOF and 6DOF. 3DOF controllers, common in mobile VR devices, allow for orientation tracking but not spatial tracking. 6DOF controllers, on the other hand, can track both orientation and position in space, making them more suitable for advanced VR systems like the Rift. 15 | 16 | 2. 👍 **User Input Recommendations:** For a successful VR experience, it's crucial to maintain a 1:1 movement ratio between the real-world controller and its virtual representation. Using standard button mappings and touch/controller-based menus can also make your VR application feel more intuitive and familiar to users. 17 | 18 | 3. 👥 **Accommodating User Preferences:** VR systems should be designed to accommodate users of all kinds, including left-handed and right-handed individuals. If two controllers are used, allow interactions with both hands; if only one is available, respect the user's default hand setting. 19 | 20 | 4. 📱 **3DOF Best Practices:** 3DOF controllers work well as pointing devices for interacting with UI elements. They shouldn't be represented as hands in-app due to their limited capabilities. Use the controller touchpad for navigating through menus and aim at objects in VR with the ray-cast pointer, which should be on at all times for clarity. 21 | 22 | 5. 👐 **6DOF Best Practices:** For 6DOF controllers, the representation of hands in VR is crucial for immersion. Poorly implemented hand registration can lead to discomfort, so it's essential to ensure that the virtual hand's position and orientation match the user's actual hand. Allowing customization of hand appearance can also improve user comfort. 23 | 24 | 6. 📐 **Implementing Hand Registration:** Testing for hand registration can be done using the "back of the hand" test, and to get the best results, the controller model should be placed in the scene accurately. Getting the registration right can avoid the "uncanny hand valley" effect. 25 | 26 | 7. 💫 **Interactions with Virtual Objects:** Virtual objects should be designed with consideration of their perceived weight to maintain believability, as controllers can't simulate resistance or torque. Using physical and ethereal hand models simultaneously can help manage hand-object intersections. 27 | 28 | 8. 💡 **Lightweight Interactions:** Since VR can't simulate the physical resistance of real-world interactions, stick to lightweight interactions for believability. Lifting a light switch, for example, is more believable than lifting a heavy rock in VR. 29 | 30 | 9. 🤲 **Two-Handed Interactions:** Designing interactions requiring two hands can be challenging in VR due to the lack of physical constraints. Be cautious when implementing these types of interactions. 31 | 32 | 10. 🔊 **Use of Haptics:** Using haptics (tactile feedback) can significantly improve the VR experience. They can be used to simulate the feeling of throwing and catching virtual objects, which increases user immersion. 33 | 34 | ## Keywords 35 | - **3DOF (3 Degrees of Freedom)**: This refers to a controller that allows for orientation tracking but does not track the controller's position in space. 36 | - **6DOF (6 Degrees of Freedom)**: This term describes a controller that supports both orientation and positional tracking. These controllers are usually used in pairs, allowing users to have "virtual hands" in the VR environment. 37 | - **Proprioceptive Mismatch**: This is a mismatch between the perceived location or movement of a virtual limb and the actual location or movement of the real limb. Such a mismatch can cause discomfort in the VR user. 38 | - **Hand Registration**: This refers to the process of aligning the virtual hand model with the user's actual hand, making it feel like the virtual hands are their own. Good hand registration is critical for a comfortable and immersive VR experience. 39 | - **Haptics**: This involves the use of tactile sensations to provide feedback to the user in a virtual environment. This could be in the form of vibrations or resistances. 40 | - **Occlusion**: In the context of VR, occlusion refers to the obstruction of a sensor's view of the controller or headset, which can affect tracking accuracy. This term is mentioned in the context of avoiding the need for users to pick up objects off the floor, where sensors may not track accurately. 41 | - **IK Arm**: IK or Inverse Kinematics is a technique used in 3D animations to make movements appear more realistic. An "IK Arm" would be a digital arm that is animated using inverse kinematics. In this case, it's suggested not to use this technique as it can lead to proprioceptive mismatch. -------------------------------------------------------------------------------- /docs/Microsoft/Shared experiences.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Shared experiences 4 | parent: Microsoft 5 | --- 6 | 7 | # Shared experiences 8 | Original article: [Shared experiences](https://learn.microsoft.com/en-us/windows/mixed-reality/design/shared-experiences-in-mixed-reality) 9 | 10 | ## TL;DR 11 | 12 | In mixed reality, designing shared experiences hinges on understanding factors like sharing nature, group size, participant location, interaction timing, environment similarity, and device type. Technologies like Azure Spatial Anchors facilitate these shared holographic interactions. As this technology progresses, it's set to revolutionize collaboration across numerous sectors. 13 | 14 | ## Bullet points 15 | 16 | 1. 🤝 **Shared Scenarios**: Before diving into the world of mixed reality, it's pivotal to outline the target scenarios. This means understanding the specific situations or use-cases you're designing for. By doing this, designers and developers can pinpoint the core challenges and opportunities, ensuring a more streamlined and effective design process. 17 | 18 | 2. 🔄 **How are they sharing?** 19 | - **Presentation**: A single entity (like a professor) showcases content to many. While the main content is visible to all, the presenter might have additional private notes. 20 | - **Collaboration**: Multiple users work together, like students collaborating on a holographic heart model to learn about surgeries. 21 | - **Guidance**: A more personalized approach where one individual guides another, akin to a professor guiding a student through a surgical procedure in a virtual environment. 22 | 23 | 3. 👥 **Group Size** 24 | - **1:1**: Personalized and direct, like a tutor-student setup. 25 | - **Small (<7)**: Suitable for team projects or group discussions, where each member can contribute without overwhelming others. 26 | - **Large (>=7)**: More complex, suitable for lectures or seminars. The larger the group, the more challenges arise, both technically and socially. 27 | 28 | 4. 🌍 **Location of Participants** 29 | - **Colocated**: Everyone's in the same room or area, fostering direct interaction. 30 | - **Remote**: Participants are scattered, perhaps in different parts of the world, relying heavily on the tech to interact. 31 | - **Both**: A hybrid model, like a team in a conference room collaborating with remote members. 32 | 33 | 5. ⏰ **Timing of Sharing** 34 | - **Synchronous**: Everyone's online and active together, like a live group video call. 35 | - **Asynchronous**: Participants interact at different times, akin to forum posts or email chains. 36 | - **Both**: A blend, like a professor leaving virtual notes on a student's project to be viewed later. 37 | 38 | 6. 🏠 **Physical Environments** 39 | - **Similar**: Environments with comparable features, like two different university lecture halls. 40 | - **Dissimilar**: Vastly different settings, like comparing a quiet study room to a bustling auditorium. 41 | 42 | 7. 📱 **Device Types** 43 | - **Immersive Devices**: Offer a deep, all-encompassing digital experience. 44 | - **Holographic Devices**: Primarily display 3D digital images, enhancing the depth of the experience. 45 | - **2D Devices**: Traditional screens like smartphones or desktops, which might offer a more limited experience compared to the other two. 46 | 47 | 8. 🚀 **Exploring Shared Experiences**Innovations like Microsoft's HoloLens showcase the potential of mixed reality. For instance, Skype on HoloLens allows a user to get guidance on fixing a light switch from a remote expert, blending the real and virtual seamlessly. 48 | 49 | 9. 🛠 **Building Shared Experiences**: To create a cohesive mixed-reality experience, several technical aspects need to be addressed. This includes creating and joining sessions, aligning holograms in a shared space, synchronizing data in real-time, and ensuring data persistence across sessions. 50 | 51 | 10. 📡 **Tech Options**: Tools like Azure Spatial Anchors and Photon SDK are at the forefront of mixed reality development. They allow for the creation of shared holographic experiences, from static holograms to fully interactive multi-user environments. 52 | 53 | ## Keywords 54 | - **Synchronous**: Occurring at the same time. 55 | 56 | - **Asynchronous**: Not occurring at the same time. 57 | 58 | - **Immersive Devices**: Devices that provide a deeply engaging, multi-sensory, digital experience. 59 | 60 | - **Holographic Devices**: Devices that can display holograms, three-dimensional (3D) digital images that can be seen without special 3D glasses or other equipment. 61 | 62 | - **HoloLens**: A mixed reality smart-glasses developed by Microsoft, which provides holographic and mixed reality experiences. 63 | 64 | - **Colocated**: Participants or users are in the same physical space. 65 | 66 | - **Azure Spatial Anchors**: A cloud service by Microsoft Azure that allows developers to create mixed-reality apps that map, persist, and share 3D content. 67 | 68 | - **Miracast**: A standard for wireless connections from devices (like laptops, tablets, or smartphones) to displays (such as TVs, monitors, or projectors). 69 | 70 | - **Photon SDK**: A software development kit used for creating multiplayer games and real-time applications. 71 | 72 | - **OnSight**: A collaboration tool developed by NASA’s Jet Propulsion Laboratory for scientists to collaborate in real-time within data from the Martian landscape. 73 | -------------------------------------------------------------------------------- /docs/Meta/Designing for Hands.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Designing for Hands 4 | parent: Meta 5 | --- 6 | 7 | # Designing for Hands 8 | Original article: [Designing for Hands](https://developer.oculus.com/resources/hands-design-intro/) 9 | 10 | ## TL;DR 11 | In AR/VR, traditional input methods are giving way to intuitive hand-based interfaces. The focus is on creating user-friendly UI components with continuous feedback, such as specifically designed buttons and innovative pinch-and-pull components. Other tools, like pointers and cursors, enhance user experience, while the virtual hand's design ensures immersion and precise gestures. 12 | 13 | ## Bullet points 14 | 15 | 1. ✋ **Hand Interactions**: With the evolution of AR and VR interfaces, there's been a trend of ditching traditional input devices. Instead, the use of natural hand gestures and movements is encouraged, making the experience more immersive and intuitive for users without the barriers of buttons or controllers. 16 | 17 | 2. 🖲️ **User Interface Components**: In the absence of physical buttons or switches that are commonly found on traditional devices, there's a pressing need for innovative UI components. These new components prioritize continuous feedback throughout a user's interaction, ensuring users always know their actions are being registered. 18 | 19 | 3. 🔘 **Buttons**: Even in a VR or AR setting, buttons remain a crucial component of user interfaces. But without tactile feedback, there's been a shift towards visual feedback mechanisms. As a user interacts with a virtual button, it transitions through multiple states—like "default," "focus," and "hover"—to communicate its current status to the user. 20 | 21 | 4. 📏 **Design Specifications**: Crafting the perfect button for AR and VR is an art and science combined. Parameters like button size, layout density, collider shapes, and optimal interaction distances have been carefully researched to ensure optimal user experience. For instance, near field buttons need to be large enough for accurate interactions, while far field buttons require consideration of angular size for distant interactions. 22 | 23 | 5. 👌 **Pinch-and-Pull**: The pinch-and-pull mechanism has emerged as a groundbreaking VR interaction method. More than just a novelty, it offers a tactile sense of interaction, even in a virtual space. This mechanism has paved the way for a variety of picker components, allowing users to interact in one, two, or even three dimensions, enabling complex selections such as color shades in a volumetric space. 24 | 25 | 6. 👉 **Pointers & Cursors**: A user's guide in the vast VR or AR space, pointers and cursors offer vital feedback. Their design is dynamic, with elements like shape, color, and opacity changing based on the user's interaction state. This adaptability ensures users always understand their current mode of interaction, whether they're just pointing or actively pinching to make a selection. 26 | 27 | 7. 🖐️ **Hand Design**: To make the virtual representation of hands feel authentic, designers combine static visual elements with dynamic feedback mechanisms. These virtual hands undergo subtle color shifts and pose changes based on the user's gestures and interactions, enhancing the immersive feel and ensuring users always understand their hand's current state and functionality. 28 | 29 | 8. 🔄 **Continuous Feedback**: In a realm where physical touch is absent, the role of visual and audio feedback is paramount. By ensuring that every user action, no matter how minor, elicits a response from the system, designers can forge a stronger bond between the user and the virtual environment, making interactions feel more grounded and authentic. 30 | 31 | 9. 📚 **Best Practices**: Crafting an effective VR and AR user experience isn't just about innovation; it's also about adhering to proven design principles. Leveraging concepts such as affordances, signifiers, and feedback, designers can create interfaces that are not only innovative but also intuitive, ensuring users can navigate and interact with ease. 32 | 33 | 34 | ## Keywords 35 | 36 | - **Affordances**: Characteristics of an object that suggest how it should be used, without explicit instructions. 37 | 38 | - **Signifiers**: Indicators or symbols that provide clues about how an interaction should take place. 39 | 40 | - **Button States**: Different visual and interaction stages a button goes through, from being idle to being interacted with. 41 | 42 | - **Near Field & Far Field Buttons**: Classification of buttons based on their proximity to the user within a virtual environment. 43 | 44 | - **Collider Shape**: The physical shape and boundary of an object that detects interaction or collision in a digital space. 45 | 46 | - **Angular Size**: A measure of the proportion of one's field of view taken up by an object, regardless of its actual size or distance. 47 | 48 | - **Pinch-and-Pull Components**: A type of interaction where users can pinch virtual objects and pull them for various functions. 49 | 50 | - **1D, 2D, 3D Picker**: Different dimensional selection tools in virtual spaces that allow users to make choices in one, two, or three axes respectively. 51 | 52 | - **Raycasting**: A technique used in computing to find where lines or rays intersect objects, often used in VR and AR to determine where a user is pointing. 53 | 54 | - **System Pose & Pointing Pose**: Different stances or positions of the hand to perform specific actions or gestures in a virtual environment. 55 | -------------------------------------------------------------------------------- /docs/Apple/Design for spatial user interfaces.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Design for spatial user interfaces 4 | parent: Apple 5 | --- 6 | 7 | # Design for spatial user interfaces 8 | Original video: [Design for spatial user interfaces](https://developer.apple.com/videos/play/wwdc2023/10076/) 9 | 10 | ## TL;DR 11 | The video discusses designing iOS applications, outlining core structures like a window for content, a tab bar for main navigation, and sidebars for sub-navigation. It introduces a new feature, "ornaments", that can expand or disappear based on user interaction. It also covers additional interactive elements like menus, popovers, and sheets (modals) that can extend interactions beyond the window. The video emphasizes the use of the platform's spatial capabilities to create immersive experiences. 12 | 13 | ## Bullet Points 14 | 1. 🪟 **Window:** The iOS app structure is built around a core element known as a window. It's not just a display frame, but a canvas that holds all interactive elements. It's made of an opaque material, providing a stage for all app components. The design ensures a fluid movement in the user's space. 15 | 16 | 2. 🎚 **Tab Bar Controller:** On iPhone, it traditionally sits at the bottom, enabling primary navigation within the app. In this new platform, it's vertically oriented and is positioned on the left, floating in a fixed position. 17 | 18 | 3. 📌 **Sidebar:** A sidebar co-exists with the tab bar providing an additional layer of navigation. It's particularly useful where users need to navigate between multiple sections or subsections within a tab. 19 | 20 | 4. 🎈 **Ornaments:** A new presentation style called "ornaments" are floating accessories placed in various locations on the interface. They serve as persistent control tools, enabling quick actions related to the content. They leverage depth to create a hierarchy within the interface. 21 | 22 | 5. 🎧 **Music App Ornaments:** The Now Playing controls are presented as a floating ornament that persists no matter where a user navigates within the app. It allows users to control their music playback while exploring other parts of the app. 23 | 24 | 6. 🍥 **Menus and Popovers:** They have been designed to expand outside the app window. They appear centered by default, and the button that invokes them changes to its selected state, featuring black labels on a white background. 25 | 26 | 7. 📑 **Sheets:** They are designed to present modal views and they appear at the center of the app. They maintain the same Z position as the parent window, which is pushed back and dimmed. 27 | 28 | 8. 🎀 **Secondary Modals:** If another sheet needs to be presented while one is already open, a secondary modal appears with an additional layer of dimming, pushing everything else back. 29 | 30 | 9. 🗺 **Push Navigation:** It's recommended for nested views. Instead of a close button, a back button is presented in the top-left corner, enabling users to navigate back to the previous view. 31 | 32 | 10. 📸 **Leverage Spatial Capabilities:** Developers are urged to leverage the unique spatial capabilities of the platform, creating immersive experiences using the depth and spatiality that the platform provides. 33 | 34 | ## Keywords 35 | - **iOS**: iOS is the operating system for Apple devices like the iPhone and iPad. It's the software that enables users to interact with their devices. 36 | - **Window**: In the context of iOS app development, a window is a fundamental part of the user interface that provides the canvas for all other elements to sit on. 37 | - **Tab bar controller**: This is a special type of view controller in iOS that lets users switch between different views (or tabs) at the same level of hierarchy in an app. 38 | - **Sidebar**: In the context of app interfaces, a sidebar is a vertical panel that provides additional navigation or control options. It can be used for sub-navigation within a particular tab. 39 | - **Ornaments**: A term used in the video to refer to floating accessory elements or controls in an app interface that provide additional functionality and are persistent, meaning they are always available. 40 | - **Popover**: A popover is a transient view that shows up on the screen when a user taps a button or performs a certain action. It appears on top of your content and usually provides additional information or includes a list of actions. 41 | - **Modal**: In the context of user interface design, a modal is a type of window that demands user interaction before they can return to the parent interface. It is used to focus the user's attention and is often used for alerts, data entry and other tasks that require user input before proceeding. 42 | - **Sheet**: In the context of iOS development, a sheet refers to a specific style of modal window that presents content in a distinct layer over the parent view. 43 | - **Spatial Captures**: It seems to be a term used within this specific platform to refer to a feature in the Photos app that provides an immersive way to view photos. The exact nature of the feature would likely be explained in more depth in the video or other related resources. 44 | - **Z Position**: In 3D design or interface design involving depth, the Z position refers to the depth of an element or how far it appears to be from the user (away or closer to the screen). 45 | - **Push Navigation**: In the context of user interface design, push navigation is a navigation style where new content is "pushed" onto the screen, typically from the right, replacing the current content. 46 | -------------------------------------------------------------------------------- /docs/Apple/Designing for visionOS.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Designing for visionOS 4 | parent: Apple 5 | --- 6 | 7 | # Designing for visionOS 8 | Original article: [Designing for visionOS](https://developer.apple.com/design/human-interface-guidelines/designing-for-visionos) 9 | 10 | ## TL;DR 11 | Apple's Vision Pro, powered by visionOS, offers a limitless virtual space for immersive and engaging experiences, smoothly transitioning between shared and full spaces for multi-tasking or focused app usage. The device blends digital and real-world surroundings through features like Passthrough and Spatial Audio. Ergonomics and Accessibility are central design considerations, ensuring comfortable, personalized user experiences that transcend physical limitations. 12 | 13 | ## Bullet points 14 | 1. 🔵 **Space**: Apple Vision Pro provides a limitless canvas for virtual experiences, accommodating different forms of content like windows, volumes, and 3D objects. It creates immersive experiences, transporting users to different virtual environments. Designing for this space requires an understanding of how to navigate and place content in a limitless environment. 15 | 16 | 2. 🔄 **Immersion**: VisionOS apps enable users to transition fluidly between different levels of immersion. Initially, an app launches in a Shared Space where multiple apps can operate simultaneously, allowing users to manage various tasks. However, users can choose to transition an app to a Full Space, meaning it becomes the only running app, providing a more focused and immersive experience. 17 | 18 | 3. 📹 **Passthrough**: Passthrough is a vital feature in visionOS. It leverages the device's external cameras to provide a live video feed of the user's actual surroundings, creating a blend of real and virtual environments. Users can control the level of passthrough using the Digital Crown, depending on how much of their real surroundings they wish to see. 19 | 20 | 4. 🎧 **Spatial Audio**: Vision Pro's Spatial Audio offers a multi-dimensional audio experience. It uses acoustic and visual-sensing technologies to model the sonic characteristics of a person’s surroundings, making audio sound natural and harmonious with their environment. With user permission, apps can access this information to create custom audio experiences. 21 | 22 | 5. 👁️ **Focus and Gestures**: Vision Pro is designed to facilitate interaction primarily through eyes and hands. Users bring focus to a virtual object by looking at it and then activate it through an indirect gesture like a tap. Direct gestures, like physically touching a virtual object, are also supported. 23 | 24 | 6. 💺 **Ergonomics**: VisionOS prioritizes user comfort. The content placement is automatically adjusted relative to the wearer's head position, ensuring optimal visual comfort regardless of user posture. This ergonomic design encourages passive interaction, letting users engage with apps and games without unnecessary physical strain. 25 | 26 | 7. 🔄 **Accessibility**: Apple Vision Pro supports numerous accessibility technologies to ensure all users can interact with the device effectively. Technologies like VoiceOver, Switch Control, Dwell Control, Guided Access, and Head Pointer enhance usability, especially for individuals with disabilities. 27 | 28 | 8. 🎯 **Best Practices**: Designing for visionOS involves embracing its unique features and capabilities. App design should prioritize user comfort, support varying degrees of immersion, integrate seamless transitions between real and virtual spaces, and encourage shared activities. Every app's design should focus on providing a seamless, engaging, and immersive experience. 29 | 30 | ## Keywords 31 | - **visionOS**: Apple's operating system for its Vision Pro device, which enables immersive augmented reality experiences. 32 | 33 | - **Apple Vision Pro**: Apple's augmented reality headset that uses advanced technologies to create immersive experiences blending the physical and digital world. 34 | 35 | - **Space**: In the context of Vision Pro, this refers to the virtual canvas where users can view and interact with digital content, such as windows, volumes, and 3D objects. 36 | 37 | - **Immersion**: A key aspect of the Vision Pro experience, allowing users to fluidly transition between varying degrees of engagement with the virtual environment. 38 | 39 | - **Shared Space**: A feature of visionOS where multiple apps can run side-by-side, allowing users to multitask effectively. 40 | 41 | - **Full Space**: A mode in visionOS where one app dominates the user experience, offering a fully immersive interaction. 42 | 43 | - **Passthrough**: A feature of Vision Pro that uses the device's external cameras to provide a live video of the user's actual surroundings, helping to blend the virtual and real world. 44 | 45 | - **Spatial Audio**: A technology used in Vision Pro to model the sonic characteristics of the user's surroundings, providing a natural sounding audio experience that complements the visual immersion. 46 | 47 | - **Digital Crown**: A control element on Apple's Vision Pro, used for adjustments such as the level of passthrough. 48 | 49 | - **Ergonomics**: The study of people's efficiency in their working environment. In visionOS, it refers to the design considerations to make the usage of the device comfortable and efficient. 50 | 51 | - **Accessibility**: The design considerations and features that ensure the device and its software can be used by as many people as possible, including those with disabilities. 52 | -------------------------------------------------------------------------------- /docs/Road to VR/Inside VR Design - The Clever Weapons, Locomotion, & Open-world of ‘Stormland’.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Inside VR Design - The Clever Weapons, Locomotion, & Open-world of ‘Stormland’ 4 | parent: Road to VR 5 | --- 6 | 7 | # Inside VR Design: The Clever Weapons, Locomotion, & Open-world of ‘Stormland’ 8 | Original article: [Inside VR Design: The Clever Weapons, Locomotion, & Open-world of ‘Stormland’](https://www.roadtovr.com/inside-vr-design-stormland-weapons-locomotion-open-world-analysis/) 9 | 10 | ## TL;DR 11 | "Stormland" is a VR action-adventure game that introduces unique features like ripping guns for reloading and crafting, as well as employing a multi-modal locomotion system for fluid movement. The game's open-world design, utilizing islands and cloud gliding, ensures engaging exploration without tedious walking. Clever comfort tricks like minimizing vection make high-speed movements comfortable, demonstrating a thoughtful balance between immersion and player ease. 12 | 13 | ## Bullet points 14 | 1. **🔫 Unique Weapon Reloading and Handling:** Stormland's weapon system is ingeniously designed to prioritize fun and fluidity over realism. Reloading involves ripping the gun apart, which is both entertaining and easy to perform, even under pressure. The mechanic serves a dual purpose by also providing crafting materials. Additionally, the game adds convenience features like allowing dropped guns to float temporarily, avoiding any break in gameplay due to accidental drops, making the game feel more engaging and accessible. 15 | 2. **🛠️ Streamlined Inventory Management:** Stormland implements unique solutions to ensure smooth and efficient inventory management. Guns and other items can float momentarily, which acts as a sort of third hand for the player while managing inventory. This design choice maintains the immersive experience while also making it easy for players to shuffle and interact with their inventory without unnecessary hindrance. 16 | 3. **🏃 Multi-modal Locomotion System:** The game combines different movement modes - thumbstick movement, climbing, and gliding - to enhance the player's sense of freedom and control. The design is meant to provide a seamless experience, such as the innovative wall grabbing system that requires less precision, thus making climbing more fluid and fun. These choices support Stormland's aim of providing a varied and enjoyable locomotion experience. 17 | 4. **✈️ Comfortable High-Speed Gliding:** Stormland’s gliding system offers an exhilarating way to travel across the game world without causing dizziness. Key design considerations like minimal visual cues in motion (vection) and controlling speed and direction through physical movement with arms make it comfortable. The game's visual field design and incorporation of the player's body movements in controlling gliding create a thrilling yet comfortable traveling experience. 18 | 5. **🌍 Innovative Open-World Design:** Stormland crafts an open-world experience that feels expansive without being tedious. By using islands as points of interest and providing fast and exciting ways to traverse between them, the game creates a sense of large-scale exploration without demanding unnecessary time investment. Upgrades like the ability to create cloud ramps add to the fun, letting the player explore and enjoy the open world seamlessly. 19 | 6. **🌳 Clever Techniques for VR Comfort:** In addition to the gliding mechanics, Stormland employs other techniques to ensure comfort during gameplay. One method is controlling vection by limiting fast-moving imagery, such as gliding only on clouds, a low contrast texture, to reduce the sense of motion. The developers' intentional choices in designing locomotion, from speed to physical involvement, showcase a deep understanding of VR mechanics and human perception to create a comfortable and immersive experience. 20 | 21 | ## Keywords 22 | - **Stormland:** The title of an open-world action-adventure VR game by Insomniac Games. It features innovative mechanics related to weapons, locomotion, and world design. 23 | - **Revive:** A software that allows playing Oculus-exclusive games on other VR headsets, such as those compatible with SteamVR. 24 | - **Locomotion:** Refers to the methods of movement within the VR environment. In Stormland, locomotion includes thumbstick movement, climbing, and gliding, allowing for horizontal and vertical travel. 25 | - **Vection:** A complex term referring to the sense of motion a person experiences based on what is seen visually. In VR, this can lead to dizziness if what is seen doesn't align with what the body senses. Stormland minimizes vection to make fast movement comfortable. 26 | - **Multi-modal:** Referring to the use of multiple methods or modes. In the context of Stormland, it refers to the combination of different types of locomotion (thumbstick movement, climbing, gliding) to create a fluid movement experience. 27 | - **Holster:** A place where a virtual weapon is kept in the game. Players pull guns out of their holsters to use them, and in Stormland, dropped guns return to the holster automatically after a few seconds. 28 | - **Traversal:** Refers to the act of moving or traveling across an area. In Stormland, this includes methods like gliding on clouds and creating ramps, facilitating quick and engaging movement through the game world. 29 | - **Crafting materials:** Items collected within a game that can be used to create or upgrade weapons and abilities. In Stormland, ripping guns apart provides both ammo and these materials. 30 | -------------------------------------------------------------------------------- /docs/Apple/Design considerations for vision and motion.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Design considerations for vision and motion 4 | parent: Apple 5 | --- 6 | # Design considerations for vision and motion 7 | Original video: [Design considerations for vision and motion](https://developer.apple.com/videos/play/wwdc2023/10078/) 8 | 9 | ## TL;DR 10 | The video discusses designing immersive experiences with focus on visual and motion comfort. It advises using depth cues and optimizing visual parameters to reduce eye strain. For motion, it recommends ensuring stability, limiting head-locked content, and wisely managing camera movements to prevent discomfort. 11 | 12 | ## Bullet Points 13 | - 🧠 **Understanding Depth Perception:** Design immersive experiences that align with the brain's interpretation of depth. This requires the use of correct visual depth cues like size, blur, motion, background, light, shadow, occlusion, and texture density. These cues enable the viewer's brain to converge the line of sight correctly, preventing discomforts such as double vision or eye fatigue. 14 | - 🚫 **Avoiding Visual Cue Conflicts:** Ensure there are no conflicting depth cues. Misalignments, like an object appearing larger but positioned behind a smaller one, can cause discomfort due to conflicting information about depth. All depth cues should consistently give the brain correct information, aligning the viewer's eyes to the depth intended. 15 | - 🔄 **Addressing Misleading Visual Cues:** Patterns can mislead depth perception. If each eye locks onto a different object in a repeating pattern, the resulting image may be perceived at a different depth than intended, causing discomfort. To mitigate this, use smaller portions of patterns or break them up with different designs. 16 | - 👓 **Optimizing Content Parameters:** Design visually comfortable content with parameters suited for each specific visual experience. Considerations include content size, contrast, and brightness. High contrast is beneficial for reading text, while lower contrast, transparency, or blur can redirect visual attention. 17 | - 👁️ **Reducing Eye Effort:** Minimize eye effort by placing content in comfortable viewing areas, primarily the center and slightly below the line of sight. Consider natural break points in the experience to allow for rest, and adjust content parameters based on the visual effort demanded from the viewer. 18 | - 🔄 **Maintaining Motion Comfort:** The brain needs to perceive the world as stationary for motion comfort. Conflicts between visual motion information and the motion perceived by the vestibular system can cause discomfort. Ensuring stable and well-designed content helps to avoid this. 19 | - 💠 **Managing Virtual Object Motion:** For large virtual objects that move, making them semi-transparent can avoid giving the viewer a sense of self-motion, ensuring motion comfort. 20 | - 🚫 **Avoiding Head-Locked Content:** Head-locked content, anchored to the user's head, can cause discomfort. Instead, use world-locked views or lazy-follow animations that move content slowly towards a destination over time. 21 | - 🖼️ **Handling Window Motion:** To provide optimal motion comfort, align the virtual window's horizon with the real horizon and keep camera motion and the focus of expansion slow and predictable. 22 | - 🌊 **Avoiding Sustained Oscillations:** To ensure a comfortable experience, avoid oscillating motions, especially those with frequencies around 0.2 Hz. If such motions are necessary, keep the amplitude low, make the content semi-transparent, and provide an oscillation-free alternative. 23 | 24 | ## Bullet Points 25 | - **Visual Depth Cues:** Signals given to the brain about the spatial relationships of objects, aiding in perceiving depth. They include size, blur, motion, background, light, shadow, occlusion, and texture density. 26 | - **Converge:** The inward turning of the eyes to fixate on a near object. In this context, it refers to the viewer's eyes aligning to perceive the intended depth of an object in a 3D space. 27 | - **Double Vision:** A condition where an individual sees two images of a single object. It can occur if the viewer's eyes aren't aligning correctly due to inconsistent or misleading visual depth cues. 28 | - **Disparity:** The difference in image location of an object seen by the left and right eyes. Correct disparity is essential for comfortable stereo viewing. 29 | - **Eye Effort:** The amount of effort or strain the eyes undergo to view certain content. Reducing eye effort, by adjusting content location and parameters, enhances comfort. 30 | - **Head-Locked Content:** Content that maintains a fixed position relative to the user's head movement. In immersive experiences, world-locked or lazy-follow content is preferred to enhance comfort. 31 | - **Vestibular System:** The sensory system responsible for motion perception, head position, and spatial orientation. Conflicting visual and vestibular motion information can cause discomfort like dizziness or nausea. 32 | - **Oscillations:** Regular back-and-forth movements. Sustained oscillations, especially at around 0.2 Hz, can be visually uncomfortable and should be minimized or made less perceptible. 33 | - **Focus of Expansion:** The point in a visual scene where all motion vectors originate, towards which all movement seems to head. Slow and predictable motion of this point aids in motion comfort. 34 | - **Reduce Motion Accessibility Setting:** A feature that minimizes certain interface movement effects to provide a simpler visual experience. Designers should offer oscillation-free alternatives when this setting is activated. 35 | -------------------------------------------------------------------------------- /Gemfile.lock: -------------------------------------------------------------------------------- 1 | GEM 2 | remote: https://rubygems.org/ 3 | specs: 4 | addressable (2.8.7) 5 | public_suffix (>= 2.0.2, < 7.0) 6 | bigdecimal (3.1.8) 7 | colorator (1.1.0) 8 | concurrent-ruby (1.3.4) 9 | em-websocket (0.5.3) 10 | eventmachine (>= 0.12.9) 11 | http_parser.rb (~> 0) 12 | eventmachine (1.2.7) 13 | ffi (1.17.0) 14 | ffi (1.17.0-aarch64-linux-gnu) 15 | ffi (1.17.0-aarch64-linux-musl) 16 | ffi (1.17.0-arm-linux-gnu) 17 | ffi (1.17.0-arm-linux-musl) 18 | ffi (1.17.0-arm64-darwin) 19 | ffi (1.17.0-x86-linux-gnu) 20 | ffi (1.17.0-x86-linux-musl) 21 | ffi (1.17.0-x86_64-darwin) 22 | ffi (1.17.0-x86_64-linux-gnu) 23 | ffi (1.17.0-x86_64-linux-musl) 24 | forwardable-extended (2.6.0) 25 | google-protobuf (4.29.0) 26 | bigdecimal 27 | rake (>= 13) 28 | google-protobuf (4.29.0-aarch64-linux) 29 | bigdecimal 30 | rake (>= 13) 31 | google-protobuf (4.29.0-arm64-darwin) 32 | bigdecimal 33 | rake (>= 13) 34 | google-protobuf (4.29.0-x86-linux) 35 | bigdecimal 36 | rake (>= 13) 37 | google-protobuf (4.29.0-x86_64-darwin) 38 | bigdecimal 39 | rake (>= 13) 40 | google-protobuf (4.29.0-x86_64-linux) 41 | bigdecimal 42 | rake (>= 13) 43 | http_parser.rb (0.8.0) 44 | i18n (1.14.6) 45 | concurrent-ruby (~> 1.0) 46 | jekyll (4.3.4) 47 | addressable (~> 2.4) 48 | colorator (~> 1.0) 49 | em-websocket (~> 0.5) 50 | i18n (~> 1.0) 51 | jekyll-sass-converter (>= 2.0, < 4.0) 52 | jekyll-watch (~> 2.0) 53 | kramdown (~> 2.3, >= 2.3.1) 54 | kramdown-parser-gfm (~> 1.0) 55 | liquid (~> 4.0) 56 | mercenary (>= 0.3.6, < 0.5) 57 | pathutil (~> 0.9) 58 | rouge (>= 3.0, < 5.0) 59 | safe_yaml (~> 1.0) 60 | terminal-table (>= 1.8, < 4.0) 61 | webrick (~> 1.7) 62 | jekyll-sass-converter (3.0.0) 63 | sass-embedded (~> 1.54) 64 | jekyll-seo-tag (2.8.0) 65 | jekyll (>= 3.8, < 5.0) 66 | jekyll-watch (2.2.1) 67 | listen (~> 3.0) 68 | just-the-docs (0.5.3) 69 | jekyll (>= 3.8.5) 70 | jekyll-seo-tag (>= 2.0) 71 | rake (>= 12.3.1) 72 | kramdown (2.5.1) 73 | rexml (>= 3.3.9) 74 | kramdown-parser-gfm (1.1.0) 75 | kramdown (~> 2.0) 76 | liquid (4.0.4) 77 | listen (3.9.0) 78 | rb-fsevent (~> 0.10, >= 0.10.3) 79 | rb-inotify (~> 0.9, >= 0.9.10) 80 | mercenary (0.4.0) 81 | pathutil (0.16.2) 82 | forwardable-extended (~> 2.6) 83 | public_suffix (6.0.1) 84 | rake (13.2.1) 85 | rb-fsevent (0.11.2) 86 | rb-inotify (0.11.1) 87 | ffi (~> 1.0) 88 | rexml (3.3.9) 89 | rouge (4.5.1) 90 | safe_yaml (1.0.5) 91 | sass-embedded (1.81.0) 92 | google-protobuf (~> 4.28) 93 | rake (>= 13) 94 | sass-embedded (1.81.0-aarch64-linux-android) 95 | google-protobuf (~> 4.28) 96 | sass-embedded (1.81.0-aarch64-linux-gnu) 97 | google-protobuf (~> 4.28) 98 | sass-embedded (1.81.0-aarch64-linux-musl) 99 | google-protobuf (~> 4.28) 100 | sass-embedded (1.81.0-aarch64-mingw-ucrt) 101 | google-protobuf (~> 4.28) 102 | sass-embedded (1.81.0-arm-linux-androideabi) 103 | google-protobuf (~> 4.28) 104 | sass-embedded (1.81.0-arm-linux-gnueabihf) 105 | google-protobuf (~> 4.28) 106 | sass-embedded (1.81.0-arm-linux-musleabihf) 107 | google-protobuf (~> 4.28) 108 | sass-embedded (1.81.0-arm64-darwin) 109 | google-protobuf (~> 4.28) 110 | sass-embedded (1.81.0-riscv64-linux-android) 111 | google-protobuf (~> 4.28) 112 | sass-embedded (1.81.0-riscv64-linux-gnu) 113 | google-protobuf (~> 4.28) 114 | sass-embedded (1.81.0-riscv64-linux-musl) 115 | google-protobuf (~> 4.28) 116 | sass-embedded (1.81.0-x86-cygwin) 117 | google-protobuf (~> 4.28) 118 | sass-embedded (1.81.0-x86-linux-android) 119 | google-protobuf (~> 4.28) 120 | sass-embedded (1.81.0-x86-linux-gnu) 121 | google-protobuf (~> 4.28) 122 | sass-embedded (1.81.0-x86-linux-musl) 123 | google-protobuf (~> 4.28) 124 | sass-embedded (1.81.0-x86-mingw-ucrt) 125 | google-protobuf (~> 4.28) 126 | sass-embedded (1.81.0-x86_64-cygwin) 127 | google-protobuf (~> 4.28) 128 | sass-embedded (1.81.0-x86_64-darwin) 129 | google-protobuf (~> 4.28) 130 | sass-embedded (1.81.0-x86_64-linux-android) 131 | google-protobuf (~> 4.28) 132 | sass-embedded (1.81.0-x86_64-linux-gnu) 133 | google-protobuf (~> 4.28) 134 | sass-embedded (1.81.0-x86_64-linux-musl) 135 | google-protobuf (~> 4.28) 136 | terminal-table (3.0.2) 137 | unicode-display_width (>= 1.1.1, < 3) 138 | unicode-display_width (2.6.0) 139 | webrick (1.9.0) 140 | 141 | PLATFORMS 142 | aarch64-linux 143 | aarch64-linux-android 144 | aarch64-linux-gnu 145 | aarch64-linux-musl 146 | aarch64-mingw-ucrt 147 | arm-linux-androideabi 148 | arm-linux-gnu 149 | arm-linux-gnueabihf 150 | arm-linux-musl 151 | arm-linux-musleabihf 152 | arm64-darwin 153 | riscv64-linux-android 154 | riscv64-linux-gnu 155 | riscv64-linux-musl 156 | ruby 157 | x86-cygwin 158 | x86-linux 159 | x86-linux-android 160 | x86-linux-gnu 161 | x86-linux-musl 162 | x86-mingw-ucrt 163 | x86_64-cygwin 164 | x86_64-darwin 165 | x86_64-linux 166 | x86_64-linux-android 167 | x86_64-linux-gnu 168 | x86_64-linux-musl 169 | 170 | DEPENDENCIES 171 | jekyll (~> 4.3.2) 172 | just-the-docs (= 0.5.3) 173 | 174 | BUNDLED WITH 175 | 2.5.16 176 | -------------------------------------------------------------------------------- /docs/Meta/Locomotion Best Practices.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Locomotion Best Practices 4 | parent: Meta 5 | --- 6 | 7 | # Locomotion Best Practices 8 | Original article: [Locomotion Best Practices](https://developer.oculus.com/resources/locomotion-design-techniques-best-practices/) 9 | 10 | ## TL;DR 11 | The article explores various methods to improve comfort and decrease motion sickness in VR experiences. These include maintaining a steady frame rate and precise head tracking, using Asynchronous Time Warp to reduce judder, and leveraging Independent Visual Backgrounds to lower discomfort, exemplified by a skybox reacting only to head movements. The benefits and implementation challenges of IVBs in VR applications are discussed. Simulated activities, like walking in place or climbing a ladder, are suggested as additional comfort-boosting techniques. Spatial sound effects are recognized for their role in reducing disorientation, especially during teleportation movements. 12 | 13 | ## Bullet Points 14 | 1. **🌄 Minimize Movement on Slopes:** Slopes and stairs increase discomfort in virtual environments due to the vertical accelerations and the induced vection. By limiting movement to flat terrains, users can have a more comfortable experience. 15 | 16 | 2. **🚶‍♂️ Design for Forward Movement:** People find forward movement most comfortable in VR environments. Strafing, back-stepping, and spinning should be minimized as they increase the likelihood of discomfort. 17 | 18 | 3. **🏰 Keep Walls at a Distance:** Walls and large structures that are close to the user can increase optic flow, which leads to discomfort. To mitigate this, design environments with open spaces or with barriers that distance users from walls. 19 | 20 | 4. **🏢 Elevator and Stair Design:** Stairs and elevators are filled with visual cues that can fill the user's field of view, potentially triggering discomfort. When necessary, design them to be less detailed and simple to minimize optic flow. 21 | 22 | 5. **⛰️ Stair & Slope Teleports:** In certain instances, it might be helpful to provide teleport nodes at the top and bottom of stairways. This offers an option for users who prefer to navigate without dealing with the discomfort that stairs might induce. 23 | 24 | 6. **🖼️ Consistent Frame Rate and Head Tracking:** Maintaining a consistent frame rate is crucial for VR comfort. Any inconsistencies can lead to "judder", an uncomfortable mismatch between the virtual and physical camera position. 25 | 26 | 7. **🌌 Independent Visual Backgrounds (IVB):** Using IVBs can help to reduce discomfort by allowing the user's brain to reinterpret the visual motion as the world moving around them, rather than them moving through the world. 27 | 28 | 8. **🎮 Simulated Activities:** Controlling artificial locomotion through the re-enactment of physical activities (such as walking in place or climbing a ladder) can improve comfort by aligning proprioceptive and vestibular input with visual motion. 29 | 30 | 9. **🔊 Spatial Sound Effects:** Sound effects can help in reducing disorientation during blink effects or when occluding the environment. It helps in positioning the users in the environment. 31 | 32 | 10. **⚙️ Asynchronous Time Warp (ATW):** ATW is a re-projection feature that reduces the effect of judder when the application doesn't submit frames fast enough to keep up with the display refresh rate. 33 | 34 | ## Keywords 35 | - **Vection:** It is the sensation of movement of the body in space produced purely by visual inputs. In the context of this transcript, it refers to the visual perception of motion that causes discomfort in VR. 36 | 37 | - **Optic Flow:** The pattern of apparent motion of objects, surfaces, and edges in a visual scene caused by the relative motion between an observer and a scene. High optic flow can cause discomfort in VR due to the perceived motion. 38 | 39 | - **Teleportation (VR):** A common form of locomotion in VR that instantly moves the player from one place to another, thus reducing motion sickness. 40 | 41 | - **Judder:** A visual artifacting that occurs in VR when the frame rate is inconsistent. It refers to the uncomfortable mismatch between the virtual and physical camera position. 42 | 43 | - **Asynchronous Time Warp (ATW):** A reprojection technique used in VR to maintain a high frame rate and prevent judder. It helps in reducing discomfort by making sure the image on the display keeps up with the user's head movement. 44 | 45 | - **Independent Visual Background (IVB):** A technique used in VR to reduce discomfort. It uses consistent imagery or geometry in the environment that aligns with the user's vestibular senses to trick the brain into thinking the user is stationary, and it's the world that's moving. 46 | 47 | - **Vestibular Senses:** They are the sensory system that contributes to the sense of balance and spatial orientation for coordination of movement. Discrepancies between vestibular senses and visual inputs in VR can cause discomfort. 48 | 49 | - **Proprioceptive Input:** This is the sense of the relative position of one's own parts of the body and the strength of effort being employed in movement. In VR, alignment of proprioceptive and vestibular input with visual motion can help improve comfort. 50 | 51 | - **Binocular Disparity:** The difference in image location of an object seen by the left and right eyes, resulting from the eyes’ horizontal separation. The brain uses binocular disparity to extract depth information from the two-dimensional retinal images in stereopsis. 52 | 53 | - **Occlusion Depth Cues:** In the context of VR and 3D graphics, occlusion refers to the effect of one object in 3D space blocking another object from view. In terms of depth perception, if an object occludes another, it is perceived as being closer. 54 | -------------------------------------------------------------------------------- /docs/Meta/A blueprint for designing inclusive ARVR experiences.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: A blueprint for designing inclusive AR/VR experiences 4 | parent: Meta 5 | --- 6 | 7 | # A blueprint for designing inclusive AR/VR experiences 8 | Original article: [A blueprint for designing inclusive ARVR experiences](https://design.facebook.com/stories/a-blueprint-for-designing-inclusive-arvr-experiences/) 9 | 10 | ## TL;DR 11 | The blog post explores issues around safety, consent, and inclusivity in VR environments, discussing the impact of virtual harassment and the importance of body sovereignty. It emphasizes the need for inclusive interactions, user-friendly nonverbal cues, sign language, and an effective code of conduct. It also introduces the concept of 'safe zones' for users to escape uncomfortable situations in VR. The key points are the need for inclusive, safe, and respectful VR experiences, the impact of virtual embodiment, and the importance of ensuring consent-centric interactions in VR. 12 | 13 | ## Bullet points 14 | 1. 🏡 **The potential for VR as a new social space**: VR technology is pushing the boundaries of how people can interact with each other. Not just confined to gaming, but VR spaces provide a platform where people can share experiences, engage in social activities, or work together, all in a highly immersive and interactive environment. This signifies a whole new level of human interaction in the virtual realm. 15 | 16 | 2. 👩‍💻 **Challenges and issues**: While VR technology holds immense potential, it's not without its challenges. A key issue highlighted is that of virtual harassment, which can be particularly severe for women. This arises due to the feeling of physicality and presence that VR technology can instill, thus making the virtual experience feel very real and often uncomfortable. 17 | 18 | 3. 📚 **Blueprint for inclusive interactions**: To combat these challenges, the authors propose a design approach for creating inclusive social interactions within the VR environment. The principles of body sovereignty and consent are the main focal points, with the aim being to provide every user the right to control their virtual bodies and have a say in how others interact with them. 19 | 20 | 4. 💡 **VR and identity**: A key concept here is "virtual embodiment", which refers to the feeling of presence and identity in a virtual body. It's this concept that can make VR interactions feel very real and, in turn, have profound impacts on our perception of safety within VR spaces. 21 | 22 | 5. 👥 **Body ownership and consent**: Drawing parallels with real-world concepts, the authors propose the need to apply principles of consent and body sovereignty within the VR environment. This includes the right to personal space, to control one's virtual body, and to be free from unwanted interaction. 23 | 24 | 6. 🌐 **Adapting real-world norms**: There is an urgent need for developing etiquette and norms within VR that closely resemble those from the real world. By doing so, the environment becomes more understandable and safer, as people can easily identify and respond appropriately to the expectations of behavior. 25 | 26 | 7. 🖐️ **Nonverbal cues**: Incorporating nonverbal communication cues can enhance user experience and safety. The authors propose developing a kind of 'sign language' or gestures that can be quickly and universally recognized within VR. For instance, a user could make a specific gesture to report a problematic interaction instantly. 27 | 28 | 8. 📜 **Setting behavioral expectations**: Much like in colleges, offices, or public places, explicit codes of conduct can be put in place to ensure user safety. These can dictate what constitutes acceptable behavior and ensure any breach of these rules is swiftly acted upon. 29 | 30 | 9. 🛑 **Quick-action remediation**: Implementing features like "Safe Zone" can be of paramount importance to enhance user safety. This could act as an escape button that allows users to swiftly leave a situation or interaction that makes them uncomfortable. 31 | 32 | 10. 🚀 **Future prospects**: Looking forward, the authors stress the responsibility and commitment required of VR creators and developers. It's their duty to continuously develop and refine the principles and mechanisms of VR safety. The goal is to make VR a more inclusive, secure, and empowering technology that can be harnessed for good. 33 | 34 | ## Keywords 35 | - **Virtual Harassment**: Unwanted or hostile behavior executed through digital mediums, specifically within VR environments. It can have the same psychological effects as real-world harassment due to the immersive nature of VR technology. 36 | 37 | - **Inclusive Interactions**: Social interactions that are designed to be accessible and safe for all individuals, irrespective of their backgrounds or identities. These interactions aim to foster an environment of respect, acceptance, and equality. 38 | 39 | - **Virtual Embodiment**: The sense of presence and identification with a virtual body within a virtual reality environment. This feeling can contribute to the realism and impact of experiences within VR. 40 | 41 | - **Body Sovereignty**: The principle that an individual has the right to control their own body. Within the context of VR, this refers to users' rights to control their virtual bodies. 42 | 43 | - **Consent**: Agreement or permission for something to happen. Within VR, this pertains to agreeing to certain types of interactions or exchanges within the virtual environment. 44 | 45 | - **Nonverbal cues**: Nonverbal communication methods, like facial expressions, body language, or gestures. In VR, these could be programmed responses or actions that communicate specific messages or intentions. 46 | 47 | - **Codes of Conduct**: A set of rules outlining the social norms, responsibilities, and proper practices for an individual, party or organization. In the VR context, this refers to the behavioral norms and rules that users must abide by in VR spaces. 48 | 49 | - **Safe Zone**: Within VR, a 'Safe Zone' could refer to a specific virtual area or feature where users can retreat to avoid uncomfortable or distressing situations. 50 | 51 | - **Sign Language**: A language which, instead of acoustically conveyed sound patterns, uses visually transmitted sign patterns (manual communication, body language, and lip patterns) to convey meaning. In the context of VR, this refers to specific gestures users can make to communicate specific intentions or needs. 52 | -------------------------------------------------------------------------------- /docs/Google/Augmented reality design guidelines.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Augmented reality design guidelines 4 | parent: Google 5 | --- 6 | 7 | # Augmented reality design guidelines 8 | Original article: [Augmented reality design guidelines](https://developers.google.com/ar/design) 9 | 10 | ## TL;DR 11 | Google's AR design guidelines recommend creating immersive, user-friendly AR experiences with intuitive object interactions, smooth transitions, and well-defined object movement boundaries. To boost user engagement, visual and audio cues should encourage exploration, and clear error messages should assist users in rectifying errors. The interface should be visually clean, with a simple onboarding process. For multiplayer AR experiences, a seamless and intuitive shared virtual environment is crucial. The guidelines also advise on permissions and privacy, highlighting the importance of transparently communicating the need for specific app permissions. 12 | 13 | ## Bullet Points 14 | 1. 🎯 **Selection of virtual objects**: Enable users to select and interact with virtual objects seamlessly. This can be achieved through the use of highlighting techniques such as color differences, glowing outlines, or other distinct visual cues. The key is to ensure that these objects, despite their interactive potential, continue to behave like standard elements within the Augmented Reality (AR) environment. This reinforces the blended nature of the virtual and real world in AR. 15 | 16 | 2. 📦 **Translation of objects**: This refers to moving a virtual object along a surface or between different surfaces in the AR space. It is vital that this process is intuitive, either involving a simple on-screen dragging motion of the object or the physical movement of the device in the desired direction. The smoother the translation process, the more immersive the AR experience becomes. 17 | 18 | 3. 🌐 **Multiple surfaces**: Actively promote and facilitate the transition of virtual objects between various surfaces in the user's environment. This encourages the user to engage more dynamically with the AR world, thus enhancing the blend of virtual and real-world elements. However, be careful to avoid abrupt transitions or sudden changes in object scale as these can be jarring and disrupt the immersive experience. 19 | 20 | 4. 🛑 **Translation limits**: There should be clear and discernable boundaries that indicate the limits of an object's movement. This prevents situations where an object is moved beyond the field of view or to a distance that makes further interaction challenging. These boundaries can be indicated through visual cues such as grids, shadows, or changes in object color or opacity. 21 | 22 | 5. 🔄 **Rotation**: The AR design should allow users to rotate virtual objects freely in any direction. This can be manually controlled through touch gestures or can occur automatically through programmed animations. The design should be capable of supporting both single and multi-finger rotation gestures, offering a more interactive and flexible user experience. 23 | 24 | 6. 🔍 **Scaling**: Users should have the ability to alter the size of an object within the AR environment through gestures such as pinching or spreading fingers. However, to prevent an object from becoming either too small to see or too large for the AR space, designers should set minimum and maximum scaling limits. 25 | 26 | 7. ✋ **Gestures & Proximity**: When virtual objects are close together or overlap, it can be challenging for the user to select or interact with an individual object. To alleviate this, design should accommodate various two-finger gestures, such as pinching or a two-finger tap, to aid in distinguishing between close-proximity objects. 27 | 28 | 8. 🌈 **AR Initialization**: The transition from a 2D interface to the AR experience should be clear and smooth. Using visual cues such as a dimming display or a blurring effect can signal the impending transition to the user. Giving the user control over when the transition happens, such as through an 'Enter AR' button, can make the change less jarring. 29 | 30 | 9. 👥 **Multiplayer experience**: Aim to create a shared AR space where different users can interact with the same virtual objects. This can involve synchronizing surface detection across devices and providing visual or audio cues to guide users towards a shared space. Remember that multiplayer experiences can be complex, so aim to make the process of joining, connecting, and interacting as straightforward and user-friendly as possible. 31 | 32 | 10. 🎨 **UI Elements**: The User Interface (UI) should enhance, not detract from, the immersive AR experience. Avoid sudden pop-ups, full-screen takeovers, and persistent 2D overlays that constantly remind the user of the artificial nature of the AR world. The UI should be easy to navigate, employing familiar interaction models and supporting both landscape and portrait modes. This reduces the learning curve and allows users to focus more on the AR experience itself. 33 | 34 | ## Keywords 35 | - **AR (Augmented Reality)**: This is a technology that superimposes a computer-generated image on a user's view of the real world, providing a composite view. 36 | 37 | - **Translation**: In the context of AR, translation refers to moving a virtual object along a surface or from one surface to another. 38 | 39 | - **Rotation**: This is the process of turning an object around in any direction in the AR environment. 40 | 41 | - **Scaling**: In AR, scaling refers to the process of increasing or decreasing the size of a virtual object. 42 | 43 | - **Gestures**: These are specific physical actions, like pinching or dragging, that a user performs to interact with their device in the AR environment. 44 | 45 | - **Proximity**: In AR, proximity can refer to the relative distance or orientation of a user or object to other entities in the digital space. 46 | 47 | - **Multiplayer Experience**: In AR, this refers to an experience where different users can interact with the same virtual environment. 48 | 49 | - **UI (User Interface)**: This term refers to the graphical layout of an application. In AR, it includes the buttons users click on, the text they read, the images, sliders, and all the rest of the items the user interacts with. 50 | 51 | - **Onboarding**: This is a term for the mechanism through which new users are introduced to an application. In AR, it can involve explaining how to interact with the virtual environment. 52 | 53 | - **Error State**: This refers to a condition where the AR program or application encounters an unexpected situation or doesn't function as intended. 54 | -------------------------------------------------------------------------------- /docs/Google/VR Performance best practices.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: VR Performance best practices 4 | parent: Google 5 | --- 6 | 7 | # VR Performance best practices 8 | Original article: [VR Performance best practices](https://developers.google.com/vr/develop/best-practices/perf-best-practices) 9 | 10 | ## TL;DR 11 | The article guides on optimizing mobile VR applications, addressing key bottlenecks like fill rate, draw calls, and thermal issues. It advises on managing CPU and GPU workloads, handling audio, UI, and garbage collection, and using techniques like multithreaded rendering and vertex processing. Recognizing and mitigating thermal loads on mobile devices are also vital to maintain performance and enhance user experience. 12 | 13 | ## Bullet points 14 | 1. 🎮 **Identifying bottlenecks** is crucial for optimizing your application performance. Both the CPU and GPU can contribute to these bottlenecks. Performance testing on both ends is vital to identify areas of your application that may be causing slower than desired operation. 15 | 16 | 2. 🖥️ **GPU optimization** often revolves around managing fill rate problems. These problems are typically related to the speed at which the GPU can fill the frame buffer, a storage area containing pixel data. The main factors include fragment shader processing, which computes the pixel's final color, and texture bandwidth, the rate at which textures can be mapped onto surfaces. 17 | 18 | 3. 📉 **Diagnosing fill rate problems** in mobile VR can be as straightforward as adjusting the resolution of your application. If the frame rate of your application increases when the resolution is lowered, it is a good indication that your application is experiencing fill rate issues. 19 | 20 | 4. 🛠️ **Solutions to fix fill rate issues** vary, but they often include adjusting post-process effects that can be GPU-intensive, managing render target bandwidth to reduce the amount of data sent to the GPU, and optimizing texture bandwidth use by compressing textures or using mipmaps, for example. 21 | 22 | 5. 💡 **CPU optimization** often involves the efficient management of tasks such as draw calls, which are commands to the GPU to render objects, audio processing, physics simulations, custom scripts, UI updates, and garbage collection (automatic memory management) in Unity. These tasks can strain the CPU if not properly optimized. 23 | 24 | 6. 🔊 **Audio processing** can be optimized by reducing the sampling rate (the number of times per second audio is sampled), limiting the number of audio clips being played simultaneously, and using hardware decompression for audio sources, which uses the device's hardware to decompress audio data, reducing CPU usage. 25 | 26 | 7. 🎨 **Draw calls** can cause CPU bottlenecks if too many objects are being rendered in a single frame. Techniques to reduce this cost can include using multithreaded rendering, which allows draw calls to be processed on multiple CPU cores, or reducing the number of draw calls by merging similar objects into a single mesh, for example. 27 | 28 | 8. 👾 **GPU vertex processing optimization** can improve frame rates. The vertex processing stage of the rendering pipeline involves transforming the vertices of a 3D model and calculating their lighting. By adding geometric detail to a 3D model, it's possible to reduce the need for a normal map (a texture that adds detail to a surface) or additional texture lookups, thereby potentially improving frame rate. 29 | 30 | 9. 🗑️ **Effective garbage collection** in Unity can contribute to maintaining a smooth frame rate in VR applications. Garbage collection is the automatic reclamation of memory that is no longer in use by the program. By avoiding per-frame memory allocations and creating all necessary objects at the beginning of your game, you can minimize the impact of garbage collection on your frame rate. 31 | 32 | 10. 🌡️ **Thermal issues** can cause significant performance drops after extended use of a mobile VR application. It is important to monitor Android device logs for signs of thermal throttling, a phenomenon where the device reduces its performance to cool down. Keeping track of the device temperature and managing the thermal loads effectively is essential to prevent overheating and ensure consistent performance. 33 | 34 | ## Keywords 35 | - **Bottlenecks**: These refer to the specific parts of a system that limit the overall performance. In the context of the transcription, it applies to elements in the application that reduce its operational speed. 36 | 37 | - **CPU (Central Processing Unit)**: The primary component of a computer that performs most of the processing. In gaming applications, it's responsible for tasks like draw calls, physics simulations, audio processing, UI updates, and memory management. 38 | 39 | - **GPU (Graphics Processing Unit)**: A specialized processor designed to accelerate graphics rendering. In gaming applications, the GPU handles tasks like fragment shader processing and texture mapping. 40 | 41 | - **Fill Rate**: The speed at which the GPU can fill the frame buffer, or in other words, render and display frames on the screen. Fill rate issues can reduce the frame rate of an application. 42 | 43 | - **Fragment Shader**: A type of shader in graphics used to compute the final color of a pixel. They can contribute to fill rate problems if not properly optimized. 44 | 45 | - **Render Target Bandwidth**: The rate at which data is sent to the GPU for rendering. Optimizing the render target bandwidth can help improve GPU performance. 46 | 47 | - **Draw Calls**: Commands sent to the GPU to render objects. If not efficiently managed, draw calls can strain the CPU and reduce the performance of an application. 48 | 49 | - **Audio Processing**: The task of generating and managing sounds within an application. Efficient audio processing involves reducing CPU usage while maintaining good sound quality. 50 | 51 | - **Multithreaded Rendering**: A technique that allows draw calls to be processed on multiple CPU cores, thereby improving the performance of an application. 52 | 53 | - **GPU Vertex Processing**: A stage in the GPU rendering pipeline that involves transforming the vertices of a 3D model and calculating their lighting. Optimizing vertex processing can improve frame rates. 54 | 55 | - **Garbage Collection**: An automatic memory management system that reclaims memory no longer in use by the program. Effective garbage collection in VR applications can contribute to maintaining a smooth frame rate. 56 | 57 | - **Thermal Throttling**: A phenomenon where a device reduces its performance to cool down. It's essential to manage thermal loads effectively to prevent overheating and ensure consistent performance in mobile VR applications. 58 | -------------------------------------------------------------------------------- /docs/Road to VR/The Design & Implementation of Hand-tracking in ‘Myst’.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: The Design & Implementation of Hand-tracking in ‘Myst’ 4 | parent: Road to VR 5 | --- 6 | 7 | # The Design & Implementation of Hand-tracking in ‘Myst’ 8 | Original article: [The Design & Implementation of Hand-tracking in ‘Myst’](https://www.roadtovr.com/design-implementation-of-hand-tracking-in-myst-oculus-quest/) 9 | 10 | ## TL;DR 11 | The article details hand tracking development for the VR game Myst using Unreal Engine. Custom solutions were implemented for gesture recognition, movement control, and multi-platform compatibility. The process highlights the complexity of creating a seamless hand tracking experience in VR. 12 | 13 | ## Bullet points 14 | 1. **👉 Pointing Mechanism for Movement**: The game adopts a natural pointing gesture for movement, allowing the player to teleport or move smoothly by pointing with their free-movement-dominant hand. To deal with inconsistencies in finger tracking when occluded, some adjustments were made to the code to ensure stable movement execution. This design leverages intuitive human gestures to enable seamless navigation within the game, incorporating extensive playtesting to find the perfect balance. 15 | 2. **👍 Thumbs-up Gesture for Turning**: Turning introduced complications with the pointing method, so a thumbs-up gesture was utilized instead. By giving a thumbs-up and rotating the wrist left or right, the player can turn smoothly or with snap turns. This solution, while not initially intuitive, was found to be the most comfortable and consistent way to enable turning within the game, showcasing the willingness to explore unconventional but effective design solutions. 16 | 3. **🚫 Handling Conflicts with Object Interaction**: Pointing is also a common interaction gesture, so the team had to distinguish between navigation pointing and interaction pointing. A range of 25 cm from the fingertip to the interactable object was established as the sweet spot to prevent unintended movement during interaction. This demonstrates the meticulous attention to real-world scenarios and user behaviors, enabling a more engaging and less frustrating player experience. 17 | 4. **✋ Designing Object Interactions with Hand Tracking**: Grabbable interactions were built to work with hand tracking by assessing when the fingers were curled enough to grab something, mimicking the behavior with Touch controllers. For buttons, capsule colliders between finger joints were used, allowing for diverse interaction methods. This multi-layered approach to hand interactions in Myst reflects a comprehensive understanding of both the technical possibilities and the intuitive user behaviors within a virtual environment. 18 | 5. **🖐️ Menu/UI Interactions**: To ensure consistency with existing user expectations, the two-finger pinch interaction paradigm from Meta's Quest Platform was used for menu interactions. This wise reuse of established interaction patterns demonstrates a user-centric approach to design, minimizing learning curves and allowing players to dive into the experience without the need to relearn basic interactions. 19 | 6. **📢 Communicating the Hand Tracking Experience**: Recognizing that hand tracking may be unfamiliar to some players, the team included specialized notifications and reminders for how to optimize hand tracking, such as proper lighting and keeping hands within the field of view. This emphasizes the team's commitment to user education and support, ensuring that players are set up for success and enjoyment within the game. 20 | 7. **🛠️ System Gesture Spam Fix & Multi-Platform Build Stability**: The Oculus left-hand system gesture was triggering prematurely during the pinch motion. The team resolved this by modifying the Oculus Input library to confirm the pinch completion before initiating the notify event. This ensures that the system gesture's confirmation circle is filled in before activating the associated event. A challenge was encountered with the Oculus Hand Component breaking builds for non-Windows and non-Android platforms like Xbox during nativization. The solution was to manage the Oculus Hand Component in C++, spawning and destroying it depending on hand tracking detection, and restricting this functionality to Android builds specifically for Quest. 21 | 8. **🎮 Custom Whole-Hand Gesture Recognition**: Unreal Engine lacked built-in functionality for recognizing specific whole-hand gestures like thumbs-up or finger-pointing. The team designed their own bone rotation detection within the Oculus Hand Component to recognize various hand gestures such as finger pointing, grabbing, and thumbs-up. These were then transformed into input events accessible in C++, enabling more intuitive and versatile player interactions. 22 | 9. **🤏 Gesture & Tracking Stability Adjustments**: The document outlines quirks in tracking stability for particular hand gestures, such as grabbing with fingers facing away or pointing the index finger directly away. Tracking inaccuracies were noted in these scenarios, sometimes misinterpreting the hand's pose. The solution was to implement individual finger thresholds to control how relaxed a finger could be before it was considered 'not grabbing' or 'not pointing.' This enabled more precise and consistent gesture interpretation. 23 | 10. **🧰 Other Handy Utilities for Oculus Hand Component**: The team introduced additional modifications and utility functions to the Oculus Hand Component. One example is a utility to find the closest point on the hand's collision from another point in world space, returning the name of the closest bone. This allowed for enhanced input verification across interactions. A consistent tracking point, such as the wrist bone, was frequently used for reliability across different hand depths. 24 | 25 | ## Keywords 26 | - **Gesture Detection**: Technology recognizing human gestures through algorithms. 27 | - **System Gesture Spam Fix**: Specific solution to a problem with early triggering of gestures. 28 | - **Multi-Platform Build Stability**: Development of software for multiple platforms. 29 | - **Nativization**: Converting Blueprint scripts into C++ in Unreal Engine. 30 | - **Blueprint**: Visual scripting system in Unreal Engine. 31 | - **Bone Rotation Detection**: Detecting positions and movements of virtual hands. 32 | - **Pointing Mechanism**: Method using a pointing gesture for game control. 33 | - **Teleport Mode**: Instantly moving to a pointed destination in the game. 34 | - **Smooth Movement Mode**: Continuous movement in a directed path by hand pointing. 35 | - **Occluded**: An object hidden from view, interfering with tracking. 36 | - **Fudge Factor**: Tweak or adjustment made to code for stable movement. 37 | - **Turning**: Action of rotating player's view in the game. 38 | - **Wrist Rotation**: Turning of the wrist, related to game gesture. 39 | - **Snap Turning**: Instant turning to a new direction in the game. 40 | - **Interactable Object**: Object that player can engage with in the game. 41 | - **World Space Location**: Coordinate system defining 3D object locations. 42 | - **Colliders**: Used to detect collisions between objects in games. 43 | - **OS-level Menus**: Operating system menus for configuring settings. 44 | -------------------------------------------------------------------------------- /docs/Others/XR Accessibility User Requirements.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: XR Accessibility User Requirements 4 | parent: Others 5 | --- 6 | 7 | # XR Accessibility User Requirements 8 | Original article: [XR Accessibility User Requirements](https://www.w3.org/TR/xaur/) 9 | 10 | ## TL;DR 11 | This document outlines accessibility needs for XR. The focus is on inclusivity for users with disabilities through features like motion-agnostic interactions, voice commands, and customization. XR's broad applications require addressing challenges to ensure accessibility for diverse disabilities and integrating guidelines like WCAG and WebXR API for a universally inclusive experience. 12 | 13 | ## Bullet points 14 | 1. **🌐 Extended Reality (XR)**: XR encompasses immersive technologies from virtual environments to real-world augmentations. It represents the merging of augmented reality (AR), virtual reality (VR), and other immersive tech. The emphasis is on its potential breadth and the need to make such experiences universally accessible to everyone. 15 | 16 | 2. **🎮 XR Spectrum**: XR is versatile. Whether it's the immersive nature of VR using head-mounted displays that transport users to completely digital worlds or AR's mobile device overlays that enhance our real-world surroundings, the breadth of experiences in XR is continually expanding. 17 | 18 | 3. **⚠️ Accessibility Challenges**: As XR technologies advance, there's an increasing responsibility to ensure that these experiences are accessible to all, considering users with disabilities and diverse needs. Accessibility shouldn't be an afterthought but integral to XR design. 19 | 20 | 4. **🎲 Disability in Gaming**: While XR gaming offers unprecedented immersion, it also presents unique challenges. Motion-based controls, while innovative, may exclude some users. Additionally, hardware-specific games can limit who can participate, creating barriers to a fully inclusive gaming environment. 21 | 22 | 5. **🚹 Disability Categories**: Designing XR experiences requires a comprehensive understanding of disabilities, from auditory impairments to cognitive challenges and physical limitations. Addressing the diverse needs of each category ensures a more inclusive XR space. 23 | 24 | 6. **🖐️ Multimodality**: XR stands out by engaging multiple senses simultaneously. Whether it's seeing a lifelike simulation, feeling haptic feedback, or hearing 3D audio, XR aims to create a rich, multi-sensory environment that mirrors real-world interactions. 25 | 26 | 7. **⌨️ Input/Output in XR**: The ways users can interact with XR are diverse. Traditional inputs like keyboards and mice are joined by gesture recognition, eye tracking, and more. Similarly, outputs aren't just visual; they can range from tactile feedback to spatial audio cues. 27 | 28 | 8. **🔍 Affordances**: In XR, clarity in interaction is paramount. Users should easily discern possible actions or interactions in the virtual space, making experiences both intuitive and accessible to everyone, regardless of prior XR exposure. 29 | 30 | 9. **🌍 WCAG & WebXR**: As the web evolves to include XR experiences, adherence to the Web Content Accessibility Guidelines (WCAG) remains crucial. Integrating these guidelines with the WebXR API is a significant step toward a more inclusive digital realm. 31 | 32 | 10. **🔖 Immersive Semantics**: Within XR, elements should be self-explanatory. Users, especially those using assistive technologies, should find navigation seamless, objects identifiable, and interactions straightforward. This clarity enhances the overall user experience. 33 | 34 | 11. **🕺 Motion Agnostic Interactions**: Not all users can (or want to) rely on extensive bodily movements in XR. Designing for motion-agnostic interactions ensures that physical disabilities don't prevent anyone from enjoying immersive experiences. 35 | 36 | 12. **🎨 Immersive Personalization**: Personalizing XR is especially crucial for users with cognitive and learning disabilities. Customizable overlays, the ability to mute potentially distracting content, and accommodating personal preferences can make XR more accessible and enjoyable. 37 | 38 | 13. **🎯 Interaction Customization**: Users with visual or mobility challenges might need tailored interactions. XR designs should be adaptable, offering larger interaction targets and simplifying gestures, ensuring everyone can navigate with ease. 39 | 40 | 14. **🎤 Voice & Gestural Interactions**: Voice commands and gestures offer alternative ways to interact within XR, especially beneficial for those with mobility restrictions. By recognizing and responding to voice or movement, XR becomes more universally accessible. 41 | 42 | 15. **✋ Signing & Descriptions**: For the deaf or hard of hearing, visual communication methods, such as sign language, can be more effective. Incorporating signing avatars or providing detailed visual cues ensures these users receive the full experience in XR. 43 | 44 | ## Keywords 45 | - **Accessibility**: The design of products, devices, services, or environments to be usable by people with disabilities. 46 | - **Web Content Accessibility Guidelines (WCAG)**: A set of guidelines that ensure web content is accessible to all, including people with disabilities. 47 | - **WebXR API**: A web-based application programming interface (API) used for creating XR experiences that are viewable in a web browser. 48 | - **Multimodality**: Refers to the use of multiple modes of communication or sensory experiences simultaneously. 49 | - **Auditory Disabilities**: Hearing impairments or conditions that impact one's ability to hear. 50 | - **Cognitive Disabilities**: Impairments or conditions that affect one's cognitive functions, such as memory, attention, or problem-solving. 51 | - **Neurological Disabilities**: Disorders that affect the nervous system. 52 | - **Physical Disabilities**: Physical conditions that can limit mobility or limb functions. 53 | - **Speech Disabilities**: Conditions that affect one's ability to communicate verbally. 54 | - **Visual Disabilities**: Impairments or conditions that affect one's ability to see. 55 | - **Robust Affordances**: In the context of accessibility, it refers to clear and distinct cues or features that indicate how an object or element can be interacted with, making it easily perceivable by users. 56 | - **Modality**: In the context of XR, it refers to the specific form or method of interaction, such as voice commands or touch gestures. 57 | - **Sticky Keys**: A feature that assists users who may have difficulty pressing multiple keys simultaneously, allowing them to press one key at a time for keyboard shortcuts. 58 | - **Spatialized Augmented Reality**: AR experiences that are mapped and positioned in physical space, enhancing the sense of immersion and realism. 59 | - **Mono Audio**: Audio that is heard through a single speaker or earpiece, beneficial for users with hearing loss or spatial orientation impairments. 60 | - **RTC Application**: A Real-Time Communication application that enables live communication between users in XR environments. 61 | - **Vestibular Disorders**: Conditions affecting the inner ear and balance system, which may cause issues in XR experiences. 62 | - **Immersive Captions Community Group**: A community group within W3C dedicated to contributing to accessibility standards for immersive experiences, focusing on captioning and subtitling. 63 | - **W3C (World Wide Web Consortium)**: An organization developing web standards to ensure accessibility and compatibility. 64 | - **Motion Agnostic**: Not dependent on specific physical movements; applicable to users with disabilities. 65 | - **Context Sensitive**: Adaptation based on the current situation or environment. 66 | -------------------------------------------------------------------------------- /docs/Road to VR/A Concise Beginner’s Guide to Apple Vision Pro Design & Development.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: A Concise Beginner’s Guide to Apple Vision Pro Design & Development 4 | parent: Road to VR 5 | --- 6 | 7 | # A Concise Beginner’s Guide to Apple Vision Pro Design & Development 8 | Original article: [A Concise Beginner’s Guide to Apple Vision Pro Design & Development](https://www.roadtovr.com/apple-vision-pro-design-development-beginner-guide-sterling-crispin/) 9 | 10 | ## TL;DR 11 | The document introduces visionOS, Apple's XR headset, with features like spatial audio and tool compatibility. It provides guidelines for spatial computing, including VR and AR, focusing on user sensory considerations and design principles. The text also highlights spatial computing weaknesses and emphasizes rapid development strategies, offering insights for developers in immersive technologies. 12 | 13 | ## Bullet points 14 | 1. 🖼️ **Scenes in visionOS (Windows, Volumes, and Spaces)**: visionOS organizes applications into three types of spatial contexts: Windows, Volumes, and Spaces. Windows are rectangular displays akin to traditional computer screens, while Volumes represent 3D or interactive objects, and Spaces are immersive environments that provide full control. Immersion styles within Spaces may vary from mixed to full, defining the extent of the real world to be seen. 15 | 2. 🎮 **User Input and Interaction Methods**: Users can interact with visionOS using various methods, from traditional gestures like pinching and tapping on floating windows to using Bluetooth trackpads or game controllers. Voice input and Dwell Control are also available. Developers need not worry about the origin of these events, focusing on handling them through elements like TapGesture. 16 | 3. 🎧 **Spatial Audio in Vision Pro**: Vision Pro's advanced spatial audio system gives a realistic sound experience by considering the room's physical attributes. Developers must prioritize using sound design for UI interaction and creating immersive experiences, acknowledging the importance of spatial audio in enhancing the user experience. 17 | 4. 🛠️ **Development Tools and Platforms**: Depending on the target platforms, developers can choose between Apple's development ecosystem (using tools like XCode, SwiftUI, RealityKit, ARKit, and Reality Composer Pro) or Unity for building fully immersive VR experiences that are compatible with various headsets. Apple's tools enable quick development with built-in assets, while Unity offers more extensive control and compatibility with non-Apple VR systems. 18 | 5. 📲 **Transitioning iOS Apps and Product Design Considerations**: Existing iOS apps can transition to visionOS as floating Windows, with the Ornament API allowing customization. Understanding spatial apps requires experiencing them firsthand (e.g., through Quest 2 or 3), and keeping a learning diary can be invaluable. Thoughtful design, understanding user needs, and avoiding overly complex or physically demanding interactions are paramount. 19 | 6. 📐 **Prototyping and Spatial Formats**: The design process for spatial apps may involve physical prototyping with paper or cardboard, or digital tools like ShapesXR. The app's purpose should be reflected in its spatial arrangements, with consideration for user comfort and familiarity. High-end tools like the Varjo XR-3 might be considered for those with substantial budgets. The prototyping phase emphasizes user-centric design, ensuring the application's spatial components align with user needs and expectations while exploring various interaction models. 20 | 7. 1. 🧠 **Visual and Perceptual Comfort:** The design of spatial computing interfaces should prioritize the sensory experience of the user. Ergonomics and cognitive impacts must be considered to minimize discomfort and motion sickness. Users may attribute discomfort to hardware, but it's the app's responsibility to ensure comfort. 21 | 8. 🎨 **UI and Web Design:** A blend of standard practices and innovation is key for intuitive interaction. Using tools like WebXR, designers can turn websites into immersive VR experiences. A balance between subtlety in space and motion and avoiding excessive 3D effects is essential. 22 | 9. 🎮 **Games and Media Development:** Choices of tools like Unity or Apple's ecosystem can significantly impact the development process. Varied contexts for gameplay mechanics, mixed inputs, and asymmetry should be explored. Existing tools like Apple's Game Porting Toolkit could facilitate the integration of high-end PC games. 23 | 10. 🏢 **Strengths, Weaknesses, and Strategies for Spatial Computing in Business and Products:** Spatial computing's strength lies in teaching spatial tasks, viewing things at scale, expressive applications, and immersive media. Limitations exist in activities involving quick movements and long-term use. A lean approach to development that encourages fast failure, pivots, and focuses on solving real problems can lead to success. For existing products, the focus should be on identifying specific spatial moments that enhance the experience while avoiding unnecessary complexity. 24 | 25 | 26 | ## Keywords 27 | - **Volumes**: These are 3D objects or small interactive scenes in visionOS. They can be things like a 3D map or a small game that floats in front of the user. 28 | - **Spaces**: In visionOS, these are fully immersive experiences where only one app is visible. Spaces can be mixed, progressive, or fully immersive, defining how much of the real world the user can see. 29 | - **TapGesture**: A user input method where actions are triggered by tapping. 30 | - **Dwell Control**: An accessibility feature in visionOS that allows eyes-only input. 31 | - **SwiftUI**: A user interface toolkit that allows developers to design apps in a declarative way for Apple devices. 32 | - **RealityKit**: A framework used for rendering 3D objects, materials, and light simulations in Apple's ecosystem. 33 | - **ARKit**: Apple's framework for augmented reality (AR) development, allowing for advanced scene understanding and interaction with real-world objects. 34 | - **Reality Composer Pro**: A 3D content editor specific to Apple's development stack, allowing developers to drag objects around a 3D scene. 35 | - **PlayStation VR**: Sony's virtual reality system used with PlayStation gaming consoles. 36 | - **PolySpatial tool**: A tool in Unity for converting materials, shaders, and other features. 37 | - **Ornament API**: An API used to create floating islands of UI for spatial effect in existing iOS apps. 38 | - **IDEO Design Thinking**: A methodology used for creative problem solving, emphasizing empathy with users, defining problems, ideation, prototyping, and testing. 39 | - **ShapesXR**: An app used for sketching ideas in space and creating storyboards for spatial apps. 40 | - **Passthrough AR**: A feature that allows users to see the real world through cameras on a VR headset, creating augmented reality experiences. 41 | - **Varjo XR-3**: A high-end virtual and augmented reality headset known for its quality and high price. 42 | - **Cognitive Impacts:** Relating to the mental processes of perception, memory, judgment, and reasoning. In this context, it refers to how design choices can affect these mental processes. 43 | - **Fitts' Law:** A principle that predicts the time required to rapidly move to a target area, as a function of the ratio of the distance to the target and the width of the target. It's used in human-computer interaction and ergonomics. 44 | - **WebXR:** A web framework that enables the creation of Virtual Reality (VR) and Augmented Reality (AR) experiences on web browsers. 45 | - **2.5D:** A term describing the visual depth and perspective in a project that is otherwise two-dimensional. It represents a structure that has a 3D look but doesn’t fully exist in three-dimensional space. 46 | - **Game Porting Toolkit:** A tool for adapting or porting games from one platform to another, mentioned in the context of bringing PC games to Mac. 47 | - **Lean Startup:** A methodology for developing businesses and products that aim to shorten product development cycles and rapidly discover if a proposed business model is viable. 48 | 49 | -------------------------------------------------------------------------------- /docs/Meta/Designing Accessible VR.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Designing Accessible VR 4 | parent: Meta 5 | --- 6 | 7 | # Designing Accessible VR 8 | Original article: [Designing Accessible VR](https://developer.oculus.com/resources/design-accessible-vr/) 9 | 10 | ## TL;DR 11 | The transcript emphasizes the need for designing accessible VR experiences by focusing on user variability, spatialized audio, Voice User Interfaces (VUIs), and low vision simulations. It recommends strategies to manage cognitive overload and motion sickness, such as simplifying user interfaces and optimizing locomotion methods. It further underscores the importance of customizing captions and subtitles, including font, location, speed, and color, to enhance reading in VR. 12 | 13 | ## Bullet points 14 | 1. 🌐 **User Variability in VR**: Every individual has a unique set of abilities, comfort levels, and preferences. This variability, encompassing factors like physical abilities, cognitive abilities, sensory abilities, and prior experiences, needs to be considered when designing VR experiences. Accommodating this variability can lead to more inclusive and comfortable VR experiences. 15 | 16 | 2. 👥 **User Research in VR Design**: User research is a process of understanding users' behaviors, needs, and motivations through observation techniques, task analysis, and other feedback methodologies. By gathering user data, designers can make evidence-based decisions, creating VR experiences that meet user needs and expectations, thereby enhancing user engagement. 17 | 18 | 3. 🎮 **UI Design in VR**: User Interface (UI) in VR should be designed such that it's easy to understand and navigate. This includes making text easily readable, ensuring the buttons are of an appropriate size and adequately spaced, and providing customization options. This helps in accommodating the varying preferences and needs of individual users, enhancing their VR experience. 19 | 20 | 4. 💡 **Cognitive Load in VR**: Cognitive load refers to the total amount of mental effort being used in a user's working memory. If a VR experience is too complex or unintuitive, it can exceed the user's cognitive load, leading to feelings of discomfort, confusion, and loss of immersion. Designing experiences that are within a user's cognitive abilities can greatly enhance their VR experience. 21 | 22 | 5. 🎧 **Spatialized Audio in VR**: Spatial audio is sound that exists in three dimensions. It is used in VR to enhance immersion, help users orient themselves in the virtual space, and understand their environment. It can help users locate virtual objects based on the direction and distance from which the sound is coming, providing a more realistic and engaging experience. 23 | 24 | 6. 🖌️ **Visual Accessibility in VR**: Ensuring that visuals are clear and contrastive is crucial for making VR experiences accessible. For people with color blindness, designers should consider offering alternate color schemes or color identification tools. This can help ensure that all users, regardless of their visual abilities, can fully experience and interact with the VR environment. 25 | 26 | 7. 👁️ **Low Vision Simulations**: Simulating low vision conditions allows designers to experience what visually impaired users might see in a VR environment. This experience can drive empathy and understanding, helping designers to make necessary enhancements to their designs and create more inclusive and accessible VR experiences. 27 | 28 | 8. 🎙️ **Voice User Interface (VUI)**: Incorporating voice commands in VR can greatly improve accessibility. For users with physical or cognitive limitations that make traditional controls challenging, voice commands provide an alternative means of navigation and interaction. This makes VR experiences more inclusive and accessible to a wider audience. 29 | 30 | 9. 📘 **Reading in VR**: Text in VR should be legible and comfortable to read. This includes considering factors such as the choice of font, the size of the font, and the placement of the text. By making text easier to read, designers can improve user comfort and engagement, making VR experiences more enjoyable and inclusive. 31 | 32 | 10. 🏃‍♂️ **Motion Sickness in VR**: VR experiences can sometimes cause motion sickness in users, due to the disconnect between the perceived movement in the virtual world and the physical stillness of the real world. Strategies to mitigate this include incorporating gradual acceleration and deceleration, providing a static frame of reference, or allowing teleportation for movement in the VR environment. 33 | 34 | 11. 📖 **Captions and Subtitles in VR**: Captions (text appearing in the same language as the spoken dialogue) and subtitles (text translated into a different language) can greatly enhance the accessibility of VR experiences. They cater to a wide range of users, from those with hearing loss or cognitive disabilities, to those who speak different languages, to even those who simply prefer to read dialogue. 35 | 36 | 12. 🎨 **Font Customization in VR**: Allowing users to select from a variety of fonts can make the text more readable for a wider range of users. Fonts such as OpenDyslexic, Arial, and Comic Sans are considered easier to read for people with dyslexia. Giving users the ability to choose their preferred font enhances accessibility and user comfort. 37 | 38 | 13. 📏 **Caption and Subtitle Placement in VR**: The placement of captions and subtitles in the VR space can significantly affect user comfort and understanding. Allowing users to customize the placement of captions and subtitles can enhance their VR experience, making it more comfortable and engaging. 39 | 40 | 14. ⏱️ **Caption and Subtitle Speed in VR**: Ensuring that captions and subtitles are displayed at a pace that matches the spoken dialogue can help ensure that users who rely on captions have a similar experience to those who don't. Presenting complete sentences as they're spoken can make the experience more immersive and inclusive for users who rely on captions or subtitles. 41 | 42 | 15. 🚦 **Visual Indicators in VR**: In addition to captions and subtitles, using visual cues such as arrows or onomatopoeia can help users understand the narrative and navigate the VR space. These cues can direct users towards the source of a sound, a speaking character, or a next game event, making the VR experience more immersive and navigable. 43 | 44 | 45 | ## Keywords 46 | - **User Variability**: The diversity of user abilities, comfort levels, and preferences. This should be taken into account when designing VR experiences for inclusivity and comfort. 47 | 48 | - **User Research**: A process used to understand user behaviors, needs, and motivations through various observation techniques and feedback methodologies. 49 | 50 | - **Cognitive Load**: The total amount of mental effort being used in a user's working memory. Exceeding the cognitive load can lead to discomfort and confusion. 51 | 52 | - **Spatialized Audio**: Sound that exists in three dimensions, used in VR to enhance immersion and help users orient themselves in the virtual space. 53 | 54 | - **Visual Accessibility**: The degree to which a product allows users with varying visual abilities to consume and interact with its content. 55 | 56 | - **Low Vision Simulations**: Tools used by designers to simulate what visually impaired users might experience, driving empathy and leading to improved design decisions. 57 | 58 | - **Voice User Interface (VUI)**: A system that enables users to interact with a digital product or service using voice commands. 59 | 60 | - **Reading in VR**: Refers to the legibility and comfort level of reading text in VR, influenced by factors such as font choice, size, and placement. 61 | 62 | - **Motion Sickness**: A common issue in VR where users feel nauseous due to a disconnect between perceived movement in the virtual world and the physical stillness of the real world. 63 | 64 | - **Captions and Subtitles**: Text that appears onscreen during a video to provide written versions of spoken dialogue or translations in different languages. 65 | 66 | - **Font Customization**: Allowing users to choose from a range of fonts for on-screen text to improve readability and accessibility. 67 | 68 | - **Caption/Subtitle Placement**: The spatial location of captions and subtitles in VR, which can be adjusted to enhance user comfort and understanding. 69 | 70 | - **Caption/Subtitle Speed**: The pace at which captions or subtitles are displayed, ideally matching the speed of spoken dialogue. 71 | 72 | - **Visual Indicators**: Visual cues or signals used to guide users in VR, enhancing their understanding and navigation of the virtual environment. 73 | -------------------------------------------------------------------------------- /docs/Google/Using type in AR & VR.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | title: Using type in AR & VR 4 | parent: Google 5 | --- 6 | 7 | # Using type in AR & VR 8 | 9 | Original article: [Using type in AR & VR](https://fonts.google.com/knowledge/using_type_in_ar_and_vr) 10 | 11 | ## TL;DR 12 | 13 | In AR/VR environments, traditional typography rules are insufficient, requiring new approaches to ensure text readability and legibility in 3D spaces. Factors like text rendering methods, dimensionality, and intricate font details like counters and apertures are crucial but can be affected by optical distortions like chromatic aberration. Both hardware and software need to innovate to tackle these unique typographic challenges as AR/VR technology continues to evolve. 14 | 15 | ## Bullet points 16 | 17 | 1. 📏 **Spatial Classification of Typography**: Text in AR/VR is not just a flat element but exists in a three-dimensional space. Decisions have to be made on whether text will remain stationary in the 3D environment or will move according to the user's point of view. These decisions have immediate implications on legibility and usability. This also affects how text should behave when the user is in motion. 18 | 19 | 2. 🔠 **Challenges in Text Rendering**: The 3D medium poses challenges like perspective distortion, distance reading, and distortion of letter shapes. These are not entirely new issues but are intensified and nuanced in a 3D context. For example, readability can be compromised if guidelines from traditional mediums, such as highway signs, are directly translated to AR/VR applications. 20 | 21 | 3. 🚫 **Limitations of 3D Text**: While 3D text might seem like an attractive option for a 3D medium, it hinders readability and legibility. Three-dimensional text adds to the letter shapes and complicates the spaces between them, making them less recognizable and requiring extra effort to read. Flat, 2D text is generally recommended for sentences and paragraphs. 22 | 23 | 4. 🔡 **Role of Variable Fonts**: Variable fonts offer dynamic customization across various axes like weight, width, and spacing. This adaptability is crucial for optimizing text based on the user's viewpoint, distance, and background contrast. It allows for real-time adjustments to improve legibility and readability in varying conditions. 24 | 25 | 5. 🔄 **Dynamic Rendering**: Text rendering in AR/VR is dynamic due to constant changes in the point of view. Traditional methods, like bitmap glyphs or FreeType, can result in blurred text when resized. More advanced methods, like Signed-Distance-Field-based (SDF) text rendering, offer better scalability but can still suffer from rounding issues at corners when text size exceeds the generated texture. 26 | 27 | 6. 🔜 **The Future of AR/VR Typography**: Despite advances in AR/VR technology, typography in this medium is still in its nascent stages. Processing power limitations have led designers to adopt less CPU-intensive text rendering methods, affecting the quality of text. However, as technology evolves, there's a need for designing typefaces and applications optimized for these unique challenges. 28 | 29 | 7. 🎯**Counters and Apertures**: The design of counters (fully enclosed spaces) and apertures (partially enclosed spaces) in letters like "c," "e," and "o" is crucial, especially in AR/VR contexts. Factors like irradiation and chromatic aberration can fill these spaces, causing misrecognition. For instance, a "c" may appear as an "o." Opt for typefaces with larger counters and open apertures for better legibility. 30 | 31 | 8. 🏋️‍♂️**Font Weight Matters**: In environments with varying resolutions, the weight of the font plays a critical role in how legible the text is. Lighter weights can flicker and blend into the background, making them hard to read. However, going too bold can also affect legibility by closing up counters and apertures. Medium to semi-bold weights usually offer a good balance and should be tested rigorously. 32 | 33 | 9. 🖊️**Width and X-Height**: The width of the typeface and its x-height (the height of lowercase letters like "x") are significant for legibility, especially in AR where text can be viewed from different angles and distances. Wider typefaces generally perform better but may require more space. A large x-height enhances legibility but must be balanced to prevent letter misrecognition. 34 | 35 | 10. ✒️**Visual Nuances **: Fine details such as stroke contrast, the thinning at joints, and stroke endings can be distorted in AR/VR due to factors like halation (glow around letters). Therefore, opt for low-to-moderate contrast typefaces and those that have well-defined stroke endings. Additionally, generous letter spacing is needed to prevent the merging of letters, particularly when viewed at angles. 36 | 37 | 11. 📑 **Text Classification in AR/VR**: The document outlines six categories for classifying text in augmented and virtual reality environments. These classifications — "Text in HUD," "Text for Long Reading," "Sticky Info Text," "Signage Text," "Responsive Text," and "Ticker Text" — serve different purposes and require specific considerations for rendering. This classification framework aids in the decision-making process for designing and setting text within AR/VR applications. 38 | 39 | 12. 👁️ **Field of View (FoV) Constraints**: The document highlights the importance of Field of View (FoV) as a critical factor in AR/VR text placement and readability. Human FoV is around 120° when looking straight and extends up to 200°-270° with eye movement. In contrast, most AR/VR devices offer a much more limited FoV, typically between 40° and 100°. This discrepancy imposes a constraint on the amount of text that can be comfortably displayed and emphasizes the need for wider FoV in future AR/VR devices for a more immersive experience. 40 | 41 | 13. ☀️ **Brightness and Contrast Challenges**: Addressing the brightness and contrast levels in AR/VR environments, especially for optical see-through headsets (OSTs), is essential. The document emphasizes that achieving a high level of brightness is vital for legible text in high ambient light conditions, such as outdoors. It also indicates that contrast ratios affect how easily the text can be read, making it a key factor in designing textual elements in mixed realities. 42 | 43 | 14. 🖥️ **Resolution and Pixel Density**: The document details the role of resolution and pixels per degree (PPD) in AR/VR text readability. A higher PPD will yield clearer and sharper text. However, the resolution isn't just about the number of pixels; it also depends on how these pixels are magnified by the optics of the device. Therefore, the calculation for PPD takes into account both the number of pixels and the device's field of view. With current technology, achieving a PPD close to the human eye's capability (approximately 60 PPD) remains a challenge. 44 | 45 | 15. 🎨 **Optical and Graphical Distortions**: Various kinds of optical and graphical distortions, such as halation, chromatic aberration, and the screen door effect, can negatively impact text readability and overall user experience in AR/VR. Halation causes a foggy glow around text, compromising its clarity. Chromatic aberration can produce colored fringes around text and objects, disrupting the immersive experience and possibly causing user discomfort. The screen door effect arises from the visible grid lines between pixels, reducing the perceived quality of the display. Addressing these distortions is crucial for improving the visual quality and readability of text in AR/VR environments. 46 | 47 | ## Keywords 48 | 49 | - **Typography**: The art and technique of arranging type, critical for readability and legibility in AR/VR environments. 50 | 51 | - **Legibility**: The clarity of text, influenced by factors such as dimensionality and rendering methods in AR/VR. 52 | 53 | - **Readability**: The overall ease with which text can be read, impacted by layout and design. 54 | 55 | - **Dimensionality**: Refers to whether text appears flat (2D) or has depth (3D) in AR/VR environments. 56 | 57 | - **Spatial Classification**: Organization and behavior of text in 3D space, relevant for typography in AR/VR. 58 | 59 | - **Variable Fonts**: Customizable fonts that offer control over multiple design axes like weight, width, and spacing. 60 | 61 | - **Extrapolation**: The estimation of text appearance at sizes other than the original, which can cause issues like rounding errors. 62 | 63 | - **Atlas (Text Atlas)**: Pre-generated text glyphs used for rendering, optimized for specific conditions in AR/VR. 64 | 65 | - **Texture Maps**: Pre-rendered images that add details or depth to 3D models, including text. 66 | 67 | - **SDF (Signed-Distance-Field)**: A text rendering method that offers smoothness by measuring the pixel distance from shape boundaries. 68 | 69 | - **OTF/TTF (OpenType Font/TrueType Font)**: Font file types that are potentially not used directly in AR/VR due to performance concerns. 70 | 71 | - **Counters**: The enclosed or partially enclosed circular or curved negative spaces in characters like "c," "d," and "o." Important for text legibility in AR/VR. 72 | 73 | - **Apertures**: The partially enclosed, somewhat rounded negative spaces in some characters such as "n," "C," "S," etc. Important for text legibility. 74 | 75 | - **Chromatic Aberration**: A distortion where colors are rendered incorrectly, which can lead to counters and apertures appearing closed, affecting legibility. 76 | 77 | - **Irradiation**: A phenomenon where light or color spills over the boundaries of an object, affecting its perceived shape. In fonts, this can make small counters and apertures appear filled. 78 | 79 | - **Font Weight**: Refers to the thickness of the character outlines in a font, affecting legibility. 80 | 81 | - **Stroke Contrast**: The variation in stroke thickness within a single character, which may be a problem in low-resolution devices like AR/VR headsets. 82 | 83 | - **X-Height**: The height of lowercase letters, disregarding ascenders or descenders. Important for determining the legibility of a typeface. 84 | 85 | - **Ascenders and Descenders**: Ascenders extend above the x-height, while descenders extend below the baseline. Important for letter recognition. 86 | 87 | - **Joints/Intersections**: The points where strokes in a character meet; the glow of pixels (halation) at these points can affect legibility. 88 | 89 | - **Halation**: The scattering of light beyond its proper boundaries, often causing a foggy appearance or glow around text, impacting legibility. 90 | 91 | - **Terminals**: The end of a stroke in a letter or symbol, which can be distorted in AR/VR environments, affecting legibility. 92 | 93 | - **Letter Spacing**: The amount of space between characters in a word or sentence; generous spacing is often needed in AR/VR to maintain legibility. 94 | --------------------------------------------------------------------------------