├── .github
└── workflows
│ └── auto-publish.yml
├── .pr-preview.json
├── CONTRIBUTING.md
├── LICENSE.md
├── Makefile
├── README.md
├── favicon-32x32.png
├── favicon-96x96.png
├── favicon.ico
├── index.bs
├── lighting-estimation-explainer.md
├── package.json
├── security-privacy-questionnaire.md
└── w3c.json
/.github/workflows/auto-publish.yml:
--------------------------------------------------------------------------------
1 | name: Build, and publish spec to GitHub Pages and /TR/
2 |
3 | on:
4 | pull_request: {}
5 | push:
6 | branches: [main]
7 | paths:
8 | - 'images/**'
9 | - 'index.bs'
10 |
11 | jobs:
12 | main:
13 | name: Build, Validate and Deploy
14 | runs-on: ubuntu-20.04
15 | steps:
16 | - uses: actions/checkout@v2
17 | - uses: w3c/spec-prod@v2
18 | with:
19 | TOOLCHAIN: bikeshed
20 | SOURCE: index.bs
21 | DESTINATION: index.html
22 | GH_PAGES_BRANCH: gh-pages
23 | W3C_ECHIDNA_TOKEN: ${{ secrets.W3C_TR_TOKEN }}
24 | W3C_WG_DECISION_URL: https://lists.w3.org/Archives/Public/public-immersive-web-wg/2021Sep/0004.html
25 | W3C_BUILD_OVERRIDE: |
26 | status: WD
27 |
28 | # not set 'warning' to BUILD_FAIL_ON (not to cause error by bikeshed warning?)
29 |
30 |
--------------------------------------------------------------------------------
/.pr-preview.json:
--------------------------------------------------------------------------------
1 | {
2 | "src_file": "index.bs",
3 | "type": "bikeshed",
4 | "params": {
5 | "force": 1
6 | }
7 | }
8 |
--------------------------------------------------------------------------------
/CONTRIBUTING.md:
--------------------------------------------------------------------------------
1 | # Web Platform Incubator Community Group
2 |
3 | This repository is being used for work in the W3C Web Platform Incubator Community Group, governed by the [W3C Community License
4 | Agreement (CLA)](http://www.w3.org/community/about/agreements/cla/). To make substantive contributions,
5 | you must join the CG.
6 |
7 | If you are not the sole contributor to a contribution (pull request), please identify all
8 | contributors in the pull request comment.
9 |
10 | To add a contributor (other than yourself, that's automatic), mark them one per line as follows:
11 |
12 | ```
13 | +@github_username
14 | ```
15 |
16 | If you added a contributor by mistake, you can remove them in a comment with:
17 |
18 | ```
19 | -@github_username
20 | ```
21 |
22 | If you are making a pull request on behalf of someone else but you had no part in designing the
23 | feature, you can remove yourself with the above syntax.
24 |
--------------------------------------------------------------------------------
/LICENSE.md:
--------------------------------------------------------------------------------
1 |
2 | All documents in this Repository are licensed by contributors under the [W3C Software and Document License](https://www.w3.org/Consortium/Legal/copyright-software).
3 |
--------------------------------------------------------------------------------
/Makefile:
--------------------------------------------------------------------------------
1 | .PHONY: all index.html
2 |
3 | all: index.html
4 |
5 | index.html: index.bs
6 | curl https://api.csswg.org/bikeshed/ -F file=@index.bs -F output=err
7 | curl https://api.csswg.org/bikeshed/ -F file=@index.bs -F force=1 > index.html | tee
8 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | Lighting Estimation
2 | ===================
3 |
4 | Explainer: https://github.com/immersive-web/lighting-estimation/blob/master/lighting-estimation-explainer.md
5 |
6 | Spec: https://immersive-web.github.io/lighting-estimation/
7 |
8 | This is the repository for the WebXR lighting estimation API, which provides XR sessions (primarily AR) with information about the current environment lighting so that cirtual objects can be rendered more realistically.
9 |
10 | The features was initially proposed in [this issue](https://github.com/immersive-web/lighting-estimation/issues/1), though that proposal no longer matches the current API shape.
11 |
--------------------------------------------------------------------------------
/favicon-32x32.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/immersive-web/lighting-estimation/8da3e628e9f8263f5285b7b258716529046a7a0a/favicon-32x32.png
--------------------------------------------------------------------------------
/favicon-96x96.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/immersive-web/lighting-estimation/8da3e628e9f8263f5285b7b258716529046a7a0a/favicon-96x96.png
--------------------------------------------------------------------------------
/favicon.ico:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/immersive-web/lighting-estimation/8da3e628e9f8263f5285b7b258716529046a7a0a/favicon.ico
--------------------------------------------------------------------------------
/index.bs:
--------------------------------------------------------------------------------
1 |
2 | Shortname: webxr-lighting-estimation
3 | Title: WebXR Lighting Estimation API Level 1
4 | Group: immersivewebwg
5 | Status: ED
6 | TR: https://www.w3.org/TR/webxr-lighting-estimation-1/
7 | ED: https://immersive-web.github.io/lighting-estimation/
8 | Repository: immersive-web/lighting-estimation
9 | Level: 1
10 | Mailing List Archives: https://lists.w3.org/Archives/Public/public-immersive-web-wg/
11 |
12 | !Participate: File an issue (open issues)
13 | !Participate: Mailing list archive
14 | !Participate: W3C's #immersive-web IRC
15 |
16 | Editor: Brandon Jones 87824, Google https://google.com/, bajones@google.com
17 | Former Editor: Kearwood Gilbert 87854, [Mozilla until 2020], kearwood@kearwood.com
18 |
19 | Abstract: This specification describes support for exposing estimates of environmental lighting conditions to WebXR sessions.
20 |
58 |
59 |
60 |
61 |
62 |
116 |
117 | Introduction {#intro}
118 | ============
119 |
120 | The WebXR Lighting Estimation module expands the WebXR Device API, the WebXR Augmented Reality Module, and the WebXR Layers module with the ability to expose estimates of the lighting conditions of the user's environment.
121 |
122 | Light Primitives {#light-primitives}
123 | ================
124 |
125 | XRLightProbe {#xrlightprobe-interface}
126 | ------------
127 |
128 | An {{XRLightProbe}} collects estimated lighting information at a given point in the user's environment.
129 |
130 |
137 |
138 | The probeSpace attribute is an {{XRSpace}} that has a [=XRSpace/native origin=] tracking the position and orientation that the {{XRLightProbe}}'s lighting estimations are being generated relative to.
139 |
140 | The onreflectionchange attribute is an [=Event handler IDL attribute=] for the {{reflectionchange}} event type.
141 |
142 | XRReflectionFormat {#xrreflectionformat-interface}
143 | ------------
144 |
150 |
151 | Reflection cube maps have an internal reflection format that indicates how the texture data is represented, and may change how applications choose to use the texture. Cube maps MAY be requested with the {{XRReflectionFormat/"srgba8"}} format or the {{XRSession/preferredReflectionFormat}} of the light probe.
152 |
153 |
154 |
155 |
156 |
{{XRReflectionFormat}}
157 |
WebGL Format
158 |
WebGL Internal Format
159 |
WebGPU Format
160 |
HDR
161 |
162 |
163 |
{{XRReflectionFormat/"srgba8"}}
164 |
RGBA
165 |
SRGB8_ALPHA8
166 |
"rgba8unorm-srgb"
167 |
168 |
169 |
{{XRReflectionFormat/"rgba16f"}}
170 |
RGBA
171 |
RGBA16F
172 |
"rgba16float"
173 |
✓
174 |
175 |
176 | XRLightEstimate {#xrlightestimate-interface}
177 | ------------
178 |
179 | An {{XRLightEstimate}} provides the estimated lighting values for an {{XRLightProbe}} at the time represented by an {{XRFrame}}. {{XRLightEstimate}}s are queried by passing an {{XRLightProbe}} to the {{XRFrame/getLightEstimate()}} method of an {{XRFrame}}.
180 |
181 |
189 |
190 | The sphericalHarmonicsCoefficients attribute returns a {{Float32Array}} containing 9 spherical harmonics coefficients. The array MUST be 27 elements in length, with every 3 elements defining the red, green, and blue components respectively of a single coefficient. The first term of the {{XRLightEstimate/sphericalHarmonicsCoefficients}}, meaning the first 3 elements of the array, MUST be representative of a valid lighting estimate. All other terms are optional, and MAY be 0 if a corresponding lighting estimate is not available due to either user privacy settings or the capabilities of the platform.
191 |
192 | The order of coefficients in {{XRLightEstimate/sphericalHarmonicsCoefficients}}, is [C00, C1-1, C10, C11, C2-2, C2-1, C20, C21, C22], where Clm is the coefficient of spherical harmonic Ylm.
193 |
194 | The primaryLightDirection represents the direction to the primary light source from the [=XRSpace/native origin=] of the {{XRLightProbe/probeSpace}} of the {{XRLightProbe}} that produced the {{XRLightEstimate}}. The value MUST be a unit length 3D vector and the {{DOMPointReadOnly/w}} value MUST be 0.0. If estimated values from the user's environment are not available the {{XRLightEstimate/primaryLightDirection}} MUST be { x: 0.0, y: 1.0, z: 0.0, w: 0.0 }, representing a light shining straight down from above.
195 |
196 | The primaryLightIntensity represents the color of the primary light source. The value MUST represent an RGB value mapped to the {{DOMPointReadOnly/x}}, {{DOMPointReadOnly/y}}, and {{DOMPointReadOnly/z}} values respectively where each component is greater than or equal to 0.0 and the {{DOMPointReadOnly/w}} value MUST be 1.0. If estimated values from the user's environment are not available the {{XRLightEstimate/primaryLightIntensity}} MUST be {x: 0.0, y: 0.0, z: 0.0, w: 1.0}, representing no illumination.
197 |
198 | WebXR Device API Integration {#webxr-device-api-integration}
199 | ============================
200 |
201 | Both the {{XRSession}} and {{XRFrame}} interfaces from the WebXR Device API are expanded by this module.
202 |
203 | Session Initialization {#session-initialization}
204 | ----------------------
205 |
206 | The string "light-estimation" is introduced by this module as a new valid [=feature descriptor=]. Applications that wish to use light estimation features MUST be requested with an the "[=feature descriptor/light-estimation=]" [=feature descriptor=].
207 |
208 | XRSession {#xrsession-interface}
209 | ---------
210 |
211 | The {{XRSession}} interface is extended with the ability to create new {{XRLightProbe}} instances. {{XRLightProbe}} instances have a session object, which is the {{XRSession}} that created this {{XRLightProbe}}. And an reflection format object, which is the {{XRReflectionFormat}} that the light probe may retrieve.
212 |
213 | The {{XRSession}} interface is further extended with an attribute {{XRSession/preferredReflectionFormat}}, indicating the {{XRReflectionFormat}} most closely supported by the underlying [=XRSession/XR device=]
214 |
215 |
227 | When the requestLightProbe(|options|) method is invoked on {{XRSession}} |session|, the user agent MUST run the following steps:
228 | 1. Let |promise| be [=a new Promise=].
229 | 1. If the [=light-estimation=] feature descriptor is not [=list/contain|contained=] in the |session|'s [=XRSession/list of enabled features=], [=/reject=] |promise| with {{NotSupportedError}} and abort these steps.
230 | 1. If |session|’s [=XRSession/ended=] value is true, throw an {{InvalidStateError}} and abort these steps.
231 |
232 |
233 |
If |options|'s {{XRLightProbeInit/reflectionFormat}} is {{XRReflectionFormat/"srgba8"}} or matches |session|'s {{XRSession/preferredReflectionFormat}}:
234 |
235 | 1. Let |probe| be a new {{XRLightProbe}}.
236 | 1. Set |probe|'s [=XRLightProbe/session=] to |session|.
237 | 1. Set |probe|'s [=XRLightProbe/reflection format=] to |options|'s {{XRLightProbeInit/reflectionFormat}}
238 | 1. [=Resolve=] |promise| with |probe|.
239 |
240 |
else
241 |
242 | 1. [=Reject=] |promise| with a "{{NotSupportedError}}" {{DOMException}}
243 |
244 |
245 |
246 |
247 |
248 | XRFrame {#xrframe-interface}
249 | -------
250 |
251 | The {{XRFrame}} interface is extended with the ability to query the {{XRLightEstimate}} for a given {{XRLightProbe}}.
252 |
253 |
260 | When the getLightEstimate(|lightProbe|) method is invoked on {{XRFrame}} |frame|, the user agent MUST run the following steps:
261 |
262 | 1. If |frame|'s [=XRFrame/active=] boolean is `false`, throw an {{InvalidStateError}} and abort these steps.
263 | 1. Let |session| be |frame|'s {{XRFrame/session}} object.
264 | 1. If |lightProbe|'s [=XRLightProbe/session=] does not equal |session|, throw an {{InvalidStateError}} and abort these steps.
265 | 1. Let |device| be |session|'s [=XRSession/XR device=].
266 | 1. If |device| cannot estimate the lighting for this frame, return null.
267 | 1. Let |estimate| be a new {{XRLightEstimate}}.
268 | 1. Populate |estimate|'s {{XRLightEstimate/sphericalHarmonicsCoefficients}}, with the coefficients provided by |device|.
269 |
270 |
271 |
If |device| has an estimated direction for the light source
272 |
273 | 1. Set |estimate|'s {{XRLightEstimate/primaryLightDirection}} to the estimated direction of the light source.
274 |
275 |
else
276 |
277 | 1. Set |estimate|'s {{XRLightEstimate/primaryLightDirection}} to { x: 0.0, y: 1.0, z: 0.0, w: 0.0 }
278 |
279 |
280 |
281 |
If |device| has an estimated intensity for the light source
282 |
283 | 1. Set |estimate|'s {{XRLightEstimate/primaryLightIntensity}} to the estimated intensity of the light source.
284 |
285 |
else
286 |
287 | 1. Set |estimate|'s {{XRLightEstimate/primaryLightIntensity}} to {x: 0.0, y: 0.0, z: 0.0, w: 1.0}
288 |
289 |
290 |
291 | 1. Return |estimate|.
292 |
293 |
294 |
295 | WebXR Layers Integration {#webxr-layers-integration}
296 | ========================
297 |
298 | The {{XRWebGLBinding}} interface from the WebXR Layers module is expanded by this module.
299 |
300 | XRWebGLBinding {#xrwebglbinding-interface}
301 | --------------
302 |
303 | The {{XRWebGLBinding}} interface is extended with the ability to query a reflection cube map for a given {{XRLightProbe}}.
304 |
305 |
312 | When the getReflectionCubeMap(|lightProbe|) method is invoked on {{XRWebGLBinding}} |binding|, the user agent MUST run the following steps:
313 |
314 | 1. If |binding|'s [=XRWebGLBinding/context=] is lost, throw an {{InvalidStateError}} and abort these steps.
315 | 1. Let |session| be |binding|'s [=XRWebGLBinding/session=].
316 | 1. If |session| is ended, throw an {{InvalidStateError}} and abort these steps.
317 | 1. If |session| does not match |lightProbe|'s [=XRLightProbe/session=], throw an {{InvalidStateError}} and abort these steps.
318 | 1. Let |device| be |session|'s [=XRSession/XR Device=].
319 | 1. If no reflection cube map is available from |device|, return null.
320 | 1. Return a new {{WebGLTexture}} cubemap in the format specified by |lightProbe|'s [=XRLightProbe/reflection format=] and populated with the data from |device|.
321 |
322 |
323 |
324 | Events {#events}
325 | ======
326 |
327 | The [=task source=] for all [=queue a task|tasks queued=] in this specification is the [=XR task source=], unless otherwise specified.
328 |
329 | Event Types {#event-types}
330 | -----------
331 |
332 | The user agent MUST [=fire an event=] named reflectionchange on an {{XRLightProbe}} object each time the contents of the cube map returned by calling {{XRWebGLBinding/getReflectionCubeMap()}} have changed.
333 |
334 | Privacy & Security Considerations {#privacy-security}
335 | =================================
336 |
337 |
338 |
339 | The lighting estimation API shares many potential [=privacy and security risks=] with the Ambient Light Sensor API [[!AMBIENT-LIGHT]], including:
340 | * Profiling: Lighting Estimation can leak information about user's use patterns and surroundings. This information can be used to enhance user profiling and behavioral analysis.
341 | * Cross-device Linking: Two devices can access web sites that include the same third-party script that correlates lighting levels over time.
342 | * Cross Device Communication: A simple broadcast communication method can use device screen or camera LED flashes to broadcast messages read out with lighting estimation on a nearby device.
343 |
344 | In addition to these, there are a few vectors unique to lighting estimation to
345 | consider.
346 | * The lighting estimation returned by the WebXR API explicitly describes the real world environment in close proximity to the user.
347 | * Reflection cube maps of a high enough resolution approach the same level as camera access.
348 |
349 | Lighting estimation must be declared when creating an XR Session as a [=feature descriptor=], which will allow the user agent to notify the user of the potential privacy implications of allowing the lighting estimation API to be used by the website. The user agent is encouraged to NOT provide real-time updates to any portion of the lighting estimation API, especially the reflection cube map. By default, only low spaital frequency and low temporal frequency information should be returned by the WebXR API. Reflection cube maps should be kept low resolution, unless the user has also consented to camera permissions for a particular origin. As further mitigation, the Spherical Harmonics and primary light direction MAY be quantized.
350 |
351 |
352 |
353 |
354 |
359 |
--------------------------------------------------------------------------------
/lighting-estimation-explainer.md:
--------------------------------------------------------------------------------
1 | # WebXR Device API - Lighting Estimation
2 | This document explains the portion of the WebXR APIs that enable developers to render augmented reality content that reacts to real world lighting.
3 |
4 | ## Introduction
5 |
6 | "Lighting Estimation" is implemented by AR platforms using a combination of sensors, cameras, algorithms, and machine learning. Lighting estimation provides input to rendering algorithms and shaders to ensure that the shading, shadows, and reflections of objects appear natural when presented in a diverse range of settings.
7 |
8 | The `XRLightProbe` interface exposes the values that the platform offer to WebXR rendering engines. The corresponding retrieval method, `XRSession.requestLightProbe()`, returns a promise and is only accessible once an AR session has started. The promise may be resolved on the same frame or multiple frames later, depending on the platform capabilities. In some cases, the promise may fail, indicating that the lighting values are not available for that session.
9 |
10 | Once an `XRLightProbe` has been created it can be used to query an `XRLightEstimate` each frame with the XRFrame.getLightEstimate() method. The light estimate provides both an ambient illumination estimate in the form of spherical harmonics and an estimate of the direction and intensity of the primary light source in the user's environment. The light probe can also be used to query an estimated cube map representing the users environment from the `XRWebGLBinding.getReflectionCubeMap()` method.
11 |
12 | Although modern render engines support multiple light and reflection probes in a scene, the WebXR API returns only a single `XRLightProbe`, representing the global approximated lighting values to be used in the area in close proximity to the viewer. When future platforms become capable of reporting multiple probes with precise locations away from the viewer, such support could be implemented additively without breaking changes.
13 |
14 | The orientation of the lighting information is reported relative to the `XRLightProbe.probeSpace`, which can be queried each `XRFrame` like any other space, and may be the same as an existing XRReferenceSpace. As it may be computationally expensive to rotate spherical harmonics values and texture cubes, the probeSpace enable the same values to be used in multiple orientations.
15 |
16 | It is possible to treat a synthetic VR scene as the environment that AR content will be mixed in to. In this case, the platform will be able to report the lighting estimation using the geometry of the VR scene. As the WebXR API does not specifically express if the world is synthetic or real, AR content is to be written the same, without such knowledge. Such "AR in VR" techniques do not affect the WebXR specification directly and are beyond the scope of this text.
17 |
18 | ## Physically Based Units
19 |
20 | The lighting estimation values represent luminance and colors that may be outside the gamut of the output device. Direct sunlight can project 5000 nits at full power, while a typical display may emit only 250-500 nits. The objects in a scene will attenuate the power of the sun and reflect a smaller portion towards the viewer. Even if the display can only represent such a limited gamut (such as SRGB, P3, or Rec 2020), intermediate lighting calculations used by shaders involve scaling up small values and attenuating large values outside of the displayed gamut. When the lighting calculation results in a color that can not be displayed, the resulting value will be altered by a variety of post processing effects to match the rendering intent and aesthetic chosen by the content authors.
21 |
22 | Luminance values are expressed in nits (cd/m^2). Nits are used by some native platform lighting estimation API's and the media-capabilities API. User agents will translate the values returned by native platforms to nits for consistency.
23 |
24 | As lighting is scene-relative as opposed to display-relative, the luminance values are encoded linearly with no gamma curve. Most modern render engines perform intermediate calculations in linear space and can accept such values directly. If an engine performs intermediate calculations in a color space encoded with gamma, such as sRGB, care must be taken when converting the values. After scaling the values, the result may include components above 1.0 or below 0.0. Naive implementations that clamp RGB components independently will result in erraneous hue and saturation for out-of-gamut colors.
25 |
26 | ## Cube Map Textures
27 |
28 | HDR Cube Map textures, as returned by `XRWebGLBinding.getReflectionCubeMap()`, provide all the information about light sources and indirect bounces needed to accurately render PBR materials that are diffuse, glossy, and visibly reflective. Image based lighting effects utilizing such textures are simple to implement and perform well for VR and AR rendering. Unfortunately, such cube map textures require a lot of video memory and can often represent the environment from a limited range of locations where such a map was captured.
29 |
30 | HDR Cube Map textures are commonly used to implement "Reflection Probes" in modern rendering engines.
31 |
32 | ## Spherical Harmonics
33 |
34 | SH (Spherical Harmonics) are used as a more compact alternative to HDR cube maps by storing a small number of coefficient values describing a fourier series over the surface of a sphere. SH can effectively compress cube maps, while retaining multiple lights and directionality. Due to their lightweight nature, many spherical harmonics probes can be used within a scene, be interpolated, or be calculated for locations nearer to the lit objects.
35 |
36 | WebXR's Lighting Estimation module supports up to 9 spherical harmonics coefficients per RGB color component, for a total of 27 floating point scalar values. This enables the level 2 (3rd) order of details. If a platform can not supply all 9 coefficients, it can pass 0 for the higher order coefficients resulting in an effectively lower frequency reproduction. This may be used to communicate a a simple global "ambient" term when more detailed lighting information is either not available or not allowed by the user.
37 |
38 | This "Spherical harmonics probe" format is used by most modern rendering engines, including Unity, Unreal, and Threejs.
39 |
40 | ## Shadows
41 |
42 | When an HDR Cube Map texture is available, shadows only have to consider occlusion of other rendered objects in the scene.
43 |
44 | When a HDR Cube Map texture is not available, or the typical soft shadow effects of image based lighting are too costly to implement, the `XRLightEstimate.primaryLightDirection` and `XRLightEstimate.primaryLightIntensity` can be used to render shadows cast by the most prominent light source.
45 |
46 | ## Security Implications
47 | The lighting estimation API shares many potential privacy risks with the [ambient light sensor API](https://www.w3.org/TR/ambient-light/#security-and-privacy), including:
48 |
49 | - profiling: Lighting estimation can leak information about user’s use patterns and surrounding. This information can be used to enhance user profiling and behavioral analysis.
50 | - cross device linking: Two devices can access web sites that include the same third-party script that correlates lighting levels over time.
51 | - cross device communication
52 |
53 | Lighting estimation also provides additional opportunities for side channel attacks and fingerprinting risks, discussed in this section.
54 |
55 | ### Feature Descriptor
56 |
57 | In order for the applications to signal their interest in accessing lighting estimation during a session, the session must be requested with appropriate feature descriptor. The string `light-estimation` is introduced by this module as new valid feature descriptor. `light-estimation` enables the light estimation feature, and is required for `XRSession.requestLightProbe()` to resolve.
58 |
59 | The inline XR device MUST NOT support the `light-estimation` feature.
60 |
61 | ### XRLightProbe
62 |
63 | The `XRLightProbe` itself contains no lighting values, but is used to retrieve the current lighting state with each `XRFrame`.
64 |
65 | ```js
66 | let lightProbe = await xrSession.requestLightProbe();
67 | ```
68 |
69 | The position and orientation in space that the lighting is estimated relative to is communicated with the `probeSpace` attribute, which is an `XRSpace`. The `probeSpace` may update its pose over time as the user moves around their environment.
70 |
71 | ```js
72 | let probePose = xrFrame.getPose(lightProbe.probeSpace, xrReferenceSpace);
73 | ```
74 |
75 | ### XRLightEstimate
76 |
77 | `XRLightEstimate` returns sufficient information to render objects that appear to fit into their environment, with highly diffuse surfaces or high frequency normal maps which would result in a wide NDF (normal distribution function). Highly polished objects may be represented with a non-physically based illusion of glossiness with a specular highlight effect sensitive only to the primary light direction. Reflections will be unable to reproduce detailed images of the environment without a cube map.
78 |
79 | The lighting estimation returned by the WebXR API explicitly describes the real world environment in proximity to the user. By default, only low spatial frequency and low temporal frequency information should be returned by the WebXR API. Even when a platform can directly produce higher spatial and temporal frequency information, the browser must apply a low pass filter with an aim to mitigate the risk of untrusted content identifying the geolocation of the user or of profiling their environment.
80 |
81 | ```js
82 | // Using Three.js to demonstrate
83 | let threeDirectionalLight = new THREE.DirectionalLight();
84 | // THREE.LightProbe is Three.js' spherical harmonics-based light type.
85 | let threeLightProbe = new THREE.LightProbe();
86 |
87 | let lightProbe = await xrSession.requestLightProbe();
88 |
89 | function onXRFrame(t, xrFrame) {
90 | let lightEstimate = xrFrame.getLightEstimate(lightProbe);
91 |
92 | let intensity = Math.max(1.0,
93 | Math.max(lightEstimate.primaryLightIntensity.x,
94 | Math.max(lightEstimate.primaryLightIntensity.y,
95 | lightEstimate.primaryLightIntensity.z)));
96 |
97 | threeDirectionalLight.position.set(lightEstimate.primaryLightDirection.x,
98 | lightEstimate.primaryLightDirection.y,
99 | lightEstimate.primaryLightDirection.z);
100 | threeDirectionalLight.color.setRGB(lightEstimate.primaryLightIntensity.x / intensity,
101 | lightEstimate.primaryLightIntensity.y / intensity,
102 | lightEstimate.primaryLightIntensity.z / intensity);
103 | threeDirectionalLight.intensity = intensity;
104 |
105 | threeLightProbe.sh.fromArray(lightEstimate.sphericalHarmonicsCoefficients);
106 |
107 | // ... other typical frame loop stuff.
108 | }
109 | ```
110 |
111 | Only first term of the `XRLightEstimate.sphericalHarmonicsCoefficients` is guaranteed to be available either due to user privacy settings or the capabilities of the platform. When a values estimated from the user's environment are not available the `primaryLightDirection` will report `(0.0, 1.0, 0.0, 0.0)`, representing a light shining straight down from above, and the `primaryLightIntensity` will report `(0.0, 0.0, 0.0, 1.0)`, representing no illumination.
112 |
113 | Combined with other factors, such as the user's IP address, even the low frequency information returned with XRLightProbe increases the fingerprinting risk. The XRLightProbe should only be accessible during an active WebXR session.
114 |
115 | ### Reflection Cube Map
116 |
117 | The cube map returned by passing an `XRLightProbe` to `XRWebGLBinding.getReflectionCubeMap()` enables efficient and simple to implement image based lighting. PBR shaders can index the mip map chain of the environment cube to reduce the memory bandwidth required while integrating multiple samples to match wider NDF's.
118 |
119 | While the estimated cube map is expected to update over time to better reflect the user's environment as they move around those changes are unlikely to happen with every `XRFrame`. Since creating and processing the cube map is potentially expensive, especially if mip maps are needed, pages can listen to the `reflectionchange` event on the `XRLightProbe` to determine when an updated cube map needs to be retrieved.
120 |
121 | ```js
122 | let glBinding = new XRWebGLBinding(xrSession, gl);
123 |
124 | let lightProbe = await xrSession.requestLightProbe();
125 | let glCubeMap = glBinding.getReflectionCubeMap(lightProbe);
126 |
127 | lightProbe.addEventListener('reflectionchange', () => {
128 | glCubeMap = glBinding.getReflectionCubeMap(lightProbe);
129 | });
130 | ```
131 |
132 | By default the cube map will be returned as a 8BPP sRGB texture. Some underlying runtimes may deliver the text data in a different "native" format however, such high dynamic range formats. The session's preferred internal format for reflection maps is reported by the `XRSession.preferredReflectionFormat`, which may alternately be specified when requesting the light probe. Querying cube maps using the preferred format ensures the minimal amount of conversion needs to happen, which in turn may be faster and experience less data loss. Passing any value other than `"srgb8"` or the light probe's `preferredReflectionCubeMapFormat` to the `reflectionFormat` option of `requestLightProbe()` will cause the promise to be rejected.
133 |
134 | ```js
135 | let lightProbe = await xrSession.requestLightProbe({ reflectionFormat: xrSession.preferredReflectionFormat });
136 | ```
137 |
138 | If a reflection is queried for a light probe using the `rgba16f` format from an `XRWebGLBinding` that uses a WebGL 1.0 context without the `OES_texture_half_float` extension enabled the `getReflectionCubeMap()` call will fail and return false. The `OES_texture_half_float` extension or a WebGL 2.0 context must be used in order for `rgba16f` cube maps to be returned.
139 |
140 | UA's may provide real-time reflection cube maps, captured by cameras or other sensors reporting high frequency spatial information. To access such real-time cube maps, the `camera` feature policy must be enabled for the origin. `XRWebGLBinding.getReflectionCubeMap()` should only return real-time cube maps following user consent equivalent to requesting access to the camera.
141 |
142 | UA's may provide a reflection cube map that was pre-created by the end user, which may differ from the environment while the `XRSession` is active. In particular, the user may choose to manually capture a reflection cube map at an earlier time when sensitive information or people are not present in the environment.
143 |
144 | #### Cube Map Open Questions:
145 | - Is there a size tolerance for the cube maps re: user consent? ARCore's cube maps are limited to 16px per side, which seems difficult to get sensitive information out of.
146 | - Should we handle mipmapping for the user. My gut and ARCore says no, but I'm not sure what ARKit does here either.
147 | - Should we allow for a single texture to be returned multiple times? Seems potentially fragile but may incur unnecessary readback on iOS.
148 |
149 | ### Temporal and Spatial Filtering
150 |
151 | Rapid changes to incoming light can provide information about a user's surroundings that can lead to fingerprinting and side-channel attacks on user privacy.
152 |
153 | As an example, a light switch can be flipped in a room, causing the lighting estimation of two users in the same room to simultaneously change. If the precise time of the change can be observed, it can be inferred that the two users are co-located in the same physical space.
154 |
155 | Another example occurs when a nearby display is playing a video, such as an advertisement. The light from the display reflects off many surfaces in the room, contributing to the observable ambient light estimate. A timeline of light intensity changes can uniquely identify the video that is playing, even if the monitor is not in direct line-of-sight to the XR device sensors.
156 |
157 | Temporal and spatial filtering of the light estimation can be used to avoid such attacks. A low-pass filter effect can be achieved by averaging the values over the last several seconds. For single scalar values representing light intensity or color, such as `XRLightEstimation.sphericalHarmonicsCoefficients` and `XRLightEstimation.primaryLightIntensity` this can be applied directly with a box-kernel. SH's have a convenient property that they can be summed and interpolated by simply interpolating their coefficients, assuming their orientation is not changing. These SH coefficients can also be filtered as scalar values with a box-kernel.
158 |
159 | When applying quanitzation, the filtered values should be quantized before the box-kernel is applied. Any vectors, such as `XRLightEstimation.primaryLightDirection` should be quantized in 3D space and always return a unit vector.
160 |
161 | ## Appendix A: Proposed partial IDL
162 | This is a partial IDL and is considered additive to the core IDL found in the main [explainer](explainer.md).
163 |
164 | ```webidl
165 | enum XRReflectionFormat {
166 | "srgba8",
167 | "rgba16f",
168 | };
169 |
170 | dictionary XRLightProbeInit {
171 | XRReflectionFormat reflectionFormat = "srgba8";
172 | };
173 |
174 | partial interface XRSession {
175 | Promise requestLightProbe(XRLightProbeInit options = {});
176 | readonly attribute XRReflectionFormat preferredReflectionFormat;
177 | };
178 |
179 | partial interface XRFrame {
180 | XRLightEstimate? getLightEstimate(XRLightProbe lightProbe);
181 | };
182 |
183 | [SecureContext, Exposed=Window]
184 | partial interface XRLightProbe : EventTarget {
185 | readonly attribute XRSpace probeSpace;
186 | attribute EventHandler onreflectionchange;
187 | };
188 |
189 | [SecureContext, Exposed=Window]
190 | partial interface XRLightEstimate {
191 | readonly attribute Float32Array sphericalHarmonicsCoefficients;
192 | readonly attribute DOMPointReadOnly primaryLightDirection;
193 | readonly attribute DOMPointReadOnly primaryLightIntensity;
194 | };
195 |
196 | // See https://github.com/immersive-web/layers for definition.
197 | partial interface XRWebGLBinding {
198 | WebGLTexture? getReflectionCubeMap(XRLightProbe lightProbe);
199 | };
200 | ```
201 |
--------------------------------------------------------------------------------
/package.json:
--------------------------------------------------------------------------------
1 | {
2 | "name": "webxr-lighting-estimation-module",
3 | "description": "WebXR Lighting Estimation Module - Level 1",
4 | "version": "0.0.1",
5 | "scripts": {
6 | "build": "make"
7 | }
8 | }
9 |
--------------------------------------------------------------------------------
/security-privacy-questionnaire.md:
--------------------------------------------------------------------------------
1 | # Security and Privacy Questionnaire
2 |
3 | This document answers the [W3C Security and Privacy
4 | Questionnaire](https://www.w3.org/TR/security-privacy-questionnaire/) for the
5 | WebXR Lighting Estimation specification. Note that the Lighting Estimation feature is only exposed during an active WebXR Immersive AR session, and that many risks are shared with the [ambient light sensor API](https://www.w3.org/TR/ambient-light/#security-and-privacy)
6 |
7 | **What information might this feature expose to Web sites or other parties, and for what purposes is that exposure necessary?**
8 |
9 | This feature builds an understanding of the light sources in the user’s environment, as well as generating a low-resolution texture representing the various shapes that might be reflected in a shiny object placed at a specific spot in the scene.
10 |
11 | Sites would use this information in order to make a rendered object appear to fit more naturally into the user’s environment.
12 |
13 | **Is this specification exposing the minimum amount of information necessary to power the feature?**
14 |
15 | The specification allows exposing a texture representing the reflection map, and a set of spherical harmonics for the ambient light, as well as the color and direction of the dominant light. A series of mitigations, including temporal and spatial filtering and quantization, are [recommended](https://github.com/immersive-web/lighting-estimation/blob/main/lighting-estimation-explainer.md#temporal-and-spatial-filtering)
16 |
17 | **How does this specification deal with personal information or personally-identifiable information or information derived thereof?**
18 |
19 | There are no direct PII exposed by this specification. The mapping of the user's environment and the knowledge that two users may be in the same space are the only derived information that could be done via this API. Other WebXr features (notably hit-test) already potentially expose information about the user’s environment, and the potential to identify two users as being in the same space can be mitigated with the temporal filtering mentioned above.
20 |
21 | **How does this specification deal with sensitive information?**
22 |
23 | The specification allows a user agent to restrict the quality of the lighting information provided a page, as well as the resolution of the texture for the reflection map. In addition, the specification encourages UAs to not update these values in real-time, this helps to mitigate the potential to determine that two users are in the same physical location.
24 |
25 | The lighting estimation feature must be requested before starting a WebXR session, which allows the user agent to show a prompt specifically requesting user approval.
26 |
27 | It’s worth noting that the WebXR Hit Test and Depth Modules already expose more environment information than the lighting estimation can do.
28 |
29 | **Does this specification introduce new state for an origin that persists across browsing sessions?**
30 |
31 | No.
32 |
33 | **What information from the underlying platform, e.g. configuration data, is exposed by this specification to an origin?**
34 |
35 | None.
36 |
37 | **Does this specification allow an origin access to sensors on a user’s device**
38 |
39 | No. However, in order to return lighting estimation data, the platform may use various sensors. The origin never has direct access to these sensors as part of the specification.
40 |
41 | **What data does this specification expose to an origin? Please also document what data is identical to data exposed by other features, in the same or different contexts.**
42 |
43 | This specification isn't directly exposing any data to the origin but can be used to get information about the user's physical environment.
44 |
45 | **Does this specification enable new script execution/loading mechanisms?**
46 |
47 | No.
48 |
49 | **Does this specification allow an origin to access other devices?**
50 |
51 | No.
52 |
53 | **Does this specification allow an origin some measure of control over a user agent’s native UI?**
54 |
55 | No.
56 |
57 | **What temporary identifiers might this this specification create or expose to the web?**
58 |
59 | None.
60 |
61 | **How does this specification distinguish between behavior in first-party and third-party contexts?**
62 |
63 | It is an extension to WebXR which is by default blocked for third-party contexts and can be controlled via a Feature Policy flag.
64 |
65 | **How does this specification work in the context of a user agent’s Private Browsing or "incognito" mode?**
66 |
67 | The specification does not mandate a different behavior.
68 |
69 | **Does this specification have a "Security Considerations" and "Privacy Considerations" section?**
70 |
71 | Yes, there is a a section in the spec. Additionally, the [explainer](https://github.com/immersive-web/lighting-estimation/blob/main/lighting-estimation-explainer.md#security-implications) goes into further detail.
72 |
73 | **Does this specification allow downgrading default security characteristics?**
74 |
75 | No.
76 |
77 | **What should this questionnaire have asked?**
78 |
79 | N/a.
80 |
--------------------------------------------------------------------------------
/w3c.json:
--------------------------------------------------------------------------------
1 | {
2 | "group": 109735
3 | , "contacts": [ "himorin" ]
4 | , "repo-type": "rec-track"
5 | }
6 |
--------------------------------------------------------------------------------