├── .github └── workflows │ └── auto-publish.yml ├── .gitignore ├── .pr-preview.json ├── CODE_OF_CONDUCT.md ├── CONTRIBUTING.md ├── LICENSE.md ├── README.md ├── images ├── absolute_orientation_sensor_coordinate_system.png ├── absolute_orientation_sensor_coordinate_system.svg └── quaternion_to_rotation_matrix.png ├── index.bs ├── releases ├── FPWD.bs ├── FPWD.html └── images │ ├── absolute_orientation_sensor_coordinate_system.png │ ├── absolute_orientation_sensor_coordinate_system.svg │ └── quaternion_to_rotation_matrix.png ├── security-questionnaire.md └── w3c.json /.github/workflows/auto-publish.yml: -------------------------------------------------------------------------------- 1 | # Workflow based on the w3c/spec-prod action example to deploy to W3C using Echidna: 2 | # https://github.com/w3c/spec-prod/blob/main/docs/examples.md#deploy-to-w3c-using-echidna 3 | 4 | name: CI 5 | 6 | on: 7 | pull_request: {} 8 | push: 9 | branches: [main] 10 | 11 | jobs: 12 | main: 13 | name: Build, Validate and Deploy 14 | runs-on: ubuntu-20.04 15 | steps: 16 | - uses: actions/checkout@v2 17 | - uses: w3c/spec-prod@v2 18 | with: 19 | GH_PAGES_BRANCH: gh-pages 20 | W3C_ECHIDNA_TOKEN: ${{ secrets.ECHIDNA_TOKEN }} 21 | # Replace following with appropriate value. See options.md for details. 22 | W3C_WG_DECISION_URL: https://lists.w3.org/Archives/Public/public-device-apis/2021May/0008.html 23 | W3C_BUILD_OVERRIDE: | 24 | status: WD 25 | shortname: orientation-sensor 26 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | deploy_key -------------------------------------------------------------------------------- /.pr-preview.json: -------------------------------------------------------------------------------- 1 | { 2 | "src_file": "index.bs", 3 | "type": "bikeshed", 4 | "params": { 5 | "force": 1 6 | } 7 | } 8 | -------------------------------------------------------------------------------- /CODE_OF_CONDUCT.md: -------------------------------------------------------------------------------- 1 | # Code of Conduct 2 | 3 | All documentation, code and communication under this repository are covered by the [W3C Code of Ethics and Professional Conduct](https://www.w3.org/Consortium/cepc/). 4 | -------------------------------------------------------------------------------- /CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | Contributions to this repository are intended to become part of Recommendation-track documents governed by the 2 | [W3C Patent Policy](https://www.w3.org/Consortium/Patent-Policy/) and 3 | [Software and Document License](https://www.w3.org/Consortium/Legal/copyright- 4 | software). To make substantive contributions to specifications, you must either participate 5 | in the relevant W3C Working Group or make a non-member patent licensing commitment. 6 | 7 | If you are not the sole contributor to a contribution (pull request), please identify all 8 | contributors in the pull request comment. 9 | 10 | To add a contributor (other than yourself, that's automatic), mark them one per line as follows: 11 | 12 | ``` 13 | +@github_username 14 | ``` 15 | 16 | If you added a contributor by mistake, you can remove them in a comment with: 17 | 18 | ``` 19 | -@github_username 20 | ``` 21 | 22 | If you are making a pull request on behalf of someone else but you had no part in designing the 23 | feature, you can remove yourself with the above syntax. 24 | 25 | # Tests 26 | 27 | For normative changes, a corresponding 28 | [web-platform-tests](https://github.com/web-platform-tests/wpt) PR is highly appreciated. Typically, 29 | both PRs will be merged at the same time. Note that a test change that contradicts the spec should 30 | not be merged before the corresponding spec change. If testing is not practical, please explain why 31 | and if appropriate [file a web-platform-tests issue](https://github.com/web-platform-tests/wpt/issues/new) 32 | to follow up later. Add the `type:untestable` or `type:missing-coverage` label as appropriate. 33 | -------------------------------------------------------------------------------- /LICENSE.md: -------------------------------------------------------------------------------- 1 | All documents in this Repository are licensed by contributors 2 | under the 3 | [W3C Software and Document License](https://www.w3.org/Consortium/Legal/copyright-software). 4 | 5 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | ## Orientation Sensor 2 | 3 | This repository contains the 4 | [Orientation Sensor](https://w3c.github.com/orientation-sensor/) 5 | specification that is being worked on in the 6 | [W3C Device and Sensors Working Group](http://www.w3.org/2009/dap/). 7 | -------------------------------------------------------------------------------- /images/absolute_orientation_sensor_coordinate_system.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/w3c/orientation-sensor/7cb6cae81cd90bfc590e3c96be120598461f419c/images/absolute_orientation_sensor_coordinate_system.png -------------------------------------------------------------------------------- /images/quaternion_to_rotation_matrix.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/w3c/orientation-sensor/7cb6cae81cd90bfc590e3c96be120598461f419c/images/quaternion_to_rotation_matrix.png -------------------------------------------------------------------------------- /index.bs: -------------------------------------------------------------------------------- 1 |
  2 | Title: Orientation Sensor
  3 | Status: ED
  4 | Level: none
  5 | ED: https://w3c.github.io/orientation-sensor/
  6 | Shortname: orientation-sensor
  7 | TR: https://www.w3.org/TR/orientation-sensor/
  8 | Editor: Kenneth Rohde Christiansen 57705, Intel Corporation, https://intel.com/
  9 | Editor: Anssi Kostiainen 41974, Intel Corporation, https://intel.com/
 10 | Former Editor: Mikhail Pozdnyakov 78325, Intel Corporation, https://intel.com/
 11 | Former Editor: Alexander Shalamov 78335, Intel Corporation, https://intel.com/
 12 | Group: dap
 13 | Abstract:
 14 |   This specification defines a base orientation sensor interface and concrete sensor subclasses to monitor the device's
 15 |   physical orientation in relation to a stationary three dimensional Cartesian coordinate system.
 16 | Status Text:
 17 |   The Devices and Sensors Working Group is pursuing modern security and privacy
 18 |   reviews for this specification in consideration of the amount of change in both
 19 |   this specification and in privacy and security review practices since the
 20 |   horizontal reviews took place
 22 |   on 14 October 2019. Similarly, the group is pursuing an update to the Technical
 23 |   Architecture Group review for this specification to account for the latest
 24 |   architectural review practices.
 25 | Version History: https://github.com/w3c/orientation-sensor/commits/main/index.bs
 26 | Issue Tracking: Orientation Sensor Issues Repository https://github.com/w3c/orientation-sensor/issues
 27 | Indent: 2
 28 | Repository: w3c/orientation-sensor
 29 | Markup Shorthands: markdown on
 30 | Inline Github Issues: true
 31 | !Test Suite: web-platform-tests on GitHub
 32 | Boilerplate: omit issues-index, omit conformance, repository-issue-tracking no
 33 | Include MDN Panels: if possible
 34 | Ignored Vars: sensor_instance, targetMatrix, x, y, z, w
 35 | 
36 |
 37 | urlPrefix: https://w3c.github.io/sensors/; spec: GENERIC-SENSOR
 38 |   type: dfn
 39 |     text: activated
 40 |     text: construct a sensor object; url: construct-sensor-object
 41 |     text: initialize a sensor object; url: initialize-a-sensor-object
 42 |     text: default sensor
 43 |     text: high-level
 44 |     text: low-level
 45 |     text: latest reading
 46 |     text: sensor-fusion
 47 |     text: sensor type
 48 |     text: local coordinate system
 49 |     text: check sensor policy-controlled features; url: check-sensor-policy-controlled-features
 50 |     text: supported sensor options
 51 | urlPrefix: https://w3c.github.io/accelerometer/; spec: ACCELEROMETER
 52 |   type: dfn
 53 |     text: acceleration
 54 |     text: device coordinate system
 55 |     text: screen coordinate system
 56 |   type: interface
 57 |     text: Accelerometer; url: accelerometer
 58 | 
59 | 60 |
 61 | urlPrefix: https://w3c.github.io/gyroscope/; spec: GYROSCOPE
 62 |   type: dfn
 63 |     text: angular velocity
 64 |   type: interface
 65 |     text: Gyroscope; url: gyroscope
 66 | 
67 | 68 |
 69 | urlPrefix: https://w3c.github.io/magnetometer/; spec: MAGNETOMETER
 70 |   type: dfn
 71 |     text: magnetic field
 72 |   type: interface
 73 |     text: Magnetometer; url: magnetometer
 74 | 
75 | 76 |
 77 | urlPrefix: https://www.w3.org/TR/2016/CR-orientation-event-20160818/; spec: DEVICEORIENTATION
 78 |   type: interface
 79 |     text: DeviceOrientationEvent; url: deviceorientation_event
 80 | 
81 | 82 | 93 | 94 |
 95 | {
 96 |     "QUATERNIONS": {
 97 |         "id": "QUATERNIONS",
 98 |         "href": "https://en.wikipedia.org/wiki/Quaternion",
 99 |         "title": "Quaternion",
100 |         "publisher": "Wikipedia"
101 |     },
102 |     "QUATCONV": {
103 |         "authors": [
104 |             "Watt, Alan H., and Mark Watt."
105 |         ],
106 |         "id": "QUATCONV",
107 |         "href": "https://www.cs.cmu.edu/afs/cs/academic/class/15462-s14/www/lec_slides/3DRotationNotes.pdf",
108 |         "title": "Advanced animation and rendering techniques., page 362",
109 |         "date": "1992",
110 |         "status": "Informational",
111 |         "publisher": "New York, NY, USA:: ACM press"
112 |     }
113 | }
114 | 
115 | 116 | Introduction {#intro} 117 | ============ 118 | 119 | The Orientation Sensor API extends the Generic Sensor API [[GENERIC-SENSOR]] 120 | to provide generic information describing the device's physical orientation 121 | in relation to a three dimensional Cartesian coordinate system. 122 | 123 | The {{AbsoluteOrientationSensor}} class inherits from the {{OrientationSensor}} interface and 124 | describes the device's physical orientation in relation to the Earth's reference coordinate system. 125 | 126 | Other subclasses describe the orientation in relation to other stationary 127 | directions, such as true north, or non stationary directions, like in 128 | relation to a devices own z-position, drifting towards its latest most stable 129 | z-position. 130 | 131 | The data provided by the {{OrientationSensor}} subclasses are similar to data from 132 | {{DeviceOrientationEvent}}, but the Orientation Sensor API has the following significant differences: 133 | 1. The Orientation Sensor API represents orientation data in WebGL-compatible formats (quaternion, rotation matrix). 134 | 1. The Orientation Sensor API satisfies stricter latency requirements. 135 | 1. Unlike {{DeviceOrientationEvent}}, the {{OrientationSensor}} subclasses explicitly define which [=low-level=] 136 | motion sensors are used to obtain the orientation data, thus obviating possible interoperability issues. 137 | 1. Instances of {{OrientationSensor}} subclasses are configurable via {{SensorOptions}} constructor parameter. 138 | 139 | Use Cases and Requirements {#usecases-requirements} 140 | ============================== 141 | 142 | The use cases and requirements are discussed in the 143 | Motion Sensors Explainer document. 144 | 145 | Examples {#examples} 146 | ======== 147 | 148 |
149 |
150 |     const sensor = new AbsoluteOrientationSensor();
151 |     const mat4 = new Float32Array(16);
152 |     sensor.start();
153 |     sensor.onerror = event => console.log(event.error.name, event.error.message);
154 | 
155 |     sensor.onreading = () => {
156 |       sensor.populateMatrix(mat4);
157 |     };
158 |     
159 |
160 | 161 |
162 |
163 |     const sensor = new AbsoluteOrientationSensor({ frequency: 60 });
164 |     const mat4 = new Float32Array(16);
165 |     sensor.start();
166 |     sensor.onerror = event => console.log(event.error.name, event.error.message);
167 | 
168 |     function draw(timestamp) {
169 |       window.requestAnimationFrame(draw);
170 |       try {
171 |         sensor.populateMatrix(mat4);
172 |       } catch(e) {
173 |         // mat4 has not been updated.
174 |       }
175 |       // Drawing...
176 |     }
177 | 
178 |     window.requestAnimationFrame(draw);
179 |     
180 |
181 | 182 | Security and Privacy Considerations {#security-and-privacy} 183 | =================================== 184 | 185 | There are no specific security and privacy considerations 186 | beyond those described in the Generic Sensor API [[!GENERIC-SENSOR]]. 187 | 188 | Model {#model} 189 | ===== 190 | 191 | The {{OrientationSensor}} class extends the {{Sensor}} class and provides generic interface 192 | representing device orientation data. 193 | 194 | To access the Orientation Sensor sensor type's [=latest reading=], the user agent must invoke [=request sensor access=] abstract operation for each of the [=low-level=] sensors used by the concrete orientation sensor. The table below 195 | describes mapping between concrete orientation sensors and permission tokens defined by [=low-level=] sensors. 196 | 197 | 198 | 199 | 200 | 201 | 202 | 203 | 204 | 205 | 206 | 207 | 208 | 209 | 210 | 211 | 212 | 213 | 214 |
OrientationSensor sublassPermission tokens
{{AbsoluteOrientationSensor}}"accelerometer", "gyroscope", "magnetometer"
{{RelativeOrientationSensor}}"accelerometer", "gyroscope"
215 | 216 | The {{AbsoluteOrientationSensor}} is a [=policy-controlled feature=] identified by strings "accelerometer", "gyroscope" and "magnetometer" . Its [=default allowlist=] is `'self'`. 217 | 218 | The {{RelativeOrientationSensor}} is a [=policy-controlled feature=] identified by strings "accelerometer" and "gyroscope". Its [=default allowlist=] is `'self'`. 219 | 220 | A [=latest reading=] for a {{Sensor}} of Orientation Sensor sensor type includes an [=map/entry=] 221 | whose [=map/key=] is "quaternion" and whose [=map/value=] contains a four element [=list=]. 222 | The elements of the [=list=] are equal to components of a unit quaternion [[QUATERNIONS]] 223 | [Vx * sin(θ/2), Vy * sin(θ/2), Vz * sin(θ/2), cos(θ/2)] where V is 224 | the unit vector (whose elements are Vx, Vy, and Vz) representing the axis of rotation, and θ is the rotation angle about the axis defined by the unit vector V. 225 | 226 | Note: The quaternion components are arranged in the [=list=] as [q1, q2, q3, q0] 227 | [[QUATERNIONS]], i.e. the components representing the vector part of the quaternion go first and the scalar part component which 228 | is equal to cos(θ/2) goes after. This order is used for better compatibility with the most of the existing WebGL frameworks, 229 | however other libraries could use a different order when exposing quaternion as an array, e.g. [q0, q1, 230 | q2, q3]. 231 | 232 | The concrete {{OrientationSensor}} subclasses that are created through [=sensor-fusion=] of the 233 | [=low-level=] motion sensors are presented in the table below: 234 | 235 | 236 | 237 | 238 | 239 | 240 | 241 | 242 | 243 | 244 | 245 | 246 | 247 | 248 | 249 | 250 | 251 | 252 |
OrientationSensor sublass[=low-level|Low-level=] motion sensors
{{AbsoluteOrientationSensor}}{{Accelerometer}}, {{Gyroscope}}, {{Magnetometer}}
{{RelativeOrientationSensor}}{{Accelerometer}}, {{Gyroscope}}
253 | 254 | Note: {{Accelerometer}}, {{Gyroscope}} and {{Magnetometer}} [=low-level=] sensors are defined in 255 | [[ACCELEROMETER]], [[GYROSCOPE]], and [[MAGNETOMETER]] specifications respectively. The 256 | [=sensor-fusion|sensor fusion=] is platform specific and can happen in software or hardware, i.e. 257 | on a sensor hub. 258 | 259 |
260 | This example code explicitly queries permissions for {{AbsoluteOrientationSensor}} before 261 | calling {{Sensor/start()}}. 262 |
263 |     const sensor = new AbsoluteOrientationSensor();
264 |     Promise.all([navigator.permissions.query({ name: "accelerometer" }),
265 |                  navigator.permissions.query({ name: "magnetometer" }),
266 |                  navigator.permissions.query({ name: "gyroscope" })])
267 |            .then(results => {
268 |                  if (results.every(result => result.state === "granted")) {
269 |                    sensor.start();
270 |                    ...
271 |                  } else {
272 |                    console.log("No permissions to use AbsoluteOrientationSensor.");
273 |                  }
274 |            });
275 |     
276 | 277 | Another approach is to simply call {{Sensor/start()}} and subscribe to 278 | {{Sensor/onerror}} [=event handler=]. 279 | 280 |
281 |     const sensor = new AbsoluteOrientationSensor();
282 |     sensor.onerror = event => {
283 |       if (event.error.name === 'NotAllowedError')
284 |         console.log("No permissions to use AbsoluteOrientationSensor.");
285 |     };
286 |     sensor.start();
287 |     
288 |
289 | 290 | The AbsoluteOrientationSensor Model {#absoluteorientationsensor-model} 291 | ---------------------------------------------------------------------- 292 | 293 | The Absolute Orientation Sensor [=sensor type=] represents the sensor described in [[MOTION-SENSORS#absolute-orientation]]. Its associated [=extension sensor interface=] is {{AbsoluteOrientationSensor}}, a subclass of {{OrientationSensor}}. Its associated [=virtual sensor type=] is 294 | "absolute-orientation". 295 | 296 | For the absolute orientation sensor the value of [=latest reading=]["quaternion"] represents 297 | the rotation of a device's [=local coordinate system=] in relation to the Earth's reference 298 | coordinate system defined as a three dimensional Cartesian coordinate system (x, y, z), where: 299 | 300 | - x-axis is a vector product of y.z that is tangential to the ground and points east, 301 | - y-axis is tangential to the ground and points towards magnetic north, and 302 | - z-axis points towards the sky and is perpendicular to the plane made up of x and y axes. 303 | 304 | The device's [=local coordinate system=] is the same as defined for the [=low-level=] 305 | motion sensors. It can be either the [=device coordinate system=] or the 306 | [=screen coordinate system=]. 307 | 308 | Note: Figure below represents the case where device's [=local coordinate system=] and the Earth's reference coordinate system are aligned, therefore, 309 | orientation sensor's [=latest reading=] would represent 0 (rad) [[SI]] rotation about each axis. 310 | 311 | AbsoluteOrientationSensor coordinate system. 312 | 313 | The RelativeOrientationSensor Model {#relativeorientationsensor-model} 314 | ---------------------------------------------------------------------- 315 | 316 | The Relative Orientation Sensor [=sensor type=] represents the sensor described in [[MOTION-SENSORS#relative-orientation]]. Its associated [=extension sensor interface=] is {{RelativeOrientationSensor}}, a subclass of {{OrientationSensor}}. Its associated [=virtual sensor type=] is 317 | "relative-orientation". 318 | 319 | For the relative orientation sensor the value of [=latest reading=]["quaternion"] represents the 320 | rotation of a device's [=local coordinate system=] in relation to a [=stationary reference coordinate 321 | system=]. The [=stationary reference coordinate system=] may drift due to the bias introduced by 322 | the gyroscope sensor, thus, the rotation value provided by the sensor, may drift over time. 323 | 324 | The stationary reference coordinate system is defined as an inertial three dimensional 325 | Cartesian coordinate system that remains stationary as the device hosting the sensor moves through 326 | the environment. 327 | 328 | The device's [=local coordinate system=] is the same as defined for the [=low-level=] 329 | motion sensors. It can be either the [=device coordinate system=] or the 330 | [=screen coordinate system=]. 331 | 332 | Note: The relative orientation sensor data could be more accurate than the one provided by absolute 333 | orientation sensor, as the sensor is not affected by magnetic fields. 334 | 335 | API {#api} 336 | === 337 | 338 | The OrientationSensor Interface {#orientationsensor-interface} 339 | ------------------------------------- 340 | 341 |
342 |   typedef (Float32Array or Float64Array or DOMMatrix) RotationMatrixType;
343 | 
344 |   [SecureContext, Exposed=Window]
345 |   interface OrientationSensor : Sensor {
346 |     readonly attribute FrozenArray<double>? quaternion;
347 |     undefined populateMatrix(RotationMatrixType targetMatrix);
348 |   };
349 | 
350 |   enum OrientationSensorLocalCoordinateSystem { "device", "screen" };
351 | 
352 |   dictionary OrientationSensorOptions : SensorOptions {
353 |     OrientationSensorLocalCoordinateSystem referenceFrame = "device";
354 |   };
355 | 
356 | 357 | ### OrientationSensor.quaternion ### {#orientationsensor-quaternion} 358 | 359 | Returns a four-element {{FrozenArray}} whose elements contain the components 360 | of the unit quaternion representing the device orientation. 361 | In other words, this attribute returns the result of invoking 362 | [=get value from latest reading=] with this 363 | and "quaternion" as arguments. 364 | 365 | ### OrientationSensor.populateMatrix() ### {#orientationsensor-populatematrix} 366 | 367 |
368 | The {{OrientationSensor/populateMatrix(targetMatrix)}} method steps are: 369 | 1. If |targetMatrix| is of type {{Float32Array}} or {{Float64Array}} with a size less than sixteen, [=throw=] a 370 | "{{TypeError!!exception}}" exception and abort these steps. 371 | 1. Let |quaternion| be the result of invoking [=get value from latest reading=] with [=this=] 372 | and "quaternion" as arguments. 373 | 1. If |quaternion| is `null`, [=throw=] a "{{NotReadableError!!exception}}" {{DOMException}} and abort these steps. 374 | 1. Let |rotationMatrix| be the result of [=converting a quaternion to rotation matrix=] with |quaternion|[0], |quaternion|[1], |quaternion|[2], and |quaternion|[3]. 375 | 1. If |targetMatrix| is of {{Float32Array}} or {{Float64Array}} type, run these sub-steps: 376 | 1. Set |targetMatrix|[0] = |rotationMatrix|[0] 377 | 1. Set |targetMatrix|[1] = |rotationMatrix|[1] 378 | 1. Set |targetMatrix|[2] = |rotationMatrix|[2] 379 | 1. Set |targetMatrix|[3] = |rotationMatrix|[3] 380 | 1. Set |targetMatrix|[4] = |rotationMatrix|[4] 381 | 1. Set |targetMatrix|[5] = |rotationMatrix|[5] 382 | 1. Set |targetMatrix|[6] = |rotationMatrix|[6] 383 | 1. Set |targetMatrix|[7] = |rotationMatrix|[7] 384 | 1. Set |targetMatrix|[8] = |rotationMatrix|[8] 385 | 1. Set |targetMatrix|[9] = |rotationMatrix|[9] 386 | 1. Set |targetMatrix|[10] = |rotationMatrix|[10] 387 | 1. Set |targetMatrix|[11] = |rotationMatrix|[11] 388 | 1. Set |targetMatrix|[12] = |rotationMatrix|[12] 389 | 1. Set |targetMatrix|[13] = |rotationMatrix|[13] 390 | 1. Set |targetMatrix|[14] = |rotationMatrix|[14] 391 | 1. Set |targetMatrix|[15] = |rotationMatrix|[15] 392 | 1. If |targetMatrix| is of {{DOMMatrix}} type, run these sub-steps: 393 | 1. Set |targetMatrix|.m11 = |rotationMatrix|[0] 394 | 1. Set |targetMatrix|.m12 = |rotationMatrix|[1] 395 | 1. Set |targetMatrix|.m13 = |rotationMatrix|[2] 396 | 1. Set |targetMatrix|.m14 = |rotationMatrix|[3] 397 | 1. Set |targetMatrix|.m21 = |rotationMatrix|[4] 398 | 1. Set |targetMatrix|.m22 = |rotationMatrix|[5] 399 | 1. Set |targetMatrix|.m23 = |rotationMatrix|[6] 400 | 1. Set |targetMatrix|.m24 = |rotationMatrix|[7] 401 | 1. Set |targetMatrix|.m31 = |rotationMatrix|[8] 402 | 1. Set |targetMatrix|.m32 = |rotationMatrix|[9] 403 | 1. Set |targetMatrix|.m33 = |rotationMatrix|[10] 404 | 1. Set |targetMatrix|.m34 = |rotationMatrix|[11] 405 | 1. Set |targetMatrix|.m41 = |rotationMatrix|[12] 406 | 1. Set |targetMatrix|.m42 = |rotationMatrix|[13] 407 | 1. Set |targetMatrix|.m43 = |rotationMatrix|[14] 408 | 1. Set |targetMatrix|.m44 = |rotationMatrix|[15] 409 |
410 | 411 | 412 | The AbsoluteOrientationSensor Interface {#absoluteorientationsensor-interface} 413 | ------------------------------------- 414 | 415 |
416 |   [SecureContext, Exposed=Window]
417 |   interface AbsoluteOrientationSensor : OrientationSensor {
418 |     constructor(optional OrientationSensorOptions sensorOptions = {});
419 |   };
420 | 
421 | 422 | To construct an {{AbsoluteOrientationSensor}} object the user agent must invoke the 423 | [=construct an orientation sensor object=] abstract operation for the {{AbsoluteOrientationSensor}} 424 | interface. 425 | 426 | [=Supported sensor options=] for {{AbsoluteOrientationSensor}} are 427 | "frequency" and "referenceFrame". 428 | 429 | The RelativeOrientationSensor Interface {#relativeorientationsensor-interface} 430 | ------------------------------------- 431 | 432 |
433 |   [SecureContext, Exposed=Window]
434 |   interface RelativeOrientationSensor : OrientationSensor {
435 |     constructor(optional OrientationSensorOptions sensorOptions = {});
436 |   };
437 | 
438 | 439 | To construct a {{RelativeOrientationSensor}} object the user agent must invoke the 440 | [=construct an orientation sensor object=] abstract operation for the {{RelativeOrientationSensor}} 441 | interface. 442 | 443 | [=Supported sensor options=] for {{RelativeOrientationSensor}} are 444 | "frequency" and "referenceFrame". 445 | 446 | Abstract Operations {#abstract-operations} 447 | =================== 448 | 449 |

Construct an Orientation Sensor object

450 | 451 |
452 | 453 | : input 454 | :: |orientation_interface|, an [=interface=] [=identifier=] whose [=inherited interfaces=] contains {{OrientationSensor}}. 455 | :: |options|, a {{OrientationSensorOptions}} object. 456 | : output 457 | :: An {{OrientationSensor}} object. 458 | 459 | 1. Let |allowed| be the result of invoking [=check sensor policy-controlled features=] 460 | with the [=interface=] identified by |orientation_interface|. 461 | 1. If |allowed| is false, then: 462 | 1. [=Throw=] a {{SecurityError}} {{DOMException}}. 463 | 1. Let |orientation| be a new instance of the [=interface=] identified by |orientation_interface|. 464 | 1. Invoke [=initialize a sensor object=] with |orientation| and |options|. 465 | 1. If |options|.{{referenceFrame!!dict-member}} is "screen", then: 466 | 1. Define [=local coordinate system=] for |orientation| 467 | as the [=screen coordinate system=]. 468 | 1. Otherwise, define [=local coordinate system=] for |orientation| 469 | as the [=device coordinate system=]. 470 | 1. Return |orientation|. 471 |
472 | 473 |

Convert quaternion to rotation matrix

474 | 475 | The [=convert a quaternion to rotation matrix=] algorithm creates a [=list=] representation of a rotation matrix in column-major order converted from a quaternion [[QUATCONV]], as shown below: 476 | 477 | Converting quaternion to rotation matrix. 478 | 479 | where: 480 | 481 | - W = cos(θ/2) 482 | - X = Vx * sin(θ/2) 483 | - Y = Vy * sin(θ/2) 484 | - Z = Vz * sin(θ/2) 485 | 486 |
487 | To convert a quaternion to rotation matrix given a number |x|, a number |y|, a number |z|, and a number |w|: 488 | 489 | 1. Let |m11| be 1 - 2 * y * y - 2 * z * z 490 | 1. Let |m12| be 2 * x * y - 2 * z * w 491 | 1. Let |m13| be 2 * x * z + 2 * y * w 492 | 1. Let |m14| be 0 493 | 1. Let |m21| be 2 * x * y + 2 * z * w 494 | 1. Let |m22| be 1 - 2 * x * x - 2 * z * z 495 | 1. Let |m23| be 2 * y * z - 2 * x * w 496 | 1. Let |m24| be 0 497 | 1. Let |m31| be 2 * x * z - 2 * y * w 498 | 1. Let |m32| be 2 * y * z + 2 * x * w 499 | 1. Let |m33| be 1 - 2 * x * x - 2 * y * y 500 | 1. Let |m34| be 0 501 | 1. Let |m41| be 0 502 | 1. Let |m42| be 0 503 | 1. Let |m43| be 0 504 | 1. Let |m44| be 1 505 | 1. Return « |m11|, |m12|, |m13|, |m14|, |m21|, |m22|, |m23|, |m24|, |m31|, |m32|, |m33|, |m34|, |m41|, |m42|, |m43|, |m44| ». 506 | 507 |
508 | 509 |

Create a quaternion from Euler angles

510 | 511 |
512 | 516 | 517 | To create a quaternion from Euler angles given a number |alpha|, a number |beta| and a number |gamma|: 518 | 519 | 1. Let |alphaInRadians| be |alpha| converted from degrees to radians. 520 | 1. Let |betaInRadians| be |beta| converted from degrees to radians. 521 | 1. Let |gammaInRadians| be |gamma| converted from degrees to radians. 522 | 1. Let |cosZ| be the cosine of (0.5 * |alphaInRadians|). 523 | 1. Let |sinZ| be the sine of (0.5 * |alphaInRadians|). 524 | 1. Let |cosX| be the cosine of (0.5 * |betaInRadians|). 525 | 1. Let |sinX| be the sine of (0.5 * |betaInRadians|). 526 | 1. Let |cosY| be the cosine of (0.5 * |gammaInRadians|). 527 | 1. Let |sinY| be the sine of (0.5 * |gammaInRadians|). 528 | 1. Let |quaternionX| be (|sinX| * |cosY| * |cosZ| - |cosX| * |sinY| * |sinZ|). 529 | 1. Let |quaternionY| be (|cosX| * |sinY| * |cosZ| + |sinX| * |cosY| * |sinZ|). 530 | 1. Let |quaternionZ| be (|cosX| * |cosY| * |sinZ| + |sinX| * |sinY| * |cosZ|). 531 | 1. Let |quaternionW| be (|cosX| * |cosY| * |cosZ| - |sinX| * |sinY| * |sinZ|). 532 | 1. Return « |quaternionX|, |quaternionY|, |quaternionZ|, |quaternionW| ». 533 | 534 |
535 | 536 | Automation {#automation} 537 | ========== 538 | 539 | This section extends [[GENERIC-SENSOR#automation]] by providing [=Orientation Sensor=]-specific virtual sensor metadata. 540 | 541 | Modifications to other specifications {#modifications-to-other-specifications} 542 | ------------------------------------- 543 | 544 | This specification integrates with [[DEVICE-ORIENTATION#automation]] as follows. 545 | 546 |
547 | The [=parse orientation data reading=] algorithm is modified as follows: 548 | 549 | * Add the following steps after setting |reading|'s "`alpha`", "`beta`", and "`gamma`" keys and before returning |reading|: 550 | 1. [=map/Set=] |reading|["`quaternion`"] to the result of invoking [=create a quaternion from Euler angles=] with |reading|["`alpha`"], |reading|["`beta`"], and |reading|["`gamma`"]. 551 | 552 |
553 | 554 | Note: This specification does not currently provide a way for specifying quaternions in WebDriver (and consequently deriving Euler angles from the quaternion) directly. This decision was made for simplicity and under the assumption that automation users are much more likely to work with Euler angles as inputs (or pick specific quaternion values and provide the corresponding Euler angle values on their own). Feedback from users with different use cases who are interested in being able to provide quaternion values directly is welcome via this specification's issue tracker. 555 | 556 | Absolute Orientation Sensor automation {#absolute-orientation-sensor-automation} 557 | -------------------------------------- 558 | 559 | The [=absolute-orientation virtual sensor type=] and its corresponding entry in the [=per-type virtual sensor metadata=] [=map=] are defined in [[DEVICE-ORIENTATION#automation]]. 560 | 561 | Relative Orientation Sensor automation {#relative-orientation-sensor-automation} 562 | -------------------------------------- 563 | 564 | The [=relative-orientation virtual sensor type=] and its corresponding entry in the [=per-type virtual sensor metadata=] [=map=] are defined in [[DEVICE-ORIENTATION#automation]]. 565 | 566 | Acknowledgements {#acknowledgements} 567 | ================ 568 | 569 | Tobie Langel for the work on Generic Sensor API. 570 | 571 | Conformance {#conformance} 572 | =========== 573 | 574 | Conformance requirements are expressed with a combination of 575 | descriptive assertions and RFC 2119 terminology. The key words "MUST", 576 | "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", 577 | "RECOMMENDED", "MAY", and "OPTIONAL" in the normative parts of this 578 | document are to be interpreted as described in RFC 2119. 579 | However, for readability, these words do not appear in all uppercase 580 | letters in this specification. 581 | 582 | All of the text of this specification is normative except sections 583 | explicitly marked as non-normative, examples, and notes. [[!RFC2119]] 584 | 585 | A conformant user agent must implement all the requirements 586 | listed in this specification that are applicable to user agents. 587 | 588 | The IDL fragments in this specification must be interpreted as required for 589 | conforming IDL fragments, as described in the Web IDL specification. [[!WEBIDL]] 590 | -------------------------------------------------------------------------------- /releases/FPWD.bs: -------------------------------------------------------------------------------- 1 |
  2 | Title: Orientation Sensor
  3 | Status: FPWD
  4 | Date: 2017-05-11
  5 | #Level: 1
  6 | ED: https://w3c.github.io/orientation-sensor/
  7 | Shortname: orientation-sensor
  8 | TR: https://www.w3.org/TR/orientation-sensor/
  9 | Editor: Mikhail Pozdnyakov 78325, Intel Corporation, http://intel.com/
 10 | Editor: Alexander Shalamov 78335, Intel Corporation, http://intel.com/
 11 | Editor: Kenneth Rohde Christiansen 57705, Intel Corporation, http://intel.com/
 12 | Editor: Anssi Kostiainen 41974, Intel Corporation, http://intel.com/
 13 | Group: dap
 14 | Abstract:
 15 |   This specification defines a base orientation sensor interface and concrete sensor subclasses to monitor the device's
 16 |   physical orientation in relation to a stationary three dimensional Cartesian coordinate system.
 17 | Version History: https://github.com/w3c/orientation-sensor/commits/gh-pages/index.bs
 18 | !Bug Reports: via the w3c/orientation-sensor repository on GitHub
 19 | Indent: 2
 20 | Repository: w3c/orientation-sensor
 21 | Markup Shorthands: markdown on
 22 | Inline Github Issues: true
 23 | !Test Suite: web-platform-tests on GitHub
 24 | Boilerplate: omit issues-index, omit conformance
 25 | Ignored Vars: sensor_instance, targetMatrix, x, y, z, w
 26 | 
27 |
 28 | urlPrefix: https://w3c.github.io/sensors; spec: GENERIC-SENSOR
 29 |   type: dfn
 30 |     text: activated
 31 |     text: construct a sensor object; url: construct-sensor-object
 32 |     text: default sensor
 33 |     text: equivalent
 34 |     text: high-level
 35 |     text: low-level
 36 |     text: latest reading
 37 |     text: sensor
 38 |     text: sensor-fusion
 39 | urlPrefix: https://w3c.github.io/accelerometer; spec: ACCELEROMETER
 40 |   type: dfn
 41 |     text: acceleration
 42 |     text: local coordinate system
 43 |   type: interface
 44 |     text: Accelerometer; url: accelerometer
 45 | 
46 | 47 |
 48 | urlPrefix: https://w3c.github.io/gyroscope; spec: GYROSCOPE
 49 |   type: dfn
 50 |     text: angular velocity
 51 |   type: interface
 52 |     text: Gyroscope; url: gyroscope
 53 | 
54 | 55 |
 56 | urlPrefix: https://w3c.github.io/magnetometer; spec: MAGNETOMETER
 57 |   type: dfn
 58 |     text: magnetic field
 59 |   type: interface
 60 |     text: Magnetometer; url: magnetometer
 61 | 
62 | 63 |
 64 | urlPrefix: https://www.w3.org/TR/orientation-event; spec: DEVICEORIENTATION
 65 |   type: interface
 66 |     text: DeviceOrientationEvent; url: deviceorientation_event
 67 | 
68 | 69 |
 70 | urlPrefix: https://w3c.github.io/motion-sensors; spec: MOTIONSENSORS
 71 |   type: dfn
 72 |     text: Absolute Orientation Sensor; url: absolute-orientation
 73 | 
74 | 75 | 83 | 84 |
 85 | {
 86 |     "QUATERNIONS": {
 87 |         "authors": [
 88 |             "Kuipers, Jack B"
 89 |         ],
 90 |         "id": "QUATERNIONS",
 91 |         "href": "http://www.emis.ams.org/proceedings/Varna/vol1/GEOM09.pdf",
 92 |         "title": "Quaternions and rotation sequences. Vol. 66.",
 93 |         "date": "1999",
 94 |         "status": "Informational",
 95 |         "publisher": "Princeton university press"
 96 |     },
 97 |     "QUATCONV": {
 98 |         "authors": [
 99 |             "Watt, Alan H., and Mark Watt."
100 |         ],
101 |         "id": "QUATCONV",
102 |         "href": "http://www.cs.cmu.edu/afs/cs/academic/class/15462-s14/www/lec_slides/3DRotationNotes.pdf",
103 |         "title": "Advanced animation and rendering techniques., page 362",
104 |         "date": "1992",
105 |         "status": "Informational",
106 |         "publisher": "New York, NY, USA:: ACM press"
107 |     }
108 | }
109 | 
110 | 111 | Introduction {#intro} 112 | ============ 113 | 114 | The Orientation Sensor API extends the Generic Sensor API [[GENERIC-SENSOR]] 115 | to provide generic information describing the device's physical orientation 116 | in relation to a three dimensional Cartesian coordinate system. 117 | 118 | The {{AbsoluteOrientationSensor}} class inherits from the {{OrientationSensor}} interface and 119 | describes the device's physical orientation in relation to the Earth's reference coordinate system. 120 | 121 | Other subclasses describe the orientation in relation to other stationary 122 | directions, such as true north, or non stationary directions, like in 123 | relation to a devices own z-position, drifting towards its latest most stable 124 | z-position. 125 | 126 | The data provided by the {{OrientationSensor}} subclasses are similar to data from 127 | {{DeviceOrientationEvent}}, but the Orientation Sensor API has the following significant differences: 128 | 1. The Orientation Sensor API represents orientation data in WebGL-compatible formats (quaternion, rotation matrix). 129 | 1. The Orientation Sensor API satisfies stricter latency requirements. 130 | 1. Unlike {{DeviceOrientationEvent}}, the {{OrientationSensor}} subclasses explicitly define which [=low-level=] 131 | motion sensors are used to obtain the orientation data, thus obviating possible interoperability issues. 132 | 1. Instances of {{OrientationSensor}} subclasses are configurable via {{SensorOptions}} constructor parameter. 133 | 134 | Use Cases and Requirements {#usecases-requirements} 135 | ============================== 136 | 137 | The use cases and requirements are discussed in the 138 | Motion Sensors Explainer document. 139 | 140 | Examples {#examples} 141 | ======== 142 | 143 |
144 |
145 |     const sensor = new AbsoluteOrientationSensor();
146 |     const mat4 = new Float32Array(16);
147 |     sensor.start();
148 |     sensor.onerror = event => console.log(event.error.name, event.error.message);
149 | 
150 |     sensor.onchange = () => {
151 |       sensor.populateMatrix(mat4);
152 |     };
153 |     
154 |
155 | 156 |
157 |
158 |     const sensor = new AbsoluteOrientationSensor({ frequency: 60 });
159 |     const mat4 = new Float32Array(16);
160 |     sensor.start();
161 |     sensor.onerror = event => console.log(event.error.name, event.error.message);
162 | 
163 |     function draw(timestamp) {
164 |       window.requestAnimationFrame(draw);
165 |       try {
166 |         sensor.populateMatrix(mat4);
167 |       } catch(e) {
168 |         // mat4 has not been updated.
169 |       }
170 |       // Drawing...
171 |     }
172 | 
173 |     window.requestAnimationFrame(draw);
174 |     
175 |
176 | 177 | Security and Privacy Considerations {#security-and-privacy} 178 | =================================== 179 | 180 | There are no specific security and privacy considerations 181 | beyond those described in the Generic Sensor API [[!GENERIC-SENSOR]]. 182 | 183 | Model {#model} 184 | ===== 185 | 186 | The {{OrientationSensor}} class extends the {{Sensor}} class and provides generic interface 187 | representing device orientation data. 188 | 189 | To access the orientation sensor's [=latest reading=], the user agent must invoke [=request sensor access=] abstract operation for each of the [=low-level=] sensors used by the concrete orientation sensor. The table below 190 | describes mapping between concrete orientation sensors and permission tokens defined by [=low-level=] sensors. 191 | 192 | 193 | 194 | 195 | 196 | 197 | 198 | 199 | 200 | 201 | 202 | 203 |
OrientationSensor sublassPermission tokens
{{AbsoluteOrientationSensor}}"accelerometer", "gyroscope", "magnetometer"
204 | 205 | A [=latest reading=] per [=sensor=] of orientation type includes an [=map/entry=] 206 | whose [=map/key=] is "quaternion" and whose [=map/value=] contains a four element [=list=]. 207 | The elements of the [=list=] are equal to components of a unit quaternion [[QUATERNIONS]] 208 | [Vx * sin(θ/2), Vy * sin(θ/2), Vz * sin(θ/2), cos(θ/2)] where V is 209 | the unit vector (whole elements are Vx, Vy, and Vz) representing the axis of rotation, and θ is the rotation angle about the axis defined by the unit vector V. 210 | 211 | Note: The quaternion components are arranged in the [=list=] as [q1, q2, q3, q0] 212 | [[QUATERNIONS]], i.e. the components representing the vector part of the quaternion go first and the scalar part component which 213 | is equal to cos(θ/2) goes after. This order is used for better compatibility with the most of the existing WebGL frameworks, 214 | however other libraries could use a different order when exposing quaternion as an array, e.g. [q0, q1, 215 | q2, q3]. 216 | 217 | The {{AbsoluteOrientationSensor}} class is a subclass of {{OrientationSensor}} which represents the [=Absolute 218 | Orientation Sensor=]. 219 | 220 | The absolute orientation sensor is a [=high-level=] sensor which is created through [=sensor-fusion=] 221 | of the [=low-level=] motion sensors: 222 | 223 | - accelerometer that measures [=acceleration=], 224 | - gyroscope that measures [=angular velocity=], and 225 | - magnetometer that measures [=magnetic field=]. 226 | 227 | Note: Corresponding [=low-level=] sensors are defined in [[ACCELEROMETER]], [[GYROSCOPE]], and 228 | [[MAGNETOMETER]]. Regardless, the fusion is platform specific and can happen in software or 229 | hardware, i.e. on a sensor hub. 230 | 231 | For the absolute orientation sensor the value of [=latest reading=]["quaternion"] represents the rotation of a 232 | device hosting motion sensors in relation to the Earth's reference coordinate system defined as a 233 | three dimensional Cartesian coordinate system (x, y, z), where: 234 | 235 | - x-axis is a vector product of y.z that is tangential to the ground and points east, 236 | - y-axis is tangential to the ground and points towards magnetic north, and 237 | - z-axis points towards the sky and is perpendicular to the plane made up of x and y axes. 238 | 239 | The device's local coordinate system is the same as defined by [=low-level=] motion sensors. 240 | 241 | Note: Figure below represents the case where device's local coordinate system and the Earth's reference coordinate system are aligned, therefore, 242 | orientation sensor's [=latest reading=] would represent 0 (rad) [[SI]] rotation about each axis. 243 | 244 | AbsoluteOrientationSensor coordinate system. 245 | 246 | API {#api} 247 | === 248 | 249 | The OrientationSensor Interface {#orientationsensor-interface} 250 | ------------------------------------- 251 | 252 |
253 |   typedef (Float32Array or Float64Array or DOMMatrix) RotationMatrixType;
254 |   interface OrientationSensor : Sensor {
255 |     readonly attribute FrozenArray<double>? quaternion;
256 |     void populateMatrix(RotationMatrixType targetMatrix);
257 |   };
258 | 
259 | 260 | ### OrientationSensor.quaternion ### {#orientationsensor-quaternion} 261 | 262 | Returns a four-element {{FrozenArray}} whose elements contain the components of the unit quaternion representing the device orientation. 263 | In other words, this attribute returns [=latest reading=]["quaternion"]. 264 | 265 | ### OrientationSensor.populateMatrix() ### {#orientationsensor-populatematrix} 266 | 267 | The {{OrientationSensor/populateMatrix()}} method populates the given object with rotation matrix 268 | which is converted from the value of [=latest reading=]["quaternion"] [[QUATCONV]], as shown below: 269 | 270 | Converting quaternion to rotation matrix. 271 | 272 | where: 273 | 274 | - W = cos(θ/2) 275 | - X = Vx * sin(θ/2) 276 | - Y = Vy * sin(θ/2) 277 | - Z = Vz * sin(θ/2) 278 | 279 | The rotation matrix is flattened in |targetMatrix| object according to the column-major order, as described in 280 | [=populate rotation matrix=] algorighm. 281 | 282 |
283 | To populate rotation matrix, the {{OrientationSensor/populateMatrix()}} method must 284 | run these steps or their [=equivalent=]: 285 | 1. If |targetMatrix| is not of type defined by {{RotationMatrixType}} union, [=throw=] a 286 | "{{TypeError!!exception}}" {{DOMException}} and abort these steps. 287 | 1. If |targetMatrix| is of type {{Float32Array}} or {{Float64Array}} with a size less than sixteen, [=throw=] a 288 | "{{TypeError!!exception}}" {{DOMException}} and abort these steps. 289 | 1. Let |quaternion| be the value of [=latest reading=]["quaternion"] 290 | 1. If |quaternion| is `null`, [=throw=] a "{{NotReadableError!!exception}}" {{DOMException}} and abort these steps. 291 | 1. Let |x| be the value of |quaternion|[0] 292 | 1. Let |y| be the value of |quaternion|[1] 293 | 1. Let |z| be the value of |quaternion|[2] 294 | 1. Let |w| be the value of |quaternion|[3] 295 | 1. If |targetMatrix| is of {{Float32Array}} or {{Float64Array}} type, run these sub-steps: 296 | 1. Set |targetMatrix|[0] = 1 - 2 * y * y - 2 * z * z 297 | 1. Set |targetMatrix|[1] = 2 * x * y - 2 * z * w 298 | 1. Set |targetMatrix|[2] = 2 * x * z + 2 * y * w 299 | 1. Set |targetMatrix|[3] = 0 300 | 1. Set |targetMatrix|[4] = 2 * x * y + 2 * z * w 301 | 1. Set |targetMatrix|[5] = 1 - 2 * x * x - 2 * z * z 302 | 1. Set |targetMatrix|[6] = 2 * y * z - 2 * x * w 303 | 1. Set |targetMatrix|[7] = 0 304 | 1. Set |targetMatrix|[8] = 2 * x * z - 2 * y * w 305 | 1. Set |targetMatrix|[9] = 2 * y * z + 2 * x * w 306 | 1. Set |targetMatrix|[10] = 1 - 2 * x * x - 2 * y * y 307 | 1. Set |targetMatrix|[11] = 0 308 | 1. Set |targetMatrix|[12] = 0 309 | 1. Set |targetMatrix|[13] = 0 310 | 1. Set |targetMatrix|[14] = 0 311 | 1. Set |targetMatrix|[15] = 1 312 | 1. If |targetMatrix| is of {{DOMMatrix}} type, run these sub-steps: 313 | 1. Set |targetMatrix|.m11 = 1 - 2 * y * y - 2 * z * z 314 | 1. Set |targetMatrix|.m12 = 2 * x * y - 2 * z * w 315 | 1. Set |targetMatrix|.m13 = 2 * x * z + 2 * y * w 316 | 1. Set |targetMatrix|.m14 = 0 317 | 1. Set |targetMatrix|.m21 = 2 * x * y + 2 * z * w 318 | 1. Set |targetMatrix|.m22 = 1 - 2 * x * x - 2 * z * z 319 | 1. Set |targetMatrix|.m23 = 2 * y * z - 2 * x * w 320 | 1. Set |targetMatrix|.m24 = 0 321 | 1. Set |targetMatrix|.m31 = 2 * x * z - 2 * y * w 322 | 1. Set |targetMatrix|.m32 = 2 * y * z + 2 * x * w 323 | 1. Set |targetMatrix|.m33 = 1 - 2 * x * x - 2 * y * y 324 | 1. Set |targetMatrix|.m34 = 0 325 | 1. Set |targetMatrix|.m41 = 0 326 | 1. Set |targetMatrix|.m42 = 0 327 | 1. Set |targetMatrix|.m43 = 0 328 | 1. Set |targetMatrix|.m44 = 1 329 |
330 | 331 | 332 | The AbsoluteOrientationSensor Interface {#absoluteorientationsensor-interface} 333 | ------------------------------------- 334 | 335 |
336 |   [Constructor(optional SensorOptions sensorOptions)]
337 |   interface AbsoluteOrientationSensor : OrientationSensor {
338 |   };
339 | 
340 | 341 | To Construct an AbsoluteOrientationSensor Object the user agent must invoke the 342 | construct a Sensor object abstract operation. 343 | 344 | Acknowledgements {#acknowledgements} 345 | ================ 346 | 347 | Tobie Langel for the work on Generic Sensor API. 348 | 349 | Conformance {#conformance} 350 | =========== 351 | 352 | Conformance requirements are expressed with a combination of 353 | descriptive assertions and RFC 2119 terminology. The key words "MUST", 354 | "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", 355 | "RECOMMENDED", "MAY", and "OPTIONAL" in the normative parts of this 356 | document are to be interpreted as described in RFC 2119. 357 | However, for readability, these words do not appear in all uppercase 358 | letters in this specification. 359 | 360 | All of the text of this specification is normative except sections 361 | explicitly marked as non-normative, examples, and notes. [[!RFC2119]] 362 | 363 | A conformant user agent must implement all the requirements 364 | listed in this specification that are applicable to user agents. 365 | 366 | The IDL fragments in this specification must be interpreted as required for 367 | conforming IDL fragments, as described in the Web IDL specification. [[!WEBIDL]] 368 | -------------------------------------------------------------------------------- /releases/FPWD.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | Orientation Sensor 6 | 7 | 8 | 9 | 10 | 20 | 66 | 95 | 157 | 193 | 251 | 252 | 253 |
254 |

255 |

Orientation Sensor

256 |

W3C First Public Working Draft,

257 |
258 |
259 |
This version: 260 |
https://www.w3.org/TR/2017/WD-orientation-sensor-20170511/ 261 |
Latest published version: 262 |
https://www.w3.org/TR/orientation-sensor/ 263 |
Editor's Draft: 264 |
https://w3c.github.io/orientation-sensor/ 265 |
Version History: 266 |
https://github.com/w3c/orientation-sensor/commits/gh-pages/index.bs 267 |
Feedback: 268 |
public-device-apis@w3.org with subject line “[orientation-sensor] … message topic …” (archives) 269 |
Issue Tracking: 270 |
GitHub 271 |
Editors: 272 |
Mikhail Pozdnyakov (Intel Corporation) 273 |
Alexander Shalamov (Intel Corporation) 274 |
Kenneth Rohde Christiansen (Intel Corporation) 275 |
Anssi Kostiainen (Intel Corporation) 276 |
Bug Reports: 277 |
via the w3c/orientation-sensor repository on GitHub 278 |
Test Suite: 279 |
web-platform-tests on GitHub 280 |
281 |
282 |
283 | 284 |
285 |
286 |
287 |

Abstract

288 |

This specification defines a base orientation sensor interface and concrete sensor subclasses to monitor the device’s 289 | 290 | physical orientation in relation to a stationary three dimensional Cartesian coordinate system.

291 |
292 |

Status of this document

293 |
294 |

This section describes the status of this document at the time of 295 | its publication. Other documents may supersede this document. A list of 296 | current W3C publications and the latest revision of this technical report 297 | can be found in the W3C technical reports 298 | index at https://www.w3.org/TR/.

299 |

This document was published by the Device and Sensors Working Group as a Working Draft. This document is intended to become a W3C Recommendation.

300 |

If you wish to make comments regarding this document, please send them to public-device-apis@w3.org (subscribe, archives). 301 | When sending e-mail, 302 | please put the text “orientation-sensor” in the subject, 303 | preferably like this: 304 | “[orientation-sensor] …summary of comment…”. 305 | All comments are welcome.

306 |

This document is a First Public Working Draft.

307 |

Publication as a First Public Working Draft does not imply endorsement by the W3C 308 | Membership. This is a draft document and may be updated, replaced or 309 | obsoleted by other documents at any time. It is inappropriate to cite this 310 | document as other than work in progress.

311 |

This document was produced by a group operating under 312 | the 5 February 2004 W3C Patent Policy. 313 | W3C maintains a public list of any patent disclosures made in connection with the deliverables of the group; 314 | that page also includes instructions for disclosing a patent. 315 | An individual who has actual knowledge of a patent which the individual believes contains Essential Claim(s) must disclose the information in accordance with section 6 of the W3C Patent Policy.

316 |

This document is governed by the 1 March 2017 W3C Process Document.

317 |

318 |
319 |
320 | 356 |
357 |

1. Introduction

358 |

The Orientation Sensor API extends the Generic Sensor API [GENERIC-SENSOR] to provide generic information describing the device’s physical orientation 359 | in relation to a three dimensional Cartesian coordinate system.

360 |

The AbsoluteOrientationSensor class inherits from the OrientationSensor interface and 361 | describes the device’s physical orientation in relation to the Earth’s reference coordinate system.

362 |

Other subclasses describe the orientation in relation to other stationary 363 | directions, such as true north, or non stationary directions, like in 364 | relation to a devices own z-position, drifting towards its latest most stable 365 | z-position.

366 |

The data provided by the OrientationSensor subclasses are similar to data from DeviceOrientationEvent, but the Orientation Sensor API has the following significant differences:

367 |
    368 |
  1. 369 |

    The Orientation Sensor API represents orientation data in WebGL-compatible formats (quaternion, rotation matrix).

    370 |
  2. 371 |

    The Orientation Sensor API satisfies stricter latency requirements.

    372 |
  3. 373 |

    Unlike DeviceOrientationEvent, the OrientationSensor subclasses explicitly define which low-level motion sensors are used to obtain the orientation data, thus obviating possible interoperability issues.

    374 |
  4. 375 |

    Instances of OrientationSensor subclasses are configurable via SensorOptions constructor parameter.

    376 |
377 |

2. Use Cases and Requirements

378 |

The use cases and requirements are discussed in the Motion Sensors Explainer document.

379 |

3. Examples

380 |
381 | 382 |
const sensor = new AbsoluteOrientationSensor();
383 | const mat4 = new Float32Array(16);
384 | sensor.start();
385 | sensor.onerror = event => console.log(event.error.name, event.error.message);
386 | 
387 | sensor.onchange = () => {
388 |   sensor.populateMatrix(mat4);
389 | };
390 | 
391 |
392 |
393 | 394 |
const sensor = new AbsoluteOrientationSensor({ frequency: 60 });
395 | const mat4 = new Float32Array(16);
396 | sensor.start();
397 | sensor.onerror = event => console.log(event.error.name, event.error.message);
398 | 
399 | function draw(timestamp) {
400 |   window.requestAnimationFrame(draw);
401 |   try {
402 |     sensor.populateMatrix(mat4);
403 |   } catch(e) {
404 |     // mat4 has not been updated.
405 |   }
406 |   // Drawing...
407 | }
408 | 
409 | window.requestAnimationFrame(draw);
410 | 
411 |
412 |

4. Security and Privacy Considerations

413 |

There are no specific security and privacy considerations 414 | beyond those described in the Generic Sensor API [GENERIC-SENSOR].

415 |

5. Model

416 |

The OrientationSensor class extends the Sensor class and provides generic interface 417 | representing device orientation data.

418 |

To access the orientation sensor’s latest reading, the user agent must invoke request sensor access abstract operation for each of the low-level sensors used by the concrete orientation sensor. The table below 419 | describes mapping between concrete orientation sensors and permission tokens defined by low-level sensors.

420 | 421 | 422 | 423 | 426 | 427 |
OrientationSensor sublass 424 | Permission tokens 425 |
AbsoluteOrientationSensor 428 | "accelerometer", "gyroscope", "magnetometer" 429 |
430 |

A latest reading per sensor of orientation type includes an entry whose key is "quaternion" and whose value contains a four element list. 431 | The elements of the list are equal to components of a unit quaternion [QUATERNIONS] [Vx * sin(θ/2), Vy * sin(θ/2), Vz * sin(θ/2), cos(θ/2)] where V is 432 | the unit vector (whole elements are Vx, Vy, and Vz) representing the axis of rotation, and θ is the rotation angle about the axis defined by the unit vector V.

433 |

Note: The quaternion components are arranged in the list as [q1, q2, q3, q0] [QUATERNIONS], i.e. the components representing the vector part of the quaternion go first and the scalar part component which 434 | is equal to cos(θ/2) goes after. This order is used for better compatibility with the most of the existing WebGL frameworks, 435 | however other libraries could use a different order when exposing quaternion as an array, e.g. [q0, q1, 436 | q2, q3].

437 |

The AbsoluteOrientationSensor class is a subclass of OrientationSensor which represents the Absolute 438 | Orientation Sensor.

439 |

The absolute orientation sensor is a high-level sensor which is created through sensor-fusion of the low-level motion sensors:

440 | 448 |

Note: Corresponding low-level sensors are defined in [ACCELEROMETER], [GYROSCOPE], and [MAGNETOMETER]. Regardless, the fusion is platform specific and can happen in software or 449 | hardware, i.e. on a sensor hub.

450 |

For the absolute orientation sensor the value of latest reading["quaternion"] represents the rotation of a 451 | device hosting motion sensors in relation to the Earth’s reference coordinate system defined as a 452 | three dimensional Cartesian coordinate system (x, y, z), where:

453 | 461 |

The device’s local coordinate system is the same as defined by low-level motion sensors.

462 |

Note: Figure below represents the case where device’s local coordinate system and the Earth’s reference coordinate system are aligned, therefore, 463 | orientation sensor’s latest reading would represent 0 (rad) [SI] rotation about each axis.

464 |

AbsoluteOrientationSensor coordinate system.

465 |

6. API

466 |

6.1. The OrientationSensor Interface

467 |
typedef (Float32Array or Float64Array or DOMMatrix) RotationMatrixType;
468 | interface OrientationSensor : Sensor {
469 |   readonly attribute FrozenArray<double>? quaternion;
470 |   void populateMatrix(RotationMatrixType targetMatrix);
471 | };
472 | 
473 |

6.1.1. OrientationSensor.quaternion

474 |

Returns a four-element FrozenArray whose elements contain the components of the unit quaternion representing the device orientation. 475 | In other words, this attribute returns latest reading["quaternion"].

476 |

6.1.2. OrientationSensor.populateMatrix()

477 |

The populateMatrix() method populates the given object with rotation matrix 478 | which is converted from the value of latest reading["quaternion"] [QUATCONV], as shown below:

479 |

Converting quaternion to rotation matrix.

480 |

where:

481 | 491 |

The rotation matrix is flattened in targetMatrix object according to the column-major order, as described in populate rotation matrix algorighm.

492 |
493 | To populate rotation matrix, the populateMatrix() method must 494 | run these steps or their equivalent: 495 |
    496 |
  1. 497 |

    If targetMatrix is not of type defined by RotationMatrixType union, throw a 498 | "TypeError" DOMException and abort these steps.

    499 |
  2. 500 |

    If targetMatrix is of type Float32Array or Float64Array with a size less than sixteen, throw a 501 | "TypeError" DOMException and abort these steps.

    502 |
  3. 503 |

    Let quaternion be the value of latest reading["quaternion"]

    504 |
  4. 505 |

    If quaternion is null, throw a "NotReadableError" DOMException and abort these steps.

    506 |
  5. 507 |

    Let x be the value of quaternion[0]

    508 |
  6. 509 |

    Let y be the value of quaternion[1]

    510 |
  7. 511 |

    Let z be the value of quaternion[2]

    512 |
  8. 513 |

    Let w be the value of quaternion[3]

    514 |
  9. 515 |

    If targetMatrix is of Float32Array or Float64Array type, run these sub-steps:

    516 |
      517 |
    1. 518 |

      Set targetMatrix[0] = 1 - 2 * y * y - 2 * z * z

      519 |
    2. 520 |

      Set targetMatrix[1] = 2 * x * y - 2 * z * w

      521 |
    3. 522 |

      Set targetMatrix[2] = 2 * x * z + 2 * y * w

      523 |
    4. 524 |

      Set targetMatrix[3] = 0

      525 |
    5. 526 |

      Set targetMatrix[4] = 2 * x * y + 2 * z * w

      527 |
    6. 528 |

      Set targetMatrix[5] = 1 - 2 * x * x - 2 * z * z

      529 |
    7. 530 |

      Set targetMatrix[6] = 2 * y * z - 2 * x * w

      531 |
    8. 532 |

      Set targetMatrix[7] = 0

      533 |
    9. 534 |

      Set targetMatrix[8] = 2 * x * z - 2 * y * w

      535 |
    10. 536 |

      Set targetMatrix[9] = 2 * y * z + 2 * x * w

      537 |
    11. 538 |

      Set targetMatrix[10] = 1 - 2 * x * x - 2 * y * y

      539 |
    12. 540 |

      Set targetMatrix[11] = 0

      541 |
    13. 542 |

      Set targetMatrix[12] = 0

      543 |
    14. 544 |

      Set targetMatrix[13] = 0

      545 |
    15. 546 |

      Set targetMatrix[14] = 0

      547 |
    16. 548 |

      Set targetMatrix[15] = 1

      549 |
    550 |
  10. 551 |

    If targetMatrix is of DOMMatrix type, run these sub-steps:

    552 |
      553 |
    1. 554 |

      Set targetMatrix.m11 = 1 - 2 * y * y - 2 * z * z

      555 |
    2. 556 |

      Set targetMatrix.m12 = 2 * x * y - 2 * z * w

      557 |
    3. 558 |

      Set targetMatrix.m13 = 2 * x * z + 2 * y * w

      559 |
    4. 560 |

      Set targetMatrix.m14 = 0

      561 |
    5. 562 |

      Set targetMatrix.m21 = 2 * x * y + 2 * z * w

      563 |
    6. 564 |

      Set targetMatrix.m22 = 1 - 2 * x * x - 2 * z * z

      565 |
    7. 566 |

      Set targetMatrix.m23 = 2 * y * z - 2 * x * w

      567 |
    8. 568 |

      Set targetMatrix.m24 = 0

      569 |
    9. 570 |

      Set targetMatrix.m31 = 2 * x * z - 2 * y * w

      571 |
    10. 572 |

      Set targetMatrix.m32 = 2 * y * z + 2 * x * w

      573 |
    11. 574 |

      Set targetMatrix.m33 = 1 - 2 * x * x - 2 * y * y

      575 |
    12. 576 |

      Set targetMatrix.m34 = 0

      577 |
    13. 578 |

      Set targetMatrix.m41 = 0

      579 |
    14. 580 |

      Set targetMatrix.m42 = 0

      581 |
    15. 582 |

      Set targetMatrix.m43 = 0

      583 |
    16. 584 |

      Set targetMatrix.m44 = 1

      585 |
    586 |
587 |
588 |

6.2. The AbsoluteOrientationSensor Interface

589 |
[Constructor(optional SensorOptions sensorOptions)]
590 | interface AbsoluteOrientationSensor : OrientationSensor {
591 | };
592 | 
593 |

To Construct an AbsoluteOrientationSensor Object the user agent must invoke the construct a Sensor object abstract operation.

594 |

7. Acknowledgements

595 |

Tobie Langel for the work on Generic Sensor API.

596 |

8. Conformance

597 |

Conformance requirements are expressed with a combination of 598 | descriptive assertions and RFC 2119 terminology. The key words "MUST", 599 | "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", 600 | "RECOMMENDED", "MAY", and "OPTIONAL" in the normative parts of this 601 | document are to be interpreted as described in RFC 2119. 602 | However, for readability, these words do not appear in all uppercase 603 | letters in this specification.

604 |

All of the text of this specification is normative except sections 605 | explicitly marked as non-normative, examples, and notes. [RFC2119]

606 |

A conformant user agent must implement all the requirements 607 | listed in this specification that are applicable to user agents.

608 |

The IDL fragments in this specification must be interpreted as required for 609 | conforming IDL fragments, as described in the Web IDL specification. [WEBIDL]

610 |
611 | 612 |

Index

613 |

Terms defined by this specification

614 | 627 |

Terms defined by reference

628 | 695 |

References

696 |

Normative References

697 |
698 |
[ACCELEROMETER] 699 |
Anssi Kostiainen; Alexander Shalamov. Accelerometer. URL: https://www.w3.org/TR/accelerometer/ 700 |
[GENERIC-SENSOR] 701 |
Tobie Langel; Rick Waldron. Generic Sensor API. URL: https://www.w3.org/TR/generic-sensor/ 702 |
[GEOMETRY-1] 703 |
Simon Pieters; Dirk Schulze; Rik Cabanier. Geometry Interfaces Module Level 1. URL: https://www.w3.org/TR/geometry-1/ 704 |
[GYROSCOPE] 705 |
Anssi Kostiainen; Mikhail Pozdnyakov. Gyroscope. URL: https://www.w3.org/TR/gyroscope/ 706 |
[INFRA] 707 |
Anne van Kesteren; Domenic Denicola. Infra Standard. Living Standard. URL: https://infra.spec.whatwg.org/ 708 |
[MAGNETOMETER] 709 |
Anssi Kostiainen; Rijubrata Bhaumik. Magnetometer. URL: https://www.w3.org/TR/magnetometer/ 710 |
[RFC2119] 711 |
S. Bradner. Key words for use in RFCs to Indicate Requirement Levels. March 1997. Best Current Practice. URL: https://tools.ietf.org/html/rfc2119 712 |
[WEBIDL] 713 |
Cameron McCormack; Boris Zbarsky; Tobie Langel. Web IDL. URL: https://heycam.github.io/webidl/ 714 |
715 |

Informative References

716 |
717 |
[QUATCONV] 718 |
Watt, Alan H., and Mark Watt.. Advanced animation and rendering techniques., page 362. 1992. Informational. URL: http://www.cs.cmu.edu/afs/cs/academic/class/15462-s14/www/lec_slides/3DRotationNotes.pdf 719 |
[QUATERNIONS] 720 |
Kuipers, Jack B. Quaternions and rotation sequences. Vol. 66.. 1999. Informational. URL: http://www.emis.ams.org/proceedings/Varna/vol1/GEOM09.pdf 721 |
[SI] 722 |
SI Brochure: The International System of Units (SI), 8th edition. 2014. URL: http://www.bipm.org/en/publications/si-brochure/ 723 |
724 |

IDL Index

725 |
typedef (Float32Array or Float64Array or DOMMatrix) RotationMatrixType;
726 | interface OrientationSensor : Sensor {
727 |   readonly attribute FrozenArray<double>? quaternion;
728 |   void populateMatrix(RotationMatrixType targetMatrix);
729 | };
730 | 
731 | [Constructor(optional SensorOptions sensorOptions)]
732 | interface AbsoluteOrientationSensor : OrientationSensor {
733 | };
734 | 
735 | 
736 | 743 | 750 | 758 | 764 | 770 | 777 | -------------------------------------------------------------------------------- /releases/images/absolute_orientation_sensor_coordinate_system.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/w3c/orientation-sensor/7cb6cae81cd90bfc590e3c96be120598461f419c/releases/images/absolute_orientation_sensor_coordinate_system.png -------------------------------------------------------------------------------- /releases/images/quaternion_to_rotation_matrix.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/w3c/orientation-sensor/7cb6cae81cd90bfc590e3c96be120598461f419c/releases/images/quaternion_to_rotation_matrix.png -------------------------------------------------------------------------------- /security-questionnaire.md: -------------------------------------------------------------------------------- 1 | # [Security and Privacy Self-Review Questionnaire] for [Orientation Sensor] 2 | 3 | Answers to the [questionnaire][Security and Privacy Self-Review Questionnaire] for 4 | [Generic Sensor API] can be found [here](https://github.com/w3c/sensors/blob/main/security-questionnaire.md). 5 | 6 | ## [3.1 Does this specification deal with personally-identifiable information?] 7 | 8 | Yes, but not directly. [Orientation sensor] specification requires user [permission] and implementation 9 | of applicable [mitigation strategies] to address [potential risks][user-identification]. 10 | For more information, please see: [Security and Privacy section][security-and-privacy]. 11 | 12 | ## [3.2 Does this specification deal with high-value data?] 13 | 14 | Yes, but not directly. 15 | 16 | >Sensor readings are explicitly flagged by the Secure Contexts specification 17 | [[POWERFUL-FEATURES]] as a high-value target for network attackers. Thus all interfaces defined by 18 | this specification or extension specifications are only available within a secure context. 19 | 20 | Indirectly, orientation sensor readings can be used to [infer user input]. 21 | 22 | ## [3.3 Does this specification introduce new state for an origin that persists across browsing sessions?] 23 | 24 | No. 25 | 26 | ## [3.4 Does this specification expose persistent, cross-origin state to the web?] 27 | 28 | No. 29 | 30 | ## [3.5 Does this specification expose any other data to an origin that it doesn’t currently have access to?] 31 | 32 | No. 33 | 34 | ## [3.6 Does this specification enable new script execution/loading mechanisms?] 35 | 36 | No. 37 | 38 | ## [3.7 Does this specification allow an origin access to a user’s location?] 39 | 40 | Not directly; However, orientation sensor data can be used to calculate direction or 41 | orientation of the user in space. [Orientation sensor] requires [user permission][permission] 42 | and implementation of applicable [mitigation strategies] to avoid [potential risks][location-tracking]. 43 | 44 | ## [3.8 Does this specification allow an origin access to sensors on a user’s device?] 45 | 46 | Yes. 47 | 48 | ## [3.9 Does this specification allow an origin access to aspects of a user’s local computing environment?] 49 | 50 | Yes. If user agent has [permission] to access concrete orientation sensor, the API provides means 51 | to check if particular set of sensors are available within user’s local computing environment. 52 | 53 | ## [3.10 Does this specification allow an origin access to other devices?] 54 | 55 | No. 56 | 57 | ## [3.11 Does this specification allow an origin some measure of control over a user agent’s native UI?] 58 | 59 | No. 60 | 61 | ## [3.12 Does this specification expose temporary identifiers to the web?] 62 | 63 | No. 64 | 65 | ## [3.13 Does this specification distinguish between behavior in first-party and third-party contexts?] 66 | 67 | No. 68 | 69 | ## [3.14 How should this specification work in the context of a user agent’s "incognito" mode?] 70 | 71 | Specification does not restrict access to a particular mode, nor work differently. However, this 72 | can be revisited when [privacy mode] would be formally specified. 73 | 74 | ## [3.15 Does this specification persist data to a user’s local device?] 75 | 76 | No. 77 | 78 | ## [3.16 Does this specification have a "Security Considerations" and "Privacy Considerations" section?] 79 | 80 | Yes. 81 | 82 | See: [Security & Privacy][security-and-privacy] section. 83 | 84 | ## [3.17 Does this specification allow downgrading default security characteristics?] 85 | 86 | No. 87 | 88 | 89 | [Generic Sensor API]: https://w3c.github.io/sensors 90 | [Orientation Sensor]: https://w3c.github.io/orientation-sensor 91 | 92 | [mitigation strategies]: https://w3c.github.io/sensors/#mitigation-strategies 93 | [user-identification]: https://w3c.github.io/sensors/#user-identifying 94 | [security-and-privacy]: https://w3c.github.io/orientation-sensor/#security-and-privacy 95 | [permission]: https://w3c.github.io/orientation-sensor/#model 96 | [POWERFUL-FEATURES]: https://w3c.github.io/webappsec-secure-contexts/ 97 | [infer user input]: https://w3c.github.io/sensors/#keystroke-monitoring 98 | [location-tracking]: https://w3c.github.io/sensors/#location-tracking 99 | [privacy mode]: https://gist.github.com/mnot/96440a5ca74fcf328d23#privacy-mode 100 | [Security and Privacy Self-Review Questionnaire]: https://w3ctag.github.io/security-questionnaire/ 101 | 102 | [3.1 Does this specification deal with personally-identifiable information?]: https://w3ctag.github.io/security-questionnaire/#pii 103 | [3.2 Does this specification deal with high-value data?]: https://w3ctag.github.io/security-questionnaire/#credentials 104 | [3.3 Does this specification introduce new state for an origin that persists across browsing sessions?]: https://w3ctag.github.io/security-questionnaire/#persistent-origin-specific-state 105 | [3.4 Does this specification expose persistent, cross-origin state to the web?]: https://w3ctag.github.io/security-questionnaire/#persistent-identifiers 106 | [3.5 Does this specification expose any other data to an origin that it doesn’t currently have access to?]: https://w3ctag.github.io/security-questionnaire/#other-data 107 | [3.6 Does this specification enable new script execution/loading mechanisms?]: https://w3ctag.github.io/security-questionnaire/#string-to-script 108 | [3.7 Does this specification allow an origin access to a user’s location?]: https://w3ctag.github.io/security-questionnaire/#location 109 | [3.8 Does this specification allow an origin access to sensors on a user’s device?]: https://w3ctag.github.io/security-questionnaire/#sensors 110 | [3.9 Does this specification allow an origin access to aspects of a user’s local computing environment?]: https://w3ctag.github.io/security-questionnaire/#local-device 111 | [3.10 Does this specification allow an origin access to other devices?]: https://w3ctag.github.io/security-questionnaire/#remote-device 112 | [3.11 Does this specification allow an origin some measure of control over a user agent’s native UI?]: https://w3ctag.github.io/security-questionnaire/#native-ui 113 | [3.12 Does this specification expose temporary identifiers to the web?]: https://w3ctag.github.io/security-questionnaire/#temporary-id 114 | [3.13 Does this specification distinguish between behavior in first-party and third-party contexts?]: https://w3ctag.github.io/security-questionnaire/#first-third-party 115 | [3.14 How should this specification work in the context of a user agent’s "incognito" mode?]: https://w3ctag.github.io/security-questionnaire/#incognito 116 | [3.15 Does this specification persist data to a user’s local device?]: https://w3ctag.github.io/security-questionnaire/#storage 117 | [3.16 Does this specification have a "Security Considerations" and "Privacy Considerations" section?]: https://w3ctag.github.io/security-questionnaire/#considerations 118 | [3.17 Does this specification allow downgrading default security characteristics?]: https://w3ctag.github.io/security-questionnaire/#relaxed-sop 119 | -------------------------------------------------------------------------------- /w3c.json: -------------------------------------------------------------------------------- 1 | { 2 | "group": 43696, 3 | "contacts": [ 4 | "himorin" 5 | ], 6 | "shortName": "orientation-sensor", 7 | "repo-type": "rec-track", 8 | "policy": "open" 9 | } 10 | --------------------------------------------------------------------------------