├── .pr-preview.json ├── tidyconfig.txt ├── LICENSE.md ├── w3c.json ├── CODE_OF_CONDUCT.md ├── .github └── workflows │ └── auto-publish.yml ├── CONTRIBUTING.md ├── README.md └── index.html /.pr-preview.json: -------------------------------------------------------------------------------- 1 | { 2 | "src_file": "index.html", 3 | "type": "respec" 4 | } 5 | -------------------------------------------------------------------------------- /tidyconfig.txt: -------------------------------------------------------------------------------- 1 | char-encoding: utf8 2 | indent: yes 3 | wrap: 80 4 | tidy-mark: no 5 | -------------------------------------------------------------------------------- /LICENSE.md: -------------------------------------------------------------------------------- 1 | All documents in this Repository are licensed by contributors 2 | under the 3 | [W3C Software and Document License](http://www.w3.org/Consortium/Legal/copyright-software). 4 | 5 | -------------------------------------------------------------------------------- /w3c.json: -------------------------------------------------------------------------------- 1 | { 2 | "group": 45211, 3 | "contacts": [ 4 | "plehegar", 5 | "siusin" 6 | ], 7 | "shortName": "user-timing-3", 8 | "policy": "open", 9 | "repo-type": "rec-track" 10 | } -------------------------------------------------------------------------------- /CODE_OF_CONDUCT.md: -------------------------------------------------------------------------------- 1 | # Code of Conduct 2 | 3 | All documentation, code and communication under this repository are covered by the [W3C Code of Ethics and Professional Conduct](https://www.w3.org/Consortium/cepc/). 4 | -------------------------------------------------------------------------------- /.github/workflows/auto-publish.yml: -------------------------------------------------------------------------------- 1 | name: CI 2 | on: 3 | pull_request: {} 4 | push: 5 | branches: [gh-pages] 6 | jobs: 7 | main: 8 | name: Build, Validate and Deploy 9 | runs-on: ubuntu-latest 10 | steps: 11 | - uses: actions/checkout@v2 12 | - uses: w3c/spec-prod@v2 13 | with: 14 | W3C_ECHIDNA_TOKEN: ${{ secrets.W3C_TR_TOKEN }} 15 | W3C_WG_DECISION_URL: https://lists.w3.org/Archives/Public/public-web-perf/2021Apr/0005.html 16 | W3C_BUILD_OVERRIDE: | 17 | shortName: user-timing 18 | specStatus: CRD 19 | historyURI: "https://www.w3.org/standards/history/user-timing" 20 | -------------------------------------------------------------------------------- /CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | Contributions to this repository are intended to become part of Recommendation-track documents governed by the 2 | [W3C Patent Policy](https://www.w3.org/Consortium/Patent-Policy-20040205/) and 3 | [Software and Document License](https://www.w3.org/Consortium/Legal/copyright-software). To make substantive contributions to specifications, you must either participate 4 | in the relevant W3C Working Group or make a non-member patent licensing commitment. 5 | 6 | For our editing test-driven process, see [CONTRIBUTING.md](https://github.com/w3c/web-performance/blob/gh-pages/CONTRIBUTING.md). 7 | 8 | If you are not the sole contributor to a contribution (pull request), please identify all 9 | contributors in the pull request comment. 10 | 11 | To add a contributor (other than yourself, that's automatic), mark them one per line as follows: 12 | 13 | ``` 14 | +@github_username 15 | ``` 16 | 17 | If you added a contributor by mistake, you can remove them in a comment with: 18 | 19 | ``` 20 | -@github_username 21 | ``` 22 | 23 | If you are making a pull request on behalf of someone else but you had no part in designing the 24 | feature, you can remove yourself with the above syntax. 25 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | User Timing 2 | =========== 3 | 4 | This specifications defines an interface to help web developers measure the 5 | performance of their applications by giving them access to high precision 6 | timestamps and enable them to do comparisons between them. 7 | 8 | * Read the latest draft: https://w3c.github.io/user-timing/ 9 | * Discuss on [public-webperf](http://www.w3.org/Search/Mail/Public/search?keywords=%5BUserTiming%5D&hdr-1-name=subject&hdr-1-query=&index-grp=Public_FULL&index-type=t&type-index=public-web-perf) 10 | 11 | See also [Web performance README](https://github.com/w3c/web-performance/blob/gh-pages/README.md) 12 | 13 | ## PerformanceMark 14 | 15 | User Timing enables developers to create a `PerformanceMark`, which contains a 16 | provided string in `name`, with a high resolution timestamp (see 17 | [hr-time](https://w3c.github.io/hr-time/) in its `startTime`, and optionally 18 | some additional metadata in its `detail`. 19 | A developer can create such an object via `performance.mark()` and can query 20 | existing entries via `performance.getEntriesByType('mark')` (or other getters; 21 | see [performance-timeline](https://w3c.github.io/performance-timeline/) or via 22 | the `PerformanceObserver`. 23 | 24 | The following examples illustrate usage of `performance.mark()` with various 25 | parameters: 26 | 27 | * `performance.mark('mark1')`: Creates a `PerformanceMark` whose name is 'mark1' 28 | and whose `startTime` is the current high resolution time. 29 | * `performance.mark('mark2', {startTime: 5.4, detail: det})`: Creates a 30 | `PerformanceMark` whose name is 'mark2', whose `startTime` is 5.4, and whose 31 | `detail` is the object `det`. 32 | 33 | ### clearMarks 34 | 35 | Every time `performance.mark()` is invoked, a new `PerformanceMark` entry needs 36 | to be stored by the browser so that it can be later queried by the developer. 37 | The `performance.clearMarks()` method enables developers to clear some of the 38 | memory used by such calls. In particular: 39 | 40 | * `performance.clearMarks()`: Clears all marks from the `Window` or `Worker` from 41 | which the method is invoked. 42 | * `performance.clearMarks('mark1')`: Clears all marks from `Window` or `Worker` 43 | from which the method is invoked, whose name is 'mark1'. 44 | 45 | ## PerformanceMeasure 46 | 47 | User Timing also enables developers to create a `PerformanceMeasure`, which is an 48 | object that represents some time measurement between two points in time, and each 49 | point in time may be represented by a `PerformanceMark`. A `PerformanceMeasure` 50 | has an associated `name`, a high resolution timestamps corresponding to the initial 51 | point in time in `startTime`, the delta between the end-time and the `startTime` in 52 | `duration`, and optionally some additional metadata in its `detail`. 53 | A developer can create a `PerformanceMeasure` entry via `performance.measure()` and 54 | it may query existing entries via `performance.getEntriesByType('measure')` or via 55 | `PerformanceObserver` (i.e. in the same way as for `PerformanceMark`). 56 | 57 | The following examples illustrate usage of `performance.measure()` with various 58 | parameters: 59 | 60 | * `performance.measure('measure1')`: Creates a `PerformanceMeasure` whose `name` is 61 | 'measure1', whose `startTime` is 0, and whose `duration` is the current high 62 | resolution time. 63 | * `performance.measure('measure2', 'requestStart')`: Creates a 64 | `PerformanceMeasure` whose name is 'measure2', whose `startTime` is equal to 65 | `performance.timing.requestStart - performance.timing.navigationStart` (see the 66 | [PerformanceTiming](https://w3c.github.io/navigation-timing/#the-performancetiming-interface) 67 | interface for all the strings that are treated as special values in 68 | `peformance.measure()` calls), and whose end-time is the current high resolution 69 | timestamp --- `duration` will be the delta between the end-time and `startTime`. 70 | * `performance.measure('measure2', 'myMark')`: Creates a `PerformanceMeasure` where 71 | name is 'measure2', where `startTime` is the timestamp from the `PerformanceMark` 72 | whose `name` is `'myMark'`, and where end-time is the current high resolution timestamp. 73 | If there are no marks with such a `name`, an error is thrown. If there are multiple marks 74 | with such a `name`, the latest one is used. 75 | * `performance.measure('measure3', 'startMark', 'endMark')`: Creates a `PerformanceMeasure` 76 | whose `name` is 'measure3', whose `startTime` is the `startTime` of the latest 77 | `PerformanceMark` with `name` 'startMark', and whose end-time is the `startTime` of the 78 | latest `PerformanceMark` with `name` 'endMark'. 79 | * `performance.measure('measure4', 'startMark', 'domInteractive')`: Creates a 80 | `PerformanceMeasure` whose `name` is 'measure4', `startTime` is as in the above example, 81 | and end-time is `performance.timing.domInteractive - performance.timing.navigationStart`. 82 | * `performance.measure('measure5', {start: 6.0, detail: det})`: Creates a 83 | `PerformanceMeasure` whose `name` is 'measure5', `startTime` is 6.0, end-time is the 84 | current high resolution timestamp, and `detail` is the object `det`. 85 | * `performance.measure('measure6', {start: 'mark1', end: 'mark2'})`: Creates a 86 | `PerformanceMeasure` whose `name` is 'measure6', `startTime` is the `startTime` of 87 | the `PerformanceMark` with `name` equal to 'mark1', and end-time is the `startTime` 88 | of the `PerformanceMark` with `name` equal to 'mark2'. 89 | * `performance.measure('measure7', {end: 10.5, duration: 'mark1')`: Creates a 90 | `PerformanceMeasure` where `name` is 'measure7', `duration` is the `startTime` of the 91 | `PerformanceMark` with `name` equal to 'mark1', and `startTime` is equal to 92 | `10.5 - duration` (since end-time is 10.5). 93 | * `performance.measure('measure8', {start: 20.2, duration: 2, detail: det}`: Creates a 94 | `PerformanceMeasure` with `name` set to 'measure8', `startTime` set to 20.2, `duration` 95 | set to 2, and `detail` set to the object `det`. 96 | 97 | The following examples would throw errors and hence illustrate incorrect usage of 98 | `peformance.measure`: 99 | 100 | * `performance.measure('m', {start: 'mark1'}, 'mark2')`: If the second parameter is a 101 | dictionary, then the third parameter must not be provided. 102 | * `performance.measure('m', {duration: 2.0})`: In the dictionary, one of `start` or `end` 103 | must be provided. 104 | * `performance.measure('m', {start: 1, end: 4, duration: 3})`: In the dictionary, not all 105 | of `start`, `end`, and `duration` should be provided. 106 | 107 | ### clearMeasures 108 | 109 | This is the analogue to `performance.clearMarks()` for `PerformanceMeasure` cleanup: 110 | 111 | * `performance.clearMeasures()` clears all `PerformanceMeasure` objects. 112 | * `performance.clearMeasures('measure1')` clears `PerformanceMeasure` objects whose `name` 113 | is 'measure1'. 114 | 115 | ## Relationship with hr-time 116 | 117 | A developer can obtain a high resolution timestamp directly via `performance.now()`, as 118 | defined in [hr-time](https://w3c.github.io/hr-time/#now-method). However, User Timing enables 119 | tracking timestamps that may happen in very different parts of the page by enabling the 120 | developer to use names to identify these timestamps. Using User Timing instead of variables 121 | containing `performance.now()` enables the data to be surfaced automatically by analytics 122 | providers that have User Timing support as well as in the developer tooling of browsers that 123 | support exposing these timings. This kind of automatic surfacing is not possible directly via 124 | HR-Time. 125 | 126 | For instance, the following could be used to track the time it takes for a user from the time a 127 | cart is created to the time that the user completes their order, assuming a Single-Page-App 128 | architecture: 129 | 130 | ```js 131 | // Called when the user first clicks the "Add to cart" button. 132 | function onBeginCart() { 133 | const initialDetail = // Compute some initial metadata about the user. 134 | performance.mark('beginCart'); 135 | } 136 | 137 | // Called after the user clicks on "Complete transaction" button in checkout. 138 | function onCompleteTransaction() { 139 | const finalDetail = // Compute some final metadata about the user and the transaction. 140 | performance.measure('transaction', {start: 'beginCart', detail: finalDetail}); 141 | } 142 | ``` 143 | 144 | While developers could calculate those time measurements using `performance.now()`, using User 145 | Timing enables both better ergonomics and standardized collection of the results. The latter 146 | enables Analytics providers and developer tools to collect and report site-specific measurements, 147 | without requiring any knowledge of the site or its conventions. 148 | -------------------------------------------------------------------------------- /index.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | User Timing 5 | 6 | 7 | 9 | 71 | 72 | 73 |
74 |

This specification defines an interface to help web 75 | developers measure the performance of their applications by giving them access 76 | to high precision timestamps.

77 |
78 |
79 |

This User Timing specification is intended to supersede [[USER-TIMING-2]] and includes:

80 | 84 |
85 |
86 |

Introduction

87 |

Web developers need the ability to assess and understand the performance characteristics of their applications. While JavaScript [[ECMA-262]] provides a mechanism to measure application latency (retrieving the current timestamp from the Date.now() method), the precision of this timestamp varies between user agents.

88 |

This document defines the PerformanceMark and PerformanceMeasure interfaces, and extensions to the Performance interface, which expose a high precision, monotonically increasing timestamp so that developers can better measure the performance characteristics of their applications.

89 |
90 |

The following script shows how a developer can use the interfaces defined in this document to obtain timing data related to developer scripts.

91 |
 92 |         async function run() {
 93 |           performance.mark("startTask1");
 94 |           await doTask1(); // Some developer code
 95 |           performance.mark("endTask1");
 96 | 
 97 |           performance.mark("startTask2");
 98 |           await doTask2(); // Some developer code
 99 |           performance.mark("endTask2");
100 | 
101 |           // Log them out
102 |           const entries = performance.getEntriesByType("mark");
103 |           for (const entry of entries) {
104 |             console.table(entry.toJSON());
105 |           }
106 |         }
107 |         run();
108 |       
109 |

[[PERFORMANCE-TIMELINE-2]] defines two mechanisms that 110 | can be used to retrieve recorded metrics: getEntries() 111 | and getEntriesByType() methods, and the 112 | PerformanceObserver interface. The former is best suited 113 | for cases where you want to retrieve a particular metric by name at a 114 | single point in time, and the latter is optimized for cases where you 115 | may want to receive notifications of new metrics as they become 116 | available.

117 |

As another example, suppose that there is an element which, when 118 | clicked, fetches some new content and indicates that it has been fetched. 119 | We'd like to report the time from when the user clicked to when the fetch 120 | was complete. We can't mark the time the click handler executes since that 121 | will miss latency to process the event, so instead we use the event 122 | hardware timestamp. We also want to know the name of the component to have 123 | more detailed analytics.

124 |
125 |         element.addEventListener("click", e => {
126 |           const component = getComponent(element);
127 |           fetch(component.url).then(() => {
128 |             element.textContent = "Updated";
129 |             const updateMark = performance.mark("update_component", {
130 |               detail: {component: component.name},
131 |             });
132 |             performance.measure("click_to_update_component", {
133 |               detail: {component: component.name},
134 |               start: e.timeStamp,
135 |               end: updateMark.startTime,
136 |             });
137 |           });
138 |         });
139 |       
140 |
141 |
142 |
143 |

Some conformance requirements are phrased as requirements on attributes, 144 | methods or objects. Such requirements are to be interpreted as requirements 145 | on user agents.

146 | 147 |

The IDL fragments in this specification MUST be interpreted as 148 | required for conforming IDL fragments, as described in the Web IDL 149 | specification. [[WEBIDL]]

150 |
151 |
152 |

User Timing

153 |
154 |

Extensions to the Performance interface

155 |

The Performance interface and DOMHighResTimeStamp are defined in [[HR-TIME-2]]. 156 | The PerformanceEntry interface is defined in [[PERFORMANCE-TIMELINE-2]]. 157 |

158 |
159 |         dictionary PerformanceMarkOptions {
160 |             any detail;
161 |             DOMHighResTimeStamp startTime;
162 |         };
163 | 
164 |         dictionary PerformanceMeasureOptions {
165 |             any detail;
166 |             (DOMString or DOMHighResTimeStamp) start;
167 |             DOMHighResTimeStamp duration;
168 |             (DOMString or DOMHighResTimeStamp) end;
169 |         };
170 | 
171 |         partial interface Performance {
172 |             PerformanceMark mark(DOMString markName, optional PerformanceMarkOptions markOptions = {});
173 |             undefined clearMarks(optional DOMString markName);
174 |             PerformanceMeasure measure(DOMString measureName, optional (DOMString or PerformanceMeasureOptions) startOrMeasureOptions = {}, optional DOMString endMark);
175 |             undefined clearMeasures(optional DOMString measureName);
176 |         };
177 |       
178 |
179 |

mark() method

180 |

Stores a timestamp with the associated name (a "mark"). It MUST run these steps:

181 |
    182 |
  1. Run the PerformanceMark constructor and let entry be the newly created object.
  2. 183 |
  3. Queue entry.
  4. 184 |
  5. Add entry to the performance entry buffer.
  6. 185 |
  7. Return entry.
  8. 186 |
187 |
188 |

PerformanceMarkOptions dictionary

189 |
190 |
detail
191 |
Metadata to be included in the mark.
192 |
startTime
193 |
Timestamp to be used as the mark time.
194 |
195 |
196 |
197 |
198 |
199 |

clearMarks() method

200 |

Removes the stored timestamp with the associated name. It MUST run these steps:

201 |
    202 |
  1. If markName is omitted, remove all PerformanceMark objects from the performance entry buffer.
  2. 203 |
  3. Otherwise, remove all PerformanceMark objects listed in the performance entry buffer whose name [=string/is=] markName.
  4. 204 |
  5. Return undefined.
  6. 205 |
206 |
207 |
208 |

measure() method

209 |

Stores the {{DOMHighResTimeStamp}} duration between two marks along with the associated name (a "measure"). It MUST run these steps:

210 |
    211 |
  1. If startOrMeasureOptions is a PerformanceMeasureOptions object and at least one of start, end, duration, and detail [=map/exist=], run the following checks: 212 |
      213 |
    1. If endMark is given, throw a TypeError.
    2. 214 |
    3. If startOrMeasureOptions's start and end members are both omitted, throw a TypeError.
    4. 215 |
    5. If startOrMeasureOptions's start, duration, and end members all [=map/exist=], throw a TypeError.
    6. 216 |
    217 |
  2. 218 |
  3. 219 | Compute end time as follows: 220 |
      221 |
    1. If endMark is given, let end time be the value returned by running the convert a mark to a timestamp algorithm passing in endMark.
    2. 222 |
    3. Otherwise, if startOrMeasureOptions is a PerformanceMeasureOptions object, and if its end member [=map/exists=], let end time be the value returned by running the convert a mark to a timestamp algorithm passing in startOrMeasureOptions's end.
    4. 223 |
    5. 224 | Otherwise, if startOrMeasureOptions is a PerformanceMeasureOptions object, and if its start and duration members both [=map/exist=]: 225 |
        226 |
      1. Let start be the value returned by running the convert a mark to a timestamp algorithm passing in start.
      2. 227 |
      3. Let duration be the value returned by running the convert a mark to a timestamp algorithm passing in duration.
      4. 228 |
      5. Let end time be start plus duration.
      6. 229 |
      230 |
    6. 231 |
    7. Otherwise, let end time be the value that would be returned by the Performance object's now() method.
    8. 232 |
    233 |
  4. 234 |
  5. 235 | Compute start time as follows: 236 |
      237 |
    1. If startOrMeasureOptions is a PerformanceMeasureOptions object, and if its start member [=map/exists=], let start time be the value returned by running the convert a mark to a timestamp algorithm passing in startOrMeasureOptions's start.
    2. 238 |
    3. 239 | Otherwise, if startOrMeasureOptions is a PerformanceMeasureOptions object, and if its duration and end members both [=map/exist=]: 240 |
        241 |
      1. Let duration be the value returned by running the convert a mark to a timestamp algorithm passing in duration.
      2. 242 |
      3. Let end be the value returned by running the convert a mark to a timestamp algorithm passing in end.
      4. 243 |
      5. Let start time be end minus duration.
      6. 244 |
      245 |
    4. 246 |
    5. Otherwise, if startOrMeasureOptions is a DOMString, let start time be the value returned by running the convert a mark to a timestamp algorithm passing in startOrMeasureOptions.
    6. 247 |
    7. Otherwise, let start time be 0.
    8. 248 |
    249 |
  6. 250 |
  7. Create a new PerformanceMeasure object (entry) with this's relevant realm.
  8. 251 |
  9. Set entry's name attribute to measureName.
  10. 252 |
  11. Set entry's entryType attribute to DOMString "measure".
  12. 253 |
  13. Set entry's startTime attribute to start time.
  14. 254 |
  15. Set entry's duration attribute to the duration from start time to end time. The resulting duration value MAY be negative.
  16. 255 |
  17. 256 | Set entry's detail attribute as follows: 257 |
      258 |
    1. If startOrMeasureOptions is a PerformanceMeasureOptions object and startOrMeasureOptions's detail member [=map/exists=]: 259 |
        260 |
      1. Let record be the result of calling the StructuredSerialize algorithm on startOrMeasureOptions's detail.
      2. 261 |
      3. Set entry's detail to the result of calling the StructuredDeserialize algorithm on record and the current realm.
      4. 262 |
      263 |
    2. 264 |
    3. Otherwise, set it to null.
    4. 265 |
    266 |
  18. 267 |
  19. Queue entry.
  20. 268 |
  21. Add entry to the performance entry buffer.
  22. 269 |
  23. Return entry.
  24. 270 |
271 |
272 |

PerformanceMeasureOptions dictionary

273 |
274 |
detail
275 |
Metadata to be included in the measure.
276 |
start
277 |
Timestamp to be used as the start time or string to be used as start mark.
278 |
duration
279 |
Duration between the start and end times.
280 |
end
281 |
Timestamp to be used as the end time or string to be used as end mark.
282 |
283 |
284 |
285 |
286 |
287 |

clearMeasures() method

288 |

Removes stored timestamp with the associated name. It MUST run these steps:

289 |
    290 |
  1. If measureName is omitted, remove all PerformanceMeasure objects in the performance entry buffer.
  2. 291 |
  3. Otherwise remove all PerformanceMeasure objects listed in the performance entry buffer whose name [=string/is=] measureName.
  4. 292 |
  5. Return undefined.
  6. 293 |
294 |
295 |
296 |
297 |

The PerformanceMark Interface

298 |

The PerformanceMark interface also exposes marks created via the {{Performance}} interface's {{Performance/mark()}} method to the Performance Timeline.

299 |
300 |         [Exposed=(Window,Worker)]
301 |         interface PerformanceMark : PerformanceEntry {
302 |           constructor(DOMString markName, optional PerformanceMarkOptions markOptions = {});
303 |           readonly attribute any detail;
304 |         };
305 |       
306 |

The PerformanceMark interface extends the following attributes of the {{PerformanceEntry}} 307 | interface:

308 |

The name attribute must return the mark's name.

309 |

The entryType attribute must return the DOMString "mark".

310 |

The startTime attribute must return a {{DOMHighResTimeStamp}} with the mark's time value.

311 |

The duration attribute must return a {{DOMHighResTimeStamp}} of value 0.

312 |

The PerformanceMark interface contains the following additional attribute:

313 |

The detail attribute must return the value it is set to (it's copied from the PerformanceMarkOptions dictionary).

314 |
315 |

The PerformanceMark Constructor

316 |

The PerformanceMark constructor must run the following steps:

317 |
    318 |
  1. If the current global object is a Window object and markName uses the same name as a read only attribute in the PerformanceTiming interface, throw a SyntaxError.
  2. 319 |
  3. Create a new PerformanceMark object (entry) with the current global object's realm.
  4. 320 |
  5. Set entry's name attribute to markName.
  6. 321 |
  7. Set entry's entryType attribute to DOMString "mark".
  8. 322 |
  9. 323 | Set entry's startTime attribute as follows: 324 |
      325 |
    1. 326 | If markOptions's startTime member [=map/exists=], then: 327 |
        328 |
      1. If markOptions's startTime is negative, throw a TypeError.
      2. 329 |
      3. Otherwise, set entry's startTime to the value of markOptions's startTime.
      4. 330 |
      331 |
    2. 332 |
    3. Otherwise, set it to the value that would be returned by the Performance object's now() method.
    4. 333 |
    334 |
  10. 335 |
  11. Set entry's duration attribute to 0.
  12. 336 |
  13. If markOptions's detail is null, set entry's detail to null.
  14. 337 |
  15. Otherwise: 338 |
      339 |
    1. Let record be the result of calling the StructuredSerialize algorithm on markOptions's detail.
    2. 340 |
    3. Set entry's detail to the result of calling the StructuredDeserialize algorithm on record and the current realm.
    4. 341 |
    342 |
  16. 343 |
344 |
345 |
346 |
347 |

The PerformanceMeasure Interface

348 |

The PerformanceMeasure interface also exposes measures created via the {{Performance}} interface's {{Performance/measure()}} method to the Performance Timeline.

349 |
350 |         [Exposed=(Window,Worker)]
351 |         interface PerformanceMeasure : PerformanceEntry {
352 |           readonly attribute any detail;
353 |         };
354 |       
355 |

The PerformanceMeasure interface extends the following attributes of the {{PerformanceEntry}} interface:

356 |

The name attribute must return the measure's name.

357 |

The entryType attribute must return the DOMString "measure".

358 |

The startTime attribute must return a {{DOMHighResTimeStamp}} with the measure's start mark.

359 |

The duration attribute must return a {{DOMHighResTimeStamp}} with the duration of the measure.

360 |

The PerformanceMeasure interface contains the following additional attribute:

361 |

The detail attribute must return the value it is set to (it's copied from the PerformanceMeasureOptions dictionary).

362 |
363 |
364 |
365 |

Processing

366 |

A user agent implementing the User Timing API would need to include "mark" and 367 | "measure" in 368 | supportedEntryTypes. This allows developers to detect support for User Timing.

369 |
370 |

Convert a mark to a timestamp

371 |

To convert a mark to a timestamp, given a mark that is a DOMString or {{DOMHighResTimeStamp}} run these steps: 372 |

    373 |
  1. If mark is a DOMString and it has the same name as a read only attribute in the PerformanceTiming interface, let end time be the value returned by running the convert a name to a timestamp algorithm with name set to the value of mark.
  2. 374 |
  3. Otherwise, if mark is a DOMString, let end time be the value of the startTime attribute from the most recent occurrence of a PerformanceMark object in the performance entry buffer whose name [=string/is=] mark. If no matching entry is found, throw a SyntaxError.
  4. 375 |
  5. 376 | Otherwise, if mark is a {{DOMHighResTimeStamp}}: 377 |
      378 |
    1. If mark is negative, throw a TypeError.
    2. 379 |
    3. Otherwise, let end time be mark.
    4. 380 |
    381 |
  6. 382 |
383 |
384 |
385 |

Convert a name to a timestamp

386 |

To convert a name to a timestamp given a name that is a read only attribute in the PerformanceTiming interface, run these steps:

387 |

    388 |
  1. If the global object is not a Window object, throw a TypeError.
  2. 389 |
  3. If name is navigationStart, return 0.
  4. 390 |
  5. Let startTime be the value of navigationStart in the PerformanceTiming interface.
  6. 391 |
  7. Let endTime be the value of name in the PerformanceTiming interface.
  8. 392 |
  9. If endTime is 0, throw an InvalidAccessError.
  10. 393 |
  11. Return result of subtracting startTime from endTime.
  12. 394 |
395 |

396 | The PerformanceTiming interface was defined in [[NAVIGATION-TIMING]] and is now considered obsolete. The use of names from the PerformanceTiming interface is supported to remain backwards compatible, but there are no plans to extend this functionality to names in the PerformanceNavigationTiming interface defined in [[NAVIGATION-TIMING-2]] (or other interfaces) in the future. 397 |

398 |
399 |
400 | 468 |
469 |

Privacy and Security

470 |

The interfaces defined in this specification expose potentially 471 | sensitive timing information on specific JavaScript activity of a page. 472 | Please refer to [[HR-TIME-2]] for privacy and security considerations of 473 | exposing high-resolution timing information.

474 |

Because the web platform has been designed with the invariant that any 475 | script included on a page has the same access as any other script included 476 | on the same page, regardless of the origin of either scripts, the 477 | interfaces defined by this specification do not place any restrictions on 478 | recording or retrieval of recorded timing information - i.e. a user timing 479 | mark or measure recorded by any script included on the page can be read by 480 | any other script running on the same page, regardless of origin.

481 |
482 |
483 |

Acknowledgments

484 |

Thanks to 485 | James Simonsen, 486 | Jason Weber, 487 | Nic Jansma, 488 | Philippe Le Hegaret, 489 | Karen Anderson, 490 | Steve Souders, 491 | Sigbjorn Vik, 492 | Todd Reifsteck, and 493 | Tony Gentilcore 494 | for their contributions to this work.

495 |
496 | 497 | 498 | --------------------------------------------------------------------------------