├── .pr-preview.json ├── tidyconfig.txt ├── LICENSE.md ├── w3c.json ├── CODE_OF_CONDUCT.md ├── .github └── workflows │ └── auto-publish.yml ├── CONTRIBUTING.md ├── README.md └── index.html /.pr-preview.json: -------------------------------------------------------------------------------- 1 | { 2 | "src_file": "index.html", 3 | "type": "respec" 4 | } 5 | -------------------------------------------------------------------------------- /tidyconfig.txt: -------------------------------------------------------------------------------- 1 | char-encoding: utf8 2 | indent: yes 3 | wrap: 80 4 | tidy-mark: no 5 | -------------------------------------------------------------------------------- /LICENSE.md: -------------------------------------------------------------------------------- 1 | All documents in this Repository are licensed by contributors 2 | under the 3 | [W3C Software and Document License](http://www.w3.org/Consortium/Legal/copyright-software). 4 | 5 | -------------------------------------------------------------------------------- /w3c.json: -------------------------------------------------------------------------------- 1 | { 2 | "group": 45211, 3 | "contacts": [ 4 | "plehegar", 5 | "siusin" 6 | ], 7 | "shortName": "user-timing-3", 8 | "policy": "open", 9 | "repo-type": "rec-track" 10 | } -------------------------------------------------------------------------------- /CODE_OF_CONDUCT.md: -------------------------------------------------------------------------------- 1 | # Code of Conduct 2 | 3 | All documentation, code and communication under this repository are covered by the [W3C Code of Ethics and Professional Conduct](https://www.w3.org/Consortium/cepc/). 4 | -------------------------------------------------------------------------------- /.github/workflows/auto-publish.yml: -------------------------------------------------------------------------------- 1 | name: CI 2 | on: 3 | pull_request: {} 4 | push: 5 | branches: [gh-pages] 6 | jobs: 7 | main: 8 | name: Build, Validate and Deploy 9 | runs-on: ubuntu-latest 10 | steps: 11 | - uses: actions/checkout@v2 12 | - uses: w3c/spec-prod@v2 13 | with: 14 | W3C_ECHIDNA_TOKEN: ${{ secrets.W3C_TR_TOKEN }} 15 | W3C_WG_DECISION_URL: https://lists.w3.org/Archives/Public/public-web-perf/2021Apr/0005.html 16 | W3C_BUILD_OVERRIDE: | 17 | shortName: user-timing 18 | specStatus: CRD 19 | historyURI: "https://www.w3.org/standards/history/user-timing" 20 | -------------------------------------------------------------------------------- /CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | Contributions to this repository are intended to become part of Recommendation-track documents governed by the 2 | [W3C Patent Policy](https://www.w3.org/Consortium/Patent-Policy-20040205/) and 3 | [Software and Document License](https://www.w3.org/Consortium/Legal/copyright-software). To make substantive contributions to specifications, you must either participate 4 | in the relevant W3C Working Group or make a non-member patent licensing commitment. 5 | 6 | For our editing test-driven process, see [CONTRIBUTING.md](https://github.com/w3c/web-performance/blob/gh-pages/CONTRIBUTING.md). 7 | 8 | If you are not the sole contributor to a contribution (pull request), please identify all 9 | contributors in the pull request comment. 10 | 11 | To add a contributor (other than yourself, that's automatic), mark them one per line as follows: 12 | 13 | ``` 14 | +@github_username 15 | ``` 16 | 17 | If you added a contributor by mistake, you can remove them in a comment with: 18 | 19 | ``` 20 | -@github_username 21 | ``` 22 | 23 | If you are making a pull request on behalf of someone else but you had no part in designing the 24 | feature, you can remove yourself with the above syntax. 25 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | User Timing 2 | =========== 3 | 4 | This specifications defines an interface to help web developers measure the 5 | performance of their applications by giving them access to high precision 6 | timestamps and enable them to do comparisons between them. 7 | 8 | * Read the latest draft: https://w3c.github.io/user-timing/ 9 | * Discuss on [public-webperf](http://www.w3.org/Search/Mail/Public/search?keywords=%5BUserTiming%5D&hdr-1-name=subject&hdr-1-query=&index-grp=Public_FULL&index-type=t&type-index=public-web-perf) 10 | 11 | See also [Web performance README](https://github.com/w3c/web-performance/blob/gh-pages/README.md) 12 | 13 | ## PerformanceMark 14 | 15 | User Timing enables developers to create a `PerformanceMark`, which contains a 16 | provided string in `name`, with a high resolution timestamp (see 17 | [hr-time](https://w3c.github.io/hr-time/) in its `startTime`, and optionally 18 | some additional metadata in its `detail`. 19 | A developer can create such an object via `performance.mark()` and can query 20 | existing entries via `performance.getEntriesByType('mark')` (or other getters; 21 | see [performance-timeline](https://w3c.github.io/performance-timeline/) or via 22 | the `PerformanceObserver`. 23 | 24 | The following examples illustrate usage of `performance.mark()` with various 25 | parameters: 26 | 27 | * `performance.mark('mark1')`: Creates a `PerformanceMark` whose name is 'mark1' 28 | and whose `startTime` is the current high resolution time. 29 | * `performance.mark('mark2', {startTime: 5.4, detail: det})`: Creates a 30 | `PerformanceMark` whose name is 'mark2', whose `startTime` is 5.4, and whose 31 | `detail` is the object `det`. 32 | 33 | ### clearMarks 34 | 35 | Every time `performance.mark()` is invoked, a new `PerformanceMark` entry needs 36 | to be stored by the browser so that it can be later queried by the developer. 37 | The `performance.clearMarks()` method enables developers to clear some of the 38 | memory used by such calls. In particular: 39 | 40 | * `performance.clearMarks()`: Clears all marks from the `Window` or `Worker` from 41 | which the method is invoked. 42 | * `performance.clearMarks('mark1')`: Clears all marks from `Window` or `Worker` 43 | from which the method is invoked, whose name is 'mark1'. 44 | 45 | ## PerformanceMeasure 46 | 47 | User Timing also enables developers to create a `PerformanceMeasure`, which is an 48 | object that represents some time measurement between two points in time, and each 49 | point in time may be represented by a `PerformanceMark`. A `PerformanceMeasure` 50 | has an associated `name`, a high resolution timestamps corresponding to the initial 51 | point in time in `startTime`, the delta between the end-time and the `startTime` in 52 | `duration`, and optionally some additional metadata in its `detail`. 53 | A developer can create a `PerformanceMeasure` entry via `performance.measure()` and 54 | it may query existing entries via `performance.getEntriesByType('measure')` or via 55 | `PerformanceObserver` (i.e. in the same way as for `PerformanceMark`). 56 | 57 | The following examples illustrate usage of `performance.measure()` with various 58 | parameters: 59 | 60 | * `performance.measure('measure1')`: Creates a `PerformanceMeasure` whose `name` is 61 | 'measure1', whose `startTime` is 0, and whose `duration` is the current high 62 | resolution time. 63 | * `performance.measure('measure2', 'requestStart')`: Creates a 64 | `PerformanceMeasure` whose name is 'measure2', whose `startTime` is equal to 65 | `performance.timing.requestStart - performance.timing.navigationStart` (see the 66 | [PerformanceTiming](https://w3c.github.io/navigation-timing/#the-performancetiming-interface) 67 | interface for all the strings that are treated as special values in 68 | `peformance.measure()` calls), and whose end-time is the current high resolution 69 | timestamp --- `duration` will be the delta between the end-time and `startTime`. 70 | * `performance.measure('measure2', 'myMark')`: Creates a `PerformanceMeasure` where 71 | name is 'measure2', where `startTime` is the timestamp from the `PerformanceMark` 72 | whose `name` is `'myMark'`, and where end-time is the current high resolution timestamp. 73 | If there are no marks with such a `name`, an error is thrown. If there are multiple marks 74 | with such a `name`, the latest one is used. 75 | * `performance.measure('measure3', 'startMark', 'endMark')`: Creates a `PerformanceMeasure` 76 | whose `name` is 'measure3', whose `startTime` is the `startTime` of the latest 77 | `PerformanceMark` with `name` 'startMark', and whose end-time is the `startTime` of the 78 | latest `PerformanceMark` with `name` 'endMark'. 79 | * `performance.measure('measure4', 'startMark', 'domInteractive')`: Creates a 80 | `PerformanceMeasure` whose `name` is 'measure4', `startTime` is as in the above example, 81 | and end-time is `performance.timing.domInteractive - performance.timing.navigationStart`. 82 | * `performance.measure('measure5', {start: 6.0, detail: det})`: Creates a 83 | `PerformanceMeasure` whose `name` is 'measure5', `startTime` is 6.0, end-time is the 84 | current high resolution timestamp, and `detail` is the object `det`. 85 | * `performance.measure('measure6', {start: 'mark1', end: 'mark2'})`: Creates a 86 | `PerformanceMeasure` whose `name` is 'measure6', `startTime` is the `startTime` of 87 | the `PerformanceMark` with `name` equal to 'mark1', and end-time is the `startTime` 88 | of the `PerformanceMark` with `name` equal to 'mark2'. 89 | * `performance.measure('measure7', {end: 10.5, duration: 'mark1')`: Creates a 90 | `PerformanceMeasure` where `name` is 'measure7', `duration` is the `startTime` of the 91 | `PerformanceMark` with `name` equal to 'mark1', and `startTime` is equal to 92 | `10.5 - duration` (since end-time is 10.5). 93 | * `performance.measure('measure8', {start: 20.2, duration: 2, detail: det}`: Creates a 94 | `PerformanceMeasure` with `name` set to 'measure8', `startTime` set to 20.2, `duration` 95 | set to 2, and `detail` set to the object `det`. 96 | 97 | The following examples would throw errors and hence illustrate incorrect usage of 98 | `peformance.measure`: 99 | 100 | * `performance.measure('m', {start: 'mark1'}, 'mark2')`: If the second parameter is a 101 | dictionary, then the third parameter must not be provided. 102 | * `performance.measure('m', {duration: 2.0})`: In the dictionary, one of `start` or `end` 103 | must be provided. 104 | * `performance.measure('m', {start: 1, end: 4, duration: 3})`: In the dictionary, not all 105 | of `start`, `end`, and `duration` should be provided. 106 | 107 | ### clearMeasures 108 | 109 | This is the analogue to `performance.clearMarks()` for `PerformanceMeasure` cleanup: 110 | 111 | * `performance.clearMeasures()` clears all `PerformanceMeasure` objects. 112 | * `performance.clearMeasures('measure1')` clears `PerformanceMeasure` objects whose `name` 113 | is 'measure1'. 114 | 115 | ## Relationship with hr-time 116 | 117 | A developer can obtain a high resolution timestamp directly via `performance.now()`, as 118 | defined in [hr-time](https://w3c.github.io/hr-time/#now-method). However, User Timing enables 119 | tracking timestamps that may happen in very different parts of the page by enabling the 120 | developer to use names to identify these timestamps. Using User Timing instead of variables 121 | containing `performance.now()` enables the data to be surfaced automatically by analytics 122 | providers that have User Timing support as well as in the developer tooling of browsers that 123 | support exposing these timings. This kind of automatic surfacing is not possible directly via 124 | HR-Time. 125 | 126 | For instance, the following could be used to track the time it takes for a user from the time a 127 | cart is created to the time that the user completes their order, assuming a Single-Page-App 128 | architecture: 129 | 130 | ```js 131 | // Called when the user first clicks the "Add to cart" button. 132 | function onBeginCart() { 133 | const initialDetail = // Compute some initial metadata about the user. 134 | performance.mark('beginCart'); 135 | } 136 | 137 | // Called after the user clicks on "Complete transaction" button in checkout. 138 | function onCompleteTransaction() { 139 | const finalDetail = // Compute some final metadata about the user and the transaction. 140 | performance.measure('transaction', {start: 'beginCart', detail: finalDetail}); 141 | } 142 | ``` 143 | 144 | While developers could calculate those time measurements using `performance.now()`, using User 145 | Timing enables both better ergonomics and standardized collection of the results. The latter 146 | enables Analytics providers and developer tools to collect and report site-specific measurements, 147 | without requiring any knowledge of the site or its conventions. 148 | -------------------------------------------------------------------------------- /index.html: -------------------------------------------------------------------------------- 1 | 2 | 3 |
4 |This specification defines an interface to help web 75 | developers measure the performance of their applications by giving them access 76 | to high precision timestamps.
77 |This User Timing specification is intended to supersede [[USER-TIMING-2]] and includes:
80 |Web developers need the ability to assess and understand the performance characteristics of their applications. While JavaScript [[ECMA-262]] provides a mechanism to measure application latency (retrieving the current timestamp from the Date.now() method), the precision of this timestamp varies between user agents.
This document defines the PerformanceMark and PerformanceMeasure interfaces, and extensions to the Performance interface, which expose a high precision, monotonically increasing timestamp so that developers can better measure the performance characteristics of their applications.
The following script shows how a developer can use the interfaces defined in this document to obtain timing data related to developer scripts.
91 |
92 | async function run() {
93 | performance.mark("startTask1");
94 | await doTask1(); // Some developer code
95 | performance.mark("endTask1");
96 |
97 | performance.mark("startTask2");
98 | await doTask2(); // Some developer code
99 | performance.mark("endTask2");
100 |
101 | // Log them out
102 | const entries = performance.getEntriesByType("mark");
103 | for (const entry of entries) {
104 | console.table(entry.toJSON());
105 | }
106 | }
107 | run();
108 |
109 | [[PERFORMANCE-TIMELINE-2]] defines two mechanisms that
110 | can be used to retrieve recorded metrics: getEntries()
111 | and getEntriesByType() methods, and the
112 | PerformanceObserver interface. The former is best suited
113 | for cases where you want to retrieve a particular metric by name at a
114 | single point in time, and the latter is optimized for cases where you
115 | may want to receive notifications of new metrics as they become
116 | available.
As another example, suppose that there is an element which, when 118 | clicked, fetches some new content and indicates that it has been fetched. 119 | We'd like to report the time from when the user clicked to when the fetch 120 | was complete. We can't mark the time the click handler executes since that 121 | will miss latency to process the event, so instead we use the event 122 | hardware timestamp. We also want to know the name of the component to have 123 | more detailed analytics.
124 |
125 | element.addEventListener("click", e => {
126 | const component = getComponent(element);
127 | fetch(component.url).then(() => {
128 | element.textContent = "Updated";
129 | const updateMark = performance.mark("update_component", {
130 | detail: {component: component.name},
131 | });
132 | performance.measure("click_to_update_component", {
133 | detail: {component: component.name},
134 | start: e.timeStamp,
135 | end: updateMark.startTime,
136 | });
137 | });
138 | });
139 |
140 | Some conformance requirements are phrased as requirements on attributes, 144 | methods or objects. Such requirements are to be interpreted as requirements 145 | on user agents.
146 | 147 |The IDL fragments in this specification MUST be interpreted as 148 | required for conforming IDL fragments, as described in the Web IDL 149 | specification. [[WEBIDL]]
150 |Performance interfaceThe Performance interface and DOMHighResTimeStamp are defined in [[HR-TIME-2]]. 156 | The PerformanceEntry interface is defined in [[PERFORMANCE-TIMELINE-2]]. 157 |
158 |
159 | dictionary PerformanceMarkOptions {
160 | any detail;
161 | DOMHighResTimeStamp startTime;
162 | };
163 |
164 | dictionary PerformanceMeasureOptions {
165 | any detail;
166 | (DOMString or DOMHighResTimeStamp) start;
167 | DOMHighResTimeStamp duration;
168 | (DOMString or DOMHighResTimeStamp) end;
169 | };
170 |
171 | partial interface Performance {
172 | PerformanceMark mark(DOMString markName, optional PerformanceMarkOptions markOptions = {});
173 | undefined clearMarks(optional DOMString markName);
174 | PerformanceMeasure measure(DOMString measureName, optional (DOMString or PerformanceMeasureOptions) startOrMeasureOptions = {}, optional DOMString endMark);
175 | undefined clearMeasures(optional DOMString measureName);
176 | };
177 |
178 | Stores a timestamp with the associated name (a "mark"). It MUST run these steps:
181 |Removes the stored timestamp with the associated name. It MUST run these steps:
201 |Stores the {{DOMHighResTimeStamp}} duration between two marks along with the associated name (a "measure"). It MUST run these steps:
210 |Performance object's now() method.DOMString, let start time be the value returned by running the convert a mark to a timestamp algorithm passing in startOrMeasureOptions.0.name attribute to measureName.entryType attribute to DOMString "measure".startTime attribute to start time.duration attribute to the duration from start time to end time. The resulting duration value MAY be negative.detail attribute as follows:
257 | null.Removes stored timestamp with the associated name. It MUST run these steps:
289 |name [=string/is=] measureName.The PerformanceMark interface also exposes marks created via the {{Performance}} interface's {{Performance/mark()}} method to the Performance Timeline.
299 |
300 | [Exposed=(Window,Worker)]
301 | interface PerformanceMark : PerformanceEntry {
302 | constructor(DOMString markName, optional PerformanceMarkOptions markOptions = {});
303 | readonly attribute any detail;
304 | };
305 |
306 | The PerformanceMark interface extends the following attributes of the {{PerformanceEntry}} 307 | interface:
308 |The name attribute must return the mark's name.
The entryType attribute must return the DOMString "mark".
The startTime attribute must return a {{DOMHighResTimeStamp}} with the mark's time value.
The duration attribute must return a {{DOMHighResTimeStamp}} of value 0.
The PerformanceMark interface contains the following additional attribute:
313 |The detail attribute must return the value it is set to (it's copied from the PerformanceMarkOptions dictionary).
314 |The PerformanceMark constructor must run the following steps:
317 |Window object and markName uses the same name as a read only attribute in the PerformanceTiming interface, throw a SyntaxError.name attribute to markName.entryType attribute to DOMString "mark".startTime attribute as follows:
324 |
334 | duration attribute to 0.The PerformanceMeasure interface also exposes measures created via the {{Performance}} interface's {{Performance/measure()}} method to the Performance Timeline.
349 |
350 | [Exposed=(Window,Worker)]
351 | interface PerformanceMeasure : PerformanceEntry {
352 | readonly attribute any detail;
353 | };
354 |
355 | The PerformanceMeasure interface extends the following attributes of the {{PerformanceEntry}} interface:
356 |The name attribute must return the measure's name.
The entryType attribute must return the DOMString "measure".
The startTime attribute must return a {{DOMHighResTimeStamp}} with the measure's start mark.
The duration attribute must return a {{DOMHighResTimeStamp}} with the duration of the measure.
The PerformanceMeasure interface contains the following additional attribute:
361 |The detail attribute must return the value it is set to (it's copied from the PerformanceMeasureOptions dictionary).
362 |A user agent implementing the User Timing API would need to include "mark" and
367 | "measure" in
368 | supportedEntryTypes. This allows developers to detect support for User Timing.
To convert a mark to a timestamp, given a mark that is a DOMString or {{DOMHighResTimeStamp}} run these steps:
372 |
DOMString and it has the same name as a read only attribute in the PerformanceTiming interface, let end time be the value returned by running the convert a name to a timestamp algorithm with name set to the value of mark.DOMString, let end time be the value of the startTime attribute from the most recent occurrence of a PerformanceMark object in the performance entry buffer whose name [=string/is=] mark. If no matching entry is found, throw a SyntaxError.TypeError.To convert a name to a timestamp given a name that is a read only attribute in the PerformanceTiming interface, run these steps:
387 |
Window object, throw a TypeError.navigationStart, return 0.navigationStart in the PerformanceTiming interface.PerformanceTiming interface.0, throw an InvalidAccessError.396 | The PerformanceTiming interface was defined in [[NAVIGATION-TIMING]] and is now considered obsolete. The use of names from the PerformanceTiming interface is supported to remain backwards compatible, but there are no plans to extend this functionality to names in the PerformanceNavigationTiming interface defined in [[NAVIGATION-TIMING-2]] (or other interfaces) in the future. 397 |
398 |Developers are encouraged to use the following recommended mark names to 403 | mark common timings. The user agent does not validate that the usage of 404 | these names is appropriate or consistent with its description.
405 |Adding such recommended mark names can help performance 406 | tools tailor guidance to a site. These mark names can also help real user 407 | monitoring providers and user agents collect web developer signals regarding 408 | their application's performance at scale, and surface this information to 409 | developers without requiring any site-specific work.
410 |In this example, the page asynchonously initializes a chat widget, a 416 | searchbox, and a newsfeed upon loading. When finished, the 417 | "mark_fully_loaded" mark name enables lab tools and analytics 418 | providers to automatically show the timing. 419 |
420 |
421 | window.addEventListener("load", (event) => {
422 | Promise.all([
423 | loadChatWidget(),
424 | initializeSearchAutocomplete(),
425 | initializeNewsfeed()]).then(() => {
426 | performance.mark('mark_fully_loaded');
427 | });
428 | });
429 |
430 | In this example, the ImageOptimizationComponent for FancyJavaScriptFramework 452 | is used to size images for optimal performance. The code notes 453 | this feature's usage so that lab tools and analytics can measure 454 | whether it helped improve performance. 455 |
456 |
457 | performance.mark('mark_feature_usage', {
458 | 'detail': {
459 | 'feature': 'ImageOptimizationComponent',
460 | 'framework': 'FancyJavaScriptFramework'
461 | }
462 | })
463 |
464 | The interfaces defined in this specification expose potentially 471 | sensitive timing information on specific JavaScript activity of a page. 472 | Please refer to [[HR-TIME-2]] for privacy and security considerations of 473 | exposing high-resolution timing information.
474 |Because the web platform has been designed with the invariant that any 475 | script included on a page has the same access as any other script included 476 | on the same page, regardless of the origin of either scripts, the 477 | interfaces defined by this specification do not place any restrictions on 478 | recording or retrieval of recorded timing information - i.e. a user timing 479 | mark or measure recorded by any script included on the page can be read by 480 | any other script running on the same page, regardless of origin.
481 |Thanks to 485 | James Simonsen, 486 | Jason Weber, 487 | Nic Jansma, 488 | Philippe Le Hegaret, 489 | Karen Anderson, 490 | Steve Souders, 491 | Sigbjorn Vik, 492 | Todd Reifsteck, and 493 | Tony Gentilcore 494 | for their contributions to this work.
495 |