├── _config.yml ├── primary-use-case.png ├── tone-mapping-scenarios.png ├── color-space-conversions.png ├── w3c.json ├── README.md ├── hdr-in-png-requirements.md ├── charter.html ├── canvas_float.md ├── hdr_html_canvas_element.md └── index.html /_config.yml: -------------------------------------------------------------------------------- 1 | theme: jekyll-theme-slate -------------------------------------------------------------------------------- /primary-use-case.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/w3c/ColorWeb-CG/HEAD/primary-use-case.png -------------------------------------------------------------------------------- /tone-mapping-scenarios.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/w3c/ColorWeb-CG/HEAD/tone-mapping-scenarios.png -------------------------------------------------------------------------------- /color-space-conversions.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/w3c/ColorWeb-CG/HEAD/color-space-conversions.png -------------------------------------------------------------------------------- /w3c.json: -------------------------------------------------------------------------------- 1 | { 2 | "contacts": "svgeesus" 3 | , "repo-type": "cg-report" 4 | , "group": 96574 5 | } 6 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # ColorWeb-CG 2 | 3 | Repository for the Color on the Web Community Group https://www.w3.org/community/colorweb/ 4 | 5 | # Current deliverables 6 | 7 | * [draft CG report](./index.html) 8 | * [requirements for adding HDR support in PNG](./hdr-in-png-requirements.md) 9 | * [strawman API for adding HDR support to Canvas/WebGPU/WebGL](./hdr_html_canvas_element.md) 10 | * [explainer for adding floating point support to HTML canvas](./canvas_float.md) 11 | 12 | # Issue tracking 13 | 14 | Please file issues directly on [GitHub](https://github.com/w3c/ColorWeb-CG/issues). 15 | -------------------------------------------------------------------------------- /hdr-in-png-requirements.md: -------------------------------------------------------------------------------- 1 | # Adding support for HDR imagery to the PNG format 2 | Editors: Chris Blume, Pierre-Anthony Lemieux, Chris Seeger, Leonard Rosenthol 3 | 4 | Status: Draft 5 | 6 | ## Problem to be solved 7 | The gAMA chunk of the Portable Network Graphics (PNG) format (specified in [PNG]) parameterized the transfer function of the image as a power law. As such, it cannot model the Reference PQ or HLG OOTFs specified in [BT2100], which are commonly used for HDR images. 8 | 9 | An existing W3C group note, [BT2100-in-PNG] specifies an approach which is limited: it supports signaling of only the ITU BT.2100 PQ EOTF and uses magic values in the iCCP chunk to signal color spaces. 10 | 11 | ## Basic requirements 12 | * does not break current implementations 13 | * extensible signaling of color space based on H.273 14 | * does not require the presence of iCCP chunk and embedded ICC profiles 15 | 16 | ## Non Requirements 17 | * bit-identical serialization of the H.273 color space parameters as used by other raster image formats (eg. JPEG, AVIF) 18 | 19 | ## Strawman approach 20 | 21 | Two new PNG chunks are proposed, the `cICP` chunk and the `iCCN` chunk. 22 | 23 | The `cICP` chunk acts as a color space label (much like the existing `sRGB` chunk), specifying the color space to which the pixels within the PNG file conforms. The `iCCN` chunk (much like the existing `iCCP`) contains an embedded color profile. 24 | 25 | A PNG may contain multiple chunks with color space information. A PNG viewer should use the highest priority color space chunk that it can honor, ignoring the others. The priorities are from highest to lowest: 26 | 27 | * `cICP` 28 | * `iCCN` 29 | * `sRGB` 30 | * `iCCP` 31 | * `gAMA` & `cHRM` 32 | 33 | If the PNG decoder knows the display / surface cannot use the cICP code point, it should ignore the cICP chunk and continue down the priority list. 34 | 35 | ### cICP chunk 36 | 37 | This chunk SHALL come before the `IDAT` chunk. 38 | 39 | [H.273](https://www.itu.int/rec/T-REC-H.273/en) specifies a controlled vocabulary for the parameterization of color space information. 40 | 41 | Define a `cICP` chunk that contains the 4 bytes necessary to carry the H.273 color space parameters: 42 | 43 | * **COLPRIMS**, 1 byte, One of the ColourPrimaries enumerated values specified in Rec. ITU-T H.273 | [ISO/IEC 23091-2] 44 | * **TRANSFC**, 1 byte, One of the TransferCharacteristics enumerated values specified in Rec. ITU-T H.273 | [ISO/IEC 23091-2] 45 | * **MATCOEFFS**, 1 byte, One of the MatrixCoefficients enumerated values specified in Rec. ITU-T H.273 | [ISO/IEC 23091-2] 46 | * **VIDFRNG**, 1 byte, Value of the VideoFullRangeFlag specified in Rec. ITU-T H.273 | [ISO/IEC 23091-2] 47 | 48 | NOTE: While these are inspired from recent JPEG standards (eg. JPEG-XL) that incorporate these color space parameters, this specification follows H.273. 49 | 50 | NOTE: [ITU-T Series H Supplement 19](https://www.itu.int/rec/T-REC-H.Sup19-201910-I) summarize combinations of H.273 parameters corresponding to common baseband linear broadcasts and file-based Video-on-Demand(VOD) services. 51 | 52 | ### iCCN chunk Embedded ICC profile (updated) 53 | 54 | This chunk SHALL come before the `IDAT` chunk. 55 | 56 | #### Structure 57 | 58 | The `iCCN` chunk contains: 59 | 60 | |Name |Size | 61 | |-------------------|-----------------------------| 62 | |Profile name |1-79 bytes (character string)| 63 | |Null separator |1 byte (null character) | 64 | |Compression method |1 byte | 65 | |Compressed profile |n bytes | 66 | 67 | The profile name may be any convenient name for referring to the profile. It is case-sensitive. Profile names shall be encoded as UTF-8. Leading, trailing, and consecutive spaces are not permitted. The profile name shall not contain a zero byte (null character). 68 | 69 | The compression method shall be method 0 (zlib datastream with deflate compression). The compression method entry is followed by a compressed datastream of an ICC profile as defined in [ICC] or [ICC-2010]. 70 | 71 | The ICC profile shall either be an output profile (Device Class = `prtr`) or a monitor profile (Device Class = `mntr`). Decompression of this datastream yields the embedded ICC profile. 72 | 73 | NOTE: This is exactly the same as `iCCP` except: 74 | 75 | * `iCCP` is ICCv2 (although many decoders treat it as ICCv4) while `iCCN` is explicitly ICCv4 76 | * the profile name is UTF-8 instead of Latin-1. Analogous to `tEXt` vs. `iTXt` 77 | 78 | #### Encoder 79 | 80 | If the `iCCN` chunk is present, the image samples conform to the colour space represented by the embedded ICC profile. The colour space of the ICC profile shall be an RGB (`RGB `) colour space for colour images (PNG colour types 2, 3, and 6), or a greyscale (`GRAY`) colour space for greyscale images (PNG colour types 0 and 4). A PNG encoder that writes the `iCCN` chunk is encouraged to also write `gAMA` and `cHRM` chunks that approximate the ICC profile, to provide compatibility with applications that do not use the iCCN chunk. 81 | 82 | ### Decoder 83 | 84 | The `iCCN` chunk should be interpreted according to [ICC] or [ICC-2010] as appropriate. 85 | 86 | ## References 87 | 88 | [ICC] ISO 15076-1:2010, Image technology colour management – Architecture, profile format and data structure — Part 1: Based on ICC.1:2010 89 | 90 | [ICC-2010] Specification ICC.1:2010-12 (Profile version 4.3.0.0) Image technology colour management - Architecture, profile format, and data structure 91 | 92 | [ITU-T Series H Supplement 19](https://www.itu.int/rec/T-REC-H.Sup19-201910-I). Series H: Audiovisual and multimedia systems - Usage of video signal type code points 93 | 94 | [BT2100] 95 | [Recommendation ITU-R BT.2100](https://www.itu.int/rec/R-REC-BT.2100), Image parameter values for high dynamic range television for use in production and international programme exchange 96 | 97 | [ITU-T H.273] 98 | [Technical Document ITU-T H.273](https://www.itu.int/rec/T-REC-H.273/en), Color Independent Coding Points for Images 99 | 100 | [PNG] 101 | [Portable Network Graphics (PNG) Specification (Second Edition)](https://www.w3.org/TR/PNG/). Tom Lane. W3C. 10 November 2003. W3C Recommendation. URL: https://www.w3.org/TR/PNG 102 | 103 | [BT2100-in-PNG] 104 | [Using the ITU BT.2100 PQ EOTF with the PNG format](https://www.w3.org/TR/png-hdr-pq/) 105 | 106 | [ISO/IEC 23091-2] 107 | [ISO/IEC 23091-2](https://www.iso.org/standard/81546.html) 108 | -------------------------------------------------------------------------------- /charter.html: -------------------------------------------------------------------------------- 1 | 2 | 3 |
4 |43 | The goal is to allow color experts from various fields - including 44 | print-oriented color management, display, media and entertainment 45 | community, CSS , image and video coding and graphics experts - to come 46 | together, share ideas, use cases, experiences and discuss technical 47 | solutions to improve the state of color on the Web. 48 |
49 |The works addresses the representation and processing of color 53 | information as applied to the Web.
54 |55 | Particular areas of interest include, but are not limited to: 56 |
87 | The group MAY produce test suites to support the Specifications. Please 88 | see the GitHub LICENSE file for test suite contribution licensing 89 | information. 90 |
91 |95 | The group expects to liaise with the following groups: 96 |
111 | The group operates under the Community and Business 113 | Group Process. Terms in this Charter that conflict with those of the 114 | Community and Business Group Process are void. 115 |
116 |117 | As with other Community Groups, W3C seeks organizational licensing 118 | commitments under the W3C Community 120 | Contributor License Agreement (CLA). When people request to 121 | participate without representing their organization's legal interests, 122 | W3C will in general approve those requests for this group with the 123 | following understanding: W3C will seek and expect an organizational 124 | commitment under the CLA starting with the individual's first request to 125 | make a contribution to a group Deliverable. 126 | The section on Contribution Mechanics describes 127 | how W3C expects to monitor these contribution requests. 128 |
129 | 130 |131 | The W3C Code of 132 | Ethics and Professional Conduct applies to participation in 133 | this group. 134 |
135 | 136 |140 | The group will not publish Specifications on topics other than those 141 | listed under Specifications above. See 142 | below for how to modify the charter. 143 |
144 |148 | Substantive Contributions to Specifications can only be made by Community 149 | Group Participants who have agreed to the W3C Community 151 | Contributor License Agreement (CLA). 152 |
153 |154 | Specifications created in the Community Group must use the 156 | W3C Software and Document License. All other documents produced by 157 | the group should use that License where possible. 158 |
159 |160 | Community Group participants agree to make all contributions in the 161 | GitHub repo the group is using for the particular document. This may be 162 | in the form of a pull request (preferred), by raising an issue, or by 163 | adding a comment to an existing issue. 164 |
165 |166 | All Github repositories attached to the Community Group must contain a 167 | copy of the CONTRIBUTING 169 | and LICENSE 171 | files. 172 |
173 |177 | The group will conduct all of its technical work in public. If the group 178 | uses GitHub, all technical work will occur in its GitHub repositories 179 | (and not in mailing list discussions). This is to ensure contributions 180 | can be tracked through a software tool. 181 |
182 |183 | Meetings may be restricted to Community Group participants, but a public 184 | summary or minutes must be posted to the group's public mailing list, or 185 | to a GitHub issue if the group uses GitHub. 186 |
187 |191 | This group will seek to make decisions where there is consensus. Groups 192 | are free to decide how to make decisions (e.g. Participants who have 193 | earned Committer status for a history of useful contributions assess 194 | consensus, or the Chair assesses consensus, or where consensus isn't 195 | clear there is a Call for Consensus [CfC] to allow multi-day online 196 | feedback for a proposed course of action). It is expected that 197 | participants can earn Committer status through a history of valuable 198 | contributions as is common in open source projects. After discussion and 199 | due consideration of different opinions, a decision should be publicly 200 | recorded (where GitHub is used as the resolution of an Issue). 201 |
202 |203 | If substantial disagreement remains (e.g. the group is divided) and the 204 | group needs to decide an Issue in order to continue to make progress, the 205 | Committers will choose an alternative that had substantial support (with 206 | a vote of Committers if necessary). Individuals who disagree with the 207 | choice are strongly encouraged to take ownership of their objection by 208 | taking ownership of an alternative fork. This is explicitly allowed (and 209 | preferred to blocking progress) with a goal of letting implementation 210 | experience inform which spec is ultimately chosen by the group to move 211 | ahead with. 212 |
213 |214 | Any decisions reached at any meeting are tentative and should be recorded 215 | in a GitHub Issue for groups that use GitHub and otherwise on the group's 216 | public mail list. Any group participant may object to a decision reached 217 | at an online or in-person meeting within 7 days of publication of the 218 | decision provided that they include clear technical reasons for their 219 | objection. The Chairs will facilitate discussion to try to resolve the 220 | objection according to the decision process. 221 |
222 |223 | It is the Chairs' responsibility to ensure that the decision process is 224 | fair, respects the consensus of the CG, and does not unreasonably favour 225 | or discriminate against any group participant or their employer. 226 |
227 |231 | Participants in this group choose their Chair(s) and can replace their 232 | Chair(s) at any time using whatever means they prefer. However, if 5 233 | participants, no two from the same organisation, call for an election, 234 | the group must use the following process to replace any current Chair(s) 235 | with a new Chair, consulting the Community Development Lead on election 236 | operations (e.g., voting infrastructure and using RFC 2777). 238 |
239 |254 | Participants dissatisfied with the outcome of an election may ask the 255 | Community Development Lead to intervene. The Community Development Lead, 256 | after evaluating the election, may take any action including no action. 257 |
258 |262 | The group can decide to work on a proposed amended charter, editing the 263 | text using the Decision Process described above. 264 | The decision on whether to adopt the amended charter is made by 265 | conducting a 30-day vote on the proposed new charter. The new charter, if 266 | approved, takes effect on either the proposed date in the charter itself, 267 | or 7 days after the result of the election is announced, whichever is 268 | later. A new charter must receive 2/3 of the votes cast in the approval 269 | vote to pass. The group may make simple corrections to the charter such 270 | as deliverable dates by the simpler group decision process rather than 271 | this charter amendment process. The group will use the amendment process 272 | for any substantive changes to the goals, scope, deliverables, decision 273 | process or rules for amending the charter. 274 |
275 | 276 | 277 | -------------------------------------------------------------------------------- /canvas_float.md: -------------------------------------------------------------------------------- 1 | # Canvas Floating Point Color Values 2 | 3 | ## Introduction 4 | 5 | This proposal introduces the ability to use floating-point pixel formats in `CanvasRenderingContext2D`, `OffscreenCanvasRenderingContext2D`, and `ImageData`. 6 | 7 | ## Background 8 | 9 | ## Current capabilities 10 | 11 | Both `CanvasRenderingContext2D` and `OffscreenCanvasRenderingContext2D` contain an output bitmap that they render to. 12 | The pixel format of this output bitmap is currently unspecified. 13 | Many implementations use a 8 bits per channel RGB or RGBA pixel format for this bitmap (this is likely the case for all implementations, but the author has not examined all implementations). 14 | 15 | An `ImageData` has a `data` member, which is a `Uint8ClampedArray`, making its format 8 bits per channel RGBA. 16 | 17 | ## Use Cases and Motivation 18 | 19 | High dynamic range and wide color gamut content content often require more than 8 bits per channel to avoid banding artifacts. 20 | 21 | Medical applications (e.g, radiography) demand higher than 8 bits per channel resolution. 22 | 23 | Modern high end displays are capable of displaying more than 8 bits per channel. 24 | 25 | ## Proposed changes 26 | 27 | ### Changes to `CanvasRenderingContext2DSettings` 28 | 29 | Create the new enum `CanvasColorType` to specify the type for the color channels of the output bitmap of a `CanvasRenderingContext2D` and `OffscreenCanvasRenderingContext2D`. 30 | 31 | ```idl 32 | enum CanvasColorType { 33 | "unorm8", 34 | "float16", 35 | }; 36 | ``` 37 | 38 | Add to `CanvasRenderingContext2DSettings` a `CanvasColorType` member, to specify this color type. 39 | 40 | ```idl 41 | partial dictionary CanvasRenderingContext2DSettings { 42 | CanvasColorType colorType = "unorm8"; 43 | }; 44 | ``` 45 | 46 | The value specified in `colorType` will determine the type for the color channels of the output bitmap. 47 | 48 | When a canvas has a color type of `"float16"` color values may be outside of the range `[0, 1]`. 49 | Such values may be used to specify colors outside of the gamut determined by the canvas' color space's primaries. 50 | 51 | When rendering `"float16"` color values to an output device, color values will be converted to the output device's color space using relative colorimetric intent. 52 | Colors values that specify a brightness outside of the standard dynamic range will have their brightness limited to the standard dynamic range of the output device, unless the canvas is explicitly indicated to be high dynamic range (and this proposal intentionally does not include this mechanism). 53 | 54 | ### Changes to `ImageData` 55 | 56 | Create a new enum `ImageDataColorType` to specify the type for the color channels of an `ImageData`. 57 | 58 | ```idl 59 | enum ImageDataColorType { 60 | "unorm8", 61 | "float32", 62 | }; 63 | ``` 64 | 65 | Add to `ImageDataSettings` a `ImageDataColorType` member, to specify this color type. 66 | 67 | ```idl 68 | partial dictionary ImageDataSettings { 69 | ImageDataColorType colorType; 70 | }; 71 | ``` 72 | 73 | Change `ImageData` to allow the `data` member to be either `Uint8ClampedArray` or `Float32Array` via a union type `ImageDataArray`. 74 | 75 | ```idl 76 | typedef (Uint8ClampedArray or Float32Array) ImageDataArray; 77 | 78 | partial interface ImageData { 79 | readonly attribute ImageDataColorType colorType; 80 | readonly attribute ImageDataArray data; 81 | } 82 | ``` 83 | 84 | In the constructor for `ImageData`, if an `ImageDataSettings` is specified, and that `ImageDataSettings` has a `colorType`, then the created `ImageData` will have the specified value for its `colorType`. 85 | If no `ImageDataSettings` is specified, or no `colorType` is specified, then the created `ImageData` will have `colorType` `"unorm8"`. 86 | 87 | The type of an `ImageData`'s `data` member is determined by its `colorType` member as follows: 88 | 89 | * If `colorType` is `"unorm8"` then `data` is of type `Uint8ClampedArray`. 90 | * If `colorType` is `"float32"` then `data` is of type `Float32Array`. 91 | 92 | ### Changes to `CanvasImageData` interface 93 | 94 | In the interface `CanvasImageData`, the functions `createImageData` and `getImageData` will create an `ImageData`, and also optionally take an `ImageDataSettings`. 95 | 96 | If the `ImageDataSettings` specifies `colorType`, then the resulting `ImageData` will have that specified `colorType`. 97 | 98 | If the `ImageDataSettings` does not specify `colorType`, then the resulting `ImageData`'s `colorType` will be `"unorm8"`. 99 | 100 | ### Values outside of the `[0, 1]` interval 101 | 102 | All color spaces supported by `PredefinedColorSpace` are defined for all real numbers. 103 | 104 | Both `"srgb"` and `"display-p3"` are extended using point symmetry around `0`. 105 | For the precise definition, see [the CSS definition of `"srgb"`](https://www.w3.org/TR/css-color-4/#predefined-sRGB). 106 | 107 | ### Type precision 108 | 109 | The exact storage format of the output bitmap of a `CanvasRenderingContext2D` and `OffscreenCanvasRenderingContext2D` is not directly observable. 110 | There exists no API through which the raw storage may be examined. 111 | 112 | If the color type of the output bitmap is `"unorm8"`, then the precision must be at least 8 bits per color channel. 113 | Stored values less than 0 or greater than 255 must be clamped to 0 and 255 respectively. 114 | 115 | If the color type of the output bitmap is `"float16"`, then the precision must be at least IEEE 754-2008 16-bit floating-point. 116 | 117 | ### Type conversions 118 | 119 | When reading an output bitmap of color type `"unorm8"` to a `Float32Array`, the resulting values will be normalized from the range `0` through `255` to the range `0.0` through `1.0`. 120 | 121 | When converting an output bitmap of color type `"float16"` to a `Uint8ClampedArray`, values less than `0.0` will be clamped to `0`, values greater than `1.0` will be clamped to `255`, and values in the range `0.0` through `1.0` will be scaled by `255` and rounded to the nearest integer value. 122 | 123 | When exporting an output bitmap of color type `"float16"` to an data URL or a `Blob` in a format that does not support floating point storage, values outside of the range `0.0` through `1.0` will be clamped. 124 | 125 | ## Related specifications 126 | 127 | ### WebGPU 128 | 129 | In WebGPU, floating-point canvas color types are already available. 130 | They may be specified in `GPUCanvasConfiguration` by indicating a `format` of `rgba16float`. 131 | 132 | ### WebGL 133 | 134 | In WebGL, floating-point canvas color types are proposed via [drawingBufferStorage](https://github.com/KhronosGroup/WebGL/pull/3222). 135 | 136 | ### Canvas High Dynamic Range 137 | 138 | This functionality is a prerequisite for the [Canvas High Dynamic Range proposal](https://github.com/w3c/ColorWeb-CG/blob/master/hdr_html_canvas_element.md). 139 | 140 | This functionality was separated off from the Canvas High Dynamic Range proposal. 141 | 142 | ## Examples 143 | 144 | ### Writing the same color in different ways 145 | 146 | The following example draws the color `color(display-p3 1 0 0)` in multiple ways. 147 | 148 | ```javascript 149 | let canvas = document.getElementById('MyCanvas'); 150 | 151 | let context = canvas.getContext('2d', {colorSpace:"display-p3"}); 152 | console.log(context.getContextAttributes().colorSpace) // Prints "display-p3" 153 | console.log(context.getContextAttributes().colorType) // Prints "unorm8" 154 | 155 | // Draw using CSS colors. 156 | context.fillStyle = "color(display-p3 1 0 0)"; 157 | context.fillRect(0, 0, 1, 1); 158 | 159 | // Draw using a floating-point sRGB ImageData, using values outside of the 160 | // [0, 1] interval. 161 | let imageDataSRGB = new ImageData(1, 1, {colorType:"float32"}); 162 | console.log(imageDataSRGB.colorSpace); // prints "srgb" 163 | console.log(imageDataSRGB.colorType); // prints "float32" 164 | imageData.data[0] = 1.0930663624351615; 165 | imageData.data[1] = -0.22674197356975417; 166 | imageData.data[2] = -0.15013458093711934; 167 | imageData.data[3] = 1.0; 168 | context.putImageData(imageDataSRGB, 1, 0); 169 | 170 | // Draw using a floating-point Display P3 ImageData. 171 | let imageDataDisplayP3 = new ImageData(1, 1, {colorSpace:"display-p3", colorType:"float32"}); 172 | console.log(imageDataDisplayP3.colorSpace); // prints "display-p3" 173 | console.log(imageDataDisplayP3.colorType); // prints "float32" 174 | imageData.data[0] = 1.0; 175 | imageData.data[1] = 0.0; 176 | imageData.data[2] = 0.0; 177 | imageData.data[3] = 1.0; 178 | context.putImageData(imageDataDisplayP3, 2, 0); 179 | 180 | // Read back the result. 181 | let imageDataReadback = context.getImageData(0, 0, 3, 1); 182 | console.log(imageDataReadback.colorSpace); // prints "display-p3" 183 | console.log(imageDataReadback.colorType); // prints "unorm8" 184 | console.log(imageDataReadback.data); // prints [255, 0, 0, 255 ...] three times. 185 | ``` 186 | 187 | ### Reading back contents 188 | 189 | The following example draws the color `color(display-p3 1 0 0)` and reads it back. 190 | 191 | ```javascript 192 | let canvas = document.getElementById('MyCanvas'); 193 | 194 | let context = canvas.getContext('2d', {colorSpace:"srgb", colorType:"float16"}); 195 | console.log(context.getContextAttributes().colorSpace) // Prints "srgb" 196 | console.log(context.getContextAttributes().colorType) // Prints "float16" 197 | 198 | // Draw using CSS colors. 199 | context.fillStyle = "color(display-p3 1 0 0)"; 200 | context.fillRect(0, 0, 1, 1); 201 | 202 | // Read back with default parameters. Note that the resulting values are 203 | // outside of the [0, 1] interval. 204 | let imageDataDefault = context.getImageData(0, 0, 1, 1); 205 | console.log(imageDataDefault.colorSpace); // prints "srgb" 206 | console.log(imageDataDefault.colorType); // prints "float32" 207 | console.log(imageDataDefault.data); // prints [1.0931, -0.2267, -0.1501, 1.0] 208 | 209 | // Read back forcing colorType to "unorm8". Results are clamped. 210 | let imageDataUnorm8 = context.getImageData(0, 0, 1, 1, {colorType:"unorm8"}); 211 | console.log(imageDataUnorm8.colorSpace); // prints "srgb" 212 | console.log(imageDataUnorm8.colorType); // prints "unorm8" 213 | console.log(imageDataUnorm8.data); // prints [255, 0, 0, 255] 214 | 215 | // Read back forcing colorSpace to "display-p3". The result has colorType 216 | // "float32" by default. 217 | let imageDataDisplayP3 = context.getImageData(0, 0, 1, 1, {colorSpace:"display-p3"}); 218 | console.log(imageDataDisplayP3.colorSpace); // prints "display-p3" 219 | console.log(imageDataDisplayP3.colorType); // prints "float32" 220 | console.log(imageDataDisplayP3.data); // prints [1.0, 0.0, 0.0, 1.0] 221 | ``` 222 | 223 | ### Determining ImageData type 224 | 225 | Consider the function `setToGradient` that initializes an `ImageData` to a horizontal gradient. 226 | This shows how to handle different `ImageData` types. 227 | 228 | ```javascript 229 | function setToGradient(imageData) { 230 | for (var x = 0; x < imageData.width; ++x) { 231 | for (var y = 0; y < imageData.height; ++y) { 232 | let offset = 4 * x * imageData.height; 233 | switch (imageData.colorType) { 234 | case "unorm8": 235 | imageData.data[offset+0] = 236 | imageData.data[offset+1] = 237 | imageData.data[offset+2] = 255 * x / (imageData.width - 1); 238 | imageData.data[offset+3] = 255; 239 | break; 240 | case "float32": 241 | imageData.data[offset+0] = 242 | imageData.data[offset+1] = 243 | imageData.data[offset+2] = x / (imageData.width - 1); 244 | imageData.data[offset+3] = 1.0; 245 | break; 246 | default: 247 | throw("Unexpected ImageData colorType " + imageData.colorType); 248 | } 249 | } 250 | } 251 | } 252 | ``` 253 | 254 | ## Remarks on choices 255 | 256 | ### Color type versus pixel format 257 | 258 | This proposal uses the term "color type" instead of "pixel format" intentionally to indicate just the precision of the color channels of the format, and not their layout. 259 | For example, a user agent may choose BGRA or RGBA as the representation of an 8 bit per channel canvas, and this implementation detail is hidden. 260 | 261 | ### Choice of `"float16"` for `CanvasColorType` 262 | 263 | The ability to texture from and render to 16 bit floating-point is universal among modern GPUs. 264 | 265 | ### Choice of `Float32Array` for `ImageData` 266 | 267 | There does not exist a `Float16Array` type, and so it is not an option. 268 | The only other floating-point type, `Float64Array`, is of unnecessarily high precision. 269 | 270 | ### Pitfall of `Float32Array` for `ImageData` 271 | 272 | There exists a potential pitfall wherein a naive user of `ImageData` may write values intended for a `Uint8ClampedArray` to a `Float32Array` 273 | E.g, one would write `255` instead of `1.0`. 274 | 275 | This can be avoided by examining the `colorType` member the `ImageData`. 276 | 277 | ### Choice of defaulting `getImageData` to `"unorm8"` 278 | 279 | Historically, all calls to `getImageData` have returned an `ImageData` with a `Uint8ClampedArray`. 280 | 281 | Changing the default type that is returned by this function will break any software that relies on 282 | that default type, which is currently all software that uses this function. 283 | 284 | Changing the default type that is returned by this function will significantly hinder the adoption 285 | of `"float16"` canvas, because no application can change the backing of any canvas without first 286 | ensuring that all libraries that it uses have been updated to support all possible return values. 287 | 288 | -------------------------------------------------------------------------------- /hdr_html_canvas_element.md: -------------------------------------------------------------------------------- 1 | # Adding support High Dynamic Range (HDR) imagery to HTML Canvas: a baseline proposal (version 2) 2 | 3 | ## Introduction 4 | 5 | Today [HTML 6 | Canvas](https://html.spec.whatwg.org/multipage/canvas.html#imagedata) supports 7 | only 8 bit per color channel and two `PredefinedColorSpace` color spaces (`srgb` 8 | and `display-p3`). 9 | 10 | This is insufficient for High-Dynamic Range (HDR) imagery, which is in 11 | widespread use today: 12 | 13 | * As detailed, for example, at [Ultra HD Blu-ray Format Video Characteristics 14 | ](https://ieeexplore.ieee.org/document/7514362), 8-bit quantization (bit depth) 15 | results in contouring and banding, even for traditional standard dynamic range 16 | (SDR) imagery, like sRGB, which covers a typical luminance range between 0 and 17 | 100 cd/m2. These quantization artifacts become unacceptable with 18 | High-Dynamic Range (HDR) imagery, which supports luminance ranges between 0 and 19 | up to 10,000 cd/m2. 20 | 21 | * As specified at [Rec. ITU-R BT.2100](https://www.itu.int/rec/R-REC-BT.2100), 22 | two color spaces tailored for HDR imagery have been developed: BT.2100 PQ and 23 | BT.2100 HLG. All movies and TV shows are distributed using either of these two 24 | color spaces. Products with HDMI and/or DisplayPort interfaces also use these 25 | color spaces for HDR support. 26 | 27 | * To reproduce an HDR image on a display, it is useful to have information on 28 | both the color volume of that display and the color volume of the image since 29 | one can be significantly smaller than the other. 30 | 31 | Accordingly, the following API modifications are needed to manipulate HDR images 32 | in HTML Canvas: 33 | 34 | 1. add BT.2100 color spaces to `PredefinedColorSpace` 35 | 2. add higher bit depth capabilities to `CanvasRenderingContext2DSettings` 36 | 3. add higher bit depth capabilities to `ImageDataSettings` 37 | 4. add image color volume information to `ImageDataSettings` and 38 | `CanvasRenderingContext2DSettings` 39 | 5. add display color volume information to the `Screen` interface of the CSS 40 | Object Model, to determine the characteristics of the display on which the 41 | image is being reproduced 42 | 43 | ## Target use cases 44 | 45 | As illustrated below, the primary use case is the drawing of HDR images into an 46 | HTML Canvas element such that the images are displayed as they would have been 47 | if they had been introduced in an `img` or `video` element. 48 | 49 |  50 | 51 | Example applications include: 52 | 53 | * drawing images retrieved from a file whose format is not supported by the 54 | `img` or `video` elements 55 | * collage of images, both HDR and SDR 56 | * adding graphics to an HDR image 57 | * drawing of visual elements that are related to an HDR presentation in a 58 | `video` element, with the expectation that the former match the look of the 59 | latter 60 | * authoring HDR images 61 | 62 | ## Scope 63 | 64 | We propose a minimal extension to the Web Platform to allow the HTML Canvas API 65 | to manipulate High Dynamic Range (HDR) images expressed using the widespread 66 | BT.2100 PQ and BT.2100 HLG color spaces. 67 | 68 | This proposal: 69 | 70 | * does not preclude adding other HDR capabilities to HTML Canvas, such as 71 | support for additional color spaces like a linear high-dynamic range color 72 | space. 73 | * neither specifies nor requires new capabilities for compositing `canvas` 74 | elements with other HTML elements. 75 | 76 | ## Add color spaces intended for use with HDR images 77 | 78 | ### General 79 | 80 | Extend [`PredefinedColorSpace`](https://html.spec.whatwg.org/multipage/canvas.html#predefinedcolorspace) to 81 | include the following HDR color spaces. 82 | 83 | ```idl 84 | partial enum PredefinedColorSpace { 85 | /* 86 | * Currently defined values: "srgb", "display-p3" 87 | */ 88 | 89 | "rec2100-hlg", 90 | "rec2100-pq", 91 | "rec2100-display-linear", 92 | } 93 | ``` 94 | 95 | Extending `PredefinedColorSpace` automatically extends 96 | `CanvasRenderingContext2DSettings` and `ImageDataSettings`. 97 | 98 | A Canvas that is initialized with an HDR color space is an HDR Canvas; otherwise 99 | it is an SDR Canvas. 100 | 101 | When drawing an image into a Canvas, the image will be transformed unless the 102 | color spaces of the image and the Canvas match. Annex A specifies transformation 103 | to and from `rec2100-pq` and `rec2100-hlg`. 104 | 105 | At the time of this writing, [the current draft of CSS Color HDR Module Level 1](https://drafts.csswg.org/css-color-hdr/) 106 | provides detailed definitions of`rec2100-hlg`, `rec2100-pq` and 107 | `rec2100-display-linear` (called `rec2100-linear` in that document). 108 | 109 | ### rec2100-hlg 110 | 111 | The non-linear component signals {R', G', B'} are mapped to red, green and blue 112 | tristimulus values according to the Hybrid Log-Gamma (HLG) system specified in 113 | Rec. ITU-R BT.2100. 114 | 115 | ### rec2100-pq 116 | 117 | The non-linear component signals {R', G', B'} are mapped to red, green and blue 118 | tristimulus values according to the Perceptual Quantizer (PQ) system system 119 | specified in Rec. ITU-R BT.2100. 120 | 121 | _NOTE: {R', G', B'} are in the range [0, 1], i.e. they are not expressed in 122 | cd/m2_ 123 | 124 | ### rec2100-display-linear 125 | 126 | The linear display-referred component signals {R, G, B} are mapped to red, green 127 | and blue tristimulus values such that R = G = B = 1.0 represents HDR reference 128 | white with a nominal luminance of 203 cd/m². 129 | 130 | ## Extend `CanvasRenderingContext2DSettings` to support higher bit depths 131 | 132 | Add to `CanvasRenderingContext2DSettings` a `CanvasDataType` member that 133 | specifies the representation of each pixel of the [output bitmap](https://html.spec.whatwg.org/multipage/canvas.html#output-bitmap) of a 134 | `CanvasRenderingContext2D` and `OffscreenCanvasRenderingContext2D`. 135 | 136 | ```idl 137 | partial dictionary CanvasRenderingContext2DSettings { 138 | /* 139 | * Currently defined attributes: 140 | * 141 | * boolean alpha = true; 142 | * boolean desynchronized = false; 143 | * PredefinedColorSpace colorSpace = "srgb"; 144 | * boolean willReadFrequently = false; 145 | */ 146 | 147 | CanvasDataType dataType = "unorm8"; 148 | }; 149 | ``` 150 | 151 | ```idl 152 | enum CanvasDataType { 153 | "unorm8", 154 | "float16", 155 | }; 156 | ``` 157 | 158 | When `dataType = "unorm8"`, the non-linear component signals {R', G', B'} are 159 | quantized using full range quantization, i.e. they are multiplied by 255 and 160 | rounded to the nearest integer. For example, R' = 0.5 is represented by the 161 | integer value 128. 162 | 163 | When `dataType = "float16"`, the non-linear component signals {R', G', B'} are 164 | not quantized, i.e. R' = 0.5 is represented by the floating point value 0.5. 165 | 166 | _NOTE: `dataType = "unorm8"` corresponds to HTML Canvas as it exists today and 167 | should not be used to represent HDR signal, as detailed in the introduction._ 168 | 169 | The `HTMLCanvasElement` methods `toDataURL()` and `toBlob()` should produce 170 | resources that preserve the color space of the underlying Canvas context 171 | For example, CICP metadata or an ICC profile. 172 | 173 | ## Extend `ImageDataSettings` to support higher bit depths 174 | 175 | Add to 176 | [`ImageDataSettings`](https://html.spec.whatwg.org/multipage/canvas.html#imagedatasettings) 177 | a `ImageDataType` member that specifies the conversion semantics and type of 178 | each of the items of the `data` member array of `ImageData`. 179 | 180 | ```idl 181 | partial dictionary ImageDataSettings { 182 | /* 183 | * Currently defined attributes: 184 | * 185 | * PredefinedColorSpace colorSpace; 186 | */ 187 | 188 | ImageDataType dataType = "unorm8"; 189 | }; 190 | ``` 191 | 192 | ```idl 193 | enum ImageDataType { 194 | "unorm8", 195 | "float16", 196 | "float32" 197 | // and potentially others 198 | }; 199 | ``` 200 | 201 | The values `"unorm8"`, `"float16"` and `"float32"` result in `data` returning an 202 | array with the type `Uint8ClampedArray`, `Float16Array`, `Float32Array`, 203 | respectively. 204 | 205 | _NOTE: The strawman requires at least one of `"float16"` and `"float32"` to 206 | support HDR imagery. Both are listed above to emphasize that the strawman can 207 | work with both or either._ 208 | 209 | ## Add HDR rendering behavior and HDR metadata to `CanvasRenderingContext2DSettings` 210 | 211 | Add a new CanvasColorMetadata dictionary: 212 | 213 | ```idl 214 | dictionary CanvasColorMetadata { 215 | optional ColorVolumeInfo contentColorVolume; 216 | } 217 | ``` 218 | 219 | ```idl 220 | dictionary Chromaticities { 221 | // The color primaries and white point of a color volume, in CIE 1931 xy 222 | // coordinates. 223 | required double redPrimaryX; 224 | required double redPrimaryY; 225 | required double greenPrimaryX; 226 | required double greenPrimaryY; 227 | required double bluePrimaryX; 228 | required double bluePrimaryY; 229 | required double whitePointX; 230 | required double whitePointY; 231 | } 232 | ``` 233 | 234 | ```idl 235 | dictionary ColorVolumeInfo { 236 | optional Chromaticities chromaticity; 237 | optional double minimumLuminance; 238 | optional double maximumLuminance; 239 | } 240 | ``` 241 | 242 | Add a mechanism for specifying this on `CanvasRenderingContext2D` and 243 | `OffscreenCanvasRenderingContext2D`. 244 | 245 | ```idl 246 | partial interface CanvasRenderingContext2D/OffscreenCanvasRenderingContext2D { 247 | attribute CanvasColorMetadata colorMetadata; 248 | } 249 | ``` 250 | 251 | `contentColorVolume` specifies the nominal color volume occupied by 252 | the image content in the CIE 1931 XYZ color space. The boundaries of the color 253 | volume are defined by: 254 | 255 | * the xy coordinates, as defined in [ISO 256 | 11664-3](https://www.iso.org/standard/74165.html), of three color primaries: 257 | `redPrimaryX`, `redPrimaryY`, `greenPrimaryX`, `greenPrimaryY`, 258 | `bluePrimaryX`, and `bluePrimaryY`; 259 | * the xy coordinates of a white point: `whitePointX` and `whitePointY`; and 260 | * a minimum and maximum luminance in cd/m²: `minimumLuminance` and `maximumLuminance`. 261 | 262 | If omitted, `chromaticities` is equal to the chromaticity of the color space of 263 | the Canvas. 264 | 265 | If omitted, `minimumLuminance` is equal to 0 cd/m². 266 | 267 | If omitted, `maximumLuminance` is equal to 1,000 cd/m². 268 | 269 | The color volume is nominal because it MAY be smaller or larger than the actual 270 | color volume of image content, but SHOULD not be smaller. 271 | 272 | If present, `contentColorVolume` SHOULD completely define the color volume mapping 273 | algorithm used when rendering the image to a display. For example, the 274 | _rec2100-pq to srgb_ mapping specified in Annex A uses the `minimumLuminance` 275 | and `maximumLuminance` parameters. 276 | 277 | If `contentColorVolume` is not present, the color volume mapping algorithm is left 278 | entirely to the implementation. 279 | 280 | `contentColorVolume` SHOULD be set if known, e.g. if obtained from metadata 281 | contained in a source image, and omitted otherwise. This is particularly 282 | important when drawing a temporal sequence of images. If `contentColorVolume` 283 | is not set, the color volume mapping algorithm can vary over the sequence, resulting in 284 | temporal artifacts. 285 | 286 | For example, `contentColorVolume` can be set according to the Mastering Display 287 | Color Volume and Content Light Level Information chunks found in a PNG image: 288 | the color volume of the image content is typically smaller than, or coincides 289 | with, that of the mastering display. For the color primaries and white point of 290 | the color volume, the colour primaries and white point parameters of the 291 | Mastering Display Color Volume chunk can be used. For the `minimumLuminance` 292 | parameter, the minimum luminance parameter of the Mastering Display Color Volume 293 | chunk can be used. For the `maximumLuminance` parameter, the MaxCLL parameter of 294 | the Content Light Level Information chunk can provide more accurate information 295 | than the maximum luminance parameter of the Mastering Display Color Volume 296 | chunk. 297 | 298 | As specified below, the platform does not generally apply color volume mapping 299 | if the color volume of the image is smaller than that of the display. 300 | 301 | ## Add display color volume information to the `Screen` interface defined in the CSSOM View Module 302 | 303 | Add, in the [CSSOM View Module](https://www.w3.org/TR/cssom-view-1/), a new 304 | `colorInfo` attribute to the `Screen` interface: 305 | 306 | ```idl 307 | partial interface Screen { 308 | optional ScreenColorInfo colorInfo; 309 | } 310 | ``` 311 | 312 | ```idl 313 | dictionary ScreenColorInfo { 314 | optional ColorVolumeInfo colorVolume; 315 | optional double referenceWhiteLuminance; 316 | } 317 | ``` 318 | 319 | If present, 320 | 321 | * `colorVolume` specifies, as defined above, the set of colors that the screen 322 | of the output device can reproduce without significant color volume mapping; and 323 | * `referenceWhiteLuminance` specifies the luminance of reference white as 324 | reproduced by the screen of the output device, and reflects viewing environment 325 | and user settings. 326 | 327 | When omitted, the value of a parameter is unspecified. It is preferable to omit 328 | parameters than to provide default or nominal values that are not known to be 329 | valid or accurate. 330 | 331 | `referenceWhiteLuminance` must be larger than or equal to `minimumLuminance`, 332 | and smaller than or equal to `maximumLuminance`. The ratio of `maximumLuminance` 333 | to `referenceWhiteLuminance` effectively specifies the headroom available for 334 | HDR imagery. 335 | 336 | _Reference white_ is also commonly referred to as diffuse white and graphics 337 | white. 338 | 339 | _EXAMPLE_: [Report ITU-R BT.2408](https://www.itu.int/pub/R-REP-BT.2408) 340 | specifies that the luminance of reference white for a PQ reference display, or a 341 | 1 000 cd/m² HLG display, is 203 cd/m². 342 | 343 | _EXAMPLE_: A PC monitor in a bright environment might report a 344 | `maximumLuminance` of 600 cd/m² and a `referenceWhiteLuminance` of 400 cd/m². 345 | 346 | `colorInfo` can, for example, be used in the following scenarios: 347 | 348 | * an authoring application can use the information to (i) avoid image colors 349 | exceeding the color volume of the output device and (ii) correspondingly set 350 | `contentColorVolume` in `CanvasColorMetadata` (see above). 351 | * a player application can use the information to map the colors of the image to 352 | the color volume of the output device if some of the former are outside the 353 | latter -- this allows the application to use its own mapping algorithm, 354 | substituting those provided by the underlying platform. 355 | 356 | In absence of some or all the parameters of `colorInfo`: 357 | 358 | * the [`dynamic-range`](https://drafts.csswg.org/mediaqueries-5/#dynamic-range) 359 | media query can be used to determine whether the output device supports 360 | high-dynamic range imagery; and 361 | * the 362 | [`color-gamut`](https://drafts.csswg.org/mediaqueries-5/#descdef-media-color-gamut) 363 | media query feature can be used to determine whether the output device supports 364 | wide color gamut imagery. 365 | 366 | ## Color Volume Mapping 367 | 368 | As illustrated by (b) below, when the the color volume of the image is not a 369 | subset of the color volume of the display, the platform performs color volume 370 | mapping, i.e. modifies the colors of the image to make them fit within the 371 | capabilities of the display. 372 | 373 | Conversely and as illustrated by (a) below, the platform does not perform color 374 | volume mapping if the color volume of the image is a subset of the color volume 375 | of the display. 376 | 377 | It is possible for an application to avoid color volume mapping by the platform 378 | by ensuring that the color volume of the image, as specified 379 | by`contentColorVolume`, is within `colorInfo`. This can be achieved, 380 | for example, by: 381 | 382 | * preventing in the first place an author from creating colors exceeding the 383 | display color volume. 384 | * the application performing its own color volume mapping such that the 385 | resulting image color volume is within the display color volume, as 386 | illustrated by (c) below. 387 | 388 |  389 | 390 | ## Annex A: Color space conversions 391 | 392 | ### Background 393 | 394 | In general, applications should avoid conversions between color spaces and 395 | maintain imagery in its original color space: conversions between color spaces 396 | are not necessarily reversible and do not necessarily result in the same image 397 | appearance. In particular, conversion of an HDR image to SDR will result in a 398 | significant loss of information and an SDR image that is different from the 399 | SDR image that would have been mastered from the same source material. From that 400 | perspective, converting from HDR to SDR imagery is similar to converting RGBA 401 | images to 16-color palette images. 402 | 403 | Nevertheless, the HTML specification allows color space conversion in several 404 | scenarios, e.g., when [drawing images to a 405 | canvas](https://html.spec.whatwg.org/multipage/canvas.html#colour-spaces-and-colour-correction), 406 | [retrieving image data from a 407 | canvas](https://html.spec.whatwg.org/multipage/canvas.html#dom-context-2d-getimagedata), 408 | among others being added). The conversions between predefined SDR color spaces 409 | are defined at113 | The Note is a gap analysis document. It identifies the next steps for enabling Wide Color Gamut (WCG) and High 114 | Dynamic Range (HDR) on the Open Web Platform. 115 |
116 |121 | The initial commercial application of Color Science (principally colorimetry, rather than color appearance 122 | modelling) to the reproduction of color was concerned with surface colors such as painted or printed objects, 123 | lit by a single illuminant. The achievable luminance range was thus constrained to the luminance of a perfect 124 | diffuse reflector (practically, an unprinted sheet of good paper) at the high end, and the densest printed or 125 | painted black that could be achieved without the paper tearing or deforming, at the darkest end. This produced 126 | luminance ranges from as low as 20:1 to as high as 90:1 (Fairchild, p.399 [[Fairchild-CAM]]). The achievable 127 | Chroma range was also limited by the vibrancy of mass-produced paints and inks, and the prohibitive cost of 128 | using additional spot colors to extend the gamut. 129 | 130 | Self-luminous displays, in that commercial environment, were primarily used at low luminances for soft proofing 131 | of an eventual printed product, and were intended to replicate the appearance of a viewing booth. 132 |
133 |134 | Over time, self-luminous displays were seen as a worthy color management target in their own right: for computer 135 | displays, for the display of digital photography, for digital cinema, and for the consumption of media such as 136 | television and movies in the home. The luminance range increased: media consumed in near-dark environments such 137 | as the cinema could have much deeper blacks, while media consumed in dim to normal viewing environments enjoyed 138 | an increased peak white luminance of 200 to 300 cd/m2. The darkest blacks in such environments were 139 | constrained by viewing flare, which is why the sRGB standard mandated a viewing flare of 5%, typical for glossy 140 | glass CRT displays of the time; coupled with a peak luminance of only 80cd/m2 the luminance range was 141 | still relatively modest. The introduction of matte LCD and OLED screens, and the commercial success of HDTV, 142 | increased the dynamic range somewhat, but it was still comfortable encoded in 8 bits per gamma-corrected 143 | component. This is referred to as Standard Dynamic Range (SDR). The achievable Chroma range for sRGB and 144 | Rec.709 HDTV, although extending beyond that of printed material at the primaries, was also relatively modest. 145 | Wider gamut displays were available professionally, covering most of the Adobe 1998 RGB™ color space, but 146 | were typically not found in a home or office environment. 147 | 148 |
149 |150 | Commercial availability of displays covering the Display P3 gamut (a derivative of DCI P3 used for digital 151 | cinema, but altered to account for a non-dark viewing environment, with an sRGB transfer curve, and D65 white 152 | point) meant that Wide Color Gamut (WCG) displays became commonplace on laptops, external monitors, 153 | mobile phones, and even watches. WCG requires a modest increase in the number of bits per gamma-corrected 154 | component, from 8 to 10 or 12 (for the widest gamut in common use for content delivery, Rec.2020). 155 |
156 |157 | The deployment of 4K and then 8K television, accompanied by digital content delivery, brought not only a 158 | similarly increased color gamut to the media space, but also a greatly increased dynamic range of 4000:1 or 159 | more. Increased phosphor efficiency and purity, together with bright and modulatable backlights, brought 160 | High Dynamic Range (HDR) into widespread use. Computers, however, remained limited to SDR for the most 161 | part. 162 |
163 |164 | The human visual system can, with appropriate adaptation, function over an enormous luminance range — from 165 | starlight or moonlight at 0.01 cd/m2 and 0.1 cd/m2, through the dim lighting at dawn and 166 | dusk (10 cd/m2), office lighting (100 cd/m2), modern displays (800 cd/m2), 167 | overcast daylight (1,000 cd/m2), bright daylight (10,000 cd/m2) and full direct sunlight 168 | (100,000 cd/m2). However, the full range cannot be simultaneously perceived in the same scene at the 169 | same time. 170 |
171 |
172 | There are two main systems defined for HDR video: Hybrid Log Gamma (HLG), developed by BBC and NHK, and Dolby
173 | Perceptual Quantizer (PQ). While improvement in video quality has driven the innovation of HDR, support for
174 | content on the web more generally, e.g., for static images, the <canvas> element, and in CSS
175 | in general, is still needed.
176 |
178 | Add a brief description of PQ and HLG approaches, including: use of metadata, absolute vs relative brightness, 179 | proprietary vs open standard, etc. 180 |
181 |182 | The BBC has published a frequently-asked questions document [[hdr-hlg-faq]] that gives a high level introduction 183 | to HDR, and the PQ and HLG solutions. 184 |
185 |186 | Fredrik Hubinette from Google has written a useful document [[hdr-chrome]] that discusses the issues with 187 | presenting, and in particular compositing, SDR and HDR content. This was presented at TPAC 2017. It considers 188 | both PQ and HLG. (minutes of the TPAC 2017 189 | meeting) 190 |
191 |192 | On the web, SDR, and HDR content using both HLG and PQ is expected to coexist, potentially within the same page, 193 | as authors can include arbitrary content in their web pages. An example is a page that contains multiple 194 | embedded videos, or their poster images. This raises the question of how to composite content with different 195 | color spaces and dynamic range encodings. 196 |
197 |201 | Support for HDR and WCG on the web is important to allow color and luminance matching between HDR video content 202 | and surrounding or overlaid graphic and textual content in web pages. 203 |
204 |Some specific goals include:
205 |214 | There are a number of specifications potentially impacted by HDR. One of the purposes of this document is to 215 | identify all documents that are possibly affected, so that we can review them and determine any changes needed. 216 |
217 |234 | CSS defines several ways of specifying the colors to use in web pages, in the CSS Color Module specifications. 235 | The current Recommendation is Level 3 [[css-color-3]], and its successor, Level 4, is becoming stable and 236 | starting to be implemented [[css-color-4]]. The various methods in Level 4 cover WCG but not HDR, and are: 237 |
238 |rgb() and rgba() functionshsl() and hsla() functionshwb() function
245 | lab() and lch() functions
249 | display-p3, a98-rgb (compatible with Adobe® RGB
255 | (1998)), prophoto-rgb,
256 | rec-2020
257 | @profile to link to an ICC profile. This is not limited to
260 | RGB:
261 | CMYK and wide-gamut print spaces
262 | are also covered.
263 |
268 | Items marked (*) are new in Level 4. rgb(), named colors, HSL and
269 | HWB all resolve to sRGB. The others are WCG. To date, all of these are SDR color spaces.
270 |
272 | Section 12 in [[css-color-4]] is titled "Working Color Space", and currently empty. It is intended to cover 273 | compositing multiple color spaces. There is as yet no consensus on whether this should be per-document or a 274 | per-element property; nor on whether a single value is appropriate for all operations. We need to define how 275 | should compositing work, if there is HLG, PQ, and SDR content present. The [[hdr-chrome]] discussion document 276 | provides useful input. This will require defining where black, media/paper white, and peak whites (full-screen 277 | and small-area highlights) map in the various spaces. 278 |
279 |280 | The draft CSS Color Module Level 4 [[css-color-4]], adds WCG support. Rec. 2020 is covered, but only for SDR. 281 | The range of CIE Lightness (in Lab and LCH) is not constrained to 100, and a figure of 400 is mentioned 282 | because it is known that Lab is a fair model for up to four times the luminance of diffuse white (Fairchild, 283 | pp.403-413 [[Fairchild-CAM]]), but no meaning is yet ascribed for values greater than 100. 284 |
285 |286 | Rec. 2100, with both HLG and Dolby PQ is added to CSS in an Unofficial Draft [[css-color-hdr]]. It also adds 287 | Jzazbz (which uses the PQ transfer characteristic), 288 | JzCzhz (the polar form), and Dolby ICtCp. Lastly, it adds a (normative) 289 | section "Compositing SDR and HDR content". 290 |
291 |292 | Other CSS specifications assume that all colors are sRGB. For example, CSS Images (which defines linear, 293 | radial, and conic gradients) assumes that all color stops are sRGB colors and mandates that interpolation 294 | happen as premultiplied, gamma-encoded, sRGB values. 295 |
296 |There are a few open issues that relate to HDR and WCG support:
299 |
352 | The HTML <canvas> element provides web applications with a resolution-dependent bitmap
353 | canvas, which can be used for rendering graphs, game graphics, art, or other visual images on the fly. The
354 | <canvas> element supports either 2D or WebGL rendering.
355 |
357 | The 2D API [[canvas-2d]] defines primitive operations for drawing graphics in to a
358 | <canvas> element, such as drawing lines, shapes, and text.
359 |
361 | The "Color managing canvas contents" proposal [[canvas-colorspace]] attempts to address the following use 362 | cases: 363 |
364 |365 |380 |366 |
379 |- 367 | Content displayed through a
371 |<canvas>element should be color managed in order to 368 | minimize differences in appearance across browsers and display devices. Improving color fidelity matters a 369 | lot for artistic uses (e.g., photo and paint apps) and for e-commerce (product presentation). 370 |- 372 | Canvases should be able to take advantage of the full color gamut and dynamic range of the display device. 373 |
374 |- 375 | Creative apps that do image manipulation generally prefer compositing, filtering and interpolation 376 | calculations to be performed in a linear color space. 377 |
378 |
381 | The proposal [[canvas-colorspace]] is not currently HDR capable <canvas>; the proposal
382 | adds WCG support. See
383 | Blink
384 | Intent to Ship, TAG review, and
385 | this issue against the HTML standard.
386 |
390 | Merging of a 391 | recent pull request 392 | indicates the canvas-colorspace proposal might move forward again. In particular, color spaces are now 393 | defined relative to CSS Color 4 predefined RGB spaces, instead of being assumed to be sRGB. 394 |
395 |
421 | The CSS Filter Effects Module [[filter-effects]], which generalizes to CSS and HTML the filter effect already
422 | available on SVG, performs all operations in linear-light sRGB color space. For example, the
423 | luminanceToAlpha
424 | definition hard-codes the matrix operations for linear-sRGB to Luminance. Operations are defined in terms
425 | of an equivalent SG filter function.
426 |
428 | It would be desirable to extend the filter operations to at least other RGB spaces, such as display-p3; 429 | provided this can be done in an opt-in manner, without Web-incompatible changes to existing content. 430 | Currently, the SVG WG is focussed on documenting existing implementations rather than extending SVG. 431 |
432 |452 | The HTML input element has a color type, which allows the user to pick or otherwise 453 | indicate a color. The picker is constrained to produce what HTML calls a 454 | 'simple color' 455 | such as '#123456'. This is limited to sRGB, and 8 bits per component precision. 456 |
457 |458 | A simple color consists of three 8-bit numbers in the range 0..255, representing the red, green, and blue 459 | components of the color respectively, in the sRGB color space. 460 |461 |
478 | Media Capabilities [[media-capabilities]] is a draft specification being developed by the Media Working Group. 479 | It intends to provide APIs to allow websites to make an optimal decision when picking audiovisual media 480 | content for the user. The APIs will expose information about the decoding and encoding capabilities for a 481 | given format but also output capabilities to find the best match based on the device's display. 482 |
483 |
484 | The API is a replacement for the existing
485 | canPlayType() function in HTML [canplaytype], and isTypeSupported() in Media Source
486 | Extensions [[media-source]].
487 |
489 | The Introduction document [media-capabilities-intro] gives a good explanation of the problem that this API is 490 | trying to solve. 491 |
492 |There are a number of open issues where the Color on the Web CG could usefully provide input.
495 |
524 | Media Queries 4 [[mediaqueries-4]] allow authors to test and query values or features of the browser or
525 | display device, independent of the document being rendered. They are used in the CSS
526 | @media rule to conditionally apply styles to a document, and in various other contexts and
527 | languages, such as HTML and JavaScript.
528 | Section 6.4
529 | describes the color-gamut feature, which describes the approximate range of colors that are
530 | supported by the browser and output device.
531 |
533 | Media Queries 5 [[mediaqueries-5]] also allows authors to test and query the ambient light level
534 | (dim | normal | washed), although the assumption seems to be that dim means
535 | "excessive contrast and brightness would be distracting or uncomfortable" rather than "standard viewing
536 | conditions for HDR content". The enumeration is, deliberately, not tied to specific light levels, although
537 | ambient light sensors are mentioned.
538 |
540 | Media Queries 5 also adds
541 | dynamic-range
542 | and
543 | video-dynamic-range
544 | features, both with the values standard and high, for SDR and HDR respectively.
545 | The two queries can return different results in the case of, for example, a 4K TV with HDR video plane and
546 | SDR content overlay plane.
547 |
573 | The CSS Object Model View Module [[cssom-view-1]] has a 574 | screen interface 575 | with a 576 | colorDepth 577 | attribute which typically returns the value 24 for an 8 bits-per-component display. An example in the 578 | specification combines colorDepth (with a value of 48!) with a Media Query for the p3 gamut and unspecified 579 | "other checks" to probe for an HDR screen. 580 |
581 |584 | No open WCG or HDR issues. However, CSSWG decided that the color-related aspects of the CSS OM should 585 | migrate from that specification to CSS Color 4. That migration is in progress, with 586 | CSS Color 4 defining serialization 587 | of WCG color spaces. 588 |
589 |594 | The CSS Typed OM Level 1 [[css-typed-om]] is an extensible API for the CSSOM that reduces the burden of 595 | manipulating a CSS property's value via string manipulation. It does so by exposing CSS values as typed 596 | JavaScript objects rather than strings. There is 597 | an explainer 598 | document. 599 |
600 |603 | There is current discussion on extending the CSS Typed OM for WCG and perhaps HDR color. Helper functions to 604 | calculate luminance contrast (for WCAG) are also in scope. 605 |
606 |607 | There is 608 | initial prototyping work for an object-oriented, typed, CSS Color model. 609 |
610 |625 | Timed Text Markup Language [[TTML2]] has a 626 | tts:color attribute 627 | whose 628 | type is defined to be a value in sRGB, with 8-bit 629 | percomponent precision. Titles in TTML are thus SDR. 630 |
631 |632 | However, subtitles are composited onto video content and TTML does provide a (non-normative) 633 | Appendix Q High Dynamic Range Compositing 634 | which gives equations for converting the sRGB values into either 635 | PQ or HLG for 636 | compositing onto HDR video. 637 |
638 |639 | Realizing that sRGB is defined to have a peak luminance of 80 cd/m2 640 | in theory, and a somewhat higher value in practice, TTML also provides a 641 | tts:luminanceGain 642 | attribute to boost that value for systems (such as PQ) which require an absolute luminance value. 643 |
644 |665 | A static image format is required to store graphics (both camera-capture and artificially generated). 666 | Suggested use cases include, but are not limited to: 667 |
668 |Any such graphics format would need to:
675 |681 | Knowledge of the image primaries allows correct display of color on all displays, including displays which 682 | would need to apply a gamut reduction to the image. 683 |
684 |685 | ICC Profiles store data which allows the correct mapping of captured data to/from an all encompassing color 686 | space and from/to the all encompassing color space to a display device. Version 4 has only a D50 based color 687 | space and lookup-table based transforms. Version 5 allows different color working spaces and an algorithmic 688 | calculator for the transforms. Version 5 is not well supported. 689 |
690 |691 | Open questions: How can we define an ICC / ICCMax profile for HLG? If we define an ICC / ICCMax profile for 692 | HLG, how can we test it? Lars Borg has written an HLG profile. 693 |
694 |Netflix have published a blog post [[netflix-hdr]] that describes their approach for static images.
695 |696 | HDR-oriented 697 | image-formats-comparision 698 |
699 |705 | Portable Network Graphics (PNG) can store graphics using 16-bits per channel and it is possible to store the 706 | primaries and white point. Two methods of storing the transfer function exist: the storage of a value for 707 | gamma, and the storage of an ICC profile. The first method, storage of gamma would allow a backwards 708 | compatible image to be displayed on non-HDR monitors. The second would allow the correct display of this 709 | image on an HDR display. A new, HDR-specific tag (with values "HLG" and "PQ") could be a third option 710 |
711 |712 | The Timed Text Working Group has published a draft specification [[png-hdr-pq]] that extends the PNG format 713 | to include an iCCP chunk with the name "ITUR_2100_PQ_FULL". The chunk contains a given ICC profile, linked 714 | from the specification, although this profile is not actually used, and only its presence is used to enable 715 | external rendering of the PQ encoded image by the platform. See 716 | this discussion 717 | on the Color on the Web CG mailing list. 718 |
719 |723 | Mozilla bug AVIF (AV1 Image File Format): 724 | ICC profile support 725 |
726 |TODO: (might be moot, see WebGPU)
732 |745 | The GPU on the Web Working Group develops the WebGPU API [[webgpu]], a more modern version of WebGL. The draft 746 | spec does not mention "color spaces". It has an RGBA color object, which consists of four double-precision 747 | floats. It therefore has the potential to cover HDR, as well as WCG; but the meaning of the values, their range, 748 | transfer function, and color space is currently undefined. However, a passing mention of the naming convention 749 | for texture formats 750 | seems to indicate that both gamma-corrected and 751 | linear sRGB are supported, with conversion on-the-fly. 752 |
753 |TODO: ...
765 |