├── camera.md ├── dof-questions.md ├── dof-questions2.md ├── hiz.md ├── images ├── alpha.jpg ├── apertures1.jpg ├── apertures2.jpg ├── apertures3.jpg ├── apertures4.jpg ├── arc01.jpg ├── arc02.jpg ├── arc03.jpg ├── artifacts.jpg ├── background-weights.jpg ├── background.jpg ├── background2.jpg ├── cameras │ ├── dof.gif │ ├── res1.jpg │ ├── res10.jpg │ ├── res11.jpg │ ├── res2.jpg │ ├── res3.jpg │ ├── res4.jpg │ ├── res5.jpg │ ├── res6.gif │ ├── res6.jpg │ ├── res7.jpg │ ├── res8.jpg │ └── res9.jpg ├── comp-01.gif ├── comp-03.gif ├── comp1.gif ├── comp2.gif ├── compa.gif ├── compb.gif ├── confusion.jpg ├── diffuse.gif ├── discard01.jpg ├── discard02.jpg ├── discard03.jpg ├── discard04.jpg ├── discard05.jpg ├── discard06.jpg ├── discard07.jpg ├── distortion.jpg ├── example01.jpg ├── example02.jpg ├── example03.jpg ├── example1.jpg ├── example2.jpg ├── example3.jpg ├── example4.jpg ├── example5.jpg ├── fix-int1.jpg ├── fix-int2.jpg ├── fix-int3.gif ├── fix-int4.gif ├── fix-int5.gif ├── fix1.jpg ├── fix2.jpg ├── fix3.gif ├── fix4.gif ├── fix5.gif ├── fixa.gif ├── fixb.gif ├── fixc.gif ├── foreground-original.png ├── foreground-weights.jpg ├── foreground.jpg ├── foreground2.jpg ├── ghost01.jpg ├── ghost02.jpg ├── ghost03.jpg ├── ghost04.jpg ├── graph.gif ├── ies.gif ├── ies2.gif ├── intro.jpg ├── intro1.jpg ├── intro2.jpg ├── intro3.jpg ├── intro4.jpg ├── layers.gif ├── lens-area.jpg ├── lens-description.jpg ├── metal1.jpg ├── metal2.jpg ├── metal3.jpg ├── metal4.jpg ├── mir-1.jpg ├── poop.jpg ├── reflectance.jpg ├── res1.jpg ├── res2.jpg ├── res3.jpg ├── results-bad.jpg ├── results.jpg ├── scene.jpg ├── scene2.jpg ├── scene3.jpg ├── slide1.png ├── ssr-cam1.jpg ├── ssr-depth-threshold01.jpg ├── ssr-gif1.gif ├── ssr-gif10.gif ├── ssr-gif11.gif ├── ssr-gif2.gif ├── ssr-gif3.gif ├── ssr-gif4.gif ├── ssr-gif5.gif ├── ssr-gif6.gif ├── ssr-gif7.gif ├── ssr-gif8.gif ├── ssr-gif9.gif ├── ssr1.jpg ├── ssr10.jpg ├── ssr11.jpg ├── ssr12.jpg ├── ssr13.jpg ├── ssr14.jpg ├── ssr15.jpg ├── ssr16.jpg ├── ssr17.jpg ├── ssr18.jpg ├── ssr19.jpg ├── ssr2.jpg ├── ssr20.jpg ├── ssr21.jpg ├── ssr22.jpg ├── ssr23.jpg ├── ssr24.jpg ├── ssr25.jpg ├── ssr26.jpg ├── ssr3.jpg ├── ssr4.jpg ├── ssr5.jpg ├── ssr6.jpg ├── ssr7.jpg ├── ssr8.jpg ├── ssr9.jpg ├── starburst01.jpg ├── starburst02.jpg ├── starburst03.jpg ├── starburst04.jpg ├── starburst05.jpg ├── starburst06.jpg ├── tile-min-depth-linear.jpg ├── tile-min-depth-point.jpg ├── trace-01.jpg ├── trace-02.jpg ├── trace-03.jpg ├── trace-04.jpg └── trace-05.jpg ├── lens-flares.md ├── reflections.md ├── river.md └── validation.md /camera.md: -------------------------------------------------------------------------------- 1 | # Physical Cameras in Stingray (Part 1) # 2 | This is a quick blog to share some of the progress [Olivier Dionne](https://twitter.com/olivier_dionne) and I made lately with Physical Cameras in Stingray. Our goal of implementing a solid physically based pipeline has always been split in three phases. First we [validated our standard material](http://bitsquid.blogspot.ca/2017/07/validating-materials-and-lights-in.html). We then added physical lights. And now we are wrapping it up with a physical camera. 3 | 4 | We define a physical camera as an entity controlled by the same parameters a real world camera would use. These parameters are split into two groups which corresponds to the two main parts of a camera. The camera _body_ is defined by it's sensor size, iso sensitivity, and a range of available shutter speeds. The camera _lens_ is defined by it's focal length, focus range, and range of aperture diameters. Setting all of these parameters should expose the incoming light the same way a real world camera would. 5 | 6 | ### Stingray Representation ### 7 | Just like our physical light, our camera is expressed as an entity with a bunch of components. The main two components being the Camera Body and the Camera Lens. We then have a transform component and a camera component which together represents the view projection matrix of the camera. After that we have a list of shading environment components which we deem relevant to be controlled by a physical camera (all post effects relevent to a camera). The state of these shading environment components is controled through a script component called the "Physical Camera Properties Mapper" (more on this later). Here is a glimpse of what the Physical Camera entity may look like (wip): 8 | 9 | ![](images/cameras/res11.jpg) 10 | 11 | So while there are a lot of components that belongs to a physical camera, the user is expected to interact mainly with the body and the lens components. 12 | 13 | ### Post Effects ### 14 | A lot of our post effects are dedicated to simulate some sort of camera/lens artifact (DOF, motion blur, film grain, vignetting, bloom, chromatic aberation, ect). One thing we wanted was the ability for physical cameras to override the post processes defined in our global shading environments. We also wanted to let users easily opt out of the physically based mapping that occurred between a camera and it's corresponding post-effect. For example a physical camera will generate an accurate circle of confusion for the depth of field effect, but a user might be frustrated by the limitations imposed by a physically correct dof effect. In this case a user can opt out by simply deleting the "Depth Of Field" component from the camera entity. 15 | 16 | It's nice to see how the expressiveness of the Stingray entity system is shaping up and how it enables us to build these complex entities without the need to change much of the engine. 17 | 18 | ### Properties Mapper ### 19 | All of the mapping occurs in the properties mapper component which I mentioned earlier. This is simply a lua script that gets executed whenever any of the entity properties are edited. 20 | 21 | The most important property we wanted to map was the exposure value. We wanted the f-stop, shutter speed, and ISO values to map to an exposure value which would simulate how a real camera sensor reacts to incoming light. Lucky for us this topic is very well covered by Sebastien Lagarde and Charles de Rousiers in their awesome awesome awesome [Moving Frostbite to Physically Based Rendering](https://seblagarde.files.wordpress.com/2015/07/course_notes_moving_frostbite_to_pbr_v32.pdf) document. The mapping basically boils down to: 22 | 23 | ~~~ 24 | local function compute_ev(aperture, shutter_time, iso) 25 | local ev_100 = log2((aperture * aperture * 100) / (shutter_time * iso)) 26 | local max_luminance = 1.2 * math.pow(2, ev_100) 27 | return (1 / max_luminance) 28 | end 29 | ~~~ 30 | 31 | The second property we were really interested in mapping is the field of view of the camera. Usually the horizontal FOV is calculated as _2 x atan(h/2f)_ where _h_ is the camera sensor's width and _f_ is the current focal length of the lens. This by itself gives a good approximation of the FOV of a lens, but as was pointed out by the [MGS5 & Fox Engine presentation](https://youtu.be/FQMbxzTUuSg?t=50m12s), the focus distance of the lens should also be considered when calculating the FOV from the camera properties. 32 | 33 | Intuitively we though that the change in the FOV was caused by a change in the effective focal length of the lens. Adjusting the focus usually shifts a group of lenses up and down the optical axis of a camera lens. Our best guess was that this shift would increase or decrease the effective focal length of the camera lens. Using this idea we we're able to simulate the effect that changing the focus point has on the FOV of a camera: 34 | 35 | [https://www.youtube.com/watch?v=KDwUi-vYYMQ](https://www.youtube.com/watch?v=KDwUi-vYYMQ&feature=youtu.be) 36 | 37 | ~~~ 38 | local function compute_fov(focal_length, film_back_height, focus) 39 | local normalized_focus = (focus - 0.38)/(5.0 - 0.38) 40 | local focal_length_offset = lerp(0.0, 1.0, normalized_focus) 41 | return 2.0 * math.atan(film_back_height/(2.0 * (focal_length + focal_length_offset))) 42 | end 43 | ~~~ 44 | 45 | 46 | While this gave us plausible results in some cases, it does not map accurately to a real world camera with certain lens settings. For example we can choose a focal length offset that gives good FOV mapping for a zoom lens set to 24mm but incorrect FOV results when it's set to 70mm (see [video](https://www.youtube.com/watch?v=KDwUi-vYYMQ&feature=youtu.be) above). This area of lens optics is one we would like to explore more in the future. 47 | 48 | In the future we will map more camera properties to their corresponding post-effects. More on this in a follow up blog. 49 | 50 | ### Validating Results ### 51 | To validate our mappings we designed a small, controlled environment room that we re-created in stingray. This idea was inspired by the "Conference Room" setup that was presented by Hideo Kojima in the [MGS5 & Fox Engine presentation](https://youtu.be/FQMbxzTUuSg?t=20m22s). We used our simplified, environment room to compare our rendered results with real world photographs. 52 | 53 | Controlled Environment: 54 | ![](images/cameras/res4.jpg) 55 | 56 | Stingray Equivalent: 57 | ![](images/cameras/res3.jpg) 58 | 59 | Since there is no convenient way to adjust the white balancing in Stingray, we decided to white balance our camera data and use a pure white light in our Stingray scene. We also decided to compare the photos and renders in linear space in the hope to minimize the source of potential error. 60 | 61 | White balancing our photographs: 62 | ![](images/cameras/res6.gif) 63 | 64 | Our very first comparison we're disapointing: 65 | ![](images/cameras/res10.jpg) 66 | 67 | We tracked down the difference in brightness to a problem with how we expressed our light intensity. We discovered that we made the mistake of using the specified lumen value of our lights as it's light intensity. The total luminous flux is expressed in lumens, but the luminous intensity (what the material shader is interested in) is actually the luminous flux per solid angle. So while we let the users enter the "intensity" of lights in lumens, we need to map this value to luminous intensity. The mapping is done as _lumens/2π(1-cos(½α))_ where _α_ is the apex angle of the light. Lots of details can be found [here](https://www.compuphase.com/electronics/candela_lumen.htm). This works well for point lights and spot lights. In the future our directional lights will be assumed to be the sun or moon and will be expressed in lux, perhaps with a corresponding disk size. 68 | 69 | With this fix in place we started getting more encouraging results: 70 | ![](images/cameras/res1.jpg) 71 | 72 | ![](images/cameras/res2.jpg) 73 | 74 | There is lots left to do but this feels like a very good start to our physically based cameras. -------------------------------------------------------------------------------- /dof-questions.md: -------------------------------------------------------------------------------- 1 | ####Question 1) 2 | Is this really the raw results of what the foreground layer should look like? Or is this some kind of visualization mode of the foreground weights combined with the foreground result? 3 | 4 | ![](https://github.com/greje656/Questions/blob/master/images/foreground-original.png) 5 | 6 | The results I seem to get for the foreground layer is more like (note this is using a "background range" of 2.5m so ~100in). 7 | 8 | foreground.rgb/foreground.a: 9 | ![](https://github.com/greje656/Questions/blob/master/images/foreground.jpg) 10 | Notice how I don't get any samples pixels whose weight is low? I'm wondering if I miss-understood something crucial here (i.e. no black pixels like you're result). 11 | 12 | The weight of the samples looks like this: 13 | ![](https://github.com/greje656/Questions/blob/master/images/foreground-weights.jpg) 14 | 15 | Similarly the background looks like this: 16 | 17 | background.rgb/background.a 18 | ![](https://github.com/greje656/Questions/blob/master/images/background.jpg) 19 | 20 | And the background weights: 21 | ![](https://github.com/greje656/Questions/blob/master/images/background-weights.jpg) 22 | 23 | The normalized alpha value I get looks like: 24 | ![](https://github.com/greje656/Questions/blob/master/images/alpha.jpg) 25 | 26 | And finally "lerp(background/background.a, foreground/foreground.a, alpha)": 27 | ![](https://github.com/greje656/Questions/blob/master/images/results.jpg) 28 | 29 | I'm wondering maybe the foreground/background layers should be weighted like this? (... nah, that doesn't make sense! I'm so confused) 30 | 31 | foreground.rgb/SAMPLE_COUNT: 32 | ![](https://github.com/greje656/Questions/blob/master/images/foreground2.jpg) 33 | 34 | background.rgb/SAMPLE_COUNT: 35 | ![](https://github.com/greje656/Questions/blob/master/images/background2.jpg) 36 | 37 | ####Question 2) 38 | Should the maxCoCMinDepth tiles be sampled using linear interpolation? Or point sampling? Intuitively it feels like it should be point sampling but I see artifacts at the edges of the tiles if I do. Note that with tiles of 20x20 pixels I'm making the assumption that the maximum kernel size allowed should be 10x10px. Maybe that is a false assumption? 39 | 40 | Linear (currently what I'm using): 41 | ![](https://github.com/greje656/Questions/blob/master/images/tile-min-depth-linear.jpg) 42 | 43 | Point: 44 | ![](https://github.com/greje656/Questions/blob/master/images/tile-min-depth-point.jpg) 45 | 46 | Point sampling artifacts: 47 | ![](https://github.com/greje656/Questions/blob/master/images/artifacts.jpg) 48 | 49 | ####Questions 3) 50 | Finally (this one might be really dumb!). Does this need to differentiate samples that are behind\infront of the camera focus point? i.e. does this need signed CoCs? I think it does but that's not something that's 100% clear to me. Currently I've implemented a solution that uses abs(coc), but this doesn't work for focused objects on top of unfocused: 51 | ![](https://github.com/greje656/Questions/blob/master/images/results-bad.jpg) 52 | It's probably that we DO need to use a signed circle of confusions but I just wanted to confirm that I'm not missing something obvious here (I'm sure I am) 53 | 54 | ####Code used to generate images 55 | #define NUM_SAMPLES 5 56 | #define COC_SIZE_IN_PIXELS 10 57 | #define BACKGROUND_RANGE 2.5 58 | 59 | struct PresortParams { 60 | float coc; 61 | float backgroundWeight; 62 | float foregroundWeight; 63 | }; 64 | 65 | float2 DepthCmp2(float depth, float closestTileDepth){ 66 | float d = saturate((depth - closestTileDepth)/BACKGROUND_RANGE); 67 | float2 depthCmp; 68 | depthCmp.x = smoothstep( 0.0, 1.0, d ); // Background 69 | depthCmp.y = 1.0 - depthCmp.x; // Foreground 70 | return depthCmp; 71 | } 72 | 73 | float SampleAlpha(float sampleCoc) { 74 | const float DOF_SINGLE_PIXEL_RADIUS = length(float2(0.5, 0.5)); 75 | return min( 76 | rcp(PI * sampleCoc * sampleCoc), 77 | rcp(PI * DOF_SINGLE_PIXEL_RADIUS * DOF_SINGLE_PIXEL_RADIUS) 78 | ); 79 | } 80 | 81 | PresortParams GetPresortParams(float sample_coc, float sample_depth, float closestTileDepth){ 82 | PresortParams presort_params; 83 | 84 | presort_params.coc = sample_coc; 85 | presort_params.backgroundWeight = SampleAlpha(sample_coc) * DepthCmp2(sample_depth, closestTileDepth).x; 86 | presort_params.foregroundWeight = SampleAlpha(sample_coc) * DepthCmp2(sample_depth, closestTileDepth).y; 87 | 88 | return presort_params; 89 | } 90 | 91 | float4 ps_main(PS_INPUT input) : SV_TARGET0 { 92 | 93 | float tileMaxCoc = TEX2DLOD(input_texture3, input.uv, 0).g * COC_SIZE_IN_PIXELS; 94 | 95 | float SAMPLE_COUNT = 0.0; 96 | float4 background = 0; 97 | float4 foreground = 0; 98 | 99 | for (int x = -NUM_SAMPLES; x <= NUM_SAMPLES; ++x) { 100 | for (int y = -NUM_SAMPLES; y <= NUM_SAMPLES; ++y) { 101 | 102 | float2 samplePos = get_sampling_pos(x, y); 103 | float2 sampleUV = input.uv + samplePos/back_buffer_size * tileMaxCoc; 104 | 105 | float3 sampleColor = TEX2DLOD(input_texture0, sampleUV, 0).rgb; 106 | float sampleCoc = decode_coc(TEX2DLOD(input_texture1, sampleUV, 0).r) * COC_SIZE_IN_PIXELS; 107 | float sampleDepth = linearize_depth(TEX2DLOD(input_texture2, sampleUV, 0).r); 108 | float closestTileDepth = TEX2DLOD(input_texture3, input.uv, 0).r; 109 | 110 | PresortParams samplePresortParams = GetPresortParams(sampleCoc, sampleDepth, closestTileDepth); 111 | 112 | background += samplePresortParams.backgroundWeight * float4(sampleColor, 1.f); 113 | foreground += samplePresortParams.foregroundWeight * float4(sampleColor, 1.f); 114 | 115 | SAMPLE_COUNT += 1; 116 | } 117 | } 118 | 119 | float alpha = saturate( 2.0 * ( 1.0 / SAMPLE_COUNT ) * ( 1.0 / SampleAlpha(tileMaxCoc)) * foreground.a); 120 | float3 finalColor = lerp(background/background.a, foreground/foreground.a, alpha); 121 | 122 | return float4(finalColor, 1); 123 | } -------------------------------------------------------------------------------- /dof-questions2.md: -------------------------------------------------------------------------------- 1 | The "scene": 2 | ![](https://github.com/greje656/Questions/blob/master/images/scene3.jpg) 3 | 4 | The different layers: 5 | ![](https://github.com/greje656/Questions/blob/master/images/layers.gif) 6 | 7 | I'm confused about weather or not the background samples are weighted correctly? 8 | ![](https://github.com/greje656/Questions/blob/master/images/confusion.jpg) 9 | 10 | Looking at the slides, it's almost is if any pixel's classified as 100% background should use a coc defined as: 11 | lerp(tileMaxCoc, currentPixel'sCoC, DepthCmp2(currentPixel'sDepth, closestTileDepth)); 12 | I'm not sure!? Or maybe this is some kind of debug view and not the background layer "as is"? -------------------------------------------------------------------------------- /hiz.md: -------------------------------------------------------------------------------- 1 | # Notes On Screen Space HIZ Tracing 2 | 3 | Note: The [Markdown version](https://github.com/greje656/Questions/blob/master/hiz.md) of this document is available and might have better formatting on phones/tablets. 4 | 5 | The following is a small gathering of notes and findings that we made throughout the implementation of hiz tracing in screen space for ssr in Stingray. I recently heard a few claims regarding hiz tracing which motivated me to share some notes on the topic. Note that I also wrote about how we reproject reflections in a [previous entry](http://bitsquid.blogspot.ca/2017/06/reprojecting-reflections_22.html) which might be of interest. Also note that I've included all the code at the bottom of the blog. 6 | 7 | The original implementation of our hiz tracing method was basically a straight port of the "Hi-Z Screen-Space Tracing" described in [GPU-Pro 5](https://www.crcpress.com/GPU-Pro-5-Advanced-Rendering-Techniques/Engel/p/book/9781482208634) by [Yasin Uludag](https://twitter.com/yasinuludag). The very first results we got looked something like this: 8 | 9 | Original scene: 10 | ![](https://github.com/greje656/Questions/blob/master/images/ssr1.jpg) 11 | 12 | Traced ssr using hiz tracing: 13 | ![](https://github.com/greje656/Questions/blob/master/images/ssr2.jpg) 14 | 15 | ## Artifacts 16 | 17 | The weird horizontal stripes were reported when ssr was enabled in the Stingray editor. They only revealed themselves for certain resolution (they would appear and disappear as the viewport got resized). I started writing some tracing visualization views to help me track each hiz trace event: 18 | 19 | ![](https://github.com/greje656/Questions/blob/master/images/ssr-gif7.gif) 20 | 21 | Using these kinds of debug views I was able to see that for some resolution, the starting position of a ray when traced at half-res happened to be exactly at the edge of a hiz cell. Since tracing the hiz structure relies on intersecting the current position of a ray with the boundary of cell it lies in, it means that we need to do a ray/plane intersection. As the numerator of (planes - pos.xy)/dir.xy got closer and closer to zero the solutions for the intersection started to loose precision until it completely fell apart. 22 | 23 | To tackle this problem we snap the origin of each traced rays to the center of a hiz cell: 24 | 25 | ~~~ 26 | float2 cell_count_at_start = cell_count(HIZ_START_LEVEL); 27 | float2 aligned_uv = floor(input.uv * cell_count_at_start)/cell_count_at_start + 0.25/cell_count_at_start; 28 | ~~~ 29 | 30 | Rays traced with and without snapping to starting pos of the hiz cell center: 31 | ![](https://github.com/greje656/Questions/blob/master/images/ssr-gif6.gif) 32 | 33 | This looked better. However it didn't address all of the tracing artifacts we were seeing. The results were still plagued with lots of small pixels whose traced rays failed. When investigating these failing cases I noticed that they would sometimes get stuck for no apparent reason in a cell along the way. It also occurred more frequently when rays travelled in the screen space axes (±1,0) or (0,±1). After drawing a bunch of ray diagrams on paper I realized that the cell intersection method proposed in GPU-Pro had a failing case! To ensure hiz cells are always crossed, the article offsets the intersection planes of a cell by a small offset. This is to ensure that the intersection point crosses the boundaries of the cell it's intersecting so that the trace continues to make progress. 34 | 35 | While this works in most cases there is one scenario which results in a ray that will not cross over into the next hiz cell (see diagram bellow). When this happens the ray wastes the rest of it's allocated trace iterations intersecting the same cell without ever crossing it. To address this we changed the proposed method slightly. Instead of offsetting the bounding planes, we choose the appropriate offset to add depending on which plane was intersected (horizontal or vertical). This ensures that we will always cross a cell when tracing: 36 | 37 | ~~~~ 38 | float2 cell_size = 1.0 / cell_count; 39 | float2 planes = cell_id/cell_count + cell_size * cross_step + cross_offset; 40 | float2 solutions = (planes - pos.xy)/dir.xy; 41 | 42 | float3 intersection_pos = pos + dir * min(solutions.x, solutions.y); 43 | return intersection_pos; 44 | ~~~~ 45 | 46 | ~~~~ 47 | float2 cell_size = 1.0 / cell_count; 48 | float2 planes = cell_id/cell_count + cell_size * cross_step; 49 | float2 solutions = (planes - pos)/dir.xy; 50 | 51 | float3 intersection_pos = pos + dir * min(solutions.x, solutions.y); 52 | intersection_pos.xy += (solutions.x < solutions.y) ? float2(cross_offset.x, 0.0) : float2(0.0, cross_offset.y); 53 | return intersection_pos; 54 | ~~~~ 55 | 56 | ![](https://github.com/greje656/Questions/blob/master/images/ssr19.jpg) 57 | 58 | Incorrect VS correct cell crossing: 59 | ![](https://github.com/greje656/Questions/blob/master/images/ssr-gif9.gif) 60 | 61 | Final result: 62 | ![](https://github.com/greje656/Questions/blob/master/images/ssr6.jpg) 63 | 64 | ## Ray Marching Towards the Camera 65 | 66 | At the end of the GPU-Pro chapter there is a small mention that raymarching towards the camera with hiz tracing would require storing both the minimum and maximum depth value in the hiz structure (requiring to bump the format to a R32G32F format). However if you visualize the trace of a ray leaving the surface and travelling towards the camera (i.e. away from the depth buffer plane) then you can simply acount for that case and augment the algorithm described in GPU-Pro to navigate up and down the hierarchy until the ray finds the first hit with a hiz cell: 67 | 68 | ![](https://github.com/greje656/Questions/blob/master/images/ssr-cam1.jpg) 69 | 70 | ~~~ 71 | if(v.z > 0) { 72 | float min_minus_ray = min_z - ray.z; 73 | tmp_ray = min_minus_ray > 0 ? ray + v_z*min_minus_ray : tmp_ray; 74 | float2 new_cell_id = cell(tmp_ray.xy, current_cell_count); 75 | if(crossed_cell_boundary(old_cell_id, new_cell_id)) { 76 | tmp_ray = intersect_cell_boundary(ray, v, old_cell_id, current_cell_count, cross_step, cross_offset); 77 | level = min(HIZ_MAX_LEVEL, level + 2.0f); 78 | } 79 | } else if(ray.z < min_z) { 80 | tmp_ray = intersect_cell_boundary(ray, v, old_cell_id, current_cell_count, cross_step, cross_offset); 81 | level = min(HIZ_MAX_LEVEL, level + 2.0f); 82 | } 83 | ~~~ 84 | 85 | This has proven to be fairly solid and enabled us to trace a wider range of the screen space: 86 | 87 | [https://youtu.be/BjoMu-yI3k8](https://youtu.be/BjoMu-yI3k8) 88 | ![](https://github.com/greje656/Questions/blob/master/images/ssr21.jpg) 89 | 90 | ## Ray Marching Behind Surfaces 91 | 92 | Another alteration that can be made to the hiz tracing algorithm is to add support for rays to travel behind surface. Of course to do this you must define a thickness to the surface of the hiz cells. So instead of tracing against extruded hiz cells you trace against "floating" hiz cells. 93 | 94 | ![](https://github.com/greje656/Questions/blob/master/images/ssr23.jpg) 95 | 96 | With that in mind we can tighten the tracing algorithm so that it cannot end the trace unless it finds a collision with one of these floating cells: 97 | 98 | ~~~ 99 | if(level == HIZ_START_LEVEL && min_minus_ray > depth_threshold) { 100 | tmp_ray = intersect_cell_boundary(ray, v, old_cell_id, current_cell_count, cross_step, cross_offset); 101 | level = HIZ_START_LEVEL + 1; 102 | } 103 | ~~~ 104 | 105 | Tracing behind surfaces disabled VS enabled: 106 | ![](https://github.com/greje656/Questions/blob/master/images/ssr13.jpg) 107 | ![](https://github.com/greje656/Questions/blob/master/images/ssr17.jpg) 108 | 109 | Unfortunately this often means that the traced rays travelling behind a surface degenerate into a linear search and the cost can skyrocket for these traced pixels: 110 | 111 | Number of iterations to complete the trace (black=0, red=64): 112 | ![](https://github.com/greje656/Questions/blob/master/images/ssr14.jpg) 113 | 114 | 115 | ## The Problem of Tracing a Discrete Depth Buffer 116 | 117 | For me the most difficult artifact to understand and deal with when implementing ssr is (by far) the implications of tracing a discreet depth buffer. Unless you can fully commit to the idea of tracing objects with infinite thicknesses, you will need to use some kind of depth threshold to mask a reflection if it's intersection with the geometry is not valid. If you do use a depth threshold then you can (will?) end up getting artifacts like these: 118 | 119 | [https://youtu.be/ZftaDG2q3D0](https://youtu.be/ZftaDG2q3D0) 120 | ![](https://github.com/greje656/Questions/blob/master/images/ssr25.jpg) 121 | 122 | The problem _as far as I understand it_, is that rays can osciliate from passing and failing the depth threshold test. It is essentially an amplified alliasing problem caused by the finite resolution of the depth buffer: 123 | ![](https://github.com/greje656/Questions/blob/master/images/ssr24.jpg) 124 | 125 | I have experimented with adapting the depth threshold based on different properties of the intersection point (direction of reflected ray, angle of insidence at intersection, surface inclination at intersection) but I have never been able to find a silver bullet (or anything that resembles a bullet to be honest). Perhaps a good approach could be to interpolate the depth value of neighboring cells _if_ the neighbors belong to the same geometry? I think that [Mikkel Svendsen](https://twitter.com/ikarosav) proposed a solution to this problem while presenting [Low Complexity, High Fidelity: The Rendering of "INSIDE"](https://youtu.be/RdN06E6Xn9E?t=40m27s) but I have yet to wrap my head around the proposed solution and try it. 126 | 127 | ## All or Nothing 128 | 129 | Finally it's worth pointing out that hiz tracing is a very "all or nothing" way to find an intersection point. Neighboring rays that exhaust their maximum number of allowed iterations to find an intersection can end up in very different screen spaces which can cause a noticeable discontinuity in the ssr buffer: 130 | 131 | ![](https://github.com/greje656/Questions/blob/master/images/ssr26.jpg) 132 | 133 | This is something that can be very distracting and made much worst when dealing with a jittered depth buffer when combined with taa. This side-effect should be considered carefully when choosing a tracing solution for ssr. 134 | 135 | ## Code 136 | 137 | ~~~ 138 | float2 cell(float2 ray, float2 cell_count, uint camera) { 139 | return floor(ray.xy * cell_count); 140 | } 141 | 142 | float2 cell_count(float level) { 143 | return input_texture2_size / (level == 0.0 ? 1.0 : exp2(level)); 144 | } 145 | 146 | float3 intersect_cell_boundary(float3 pos, float3 dir, float2 cell_id, float2 cell_count, float2 cross_step, float2 cross_offset, uint camera) { 147 | float2 cell_size = 1.0 / cell_count; 148 | float2 planes = cell_id/cell_count + cell_size * cross_step; 149 | 150 | float2 solutions = (planes - pos)/dir.xy; 151 | float3 intersection_pos = pos + dir * min(solutions.x, solutions.y); 152 | 153 | intersection_pos.xy += (solutions.x < solutions.y) ? float2(cross_offset.x, 0.0) : float2(0.0, cross_offset.y); 154 | 155 | return intersection_pos; 156 | } 157 | 158 | bool crossed_cell_boundary(float2 cell_id_one, float2 cell_id_two) { 159 | return (int)cell_id_one.x != (int)cell_id_two.x || (int)cell_id_one.y != (int)cell_id_two.y; 160 | } 161 | 162 | float minimum_depth_plane(float2 ray, float level, float2 cell_count, uint camera) { 163 | return input_texture2.Load(int3(vr_stereo_to_mono(ray.xy, camera) * cell_count, level)).r; 164 | } 165 | 166 | float3 hi_z_trace(float3 p, float3 v, in uint camera, out uint iterations) { 167 | float level = HIZ_START_LEVEL; 168 | float3 v_z = v/v.z; 169 | float2 hi_z_size = cell_count(level); 170 | float3 ray = p; 171 | 172 | float2 cross_step = float2(v.x >= 0.0 ? 1.0 : -1.0, v.y >= 0.0 ? 1.0 : -1.0); 173 | float2 cross_offset = cross_step * 0.00001; 174 | cross_step = saturate(cross_step); 175 | 176 | float2 ray_cell = cell(ray.xy, hi_z_size.xy, camera); 177 | ray = intersect_cell_boundary(ray, v, ray_cell, hi_z_size, cross_step, cross_offset, camera); 178 | 179 | iterations = 0; 180 | while(level >= HIZ_STOP_LEVEL && iterations < MAX_ITERATIONS) { 181 | // get the cell number of the current ray 182 | float2 current_cell_count = cell_count(level); 183 | float2 old_cell_id = cell(ray.xy, current_cell_count, camera); 184 | 185 | // get the minimum depth plane in which the current ray resides 186 | float min_z = minimum_depth_plane(ray.xy, level, current_cell_count, camera); 187 | 188 | // intersect only if ray depth is below the minimum depth plane 189 | float3 tmp_ray = ray; 190 | if(v.z > 0) { 191 | float min_minus_ray = min_z - ray.z; 192 | tmp_ray = min_minus_ray > 0 ? ray + v_z*min_minus_ray : tmp_ray; 193 | float2 new_cell_id = cell(tmp_ray.xy, current_cell_count, camera); 194 | if(crossed_cell_boundary(old_cell_id, new_cell_id)) { 195 | tmp_ray = intersect_cell_boundary(ray, v, old_cell_id, current_cell_count, cross_step, cross_offset, camera); 196 | level = min(HIZ_MAX_LEVEL, level + 2.0f); 197 | }else{ 198 | if(level == 1 && abs(min_minus_ray) > 0.0001) { 199 | tmp_ray = intersect_cell_boundary(ray, v, old_cell_id, current_cell_count, cross_step, cross_offset, camera); 200 | level = 2; 201 | } 202 | } 203 | } else if(ray.z < min_z) { 204 | tmp_ray = intersect_cell_boundary(ray, v, old_cell_id, current_cell_count, cross_step, cross_offset, camera); 205 | level = min(HIZ_MAX_LEVEL, level + 2.0f); 206 | } 207 | 208 | ray.xyz = tmp_ray.xyz; 209 | --level; 210 | 211 | ++iterations; 212 | } 213 | return ray; 214 | } 215 | ~~~ -------------------------------------------------------------------------------- /images/alpha.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/alpha.jpg -------------------------------------------------------------------------------- /images/apertures1.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/apertures1.jpg -------------------------------------------------------------------------------- /images/apertures2.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/apertures2.jpg -------------------------------------------------------------------------------- /images/apertures3.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/apertures3.jpg -------------------------------------------------------------------------------- /images/apertures4.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/apertures4.jpg -------------------------------------------------------------------------------- /images/arc01.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/arc01.jpg -------------------------------------------------------------------------------- /images/arc02.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/arc02.jpg -------------------------------------------------------------------------------- /images/arc03.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/arc03.jpg -------------------------------------------------------------------------------- /images/artifacts.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/artifacts.jpg -------------------------------------------------------------------------------- /images/background-weights.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/background-weights.jpg -------------------------------------------------------------------------------- /images/background.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/background.jpg -------------------------------------------------------------------------------- /images/background2.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/background2.jpg -------------------------------------------------------------------------------- /images/cameras/dof.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/cameras/dof.gif -------------------------------------------------------------------------------- /images/cameras/res1.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/cameras/res1.jpg -------------------------------------------------------------------------------- /images/cameras/res10.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/cameras/res10.jpg -------------------------------------------------------------------------------- /images/cameras/res11.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/cameras/res11.jpg -------------------------------------------------------------------------------- /images/cameras/res2.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/cameras/res2.jpg -------------------------------------------------------------------------------- /images/cameras/res3.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/cameras/res3.jpg -------------------------------------------------------------------------------- /images/cameras/res4.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/cameras/res4.jpg -------------------------------------------------------------------------------- /images/cameras/res5.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/cameras/res5.jpg -------------------------------------------------------------------------------- /images/cameras/res6.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/cameras/res6.gif -------------------------------------------------------------------------------- /images/cameras/res6.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/cameras/res6.jpg -------------------------------------------------------------------------------- /images/cameras/res7.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/cameras/res7.jpg -------------------------------------------------------------------------------- /images/cameras/res8.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/cameras/res8.jpg -------------------------------------------------------------------------------- /images/cameras/res9.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/cameras/res9.jpg -------------------------------------------------------------------------------- /images/comp-01.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/comp-01.gif -------------------------------------------------------------------------------- /images/comp-03.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/comp-03.gif -------------------------------------------------------------------------------- /images/comp1.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/comp1.gif -------------------------------------------------------------------------------- /images/comp2.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/comp2.gif -------------------------------------------------------------------------------- /images/compa.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/compa.gif -------------------------------------------------------------------------------- /images/compb.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/compb.gif -------------------------------------------------------------------------------- /images/confusion.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/confusion.jpg -------------------------------------------------------------------------------- /images/diffuse.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/diffuse.gif -------------------------------------------------------------------------------- /images/discard01.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/discard01.jpg -------------------------------------------------------------------------------- /images/discard02.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/discard02.jpg -------------------------------------------------------------------------------- /images/discard03.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/discard03.jpg -------------------------------------------------------------------------------- /images/discard04.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/discard04.jpg -------------------------------------------------------------------------------- /images/discard05.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/discard05.jpg -------------------------------------------------------------------------------- /images/discard06.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/discard06.jpg -------------------------------------------------------------------------------- /images/discard07.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/discard07.jpg -------------------------------------------------------------------------------- /images/distortion.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/distortion.jpg -------------------------------------------------------------------------------- /images/example01.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/example01.jpg -------------------------------------------------------------------------------- /images/example02.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/example02.jpg -------------------------------------------------------------------------------- /images/example03.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/example03.jpg -------------------------------------------------------------------------------- /images/example1.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/example1.jpg -------------------------------------------------------------------------------- /images/example2.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/example2.jpg -------------------------------------------------------------------------------- /images/example3.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/example3.jpg -------------------------------------------------------------------------------- /images/example4.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/example4.jpg -------------------------------------------------------------------------------- /images/example5.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/example5.jpg -------------------------------------------------------------------------------- /images/fix-int1.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/fix-int1.jpg -------------------------------------------------------------------------------- /images/fix-int2.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/fix-int2.jpg -------------------------------------------------------------------------------- /images/fix-int3.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/fix-int3.gif -------------------------------------------------------------------------------- /images/fix-int4.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/fix-int4.gif -------------------------------------------------------------------------------- /images/fix-int5.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/fix-int5.gif -------------------------------------------------------------------------------- /images/fix1.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/fix1.jpg -------------------------------------------------------------------------------- /images/fix2.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/fix2.jpg -------------------------------------------------------------------------------- /images/fix3.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/fix3.gif -------------------------------------------------------------------------------- /images/fix4.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/fix4.gif -------------------------------------------------------------------------------- /images/fix5.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/fix5.gif -------------------------------------------------------------------------------- /images/fixa.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/fixa.gif -------------------------------------------------------------------------------- /images/fixb.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/fixb.gif -------------------------------------------------------------------------------- /images/fixc.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/fixc.gif -------------------------------------------------------------------------------- /images/foreground-original.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/foreground-original.png -------------------------------------------------------------------------------- /images/foreground-weights.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/foreground-weights.jpg -------------------------------------------------------------------------------- /images/foreground.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/foreground.jpg -------------------------------------------------------------------------------- /images/foreground2.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/foreground2.jpg -------------------------------------------------------------------------------- /images/ghost01.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ghost01.jpg -------------------------------------------------------------------------------- /images/ghost02.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ghost02.jpg -------------------------------------------------------------------------------- /images/ghost03.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ghost03.jpg -------------------------------------------------------------------------------- /images/ghost04.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ghost04.jpg -------------------------------------------------------------------------------- /images/graph.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/graph.gif -------------------------------------------------------------------------------- /images/ies.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ies.gif -------------------------------------------------------------------------------- /images/ies2.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ies2.gif -------------------------------------------------------------------------------- /images/intro.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/intro.jpg -------------------------------------------------------------------------------- /images/intro1.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/intro1.jpg -------------------------------------------------------------------------------- /images/intro2.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/intro2.jpg -------------------------------------------------------------------------------- /images/intro3.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/intro3.jpg -------------------------------------------------------------------------------- /images/intro4.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/intro4.jpg -------------------------------------------------------------------------------- /images/layers.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/layers.gif -------------------------------------------------------------------------------- /images/lens-area.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/lens-area.jpg -------------------------------------------------------------------------------- /images/lens-description.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/lens-description.jpg -------------------------------------------------------------------------------- /images/metal1.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/metal1.jpg -------------------------------------------------------------------------------- /images/metal2.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/metal2.jpg -------------------------------------------------------------------------------- /images/metal3.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/metal3.jpg -------------------------------------------------------------------------------- /images/metal4.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/metal4.jpg -------------------------------------------------------------------------------- /images/mir-1.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/mir-1.jpg -------------------------------------------------------------------------------- /images/poop.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/poop.jpg -------------------------------------------------------------------------------- /images/reflectance.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/reflectance.jpg -------------------------------------------------------------------------------- /images/res1.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/res1.jpg -------------------------------------------------------------------------------- /images/res2.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/res2.jpg -------------------------------------------------------------------------------- /images/res3.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/res3.jpg -------------------------------------------------------------------------------- /images/results-bad.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/results-bad.jpg -------------------------------------------------------------------------------- /images/results.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/results.jpg -------------------------------------------------------------------------------- /images/scene.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/scene.jpg -------------------------------------------------------------------------------- /images/scene2.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/scene2.jpg -------------------------------------------------------------------------------- /images/scene3.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/scene3.jpg -------------------------------------------------------------------------------- /images/slide1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/slide1.png -------------------------------------------------------------------------------- /images/ssr-cam1.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr-cam1.jpg -------------------------------------------------------------------------------- /images/ssr-depth-threshold01.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr-depth-threshold01.jpg -------------------------------------------------------------------------------- /images/ssr-gif1.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr-gif1.gif -------------------------------------------------------------------------------- /images/ssr-gif10.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr-gif10.gif -------------------------------------------------------------------------------- /images/ssr-gif11.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr-gif11.gif -------------------------------------------------------------------------------- /images/ssr-gif2.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr-gif2.gif -------------------------------------------------------------------------------- /images/ssr-gif3.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr-gif3.gif -------------------------------------------------------------------------------- /images/ssr-gif4.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr-gif4.gif -------------------------------------------------------------------------------- /images/ssr-gif5.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr-gif5.gif -------------------------------------------------------------------------------- /images/ssr-gif6.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr-gif6.gif -------------------------------------------------------------------------------- /images/ssr-gif7.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr-gif7.gif -------------------------------------------------------------------------------- /images/ssr-gif8.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr-gif8.gif -------------------------------------------------------------------------------- /images/ssr-gif9.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr-gif9.gif -------------------------------------------------------------------------------- /images/ssr1.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr1.jpg -------------------------------------------------------------------------------- /images/ssr10.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr10.jpg -------------------------------------------------------------------------------- /images/ssr11.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr11.jpg -------------------------------------------------------------------------------- /images/ssr12.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr12.jpg -------------------------------------------------------------------------------- /images/ssr13.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr13.jpg -------------------------------------------------------------------------------- /images/ssr14.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr14.jpg -------------------------------------------------------------------------------- /images/ssr15.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr15.jpg -------------------------------------------------------------------------------- /images/ssr16.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr16.jpg -------------------------------------------------------------------------------- /images/ssr17.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr17.jpg -------------------------------------------------------------------------------- /images/ssr18.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr18.jpg -------------------------------------------------------------------------------- /images/ssr19.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr19.jpg -------------------------------------------------------------------------------- /images/ssr2.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr2.jpg -------------------------------------------------------------------------------- /images/ssr20.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr20.jpg -------------------------------------------------------------------------------- /images/ssr21.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr21.jpg -------------------------------------------------------------------------------- /images/ssr22.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr22.jpg -------------------------------------------------------------------------------- /images/ssr23.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr23.jpg -------------------------------------------------------------------------------- /images/ssr24.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr24.jpg -------------------------------------------------------------------------------- /images/ssr25.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr25.jpg -------------------------------------------------------------------------------- /images/ssr26.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr26.jpg -------------------------------------------------------------------------------- /images/ssr3.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr3.jpg -------------------------------------------------------------------------------- /images/ssr4.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr4.jpg -------------------------------------------------------------------------------- /images/ssr5.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr5.jpg -------------------------------------------------------------------------------- /images/ssr6.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr6.jpg -------------------------------------------------------------------------------- /images/ssr7.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr7.jpg -------------------------------------------------------------------------------- /images/ssr8.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr8.jpg -------------------------------------------------------------------------------- /images/ssr9.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/ssr9.jpg -------------------------------------------------------------------------------- /images/starburst01.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/starburst01.jpg -------------------------------------------------------------------------------- /images/starburst02.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/starburst02.jpg -------------------------------------------------------------------------------- /images/starburst03.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/starburst03.jpg -------------------------------------------------------------------------------- /images/starburst04.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/starburst04.jpg -------------------------------------------------------------------------------- /images/starburst05.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/starburst05.jpg -------------------------------------------------------------------------------- /images/starburst06.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/starburst06.jpg -------------------------------------------------------------------------------- /images/tile-min-depth-linear.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/tile-min-depth-linear.jpg -------------------------------------------------------------------------------- /images/tile-min-depth-point.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/tile-min-depth-point.jpg -------------------------------------------------------------------------------- /images/trace-01.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/trace-01.jpg -------------------------------------------------------------------------------- /images/trace-02.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/trace-02.jpg -------------------------------------------------------------------------------- /images/trace-03.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/trace-03.jpg -------------------------------------------------------------------------------- /images/trace-04.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/trace-04.jpg -------------------------------------------------------------------------------- /images/trace-05.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greje656/Questions/1828104d8e01791cfbd55599fd0627d83e59eff1/images/trace-05.jpg -------------------------------------------------------------------------------- /lens-flares.md: -------------------------------------------------------------------------------- 1 | # Physically Based Lens Flare 2 | 3 | While playing Horizon Zero Dawn I got inspired by the lens flare they supported and decided to look into implementing some basic ones in Stingray. There were four types of flare I was particularly interested in. 4 | 5 | 1) Anisomorphic flare 6 | 2) Aperture diffraction (Starbursts) 7 | 4) Camera ghosts due to Sun or Moon (High Quality - What this post will cover) 8 | 3) Camera ghosts due to all other light sources (Low Quality - Screen Space Effect) 9 | 10 | *[Die Hard](http://www.imdb.com/title/tt0095016/)* (1), *[Just Another Dang Blog](https://blog.lopau.com/uv-lens-and-night-photography/)* (2), *[PEXELS](https://www.pexels.com/photo/sunrise-sunset-lens-flare-6889/)* (3), *[The Matrix Reloaded](http://www.imdb.com/title/tt0234215/)* (4) 11 | ![](https://github.com/greje656/Questions/blob/master/images/intro.jpg) 12 | 13 | Once finished I'll do a follow up blog post on the Camera Lens Flare plugin, but for now I want to share the implementation details of the high-quality ghosts which are an implementation of ["Physically-Based Real-Time Lens Flare Rendering"](http://resources.mpi-inf.mpg.de/lensflareRendering). 14 | 15 | All the code used to generate the images and videos of this article can can be found here: https://github.com/greje656/PhysicallyBasedLensFlare 16 | 17 | ### Ghosts 18 | 19 | The basic idea of the "Physically-Based Lens Flare" paper is to ray trace "bundles" into a lens system which will end up on a sensor to form a ghost. A ghost here refers to the de-focused light that reaches the sensor of a camera due to the light reflecting off the lenses. Since a camera lens is not typically made of a single optical lens but many lenses there can be many ghosts that form on it's sensor. If we only consider the ghosts that are formed from two bounces, that's a total of [nCr(n,2)](https://www.desmos.com/calculator/rsrjo1mhy1) possible ghosts [combinations](https://en.wikipedia.org/wiki/Combination) (where n is the number of lens components in a camera lens) 20 | 21 | ![](https://github.com/greje656/Questions/blob/master/images/ghost04.jpg) 22 | 23 | ### Lens Interface Description 24 | 25 | Ok let's get into it. To trace rays in an optical system we obviously need to build an optical system first. This part can be tedious. Not only have you got to find the "Lens Prescription" you are looking for, you also need to manually parse it. For example parsing the Nikon 28-75mm patent data might look something like this: 26 | 27 | ![](https://github.com/greje656/Questions/blob/master/images/lens-description.jpg) 28 | 29 | There is no standard way of describing such systems. You may find all the information you need from a lens patent, but often (especially for older lenses) you end up staring at an old document that seems to be missing important information required for the algorithm. For example, the Russian lens MIR-1 apparently produces beautiful lens flare, but the only lens description I could find for it was this: 30 | 31 | *[MIP.1B manual](http://allphotolenses.com/public/files/pdfs/ce6dd287abeae4f6a6716e27f0f82e41.pdf)* 32 | ![](https://github.com/greje656/Questions/blob/master/images/mir-1.jpg) 33 | 34 | ### Ray Tracing 35 | 36 | Once you have parsed your lens description into something your trace algorithm can consume, you can then start to ray trace. The idea is to initialize a tessellated patch at the camera's light entry point and trace through each of the points in the direction of the incoming light. There are a couple of subtleties to note regarding the tracing algorithm. 37 | 38 | First, when a ray misses a lens component the raytracing routine isn't necessarily stopped. Instead if the ray can continue with a path that is meaningful the ray trace continues until it reaches the sensor. Only if the ray misses the sphere formed by the radius of the lens do we break the raytracing routine. The idea behind this is to get as many traced points to reach the sensor so that the interpolated data can remain as continuous as possible. Rays track the maximum relative distance it had with a lens component while tracing through the interface. This relative distance will be used in the pixel shader later to determine if a ray had left the interface. 39 | 40 | *Relative distance visualized as green/orange gradient (black means ray missed lens component completely).* 41 | ![](https://github.com/greje656/Questions/blob/master/images/trace-05.jpg) 42 | 43 | Secondly, a ray bundle carries a fixed amount of energy so it is important to consider the distortion of the bundle area that occurs while tracing them. In In the paper, the author states: 44 | 45 | *"At each vertex, we store the average value of its surrounding neighbours. The regular grid of rays, combined with the transform feedback (or the stream-out) mechanism of modern graphics hardware, makes this lookup of neighbouring quad values very easy"* 46 | 47 | I don't understand how the transform feedback, along with the available adjacency information of the geometry shader could be enough to provide the information of the four surrounding quads of a vertex (if you know please leave a comment). Luckily we now have compute and UAVs which turn this problem into a fairly trivial one. Currently I only calculate an approximation of the surrounding areas by assuming the neighbouring quads are roughly parallelograms. I estimate their bases and heights as the average lengths of their top/bottom, left/right segments. The results are seen as caustics forming on the sensor where some bundles converge into tighter area patches while some other dilates: 48 | 49 | ![](https://github.com/greje656/Questions/blob/master/images/lens-area.jpg) 50 | 51 | This works fairly well but is [expensive](https://github.com/greje656/PhysicallyBasedLensFlare/blob/master/Lens/lens.hlsl#L198). Something that I intend to improve in the future. 52 | 53 | Now that we have a traced patch we need to make some sense out of it. The patch "as is" can look intimidating at first. Due to early exits of some rays the final vertices can sometimes look like something went terribly wrong. Here is a particularly distorted ghost: 54 | 55 | ![](https://github.com/greje656/Questions/blob/master/images/discard03.jpg) 56 | 57 | The first thing to do is discard pixels that exited the lens system: 58 | 59 | ~~~~ 60 | float intensity1 = max_relative_distance < 1.0f; 61 | float intensity = intensity1; 62 | if(intensity == 0.f) discard; 63 | ~~~~ 64 | 65 | ![](https://github.com/greje656/Questions/blob/master/images/discard04.jpg) 66 | 67 | Then we can discard the rays that didn't have any energy as they entered to begin with (say outside the sun disk): 68 | 69 | ~~~~ 70 | float lens_distance = length(entry_coordinates.xy); 71 | float sun_disk = 1 - saturate((lens_distance - 1.f + fade)/fade); 72 | sun_disk = smoothstep(0, 1, sun_disk); 73 | ... 74 | float intensity2 = sun_disk; 75 | float intensity = intensity1 * intensity2; 76 | if(intensity == 0.f) discard; 77 | ~~~~ 78 | 79 | ![](https://github.com/greje656/Questions/blob/master/images/discard05.jpg) 80 | 81 | Then we can discard the rays that we're blocked by the aperture: 82 | 83 | ~~~~ 84 | ... 85 | float intensity3 = aperture_sample; 86 | float intensity = intensity1 * intensity2 * intensity3; 87 | if(intensity == 0.f) discard; 88 | ~~~~ 89 | 90 | ![](https://github.com/greje656/Questions/blob/master/images/discard06.jpg) 91 | 92 | Finally we adjust the radiance of the beams based on their final areas: 93 | 94 | ~~~~ 95 | ... 96 | float intensity4 = (original_area/(new_area + eps)) * energy; 97 | float intensity = intensity1 * intensity2 * intensity3 * intensity4; 98 | if(intensity == 0.f) discard; 99 | ~~~~ 100 | 101 | ![](https://github.com/greje656/Questions/blob/master/images/discard07.jpg) 102 | 103 | The final value is the rgb reflectance value of the ghost modulated by the incoming light color: 104 | 105 | ~~~~ 106 | float3 color = intensity * input.reflectance.xyz * TemperatureToColor(INCOMING_LIGHT_TEMP); 107 | ~~~~ 108 | 109 | ### Aperture 110 | 111 | *[6iee](http://6iee.com/755819.html)* 112 | ![](https://github.com/greje656/Questions/blob/master/images/apertures4.jpg) 113 | 114 | The aperture shape is built procedurally. As suggested by [Padraic Hennessy's blog](https://placeholderart.wordpress.com/2015/01/19/implementation-notes-physically-based-lens-flares/) I use a signed distance field confined by "n" segments and threshold it against some distance value. I also experimented with approximating the light diffraction that occurs at the edge of the apperture blades using a [simple function](https://www.desmos.com/calculator/munv7q2ez3): 115 | 116 | ![](https://github.com/greje656/Questions/blob/master/images/apertures1.jpg) 117 | 118 | Finally, I offset the signed distance field with a repeating sin function which can give curved aperture blades: 119 | 120 | ![](https://github.com/greje656/Questions/blob/master/images/apertures2.jpg) 121 | 122 | ### Starburst 123 | 124 | The starburst phenomena is due to light diffraction that passes through the small aperture hole. It's a phenomena known as the "single slit diffraction of light". The author got really convincing results to simulate this using the Fraunhofer approximation. The challenge with this approach is that it requires bringing the aperture texture into Fourier space which is not trivial. In previous projects I used Cuda's math library to perform the FFT of a signal but since the goal is to bring this into Stingray I didn't want to have such a dependency. Luckily I found this little gem posted by [Joseph S. from intel](https://software.intel.com/en-us/articles/fast-fourier-transform-for-image-processing-in-directx-11). He provides a clean and elegant compute implementation of the butterfly passes method which bring a signal to and from Fourier space. Using it I can feed in the aperture shape and extract the Fourier Power Spectrum: 125 | 126 | ![](https://github.com/greje656/Questions/blob/master/images/starburst04.jpg) 127 | 128 | This spectrum needs to be filtered further in order to look like a starburst. This is where the Fraunhofer approximation comes in. The idea is to basically reconstruct the diffraction of white light by summing up the diffraction of multiple wavelengths. The key observation is that the same Fourier signal can be used for all wavelengths. The only thing needed is to scale the sampling coordinates of the Fourier power spectrum: (x0,y0) = (u,v)·λ·z0. 129 | 130 | 350nm/435nm/525nm/700nm 131 | ![](https://github.com/greje656/Questions/blob/master/images/starburst05.jpg) 132 | 133 | Summing up the wavelengths gives the starburst image. To get more interesting results I apply an extra filtering step. I use a spiral pattern mixed with a small rotation to get rid of any left over radial ringing artifacts (judging by the author's starburst results I suspect this is a step they are also doing). 134 | 135 | ![](https://github.com/greje656/Questions/blob/master/images/starburst06.jpg) 136 | 137 | ### Anti Reflection Coating 138 | 139 | While some appreciate the artistic aspect of lens flare, lens manufacturers work hard to minimize them by coating lenses with anti-reflection coatings. The coating applied to each lenses are usually designed to minimize the reflection of a specific wavelength. They are defined by their thickness and index of refraction. Given the wavelength to minimize reflections for, and the IORs of the two medium involved in the reflection (say n0 and n2), the ideal IOR (n1) and thickness (d) of the coating are defined as n1 = sqrt(n0·n2) and d=λ/4·n1. This is known as a quarter wavelength anti-reflection coating. I've found [this site](http://www.pveducation.org/pvcdrom/anti-reflection-coatings) very helpful to understand this phenomenon. 140 | 141 | In the current implementation each lens coating specifies a wavelength the coating should be optimized for and the ideal thickness and IOR are used by default. I added a controllable offset to thicken the AR coating layer in order to conveniently reduce it's anti-reflection properties: 142 | 143 | No AR Coating: 144 | ![](https://github.com/greje656/Questions/blob/master/images/arc01.jpg) 145 | 146 | Ideal AR Coating: 147 | ![](https://github.com/greje656/Questions/blob/master/images/arc02.jpg) 148 | 149 | AR Coating with offsetted thickness: 150 | ![](https://github.com/greje656/Questions/blob/master/images/arc03.jpg) 151 | 152 | ### Optimisations 153 | 154 | Currently the full cost of the effect for a Nikon 28-75mm lens is xxxms. The performance degrades as the sun disk is made bigger since it results in more and more overshading during the rasterisation of each ghosts. With a simpler lens interface like the 1955 Angenieux the cost decreases significantly. In the current implementation every possible "two bounce ghost" is traced and drawn. For a lens system like the Nikon 28-75mm which has 27 lens components, that's n!/r!(n-r)! = 352 ghosts. It's easy to see that this number can [increase dramatically](https://www.desmos.com/calculator/rsrjo1mhy1) with the number of component. 155 | 156 | An obvious optimization would be to skip ghosts that have intensities so low that their contributions are imperceptible. Using Compute/DrawIndirect it would be fairly simple to first run a coarse pass and use it to cull non contributing ghosts. This would reduce the compute and rasterization pressure on the gpu dramatically. Something I intend to do in future. 157 | 158 | ### Conclusion 159 | 160 | I'm not sure if this approach was ever used in a game. It would probably be hard to justify it's heavy cost. I feel this would have a better use case in the context of pre-visualization where a director might be interested in having early feedback on how a certain lens might behave in a shot. 161 | 162 | *[Wallup](http://wallup.net/car-sunset-lexus/)* 163 | ![](https://github.com/greje656/Questions/blob/master/images/example03.jpg) 164 | 165 | *[Wallpapers Web](http://www.wallpapers-web.com/sunset-field-wallpapers/5485554.html/)* 166 | ![](https://github.com/greje656/Questions/blob/master/images/example02.jpg) 167 | 168 | Finally, be aware the author has filled a patent for the algorithm described in his paper, which may put limits on how you may use parts of what is described in my post. Please contact the [paper's author](http://resources.mpi-inf.mpg.de/lensflareRendering/) for more information on what restrictions might be in place. -------------------------------------------------------------------------------- /reflections.md: -------------------------------------------------------------------------------- 1 | #Reprojecting reflections 2 | 3 | Screen space reflections are such a pain. When combined with taa they are even harder to manage. Raytracing against jittered depth/normal g-buffers can easily cause reflection rays to have widely different intersection points from frame to frame. When using neighborhood clamping, it becomes difficult to handle the flickering caused by too much clipping caused by the high variance in the ssr signal. On top of this, reflections are very hard to reproject. Since they are view dependent simply fetching the motion vector from the current pixel tends to make the reprojection "smudge" under camera motion. 4 | 5 | ![](https://github.com/greje656/Questions/blob/master/images/foreground-original.png) 6 | 7 | Last year I spent some time trying to understand this problem a little bit more. I first drew a ray diagram desicrbing how a reflection could be reprojected in theory: 8 | 9 | 1) Retrieve the surface motion vector (ms) corresponding to the reflection incidence point (v0) 10 | 2) Reproject the incidence point using (ms) 11 | 3) Using the previous depth buffer, reconstruct the reflection incidence point (v1) 12 | 4) Retrieve the motion vector (mr) corresponding to the reflected point (p0) 13 | 5) Reproject the reflection point using (mr) 14 | 6) Using the depth buffer history, reconstruct the previous reflection point (p1) 15 | 7) Using the previous view matrix transform, reconstruct the previous surface normal of the incidence point (n1) 16 | 8) Project the camera position (deye ) and the reconstructed reflection point (dp1) onto the previous plane (defined by surface normal = n1, and surface point = v1) 17 | 9) Solve for the position of the previous reflection point (r) knowing (deye ) and (dp1) 18 | 10) Finally, using the previous view-projection matrix, evaluate (r) in the previous reflection buffer. 19 | 20 | ![](https://github.com/greje656/Questions/blob/master/images/foreground-original.png) 21 | 22 | By adding to Stingray a copy of the depth buffer of the previous frame and using the previous view projection matrix I was able to confirm this approach could successfully reproject reflections: 23 | 24 | ![](https://github.com/greje656/Questions/blob/master/images/foreground-original.png) 25 | 26 | Ghosting was definitly minimised under camera motion (note that here neighborhood clamping is disabled to visualize the success of the reprojection better): 27 | 28 | ![](https://github.com/greje656/Questions/blob/master/images/foreground-original.png) 29 | 30 | Unfortunately keeping a copy of the depthbuffer is currently not really a feasible/appealing solution in a lot of scenarios. But it was a good exercise to understand the problem. 31 | 32 | So instead I tried a different approach. The new idea was to pick a few reprojection vectors that are likely to be meaningful in the context of a reflection. Looking at the case of camera rotation, translation, and object motion we can build three "interesting" vectors to consider: 33 | 34 | We then declare the vector with the smallest magnitude as the most likely succesful reprojection vector. This simple idea alone has improved the reprojection of the ssr buffer quite significantly: 35 | 36 | ![Imgur](http://i.imgur.com/QURbYF0.gifv) 37 | 38 | Note that if casting multiple rays per pixel then averaging the sum of all reprojection vectors still gave us a better results than what we had previously. 39 | 40 | Screen space reflections is one of the most difficult screen space effect I've had to deal with. They are plaged with artifacts which can often be difficult to explain or understand.In the last couple of years I've seen people propose really creative ways to minimize some of the artifacts that are inherent to ssr. 41 | 42 | 43 | 44 | 45 | 46 | 47 | 48 | 49 | 50 | 51 | 52 | 53 | 54 | 55 | 56 | 57 | 58 | ![](https://github.com/greje656/Questions/blob/master/images/foreground-original.png) 59 | 60 | The results I seem to get for the foreground layer is more like (note this is using a "background range" of 2.5m so ~100in). 61 | 62 | foreground.rgb/foreground.a: 63 | ![](https://github.com/greje656/Questions/blob/master/images/foreground.jpg) 64 | Notice how I don't get any samples pixels whose weight is low? I'm wondering if I miss-understood something crucial here (i.e. no black pixels like you're result). 65 | 66 | The weight of the samples looks like this: 67 | ![](https://github.com/greje656/Questions/blob/master/images/foreground-weights.jpg) 68 | 69 | Similarly the background looks like this: 70 | 71 | background.rgb/background.a 72 | ![](https://github.com/greje656/Questions/blob/master/images/background.jpg) 73 | 74 | And the background weights: 75 | ![](https://github.com/greje656/Questions/blob/master/images/background-weights.jpg) 76 | 77 | The normalized alpha value I get looks like: 78 | ![](https://github.com/greje656/Questions/blob/master/images/alpha.jpg) 79 | 80 | And finally "lerp(background/background.a, foreground/foreground.a, alpha)": 81 | ![](https://github.com/greje656/Questions/blob/master/images/results.jpg) 82 | 83 | I'm wondering maybe the foreground/background layers should be weighted like this? (... nah, that doesn't make sense! I'm so confused) 84 | 85 | foreground.rgb/SAMPLE_COUNT: 86 | ![](https://github.com/greje656/Questions/blob/master/images/foreground2.jpg) 87 | 88 | background.rgb/SAMPLE_COUNT: 89 | ![](https://github.com/greje656/Questions/blob/master/images/background2.jpg) 90 | 91 | ####Question 2) 92 | Should the maxCoCMinDepth tiles be sampled using linear interpolation? Or point sampling? Intuitively it feels like it should be point sampling but I see artifacts at the edges of the tiles if I do. Note that with tiles of 20x20 pixels I'm making the assumption that the maximum kernel size allowed should be 10x10px. Maybe that is a false assumption? 93 | 94 | Linear (currently what I'm using): 95 | ![](https://github.com/greje656/Questions/blob/master/images/tile-min-depth-linear.jpg) 96 | 97 | Point: 98 | ![](https://github.com/greje656/Questions/blob/master/images/tile-min-depth-point.jpg) 99 | 100 | Point sampling artifacts: 101 | ![](https://github.com/greje656/Questions/blob/master/images/artifacts.jpg) 102 | 103 | ####Questions 3) 104 | Finally (this one might be really dumb!). Does this need to differentiate samples that are behind\infront of the camera focus point? i.e. does this need signed CoCs? I think it does but that's not something that's 100% clear to me. Currently I've implemented a solution that uses abs(coc), but this doesn't work for focused objects on top of unfocused: 105 | ![](https://github.com/greje656/Questions/blob/master/images/results-bad.jpg) 106 | It's probably that we DO need to use a signed circle of confusions but I just wanted to confirm that I'm not missing something obvious here (I'm sure I am) 107 | 108 | ####Code used to generate images 109 | #define NUM_SAMPLES 5 110 | #define COC_SIZE_IN_PIXELS 10 111 | #define BACKGROUND_RANGE 2.5 112 | 113 | struct PresortParams { 114 | float coc; 115 | float backgroundWeight; 116 | float foregroundWeight; 117 | }; 118 | 119 | float2 DepthCmp2(float depth, float closestTileDepth){ 120 | float d = saturate((depth - closestTileDepth)/BACKGROUND_RANGE); 121 | float2 depthCmp; 122 | depthCmp.x = smoothstep( 0.0, 1.0, d ); // Background 123 | depthCmp.y = 1.0 - depthCmp.x; // Foreground 124 | return depthCmp; 125 | } 126 | 127 | float SampleAlpha(float sampleCoc) { 128 | const float DOF_SINGLE_PIXEL_RADIUS = length(float2(0.5, 0.5)); 129 | return min( 130 | rcp(PI * sampleCoc * sampleCoc), 131 | rcp(PI * DOF_SINGLE_PIXEL_RADIUS * DOF_SINGLE_PIXEL_RADIUS) 132 | ); 133 | } 134 | 135 | PresortParams GetPresortParams(float sample_coc, float sample_depth, float closestTileDepth){ 136 | PresortParams presort_params; 137 | 138 | presort_params.coc = sample_coc; 139 | presort_params.backgroundWeight = SampleAlpha(sample_coc) * DepthCmp2(sample_depth, closestTileDepth).x; 140 | presort_params.foregroundWeight = SampleAlpha(sample_coc) * DepthCmp2(sample_depth, closestTileDepth).y; 141 | 142 | return presort_params; 143 | } 144 | 145 | float4 ps_main(PS_INPUT input) : SV_TARGET0 { 146 | 147 | float tileMaxCoc = TEX2DLOD(input_texture3, input.uv, 0).g * COC_SIZE_IN_PIXELS; 148 | 149 | float SAMPLE_COUNT = 0.0; 150 | float4 background = 0; 151 | float4 foreground = 0; 152 | 153 | for (int x = -NUM_SAMPLES; x <= NUM_SAMPLES; ++x) { 154 | for (int y = -NUM_SAMPLES; y <= NUM_SAMPLES; ++y) { 155 | 156 | float2 samplePos = get_sampling_pos(x, y); 157 | float2 sampleUV = input.uv + samplePos/back_buffer_size * tileMaxCoc; 158 | 159 | float3 sampleColor = TEX2DLOD(input_texture0, sampleUV, 0).rgb; 160 | float sampleCoc = decode_coc(TEX2DLOD(input_texture1, sampleUV, 0).r) * COC_SIZE_IN_PIXELS; 161 | float sampleDepth = linearize_depth(TEX2DLOD(input_texture2, sampleUV, 0).r); 162 | float closestTileDepth = TEX2DLOD(input_texture3, input.uv, 0).r; 163 | 164 | PresortParams samplePresortParams = GetPresortParams(sampleCoc, sampleDepth, closestTileDepth); 165 | 166 | background += samplePresortParams.backgroundWeight * float4(sampleColor, 1.f); 167 | foreground += samplePresortParams.foregroundWeight * float4(sampleColor, 1.f); 168 | 169 | SAMPLE_COUNT += 1; 170 | } 171 | } 172 | 173 | float alpha = saturate( 2.0 * ( 1.0 / SAMPLE_COUNT ) * ( 1.0 / SampleAlpha(tileMaxCoc)) * foreground.a); 174 | float3 finalColor = lerp(background/background.a, foreground/foreground.a, alpha); 175 | 176 | return float4(finalColor, 1); 177 | } -------------------------------------------------------------------------------- /river.md: -------------------------------------------------------------------------------- 1 | # Real time river editor # 2 | 3 | I've always been interested in water in games. I played From Dust a lot when it came out and it fascinated me. More rencetly I was also very impressed with the mud and water system of Spintires MudRunner and Uncharted 4. I was watching the Siggraph 2016 talk "Rendering Rapids in Uncharted 4" and something really caught my attention. There was a small demo of an idea called "Wave Particles", developped by Cem Yuksel and al. 4 | 5 | ![](images/slide1.png) 6 | 7 | I was amazed at the complexity that could arise by this simple idea of stacking moving simple wave particles on top of each other and had to try it for myself. I setup a small d3d12 app and even without any lighting it was obvious that it was possible to build a system around this that was very intuitive and fun to use: 8 | 9 | [![](http://img.youtube.com/vi/MIGhIPTaTDI/0.jpg)](http://www.youtube.com/watch?v=MIGhIPTaTDI) 10 | 11 | Normally for open water the philip spectrum is used to distribute a number of waves lengths and amplitude, but I've found that using some logarithmic distribution made it easier to reason about the coverage of the waves on a given surface. 12 | 13 | [[image of wave distribution]] 14 | 15 | One thing I tried early on was to combine the idea of wave particles and wave packets that was presented by Stefan Jeschke during the Siggrapgh of 2017. A wave packet roughly speaking is similar to a wave particle, but it carries with it multiple wave fronts: 16 | 17 | [[image of wave packet]] 18 | 19 | The hope was that these complex particles could capture subtle water surface behaviors which would not be possible with simple wave particles. While it was an interesting idea I did not have much luck with them. Mainly because it was difficult to gather the longitudinal forces of the waves but also because they prevented the use of an important optimization that plain wave particles offered (http://www.cemyuksel.com/research/waveparticles/waveparticles_sketch_slides.pdf). So I ended up abandonning this effort early on and sticked to using plain wave particles: 20 | 21 | [![](http://img.youtube.com/vi/I1l587DTKIE/0.jpg)](http://www.youtube.com/watch?v=I1l587DTKIE) 22 | 23 | Once I had the lighting and modelling in a descent state, I turned my attention to generating an interesting vector field which could drive the motion of these waves. 24 | 25 | I first looked at the distance field approach made popular by Valve with the game Portal. Unfortunatly this approach lacks the ability of creating interesting vortices which was a property I was hoping to have. I also looked at Jos Stam stable fluid method. Since this is an actual simulation the method captures some interesting fluid properties: 26 | 27 | [![](http://img.youtube.com/vi/V9bH3QCu90w/0.jpg)](http://www.youtube.com/watch?v=V9bH3QCu90w) 28 | 29 | But then I found out about LBM solvers. The more I looked into these the more they revealed interesting properties. They seemed straight forward to parallelize and port to GPU. They captured vorticity effects really well and on top of that they could be used to solve the Shallow Water Equation. Some configuration even appeared to converge to some stable state which seemed like a desirable property if the vector field needs to be stored offline. 30 | 31 | I always like to start implementing these complex systems on CPU since it allows me to set break points and investigate the simulated data much easier than when ran on a GPU. But even at a very low resolution and frame rate, I could see that this method seemed promissing. 32 | 33 | [![](http://img.youtube.com/vi/4aXSyiukvOI/0.jpg)](http://www.youtube.com/watch?v=4aXSyiukvOI) 34 | 35 | For more information on how to solve the Shallow Water Equation using the LBM method, I recommend this fantastic book by Dr. Jian Guo Zhou (https://www.springer.com/gp/book/9783540407461). 36 | 37 | I then ported the code to compute which allowed me to run the simulation at a much higher resolution: 38 | 39 | [![](http://img.youtube.com/vi/seChiG6V94Q/0.jpg)](http://www.youtube.com/watch?v=seChiG6V94Q) 40 | 41 | I was sold on the benifits of the LBM solver very early. The main drawback is the memory required to run it. The method needs to operate on 9 float components (so far 16F per component seems to be enough). To make it worst, I'm currently using a ping/pong scheme which means twice the amount of data. For example, if running the simulation on a 2d grid that's 256x128, then it's (256x128pixels) x (9 components) x (16 bits) x (2 for ping/ponging) 42 | 43 | The vector field is then used to advect some data. Mostly foam related. At the moment the foam amount and 2 sets of uv coordinates are advected (5 more float components). TheWe need two sets of uvs because we will fetch the foam texture with each set and continually blend back and forth between the two samples as it stretches because of the advection. 44 | 45 | [image of advected textures only] 46 | 47 | Advecting the wave heights and normal maps didn't prove very successful. A very visble and annoying pusling appeared. The pulsing could be minimized using some perlin noise to add some variation but it was a difficult battle to fight. There is a lot of research done on this subject, especially from Fabrice Neyret and al. But I haven't seen anything that comes accross as a silver bullet. This idea 48 | http://www-evasion.imag.fr/Publications/2003/Ney03/ basically advects 3 sets of uv coordinates and lets you choose which set would yeild less distortion. This idea https://www.youtube.com/watch?v=HXrCSJqXcl0 advects particles in the velocity fields and then splats out a uv set with very limited stretching. Incredible results but very expensive to calculate. 49 | 50 | [![](http://img.youtube.com/vi/NqSOnmh2_do/0.jpg)](http://www.youtube.com/watch?v=NqSOnmh2_do) 51 | 52 | Fortunatly, roughly at the same time I was tackling these issues, Stefan Jeschke presented yet another new idea called wave Profile Buffers during his Siggraph 2018 talk. As soon as I saw the presentation I started reading the paper. One key property of the Wave Profiles is that they don't require uv coordinates. To sample them you only need a world position, a time value, and a wave direction. This means that the wave fronts generated by sampling these wave profiles are spacially and temporally coherant. i.e. no stretching or tearing! 53 | 54 | [![](http://img.youtube.com/vi/iGu_1Yvkukg/0.jpg)](http://www.youtube.com/watch?v=iGu_1Yvkukg) 55 | 56 | ## A note on lighting ## 57 | The lighting of the water is fairly standard. It basically boils down to a specular term for the water surface and a scattering term which simulates the light scattered in the water volume coming back towards the eye. For the specular term I'm simply sampling a cube map which is generated using an atmospheric shader. There's a really good blog series on this subject that I love written by Alan Zucconi https://www.alanzucconi.com/2017/10/10/atmospheric-scattering-1/. But the specular term could be anything that your pipeline is currently using. 58 | 59 | For the volumetric light scattering approximation, I define an imaginary point as the halfway point between the water surface being shaded and the floor directly underneath it. I then light this point with two imaginary infinite planes which acts as large area lights. The luminance of these two planes are defined as WaterColor*(N*L)*SunColor and GroundColor*(N*L)*SunColor*e^(-depth) respectively. I then solve the two exponential integrals using an analytical solution and average them. 60 | 61 | When the sun angle hits a certain grazing angle threshold, I also refract a ray and fetch a sky value and inject this as extra water scattering that can occur on the wave tips. Something I haven't tried yet but might be interesting would be to support self occlusion of the water surface by doing some screen space shadow tracing, but for now the wave heights I'm toying with are too small for it to see any interesting benefit. For larger waves it seems like self occlusion could definitly be interesting (http://www.rutter.ca/files/large/0405f593e9fabc4) 62 | 63 | Future work: 64 | There is a lot of work left to do for this to be production ready. The next step will be to implement a system which will let me paint large river canals. I have been thinking about using some kind of virtual texture setup where the velocity field could be baked along with some initial condition which could be used to bootstrap the advection for a tile as the camera approaches a certain area. This would elliminate the cost of running the simulation at runtime. 65 | 66 | Also, I'm interested to see how far one could go with pre-generating Wave Profile Buffers which would make them much more feasible to use on a tigheter gpu budget. -------------------------------------------------------------------------------- /validation.md: -------------------------------------------------------------------------------- 1 | # Validating materials and lights in Stingray # 2 | 3 | ![](images/comp-01.gif) 4 | 5 | Stingray 1.9 is just around the corner and with it will come our new physical lights. I wanted to write a little bit about the validation process that we went through to increase our confidence in the behaviour of our materials and lights. 6 | 7 | Early on we were quite set on building a small controlled "light room" similar to what the [Fox Engine team presented at GDC](https://youtu.be/FQMbxzTUuSg?t=19m25s) as a validation process. But while this seemed like a fantastic way to confirm the entire pipeline is giving plausible results, it felt like identifying the source of discontinuities when comparing photographs vs renders might involve a lot of guess work. So we decided to delay the validation process through a controlled light room and started thinking about comparing our results with a high quality offline renderer. Since [SolidAngle](https://www.solidangle.com/) joined Autodesk last year and that we had access to an [Arnold](https://www.solidangle.com/arnold/) license server it seemed like a good candidate. Note that the Arnold SDK is extremely easy to use and can be [downloaded](https://www.solidangle.com/arnold/download) for free. If you don't have a license you still have access to all the features and the only limitation is that the rendered frames are watermarked. 8 | 9 | We started writing a Stingray plugin that supported simple scene reflection into Arnold. We also implemented a custom Arnold Output Driver which allowed us to forward Arnold's linear data directly into the Stingray viewport where they would be gamma corrected and tonemapped by Stingray (minimizing as many potential sources of error). 10 | 11 | ### Material parameters mapping ### 12 | The trickiest part of the process was to find an Arnold material which we could use to validate. When we started this work we used Arnold 4.3 and realized early that the Arnold's [Standard shader](https://support.solidangle.com/display/AFMUG/Standard) didn't map very well to the Metallic/Roughness model. We had more luck using the [alSurface shader](http://www.anderslanglands.com/alshaders/alSurface.html) with the following mapping: 13 | 14 | ~~~~ 15 | // "alSurface" 16 | // ============================================================================================== 17 | AiNodeSetRGB(surface_shader, "diffuseColor", color.x, color.y, color.z); 18 | AiNodeSetInt(surface_shader, "specular1FresnelMode", 0); 19 | AiNodeSetInt(surface_shader, "specular1Distribution", 1); 20 | AiNodeSetFlt(surface_shader, "specular1Strength", 1.0f - metallic); 21 | AiNodeSetRGB(surface_shader, "specular1Color", white.x, white.y, white.z); 22 | AiNodeSetFlt(surface_shader, "specular1Roughness", roughness); 23 | AiNodeSetFlt(surface_shader, "specular1Ior", 1.5f); // solving ior = (n-1)^2/(n+1)^2 for 0.04 gives 1.5 24 | AiNodeSetRGB(surface_shader, "specular1Reflectivity", white.x, white.y, white.z); 25 | AiNodeSetRGB(surface_shader, "specular1EdgeTint", white.x, white.y, white.z); 26 | 27 | AiNodeSetInt(surface_shader, "specular2FresnelMode", 1); 28 | AiNodeSetInt(surface_shader, "specular2Distribution", 1); 29 | AiNodeSetFlt(surface_shader, "specular2Strength", metallic); 30 | AiNodeSetRGB(surface_shader, "specular2Color", color.x, color.y, color.z); 31 | AiNodeSetFlt(surface_shader, "specular2Roughness", roughness); 32 | AiNodeSetRGB(surface_shader, "specular2Reflectivity", white.x, white.y, white.z); 33 | AiNodeSetRGB(surface_shader, "specular2EdgeTint", white.x, white.y, white.z); 34 | ~~~~ 35 | 36 | Stingray VS Arnold: roughness = 0, metallicness = [0, 1] 37 | ![](images/res1.jpg) 38 | 39 | Stingray VS Arnold: metallicness = 1, roughness = [0, 1] 40 | ![](images/res3.jpg) 41 | 42 | Halfway through the validation process Arnold 5.0 got released and with it came the new [Standard Surface shader](https://support.solidangle.com/display/A5AFMUG/Standard+Surface) which is based on a Metalness/Roughness workflow. This allowed for a much simpler mapping: 43 | 44 | ~~~~ 45 | // "aiStandardSurface" 46 | // ============================================================================================== 47 | AiNodeSetFlt(standard_shader, "base", 1.f); 48 | AiNodeSetRGB(standard_shader, "base_color", color.x, color.y, color.z); 49 | AiNodeSetFlt(standard_shader, "diffuse_roughness", 0.f); // Use Lambert for diffuse 50 | 51 | AiNodeSetFlt(standard_shader, "specular", 1.f); 52 | AiNodeSetFlt(standard_shader, "specular_IOR", 1.5f); // solving ior = (n-1)^2/(n+1)^2 for 0.04 gives 1.5 53 | AiNodeSetRGB(standard_shader, "specular_color", 1, 1, 1); 54 | AiNodeSetFlt(standard_shader, "specular_roughness", roughness); 55 | AiNodeSetFlt(standard_shader, "metalness", metallic); 56 | ~~~~ 57 | 58 | ### Investigating material differences ### 59 | 60 | The first thing we noticed is an excess in reflection intensity for reflections with large incident angles. Arnold supports [Light Path Expressions](https://support.solidangle.com/display/A5AFMUG/Introduction+to+Light+Path+Expressions) which made it very easy to compare and identify the term causing the differences. In this particular case we quickly identified that we had an energy conservation problem. Specifically the contribution from the Fresnel reflections was not removed from the diffuse contribution: 61 | 62 | ![](images/fix1.jpg) 63 | 64 | Scenes with a lot of smooth reflective surfaces demonstrates the impact of this issue noticeably: 65 | 66 | ![](images/fixc.gif) 67 | ![](images/fixa.gif) 68 | ![](images/fixb.gif) 69 | 70 | Another source of differences and confusion came from the tint of the Fresnel term for metallic surfaces. Different shaders I investigaed had different behaviors. Some tinted the Fresnel term with the base color while some others didn't: 71 | 72 | ![](images/metal3.jpg) 73 | 74 | It wasn't clear to me how Fresnel's law of reflection applied to metals. I asked on Twitter what peoples thoughts were on this and got this simple and elegant [claim](https://twitter.com/BrookeHodgman/status/884532159331028992) made by Brooke Hodgman: *"Metalic reflections are coloured because their Fresnel is wavelength varying, but Fresnel still goes to 1 at 90deg for every wavelength"*. This convinced me instantly that indeed the correct thing to do was to use an un-tinted Fresnel contribution regardless of the metallicness of the material. I later found this [graph](https://en.wikipedia.org/wiki/Reflectance) which also confirmed this: 75 | 76 | ![](images/reflectance.jpg) 77 | 78 | For the Fresnel term we use a pre filtered Fresnel offset stored in a 2d lut (as proposed by Brian Karis in [Real Shading in Unreal Engine 4](http://blog.selfshadow.com/publications/s2013-shading-course/karis/s2013_pbs_epic_slides.pdf)). While results can diverge slightly from Arnold's Standard Surface Shader (see "the effect of metalness" from Zap Andeson's [Physical Material Whitepaper](https://www.dropbox.com/s/jt8dk65u14n2mi5/Physical%20Material%20-%20Whitepaper%20-%201.01.pdf?dl=0)), in most cases we get an edge tint that is pretty close: 79 | 80 | ![](images/metal4.jpg) 81 | 82 | ### Investigating light differences ### 83 | 84 | With the brdf validated we could start looking into validating our physical lights. Stingray currently supports point, spot, and directional lights (with more to come). The main problem we discovered with our lights is that the attenuation function we use is a bit awkward. Specifically we attenuate by I/(d+1)^2 as opposed to I/d^2 (Where 'I' is the intensity of the light source and 'd' is the distance to the light source from the shaded point). The main reason behind this decision is to manage the overflow that could occur in the light accumulation buffer. Adding the +1 effectively clamps the maximum value intensity of the light as the intensity set for that light itself i.e. as 'd' approaches zero 'I' approaches the intensity set for that light (as opposed to infinity). Unfortunatly this decision also means we can't get physically [correct light falloffs](https://www.desmos.com/calculator/jydb51epow) in a scene: 85 | 86 | ![](images/graph.gif) 87 | 88 | Even if we scale the intensity of the light to match the intensity for a certain distance (say 1m) we still have a different falloff curve than the physically correct attenuation. It's not too bad in a game context, but in the architectural world this is a bigger issue: 89 | 90 | ![](images/fix-int5.gif) 91 | 92 | This issue will be fixed in Stingray 1.10. Using I/(d+e)^2 (where 'e' is 1/max_value along) with an EV shift up and down while writting and reading from the accumulation buffer as described by [Nathan Reed](http://www.reedbeta.com/blog/artist-friendly-hdr-with-exposure-values/) is a good step forward. 93 | 94 | Finally we were also able to validate our ies profile parser/shader and our color temperatures behaved as expected: 95 | 96 | ![](images/ies2.gif) 97 | 98 | ### Results and final thoughts ### 99 | 100 | ![](images/comp-01.gif) 101 | ![](images/comp-03.gif) 102 | 103 | Integrating a high quality offline renderer like Arnold has proven invaluable in the process of validating our lights in Stingray. A similar validation process could be applicable to many other aspects of our rendering pipeline (antialiasing, refractive materials, fur, hair, post-effects, volumetrics, ect) 104 | 105 | I also think that it can be a very powerful tool for content creators to build intuition on the impact of indirect lighting in a particular scene. For example in a simple level like this, adding a diffuse plane dramatically changes the lighting on the Buddha statue: 106 | 107 | ![](images/diffuse.gif) 108 | 109 | The next step is now to compare our results with photographs gathered from a controlled environments. To be continued... --------------------------------------------------------------------------------