├── README.md
├── example
├── README.md
├── e1.png
├── e2.png
├── output1.png
├── output2.png
├── sample-00.png
├── sample-01.png
├── sample-02.png
├── sample-03.png
├── sample-04.png
├── sample-05.png
├── sample2-00.jpg
├── sample2-01.jpg
├── sample2-02.jpg
├── sample2-03.jpg
├── sample2-04.jpg
├── sample2-05.jpg
└── sample2-06.jpg
└── hdr.py
/README.md:
--------------------------------------------------------------------------------
1 | # high-dynamic-range-image
2 | Creating HDR image from image stack with multiple exposures
3 |
4 | ## Introduction
5 | The goal of this project is to recover high dynamic range radiance maps from photographs to crease an image that captures details from the entire dynaimic range. This project has two parts, radiane map construction and tone mapping. For the first part, we implement algorithm from [Debevec, Malik](http://www.pauldebevec.com/Research/HDR/debevec-siggraph97.pdf) to recover high dynamic range radiance map. Then we apply tone mapping and intensity adjustmemt to to convert the radiance map into displayable image.
6 |
7 | ## Algorithm Overview
8 | ### High Dynamic Range Radiance Map Construction
9 | 1. Film Response Curve Recovery
10 | >Film response curve is a function maps from observed pixel values on an image to the log of exposure values: g(Zij) = ln(Ei) + ln(tj). To recover function g, we implement equation from Debevec
11 |
12 |
13 | >>g is the unknown response function
14 |
15 | >>w is a linear weighting function. g will be less smooth and will fit the data more poorly near extremes (Z=0 or Z=255). Debevec introduces a weighting function to enphasize the smoothness fitting terms toward the middle of the curve.
16 |
17 | >>t is the exposure time
18 |
19 | >>E is the unknown radiance
20 |
21 | >>Z: is the observed pixel value
22 |
23 | >>i is the pixel location index.
24 |
25 | >>j is the exposure index
26 |
27 | >>P is the total number of exposures.
28 |
29 | >This response curve can be used to determine radiance values in any images acquired by the imaging processing associated with g, not just the images used to recover the response curve.
30 |
31 | 2. High Dynamic Range Radiance Map Construction
32 | >Once the response curve g is recovered, we can construct a radiance map based on equation from Debevec
33 |
34 |
35 |
36 | >In order to reducing noise in the recovered radiance value, we use all the available exposrues for a particular pixel to computer its radiance based on equation 6 in Debevec.
37 |
38 | ### Tone Mapping
39 | >Global tone mapping: In this project, we use gamma correction as global tone mapping. The output image is proportional to the input raised to the power of the inverse of gamma and has pixel value range from 0 to 255.
40 |
41 | ### Color Adjustment
42 | >In order to construct HDR image to be as closer to input image as possible, we adjust the output image average intensity for each channel (B, G, R) to be the same as template image. In general, we use middle image from image stack as template, which is usually most representative of the ground truth.
43 |
44 | ## Result 1
45 | ### Original image
46 |
47 |
48 |  Exposure 1/160 sec |
49 |  Exposure 1/125 sec |
50 |  Exposure 1/80 sec |
51 |  Exposure 1/60 sec |
52 |  Exposure 1/40 sec |
53 |  Exposure 1/15 sec |
54 |
55 |
56 |
57 | ### HDR image
58 |
59 |
60 | ## Result 2
61 | ### Original image
62 |
63 |
64 |  Exposure 1/400 sec |
65 |  Exposure 1/250 sec |
66 |  Exposure 1/100 sec |
67 |  Exposure 1/40 sec |
68 |  Exposure 1/25 sec |
69 |  Exposure 1/8 sec |
70 |  Exposure 1/3 sec |
71 |
72 |
73 |
74 | ### HDR image
75 |
76 |
77 |
78 |
79 |
80 |
--------------------------------------------------------------------------------
/example/README.md:
--------------------------------------------------------------------------------
1 |
2 |
--------------------------------------------------------------------------------
/example/e1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/vivianhylee/high-dynamic-range-image/396124d56e9dce0285c3911f384227fbfc94ec24/example/e1.png
--------------------------------------------------------------------------------
/example/e2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/vivianhylee/high-dynamic-range-image/396124d56e9dce0285c3911f384227fbfc94ec24/example/e2.png
--------------------------------------------------------------------------------
/example/output1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/vivianhylee/high-dynamic-range-image/396124d56e9dce0285c3911f384227fbfc94ec24/example/output1.png
--------------------------------------------------------------------------------
/example/output2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/vivianhylee/high-dynamic-range-image/396124d56e9dce0285c3911f384227fbfc94ec24/example/output2.png
--------------------------------------------------------------------------------
/example/sample-00.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/vivianhylee/high-dynamic-range-image/396124d56e9dce0285c3911f384227fbfc94ec24/example/sample-00.png
--------------------------------------------------------------------------------
/example/sample-01.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/vivianhylee/high-dynamic-range-image/396124d56e9dce0285c3911f384227fbfc94ec24/example/sample-01.png
--------------------------------------------------------------------------------
/example/sample-02.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/vivianhylee/high-dynamic-range-image/396124d56e9dce0285c3911f384227fbfc94ec24/example/sample-02.png
--------------------------------------------------------------------------------
/example/sample-03.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/vivianhylee/high-dynamic-range-image/396124d56e9dce0285c3911f384227fbfc94ec24/example/sample-03.png
--------------------------------------------------------------------------------
/example/sample-04.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/vivianhylee/high-dynamic-range-image/396124d56e9dce0285c3911f384227fbfc94ec24/example/sample-04.png
--------------------------------------------------------------------------------
/example/sample-05.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/vivianhylee/high-dynamic-range-image/396124d56e9dce0285c3911f384227fbfc94ec24/example/sample-05.png
--------------------------------------------------------------------------------
/example/sample2-00.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/vivianhylee/high-dynamic-range-image/396124d56e9dce0285c3911f384227fbfc94ec24/example/sample2-00.jpg
--------------------------------------------------------------------------------
/example/sample2-01.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/vivianhylee/high-dynamic-range-image/396124d56e9dce0285c3911f384227fbfc94ec24/example/sample2-01.jpg
--------------------------------------------------------------------------------
/example/sample2-02.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/vivianhylee/high-dynamic-range-image/396124d56e9dce0285c3911f384227fbfc94ec24/example/sample2-02.jpg
--------------------------------------------------------------------------------
/example/sample2-03.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/vivianhylee/high-dynamic-range-image/396124d56e9dce0285c3911f384227fbfc94ec24/example/sample2-03.jpg
--------------------------------------------------------------------------------
/example/sample2-04.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/vivianhylee/high-dynamic-range-image/396124d56e9dce0285c3911f384227fbfc94ec24/example/sample2-04.jpg
--------------------------------------------------------------------------------
/example/sample2-05.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/vivianhylee/high-dynamic-range-image/396124d56e9dce0285c3911f384227fbfc94ec24/example/sample2-05.jpg
--------------------------------------------------------------------------------
/example/sample2-06.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/vivianhylee/high-dynamic-range-image/396124d56e9dce0285c3911f384227fbfc94ec24/example/sample2-06.jpg
--------------------------------------------------------------------------------
/hdr.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | import cv2
3 | import random
4 |
5 |
6 | def linearWeight(pixel_value):
7 | """ Linear weighting function based on pixel intensity that reduces the
8 | weight of pixel values that are near saturation.
9 |
10 | Parameters
11 | ----------
12 | pixel_value : np.uint8
13 | A pixel intensity value from 0 to 255
14 |
15 | Returns
16 | -------
17 | weight : np.float64
18 | The weight corresponding to the input pixel intensity
19 |
20 | """
21 | z_min, z_max = 0., 255.
22 | if pixel_value <= (z_min + z_max) / 2:
23 | return pixel_value - z_min
24 | return z_max - pixel_value
25 |
26 |
27 | def sampleIntensities(images):
28 | """Randomly sample pixel intensities from the exposure stack.
29 |
30 | Parameters
31 | ----------
32 | images : list
33 | A list containing a stack of single-channel (i.e., grayscale)
34 | layers of an HDR exposure stack
35 |
36 | Returns
37 | -------
38 | intensity_values : numpy.array, dtype=np.uint8
39 | An array containing a uniformly sampled intensity value from each
40 | exposure layer (shape = num_intensities x num_images)
41 |
42 | """
43 | z_min, z_max = 0, 255
44 | num_intensities = z_max - z_min + 1
45 | num_images = len(images)
46 | intensity_values = np.zeros((num_intensities, num_images), dtype=np.uint8)
47 |
48 | # Find the middle image to use as the source for pixel intensity locations
49 | mid_img = images[num_images // 2]
50 |
51 | for i in range(z_min, z_max + 1):
52 | rows, cols = np.where(mid_img == i)
53 | if len(rows) != 0:
54 | idx = random.randrange(len(rows))
55 | for j in range(num_images):
56 | intensity_values[i, j] = images[j][rows[idx], cols[idx]]
57 | return intensity_values
58 |
59 |
60 | def computeResponseCurve(intensity_samples, log_exposures, smoothing_lambda, weighting_function):
61 | """Find the camera response curve for a single color channel
62 |
63 | Parameters
64 | ----------
65 | intensity_samples : numpy.ndarray
66 | Stack of single channel input values (num_samples x num_images)
67 |
68 | log_exposures : numpy.ndarray
69 | Log exposure times (size == num_images)
70 |
71 | smoothing_lambda : float
72 | A constant value used to correct for scale differences between
73 | data and smoothing terms in the constraint matrix -- source
74 | paper suggests a value of 100.
75 |
76 | weighting_function : callable
77 | Function that computes a weight from a pixel intensity
78 |
79 | Returns
80 | -------
81 | numpy.ndarray, dtype=np.float64
82 | Return a vector g(z) where the element at index i is the log exposure
83 | of a pixel with intensity value z = i (e.g., g[0] is the log exposure
84 | of z=0, g[1] is the log exposure of z=1, etc.)
85 | """
86 | z_min, z_max = 0, 255
87 | intensity_range = 255 # difference between min and max possible pixel value for uint8
88 | num_samples = intensity_samples.shape[0]
89 | num_images = len(log_exposures)
90 |
91 | # NxP + [(Zmax-1) - (Zmin + 1)] + 1 constraints; N + 256 columns
92 | mat_A = np.zeros((num_images * num_samples + intensity_range, num_samples + intensity_range + 1), dtype=np.float64)
93 | mat_b = np.zeros((mat_A.shape[0], 1), dtype=np.float64)
94 |
95 | # 1. Add data-fitting constraints:
96 | k = 0
97 | for i in range(num_samples):
98 | for j in range(num_images):
99 | z_ij = intensity_samples[i, j]
100 | w_ij = weighting_function(z_ij)
101 | mat_A[k, z_ij] = w_ij
102 | mat_A[k, (intensity_range + 1) + i] = -w_ij
103 | mat_b[k, 0] = w_ij * log_exposures[j]
104 | k += 1
105 |
106 | # 2. Add smoothing constraints:
107 | for z_k in range(z_min + 1, z_max):
108 | w_k = weighting_function(z_k)
109 | mat_A[k, z_k - 1] = w_k * smoothing_lambda
110 | mat_A[k, z_k ] = -2 * w_k * smoothing_lambda
111 | mat_A[k, z_k + 1] = w_k * smoothing_lambda
112 | k += 1
113 |
114 | # 3. Add color curve centering constraint:
115 | mat_A[k, (z_max - z_min) // 2] = 1
116 |
117 | inv_A = np.linalg.pinv(mat_A)
118 | x = np.dot(inv_A, mat_b)
119 |
120 | g = x[0: intensity_range + 1]
121 | return g[:, 0]
122 |
123 |
124 | def computeRadianceMap(images, log_exposure_times, response_curve, weighting_function):
125 | """Calculate a radiance map for each pixel from the response curve.
126 |
127 | Parameters
128 | ----------
129 | images : list
130 | Collection containing a single color layer (i.e., grayscale)
131 | from each image in the exposure stack. (size == num_images)
132 |
133 | log_exposure_times : numpy.ndarray
134 | Array containing the log exposure times for each image in the
135 | exposure stack (size == num_images)
136 |
137 | response_curve : numpy.ndarray
138 | Least-squares fitted log exposure of each pixel value z
139 |
140 | weighting_function : callable
141 | Function that computes the weights
142 |
143 | Returns
144 | -------
145 | numpy.ndarray(dtype=np.float64)
146 | The image radiance map (in log space)
147 | """
148 | img_shape = images[0].shape
149 | img_rad_map = np.zeros(img_shape, dtype=np.float64)
150 |
151 | num_images = len(images)
152 | for i in range(img_shape[0]):
153 | for j in range(img_shape[1]):
154 | g = np.array([response_curve[images[k][i, j]] for k in range(num_images)])
155 | w = np.array([weighting_function(images[k][i, j]) for k in range(num_images)])
156 | SumW = np.sum(w)
157 | if SumW > 0:
158 | img_rad_map[i, j] = np.sum(w * (g - log_exposure_times) / SumW)
159 | else:
160 | img_rad_map[i, j] = g[num_images // 2] - log_exposure_times[num_images // 2]
161 | return img_rad_map
162 |
163 |
164 | def globalToneMapping(image, gamma):
165 | """Global tone mapping using gamma correction
166 | ----------
167 | images :
168 | Image needed to be corrected
169 | gamma : floating number
170 | The number for gamma correction. Higher value for brighter result; lower for darker
171 | Returns
172 | -------
173 | numpy.ndarray
174 | The resulting image after gamma correction
175 | """
176 | image_corrected = cv2.pow(image/255., 1.0/gamma)
177 | return image_corrected
178 |
179 |
180 | def intensityAdjustment(image, template):
181 | """Tune image intensity based on template
182 | ----------
183 | images :
184 | image needed to be adjusted
185 | template :
186 | Typically we use the middle image from image stack. We want to match the image
187 | intensity for each channel to template's
188 | Returns
189 | -------
190 | numpy.ndarray
191 | The resulting image after intensity adjustment
192 | """
193 | m, n, channel = image.shape
194 | output = np.zeros((m, n, channel))
195 | for ch in range(channel):
196 | image_avg, template_avg = np.average(image[:, :, ch]), np.average(template[:, :, ch])
197 | output[..., ch] = image[..., ch] * (template_avg / image_avg)
198 |
199 | return output
200 |
201 |
202 | def computeHDR(images, log_exposure_times, smoothing_lambda=100., gamma=0.6):
203 | """Computational pipeline to produce the HDR images
204 | ----------
205 | images : list
206 | A list containing an exposure stack of images
207 | log_exposure_times : numpy.ndarray
208 | The log exposure times for each image in the exposure stack
209 | smoothing_lambda : np.int (Optional)
210 | A constant value to correct for scale differences between
211 | data and smoothing terms in the constraint matrix -- source
212 | paper suggests a value of 100.
213 | Returns
214 | -------
215 | numpy.ndarray
216 | The resulting HDR with intensities scaled to fit uint8 range
217 | """
218 |
219 | num_channels = images[0].shape[2]
220 | hdr_image = np.zeros(images[0].shape, dtype=np.float64)
221 |
222 | for channel in range(num_channels):
223 | # Collect the current layer of each input image from the exposure stack
224 | layer_stack = [img[:, :, channel] for img in images]
225 |
226 | # Sample image intensities
227 | intensity_samples = sampleIntensities(layer_stack)
228 |
229 | # Compute Response Curve
230 | response_curve = computeResponseCurve(intensity_samples, log_exposure_times, smoothing_lambda, linearWeight)
231 |
232 | # Build radiance map
233 | img_rad_map = computeRadianceMap(layer_stack, log_exposure_times, response_curve, linearWeight)
234 |
235 | # Normalize hdr layer to (0, 255)
236 | hdr_image[..., channel] = cv2.normalize(img_rad_map, alpha=0, beta=255, norm_type=cv2.NORM_MINMAX)
237 |
238 | # Global tone mapping
239 | image_mapped = globalToneMapping(hdr_image, gamma)
240 |
241 | # Adjust image intensity based on the middle image from image stack
242 | template = images[len(images)//2]
243 | image_tuned = intensityAdjustment(image_mapped, template)
244 |
245 | # Output image
246 | output = cv2.normalize(image_tuned, alpha=0, beta=255, norm_type=cv2.NORM_MINMAX)
247 | return output.astype(np.uint8)
--------------------------------------------------------------------------------