├── .gitignore
├── LICENSE
├── README.md
├── coordinates.png
├── docs
├── Makefile
├── make.bat
└── source
│ ├── conf.py
│ └── index.rst
├── envmap
├── __init__.py
├── environmentmap.py
├── projections.py
├── rotations.py
├── tetrahedronSolidAngle.py
└── xmlhelper.py
├── ezexr
└── __init__.py
├── hdrio
└── __init__.py
├── hdrtools
├── __init__.py
├── gsolve.py
├── sunutils.py
└── tonemapping
│ └── __init__.py
├── pyproject.toml
├── setup.cfg
├── setup.py
├── skydb
└── __init__.py
├── skylibs
└── __init__.py
├── test
├── test_envmap.py
├── test_projections.py
├── test_sunutils.py
└── test_warping_operator.py
└── tools3d
├── __init__.py
├── blender_addon_transportmatrix
├── compress_matrix.py
├── render.py
├── render_compressed.py
└── transportmatrix.py
├── display.py
├── spharm.py
└── warping_operator
├── README.md
├── __init__.py
└── example_warp_operator.py
/.gitignore:
--------------------------------------------------------------------------------
1 | # Byte-compiled / optimized / DLL files
2 | __pycache__/
3 | *.py[cod]
4 |
5 | # C extensions
6 | *.so
7 |
8 | # Distribution / packaging
9 | .Python
10 | env/
11 | build/
12 | develop-eggs/
13 | dist/
14 | downloads/
15 | eggs/
16 | .eggs/
17 | lib/
18 | lib64/
19 | parts/
20 | sdist/
21 | var/
22 | *.egg-info/
23 | .installed.cfg
24 | *.egg
25 |
26 | # PyInstaller
27 | # Usually these files are written by a python script from a template
28 | # before PyInstaller builds the exe, so as to inject date/other infos into it.
29 | *.manifest
30 | *.spec
31 |
32 | # Installer logs
33 | pip-log.txt
34 | pip-delete-this-directory.txt
35 |
36 | # Unit test / coverage reports
37 | htmlcov/
38 | .tox/
39 | .coverage
40 | .coverage.*
41 | .cache
42 | nosetests.xml
43 | coverage.xml
44 | *,cover
45 |
46 | # Translations
47 | *.mo
48 | *.pot
49 |
50 | # Django stuff:
51 | *.log
52 |
53 | # Sphinx documentation
54 | docs/build/
55 |
56 | # PyBuilder
57 | target/
58 |
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | GNU LESSER GENERAL PUBLIC LICENSE
2 | Version 3, 29 June 2007
3 |
4 | Copyright © 2007 Free Software Foundation, Inc.
5 |
6 | Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed.
7 |
8 | This version of the GNU Lesser General Public License incorporates the terms and conditions of version 3 of the GNU General Public License, supplemented by the additional permissions listed below.
9 |
10 | 0. Additional Definitions.
11 | As used herein, “this License” refers to version 3 of the GNU Lesser General Public License, and the “GNU GPL” refers to version 3 of the GNU General Public License.
12 |
13 | “The Library” refers to a covered work governed by this License, other than an Application or a Combined Work as defined below.
14 |
15 | An “Application” is any work that makes use of an interface provided by the Library, but which is not otherwise based on the Library. Defining a subclass of a class defined by the Library is deemed a mode of using an interface provided by the Library.
16 |
17 | A “Combined Work” is a work produced by combining or linking an Application with the Library. The particular version of the Library with which the Combined Work was made is also called the “Linked Version”.
18 |
19 | The “Minimal Corresponding Source” for a Combined Work means the Corresponding Source for the Combined Work, excluding any source code for portions of the Combined Work that, considered in isolation, are based on the Application, and not on the Linked Version.
20 |
21 | The “Corresponding Application Code” for a Combined Work means the object code and/or source code for the Application, including any data and utility programs needed for reproducing the Combined Work from the Application, but excluding the System Libraries of the Combined Work.
22 |
23 | 1. Exception to Section 3 of the GNU GPL.
24 | You may convey a covered work under sections 3 and 4 of this License without being bound by section 3 of the GNU GPL.
25 |
26 | 2. Conveying Modified Versions.
27 | If you modify a copy of the Library, and, in your modifications, a facility refers to a function or data to be supplied by an Application that uses the facility (other than as an argument passed when the facility is invoked), then you may convey a copy of the modified version:
28 |
29 | a) under this License, provided that you make a good faith effort to ensure that, in the event an Application does not supply the function or data, the facility still operates, and performs whatever part of its purpose remains meaningful, or
30 | b) under the GNU GPL, with none of the additional permissions of this License applicable to that copy.
31 | 3. Object Code Incorporating Material from Library Header Files.
32 | The object code form of an Application may incorporate material from a header file that is part of the Library. You may convey such object code under terms of your choice, provided that, if the incorporated material is not limited to numerical parameters, data structure layouts and accessors, or small macros, inline functions and templates (ten or fewer lines in length), you do both of the following:
33 |
34 | a) Give prominent notice with each copy of the object code that the Library is used in it and that the Library and its use are covered by this License.
35 | b) Accompany the object code with a copy of the GNU GPL and this license document.
36 | 4. Combined Works.
37 | You may convey a Combined Work under terms of your choice that, taken together, effectively do not restrict modification of the portions of the Library contained in the Combined Work and reverse engineering for debugging such modifications, if you also do each of the following:
38 |
39 | a) Give prominent notice with each copy of the Combined Work that the Library is used in it and that the Library and its use are covered by this License.
40 | b) Accompany the Combined Work with a copy of the GNU GPL and this license document.
41 | c) For a Combined Work that displays copyright notices during execution, include the copyright notice for the Library among these notices, as well as a reference directing the user to the copies of the GNU GPL and this license document.
42 | d) Do one of the following:
43 | 0) Convey the Minimal Corresponding Source under the terms of this License, and the Corresponding Application Code in a form suitable for, and under terms that permit, the user to recombine or relink the Application with a modified version of the Linked Version to produce a modified Combined Work, in the manner specified by section 6 of the GNU GPL for conveying Corresponding Source.
44 | 1) Use a suitable shared library mechanism for linking with the Library. A suitable mechanism is one that (a) uses at run time a copy of the Library already present on the user's computer system, and (b) will operate properly with a modified version of the Library that is interface-compatible with the Linked Version.
45 | e) Provide Installation Information, but only if you would otherwise be required to provide such information under section 6 of the GNU GPL, and only to the extent that such information is necessary to install and execute a modified version of the Combined Work produced by recombining or relinking the Application with a modified version of the Linked Version. (If you use option 4d0, the Installation Information must accompany the Minimal Corresponding Source and Corresponding Application Code. If you use option 4d1, you must provide the Installation Information in the manner specified by section 6 of the GNU GPL for conveying Corresponding Source.)
46 | 5. Combined Libraries.
47 | You may place library facilities that are a work based on the Library side by side in a single library together with other library facilities that are not Applications and are not covered by this License, and convey such a combined library under terms of your choice, if you do both of the following:
48 |
49 | a) Accompany the combined library with a copy of the same work based on the Library, uncombined with any other library facilities, conveyed under the terms of this License.
50 | b) Give prominent notice with the combined library that part of it is a work based on the Library, and explaining where to find the accompanying uncombined form of the same work.
51 | 6. Revised Versions of the GNU Lesser General Public License.
52 | The Free Software Foundation may publish revised and/or new versions of the GNU Lesser General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns.
53 |
54 | Each version is given a distinguishing version number. If the Library as you received it specifies that a certain numbered version of the GNU Lesser General Public License “or any later version” applies to it, you have the option of following the terms and conditions either of that published version or of any later version published by the Free Software Foundation. If the Library as you received it does not specify a version number of the GNU Lesser General Public License, you may choose any version of the GNU Lesser General Public License ever published by the Free Software Foundation.
55 |
56 | If the Library as you received it specifies that a proxy can decide whether future versions of the GNU Lesser General Public License shall apply, that proxy's public statement of acceptance of any version is permanent authorization for you to choose that version for the Library.
57 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | ## skylibs
2 |
3 | Tools used for LDR/HDR environment map (IBL) handling, conversion and I/O.
4 |
5 |
6 | ### Install & Develop
7 |
8 | Install using:
9 | ```
10 | pip install --upgrade skylibs
11 | ```
12 |
13 | To develop skylibs, clone the repository and execute `python setup.py develop`
14 |
15 |
16 | ### OpenEXR & Spherical harmonics
17 |
18 | To read and save `exr` files, install the following dependencies (works on win/mac/linux):
19 |
20 | ```
21 | conda install -c conda-forge openexr-python openexr
22 | ```
23 |
24 |
25 | ### Spherical Harmonics
26 |
27 | To use the spherical harmonics functionalities, install the following dependency (works on mac/linux):
28 |
29 | ```
30 | conda install -c conda-forge pyshtools
31 | ```
32 |
33 | ### envmap
34 |
35 | Example usage:
36 | ```
37 | from envmap import EnvironmentMap
38 |
39 | e = EnvironmentMap('envmap.exr', 'latlong')
40 | e_angular = e.copy().convertTo('angular')
41 | e_angular_sa = e_angular.solidAngles()
42 | ```
43 |
44 | `envmap.EnvironmentMap` Environment map class. Converts easily between those formats:
45 |
46 | - latlong (equirectangular)
47 | - angular
48 | - sphere
49 | - cube
50 | - skyangular
51 | - skylatlong
52 |
53 | #### Coordinates system
54 |
55 | Skylibs employs the following right-handed coordinates system: +X = right, +Y = up, +z = towards the camera. Here is its latitude-longitude map (see code below):
56 |
57 | 
58 |
59 |
60 | #### Available methods:
61 |
62 | - `.copy()`: deepcopy the instance.
63 | - `.solidAngles()`: provides the solid angle for each pixel of the current representation.
64 | - `.convertTo(targetFormat)`: convert to the `targetFormat`.
65 | - `.rotate(rotation)`: rotate the environment map using a Direction Cosine Matrix (DCM).
66 | - `.resize(targetHeight)`: resize the environment map. Down-scaling will use energy-preserving interpolation (best results with integer downscales), which may introduce aliasing.
67 | - `.toIntensity(mode, colorspace)`: convert to grayscale.
68 | - `.getHemisphere(normal)`: returns a mask of the hemisphere visible from a surface with `normal`.
69 | - `.setHemisphereValue(normal, value)`: sets all pixels visible from a surface with `normal` to `value`.
70 | - `.getMeanLightVectors(normals)`: compute the mean light vector of the environment map for the given normals.
71 | - `.project(vfov, rotation_matrix, ar, resolution, projection, mode)`: Extract a rectified image from the panorama, simulating a camera with field-of-view `vfov`, extrinsics `rotation_matrix`, aspect ratio `ar`, `resolution`.
72 | - `.blur(kappa)`: Blurs the environment map. `kappa` >= 0 is the blur bandwidth, higher values are sharper.
73 | - `.addSphericalGaussian(center, bandwidth, color)`: Adds a spherical gaussian to the environment map, in the unit direction of `center` with a specified `bandwidth` and `color`.
74 | - `.embed(vfov, rotation_matrix, image)`: inverse of `project`, embeds an image in the environment map.
75 | - `.imageCoordinates()`: returns the (u, v) coordinates at each pixel center.
76 | - `.worldCoordinates()`: returns the (x, y, z, valid) world coordinates for each pixel center, with mask `valid` (anything outside this mask does not project to the world).
77 | - `.world2image(x, y, z)`: returns the (u, v) coordinates of the vector (x, y, z). Pixel coordinates can be obtained with `floor(u)` and `floor(v)`.
78 | - `.image2world(u, v)`: returns the (x, y, z) coordinates of the coordinates (u, v).
79 | - `.interpolate(u, v, valid)`: interpolates the envmap to coordinates (u, v) masked with valid.
80 |
81 |
82 | ### Projection, cropping, simulating a camera
83 |
84 | To perform a crop from `pano.jpg`:
85 |
86 | ```
87 | import numpy as np
88 | from imageio import imread, imsave
89 | from envmap import EnvironmentMap, rotation_matrix
90 |
91 |
92 | e = EnvironmentMap('pano.jpg', 'latlong')
93 |
94 | dcm = rotation_matrix(azimuth=np.pi/6,
95 | elevation=np.pi/8,
96 | roll=np.pi/12)
97 | crop = e.project(vfov=85., # degrees
98 | rotation_matrix=dcm,
99 | ar=4./3.,
100 | resolution=(640, 480),
101 | projection="perspective",
102 | mode="normal")
103 |
104 | crop = np.clip(255.*crop, 0, 255).astype('uint8')
105 | imsave("crop.jpg", crop, quality=90)
106 | ```
107 |
108 | ### hdrio
109 |
110 | `imread` and `imwrite`/`imsave` supporting the folloring formats:
111 |
112 | - exr (ezexr)
113 | - cr2, nef, raw (dcraw)
114 | - hdr, pic (custom, beta)
115 | - tiff (tifffile)
116 | - All the formats supported by `imageio`
117 |
118 | ### ezexr
119 |
120 | Internal exr reader and writer, relies on `python-openexr`.
121 |
122 | ### tools3d
123 |
124 | - `getMaskDerivatives(mask)`: creates the dx+dy from a binary `mask`.
125 | - `NfromZ`: derivates the normals from a depth map `surf`.
126 | - `ZfromN`: Integrates a depth map from a normal map `normals`.
127 | - `display.plotDepth`: Creates a 3-subplot figure that shows the depth map `Z` and two side views.
128 | - `spharm.SphericalHarmonic` Spherical Harmonic Transform (uses `pyshtools`).
129 |
130 | Example usage of `spharm`:
131 | ```
132 | from envmap import EnvironmentMap
133 | from tools3d import spharm
134 |
135 | e = EnvironmentMap('envmap.exr', 'latlong')
136 | sh = spharm.SphericalHarmonic(e)
137 | print(sh.coeffs)
138 | reconstruction = sh.reconstruct(height=64)
139 | ```
140 | - `warping_operator.warpEnvironmentMap`: The warping operator of [Gardner et al., 2017](https://dl.acm.org/doi/10.1145/3130800.3130891). See documentation [here](./tools3d/warping_operator/README.md).
141 |
142 | ### hdrtools
143 |
144 | Tonemapping using `pfstools`.
145 |
146 | ## Changelog
147 |
148 | - 0.7.6: Fixed division by zero in envmap projection's mask mode.
149 | - 0.7.5: Fixed spherical harmonics import with latest pyshtools, added spherical warping operator, added whitelist channels in `ezexr`.
150 | - 0.7.4: Fixed tools3d.spharm compatibility with latest pyshtools.
151 | - 0.7.3: Added vMF-based envmap blur functionality. `ezexr` does not print the number of channels on stdout.
152 | - 0.7.2: `ezexr.imwrite()` now orders correctly channels when > 10 are used.
153 | - 0.7.1: Added `sunFromPySolar` (Thank Ian!) and support for alpha values and imageio v3 in ezexr(thanks Hong-Xing!).
154 | - 0.7.0: Fixed `.setHemisphereValue()`, added mode to `.toIntensity()`, fixed angular and sphere projections for normals [0, 0, ±1], added `.getHemisphere(normal)`.
155 | - 0.6.8: Fixed resizing to be energy-preserving when downscaling; fixed conversion that shifted the envmap by half a pixel
156 | - 0.6.7: Fixed image envmap embedding to fit the projection coordinates; fixed crash in imwrite with specific channel names
157 | - 0.6.6: Fixed aspect ratio when embedding
158 | - 0.6.5: Added envmap embed feature
159 | - 0.6.4: Removed `pyshtools` as mandatory dependency
160 | - 0.6.3: Removed custom OpenEXR bindings (can be easily installed using conda)
161 | - 0.6.2: Removed `rotlib` dependency
162 | - 0.6.1: Aspect ratio in `project()` now in pixels
163 | - 0.6: Updated the transport matrix Blender plugin to 2.8+
164 |
165 |
166 | ## Roadmap
167 |
168 | - Improved display for environment maps (change intensity with keystroke/button)
169 | - Standalone `ezexr` on all platforms (investigate `pyexr`)
170 | - add `worldCoordinates()` output in spherical coordinates instead of (x, y, z)
171 | - Add assert that data is float32 in convertTo/resize (internal bugs in scipy interpolation)
172 |
173 |
174 | ### Code for the coordinates system figure
175 |
176 | ```
177 | import numpy as np
178 | from matplotlib import pyplot as plt
179 | from envmap import EnvironmentMap
180 |
181 | sz = 1024
182 | e = EnvironmentMap(sz, 'cube', channels=3)
183 | e.data[:sz//4,:,:] = [0, 1, 0] # +Y
184 | e.data[sz//4:int(0.5*sz),:,:] = [1, 0, 1] # -Y
185 | e.data[:,int(0.5*sz):,:] = [1, 0, 0] # +X
186 | e.data[:,:int(0.25*sz),:] = [0, 1, 1] # -X
187 | e.data[int(3/4*sz):,:,:] = [0, 0, 1] # +Z
188 | e.data[int(0.5*sz):int(3/4*sz):,:,:] = [1, 1, 0] # -Z
189 | e.convertTo('latlong')
190 |
191 | def getCoords(normal):
192 | u, v = e.world2image(*normal)
193 | return [u*e.data.shape[1], v*e.data.shape[0]]
194 |
195 | plt.imshow(e.data)
196 | plt.text(*(getCoords([1, 0, 0]) + ["+X"]))
197 | plt.text(*(getCoords([-1, 0, 0]) + ["-X"]))
198 | plt.text(*(getCoords([0, 1, 0]) + ["+Y"]))
199 | plt.text(*(getCoords([0, -1, 0]) + ["-Y"]))
200 | plt.text(*(getCoords([0, 0, 1]) + ["+Z"]))
201 | plt.text(*(getCoords([0, 0, -1]) + ["-Z"]))
202 | plt.axis('off')
203 | plt.savefig('coordinates.png', bbox_inches="tight", dpi=200)
204 | plt.show()
205 | ```
--------------------------------------------------------------------------------
/coordinates.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/soravux/skylibs/c163f9ba048a7ba73c42097593da6dae99e5f0ba/coordinates.png
--------------------------------------------------------------------------------
/docs/Makefile:
--------------------------------------------------------------------------------
1 | # Makefile for Sphinx documentation
2 | #
3 |
4 | # You can set these variables from the command line.
5 | SPHINXOPTS =
6 | SPHINXBUILD = sphinx-build
7 | PAPER =
8 | BUILDDIR = build
9 |
10 | # User-friendly check for sphinx-build
11 | ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1)
12 | $(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/)
13 | endif
14 |
15 | # Internal variables.
16 | PAPEROPT_a4 = -D latex_paper_size=a4
17 | PAPEROPT_letter = -D latex_paper_size=letter
18 | ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) source
19 | # the i18n builder cannot share the environment and doctrees with the others
20 | I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) source
21 |
22 | .PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext
23 |
24 | help:
25 | @echo "Please use \`make ' where is one of"
26 | @echo " html to make standalone HTML files"
27 | @echo " dirhtml to make HTML files named index.html in directories"
28 | @echo " singlehtml to make a single large HTML file"
29 | @echo " pickle to make pickle files"
30 | @echo " json to make JSON files"
31 | @echo " htmlhelp to make HTML files and a HTML help project"
32 | @echo " qthelp to make HTML files and a qthelp project"
33 | @echo " devhelp to make HTML files and a Devhelp project"
34 | @echo " epub to make an epub"
35 | @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
36 | @echo " latexpdf to make LaTeX files and run them through pdflatex"
37 | @echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx"
38 | @echo " text to make text files"
39 | @echo " man to make manual pages"
40 | @echo " texinfo to make Texinfo files"
41 | @echo " info to make Texinfo files and run them through makeinfo"
42 | @echo " gettext to make PO message catalogs"
43 | @echo " changes to make an overview of all changed/added/deprecated items"
44 | @echo " xml to make Docutils-native XML files"
45 | @echo " pseudoxml to make pseudoxml-XML files for display purposes"
46 | @echo " linkcheck to check all external links for integrity"
47 | @echo " doctest to run all doctests embedded in the documentation (if enabled)"
48 |
49 | clean:
50 | rm -rf $(BUILDDIR)/*
51 |
52 | html:
53 | $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
54 | @echo
55 | @echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
56 |
57 | dirhtml:
58 | $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
59 | @echo
60 | @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
61 |
62 | singlehtml:
63 | $(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
64 | @echo
65 | @echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
66 |
67 | pickle:
68 | $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
69 | @echo
70 | @echo "Build finished; now you can process the pickle files."
71 |
72 | json:
73 | $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
74 | @echo
75 | @echo "Build finished; now you can process the JSON files."
76 |
77 | htmlhelp:
78 | $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
79 | @echo
80 | @echo "Build finished; now you can run HTML Help Workshop with the" \
81 | ".hhp project file in $(BUILDDIR)/htmlhelp."
82 |
83 | qthelp:
84 | $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
85 | @echo
86 | @echo "Build finished; now you can run "qcollectiongenerator" with the" \
87 | ".qhcp project file in $(BUILDDIR)/qthelp, like this:"
88 | @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/skylib.qhcp"
89 | @echo "To view the help file:"
90 | @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/skylib.qhc"
91 |
92 | devhelp:
93 | $(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
94 | @echo
95 | @echo "Build finished."
96 | @echo "To view the help file:"
97 | @echo "# mkdir -p $$HOME/.local/share/devhelp/skylib"
98 | @echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/skylib"
99 | @echo "# devhelp"
100 |
101 | epub:
102 | $(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
103 | @echo
104 | @echo "Build finished. The epub file is in $(BUILDDIR)/epub."
105 |
106 | latex:
107 | $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
108 | @echo
109 | @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
110 | @echo "Run \`make' in that directory to run these through (pdf)latex" \
111 | "(use \`make latexpdf' here to do that automatically)."
112 |
113 | latexpdf:
114 | $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
115 | @echo "Running LaTeX files through pdflatex..."
116 | $(MAKE) -C $(BUILDDIR)/latex all-pdf
117 | @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
118 |
119 | latexpdfja:
120 | $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
121 | @echo "Running LaTeX files through platex and dvipdfmx..."
122 | $(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
123 | @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
124 |
125 | text:
126 | $(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
127 | @echo
128 | @echo "Build finished. The text files are in $(BUILDDIR)/text."
129 |
130 | man:
131 | $(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
132 | @echo
133 | @echo "Build finished. The manual pages are in $(BUILDDIR)/man."
134 |
135 | texinfo:
136 | $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
137 | @echo
138 | @echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
139 | @echo "Run \`make' in that directory to run these through makeinfo" \
140 | "(use \`make info' here to do that automatically)."
141 |
142 | info:
143 | $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
144 | @echo "Running Texinfo files through makeinfo..."
145 | make -C $(BUILDDIR)/texinfo info
146 | @echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
147 |
148 | gettext:
149 | $(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
150 | @echo
151 | @echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
152 |
153 | changes:
154 | $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
155 | @echo
156 | @echo "The overview file is in $(BUILDDIR)/changes."
157 |
158 | linkcheck:
159 | $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
160 | @echo
161 | @echo "Link check complete; look for any errors in the above output " \
162 | "or in $(BUILDDIR)/linkcheck/output.txt."
163 |
164 | doctest:
165 | $(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
166 | @echo "Testing of doctests in the sources finished, look at the " \
167 | "results in $(BUILDDIR)/doctest/output.txt."
168 |
169 | xml:
170 | $(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml
171 | @echo
172 | @echo "Build finished. The XML files are in $(BUILDDIR)/xml."
173 |
174 | pseudoxml:
175 | $(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml
176 | @echo
177 | @echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml."
178 |
--------------------------------------------------------------------------------
/docs/make.bat:
--------------------------------------------------------------------------------
1 | @ECHO OFF
2 |
3 | REM Command file for Sphinx documentation
4 |
5 | if "%SPHINXBUILD%" == "" (
6 | set SPHINXBUILD=sphinx-build
7 | )
8 | set BUILDDIR=build
9 | set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% source
10 | set I18NSPHINXOPTS=%SPHINXOPTS% source
11 | if NOT "%PAPER%" == "" (
12 | set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS%
13 | set I18NSPHINXOPTS=-D latex_paper_size=%PAPER% %I18NSPHINXOPTS%
14 | )
15 |
16 | if "%1" == "" goto help
17 |
18 | if "%1" == "help" (
19 | :help
20 | echo.Please use `make ^` where ^ is one of
21 | echo. html to make standalone HTML files
22 | echo. dirhtml to make HTML files named index.html in directories
23 | echo. singlehtml to make a single large HTML file
24 | echo. pickle to make pickle files
25 | echo. json to make JSON files
26 | echo. htmlhelp to make HTML files and a HTML help project
27 | echo. qthelp to make HTML files and a qthelp project
28 | echo. devhelp to make HTML files and a Devhelp project
29 | echo. epub to make an epub
30 | echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter
31 | echo. text to make text files
32 | echo. man to make manual pages
33 | echo. texinfo to make Texinfo files
34 | echo. gettext to make PO message catalogs
35 | echo. changes to make an overview over all changed/added/deprecated items
36 | echo. xml to make Docutils-native XML files
37 | echo. pseudoxml to make pseudoxml-XML files for display purposes
38 | echo. linkcheck to check all external links for integrity
39 | echo. doctest to run all doctests embedded in the documentation if enabled
40 | goto end
41 | )
42 |
43 | if "%1" == "clean" (
44 | for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i
45 | del /q /s %BUILDDIR%\*
46 | goto end
47 | )
48 |
49 |
50 | %SPHINXBUILD% 2> nul
51 | if errorlevel 9009 (
52 | echo.
53 | echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
54 | echo.installed, then set the SPHINXBUILD environment variable to point
55 | echo.to the full path of the 'sphinx-build' executable. Alternatively you
56 | echo.may add the Sphinx directory to PATH.
57 | echo.
58 | echo.If you don't have Sphinx installed, grab it from
59 | echo.http://sphinx-doc.org/
60 | exit /b 1
61 | )
62 |
63 | if "%1" == "html" (
64 | %SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html
65 | if errorlevel 1 exit /b 1
66 | echo.
67 | echo.Build finished. The HTML pages are in %BUILDDIR%/html.
68 | goto end
69 | )
70 |
71 | if "%1" == "dirhtml" (
72 | %SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml
73 | if errorlevel 1 exit /b 1
74 | echo.
75 | echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml.
76 | goto end
77 | )
78 |
79 | if "%1" == "singlehtml" (
80 | %SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml
81 | if errorlevel 1 exit /b 1
82 | echo.
83 | echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml.
84 | goto end
85 | )
86 |
87 | if "%1" == "pickle" (
88 | %SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle
89 | if errorlevel 1 exit /b 1
90 | echo.
91 | echo.Build finished; now you can process the pickle files.
92 | goto end
93 | )
94 |
95 | if "%1" == "json" (
96 | %SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json
97 | if errorlevel 1 exit /b 1
98 | echo.
99 | echo.Build finished; now you can process the JSON files.
100 | goto end
101 | )
102 |
103 | if "%1" == "htmlhelp" (
104 | %SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp
105 | if errorlevel 1 exit /b 1
106 | echo.
107 | echo.Build finished; now you can run HTML Help Workshop with the ^
108 | .hhp project file in %BUILDDIR%/htmlhelp.
109 | goto end
110 | )
111 |
112 | if "%1" == "qthelp" (
113 | %SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp
114 | if errorlevel 1 exit /b 1
115 | echo.
116 | echo.Build finished; now you can run "qcollectiongenerator" with the ^
117 | .qhcp project file in %BUILDDIR%/qthelp, like this:
118 | echo.^> qcollectiongenerator %BUILDDIR%\qthelp\skylib.qhcp
119 | echo.To view the help file:
120 | echo.^> assistant -collectionFile %BUILDDIR%\qthelp\skylib.ghc
121 | goto end
122 | )
123 |
124 | if "%1" == "devhelp" (
125 | %SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp
126 | if errorlevel 1 exit /b 1
127 | echo.
128 | echo.Build finished.
129 | goto end
130 | )
131 |
132 | if "%1" == "epub" (
133 | %SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub
134 | if errorlevel 1 exit /b 1
135 | echo.
136 | echo.Build finished. The epub file is in %BUILDDIR%/epub.
137 | goto end
138 | )
139 |
140 | if "%1" == "latex" (
141 | %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
142 | if errorlevel 1 exit /b 1
143 | echo.
144 | echo.Build finished; the LaTeX files are in %BUILDDIR%/latex.
145 | goto end
146 | )
147 |
148 | if "%1" == "latexpdf" (
149 | %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
150 | cd %BUILDDIR%/latex
151 | make all-pdf
152 | cd %BUILDDIR%/..
153 | echo.
154 | echo.Build finished; the PDF files are in %BUILDDIR%/latex.
155 | goto end
156 | )
157 |
158 | if "%1" == "latexpdfja" (
159 | %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
160 | cd %BUILDDIR%/latex
161 | make all-pdf-ja
162 | cd %BUILDDIR%/..
163 | echo.
164 | echo.Build finished; the PDF files are in %BUILDDIR%/latex.
165 | goto end
166 | )
167 |
168 | if "%1" == "text" (
169 | %SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text
170 | if errorlevel 1 exit /b 1
171 | echo.
172 | echo.Build finished. The text files are in %BUILDDIR%/text.
173 | goto end
174 | )
175 |
176 | if "%1" == "man" (
177 | %SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man
178 | if errorlevel 1 exit /b 1
179 | echo.
180 | echo.Build finished. The manual pages are in %BUILDDIR%/man.
181 | goto end
182 | )
183 |
184 | if "%1" == "texinfo" (
185 | %SPHINXBUILD% -b texinfo %ALLSPHINXOPTS% %BUILDDIR%/texinfo
186 | if errorlevel 1 exit /b 1
187 | echo.
188 | echo.Build finished. The Texinfo files are in %BUILDDIR%/texinfo.
189 | goto end
190 | )
191 |
192 | if "%1" == "gettext" (
193 | %SPHINXBUILD% -b gettext %I18NSPHINXOPTS% %BUILDDIR%/locale
194 | if errorlevel 1 exit /b 1
195 | echo.
196 | echo.Build finished. The message catalogs are in %BUILDDIR%/locale.
197 | goto end
198 | )
199 |
200 | if "%1" == "changes" (
201 | %SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes
202 | if errorlevel 1 exit /b 1
203 | echo.
204 | echo.The overview file is in %BUILDDIR%/changes.
205 | goto end
206 | )
207 |
208 | if "%1" == "linkcheck" (
209 | %SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck
210 | if errorlevel 1 exit /b 1
211 | echo.
212 | echo.Link check complete; look for any errors in the above output ^
213 | or in %BUILDDIR%/linkcheck/output.txt.
214 | goto end
215 | )
216 |
217 | if "%1" == "doctest" (
218 | %SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest
219 | if errorlevel 1 exit /b 1
220 | echo.
221 | echo.Testing of doctests in the sources finished, look at the ^
222 | results in %BUILDDIR%/doctest/output.txt.
223 | goto end
224 | )
225 |
226 | if "%1" == "xml" (
227 | %SPHINXBUILD% -b xml %ALLSPHINXOPTS% %BUILDDIR%/xml
228 | if errorlevel 1 exit /b 1
229 | echo.
230 | echo.Build finished. The XML files are in %BUILDDIR%/xml.
231 | goto end
232 | )
233 |
234 | if "%1" == "pseudoxml" (
235 | %SPHINXBUILD% -b pseudoxml %ALLSPHINXOPTS% %BUILDDIR%/pseudoxml
236 | if errorlevel 1 exit /b 1
237 | echo.
238 | echo.Build finished. The pseudo-XML files are in %BUILDDIR%/pseudoxml.
239 | goto end
240 | )
241 |
242 | :end
243 |
--------------------------------------------------------------------------------
/docs/source/conf.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python3
2 | # -*- coding: utf-8 -*-
3 | #
4 | # skylib documentation build configuration file, created by
5 | # sphinx-quickstart on Sun Mar 22 03:07:31 2015.
6 | #
7 | # This file is execfile()d with the current directory set to its
8 | # containing dir.
9 | #
10 | # Note that not all possible configuration values are present in this
11 | # autogenerated file.
12 | #
13 | # All configuration values have a default; values that are commented out
14 | # serve to show the default.
15 |
16 | import sys
17 | import os
18 |
19 | # If extensions (or modules to document with autodoc) are in another directory,
20 | # add these directories to sys.path here. If the directory is relative to the
21 | # documentation root, use os.path.abspath to make it absolute, like shown here.
22 | sys.path.insert(0, os.path.abspath('../..'))
23 |
24 | import ezexr
25 | import envmap
26 |
27 | # -- General configuration ------------------------------------------------
28 |
29 | # If your documentation needs a minimal Sphinx version, state it here.
30 | #needs_sphinx = '1.0'
31 |
32 | # Add any Sphinx extension module names here, as strings. They can be
33 | # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
34 | # ones.
35 | extensions = [
36 | 'sphinx.ext.autodoc',
37 | 'sphinx.ext.todo',
38 | 'sphinx.ext.mathjax',
39 | ]
40 |
41 | todo_include_todos = True
42 |
43 | # Add any paths that contain templates here, relative to this directory.
44 | templates_path = ['_templates']
45 |
46 | # The suffix of source filenames.
47 | source_suffix = '.rst'
48 |
49 | # The encoding of source files.
50 | #source_encoding = 'utf-8-sig'
51 |
52 | # The master toctree document.
53 | master_doc = 'index'
54 |
55 | # General information about the project.
56 | project = 'skylib'
57 | copyright = '2015, Marc-Andre Gardner & Yannick Hold-Geoffroy'
58 |
59 | # The version info for the project you're documenting, acts as replacement for
60 | # |version| and |release|, also used in various other places throughout the
61 | # built documents.
62 | #
63 | # The short X.Y version.
64 | version = '0.1'
65 | # The full version, including alpha/beta/rc tags.
66 | release = '0.1'
67 |
68 | # The language for content autogenerated by Sphinx. Refer to documentation
69 | # for a list of supported languages.
70 | #language = None
71 |
72 | # There are two options for replacing |today|: either, you set today to some
73 | # non-false value, then it is used:
74 | #today = ''
75 | # Else, today_fmt is used as the format for a strftime call.
76 | #today_fmt = '%B %d, %Y'
77 |
78 | # List of patterns, relative to source directory, that match files and
79 | # directories to ignore when looking for source files.
80 | exclude_patterns = []
81 |
82 | # The reST default role (used for this markup: `text`) to use for all
83 | # documents.
84 | #default_role = None
85 |
86 | # If true, '()' will be appended to :func: etc. cross-reference text.
87 | #add_function_parentheses = True
88 |
89 | # If true, the current module name will be prepended to all description
90 | # unit titles (such as .. function::).
91 | #add_module_names = True
92 |
93 | # If true, sectionauthor and moduleauthor directives will be shown in the
94 | # output. They are ignored by default.
95 | #show_authors = False
96 |
97 | # The name of the Pygments (syntax highlighting) style to use.
98 | pygments_style = 'sphinx'
99 |
100 | # A list of ignored prefixes for module index sorting.
101 | #modindex_common_prefix = []
102 |
103 | # If true, keep warnings as "system message" paragraphs in the built documents.
104 | #keep_warnings = False
105 |
106 |
107 | # -- Options for HTML output ----------------------------------------------
108 |
109 | # The theme to use for HTML and HTML Help pages. See the documentation for
110 | # a list of builtin themes.
111 | html_theme = 'haiku'
112 |
113 | # Theme options are theme-specific and customize the look and feel of a theme
114 | # further. For a list of options available for each theme, see the
115 | # documentation.
116 | #html_theme_options = {}
117 |
118 | # Add any paths that contain custom themes here, relative to this directory.
119 | #html_theme_path = []
120 |
121 | # The name for this set of Sphinx documents. If None, it defaults to
122 | # " v documentation".
123 | #html_title = None
124 |
125 | # A shorter title for the navigation bar. Default is the same as html_title.
126 | #html_short_title = None
127 |
128 | # The name of an image file (relative to this directory) to place at the top
129 | # of the sidebar.
130 | #html_logo = None
131 |
132 | # The name of an image file (within the static path) to use as favicon of the
133 | # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
134 | # pixels large.
135 | #html_favicon = None
136 |
137 | # Add any paths that contain custom static files (such as style sheets) here,
138 | # relative to this directory. They are copied after the builtin static files,
139 | # so a file named "default.css" will overwrite the builtin "default.css".
140 | html_static_path = ['_static']
141 |
142 | # Add any extra paths that contain custom files (such as robots.txt or
143 | # .htaccess) here, relative to this directory. These files are copied
144 | # directly to the root of the documentation.
145 | #html_extra_path = []
146 |
147 | # If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
148 | # using the given strftime format.
149 | #html_last_updated_fmt = '%b %d, %Y'
150 |
151 | # If true, SmartyPants will be used to convert quotes and dashes to
152 | # typographically correct entities.
153 | #html_use_smartypants = True
154 |
155 | # Custom sidebar templates, maps document names to template names.
156 | #html_sidebars = {}
157 |
158 | # Additional templates that should be rendered to pages, maps page names to
159 | # template names.
160 | #html_additional_pages = {}
161 |
162 | # If false, no module index is generated.
163 | #html_domain_indices = True
164 |
165 | # If false, no index is generated.
166 | #html_use_index = True
167 |
168 | # If true, the index is split into individual pages for each letter.
169 | #html_split_index = False
170 |
171 | # If true, links to the reST sources are added to the pages.
172 | #html_show_sourcelink = True
173 |
174 | # If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
175 | #html_show_sphinx = True
176 |
177 | # If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
178 | #html_show_copyright = True
179 |
180 | # If true, an OpenSearch description file will be output, and all pages will
181 | # contain a tag referring to it. The value of this option must be the
182 | # base URL from which the finished HTML is served.
183 | #html_use_opensearch = ''
184 |
185 | # This is the file name suffix for HTML files (e.g. ".xhtml").
186 | #html_file_suffix = None
187 |
188 | # Output file base name for HTML help builder.
189 | htmlhelp_basename = 'skylibdoc'
190 |
191 |
192 | # -- Options for LaTeX output ---------------------------------------------
193 |
194 | latex_elements = {
195 | # The paper size ('letterpaper' or 'a4paper').
196 | #'papersize': 'letterpaper',
197 |
198 | # The font size ('10pt', '11pt' or '12pt').
199 | #'pointsize': '10pt',
200 |
201 | # Additional stuff for the LaTeX preamble.
202 | #'preamble': '',
203 | }
204 |
205 | # Grouping the document tree into LaTeX files. List of tuples
206 | # (source start file, target name, title,
207 | # author, documentclass [howto, manual, or own class]).
208 | latex_documents = [
209 | ('index', 'skylib.tex', 'skylib Documentation',
210 | 'Marc-Andre Gardner \\& Yannick Hold-Geoffroy', 'manual'),
211 | ]
212 |
213 | # The name of an image file (relative to this directory) to place at the top of
214 | # the title page.
215 | #latex_logo = None
216 |
217 | # For "manual" documents, if this is true, then toplevel headings are parts,
218 | # not chapters.
219 | #latex_use_parts = False
220 |
221 | # If true, show page references after internal links.
222 | #latex_show_pagerefs = False
223 |
224 | # If true, show URL addresses after external links.
225 | #latex_show_urls = False
226 |
227 | # Documents to append as an appendix to all manuals.
228 | #latex_appendices = []
229 |
230 | # If false, no module index is generated.
231 | #latex_domain_indices = True
232 |
233 |
234 | # -- Options for manual page output ---------------------------------------
235 |
236 | # One entry per manual page. List of tuples
237 | # (source start file, name, description, authors, manual section).
238 | man_pages = [
239 | ('index', 'skylib', 'skylib Documentation',
240 | ['Marc-Andre Gardner & Yannick Hold-Geoffroy'], 1)
241 | ]
242 |
243 | # If true, show URL addresses after external links.
244 | #man_show_urls = False
245 |
246 |
247 | # -- Options for Texinfo output -------------------------------------------
248 |
249 | # Grouping the document tree into Texinfo files. List of tuples
250 | # (source start file, target name, title, author,
251 | # dir menu entry, description, category)
252 | texinfo_documents = [
253 | ('index', 'skylib', 'skylib Documentation',
254 | 'Marc-Andre Gardner & Yannick Hold-Geoffroy', 'skylib', 'One line description of project.',
255 | 'Miscellaneous'),
256 | ]
257 |
258 | # Documents to append as an appendix to all manuals.
259 | #texinfo_appendices = []
260 |
261 | # If false, no module index is generated.
262 | #texinfo_domain_indices = True
263 |
264 | # How to display URL addresses: 'footnote', 'no', or 'inline'.
265 | #texinfo_show_urls = 'footnote'
266 |
267 | # If true, do not generate a @detailmenu in the "Top" node's menu.
268 | #texinfo_no_detailmenu = False
269 |
--------------------------------------------------------------------------------
/docs/source/index.rst:
--------------------------------------------------------------------------------
1 | .. skylib documentation master file, created by
2 | sphinx-quickstart on Sun Mar 22 03:07:31 2015.
3 | You can adapt this file completely to your liking, but it should at least
4 | contain the root `toctree` directive.
5 |
6 | Welcome to skylib's documentation!
7 | ==================================
8 |
9 | .. toctree::
10 | :maxdepth: 2
11 |
12 | List of TODOs
13 | -------------
14 |
15 | .. todolist::
16 |
17 | EXR I/O helper
18 | --------------
19 |
20 | .. automodule:: ezexr
21 | :members:
22 | :undoc-members:
23 |
24 | EnvironmentMap Class
25 | --------------------
26 |
27 | .. automodule:: envmap
28 | :members:
29 |
30 | .. autoclass:: envmap.EnvironmentMap
31 | :members:
32 | :undoc-members:
33 | :private-members:
34 | :special-members:
35 | :exclude-members: __dict__,__weakref__,__module__
36 |
37 |
38 | Indices and tables
39 | ==================
40 |
41 | * :ref:`genindex`
42 | * :ref:`modindex`
43 | * :ref:`search`
44 |
45 |
--------------------------------------------------------------------------------
/envmap/__init__.py:
--------------------------------------------------------------------------------
1 | from .environmentmap import EnvironmentMap, rotation_matrix, downscaleEnvmap
2 | from . import projections
3 | from skylibs import __version__
4 |
--------------------------------------------------------------------------------
/envmap/environmentmap.py:
--------------------------------------------------------------------------------
1 | import os
2 | import pathlib
3 | from copy import deepcopy
4 | from decimal import Decimal
5 |
6 | from tqdm import tqdm
7 | import numpy as np
8 | from scipy.ndimage import map_coordinates, zoom
9 | from skimage.transform import resize_local_mean, downscale_local_mean
10 |
11 | from hdrio import imread
12 | from .tetrahedronSolidAngle import tetrahedronSolidAngle
13 | from .projections import *
14 | from .xmlhelper import EnvmapXMLParser
15 | from .rotations import rotx, roty, rotz
16 |
17 |
18 | SUPPORTED_FORMATS = [
19 | 'angular',
20 | 'skyangular',
21 | 'latlong',
22 | 'skylatlong',
23 | 'sphere',
24 | 'cube',
25 | ]
26 |
27 | #From Dan:
28 | # I've generated these using the monochromatic albedo values from here:
29 | # http://agsys.cra-cin.it/tools/solarradiation/help/Albedo.html
30 | # (they cite some books as references). Since these monochromatic,
31 | # I got unscaled r, g, b values from internet textures and scaled them
32 | # so that their mean matches the expected monochromatic albedo. Hopefully
33 | # this is a valid thing to do.
34 | GROUND_ALBEDOS = {
35 | "GreenGrass": np.array([ 0.291801, 0.344855, 0.113344 ]).T,
36 | "FreshSnow": np.array([ 0.797356, 0.835876, 0.916767 ]).T,
37 | "Asphalt": np.array([ 0.148077, 0.150000, 0.151923 ]).T,
38 | }
39 |
40 |
41 | def isPath(var):
42 | return isinstance(var, str) or isinstance(var, pathlib.PurePath)
43 |
44 |
45 | class EnvironmentMap:
46 | def __init__(self, im, format_=None, copy=True, channels=3):
47 | """
48 | Creates an EnvironmentMap.
49 |
50 | :param im: Image path (str, pathlib.Path) or data (np.ndarray) representing
51 | an EnvironmentMap, or the height (int) of an empty EnvironmentMap.
52 | :param format_: EnvironmentMap format. Can be any from SUPPORTED_FORMATS.
53 | :param copy: When a numpy array is given, should it be copied.
54 | :param channels: Number of channels (e.g., 1=grayscale, 3=color).
55 | :type im: str, Path, int, np.ndarray
56 | :type format_: str
57 | :type copy: bool
58 | """
59 | if not format_ and isPath(im):
60 | filename = os.path.splitext(str(im))[0]
61 | metadata = EnvmapXMLParser("{}.meta.xml".format(filename))
62 | format_ = metadata.getFormat()
63 |
64 | assert format_ is not None, (
65 | "Please provide format (metadata file not found).")
66 | assert format_.lower() in SUPPORTED_FORMATS, (
67 | "Unknown format: {}".format(format_))
68 |
69 | self.format_ = format_.lower()
70 |
71 | if isPath(im):
72 | # We received the filename
73 | self.data = imread(str(im))
74 | elif isinstance(im, int):
75 | # We received a single scalar
76 | if self.format_ == 'latlong':
77 | self.data = np.zeros((im, im*2, channels))
78 | elif self.format_ == 'skylatlong':
79 | self.data = np.zeros((im, im*4, channels))
80 | elif self.format_ == 'cube':
81 | self.data = np.zeros((im, round(3/4*im), channels))
82 | else:
83 | self.data = np.zeros((im, im, channels))
84 | elif type(im).__module__ == np.__name__:
85 | # We received a numpy array
86 | self.data = np.asarray(im, dtype='double')
87 |
88 | if copy:
89 | self.data = self.data.copy()
90 | else:
91 | raise Exception('Could not understand input. Please provide a '
92 | 'filename (str), an height (integer) or an image '
93 | '(np.ndarray).')
94 |
95 | self.backgroundColor = np.zeros(self.data.shape[-1])
96 | self.validate()
97 |
98 | def validate(self):
99 | # Ensure the envmap is valid
100 | if self.format_ in ['sphere', 'angular', 'skysphere', 'skyangular']:
101 | assert self.data.shape[0] == self.data.shape[1], (
102 | "Sphere/Angular formats must have the same width/height")
103 | elif self.format_ == 'latlong':
104 | assert 2*self.data.shape[0] == self.data.shape[1], (
105 | "LatLong format width should be twice the height")
106 | elif self.format_ == 'skylatlong':
107 | assert 4*self.data.shape[0] == self.data.shape[1], (
108 | "SkyLatLong format width should be four times the height")
109 |
110 | assert self.data.ndim == 3, "Expected 3-dim array. For grayscale, use [h,w,1]."
111 |
112 | @classmethod
113 | def fromSkybox(cls, top, bottom, left, right, front, back):
114 | """Create an environment map from skybox (cube) captures. Six images
115 | must be provided, one for each side of the virtual cube. This function
116 | will return an EnvironmentMap object.
117 |
118 | All the images must be square (width==height), have the same size
119 | and number of channels.
120 | """
121 | basedim = top.shape[0]
122 | channels = 1 if len(top.shape) == 2 else top.shape[2]
123 |
124 | cube = np.zeros((basedim*4, basedim*3, channels), dtype=top.dtype)
125 | cube[0:basedim, basedim:2*basedim] = top
126 | cube[basedim:2*basedim, basedim:2*basedim] = front
127 | cube[basedim:2*basedim, 2*basedim:3*basedim] = right
128 | cube[3*basedim:4*basedim, basedim:2*basedim] = np.fliplr(np.flipud(back))
129 | cube[1*basedim:2*basedim, 0:basedim] = left
130 | cube[2*basedim:3*basedim, basedim:2*basedim] = bottom
131 |
132 | # We ensure that there is no visible artifacts at the junction of the images
133 | # due to the interpolation process
134 | cube[0:basedim, basedim-1] = left[0,...] # top-left
135 | cube[0:basedim, 2*basedim] = right[0,...][::-1] # top-right
136 | cube[basedim-1, 2*basedim:3*basedim] = top[:,-1][::-1] #right-top
137 | cube[2*basedim, 2*basedim:3*basedim] = bottom[:,-1] #right-bottom
138 | cube[1*basedim-1, 0:basedim] = top[:,0] # left-top
139 | cube[2*basedim, 0:basedim] = bottom[:,0][::-1] # left-bottom
140 | cube[2*basedim:3*basedim, basedim-1] = left[-1,...][::-1] #bottom-left
141 | cube[2*basedim:3*basedim, 2*basedim] = right[-1,...] #bottom-right
142 | cube[3*basedim:, basedim-1] = left[:,0][::-1] # back-left
143 | cube[3*basedim:, 2*basedim] = right[:,-1][::-1] # back-right
144 |
145 | return cls(cube, format_="cube")
146 |
147 | def __hash__(self):
148 | """Provide a hash of the environment map type and size.
149 | Warning: doesn't take into account the data, just the type,
150 | and the size (useful for soldAngles)."""
151 | return hash((self.data.shape, self.format_))
152 |
153 | def copy(self):
154 | """Returns a copy of the current environment map."""
155 | return deepcopy(self)
156 |
157 | def solidAngles(self):
158 | """Computes the solid angle subtended by each pixel."""
159 | # If already computed, take it
160 | if hasattr(self, '_solidAngles') and hash(self) == self._solidAngles_hash:
161 | return self._solidAngles
162 |
163 | # Compute coordinates of pixel borders
164 | cols = np.linspace(0, 1, self.data.shape[1] + 1)
165 | rows = np.linspace(0, 1, self.data.shape[0] + 1)
166 |
167 | u, v = np.meshgrid(cols, rows)
168 | dx, dy, dz, _ = self.image2world(u, v)
169 |
170 | # Split each pixel into two triangles and compute the solid angle
171 | # subtended by the two tetrahedron
172 | a = np.vstack((dx[:-1,:-1].ravel(), dy[:-1,:-1].ravel(), dz[:-1,:-1].ravel()))
173 | b = np.vstack((dx[:-1,1:].ravel(), dy[:-1,1:].ravel(), dz[:-1,1:].ravel()))
174 | c = np.vstack((dx[1:,:-1].ravel(), dy[1:,:-1].ravel(), dz[1:,:-1].ravel()))
175 | d = np.vstack((dx[1:,1:].ravel(), dy[1:,1:].ravel(), dz[1:,1:].ravel()))
176 | omega = tetrahedronSolidAngle(a, b, c)
177 | omega += tetrahedronSolidAngle(b, c, d)
178 |
179 | # Get pixel center coordinates
180 | _, _, _, valid = self.worldCoordinates()
181 | omega[~valid.ravel()] = np.nan
182 |
183 | self._solidAngles = omega.reshape(self.data.shape[0:2])
184 | self._solidAngles_hash = hash(self)
185 | return self._solidAngles
186 |
187 | def imageCoordinates(self):
188 | """Returns the (u, v) coordinates for each pixel center."""
189 | cols = np.linspace(0, 1, self.data.shape[1]*2 + 1)
190 | rows = np.linspace(0, 1, self.data.shape[0]*2 + 1)
191 |
192 | cols = cols[1::2]
193 | rows = rows[1::2]
194 |
195 | return [d.astype('float32') for d in np.meshgrid(cols, rows)]
196 |
197 | def worldCoordinates(self):
198 | """Returns the (x, y, z) world coordinates for each pixel center."""
199 | u, v = self.imageCoordinates()
200 | x, y, z, valid = self.image2world(u, v)
201 | return x, y, z, valid
202 |
203 | def image2world(self, u, v):
204 | """Returns the (x, y, z) coordinates in the [-1, 1] interval."""
205 | func = {
206 | 'angular': angular2world,
207 | 'skyangular': skyangular2world,
208 | 'latlong': latlong2world,
209 | 'skylatlong': skylatlong2world,
210 | 'sphere': sphere2world,
211 | 'cube': cube2world,
212 | }.get(self.format_)
213 | return func(u, v)
214 |
215 | def world2image(self, x, y, z):
216 | """Returns the (u, v) coordinates (in the [0, 1] interval)."""
217 | func = {
218 | 'angular': world2angular,
219 | 'skyangular': world2skyangular,
220 | 'latlong': world2latlong,
221 | 'skylatlong': world2skylatlong,
222 | 'sphere': world2sphere,
223 | 'cube': world2cube,
224 | }.get(self.format_)
225 | return func(x, y, z)
226 |
227 |
228 | def world2pixel(self, x, y, z):
229 | """Returns the (u, v) coordinates (in the interval defined by the MxN image)."""
230 |
231 | # Get (u,v) in [-1, 1] interval
232 | u,v = self.world2image(x, y, z)
233 |
234 | # de-Normalize coordinates to interval defined by the MxN image
235 | u = np.floor(u*self.data.shape[1]).astype(int)
236 | v = np.floor(v*self.data.shape[0]).astype(int)
237 |
238 | return u, v
239 |
240 |
241 | def pixel2world(self, u, v):
242 | """Returns the (x, y, z) coordinates for pixel cordinates (u,v)(in the interval defined by the MxN image)."""
243 |
244 | # Normalize coordinates to [-1, 1] interval
245 | u = (u+0.5) / self.data.shape[1]
246 | v = (v+0.5) / self.data.shape[0]
247 |
248 | # Get (x, y, z)
249 | x, y, z, v = self.image2world(u, v)
250 |
251 | return x, y, z, v
252 |
253 | def interpolate(self, u, v, valid=None, order=1, filter=True):
254 | """
255 | Interpolate to get the desired pixel values.
256 |
257 | :param order: Interpolation order (0: nearest, 1: linear, ..., 5).
258 | :type order: integer (0,1,...,5)
259 | """
260 |
261 | u = u.copy()
262 | v = v.copy()
263 |
264 | # Repeat the first and last rows/columns for interpolation purposes
265 | h, w, d = self.data.shape
266 | source = np.empty((h + 2, w + 2, d))
267 |
268 | source[1:-1, 1:-1] = self.data
269 | source[0,1:-1] = self.data[0,:]; source[0,0] = self.data[0,0]; source[0,-1] = self.data[0,-1]
270 | source[-1,1:-1] = self.data[-1,:]; source[-1,0] = self.data[-1,0]; source[-1,-1] = self.data[-1,-1]
271 | source[1:-1,0] = self.data[:,0]
272 | source[1:-1,-1] = self.data[:,-1]
273 |
274 | # To avoid displacement due to the padding
275 | u += 0.5/self.data.shape[1]
276 | v += 0.5/self.data.shape[0]
277 | target = np.vstack((v.flatten()*self.data.shape[0], u.flatten()*self.data.shape[1]))
278 |
279 | data = np.zeros((u.shape[0], u.shape[1], d))
280 | for c in range(d):
281 | map_coordinates(source[:,:,c], target, output=data[:,:,c].reshape(-1), cval=np.nan, order=order, prefilter=filter)
282 | self.data = data
283 |
284 | if valid is not None:
285 | self.setBackgroundColor(self.backgroundColor, valid)
286 |
287 | return self
288 |
289 | def setBackgroundColor(self, color, valid=None):
290 | """Sets the area defined by ~valid to color."""
291 | if valid is None:
292 | _, _, _, valid = self.worldCoordinates()
293 |
294 | assert valid.dtype == 'bool', "`valid` must be a boolean array."
295 | assert valid.shape[:2] == self.data.shape[:2], "`valid` must be the same size as the EnvironmentMap."
296 |
297 | self.backgroundColor = np.asarray(color)
298 | if self.backgroundColor.size == 1 and self.data.shape[2] != self.backgroundColor.size:
299 | self.backgroundColor = np.tile(self.backgroundColor, (self.data.shape[2],))
300 |
301 | assert self.data.shape[2] == self.backgroundColor.size, "Channel number mismatch when setting background color"
302 |
303 | mask = np.invert(valid)
304 | if mask.sum() > 0:
305 | self.data[np.tile(mask[:,:,None], (1, 1, self.data.shape[2]))] = np.tile(self.backgroundColor, (mask.sum(),))
306 |
307 | return self
308 |
309 | def convertTo(self, targetFormat, targetDim=None, order=1):
310 | """
311 | Convert to another format.
312 |
313 | :param targetFormat: Target format.
314 | :param targetDim: Target dimension.
315 | :param order: Interpolation order (0: nearest, 1: linear, ..., 5).
316 |
317 | :type targetFormat: string
318 | :type targetDim: integer
319 | :type order: integer (0,1,...,5)
320 | """
321 | self.validate()
322 | assert targetFormat.lower() in SUPPORTED_FORMATS, (
323 | "Unknown format: {}".format(targetFormat))
324 | assert order in range(6), "Spline interpolation order must be between 0 and 5."
325 |
326 | if not targetDim:
327 | # By default, number of rows
328 | targetDim = self.data.shape[0]
329 |
330 | eTmp = EnvironmentMap(targetDim, targetFormat)
331 | dx, dy, dz, valid = eTmp.worldCoordinates()
332 | u, v = self.world2image(dx, dy, dz)
333 | self.format_ = targetFormat.lower()
334 | self.interpolate(u, v, valid, order)
335 |
336 | return self
337 |
338 | def rotate(self, dcm, order=1):
339 | """
340 | Rotate the environment map.
341 |
342 | :param dcm: Rotation information (currently only 3x3 numpy matrix)
343 | :param order: Integer interpolation order (0: nearest, 1: linear, ..., 5).
344 |
345 | """
346 |
347 | self.validate()
348 | assert type(dcm).__module__ == np.__name__ and dcm.ndim == 2 and dcm.shape == (3, 3)
349 | assert order in range(6), "Spline interpolation order must be between 0 and 5."
350 |
351 | dx, dy, dz, valid = self.worldCoordinates()
352 |
353 | ptR = np.dot(dcm, np.vstack((dx.flatten(), dy.flatten(), dz.flatten())))
354 | dx, dy, dz = ptR[0].reshape(dx.shape), ptR[1].reshape(dy.shape), ptR[2].reshape(dz.shape)
355 |
356 | dx = np.clip(dx, -1, 1)
357 | dy = np.clip(dy, -1, 1)
358 | dz = np.clip(dz, -1, 1)
359 |
360 | u, v = self.world2image(dx, dy, dz)
361 | self.interpolate(u, v, valid, order)
362 |
363 | return self
364 |
365 | def resize(self, targetSize, order=1, debug=False):
366 | """
367 | Resize the current environnement map to targetSize.
368 | The energy-preserving "pixel mixing" algorithm is used when downscaling unless
369 | order is set to 0 (nearest neighbor interpolation).
370 |
371 | `targetSize` is either the desired `height` or `(height, width)`.
372 | `order` is the order of the spline interpolation (0: nearest, 1: linear, ..., 5).
373 |
374 | """
375 |
376 | self.validate()
377 | assert order in range(6), "Spline interpolation order must be between 0 and 5."
378 |
379 | if not isinstance(targetSize, tuple):
380 | if self.format_ == 'latlong':
381 | targetSize = (targetSize, 2*targetSize)
382 | elif self.format_ == 'skylatlong':
383 | targetSize = (targetSize, 4*targetSize)
384 | elif self.format_ == 'cube':
385 | targetSize = (targetSize, round(3/4*targetSize))
386 | else:
387 | targetSize = (targetSize, targetSize)
388 |
389 | if debug == True:
390 | old_mean = self.data.mean()
391 |
392 | # downsampling
393 | if targetSize[0] < self.data.shape[0] and order != 0:
394 |
395 | # check if integer
396 | if (Decimal(self.data.shape[0]) / Decimal(targetSize[0])) % 1 == 0:
397 | if debug is True:
398 | print("integer resize")
399 | fac = self.data.shape[0] // targetSize[0]
400 | self.data = downscale_local_mean(self.data, (fac, fac, 1))
401 | else:
402 | if debug is True:
403 | print("non-integer resize")
404 | self.data = resize_local_mean(self.data, targetSize, grid_mode=True, preserve_range=True)
405 |
406 | else: # upsampling or nearest neighbor
407 | _size = []
408 | for i in range(2):
409 | _size.append(targetSize[i] / self.data.shape[i] if targetSize[i] > 1. else targetSize[i])
410 | _size.append(1.0)
411 | self.data = zoom(self.data, _size, order=order)
412 |
413 | if debug is True:
414 | print("Energy difference in resize: {:.04f}".format(self.data.mean()/old_mean - 1.))
415 |
416 | return self
417 |
418 | def toIntensity(self, mode="ITU BT.709", colorspace="linear"):
419 | """
420 | Converts the environment map to grayscale.
421 |
422 | `mode` can be either :
423 | "ITU BT.601" (luma in sRGB),
424 | "ITU BT.709" (assumes the envmap is linear),
425 | mean (mean of the channels)
426 | `colorspace`: either "sRGB" or "linear"
427 | """
428 | self.validate()
429 |
430 | if self.data.shape[2] != 3:
431 | print("Envmap does not have 3 channels. This function won't do anything.")
432 | return self
433 |
434 | if colorspace.lower() == "sRGB":
435 | if self.data.max() > 1.:
436 | raise Exception("Error during sRGB to linear conversion: data is > 1. Please linearize "
437 | "the data beforehand or normalize it [0, 1].")
438 | self.data = self.data**2.2
439 |
440 | if mode == "ITU BT.601":
441 | self.data = 0.299*self.data[...,0] + 0.587*self.data[...,1] + 0.114*self.data[...,2]
442 | self.data = self.data[:,:,np.newaxis]
443 | elif mode == "ITU BT.709":
444 | self.data = 0.2126*self.data[...,0] + 0.7152*self.data[...,1] + 0.0722*self.data[...,2]
445 | self.data = self.data[:,:,np.newaxis]
446 | elif mode == "mean":
447 | self.data = np.mean(self.data, axis=2, keepdims=True)
448 |
449 | if colorspace.lower() == "sRGB":
450 | self.data = self.data**(1./2.2)
451 |
452 | return self
453 |
454 | def getHemisphere(self, normal, channels=True):
455 | """
456 | normal:
457 | """
458 | normal = np.asarray(normal, dtype=np.float32).reshape((-1))
459 | assert normal.size == 3, "unknown normal shape, should have 3 elements"
460 | normal /= np.linalg.norm(normal)
461 |
462 | x, y, z, _ = self.worldCoordinates()
463 | xyz = np.dstack((x, y, z))
464 |
465 | mask = xyz.dot(normal) > 0
466 | if channels == False:
467 | return mask
468 |
469 | return np.tile(mask[:,:,None], (1, 1, self.data.shape[2]))
470 |
471 | def setHemisphereValue(self, normal, value):
472 | """Sets an whole hemisphere defined by `normal` to a given `value`."""
473 |
474 | mask = self.getHemisphere(normal, channels=True)
475 |
476 | value = np.asarray(value)
477 | assert value.size in (self.data.shape[2], 1), ("Cannot understand value: should "
478 | "be size 1 or channels ({})".format(self.data.shape[2]))
479 |
480 | if value.size == 1 and self.data.shape[2] != value.size:
481 | value = np.tile(value, (self.data.shape[2],))
482 |
483 | self.data[mask] = np.tile(value, (mask.sum()//self.data.shape[2],))
484 |
485 | return self
486 |
487 | def getMeanLightVectors(self, normals):
488 | """Compute the mean light vector of the environment map for the normals given.
489 | Normals should be 3xN.
490 | Output is 3xN.
491 | """
492 | self.validate()
493 |
494 | normals = np.asarray(normals)
495 | normals /= np.linalg.norm(normals, 1)
496 | solidAngles = self.solidAngles()
497 | solidAngles /= np.nansum(solidAngles) # Normalize to 1
498 |
499 | x, y, z, _ = self.worldCoordinates()
500 | xyz = np.dstack((x, y, z))
501 |
502 | visibility = xyz.dot(normals) > 0
503 |
504 | intensity = deepcopy(self).toIntensity()
505 | meanlight = visibility * intensity.data * solidAngles[:,:,np.newaxis]
506 | meanlight = np.nansum(xyz[...,np.newaxis] * meanlight[...,np.newaxis].transpose((0,1,3,2)), (0, 1))
507 |
508 | return meanlight
509 |
510 | def blur(self, kappa):
511 | """
512 | Blurs the environment map, taking into account the format.
513 | Uses the von Mises-Fisher notation (`kappa` >= 0 for blur bandwidth, higher values are sharper)
514 | """
515 |
516 | x, y, z, _ = self.worldCoordinates()
517 | xyz = np.dstack((x, y, z)).reshape((-1, 3))
518 |
519 | h, w, c = self.data.shape[:3]
520 | data = self.data.reshape((-1, c))
521 | sa = self.solidAngles().reshape((-1, 1))
522 |
523 | C3 = kappa/(4*np.pi*np.sinh(kappa))
524 |
525 | result = np.empty((h*w, c), dtype=np.float32)
526 |
527 | for i in tqdm(range(xyz.shape[0])):
528 | normal = xyz[i:i+1,:]
529 | result[i,:] = np.sum(data*sa*C3*np.exp(kappa*np.sum(xyz*normal, 1, keepdims=True)), 0)
530 |
531 | self.data = result.reshape((h, w, c))
532 | return self
533 |
534 | def addSphericalGaussian(self, center, bandwidth, color):
535 | """
536 | Adds a spherical gaussian to the environment map.
537 | `center` is the direction of the vector, represented by an x, y, z unit vector.
538 | `bandwidth` is the scalar bandwidth of the gaussian.
539 | `color` is the color of the gaussian.
540 | """
541 |
542 | x, y, z, _ = self.worldCoordinates()
543 | xyz = np.dstack((x, y, z))
544 |
545 | center = np.asarray(center, dtype=np.float32).reshape((1, 1, 3))
546 | center /= np.linalg.norm(center)
547 |
548 | color = np.asarray(color, dtype=np.float32).reshape((1, 1, self.data.shape[2]))
549 | self.data += color * (np.exp(bandwidth * (np.sum(xyz * center, 2, keepdims=True) - 1)))
550 |
551 | return self
552 |
553 | def embed(self, vfov, rotation_matrix, image, order=1):
554 | """
555 | Projects an image onto the environment map.
556 |
557 | :vfov: Vertical Field of View (degrees).
558 | :rotation_matrix: Camera rotation matrix.
559 | :image: The image to project.
560 | :param order: Integer interpolation order (0: nearest, 1: linear, ..., 5).
561 | """
562 |
563 | self.validate()
564 | assert order in range(6), "Spline interpolation order must be between 0 and 5."
565 |
566 | targetDim = self.data.shape[0]
567 | targetFormat = self.format_
568 |
569 | eTmp = EnvironmentMap(targetDim, targetFormat)
570 | dx, dy, dz, valid = eTmp.worldCoordinates()
571 | ar = image.shape[1]/image.shape[0]
572 |
573 | vfov_half_rad = vfov/2.*np.pi/180.
574 | hfov_half_rad = np.arctan(np.tan(vfov_half_rad)*ar)
575 |
576 | fx = 0.5/np.tan(hfov_half_rad)
577 | fy = 0.5/np.tan(vfov_half_rad)
578 | u0 = 0.5
579 | v0 = 0.5
580 |
581 | # world2image for a camera
582 | K = np.array([[fx, 0, u0],
583 | [0, fy, v0],
584 | [0, 0, 1]])
585 | M = K.dot(rotation_matrix)
586 |
587 | xyz = np.dstack((dx, dy, dz)).reshape((-1, 3)).T
588 |
589 | # mask behind the camera
590 | forward_vector = rotation_matrix.T.dot(np.array([0, 0, -1]).T)
591 | mask = forward_vector.dot(xyz) <= 0
592 | xyz[:,mask] = np.inf
593 |
594 | uv = M.dot(xyz)
595 | u = (uv[0,:]/uv[2,:]).reshape(self.data.shape[:2])
596 | v = (uv[1,:]/uv[2,:]).reshape(self.data.shape[:2])
597 |
598 | self.format_ = targetFormat.lower()
599 | self.data = image.copy()[:,::-1,...]
600 | self.interpolate(u, v, valid, order)
601 | return self
602 |
603 | def project(self, vfov, rotation_matrix, ar=4./3., resolution=(640, 480),
604 | projection="perspective", mode="normal", order=1):
605 | """
606 | Perform a projection onto a plane ("_simulates_" a camera).
607 |
608 | Note: this function does not modify the foreshortening present in the
609 | environment map.
610 |
611 | :vfov: Vertical Field of View (degrees).
612 | :rotation_matrix: Camera rotation matrix.
613 | :ar: Aspect ratio (width / height), defaults to 4/3.
614 | :resolution: Output size in (cols, rows), defaults to (640,480).
615 | :projection: perspective (default) or orthographic.
616 | :mode: "normal": perform crop (default)
617 | "mask": show pixel mask in the envmap,
618 | "normal+uv": returns (crop, u, v), where (u,v) are the coordinates
619 | of the crop.
620 | :param order: Integer interpolation order (0: nearest, 1: linear, ..., 5).
621 | """
622 |
623 | self.validate()
624 | assert order in range(6), "Spline interpolation order must be between 0 and 5."
625 |
626 | coords = self._cameraCoordinates(vfov, rotation_matrix, ar,
627 | resolution, projection, mode)
628 |
629 | if mode == "mask":
630 | return coords
631 |
632 | target = self.copy()
633 | if target.format_ != "latlong":
634 | target = target.convertTo("LatLong")
635 |
636 | # Get image coordinates from world sphere coordinates
637 | u, v = target.world2image(coords[0,:], coords[1,:], coords[2,:])
638 | u, v = u.reshape(resolution[::-1]), v.reshape(resolution[::-1])
639 |
640 | crop = target.interpolate(u, v, order=order).data
641 |
642 | if mode == "normal+uv":
643 | return crop, u, v
644 |
645 | return crop
646 |
647 | def _cameraCoordinates(self, vfov, rotation_matrix, ar=4./3., resolution=(640, 480),
648 | projection="perspective", mode="normal"):
649 |
650 | if mode not in ("normal", "mask", "normal+uv"):
651 | raise Exception("Unknown mode: {}.".format(mode))
652 |
653 | if projection == "orthographic":
654 | vfov = np.arctan(np.sin(vfov*np.pi/180.))*180/np.pi
655 |
656 | # aspect ratio in pixels
657 | hfov = 2 * np.arctan(np.tan(vfov*np.pi/180./2)*ar)*180/np.pi
658 |
659 | # Project angle on the sphere to the +Z plane (distance=1 from the camera)
660 | mu = np.tan(hfov/2.*np.pi/180.)
661 | mv = np.tan(vfov/2.*np.pi/180.)
662 |
663 | if mode == "mask":
664 | x, y, z, _ = self.worldCoordinates()
665 | xy = np.sqrt( (x**2 + y**2) / np.maximum(-x**2 - y**2 + 1, 1e-10) )
666 | theta = np.arctan2(x, y)
667 | x = xy*np.sin(theta)
668 | y = xy*np.cos(theta)
669 |
670 | hmask = (x > -mu) & (x < mu)
671 | vmask = (y > -mv) & (y < mv)
672 | dmask = z < 0
673 | mask = hmask & vmask & dmask
674 |
675 | e = EnvironmentMap(mask[:,:,np.newaxis], self.format_)
676 | e.rotate(rotation_matrix)
677 |
678 | mask = e.data[:,:,0]
679 | return mask
680 |
681 | # Uniform sampling on the plane
682 | dy = np.linspace(mv, -mv, resolution[1])
683 | dx = np.linspace(-mu, mu, resolution[0])
684 | x, y = np.meshgrid(dx, dy)
685 |
686 | # Compute unit length vector (project back to sphere) from the plane
687 | x, y = x.ravel(), y.ravel()
688 | if projection == "perspective":
689 | xy = np.sqrt( (x**2 + y**2) / (x**2 + y**2 + 1) )
690 | theta = np.arctan2(x, y)
691 | x = xy*np.sin(theta)
692 | y = xy*np.cos(theta)
693 | elif projection == "orthographic":
694 | pass
695 | else:
696 | raise Exception("Unknown projection: {}.".format(projection))
697 | z = -np.sqrt(1 - (x**2 + y**2))
698 | coords = np.vstack((x, y, z))
699 |
700 | # Perform rotation
701 | coords = rotation_matrix.T.dot(coords)
702 |
703 | return coords
704 |
705 |
706 | def rotation_matrix(azimuth, elevation, roll=0):
707 | """Returns a camera rotation matrix.
708 | :azimuth: left (negative) to right (positive) [rad]
709 | :elevation: upward (negative) to downward (positive) [rad]
710 | :roll: counter-clockwise (negative) to clockwise (positive) [rad]"""
711 | return rotz(roll).dot(rotx(elevation)).dot(roty(-azimuth))
712 |
713 |
714 | def downscaleEnvmap(nenvmap, sao, sat, times):
715 | """Deprecated"""
716 | print("This function is deprecated and will be removed in an ulterior version of skylibs.")
717 | sz = sat.shape[0]
718 | return nenvmap.resize(sz)
719 |
--------------------------------------------------------------------------------
/envmap/projections.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | from numpy import logical_and as land, logical_or as lor
3 |
4 |
5 | def world2latlong(x, y, z):
6 | """Get the (u, v) coordinates of the point defined by (x, y, z) for
7 | a latitude-longitude map."""
8 | u = 1 + (1 / np.pi) * np.arctan2(x, -z)
9 | v = (1 / np.pi) * np.arccos(y)
10 | # because we want [0,1] interval
11 | u = u / 2
12 | return u, v
13 |
14 |
15 | def world2skylatlong(x, y, z):
16 | """Get the (u, v) coordinates of the point defined by (x, y, z) for
17 | a sky-latitude-longitude map (the zenith hemisphere of a latlong map)."""
18 | u = 1 + (1 / np.pi) * np.arctan2(x, -z)
19 | v = (1 / np.pi) * np.arccos(y) * 2
20 | # because we want [0,1] interval
21 | u = u / 2
22 | return u, v
23 |
24 |
25 | def world2angular(x, y, z):
26 | """Get the (u, v) coordinates of the point defined by (x, y, z) for
27 | an angular map."""
28 | # world -> angular
29 |
30 | # take advantage of the division by zero handling of numpy
31 | x, y, z = np.asarray(x), np.asarray(y), np.asarray(z)
32 |
33 | denum = (2 * np.pi * np.sqrt(x**2 + y**2))
34 | rAngular = np.arccos(-z) / denum
35 | v = np.atleast_1d(0.5 - rAngular * y)
36 | u = np.atleast_1d(0.5 + rAngular * x)
37 |
38 | u[~np.isfinite(rAngular)] = 0.5
39 | # handle [0, 0, -1]
40 | v[np.isnan(rAngular)] = 0.5
41 | # handle [0, 0, 1]
42 | v[np.isinf(rAngular)] = 0.
43 |
44 | if u.size == 1:
45 | return u.item(), v.item()
46 |
47 | return u, v
48 |
49 |
50 | def latlong2world(u, v):
51 | """Get the (x, y, z, valid) coordinates of the point defined by (u, v)
52 | for a latlong map."""
53 | u = u * 2
54 |
55 | # lat-long -> world
56 | thetaLatLong = np.pi * (u - 1)
57 | phiLatLong = np.pi * v
58 |
59 | x = np.sin(phiLatLong) * np.sin(thetaLatLong)
60 | y = np.cos(phiLatLong)
61 | z = -np.sin(phiLatLong) * np.cos(thetaLatLong)
62 |
63 | valid = np.ones(x.shape, dtype='bool')
64 | return x, y, z, valid
65 |
66 |
67 | def skylatlong2world(u, v):
68 | """Get the (x, y, z, valid) coordinates of the point defined by (u, v)
69 | for a latlong map."""
70 | u = u * 2
71 |
72 | # lat-long -> world
73 | thetaLatLong = np.pi * (u - 1)
74 | phiLatLong = np.pi * v / 2
75 |
76 | x = np.sin(phiLatLong) * np.sin(thetaLatLong)
77 | y = np.cos(phiLatLong)
78 | z = -np.sin(phiLatLong) * np.cos(thetaLatLong)
79 |
80 | valid = np.ones(x.shape, dtype='bool')
81 | return x, y, z, valid
82 |
83 |
84 | def angular2world(u, v):
85 | """Get the (x, y, z, valid) coordinates of the point defined by (u, v)
86 | for an angular map."""
87 | # angular -> world
88 | thetaAngular = np.arctan2(-2 * v + 1, 2 * u - 1)
89 | phiAngular = np.pi * np.sqrt((2 * u - 1)**2 + (2 * v - 1)**2)
90 |
91 | x = np.sin(phiAngular) * np.cos(thetaAngular)
92 | y = np.sin(phiAngular) * np.sin(thetaAngular)
93 | z = -np.cos(phiAngular)
94 |
95 | r = (u - 0.5)**2 + (v - 0.5)**2
96 | valid = r <= .25 # .5**2
97 |
98 | return x, y, z, valid
99 |
100 |
101 | def skyangular2world(u, v):
102 | """Get the (x, y, z, valid) coordinates of the point defined by (u, v)
103 | for a sky angular map."""
104 | # skyangular -> world
105 | thetaAngular = np.arctan2(-2 * v + 1, 2 * u - 1) # azimuth
106 | phiAngular = np.pi / 2 * np.sqrt((2 * u - 1)**2 + (2 * v - 1)**2) # zenith
107 |
108 | x = np.sin(phiAngular) * np.cos(thetaAngular)
109 | z = np.sin(phiAngular) * np.sin(thetaAngular)
110 | y = np.cos(phiAngular)
111 |
112 | r = (u - 0.5)**2 + (v - 0.5)**2
113 | valid = r <= .25 # .5^2
114 |
115 | return x, y, z, valid
116 |
117 |
118 | def world2skyangular(x, y, z):
119 | """Get the (u, v) coordinates of the point defined by (x, y, z) for
120 | a sky angular map."""
121 | # world -> skyangular
122 | thetaAngular = np.arctan2(x, z) # azimuth
123 | phiAngular = np.arctan2(np.sqrt(x**2 + z**2), y) # zenith
124 |
125 | r = phiAngular / (np.pi / 2)
126 |
127 | u = 1. / 2 + r * np.sin(thetaAngular) / 2
128 | v = 1. / 2 - r * np.cos(thetaAngular) / 2
129 |
130 | return u, v
131 |
132 |
133 | def sphere2world(u, v):
134 | """Get the (x, y, z, valid) coordinates of the point defined by (u, v)
135 | for the sphere map."""
136 | u = u * 2 - 1
137 | v = v * 2 - 1
138 |
139 | # sphere -> world
140 | r = np.sqrt(u**2 + v**2)
141 | theta = np.arctan2(u, -v)
142 |
143 | phi = np.zeros(theta.shape)
144 | valid = r <= 1
145 | phi[valid] = 2 * np.arcsin(r[valid])
146 |
147 | x = np.sin(phi) * np.sin(theta)
148 | y = np.sin(phi) * np.cos(theta)
149 | z = -np.cos(phi)
150 | return x, y, z, valid
151 |
152 |
153 | def world2sphere(x, y, z):
154 | # world -> sphere
155 |
156 | # take advantage of the division by zero handling of numpy
157 | x, y, z = np.asarray(x), np.asarray(y), np.asarray(z)
158 |
159 | denum = (2 * np.sqrt(x**2 + y**2))
160 | with np.errstate(divide='ignore', invalid='ignore'):
161 | r = np.sin(.5 * np.arccos(-z)) / denum
162 |
163 | u = np.atleast_1d(.5 + r * x)
164 | v = np.atleast_1d(.5 - r * y)
165 |
166 | u[~np.isfinite(r)] = 0.5
167 | # handle [0, 0, -1]
168 | v[np.isnan(r)] = 0.5
169 | # handle [0, 0, 1]
170 | v[np.isinf(r)] = 0.
171 |
172 | if u.size == 1:
173 | return u.item(), v.item()
174 |
175 | return u, v
176 |
177 |
178 | def world2cube(x, y, z):
179 | # world -> cube
180 | x = np.atleast_1d(np.asarray(x))
181 | y = np.atleast_1d(np.asarray(y))
182 | z = np.atleast_1d(np.asarray(z))
183 | u = np.zeros(x.shape)
184 | v = np.zeros(x.shape)
185 |
186 | # forward
187 | indForward = np.nonzero(
188 | land(land(z <= 0, z <= -np.abs(x)), z <= -np.abs(y)))
189 | u[indForward] = 1.5 - 0.5 * x[indForward] / z[indForward]
190 | v[indForward] = 1.5 + 0.5 * y[indForward] / z[indForward]
191 |
192 | # backward
193 | indBackward = np.nonzero(
194 | land(land(z >= 0, z >= np.abs(x)), z >= np.abs(y)))
195 | u[indBackward] = 1.5 + 0.5 * x[indBackward] / z[indBackward]
196 | v[indBackward] = 3.5 + 0.5 * y[indBackward] / z[indBackward]
197 |
198 | # down
199 | indDown = np.nonzero(
200 | land(land(y <= 0, y <= -np.abs(x)), y <= -np.abs(z)))
201 | u[indDown] = 1.5 - 0.5 * x[indDown] / y[indDown]
202 | v[indDown] = 2.5 - 0.5 * z[indDown] / y[indDown]
203 |
204 | # up
205 | indUp = np.nonzero(land(land(y >= 0, y >= np.abs(x)), y >= np.abs(z)))
206 | u[indUp] = 1.5 + 0.5 * x[indUp] / y[indUp]
207 | v[indUp] = 0.5 - 0.5 * z[indUp] / y[indUp]
208 |
209 | # left
210 | indLeft = np.nonzero(
211 | land(land(x <= 0, x <= -np.abs(y)), x <= -np.abs(z)))
212 | u[indLeft] = 0.5 + 0.5 * z[indLeft] / x[indLeft]
213 | v[indLeft] = 1.5 + 0.5 * y[indLeft] / x[indLeft]
214 |
215 | # right
216 | indRight = np.nonzero(land(land(x >= 0, x >= np.abs(y)), x >= np.abs(z)))
217 | u[indRight] = 2.5 + 0.5 * z[indRight] / x[indRight]
218 | v[indRight] = 1.5 - 0.5 * y[indRight] / x[indRight]
219 |
220 | # bring back in the [0,1] intervals
221 | u = u / 3.
222 | v = v / 4.
223 |
224 | if u.size == 1:
225 | return u.item(), v.item()
226 |
227 | return u, v
228 |
229 |
230 | def cube2world(u, v):
231 | u = np.atleast_1d(np.asarray(u))
232 | v = np.atleast_1d(np.asarray(v))
233 |
234 | # [u,v] = meshgrid(0:3/(3*dim-1):3, 0:4/(4*dim-1):4);
235 | # u and v are in the [0,1] interval, so put them back to [0,3]
236 | # and [0,4]
237 | u = u * 3
238 | v = v * 4
239 |
240 | x = np.zeros(u.shape)
241 | y = np.zeros(u.shape)
242 | z = np.zeros(u.shape)
243 | valid = np.zeros(u.shape, dtype='bool')
244 |
245 | # up
246 | indUp = land(land(u >= 1, u < 2), v < 1)
247 | x[indUp] = (u[indUp] - 1.5) * 2
248 | y[indUp] = 1
249 | z[indUp] = (v[indUp] - 0.5) * -2
250 |
251 | # left
252 | indLeft = land(land(u < 1, v >= 1), v < 2)
253 | x[indLeft] = -1
254 | y[indLeft] = (v[indLeft] - 1.5) * -2
255 | z[indLeft] = (u[indLeft] - 0.5) * -2
256 |
257 | # forward
258 | indForward = land(land(land(u >= 1, u < 2), v >= 1), v < 2)
259 | x[indForward] = (u[indForward] - 1.5) * 2
260 | y[indForward] = (v[indForward] - 1.5) * -2
261 | z[indForward] = -1
262 |
263 | # right
264 | indRight = land(land(u >= 2, v >= 1), v < 2)
265 | x[indRight] = 1
266 | y[indRight] = (v[indRight] - 1.5) * -2
267 | z[indRight] = (u[indRight] - 2.5) * 2
268 |
269 | # down
270 | indDown = land(land(land(u >= 1, u < 2), v >= 2), v < 3)
271 | x[indDown] = (u[indDown] - 1.5) * 2
272 | y[indDown] = -1
273 | z[indDown] = (v[indDown] - 2.5) * 2
274 |
275 | # backward
276 | indBackward = land(land(u >= 1, u < 2), v >= 3)
277 | x[indBackward] = (u[indBackward] - 1.5) * 2
278 | y[indBackward] = (v[indBackward] - 3.5) * 2
279 | z[indBackward] = 1
280 |
281 | # normalize
282 | # np.hypot(x, y, z) #sqrt(x.^2 + y.^2 + z.^2);
283 | norm = np.sqrt(x**2 + y**2 + z**2)
284 | with np.errstate(divide='ignore', invalid='ignore'):
285 | x = x / norm
286 | y = y / norm
287 | z = z / norm
288 |
289 | # return valid indices
290 | valid_ind = lor(
291 | lor(lor(indUp, indLeft), lor(indForward, indRight)), lor(indDown, indBackward))
292 | valid[valid_ind] = 1
293 |
294 | if x.size == 1:
295 | return x.item(), y.item(), z.item(), valid.item()
296 |
297 | return x, y, z, valid
298 |
--------------------------------------------------------------------------------
/envmap/rotations.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 |
3 |
4 | def rotx(theta):
5 | """
6 | Produces a counter-clockwise 3D rotation matrix around axis X with angle `theta` in radians.
7 | """
8 | return np.array([[1, 0, 0],
9 | [0, np.cos(theta), -np.sin(theta)],
10 | [0, np.sin(theta), np.cos(theta)]], dtype='float64')
11 |
12 |
13 | def roty(theta):
14 | """
15 | Produces a counter-clockwise 3D rotation matrix around axis Y with angle `theta` in radians.
16 | """
17 | return np.array([[np.cos(theta), 0, -np.sin(theta)],
18 | [0, 1, 0],
19 | [np.sin(theta), 0, np.cos(theta)]], dtype='float64')
20 |
21 |
22 | def rotz(theta):
23 | """
24 | Produces a counter-clockwise 3D rotation matrix around axis Z with angle `theta` in radians.
25 | """
26 | return np.array([[np.cos(theta), -np.sin(theta), 0],
27 | [np.sin(theta), np.cos(theta), 0],
28 | [0, 0, 1]], dtype='float64')
29 |
30 |
31 | def rot(theta=(0,0,0)):
32 | """
33 | Produces a counter-clockwise 3D rotation matrix around axis X, Y and Z with angles `theta` in radians.
34 | """
35 | return np.dot(rotz(theta[2]), np.dot(roty(theta[1]), rotx(theta[0])))
36 |
--------------------------------------------------------------------------------
/envmap/tetrahedronSolidAngle.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | from numpy import tan, arctan, arccos, sqrt
3 |
4 |
5 | def tetrahedronSolidAngle(a, b, c, lhuillier=True):
6 | """ Computes the solid angle subtended by a tetrahedron.
7 |
8 | omega = tetrahedronSolidAngle(a, b, c)
9 |
10 | The tetrahedron is defined by three vectors (a, b, c) which define the
11 | vertices of the triangle with respect to an origin.
12 |
13 | For more details, see:
14 | http://en.wikipedia.org/wiki/Solid_angle#Tetrahedron
15 |
16 | Both methods are implemented, but L'Huillier (default) is easier to
17 | parallelize and thus much faster.
18 |
19 | ----------
20 | Jean-Francois Lalonde
21 | """
22 | assert a.shape[0] == 3, 'a must be a 3xN matrix'
23 | assert b.shape[0] == 3, 'b must be a 3xN matrix'
24 | assert c.shape[0] == 3, 'c must be a 3xN matrix'
25 |
26 | if lhuillier:
27 | theta_a = arccos(np.sum(b*c, 0))
28 | theta_b = arccos(np.sum(a*c, 0))
29 | theta_c = arccos(np.sum(a*b, 0))
30 |
31 | theta_s = (theta_a + theta_b + theta_c) / 2
32 |
33 | product = tan(theta_s/2) * tan((theta_s-theta_a) / 2) \
34 | * tan((theta_s-theta_b) / 2) * tan((theta_s-theta_c) / 2)
35 |
36 | product[product < 0] = 0;
37 | omega = 4 * arctan( sqrt(product) )
38 | else:
39 | raise NotImplementedError()
40 |
41 | return omega
42 |
--------------------------------------------------------------------------------
/envmap/xmlhelper.py:
--------------------------------------------------------------------------------
1 | import xml.etree.ElementTree as ET
2 |
3 |
4 | class EnvmapXMLParser:
5 | """
6 | Parser for the metadata file ( filename.meta.xml ).
7 | """
8 | def __init__(self, filename):
9 | self.tree = ET.parse(filename)
10 | self.root = self.tree.getroot()
11 |
12 | def _getFirstChildTag(self, tag):
13 | for elem in self.root:
14 | if elem.tag == tag:
15 | return elem.attrib
16 |
17 | def _getAttrib(self, node, attribute, default=None):
18 | if node:
19 | return node.get(attribute, default)
20 | return default
21 |
22 | def getFormat(self):
23 | """Returns the format of the environment map."""
24 | node = self._getFirstChildTag('data')
25 | return self._getAttrib(node, 'format', 'Unknown')
26 |
27 | def getDate(self):
28 | """Returns the date of the environment mapin dict format."""
29 | return self._getFirstChildTag('date')
30 |
31 | def getExposure(self):
32 | """Returns the exposure of the environment map in EV."""
33 | node = self._getFirstChildTag('exposure')
34 | return self._getAttrib(node, 'EV')
35 |
--------------------------------------------------------------------------------
/ezexr/__init__.py:
--------------------------------------------------------------------------------
1 | import warnings
2 |
3 | import numpy as np
4 | import re
5 |
6 | from skylibs import __version__
7 |
8 |
9 | try:
10 | import OpenEXR
11 | import Imath
12 |
13 | except Exception as e:
14 | pass
15 |
16 |
17 | def imread(filename, bufferImage=None, rgb=True, whitelisted_channels=None):
18 | """
19 | Read an .exr image and returns a numpy matrix or a dict of channels.
20 |
21 | Does not support .exr with varying channels sizes.
22 |
23 | :bufferImage: If not None, then it should be a numpy array
24 | of a sufficient size to contain the data.
25 | If it is None, a new array is created and returned.
26 | :rgb: If True: tries to get the RGB(A) channels as an image
27 | If False: Returns all channels in a dict()
28 | If "hybrid": ".[R|G|B|A|X|Y|Z]" -> merged to an image
29 | Useful for Blender Cycles' output.
30 | :whitelisted_channels: If not None, then it should be a list of channel names to read in Regex format, e.g. [r"\.V", r""].
31 | By default, all channels are read, which may be slower when using some compression formats (e.g. DWAA).
32 | """
33 | if 'OpenEXR' not in globals():
34 | print(">>> Install OpenEXR-Python with `conda install -c conda-forge openexr openexr-python`\n\n")
35 | raise Exception("Please Install OpenEXR-Python")
36 |
37 | # Open the input file
38 | f = OpenEXR.InputFile(filename)
39 |
40 | # Get the header (we store it in a variable because this function read the file each time it is called)
41 | header = f.header()
42 |
43 | # Compute the size
44 | dw = header['dataWindow']
45 | h, w = dw.max.y - dw.min.y + 1, dw.max.x - dw.min.x + 1
46 |
47 | # Use the attribute "v" of PixelType objects because they have no __eq__
48 | pixformat_mapping = {Imath.PixelType(Imath.PixelType.FLOAT).v: np.float32,
49 | Imath.PixelType(Imath.PixelType.HALF).v: np.float16,
50 | Imath.PixelType(Imath.PixelType.UINT).v: np.uint32}
51 |
52 | # Get the number of channels
53 | nc = len(header['channels'])
54 |
55 | # Check the data type
56 | dtGlobal = list(header['channels'].values())[0].type
57 |
58 | if rgb is True:
59 | # Create the read buffer if needed
60 | data = bufferImage if bufferImage is not None else np.empty((h, w, nc), dtype=pixformat_mapping[dtGlobal.v])
61 |
62 | if nc == 1: # Greyscale
63 | cname = list(header['channels'].keys())[0]
64 | data = np.fromstring(f.channel(cname), dtype=pixformat_mapping[dtGlobal.v]).reshape(h, w, 1)
65 | else:
66 | assert 'R' in header['channels'] and 'G' in header['channels'] and 'B' in header['channels'], "Not a grayscale image, but no RGB data!"
67 | channelsToUse = ('R', 'G', 'B', 'A') if 'A' in header['channels'] else ('R', 'G', 'B')
68 | nc = len(channelsToUse)
69 | for i, c in enumerate(channelsToUse):
70 | # Check the data type
71 | dt = header['channels'][c].type
72 | if dt.v != dtGlobal.v:
73 | data[:, :, i] = np.fromstring(f.channel(c), dtype=pixformat_mapping[dt.v]).reshape((h, w)).astype(pixformat_mapping[dtGlobal.v])
74 | else:
75 | data[:, :, i] = np.fromstring(f.channel(c), dtype=pixformat_mapping[dt.v]).reshape((h, w))
76 | else:
77 | data = {}
78 |
79 | for i, c in enumerate(header['channels']):
80 | if whitelisted_channels is not None:
81 | if not any([re.match(pattern, c) for pattern in whitelisted_channels]):
82 | continue
83 | dt = header['channels'][c].type
84 | data[c] = np.fromstring(f.channel(c), dtype=pixformat_mapping[dt.v]).reshape((h, w))
85 |
86 | if rgb == "hybrid":
87 | ordering = {key: i for i, key in enumerate("RGBAXYZ")}
88 |
89 | new_data = {}
90 | for c in data.keys():
91 |
92 | ident = c.split(".")[0]
93 | try:
94 | chan = c.split(".")[1]
95 | except IndexError:
96 | chan = "R"
97 |
98 | if ident not in new_data:
99 | all_chans = [x.split(".")[1] for x in data if x.startswith(ident + ".")]
100 | nc = len(all_chans)
101 | new_data[ident] = np.empty((h, w, nc), dtype=np.float32)
102 | for i, chan in enumerate(sorted(all_chans, key=lambda v: ordering.get(v, len(ordering)))):
103 | new_data[ident][:,:,i] = data["{}.{}".format(ident, chan)].astype(new_data[ident].dtype)
104 |
105 | data = new_data
106 |
107 | f.close()
108 |
109 | return data
110 |
111 |
112 | def imwrite(filename, arr, **params):
113 | """
114 | Write an .exr file from an input array.
115 |
116 | Optional params :
117 | channel_names = name of the channels, defaults to "RGB" for 3-channel, "Y" for grayscale, and "Y{n}" for N channels.
118 | compression = 'NONE' | 'RLE' | 'ZIPS' | 'ZIP' | 'PIZ' | 'PXR24' (default PIZ)
119 | pixeltype = 'HALF' | 'FLOAT' | 'UINT' (default : dtype of the input array if float16, float32 or uint32, else float16)
120 |
121 | """
122 |
123 | if arr.ndim == 3:
124 | h, w, d = arr.shape
125 | elif arr.ndim == 2:
126 | h, w = arr.shape
127 | d = 1
128 | else:
129 | raise Exception("Could not understand dimensions in array.")
130 |
131 | if "channel_names" in params:
132 | ch_names = params["channel_names"]
133 | assert len(ch_names) >= d, "Provide as many channel names as channels in the array."
134 | else:
135 | if d == 1:
136 | ch_names = ["Y"]
137 | elif d == 3:
138 | ch_names = ["R","G","B"]
139 | elif d == 4:
140 | ch_names = ["R","G","B","A"]
141 | else:
142 | length = len(str(d - 1))
143 | ch_names = ['Y{}'.format(str(idx).zfill(length)) for idx in range(d)]
144 |
145 | if 'OpenEXR' not in globals():
146 | print(">>> Install OpenEXR-Python with `conda install -c conda-forge openexr openexr-python`\n\n")
147 | raise Exception("Please Install OpenEXR-Python")
148 |
149 | compression = 'PIZ' if not 'compression' in params or \
150 | params['compression'] not in ('NONE', 'RLE', 'ZIPS', 'ZIP', 'PIZ', 'PXR24', 'B44', 'B44A', 'DWAA', 'DWAB') else params['compression']
151 | imath_compression = {'NONE' : Imath.Compression(Imath.Compression.NO_COMPRESSION),
152 | 'RLE' : Imath.Compression(Imath.Compression.RLE_COMPRESSION),
153 | 'ZIPS' : Imath.Compression(Imath.Compression.ZIPS_COMPRESSION),
154 | 'ZIP' : Imath.Compression(Imath.Compression.ZIP_COMPRESSION),
155 | 'PIZ' : Imath.Compression(Imath.Compression.PIZ_COMPRESSION),
156 | 'PXR24' : Imath.Compression(Imath.Compression.PXR24_COMPRESSION),
157 | 'B44' : Imath.Compression(Imath.Compression.B44_COMPRESSION),
158 | 'B44A' : Imath.Compression(Imath.Compression.B44A_COMPRESSION),
159 | 'DWAA' : Imath.Compression(Imath.Compression.DWAA_COMPRESSION),
160 | 'DWAB' : Imath.Compression(Imath.Compression.DWAB_COMPRESSION)}[compression]
161 |
162 |
163 | if 'pixeltype' in params and params['pixeltype'] in ('HALF', 'FLOAT', 'UINT'):
164 | # User-defined pixel type
165 | pixformat = params['pixeltype']
166 | elif arr.dtype == np.float32:
167 | pixformat = 'FLOAT'
168 | elif arr.dtype == np.uint32:
169 | pixformat = 'UINT'
170 | elif arr.dtype == np.float16:
171 | pixformat = 'HALF'
172 | else:
173 | # Default : Auto detect
174 | arr_fin = arr[np.isfinite(arr)]
175 | the_max = np.abs(arr_fin).max()
176 | the_min = np.abs(arr_fin[arr_fin > 0]).min()
177 |
178 | if the_max <= 65504. and the_min >= 1e-7:
179 | print("Autodetected HALF (FLOAT16) format")
180 | pixformat = 'HALF'
181 | elif the_max < 3.402823e+38 and the_min >= 1.18e-38:
182 | print("Autodetected FLOAT32 format")
183 | pixformat = 'FLOAT'
184 | else:
185 | raise Exception('Could not convert array into exr without loss of information '
186 | '(a value would be rounded to infinity or 0)')
187 | warnings.warn("imwrite received an array with dtype={}, which cannot be saved in EXR format."
188 | "Will fallback to {}, which can represent all the values in the array.".format(arr.dtype, pixformat), RuntimeWarning)
189 |
190 | imath_pixformat = {'HALF' : Imath.PixelType(Imath.PixelType.HALF),
191 | 'FLOAT' : Imath.PixelType(Imath.PixelType.FLOAT),
192 | 'UINT' : Imath.PixelType(Imath.PixelType.UINT)}[pixformat]
193 | numpy_pixformat = {'HALF' : 'float16',
194 | 'FLOAT' : 'float32',
195 | 'UINT' : 'uint32'}[pixformat] # Not sure for the last one...
196 |
197 | # Convert to strings
198 | if d == 1:
199 | data = [ arr.astype(numpy_pixformat).tostring() ]
200 | else:
201 | data = [ arr[:,:,c].astype(numpy_pixformat).tostring() for c in range(d) ]
202 |
203 | outHeader = OpenEXR.Header(w, h)
204 | outHeader['compression'] = imath_compression # Apply compression
205 | outHeader['channels'] = { # Apply pixel format
206 | ch_names[i]: Imath.Channel(imath_pixformat, 1, 1) for i in range(d)
207 | }
208 |
209 | # Write the three color channels to the output file
210 | out = OpenEXR.OutputFile(filename, outHeader)
211 | if d == 1:
212 | out.writePixels({ch_names[0] : data[0] })
213 | elif d == 3:
214 | out.writePixels({ch_names[0] : data[0], ch_names[1] : data[1], ch_names[2] : data[2] })
215 | else:
216 | out.writePixels({ch_names[c] : data[c] for c in range(d)})
217 |
218 | out.close()
219 |
220 |
221 | imsave = imwrite
222 |
223 |
224 | __all__ = ['imread', 'imwrite', 'imsave']
225 |
--------------------------------------------------------------------------------
/hdrio/__init__.py:
--------------------------------------------------------------------------------
1 | import os
2 | import subprocess
3 |
4 | import numpy as np
5 | import imageio.v3 as imageio
6 | from skylibs import __version__
7 |
8 |
9 | try:
10 | import ezexr
11 | except ImportError as e:
12 | print("Could not import exr module:", e)
13 |
14 | imsave_ldr = imageio.imwrite
15 |
16 |
17 | def imwrite(data, filename):
18 | _, ext = os.path.splitext(filename.lower())
19 | if ext == '.exr':
20 | ezexr.imwrite(filename, data)
21 | elif ext in ['.hdr', '.pic']:
22 | _hdr_write(filename, data)
23 | else:
24 | imsave_ldr(filename, np.clip(255.*data, 0, 255).astype('uint8'))
25 |
26 |
27 | def imsave(filename, data):
28 | imwrite(data, filename)
29 |
30 |
31 | def imread(filename, format_="float32"):
32 | """Reads an image. Supports exr, hdr, cr2, tiff, jpg, png and
33 | everything SciPy/PIL supports.
34 |
35 | :filename: file path.
36 | :format_: format in which to return the value. If set to "native", the
37 | native format of the file will be given (e.g. uint8 for jpg).
38 | """
39 | ldr = False
40 | _, ext = os.path.splitext(filename.lower())
41 |
42 | if ext == '.exr':
43 | im = ezexr.imread(filename)
44 | elif ext in ['.hdr', '.pic']:
45 | im = _hdr_read(filename)
46 | elif ext in ['.cr2', '.nef', '.raw']:
47 | im = _raw_read(filename)
48 | elif ext in ['.tiff', '.tif']:
49 | try:
50 | import tifffile as tiff
51 | except ImportError:
52 | print('Install tifffile for better tiff support. Fallbacking to '
53 | 'imageio.')
54 | im = imageio.imread(filename)
55 | else:
56 | im = tiff.imread(filename)
57 | else:
58 | im = imageio.imread(filename)
59 | ldr = True
60 |
61 | if format_ == "native":
62 | return im
63 | elif ldr and not 'int' in format_:
64 | ii = np.iinfo(im.dtype)
65 | return im.astype(format_) / ii.max
66 | else:
67 | return im.astype(format_)
68 |
69 |
70 | def _raw_read(filename):
71 | """Calls the dcraw program to unmosaic the raw image."""
72 | fn, _ = os.path.splitext(filename.lower())
73 | target_file = "{}.tiff".format(fn)
74 | if not os.path.exists(target_file):
75 | ret = subprocess.call('dcraw -v -T -4 -t 0 -j {}'.format(filename))
76 | if ret != 0:
77 | raise Exception('Could not execute dcraw. Make sure the executable'
78 | ' is available.')
79 | try:
80 | import tifffile as tiff
81 | except ImportError:
82 | raise Exception('Install tifffile to read the converted tiff file.')
83 | else:
84 | return tiff.imread(target_file)
85 |
86 |
87 | def _hdr_write(filename, data, **kwargs):
88 | """Write a Radiance hdr file.
89 | Refer to the ImageIO API ( http://imageio.readthedocs.io/en/latest/userapi.html
90 | ) for parameter description."""
91 |
92 | imageio.imwrite(filename, data, **kwargs)
93 |
94 |
95 | def _hdr_read(filename, use_imageio=False):
96 | """Read hdr file.
97 |
98 | .. TODO:
99 |
100 | * Support axis other than -Y +X
101 | """
102 | if use_imageio:
103 | return imageio.imread(filename, **kwargs)
104 |
105 | with open(filename, "rb") as f:
106 | MAGIC = f.readline().strip()
107 | assert MAGIC == b'#?RADIANCE', "Wrong header found in {}".format(filename)
108 | comments = b""
109 | while comments[:6] != b"FORMAT":
110 | comments = f.readline().strip()
111 | assert comments[:3] != b"-Y ", "Could not find data format"
112 | assert comments == b'FORMAT=32-bit_rle_rgbe', "Format not supported"
113 | while comments[:3] != b"-Y ":
114 | comments = f.readline().strip()
115 | _, height, _, width = comments.decode("ascii").split(" ")
116 | height, width = int(height), int(width)
117 | rgbe = np.fromfile(f, dtype=np.uint8).reshape((height, width, 4))
118 | rgb = np.empty((height, width, 3), dtype=np.float)
119 | rgb[...,0] = np.ldexp(rgbe[...,0], rgbe[...,3].astype('int') - 128)
120 | rgb[...,1] = np.ldexp(rgbe[...,1], rgbe[...,3].astype('int') - 128)
121 | rgb[...,2] = np.ldexp(rgbe[...,2], rgbe[...,3].astype('int') - 128)
122 | # TODO: This will rescale all the values to be in [0, 1]. Find a way to retrieve the original values.
123 | rgb /= rgb.max()
124 | return rgb
125 |
126 |
127 | __all__ = ['imwrite', 'imsave', 'imread']
128 |
--------------------------------------------------------------------------------
/hdrtools/__init__.py:
--------------------------------------------------------------------------------
1 | from .gsolve import gsolve, weights
2 | from .sunutils import findBrightestSpot, sunPosition_fromEnvmap, sunPosition_pySolar_zenithAzimuth, sunPosition_pySolar_XYZ
--------------------------------------------------------------------------------
/hdrtools/gsolve.py:
--------------------------------------------------------------------------------
1 | # Taken from Debevec1997 "Recovering High Dynamic Range Radiance Maps
2 | # from Photographs"
3 | #
4 | # gsolve.py - Solve for imaging system response function
5 | #
6 | # Given a set of pixel values observed for several pixels in several
7 | # images with different exposure times, this function returns the
8 | # imaging system's response function g as well as the log film irradiance
9 | # values for the observed pixels.
10 | #
11 | # Assumes:
12 | #
13 | # Zmin = 0
14 | # Zmax = 255
15 | #
16 | # Arguments:
17 | #
18 | # Z(i,j) is the pixel values of pixel location number i in image j
19 | # B(j) is the log delta t, or log shutter speed, for image j
20 | # l is lamdba, the constant that determines the amount of smoothness
21 | # w(z) is the weighting function value for pixel value z
22 | #
23 | # Returns:
24 | #
25 | # g(z) is the log exposure corresponding to pixel value z
26 | # lE(i) is the log film irradiance at pixel location i
27 | #
28 |
29 | import numpy as np
30 |
31 |
32 | def gsolve(Z, B, l, w):
33 | n = 256
34 | A = np.zeros((Z.shape[0]*Z.shape[1] + n - 1, n + Z.shape[0]), dtype=float)
35 | b = np.zeros((A.shape[0], 1), dtype=float)
36 |
37 | # Include the data-fitting equations
38 | k = 0;
39 | for i in range(Z.shape[0]):
40 | for j in range(Z.shape[1]):
41 | wij = w[Z[i,j]]
42 | A[k, Z[i,j]] = wij
43 | A[k, n + i] = -wij
44 | b[k, 0] = wij * B[j]
45 | k += 1
46 |
47 | # Fix the curve by setting its middle value to 0
48 | A[k, 128] = 1
49 | k += 1
50 |
51 | # Include the smoothness equations
52 | for i in range(n - 2):
53 | A[k, i+0] = l*w[i+1]
54 | A[k, i+1] = -2*l*w[i+1]
55 | A[k, i+2] = l*w[i+1]
56 | k += 1
57 |
58 | # Solve the system using SVD
59 | x = np.linalg.lstsq(A, b)[0]
60 | g = x[:n]
61 | lE = x[n:]
62 |
63 | return g, lE
64 |
65 |
66 | def weights(z_min=0, z_max=255):
67 | """Outputs the weights of the z(i) input pixels.
68 | This is a direct implementation of eq. 4."""
69 | z = np.array(range(z_min, z_max + 1), dtype='float')
70 |
71 | lower = z <= 0.5*(z_min + z_max)
72 | upper = z > 0.5*(z_min + z_max)
73 | z[lower] = z[lower] - z_min
74 | z[upper] = z_max - z[upper]
75 | z /= z.max()
76 |
77 | return z
78 |
--------------------------------------------------------------------------------
/hdrtools/sunutils.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | import scipy, scipy.misc, scipy.ndimage, scipy.ndimage.filters
3 | import scipy.spatial, scipy.interpolate, scipy.spatial.distance
4 |
5 | from pysolar import solar
6 |
7 | import envmap
8 | from envmap.projections import latlong2world
9 |
10 |
11 | def findBrightestSpot(image, minpct=99.99):
12 | """
13 | Find the sun position (in pixels, in the current projection) using the image.
14 | """
15 | if isinstance(image, envmap.EnvironmentMap):
16 | image = image.data
17 |
18 | # Gaussian filter
19 | filteredimg = scipy.ndimage.filters.gaussian_filter(image, (5, 5, 0))
20 |
21 | # Intensity image
22 | if filteredimg.ndim == 2 or filteredimg.shape[2] > 1:
23 | intensityimg = np.dot( filteredimg[:,:,:3], [.299, .587, .114] )
24 | else:
25 | intensityimg = filteredimg
26 | intensityimg[~np.isfinite(intensityimg)] = 0
27 |
28 | # Look for the value a the *minpct* percentage and threshold at this value
29 | # We do not take into account the pixels with a value of 0
30 | minval = np.percentile(intensityimg[intensityimg > 0], minpct)
31 | thresholdmap = intensityimg >= minval
32 |
33 | # Label the regions in the thresholded image
34 | labelarray, n = scipy.ndimage.measurements.label(thresholdmap, np.ones((3, 3), dtype="bool8"))
35 |
36 | # Find the size of each of them
37 | funcsize = lambda x: x.size
38 | patchsizes = scipy.ndimage.measurements.labeled_comprehension(intensityimg,
39 | labelarray,
40 | index=np.arange(1, n+1),
41 | func=funcsize,
42 | out_dtype=np.uint32,
43 | default=0.0)
44 |
45 | # Find the biggest one (we must add 1 because the label 0 is the background)
46 | biggestPatchIdx = np.argmax(patchsizes) + 1
47 |
48 | # Obtain the center of mass of the said biggest patch (we suppose that it is the sun)
49 | centerpos = scipy.ndimage.measurements.center_of_mass(intensityimg, labelarray, biggestPatchIdx)
50 |
51 | return centerpos
52 |
53 |
54 | def sunPosition_fromEnvmap(envmapInput):
55 | """
56 | Finds the azimuth and zenith of the sun using the environnement map provided.
57 | Returns a tuple containing (zenith, azimuth)
58 | """
59 | c = findBrightestSpot(envmapInput.data)
60 | u, v = (c[1]+0.5) / envmapInput.data.shape[1], (c[0]+0.5) / envmapInput.data.shape[0]
61 |
62 | azimuth = np.pi*(2*u - 1)
63 | zenith = np.pi*v
64 |
65 | return zenith, azimuth
66 |
67 |
68 | def sunPosition_pySolar_zenithAzimuth(latitude, longitude, time, elevation=0):
69 | """
70 | Finds the azimuth and zenith angle of the sun using the pySolar library.
71 | Takes latitude(deg), longitude(deg) and a datetime object.
72 | Returns a tuple containing (elevation, azimuth) in RADIANS with world coordinate orientation.
73 |
74 | Please note:
75 | zenith angle = 90degrees - elevation angle
76 | azimuth angle = north-based azimuth angles require offset (+90deg) and inversion (*-1) to measure clockwise
77 | thus, azimuth = (pi/2) - azimuth
78 | """
79 |
80 | # Find azimuth and elevation from pySolar library.
81 | azimuth = solar.get_azimuth(latitude, longitude, time, elevation)
82 | altitude = solar.get_altitude(latitude, longitude, time, elevation)
83 |
84 | # Convert to radians
85 | azimuth = (np.pi/2) + np.deg2rad(-azimuth)
86 | zenith = np.deg2rad(90 - altitude)
87 |
88 | # Reset if degrees > 180
89 | if azimuth > np.pi: azimuth = azimuth - 2*np.pi
90 | if zenith > np.pi: zenith = zenith - 2*np.pi
91 |
92 | return zenith, azimuth
93 |
94 |
95 | def sunPosition_pySolar_UV(latitude, longitude, time, elevation=0):
96 | """
97 | Finds the azimuth and elevation of the sun using the pySolar library.
98 | Takes latitude (in degrees), longitude(in degrees) and a datetime object.
99 | Returns a tuple containing the (x, y, z) world coordinate.
100 |
101 | Note, the validity (v) of the coordinate is not returned.
102 | Please check the coordinate in respect to your environment map.
103 | """
104 |
105 | zenith, azimuth = sunPosition_pySolar_zenithAzimuth(
106 | latitude, longitude,
107 | time,
108 | elevation
109 | )
110 |
111 | # Fix orientation of azimuth
112 | azimuth = -(azimuth - (np.pi/2))
113 |
114 | # Convert to UV coordinates
115 | u = (azimuth/(2*np.pi))
116 | v = zenith/np.pi
117 | return u, v
118 |
119 |
120 | def sunPosition_pySolar_XYZ(latitude, longitude, time, elevation=0):
121 | """
122 | Finds the azimuth and elevation of the sun using the pySolar library.
123 | Takes latitude (in degrees), longitude(in degrees) and a datetime object.
124 | Returns a tuple containing the (x, y, z) world coordinate.
125 |
126 | Note, the validity (v) of the coordinate is not returned.
127 | Please check the coordinate in respect to your environment map.
128 | """
129 |
130 | u,v = sunPosition_pySolar_UV(
131 | latitude, longitude,
132 | time,
133 | elevation
134 | )
135 |
136 | # Convert to world coordinates
137 | x, y, z, _ = latlong2world(u, v)
138 | return x, y, z
139 |
--------------------------------------------------------------------------------
/hdrtools/tonemapping/__init__.py:
--------------------------------------------------------------------------------
1 | import sys
2 | import subprocess
3 | import numpy as np
4 | import numpy.linalg as linalg
5 | import hdrio
6 | from itertools import chain
7 | from functools import partial
8 |
9 | # sRGB, D65 (same as pfstools)
10 | rgb2xyz_mat = np.array([[0.412453, 0.357580, 0.180423],
11 | [0.212671, 0.715160, 0.072169],
12 | [0.019334, 0.119193, 0.950227]], dtype="float32")
13 | xyz2rgb_mat = linalg.inv(rgb2xyz_mat)
14 |
15 | _availToneMappers = subprocess.check_output(["/bin/bash -c \"compgen -c pfstmo\""],
16 | stderr=subprocess.STDOUT,
17 | shell=True).decode('ascii').strip().split("\n")
18 |
19 | def convertToXYZ(rgbimg):
20 | # Normalize RGB
21 | rgbimg = np.nan_to_num(rgbimg)
22 | rgbimg /= rgbimg.max()
23 | rgbimg = np.clip(rgbimg, 0.0, 1.0)
24 |
25 | # Convert to XYZ (sRGB, D65)
26 | pixelVec = rgbimg.reshape(-1, 3)
27 |
28 | # Convert float32 (mandatory for PFS)
29 | imgXYZ = np.dot(rgb2xyz_mat, pixelVec.T).T.reshape(rgbimg.shape).astype("float32")
30 |
31 | return imgXYZ
32 |
33 |
34 | def convertFromXYZ(xyzimg):
35 | # Convert XYZ to RGB
36 | pixelVec = xyzimg.reshape(-1, 3)
37 | img = np.dot(xyz2rgb_mat, pixelVec.T).T.reshape(xyzimg.shape)
38 |
39 | # The image will be returned with RGB values in
40 | # the [0, 1] range, type=float32 !
41 | return img
42 |
43 |
44 | def writePFS(hdrimg):
45 | """
46 | Return a bytes object encapsulating the hdrimg given as argument,
47 | including a valid header.
48 | The hdrimg should be a valid RGB image (HDR or not).
49 | """
50 | header = "PFS1\n{} {}\n{}\n0\nX\n0\nY\n0\nZ\n0\nENDH".format(hdrimg.shape[1],
51 | hdrimg.shape[0],
52 | hdrimg.shape[2])
53 | b = bytes(header, "ascii")
54 |
55 | imgXYZ = convertToXYZ(hdrimg)
56 | b += imgXYZ.transpose(2, 1, 0).tostring()
57 | return b
58 |
59 |
60 | def readPFS(data):
61 | """
62 | Return the image (HDR or not contained in the PFS data). The data argument
63 | should be a bytes object or an equivalent (bytearray, etc.) containing the
64 | PFS output, including the header.
65 | """
66 | headerEnd = data.find(b"\nENDH")
67 | assert headerEnd != -1, "Invalid PFS file (no header end marker)!"
68 | headerEnd += 5 # To get to the end of the ENDH tag
69 |
70 | headerLines = data[:headerEnd].decode('ascii').split("\n")
71 |
72 | # Read the header and extract width, height, and number of channels
73 | # TODO : Use the LUMINANCE tag from PFS? Is it useful?
74 | assert "PFS1" in headerLines[0], "Invalid PFS file (no PFS1 identifier)!"
75 | w, h = map(int, headerLines[1].split())
76 | channelsCount = int(headerLines[2])
77 |
78 | # Create the output image
79 | img = np.fromstring(data[headerEnd:], dtype='float32', count=h*w*channelsCount)
80 | img = img.reshape(channelsCount, w, h).transpose(2, 1, 0)
81 |
82 | # Return its RGB representation
83 | return convertFromXYZ(img)
84 |
85 |
86 | def _tonemapping(hdrimg, exec_, copy=True, **kwargs):
87 | """
88 | Wrapper function for all available tone mappers.
89 | If copy is set to True (default), then it will copy the array
90 | before normalizing it (as required by PFS format).
91 | """
92 | inPFS = writePFS(hdrimg if not copy else hdrimg.copy())
93 |
94 | listArgs = []
95 | for k,v in kwargs.items():
96 | listArgs.append("--"+str(k))
97 | listArgs.append(str(v))
98 |
99 | p = subprocess.Popen([exec_] + listArgs, stdin=subprocess.PIPE, stdout=subprocess.PIPE)
100 | output, err = p.communicate(inPFS)
101 |
102 | ldrimgRGB = readPFS(output) * 255.
103 | ldrimgRGB = np.clip(ldrimgRGB, 0, 255).astype('uint8')
104 |
105 | return ldrimgRGB
106 |
107 | def getAvailableToneMappers():
108 | """
109 | Return the available tone mappers on the current platform,
110 | as a list of strings. These names are the ones that should be
111 | used as function names to actually use these tone mappers.
112 | """
113 | return [tm[7:] for tm in _availToneMappers]
114 |
115 |
116 | # Dynamically create the tone mapping functions
117 | for tm,tmName in zip(_availToneMappers, getAvailableToneMappers()):
118 | setattr(sys.modules[__name__], tmName, partial(_tonemapping, exec_=tm))
119 |
--------------------------------------------------------------------------------
/pyproject.toml:
--------------------------------------------------------------------------------
1 | [build-system]
2 | requires = [
3 | "setuptools>=42",
4 | "wheel"
5 | ]
6 | build-backend = "setuptools.build_meta"
--------------------------------------------------------------------------------
/setup.cfg:
--------------------------------------------------------------------------------
1 | [metadata]
2 | description-file = README.md
3 |
--------------------------------------------------------------------------------
/setup.py:
--------------------------------------------------------------------------------
1 | from pathlib import Path
2 | from setuptools import setup
3 |
4 |
5 | skylibs_module = {}
6 | root = Path(__file__).resolve().parent
7 | with open(str(root / "skylibs/__init__.py")) as fhdl:
8 | exec(fhdl.read(), skylibs_module)
9 |
10 |
11 | setup(
12 | name='skylibs',
13 | description=('Tools to read, write, perform projections and handle LDR/HDR environment maps (IBL).'),
14 | author='Yannick Hold',
15 | author_email='yannickhold@gmail.com',
16 | license="LGPLv3",
17 | url='https://github.com/soravux/skylibs',
18 | version=skylibs_module['__version__'],
19 | packages=['ezexr', 'envmap', 'hdrio', 'hdrtools', 'hdrtools/tonemapping', 'skydb', 'tools3d', 'tools3d/warping_operator', 'skylibs'],
20 | include_package_data=True,
21 | install_requires=['imageio>=1.6', 'tqdm', 'numpy', 'scipy', 'scikit-image>=0.19', 'pysolar'],
22 | )
23 |
--------------------------------------------------------------------------------
/skydb/__init__.py:
--------------------------------------------------------------------------------
1 | import os
2 | from os import listdir
3 | from os.path import abspath, isdir, join
4 | import fnmatch
5 | import datetime
6 |
7 | import numpy as np
8 |
9 | from envmap import EnvironmentMap
10 | from hdrtools import sunutils
11 |
12 |
13 | class SkyDB:
14 | def __init__(self, path):
15 | """Creates a SkyDB.
16 | The path should contain folders named by YYYYMMDD (ie. 20130619 for June 19th 2013).
17 | These folders should contain folders named by HHMMSS (ie. 102639 for 10h26 39s).
18 | Inside these folders should a file named envmap.exr be located.
19 | """
20 | p = abspath(path)
21 | self.intervals_dates = [join(p, f) for f in listdir(p) if isdir(join(p, f))]
22 | self.intervals = list(map(SkyInterval, self.intervals_dates))
23 |
24 |
25 | class SkyInterval:
26 | def __init__(self, path):
27 | """Represent an interval, usually a day.
28 | The path should contain folders named by HHMMSS (ie. 102639 for 10h26 39s).
29 | """
30 | self.path = path
31 | matches = []
32 | for root, dirnames, filenames in os.walk(path):
33 | for filename in fnmatch.filter(filenames, 'envmap.exr'):
34 | matches.append(join(root, filename))
35 |
36 | self.probes = list(map(SkyProbe, matches))
37 | self.reftimes = [x.datetime for x in self.probes]
38 |
39 | @property
40 | def sun_visibility(self):
41 | """
42 | Return sun_visibility of the interval
43 | """
44 | if len(self.probes) > 0:
45 | sun_visibility = sum(1 for x in self.probes if x.sun_visible) / len(self.probes)
46 | else:
47 | sun_visibility = 0
48 | return sun_visibility
49 |
50 | @property
51 | def date(self):
52 | """
53 | :returns: datetime.date object
54 | """
55 | date = os.path.normpath(self.path).split(os.sep)[-1]
56 | infos = {
57 | "day": int(date[-2:]),
58 | "month": int(date[4:6]),
59 | "year": int(date[:4]),
60 | }
61 | return datetime.date(**infos)
62 |
63 | def closestProbe(self, hours, minutes=0, seconds=0):
64 | """
65 | Return the SkyProbe object closest to the requested time.
66 | TODO : check for day change (if we ask for 6:00 AM and the probe sequence
67 | only begins at 7:00 PM and ends at 9:00 PM, then 9:00 PM is actually
68 | closer than 7:00 PM and will be wrongly selected; not a big deal but...)
69 | TODO : Take the code from skymangler.
70 | """
71 | cmpdate = datetime.datetime(year=1, month=1, day=1, hour=hours, minute=minutes, second=seconds)
72 | idx = np.argmin([np.abs((cmpdate - t).total_seconds()) for t in self.reftimes])
73 | return self.probes[idx]
74 |
75 |
76 | class SkyProbe:
77 | def __init__(self, path, format_=None):
78 | """Represent an environment map among an interval."""
79 | self.path = path
80 | self.format_ = format_
81 |
82 | def init_properties(self):
83 | """
84 | Cache properties that are resource intensive to generate.
85 | """
86 | if not hasattr(self, '_envmap'):
87 | self._envmap = self.environment_map
88 |
89 | def remove_envmap(self):
90 | """
91 | Delete probe's envmap from memory.
92 | """
93 | del self._envmap
94 |
95 | @property
96 | def sun_visible(self):
97 | """
98 | :returns: boolean, True if the sun is visible, False otherwise.
99 | """
100 | self.init_properties()
101 | return self._envmap.data.max() > 5000
102 |
103 | @property
104 | def datetime(self):
105 | """Datetime of the capture.
106 | :returns: datetime object.
107 | """
108 | time_ = os.path.normpath(self.path).split(os.sep)[-2]
109 | date = os.path.normpath(self.path).split(os.sep)[-3]
110 | infos = {
111 | "second": int(time_[-2:]),
112 | "minute": int(time_[2:4]),
113 | "hour": int(time_[:2]),
114 | "day": int(date[-2:]),
115 | "month": int(date[4:6]),
116 | "year": int(date[:4]),
117 | }
118 |
119 | if infos["second"] >= 60:
120 | infos["second"] = 59
121 |
122 | try:
123 | datetime_ = datetime.datetime(**infos)
124 | except ValueError:
125 | print('error on path:', self.path)
126 | raise
127 |
128 | return datetime_
129 |
130 | @property
131 | def environment_map(self):
132 | """
133 | :returns: EnvironmentMap object.
134 | """
135 | if self.format_:
136 | return EnvironmentMap(self.path, self.format_)
137 | else:
138 | return EnvironmentMap(self.path)
139 |
140 | def sun_position(self, method="coords"):
141 | """
142 | :returns: (elevation, azimuth)
143 | """
144 | if method == "intensity":
145 | self.init_properties()
146 | return sunutils.sunPosFromEnvmap(self._envmap)
147 |
148 | elif method == "coords":
149 | # Assume Laval University, Quebec, Canada
150 | latitude, longitude = 46.778969, -71.274914
151 | elevation = 125
152 |
153 | tz = datetime.timezone(datetime.timedelta(seconds=-17760))
154 | if self.datetime < datetime.datetime(2013, 12, 25, 10, 10, 10, tzinfo=tz):
155 | # Assume Carnegie Mellon University, Pittsburgh, PA
156 | latitude, longitude = 40.442794, -79.944115
157 | elevation = 300
158 |
159 | d = self.datetime
160 | if self.datetime.tzinfo is None:
161 | d += datetime.timedelta(hours=+4)
162 |
163 | return sunutils.sunPosFromCoord(latitude, longitude, d, elevation=elevation)
164 |
--------------------------------------------------------------------------------
/skylibs/__init__.py:
--------------------------------------------------------------------------------
1 | __version__ = "0.7.6"
2 |
--------------------------------------------------------------------------------
/test/test_envmap.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | from itertools import product
3 | import pytest
4 | from skimage.transform import resize
5 | from scipy.ndimage import binary_erosion
6 | from envmap import EnvironmentMap, rotation_matrix, rotations
7 | from envmap.environmentmap import SUPPORTED_FORMATS
8 |
9 |
10 | np.random.seed(31415926)
11 |
12 |
13 | # pytest [-s] [-k test_convert]
14 |
15 |
16 | def get_envmap(sz, up_factor, format_, channels=3):
17 | e = EnvironmentMap(sz, format_, channels=channels)
18 | e.data = np.random.rand(e.data.shape[0], e.data.shape[1], channels)
19 | if up_factor != 1.:
20 | e.data = resize(e.data, [up_factor*x for x in e.data.shape[:2]])
21 | e.setBackgroundColor(0.)
22 | return e
23 |
24 |
25 | @pytest.mark.parametrize("format_", SUPPORTED_FORMATS)
26 | def test_imageCoordinates(format_):
27 | s = 64
28 | e = EnvironmentMap(s, format_)
29 | u, v = e.imageCoordinates()
30 |
31 | s2, s1 = e.data.shape[:2]
32 |
33 | # All rows/columns are the same
34 | np.testing.assert_array_almost_equal(np.diff(u, axis=0), 0, decimal=5)
35 | np.testing.assert_array_almost_equal(np.diff(v, axis=1), 0, decimal=5)
36 | # All the columns/rows are spaced by 1/s
37 | np.testing.assert_array_almost_equal(np.diff(u, axis=1), 1/s1, decimal=5)
38 | np.testing.assert_array_almost_equal(np.diff(v, axis=0), 1/s2, decimal=5)
39 | # First element is (1/s)/2
40 | np.testing.assert_array_almost_equal(u[0,0], 1/s1/2, decimal=5)
41 | np.testing.assert_array_almost_equal(v[0,0], 1/s2/2, decimal=5)
42 |
43 |
44 | @pytest.mark.parametrize("envmap_type,in_sz,out_sz", product(SUPPORTED_FORMATS, [512, 431, 271], [512, 431, 271]))
45 | def test_resize_integer(envmap_type, in_sz, out_sz):
46 | e = get_envmap(in_sz, 1, envmap_type, 1)
47 | old_energy = e.data.mean()
48 | old = e.data.copy()
49 | e = e.copy().resize(out_sz, debug=True)
50 | new_energy = e.data.mean()
51 | print("Energy difference: {:g}".format(np.abs(new_energy/old_energy - 1.)))
52 | assert np.abs(new_energy/old_energy - 1.) < 5e-3
53 |
54 |
55 | @pytest.mark.parametrize("src_format,tgt_format", product(SUPPORTED_FORMATS, SUPPORTED_FORMATS))
56 | def test_convert(src_format, tgt_format):
57 | e_src = get_envmap(16, 6, src_format)
58 |
59 | # remove everything not in the sky if src or tgt format is sky-only
60 | if src_format[:3] == "sky" or tgt_format[:3] == "sky":
61 | _, y, _, _ = e_src.worldCoordinates()
62 | e_src.data[np.tile(y[:,:,None], (1, 1, 3)) < 0] = 0.
63 |
64 | sa_src = e_src.solidAngles()
65 | old_energy = np.nansum(sa_src[:,:,None]*e_src.data)
66 |
67 | e_tgt = e_src.copy().convertTo(tgt_format)
68 | sa_tgt = e_tgt.solidAngles()
69 | new_energy = np.nansum(sa_tgt[:,:,None]*e_tgt.data)
70 |
71 | assert new_energy/old_energy - 1. < 1
72 | print("Energy difference in convertTo: {:.08f}".format(new_energy/old_energy - 1.))
73 |
74 | recovered = e_tgt.copy().convertTo(src_format)
75 | recovered_energy = np.nansum(sa_src[:,:,None]*recovered.data)
76 |
77 | assert recovered_energy/old_energy - 1. < 1
78 | print("Recovered energy difference: {:.08f}".format(recovered_energy/old_energy - 1.))
79 |
80 |
81 | @pytest.mark.parametrize("format_", SUPPORTED_FORMATS)
82 | def test_convert_self(format_):
83 | e_src = get_envmap(16, 6, format_)
84 |
85 | e_tgt = e_src.copy().convertTo(format_)
86 | #from matplotlib import pyplot as plt
87 | #plt.imshow(e_tgt.data[:,:,0] - e_src.data[:,:,0]); plt.colorbar(); plt.title(format_); plt.show()
88 | assert np.nanmax(np.abs(e_tgt.data - e_src.data)) < 1e-4
89 |
90 |
91 | @pytest.mark.parametrize("format_", SUPPORTED_FORMATS)
92 | def test_project_embed(format_):
93 | e = get_envmap(16, 6, format_)
94 |
95 | dcm = rotation_matrix(azimuth=0./180*np.pi,
96 | elevation=-45./180*np.pi,
97 | roll=0./180*np.pi)
98 | crop = e.project(vfov=85., # degrees
99 | rotation_matrix=dcm,
100 | ar=4/3,
101 | resolution=(640, 480),
102 | projection="perspective",
103 | mode="normal")
104 |
105 | mask = e.project(vfov=85., # degrees
106 | rotation_matrix=dcm,
107 | ar=4/3,
108 | resolution=(640, 480),
109 | projection="perspective",
110 | mode="mask") > 0.9
111 |
112 | e_embed = EnvironmentMap(e.data.shape[0], format_, channels=1)
113 | e_embed = e_embed.embed(vfov=85.,
114 | rotation_matrix=dcm,
115 | image=crop)
116 |
117 | e_embed.data[~np.isfinite(e_embed.data)] = 0.
118 | recovered = mask[:,:,None]*e_embed.data
119 | source = mask[:,:,None]*e.data
120 | # from matplotlib import pyplot as plt
121 | # plt.subplot(141); plt.imshow(crop) # mask.astype('float32'))
122 | # plt.subplot(142); plt.imshow(recovered)
123 | # plt.subplot(143); plt.imshow(source); plt.title(format_)
124 | # plt.subplot(144); plt.imshow(np.abs(recovered - source)); plt.colorbar()
125 | # plt.show()
126 |
127 | assert np.mean(np.abs(recovered - source)) < 1e-1
128 |
129 | # edges are not pixel-perfect, remove boundary for check
130 | mask = binary_erosion(mask)
131 | recovered = mask[:,:,None]*e_embed.data
132 | source = mask[:,:,None]*e.data
133 | # from matplotlib import pyplot as plt
134 | # plt.subplot(131); plt.imshow(recovered)
135 | # plt.subplot(132); plt.imshow(source); plt.title(format_)
136 | # plt.subplot(133); plt.imshow(np.abs(recovered - source)); plt.colorbar()
137 | # plt.show()
138 | assert np.max(np.abs(recovered - source)) < 0.15
139 |
140 |
141 | @pytest.mark.parametrize("format_,mode,colorspace", product(SUPPORTED_FORMATS, ["ITU BT.601", "ITU BT.709", "mean"], ["sRGB", "linear"]))
142 | def test_intensity(format_, mode, colorspace):
143 | e = get_envmap(16, 6, format_, channels=3)
144 | e.toIntensity(mode=mode, colorspace=colorspace)
145 |
146 | assert e.data.shape[2] == 1
147 |
148 | @pytest.mark.parametrize("format_,normal,channels", product(SUPPORTED_FORMATS, [[0, 1, 0], [1, 0, 0], [0, 0, -1], [0.707, 0.707, 0], "rand"], [1, 3, 5, -3]))
149 | def test_set_hemisphere(format_, normal, channels):
150 | if channels < 0:
151 | value = np.asarray(np.random.rand())
152 | channels = np.abs(channels)
153 | else:
154 | value = np.random.rand(channels)
155 |
156 | if normal == "rand":
157 | normal = np.random.rand(3) + 1e-4
158 | normal /= np.linalg.norm(normal)
159 | else:
160 | normal = np.asarray(normal, dtype=np.float32)
161 |
162 | e = EnvironmentMap(128, format_, channels=channels)
163 | e.setHemisphereValue(normal, value)
164 |
165 | if value.size != e.data.shape[2]:
166 | value = np.tile(value, (e.data.shape[2],))
167 |
168 | u, v = e.world2image(normal[0:1], normal[1:2], normal[2:3])
169 | h, w = np.floor(v*e.data.shape[0]).astype('int16'), np.floor(u*e.data.shape[1]).astype('int16')
170 | h, w = np.minimum(h, e.data.shape[0] - 1), np.minimum(w, e.data.shape[1] - 1)
171 | assert np.all(e.data[h, w, :].squeeze().tolist() == value.squeeze().tolist()) , "normal not set"
172 |
173 | # skip sky-* envmaps as they might not represent the opposite normal
174 | if "sky" in format_:
175 | return
176 |
177 | u, v = e.world2image(-normal[0:1], -normal[1:2], -normal[2:3])
178 | h, w = np.floor(v*e.data.shape[0]).astype('int16'), np.floor(u*e.data.shape[1]).astype('int16')
179 | h, w = np.minimum(h, e.data.shape[0] - 1), np.minimum(w, e.data.shape[1] - 1)
180 |
181 | # try:
182 | assert np.sum(np.abs(e.data[h, w, :])) == 0. , "opposite normal not zeros"
183 | # except:
184 | # print(format_)
185 | # from matplotlib import pyplot as plt
186 | # plt.imshow(e.data)
187 | # plt.show()
188 | # import pdb; pdb.set_trace()
189 |
190 |
191 | @pytest.mark.parametrize("format_,normal", product(SUPPORTED_FORMATS, [[0, 1, 0], [1, 0, 0], [0, 0, -1], [0.707, 0.707, 0], "rand"]))
192 | def test_worldCoordinates_list(format_, normal):
193 | e = EnvironmentMap(128, format_)
194 | if normal == "rand":
195 | normal = np.random.rand(3) + 1e-4
196 | normal /= np.linalg.norm(normal)
197 |
198 | u, v = e.world2image(*normal)
199 |
200 |
201 | @pytest.mark.parametrize("format_,normal", product(SUPPORTED_FORMATS, [[0, 1, 0],
202 | [1, 0, 0],
203 | [0, 0, -1],
204 | [0.707, 0.707, 0],
205 | [[0.707, 0],
206 | [0.707, 0],
207 | [0, -1]],
208 | "rand"]))
209 | def test_worldCoordinates_ndarray(format_, normal):
210 | e = EnvironmentMap(128, format_)
211 | if normal == "rand":
212 | normal = np.random.rand(3) + 1e-4
213 | normal /= np.linalg.norm(normal)
214 |
215 | normal = np.asarray(normal)
216 | u, v = e.world2image(*normal)
217 |
218 |
219 | @pytest.mark.parametrize("format_1, format_2", product(SUPPORTED_FORMATS, SUPPORTED_FORMATS))
220 | def test_interpolation_convertTo_discrete_values(format_1, format_2):
221 | # test interpolation order 0 (nearest neighbor)
222 |
223 | # converTo() should not change the unique values in the data
224 | e_1 = EnvironmentMap(128, format_1)
225 | e_1.data = np.random.randint(0,512, size=e_1.data.shape)
226 | unique_1 = np.unique(e_1.data)
227 |
228 | # convertTo()
229 | e_2 = e_1.convertTo(format_2, order=0)
230 | # e_2 contains nan values... ¯\_(ツ)_/¯
231 | unique_2 = [x for x in np.unique(e_2.data) if not np.isnan(x)]
232 |
233 | assert len(unique_2) > 0, f"Format {format_1}->{format_2}: unique_2 is empty"
234 | for x in unique_2:
235 | assert x in unique_1, f"Format {format_1}->{format_2}: {x} was not in unique_1"
236 |
237 |
238 | @pytest.mark.parametrize("format_", SUPPORTED_FORMATS)
239 | def test_interpolation_rotate_discrete_values(format_):
240 | # rotate() should not change the unique values in the data
241 | for i in range(0,360,10):
242 | e_1 = EnvironmentMap(128, format_)
243 | e_1.data = np.random.randint(0, 512, size=e_1.data.shape)
244 | unique_1 = np.unique(e_1.data)
245 |
246 | # Rotate
247 | angle = np.deg2rad(i)
248 | dcm = rotations.roty(angle)
249 | e_2 = e_1.rotate(dcm, order=0)
250 |
251 | unique_2 = np.unique(e_2.data)
252 | assert len(unique_2) > 0, f"Format {format_}: unique_2 is empty"
253 | for x in unique_2:
254 | assert x in unique_1, f"Format {format_}: {x} was not in {unique_1}"
255 |
256 |
257 | @pytest.mark.parametrize("format_", SUPPORTED_FORMATS)
258 | def test_interpolation_resize_discrete_values(format_):
259 | # scale() should not change the unique values in the data while upscaling
260 | for s in [32,64,100,200,256,512]:
261 | e_1 = EnvironmentMap(128, format_)
262 | e_1.data = np.random.randint(0, 128, size=e_1.data.shape)
263 | unique_1 = [ x for x in np.unique(e_1.data) if not np.isnan(x) ]
264 |
265 | e_2 = e_1.resize(s, order=0)
266 | unique_2 = [ x for x in np.unique(e_2.data) if not np.isnan(x) ]
267 |
268 | assert len(unique_2) > 0, f"Format {format_}: unique_2 is empty"
269 | for x in unique_2:
270 | assert x in unique_1 , f"Format {format_}: {x} was not in {unique_1}"
271 |
272 |
273 | @pytest.mark.parametrize("format_", SUPPORTED_FORMATS)
274 | def test_add_spherical_gaussian(format_):
275 | for s in [32,64,100,200,256,512]:
276 | e = EnvironmentMap(s, format_)
277 | e.addSphericalGaussian((0,1,0), 0.2, (1, 0.5, 0.25))
278 | u, v = e.world2pixel(0, 1, 0)
279 | assert np.all(np.abs(e.data[v, u] - np.array((1, 0.5, 0.25))) < 1e-2)
280 | assert np.all(np.max(np.nan_to_num(e.data), axis=(0, 1)) <= e.data[v, u])
281 |
--------------------------------------------------------------------------------
/test/test_projections.py:
--------------------------------------------------------------------------------
1 | import pytest
2 | import math
3 | import numpy as np
4 |
5 | from envmap import projections as t
6 | from envmap import environmentmap as env
7 |
8 |
9 | # pytest [-s] [-k test_projections_cube]
10 |
11 | @pytest.mark.parametrize("coordinate_UV, coordinate_XYZR",
12 | [
13 | # ( coordinate_UV(u,v), coordinate_XYZ(x,y,z,r) )
14 |
15 | # Diagonal
16 | ((0.1, 0.1), (float('nan'), float('nan'), float('nan'), False)),
17 | ((0.25, 0.25), (-0.6666666666666666, 0.6666666666666666, -0.3333333333333333, True)),
18 | ((0.5, 0.5), (0., -0.70710678, -0.70710678, True)),
19 | ((0.75, 0.75), (float('nan'), float('nan'), float('nan'), False)),
20 | ((0.9, 0.9), (float('nan'), float('nan'), float('nan'), False)),
21 |
22 | # Random
23 | ((0.6, 0.3), (0.4574957109978137, 0.45749571099781405, -0.7624928516630234, True)),
24 | ((0.3, 0.6), (float('nan'), float('nan'), float('nan'), False)),
25 | ]
26 | )
27 | def test_projections_cube(coordinate_UV, coordinate_XYZR):
28 | X, Y, Z, R = coordinate_XYZR
29 | U, V = coordinate_UV
30 |
31 | # --- Singular float coordinates --- #
32 | if R == True:
33 | u, v = t.world2cube(X, Y, Z)
34 | assert type(u) == float
35 | assert type(v) == float
36 | assert u == pytest.approx(U, abs=1e-6)
37 | assert v == pytest.approx(V, abs=1e-6)
38 |
39 | x, y, z, r = t.cube2world(U, V)
40 | assert type(x) == float
41 | assert type(y) == float
42 | assert type(z) == float
43 | assert type(r) == bool
44 | assert r == R
45 | np.testing.assert_almost_equal(x, X, decimal=6)
46 | np.testing.assert_almost_equal(y, Y, decimal=6)
47 | np.testing.assert_almost_equal(z, Z, decimal=6)
48 |
49 | # --- Array of float coordinates --- #
50 | U_, V_ = np.array([U, U, U]), np.array([V, V, V])
51 | X_, Y_, Z_, R_ = np.array([X, X, X]), np.array([Y, Y, Y]), np.array([Z, Z, Z]), np.array([R, R, R])
52 |
53 | if R == True:
54 | u_, v_ = t.world2cube(X_, Y_, Z_)
55 | assert type(u_) == np.ndarray
56 | assert type(v_) == np.ndarray
57 | assert u_ == pytest.approx(U_, abs=1e-6)
58 | assert v_ == pytest.approx(V_, abs=1e-6)
59 |
60 | x_, y_, z_, r_ = t.cube2world(U_, V_)
61 | assert type(x_) == np.ndarray
62 | assert type(y_) == np.ndarray
63 | assert type(z_) == np.ndarray
64 | assert type(r_) == np.ndarray
65 | assert (r_ == R_).all()
66 | np.testing.assert_almost_equal(x_, X_, decimal=6)
67 | np.testing.assert_almost_equal(y_, Y_, decimal=6)
68 | np.testing.assert_almost_equal(z_, Z_, decimal=6)
69 |
70 |
71 | @pytest.mark.parametrize("format_", env.SUPPORTED_FORMATS)
72 | def test_projections_pixel(format_):
73 | e = env.EnvironmentMap(64, format_, channels=2)
74 |
75 | # Meshgrid of Normalized Coordinates
76 | u, v = e.imageCoordinates()
77 |
78 | # Meshgrid of M*N image Coordinates
79 | cols = np.linspace(0, e.data.shape[1] - 1, e.data.shape[1])
80 | rows = np.linspace(0, e.data.shape[0] - 1, e.data.shape[0])
81 | U, V = np.meshgrid(cols, rows)
82 |
83 | x, y, z, valid = e.image2world(u, v)
84 | x_, y_, z_, valid_ = e.pixel2world(U, V)
85 |
86 | np.testing.assert_array_almost_equal(x[valid], x_[valid_], decimal=5)
87 | np.testing.assert_array_almost_equal(y[valid], y_[valid_], decimal=5)
88 | np.testing.assert_array_almost_equal(z[valid], z_[valid_], decimal=5)
89 | np.testing.assert_array_equal(valid, valid_)
90 |
91 | # world2pixel(x,y,z)
92 | U_, V_ = e.world2pixel(x, y, z)
93 |
94 | np.testing.assert_array_equal(U_[valid_], U[valid_])
95 | np.testing.assert_array_equal(V_[valid_], V[valid_])
96 |
97 |
98 | @pytest.mark.parametrize("format_", env.SUPPORTED_FORMATS)
99 | def test_projections_image(format_):
100 |
101 | e = env.EnvironmentMap(64, format_, channels=2)
102 |
103 | # Meshgrid of Normalized Coordinates
104 | cols = np.linspace(0, 1, e.data.shape[1]*2 + 1)[1::2]
105 | rows = np.linspace(0, 1, e.data.shape[0]*2 + 1)[1::2]
106 | u, v = np.meshgrid(cols, rows)
107 |
108 | # image2world(U,V)
109 | x, y, z, valid = e.image2world(u, v)
110 |
111 | # world2image(x,y,z)
112 | u_, v_ = e.world2image(x, y, z)
113 |
114 | np.testing.assert_array_almost_equal(u[valid], u_[valid], decimal=5)
115 | np.testing.assert_array_almost_equal(v[valid], v_[valid], decimal=5)
116 |
--------------------------------------------------------------------------------
/test/test_sunutils.py:
--------------------------------------------------------------------------------
1 | import pytest
2 | import numpy as np
3 | from datetime import datetime as dt
4 |
5 | from hdrtools import sunutils
6 | from envmap.projections import latlong2world
7 |
8 |
9 | # pytest [-s] [-k test_sunPosition_pySolar_XYZ]
10 |
11 | @pytest.mark.parametrize("latitude,longitude,elevation,time,azimuth,altitude,expectedPosition_UV,envShape",
12 | [
13 | #(latitude, longitude, elevation, time, azimuth, altitude, expectedPosition_UV, envShape)
14 | (46.778969, -71.274914, 125, "2016-07-11 14:55:03-04:00", 236.0342180792503, 54.738195890813465, (202, 1347), (1024, 2048, 3)),
15 | (46.778969, -71.274914, 125, "2016-07-11 16:37:09-04:00", 261.48788303999396, 38.476585009805774, (295, 1486), (1024, 2048, 3)),
16 | (46.778969, -71.274914, 125, "2016-07-11 18:11:15-04:00", 278.7179058813837, 22.440617313357762, (383, 1582), (1024, 2048, 3)),
17 | (46.778969, -71.274914, 125, "2016-07-11 19:13:19-04:00", 289.2536012730952, 12.175347987926557, (432, 1639), (1024, 2048, 3)),
18 | (46.778969, -71.274914, 125, "2016-07-11 14:59:02-04:00", 237.28376586581751, 54.1684484374844, (205, 1354), (1024, 2048, 3)),
19 | (46.778969, -71.274914, 125, "2016-07-11 15:25:04-04:00", 244.80877604207072, 50.26812689923775, (228, 1396), (1024, 2048, 3)),
20 | (46.778969, -71.274914, 125, "2016-07-11 16:31:06-04:00", 260.2533901025191, 39.49882765736283, (289, 1478), (1024, 2048, 3)),
21 | (46.778969, -71.274914, 125, "2016-07-11 18:57:15-04:00", 286.52892194502806, 14.780838961049204, (421, 1625), (1024, 2048, 3)),
22 | (46.778969, -71.274914, 125, "2016-07-11 15:35:05-04:00", 247.44434330174641, 48.70012523679931, (238, 1411), (1024, 2048, 3)),
23 | (46.778969, -71.274914, 125, "2016-07-11 17:45:13-04:00", 274.2050868138487, 26.862260714722463, (359, 1557), (1024, 2048, 3)),
24 | (46.778969, -71.274914, 125, "2016-07-11 14:57:03-04:00", 236.6651568371286, 54.453150605734265, (204, 1351), (1024, 2048, 3)),
25 | (46.778969, -71.274914, 125, "2016-07-11 13:50:59-04:00", 211.324186709539, 62.36297164343926, (156, 1212), (1024, 2048, 3)),
26 | (46.778969, -71.274914, 125, "2016-07-11 17:43:13-04:00", 273.85317844978465, 27.2034602696621, (356, 1554), (1024, 2048, 3)),
27 | (46.778969, -71.274914, 125, "2016-07-11 14:12:57-04:00", 220.86582713099386, 60.14378307583763, (170, 1266), (1024, 2048, 3)),
28 | (46.778969, -71.274914, 125, "2016-07-11 16:17:04-04:00", 257.2962617745477, 41.85445355912644, (276, 1464), (1024, 2048, 3)),
29 | (46.778969, -71.274914, 125, "2016-07-11 14:05:00-04:00", 217.54924128918142, 61.0045783451481, (164, 1246), (1024, 2048, 3)),
30 | (46.778969, -71.274914, 125, "2016-07-11 16:05:04-04:00", 254.64954763786534, 43.84699443623451, (265, 1449), (1024, 2048, 3)),
31 | (46.778969, -71.274914, 125, "2016-07-11 18:17:15-04:00", 279.74423375929877, 21.42842020578881, (391, 1587), (1024, 2048, 3)),
32 | (46.778969, -71.274914, 125, "2016-07-11 14:39:02-04:00", 230.7075145629071, 56.939376143025406, (189, 1318), (1024, 2048, 3)),
33 | (46.778969, -71.274914, 125, "2016-07-11 19:03:18-04:00", 287.55365936964733, 13.794549384465714, (425, 1631), (1024, 2048, 3)),
34 | ]
35 | )
36 | def test_sunPosition_pySolar_XYZ(latitude,longitude,elevation,time,azimuth,altitude,expectedPosition_UV,envShape):
37 | 'Implicitly tests sunPosition_pySolar_UV()'
38 |
39 | x,y,z = sunutils.sunPosition_pySolar_XYZ(latitude,longitude,dt.fromisoformat(time),elevation)
40 |
41 | # expectedPosition_UV is directly from findBrightestSpot()
42 | U, V = (expectedPosition_UV[1]+0.5) / envShape[1], \
43 | (expectedPosition_UV[0]+0.5) / envShape[0]
44 |
45 | # Convert UV to XYZ
46 | X,Y,Z, _ = latlong2world(U,V)
47 | assert x == pytest.approx(X, abs=1e-1)
48 | assert y == pytest.approx(Y, abs=1e-1)
49 | assert z == pytest.approx(Z, abs=1e-1)
50 |
51 |
52 | @pytest.mark.parametrize("latitude,longitude,elevation,time,azimuth,altitude,expectedPosition_UV,envShape",
53 | [
54 | #(latitude, longitude, elevation, time, azimuth, altitude, expectedPosition_UV, envShape)
55 | (46.778969, -71.274914, 125, "2016-07-11 14:55:03-04:00", 236.0342180792503, 54.738195890813465, (202, 1347), (1024, 2048, 3)),
56 | (46.778969, -71.274914, 125, "2016-07-11 16:37:09-04:00", 261.48788303999396, 38.476585009805774, (295, 1486), (1024, 2048, 3)),
57 | (46.778969, -71.274914, 125, "2016-07-11 18:11:15-04:00", 278.7179058813837, 22.440617313357762, (383, 1582), (1024, 2048, 3)),
58 | (46.778969, -71.274914, 125, "2016-07-11 19:13:19-04:00", 289.2536012730952, 12.175347987926557, (432, 1639), (1024, 2048, 3)),
59 | (46.778969, -71.274914, 125, "2016-07-11 14:59:02-04:00", 237.28376586581751, 54.1684484374844, (205, 1354), (1024, 2048, 3)),
60 | (46.778969, -71.274914, 125, "2016-07-11 15:25:04-04:00", 244.80877604207072, 50.26812689923775, (228, 1396), (1024, 2048, 3)),
61 | (46.778969, -71.274914, 125, "2016-07-11 16:31:06-04:00", 260.2533901025191, 39.49882765736283, (289, 1478), (1024, 2048, 3)),
62 | (46.778969, -71.274914, 125, "2016-07-11 18:57:15-04:00", 286.52892194502806, 14.780838961049204, (421, 1625), (1024, 2048, 3)),
63 | (46.778969, -71.274914, 125, "2016-07-11 15:35:05-04:00", 247.44434330174641, 48.70012523679931, (238, 1411), (1024, 2048, 3)),
64 | (46.778969, -71.274914, 125, "2016-07-11 17:45:13-04:00", 274.2050868138487, 26.862260714722463, (359, 1557), (1024, 2048, 3)),
65 | (46.778969, -71.274914, 125, "2016-07-11 14:57:03-04:00", 236.6651568371286, 54.453150605734265, (204, 1351), (1024, 2048, 3)),
66 | (46.778969, -71.274914, 125, "2016-07-11 13:50:59-04:00", 211.324186709539, 62.36297164343926, (156, 1212), (1024, 2048, 3)),
67 | (46.778969, -71.274914, 125, "2016-07-11 17:43:13-04:00", 273.85317844978465, 27.2034602696621, (356, 1554), (1024, 2048, 3)),
68 | (46.778969, -71.274914, 125, "2016-07-11 19:55:22-04:00", 296.48810555162686, 5.617312205018568, (430, 1821), (1024, 2048, 3)),
69 | (46.778969, -71.274914, 125, "2016-07-11 14:12:57-04:00", 220.86582713099386, 60.14378307583763, (170, 1266), (1024, 2048, 3)),
70 | (46.778969, -71.274914, 125, "2016-07-11 16:17:04-04:00", 257.2962617745477, 41.85445355912644, (276, 1464), (1024, 2048, 3)),
71 | (46.778969, -71.274914, 125, "2016-07-11 14:05:00-04:00", 217.54924128918142, 61.0045783451481, (164, 1246), (1024, 2048, 3)),
72 | (46.778969, -71.274914, 125, "2016-07-11 16:05:04-04:00", 254.64954763786534, 43.84699443623451, (265, 1449), (1024, 2048, 3)),
73 | (46.778969, -71.274914, 125, "2016-07-11 18:17:15-04:00", 279.74423375929877, 21.42842020578881, (391, 1587), (1024, 2048, 3)),
74 | (46.778969, -71.274914, 125, "2016-07-11 14:39:02-04:00", 230.7075145629071, 56.939376143025406, (189, 1318), (1024, 2048, 3)),
75 | (46.778969, -71.274914, 125, "2016-07-11 19:03:18-04:00", 287.55365936964733, 13.794549384465714, (425, 1631), (1024, 2048, 3)),
76 | ]
77 | )
78 | def test_sunPosition_pySolar_zenithAzimuth(latitude,longitude,elevation,time,azimuth,altitude,expectedPosition_UV,envShape):
79 |
80 | # azimuth and altitude are directly from pySolar for the given time
81 | zenith_, azimuth_ = sunutils.sunPosition_pySolar_zenithAzimuth(latitude, longitude, dt.fromisoformat(time), elevation)
82 | assert np.deg2rad(90-altitude) == pytest.approx(zenith_, abs=1e-6)
83 | assert (np.pi/2) + np.deg2rad(-azimuth) == pytest.approx(azimuth_, abs=1e-6)
84 | elevation_ = np.pi/2 - zenith_
85 |
86 | # expectedPosition_UV is directly from findBrightestSpot()
87 | U, V = (expectedPosition_UV[1]+0.5) / envShape[1], \
88 | (expectedPosition_UV[0]+0.5) / envShape[0]
89 |
90 | # Fix orientation of azimuth
91 | azimuth_ = -(azimuth_ - (np.pi/2))
92 |
93 | # Convert azimuth to U
94 | u = (azimuth_ / (2*np.pi))
95 | assert u == pytest.approx(U, abs=1e-1)
96 |
97 | # Convert zenith to V
98 | v = zenith_/np.pi
99 | assert v == pytest.approx(V, abs=1e-1)
100 | v = (np.pi/2 - elevation_)/np.pi
101 | assert v == pytest.approx(V, abs=1e-1)
102 |
103 |
--------------------------------------------------------------------------------
/test/test_warping_operator.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | from itertools import product
3 | import pytest
4 | from envmap import EnvironmentMap
5 | from envmap.environmentmap import SUPPORTED_FORMATS
6 | from tools3d.warping_operator import warpEnvironmentMap
7 | from scipy import ndimage
8 |
9 | np.random.seed(31415926)
10 |
11 |
12 | # pytest [-s] [-k test_convert]
13 |
14 | @pytest.mark.parametrize('format_, nadir, size, theta, phi',
15 | product(
16 | SUPPORTED_FORMATS,
17 | [-np.pi/2 + np.pi/20, -np.pi/4, 0, np.pi/2 - np.pi/20],
18 | [512, 271],
19 | np.linspace(0, 2*np.pi, 5) + 1e-10,
20 | np.linspace(-np.pi/2, np.pi/2, 4) + 1e-10
21 | )
22 | )
23 | def test_warpEnvironmentMap(format_, nadir, size, theta, phi):
24 | """
25 | This test works in the following way:
26 | a white "blob" of 1 pixel is added to the source environment map, and then the environment map is warped.
27 | The warped environment map is then checked to see if the blob is there at the expected location.
28 | """
29 | channels = 3
30 | blobSourceCoordinates = np.array([np.sin(phi) * np.cos(theta), np.sin(phi) * np.sin(theta), -np.cos(phi)])
31 |
32 | sourceEnvironmentMap = EnvironmentMap(size, format_, channels=channels)
33 | sourceEnvironmentMap.data = np.zeros((sourceEnvironmentMap.data.shape[0], sourceEnvironmentMap.data.shape[1], channels))
34 | sourceBlobCoordinatesPixel = np.mod(sourceEnvironmentMap.world2pixel(*blobSourceCoordinates.tolist()), (sourceEnvironmentMap.data.shape[1],
35 | sourceEnvironmentMap.data.shape[0]))
36 | # we recompute the world coordinates because of the quantization error introduced by the world2pixel function
37 | x, y, z, _ = tuple(sourceEnvironmentMap.pixel2world(*sourceBlobCoordinatesPixel.tolist()))
38 |
39 | # we add a white blob in the source environment map, which we are going to track after warping the envmap
40 | sourceEnvironmentMap.data[sourceBlobCoordinatesPixel[1], sourceBlobCoordinatesPixel[0], :] = 1.0
41 |
42 | warpedEnvironmentMap = warpEnvironmentMap(sourceEnvironmentMap.copy(), nadir, order=1)
43 | if warpedEnvironmentMap.data.max() == 0:
44 | # sometimes the warping removes the blob during interpolation due to the distortion, we ignore those cases
45 | pytest.skip()
46 |
47 |
48 | blobExpectedWarpedCoordinates = np.array([x, y, z - np.sin(nadir)])
49 | blobExpectedWarpedCoordinates /= np.linalg.norm(blobExpectedWarpedCoordinates, axis=0)
50 | blobExpectedWarpedCoordinatesPixel = np.array(EnvironmentMap(size, format_, channels=channels).world2pixel(*blobExpectedWarpedCoordinates.tolist()))
51 |
52 | warpedEnvironmentMapGray = warpedEnvironmentMap.data.mean(axis=2)
53 |
54 | # we detect the blob by checking if the pixel value is greater than 0
55 | assert warpedEnvironmentMapGray[blobExpectedWarpedCoordinatesPixel[1], blobExpectedWarpedCoordinatesPixel[0]] > 0.0
56 |
--------------------------------------------------------------------------------
/tools3d/__init__.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | from scipy.sparse import coo_matrix, vstack as sparse_vstack
3 | from scipy.sparse.linalg import lsqr as sparse_lsqr
4 |
5 |
6 | from . import display
7 | from . import spharm
8 |
9 |
10 | def getMaskDerivatives(mask):
11 | """
12 | Build the derivatives of the input `mask`
13 |
14 | :returns: (`Mx`, `My`), containing the U-axis and V-axis derivatives of the mask.
15 | """
16 | idxmat = np.zeros_like(mask, np.int32)
17 | idxmat[mask] = np.arange(np.sum(mask))
18 |
19 | # Build the pixel list (for in-order iteration) and set (for O(1) `in` operator)
20 | pts = list(zip(*np.where(mask)))
21 | pts_set = set(pts)
22 |
23 | col_x, data_x = [], []
24 | col_y, data_y = [], []
25 | for x, y in pts:
26 | p0 = idxmat[x, y]
27 | # x-derivative
28 | if (x, y+1) in pts_set: # If pixel to the right
29 | pE = idxmat[x, y+1]
30 | col_x.extend([p0, pE]); data_x.extend([-1, 1])
31 | elif (x, y-1) in pts_set: # If pixel to the left
32 | pW = idxmat[x, y-1]
33 | col_x.extend([pW, p0]); data_x.extend([-1, 1])
34 | else: # Pixel has no right or left but is valid, so must have an entry
35 | col_x.extend([p0, p0]); data_x.extend([0, 0])
36 |
37 | # y-derivative
38 | if (x+1, y) in pts_set: # If pixel to the bottom
39 | pS = idxmat[x+1, y]
40 | col_y.extend([p0, pS]); data_y.extend([-1, 1])
41 | elif (x-1, y) in pts_set: # If pixel to the top
42 | pN = idxmat[x-1, y]
43 | col_y.extend([pN, p0]); data_y.extend([-1, 1])
44 | else: # Pixel has no right or left but is valid, so must have an entry
45 | col_y.extend([p0, p0]); data_y.extend([0, 0])
46 |
47 | nelem = np.sum(mask)
48 |
49 | row_x = np.tile(np.arange(len(col_x)//2)[:,np.newaxis], [1, 2]).ravel()
50 | row_y = np.tile(np.arange(len(col_y)//2)[:,np.newaxis], [1, 2]).ravel()
51 | Mx = coo_matrix((data_x, (row_x, col_x)), shape=(len(col_x)//2, nelem))
52 | My = coo_matrix((data_y, (row_y, col_y)), shape=(len(col_y)//2, nelem))
53 |
54 | return Mx, My
55 |
56 |
57 | def NfromZ(surf, mask, Mx, My):
58 | """
59 | Compute (derivate) the normal map of a depth map.
60 | """
61 |
62 | normals = np.hstack((Mx.dot(surf.ravel())[:,np.newaxis], My.dot(surf.ravel())[:,np.newaxis], -np.ones((Mx.shape[0], 1), np.float64)))
63 | # Normalization
64 | normals = (normals.T / np.linalg.norm(normals, axis=1)).T
65 |
66 | # Apply mask
67 | out = np.zeros(mask.shape + (3,), np.float32)
68 | out[np.tile(mask[:,:,np.newaxis], [1, 1, 3])] = normals.ravel()
69 |
70 | return out
71 |
72 |
73 | def ZfromN(normals, mask, Mx, My):
74 | """
75 | Compute (integrate) the depth map of a normal map.
76 |
77 | The reconstruction is up to a scaling factor.
78 | """
79 | b = -normals
80 | b[:,2] = 0
81 | b = b.T.ravel()
82 |
83 | N = normals.shape[0]
84 | ij = list(range(N))
85 | X = coo_matrix((normals[:,0], (ij, ij)), shape=Mx.shape)
86 | Y = coo_matrix((normals[:,1], (ij, ij)), shape=Mx.shape)
87 | Z = coo_matrix((normals[:,2], (ij, ij)), shape=Mx.shape)
88 | A = sparse_vstack((Z.dot(Mx),
89 | Z.dot(My),
90 | Y.dot(Mx) - X.dot(My)))
91 | # Is the 3rd constraint really useful?
92 |
93 | surf = sparse_lsqr(A, b)
94 | surf = surf[0]
95 | surf -= surf.min()
96 |
97 | out = np.zeros(mask.shape, np.float32)
98 | out[mask] = surf.ravel()
99 |
100 | return out
101 |
102 |
103 | if __name__ == '__main__':
104 | from matplotlib import pyplot as plt
105 | from scipy.misc import imresize
106 | from scipy.ndimage.interpolation import zoom
107 |
108 | # Usage example
109 | # Step 1) a) Build a depth map ...
110 | surf = np.array([[1, 1, 1, 1],
111 | [1, 2, 3, 1],
112 | [2, 3, 5, 2],
113 | [3, 3, 2, 3]], np.float32)
114 |
115 |
116 | # Step 1) b) ... and its mask
117 | mask = np.ones(surf.shape, np.bool)
118 | # Simulate mask, uncomment to test
119 | mask[1,1] = 0
120 | # mask[1,2] = 0
121 | # mask[0,2] = 0
122 | # mask[3,1] = 0
123 |
124 | # Step 1) c) Scale it up to spice up life
125 | surf = zoom(surf, (5, 5), order=1)
126 | mask = zoom(mask, (5, 5), order=0)
127 | surf[~mask] = 0
128 | surf_ori = surf.copy()
129 | surf = surf[mask]
130 |
131 | # Step 2) Compute the mask derivatives
132 | Ms = getMaskDerivatives(mask)
133 |
134 | # Step 3) Compute the normal map from the depth map
135 | normals = NfromZ(surf, mask, *Ms)
136 |
137 | # Step 4) Compute the depth map from the normal map
138 | masked_normals = normals[np.tile(mask[:,:,np.newaxis], [1, 1, 3])].reshape([-1, 3])
139 | surf_recons = ZfromN(masked_normals, mask, *Ms)
140 |
141 | # Visualize the results
142 | plt.subplot(131); plt.imshow(surf_ori, interpolation='nearest'); plt.colorbar()
143 | plt.subplot(132); plt.imshow((normals+1)/2, interpolation='nearest')
144 | plt.subplot(133); plt.imshow(surf_recons, interpolation='nearest'); plt.colorbar()
145 | plt.show()
146 |
--------------------------------------------------------------------------------
/tools3d/blender_addon_transportmatrix/compress_matrix.py:
--------------------------------------------------------------------------------
1 | import time
2 | import numpy as np
3 | from imageio import imsave
4 | import lz4.frame
5 | import msgpack
6 | import msgpack_numpy
7 | from ezexr import imread
8 | from scipy.ndimage.interpolation import zoom
9 |
10 |
11 | from functools import partial
12 | lz4open = partial(lz4.frame.open, block_size=lz4.frame.BLOCKSIZE_MAX1MB,
13 | compression_level=lz4.frame.COMPRESSIONLEVEL_MAX)
14 |
15 |
16 | ts = time.time()
17 | with open("output.npz", "rb") as fhdl:
18 | data = np.load(fhdl)
19 | T = data["arr_0"]
20 | normals = data["arr_1"]
21 | image_size = data["arr_2"]
22 | print("Loading time: {:.03f}".format(time.time() - ts))
23 |
24 |
25 | ts = time.time()
26 | with lz4open("output.msgpack", "wb") as fhdl:
27 | msgpack.pack([T.astype('float16'), normals.astype('float16'), image_size], fhdl, default=msgpack_numpy.encode)
28 | print("Writing time: {:.03f}".format(time.time() - ts))
29 |
--------------------------------------------------------------------------------
/tools3d/blender_addon_transportmatrix/render.py:
--------------------------------------------------------------------------------
1 | import time
2 | import numpy as np
3 | from imageio import imsave
4 | from ezexr import imread
5 | from scipy.ndimage.interpolation import zoom
6 |
7 |
8 | # Load the transport matrix
9 | ts = time.time()
10 | with open("output.npz", "rb") as fhdl:
11 | data = np.load(fhdl)
12 | T = data["arr_0"]
13 | normals = data["arr_1"]
14 | img_size = data["arr_2"]
15 | print("time: {:.03f}".format(time.time() - ts))
16 |
17 | # Save the normals
18 | imsave('normals.png', (normals + 1)/2)
19 | #from matplotlib import pyplot as plt
20 | #plt.imshow((normals + 1)/2); plt.show()
21 |
22 | # Load the envmap
23 | ts = time.time()
24 | envmap = imread("envmap.exr", rgb=True)
25 | envmap = zoom(envmap, (128/envmap.shape[0], 256/envmap.shape[1], 1), order=1, prefilter=True)
26 | envmap = envmap[:64,:,:]
27 | print("time: {:.03f}".format(time.time() - ts))
28 |
29 |
30 | for i in range(256):
31 | envmap = np.roll(envmap, shift=1, axis=1)
32 | # Perform the rendering
33 | ts = time.time()
34 | im = T.dot(envmap.reshape((-1, 3)))
35 | # Tonemap & reshape
36 | im = im.reshape((img_size[0], img_size[1], 3))**(1./2.2)
37 | print("Render performed in {:.3f}s".format(time.time() - ts))
38 |
39 |
40 | # Save the images
41 | im *= 1/200
42 | imsave('render_{:03d}.png'.format(i), np.clip(im, 0, 1))
43 | imsave('envmap_{:03d}.png'.format(i), np.clip(0.7*envmap**(1./2.2), 0, 1))
44 | print("Saved images render.png & render_envmap.png")
45 |
46 |
--------------------------------------------------------------------------------
/tools3d/blender_addon_transportmatrix/render_compressed.py:
--------------------------------------------------------------------------------
1 | import time
2 | import numpy as np
3 | from imageio import imsave
4 | import lz4.frame
5 | import msgpack
6 | import msgpack_numpy
7 | from ezexr import imread
8 | from scipy.ndimage.interpolation import zoom
9 |
10 |
11 | from functools import partial
12 | lz4open = partial(lz4.frame.open, block_size=lz4.frame.BLOCKSIZE_MAX1MB,
13 | compression_level=lz4.frame.COMPRESSIONLEVEL_MAX)
14 |
15 |
16 | # Loading the transport matrix
17 | ts = time.time()
18 | with lz4open("output.msgpack", "rb") as fhdl:
19 | T, normals, img_size = msgpack.unpack(fhdl, object_hook=msgpack_numpy.decode, max_str_len=2**32-1)
20 | print("time: {:.03f}".format(time.time() - ts))
21 | T = T.astype('float32')
22 | normals = normals.astype('float32')
23 |
24 | # Save the normals
25 | imsave('normals.png', (normals + 1)/2)
26 | #from matplotlib import pyplot as plt
27 | #plt.imshow((normals + 1)/2); plt.show()
28 |
29 | # Load the envmap
30 | ts = time.time()
31 | envmap = imread("envmap.exr", rgb=True)
32 | envmap = zoom(envmap, (128/envmap.shape[0], 256/envmap.shape[1], 1), order=1, prefilter=True)
33 | envmap = envmap[:64,:,:].astype('float32')
34 | print("time: {:.03f}".format(time.time() - ts))
35 |
36 |
37 | for i in range(256):
38 | envmap = np.roll(envmap, shift=1, axis=1)
39 | # Perform the rendering
40 | ts = time.time()
41 | im = T.dot(envmap.reshape((-1, 3)))
42 | # Tonemap & reshape
43 | im = im.reshape((img_size[0], img_size[1], 3))**(1./2.2)
44 | print("Render performed in {:.3f}s".format(time.time() - ts))
45 |
46 |
47 | # Save the images
48 | im *= 1/200
49 | imsave('render_{:03d}.png'.format(i), np.clip(im, 0, 1))
50 | imsave('envmap_{:03d}.png'.format(i), np.clip(0.7*envmap**(1./2.2), 0, 1))
51 | print("Saved images render.png & render_envmap.png")
52 |
53 |
--------------------------------------------------------------------------------
/tools3d/blender_addon_transportmatrix/transportmatrix.py:
--------------------------------------------------------------------------------
1 | bl_info = {
2 | "name": "Generate Transport Matrix",
3 | "author": "Yannick Hold-Geoffroy",
4 | "version": (0, 2, 1),
5 | "blender": (3, 0, 1),
6 | "category": "Import-Export",
7 | "location": "File > Export > Generate Transport Matrix",
8 | "description": "Export the current camera viewport to a transport matrix.",
9 | }
10 |
11 | import bpy
12 | from mathutils import Vector
13 | import bpy_extras
14 | import numpy as np
15 | import time
16 |
17 |
18 | # TODOs:
19 | # 1. Generate a mask to shrink the matrix size (remove pixels that doesn't intersect the scene)
20 | # 2. Add support for (lambertian) albedo from blender's UI
21 | # 3. Add anti-aliasing
22 |
23 |
24 | # For each pixel, check intersection
25 | # Inspirations:
26 | # https://blender.stackexchange.com/questions/90698/projecting-onto-a-mirror-and-back/91019#91019
27 | # https://blender.stackexchange.com/questions/13510/get-position-of-the-first-solid-crossing-limits-line-of-a-camera
28 | def getClosestIntersection(clip_end, ray_begin, ray_direction):
29 | min_normal = None
30 | min_location = None
31 | depsgraph = bpy.context.evaluated_depsgraph_get()
32 | success, location, normal, index, object, matrix = bpy.context.scene.ray_cast(depsgraph, ray_begin, ray_direction - ray_begin)
33 |
34 | if success:
35 | min_normal = matrix.to_3x3().normalized() @ normal.copy()
36 | min_location = location.copy()
37 | return min_normal, min_location
38 |
39 |
40 | # Taken from https://github.com/soravux/skylibs/blob/master/envmap/projections.py
41 | def latlong2world(u, v):
42 | """Get the (x, y, z, valid) coordinates of the point defined by (u, v)
43 | for a latlong map."""
44 | u = u * 2
45 |
46 | # lat-long -> world
47 | thetaLatLong = np.pi * (u - 1)
48 | phiLatLong = np.pi * v
49 |
50 | x = np.sin(phiLatLong) * np.sin(thetaLatLong)
51 | y = np.cos(phiLatLong)
52 | z = -np.sin(phiLatLong) * np.cos(thetaLatLong)
53 |
54 | valid = np.ones(x.shape, dtype='bool')
55 | return x, y, z, valid
56 |
57 |
58 | def skylatlong2world(u, v):
59 | """Get the (x, y, z, valid) coordinates of the point defined by (u, v)
60 | for a latlong map."""
61 | u = u * 2
62 |
63 | # lat-long -> world
64 | thetaLatLong = np.pi * (u - 1)
65 | phiLatLong = np.pi * v / 2
66 |
67 | x = np.sin(phiLatLong) * np.sin(thetaLatLong)
68 | y = np.cos(phiLatLong)
69 | z = -np.sin(phiLatLong) * np.cos(thetaLatLong)
70 |
71 | valid = np.ones(x.shape, dtype='bool')
72 | return x, y, z, valid
73 |
74 |
75 | def getEnvmapDirections(envmap_size, envmap_type):
76 | """envmap_size = (rows, columns)
77 | envmap_type in ["latlong", "skylatlong"]"""
78 | cols = np.linspace(0, 1, envmap_size[1] * 2 + 1)
79 | rows = np.linspace(0, 1, envmap_size[0] * 2 + 1)
80 |
81 | cols = cols[1::2]
82 | rows = rows[1::2]
83 |
84 | u, v = [d.astype('float32') for d in np.meshgrid(cols, rows, indexing='ij')]
85 | if envmap_type.lower() == "latlong":
86 | x, y, z, valid = latlong2world(u, v)
87 | elif envmap_type.lower() == "skylatlong":
88 | x, y, z, valid = skylatlong2world(u, v)
89 | else:
90 | raise Exception("Unknown format: {}. Should be either \"latlong\" or "
91 | "\"skylatlong\".".format(envmap_type))
92 |
93 | return np.asarray([x, y, z]).reshape((3, -1)).T
94 |
95 |
96 | class GenerateTransportMatrix(bpy.types.Operator):
97 | bl_idname = "export.generate_transport_matrix"
98 | bl_label = "Generate Transport Matrix"
99 | bl_options = {'REGISTER'}
100 |
101 | filepath : bpy.props.StringProperty(default="scene.msgpack", subtype="FILE_PATH")
102 | envmap_type : bpy.props.EnumProperty(name="Environment Map Type",
103 | items=[("SKYLATLONG", "skylatlong", "width should be 4x the height", "", 1),
104 | ("LATLONG", "latlong", "width should be 2x the height", "", 2)])
105 | envmap_height : bpy.props.IntProperty(name="Environment Map Height (px)", default=64)
106 | only_surface_normals : bpy.props.BoolProperty(name="Output only surface normals", default=False)
107 |
108 | def execute(self, context):
109 |
110 | T, normals, resolution = self.compute_transport(self.envmap_height, self.envmap_type, self.only_surface_normals)
111 | print("Saving to ", self.filepath)
112 | with open(self.filepath, "wb") as fhdl:
113 | np.savez(fhdl, T, normals, resolution)
114 |
115 | return {'FINISHED'}
116 |
117 | def invoke(self, context, event):
118 | context.window_manager.fileselect_add(self)
119 | return {'RUNNING_MODAL'}
120 |
121 | @staticmethod
122 | def compute_occlusion(location, normal, envmap_directions, cam, envmap_directions_blender=None):
123 | if envmap_directions_blender is None:
124 | envmap_directions_blender = envmap_directions[:, [0, 2, 1]].copy()
125 | envmap_directions_blender[:, 1] = -envmap_directions_blender[:, 1]
126 |
127 | if normal.any():
128 | # For every envmap pixel
129 | # normal_np = np.array([normal[0], normal[2], -normal[1]])
130 | intensity = envmap_directions.dot(normal)
131 | # TODO: Add albedo here
132 |
133 | # Handle occlusions (single bounce)
134 | for idx in range(envmap_directions.shape[0]):
135 | # if the normal opposes the light direction, no need to raytrace.
136 | if intensity[idx] < 0:
137 | intensity[idx] = 0
138 | continue
139 |
140 | target_vec = Vector(envmap_directions_blender[idx, :])
141 | # Check for occlusions. The 1e-3 is just to be sure the
142 | # ray casting does not start right from the surface and
143 | # find itself as occlusion...
144 | normal_occ, _ = getClosestIntersection(cam.data.clip_end,
145 | location + (1 / cam.data.clip_end) * 1e-3 * target_vec,
146 | target_vec)
147 |
148 | if normal_occ:
149 | intensity[idx] = 0
150 | else:
151 | intensity = np.zeros(envmap_directions.shape[0])
152 | return intensity
153 |
154 | @staticmethod
155 | def compute_transport(envmap_height, envmap_type, only_surface_normals):
156 | start_time = time.time()
157 |
158 | cam = bpy.data.objects['Camera']
159 |
160 | envmap_size = (
161 | envmap_height,
162 | 2 * envmap_height if envmap_type.lower() == "latlong" else 4 * envmap_height)
163 | envmap_coords = cam.data.clip_end * getEnvmapDirections(envmap_size, envmap_type)
164 | envmap_coords_blender = envmap_coords[:, [0, 2, 1]].copy()
165 | envmap_coords_blender[:, 1] = -envmap_coords_blender[:, 1]
166 |
167 | # Get the render resolution
168 | resp = bpy.data.scenes["Scene"].render.resolution_percentage / 100.
169 | resx = int(bpy.data.scenes["Scene"].render.resolution_x * resp)
170 | resy = int(bpy.data.scenes["Scene"].render.resolution_y * resp)
171 |
172 | # Get the camera viewport corner 3D coordinates
173 | frame = cam.data.view_frame()
174 | tr, br, bl, tl = [cam.matrix_world @ corner for corner in frame]
175 | x = br - bl
176 | dx = x.normalized()
177 | y = tl - bl
178 | dy = y.normalized()
179 |
180 | whole_normals = []
181 | whole_positions = []
182 | print("Raycast scene...")
183 | # TODO: Shouldn't we use the center of the pixel instead of the corner?
184 | for s in range(resy):
185 | py = tl - s * y.length / float(resy) * dy
186 | normals_row = []
187 | position_row = []
188 | for b in range(resx):
189 | ray_direction = py + b * x.length / float(resx) * dx
190 | normal, location = getClosestIntersection(cam.data.clip_end, cam.location, ray_direction)
191 | # This coordinates system converts from blender's z=up to skylibs y=up
192 | normals_row.append([normal[0], normal[2], -normal[1]] if normal else [0, 0, 0])
193 | position_row.append(location)
194 |
195 | whole_normals.append(normals_row)
196 | whole_positions.append(position_row)
197 |
198 | whole_normals = np.asarray(whole_normals)
199 |
200 | print("Raycast light contribution")
201 | pixels = np.zeros((resx * resy, envmap_coords.shape[0]))
202 | if not only_surface_normals:
203 | for s in range(resy):
204 | for b in range(resx):
205 | index = s * resx + b
206 | intensity = GenerateTransportMatrix.compute_occlusion(whole_positions[s][b],
207 | whole_normals[s, b, :],
208 | envmap_coords,
209 | cam,
210 | envmap_coords_blender)
211 | pixels[index, :] = intensity
212 | print("{}/{}".format(s, resy))
213 |
214 | print("Total time : {}".format(time.time() - start_time))
215 | return pixels, whole_normals, np.array([resy, resx])
216 |
217 |
218 | def menu_export(self, context):
219 | # import os
220 | # default_path = os.path.splitext(bpy.data.filepath)[0] + ".npz"
221 | self.layout.operator(GenerateTransportMatrix.bl_idname, text="Transport Matrix (.npz)") # .filepath = default_path
222 |
223 |
224 | def register():
225 | bpy.types.TOPBAR_MT_file_export.append(menu_export)
226 | bpy.utils.register_class(GenerateTransportMatrix)
227 |
228 |
229 | def unregister():
230 | bpy.types.TOPBAR_MT_file_export.remove(menu_export)
231 | bpy.utils.unregister_class(GenerateTransportMatrix)
232 |
233 |
234 | if __name__ == "__main__":
235 | register()
236 |
237 | # Debug code
238 | #import time
239 | #time_start = time.time()
240 | #T, normals, res = GenerateTransportMatrix.compute_transport(12, "latlong", False)
241 | #print("Total time = {}".format(time.time() - time_start))
242 |
--------------------------------------------------------------------------------
/tools3d/display.py:
--------------------------------------------------------------------------------
1 | from matplotlib import pyplot as plt
2 | from matplotlib import cm
3 | import numpy as np
4 |
5 |
6 |
7 | def plotSubFigure(X, Y, Z, subfig, type_):
8 | fig = plt.gcf()
9 | ax = fig.add_subplot(1, 3, subfig, projection='3d')
10 | #ax = fig.gca(projection='3d')
11 | if type_ == "colormap":
12 | ax.plot_surface(X, Y, Z, cmap=cm.viridis, rstride=1, cstride=1,
13 | shade=True, linewidth=0, antialiased=False)
14 | else:
15 | ax.plot_surface(X, Y, Z, color=[0.7, 0.7, 0.7], rstride=1, cstride=1,
16 | shade=True, linewidth=0, antialiased=False)
17 |
18 | ax.set_aspect("equal")
19 |
20 | max_range = np.array([X.max()-X.min(), Y.max()-Y.min(), Z.max()-Z.min()]).max() / 2.0 * 0.6
21 | mid_x = (X.max()+X.min()) * 0.5
22 | mid_y = (Y.max()+Y.min()) * 0.5
23 | mid_z = (Z.max()+Z.min()) * 0.5
24 | ax.set_xlim(mid_x - max_range, mid_x + max_range)
25 | ax.set_ylim(mid_y - max_range, mid_y + max_range)
26 | ax.set_zlim(mid_z - max_range, mid_z + max_range)
27 |
28 | az, el = 90, 90
29 | if type_ == "top":
30 | az = 130
31 | elif type_ == "side":
32 | az, el = 40, 0
33 |
34 | ax.view_init(az, el)
35 | fig.subplots_adjust(left=0, right=1, bottom=0, top=1)
36 |
37 | plt.grid(False)
38 | plt.axis('off')
39 |
40 |
41 | def plotDepth(Z):
42 | x = np.linspace(0, Z.shape[0]-1, Z.shape[0])
43 | y = np.linspace(0, Z.shape[1]-1, Z.shape[1])
44 | X, Y = np.meshgrid(x, y)
45 |
46 | fig = plt.figure(figsize=(12, 6))
47 | plotSubFigure(X, Y, Z, 1, "colormap")
48 | plotSubFigure(X, Y, Z, 2, "top")
49 | plotSubFigure(X, Y, Z, 3, "side")
50 |
51 | fig.subplots_adjust(left=0, right=1, bottom=0, top=1)
52 |
--------------------------------------------------------------------------------
/tools3d/spharm.py:
--------------------------------------------------------------------------------
1 | import os
2 |
3 | import numpy as np
4 | from scipy.special import sph_harm
5 | from pyshtools.backends import shtools
6 |
7 | from envmap import EnvironmentMap
8 |
9 | # Sanity check, sph_harm was bogus in some versions of scipy / Anaconda Python
10 | # http://stackoverflow.com/questions/33149777/scipy-spherical-harmonics-imaginary-part
11 | #assert np.isclose(sph_harm(2, 5, 2.1, 0.4), -0.17931012976432356-0.31877392205957022j), \
12 | # "Please update your SciPy version, the current version has a bug in its " \
13 | # "spherical harmonics basis computation."
14 |
15 |
16 | class SphericalHarmonic:
17 | def __init__(self, input_, copy_=True, max_l=None, norm=4):
18 | """
19 | Projects `input_` to its spherical harmonics basis up to degree `max_l`.
20 |
21 | norm = 4 means orthonormal harmonics.
22 | For more details, please see https://shtools.oca.eu/shtools/pyshexpanddh.html
23 | """
24 |
25 | if copy_:
26 | self.spatial = input_.copy()
27 | else:
28 | self.spatial = input_
29 |
30 | if not isinstance(self.spatial, EnvironmentMap):
31 | self.spatial = EnvironmentMap(self.spatial, 'LatLong')
32 |
33 | if self.spatial.format_ != "latlong":
34 | self.spatial = self.spatial.convertTo("latlong")
35 |
36 | self.norm = norm
37 |
38 | self.coeffs = []
39 | for i in range(self.spatial.data.shape[2]):
40 | self.coeffs.append(shtools.SHExpandDH(self.spatial.data[:,:,i], norm=norm, sampling=2, lmax_calc=max_l))
41 |
42 | def reconstruct(self, height=None, max_l=None, clamp_negative=True):
43 | """
44 | :height: height of the reconstructed image
45 | :clamp_negative: Remove reconstructed values under 0
46 | """
47 |
48 | retval = []
49 | for i in range(len(self.coeffs)):
50 | retval.append(shtools.MakeGridDH(self.coeffs[i], norm=self.norm, sampling=2, lmax=height, lmax_calc=max_l))
51 |
52 | retval = np.asarray(retval).transpose((1,2,0))
53 |
54 | if clamp_negative:
55 | retval = np.maximum(retval, 0)
56 |
57 | return retval
58 |
59 | def window(self, function="sinc"):
60 | """
61 | Applies a windowing function to the coefficients to reduce ringing artifacts.
62 | See https://www.ppsloan.org/publication/StupidSH36.pdf
63 | """
64 | deg = self.coeffs[0].shape[2]
65 | x = np.linspace(0, 1, deg + 1)[:-1]
66 |
67 | if function == "sinc":
68 | kernel = np.sinc(x)
69 | else:
70 | raise NotImplementedError(f"Windowing function {function} is not implemented.")
71 |
72 | for c in range(len(self.coeffs)):
73 | for l in range(self.coeffs[c].shape[2]):
74 | self.coeffs[c][0,l,:] *= kernel # real
75 | self.coeffs[c][1,l,:] *= kernel # imag
76 |
77 | return self
78 |
79 |
80 | if __name__ == '__main__':
81 | from matplotlib import pyplot as plt
82 |
83 | e = EnvironmentMap('envmap.exr', 'angular')
84 | e.resize((64, 64))
85 | e.convertTo('latlong')
86 |
87 | se = SphericalHarmonic(e)
88 |
89 | err = []
90 | from tqdm import tqdm
91 | for i in tqdm(range(32)):
92 | recons = se.reconstruct(max_l=i)
93 | err.append(np.sum((recons - e.data)**2))
94 |
95 | plt.plot(err)
96 |
97 | plt.figure()
98 | plt.imshow(recons);
99 | plt.show()
100 |
--------------------------------------------------------------------------------
/tools3d/warping_operator/README.md:
--------------------------------------------------------------------------------
1 | # Warping operator
2 |
3 | The warping operator of [Gardner et al., 2017](https://dl.acm.org/doi/10.1145/3130800.3130891) is implemented in the function `tools3d.warping_operator.warpEnvironmentMap`. This allows to simulate a translation through an HDR environment map with unknown geometry by approximating it with a sphere. It can be used to relight 3D models from a different position than the camera position in the original panorama.
4 |
5 | ## Documentation
6 |
7 | The function `warpEnvironmentMap` is documented in the source code [here](__init__.py).
8 |
9 | ## Example usage
10 |
11 | The script `example_warp_operator.py` shows a fun example usage, where the camera moves from the far back to the far front of the environment map. First, have a panorama image ready, e.g. `pano.exr`. Then, run the script with:
12 |
13 | ```bash
14 | python tools3d/warping_operator/example_warp_operator.py --environment 'pano.exr'
15 | ```
16 |
--------------------------------------------------------------------------------
/tools3d/warping_operator/__init__.py:
--------------------------------------------------------------------------------
1 | import envmap
2 | import numpy as np
3 |
4 | cachedWorldCoordinates = {}
5 | def warpEnvironmentMap(environmentMap, nadir, order=1):
6 | """
7 | Applies a warping operation to the environment map by simulating a camera translation along the z-axis
8 | (the environment map is approximated by a sphere, thus occlusions are not taken into account).
9 | The translation amount is determined by the sinus of the nadir angle.
10 |
11 | :param envmap: Environment map to warp.
12 | :param nadir: Nadir angle in radians.
13 | :param order: Interpolation order (0: nearest, 1: linear, ..., 5).
14 |
15 | This code is a refactored version of the orignal implementation by Marc-André Gardner.
16 | The algorithm is described in the following paper:
17 | “Learning to predict indoor illumination from a single image | ACM Transactions on Graphics.”
18 | https://dl.acm.org/doi/10.1145/3130800.3130891.
19 | """
20 | assert isinstance(environmentMap, envmap.EnvironmentMap)
21 |
22 | global cachedWorldCoordinates
23 | cacheKey = (environmentMap.data.shape, environmentMap.format_)
24 | if not cacheKey in cachedWorldCoordinates:
25 | cachedWorldCoordinates[cacheKey] = environmentMap.worldCoordinates()
26 |
27 | def warpCoordinates(x, y, z, zOffset):
28 | """
29 | Moves the x, y, z coordinates (ray intersections with the unit sphere,
30 | assuming the origin is at [0, 0, 0]) to their new position, where the ray origins
31 | are at [0, 0, zOffset] and the ray directions are unchanged.
32 |
33 | The equation for the new coordinates is a simplified version the quadratic
34 | formula in the eq. 3 in the paper where we know v_x^2+v_y^2+v_z^2 = 1.
35 | We only keep the positive solution since the negative one would move the
36 | point to the other side of the sphere.
37 | """
38 | t = -z * zOffset + np.sqrt(zOffset**2 * (z**2 - 1) + 1)
39 | return x * t, y * t, z * t + zOffset
40 |
41 | xDestination, yDestination, zDestination, _ = cachedWorldCoordinates[cacheKey]
42 | xSource, ySource, zSource = warpCoordinates(xDestination, yDestination, zDestination, np.sin(nadir))
43 | uSource, vSource = environmentMap.world2image(xSource, ySource, zSource)
44 | environmentMap.interpolate(uSource, vSource, order=order)
45 |
46 | return environmentMap
47 |
48 |
--------------------------------------------------------------------------------
/tools3d/warping_operator/example_warp_operator.py:
--------------------------------------------------------------------------------
1 | from tools3d.warping_operator import warpEnvironmentMap
2 | from envmap import EnvironmentMap, rotation_matrix
3 | import numpy as np
4 | import hdrio
5 | import subprocess
6 | import os
7 | import argparse
8 | import tempfile
9 | import subprocess
10 |
11 | def frames_to_video(frames_path_pattern, output_path):
12 | ffmpeg_cmd = [
13 | 'ffmpeg',
14 | '-y',
15 | '-r', '30',
16 | '-i', frames_path_pattern,
17 | '-c:v', 'libx264',
18 | '-vf', 'fps=30',
19 | '-pix_fmt', 'yuv420p',
20 | output_path
21 | ]
22 |
23 | subprocess.run(ffmpeg_cmd, check=True, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
24 |
25 | def main():
26 | # validate ffmpeg is installed (without printing the output)
27 | if subprocess.run(['ffmpeg', '-version'], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL).returncode != 0:
28 | raise RuntimeError('ffmpeg is not installed')
29 |
30 | parser = argparse.ArgumentParser(description='Creates a video of the warping of an environment map by simulating a translation of the camera along the z-axis.')
31 | parser.add_argument('--output_dir', type=str, default='output')
32 | parser.add_argument('--environment', type=str, help='Path to the environment map to warp (latlong format in exr format).', required=True)
33 | parser.add_argument('--frames', type=int, default=60, help='Number of frames in the video.')
34 | args = parser.parse_args()
35 |
36 | originalEnvironmentMap = EnvironmentMap(args.environment, 'latlong')
37 |
38 | hdrScalingFactor = 0.5/np.mean(originalEnvironmentMap.data)
39 | def tonemap(x):
40 | return np.clip(x * hdrScalingFactor, 0, 1) ** (1/2.2)
41 |
42 | output_dir = args.output_dir
43 | os.makedirs(output_dir, exist_ok=True)
44 |
45 | with tempfile.TemporaryDirectory() as tmp_dir:
46 | for i, nadir_deg in enumerate(np.linspace(-90, 90, args.frames)):
47 | print(f'warping {i+1}/{args.frames}...')
48 |
49 | nadir = np.deg2rad(nadir_deg)
50 | warpedEnvironmentMap = warpEnvironmentMap(originalEnvironmentMap.copy(), nadir)
51 | hdrio.imsave(os.path.join(tmp_dir, f'warped_{i:05}_latlong.png'), tonemap(warpedEnvironmentMap.data))
52 |
53 | warpedEnvironmentMap.convertTo('cube')
54 | hdrio.imsave(os.path.join(tmp_dir, f'warped_{i:05}_cube.png'), tonemap(warpedEnvironmentMap.convertTo('cube').data))
55 |
56 | forwardCrop = warpedEnvironmentMap.project(60, rotation_matrix(0, 0, 0))
57 | hdrio.imsave(os.path.join(tmp_dir, f'warped_{i:05}_forwardCrop.png'), tonemap(forwardCrop.data))
58 |
59 | print('creating videos...')
60 | frames_to_video(f'{tmp_dir}/warped_%05d_latlong.png', f'{output_dir}/warped_latlong.mp4')
61 | frames_to_video(f'{tmp_dir}/warped_%05d_cube.png', f'{output_dir}/warped_cube.mp4')
62 | frames_to_video(f'{tmp_dir}/warped_%05d_forwardCrop.png', f'{output_dir}/warped_crop.mp4')
63 |
64 | print('done.')
65 |
66 | if __name__ == '__main__':
67 | main()
68 |
--------------------------------------------------------------------------------