12 |
13 | **pyfor** is a Python package that assists in the processing of point cloud data in the context of forest inventory.
14 | This includes manipulation of point data, support for analysis, and a
15 | memory optimized API for managing large collections of tiles.
16 |
17 | ## Release Status
18 |
19 | Current Release: 0.3.6
20 |
21 | Release Date: December 1st, 2019.
22 |
23 | Release Status: 0.3.6 is an adolescent LiDAR data processing package adequate for single tile processing and large acqusitions.
24 |
25 | ## What Does pyfor Do?
26 |
27 | - [Normalization](http://brycefrank.com/pyfor/html/topics/normalization.html)
28 | - [Canopy Height Models](http://brycefrank.com/pyfor/html/topics/canopyheightmodel.html)
29 | - [Ground Filtering](http://brycefrank.com/pyfor/html/api/pyfor.ground_filter.html)
30 | - [Clipping](http://brycefrank.com/pyfor/html/topics/clipping.html)
31 | - [Large Acquisition Processing](http://brycefrank.com/pyfor/html/advanced/handlinglargeacquisitions.html)
32 |
33 | and many other tasks. See the [documentation](http://brycefrank.com/pyfor) for examples and applications.
34 |
35 | What about tree segmentation? Please see pyfor's sister package [`treeseg`](https://github.com/brycefrank/treeseg) which
36 | is a standalone package for tree segmentation and detection.
37 |
38 | ## Installation
39 |
40 | [miniconda](https://conda.io/miniconda.html) or Anaconda is required for your system before beginning. pyfor depends on many packages that are otherwise tricky and difficult to install (especially gdal and its bindings), and conda provides a quick and easy way to manage many different Python environments on your system simultaneously.
41 |
42 | As of October 14th, 2019, we are proud to announce that `pyfor` is available on `conda-forge`, greatly simplifying the installation process:
43 |
44 | ```
45 | conda install -c conda-forge pyfor
46 | ```
47 |
48 | ## Collaboration & Requests
49 |
50 | If you would like to contribute, especially those experienced with `numba`, `numpy`, `gdal`, `ogr` and `pandas`, please contact me at bfrank70@gmail.com
51 |
52 | I am also willing to implement features on request. Feel free to [open an issue](https://github.com/brycefrank/pyfor/issues) with your request or email me at the address above.
53 |
54 | pyfor will always remain a free service. Its development takes time, energy and a bit of money to maintain source code and host documentation. If you are so inclined, donations are accepted at the donation button at the top of the readme.
55 |
56 |
--------------------------------------------------------------------------------
/docs/.nojekyll:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/.nojekyll
--------------------------------------------------------------------------------
/docs/Makefile:
--------------------------------------------------------------------------------
1 | # Minimal makefile for Sphinx documentation
2 | #
3 |
4 | # You can set these variables from the command line.
5 | SPHINXOPTS =
6 | SPHINXBUILD = sphinx-build
7 | SPHINXPROJ = pyfor
8 | SOURCEDIR = .
9 | BUILDDIR = ../docs
10 |
11 | # Put it first so that "make" without argument is like "make help".
12 | help:
13 | @$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
14 |
15 | .PHONY: help Makefile
16 |
17 | # Catch-all target: route all unknown targets to Sphinx using the new
18 | # "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
19 | %: Makefile
20 | @$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
21 |
--------------------------------------------------------------------------------
/docs/advanced/groundfilter.rst:
--------------------------------------------------------------------------------
1 | Ground Filtering
2 | ================
--------------------------------------------------------------------------------
/docs/advanced/handlinglargeacquisitions.rst:
--------------------------------------------------------------------------------
1 | Handling Large Acquisitions
2 | ===========================
3 |
4 | The pyfor ``collections`` module allows for efficient handling of large acquisitions with paralell
5 | processing of user-define functions. This document presents a couple examples of large acquisition
6 | processing.
7 |
8 | Creating Project-Level Canopy Height Models
9 | -------------------------------------------
10 |
11 | A ``CloudDataFrame`` is the integral part of managing large point cloud acquisitions. It is initialized
12 | by passing a directory that contains ``.las`` or ``.laz`` files:
13 |
14 | .. code-block:: python
15 |
16 | import pyfor
17 | my_collection = pyfor.collection.from_dir("my_collection_dir", n_jobs=3)
18 |
19 | Here we have instantiated a collection such that the processing function will be done in parallel across
20 | three cores.
21 |
22 | It will be useful to set the coordinate reference system for the project. This is done via the ``.crs`` attribute:
23 |
24 | .. code-block:: python
25 |
26 | import pyproj
27 | crs = pyproj.proj({'init': 'epsg:26910'}).srs
28 | my_collection.crs = crs
29 |
30 | Here we must grapple with a few things. First, we will want to process buffered tiles to eliminate edge effects from
31 | normalization. This requires defining a few things for the collection.
32 |
33 | First, we want to set the collection tiles such
34 | that the output rasters will be consistent for a specified grid cell size. This is done by manipulating the internal
35 | geometries stored in ``my_collection.tiles``. By default, these describe the ``.las/z`` bounding boxes, but we want to
36 | ensure the output rasters are exactly the correct size so that they line up correctly. We can do this easily with the
37 | ``.retile_raster`` helper function. This is done **in place** for the collection.
38 |
39 | .. code-block:: python
40 |
41 | my_collection.retile_raster(0.5, 500, buffer=20)
42 |
43 | The first argument is the desired resolution of the output raster. The second argument is the resolution of the new
44 | tile sets. ``500`` means they will be processed in 500m x 500m chunks. Finally ``buffer=20`` means that each tile will
45 | be buffered by 20 meters, such that we can process a large buffered tile to eliminate edge effects.
46 |
47 | Next, we want to define a function to process each buffered tile. This is where the flexibility of the collection
48 | enters in.
49 |
50 |
51 | .. code-block:: python
52 |
53 | def my_process_func(buffered_cloud, tile):
54 | buffer_dist = 20
55 | buffered_cloud.normalize(1)
56 |
57 | # Generate CHM of buffered cloud
58 | chm = buffered_cloud.chm(0.5, interp_method="nearest", pit_filter="median")
59 |
60 | # Define output bounding box (remove the buffered part)
61 | coords = list(tile.exterior.coords)
62 | bbox = coords[0][0] + buffer_dist, coords[2][0] - buffer_dist, coords[0][1] + buffer_dist, coords[1][
63 | 1] - buffer_dist
64 | chm.force_extent(bbox)
65 |
66 | # Make a readable name for this particular tile
67 | flat_coords = [int(np.floor(coord)) for coord in bbox]
68 | tile_str = '{}_{}_{}_{}'.format(*flat_coords)
69 |
70 | # Write out the canopy height model
71 | chm.write('{}.tif'.format(tile_str))
72 |
73 | The above function is lengthy, but really quite simple. First, we normalize and create a canopy height model for some
74 | given buffered point cloud. Then, we restrict its output to remove the buffer using ``.force_extent``. Finally, we write
75 | out the canopy height model to file with a nicely formatted name.
76 |
77 | Finally, we can execute the processing job with the following:
78 |
79 | .. code-block:: python
80 |
81 | my_collection.par_apply(my_process_func)
82 |
--------------------------------------------------------------------------------
/docs/advanced/index.rst:
--------------------------------------------------------------------------------
1 | Advanced
2 | ==========
3 |
4 | .. toctree::
5 | :maxdepth: 1
6 |
7 | handlinglargeacquisitions
8 |
--------------------------------------------------------------------------------
/docs/advanced/understandingcomponents.rst:
--------------------------------------------------------------------------------
1 | Understanding the Components
2 | ============================
3 |
4 | The key to an advanced usage of `pyfor` is understanding the base classes of the package. A full
5 | understanding of these allows for flexible and creative ways of conducting your processing task.
6 | This document outlines these classes in a qualitative way. See the documentation for particulars.
7 |
8 | Cloud
9 | -----
10 |
11 | Grid
12 | ----
13 |
14 | Raster
15 | ------
16 |
17 | Raster Derivatives
18 | -------------------
19 |
20 | CloudDataFrame
21 | --------------
22 |
--------------------------------------------------------------------------------
/docs/api/index.rst:
--------------------------------------------------------------------------------
1 | API Reference
2 | ==============
3 |
4 | .. toctree::
5 |
6 | pyfor
--------------------------------------------------------------------------------
/docs/api/modules.rst:
--------------------------------------------------------------------------------
1 | pyfor
2 | =====
3 |
4 | .. toctree::
5 | :maxdepth: 4
6 |
7 | pyfor
8 |
--------------------------------------------------------------------------------
/docs/api/pyfor.clip.rst:
--------------------------------------------------------------------------------
1 | pyfor.clip module
2 | =================
3 |
4 | .. automodule:: pyfor.clip
5 | :members:
6 | :undoc-members:
7 | :show-inheritance:
8 |
--------------------------------------------------------------------------------
/docs/api/pyfor.cloud.rst:
--------------------------------------------------------------------------------
1 | pyfor.cloud module
2 | ==================
3 |
4 | .. automodule:: pyfor.cloud
5 | :members:
6 | :undoc-members:
7 | :show-inheritance:
8 |
--------------------------------------------------------------------------------
/docs/api/pyfor.collection.rst:
--------------------------------------------------------------------------------
1 | pyfor.collection module
2 | =======================
3 |
4 | .. automodule:: pyfor.collection
5 | :members:
6 | :undoc-members:
7 | :show-inheritance:
8 |
--------------------------------------------------------------------------------
/docs/api/pyfor.gisexport.rst:
--------------------------------------------------------------------------------
1 | pyfor.gisexport module
2 | ======================
3 |
4 | .. automodule:: pyfor.gisexport
5 | :members:
6 | :undoc-members:
7 | :show-inheritance:
8 |
--------------------------------------------------------------------------------
/docs/api/pyfor.ground_filter.rst:
--------------------------------------------------------------------------------
1 | pyfor.ground\_filter module
2 | ===========================
3 |
4 | .. automodule:: pyfor.ground_filter
5 | :members:
6 | :undoc-members:
7 | :show-inheritance:
8 |
--------------------------------------------------------------------------------
/docs/api/pyfor.metrics.rst:
--------------------------------------------------------------------------------
1 | pyfor.metrics module
2 | ====================
3 |
4 | .. automodule:: pyfor.metrics
5 | :members:
6 | :undoc-members:
7 | :show-inheritance:
8 |
--------------------------------------------------------------------------------
/docs/api/pyfor.rasterizer.rst:
--------------------------------------------------------------------------------
1 | pyfor.rasterizer module
2 | =======================
3 |
4 | .. automodule:: pyfor.rasterizer
5 | :members:
6 | :undoc-members:
7 | :show-inheritance:
8 |
--------------------------------------------------------------------------------
/docs/api/pyfor.rst:
--------------------------------------------------------------------------------
1 | pyfor package
2 | =============
3 |
4 | Submodules
5 | ----------
6 |
7 | .. toctree::
8 |
9 | pyfor.clip
10 | pyfor.cloud
11 | pyfor.collection
12 | pyfor.gisexport
13 | pyfor.ground_filter
14 | pyfor.metrics
15 | pyfor.rasterizer
16 | pyfor.voxelizer
17 |
18 | Module contents
19 | ---------------
20 |
21 | .. automodule:: pyfor
22 | :members:
23 | :undoc-members:
24 | :show-inheritance:
25 |
--------------------------------------------------------------------------------
/docs/api/pyfor.voxelizer.rst:
--------------------------------------------------------------------------------
1 | pyfor.voxelizer module
2 | ======================
3 |
4 | .. automodule:: pyfor.voxelizer
5 | :members:
6 | :undoc-members:
7 | :show-inheritance:
8 |
--------------------------------------------------------------------------------
/docs/conf.py:
--------------------------------------------------------------------------------
1 | # -*- coding: utf-8 -*-
2 | #
3 | # pyfor documentation build configuration file, created by
4 | # sphinx-quickstart on Sat Apr 14 07:55:21 2018.
5 | #
6 | # This file is execfile()d with the current directory set to its
7 | # containing dir.
8 | #
9 | # Note that not all possible configuration values are present in this
10 | # autogenerated file.
11 | #
12 | # All configuration values have a default; values that are commented out
13 | # serve to show the default.
14 |
15 | # If extensions (or modules to document with autodoc) are in another directory,
16 | # add these directories to sys.path here. If the directory is relative to the
17 | # documentation root, use os.path.abspath to make it absolute, like shown here.
18 | #
19 | import sphinx_rtd_theme
20 | import os
21 | import sys
22 |
23 | sys.path.insert(0, os.path.abspath("."))
24 | sys.path.insert(0, os.path.abspath("../"))
25 |
26 |
27 | # -- General configuration ------------------------------------------------
28 |
29 | # If your documentation needs a minimal Sphinx version, state it here.
30 | #
31 | # needs_sphinx = '1.0'
32 |
33 | # Add any Sphinx extension module names here, as strings. They can be
34 | # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
35 | # ones.
36 | extensions = ["sphinx.ext.autodoc", "sphinx.ext.doctest"]
37 |
38 | # Add any paths that contain templates here, relative to this directory.
39 | templates_path = ["_templates"]
40 |
41 | # The suffix(es) of source filenames.
42 | # You can specify multiple suffix as a list of string:
43 | #
44 | # source_suffix = ['.rst', '.md']
45 | source_suffix = ".rst"
46 |
47 | # The master toctree document.
48 | master_doc = "index"
49 |
50 | # General information about the project.
51 | project = u"pyfor"
52 | copyright = u"2019, Bryce Frank"
53 | author = u"Bryce Frank"
54 |
55 | # The version info for the project you're documenting, acts as replacement for
56 | # |version| and |release|, also used in various other places throughout the
57 | # built documents.
58 |
59 | try:
60 | release = pyfor.__version__
61 | except:
62 | with open("../pyfor/__init__.py") as f:
63 | for line in f:
64 | if line.find("__version__") >= 0:
65 | version = line.split("=")[1].strip()
66 | version = version.strip('"')
67 | version = version.strip("'")
68 |
69 | # The language for content autogenerated by Sphinx. Refer to documentation
70 | # for a list of supported languages.
71 | #
72 | # This is also used if you do content translation via gettext catalogs.
73 | # Usually you set "language" from the command line for these cases.
74 | language = None
75 |
76 | # List of patterns, relative to source directory, that match files and
77 | # directories to ignore when looking for source files.
78 | # This patterns also effect to html_static_path and html_extra_path
79 | exclude_patterns = ["_build", "Thumbs.db", ".DS_Store"]
80 |
81 | # The name of the Pygments (syntax highlighting) style to use.
82 | pygments_style = "sphinx"
83 |
84 | # If true, `todo` and `todoList` produce output, else they produce nothing.
85 | todo_include_todos = False
86 |
87 |
88 | # -- Options for HTML output ----------------------------------------------
89 |
90 | # The theme to use for HTML and HTML Help pages. See the documentation for
91 | # a list of builtin themes.
92 | #
93 | html_theme = "sphinx_rtd_theme"
94 |
95 | # Theme options are theme-specific and customize the look and feel of a theme
96 | # further. For a list of options available for each theme, see the
97 | # documentation.
98 | #
99 | # html_theme_options = {}
100 |
101 | # Add any paths that contain custom static files (such as style sheets) here,
102 | # relative to this directory. They are copied after the builtin static files,
103 | # so a file named "default.css" will overwrite the builtin "default.css".
104 | html_static_path = ["_static"]
105 |
106 | # Custom sidebar templates, must be a dictionary that maps document names
107 | # to template names.
108 | #
109 | # This is required for the alabaster theme
110 | # refs: http://alabaster.readthedocs.io/en/latest/installation.html#sidebars
111 | html_sidebars = {
112 | "**": [
113 | "relations.html", # needs 'show_related': True theme option to display
114 | "searchbox.html",
115 | ]
116 | }
117 |
118 |
119 | # -- Options for HTMLHelp output ------------------------------------------
120 |
121 | # Output file base name for HTML help builder.
122 | htmlhelp_basename = "pyfordoc"
123 |
124 |
125 | # -- Options for LaTeX output ---------------------------------------------
126 |
127 | latex_elements = {
128 | # The paper size ('letterpaper' or 'a4paper').
129 | #
130 | # 'papersize': 'letterpaper',
131 | # The font size ('10pt', '11pt' or '12pt').
132 | #
133 | # 'pointsize': '10pt',
134 | # Additional stuff for the LaTeX preamble.
135 | #
136 | # 'preamble': '',
137 | # Latex figure (float) alignment
138 | #
139 | # 'figure_align': 'htbp',
140 | }
141 |
142 | # Grouping the document tree into LaTeX files. List of tuples
143 | # (source start file, target name, title,
144 | # author, documentclass [howto, manual, or own class]).
145 | latex_documents = [
146 | (master_doc, "pyfor.tex", u"pyfor Documentation", u"Bryce Frank", "manual")
147 | ]
148 |
149 |
150 | # -- Options for manual page output ---------------------------------------
151 |
152 | # One entry per manual page. List of tuples
153 | # (source start file, name, description, authors, manual section).
154 | man_pages = [(master_doc, "pyfor", u"pyfor Documentation", [author], 1)]
155 |
156 |
157 | # -- Options for Texinfo output -------------------------------------------
158 |
159 | # Grouping the document tree into Texinfo files. List of tuples
160 | # (source start file, target name, title, author,
161 | # dir menu entry, description, category)
162 | texinfo_documents = [
163 | (
164 | master_doc,
165 | "pyfor",
166 | u"pyfor Documentation",
167 | author,
168 | "pyfor",
169 | "One line description of project.",
170 | "Miscellaneous",
171 | )
172 | ]
173 |
174 |
175 | # Include init
176 | def skip(app, what, name, obj, skip, options):
177 | if name == "__init__":
178 | return False
179 | return skip
180 |
181 |
182 | def setup(app):
183 | app.connect("autodoc-skip-member", skip)
184 |
--------------------------------------------------------------------------------
/docs/gettingstarted.rst:
--------------------------------------------------------------------------------
1 | Getting Started
2 | ===============
3 |
4 | This document describes a few basic operations, such as reading, writing, and basic point cloud
5 | manipulations for a single points dataset.
6 |
7 | Reading a Point Cloud
8 | ---------------------
9 |
10 | Reading a point cloud means instantiating a `Cloud` object. The `Cloud` object is the integral
11 | part of a point cloud analysis in pyfor. Instantiating a `Cloud` is simple:
12 |
13 | .. code-block:: python
14 |
15 | import pyfor
16 | tile = pyfor.cloud.Cloud("../pyfortest/data/test.las")
17 |
18 | Once we have an instance of our Cloud object we can explore some information regarding the
19 | point cloud. We can print the Cloud object for a brief summary of the data within.
20 |
21 | .. code-block:: python
22 |
23 | print(tile)
24 |
25 | ::
26 |
27 | File Path: ../data/test.las
28 | File Size: 6082545
29 | Number of Points: 217222
30 | Minimum (x y z): [405000.01, 3276300.01, 36.29]
31 | Maximum (x y z): [405199.99, 3276499.99, 61.12]
32 | Las Version: 1.3
33 |
34 | An important attribute of all `Cloud` objects is `.data`. This represents the raw data and header
35 | information of a `Cloud` object. It is managed by a separate, internal class called `LASData` in
36 | the case of .las files and `PLYData` in the case of .ply files. These classes manage some
37 | monotonous reading, writing and updating tasks for us, and are generally not necessary
38 | to interact with directly. Still, it is important to know they exist.
39 |
40 | Filtering Raw Points
41 | --------------------
42 |
43 | Sometimes it is interesting to view the raw points. For a `Cloud` object, these are stored in a
44 | pandas dataframe in the `.data.points` attribute:
45 |
46 | .. code-block:: python
47 |
48 | tile.data.points.head()
49 |
50 | Direct modifications to the raw points should be done with caution, but is as simple as
51 | over-writing this dataframe. For example, to remove all points with an x dimension exceeding
52 | 405120:
53 |
54 | .. code-block:: python
55 |
56 | tile.data.points = tile.data.points[tile.data.points["x"] < 405120]
57 |
58 | Plotting
59 | --------
60 |
61 | For quick visual inspection, a simple plotting method is available that uses `matplotlib` as a
62 | backend:
63 |
64 | .. code-block:: python
65 |
66 | tile.plot()
67 |
68 | .. image:: img/simple_plot.png
69 | :scale: 50%
70 | :align: center
71 |
72 | Writing Points
73 | ---------------
74 |
75 | Finally, we can write our point cloud out to a new file:
76 |
77 | .. code-block:: python
78 |
79 | tile.write('my_new_tile.las')
80 |
81 |
--------------------------------------------------------------------------------
/docs/html/.buildinfo:
--------------------------------------------------------------------------------
1 | # Sphinx build info version 1
2 | # This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
3 | config: 8d31c8b1ae9d2c2d62d4bc9c3d430c12
4 | tags: 645f666f9bcd5a90fca523b33c5a78b7
5 |
--------------------------------------------------------------------------------
/docs/html/_images/chm_final.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_images/chm_final.png
--------------------------------------------------------------------------------
/docs/html/_images/clipped.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_images/clipped.png
--------------------------------------------------------------------------------
/docs/html/_images/simple_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_images/simple_plot.png
--------------------------------------------------------------------------------
/docs/html/_images/unclipped.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_images/unclipped.png
--------------------------------------------------------------------------------
/docs/html/_sources/index.rst.txt:
--------------------------------------------------------------------------------
1 | .. pyfor documentation master file, created by
2 | sphinx-quickstart on Sat Apr 14 07:55:21 2018.
3 | You can adapt this file completely to your liking, but it should at least
4 | contain the root `toctree` directive.
5 |
6 | pyfor: point cloud analysis for forest inventory
7 | ================================================
8 |
9 | `pyfor `_ is a Python package for processing and manipulating point cloud data for analysis in large scale forest inventory
10 | systems. pyfor is developed with the philsophy of flexibility, and offers solutions for advanced and novice Python users.
11 | This web page contains a user manual and source code documentation.
12 |
13 | pyfor is capable of processing large acquisitions of point data in just a few lines of code. Here is an example for
14 | performing a routine normalization for an entire collection of `.las` tiles.
15 |
16 | .. code:: python
17 |
18 | import pyfor
19 | collection = pyfor.collection.from_dir('./my_tiles')
20 |
21 | def normalize(las_path):
22 | tile = pyfor.cloud.Cloud(las_path)
23 | tile.normalize()
24 | tile.write('{}_normalized.las'.format(tile.name))
25 |
26 | collection.par_apply(normalize, by_file=True)
27 |
28 | The above example only scratches the surface. Please see the Installation and Getting Started pages to learn more.
29 |
30 | .. toctree::
31 | :maxdepth: 2
32 |
33 | introduction
34 | installation
35 | gettingstarted
36 | topics/index
37 | advanced/index
38 | api/index
39 |
40 | Indices and Tables
41 | ==================
42 |
43 | * :ref:`genindex`
44 | * :ref:`modindex`
45 | * :ref:`search`
46 |
--------------------------------------------------------------------------------
/docs/html/_static/ajax-loader.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/ajax-loader.gif
--------------------------------------------------------------------------------
/docs/html/_static/comment-bright.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/comment-bright.png
--------------------------------------------------------------------------------
/docs/html/_static/comment-close.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/comment-close.png
--------------------------------------------------------------------------------
/docs/html/_static/comment.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/comment.png
--------------------------------------------------------------------------------
/docs/html/_static/css/badge_only.css:
--------------------------------------------------------------------------------
1 | .fa:before{-webkit-font-smoothing:antialiased}.clearfix{*zoom:1}.clearfix:before,.clearfix:after{display:table;content:""}.clearfix:after{clear:both}@font-face{font-family:FontAwesome;font-weight:normal;font-style:normal;src:url("../fonts/fontawesome-webfont.eot");src:url("../fonts/fontawesome-webfont.eot?#iefix") format("embedded-opentype"),url("../fonts/fontawesome-webfont.woff") format("woff"),url("../fonts/fontawesome-webfont.ttf") format("truetype"),url("../fonts/fontawesome-webfont.svg#FontAwesome") format("svg")}.fa:before{display:inline-block;font-family:FontAwesome;font-style:normal;font-weight:normal;line-height:1;text-decoration:inherit}a .fa{display:inline-block;text-decoration:inherit}li .fa{display:inline-block}li .fa-large:before,li .fa-large:before{width:1.875em}ul.fas{list-style-type:none;margin-left:2em;text-indent:-0.8em}ul.fas li .fa{width:.8em}ul.fas li .fa-large:before,ul.fas li .fa-large:before{vertical-align:baseline}.fa-book:before{content:""}.icon-book:before{content:""}.fa-caret-down:before{content:""}.icon-caret-down:before{content:""}.fa-caret-up:before{content:""}.icon-caret-up:before{content:""}.fa-caret-left:before{content:""}.icon-caret-left:before{content:""}.fa-caret-right:before{content:""}.icon-caret-right:before{content:""}.rst-versions{position:fixed;bottom:0;left:0;width:300px;color:#fcfcfc;background:#1f1d1d;font-family:"Lato","proxima-nova","Helvetica Neue",Arial,sans-serif;z-index:400}.rst-versions a{color:#2980B9;text-decoration:none}.rst-versions .rst-badge-small{display:none}.rst-versions .rst-current-version{padding:12px;background-color:#272525;display:block;text-align:right;font-size:90%;cursor:pointer;color:#27AE60;*zoom:1}.rst-versions .rst-current-version:before,.rst-versions .rst-current-version:after{display:table;content:""}.rst-versions .rst-current-version:after{clear:both}.rst-versions .rst-current-version .fa{color:#fcfcfc}.rst-versions .rst-current-version .fa-book{float:left}.rst-versions .rst-current-version .icon-book{float:left}.rst-versions .rst-current-version.rst-out-of-date{background-color:#E74C3C;color:#fff}.rst-versions .rst-current-version.rst-active-old-version{background-color:#F1C40F;color:#000}.rst-versions.shift-up{height:auto;max-height:100%;overflow-y:scroll}.rst-versions.shift-up .rst-other-versions{display:block}.rst-versions .rst-other-versions{font-size:90%;padding:12px;color:gray;display:none}.rst-versions .rst-other-versions hr{display:block;height:1px;border:0;margin:20px 0;padding:0;border-top:solid 1px #413d3d}.rst-versions .rst-other-versions dd{display:inline-block;margin:0}.rst-versions .rst-other-versions dd a{display:inline-block;padding:6px;color:#fcfcfc}.rst-versions.rst-badge{width:auto;bottom:20px;right:20px;left:auto;border:none;max-width:300px;max-height:90%}.rst-versions.rst-badge .icon-book{float:none}.rst-versions.rst-badge .fa-book{float:none}.rst-versions.rst-badge.shift-up .rst-current-version{text-align:right}.rst-versions.rst-badge.shift-up .rst-current-version .fa-book{float:left}.rst-versions.rst-badge.shift-up .rst-current-version .icon-book{float:left}.rst-versions.rst-badge .rst-current-version{width:auto;height:30px;line-height:30px;padding:0 6px;display:block;text-align:center}@media screen and (max-width: 768px){.rst-versions{width:85%;display:none}.rst-versions.shift{display:block}}
2 |
--------------------------------------------------------------------------------
/docs/html/_static/documentation_options.js:
--------------------------------------------------------------------------------
1 | var DOCUMENTATION_OPTIONS = {
2 | URL_ROOT: document.getElementById("documentation_options").getAttribute('data-url_root'),
3 | VERSION: '',
4 | LANGUAGE: 'None',
5 | COLLAPSE_INDEX: false,
6 | FILE_SUFFIX: '.html',
7 | HAS_SOURCE: true,
8 | SOURCELINK_SUFFIX: '.txt',
9 | NAVIGATION_WITH_KEYS: false
10 | };
--------------------------------------------------------------------------------
/docs/html/_static/down-pressed.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/down-pressed.png
--------------------------------------------------------------------------------
/docs/html/_static/down.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/down.png
--------------------------------------------------------------------------------
/docs/html/_static/file.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/file.png
--------------------------------------------------------------------------------
/docs/html/_static/fonts/Lato/lato-bold.eot:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/Lato/lato-bold.eot
--------------------------------------------------------------------------------
/docs/html/_static/fonts/Lato/lato-bold.ttf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/Lato/lato-bold.ttf
--------------------------------------------------------------------------------
/docs/html/_static/fonts/Lato/lato-bold.woff:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/Lato/lato-bold.woff
--------------------------------------------------------------------------------
/docs/html/_static/fonts/Lato/lato-bold.woff2:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/Lato/lato-bold.woff2
--------------------------------------------------------------------------------
/docs/html/_static/fonts/Lato/lato-bolditalic.eot:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/Lato/lato-bolditalic.eot
--------------------------------------------------------------------------------
/docs/html/_static/fonts/Lato/lato-bolditalic.ttf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/Lato/lato-bolditalic.ttf
--------------------------------------------------------------------------------
/docs/html/_static/fonts/Lato/lato-bolditalic.woff:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/Lato/lato-bolditalic.woff
--------------------------------------------------------------------------------
/docs/html/_static/fonts/Lato/lato-bolditalic.woff2:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/Lato/lato-bolditalic.woff2
--------------------------------------------------------------------------------
/docs/html/_static/fonts/Lato/lato-italic.eot:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/Lato/lato-italic.eot
--------------------------------------------------------------------------------
/docs/html/_static/fonts/Lato/lato-italic.ttf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/Lato/lato-italic.ttf
--------------------------------------------------------------------------------
/docs/html/_static/fonts/Lato/lato-italic.woff:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/Lato/lato-italic.woff
--------------------------------------------------------------------------------
/docs/html/_static/fonts/Lato/lato-italic.woff2:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/Lato/lato-italic.woff2
--------------------------------------------------------------------------------
/docs/html/_static/fonts/Lato/lato-regular.eot:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/Lato/lato-regular.eot
--------------------------------------------------------------------------------
/docs/html/_static/fonts/Lato/lato-regular.ttf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/Lato/lato-regular.ttf
--------------------------------------------------------------------------------
/docs/html/_static/fonts/Lato/lato-regular.woff:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/Lato/lato-regular.woff
--------------------------------------------------------------------------------
/docs/html/_static/fonts/Lato/lato-regular.woff2:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/Lato/lato-regular.woff2
--------------------------------------------------------------------------------
/docs/html/_static/fonts/RobotoSlab/roboto-slab-v7-bold.eot:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/RobotoSlab/roboto-slab-v7-bold.eot
--------------------------------------------------------------------------------
/docs/html/_static/fonts/RobotoSlab/roboto-slab-v7-bold.ttf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/RobotoSlab/roboto-slab-v7-bold.ttf
--------------------------------------------------------------------------------
/docs/html/_static/fonts/RobotoSlab/roboto-slab-v7-bold.woff:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/RobotoSlab/roboto-slab-v7-bold.woff
--------------------------------------------------------------------------------
/docs/html/_static/fonts/RobotoSlab/roboto-slab-v7-bold.woff2:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/RobotoSlab/roboto-slab-v7-bold.woff2
--------------------------------------------------------------------------------
/docs/html/_static/fonts/RobotoSlab/roboto-slab-v7-regular.eot:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/RobotoSlab/roboto-slab-v7-regular.eot
--------------------------------------------------------------------------------
/docs/html/_static/fonts/RobotoSlab/roboto-slab-v7-regular.ttf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/RobotoSlab/roboto-slab-v7-regular.ttf
--------------------------------------------------------------------------------
/docs/html/_static/fonts/RobotoSlab/roboto-slab-v7-regular.woff:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/RobotoSlab/roboto-slab-v7-regular.woff
--------------------------------------------------------------------------------
/docs/html/_static/fonts/RobotoSlab/roboto-slab-v7-regular.woff2:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/RobotoSlab/roboto-slab-v7-regular.woff2
--------------------------------------------------------------------------------
/docs/html/_static/fonts/fontawesome-webfont.eot:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/fontawesome-webfont.eot
--------------------------------------------------------------------------------
/docs/html/_static/fonts/fontawesome-webfont.ttf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/fontawesome-webfont.ttf
--------------------------------------------------------------------------------
/docs/html/_static/fonts/fontawesome-webfont.woff:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/fontawesome-webfont.woff
--------------------------------------------------------------------------------
/docs/html/_static/fonts/fontawesome-webfont.woff2:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/html/_static/fonts/fontawesome-webfont.woff2
--------------------------------------------------------------------------------
/docs/html/_static/js/theme.js:
--------------------------------------------------------------------------------
1 | /* sphinx_rtd_theme version 0.4.3 | MIT license */
2 | /* Built 20190212 16:02 */
3 | require=function r(s,a,l){function c(e,n){if(!a[e]){if(!s[e]){var i="function"==typeof require&&require;if(!n&&i)return i(e,!0);if(u)return u(e,!0);var t=new Error("Cannot find module '"+e+"'");throw t.code="MODULE_NOT_FOUND",t}var o=a[e]={exports:{}};s[e][0].call(o.exports,function(n){return c(s[e][1][n]||n)},o,o.exports,r,s,a,l)}return a[e].exports}for(var u="function"==typeof require&&require,n=0;n"),i("table.docutils.footnote").wrap(""),i("table.docutils.citation").wrap(""),i(".wy-menu-vertical ul").not(".simple").siblings("a").each(function(){var e=i(this);expand=i(''),expand.on("click",function(n){return t.toggleCurrent(e),n.stopPropagation(),!1}),e.prepend(expand)})},reset:function(){var n=encodeURI(window.location.hash)||"#";try{var e=$(".wy-menu-vertical"),i=e.find('[href="'+n+'"]');if(0===i.length){var t=$('.document [id="'+n.substring(1)+'"]').closest("div.section");0===(i=e.find('[href="#'+t.attr("id")+'"]')).length&&(i=e.find('[href="#"]'))}0this.docHeight||(this.navBar.scrollTop(i),this.winPosition=n)},onResize:function(){this.winResize=!1,this.winHeight=this.win.height(),this.docHeight=$(document).height()},hashChange:function(){this.linkScroll=!0,this.win.one("hashchange",function(){this.linkScroll=!1})},toggleCurrent:function(n){var e=n.closest("li");e.siblings("li.current").removeClass("current"),e.siblings().find("li.current").removeClass("current"),e.find("> ul li.current").removeClass("current"),e.toggleClass("current")}},"undefined"!=typeof window&&(window.SphinxRtdTheme={Navigation:e.exports.ThemeNav,StickyNav:e.exports.ThemeNav}),function(){for(var r=0,n=["ms","moz","webkit","o"],e=0;e
4 |
5 |
6 |
7 |
8 |
9 |
10 |
11 | Ground Filtering — pyfor documentation
12 |
13 |
14 |
15 |
16 |
17 |
18 |
19 |
20 |
21 |
22 |
23 |
24 |
25 |
26 |
27 |
28 |
29 |
30 |
31 |
32 |
33 |
34 |
35 |
36 |
37 |
38 |
39 |
40 |
41 |
42 |
43 |
The key to an advanced usage of pyfor is understanding the base classes of the package. A full
159 | understanding of these allows for flexible and creative ways of conducting your processing task.
160 | This document outlines these classes in a qualitative way. See the documentation for particulars.
Returns the indices of points that are within a given polygon. This differs from ray_trace() in that it enforces a small “pre-clip” optimization by first clipping to the polygon bounding box. This function is directly called by Cloud.clip().
168 |
169 |
Parameters
170 |
171 |
cloud – A cloud object.
172 |
poly – A shapely Polygon, with coordinates in the same CRS as the point cloud.
173 |
174 |
175 |
Returns
176 |
A 1D numpy array of indices corresponding to points within the given polygon.
Determines for some set of x and y coordinates, which of those coordinates is within poly. Ray trace is generally called as an internal function, see poly_clip()
185 |
186 |
Parameters
187 |
188 |
x – A 1D numpy array of x coordinates.
189 |
y – A 1D numpy array of y coordinates.
190 |
poly – The coordinates of a polygon as a numpy array (i.e. from geo_json[‘coordinates’]
191 |
192 |
193 |
Returns
194 |
A 1D boolean numpy array, true values are those points that are within poly.
Converts indices of an array (for example, those indices that describe the location of a local maxima) to the
184 | same space as the input cloud object.
185 |
186 |
Parameters
187 |
indices – The indices to project, an Nx2 matrix of indices where the first column are the rows (Y) and
188 |
189 |
190 |
the second column is the columns (X)
191 | :param raster: An object of type pyfor.rasterizer.Raster
192 | :return:
165 | class pyfor.voxelizer.VoxelGrid(cloud, cell_size)¶
166 |
Bases: object
167 |
A 3 dimensional grid representation of a point cloud. This is analagous to the rasterizer.Grid class, but
168 | with three axes instead of two. VoxelGrids are generally used to produce VoxelRaster objects.
miniconda or Anaconda is required for your system before beginning. pyfor depends on many
161 | packages that are otherwise tricky and difficult to install (especially gdal and its bindings),
162 | and conda provides a quick and easy way to manage many different Python environments on your
163 | system simultaneously.
164 |
The following bash commands will install this branch of pyfor. It requires installation of miniconda (see above). This will install all of the prerequisites in that environment, named pyfor_env. pyfor depends on a lot of heavy libraries, so expect construction of the environment to take a little time.
pyfor is unique from other forest inventory LiDAR processing packages in that it is built on top of an object oriented (OOP) framework. Python is not stricly an OOP programming language, but allows for its implementation in a flexible way. pyfor takes advantage of this attractive feature of Python to offer a natural way of thinking about LiDAR as a collection of objects. Each object flows into the other through the process of LiDAR data analysis. Using pyfor means being comfortable with this framework and knowledgeable about what each of the following objects can do. This document is a primer on grasping this philosophy.
163 |
There are four main classes that define pyfor. They are:
The Cloud class represents the point cloud data in its raw format, a list of x, y and z coordinates (and some other fields like intensity and the like). The Cloud object can be plotted in a variety of ways, including 2D and 3D plots (see .plot, .plot3d, .iplot3d). The Cloud is generally considered the starting point of LiDAR analysis, from this object we can produce other objects, specifically the Raster and Grid.
The Grid can be considered the next step in producing products from our Cloud. The Grid assigns each point of the Cloud to a grid cell. This process allows us to summarize information about the points in each grid cell.
Once we have decided how to summarize our grid cells, we produce a Raster. The Raster is a geo-referenced numpy array where the value of each cell is some summarization of the points within that cell. The most common implementation of a Raster is the canopy height model (CHM) that is a summary of the highest points within each grid cell, but other types are possible, such as bare earth models (BEM) and grid metrics. We can write Rasters to a GeoTIFF format using Raster.write().
At some point we will want to interact with our LiDAR tiles in toto. CloudDataFrame is a collection of Cloud objects that allows for efficient analysis and manupulation of many Cloud objects. Because Cloud is considered the most base of all classes in pyfor, CloudDataFrame is our portal to all the products we desire on a mass scale.
Often times we want to clip out LiDAR points using a shapefile. This can be done using pyfor’s
169 | Cloud.clip method. pyfor integrates with geopandas and shapely, convenient geospatial packages
170 | for Python, to provide a way to clip point clouds.
As input to the clipping function we need any shapely.geometry.Polygon our heart desires, as
181 | long as its coordinates correspond to the same physical space as the Cloud object. Here I extract
182 | a Polygon from a shapefile using geopandas:
One of the most integral parts of LiDAR analysis is determining which points represent
169 | the ground. Once this step is completed, we can construct bare earth models (BEMs) and
170 | normalize our point clouds to produce reliable estimates of height, canopy height models
171 | and other products.
172 |
pyfor offers a few avenues for normalization, ground filtering and the creation of bare earth
173 | models. All of these methods are covered in the advanced Ground Filtering document. Here, only
174 | the convenience wrapper is covered.
175 |
The convenience wrapper Cloud.normalize is a function that filters the cloud object for ground
176 | points, creates a bare earth model, and uses this bare earth model to normalize the object in
177 | place. That is, it conducts the entire normalization process from the raw data and does not
178 | leverage existing classified ground points. See the advanced Ground Filtering document to use
179 | existing classified ground points.
180 |
It uses the Zhang2003 filter by default and takes as its first argument the resolution
181 | of the bare earth model:
223 |
224 |
225 |
226 |
231 |
232 |
233 |
234 |
235 |
236 |
237 |
238 |
--------------------------------------------------------------------------------
/docs/img/chm_final.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/img/chm_final.png
--------------------------------------------------------------------------------
/docs/img/clipped.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/img/clipped.png
--------------------------------------------------------------------------------
/docs/img/simple_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/img/simple_plot.png
--------------------------------------------------------------------------------
/docs/img/unclipped.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/img/unclipped.png
--------------------------------------------------------------------------------
/docs/index.html:
--------------------------------------------------------------------------------
1 |
2 |
--------------------------------------------------------------------------------
/docs/index.rst:
--------------------------------------------------------------------------------
1 | .. pyfor documentation master file, created by
2 | sphinx-quickstart on Sat Apr 14 07:55:21 2018.
3 | You can adapt this file completely to your liking, but it should at least
4 | contain the root `toctree` directive.
5 |
6 | pyfor: point cloud analysis for forest inventory
7 | ================================================
8 |
9 | `pyfor `_ is a Python package for processing and manipulating point cloud data for analysis in large scale forest inventory
10 | systems. pyfor is developed with the philsophy of flexibility, and offers solutions for advanced and novice Python users.
11 | This web page contains a user manual and source code documentation.
12 |
13 | pyfor is capable of processing large acquisitions of point data in just a few lines of code. Here is an example for
14 | performing a routine normalization for an entire collection of `.las` tiles.
15 |
16 | .. code:: python
17 |
18 | import pyfor
19 | collection = pyfor.collection.from_dir('./my_tiles')
20 |
21 | def normalize(las_path):
22 | tile = pyfor.cloud.Cloud(las_path)
23 | tile.normalize()
24 | tile.write('{}_normalized.las'.format(tile.name))
25 |
26 | collection.par_apply(normalize, by_file=True)
27 |
28 | The above example only scratches the surface. Please see the Installation and Getting Started pages to learn more.
29 |
30 | .. toctree::
31 | :maxdepth: 2
32 |
33 | introduction
34 | installation
35 | gettingstarted
36 | topics/index
37 | advanced/index
38 | api/index
39 |
40 | Indices and Tables
41 | ==================
42 |
43 | * :ref:`genindex`
44 | * :ref:`modindex`
45 | * :ref:`search`
46 |
--------------------------------------------------------------------------------
/docs/installation.rst:
--------------------------------------------------------------------------------
1 | Installation
2 | ============
3 |
4 | `miniconda `_ or Anaconda is required for your system before beginning. pyfor depends on many
5 | packages that are otherwise tricky and difficult to install (especially gdal and its bindings),
6 | and conda provides a quick and easy way to manage many different Python environments on your
7 | system simultaneously.
8 |
9 | The following bash commands will install this branch of pyfor. It requires installation of miniconda (see above). This will install all of the prerequisites in that environment, named pyfor_env. pyfor depends on a lot of heavy libraries, so expect construction of the environment to take a little time.
10 |
11 | .. code:: bash
12 |
13 | git clone https://github.com/brycefrank/pyfor.git
14 | cd pyfor
15 | conda env create -f environment.yml
16 |
17 | # For Linux / macOS:
18 | source activate pyfor_env
19 |
20 | # For Windows:
21 | activate pyfor_env
22 |
23 | pip install .
24 |
25 | Following these commands, pyfor should load in an activated Python shell:
26 |
27 | .. code:: python
28 |
29 | import pyfor
30 |
31 | If you see no errors, you are ready to process.
--------------------------------------------------------------------------------
/docs/introduction.rst:
--------------------------------------------------------------------------------
1 | Introduction
2 | ============
3 |
4 |
5 | Welcome to pyfor, a Python module intended for processing large aerial LiDAR (and phodar)
6 | acquisitions for the use of large-scale forest inventories. This document serves as a general
7 | introduction to the package, its philosophy, and a bit of its history. If you would like to
8 | jump straight into analysis, feel free to skip over this "soft" document.
9 |
10 | About the Author
11 | ----------------
12 |
13 | I am Bryce Frank, a PhD student at Oregon State University. I work for the Forest Measurements
14 | and Biometrics Lab, which is run under the guidance of Dr. Temesgen Hailemariam.
15 | Our work focuses on developing statistical methodology for the analysis of forest resources
16 | at multiple scales. Some of the lab members work on small-scale issues, like biomass
17 | and taper modeling. My work, along with others, is focused on producing reliable
18 | estimates of forest attributes for large scale forest assessment.
19 |
20 | Package History
21 | ---------------
22 |
23 | I originally began pyfor as a class project to explore the use of object oriented programming
24 | in GIS. At the time, I had been programming in Python for about two years, but still struggled
25 | with some concepts like Classes, object inheritance, and the like. pyfor was a way for me to
26 | finally learn some of those concepts and implement them in an analysis environment that was
27 | useful for me. Around the Spring of 2017, I released the package on GitHub. At the time,
28 | the package was in very rough condition, was very inefficient, and only did a few rudimentary
29 | tasks.
30 |
31 | Around the Spring of 2018 I found a bit of time to rework the package from the ground up.
32 | I was deeply inspired by the lidR package, which I used extensively for a few months.
33 | I think lidR is a great tool, and pyfor is really just an alternative way of doing many of
34 | the same tasks. However, I prefer to work in Python for many reasons, and I also prefer to
35 | do my own scripting, so lidR fell by the wayside for me for those reasons. Rather than keep
36 | my scripts locked up somewhere, I modified the early version of pyfor with my newest
37 | attempts. I am also indebted to Bob McGaughey's FUSION, which paved the way in terms
38 | of these sorts of software, and is still my go-to software package for production work.
39 |
40 | Philosophy
41 | ----------
42 |
43 | pyfor started as a means for me to learn OOP, and I think the framework is a very natural
44 | way to work with LiDAR data from an interactive standpoint. In this way, pyfor is a bit
45 | of a niche tool that is really designed more for research - being able to quickly change
46 | parameters on the fly and get quick visual feedback about the results is important for tree
47 | detection and other tasks. Because I am a bit selfish when I develop, and I am mainly a
48 | researcher at this point in my career, this will be the main objective for the package
49 | for the time being.
50 |
51 | However, I completely understand the desire for performant processing. As the structure of
52 | pyfor beings to solidify, more of my time can be spent on diagnosing performance issues
53 | within the package and optimizing toward that end. I think Python, specifically scientific
54 | Python packages, will continue to be a solid platform for developing reasonably performant
55 | code, if done well. It is unlikely that pyfor will achieve speeds equivalent to raw C++ or
56 | FORTRAN code, but I do not think it will be orders of magnitude away if functions are
57 | developed in a way that leverages some of these faster packages. This also comes with
58 | the benefit of increased maintainability, clarity and usability - goals that I feel
59 | are often overlooked in the GIS world.
60 |
61 | Acknowledgements
62 | ----------------
63 |
64 | A special thank you to Drs. Ben Weinstein, Francisco Mauro-Gutiérrez and Temesgen Hailemariam
65 | for their continued support and advice as this project matures.
--------------------------------------------------------------------------------
/docs/logo.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/logo.png
--------------------------------------------------------------------------------
/docs/make.bat:
--------------------------------------------------------------------------------
1 | @ECHO OFF
2 |
3 | pushd %~dp0
4 |
5 | REM Command file for Sphinx documentation
6 |
7 | if "%SPHINXBUILD%" == "" (
8 | set SPHINXBUILD=sphinx-build
9 | )
10 | set SOURCEDIR=.
11 | set BUILDDIR=_build
12 | set SPHINXPROJ=pyfor
13 |
14 | if "%1" == "" goto help
15 |
16 | %SPHINXBUILD% >NUL 2>NUL
17 | if errorlevel 9009 (
18 | echo.
19 | echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
20 | echo.installed, then set the SPHINXBUILD environment variable to point
21 | echo.to the full path of the 'sphinx-build' executable. Alternatively you
22 | echo.may add the Sphinx directory to PATH.
23 | echo.
24 | echo.If you don't have Sphinx installed, grab it from
25 | echo.http://sphinx-doc.org/
26 | exit /b 1
27 | )
28 |
29 | %SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS%
30 | goto end
31 |
32 | :help
33 | %SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS%
34 |
35 | :end
36 | popd
37 |
--------------------------------------------------------------------------------
/docs/norm_sample.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/norm_sample.png
--------------------------------------------------------------------------------
/docs/structure.rst:
--------------------------------------------------------------------------------
1 | The structure of pyfor
2 | ============================
3 |
4 | This document provides an in-depth discussion of the structure of pyfor.
5 |
6 | Introduction
7 | ============
8 |
9 | pyfor is unique from other forest inventory LiDAR processing packages in that it is built on top of an object oriented (OOP) framework. Python is not stricly an OOP programming language, but allows for its implementation in a flexible way. pyfor takes advantage of this attractive feature of Python to offer a natural way of thinking about LiDAR as a collection of objects. Each object flows into the other through the process of LiDAR data analysis. Using pyfor means being comfortable with this framework and knowledgeable about what each of the following objects can do. This document is a primer on grasping this philosophy.
10 |
11 | There are four main classes that define pyfor. They are:
12 |
13 | 1. Cloud
14 | 2. CloudDataFrame
15 | 3. Grid
16 | 4. Raster
17 |
18 | Cloud
19 | -----
20 |
21 | The Cloud class represents the point cloud data in its raw format, a list of x, y and z coordinates (and some other fields like intensity and the like). The Cloud object can be plotted in a variety of ways, including 2D and 3D plots (see .plot, .plot3d, .iplot3d). The Cloud is generally considered the starting point of LiDAR analysis, from this object we can produce other objects, specifically the Raster and Grid.
22 |
23 | Grid
24 | ----
25 |
26 | The Grid can be considered the next step in producing products from our Cloud. The Grid assigns each point of the Cloud to a grid cell. This process allows us to summarize information about the points in each grid cell.
27 |
28 | Raster
29 | ------
30 |
31 | Once we have decided how to summarize our grid cells, we produce a Raster. The Raster is a geo-referenced numpy array where the value of each cell is some summarization of the points within that cell. The most common implementation of a Raster is the canopy height model (CHM) that is a summary of the highest points within each grid cell, but other types are possible, such as bare earth models (BEM) and grid metrics. We can write Rasters to a GeoTIFF format using Raster.write().
32 |
33 | CloudDataFrame
34 | --------------
35 |
36 | At some point we will want to interact with our LiDAR tiles in toto. CloudDataFrame is a collection of Cloud objects that allows for efficient analysis and manupulation of many Cloud objects. Because Cloud is considered the most base of all classes in pyfor, CloudDataFrame is our portal to all the products we desire on a mass scale.
37 |
--------------------------------------------------------------------------------
/docs/tile.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/docs/tile.png
--------------------------------------------------------------------------------
/docs/topics/canopyheightmodel.rst:
--------------------------------------------------------------------------------
1 | Canopy Height Models
2 | ====================
3 |
4 | An integral part of any analysis is the production of a canopy height model, or CHM.
5 | The CHM is a rasterized representation of the canopy of the forest. The creation and filtering
6 | of CHMs play a large role in tree detection algorithms and are an interpretable way to display
7 | information.
8 |
9 | A basic canopy height model can be created using a convenience wrapper:
10 |
11 | .. code-block:: python
12 |
13 | tile = pyfor.cloud.Cloud('my_tile.las')
14 | tile.normalize(1)
15 | chm = tile.chm(0.5)
16 |
17 | The above block will load the las file, `my_tile.las`, remove the ground elevation (normalize)
18 | and compute a basic canopy height model. Here, we specify a resolution of 0.5 units.
19 |
20 | **Note**: In pyfor
21 | no assumptions are made about the reference system, so always specify resolutions in the
22 | units that the point cloud is registered in. In this case it was originally registered in meters,
23 | therefore the output raster will have a resolution of 0.5 meters.
24 |
25 | Manipulating Canopy Height Models
26 | ---------------------------------
27 |
28 | Often times, raw CHMs are not adequate for analysis. They contain many issues, such as missing
29 | values and data pits. We can add some extra arguments to add NaN interpolation and pit filtering.
30 |
31 | .. code-block:: python
32 |
33 | better_chm = tile.chm(0.5, interp_method = "nearest", pit_filter = "median")
34 |
35 | Here, we interpolate missing values using a nearest neighbor interpolator, and pass a median
36 | filter over the canopy height model to smooth pits.
37 |
38 | We can display our CHM with the `.plot` method:
39 |
40 | .. code-block:: python
41 |
42 | better_chm.plot()
43 |
44 | .. image:: ../img/chm_final.png
45 | :scale: 40%
46 | :align: center
47 |
48 |
49 | Writing Canopy Height Models
50 | ----------------------------
51 |
52 | A canopy height model is a `Raster` object. And can be written out in the same way.
53 |
54 | .. code-block:: python
55 |
56 | better_chm.write('my_chm.tif')
--------------------------------------------------------------------------------
/docs/topics/clipping.rst:
--------------------------------------------------------------------------------
1 | Clipping
2 | ========
3 |
4 | Often times we want to clip out LiDAR points using a shapefile. This can be done using pyfor's
5 | Cloud.clip method. pyfor integrates with geopandas and shapely, convenient geospatial packages
6 | for Python, to provide a way to clip point clouds.
7 |
8 | .. code-block:: python
9 |
10 | import pyfor
11 | import geopandas
12 |
13 | # Load point cloud
14 | pc = pyfor.cloud.Cloud("../data/test.las")
15 | pc.plot3d()
16 |
17 |
18 | .. image:: ../img/unclipped.png
19 | :scale: 80%
20 | :align: center
21 |
22 | As input to the clipping function we need any `shapely.geometry.Polygon` our heart desires, as
23 | long as its coordinates correspond to the same physical space as the `Cloud` object. Here I extract
24 | a `Polygon` from a shapefile using `geopandas`:
25 |
26 | .. code-block:: python
27 |
28 | # Load point cloud
29 | polys = geopandas.read_file("../data/clip.shp")
30 | poly = poly_frame["geometry"].iloc[0]
31 |
32 | Finally, pass the `Polygon` to the clipping function. This function returns a new `Cloud` object.
33 |
34 | .. code-block:: python
35 |
36 | # Load point cloud
37 | clipped = pc.clip(poly)
38 | clipped.plot3d()
39 |
40 |
41 | .. image:: ../img/clipped.png
42 | :scale: 80%
43 | :align: center
44 |
--------------------------------------------------------------------------------
/docs/topics/index.rst:
--------------------------------------------------------------------------------
1 | The Basics
2 | ==========
3 |
4 | .. toctree::
5 | :maxdepth: 1
6 |
7 | canopyheightmodel
8 | clipping
9 | normalization
10 | metrics
--------------------------------------------------------------------------------
/docs/topics/metrics.rst:
--------------------------------------------------------------------------------
1 | Area-Based Metrics
2 | ===================
3 |
4 | The area-based approach (ABA) is ubiquitous in modern forest inventory systems. `pyfor` enables the
5 | computation of a set of area-based metrics for individual point clouds (`Cloud` objects)
6 | and for gridded point clouds (`Grid` objects).
7 |
8 | The following block demonstrates a minimal example of creating standard metrics for a gridded tile.
9 |
10 | .. code-block:: python
11 |
12 | import pyfor
13 | tile = pyfor.cloud.Cloud('my_tile.las')
14 | tile.normalize(1)
15 | grid = tile.grid(20)
16 | std_metrics = grid.standard_metrics(2)
17 |
18 | This returns a Python dictionary, where each key is the name of the metric and each value is
19 | a `Raster` object. The argument `2` is the heightbreak at which canopy cover metrics are computed in
20 | meters.
21 |
22 | .. code-block:: python
23 |
24 | {'max_z': ,
25 | 'min_z': ,
26 | ...
27 | 'pct_all_above_mean': }
28 |
29 | Interacting with these key value pairs is natural, since the values are simply `Raster` objects.
30 | For example we can plot the `max_z` raster from the dictionary:
31 |
32 | .. code-block:: python
33 |
34 | std_metrics['max_z'].plot()
35 |
36 | .. image:: ../img/max_z.png
37 | :scale: 40%
38 | :align: center
39 |
40 | Or, perhaps more useful, write the raster with a custom name:
41 |
42 | .. code-block:: python
43 |
44 | std_metrics['max_z'].write('my_max_z.tif')
45 |
46 |
47 | Standard Metrics Description
48 | ----------------------------
49 |
50 | A number of metrics are included in the standard suite and are modeled heavily after FUSION software.
51 | Here is a brief description of each.
52 |
53 | .. code-block:: python
54 |
55 | p_*: The height of the *th percentile along the z dimension.
56 | * = (1, 5, 10, 20, 25, 30, 40, 50, 60, 70, 75, 80, 90, 95, 99)
57 | max_z: The maximum of the z dimension.
58 | min_z: The minimum of the z dimension.
59 | mean_z: The mean of the z dimension.
60 | mean_z: The mean of the z dimension.
61 | stddev_z: The standard deviation of the z dimension.
62 | var_z: The variance of the z dimension.
63 | canopy_relief_ratio: (mean_z - min_z) / (max_z - min_z)
64 | pct_r_1_above_*: The percentage of first returns above a specified heightbreak.
65 | pct_r_1_above_mean: The percentage of first returns above mean_z.
66 | pct_all_above_*: The percentage of returns above a specified heightbreak.
67 | pct_all_above_mean: The percentage of returns above mean_z.
68 |
69 |
--------------------------------------------------------------------------------
/docs/topics/normalization.rst:
--------------------------------------------------------------------------------
1 | Normalization
2 | =============
3 |
4 | One of the most integral parts of LiDAR analysis is determining which points represent
5 | the ground. Once this step is completed, we can construct bare earth models (BEMs) and
6 | normalize our point clouds to produce reliable estimates of height, canopy height models
7 | and other products.
8 |
9 | `pyfor` offers a few avenues for normalization, ground filtering and the creation of bare earth
10 | models. All of these methods are covered in the advanced Ground Filtering document. Here, only
11 | the convenience wrapper is covered.
12 |
13 | The convenience wrapper `Cloud.normalize` is a function that filters the cloud object for ground
14 | points, creates a bare earth model, and uses this bare earth model to normalize the object in
15 | place. That is, **it conducts the entire normalization process from the raw data and does not
16 | leverage existing classified ground points**. See the advanced Ground Filtering document to use
17 | existing classified ground points.
18 |
19 | It uses the `Zhang2003` filter by default and takes as its first argument the resolution
20 | of the bare earth model:
21 |
22 | .. code-block:: python
23 |
24 | import pyfor
25 | tile = pyfor.cloud.Cloud('my_tile.las')
26 | tile.normalize(1)
27 |
28 |
--------------------------------------------------------------------------------
/docs/update_docs.bat:
--------------------------------------------------------------------------------
1 | sphinx-apidoc -f -o .\api ..\pyfor --separate
2 | sphinx-build . .\html
--------------------------------------------------------------------------------
/docs/update_docs.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 | # This script autoregenerates pyfor's documentation for updates to master.
3 |
4 | DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null && pwd )"
5 | sphinx-apidoc -f -o api/ $DIR/../pyfor --separate
6 | cd $DIR
7 | make html
8 | cd -
9 |
--------------------------------------------------------------------------------
/environment.yml:
--------------------------------------------------------------------------------
1 | name: pyfor_env
2 |
3 | dependencies:
4 | - python=3.7
5 | - gdal
6 | - rasterio
7 | - conda-forge::pyproj
8 | - numpy
9 | - pandas
10 | - scipy
11 | - scikit-image
12 | - scikit-learn
13 | - pyqtgraph
14 | - matplotlib
15 | - pyopengl
16 | - numba
17 | - libkml
18 | - geopandas
19 | - joblib
20 | - poppler
21 | - conda-forge::lastools
22 | - pip:
23 | - laspy
24 | - laxpy
25 | - sphinx_rtd_theme
26 | - python-coveralls
27 | - plyfile
28 |
--------------------------------------------------------------------------------
/pyfor/__init__.py:
--------------------------------------------------------------------------------
1 | from __future__ import absolute_import
2 |
3 | __version__ = "0.3.6"
4 |
5 | from pyfor import cloud
6 | from pyfor import rasterizer
7 | from pyfor import gisexport
8 | from pyfor import clip
9 | from pyfor import ground_filter
10 | from pyfor import collection
11 | from pyfor import voxelizer
12 | from pyfor import metrics
13 |
--------------------------------------------------------------------------------
/pyfor/clip.py:
--------------------------------------------------------------------------------
1 | import json
2 | import numpy as np
3 | from numba import vectorize, bool_, float64
4 |
5 | # These are the lower level clipping functions.
6 |
7 |
8 | def square_clip(points, bounds):
9 | """
10 | Clips a square from a tuple describing the position of the square.
11 |
12 | :param points: A N x 2 numpy array of x and y coordinates, where x is in column 0
13 | :param bounds: A tuple of length 4, min y and max y coordinates of the square.
14 | :return: A boolean mask, true is within the square, false is outside of the square.
15 | """
16 |
17 | # Extact x y coordinates from cloud
18 | xy = points[["x", "y"]]
19 |
20 | # Create masks for each axis
21 | x_in = (xy["x"] >= bounds[0]) & (xy["x"] <= bounds[2])
22 | y_in = (xy["y"] >= bounds[1]) & (xy["y"] <= bounds[3])
23 | stack = np.stack((x_in, y_in), axis=1)
24 | in_clip = np.all(stack, axis=1)
25 |
26 | return in_clip
27 |
28 |
29 | def ray_trace(x, y, poly):
30 | """
31 | Determines for some set of x and y coordinates, which of those coordinates is within `poly`. Ray trace is \
32 | generally called as an internal function, see :func:`.poly_clip`
33 |
34 | :param x: A 1D numpy array of x coordinates.
35 | :param y: A 1D numpy array of y coordinates.
36 | :param poly: The coordinates of a polygon as a numpy array (i.e. from geo_json['coordinates']
37 | :return: A 1D boolean numpy array, true values are those points that are within `poly`.
38 | """
39 |
40 | @vectorize([bool_(float64, float64)])
41 | def ray(x, y):
42 | # where xy is a coordinate
43 | n = len(poly)
44 | inside = False
45 | p2x = 0.0
46 | p2y = 0.0
47 | xints = 0.0
48 | p1x, p1y = poly[0]
49 | for i in range(n + 1):
50 | p2x, p2y = poly[i % n]
51 | if y > min(p1y, p2y):
52 | if y <= max(p1y, p2y):
53 | if x <= max(p1x, p2x):
54 | if p1y != p2y:
55 | xints = (y - p1y) * (p2x - p1x) / (p2y - p1y) + p1x
56 | if p1x == p2x or x <= xints:
57 | inside = not inside
58 | p1x, p1y = p2x, p2y
59 | return inside
60 |
61 | return ray(x, y)
62 |
63 |
64 | def poly_clip(points, poly):
65 | """
66 | Returns the indices of `points` that are within a given polygon. This differs from :func:`.ray_trace` \
67 | in that it enforces a small "pre-clip" optimization by first clipping to the polygon bounding box. This function \
68 | is directly called by :meth:`.Cloud.clip`.
69 |
70 | :param cloud: A cloud object.
71 | :param poly: A shapely Polygon, with coordinates in the same CRS as the point cloud.
72 | :return: A 1D numpy array of indices corresponding to points within the given polygon.
73 | """
74 | # Clip to bounding box
75 | bbox = poly.bounds
76 | pre_clip_mask = square_clip(points, bbox)
77 | pre_clip = points[["x", "y"]].iloc[pre_clip_mask].values
78 |
79 | # Store old indices
80 | pre_clip_inds = np.where(pre_clip_mask)[0]
81 |
82 | # Clip the preclip
83 | poly_coords = np.stack(
84 | (poly.exterior.coords.xy[0], poly.exterior.coords.xy[1]), axis=1
85 | )
86 |
87 | full_clip_mask = ray_trace(pre_clip[:, 0], pre_clip[:, 1], poly_coords)
88 | clipped = pre_clip_inds[full_clip_mask]
89 |
90 | return clipped
91 |
--------------------------------------------------------------------------------
/pyfor/gisexport.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | import rasterio
3 |
4 | # This module holds internal functions for GIS processing.
5 |
6 | def project_indices(indices, raster):
7 | """
8 | Converts indices of an array (for example, those indices that describe the location of a local maxima) to the
9 | same space as the input cloud object.
10 |
11 | :param indices: The indices to project, an Nx2 matrix of indices where the first column are the rows (Y) and
12 | the second column is the columns (X)
13 | :param raster: An object of type pyfor.rasterizer.Raster
14 | :return:
15 | """
16 |
17 | seed_xy = (
18 | indices[:, 1] + (raster._affine[2] / raster._affine[0]),
19 | indices[:, 0]
20 | + (
21 | raster._affine[5]
22 | - (raster.grid.cloud.data.max[1] - raster.grid.cloud.data.min[1])
23 | / abs(raster._affine[4])
24 | ),
25 | )
26 | seed_xy = np.stack(seed_xy, axis=1)
27 | return seed_xy
28 |
29 |
30 | def array_to_raster(array, affine, crs, path):
31 | """Writes a GeoTIFF raster from a numpy array.
32 |
33 | :param array: 2D numpy array of cell values
34 | :param affine: The affine transformation.
35 | :param crs: A rasterio-compatible coordinate reference (e.g. a proj4 string)
36 | :param path: The output bath of the GeoTIFF
37 | """
38 | # First flip the array
39 | # transform = rasterio.transform.from_origin(x_min, y_max, pixel_size, pixel_size)
40 | out_dataset = rasterio.open(
41 | path,
42 | "w",
43 | driver="GTiff",
44 | height=array.shape[0],
45 | width=array.shape[1],
46 | count=1,
47 | dtype=str(array.dtype),
48 | crs=crs,
49 | transform=affine,
50 | )
51 | out_dataset.write(array, 1)
52 | out_dataset.close()
53 |
54 |
--------------------------------------------------------------------------------
/pyfor/metrics.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | import pandas as pd
3 | import pyfor.rasterizer
4 |
5 | all_pct = (1, 5, 10, 20, 25, 30, 40, 50, 60, 70, 75, 80, 90, 95, 99)
6 |
7 |
8 | def summarize_return_num(return_nums):
9 | """
10 | Gets the number of returns by return number.
11 |
12 | :param return_nums: A :class:`pandas.Series` of return number that describes the return number of each point.
13 | :return: A :class:`pandas.Series` of return number counts by return number.
14 | """
15 | return return_nums.groupby(return_nums).agg("count")
16 |
17 |
18 | def summarize_percentiles(z, pct=all_pct):
19 | """
20 | :param z: A :class:`pandas.Series` of z values.
21 | """
22 | return (np.percentile(z, pct), pct)
23 |
24 |
25 | def pct_above_heightbreak(grid, r=0, heightbreak="mean"):
26 | """
27 | Calculates the percentage of first returns above the mean. This needs its own function because it summarizes
28 | multiple columns of the point cloud, and is therefore more complex than typical summarizations
29 | (i.e. percentiles). This returns a `pyfor.rasterizer.Raster` object.
30 |
31 | :param grid: A `pyfor.rasterizer.Grid` object
32 | :param r: The return number to constrain to. Must be a positive integer. If r=0, all points will be considered
33 | (this is the default behavior).
34 | :param heightbreak: The height at which to summarize. If a number is given, this will be interpreted as the height
35 | at which points will be considered "above". If the string "mean" is given (this is the default), will use the mean
36 | height of that cell, for example, to construct the "pct_above_mean" metric.
37 | """
38 |
39 | if heightbreak == "mean":
40 | # Compute mean z in each cell
41 | mean_z = grid.cells.agg({"z": np.mean})
42 | mean_z = mean_z.rename(columns={"z": "mean_z"})
43 | mean_z = pd.merge(grid.cloud.data.points, mean_z, on=["bins_x", "bins_y"])[
44 | "mean_z"
45 | ]
46 | is_above = grid.cloud.data.points["z"] > mean_z
47 | else:
48 | is_above = grid.cloud.data.points["z"] > heightbreak
49 |
50 | if r > 0:
51 | out_col = "pct_r{}_above_{}".format(r, heightbreak)
52 | grid.cloud.data.points["is_r"] = grid.cloud.data.points["return_num"] == r
53 | grid.cloud.data.points["is_r_above"] = grid.cloud.data.points["is_r"] & is_above
54 | cells = grid.cloud.grid(grid.cell_size).cells
55 |
56 | summary = cells.agg({"is_r": np.sum, "is_r_above": np.sum}).reset_index()
57 | summary["bins_y"] = summary["bins_y"]
58 | summary[out_col] = summary["is_r_above"] / summary["is_r"]
59 |
60 | else:
61 | out_col = "pct_all_above_{}".format(heightbreak)
62 | grid.cloud.data.points["is_above"] = is_above
63 | cells = grid.cloud.grid(grid.cell_size).cells
64 | summary = cells.agg({"x": "count", "is_above": np.sum}).reset_index()
65 | summary[out_col] = summary["is_above"] / summary["x"]
66 |
67 | array = np.full((grid.m, grid.n), np.nan)
68 | array[summary["bins_y"], summary["bins_x"]] = summary[out_col]
69 | return pyfor.rasterizer.Raster(array, grid)
70 |
71 |
72 | def grid_percentile(grid, percentile):
73 | """
74 | Calculates a percentile raster.
75 | :param percentile: The percentile (a number between 0 and 100) to compute.
76 | """
77 | return grid.raster(lambda z: np.percentile(z, percentile), "z")
78 |
79 |
80 | def z_max(grid):
81 | """
82 | Calculates maximum z value.
83 | """
84 |
85 | return grid.raster(np.max, "z")
86 |
87 |
88 | def z_min(grid):
89 | """
90 | Calculates minimum z value.
91 | """
92 |
93 | return grid.raster(np.min, "z")
94 |
95 |
96 | def z_std(grid):
97 | """
98 | Calculates standard deviation of z value.
99 | """
100 |
101 | return grid.raster(np.std, "z")
102 |
103 |
104 | def z_var(grid):
105 | """
106 | Calculates variance of z value.
107 | """
108 |
109 | return grid.raster(np.var, "z")
110 |
111 |
112 | def z_mean(grid):
113 | """
114 | Calculates mean of z value.
115 | """
116 |
117 | return grid.raster(np.mean, "z")
118 |
119 |
120 | def z_iqr(grid):
121 | """
122 | Calculates interquartile range of z value.
123 | """
124 |
125 | return grid.raster(lambda z: np.percentile(z, 75) - np.percentile(z, 25), "z")
126 |
127 |
128 | def vol_cov(grid, r, heightbreak):
129 | """
130 | Calculates the volume covariate (percentage first returns above two meters times mean z)
131 | """
132 |
133 | pct_r_above_hb = pct_above_heightbreak(grid, r, heightbreak)
134 | mean_z = grid.raster(np.mean, "z")
135 | # Overwrite pct_r1_above_2m array (to save memory)
136 | pct_r_above_hb.array = pct_r_above_hb.array * mean_z.array
137 |
138 | return pct_r_above_hb
139 |
140 |
141 | def z_mean_sq(grid):
142 | """
143 | Calculates the square of the mean z value.
144 | """
145 |
146 | rast = z_mean(grid)
147 | rast.array = rast.array ^ 2
148 | return rast
149 |
150 |
151 | def canopy_relief_ratio(grid, mean_z_arr, min_z_arr, max_z_arr):
152 | crr_arr = (mean_z_arr - min_z_arr) / (max_z_arr - min_z_arr)
153 | crr_rast = pyfor.rasterizer.Raster(crr_arr, grid)
154 | return crr_rast
155 |
156 |
157 | def return_num(grid, num):
158 | """Compute the number of returns that match `num` for a grid object"""
159 | counts = grid.cells["return_num"].value_counts()
160 | counts = pd.DataFrame(counts)
161 | counts = counts.rename(columns={"return_num": "occurrences"})
162 | counts = counts.reset_index()
163 | counts = counts.loc[counts["return_num"] == num, :]
164 |
165 | array = np.full((grid.m, grid.n), np.nan)
166 | array[counts["bins_y"], counts["bins_x"]] = counts["occurrences"]
167 |
168 | return pyfor.rasterizer.Raster(array, grid)
169 |
170 |
171 | def all_returns(grid):
172 | return grid.raster("count", "z")
173 |
174 |
175 | def total_returns(grid):
176 | counts = grid.cells["x"].count()
177 | counts = counts.reset_index()
178 | counts = counts.rename(columns={"x": "num"})
179 |
180 | array = np.full((grid.m, grid.n), np.nan)
181 | array[counts["bins_y"], counts["bins_x"]] = counts["num"]
182 |
183 | return pyfor.rasterizer.Raster(array, grid)
184 |
185 |
186 | def standard_metrics_grid(grid, heightbreak):
187 | metrics_dict = {}
188 | metrics_dict["max_z"] = z_max(grid)
189 | metrics_dict["min_z"] = z_min(grid)
190 | metrics_dict["mean_z"] = z_mean(grid)
191 | metrics_dict["stddev_z"] = z_std(grid)
192 | metrics_dict["var_z"] = z_var(grid)
193 | metrics_dict["canopy_relief_ratio"] = canopy_relief_ratio(
194 | grid,
195 | metrics_dict["mean_z"].array,
196 | metrics_dict["min_z"].array,
197 | metrics_dict["max_z"].array,
198 | )
199 |
200 | for pct in all_pct:
201 | metrics_dict["p_" + str(pct)] = grid_percentile(grid, pct)
202 |
203 | metrics_dict["pct_r_1_above_{}".format(heightbreak)] = pct_above_heightbreak(
204 | grid, 1, heightbreak
205 | )
206 | metrics_dict["pct_r_1_above_mean".format(heightbreak)] = pct_above_heightbreak(
207 | grid, 1, "mean"
208 | )
209 | metrics_dict["pct_all_above_{}".format(heightbreak)] = pct_above_heightbreak(
210 | grid, 0, heightbreak
211 | )
212 | metrics_dict["pct_all_above_mean".format(heightbreak)] = pct_above_heightbreak(
213 | grid, 0, "mean"
214 | )
215 | return metrics_dict
216 |
217 |
218 | def standard_metrics_cloud(points, heightbreak):
219 | metrics = pd.DataFrame()
220 |
221 | # Some values used multiple times
222 | mean_z = np.mean(points.z)
223 |
224 | metrics["total_returns"] = [np.alen(points)]
225 |
226 | # Get number of returns by return number
227 | for i, num in enumerate(summarize_return_num(points.return_num)):
228 | metrics["r_{}".format(i + 1)] = [num]
229 |
230 | metrics["max_z"] = [np.max(points.z)]
231 | metrics["min_z"] = [np.min(points.z)]
232 | metrics["mean_z"] = [mean_z]
233 | metrics["median_z"] = [np.median(points.z)]
234 | metrics["stddev_z"] = [np.std(points.z)]
235 | metrics["var_z"] = [np.var(points.z)]
236 |
237 | for pct_z, pct in zip(*summarize_percentiles(points.z)):
238 | metrics["p_{}".format(pct)] = [pct_z]
239 |
240 | # "Cover metrics"
241 | metrics["canopy_relief_ratio"] = (metrics["mean_z"] - metrics["min_z"]) / (
242 | metrics["max_z"] - metrics["min_z"]
243 | )
244 | metrics["pct_r_1_above_{}".format(heightbreak)] = (
245 | np.sum((points["return_num"] == 1) & (points["z"] > heightbreak))
246 | / metrics["r_1"]
247 | )
248 | metrics["pct_r_1_above_mean"] = (
249 | np.sum((points["return_num"] == 1) & (points["z"] > mean_z)) / metrics["r_1"]
250 | )
251 | metrics["pct_all_above_{}".format(heightbreak)] = (
252 | np.sum(points["z"] > heightbreak) / metrics["total_returns"]
253 | )
254 | metrics["pct_all_above_mean"] = (
255 | np.sum(points["z"] > mean_z) / metrics["total_returns"]
256 | )
257 |
258 | return metrics
259 |
--------------------------------------------------------------------------------
/pyfor/voxelizer.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 |
3 |
4 | class VoxelGrid:
5 | """A 3 dimensional grid representation of a point cloud. This is analagous to the rasterizer.Grid class, but
6 | with three axes instead of two. VoxelGrids are generally used to produce VoxelRaster objects."""
7 |
8 | def __init__(self, cloud, cell_size):
9 | self.cell_size = cell_size
10 | self.cloud = cloud
11 | self.cell_size = cell_size
12 |
13 | min_x, max_x = self.cloud.data.min[0], self.cloud.data.max[0]
14 | min_y, max_y = self.cloud.data.min[1], self.cloud.data.max[1]
15 | min_z, max_z = self.cloud.data.min[2], self.cloud.data.max[2]
16 |
17 | self.m = int(np.floor((max_y - min_y) / cell_size))
18 | self.n = int(np.floor((max_x - min_x) / cell_size))
19 | self.p = int(np.floor((max_z - min_z) / cell_size))
20 |
21 | # Create bins
22 | y_edges = np.linspace(min_y, max_y, self.m)
23 | bins_x = np.searchsorted(
24 | np.linspace(min_x, max_x, self.n), self.cloud.data.points["x"]
25 | )
26 | bins_y = (
27 | np.searchsorted(
28 | -y_edges,
29 | -self.cloud.data.points["y"],
30 | side="right",
31 | sorter=(-y_edges).argsort(),
32 | )
33 | - 1
34 | )
35 | bins_z = np.searchsorted(
36 | np.linspace(min_z, max_z, self.p), self.cloud.data.points["z"]
37 | )
38 |
39 | self.data = self.cloud.data.points
40 | self.data["bins_x"] = bins_x
41 | self.data["bins_y"] = bins_y
42 | self.data["bins_z"] = bins_z
43 |
44 | self.cells = self.data.groupby(["bins_x", "bins_y", "bins_z"])
45 |
46 | def voxel_raster(self, func, dim):
47 | """Creates a 3 dimensional voxel raster, analagous to rasterizer.Grid.raster.
48 |
49 | :param func: The function to summarize within each voxel.
50 | :param dim: The dimension upon which to summarize (i.e. "z", "intensity", etc.)
51 | """
52 | voxel_grid = np.zeros((self.m, self.n, self.p))
53 | cells = (
54 | self.data.groupby(["bins_x", "bins_y", "bins_z"])
55 | .agg({dim: func})
56 | .reset_index()
57 | )
58 |
59 | # Set the values of the grid
60 | voxel_grid[cells["bins_x"], cells["bins_y"], cells["bins_z"]] = cells[dim]
61 |
62 | return voxel_grid
63 |
--------------------------------------------------------------------------------
/pyfortest/__init__.py:
--------------------------------------------------------------------------------
1 | from pyfortest import test_pyfor
2 | from pyfortest import test_collection
3 |
--------------------------------------------------------------------------------
/pyfortest/data/chm.tif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/pyfortest/data/chm.tif
--------------------------------------------------------------------------------
/pyfortest/data/clip.dbf:
--------------------------------------------------------------------------------
1 | v A id N
2 |
1 2
--------------------------------------------------------------------------------
/pyfortest/data/clip.prj:
--------------------------------------------------------------------------------
1 | PROJCS["WGS_1984_UTM_Zone_10N",GEOGCS["GCS_WGS_1984",DATUM["D_WGS_1984",SPHEROID["WGS_1984",6378137,298.257223563]],PRIMEM["Greenwich",0],UNIT["Degree",0.017453292519943295]],PROJECTION["Transverse_Mercator"],PARAMETER["latitude_of_origin",0],PARAMETER["central_meridian",-123],PARAMETER["scale_factor",0.9996],PARAMETER["false_easting",500000],PARAMETER["false_northing",0],UNIT["Meter",1]]
--------------------------------------------------------------------------------
/pyfortest/data/clip.qpj:
--------------------------------------------------------------------------------
1 | PROJCS["WGS 84 / UTM zone 10N",GEOGCS["WGS 84",DATUM["WGS_1984",SPHEROID["WGS 84",6378137,298.257223563,AUTHORITY["EPSG","7030"]],AUTHORITY["EPSG","6326"]],PRIMEM["Greenwich",0,AUTHORITY["EPSG","8901"]],UNIT["degree",0.0174532925199433,AUTHORITY["EPSG","9122"]],AUTHORITY["EPSG","4326"]],PROJECTION["Transverse_Mercator"],PARAMETER["latitude_of_origin",0],PARAMETER["central_meridian",-123],PARAMETER["scale_factor",0.9996],PARAMETER["false_easting",500000],PARAMETER["false_northing",0],UNIT["metre",1,AUTHORITY["EPSG","9001"]],AXIS["Easting",EAST],AXIS["Northing",NORTH],AUTHORITY["EPSG","32610"]]
2 |
--------------------------------------------------------------------------------
/pyfortest/data/clip.shp:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/pyfortest/data/clip.shp
--------------------------------------------------------------------------------
/pyfortest/data/clip.shx:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/pyfortest/data/clip.shx
--------------------------------------------------------------------------------
/pyfortest/data/mock_collection/0.las:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/pyfortest/data/mock_collection/0.las
--------------------------------------------------------------------------------
/pyfortest/data/mock_collection/1.las:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/pyfortest/data/mock_collection/1.las
--------------------------------------------------------------------------------
/pyfortest/data/mock_collection/2.las:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/pyfortest/data/mock_collection/2.las
--------------------------------------------------------------------------------
/pyfortest/data/mock_collection/3.las:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/pyfortest/data/mock_collection/3.las
--------------------------------------------------------------------------------
/pyfortest/data/test.las:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/pyfortest/data/test.las
--------------------------------------------------------------------------------
/pyfortest/data/test.laz:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/pyfortest/data/test.laz
--------------------------------------------------------------------------------
/pyfortest/data/test.ply:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/brycefrank/pyfor/efd34bcd7440358abea770c1bf5cad5e05a6fbe3/pyfortest/data/test.ply
--------------------------------------------------------------------------------
/pyfortest/test_collection.py:
--------------------------------------------------------------------------------
1 | from pyfor import *
2 | import unittest
3 | import os
4 |
5 | data_dir = os.path.join(os.path.dirname(os.path.realpath(__file__)), "data")
6 | proj4str = "+proj=utm +zone=10 +ellps=GRS80 +datum=NAD83 +units=m +no_defs"
7 |
8 |
9 | def test_buffered_func(pc, tile):
10 | pass
11 |
12 |
13 | def test_byfile_func(las_path):
14 | pass
15 |
16 | def test_func_arg(pc, tile, args):
17 | return(args["test_arg"])
18 |
19 |
20 | def make_test_collection():
21 | """
22 | Splits the testing tile into 4 tiles to use for testing
23 | :return:
24 | """
25 |
26 | pc = cloud.Cloud(os.path.join(data_dir, "test.las"))
27 |
28 | # Sample to only 1000 points for speed
29 | pc.data.points = pc.data.points.sample(1000, random_state=12)
30 |
31 | tr = pc.data.points[
32 | (pc.data.points["x"] > 405100) & (pc.data.points["y"] > 3276400)
33 | ]
34 | tl = pc.data.points[
35 | (pc.data.points["x"] < 405100) & (pc.data.points["y"] > 3276400)
36 | ]
37 | br = pc.data.points[
38 | (pc.data.points["x"] > 405100) & (pc.data.points["y"] < 3276400)
39 | ]
40 | bl = pc.data.points[
41 | (pc.data.points["x"] < 405100) & (pc.data.points["y"] < 3276400)
42 | ]
43 |
44 | all = [tr, tl, br, bl]
45 |
46 | for i, points in enumerate(all):
47 | out = cloud.LASData(points, pc.data.header)
48 | out.write(os.path.join(data_dir, "mock_collection", "{}.las".format(i)))
49 |
50 | pc.data.header.reader.close()
51 |
52 |
53 | class CollectionTestCase(unittest.TestCase):
54 | def setUp(self):
55 | self.test_col = collection.from_dir(os.path.join(data_dir, "mock_collection"))
56 | self.test_col_path = os.path.join(data_dir, "mock_collection")
57 |
58 | def test_create_index(self):
59 | self.test_col.create_index()
60 | lax_paths = [
61 | os.path.join(self.test_col_path, lax_file)
62 | for lax_file in os.listdir(self.test_col_path)
63 | if lax_file.endswith(".lax")
64 | ]
65 | self.assertEqual(len(lax_paths), 4)
66 |
67 | def test_retile_raster(self):
68 | self.test_col.retile_raster(10, 50, buffer=10)
69 | self.assertEqual(len(self.test_col.tiles), 16)
70 | self.test_col.reset_tiles()
71 |
72 | def test_par_apply_buff_index(self):
73 | self.test_col.create_index()
74 | self.test_col.retile_raster(10, 50, buffer=10)
75 | self.test_col.par_apply(test_buffered_func, indexed=True)
76 |
77 | def test_par_apply_buff_noindex(self):
78 | self.test_col.par_apply(test_buffered_func, indexed=False)
79 |
80 | def test_par_apply_by_file(self):
81 | self.test_col.par_apply(test_byfile_func, by_file=True)
82 |
83 | def test_par_apply_arg(self):
84 | self.test_col.par_apply(test_func_arg, indexed=False, args={"test_arg": 3})
85 |
86 | def test_datetimes_retrieval(self):
87 | timestamp = self.test_col['datetime'].iloc[0]
88 | timestamp = timestamp.to_pydatetime()
89 | self.assertEqual(timestamp.year, 2017)
90 |
91 | def tearDown(self):
92 | # Delete any .lax files
93 | lax_paths = [
94 | os.path.join(self.test_col_path, lax_file)
95 | for lax_file in os.listdir(self.test_col_path)
96 | if lax_file.endswith(".lax")
97 | ]
98 | for lax_path in lax_paths:
99 | os.remove(lax_path)
100 |
--------------------------------------------------------------------------------
/setup.cfg:
--------------------------------------------------------------------------------
1 | [metadata]
2 | description-file = README.md
--------------------------------------------------------------------------------
/setup.py:
--------------------------------------------------------------------------------
1 | from setuptools import setup, find_packages, Extension
2 |
3 | setup(
4 | name="pyfor",
5 | version="0.3.6",
6 | author="Bryce Frank",
7 | author_email="bfrank70@gmail.com",
8 | packages=["pyfor", "pyfortest"],
9 | url="https://github.com/brycefrank/pyfor",
10 | license="LICENSE.txt",
11 | description="Tools for forest resource point cloud analysis.",
12 | install_requires=["laspy", "laxpy", "python-coveralls"], # Dependencies from pip
13 | )
14 |
--------------------------------------------------------------------------------
/test_environment.yml:
--------------------------------------------------------------------------------
1 | name: pyfor_env
2 |
3 | dependencies:
4 | - python=3.7
5 | - gdal
6 | - rasterio
7 | - conda-forge::pyproj
8 | - numpy
9 | - pandas
10 | - scipy
11 | - scikit-image
12 | - scikit-learn
13 | - pyqtgraph
14 | - matplotlib
15 | - pyopengl
16 | - numba
17 | - libkml
18 | - geopandas
19 | - joblib
20 | - poppler
21 | - conda-forge::lastools
22 | - pip:
23 | - laspy
24 | - laxpy
25 | - sphinx_rtd_theme
26 | - python-coveralls
27 | - plyfile
28 |
--------------------------------------------------------------------------------