├── .gitignore ├── Docs ├── Backgrounder.md ├── Helper_functions.md ├── Multipart_shapes.md ├── Scripts_desc.md └── npGeomTools.md ├── README.md ├── Voronoi2.png ├── _config.yml ├── _layouts └── default.html ├── arcpro_npg ├── README.md ├── images │ ├── FreeTools.png │ ├── README.md │ ├── circles.png │ ├── dissolve_sq2_0.png │ ├── dissolve_sq2_1.png │ └── sq.png └── npg │ ├── README.md │ ├── __init__.py │ ├── _npg_samples_.py │ ├── npGeom_32.atbx │ ├── npg │ ├── README.md │ ├── __init__.py │ ├── doc_t.py │ ├── extras │ │ ├── README.md │ │ ├── hulls.py │ │ └── npgCondaPkgs.py │ ├── npGeo.py │ ├── npgDocs.py │ ├── npg_analysis.py │ ├── npg_arc_npg.py │ ├── npg_bool_hlp.py │ ├── npg_bool_ops.py │ ├── npg_buffer.py │ ├── npg_clip_split.py │ ├── npg_create.py │ ├── npg_geom_hlp.py │ ├── npg_geom_ops.py │ ├── npg_helpers.py │ ├── npg_io.py │ ├── npg_maths.py │ ├── npg_min_circ.py │ ├── npg_overlay.py │ ├── npg_pip.py │ ├── npg_plots.py │ ├── npg_prn.py │ ├── npg_setops.py │ ├── npg_table.py │ ├── npg_utils.py │ ├── old │ │ ├── README.md │ │ ├── npg_boolean.py │ │ ├── npg_clip_lastgood.py │ │ ├── npg_erase.py │ │ ├── npg_geom.py │ │ ├── npg_helpers.py │ │ ├── npg_pgon.py │ │ └── npg_split_lastgood.py │ └── testing.py │ ├── npg_tools.tbx │ └── tbx_tools.py ├── assets └── css │ └── style.scss └── images ├── 1_README.md ├── Shape2.png ├── Voronoi2.png ├── bad_shape.png ├── circles.png ├── clones2.png ├── containers.png ├── double_cross_b3c1.png ├── double_cross_b4c1.png ├── extent_to_poly02.png ├── hexagons.png ├── npGeo.png ├── npGeo_conversion_tools.png ├── npGeomTools.png ├── npGeom_containers_tools0.png ├── npg_arc_npg.png ├── npg_create.png ├── npg_io.png ├── single_cross_c2CC.png ├── single_cross_s00_t0.png ├── sq.png └── sq2.png /.gitignore: -------------------------------------------------------------------------------- 1 | # gitignore 2 | # temporary/working/backup files # 3 | *.bak 4 | *.md 5 | *.yml 6 | .project 7 | 8 | # Compiled source examples from numpy# 9 | ###################################### 10 | *.com 11 | *.class 12 | *.dll 13 | *.exe 14 | *.o 15 | *.o.d 16 | *.py[ocd] 17 | *.so 18 | 19 | # Things specific to as project # 20 | ################################# 21 | assets/css 22 | Docs 23 | images 24 | arcpro_npg/images 25 | arcpro_npg/npg/npg 26 | _layouts 27 | -------------------------------------------------------------------------------- /Docs/Helper_functions.md: -------------------------------------------------------------------------------- 1 | # Geo array helpers 2 | 3 | **Properties and methods** 4 | 5 | 6 | |Info | info | geo_info | structure| 7 | | --- | ---- | -------- | -------- | 8 | 9 | There are several levels of information that can be acquired for Geo arrays. 10 | 11 | The most basic is attached as an ``info`` property to the array itself. This property is proper case **Info**. 12 | ```python 13 | 14 | g.Info 15 | 'rolled' 16 | ``` 17 | 18 | A fuller description can be derived using the lowercase **info** property. The following is returned. 19 | - extent, 20 | - number of shapes 21 | - number of parts 22 | - number of points 23 | - textual presentation of the Geo array's ``IFT`` information. 24 | 25 | ```python 26 | 27 | g.info 28 | -------------- 29 | Extents : 30 | LL [ 300000.00 5000000.00] 31 | UR [ 300012.00 5000015.00] 32 | Shapes : 7 33 | Parts : 10 34 | Points : 55 35 | Sp Ref : NAD 1983 CSRS MTM 9 36 | 37 | | OID_ Fr_pnt To_pnt CW_CCW Part_ID Bit_ID 38 | ---------------------------------------------------------------- 39 | 000 1 0 7 1 1 0 40 | 001 1 7 11 0 1 1 41 | 002 2 11 19 1 1 0 42 | 003 3 19 25 1 1 0 43 | 004 4 25 30 1 1 0 44 | 005 5 30 36 1 1 0 45 | 006 5 36 40 0 1 1 46 | 007 6 40 46 1 1 0 47 | 008 6 46 50 0 1 1 48 | 009 9 50 55 1 1 0 49 | ``` 50 | 51 | The similarities and differences between the Geo array and the base ndarray is ascribed to the ``geo_info`` method. 52 | 53 | All common properties can be determined using ``npg.dirr`` while the output from ``npg.geo_info`` is subdivided into the properties 54 | and methods specific to a geo array, their base and special properties. 55 | 56 | ```python 57 | 58 | g.geo_info() 59 | 60 | geo_info(geo_array) 61 | Geo methods and properties. 62 | Bit, CW, FT, Fr, H, IDs, IFT, IFT_str, IP, Info, K, LL, N, PID, SR, SVG, To, U, 63 | UR, X, XT, XY, Y, Z, __author__, __dict__, __module__, __name__, aoi_extent, 64 | aoi_rectangle, areas, as_arrays, as_lists, bit_IFT, bit_ids, bit_pnt_cnt, 65 | bit_seq, bits, boundary, bounding_circles, centers, centroids, change_indices, 66 | close_polylines, common_segments, convex_hulls, densify_by_distance, 67 | densify_by_factor, densify_by_percent, dupl_pnts, extent_centers, 68 | extent_corner, extent_pnts, extent_rectangles, extents, fill_holes, first_bit, 69 | first_part, fr_to_pnts, geo_info, geom_check, get_shapes, holes_to_shape, info, 70 | inner_IFT, inner_rings, is_clockwise, is_convex, is_in, is_multipart, 71 | is_multipart_report, lengths, maxs, means, min_area_rect, mins, moveto, 72 | multipart_to_singlepart, od_pairs, outer_IFT, outer_rings, part_IFT, part_ids, 73 | parts, pnt_counts, pnt_ids, pnt_indices, pnt_on_poly, polygon_angles, 74 | polygons_to_polylines, polyline_angles, polylines_to_polygons, polys_to_points, 75 | prn, prn_obj, radial_sort, roll_shapes, rotate, segment_angles, 76 | segment_pnt_ids, segment_polys, shapes, shift, shp_IFT, shp_ids, shp_part_cnt, 77 | shp_pnt_cnt, shp_pnt_ids, sort_by_area, sort_by_extent, sort_by_length, 78 | sort_coords, split_by, structure, svg, to_segments, translate, triangulate, 79 | uniq_pnts, unique_segments, xy_id 80 | ``` 81 | Another useful method for documentation purposes is ``structure``. It is largely a look at the IFT in verbose format. 82 | 83 | ```python 84 | 85 | g.structure() # provide structural information and the IFT in verbose format 86 | 87 | Geo array structure 88 | ------------------- 89 | OID_ : self.Id shape id 90 | Fr_pnt : self.Fr from point id 91 | To_pnt : self.To to point id for a shape 92 | CW_CCW : self.CW outer (1) inner/hole (0) 93 | Part_ID : self.PID part id for each shape 94 | Bit_ID : self.Bit sequence order of each part in a shape 95 | ---- 96 | 97 | | OID_ Fr_pnt To_pnt CW_CCW Part_ID Bit_ID 98 | ---------------------------------------------------------------- 99 | 000 1 0 7 1 1 0 100 | 001 1 7 11 0 1 1 101 | 002 2 11 19 1 1 0 102 | 003 3 19 25 1 1 0 103 | 004 4 25 30 1 1 0 104 | 005 5 30 36 1 1 0 105 | 006 5 36 40 0 1 1 106 | 007 6 40 46 1 1 0 107 | 008 6 46 50 0 1 1 108 | 009 9 50 55 1 1 0 109 | 110 | ``` 111 | A number of properties and methods can be examined in the code section. 112 | 113 | ```python 114 | Geo base properties. 115 | Bit, CW, FT, Fr, IDs, IFT, IP, Info, K, LL, N, PID, SR, SVG, To, U, UR, X, XT, 116 | XY, Y, Z 117 | 118 | Geo special. 119 | __array_finalize__, __array_wrap__, __doc__, __new__ 120 | ``` 121 | -------------------------------------------------------------------------------- /Docs/Multipart_shapes.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | # Multipart shapes 4 | 5 | 6 | Some basics 7 | ----------- 8 | Consider the following multipart shapes. The first shape has is second part slightly offset and it also contains a hole. The second shape is a flip/mirror/translate of its first part. 9 | 10 | The centroids of each part are shown on the image. These locations have been confirmed using arcpy and npGeo methods. 11 | 12 | The point coordinates with (300,000 m, 5,000,000 m, MTM 9) subtracted from their values. So the data are in a projected coordinate system and all further measures will be in planar/metric units. 13 | 14 | 15 | 16 | ```python 17 | 18 | npg.prn_geo(g) # ---- Geo array representation of a polygon featureclass 19 | 20 | pnt shape part X Y 21 | -------------------------------- 22 | 000 1 10.00 10.00 first point of outer ring 23 | 001 1 10.00 0.00 24 | 002 1 0.00 0.00 # ---- shape 1: a polygon with 3 holes 25 | 003 1 0.00 10.00 26 | 004 1 10.00 10.00 last point of outer ring 27 | 005 1 -o 3.00 9.00 first point of first inner ring 28 | 006 1 3.00 3.00 29 | 007 1 9.00 3.00 30 | 008 1 9.00 9.00 31 | 009 1 3.00 9.00 last point of first inner ring 32 | 010 1 -o 8.00 8.00 first point of second inner ring 33 | 011 1 8.00 4.00 34 | 012 1 4.00 4.00 35 | 013 1 4.00 8.00 36 | 014 1 8.00 8.00 last point of second inner ring 37 | 015 1 -o 6.00 7.00 first point of third inner ring 38 | 016 1 5.00 5.00 39 | 017 1 7.00 5.00 40 | 018 1 ___ 6.00 7.00 last point of third inner ring AND the end of the first shape 41 | 019 2 -o 12.00 8.00 42 | 020 2 12.00 2.00 43 | 021 2 20.00 2.00 44 | 022 2 20.00 0.00 45 | 023 2 10.00 0.00 46 | 024 2 10.00 10.00 47 | 025 2 14.00 10.00 48 | 026 2 20.00 10.00 49 | 027 2 20.00 8.00 50 | 028 2 12.00 8.00 51 | 029 2 -o 25.00 14.00 # ---- shape 2: a two part polygon without holes 52 | 030 2 25.00 4.00 53 | 031 2 15.00 4.00 54 | 032 2 15.00 6.00 55 | 033 2 23.00 6.00 56 | 034 2 23.00 12.00 57 | 035 2 15.00 12.00 58 | 036 2 15.00 14.00 59 | 037 2 ___ 25.00 14.00 Now... you figure out the rest ;) 60 | 038 3 -o 14.00 10.00 61 | 039 3 10.00 10.00 62 | ... snip 63 | 64 | ``` 65 | 66 | This shape (s2) is simply represented by the last 2 columns, the first 2 columns are solely for printing purposes. 67 | The sequence of points is identified by their Id and From and To points (IFT) 68 | 69 | ```python 70 | 71 | g.IFT 72 | array([[ 1, 0, 5, 1, 1, 0], 73 | [ 1, 5, 10, 0, 1, 1], 74 | [ 1, 10, 15, 1, 2, 0], 75 | [ 1, 15, 19, 0, 2, 1], 76 | [ 2, 19, 29, 1, 1, 0], 77 | [ 2, 29, 38, 1, 2, 0], 78 | [ 3, 38, 42, 1, 1, 0], 79 | ... snip ... 80 | ], dtype=int64) 81 | 82 | ``` 83 | I added another method to the pack to expand upon the IFT information. 84 | 85 | ```python 86 | 87 | g.info # ---- g.info returns extent, shape, part, point and structure information 88 | -------------- 89 | Extents : 90 | LL [ 300000. 5000000.] 91 | UR [ 300036.71 5000033. ] 92 | Shapes : 12 93 | Parts : 16 94 | Points : 145 95 | 96 | ... OID_ From_pnt To_pnt Ring_type Part_ID Bit_seq 97 | -------------------------------------------------------------------- 98 | 000 1 0 5 1 1 0 99 | 001 1 5 10 0 1 1 100 | 002 1 10 15 1 2 0 101 | 003 1 15 19 0 2 1 102 | 004 2 19 29 1 1 0 103 | 005 2 29 38 1 2 0 104 | 006 3 38 42 1 1 0 105 | ... snip ... 106 | ``` 107 | There is an alternate approach... 108 | 109 | ```python 110 | 111 | npg.prn_tbl(g.IFT_str) 112 | 113 | ... OID_ From_pnt To_pnt Ring_type Part_ID Bit_seq 114 | -------------------------------------------------------------------- 115 | 000 1 0 5 1 1 0 116 | 001 1 5 10 0 1 1 117 | 002 1 10 15 1 2 0 118 | 003 1 15 19 0 2 1 119 | 004 2 19 29 1 1 0 120 | 005 2 29 38 1 2 0 121 | 006 3 38 42 1 1 0 122 | ... snip ... 123 | ``` 124 | As is shown **prn_tbl** produces a nicely labelled output from the structured array that can be returned from the **IFT_str** method. 125 | A quick survey shows repetition of the shape ID in the *Part* column. The *Points* for each part are given, from which the *From_ID* and *To_ID* values are derived. 126 | 127 | Either approach allows you to quickly look back at geometry structure. 128 | 129 | 130 | ---- 131 | The methods and functions that will be shown use this information in their processing. In this fashion, it is possible to try and optimize the derivation of properties and application of functions by using the whole point sequence of their subgroupings. 132 | 133 | This will obviously not be possible in all situations, but every bit helps. 134 | 135 | 136 | ---- 137 | 138 | ndarray values from esri geometry 139 | --------------------------------- 140 | 141 | arcpy geometry 142 | ------------------ 143 | 144 | This is what the geometry looks like for the first shape (multipart with holes). 145 | 146 | ```python 147 | 148 | p0 149 | 150 | 151 | p0[:2] # ---- two parts, so slice 152 | [, , , 153 | , , 154 | None, 155 | , , , 156 | , ]>, 157 | , , , 158 | , , 159 | None, 160 | , , , 161 | ]>] 162 | ``` 163 | The polygon consists of two parts, represented as the arcpy.Array. This in turn consists of sequences of arcpy.Point values with outer rings ordered clockwise and inner rings/holes, order counter-clockwise. Inner and outer rings are separated by None, rather than a null point since a null point, unfortunately, has X and Y values of 0. 164 | 165 | ```python 166 | 167 | (arcpy.Point() 168 | 169 | ``` 170 | 171 | 172 | FeatureClassToNumPyArray 173 | ------------------------ 174 | The standby, great for singlepart simple shapes. You have to read the X, and Y coordinates separately or as a ``SHAPE@XY`` since reading the ``SHAPE@`` to retrieve the object directly is not permitted. 175 | 176 | In the examples below, extra effort would have to be made to subtract the extent minimum from each point to obtain their values relative to it. 177 | 178 | ```python 179 | 180 | a0 = arcpy.da.FeatureClassToNumPyArray(in_fc3, ['SHAPE@X', 'SHAPE@Y'], explode_to_points=True, spatial_reference=SR) 181 | a0 182 | array([(300010., 5000020.), (300010., 5000010.), (300000., 5000010.), (300000., 5000020.), 183 | (300010., 5000020.), (300003., 5000019.), (300003., 5000013.), (300009., 5000013.), 184 | (300009., 5000019.), (300003., 5000019.), (300008., 5000018.), (300008., 5000014.), 185 | (300004., 5000014.), (300004., 5000018.), (300008., 5000018.), (300006., 5000017.), 186 | (300005., 5000015.), (300007., 5000015.), (300006., 5000017.), (300012., 5000018.), 187 | (300012., 5000012.), (300020., 5000012.), (300020., 5000010.), (300010., 5000010.), 188 | (300010., 5000020.), (300020., 5000020.), (300020., 5000018.), (300012., 5000018.), 189 | (300025., 5000024.), (300025., 5000014.), (300015., 5000014.), (300015., 5000016.), 190 | (300023., 5000016.), (300023., 5000022.), (300015., 5000022.), (300015., 5000024.), 191 | (300025., 5000024.)], dtype=[('SHAPE@X', ' dtype([('OID@', ' ('OID@', 'SHAPE@X', 'SHAPE@Y') 288 | 289 | ``` 290 | 291 | 292 | 293 | 294 | The parts and the geometry are not identified within the sequences. Constructing points from the above is no big deal, but polylines and polygons would fail miserably... as shown in this example. 295 | 296 | The need to identify parts and holes in polygons prompted this study to see whether arcpy geometries could be represented in a different manner in numpy array format. 297 | Currently, there are operations that cannot be done simply on arcpy geometries that are so simple in numpy. 298 | 299 | Want to shift some polygons a couple of meters? 300 | - just shuffle through a search cursor disassemble the point to an arcpy.Array, 301 | - cycle through each point (checking for None), then doing the math on each point, 302 | - finally, just reassemble the points array and reconstitute the polygon. 303 | 304 | In numpy, if the whole dataset can be represented as an Nx2 array... you just add/subtract from the whole array. 305 | Other functions, like convex hulls will require you to operate on 'chunks' of the array, rather than on the whole dataset at once. At least nothing needs to be devolved to its smallest part first. 306 | 307 | More on this in subsequent sections. 308 | 309 | 310 | -------------------------------------------------------------------------------- /Docs/Scripts_desc.md: -------------------------------------------------------------------------------- 1 | # Scripts 2 | ---- 3 | **Last update: 2022-11-22** 4 | 5 | **Slowly updating** 6 | I will update the documentation as soon as I can. 7 | 8 | The following scripts are listed in this folder and the documentation guide 9 | 10 | 1. \_\_init__.py 11 | 2. npg_io.py 12 | 3. npg_arc_npg.py 13 | 4. npGeo.py 14 | 5. npg_helpers.py 15 | 6. smallest_circle.py 16 | 7. npg_create.py 17 | 18 | Links to other documentation will be provided as appropriate 19 | 20 | 21 | ---- 22 | ## npg_io.py 23 | 24 | 25 | 26 | Some useful functions to access and document featureclass information. 27 | 28 | **dtype_info(a, as_string=False)** 29 | 30 | Return dtype information as lists or a string. 31 | 32 | ```python 33 | a = np.array([(0, 1, 2), (3, 4, 5), (6, 7, 8), (9, 10, 11)], 34 | dtype=[('f0', '>> f_name = "C:/arcpro_npg/data/ontario_attr.npz" 104 | >>> names, arrs = load_geo_attr(f_name) 105 | >>> names # ['arr', 'fields'] 106 | >>> arr = arrs[names[0]] 107 | >>> fields = arrs[names[1]] 108 | 109 | **save_geo(g, f_name, folder)** 110 | 111 | Save an array as an npz file. 112 | 113 | **save_txt(a, name="arr.txt", sep=", ", dt_hdr=True)** 114 | 115 | Save a NumPy structured/recarray to text. 116 | 117 | **load_txt(name="arr.txt", data_type=None)** 118 | 119 | Read a structured/recarray created by save_txt. Many options are specified in save_txt. 120 | If you wish to modify this, modify save_txt as well. 121 | 122 | **Others** 123 | 124 | There are functionsto work with geojson files as well. 125 | 126 | ```python 127 | 128 | npg.npg_io 129 | [... 'dtype_info', 'geojson_Geo', 'load_geo', 'load_geo_attr', 'load_geojson', 'load_txt', 'save_geo', 'save_txt'] 130 | ``` 131 | ---- 132 | ## npg_arc_npg 133 | 134 | 135 | 136 | ```python 137 | 138 | in_fc = 'C:/Arc_projects/CoordGeom/CoordGeom.gdb/Shape2' 139 | ``` 140 | **get_SR(in_fc, verbose=False)** 141 | 142 | ```python 143 | 144 | getSR(in_fc, verbose=False) 145 | 146 | 147 | getSR(in_fc, verbose=True) 148 | SR name: NAD_1983_CSRS_MTM_9 factory code: 2951 149 | ``` 150 | 151 | **get_shape_K(in_fc)** 152 | 153 | Returns the shape type for a featureclass as (kind, k), where kind is polygon, polyline, multipoint, point and variants. k is 2, 1 or 0. 154 | 155 | **fc_to_Geo(in_fc, geom_kind=2, minX=0, minY=0, info="")** 156 | 157 | Convert FeatureClassToNumPyArray to a Geo array. `in_fc` is the path to the featureclass. `geom_kind` is either 1 or 2 representing polylines or polygons. 158 | 159 | **id_fr_to(a, oids)** 160 | 161 | Produce the `id`, `from` and `to points` used to delineate poly* bit geometry. 162 | 163 | **Geo_to_shapes(geo, as_singlepart=True)** 164 | 165 | Convert a geo array back to esri geometry objects. 166 | 167 | **Geo_to_fc(geo, gdb=None, name=None, kind=None, SR=None)** 168 | 169 | Produce a geodatabase featureclass from a geo array. 170 | 171 | **Other functions** 172 | 173 | There are a variety of other functions that deal with converting between geometries. 174 | 175 | 176 | ---- 177 | ## npGeo.py 178 | 179 | 180 | 181 | 182 | This is where the Geo class is housed along with methods and properties applicable to it. 183 | 184 | The Geo class inherits from the numpy ndarray and methods applied to Geo arrays generally returns arrays of that class. 185 | 186 | Geo arrays can be constructed from other ndarrays using **arrays_Geo**. 187 | Three sample arrays are shown below. 188 | They have been arranged in column format to save space. 189 | 190 | ```python 191 | array( array([[[12., 18.], array([[14., 20.], 192 | [array([[10., 20.], [12., 12.], [10., 20.], 193 | [10., 10.], [20., 12.], [15., 28.], 194 | [ 0., 10.], [20., 10.], [14., 20.]]) 195 | [ 0., 20.], [10., 10.], 196 | [10., 20.], [10., 20.], 197 | [ 3., 19.], [20., 20.], 198 | [ 3., 13.], [20., 18.], 199 | [ 9., 13.], [12., 18.]], 200 | [ 9., 19.], [[25., 24.], 201 | [ 3., 19.]]), [25., 14.], 202 | array([[ 8., 18.], [15., 14.], 203 | [ 8., 14.], [15., 16.], 204 | [ 4., 14.], [23., 16.], 205 | [ 4., 18.], [23., 22.], 206 | [ 8., 18.], [15., 22.], 207 | [ 6., 17.], [15., 24.], 208 | [ 5., 15.], [25., 24.]]]) 209 | [ 7., 15.], 210 | [ 6., 17.]])], 211 | dtype=object), 212 | ``` 213 | 214 | Both the array of arrays and the geo array are saved in the Scripts folder. 215 | To load the Geo array, save the files to disk. You can save and load arrays using the follow syntax. This was used to create the files saved here. 216 | 217 | ```python 218 | # ---- For single arrays 219 | np.save("c:/path_to_file/three_arrays.npy", z, allow_pickle=True, fix_imports=False) # ---- save to disk 220 | 221 | arr = np.load(c:/path_to_file/three_arrays.npy", allow_pickle=True, fix_imports=False) # ---- load above arrays 222 | 223 | # ---- For multiple arrays 224 | np.savez("c:/temp/geo_array.npz", s2=s2, IFT=s2.IFT) # ---- save arrays, s2 and s2.IFT with names (s2, IFT) 225 | 226 | npzfiles = np.load("c:/temp/geo_array.npz") # ---- the Geo array and the array of I(ds)F(rom)T(o) values 227 | npzfiles.files # ---- will show ==> ['s2', 'IFT'] 228 | s2 = npzfiles['s2'] # ---- slice the arrays by name from the npz file to get each array 229 | IFT = npzfiles['IFT'] 230 | ``` 231 | 232 | 233 | ---- 234 | ## np_create.py 235 | 236 | 237 | 238 | 239 | The functions here allow you to create geometry of various shapes. 240 | 241 | Many of the functions can be used for spatial tiling. These would include rectangles, triangles and two variants of hexagons. 242 | 243 | Circle-based functions (arc, arc sector, circle, ellipse) are also included since they are used in a variety of other functions (eg. triangulation) 244 | 245 | 246 | 247 | 248 | 249 | 250 | 251 |
252 | 253 | 254 | ---- 255 | ## npg_prn.py 256 | ---- 257 | 258 | For regular ndarrays like `z` below with ndim=4. You can see that array is split up and organized by dimensions. 259 | 260 | Print the array with 2 decimal places to a maximum width of 120 characters. 261 | 262 | `prn_(a, deci=2, width=120, prefix='. . ')` 263 | 264 | ``` 265 | z = np.arange(0, 2*3*4*5).reshape(2, 3, 4, 5) 266 | 267 | npg.npg_prn.prn_(z) 268 | 269 | (0, 3, 4, 5) 270 | . . 0 1 2 3 4 20 21 22 23 24 40 41 42 43 44 271 | . . 5 6 7 8 9 25 26 27 28 29 45 46 47 48 49 272 | . . 10 11 12 13 14 30 31 32 33 34 50 51 52 53 54 273 | . . 15 16 17 18 19 35 36 37 38 39 55 56 57 58 59 274 | 275 | (1, 3, 4, 5) 276 | . . 60 61 62 63 64 80 81 82 83 84 100 101 102 103 104 277 | . . 65 66 67 68 69 85 86 87 88 89 105 106 107 108 109 278 | . . 70 71 72 73 74 90 91 92 93 94 110 111 112 113 114 279 | . . 75 76 77 78 79 95 96 97 98 99 115 116 117 118 119 280 | ``` 281 | 282 | 283 | **npg_prn.prn_Geo_shapes** 284 | 285 | This can also be called using `array.prn(). 286 | 287 | ``` 288 | sq2.prn() 289 | ID : Shape ID by part 290 | R : ring, outer 1, inner 0 291 | P : part 1 or more 292 | ID R P x y 293 | 1 1 1 [ 0.00 0.00] 294 | [ 2.00 8.00] 295 | [ 8.00 10.00] 296 | [ 10.00 10.00] 297 | [ 10.00 8.00] 298 | [ 9.00 1.00] 299 | [ 0.00 0.00] 300 | 1 0 1 [ 3.00 3.00] 301 | [ 7.00 3.00] 302 | [ 5.00 7.00] 303 | [ 3.00 3.00] 304 | 2 1 1 [ 8.00 10.00] 305 | [ 8.00 11.00] 306 | [ 8.00 12.00] 307 | [ 12.00 12.00] 308 | [ 12.00 8.00] 309 | [ 10.00 8.00] 310 | [ 10.00 10.00] 311 | [ 8.00 10.00] 312 | 3 1 1 [ 5.00 10.00] 313 | [ 5.00 12.00] 314 | [ 6.00 12.00] 315 | [ 8.00 12.00] 316 | [ 8.00 11.00] 317 | [ 5.00 10.00] 318 | 4 1 1 [ 5.00 12.00] 319 | [ 5.00 15.00] 320 | [ 7.00 14.00] 321 | [ 6.00 12.00] 322 | [ 5.00 12.00] 323 | 5 1 1 [ 0.00 10.00] 324 | [ 1.00 13.00] 325 | [ 3.00 14.00] 326 | [ 2.50 13.00] 327 | [ 1.00 11.50] 328 | [ 0.00 10.00] 329 | 5 0 1 [ 1.00 12.50] 330 | [ 1.50 12.50] 331 | [ 1.50 13.00] 332 | [ 1.00 12.50] 333 | 6 1 1 [ 1.00 11.50] 334 | [ 2.50 13.00] 335 | [ 4.00 12.50] 336 | [ 3.00 11.00] 337 | [ 2.00 11.00] 338 | [ 1.00 11.50] 339 | 6 0 1 [ 1.50 11.50] 340 | [ 2.50 11.50] 341 | [ 2.50 12.50] 342 | [ 1.50 11.50] 343 | 9 1 1 [ 2.00 11.00] 344 | [ 3.00 11.00] 345 | [ 4.00 9.00] 346 | [ 2.00 9.00] 347 | [ 2.00 11.00] 348 | 349 | ``` 350 | 351 | **prn_lists** 352 | 353 | Convert the geo array to a list of lists. 354 | 355 | ``` 356 | z = sq2.as_lists() 357 | 358 | npg.npg_prn.prn_lists(z) 359 | 360 | (0)... 361 | [(0.0, 0.0), (2.0, 8.0), (8.0, 10.0), (10.0, 10.0), (10.0, 8.0), (9.0, ... 362 | [(3.0, 3.0), (7.0, 3.0), (5.0, 7.0), (3.0, 3.0)]] 363 | 364 | (1)... 365 | [(8.0, 10.0), (8.0, 11.0), (8.0, 12.0), (12.0, 12.0), (12.0, 8.0), (10 ... 366 | 367 | (2)... 368 | [(5.0, 10.0), (5.0, 12.0), (6.0, 12.0), (8.0, 12.0), (8.0, 11.0), (5.0 ... 369 | 370 | (3)... 371 | [(5.0, 12.0), (5.0, 15.0), (7.0, 14.0), (6.0, 12.0), (5.0, 12.0)]] 372 | 373 | (4)... 374 | [(0.0, 10.0), (1.0, 13.0), (3.0, 14.0), (2.5, 13.0), (1.0, 11.5), (0.0 ... 375 | [(1.0, 12.5), (1.5, 12.5), (1.5, 13.0), (1.0, 12.5)]] 376 | 377 | (5)... 378 | [(1.0, 11.5), (2.5, 13.0), (4.0, 12.5), (3.0, 11.0), (2.0, 11.0), (1.0 ... 379 | [(1.5, 11.5), (2.5, 11.5), (2.5, 12.5), (1.5, 11.5)]] 380 | 381 | (6)... 382 | [(2.0, 11.0), (3.0, 11.0), (4.0, 9.0), (2.0, 9.0), (2.0, 11.0)]] 383 | ``` 384 | 385 | 386 | ---- 387 | ## To continue updating below... 388 | ---- 389 | 390 | **fc_info(in_fc, prn=True)** 391 | 392 | ```python 393 | fc_info(in_fc, prn=True) 394 | 395 | FeatureClass: 396 | C:/Arc_projects/CoordGeom/CoordGeom.gdb/Shape2 397 | shapeFieldName OIDFieldName shapeType spatialReference 398 | Shape OBJECTID Polygon NAD_1983_CSRS_MTM_9 399 | ``` 400 | 401 | **fc_fld_info(in_fc, prn=True)** 402 | 403 | ```python 404 | 405 | fc_fld_info(in_fc, prn=True) 406 | 407 | FeatureClass: 408 | C:/Arc_projects/CoordGeom/CoordGeom.gdb/Shape2 409 | Name Type Length Nullable Required 410 | OBJECTID OID 4 False True 411 | Shape Geometry 0 True True 412 | Shape_Length Double 8 True True 413 | Shape_Area Double 8 True True 414 | CENTROID_X Double 8 True False 415 | CENTROID_Y Double 8 True False 416 | INSIDE_X Double 8 True False 417 | INSIDE_Y Double 8 True False 418 | ``` 419 | 420 | **fc_geom_info(in_fc, SR=None, prn=True, start=0, num=50)** 421 | 422 | ```python 423 | 424 | fc_geom_info(in_fc, SR=None, prn=True, start=0, num=50) 425 | 426 | Featureclass: 427 | C:/Arc_projects/CoordGeom/CoordGeom.gdb/Shape2 428 | Shape Parts Points From_pnt To_pnt 429 | 1 2 21 0 21 430 | 2 2 18 21 39 431 | 3 1 4 39 43 432 | ``` 433 | 434 | **fc_composition(in_fc, SR=None, prn=True, start=0, end=50)** 435 | 436 | ```python 437 | 438 | fc_composition(in_fc, SR=None, prn=True, start=0, end=50) 439 | 440 | C:/Arc_projects/CoordGeom/CoordGeom.gdb/Shape2 441 | Shapes : 3 442 | Parts : 5 443 | max : 2 444 | Points : 43 445 | min : 4 446 | median : 9 447 | max : 11 448 | IDs Part Points From_pnt To_pnt 449 | 1 0 11 0 11 450 | 1 1 10 11 21 451 | 2 0 9 21 30 452 | 2 1 9 30 39 453 | 3 0 4 39 43 454 | ``` 455 | 456 | **arr = tbl_arr(in_fc)** 457 | 458 | ```python 459 | 460 | arr = tbl_arr(in_fc) 461 | 462 | array([(1, 86.47, 78., 300004.72, 5000014.73, 300004.72, 5000014.73), 463 | (2, 112. , 104., 300017.5 , 5000017. , 300015. , 5000011. ), 464 | (3, 21.5 , 16., 300013. , 5000022.67, 300013. , 5000022.67)], 465 | dtype=[('OID_', ' 497 | -------------------------------------------------------------------------------- /Docs/npGeomTools.md: -------------------------------------------------------------------------------- 1 | # npGeom Tools 2 | 3 | ---- 4 | 5 | 6 | 7 | 8 | The following tools are implemented in the npGeom.tbx toolbox for use in ArcGIS Pro. 9 | 10 | The *Geo* array, based on a numpy array, is used along with *arcpy* functions to implement the tools. 11 | 12 | The tools here are the most recent version of those provided in *FreeTools* 13 | 14 |
15 | 16 | 17 | (1) **Attribute tools** 18 | 19 | The tools are self-evident and don't include multiple options. 20 | 21 | (2) **Containers** 22 | 23 | The containers toolset offers the options shown in the image. 24 | 25 | - `Bounding circles` is probably the most uncommon for GIS tools. 26 | 27 | - `extent polygon` is the axis aligned extent of the feature geometry. 28 | 29 | - `Convex hull` is included for convenience. It uses scipy's Qhull implementation. 30 | 31 | (3) **Conversion** 32 | 33 | The options for `conversion` are as follows: 34 | 35 | - `Feature to point` will return the geometry centroid for polygons. 36 | 37 | - `Split at vertices` is for polygon geometry. From-to point segmentation of the geometry will be returned. 38 | 39 | - `Vertices to points` applies to polyline/polygon geometry. 40 | 41 | - `Polygons to polylines` is also a convenience function since it only requires a `Kind (K)` conversion in the Geo class. 42 | 43 | The reciprocal function was not implemented because I didn't want to provide a whole load of geometry checks. 44 | 45 | If you have polyline geometry that you know will create well-formed polygons, simply change `K`. 46 | 47 | 48 | 49 | 50 | 51 |
52 |
53 | 54 | (4) **Sort Geometry** 55 | 56 | - `Extent sort` provides options for sorting geometries using keys like S - N, E - W etcetera. 57 | 58 | - `Geometry sort` can be used to sort by area or perimeter/length. 59 | *Source image* 60 | 61 | (5) **Alter Geometry** 62 | 63 | The tool listing provides densification options, filling holes in polygons, and shifting and rotating geometries either as a group or individually. 64 | 65 | (6) **Triangulation Tools** 66 | 67 | Voronoi diagram (aka Thiessen polygons) and Delaunay triangulations, round out the tools so far. 68 | 69 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # NumPy and Geometry 2 | 3 | ---- 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | **Links to tool and script documentation** 14 | 15 | * [NumPy Geometry and Free Tools](/arcpro_npg/README.md) 16 | 17 | **Background Documentation Links** 18 | 19 | * [Geo array Backgrounder](/Docs/Backgrounder.md) 20 | 21 | * [Geo array Helper functions](/Docs/Helper_functions.md) 22 | 23 | * [Script descriptions](/Docs/Scripts_desc.md) 24 | 25 | * [npGeom toolbox description](/Docs/npGeomTools.md) 26 | 27 | * [Multipart shapes](/Docs/Multipart_shapes.md) 28 | 29 | 30 | **NOTE** 31 | 32 | 1 This is the main working repository for *arcpro_npg*. I update that when major work is complete here. 33 | 34 | 2 See the **Docs** folder for other documentation, or click on the links above. 35 | 36 | 3 The **arcpro_npg** folder contains the scripts associated with the **Geo** array, a subclass of a *numpy* ndarray. The script, *tbx_tools.py* is the controlling script associated with the *npGeom.tbx* toolbox. 37 | 38 | 4 Folder structure in this repository. 39 | 40 | - numpy_geometry 41 | - arcpro_npg : main folder for numpy geometry and FreeTools 42 | - images : documentation images 43 | - npg : toolbox and main script 44 | - npg : scripts location 45 | - Docs : main and ancillary documentation 46 | - images : main documentation images 47 | 48 | ---- 49 | A numpy geometry class and functions with a focus on polygon/polyline geometry. Some methods for converting other geometry representations are provided, but limited to geojson and ESRI featureclasses. 50 | The current incarnation is restricted to 2D vector geometry using the NumPy ndarray as its base. 51 | 52 | This is a work in progress, so bear with me. The intent of the Geo class is to treat the geometry of featureclasses as one entity. 53 | 54 | See **npGeo.py** in subsequent sections for details. 55 | 56 | Converting geojson or esri's arcpy geometry objects to array representations is contained in **npg_io.py**.\ 57 | Most approaches I have see so far tend to construct the geometric representations of geometries using some variant of arcpy cursors. 58 | 59 | When trying to work with numpy and the geometries, this creates problems since geometry is rarely a collection of simple shapes (eg. rectangles, circles, triangles).\ 60 | Object arrays (aka ragged arrays) containing the coordinates are the norm. An object array is created when the number of points per feature and/or feature part are not uniform.\ 61 | For example, a square with a triangular hole in it, will have an outer ring, oriented clockwise consisting of a list of 5 points with the first and last point being the same. The triangular hole will be represented by 4 points oriented counterclockwise.\ 62 | Now that arrangement of points can be used to represent a polygon, a closed-loop polyline or a multipoint.\ 63 | The same points can be used to represent 3 distinctly different geometric objects. 64 | 65 | What I set out to do was create a uniform 2D array of coordinates and a companion array which denotes the feature ID and the from-to point pairs plus other useful information.\ 66 | This is similar to esri's FeatureClassToNumPy array approach, but that particular function and its inverse, are only useful for simple singlepart geometries. 67 | 68 | I will document and build on these tools set with examples.\ 69 | Generally, I am only working with featureclasses stored in a file geodatabase. None of this web-based stuff.\ 70 | There are tools to derive geometries from geojson format or other formats capable of generating array-like structures. 71 | 72 | ---- 73 | 74 | See the **geonumeracy** repository for more examples of numpy geometry functions. 75 | 76 | -------------- 77 | **Some links** 78 | 79 | *2025* 80 | 81 | - [Text Data... NumPy moves up a notch](https://community.esri.com/t5/python-blog/text-data-numpy-moves-up-a-notch/ba-p/1614353) 82 | - [Clean polygons boundaries](https://community.esri.com/t5/python-blog/clean-polygons-boundaries/ba-p/1600683) 83 | - [Einsum in geometry](https://community.esri.com/t5/python-snippets-blog/bg-p/python-snippetsblog-board) 84 | - [Buffers revisited](https://community.esri.com/t5/python-blog/buffers-revisited/ba-p/1599281) 85 | - [Polygon intersections](https://community.esri.com/t5/python-blog/polygon-intersections/ba-p/1585839) 86 | - [Intersections](https://community.esri.com/t5/python-blog/intersections/ba-p/1585828) 87 | 88 | 89 | *2024* 90 | 91 | - [Finding irregular patterns in data](https://community.esri.com/t5/python-blog/finding-irregular-patterns-in-data-numpy-snippets/ba-p/1549306) 92 | - [Adjacency ... numpy snippets](https://community.esri.com/t5/python-blog/adjacency-numpy-snippets/ba-p/1401241) 93 | - [Recursion: Using the same function within itself](https://community.esri.com/t5/python-blog/recursion-using-the-same-function-within-itself/ba-p/1512093) 94 | - [Keys in dictionaries: the geojson example](https://community.esri.com/t5/python-blog/keys-in-dictionaries-the-geojson-example/ba-p/1512072) 95 | 96 | *2023* 97 | 98 | - [Fun with structured arrays](https://community.esri.com/t5/python-blog/fun-with-structured-arrays/ba-p/1258790) 99 | - [Fun with data duplicates](https://community.esri.com/t5/python-blog/dealing-with-duplicates/ba-p/1258351) 100 | - [Fun with clipping ... part 1](https://community.esri.com/t5/python-blog/fun-with-clipping-part-1/ba-p/1265826) 101 | - [Simplify your poly* geometry ... part 2](https://community.esri.com/t5/python-blog/simplify-your-poly-geometry-part-2/ba-p/1266873) 102 | - [I don't feel like scrolling through the folders](https://community.esri.com/t5/python-blog/i-don-t-feel-like-scrolling-through-the-folders/ba-p/1270687) 103 | 104 | *2022* 105 | 106 | - [The *.atbx toolbox](https://community.esri.com/t5/python-blog/the-atbx-toolbox/ba-p/1234953/jump-to/first-unread-message) 107 | - [What is arcgisscripting](https://community.esri.com/t5/python-blog/what-is-arcgisscripting/ba-p/1210998) 108 | - [Raster Generalization Functions](https://community.esri.com/t5/python-blog/raster-generalization-functions/ba-p/1180857) 109 | 110 | *2021* 111 | 112 | - [Sharing ArcGIS Pro Packages : hidden gems](https://community.esri.com/t5/python-blog/sharing-packages-but-you-can-t-see-inside/ba-p/1124521) 113 | - [Intersections : Polygon overlay operations](https://community.esri.com/t5/python-blog/intersections-polygon-overlay-operations/ba-p/1122050) 114 | - [NumPy and Arcpy play nice: part 2](https://community.esri.com/t5/python-blog/numpy-and-arcpy-play-nice-part-2/ba-p/1120621) 115 | - [NumPy and Arcpy play nice](https://community.esri.com/t5/python-blog/numpy-and-arcpy-play-nice/ba-p/1119719) 116 | - [What's in that module?](https://community.esri.com/t5/python-blog/what-s-in-that-module/ba-p/1111083) 117 | - [Arcpy Dependencies](https://community.esri.com/t5/python-blog/arcpy-dependencies/ba-p/1089485) 118 | - [Group, split and reclass using numpy, python and arcpy](https://community.esri.com/t5/python-blog/group-split-and-reclass-using-numpy-python-and/ba-p/1084357) 119 | - [Documenting Python Code](https://community.esri.com/t5/python-blog/documenting-python-code/ba-p/1075535) 120 | 121 | *2020* 122 | 123 | - [Dissolve Boundaries](https://community.esri.com/t5/python-blog/dissolve-boundaries/ba-p/1011337) 124 | - [Conda, the dependency trail](https://community.esri.com/t5/python-blog/conda-the-dependency-trail/ba-p/904040) 125 | - [Densify by distance](https://community.esri.com/t5/python-blog/densify-by-distance/ba-p/1004894) 126 | - [Thiessen/Voronoi and Delaunay](https://community.esri.com/people/danretired/blog/2020/06/16/free-advanced-tools-thiessen-polygons-delaunay-triangulation) 127 | - [Point tools](https://community.esri.com/people/danretired/blog/2020/05/15/point-tools-for-pro) 128 | - [Polyline/Polygon tools](https://community.esri.com/people/danretired/blog/2020/05/19/polygonpolyline-tools-for-pro) 129 | - [Table tools](https://community.esri.com/people/danretired/blog/2020/05/18/free-tools-for-arcgis-pro-table-tools) 130 | - [Buffer ... Geometry Mysteries](https://community.esri.com/blogs/dan_patterson/2020/01/27/buffer-geometry-mysteries) 131 | - [Point in Polygon ... Geometry Mysteries](https://community.esri.com/blogs/dan_patterson/2020/02/18/point-in-polygon-geometry-mysteries) 132 | 133 | *2019* 134 | 135 | - [Geometry in NumPy](https://community.esri.com/blogs/dan_patterson/2019/03/17/geometry-in-numpy-1) 136 | - [Geometry, Arcpy and NumPy](https://community.esri.com/blogs/dan_patterson/2019/04/10/geometry-arcpy-and-numpy-2) 137 | - [Deconstructing poly* features](https://community.esri.com/blogs/dan_patterson/2019/04/10/geometry-deconstructing-poly-features-3) 138 | - [Reconstructing poly* features](https://community.esri.com/blogs/dan_patterson/2019/04/17/geometry-reconstructing-poly-features-4) 139 | - [Attributes.. the other bits](https://community.esri.com/blogs/dan_patterson/2019/04/17/geometry-attributes-actually-the-other-bits-5) 140 | - [Don't believe what you see](https://community.esri.com/blogs/dan_patterson/2019/05/09/geometry-dont-believe-what-you-see-6) 141 | - [Geometry: forms of the same feature](https://community.esri.com/t5/python-blog/geometry-forms-of-the-same-feature-7/ba-p/902680) 142 | 143 | *2018* 144 | 145 | - [Minimum area bounding ellipse](https://community.esri.com/t5/python-blog/minimum-area-bounding-ellipses-containers-for/ba-p/902600) 146 | 147 | -------------------------------------------------------------------------------- /Voronoi2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/Voronoi2.png -------------------------------------------------------------------------------- /_config.yml: -------------------------------------------------------------------------------- 1 | # --- Required options --- # 2 | 3 | title: NumPy Geometry 4 | theme: jekyll-theme-dinky 5 | # theme: jekyll-theme-cayman 6 | show_downloads: true 7 | 8 | # Exclude 9 | 10 | # Name of website 11 | name: NumPy Geometry 12 | description: Working with geometry using numpy. 13 | 14 | # [npg](voronoi2.png) 15 | # Your name to show in the footer 16 | author: Dan Patterson 17 | 18 | url: "https://dan-patterson.github.io/numpy_geometry" 19 | 20 | # url: https://github.com/Dan-Patterson "https://dan-patterson.github.io/numpy_geometry" 21 | # exclude: 22 | # --- Logo --- # 23 | 24 | # Image to show in the navigation bar - works best with a square image 25 | # Remove this parameter if you don't want an image in the navbar 26 | # avatar: "/assets/img/avatar-icon.png" 27 | 28 | # If you want to have an image logo in the top-left corner instead of having the title of the website, 29 | # then specify the following parameter 30 | #title-img: /path/to/image 31 | 32 | # kramdown 33 | markdown: GFM 34 | 35 | # Alternatively, the navbar, footer, and page background can be set to an image 36 | # instead of colour 37 | 38 | #navbar-img: "/assets/img/bgimage.png" 39 | #footer-img: "/assets/img/bgimage.png" 40 | #page-img: "/assets/img/bgimage.png" 41 | plugins: 42 | - jekyll-paginate 43 | - jekyll-sitemap 44 | 45 | # Gems 46 | #gems: 47 | # - jekyll-paginate 48 | # --- reference --- 49 | # https://github.com/daattali/beautiful-jekyll/blob/master/_config.yml 50 | -------------------------------------------------------------------------------- /_layouts/default.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | {% seo %} 8 | 9 | 10 | 11 | 14 | 15 | 16 |
17 |
18 |

{{ site.title | default: site.github.repository_name }}

19 |

{{ site.description | default: site.github.project_tagline }}

20 | 21 | 28 | 29 | {% if site.github.is_project_page %} 30 |

Maintained by {{ site.github.owner_name }}

31 | {% endif %} 32 | 33 | {% if site.github.is_user_page %} 34 | 37 | {% endif %} 38 |
39 | 40 |
41 | {{ content }} 42 |
43 | 44 | 47 |
48 | 49 | {% if site.google_analytics %} 50 | 58 | {% endif %} 59 | 60 | 61 | -------------------------------------------------------------------------------- /arcpro_npg/README.md: -------------------------------------------------------------------------------- 1 | # NumPy Geometry 2 | 3 | ---- 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | **NOTE** 14 | 15 | Initially developed using python <= 3.12 and numpy < 2.0 and ArcGIS Pro <= 3.4. 16 | 17 | I no longer have access to ArcGIS Pro so I am no longer actively developing or supporting anything that involves arcpy or arcgisscripting. 18 | 19 | Currently development is using python >= 3.13 and numpy > 2.0. A major transition is denoted pre and post numpy 2.0. 20 | 21 | These demo scripts and the toolbox show how numpy and arcpy can play nice together and generate geometries that are normally only available at the ArcGIS Pro Advanced level. The tools are already provided to do this, but less attention is paid to the attributes. 22 | 23 | The image to the right reflects the tools as they stand on 2022/02/07. 24 | 25 | The */npg* folder contains the toolbox and scripts needed to run the tools in ArcGIS Pro. 26 | 27 |


28 | 29 | 30 | 31 | 32 | 33 | 34 | ---- 35 | The basic structure is as follows: 36 | 37 | file, folder | purpose 38 | ------------ | ------- 39 | npg/npg_tools.tbx | the toolbox for ArcGIS Pro (2.8 although may work with 2.7). 40 | npg/tbx_tools.py | the main script controlling all tools in the toolbox 41 | npg/npg | the scripts folder which contains the modules imported by *tbx_tools*. 42 | 43 | 44 | 45 | 46 | Usually a spatial and/or attribute join enables one to bring the attributes from the input class to the output class. 47 | 48 | This can be done after the geometry is created, or I may have done so during script construction (depending on how bored I was). 49 | 50 | In some cases, the outputs are only one option of what the Esri tool provides, for example 51 | 52 | #6 Polygons to Polylines, is just that... 53 | 54 | a simple conversion of the geometry type, no fancy intersection and overlap stuff... 55 | 56 | You get what you pay for, but the widest use is probably the simplest. 57 | 58 | ---- 59 | ## 2025-04-29 update ## 60 | The big split is done. I am now working with Python 3.13 and NumPy > 2.0. I also no longer have access to ArcGIS Pro. The functionality within *npg* works with numpy arrays, so that part is still being developed. Get your data into array format and you can use geojson as an interchange format if you need to map within Pro. 61 | 62 | So Arcgis Pro and arcpy specific functionality will no longer be supported until such time as the required NumPy and related stack packages are met. 63 | 64 | ## 2024-07-29 update ## 65 | Additions and modifications. 66 | 67 | ## 2022-02-07 update ## 68 | Fixed some errors...(I hope). 69 | 70 | ## 2021-06-04 update ## 71 | Rewrites of many functions to optimize array concatenations. 72 | *npg_clip* is still being tested. convex polygons pose no problems, but clipping that results in multipart geometries is still being investigated. 73 | 74 | ## 2020-12-20 update ## 75 | This is not an exhaustive or definitive list of the toolbox functionality... consider it a sampling: 76 | 77 | **Dissolve Boundaries** 78 | 79 | 80 | Dissolve shared edges in polygon geometry. The polygons do not need to share a common attribute. 81 | All shared edges are deleted and holes are removed. Edges that meet at a point are not considered shared since there is no 2D space between the points. 82 | 83 | *Blog post* [Dissolve boundaries](https://community.esri.com/t5/python-blog/dissolve-boundaries/ba-p/1011337) 84 | 85 | **Feature Envelope to Polygon** 86 | 87 | Using the Geo array class, the extent of polyline and polygon features are created from their constituent points. 88 | 89 | *ArcGIS Pro help* [Feature envelope to polygon](https://pro.arcgis.com/en/pro-app/tool-reference/data-management/feature-envelope-to-polygon.htm) 90 | 91 | **Convex hulls** 92 | 93 | Simple convex hull implementation in python, or scipy (feature points > than a threshold) 94 | 95 | *ArcGIS Pro help* [Minimum bounding geometry](https://pro.arcgis.com/en/pro-app/tool-reference/data-management/minimum-bounding-geometry.htm) 96 | 97 | **Feature to Point** 98 | 99 | For polygon features. Reduces the points to a representative centroid. 100 | 101 | *ArcGIS Pro help* [Feature to point](https://pro.arcgis.com/en/pro-app/tool-reference/data-management/feature-to-point.htm) 102 | 103 | **Split Line at Vertices** 104 | 105 | As it says. I chose to keep the order of the resultant line segments as they were and not remove apparent `duplicates` for line segments that occur on shared borders. In such cases, the shared segments will have the same points, but their from-to order is reversed. There are cases to be made for keeping them or removing them... however, if they are removed, then they are harder to add back in should one want to recreate polygon geometry from the line segments. 106 | 107 | *ArcGIS Pro help* [Split line at vertices](https://pro.arcgis.com/en/pro-app/tool-reference/data-management/split-line-at-vertices.htm) 108 | 109 | **Feature Vertices to Points** 110 | 111 | Convert polygon features to a centroid. One point is returned for multipart shapes, but this could be altered if someone has a use-case that might be relevant. 112 | 113 | *ArcGIS Pro help* [Feature vertices to points](https://pro.arcgis.com/en/pro-app/tool-reference/data-management/feature-vertices-to-points.htm) 114 | 115 | **Polygons to Polylines** 116 | 117 | Really... You are just changing from polygons to polylines. They are still a closed geometry, nothing fancy geometry-wise. Should definitely be **Freed** from its shackles. 118 | 119 | *ArcGIS Pro help* [Feature to polygon](https://pro.arcgis.com/en/pro-app/tool-reference/data-management/feature-to-polygon.htm) 120 | 121 | **Bounding Circles** 122 | 123 | 124 | 125 | Another container that has been around for a long time in a variety of formats and readily implemented in python. Sort-of ported this over from an old toolbox for ArcMap, but redone for the new geom class. Speedy and accurate. 126 | 127 | *ArcGIS Pro help* [Minimum area bounding circles](https://pro.arcgis.com/en/pro-app/tool-reference/data-management/minimum-bounding-geometry.htm) 128 | 129 | 130 | **Frequency** 131 | 132 | Another tool that should be free for such basic functionality. 133 | 134 | [Create classes from multiple columns](https://community.esri.com/blogs/dan_patterson/2016/03/03/create-classes-from-multiple-columns) 135 | 136 | *ArcGIS Pro help* [Frequency](https://pro.arcgis.com/en/pro-app/tool-reference/analysis/frequency.htm) 137 | 138 | 139 | 140 | -------------------------------------------------------------------------------- /arcpro_npg/images/FreeTools.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/arcpro_npg/images/FreeTools.png -------------------------------------------------------------------------------- /arcpro_npg/images/README.md: -------------------------------------------------------------------------------- 1 | Images for documents 2 | -------------------------------------------------------------------------------- /arcpro_npg/images/circles.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/arcpro_npg/images/circles.png -------------------------------------------------------------------------------- /arcpro_npg/images/dissolve_sq2_0.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/arcpro_npg/images/dissolve_sq2_0.png -------------------------------------------------------------------------------- /arcpro_npg/images/dissolve_sq2_1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/arcpro_npg/images/dissolve_sq2_1.png -------------------------------------------------------------------------------- /arcpro_npg/images/sq.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/arcpro_npg/images/sq.png -------------------------------------------------------------------------------- /arcpro_npg/npg/README.md: -------------------------------------------------------------------------------- 1 | # Working with npg 2 | 3 | **NOTE** 4 | 5 | The *.atbx is the first major incarnation of the arcgis pro 3.x toolbox format. 6 | 7 | **Tools and the Toolbox** 8 | 9 | The toolbox ( ``*.atbx`` ) and the script ( ``tbx_tools.py`` ) in this folder, plus the contents of the ``npg`` folder are used to define the Geo class and provide the ability work with geometry. 10 | I will eventually get around to "packaging" it, but, just copy and retain the structure of the contents into a folder on your computer. Load the toolbox into an ArcGIS Pro project from that location. 11 | 12 | **Toolbox description** 13 | -------------------------------------------------------------------------------- /arcpro_npg/npg/__init__.py: -------------------------------------------------------------------------------- 1 | # -- this does nothing, but is needed in any folder that is part of a project path -- -------------------------------------------------------------------------------- /arcpro_npg/npg/npGeom_32.atbx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/arcpro_npg/npg/npGeom_32.atbx -------------------------------------------------------------------------------- /arcpro_npg/npg/npg/README.md: -------------------------------------------------------------------------------- 1 | # Scripts Folder 2 | 3 | This is the repository for the NumPy geometry class and methods (aka *npg*). 4 | 5 | I use my own **npg.dirr** function to produce a formatted list of objects, their properties, methods and pretty well what python`s **dir** does. 6 | 7 | Here is a partial listing of the code to the left with their **dir** contents. 8 | 9 | **npgGeo** 10 | 11 | ```python 12 | ---------------------------------------------------------------------- 13 | | npg.dirr(npg.npGeo) ... 14 | | npg.dirr... 15 | ------- 16 | (001) FLOATS Geo Geo_hlp 17 | (002) Geo_to_arrays Geo_to_lists INTS 18 | (003) NUMS TwoPI __all__ 19 | (004) __builtins__ __cached__ __doc__ 20 | (005) __file__ __loader__ __name__ 21 | (006) __package__ __spec__ _angles_3pnt_ 22 | (007) _area_centroid_ _bit_area_ _bit_crossproduct_ 23 | (008) _bit_length_ _bit_min_max_ _clean_segments_ 24 | (009) _rotate_ array_IFT array_IFT_doc 25 | (010) arrays_to_Geo bounding_circles_doc check_geometry 26 | (011) clean_polygons convex_hulls_doc dedent 27 | (012) dirr dirr_doc extent_rectangles_doc 28 | (013) fill_float_array geom geom_angles 29 | (014) get_shapes_doc indent inner_rings_doc 30 | (015) inspect is_Geo is_in_doc 31 | (016) np npg npg_geom_hlp 32 | (017) npg_io npg_prn od_pairs_doc 33 | (018) outer_rings_doc parts_doc pnt_on_poly_doc 34 | (019) radial_sort_doc reindex_shapes remove_seq_dupl 35 | (020) repack_fields roll_arrays roll_coords 36 | (021) sc script shapes_doc 37 | (022) sort_by_area_doc sort_by_extent_doc sys 38 | (023) uniq_1d uts wrap 39 | 40 | ``` 41 | 42 | **npgDocs** 43 | ```python 44 | ---------------------------------------------------------------------- 45 | | npg.dirr(npg.npgDocs) ... 46 | | npg.dirr... 47 | ------- 48 | (001) Geo_hlp __all__ __builtins__ 49 | (002) __cached__ __doc__ __file__ 50 | (003) __loader__ __name__ __package__ 51 | (004) __spec__ _update_docstring array_IFT_doc 52 | (005) author_date bounding_circles_doc convex_hulls_doc 53 | (006) dirr_doc extent_rectangles_doc ft 54 | (007) get_shapes_doc inner_rings_doc is_in_doc 55 | (008) np npGeo_doc od_pairs_doc 56 | (009) outer_rings_doc parts_doc pnt_on_poly_doc 57 | (010) radial_sort_doc script shapes_doc 58 | (011) sort_by_area_doc sort_by_extent_doc sys 59 | ``` 60 | 61 | **npg.npg_geom_ops** 62 | ```python 63 | ---------------------------------------------------------------------- 64 | | npg.dirr(npg.npg_geom_ops) ... 65 | | npg.dirr... 66 | ------- 67 | (001) CH Delaunay __all__ 68 | (002) __builtins__ __cached__ __doc__ 69 | (003) __file__ __helpers__ __imports__ 70 | (004) __loader__ __name__ __package__ 71 | (005) __spec__ _add_pnts_on_line_ _angles_3pnt_ 72 | (006) _bit_min_max_ _ch_ _ch_scipy_ 73 | (007) _ch_simple_ _closest_pnt_on_poly_ _dist_along_ 74 | (008) _e_2d_ _get_base_ _is_pnt_on_line_ 75 | (009) _percent_along_ _pnt_on_segment_ _view_as_struct_ 76 | (010) bin_pnts common_extent densify_by_distance 77 | (011) densify_by_factor dist_array eucl_dist 78 | (012) extent_to_poly find_closest in_hole_check 79 | (013) mabr near_analysis np 80 | (014) npGeo np_wn npg_geom_hlp 81 | (015) npg_pip on_line_chk pnts_in_pnts 82 | (016) pnts_on_poly pnts_to_extent polys_to_segments 83 | (017) polys_to_unique_pnts prn_q prn_tbl 84 | (018) repack_fields script segments_to_polys 85 | (019) simplify simplify_lines spider_diagram 86 | (020) stu sys triangulate_pnts 87 | (021) uts which_quad 88 | 89 | ``` 90 | **npg.npg_geom_hlp** 91 | ```python 92 | ---------------------------------------------------------------------- 93 | | npg.dirr(npg.npg_geom_hlp) ... 94 | | npg.dirr... 95 | ------- 96 | (001) __all__ __builtins__ __cached__ 97 | (002) __doc__ __file__ __helpers__ 98 | (003) __imports__ __loader__ __name__ 99 | (004) __package__ __spec__ _adj_within_ 100 | (005) _angle_between_ _angles_from_north_ _angles_from_xaxis_ 101 | (006) _area_centroid_ _bit_area_ _bit_check_ 102 | (007) _bit_crossproduct_ _bit_length_ _bit_min_max_ 103 | (008) _bit_segment_angles_ _clean_segments_ _from_to_pnts_ 104 | (009) _get_base_ _in_LBRT_ _in_extent_ 105 | (010) _is_ccw_ _is_clockwise_ _is_convex_ 106 | (011) _is_turn _od_angles_dist_ _perp_ 107 | (012) _pnts_in_extent_ _rotate_ _scale_ 108 | (013) _trans_rot_ _translate_ a_eq_b 109 | (014) classify_pnts close_pnts coerce2array 110 | (015) common_pnts compare_geom compare_segments 111 | (016) del_seq_dups dist_angle_sort flat 112 | (017) geom_angles interweave keep_geom 113 | (018) multi_check np nums 114 | (019) pnt_segment_info prn_tbl radial_sort 115 | (020) reclass_ids remove_geom script 116 | (021) segment_angles shape_finder sort_segment_pairs 117 | (022) sort_xy swap_segment_pnts sys 118 | (023) uts 119 | 120 | ``` 121 | **npg.npg_bool_ops** 122 | ```python 123 | ---------------------------------------------------------------------- 124 | | npg.dirr(npg.npg_bool_ops) ... 125 | | npg.dirr... 126 | ------- 127 | (001) __all__ __builtins__ __cached__ 128 | (002) __doc__ __file__ __helpers__ 129 | (003) __imports__ __loader__ __name__ 130 | (004) __package__ __spec__ _adjacent_ 131 | (005) _bit_area_ _bit_check_ _cut_across_ 132 | (006) _cut_pairs_ _del_seq_dupl_pnts_ _tri_chk_ 133 | (007) _union_op_ add_intersections adjacency_array 134 | (008) adjacency_matrix append_ bail 135 | (009) clp_ erase_ merge_ 136 | (010) no_overlay_ np npGeo 137 | (011) npg nx nx_solve 138 | (012) one_overlay_ orient_clockwise overlay_to_geo 139 | (013) pnt_connections polygon_overlay prepare 140 | (014) prn_ prn_as_obj renumber_pnts 141 | (015) reorder_x_pnts roll_arrays rolling_match 142 | (016) script split_at_intersections sweep 143 | (017) symm_diff_ sys tri_array 144 | (018) triangle_check turns union_adj 145 | (019) union_over uts wrap_ 146 | 147 | ``` 148 | 149 | **npg.npg_bool_hlp** 150 | ```python 151 | ---------------------------------------------------------------------- 152 | | npg.dirr(npg.npg_bool_hlp) ... 153 | | npg.dirr... 154 | ------- 155 | (001) __all__ __builtins__ __cached__ 156 | (002) __doc__ __file__ __helpers__ 157 | (003) __imports__ __loader__ __name__ 158 | (004) __package__ __spec__ _add_pnts_ 159 | (005) _del_seq_dupl_pnts_ _node_type_ _roll_ 160 | (006) _seg_prep_ _w_ _wn_clip_ 161 | (007) add_intersections find_sequence np 162 | (008) np_wn npg p_ints_p 163 | (009) plot_2d plot_polygons prep_overlay 164 | (010) roll_arrays script segment_intersections 165 | (011) self_intersection_check sequences sort_segment_pairs 166 | (012) swv sys 167 | 168 | ``` 169 | **npg.npg_helpers** 170 | ```python 171 | ---------------------------------------------------------------------- 172 | | npg.dirr(npg.npg_helpers) ... 173 | | npg.dirr... 174 | ------- 175 | (001) __all__ __builtins__ __cached__ 176 | (002) __doc__ __file__ __helpers__ 177 | (003) __loader__ __name__ __package__ 178 | (004) __spec__ _base_ _isin_2d_ 179 | (005) _iterate_ _to_lists_ _view_as_struct_ 180 | (006) cartesian_product flatten np 181 | (007) npg remove_seq_dupl script 182 | (008) separate_string_number sequences stride_2d 183 | (009) sys uniq_1d uniq_2d 184 | (010) unpack 185 | 186 | ``` 187 | **npg.npg_maths** 188 | ```python 189 | ---------------------------------------------------------------------- 190 | | npg.dirr(npg.npg_maths) ... 191 | | npg.dirr... 192 | ------- 193 | (001) __all__ __builtins__ __cached__ 194 | (002) __doc__ __file__ __helpers__ 195 | (003) __imports__ __loader__ __name__ 196 | (004) __package__ __spec__ _angle_between_ 197 | (005) _angles_3pnt_ _arc_mini_ _area_centroid_2 198 | (006) _offset_segment_ _pnt_on_segment_ _point_along_a_line 199 | (007) _resize_segment_ _trans_rot_2 circ_circ_intersection 200 | (008) cross_product_2d dot_product_2d flip_left_right 201 | (009) flip_up_down line_circ_intersection n_largest 202 | (010) n_smallest norm_2d np 203 | (011) npg pnt_to_array_distances project_pnt_to_line 204 | (012) rot180 rot270 rot90 205 | (013) running_count script segment_crossing 206 | (014) sys 207 | 208 | ``` 209 | -------------------------------------------------------------------------------- /arcpro_npg/npg/npg/__init__.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | # flake8 --per-file-ignores="__init__.py:F401 3 | # noqa: E401, E402, D205, D400, F401, F403 4 | # pylint: disable=C0410 5 | r""" 6 | npg NumPy Geometry 7 | =================== 8 | 9 | **npg module __init__ file** 10 | 11 | Normal usage: 12 | 13 | >>> import npg 14 | 15 | Script : 16 | __init__.py. 17 | 18 | Author : 19 | Dan Patterson 20 | - Dan_Patterson@carleton.ca 21 | - https://github.com/Dan-Patterson 22 | 23 | Modified : 2025-03-11 24 | Creation date during 2019 as part of ``arraytools``. 25 | 26 | Purpose 27 | ------- 28 | Tools for working with point and poly features as an array class. 29 | Requires npGeo to implement the array geometry class. 30 | 31 | See Also 32 | -------- 33 | Many links and references in ``..docs/_npgeom_notes_.txt``. 34 | 35 | .. note:: 36 | 37 | This double dot, .. note:: thing produces a nice blue colored box with the 38 | note inside. 39 | 40 | Notes 41 | ----- 42 | Import suggestion and package properties and methods. The Geo class in npGeo 43 | provides the base class for this package. It is based on the numpy ndarray. 44 | 45 | >>> import npgeom as npg 46 | 47 | 48 | **Import options for arcpy functions** 49 | 50 | >>> import arcgisscripting as ags 51 | ... ['ContingentFieldValue', 'ContingentValue', 'DatabaseSequence', 'Describe', 52 | ... 'Domain', 'Editor', 'ExtendTable', 'FeatureClassToNumPyArray', 53 | ... 'InsertCursor', 'ListContingentValues', 'ListDatabaseSequences', 54 | ... 'ListDomains', 'ListFieldConflictFilters', 'ListReplicas', 'ListSubtypes', 55 | ... 'ListVersions', 'NumPyArrayToFeatureClass', 'NumPyArrayToTable', 'Replica', 56 | ... 'SearchCursor', 'TableToNumPyArray', 'UpdateCursor', 'Version', 'Walk'...] 57 | 58 | >>> ags.da.FeatureClassToNumPyArray(...) # useage 59 | 60 | >>> from arcpy import (env, AddMessage, Exists) 61 | 62 | >>> from arcpy.management import ( 63 | ... AddField, CopyFeatures, CreateFeatureclass, Delete, MakeFeatureLayer, 64 | ... MultipartToSinglepart, SelectLayerByLocation, XYToLine 65 | ... ) 66 | 67 | >>> from arcpy.analysis import Clip 68 | 69 | 70 | Spyder and conda 71 | ---------------- 72 | When using spyder, you can access conda. 73 | 74 | Currently use IPython line magics and change your directory to where conda.exe 75 | resides. 76 | 77 | >>> cd C:\arc_pro\bin\Python\Scripts 78 | >>> conda list # will provide a listing of your packages 79 | 80 | Note: Python resides in... (substitute `arc_pro` for your install folder). 81 | 82 | >>> C:\arc_pro\bin\Python\envs\arcgispro-py3 83 | 84 | """ 85 | # noqa: E401, F401, F403, C0410 86 | # pyflakes: disable=E0401,F403,F401 87 | # pylint: disable=unused-import 88 | # pylint: disable=E0401 89 | 90 | # ---- sys, np imports 91 | import sys 92 | import numpy as np 93 | 94 | # ---- import for npg 95 | # import npg 96 | 97 | from . import npgDocs, npGeo, npg_geom_hlp, npg_geom_ops, npg_helpers 98 | from . import npg_bool_hlp, npg_bool_ops, npg_buffer 99 | from . import npg_min_circ, npg_overlay, npg_analysis, npg_maths 100 | from . import npg_setops, npg_io, npg_table, npg_create, npg_utils 101 | from . import npg_pip, npg_prn # noqa 102 | # -- 103 | # requires arcpy 104 | # npg_create, npg_arc_npg, npg_geom_ops, npg_io 105 | 106 | from . npGeo import * # noqa 107 | from . npg_geom_hlp import * # noqa 108 | from . npg_geom_ops import * # noqa 109 | from . npg_buffer import * # noqa 110 | from . npg_helpers import * # noqa 111 | from . npg_bool_hlp import * # noqa 112 | from . npg_bool_ops import * # noqa 113 | from . npg_maths import * # noqa 114 | from . npg_clip_split import clip_poly, split_poly # noqa 115 | from . npg_io import * # noqa 116 | 117 | # from . npg_pip import * 118 | # from . npg_min_circ import * 119 | # from . npg_overlay import * 120 | # from . npg_analysis import * 121 | # from . npg_setops import * 122 | # from . npg_prn import * 123 | # from . npg_table import * 124 | # from . npg_create import * 125 | # from . npg_utils import * 126 | 127 | # ---- docstring info for Geo and some methods 128 | 129 | 130 | # ---- define __all__ 131 | __all__ = [ 132 | 'npgDocs', 'npGeo', 'npg_io', 'npg_geom_ops', 'npg_geom_hlp', 133 | 'npg_helpers', 'npg_bool_hlp', 'npg_bool_ops', 'npg_buffer', 134 | 'npg_overlay', 135 | 'npg_table', 'npg_create', 'npg_analysis', 'npg_maths', 'npg_utils', 136 | 'npg_setops', 'npg_min_circ' 137 | ] 138 | 139 | __helpers__ = [ 140 | 'npGeo_doc', 'Geo_hlp', 'array_IFT_doc', 'dirr_doc', 'shapes_doc', 141 | 'parts_doc', 'outer_rings_doc', 'inner_rings_doc', 'get_shapes_doc', 142 | 'sort_by_extent_doc', 'radial_sort_doc' 143 | ] 144 | 145 | __all__.extend(npgDocs.__all__) 146 | __all__.extend(npGeo.__all__) 147 | __all__.extend(npg_geom_ops.__all__) 148 | __all__.extend(npg_geom_hlp.__all__) 149 | __all__.extend(npg_bool_ops.__all__) 150 | __all__.extend(npg_bool_hlp.__all__) 151 | __all__.extend(npg_buffer.__all__) 152 | __all__.extend(npg_maths.__all__) 153 | __all__.sort() 154 | 155 | __helpers__.extend(npg_geom_hlp.__helpers__) 156 | __helpers__.extend(npg_geom_ops.__helpers__) 157 | __helpers__.extend(npg_bool_ops.__helpers__) 158 | __helpers__.extend(npg_bool_hlp.__helpers__) 159 | __helpers__.extend(npg_buffer.__helpers__) 160 | __helpers__.extend(npg_maths.__helpers__) 161 | __helpers__.sort() 162 | 163 | """ 164 | __all__.extend(npg.npg_io.__all__) 165 | __all__.extend(npg_prn.__all__) 166 | __all__.extend(npg_pip.__all__) 167 | __all__.extend(npg_table.__all__) 168 | __all__.extend(npg_create.__all__) 169 | __all__.extend(npg_analysis.__all__) 170 | __all__.extend(npg_overlay.__all__) 171 | __all__.extend(npg_setops.__all__) 172 | # __all__.extend(npg_min_circ.__all__) 173 | __helpers__.extend(npg_analysis.__helpers__) 174 | __helpers__.extend(npg_overlay.__helpers__) 175 | 176 | """ 177 | msg = """ 178 | ---------------------------------------------- 179 | ---- ... (n)um(p)y (g)eometry ... npg ... ---- 180 | location 181 | ... {} 182 | python version and location ... 183 | ... {} 184 | ... {} 185 | numpy version ... 186 | ... {} 187 | Usage... 188 | ... import npg 189 | 190 | Modules not imported by default... 191 | ... npg_arc_npg 192 | ... npg_plots 193 | 194 | ---------------------------------------------- 195 | """ 196 | 197 | pth = __path__[0] #noqa 198 | print(msg.format(pth, sys.version, sys.exec_prefix, np.__version__)) 199 | del pth 200 | del msg 201 | -------------------------------------------------------------------------------- /arcpro_npg/npg/npg/doc_t.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | # noqa: D205, D400, F403 3 | """ 4 | ===== 5 | doc_t 6 | ===== 7 | 8 | **Script** : doc_t.py 9 | 10 | **Author** : Dan_Patterson@carleton.ca 11 | 12 | **Modified** : 2023-08-23 13 | 14 | **Purpose** : Documenting scripts 15 | 16 | .. Table of contents appears first 17 | .. contents:: **Table of Contents** 18 | :depth: 2 19 | 20 | Functions 21 | --------- 22 | 23 | .. currentmodule:: doc_t 24 | .. autofunction:: a 25 | 26 | 27 | .. summary:: 28 | :toctree: 29 | 30 | a - doc_maker.a 31 | 32 | Notes 33 | ----- 34 | None 35 | 36 | References 37 | ---------- 38 | Supposedly a code block below 39 | 40 | .. code-block:: python 41 | 42 | a = np.arange(5) 43 | print(a) 44 | 45 | The functions 46 | ------------- 47 | None so far 48 | 49 | .. autosummary:: 50 | :toctree: 51 | 52 | a - doc_maker.a 53 | 54 | """ 55 | import sys 56 | import numpy as np 57 | 58 | import npg # noqa 59 | from npg.npg_utils import * # noqa 60 | 61 | ft = {'bool': lambda x: repr(x.astype(np.int32)), 62 | 'float_kind': '{: 0.3f}'.format} 63 | np.set_printoptions(edgeitems=10, linewidth=80, precision=2, suppress=True, 64 | threshold=100, formatter=ft) 65 | np.ma.masked_print_option.set_display('-') # change to a single - 66 | 67 | script = sys.argv[0] # print this should you need to locate the script 68 | 69 | 70 | # ---------------------------------------------------------------------- 71 | # ---- (6) doc_frmts .... code section ---- 72 | def a(): 73 | """Document format tester. 74 | 75 | .. Is this a comment? Yes it is ... you can't see me 76 | 77 | .. _format: 78 | .. contents:: **Table of Contents** 79 | :depth: 2 80 | 81 | .. note:: This is a **note** box. This one actually works 82 | 83 | .. details :: xxxx 84 | 85 | 86 | Parameters 87 | ---------- 88 | No need for a blank line after the underlines, but you need one after this 89 | line. 90 | 91 | a : stuff 92 | A space before the colon only bold faces before it. The second line 93 | gets a four space indent and the text line wraps. 94 | b: more stuff 95 | But if you don't leave a space before the colon, the whole line gets 96 | bolded. 97 | c : still more 98 | Now what? 99 | 100 | Code samples 101 | ------------ 102 | 103 | .. note:: 104 | 105 | In the example below, line numbers are added to the code block using 106 | the directives below. The second line can contain `lineno-start` or 107 | `linenos` depending on whether you want to number all or number from.:: 108 | 109 | .. code-block:: python 110 | :lineno-start: 5 111 | 112 | It is important to note that the colon ``:`` appears in the 3rd space 113 | directly beneath the space before ``code-block`` 114 | 115 | .. ------------------------------------------------------------------- 116 | .. line numbers with code block 117 | .. code-block:: python 118 | :lineno-start: 5 119 | 120 | a = np.arange(5) 121 | print(a) 122 | 123 | .. ------------------------------------------------------------------- 124 | .. Code block removed. 125 | 126 | The code-block without line numbers. 127 | 128 | .. code-block:: python 129 | 130 | a = np.arange(5) 131 | print(a) 132 | 133 | .. ------------------------------------------------------------------- 134 | 135 | Bullet lists 136 | ------------ 137 | Bullet lists need a blank line below and a blank line after 138 | 139 | - ``double backticks`` : monospaced 140 | - `single backticks` : italics 141 | - *one star* : italics 142 | - **two star** : bold 143 | 144 | So bullet is done. 145 | 146 | You can maintain indentation using vertical bars but the lines need to be 147 | indented. 148 | 149 | | Vertical bars with at least 1 space before 150 | | keep stuff lined up without needing a blank line. 151 | 152 | | Indentation and a blank line before 153 | | control the spaces before the bar 154 | but as soon as it is gone, it wraps. 155 | 156 | 157 | Test section 158 | ------------ 159 | 160 | **autosummary directive** 161 | 162 | Now this is cool. You will notice that I imported the ``npg_utils`` 163 | module at the beginning of the script. The ``autosummary`` directive can 164 | be used to produce a small table of contents of the methods/functions 165 | contained in it. The directive brings in the first line of the docstring. 166 | 167 | .. autosummary:: 168 | 169 | npg_utils.time_deco - Runs timing functions 170 | npg_utils.run_deco - Reports run information 171 | npg_utils.get_func - Retrieves function information 172 | npg_utils.get_module_info - Retrieves module information 173 | 174 | 175 | Warnings 176 | -------- 177 | numpydoc supports a main `warnings` section. Specific warnings within this 178 | or other sections can be raised as well. 179 | 180 | Warnings begin with ``.. warnings::`` but they don`t show in the function 181 | help if the warning text is on the same line. 182 | 183 | .. warnings:: can't see me. 184 | 185 | Hidden warning above. Now a real warning. 186 | 187 | .. warning:: 188 | 189 | This is a real warning because the content is on a separate line. 190 | 191 | .. seealso:: This is a simple **seealso** note. 192 | 193 | List with 3 columns 194 | ------------------- 195 | 196 | .. hlist:: 197 | :columns: 3 198 | 199 | * hlist needed 200 | * so is columns 201 | * 3 is the number of columns 202 | * this is the last of col 2 203 | * and column 3 straggler 204 | 205 | Another section 206 | --------------- 207 | Which gets picked up until you turn autosummary off. 208 | 209 | References 210 | ---------- 211 | ``_. 213 | 214 | Headings 215 | -------- 216 | Created by underlining ``Heading`` with a dash - . 217 | 218 | Heading 219 | ======= 220 | Created by underlining ``Heading`` with equals = . 221 | 222 | """ 223 | print(a.__doc__) 224 | 225 | 226 | # ---------------------------------------------------------------------- 227 | # __main__ .... code section 228 | if __name__ == "__main__": 229 | """Optionally... 230 | : - print the script source name. 231 | : - run the _demo 232 | """ 233 | -------------------------------------------------------------------------------- /arcpro_npg/npg/npg/extras/README.md: -------------------------------------------------------------------------------- 1 | Extra scripts 2 | -------------------------------------------------------------------------------- /arcpro_npg/npg/npg/extras/hulls.py: -------------------------------------------------------------------------------- 1 | # -*- coding: UTF-8 -*- 2 | # noqa: D205, D400 3 | r""" 4 | ===== 5 | hulls 6 | ===== 7 | 8 | Script : 9 | hulls.py 10 | 11 | Author : 12 | danpatterson@cunet.carleton.ca 13 | 14 | Modified : 15 | 2022-02-08 16 | 17 | Purpose : 18 | Working with numpy arrays to determine convex and concave hulls 19 | 20 | References 21 | ---------- 22 | `'_. 24 | 25 | `'_. 26 | 27 | `'_. 29 | 30 | `'_. 32 | :---------------------------------------------------------------------: 33 | """ 34 | 35 | # ---- imports, formats, constants ---- 36 | import sys 37 | import numpy as np 38 | from numpy.lib.recfunctions import structured_to_unstructured as stu 39 | from arcpytools_pnt import tweet, output_polylines, output_polygons 40 | import arcpy 41 | 42 | import math 43 | 44 | arcpy.overwriteOutput = True 45 | 46 | ft = {'bool': lambda x: repr(x.astype(np.int32)), 47 | 'float_kind': '{: 0.3f}'.format} 48 | np.set_printoptions(edgeitems=10, linewidth=80, precision=2, suppress=True, 49 | threshold=100, formatter=ft) 50 | np.ma.masked_print_option.set_display('-') # change to a single - 51 | 52 | script = sys.argv[0] # print this should you need to locate the script 53 | 54 | PI = math.pi 55 | 56 | 57 | msg = """\n 58 | ----------------------------------------------------------------------- 59 | ---- Concave/convex hull ---- 60 | script {} 61 | Testing {} 62 | in_fc {} 63 | group_by {} 64 | k_factor {} 65 | hull_type {} 66 | out_type {} 67 | out_fc {} 68 | ----------------------------------------------------------------------- 69 | 70 | """ 71 | 72 | 73 | def pnt_in_list(pnt, pnts_list): 74 | """Check to see if a point is in a list of points.""" 75 | is_in = np.any([np.isclose(pnt, i) for i in pnts_list]) 76 | return is_in 77 | 78 | 79 | def intersects(*args): 80 | """Line intersection check. Two lines or 4 points that form the lines. 81 | 82 | Requires 83 | -------- 84 | - intersects(line0, line1) or intersects(p0, p1, p2, p3) 85 | - p0, p1 -> line 1 86 | - p2, p3 -> line 2 87 | 88 | Returns 89 | ------- 90 | boolean, if the segments do intersect 91 | 92 | References 93 | ---------- 94 | ``_. 96 | 97 | """ 98 | if len(args) == 2: 99 | p0, p1, p2, p3 = *args[0], *args[1] 100 | elif len(args) == 4: 101 | p0, p1, p2, p3 = args 102 | else: 103 | raise AttributeError("Pass 2, 2-pnt lines or 4 points to the function") 104 | # 105 | # ---- First check ---- np.cross(p1-p0, p3-p2 ) 106 | p0_x, p0_y, p1_x, p1_y, p2_x, p2_y, p3_x, p3_y = *p0, *p1, *p2, *p3 107 | s10_x = p1_x - p0_x 108 | s10_y = p1_y - p0_y 109 | s32_x = p3_x - p2_x 110 | s32_y = p3_y - p2_y 111 | denom = s10_x * s32_y - s32_x * s10_y 112 | if denom == 0.0: 113 | return False 114 | # 115 | # ---- Second check ---- np.cross(p1-p0, p0-p2 ) 116 | den_gt0 = denom > 0 117 | s02_x = p0_x - p2_x 118 | s02_y = p0_y - p2_y 119 | s_numer = s10_x * s02_y - s10_y * s02_x 120 | if (s_numer < 0) == den_gt0: 121 | return False 122 | # 123 | # ---- Third check ---- np.cross(p3-p2, p0-p2) 124 | t_numer = s32_x * s02_y - s32_y * s02_x 125 | if (t_numer < 0) == den_gt0: 126 | return False 127 | # 128 | if ((s_numer > denom) == den_gt0) or ((t_numer > denom) == den_gt0): 129 | return False 130 | # 131 | # ---- check to see if the intersection point is one of the input points 132 | t = t_numer / denom 133 | # substitute p0 in the equation 134 | x = p0_x + (t * s10_x) 135 | y = p0_y + (t * s10_y) 136 | # be careful that you are comparing tuples to tuples, lists to lists 137 | if sum([(x, y) == tuple(i) for i in [p0, p1, p2, p3]]) > 0: 138 | return False 139 | return True 140 | 141 | 142 | def angle(p0, p1, prv_ang=0): 143 | """Angle between two points and the previous angle, or zero.""" 144 | ang = math.atan2(p0[1] - p1[1], p0[0] - p1[0]) 145 | a0 = (ang - prv_ang) 146 | a0 = a0 % (PI * 2) - PI 147 | return a0 148 | 149 | 150 | def point_in_polygon(pnt, poly): # pnt_in_poly(pnt, poly): # 151 | """Point is in polygon. ## fix this and use pip from arraytools.""" 152 | x, y = pnt 153 | N = len(poly) 154 | for i in range(N): 155 | x0, y0, xy = [poly[i][0], poly[i][1], poly[(i + 1) % N]] 156 | c_min = min([x0, xy[0]]) 157 | c_max = max([x0, xy[0]]) 158 | if c_min < x <= c_max: 159 | p = y0 - xy[1] 160 | q = x0 - xy[0] 161 | y_cal = (x - x0) * p / q + y0 162 | if y_cal < y: 163 | return True 164 | return False 165 | 166 | 167 | def knn(pnts, p, k): 168 | """Calculate k nearest neighbours for a given point. 169 | 170 | :param points: list of points 171 | :param p: reference point 172 | :param k: amount of neighbours 173 | :return: list 174 | """ 175 | s = sorted(pnts, 176 | key=lambda x: math.sqrt((x[0]-p[0])**2 + (x[1]-p[1])**2))[0:k] 177 | return s 178 | 179 | 180 | def knn0(pnts, p, k): 181 | """Calculate k nearest neighbours for a given point. 182 | 183 | points : array 184 | list of points 185 | p : two number array-like 186 | reference point 187 | k : integer 188 | amount of neighbours 189 | Returns: 190 | -------- 191 | list of the k nearest neighbours, based on squared distance 192 | """ 193 | p = np.asarray(p) 194 | pnts = np.asarray(pnts) 195 | diff = pnts - p[np.newaxis, :] 196 | d = np.einsum('ij,ij->i', diff, diff) 197 | idx = np.argsort(d)[:k] 198 | # s = [i.tolist() for i in pnts[idx]] 199 | return pnts[idx].tolist() 200 | 201 | 202 | def concave(points, k): 203 | """Calculate the concave hull for given points. 204 | 205 | Requires 206 | -------- 207 | points : initially the input set of points with duplicates removes and 208 | sorted on the Y value first, lowest Y at the top (?) 209 | k : initially the number of points to start forming the concave hull, 210 | k will be the initial set of neighbors 211 | 212 | Notes 213 | ----- 214 | This recursively calls itself to check concave hull. 215 | 216 | : p_set - The working copy of the input points 217 | :----- 218 | """ 219 | k = max(k, 3) # Make sure k >= 3 220 | p_set = list(set(points[:])) # Remove duplicates if not done already 221 | if len(p_set) < 3: 222 | raise Exception("p_set length cannot be smaller than 3") 223 | elif len(p_set) == 3: 224 | return p_set # Points are a polygon already 225 | k = min(k, len(p_set) - 1) # Make sure k neighbours can be found 226 | 227 | frst_p = cur_p = min(p_set, key=lambda x: x[1]) 228 | hull = [frst_p] # Initialize hull with first point 229 | p_set.remove(frst_p) # Remove first point from p_set 230 | prev_ang = 0 231 | 232 | while (cur_p != frst_p or len(hull) == 1) and len(p_set) > 0: 233 | if len(hull) == 3: 234 | p_set.append(frst_p) # Add first point again 235 | knn_pnts = knn(p_set, cur_p, k) # knn or knn0 236 | cur_pnts = sorted(knn_pnts, key=lambda x: -angle(x, cur_p, prev_ang)) 237 | 238 | its = True 239 | i = -1 240 | while its and i < len(cur_pnts) - 1: 241 | i += 1 242 | last_point = 1 if cur_pnts[i] == frst_p else 0 243 | j = 1 244 | its = False 245 | while not its and j < len(hull) - last_point: 246 | its = intersects(hull[-1], cur_pnts[i], hull[-j - 1], hull[-j]) 247 | j += 1 248 | if its: # All points intersect, try a higher number of neighbours 249 | return concave(points, k + 1) 250 | prev_ang = angle(cur_pnts[i], cur_p) 251 | cur_p = cur_pnts[i] 252 | hull.append(cur_p) # Valid candidate was found 253 | p_set.remove(cur_p) 254 | 255 | for point in p_set: 256 | if not point_in_polygon(point, hull): 257 | return concave(points, k + 1) 258 | # 259 | return hull 260 | 261 | 262 | # ---- convex hull ---------------------------------------------------------- 263 | # 264 | def cross(o, a, b): 265 | """Cross-product for vectors o-a and o-b.""" 266 | xo, yo = o 267 | xa, ya = a 268 | xb, yb = b 269 | return (xa - xo)*(yb - yo) - (ya - yo)*(xb - xo) 270 | # return (a[0] - o[0]) * (b[1] - o[1]) - (a[1] - o[1]) * (b[0] - o[0]) 271 | 272 | 273 | def convex(points): 274 | """Calculate the convex hull for given points. 275 | 276 | Input is a list of 2D points [(x, y), ...] 277 | """ 278 | points = sorted(set(points)) # Remove duplicates 279 | if len(points) <= 1: 280 | return points 281 | # Build lower hull 282 | lower = [] 283 | for p in points: 284 | while len(lower) >= 2 and cross(lower[-2], lower[-1], p) <= 0: 285 | lower.pop() 286 | lower.append(p) 287 | # Build upper hull 288 | upper = [] 289 | for p in reversed(points): 290 | while len(upper) >= 2 and cross(upper[-2], upper[-1], p) <= 0: 291 | upper.pop() 292 | upper.append(p) 293 | print("lower\n{}\nupper\n{}".format(lower, upper)) 294 | return np.array(lower[:-1] + upper) # upper[:-1]) # for open loop 295 | 296 | 297 | # ---- (1) Run the analysis 298 | def _run_(args): 299 | """Run the analysis.""" 300 | in_fc, group_by, k_factor, hull_type, out_type, out_fc = args 301 | desc = arcpy.da.Describe(in_fc) 302 | SR = desc['spatialReference'] 303 | # 304 | # (1) -- get the points 305 | in_flds = ['OID@', 'SHAPE@X', 'SHAPE@Y'] + group_by 306 | a = arcpy.da.FeatureClassToNumPyArray(in_fc, in_flds, "", SR, True) 307 | # 308 | # (2) -- determine the unique groupings of the points 309 | if group_by: 310 | uniq, idx, rev = np.unique(a[group_by], True, True) 311 | groups = [a[np.where(a[group_by] == i)[0]] for i in uniq] 312 | else: 313 | groups = [a] 314 | # 315 | # (3) -- for each group, perform the concave hull 316 | hulls = [] 317 | for i, group in enumerate(groups): 318 | p = groups[i] 319 | p = p[['SHAPE@X', 'SHAPE@Y']] 320 | p = stu(p) 321 | # 322 | # -- point preparation section 323 | p = np.array(list(set([tuple(i) for i in p]))) # Remove duplicates 324 | idx_cr = np.lexsort((p[:, 0], p[:, 1])) # indices of sorted array 325 | in_pnts = np.asarray([p[i] for i in idx_cr]) # p[idx_cr] # 326 | in_pnts = in_pnts.tolist() 327 | in_pnts = [tuple(i) for i in in_pnts] 328 | if hull_type == 'concave': 329 | cx = np.array(concave(in_pnts, k_factor)) # needs a list of tuples 330 | else: 331 | cx = np.array(convex(in_pnts)) 332 | hulls.append(cx.tolist()) 333 | # ---- 334 | # 335 | if out_type == 'Polyline': 336 | output_polylines(out_fc, SR, [hulls]) 337 | elif out_type == 'Polygon': 338 | output_polygons(out_fc, SR, [hulls]) 339 | else: 340 | for i in hulls: 341 | print("Hulls\n{}".format(np.array(i))) 342 | return hulls 343 | 344 | 345 | # ---------------------------------------------------------------------- 346 | # .... running script or testing code section 347 | def _tool(): 348 | """Run when script is from a tool.""" 349 | in_fc = sys.argv[1] 350 | group_by = str(sys.argv[2]) 351 | k_factor = int(sys.argv[3]) 352 | hull_type = str(sys.argv[4]) 353 | out_type = str(sys.argv[5]) 354 | out_fc = sys.argv[6] 355 | args = [in_fc, group_by, k_factor, hull_type, out_type, out_fc] 356 | hulls = _run_(args) 357 | return hulls 358 | 359 | 360 | def _testing_(args): 361 | """Run a test.""" 362 | script, testing, in_fc, group_by = args[:4] 363 | k_factor, hull_type, out_type, out_fc = args[4:] 364 | tweet(msg.format(*args)) 365 | # 366 | args = [in_fc, group_by, k_factor, hull_type, out_type, out_fc] 367 | hulls = _run_(args) 368 | return hulls 369 | 370 | 371 | # ---------------------------------------------------------------------- 372 | # __main__ .... code section 373 | if __name__ == "__main__": 374 | """Optionally... 375 | : - print the script source name. 376 | : - run the _demo 377 | """ 378 | testing = True 379 | gdb = r"C:\Arc_projects\Test_29\Test_29.gdb" 380 | in_fc = r"C:\Arc_projects\Test_29\Test_29.gdb\some_pnts" 381 | if testing: 382 | group_by = ['Group_'] # or [] [Group_] 383 | k_factor = 3 # minimum 3 384 | hull_type = "concave" 385 | out_type = "" # "Polygon" or "Polyline" for output 386 | name = "concave02" 387 | out_fc = rf"{gdb}\{name}" 388 | args = [script, testing, in_fc, group_by, k_factor, 389 | hull_type, out_type, out_fc] 390 | hulls = _testing_(args) 391 | else: 392 | _tool() 393 | """ 394 | gdb_pth = "/".join(script.split("/")[:-2]) + "/Data/Point_tools.gdb" 395 | 396 | if len(sys.argv) == 1: 397 | testing = True 398 | in_fc = gdb_pth + r"/r_sorted" 399 | group_by = 'Group_' 400 | k_factor = 3 401 | hull_type = 'concave' # 'convex' 402 | out_type = 'Polyline' 403 | out_fc = gdb_pth + r"/r_11" 404 | else: 405 | testing = False 406 | in_fc, group_by, k_factor, hull_type, out_type, out_fc = _tool() 407 | 408 | """ 409 | -------------------------------------------------------------------------------- /arcpro_npg/npg/npg/extras/npgCondaPkgs.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- # noqa 2 | 3 | r""" 4 | ------------ 5 | npgCondaPkgs 6 | ------------ 7 | 8 | Script : 9 | npgCondaPkgs.py 10 | 11 | Author : 12 | Dan_Patterson@carleton.ca 13 | 14 | Modified : 15 | 2023-11-22 16 | 17 | Purpose 18 | ------- 19 | Determine conda package information related to arcgis pro. There are two 20 | variants. 21 | 22 | - pkg_info_json(folder=None) 23 | This uses json files in a specific folder in the arcgis installation path by 24 | default, or you can specify a different folder containing json files. 25 | 26 | - package_info_conda(folder=None) 27 | The user profile installation path. 28 | 29 | Notes 30 | ----- 31 | The following can be used if you need an output text file. 32 | 33 | out_file : text *`currently commented out`* 34 | The filename if an output `*.csv` is desired, e.g. `c:/temp/out.csv'. 35 | The file's field names are `Package, Filename, Dependencies`. The 36 | latter field is delimited by `; ` (semicolon space) if there are 37 | multiple dependencies. 38 | # if out_file: # commented out to just return arrays 39 | hdr = ", ".join(packages.dtype.names) 40 | frmt = "%s, %s, %s" 41 | np.savetxt(out_file, packages, fmt=frmt, header=hdr, comments="") 42 | print("\nFile saved...") 43 | 44 | To get a dictionary of packages/dependencies:: 45 | 46 | a = packages['Filename'].tolist() 47 | b = packages['Dependencies'].tolist() 48 | b0 = [i.split("; ") for i in b] 49 | z ={a : b for a, b in list(zip(a, b))} 50 | 51 | To reverse it use `reverse_dict`: 52 | """ 53 | 54 | import sys 55 | import os 56 | from textwrap import dedent, wrap 57 | import numpy as np 58 | from numpy.lib.recfunctions import unstructured_to_structured as uts 59 | from pathlib import Path 60 | import json 61 | #from arcpy.da import NumPyArrayToTable # noqa 62 | 63 | ft = {'bool': lambda x: repr(x.astype(np.int32)), 64 | 'float_kind': '{: 0.3f}'.format} 65 | np.set_printoptions(edgeitems=10, linewidth=80, precision=2, suppress=True, 66 | threshold=300, formatter=ft) 67 | np.ma.masked_print_option.set_display('-') # change to a single - 68 | 69 | script = sys.argv[0] # print this should you need to locate the script 70 | 71 | 72 | def reverse_dict(d, sort_keys=True): 73 | """Reverse a dictionary's keys and values.""" 74 | d_inv = {} 75 | for k, v in d.items(): 76 | if not isinstance(v, (list, tuple)): 77 | v = [v] 78 | for i in v: 79 | if i in d_inv: 80 | d_inv[i].append(k) 81 | else: 82 | d_inv[i] = [k] 83 | if sort_keys: 84 | return dict(sorted(d_inv.items())) 85 | return d_inv 86 | 87 | 88 | def parse_json(f, key="depends"): 89 | """Parse the json file.""" 90 | with open(f, "r") as f: 91 | data = json.load(f) 92 | keys = list(data.keys()) 93 | if key in keys: 94 | return data[key] 95 | else: 96 | return [] 97 | 98 | 99 | def pkg_dependencies(pkg='arcpy-base', 100 | skip=['python', 'python >3.6', 101 | 'python >3.', 'python >=3.9,<3.10.0a0'], 102 | folder=None, 103 | sort_keys=True): 104 | r"""Access specific information for the specified `pkg`. 105 | 106 | Parameters 107 | ---------- 108 | pkg : text 109 | The package name. 110 | skip : None, text or list of text 111 | `None`, doesn't exclude packages. Use `text` for a single package 112 | exclusion, otherwise provide a list of package names. 113 | folder : text 114 | Path to the folder containing the `json` files. 115 | SEE `pkg_info_json` for details. 116 | sort_keys : boolean 117 | `True`, returns the packages in sorted order, `False` otherwise. 118 | 119 | """ 120 | packages = pkg_info_json(folder=folder, all_info=False) 121 | chk = packages[packages['Package'] == pkg] 122 | if packages is None or chk.size == 0: 123 | msg = "\nPackage json or Folder is incorrect, or doesn't exist." 124 | print(msg) 125 | return msg, None 126 | frst = chk[0] 127 | r0, r1, r2 = frst 128 | deps = [i.strip() for i in r2.split(";")] 129 | if not isinstance(deps, (list, tuple)): 130 | deps = [deps] 131 | msg = dedent(""" 132 | name : {} 133 | file : {} 134 | omitting : {} 135 | dependencies : 136 | """) 137 | msg = msg.format(r0, r1, skip) 138 | uniq = [] 139 | others = [] 140 | uni_dict = {} 141 | for d in deps: 142 | sub = d.split(" ")[0] 143 | if skip is not None: 144 | if sub == skip or sub in skip: 145 | msg += "\nSkipping ... " + sub 146 | continue # -- skip out 147 | if d == 'None': 148 | msg += "\n - {} \n{}".format(d, d) 149 | else: 150 | nxt = packages[packages['Package'] == sub] 151 | others.append(nxt[0][0]) 152 | if nxt.size == 0: 153 | msg += "\n - None" 154 | else: 155 | r0, r1, r2 = nxt[0] 156 | r3 = [i.strip() for i in r2.split(";")] 157 | r3 = [i for i in r3 if i not in skip] 158 | # deps = "\n ".join(r3) 159 | deps = wrap(", ".join(r3), 70) 160 | deps = "\n ".join([i for i in deps]) 161 | uniq.append(sub) 162 | uniq.extend(r3) 163 | uni_dict[sub] = r3 164 | msg += "\n - {}\n {}".format(r0, deps) 165 | uniq2 = [i.split(" ")[0] for i in uniq] 166 | # uni = np.unique(np.asarray(uniq2)) 167 | uni, cnts = np.unique(np.asarray(uniq2), return_counts=True) 168 | row_frmt = " {!s:<15} {:>3.0f}\n" 169 | out = "" 170 | for cnt, u in enumerate(uni): 171 | out += row_frmt.format(u, cnts[cnt]) 172 | # out = ", ".join([i for i in uni if i != frst[0]]) 173 | # out = wrap(out, 70) 174 | msg += dedent("""\n 175 | General dependencies : 176 | Name Count 177 | ---- ----- 178 | """) 179 | msg += out 180 | if sort_keys: 181 | srt_dct = dict(sorted(uni_dict.items())) 182 | return msg, srt_dct, others 183 | return msg, uni_dict, others 184 | 185 | 186 | def pkg_info_json(folder=None, keyword='depends', all_info=False): 187 | r"""Access package info from `*.json` files in a `folder`. 188 | 189 | Parameters 190 | ---------- 191 | folder : text 192 | File path to the `json` file. See below. 193 | keyword : text 194 | See the options in `Notes`. `depends` is the default. 195 | all_info : boolean 196 | If true, a list of packages, their dependency counts and the modules 197 | it requires is returned. If False, just a dependency list is returned. 198 | 199 | The `folder` parameter can be derived from:: 200 | 201 | >>> sys.prefix 202 | ... r"C:\arc_pro\bin\Python\envs\arcgispro-py3" 203 | ... r"C:\arc_pro\bin\Python\pkgs\cache" 204 | >>> # ---- conda-meta is appended to yield `folder`, see *`Example`* 205 | 206 | Notes 207 | ----- 208 | The keyword to search on is **depends**. 209 | Other options in json files include:: 210 | 211 | arch, auth, build, build_number, channel, depends, files, fn, 212 | has_prefix, license, link, md5, name, noarch, platform, preferred_env, 213 | priority, requires, schannel, size, subdir, timestamp, url, version, 214 | with_features_depends 215 | 216 | Example 217 | ------- 218 | folder = "C:/...install path/bin/Python/envs/arcgispro-py3/conda-meta" :: 219 | 220 | # -- folder examples 221 | install_path = r"C:\arc_pro" 222 | sys.prefix 223 | ... 'C:\\arc_pro\\bin\\Python\\envs\\arcgispro-py3' # or 224 | ... install_path + r"\bin\Python\envs\arcgispro-py3 225 | # 226 | folder = install_path + r"\bin\Python\envs\arcgispro-py3\conda-meta" 227 | # 228 | # packages are unpacked to: 229 | # r"C:\Users\...You...\AppData\Local\ESRI\conda\pkgs" 230 | 231 | >>> folder = sys.prefix + r"\conda-meta" 232 | >>> out_ = pkg_info_json(folder, keyword='depends', all_info=True) 233 | >>> packages, dep_counts, required_by = out_ 234 | # create the output 235 | >>> f0 = r"C:\arcpro_npg\Project_npg\npgeom.gdb\dep_pkg_info" 236 | >>> f1 = r"C:\arcpro_npg\Project_npg\npgeom.gdb\dep_counts" 237 | >>> f2 = r"C:\arcpro_npg\Project_npg\npgeom.gdb\dep_required_by" 238 | >>> arcpy.da.NumPyArrayToTable(packages, f0) 239 | >>> arcpy.da.NumPyArrayToTable(dep_counts, f1) 240 | >>> arcpy.da.NumPyArrayToTable(required_by, f2) 241 | 242 | print required by:: 243 | msg = "" 244 | for r in required_by: 245 | pkg = r[0] 246 | req = r[1] 247 | msg += "{} : {}\n".format(pkg, req) 248 | print(msg) 249 | """ 250 | # -- Checks 251 | if not folder: 252 | folder = sys.prefix + "\\conda-meta" 253 | folder = Path(folder) 254 | if not folder.is_dir(): 255 | print("\nInvalid path... {}".format(folder)) 256 | return None 257 | files = list(folder.glob("*.json")) 258 | if not files: 259 | print("{} doesn't have any json files".format(folder)) 260 | return None 261 | # 262 | # -- Package, Filename, Dependencies 263 | packages = [] 264 | m0 = m1 = m2 = 0 265 | for f in files: 266 | ret = parse_json(f, key=keyword) # ---- look at dependencies only 267 | nme = str(f.name).rsplit("-", 2)[0] # ---- split off the last two 268 | if len(ret) == 1: 269 | ret = ret[0] 270 | elif len(ret) > 1: 271 | srted = sorted(ret) 272 | ret = "; ".join([i for i in srted]) # `; ` used 273 | else: 274 | ret = "None" 275 | m0 = max(m0, len(nme)) 276 | m1 = max(m1, len(str(f.name))) 277 | m2 = max(m2, len(ret)) 278 | packages.append((nme, f.name, ret)) 279 | dt1 = [("Package", " 0: 308 | v = w.tolist() 309 | v0 = ", ".join([i.split(" ")[0] for i in v]) 310 | max_len = max(max_len, len(v0)) 311 | required_by.append([nme, v0]) 312 | else: # -- no dependencies, hence `None` 313 | max_len = max(max_len, 4) 314 | required_by.append([nme, "None"]) 315 | r_dt = ">> pckg_info[pckg_info['Package'] == 'jupyter_client'] 348 | ... array([ 349 | ... ('jupyter_client', '6.1.12', 'pyhd3eb1b0_0', 350 | ... 'jupyter_core -dateutil pyzmq tornado traitlets'), 351 | ... ('jupyter_client', '7.3.5', 'py39haa95532_0', 352 | ... 'entrypoints jupyter_core nest-asyncio -dateutil pyzmq 353 | ... tornado traitlets')], 354 | ... dtype=[('Package', '>> pckg_deps[pckg_deps['Package'] == 'jupyter_client'] 360 | ... array([ 361 | ... ('jupyter_client', 362 | ... 'arcgis ipykernel ipykernel jupyter_console jupyter_server 363 | ... nbclient notebook qtconsole spyder-kernels spyder-kernels'), 364 | ... ('jupyter_client', 365 | ... 'arcgis ipykernel ipykernel jupyter_console jupyter_server 366 | ... nbclient notebook qtconsole spyder-kernels spyder-kernels')], 367 | ... dtype=[('Package', ' 0: 403 | v = names[w].tolist() 404 | v0 = " ".join([i.split(" ")[0] for i in v]) 405 | max_len = max(max_len, len(v0)) 406 | out2.append([nme, v0]) 407 | else: 408 | out2.append([nme, "None"]) 409 | r_dt = "`_. 19 | 20 | Modified : 21 | 2025-02-17 22 | 23 | Purpose 24 | ------- 25 | General helper functions. 26 | 27 | Notes 28 | ----- 29 | To add 30 | """ 31 | 32 | import sys 33 | import numpy as np 34 | import npg # noqa 35 | 36 | script = sys.argv[0] 37 | 38 | __all__ = [ 39 | 'cartesian_product', # (2) main functions 40 | 'remove_seq_dupl', 41 | 'separate_string_number', 42 | 'sequences', 43 | 'stride_2d', 44 | 'uniq_1d', 45 | 'uniq_2d', 46 | 'flatten', 47 | 'unpack' 48 | ] 49 | 50 | __helpers__ = [ # (1) private helpers 51 | '_base_', 52 | '_isin_2d_', 53 | '_iterate_', 54 | '_to_lists_', 55 | '_view_as_struct_' 56 | ] 57 | 58 | # __imports__ = ['roll_arrays'] 59 | 60 | 61 | # ---- --------------------------- 62 | # ---- (1) private helpers 63 | def _base_(a): 64 | """Return the base array of a Geo array. Shave off microseconds.""" 65 | if hasattr(a, "IFT"): 66 | return a.XY 67 | return a 68 | 69 | 70 | def _isin_2d_(a, b, as_integer=False): 71 | """Perform a 2d `isin` check for 2 arrays. 72 | 73 | Parameters 74 | ---------- 75 | a, b : arrays 76 | The arrays to compare. 77 | as_integer : boolean 78 | False, returns a list of booleans. True, returns an integer array 79 | which may useful for some operations. 80 | 81 | Example 82 | ------- 83 | >>> a = np.array([[ 5.00, 10.00], [ 5.00, 12.00], [ 6.00, 12.00], 84 | [ 8.00, 12.00], [ 8.00, 11.00], [ 5.00, 10.00]]) 85 | >>> b = np.array([[ 5.00, 12.00], [ 5.00, 15.00], [ 7.00, 14.00], 86 | [ 6.00, 12.00], [ 5.00, 12.00]]) 87 | >>> w0 = (a[:, None] == b).all(-1).any(-1) 88 | array([0, 1, 1, 0, 0, 0]) 89 | >>> a[w0] 90 | array([[ 5.00, 12.00], [ 6.00, 12.00]]) 91 | >>> w1 = (b[:, None] == a).all(-1).any(-1) 92 | >>> b[w1] 93 | array([[ 5.00, 12.00], [ 6.00, 12.00], [ 5.00, 12.00]]) 94 | 95 | Reference 96 | --------- 97 | ``_. 98 | """ 99 | a = _base_(a) 100 | b = _base_(b) 101 | out = (a[:, None] == b).all(-1).any(-1) 102 | if as_integer: 103 | return out.astype('int') 104 | return out.tolist() 105 | 106 | 107 | def _iterate_(N, n): 108 | """Return combinations for array lengths.""" 109 | import itertools 110 | combos = itertools.combinations(np.arange(N), n) 111 | return list(combos) 112 | 113 | 114 | def _to_lists_(a, outer_only=True): 115 | """Return list or list of lists for a Geo or ndarray. 116 | 117 | Parameters 118 | ---------- 119 | a : array-like 120 | Either a Geo array or ndarray. 121 | outer_only : boolean 122 | True, returns the outer-rings of a Geo array. False, returns the bit. 123 | 124 | See Also 125 | -------- 126 | `Geo_to_lists`, `Geo_to_arrays` if you want to maintain the potentially 127 | nested structure of the geometry. 128 | """ 129 | if hasattr(a, "IFT"): 130 | if outer_only: 131 | return a.outer_rings(False) # a.bits 132 | return a.bits 133 | if isinstance(a, np.ndarray): 134 | if a.dtype.kind == 'O': 135 | return a 136 | if a.ndim == 2: 137 | return [a] 138 | if a.ndim == 3: 139 | return list(a) 140 | return a # a list already 141 | 142 | 143 | def _view_as_struct_(a, return_all=False): 144 | """Key function to get uniform 2d arrays to be viewed as structured arrays. 145 | 146 | A bit of trickery, but it works for all set-like functionality. 147 | Use `uts` for more complicated dtypes. 148 | 149 | Parameters 150 | ---------- 151 | a : array 152 | Geo array or ndarray to be viewed. 153 | 154 | Returns 155 | ------- 156 | Array view as structured/recarray, with shape = (N, 1) 157 | 158 | References 159 | ---------- 160 | See `unstructured_to_structured` in... numpy/lib/recfunctions.py 161 | 162 | >>> from numpy.lib.recfunctions import unstructured_to_structured as uts 163 | """ 164 | shp = a.shape 165 | dt = a.dtype 166 | a_view = a.view(dt.descr * shp[1])[..., 0] 167 | if return_all: 168 | return a_view, shp, dt 169 | return a_view 170 | 171 | 172 | # --------------------------- 173 | # ---- (2) main functions 174 | # 175 | # 176 | 177 | def cartesian_product(sequences): 178 | """Construct an index grid using 1D array_like sequences. 179 | 180 | arrays : array_like 181 | At least 2 array_like sequences to form the indices/product. 182 | 183 | Example 184 | ------- 185 | >>> cartesian_product([[0, 1]), [0, 1, 2]]) 186 | ...array([[0, 0], 187 | ... [0, 1], 188 | ... [0, 2], 189 | ... [1, 0], 190 | ... [1, 1], 191 | ... [1, 2]]) 192 | >>> cartesian_product([[0], [2, 3], [5, 4]]) 193 | ...array([[0, 2, 5], 194 | ... [0, 2, 4], 195 | ... [0, 3, 5], 196 | ... [0, 3, 4]]) 197 | 198 | Reference 199 | --------- 200 | ``_. 202 | """ 203 | arrays = [np.array(i) for i in sequences] 204 | len_ = len(arrays) 205 | dtype = np.result_type(*arrays) 206 | arr = np.empty([len(a) for a in arrays] + [len_], dtype=dtype) 207 | for i, a in enumerate(np.ix_(*arrays)): 208 | arr[..., i] = a 209 | return arr.reshape(-1, len_) 210 | 211 | 212 | def remove_seq_dupl(a): 213 | """Remove sequential duplicates from an array. 214 | 215 | The array is stacked with the first value in the sequence to retain it. 216 | Designed to removed sequential duplicates in point arrays. 217 | """ 218 | uni = a[np.where(a[:-1] != a[1:])[0] + 1] 219 | if a.ndim == 1: 220 | uni = np.hstack((a[0], uni)) 221 | else: 222 | uni = np.vstack((a[0], uni)) 223 | uni = np.ascontiguousarray(uni) 224 | return uni 225 | 226 | 227 | 228 | def separate_string_number(string, as_list=False): 229 | """Return a string split into strings and numbers, as a list. 230 | 231 | z = 'Pj 60Pt 30Bw 10' 232 | z0 = 'PJ60PT30BW10' 233 | separate_string_number(z) 234 | separate_string_number(z0) 235 | returned value 236 | ['Pj', '60', 'Pt', '30', 'Bw', '10'] 237 | 238 | separate_string_number("A .1 in the 1.1 is not 1") 239 | ['A', '.1', 'in', 'the', '1.1', 'is', 'not', '1'] 240 | 241 | Modified from https://stackoverflow.com/a/57359921/6828711 242 | """ 243 | groups = [] 244 | prev = string[0] 245 | newword = string[0] 246 | if len(string) <= 1: 247 | return [string] 248 | for x, i in enumerate(string[1:]): 249 | if i.isalpha() and prev.isalpha(): 250 | newword += i 251 | elif (i.isnumeric() or i == '.') and (prev.isnumeric() or prev == '.'): 252 | newword += i 253 | else: 254 | groups.append(newword.strip()) 255 | newword = i 256 | prev = i 257 | if x == len(string) - 2: 258 | groups.append(newword.strip()) # strip any spaces 259 | newword = '' 260 | # remove extraneous space values in groups 261 | groups = [i for i in groups if i != ''] 262 | # -- for arrays 263 | # np.asarray(np.split(groups, groups.size//2) # split into pairs.) 264 | if as_list: 265 | return groups 266 | # -- pair values, special case 267 | s = " ".join(["".join(groups[pos:pos + 2]) 268 | for pos in range(0, len(groups), 2)] 269 | ) 270 | return s 271 | 272 | 273 | def sequences(data, stepsize=0): 274 | """Return an array of sequence information denoted by stepsize. 275 | 276 | Parameters 277 | ---------- 278 | data : array-like 279 | List/array of values in 1D 280 | stepsize : integer 281 | Separation between the values. 282 | If stepsize=0, sequences of equal values will be searched. If stepsize 283 | is 1, then sequences incrementing by 1 etcetera. 284 | 285 | Stepsize can be both positive or negative:: 286 | 287 | >>> # check for incrementing sequence by 1's 288 | >>> d = [1, 2, 3, 4, 4, 5] 289 | >>> s = sequences(d, 1) 290 | |array([(0, 0, 4, 1, 4), (1, 4, 6, 4, 2)], 291 | | dtype=[('ID', '>> npg.prn(s) 294 | id ID From_ To_ Value Count 295 | ---------------------------------------- 296 | 000 0 0 4 1 4 297 | 001 1 4 6 4 2 298 | 299 | Notes 300 | ----- 301 | For strings, use 302 | 303 | >>> partitions = np.where(a[1:] != a[:-1])[0] + 1 304 | 305 | Change **N** in the expression to find other splits in the data 306 | 307 | >>> np.split(data, np.where(np.abs(np.diff(data)) >= N)[0]+1) 308 | 309 | Keep for now:: 310 | 311 | checking sequences of 0, 1 312 | >>> a = np.array([1,0,1,1,1,0,0,0,0,1,1,0,0]) 313 | | np.hstack([[x.sum(), *[0]*(len(x) -1)] 314 | | if x[0] == 1 315 | | else x 316 | | for x in np.split(a, np.where(np.diff(a) != 0)[0]+1)]) 317 | >>> # array([1, 0, 3, 0, 0, 0, 0, 0, 0, 2, 0, 0, 0]) 318 | 319 | References 320 | ---------- 321 | ``__. 323 | """ 324 | # 325 | a = np.array(data) 326 | a_dt = a.dtype.kind 327 | dt = [('ID', '>> a = np.array([[0, 1], [2, 3], [4, 5]]) 372 | >>> stride_2d(a.ravel(), win=(4,), stepby=(2,)) 373 | 374 | array([[0, 1, 2, 3], 375 | [2, 3, 4, 5]]) 376 | 377 | # -- alternatives, but they produce copies 378 | >>> np.concatenate((a[:-1], a[1:]), axis=1) 379 | >>> np.asarray(list(zip(a[:-1], a[1:]))) 380 | 381 | # -- concatenate is faster, with 500 points in `s`. 382 | %timeit stride_2d(s.ravel(), win=(4,), stepby=(2,)) 383 | 21.7 µs ± 476 ns per loop (mean ± std. dev. of 7 runs, 10000 loops 384 | 385 | %timeit np.concatenate((s[:-1], s[1:]), axis=1) 386 | 8.41 µs ± 158 ns per loop (mean ± std. dev. of 7 runs, 100000 loops 387 | 388 | A different stride:: 389 | 390 | >>> stride_2d(b, win=(2, 2), stepby=(1, 1)) 391 | array([[[0, 1], 392 | [2, 3]], 393 | 394 | [[2, 3], 395 | [4, 5]]]) 396 | 397 | """ 398 | from numpy.lib.stride_tricks import as_strided 399 | shp = np.array(a.shape) # array shape 2D (r, c) or 3D (d, r, c) 400 | win_shp = np.array(win) # window (4,) (3, 3) or (1, 3, 3) 401 | ss = np.array(stepby) # step by (2,) (1, 1) or (1, 1, 1) 402 | newshape = tuple(((shp - win_shp) // ss) + 1) + tuple(win_shp) 403 | newstrides = tuple(np.array(a.strides) * ss) + a.strides 404 | a_s = as_strided(a, shape=newshape, strides=newstrides, subok=True) 405 | return a_s.squeeze() 406 | 407 | 408 | def uniq_1d(arr): 409 | """Return mini `unique` 1D.""" 410 | mask = np.empty(arr.shape, dtype=np.bool_) 411 | mask[:1] = True 412 | a_copy = np.sort(arr) 413 | mask[1:] = a_copy[1:] != a_copy[:-1] 414 | return a_copy[mask] 415 | 416 | 417 | def uniq_2d(arr, return_sorted=False): # *** keep but slower than unique 418 | """Return mini `unique` for 2D coordinates. Derived from np.unique. 419 | 420 | Notes 421 | ----- 422 | For returning in the original order this is equivalent to:: 423 | 424 | u, idx = np.unique(x_pnts, return_index=True, axis=0) 425 | x_pnts[np.sort(idx)] 426 | 427 | References 428 | ---------- 429 | `NumPy unique 430 | `_. 432 | """ 433 | def _reshape_uniq_(uniq, dt, shp): 434 | n = len(uniq) 435 | uniq = uniq.view(dt) 436 | uniq = uniq.reshape(n, *shp[1:]) 437 | uniq = np.moveaxis(uniq, 0, 0) 438 | return uniq 439 | 440 | shp = arr.shape 441 | dt = arr.dtype 442 | st_arr = arr.view(dt.descr * shp[1]) 443 | ar = st_arr.flatten() 444 | if return_sorted: 445 | perm = ar.argsort(kind='mergesort') 446 | aux = ar[perm] 447 | else: # removed ar.sort() 448 | aux = ar 449 | mask = np.empty(aux.shape, dtype=np.bool_) 450 | mask[:1] = True 451 | mask[1:] = aux[1:] != aux[:-1] 452 | ret = aux[mask] 453 | uniq = _reshape_uniq_(ret, dt, shp) 454 | if return_sorted: # return_index in unique 455 | return uniq, perm[mask] 456 | return uniq 457 | 458 | 459 | def flatten(a_list, flat_list=None): 460 | """Change the isinstance as appropriate. 461 | 462 | : Flatten an object using recursion 463 | : see: itertools.chain() for an alternate method of flattening. 464 | """ 465 | if flat_list is None: 466 | flat_list = [] 467 | for item in a_list: 468 | if isinstance(item, list): 469 | flatten(item, flat_list) 470 | else: 471 | flat_list.append(item) 472 | return flat_list 473 | 474 | 475 | def unpack(iterable, param='__iter__'): 476 | """Unpack an iterable based on the param(eter) condition using recursion. 477 | 478 | :Notes: 479 | : ---- see main docs for more information and options ---- 480 | : To produce an array from this, use the following after this is done. 481 | : out = np.array(xy).reshape(len(xy)//2, 2) 482 | """ 483 | xy = [] 484 | for x in iterable: 485 | if hasattr(x, '__iter__'): 486 | xy.extend(unpack(x)) 487 | else: 488 | xy.append(x) 489 | return xy 490 | 491 | 492 | # ---- --------------------------- 493 | # ---- Final main section 494 | if __name__ == "__main__": 495 | """optional location for parameters""" 496 | print(f"\nRunning... {script}\n") 497 | -------------------------------------------------------------------------------- /arcpro_npg/npg/npg/npg_min_circ.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | # noqa: D205, D400 3 | r""" 4 | ------------ 5 | npg_min_circ 6 | ------------ 7 | 8 | Script : 9 | npg_min_circ.py # Minimum area bounding circles 10 | 11 | Author : 12 | Dan_Patterson@carleton.ca 13 | 14 | Modified : 15 | 2023-08-23 16 | 17 | Purpose 18 | ------- 19 | Returns the smallest circle enclosing a shape in the form of a center and 20 | radius. Original in smallestCircle.py in Bounding Containers. 21 | 22 | Requires 23 | -------- 24 | Must have at least two points. 25 | 26 | References 27 | ---------- 28 | de Berg et al., Computational Geometry with Applications, Springer-Verlag. 29 | 30 | Welzl, E. (1991), Smallest enclosing disks (balls and ellipsoids), 31 | Lecture Notes in Computer Science, Vol. 555, pp. 359-370. 32 | 33 | ``_. 35 | 36 | >>> cent = array([ 421645.83745955, 4596388.99204294]) 37 | >>> Xc, Yc, radius = 421646.74552, 4596389.82475, 24.323246 38 | 39 | mean of points : 40 | [ 421645.83745955 4596388.99204294] ## correct mean 41 | translated to origin : 42 | (0.9080813432488977, 0.8327111343034483, 24.323287017466253) 43 | direct calculation : 44 | (421646.74554089626, 4596389.8247540779, 24.323287017466253) 45 | 46 | """ 47 | 48 | import numpy as np 49 | 50 | np.set_printoptions( 51 | edgeitems=5, linewidth=10, precision=3, suppress=True, threshold=200, 52 | formatter={"bool": lambda x: repr(x.astype(np.int32)), 53 | "float_kind": '{: 7.3f}'.format}) 54 | 55 | 56 | def circle_mini(radius=1.0, theta=10.0, xc=0.0, yc=0.0): 57 | """Produce a circle/ellipse depending on parameters. 58 | 59 | Parameters 60 | ---------- 61 | radius : number 62 | Distance from centre. 63 | theta : number 64 | Angle of densification of the shape around 360 degrees. 65 | """ 66 | angles = np.deg2rad(np.arange(180.0, -180.0 - theta, step=-theta)) 67 | x_s = radius * np.cos(angles) + xc # X values 68 | y_s = radius * np.sin(angles) + yc # Y values 69 | pnts = np.array([x_s, y_s]).T 70 | return pnts 71 | 72 | 73 | # ---- smallest circle implementation ---------------------------------------- 74 | # helpers : farthest, center, distance 75 | def farthest(a, check=False): 76 | """Distance matrix calculation for 2D points using einsum. 77 | 78 | This functions yielding the two points which have the greatest distance 79 | between them. 80 | """ 81 | if check: 82 | a = np.unique(a, axis=0) 83 | b = a.reshape(a.shape[0], 1, a.shape[-1]) 84 | diff = a - b 85 | dist_arr = np.sqrt(np.einsum('ijk,ijk->ij', diff, diff)) 86 | t_low = np.tril(dist_arr) # np.triu(dist_arr) 87 | dist_order = np.unique(t_low)[::-1] # largest to smallest unique dist 88 | mx = dist_order[0] 89 | r, c = np.where(t_low == mx) 90 | return a[c][0], a[r][0], dist_order 91 | 92 | 93 | def center(p0, p1): 94 | """Center point between two points.""" 95 | return np.mean((p0, p1), axis=0) 96 | 97 | 98 | def distance(p0, p1): 99 | """Distance between two points.""" 100 | return np.hypot(*(p1 - p0)) 101 | 102 | 103 | def small_circ(a): 104 | """Return the minimum area bounding circle for a points array. 105 | 106 | The ``unique`` points are used since np.unique removes reduncant calls and 107 | sorts the points in ascending order. 108 | 109 | Notes 110 | ----- 111 | This incarnation uses a mix of pure python and numpy functionality where 112 | appropriate. A simple check is first made to see if the farthest points 113 | enclose the point set. If not, then the search continues to attempt to 114 | find 2, 3 or more points that form a circle to completely enclose all the 115 | points. 116 | """ 117 | a = np.unique(a, axis=0) 118 | N = a.shape[0] 119 | if N <= 1: 120 | return a[0], 0.0, a 121 | if N == 2: 122 | cent = center(*a[:2]) 123 | radius = distance(cent, a[0]) 124 | return cent, radius, a[:2] 125 | # corner-cases/garbage checking over 126 | p0, p1, _ = farthest(a, check=False) 127 | cent = center(p0, p1) 128 | radius = distance(cent, p0) 129 | check = np.sqrt(np.einsum('ij,ij->i', a - cent, a - cent)) <= radius 130 | if not np.all(check): # degenerate case found 131 | for i in range(1, N): 132 | ptP = a[i] 133 | if distance(cent, ptP) > radius: 134 | prev = i - 1 135 | cent, radius = sub_1(a, prev, ptP) 136 | check = np.sqrt(np.einsum('ij,ij->i', a - cent, a - cent)) - radius 137 | # pnts = a[np.isclose(check, 0.)] 138 | return cent[0], cent[1], radius # , pnts 139 | 140 | 141 | # ------------------------------------------------------------------- 142 | def sub_1(pnts, prev, ptQ): 143 | """Stage 1 check. Calls sub_2 to complete the search.""" 144 | N = prev 145 | cent = center(pnts[0], ptQ) 146 | radius = distance(cent, ptQ) 147 | for i in range(1, N + 1): 148 | ptP = pnts[i] 149 | if distance(ptP, cent) > radius: 150 | N = i - 1 151 | cent, radius = sub_2(pnts, N, ptQ, ptP) 152 | return cent, radius 153 | 154 | 155 | # ------------------------------------------------------------------- 156 | def sub_2(pnts, N, ptQ, ptP): 157 | """Return the {cent, radius} for the smallest disc. 158 | 159 | The disc encloses the points list with PointR, PointQ on its boundary. 160 | """ 161 | if pnts.size == 0: 162 | pnts = np.array([[1.0, 1.0]]) # check 163 | N = 0 164 | ptQ = np.array([0.0, 0.0]) 165 | ptR = np.array([1.0, 0.0]) 166 | else: 167 | ptR = ptP 168 | cent = center(ptR, ptQ) 169 | radius = distance(cent, ptQ) 170 | ptO = np.array([0.0, 0.0]) 171 | ptB = ptR - ptQ 172 | c2 = (distance(ptR, ptO)**2 - distance(ptQ, ptO)**2) / 2.0 173 | for i in range(0, N + 1): 174 | ptP = pnts[i] 175 | if distance(ptP, cent) > radius: 176 | if np.all([0.0, 0.0] == ptB): 177 | cent = center(ptP, ptQ) 178 | radius = distance(ptQ, cent) 179 | else: 180 | ptA = ptQ - ptP 181 | xDelta = ptA[0] * ptB[1] - (ptA[1] * ptB[0]) 182 | if abs(xDelta) >= 1.0e-06: # 0.0: 183 | c1 = (distance(ptQ, ptO)**2 - (distance(ptP, ptO)**2)) / 2. 184 | x = (ptB[1] * c1 - (ptA[1] * c2)) / xDelta 185 | y = (ptA[0] * c2 - (ptB[0] * c1)) / xDelta 186 | cent = [x, y] 187 | radius = distance(cent, ptP) 188 | return cent, radius 189 | 190 | 191 | # =========================================================================== 192 | # 193 | if __name__ == "__main__": 194 | """optional location for parameters""" 195 | -------------------------------------------------------------------------------- /arcpro_npg/npg/npg/npg_pip.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | # noqa: D205, D400 3 | r""" 4 | ------- 5 | npg_pip 6 | ------- 7 | 8 | Point in Polygon implementation using winding numbers. This is for Geo arrays 9 | and uses numpy enhancements. 10 | 11 | ---- 12 | 13 | Script : 14 | npg_pip.py 15 | Author : 16 | Dan_Patterson@carleton.ca 17 | 18 | ``_. 19 | Modified : 20 | 2022-11-07 21 | 22 | Purpose 23 | ------- 24 | Functions for point partitioning and winding number inclusion tests for points 25 | in polygons. 26 | 27 | Notes 28 | ----- 29 | **np_wn notes** 30 | 31 | The polygon is represented as from-to pairs (fr_, to_). Their x, y values 32 | are obtained by translation and splitting (x0, y0, x1, y1). 33 | The input points are processing in a similar fashion (pnts --> px, py). 34 | The `winding number` is determined for all points at once for the given 35 | polygon. 36 | 37 | **pnts_in_Geo notes** 38 | 39 | Pre-processing to remove duplicates or partition the points hasn't proved 40 | to be optimal in all situations. They are included for experimental 41 | purposes. In such cases, the process is as follows:: 42 | 43 | - Determine polygon extents for the Geo array `geo`. 44 | - Derive the unique points for the test points `pnts`. 45 | - Assign points to the appropriate extent. 46 | - Run `winding number` algorithm (or `crossing number` if so inclined) 47 | - Deleting points as you go does not improve things. 48 | 49 | How to remove points from an array, if found in an array. In the example below 50 | `sub` is a subarray of `pnts`. The indices where they are equal is `w`. 51 | 52 | >>> w = np.where((pnts == sub[:, None]).all(-1))[1] 53 | >>> pnts = np.delete(pnts, w, 0) 54 | 55 | References 56 | ---------- 57 | ``_. 58 | 59 | ``_. 61 | 62 | ``_. ** good 63 | 64 | """ 65 | # pycodestyle D205 gets rid of that one blank line thing 66 | # pylint: disable=C0103,C0302,C0415 67 | # pylint: disable=E0402,E0611,E1136,E1121,R0904,R0914, 68 | # pylint: disable=W0201,W0212,W0221,W0612,W0621,W0105 69 | # pylint: disable=R0902 70 | 71 | 72 | import sys 73 | import numpy as np 74 | 75 | # ---- optional imports 76 | # import npgeom as npg 77 | # from numpy.lib.recfunctions import structured_to_unstructured as stu 78 | # from numpy.lib.recfunctions import unstructured_to_structured as uts 79 | # from numpy.lib.recfunctions import repack_fields 80 | 81 | # noqa: E501 82 | np.set_printoptions( 83 | edgeitems=10, linewidth=120, precision=3, suppress=True, threshold=200, 84 | formatter={"bool": lambda x: repr(x.astype(np.int32)), 85 | "float_kind": '{: 7.3f}'.format}) 86 | 87 | script = sys.argv[0] # print this should you need to locate the script 88 | 89 | __all__ = [ 90 | '_is_right_side', '_side_', 'crossing_num', 'winding_num', '_partition_', 91 | 'np_wn', 'pnts_in_Geo' 92 | ] 93 | 94 | 95 | # ---- single use helpers 96 | # 97 | def _side_(pnts, poly): # ** not used 98 | r"""Return points inside, outside or equal/crossing a convex poly feature. 99 | 100 | Returns 101 | ------- 102 | r the equation value array 103 | in_ the points based on the winding number 104 | inside (r < 0) 105 | outside (r > 0) 106 | equal_ (r == 0) 107 | 108 | Notes 109 | ----- 110 | See `_wn_clip_` as another option to return more information. 111 | 112 | >>> `r` == diff_ in _wn_ used in chk3 113 | >>> `r` == t_num = a_0 - a_1 ... in previous equations 114 | >>> r_lt0 = r < 0, r_gt0 = ~r_lt0, to yield => (r_lt0 * -1) - (r_gt0 + 0) 115 | >>> (r < 0).all(-1) # just the boolean locations 116 | ... array([0, 0, 1, 1, 1, 0, 1, 1, 0, 0, 0, 0, 0]) 117 | >>> (r < 0).all(-1).nonzero()[0] # the index numbers 118 | ... array([2, 3, 4, 6, 7], dtype=int64) 119 | """ 120 | if pnts.ndim < 2: 121 | pnts = np.atleast_2d(pnts) 122 | x0, y0 = pnts.T 123 | x2, y2 = poly[:-1].T # poly segment start points 124 | x3, y3 = poly[1:].T # poly segment end points 125 | r = (x3 - x2) * (y0[:, None] - y2) - (y3 - y2) * (x0[:, None] - x2) 126 | # -- from _wn_, winding numbers for concave/convex poly 127 | chk1 = ((y0[:, None] - y2) >= 0.) 128 | chk2 = (y0[:, None] < y3) 129 | chk3 = np.sign(r).astype(int) 130 | pos = (chk1 & chk2 & (chk3 > 0)).sum(axis=1, dtype=int) 131 | neg = (~chk1 & ~chk2 & (chk3 < 0)).sum(axis=1, dtype=int) 132 | wn_vals = pos - neg 133 | in_ = pnts[np.nonzero(wn_vals)] 134 | inside = pnts[(r < 0).all(axis=-1)] # all must be True along row, convex 135 | outside = pnts[(r > 0).any(-1)] # any must be True along row 136 | equal_ = pnts[(r == 0).any(-1)] # ditto 137 | return r, in_, inside, outside, equal_ 138 | 139 | 140 | def _is_right_side(p, strt, end): 141 | """Determine if point (p) is `inside` a line segment (strt-->end). 142 | 143 | See Also 144 | -------- 145 | line_crosses, in_out_crosses in npg_geom_hlp. 146 | position = sign((Bx - Ax) * (Y - Ay) - (By - Ay) * (X - Ax)) 147 | 148 | Returns 149 | ------- 150 | Negative for right of clockwise line, positive for left. So in essence, 151 | the reverse of _is_left_side with the outcomes reversed ;) 152 | """ 153 | x, y, x0, y0, x1, y1 = *p, *strt, *end 154 | return (x1 - x0) * (y - y0) - (y1 - y0) * (x - x0) 155 | 156 | 157 | def crossing_num(pnts, poly, line=True): 158 | """Crossing Number for point(s) in polygon. See `pnts_in_poly`. 159 | 160 | Parameters 161 | ---------- 162 | pnts : array of points 163 | Points are an N-2 array of point objects determined to be within the 164 | extent of the input polygons. 165 | poly : polygon array 166 | Polygon is an Nx2 array of point objects that form the clockwise 167 | boundary of the polygon. 168 | line : boolean 169 | True to include points that fall on a line as being inside. 170 | """ 171 | def _in_ex_(pnts, ext): 172 | """Return the points within an extent or on the line of the extent.""" 173 | LB, RT = ext 174 | comp = np.logical_and(LB <= pnts, pnts <= RT) # using <= and <= 175 | idx = np.logical_and(comp[..., 0], comp[..., 1]) 176 | return idx, pnts[idx] 177 | 178 | pnts = np.atleast_2d(pnts) 179 | xs = poly[:, 0] 180 | ys = poly[:, 1] 181 | N = len(poly) 182 | xy_diff = np.diff(poly, axis=0) 183 | dx = xy_diff[:, 0] # np.diff(xs) 184 | dy = xy_diff[:, 1] # np.diff(ys) 185 | ext = np.array([poly.min(axis=0), poly.max(axis=0)]) 186 | idx, inside = _in_ex_(pnts, ext) 187 | is_in = [] 188 | for pnt in inside: 189 | cn = 0 # the crossing number counter 190 | x, y = pnt 191 | for i in range(N - 1): 192 | if line is True: 193 | c0 = (ys[i] < y <= ys[i + 1]) # changed to < <= 194 | c1 = (ys[i] > y >= ys[i + 1]) # and > >= 195 | else: 196 | c0 = (ys[i] < y < ys[i + 1]) 197 | c1 = (ys[i] > y > ys[i + 1]) 198 | if (c0 or c1): # or y in (ys[i], ys[i+1]): 199 | vt = (y - ys[i]) / dy[i] # compute x-coordinate 200 | if line is True: 201 | if (x == xs[i]) or (x < (xs[i] + vt * dx[i])): # include 202 | cn += 1 203 | else: 204 | if x < (xs[i] + vt * dx[i]): # exclude pnts on line 205 | cn += 1 206 | is_in.append(cn % 2) # either even or odd (0, 1) 207 | return inside[np.nonzero(is_in)] 208 | 209 | 210 | def winding_num(pnts, poly, batch=False): 211 | """Point in polygon using winding numbers. 212 | 213 | Parameters 214 | ---------- 215 | pnts : array 216 | This is simply an (x, y) point pair of the point in question. 217 | poly : array 218 | A clockwise oriented Nx2 array of points, with the first and last 219 | points being equal. 220 | 221 | Notes 222 | ----- 223 | Until this can be implemented in a full array of points and full suite of 224 | polygons, you have to test for all the points in each polygon. 225 | 226 | >>> w = [winding_num(p, e1) for p in g_uni] 227 | >>> g_uni[np.nonzero(w)] 228 | array([[ 20.00, 1.00], 229 | ... [ 21.00, 0.00]]) 230 | 231 | References 232 | ---------- 233 | ``_. 234 | """ 235 | def _is_right_side(p, strt, end): 236 | """Determine if a point (p) is `inside` a line segment (strt-->end). 237 | 238 | See Also 239 | -------- 240 | `line_crosses`, `in_out_crosses` in npg_geom_hlp. 241 | position = sign((Bx - Ax) * (Y - Ay) - (By - Ay) * (X - Ax)) 242 | negative for right of clockwise line, positive for left. So in essence, 243 | the reverse of _is_left_side with the outcomes reversed ;) 244 | """ 245 | x, y, x0, y0, x1, y1 = *p, *strt, *end 246 | return (x1 - x0) * (y - y0) - (y1 - y0) * (x - x0) 247 | 248 | def cal_w(p, poly): 249 | """Do the calculation.""" 250 | w = 0 251 | y = p[1] 252 | ys = poly[:, 1] 253 | for i in range(poly.shape[0]): 254 | if ys[i - 1] <= y: 255 | if ys[i] > y: 256 | if _is_right_side(p, poly[i - 1], poly[i]) > 0: 257 | w += 1 258 | elif ys[i] <= y: 259 | if _is_right_side(p, poly[i - 1], poly[i]) < 0: 260 | w -= 1 261 | return w 262 | 263 | if batch: 264 | w = [cal_w(p, poly) for p in pnts] 265 | return pnts[np.nonzero(w)], w 266 | else: 267 | return cal_w(pnts, poly) 268 | 269 | 270 | # ---------------------------------------------------------------------------- 271 | # ---- (1) ... points in polygons 272 | # 273 | def _partition_(pnts, geo, return_remainder=False): 274 | """Partition points into the first polygon they fall into. 275 | 276 | Parameters 277 | ---------- 278 | pnts, geo : ndarrays 279 | `pnts` is an Nx2 array representing point objects (x, y). 280 | `geo` is a Geo array. 281 | return_remainder : boolean 282 | True, returns the inside and outside points 283 | 284 | Notes 285 | ----- 286 | This code block can be added to pnts_in_Geo if you want to test partition:: 287 | 288 | if partition: 289 | ps_in_exts = _partition_(pnts, geo) 290 | polys = geo.outer_rings(False) 291 | for i, pts in enumerate(ps_in_exts): 292 | if pts.size > 0: 293 | in_, w = np_wn(pts, polys[i]) 294 | w_s.append(w) # [w, pts]) 295 | out.append(in_) # [geo.shp_IFT[i] 296 | 297 | """ 298 | extents = geo.extents(splitter="shape") 299 | L_ = extents[:, 1] 300 | B_ = extents[:, 0] 301 | srt_idx = np.lexsort((B_, L_)).tolist() 302 | extents = extents[srt_idx] 303 | in_ = [] 304 | for e in extents: 305 | c0 = np.logical_and(e[0] <= pnts[:, 0], pnts[:, 0] <= e[2]) 306 | c1 = np.logical_and(e[1] <= pnts[:, 1], pnts[:, 1] <= e[3]) 307 | c2 = np.logical_and(c0, c1) 308 | in_.append(pnts[c2]) 309 | out_pnts = pnts[np.logical_not(c2)] 310 | in_pnts = np.asarray(in_)[sorted(srt_idx)] 311 | if return_remainder: 312 | return in_pnts, out_pnts 313 | return in_pnts 314 | 315 | 316 | def np_wn(pnts, poly, return_winding=False, extras=False): 317 | """Return points in polygon using a winding number algorithm in numpy. 318 | 319 | Parameters 320 | ---------- 321 | pnts : Nx2 array 322 | Points represented as an x,y array. 323 | poly : Nx2 array 324 | Polygon consisting of at least 4 points oriented in a clockwise manner. 325 | return_winding : boolean 326 | True, returns the winding number pattern for testing purposes. Keep as 327 | False to avoid downstream errors. 328 | 329 | Returns 330 | ------- 331 | The points within or on the boundary of the geometry. 332 | 333 | Notes 334 | ----- 335 | The polygon is represented as from-to pairs (`fr_`, `to_`). Their x,y 336 | values are obtained by translation and splitting (x0, y0, x1, y1). 337 | The input points are processed in a similar fashion (pnts --> x, y). 338 | The `winding number` is determined for all points at once for the given 339 | polygon. 340 | 341 | Original form 342 | 343 | >>> c0 = (x1 - x0) * (y[:, None] - y0) 344 | >>> c1 = (y1 - y0) * (x[:, None] - x0) 345 | >>> diff_ = c0 - c1 346 | 347 | Useage 348 | ------ 349 | >>> out_ = [np_wn(points, poly) for poly in polygons] 350 | >>> final = np.unique(np.vstack(out_), axis=0) # points only 351 | 352 | Inclusion checks 353 | ---------------- 354 | on the perimeter is deemed `out` 355 | chk1 (y_y0 > 0.0) changed from >= 356 | chk2 np.less is ok 357 | chk3 leave 358 | pos leave 359 | neg chk3 <= 0 to keep all points inside poly on edge included 360 | 361 | References 362 | ---------- 363 | ``_. inspiration for this numpy version 365 | """ 366 | x0, y0 = poly[:-1].T # polygon `from` coordinates 367 | x1, y1 = poly[1:].T # polygon `to` coordinates 368 | x, y = pnts.T # point coordinates 369 | y_y0 = y[:, None] - y0 370 | y_y1 = y[:, None] - y1 371 | x_x0 = x[:, None] - x0 372 | # -- diff = np.sign(np.einsum("ikj, kj -> ij", pnts[:, None], poly[:-1])) 373 | diff_ = ((x1 - x0) * y_y0 - (y1 - y0) * x_x0) + 0.0 # einsum originally 374 | chk1 = (y_y0 >= 0.0) # -- top and bottom point inclusion! try `>` 375 | chk2 = (y_y1 < 0.0) # was chk2 = np.less(y[:, None], y1) try `<` 376 | chk3 = np.sign(diff_).astype(np.int32) 377 | pos = (chk1 & chk2 & (chk3 > 0)).sum(axis=1, dtype=int) 378 | neg = (~chk1 & ~chk2 & (chk3 < 0)).sum(axis=1, dtype=int) # -- <= ?? 379 | wn = pos - neg 380 | in_ = pnts[np.nonzero(wn)] 381 | if extras: 382 | eq_ids = np.isin(pnts, poly).all(-1).nonzero()[0] # equal 383 | extra_info = ["equal pnt ids", eq_ids] 384 | if return_winding: 385 | if extras: 386 | return in_, wn, extra_info 387 | return in_, wn 388 | return in_ 389 | 390 | 391 | def pnts_in_Geo(pnts, geo, stacked=True): 392 | """Geo array implementation of points in polygon using `winding number`. 393 | 394 | Parameters 395 | ---------- 396 | pnts : array (N, 2) 397 | An ndarray of point objects. 398 | geo : Geo array 399 | The Geo array of polygons. Only the outer rings are used. 400 | stacked : boolean 401 | True, stack the inclusion points as one set. False, returns the points 402 | as separate entities. 403 | 404 | Returns 405 | ------- 406 | Points completely inside or on the boundary of a polygon are returned. 407 | 408 | Requires 409 | -------- 410 | The helper, `np_wn`, (winding number inclusion test). 411 | 412 | Notes 413 | ----- 414 | See docstring notes. 415 | 416 | >>> # for my testing 417 | >>> final = pnts_in_Geo(g_uni, g4, True) 418 | """ 419 | # 420 | out = [] 421 | polys = geo.outer_rings(False) 422 | for poly in polys: 423 | in_ = np_wn(pnts, poly, return_winding=False) # run np_wn 424 | out.append(in_) 425 | pts = [i for i in out if len(i) > 0] 426 | if len(pts) > 1 and stacked: 427 | return np.unique(np.vstack(pts), axis=0) 428 | return pts 429 | 430 | 431 | # 432 | # ---- Final main section ---------------------------------------------------- 433 | if __name__ == "__main__": 434 | """optional location for parameters""" 435 | print("\nRunning... {}\n".format(script)) 436 | -------------------------------------------------------------------------------- /arcpro_npg/npg/npg/npg_setops.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | # noqa: D205, D400 3 | """ 4 | ---------- 5 | npg_setops 6 | ---------- 7 | 8 | Script : 9 | npg_setops.py 10 | Author : 11 | Dan_Patterson@carleton.ca 12 | Modified : 13 | 2023-08-23 14 | 15 | Purpose 16 | ------- 17 | 18 | This set of functions is largely directed to extending some of numpy set 19 | functions to apply to Nxd shaped arrays as well as structured and recarrays. 20 | The functionality largely depends on using a `view` of the input array so that 21 | each row can be treated as a unique record in the array. 22 | 23 | If you are working with arrays and wish to perform functions on certain columns 24 | then you will have to preprocess/preselect. You can only add so much to a 25 | function before it loses its readability and utility. 26 | 27 | ndset:: 28 | 29 | _view_as_struct_, is_in, nd_diff, nd_diffxor, nd_intersect, 30 | nd_union, nd_uniq 31 | 32 | Notes 33 | ----- 34 | _view_as_struct_ 35 | >>> a = np.array([[ 0, 0], [ 0, 100], [100, 100]]) 36 | >>> _view_as_struct_(a, return_all=False) 37 | ... array([[( 0, 0)], 38 | ... [( 0, 100)], 39 | ... [(100, 100)]], dtype=[('f0', '>> a_view, shp, dt = _view_as_struct_(a, return_all=True) 42 | ... shp # (3, 2) 43 | ... dt # dtype('int32') 44 | 45 | nd_is_in 46 | >>> a = np.array([[ 0, 0], [ 0, 100], [100, 100]]) 47 | >>> look_for = np.array([[ 0, 100], [100, 100]]) 48 | >>> nd_isin(a, look_for, reverse=False) 49 | array([[ 0, 100], 50 | ... [100, 100]]) 51 | >>> nd_isin(a, look_for, reverse=True) 52 | array([[0, 0]]) 53 | 54 | For the following: 55 | >>> a = np.array([[ 0, 0], [ 0, 100], [100, 100]]) 56 | >>> b = np.array([[ 0, 100], [100, 100]]) 57 | >>> c = np.array([[ 20, 20], [100, 20], [100, 0], [ 0, 0]]) 58 | 59 | nd_diff(a, b) 60 | >>> nd_diff(a, b) 61 | array([[0, 0]]) 62 | 63 | nd_diffxor(a, b, uni=False) 64 | >>> nd_diffxor(a, c, uni=False) 65 | array([[ 0, 100], 66 | [ 20, 20], 67 | [100, 0], 68 | [100, 20], 69 | [100, 100]]) 70 | 71 | nd_intersect(a, b, invert=False) 72 | >>> nd_intersect(a, b, invert=False) 73 | array([[ 0, 100], 74 | [100, 100]]) 75 | >>> nd_intersect(a, c, invert=False) 76 | array([[0, 0]]) 77 | 78 | nd_union(a, b) 79 | >>> nd_union(a, c) 80 | array([[ 0, 0], 81 | [ 0, 100], 82 | [ 20, 20], 83 | [100, 0], 84 | [100, 20], 85 | [100, 100]]) 86 | 87 | nd_uniq(a, counts=False) 88 | >>> d = np.array([[ 0, 0], [100, 100], [100, 100], [ 0, 0]]) 89 | nd_uniq(d) 90 | array([[ 0, 0], 91 | [100, 100]]) 92 | 93 | References 94 | ---------- 95 | ``_. 97 | 98 | ``_. 99 | 100 | """ 101 | # pylint: disable=C0103 102 | # pylint: disable=R1710 103 | # pylint: disable=R0914 104 | 105 | import sys 106 | import numpy as np 107 | 108 | 109 | np.set_printoptions( 110 | edgeitems=10, linewidth=120, precision=3, suppress=True, threshold=200, 111 | formatter={"bool": lambda x: repr(x.astype(np.int32)), 112 | "float_kind": '{: 7.3f}'.format}) 113 | np.ma.masked_print_option.set_display('-') # change to a single - 114 | 115 | script = sys.argv[0] # print this should you need to locate the script 116 | 117 | 118 | __all__ = ['_view_as_struct_', 119 | '_check_dtype_', 120 | '_unique1d_', 121 | 'nd_diff', 122 | 'nd_diffxor', 123 | 'nd_intersect', 124 | 'nd_isin', 125 | 'nd_merge', 126 | 'nd_union', 127 | 'nd_uniq' 128 | ] 129 | 130 | 131 | def _view_as_struct_(a, return_all=False): 132 | """Key function to get uniform 2d arrays to be viewed as structured arrays. 133 | 134 | A bit of trickery, but it works for all set-like functionality 135 | 136 | Parameters 137 | ---------- 138 | a : array 139 | Geo array or ndarray to be viewed. 140 | 141 | Returns 142 | ------- 143 | Array view as structured/recarray, with shape = (N, 1) 144 | 145 | References 146 | ---------- 147 | See `unstructured_to_structured` in... numpy/lib/recfunctions.py 148 | 149 | >>> from numpy.lib.recfunctions import unstructured_to_structured as uts 150 | """ 151 | shp = a.shape 152 | dt = a.dtype 153 | a_view = a.view(dt.descr * shp[1])[..., 0] 154 | if return_all: 155 | return a_view, shp, dt 156 | return a_view 157 | 158 | 159 | def _check_dtype_(a_view, b_view): 160 | """Check for equivalency in the dtypes. 161 | 162 | If they are not equal, flag and return True or False. 163 | """ 164 | err = "\nData types are not equal, function failed.\n1. {}\n2. {}" 165 | adtype = a_view.dtype.descr 166 | bdtype = b_view.dtype.descr 167 | if adtype != bdtype: 168 | print(err.format(adtype, bdtype)) 169 | return False 170 | return True 171 | 172 | 173 | def _unique1d_(ar, return_index=False, return_inverse=False, 174 | return_counts=False): 175 | """Return unique array elements. From `np.lib.arraysetops`. 176 | 177 | ``_. 178 | """ 179 | ar = np.asanyarray(ar).flatten() 180 | any_indices = return_index or return_inverse 181 | if any_indices: 182 | p = ar.argsort(kind='mergesort' if return_index else 'quicksort') 183 | aux = ar[p] 184 | else: 185 | ar.sort() 186 | aux = ar 187 | mask = np.empty(aux.shape, dtype=np.bool_) 188 | mask[:1] = True 189 | mask[1:] = aux[1:] != aux[:-1] 190 | ret = (aux[mask],) 191 | if return_index: 192 | ret += (p[mask],) 193 | if return_inverse: 194 | imask = np.cumsum(mask) - 1 195 | inv_idx = np.empty(mask.shape, dtype=np.intp) 196 | inv_idx[p] = imask 197 | ret += (inv_idx,) 198 | if return_counts: 199 | idx = np.concatenate(np.nonzero(mask) + ([mask.size],)) 200 | ret += (np.diff(idx),) 201 | return ret 202 | 203 | 204 | # ---- Set functions 205 | # 206 | def nd_diff(a, b, invert=True): 207 | """See nd_intersect. This just returns the opposite/difference.""" 208 | return nd_intersect(a, b, invert=invert) 209 | 210 | 211 | def nd_diffxor(a, b, uni=False): 212 | """Use setxor... it is slower than nd_diff. 213 | 214 | 36 microseconds vs 18.2 but this is faster for large sets 215 | """ 216 | a_view = _view_as_struct_(a, return_all=False) 217 | b_view = _view_as_struct_(b, return_all=False) 218 | good = _check_dtype_(a_view, b_view) # check dtypes 219 | if not good: 220 | return None 221 | ab = np.setxor1d(a_view, b_view, assume_unique=uni) 222 | return ab.view(a.dtype).reshape(-1, ab.shape[0]).squeeze() 223 | 224 | 225 | def nd_in1d(a, b, assume_unique=False, invert=False): 226 | """Check for the presence of array in the other. Taken from `in1d` in. 227 | 228 | ``_. 229 | """ 230 | ar1 = np.asarray(a).ravel() 231 | ar2 = np.asarray(b).ravel() 232 | contains_object = ar1.dtype.hasobject or ar2.dtype.hasobject 233 | if len(ar2) < 10 * len(ar1) ** 0.145 or contains_object: 234 | if invert: 235 | mask = np.ones(len(ar1), dtype=bool) 236 | for a in ar2: 237 | mask &= (ar1 != a) 238 | else: 239 | mask = np.zeros(len(ar1), dtype=bool) 240 | for a in ar2: 241 | mask |= (ar1 == a) 242 | return mask 243 | # Otherwise use sorting 244 | if not assume_unique: 245 | ar1, rev_idx = np.unique(ar1, return_inverse=True) 246 | ar2 = np.unique(ar2) 247 | ar = np.concatenate((ar1, ar2)) 248 | order = ar.argsort(kind='mergesort') 249 | sar = ar[order] 250 | if invert: 251 | bool_ar = (sar[1:] != sar[:-1]) 252 | else: 253 | bool_ar = (sar[1:] == sar[:-1]) 254 | flag = np.concatenate((bool_ar, [invert])) 255 | ret = np.empty(ar.shape, dtype=bool) 256 | ret[order] = flag 257 | if assume_unique: 258 | return ret[:len(ar1)] 259 | else: 260 | return ret[rev_idx] 261 | 262 | 263 | def nd_intersect(a, b, invert=False): 264 | """Intersect two, 2D arrays using views and in1d. 265 | 266 | Parameters 267 | ---------- 268 | a, b : arrays 269 | Arrays are assumed to have a shape = (N, 2) 270 | 271 | References 272 | ---------- 273 | ``_. 274 | 275 | ``_. 277 | 278 | ``_. 280 | """ 281 | a_view = _view_as_struct_(a, return_all=False) 282 | b_view = _view_as_struct_(b, return_all=False) 283 | good = _check_dtype_(a_view, b_view) # check dtypes 284 | if not good: 285 | return None 286 | if len(a) > len(b): 287 | idx = nd_in1d(a_view, b_view, assume_unique=False, invert=invert) 288 | return a[idx] 289 | else: 290 | idx = nd_in1d(b_view, a_view, assume_unique=False, invert=invert) 291 | return b[idx] 292 | 293 | 294 | def nd_isin(a, look_for, indices_only=False, reverse=False): 295 | """Check array `a` for the presence of records in array `look_for`. 296 | 297 | Parameters 298 | ---------- 299 | arr : array 300 | The array to check for the elements 301 | look_for : number, list or array 302 | what to use for the good 303 | indices_only : boolean 304 | True, returns the indices of where `look_for` is in `a`. False, 305 | returns the values found. 306 | reverse : boolean 307 | Switch the query look_for to `True` to find those not in `a` 308 | """ 309 | a_view = _view_as_struct_(a, return_all=False) 310 | b_view = _view_as_struct_(look_for, return_all=False) 311 | good = _check_dtype_(a_view, b_view) # check dtypes 312 | if not good: 313 | return None 314 | inv = False 315 | if reverse: 316 | inv = True 317 | idx = nd_in1d(a_view, b_view, assume_unique=False, invert=inv) 318 | if indices_only: 319 | return np.nonzero(idx)[0] 320 | return a[idx] 321 | 322 | 323 | def nd_merge(a, b): 324 | """Merge views of 2 ndarrays or recarrays. 325 | 326 | Duplicates are not removed, use nd_union instead. 327 | """ 328 | ab = None 329 | if (a.dtype.kind in ('f', 'i')) and (b.dtype.kind in ('f', 'i')): 330 | ab = np.concatenate((a, b), axis=0) 331 | else: 332 | a_view = _view_as_struct_(a, return_all=False) 333 | b_view = _view_as_struct_(b, return_all=False) 334 | good = _check_dtype_(a_view, b_view) # check dtypes 335 | if good: 336 | ab = np.concatenate((a_view, b_view), axis=None) 337 | ab = ab.view(a.dtype).reshape(-1, ab.shape[0]).squeeze() 338 | return ab 339 | 340 | 341 | def nd_union(a, b): 342 | """Union views of arrays. 343 | 344 | Returns the unique, sorted array of values that are in either of the two 345 | input arrays. 346 | """ 347 | a_view = _view_as_struct_(a, return_all=False) 348 | b_view = _view_as_struct_(b, return_all=False) 349 | good = _check_dtype_(a_view, b_view) # check dtypes 350 | if not good: 351 | return None 352 | # ab = np.union1d(a_view, b_view) 353 | ab = np.unique(np.concatenate((a_view, b_view), axis=None)) 354 | return ab.view(a.dtype).reshape(ab.shape[0], -1).squeeze() 355 | 356 | 357 | def nd_uniq(a, return_index=True, 358 | return_inverse=False, 359 | return_counts=True, 360 | axis=0): 361 | """Taken from, but modified for Geo arrays. 362 | 363 | Parameters 364 | ---------- 365 | a : Geo array or ndarray 366 | For other array-like objects, see `unique` and `_unique1d` in: 367 | 368 | ``_. 369 | 370 | Notes 371 | ----- 372 | Using True for `return_index` and/or `return_inverse` speeds up the 373 | sorting process. Example for 750k points as structured array:: 374 | 375 | st.dtype # dtype([('f0', '`_. 19 | 20 | Modified : 21 | 2023-10-30 22 | 23 | Purpose 24 | ------- 25 | Functions for boolean operations on polygons: 26 | 27 | - erase 28 | 29 | """ 30 | # pylint: disable=C0103,C0201,C0209,C0302,C0415 31 | # pylint: disable=R0902,R0904,R0912,R0913,R0914,R0915 32 | # pylint: disable=W0105,W0201,W0212,W0221,W0611,W0612,W0613,W0621 33 | # pylint: disable=E0401,E0611,E1101,E1121 34 | 35 | import sys 36 | import copy 37 | import numpy as np 38 | import npg # noqa 39 | from npg.npg_bool_hlp import add_intersections, _del_seq_pnts_ # prep_overlay 40 | from npg.npg_plots import plot_polygons # noqa 41 | from npg.npg_prn import prn_ # noqa 42 | 43 | ft = {"bool": lambda x: repr(x.astype(np.int32)), 44 | "float_kind": '{: 6.2f}'.format} 45 | np.set_printoptions( 46 | edgeitems=10, linewidth=120, precision=3, suppress=True, threshold=200, 47 | formatter=ft 48 | ) 49 | 50 | script = sys.argv[0] 51 | 52 | __all__ = ['erase_poly'] 53 | __helpers__ = ['cut_pairs'] 54 | 55 | 56 | # ---- (1) difference polygons 57 | # 58 | def cut_pairs(arr): 59 | """Return cut lines from `onConP` or `id_plcl`.""" 60 | c_segs = [] 61 | p_segs = [] 62 | for cn, v in enumerate(arr[1:, 0], 0): 63 | prev = arr[cn, 0] 64 | dff = v - prev 65 | if dff == 1: 66 | vals = [arr[cn, 1], arr[cn + 1, 1]] 67 | c_segs.append([prev, v]) 68 | p_segs.append(vals) 69 | return c_segs, p_segs 70 | 71 | 72 | def erase_poly(poly, clp, as_geo=True): 73 | """Return the symmetrical difference between two polygons, `poly`, `clp`. 74 | 75 | Parameters 76 | ---------- 77 | poly, clp : array_like 78 | `poly` is the polygon being differenced by polygon `clp` 79 | 80 | Requires 81 | -------- 82 | `npg_helpers` : `a_eq_b` 83 | 84 | `_roll_`, `_wn_clip_`, `_node_type_`, `_add_pnts_`, `_del_seq_pnts_ 85 | 86 | Notes 87 | ----- 88 | The notations `p_p, p_c` refer to the previous and current poly points 89 | during iterations. Similarily `c_p, c_c` denote the previous and current 90 | clipper poly points. 91 | """ 92 | """ 93 | Create dictionary from list using first value as key 94 | 95 | ky = [i[0] for i in p_out] 96 | dct = dict(list(zip(ky, p_out))) 97 | """ 98 | def _in_c_(c_c, c_seen, c_inside): 99 | """Return sub lists.""" 100 | if len(c_inside) > 0: 101 | if c_c in c_inside[0]: 102 | vals = c_inside.pop(0) 103 | c_seen.extend(vals) 104 | return cl_n[vals] 105 | return [] 106 | 107 | def _out_c_(c_c, c_seen, c_outside): 108 | """Return sub lists.""" 109 | if len(c_outside) > 0: 110 | if c_c in c_outside[0]: 111 | vals = c_outside.pop(0) 112 | c_seen.extend(vals) 113 | return cl_n[vals] 114 | return [] 115 | 116 | def _in_p_(p_c, p_seen, p_inside): 117 | """Return sub lists.""" 118 | if len(p_inside) > 0: 119 | if p_c in p_inside[0]: 120 | vals = p_inside.pop(0) 121 | p_seen.extend(vals) 122 | return pl_n[vals] 123 | return [] 124 | 125 | def _out_p_(p_c, p_seen, p_outside): 126 | """Return sub lists.""" 127 | if len(p_outside) > 0: 128 | if p_c in p_outside[0]: 129 | vals = p_outside.pop(0) 130 | p_seen.extend(vals) 131 | return pl_n[vals] 132 | return [] 133 | 134 | def in_out_chk(_n, _p, _c, _seen, _outside, _inside): 135 | """Last ditch check in case p_p and p_c are separated by a segment. 136 | 137 | Parameters 138 | ---------- 139 | parameter meanings 140 | 141 | +------+-------+-----+-----+--------+-----------+----------+ 142 | | | _n | _p | _c | _seen | _outside | _inside | 143 | +======+=======+=====+=====+========+===========+==========+ 144 | | poly | pl_n | p_p | p_c | p_seen | p_outside | p_inside | 145 | +------+-------+-----+-----+--------+-----------+----------+ 146 | |clip | cl_n | c_p | c_c | c_seen | c_outside | c_inside | 147 | +------+-------+-----+-----+--------+-----------+----------+ 148 | 149 | """ 150 | out_bits = [] 151 | in_bits = [] 152 | pc_max = max([_p, _c]) + 1 153 | for i in [_p, _c]: 154 | for cnt_, out_ in enumerate(_outside): 155 | if i in out_ and pc_max not in out_: # only take the first out 156 | vals = _outside.pop(cnt_) 157 | out_bits.append(vals) 158 | for cnt_, in_ in enumerate(_inside): 159 | if i in in_ and pc_max not in in_: # only take the first in 160 | vals = _inside.pop(cnt_) 161 | in_bits.append(vals) 162 | return out_bits, in_bits 163 | 164 | # -- Returns the intersections, the rolled input polygons, the new polygons 165 | # and how the points in both relate to one another. 166 | result = add_intersections(poly, clp, 167 | roll_to_minX=True, 168 | p0_pgon=True, 169 | p1_pgon=True, 170 | class_ids=True) 171 | pl_n, cl_n, id_plcl, x_pnts, p_out, p_in, c_out, c_in = result 172 | # -- 173 | # Get the intersections, new polys, points inside and outside and 174 | # x_pnt ids from `add_intersections`. Swap the order of the last. 175 | w0 = np.argsort(id_plcl[:, 1]) # get the order and temporarily sort 176 | z = np.zeros((id_plcl.shape[0], 4), dtype=int) 177 | z[:, :2] = id_plcl[:, [1, 0]][w0] 178 | z[1:, 2] = z[1:, 0] - z[:-1, 0] 179 | z[1:, 3] = z[1:, 1] - z[:-1, 1] 180 | onConP = np.copy(z) 181 | # onConP = id_plcl[:, [1, 0]][w0] # slice to rearrange the columns 182 | # -- cut lines, where one crosses the other 183 | # -- two point cut lines, which cross the other polygon 184 | c_cut0, p_cut0 = cut_pairs(onConP[:, :2]) # use onConP since it issorted 185 | p_cut1, c_cut1 = cut_pairs(id_plcl) # use id_plcl col 0 to save a sort 186 | c_cut = c_cut0 + c_cut1 # sorted(c_cut0 + c_cut1, key=lambda l:l[0]) 187 | p_cut = p_cut0 + p_cut1 # sorted(p_cut0 + p_cut1, key=lambda l:l[0]) 188 | # -- cut lines that are more than two points are either inside or 189 | # outside the other polygon 190 | p_outside = copy.deepcopy(p_out) 191 | p_inside = copy.deepcopy(p_in) 192 | c_outside = copy.deepcopy(c_out) 193 | c_inside = copy.deepcopy(c_in) 194 | # 195 | # Determine preceeding points to first clip. 196 | out = [] # p_seen, c_seen = [], [], [] 197 | prev = onConP[0, :2] # -- set the first `previous` for enumerate 198 | p_seen, c_seen = [], [] 199 | in_clp = [] # collect `clipping` segments to use for clip. 200 | kind_ = [] # see below 201 | # -1 symmetrical diff : features don't overlap clip out, poly out 202 | # 0 erase : poly outside of clip 203 | # 1 clip : clip in, poly in 204 | # 2 hole : neither 205 | # 3 identity : features or parts that overlap 206 | for cnt, row in enumerate(onConP[1:], 1): # enumerate fromonConP[1:] 207 | # current ids and differences... this is an intersection point 208 | c_c, p_c, d0, d1 = row # row[:2], row[2], row[3] 209 | c_p, p_p = prev # previous ids 210 | sub, bts, sub0, sub1 = [], [], [], [] 211 | # -- 212 | chk0, chk1, chk2, chk3 = [False, False, False, False] 213 | c_out_f = sum(c_outside, []) # flatten list of sub lists 214 | c_in_f = sum(c_inside, []) 215 | p_out_f = sum(p_outside, []) 216 | p_in_f = sum(p_inside, []) 217 | if len(c_outside) > 0: 218 | chk0 = set([c_p, c_c]).issubset(set(c_out_f)) 219 | if len(c_inside) > 0: 220 | chk1 = set([c_p, c_c]).issubset(set(c_in_f)) 221 | if len(p_outside) > 0: 222 | chk2 = set([p_p, p_c]).issubset(set(p_out_f)) 223 | if len(p_inside) > 0: 224 | chk3 = set([p_p, p_c]).issubset(set(p_in_f)) 225 | # d0, d1, chk0, chk1, chk2, chk3 226 | # -- 227 | # -- d0 clp ids are sequential and are on the densified clp line 228 | # -- d0 should never be <= 0 since you are following clp sequence 229 | # -- When d0 == 1, this is a shared edge between the two polygons 230 | # it is equivalent to `[c_p, c_c] in c_cut` 231 | # -- 232 | if d0 == 1: # this is a `cutting` segment inside `c_cut` 233 | _clp_ = cl_n[[c_p, c_c]] 234 | if chk2: 235 | in_clp += [_clp_] 236 | elif chk3: 237 | in_clp += [pl_n[p_p: p_c + 1]] # poly inside clip 238 | # -- 239 | if d1 > 1: # poly inside and outside check 240 | r_ = in_out_chk(pl_n, p_p, p_c, p_seen, p_outside, p_inside) 241 | out_bits, in_bits = r_ 242 | if len(out_bits) > 0: # -- construct outside bits 243 | tmp = sum(out_bits, []) 244 | bts = [pl_n[tmp]] + [_clp_[::-1]] 245 | out.append(np.concatenate(bts, axis=0)) 246 | p_seen += tmp 247 | kind_.append(0) 248 | if len(in_bits) > 0: # -- inside bits 249 | tmp = sum(in_bits, []) 250 | if tmp[-1] - tmp[0] == 1: # edgly1-eclip last clp [76, 77] 251 | bts = [] 252 | elif p_p == tmp[-1]: # check to see if it is start or end 253 | bts = [pl_n[tmp]] + [_clp_] + [pl_n[tmp[0]][None, :]] 254 | else: 255 | bts = [_clp_] + [pl_n[tmp[::-1]]] 256 | if bts: # diff > 1 257 | out.append(np.concatenate(bts, axis=0)) 258 | p_seen += tmp 259 | kind_.append(1) 260 | # -- 261 | elif d1 < 0: # not common, but accounted for (eg. E, d0_ polys) 262 | # fix this section 263 | if p_c + 1 in p_seen: # closes possible double cut triangle 264 | if [p_c, p_c + 1] in p_cut: 265 | to_add = pl_n[[p_c, p_c + 1, p_p, p_c]] 266 | kind_.append(1) 267 | out.append(to_add) 268 | bts = _out_p_(max([p_p, p_c]), p_seen, p_outside) 269 | if len(bts) > 0: 270 | # in_clp.extend([]) # fix !!! 271 | s0 = np.concatenate((bts, bts[0][None, :]), axis=0) 272 | kind_.append(0) # was -1 which is wrong 273 | out.append(s0) 274 | # -- try this for cnt = 7 275 | if min([p_p, p_c]) in p_seen: # or [p_p - 1, p_p] in p_on 276 | # in_clp.append([]) # fix !!! 277 | kind_.append(1) # fix !!! 278 | s1 = np.concatenate((_clp_, pl_n[[p_p - 1, p_p]]), axis=0) 279 | in_clp.append(s1) # ?? 2023-09-07 for E,d0_ first clp 280 | out.append(s1) 281 | # -- 282 | elif d1 == 1: # unexpected, but accounted for 283 | sub = [] 284 | # -- 285 | # -- Note: clip can be inside or outside 286 | elif d0 > 1: 287 | if chk0: # clp seg is outside 288 | sub0 = _out_c_(c_c, c_seen, c_outside) 289 | # in_clp += [cl_n[[c_p, c_c]]] # add sorted(...) ?? 290 | # in_clp.append(sub0) # 5-9 291 | elif chk1: # clp seg is inside 292 | sub0 = _in_c_(c_c, c_seen, c_inside) 293 | in_clp += [sub0] 294 | # -- 295 | if d1 < 0: # -- applies to E, d0_ because of wrapping crosses 296 | if chk0: # clip segment outside ply 297 | sub0 = sub0[::-1] if len(sub0) > 0 else [] # ??? 298 | sub1 = pl_n[p_c:p_p + 1, :][::-1] 299 | if len(sub0) > 0 and len(sub1) > 0: 300 | sub = np.concatenate((sub1, sub0), axis=0) 301 | kind_.append(2) # a hole between the two 302 | else: 303 | sub = [] # or _out_bits_ 304 | # -- 305 | if d1 == 1: # poly ids are sequential, clp is inside or outside 306 | sub1 = pl_n[[p_p, p_c]] 307 | if chk0: 308 | in_clp += [sub1] 309 | sub = np.concatenate((sub0, sub1[::-1]), axis=0) 310 | kind_.append(-1) 311 | elif chk1: 312 | sub = np.concatenate((sub1, sub0[::-1]), axis=0) 313 | kind_.append(0) # ?? check 314 | # -- 315 | elif d1 > 1: # clp inside and outside check 316 | if chk0: # clip segment outside ply, chk3==True 317 | sub1 = _in_p_(p_c, p_seen, p_inside) 318 | if len(sub1) > 0: 319 | in_clp += [sub1] # ?? 320 | sub1 = sub1[::-1] if len(sub1) > 0 else [] 321 | kind_.append(-1) 322 | elif chk1: # clip segment inside ply, chk2==True? 323 | sub1 = _out_p_(p_c, p_seen, p_outside) 324 | if len(sub1) > 0: 325 | sub0 = sub0[::-1] if len(sub0) > 0 else [] 326 | kind_.append(0) 327 | if len(sub0) > 0 and len(sub1) > 0: 328 | sub = np.concatenate((sub0, sub1), axis=0) 329 | if len(sub) > 0: 330 | out.append(sub) 331 | # 332 | """ 333 | k_ = [] if len(kind_) == 0 else kind_[-1] 334 | o_ = np.asarray([]) if len(out) == 0 else out[-1] 335 | val_lst = [cnt, prev, row[:2], row[2:], chk0, chk1, chk2, chk3, k_] 336 | print("cnt {}: {} {} {} {} {} {} {} {}".format(*val_lst)) 337 | prn_(o_, deci=2, width=60, prefix=" ..") 338 | """ 339 | prev = [c_c, p_c] 340 | p_seen.append(p_c) 341 | c_seen.append(c_c) 342 | # # -- 343 | final = np.asarray(out, dtype='O') 344 | if as_geo: 345 | return npg.arrays_to_Geo(final, kind=2, info=None, to_origin=False) 346 | idx = np.array(kind_) 347 | clp_ply = np.concatenate(in_clp, axis=0) # intersect as well 348 | clp_ply = _del_seq_pnts_(clp_ply, True) 349 | 350 | idx_hole = np.nonzero(idx == 2)[0] # holes 351 | idx_all = np.nonzero(idx < 2)[0] # symmetrical difference 352 | idx_p_out = np.nonzero(idx == 0)[0] # pairwise erase 353 | idx_c_out = np.nonzero(idx != 0)[0] # reverse pairwise erase 354 | idx_c_in = np.nonzero(idx == 1)[0] # clp ?? reverse pairwise erase 355 | # 356 | hole_ply = final[idx_hole] if len(idx_hole) > 0 else [] 357 | symm_ply = final[idx_all] 358 | clp_ply2 = final[idx_c_in] if len(idx_c_in) > 0 else [] 359 | erase_ply = final[idx_p_out] if len(idx_p_out) > 0 else [] 360 | rev_erase = final[idx_c_out] if len(idx_c_out) > 0 else [] 361 | # -- 362 | return erase_ply, clp_ply, clp_ply2, hole_ply, symm_ply, rev_erase 363 | 364 | # -- Extras 365 | 366 | # def on_pairs(col): 367 | # """Return sequential ids from the intersections not in or out.""" 368 | # segs = [] 369 | # for cn, v in enumerate(col[1:], 0): 370 | # prev = col[cn] 371 | # dff = v - prev 372 | # if dff == 1: 373 | # segs.append([prev, v]) 374 | # return segs 375 | 376 | # def _chk_in_lst(_p, _c, _case): 377 | # """Boolean check of poly or clip points. 378 | 379 | # Parameters 380 | # ---------- 381 | # _p, _c : integer 382 | # Previous or current point id values. 383 | # _case : list of lists 384 | # Inside or outside point lists. 385 | 386 | # Notes 387 | # ----- 388 | # This function is used to see if the previous (`_p`) or current (`_c`) 389 | # poly or clip points are inside or outside their counterpart. 390 | # The same function can be used for either case. 391 | # """ 392 | # for lst in _case: 393 | # if _p in lst and _c in lst: 394 | # return True, lst 395 | # return False, [] 396 | 397 | 398 | # ---- Final main section ---------------------------------------------------- 399 | if __name__ == "__main__": 400 | """optional location for parameters""" 401 | print(f"\nRunning... {script}\n") 402 | 403 | # out, final = clip_poly( 404 | # all work as of 2023-03-19 405 | # out, final = clip_poly(edgy1, eclip) 406 | # out, final = clip_poly(E, d0_) 407 | # out, final = clip_poly(pl_, cl_) 408 | # out, final = clip_poly(p00, c00) 409 | -------------------------------------------------------------------------------- /arcpro_npg/npg/npg/old/npg_pgon.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | # noqa: D205, D400, F403 3 | r""" 4 | ------------------------------------------ 5 | clp: segment intersection and clipping 6 | ------------------------------------------ 7 | 8 | Stuff 9 | 10 | ---- 11 | 12 | """ 13 | import numpy as np 14 | from numpy.lib.recfunctions import unstructured_to_structured as uts 15 | from numpy.lib.recfunctions import repack_fields 16 | 17 | from npg.npg_plots import plot_polygons 18 | 19 | np.set_printoptions( 20 | edgeitems=10, linewidth=120, precision=2, suppress=True, threshold=200, 21 | formatter={"bool": lambda x: repr(x.astype(np.int32)), 22 | "float_kind": '{: 6.2f}'.format}) 23 | 24 | 25 | # ---- Polygon class, properties and methods 26 | # 27 | class pgon(np.ndarray): 28 | """ 29 | Poly class for clipping. 30 | 31 | Requires 32 | -------- 33 | numpy array. 34 | """ 35 | 36 | def __new__(cls, 37 | arr=None, 38 | CFT=None, 39 | Kind=None, 40 | Info="clipper", 41 | Extent=None, 42 | SR=None 43 | ): 44 | # -- 45 | """See `Docs` for construction notes.""" 46 | arr = np.ascontiguousarray(arr) 47 | CFT = np.ascontiguousarray(CFT) 48 | if (arr.ndim != 2) or (CFT.ndim != 2): 49 | m = "Input error... arr.ndim != 2 : {} or IFT.dim != 2 : {}" 50 | print(m.format(arr.ndim, CFT.ndim)) 51 | return None 52 | if (CFT.shape[-1] < 6) or (Kind not in (1, 2)): 53 | print("didn't work") 54 | return None 55 | # -- 56 | self = arr.view(cls) # view as Geo class 57 | self.CFT = CFT # array id, fr-to, cw, part id 58 | self.K = Kind # Points (0), Polylines (1), Polygons (2) 59 | self.Info = Info # any useful information 60 | self.Fr = CFT[:, 0] # from point id 61 | self.To = CFT[:, 1] # to point id 62 | # self.Bf = CFT[:, 2] # previous pnt id 63 | self.eqX = CFT[:, 2] # pnt equal intersection 64 | self.eqO = CFT[:, 3] # pnt equals other 65 | self.inO = CFT[:, 4] # pnt in other 66 | self.CT = CFT[:, 5] # crossing type 67 | return self 68 | 69 | def __array_finalize__(self, src_arr): 70 | """ 71 | Finalize new object.... 72 | 73 | See npgGeo for more details 74 | """ 75 | if src_arr is None: 76 | return 77 | self.CFT = getattr(src_arr, 'CFT', None) 78 | self.K = getattr(src_arr, 'K', None) 79 | self.Info = getattr(src_arr, 'Info', None) 80 | self.Fr = getattr(src_arr, 'Fr', None) 81 | self.To = getattr(src_arr, 'To', None) 82 | # self.Bf = getattr(src_arr, 'Bf', None) 83 | self.eqX = getattr(src_arr, 'eqX', None) 84 | self.eqO = getattr(src_arr, 'eqO', None) 85 | self.inO = getattr(src_arr, 'inO', None) 86 | self.CT = getattr(src_arr, 'CT', None) 87 | 88 | def __array_wrap__(self, out_arr, context=None): 89 | """Wrap it up.""" 90 | return np.ndarray.__array_wrap__(self, out_arr, context) 91 | 92 | # @property 93 | 94 | @property 95 | def CFT_str(self): 96 | """Clip poly structure. See self.structure for more information.""" 97 | nmes = ["Fr_pnt", "To_pnt", "Prev_pnt", "CeX", "CeP", "CinP"] 98 | return uts(self.CFT, names=nmes, align=False) 99 | 100 | def update(self, eqX, eqOther, inOther): 101 | """Update a column in `CeX`, `CeP`, `CinP`. 102 | 103 | Parameters 104 | ---------- 105 | eqX, eqOther, inOther : ndarrays or lists of values 106 | - poly/clipper equal to an intersection point 107 | - one equals the other point 108 | - one is in the other 109 | CFT column names (3, 4, 5 positionally) 110 | 111 | Notes 112 | ----- 113 | Conversion values are based on binary conversion as shown in the 114 | `keys` line, then reclassed using a dictionary conversion. 115 | 116 | - keys = eqX * 100 + eqOther * 10 + inOther 117 | - 0 is outside 118 | - 1 is inside with no intersection 119 | - position before a 1 is an intersection point not at an endpoint 120 | - 5 endpoint intersects a segment 121 | - 111 (7) clp, poly and intersection meet at a point 122 | """ 123 | k = [0, 1, 10, 11, 100, 101, 110, 111] 124 | v = [0, 1, 2, 3, 4, 5, 6, 7] 125 | d = dict(zip(k, v)) 126 | self.CFT[:, 2][eqX] = 1 127 | self.CFT[:, 3][eqOther] = 1 128 | self.CFT[:, 4][inOther] = 1 129 | keys = self.CFT[:, 2] * 100 + self.CFT[:, 3] * 10 + self.CFT[:, 4] 130 | vals = [d[i] for i in keys.tolist()] 131 | self.CFT[:, 5] = vals 132 | return 133 | 134 | 135 | # ---- 136 | def array_CFT(in_arrays, shift_to_origin=False): 137 | """Produce the Geo array. Construction information in `npgDocs`. 138 | 139 | Parameters 140 | ---------- 141 | in_arrays : list of ndarrays 142 | shift_to_origin : boolean 143 | True, moves the geometry to the origin. False, uses the existing 144 | coordinates. 145 | """ 146 | out_arrays = [] 147 | info_ = ["target", "clipper"] 148 | # -- create the polygons 149 | # 150 | for cnt, arr in enumerate(in_arrays): 151 | ids = np.arange(len(arr) + 1) 152 | f = ids[:-1, None] 153 | t = np.concatenate((ids[1:-1], [ids[0]]))[:, None] 154 | fr_to = np.concatenate((f, t), axis=1) 155 | # prev_ = np.concatenate((ids[[-2]], ids[:-2]))[:, None] 156 | CFT = np.concatenate( 157 | (fr_to, np.full((arr.shape[0], 4), 0)), axis=1) # prev_, 158 | extent = np.array([np.min(arr, axis=0), np.max(arr, axis=0)]) 159 | if shift_to_origin: 160 | arr = arr - extent[0] 161 | _type = info_[cnt] 162 | p = pgon(arr, CFT, 2, _type) 163 | out_arrays.append([p, CFT, extent]) 164 | out_ = out_arrays 165 | if len(out_arrays) == 1: 166 | out_ = out_arrays[0] 167 | return out_ 168 | -------------------------------------------------------------------------------- /arcpro_npg/npg/npg/testing.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | """ 3 | Created on Tue Feb 6 18:31:53 2024 4 | 5 | @author: dan_p 6 | """ 7 | 8 | import sys 9 | import numpy as np 10 | import npg 11 | from npg.npGeo import roll_arrays 12 | from npg.npg_plots import plot_polygons # noqa 13 | from copy import deepcopy 14 | from npg_bool_ops import add_intersections, connections_dict, _renumber_pnts_ 15 | 16 | def tst(poly, clp): 17 | result = add_intersections( 18 | poly, clp, 19 | roll_to_minX=True, 20 | p0_pgon=True, p1_pgon=True, 21 | class_ids=False 22 | ) 23 | # pl_ioo polygon in-out-on, cl_ioo for clip 24 | pl_n, cl_n, id_plcl, onConP, x_pnts, ps_info, cs_info = result 25 | p_out, p_on, p_in, pl_ioo = ps_info 26 | c_out, c_on, c_in, cl_ioo = cs_info 27 | # 28 | p_col = pl_ioo[:, 1] # noqa 29 | c_col = cl_ioo[:, 1] # noqa 30 | # 31 | N_c = cl_n.shape[0] - 1 # noqa 32 | N_p = pl_n.shape[0] - 1 # noqa 33 | # 34 | p_ft = list(zip(p_on[:-1], p_on[1:] + 1)) 35 | p_subs = [pl_ioo[i[0]: (i[1])] for i in p_ft] 36 | p_vals = [sub[1][1] for sub in p_subs] 37 | p_ft_v = np.concatenate((p_ft, np.array(p_vals)[:, None]), axis=1) 38 | # 39 | # 40 | c_ft = list(zip(c_on[:-1], c_on[1:] + 1)) 41 | c_subs = [cl_ioo[i[0]:i[1]] for i in c_ft] 42 | c_vals = [sub[1][1] for sub in c_subs] 43 | c_ft_v = np.concatenate((c_ft, np.array(c_vals)[:, None]), axis=1) 44 | # 45 | new_ids, old_new, CP_ = _renumber_pnts_(cl_n, pl_n) 46 | # 47 | # -- produce the connections dictionary 48 | c_ft_orig = np.array(list(zip(np.arange(0, N_c), np.arange(1, N_c + 1)))) 49 | # p_ft_orig = np.array(list(zip(np.arange(0, N_p), np.arange(1, N_p + 1)))) 50 | p_ft_new = np.array(list(zip(old_new[:-1, 2], old_new[1:, 2]))) 51 | ft_new = np.concatenate((c_ft_orig, p_ft_new), axis=0) 52 | conn_dict = connections_dict(ft_new, bidirectional=True) 53 | # 54 | # -- CP_ can be used to plot the combined, resultant polygon 55 | # -- poly = CP_[old_new[:, 2]] # to plot poly from CP_ 56 | # -- clp = CP_[old_new[:N_c + 1, 0]] # to plot poly from CP_ 57 | # 58 | # -- # uniq points and lex sorted l-r b-t 59 | cp_uni, idx_cp = np.unique(CP_, True, axis=0) 60 | lex_ids = new_ids[idx_cp] 61 | # -- poly, clp segments 62 | clp_s = [cl_n[i:j] for i,j in c_ft_v[:, :2]] 63 | ply_s = [pl_n[i:j] for i,j in p_ft_v[:, :2]] 64 | c_segs = np.arange(c_ft_v.shape[0]) 65 | p_segs = np.arange(p_ft_v.shape[0]) 66 | 67 | 68 | # pon_whr = np.nonzero((p_on == old_new[:, 0][:, None]).any(-1)) 69 | # pon_new = old_new[pon_whr] 70 | 71 | # p_new = new_ids[N_c + 1:] 72 | # p_ft_new = 73 | # c0, c1 = p_ft_v[:, :2].T 74 | # wc0 = np.nonzero((c0== old_new[:, 0][:, None]).any(-1))[0] 75 | # wc1 = np.nonzero((c1 == old_new[:, 0][:, None]).any(-1))[0] 76 | 77 | # p_fnew = old_new[c0][:, -1] 78 | # 79 | # p_out_new = p_out + N_c + 1 80 | # p_in_new = p_in + N_c + 1 81 | # p_on_new = p_on + N_c + 1 82 | 83 | p_ft_v_new = np.copy(p_ft_v) 84 | p_ft_v_new[:, 0] += N_c + 1 85 | p_ft_v_new[:, 1] += N_c + 1 86 | # 87 | # ft_c_p_new = np.concatenate((c_ft_v, p_ft_v), axis= 1) # or ... 88 | ft_c_p_new = np.concatenate((c_ft_v, p_ft_v_new), axis= 0) 89 | cp_dict = connections_dict(ft_c_p_new[:, :2], bidirectional=False) # single 90 | cp_dict = connections_dict(ft_c_p_new[:, :2], bidirectional=True) # multiple 91 | """ 92 | onConP = np.array([[0, 0, 0, 0], 93 | [1, 2, 1, 2], 94 | [2, 3, 1, 1], 95 | [3, 10, 1, 7], 96 | [5, 9, 2, -1], 97 | [6, 4, 1, -5], 98 | [7, 5, 1, 1], 99 | [8, 8, 1, 3], 100 | [10, 15, 2, 7], 101 | [11, 18, 1, 3], 102 | [12, 19, 1, 1], 103 | [13, 14, 1, -5], 104 | [15, 13, 2, -1], 105 | [16, 20, 1, 7], 106 | [17, 21, 1, 1]]) 107 | 108 | id_plcl = np.array([[0, 0], 109 | [2, 1], 110 | [3, 2], 111 | [4, 6], 112 | [5, 7], 113 | [8, 8], 114 | [9, 5], 115 | [10, 3], 116 | [13, 15], 117 | [14, 13], 118 | [15, 10], 119 | [18, 11], 120 | [19, 12], 121 | [20, 16], 122 | [21, 17]]) 123 | 124 | x_pnts = np.array([[0.00, 0.00], [2.00, 0.00], [2.00, 10.00], 125 | [2.67, 2.00], [3.33, 2.00], [3.60, 8.00], 126 | [4.00, 0.00], [4.00, 10.00], [4.80, 8.00], 127 | [5.50, 2.00], [6.00, 0.00], [6.00, 10.00], 128 | [7.00, 8.00], [8.00, 10.00]]) 129 | 130 | cl_n = [[0.0, 0.0], [2.0, 10.0], [4.0, 10.0], [3.60, 8.0], 131 | [3.0, 5.0], [4.8, 8.0], [6.0, 10.0], [8.0, 10.0], [7.0, 8.0], 132 | [5.0, 4.0], [5.5, 2.0], [6.0, 0.0], [4.0, 0.0], 133 | [3.33, 2.0], [3.0, 3.0], [2.67, 2.0], 134 | [2.0, 0.0], [0.0, 0.0]] 135 | cl_n = np.array(cl_n) 136 | 137 | pl_n = [[0.0, 0.0], [0.0, 10.0], [2.0, 10.0], [4.0, 10.0], [6.0, 10.0], 138 | [8.0, 10.0], [10.0, 10.0], [10.0, 8.0], [7.0, 8.0], [4.8, 8.0], 139 | [3.6, 8.0], [2.0, 8.0], [2.0, 2.0], 140 | [2.67, 2.0], [3.33, 2.0], [5.5, 2.0], 141 | [10.0, 2.0], [10.0, 0.0], [6.0, 0.0], [4.0, 0.0], [2.0, 0.0], 142 | [0.0, 0.0]] 143 | 144 | pl_n = np.array(pl_n) 145 | 146 | CP_ = np.concatenate((cl_n, pl_n), axis=0) 147 | 148 | # find first intersections 149 | ids = np.arange(0, CP_.shape[0]) 150 | N_c = cl_n.shape[0] - 1 151 | N_p = pl_n.shape[0] - 1 152 | 153 | 154 | 155 | wp, wc = np.nonzero((cl_n == pl_n[:, None]).all(-1)) # id_plcl 156 | # [ 0, 0, 2, 3, 4, 5, 8, 9, 10, 13, 14, 15, 18, 19, 20, 21, 21] 157 | # [ 0, 17, 1, 2, 6, 7, 8, 5, 3, 15, 13, 10, 11, 12, 16, 0, 17] 158 | 159 | wc0, wp0 = np.nonzero((pl_n == cl_n[:, None]).all(-1)) # onConP 160 | # [ 0, 0, 1, 2, 3, 5, 6, 7, 8, 10, 11, 12, 13, 15, 16, 17, 17] 161 | # [ 0, 21, 2, 3, 10, 9, 4, 5, 8, 15, 18, 19, 14, 13, 20, 0, 21] 162 | 163 | wpx, xp = np.nonzero((x_pnts == pl_n[:, None]).all(-1)) 164 | # [ 0, 2, 3, 4, 5, 8, 9, 10, 13, 14, 15, 18, 19, 20, 21] 165 | # [ 0, 2, 7, 11, 13, 12, 8, 5, 3, 4, 9, 10, 6, 1, 0] 166 | 167 | wcx, xc = np.nonzero((x_pnts == cl_n[:, None]).all(-1)) 168 | # [ 0, 1, 2, 3, 5, 6, 7, 8, 10, 11, 12, 13, 15, 16, 17] 169 | # [ 0, 2, 7, 5, 8, 11, 13, 12, 9, 10, 6, 4, 3, 1, 0] 170 | 171 | # --- new attempt with x_pnts lex sorted from top left 172 | # 173 | x_lex_ids = np.lexsort((-x_pnts[:, 1], x_pnts[:, 0])) 174 | x_lex = x_pnts[x_lex_ids] 175 | 176 | wpx_lex, xp_lex = np.nonzero((x_lex == pl_n[:, None]).all(-1)) 177 | wcx_lex, xc_lex = np.nonzero((x_lex == cl_n[:, None]).all(-1)) 178 | 179 | 180 | c_ids = np.arange(0, N_c + 1) 181 | c_ids[-1] = 0 182 | p_ids = np.arange(0, N_p + 1) 183 | p_ids[-1] = 0 184 | p_ids[1:-1] = np.arange(c_ids[-2] + 1, N_c + N_p - 1) 185 | 186 | cp_ids = np.concatenate((c_ids, p_ids), axis=0) 187 | 188 | zz = wp0[1:-1] + N_c + 1 189 | zz0 = list(zip(zz, wc0[1:-1])) 190 | zz0 = np.array(zz0) 191 | zz1 = np.copy(cp_ids) 192 | r0, r1 = (zz1 == zz0[:, 0][:, None]).nonzero() 193 | zz1[r1] = zz0[:, 1] 194 | zz1[(zz1 == N_c).nonzero()] = 0 195 | 196 | frto = np.concatenate((zz1[:-1][:, None], zz1[1:][:, None]), axis=1) 197 | p = CP_[zz1] 198 | 199 | # from 200 | # https://stackoverflow.com/questions/48705143/efficiency-2d-list-to-dictionary-in-python 201 | # second element is the key 202 | l = frto 203 | d = {} 204 | for elem in l: 205 | if elem[1] in d: 206 | d[elem[1]].append(elem[0]) 207 | else: 208 | d[elem[1]] = [elem[0]] 209 | 210 | # first element is the key 211 | l = frto 212 | d = {} 213 | for elem in l: 214 | if elem[0] in d: 215 | d[elem[0]].append(elem[1]) 216 | else: 217 | d[elem[0]] = [elem[1]] 218 | 219 | 220 | # Splitting example 221 | # ========================= 222 | # polygons poly, clp: E, d0_ 223 | 224 | p_ft_v = np.array([[0, 3, -1], 225 | [2, 6, -1], 226 | [5, 7, 0], 227 | [6, 9, -1], 228 | [8, 14, 1], 229 | [13, 16, -1], 230 | [15, 17, 0], 231 | [16, 20, -1], 232 | [19, 22, -1]]) 233 | 234 | c_ft_v = np.array([[0, 2, 0], 235 | [1, 4, -1], 236 | [3, 5, 0], 237 | [4, 6, 0], 238 | [5, 10, -1], 239 | [9, 11, 0], 240 | [10, 12, 0], 241 | [11, 14, -1], 242 | [13, 15, 0]]) 243 | 244 | 245 | out_p = [] 246 | for i in p_ft_v[:, :2]: 247 | fr_, to_ = i 248 | out_p.append(pl_n[fr_:to_]) 249 | out_c = [] 250 | for i in c_ft_v[:, :2]: 251 | fr_, to_ = i 252 | out_c.append(cl_n[fr_:to_]) 253 | 254 | # -- out_p 255 | 256 | # [array([[ 0.00, 5.00], 257 | # [ 0.00, 10.00], 258 | # [ 5.00, 10.00]]), 259 | # array([[ 5.00, 10.00], 260 | # [ 10.00, 10.00], 261 | # [ 10.00, 8.00], 262 | # [ 6.33, 8.00]]), 263 | # array([[ 6.33, 8.00], 264 | # [ 3.67, 8.00]]), 265 | # array([[ 3.67, 8.00], 266 | # [ 2.00, 8.00], 267 | # [ 2.00, 6.33]]), 268 | # array([[ 2.00, 6.33], 269 | # [ 2.00, 5.50], 270 | # [ 5.00, 5.50], 271 | # [ 5.00, 4.00], 272 | # [ 2.00, 4.00], 273 | # [ 2.00, 3.67]]), 274 | # array([[ 2.00, 3.67], 275 | # [ 2.00, 2.00], 276 | # [ 3.67, 2.00]]), 277 | # array([[ 3.67, 2.00], 278 | # [ 6.33, 2.00]]), 279 | # array([[ 6.33, 2.00], 280 | # [ 10.00, 2.00], 281 | # [ 10.00, 0.00], 282 | # [ 5.00, 0.00]]), 283 | # array([[ 5.00, 0.00], 284 | # [ 0.00, 0.00], 285 | # [ 0.00, 5.00]])] 286 | 287 | # -- out_c 288 | 289 | # [array([[ 0.00, 5.00], 290 | # [ 2.00, 6.33]]), 291 | # array([[ 2.00, 6.33], 292 | # [ 3.00, 7.00], 293 | # [ 3.67, 8.00]]), 294 | # array([[ 3.67, 8.00], 295 | # [ 5.00, 10.00]]), 296 | # array([[ 5.00, 10.00], 297 | # [ 6.33, 8.00]]), 298 | # array([[ 6.33, 8.00], 299 | # [ 7.00, 7.00], 300 | # [ 10.00, 5.00], 301 | # [ 7.00, 3.00], 302 | # [ 6.33, 2.00]]), 303 | # array([[ 6.33, 2.00], 304 | # [ 5.00, 0.00]]), 305 | # array([[ 5.00, 0.00], 306 | # [ 3.67, 2.00]]), 307 | # array([[ 3.67, 2.00], 308 | # [ 3.00, 3.00], 309 | # [ 2.00, 3.67]]), 310 | # array([[ 2.00, 3.67], 311 | # [ 0.00, 5.00]])] 312 | 313 | 314 | ps = np.array_split(pl_n, p_ft_v[:, 1], axis=0) 315 | 316 | # [array([[ 0.00, 5.00], 317 | # [ 0.00, 10.00], 318 | # [ 5.00, 10.00]]), 319 | # array([[ 10.00, 10.00], 320 | # [ 10.00, 8.00], 321 | # [ 6.33, 8.00]]), 322 | # array([[ 3.67, 8.00]]), 323 | # array([[ 2.00, 8.00], 324 | # [ 2.00, 6.33]]), 325 | # array([[ 2.00, 5.50], 326 | # [ 5.00, 5.50], 327 | # [ 5.00, 4.00], 328 | # [ 2.00, 4.00], 329 | # [ 2.00, 3.67]]), 330 | # array([[ 2.00, 2.00], 331 | # [ 3.67, 2.00]]), 332 | # array([[ 6.33, 2.00]]), 333 | # array([[ 10.00, 2.00], 334 | # [ 10.00, 0.00], 335 | # [ 5.00, 0.00]]), 336 | # array([[ 0.00, 0.00], 337 | # [ 0.00, 5.00]]), 338 | # array([], shape=(0, 2), dtype=float64)] 339 | 340 | 341 | 342 | 343 | # def line_sweep(segments): 344 | # events = [] # List to store events (start and end points of segments) 345 | # for segment in segments: 346 | # events.append((segment.start, 'start', segment)) 347 | # events.append((segment.end, 'end', segment)) 348 | 349 | # events.sort() # Sort the events by x-coordinate 350 | 351 | # active_segments = set() # Set to keep track of active segments 352 | 353 | # intersections = [] # List to store intersection points 354 | 355 | # for event in events: 356 | # point, event_type, segment = event 357 | # if event_type == 'start': 358 | # for active_segment in active_segments: 359 | # if intersection(segment, active_segment): 360 | # intersections.append((point, active_segment, segment)) 361 | # active_segments.add(segment) 362 | # else: # event_type == 'end' 363 | # active_segments.remove(segment) 364 | 365 | # return intersections 366 | """ 367 | 368 | 369 | -------------------------------------------------------------------------------- /arcpro_npg/npg/npg_tools.tbx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/arcpro_npg/npg/npg_tools.tbx -------------------------------------------------------------------------------- /assets/css/style.scss: -------------------------------------------------------------------------------- 1 | --- 2 | --- 3 | 4 | @import "{{ site.theme }}"; 5 | /* Wrapper change from 960px */ 6 | .wrapper { 7 | width:1280px; 8 | } 9 | 10 | /* Code blocks */ 11 | 12 | code, pre { 13 | font-family: Monaco, "Bitstream Vera Sans Mono", "Lucida Console", Terminal, monospace; 14 | color:#000; 15 | font-size:12px; 16 | } 17 | 18 | pre { 19 | padding: 4px 12px; 20 | background: #FDFEFB; 21 | border-radius:4px; 22 | border:1px solid #D7D8C8; 23 | overflow: auto; 24 | overflow-y: hidden; 25 | margin-bottom: 32px; 26 | } 27 | 28 | /* Section - for main page content changed from 650*/ 29 | 30 | section { 31 | width:960px; 32 | float:right; 33 | padding-bottom:50px; 34 | } 35 | 36 | /* Tables */ 37 | 38 | table { 39 | width:100%; 40 | } 41 | 42 | table { 43 | border: 1px solid #ccc; 44 | margin-bottom: 32px; 45 | text-align: left; 46 | } 47 | 48 | th { 49 | font-family: 'Arvo', Helvetica, Arial, sans-serif; 50 | font-size: 14px; 51 | font-weight: normal; 52 | padding: 10px; 53 | background: #f2f6ff; 54 | color: #000000; 55 | } 56 | 57 | td { 58 | padding: 10px; 59 | background: #ccc; 60 | } 61 | 62 | /* Footer */ 63 | 64 | footer { 65 | width:170px; 66 | float:left; 67 | position:fixed; 68 | bottom:10px; 69 | padding-left: 50px; 70 | } 71 | 72 | /* changed max width from 960px */ 73 | @media print, screen and (max-width: 1280px) { 74 | 75 | div.wrapper { 76 | width:auto; 77 | margin:0; 78 | } 79 | 80 | header, section, footer { 81 | float:none; 82 | position:static; 83 | width:auto; 84 | } 85 | 86 | footer { 87 | border-top: 1px solid #ccc; 88 | margin:0 84px 0 50px; 89 | padding:0; 90 | } 91 | 92 | header { 93 | padding-right:320px; 94 | } 95 | 96 | section { 97 | padding:20px 84px 20px 50px; 98 | margin:0 0 20px; 99 | } 100 | 101 | header a small { 102 | display:inline; 103 | } 104 | 105 | header ul { 106 | position:absolute; 107 | right:130px; 108 | top:84px; 109 | } 110 | } 111 | -------------------------------------------------------------------------------- /images/1_README.md: -------------------------------------------------------------------------------- 1 | ## Image files ## 2 | 3 | Copies of documentation images are kept here. 4 | -------------------------------------------------------------------------------- /images/Shape2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/images/Shape2.png -------------------------------------------------------------------------------- /images/Voronoi2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/images/Voronoi2.png -------------------------------------------------------------------------------- /images/bad_shape.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/images/bad_shape.png -------------------------------------------------------------------------------- /images/circles.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/images/circles.png -------------------------------------------------------------------------------- /images/clones2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/images/clones2.png -------------------------------------------------------------------------------- /images/containers.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/images/containers.png -------------------------------------------------------------------------------- /images/double_cross_b3c1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/images/double_cross_b3c1.png -------------------------------------------------------------------------------- /images/double_cross_b4c1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/images/double_cross_b4c1.png -------------------------------------------------------------------------------- /images/extent_to_poly02.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/images/extent_to_poly02.png -------------------------------------------------------------------------------- /images/hexagons.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/images/hexagons.png -------------------------------------------------------------------------------- /images/npGeo.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/images/npGeo.png -------------------------------------------------------------------------------- /images/npGeo_conversion_tools.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/images/npGeo_conversion_tools.png -------------------------------------------------------------------------------- /images/npGeomTools.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/images/npGeomTools.png -------------------------------------------------------------------------------- /images/npGeom_containers_tools0.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/images/npGeom_containers_tools0.png -------------------------------------------------------------------------------- /images/npg_arc_npg.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/images/npg_arc_npg.png -------------------------------------------------------------------------------- /images/npg_create.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/images/npg_create.png -------------------------------------------------------------------------------- /images/npg_io.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/images/npg_io.png -------------------------------------------------------------------------------- /images/single_cross_c2CC.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/images/single_cross_c2CC.png -------------------------------------------------------------------------------- /images/single_cross_s00_t0.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/images/single_cross_s00_t0.png -------------------------------------------------------------------------------- /images/sq.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/images/sq.png -------------------------------------------------------------------------------- /images/sq2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Dan-Patterson/numpy_geometry/ef7acd77b8d98c1a2cef8a9dcfaeb3e54b79c47d/images/sq2.png --------------------------------------------------------------------------------