├── .gitignore
├── LICENSE
├── Readme.md
├── __init__.py
├── core
├── __init__.py
├── material.py
├── objects.py
├── operator.py
├── prop.py
├── ui.py
└── utils.py
├── images
├── attributes.png
├── new_object_empty.png
├── new_object_mesh.png
├── semantic_property.png
├── semantics.png
└── world_properties.png
└── pylintrc
/.gitignore:
--------------------------------------------------------------------------------
1 | # Byte-compiled / optimized / DLL files
2 | __pycache__/
3 | *.py[cod]
4 | *$py.class
5 |
6 | # C extensions
7 | *.so
8 |
9 | # Distribution / packaging
10 | .Python
11 | build/
12 | develop-eggs/
13 | dist/
14 | downloads/
15 | eggs/
16 | .eggs/
17 | lib/
18 | lib64/
19 | parts/
20 | sdist/
21 | var/
22 | wheels/
23 | pip-wheel-metadata/
24 | share/python-wheels/
25 | *.egg-info/
26 | .installed.cfg
27 | *.egg
28 | MANIFEST
29 |
30 | # PyInstaller
31 | # Usually these files are written by a python script from a template
32 | # before PyInstaller builds the exe, so as to inject date/other infos into it.
33 | *.manifest
34 | *.spec
35 |
36 | # Installer logs
37 | pip-log.txt
38 | pip-delete-this-directory.txt
39 |
40 | # Unit test / coverage reports
41 | htmlcov/
42 | .tox/
43 | .nox/
44 | .coverage
45 | .coverage.*
46 | .cache
47 | nosetests.xml
48 | coverage.xml
49 | *.cover
50 | *.py,cover
51 | .hypothesis/
52 | .pytest_cache/
53 |
54 | # Translations
55 | *.mo
56 | *.pot
57 |
58 | # Django stuff:
59 | *.log
60 | local_settings.py
61 | db.sqlite3
62 | db.sqlite3-journal
63 |
64 | # Flask stuff:
65 | instance/
66 | .webassets-cache
67 |
68 | # Scrapy stuff:
69 | .scrapy
70 |
71 | # Sphinx documentation
72 | docs/_build/
73 |
74 | # PyBuilder
75 | target/
76 |
77 | # Jupyter Notebook
78 | .ipynb_checkpoints
79 |
80 | # IPython
81 | profile_default/
82 | ipython_config.py
83 |
84 | # pyenv
85 | .python-version
86 |
87 | # pipenv
88 | # According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
89 | # However, in case of collaboration, if having platform-specific dependencies or dependencies
90 | # having no cross-platform support, pipenv may install dependencies that don't work, or not
91 | # install all needed dependencies.
92 | #Pipfile.lock
93 |
94 | # celery beat schedule file
95 | celerybeat-schedule
96 |
97 | # SageMath parsed files
98 | *.sage.py
99 |
100 | # Environments
101 | .env
102 | .venv
103 | env/
104 | venv/
105 | ENV/
106 | env.bak/
107 | venv.bak/
108 |
109 | # Spyder project settings
110 | .spyderproject
111 | .spyproject
112 |
113 | # Rope project settings
114 | .ropeproject
115 |
116 | # mkdocs documentation
117 | /site
118 |
119 | # mypy
120 | .mypy_cache/
121 | .dmypy.json
122 | dmypy.json
123 |
124 | # Pyre type checker
125 | .pyre/
126 |
127 | .noseids
128 | .vscode/
129 |
130 | test.json
131 |
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) 2019 Konstantinos Mastorakis
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE.
22 |
--------------------------------------------------------------------------------
/Readme.md:
--------------------------------------------------------------------------------
1 | *This add-on was originally developed by Konstantinos Mastorakis ([konmast3r](https://github.com/konmast3r/)) as part of their research orientation project for the MSc Geomatics programme of TU Delft. Its functionality was further developed for the needs of their MSc thesis [An integrative workflow for 3D city model versioning](http://resolver.tudelft.nl/uuid:a7f7f0c8-7a34-454e-973a-d55f5b8b0dfe)*
2 |
3 | I plan to keep developing this add-on as a free-time project since I enjoy it a lot and because I see that there are many enthusiasts out there already using it in ways more serious than I expected. Although I am very excited to see that, I can't guarantee how much and how often `Up3date` going to be further developed.
4 |
5 |
6 | # Up3date
7 |
8 | A Blender add-on to import, edit and export new instances of [CityJSON](http://cityjson.org)-encoded 3D city models. All buildings' levels of detail (LoD), attributes and semantic surfaces are stored and can be accesed via Blender's graphical interface.
9 |
10 |
11 | ## Requirements
12 |
13 | - Blender Version >=2.80
14 |
15 |
16 | ## Testing Datasets
17 |
18 | You can find sample datasets at the official [CityJSON](https://www.cityjson.org/datasets/#datasets-converted-from-citygml) website. In case you have 3D city model datasets encoded in `CityGML` you can use the free [Conversion Tool](https://www.cityjson.org/help/users/conversion/) to convert to and from `CityGML` and `CityJSON` and vice versa.
19 |
20 | Trying to import really big datasets such as `New York` will take several minutes because of the amount of information contained. With the rest sample `CityJSON` files everything should work noticeably faster. Depending on your machine, it could take some seconds up to few minutes minute to import the 3D city model.
21 |
22 |
23 | ## Installation
24 |
25 | 1. Download this repository as zip (through GitHub this can be done through the `Clone or download` > `Download ZIP` button).
26 |
27 | 2. Run `Blender` and go to `Edit > Preferences > Add-ons` and press the `Install...` button.
28 |
29 | 3. Select the downloaded ZIP and press `Install Add-on from File...`.
30 |
31 | 4. Enable the add-on from the list, by ticking the empty box next to the add-on's name.
32 | *(Optional: If you wish to update to a newer version, un-tick and tick again the add-on to reload it!)*
33 |
34 |
35 | ## Usage
36 |
37 | ### Before you start!
38 |
39 | - For better understanding of the logic behind the add-on it is **strongly** recommended to have a quick (or thorough :-)) look on the [CityJSON documentation](https://www.cityjson.org/specs/1.0.1/#city-object) if you are unfamiliar with it.
40 |
41 | - In case you run `Blender` through the console, useful feedback is given in it, informing about the progress of the import and export process. Upon importing and exporting from `Blender` it might prove quite useful since in the case of big files it can take up to several minutes. It will also print an error message in case of a crash, which is quite useful for debugging purposes.*
42 | **\*Important: Make sure `Blender's` viewport is in `Object Mode` before importing and exporting a CityJSON file.**
43 |
44 |
45 | ### Importing a 3D city model
46 |
47 | Go to `File > Import > CityJSON (.json)` and navigate to the directory where the `CityJSON` file is stored and open it.
48 |
49 |
50 |
51 | #### Options
52 |
53 | The following options are available during file selection:
54 |
55 | * `Materials' type`:
56 | * `Surfaces` will create materials according to semantic surfaces *(e.g. RoofSurface, WallSurface)*, if present, and load their attributes.
57 | * ` City Objects` will create materials per city object and according to the city object's type *(e.g. Building, Road)*.
58 | * `Reuse materials`: Enable this if you want semantic surface materials to be reused when they share the same type. For example, all `RoofSurface` faces will have the same materials. *This only work when `Surfaces` are selected as `Materials' type`*.
59 | **\*Important: Greatly improves speed of loading, but semantic surfaces' attributes can be lost, if present!**
60 | * `Clean scene`: Enable this if you want the script to clean the scene of any existing object prior to importing the `CityJSON` objects.
61 |
62 |
63 |
64 | #### Useful tips
65 |
66 | - After a successful import, you should be able to see the model somewhere close to the axis origin. Rotation of the scene and zooming in and out might help you, locating the model.
67 | In case you can't see the model, select an object from the `Outliner`* (always in `Object Mode`) and click `View > Frame Selected` or use the `home` button of your keyboard right after importing and try zooming in.
68 | **\*Important: Make sure the object you are selecting is a `mesh object` and not an `empty object`. You can check that from the small pointing down triangle icon next to the object's name.**
69 |
70 | - A different `Collection` is created for each `LoD` present in the 3D city model. In case more than 1 geometry exists for the objects -representing different `LODs` (levels of detail)-, every geometry is stored under the appropriate `Collection`, under the parent `CityObject`. You can display different `Collections` by clicking on the `eye icon` in the `Outliner` at the top right of the interface (see screenshot below). By default all the `LOD_x` collections should be visible right after importing the 3D city model. In case you see any artefacts that is the reason! Choosing only one visible collection should remove all artifacts.
71 |
72 | - In case you want to visualize a certain area, click `Shift + B` and draw a rectangle with your mouse to zoom into that specific area of the 3D city model. This also moves the rotation center at that point, which will come handy when you want to inspect specific areas of the model.
73 |
74 | - To see the attributes of each object, simply select the object on the screen and click on the `Object Properties` tab on the bottom right of `Blender's` interface. Then click `Custom Properties` drop down menu (see screenshot below).
75 |
76 | - To see the semantics of each surface, select an object in `Object Mode`, hit `TAB` to toggle `Edit Mode` and click `Face Select` (top left of the viewport between the `Edit Mode` and the `View` button). Select a face of the object and click on the `Material Properties` tab at the bottom right. Scroll down and click on `Custom Properties`(see screenshot below).
77 |
78 |
79 | 
80 | 
81 |
82 | - `Blender` translates the 3D city model at the beginning of the axis upon importing. The translation parameters and the `CRS` are visible under the `World Properties` for transforming the coordinates back to original if needed.
83 |
84 | 
85 |
86 |
87 |
88 | ### Exporting a 3D city model
89 |
90 | `Up3date's` exporting module was desinged and implemented in order to be able to export any scene of `Blender` into a `CityJSON` file.
91 |
92 | **To do so and because there are certain differences between the two data models *(Blender and CityJSON)* some conventions were made to allow lossless exporting. **
93 |
94 | In order to export objects from `Blender's` scene the following steps need to be followed:
95 |
96 | 1. For every `LoD/geometry` a `Mesh object`* has to be added into `Blender's` scene. In case there are already created `Collections` from a previously imported `CityJSON` file, it is not necessary to add the `LoD/geometry` into it, but recommended for organization purposes.
97 |
98 | **\*Important: The mesh should be named in a predefined way for `Up3date` to be able to parse it correctly. Example: a `LoD0` geometry should be named as `0: [LoD0] ID_of_object` preserving also the spaces.**
99 |
100 | For every `Mesh / geometry` 2 more things needs to be added as `Custom Properties` for the exporter to work. You need to add them yourself after selecting the `Mesh object` in `Object Mode`, clicking on the `Object Properties` button *(second screenshot of the documentation)*, expanding the `Custom Properties` and clicking the `Add` button to add a new `Custom Property`. After addition, edit the property by hitting the `Edit` button next to it. You only need to change `Property Name` and `Property Value`.
101 |
102 | * `type`: the_surface_type (*`Surface`, `MultiSurface`, `CompositeSurface` and `Solid`* are accepted)
103 | * `lod` : the_number_of_lod
104 |
105 | 
106 |
107 | 2. An `Empty object` representing the `CityObject` has to be created named as `ID_of_object` *(should be exactly the same name as the `Mesh` described above without the `0: [LoD0] ` prefix)*. To rename any object just double-click on it in the `Outliner` and type a new name.
108 | This object will be the `parent` for the various `LoD` geometries that a `CityObject` might have. For any `CityObject's` *(aka building's)* attribute you wish to store, a new `Custom Property` has to be added to the `Empty Object`. You have to manually add them via `Blender's` graphical interface exactly the same way as described in `step 1`.
109 | In case the attributes have to be nested, for example the `postal code` of an `address`, then the `Custom Property` key should be `address.postalcode` so `Up3date` can understand the nested attribute structure from the `.` and handle it accordingly *(see picture below)*.
110 |
111 | 
112 |
113 | 3. If the semantics of a *(`LoD 2` or above)* `geometry` surfaces are known and you want to add them, they can be assigned (again) as `Custom Properties` of `Materials` to the respective faces. For every `Mesh / Geometry` object `Blender` allows the creation of `Materials`. To assign semantics that will be exported in the `CityJSON` file, you will need to first create (a) new material(s) inside the newly added `Mesh / Geometry` object (just select the object in `Object Mode` and go to the `Materials` tab). If working with a pre-imported file, you can select an already existing material. Don't worry if the materials' names look like `WallSurface.001` etc. The only information exported is the value of the `Custom Property` `type` of the material (i.e. the semantic).
114 | In the case of creating new materials you need to add a `Custom Property` to each one of them which must look like the following: `type: Semantic_name` *(`WallSurface`, `RoofSurface`, `GroundSurface` etc)* (see also picture below).
115 | After successfully adding the material(s) and the `Custom Property`, select the geometry in `Object Mode`, hit the `tab` button to swap to `Edit Mode` and click the `Face Select` button right next to the `Edit Mode` option *(as explained under the 5th `Useful tip` in the section above)*.
116 | With the appropriate face selected select the appropriate material and hit the `Assign` button to link that material to the face.
117 |
118 | 
119 |
120 | 4. Finally, go to `File > Export > CityJSON (.json)` and export the new instance. Voila!
121 |
122 |
123 |
124 | ## Further Development
125 |
126 | If you are using `Visual Studio Code`, you may:
127 |
128 | - Install [Blender Development](https://marketplace.visualstudio.com/items?itemName=JacquesLucke.blender-development): a plugin that allows starting and debugging Python scripts from VSC.
129 | - Install the [fake-bpy-module](https://github.com/nutti/fake-bpy-module) to enable auto-completion: `pip install fake-bpy-module-2.80`.
130 |
131 | Clone this repository and have fun!
132 | If you experience any bugs or have recommendations etc, you can open a new issue, providing all the necessary information. I can't promise to take them all under consideration but I always appreciate them.
--------------------------------------------------------------------------------
/__init__.py:
--------------------------------------------------------------------------------
1 | """Main module of the CityJSON Blender addon"""
2 |
3 | import json
4 | import time
5 |
6 | import bpy
7 | from bpy.props import BoolProperty, EnumProperty, StringProperty, IntProperty
8 | from bpy.types import Operator
9 | from bpy_extras.io_utils import ExportHelper, ImportHelper
10 |
11 | from .core.objects import CityJSONParser, CityJSONExporter
12 | from .core import ui, prop, operator
13 |
14 | bl_info = {
15 | "name": "Up3date",
16 | "author": "Konstantinos Mastorakis",
17 | "version": (1, 0),
18 | "blender": (2, 80, 0),
19 | "location": "File > Import > CityJSON (.json) || File > Export > CityJSON (.json)",
20 | "description": "Visualize, edit and export 3D City Models encoded in CityJSON format",
21 | "warning": "",
22 | "wiki_url": "",
23 | "category": "Import-Export",
24 | }
25 |
26 | class ImportCityJSON(Operator, ImportHelper):
27 | "Load a CityJSON file"
28 | bl_idname = "cityjson.import_file" # important since its how bpy.ops.import_test.some_data is constructed
29 | bl_label = "Import CityJSON"
30 |
31 | # ImportHelper mixin class uses this
32 | filename_ext = ".json"
33 |
34 | filter_glob: StringProperty(
35 | default="*.json",
36 | options={'HIDDEN'},
37 | maxlen=255, # Max internal buffer length, longer would be clamped.
38 | )
39 |
40 | material_type: EnumProperty(
41 | name="Materials' type",
42 | items=(('SURFACES', "Surfaces",
43 | "Creates materials based on semantic surface types"),
44 | ('CITY_OBJECTS', "City Objects",
45 | "Creates materials based on the type of city object")),
46 | description=(
47 | "Create materials based on city object or semantic"
48 | " surfaces"
49 | )
50 | )
51 |
52 | reuse_materials: BoolProperty(
53 | name="Reuse materials",
54 | description="Use common materials according to surface type",
55 | default=True
56 | )
57 |
58 | clean_scene: BoolProperty(
59 | name="Clean scene",
60 | description="Remove existing objects from the scene before importing",
61 | default=True
62 | )
63 |
64 | def execute(self, context):
65 | """Executes the import process"""
66 |
67 | parser = CityJSONParser(self.filepath,
68 | material_type=self.material_type,
69 | reuse_materials=self.reuse_materials,
70 | clear_scene=self.clean_scene)
71 |
72 | return parser.execute()
73 |
74 | class ExportCityJSON(Operator, ExportHelper):
75 | "Export scene as a CityJSON file"
76 | bl_idname = "cityjson.export_file"
77 | bl_label = "Export CityJSON"
78 |
79 | # ExportHelper mixin class uses this
80 | filename_ext = ".json"
81 |
82 | filter_glob: StringProperty(
83 | default="*.json",
84 | options={'HIDDEN'},
85 | maxlen=255, # Max internal buffer length, longer would be clamped.
86 | )
87 |
88 | check_for_duplicates: BoolProperty(
89 | name="Remove vertex duplicates",
90 | default=True,
91 | )
92 |
93 | precision: IntProperty(
94 | name="Precision",
95 | default=5,
96 | description="Decimals to check for vertex duplicates",
97 | min=0,
98 | max=12,
99 | )
100 | # single_lod_switch: BoolProperty(
101 | # name="Export single LoD",
102 | # description="Enable to export only a single LoD",
103 | # default=False,
104 | # )
105 |
106 | # export_single_lod: EnumProperty(
107 | # name="Select LoD :",
108 | # items=(('LoD0', "LoD 0",
109 | # "Export only LoD 0"),
110 | # ('LoD1', "LoD 1",
111 | # "Export only LoD 1"),
112 | # ('LoD2', "LoD 2",
113 | # "Export only LoD 2"),
114 | # ),
115 | # description=(
116 | # "Select which LoD should be exported"
117 | # )
118 | # )
119 | def execute(self, context):
120 |
121 | exporter = CityJSONExporter(self.filepath,
122 | check_for_duplicates=self.check_for_duplicates,
123 | precision=self.precision)
124 | return exporter.execute()
125 |
126 | classes = (
127 | ImportCityJSON,
128 | ExportCityJSON,
129 | prop.UP3DATE_CityjsonfyProperties,
130 | operator.UP3DATECityjsonfy,
131 | ui.UP3DATE_PT_gui
132 | )
133 |
134 | def menu_func_export(self, context):
135 | """Defines the menu item for CityJSON import"""
136 | self.layout.operator(ExportCityJSON.bl_idname, text="CityJSON (.json)")
137 |
138 | def menu_func_import(self, context):
139 | """Defines the menu item for CityJSON export"""
140 | self.layout.operator(ImportCityJSON.bl_idname, text="CityJSON (.json)")
141 |
142 | def register():
143 | """Registers the classes and functions of the addon"""
144 |
145 | for cls in classes:
146 | bpy.utils.register_class(cls)
147 | bpy.types.Scene.cityjsonfy_properties = bpy.props.PointerProperty(type=prop.UP3DATE_CityjsonfyProperties)
148 | bpy.types.TOPBAR_MT_file_import.append(menu_func_import)
149 | bpy.types.TOPBAR_MT_file_export.append(menu_func_export)
150 |
151 | def unregister():
152 | """Unregisters the classes and functions of the addon"""
153 |
154 | bpy.types.TOPBAR_MT_file_import.remove(menu_func_import)
155 | bpy.types.TOPBAR_MT_file_export.remove(menu_func_export)
156 |
157 | for cls in classes:
158 | bpy.utils.unregister_class(cls)
159 |
160 | del bpy.types.Scene.cityjsonfy_properties
161 |
162 | if __name__ == "__main__":
163 | register()
164 |
--------------------------------------------------------------------------------
/core/__init__.py:
--------------------------------------------------------------------------------
1 | """The core module of the CityJSON importer addon for Blender"""
--------------------------------------------------------------------------------
/core/material.py:
--------------------------------------------------------------------------------
1 | """Module to manipulate materials in Blender regarding CityJSON
2 |
3 | This module provides a set of factory classes to create materials
4 | based on the semantics of the CityJSON file.
5 | """
6 |
7 | import bpy
8 | from .utils import assign_properties, clean_list
9 |
10 | class BasicMaterialFactory:
11 | """A factory that creates a simple material for every city object"""
12 |
13 | material_colors = {
14 | "WallSurface": (0.8, 0.8, 0.8, 1),
15 | "RoofSurface": (0.9, 0.057, 0.086, 1),
16 | "GroundSurface": (0.507, 0.233, 0.036, 1)
17 | }
18 |
19 | default_color = (0, 0, 0, 1)
20 |
21 | def get_surface_color(self, surface_type):
22 | """Returns the material color of the appropriate surface type"""
23 |
24 | if surface_type in self.material_colors:
25 | return self.material_colors[surface_type]
26 |
27 | return self.default_color
28 |
29 | def create_material(self, surface):
30 | """Returns a new material based on the semantic surface of the object"""
31 | mat = bpy.data.materials.new(name=surface['type'])
32 |
33 | assign_properties(mat, surface)
34 |
35 | mat.diffuse_color = self.get_surface_color(surface['type'])
36 |
37 | return mat
38 |
39 | def get_material(self, surface):
40 | """Returns the material that corresponds to the semantic surface"""
41 |
42 | return self.create_material(surface)
43 |
44 | def get_materials(self, geometry=None, **params):
45 | """Returns the materials and material index list for the given
46 | geometry
47 | """
48 | mats = []
49 | values = []
50 | if 'semantics' in geometry:
51 | values = geometry['semantics']['values']
52 |
53 | for surface in geometry['semantics']['surfaces']:
54 | mats.append(self.get_material(surface))
55 |
56 | values = clean_list(values)
57 |
58 | return (mats, values)
59 |
60 | class ReuseMaterialFactory(BasicMaterialFactory):
61 | """A class that re-uses a material with similar semantics"""
62 |
63 | def check_material(self, material, surface):
64 | """Checks if the material can represent the provided surface"""
65 |
66 | if not material.name.startswith(surface['type']):
67 | return False
68 |
69 | # TODO: Add logic here to check for semantic surface attributes
70 |
71 | return True
72 |
73 | def get_material(self, surface):
74 | """Returns the material that corresponds to the semantic surface"""
75 |
76 | matches = [m for m in bpy.data.materials
77 | if self.check_material(m, surface)]
78 |
79 | if matches:
80 | return matches[0]
81 |
82 | return self.create_material(surface)
83 |
84 | class CityObjectTypeMaterialFactory:
85 | """A class that returns a material based on the object type"""
86 |
87 | type_color = {
88 | "Building": (0.9, 0.057, 0.086, 1),
89 | "BuildingPart": (0.9, 0.057, 0.086, 1),
90 | "BuildingInstallation": (0.9, 0.057, 0.086, 1),
91 | "Road": (0.4, 0.4, 0.4, 1),
92 | "LandUse": (242/255, 193/255, 25/255, 1),
93 | "PlantCover": (145/255, 191/255, 102/255, 1),
94 | "SolitaryVegetationObject": (145/255, 191/255, 102/255, 1),
95 | "TINRelief": (242/255, 193/255, 25/255, 1),
96 | "WaterBody": (54/255, 197/255, 214/255, 1)
97 | }
98 |
99 | default_color = (0.3, 0.3, 0.3, 1)
100 |
101 | def create_material(self, name, color):
102 | """Returns a new material based on the semantic surface of the object"""
103 | mat = bpy.data.materials.new(name=name)
104 |
105 | mat.diffuse_color = color
106 |
107 | return mat
108 |
109 | def get_type_color(self, object_type):
110 | """Returns the color that corresponds to the provided city object type"""
111 |
112 | if object_type in self.type_color:
113 | return self.type_color[object_type]
114 |
115 | return self.default_color
116 |
117 | def get_material(self, object_type):
118 | """Returns the material that corresponds to the provided
119 | object type
120 | """
121 |
122 | if object_type in bpy.data.materials:
123 | return bpy.data.materials[object_type]
124 |
125 | return self.create_material(object_type, self.get_type_color(object_type))
126 |
127 |
128 | def get_materials(self, cityobject=None, **params):
129 | """Returns the materials and material index list for the given
130 | geometry
131 | """
132 |
133 | return ([self.get_material(cityobject['type'])],
134 | [])
135 |
--------------------------------------------------------------------------------
/core/objects.py:
--------------------------------------------------------------------------------
1 | """Module to manipulate objects in Blender regarding CityJSON"""
2 |
3 | import json
4 | import time
5 | import sys
6 | import bpy
7 | import idprop
8 |
9 | from datetime import datetime
10 |
11 | from .material import (BasicMaterialFactory, ReuseMaterialFactory,
12 | CityObjectTypeMaterialFactory)
13 | from .utils import (assign_properties, clean_buffer, clean_list,
14 | coord_translate_axis_origin, coord_translate_by_offset,
15 | remove_scene_objects, get_geometry_name, create_empty_object,
16 | create_mesh_object, get_collection, write_vertices_to_CityJSON,
17 | remove_vertex_duplicates, export_transformation_parameters,
18 | export_metadata, export_parent_child, export_attributes,
19 | store_semantic_surfaces, link_face_semantic_surface)
20 |
21 | class CityJSONParser:
22 | """Class that parses a CityJSON file to Blender"""
23 |
24 | def __init__(self, filepath, material_type, reuse_materials=True, clear_scene=True):
25 | self.filepath = filepath
26 | self.clear_scene = clear_scene
27 |
28 | self.data = {}
29 | self.vertices = []
30 |
31 | if material_type == 'SURFACES':
32 | if reuse_materials:
33 | self.material_factory = ReuseMaterialFactory()
34 | else:
35 | self.material_factory = BasicMaterialFactory()
36 | else:
37 | self.material_factory = CityObjectTypeMaterialFactory()
38 |
39 | def load_data(self):
40 | """Loads the CityJSON data from the file"""
41 |
42 | with open(self.filepath) as json_file:
43 | self.data = json.load(json_file)
44 |
45 | def prepare_vertices(self):
46 | """Prepares the vertices by applying any required transformations"""
47 |
48 | vertices = []
49 |
50 | # Checking if coordinates need to be transformed and
51 | # transforming if necessary
52 | if 'transform' not in self.data:
53 | for vertex in self.data['vertices']:
54 | vertices.append(tuple(vertex))
55 | else:
56 | trans_param = self.data['transform']
57 | # Transforming coords to actual real world coords
58 | for vertex in self.data['vertices']:
59 | x = vertex[0]*trans_param['scale'][0] \
60 | + trans_param['translate'][0]
61 | y = vertex[1]*trans_param['scale'][1] \
62 | + trans_param['translate'][1]
63 | z = vertex[2]*trans_param['scale'][2] \
64 | + trans_param['translate'][2]
65 |
66 | vertices.append((x, y, z))
67 | #Creating transform properties
68 | bpy.context.scene.world['transformed'] = True
69 | bpy.context.scene.world['transform.X_scale'] = trans_param['scale'][0]
70 | bpy.context.scene.world['transform.Y_scale'] = trans_param['scale'][1]
71 | bpy.context.scene.world['transform.Z_scale'] = trans_param['scale'][2]
72 |
73 | bpy.context.scene.world['transform.X_translate'] = trans_param['translate'][0]
74 | bpy.context.scene.world['transform.Y_translate'] = trans_param['translate'][1]
75 | bpy.context.scene.world['transform.Z_translate'] = trans_param['translate'][2]
76 |
77 |
78 | if 'Axis_Origin_X_translation' in bpy.context.scene.world:
79 | offx = -bpy.context.scene.world['Axis_Origin_X_translation']
80 | offy = -bpy.context.scene.world['Axis_Origin_Y_translation']
81 | offz = -bpy.context.scene.world['Axis_Origin_Z_translation']
82 | translation = coord_translate_by_offset(vertices, offx, offy, offz)
83 | else:
84 | translation = coord_translate_axis_origin(vertices)
85 |
86 | bpy.context.scene.world['Axis_Origin_X_translation']=-translation[1]
87 | bpy.context.scene.world['Axis_Origin_Y_translation']=-translation[2]
88 | bpy.context.scene.world['Axis_Origin_Z_translation']=-translation[3]
89 |
90 | # Updating vertices with new translated vertices
91 | self.vertices = translation[0]
92 |
93 | def parse_geometry(self, theid, obj, geom, index):
94 | """Returns a mesh object for the provided geometry"""
95 | bound = []
96 |
97 | # Checking how nested the geometry is i.e what kind of 3D
98 | # geometry it contains
99 | if (geom['type'] == 'MultiSurface'
100 | or geom['type'] == 'CompositeSurface'):
101 | for face in geom['boundaries']:
102 | if face:
103 | bound.append(tuple(face[0]))
104 | elif geom['type'] == 'Solid':
105 | for shell in geom['boundaries']:
106 | for face in shell:
107 | if face:
108 | bound.append(tuple(face[0]))
109 | elif geom['type'] == 'MultiSolid':
110 | for solid in geom['boundaries']:
111 | for shell in solid:
112 | for face in shell:
113 | if face:
114 | bound.append(tuple(face[0]))
115 |
116 | temp_vertices, temp_bound = clean_buffer(self.vertices, bound)
117 |
118 | mats, values = self.material_factory.get_materials(cityobject=obj,
119 | geometry=geom)
120 |
121 | geom_obj = create_mesh_object(get_geometry_name(theid, geom, index),
122 | temp_vertices,
123 | temp_bound,
124 | mats,
125 | values)
126 |
127 | if 'lod' in geom:
128 | geom_obj['lod'] = geom['lod']
129 |
130 | geom_obj['type'] = geom['type']
131 |
132 | return geom_obj
133 |
134 | def execute(self):
135 | """Execute the import process"""
136 |
137 | if self.clear_scene:
138 | remove_scene_objects()
139 |
140 | print("\nImporting CityJSON file...")
141 |
142 | self.load_data()
143 |
144 | self.prepare_vertices()
145 |
146 | #Storing the reference system
147 | if 'metadata' in self.data:
148 |
149 | if 'referenceSystem' in self.data['metadata']:
150 | bpy.context.scene.world['CRS'] = self.data['metadata']['referenceSystem']
151 |
152 | new_objects = []
153 | cityobjs = {}
154 |
155 | progress_max = len(self.data['CityObjects'])
156 | progress = 0
157 | start_import = time.time()
158 |
159 | # Creating empty meshes for every CityObjects and linking its
160 | # geometries as children-meshes
161 | for objid, obj in self.data['CityObjects'].items():
162 | cityobject = create_empty_object(objid)
163 | cityobject = assign_properties(cityobject,
164 | obj)
165 | new_objects.append(cityobject)
166 | cityobjs[objid] = cityobject
167 |
168 | for i, geom in enumerate(obj['geometry']):
169 | geom_obj = self.parse_geometry(objid, obj, geom, i)
170 | geom_obj.parent = cityobject
171 | new_objects.append(geom_obj)
172 |
173 | progress += 1
174 | print("Importing: {percent}% completed"
175 | .format(percent=round(progress * 100 / progress_max, 1)),
176 | end="\r")
177 | end_import = time.time()
178 |
179 | progress = 0
180 | start_hier = time.time()
181 |
182 | #Assigning child building parts to parent buildings
183 | print ("\nBuilding hierarchy...")
184 | for objid, obj in self.data['CityObjects'].items():
185 | if 'parents' in obj:
186 | parent_id = obj['parents'][0]
187 | cityobjs[objid].parent = cityobjs[parent_id]
188 |
189 | progress += 1
190 | print("Building hierarchy: {percent}% completed"
191 | .format(percent=round(progress * 100 / progress_max, 1)),
192 | end="\r")
193 | end_hier = time.time()
194 |
195 | start_link = time.time()
196 |
197 | # Link everything to the scene
198 | print ("\nLinking objects to the scene...")
199 | collection = bpy.context.scene.collection
200 | for new_object in new_objects:
201 | if 'lod' in new_object:
202 | get_collection("LoD{}".format(new_object['lod'])).objects.link(new_object)
203 | else:
204 | collection.objects.link(new_object)
205 |
206 | end_link = time.time()
207 | #Console output
208 | print("Total importing time: ", round(end_import-start_import, 2), "s")
209 | print("Building hierarchy: ", round(end_hier-start_hier, 2), "s")
210 | print("Linking: ", round(end_link-start_link, 2), "s")
211 | print("Done!")
212 | timestamp = datetime.now()
213 | print("\n[" +timestamp.strftime("%d/%b/%Y @ %H:%M:%S")+ "]", "CityJSON file successfully imported from '"+str(self.filepath)+"'.")
214 |
215 | return {'FINISHED'}
216 |
217 | class CityJSONExporter:
218 |
219 | def __init__ (self, filepath, check_for_duplicates=True, precision=3):
220 | self.filepath = filepath
221 | self.check_for_duplicates = check_for_duplicates
222 | self.precision = precision
223 |
224 | def initialize_dictionary(self):
225 | empty_json = {
226 | "type": "CityJSON",
227 | "version": "1.0",
228 | # "extensions": {},
229 | "metadata": {},
230 | "CityObjects": {},
231 | "vertices":[],
232 | #"appearance":{}
233 | }
234 | return empty_json
235 |
236 | def get_custom_properties(self,city_object,init_json,CityObject_id):
237 | """Creates the required structure according to CityJSON and writes all the object's custom properties (aka attributes)"""
238 | init_json["CityObjects"].setdefault(CityObject_id,{})
239 | init_json["CityObjects"][CityObject_id].setdefault('geometry',[])
240 | cp=city_object.items()
241 | for prop in cp:
242 | # When a new empty object is added by user, Blender assigns some built in properties at the first index of the property list.
243 | # With this it is bypassed and continues to the actual properties of the object
244 | if prop[0]=="_RNA_UI":
245 | continue
246 | # Upon import into Blender the for every level deeper an attribute is nested the more a "." is used in the string between the 2 attributes' names
247 | # So to store it back to the original form the concatenated string must be split.
248 | # Split is the list containing the original attributes names.
249 | split = prop[0].split(".")
250 | # Check if the attribute is IDPropertyArray and convert to python list type because JSON encoder cannot handle type IDPropertyArray.
251 | if isinstance(prop[1],idprop.types.IDPropertyArray):
252 | attribute=prop[1].to_list()
253 | else:
254 | attribute=prop[1]
255 | export_attributes(split,init_json,CityObject_id,attribute)
256 |
257 | def create_mesh_structure(self,city_object,objid,init_json):
258 | "Prepares the structure within the empty mesh for storing the geometries, stored the lod and accesses the vertices and faces of the geometry within Blender"
259 | #Create geometry key within the empty object for storing the LoD(s)
260 | CityObject_id = objid.split(' ')[2]
261 | init_json["CityObjects"].setdefault(CityObject_id,{})
262 | init_json["CityObjects"][CityObject_id].setdefault('geometry',[])
263 | #Check if the user has assigned the custom properties 'lod' and 'type' correctly
264 | if ('lod' in city_object.keys() and (type(city_object['lod']) == float or type(city_object['lod'])==int) ):
265 | if ('type' in city_object.keys() and (city_object['type'] == "MultiSurface" or city_object['type'] == "CompositeSurface" or city_object['type'] == "Solid")):
266 | #Check if object has materials (in Blender) i.e semantics in real life and if yes create the extra keys (within_geometry) to store it.
267 | #Otherwise just create the rest of the tags
268 | if city_object.data.materials:
269 | init_json["CityObjects"][CityObject_id]['geometry'].append({'type':city_object['type'],'boundaries':[],'semantics':{'surfaces': [], 'values': [[]]},'texture':{},'lod':city_object['lod']})
270 | else:
271 | init_json["CityObjects"][CityObject_id]['geometry'].append({'type':city_object['type'],'boundaries':[],'lod':city_object['lod']})
272 | else:
273 | print ("You either forgot to add `type` as a custom property of the geometry, ", name, ", or 'type' is not `MultiSurface`,`CompositeSurface` or `Solid`")
274 | sys.exit(None)
275 | else:
276 | print ("You either forgot to add `lod` as a custom property of the geometry, ", name, ", or 'lod' is not a number")
277 | sys.exit(None)
278 |
279 | #Accessing object's vertices
280 | object_verts = city_object.data.vertices
281 | #Accessing object's faces
282 | object_faces = city_object.data.polygons
283 |
284 | return CityObject_id,object_verts,object_faces
285 |
286 |
287 | def export_geometry_and_semantics(self,city_object,init_json,CityObject_id,object_faces,object_verts,
288 | vertices,cj_next_index):
289 | #Index in the geometry list that the new geometry needs to be stored.
290 | index = len(init_json["CityObjects"][CityObject_id]['geometry'])-1
291 |
292 | # Create semantic surfaces
293 | semantic_surfaces = store_semantic_surfaces(init_json, city_object, index, CityObject_id)
294 | if city_object['type'] == 'MultiSurface' or city_object['type'] == 'CompositeSurface':
295 | # Browsing through faces and their vertices of every object.
296 | for face in object_faces:
297 | init_json["CityObjects"][CityObject_id]["geometry"][index]["boundaries"].append([[]])
298 |
299 |
300 | for i in range(len(object_faces[face.index].vertices)):
301 | original_index = object_faces[face.index].vertices[i]
302 | get_vertex = object_verts[original_index]
303 |
304 | #Write vertex to init_json at this point so the mesh_object.world_matrix (aka transformation matrix) is always the
305 | #correct one. With the previous way it would take the last object's transformation matrix and would potentially lead to wrong final
306 | #coordinate to be exported.
307 | write_vertices_to_CityJSON(city_object,get_vertex.co,init_json)
308 | vertices.append(get_vertex.co)
309 | init_json["CityObjects"][CityObject_id]["geometry"][index]["boundaries"][face.index][0].append(cj_next_index)
310 | cj_next_index += 1
311 |
312 | # In case the object has semantics they are accordingly stored as well
313 | link_face_semantic_surface(init_json, city_object, index, CityObject_id, semantic_surfaces, face)
314 |
315 | if city_object['type'] == 'Solid':
316 | init_json["CityObjects"][CityObject_id]["geometry"][index]["boundaries"].append([])
317 | for face in object_faces:
318 | init_json["CityObjects"][CityObject_id]["geometry"][index]["boundaries"][0].append([[]])
319 | for i in range(len(object_faces[face.index].vertices)):
320 | original_index = object_faces[face.index].vertices[i]
321 | get_vertex = object_verts[original_index]
322 | if get_vertex.co in vertices:
323 | vert_index = vertices.index(get_vertex.co)
324 | init_json["CityObjects"][CityObject_id]["geometry"][index]["boundaries"][0][face.index][0].append(vert_index)
325 | else:
326 | write_vertices_to_CityJSON(city_object,get_vertex.co,init_json)
327 | vertices.append(get_vertex.co)
328 | init_json["CityObjects"][CityObject_id]["geometry"][index]["boundaries"][0][face.index][0].append(cj_next_index)
329 | cj_next_index += 1
330 | # In case the object has semantics they are accordingly stored as well
331 | link_face_semantic_surface(init_json, city_object, index, CityObject_id, semantic_surfaces, face)
332 | return cj_next_index
333 |
334 | def execute(self):
335 | start=time.time()
336 | print("\nExporting Blender scene into CityJSON file...")
337 | #Create the initial structure of the cityjson dictionary
338 | init_json = self.initialize_dictionary()
339 | # Variables to keep up with the exporting progress. Used to print percentage in the terminal.
340 | progress_max = len(bpy.data.objects)
341 | # Initialize progress status
342 | progress = 0
343 | # Variable to store the next free index that a vertex should be saved in the cityjson file. Avoiding saving duplicates.
344 | cj_next_index = 0
345 | # Create a list of vertices to store the global vertices of all objects
346 | verts = list()
347 | for city_object in bpy.data.objects:
348 | #Get object's name
349 | objid = city_object.name
350 | #Empty objects have all the attributes so their properties are accessed to extract this information
351 | if city_object.type=='EMPTY':
352 | #Get all the custom properties of the object
353 | self.get_custom_properties(city_object,init_json,objid)
354 | #If the object is MESH means that is an actual geometry contained in the CityJSON file
355 | if city_object.type =='MESH':
356 | """ Export geometries with their semantics into CityJSON
357 | Geometry type is checked for every object, because the structure that the geometry has to be stored in the cityjson is different depending on the geometry type
358 | In case the object has semantics they are accordingly stored as well using the 'store_semantics' function
359 | Case of multisolid hasn't been taken under consideration!!
360 | """
361 | #Creating the structure for storing the geometries and get the initial ID of the CityObject its vertices and its faces
362 | CityObject_id,object_verts,object_faces = self.create_mesh_structure(city_object,objid,init_json)
363 |
364 | #Exporting geometry and semantics. CityJSON vertices_index is returned so it can be re-fed into the function at the correct point.
365 | cj_next_index = self.export_geometry_and_semantics(city_object,init_json,CityObject_id,object_faces,
366 | object_verts,verts,cj_next_index)
367 |
368 | progress += 1
369 | print("Appending geometries, vertices, semantics, attributes: {percent}% completed".format(percent=round(progress * 100 / progress_max, 1)),end="\r")
370 |
371 | if self.check_for_duplicates:
372 | remove_vertex_duplicates(init_json, self.precision)
373 | export_parent_child(init_json)
374 | export_transformation_parameters(init_json)
375 | export_metadata(init_json)
376 |
377 | print ("Writing to CityJSON file...")
378 | #Writing CityJSON file
379 | with open(self.filepath, 'w', encoding='utf-8') as f:
380 | json.dump(init_json, f, ensure_ascii=False)
381 |
382 | end=time.time()
383 | timestamp = datetime.now()
384 | print("\n[" +timestamp.strftime("%d/%b/%Y @ %H:%M:%S")+ "]", "Blender scene successfully exported to CityJSON at '"+str(self.filepath)+"'.")
385 | # print("\nBlender scene successfully exported to CityJSON at '"+str(self.filepath)+"'.")
386 | print("\nTotal exporting time: ", round(end-start, 2), "s")
387 |
388 | return{'FINISHED'}
--------------------------------------------------------------------------------
/core/operator.py:
--------------------------------------------------------------------------------
1 | import bpy
2 |
3 | class UP3DATECityjsonfy(bpy.types.Operator):
4 | bl_idname = "cityjson.cityjsonfy"
5 | bl_label = "Convert to cityjson"
6 | bl_context = "scene"
7 |
8 | def execute(self, context):
9 | scene = bpy.context.scene
10 | props = scene.cityjsonfy_properties
11 |
12 | # define properties
13 | LOD_fullversion = props.LOD
14 | if props.LOD_version != 0:
15 | LOD_fullversion = round(props.LOD + (props.LOD_version / 10), 1)
16 |
17 | # loop through selected objects
18 | for geom_obj in bpy.context.selected_objects:
19 | cityjson_id = geom_obj.name
20 | geom_location = geom_obj.location
21 |
22 | # create empty
23 | cityjson_object = bpy.data.objects.new("empty", None)
24 | scene.collection.objects.link(cityjson_object)
25 | cityjson_object.location = geom_location
26 |
27 | # set names and attributes
28 | geom_obj.name = f"{props.LOD}: [LOD{LOD_fullversion}] {cityjson_id}"
29 | geom_obj["type"] = props.geometry_type
30 | geom_obj["lod"] = props.LOD
31 | cityjson_object.name = cityjson_id
32 | cityjson_object["type"] = props.feature_type
33 |
34 | return {"FINISHED"}
--------------------------------------------------------------------------------
/core/prop.py:
--------------------------------------------------------------------------------
1 | import bpy
2 |
3 | class UP3DATE_CityjsonfyProperties(bpy.types.PropertyGroup):
4 | LOD: bpy.props.IntProperty(name="LOD", default=2)
5 | LOD_version: bpy.props.IntProperty(name="LODSubversion", default=0)
6 | feature_type: bpy.props.StringProperty(name="feature_type", default="Building")
7 | geometry_type: bpy.props.EnumProperty(name="geometry_type", description="",
8 | items=[("MultiSurface", "MultiSurface", "MultiSurface"),
9 | ("CompositeSurface", "CompositeSurface", "CompositeSurface"),
10 | ("Solid", "Solid", "Solid"),
11 | ("MultiSolid", "MultiSolid", "MultiSolid")])
12 |
--------------------------------------------------------------------------------
/core/ui.py:
--------------------------------------------------------------------------------
1 | import bpy
2 |
3 | class UP3DATE_PT_gui(bpy.types.Panel):
4 | """Creates a Panel in the scene context of properties editor"""
5 | bl_idname = "cityjson_PT_gui"
6 | bl_label = "Up3date"
7 | bl_space_type = "PROPERTIES"
8 | bl_region_type = "WINDOW"
9 | bl_context = "scene"
10 |
11 | def draw(self, context):
12 | layout = self.layout
13 |
14 | scene = context.scene
15 | props = scene.cityjsonfy_properties
16 |
17 | layout.label(text="Convert selected to CityJSON:")
18 | row = layout.row(align=True)
19 | row.prop(props, "LOD")
20 | row = layout.row()
21 | row.prop(props, "LOD_version")
22 | row = layout.row()
23 | row.prop(props, "feature_type")
24 | row = layout.row()
25 | row.prop(props, "geometry_type")
26 | row = layout.row()
27 | row.operator("cityjson.cityjsonfy")
--------------------------------------------------------------------------------
/core/utils.py:
--------------------------------------------------------------------------------
1 | """Blender CityJSON plugin utils module
2 |
3 | This modules provides utitily methods for the importing/exporting
4 | processing of CityJSON files
5 | """
6 |
7 | import bpy, idprop
8 |
9 |
10 | ########## Importer functions ##########
11 |
12 | def remove_scene_objects():
13 | """Clears the scenes of any objects and removes world's custom properties
14 | and collections"""
15 | # Delete world custom properties
16 | if bpy.context.scene.world.keys():
17 | for custom_property in bpy.context.scene.world.keys():
18 | del bpy.context.scene.world[custom_property]
19 | # Deleting previous objects every time a new CityJSON file is imported
20 | bpy.ops.object.select_all(action='SELECT')
21 | bpy.ops.object.delete()
22 | # Deleting previously existing collections
23 | for collection in bpy.data.collections:
24 | bpy.data.collections.remove(collection)
25 |
26 |
27 | def clean_list(values):
28 | """Creates a list of non list in case lists nested in lists exist"""
29 |
30 | while isinstance(values[0], list):
31 | values = values[0]
32 |
33 | return values
34 |
35 |
36 | def assign_properties(obj, props, prefix=[]):
37 | """Assigns the custom properties to obj based on the props"""
38 |
39 | for prop, value in props.items():
40 |
41 | if prop in ["geometry", "children", "parents"]:
42 | continue
43 |
44 | if isinstance(value, dict):
45 | obj = assign_properties(obj, value, prefix + [prop])
46 |
47 | else:
48 | obj[".".join(prefix + [prop])] = value
49 |
50 | return obj
51 |
52 |
53 | def coord_translate_axis_origin(vertices):
54 | """Translates the vertices to the origin (0, 0, 0)"""
55 | # Finding minimum value of x,y,z
56 | minx = min(i[0] for i in vertices)
57 | miny = min(i[1] for i in vertices)
58 | minz = min(i[2] for i in vertices)
59 |
60 | return coord_translate_by_offset(vertices, minx, miny, minz)
61 |
62 |
63 | def coord_translate_by_offset(vertices, offx, offy, offz):
64 | """Translates the vertices by minx, miny and minz"""
65 | # Calculating new coordinates
66 | translated_x = [i[0] - offx for i in vertices]
67 | translated_y = [i[1] - offy for i in vertices]
68 | translated_z = [i[2] - offz for i in vertices]
69 |
70 | return (tuple(zip(translated_x, translated_y, translated_z)),
71 | offx,
72 | offy,
73 | offz)
74 |
75 |
76 | def original_coordinates(vertices, minx, miny, minz):
77 | """Translates the vertices from origin to original"""
78 | # Calculating original coordinates
79 | original_x = [i[0] + minx for i in vertices]
80 | original_y = [i[1] + miny for i in vertices]
81 | original_z = [i[2] + minz for i in vertices]
82 |
83 | return tuple(zip(original_x, original_y, original_z))
84 |
85 |
86 | def clean_buffer(vertices, bounds):
87 | """Cleans the vertices index from unused vertices3"""
88 |
89 | new_bounds = list()
90 | new_vertices = list()
91 | i = 0
92 | for bound in bounds:
93 | new_bound = list()
94 |
95 | for vertex_id in bound:
96 | new_vertices.append(vertices[vertex_id])
97 | new_bound.append(i)
98 | i = i + 1
99 |
100 | new_bounds.append(tuple(new_bound))
101 |
102 | return new_vertices, new_bounds
103 |
104 |
105 | def get_geometry_name(objid, geom, index):
106 | """Returns the name of the provided geometry"""
107 | if 'lod' in geom:
108 | return "{index}: [LoD{lod}] {name}".format(name=objid, lod=geom['lod'], index=index)
109 | else:
110 | return "{index}: [GeometryInstance] {name}".format(name=objid, index=index)
111 |
112 |
113 | def create_empty_object(name):
114 | """Returns an empty blender object"""
115 |
116 | new_object = bpy.data.objects.new(name, None)
117 |
118 | return new_object
119 |
120 |
121 | def create_mesh_object(name, vertices, faces, materials=[], material_indices=[]):
122 | """Returns a mesh blender object"""
123 |
124 | mesh_data = None
125 |
126 | if faces:
127 | mesh_data = bpy.data.meshes.new(name)
128 |
129 | for material in materials:
130 | mesh_data.materials.append(material)
131 |
132 | indices = [i for face in faces for i in face]
133 |
134 | mesh_data.vertices.add(len(vertices))
135 | mesh_data.loops.add(len(indices))
136 | mesh_data.polygons.add(len(faces))
137 |
138 | coords = [c for v in vertices for c in v]
139 |
140 | loop_totals = [len(face) for face in faces]
141 | loop_starts = []
142 | i = 0
143 | for face in faces:
144 | loop_starts.append(i)
145 | i += len(face)
146 |
147 | mesh_data.vertices.foreach_set("co", coords)
148 | mesh_data.loops.foreach_set("vertex_index", indices)
149 | mesh_data.polygons.foreach_set("loop_start", loop_starts)
150 | mesh_data.polygons.foreach_set("loop_total", loop_totals)
151 | if len(material_indices) == len(faces):
152 | mesh_data.polygons.foreach_set("material_index", material_indices)
153 | elif len(material_indices) > len(faces):
154 | print("Object {name} has {num_faces} faces but {num_surfaces} semantic surfaces!"
155 | .format(name=name,
156 | num_faces=len(faces),
157 | num_surfaces=len(material_indices)))
158 |
159 | mesh_data.update()
160 |
161 | new_object = bpy.data.objects.new(name, mesh_data)
162 |
163 | return new_object
164 |
165 |
166 | def get_collection(collection_name):
167 | """Returns a collection with the given name"""
168 |
169 | if collection_name in bpy.data.collections:
170 | return bpy.data.collections[collection_name]
171 |
172 | new_collection = bpy.data.collections.new(collection_name)
173 | bpy.context.scene.collection.children.link(new_collection)
174 |
175 | return new_collection
176 |
177 |
178 | ########## Exporter functions ##########
179 |
180 | def store_semantic_surfaces(init_json, city_object, index, CityObject_Id):
181 | """Stores the semantics from the objects materials"""
182 | if not city_object.data.materials:
183 | return None
184 |
185 | semantics = init_json["CityObjects"][CityObject_Id]["geometry"][index]['semantics']
186 | semantic_surface_lookup = {}
187 | semantic_surface_index = 0
188 | for material in city_object.data.materials:
189 | if material is None:
190 | continue
191 |
192 | semantics['surfaces'].append({'type': material['type']})
193 | semantic_surface_lookup[material.name] = semantic_surface_index
194 | semantic_surface_index += 1
195 |
196 | return semantic_surface_lookup
197 |
198 |
199 | def link_face_semantic_surface(init_json, city_object, index, CityObject_Id, semantic_surface_lookup, face):
200 | """Links the object faces to corresponding semantic surfaces"""
201 | if not city_object.data.materials:
202 | return None
203 | if city_object.data.materials[face.material_index] is None:
204 | init_json["CityObjects"][CityObject_Id]["geometry"][index]['semantics']['values'][0].append(None)
205 | return None
206 |
207 | semantic_surface_name = city_object.data.materials[face.material_index].name
208 | semantic_surface_index = semantic_surface_lookup[semantic_surface_name]
209 | init_json["CityObjects"][CityObject_Id]["geometry"][index]['semantics']['values'][0].append(semantic_surface_index)
210 |
211 | return None
212 |
213 |
214 | def bbox(objects):
215 | """Calculates the bounding box of the objects given"""
216 | # Initialization
217 | obj = objects[0]
218 | bbox = obj.bound_box
219 | xmax = bbox[0][0]
220 | ymax = bbox[0][1]
221 | zmax = bbox[0][2]
222 | xmin = xmax
223 | ymin = ymax
224 | zmin = zmax
225 | world_max_extent = [xmax, ymax, zmax]
226 | world_min_extent = [xmin, ymin, zmin]
227 |
228 | # Calculating bbox of the whole scene
229 | for obj in objects:
230 | bbox = obj.bound_box
231 |
232 | xmax = bbox[0][0]
233 | ymax = bbox[0][1]
234 | zmax = bbox[0][2]
235 |
236 | xmin = xmax
237 | ymin = ymax
238 | zmin = zmax
239 |
240 | for i in range(len(bbox)):
241 | if bbox[i][0] > xmax:
242 | xmax = bbox[i][0]
243 | if bbox[i][0] < xmin:
244 | xmin = bbox[i][0]
245 |
246 | if bbox[i][1] > ymax:
247 | ymax = bbox[i][1]
248 | if bbox[i][1] < ymin:
249 | ymin = bbox[i][1]
250 |
251 | if bbox[i][2] > zmax:
252 | zmax = bbox[i][2]
253 | if bbox[i][2] < zmin:
254 | zmin = bbox[i][2]
255 |
256 | object_max_extent = [xmax, ymax, zmax]
257 | object_min_extent = [xmin, ymin, zmin]
258 |
259 | if object_max_extent[0] > world_max_extent[0]:
260 | world_max_extent[0] = object_max_extent[0]
261 | if object_max_extent[1] > world_max_extent[1]:
262 | world_max_extent[1] = object_max_extent[1]
263 | if object_max_extent[2] > world_max_extent[2]:
264 | world_max_extent[2] = object_max_extent[2]
265 | if object_min_extent[0] < world_min_extent[0]:
266 | world_min_extent[0] = object_min_extent[0]
267 | if object_min_extent[1] < world_min_extent[1]:
268 | world_min_extent[1] = object_min_extent[1]
269 | if object_min_extent[2] < world_min_extent[2]:
270 | world_min_extent[2] = object_min_extent[2]
271 |
272 | # Translating back to original
273 |
274 | if "Axis_Origin_X_translation" in bpy.context.scene.world:
275 | world_min_extent[0] -= bpy.context.scene.world["Axis_Origin_X_translation"]
276 | world_min_extent[1] -= bpy.context.scene.world["Axis_Origin_Y_translation"]
277 | world_min_extent[2] -= bpy.context.scene.world["Axis_Origin_Z_translation"]
278 |
279 | world_max_extent[0] -= bpy.context.scene.world["Axis_Origin_X_translation"]
280 | world_max_extent[1] -= bpy.context.scene.world["Axis_Origin_Y_translation"]
281 | world_max_extent[2] -= bpy.context.scene.world["Axis_Origin_Z_translation"]
282 |
283 | return world_min_extent, world_max_extent
284 |
285 |
286 | def write_vertices_to_CityJSON(city_object, vertex, init_json):
287 | """ Writing vertices to minimal_json after translation to the original position. """
288 | # Initialize progress status
289 | progress = 0
290 | coord = city_object.matrix_world @ vertex
291 | if 'transformed' in bpy.context.scene.world and "Axis_Origin_X_translation" in bpy.context.scene.world:
292 | # First translate back to the original CRS coordinates
293 | x, y, z = coord[0] - bpy.context.scene.world["Axis_Origin_X_translation"], coord[1] \
294 | - bpy.context.scene.world["Axis_Origin_Y_translation"], coord[2] \
295 | - bpy.context.scene.world["Axis_Origin_Z_translation"]
296 | # Second transform the original CRS coordinates based on the transform parameters of the original CityJSON file
297 | x = round((x - bpy.context.scene.world['transform.X_translate']) / bpy.context.scene.world['transform.X_scale'])
298 | y = round((y - bpy.context.scene.world['transform.Y_translate']) / bpy.context.scene.world['transform.Y_scale'])
299 | z = round((z - bpy.context.scene.world['transform.Z_translate']) / bpy.context.scene.world['transform.Z_scale'])
300 | init_json['vertices'].append([x, y, z])
301 | progress += 1
302 | # print("Appending vertices into CityJSON: {percent}% completed".format(percent=round(progress * 100 / progress_max, 1)),end="\r")
303 | elif "Axis_Origin_X_translation" in bpy.context.scene.world:
304 | init_json['vertices'].append([coord[0] - bpy.context.scene.world["Axis_Origin_X_translation"], \
305 | coord[1] - bpy.context.scene.world["Axis_Origin_Y_translation"], \
306 | coord[2] - bpy.context.scene.world["Axis_Origin_Z_translation"]])
307 | progress += 1
308 | # print("Appending vertices into CityJSON: {percent}% completed".format(percent=round(progress * 100 / progress_max, 1)),end="\r")
309 | else:
310 | init_json['vertices'].append([coord[0], coord[1], coord[2]])
311 | progress += 1
312 | # print("Appending vertices into CityJSON: {percent}% completed".format(percent=round(progress * 100 / progress_max, 1)),end="\r")
313 | return None
314 |
315 |
316 | def remove_vertex_duplicates(init_json, precision=3):
317 | """Finds all duplicate vertices within a given precision and merges these
318 | method from https://github.com/cityjson/cjio/blob/faf422afe94b4787aeffa9b2e53ee71b32546320/cjio/cityjson.py#L1208
319 | """
320 |
321 | if "transform" in init_json:
322 | precision = 0
323 |
324 | def update_geom_indices(a, newids):
325 | for i, each in enumerate(a):
326 | if isinstance(each, list):
327 | update_geom_indices(each, newids)
328 | else:
329 | a[i] = newids[each]
330 |
331 | # --
332 | totalinput = len(init_json["vertices"])
333 | h = {}
334 | newids = [-1] * len(init_json["vertices"])
335 | newvertices = []
336 | for i, v in enumerate(init_json["vertices"]):
337 | s = "{{x:.{p}f}} {{y:.{p}f}} {{z:.{p}f}}".format(p=precision).format(x=v[0], y=v[1], z=v[2])
338 | if s not in h:
339 | newid = len(h)
340 | newids[i] = newid
341 | h[s] = newid
342 | newvertices.append(s)
343 | else:
344 | newids[i] = h[s]
345 | # -- update indices
346 | for theid in init_json["CityObjects"]:
347 | for g in init_json['CityObjects'][theid]['geometry']:
348 | update_geom_indices(g["boundaries"], newids)
349 | # -- replace the vertices, innit?
350 | newv2 = []
351 | for v in newvertices:
352 | if "transform" in init_json:
353 | a = list(map(int, v.split()))
354 | else:
355 | a = list(map(float, v.split()))
356 | newv2.append(a)
357 | init_json["vertices"] = newv2
358 | return totalinput - len(init_json["vertices"])
359 |
360 |
361 | def export_transformation_parameters(init_json):
362 | if 'transformed' in bpy.context.scene.world:
363 | print("Exporting transformation parameters...")
364 | init_json.update({'transform': {}})
365 | init_json['transform'].update({'scale': []})
366 | init_json['transform'].update({'translate': []})
367 |
368 | init_json['transform']['scale'].append(bpy.context.scene.world['transform.X_scale'])
369 | init_json['transform']['scale'].append(bpy.context.scene.world['transform.Y_scale'])
370 | init_json['transform']['scale'].append(bpy.context.scene.world['transform.Z_scale'])
371 |
372 | init_json['transform']['translate'].append(bpy.context.scene.world['transform.X_translate'])
373 | init_json['transform']['translate'].append(bpy.context.scene.world['transform.Y_translate'])
374 | init_json['transform']['translate'].append(bpy.context.scene.world['transform.Z_translate'])
375 |
376 | return None
377 |
378 |
379 | def export_metadata(init_json):
380 | print("Exporting metadata...")
381 | # Check if model's reference system exists
382 | if 'CRS' in bpy.context.scene.world:
383 | init_json['metadata'].update({'referenceSystem': bpy.context.scene.world["CRS"]})
384 | init_json['metadata'].update({'geographicalExtent': []})
385 | # Calculation of the bounding box of the whole area to get the geographic extents
386 | minim, maxim = bbox(bpy.data.objects)
387 |
388 | # Updating the metadata tag
389 | print("Appending geographical extent...")
390 | for extent_coord in minim:
391 | init_json['metadata']['geographicalExtent'].append(extent_coord)
392 | for extent_coord in maxim:
393 | init_json['metadata']['geographicalExtent'].append(round(extent_coord, 3))
394 |
395 | return None
396 |
397 |
398 | def export_parent_child(init_json):
399 | """ Store parents/children tags into CityJSON file
400 | Going again through the loop because there is the case that the object whose tag is attempted to be updated
401 | is not yet created if this code is run iin the above loop.
402 | TODO this can be done more efficiently. To be improved..."""
403 | print("\nSaving parents-children relations...")
404 | for city_object in bpy.data.objects:
405 | # Parent and child relationships are stored in the empty objects carrying also all the attributes
406 | if city_object.parent and city_object.type == "EMPTY":
407 | parents_id = city_object.parent.name
408 | # Create children node/tag below the parent's ID and assign to it the children's name
409 | init_json["CityObjects"][parents_id].setdefault('children', [])
410 | init_json["CityObjects"][parents_id]['children'].append(city_object.name)
411 | # Create the "parents" tag below the children's ID and assign to it the parent's name
412 | init_json["CityObjects"][city_object.name].update({'parents': []})
413 | init_json["CityObjects"][city_object.name]['parents'].append(parents_id)
414 |
415 | return None
416 |
417 |
418 | def export_attributes(split, init_json, CityObject_Id, attribute):
419 | """ Storing the attributes back to the dictionary.
420 | The following code works only up to 3 levels of nested attributes
421 | TODO Future suggestion: Make a function out of this that works for any level of nested attributes."""
422 | if len(split) == 3:
423 | if not (split[0] in init_json["CityObjects"][CityObject_Id]):
424 | init_json["CityObjects"][CityObject_Id].update({split[0]: {}})
425 | if not (split[1] in init_json["CityObjects"][CityObject_Id][split[0]]):
426 | init_json["CityObjects"][CityObject_Id][split[0]].update({split[1]: {}})
427 | if not (split[2] in init_json["CityObjects"][CityObject_Id][split[0]][split[1]]):
428 | init_json["CityObjects"][CityObject_Id][split[0]][split[1]].update({split[2]: attribute})
429 | elif len(split) == 2:
430 | if not (split[0] in init_json["CityObjects"][CityObject_Id]):
431 | init_json["CityObjects"][CityObject_Id].update({split[0]: {}})
432 | if not (split[1] in init_json["CityObjects"][CityObject_Id][split[0]]):
433 | init_json["CityObjects"][CityObject_Id][split[0]].update({split[1]: attribute})
434 | elif len(split) == 1:
435 | if not (split[0] in init_json["CityObjects"][CityObject_Id]):
436 | init_json["CityObjects"][CityObject_Id].update({split[0]: attribute})
437 |
438 | return None
439 |
--------------------------------------------------------------------------------
/images/attributes.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/cityjson/Up3date/3e2a12b3ccc8e0fe3a0d05fcad2ffb330b801cd4/images/attributes.png
--------------------------------------------------------------------------------
/images/new_object_empty.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/cityjson/Up3date/3e2a12b3ccc8e0fe3a0d05fcad2ffb330b801cd4/images/new_object_empty.png
--------------------------------------------------------------------------------
/images/new_object_mesh.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/cityjson/Up3date/3e2a12b3ccc8e0fe3a0d05fcad2ffb330b801cd4/images/new_object_mesh.png
--------------------------------------------------------------------------------
/images/semantic_property.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/cityjson/Up3date/3e2a12b3ccc8e0fe3a0d05fcad2ffb330b801cd4/images/semantic_property.png
--------------------------------------------------------------------------------
/images/semantics.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/cityjson/Up3date/3e2a12b3ccc8e0fe3a0d05fcad2ffb330b801cd4/images/semantics.png
--------------------------------------------------------------------------------
/images/world_properties.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/cityjson/Up3date/3e2a12b3ccc8e0fe3a0d05fcad2ffb330b801cd4/images/world_properties.png
--------------------------------------------------------------------------------
/pylintrc:
--------------------------------------------------------------------------------
1 | [MASTER]
2 |
3 | # Specify a configuration file.
4 | #rcfile=
5 |
6 | # Python code to execute, usually for sys.path manipulation such as
7 | # pygtk.require().
8 | #init-hook=
9 |
10 | # Profiled execution.
11 | profile=no
12 |
13 | # Add files or directories to the blacklist. They should be base names, not
14 | # paths.
15 | ignore=CVS
16 |
17 | # Pickle collected data for later comparisons.
18 | persistent=yes
19 |
20 | # List of plugins (as comma separated values of python modules names) to load,
21 | # usually to register additional checkers.
22 | load-plugins=
23 |
24 |
25 | [MESSAGES CONTROL]
26 |
27 | # Enable the message, report, category or checker with the given id(s). You can
28 | # either give multiple identifier separated by comma (,) or put this option
29 | # multiple time. See also the "--disable" option for examples.
30 | #enable=
31 |
32 | # Disable the message, report, category or checker with the given id(s). You
33 | # can either give multiple identifiers separated by comma (,) or put this
34 | # option multiple times (only on the command line, not in the configuration
35 | # file where it should appear only once).You can also use "--disable=all" to
36 | # disable everything first and then reenable specific checks. For example, if
37 | # you want to run only the similarities checker, you can use "--disable=all
38 | # --enable=similarities". If you want to run only the classes checker, but have
39 | # no Warning level messages displayed, use"--disable=all --enable=classes
40 | # --disable=W"
41 | # see http://stackoverflow.com/questions/21487025/pylint-locally-defined-disables-still-give-warnings-how-to-suppress-them
42 | disable=locally-disabled,C0103
43 |
44 |
45 | [REPORTS]
46 |
47 | # Set the output format. Available formats are text, parseable, colorized, msvs
48 | # (visual studio) and html. You can also give a reporter class, eg
49 | # mypackage.mymodule.MyReporterClass.
50 | output-format=text
51 |
52 | # Put messages in a separate file for each module / package specified on the
53 | # command line instead of printing them on stdout. Reports (if any) will be
54 | # written in a file name "pylint_global.[txt|html]".
55 | files-output=no
56 |
57 | # Tells whether to display a full report or only the messages
58 | reports=yes
59 |
60 | # Python expression which should return a note less than 10 (10 is the highest
61 | # note). You have access to the variables errors warning, statement which
62 | # respectively contain the number of errors / warnings messages and the total
63 | # number of statements analyzed. This is used by the global evaluation report
64 | # (RP0004).
65 | evaluation=10.0 - ((float(5 * error + warning + refactor + convention) / statement) * 10)
66 |
67 | # Add a comment according to your evaluation note. This is used by the global
68 | # evaluation report (RP0004).
69 | comment=no
70 |
71 | # Template used to display messages. This is a python new-style format string
72 | # used to format the message information. See doc for all details
73 | #msg-template=
74 |
75 |
76 | [BASIC]
77 |
78 | # Required attributes for module, separated by a comma
79 | required-attributes=
80 |
81 | # List of builtins function names that should not be used, separated by a comma
82 | bad-functions=map,filter,apply,input
83 |
84 | # Regular expression which should only match correct module names
85 | module-rgx=(([a-z_][a-z0-9_]*)|([A-Z][a-zA-Z0-9]+))$
86 |
87 | # Regular expression which should only match correct module level names
88 | const-rgx=(([A-Z_][A-Z0-9_]*)|(__.*__))$
89 |
90 | # Regular expression which should only match correct class names
91 | class-rgx=[A-Z_][a-zA-Z0-9]+$
92 |
93 | # Regular expression which should only match correct function names
94 | function-rgx=[a-z_][a-z0-9_]{2,30}$
95 |
96 | # Regular expression which should only match correct method names
97 | method-rgx=[a-z_][a-z0-9_]{2,30}$
98 |
99 | # Regular expression which should only match correct instance attribute names
100 | attr-rgx=[a-z_][a-z0-9_]{2,30}$
101 |
102 | # Regular expression which should only match correct argument names
103 | argument-rgx=[a-z_][a-z0-9_]{2,30}$
104 |
105 | # Regular expression which should only match correct variable names
106 | variable-rgx=[a-z_][a-z0-9_]{2,30}$
107 |
108 | # Regular expression which should only match correct attribute names in class
109 | # bodies
110 | class-attribute-rgx=([A-Za-z_][A-Za-z0-9_]{2,30}|(__.*__))$
111 |
112 | # Regular expression which should only match correct list comprehension /
113 | # generator expression variable names
114 | inlinevar-rgx=[A-Za-z_][A-Za-z0-9_]*$
115 |
116 | # Good variable names which should always be accepted, separated by a comma
117 | good-names=i,j,k,ex,Run,_
118 |
119 | # Bad variable names which should always be refused, separated by a comma
120 | bad-names=foo,bar,baz,toto,tutu,tata
121 |
122 | # Regular expression which should only match function or class names that do
123 | # not require a docstring.
124 | no-docstring-rgx=__.*__
125 |
126 | # Minimum line length for functions/classes that require docstrings, shorter
127 | # ones are exempt.
128 | docstring-min-length=-1
129 |
130 |
131 | [MISCELLANEOUS]
132 |
133 | # List of note tags to take in consideration, separated by a comma.
134 | notes=FIXME,XXX,TODO
135 |
136 |
137 | [TYPECHECK]
138 |
139 | # Tells whether missing members accessed in mixin class should be ignored. A
140 | # mixin class is detected if its name ends with "mixin" (case insensitive).
141 | ignore-mixin-members=yes
142 |
143 | # List of classes names for which member attributes should not be checked
144 | # (useful for classes with attributes dynamically set).
145 | ignored-classes=SQLObject
146 |
147 | # When zope mode is activated, add a predefined set of Zope acquired attributes
148 | # to generated-members.
149 | zope=no
150 |
151 | # List of members which are set dynamically and missed by pylint inference
152 | # system, and so shouldn't trigger E0201 when accessed. Python regular
153 | # expressions are accepted.
154 | generated-members=REQUEST,acl_users,aq_parent
155 |
156 |
157 | [VARIABLES]
158 |
159 | # Tells whether we should check for unused import in __init__ files.
160 | init-import=no
161 |
162 | # A regular expression matching the beginning of the name of dummy variables
163 | # (i.e. not used).
164 | dummy-variables-rgx=_$|dummy
165 |
166 | # List of additional names supposed to be defined in builtins. Remember that
167 | # you should avoid to define new builtins when possible.
168 | additional-builtins=
169 |
170 |
171 | [FORMAT]
172 |
173 | # Maximum number of characters on a single line.
174 | max-line-length=80
175 |
176 | # Regexp for a line that is allowed to be longer than the limit.
177 | ignore-long-lines=^\s*(# )??$
178 |
179 | # Allow the body of an if to be on the same line as the test if there is no
180 | # else.
181 | single-line-if-stmt=no
182 |
183 | # List of optional constructs for which whitespace checking is disabled
184 | no-space-check=trailing-comma,dict-separator
185 |
186 | # Maximum number of lines in a module
187 | max-module-lines=1000
188 |
189 | # String used as indentation unit. This is usually " " (4 spaces) or "\t" (1
190 | # tab).
191 | indent-string=' '
192 |
193 |
194 | [SIMILARITIES]
195 |
196 | # Minimum lines number of a similarity.
197 | min-similarity-lines=4
198 |
199 | # Ignore comments when computing similarities.
200 | ignore-comments=yes
201 |
202 | # Ignore docstrings when computing similarities.
203 | ignore-docstrings=yes
204 |
205 | # Ignore imports when computing similarities.
206 | ignore-imports=no
207 |
208 |
209 | [IMPORTS]
210 |
211 | # Deprecated modules which should not be used, separated by a comma
212 | deprecated-modules=regsub,TERMIOS,Bastion,rexec
213 |
214 | # Create a graph of every (i.e. internal and external) dependencies in the
215 | # given file (report RP0402 must not be disabled)
216 | import-graph=
217 |
218 | # Create a graph of external dependencies in the given file (report RP0402 must
219 | # not be disabled)
220 | ext-import-graph=
221 |
222 | # Create a graph of internal dependencies in the given file (report RP0402 must
223 | # not be disabled)
224 | int-import-graph=
225 |
226 |
227 | [DESIGN]
228 |
229 | # Maximum number of arguments for function / method
230 | max-args=5
231 |
232 | # Argument names that match this expression will be ignored. Default to name
233 | # with leading underscore
234 | ignored-argument-names=_.*
235 |
236 | # Maximum number of locals for function / method body
237 | max-locals=15
238 |
239 | # Maximum number of return / yield for function / method body
240 | max-returns=6
241 |
242 | # Maximum number of branch for function / method body
243 | max-branches=12
244 |
245 | # Maximum number of statements in function / method body
246 | max-statements=50
247 |
248 | # Maximum number of parents for a class (see R0901).
249 | max-parents=7
250 |
251 | # Maximum number of attributes for a class (see R0902).
252 | max-attributes=7
253 |
254 | # Minimum number of public methods for a class (see R0903).
255 | min-public-methods=2
256 |
257 | # Maximum number of public methods for a class (see R0904).
258 | max-public-methods=20
259 |
260 |
261 | [CLASSES]
262 |
263 | # List of interface methods to ignore, separated by a comma. This is used for
264 | # instance to not check methods defines in Zope's Interface base class.
265 | ignore-iface-methods=isImplementedBy,deferred,extends,names,namesAndDescriptions,queryDescriptionFor,getBases,getDescriptionFor,getDoc,getName,getTaggedValue,getTaggedValueTags,isEqualOrExtendedBy,setTaggedValue,isImplementedByInstancesOf,adaptWith,is_implemented_by
266 |
267 | # List of method names used to declare (i.e. assign) instance attributes.
268 | defining-attr-methods=__init__,__new__,setUp
269 |
270 | # List of valid names for the first argument in a class method.
271 | valid-classmethod-first-arg=cls
272 |
273 | # List of valid names for the first argument in a metaclass class method.
274 | valid-metaclass-classmethod-first-arg=mcs
275 |
276 |
277 | [EXCEPTIONS]
278 |
279 | # Exceptions that will emit a warning when being caught. Defaults to
280 | # "Exception"
281 | overgeneral-exceptions=Exception
282 |
--------------------------------------------------------------------------------