├── .gitignore ├── .gitmodules ├── .infragenie └── infrastructure_model.png ├── LICENSE ├── README.md ├── application ├── Dockerfile ├── config.json ├── config.py ├── config.schema ├── database.py ├── main.py ├── process_xml_to_json.py ├── queue.conf ├── requirements-production.txt ├── requirements-with-redis.txt ├── requirements.txt ├── run_debug.bat ├── run_debug.sh ├── static │ ├── App.js │ ├── dropzone.css │ ├── lib │ │ ├── Array.from.js │ │ ├── Object.assign.js │ │ ├── Object.entries.js │ │ ├── String.startsWith.js │ │ └── fetch.umd.js │ ├── main.css │ ├── package.json │ └── webpack.config.js ├── templates │ ├── error.html │ ├── index.html │ ├── log.html │ ├── progress.html │ └── viewer.html ├── utils.py ├── worker.py └── wsgi.py ├── docker-compose.yml ├── examples └── embed │ └── index.html ├── img └── architecture.png ├── init.sh └── nginx └── app.conf /.gitignore: -------------------------------------------------------------------------------- 1 | __pycache__/ 2 | *.py[cod] 3 | *.ifc 4 | *.cover 5 | docker-volumes 6 | application/static/bimsurfer 7 | application/static/node_modules 8 | application/nix 9 | application/win 10 | application/ifc-pipeline.db 11 | -------------------------------------------------------------------------------- /.gitmodules: -------------------------------------------------------------------------------- 1 | [submodule "application/bimsurfer"] 2 | path = application/bimsurfer 3 | url = https://github.com/AECgeeks/bimsurfer-ui 4 | -------------------------------------------------------------------------------- /.infragenie/infrastructure_model.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AECgeeks/ifc-pipeline/4d26c33f79a2b67bd86d9f3f48d1ee7e8333c625/.infragenie/infrastructure_model.png -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2020 AECgeeks 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # ifc-pipeline 2 | 3 | A processing queue that uses [IfcOpenShell](https://github.com/IfcOpenShell/IfcOpenShell/) to convert IFC input files into a graphic display using glTF 2.0 and [BIMSurfer2](https://github.com/AECgeeks/BIMsurfer2/) for visualization. 4 | 5 | There is a small web application in Flask that accepts file uploads. HTTPS is provided by Nginx. Everything is tied together using Docker Compose. 6 | 7 | ## Architecture 8 | 9 | ![Architecture diagram of ifc-pipeline](img/architecture.png "Architecture diagram of ifc-pipeline") 10 | 11 | ### Routes 12 | 13 | There is a front-end in flask with the following routes: 14 | 15 | - `GET /` HTML front-page with upload 16 | - `POST /` Form post upload. Redirects to `/p/`. 17 | - `GET /p/` HTML progress page 18 | - `GET /pp/` JSON progress 19 | - `GET /log/.` Conversion log. ext: html|json 20 | - `GET /v/` HTML viewer page 21 | - `GET /m/.` Artefact download. ext eg: `.svg` | `.glb` | `.tree.json` 22 | 23 | In the backend there is a worker process that uses IfcOpenShell's IfcConvert to convert the incoming IFC models into XML (for decomposition structure and property set data), glTF for 3D geometry and SVG for 2D floor plan geometry. 24 | 25 | Communication between the Flask web-app and the task queue is by means of the Rq library and a postgres database. Input and output files are stored on disk in a docker volume shared between the two containers. 26 | 27 | Nginx is used a reverse proxy with HTTPS where certbot will automatically create and renew the certificates. 28 | 29 | In the front-end the BIMsurfer2 viewer module takes the various converted artefacts for display. 30 | 31 | ## Local development 32 | 33 | Clone the ifc-pipeline repo recursively (with submodules) 34 | 35 | ### Linux 36 | 37 | This will setup an environment for easy development, without Docker, which uses sqlite instead of postgresql and Python threads instead of the redis-backed RQ processing queue. 38 | 39 | ~~~sh 40 | cd application 41 | 42 | python3 -m pip install -r requirements.txt 43 | 44 | # Download the IfcConvert binary 45 | mkdir nix 46 | cd nix/ 47 | wget https://s3.amazonaws.com/ifcopenshell-builds/IfcConvert-v0.7.0-2985bba-linux64.zip 48 | unzip IfcConvert-v0.7.0-2985bba-linux64.zip 49 | chmod +x IfcConvert 50 | cd .. 51 | 52 | # Install IfcOpenShell-python 53 | wget -O /tmp/ifcopenshell_python.zip https://s3.amazonaws.com/ifcopenshell-builds/ifcopenshell-python-`python3 -c 'import sys;print("".join(map(str, sys.version_info[0:2]))[0:2])'`-v0.7.0-2985bba-linux64.zip 54 | mkdir -p `python3 -c 'import site; print(site.getusersitepackages())'` 55 | unzip -d `python3 -c 'import site; print(site.getusersitepackages())'` /tmp/ifcopenshell_python.zip 56 | 57 | # Run flask with dev environment variables 58 | ./run_debug.sh 59 | ~~~ 60 | 61 | ### Windows 62 | 63 | * Install Python dependencies with `python -m pip -r .\application\requirements.txt` 64 | * Download https://s3.amazonaws.com/ifcopenshell-builds/IfcConvert-v0.7.0-2985bba-win64.zip 65 | * Extract and unzip and place IfcConvert.exe in a newly created directory `.\application\win\IfcConvert.exe` 66 | * Lookup you python version with: `python -c "import sys; print(sys.version_info);"` 67 | * Take the `major` and `minor` number and replace the `XY` in the URL below with those numbers. For for Python 3.10.x use `31` 68 | * Download https://s3.amazonaws.com/ifcopenshell-builds/ifcopenshell-python-XY-v0.7.0-2985bba-win64.zip 69 | * Take note of your python site-packages directory with `python -c "import site; print(site.USER_SITE)"` 70 | * Extract and unzip and place into the site-packages folder of your python interpreter 71 | * The result should be a directory structure with `...\site-packages\ifcopenshell\__init__.py` 72 | * Use run_debug.bat to start the application 73 | 74 | ## Production deployment 75 | 76 | Install docker and docker-compose using the relevant guides for your operating system and then proceed with the two following steps. 77 | 78 | ~~~ 79 | # Do first time registration with certbot 80 | # Further renewals will happen automatically. 81 | ./init.sh my.domain.name.com 82 | 83 | # Start the service using docker-compose 84 | docker-compose up -d 85 | ~~~ 86 | 87 | The service creates a fair amount of data for incoming files. Users are advised to setup a large disk for the storage of models and implement their own vacuuming strategy to remove stale files. 88 | 89 | For example to remove old files (which can be ran periodically using cron). 90 | 91 | ~~~sh 92 | find ./ifc-pipeline/models -mtime +30 -delete 93 | ~~~ 94 | 95 | By default the nginx container listens on 80 and 443. Edit docker-compose.yml to change. 96 | 97 | ### Maintenance commands 98 | 99 | * get queue info 100 | 101 | `docker exec -it $(docker ps -q --filter name=worker) rq info` 102 | 103 | * open a remote python shell and print task information 104 | 105 | ~~~ 106 | docker exec -it $(docker ps -q --filter name=worker) python -c " 107 | import json, database, datetime; 108 | s = database.Session(); 109 | print(json.dumps([m.serialize() for m in s.query(database.model).filter(database.model.date+datetime.timedelta(days=2)>datetime.datetime.now()).order_by(database.model.date).all()], default=str)) 110 | " 111 | ~~~ 112 | 113 | 114 | ### Infrastructure model 115 | 116 | ![Infrastructure model](.infragenie/infrastructure_model.png) 117 | -------------------------------------------------------------------------------- /application/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM ubuntu:latest 2 | 3 | WORKDIR / 4 | ARG DEBIAN_FRONTEND=noninteractive 5 | RUN apt-get -y update && apt-get -y --no-install-recommends --no-install-suggests install python3 python3-pip unzip wget libpq-dev build-essential libssl-dev libffi-dev libxml2-dev libxslt1-dev zlib1g-dev python3-setuptools python3-dev python3-wheel supervisor libjpeg-dev 6 | RUN wget -qO- https://nodejs.org/dist/v16.18.0/node-v16.18.0-linux-x64.tar.gz | tar -xzC /opt 7 | ENV PATH /opt/node-v16.18.0-linux-x64/bin:$PATH 8 | RUN npm install -g jsdoc gltf-pipeline 9 | 10 | RUN wget -O /tmp/ifcopenshell_python.zip https://s3.amazonaws.com/ifcopenshell-builds/ifcopenshell-python-`python3 -c 'import sys;print("".join(map(str, sys.version_info[0:2]))[0:2])'`-v0.7.0-b5133c6-linux64.zip 11 | RUN mkdir -p `python3 -c 'import site; print(site.getusersitepackages())'` 12 | RUN unzip -d `python3 -c 'import site; print(site.getusersitepackages())'` /tmp/ifcopenshell_python.zip 13 | 14 | # Server 15 | WORKDIR /www 16 | COPY application/*.py application/*.txt application/config.json application/config.schema /www/ 17 | COPY application/templates /www/templates 18 | 19 | # Python dependencies 20 | RUN python3 -m pip install -r requirements-production.txt 21 | 22 | COPY .git/HEAD /tmp/.git/HEAD 23 | COPY .git/refs/ /tmp/.git/refs/ 24 | RUN /bin/bash -c '(cat /tmp/.git/$(cat /tmp/.git/HEAD | cut -d \ -f 2)) || cat /tmp/.git/HEAD' > /version 25 | RUN sed -i "4i" /www/templates/*.html 26 | RUN rm -rf /tmp/.git 27 | 28 | COPY application/static /www/static/ 29 | COPY application/bimsurfer /www/bimsurfer 30 | 31 | WORKDIR /www/static 32 | RUN npm i && npx webpack --env postfix=$(cat /version) && rm -rf node_modules 33 | 34 | COPY application/queue.conf /etc/supervisord.conf 35 | RUN sed -i s/NUM_WORKERS/`python3 -c "import json; print(json.load(open('/www/config.json'))['performance']['num_workers'])"`/g /etc/supervisord.conf 36 | 37 | WORKDIR /www 38 | -------------------------------------------------------------------------------- /application/config.json: -------------------------------------------------------------------------------- 1 | { 2 | "tasks": [ 3 | "ifc_validation_task", 4 | "xml_generation_task", 5 | "xml_to_json_conversion", 6 | "geometry_generation_task", 7 | "glb_optimize_task", 8 | "gzip_task", 9 | "svg_rename_task", 10 | "svg_generation_task" 11 | ], 12 | "treeview": { 13 | "label": "{{attr.Name or attr.GlobalId}}" 14 | }, 15 | "features": { 16 | "screen_share": { 17 | "enabled": true 18 | } 19 | }, 20 | "performance": { 21 | "num_workers": 1, 22 | "num_threads": "auto" 23 | } 24 | } 25 | -------------------------------------------------------------------------------- /application/config.py: -------------------------------------------------------------------------------- 1 | import os 2 | import json 3 | 4 | from multiprocessing import cpu_count 5 | 6 | import jsonschema 7 | 8 | # parse configuration file 9 | _config = json.load(open(os.path.join(os.path.dirname(__file__), "config.json"))) 10 | 11 | # parse configuration schema and validate _config 12 | schema = json.load(open(os.path.join(os.path.dirname(__file__), "config.schema"))) 13 | jsonschema.validate(schema=schema, instance=_config) 14 | 15 | # returns true if task is enabled in _config 16 | task_enabled = lambda nm: nm.__name__ in _config['tasks'] 17 | treeview_label = _config['treeview']['label'] 18 | with_screen_share = _config['features'].get("screen_share", {}).get("enabled", False) 19 | num_threads = _config['performance']['num_threads'] 20 | if num_threads == 'auto': 21 | num_threads = max(1, cpu_count() // _config['performance']['num_workers']) 22 | -------------------------------------------------------------------------------- /application/config.schema: -------------------------------------------------------------------------------- 1 | { 2 | "type": "object", 3 | "properties": { 4 | "tasks": { 5 | "type": "array", 6 | "items": { 7 | "type": "string" 8 | } 9 | }, 10 | "treeview": { 11 | "type": "object", 12 | "properties": { 13 | "label": { 14 | "type": "string" 15 | } 16 | }, 17 | "required": [ 18 | "label" 19 | ] 20 | }, 21 | "features": { 22 | "type": "object" 23 | }, 24 | "performance": { 25 | "type": "object", 26 | "properties": { 27 | "num_workers": { 28 | "type": "integer" 29 | }, 30 | "num_threads": { 31 | "oneOf": [ 32 | { 33 | "type": "integer" 34 | }, 35 | { 36 | "type": "string", 37 | "enum": [ 38 | "auto" 39 | ] 40 | } 41 | ] 42 | } 43 | }, 44 | "required": [ 45 | "num_workers" 46 | ] 47 | } 48 | }, 49 | "required": [ 50 | "tasks", 51 | "treeview", 52 | "features", 53 | "performance" 54 | ] 55 | } 56 | -------------------------------------------------------------------------------- /application/database.py: -------------------------------------------------------------------------------- 1 | ################################################################################## 2 | # # 3 | # Copyright (c) 2020 AECgeeks # 4 | # # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy # 6 | # of this software and associated documentation files (the "Software"), to deal # 7 | # in the Software without restriction, including without limitation the rights # 8 | # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell # 9 | # copies of the Software, and to permit persons to whom the Software is # 10 | # furnished to do so, subject to the following conditions: # 11 | # # 12 | # The above copyright notice and this permission notice shall be included in all # 13 | # copies or substantial portions of the Software. # 14 | # # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR # 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, # 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE # 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER # 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, # 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE # 21 | # SOFTWARE. # 22 | # # 23 | ################################################################################## 24 | 25 | from sqlalchemy import create_engine 26 | from sqlalchemy.ext.declarative import declarative_base 27 | from sqlalchemy.orm import sessionmaker 28 | from sqlalchemy.sql import func 29 | from sqlalchemy.inspection import inspect 30 | from sqlalchemy import Column, Integer, String, DateTime, ForeignKey 31 | from sqlalchemy_utils import database_exists, create_database 32 | from sqlalchemy.orm import relationship 33 | import os 34 | 35 | DEVELOPMENT = os.environ.get('environment', 'production').lower() == 'development' 36 | 37 | if DEVELOPMENT: 38 | engine = create_engine('sqlite:///ifc-pipeline.db', connect_args={'check_same_thread': False}) 39 | else: 40 | engine = create_engine('postgresql://postgres:postgres@%s:5432/bimsurfer2' % os.environ.get('POSTGRES_HOST', 'localhost')) 41 | 42 | Session = sessionmaker(bind=engine) 43 | 44 | Base = declarative_base() 45 | 46 | 47 | class Serializable(object): 48 | def serialize(self): 49 | return {c: getattr(self, c) for c in inspect(self).attrs.keys()} 50 | 51 | 52 | class model(Base, Serializable): 53 | __tablename__ = 'models' 54 | 55 | id = Column(Integer, primary_key=True) 56 | code = Column(String) 57 | filename = Column(String) 58 | files = relationship("file") 59 | 60 | progress = Column(Integer, default=-1) 61 | date = Column(DateTime, server_default=func.now()) 62 | 63 | def __init__(self, code, filename): 64 | self.code = code 65 | self.filename = filename 66 | 67 | 68 | 69 | 70 | class file(Base, Serializable): 71 | __tablename__ = 'files' 72 | 73 | id = Column(Integer, primary_key=True) 74 | code = Column(String) 75 | filename = Column(String) 76 | 77 | model_id = Column(Integer, ForeignKey('models.id')) 78 | 79 | progress = Column(Integer, default=-1) 80 | date = Column(DateTime, server_default=func.now()) 81 | 82 | def __init__(self, code, filename): 83 | self.code = code 84 | self.filename = filename 85 | 86 | self.model_id = Column(Integer, ForeignKey('models.id')) 87 | 88 | 89 | def initialize(): 90 | if not database_exists(engine.url): 91 | create_database(engine.url) 92 | Base.metadata.create_all(engine) 93 | 94 | 95 | if __name__ == "__main__" or DEVELOPMENT: 96 | initialize() 97 | -------------------------------------------------------------------------------- /application/main.py: -------------------------------------------------------------------------------- 1 | ################################################################################## 2 | # # 3 | # Copyright (c) 2020 AECgeeks # 4 | # # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy # 6 | # of this software and associated documentation files (the "Software"), to deal # 7 | # in the Software without restriction, including without limitation the rights # 8 | # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell # 9 | # copies of the Software, and to permit persons to whom the Software is # 10 | # furnished to do so, subject to the following conditions: # 11 | # # 12 | # The above copyright notice and this permission notice shall be included in all # 13 | # copies or substantial portions of the Software. # 14 | # # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR # 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, # 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE # 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER # 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, # 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE # 21 | # SOFTWARE. # 22 | # # 23 | ################################################################################## 24 | 25 | from __future__ import print_function 26 | 27 | import os 28 | import glob 29 | import json 30 | import operator 31 | import threading 32 | 33 | from collections import defaultdict, namedtuple 34 | from flask_dropzone import Dropzone 35 | 36 | from werkzeug.middleware.proxy_fix import ProxyFix 37 | from flask import Flask, request, send_file, render_template, abort, jsonify, redirect, url_for, make_response, send_from_directory 38 | from flask_cors import CORS 39 | from flask_basicauth import BasicAuth 40 | from flasgger import Swagger 41 | 42 | import utils 43 | import worker 44 | import config 45 | import database 46 | 47 | # We have a custom static file handler that serves two directories, 48 | # but None cannot be supplied here because flask-dropzone depends on 49 | # it. 50 | application = Flask(__name__, static_folder="non-existant") 51 | dropzone = Dropzone(application) 52 | 53 | # application.config['DROPZONE_UPLOAD_MULTIPLE'] = True 54 | # application.config['DROPZONE_PARALLEL_UPLOADS'] = 3 55 | 56 | DEVELOPMENT = os.environ.get('environment', 'production').lower() == 'development' 57 | WITH_REDIS = os.environ.get('with_redis', 'false').lower() == 'true' 58 | 59 | 60 | if not DEVELOPMENT and os.path.exists("/version"): 61 | PIPELINE_POSTFIX = "." + open("/version").read().strip() 62 | else: 63 | PIPELINE_POSTFIX = "" 64 | 65 | 66 | if not DEVELOPMENT: 67 | # In some setups this proved to be necessary for url_for() to pick up HTTPS 68 | application.wsgi_app = ProxyFix(application.wsgi_app, x_proto=1) 69 | 70 | CORS(application) 71 | application.config['SWAGGER'] = { 72 | 'title': os.environ.get('APP_NAME', 'ifc-pipeline request API'), 73 | 'openapi': '3.0.2', 74 | "specs": [ 75 | { 76 | "version": "0.1", 77 | "title": os.environ.get('APP_NAME', 'ifc-pipeline request API'), 78 | "description": os.environ.get('APP_NAME', 'ifc-pipeline request API'), 79 | "endpoint": "spec", 80 | "route": "/apispec", 81 | }, 82 | ] 83 | } 84 | swagger = Swagger(application) 85 | 86 | if DEVELOPMENT and not WITH_REDIS: 87 | redis_queue = None 88 | else: 89 | from redis import Redis 90 | from rq import Queue 91 | 92 | redis = Redis(host=os.environ.get("REDIS_HOST", "localhost")) 93 | redis_queue = Queue(connection=redis, default_timeout=3600) 94 | 95 | 96 | @application.route('/', methods=['GET']) 97 | def get_main(): 98 | return render_template('index.html') 99 | 100 | 101 | 102 | def process_upload(filewriter, callback_url=None): 103 | id = utils.generate_id() 104 | d = utils.storage_dir_for_id(id) 105 | os.makedirs(d) 106 | 107 | filewriter(os.path.join(d, id+".ifc")) 108 | 109 | session = database.Session() 110 | session.add(database.model(id, '')) 111 | session.commit() 112 | session.close() 113 | 114 | if redis_queue is None: 115 | t = threading.Thread(target=lambda: worker.process(id, callback_url)) 116 | t.start() 117 | else: 118 | redis_queue.enqueue(worker.process, id, callback_url) 119 | 120 | return id 121 | 122 | 123 | 124 | def process_upload_multiple(files, callback_url=None): 125 | id = utils.generate_id() 126 | d = utils.storage_dir_for_id(id) 127 | os.makedirs(d) 128 | 129 | file_id = 0 130 | session = database.Session() 131 | m = database.model(id, '') 132 | session.add(m) 133 | 134 | for file in files: 135 | fn = file.filename 136 | filewriter = lambda fn: file.save(fn) 137 | filewriter(os.path.join(d, id+"_"+str(file_id)+".ifc")) 138 | file_id += 1 139 | m.files.append(database.file(id, '')) 140 | 141 | session.commit() 142 | session.close() 143 | 144 | if redis_queue is None: 145 | t = threading.Thread(target=lambda: worker.process(id, callback_url)) 146 | t.start() 147 | else: 148 | redis_queue.enqueue(worker.process, id, callback_url) 149 | 150 | return id 151 | 152 | 153 | 154 | @application.route('/', methods=['POST']) 155 | def put_main(): 156 | """ 157 | Upload model 158 | --- 159 | requestBody: 160 | content: 161 | multipart/form-data: 162 | schema: 163 | type: object 164 | properties: 165 | ifc: 166 | type: string 167 | format: binary 168 | responses: 169 | '200': 170 | description: redirect 171 | """ 172 | ids = [] 173 | 174 | files = [] 175 | for key, f in request.files.items(): 176 | if key.startswith('file'): 177 | file = f 178 | files.append(file) 179 | 180 | 181 | id = process_upload_multiple(files) 182 | url = url_for('check_viewer', id=id) 183 | 184 | if request.accept_mimetypes.accept_json: 185 | return jsonify({"url":url}) 186 | else: 187 | return redirect(url) 188 | 189 | 190 | @application.route('/p/', methods=['GET']) 191 | def check_viewer(id): 192 | if not utils.validate_id(id): 193 | abort(404) 194 | return render_template('progress.html', id=id) 195 | 196 | 197 | @application.route('/pp/', methods=['GET']) 198 | def get_progress(id): 199 | if not utils.validate_id(id): 200 | abort(404) 201 | session = database.Session() 202 | model = session.query(database.model).filter(database.model.code == id).all()[0] 203 | session.close() 204 | return jsonify({"progress": model.progress}) 205 | 206 | 207 | @application.route('/log/.', methods=['GET']) 208 | def get_log(id, ext): 209 | log_entry_type = namedtuple('log_entry_type', ("level", "message", "instance", "product")) 210 | 211 | if ext not in {'html', 'json'}: 212 | abort(404) 213 | 214 | if not utils.validate_id(id): 215 | abort(404) 216 | logfn = os.path.join(utils.storage_dir_for_id(id), "log.json") 217 | if not os.path.exists(logfn): 218 | abort(404) 219 | 220 | if ext == 'html': 221 | log = [] 222 | for ln in open(logfn): 223 | l = ln.strip() 224 | if l: 225 | log.append(json.loads(l, object_hook=lambda d: log_entry_type(*(d.get(k, '') for k in log_entry_type._fields)))) 226 | return render_template('log.html', id=id, log=log) 227 | else: 228 | return send_file(logfn, mimetype='text/plain') 229 | 230 | 231 | @application.route('/v/', methods=['GET']) 232 | @application.route('/live//', methods=['GET']) 233 | def get_viewer(id, channel=None): 234 | if not utils.validate_id(id): 235 | abort(404) 236 | d = utils.storage_dir_for_id(id) 237 | 238 | if not os.path.exists(d): 239 | abort(404) 240 | 241 | ifc_files = [os.path.join(d, name) for name in os.listdir(d) if os.path.isfile(os.path.join(d, name)) and name.endswith('.ifc')] 242 | 243 | if len(ifc_files) == 0: 244 | abort(404) 245 | 246 | failedfn = os.path.join(utils.storage_dir_for_id(id), "failed") 247 | if os.path.exists(failedfn): 248 | return render_template('error.html', id=id) 249 | 250 | for ifc_fn in ifc_files: 251 | glbfn = ifc_fn.replace(".ifc", ".glb") 252 | if not os.path.exists(glbfn): 253 | abort(404) 254 | 255 | n_files = len(ifc_files) if "_" in ifc_files[0] else None 256 | 257 | return render_template( 258 | 'viewer.html', 259 | id=id, 260 | n_files=n_files, 261 | postfix=PIPELINE_POSTFIX, 262 | with_screen_share=config.with_screen_share, 263 | live_share_id=channel or utils.generate_id(), 264 | mode='listen' if channel else 'view' 265 | ) 266 | 267 | 268 | @application.route('/m/', methods=['GET']) 269 | def get_model(fn): 270 | """ 271 | Get model component 272 | --- 273 | parameters: 274 | - in: path 275 | name: fn 276 | required: true 277 | schema: 278 | type: string 279 | description: Model id and part extension 280 | example: BSESzzACOXGTedPLzNiNklHZjdJAxTGT.glb 281 | """ 282 | 283 | 284 | id, ext = fn.split('.', 1) 285 | 286 | if not utils.validate_id(id): 287 | abort(404) 288 | 289 | if ext not in {"xml", "svg", "glb", "unoptimized.glb", "tree.json"}: 290 | abort(404) 291 | 292 | path = utils.storage_file_for_id(id, ext) 293 | 294 | if not os.path.exists(path): 295 | abort(404) 296 | 297 | if os.path.exists(path + ".gz"): 298 | import mimetypes 299 | response = make_response( 300 | send_file(path + ".gz", 301 | mimetype=mimetypes.guess_type(fn, strict=False)[0]) 302 | ) 303 | response.headers['Content-Encoding'] = 'gzip' 304 | return response 305 | else: 306 | return send_file(path) 307 | 308 | 309 | @application.route('/live/', methods=['POST']) 310 | def post_live_viewer_update(channel): 311 | # body = request.get_json(force=True) 312 | # @todo validate schema? 313 | # body = json.dumps(body) 314 | body = request.data.decode('ascii'); 315 | redis.publish(channel=f"live_{channel}", message=body) 316 | return "" 317 | 318 | 319 | @application.route('/static/') 320 | def static_handler(filename): 321 | # filenames = [os.path.join(root, fn)[len("static")+1:] for root, dirs, files in os.walk("static", topdown=False) for fn in files] 322 | if filename.startswith("bimsurfer/"): 323 | return send_from_directory("bimsurfer", "/".join(filename.split("/")[1:])) 324 | else: 325 | return send_from_directory("static", filename) 326 | 327 | 328 | @application.route('/live/', methods=['GET']) 329 | def get_viewer_update(channel): 330 | def format(obj): 331 | return f"data: {obj.decode('ascii')}\n\n" 332 | 333 | def stream(): 334 | pubsub = redis.pubsub() 335 | pubsub.subscribe(f"live_{channel}") 336 | try: 337 | msgs = pubsub.listen() 338 | yield from map(format, \ 339 | map(operator.itemgetter('data'), \ 340 | filter(lambda x: x.get('type') == 'message', msgs))) 341 | finally: 342 | import traceback 343 | traceback.print_exc() 344 | try: pubsub.unsubscribe(channel) 345 | except: pass 346 | 347 | return application.response_class( 348 | stream(), 349 | mimetype='text/event-stream', 350 | headers={'X-Accel-Buffering': 'no', 'Cache-Control': 'no-cache'}, 351 | ) 352 | 353 | """ 354 | # Create a file called routes.py with the following 355 | # example content to add application-specific routes 356 | 357 | from main import application 358 | 359 | @application.route('/test', methods=['GET']) 360 | def test_hello_world(): 361 | return 'Hello world' 362 | """ 363 | try: 364 | import routes 365 | except ImportError as e: 366 | pass 367 | -------------------------------------------------------------------------------- /application/process_xml_to_json.py: -------------------------------------------------------------------------------- 1 | import os 2 | import re 3 | import sys 4 | import json 5 | 6 | from functools import lru_cache, partial 7 | 8 | import ifcopenshell 9 | import ifcopenshell.util.element 10 | import lxml.etree as ET 11 | 12 | from jinja2 import Environment, BaseLoader, Undefined 13 | 14 | import config 15 | 16 | class Ignore(Undefined): 17 | def _fail_with_undefined_error(self, *args, **kwargs): 18 | return None 19 | 20 | 21 | T = Environment( 22 | loader=BaseLoader, 23 | undefined=Ignore 24 | ).from_string(config.treeview_label) 25 | 26 | 27 | def shorten_namespace(k): 28 | ns = re.findall(r"\{(.+?)\}(\w+)", k) 29 | if ns: 30 | return f"{ns[0][0].split('/')[-1]}:{ns[0][1]}" 31 | else: 32 | return k 33 | 34 | 35 | def attempt(fn): 36 | try: 37 | return fn() 38 | except: 39 | pass 40 | 41 | 42 | def map_attribute(k): 43 | return {'Name': 'name', 'id': 'guid'}.get(k, k) 44 | 45 | def to_dict(ifc, t): 46 | di = {map_attribute(shorten_namespace(k)): v for k, v in (t.attrib or {}).items()} 47 | is_product = attempt(lambda: ifc[di.get('guid')].is_a("IfcProduct")) 48 | is_project = attempt(lambda: ifc[di.get('guid')].is_a("IfcProject")) 49 | if is_product or is_project: 50 | di['label'] = T.render(**instance_template_lookup(ifc[di.get('guid')])) 51 | di['type'] = t.tag 52 | cs = list(map(partial(to_dict, f), t)) 53 | if cs: 54 | di['children'] = cs 55 | t = (t.text or '').strip() 56 | if t: 57 | di['text'] = t 58 | return di 59 | 60 | 61 | def dict_to_namespace(di): 62 | methods = { 63 | '__repr__': lambda self: '' 64 | } 65 | methods.update(di) 66 | return type('anonymous_', (object,), methods)() 67 | 68 | 69 | class instance_template_lookup: 70 | def __init__(self, inst): 71 | self.inst = inst 72 | di = {k: v for k, v in inst.get_info(include_identifier=False, recursive=False).items() if isinstance(v, (str, int, float)) and k != 'type'} 73 | self.attrs = dict_to_namespace(di) 74 | 75 | def __getitem__(self, k): 76 | if k == 'attr': 77 | return self.attrs 78 | elif k == 'prop': 79 | return instance_property_lookup(self.inst) 80 | elif k == 'type': 81 | return self.inst.is_a() 82 | else: 83 | raise KeyError(k) 84 | 85 | def keys(self): 86 | return 'attr', 'prop', 'type' 87 | 88 | 89 | class instance_property_lookup: 90 | def __init__(self, inst): 91 | self.inst = inst 92 | 93 | def __getattr__(self, k): 94 | v = self.props().get(k, {}) 95 | if isinstance(v, dict): 96 | return dict_to_namespace(v) 97 | else: 98 | return v 99 | 100 | @lru_cache(maxsize=32) 101 | def props(self): 102 | props = ifcopenshell.util.element.get_psets(self.inst) 103 | 104 | if ifcopenshell.util.element.get_type(self.inst): 105 | props.update(ifcopenshell.util.element.get_psets(ifcopenshell.util.element.get_type(self.inst))) 106 | 107 | for v in list(props.values()): 108 | props.update(v) 109 | 110 | return props 111 | 112 | 113 | if __name__ == "__main__": 114 | 115 | id = sys.argv[1] 116 | 117 | f = ifcopenshell.open(f"{id}.ifc") 118 | 119 | json.dump( 120 | to_dict(f, ET.parse( 121 | f"{id}.xml", 122 | parser=ET.XMLParser(encoding="utf-8") 123 | ).getroot()), 124 | open(f"{id}.tree.json", "w", encoding="utf-8") 125 | ) -------------------------------------------------------------------------------- /application/queue.conf: -------------------------------------------------------------------------------- 1 | [supervisord] 2 | 3 | [program:worker] 4 | command=rq worker --url redis://redis:6379 5 | numprocs=NUM_WORKERS 6 | process_name=%(program_name)s_%(process_num)02d 7 | autorestart=true 8 | stdout_logfile=/dev/fd/1 9 | stdout_logfile_maxbytes=0 10 | redirect_stderr=True 11 | -------------------------------------------------------------------------------- /application/requirements-production.txt: -------------------------------------------------------------------------------- 1 | -r requirements-with-redis.txt 2 | 3 | psycopg2 4 | gunicorn 5 | gevent 6 | -------------------------------------------------------------------------------- /application/requirements-with-redis.txt: -------------------------------------------------------------------------------- 1 | -r requirements.txt 2 | 3 | rq 4 | redis 5 | -------------------------------------------------------------------------------- /application/requirements.txt: -------------------------------------------------------------------------------- 1 | flask 2 | flask-cors 3 | sqlalchemy 4 | sqlalchemy-utils 5 | Flask-BasicAuth 6 | flasgger 7 | requests 8 | flask-dropzone 9 | jsonschema 10 | lxml 11 | -------------------------------------------------------------------------------- /application/run_debug.bat: -------------------------------------------------------------------------------- 1 | set environment=development 2 | set FLASK_APP=main.py 3 | set FLASK_DEBUG=1 4 | flask run 5 | -------------------------------------------------------------------------------- /application/run_debug.sh: -------------------------------------------------------------------------------- 1 | environment=development FLASK_APP=main.py FLASK_DEBUG=1 python3 -m flask run -h 0.0.0.0 --with-threads 2 | -------------------------------------------------------------------------------- /application/static/App.js: -------------------------------------------------------------------------------- 1 | import MultiModal from './bimsurfer/src/MultiModal.js'; 2 | 3 | let processed = false; 4 | 5 | document.addEventListener('DOMContentLoaded', () => { 6 | if (processed) { 7 | // For some reason the event fires twice with es-module-shims on chrome 8 | return; 9 | } 10 | processed = true; 11 | 12 | var v = window.viewer = new MultiModal({ 13 | domNode: 'right', 14 | svgDomNode: 'bottom', 15 | modelId: window.MODEL_ID, 16 | withTreeVisibilityToggle: true, 17 | withTreeViewIcons: true, 18 | treeLoadUntil: 3, 19 | n_files: window.NUM_FILES, 20 | withShadows: true, 21 | engine3d: localStorage.getItem('engine') || 'threejs', 22 | fromPipeline: true 23 | }); 24 | 25 | if (window.SPINNER_CLASS) { 26 | v.setSpinner({className: window.SPINNER_CLASS}); 27 | } else if (window.SPINNER_URL) { 28 | v.setSpinner({url: window.SPINNER_URL}); 29 | } 30 | 31 | v.load2d(); 32 | v.load3d(); 33 | v.loadMetadata('middle'); 34 | v.loadTreeView('top'); 35 | 36 | if (window.onViewerLoaded) { 37 | window.onViewerLoaded(self); 38 | } 39 | }); 40 | -------------------------------------------------------------------------------- /application/static/dropzone.css: -------------------------------------------------------------------------------- 1 | /* 2 | * The MIT License 3 | * Copyright (c) 2012 Matias Meno 4 | */ 5 | @-webkit-keyframes passing-through { 6 | 0% { 7 | opacity: 0; 8 | -webkit-transform: translateY(40px); 9 | -moz-transform: translateY(40px); 10 | -ms-transform: translateY(40px); 11 | -o-transform: translateY(40px); 12 | transform: translateY(40px); } 13 | 30%, 70% { 14 | opacity: 1; 15 | -webkit-transform: translateY(0px); 16 | -moz-transform: translateY(0px); 17 | -ms-transform: translateY(0px); 18 | -o-transform: translateY(0px); 19 | transform: translateY(0px); } 20 | 100% { 21 | opacity: 0; 22 | -webkit-transform: translateY(-40px); 23 | -moz-transform: translateY(-40px); 24 | -ms-transform: translateY(-40px); 25 | -o-transform: translateY(-40px); 26 | transform: translateY(-40px); } } 27 | @-moz-keyframes passing-through { 28 | 0% { 29 | opacity: 0; 30 | -webkit-transform: translateY(40px); 31 | -moz-transform: translateY(40px); 32 | -ms-transform: translateY(40px); 33 | -o-transform: translateY(40px); 34 | transform: translateY(40px); } 35 | 30%, 70% { 36 | opacity: 1; 37 | -webkit-transform: translateY(0px); 38 | -moz-transform: translateY(0px); 39 | -ms-transform: translateY(0px); 40 | -o-transform: translateY(0px); 41 | transform: translateY(0px); } 42 | 100% { 43 | opacity: 0; 44 | -webkit-transform: translateY(-40px); 45 | -moz-transform: translateY(-40px); 46 | -ms-transform: translateY(-40px); 47 | -o-transform: translateY(-40px); 48 | transform: translateY(-40px); } } 49 | @keyframes passing-through { 50 | 0% { 51 | opacity: 0; 52 | -webkit-transform: translateY(40px); 53 | -moz-transform: translateY(40px); 54 | -ms-transform: translateY(40px); 55 | -o-transform: translateY(40px); 56 | transform: translateY(40px); } 57 | 30%, 70% { 58 | opacity: 1; 59 | -webkit-transform: translateY(0px); 60 | -moz-transform: translateY(0px); 61 | -ms-transform: translateY(0px); 62 | -o-transform: translateY(0px); 63 | transform: translateY(0px); } 64 | 100% { 65 | opacity: 0; 66 | -webkit-transform: translateY(-40px); 67 | -moz-transform: translateY(-40px); 68 | -ms-transform: translateY(-40px); 69 | -o-transform: translateY(-40px); 70 | transform: translateY(-40px); } } 71 | @-webkit-keyframes slide-in { 72 | 0% { 73 | opacity: 0; 74 | -webkit-transform: translateY(40px); 75 | -moz-transform: translateY(40px); 76 | -ms-transform: translateY(40px); 77 | -o-transform: translateY(40px); 78 | transform: translateY(40px); } 79 | 30% { 80 | opacity: 1; 81 | -webkit-transform: translateY(0px); 82 | -moz-transform: translateY(0px); 83 | -ms-transform: translateY(0px); 84 | -o-transform: translateY(0px); 85 | transform: translateY(0px); } } 86 | @-moz-keyframes slide-in { 87 | 0% { 88 | opacity: 0; 89 | -webkit-transform: translateY(40px); 90 | -moz-transform: translateY(40px); 91 | -ms-transform: translateY(40px); 92 | -o-transform: translateY(40px); 93 | transform: translateY(40px); } 94 | 30% { 95 | opacity: 1; 96 | -webkit-transform: translateY(0px); 97 | -moz-transform: translateY(0px); 98 | -ms-transform: translateY(0px); 99 | -o-transform: translateY(0px); 100 | transform: translateY(0px); } } 101 | @keyframes slide-in { 102 | 0% { 103 | opacity: 0; 104 | -webkit-transform: translateY(40px); 105 | -moz-transform: translateY(40px); 106 | -ms-transform: translateY(40px); 107 | -o-transform: translateY(40px); 108 | transform: translateY(40px); } 109 | 30% { 110 | opacity: 1; 111 | -webkit-transform: translateY(0px); 112 | -moz-transform: translateY(0px); 113 | -ms-transform: translateY(0px); 114 | -o-transform: translateY(0px); 115 | transform: translateY(0px); } } 116 | @-webkit-keyframes pulse { 117 | 0% { 118 | -webkit-transform: scale(1); 119 | -moz-transform: scale(1); 120 | -ms-transform: scale(1); 121 | -o-transform: scale(1); 122 | transform: scale(1); } 123 | 10% { 124 | -webkit-transform: scale(1.1); 125 | -moz-transform: scale(1.1); 126 | -ms-transform: scale(1.1); 127 | -o-transform: scale(1.1); 128 | transform: scale(1.1); } 129 | 20% { 130 | -webkit-transform: scale(1); 131 | -moz-transform: scale(1); 132 | -ms-transform: scale(1); 133 | -o-transform: scale(1); 134 | transform: scale(1); } } 135 | @-moz-keyframes pulse { 136 | 0% { 137 | -webkit-transform: scale(1); 138 | -moz-transform: scale(1); 139 | -ms-transform: scale(1); 140 | -o-transform: scale(1); 141 | transform: scale(1); } 142 | 10% { 143 | -webkit-transform: scale(1.1); 144 | -moz-transform: scale(1.1); 145 | -ms-transform: scale(1.1); 146 | -o-transform: scale(1.1); 147 | transform: scale(1.1); } 148 | 20% { 149 | -webkit-transform: scale(1); 150 | -moz-transform: scale(1); 151 | -ms-transform: scale(1); 152 | -o-transform: scale(1); 153 | transform: scale(1); } } 154 | @keyframes pulse { 155 | 0% { 156 | -webkit-transform: scale(1); 157 | -moz-transform: scale(1); 158 | -ms-transform: scale(1); 159 | -o-transform: scale(1); 160 | transform: scale(1); } 161 | 10% { 162 | -webkit-transform: scale(1.1); 163 | -moz-transform: scale(1.1); 164 | -ms-transform: scale(1.1); 165 | -o-transform: scale(1.1); 166 | transform: scale(1.1); } 167 | 20% { 168 | -webkit-transform: scale(1); 169 | -moz-transform: scale(1); 170 | -ms-transform: scale(1); 171 | -o-transform: scale(1); 172 | transform: scale(1); } } 173 | .dropzone, .dropzone * { 174 | box-sizing: border-box; } 175 | 176 | .dropzone { 177 | min-height: 150px; 178 | border: 2px solid rgba(0, 0, 0, 0.3); 179 | background: white; 180 | padding: 20px 20px; } 181 | .dropzone.dz-clickable { 182 | cursor: pointer; } 183 | .dropzone.dz-clickable * { 184 | cursor: default; } 185 | .dropzone.dz-clickable .dz-message, .dropzone.dz-clickable .dz-message * { 186 | cursor: pointer; } 187 | .dropzone.dz-started .dz-message { 188 | display: none; } 189 | .dropzone.dz-drag-hover { 190 | border-style: solid; } 191 | .dropzone.dz-drag-hover .dz-message { 192 | opacity: 0.5; } 193 | .dropzone .dz-message { 194 | text-align: center; 195 | margin: 2em 0; } 196 | .dropzone .dz-message .dz-button { 197 | background: none; 198 | color: inherit; 199 | border: none; 200 | padding: 0; 201 | font: inherit; 202 | cursor: pointer; 203 | outline: inherit; } 204 | .dropzone .dz-preview { 205 | position: relative; 206 | display: inline-block; 207 | vertical-align: top; 208 | margin: 16px; 209 | min-height: 100px; } 210 | .dropzone .dz-preview:hover { 211 | z-index: 1000; } 212 | .dropzone .dz-preview:hover .dz-details { 213 | opacity: 1; } 214 | .dropzone .dz-preview.dz-file-preview .dz-image { 215 | border-radius: 20px; 216 | background: #999; 217 | background: linear-gradient(to bottom, #eee, #ddd); } 218 | .dropzone .dz-preview.dz-file-preview .dz-details { 219 | opacity: 1; } 220 | .dropzone .dz-preview.dz-image-preview { 221 | background: white; } 222 | .dropzone .dz-preview.dz-image-preview .dz-details { 223 | -webkit-transition: opacity 0.2s linear; 224 | -moz-transition: opacity 0.2s linear; 225 | -ms-transition: opacity 0.2s linear; 226 | -o-transition: opacity 0.2s linear; 227 | transition: opacity 0.2s linear; } 228 | .dropzone .dz-preview .dz-remove { 229 | font-size: 14px; 230 | text-align: center; 231 | display: block; 232 | cursor: pointer; 233 | border: none; } 234 | .dropzone .dz-preview .dz-remove:hover { 235 | text-decoration: underline; } 236 | .dropzone .dz-preview:hover .dz-details { 237 | opacity: 1; } 238 | .dropzone .dz-preview .dz-details { 239 | z-index: 20; 240 | position: absolute; 241 | top: 0; 242 | left: 0; 243 | opacity: 0; 244 | font-size: 13px; 245 | min-width: 100%; 246 | max-width: 100%; 247 | padding: 2em 1em; 248 | text-align: center; 249 | color: rgba(0, 0, 0, 0.9); 250 | line-height: 150%; } 251 | .dropzone .dz-preview .dz-details .dz-size { 252 | margin-bottom: 1em; 253 | font-size: 16px; } 254 | .dropzone .dz-preview .dz-details .dz-filename { 255 | white-space: nowrap; } 256 | .dropzone .dz-preview .dz-details .dz-filename:hover span { 257 | border: 1px solid rgba(200, 200, 200, 0.8); 258 | background-color: rgba(255, 255, 255, 0.8); } 259 | .dropzone .dz-preview .dz-details .dz-filename:not(:hover) { 260 | overflow: hidden; 261 | text-overflow: ellipsis; } 262 | .dropzone .dz-preview .dz-details .dz-filename:not(:hover) span { 263 | border: 1px solid transparent; } 264 | .dropzone .dz-preview .dz-details .dz-filename span, .dropzone .dz-preview .dz-details .dz-size span { 265 | background-color: rgba(255, 255, 255, 0.4); 266 | padding: 0 0.4em; 267 | border-radius: 3px; } 268 | .dropzone .dz-preview:hover .dz-image img { 269 | -webkit-transform: scale(1.05, 1.05); 270 | -moz-transform: scale(1.05, 1.05); 271 | -ms-transform: scale(1.05, 1.05); 272 | -o-transform: scale(1.05, 1.05); 273 | transform: scale(1.05, 1.05); 274 | -webkit-filter: blur(8px); 275 | filter: blur(8px); } 276 | .dropzone .dz-preview .dz-image { 277 | border-radius: 20px; 278 | overflow: hidden; 279 | width: 120px; 280 | height: 120px; 281 | position: relative; 282 | display: block; 283 | z-index: 10; } 284 | .dropzone .dz-preview .dz-image img { 285 | display: block; } 286 | .dropzone .dz-preview.dz-success .dz-success-mark { 287 | -webkit-animation: passing-through 3s cubic-bezier(0.77, 0, 0.175, 1); 288 | -moz-animation: passing-through 3s cubic-bezier(0.77, 0, 0.175, 1); 289 | -ms-animation: passing-through 3s cubic-bezier(0.77, 0, 0.175, 1); 290 | -o-animation: passing-through 3s cubic-bezier(0.77, 0, 0.175, 1); 291 | animation: passing-through 3s cubic-bezier(0.77, 0, 0.175, 1); } 292 | .dropzone .dz-preview.dz-error .dz-error-mark { 293 | opacity: 1; 294 | -webkit-animation: slide-in 3s cubic-bezier(0.77, 0, 0.175, 1); 295 | -moz-animation: slide-in 3s cubic-bezier(0.77, 0, 0.175, 1); 296 | -ms-animation: slide-in 3s cubic-bezier(0.77, 0, 0.175, 1); 297 | -o-animation: slide-in 3s cubic-bezier(0.77, 0, 0.175, 1); 298 | animation: slide-in 3s cubic-bezier(0.77, 0, 0.175, 1); } 299 | .dropzone .dz-preview .dz-success-mark, .dropzone .dz-preview .dz-error-mark { 300 | pointer-events: none; 301 | opacity: 0; 302 | z-index: 500; 303 | position: absolute; 304 | display: block; 305 | top: 50%; 306 | left: 50%; 307 | margin-left: -27px; 308 | margin-top: -27px; } 309 | .dropzone .dz-preview .dz-success-mark svg, .dropzone .dz-preview .dz-error-mark svg { 310 | display: block; 311 | width: 54px; 312 | height: 54px; } 313 | .dropzone .dz-preview.dz-processing .dz-progress { 314 | opacity: 1; 315 | -webkit-transition: all 0.2s linear; 316 | -moz-transition: all 0.2s linear; 317 | -ms-transition: all 0.2s linear; 318 | -o-transition: all 0.2s linear; 319 | transition: all 0.2s linear; } 320 | .dropzone .dz-preview.dz-complete .dz-progress { 321 | opacity: 0; 322 | -webkit-transition: opacity 0.4s ease-in; 323 | -moz-transition: opacity 0.4s ease-in; 324 | -ms-transition: opacity 0.4s ease-in; 325 | -o-transition: opacity 0.4s ease-in; 326 | transition: opacity 0.4s ease-in; } 327 | .dropzone .dz-preview:not(.dz-processing) .dz-progress { 328 | -webkit-animation: pulse 6s ease infinite; 329 | -moz-animation: pulse 6s ease infinite; 330 | -ms-animation: pulse 6s ease infinite; 331 | -o-animation: pulse 6s ease infinite; 332 | animation: pulse 6s ease infinite; } 333 | .dropzone .dz-preview .dz-progress { 334 | opacity: 1; 335 | z-index: 1000; 336 | pointer-events: none; 337 | position: absolute; 338 | height: 16px; 339 | left: 50%; 340 | top: 50%; 341 | margin-top: -8px; 342 | width: 80px; 343 | margin-left: -40px; 344 | background: rgba(255, 255, 255, 0.9); 345 | -webkit-transform: scale(1); 346 | border-radius: 8px; 347 | overflow: hidden; } 348 | .dropzone .dz-preview .dz-progress .dz-upload { 349 | background: #333; 350 | background: linear-gradient(to bottom, #666, #444); 351 | position: absolute; 352 | top: 0; 353 | left: 0; 354 | bottom: 0; 355 | width: 0; 356 | -webkit-transition: width 300ms ease-in-out; 357 | -moz-transition: width 300ms ease-in-out; 358 | -ms-transition: width 300ms ease-in-out; 359 | -o-transition: width 300ms ease-in-out; 360 | transition: width 300ms ease-in-out; } 361 | .dropzone .dz-preview.dz-error .dz-error-message { 362 | display: block; } 363 | .dropzone .dz-preview.dz-error:hover .dz-error-message { 364 | opacity: 1; 365 | pointer-events: auto; } 366 | .dropzone .dz-preview .dz-error-message { 367 | pointer-events: none; 368 | z-index: 1000; 369 | position: absolute; 370 | display: block; 371 | display: none; 372 | opacity: 0; 373 | -webkit-transition: opacity 0.3s ease; 374 | -moz-transition: opacity 0.3s ease; 375 | -ms-transition: opacity 0.3s ease; 376 | -o-transition: opacity 0.3s ease; 377 | transition: opacity 0.3s ease; 378 | border-radius: 8px; 379 | font-size: 13px; 380 | top: 130px; 381 | left: -10px; 382 | width: 140px; 383 | background: #be2626; 384 | background: linear-gradient(to bottom, #be2626, #a92222); 385 | padding: 0.5em 1.2em; 386 | color: white; } 387 | .dropzone .dz-preview .dz-error-message:after { 388 | content: ''; 389 | position: absolute; 390 | top: -6px; 391 | left: 64px; 392 | width: 0; 393 | height: 0; 394 | border-left: 6px solid transparent; 395 | border-right: 6px solid transparent; 396 | border-bottom: 6px solid #be2626; } 397 | -------------------------------------------------------------------------------- /application/static/lib/Array.from.js: -------------------------------------------------------------------------------- 1 | /** 2 | * Polyfill for "fixing" IE's lack of support (IE < 9) for applying slice 3 | * on host objects like NamedNodeMap, NodeList, and HTMLCollection 4 | * (technically, since host objects are implementation-dependent, 5 | * IE hasn't needed to work this way, though I am informed this 6 | * will be changing with ES6). Also works on strings, 7 | * fixes IE to allow an explicit undefined for the 2nd argument 8 | * (as in Firefox), and prevents errors when called on other 9 | * DOM objects. 10 | * @license MIT, GPL, do whatever you want 11 | -* @see https://gist.github.com/brettz9/6093105 12 | */ 13 | (function () { 14 | 'use strict'; 15 | var _slice = Array.prototype.slice; 16 | 17 | try { 18 | _slice.call(document.documentElement); // Can't be used with DOM elements in IE < 9 19 | } 20 | catch (e) { // Fails in IE < 9 21 | Array.prototype.slice = function (begin, end) { 22 | var i, arrl = this.length, a = []; 23 | if (this.charAt) { // Although IE < 9 does not fail when applying Array.prototype.slice 24 | // to strings, here we do have to duck-type to avoid failing 25 | // with IE < 9's lack of support for string indexes 26 | for (i = 0; i < arrl; i++) { 27 | a.push(this.charAt(i)); 28 | } 29 | } 30 | else { // This will work for genuine arrays, array-like objects, NamedNodeMap (attributes, entities, notations), NodeList (e.g., getElementsByTagName), HTMLCollection (e.g., childNodes), and will not fail on other DOM objects (as do DOM elements in IE < 9) 31 | for (i = 0; i < this.length; i++) { // IE < 9 (at least IE < 9 mode in IE 10) does not work with node.attributes (NamedNodeMap) without a dynamically checked length here 32 | a.push(this[i]); 33 | } 34 | } 35 | return _slice.call(a, begin, end || a.length); // IE < 9 gives errors here if end is allowed as undefined (as opposed to just missing) so we default ourselves 36 | }; 37 | } 38 | }()); 39 | 40 | /** 41 | * @license MIT, GPL, do whatever you want 42 | * @requires polyfill: Array.prototype.slice fix {@link https://gist.github.com/brettz9/6093105} 43 | */ 44 | if (!Array.from) { 45 | Array.from = function (object) { 46 | 'use strict'; 47 | return [].slice.call(object); 48 | }; 49 | } 50 | -------------------------------------------------------------------------------- /application/static/lib/Object.assign.js: -------------------------------------------------------------------------------- 1 | if (typeof Object.assign !== 'function') { 2 | // Must be writable: true, enumerable: false, configurable: true 3 | Object.defineProperty(Object, "assign", { 4 | value: function assign(target, varArgs) { // .length of function is 2 5 | 'use strict'; 6 | if (target === null || target === undefined) { 7 | throw new TypeError('Cannot convert undefined or null to object'); 8 | } 9 | 10 | var to = Object(target); 11 | 12 | for (var index = 1; index < arguments.length; index++) { 13 | var nextSource = arguments[index]; 14 | 15 | if (nextSource !== null && nextSource !== undefined) { 16 | for (var nextKey in nextSource) { 17 | // Avoid bugs when hasOwnProperty is shadowed 18 | if (Object.prototype.hasOwnProperty.call(nextSource, nextKey)) { 19 | to[nextKey] = nextSource[nextKey]; 20 | } 21 | } 22 | } 23 | } 24 | return to; 25 | }, 26 | writable: true, 27 | configurable: true 28 | }); 29 | } 30 | -------------------------------------------------------------------------------- /application/static/lib/Object.entries.js: -------------------------------------------------------------------------------- 1 | if (!Object.entries) { 2 | Object.entries = function( obj ){ 3 | var ownProps = Object.keys( obj ), 4 | i = ownProps.length, 5 | resArray = new Array(i); // preallocate the Array 6 | while (i--) 7 | resArray[i] = [ownProps[i], obj[ownProps[i]]]; 8 | 9 | return resArray; 10 | }; 11 | } 12 | -------------------------------------------------------------------------------- /application/static/lib/String.startsWith.js: -------------------------------------------------------------------------------- 1 | if (!String.prototype.startsWith) { 2 | Object.defineProperty(String.prototype, 'startsWith', { 3 | value: function(search, rawPos) { 4 | var pos = rawPos > 0 ? rawPos|0 : 0; 5 | return this.substring(pos, pos + search.length) === search; 6 | } 7 | }); 8 | } 9 | -------------------------------------------------------------------------------- /application/static/lib/fetch.umd.js: -------------------------------------------------------------------------------- 1 | (function (global, factory) { 2 | typeof exports === 'object' && typeof module !== 'undefined' ? factory(exports) : 3 | typeof define === 'function' && define.amd ? define(['exports'], factory) : 4 | (factory((global.WHATWGFetch = {}))); 5 | }(this, (function (exports) { 'use strict'; 6 | 7 | var global = 8 | (typeof globalThis !== 'undefined' && globalThis) || 9 | (typeof self !== 'undefined' && self) || 10 | (typeof global !== 'undefined' && global); 11 | 12 | var support = { 13 | searchParams: 'URLSearchParams' in global, 14 | iterable: 'Symbol' in global && 'iterator' in Symbol, 15 | blob: 16 | 'FileReader' in global && 17 | 'Blob' in global && 18 | (function() { 19 | try { 20 | new Blob(); 21 | return true 22 | } catch (e) { 23 | return false 24 | } 25 | })(), 26 | formData: 'FormData' in global, 27 | arrayBuffer: 'ArrayBuffer' in global 28 | }; 29 | 30 | function isDataView(obj) { 31 | return obj && DataView.prototype.isPrototypeOf(obj) 32 | } 33 | 34 | if (support.arrayBuffer) { 35 | var viewClasses = [ 36 | '[object Int8Array]', 37 | '[object Uint8Array]', 38 | '[object Uint8ClampedArray]', 39 | '[object Int16Array]', 40 | '[object Uint16Array]', 41 | '[object Int32Array]', 42 | '[object Uint32Array]', 43 | '[object Float32Array]', 44 | '[object Float64Array]' 45 | ]; 46 | 47 | var isArrayBufferView = 48 | ArrayBuffer.isView || 49 | function(obj) { 50 | return obj && viewClasses.indexOf(Object.prototype.toString.call(obj)) > -1 51 | }; 52 | } 53 | 54 | function normalizeName(name) { 55 | if (typeof name !== 'string') { 56 | name = String(name); 57 | } 58 | if (/[^a-z0-9\-#$%&'*+.^_`|~!]/i.test(name) || name === '') { 59 | throw new TypeError('Invalid character in header field name') 60 | } 61 | return name.toLowerCase() 62 | } 63 | 64 | function normalizeValue(value) { 65 | if (typeof value !== 'string') { 66 | value = String(value); 67 | } 68 | return value 69 | } 70 | 71 | // Build a destructive iterator for the value list 72 | function iteratorFor(items) { 73 | var iterator = { 74 | next: function() { 75 | var value = items.shift(); 76 | return {done: value === undefined, value: value} 77 | } 78 | }; 79 | 80 | if (support.iterable) { 81 | iterator[Symbol.iterator] = function() { 82 | return iterator 83 | }; 84 | } 85 | 86 | return iterator 87 | } 88 | 89 | function Headers(headers) { 90 | this.map = {}; 91 | 92 | if (headers instanceof Headers) { 93 | headers.forEach(function(value, name) { 94 | this.append(name, value); 95 | }, this); 96 | } else if (Array.isArray(headers)) { 97 | headers.forEach(function(header) { 98 | this.append(header[0], header[1]); 99 | }, this); 100 | } else if (headers) { 101 | Object.getOwnPropertyNames(headers).forEach(function(name) { 102 | this.append(name, headers[name]); 103 | }, this); 104 | } 105 | } 106 | 107 | Headers.prototype.append = function(name, value) { 108 | name = normalizeName(name); 109 | value = normalizeValue(value); 110 | var oldValue = this.map[name]; 111 | this.map[name] = oldValue ? oldValue + ', ' + value : value; 112 | }; 113 | 114 | Headers.prototype['delete'] = function(name) { 115 | delete this.map[normalizeName(name)]; 116 | }; 117 | 118 | Headers.prototype.get = function(name) { 119 | name = normalizeName(name); 120 | return this.has(name) ? this.map[name] : null 121 | }; 122 | 123 | Headers.prototype.has = function(name) { 124 | return this.map.hasOwnProperty(normalizeName(name)) 125 | }; 126 | 127 | Headers.prototype.set = function(name, value) { 128 | this.map[normalizeName(name)] = normalizeValue(value); 129 | }; 130 | 131 | Headers.prototype.forEach = function(callback, thisArg) { 132 | for (var name in this.map) { 133 | if (this.map.hasOwnProperty(name)) { 134 | callback.call(thisArg, this.map[name], name, this); 135 | } 136 | } 137 | }; 138 | 139 | Headers.prototype.keys = function() { 140 | var items = []; 141 | this.forEach(function(value, name) { 142 | items.push(name); 143 | }); 144 | return iteratorFor(items) 145 | }; 146 | 147 | Headers.prototype.values = function() { 148 | var items = []; 149 | this.forEach(function(value) { 150 | items.push(value); 151 | }); 152 | return iteratorFor(items) 153 | }; 154 | 155 | Headers.prototype.entries = function() { 156 | var items = []; 157 | this.forEach(function(value, name) { 158 | items.push([name, value]); 159 | }); 160 | return iteratorFor(items) 161 | }; 162 | 163 | if (support.iterable) { 164 | Headers.prototype[Symbol.iterator] = Headers.prototype.entries; 165 | } 166 | 167 | function consumed(body) { 168 | if (body.bodyUsed) { 169 | return Promise.reject(new TypeError('Already read')) 170 | } 171 | body.bodyUsed = true; 172 | } 173 | 174 | function fileReaderReady(reader) { 175 | return new Promise(function(resolve, reject) { 176 | reader.onload = function() { 177 | resolve(reader.result); 178 | }; 179 | reader.onerror = function() { 180 | reject(reader.error); 181 | }; 182 | }) 183 | } 184 | 185 | function readBlobAsArrayBuffer(blob) { 186 | var reader = new FileReader(); 187 | var promise = fileReaderReady(reader); 188 | reader.readAsArrayBuffer(blob); 189 | return promise 190 | } 191 | 192 | function readBlobAsText(blob) { 193 | var reader = new FileReader(); 194 | var promise = fileReaderReady(reader); 195 | reader.readAsText(blob); 196 | return promise 197 | } 198 | 199 | function readArrayBufferAsText(buf) { 200 | var view = new Uint8Array(buf); 201 | var chars = new Array(view.length); 202 | 203 | for (var i = 0; i < view.length; i++) { 204 | chars[i] = String.fromCharCode(view[i]); 205 | } 206 | return chars.join('') 207 | } 208 | 209 | function bufferClone(buf) { 210 | if (buf.slice) { 211 | return buf.slice(0) 212 | } else { 213 | var view = new Uint8Array(buf.byteLength); 214 | view.set(new Uint8Array(buf)); 215 | return view.buffer 216 | } 217 | } 218 | 219 | function Body() { 220 | this.bodyUsed = false; 221 | 222 | this._initBody = function(body) { 223 | /* 224 | fetch-mock wraps the Response object in an ES6 Proxy to 225 | provide useful test harness features such as flush. However, on 226 | ES5 browsers without fetch or Proxy support pollyfills must be used; 227 | the proxy-pollyfill is unable to proxy an attribute unless it exists 228 | on the object before the Proxy is created. This change ensures 229 | Response.bodyUsed exists on the instance, while maintaining the 230 | semantic of setting Request.bodyUsed in the constructor before 231 | _initBody is called. 232 | */ 233 | this.bodyUsed = this.bodyUsed; 234 | this._bodyInit = body; 235 | if (!body) { 236 | this._bodyText = ''; 237 | } else if (typeof body === 'string') { 238 | this._bodyText = body; 239 | } else if (support.blob && Blob.prototype.isPrototypeOf(body)) { 240 | this._bodyBlob = body; 241 | } else if (support.formData && FormData.prototype.isPrototypeOf(body)) { 242 | this._bodyFormData = body; 243 | } else if (support.searchParams && URLSearchParams.prototype.isPrototypeOf(body)) { 244 | this._bodyText = body.toString(); 245 | } else if (support.arrayBuffer && support.blob && isDataView(body)) { 246 | this._bodyArrayBuffer = bufferClone(body.buffer); 247 | // IE 10-11 can't handle a DataView body. 248 | this._bodyInit = new Blob([this._bodyArrayBuffer]); 249 | } else if (support.arrayBuffer && (ArrayBuffer.prototype.isPrototypeOf(body) || isArrayBufferView(body))) { 250 | this._bodyArrayBuffer = bufferClone(body); 251 | } else { 252 | this._bodyText = body = Object.prototype.toString.call(body); 253 | } 254 | 255 | if (!this.headers.get('content-type')) { 256 | if (typeof body === 'string') { 257 | this.headers.set('content-type', 'text/plain;charset=UTF-8'); 258 | } else if (this._bodyBlob && this._bodyBlob.type) { 259 | this.headers.set('content-type', this._bodyBlob.type); 260 | } else if (support.searchParams && URLSearchParams.prototype.isPrototypeOf(body)) { 261 | this.headers.set('content-type', 'application/x-www-form-urlencoded;charset=UTF-8'); 262 | } 263 | } 264 | }; 265 | 266 | if (support.blob) { 267 | this.blob = function() { 268 | var rejected = consumed(this); 269 | if (rejected) { 270 | return rejected 271 | } 272 | 273 | if (this._bodyBlob) { 274 | return Promise.resolve(this._bodyBlob) 275 | } else if (this._bodyArrayBuffer) { 276 | return Promise.resolve(new Blob([this._bodyArrayBuffer])) 277 | } else if (this._bodyFormData) { 278 | throw new Error('could not read FormData body as blob') 279 | } else { 280 | return Promise.resolve(new Blob([this._bodyText])) 281 | } 282 | }; 283 | 284 | this.arrayBuffer = function() { 285 | if (this._bodyArrayBuffer) { 286 | var isConsumed = consumed(this); 287 | if (isConsumed) { 288 | return isConsumed 289 | } 290 | if (ArrayBuffer.isView(this._bodyArrayBuffer)) { 291 | return Promise.resolve( 292 | this._bodyArrayBuffer.buffer.slice( 293 | this._bodyArrayBuffer.byteOffset, 294 | this._bodyArrayBuffer.byteOffset + this._bodyArrayBuffer.byteLength 295 | ) 296 | ) 297 | } else { 298 | return Promise.resolve(this._bodyArrayBuffer) 299 | } 300 | } else { 301 | return this.blob().then(readBlobAsArrayBuffer) 302 | } 303 | }; 304 | } 305 | 306 | this.text = function() { 307 | var rejected = consumed(this); 308 | if (rejected) { 309 | return rejected 310 | } 311 | 312 | if (this._bodyBlob) { 313 | return readBlobAsText(this._bodyBlob) 314 | } else if (this._bodyArrayBuffer) { 315 | return Promise.resolve(readArrayBufferAsText(this._bodyArrayBuffer)) 316 | } else if (this._bodyFormData) { 317 | throw new Error('could not read FormData body as text') 318 | } else { 319 | return Promise.resolve(this._bodyText) 320 | } 321 | }; 322 | 323 | if (support.formData) { 324 | this.formData = function() { 325 | return this.text().then(decode) 326 | }; 327 | } 328 | 329 | this.json = function() { 330 | return this.text().then(JSON.parse) 331 | }; 332 | 333 | return this 334 | } 335 | 336 | // HTTP methods whose capitalization should be normalized 337 | var methods = ['DELETE', 'GET', 'HEAD', 'OPTIONS', 'POST', 'PUT']; 338 | 339 | function normalizeMethod(method) { 340 | var upcased = method.toUpperCase(); 341 | return methods.indexOf(upcased) > -1 ? upcased : method 342 | } 343 | 344 | function Request(input, options) { 345 | if (!(this instanceof Request)) { 346 | throw new TypeError('Please use the "new" operator, this DOM object constructor cannot be called as a function.') 347 | } 348 | 349 | options = options || {}; 350 | var body = options.body; 351 | 352 | if (input instanceof Request) { 353 | if (input.bodyUsed) { 354 | throw new TypeError('Already read') 355 | } 356 | this.url = input.url; 357 | this.credentials = input.credentials; 358 | if (!options.headers) { 359 | this.headers = new Headers(input.headers); 360 | } 361 | this.method = input.method; 362 | this.mode = input.mode; 363 | this.signal = input.signal; 364 | if (!body && input._bodyInit != null) { 365 | body = input._bodyInit; 366 | input.bodyUsed = true; 367 | } 368 | } else { 369 | this.url = String(input); 370 | } 371 | 372 | this.credentials = options.credentials || this.credentials || 'same-origin'; 373 | if (options.headers || !this.headers) { 374 | this.headers = new Headers(options.headers); 375 | } 376 | this.method = normalizeMethod(options.method || this.method || 'GET'); 377 | this.mode = options.mode || this.mode || null; 378 | this.signal = options.signal || this.signal; 379 | this.referrer = null; 380 | 381 | if ((this.method === 'GET' || this.method === 'HEAD') && body) { 382 | throw new TypeError('Body not allowed for GET or HEAD requests') 383 | } 384 | this._initBody(body); 385 | 386 | if (this.method === 'GET' || this.method === 'HEAD') { 387 | if (options.cache === 'no-store' || options.cache === 'no-cache') { 388 | // Search for a '_' parameter in the query string 389 | var reParamSearch = /([?&])_=[^&]*/; 390 | if (reParamSearch.test(this.url)) { 391 | // If it already exists then set the value with the current time 392 | this.url = this.url.replace(reParamSearch, '$1_=' + new Date().getTime()); 393 | } else { 394 | // Otherwise add a new '_' parameter to the end with the current time 395 | var reQueryString = /\?/; 396 | this.url += (reQueryString.test(this.url) ? '&' : '?') + '_=' + new Date().getTime(); 397 | } 398 | } 399 | } 400 | } 401 | 402 | Request.prototype.clone = function() { 403 | return new Request(this, {body: this._bodyInit}) 404 | }; 405 | 406 | function decode(body) { 407 | var form = new FormData(); 408 | body 409 | .trim() 410 | .split('&') 411 | .forEach(function(bytes) { 412 | if (bytes) { 413 | var split = bytes.split('='); 414 | var name = split.shift().replace(/\+/g, ' '); 415 | var value = split.join('=').replace(/\+/g, ' '); 416 | form.append(decodeURIComponent(name), decodeURIComponent(value)); 417 | } 418 | }); 419 | return form 420 | } 421 | 422 | function parseHeaders(rawHeaders) { 423 | var headers = new Headers(); 424 | // Replace instances of \r\n and \n followed by at least one space or horizontal tab with a space 425 | // https://tools.ietf.org/html/rfc7230#section-3.2 426 | var preProcessedHeaders = rawHeaders.replace(/\r?\n[\t ]+/g, ' '); 427 | preProcessedHeaders.split(/\r?\n/).forEach(function(line) { 428 | var parts = line.split(':'); 429 | var key = parts.shift().trim(); 430 | if (key) { 431 | var value = parts.join(':').trim(); 432 | headers.append(key, value); 433 | } 434 | }); 435 | return headers 436 | } 437 | 438 | Body.call(Request.prototype); 439 | 440 | function Response(bodyInit, options) { 441 | if (!(this instanceof Response)) { 442 | throw new TypeError('Please use the "new" operator, this DOM object constructor cannot be called as a function.') 443 | } 444 | if (!options) { 445 | options = {}; 446 | } 447 | 448 | this.type = 'default'; 449 | this.status = options.status === undefined ? 200 : options.status; 450 | this.ok = this.status >= 200 && this.status < 300; 451 | this.statusText = 'statusText' in options ? options.statusText : ''; 452 | this.headers = new Headers(options.headers); 453 | this.url = options.url || ''; 454 | this._initBody(bodyInit); 455 | } 456 | 457 | Body.call(Response.prototype); 458 | 459 | Response.prototype.clone = function() { 460 | return new Response(this._bodyInit, { 461 | status: this.status, 462 | statusText: this.statusText, 463 | headers: new Headers(this.headers), 464 | url: this.url 465 | }) 466 | }; 467 | 468 | Response.error = function() { 469 | var response = new Response(null, {status: 0, statusText: ''}); 470 | response.type = 'error'; 471 | return response 472 | }; 473 | 474 | var redirectStatuses = [301, 302, 303, 307, 308]; 475 | 476 | Response.redirect = function(url, status) { 477 | if (redirectStatuses.indexOf(status) === -1) { 478 | throw new RangeError('Invalid status code') 479 | } 480 | 481 | return new Response(null, {status: status, headers: {location: url}}) 482 | }; 483 | 484 | exports.DOMException = global.DOMException; 485 | try { 486 | new exports.DOMException(); 487 | } catch (err) { 488 | exports.DOMException = function(message, name) { 489 | this.message = message; 490 | this.name = name; 491 | var error = Error(message); 492 | this.stack = error.stack; 493 | }; 494 | exports.DOMException.prototype = Object.create(Error.prototype); 495 | exports.DOMException.prototype.constructor = exports.DOMException; 496 | } 497 | 498 | function fetch(input, init) { 499 | return new Promise(function(resolve, reject) { 500 | var request = new Request(input, init); 501 | 502 | if (request.signal && request.signal.aborted) { 503 | return reject(new exports.DOMException('Aborted', 'AbortError')) 504 | } 505 | 506 | var xhr = new XMLHttpRequest(); 507 | 508 | function abortXhr() { 509 | xhr.abort(); 510 | } 511 | 512 | xhr.onload = function() { 513 | var options = { 514 | status: xhr.status, 515 | statusText: xhr.statusText, 516 | headers: parseHeaders(xhr.getAllResponseHeaders() || '') 517 | }; 518 | options.url = 'responseURL' in xhr ? xhr.responseURL : options.headers.get('X-Request-URL'); 519 | var body = 'response' in xhr ? xhr.response : xhr.responseText; 520 | setTimeout(function() { 521 | resolve(new Response(body, options)); 522 | }, 0); 523 | }; 524 | 525 | xhr.onerror = function() { 526 | setTimeout(function() { 527 | reject(new TypeError('Network request failed')); 528 | }, 0); 529 | }; 530 | 531 | xhr.ontimeout = function() { 532 | setTimeout(function() { 533 | reject(new TypeError('Network request failed')); 534 | }, 0); 535 | }; 536 | 537 | xhr.onabort = function() { 538 | setTimeout(function() { 539 | reject(new exports.DOMException('Aborted', 'AbortError')); 540 | }, 0); 541 | }; 542 | 543 | function fixUrl(url) { 544 | try { 545 | return url === '' && global.location.href ? global.location.href : url 546 | } catch (e) { 547 | return url 548 | } 549 | } 550 | 551 | xhr.open(request.method, fixUrl(request.url), true); 552 | 553 | if (request.credentials === 'include') { 554 | xhr.withCredentials = true; 555 | } else if (request.credentials === 'omit') { 556 | xhr.withCredentials = false; 557 | } 558 | 559 | if ('responseType' in xhr) { 560 | if (support.blob) { 561 | xhr.responseType = 'blob'; 562 | } else if ( 563 | support.arrayBuffer && 564 | request.headers.get('Content-Type') && 565 | request.headers.get('Content-Type').indexOf('application/octet-stream') !== -1 566 | ) { 567 | xhr.responseType = 'arraybuffer'; 568 | } 569 | } 570 | 571 | if (init && typeof init.headers === 'object' && !(init.headers instanceof Headers)) { 572 | Object.getOwnPropertyNames(init.headers).forEach(function(name) { 573 | xhr.setRequestHeader(name, normalizeValue(init.headers[name])); 574 | }); 575 | } else { 576 | request.headers.forEach(function(value, name) { 577 | xhr.setRequestHeader(name, value); 578 | }); 579 | } 580 | 581 | if (request.signal) { 582 | request.signal.addEventListener('abort', abortXhr); 583 | 584 | xhr.onreadystatechange = function() { 585 | // DONE (success or failure) 586 | if (xhr.readyState === 4) { 587 | request.signal.removeEventListener('abort', abortXhr); 588 | } 589 | }; 590 | } 591 | 592 | xhr.send(typeof request._bodyInit === 'undefined' ? null : request._bodyInit); 593 | }) 594 | } 595 | 596 | fetch.polyfill = true; 597 | 598 | if (!global.fetch) { 599 | global.fetch = fetch; 600 | global.Headers = Headers; 601 | global.Request = Request; 602 | global.Response = Response; 603 | } 604 | 605 | exports.Headers = Headers; 606 | exports.Request = Request; 607 | exports.Response = Response; 608 | exports.fetch = fetch; 609 | 610 | Object.defineProperty(exports, '__esModule', { value: true }); 611 | 612 | }))); 613 | -------------------------------------------------------------------------------- /application/static/main.css: -------------------------------------------------------------------------------- 1 | html, body, canvas { 2 | display: block; 3 | margin: 0; 4 | padding: 0; 5 | width: 100%; 6 | height: 100%; 7 | } 8 | 9 | form.upload { 10 | padding: 32px; 11 | } 12 | 13 | .container { 14 | padding: 10px; 15 | } 16 | 17 | h1.logo { 18 | margin: 0; 19 | font-weight: normal; 20 | } 21 | 22 | .progress { 23 | width: 500px; 24 | height: 24px; 25 | border: solid 1px gray; 26 | border-radius: 2px; 27 | } 28 | 29 | .progress .percentage { 30 | position: absolute; 31 | width: 500px; 32 | text-align: center; 33 | } 34 | 35 | .progress .bar { 36 | height: 24px; 37 | border-right: solid 1px #aaa; 38 | background: #eee; 39 | width: 0px; 40 | } 41 | 42 | .bimsurfer-static-tree, .bimsurfer-metadata { 43 | overflow: hidden; 44 | overflow-y: scroll; 45 | } 46 | 47 | .bimsurfer-static-tree, .bimsurfer-static-tree * { 48 | user-select: none; 49 | -ms-user-select: none; 50 | -moz-user-select: none; 51 | -webkit-user-select: none; 52 | z-index: 1; 53 | } 54 | 55 | .bimsurfer-static-tree div.item { 56 | border-top: solid 1px #eee; 57 | line-height: 12pt; 58 | } 59 | 60 | .bimsurfer-static-tree > div > div.item { 61 | border: none; 62 | } 63 | 64 | .bimsurfer-static-tree div.bimsurfer-tree-label { 65 | margin-left: -100px; 66 | padding-left: 100px; 67 | cursor: pointer; 68 | white-space: nowrap; 69 | } 70 | 71 | .bimsurfer-static-tree .bimsurfer-tree-children-with-indent { 72 | padding-left: 10px; 73 | } 74 | 75 | .bimsurfer-static-tree div.selected { 76 | background: #6ae; 77 | color: white; 78 | } 79 | 80 | .bimsurfer-static-tree div.item, 81 | .bimsurfer-static-tree div.item i { 82 | font-size: 15px !important; 83 | line-height: 24px !important; 84 | } 85 | 86 | .bimsurfer-static-tree div.bimsurfer-tree-label .collapse:before { 87 | content: "\e5ce "; 88 | } 89 | 90 | .bimsurfer-static-tree div.bimsurfer-tree-node-collapsed .collapse:before { 91 | content: "\e5cf "; 92 | } 93 | 94 | .bimsurfer-static-tree div.bimsurfer-tree-node-collapsed .item { 95 | display: none; 96 | } 97 | 98 | .material-icons { 99 | color: #444; 100 | } 101 | 102 | .bimsurfer-static-tree i.icon.material-icons { 103 | font-size: 24px !important; 104 | margin-right: 0.3em; 105 | vertical-align: -5px; 106 | } 107 | 108 | .bimsurfer-static-tree .bimsurfer-tree-eye { 109 | float: right; 110 | padding: 0 0.5em; 111 | display: inline-block; 112 | background: white; 113 | z-index: 1000; 114 | position: relative; 115 | } 116 | 117 | .bimsurfer-static-tree .bimsurfer-tree-eye:before { 118 | content: "\e8f4"; 119 | } 120 | 121 | .bimsurfer-tree-column { 122 | display: inline-block; 123 | overflow: hidden; 124 | } 125 | 126 | i.bimsurfer-tree-eye-off { 127 | opacity: 0.2 !important; 128 | } 129 | 130 | .bimsurfer-metadata h3 { 131 | font-weight: bold; 132 | margin: 0; 133 | padding: 10px 0 0 0; 134 | width: 1000px; 135 | } 136 | 137 | .bimsurfer-metadata table { 138 | width: 100%; 139 | } 140 | 141 | .bimsurfer-metadata h3:before { 142 | content: "\25b8 "; 143 | } 144 | 145 | .bimsurfer-metadata th, 146 | .bimsurfer-metadata td { 147 | width: 50%; 148 | border-bottom: solid 1px #eee; 149 | text-align: left; 150 | } 151 | 152 | .bimsurfer-metadata td:nth-child(1), 153 | .bimsurfer-metadata th:nth-child(1) { 154 | width: 30%; 155 | } 156 | 157 | .bimsurfer-metadata td:nth-child(2), 158 | .bimsurfer-metadata th:nth-child(2) { 159 | width: 70%; 160 | } 161 | 162 | .bimsurfer-metadata, 163 | .bimsurfer-metadata h3, 164 | .bimsurfer-static-tree { 165 | font-size: 10pt; 166 | } 167 | 168 | .spinner { 169 | position: fixed; 170 | top: calc(50% - 64px); 171 | left: calc(50% - 64px); 172 | width: 128px; 173 | height: 128px; 174 | border: solid 4px gray; 175 | border-bottom-color: white; 176 | border-radius: 50%; 177 | animation: spin 2s linear infinite; 178 | z-index: 9999; 179 | } 180 | 181 | @keyframes spin { 182 | to { 183 | transform: rotate(360deg); 184 | } 185 | } 186 | 187 | .log table { 188 | border-collapse: collapse; 189 | border-radius: 16px; 190 | overflow: hidden; 191 | width: 100%; 192 | border: 0; 193 | } 194 | .log td { 195 | padding-left: 10px; 196 | } 197 | .log tr { 198 | background: #f5f5f5; 199 | } 200 | .log table thead tr { 201 | height: 60px; 202 | background: #ccc; 203 | } 204 | .log table tbody tr { 205 | height: 50px; 206 | } 207 | .log th, .log td { 208 | border: none; 209 | } 210 | .log tr:nth-child(2n) { 211 | background-color: #f0f0f0; 212 | } 213 | 214 | .start_sharing, 215 | .listen_mode { 216 | position: fixed; 217 | bottom: 50px; 218 | left: 50%; 219 | width: 380px; 220 | margin-left: -220px; 221 | border-radius: 10px; 222 | background: green; 223 | padding: 10px 30px; 224 | color: white; 225 | font-size: 20px; 226 | cursor: pointer; 227 | text-align: center; 228 | } 229 | 230 | .start_sharing:hover { 231 | background: rgb(0, 170, 0); 232 | } 233 | 234 | .listen_mode { 235 | background: orange; 236 | cursor: default; 237 | } 238 | 239 | .start_sharing i, 240 | .listen_mode i { 241 | color: white; 242 | vertical-align: -6px; 243 | padding-right: 10px; 244 | } 245 | 246 | div.start_sharing > span.url > i { 247 | vertical-align: 8px; 248 | } 249 | 250 | .start_sharing pre { 251 | display: inline-block; 252 | font-size: 70%; 253 | overflow-wrap: anywhere; 254 | width: 300px; 255 | white-space: normal; 256 | } 257 | 258 | .start_sharing .action, 259 | .listen_mode .action { 260 | display: block; 261 | } 262 | 263 | .start_sharing .url { 264 | border: solid 1px #666; 265 | background: #eee9; 266 | color: #444; 267 | padding: 2px 12px; 268 | border-radius: 8px; 269 | display: none; 270 | } 271 | 272 | .bimsurfer-static-tree .controls div { 273 | display: inline-block; 274 | } 275 | 276 | .bimsurfer-static-tree .controls div i { 277 | margin: 8px; 278 | padding: 4px; 279 | border: solid 1px #ccc; 280 | border-radius: 2px; 281 | cursor: pointer; 282 | } 283 | 284 | .bimsurfer-static-tree .controls div.checked i { 285 | border: solid 1px #bbb; 286 | background: #eee; 287 | box-shadow: inset 1px 1px 2px 2px rgba(0,0,0,0.15); 288 | } 289 | 290 | .bimsurfer-static-tree .controls div i:hover { 291 | background: #ddd; 292 | border-color: #aaa; 293 | } 294 | 295 | .bimsurfer-static-tree .number-occurrences { 296 | padding-left: 4px; 297 | font-size: 80%; 298 | } 299 | 300 | .bimsurfer-static-tree .number-occurrences:before { 301 | content: '('; 302 | } 303 | 304 | .bimsurfer-static-tree .number-occurrences:after { 305 | content: ')'; 306 | } 307 | -------------------------------------------------------------------------------- /application/static/package.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "static", 3 | "version": "1.0.0", 4 | "description": "", 5 | "main": "App.built.es2015.js", 6 | "directories": { 7 | "lib": "lib" 8 | }, 9 | "scripts": { 10 | "test": "echo \"Error: no test specified\" && exit 1" 11 | }, 12 | "keywords": [], 13 | "author": "", 14 | "devDependencies": { 15 | "webpack": "^5.72.1", 16 | "webpack-cli": "^4.9.2" 17 | } 18 | } 19 | -------------------------------------------------------------------------------- /application/static/webpack.config.js: -------------------------------------------------------------------------------- 1 | const path = require('path'); 2 | 3 | module.exports = env => ({ 4 | entry: './App.js', 5 | resolve: { 6 | alias: { 7 | './bimsurfer': path.resolve(__dirname, '../bimsurfer'), 8 | 'three': path.resolve(__dirname, '../bimsurfer/src/v2/bimsurfer/lib/three/r140/three.module.js'), 9 | } 10 | }, 11 | output: { 12 | filename: `App.${env.postfix}.js`, 13 | path: path.resolve(__dirname, '.'), 14 | }, 15 | }); 16 | -------------------------------------------------------------------------------- /application/templates/error.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | IfcOpenShell viewer 5 | 6 | 7 | 8 | 9 | 10 | 11 |
12 |

Oops

13 | There was an error processing your request with ID:
{{ id }}
14 |
15 | 16 | 17 | 18 | -------------------------------------------------------------------------------- /application/templates/index.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | IfcOpenShell viewer 6 | 7 | 8 | 9 | 15 | 16 | 17 | {{ dropzone.load_css() }} 18 | {{ dropzone.style('border: 2px dashed #0087F7; margin: 20px 10%; min-height: 400px;') }} 19 | 20 | 50 | 51 | 52 |

IfcOpenShell viewer

53 | 54 | {{ dropzone.create('put_main') }} 55 | {{ dropzone.load_js() }} 56 | 57 | 83 | 84 | 85 | 86 | 87 | 88 | -------------------------------------------------------------------------------- /application/templates/log.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | IfcOpenShell viewer 5 | 6 | 7 | 8 | 9 | 10 | 11 |
12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | {% for item in log %} 23 | 24 | {% for row in item %} 25 | 26 | {% endfor %} 27 | 28 | {% endfor %} 29 | 30 |
SeverityMessageInstanceElement
{{ row[0]|upper + row[1:] }}
31 |
32 | 33 | 34 | 35 | -------------------------------------------------------------------------------- /application/templates/progress.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | IfcOpenShell viewer 6 | 7 | 8 | 9 | 16 | 17 | 18 | 19 | 20 |
21 |

IfcOpenShell viewer

22 | Conversion in progress... 23 |
24 |
25 |
26 |
27 |
28 | 51 | 52 | 53 | 54 | -------------------------------------------------------------------------------- /application/templates/viewer.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | IfcOpenShell viewer 6 | 7 | 8 | 9 | 10 | 21 | 22 | 23 | 24 | 30 | 31 | 32 | 33 | 40 | 41 | 42 | 120 | 121 | 122 | 123 |
124 |
125 | 126 |
127 |
128 | 130 | 133 | 134 | 145 | 146 | {% if with_screen_share %} 147 | 148 | {% if mode == 'listen' %} 149 | 150 |
151 | ondemand_video Live view 152 |
153 | 154 | 159 | 160 | {% else %} 161 | 162 | 177 | 178 |
179 | ondemand_video Start sharing 180 | content_copy
/live/{{ id }}/{{ live_share_id }}
181 |
182 | 183 | {% endif %} 184 | 185 | {% endif %} 186 | 187 |
188 | video_settings Toggle engine 189 |
190 | 191 | 200 | 201 | 202 | 203 | -------------------------------------------------------------------------------- /application/utils.py: -------------------------------------------------------------------------------- 1 | ################################################################################## 2 | # # 3 | # Copyright (c) 2020 AECgeeks # 4 | # # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy # 6 | # of this software and associated documentation files (the "Software"), to deal # 7 | # in the Software without restriction, including without limitation the rights # 8 | # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell # 9 | # copies of the Software, and to permit persons to whom the Software is # 10 | # furnished to do so, subject to the following conditions: # 11 | # # 12 | # The above copyright notice and this permission notice shall be included in all # 13 | # copies or substantial portions of the Software. # 14 | # # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR # 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, # 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE # 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER # 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, # 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE # 21 | # SOFTWARE. # 22 | # # 23 | ################################################################################## 24 | 25 | import os 26 | import string 27 | import tempfile 28 | 29 | from random import SystemRandom 30 | choice = lambda seq: SystemRandom().choice(seq) 31 | letter_set = set(string.ascii_letters) 32 | 33 | STORAGE_DIR = os.environ.get("MODEL_DIR", tempfile.gettempdir()) 34 | 35 | def generate_id(): 36 | return "".join(choice(string.ascii_letters) for i in range(32)) 37 | 38 | 39 | def storage_dir_for_id(id): 40 | id = id.split("_")[0] 41 | return os.path.join(STORAGE_DIR, id[0:1], id[0:2], id[0:3], id) 42 | 43 | 44 | def storage_file_for_id(id, ext): 45 | return os.path.join(storage_dir_for_id(id), id + "." + ext) 46 | 47 | 48 | def validate_id(id): 49 | id_num = id.split("_") 50 | 51 | if len(id_num) == 1: 52 | id = id_num[0] 53 | elif len(id_num) == 2: 54 | id, num = id_num 55 | num = str(int(num)) 56 | else: 57 | return False 58 | 59 | return len(set(id) - set(string.ascii_letters)) == 0 60 | 61 | -------------------------------------------------------------------------------- /application/worker.py: -------------------------------------------------------------------------------- 1 | ################################################################################## 2 | # # 3 | # Copyright (c) 2020 AECgeeks # 4 | # # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy # 6 | # of this software and associated documentation files (the "Software"), to deal # 7 | # in the Software without restriction, including without limitation the rights # 8 | # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell # 9 | # copies of the Software, and to permit persons to whom the Software is # 10 | # furnished to do so, subject to the following conditions: # 11 | # # 12 | # The above copyright notice and this permission notice shall be included in all # 13 | # copies or substantial portions of the Software. # 14 | # # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR # 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, # 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE # 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER # 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, # 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE # 21 | # SOFTWARE. # 22 | # # 23 | ################################################################################## 24 | 25 | import os 26 | import sys 27 | import json 28 | import glob 29 | import platform 30 | import traceback 31 | import importlib 32 | import subprocess 33 | import tempfile 34 | import operator 35 | import itertools 36 | import shutil 37 | 38 | from multiprocessing import Process 39 | 40 | class ifcopenshell_file_dict(dict): 41 | def __missing__(self, key): 42 | import ifcopenshell 43 | self[key] = ifcopenshell.open(utils.storage_file_for_id(key, 'ifc')) 44 | return self[key] 45 | 46 | import requests 47 | 48 | import utils 49 | import config 50 | import database 51 | 52 | print(f"Using {config.num_threads} threads per worker") 53 | 54 | on_windows = platform.system() == 'Windows' 55 | 56 | def set_progress(id, progress): 57 | session = database.Session() 58 | 59 | id = id.split("_")[0] 60 | 61 | model = session.query(database.model).filter(database.model.code == id).all()[0] 62 | model.progress = int(progress) 63 | session.commit() 64 | session.close() 65 | 66 | 67 | class task(object): 68 | def __init__(self, id, progress_map): 69 | import inspect 70 | print(self.__class__.__name__, inspect.getfile(type(self)), *progress_map) 71 | self.id = id 72 | self.begin, self.end = progress_map 73 | 74 | def sub_progress(self, i): 75 | set_progress(self.id, self.begin + (self.end - self.begin) * i / 100.) 76 | 77 | def __call__(self, *args): 78 | self.execute(*args) 79 | self.sub_progress(100) 80 | 81 | 82 | class ifc_validation_task(task): 83 | est_time = 1 84 | 85 | def execute(self, context, id): 86 | import ifcopenshell.validate 87 | 88 | logger = ifcopenshell.validate.json_logger() 89 | f = context.models[id] 90 | 91 | ifcopenshell.validate.validate(f, logger) 92 | 93 | with open(os.path.join(context.directory, "log.json"), "w") as f: 94 | print("\n".join(json.dumps(x, default=str) for x in logger.statements), file=f) 95 | 96 | 97 | class xml_generation_task(task): 98 | est_time = 1 99 | 100 | def execute(self, context, id): 101 | import ifcopenshell.geom 102 | f = context.models[id] 103 | sr = ifcopenshell.geom.serializers.xml(f, os.path.join(context.directory, f"{id}.xml")) 104 | sr.finalize() 105 | 106 | 107 | class xml_to_json_conversion(task): 108 | est_time = 1 109 | 110 | def execute(self, context, id): 111 | subprocess.check_call([ 112 | sys.executable, 113 | os.path.join(os.path.dirname(__file__), "process_xml_to_json.py"), 114 | id 115 | ], cwd=context.directory) 116 | 117 | 118 | class geometry_generation_task(task): 119 | est_time = 10 120 | 121 | def execute(self, context, id): 122 | import ifcopenshell.geom 123 | 124 | ifcopenshell.ifcopenshell_wrapper.turn_off_detailed_logging() 125 | ifcopenshell.ifcopenshell_wrapper.set_log_format_json() 126 | 127 | f = context.models[id] 128 | settings = ifcopenshell.geom.settings( 129 | WELD_VERTICES=False, 130 | APPLY_DEFAULT_MATERIALS=True 131 | ) 132 | sr = ifcopenshell.geom.serializers.gltf(utils.storage_file_for_id(id, "glb"), settings) 133 | 134 | sr.writeHeader() 135 | 136 | for progress, elem in ifcopenshell.geom.iterate( 137 | settings, 138 | f, 139 | with_progress=True, 140 | exclude=("IfcSpace", "IfcOpeningElement"), 141 | cache=utils.storage_file_for_id(id, "cache.h5"), 142 | num_threads=config.num_threads 143 | ): 144 | sr.write(elem) 145 | self.sub_progress(progress) 146 | 147 | sr.finalize() 148 | 149 | with open(os.path.join(context.directory, f"log.json"), "w") as f: 150 | f.write(ifcopenshell.get_log()) 151 | 152 | class glb_optimize_task(task): 153 | est_time = 1 154 | 155 | def execute(self, context, id): 156 | try: 157 | if subprocess.call(["gltf-pipeline" + ('.cmd' if on_windows else ''), "-i", id + ".glb", "-o", id + ".optimized.glb", "-b", "-d"], cwd=context.directory) == 0: 158 | os.rename(os.path.join(context.directory, id + ".glb"), os.path.join(context.directory, id + ".unoptimized.glb")) 159 | os.rename(os.path.join(context.directory, id + ".optimized.glb"), os.path.join(context.directory, id + ".glb")) 160 | except FileNotFoundError: 161 | pass 162 | 163 | 164 | class gzip_task(task): 165 | est_time = 1 166 | order = 2000 167 | 168 | def execute(self, context, id): 169 | import gzip 170 | for ext in ["glb", "xml", "svg"]: 171 | fn = os.path.join(context.directory, id + "." + ext) 172 | if os.path.exists(fn): 173 | with open(fn, 'rb') as orig_file: 174 | with gzip.open(fn + ".gz", 'wb') as zipped_file: 175 | zipped_file.writelines(orig_file) 176 | 177 | 178 | class svg_generation_task(task): 179 | est_time = 10 180 | aggregate_model = True 181 | 182 | def execute(self, context): 183 | import ifcopenshell.geom 184 | 185 | settings = ifcopenshell.geom.settings( 186 | INCLUDE_CURVES=True, 187 | EXCLUDE_SOLIDS_AND_SURFACES=False, 188 | APPLY_DEFAULT_MATERIALS=True, 189 | DISABLE_TRIANGULATION=True 190 | ) 191 | 192 | # cache = ifcopenshell.geom.serializers.hdf5("cache.h5", settings) 193 | 194 | sr = ifcopenshell.geom.serializers.svg(utils.storage_file_for_id(self.id, "svg"), settings) 195 | 196 | # @todo determine file to select here or unify building storeys accross files somehow 197 | sr.setFile(context.models[context.input_ids[0]]) 198 | sr.setSectionHeightsFromStoreys() 199 | 200 | sr.setDrawDoorArcs(True) 201 | sr.setPrintSpaceAreas(True) 202 | sr.setPrintSpaceNames(True) 203 | sr.setBoundingRectangle(1024., 1024.) 204 | 205 | """ 206 | sr.setProfileThreshold(128) 207 | sr.setPolygonal(True) 208 | sr.setAlwaysProject(True) 209 | sr.setAutoElevation(True) 210 | """ 211 | 212 | # sr.setAutoSection(True) 213 | 214 | sr.writeHeader() 215 | 216 | for ii in context.input_ids: 217 | f = context.models[ii] 218 | for progress, elem in ifcopenshell.geom.iterate( 219 | settings, 220 | f, 221 | with_progress=True, 222 | exclude=("IfcOpeningElement",), 223 | cache=utils.storage_file_for_id(self.id, "cache.h5"), 224 | num_threads=config.num_threads 225 | ): 226 | try: 227 | sr.write(elem) 228 | except: 229 | print("On %s:" % f[elem.id]) 230 | traceback.print_exc(file=sys.stdout) 231 | self.sub_progress(progress) 232 | 233 | sr.finalize() 234 | 235 | 236 | class task_execution_context: 237 | 238 | def __init__(self, id): 239 | self.id = id 240 | self.directory = utils.storage_dir_for_id(id) 241 | self.input_files = [name for name in os.listdir(self.directory) if os.path.isfile(os.path.join(self.directory, name)) and name.endswith(".ifc")] 242 | self.models = ifcopenshell_file_dict() 243 | 244 | tasks = [ 245 | ifc_validation_task, 246 | xml_generation_task, 247 | xml_to_json_conversion, 248 | geometry_generation_task, 249 | glb_optimize_task, 250 | gzip_task 251 | ] 252 | 253 | tasks_on_aggregate = [ 254 | svg_generation_task 255 | ] 256 | 257 | self.is_multiple = any("_" in n for n in self.input_files) 258 | 259 | self.n_files = len(self.input_files) 260 | 261 | self.input_ids = ["%s_%d" % (self.id, i) if self.is_multiple else self.id for i in range(self.n_files)] 262 | 263 | """ 264 | # Create a file called task_print.py with the following 265 | # example content to add application-specific tasks 266 | 267 | import sys 268 | 269 | from worker import task as base_task 270 | 271 | class task(base_task): 272 | est_time = 1 273 | 274 | def execute(self, context): 275 | print("Executing task 'print' on ", context.id, ' in ', context.directory, file=sys.stderr) 276 | """ 277 | 278 | for fn in glob.glob("task_*.py"): 279 | mdl = importlib.import_module(fn.split('.')[0]) 280 | if getattr(mdl.task, 'aggregate_model', False): 281 | tasks_on_aggregate.append(mdl.task) 282 | else: 283 | tasks.append(mdl.task) 284 | 285 | self.tasks = list(filter(config.task_enabled, tasks)) 286 | self.tasks_on_aggregate = list(filter(config.task_enabled, tasks_on_aggregate)) 287 | 288 | self.tasks.sort(key=lambda t: getattr(t, 'order', 10)) 289 | self.tasks_on_aggregate.sort(key=lambda t: getattr(t, 'order', 10)) 290 | 291 | 292 | def run(self): 293 | elapsed = 0 294 | set_progress(self.id, elapsed) 295 | 296 | 297 | total_est_time = \ 298 | sum(map(operator.attrgetter('est_time'), self.tasks)) * self.n_files + \ 299 | sum(map(operator.attrgetter('est_time'), self.tasks_on_aggregate)) 300 | 301 | def run_task(t, *args, aggregate_model=False): 302 | nonlocal elapsed 303 | begin_end = (elapsed / total_est_time * 99, (elapsed + t.est_time) / total_est_time * 99) 304 | task = t(self.id, begin_end) 305 | try: 306 | task(self, *args) 307 | except: 308 | traceback.print_exc(file=sys.stdout) 309 | # Mark ID as failed 310 | with open(os.path.join(self.directory, 'failed'), 'w') as f: 311 | pass 312 | return False 313 | elapsed += t.est_time 314 | return True 315 | 316 | with_failure = False 317 | 318 | for t, ii in itertools.product(self.tasks, self.input_ids): 319 | if not run_task(t, ii): 320 | with_failure = True 321 | break 322 | 323 | if not with_failure: 324 | for t in self.tasks_on_aggregate: 325 | run_task(t, aggregate_model=True) 326 | 327 | elapsed = 100 328 | set_progress(self.id, elapsed) 329 | 330 | 331 | def do_process(id): 332 | tec = task_execution_context(id) 333 | # for local development 334 | # tec.run() 335 | 336 | p = Process(target=task_execution_context.run, args=(tec,)) 337 | p.start() 338 | p.join() 339 | if p.exitcode != 0: 340 | raise RuntimeError() 341 | 342 | 343 | def process(id, callback_url): 344 | try: 345 | do_process(id) 346 | status = "success" 347 | except Exception as e: 348 | traceback.print_exc(file=sys.stdout) 349 | status = "failure" 350 | 351 | if callback_url is not None: 352 | r = requests.post(callback_url, data={"status": status, "id": id}) 353 | 354 | -------------------------------------------------------------------------------- /application/wsgi.py: -------------------------------------------------------------------------------- 1 | from main import application 2 | 3 | if __name__ == "__main__": 4 | application.run() 5 | -------------------------------------------------------------------------------- /docker-compose.yml: -------------------------------------------------------------------------------- 1 | version: '3' 2 | 3 | services: 4 | db: 5 | image: postgres:12.1-alpine 6 | expose: 7 | - "5432" 8 | volumes: 9 | - ./docker-volumes/db:/var/lib/postgresql/data 10 | environment: 11 | - POSTGRES_PASSWORD=postgres 12 | 13 | redis: 14 | image: redis:5.0.7-alpine 15 | expose: 16 | - "6379" 17 | volumes: 18 | - ./docker-volumes/redis:/data 19 | 20 | frontend: 21 | build: 22 | context: '.' 23 | dockerfile: application/Dockerfile 24 | entrypoint: sh -c 'python3 database.py; gunicorn --bind 0.0.0.0:5000 --worker-class gevent -w 8 --access-logfile - --error-logfile - wsgi' 25 | environment: 26 | - MODEL_DIR=/data 27 | - REDIS_HOST=redis 28 | - POSTGRES_HOST=db 29 | expose: 30 | - "5000" 31 | depends_on: 32 | - redis 33 | - db 34 | volumes: 35 | - ./models:/data 36 | 37 | nginx: 38 | image: nginx:1.17-alpine 39 | depends_on: 40 | - frontend 41 | ports: 42 | - "80:80" 43 | volumes: 44 | - ./nginx:/etc/nginx/conf.d 45 | - ./docker-volumes/certbot/conf:/etc/letsencrypt 46 | - ./docker-volumes/certbot/www:/var/www/certbot 47 | entrypoint: "/bin/sh -c \"nginx -g 'daemon off;' & while true; do nginx -t && nginx -s reload; sleep 1h; done\"" 48 | 49 | 50 | certbot: 51 | ports: [] 52 | image: certbot/certbot 53 | volumes: 54 | - ./docker-volumes/certbot/conf:/etc/letsencrypt 55 | - ./docker-volumes/certbot/www:/var/www/certbot 56 | entrypoint: "/bin/sh -c 'trap exit INT TERM; while true; do certbot renew --webroot -w /var/www/certbot; sleep 12h; done'" 57 | 58 | worker: 59 | build: 60 | context: '.' 61 | dockerfile: application/Dockerfile 62 | entrypoint: supervisord -n 63 | environment: 64 | - MODEL_DIR=/data 65 | - REDIS_HOST=redis 66 | - POSTGRES_HOST=db 67 | depends_on: 68 | - redis 69 | - db 70 | volumes: 71 | - ./models:/data 72 | -------------------------------------------------------------------------------- /examples/embed/index.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | IFC pipeline viewer embed API example 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 24 | 25 | 26 | 28 | 51 | 52 | 53 | 77 | 78 | 79 | 80 | 81 | 82 | 83 |
84 |
85 |

Tree view

86 |
87 |
88 | 89 |
90 |

3D view

91 |
92 |
93 | 94 |
95 |

2D view

96 |
97 |
98 | 99 |
100 |

Meta-data view

101 |
102 |
103 |
104 | 105 | 106 | -------------------------------------------------------------------------------- /img/architecture.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AECgeeks/ifc-pipeline/4d26c33f79a2b67bd86d9f3f48d1ee7e8333c625/img/architecture.png -------------------------------------------------------------------------------- /init.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | dotcount=`awk -F. '{print NF-1}' <<< "$1"` 4 | if [ $dotcount -eq 1 ]; then 5 | domain="-d $1 -d www.$1" 6 | elif [ $dotcount -eq 2 ]; then 7 | domain="-d $1" 8 | else 9 | echo "usage: init.sh " 10 | exit 1 11 | fi 12 | 13 | sudo docker run -it --rm \ 14 | -v $PWD/docker-volumes/certbot/conf:/etc/letsencrypt \ 15 | -p 80:80 \ 16 | certbot/certbot \ 17 | certonly --standalone \ 18 | --register-unsafely-without-email --agree-tos \ 19 | --cert-name host \ 20 | $domain 21 | 22 | # sudo openssl dhparam -out $PWD/certbot/dh-param/dhparam-2048.pem 2048 23 | -------------------------------------------------------------------------------- /nginx/app.conf: -------------------------------------------------------------------------------- 1 | server { 2 | listen 443 ssl deferred; 3 | 4 | ssl_certificate /etc/letsencrypt/live/host/fullchain.pem; 5 | ssl_certificate_key /etc/letsencrypt/live/host/privkey.pem; 6 | 7 | # https://github.com/certbot/certbot/blob/master/certbot-nginx/certbot_nginx/_internal/tls_configs/options-ssl-nginx.conf 8 | # @todo actually include this 9 | ssl_session_cache shared:le_nginx_SSL:10m; 10 | ssl_session_timeout 1440m; 11 | ssl_session_tickets off; 12 | 13 | ssl_protocols TLSv1.2 TLSv1.3; 14 | ssl_prefer_server_ciphers off; 15 | 16 | ssl_ciphers "ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384"; 17 | 18 | server_name localhost; 19 | client_max_body_size 4G; 20 | keepalive_timeout 5; 21 | 22 | location /.well-known/acme-challenge/ { 23 | root /var/www/certbot; 24 | } 25 | location / { 26 | try_files $uri @app; 27 | } 28 | location @app { 29 | proxy_pass http://frontend:5000; 30 | proxy_redirect off; 31 | 32 | proxy_set_header Host $http_host; 33 | proxy_set_header X-Real-IP $remote_addr; 34 | proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; 35 | proxy_set_header X-Forwarded-Proto $scheme; 36 | } 37 | } 38 | 39 | server { 40 | listen 80 default_server; 41 | listen [::]:80 default_server; 42 | server_name _; 43 | return 301 https://$host$request_uri; 44 | } 45 | --------------------------------------------------------------------------------