├── .gitignore
├── LICENSE
├── README.md
├── detection_visualization_util.py
├── gui
├── .gitignore
├── app.js
├── demo_nyc_tracklets.json
├── index.html
├── nodemon.json
├── package.json
├── src
│ ├── client
│ │ ├── analog-clock-timepicker
│ │ │ ├── LICENSE
│ │ │ ├── examples
│ │ │ │ ├── analog-clock.html
│ │ │ │ └── timepicker.html
│ │ │ └── libs
│ │ │ │ ├── timepicker.css
│ │ │ │ └── timepicker.js
│ │ ├── app_controls.css
│ │ ├── cheap-ruler.min.js
│ │ ├── format-tracklets.js
│ │ ├── mapbox-gl-0.48.0.css
│ │ ├── mapbox-gl-dev.js
│ │ ├── trips-layer-fragment.glsl.js
│ │ ├── trips-layer-vertex.glsl.js
│ │ └── trips-layer.js
│ └── server
│ │ └── index.js
└── webpack.config.js
├── lnglat_homography.py
├── multi_object_tracking.py
├── pyimagesearch
├── __init__.py
├── centroidtracker.py
└── trackableobject.py
└── requirements.txt
/.gitignore:
--------------------------------------------------------------------------------
1 | # Byte-compiled / optimized / DLL files
2 | __pycache__/
3 | *.py[cod]
4 | *$py.class
5 |
6 | # C extensions
7 | *.so
8 |
9 | # Distribution / packaging
10 | .Python
11 | build/
12 | develop-eggs/
13 | dist/
14 | downloads/
15 | eggs/
16 | .eggs/
17 | lib/
18 | lib64/
19 | parts/
20 | sdist/
21 | var/
22 | wheels/
23 | *.egg-info/
24 | .installed.cfg
25 | *.egg
26 | MANIFEST
27 | */node_modules/
28 |
29 | # PyInstaller
30 | # Usually these files are written by a python script from a template
31 | # before PyInstaller builds the exe, so as to inject date/other infos into it.
32 | *.manifest
33 | *.spec
34 |
35 | # Installer logs
36 | pip-log.txt
37 | pip-delete-this-directory.txt
38 |
39 | # Unit test / coverage reports
40 | htmlcov/
41 | .tox/
42 | .coverage
43 | .coverage.*
44 | .cache
45 | nosetests.xml
46 | coverage.xml
47 | *.cover
48 | .hypothesis/
49 | .pytest_cache/
50 |
51 | # Translations
52 | *.mo
53 | *.pot
54 |
55 | # Django stuff:
56 | *.log
57 | local_settings.py
58 | db.sqlite3
59 |
60 | # Flask stuff:
61 | instance/
62 | .webassets-cache
63 |
64 | # Scrapy stuff:
65 | .scrapy
66 |
67 | # Sphinx documentation
68 | docs/_build/
69 |
70 | # PyBuilder
71 | target/
72 |
73 | # Jupyter Notebook
74 | .ipynb_checkpoints
75 |
76 | # pyenv
77 | .python-version
78 |
79 | # celery beat schedule file
80 | celerybeat-schedule
81 |
82 | # SageMath parsed files
83 | *.sage.py
84 |
85 | # Environments
86 | .env
87 | .venv
88 | env/
89 | venv/
90 | ENV/
91 | env.bak/
92 | venv.bak/
93 |
94 | # Spyder project settings
95 | .spyderproject
96 | .spyproject
97 |
98 | # Rope project settings
99 | .ropeproject
100 |
101 | # mkdocs documentation
102 | /site
103 |
104 | # mypy
105 | .mypy_cache/
106 |
107 | # -*- mode: gitignore; -*-
108 | *~
109 | \#*\#
110 |
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) 2019 David Thompson
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE.
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # Grassland Node_Lite
2 |
3 | ### A Grassland mining node that can run on mini computers or older desktops and laptops. This repo is the client side to the Serverless [Node Lite Object Detection](https://github.com/grasslandnetwork/node_lite_object_detection) AWS Lambda function which handles the node's object detections.
4 |
5 |
6 | If you have questions related to this software, search in the 'Issues' tab of this Github repo to see if it's been answered. If not, feel free to open a new issue and I'll get back to you as soon as I can.
7 |
8 |
9 | ## Step 1: Installation
10 | #### Developed and tested on Ubuntu 16.04 and Raspbian (a rebuild of Debian) 9 Stretch. Requires at least 4 GB's of RAM (Slower hardware like Rasberry Pi's can run software locally but aren't powerful enough for mining at mainnet speed requirements), Python 3.6 or greater and Node.js 8.10.0 or greater. It's recommended that you use a Python [virtual environment and virtual environment wrapper](https://docs.python-guide.org/dev/virtualenvs/) to create a separate virtual environment for your package dependencies
11 |
12 |
13 | ### Grassland Node Installation
14 |
15 | Clone this repo on your own machine. 'cd' to the project's root directory and install the required Python packages using
16 |
17 | ```pip install -r requirements.txt```
18 |
19 | #### AWS Credentials
20 | This node version uses the [boto3](https://pypi.org/project/boto3/) (the Amazon Web Services (AWS) Software Development Kit) Python package to communicate with the Serverless [Node Lite Object Detection](https://github.com/grasslandnetwork/node_lite_object_detection) AWS Lambda instance to do the necessary object detections. If you haven't deployed that, please do so now by following the instructions in that repo.
21 |
22 | You should have your AWS Access Key and AWS Secret Key as environment variables on your system by following the instructions on the Node Lite Object Detection [README](https://github.com/grasslandnetwork/node_lite_object_detection)
23 |
24 | ```
25 | export AWS_ACCESS_KEY_ID=
26 | export AWS_SECRET_ACCESS_KEY=
27 | export LAMBDA_DETECTION_URL=
28 |
29 | # 'export' command is valid only for unix shells. In Windows - use 'set' instead of 'export'
30 | ```
31 |
32 | You will now need to set the name of the S3 bucket you'll use to temporarily (they're deleted after detection) store the frames from your camera that will be used by the Lambda function for object detection as well as your AWS default region (e.g. ```us-east-1```) as environment variables
33 |
34 | ```
35 | export AWS_DEFAULT_REGION=
36 | export GRASSLAND_FRAME_S3_BUCKET=
37 | ```
38 |
39 |
40 | ### Grassland GUI Installation
41 |
42 | You'll need to calibrate your Grassland node the first time you run it by letting your node know where the camera its viewing is positioned in the real world. To do that easily we'll use the node GUI's (graphical user interface) simulated, 3D map of the world to virtually set a position and viewing angle that matches that of the camera in the real world. Your node will use this to automatically calculate the right calibration.
43 |
44 |
45 | Before we start the node, open a second bash terminal and cd to the projects 'gui' subfolder.
46 |
47 | ```cd gui```
48 |
49 | Then type
50 |
51 | ```npm install```
52 |
53 | To use the map, you will need a free Mapbox Access Token. You can get a free Mapbox Access token here -> https://docs.mapbox.com/help/how-mapbox-works/access-tokens/
54 |
55 | Make a note of your Mapbox token because we'll be using it later.
56 |
57 |
58 | ## Step 2: Run the Software
59 | ### Start The Grassland Node
60 |
61 | Return to the first terminal to start the Grassland node. Type
62 |
63 | ```python multi_object_tracking.py --mode CALIBRATING --display 1 [ --additional-options ]...```
64 |
65 | (See below for additional options)
66 |
67 | The software should start running and pause as it's waiting for you to set the calibration through the GUI.
68 |
69 | ### Start The GUI
70 |
71 | Go back to your second ('gui') terminal in project's 'gui' directory and type either
72 |
73 |
74 | ```MapboxAccessToken='your-Mapbox-token-here' npm run dev-localhost```
75 | or
76 | ```MapboxAccessToken='your-Mapbox-token-here' npm run dev-external```
77 |
78 | Choose ```npm run dev-localhost``` to ensure your GUI server is only accessible to users on this computer via the loopback (localhost/127.0.0.1) interface
79 |
80 | Choose ```npm run dev-external``` if you want the server to bind to all IPv4 addresses on the local machine making it also accesible to computers on your Local Area Network if you're behind a router or to *any computer* on the internet if your computer is not behind a router and is connected directly to the internet
81 |
82 | ### **Unless you know exactly what you're doing and understand the risks involved, it is highly recommended that you choose "npm run dev-localhost"**
83 |
84 | (Instead of typing ```MapboxAccessToken='your-Mapbox-token-here'``` each time you run your GUI, you can add that line to your ~/.bashrc file to make it a permanent environment variable)
85 |
86 |
87 | After typing the above command, Webpack will begin bundling your software and your browser will automatically open to the map via port 3000.
88 |
89 | ## Step 3: Calibrate The Node
90 |
91 | Once the map loads, use your mouse's scroll wheel to zoom and the left and right mouse buttons to drag and rotate the map until you've adjusted your browsers view of the map to match the position and orientation of your camera in the real world. Once you've narrowed it down, click on the 'CALIBRATION' toggle button. The GUI's frame dimensions will adjust to match your camera frame's dimensions. Continue adjusting your position until it matches the position and orientation of the real precisely.
92 |
93 | As you're adjusting, your node should be receiving new calibration measurements and placing tracked objects on the GUI's map. Continue adjusting while referring to the node's video display until objects tracked in the video display are in their correct positions in the GUI's map.
94 |
95 | In other words, you should have the video window that shows you the video that's streaming from the camera up on your computer screen (because the command you used to start the node included the "--display 1" option). Using your mouse, align the virtual map's viewport so it's looking from exact the same vantage point (latitiude, longitude, altitude, angle etc.) as the real camera is in real life.
96 |
97 | Once that's done, your calibration values should be set inside the node's database. Now click the 'CALIBRATION' toggle button again to turn CALIBRATING mode off.
98 |
99 |
100 | ## Step 4: Restart The Node In 'ONLINE' Mode
101 |
102 | Then return to your first terminal, hold down Ctrl-C on your keyboard to stop the node, then restart the node in the default mode (ONLINE)
103 |
104 | ```python multi_object_tracking.py [ --additional-options ]...``` See additional options below
105 |
106 |
107 |
108 |
109 | ## Multi Object Tracking Command Line Options:
110 |
111 | --mode | [default: ONLINE] "If ONLINE, data is stored in main database. CALIBRATING is used for setting camera orientation in the map"
112 |
113 | --display <0> | <1> [default: 0] "Displays the input video feed in console with tracked objects and bounding boxes. Useful for debugging the tracker and object detector. If not needed, do not use as it consumes uncessary computation."
114 |
115 | --picamera <0> | <1> [default: 0] "DEPRECATED: By default, the computer's webcamera is used as input. If running on a Raspberry Pi, set this option to use the Pi's attached camera as input"
116 |
117 | --rotation (<0> | <90> | <180> | <270>) [default: 0] "DEPRECATED: If a Raspberry Pi camera is used for input instead of the webcamera (default), this specifies camera's clockwise rotation"
118 |
119 | --video "For debugging purposes, a video file can be used as input instead of an attached webcamera (default). This specifies path to video file
120 |
121 | --num_workers <#> [default: 5] "For computers with multi-core CPU's, spreads tasks into separate processes to parralelize processes and speed up software"
122 |
123 |
124 | ## Future Grassland Software Improvements
125 | [Link to current list](https://gist.github.com/00hello/0199d393e872ed7645979f5daf7bd62c) of Grassland features and modules that will be built next
126 |
127 |
128 | ## License
129 | #### This project is licensed under the terms of the MIT license.
130 |
--------------------------------------------------------------------------------
/detection_visualization_util.py:
--------------------------------------------------------------------------------
1 | # Modified from https://github.com/tensorflow/models/blob/master/research/object_detection/utils/visualization_utils.py
2 | import collections
3 | import numpy as np
4 | import PIL.Image as Image
5 | import PIL.ImageColor as ImageColor
6 | import PIL.ImageDraw as ImageDraw
7 | import PIL.ImageFont as ImageFont
8 |
9 |
10 | STANDARD_COLORS = [
11 | 'AliceBlue', 'Chartreuse', 'Aqua', 'Aquamarine', 'Azure', 'Beige', 'Bisque',
12 | 'BlanchedAlmond', 'BlueViolet', 'BurlyWood', 'CadetBlue', 'AntiqueWhite',
13 | 'Chocolate', 'Coral', 'CornflowerBlue', 'Cornsilk', 'Crimson', 'Cyan',
14 | 'DarkCyan', 'DarkGoldenRod', 'DarkGrey', 'DarkKhaki', 'DarkOrange',
15 | 'DarkOrchid', 'DarkSalmon', 'DarkSeaGreen', 'DarkTurquoise', 'DarkViolet',
16 | 'DeepPink', 'DeepSkyBlue', 'DodgerBlue', 'FireBrick', 'FloralWhite',
17 | 'ForestGreen', 'Fuchsia', 'Gainsboro', 'GhostWhite', 'Gold', 'GoldenRod',
18 | 'Salmon', 'Tan', 'HoneyDew', 'HotPink', 'IndianRed', 'Ivory', 'Khaki',
19 | 'Lavender', 'LavenderBlush', 'LawnGreen', 'LemonChiffon', 'LightBlue',
20 | 'LightCoral', 'LightCyan', 'LightGoldenRodYellow', 'LightGray', 'LightGrey',
21 | 'LightGreen', 'LightPink', 'LightSalmon', 'LightSeaGreen', 'LightSkyBlue',
22 | 'LightSlateGray', 'LightSlateGrey', 'LightSteelBlue', 'LightYellow', 'Lime',
23 | 'LimeGreen', 'Linen', 'Magenta', 'MediumAquaMarine', 'MediumOrchid',
24 | 'MediumPurple', 'MediumSeaGreen', 'MediumSlateBlue', 'MediumSpringGreen',
25 | 'MediumTurquoise', 'MediumVioletRed', 'MintCream', 'MistyRose', 'Moccasin',
26 | 'NavajoWhite', 'OldLace', 'Olive', 'OliveDrab', 'Orange', 'OrangeRed',
27 | 'Orchid', 'PaleGoldenRod', 'PaleGreen', 'PaleTurquoise', 'PaleVioletRed',
28 | 'PapayaWhip', 'PeachPuff', 'Peru', 'Pink', 'Plum', 'PowderBlue', 'Purple',
29 | 'Red', 'RosyBrown', 'RoyalBlue', 'SaddleBrown', 'Green', 'SandyBrown',
30 | 'SeaGreen', 'SeaShell', 'Sienna', 'Silver', 'SkyBlue', 'SlateBlue',
31 | 'SlateGray', 'SlateGrey', 'Snow', 'SpringGreen', 'SteelBlue', 'GreenYellow',
32 | 'Teal', 'Thistle', 'Tomato', 'Turquoise', 'Violet', 'Wheat', 'White',
33 | 'WhiteSmoke', 'Yellow', 'YellowGreen'
34 | ]
35 |
36 | category_index = {1: {'name': 'person', 'id': 1}, 2: {'name': 'bicycle', 'id': 2}, 3: {'name': 'car', 'id': 3}, 4: {'name': 'motorcycle', 'id': 4}, 5: {'name': 'airplane', 'id': 5}, 6: {'name': 'bus', 'id': 6}, 7: {'name': 'train', 'id': 7}, 8: {'name': 'truck', 'id': 8}, 9: {'name': 'boat', 'id': 9}, 10: {'name': 'traffic light', 'id': 10}, 11: {'name': 'fire hydrant', 'id': 11}, 13: {'name': 'stop sign', 'id': 13}, 14: {'name': 'parking meter', 'id': 14}, 15: {'name': 'bench', 'id': 15}, 16: {'name': 'bird', 'id': 16}, 17: {'name': 'cat', 'id': 17}, 18: {'name': 'dog', 'id': 18}, 19: {'name': 'horse', 'id': 19}, 20: {'name': 'sheep', 'id': 20}, 21: {'name': 'cow', 'id': 21}, 22: {'name': 'elephant', 'id': 22}, 23: {'name': 'bear', 'id': 23}, 24: {'name': 'zebra', 'id': 24}, 25: {'name': 'giraffe', 'id': 25}, 27: {'name': 'backpack', 'id': 27}, 28: {'name': 'umbrella', 'id': 28}, 31: {'name': 'handbag', 'id': 31}, 32: {'name': 'tie', 'id': 32}, 33: {'name': 'suitcase', 'id': 33}, 34: {'name': 'frisbee', 'id': 34}, 35: {'name': 'skis', 'id': 35}, 36: {'name': 'snowboard', 'id': 36}, 37: {'name': 'sports ball', 'id': 37}, 38: {'name': 'kite', 'id': 38}, 39: {'name': 'baseball bat', 'id': 39}, 40: {'name': 'baseball glove', 'id': 40}, 41: {'name': 'skateboard', 'id': 41}, 42: {'name': 'surfboard', 'id': 42}, 43: {'name': 'tennis racket', 'id': 43}, 44: {'name': 'bottle', 'id': 44}, 46: {'name': 'wine glass', 'id': 46}, 47: {'name': 'cup', 'id': 47}, 48: {'name': 'fork', 'id': 48}, 49: {'name': 'knife', 'id': 49}, 50: {'name': 'spoon', 'id': 50}, 51: {'name': 'bowl', 'id': 51}, 52: {'name': 'banana', 'id': 52}, 53: {'name': 'apple', 'id': 53}, 54: {'name': 'sandwich', 'id': 54}, 55: {'name': 'orange', 'id': 55}, 56: {'name': 'broccoli', 'id': 56}, 57: {'name': 'carrot', 'id': 57}, 58: {'name': 'hot dog', 'id': 58}, 59: {'name': 'pizza', 'id': 59}, 60: {'name': 'donut', 'id': 60}, 61: {'name': 'cake', 'id': 61}, 62: {'name': 'chair', 'id': 62}, 63: {'name': 'couch', 'id': 63}, 64: {'name': 'potted plant', 'id': 64}, 65: {'name': 'bed', 'id': 65}, 67: {'name': 'dining table', 'id': 67}, 70: {'name': 'toilet', 'id': 70}, 72: {'name': 'tv', 'id': 72}, 73: {'name': 'laptop', 'id': 73}, 74: {'name': 'mouse', 'id': 74}, 75: {'name': 'remote', 'id': 75}, 76: {'name': 'keyboard', 'id': 76}, 77: {'name': 'cell phone', 'id': 77}, 78: {'name': 'microwave', 'id': 78}, 79: {'name': 'oven', 'id': 79}, 80: {'name': 'toaster', 'id': 80}, 81: {'name': 'sink', 'id': 81}, 82: {'name': 'refrigerator', 'id': 82}, 84: {'name': 'book', 'id': 84}, 85: {'name': 'clock', 'id': 85}, 86: {'name': 'vase', 'id': 86}, 87: {'name': 'scissors', 'id': 87}, 88: {'name': 'teddy bear', 'id': 88}, 89: {'name': 'hair drier', 'id': 89}, 90: {'name': 'toothbrush', 'id': 90}}
37 |
38 |
39 |
40 | def draw_mask_on_image_array(
41 | image,
42 | mask,
43 | color='red',
44 | alpha=0.4):
45 | """Draws mask on an image.
46 | Args:
47 | image: uint8 numpy array with shape (img_height, img_height, 3)
48 | mask: a uint8 numpy array of shape (img_height, img_height) with
49 | values between either 0 or 1.
50 | color: color to draw the keypoints with. Default is red.
51 | alpha: transparency value between 0 and 1. (default: 0.4)
52 | Raises:
53 | ValueError: On incorrect data type for image or masks.
54 | """
55 | if image.dtype != np.uint8:
56 | raise ValueError('`image` not of type np.uint8')
57 | if mask.dtype != np.uint8:
58 | raise ValueError('`mask` not of type np.uint8')
59 | if np.any(np.logical_and(mask != 1, mask != 0)):
60 | raise ValueError('`mask` elements should be in [0, 1]')
61 | if image.shape[:2] != mask.shape:
62 | raise ValueError('The image has spatial dimensions %s but the mask has '
63 | 'dimensions %s' % (image.shape[:2], mask.shape))
64 | rgb = ImageColor.getrgb(color)
65 | pil_image = Image.fromarray(image)
66 |
67 | solid_color = np.expand_dims(
68 | np.ones_like(mask), axis=2) * np.reshape(list(rgb), [1, 1, 3])
69 | pil_solid_color = Image.fromarray(np.uint8(solid_color)).convert('RGBA')
70 | pil_mask = Image.fromarray(np.uint8(255.0*alpha*mask)).convert('L')
71 | pil_image = Image.composite(pil_solid_color, pil_image, pil_mask)
72 | np.copyto(image, np.array(pil_image.convert('RGB')))
73 |
74 |
75 | def draw_keypoints_on_image_array(
76 | image,
77 | keypoints,
78 | color='red',
79 | radius=2,
80 | use_normalized_coordinates=True):
81 | """Draws keypoints on an image (numpy array).
82 | Args:
83 | image: a numpy array with shape [height, width, 3].
84 | keypoints: a numpy array with shape [num_keypoints, 2].
85 | color: color to draw the keypoints with. Default is red.
86 | radius: keypoint radius. Default value is 2.
87 | use_normalized_coordinates: if True (default), treat keypoint values as
88 | relative to the image. Otherwise treat them as absolute.
89 | """
90 | image_pil = Image.fromarray(np.uint8(image)).convert('RGB')
91 | draw_keypoints_on_image(image_pil, keypoints, color, radius,
92 | use_normalized_coordinates)
93 |
94 | np.copyto(image, np.array(image_pil))
95 |
96 |
97 | def draw_bounding_box_on_image_array(
98 | image,
99 | ymin,
100 | xmin,
101 | ymax,
102 | xmax,
103 | color='red',
104 | thickness=4,
105 | display_str_list=(),
106 | use_normalized_coordinates=True):
107 | """Adds a bounding box to an image (numpy array).
108 | Bounding box coordinates can be specified in either absolute (pixel) or
109 | normalized coordinates by setting the use_normalized_coordinates argument.
110 | Args:
111 | image: a numpy array with shape [height, width, 3].
112 | ymin: ymin of bounding box.
113 | xmin: xmin of bounding box.
114 | ymax: ymax of bounding box.
115 | xmax: xmax of bounding box.
116 | color: color to draw bounding box. Default is red.
117 | thickness: line thickness. Default value is 4.
118 | display_str_list: list of strings to display in box
119 | (each to be shown on its own line).
120 | use_normalized_coordinates: If True (default), treat coordinates
121 | ymin, xmin, ymax, xmax as relative to the image. Otherwise treat
122 | coordinates as absolute.
123 | """
124 | image_pil = Image.fromarray(np.uint8(image)).convert('RGB')
125 | draw_bounding_box_on_image(image_pil, ymin, xmin, ymax, xmax, color,
126 | thickness, display_str_list,
127 | use_normalized_coordinates)
128 | np.copyto(image, np.array(image_pil))
129 |
130 |
131 | def draw_bounding_box_on_image(
132 | image,
133 | ymin,
134 | xmin,
135 | ymax,
136 | xmax,
137 | color='red',
138 | thickness=4,
139 | display_str_list=(),
140 | use_normalized_coordinates=True):
141 | """Adds a bounding box to an image.
142 | Bounding box coordinates can be specified in either absolute (pixel) or
143 | normalized coordinates by setting the use_normalized_coordinates argument.
144 | Each string in display_str_list is displayed on a separate line above the
145 | bounding box in black text on a rectangle filled with the input 'color'.
146 | If the top of the bounding box extends to the edge of the image, the strings
147 | are displayed below the bounding box.
148 | Args:
149 | image: a PIL.Image object.
150 | ymin: ymin of bounding box.
151 | xmin: xmin of bounding box.
152 | ymax: ymax of bounding box.
153 | xmax: xmax of bounding box.
154 | color: color to draw bounding box. Default is red.
155 | thickness: line thickness. Default value is 4.
156 | display_str_list: list of strings to display in box
157 | (each to be shown on its own line).
158 | use_normalized_coordinates: If True (default), treat coordinates
159 | ymin, xmin, ymax, xmax as relative to the image. Otherwise treat
160 | coordinates as absolute.
161 | """
162 | draw = ImageDraw.Draw(image)
163 | im_width, im_height = image.size
164 | if use_normalized_coordinates:
165 | (left, right, top, bottom) = (xmin * im_width, xmax * im_width, ymin * im_height, ymax * im_height)
166 | else:
167 | (left, right, top, bottom) = (xmin, xmax, ymin, ymax)
168 |
169 | draw.line([(left, top), (left, bottom), (right, bottom), (right, top), (left, top)], width=thickness, fill=color)
170 | try:
171 | font = ImageFont.truetype('arial.ttf', 24)
172 | except IOError:
173 | font = ImageFont.load_default()
174 |
175 | # If the total height of the display strings added to the top of the bounding
176 | # box exceeds the top of the image, stack the strings below the bounding box
177 | # instead of above.
178 | display_str_heights = [font.getsize(ds)[1] for ds in display_str_list]
179 | # Each display_str has a top and bottom margin of 0.05x.
180 | total_display_str_height = (1 + 2 * 0.05) * sum(display_str_heights)
181 |
182 | if top > total_display_str_height:
183 | text_bottom = top
184 | else:
185 | text_bottom = bottom + total_display_str_height
186 | # Reverse list and print from bottom to top.
187 | for display_str in display_str_list[::-1]:
188 | text_width, text_height = font.getsize(display_str)
189 | margin = np.ceil(0.05 * text_height)
190 | draw.rectangle([(left, text_bottom - text_height - 2 * margin), (left + text_width, text_bottom)], fill=color)
191 | draw.text((left + margin, text_bottom - text_height - margin), display_str, fill='black', font=font)
192 | text_bottom -= text_height - 2 * margin
193 |
194 |
195 | def visualize_boxes_and_labels_on_image_array(
196 | image,
197 | boxes,
198 | classes,
199 | scores,
200 | #category_index,
201 | instance_masks=None,
202 | instance_boundaries=None,
203 | keypoints=None,
204 | use_normalized_coordinates=False,
205 | max_boxes_to_draw=20,
206 | min_score_thresh=.5,
207 | agnostic_mode=False,
208 | line_thickness=4,
209 | groundtruth_box_visualization_color='black',
210 | skip_scores=False,
211 | skip_labels=False):
212 | """Overlay labeled boxes on an image with formatted scores and label names.
213 | This function groups boxes that correspond to the same location
214 | and creates a display string for each detection and overlays these
215 | on the image. Note that this function modifies the image in place, and returns
216 | that same image.
217 | Args:
218 | image: uint8 numpy array with shape (img_height, img_width, 3)
219 | boxes: a numpy array of shape [N, 4]
220 | classes: a numpy array of shape [N]. Note that class indices are 1-based,
221 | and match the keys in the label map.
222 | scores: a numpy array of shape [N] or None. If scores=None, then
223 | this function assumes that the boxes to be plotted are groundtruth
224 | boxes and plot all boxes as black with no classes or scores.
225 | category_index: a dict containing category dictionaries (each holding
226 | category index `id` and category name `name`) keyed by category indices.
227 | instance_masks: a numpy array of shape [N, image_height, image_width] with
228 | values ranging between 0 and 1, can be None.
229 | instance_boundaries: a numpy array of shape [N, image_height, image_width]
230 | with values ranging between 0 and 1, can be None.
231 | keypoints: a numpy array of shape [N, num_keypoints, 2], can
232 | be None
233 | use_normalized_coordinates: whether boxes is to be interpreted as
234 | normalized coordinates or not.
235 | max_boxes_to_draw: maximum number of boxes to visualize. If None, draw
236 | all boxes.
237 | min_score_thresh: minimum score threshold for a box to be visualized
238 | agnostic_mode: boolean (default: False) controlling whether to evaluate in
239 | class-agnostic mode or not. This mode will display scores but ignore
240 | classes.
241 | line_thickness: integer (default: 4) controlling line width of the boxes.
242 | groundtruth_box_visualization_color: box color for visualizing groundtruth
243 | boxes
244 | skip_scores: whether to skip score when drawing a single detection
245 | skip_labels: whether to skip label when drawing a single detection
246 | Returns:
247 | uint8 numpy array with shape (img_height, img_width, 3) with overlaid boxes.
248 | """
249 | # Create a display string (and color) for every box location, group any boxes
250 | # that correspond to the same location.
251 | box_to_display_str_map = collections.defaultdict(list)
252 | box_to_color_map = collections.defaultdict(str)
253 | box_to_instance_masks_map = {}
254 | box_to_instance_boundaries_map = {}
255 | box_to_keypoints_map = collections.defaultdict(list)
256 | if not max_boxes_to_draw:
257 | max_boxes_to_draw = boxes.shape[0]
258 | for i in range(min(max_boxes_to_draw, boxes.shape[0])):
259 | if scores is None or scores[i] > min_score_thresh:
260 | box = tuple(boxes[i].tolist())
261 | if instance_masks is not None:
262 | box_to_instance_masks_map[box] = instance_masks[i]
263 | if instance_boundaries is not None:
264 | box_to_instance_boundaries_map[box] = instance_boundaries[i]
265 | if keypoints is not None:
266 | box_to_keypoints_map[box].extend(keypoints[i])
267 | if scores is None:
268 | box_to_color_map[box] = groundtruth_box_visualization_color
269 | else:
270 | display_str = ''
271 | if not skip_labels:
272 | if not agnostic_mode:
273 | if classes[i] in category_index.keys():
274 | class_name = category_index[classes[i]]['name']
275 | else:
276 | class_name = 'N/A'
277 | display_str = str(class_name)
278 | if not skip_scores:
279 | if not display_str:
280 | display_str = '{}%'.format(int(100*scores[i]))
281 | else:
282 | display_str = '{}: {}%'.format(display_str, int(100*scores[i]))
283 | box_to_display_str_map[box].append(display_str)
284 | if agnostic_mode:
285 | box_to_color_map[box] = 'DarkOrange'
286 | else:
287 | box_to_color_map[box] = STANDARD_COLORS[classes[i] % len(STANDARD_COLORS)]
288 |
289 | # Draw all boxes onto image.
290 | for box, color in box_to_color_map.items():
291 | ymin, xmin, ymax, xmax = box
292 | if instance_masks is not None:
293 | draw_mask_on_image_array(
294 | image,
295 | box_to_instance_masks_map[box],
296 | color=color
297 | )
298 | if instance_boundaries is not None:
299 | draw_mask_on_image_array(
300 | image,
301 | box_to_instance_boundaries_map[box],
302 | color='red',
303 | alpha=1.0
304 | )
305 | draw_bounding_box_on_image_array(
306 | image,
307 | ymin,
308 | xmin,
309 | ymax,
310 | xmax,
311 | color=color,
312 | thickness=line_thickness,
313 | display_str_list=box_to_display_str_map[box],
314 | use_normalized_coordinates=use_normalized_coordinates
315 | )
316 | if keypoints is not None:
317 | draw_keypoints_on_image_array(
318 | image,
319 | box_to_keypoints_map[box],
320 | color=color,
321 | radius=line_thickness / 2,
322 | use_normalized_coordinates=use_normalized_coordinates
323 | )
324 |
325 | return image
326 |
327 |
328 |
329 |
330 |
331 |
332 |
333 |
334 |
335 |
336 | # --------------------------------------------------------------------- New
337 |
338 |
339 |
340 |
341 |
342 |
343 | def get_bounding_box_for_image_array(
344 | image,
345 | ymin,
346 | xmin,
347 | ymax,
348 | xmax,
349 | color='red',
350 | thickness=4,
351 | display_str_list=(),
352 | use_normalized_coordinates=True):
353 | """Adds a bounding box to an image (numpy array).
354 | Bounding box coordinates can be specified in either absolute (pixel) or
355 | normalized coordinates by setting the use_normalized_coordinates argument.
356 | Args:
357 | image: a numpy array with shape [height, width, 3].
358 | ymin: ymin of bounding box.
359 | xmin: xmin of bounding box.
360 | ymax: ymax of bounding box.
361 | xmax: xmax of bounding box.
362 | color: color to draw bounding box. Default is red.
363 | thickness: line thickness. Default value is 4.
364 | display_str_list: list of strings to display in box
365 | (each to be shown on its own line).
366 | use_normalized_coordinates: If True (default), treat coordinates
367 | ymin, xmin, ymax, xmax as relative to the image. Otherwise treat
368 | coordinates as absolute.
369 | """
370 | image_pil = Image.fromarray(np.uint8(image)).convert('RGB')
371 | # draw_bounding_box_on_image(image_pil, ymin, xmin, ymax, xmax, color,
372 | # thickness, display_str_list,
373 | # use_normalized_coordinates)
374 | # np.copyto(image, np.array(image_pil))
375 |
376 | left, top, right, bottom = get_bounding_box_for_image(image_pil, ymin, xmin, ymax, xmax, color,
377 | thickness, display_str_list,
378 | use_normalized_coordinates)
379 |
380 | return (left, top, right, bottom)
381 |
382 |
383 | def get_bounding_box_for_image(
384 | image,
385 | ymin,
386 | xmin,
387 | ymax,
388 | xmax,
389 | color='red',
390 | thickness=4,
391 | display_str_list=(),
392 | use_normalized_coordinates=True):
393 | """Adds a bounding box to an image.
394 | Bounding box coordinates can be specified in either absolute (pixel) or
395 | normalized coordinates by setting the use_normalized_coordinates argument.
396 | Each string in display_str_list is displayed on a separate line above the
397 | bounding box in black text on a rectangle filled with the input 'color'.
398 | If the top of the bounding box extends to the edge of the image, the strings
399 | are displayed below the bounding box.
400 | Args:
401 | image: a PIL.Image object.
402 | ymin: ymin of bounding box.
403 | xmin: xmin of bounding box.
404 | ymax: ymax of bounding box.
405 | xmax: xmax of bounding box.
406 | color: color to draw bounding box. Default is red.
407 | thickness: line thickness. Default value is 4.
408 | display_str_list: list of strings to display in box
409 | (each to be shown on its own line).
410 | use_normalized_coordinates: If True (default), treat coordinates
411 | ymin, xmin, ymax, xmax as relative to the image. Otherwise treat
412 | coordinates as absolute.
413 | """
414 | draw = ImageDraw.Draw(image)
415 | im_width, im_height = image.size
416 | if use_normalized_coordinates:
417 | (left, top, right, bottom) = (xmin * im_width, ymin * im_height, xmax * im_width, ymax * im_height)
418 | else:
419 | (left, top, right, bottom) = (xmin, ymin, xmax, ymax)
420 |
421 |
422 | # draw.line([(left, top), (left, bottom), (right, bottom), (right, top), (left, top)], width=thickness, fill=color)
423 | # try:
424 | # font = ImageFont.truetype('arial.ttf', 24)
425 | # except IOError:
426 | # font = ImageFont.load_default()
427 |
428 | # # If the total height of the display strings added to the top of the bounding
429 | # # box exceeds the top of the image, stack the strings below the bounding box
430 | # # instead of above.
431 | # display_str_heights = [font.getsize(ds)[1] for ds in display_str_list]
432 | # # Each display_str has a top and bottom margin of 0.05x.
433 | # total_display_str_height = (1 + 2 * 0.05) * sum(display_str_heights)
434 |
435 | # if top > total_display_str_height:
436 | # text_bottom = top
437 | # else:
438 | # text_bottom = bottom + total_display_str_height
439 | # # Reverse list and print from bottom to top.
440 | # for display_str in display_str_list[::-1]:
441 | # text_width, text_height = font.getsize(display_str)
442 | # margin = np.ceil(0.05 * text_height)
443 | # draw.rectangle([(left, text_bottom - text_height - 2 * margin), (left + text_width, text_bottom)], fill=color)
444 | # draw.text((left + margin, text_bottom - text_height - margin), display_str, fill='black', font=font)
445 | # text_bottom -= text_height - 2 * margin
446 |
447 | return (int(left), int(top), int(right), int(bottom))
448 |
449 |
450 | def get_bounding_boxes_for_image_array(
451 | image,
452 | boxes,
453 | classes,
454 | scores,
455 | #category_index,
456 | instance_masks=None,
457 | instance_boundaries=None,
458 | keypoints=None,
459 | use_normalized_coordinates=False,
460 | max_boxes_to_draw=None,
461 | min_score_thresh=.5,
462 | agnostic_mode=False,
463 | line_thickness=4,
464 | groundtruth_box_visualization_color='black',
465 | skip_scores=True,
466 | skip_labels=True):
467 | """Overlay labeled boxes on an image with formatted scores and label names.
468 | This function groups boxes that correspond to the same location
469 | and creates a display string for each detection and overlays these
470 | on the image. Note that this function modifies the image in place, and returns
471 | that same image.
472 | Args:
473 | image: uint8 numpy array with shape (img_height, img_width, 3)
474 | boxes: a numpy array of shape [N, 4]
475 | classes: a numpy array of shape [N]. Note that class indices are 1-based,
476 | and match the keys in the label map.
477 | scores: a numpy array of shape [N] or None. If scores=None, then
478 | this function assumes that the boxes to be plotted are groundtruth
479 | boxes and plot all boxes as black with no classes or scores.
480 | category_index: a dict containing category dictionaries (each holding
481 | category index `id` and category name `name`) keyed by category indices.
482 | instance_masks: a numpy array of shape [N, image_height, image_width] with
483 | values ranging between 0 and 1, can be None.
484 | instance_boundaries: a numpy array of shape [N, image_height, image_width]
485 | with values ranging between 0 and 1, can be None.
486 | keypoints: a numpy array of shape [N, num_keypoints, 2], can
487 | be None
488 | use_normalized_coordinates: whether boxes is to be interpreted as
489 | normalized coordinates or not.
490 | max_boxes_to_draw: maximum number of boxes to visualize. If None, draw
491 | all boxes.
492 | min_score_thresh: minimum score threshold for a box to be visualized
493 | agnostic_mode: boolean (default: False) controlling whether to evaluate in
494 | class-agnostic mode or not. This mode will display scores but ignore
495 | classes.
496 | line_thickness: integer (default: 4) controlling line width of the boxes.
497 | groundtruth_box_visualization_color: box color for visualizing groundtruth
498 | boxes
499 | skip_scores: whether to skip score when drawing a single detection
500 | skip_labels: whether to skip label when drawing a single detection
501 | Returns:
502 | uint8 numpy array with shape (img_height, img_width, 3) with overlaid boxes.
503 | """
504 | # Create a display string (and color) for every box location, group any boxes
505 | # that correspond to the same location.
506 | box_to_display_str_map = collections.defaultdict(list)
507 | box_to_color_map = collections.defaultdict(str)
508 | box_to_instance_masks_map = {}
509 | box_to_instance_boundaries_map = {}
510 | box_to_keypoints_map = collections.defaultdict(list)
511 | if not max_boxes_to_draw:
512 | max_boxes_to_draw = boxes.shape[0]
513 | for i in range(min(max_boxes_to_draw, boxes.shape[0])):
514 | if scores is None or scores[i] > min_score_thresh:
515 | box = tuple(boxes[i].tolist())
516 | if instance_masks is not None:
517 | box_to_instance_masks_map[box] = instance_masks[i]
518 | if instance_boundaries is not None:
519 | box_to_instance_boundaries_map[box] = instance_boundaries[i]
520 | if keypoints is not None:
521 | box_to_keypoints_map[box].extend(keypoints[i])
522 | if scores is None:
523 | box_to_color_map[box] = groundtruth_box_visualization_color
524 | else:
525 | display_str = ''
526 | if not skip_labels:
527 | if not agnostic_mode:
528 | if classes[i] in category_index.keys():
529 | class_name = category_index[classes[i]]['name']
530 | else:
531 | class_name = 'N/A'
532 | display_str = str(class_name)
533 | if not skip_scores:
534 | if not display_str:
535 | display_str = '{}%'.format(int(100*scores[i]))
536 | else:
537 | display_str = '{}: {}%'.format(display_str, int(100*scores[i]))
538 | box_to_display_str_map[box].append(display_str)
539 | if agnostic_mode:
540 | box_to_color_map[box] = ('DarkOrange', classes[i])
541 | else:
542 | box_to_color_map[box] = (STANDARD_COLORS[classes[i] % len(STANDARD_COLORS)], classes[i])
543 |
544 | bounding_boxes = []
545 | # Draw all boxes onto image.
546 | for box, (color, detection_class_id) in box_to_color_map.items():
547 | ymin, xmin, ymax, xmax = box
548 | left, top, right, bottom = get_bounding_box_for_image_array(
549 | image,
550 | ymin,
551 | xmin,
552 | ymax,
553 | xmax,
554 | color=color,
555 | thickness=line_thickness,
556 | display_str_list=box_to_display_str_map[box],
557 | use_normalized_coordinates=use_normalized_coordinates
558 | )
559 |
560 | bounding_boxes.append((left, top, right, bottom, detection_class_id))
561 |
562 | return bounding_boxes
563 |
564 |
--------------------------------------------------------------------------------
/gui/.gitignore:
--------------------------------------------------------------------------------
1 |
2 | node_modules/*
3 | profile.json
4 | performance_recording.json
5 | package-lock.json
6 |
--------------------------------------------------------------------------------
/gui/app.js:
--------------------------------------------------------------------------------
1 | /* global window */
2 | import React, {Component} from 'react';
3 | import {render} from 'react-dom';
4 | import ReactMapGL from 'react-map-gl';
5 | import DeckGL, {PolygonLayer, GeoJsonLayer} from 'deck.gl';
6 | // PolygonLayer, GeoJsonLayer etc. can't be used until mapbox-gl is upgraded to at least 50.0 (See note in webpack-config.js)
7 | //import {TripsLayer} from '@deck.gl/experimental-layers';
8 | import TripsLayer from './src/client/trips-layer.js';
9 |
10 | import {MapboxLayer} from '@deck.gl/mapbox'; // can't be used until mapbox-gl is upgraded to at least 50.0 (See note in webpack-config.js)
11 |
12 | import {Timepicker} from './src/client/analog-clock-timepicker/libs/timepicker.js';
13 |
14 | import {formatTracklets} from './src/client/format-tracklets.js';
15 | import {interpolatePositionAndBearing} from './src/client/format-tracklets.js';
16 | import cheapRuler from 'cheap-ruler';
17 |
18 | import {polygon as turfPolygon} from '@turf/turf';
19 | import {transformRotate as turfTransformRotate} from '@turf/turf';
20 | import {transformTranslate as turfTransformTranslate} from '@turf/turf';
21 |
22 |
23 |
24 | // Set your mapbox token here or set environment variable from cmd line and access using 'process.env.[variable_name]'
25 | const MAPBOX_TOKEN = process.env.MapboxAccessToken; // eslint-disable-line
26 |
27 |
28 | // Source demo data JSON. Change to use different demo data
29 | const DATA_URL = {
30 | DEMO_TRACKLETS: './demo_nyc_tracklets.json' // eslint-disable-line
31 | };
32 | const trackletsData = require('./demo_nyc_tracklets.json');
33 |
34 | // change DEMO_MODE to true to use demo data. Change INITIAL_VIEW_STATE latitude/longitude values to location of demo data
35 | const DEMO_MODE = false;
36 |
37 |
38 | const controlsWrapper = {
39 | 'position': 'relative',
40 | 'zIndex': 200
41 | }
42 | const inlineClockStyleWrapper = {
43 | }
44 | const inlineClockStyle = {
45 | 'paddingRight': '56px'
46 | };
47 |
48 | const LIGHT_SETTINGS = {
49 | lightsPosition: [-74.05, 40.7, 8000, -73.5, 41, 5000],
50 | ambientRatio: 0.05,
51 | diffuseRatio: 0.6,
52 | specularRatio: 0.8,
53 | lightsStrength: [2.0, 0.0, 0.0, 0.0],
54 | numberOfLights: 2
55 | };
56 |
57 | export const INITIAL_VIEW_STATE = {
58 | longitude: 0,
59 | latitude: 0,
60 | zoom: 1,
61 | maxZoom: 24,
62 | pitch: 0,
63 | maxPitch: 89.9,
64 | altitude: 1.5,
65 | bearing: 0
66 | };
67 |
68 | const mapboxBuildingLayer = {
69 | id: '3d-buildings',
70 | source: 'composite',
71 | 'source-layer': 'building',
72 | filter: ['==', 'extrude', 'true'],
73 | type: 'fill-extrusion',
74 | minzoom: 13,
75 | 'paint': {
76 | 'fill-extrusion-color': '#aaa',
77 |
78 | // use an 'interpolate' expression to add a smooth transition effect to the
79 | // buildings as the user zooms in
80 | 'fill-extrusion-height': [
81 | "interpolate", ["linear"], ["zoom"],
82 | 13, 0,
83 | 15.05, ["get", "height"]
84 | ],
85 | 'fill-extrusion-base': [
86 | "interpolate", ["linear"], ["zoom"],
87 | 13, 0,
88 | 15.05, ["get", "min_height"]
89 | ],
90 | 'fill-extrusion-opacity': .6
91 | }
92 |
93 | };
94 |
95 |
96 |
97 | export default class App extends Component {
98 | constructor(props) {
99 | super(props);
100 | this.state = {
101 | time: 0,
102 | trackableObjectsArray: [],
103 | canvasWidth: window.innerWidth,
104 | canvasHeight: window.innerHeight,
105 | calibrationMode: false
106 | };
107 |
108 | this.timepicker = new Timepicker();
109 | this.waitingToReceiveTrackableObjects = false;
110 | this.last_ws_get_tracklets = 0;
111 | this.clockTimestamp;
112 | this.last_query_timestamp = 0;
113 | this.new_query_timestamp;
114 | this.query_timestamp_range = 60000;
115 | this.mapLoaded = false;
116 | this.lastRAFTimestamp = 0;
117 | this.lastMapLatitudeFocus = 0;
118 | this.currentZoomLevel = INITIAL_VIEW_STATE.zoom;
119 | this.trackedObjectMinZoomAppearance = 10;
120 |
121 | this.defaultFootprint = {};
122 |
123 | // This binding is necessary to make `this` work in the callback
124 | this._onWebGLInitialized = this._onWebGLInitialized.bind(this);
125 | this._onMapLoad = this._onMapLoad.bind(this);
126 | this._determineCalibration = this._determineCalibration.bind(this);
127 | this._adjustCanvas = this._adjustCanvas.bind(this);
128 | this._receiveTracklets = this._receiveTracklets.bind(this);
129 | this._interpolateTrackableObjects = this._interpolateTrackableObjects.bind(this);
130 | this._openCalFrameWebsocketConnection = this._openCalFrameWebsocketConnection.bind(this);
131 | this._openTrackletsWebsocketConnection = this._openTrackletsWebsocketConnection.bind(this);
132 |
133 | this._openCalFrameWebsocketConnection();
134 | this._openTrackletsWebsocketConnection();
135 |
136 | this._onCalibrationToggleClick = this._onCalibrationToggleClick.bind(this);
137 | this._setDefaultObjectFootprints = this._setDefaultObjectFootprints.bind(this);
138 |
139 | this._setDefaultObjectFootprints(1); // person
140 | this._setDefaultObjectFootprints(3); // car
141 |
142 | }
143 |
144 | componentDidMount() {
145 | this._animate();
146 | this.ws_send_cal_recv_frame.addEventListener('message', this._adjustCanvas);
147 | this.ws_get_tracklets.addEventListener('message', this._receiveTracklets);
148 |
149 | this.ws_send_cal_recv_frame.addEventListener('close', this._openCalFrameWebsocketConnection);
150 | this.ws_get_tracklets.addEventListener('close', this._openTrackletsWebsocketConnection);
151 |
152 | document.getElementById('timepicker').appendChild(this.timepicker.getElement());
153 | this.timepicker.show();
154 | }
155 |
156 | componentWillUnmount() {
157 | this.timepicker.destroy();
158 | if (this._animationFrame) {
159 | window.cancelAnimationFrame(this._animationFrame);
160 | }
161 | this.ws_send_cal_recv_frame.removeEventListener('message', this._adjustCanvas);
162 | this.ws_get_tracklets.removeEventListener('message', this._receiveTracklets);
163 |
164 | this.ws_send_cal_recv_frame.removeEventListener('close', this._openCalFrameWebsocketConnection);
165 | this.ws_get_tracklets.removeEventListener('close', this._openTrackletsWebsocketConnection);
166 |
167 | }
168 |
169 |
170 | _openCalFrameWebsocketConnection() {
171 | try {
172 | this.ws_send_cal_recv_frame = new WebSocket('ws://'+window.location.hostname+':8080/send_calibration');
173 | } catch (e) {
174 | console.log("error", e);
175 | }
176 | }
177 |
178 | _openTrackletsWebsocketConnection() {
179 | try {
180 | console.log("Reconnecting to ws_get_tracklets");
181 | this.ws_get_tracklets = new WebSocket('ws://'+window.location.hostname+':8080/get_tracklets');
182 | } catch(e) {
183 | console.log("error", e);
184 | }
185 | }
186 |
187 |
188 | _onCalibrationToggleClick() {
189 |
190 | if (this.state.calibrationMode) { // if we are CURRENTLY in calibration mode
191 | this.setState({
192 | canvasWidth: window.innerWidth,
193 | canvasHeight: window.innerHeight,
194 | calibrationMode: false
195 | });
196 | } else { // if we're NOT CURRENTLY in calibration mode
197 | this.setState({
198 | calibrationMode: true
199 | });
200 | }
201 |
202 | }
203 |
204 | _setDefaultObjectFootprints(category = 1) {
205 |
206 | var lat_dst;
207 | var lng_dst;
208 | var origin_dst;
209 |
210 | // Declare length/width of all object objects of that category (in lat and lng values)
211 | if (category == 1) { // person
212 |
213 | lat_dst = 0.00000492271;
214 | lng_dst = 0.00000628327;
215 |
216 | } else if (category == 3) { // car
217 |
218 | lat_dst = 0.00004556421;
219 | lng_dst = 0.00002992647;
220 |
221 | }
222 |
223 | // Assume position coordinates relative to the object itself is located in the center with respect to the ground plane
224 | // Assume a view that has a north/south orientation
225 |
226 | // Mapbox bearing angles are in degrees rotating counter-clockwise from north
227 | // Thus, the right half of the (x,y) plane has negative angles
228 | // Assume the object is on an (x,y) plane with 'y' being north and the origin being the object's center (the 'position')
229 | // So along the 'y'/north-south axis the distance between the object's center and its points is lat_dst/2
230 | // And along the 'x'/east-west axis, the distance between the objec's center and it's pionts is lng_dst/2
231 |
232 | //Get the distance of each of the four corners from the origin for our hypotenuse... a^2 + b^2 = c^2
233 | origin_dst = Math.sqrt( Math.pow((lat_dst/2), 2) + Math.pow((lng_dst/2), 2) );
234 |
235 | // get the angle theta (in degrees) of each corner
236 | var tr_deg = Math.acos( (lat_dst/2) / origin_dst ) * 180 / Math.PI;
237 | var br_deg = Math.acos( -(lat_dst/2) / origin_dst ) * 180 / Math.PI;
238 | var bl_deg = Math.acos( -(lat_dst/2) / origin_dst ) * 180 / Math.PI;
239 | var tl_deg = Math.acos( (lat_dst/2) / origin_dst ) * 180 / Math.PI;
240 |
241 |
242 | this.defaultFootprint[category] = {
243 | "lat_dst": lat_dst,
244 | "lng_dst": lng_dst,
245 | "origin_dst": origin_dst,
246 | "tr_deg": tr_deg,
247 | "br_deg": br_deg,
248 | "bl_deg": bl_deg,
249 | "tl_deg": tl_deg
250 | };
251 |
252 | }
253 |
254 | _moduloBearing(x) {
255 |
256 | // if it's greater than +180, it should be -180 + (x % 180 )
257 | if (x > 180) {
258 | return -180 + (x % 180);
259 | }
260 |
261 | // if it's smaller than -180, it should be 180 - (Math.abs(x) % 180)
262 | if (x < -180) {
263 | return 180 - (Math.abs(x) % 180);
264 | }
265 |
266 | }
267 |
268 |
269 | _objectFootprint(positionAndBearing, category=1) {
270 |
271 | // // add the new bearing to the current angle of each corner
272 | // var new_tr_deg = this._moduloBearing(this.defaultFootprint[category].tr_deg + positionAndBearing[1]);
273 | // var new_br_deg = this._moduloBearing(this.defaultFootprint[category].br_deg + positionAndBearing[1]);
274 | // var new_bl_deg = this._moduloBearing(this.defaultFootprint[category].bl_deg + positionAndBearing[1]);
275 | // var new_tl_deg = this._moduloBearing(this.defaultFootprint[category].tl_deg + positionAndBearing[1]);
276 |
277 |
278 | // // Multiply the distance from the origin by the cosine of each new degree to get the new displacement along the 'y' (latitude) axis
279 | // var tr_lat_displacement = this.defaultFootprint[category].origin_dst * Math.cos(new_tr_deg*Math.PI/180)
280 | // var br_lat_displacement = this.defaultFootprint[category].origin_dst * Math.cos(new_br_deg*Math.PI/180)
281 | // var bl_lat_displacement = this.defaultFootprint[category].origin_dst * Math.cos(new_bl_deg*Math.PI/180)
282 | // var tl_lat_displacement = this.defaultFootprint[category].origin_dst * Math.cos(new_tl_deg*Math.PI/180)
283 |
284 | // // Multiply the distance from the origin by the sine of each new degree to get the new displacement along the 'x' (longitude) axis
285 | // var tr_lng_displacement = this.defaultFootprint[category].origin_dst * Math.sin(new_tr_deg*Math.PI/180)
286 | // var br_lng_displacement = this.defaultFootprint[category].origin_dst * Math.sin(new_br_deg*Math.PI/180)
287 | // var bl_lng_displacement = this.defaultFootprint[category].origin_dst * Math.sin(new_bl_deg*Math.PI/180)
288 | // var tl_lng_displacement = this.defaultFootprint[category].origin_dst * Math.sin(new_tl_deg*Math.PI/180)
289 |
290 | var top_right;
291 | var bottom_right;
292 | var bottom_left;
293 | var top_left;
294 |
295 | // top_right = { "lat": positionAndBearing[0].lat + tr_lat_displacement, "lng": positionAndBearing[0].lng + tr_lng_displacement };
296 | // bottom_right = { "lat": positionAndBearing[0].lat + br_lat_displacement, "lng": positionAndBearing[0].lng + br_lng_displacement };
297 | // bottom_left = { "lat": positionAndBearing[0].lat + bl_lat_displacement, "lng": positionAndBearing[0].lng + bl_lng_displacement };
298 | // top_left = { "lat": positionAndBearing[0].lat + tl_lat_displacement, "lng": positionAndBearing[0].lng + tl_lng_displacement };
299 |
300 |
301 | top_right = [ positionAndBearing[0].lng + (this.defaultFootprint[category].lng_dst/2), positionAndBearing[0].lat + (this.defaultFootprint[category].lat_dst/2) ];
302 |
303 | bottom_right = [ positionAndBearing[0].lng + (this.defaultFootprint[category].lng_dst/2), positionAndBearing[0].lat - (this.defaultFootprint[category].lat_dst/2) ];
304 |
305 | bottom_left = [ positionAndBearing[0].lng - (this.defaultFootprint[category].lng_dst/2), positionAndBearing[0].lat - (this.defaultFootprint[category].lat_dst/2) ];
306 |
307 | top_left = [ positionAndBearing[0].lng - (this.defaultFootprint[category].lng_dst/2), positionAndBearing[0].lat + (this.defaultFootprint[category].lat_dst/2) ];
308 |
309 |
310 | var poly = turfPolygon(
311 | [
312 | [
313 | top_right,
314 | bottom_right,
315 | bottom_left,
316 | top_left,
317 | top_right
318 | ]
319 | ]
320 | );
321 |
322 | // console.log(positionAndBearing[1]);
323 |
324 | var rotatedPoly = turfTransformRotate(poly, positionAndBearing[1]);
325 |
326 | if (DEMO_MODE) {
327 | var translatedPoly = turfTransformTranslate(rotatedPoly, 3, positionAndBearing[1]+90, {"units": "meters"});
328 | return translatedPoly.geometry.coordinates;
329 | }
330 |
331 | return rotatedPoly.geometry.coordinates;
332 |
333 | // // Return coordinates
334 | // return [
335 | // [
336 | // [
337 | // top_right.lng,
338 | // top_right.lat
339 | // ],
340 | // [
341 | // bottom_right.lng,
342 | // bottom_right.lat
343 | // ],
344 | // [
345 | // bottom_left.lng,
346 | // bottom_left.lat
347 | // ],
348 | // [
349 | // top_left.lng,
350 | // top_left.lat
351 | // ],
352 | // [
353 | // top_right.lng,
354 | // top_right.lat
355 | // ]
356 | // ]
357 | // ]
358 |
359 |
360 | }
361 |
362 |
363 | _adjustCanvas(message) {
364 |
365 | console.log("Map received frame dim");
366 | // console.log(m.data);
367 | const frame_dim = JSON.parse(message.data);
368 |
369 | const nodeFrameWidth = Math.round(frame_dim.width);
370 | const nodeFrameHeight = Math.round(frame_dim.height);
371 |
372 | this.setState({
373 | canvasWidth: nodeFrameWidth,
374 | canvasHeight: nodeFrameHeight
375 | });
376 |
377 |
378 | }
379 |
380 | _determineCalibration({viewState, interactionState, oldViewState}) {
381 |
382 | // console.log(viewState);
383 |
384 | const {bearing, pitch, zoom, latitude, longitude, width, height} = viewState;
385 |
386 | this.currentZoomLevel = zoom;
387 |
388 | // const map = this._map;
389 | // var bearing = map.getBearing();
390 | // var pitch = map.getPitch();
391 | // var zoom = map.getZoom();
392 |
393 | // Not the location of the camera but the geographic focal point according to the mapbox/OSM camera
394 | // var lnglat_focus = map.getCenter();
395 | // var lng_focus = lnglat_focus.lng;
396 | // var lat_focus = lnglat_focus.lat;
397 |
398 | var lng_focus = longitude;
399 | var lat_focus = latitude;
400 |
401 |
402 | // (From docs) ...create a ruler object only once per a general area of calculation, and then reuse it as much as possible. Don't create a new ruler for every calculation.
403 | if (Math.abs(this.lastMapLatitudeFocus - lat_focus) > 0.05) {
404 | // Create a Cheap Ruler object that will approximate measurements around the given latitude.
405 | this.ruler = cheapRuler(lat_focus, 'meters');
406 |
407 | this.lastMapLatitudeFocus = lat_focus; // set lastMapLatitudeFocus to the new focus
408 | }
409 |
410 |
411 | if (!this.state.calibrationMode) return null; // If we're not in CALIBRATION mode, no need to continue
412 |
413 |
414 | const map = this._map;
415 |
416 | // Homography points
417 | var homography_points = {};
418 |
419 | // var canvas = map.getCanvas(),
420 | // w = canvas.width,
421 | // h = canvas.height;
422 |
423 | const w = width;
424 | const h = height;
425 |
426 | // The modification made to mapbox (https://github.com/mapbox/mapbox-gl-js/issues/3731#issuecomment-368641789) that allows a greater than 60 degree pitch has a bug with unprojecting points closer to the horizon. They get very "screwy". So the two top homography_point corners in the web app ('ul' and 'ur') actually start half way down the canvas as the starting point to start from below the horizon
427 | const b_h = h/2;
428 |
429 | const cUL = map.unproject([0,0]).toArray(),
430 | cUR = map.unproject([w,0]).toArray(),
431 | cLR = map.unproject([w,h]).toArray(),
432 | cLL = map.unproject([0,h]).toArray();
433 | homography_points['corners'] = {};
434 | homography_points['corners']['ul'] = { lng: map.unproject([0,b_h]).lng, lat: map.unproject([0,b_h]).lat };
435 | homography_points['corners']['ur'] = { lng: map.unproject([w,b_h]).lng, lat: map.unproject([w,b_h]).lat };
436 | homography_points['corners']['ll'] = { lng: map.unproject([w,h]).lng, lat: map.unproject([w,h]).lat };
437 | homography_points['corners']['lr'] = { lng: map.unproject([0,h]).lng, lat: map.unproject([0,h]).lat };
438 |
439 | homography_points['markers'] = {};
440 |
441 | const calibration = {
442 | bearing: bearing,
443 | pitch: pitch,
444 | zoom: zoom,
445 | lng_focus: lng_focus,
446 | lat_focus: lat_focus,
447 | homography_points: homography_points
448 | };
449 |
450 |
451 | if (this.ws_send_cal_recv_frame.readyState == 1) {
452 | this.ws_send_cal_recv_frame.send(JSON.stringify(calibration));
453 | } else if (this.ws_send_cal_recv_frame.readyState == 3) { // if it's closed, reopen
454 | this._openCalFrameWebsocketConnection();
455 | }
456 |
457 | }
458 |
459 |
460 | _receiveTracklets(message) {
461 |
462 | // console.log("tracklets received");
463 |
464 | // always assign newly received tracklets to trackableObjectsArray so it's reflects the tracklets that are in the clock's range
465 | this.setState({
466 | trackableObjectsArray: formatTracklets(JSON.parse(message.data), this.loopTime, this.loopLength)
467 | });
468 |
469 | if (this.state.trackableObjectsArray.length > 0) console.log(this.state.trackableObjectsArray);
470 |
471 |
472 | this.last_query_timestamp = this.new_query_timestamp; // set the last_query_timestamp
473 | this.waitingToReceiveTrackableObjects = false;
474 | }
475 |
476 | _animate(rAFTimestamp=0) {
477 | const {
478 | loopLength = 1800, // unit corresponds to the timestamp in source data
479 | animationSpeed = 7 // unit time per second
480 | } = this.props;
481 |
482 | this.loopLength = loopLength;
483 | this.loopTime = loopLength / animationSpeed;
484 |
485 |
486 | // find out how long it's been since this was last called
487 | var elapsedMilliseconds = rAFTimestamp - this.lastRAFTimestamp;
488 | // move clock's time forward by {elapsedMilliseconds}
489 | this.clockTimestamp = this.timepicker.moveClockDateForward(elapsedMilliseconds);
490 |
491 | if (DEMO_MODE) {
492 | const timestamp = this.clockTimestamp / 1000;
493 | this.setState({
494 | time: ((timestamp % this.loopTime) / this.loopTime) * loopLength
495 | });
496 | } else {
497 | this.setState({
498 | time: this.clockTimestamp
499 | });
500 | }
501 |
502 |
503 | if (this.mapLoaded) { // if the map is loaded
504 |
505 | // calculate the next interpolated position of each trackable object
506 | if (DEMO_MODE) {
507 | this.state.trackableObjectsArray = trackletsData;
508 | }
509 |
510 | this._interpolateTrackableObjects(this.state.trackableObjectsArray);
511 |
512 | // set this.waitingToReceiveTrackableObjects to false if we've been waiting too long
513 | const trackableObjectsTimeout = 4000;
514 | if (rAFTimestamp - this.last_ws_get_tracklets > trackableObjectsTimeout) { // if it's been too long since we last called ws_get_tracklets.send, change waitingToReceiveTrackableObjects to false just in case something happens and the event listener is never fired changing waitingToReceiveTrackableObjects back to false
515 | this.waitingToReceiveTrackableObjects = false;
516 | }
517 |
518 | if (!DEMO_MODE && !this.waitingToReceiveTrackableObjects) { // if we're NOT in DEMO_MODE and NOT currently waiting on a request for more trackableObjects
519 |
520 |
521 | if (this.clockTimestamp < this.last_query_timestamp+15000 || this.clockTimestamp > this.last_query_timestamp+this.query_timestamp_range-15000) { // we'll still get back the range of tracklets. This just makes sure we don't experience a gap while we're waiting
522 |
523 | this.waitingToReceiveTrackableObjects = true;
524 | if (this.clockTimestamp < this.last_query_timestamp+15000) { // if clock is being dragged backwards, set query back
525 | this.new_query_timestamp = this.clockTimestamp-30000;
526 | } else {
527 | this.new_query_timestamp = this.clockTimestamp;
528 | }
529 |
530 | if (this.ws_get_tracklets.readyState == 1) {
531 | this.ws_get_tracklets.send(JSON.stringify({"timestamp": this.new_query_timestamp, "range": this.query_timestamp_range})); // ask server for more trackableObjects
532 |
533 | this.last_ws_get_tracklets = rAFTimestamp; // set the time we last called ws_get_tracklets
534 |
535 | } else if (this.ws_get_tracklets.readyState == 3) { // if it's closed, reopen
536 | this._openTrackletsWebsocketConnection();
537 | }
538 |
539 |
540 | setTimeout(function(){ this.waitingToReceiveTrackableObjects = false; }, trackableObjectsTimeout); // set timeout to change waitingToReceiveTrackableObjects to false just in case something happens and the event listener is never fired changing waitingToReceiveTrackableObjects back to false
541 |
542 | }
543 | }
544 |
545 | }
546 |
547 | // set this current rAFTimestamp as the last one for next time
548 | this.lastRAFTimestamp = rAFTimestamp;
549 |
550 | this._animationFrame = window.requestAnimationFrame(this._animate.bind(this));
551 | }
552 |
553 |
554 | _interpolateTrackableObjects(trackableObjectsArray) {
555 |
556 | // if this.ruler isn't set or the zoom level is too low
557 | if (this.ruler === undefined || this.currentZoomLevel < this.trackedObjectMinZoomAppearance) return null;
558 |
559 | const featureCollection = []; // make new array to hold these Objects and interpolated positions
560 |
561 | for(var i=0; i < trackableObjectsArray.length-1; i++) {
562 |
563 |
564 | var inMilliseconds;
565 | if (DEMO_MODE) {
566 | inMilliseconds = false;
567 | } else {
568 | inMilliseconds = true;
569 | }
570 | // get the position
571 | var positionAndBearing = interpolatePositionAndBearing(trackableObjectsArray[i]['tracklets'], this.state.time, this.ruler, inMilliseconds);
572 |
573 | if (positionAndBearing[0] !== null) {
574 |
575 | featureCollection.push({
576 | "type": "Feature",
577 | "geometry": {
578 | "coordinates": this._objectFootprint(positionAndBearing, 3),
579 | "type": "Polygon"
580 | },
581 | "properties":{
582 | "category": "car",
583 | "id": trackableObjectsArray[i]["object_id"],
584 | "height": 2,
585 | "color": "rgb(255, 0, 0)"
586 | }
587 | });
588 | }
589 | }
590 |
591 | this._map.getSource('trackedObject').setData(
592 | {
593 | "type": "FeatureCollection",
594 | "features": featureCollection
595 | }
596 | );
597 |
598 |
599 | }
600 |
601 | _renderLayers() {
602 | const {tracklets = DATA_URL.DEMO_TRACKLETS, trailLength = 50} = this.props;
603 |
604 |
605 | // const geojson_ = {
606 | // "type": "Feature",
607 | // "geometry": {
608 | // "coordinates": this._objectFootprint([{"lng": -74.20992, "lat": 40.81773}, 0]),
609 | // "type": "Polygon"
610 | // },
611 | // "properties":{
612 | // "type":"person",
613 | // "id":"20092as9df2",
614 | // "height": 2,
615 | // "color": "rgb(0, 255, 0)"
616 | // }
617 | // }
618 |
619 | // const geoJsonData = {
620 | // "type":"FeatureCollection",
621 | // "features":[ geojson_ ]
622 | // };
623 |
624 |
625 | // const thisTrackedObjectsLayer = new GeoJsonLayer({
626 | // id: 'geojson-layer',
627 | // geoJsonData,
628 | // pickable: true,
629 | // stroked: false,
630 | // filled: true,
631 | // extruded: true,
632 | // lineWidthScale: 20,
633 | // lineWidthMinPixels: 2,
634 | // getFillColor: [160, 160, 180, 200],
635 | // getLineColor: d => colorToRGBArray(d.properties.color),
636 | // getRadius: 100,
637 | // getLineWidth: 1,
638 | // getElevation: 30,
639 | // });
640 |
641 |
642 | return [
643 |
644 | new TripsLayer({
645 | id: 'tracklets',
646 | data: this.state.trackableObjectsArray,
647 | getPath: d => d.tracklets,
648 | getColor: d => (d.detection_class_id === 0 ? [255, 0, 0] : [0, 255, 0]),
649 | opacity: 0.3,
650 | strokeWidth: 2,
651 | trailLength,
652 | currentTime: this.state.time
653 | })
654 |
655 | ];
656 | }
657 |
658 | // DeckGL and mapbox will both draw into this WebGL context
659 | _onWebGLInitialized(gl) {
660 | this.setState(state => ({
661 | gl
662 | }));
663 | }
664 |
665 | _onMapLoad() {
666 | this.mapLoaded = true;
667 |
668 | const map = this._map;
669 | const deck = this._deck;
670 |
671 | //map.addLayer(new MapboxLayer({id: 'geojson-layer', deck}));
672 | map.addLayer(mapboxBuildingLayer);
673 |
674 |
675 | // See https://www.mapbox.com/mapbox-gl-js/example/point-from-geocoder-result/
676 | map.addSource(
677 | 'trackedObject',
678 | {
679 | type: 'geojson',
680 | "data": {
681 | "type": "FeatureCollection",
682 | "features": []
683 | }
684 | }
685 | );
686 |
687 |
688 | // See https://www.mapbox.com/mapbox-gl-js/example/point-from-geocoder-result/
689 | map.addLayer({
690 | "id": "trackedObject",
691 | "type": "fill-extrusion",
692 | "source": "trackedObject",
693 | 'paint': {
694 | 'fill-extrusion-color': ["get", "color"],
695 |
696 | // modify 'interpolate' expression to add a smooth size transition effect to the
697 | // tracked objects as the camera zooms in
698 | 'fill-extrusion-height': [
699 | "interpolate", ["linear"], ["zoom"],
700 | this.trackedObjectMinZoomAppearance, 0,
701 | 15.05, ["get", "height"]
702 | ],
703 | 'fill-extrusion-base': [
704 | "interpolate", ["linear"], ["zoom"],
705 | this.trackedObjectMinZoomAppearance, 0,
706 | 15.05, 0
707 | ],
708 | 'fill-extrusion-opacity': .6
709 | }
710 |
711 | });
712 |
713 |
714 | var geojson_ = {
715 | "type": "Feature",
716 | "geometry": {
717 | "coordinates": this._objectFootprint([{"lng": -74.20986, "lat": 40.81773}, 0]),
718 | "type": "Polygon"
719 | },
720 | "properties":{
721 | "type":"person",
722 | "id":"20092as9df2",
723 | "height": 2,
724 | "color": "rgb(0, 255, 0)"
725 | }
726 | };
727 |
728 |
729 | map.getSource('trackedObject').setData(
730 | {
731 | "type": "FeatureCollection",
732 | "features": [geojson_]
733 | }
734 | );
735 |
736 | }
737 |
738 |
739 | render() {
740 | const {gl} = this.state;
741 | const {viewState, controller = true, baseMap = true} = this.props;
742 |
743 | return (
744 |
745 |
746 |
747 |
748 |
749 |
750 |
751 |
758 |
759 |
760 |
761 | {
763 | // save a reference to the Deck instance
764 | this._deck = ref && ref.deck;
765 | }}
766 | layers={this._renderLayers()}
767 | initialViewState={INITIAL_VIEW_STATE}
768 | viewState={viewState}
769 | width={this.state.canvasWidth}
770 | height={this.state.canvasHeight}
771 | controller={controller}
772 | onWebGLInitialized={this._onWebGLInitialized}
773 | onViewStateChange={this._determineCalibration}
774 | >
775 | {gl && ( {
777 | // save a reference to the mapboxgl.Map instance
778 | this._map = ref && ref.getMap();
779 | }}
780 | gl={gl}
781 | reuseMaps
782 | mapStyle="mapbox://styles/mapbox/light-v9"
783 | preventStyleDiffing={true}
784 | visibilityConstraints={ {minZoom: 0, maxZoom: 24, minPitch: 0, maxPitch: 89.9} }
785 | mapboxApiAccessToken={MAPBOX_TOKEN}
786 | onLoad={this._onMapLoad}
787 | />
788 | )}
789 |
790 |
791 |
792 | );
793 | }
794 | }
795 |
796 | export function renderToDOM(container) {
797 | render(, container);
798 | }
799 |
--------------------------------------------------------------------------------
/gui/index.html:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 | grassland
6 |
7 |
11 |
12 |
13 |
14 |
15 |
16 |
17 |
18 |
19 |
20 |
32 |
33 |
36 |
37 |
--------------------------------------------------------------------------------
/gui/nodemon.json:
--------------------------------------------------------------------------------
1 | {
2 | "watch": ["src/server/"]
3 | }
4 |
--------------------------------------------------------------------------------
/gui/package.json:
--------------------------------------------------------------------------------
1 | {
2 | "name": "gui",
3 | "version": "0.1.0",
4 | "license": "MIT",
5 | "private": true,
6 | "main": "src/server/index.js",
7 | "partial-credit": "https://github.com/crsandeep/simple-react-full-stack",
8 | "scripts": {
9 | "orig-start-localhost": "webpack-dev-server --progress --hot --open --env.HOST=localhost",
10 | "orig-start-external": "webpack-dev-server --progress --hot --open --env.HOST=0.0.0.0",
11 | "build": "webpack --mode production",
12 | "start": "npm run build && node src/server/index.js",
13 | "client": "webpack-dev-server --mode development --devtool inline-source-map --hot",
14 | "server": "nodemon src/server/index.js",
15 | "dev-localhost": "concurrently \"npm run server\" \"npm run orig-start-localhost\"",
16 | "dev-external": "concurrently \"npm run server\" \"npm run orig-start-external\""
17 | },
18 | "dependencies": {
19 | "@deck.gl/experimental-layers": "^6.4.0",
20 | "@deck.gl/mapbox": "^6.4.0",
21 | "@turf/turf": "^5.1.6",
22 | "acorn": "^6.1.1",
23 | "cheap-ruler": "^2.5.1",
24 | "deck.gl": "^6.4.0",
25 | "express-ws": "^4.0.0",
26 | "react": "^16.3.0",
27 | "react-dom": "^16.3.0",
28 | "react-map-gl": "^3.3.0"
29 | },
30 | "devDependencies": {
31 | "buble": "^0.19.3",
32 | "buble-loader": "^0.5.0",
33 | "clean-webpack-plugin": "^1.0.0",
34 | "concurrently": "^4.0.0",
35 | "html-webpack-plugin": "^3.2.0",
36 | "http-proxy-middleware": "^0.19.1",
37 | "nodemon": "^1.17.3",
38 | "webpack": "^4.20.2",
39 | "webpack-cli": "^3.1.2",
40 | "webpack-dev-server": "^3.1.1"
41 | }
42 | }
43 |
--------------------------------------------------------------------------------
/gui/src/client/analog-clock-timepicker/LICENSE:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) 2018 ZulNs
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE.
22 |
--------------------------------------------------------------------------------
/gui/src/client/analog-clock-timepicker/examples/analog-clock.html:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 |
6 |
7 |
8 |
9 |
10 |
11 |
12 | Realtime Analog Clock
13 |
14 |
15 |
16 |
17 | Designed by ZulNs @Yogyakarta, February 2016
18 |
19 | See API Documentation:
20 |
24 |
25 |
26 |
27 |
38 |
39 |
40 |
--------------------------------------------------------------------------------
/gui/src/client/analog-clock-timepicker/examples/timepicker.html:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 |
6 |
7 |
8 |
9 |
10 |
11 |
12 | Analog Clock Timepicker
13 |
14 |
15 |
16 |
17 | Designed by ZulNs @Yogyakarta, February 2016
18 |
19 |
20 |
21 | See API Documentation:
22 |
26 |
27 |
28 |
29 |
53 |
54 |
55 |
--------------------------------------------------------------------------------
/gui/src/client/analog-clock-timepicker/libs/timepicker.css:
--------------------------------------------------------------------------------
1 |
2 | /****************************************************
3 | * Directly Draggable Analog-Clock Style Timepicker *
4 | * *
5 | * Design by ZulNs @Yogyakarta, February 2016 *
6 | ****************************************************/
7 |
8 | .timepicker {
9 | position: relative;
10 | width: 257px;
11 | height: 286px;
12 | padding: 10px;
13 | font: 16px arial;
14 | display: block;
15 | /* margin-top: 10px; */
16 | float: right;
17 | }
18 | .timepicker .clock-face,
19 | .timepicker .hour-hand,
20 | .timepicker .minute-hand,
21 | .timepicker .second-hand {
22 | position: absolute;
23 | }
24 | .timepicker .hour-hand {
25 | -webkit-transform-origin: 50% 70px;
26 | -moz-transform-origin: 50% 70px;
27 | -ms-transform-origin: 50% 70px;
28 | -o-transform-origin: 50% 70px;
29 | transform-origin: 50% 70px;
30 | left: 120px;
31 | top: 60px;
32 | }
33 | .timepicker .minute-hand,
34 | .timepicker .second-hand {
35 | -webkit-transform-origin: 50% 90px;
36 | -moz-transform-origin: 50% 90px;
37 | -ms-transform-origin: 50% 90px;
38 | -o-transform-origin: 50% 90px;
39 | transform-origin: 50% 90px;
40 | top: 40px;
41 | }
42 | .timepicker .minute-hand {
43 | left: 124px;
44 | }
45 | .timepicker .second-hand {
46 | left: 126px;
47 | }
48 | .timepicker .picked-time,
49 | .timepicker .picked-date,
50 | .timepicker .button {
51 | position: absolute;
52 | text-align: center;
53 | vertical-align: middle;
54 | }
55 | .timepicker .picked-time,
56 | .timepicker .picked-date {
57 | top: 260px;
58 | left: 62px;
59 | width: 140px;
60 | height: 36px;
61 | line-height: 36px;
62 | color: #246;
63 | text-align: center;
64 | font-size: 1.75em;
65 | font-weight: bold;
66 | text-shadow: 1px 1px 2px #fff;
67 | cursor: default;
68 | }
69 | .timepicker .picked-date {
70 | top: 300px !important;
71 | font-size: 1.35em !important;
72 | }
73 | .timepicker .button.hour {
74 | left: 15px;
75 | }
76 | .timepicker .button.ok {
77 | right: 6px;
78 | }
79 | .timepicker .button {
80 | top: 263px;
81 | width: 48px;
82 | height: 30px;
83 | line-height: 30px;
84 | cursor: pointer;
85 | font-size: 0.9em;
86 | color: #eee;
87 | background: #07f;
88 | border: 1px solid #05a;
89 | -moz-border-radius: 4px;
90 | -webkit-border-radius: 4px;
91 | border-radius: 4px;
92 | -moz-box-shadow: 0 2px 4px rgba(0,0,0,.5);
93 | -webkit-box-shadow: 0 2px 4px rgba(0,0,0,.5);
94 | box-shadow: 0 2px 4px rgba(0,0,0,.5);
95 | }
96 | .timepicker .button:active {
97 | border: 2px solid #05a;
98 | -webkit-box-shadow: none;
99 | -moz-box-shadow: none;
100 | box-shadow: none;
101 | }
102 |
--------------------------------------------------------------------------------
/gui/src/client/analog-clock-timepicker/libs/timepicker.js:
--------------------------------------------------------------------------------
1 |
2 | /****************************************************
3 | * Directly Draggable Analog-Clock Style Timepicker *
4 | * *
5 | * Design by ZulNs @Yogyakarta, February 2016 *
6 | ****************************************************/
7 |
8 | export function Timepicker(isClockMode, is24HoursSystem, selectedHours, selectedMinutes, selectedSeconds) {
9 | isClockMode = !!isClockMode;
10 | is24HoursSystem = !!is24HoursSystem;
11 | selectedHours = selectedHours === undefined ? new Date().getHours() : ~~selectedHours % 24;
12 | selectedMinutes = selectedMinutes === undefined ? new Date().getMinutes() : ~~selectedMinutes % 60;
13 | selectedSeconds = selectedSeconds === undefined ? new Date().getSeconds() : ~~selectedSeconds % 60;
14 |
15 | // var clockDate = new Date();
16 | // clockDate.setHours(0,0,0,0);
17 |
18 | var self = this,
19 | timepicker = document.createElement('div'),
20 | clockFace = document.createElement('canvas'),
21 | hourHand = document.createElement('canvas'),
22 | minuteHand = document.createElement('canvas'),
23 | secondHand = document.createElement('canvas'),
24 | pickedTime = document.createElement('div'),
25 | pickedDate = document.createElement('div'),
26 | hourSystemButton = document.createElement('div'),
27 | okButton = document.createElement('div'),
28 | displayStyle = 'block',
29 | isHidden = true,
30 | isPM = selectedHours >= 12,
31 | lastIsPM = isPM,
32 | isHourHand,
33 | isMinuteHand,
34 | clockDate,
35 | clockDateIntervalSize = 10,
36 | clockTimestamp,
37 | isReverseRotate,
38 | isDragging = false,
39 | isFiredByMouse = false,
40 | touchId,
41 | lastHourDeg,
42 | lastMinuteDeg,
43 | lastSecondDeg,
44 | centerX,
45 | centerY,
46 | cssTransform = Timepicker.getSupportedTransformProp(),
47 | secondTimer,
48 |
49 |
50 | handleMouseDown = function(e) {
51 | if (!isDragging) {
52 | window.clearInterval(secondTimer); // stop the clock moving forward
53 | e = e || window.event;
54 | e.preventDefault();
55 | e.stopPropagation();
56 | isFiredByMouse = true;
57 | isHourHand = e.target === hourHand;
58 | isMinuteHand = e.target === minuteHand;
59 | onPointerStart(e.pageX, e.pageY);
60 | }
61 | },
62 |
63 | handleMouseMove = function(e) {
64 | if (isDragging && isFiredByMouse) {
65 | e = e || window.event;
66 | e.preventDefault();
67 | onPointerMove(e.pageX, e.pageY);
68 | }
69 | },
70 |
71 | handleMouseUp = function(e) {
72 | if (isDragging && isFiredByMouse) {
73 | // startClockDateInterval();
74 | e = e || window.event;
75 | e.preventDefault();
76 | isDragging = false;
77 | }
78 | },
79 |
80 | handleTouchStart = function(e) {
81 | e = e || window.event;
82 | if (isDragging && !isFiredByMouse && e.touches.length == 1) isDragging = false;
83 | if (!isDragging) {
84 | var touch = e.changedTouches[0];
85 | e.preventDefault();
86 | //e.stopPropagation();
87 | isFiredByMouse = false;
88 | touchId = touch.identifier;
89 | isHourHand = touch.target === hourHand;
90 | isMinuteHand = touch.target === minuteHand;
91 | onPointerStart(touch.pageX, touch.pageY);
92 | }
93 | },
94 |
95 | handleTouchMove = function(e) {
96 | if (isDragging && !isFiredByMouse) {
97 | e = e || window.event;
98 | var touches = e.changedTouches, touch;
99 | for (var i = 0; i < touches.length; i++) {
100 | touch = touches[i];
101 | if (touch.identifier === touchId) {
102 | e.preventDefault();
103 | onPointerMove(touch.pageX, touch.pageY);
104 | break;
105 | }
106 | }
107 | }
108 | },
109 |
110 | handleTouchEnd = function(e) {
111 | if (isDragging && !isFiredByMouse) {
112 | e = e || window.event;
113 | var touches = e.changedTouches, touch;
114 | for (var i = 0; i < touches.length; i++) {
115 | touch = touches[i];
116 | if (touch.identifier === touchId) {
117 | e.preventDefault();
118 | isDragging = false;
119 | return;
120 | }
121 | }
122 | }
123 | },
124 |
125 | updateLastMinuteDeg = function(pseudo_deg, manual=false) {
126 | if ((270 < lastMinuteDeg && lastMinuteDeg < 360 && 0 <= pseudo_deg && pseudo_deg < 90) || (270 < pseudo_deg && pseudo_deg < 360 && 0 <= lastMinuteDeg && lastMinuteDeg < 90)) {
127 | lastHourDeg = lastHourDeg + (pseudo_deg - lastMinuteDeg - Math.sign(pseudo_deg - lastMinuteDeg) * 360) / 12;
128 | if (lastHourDeg < 0) lastHourDeg += 360;
129 | lastHourDeg %= 360;
130 | if (345 < lastHourDeg || lastHourDeg < 15) isPM = !isPM;
131 | }
132 | else {
133 | lastHourDeg = lastHourDeg + (pseudo_deg - lastMinuteDeg) / 12;
134 | if (lastHourDeg < 0) lastHourDeg += 360;
135 | lastHourDeg %= 360;
136 | }
137 | lastMinuteDeg = pseudo_deg;
138 |
139 | if (!manual) {
140 | rotateElement(minuteHand, lastMinuteDeg);
141 | }
142 | rotateElement(hourHand, lastHourDeg);
143 | },
144 |
145 | onPointerStart = function(currentX, currentY) {
146 | isDragging = true;
147 | centerX = timepicker.offsetLeft + hourHand.offsetLeft + 10;
148 | centerY = timepicker.offsetTop + hourHand.offsetTop + 70;
149 | var last = isHourHand ? lastHourDeg : isMinuteHand ? lastMinuteDeg : lastSecondDeg,
150 | deg = -Math.atan2(centerX - currentX, centerY - currentY) * 180 / Math.PI,
151 | dif = Math.abs(deg - last);
152 | isReverseRotate = (160 < dif && dif < 200);
153 | },
154 |
155 | onPointerMove = function(currentX, currentY) {
156 | var deg, last, target;
157 | if (currentX !== centerX || currentY !== centerY) {
158 | deg = -Math.atan2(centerX - currentX, centerY - currentY) * 180 / Math.PI;
159 | if (isReverseRotate) deg = deg - 180;
160 | if (deg < 0) deg += 360;
161 | target = isHourHand ? hourHand : isMinuteHand ? minuteHand : secondHand;
162 | rotateElement(target, deg);
163 | lastIsPM = isPM;
164 | var manual;
165 | if (isHourHand) {
166 | if ((0 <= deg && deg < 90 && 270 < lastHourDeg && lastHourDeg < 360) || (0 <= lastHourDeg && lastHourDeg < 90 && 270 < deg && deg < 360)) isPM = !isPM;
167 |
168 | lastHourDeg = deg;
169 | lastMinuteDeg = deg % 30 * 12;
170 | rotateElement(minuteHand, lastMinuteDeg);
171 | lastSecondDeg = lastMinuteDeg % 6 * 60;
172 | rotateElement(secondHand, lastSecondDeg);
173 | } else if (isMinuteHand) {
174 | updateLastMinuteDeg(deg, manual=true);
175 | lastSecondDeg = lastMinuteDeg % 6 * 60;
176 | rotateElement(secondHand, lastSecondDeg);
177 | } else {
178 | if ((270 < lastSecondDeg && lastSecondDeg < 360 && 0 <= deg && deg < 90) || (270 < deg && deg < 360 && 0 <= lastSecondDeg && lastSecondDeg < 90)) {
179 |
180 | var pseudo_deg = lastMinuteDeg + (deg - lastSecondDeg - Math.sign(deg - lastSecondDeg) * 360) / 60;
181 | if (pseudo_deg < 0) pseudo_deg += 360;
182 | pseudo_deg %= 360;
183 | updateLastMinuteDeg(pseudo_deg, manual=false);
184 | } else {
185 | pseudo_deg = lastMinuteDeg + (deg - lastSecondDeg) / 60;
186 | if (pseudo_deg < 0) pseudo_deg += 360;
187 | pseudo_deg %= 360;
188 | updateLastMinuteDeg(pseudo_deg, manual=false);
189 | }
190 | lastSecondDeg = deg;
191 | }
192 |
193 | selectedMinutes = 6 * lastHourDeg / 180;
194 | selectedHours = ~~selectedMinutes;
195 | selectedMinutes = Math.floor((selectedMinutes - selectedHours) * 60);
196 | selectedSeconds = parseFloat((lastSecondDeg / 6).toFixed(3));
197 | if (isPM) selectedHours += 12;
198 |
199 | if (lastIsPM && !isPM && selectedHours < 3) { // if we've gone from PM to AM and time's less then 3:00, we've gone forward a day...
200 | // console.log("Forwards a day");
201 | clockDate.setDate(clockDate.getDate() + 1)
202 | // clockDate.setHours(0,0,0,0);
203 |
204 | } else if (!lastIsPM && isPM && selectedHours > 21) { // else if we've gone from AM to PM and time's greater than 21:00, we've gone backwards a day
205 | // console.log("Backwards a day");
206 | clockDate.setDate(clockDate.getDate() - 1)
207 | // clockDate.setHours(0,0,0,0);
208 | }
209 |
210 | updateClockDate();
211 | updatePickedTime();
212 |
213 | }
214 | },
215 |
216 | handleChangeHourSystem = function() {
217 | is24HoursSystem = !is24HoursSystem;
218 | label24HoursSystem();
219 | updatePickedTime();
220 | },
221 |
222 | handleOkButton = function() {
223 | // self.hide();
224 | updateClockTime();
225 | if (typeof self.callback === 'function') self.callback();
226 | },
227 |
228 | updatePickedTime = function() {
229 | // console.log("You picked "+getPickedTimeString());
230 | pickedTime.innerHTML = getPickedTimeString();
231 |
232 | pickedDate.innerHTML = getPickedDateString();
233 |
234 | if (typeof self.callback === 'function') self.callback();
235 | },
236 |
237 | setClockDateToNow = function() {
238 | clockDate = new Date();
239 | },
240 |
241 | updateClockDate = function() {
242 | clockDate.setHours(selectedHours,selectedMinutes,selectedSeconds,0);
243 | },
244 |
245 | // startClockDateInterval = function() {
246 | // secondTimer = window.setInterval(moveClockDateForward, clockDateIntervalSize);
247 | // },
248 |
249 | updateClockTime = function() {
250 | selectedHours = new Date().getHours();
251 | selectedMinutes = new Date().getMinutes();
252 | selectedSeconds = new Date().getSeconds();
253 |
254 | setClockDateToNow();
255 |
256 | isPM = selectedHours >= 12;
257 |
258 | updateClockPointers();
259 | // if (selectedSeconds === 0) updatePickedTime();
260 | updatePickedTime();
261 |
262 | },
263 |
264 | updateClockPointers = function() {
265 | lastSecondDeg = selectedSeconds * 6;
266 | lastMinuteDeg = (selectedMinutes + lastSecondDeg / 360) * 6;
267 | lastHourDeg = (selectedHours % 12 + lastMinuteDeg / 360) * 30;
268 | rotateElement(hourHand, lastHourDeg);
269 | rotateElement(minuteHand, lastMinuteDeg);
270 | rotateElement(secondHand, lastSecondDeg);
271 | },
272 |
273 | getPickedDateString = function() {
274 | var pts = clockDate.toLocaleDateString();
275 | return pts;
276 | },
277 |
278 | getPickedTimeString = function() {
279 | var pts = ('0' + (is24HoursSystem ? selectedHours : selectedHours % 12 === 0 ? 12 : selectedHours % 12)).slice(-2) + ':' + ('0' + selectedMinutes).slice(-2);
280 | if (!is24HoursSystem) pts += ' ' + (isPM ? 'PM' : 'AM');
281 | return pts;
282 | },
283 |
284 | label24HoursSystem = function() {
285 | hourSystemButton.innerHTML = (is24HoursSystem ? '12' : '24') + 'H';
286 | },
287 |
288 | rotateElement = function(elm, deg) {
289 | elm.style[cssTransform] = 'rotate(' + deg + 'deg)';
290 | },
291 |
292 |
293 | setTimepickerDisplay = function() {
294 | timepicker.style.display = isHidden ? 'none' : displayStyle;
295 | },
296 |
297 | scrollToFix = function() {
298 | var dw = document.body.offsetWidth,
299 | vw = window.innerWidth,
300 | vh = window.innerHeight,
301 | rect = timepicker.getBoundingClientRect(),
302 | hsSpc = dw > vw ? 20 : 0,
303 | scrollX = rect.left < 0 ? rect.left : 0,
304 | scrollY = rect.bottom - rect.top > vh ? rect.top : rect.bottom > vh - hsSpc ? rect.bottom - vh + hsSpc : 0;
305 | window.scrollBy(scrollX, scrollY);
306 | },
307 |
308 | addEvents = function() {
309 | Timepicker.addEvent(hourHand, 'mousedown', handleMouseDown);
310 | Timepicker.addEvent(minuteHand, 'mousedown', handleMouseDown);
311 | Timepicker.addEvent(secondHand, 'mousedown', handleMouseDown);
312 | Timepicker.addEvent(document, 'mousemove', handleMouseMove);
313 | Timepicker.addEvent(document, 'mouseup', handleMouseUp);
314 | if ('touchstart' in window || navigator.maxTouchPoints > 0 || navigator.msMaxTouchPoints > 0) {
315 | Timepicker.addEvent(hourHand, 'touchstart', handleTouchStart);
316 | Timepicker.addEvent(hourHand, 'touchmove', handleTouchMove);
317 | Timepicker.addEvent(hourHand, 'touchcancel', handleTouchEnd);
318 | Timepicker.addEvent(hourHand, 'touchend', handleTouchEnd);
319 | Timepicker.addEvent(minuteHand, 'touchstart', handleTouchStart);
320 | Timepicker.addEvent(minuteHand, 'touchmove', handleTouchMove);
321 | Timepicker.addEvent(minuteHand, 'touchcancel', handleTouchEnd);
322 | Timepicker.addEvent(minuteHand, 'touchend', handleTouchEnd);
323 | Timepicker.addEvent(secondHand, 'touchstart', handleTouchStart);
324 | Timepicker.addEvent(secondHand, 'touchmove', handleTouchMove);
325 | Timepicker.addEvent(secondHand, 'touchcancel', handleTouchEnd);
326 | Timepicker.addEvent(secondHand, 'touchend', handleTouchEnd);
327 |
328 | }
329 | },
330 |
331 | removeEvents = function() {
332 | Timepicker.removeEvent(hourHand, 'mousedown', handleMouseDown);
333 | Timepicker.removeEvent(minuteHand, 'mousedown', handleMouseDown);
334 | Timepicker.removeEvent(secondHand, 'mousedown', handleMouseDown);
335 | Timepicker.removeEvent(document, 'mousemove', handleMouseMove);
336 | Timepicker.removeEvent(document, 'mouseup', handleMouseUp);
337 | if ('touchstart' in window || navigator.maxTouchPoints > 0 || navigator.msMaxTouchPoints > 0) {
338 | Timepicker.removeEvent(hourHand, 'touchstart', handleTouchStart);
339 | Timepicker.removeEvent(hourHand, 'touchmove', handleTouchMove);
340 | Timepicker.removeEvent(hourHand, 'touchcancel', handleTouchEnd);
341 | Timepicker.removeEvent(hourHand, 'touchend', handleTouchEnd);
342 | Timepicker.removeEvent(minuteHand, 'touchstart', handleTouchStart);
343 | Timepicker.removeEvent(minuteHand, 'touchmove', handleTouchMove);
344 | Timepicker.removeEvent(minuteHand, 'touchcancel', handleTouchEnd);
345 | Timepicker.removeEvent(minuteHand, 'touchend', handleTouchEnd);
346 | Timepicker.removeEvent(secondHand, 'touchstart', handleTouchStart);
347 | Timepicker.removeEvent(secondHand, 'touchmove', handleTouchMove);
348 | Timepicker.removeEvent(secondHand, 'touchcancel', handleTouchEnd);
349 | Timepicker.removeEvent(secondHand, 'touchend', handleTouchEnd);
350 |
351 | }
352 | },
353 |
354 | createTimepicker = function() {
355 | if (!cssTransform) {
356 | self.destroy();
357 | alert('Sorry, your browser not support CSS transform!');
358 | return
359 | }
360 | // Initialize
361 | timepicker.classList.add('timepicker');
362 | clockFace.classList.add('clock-face');
363 | hourHand.classList.add('hour-hand');
364 | minuteHand.classList.add('minute-hand');
365 | secondHand.classList.add('second-hand');
366 | pickedTime.classList.add('picked-time');
367 | pickedDate.classList.add('picked-date');
368 | hourSystemButton.classList.add('button');
369 | hourSystemButton.classList.add('hour');
370 | hourSystemButton.style.padding = '0px';
371 | okButton.classList.add('button');
372 | okButton.classList.add('ok');
373 | okButton.style.padding = '0px';
374 | clockFace.setAttribute('width', 240);
375 | clockFace.setAttribute('height', 240);
376 | hourHand.setAttribute('width', 20);
377 | hourHand.setAttribute('height', 90);
378 | minuteHand.setAttribute('width', 12);
379 | minuteHand.setAttribute('height', 110);
380 | secondHand.setAttribute('width', 8);
381 | secondHand.setAttribute('height', 120);
382 | label24HoursSystem();
383 | okButton.innerHTML = 'Now';
384 | setTimepickerDisplay();
385 | timepicker.appendChild(clockFace);
386 | timepicker.appendChild(hourHand);
387 | timepicker.appendChild(minuteHand);
388 | timepicker.appendChild(secondHand);
389 | timepicker.appendChild(pickedTime);
390 | timepicker.appendChild(pickedDate);
391 | timepicker.appendChild(hourSystemButton);
392 | timepicker.appendChild(okButton);
393 | if (clockFace.getContext){
394 | // Create clock surface
395 | var ctx = clockFace.getContext('2d');
396 | ctx.strokeStyle = '#333';
397 | ctx.beginPath();
398 | ctx.arc(120, 120, 119, 0, 2 * Math.PI);
399 | ctx.stroke();
400 | var radGrd = ctx.createRadialGradient(100, 100, 140, 100, 100, 20);
401 | radGrd.addColorStop(0, '#fff');
402 | radGrd.addColorStop(1, '#ddd');
403 | ctx.fillStyle = radGrd;
404 | ctx.beginPath();
405 | ctx.arc(120, 120, 118, 0, 2 * Math.PI);
406 | ctx.fill();
407 | ctx.translate(120, 120);
408 | ctx.fillStyle = '#333';
409 | for (var i = 0; i < 12; i++) {
410 | ctx.beginPath();
411 | ctx.arc(0, -110, 3, 0, 2 * Math.PI);
412 | ctx.fill();
413 | ctx.rotate(Math.PI / 30);
414 | for (var j = 0; j < 4; j++) {
415 | ctx.beginPath();
416 | ctx.arc(0, -110, 2, 0, 2 * Math.PI);
417 | ctx.fill();
418 | ctx.rotate(Math.PI / 30);
419 | }
420 | }
421 | ctx.font = '16px serif';
422 | ctx.textAlign = 'center';
423 | ctx.textBaseline = 'middle';
424 | for (var i = 1; i <= 12; i++) {
425 | ctx.fillText(i, 94 * Math.sin(i * Math.PI / 6), -94 * Math.cos(i * Math.PI / 6));
426 | }
427 | // Create hour hand
428 | ctx = hourHand.getContext('2d');
429 | var radGrd = ctx.createRadialGradient(0, 0, 90, 70, 70, 20);
430 | radGrd.addColorStop(0, '#e40');
431 | radGrd.addColorStop(1, '#f51');
432 | ctx.fillStyle = radGrd;
433 | ctx.beginPath();
434 | ctx.moveTo(10, 0);
435 | ctx.lineTo(0, 90);
436 | ctx.lineTo(20, 90);
437 | ctx.lineTo(10, 0);
438 | ctx.fill();
439 | // Create minute hand
440 | ctx = minuteHand.getContext('2d');
441 | var radGrd = ctx.createRadialGradient(0, 0, 110, 90, 90, 20);
442 | radGrd.addColorStop(0, '#06e');
443 | radGrd.addColorStop(1, '#17f');
444 | ctx.fillStyle = radGrd;
445 | ctx.beginPath();
446 | ctx.moveTo(6, 0);
447 | ctx.lineTo(0, 110);
448 | ctx.lineTo(12, 110);
449 | ctx.lineTo(6, 0);
450 | ctx.fill();
451 | ctx.fillStyle = '#000';
452 | ctx.beginPath();
453 | ctx.arc(6, 90, 2, 0, 2 * Math.PI);
454 | ctx.fill();
455 | // Create second hand
456 | ctx = secondHand.getContext('2d');
457 | var radGrd = ctx.createRadialGradient(0, 0, 120, 100, 100, 20);
458 | radGrd.addColorStop(0, '#3a3');
459 | radGrd.addColorStop(1, '#4b4');
460 | ctx.fillStyle = radGrd;
461 | ctx.beginPath();
462 | ctx.moveTo(4, 0);
463 | ctx.lineTo(0, 120);
464 | ctx.lineTo(8, 120);
465 | ctx.lineTo(4, 0);
466 | ctx.fill();
467 | ctx.fillStyle = '#000';
468 | ctx.beginPath();
469 | ctx.arc(4, 90, 2, 0, 2 * Math.PI);
470 | ctx.fill();
471 | // Finalize
472 | Timepicker.addEvent(hourSystemButton, 'click', handleChangeHourSystem);
473 | Timepicker.addEvent(okButton, 'click', handleOkButton);
474 |
475 | if (isClockMode) secondTimer = window.setInterval(updateClockTime, 1000);
476 | else {
477 | addEvents();
478 | Timepicker.setCursor(hourHand, true);
479 | Timepicker.setCursor(minuteHand, true);
480 | Timepicker.setCursor(secondHand, true);
481 | // secondHand.style.display = 'none';
482 | // startClockDateInterval();
483 | }
484 |
485 | setClockDateToNow();
486 | updateClockDate();
487 | updateClockPointers();
488 | updatePickedTime();
489 |
490 |
491 | }
492 | else {
493 | self.destroy();
494 | alert('Sorry, your browser not support HTML canvas!');
495 | }
496 | };
497 |
498 | this.getElement = function() {
499 | return timepicker;
500 | };
501 |
502 | this.getHours = function() {
503 | return selectedHours;
504 | };
505 |
506 | this.getMinutes = function() {
507 | return selectedMinutes;
508 | };
509 |
510 | this.getTimeString = function() {
511 | return getPickedTimeString();
512 | };
513 |
514 | this.getDateString = function() {
515 | return getPickedDateString();
516 | };
517 |
518 |
519 | this.getTimestamp = function() {
520 | return +clockDate;
521 | };
522 |
523 | this.isClockMode = function() {
524 | return isClockMode;
525 | };
526 |
527 | this.is24HoursSystem = function() {
528 | return is24HoursSystem;
529 | };
530 |
531 | this.isHidden = function() {
532 | return isHidden;
533 | };
534 |
535 |
536 | this.moveClockDateForward = function(elapsedMilliseconds) {
537 |
538 | if (isDragging) return +clockDate; // Don't move clock hands forward if they're being dragged
539 |
540 | var selectedMilliseconds = clockDate.getMilliseconds();
541 | // add elapsedMilliseconds to clockDate
542 | clockDate.setMilliseconds(selectedMilliseconds + elapsedMilliseconds);
543 |
544 | selectedHours = clockDate.getHours();
545 | selectedMinutes = clockDate.getMinutes();
546 | selectedSeconds = clockDate.getSeconds();
547 | selectedMilliseconds = clockDate.getMilliseconds();
548 |
549 | isPM = selectedHours >= 12;
550 | if (selectedMilliseconds <= 100) updateClockPointers(); // Since the clock has 1 second precision, if the current millesconds are less than 100, we'll assume we've crossed a point where we should move the hands
551 |
552 |
553 | if (selectedSeconds < 5) updatePickedTime(); // Since pickedTime only has minute precision, if the current seconds are less than 5, we'll assume we've crossed a point where we should update pickedTime
554 |
555 | return +clockDate;
556 |
557 | };
558 |
559 |
560 | this.setHours = function(hours) {
561 | if (!isNaN(hours)) selectedHours = parseInt(hours);
562 | if (isClockMode) updateClockPointers();
563 | updatePickedTime();
564 | };
565 |
566 | this.setMinutes = function(minutes) {
567 | if (!isNaN(minutes)) selectedMinutes = parseInt(minutes);
568 | if (isClockMode) updateClockPointers();
569 | updatePickedTime();
570 | };
571 |
572 | this.setSeconds = function(seconds) {
573 | if (!isNaN(seconds)) selectedSeconds = parseFloat(seconds).toFixed(3);
574 | if (isClockMode) updateClockPointers();
575 | updatePickedTime();
576 | };
577 |
578 | this.changeClockMode = function() {
579 | isClockMode = !isClockMode;
580 | Timepicker.setCursor(hourHand, !isClockMode);
581 | Timepicker.setCursor(minuteHand, !isClockMode);
582 | Timepicker.setCursor(secondHand, !isClockMode);
583 | // secondHand.style.display = isClockMode ? '' : 'none';
584 | if (isClockMode) {
585 | removeEvents();
586 | updateClockTime();
587 | updatePickedTime();
588 | secondTimer = window.setInterval(updateClockTime, 1000);
589 | }
590 | else {
591 | // startClockDateInterval();
592 | addEvents();
593 | }
594 | };
595 |
596 | this.changeHourSystem = function() {
597 | handleChangeHourSystem();
598 | };
599 |
600 | this.show = function() {
601 | if (isHidden) {
602 | isHidden = !isHidden;
603 | setTimepickerDisplay();
604 | scrollToFix();
605 | }
606 | };
607 |
608 | this.hide = function() {
609 | if (!isHidden) {
610 | isHidden = !isHidden;
611 | setTimepickerDisplay();
612 | }
613 | };
614 |
615 | this.destroy = function() {
616 | window.clearInterval(secondTimer);
617 | timepicker.remove();
618 | self = null;
619 | };
620 |
621 | this.setDisplayStyle = function(style) {
622 | displayStyle = style;
623 | setTimepickerDisplay();
624 | };
625 |
626 | this.callback;
627 |
628 | createTimepicker();
629 | }
630 |
631 | Timepicker.addEvent = function(elm, evt, callback) {
632 | if (window.addEventListener) elm.addEventListener(evt, callback);
633 | else elm.attachEvent('on' + evt, callback);
634 | };
635 |
636 | Timepicker.removeEvent = function(elm, evt, callback) {
637 | if (window.addEventListener) elm.removeEventListener(evt, callback);
638 | else elm.detachEvent('on' + evt, callback);
639 | };
640 |
641 | Timepicker.setCursor = function(elm, pointer) {
642 | elm.style.cursor = pointer ? 'pointer' : 'default';
643 | };
644 |
645 | Timepicker.getSupportedTransformProp = function() {
646 | var props = ['transform', 'MozTransform', 'WebkitTransfor', 'msTransform', 'OTransform'],
647 | root = document.documentElement;
648 | for (var i = 0; i < props.length; i++)
649 | if (props[i] in root.style) return props[i];
650 | return null;
651 | };
652 |
--------------------------------------------------------------------------------
/gui/src/client/app_controls.css:
--------------------------------------------------------------------------------
1 | /* The calibrationToggleBox - the box around the calibrationToggle */
2 | .switch {
3 | position: relative;
4 | display: inline-block;
5 | float: right;
6 | left: 267px;
7 | top: 354px;
8 | width: 254px;
9 | height: 34px;
10 | }
11 |
12 | .switch input {display:none;}
13 |
14 | .slider {
15 | position: absolute;
16 | cursor: pointer;
17 | top: 0;
18 | left: 0;
19 | right: 0;
20 | bottom: 0;
21 | background-color: #ca2222;
22 | -webkit-transition: .4s;
23 | transition: .4s;
24 | }
25 |
26 | .slider:before {
27 | position: absolute;
28 | content: "";
29 | height: 26px;
30 | width: 26px;
31 | left: 4px;
32 | bottom: 4px;
33 | background-color: white;
34 | -webkit-transition: .4s;
35 | transition: .4s;
36 | }
37 |
38 | input:checked + .slider {
39 | background-color: #2ab934;
40 | }
41 |
42 | input:focus + .slider {
43 | box-shadow: 0 0 1px #2196F3;
44 | }
45 |
46 | input:checked + .slider:before {
47 | -webkit-transform: translateX(55px);
48 | -ms-transform: translateX(55px);
49 | transform: translateX(55px);
50 | }
51 |
52 | /*------ ADDED CSS ---------*/
53 | .on
54 | {
55 | display: none;
56 | }
57 |
58 | .on, .off
59 | {
60 | color: white;
61 | position: absolute;
62 | transform: translate(-50%,-50%);
63 | top: 50%;
64 | left: 50%;
65 | font-size: 10px;
66 | font-family: Verdana, sans-serif;
67 | }
68 |
69 | input:checked+ .slider .on
70 | {display: block;}
71 |
72 | input:checked + .slider .off
73 | {display: none;}
74 |
75 | /*--------- END --------*/
76 |
77 | /* Rounded sliders */
78 | .slider.round {
79 | border-radius: 34px;
80 | }
81 |
82 | .slider.round:before {
83 | border-radius: 50%;}
84 |
--------------------------------------------------------------------------------
/gui/src/client/cheap-ruler.min.js:
--------------------------------------------------------------------------------
1 | !function(t){if("object"==typeof exports&&"undefined"!=typeof module)module.exports=t();else if("function"==typeof define&&define.amd)define([],t);else{("undefined"!=typeof window?window:"undefined"!=typeof global?global:"undefined"!=typeof self?self:this).cheapRuler=t()}}(function(){return function t(n,e,r){function i(u,f){if(!e[u]){if(!n[u]){var a="function"==typeof require&&require;if(!f&&a)return a(u,!0);if(o)return o(u,!0);var s=new Error("Cannot find module '"+u+"'");throw s.code="MODULE_NOT_FOUND",s}var h=e[u]={exports:{}};n[u][0].call(h.exports,function(t){var e=n[u][1][t];return i(e||t)},h,h.exports,t,n,e,r)}return e[u].exports}for(var o="function"==typeof require&&require,u=0;u180&&(i-=360),i},destination:function(t,n,e){var r=(90-e)*Math.PI/180;return this.offset(t,Math.cos(r)*n,Math.sin(r)*n)},offset:function(t,n,e){return[t[0]+n/this.kx,t[1]+e/this.ky]},lineDistance:function(t){for(var n=0,e=0;en)return u(i,o,(n-(e-f))/f)}return t[t.length-1]},pointOnLine:function(t,n){for(var e,r,i,o,u=1/0,f=0;f1?(a=t[f+1][0],s=t[f+1][1]):l>0&&(a+=h/this.kx*l,s+=c/this.ky*l)}var d=(h=(n[0]-a)*this.kx)*h+(c=(n[1]-s)*this.ky)*c;di.index||r.index===i.index&&r.t>i.t){var u=r;r=i,i=u}var f=[r.point],a=r.index+1,s=i.index;!o(e[a],f[0])&&a<=s&&f.push(e[a]);for(var h=a+1;h<=s;h++)f.push(e[h]);return o(e[s],i.point)||f.push(i.point),f},lineSliceAlong:function(t,n,e){for(var r=0,i=[],o=0;ot&&0===i.length&&i.push(u(f,a,(t-(r-s))/s)),r>=n)return i.push(u(f,a,(n-(r-s))/s)),i;r>t&&i.push(a)}return i},bufferPoint:function(t,n){var e=n/this.ky,r=n/this.kx;return[t[0]-r,t[1]-e,t[0]+r,t[1]+e]},bufferBBox:function(t,n){var e=n/this.ky,r=n/this.kx;return[t[0]-r,t[1]-e,t[2]+r,t[3]+e]},insideBBox:function(t,n){return t[0]>=n[0]&&t[0]<=n[2]&&t[1]>=n[1]&&t[1]<=n[3]}}},{}]},{},[1])(1)});
2 |
--------------------------------------------------------------------------------
/gui/src/client/format-tracklets.js:
--------------------------------------------------------------------------------
1 | function compare(a,b) {
2 | if (a[2] < b[2])
3 | return -1;
4 | if (a[2] > b[2])
5 | return 1;
6 | return 0;
7 | }
8 |
9 | export function formatTracklets(trackableObjectsArray, loopTime, loopLength) {
10 |
11 | for (var i=0; i < trackableObjectsArray.length; i++) { // go through each trackableObject in trackableObjectsArray
12 |
13 | var trackableObject = trackableObjectsArray[i];
14 |
15 | // if (trackableObject.rw_coords.length < 2) {
16 | // continue; // if there's only one position, there's no point in trying to track it
17 | // }
18 |
19 | // console.log(trackableObject.objectID);
20 |
21 | var start_timestamp;
22 | var end_timestamp;
23 | var start_lng;
24 | var start_lat;
25 | var end_lng;
26 | var end_lat;
27 |
28 | var useable = false;
29 |
30 | // Sort the tracklets by frame_timestamp
31 | var tracklets = trackableObject.tracklets; // get each tracklet for this trackableObject with each frame_timestamp
32 | if(!trackableObject.sorted) { // only sort if it hasn't been sorted yet
33 | tracklets.sort(compare); // sort trackable by frame_timestamp
34 | trackableObject.tracklets = tracklets; // change "tracklets" property to the sorted version
35 | trackableObject.sorted = true; // mark it as sorted
36 | trackableObjectsArray[i] = trackableObject // update its version in the main array
37 | }
38 |
39 | }
40 |
41 | // trackableObjectsArray = truncateTimestamp(trackableObjectsArray, loopTime, loopLength);
42 | return trackableObjectsArray;
43 | }
44 |
45 | function truncateTimestamp(trackableObjectsArray, loopTime, loopLength) {
46 |
47 | for (var i=0; i < trackableObjectsArray.length; i++) { // go through each trackableObject in trackableObjectsArray
48 |
49 | var trackableObject = trackableObjectsArray[i];
50 | var tracklets = trackableObject.tracklets; // get each tracklet for this trackableObject with each frame_timestamp
51 | for (var j=0; j < tracklets.length-1; j++) { // go through all the recorded coordinates of that trackable object
52 |
53 | var frame_timestamp = tracklets[j][2];
54 |
55 | frame_timestamp = frame_timestamp / 1000;
56 |
57 | tracklets[j][2] = ((frame_timestamp % loopTime) / loopTime) * loopLength;
58 |
59 | }
60 |
61 | // attach modified tracklets to trackableObject
62 | trackableObject.tracklets = tracklets;
63 |
64 | // attach modified trackableObject to trackableObjectsArray
65 | trackableObjectsArray[i] = trackableObject;
66 |
67 | }
68 |
69 | return trackableObjectsArray;
70 |
71 | }
72 |
73 |
74 | export function interpolateTracklets(sortedTrackletsOfObject, clockTimestamp, ruler) { // argument is tracklets of single object that are sorted by timestamp (t1,t2 ...)
75 | for (var j=0; j < sortedTrackletsOfObject.length-1; j++) { // go through all the recorded coordinates of that trackable object
76 |
77 | // console.log("sortedTrackletsOfObject[j].frame_timestamp");
78 | // console.log(sortedTrackletsOfObject[j].frame_timestamp);
79 | // console.log("clockTimestamp");
80 | // console.log(clockTimestamp);
81 | // console.log("clockTimestamp - sortedTrackletsOfObject[j].frame_timestamp");
82 | // console.log(clockTimestamp - sortedTrackletsOfObject[j].frame_timestamp);
83 |
84 | var useable = false;
85 |
86 | if (sortedTrackletsOfObject[j][2] <= clockTimestamp && clockTimestamp < sortedTrackletsOfObject[j+1][2]) { // if the clockTimestamp is between the timestamp of two successive coordinates..
87 |
88 | /* console.log("inside j");
89 | console.log(j); */
90 |
91 | // .. set the local variables to these coordinates
92 | var start_timestamp = sortedTrackletsOfObject[j][2];
93 | var end_timestamp = sortedTrackletsOfObject[j+1][2];
94 |
95 | var start_lng = sortedTrackletsOfObject[j][0];
96 | var start_lat = sortedTrackletsOfObject[j][1];
97 |
98 | var end_lng = sortedTrackletsOfObject[j+1][0];
99 | var end_lat = sortedTrackletsOfObject[j+1][1];
100 |
101 | useable = true; // let the next block know it can run
102 |
103 | break; // no need to continue
104 | }
105 | }
106 |
107 | if (useable == true) {
108 | // console.log(start_lng);
109 | // console.log(end_lng);
110 | // console.log(start_lat);
111 | // console.log(end_lat);
112 | var line = [[start_lng, start_lat], [end_lng, end_lat]];
113 | var distance = ruler.distance(line[0], line[1]);
114 | // console.log(distance);
115 | var progress = (clockTimestamp-start_timestamp) / (end_timestamp-start_timestamp);
116 | // console.log(progress);
117 | var distance_along = progress*distance;
118 | // console.log(distance_along);
119 | var point = ruler.along(line, distance_along);
120 |
121 | // console.log("point");
122 | // console.log(point);
123 |
124 | // return [point[0], point[1], clockTimestamp];
125 | return {"lng": point[0], "lat": point[1]};
126 |
127 | } else {
128 | return null;
129 | }
130 | }
131 |
132 | export function interpolatePositionAndBearing(sortedTrackletsOfObject, clockTimestamp, ruler, inMilliseconds) { // Only to be used with tracklets sorted by timestamp (t1,t2 ...)
133 |
134 | var advance;
135 | var bearing;
136 |
137 | if (inMilliseconds == true) {
138 | advance = 2;
139 | } else {
140 | advance = 0.002;
141 | }
142 |
143 | const interpolatedCurrentPosition = interpolateTracklets(sortedTrackletsOfObject, clockTimestamp, ruler);
144 | // get the next position
145 | const interpolatedNextPosition = interpolateTracklets(sortedTrackletsOfObject, clockTimestamp+advance, ruler);
146 |
147 | if (interpolatedCurrentPosition !== null && interpolatedNextPosition !== null) {
148 |
149 | // determine bearing
150 | // ruler.bearing([longitude, latitude], [longitude, latitude]): Returns the bearing between two points in angles.
151 | bearing = ruler.bearing( [interpolatedCurrentPosition.lng, interpolatedCurrentPosition.lat], [interpolatedNextPosition.lng, interpolatedNextPosition.lat] )
152 |
153 | return [interpolatedCurrentPosition, bearing];
154 |
155 | } else {
156 | return [null, null];
157 | }
158 |
159 |
160 | }
161 |
--------------------------------------------------------------------------------
/gui/src/client/mapbox-gl-0.48.0.css:
--------------------------------------------------------------------------------
1 | .mapboxgl-map {
2 | font: 12px/20px 'Helvetica Neue', Arial, Helvetica, sans-serif;
3 | overflow: hidden;
4 | position: relative;
5 | -webkit-tap-highlight-color: rgba(0, 0, 0, 0);
6 | }
7 |
8 | .mapboxgl-map:-webkit-full-screen {
9 | width: 100%;
10 | height: 100%;
11 | }
12 |
13 | .mapboxgl-canary {
14 | background-color: salmon;
15 | }
16 |
17 | .mapboxgl-canvas-container.mapboxgl-interactive,
18 | .mapboxgl-ctrl-group > button.mapboxgl-ctrl-compass {
19 | cursor: -webkit-grab;
20 | cursor: -moz-grab;
21 | cursor: grab;
22 | -moz-user-select: none;
23 | -webkit-user-select: none;
24 | -ms-user-select: none;
25 | user-select: none;
26 | }
27 |
28 | .mapboxgl-canvas-container.mapboxgl-interactive:active,
29 | .mapboxgl-ctrl-group > button.mapboxgl-ctrl-compass:active {
30 | cursor: -webkit-grabbing;
31 | cursor: -moz-grabbing;
32 | cursor: grabbing;
33 | }
34 |
35 | .mapboxgl-canvas-container.mapboxgl-touch-zoom-rotate,
36 | .mapboxgl-canvas-container.mapboxgl-touch-zoom-rotate .mapboxgl-canvas {
37 | touch-action: pan-x pan-y;
38 | }
39 |
40 | .mapboxgl-canvas-container.mapboxgl-touch-drag-pan,
41 | .mapboxgl-canvas-container.mapboxgl-touch-drag-pan .mapboxgl-canvas {
42 | touch-action: pinch-zoom;
43 | }
44 |
45 | .mapboxgl-canvas-container.mapboxgl-touch-zoom-rotate.mapboxgl-touch-drag-pan,
46 | .mapboxgl-canvas-container.mapboxgl-touch-zoom-rotate.mapboxgl-touch-drag-pan .mapboxgl-canvas {
47 | touch-action: none;
48 | }
49 |
50 | .mapboxgl-ctrl-top-left,
51 | .mapboxgl-ctrl-top-right,
52 | .mapboxgl-ctrl-bottom-left,
53 | .mapboxgl-ctrl-bottom-right { position: absolute; pointer-events: none; z-index: 2; }
54 | .mapboxgl-ctrl-top-left { top: 0; left: 0; }
55 | .mapboxgl-ctrl-top-right { top: 0; right: 0; }
56 | .mapboxgl-ctrl-bottom-left { bottom: 0; left: 0; }
57 | .mapboxgl-ctrl-bottom-right { right: 0; bottom: 0; }
58 |
59 | .mapboxgl-ctrl { clear: both; pointer-events: auto; }
60 | .mapboxgl-ctrl-top-left .mapboxgl-ctrl { margin: 10px 0 0 10px; float: left; }
61 | .mapboxgl-ctrl-top-right .mapboxgl-ctrl { margin: 10px 10px 0 0; float: right; }
62 | .mapboxgl-ctrl-bottom-left .mapboxgl-ctrl { margin: 0 0 10px 10px; float: left; }
63 | .mapboxgl-ctrl-bottom-right .mapboxgl-ctrl { margin: 0 10px 10px 0; float: right; }
64 |
65 | .mapboxgl-ctrl-group {
66 | border-radius: 4px;
67 | -moz-box-shadow: 0 0 2px rgba(0, 0, 0, 0.1);
68 | -webkit-box-shadow: 0 0 2px rgba(0, 0, 0, 0.1);
69 | box-shadow: 0 0 0 2px rgba(0, 0, 0, 0.1);
70 | overflow: hidden;
71 | background: #fff;
72 | }
73 |
74 | .mapboxgl-ctrl-group > button {
75 | width: 30px;
76 | height: 30px;
77 | display: block;
78 | padding: 0;
79 | outline: none;
80 | border: 0;
81 | box-sizing: border-box;
82 | background-color: transparent;
83 | cursor: pointer;
84 | }
85 |
86 | .mapboxgl-ctrl-group > button + button {
87 | border-top: 1px solid #ddd;
88 | }
89 |
90 | /* https://bugzilla.mozilla.org/show_bug.cgi?id=140562 */
91 | .mapboxgl-ctrl > button::-moz-focus-inner {
92 | border: 0;
93 | padding: 0;
94 | }
95 |
96 | .mapboxgl-ctrl > button:hover {
97 | background-color: rgba(0, 0, 0, 0.05);
98 | }
99 |
100 | .mapboxgl-ctrl-icon,
101 | .mapboxgl-ctrl-icon > .mapboxgl-ctrl-compass-arrow {
102 | speak: none;
103 | -webkit-font-smoothing: antialiased;
104 | -moz-osx-font-smoothing: grayscale;
105 | }
106 |
107 | .mapboxgl-ctrl-icon {
108 | padding: 5px;
109 | }
110 |
111 | .mapboxgl-ctrl-icon.mapboxgl-ctrl-zoom-out {
112 | background-image: url("data:image/svg+xml;charset=utf-8,%3Csvg viewBox='0 0 20 20' xmlns='http://www.w3.org/2000/svg'%3E %3Cpath style='fill:%23333333;' d='m 7,9 c -0.554,0 -1,0.446 -1,1 0,0.554 0.446,1 1,1 l 6,0 c 0.554,0 1,-0.446 1,-1 0,-0.554 -0.446,-1 -1,-1 z'/%3E %3C/svg%3E");
113 | }
114 |
115 | .mapboxgl-ctrl-icon.mapboxgl-ctrl-zoom-in {
116 | background-image: url("data:image/svg+xml;charset=utf-8,%3Csvg viewBox='0 0 20 20' xmlns='http://www.w3.org/2000/svg'%3E %3Cpath style='fill:%23333333;' d='M 10 6 C 9.446 6 9 6.4459904 9 7 L 9 9 L 7 9 C 6.446 9 6 9.446 6 10 C 6 10.554 6.446 11 7 11 L 9 11 L 9 13 C 9 13.55401 9.446 14 10 14 C 10.554 14 11 13.55401 11 13 L 11 11 L 13 11 C 13.554 11 14 10.554 14 10 C 14 9.446 13.554 9 13 9 L 11 9 L 11 7 C 11 6.4459904 10.554 6 10 6 z'/%3E %3C/svg%3E");
117 | }
118 |
119 | .mapboxgl-ctrl-icon.mapboxgl-ctrl-geolocate {
120 | background-image: url("data:image/svg+xml;charset=utf-8,%3Csvg viewBox='0 0 20 20' xmlns='http://www.w3.org/2000/svg' fill='%23333'%3E %3Cpath d='M10 4C9 4 9 5 9 5L9 5.1A5 5 0 0 0 5.1 9L5 9C5 9 4 9 4 10 4 11 5 11 5 11L5.1 11A5 5 0 0 0 9 14.9L9 15C9 15 9 16 10 16 11 16 11 15 11 15L11 14.9A5 5 0 0 0 14.9 11L15 11C15 11 16 11 16 10 16 9 15 9 15 9L14.9 9A5 5 0 0 0 11 5.1L11 5C11 5 11 4 10 4zM10 6.5A3.5 3.5 0 0 1 13.5 10 3.5 3.5 0 0 1 10 13.5 3.5 3.5 0 0 1 6.5 10 3.5 3.5 0 0 1 10 6.5zM10 8.3A1.8 1.8 0 0 0 8.3 10 1.8 1.8 0 0 0 10 11.8 1.8 1.8 0 0 0 11.8 10 1.8 1.8 0 0 0 10 8.3z'/%3E %3C/svg%3E");
121 | }
122 |
123 | .mapboxgl-ctrl-icon.mapboxgl-ctrl-geolocate:disabled {
124 | background-image: url("data:image/svg+xml;charset=utf-8,%3Csvg viewBox='0 0 20 20' xmlns='http://www.w3.org/2000/svg' fill='%23aaa'%3E %3Cpath d='M10 4C9 4 9 5 9 5L9 5.1A5 5 0 0 0 5.1 9L5 9C5 9 4 9 4 10 4 11 5 11 5 11L5.1 11A5 5 0 0 0 9 14.9L9 15C9 15 9 16 10 16 11 16 11 15 11 15L11 14.9A5 5 0 0 0 14.9 11L15 11C15 11 16 11 16 10 16 9 15 9 15 9L14.9 9A5 5 0 0 0 11 5.1L11 5C11 5 11 4 10 4zM10 6.5A3.5 3.5 0 0 1 13.5 10 3.5 3.5 0 0 1 10 13.5 3.5 3.5 0 0 1 6.5 10 3.5 3.5 0 0 1 10 6.5zM10 8.3A1.8 1.8 0 0 0 8.3 10 1.8 1.8 0 0 0 10 11.8 1.8 1.8 0 0 0 11.8 10 1.8 1.8 0 0 0 10 8.3z'/%3E %3C/svg%3E");
125 | }
126 |
127 | .mapboxgl-ctrl-icon.mapboxgl-ctrl-geolocate.mapboxgl-ctrl-geolocate-active {
128 | background-image: url("data:image/svg+xml;charset=utf-8,%3Csvg viewBox='0 0 20 20' xmlns='http://www.w3.org/2000/svg' fill='%2333b5e5'%3E %3Cpath d='M10 4C9 4 9 5 9 5L9 5.1A5 5 0 0 0 5.1 9L5 9C5 9 4 9 4 10 4 11 5 11 5 11L5.1 11A5 5 0 0 0 9 14.9L9 15C9 15 9 16 10 16 11 16 11 15 11 15L11 14.9A5 5 0 0 0 14.9 11L15 11C15 11 16 11 16 10 16 9 15 9 15 9L14.9 9A5 5 0 0 0 11 5.1L11 5C11 5 11 4 10 4zM10 6.5A3.5 3.5 0 0 1 13.5 10 3.5 3.5 0 0 1 10 13.5 3.5 3.5 0 0 1 6.5 10 3.5 3.5 0 0 1 10 6.5zM10 8.3A1.8 1.8 0 0 0 8.3 10 1.8 1.8 0 0 0 10 11.8 1.8 1.8 0 0 0 11.8 10 1.8 1.8 0 0 0 10 8.3z'/%3E %3C/svg%3E");
129 | }
130 |
131 | .mapboxgl-ctrl-icon.mapboxgl-ctrl-geolocate.mapboxgl-ctrl-geolocate-active-error {
132 | background-image: url("data:image/svg+xml;charset=utf-8,%3Csvg viewBox='0 0 20 20' xmlns='http://www.w3.org/2000/svg' fill='%23e58978'%3E %3Cpath d='M10 4C9 4 9 5 9 5L9 5.1A5 5 0 0 0 5.1 9L5 9C5 9 4 9 4 10 4 11 5 11 5 11L5.1 11A5 5 0 0 0 9 14.9L9 15C9 15 9 16 10 16 11 16 11 15 11 15L11 14.9A5 5 0 0 0 14.9 11L15 11C15 11 16 11 16 10 16 9 15 9 15 9L14.9 9A5 5 0 0 0 11 5.1L11 5C11 5 11 4 10 4zM10 6.5A3.5 3.5 0 0 1 13.5 10 3.5 3.5 0 0 1 10 13.5 3.5 3.5 0 0 1 6.5 10 3.5 3.5 0 0 1 10 6.5zM10 8.3A1.8 1.8 0 0 0 8.3 10 1.8 1.8 0 0 0 10 11.8 1.8 1.8 0 0 0 11.8 10 1.8 1.8 0 0 0 10 8.3z'/%3E %3C/svg%3E");
133 | }
134 |
135 | .mapboxgl-ctrl-icon.mapboxgl-ctrl-geolocate.mapboxgl-ctrl-geolocate-background {
136 | background-image: url("data:image/svg+xml;charset=utf-8,%3Csvg viewBox='0 0 20 20' xmlns='http://www.w3.org/2000/svg' fill='%2333b5e5'%3E %3Cpath d='M 10,4 C 9,4 9,5 9,5 L 9,5.1 C 7.0357113,5.5006048 5.5006048,7.0357113 5.1,9 L 5,9 c 0,0 -1,0 -1,1 0,1 1,1 1,1 l 0.1,0 c 0.4006048,1.964289 1.9357113,3.499395 3.9,3.9 L 9,15 c 0,0 0,1 1,1 1,0 1,-1 1,-1 l 0,-0.1 c 1.964289,-0.400605 3.499395,-1.935711 3.9,-3.9 l 0.1,0 c 0,0 1,0 1,-1 C 16,9 15,9 15,9 L 14.9,9 C 14.499395,7.0357113 12.964289,5.5006048 11,5.1 L 11,5 c 0,0 0,-1 -1,-1 z m 0,2.5 c 1.932997,0 3.5,1.5670034 3.5,3.5 0,1.932997 -1.567003,3.5 -3.5,3.5 C 8.0670034,13.5 6.5,11.932997 6.5,10 6.5,8.0670034 8.0670034,6.5 10,6.5 Z'/%3E %3C/svg%3E");
137 | }
138 |
139 | .mapboxgl-ctrl-icon.mapboxgl-ctrl-geolocate.mapboxgl-ctrl-geolocate-background-error {
140 | background-image: url("data:image/svg+xml;charset=utf-8,%3Csvg viewBox='0 0 20 20' xmlns='http://www.w3.org/2000/svg' fill='%23e54e33'%3E %3Cpath d='M 10,4 C 9,4 9,5 9,5 L 9,5.1 C 7.0357113,5.5006048 5.5006048,7.0357113 5.1,9 L 5,9 c 0,0 -1,0 -1,1 0,1 1,1 1,1 l 0.1,0 c 0.4006048,1.964289 1.9357113,3.499395 3.9,3.9 L 9,15 c 0,0 0,1 1,1 1,0 1,-1 1,-1 l 0,-0.1 c 1.964289,-0.400605 3.499395,-1.935711 3.9,-3.9 l 0.1,0 c 0,0 1,0 1,-1 C 16,9 15,9 15,9 L 14.9,9 C 14.499395,7.0357113 12.964289,5.5006048 11,5.1 L 11,5 c 0,0 0,-1 -1,-1 z m 0,2.5 c 1.932997,0 3.5,1.5670034 3.5,3.5 0,1.932997 -1.567003,3.5 -3.5,3.5 C 8.0670034,13.5 6.5,11.932997 6.5,10 6.5,8.0670034 8.0670034,6.5 10,6.5 Z'/%3E %3C/svg%3E");
141 | }
142 |
143 | .mapboxgl-ctrl-icon.mapboxgl-ctrl-geolocate.mapboxgl-ctrl-geolocate-waiting {
144 | -webkit-animation: mapboxgl-spin 2s infinite linear;
145 | -moz-animation: mapboxgl-spin 2s infinite linear;
146 | -o-animation: mapboxgl-spin 2s infinite linear;
147 | -ms-animation: mapboxgl-spin 2s infinite linear;
148 | animation: mapboxgl-spin 2s infinite linear;
149 | }
150 |
151 | @-webkit-keyframes mapboxgl-spin {
152 | 0% { -webkit-transform: rotate(0deg); }
153 | 100% { -webkit-transform: rotate(360deg); }
154 | }
155 |
156 | @-moz-keyframes mapboxgl-spin {
157 | 0% { -moz-transform: rotate(0deg); }
158 | 100% { -moz-transform: rotate(360deg); }
159 | }
160 |
161 | @-o-keyframes mapboxgl-spin {
162 | 0% { -o-transform: rotate(0deg); }
163 | 100% { -o-transform: rotate(360deg); }
164 | }
165 |
166 | @-ms-keyframes mapboxgl-spin {
167 | 0% { -ms-transform: rotate(0deg); }
168 | 100% { -ms-transform: rotate(360deg); }
169 | }
170 |
171 | @keyframes mapboxgl-spin {
172 | 0% { transform: rotate(0deg); }
173 | 100% { transform: rotate(360deg); }
174 | }
175 |
176 | .mapboxgl-ctrl-icon.mapboxgl-ctrl-fullscreen {
177 | background-image: url("data:image/svg+xml;charset=utf-8,%3Csvg viewBox='0 0 20 20' xmlns='http://www.w3.org/2000/svg'%3E %3Cpath d='M 5 4 C 4.5 4 4 4.5 4 5 L 4 6 L 4 9 L 4.5 9 L 5.7773438 7.296875 C 6.7771319 8.0602131 7.835765 8.9565728 8.890625 10 C 7.8257121 11.0633 6.7761791 11.951675 5.78125 12.707031 L 4.5 11 L 4 11 L 4 15 C 4 15.5 4.5 16 5 16 L 9 16 L 9 15.5 L 7.2734375 14.205078 C 8.0428931 13.187886 8.9395441 12.133481 9.9609375 11.068359 C 11.042371 12.14699 11.942093 13.2112 12.707031 14.21875 L 11 15.5 L 11 16 L 14 16 L 15 16 C 15.5 16 16 15.5 16 15 L 16 14 L 16 11 L 15.5 11 L 14.205078 12.726562 C 13.177985 11.949617 12.112718 11.043577 11.037109 10.009766 C 12.151856 8.981061 13.224345 8.0798624 14.228516 7.3046875 L 15.5 9 L 16 9 L 16 5 C 16 4.5 15.5 4 15 4 L 11 4 L 11 4.5 L 12.703125 5.7773438 C 11.932647 6.7864834 11.026693 7.8554712 9.9707031 8.9199219 C 8.9584739 7.8204943 8.0698767 6.7627188 7.3046875 5.7714844 L 9 4.5 L 9 4 L 6 4 L 5 4 z '/%3E %3C/svg%3E");
178 | }
179 |
180 | .mapboxgl-ctrl-icon.mapboxgl-ctrl-shrink {
181 | background-image: url("data:image/svg+xml;charset=utf-8,%3Csvg viewBox='0 0 20 20' xmlns='http://www.w3.org/2000/svg'%3E %3Cpath style='fill:%23000000;' d='M 4.2421875 3.4921875 A 0.750075 0.750075 0 0 0 3.71875 4.78125 L 5.9648438 7.0273438 L 4 8.5 L 4 9 L 8 9 C 8.500001 8.9999988 9 8.4999992 9 8 L 9 4 L 8.5 4 L 7.0175781 5.9550781 L 4.78125 3.71875 A 0.750075 0.750075 0 0 0 4.2421875 3.4921875 z M 15.734375 3.4921875 A 0.750075 0.750075 0 0 0 15.21875 3.71875 L 12.984375 5.953125 L 11.5 4 L 11 4 L 11 8 C 11 8.4999992 11.499999 8.9999988 12 9 L 16 9 L 16 8.5 L 14.035156 7.0273438 L 16.28125 4.78125 A 0.750075 0.750075 0 0 0 15.734375 3.4921875 z M 4 11 L 4 11.5 L 5.9648438 12.972656 L 3.71875 15.21875 A 0.75130096 0.75130096 0 1 0 4.78125 16.28125 L 7.0273438 14.035156 L 8.5 16 L 9 16 L 9 12 C 9 11.500001 8.500001 11.000001 8 11 L 4 11 z M 12 11 C 11.499999 11.000001 11 11.500001 11 12 L 11 16 L 11.5 16 L 12.972656 14.035156 L 15.21875 16.28125 A 0.75130096 0.75130096 0 1 0 16.28125 15.21875 L 14.035156 12.972656 L 16 11.5 L 16 11 L 12 11 z '/%3E %3C/svg%3E");
182 | }
183 |
184 | .mapboxgl-ctrl-icon.mapboxgl-ctrl-compass > .mapboxgl-ctrl-compass-arrow {
185 | width: 20px;
186 | height: 20px;
187 | margin: 5px;
188 | background-image: url("data:image/svg+xml;charset=utf-8,%3Csvg viewBox='0 0 20 20' xmlns='http://www.w3.org/2000/svg'%3E %3Cpolygon fill='%23333333' points='6,9 10,1 14,9'/%3E %3Cpolygon fill='%23CCCCCC' points='6,11 10,19 14,11 '/%3E %3C/svg%3E");
189 | background-repeat: no-repeat;
190 | display: inline-block;
191 | }
192 |
193 | a.mapboxgl-ctrl-logo {
194 | width: 85px;
195 | height: 21px;
196 | margin: 0 0 -3px -3px;
197 | display: block;
198 | background-repeat: no-repeat;
199 | cursor: pointer;
200 | background-image: url("data:image/svg+xml;charset=utf-8,%3C?xml version='1.0' encoding='utf-8'?%3E%3Csvg version='1.1' id='Layer_1' xmlns='http://www.w3.org/2000/svg' xmlns:xlink='http://www.w3.org/1999/xlink' x='0px' y='0px' viewBox='0 0 84.49 21' style='enable-background:new 0 0 84.49 21;' xml:space='preserve'%3E%3Cg%3E %3Cpath class='st0' style='opacity:0.9; fill: %23FFFFFF; enable-background: new;' d='M83.25,14.26c0,0.12-0.09,0.21-0.21,0.21h-1.61c-0.13,0-0.24-0.06-0.3-0.17l-1.44-2.39l-1.44,2.39 c-0.06,0.11-0.18,0.17-0.3,0.17h-1.61c-0.04,0-0.08-0.01-0.12-0.03c-0.09-0.06-0.13-0.19-0.06-0.28l0,0l2.43-3.68L76.2,6.84 c-0.02-0.03-0.03-0.07-0.03-0.12c0-0.12,0.09-0.21,0.21-0.21h1.61c0.13,0,0.24,0.06,0.3,0.17l1.41,2.36l1.4-2.35 c0.06-0.11,0.18-0.17,0.3-0.17H83c0.04,0,0.08,0.01,0.12,0.03c0.09,0.06,0.13,0.19,0.06,0.28l0,0l-2.37,3.63l2.43,3.67 C83.24,14.18,83.25,14.22,83.25,14.26z'/%3E %3Cpath class='st0' style='opacity:0.9; fill: %23FFFFFF; enable-background: new;' d='M66.24,9.59c-0.39-1.88-1.96-3.28-3.84-3.28c-1.03,0-2.03,0.42-2.73,1.18V3.51c0-0.13-0.1-0.23-0.23-0.23h-1.4 c-0.13,0-0.23,0.11-0.23,0.23v10.72c0,0.13,0.1,0.23,0.23,0.23h1.4c0.13,0,0.23-0.11,0.23-0.23V13.5c0.71,0.75,1.7,1.18,2.73,1.18 c1.88,0,3.45-1.41,3.84-3.29C66.37,10.79,66.37,10.18,66.24,9.59L66.24,9.59z M62.08,13c-1.32,0-2.39-1.11-2.41-2.48v-0.06 c0.02-1.38,1.09-2.48,2.41-2.48s2.42,1.12,2.42,2.51S63.41,13,62.08,13z'/%3E %3Cpath class='st0' style='opacity:0.9; fill: %23FFFFFF; enable-background: new;' d='M71.67,6.32c-1.98-0.01-3.72,1.35-4.16,3.29c-0.13,0.59-0.13,1.19,0,1.77c0.44,1.94,2.17,3.32,4.17,3.3 c2.35,0,4.26-1.87,4.26-4.19S74.04,6.32,71.67,6.32z M71.65,13.01c-1.33,0-2.42-1.12-2.42-2.51s1.08-2.52,2.42-2.52 c1.33,0,2.42,1.12,2.42,2.51S72.99,13,71.65,13.01L71.65,13.01z'/%3E %3Cpath class='st1' style='opacity:0.35; enable-background:new;' d='M62.08,7.98c-1.32,0-2.39,1.11-2.41,2.48v0.06C59.68,11.9,60.75,13,62.08,13s2.42-1.12,2.42-2.51 S63.41,7.98,62.08,7.98z M62.08,11.76c-0.63,0-1.14-0.56-1.17-1.25v-0.04c0.01-0.69,0.54-1.25,1.17-1.25 c0.63,0,1.17,0.57,1.17,1.27C63.24,11.2,62.73,11.76,62.08,11.76z'/%3E %3Cpath class='st1' style='opacity:0.35; enable-background:new;' d='M71.65,7.98c-1.33,0-2.42,1.12-2.42,2.51S70.32,13,71.65,13s2.42-1.12,2.42-2.51S72.99,7.98,71.65,7.98z M71.65,11.76c-0.64,0-1.17-0.57-1.17-1.27c0-0.7,0.53-1.26,1.17-1.26s1.17,0.57,1.17,1.27C72.82,11.21,72.29,11.76,71.65,11.76z'/%3E %3Cpath class='st0' style='opacity:0.9; fill: %23FFFFFF; enable-background: new;' d='M45.74,6.53h-1.4c-0.13,0-0.23,0.11-0.23,0.23v0.73c-0.71-0.75-1.7-1.18-2.73-1.18 c-2.17,0-3.94,1.87-3.94,4.19s1.77,4.19,3.94,4.19c1.04,0,2.03-0.43,2.73-1.19v0.73c0,0.13,0.1,0.23,0.23,0.23h1.4 c0.13,0,0.23-0.11,0.23-0.23V6.74c0-0.12-0.09-0.22-0.22-0.22C45.75,6.53,45.75,6.53,45.74,6.53z M44.12,10.53 C44.11,11.9,43.03,13,41.71,13s-2.42-1.12-2.42-2.51s1.08-2.52,2.4-2.52c1.33,0,2.39,1.11,2.41,2.48L44.12,10.53z'/%3E %3Cpath class='st1' style='opacity:0.35; enable-background:new;' d='M41.71,7.98c-1.33,0-2.42,1.12-2.42,2.51S40.37,13,41.71,13s2.39-1.11,2.41-2.48v-0.06 C44.1,9.09,43.03,7.98,41.71,7.98z M40.55,10.49c0-0.7,0.52-1.27,1.17-1.27c0.64,0,1.14,0.56,1.17,1.25v0.04 c-0.01,0.68-0.53,1.24-1.17,1.24C41.08,11.75,40.55,11.19,40.55,10.49z'/%3E %3Cpath class='st0' style='opacity:0.9; fill: %23FFFFFF; enable-background: new;' d='M52.41,6.32c-1.03,0-2.03,0.42-2.73,1.18V6.75c0-0.13-0.1-0.23-0.23-0.23h-1.4c-0.13,0-0.23,0.11-0.23,0.23 v10.72c0,0.13,0.1,0.23,0.23,0.23h1.4c0.13,0,0.23-0.1,0.23-0.23V13.5c0.71,0.75,1.7,1.18,2.74,1.18c2.17,0,3.94-1.87,3.94-4.19 S54.58,6.32,52.41,6.32z M52.08,13.01c-1.32,0-2.39-1.11-2.42-2.48v-0.07c0.02-1.38,1.09-2.49,2.4-2.49c1.32,0,2.41,1.12,2.41,2.51 S53.4,13,52.08,13.01L52.08,13.01z'/%3E %3Cpath class='st1' style='opacity:0.35; enable-background:new;' d='M52.08,7.98c-1.32,0-2.39,1.11-2.42,2.48v0.06c0.03,1.38,1.1,2.48,2.42,2.48s2.41-1.12,2.41-2.51 S53.4,7.98,52.08,7.98z M52.08,11.76c-0.63,0-1.14-0.56-1.17-1.25v-0.04c0.01-0.69,0.54-1.25,1.17-1.25c0.63,0,1.17,0.58,1.17,1.27 S52.72,11.76,52.08,11.76z'/%3E %3Cpath class='st0' style='opacity:0.9; fill: %23FFFFFF; enable-background: new;' d='M36.08,14.24c0,0.13-0.1,0.23-0.23,0.23h-1.41c-0.13,0-0.23-0.11-0.23-0.23V9.68c0-0.98-0.74-1.71-1.62-1.71 c-0.8,0-1.46,0.7-1.59,1.62l0.01,4.66c0,0.13-0.11,0.23-0.23,0.23h-1.41c-0.13,0-0.23-0.11-0.23-0.23V9.68 c0-0.98-0.74-1.71-1.62-1.71c-0.85,0-1.54,0.79-1.6,1.8v4.48c0,0.13-0.1,0.23-0.23,0.23h-1.4c-0.13,0-0.23-0.11-0.23-0.23V6.74 c0.01-0.13,0.1-0.22,0.23-0.22h1.4c0.13,0,0.22,0.11,0.23,0.22V7.4c0.5-0.68,1.3-1.09,2.16-1.1h0.03c1.09,0,2.09,0.6,2.6,1.55 c0.45-0.95,1.4-1.55,2.44-1.56c1.62,0,2.93,1.25,2.9,2.78L36.08,14.24z'/%3E %3Cpath class='st1' style='opacity:0.35; enable-background:new;' d='M84.34,13.59l-0.07-0.13l-1.96-2.99l1.94-2.95c0.44-0.67,0.26-1.56-0.41-2.02c-0.02,0-0.03,0-0.04-0.01 c-0.23-0.15-0.5-0.22-0.78-0.22h-1.61c-0.56,0-1.08,0.29-1.37,0.78L79.72,6.6l-0.34-0.56C79.09,5.56,78.57,5.27,78,5.27h-1.6 c-0.6,0-1.13,0.37-1.35,0.92c-2.19-1.66-5.28-1.47-7.26,0.45c-0.35,0.34-0.65,0.72-0.89,1.14c-0.9-1.62-2.58-2.72-4.5-2.72 c-0.5,0-1.01,0.07-1.48,0.23V3.51c0-0.82-0.66-1.48-1.47-1.48h-1.4c-0.81,0-1.47,0.66-1.47,1.47v3.75 c-0.95-1.36-2.5-2.18-4.17-2.19c-0.74,0-1.46,0.16-2.12,0.47c-0.24-0.17-0.54-0.26-0.84-0.26h-1.4c-0.45,0-0.87,0.21-1.15,0.56 c-0.02-0.03-0.04-0.05-0.07-0.08c-0.28-0.3-0.68-0.47-1.09-0.47h-1.39c-0.3,0-0.6,0.09-0.84,0.26c-0.67-0.3-1.39-0.46-2.12-0.46 c-1.83,0-3.43,1-4.37,2.5c-0.2-0.46-0.48-0.89-0.83-1.25c-0.8-0.81-1.89-1.25-3.02-1.25h-0.01c-0.89,0.01-1.75,0.33-2.46,0.88 c-0.74-0.57-1.64-0.88-2.57-0.88H28.1c-0.29,0-0.58,0.03-0.86,0.11c-0.28,0.06-0.56,0.16-0.82,0.28c-0.21-0.12-0.45-0.18-0.7-0.18 h-1.4c-0.82,0-1.47,0.66-1.47,1.47v7.5c0,0.82,0.66,1.47,1.47,1.47h1.4c0.82,0,1.48-0.66,1.48-1.48l0,0V9.79 c0.03-0.36,0.23-0.59,0.36-0.59c0.18,0,0.38,0.18,0.38,0.47v4.57c0,0.82,0.66,1.47,1.47,1.47h1.41c0.82,0,1.47-0.66,1.47-1.47 l-0.01-4.57c0.06-0.32,0.25-0.47,0.35-0.47c0.18,0,0.38,0.18,0.38,0.47v4.57c0,0.82,0.66,1.47,1.47,1.47h1.41 c0.82,0,1.47-0.66,1.47-1.47v-0.38c0.96,1.29,2.46,2.06,4.06,2.06c0.74,0,1.46-0.16,2.12-0.47c0.24,0.17,0.54,0.26,0.84,0.26h1.39 c0.3,0,0.6-0.09,0.84-0.26v2.01c0,0.82,0.66,1.47,1.47,1.47h1.4c0.82,0,1.47-0.66,1.47-1.47v-1.77c0.48,0.15,0.99,0.23,1.49,0.22 c1.7,0,3.22-0.87,4.17-2.2v0.52c0,0.82,0.66,1.47,1.47,1.47h1.4c0.3,0,0.6-0.09,0.84-0.26c0.66,0.31,1.39,0.47,2.12,0.47 c1.92,0,3.6-1.1,4.49-2.73c1.54,2.65,4.95,3.53,7.58,1.98c0.18-0.11,0.36-0.22,0.53-0.36c0.22,0.55,0.76,0.91,1.35,0.9H78 c0.56,0,1.08-0.29,1.37-0.78l0.37-0.61l0.37,0.61c0.29,0.48,0.81,0.78,1.38,0.78h1.6c0.81,0,1.46-0.66,1.45-1.46 C84.49,14.02,84.44,13.8,84.34,13.59L84.34,13.59z M35.86,14.47h-1.41c-0.13,0-0.23-0.11-0.23-0.23V9.68 c0-0.98-0.74-1.71-1.62-1.71c-0.8,0-1.46,0.7-1.59,1.62l0.01,4.66c0,0.13-0.1,0.23-0.23,0.23h-1.41c-0.13,0-0.23-0.11-0.23-0.23 V9.68c0-0.98-0.74-1.71-1.62-1.71c-0.85,0-1.54,0.79-1.6,1.8v4.48c0,0.13-0.1,0.23-0.23,0.23h-1.4c-0.13,0-0.23-0.11-0.23-0.23 V6.74c0.01-0.13,0.11-0.22,0.23-0.22h1.4c0.13,0,0.22,0.11,0.23,0.22V7.4c0.5-0.68,1.3-1.09,2.16-1.1h0.03 c1.09,0,2.09,0.6,2.6,1.55c0.45-0.95,1.4-1.55,2.44-1.56c1.62,0,2.93,1.25,2.9,2.78l0.01,5.16C36.09,14.36,35.98,14.46,35.86,14.47 L35.86,14.47z M45.97,14.24c0,0.13-0.1,0.23-0.23,0.23h-1.4c-0.13,0-0.23-0.11-0.23-0.23V13.5c-0.7,0.76-1.69,1.18-2.72,1.18 c-2.17,0-3.94-1.87-3.94-4.19s1.77-4.19,3.94-4.19c1.03,0,2.02,0.43,2.73,1.18V6.74c0-0.13,0.1-0.23,0.23-0.23h1.4 c0.12-0.01,0.22,0.08,0.23,0.21c0,0.01,0,0.01,0,0.02v7.51h-0.01V14.24z M52.41,14.67c-1.03,0-2.02-0.43-2.73-1.18v3.97 c0,0.13-0.1,0.23-0.23,0.23h-1.4c-0.13,0-0.23-0.1-0.23-0.23V6.75c0-0.13,0.1-0.22,0.23-0.22h1.4c0.13,0,0.23,0.11,0.23,0.23v0.73 c0.71-0.76,1.7-1.18,2.73-1.18c2.17,0,3.94,1.86,3.94,4.18S54.58,14.67,52.41,14.67z M66.24,11.39c-0.39,1.87-1.96,3.29-3.84,3.29 c-1.03,0-2.02-0.43-2.73-1.18v0.73c0,0.13-0.1,0.23-0.23,0.23h-1.4c-0.13,0-0.23-0.11-0.23-0.23V3.51c0-0.13,0.1-0.23,0.23-0.23 h1.4c0.13,0,0.23,0.11,0.23,0.23v3.97c0.71-0.75,1.7-1.18,2.73-1.17c1.88,0,3.45,1.4,3.84,3.28C66.37,10.19,66.37,10.8,66.24,11.39 L66.24,11.39L66.24,11.39z M71.67,14.68c-2,0.01-3.73-1.35-4.17-3.3c-0.13-0.59-0.13-1.19,0-1.77c0.44-1.94,2.17-3.31,4.17-3.3 c2.36,0,4.26,1.87,4.26,4.19S74.03,14.68,71.67,14.68L71.67,14.68z M83.04,14.47h-1.61c-0.13,0-0.24-0.06-0.3-0.17l-1.44-2.39 l-1.44,2.39c-0.06,0.11-0.18,0.17-0.3,0.17h-1.61c-0.04,0-0.08-0.01-0.12-0.03c-0.09-0.06-0.13-0.19-0.06-0.28l0,0l2.43-3.68 L76.2,6.84c-0.02-0.03-0.03-0.07-0.03-0.12c0-0.12,0.09-0.21,0.21-0.21h1.61c0.13,0,0.24,0.06,0.3,0.17l1.41,2.36l1.41-2.36 c0.06-0.11,0.18-0.17,0.3-0.17h1.61c0.04,0,0.08,0.01,0.12,0.03c0.09,0.06,0.13,0.19,0.06,0.28l0,0l-2.38,3.64l2.43,3.67 c0.02,0.03,0.03,0.07,0.03,0.12C83.25,14.38,83.16,14.47,83.04,14.47L83.04,14.47L83.04,14.47z'/%3E %3Cpath class='st0' style='opacity:0.9; fill: %23FFFFFF; enable-background: new;' d='M10.5,1.24c-5.11,0-9.25,4.15-9.25,9.25s4.15,9.25,9.25,9.25s9.25-4.15,9.25-9.25 C19.75,5.38,15.61,1.24,10.5,1.24z M14.89,12.77c-1.93,1.93-4.78,2.31-6.7,2.31c-0.7,0-1.41-0.05-2.1-0.16c0,0-1.02-5.64,2.14-8.81 c0.83-0.83,1.95-1.28,3.13-1.28c1.27,0,2.49,0.51,3.39,1.42C16.59,8.09,16.64,11,14.89,12.77z'/%3E %3Cpath class='st1' style='opacity:0.35; enable-background:new;' d='M10.5-0.01C4.7-0.01,0,4.7,0,10.49s4.7,10.5,10.5,10.5S21,16.29,21,10.49C20.99,4.7,16.3-0.01,10.5-0.01z M10.5,19.74c-5.11,0-9.25-4.15-9.25-9.25s4.14-9.26,9.25-9.26s9.25,4.15,9.25,9.25C19.75,15.61,15.61,19.74,10.5,19.74z'/%3E %3Cpath class='st1' style='opacity:0.35; enable-background:new;' d='M14.74,6.25C12.9,4.41,9.98,4.35,8.23,6.1c-3.16,3.17-2.14,8.81-2.14,8.81s5.64,1.02,8.81-2.14 C16.64,11,16.59,8.09,14.74,6.25z M12.47,10.34l-0.91,1.87l-0.9-1.87L8.8,9.43l1.86-0.9l0.9-1.87l0.91,1.87l1.86,0.9L12.47,10.34z'/%3E %3Cpolygon class='st0' style='opacity:0.9; fill: %23FFFFFF; enable-background: new;' points='14.33,9.43 12.47,10.34 11.56,12.21 10.66,10.34 8.8,9.43 10.66,8.53 11.56,6.66 12.47,8.53 '/%3E%3C/g%3E%3C/svg%3E");
201 | }
202 |
203 | a.mapboxgl-ctrl-logo.mapboxgl-compact {
204 | width: 21px;
205 | height: 21px;
206 | background-image: url("data:image/svg+xml;charset=utf-8,%3C?xml version='1.0' encoding='utf-8'?%3E %3Csvg version='1.1' id='Layer_1' xmlns='http://www.w3.org/2000/svg' xmlns:xlink='http://www.w3.org/1999/xlink' x='0px' y='0px' viewBox='0 0 21 21' style='enable-background:new 0 0 21 21;' xml:space='preserve'%3E%3Cg transform='translate(0,0.01)'%3E%3Cpath d='m 10.5,1.24 c -5.11,0 -9.25,4.15 -9.25,9.25 0,5.1 4.15,9.25 9.25,9.25 5.1,0 9.25,-4.15 9.25,-9.25 0,-5.11 -4.14,-9.25 -9.25,-9.25 z m 4.39,11.53 c -1.93,1.93 -4.78,2.31 -6.7,2.31 -0.7,0 -1.41,-0.05 -2.1,-0.16 0,0 -1.02,-5.64 2.14,-8.81 0.83,-0.83 1.95,-1.28 3.13,-1.28 1.27,0 2.49,0.51 3.39,1.42 1.84,1.84 1.89,4.75 0.14,6.52 z' style='opacity:0.9;fill:%23ffffff;enable-background:new' class='st0'/%3E%3Cpath d='M 10.5,-0.01 C 4.7,-0.01 0,4.7 0,10.49 c 0,5.79 4.7,10.5 10.5,10.5 5.8,0 10.5,-4.7 10.5,-10.5 C 20.99,4.7 16.3,-0.01 10.5,-0.01 Z m 0,19.75 c -5.11,0 -9.25,-4.15 -9.25,-9.25 0,-5.1 4.14,-9.26 9.25,-9.26 5.11,0 9.25,4.15 9.25,9.25 0,5.13 -4.14,9.26 -9.25,9.26 z' style='opacity:0.35;enable-background:new' class='st1'/%3E%3Cpath d='M 14.74,6.25 C 12.9,4.41 9.98,4.35 8.23,6.1 5.07,9.27 6.09,14.91 6.09,14.91 c 0,0 5.64,1.02 8.81,-2.14 C 16.64,11 16.59,8.09 14.74,6.25 Z m -2.27,4.09 -0.91,1.87 -0.9,-1.87 -1.86,-0.91 1.86,-0.9 0.9,-1.87 0.91,1.87 1.86,0.9 z' style='opacity:0.35;enable-background:new' class='st1'/%3E%3Cpolygon points='11.56,12.21 10.66,10.34 8.8,9.43 10.66,8.53 11.56,6.66 12.47,8.53 14.33,9.43 12.47,10.34 ' style='opacity:0.9;fill:%23ffffff;enable-background:new' class='st0'/%3E%3C/g%3E%3C/svg%3E");
207 | }
208 |
209 | .mapboxgl-ctrl.mapboxgl-ctrl-attrib {
210 | padding: 0 5px;
211 | background-color: rgba(255, 255, 255, 0.5);
212 | margin: 0;
213 | }
214 |
215 | @media screen {
216 | .mapboxgl-ctrl-attrib.mapboxgl-compact {
217 | padding-top: 2px;
218 | padding-bottom: 2px;
219 | margin: 0 10px 10px;
220 | position: relative;
221 | padding-right: 24px;
222 | background-color: #fff;
223 | border-radius: 3px 12px 12px 3px;
224 | visibility: hidden;
225 | }
226 |
227 | .mapboxgl-ctrl-attrib.mapboxgl-compact:hover {
228 | visibility: visible;
229 | }
230 |
231 | .mapboxgl-ctrl-attrib.mapboxgl-compact::after {
232 | content: '';
233 | cursor: pointer;
234 | position: absolute;
235 | bottom: 0;
236 | right: 0;
237 | background-image: url("data:image/svg+xml;charset=utf-8,%3Csvg viewBox='0 0 20 20' xmlns='http://www.w3.org/2000/svg'%3E %3Cpath fill='%23333333' fill-rule='evenodd' d='M4,10a6,6 0 1,0 12,0a6,6 0 1,0 -12,0 M9,7a1,1 0 1,0 2,0a1,1 0 1,0 -2,0 M9,10a1,1 0 1,1 2,0l0,3a1,1 0 1,1 -2,0'/%3E %3C/svg%3E");
238 | background-color: rgba(255, 255, 255, 0.5);
239 | width: 24px;
240 | height: 24px;
241 | box-sizing: border-box;
242 | visibility: visible;
243 | border-radius: 12px;
244 | }
245 | }
246 |
247 | .mapboxgl-ctrl-attrib a {
248 | color: rgba(0, 0, 0, 0.75);
249 | text-decoration: none;
250 | }
251 |
252 | .mapboxgl-ctrl-attrib a:hover {
253 | color: inherit;
254 | text-decoration: underline;
255 | }
256 |
257 | /* stylelint-disable-next-line selector-class-pattern */
258 | .mapboxgl-ctrl-attrib .mapbox-improve-map {
259 | font-weight: bold;
260 | margin-left: 2px;
261 | }
262 |
263 | .mapboxgl-attrib-empty {
264 | display: none;
265 | }
266 |
267 | .mapboxgl-ctrl-scale {
268 | background-color: rgba(255, 255, 255, 0.75);
269 | font-size: 10px;
270 | border-width: medium 2px 2px;
271 | border-style: none solid solid;
272 | border-color: #333;
273 | padding: 0 5px;
274 | color: #333;
275 | box-sizing: border-box;
276 | }
277 |
278 | .mapboxgl-popup {
279 | position: absolute;
280 | top: 0;
281 | left: 0;
282 | display: -webkit-flex;
283 | display: flex;
284 | will-change: transform;
285 | pointer-events: none;
286 | }
287 |
288 | .mapboxgl-popup-anchor-top,
289 | .mapboxgl-popup-anchor-top-left,
290 | .mapboxgl-popup-anchor-top-right {
291 | -webkit-flex-direction: column;
292 | flex-direction: column;
293 | }
294 |
295 | .mapboxgl-popup-anchor-bottom,
296 | .mapboxgl-popup-anchor-bottom-left,
297 | .mapboxgl-popup-anchor-bottom-right {
298 | -webkit-flex-direction: column-reverse;
299 | flex-direction: column-reverse;
300 | }
301 |
302 | .mapboxgl-popup-anchor-left {
303 | -webkit-flex-direction: row;
304 | flex-direction: row;
305 | }
306 |
307 | .mapboxgl-popup-anchor-right {
308 | -webkit-flex-direction: row-reverse;
309 | flex-direction: row-reverse;
310 | }
311 |
312 | .mapboxgl-popup-tip {
313 | width: 0;
314 | height: 0;
315 | border: 10px solid transparent;
316 | z-index: 1;
317 | }
318 |
319 | .mapboxgl-popup-anchor-top .mapboxgl-popup-tip {
320 | -webkit-align-self: center;
321 | align-self: center;
322 | border-top: none;
323 | border-bottom-color: #fff;
324 | }
325 |
326 | .mapboxgl-popup-anchor-top-left .mapboxgl-popup-tip {
327 | -webkit-align-self: flex-start;
328 | align-self: flex-start;
329 | border-top: none;
330 | border-left: none;
331 | border-bottom-color: #fff;
332 | }
333 |
334 | .mapboxgl-popup-anchor-top-right .mapboxgl-popup-tip {
335 | -webkit-align-self: flex-end;
336 | align-self: flex-end;
337 | border-top: none;
338 | border-right: none;
339 | border-bottom-color: #fff;
340 | }
341 |
342 | .mapboxgl-popup-anchor-bottom .mapboxgl-popup-tip {
343 | -webkit-align-self: center;
344 | align-self: center;
345 | border-bottom: none;
346 | border-top-color: #fff;
347 | }
348 |
349 | .mapboxgl-popup-anchor-bottom-left .mapboxgl-popup-tip {
350 | -webkit-align-self: flex-start;
351 | align-self: flex-start;
352 | border-bottom: none;
353 | border-left: none;
354 | border-top-color: #fff;
355 | }
356 |
357 | .mapboxgl-popup-anchor-bottom-right .mapboxgl-popup-tip {
358 | -webkit-align-self: flex-end;
359 | align-self: flex-end;
360 | border-bottom: none;
361 | border-right: none;
362 | border-top-color: #fff;
363 | }
364 |
365 | .mapboxgl-popup-anchor-left .mapboxgl-popup-tip {
366 | -webkit-align-self: center;
367 | align-self: center;
368 | border-left: none;
369 | border-right-color: #fff;
370 | }
371 |
372 | .mapboxgl-popup-anchor-right .mapboxgl-popup-tip {
373 | -webkit-align-self: center;
374 | align-self: center;
375 | border-right: none;
376 | border-left-color: #fff;
377 | }
378 |
379 | .mapboxgl-popup-close-button {
380 | position: absolute;
381 | right: 0;
382 | top: 0;
383 | border: 0;
384 | border-radius: 0 3px 0 0;
385 | cursor: pointer;
386 | background-color: transparent;
387 | }
388 |
389 | .mapboxgl-popup-close-button:hover {
390 | background-color: rgba(0, 0, 0, 0.05);
391 | }
392 |
393 | .mapboxgl-popup-content {
394 | position: relative;
395 | background: #fff;
396 | border-radius: 3px;
397 | box-shadow: 0 1px 2px rgba(0, 0, 0, 0.1);
398 | padding: 10px 10px 15px;
399 | pointer-events: auto;
400 | }
401 |
402 | .mapboxgl-popup-anchor-top-left .mapboxgl-popup-content {
403 | border-top-left-radius: 0;
404 | }
405 |
406 | .mapboxgl-popup-anchor-top-right .mapboxgl-popup-content {
407 | border-top-right-radius: 0;
408 | }
409 |
410 | .mapboxgl-popup-anchor-bottom-left .mapboxgl-popup-content {
411 | border-bottom-left-radius: 0;
412 | }
413 |
414 | .mapboxgl-popup-anchor-bottom-right .mapboxgl-popup-content {
415 | border-bottom-right-radius: 0;
416 | }
417 |
418 | .mapboxgl-marker {
419 | position: absolute;
420 | top: 0;
421 | left: 0;
422 | will-change: transform;
423 | }
424 |
425 | .mapboxgl-user-location-dot {
426 | background-color: #1da1f2;
427 | width: 15px;
428 | height: 15px;
429 | border-radius: 50%;
430 | box-shadow: 0 0 2px rgba(0, 0, 0, 0.25);
431 | }
432 |
433 | .mapboxgl-user-location-dot::before {
434 | background-color: #1da1f2;
435 | content: '';
436 | width: 15px;
437 | height: 15px;
438 | border-radius: 50%;
439 | position: absolute;
440 | -webkit-animation: mapboxgl-user-location-dot-pulse 2s infinite;
441 | -moz-animation: mapboxgl-user-location-dot-pulse 2s infinite;
442 | -ms-animation: mapboxgl-user-location-dot-pulse 2s infinite;
443 | animation: mapboxgl-user-location-dot-pulse 2s infinite;
444 | }
445 |
446 | .mapboxgl-user-location-dot::after {
447 | border-radius: 50%;
448 | border: 2px solid #fff;
449 | content: '';
450 | height: 19px;
451 | left: -2px;
452 | position: absolute;
453 | top: -2px;
454 | width: 19px;
455 | box-sizing: border-box;
456 | }
457 |
458 | @-webkit-keyframes mapboxgl-user-location-dot-pulse {
459 | 0% { -webkit-transform: scale(1); opacity: 1; }
460 | 70% { -webkit-transform: scale(3); opacity: 0; }
461 | 100% { -webkit-transform: scale(1); opacity: 0; }
462 | }
463 |
464 | @-ms-keyframes mapboxgl-user-location-dot-pulse {
465 | 0% { -ms-transform: scale(1); opacity: 1; }
466 | 70% { -ms-transform: scale(3); opacity: 0; }
467 | 100% { -ms-transform: scale(1); opacity: 0; }
468 | }
469 |
470 | @keyframes mapboxgl-user-location-dot-pulse {
471 | 0% { transform: scale(1); opacity: 1; }
472 | 70% { transform: scale(3); opacity: 0; }
473 | 100% { transform: scale(1); opacity: 0; }
474 | }
475 |
476 | .mapboxgl-user-location-dot-stale {
477 | background-color: #aaa;
478 | }
479 |
480 | .mapboxgl-user-location-dot-stale::after {
481 | display: none;
482 | }
483 |
484 | .mapboxgl-crosshair,
485 | .mapboxgl-crosshair .mapboxgl-interactive,
486 | .mapboxgl-crosshair .mapboxgl-interactive:active {
487 | cursor: crosshair;
488 | }
489 |
490 | .mapboxgl-boxzoom {
491 | position: absolute;
492 | top: 0;
493 | left: 0;
494 | width: 0;
495 | height: 0;
496 | background: #fff;
497 | border: 2px dotted #202020;
498 | opacity: 0.5;
499 | }
500 |
501 | @media print {
502 | /* stylelint-disable-next-line selector-class-pattern */
503 | .mapbox-improve-map {
504 | display: none;
505 | }
506 | }
507 |
--------------------------------------------------------------------------------
/gui/src/client/trips-layer-fragment.glsl.js:
--------------------------------------------------------------------------------
1 | // Copyright (c) 2015 - 2017 Uber Technologies, Inc.
2 | //
3 | // Permission is hereby granted, free of charge, to any person obtaining a copy
4 | // of this software and associated documentation files (the "Software"), to deal
5 | // in the Software without restriction, including without limitation the rights
6 | // to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
7 | // copies of the Software, and to permit persons to whom the Software is
8 | // furnished to do so, subject to the following conditions:
9 | //
10 | // The above copyright notice and this permission notice shall be included in
11 | // all copies or substantial portions of the Software.
12 | //
13 | // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
14 | // IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
15 | // FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
16 | // AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
17 | // LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
18 | // OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
19 | // THE SOFTWARE.
20 |
21 | export default `\
22 | #define SHADER_NAME trips-layer-fragment-shader
23 |
24 | precision highp float;
25 |
26 | varying float vTime;
27 | varying vec4 vColor;
28 |
29 | void main(void) {
30 | if (vTime > 1.0 || vTime < 0.0) {
31 | discard;
32 | }
33 | gl_FragColor = vec4(vColor.rgb, vColor.a * vTime);
34 | }
35 | `;
36 |
--------------------------------------------------------------------------------
/gui/src/client/trips-layer-vertex.glsl.js:
--------------------------------------------------------------------------------
1 | // Copyright (c) 2015 - 2017 Uber Technologies, Inc.
2 | //
3 | // Permission is hereby granted, free of charge, to any person obtaining a copy
4 | // of this software and associated documentation files (the "Software"), to deal
5 | // in the Software without restriction, including without limitation the rights
6 | // to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
7 | // copies of the Software, and to permit persons to whom the Software is
8 | // furnished to do so, subject to the following conditions:
9 | //
10 | // The above copyright notice and this permission notice shall be included in
11 | // all copies or substantial portions of the Software.
12 | //
13 | // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
14 | // IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
15 | // FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
16 | // AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
17 | // LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
18 | // OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
19 | // THE SOFTWARE.
20 |
21 | export default `\
22 | #define SHADER_NAME trips-layer-vertex-shader
23 |
24 | attribute vec3 positions;
25 | attribute vec3 colors;
26 |
27 | uniform float opacity;
28 | uniform float currentTime;
29 | uniform float trailLength;
30 |
31 | varying float vTime;
32 | varying vec4 vColor;
33 |
34 | void main(void) {
35 | vec2 p = project_position(positions.xy);
36 | // the magic de-flickering factor
37 | vec4 shift = vec4(0., 0., mod(positions.z, trailLength) * 1e-4, 0.);
38 |
39 | gl_Position = project_to_clipspace(vec4(p, 1., 1.)) + shift;
40 |
41 | vColor = vec4(colors / 255.0, opacity);
42 | vTime = 1.0 - (currentTime - positions.z) / trailLength;
43 | }
44 | `;
45 |
--------------------------------------------------------------------------------
/gui/src/client/trips-layer.js:
--------------------------------------------------------------------------------
1 | import {Layer} from '@deck.gl/core';
2 |
3 | import {Model, Geometry} from 'luma.gl';
4 |
5 | import tripsVertex from './trips-layer-vertex.glsl';
6 | import tripsFragment from './trips-layer-fragment.glsl';
7 |
8 | const defaultProps = {
9 | trailLength: {type: 'number', value: 120, min: 0},
10 | currentTime: {type: 'number', value: 0, min: 0},
11 | getPath: {type: 'accessor', value: d => d.path},
12 | getColor: {type: 'accessor', value: d => d.color}
13 | };
14 |
15 | export default class TripsLayer extends Layer {
16 | initializeState() {
17 | const {gl} = this.context;
18 | const attributeManager = this.getAttributeManager();
19 |
20 | const model = this.getModel(gl);
21 |
22 | attributeManager.add({
23 | indices: {size: 1, update: this.calculateIndices, isIndexed: true},
24 | positions: {size: 3, update: this.calculatePositions},
25 | colors: {size: 3, accessor: 'getColor', update: this.calculateColors}
26 | });
27 |
28 | gl.getExtension('OES_element_index_uint');
29 | this.setState({model});
30 | }
31 |
32 | updateState({props, changeFlags: {dataChanged}}) {
33 | if (dataChanged) {
34 | this.countVertices(props.data);
35 | this.state.attributeManager.invalidateAll();
36 | }
37 | }
38 |
39 | getModel(gl) {
40 | return new Model(gl, {
41 | id: this.props.id,
42 | vs: tripsVertex,
43 | fs: tripsFragment,
44 | geometry: new Geometry({
45 | id: this.props.id,
46 | drawMode: 'LINES'
47 | }),
48 | vertexCount: 0,
49 | isIndexed: true,
50 | // TODO-state-management: onBeforeRender can go to settings, onAfterRender, we should
51 | // move this settings of corresponding draw.
52 | onBeforeRender: () => {
53 | gl.enable(gl.BLEND);
54 | gl.enable(gl.POLYGON_OFFSET_FILL);
55 | gl.polygonOffset(2.0, 1.0);
56 | gl.blendFunc(gl.SRC_ALPHA, gl.ONE);
57 | gl.blendEquation(gl.FUNC_ADD);
58 | },
59 | onAfterRender: () => {
60 | gl.blendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA);
61 | gl.disable(gl.POLYGON_OFFSET_FILL);
62 | }
63 | });
64 | }
65 |
66 | countVertices(data) {
67 | if (!data) {
68 | return;
69 | }
70 |
71 | const {getPath} = this.props;
72 | let vertexCount = 0;
73 | const pathLengths = data.reduce((acc, d) => {
74 | const l = getPath(d).length;
75 | vertexCount += l;
76 | return [...acc, l];
77 | }, []);
78 | this.setState({pathLengths, vertexCount});
79 | }
80 |
81 | draw({uniforms}) {
82 | const {trailLength, currentTime} = this.props;
83 | this.state.model.render(
84 | Object.assign({}, uniforms, {
85 | trailLength,
86 | currentTime
87 | })
88 | );
89 | }
90 |
91 | calculateIndices(attribute) {
92 | const {pathLengths, vertexCount} = this.state;
93 |
94 | const indicesCount = (vertexCount - pathLengths.length) * 2;
95 | const indices = new Uint32Array(indicesCount);
96 |
97 | let offset = 0;
98 | let index = 0;
99 | for (let i = 0; i < pathLengths.length; i++) {
100 | const l = pathLengths[i];
101 | indices[index++] = offset;
102 | for (let j = 1; j < l - 1; j++) {
103 | indices[index++] = j + offset;
104 | indices[index++] = j + offset;
105 | }
106 | indices[index++] = offset + l - 1;
107 | offset += l;
108 | }
109 | attribute.value = indices;
110 | this.state.model.setVertexCount(indicesCount);
111 | }
112 |
113 | calculatePositions(attribute) {
114 | const {data, getPath} = this.props;
115 | const {vertexCount} = this.state;
116 | const positions = new Float32Array(vertexCount * 3);
117 |
118 | let index = 0;
119 | for (let i = 0; i < data.length; i++) {
120 | const path = getPath(data[i]);
121 | for (let j = 0; j < path.length; j++) {
122 | const pt = path[j];
123 | positions[index++] = pt[0];
124 | positions[index++] = pt[1];
125 | positions[index++] = pt[2];
126 | }
127 | }
128 | attribute.value = positions;
129 | }
130 |
131 | calculateColors(attribute) {
132 | const {data, getColor} = this.props;
133 | const {pathLengths, vertexCount} = this.state;
134 | const colors = new Float32Array(vertexCount * 3);
135 |
136 | let index = 0;
137 | for (let i = 0; i < data.length; i++) {
138 | const color = getColor(data[i]);
139 | const l = pathLengths[i];
140 | for (let j = 0; j < l; j++) {
141 | colors[index++] = color[0];
142 | colors[index++] = color[1];
143 | colors[index++] = color[2];
144 | }
145 | }
146 | attribute.value = colors;
147 | }
148 | }
149 |
150 | TripsLayer.layerName = 'TripsLayer';
151 | TripsLayer.defaultProps = defaultProps;
152 |
--------------------------------------------------------------------------------
/gui/src/server/index.js:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env node
2 | 'use strict';
3 |
4 | const net = require('net');
5 | const express = require('express')
6 |
7 | // const app = express()
8 | const port = 8080
9 |
10 | var expressWs = require('express-ws')
11 | var expressWs = expressWs(express());
12 | var app = expressWs.app;
13 |
14 |
15 | var calibration;
16 |
17 |
18 |
19 | app.use(express.static('dist'))
20 |
21 | // Use this server to send calibration values received from map/browser down to node and ...
22 | // ... send frame dimensions (frame_dim) received from node up to map/browser
23 | app.ws('/send_calibration', (ws, req) => {
24 | // console.error('websocket connection from browser');
25 | // for (var t = 0; t < 3; t++)
26 | // setTimeout(() => ws.send('Map, the server received your message', ()=>{}), 1000*t);
27 |
28 | ws.on('message', function incoming(message) {
29 |
30 | // console.log('Server received calibration from map');
31 |
32 | // Setting 'calibration' variable to calibration value
33 | calibration = message;
34 |
35 | // Create new socket and connect to node
36 | var client = new net.Socket();
37 | client.setTimeout(4000);
38 | client = client.setEncoding('utf8');
39 | client.connect(8765, '127.0.0.1', function() {
40 | // console.log('Connected to node calibration socket server');
41 |
42 | // Send calibration to node's websocket/socket server
43 | var if_sent = client.write(calibration);
44 | if (if_sent) {
45 | // console.log("Calibration sent");
46 | }
47 | });
48 |
49 |
50 | let chunks = []; // https://stackoverflow.com/a/45030133/8941739
51 | client.on('data', function(data) {
52 | chunks.push(data); // https://stackoverflow.com/a/45030133/8941739
53 | }).on('end', function() {
54 | if (chunks.length > 0) {
55 | // console.log("Received frame dimensions (frame_dim) ...");
56 | let this_data = chunks.join("");
57 | var parsed_data = JSON.parse(this_data.replace(/\'/g, '"'));
58 | // console.log(parsed_data);
59 | // Send frame_dim to browser
60 | ws.send(JSON.stringify(parsed_data));
61 | }
62 | client.destroy(); // kill client after server's response
63 | });
64 |
65 | client.on('error', function(ex) {
66 | // console.log("Something happened trying to send calibration to node ");
67 | client.destroy(); // kill client after error
68 | // console.log(ex);
69 | });
70 |
71 | client.on('timeout', function() {
72 | console.log('calibration socket timeout');
73 | client.destroy(); // have to .destroy() on timeout. If just .end(), it won't reconnect if user doesn't refresh browser
74 | });
75 |
76 |
77 | });
78 | });
79 |
80 |
81 |
82 |
83 |
84 | // When browser requests tracklets, get them from node and return them
85 | // Browser requests tracklets from server. Server gets them from Node. Server sends tracklets to browser
86 | app.ws('/get_tracklets', (ws, req) => {
87 |
88 | // console.log("called get_tracklets");
89 |
90 | ws.on('message', function incoming(message) {
91 |
92 | // console.log('Server received get_tracklets from map');
93 |
94 | var client = new net.Socket();
95 | client.setTimeout(4000);
96 | client = client.setEncoding('utf8');
97 | client.connect(8766, '127.0.0.1', function() {
98 | // console.log('Connected to tracklets');
99 | // console.log("message");
100 | //console.log(message);
101 | client.write(message);
102 | });
103 |
104 | let chunks = []; // https://stackoverflow.com/a/45030133/8941739
105 | client.on('data', function(data) {
106 | chunks.push(data); // https://stackoverflow.com/a/45030133/8941739
107 | }).on('end', function() {
108 |
109 | //console.log(chunks);
110 | if (chunks.length > 0) {
111 | let this_data = chunks.join("");
112 | var trackableObjectsArray = JSON.parse(this_data.replace(/'/g, '"'));
113 |
114 | if (typeof trackableObjectsArray !== 'undefined') {
115 | try {
116 |
117 | // console.log("-------------------------------------------------------");
118 | // console.log(JSON.stringify(trackableObjectsArray));
119 | // Send tracklets to browser
120 | ws.send(JSON.stringify(trackableObjectsArray));
121 |
122 | } catch(e) {
123 | console.log("error", e);
124 | }
125 | } else {
126 | // Send empty array to browser
127 | ws.send(JSON.stringify([]));
128 | }
129 | } else {
130 | // Send empty array to browser
131 | ws.send(JSON.stringify([]));
132 | }
133 |
134 | client.destroy(); // kill client after server's response
135 | });
136 |
137 | client.on('error', function(ex) {
138 | console.log("Something happened trying to get trackelets from node ");
139 | client.destroy(); // kill client after error
140 | console.log(ex);
141 | });
142 |
143 | client.on('timeout', function() {
144 | console.log('tracklet socket timeout');
145 | client.destroy(); // have to .destroy() on timeout. If just .end(), it won't reconnect if user doesn't refresh browser
146 | });
147 |
148 | });
149 | });
150 |
151 |
152 |
153 | app.listen(port, () => console.log(`Example app listening on port ${port}!`))
154 |
--------------------------------------------------------------------------------
/gui/webpack.config.js:
--------------------------------------------------------------------------------
1 |
2 | // avoid destructuring for older Node version support
3 | const path = require('path');
4 | const resolve = require('path').resolve;
5 | const webpack = require('webpack');
6 | const outputDirectory = "dist";
7 |
8 |
9 | // const HtmlWebpackPlugin = require('html-webpack-plugin');
10 | // const CleanWebpackPlugin = require('clean-webpack-plugin');
11 |
12 | module.exports = env => {
13 |
14 | var CONFIG = {
15 | mode: 'development',
16 |
17 | entry: {
18 | app: resolve('./app.js')
19 | },
20 |
21 | resolve: {
22 | alias: {
23 | // From mapbox-gl-js README. Required for non-browserify bundlers (e.g. webpack) if you want to use your own modified mapbox-gl.js (https://github.com/uber/deck.gl/blob/b775505a04fe45504e787c99beb5e3d31f95a3fd/test/apps/mapbox-layers/webpack.config.js#L17):
24 | // I needed an 89.9 degree max pitch....
25 | // So following these instructions (https://github.com/uber/deck.gl/blob/b775505a04fe45504e787c99beb5e3d31f95a3fd/test/apps/mapbox-layers/README.md) with regard to the custom-layers branch (https://github.com/mapbox/mapbox-gl-js/tree/custom-layers) ...
26 | // I downloaded version 48.0. cd into it and did a dev-build (yarn run build-dev), and copy the newly created mapbox-gl-dev.js from its '/dist' directory to 'map/node_modules/mapbox-gl/dist' directory here ...
27 | // ... I took version 48.0 in accordance with this user's patch (https://github.com/mapbox/mapbox-gl-js/issues/3731#issuecomment-409980664) and then set the custom maxPitch to 89.9 degrees and stored it here as mapbox-gl-dev.js
28 |
29 | // Until this is upgraded to mapbox-gl.js version 50.0, Mapbox's custom layers and thus deck.gl's MapboxLayer, GeoJsonLayer, PolygonLayer etc. etc. cannot be used
30 |
31 | 'mapbox-gl$': resolve('./src/client/mapbox-gl-dev.js')
32 | }
33 | },
34 |
35 | devtool: 'inline-source-map',
36 |
37 |
38 | output: {
39 | library: 'App'
40 | },
41 |
42 |
43 | module: {
44 | rules: [
45 | {
46 | // Compile ES2015 using buble
47 | test: /\.js$/,
48 | loader: 'buble-loader',
49 | include: [resolve('.')],
50 | exclude: [/node_modules/],
51 | options: {
52 | objectAssign: 'Object.assign'
53 | }
54 | }
55 | ]
56 | },
57 |
58 | // Optional: Enables reading mapbox token from environment variable
59 | plugins: [new webpack.EnvironmentPlugin(['MapboxAccessToken'])],
60 |
61 | devServer: {
62 | port: 3000,
63 | host: env.HOST,
64 | open: true,
65 | proxy: {
66 | '/api': {
67 | target: 'ws://'+env.HOST+':8080',
68 | ws: true
69 | },
70 | },
71 |
72 | },
73 |
74 | // plugins: [
75 | // new CleanWebpackPlugin([outputDirectory]),
76 | // new HtmlWebpackPlugin({
77 | // template: "./index.html"
78 | // })
79 | // ]
80 | };
81 |
82 | return CONFIG;
83 | };
84 |
85 |
--------------------------------------------------------------------------------
/lnglat_homography.py:
--------------------------------------------------------------------------------
1 | # Copyright (C) 2018-2019 David Thompson
2 | #
3 | # This file is part of Grassland
4 | #
5 | # It is subject to the license terms in the LICENSE file found in the top-level
6 | # directory of this distribution.
7 | #
8 | # No part of Grassland, including this file, may be copied, modified,
9 | # propagated, or distributed except according to the terms contained in the
10 | # LICENSE file.
11 |
12 |
13 | import numpy as np
14 | import cv2
15 | import os
16 | import json
17 | import requests
18 | import plyvel
19 | import multiprocessing
20 | from multiprocessing import Queue, Pool
21 | import gevent
22 | from gevent.server import StreamServer
23 | import time
24 | from pathlib import Path
25 |
26 | class MyException(Exception):
27 | pass
28 |
29 | class RealWorldCoordinates:
30 | def __init__(self, tracking_frame):
31 |
32 | # Create node's personal leveldb database if missing
33 | self.node_db = plyvel.DB(str(Path.home())+'/.grassland/node_db/', create_if_missing=True)
34 | self.CALIBRATING = False
35 | self.tracking_frame = tracking_frame
36 | self.calibration = {}
37 |
38 | # pts_src and pts_dst are numpy arrays of points
39 | # in source and destination images. We need at least
40 | # 4 corresponding points.
41 | pts_src = np.array([[421, 695], [1587, 198], [368, 309], [1091, 98]])
42 | pts_dst = np.array([[581, 473], [618, 215], [296, 449], [281, 245]])
43 | h, status = cv2.findHomography(pts_src, pts_dst)
44 |
45 |
46 |
47 | def set_transform(self, calibrating=False):
48 | self.CALIBRATING = calibrating
49 |
50 | # Get the real world transform that gets the longitude and latitude coordinates of each pixel of the realigned image
51 | # Using the node calibration web app, we can make a function that will allow the node to know what the real world (lat/lng) coordinates are for each pixels in it's frame
52 | # The node calibration web app using Mapbox and Open Street Map can map its pixels coordinates (2D space 'F') to a latitude and longitude coordinate (2D space 'W').
53 | # By lining up the Open Street Map "camera"" to exactly match the perspective of the real camera in the node, we can take a few pixels coordinates from 'F'
54 | # and their coresponding real world coordinates 'W' from the web app and use that to find a function, a linear map (transformation matrix), 'L'
55 | # that will take any pixel coordinate from the space 'F' and produce the cooresponding coordinate in 'W'. L(f) = w
56 |
57 | #
58 | # Code taken From https://stackoverflow.com/a/20555267/8941739
59 |
60 |
61 | #primary = np.array([[0.0, 0.0], [1366.0, 0.0], [1366.0, 662.0], [0.0, 662.0]]) # Average dimensions of monitor viewing webapp. Maybe I should change this to be dynamic
62 |
63 | height = float(self.tracking_frame['height'])
64 | height = height / 2 # The modification made to mapbox (https://github.com/mapbox/mapbox-gl-js/issues/3731#issuecomment-368641789) that allows a greater than 60 degree pitch has a bug with unprojecting points closer to the horizon. They get very "screwy". So the two top homography_point corners in the web app ('ul' and 'ur') actually start half way down the canvas as the starting point to start from below the horizon
65 |
66 | width = float(self.tracking_frame['width'])
67 | primary = np.array([[0.0, 0.0], [width, 0.0], [width, height], [0.0, height]])
68 |
69 |
70 | # if not dynamic: #
71 | # secondary = np.array([[-75.75021684378025, 45.393495598366655], [-75.7512298958311, 45.39309963711102], [-75.75150315621723, 45.393444401619234], [-75.75049010416637, 45.393840360459365]])
72 |
73 | secondary_array = []
74 |
75 |
76 | '''
77 | Sample Node Format
78 | {
79 | 'id': 'n68b5a19ef9364a74ae73b069934b21a4',
80 | 'tracking_frame': {'height': 281, 'width': 500},
81 | 'calibration': {
82 | 'lng_focus': -75.75107566872947,
83 | 'bearing': 62.60000000000002,
84 | 'lat_focus': 45.39331613895314,
85 | 'pitch': 55.00000000000001,
86 | 'homography_points': {
87 | 'corners': {
88 | 'ul': {
89 | 'lat': 45.395059987864016,
90 | 'lng': -75.75055046479982
91 | },
92 | 'll': {
93 | 'lat': 45.392791493630654,
94 | 'lng': -75.75123398120483
95 | },
96 | 'ur': {
97 | 'lat': 45.392869098373296,
98 | 'lng': -75.74893325620522
99 | },
100 | 'lr': {
101 | 'lat': 45.39362547029299,
102 | 'lng': -75.75184957418519
103 | }
104 | },
105 | 'markers': {}
106 | }
107 | }
108 | }
109 |
110 |
111 | '''
112 | # MySQL
113 |
114 | if self.CALIBRATING:
115 | # Get calibration values from Calibration Map
116 | # https://docs.python.org/3/library/asyncio-eventloop.html#asyncio.get_event_loop
117 | # asyncio.get_event_loop().run_until_complete(self.call_websocket())
118 | #self.msl = multiprocessing.Process(target=self.mapserver_loop)
119 | #self.msl.daemon = True
120 | #self.msl.start()
121 | #print("Finished starting msl")
122 | self.calibration_socket_server = StreamServer(('127.0.0.1', 8765), self.calibration_socket_server_handler)
123 | self.calibration_socket_server.start()
124 |
125 |
126 | self.node_get()
127 |
128 | corner_names = ['ul', 'ur', 'll', 'lr']
129 | for corner_name in corner_names:
130 | ul_lng = self.calibration['homography_points']['corners'][corner_name]['lng']
131 | ul_lat = self.calibration['homography_points']['corners'][corner_name]['lat']
132 | secondary_array.append([ul_lng, ul_lat])
133 |
134 |
135 | secondary = np.array(secondary_array)
136 |
137 |
138 | # Pad the data with ones, so that our transformation can do translations too
139 | n = primary.shape[0]
140 | pad = lambda x: np.hstack([x, np.ones((x.shape[0], 1))])
141 | unpad = lambda x: x[:,:-1]
142 | X = pad(primary)
143 | Y = pad(secondary)
144 |
145 | # Solve the least squares problem X * A = Y
146 | # to find our transformation matrix A
147 | A, res, rank, s = np.linalg.lstsq(X, Y)
148 |
149 | # Real World Transform
150 | self.rw_transform = lambda x: unpad(np.dot(pad(x), A))
151 |
152 | np.set_printoptions(suppress=True)
153 |
154 |
155 | print("Target:")
156 | print(secondary)
157 | print("Result:")
158 | print(self.rw_transform(primary))
159 | print("Max error:", np.abs(secondary - self.rw_transform(primary)).max())
160 | A[np.abs(A) < 1e-10] = 0 # set really small values to zero
161 | print(A)
162 | print("Now Try it")
163 | print(self.rw_transform(np.array([[300, 200]])))
164 | print(self.rw_transform(np.array([[300.0, 200.0]])))
165 |
166 |
167 |
168 | def node_update(self):
169 |
170 | self.node_get()
171 |
172 | #node_id = os.environ['NODE_ID']
173 | #gl_api_endpoint = os.environ['GRASSLAND_API_ENDPOINT']
174 | # data = { "id": node_id, "tracking_frame": self.tracking_frame, "calibration": self.calibration }
175 | #response = requests.put(gl_api_endpoint+"node_update", json=data)
176 |
177 | tracking_frame_string = json.dumps(self.tracking_frame)
178 | self.node_db.put(b'tracking_frame', bytes(tracking_frame_string, 'utf-8'))
179 |
180 | calibration_string = json.dumps(self.calibration)
181 | self.node_db.put(b'calibration', bytes(calibration_string, 'utf-8'))
182 |
183 |
184 |
185 | def node_get(self):
186 |
187 | if self.CALIBRATING:
188 | self.call_gevent_wait()
189 |
190 |
191 | #node_id = os.environ['NODE_ID']
192 | # gl_api_endpoint = os.environ['GRASSLAND_API_ENDPOINT']
193 | # response = requests.get(gl_api_endpoint+"node_get"+"?id="+str(node_id))
194 |
195 |
196 | # tracking_frame = self.node_db.get(b'tracking_frame')
197 | # if tracking_frame == None: # THROW ERROR
198 | # raise MyException("!!! leveldb get 'tracking_frame' returned None !!!!")
199 | # else:
200 | # print(tracking_frame)
201 | # self.tracking_frame = json.loads(tracking_frame.decode("utf-8"))
202 |
203 |
204 | if self.CALIBRATING:
205 | calibration = self.node_db.get(b'calibration')
206 |
207 | if calibration == None:
208 | self.call_gevent_wait()
209 | timeout = time.time() + 60*5 # 5 minutes from now
210 | print("WAITING FOR YOU TO USE THE MAPSERVER TO SET THE CALIBRATION VALUES IN THE DATABASE ...")
211 | while True:
212 | if time.time() > timeout:
213 | print("TIMED OUT WAITING FOR THE CALIBRATION TO BE SENT FROM THE MAP SERVER!!")
214 | break
215 |
216 | calibration = self.node_db.get(b'calibration')
217 |
218 | if calibration == None:
219 | self.call_gevent_wait()
220 | else:
221 | self.calibration = json.loads(calibration.decode("utf-8"))
222 | break
223 |
224 | else:
225 | print(calibration.decode("utf-8"))
226 | self.calibration = json.loads(calibration.decode("utf-8"))
227 |
228 | else:
229 | calibration = self.node_db.get(b'calibration')
230 | if calibration == None: # THROW ERROR
231 | raise MyException("!!! leveldb get 'calibration' returned None. Restart with '--mode CALIBRATING' !!!!")
232 | else:
233 | print(calibration)
234 | self.calibration = json.loads(calibration.decode("utf-8"))
235 |
236 |
237 |
238 |
239 | def coord(self, x, y):
240 |
241 | coord = self.rw_transform(np.array([[x, y]]))
242 |
243 | return {
244 | "lng": coord[0][0],
245 | "lat": coord[0][1]
246 | }
247 |
248 |
249 |
250 | def calibration_socket_server_handler(self, socket, address):
251 |
252 | calibration_bytes_object = socket.recv(4096)
253 | # print("calibration_bytes_object")
254 | # print(calibration_bytes_object)
255 |
256 |
257 | # Store calibration in leveldb
258 | #self.node_db.put(b'calibration', bytes(calibration_string, 'utf-8'))
259 | self.node_db.put(b'calibration', calibration_bytes_object)
260 |
261 | # Get it back
262 | calibration = self.node_db.get(b'calibration')
263 |
264 | self.calibration = json.loads(calibration.decode("utf-8"))
265 |
266 |
267 | # Get camera frame dimensions (frame_dim). Could pull from database but this is easier
268 | tracking_frame_string = json.dumps(self.tracking_frame)
269 | # Send camera frame dimensions (frame_dim)
270 | socket.sendall(bytes(tracking_frame_string, 'utf-8'))
271 |
272 |
273 | def call_gevent_wait(self):
274 | gevent.wait(timeout=1) # https://stackoverflow.com/a/10292950/8941739
275 |
--------------------------------------------------------------------------------
/pyimagesearch/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/grasslandnetwork/node_lite/26aa46be7cd092a611b6b46f92ff38bd4d3beabe/pyimagesearch/__init__.py
--------------------------------------------------------------------------------
/pyimagesearch/centroidtracker.py:
--------------------------------------------------------------------------------
1 | # import the necessary packages
2 | from scipy.spatial import distance as dist
3 | from collections import OrderedDict
4 | import numpy as np
5 | import uuid
6 |
7 | class CentroidTracker:
8 | def __init__(self, maxDisappeared=50, maxDistance=50):
9 | # initialize the next unique object ID along with two ordered
10 | # dictionaries used to keep track of mapping a given object
11 | # ID to its centroid and number of consecutive frames it has
12 | # been marked as "disappeared", respectively
13 | # self.nextObjectID = "t" + uuid.uuid4().hex # https://stackoverflow.com/a/534847
14 | self.nextObjectID = uuid.uuid4().hex # https://stackoverflow.com/a/534847
15 | self.objects = OrderedDict()
16 | self.disappeared = OrderedDict()
17 |
18 | # store the number of maximum consecutive frames a given
19 | # object is allowed to be marked as "disappeared" until we
20 | # need to deregister the object from tracking
21 | self.maxDisappeared = maxDisappeared
22 |
23 | # store the maximum distance between centroids to associate
24 | # an object -- if the distance is larger than this maximum
25 | # distance we'll start to mark the object as "disappeared"
26 | self.maxDistance = maxDistance
27 |
28 | def register(self, centroid_frame_timestamp, detection_class_id, centroid, boxoid):
29 | # when registering an object we use the next available object
30 | # ID to store the centroid
31 | self.objects[self.nextObjectID] = (centroid_frame_timestamp, detection_class_id, centroid, boxoid)
32 | self.disappeared[self.nextObjectID] = 0
33 | # self.nextObjectID = "t" + uuid.uuid4().hex
34 | self.nextObjectID = uuid.uuid4().hex
35 |
36 | def deregister(self, objectID):
37 | # to deregister an object ID we delete the object ID from
38 | # both of our respective dictionaries
39 | del self.objects[objectID]
40 | del self.disappeared[objectID]
41 |
42 | def update(self, rects, detectionsInput=False):
43 | # check to see if the list of input bounding box rectangles
44 | # is empty
45 | if len(rects) == 0:
46 | # loop over any existing tracked objects and mark them
47 | # as disappeared
48 | for objectID in list(self.disappeared.keys()):
49 | self.disappeared[objectID] += 1
50 |
51 | # if we have reached a maximum number of consecutive
52 | # frames where a given object has been marked as
53 | # missing, deregister it
54 | if self.disappeared[objectID] > self.maxDisappeared:
55 | self.deregister(objectID)
56 |
57 | # return early as there are no centroids or tracking info
58 | # to update
59 | return self.objects
60 |
61 | # initialize an array of input centroids for the current frame
62 | inputCentroids = np.zeros((len(rects), 2), dtype="int")
63 |
64 | # initialize an array of input boxoids for the current frame
65 | inputBoxoids = np.zeros((len(rects), 4), dtype="float")
66 |
67 | # initializa an array of detection_class_ids for the current frame
68 | inputDetectionClassIDs = np.zeros((len(rects), 1), dtype="int")
69 |
70 |
71 |
72 | # loop over the bounding box rectangles
73 | for (i, (startX, startY, endX, endY, centroid_frame_timestamp, detection_class_id)) in enumerate(rects):
74 | # use the bounding box coordinates to derive the centroid
75 | cX = int((startX + endX) / 2.0)
76 | cY = int((startY + endY) / 2.0)
77 | inputCentroids[i] = (cX, cY)
78 | inputBoxoids[i] = (startX, startY, endX, endY)
79 | inputDetectionClassIDs[i] = detection_class_id
80 |
81 | centroid_frame_timestamp = centroid_frame_timestamp # Will be the same for all items in rects
82 |
83 | # if we are currently not tracking any objects take the input
84 | # centroids and register each of them
85 |
86 | if len(self.objects) == 0:
87 | for i in range(0, len(inputCentroids)):
88 | self.register(centroid_frame_timestamp, inputDetectionClassIDs[i], inputCentroids[i], inputBoxoids[i])
89 |
90 | # otherwise, are are currently tracking objects so we need to
91 | # try to match the input centroids to existing object
92 | # centroids
93 | else:
94 | # grab the set of object IDs and corresponding centroids
95 | objectIDs = list(self.objects.keys())
96 | # print("values")
97 | # print(list(zip(*list(self.objects.values()))))
98 | objectCentroidFrameTimestamps, objectDetectionClassIDs, objectCentroids, objectBoxoids = list(zip(*list(self.objects.values())))
99 |
100 | # compute the distance between each pair of object
101 | # centroids and input centroids, respectively -- our
102 | # goal will be to match an input centroid to an existing
103 | # object centroid
104 | D = dist.cdist(np.array(objectCentroids), inputCentroids)
105 |
106 | # in order to perform this matching we must (1) find the
107 | # smallest value in each row and then (2) sort the row
108 | # indexes based on their minimum values so that the row
109 | # with the smallest value as at the *front* of the index
110 | # list
111 | rows = D.min(axis=1).argsort()
112 |
113 | # next, we perform a similar process on the columns by
114 | # finding the smallest value in each column and then
115 | # sorting using the previously computed row index list
116 | cols = D.argmin(axis=1)[rows]
117 |
118 | # in order to determine if we need to update, register,
119 | # or deregister an object we need to keep track of which
120 | # of the rows and column indexes we have already examined
121 | usedRows = set()
122 | usedCols = set()
123 |
124 | # loop over the combination of the (row, column) index
125 | # tuples
126 | for (row, col) in zip(rows, cols):
127 | # if we have already examined either the row or
128 | # column value before, ignore it
129 | if row in usedRows or col in usedCols:
130 | continue
131 |
132 | # if the distance between centroids is greater than
133 | # the maximum distance, do not associate the two
134 | # centroids to the same object
135 | if D[row, col] > self.maxDistance:
136 | continue
137 |
138 | # otherwise, grab the object ID for the current row,
139 | # set its new centroid, and reset the disappeared
140 | # counter
141 | objectID = objectIDs[row]
142 | self.objects[objectID] = (centroid_frame_timestamp, inputDetectionClassIDs[col], inputCentroids[col], inputBoxoids[col])
143 | self.disappeared[objectID] = 0
144 |
145 | # indicate that we have examined each of the row and
146 | # column indexes, respectively
147 | usedRows.add(row)
148 | usedCols.add(col)
149 |
150 | # compute both the row and column index we have NOT yet
151 | # examined
152 | unusedRows = set(range(0, D.shape[0])).difference(usedRows)
153 | unusedCols = set(range(0, D.shape[1])).difference(usedCols)
154 |
155 | # in the event that the number of object centroids is
156 | # equal or greater than the number of input centroids
157 | # we need to check and see if some of these objects have
158 | # potentially disappeared
159 | if D.shape[0] >= D.shape[1]:
160 | # loop over the unused row indexes
161 | for row in unusedRows:
162 | # grab the object ID for the corresponding row
163 | # index and increment the disappeared counter
164 | objectID = objectIDs[row]
165 | self.disappeared[objectID] += 1
166 |
167 |
168 | # if the input was detection boxes, then
169 | # mark for database deletion, those tracklets not acknowledged by detection boxes (objectless tracklets)
170 | if detectionsInput:
171 |
172 | (centroid_frame_timestamp, detection_class_id, centroid, boxoid) = self.objects[objectID]
173 | (startX, startY, endX, endY) = boxoid
174 | boxoid = (startX, startY, endX, endY)
175 | self.objects[objectID] = (centroid_frame_timestamp, np.array([-1]), centroid, boxoid) # Set detection_class_id to -1 to ignore tracklet
176 |
177 | else:
178 | # check to see if the number of consecutive
179 | # frames the object has been marked "disappeared"
180 | # for warrants deregistering the object
181 | if self.disappeared[objectID] > self.maxDisappeared:
182 | self.deregister(objectID)
183 |
184 |
185 |
186 |
187 | # otherwise, if the number of input centroids is greater
188 | # than the number of existing object centroids we need to
189 | # register each new input centroid as a trackable object
190 | else:
191 | for col in unusedCols:
192 | self.register(centroid_frame_timestamp, inputDetectionClassIDs[col], inputCentroids[col], inputBoxoids[col])
193 |
194 | # return the set of trackable objects
195 | return self.objects
196 |
197 |
--------------------------------------------------------------------------------
/pyimagesearch/trackableobject.py:
--------------------------------------------------------------------------------
1 | class TrackableObject:
2 | def __init__(self, objectID, centroid_frame_timestamp, detection_class_id, centroid, boxoid, bbox_rw_coords):
3 | # store the object ID, then initialize a list of centroids
4 | # using the current centroid
5 | self.objectID = objectID
6 |
7 | # initialize instance variable, 'oids' as a list
8 | self.oids = []
9 |
10 | # initialize instance variable, 'centroids' as a list
11 | self.centroids = []
12 |
13 | # initialize instance variable, 'boxoids' as a list
14 | self.boxoids = []
15 |
16 | # initialize instance variable, 'bbox_rw_coords' as a list
17 | self.bbox_rw_coords = []
18 |
19 | # initialize instance variable 'detection_class_id' as 0
20 | self.detection_class_id = detection_class_id
21 |
22 | # initialize a boolean used to indicate if the object has
23 | # already been counted or not
24 | self.counted = False
25 |
26 | # initialize a boolean used to indicate if the object has left the node's field of view and the tracks complete
27 | self.complete = False
28 |
29 | # pass first boxoid to 'append_boxoids' method for processing
30 | self.append_boxoid(boxoid)
31 |
32 | # pass first centroid to 'append_centroids' method for processing
33 | self.append_centroid(centroid)
34 |
35 |
36 | self.append_oids(centroid_frame_timestamp, detection_class_id, centroid, boxoid, bbox_rw_coords)
37 |
38 |
39 |
40 | def append_centroid(self, centroid):
41 | pass
42 | #self.centroids.append(list(centroid))
43 |
44 |
45 | def append_boxoid(self, boxoid):
46 |
47 | #self.boxoids.append(list(boxoid))
48 |
49 | # if self.detection_class_id > 0 and boxoid[5] <= 0: # if object's class has been identified already but this isn't a new identification
50 | # pass # ... then don't change the current detection class. Even if the new detection_class_id is a -1, which means that the detection has changed but we'll stick with the first detected object class
51 | # else: # if the object's class hasn't been identified yet or this is a new identification from a detected frame or a -1
52 | # self.detection_class_id = boxoid[5]
53 |
54 | pass
55 |
56 |
57 | def append_oids(self, centroid_frame_timestamp, detection_class_id, centroid, boxoid, bbox_rw_coords):
58 |
59 | if self.detection_class_id > 0 and detection_class_id <= 0: # if object's class has been identified already but this isn't a new identification
60 | pass # ... then don't change the current detection class. Even if the new detection_class_id is a -1, which means that the detection has changed but we'll stick with the first detected object class
61 | else: # if the object's class hasn't been identified yet or this is a new identification from a detected frame or a -1
62 | self.detection_class_id = detection_class_id
63 |
64 |
65 | oid = {
66 | "frame_timestamp": centroid_frame_timestamp,
67 | "centroid": list(centroid),
68 | "boxoid": list(boxoid),
69 | "bbox_rw_coords": bbox_rw_coords
70 | }
71 |
72 | self.oids.append(oid)
73 |
--------------------------------------------------------------------------------
/requirements.txt:
--------------------------------------------------------------------------------
1 | boto3==1.9.112
2 | botocore==1.12.112
3 | certifi==2019.3.9
4 | chardet==3.0.4
5 | docutils==0.14
6 | future==0.17.1
7 | gevent==1.4.0
8 | greenlet==0.4.15
9 | idna==2.8
10 | imutils==0.4.6
11 | jmespath==0.9.4
12 | numpy==1.22.0
13 | opencv-contrib-python==4.2.0.32
14 | Pillow==9.0.1
15 | plyvel
16 | python-dateutil==2.8.0
17 | requests==2.21.0
18 | s2sphere==0.2.5
19 | scipy
20 | six==1.12.0
21 |
--------------------------------------------------------------------------------