├── .gitignore
├── CONTRIBUTING.md
├── LICENSE.md
├── Procfile
├── README.md
├── app.py
├── assets
├── base.css
├── fonts.css
└── internal.css
├── config.py
├── data
├── CarFootage_object_data.csv
├── CarShowDrone_object_data.csv
├── DroneCanalFestivalDetectionData.csv
├── DroneCarFestival2DetectionData.csv
├── FarmDroneDetectionData.csv
├── ManCCTVDetectionData.csv
├── RestaurantHoldupDetectionData.csv
├── Zebra_object_data.csv
├── james_bond_object_data.csv
└── original_footage.md
├── images
├── Screencast.gif
├── Screenshot1.png
└── Screenshot2.png
├── requirements.txt
├── runtime.txt
└── utils
├── __init__.py
├── dash_reusable_components.py
├── generate_video_data.py
├── mscoco_label_map.py
└── visualization_utils.py
/.gitignore:
--------------------------------------------------------------------------------
1 | videos/*
2 | *.pyc
3 | .idea/*
4 | frozen_inference_graph.pb
--------------------------------------------------------------------------------
/CONTRIBUTING.md:
--------------------------------------------------------------------------------
1 | # Contributing
2 |
3 | When contributing to this repository, please first discuss the change you wish to make via issue,
4 | email, or any other method with the owners of this repository before making a change.
5 |
6 | Please note we have a code of conduct, please follow it in all your interactions with the project.
7 |
8 | ## Pull Request Process
9 |
10 | 1. Ensure any install or build dependencies are removed before the end of the layer when doing a
11 | build.
12 | 2. Update the README.md with details of changes to the interface, this includes new environment
13 | variables, exposed ports, useful file locations and container parameters.
14 | 3. Increase the version numbers in any examples files and the README.md to the new version that this
15 | Pull Request would represent. The versioning scheme we use is [SemVer](http://semver.org/).
16 | 4. You may merge the Pull Request in once you have the sign-off of two other developers, or if you
17 | do not have permission to do that, you may request the second reviewer to merge it for you.
18 |
19 | ## Code of Conduct
20 |
21 | ### Our Pledge
22 |
23 | In the interest of fostering an open and welcoming environment, we as
24 | contributors and maintainers pledge to making participation in our project and
25 | our community a harassment-free experience for everyone, regardless of age, body
26 | size, disability, ethnicity, gender identity and expression, level of experience,
27 | nationality, personal appearance, race, religion, or sexual identity and
28 | orientation.
29 |
30 | ### Our Standards
31 |
32 | Examples of behavior that contributes to creating a positive environment
33 | include:
34 |
35 | * Using welcoming and inclusive language
36 | * Being respectful of differing viewpoints and experiences
37 | * Gracefully accepting constructive criticism
38 | * Focusing on what is best for the community
39 | * Showing empathy towards other community members
40 |
41 | Examples of unacceptable behavior by participants include:
42 |
43 | * The use of sexualized language or imagery and unwelcome sexual attention or
44 | advances
45 | * Trolling, insulting/derogatory comments, and personal or political attacks
46 | * Public or private harassment
47 | * Publishing others' private information, such as a physical or electronic
48 | address, without explicit permission
49 | * Other conduct which could reasonably be considered inappropriate in a
50 | professional setting
51 |
52 | ### Our Responsibilities
53 |
54 | Project maintainers are responsible for clarifying the standards of acceptable
55 | behavior and are expected to take appropriate and fair corrective action in
56 | response to any instances of unacceptable behavior.
57 |
58 | Project maintainers have the right and responsibility to remove, edit, or
59 | reject comments, commits, code, wiki edits, issues, and other contributions
60 | that are not aligned to this Code of Conduct, or to ban temporarily or
61 | permanently any contributor for other behaviors that they deem inappropriate,
62 | threatening, offensive, or harmful.
63 |
64 | ### Scope
65 |
66 | This Code of Conduct applies both within project spaces and in public spaces
67 | when an individual is representing the project or its community. Examples of
68 | representing a project or community include using an official project e-mail
69 | address, posting via an official social media account, or acting as an appointed
70 | representative at an online or offline event. Representation of a project may be
71 | further defined and clarified by project maintainers.
72 |
73 | ### Enforcement
74 |
75 | Instances of abusive, harassing, or otherwise unacceptable behavior may be
76 | reported by contacting the project team at [INSERT EMAIL ADDRESS]. All
77 | complaints will be reviewed and investigated and will result in a response that
78 | is deemed necessary and appropriate to the circumstances. The project team is
79 | obligated to maintain confidentiality with regard to the reporter of an incident.
80 | Further details of specific enforcement policies may be posted separately.
81 |
82 | Project maintainers who do not follow or enforce the Code of Conduct in good
83 | faith may face temporary or permanent repercussions as determined by other
84 | members of the project's leadership.
85 |
86 | ### Attribution
87 |
88 | This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4,
89 | available at [http://contributor-covenant.org/version/1/4][version]
90 |
91 | [homepage]: http://contributor-covenant.org
92 | [version]: http://contributor-covenant.org/version/1/4/
--------------------------------------------------------------------------------
/LICENSE.md:
--------------------------------------------------------------------------------
1 | The MIT License (MIT)
2 |
3 | Copyright (c) 2018 plotly
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE.
--------------------------------------------------------------------------------
/Procfile:
--------------------------------------------------------------------------------
1 | web: gunicorn app:server --timeout 300
2 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # Dash Object Detection Explorer
2 |
3 | This is a demo of the Dash interactive Python framework developed by [Plotly](https://plot.ly/).
4 |
5 | Dash abstracts away all of the technologies and protocols required to build an interactive web-based application and is a simple and effective way to bind a user interface around your Python code. To learn more check out our [documentation](https://plot.ly/dash).
6 |
7 | Try out the [demo app here](https://dash-object-detection.plot.ly/).
8 |
9 | 
10 |
11 | ## Getting Started
12 |
13 | ### Using the demo
14 |
15 | To get started, select a footage you want to view, and choose the display mode (with or without bounding boxes). Then, you can start playing the video, and the visualization will be displayed depending on the current time.
16 |
17 | ### Running the app locally
18 |
19 | First create a virtual environment with conda or venv inside a temp folder, then activate it.
20 |
21 | ```
22 | virtualenv dash-object-detection
23 |
24 | # Windows
25 | dash-object-detection\Scripts\activate
26 | # Or Linux
27 | source venv/bin/activate
28 | ```
29 |
30 | Clone the git repo, then install the requirements with pip
31 | ```
32 | git clone https://github.com/plotly/dash-object-detection.git
33 | cd dash-object-detection
34 | pip install -r requirements.txt
35 | ```
36 |
37 | Run the app
38 | ```
39 | python app.py
40 | ```
41 |
42 | ## About the app
43 | The videos are displayed using a community-maintained Dash video component. It is made by two Plotly community contributors. You can find the [source code here](https://github.com/SkyRatInd/Video-Engine-Dash).
44 |
45 | All videos used are open-sourced under Creative Commons. The [original links can be found here](data/original_footage.md).
46 |
47 | ### Model
48 | The object detection model is the MobileNet v1, made by Google and trained on the COCO dataset. You can find their implementation on their [official Github repo](https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet_v1.md). You are encouraged to try this app with other models.
49 |
50 | ### Bounding Box Generation
51 | The data displayed in the app are pregenerated for demo purposes. To generate the csv files containing the objects detected for each frame, as well as the output video with bounding boxes, please refer to `utils/generate_video_data.py`. You will need the latest version of tensorflow and OpenCV, as well as the frozen graph `ssd_mobilenet_v1_coco`, that you can [download in the Model Zoo](https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/detection_model_zoo.md). Make sure to place the frozen graph inside the same folder as `generate_video_data.py`, i.e. `utils`.
52 |
53 | ## Built With
54 |
55 | * [Dash](https://dash.plot.ly/) - Main server and interactive components
56 | * [Plotly Python](https://plot.ly/python/) - Used to create the interactive plots
57 | * [OpenCV](https://docs.opencv.org/) - Create the video with bounding boxes
58 | * [Tensorflow](https://www.tensorflow.org/api_docs/) - Generate the bounding box data
59 |
60 | ## Contributing
61 |
62 | Please read [CONTRIBUTING.md](CONTRIBUTING.md) for details on our code of conduct, and the process for submitting pull requests to us.
63 |
64 | ## Authors
65 |
66 | * **Xing Han Lu** - *Initial Work* - [@xhlulu](https://github.com/xhlulu)
67 | * **Yi Cao** - *Restyle* - [@ycaokris](https://github.com/ycaokris)
68 |
69 | See also the list of [contributors](https://github.com/your/project/contributors) who participated in this project.
70 |
71 | ## License
72 |
73 | This project is licensed under the MIT License - see the [LICENSE.md](LICENSE.md) file for details
74 |
75 | ## Screenshots
76 | 
77 | 
78 |
--------------------------------------------------------------------------------
/app.py:
--------------------------------------------------------------------------------
1 | from textwrap import dedent
2 |
3 | import dash
4 | import dash_core_components as dcc
5 | import dash_html_components as html
6 | import dash_player as player
7 | import numpy as np
8 | import pandas as pd
9 | import plotly.graph_objs as go
10 | from dash.dependencies import Input, Output, State
11 |
12 |
13 | DEBUG = True
14 | FRAMERATE = 24.0
15 |
16 | app = dash.Dash(__name__)
17 | server = app.server
18 |
19 | app.scripts.config.serve_locally = True
20 | app.config['suppress_callback_exceptions'] = True
21 |
22 |
23 | def load_data(path):
24 | """Load data about a specific footage (given by the path). It returns a dictionary of useful variables such as
25 | the dataframe containing all the detection and bounds localization, the number of classes inside that footage,
26 | the matrix of all the classes in string, the given class with padding, and the root of the number of classes,
27 | rounded."""
28 |
29 | # Load the dataframe containing all the processed object detections inside the video
30 | video_info_df = pd.read_csv(path)
31 |
32 | # The list of classes, and the number of classes
33 | classes_list = video_info_df["class_str"].value_counts().index.tolist()
34 | n_classes = len(classes_list)
35 |
36 | # Gets the smallest value needed to add to the end of the classes list to get a square matrix
37 | root_round = np.ceil(np.sqrt(len(classes_list)))
38 | total_size = root_round ** 2
39 | padding_value = int(total_size - n_classes)
40 | classes_padded = np.pad(classes_list, (0, padding_value), mode='constant')
41 |
42 | # The padded matrix containing all the classes inside a matrix
43 | classes_matrix = np.reshape(classes_padded, (int(root_round), int(root_round)))
44 |
45 | # Flip it for better looks
46 | classes_matrix = np.flip(classes_matrix, axis=0)
47 |
48 | data_dict = {
49 | "video_info_df": video_info_df,
50 | "n_classes": n_classes,
51 | "classes_matrix": classes_matrix,
52 | "classes_padded": classes_padded,
53 | "root_round": root_round
54 | }
55 |
56 | if DEBUG:
57 | print(f'{path} loaded.')
58 |
59 | return data_dict
60 |
61 |
62 | def markdown_popup():
63 | return html.Div(
64 | id='markdown',
65 | className="model",
66 | style={'display': 'none'},
67 | children=(
68 | html.Div(
69 | className="markdown-container",
70 | children=[
71 | html.Div(
72 | className='close-container',
73 | children=html.Button(
74 | "Close",
75 | id="markdown_close",
76 | n_clicks=0,
77 | className="closeButton",
78 | style={'border': 'none', 'height': '100%'}
79 | )
80 | ),
81 | html.Div(
82 | className='markdown-text',
83 | children=[dcc.Markdown(
84 | children=dedent(
85 | '''
86 | ##### What am I looking at?
87 |
88 | This app enhances visualization of objects detected using state-of-the-art Mobile Vision Neural Networks.
89 | Most user generated videos are dynamic and fast-paced, which might be hard to interpret. A confidence
90 | heatmap stays consistent through the video and intuitively displays the model predictions. The pie chart
91 | lets you interpret how the object classes are divided, which is useful when analyzing videos with numerous
92 | and differing objects.
93 |
94 | ##### More about this dash app
95 |
96 | The purpose of this demo is to explore alternative visualization methods for Object Detection. Therefore,
97 | the visualizations, predictions and videos are not generated in real time, but done beforehand. To read
98 | more about it, please visit the [project repo](https://github.com/plotly/dash-object-detection).
99 |
100 | '''
101 | ))
102 | ]
103 | )
104 | ]
105 | )
106 | )
107 | )
108 |
109 |
110 | # Main App
111 |
112 | app.layout = html.Div(
113 | children=[
114 | html.Div(
115 | id='top-bar',
116 | className='row',
117 | style={'backgroundColor': '#fa4f56',
118 | 'height': '5px',
119 | }
120 | ),
121 | html.Div(
122 | className='container',
123 | children=[
124 | html.Div(
125 | id='left-side-column',
126 | className='eight columns',
127 | style={'display': 'flex',
128 | 'flexDirection': 'column',
129 | 'flex': 1,
130 | 'height': 'calc(100vh - 5px)',
131 | 'backgroundColor': '#F2F2F2',
132 | 'overflow-y': 'scroll',
133 | 'marginLeft': '0px',
134 | 'justifyContent': 'flex-start',
135 | 'alignItems': 'center'},
136 | children=[
137 | html.Div(
138 | id='header-section',
139 | children=[
140 | html.H4(
141 | 'Object Detection Explorer'
142 | ),
143 | html.P(
144 | 'To get started, select a footage you want to view, and choose the display mode (with or without'
145 | ' bounding boxes). Then, you can start playing the video, and the visualization will '
146 | 'be displayed depending on the current time.'
147 | ),
148 | html.Button("Learn More", id="learn-more-button", n_clicks=0)
149 | ]
150 | ),
151 | html.Div(
152 | className='video-outer-container',
153 | children=html.Div(
154 | style={'width': '100%', 'paddingBottom': '56.25%', 'position': 'relative'},
155 | children=player.DashPlayer(
156 | id='video-display',
157 | style={'position': 'absolute', 'width': '100%',
158 | 'height': '100%', 'top': '0', 'left': '0', 'bottom': '0', 'right': '0'},
159 | url='https://www.youtube.com/watch?v=gPtn6hD7o8g',
160 | controls=True,
161 | playing=False,
162 | volume=1,
163 | width='100%',
164 | height='100%'
165 | )
166 | )
167 | ),
168 | html.Div(
169 | className='control-section',
170 | children=[
171 | html.Div(
172 | className='control-element',
173 | children=[
174 | html.Div(children=["Minimum Confidence Threshold:"], style={'width': '40%'}),
175 | html.Div(dcc.Slider(
176 | id='slider-minimum-confidence-threshold',
177 | min=20,
178 | max=80,
179 | marks={i: f'{i}%' for i in range(20, 81, 10)},
180 | value=30,
181 | updatemode='drag'
182 | ), style={'width': '60%'})
183 | ]
184 | ),
185 |
186 | html.Div(
187 | className='control-element',
188 | children=[
189 | html.Div(children=["Footage Selection:"], style={'width': '40%'}),
190 | dcc.Dropdown(
191 | id="dropdown-footage-selection",
192 | options=[
193 | {'label': 'Drone recording of canal festival',
194 | 'value': 'DroneCanalFestival'},
195 | {'label': 'Drone recording of car festival', 'value': 'car_show_drone'},
196 | {'label': 'Drone recording of car festival #2',
197 | 'value': 'DroneCarFestival2'},
198 | {'label': 'Drone recording of a farm', 'value': 'FarmDrone'},
199 | {'label': 'Lion fighting Zebras', 'value': 'zebra'},
200 | {'label': 'Man caught by a CCTV', 'value': 'ManCCTV'},
201 | {'label': 'Man driving expensive car', 'value': 'car_footage'},
202 | {'label': 'Restaurant Robbery', 'value': 'RestaurantHoldup'}
203 | ],
204 | value='car_show_drone',
205 | clearable=False,
206 | style={'width': '60%'}
207 | )
208 | ]
209 | ),
210 |
211 | html.Div(
212 | className='control-element',
213 | children=[
214 | html.Div(children=["Video Display Mode:"], style={'width': '40%'}),
215 | dcc.Dropdown(
216 | id="dropdown-video-display-mode",
217 | options=[
218 | {'label': 'Regular Display', 'value': 'regular'},
219 | {'label': 'Display with Bounding Boxes', 'value': 'bounding_box'},
220 | ],
221 | value='bounding_box',
222 | searchable=False,
223 | clearable=False,
224 | style={'width': '60%'}
225 | )
226 | ]
227 | ),
228 |
229 | html.Div(
230 | className='control-element',
231 | children=[
232 | html.Div(children=["Graph View Mode:"], style={'width': '40%'}),
233 | dcc.Dropdown(
234 | id="dropdown-graph-view-mode",
235 | options=[
236 | {'label': 'Visual Mode', 'value': 'visual'},
237 | {'label': 'Detection Mode', 'value': 'detection'}
238 | ],
239 | value='visual',
240 | searchable=False,
241 | clearable=False,
242 | style={'width': '60%'}
243 | )
244 | ]
245 | )
246 | ]
247 | )
248 | ]
249 | ),
250 | html.Div(
251 | id='right-side-column',
252 | className='four columns',
253 | style={
254 | 'height': 'calc(100vh - 5px)',
255 | 'overflow-y': 'scroll',
256 | 'marginLeft': '1%',
257 | 'display': 'flex',
258 | 'backgroundColor': '#F9F9F9',
259 | 'flexDirection': 'column'
260 | },
261 | children=[
262 | html.Div(
263 | className='img-container',
264 | children=html.Img(
265 | style={'height': '100%', 'margin': '2px'},
266 | src="https://s3-us-west-1.amazonaws.com/plotly-tutorials/logo/new-branding/dash-logo-by-plotly-stripe.png")
267 | ),
268 | html.Div(id="div-visual-mode"),
269 | html.Div(id="div-detection-mode")
270 | ]
271 | )]),
272 | markdown_popup()
273 | ]
274 | )
275 |
276 |
277 | # Data Loading
278 | @app.server.before_first_request
279 | def load_all_footage():
280 | global data_dict, url_dict
281 |
282 | # Load the dictionary containing all the variables needed for analysis
283 | data_dict = {
284 | 'james_bond': load_data("data/james_bond_object_data.csv"),
285 | 'zebra': load_data("data/Zebra_object_data.csv"),
286 | 'car_show_drone': load_data("data/CarShowDrone_object_data.csv"),
287 | 'car_footage': load_data("data/CarFootage_object_data.csv"),
288 | 'DroneCanalFestival': load_data("data/DroneCanalFestivalDetectionData.csv"),
289 | 'DroneCarFestival2': load_data("data/DroneCarFestival2DetectionData.csv"),
290 | 'FarmDrone': load_data("data/FarmDroneDetectionData.csv"),
291 | 'ManCCTV': load_data("data/ManCCTVDetectionData.csv"),
292 | 'RestaurantHoldup': load_data("data/RestaurantHoldupDetectionData.csv")
293 | }
294 |
295 | url_dict = {
296 | 'regular': {
297 | 'james_bond': 'https://www.youtube.com/watch?v=g9S5GndUhko',
298 | 'zebra': 'https://www.youtube.com/watch?v=TVvtD3AVt10',
299 | 'car_show_drone': 'https://www.youtube.com/watch?v=gPtn6hD7o8g',
300 | 'car_footage': 'https://www.youtube.com/watch?v=qX3bDxHuq6I',
301 | 'DroneCanalFestival': 'https://youtu.be/0oucTt2OW7M',
302 | 'DroneCarFestival2': 'https://youtu.be/vhJ7MHsJvwY',
303 | 'FarmDrone': 'https://youtu.be/aXfKuaP8v_A',
304 | 'ManCCTV': 'https://youtu.be/BYZORBIxgbc',
305 | 'RestaurantHoldup': 'https://youtu.be/WDin4qqgpac',
306 | },
307 |
308 | 'bounding_box': {
309 | 'james_bond': 'https://www.youtube.com/watch?v=g9S5GndUhko',
310 | 'zebra': 'https://www.youtube.com/watch?v=G2pbZgyWQ5E',
311 | 'car_show_drone': 'https://www.youtube.com/watch?v=9F5FdcVmLOY',
312 | 'car_footage': 'https://www.youtube.com/watch?v=EhnNosq1Lrc',
313 | 'DroneCanalFestival': 'https://youtu.be/6ZZmsnwk2HQ',
314 | 'DroneCarFestival2': 'https://youtu.be/2Gr4RQ-JHIs',
315 | 'FarmDrone': 'https://youtu.be/pvvW5yZlpyc',
316 | 'ManCCTV': 'https://youtu.be/1oMrHLrtOZw',
317 | 'RestaurantHoldup': 'https://youtu.be/HOIKOwixYEY',
318 | }
319 | }
320 |
321 |
322 | # Footage Selection
323 | @app.callback(Output("video-display", "url"),
324 | [Input('dropdown-footage-selection', 'value'),
325 | Input('dropdown-video-display-mode', 'value')])
326 | def select_footage(footage, display_mode):
327 | # Find desired footage and update player video
328 | url = url_dict[display_mode][footage]
329 | return url
330 |
331 |
332 | # Learn more popup
333 | @app.callback(Output("markdown", "style"),
334 | [Input("learn-more-button", "n_clicks"), Input("markdown_close", "n_clicks")])
335 | def update_click_output(button_click, close_click):
336 | if button_click > close_click:
337 | return {"display": "block"}
338 | else:
339 | return {"display": "none"}
340 |
341 |
342 | @app.callback(Output("div-visual-mode", "children"),
343 | [Input("dropdown-graph-view-mode", "value")])
344 | def update_output(dropdown_value):
345 | if dropdown_value == "visual":
346 | return [
347 | dcc.Interval(
348 | id="interval-visual-mode",
349 | interval=700,
350 | n_intervals=0
351 | ),
352 | html.Div(
353 | children=[
354 | html.P(children="Confidence Level of Object Presence",
355 | className='plot-title'),
356 | dcc.Graph(
357 | id="heatmap-confidence",
358 | style={'height': '45vh', 'width': '100%'}),
359 |
360 | html.P(children="Object Count",
361 | className='plot-title'),
362 | dcc.Graph(
363 | id="pie-object-count",
364 | style={'height': '40vh', 'width': '100%'}
365 | )
366 |
367 | ]
368 | )
369 | ]
370 | else:
371 | return []
372 |
373 |
374 | @app.callback(Output("div-detection-mode", "children"),
375 | [Input("dropdown-graph-view-mode", "value")])
376 | def update_detection_mode(value):
377 | if value == "detection":
378 | return [
379 | dcc.Interval(
380 | id="interval-detection-mode",
381 | interval=700,
382 | n_intervals=0
383 | ),
384 | html.Div(
385 | children=[
386 | html.P(children="Detection Score of Most Probable Objects",
387 | className='plot-title'),
388 | dcc.Graph(
389 | id="bar-score-graph",
390 | style={'height': '55vh'}
391 | )
392 | ]
393 | )
394 | ]
395 | else:
396 | return []
397 |
398 |
399 | # Updating Figures
400 | @app.callback(Output("bar-score-graph", "figure"),
401 | [Input("interval-detection-mode", "n_intervals")],
402 | [State("video-display", "currentTime"),
403 | State('dropdown-footage-selection', 'value'),
404 | State('slider-minimum-confidence-threshold', 'value')])
405 | def update_score_bar(n, current_time, footage, threshold):
406 | layout = go.Layout(
407 | showlegend=False,
408 | paper_bgcolor='rgb(249,249,249)',
409 | plot_bgcolor='rgb(249,249,249)',
410 | xaxis={
411 | 'automargin': True,
412 | },
413 | yaxis={
414 | 'title': 'Score',
415 | 'automargin': True,
416 | 'range': [0, 1]
417 | }
418 | )
419 |
420 | if current_time is not None:
421 | current_frame = round(current_time * FRAMERATE)
422 |
423 | if n > 0 and current_frame > 0:
424 | video_info_df = data_dict[footage]["video_info_df"]
425 |
426 | # Select the subset of the dataset that correspond to the current frame
427 | frame_df = video_info_df[video_info_df["frame"] == current_frame]
428 |
429 | # Select only the frames above the threshold
430 | threshold_dec = threshold / 100 # Threshold in decimal
431 | frame_df = frame_df[frame_df["score"] > threshold_dec]
432 |
433 | # Select up to 8 frames with the highest scores
434 | frame_df = frame_df[:min(8, frame_df.shape[0])]
435 |
436 | # Add count to object names (e.g. person --> person 1, person --> person 2)
437 | objects = frame_df["class_str"].tolist()
438 | object_count_dict = {x: 0 for x in set(objects)} # Keeps count of the objects
439 | objects_wc = [] # Object renamed with counts
440 | for object in objects:
441 | object_count_dict[object] += 1 # Increment count
442 | objects_wc.append(f"{object} {object_count_dict[object]}")
443 |
444 | colors = list('rgb(250,79,86)' for i in range(len(objects_wc)))
445 |
446 | # Add text information
447 | y_text = [f"{round(value * 100)}% confidence" for value in frame_df["score"].tolist()]
448 |
449 | figure = go.Figure({
450 | 'data': [{'hoverinfo': 'x+text',
451 | 'name': 'Detection Scores',
452 | 'text': y_text,
453 | 'type': 'bar',
454 | 'x': objects_wc,
455 | 'marker': {'color': colors},
456 | 'y': frame_df["score"].tolist()}],
457 | 'layout': {'showlegend': False,
458 | 'autosize': False,
459 | 'paper_bgcolor': 'rgb(249,249,249)',
460 | 'plot_bgcolor': 'rgb(249,249,249)',
461 | 'xaxis': {'automargin': True, 'tickangle': -45},
462 | 'yaxis': {'automargin': True, 'range': [0, 1], 'title': {'text': 'Score'}}}
463 | }
464 | )
465 | return figure
466 |
467 | return go.Figure(data=[go.Bar()], layout=layout) # Returns empty bar
468 |
469 |
470 | @app.callback(Output("pie-object-count", "figure"),
471 | [Input("interval-visual-mode", "n_intervals")],
472 | [State("video-display", "currentTime"),
473 | State('dropdown-footage-selection', 'value'),
474 | State('slider-minimum-confidence-threshold', 'value')])
475 | def update_object_count_pie(n, current_time, footage, threshold):
476 | layout = go.Layout(
477 | showlegend=True,
478 | paper_bgcolor='rgb(249,249,249)',
479 | plot_bgcolor='rgb(249,249,249)',
480 | autosize=False,
481 | margin=go.layout.Margin(
482 | l=10,
483 | r=10,
484 | t=15,
485 | b=15
486 | )
487 | )
488 |
489 | if current_time is not None:
490 | current_frame = round(current_time * FRAMERATE)
491 |
492 | if n > 0 and current_frame > 0:
493 | video_info_df = data_dict[footage]["video_info_df"]
494 |
495 | # Select the subset of the dataset that correspond to the current frame
496 | frame_df = video_info_df[video_info_df["frame"] == current_frame]
497 |
498 | # Select only the frames above the threshold
499 | threshold_dec = threshold / 100 # Threshold in decimal
500 | frame_df = frame_df[frame_df["score"] > threshold_dec]
501 |
502 | # Get the count of each object class
503 | class_counts = frame_df["class_str"].value_counts()
504 |
505 | classes = class_counts.index.tolist() # List of each class
506 | counts = class_counts.tolist() # List of each count
507 |
508 | text = [f"{count} detected" for count in counts]
509 |
510 | # Set colorscale to piechart
511 | colorscale = ['#fa4f56', '#fe6767', '#ff7c79', '#ff908b', '#ffa39d', '#ffb6b0', '#ffc8c3', '#ffdbd7',
512 | '#ffedeb', '#ffffff']
513 |
514 | pie = go.Pie(
515 | labels=classes,
516 | values=counts,
517 | text=text,
518 | hoverinfo="text+percent",
519 | textinfo="label+percent",
520 | marker={'colors': colorscale[:len(classes)]}
521 | )
522 | return go.Figure(data=[pie], layout=layout)
523 |
524 | return go.Figure(data=[go.Pie()], layout=layout) # Returns empty pie chart
525 |
526 |
527 | @app.callback(Output("heatmap-confidence", "figure"),
528 | [Input("interval-visual-mode", "n_intervals")],
529 | [State("video-display", "currentTime"),
530 | State('dropdown-footage-selection', 'value'),
531 | State('slider-minimum-confidence-threshold', 'value')])
532 | def update_heatmap_confidence(n, current_time, footage, threshold):
533 | layout = go.Layout(
534 | showlegend=False,
535 | paper_bgcolor='rgb(249,249,249)',
536 | plot_bgcolor='rgb(249,249,249)',
537 | autosize=False,
538 | margin=go.layout.Margin(
539 | l=10,
540 | r=10,
541 | b=20,
542 | t=20,
543 | pad=4
544 | )
545 | )
546 |
547 | if current_time is not None:
548 | current_frame = round(current_time * FRAMERATE)
549 |
550 | if n > 0 and current_frame > 0:
551 | # Load variables from the data dictionary
552 | video_info_df = data_dict[footage]["video_info_df"]
553 | classes_padded = data_dict[footage]["classes_padded"]
554 | root_round = data_dict[footage]["root_round"]
555 | classes_matrix = data_dict[footage]["classes_matrix"]
556 |
557 | # Select the subset of the dataset that correspond to the current frame
558 | frame_df = video_info_df[video_info_df["frame"] == current_frame]
559 |
560 | # Select only the frames above the threshold
561 | threshold_dec = threshold / 100
562 | frame_df = frame_df[frame_df["score"] > threshold_dec]
563 |
564 | # Remove duplicate, keep the top result
565 | frame_no_dup = frame_df[["class_str", "score"]].drop_duplicates("class_str")
566 | frame_no_dup.set_index("class_str", inplace=True)
567 |
568 | # The list of scores
569 | score_list = []
570 | for el in classes_padded:
571 | if el in frame_no_dup.index.values:
572 | score_list.append(frame_no_dup.loc[el][0])
573 | else:
574 | score_list.append(0)
575 |
576 | # Generate the score matrix, and flip it for visual
577 | score_matrix = np.reshape(score_list, (-1, int(root_round)))
578 | score_matrix = np.flip(score_matrix, axis=0)
579 |
580 | # We set the color scale to white if there's nothing in the frame_no_dup
581 | if frame_no_dup.shape != (0, 1):
582 | colorscale = [[0, '#f9f9f9'], [1, '#fa4f56']]
583 | else:
584 | colorscale = [[0, '#f9f9f9'], [1, '#f9f9f9']]
585 |
586 | hover_text = [f"{score * 100:.2f}% confidence" for score in score_list]
587 | hover_text = np.reshape(hover_text, (-1, int(root_round)))
588 | hover_text = np.flip(hover_text, axis=0)
589 |
590 | # Add linebreak for multi-word annotation
591 | classes_matrix = classes_matrix.astype(dtype='|U40')
592 |
593 | for index, row in enumerate(classes_matrix):
594 | row = list(map(lambda x: '
'.join(x.split()), row))
595 | classes_matrix[index] = row
596 |
597 | # Set up annotation text
598 | annotation = []
599 | for y_cord in range(int(root_round)):
600 | for x_cord in range(int(root_round)):
601 | annotation_dict = dict(
602 | showarrow=False,
603 | text=classes_matrix[y_cord][x_cord],
604 | xref='x',
605 | yref='y',
606 | x=x_cord,
607 | y=y_cord
608 | )
609 | if score_matrix[y_cord][x_cord] > 0:
610 | annotation_dict['font'] = {'color': '#F9F9F9', 'size': '11'}
611 | else:
612 | annotation_dict['font'] = {'color': '#606060', 'size': '11'}
613 | annotation.append(annotation_dict)
614 |
615 | # Generate heatmap figure
616 |
617 | figure = {
618 | 'data': [
619 | {'colorscale': colorscale,
620 | 'showscale': False,
621 | 'hoverinfo': 'text',
622 | 'text': hover_text,
623 | 'type': 'heatmap',
624 | 'zmin': 0,
625 | 'zmax': 1,
626 | 'xgap': 1,
627 | 'ygap': 1,
628 | 'z': score_matrix}],
629 | 'layout':
630 | {'showlegend': False,
631 | 'autosize': False,
632 | 'paper_bgcolor': 'rgb(249,249,249)',
633 | 'plot_bgcolor': 'rgb(249,249,249)',
634 | 'margin': {'l': 10, 'r': 10, 'b': 20, 't': 20, 'pad': 2},
635 | 'annotations': annotation,
636 | 'xaxis': {'showticklabels': False, 'showgrid': False, 'side': 'top', 'ticks': ''},
637 | 'yaxis': {'showticklabels': False, 'showgrid': False, 'side': 'left', 'ticks': ''}
638 | }
639 | }
640 |
641 | return figure
642 |
643 | # Returns empty figure
644 | return go.Figure(data=[go.Pie()], layout=layout)
645 |
646 |
647 | # Running the server
648 | if __name__ == '__main__':
649 | app.run_server(dev_tools_hot_reload=False, debug=DEBUG, host='0.0.0.0')
650 |
--------------------------------------------------------------------------------
/assets/base.css:
--------------------------------------------------------------------------------
1 | /* Table of contents
2 | ––––––––––––––––––––––––––––––––––––––––––––––––––
3 | - Plotly.js
4 | - Grid
5 | - Base Styles
6 | - Typography
7 | - Links
8 | - Buttons
9 | - Forms
10 | - Lists
11 | - Code
12 | - Tables
13 | - Spacing
14 | - Utilities
15 | - Clearing
16 | - Media Queries
17 | */
18 |
19 | /* PLotly.js
20 | –––––––––––––––––––––––––––––––––––––––––––––––––– */
21 | /* plotly.js's modebar's z-index is 1001 by default
22 | * https://github.com/plotly/plotly.js/blob/7e4d8ab164258f6bd48be56589dacd9bdd7fded2/src/css/_modebar.scss#L5
23 | * In case a dropdown is above the graph, the dropdown's options
24 | * will be rendered below the modebar
25 | * Increase the select option's z-index
26 | */
27 |
28 | /* This was actually not quite right -
29 | dropdowns were overlapping each other (edited October 26)
30 |
31 | .Select {
32 | z-index: 1002;
33 | }*/
34 |
35 | /* Grid
36 | –––––––––––––––––––––––––––––––––––––––––––––––––– */
37 | .container {
38 | position: relative;
39 | width: 100%;
40 | /*max-width: 960px;*/
41 | /*margin: 0 auto;*/
42 | padding: 0 20px;
43 | box-sizing: border-box; }
44 | .column,
45 | .columns {
46 | width: 100%;
47 | float: left;
48 | box-sizing: border-box; }
49 |
50 | /* For devices larger than 400px */
51 | @media (min-width: 400px) {
52 | .container {
53 | width: 100%;
54 | padding: 0; }
55 | }
56 |
57 | /* For devices larger than 550px */
58 | @media (min-width: 550px) {
59 | .container {
60 | width: 100%; }
61 | .column,
62 | .columns {
63 | margin-left: 0; }
64 | .column:first-child,
65 | .columns:first-child {
66 | margin-left: 0; }
67 |
68 | .one.column,
69 | .one.columns { width: 4.66666666667%; }
70 | .two.columns { width: 13.3333333333%; }
71 | .three.columns { width: 22%; }
72 | .four.columns { width: 32.6666666667%; }
73 | .five.columns { width: 39.3333333333%; }
74 | .six.columns { width: 48%; }
75 | .seven.columns { width: 56.6666666667%; }
76 | .eight.columns { width: 65.3333333333%; }
77 | .nine.columns { width: 74.0%; }
78 | .ten.columns { width: 82.6666666667%; }
79 | .eleven.columns { width: 91.3333333333%; }
80 | .twelve.columns { width: 100%; margin-left: 0; }
81 |
82 | .one-third.column { width: 30.6666666667%; }
83 | .two-thirds.column { width: 65.3333333333%; }
84 |
85 | .one-half.column { width: 48%; }
86 |
87 | /* Offsets */
88 | .offset-by-one.column,
89 | .offset-by-one.columns { margin-left: 8.66666666667%; }
90 | .offset-by-two.column,
91 | .offset-by-two.columns { margin-left: 17.3333333333%; }
92 | .offset-by-three.column,
93 | .offset-by-three.columns { margin-left: 26%; }
94 | .offset-by-four.column,
95 | .offset-by-four.columns { margin-left: 34.6666666667%; }
96 | .offset-by-five.column,
97 | .offset-by-five.columns { margin-left: 43.3333333333%; }
98 | .offset-by-six.column,
99 | .offset-by-six.columns { margin-left: 52%; }
100 | .offset-by-seven.column,
101 | .offset-by-seven.columns { margin-left: 60.6666666667%; }
102 | .offset-by-eight.column,
103 | .offset-by-eight.columns { margin-left: 69.3333333333%; }
104 | .offset-by-nine.column,
105 | .offset-by-nine.columns { margin-left: 78.0%; }
106 | .offset-by-ten.column,
107 | .offset-by-ten.columns { margin-left: 86.6666666667%; }
108 | .offset-by-eleven.column,
109 | .offset-by-eleven.columns { margin-left: 95.3333333333%; }
110 |
111 | .offset-by-one-third.column,
112 | .offset-by-one-third.columns { margin-left: 34.6666666667%; }
113 | .offset-by-two-thirds.column,
114 | .offset-by-two-thirds.columns { margin-left: 69.3333333333%; }
115 |
116 | .offset-by-one-half.column,
117 | .offset-by-one-half.columns { margin-left: 52%; }
118 |
119 | }
120 |
121 |
122 | /* Base Styles
123 | –––––––––––––––––––––––––––––––––––––––––––––––––– */
124 | /* NOTE
125 | html is set to 62.5% so that all the REM measurements throughout Skeleton
126 | are based on 10px sizing. So basically 1.5rem = 15px :) */
127 | html {
128 | font-size: 62.5%; }
129 | body {
130 | font-size: 1.5em; /* currently ems cause chrome bug misinterpreting rems on body element */
131 | line-height: 1.6;
132 | font-weight: 400;
133 | font-family: "Open Sans", "HelveticaNeue", "Helvetica Neue", Helvetica, Arial, sans-serif;
134 | color: rgb(50, 50, 50); }
135 |
136 |
137 | /* Typography
138 | –––––––––––––––––––––––––––––––––––––––––––––––––– */
139 | h1, h2, h3, h4, h5, h6 {
140 | margin-top: 0;
141 | margin-bottom: 0;
142 | font-weight: 300; }
143 | h1 { font-size: 4.5rem; line-height: 1.2; letter-spacing: -.1rem; margin-bottom: 2rem; }
144 | h2 { font-size: 3.6rem; line-height: 1.25; letter-spacing: -.1rem; margin-bottom: 1.8rem; margin-top: 1.8rem;}
145 | h3 { font-size: 3.0rem; line-height: 1.3; letter-spacing: -.1rem; margin-bottom: 1.5rem; margin-top: 1.5rem;}
146 | h4 { font-size: 2.6rem; line-height: 1.35; letter-spacing: -.08rem; margin-bottom: 1.2rem; margin-top: 1.2rem;}
147 | h5 { font-size: 2.2rem; line-height: 1.5; letter-spacing: -.05rem; margin-bottom: 0.6rem; margin-top: 0.6rem;}
148 | h6 { font-size: 2.0rem; line-height: 1.6; letter-spacing: 0; margin-bottom: 0.75rem; margin-top: 0.75rem;}
149 |
150 | p {
151 | margin-top: 0; }
152 |
153 |
154 | /* Blockquotes
155 | –––––––––––––––––––––––––––––––––––––––––––––––––– */
156 | blockquote {
157 | border-left: 4px lightgrey solid;
158 | padding-left: 1rem;
159 | margin-top: 2rem;
160 | margin-bottom: 2rem;
161 | margin-left: 0rem;
162 | }
163 |
164 |
165 | /* Links
166 | –––––––––––––––––––––––––––––––––––––––––––––––––– */
167 | a {
168 | color: #1EAEDB;
169 | text-decoration: underline;
170 | cursor: pointer;}
171 | a:hover {
172 | color: #0FA0CE; }
173 |
174 |
175 | /* Buttons
176 | –––––––––––––––––––––––––––––––––––––––––––––––––– */
177 | .button,
178 | button,
179 | input[type="submit"],
180 | input[type="reset"],
181 | input[type="button"] {
182 | display: inline-block;
183 | height: 38px;
184 | padding: 0 30px;
185 | color: #555;
186 | text-align: center;
187 | font-size: 11px;
188 | font-weight: 600;
189 | line-height: 38px;
190 | letter-spacing: .1rem;
191 | text-transform: uppercase;
192 | text-decoration: none;
193 | white-space: nowrap;
194 | background-color: transparent;
195 | border-radius: 4px;
196 | border: 1px solid #bbb;
197 | cursor: pointer;
198 | box-sizing: border-box; }
199 | .button:hover,
200 | button:hover,
201 | input[type="submit"]:hover,
202 | input[type="reset"]:hover,
203 | input[type="button"]:hover,
204 | .button:focus,
205 | button:focus,
206 | input[type="submit"]:focus,
207 | input[type="reset"]:focus,
208 | input[type="button"]:focus {
209 | color: #333;
210 | border-color: #888;
211 | outline: 0; }
212 | .button.button-primary,
213 | button.button-primary,
214 | input[type="submit"].button-primary,
215 | input[type="reset"].button-primary,
216 | input[type="button"].button-primary {
217 | color: #FFF;
218 | background-color: #33C3F0;
219 | border-color: #33C3F0; }
220 | .button.button-primary:hover,
221 | button.button-primary:hover,
222 | input[type="submit"].button-primary:hover,
223 | input[type="reset"].button-primary:hover,
224 | input[type="button"].button-primary:hover,
225 | .button.button-primary:focus,
226 | button.button-primary:focus,
227 | input[type="submit"].button-primary:focus,
228 | input[type="reset"].button-primary:focus,
229 | input[type="button"].button-primary:focus {
230 | color: #FFF;
231 | background-color: #1EAEDB;
232 | border-color: #1EAEDB; }
233 |
234 |
235 | /* Forms
236 | –––––––––––––––––––––––––––––––––––––––––––––––––– */
237 | input[type="email"],
238 | input[type="number"],
239 | input[type="search"],
240 | input[type="text"],
241 | input[type="tel"],
242 | input[type="url"],
243 | input[type="password"],
244 | textarea,
245 | select {
246 | height: 38px;
247 | padding: 6px 10px; /* The 6px vertically centers text on FF, ignored by Webkit */
248 | background-color: #fff;
249 | border: 1px solid #D1D1D1;
250 | border-radius: 4px;
251 | box-shadow: none;
252 | box-sizing: border-box;
253 | font-family: inherit;
254 | font-size: inherit; /*https://stackoverflow.com/questions/6080413/why-doesnt-input-inherit-the-font-from-body*/}
255 | /* Removes awkward default styles on some inputs for iOS */
256 | input[type="email"],
257 | input[type="number"],
258 | input[type="search"],
259 | input[type="text"],
260 | input[type="tel"],
261 | input[type="url"],
262 | input[type="password"],
263 | textarea {
264 | -webkit-appearance: none;
265 | -moz-appearance: none;
266 | appearance: none; }
267 | textarea {
268 | min-height: 65px;
269 | padding-top: 6px;
270 | padding-bottom: 6px; }
271 | input[type="email"]:focus,
272 | input[type="number"]:focus,
273 | input[type="search"]:focus,
274 | input[type="text"]:focus,
275 | input[type="tel"]:focus,
276 | input[type="url"]:focus,
277 | input[type="password"]:focus,
278 | textarea:focus,
279 | select:focus {
280 | border: 1px solid #33C3F0;
281 | outline: 0; }
282 | label,
283 | legend {
284 | display: block;
285 | margin-bottom: 0px; }
286 | fieldset {
287 | padding: 0;
288 | border-width: 0; }
289 | input[type="checkbox"],
290 | input[type="radio"] {
291 | display: inline; }
292 | label > .label-body {
293 | display: inline-block;
294 | margin-left: .5rem;
295 | font-weight: normal; }
296 |
297 |
298 | /* Lists
299 | –––––––––––––––––––––––––––––––––––––––––––––––––– */
300 | ul {
301 | list-style: circle inside; }
302 | ol {
303 | list-style: decimal inside; }
304 | ol, ul {
305 | padding-left: 0;
306 | margin-top: 0; }
307 | ul ul,
308 | ul ol,
309 | ol ol,
310 | ol ul {
311 | margin: 1.5rem 0 1.5rem 3rem;
312 | font-size: 90%; }
313 | li {
314 | margin-bottom: 1rem; }
315 |
316 |
317 | /* Tables
318 | –––––––––––––––––––––––––––––––––––––––––––––––––– */
319 | table {
320 | border-collapse: collapse;
321 | }
322 | th,
323 | td {
324 | padding: 12px 15px;
325 | text-align: left;
326 | border-bottom: 1px solid #E1E1E1; }
327 | th:first-child,
328 | td:first-child {
329 | padding-left: 0; }
330 | th:last-child,
331 | td:last-child {
332 | padding-right: 0; }
333 |
334 |
335 | /* Spacing
336 | –––––––––––––––––––––––––––––––––––––––––––––––––– */
337 | button,
338 | .button {
339 | margin-bottom: 0rem; }
340 | input,
341 | textarea,
342 | select,
343 | fieldset {
344 | margin-bottom: 0rem; }
345 | pre,
346 | dl,
347 | figure,
348 | table,
349 | form {
350 | margin-bottom: 0rem; }
351 | p,
352 | ul,
353 | ol {
354 | margin-bottom: 0.75rem; }
355 |
356 | /* Utilities
357 | –––––––––––––––––––––––––––––––––––––––––––––––––– */
358 | .u-full-width {
359 | width: 100%;
360 | box-sizing: border-box; }
361 | .u-max-full-width {
362 | max-width: 100%;
363 | box-sizing: border-box; }
364 | .u-pull-right {
365 | float: right; }
366 | .u-pull-left {
367 | float: left; }
368 |
369 |
370 | /* Misc
371 | –––––––––––––––––––––––––––––––––––––––––––––––––– */
372 | hr {
373 | margin-top: 3rem;
374 | margin-bottom: 3.5rem;
375 | border-width: 0;
376 | border-top: 1px solid #E1E1E1; }
377 |
378 |
379 | /* Clearing
380 | –––––––––––––––––––––––––––––––––––––––––––––––––– */
381 |
382 | /* Self Clearing Goodness */
383 | .container:after,
384 | .row:after,
385 | .u-cf {
386 | content: "";
387 | display: table;
388 | clear: both; }
389 |
390 |
391 | /* Media Queries
392 | –––––––––––––––––––––––––––––––––––––––––––––––––– */
393 | /*
394 | Note: The best way to structure the use of media queries is to create the queries
395 | near the relevant code. For example, if you wanted to change the styles for buttons
396 | on small devices, paste the mobile query code up in the buttons section and style it
397 | there.
398 | */
399 |
400 |
401 | /* Larger than mobile */
402 | @media (min-width: 400px) {}
403 |
404 | /* Larger than phablet (also point when grid becomes active) */
405 | @media (min-width: 550px) {}
406 |
407 | /* Larger than tablet */
408 | @media (min-width: 750px) {}
409 |
410 | /* Larger than desktop */
411 | @media (min-width: 1000px) {}
412 |
413 | /* Larger than Desktop HD */
414 | @media (min-width: 1200px) {}
--------------------------------------------------------------------------------
/assets/fonts.css:
--------------------------------------------------------------------------------
1 | /* latin-ext */
2 | @font-face {
3 | font-family: 'Raleway';
4 | font-style: italic;
5 | font-weight: 400;
6 | src: local('Raleway Italic'), local('Raleway-Italic'), url(https://fonts.gstatic.com/s/raleway/v12/1Ptsg8zYS_SKggPNyCg4Q4FqPfE.woff2) format('woff2');
7 | unicode-range: U+0100-024F, U+0259, U+1E00-1EFF, U+2020, U+20A0-20AB, U+20AD-20CF, U+2113, U+2C60-2C7F, U+A720-A7FF;
8 | }
9 | /* latin */
10 | @font-face {
11 | font-family: 'Raleway';
12 | font-style: italic;
13 | font-weight: 400;
14 | src: local('Raleway Italic'), local('Raleway-Italic'), url(https://fonts.gstatic.com/s/raleway/v12/1Ptsg8zYS_SKggPNyCg4TYFq.woff2) format('woff2');
15 | unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02BB-02BC, U+02C6, U+02DA, U+02DC, U+2000-206F, U+2074, U+20AC, U+2122, U+2191, U+2193, U+2212, U+2215, U+FEFF, U+FFFD;
16 | }
17 | /* latin-ext */
18 | @font-face {
19 | font-family: 'Raleway';
20 | font-style: italic;
21 | font-weight: 700;
22 | src: local('Raleway Bold Italic'), local('Raleway-BoldItalic'), url(https://fonts.gstatic.com/s/raleway/v12/1Ptpg8zYS_SKggPNyCgw9qR_DNCb_Vo.woff2) format('woff2');
23 | unicode-range: U+0100-024F, U+0259, U+1E00-1EFF, U+2020, U+20A0-20AB, U+20AD-20CF, U+2113, U+2C60-2C7F, U+A720-A7FF;
24 | }
25 | /* latin */
26 | @font-face {
27 | font-family: 'Raleway';
28 | font-style: italic;
29 | font-weight: 700;
30 | src: local('Raleway Bold Italic'), local('Raleway-BoldItalic'), url(https://fonts.gstatic.com/s/raleway/v12/1Ptpg8zYS_SKggPNyCgw9qR_AtCb.woff2) format('woff2');
31 | unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02BB-02BC, U+02C6, U+02DA, U+02DC, U+2000-206F, U+2074, U+20AC, U+2122, U+2191, U+2193, U+2212, U+2215, U+FEFF, U+FFFD;
32 | }
33 | /* latin-ext */
34 | @font-face {
35 | font-family: 'Raleway';
36 | font-style: normal;
37 | font-weight: 400;
38 | src: local('Raleway'), local('Raleway-Regular'), url(https://fonts.gstatic.com/s/raleway/v12/1Ptug8zYS_SKggPNyCMIT5lu.woff2) format('woff2');
39 | unicode-range: U+0100-024F, U+0259, U+1E00-1EFF, U+2020, U+20A0-20AB, U+20AD-20CF, U+2113, U+2C60-2C7F, U+A720-A7FF;
40 | }
41 | /* latin */
42 | @font-face {
43 | font-family: 'Raleway';
44 | font-style: normal;
45 | font-weight: 400;
46 | src: local('Raleway'), local('Raleway-Regular'), url(https://fonts.gstatic.com/s/raleway/v12/1Ptug8zYS_SKggPNyC0ITw.woff2) format('woff2');
47 | unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02BB-02BC, U+02C6, U+02DA, U+02DC, U+2000-206F, U+2074, U+20AC, U+2122, U+2191, U+2193, U+2212, U+2215, U+FEFF, U+FFFD;
48 | }
49 | /* latin-ext */
50 | @font-face {
51 | font-family: 'Raleway';
52 | font-style: normal;
53 | font-weight: 700;
54 | src: local('Raleway Bold'), local('Raleway-Bold'), url(https://fonts.gstatic.com/s/raleway/v12/1Ptrg8zYS_SKggPNwJYtWqhPAMif.woff2) format('woff2');
55 | unicode-range: U+0100-024F, U+0259, U+1E00-1EFF, U+2020, U+20A0-20AB, U+20AD-20CF, U+2113, U+2C60-2C7F, U+A720-A7FF;
56 | }
57 | /* latin */
58 | @font-face {
59 | font-family: 'Raleway';
60 | font-style: normal;
61 | font-weight: 700;
62 | src: local('Raleway Bold'), local('Raleway-Bold'), url(https://fonts.gstatic.com/s/raleway/v12/1Ptrg8zYS_SKggPNwJYtWqZPAA.woff2) format('woff2');
63 | unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02BB-02BC, U+02C6, U+02DA, U+02DC, U+2000-206F, U+2074, U+20AC, U+2122, U+2191, U+2193, U+2212, U+2215, U+FEFF, U+FFFD;
64 | }
65 |
66 | /* cyrillic-ext */
67 | @font-face {
68 | font-family: 'Roboto';
69 | font-style: normal;
70 | font-weight: 400;
71 | src: local('Roboto'), local('Roboto-Regular'), url(https://fonts.gstatic.com/s/roboto/v18/KFOmCnqEu92Fr1Mu72xKKTU1Kvnz.woff2) format('woff2');
72 | unicode-range: U+0460-052F, U+1C80-1C88, U+20B4, U+2DE0-2DFF, U+A640-A69F, U+FE2E-FE2F;
73 | }
74 | /* cyrillic */
75 | @font-face {
76 | font-family: 'Roboto';
77 | font-style: normal;
78 | font-weight: 400;
79 | src: local('Roboto'), local('Roboto-Regular'), url(https://fonts.gstatic.com/s/roboto/v18/KFOmCnqEu92Fr1Mu5mxKKTU1Kvnz.woff2) format('woff2');
80 | unicode-range: U+0400-045F, U+0490-0491, U+04B0-04B1, U+2116;
81 | }
82 | /* greek-ext */
83 | @font-face {
84 | font-family: 'Roboto';
85 | font-style: normal;
86 | font-weight: 400;
87 | src: local('Roboto'), local('Roboto-Regular'), url(https://fonts.gstatic.com/s/roboto/v18/KFOmCnqEu92Fr1Mu7mxKKTU1Kvnz.woff2) format('woff2');
88 | unicode-range: U+1F00-1FFF;
89 | }
90 | /* greek */
91 | @font-face {
92 | font-family: 'Roboto';
93 | font-style: normal;
94 | font-weight: 400;
95 | src: local('Roboto'), local('Roboto-Regular'), url(https://fonts.gstatic.com/s/roboto/v18/KFOmCnqEu92Fr1Mu4WxKKTU1Kvnz.woff2) format('woff2');
96 | unicode-range: U+0370-03FF;
97 | }
98 | /* vietnamese */
99 | @font-face {
100 | font-family: 'Roboto';
101 | font-style: normal;
102 | font-weight: 400;
103 | src: local('Roboto'), local('Roboto-Regular'), url(https://fonts.gstatic.com/s/roboto/v18/KFOmCnqEu92Fr1Mu7WxKKTU1Kvnz.woff2) format('woff2');
104 | unicode-range: U+0102-0103, U+0110-0111, U+1EA0-1EF9, U+20AB;
105 | }
106 | /* latin-ext */
107 | @font-face {
108 | font-family: 'Roboto';
109 | font-style: normal;
110 | font-weight: 400;
111 | src: local('Roboto'), local('Roboto-Regular'), url(https://fonts.gstatic.com/s/roboto/v18/KFOmCnqEu92Fr1Mu7GxKKTU1Kvnz.woff2) format('woff2');
112 | unicode-range: U+0100-024F, U+0259, U+1E00-1EFF, U+2020, U+20A0-20AB, U+20AD-20CF, U+2113, U+2C60-2C7F, U+A720-A7FF;
113 | }
114 | /* latin */
115 | @font-face {
116 | font-family: 'Roboto';
117 | font-style: normal;
118 | font-weight: 400;
119 | src: local('Roboto'), local('Roboto-Regular'), url(https://fonts.gstatic.com/s/roboto/v18/KFOmCnqEu92Fr1Mu4mxKKTU1Kg.woff2) format('woff2');
120 | unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02BB-02BC, U+02C6, U+02DA, U+02DC, U+2000-206F, U+2074, U+20AC, U+2122, U+2191, U+2193, U+2212, U+2215, U+FEFF, U+FFFD;
121 | }
--------------------------------------------------------------------------------
/assets/internal.css:
--------------------------------------------------------------------------------
1 | /* Remove Undo
2 | –––––––––––––––––––––––––––––––––––––––––––––––––– */
3 | ._dash-undo-redo {
4 | display: none;
5 | }
6 |
7 | body {
8 | margin: 0 !important;
9 | color: #606060 !important;
10 | font-family: 'Raleway', sans-serif;
11 | background-color: #F9F9F9 !important;
12 | }
13 |
14 | h4 {
15 | font-family: 'Roboto', sans-serif;
16 | font-weight: 400;
17 | }
18 |
19 | p {
20 | font-size: 16px;
21 | font-weight: 300;
22 | }
23 |
24 | /*.four.columns {*/
25 | /*width: 32.6666666667% !important;*/
26 | /*}*/
27 |
28 | .model {
29 | display: block; /*Hidden by default */
30 | position: fixed; /* Stay in place */
31 | z-index: 1000; /* Sit on top */
32 | left: 0;
33 | top: 0;
34 | width: 100vw; /* Full width */
35 | height: 100vh; /* Full height */
36 | overflow: auto; /* Enable scroll if needed */
37 | background-color: rgb(0, 0, 0); /* Fallback color */
38 | background-color: rgba(0, 0, 0, 0.4); /* Black w/ opacity */
39 | }
40 |
41 | .markdown-container {
42 | width: 60vw;
43 | margin: 10% auto;
44 | padding: 10px 15px;
45 | background-color: #F9F9F9;
46 | border-radius: 10px;
47 | }
48 |
49 | .close-container {
50 | display: inline-block;
51 | width: 100%;
52 | margin: 0;
53 | padding: 0;
54 | }
55 |
56 | .markdown-text {
57 | width: 100%;
58 | margin: 0;
59 | padding: 0px 10px;
60 | }
61 |
62 |
63 | .closeButton {
64 | padding: 0 15px;
65 | font-weight: normal;
66 | float: right;
67 | font-size: 1.2rem;
68 | }
69 |
70 | .closeButton:hover {
71 | color: red; !important
72 | }
73 |
74 | #header-section {
75 | width: 90%;
76 | margin-top: 2%;
77 | }
78 |
79 | .button:focus {
80 | color: #FFFFFF;
81 | border-color: #bbb;
82 | }
83 |
84 | #learn-more-button {
85 | background-color: #fa4f56;
86 | color: #FFFFFF;
87 | font-size: 13px;
88 | height: 30px;
89 | font-weight: 500;
90 | padding: 5px 25px 30px 25px;
91 | line-height: 30px;
92 | font-family: 'Raleway', sans-serif;
93 | }
94 |
95 | .video-outer-container {
96 | width: 65%;
97 | margin-top: 2%;
98 | margin-bottom: 2%;
99 | min-width: 500px;
100 | }
101 |
102 | .control-section {
103 | width: 85%;
104 | display: flex;
105 | flex-flow: column nowrap;
106 | padding: 5px;
107 | flex-shrink: 0;
108 | margin-bottom: 2%;
109 | }
110 |
111 | .control-element {
112 | font-size: 15px;
113 | padding: 10px;
114 | display: flex;
115 | flex-flow: row nowrap;
116 | }
117 |
118 | .Select-control {
119 | width: 100% !important;
120 | }
121 |
122 | .Select-menu-outer {
123 | position: relative;
124 | }
125 |
126 | .has-value.Select--single > .Select-control .Select-value .Select-value-label, .has-value.is-pseudo-focused.Select--single > .Select-control .Select-value .Select-value-label {
127 | color: #606060;
128 | }
129 |
130 | .rc-slider-track {
131 | background-color: #fa4f56;
132 | }
133 |
134 | .rc-slider-dot-active, .rc-slider-dot, .rc-slider-handle {
135 | border-color: #fa4f56;
136 | }
137 |
138 | .rc-slider-handle:hover {
139 | border-color: #fa4f56;
140 | }
141 |
142 | .img-container {
143 | border-color: black;
144 | display: flex;
145 | justify-content: flex-end;
146 | height: 40px;
147 | }
148 |
149 | .plot-title {
150 | margin-left: 5%;
151 | margin-bottom: 0px;
152 | font-weight: 500;
153 | }
--------------------------------------------------------------------------------
/config.py:
--------------------------------------------------------------------------------
1 | import os
2 |
3 | # Replace with the name of your Dash app
4 | # This will end up being part of the URL of your deployed app,
5 | # so it can't contain any spaces, capitalizations, or special characters
6 | #
7 | # This name MUST match the name that you specified in the
8 | # Dash App Manager
9 | DASH_APP_NAME = 'dash-object-detection'
10 |
11 | # Set to 'private' if you want to add a login screen to your app
12 | # You can choose who can view the app in your list of files
13 | # at /organize.
14 | # Set to 'public' if you want your app to be accessible to
15 | # anyone who has access to your Plotly server on your network without
16 | # a login screen.
17 | # Set to 'secret' if you want to add a login screen, but allow it
18 | # to be bypassed by using a secret "share_key" parameter.
19 | DASH_APP_PRIVACY = 'public'
20 |
21 | # Dash On-Premise is configured with either "Path based routing"
22 | # or "Domain based routing"
23 | # Ask your server administrator which version was set up.
24 | # If a separate subdomain was created,
25 | # then set this to `False`. If it was not, set this to 'True'.
26 | # Path based routing is the default option and most On-Premise
27 | # users use this option.
28 | PATH_BASED_ROUTING = True
29 |
30 | # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
31 | # This section only needs to be filled out #
32 | # if DASH_APP_PRIVACY is set to 'private' or 'secret' #
33 | # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
34 |
35 | # Fill in with your Plotly On-Premise username
36 | os.environ['PLOTLY_USERNAME'] = 'plotly-username'
37 |
38 | # Fill in with your Plotly On-Premise API key
39 | # See /settings/api to generate a key
40 | # If you have already created a key and saved it on your own machine
41 | # (from the Plotly-Python library instructions at https://plot.ly/python/getting-started)
42 | # then you can view that key in your ~/.plotly/.config file
43 | # or inside a Python session with these commands:
44 | # import plotly
45 | # print(plotly.tools.get_config_file())
46 | os.environ['PLOTLY_API_KEY'] = 'your-plotly-api-key'
47 |
48 | # Fill in with your Plotly On-Premise domain
49 | os.environ['PLOTLY_DOMAIN'] = 'https://your-plotly-domain.com'
50 | os.environ['PLOTLY_API_DOMAIN'] = os.environ['PLOTLY_DOMAIN']
51 |
52 | # Fill in with the domain of your Dash subdomain.
53 | # This matches the domain of the Dash App Manager
54 | PLOTLY_DASH_DOMAIN='https://your-dash-manager-plotly-domain.com'
55 |
56 | # Keep as True if your SSL certificates are valid.
57 | # If you are just trialing Plotly On-Premise with self signed certificates,
58 | # then you can set this to False. Note that self-signed certificates are not
59 | # safe for production.
60 | os.environ['PLOTLY_SSL_VERIFICATION'] = 'True'
61 |
--------------------------------------------------------------------------------
/data/ManCCTVDetectionData.csv:
--------------------------------------------------------------------------------
1 | frame,y,x,bottom,right,class,class_str,score
2 | 1,0.13252242,0.4093678,0.43429434,0.5772596,85,clock,0.6988673
3 | 2,0.13247202,0.40937623,0.4343151,0.57724833,85,clock,0.69374526
4 | 3,0.13247204,0.40935123,0.43433386,0.57725555,85,clock,0.693359
5 | 4,0.13249242,0.40934497,0.43432552,0.5772541,85,clock,0.6935436
6 | 5,0.1324829,0.40934348,0.43433905,0.57725614,85,clock,0.69321024
7 | 6,0.1324814,0.40934482,0.434339,0.5772563,85,clock,0.6931361
8 | 7,0.13248171,0.40934473,0.43433875,0.5772563,85,clock,0.69314146
9 | 8,0.13248138,0.4093448,0.43433887,0.5772564,85,clock,0.693137
10 | 9,0.13247812,0.40933725,0.43433946,0.57725716,85,clock,0.69307536
11 | 10,0.13247819,0.4093373,0.43433917,0.57725704,85,clock,0.69307613
12 | 11,0.13247812,0.40933725,0.43433946,0.57725716,85,clock,0.69307536
13 | 12,0.13247779,0.40933737,0.43433896,0.57725704,85,clock,0.6930805
14 | 13,0.13247779,0.40933737,0.43433896,0.57725704,85,clock,0.6930805
15 | 14,0.13247779,0.40933737,0.43433896,0.57725704,85,clock,0.6930805
16 | 15,0.13247779,0.40933737,0.43433896,0.57725704,85,clock,0.6930805
17 | 16,0.13247779,0.40933737,0.43433896,0.57725704,85,clock,0.6930805
18 | 17,0.13247368,0.40932876,0.43435535,0.57725745,85,clock,0.6925126
19 | 18,0.13247214,0.40932745,0.43435788,0.5772574,85,clock,0.6922269
20 | 19,0.13246039,0.4093049,0.43439928,0.5772589,85,clock,0.69109565
21 | 20,0.13246039,0.4093049,0.43439928,0.5772589,85,clock,0.69109565
22 | 21,0.13246039,0.4093049,0.43439928,0.5772589,85,clock,0.69109565
23 | 22,0.13246039,0.4093049,0.43439928,0.5772589,85,clock,0.69109565
24 | 23,0.13246039,0.4093049,0.43439928,0.5772589,85,clock,0.69109565
25 | 24,0.13246039,0.4093049,0.43439928,0.5772589,85,clock,0.69109565
26 | 25,0.13246039,0.4093049,0.43439928,0.5772589,85,clock,0.69109565
27 | 26,0.13246039,0.4093049,0.43439928,0.5772589,85,clock,0.69109565
28 | 27,0.13246039,0.4093049,0.43439928,0.5772589,85,clock,0.69109565
29 | 28,0.13246039,0.4093049,0.43439928,0.5772589,85,clock,0.69109565
30 | 29,0.13246039,0.4093049,0.43439928,0.5772589,85,clock,0.69109565
31 | 30,0.13246039,0.4093049,0.43439928,0.5772589,85,clock,0.69109565
32 | 31,0.13246039,0.4093049,0.43439928,0.5772589,85,clock,0.69109565
33 | 32,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
34 | 33,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
35 | 34,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
36 | 35,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
37 | 36,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
38 | 37,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
39 | 38,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
40 | 39,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
41 | 40,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
42 | 41,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
43 | 42,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
44 | 43,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
45 | 44,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
46 | 45,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
47 | 46,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
48 | 47,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
49 | 48,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
50 | 49,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
51 | 50,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
52 | 51,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
53 | 52,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
54 | 53,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
55 | 54,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
56 | 55,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
57 | 56,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
58 | 57,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
59 | 58,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
60 | 59,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
61 | 60,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
62 | 61,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
63 | 62,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
64 | 63,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
65 | 64,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
66 | 65,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
67 | 66,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
68 | 67,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
69 | 68,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
70 | 69,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
71 | 70,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
72 | 71,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
73 | 72,0.13251299,0.40932506,0.43438256,0.577259,85,clock,0.69371736
74 | 94,0.12989454,0.20621033,0.25301638,0.26437813,1,person,0.21171805
75 | 95,0.13006428,0.20577835,0.25290808,0.264589,1,person,0.22324668
76 | 106,0.0022269487,0.27747107,0.29946315,0.3537811,1,person,0.23271647
77 | 107,0.0022403896,0.2774123,0.2993853,0.35382283,1,person,0.2299701
78 | 108,0.00041650236,0.27840424,0.32549757,0.3708799,1,person,0.23916061
79 | 109,0.0009911358,0.27854967,0.32564723,0.37076503,1,person,0.23817919
80 | 110,0.0012103021,0.27868533,0.32596484,0.37080848,1,person,0.23277129
81 | 111,0.0011888742,0.27868575,0.3257637,0.37085223,1,person,0.22996847
82 | 112,0.0,0.2966817,0.35041165,0.39050186,1,person,0.3074168
83 | 113,0.0,0.2970157,0.3499587,0.39077118,1,person,0.31888554
84 | 116,0.037258267,0.33273765,0.35541368,0.4386073,1,person,0.2133668
85 | 117,0.03712678,0.3325244,0.35534126,0.43858966,1,person,0.21895397
86 | 118,0.037165925,0.33236486,0.35509998,0.4376108,1,person,0.22511394
87 | 119,0.09991081,0.33460414,0.3676684,0.43680924,1,person,0.24410802
88 | 119,0.09991081,0.33460414,0.3676684,0.43680924,18,dog,0.20710325
89 | 120,0.099660784,0.33462998,0.3677235,0.43732473,1,person,0.2518803
90 | 121,0.09954591,0.33459827,0.3675173,0.43760756,1,person,0.25875753
91 | 122,0.102022186,0.34292272,0.3827008,0.43799105,1,person,0.33808887
92 | 123,0.101841256,0.34283984,0.382568,0.43808472,1,person,0.34458145
93 | 132,0.06735273,0.41413727,0.41354942,0.48884752,1,person,0.23979302
94 | 133,0.06665832,0.4141612,0.4144037,0.48922712,1,person,0.23581375
95 | 134,0.065966085,0.41275606,0.41404253,0.48916015,1,person,0.23923187
96 | 135,0.066394255,0.4107654,0.43737906,0.50563896,18,dog,0.23565683
97 | 136,0.06571561,0.41009152,0.4370528,0.5059037,18,dog,0.23388486
98 | 137,0.06585282,0.41069874,0.43694264,0.50620055,18,dog,0.25103113
99 | 138,0.075203255,0.41204387,0.44281894,0.5106734,18,dog,0.5425906
100 | 139,0.07615201,0.41192317,0.442954,0.5107133,18,dog,0.55303484
101 | 140,0.07903157,0.43481958,0.46810514,0.54024935,18,dog,0.33982864
102 | 140,0.07085389,0.44276357,0.47712696,0.5363381,1,person,0.26757
103 | 141,0.07901575,0.43506396,0.46816856,0.540078,18,dog,0.33942902
104 | 141,0.070669204,0.4428999,0.47711095,0.5360586,1,person,0.2687298
105 | 142,0.078931466,0.43508363,0.46837467,0.5402254,18,dog,0.34225425
106 | 142,0.070715606,0.44260398,0.4775154,0.5362647,1,person,0.26983273
107 | 143,0.06373046,0.44919175,0.47976065,0.5480023,1,person,0.35121563
108 | 143,0.016124308,0.151325,0.5204238,0.5445348,15,bench,0.21252796
109 | 143,0.09052791,0.40385357,0.48750794,0.5233395,1,person,0.21112186
110 | 144,0.064235196,0.44948018,0.47894347,0.5478138,1,person,0.3577099
111 | 144,0.09085503,0.40392306,0.48718634,0.5232532,1,person,0.22022565
112 | 144,0.015448183,0.14990339,0.52057993,0.5443636,15,bench,0.200135
113 | 145,0.064579934,0.44954965,0.4790211,0.5475472,1,person,0.35729945
114 | 145,0.090812385,0.4041685,0.4871723,0.523683,1,person,0.22642004
115 | 145,0.015026867,0.14890167,0.5202584,0.54397655,15,bench,0.20157512
116 | 146,0.06654008,0.46691236,0.48268545,0.5584964,1,person,0.6534242
117 | 147,0.06581998,0.4665141,0.4826241,0.55845857,1,person,0.6462254
118 | 148,0.06320836,0.4717687,0.48928273,0.56616235,1,person,0.4857126
119 | 149,0.06359525,0.47208408,0.48923594,0.5663321,1,person,0.48201033
120 | 150,0.06306966,0.47178024,0.48902774,0.5664392,1,person,0.48462933
121 | 151,0.07564534,0.5053558,0.51935923,0.60094506,1,person,0.57151616
122 | 152,0.07565078,0.5050396,0.51922274,0.6014209,1,person,0.575136
123 | 153,0.07597938,0.50553364,0.52005196,0.60138434,1,person,0.57721114
124 | 154,0.077832565,0.5073523,0.53175426,0.61256987,1,person,0.6147246
125 | 155,0.07787363,0.50737566,0.53107816,0.612493,1,person,0.60368836
126 | 156,0.118296266,0.5399774,0.5321716,0.6336548,1,person,0.23674718
127 | 157,0.11883938,0.53997123,0.5315086,0.63298154,1,person,0.24992873
128 | 158,0.11868097,0.53997946,0.53153646,0.63296795,1,person,0.2517877
129 | 159,0.12400837,0.566057,0.5741063,0.6586012,1,person,0.2132713
130 | 160,0.12441313,0.5659575,0.57345486,0.65862757,1,person,0.22545294
131 | 161,0.12436703,0.5662334,0.57327664,0.65876615,1,person,0.22267863
132 | 162,0.16382393,0.5773331,0.57619286,0.6635431,1,person,0.3249605
133 | 163,0.16319017,0.5772945,0.57589024,0.6637946,1,person,0.3077547
134 | 164,0.13908437,0.60004675,0.53884137,0.689168,1,person,0.2567934
135 | 165,0.13792346,0.59987956,0.53989136,0.6895037,1,person,0.27048007
136 | 166,0.13803037,0.5999952,0.540422,0.6897375,1,person,0.25065786
137 | 167,0.16382666,0.6046202,0.6009163,0.6942539,1,person,0.34128252
138 | 168,0.16355594,0.6041764,0.6013153,0.6944332,1,person,0.35099295
139 | 169,0.16365817,0.60440135,0.60168695,0.6945406,1,person,0.35059544
140 | 170,0.17385782,0.61489815,0.6149012,0.7082296,1,person,0.24438839
141 | 171,0.2011872,0.61556673,0.60268414,0.70056975,1,person,0.2371161
142 | 172,0.27974737,0.60594094,0.655689,0.71684945,1,person,0.20156129
143 | 173,0.28228223,0.6055097,0.6555332,0.71689194,1,person,0.20246306
144 | 174,0.2826295,0.60545874,0.6554732,0.7167882,1,person,0.20105003
145 | 175,0.23311517,0.6790656,0.6311252,0.7670598,1,person,0.25742134
146 | 176,0.23312451,0.6791413,0.6302309,0.76706934,1,person,0.25558433
147 | 177,0.23327276,0.6791332,0.6308763,0.76722085,1,person,0.23273957
148 | 178,0.23324853,0.6791861,0.63045824,0.7672547,1,person,0.2315905
149 | 179,0.23361382,0.6797866,0.626493,0.76708597,1,person,0.23165858
150 | 221,0.31264114,0.31171495,0.60036314,0.3833992,1,person,0.41304368
151 | 222,0.31258965,0.3117426,0.6003331,0.38342226,1,person,0.41756198
152 | 223,0.3121506,0.31128755,0.5979592,0.3834599,1,person,0.43515986
153 | 224,0.31209606,0.31125158,0.5978322,0.38346297,1,person,0.44092506
154 | 225,0.31666097,0.30860424,0.64554036,0.3955381,1,person,0.44271976
155 | 226,0.3181525,0.3077012,0.64440954,0.3951353,1,person,0.42892623
156 | 227,0.31706744,0.3086091,0.64485824,0.3956165,1,person,0.43937647
157 | 228,0.31709778,0.3087089,0.6444427,0.39541477,1,person,0.43976188
158 | 229,0.30082008,0.341665,0.6632031,0.4401628,1,person,0.55300105
159 | 230,0.3014146,0.342538,0.66164297,0.44053704,1,person,0.5430963
160 | 231,0.30411488,0.35329637,0.64712346,0.45044407,1,person,0.5386491
161 | 232,0.30383694,0.3516294,0.6478303,0.45084724,1,person,0.5462782
162 | 233,0.31713206,0.35530382,0.6437407,0.46302783,1,person,0.5155349
163 | 233,0.38049734,0.33304757,0.6760982,0.47105193,1,person,0.2007579
164 | 234,0.31716496,0.35543323,0.64460164,0.4630751,1,person,0.49798837
165 | 235,0.31718498,0.3549512,0.6451507,0.4630468,1,person,0.5074518
166 | 236,0.31736892,0.35507306,0.6450435,0.46307346,1,person,0.50692606
167 | 237,0.33978623,0.3711374,0.69631404,0.49126628,1,person,0.5683907
168 | 238,0.33997822,0.37046957,0.6950624,0.49054188,1,person,0.6064167
169 | 239,0.34390533,0.3908744,0.6833401,0.49788713,1,person,0.7047791
170 | 240,0.34435043,0.39123794,0.6839467,0.49796703,1,person,0.7089793
171 | 241,0.33849293,0.4188037,0.6616604,0.5050387,1,person,0.5830589
172 | 241,0.20956224,0.03610623,0.9812446,0.99205565,5,airplane,0.22764418
173 | 242,0.33843863,0.41862622,0.6615342,0.50500953,1,person,0.5855583
174 | 242,0.21005207,0.03574568,0.98124975,0.9917794,5,airplane,0.22748329
175 | 243,0.33826023,0.4183588,0.6613066,0.50502264,1,person,0.5995281
176 | 243,0.2064963,0.035553843,0.9810831,0.9909203,5,airplane,0.22823626
177 | 244,0.3382864,0.41853487,0.6611405,0.5049881,1,person,0.59575945
178 | 244,0.20629856,0.03532335,0.9811301,0.9908807,5,airplane,0.22959289
179 | 244,0.5375652,0.45237288,0.6795128,0.534235,1,person,0.20058137
180 | 245,0.32460555,0.43462005,0.6464484,0.53008664,1,person,0.52488476
181 | 245,0.15781161,0.03943789,0.9778007,0.98764384,5,airplane,0.22088644
182 | 246,0.32409245,0.43478066,0.6468846,0.53025126,1,person,0.5328035
183 | 246,0.16205233,0.03892067,0.97905236,0.98962164,5,airplane,0.22468767
184 | 247,0.34787816,0.43770042,0.6803934,0.5467024,1,person,0.7459561
185 | 247,0.40931475,0.41058135,0.6817503,0.52534914,1,person,0.3364101
186 | 247,0.14446142,0.043819338,0.978029,0.9886608,5,airplane,0.208872
187 | 248,0.3482647,0.4382367,0.6805872,0.5463147,1,person,0.7369621
188 | 248,0.40978038,0.4103327,0.68236494,0.5254198,1,person,0.33128944
189 | 248,0.1488421,0.043854862,0.97815907,0.9884654,5,airplane,0.20775728
190 | 249,0.3517816,0.45977047,0.676358,0.5533389,1,person,0.70428455
191 | 249,0.11344877,0.042742252,0.9610225,0.9552403,5,airplane,0.24380895
192 | 249,0.41461307,0.4208938,0.6804405,0.5250883,1,person,0.22935395
193 | 250,0.3519377,0.45989627,0.67637485,0.55322486,1,person,0.70278925
194 | 250,0.11316967,0.042865932,0.9611052,0.9550008,5,airplane,0.2440282
195 | 250,0.41442633,0.42096126,0.6802771,0.5250207,1,person,0.22988726
196 | 251,0.3515228,0.4612616,0.6764704,0.553567,1,person,0.69648665
197 | 251,0.11196038,0.04078242,0.9619899,0.9561262,5,airplane,0.2561104
198 | 251,0.4159158,0.42003292,0.6807404,0.52525663,1,person,0.22253393
199 | 252,0.35161597,0.4611992,0.6766177,0.5536216,1,person,0.69775873
200 | 252,0.11176783,0.0410358,0.96159524,0.95605135,5,airplane,0.25630066
201 | 252,0.41582733,0.41995174,0.6807551,0.525295,1,person,0.22214007
202 | 253,0.34762567,0.4878651,0.6512645,0.56319535,1,person,0.56906694
203 | 253,0.10982275,0.022954673,0.9717301,0.9762007,5,airplane,0.27996048
204 | 254,0.34719425,0.48695067,0.65146035,0.5631727,1,person,0.57064784
205 | 254,0.10826534,0.021087795,0.9725285,0.9766544,5,airplane,0.28838775
206 | 255,0.3471604,0.4869163,0.6514733,0.56322336,1,person,0.5753504
207 | 255,0.1081247,0.02183956,0.9724479,0.97631115,5,airplane,0.28583902
208 | 256,0.34699905,0.48679048,0.65160483,0.56331235,1,person,0.57882273
209 | 256,0.10774091,0.021383673,0.97251606,0.97661555,5,airplane,0.28620514
210 | 257,0.36184207,0.5028991,0.64776444,0.60299224,1,person,0.3294558
211 | 257,0.2975507,0.002993703,0.93782103,0.30112606,47,cup,0.253079
212 | 257,0.18716875,0.044875562,0.98053706,0.99446267,5,airplane,0.22829685
213 | 258,0.36182505,0.50259906,0.6479525,0.6029431,1,person,0.33156437
214 | 258,0.29731864,0.0031303912,0.9372954,0.3009709,47,cup,0.25486645
215 | 258,0.1858963,0.045300245,0.98033583,0.9944122,5,airplane,0.22746523
216 | 259,0.36183938,0.5030119,0.6486962,0.603293,1,person,0.33611044
217 | 259,0.29691386,0.003132537,0.9378872,0.30059117,47,cup,0.25726596
218 | 259,0.1857461,0.043557644,0.98100054,0.99481475,5,airplane,0.23753405
219 | 260,0.36169878,0.50315243,0.6485293,0.6032184,1,person,0.33642682
220 | 260,0.2967532,0.0031419396,0.938424,0.3006508,47,cup,0.2565626
221 | 260,0.18658304,0.042922914,0.9810568,0.99486786,5,airplane,0.23992787
222 | 261,0.36412996,0.5305823,0.6440467,0.6097951,1,person,0.43686843
223 | 261,0.1875658,0.041733533,0.98265314,0.9958365,5,airplane,0.2765846
224 | 261,0.29595762,0.0034813732,0.938662,0.3004706,47,cup,0.25631455
225 | 262,0.36429325,0.5308633,0.6431519,0.6099112,1,person,0.4244156
226 | 262,0.19055876,0.041014254,0.9827341,0.99581784,5,airplane,0.27462694
227 | 262,0.2967419,0.0034540594,0.93843454,0.30062756,47,cup,0.2603062
228 | 262,0.37701085,0.3720324,0.99564314,0.9957131,5,airplane,0.20032208
229 | 263,0.35585213,0.54878914,0.64980876,0.6380961,1,person,0.3619331
230 | 263,0.29649535,0.0046129823,0.93833756,0.30140883,47,cup,0.24931139
231 | 263,0.22046307,0.06082818,0.983716,0.9961271,5,airplane,0.23227264
232 | 263,0.5545862,0.5717854,0.65140903,0.62698466,1,person,0.20788623
233 | 264,0.3555379,0.54880553,0.6490153,0.6381667,1,person,0.37191615
234 | 264,0.2960039,0.0049464703,0.93742406,0.30089244,47,cup,0.24933575
235 | 264,0.2131021,0.060275048,0.98332846,0.9958925,5,airplane,0.23435102
236 | 264,0.5550067,0.57166463,0.65135485,0.62703854,1,person,0.20882481
237 | 265,0.34223908,0.5529831,0.6473865,0.64615613,1,person,0.6432217
238 | 265,0.2954662,0.0050786138,0.94045854,0.30106717,47,cup,0.2736909
239 | 266,0.34219533,0.552937,0.64732647,0.64623207,1,person,0.6402714
240 | 266,0.2955821,0.005067095,0.9407923,0.30100203,47,cup,0.27526614
241 | 267,0.34223786,0.55286103,0.64728916,0.64634377,1,person,0.6411626
242 | 267,0.29505593,0.005149573,0.9410232,0.30082312,47,cup,0.27065772
243 | 268,0.34227097,0.5528807,0.6472506,0.6464334,1,person,0.6403734
244 | 268,0.29504824,0.0051599294,0.94114065,0.30076408,47,cup,0.2705052
245 | 269,0.3426966,0.5531163,0.6459599,0.6469842,1,person,0.63043106
246 | 269,0.29490247,0.005289927,0.9409894,0.30066878,47,cup,0.2698208
247 | 270,0.34251142,0.5527132,0.6463329,0.6470905,1,person,0.6255914
248 | 270,0.29526466,0.005288914,0.9417928,0.30098736,47,cup,0.27933717
249 | 271,0.34783268,0.5699529,0.64038444,0.6536667,1,person,0.5267052
250 | 271,0.2950518,0.005080506,0.9415642,0.30104238,47,cup,0.26888514
251 | 272,0.3477909,0.56980455,0.64028776,0.6538162,1,person,0.48944098
252 | 272,0.29545838,0.0053025335,0.94069976,0.30119246,47,cup,0.27022874
253 | 273,0.29534167,0.0050084144,0.9392014,0.30145657,47,cup,0.26381096
254 | 273,0.12977222,0.061579227,0.96966493,0.97436905,5,airplane,0.21980307
255 | 274,0.2954593,0.0047888607,0.94030225,0.3015824,47,cup,0.26568502
256 | 274,0.13104486,0.06124279,0.96987915,0.97511554,5,airplane,0.2211624
257 | 275,0.29600307,0.004737854,0.9422252,0.30146778,47,cup,0.2588684
258 | 275,0.12930244,0.062360883,0.9706549,0.976143,5,airplane,0.21980152
259 | 276,0.29628956,0.004571244,0.94296074,0.30162936,47,cup,0.25882292
260 | 276,0.12930328,0.06219706,0.97075754,0.9763261,5,airplane,0.22153027
261 | 277,0.29676372,0.004634291,0.9424636,0.3013203,47,cup,0.23256999
262 | 278,0.2956678,0.00442113,0.9429972,0.30171782,47,cup,0.22747627
263 | 279,0.29524463,0.0046039373,0.942308,0.30185062,47,cup,0.23315416
264 | 280,0.29528716,0.004748717,0.9422662,0.3019005,47,cup,0.23623855
265 | 281,0.29620528,0.0034530163,0.9440025,0.30233115,47,cup,0.27276486
266 | 281,0.3532202,0.6170656,0.647949,0.7030255,1,person,0.2502822
267 | 282,0.29729635,0.0027715117,0.9439449,0.30301225,47,cup,0.27816036
268 | 282,0.35356238,0.61815786,0.6482649,0.70350266,1,person,0.2501785
269 | 283,0.29760167,0.0025659353,0.9443053,0.30310017,47,cup,0.26599053
270 | 283,0.35427207,0.61793417,0.6477043,0.70387083,1,person,0.24028154
271 | 284,0.29780838,0.0026481003,0.9439465,0.30300015,47,cup,0.2660449
272 | 284,0.3542003,0.6173266,0.64783543,0.7036983,1,person,0.23920752
273 | 285,0.014379352,0.6410736,0.67294693,0.81480825,1,person,0.47603554
274 | 285,0.37279326,0.61974585,0.6494389,0.7159635,1,person,0.34511405
275 | 285,0.29784548,0.0028529465,0.94362175,0.30278584,47,cup,0.27297008
276 | 286,0.015355647,0.6431955,0.6724936,0.81332374,1,person,0.45150426
277 | 286,0.37184787,0.6203519,0.6497377,0.71576285,1,person,0.3674154
278 | 286,0.29790688,0.0028146058,0.9435123,0.30275816,47,cup,0.2677215
279 | 287,0.012499332,0.6348482,0.67394626,0.81971043,1,person,0.5398772
280 | 287,0.29758674,0.0024759322,0.9430484,0.30292088,47,cup,0.2537035
281 | 288,0.011672467,0.633633,0.6741979,0.82110417,1,person,0.566434
282 | 288,0.29754406,0.002425611,0.94297,0.3029321,47,cup,0.24994226
283 | 289,0.021931857,0.67571306,0.6546366,0.8239112,1,person,0.38582262
284 | 289,0.2962959,0.0029353797,0.9424579,0.3025536,47,cup,0.2547898
285 | 290,0.022646934,0.6754777,0.6540743,0.8244459,1,person,0.40288797
286 | 290,0.29624072,0.0029340088,0.9423677,0.30244634,47,cup,0.25233123
287 | 291,0.022708446,0.67600965,0.6544876,0.8262632,1,person,0.37816948
288 | 291,0.2959735,0.0031507611,0.9418638,0.30223018,47,cup,0.2766228
289 | 292,0.022957504,0.6762053,0.65439534,0.82655525,1,person,0.3657585
290 | 292,0.29586476,0.0031289011,0.9433474,0.302337,47,cup,0.2851911
291 | 293,0.025773168,0.6830766,0.6706729,0.819458,1,person,0.39391753
292 | 293,0.29616064,0.0033557862,0.9446896,0.30206972,47,cup,0.2937415
293 | 294,0.025750846,0.6832559,0.6705191,0.8196378,1,person,0.38928914
294 | 294,0.29628852,0.0032536387,0.9447663,0.30209932,47,cup,0.2912306
295 | 295,0.027024984,0.6833916,0.67030543,0.8200079,1,person,0.38290316
296 | 295,0.2961997,0.0032034218,0.94465387,0.3020178,47,cup,0.28409278
297 | 296,0.027161658,0.68329483,0.6702574,0.82024246,1,person,0.38538533
298 | 296,0.29629278,0.003329888,0.9448236,0.3019929,47,cup,0.2810965
299 | 297,0.024081886,0.68354875,0.6780344,0.8324985,1,person,0.352562
300 | 297,0.29726648,0.0032575577,0.9450946,0.30206376,47,cup,0.2559624
301 | 298,0.024154752,0.68347484,0.6777762,0.8323278,1,person,0.3438391
302 | 298,0.29710576,0.003311947,0.94479096,0.3020264,47,cup,0.25590214
303 | 299,0.021346658,0.683399,0.6788962,0.83332115,1,person,0.3564497
304 | 299,0.2962597,0.0034703314,0.9429681,0.30168885,47,cup,0.25654137
305 | 300,0.02095741,0.6834187,0.6788453,0.8334524,1,person,0.35631466
306 | 300,0.29640195,0.0032606125,0.94301784,0.30185524,47,cup,0.25495112
307 | 301,0.015803069,0.6848496,0.6820861,0.8246236,1,person,0.43486628
308 | 301,0.29616413,0.00283359,0.9431406,0.30254656,47,cup,0.2409832
309 | 302,0.013490319,0.6852822,0.6829461,0.82471865,1,person,0.42865768
310 | 302,0.29584888,0.002629891,0.9415984,0.30239302,47,cup,0.2269731
311 | 303,0.0076743364,0.69100726,0.6862676,0.827703,1,person,0.5028909
312 | 303,0.29648858,0.0025786757,0.9420609,0.30244854,47,cup,0.22235909
313 | 304,0.00829941,0.6916246,0.68631077,0.82744914,1,person,0.5071034
314 | 304,0.29647803,0.0028681308,0.94188845,0.30252874,47,cup,0.222107
315 | 305,0.010312617,0.69472206,0.6812233,0.8263782,1,person,0.32014087
316 | 305,0.29662946,0.0022151172,0.94564474,0.30264837,47,cup,0.2323933
317 | 306,0.01004979,0.69476694,0.6811962,0.82637304,1,person,0.31767684
318 | 306,0.29665452,0.0022029728,0.9456056,0.3028087,47,cup,0.23258598
319 | 307,0.010194689,0.69430864,0.68124557,0.8267242,1,person,0.3259848
320 | 307,0.2970612,0.0019292533,0.94677174,0.3030881,47,cup,0.21368442
321 | 308,0.0100981,0.6941812,0.6808915,0.8269578,1,person,0.32207647
322 | 308,0.29710084,0.0018758178,0.9471447,0.3032089,47,cup,0.21252994
323 | 309,0.2960392,0.0018420815,0.9473196,0.30319527,47,cup,0.22389697
324 | 309,0.012335837,0.6969006,0.673033,0.8307167,1,person,0.2031546
325 | 310,0.29641107,0.0017573088,0.94851065,0.3033024,47,cup,0.2204743
326 | 310,0.012390435,0.69715613,0.6722037,0.83085495,1,person,0.2009718
327 | 311,0.0054935515,0.6963767,0.6773344,0.82663524,1,person,0.26252508
328 | 311,0.2955076,0.0026217997,0.9434784,0.30230823,47,cup,0.21743113
329 | 312,0.0066202283,0.69628996,0.6753413,0.825496,1,person,0.2709965
330 | 312,0.29546848,0.0026646852,0.9433222,0.30222902,47,cup,0.21403734
331 | 313,0.2955545,0.00309439,0.94261277,0.30250466,47,cup,0.21530914
332 | 314,0.29544225,0.0030520856,0.94273484,0.3025103,47,cup,0.21712078
333 | 315,0.29033378,0.012631401,0.9070709,0.280635,47,cup,0.22250757
334 | 316,0.29053628,0.012981564,0.9055939,0.28048128,47,cup,0.22438549
335 | 317,0.29523727,0.0022507608,0.94334984,0.30365786,47,cup,0.24160197
336 | 318,0.29532722,0.002236575,0.9430449,0.30363104,47,cup,0.23870751
337 | 319,0.29531437,0.002400741,0.9427721,0.30343795,47,cup,0.23462692
338 | 320,0.2953576,0.002376452,0.9426558,0.30347127,47,cup,0.23011555
339 | 321,0.2909648,0.012422025,0.9022788,0.28002954,47,cup,0.21787478
340 | 322,0.2911201,0.012627825,0.90146655,0.2798534,47,cup,0.21796192
341 | 323,0.29079908,0.012704879,0.90195066,0.28004837,47,cup,0.21935406
342 | 324,0.29084617,0.01278995,0.9004161,0.2800017,47,cup,0.21839614
343 | 325,0.29103208,0.013351858,0.90533304,0.2800311,47,cup,0.22559448
344 | 326,0.29138085,0.013924807,0.9022807,0.28006896,47,cup,0.22082627
345 | 327,0.2912407,0.014059812,0.9040532,0.28015578,47,cup,0.22870114
346 | 328,0.29122695,0.014061749,0.9075484,0.28045678,47,cup,0.2318847
347 | 329,0.29131964,0.013928071,0.89980316,0.28014076,47,cup,0.24079517
348 | 330,0.2949025,0.0033104718,0.93873984,0.3020733,47,cup,0.23337255
349 | 331,0.2949557,0.003992513,0.9394764,0.30210435,47,cup,0.22423361
350 | 332,0.2942035,0.0036753118,0.9419882,0.30286586,47,cup,0.23361416
351 | 333,0.29482928,0.0041764677,0.9398279,0.30244774,47,cup,0.24867927
352 | 334,0.2950413,0.00410752,0.93959427,0.30256832,47,cup,0.23743686
353 | 335,0.2947738,0.0038642883,0.9412036,0.30266467,47,cup,0.22771913
354 | 336,0.29478917,0.004072383,0.9405949,0.30257618,47,cup,0.23112836
355 | 337,0.2962166,0.0038336664,0.9400933,0.30277503,47,cup,0.254367
356 | 338,0.2961535,0.0037153512,0.93952805,0.30271345,47,cup,0.2534379
357 | 339,0.29550493,0.0032265335,0.93854606,0.30280483,47,cup,0.23775832
358 | 340,0.29537055,0.0031110942,0.93842506,0.3023616,47,cup,0.2288415
359 | 341,0.29097664,0.013976365,0.89028597,0.2796774,47,cup,0.20409742
360 | 345,0.29073688,0.013924897,0.8937137,0.2796345,47,cup,0.20437457
361 | 346,0.29072887,0.013932183,0.893495,0.27962875,47,cup,0.20440301
362 | 347,0.09431794,0.37962848,0.4682376,0.6080859,85,clock,0.50831
363 | 348,0.09429897,0.37964728,0.4681357,0.6080916,85,clock,0.5066786
364 | 349,0.09428178,0.37965834,0.46813738,0.6080906,85,clock,0.50555086
365 | 350,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
366 | 351,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
367 | 352,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
368 | 353,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
369 | 354,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
370 | 355,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
371 | 356,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
372 | 357,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
373 | 358,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
374 | 359,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
375 | 360,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
376 | 361,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
377 | 362,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
378 | 363,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
379 | 364,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
380 | 365,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
381 | 366,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
382 | 367,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
383 | 368,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
384 | 369,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
385 | 370,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
386 | 371,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
387 | 372,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
388 | 373,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
389 | 374,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
390 | 375,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
391 | 376,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
392 | 377,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
393 | 378,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
394 | 379,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
395 | 380,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
396 | 381,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
397 | 382,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
398 | 383,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
399 | 384,0.094281435,0.3796584,0.4681381,0.60809064,85,clock,0.50554234
400 | 385,0.09390959,0.3790092,0.4678389,0.60826,85,clock,0.55909914
401 | 386,0.093902454,0.37899497,0.467857,0.60825837,85,clock,0.5598696
402 | 387,0.09390618,0.3790143,0.46780634,0.60825735,85,clock,0.5588542
403 | 388,0.09390555,0.37902686,0.4678018,0.6082615,85,clock,0.55856943
404 | 389,0.09390667,0.3790281,0.46780726,0.6082593,85,clock,0.5584626
405 | 390,0.09390649,0.3790281,0.46780726,0.6082593,85,clock,0.5584839
406 | 391,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
407 | 392,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
408 | 393,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
409 | 394,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
410 | 395,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.5584568
411 | 396,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.5584568
412 | 397,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
413 | 398,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
414 | 399,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
415 | 400,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
416 | 401,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
417 | 402,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
418 | 403,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
419 | 404,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
420 | 405,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
421 | 406,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
422 | 407,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
423 | 408,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
424 | 409,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
425 | 410,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
426 | 411,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
427 | 412,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
428 | 413,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
429 | 414,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
430 | 415,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
431 | 416,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
432 | 417,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
433 | 418,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
434 | 419,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
435 | 420,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
436 | 421,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
437 | 422,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
438 | 423,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
439 | 424,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
440 | 425,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
441 | 426,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
442 | 427,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
443 | 428,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
444 | 429,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
445 | 430,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
446 | 431,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
447 | 432,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
448 | 433,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
449 | 434,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
450 | 435,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
451 | 436,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
452 | 437,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
453 | 438,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
454 | 439,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
455 | 440,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
456 | 441,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
457 | 442,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
458 | 443,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
459 | 444,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
460 | 445,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
461 | 446,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
462 | 447,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
463 | 448,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
464 | 449,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
465 | 450,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
466 | 451,0.09390679,0.37902814,0.46780697,0.6082592,85,clock,0.55845684
467 |
--------------------------------------------------------------------------------
/data/original_footage.md:
--------------------------------------------------------------------------------
1 | ## List of original footages
2 | * Lions fighting Zebras: https://www.youtube.com/watch?v=7Wexb_7ALmU
3 | * Man driving expensive car: https://www.youtube.com/watch?v=pHk2LGraF90
4 | * Drone recording of car festival: https://www.youtube.com/watch?v=ojgtWheibFM
5 | * Drone recording of car festival #2: https://www.youtube.com/watch?v=GxCQDBPimms
6 | * Drone recording of canal festival: https://www.youtube.com/watch?v=k31eo5XsIdM
7 | * Man caught in CCTV: https://www.youtube.com/watch?v=yGdkAiWY8Wk
8 | * Drone recording of Farm: https://www.youtube.com/watch?v=5F_MJH2jbrg
9 |
--------------------------------------------------------------------------------
/images/Screencast.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/plotly/dash-object-detection/62f2028aa14dd317457bc250a44604f84a43efc8/images/Screencast.gif
--------------------------------------------------------------------------------
/images/Screenshot1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/plotly/dash-object-detection/62f2028aa14dd317457bc250a44604f84a43efc8/images/Screenshot1.png
--------------------------------------------------------------------------------
/images/Screenshot2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/plotly/dash-object-detection/62f2028aa14dd317457bc250a44604f84a43efc8/images/Screenshot2.png
--------------------------------------------------------------------------------
/requirements.txt:
--------------------------------------------------------------------------------
1 | # Core
2 | dash==0.35.2
3 | dash-auth==1.1.2
4 | dash-html-components==0.13.5
5 | dash-core-components==0.43.0
6 | dash-renderer==0.17.0
7 | gunicorn==19.9.0
8 | plotly==3.6.0
9 | pillow==5.4.1
10 | Flask==1.0.1
11 | scipy==1.2.1
12 | numpy==1.16.1
13 | pandas==0.24.1
14 | dash-player==0.0.1
15 |
16 |
--------------------------------------------------------------------------------
/runtime.txt:
--------------------------------------------------------------------------------
1 | python-3.6.6
--------------------------------------------------------------------------------
/utils/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/plotly/dash-object-detection/62f2028aa14dd317457bc250a44604f84a43efc8/utils/__init__.py
--------------------------------------------------------------------------------
/utils/dash_reusable_components.py:
--------------------------------------------------------------------------------
1 | from textwrap import dedent
2 |
3 | import dash_core_components as dcc
4 | import dash_html_components as html
5 |
6 |
7 | def DemoDescriptionCard(markdown_text):
8 | """
9 | 'width': '80%',
10 | 'max-width': '1024px',
11 | 'font-family': 'Roboto, sans-serif'
12 | :param markdown_text:
13 | :return: html.Div
14 | """
15 | return html.Div(
16 | className='row',
17 | style={
18 | 'padding': '15px 30px 27px',
19 | 'margin': '10px auto 45px',
20 | 'width': '80%',
21 | 'max-width': '1024px',
22 | 'borderRadius': 5,
23 | 'border': 'thin lightgrey solid',
24 | 'font-family': 'Roboto, sans-serif'
25 | },
26 | children=dcc.Markdown(dedent(markdown_text))
27 | )
28 |
--------------------------------------------------------------------------------
/utils/generate_video_data.py:
--------------------------------------------------------------------------------
1 | # coding: utf-8
2 | import numpy as np
3 | import tensorflow as tf
4 | import cv2 as cv
5 | import time
6 | import base64
7 | import pandas as pd
8 | from utils.visualization_utils import visualize_boxes_and_labels_on_image_array # Taken from Google Research GitHub
9 | from utils.mscoco_label_map import category_index
10 |
11 | ############################# MODIFY BELOW #############################
12 |
13 | # Generate the base64 string of each frame, not recommended
14 | ENCODE_B64 = False
15 | # Prints information about training in console
16 | VERBOSE = True
17 | # Show video being processed in window
18 | SHOW_PROCESS = True
19 | # Create a video with the bounding boxes
20 | WRITE_VIDEO_OUT = True
21 | # Minimum score threshold for a bounding box to be recorded in data
22 | THRESHOLD = 0.2
23 | OUTPUT_FPS = 24.0
24 | # Change name of video being processed
25 | VIDEO_FILE_NAME = "../videos/DroneCarFestival3"
26 | VIDEO_EXTENSION = ".mp4"
27 |
28 | ############################# MODIFY ABOVE #############################
29 |
30 | # Load a (frozen) Tensorflow model into memory.
31 | detection_graph = tf.Graph()
32 | with detection_graph.as_default():
33 | od_graph_def = tf.GraphDef()
34 | with tf.gfile.GFile("frozen_inference_graph.pb", 'rb') as fid:
35 | serialized_graph = fid.read()
36 | od_graph_def.ParseFromString(serialized_graph)
37 | tf.import_graph_def(od_graph_def, name='')
38 |
39 | # Loading the videocapture objects
40 | cap = cv.VideoCapture(f'{VIDEO_FILE_NAME}{VIDEO_EXTENSION}')
41 |
42 | if WRITE_VIDEO_OUT:
43 | # Setup the video creation process
44 | fourcc = cv.VideoWriter_fourcc(*'MP4V')
45 | out = cv.VideoWriter(f'{VIDEO_FILE_NAME}WithBoundingBoxes.mp4', fourcc, OUTPUT_FPS, (1280, 720))
46 | out_orig = cv.VideoWriter(f'{VIDEO_FILE_NAME}Original.mp4', fourcc, OUTPUT_FPS, (1280, 720))
47 |
48 | # Start the session
49 | with detection_graph.as_default():
50 | with tf.Session(graph=detection_graph) as sess:
51 | # Definite input and output Tensors for detection_graph
52 | image_tensor = detection_graph.get_tensor_by_name('image_tensor:0')
53 | # Each box represents a part of the image where a particular object was detected.
54 | detection_boxes = detection_graph.get_tensor_by_name('detection_boxes:0')
55 | # Each score represent how level of confidence for each of the objects.
56 | # Score is shown on the result image, together with the class label.
57 | detection_scores = detection_graph.get_tensor_by_name('detection_scores:0')
58 | detection_classes = detection_graph.get_tensor_by_name('detection_classes:0')
59 | num_detections = detection_graph.get_tensor_by_name('num_detections:0')
60 |
61 | frame_base64_ls = [] # The list containing the frame in base64 format and their timestamp
62 | frame_info_ls = [] # The list containing the information about the frames
63 |
64 | counter = 0
65 | while cap.isOpened():
66 | ret, image = cap.read()
67 |
68 | if ret:
69 | # Retrieve timestamp
70 | curr_frame = int(cap.get(cv.CAP_PROP_POS_FRAMES))
71 |
72 | # Convert image into an np array
73 | image_np = np.array(image)
74 | image_np_expanded = np.expand_dims(image_np, axis=0)
75 |
76 | t1 = time.time()
77 |
78 | # Run the algorithm, retrieve the boxes, score and classes
79 | (boxes, scores, classes, num) = sess.run(
80 | [detection_boxes, detection_scores, detection_classes, num_detections],
81 | feed_dict={image_tensor: image_np_expanded})
82 |
83 | t2 = time.time()
84 |
85 | # Remove the leading 1 dimension
86 | boxes = np.squeeze(boxes)
87 | classes = np.squeeze(classes).astype(np.int32)
88 | scores = np.squeeze(scores)
89 |
90 | # Draw the bounding boxes with information about the predictions
91 | visualize_boxes_and_labels_on_image_array(
92 | image_np,
93 | boxes,
94 | classes,
95 | scores,
96 | category_index,
97 | use_normalized_coordinates=True,
98 | line_thickness=2
99 | )
100 |
101 | # Encode the image into base64
102 | if ENCODE_B64:
103 | retval, buffer = cv.imencode('.png', image_np)
104 | img_str = base64.b64encode(buffer)
105 | image_b64 = 'data:image/png;base64,{}'.format(img_str.decode('ascii'))
106 |
107 | # Append the image along with timestamp to the frame_base64_ls
108 | frame_base64_ls.append([curr_frame, image_b64])
109 |
110 | # Update the output video
111 | if WRITE_VIDEO_OUT:
112 | out.write(image_np)
113 | out_orig.write(image) # Writes the original image
114 |
115 | # Process the information about the video at that exact timestamp
116 | timestamp_df = pd.DataFrame([curr_frame for _ in range(int(num))], columns=["frame"])
117 | boxes_df = pd.DataFrame(boxes, columns=['y', 'x', 'bottom', 'right'])
118 | classes_df = pd.DataFrame(classes, columns=['class'])
119 | score_df = pd.DataFrame(scores, columns=['score'])
120 | # Maps a np array of integer to their coco index
121 | coco_map = np.vectorize(lambda i: category_index[i]['name'])
122 | classes_str_df = pd.DataFrame(coco_map(classes), columns=['class_str'])
123 |
124 | # Concatenate all the information
125 | info_df = pd.concat([timestamp_df, boxes_df, classes_df, classes_str_df, score_df], axis=1)
126 |
127 | # Only keep the entries with a score over the threshold
128 | narrow_info_df = info_df[info_df['score'] > THRESHOLD]
129 |
130 | # Append it the list of information of all the frames
131 | frame_info_ls.append(narrow_info_df)
132 |
133 | t3 = time.time()
134 |
135 | counter += 1
136 | if VERBOSE:
137 | print(f"Algorithm runtime at frame {counter}: {t2-t1:.2f}")
138 |
139 | if SHOW_PROCESS:
140 | cv.imshow('Object detection', image_np)
141 |
142 | if cv.waitKey(1) & 0xFF == ord('q'):
143 | break
144 |
145 | else:
146 | break
147 |
148 | if ENCODE_B64:
149 | # Save the frames in base64
150 | frame_base64_df = pd.DataFrame(frame_base64_ls, columns=['frame', 'source'])
151 | frame_base64_df.to_csv("video_frames_b64.csv", index=False)
152 |
153 | frame_info_df = pd.concat(frame_info_ls)
154 | frame_info_df.to_csv(f"{VIDEO_FILE_NAME}DetectionData.csv", index=False)
155 |
156 | # Release processes
157 | cap.release()
158 |
159 | if WRITE_VIDEO_OUT:
160 | out.release()
161 | out_orig.release()
162 |
163 | cv.destroyAllWindows()
164 |
--------------------------------------------------------------------------------
/utils/mscoco_label_map.py:
--------------------------------------------------------------------------------
1 | category_map = {
2 | 1: 'person',
3 | 2: 'bicycle',
4 | 3: 'car',
5 | 4: 'motorcycle',
6 | 5: 'airplane',
7 | 6: 'bus',
8 | 7: 'train',
9 | 8: 'truck',
10 | 9: 'boat',
11 | 10: 'traffic light',
12 | 11: 'fire hydrant',
13 | 13: 'stop sign',
14 | 14: 'parking meter',
15 | 15: 'bench',
16 | 16: 'bird',
17 | 17: 'cat',
18 | 18: 'dog',
19 | 19: 'horse',
20 | 20: 'sheep',
21 | 21: 'cow',
22 | 22: 'elephant',
23 | 23: 'bear',
24 | 24: 'zebra',
25 | 25: 'giraffe',
26 | 27: 'backpack',
27 | 28: 'umbrella',
28 | 31: 'handbag',
29 | 32: 'tie',
30 | 33: 'suitcase',
31 | 34: 'frisbee',
32 | 35: 'skis',
33 | 36: 'snowboard',
34 | 37: 'sports ball',
35 | 38: 'kite',
36 | 39: 'baseball bat',
37 | 40: 'baseball glove',
38 | 41: 'skateboard',
39 | 42: 'surfboard',
40 | 43: 'tennis racket',
41 | 44: 'bottle',
42 | 46: 'wine glass',
43 | 47: 'cup',
44 | 48: 'fork',
45 | 49: 'knife',
46 | 50: 'spoon',
47 | 51: 'bowl',
48 | 52: 'banana',
49 | 53: 'apple',
50 | 54: 'sandwich',
51 | 55: 'orange',
52 | 56: 'broccoli',
53 | 57: 'carrot',
54 | 58: 'hot dog',
55 | 59: 'pizza',
56 | 60: 'donut',
57 | 61: 'cake',
58 | 62: 'chair',
59 | 63: 'couch',
60 | 64: 'potted plant',
61 | 65: 'bed',
62 | 67: 'dining table',
63 | 70: 'toilet',
64 | 72: 'tv',
65 | 73: 'laptop',
66 | 74: 'mouse',
67 | 75: 'remote',
68 | 76: 'keyboard',
69 | 77: 'cell phone',
70 | 78: 'microwave',
71 | 79: 'oven',
72 | 80: 'toaster',
73 | 81: 'sink',
74 | 82: 'refrigerator',
75 | 84: 'book',
76 | 85: 'clock',
77 | 86: 'vase',
78 | 87: 'scissors',
79 | 88: 'teddy bear',
80 | 89: 'hair drier',
81 | 90: 'toothbrush'
82 | }
83 |
84 | category_index = {
85 | 1: {'id': 1, 'name': 'person'},
86 | 2: {'id': 2, 'name': 'bicycle'},
87 | 3: {'id': 3, 'name': 'car'},
88 | 4: {'id': 4, 'name': 'motorcycle'},
89 | 5: {'id': 5, 'name': 'airplane'},
90 | 6: {'id': 6, 'name': 'bus'},
91 | 7: {'id': 7, 'name': 'train'},
92 | 8: {'id': 8, 'name': 'truck'},
93 | 9: {'id': 9, 'name': 'boat'},
94 | 10: {'id': 10, 'name': 'traffic light'},
95 | 11: {'id': 11, 'name': 'fire hydrant'},
96 | 13: {'id': 13, 'name': 'stop sign'},
97 | 14: {'id': 14, 'name': 'parking meter'},
98 | 15: {'id': 15, 'name': 'bench'},
99 | 16: {'id': 16, 'name': 'bird'},
100 | 17: {'id': 17, 'name': 'cat'},
101 | 18: {'id': 18, 'name': 'dog'},
102 | 19: {'id': 19, 'name': 'horse'},
103 | 20: {'id': 20, 'name': 'sheep'},
104 | 21: {'id': 21, 'name': 'cow'},
105 | 22: {'id': 22, 'name': 'elephant'},
106 | 23: {'id': 23, 'name': 'bear'},
107 | 24: {'id': 24, 'name': 'zebra'},
108 | 25: {'id': 25, 'name': 'giraffe'},
109 | 27: {'id': 27, 'name': 'backpack'},
110 | 28: {'id': 28, 'name': 'umbrella'},
111 | 31: {'id': 31, 'name': 'handbag'},
112 | 32: {'id': 32, 'name': 'tie'},
113 | 33: {'id': 33, 'name': 'suitcase'},
114 | 34: {'id': 34, 'name': 'frisbee'},
115 | 35: {'id': 35, 'name': 'skis'},
116 | 36: {'id': 36, 'name': 'snowboard'},
117 | 37: {'id': 37, 'name': 'sports ball'},
118 | 38: {'id': 38, 'name': 'kite'},
119 | 39: {'id': 39, 'name': 'baseball bat'},
120 | 40: {'id': 40, 'name': 'baseball glove'},
121 | 41: {'id': 41, 'name': 'skateboard'},
122 | 42: {'id': 42, 'name': 'surfboard'},
123 | 43: {'id': 43, 'name': 'tennis racket'},
124 | 44: {'id': 44, 'name': 'bottle'},
125 | 46: {'id': 46, 'name': 'wine glass'},
126 | 47: {'id': 47, 'name': 'cup'},
127 | 48: {'id': 48, 'name': 'fork'},
128 | 49: {'id': 49, 'name': 'knife'},
129 | 50: {'id': 50, 'name': 'spoon'},
130 | 51: {'id': 51, 'name': 'bowl'},
131 | 52: {'id': 52, 'name': 'banana'},
132 | 53: {'id': 53, 'name': 'apple'},
133 | 54: {'id': 54, 'name': 'sandwich'},
134 | 55: {'id': 55, 'name': 'orange'},
135 | 56: {'id': 56, 'name': 'broccoli'},
136 | 57: {'id': 57, 'name': 'carrot'},
137 | 58: {'id': 58, 'name': 'hot dog'},
138 | 59: {'id': 59, 'name': 'pizza'},
139 | 60: {'id': 60, 'name': 'donut'},
140 | 61: {'id': 61, 'name': 'cake'},
141 | 62: {'id': 62, 'name': 'chair'},
142 | 63: {'id': 63, 'name': 'couch'},
143 | 64: {'id': 64, 'name': 'potted plant'},
144 | 65: {'id': 65, 'name': 'bed'},
145 | 67: {'id': 67, 'name': 'dining table'},
146 | 70: {'id': 70, 'name': 'toilet'},
147 | 72: {'id': 72, 'name': 'tv'},
148 | 73: {'id': 73, 'name': 'laptop'},
149 | 74: {'id': 74, 'name': 'mouse'},
150 | 75: {'id': 75, 'name': 'remote'},
151 | 76: {'id': 76, 'name': 'keyboard'},
152 | 77: {'id': 77, 'name': 'cell phone'},
153 | 78: {'id': 78, 'name': 'microwave'},
154 | 79: {'id': 79, 'name': 'oven'},
155 | 80: {'id': 80, 'name': 'toaster'},
156 | 81: {'id': 81, 'name': 'sink'},
157 | 82: {'id': 82, 'name': 'refrigerator'},
158 | 84: {'id': 84, 'name': 'book'},
159 | 85: {'id': 85, 'name': 'clock'},
160 | 86: {'id': 86, 'name': 'vase'},
161 | 87: {'id': 87, 'name': 'scissors'},
162 | 88: {'id': 88, 'name': 'teddy bear'},
163 | 89: {'id': 89, 'name': 'hair drier'},
164 | 90: {'id': 90, 'name': 'toothbrush'}
165 | }
166 |
--------------------------------------------------------------------------------
/utils/visualization_utils.py:
--------------------------------------------------------------------------------
1 | # Copyright 2017 The TensorFlow Authors. All Rights Reserved.
2 | #
3 | # Licensed under the Apache License, Version 2.0 (the "License");
4 | # you may not use this file except in compliance with the License.
5 | # You may obtain a copy of the License at
6 | #
7 | # http://www.apache.org/licenses/LICENSE-2.0
8 | #
9 | # Unless required by applicable law or agreed to in writing, software
10 | # distributed under the License is distributed on an "AS IS" BASIS,
11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | # See the License for the specific language governing permissions and
13 | # limitations under the License.
14 | # ==============================================================================
15 |
16 | """A set of functions that are used for visualization.
17 |
18 | These functions often receive an image, perform some visualization on the image.
19 | The functions do not return a value, instead they modify the image itself.
20 |
21 | """
22 | import collections
23 | import functools
24 | # Set headless-friendly backend.
25 | import matplotlib; matplotlib.use('Agg') # pylint: disable=multiple-statements
26 | import matplotlib.pyplot as plt # pylint: disable=g-import-not-at-top
27 | import numpy as np
28 | import PIL.Image as Image
29 | import PIL.ImageColor as ImageColor
30 | import PIL.ImageDraw as ImageDraw
31 | import PIL.ImageFont as ImageFont
32 | import six
33 | import tensorflow as tf
34 |
35 | # from object_detection.core import standard_fields as fields # Commented out by xhlulu
36 |
37 |
38 | _TITLE_LEFT_MARGIN = 10
39 | _TITLE_TOP_MARGIN = 10
40 | STANDARD_COLORS = [
41 | 'AliceBlue', 'Chartreuse', 'Aqua', 'Aquamarine', 'Azure', 'Beige', 'Bisque',
42 | 'BlanchedAlmond', 'BlueViolet', 'BurlyWood', 'CadetBlue', 'AntiqueWhite',
43 | 'Chocolate', 'Coral', 'CornflowerBlue', 'Cornsilk', 'Crimson', 'Cyan',
44 | 'DarkCyan', 'DarkGoldenRod', 'DarkGrey', 'DarkKhaki', 'DarkOrange',
45 | 'DarkOrchid', 'DarkSalmon', 'DarkSeaGreen', 'DarkTurquoise', 'DarkViolet',
46 | 'DeepPink', 'DeepSkyBlue', 'DodgerBlue', 'FireBrick', 'FloralWhite',
47 | 'ForestGreen', 'Fuchsia', 'Gainsboro', 'GhostWhite', 'Gold', 'GoldenRod',
48 | 'Salmon', 'Tan', 'HoneyDew', 'HotPink', 'IndianRed', 'Ivory', 'Khaki',
49 | 'Lavender', 'LavenderBlush', 'LawnGreen', 'LemonChiffon', 'LightBlue',
50 | 'LightCoral', 'LightCyan', 'LightGoldenRodYellow', 'LightGray', 'LightGrey',
51 | 'LightGreen', 'LightPink', 'LightSalmon', 'LightSeaGreen', 'LightSkyBlue',
52 | 'LightSlateGray', 'LightSlateGrey', 'LightSteelBlue', 'LightYellow', 'Lime',
53 | 'LimeGreen', 'Linen', 'Magenta', 'MediumAquaMarine', 'MediumOrchid',
54 | 'MediumPurple', 'MediumSeaGreen', 'MediumSlateBlue', 'MediumSpringGreen',
55 | 'MediumTurquoise', 'MediumVioletRed', 'MintCream', 'MistyRose', 'Moccasin',
56 | 'NavajoWhite', 'OldLace', 'Olive', 'OliveDrab', 'Orange', 'OrangeRed',
57 | 'Orchid', 'PaleGoldenRod', 'PaleGreen', 'PaleTurquoise', 'PaleVioletRed',
58 | 'PapayaWhip', 'PeachPuff', 'Peru', 'Pink', 'Plum', 'PowderBlue', 'Purple',
59 | 'Red', 'RosyBrown', 'RoyalBlue', 'SaddleBrown', 'Green', 'SandyBrown',
60 | 'SeaGreen', 'SeaShell', 'Sienna', 'Silver', 'SkyBlue', 'SlateBlue',
61 | 'SlateGray', 'SlateGrey', 'Snow', 'SpringGreen', 'SteelBlue', 'GreenYellow',
62 | 'Teal', 'Thistle', 'Tomato', 'Turquoise', 'Violet', 'Wheat', 'White',
63 | 'WhiteSmoke', 'Yellow', 'YellowGreen'
64 | ]
65 |
66 |
67 | def save_image_array_as_png(image, output_path):
68 | """Saves an image (represented as a numpy array) to PNG.
69 |
70 | Args:
71 | image: a numpy array with shape [height, width, 3].
72 | output_path: path to which image should be written.
73 | """
74 | image_pil = Image.fromarray(np.uint8(image)).convert('RGB')
75 | with tf.gfile.Open(output_path, 'w') as fid:
76 | image_pil.save(fid, 'PNG')
77 |
78 |
79 | def encode_image_array_as_png_str(image):
80 | """Encodes a numpy array into a PNG string.
81 |
82 | Args:
83 | image: a numpy array with shape [height, width, 3].
84 |
85 | Returns:
86 | PNG encoded image string.
87 | """
88 | image_pil = Image.fromarray(np.uint8(image))
89 | output = six.BytesIO()
90 | image_pil.save(output, format='PNG')
91 | png_string = output.getvalue()
92 | output.close()
93 | return png_string
94 |
95 |
96 | def draw_bounding_box_on_image_array(image,
97 | ymin,
98 | xmin,
99 | ymax,
100 | xmax,
101 | color='red',
102 | thickness=4,
103 | display_str_list=(),
104 | use_normalized_coordinates=True):
105 | """Adds a bounding box to an image (numpy array).
106 |
107 | Bounding box coordinates can be specified in either absolute (pixel) or
108 | normalized coordinates by setting the use_normalized_coordinates argument.
109 |
110 | Args:
111 | image: a numpy array with shape [height, width, 3].
112 | ymin: ymin of bounding box.
113 | xmin: xmin of bounding box.
114 | ymax: ymax of bounding box.
115 | xmax: xmax of bounding box.
116 | color: color to draw bounding box. Default is red.
117 | thickness: line thickness. Default value is 4.
118 | display_str_list: list of strings to display in box
119 | (each to be shown on its own line).
120 | use_normalized_coordinates: If True (default), treat coordinates
121 | ymin, xmin, ymax, xmax as relative to the image. Otherwise treat
122 | coordinates as absolute.
123 | """
124 | image_pil = Image.fromarray(np.uint8(image)).convert('RGB')
125 | draw_bounding_box_on_image(image_pil, ymin, xmin, ymax, xmax, color,
126 | thickness, display_str_list,
127 | use_normalized_coordinates)
128 | np.copyto(image, np.array(image_pil))
129 |
130 |
131 | def draw_bounding_box_on_image(image,
132 | ymin,
133 | xmin,
134 | ymax,
135 | xmax,
136 | color='red',
137 | thickness=4,
138 | display_str_list=(),
139 | use_normalized_coordinates=True):
140 | """Adds a bounding box to an image.
141 |
142 | Bounding box coordinates can be specified in either absolute (pixel) or
143 | normalized coordinates by setting the use_normalized_coordinates argument.
144 |
145 | Each string in display_str_list is displayed on a separate line above the
146 | bounding box in black text on a rectangle filled with the input 'color'.
147 | If the top of the bounding box extends to the edge of the image, the strings
148 | are displayed below the bounding box.
149 |
150 | Args:
151 | image: a PIL.Image object.
152 | ymin: ymin of bounding box.
153 | xmin: xmin of bounding box.
154 | ymax: ymax of bounding box.
155 | xmax: xmax of bounding box.
156 | color: color to draw bounding box. Default is red.
157 | thickness: line thickness. Default value is 4.
158 | display_str_list: list of strings to display in box
159 | (each to be shown on its own line).
160 | use_normalized_coordinates: If True (default), treat coordinates
161 | ymin, xmin, ymax, xmax as relative to the image. Otherwise treat
162 | coordinates as absolute.
163 | """
164 | draw = ImageDraw.Draw(image)
165 | im_width, im_height = image.size
166 | if use_normalized_coordinates:
167 | (left, right, top, bottom) = (xmin * im_width, xmax * im_width,
168 | ymin * im_height, ymax * im_height)
169 | else:
170 | (left, right, top, bottom) = (xmin, xmax, ymin, ymax)
171 | draw.line([(left, top), (left, bottom), (right, bottom),
172 | (right, top), (left, top)], width=thickness, fill=color)
173 | try:
174 | font = ImageFont.truetype('arial.ttf', 24)
175 | except IOError:
176 | font = ImageFont.load_default()
177 |
178 | # If the total height of the display strings added to the top of the bounding
179 | # box exceeds the top of the image, stack the strings below the bounding box
180 | # instead of above.
181 | display_str_heights = [font.getsize(ds)[1] for ds in display_str_list]
182 | # Each display_str has a top and bottom margin of 0.05x.
183 | total_display_str_height = (1 + 2 * 0.05) * sum(display_str_heights)
184 |
185 | if top > total_display_str_height:
186 | text_bottom = top
187 | else:
188 | text_bottom = bottom + total_display_str_height
189 | # Reverse list and print from bottom to top.
190 | for display_str in display_str_list[::-1]:
191 | text_width, text_height = font.getsize(display_str)
192 | margin = np.ceil(0.05 * text_height)
193 | draw.rectangle(
194 | [(left, text_bottom - text_height - 2 * margin), (left + text_width,
195 | text_bottom)],
196 | fill=color)
197 | draw.text(
198 | (left + margin, text_bottom - text_height - margin),
199 | display_str,
200 | fill='black',
201 | font=font)
202 | text_bottom -= text_height - 2 * margin
203 |
204 |
205 | def draw_bounding_boxes_on_image_array(image,
206 | boxes,
207 | color='red',
208 | thickness=4,
209 | display_str_list_list=()):
210 | """Draws bounding boxes on image (numpy array).
211 |
212 | Args:
213 | image: a numpy array object.
214 | boxes: a 2 dimensional numpy array of [N, 4]: (ymin, xmin, ymax, xmax).
215 | The coordinates are in normalized format between [0, 1].
216 | color: color to draw bounding box. Default is red.
217 | thickness: line thickness. Default value is 4.
218 | display_str_list_list: list of list of strings.
219 | a list of strings for each bounding box.
220 | The reason to pass a list of strings for a
221 | bounding box is that it might contain
222 | multiple labels.
223 |
224 | Raises:
225 | ValueError: if boxes is not a [N, 4] array
226 | """
227 | image_pil = Image.fromarray(image)
228 | draw_bounding_boxes_on_image(image_pil, boxes, color, thickness,
229 | display_str_list_list)
230 | np.copyto(image, np.array(image_pil))
231 |
232 |
233 | def draw_bounding_boxes_on_image(image,
234 | boxes,
235 | color='red',
236 | thickness=4,
237 | display_str_list_list=()):
238 | """Draws bounding boxes on image.
239 |
240 | Args:
241 | image: a PIL.Image object.
242 | boxes: a 2 dimensional numpy array of [N, 4]: (ymin, xmin, ymax, xmax).
243 | The coordinates are in normalized format between [0, 1].
244 | color: color to draw bounding box. Default is red.
245 | thickness: line thickness. Default value is 4.
246 | display_str_list_list: list of list of strings.
247 | a list of strings for each bounding box.
248 | The reason to pass a list of strings for a
249 | bounding box is that it might contain
250 | multiple labels.
251 |
252 | Raises:
253 | ValueError: if boxes is not a [N, 4] array
254 | """
255 | boxes_shape = boxes.shape
256 | if not boxes_shape:
257 | return
258 | if len(boxes_shape) != 2 or boxes_shape[1] != 4:
259 | raise ValueError('Input must be of size [N, 4]')
260 | for i in range(boxes_shape[0]):
261 | display_str_list = ()
262 | if display_str_list_list:
263 | display_str_list = display_str_list_list[i]
264 | draw_bounding_box_on_image(image, boxes[i, 0], boxes[i, 1], boxes[i, 2],
265 | boxes[i, 3], color, thickness, display_str_list)
266 |
267 |
268 | def _visualize_boxes(image, boxes, classes, scores, category_index, **kwargs):
269 | return visualize_boxes_and_labels_on_image_array(
270 | image, boxes, classes, scores, category_index=category_index, **kwargs)
271 |
272 |
273 | def _visualize_boxes_and_masks(image, boxes, classes, scores, masks,
274 | category_index, **kwargs):
275 | return visualize_boxes_and_labels_on_image_array(
276 | image,
277 | boxes,
278 | classes,
279 | scores,
280 | category_index=category_index,
281 | instance_masks=masks,
282 | **kwargs)
283 |
284 |
285 | def _visualize_boxes_and_keypoints(image, boxes, classes, scores, keypoints,
286 | category_index, **kwargs):
287 | return visualize_boxes_and_labels_on_image_array(
288 | image,
289 | boxes,
290 | classes,
291 | scores,
292 | category_index=category_index,
293 | keypoints=keypoints,
294 | **kwargs)
295 |
296 |
297 | def _visualize_boxes_and_masks_and_keypoints(
298 | image, boxes, classes, scores, masks, keypoints, category_index, **kwargs):
299 | return visualize_boxes_and_labels_on_image_array(
300 | image,
301 | boxes,
302 | classes,
303 | scores,
304 | category_index=category_index,
305 | instance_masks=masks,
306 | keypoints=keypoints,
307 | **kwargs)
308 |
309 |
310 | def draw_bounding_boxes_on_image_tensors(images,
311 | boxes,
312 | classes,
313 | scores,
314 | category_index,
315 | instance_masks=None,
316 | keypoints=None,
317 | max_boxes_to_draw=20,
318 | min_score_thresh=0.2):
319 | """Draws bounding boxes, masks, and keypoints on batch of image tensors.
320 |
321 | Args:
322 | images: A 4D uint8 image tensor of shape [N, H, W, C].
323 | boxes: [N, max_detections, 4] float32 tensor of detection boxes.
324 | classes: [N, max_detections] int tensor of detection classes. Note that
325 | classes are 1-indexed.
326 | scores: [N, max_detections] float32 tensor of detection scores.
327 | category_index: a dict that maps integer ids to category dicts. e.g.
328 | {1: {1: 'dog'}, 2: {2: 'cat'}, ...}
329 | instance_masks: A 4D uint8 tensor of shape [N, max_detection, H, W] with
330 | instance masks.
331 | keypoints: A 4D float32 tensor of shape [N, max_detection, num_keypoints, 2]
332 | with keypoints.
333 | max_boxes_to_draw: Maximum number of boxes to draw on an image. Default 20.
334 | min_score_thresh: Minimum score threshold for visualization. Default 0.2.
335 |
336 | Returns:
337 | 4D image tensor of type uint8, with boxes drawn on top.
338 | """
339 | visualization_keyword_args = {
340 | 'use_normalized_coordinates': True,
341 | 'max_boxes_to_draw': max_boxes_to_draw,
342 | 'min_score_thresh': min_score_thresh,
343 | 'agnostic_mode': False,
344 | 'line_thickness': 4
345 | }
346 |
347 | if instance_masks is not None and keypoints is None:
348 | visualize_boxes_fn = functools.partial(
349 | _visualize_boxes_and_masks,
350 | category_index=category_index,
351 | **visualization_keyword_args)
352 | elems = [images, boxes, classes, scores, instance_masks]
353 | elif instance_masks is None and keypoints is not None:
354 | visualize_boxes_fn = functools.partial(
355 | _visualize_boxes_and_keypoints,
356 | category_index=category_index,
357 | **visualization_keyword_args)
358 | elems = [images, boxes, classes, scores, keypoints]
359 | elif instance_masks is not None and keypoints is not None:
360 | visualize_boxes_fn = functools.partial(
361 | _visualize_boxes_and_masks_and_keypoints,
362 | category_index=category_index,
363 | **visualization_keyword_args)
364 | elems = [images, boxes, classes, scores, instance_masks, keypoints]
365 | else:
366 | visualize_boxes_fn = functools.partial(
367 | _visualize_boxes,
368 | category_index=category_index,
369 | **visualization_keyword_args)
370 | elems = [images, boxes, classes, scores]
371 |
372 | def draw_boxes(image_and_detections):
373 | """Draws boxes on image."""
374 | image_with_boxes = tf.py_func(visualize_boxes_fn, image_and_detections,
375 | tf.uint8)
376 | return image_with_boxes
377 |
378 | images = tf.map_fn(draw_boxes, elems, dtype=tf.uint8, back_prop=False)
379 | return images
380 |
381 |
382 | def draw_side_by_side_evaluation_image(eval_dict,
383 | category_index,
384 | max_boxes_to_draw=20,
385 | min_score_thresh=0.2):
386 | """Creates a side-by-side image with detections and groundtruth.
387 |
388 | Bounding boxes (and instance masks, if available) are visualized on both
389 | subimages.
390 |
391 | Args:
392 | eval_dict: The evaluation dictionary returned by
393 | eval_util.result_dict_for_single_example().
394 | category_index: A category index (dictionary) produced from a labelmap.
395 | max_boxes_to_draw: The maximum number of boxes to draw for detections.
396 | min_score_thresh: The minimum score threshold for showing detections.
397 |
398 | Returns:
399 | A [1, H, 2 * W, C] uint8 tensor. The subimage on the left corresponds to
400 | detections, while the subimage on the right corresponds to groundtruth.
401 | """
402 | detection_fields = fields.DetectionResultFields()
403 | input_data_fields = fields.InputDataFields()
404 | instance_masks = None
405 | if detection_fields.detection_masks in eval_dict:
406 | instance_masks = tf.cast(
407 | tf.expand_dims(eval_dict[detection_fields.detection_masks], axis=0),
408 | tf.uint8)
409 | keypoints = None
410 | if detection_fields.detection_keypoints in eval_dict:
411 | keypoints = tf.expand_dims(
412 | eval_dict[detection_fields.detection_keypoints], axis=0)
413 | groundtruth_instance_masks = None
414 | if input_data_fields.groundtruth_instance_masks in eval_dict:
415 | groundtruth_instance_masks = tf.cast(
416 | tf.expand_dims(
417 | eval_dict[input_data_fields.groundtruth_instance_masks], axis=0),
418 | tf.uint8)
419 | images_with_detections = draw_bounding_boxes_on_image_tensors(
420 | eval_dict[input_data_fields.original_image],
421 | tf.expand_dims(eval_dict[detection_fields.detection_boxes], axis=0),
422 | tf.expand_dims(eval_dict[detection_fields.detection_classes], axis=0),
423 | tf.expand_dims(eval_dict[detection_fields.detection_scores], axis=0),
424 | category_index,
425 | instance_masks=instance_masks,
426 | keypoints=keypoints,
427 | max_boxes_to_draw=max_boxes_to_draw,
428 | min_score_thresh=min_score_thresh)
429 | images_with_groundtruth = draw_bounding_boxes_on_image_tensors(
430 | eval_dict[input_data_fields.original_image],
431 | tf.expand_dims(eval_dict[input_data_fields.groundtruth_boxes], axis=0),
432 | tf.expand_dims(eval_dict[input_data_fields.groundtruth_classes], axis=0),
433 | tf.expand_dims(
434 | tf.ones_like(
435 | eval_dict[input_data_fields.groundtruth_classes],
436 | dtype=tf.float32),
437 | axis=0),
438 | category_index,
439 | instance_masks=groundtruth_instance_masks,
440 | keypoints=None,
441 | max_boxes_to_draw=None,
442 | min_score_thresh=0.0)
443 | return tf.concat([images_with_detections, images_with_groundtruth], axis=2)
444 |
445 |
446 | def draw_keypoints_on_image_array(image,
447 | keypoints,
448 | color='red',
449 | radius=2,
450 | use_normalized_coordinates=True):
451 | """Draws keypoints on an image (numpy array).
452 |
453 | Args:
454 | image: a numpy array with shape [height, width, 3].
455 | keypoints: a numpy array with shape [num_keypoints, 2].
456 | color: color to draw the keypoints with. Default is red.
457 | radius: keypoint radius. Default value is 2.
458 | use_normalized_coordinates: if True (default), treat keypoint values as
459 | relative to the image. Otherwise treat them as absolute.
460 | """
461 | image_pil = Image.fromarray(np.uint8(image)).convert('RGB')
462 | draw_keypoints_on_image(image_pil, keypoints, color, radius,
463 | use_normalized_coordinates)
464 | np.copyto(image, np.array(image_pil))
465 |
466 |
467 | def draw_keypoints_on_image(image,
468 | keypoints,
469 | color='red',
470 | radius=2,
471 | use_normalized_coordinates=True):
472 | """Draws keypoints on an image.
473 |
474 | Args:
475 | image: a PIL.Image object.
476 | keypoints: a numpy array with shape [num_keypoints, 2].
477 | color: color to draw the keypoints with. Default is red.
478 | radius: keypoint radius. Default value is 2.
479 | use_normalized_coordinates: if True (default), treat keypoint values as
480 | relative to the image. Otherwise treat them as absolute.
481 | """
482 | draw = ImageDraw.Draw(image)
483 | im_width, im_height = image.size
484 | keypoints_x = [k[1] for k in keypoints]
485 | keypoints_y = [k[0] for k in keypoints]
486 | if use_normalized_coordinates:
487 | keypoints_x = tuple([im_width * x for x in keypoints_x])
488 | keypoints_y = tuple([im_height * y for y in keypoints_y])
489 | for keypoint_x, keypoint_y in zip(keypoints_x, keypoints_y):
490 | draw.ellipse([(keypoint_x - radius, keypoint_y - radius),
491 | (keypoint_x + radius, keypoint_y + radius)],
492 | outline=color, fill=color)
493 |
494 |
495 | def draw_mask_on_image_array(image, mask, color='red', alpha=0.4):
496 | """Draws mask on an image.
497 |
498 | Args:
499 | image: uint8 numpy array with shape (img_height, img_height, 3)
500 | mask: a uint8 numpy array of shape (img_height, img_height) with
501 | values between either 0 or 1.
502 | color: color to draw the keypoints with. Default is red.
503 | alpha: transparency value between 0 and 1. (default: 0.4)
504 |
505 | Raises:
506 | ValueError: On incorrect data type for image or masks.
507 | """
508 | if image.dtype != np.uint8:
509 | raise ValueError('`image` not of type np.uint8')
510 | if mask.dtype != np.uint8:
511 | raise ValueError('`mask` not of type np.uint8')
512 | if np.any(np.logical_and(mask != 1, mask != 0)):
513 | raise ValueError('`mask` elements should be in [0, 1]')
514 | if image.shape[:2] != mask.shape:
515 | raise ValueError('The image has spatial dimensions %s but the mask has '
516 | 'dimensions %s' % (image.shape[:2], mask.shape))
517 | rgb = ImageColor.getrgb(color)
518 | pil_image = Image.fromarray(image)
519 |
520 | solid_color = np.expand_dims(
521 | np.ones_like(mask), axis=2) * np.reshape(list(rgb), [1, 1, 3])
522 | pil_solid_color = Image.fromarray(np.uint8(solid_color)).convert('RGBA')
523 | pil_mask = Image.fromarray(np.uint8(255.0*alpha*mask)).convert('L')
524 | pil_image = Image.composite(pil_solid_color, pil_image, pil_mask)
525 | np.copyto(image, np.array(pil_image.convert('RGB')))
526 |
527 |
528 | def visualize_boxes_and_labels_on_image_array(
529 | image,
530 | boxes,
531 | classes,
532 | scores,
533 | category_index,
534 | instance_masks=None,
535 | instance_boundaries=None,
536 | keypoints=None,
537 | use_normalized_coordinates=False,
538 | max_boxes_to_draw=20,
539 | min_score_thresh=.5,
540 | agnostic_mode=False,
541 | line_thickness=4,
542 | groundtruth_box_visualization_color='black',
543 | skip_scores=False,
544 | skip_labels=False):
545 | """Overlay labeled boxes on an image with formatted scores and label names.
546 |
547 | This function groups boxes that correspond to the same location
548 | and creates a display string for each detection and overlays these
549 | on the image. Note that this function modifies the image in place, and returns
550 | that same image.
551 |
552 | Args:
553 | image: uint8 numpy array with shape (img_height, img_width, 3)
554 | boxes: a numpy array of shape [N, 4]
555 | classes: a numpy array of shape [N]. Note that class indices are 1-based,
556 | and match the keys in the label map.
557 | scores: a numpy array of shape [N] or None. If scores=None, then
558 | this function assumes that the boxes to be plotted are groundtruth
559 | boxes and plot all boxes as black with no classes or scores.
560 | category_index: a dict containing category dictionaries (each holding
561 | category index `id` and category name `name`) keyed by category indices.
562 | instance_masks: a numpy array of shape [N, image_height, image_width] with
563 | values ranging between 0 and 1, can be None.
564 | instance_boundaries: a numpy array of shape [N, image_height, image_width]
565 | with values ranging between 0 and 1, can be None.
566 | keypoints: a numpy array of shape [N, num_keypoints, 2], can
567 | be None
568 | use_normalized_coordinates: whether boxes is to be interpreted as
569 | normalized coordinates or not.
570 | max_boxes_to_draw: maximum number of boxes to visualize. If None, draw
571 | all boxes.
572 | min_score_thresh: minimum score threshold for a box to be visualized
573 | agnostic_mode: boolean (default: False) controlling whether to evaluate in
574 | class-agnostic mode or not. This mode will display scores but ignore
575 | classes.
576 | line_thickness: integer (default: 4) controlling line width of the boxes.
577 | groundtruth_box_visualization_color: box color for visualizing groundtruth
578 | boxes
579 | skip_scores: whether to skip score when drawing a single detection
580 | skip_labels: whether to skip label when drawing a single detection
581 |
582 | Returns:
583 | uint8 numpy array with shape (img_height, img_width, 3) with overlaid boxes.
584 | """
585 | # Create a display string (and color) for every box location, group any boxes
586 | # that correspond to the same location.
587 | box_to_display_str_map = collections.defaultdict(list)
588 | box_to_color_map = collections.defaultdict(str)
589 | box_to_instance_masks_map = {}
590 | box_to_instance_boundaries_map = {}
591 | box_to_keypoints_map = collections.defaultdict(list)
592 | if not max_boxes_to_draw:
593 | max_boxes_to_draw = boxes.shape[0]
594 | for i in range(min(max_boxes_to_draw, boxes.shape[0])):
595 | if scores is None or scores[i] > min_score_thresh:
596 | box = tuple(boxes[i].tolist())
597 | if instance_masks is not None:
598 | box_to_instance_masks_map[box] = instance_masks[i]
599 | if instance_boundaries is not None:
600 | box_to_instance_boundaries_map[box] = instance_boundaries[i]
601 | if keypoints is not None:
602 | box_to_keypoints_map[box].extend(keypoints[i])
603 | if scores is None:
604 | box_to_color_map[box] = groundtruth_box_visualization_color
605 | else:
606 | display_str = ''
607 | if not skip_labels:
608 | if not agnostic_mode:
609 | if classes[i] in category_index.keys():
610 | class_name = category_index[classes[i]]['name']
611 | else:
612 | class_name = 'N/A'
613 | display_str = str(class_name)
614 | if not skip_scores:
615 | if not display_str:
616 | display_str = '{}%'.format(int(100*scores[i]))
617 | else:
618 | display_str = '{}: {}%'.format(display_str, int(100*scores[i]))
619 | box_to_display_str_map[box].append(display_str)
620 | if agnostic_mode:
621 | box_to_color_map[box] = 'DarkOrange'
622 | else:
623 | box_to_color_map[box] = STANDARD_COLORS[
624 | classes[i] % len(STANDARD_COLORS)]
625 |
626 | # Draw all boxes onto image.
627 | for box, color in box_to_color_map.items():
628 | ymin, xmin, ymax, xmax = box
629 | if instance_masks is not None:
630 | draw_mask_on_image_array(
631 | image,
632 | box_to_instance_masks_map[box],
633 | color=color
634 | )
635 | if instance_boundaries is not None:
636 | draw_mask_on_image_array(
637 | image,
638 | box_to_instance_boundaries_map[box],
639 | color='red',
640 | alpha=1.0
641 | )
642 | draw_bounding_box_on_image_array(
643 | image,
644 | ymin,
645 | xmin,
646 | ymax,
647 | xmax,
648 | color=color,
649 | thickness=line_thickness,
650 | display_str_list=box_to_display_str_map[box],
651 | use_normalized_coordinates=use_normalized_coordinates)
652 | if keypoints is not None:
653 | draw_keypoints_on_image_array(
654 | image,
655 | box_to_keypoints_map[box],
656 | color=color,
657 | radius=line_thickness / 2,
658 | use_normalized_coordinates=use_normalized_coordinates)
659 |
660 | return image
661 |
662 |
663 | def add_cdf_image_summary(values, name):
664 | """Adds a tf.summary.image for a CDF plot of the values.
665 |
666 | Normalizes `values` such that they sum to 1, plots the cumulative distribution
667 | function and creates a tf image summary.
668 |
669 | Args:
670 | values: a 1-D float32 tensor containing the values.
671 | name: name for the image summary.
672 | """
673 | def cdf_plot(values):
674 | """Numpy function to plot CDF."""
675 | normalized_values = values / np.sum(values)
676 | sorted_values = np.sort(normalized_values)
677 | cumulative_values = np.cumsum(sorted_values)
678 | fraction_of_examples = (np.arange(cumulative_values.size, dtype=np.float32)
679 | / cumulative_values.size)
680 | fig = plt.figure(frameon=False)
681 | ax = fig.add_subplot('111')
682 | ax.plot(fraction_of_examples, cumulative_values)
683 | ax.set_ylabel('cumulative normalized values')
684 | ax.set_xlabel('fraction of examples')
685 | fig.canvas.draw()
686 | width, height = fig.get_size_inches() * fig.get_dpi()
687 | image = np.fromstring(fig.canvas.tostring_rgb(), dtype='uint8').reshape(
688 | 1, int(height), int(width), 3)
689 | return image
690 | cdf_plot = tf.py_func(cdf_plot, [values], tf.uint8)
691 | tf.summary.image(name, cdf_plot)
692 |
693 |
694 | def add_hist_image_summary(values, bins, name):
695 | """Adds a tf.summary.image for a histogram plot of the values.
696 |
697 | Plots the histogram of values and creates a tf image summary.
698 |
699 | Args:
700 | values: a 1-D float32 tensor containing the values.
701 | bins: bin edges which will be directly passed to np.histogram.
702 | name: name for the image summary.
703 | """
704 |
705 | def hist_plot(values, bins):
706 | """Numpy function to plot hist."""
707 | fig = plt.figure(frameon=False)
708 | ax = fig.add_subplot('111')
709 | y, x = np.histogram(values, bins=bins)
710 | ax.plot(x[:-1], y)
711 | ax.set_ylabel('count')
712 | ax.set_xlabel('value')
713 | fig.canvas.draw()
714 | width, height = fig.get_size_inches() * fig.get_dpi()
715 | image = np.fromstring(
716 | fig.canvas.tostring_rgb(), dtype='uint8').reshape(
717 | 1, int(height), int(width), 3)
718 | return image
719 | hist_plot = tf.py_func(hist_plot, [values, bins], tf.uint8)
720 | tf.summary.image(name, hist_plot)
721 |
--------------------------------------------------------------------------------