├── .gitignore ├── CONTRIBUTING.md ├── LICENSE ├── README.md ├── download_models.sh ├── gstreamer ├── README.md ├── common.py ├── detect.py ├── gstreamer.py ├── install_requirements.sh ├── requirements_for_sort_tracker.txt └── tracker.py └── third_party └── README.md /.gitignore: -------------------------------------------------------------------------------- 1 | all_models/ 2 | **__pycache__ 3 | -------------------------------------------------------------------------------- /CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | # How to Contribute 2 | 3 | We'd love to accept your patches and contributions to this project. There are 4 | just a few small guidelines you need to follow. 5 | 6 | ## Contributor License Agreement 7 | 8 | Contributions to this project must be accompanied by a Contributor License 9 | Agreement. You (or your employer) retain the copyright to your contribution; 10 | this simply gives us permission to use and redistribute your contributions as 11 | part of the project. Head over to to see 12 | your current agreements on file or to sign a new one. 13 | 14 | You generally only need to submit a CLA once, so if you've already submitted one 15 | (even if it was for a different project), you probably don't need to do it 16 | again. 17 | 18 | ## Code reviews 19 | 20 | All submissions, including submissions by project members, require review. We 21 | use GitHub pull requests for this purpose. Consult 22 | [GitHub Help](https://help.github.com/articles/about-pull-requests/) for more 23 | information on using pull requests. 24 | 25 | ## Community Guidelines 26 | 27 | This project follows [Google's Open Source Community 28 | Guidelines](https://opensource.google.com/conduct/). 29 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | 2 | Apache License 3 | Version 2.0, January 2004 4 | http://www.apache.org/licenses/ 5 | 6 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 7 | 8 | 1. Definitions. 9 | 10 | "License" shall mean the terms and conditions for use, reproduction, 11 | and distribution as defined by Sections 1 through 9 of this document. 12 | 13 | "Licensor" shall mean the copyright owner or entity authorized by 14 | the copyright owner that is granting the License. 15 | 16 | "Legal Entity" shall mean the union of the acting entity and all 17 | other entities that control, are controlled by, or are under common 18 | control with that entity. For the purposes of this definition, 19 | "control" means (i) the power, direct or indirect, to cause the 20 | direction or management of such entity, whether by contract or 21 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 22 | outstanding shares, or (iii) beneficial ownership of such entity. 23 | 24 | "You" (or "Your") shall mean an individual or Legal Entity 25 | exercising permissions granted by this License. 26 | 27 | "Source" form shall mean the preferred form for making modifications, 28 | including but not limited to software source code, documentation 29 | source, and configuration files. 30 | 31 | "Object" form shall mean any form resulting from mechanical 32 | transformation or translation of a Source form, including but 33 | not limited to compiled object code, generated documentation, 34 | and conversions to other media types. 35 | 36 | "Work" shall mean the work of authorship, whether in Source or 37 | Object form, made available under the License, as indicated by a 38 | copyright notice that is included in or attached to the work 39 | (an example is provided in the Appendix below). 40 | 41 | "Derivative Works" shall mean any work, whether in Source or Object 42 | form, that is based on (or derived from) the Work and for which the 43 | editorial revisions, annotations, elaborations, or other modifications 44 | represent, as a whole, an original work of authorship. For the purposes 45 | of this License, Derivative Works shall not include works that remain 46 | separable from, or merely link (or bind by name) to the interfaces of, 47 | the Work and Derivative Works thereof. 48 | 49 | "Contribution" shall mean any work of authorship, including 50 | the original version of the Work and any modifications or additions 51 | to that Work or Derivative Works thereof, that is intentionally 52 | submitted to Licensor for inclusion in the Work by the copyright owner 53 | or by an individual or Legal Entity authorized to submit on behalf of 54 | the copyright owner. For the purposes of this definition, "submitted" 55 | means any form of electronic, verbal, or written communication sent 56 | to the Licensor or its representatives, including but not limited to 57 | communication on electronic mailing lists, source code control systems, 58 | and issue tracking systems that are managed by, or on behalf of, the 59 | Licensor for the purpose of discussing and improving the Work, but 60 | excluding communication that is conspicuously marked or otherwise 61 | designated in writing by the copyright owner as "Not a Contribution." 62 | 63 | "Contributor" shall mean Licensor and any individual or Legal Entity 64 | on behalf of whom a Contribution has been received by Licensor and 65 | subsequently incorporated within the Work. 66 | 67 | 2. Grant of Copyright License. Subject to the terms and conditions of 68 | this License, each Contributor hereby grants to You a perpetual, 69 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 70 | copyright license to reproduce, prepare Derivative Works of, 71 | publicly display, publicly perform, sublicense, and distribute the 72 | Work and such Derivative Works in Source or Object form. 73 | 74 | 3. Grant of Patent License. Subject to the terms and conditions of 75 | this License, each Contributor hereby grants to You a perpetual, 76 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 77 | (except as stated in this section) patent license to make, have made, 78 | use, offer to sell, sell, import, and otherwise transfer the Work, 79 | where such license applies only to those patent claims licensable 80 | by such Contributor that are necessarily infringed by their 81 | Contribution(s) alone or by combination of their Contribution(s) 82 | with the Work to which such Contribution(s) was submitted. If You 83 | institute patent litigation against any entity (including a 84 | cross-claim or counterclaim in a lawsuit) alleging that the Work 85 | or a Contribution incorporated within the Work constitutes direct 86 | or contributory patent infringement, then any patent licenses 87 | granted to You under this License for that Work shall terminate 88 | as of the date such litigation is filed. 89 | 90 | 4. Redistribution. You may reproduce and distribute copies of the 91 | Work or Derivative Works thereof in any medium, with or without 92 | modifications, and in Source or Object form, provided that You 93 | meet the following conditions: 94 | 95 | (a) You must give any other recipients of the Work or 96 | Derivative Works a copy of this License; and 97 | 98 | (b) You must cause any modified files to carry prominent notices 99 | stating that You changed the files; and 100 | 101 | (c) You must retain, in the Source form of any Derivative Works 102 | that You distribute, all copyright, patent, trademark, and 103 | attribution notices from the Source form of the Work, 104 | excluding those notices that do not pertain to any part of 105 | the Derivative Works; and 106 | 107 | (d) If the Work includes a "NOTICE" text file as part of its 108 | distribution, then any Derivative Works that You distribute must 109 | include a readable copy of the attribution notices contained 110 | within such NOTICE file, excluding those notices that do not 111 | pertain to any part of the Derivative Works, in at least one 112 | of the following places: within a NOTICE text file distributed 113 | as part of the Derivative Works; within the Source form or 114 | documentation, if provided along with the Derivative Works; or, 115 | within a display generated by the Derivative Works, if and 116 | wherever such third-party notices normally appear. The contents 117 | of the NOTICE file are for informational purposes only and 118 | do not modify the License. You may add Your own attribution 119 | notices within Derivative Works that You distribute, alongside 120 | or as an addendum to the NOTICE text from the Work, provided 121 | that such additional attribution notices cannot be construed 122 | as modifying the License. 123 | 124 | You may add Your own copyright statement to Your modifications and 125 | may provide additional or different license terms and conditions 126 | for use, reproduction, or distribution of Your modifications, or 127 | for any such Derivative Works as a whole, provided Your use, 128 | reproduction, and distribution of the Work otherwise complies with 129 | the conditions stated in this License. 130 | 131 | 5. Submission of Contributions. Unless You explicitly state otherwise, 132 | any Contribution intentionally submitted for inclusion in the Work 133 | by You to the Licensor shall be under the terms and conditions of 134 | this License, without any additional terms or conditions. 135 | Notwithstanding the above, nothing herein shall supersede or modify 136 | the terms of any separate license agreement you may have executed 137 | with Licensor regarding such Contributions. 138 | 139 | 6. Trademarks. This License does not grant permission to use the trade 140 | names, trademarks, service marks, or product names of the Licensor, 141 | except as required for reasonable and customary use in describing the 142 | origin of the Work and reproducing the content of the NOTICE file. 143 | 144 | 7. Disclaimer of Warranty. Unless required by applicable law or 145 | agreed to in writing, Licensor provides the Work (and each 146 | Contributor provides its Contributions) on an "AS IS" BASIS, 147 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 148 | implied, including, without limitation, any warranties or conditions 149 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 150 | PARTICULAR PURPOSE. You are solely responsible for determining the 151 | appropriateness of using or redistributing the Work and assume any 152 | risks associated with Your exercise of permissions under this License. 153 | 154 | 8. Limitation of Liability. In no event and under no legal theory, 155 | whether in tort (including negligence), contract, or otherwise, 156 | unless required by applicable law (such as deliberate and grossly 157 | negligent acts) or agreed to in writing, shall any Contributor be 158 | liable to You for damages, including any direct, indirect, special, 159 | incidental, or consequential damages of any character arising as a 160 | result of this License or out of the use or inability to use the 161 | Work (including but not limited to damages for loss of goodwill, 162 | work stoppage, computer failure or malfunction, or any and all 163 | other commercial damages or losses), even if such Contributor 164 | has been advised of the possibility of such damages. 165 | 166 | 9. Accepting Warranty or Additional Liability. While redistributing 167 | the Work or Derivative Works thereof, You may choose to offer, 168 | and charge a fee for, acceptance of support, warranty, indemnity, 169 | or other liability obligations and/or rights consistent with this 170 | License. However, in accepting such obligations, You may act only 171 | on Your own behalf and on Your sole responsibility, not on behalf 172 | of any other Contributor, and only if You agree to indemnify, 173 | defend, and hold each Contributor harmless for any liability 174 | incurred by, or claims asserted against, such Contributor by reason 175 | of your accepting any such warranty or additional liability. 176 | 177 | END OF TERMS AND CONDITIONS 178 | 179 | APPENDIX: How to apply the Apache License to your work. 180 | 181 | To apply the Apache License to your work, attach the following 182 | boilerplate notice, with the fields enclosed by brackets "[]" 183 | replaced with your own identifying information. (Don't include 184 | the brackets!) The text should be enclosed in the appropriate 185 | comment syntax for the file format. We also recommend that a 186 | file or class name and description of purpose be included on the 187 | same "printed page" as the copyright notice for easier 188 | identification within third-party archives. 189 | 190 | Copyright [yyyy] [name of copyright owner] 191 | 192 | Licensed under the Apache License, Version 2.0 (the "License"); 193 | you may not use this file except in compliance with the License. 194 | You may obtain a copy of the License at 195 | 196 | http://www.apache.org/licenses/LICENSE-2.0 197 | 198 | Unless required by applicable law or agreed to in writing, software 199 | distributed under the License is distributed on an "AS IS" BASIS, 200 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 201 | See the License for the specific language governing permissions and 202 | limitations under the License. 203 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Edge TPU Object Tracker Example 2 | 3 | This repo contains a collection of examples that use camera streams 4 | together with the [TensorFlow Lite API](https://tensorflow.org/lite) with a 5 | Coral device such as the 6 | [USB Accelerator](https://coral.withgoogle.com/products/accelerator) or 7 | [Dev Board](https://coral.withgoogle.com/products/dev-board) and provides an Object tracker for use with the detected objects. 8 | 9 | 10 | ## Installation 11 | 12 | 1. First, be sure you have completed the [setup instructions for your Coral 13 | device](https://coral.ai/docs/setup/). If it's been a while, repeat to be sure 14 | you have the latest software. 15 | 16 | Importantly, you should have the latest TensorFlow Lite runtime installed 17 | (as per the [Python quickstart]( 18 | https://www.tensorflow.org/lite/guide/python)). 19 | 20 | 2. Clone this Git repo onto your computer: 21 | 22 | ``` 23 | mkdir google-coral && cd google-coral 24 | 25 | git clone https://github.com/google-coral/example-object-tracker.git 26 | 27 | cd example-object-tracker/ 28 | ``` 29 | 30 | 3. Download the models: 31 | 32 | ``` 33 | sh download_models.sh 34 | ``` 35 | 36 | These models will be downloaded to a new folder 37 | ```models```. 38 | 39 | 40 | Further requirements may be needed by the different camera libraries, check the 41 | README file for the respective subfolder. 42 | 43 | ## Contents 44 | 45 | * __gstreamer__: Python examples using gstreamer to obtain camera stream. These 46 | examples work on Linux using a webcam, Raspberry Pi with 47 | the Raspicam, and on the Coral DevBoard using the Coral camera. For the 48 | former two, you will also need a Coral USB Accelerator to run the models. 49 | 50 | This demo provides the support of an Object tracker. After following the setup 51 | instructions in README file for the subfolder ```gstreamer```, you can run the tracker demo: 52 | 53 | ``` 54 | cd gstreamer 55 | python3 detect.py --tracker sort 56 | ``` 57 | 58 | ## Models 59 | 60 | For the demos in this repository you can change the model and the labels 61 | file by using the flags flags ```--model``` and 62 | ```--labels```. Be sure to use the models labeled _edgetpu, as those are 63 | compiled for the accelerator - otherwise the model will run on the CPU and 64 | be much slower. 65 | 66 | 67 | For detection you need to select one of the SSD detection models 68 | and its corresponding labels file: 69 | 70 | ``` 71 | mobilenet_ssd_v2_coco_quant_postprocess_edgetpu.tflite, coco_labels.txt 72 | ``` 73 | 74 | 75 | -------------------------------------------------------------------------------- /download_models.sh: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | # Copyright 2019 Google LLC 3 | # 4 | # Licensed under the Apache License, Version 2.0 (the "License"); 5 | # you may not use this file except in compliance with the License. 6 | # You may obtain a copy of the License at 7 | # 8 | # https://www.apache.org/licenses/LICENSE-2.0 9 | # 10 | # Unless required by applicable law or agreed to in writing, software 11 | # distributed under the License is distributed on an "AS IS" BASIS, 12 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | # See the License for the specific language governing permissions and 14 | # limitations under the License. 15 | 16 | mkdir -p models 17 | wget https://dl.google.com/coral/canned_models/mobilenet_ssd_v2_coco_quant_postprocess_edgetpu.tflite 18 | wget https://dl.google.com/coral/canned_models/coco_labels.txt 19 | mv mobilenet_ssd_v2_coco_quant_postprocess_edgetpu.tflite coco_labels.txt models/ 20 | -------------------------------------------------------------------------------- /gstreamer/README.md: -------------------------------------------------------------------------------- 1 | # GStreamer based Object Tracking Example 2 | 3 | This folder contains example code using [GStreamer](https://github.com/GStreamer/gstreamer) to 4 | obtain camera images and perform image classification and object detection on the Edge TPU. 5 | 6 | This code works on Linux using a webcam, Raspberry Pi with the Pi Camera, and on the Coral Dev 7 | Board using the Coral Camera or a webcam. For the first two, you also need a Coral 8 | USB/PCIe/M.2 Accelerator. 9 | 10 | 11 | ## Set up your device 12 | 13 | 1. First, be sure you have completed the [setup instructions for your Coral 14 | device](https://coral.ai/docs/setup/). If it's been a while, repeat to be sure 15 | you have the latest software. 16 | 17 | Importantly, you should have the latest TensorFlow Lite runtime installed 18 | (as per the [Python quickstart]( 19 | https://www.tensorflow.org/lite/guide/python)). You can check which version is installed 20 | using the ```pip3 show tflite_runtime``` command. 21 | 22 | 2. Install the GStreamer libraries and Trackers: 23 | 24 | ``` 25 | bash install_requirements.sh 26 | ``` 27 | 3. Run the detection model with Sort tracker 28 | ``` 29 | python3 detect.py --tracker sort 30 | ``` 31 | 32 | ## Run the detection demo without any tracker (SSD models) 33 | 34 | ``` 35 | python3 detect.py 36 | ``` 37 | You can change the model and the labels file using ```--model``` and ```--labels```. 38 | 39 | By default, example use the attached Coral Camera. If you want to use a USB camera, 40 | edit the ```gstreamer.py``` file and change ```device=/dev/video0``` to ```device=/dev/video1```. 41 | -------------------------------------------------------------------------------- /gstreamer/common.py: -------------------------------------------------------------------------------- 1 | # Copyright 2019 Google LLC 2 | # 3 | # Licensed under the Apache License, Version 2.0 (the "License"); 4 | # you may not use this file except in compliance with the License. 5 | # You may obtain a copy of the License at 6 | # 7 | # https://www.apache.org/licenses/LICENSE-2.0 8 | # 9 | # Unless required by applicable law or agreed to in writing, software 10 | # distributed under the License is distributed on an "AS IS" BASIS, 11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | # See the License for the specific language governing permissions and 13 | # limitations under the License. 14 | 15 | """Common utilities.""" 16 | import collections 17 | import gi 18 | gi.require_version('Gst', '1.0') 19 | from gi.repository import Gst 20 | import numpy as np 21 | import svgwrite 22 | import tflite_runtime.interpreter as tflite 23 | import time 24 | 25 | EDGETPU_SHARED_LIB = 'libedgetpu.so.1' 26 | 27 | def make_interpreter(model_file): 28 | model_file, *device = model_file.split('@') 29 | return tflite.Interpreter( 30 | model_path=model_file, 31 | experimental_delegates=[ 32 | tflite.load_delegate(EDGETPU_SHARED_LIB, 33 | {'device': device[0]} if device else {}) 34 | ]) 35 | 36 | def input_image_size(interpreter): 37 | """Returns input size as (width, height, channels) tuple.""" 38 | _, height, width, channels = interpreter.get_input_details()[0]['shape'] 39 | return width, height, channels 40 | 41 | def input_tensor(interpreter): 42 | """Returns input tensor view as numpy array of shape (height, width, channels).""" 43 | tensor_index = interpreter.get_input_details()[0]['index'] 44 | return interpreter.tensor(tensor_index)()[0] 45 | 46 | def set_input(interpreter, buf): 47 | """Copies data to input tensor.""" 48 | result, mapinfo = buf.map(Gst.MapFlags.READ) 49 | if result: 50 | np_buffer = np.reshape(np.frombuffer(mapinfo.data, dtype=np.uint8), 51 | interpreter.get_input_details()[0]['shape']) 52 | input_tensor(interpreter)[:, :] = np_buffer 53 | buf.unmap(mapinfo) 54 | 55 | def output_tensor(interpreter, i): 56 | """Returns dequantized output tensor if quantized before.""" 57 | output_details = interpreter.get_output_details()[i] 58 | output_data = np.squeeze(interpreter.tensor(output_details['index'])()) 59 | if 'quantization' not in output_details: 60 | return output_data 61 | scale, zero_point = output_details['quantization'] 62 | if scale == 0: 63 | return output_data - zero_point 64 | return scale * (output_data - zero_point) 65 | 66 | def avg_fps_counter(window_size): 67 | window = collections.deque(maxlen=window_size) 68 | prev = time.monotonic() 69 | yield 0.0 # First fps value. 70 | 71 | while True: 72 | curr = time.monotonic() 73 | window.append(curr - prev) 74 | prev = curr 75 | yield len(window) / sum(window) 76 | -------------------------------------------------------------------------------- /gstreamer/detect.py: -------------------------------------------------------------------------------- 1 | # Copyright 2019 Google LLC 2 | # 3 | # Licensed under the Apache License, Version 2.0 (the "License"); 4 | # you may not use this file except in compliance with the License. 5 | # You may obtain a copy of the License at 6 | # 7 | # https://www.apache.org/licenses/LICENSE-2.0 8 | # 9 | # Unless required by applicable law or agreed to in writing, software 10 | # distributed under the License is distributed on an "AS IS" BASIS, 11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | # See the License for the specific language governing permissions and 13 | # limitations under the License. 14 | 15 | """ 16 | A demo which runs object detection on camera frames using GStreamer. 17 | It also provides support for Object Tracker. 18 | 19 | Run default object detection: 20 | python3 detect.py 21 | 22 | Choose different camera and input encoding 23 | python3 detect.py --videosrc /dev/video1 --videofmt jpeg 24 | 25 | Choose an Object Tracker. Example : To run sort tracker 26 | python3 detect.py --tracker sort 27 | 28 | TEST_DATA=../all_models 29 | 30 | Run coco model: 31 | python3 detect.py \ 32 | --model ${TEST_DATA}/mobilenet_ssd_v2_coco_quant_postprocess_edgetpu.tflite \ 33 | --labels ${TEST_DATA}/coco_labels.txt 34 | """ 35 | import argparse 36 | import collections 37 | import common 38 | import gstreamer 39 | import numpy as np 40 | import os 41 | import re 42 | import svgwrite 43 | import time 44 | from tracker import ObjectTracker 45 | 46 | 47 | Object = collections.namedtuple('Object', ['id', 'score', 'bbox']) 48 | 49 | 50 | def load_labels(path): 51 | p = re.compile(r'\s*(\d+)(.+)') 52 | with open(path, 'r', encoding='utf-8') as f: 53 | lines = (p.match(line).groups() for line in f.readlines()) 54 | return {int(num): text.strip() for num, text in lines} 55 | 56 | 57 | def shadow_text(dwg, x, y, text, font_size=20): 58 | dwg.add(dwg.text(text, insert=(x+1, y+1), fill='black', font_size=font_size)) 59 | dwg.add(dwg.text(text, insert=(x, y), fill='white', font_size=font_size)) 60 | 61 | 62 | def generate_svg(src_size, inference_size, inference_box, objs, labels, text_lines, trdata, trackerFlag): 63 | dwg = svgwrite.Drawing('', size=src_size) 64 | src_w, src_h = src_size 65 | inf_w, inf_h = inference_size 66 | box_x, box_y, box_w, box_h = inference_box 67 | scale_x, scale_y = src_w / box_w, src_h / box_h 68 | 69 | for y, line in enumerate(text_lines, start=1): 70 | shadow_text(dwg, 10, y*20, line) 71 | if trackerFlag and (np.array(trdata)).size: 72 | for td in trdata: 73 | x0, y0, x1, y1, trackID = td[0].item(), td[1].item( 74 | ), td[2].item(), td[3].item(), td[4].item() 75 | overlap = 0 76 | for ob in objs: 77 | dx0, dy0, dx1, dy1 = ob.bbox.xmin.item(), ob.bbox.ymin.item( 78 | ), ob.bbox.xmax.item(), ob.bbox.ymax.item() 79 | area = (min(dx1, x1)-max(dx0, x0))*(min(dy1, y1)-max(dy0, y0)) 80 | if (area > overlap): 81 | overlap = area 82 | obj = ob 83 | 84 | # Relative coordinates. 85 | x, y, w, h = x0, y0, x1 - x0, y1 - y0 86 | # Absolute coordinates, input tensor space. 87 | x, y, w, h = int(x * inf_w), int(y * 88 | inf_h), int(w * inf_w), int(h * inf_h) 89 | # Subtract boxing offset. 90 | x, y = x - box_x, y - box_y 91 | # Scale to source coordinate space. 92 | x, y, w, h = x * scale_x, y * scale_y, w * scale_x, h * scale_y 93 | percent = int(100 * obj.score) 94 | label = '{}% {} ID:{}'.format( 95 | percent, labels.get(obj.id, obj.id), int(trackID)) 96 | shadow_text(dwg, x, y - 5, label) 97 | dwg.add(dwg.rect(insert=(x, y), size=(w, h), 98 | fill='none', stroke='red', stroke_width='2')) 99 | else: 100 | for obj in objs: 101 | x0, y0, x1, y1 = list(obj.bbox) 102 | # Relative coordinates. 103 | x, y, w, h = x0, y0, x1 - x0, y1 - y0 104 | # Absolute coordinates, input tensor space. 105 | x, y, w, h = int(x * inf_w), int(y * 106 | inf_h), int(w * inf_w), int(h * inf_h) 107 | # Subtract boxing offset. 108 | x, y = x - box_x, y - box_y 109 | # Scale to source coordinate space. 110 | x, y, w, h = x * scale_x, y * scale_y, w * scale_x, h * scale_y 111 | percent = int(100 * obj.score) 112 | label = '{}% {}'.format(percent, labels.get(obj.id, obj.id)) 113 | shadow_text(dwg, x, y - 5, label) 114 | dwg.add(dwg.rect(insert=(x, y), size=(w, h), 115 | fill='none', stroke='red', stroke_width='2')) 116 | return dwg.tostring() 117 | 118 | 119 | class BBox(collections.namedtuple('BBox', ['xmin', 'ymin', 'xmax', 'ymax'])): 120 | """Bounding box. 121 | Represents a rectangle which sides are either vertical or horizontal, parallel 122 | to the x or y axis. 123 | """ 124 | __slots__ = () 125 | 126 | 127 | def get_output(interpreter, score_threshold, top_k, image_scale=1.0): 128 | """Returns list of detected objects.""" 129 | boxes = common.output_tensor(interpreter, 0) 130 | category_ids = common.output_tensor(interpreter, 1) 131 | scores = common.output_tensor(interpreter, 2) 132 | 133 | def make(i): 134 | ymin, xmin, ymax, xmax = boxes[i] 135 | return Object( 136 | id=int(category_ids[i]), 137 | score=scores[i], 138 | bbox=BBox(xmin=np.maximum(0.0, xmin), 139 | ymin=np.maximum(0.0, ymin), 140 | xmax=np.minimum(1.0, xmax), 141 | ymax=np.minimum(1.0, ymax))) 142 | return [make(i) for i in range(top_k) if scores[i] >= score_threshold] 143 | 144 | 145 | def main(): 146 | default_model_dir = '../models' 147 | default_model = 'mobilenet_ssd_v2_coco_quant_postprocess_edgetpu.tflite' 148 | default_labels = 'coco_labels.txt' 149 | parser = argparse.ArgumentParser() 150 | parser.add_argument('--model', help='.tflite model path', 151 | default=os.path.join(default_model_dir, default_model)) 152 | parser.add_argument('--labels', help='label file path', 153 | default=os.path.join(default_model_dir, default_labels)) 154 | parser.add_argument('--top_k', type=int, default=3, 155 | help='number of categories with highest score to display') 156 | parser.add_argument('--threshold', type=float, default=0.1, 157 | help='classifier score threshold') 158 | parser.add_argument('--videosrc', help='Which video source to use. ', 159 | default='/dev/video0') 160 | parser.add_argument('--videofmt', help='Input video format.', 161 | default='raw', 162 | choices=['raw', 'h264', 'jpeg']) 163 | parser.add_argument('--tracker', help='Name of the Object Tracker To be used.', 164 | default=None, 165 | choices=[None, 'sort']) 166 | args = parser.parse_args() 167 | 168 | print('Loading {} with {} labels.'.format(args.model, args.labels)) 169 | interpreter = common.make_interpreter(args.model) 170 | interpreter.allocate_tensors() 171 | labels = load_labels(args.labels) 172 | 173 | w, h, _ = common.input_image_size(interpreter) 174 | inference_size = (w, h) 175 | # Average fps over last 30 frames. 176 | fps_counter = common.avg_fps_counter(30) 177 | 178 | def user_callback(input_tensor, src_size, inference_box, mot_tracker): 179 | nonlocal fps_counter 180 | start_time = time.monotonic() 181 | common.set_input(interpreter, input_tensor) 182 | interpreter.invoke() 183 | # For larger input image sizes, use the edgetpu.classification.engine for better performance 184 | objs = get_output(interpreter, args.threshold, args.top_k) 185 | end_time = time.monotonic() 186 | detections = [] # np.array([]) 187 | for n in range(0, len(objs)): 188 | element = [] # np.array([]) 189 | element.append(objs[n].bbox.xmin) 190 | element.append(objs[n].bbox.ymin) 191 | element.append(objs[n].bbox.xmax) 192 | element.append(objs[n].bbox.ymax) 193 | element.append(objs[n].score) # print('element= ',element) 194 | detections.append(element) # print('dets: ',dets) 195 | # convert to numpy array # print('npdets: ',dets) 196 | detections = np.array(detections) 197 | trdata = [] 198 | trackerFlag = False 199 | if detections.any(): 200 | if mot_tracker != None: 201 | trdata = mot_tracker.update(detections) 202 | trackerFlag = True 203 | text_lines = [ 204 | 'Inference: {:.2f} ms'.format((end_time - start_time) * 1000), 205 | 'FPS: {} fps'.format(round(next(fps_counter))), ] 206 | if len(objs) != 0: 207 | return generate_svg(src_size, inference_size, inference_box, objs, labels, text_lines, trdata, trackerFlag) 208 | 209 | result = gstreamer.run_pipeline(user_callback, 210 | src_size=(640, 480), 211 | appsink_size=inference_size, 212 | trackerName=args.tracker, 213 | videosrc=args.videosrc, 214 | videofmt=args.videofmt) 215 | 216 | 217 | if __name__ == '__main__': 218 | main() 219 | -------------------------------------------------------------------------------- /gstreamer/gstreamer.py: -------------------------------------------------------------------------------- 1 | # Copyright 2019 Google LLC 2 | # 3 | # Licensed under the Apache License, Version 2.0 (the 'License'); 4 | # you may not use this file except in compliance with the License. 5 | # You may obtain a copy of the License at 6 | # 7 | # https://www.apache.org/licenses/LICENSE-2.0 8 | # 9 | # Unless required by applicable law or agreed to in writing, software 10 | # distributed under the License is distributed on an 'AS IS' BASIS, 11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | # See the License for the specific language governing permissions and 13 | # limitations under the License. 14 | 15 | import sys 16 | import svgwrite 17 | import threading 18 | from tracker import ObjectTracker 19 | 20 | import gi 21 | gi.require_version('Gst', '1.0') 22 | gi.require_version('GstBase', '1.0') 23 | gi.require_version('Gtk', '3.0') 24 | from gi.repository import GLib, GObject, Gst, GstBase, Gtk 25 | 26 | GObject.threads_init() 27 | Gst.init(None) 28 | 29 | class GstPipeline: 30 | def __init__(self, pipeline, user_function, src_size, mot_tracker): 31 | self.user_function = user_function 32 | self.running = False 33 | self.gstbuffer = None 34 | self.sink_size = None 35 | self.src_size = src_size 36 | self.box = None 37 | self.condition = threading.Condition() 38 | self.mot_tracker = mot_tracker 39 | self.pipeline = Gst.parse_launch(pipeline) 40 | self.overlay = self.pipeline.get_by_name('overlay') 41 | self.overlaysink = self.pipeline.get_by_name('overlaysink') 42 | appsink = self.pipeline.get_by_name('appsink') 43 | appsink.connect('new-sample', self.on_new_sample) 44 | 45 | # Set up a pipeline bus watch to catch errors. 46 | bus = self.pipeline.get_bus() 47 | bus.add_signal_watch() 48 | bus.connect('message', self.on_bus_message) 49 | 50 | # Set up a full screen window on Coral, no-op otherwise. 51 | self.setup_window() 52 | 53 | def run(self): 54 | # Start inference worker. 55 | self.running = True 56 | worker = threading.Thread(target=self.inference_loop) 57 | worker.start() 58 | 59 | # Run pipeline. 60 | self.pipeline.set_state(Gst.State.PLAYING) 61 | try: 62 | Gtk.main() 63 | except: 64 | pass 65 | 66 | # Clean up. 67 | self.pipeline.set_state(Gst.State.NULL) 68 | while GLib.MainContext.default().iteration(False): 69 | pass 70 | with self.condition: 71 | self.running = False 72 | self.condition.notify_all() 73 | worker.join() 74 | 75 | def on_bus_message(self, bus, message): 76 | t = message.type 77 | if t == Gst.MessageType.EOS: 78 | Gtk.main_quit() 79 | elif t == Gst.MessageType.WARNING: 80 | err, debug = message.parse_warning() 81 | sys.stderr.write('Warning: %s: %s\n' % (err, debug)) 82 | elif t == Gst.MessageType.ERROR: 83 | err, debug = message.parse_error() 84 | sys.stderr.write('Error: %s: %s\n' % (err, debug)) 85 | Gtk.main_quit() 86 | return True 87 | 88 | def on_new_sample(self, sink): 89 | sample = sink.emit('pull-sample') 90 | if not self.sink_size: 91 | s = sample.get_caps().get_structure(0) 92 | self.sink_size = (s.get_value('width'), s.get_value('height')) 93 | with self.condition: 94 | self.gstbuffer = sample.get_buffer() 95 | self.condition.notify_all() 96 | return Gst.FlowReturn.OK 97 | 98 | def get_box(self): 99 | if not self.box: 100 | glbox = self.pipeline.get_by_name('glbox') 101 | if glbox: 102 | glbox = glbox.get_by_name('filter') 103 | box = self.pipeline.get_by_name('box') 104 | assert glbox or box 105 | assert self.sink_size 106 | if glbox: 107 | self.box = (glbox.get_property('x'), glbox.get_property('y'), 108 | glbox.get_property('width'), glbox.get_property('height')) 109 | else: 110 | self.box = (-box.get_property('left'), -box.get_property('top'), 111 | self.sink_size[0] + box.get_property('left') + box.get_property('right'), 112 | self.sink_size[1] + box.get_property('top') + box.get_property('bottom')) 113 | return self.box 114 | 115 | def inference_loop(self): 116 | while True: 117 | with self.condition: 118 | while not self.gstbuffer and self.running: 119 | self.condition.wait() 120 | if not self.running: 121 | break 122 | gstbuffer = self.gstbuffer 123 | self.gstbuffer = None 124 | 125 | # Passing Gst.Buffer as input tensor avoids 2 copies of it: 126 | # * Python bindings copies the data when mapping gstbuffer 127 | # * Numpy copies the data when creating ndarray. 128 | # This requires a recent version of the python3-edgetpu package. If this 129 | # raises an exception please make sure dependencies are up to date. 130 | input_tensor = gstbuffer 131 | svg = self.user_function(input_tensor, self.src_size, self.get_box(), self.mot_tracker) 132 | if svg: 133 | if self.overlay: 134 | self.overlay.set_property('data', svg) 135 | if self.overlaysink: 136 | self.overlaysink.set_property('svg', svg) 137 | 138 | def setup_window(self): 139 | # Only set up our own window if we have Coral overlay sink in the pipeline. 140 | if not self.overlaysink: 141 | return 142 | 143 | gi.require_version('GstGL', '1.0') 144 | gi.require_version('GstVideo', '1.0') 145 | from gi.repository import GstGL, GstVideo 146 | 147 | # Needed to commit the wayland sub-surface. 148 | def on_gl_draw(sink, widget): 149 | widget.queue_draw() 150 | 151 | # Needed to account for window chrome etc. 152 | def on_widget_configure(widget, event, overlaysink): 153 | allocation = widget.get_allocation() 154 | overlaysink.set_render_rectangle(allocation.x, allocation.y, 155 | allocation.width, allocation.height) 156 | return False 157 | 158 | window = Gtk.Window(Gtk.WindowType.TOPLEVEL) 159 | window.fullscreen() 160 | 161 | drawing_area = Gtk.DrawingArea() 162 | window.add(drawing_area) 163 | drawing_area.realize() 164 | 165 | self.overlaysink.connect('drawn', on_gl_draw, drawing_area) 166 | 167 | # Wayland window handle. 168 | wl_handle = self.overlaysink.get_wayland_window_handle(drawing_area) 169 | self.overlaysink.set_window_handle(wl_handle) 170 | 171 | # Wayland display context wrapped as a GStreamer context. 172 | wl_display = self.overlaysink.get_default_wayland_display_context() 173 | self.overlaysink.set_context(wl_display) 174 | 175 | drawing_area.connect('configure-event', on_widget_configure, self.overlaysink) 176 | window.connect('delete-event', Gtk.main_quit) 177 | window.show_all() 178 | 179 | # The appsink pipeline branch must use the same GL display as the screen 180 | # rendering so they get the same GL context. This isn't automatically handled 181 | # by GStreamer as we're the ones setting an external display handle. 182 | def on_bus_message_sync(bus, message, overlaysink): 183 | if message.type == Gst.MessageType.NEED_CONTEXT: 184 | _, context_type = message.parse_context_type() 185 | if context_type == GstGL.GL_DISPLAY_CONTEXT_TYPE: 186 | sinkelement = overlaysink.get_by_interface(GstVideo.VideoOverlay) 187 | gl_context = sinkelement.get_property('context') 188 | if gl_context: 189 | display_context = Gst.Context.new(GstGL.GL_DISPLAY_CONTEXT_TYPE, True) 190 | GstGL.context_set_gl_display(display_context, gl_context.get_display()) 191 | message.src.set_context(display_context) 192 | return Gst.BusSyncReply.PASS 193 | 194 | bus = self.pipeline.get_bus() 195 | bus.set_sync_handler(on_bus_message_sync, self.overlaysink) 196 | 197 | def detectCoralDevBoard(): 198 | try: 199 | if 'MX8MQ' in open('/sys/firmware/devicetree/base/model').read(): 200 | print('Detected Edge TPU dev board.') 201 | return True 202 | except: pass 203 | return False 204 | 205 | def run_pipeline(user_function, 206 | src_size, 207 | appsink_size, 208 | trackerName, 209 | videosrc='/dev/video1', 210 | videofmt='raw'): 211 | objectOfTracker = None 212 | if videofmt == 'h264': 213 | SRC_CAPS = 'video/x-h264,width={width},height={height},framerate=30/1' 214 | elif videofmt == 'jpeg': 215 | SRC_CAPS = 'image/jpeg,width={width},height={height},framerate=30/1' 216 | else: 217 | SRC_CAPS = 'video/x-raw,width={width},height={height},framerate=30/1' 218 | if videosrc.startswith('/dev/video'): 219 | PIPELINE = 'v4l2src device=%s ! {src_caps}'%videosrc 220 | elif videosrc.startswith('http'): 221 | PIPELINE = 'souphttpsrc location=%s'%videosrc 222 | elif videosrc.startswith('rtsp'): 223 | PIPELINE = 'rtspsrc location=%s'%videosrc 224 | else: 225 | demux = 'avidemux' if videosrc.endswith('avi') else 'qtdemux' 226 | PIPELINE = """filesrc location=%s ! %s name=demux demux.video_0 227 | ! queue ! decodebin ! videorate 228 | ! videoconvert n-threads=4 ! videoscale n-threads=4 229 | ! {src_caps} ! {leaky_q} """ % (videosrc, demux) 230 | ''' Check for the object tracker.''' 231 | if trackerName != None: 232 | if trackerName == 'mediapipe': 233 | if detectCoralDevBoard(): 234 | objectOfTracker = ObjectTracker('mediapipe') 235 | else: 236 | print("Tracker MediaPipe is only available on the Dev Board. Keeping the tracker as None") 237 | trackerName = None 238 | else: 239 | objectOfTracker = ObjectTracker(trackerName) 240 | else: 241 | pass 242 | 243 | if detectCoralDevBoard(): 244 | scale_caps = None 245 | PIPELINE += """ ! decodebin ! glupload ! tee name=t 246 | t. ! queue ! glfilterbin filter=glbox name=glbox ! {sink_caps} ! {sink_element} 247 | t. ! queue ! glsvgoverlaysink name=overlaysink 248 | """ 249 | else: 250 | scale = min(appsink_size[0] / src_size[0], appsink_size[1] / src_size[1]) 251 | scale = tuple(int(x * scale) for x in src_size) 252 | scale_caps = 'video/x-raw,width={width},height={height}'.format(width=scale[0], height=scale[1]) 253 | PIPELINE += """ ! tee name=t 254 | t. ! {leaky_q} ! videoconvert ! videoscale ! {scale_caps} ! videobox name=box autocrop=true 255 | ! {sink_caps} ! {sink_element} 256 | t. ! {leaky_q} ! videoconvert 257 | ! rsvgoverlay name=overlay ! videoconvert ! ximagesink sync=false 258 | """ 259 | if objectOfTracker: 260 | mot_tracker = objectOfTracker.trackerObject.mot_tracker 261 | else: 262 | mot_tracker = None 263 | SINK_ELEMENT = 'appsink name=appsink emit-signals=true max-buffers=1 drop=true' 264 | SINK_CAPS = 'video/x-raw,format=RGB,width={width},height={height}' 265 | LEAKY_Q = 'queue max-size-buffers=1 leaky=downstream' 266 | 267 | src_caps = SRC_CAPS.format(width=src_size[0], height=src_size[1]) 268 | sink_caps = SINK_CAPS.format(width=appsink_size[0], height=appsink_size[1]) 269 | pipeline = PIPELINE.format(leaky_q=LEAKY_Q, 270 | src_caps=src_caps, sink_caps=sink_caps, 271 | sink_element=SINK_ELEMENT, scale_caps=scale_caps) 272 | 273 | print('Gstreamer pipeline:\n', pipeline) 274 | 275 | pipeline = GstPipeline(pipeline, user_function, src_size, mot_tracker) 276 | pipeline.run() 277 | -------------------------------------------------------------------------------- /gstreamer/install_requirements.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | # Copyright 2019 Google LLC 3 | # 4 | # Licensed under the Apache License, Version 2.0 (the "License"); 5 | # you may not use this file except in compliance with the License. 6 | # You may obtain a copy of the License at 7 | # 8 | # https://www.apache.org/licenses/LICENSE-2.0 9 | # 10 | # Unless required by applicable law or agreed to in writing, software 11 | # distributed under the License is distributed on an "AS IS" BASIS, 12 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | # See the License for the specific language governing permissions and 14 | # limitations under the License. 15 | 16 | if grep -s -q "MX8MQ" /sys/firmware/devicetree/base/model; then 17 | echo "Installing DevBoard specific dependencies" 18 | sudo apt-get install -y python3-pip python3-edgetpuvision 19 | sudo python3 -m pip install svgwrite 20 | else 21 | # Install gstreamer 22 | sudo apt-get install -y gstreamer1.0-plugins-bad gstreamer1.0-plugins-good python3-gst-1.0 python3-gi gir1.2-gtk-3.0 23 | python3 -m pip install svgwrite 24 | 25 | if grep -s -q "Raspberry Pi" /sys/firmware/devicetree/base/model; then 26 | echo "Installing Raspberry Pi specific dependencies" 27 | sudo apt-get install python3-rpi.gpio 28 | # Add v4l2 video module to kernel 29 | if ! grep -q "bcm2835-v4l2" /etc/modules; then 30 | echo bcm2835-v4l2 | sudo tee -a /etc/modules 31 | fi 32 | sudo modprobe bcm2835-v4l2 33 | fi 34 | fi 35 | 36 | # Verify models are downloaded 37 | if [ ! -d "../models" ] 38 | then 39 | cd .. 40 | echo "Downloading models." 41 | bash download_models.sh 42 | cd - 43 | fi 44 | 45 | # Install Tracker Dependencies 46 | echo 47 | echo "Installing tracker dependencies." 48 | echo 49 | echo "Note that the trackers have their own licensing, many of which 50 | are not Apache. Care should be taken if using a tracker with restrictive 51 | licenses for end applications." 52 | 53 | read -p "Install SORT (GPLv3)? " -n 1 -r 54 | if [[ $REPLY =~ ^[Yy]$ ]] 55 | then 56 | wget https://github.com/abewley/sort/archive/master.zip -O sort.zip 57 | unzip sort.zip -d ../third_party 58 | rm sort.zip 59 | sudo apt install python3-skimage 60 | sudo apt install python3-dev 61 | python3 -m pip install -r requirements_for_sort_tracker.txt 62 | fi 63 | echo 64 | -------------------------------------------------------------------------------- /gstreamer/requirements_for_sort_tracker.txt: -------------------------------------------------------------------------------- 1 | filterpy==1.1.0 2 | lap==0.4.0 3 | -------------------------------------------------------------------------------- /gstreamer/tracker.py: -------------------------------------------------------------------------------- 1 | # Copyright 2020 Google LLC 2 | # 3 | # Licensed under the Apache License, Version 2.0 (the "License"); 4 | # you may not use this file except in compliance with the License. 5 | # You may obtain a copy of the License at 6 | # 7 | # https://www.apache.org/licenses/LICENSE-2.0 8 | # 9 | # Unless required by applicable law or agreed to in writing, software 10 | # distributed under the License is distributed on an "AS IS" BASIS, 11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | # See the License for the specific language governing permissions and 13 | # limitations under the License. 14 | 15 | """ 16 | This module provides the support for Tracker Object. 17 | This creates object for the specific tracker based on the name of the tracker provided 18 | in the command line of the demo. 19 | 20 | To add more trackers here, simply replicate the SortTracker() code and replace it with 21 | the new tracker as required. 22 | 23 | Developer simply needs to instantiate the object of ObjectTracker(trackerObjectName) with a valid 24 | trackerObjectName. 25 | 26 | """ 27 | import os,sys 28 | 29 | class ObjectTracker(object): 30 | def __init__(self, trackerObjectName): 31 | if trackerObjectName == 'sort': # Add more trackers in elif whenever needed 32 | self.trackerObject = SortTracker() 33 | else: 34 | print("Invalid Tracker Name") 35 | self.trackerObject = None 36 | 37 | 38 | class SortTracker(ObjectTracker): 39 | def __init__(self): 40 | sys.path.append(os.path.join(os.path.dirname(__file__), '../third_party', 'sort-master')) 41 | from sort import Sort 42 | self.mot_tracker = Sort() 43 | -------------------------------------------------------------------------------- /third_party/README.md: -------------------------------------------------------------------------------- 1 | # Third Party Trackers 2 | This directory contains third party trackers. 3 | 4 | While the overall project has a permissive Apache 2.0 license, certain trackers 5 | (especially those intended for research) have more restrictive licenses. These 6 | trackers will be downloaded in install_requirements scripts and will notify of 7 | the license used. Care should be taken if restrictive licenses are an issue. 8 | --------------------------------------------------------------------------------