├── .github ├── actions │ ├── Dockerfile │ ├── action.yml │ └── entrypoint.sh └── workflows │ └── build.yml ├── CONTRIBUTING.md ├── LICENSE ├── MAINTAINERS.md ├── README.md ├── docs ├── css │ ├── extra.css │ └── pdf-print.css ├── part1 │ ├── CLOUDSETUP.md │ ├── LABEL.md │ ├── README.md │ └── TRAIN.md ├── part2 │ ├── README.md │ ├── adding-tf.md │ ├── create-dashboard.md │ ├── display-objects.md │ ├── install.md │ └── installing-nodes.md ├── part3 │ ├── EDGEINFER.md │ ├── JETSON.md │ └── README.md ├── part4 │ ├── HZNCLIENT.md │ ├── HZNHUB.md │ └── README.md ├── part5 │ ├── README.md │ ├── dockerfile.md │ ├── package.md │ └── settings.md └── part6 │ ├── HZNDEPLOY.md │ └── README.md ├── images ├── COS-BucketName.png ├── COS-Create.png ├── COS-CreateBucket.png ├── COS-Upload-1.png ├── COS-Upload-2.png ├── COS-Upload-3.png ├── COS-Upload-Folder-1.png ├── COS-Upload-Folder-2.png ├── IBM-Cloud-Catalog-WML.png ├── IBM-Cloud-Catalog-search-WML.png ├── IBM-Cloud-Login.png ├── IBM-Cloud-WML-Create.png ├── IBM-Cloud-WML-Instance.png ├── Jetson-HardHat-Detection-Author.png ├── Sample-HardHat-Detection-Author.png ├── WorkerSafety-HardHat-Labeled-Image.png ├── cacli-complete.png ├── cacli-download.png ├── cacli-list.png ├── cacli-login.png ├── cacli-progress.png ├── cacli-train1.png ├── cacli-train2.png └── edge-objectdetection-arch-diagram.png ├── mkdocs.yml ├── part1 ├── CLOUDSETUP.md ├── LABEL.md └── TRAIN.md ├── part2 ├── EDGEINFER.md └── JETSON.md ├── part3 ├── HZNCLIENT.md └── HZNHUB.md └── part4 ├── DOCKERMODEL.md └── HZNDEPLOY.md /.github/actions/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM python:alpine 2 | 3 | RUN apk add --no-cache \ 4 | build-base \ 5 | git \ 6 | git-fast-import \ 7 | openssh \ 8 | gcc \ 9 | musl-dev \ 10 | jpeg-dev \ 11 | zlib-dev \ 12 | libffi-dev \ 13 | cairo-dev \ 14 | pango-dev \ 15 | gdk-pixbuf-dev \ 16 | ttf-ubuntu-font-family \ 17 | msttcorefonts-installer \ 18 | fontconfig && \ 19 | update-ms-fonts && \ 20 | fc-cache -f 21 | 22 | RUN pip install --no-cache-dir mkdocs mkdocs-with-pdf mkdocs-material 23 | 24 | COPY entrypoint.sh /entrypoint.sh 25 | RUN chmod +x /entrypoint.sh 26 | 27 | ENTRYPOINT ["/entrypoint.sh"] -------------------------------------------------------------------------------- /.github/actions/action.yml: -------------------------------------------------------------------------------- 1 | # action.yml 2 | name: 'Deploy to GitHub Pages' 3 | description: 'Publish Markdown docs as GitHub Pages static site' 4 | runs: 5 | using: 'docker' 6 | image: 'Dockerfile' -------------------------------------------------------------------------------- /.github/actions/entrypoint.sh: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | 3 | ENABLE_PDF_EXPORT=1 mkdocs gh-deploy --config-file "${GITHUB_WORKSPACE}/mkdocs.yml" --force -------------------------------------------------------------------------------- /.github/workflows/build.yml: -------------------------------------------------------------------------------- 1 | name: GenerateSite 2 | 3 | on: 4 | push: 5 | branches: [main] 6 | pull_request: 7 | branches: [main] 8 | jobs: 9 | generate: 10 | name: 'Run mkdocs gh-deploy' 11 | runs-on: ubuntu-latest 12 | steps: 13 | - name: Check out repository 14 | uses: actions/checkout@v2 15 | 16 | - name: generate site 17 | uses: ./.github/actions/ 18 | -------------------------------------------------------------------------------- /CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | # Contributing 2 | 3 | This is an open source project, and we appreciate your help! 4 | 5 | We use the GitHub issue tracker to discuss new features and non-trivial bugs. 6 | 7 | To contribute code, documentation, or tests, please submit a pull request to 8 | the GitHub repository. Generally, we expect the maintainers to review your pull 9 | request before it is approved for merging. For more details, see the 10 | [MAINTAINERS](MAINTAINERS.md) page. 11 | 12 | Contributions are subject to the [Developer Certificate of Origin, Version 1.1](https://developercertificate.org/) and the [Apache License, Version 2](https://www.apache.org/licenses/LICENSE-2.0.txt). 13 | 14 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Apache License 2 | Version 2.0, January 2004 3 | http://www.apache.org/licenses/ 4 | 5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 6 | 7 | 1. Definitions. 8 | 9 | "License" shall mean the terms and conditions for use, reproduction, 10 | and distribution as defined by Sections 1 through 9 of this document. 11 | 12 | "Licensor" shall mean the copyright owner or entity authorized by 13 | the copyright owner that is granting the License. 14 | 15 | "Legal Entity" shall mean the union of the acting entity and all 16 | other entities that control, are controlled by, or are under common 17 | control with that entity. For the purposes of this definition, 18 | "control" means (i) the power, direct or indirect, to cause the 19 | direction or management of such entity, whether by contract or 20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 21 | outstanding shares, or (iii) beneficial ownership of such entity. 22 | 23 | "You" (or "Your") shall mean an individual or Legal Entity 24 | exercising permissions granted by this License. 25 | 26 | "Source" form shall mean the preferred form for making modifications, 27 | including but not limited to software source code, documentation 28 | source, and configuration files. 29 | 30 | "Object" form shall mean any form resulting from mechanical 31 | transformation or translation of a Source form, including but 32 | not limited to compiled object code, generated documentation, 33 | and conversions to other media types. 34 | 35 | "Work" shall mean the work of authorship, whether in Source or 36 | Object form, made available under the License, as indicated by a 37 | copyright notice that is included in or attached to the work 38 | (an example is provided in the Appendix below). 39 | 40 | "Derivative Works" shall mean any work, whether in Source or Object 41 | form, that is based on (or derived from) the Work and for which the 42 | editorial revisions, annotations, elaborations, or other modifications 43 | represent, as a whole, an original work of authorship. For the purposes 44 | of this License, Derivative Works shall not include works that remain 45 | separable from, or merely link (or bind by name) to the interfaces of, 46 | the Work and Derivative Works thereof. 47 | 48 | "Contribution" shall mean any work of authorship, including 49 | the original version of the Work and any modifications or additions 50 | to that Work or Derivative Works thereof, that is intentionally 51 | submitted to Licensor for inclusion in the Work by the copyright owner 52 | or by an individual or Legal Entity authorized to submit on behalf of 53 | the copyright owner. For the purposes of this definition, "submitted" 54 | means any form of electronic, verbal, or written communication sent 55 | to the Licensor or its representatives, including but not limited to 56 | communication on electronic mailing lists, source code control systems, 57 | and issue tracking systems that are managed by, or on behalf of, the 58 | Licensor for the purpose of discussing and improving the Work, but 59 | excluding communication that is conspicuously marked or otherwise 60 | designated in writing by the copyright owner as "Not a Contribution." 61 | 62 | "Contributor" shall mean Licensor and any individual or Legal Entity 63 | on behalf of whom a Contribution has been received by Licensor and 64 | subsequently incorporated within the Work. 65 | 66 | 2. Grant of Copyright License. Subject to the terms and conditions of 67 | this License, each Contributor hereby grants to You a perpetual, 68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 69 | copyright license to reproduce, prepare Derivative Works of, 70 | publicly display, publicly perform, sublicense, and distribute the 71 | Work and such Derivative Works in Source or Object form. 72 | 73 | 3. Grant of Patent License. Subject to the terms and conditions of 74 | this License, each Contributor hereby grants to You a perpetual, 75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 76 | (except as stated in this section) patent license to make, have made, 77 | use, offer to sell, sell, import, and otherwise transfer the Work, 78 | where such license applies only to those patent claims licensable 79 | by such Contributor that are necessarily infringed by their 80 | Contribution(s) alone or by combination of their Contribution(s) 81 | with the Work to which such Contribution(s) was submitted. If You 82 | institute patent litigation against any entity (including a 83 | cross-claim or counterclaim in a lawsuit) alleging that the Work 84 | or a Contribution incorporated within the Work constitutes direct 85 | or contributory patent infringement, then any patent licenses 86 | granted to You under this License for that Work shall terminate 87 | as of the date such litigation is filed. 88 | 89 | 4. Redistribution. You may reproduce and distribute copies of the 90 | Work or Derivative Works thereof in any medium, with or without 91 | modifications, and in Source or Object form, provided that You 92 | meet the following conditions: 93 | 94 | (a) You must give any other recipients of the Work or 95 | Derivative Works a copy of this License; and 96 | 97 | (b) You must cause any modified files to carry prominent notices 98 | stating that You changed the files; and 99 | 100 | (c) You must retain, in the Source form of any Derivative Works 101 | that You distribute, all copyright, patent, trademark, and 102 | attribution notices from the Source form of the Work, 103 | excluding those notices that do not pertain to any part of 104 | the Derivative Works; and 105 | 106 | (d) If the Work includes a "NOTICE" text file as part of its 107 | distribution, then any Derivative Works that You distribute must 108 | include a readable copy of the attribution notices contained 109 | within such NOTICE file, excluding those notices that do not 110 | pertain to any part of the Derivative Works, in at least one 111 | of the following places: within a NOTICE text file distributed 112 | as part of the Derivative Works; within the Source form or 113 | documentation, if provided along with the Derivative Works; or, 114 | within a display generated by the Derivative Works, if and 115 | wherever such third-party notices normally appear. The contents 116 | of the NOTICE file are for informational purposes only and 117 | do not modify the License. You may add Your own attribution 118 | notices within Derivative Works that You distribute, alongside 119 | or as an addendum to the NOTICE text from the Work, provided 120 | that such additional attribution notices cannot be construed 121 | as modifying the License. 122 | 123 | You may add Your own copyright statement to Your modifications and 124 | may provide additional or different license terms and conditions 125 | for use, reproduction, or distribution of Your modifications, or 126 | for any such Derivative Works as a whole, provided Your use, 127 | reproduction, and distribution of the Work otherwise complies with 128 | the conditions stated in this License. 129 | 130 | 5. Submission of Contributions. Unless You explicitly state otherwise, 131 | any Contribution intentionally submitted for inclusion in the Work 132 | by You to the Licensor shall be under the terms and conditions of 133 | this License, without any additional terms or conditions. 134 | Notwithstanding the above, nothing herein shall supersede or modify 135 | the terms of any separate license agreement you may have executed 136 | with Licensor regarding such Contributions. 137 | 138 | 6. Trademarks. This License does not grant permission to use the trade 139 | names, trademarks, service marks, or product names of the Licensor, 140 | except as required for reasonable and customary use in describing the 141 | origin of the Work and reproducing the content of the NOTICE file. 142 | 143 | 7. Disclaimer of Warranty. Unless required by applicable law or 144 | agreed to in writing, Licensor provides the Work (and each 145 | Contributor provides its Contributions) on an "AS IS" BASIS, 146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 147 | implied, including, without limitation, any warranties or conditions 148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 149 | PARTICULAR PURPOSE. You are solely responsible for determining the 150 | appropriateness of using or redistributing the Work and assume any 151 | risks associated with Your exercise of permissions under this License. 152 | 153 | 8. Limitation of Liability. In no event and under no legal theory, 154 | whether in tort (including negligence), contract, or otherwise, 155 | unless required by applicable law (such as deliberate and grossly 156 | negligent acts) or agreed to in writing, shall any Contributor be 157 | liable to You for damages, including any direct, indirect, special, 158 | incidental, or consequential damages of any character arising as a 159 | result of this License or out of the use or inability to use the 160 | Work (including but not limited to damages for loss of goodwill, 161 | work stoppage, computer failure or malfunction, or any and all 162 | other commercial damages or losses), even if such Contributor 163 | has been advised of the possibility of such damages. 164 | 165 | 9. Accepting Warranty or Additional Liability. While redistributing 166 | the Work or Derivative Works thereof, You may choose to offer, 167 | and charge a fee for, acceptance of support, warranty, indemnity, 168 | or other liability obligations and/or rights consistent with this 169 | License. However, in accepting such obligations, You may act only 170 | on Your own behalf and on Your sole responsibility, not on behalf 171 | of any other Contributor, and only if You agree to indemnify, 172 | defend, and hold each Contributor harmless for any liability 173 | incurred by, or claims asserted against, such Contributor by reason 174 | of your accepting any such warranty or additional liability. 175 | 176 | END OF TERMS AND CONDITIONS 177 | 178 | APPENDIX: How to apply the Apache License to your work. 179 | 180 | To apply the Apache License to your work, attach the following 181 | boilerplate notice, with the fields enclosed by brackets "[]" 182 | replaced with your own identifying information. (Don't include 183 | the brackets!) The text should be enclosed in the appropriate 184 | comment syntax for the file format. We also recommend that a 185 | file or class name and description of purpose be included on the 186 | same "printed page" as the copyright notice for easier 187 | identification within third-party archives. 188 | 189 | Copyright [yyyy] [name of copyright owner] 190 | 191 | Licensed under the Apache License, Version 2.0 (the "License"); 192 | you may not use this file except in compliance with the License. 193 | You may obtain a copy of the License at 194 | 195 | http://www.apache.org/licenses/LICENSE-2.0 196 | 197 | Unless required by applicable law or agreed to in writing, software 198 | distributed under the License is distributed on an "AS IS" BASIS, 199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 200 | See the License for the specific language governing permissions and 201 | limitations under the License. 202 | -------------------------------------------------------------------------------- /MAINTAINERS.md: -------------------------------------------------------------------------------- 1 | # Project Maintainers 2 | 3 | EdgeComputing-WorkerSafety-HardHat-Detection is currently managed by the following individuals: 4 | 5 | The maintainers are listed in alphabetical order of their Github username. 6 | 7 | - **John Walicki** ([johnwalicki](https://github.com/johnwalicki)) 8 | 9 | ## Methodology 10 | 11 | The quality of the master branch should never be compromised. 12 | 13 | The remainder of this document details how to merge pull requests to the 14 | repositories. 15 | 16 | ## Merge Approval 17 | 18 | The project maintainers use LGTM (Looks Good To Me) in comments on the pull 19 | request to indicate acceptance prior to merging. A change requires LGTMs from 20 | two project maintainers. If the code is written by a maintainer, the change 21 | only requires one additional LGTM. 22 | 23 | ## Reviewing Pull Requests 24 | 25 | We recommend reviewing pull requests directly within GitHub. This allows a 26 | public commentary on changes, providing transparency for all users. When 27 | providing feedback be civil, courteous, and kind. Disagreement is fine, so long 28 | as the discourse is carried out politely. If we see a record of uncivil or 29 | abusive comments, we will revoke your commit privileges and invite you to leave 30 | the project. 31 | 32 | During your review, consider the following points: 33 | 34 | ### Does the change have positive impact? 35 | 36 | Some proposed changes may not represent a positive impact to the project. Ask 37 | whether or not the change will make understanding the code easier, or if it 38 | could simply be a personal preference on the part of the author (see 39 | [bikeshedding](https://en.wiktionary.org/wiki/bikeshedding)). 40 | 41 | Pull requests that do not have a clear positive impact should be closed without 42 | merging. 43 | 44 | ### Do the changes make sense? 45 | 46 | If you do not understand what the changes are or what they accomplish, ask the 47 | author for clarification. Ask the author to add comments and/or clarify test 48 | case names to make the intentions clear. 49 | 50 | At times, such clarification will reveal that the author may not be using the 51 | code correctly, or is unaware of features that accommodate their needs. If you 52 | feel this is the case, work up a code sample that would address the pull 53 | request for them, and feel free to close the pull request once they confirm. 54 | 55 | ### Does the change introduce a new feature? 56 | 57 | For any given pull request, ask yourself "is this a new feature?" If so, does 58 | the pull request (or associated issue) contain narrative indicating the need 59 | for the feature? If not, ask them to provide that information. 60 | 61 | Are new unit tests in place that test all new behaviors introduced? If not, do 62 | not merge the feature until they are! Is documentation in place for the new 63 | feature? (See the documentation guidelines). If not, do not merge the feature 64 | until it is! Is the feature necessary for general use cases? Try and keep the 65 | scope of any given component narrow. If a proposed feature does not fit that 66 | scope, recommend to the user that they maintain the feature on their own, and 67 | close the request. You may also recommend that they see if the feature gains 68 | traction among other users, and suggest they re-submit when they can show such 69 | support. 70 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # This tutorial has been archived 2 | 3 | # Edge Computing Worker Safety Hard Hat Detection 4 | Protect workers with a TensorFlow Hard Hat object detection model running on a Jetson Nano 5 | 6 | ## Overview 7 | 8 | In this code pattern, we will describe how to implement a workplace safety use case using open source edge computing technologies and IBM's management architecture for deploying AI workloads to edge devices. 9 | 10 | In many factory environments, when employees enter a designated area, they must be wearing proper Personal Protective Equipment (PPE) such as a hard hat. This pattern demonstrates a solution which monitors the designated area and issues an alert only when an employee has been detected and is not wearing a hard hat. To reduce load on the network, the video stream object detection will be performed on edge devices managed by Open Horizon and IBM Edge Application Manager. 11 | 12 | To implement the architecture, the following components are required: 13 | 14 | - Models need to be trained to identify a person wearing a hard hat. This is accomplished using IBM Cloud Annotations and Watson Machine Learning. 15 | 16 | - The models need to be containerized and deployed to the edge. This is accomplished using [Open Horizon](https://www.lfedge.org/projects/openhorizon/) 17 | 18 | - Models need to be deployed to the camera to identify a human which will trigger the camera to start streaming. This is done using [IBM Edge Application Manager](https://developer.ibm.com/components/ibm-edge-application-manager/). 19 | 20 | 21 | ## Architecture diagram 22 | 23 | The following diagram shows the workflow for the workplace safety hard hat detection system. 24 | 25 | ![Edge Architecture Diagram](/images/edge-objectdetection-arch-diagram.png) 26 | 27 | ## Learning Objectives: 28 | In this code pattern you will learn how to: 29 | 30 | ### Part 1 31 | 32 | - Label a dataset of hard hat images with White / Blue / Yellow / Person objects using Cloud Annotations 33 | - Create an instance of Cloud Object Storage (COS) in IBM Cloud 34 | - Upload the hard hat dataset to COS 35 | - Create a Watson Machine Learning instance 36 | - Install Cloud Annotations command line interface - cacli 37 | - Train a Tensorflow model using Watson Machine Learning 38 | - Download the model with the cacli 39 | - git clone object-detection-react 40 | - Test model on your laptop 41 | 42 | ### Part 2 43 | 44 | - Set up a Jetson Nano with Ubuntu 45 | - Attach a USB webcam and use a 5V barrel power supply to provide sufficient power 46 | - Download the hard hat model with the cacli 47 | - git clone object-detection-react 48 | - Test model on the Jetson 49 | 50 | ### Part 3 51 | 52 | - Set up a laptop with HZN Exchange Hub Services 53 | - Install docker on your Jetson Nano 54 | - Install Open Horizon HZN anax client 55 | - Register the Jetson into the Hub using the Open Horizon HZN anax client 56 | 57 | ### Part 4 58 | 59 | - Build a docker image which contains the model 60 | - Register the workload pattern on the Exchange server 61 | - Deploy the pattern to the edge device 62 | 63 | 64 | ## Prerequisites 65 | This tutorial can be completed using an IBM Cloud Lite account. 66 | 67 | - Create an [IBM Cloud account](https://cloud.ibm.com) 68 | - Log into [IBM Cloud](https://cloud.ibm.com/login) 69 | - Nvidia Jetson Nano 70 | - If you do not have a Jetson Nano or Jetson TX2, you can still complete Sections 1,2,3 71 | 72 | ## Part 1 - Build a Hard Hat Object Detection Model 73 | 74 | ### Section 1 - Label a dataset of hard hat images with White / Blue / Yellow / Person objects 75 | 76 | This section shows you how to use Cloud Annotations to label a dataset of hard hat images with White / Blue / Yellow / Person objects 77 | 78 | - Instructions : [Label Hard Hat Data](part1/LABEL.md) 79 | 80 | ### Section 2 - Create IBM Cloud resources 81 | 82 | This section shows you how to create an instance of Cloud Object Storage (COS) in IBM Cloud. Create a COS bucket and upload the hard hat dataset to COS. Finally create Watson Machine Learning instance. 83 | 84 | - Instructions : [Create IBM Cloud Resources](part1/CLOUDSETUP.md) 85 | 86 | ### Section 3 - Train and Test a Hard Hat Detection Model 87 | 88 | This section demonstrates how to install the Cloud Annotations cacli, train a Tensorflow model, download the model, set up a sample localhost web application and test model on your laptop. 89 | 90 | - Instructions : [Train your Hard Hat model](part1/TRAIN.md) 91 | 92 | ## Part 2 - Configure a Jetson Nano 93 | 94 | ### Section 4 - Jetson Nano setup 95 | 96 | This section shows you how to configure a Jetson Nano to run object detection. 97 | 98 | - Instructions : [Configure Nvidia Jetson Nano](part2/JETSON.md) 99 | 100 | ### Section 5 - Deploy and Test Hard Hat Object Detection 101 | 102 | This section shows you how to download and test the hard hat model. 103 | 104 | - Instructions : [Deploy / Test Model on the Jetson](part2/EDGEINFER.md) 105 | 106 | ## Part 3 - Open Horizon 107 | 108 | ### Section 6 - Open Horizon Exchange Hub 109 | 110 | This section guides you through the set up of a laptop with HZN Exchange Hub Services 111 | 112 | - Instructions : [Open Horizon Exchange Hub Services](part3/HZNHUB.md) 113 | 114 | ### Section 7 - Open Horizon client setup 115 | 116 | Learn how to install Open Horizon HZN anax client and register the Jetson HZN anax client into your hub. 117 | 118 | - Instructions : [Open Horizon agent on the Jetson](part3/HZNCLIENT.md) 119 | 120 | ## Part 4 - 121 | 122 | ### Section 8 - Containizer your model in a Docker image 123 | 124 | This section shows you how to build a docker image with a model 125 | 126 | - Instructions : [Build a Docker Image](part4/DOCKERMODEL.md) 127 | 128 | ### Section 9 - Register and deploy the Pattern to the edge 129 | 130 | This final section demonstrates how to register the workload pattern on the Exchange server and deploy the pattern to the edge device. 131 | 132 | - Instructions : [Deploy Horizon Pattern](part4/HZNDEPLOY.md) 133 | 134 | ___ 135 | [**Home**](/README.md) - [Label Data](/part1/LABEL.md) - [Cloud Setup](/part1/CLOUDSETUP.md) - [Train Model](/part1/TRAIN.md) - [Setup Jetson](/part2/JETSON.md) - [Edge Inferencing](/part2/EDGEINFER.md) - [Horizon Hub](/part3/HZNHUB.md) - [Horizon Client](/part3/HZNCLIENT.md) - [Docker Model](/part4/DOCKERMODEL.md) - [Horizon Deploy](/part4/HZNDEPLOY.md) 136 | ___ 137 | 138 | Enjoy! Give me [feedback](https://github.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/issues) if you have suggestions on how to improve this workshop. 139 | 140 | ## License 141 | 142 | This workshop and code examples are licensed under the Apache Software License, Version 2. Separate third party code objects invoked within this code pattern are licensed by their respective providers pursuant to their own separate licenses. Contributions are subject to the [Developer Certificate of Origin, Version 1.1 (DCO)](https://developercertificate.org/) and the [Apache Software License, Version 2](http://www.apache.org/licenses/LICENSE-2.0.txt). 143 | 144 | [Apache Software License (ASL) FAQ](http://www.apache.org/foundation/license-faq.html#WhatDoesItMEAN) 145 | -------------------------------------------------------------------------------- /docs/css/extra.css: -------------------------------------------------------------------------------- 1 | div.col-md-9 h1:first-of-type { 2 | text-align: center; 3 | font-size: 60px; 4 | font-weight: 300; 5 | } 6 | 7 | div.col-md-9 h1:first-of-type .headerlink { 8 | display: none; 9 | } 10 | 11 | code.no-highlight { 12 | color: black; 13 | } 14 | 15 | /* Definition List styles */ 16 | 17 | dd { 18 | padding-left: 20px; 19 | } 20 | 21 | /* Center images*/ 22 | 23 | img.center { 24 | display: block; 25 | margin: 0 auto; 26 | } 27 | 28 | :root>* { 29 | --md-primary-fg-color: #a44; 30 | --md-footer-bg-color: #444; 31 | } 32 | .md-typeset .admonition.note, .md-typeset details.note { 33 | border-color: #a44; 34 | } 35 | .md-typeset .note>.admonition-title, .md-typeset .note>summary { 36 | border-color: #a44; 37 | background-color: rgba(170,68,68,.1); 38 | } 39 | 40 | .md-typeset .note>.admonition-title::before, .md-typeset .note>summary::before { 41 | background-color: #a44; 42 | } 43 | 44 | -------------------------------------------------------------------------------- /docs/css/pdf-print.css: -------------------------------------------------------------------------------- 1 | /* 2 | ** for PDF Printing 3 | */ 4 | 5 | 6 | /* fixed for `base.css` of mkdocs v1.0.4 */ 7 | 8 | code, 9 | pre code { 10 | font-family: Menlo, Monaco, Consolas, "Courier New", monospace !important; 11 | } 12 | 13 | @media print { 14 | hr { 15 | display: none; 16 | } 17 | p { 18 | font-size: inherit; 19 | } 20 | ol, 21 | ul, 22 | li { 23 | break-inside: avoid; 24 | break-before: avoid; 25 | break-after: avoid; 26 | } 27 | } -------------------------------------------------------------------------------- /docs/part1/CLOUDSETUP.md: -------------------------------------------------------------------------------- 1 | ## Create IBM Cloud resources 2 | 3 | This section shows you how to use Cloud Annotations to label a dataset of hard hat images with White / Blue / Yellow / Person objects 4 | 5 | ### Lab Objectives 6 | 7 | In this lab you will learn how to: 8 | 9 | - Create an instance of Cloud Object Storage (COS) in IBM Cloud 10 | - Upload the hard hat dataset to COS 11 | - Create Watson Machine Learning instance 12 | 13 | ### Login to IBM Cloud 14 | 15 | - Visit [IBM Cloud](http://cloud.ibm.com) and, if you don't yet have an account, register for a (free) IBM Cloud Lite account. 16 | ![IBM Cloud login](/images/IBM-Cloud-Login.png) 17 | 18 | ### Create a Cloud Object Storage instance 19 | 20 | - IBM Cloud will provision 25GB for free. 21 | - Visit the [IBM Cloud Catalog](https://cloud.ibm.com/catalog/services/cloud-object-storage) COS page. 22 | - Click on the **Create** button 23 | ![COS Create](/images/COS-Create.png) 24 | 25 | ### Create a Cloud Object Storage bucket 26 | 27 | - Click on the **Create bucket** button 28 | ![COS Create Bucket](/images/COS-CreateBucket.png) 29 | 30 | - Select a **Standard** predefined bucket 31 | - Name your bucket ```hardhat-detection-``` 32 | - The bucket name must be unique across the whole IBM Cloud Object Storage system 33 | ![COS Bucket name](/images/COS-BucketName.png) 34 | - Press the **Next** button 35 | - Unzip the hardhat-dataset.zip that was downloaded in the previous section. 36 | - Make certain you extract / unzip the .zip file. 37 | - Navigate to the extracted files on your computer 38 | - Drag the ```_annotations.json``` into the bucket. 39 | - A progress bar and a confirmation will display. 40 | 41 | 42 | 43 | 44 | 45 | 46 | 47 | 48 | 51 | 54 | 57 |
49 | 50 | 52 | 53 | 55 | 56 |
58 | 59 | - Hold off on uploading the ```images``` folder until the next step where you will be able to upload the entire folder. 60 | - Press the **Next** button. 61 | - On the next panel, scroll down and press the **View bucket configuration** button. 62 | - On the next panel, in the left navigation sidebar, click on **Objects** 63 | - Then, expand the **Upload** dropdown, and choose **Folders** 64 | ![COS Folder Upload](/images/COS-Upload-Folder-1.png) 65 | - In the popup dialog, click on **Select folders** 66 | - A operating system file dialog will open, navigate and select the **images** folder that you have extracted. 67 | - A progress bar and a confirmation will display as the files are uploaded. 68 | ![COS Folder Upload](/images/COS-Upload-Folder-2.png) 69 | - Congratulations - You have uploaded the hardhat model dataset to IBM Cloud Object Storage. 70 | 71 | ### Create an Watson Machine Learning Instance 72 | 73 | - Visit the [IBM Cloud Catalog](https://cloud.ibm.com/catalog/services/machine-learning) 74 | - Search for **Watson Machine Learning** 75 | ![Catalog WML](/images/IBM-Cloud-Catalog-search-WML.png) 76 | - Click on the Machine Learning tile 77 | ![WML Service](/images/IBM-Cloud-Catalog-WML.png) 78 | - Click on the **Create** button 79 | ![WML Create](/images/IBM-Cloud-WML-Create.png) 80 | - You now have a **Watson Machine Learning** instance which will be used to train your Hard Hat model. 81 | ![WML Instance](/images/IBM-Cloud-WML-Instance.png) 82 | 83 | *** 84 | You are now ready to train your Tensorflow hard hat model, so proceed to the next [Train Model section](TRAIN.md). 85 | -------------------------------------------------------------------------------- /docs/part1/LABEL.md: -------------------------------------------------------------------------------- 1 | ## Label a dataset of hard hat images with White / Blue / Yellow / Person objects 2 | 3 | This section shows you how to use Cloud Annotations to label a dataset of hard hat images with White / Blue / Yellow / Person objects. 4 | 5 | ### Lab Objectives 6 | 7 | In this lab you will learn how to: 8 | 9 | - Label a dataset of hard hat images with White / Blue / Yellow / Person objects using Cloud Annotations 10 | 11 | ### Hard Hat Image Data Set 12 | 13 | Create a set of videos with individuals wearing hardhats in several industrial factory settings. Make sure to include varied scenarios with different lighting conditions. If you have different colored hats that you want to recognize, you might record video of people wearing blue, white, yellow hard hats. Use a tool to capture frames at the desired time interval from the videos in your data set. Label each frame with objects that are of interest - blue, white, yellow hard hats and people. Draw bounding boxes around the objects. Repeat this step for all frames. 14 | 15 | To proceed with this code pattern example, several hundred annotated and labeled hard hat images are provided in this repository. 16 | 17 | ![Worker Safety Hard Hat Labeled Image](/images/WorkerSafety-HardHat-Labeled-Image.png) 18 | 19 | ### Cloud Annotations 20 | 21 | Cloud Annotations makes labeling images and training machine learning models easy. Whether you’ve never touched a line of code in your life or you’re a TensorFlow ninja, the [Cloud Annotations docs](https://cloud.annotations.ai/docs) will help you build what you need. 22 | 23 | The Cloud Annotations tool will help you [prepare the training data](https://cloud.annotations.ai/docs#preparing-training-data). To sound the alarms, our worker safety use case needs to identify **localized hard hat objects** in the frames. 24 | 25 | A classification model can tell you what an image is and how confident it is about it’s decision. An localized object detection model can provide you with much more information: 26 | - **Location** : The coordinates and area of where the object is in the image. 27 | - **Count** : The number of objects found in the image. 28 | - **Size** : How large the object is with respect to the image dimensions. 29 | 30 | Cloud Annotations stores the annotated labeled metadata in a file named ```_annotations.json``` and images in a directory named ```images```. Cloud Annotations stores this training set in IBM Cloud Object Storage. 31 | 32 | ### Download the Hard Hat training dataset 33 | 34 | [Download](/dataset/hardhat-dataset.zip) the Hart Hat annotated labeled metadata and images (50MB zipfile) to your system. In the next section we will upload the training data to IBM Cloud Object Storage (COS). 35 | 36 | *** 37 | You are now ready to set up your IBM Cloud resources, so proceed to the next [Cloud Setup section](CLOUDSETUP.md). 38 | -------------------------------------------------------------------------------- /docs/part1/README.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/docs/part1/README.md -------------------------------------------------------------------------------- /docs/part1/TRAIN.md: -------------------------------------------------------------------------------- 1 | ## Train and Test a Hard Hat Detection Model 2 | 3 | This section demonstrates how to install the Cloud Annotations cacli, train a Tensorflow model, download the model, set up a sample localhost web application and test model on your laptop. 4 | 5 | ### Lab Objectives 6 | 7 | In this lab you will learn how to: 8 | 9 | - Install Cloud Annotations cacli 10 | - Train a Tensorflow model 11 | - Download the model with the cacli 12 | - git clone object-detection-react 13 | - Test model on your laptop 14 | 15 | ### Cloud Annotations cacli Install 16 | 17 | Now that the Hard Hat training data set has been uploaded to IBM Cloud Object Storage, we can train our model using the Cloud Annotations command line interface, ```cacli``` 18 | 19 | The cacli is available for Linux, Mac, Windows. Follow the [download and installation instructions](https://cloud.annotations.ai/docs#installing-the-cli) 20 | 21 | ### Cloud Annotations - Login / Train 22 | 23 | - Login to your IBM Cloud account using ```cacli login``` and answer the OTP prompts 24 | ``` 25 | $ cacli login 26 | ``` 27 | ![cacli login](/images/cacli-login.png) 28 | 29 | - Train your hard hat object detection model 30 | ``` 31 | $ cacli train 32 | ``` 33 | - Select your ```hardhat-detection-``` Bucket. 34 | ![cacli train](/images/cacli-train1.png) 35 | - Note the training run **Model ID** 36 | ![cacli train](/images/cacli-train2.png) 37 | - List training runs 38 | ``` 39 | $ cacli list 40 | ``` 41 | ![cacli list](/images/cacli-list.png) 42 | - Monitor the progress of your training runs 43 | ``` 44 | $ cacli progress model-58sdpqyx 45 | ``` 46 | ![cacli progress](/images/cacli-progress.png) 47 | - Wait for the model training to complete (15-30 minutes) 48 | ![cacli complete](/images/cacli-complete.png) 49 | - Download the model 50 | ``` 51 | $ cacli download model-58sdpqyx 52 | success download complete 53 | ``` 54 | ![cacli download](/images/cacli-download.png) 55 | 56 | ### Set up a Object Detection web application on your local system 57 | 58 | The [Cloud Annotations github repository](https://github.com/cloud-annotations) includes an [object detection react](https://github.com/cloud-annotations/object-detection-react) web application that you can install locally. 59 | 60 | - Follow the [README.md instructions](https://github.com/cloud-annotations/object-detection-react/blob/master/README.md) 61 | - (or) Run these commands: 62 | ``` 63 | $ git clone git@github.com:cloud-annotations/object-detection-react.git 64 | $ cd object-detection-react 65 | $ npm install 66 | ``` 67 | - Copy the **model_web** directory created by the ```cacli download``` and paste it into the ```public``` folder of this repo directory. On my Linux system I just create a symlink. 68 | 69 | - Start the server 70 | ``` 71 | $ npm start 72 | ``` 73 | 74 | ## Test the Hard Hat model on your laptop 75 | 76 | Now that the **object detection react** web application is running, open http://localhost:3000 to view it in the browser. Share your laptop camera with the browser tab. 77 | 78 | Find your hard hat and observe the hard hat model prediction. 79 | 80 | ![author with a hardhat](/images/Sample-HardHat-Detection-Author.png) 81 | 82 | ### Congratulations 83 | 84 | You are now ready to set up your Nvidia Jetson Nano, so proceed to the next [Jetson Setup section](/part2/). 85 | -------------------------------------------------------------------------------- /docs/part2/README.md: -------------------------------------------------------------------------------- 1 | # Introducing Node-RED 2 | 3 | Node-RED is a programming tool for building event-driven application. It takes 4 | a low-code approach - which means rather than write lots of code, you build 5 | applications by using its visual editor to create flows of nodes that describe 6 | what should happen when particular events happen. 7 | 8 | For example, the `HTTP In` node can be configured to list on a particular path. 9 | When an HTTP request arrives on that path the node is triggered. It generates 10 | a message containing information about the request and passes it on to the nodes 11 | it is wired to. They in turn do whatever work they need to do using the message. 12 | 13 | Such as generating HTML content and adding it to the message before being 14 | passed on through the flow. In this example, the flow ends with an `HTTP Response` 15 | node which responds to the original HTTP request using the information in the 16 | message. 17 | 18 | ## Next Steps 19 | 20 | In this part of the tutorial you will: 21 | 22 | - [Install Node-RED](install.md) 23 | - [Install extra nodes into the palette](installing-nodes.md) 24 | - [Create your initial Hard Hat Detection Dashboard](create-dashboard.md) 25 | - [Add TensorFlow to the Hard Hat Detection dashboard](adding-tf.md) 26 | - [Display detected objects on the dashboard](display-objects.md) 27 | -------------------------------------------------------------------------------- /docs/part2/adding-tf.md: -------------------------------------------------------------------------------- 1 | # TensorFlow in Node-RED 2 | 3 | 4 | ## Installing TensorFlow nodes 5 | 6 | For this workshop, we're going to use the `node-red-contrib-cloud-annotations` 7 | module that provides the `tf model` node. 8 | 9 | This module can be installed from the Manage Palette option in the editor, or by 10 | running the following command in `~/.node-red`: 11 | 12 | ``` 13 | npm install node-red-contrib-cloud-annotations 14 | ``` 15 | 16 | This will install the module and the TensorFlow library it depends on. 17 | 18 | 19 | ## Connecting TensorFlow to the WebCam 20 | 21 | In this part, we'll setup the TensorFlow node to receive images from the WebCam. 22 | 23 | 1. Add an instance of the `tf coco ssd` node from the "analysis" category of the 24 | palette into your workspace. 25 | 2. Wire the output of WebCam node to the input of the tf node. 26 | 3. Make sure the WebCam node is configured to capture `jpeg` images - we said we'd 27 | remind you about this. 28 | 3. Add a Debug node and connect it to the output of the tf node. 29 | 4. Click the Deploy button to save your changes. 30 | 31 | !!! note "Muting Debug nodes" 32 | In this screenshot you can see I have muted the Debug node attached to the 33 | webcam node by clicking its button in the workspace. This can be useful 34 | to turn off different bits of Debug without unwiring or removing the nodes 35 | entirely. 36 | 37 | ![](../images/flow-tf.png) 38 | 39 | On the dashboard, make sure your webcam can see you and click the capture button. 40 | Switch back to the Node-RED editor and open the Debug sidebar panel. You should see 41 | the message sent by the tf node. Its payload consists of a list of the objects 42 | it has detected. 43 | 44 | Each entry in the list has: 45 | 46 | - `class` - the type of object 47 | - `score` - the confidence level of the detection, from `0` to `1`. 48 | - `bbox` - an array giving the corners of the bounding box surrounding the detected object 49 | 50 | 51 | ![](../images/flow-capture-debug.png) 52 | 53 | !!! note "`msg.payload` format" 54 | Each of the TensorFlow nodes uses a slightly different object format. For 55 | example, the `node-red-contrib-tf-*` nodes set the `className` property 56 | rather than `class` as we have here. If you experiement with the other nodes 57 | make sure you read their documentation and use the Debug node to understand 58 | their message format. 59 | 60 | 61 | ## Next Steps 62 | 63 | With TensorFlow integrated into the dashboard, the next task is to 64 | [display the detected objects on the dashboard](display-objects.md). 65 | -------------------------------------------------------------------------------- /docs/part2/create-dashboard.md: -------------------------------------------------------------------------------- 1 | # Create a dashboard 2 | 3 | In this part of the workshop you will begin to create your Hard Hat Detection application. 4 | 5 | ## Dashboard Layouts 6 | 7 | The Dashboard uses a grid based layout. Widgets, such as buttons or text boxes, are 8 | given a size in grid-cells. They are packed into a group that has a defined width. 9 | The Groups are then placed on a Tab - laid out using a flow-based layout, filling 10 | the width of the page before wrapping. This arrangement provides some flexibility 11 | in how the page displays on different screen sizes. 12 | 13 | ## Dashboard Sidebar 14 | 15 | Within the editor, the Dashboard module provides a sidebar where you can customize 16 | its features as well as manage its tabs, groups and widgets. 17 | 18 | 1. Open the Dashboard sidebar 19 | 2. Click the `+ tab` button to create a new tab. 20 | 3. Hover over the newly added tab and click the `edit` button. 21 | 22 | ![](../images/db-sidebar.png){: style="width: 250px"} 23 | 24 | 4. In the edit dialog, give the tab a name of `AI Photo Booth` and click Update 25 | to close the dialog. 26 | 5. Hover over the tab again and click the `+ group` button. 27 | 6. Edit the new group and set its properties: 28 | 29 | ![](../images/db-group.png){: style="width: 250px"} 30 | 31 | - Set the name to 'WebCam' 32 | - Set the width to `10` by clicking the button and dragging the box out to 10 33 | units wide. 34 | - Untick the 'Display group name' option 35 | 36 | ![](../images/db-group-edit.png){: style="width: 300px"} 37 | 38 | This has created the initial layout components needed for the dashboard. You can 39 | now start to add content. 40 | 41 | 42 | 43 | 44 | ## Next Steps 45 | 46 | The next task is to [add some different controls to the dashboard](adding-controls.md). 47 | -------------------------------------------------------------------------------- /docs/part2/display-objects.md: -------------------------------------------------------------------------------- 1 | # Displaying the detected objects 2 | 3 | In this part we're going to display the detected objects on the dashboard in two 4 | different ways. 5 | 6 | First we will display an annotated version of the captured image with all of the 7 | objects highlighted. We will then add a table to the dashboard that lists them out. 8 | 9 | ## Displaying an annotated image 10 | 11 | The `tf coco ssd` node has an option to output an annotated version of the image 12 | with all of the detected objects highlighted. The image is set on the `msg.image` 13 | message property. 14 | 15 | 1. Edit the `tf` node and configure the "Passthru" field to `Annotated Image` 16 | 17 | ![](../images/tf-passthru.png){: style="width:400px;"} 18 | 19 | 2. Add a Change node, wired to the output of the `tf` node and configure it to move 20 | `msg.image` to `msg.payload`. 21 | 22 | ![](../images/change-img-payload.png){:style="width:400px"} 23 | 24 | 3. Wire the Change node to the input of the WebCam node. 25 | 4. Click the Deploy button to save your changes. 26 | 27 | ![](../images/flow-tf-to-webcam.png) 28 | 29 | !!! note "Laying out flows" 30 | With this latest addition, you can see we now have wires crossing each other 31 | and looping back on themselves. As flows evolve, their wiring can become 32 | quite complex. It is always worth spending some time trying to find a layout 33 | that remains 'readable'. 34 | 35 | There is a [Flow Developer guide](https://nodered.org/docs/developing-flows/) 36 | in the Node-RED documenation that provides a number of tips on how to layout flows. 37 | 38 | Now when you take an image on the dashboard, you should see the annotated version 39 | of the image. 40 | 41 | ![](../images/booth-display-all.png) 42 | 43 | 44 | ## Adding a table of objects 45 | 46 | Install the module `node-red-node-ui-table` using the Manage Palette option in the 47 | editor, or be running the following command in `~/.node-red`: 48 | 49 | ``` 50 | npm install node-red-node-ui-table 51 | ``` 52 | 53 | This adds the `ui_table` node to the palette which can be used to display tabular 54 | data. 55 | 56 | 1. In the Dashboard sidebar of the Node-RED editor, hover over the `AI Photo Booth` 57 | tab and click the `+ group` button. 58 | 2. Edit the new group and set its properties: 59 | - Set the name to 'Objects' 60 | - Set the width to `6` by clicking the button and dragging the box out to 6 61 | units wide. 62 | - Untick the 'Display group name' option 63 | 3. Add a new `ui_table` node from the "dashboard" section of the palette into your 64 | workspace. Edit its properties as follows: 65 | - Add it to the 'Objects' group 66 | - Set its size to `6x8` 67 | - Add two columns by clicking the `+ add` button at the bottom. Configure them as: 68 | - Property: `class`, Title: `Object Type` 69 | - Property: `score`, Title: `Score`, Format: `Progress (0-100)` 70 | 71 | ![](../images/table-config.png){:style="width:400px"} 72 | 73 | 74 | 4. Add a Change node to the workspace. Configure it to set `msg.payload` to the 75 | expression `$append([],payload.{"class":class,"score":score*100,"bbox":bbox})` 76 | 77 | !!! note 78 | Make sure you select the `expression` type for the `to` field of the Change node. 79 | This uses the [JSONata](https://jsonata.org){:target="blank"} expression language. 80 | 81 | 5. Create the following wires between nodes: 82 | - wire the output of the `tf` node to the Change node. 83 | - wire the output of the Change node to the Table node 84 | 6. Click the Deploy button to save your changes. 85 | 86 | Now when you capture an image on the dashboard, the table should list the detected 87 | objects. 88 | 89 | ![](../images/flow-table.png) 90 | 91 | ![](../images/booth-display-table.png) 92 | 93 | 94 | !!! idea "Side Quest - Star Ratings" 95 | The JSONata expression used in the Change node mapped the `score` property 96 | of each detected object from its original `0-1` range to the `0-100` range 97 | expected by the `ui_table` node's "Progress" column type. 98 | 99 | The table supports a number of other formats of displaying numeric values. 100 | For example, it can map a number in the 0-100 range to a traffic light colour. 101 | It can also display a value in the range 0-5 as a number of stars. 102 | 103 | Edit the table node to display the score using the star format. See if you can 104 | modify the expression in the Change node to map the original score to the required 105 | `0-5` range. 106 | 107 | 108 | !!! idea "Side Quest - Clear the table" 109 | With the current dashboard, when an image is captured it gets displayed in 110 | place of the live web cam view until the clear button is clicked. 111 | 112 | However clicking the button does not clear the table we've just added. 113 | 114 | Using what you've learnt so far, build a flow between the Clear button and 115 | the table node that will clear the table when the button is clicked. 116 | 117 | *Hint:* think about what payload must be passed to the table in order to clear it. 118 | 119 | 120 | 121 | ## Next Steps 122 | 123 | With the list of detected objects on the dashboard, the next task is to let the 124 | user [select which object to display](select-objects.md). 125 | 126 | -------------------------------------------------------------------------------- /docs/part2/install.md: -------------------------------------------------------------------------------- 1 | # Installing Node-RED 2 | 3 | Node-RED is published as a node.js module available on npm, as well as a container 4 | available on Docker Hub. 5 | 6 | The full guide for installing and running Node-RED is available [here](https://nodered.org/docs/getting-started/local). 7 | 8 | !!! note "Linux" 9 | The following steps assume you are running on Windows or OSX. If you are running 10 | on a Linux OS, or a device like a Raspberry Pi, the project provides a set of 11 | install scripts that will get node, npm and Node-RED all installed at the latest 12 | stable versions. Refer to the docs linked above. 13 | 14 | You must have a supported version of Node.js installed. Node-RED supports 15 | the Active and LTS releases, 12.x and 14.x. 16 | 17 | You can then install Node-RED as a global module with the command: 18 | 19 | ``` 20 | npm install -g --unsafe-perm node-red 21 | ``` 22 | 23 | Depending on your Node.js installation, you may need to run this command using `sudo`. 24 | 25 | The install log output may contain some warnings - these can be ignored as long 26 | as the output ends with something like: 27 | 28 | 29 | ``` 30 | + node-red@1.2.2 31 | added 332 packages from 341 contributors in 18.494s 32 | ``` 33 | 34 | ## Running Node-RED 35 | 36 | Once installed, you should now have the `node-red` command available to run. 37 | 38 | !!! note "Command not found" 39 | If you do not have the `node-red` command available it may be a problem 40 | with your `PATH` configuration. 41 | 42 | Find where your global node modules are installed by running: 43 | 44 | npm get prefix 45 | 46 | Then ensure the `bin` subdirectory of that location is on your `PATH`. 47 | 48 | When you run `node-red`, the log output will appear 49 | 50 | ``` 51 | 23 Oct 00:12:01 - [info] 52 | 53 | Welcome to Node-RED 54 | =================== 55 | 56 | 23 Oct 00:12:01 - [info] Node-RED version: v1.2.2 57 | 23 Oct 00:12:01 - [info] Node.js version: v12.19.0 58 | 23 Oct 00:12:01 - [info] Darwin 18.7.0 x64 LE 59 | 23 Oct 00:12:01 - [info] Loading palette nodes 60 | 23 Oct 00:12:03 - [info] Settings file : /Users/nol/.node-red/settings.js 61 | 23 Oct 00:12:03 - [info] User directory : /Users/nol/.node-red 62 | 23 Oct 00:12:03 - [info] Server now running at http://127.0.0.1:1880/ 63 | 23 Oct 00:12:03 - [info] Flows file : /Users/nol/.node-red/flows.json 64 | 23 Oct 00:12:03 - [info] Starting flows 65 | 23 Oct 00:12:03 - [info] Started flows 66 | ``` 67 | 68 | This output contains an important piece of information you will need - 69 | the location of your `User directory`. 70 | 71 | 72 | ## Accessing the Node-RED editor 73 | 74 | Assuming you are running Node-RED on your local computer, open a browser and access 75 | the url [http://127.0.0.1:1880/](http://127.0.0.1:1880/){: target="blank"}. This will load the 76 | Node-RED editor - the tool used to build your applications. 77 | 78 | 79 | 80 | ## Next Steps 81 | 82 | The next task is to [Install extra nodes into the palette](installing-nodes.md). 83 | -------------------------------------------------------------------------------- /docs/part2/installing-nodes.md: -------------------------------------------------------------------------------- 1 | # Installing Nodes 2 | 3 | The building blocks of any Node-RED application are the nodes in its palette. 4 | 5 | Node-RED comes with a number of core nodes that provide the basic components, 6 | but the palette can be easily extended by installing additional nodes. 7 | 8 | Nodes are published as npm modules and the project provides an online catalogue 9 | of them at [https://flows.nodered.org](https://flows.nodered.org). 10 | 11 | 12 | There are two ways to install nodes - via the command-line or from within the 13 | Node-RED editor. 14 | 15 | ### Node-RED Palette Manager 16 | 17 | To install a node from within the editor, select the Manage Palette option from 18 | the main menu. 19 | 20 | This opens the Palette Manager which shows two tabs - a list of the modules you 21 | have installed and a searchable catalogue of modules available to install. 22 | 23 | Switch to the Install tab and search for `random` - you should see `node-red-node-random` in the list below. Click the `install` button next to it. 24 | 25 | ![](../images/palette-install.png){: style="width:350px"} 26 | 27 | After a short time the node will be installed and added to the palette. 28 | 29 | ### Command-line 30 | 31 | To install on the command-line, switch to the Node-RED user directory and run the appropriate `npm install` command. For example: 32 | 33 | ``` 34 | npm install node-red-node-random 35 | ``` 36 | 37 | !!! note "Node-RED User Directory" 38 | By default, Node-RED creates a directory called `.node-red` in the user's 39 | home directory. As it starts with a `.` it may be hidden from view by your file 40 | browser. 41 | 42 | As mentioned in the [Install](install.md) section, Node-RED logs the full 43 | path to the user directory when it starts up. If in doubt, check what it says. 44 | 45 | !!! note 46 | Some nodes will have external dependencies that cannot be automatically 47 | installed by Node-RED or npm. You should always check a module's readme 48 | for further information. This will be particularly true of some of the 49 | TensorFlow nodes we'll be using later in this workshop. 50 | 51 | ## Next Steps 52 | 53 | The next task is to [create you first flow in Node-RED](create-flow.md). 54 | 55 | -------------------------------------------------------------------------------- /docs/part3/EDGEINFER.md: -------------------------------------------------------------------------------- 1 | ## Train and Test a Hard Hat Detection Model 2 | 3 | This section shows you how to download and test the hard hat model on your Jetson Nano. 4 | 5 | ### Lab Objectives 6 | 7 | In this lab you will learn how to: 8 | 9 | - Download the hard hat model with the cacli 10 | - git clone object-detection-react 11 | - Test model on the Jetson 12 | 13 | ### Inferencing on the Jetson 14 | 15 | In a previous section you uploaded hard hat image training data and trained a object detection model using the Cloud Annotations command line interface, cacli, Tensorflow and Watson Machine Learning. You also downloaded the model and tested the model on your laptop using a browser and your laptop web camera. 16 | 17 | This section will repeat some of those steps on the Jetson Nano. 18 | 19 | ### Cloud Annotations cacli Install 20 | 21 | Our model is already trained so we can use the Cloud Annotations command line interface, ```cacli``` to download it. 22 | 23 | The cacli is available for Linux, Mac, Windows. Follow the [download and installation instructions](https://cloud.annotations.ai/docs#installing-the-cli) 24 | 25 | ### Cloud Annotations - Login / List Models 26 | 27 | - Login to your IBM Cloud account using ```cacli login``` and answer the OTP prompts 28 | ``` 29 | $ cacli login 30 | ``` 31 | ![cacli login](/images/cacli-login.png) 32 | 33 | - List training runs and trained models 34 | ``` 35 | $ cacli list 36 | ``` 37 | ![cacli list](/images/cacli-list.png) 38 | 39 | ### Download your hard hat object detection model 40 | 41 | - Download the model 42 | ``` 43 | $ cacli download model-58sdpqyx 44 | success download complete 45 | ``` 46 | ![cacli download](/images/cacli-download.png) 47 | 48 | ### Install node.js 49 | 50 | Before we can run the Object Detection web application on your Jetson Nano, these commands install Node.js 51 | 52 | ```bash 53 | $ curl -sL https://deb.nodesource.com/setup_12.x | sudo -E bash - 54 | $ sudo apt-get install -y nodejs 55 | ``` 56 | 57 | ### Set up a Object Detection web application on your Jetson Nano 58 | 59 | The [Cloud Annotations github repository](https://github.com/cloud-annotations) includes an [object detection react](https://github.com/cloud-annotations/object-detection-react) web application that you can install locally. 60 | 61 | - Follow the [README.md instructions](https://github.com/cloud-annotations/object-detection-react/blob/master/README.md) 62 | - (or) Run these commands: 63 | ``` 64 | $ git clone git@github.com:cloud-annotations/object-detection-react.git 65 | $ cd object-detection-react 66 | $ npm install 67 | ``` 68 | - Move the **model_web** directory created by the ```cacli download``` into the ```public``` folder of this repo directory. On the Ubuntu Linux system I just create a symlink. 69 | 70 | - Start the server 71 | ``` 72 | $ npm start 73 | ``` 74 | 75 | ## Test the Hard Hat model on your Jetson Nano 76 | 77 | Now that the **object detection react** web application is running, launch Chromium on your Jetson Nano and open http://localhost:3000 to view it in the browser. Share your Jetson web camera with the browser tab. 78 | 79 | Find your hard hat and observe the hard hat model prediction running on the Jetson! 80 | 81 | ![author with a hardhat](/images/Jetson-HardHat-Detection-Author.png) 82 | 83 | ### Congratulations - Object Detection on the Edge! 84 | 85 | You are now ready to set up Open Horizon Exchange Hub Services, so proceed to the next [Horizon Hub Setup section](/part4/). 86 | -------------------------------------------------------------------------------- /docs/part3/JETSON.md: -------------------------------------------------------------------------------- 1 | ## Jetson Nano setup 2 | 3 | This section shows you how to configure a Jetson Nano to run object detection. 4 | 5 | ### Lab Objectives 6 | 7 | In this lab you will learn how to: 8 | 9 | - Set up a Jetson Nano with Ubuntu 10 | - Attach a USB webcam, use a 5V barrel power supply to provide sufficient power 11 | 12 | ### Set up a Jetson Nano Developer Kit 13 | 14 | Nvidia provides thorough setup [documentation on the Jetson Nano Developer Kit](https://developer.nvidia.com/embedded/learn/get-started-jetson-nano-devkit) so this guide will not attempt to reproduce those steps. In addition, there are [numerous Internet resources](https://www.hackster.io/news/getting-started-with-the-nvidia-jetson-nano-developer-kit-43aa7c298797) that walk through the configuration and setup. 15 | 16 | ![Jetson Nano hardware picture](https://developer.nvidia.com/sites/default/files/akamai/embedded/images/jetsonNano/gettingStarted/jetson-nano-dev-kit-top-r6-HR-B01.png) 17 | 18 | This tutorial will emphasize several parts of those guides that can assure a successful implementation of the Worker Safety and Hard Hat Detection model. 19 | 20 | #### 5V Barrel Jack Power Adapter 21 | 22 | The Jetson Nano can be powered by a Micro-USB 5V 2A power supply but the camera and GPU require additional power to operate. Avoid the frustration of indeterminate results and switch to a 5V barrel jack power supply (4A). Closing the J48 jumper with a standard 2.54mm pitch jumper will switch the power from the micro USB jack to the barrel jack. Follow the [recommended instructions](https://forums.developer.nvidia.com/t/power-supply-considerations-for-jetson-nano-developer-kit/71637) to close jumper J48 on the Nano PCB. 23 | 24 | Once you are powering the Jetson Nano with a 5V barrel jack power supply, you can enable the 'maximum performance model' and the system and inferencing should run faster. 25 | 26 | ```bash 27 | sudo nvpmodel -m 0 28 | ``` 29 | 30 | ### Attach a Android IP Web camera app 31 | 32 | Object Detection does not require the highest quality camera. Most images will get scaled down. 33 | 34 | ### Internet Connectivity 35 | 36 | You will need an ethernet cable (the Jetson Nano developer kit does not include WiFi) or a WiFi USB dongle. 37 | 38 | ### Now the fun begins! 39 | 40 | You are now ready to run the hard hat model on the Jetson edge device, so proceed to the next [Edge Inferencing section](EDGEINFER.md). 41 | -------------------------------------------------------------------------------- /docs/part3/README.md: -------------------------------------------------------------------------------- 1 | # Edge Computing Worker Safety Hard Hat Detection - Jetson Nano setup 2 | 3 | This section shows you how to configure a Jetson Nano to run object detection. 4 | 5 | ## Next Steps 6 | 7 | In this part of the workshop you will: 8 | 9 | - Set up a [Jetson Nano](JETSON.md) with Ubuntu 10 | - 5V barrel power supply to provide sufficient power 11 | - Download and Run the Docker container 12 | - Test model on the Jetson 13 | -------------------------------------------------------------------------------- /docs/part4/HZNCLIENT.md: -------------------------------------------------------------------------------- 1 | ## Deploy and Test Hard Hat Object Detection 2 | 3 | Learn how to install Open Horizon HZN anax client and register the Jetson HZN anax client into your hub. 4 | 5 | ### Lab Objectives 6 | 7 | In this lab you will learn how to: 8 | 9 | - On your Jetson Nano, Install docker 10 | - Install Open Horizon HZN anax client 11 | - Register Jetson HZN anax client into Hub 12 | 13 | ### 14 | 15 | You are now ready to containerize the hard hat model in a Docker image, so proceed to the next [Docker Model section](/part5). 16 | -------------------------------------------------------------------------------- /docs/part4/HZNHUB.md: -------------------------------------------------------------------------------- 1 | ## Open Horizon Exchange Hub Set Up 2 | 3 | This section guides you through the set up of a laptop with HZN Exchange Hub Services 4 | 5 | ## Lab Objectives 6 | 7 | In this lab you will learn how to: 8 | 9 | - Set up a laptop with Horizon Exchange Hub Services 10 | 11 | ### 12 | 13 | You are now ready to set up your Open Horizon anax client on the Jetson Nano, so proceed to the next [Horizon Client section](HZNCLIENT.md). 14 | -------------------------------------------------------------------------------- /docs/part4/README.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/docs/part4/README.md -------------------------------------------------------------------------------- /docs/part5/README.md: -------------------------------------------------------------------------------- 1 | # Containerization 2 | 3 | ## Why create a container? 4 | 5 | Up to this point in the workshop, you've been developing an application using 6 | Node-RED on your local device. 7 | 8 | The fact the editor and runtime are bundled together makes it very convenient 9 | to quickly start building applications. 10 | 11 | But that model is less suitable when you think about creating applications that 12 | run in production, or that need to be distributed to remote devices. You don't 13 | want to be using the editor to edit the application directly - you want to be 14 | able to develop and test your application locally and then have a controlled 15 | way to deploy it into your production environment. 16 | 17 | In this part of the workshop, we're going to step through the process of 18 | wrapping the application as a Docker container. Once the container has been 19 | created, it can be deployed just as you would any other container - pushing it 20 | to a cloud environment or down to edge devices. 21 | 22 | ## Next Steps 23 | 24 | At the start of this workshop, you enabled the Projects feature. That gave 25 | you a git repository you can use to manage your application. 26 | 27 | In this section we are going to make some updates to the project files to help 28 | create a deployable container. 29 | 30 | This involves: 31 | 32 | - [Updating the project's `package.json` file](package.md) 33 | - [Adding a `settings.js` file](settings.md) 34 | - [Adding a `Dockerfile`](dockerfile.md) 35 | 36 | 37 | 38 | 39 | 40 | 41 | 42 | -------------------------------------------------------------------------------- /docs/part5/dockerfile.md: -------------------------------------------------------------------------------- 1 | # Add a `Dockerfile` 2 | 3 | The final task is to add a Dockerfile and to build the container. 4 | 5 | Create the file `~/.node-red/projects//Dockerfile` with the following 6 | contents: 7 | 8 | ``` 9 | FROM node:lts as build 10 | 11 | RUN apt-get update \ 12 | && apt-get install -y build-essential 13 | 14 | WORKDIR /data 15 | 16 | COPY ./package.json /data/ 17 | 18 | RUN npm install 19 | 20 | COPY ./settings.js /data/ 21 | COPY ./flows.json /data/ 22 | COPY ./flows_cred.json /data/ 23 | 24 | ## Release image 25 | 26 | FROM node:lts-slim 27 | RUN apt-get update 28 | 29 | RUN mkdir -p /data 30 | 31 | COPY --from=build /data /data 32 | 33 | WORKDIR /data 34 | 35 | ENV PORT 1880 36 | ENV NODE_ENV=production 37 | ENV NODE_PATH=/data/node_modules 38 | EXPOSE 1880 39 | CMD ["npm", "start"] 40 | ``` 41 | 42 | This Dockerfile has two parts. It first creates a build image using the latest 43 | `node:lts` image. It installs the build tools, copies in the project's `package.json` 44 | ,installs all of the node modules and then copies in the remaining project 45 | files. 46 | 47 | After that, it creates the real image using the `node:lts-slim` image - a smaller 48 | base image. It copies over the required parts from the build image, sets up some 49 | default environment variables and then runs Node-RED. 50 | 51 | ## Building the image 52 | 53 | To build the image, run the following command from the `~/.node-red/projects//` 54 | directory: 55 | 56 | ``` 57 | docker build . -t node-red-photobooth 58 | ``` 59 | 60 | This will take a few minutes the first time you run it as it will have to download 61 | the base images. Subsequent runs will be quicker as those downloads are cached. 62 | 63 | ## Running the image locally 64 | 65 | Once built, you can test the image locally by running: 66 | 67 | ``` 68 | docker run -p 9000:1880 --name photobooth node-red-photobooth 69 | ``` 70 | 71 | Once that runs, you will be able to open `http://localhost:9000` to access the 72 | photo booth dashboard. 73 | 74 | 75 | !!! note "Cleaning up Docker" 76 | To stop the running image, you can run the command: 77 | 78 | ``` 79 | docker stop photobooth 80 | ``` 81 | 82 | To delete the container, run: 83 | 84 | ``` 85 | docker rm photobooth 86 | ``` 87 | 88 | To delete the image, run: 89 | 90 | ``` 91 | docker rmi node-red-photobooth 92 | ``` 93 | 94 | -------------------------------------------------------------------------------- /docs/part5/package.md: -------------------------------------------------------------------------------- 1 | # Update `package.json` 2 | 3 | ## Adding dependencies 4 | 5 | Through the workshop you've installed a number of additional modules into 6 | Node-RED. We need to make sure they are listed in the project's `package.json` 7 | file so they will get installed into the container. 8 | 9 | 1. Open the Project Settings dialog from the main menu in the editor (`Projects -> Project Settings`) 10 | 2. Switch to the Dependencies tab. You will see a list of the additional modules 11 | that are *used by the project*. Click the 'add to project' next to each one 12 | and then close the dialog. 13 | 14 | ![](../images/project-dependencies.png){:style="width: 400px"} 15 | 16 | 17 | That will have updated the `package.json` file for you. However, for this 18 | scenario there's one more dependency we need to add manually - `node-red` 19 | itself. 20 | 21 | The `package.json` file can be found in `~/.node-red/projects//package.json`. Open that file in a text editor and make the following changes: 22 | 23 | 1. add `node-red` to the dependencies section - this will ensure Node-RED gets installed when the container is 24 | built. 25 | 26 | "dependencies": { 27 | "node-red": "1.x", 28 | ... 29 | }, 30 | 31 | 2. Add a `scripts` section to define a `start` command - this is how the 32 | container will run Node-RED: 33 | 34 | "scripts": { 35 | "start": "node --max-old-space-size=256 ./node_modules/node-red/red.js --userDir . --settings ./settings.js flows.json" 36 | } 37 | 38 | Let's take a closer look at the start command: 39 | 40 | node 41 | --max-old-space-size=256 (a) 42 | ./node_modules/node-red/red.js (b) 43 | --userDir . (c) 44 | --settings ./settings.js (d) 45 | flows.json (e) 46 | 47 | 48 | 1. This argument is used to tell node when it should start garbage collecting. 49 | 2. With `node-red` listed as an npm dependency of the project, we know exactly where it will get installed and where the `red.js` main entry point is. 50 | 3. We want Node-RED to use the current directory as its user directory 51 | 4. Just to be sure, we point at the settings file it should use - something we’ll add in the next step 52 | 5. Finally we specify the flow file to use. If you picked a different flow file name at the start, make sure you use the right name here. 53 | 54 | 3. Having made those changes, restart Node-RED and reload the editor in your browser. 55 | 56 | 57 | Your complete `package.json` file should look something like this: 58 | 59 | ``` 60 | { 61 | "name": "photobooth-workshop", 62 | "description": "A Node-RED Project", 63 | "version": "0.0.1", 64 | "dependencies": { 65 | "node-red": "1.x", 66 | "node-red-contrib-tfjs-coco-ssd": "0.5.2", 67 | "node-red-dashboard": "2.23.5", 68 | "node-red-node-annotate-image": "0.1.0", 69 | "node-red-node-ui-table": "0.3.7", 70 | "node-red-node-ui-webcam": "0.2.1" 71 | }, 72 | "node-red": { 73 | "settings": { 74 | "flowFile": "flows.json", 75 | "credentialsFile": "flows_cred.json" 76 | } 77 | }, 78 | "scripts": { 79 | "start": "node --max-old-space-size=256 ./node_modules/node-red/red.js --userDir . --settings ./settings.js flows.json" 80 | } 81 | } 82 | ``` 83 | 84 | 85 | ## Next Steps 86 | 87 | With the `package.js` file updated, the next task is to [add a `settings.js` file](settings.md). 88 | -------------------------------------------------------------------------------- /docs/part5/settings.md: -------------------------------------------------------------------------------- 1 | # Add a `settings.js` file 2 | 3 | You are already familiar with the Node-RED `settings.js` file you had to edit 4 | in the earlier part of the workshop. 5 | 6 | The containerized version of your application will need its own settings file to 7 | use. 8 | 9 | Create the file `~/.node-red/projects//settings.js` with the following 10 | contents: 11 | 12 | ``` 13 | module.exports = { 14 | uiPort: process.env.PORT || 1880, 15 | credentialSecret: process.env.NODE_RED_CREDENTIAL_SECRET, 16 | httpAdminRoot: false, 17 | ui: { path: "/" } 18 | } 19 | ``` 20 | 21 | - Setting `httpAdminRoot` to `false` will disable the Node-RED editor entirely - we 22 | do not want the flows to be edited directly in our production environment. 23 | - `credentialSecret` is how you can provide the key for decrypting your credentials 24 | file. Rather than hardcode the key in the file - which is a bad idea - we set 25 | it using the environment variable `NODE_RED_CREDENTIAL_SECRET`. 26 | - Having disabled the editor, the `ui` setting moves the root url of the dashboard 27 | page back to `/` rather than its default `/ui`. 28 | 29 | ## Next Steps 30 | 31 | The final task is to [add a `Dockerfile`](dockerfile.md). -------------------------------------------------------------------------------- /docs/part6/HZNDEPLOY.md: -------------------------------------------------------------------------------- 1 | *Quick links :* 2 | [Home](/README.md) - [Label Data](/part1/LABEL.md) - [Cloud Setup](/part1/CLOUDSETUP.md) - [Train Model](/part1/TRAIN.md) - [Setup Jetson](/part2/JETSON.md) - [Edge Inferencing](/part2/EDGEINFER.md) - [Horizon Hub](/part3/HZNHUB.md) - [Horizon Client](/part3/HZNCLIENT.md) - [Docker Model](/part4/DOCKERMODEL.md) - [**Horizon Deploy**](/part4/HZNDEPLOY.md) 3 | *** 4 | 5 | # Edge Computing Worker Safety Hard Hat Detection - Section 9 6 | 7 | ## Register and deploy the Pattern to the edge 8 | 9 | This final section demonstrates how to register the workload pattern on the Exchange server and deploy the pattern to the edge device. 10 | 11 | ### Lab Objectives 12 | 13 | In this lab you will learn how to: 14 | 15 | - Register the workload pattern on the Exchange server 16 | - Deploy the pattern to the edge device 17 | 18 | ### 19 | 20 | Congratulations - You have completed this Code Pattern. 21 | 22 | *** 23 | [Home](/README.md) - [Label Data](/part1/LABEL.md) - [Cloud Setup](/part1/CLOUDSETUP.md) - [Train Model](/part1/TRAIN.md) - [Setup Jetson](/part2/JETSON.md) - [Edge Inferencing](/part2/EDGEINFER.md) - [Horizon Hub](/part3/HZNHUB.md) - [Horizon Client](/part3/HZNCLIENT.md) - [Docker Model](/part4/DOCKERMODEL.md) - [**Horizon Deploy**](/part4/HZNDEPLOY.md) 24 | *** 25 | -------------------------------------------------------------------------------- /docs/part6/README.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/docs/part6/README.md -------------------------------------------------------------------------------- /images/COS-BucketName.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/images/COS-BucketName.png -------------------------------------------------------------------------------- /images/COS-Create.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/images/COS-Create.png -------------------------------------------------------------------------------- /images/COS-CreateBucket.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/images/COS-CreateBucket.png -------------------------------------------------------------------------------- /images/COS-Upload-1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/images/COS-Upload-1.png -------------------------------------------------------------------------------- /images/COS-Upload-2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/images/COS-Upload-2.png -------------------------------------------------------------------------------- /images/COS-Upload-3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/images/COS-Upload-3.png -------------------------------------------------------------------------------- /images/COS-Upload-Folder-1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/images/COS-Upload-Folder-1.png -------------------------------------------------------------------------------- /images/COS-Upload-Folder-2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/images/COS-Upload-Folder-2.png -------------------------------------------------------------------------------- /images/IBM-Cloud-Catalog-WML.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/images/IBM-Cloud-Catalog-WML.png -------------------------------------------------------------------------------- /images/IBM-Cloud-Catalog-search-WML.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/images/IBM-Cloud-Catalog-search-WML.png -------------------------------------------------------------------------------- /images/IBM-Cloud-Login.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/images/IBM-Cloud-Login.png -------------------------------------------------------------------------------- /images/IBM-Cloud-WML-Create.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/images/IBM-Cloud-WML-Create.png -------------------------------------------------------------------------------- /images/IBM-Cloud-WML-Instance.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/images/IBM-Cloud-WML-Instance.png -------------------------------------------------------------------------------- /images/Jetson-HardHat-Detection-Author.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/images/Jetson-HardHat-Detection-Author.png -------------------------------------------------------------------------------- /images/Sample-HardHat-Detection-Author.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/images/Sample-HardHat-Detection-Author.png -------------------------------------------------------------------------------- /images/WorkerSafety-HardHat-Labeled-Image.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/images/WorkerSafety-HardHat-Labeled-Image.png -------------------------------------------------------------------------------- /images/cacli-complete.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/images/cacli-complete.png -------------------------------------------------------------------------------- /images/cacli-download.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/images/cacli-download.png -------------------------------------------------------------------------------- /images/cacli-list.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/images/cacli-list.png -------------------------------------------------------------------------------- /images/cacli-login.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/images/cacli-login.png -------------------------------------------------------------------------------- /images/cacli-progress.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/images/cacli-progress.png -------------------------------------------------------------------------------- /images/cacli-train1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/images/cacli-train1.png -------------------------------------------------------------------------------- /images/cacli-train2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/images/cacli-train2.png -------------------------------------------------------------------------------- /images/edge-objectdetection-arch-diagram.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection/cbdd114d90310365994c1c84b73c5fbedd7ed015/images/edge-objectdetection-arch-diagram.png -------------------------------------------------------------------------------- /mkdocs.yml: -------------------------------------------------------------------------------- 1 | site_name: Edge Computing Worker Safety Hard Hat Detection 2 | site_description: >- 3 | This code pattern demonstrates how to protect workers with a TensorFlow Hard Hat object detection model running on a Jetson Nano edge device. 4 | site_url: https://johnwalicki.github.io/EdgeComputing-WorkerSafety-HardHat-Detection 5 | site_author: John Walicki 6 | repo_name: johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection 7 | repo_url: https://github.com/johnwalicki/EdgeComputing-WorkerSafety-HardHat-Detection 8 | edit_uri: edit/main/docs/ 9 | theme: 10 | name: material 11 | use_directory_urls: false 12 | plugins: 13 | - search 14 | markdown_extensions: 15 | - attr_list 16 | - admonition 17 | - toc: 18 | permalink: true 19 | toc_depth: 3 20 | extra_css: 21 | - css/extra.css 22 | - css/pdf-print.css 23 | nav: 24 | - Introduction: README.md 25 | - 1 - Build a Hard Hat Object Detection Model: 26 | - Introduction: part1/README.md 27 | - Label Hard Hat Data: part1/LABEL.md 28 | - Create IBM Cloud Resources: part1/CLOUDSETUP.md 29 | - Train your Hard Hat model: part1/TRAIN.md 30 | - 2 - Build a Dashboard Application: 31 | - Introduction: part2/README.md 32 | - Install Node-RED: part2/install.md 33 | - Installing Nodes: part2/installing-nodes.md 34 | - TensorFlow in Node-RED: part2/adding-tf.md 35 | - Create a dashboard: part2/create-dashboard.md 36 | - Displaying detected objects: part2/display-objects.md 37 | - 3 - Configure a Jetson Nano: 38 | - Introduction: part3/README.md 39 | - Configure Nvidia Jetson Nano: part3/JETSON.md 40 | - Deploy / Test Model on the Jetson: part3/EDGEINFER.md 41 | - 4 - Open Horizon: 42 | - Introduction: part4/README.md 43 | - Open Horizon Exchange Hub Services: part4/HZNHUB.md 44 | - Open Horizon agent on the Jetson: part4/HZNCLIENT.md 45 | - 5 - Containerized Workloads: 46 | - Introduction: part5/README.md 47 | - Package file: part5/package.md 48 | - Settings file: part5/settings.md 49 | - Build a multistage Dockerfile: part5/dockerfile.md 50 | - Build a MultiArch Docker Image: part5/DOCKERMODEL.md 51 | - Deploy container to DockerHub: part5/DOCKERDEPLOY.md 52 | - 6 - IEAM / Horizon Exchange Hub: 53 | - Introduction: part6/README.md 54 | - Deploy Horizon Pattern: part6/HZNDEPLOY.md 55 | -------------------------------------------------------------------------------- /part1/CLOUDSETUP.md: -------------------------------------------------------------------------------- 1 | *Quick links :* 2 | [Home](/README.md) - [Label Data](/part1/LABEL.md) - [**Cloud Setup**](/part1/CLOUDSETUP.md) - [Train Model](/part1/TRAIN.md) - [Setup Jetson](/part2/JETSON.md) - [Edge Inferencing](/part2/EDGEINFER.md) - [Horizon Hub](/part3/HZNHUB.md) - [Horizon Client](/part3/HZNCLIENT.md) - [Docker Model](/part4/DOCKERMODEL.md) - [Horizon Deploy](/part4/HZNDEPLOY.md) 3 | *** 4 | 5 | # Edge Computing Worker Safety Hard Hat Detection - Section 2 6 | 7 | ## Create IBM Cloud resources 8 | 9 | This section shows you how to use Cloud Annotations to label a dataset of hard hat images with White / Blue / Yellow / Person objects 10 | 11 | ### Lab Objectives 12 | 13 | In this lab you will learn how to: 14 | 15 | - Create an instance of Cloud Object Storage (COS) in IBM Cloud 16 | - Upload the hard hat dataset to COS 17 | - Create Watson Machine Learning instance 18 | 19 | ### Login to IBM Cloud 20 | 21 | - Visit [IBM Cloud](http://cloud.ibm.com) and, if you don't yet have an account, register for a (free) IBM Cloud Lite account. 22 | ![IBM Cloud login](/images/IBM-Cloud-Login.png) 23 | 24 | ### Create a Cloud Object Storage instance 25 | 26 | - IBM Cloud will provision 25GB for free. 27 | - Visit the [IBM Cloud Catalog](https://cloud.ibm.com/catalog/services/cloud-object-storage) COS page. 28 | - Click on the **Create** button 29 | ![COS Create](/images/COS-Create.png) 30 | 31 | ### Create a Cloud Object Storage bucket 32 | 33 | - Click on the **Create bucket** button 34 | ![COS Create Bucket](/images/COS-CreateBucket.png) 35 | 36 | - Select a **Standard** predefined bucket 37 | - Name your bucket ```hardhat-detection-``` 38 | - The bucket name must be unique across the whole IBM Cloud Object Storage system 39 | ![COS Bucket name](/images/COS-BucketName.png) 40 | - Press the **Next** button 41 | - Unzip the hardhat-dataset.zip that was downloaded in the previous section. 42 | - Make certain you extract / unzip the .zip file. 43 | - Navigate to the extracted files on your computer 44 | - Drag the ```_annotations.json``` into the bucket. 45 | - A progress bar and a confirmation will display. 46 | 47 | 48 | 49 | 50 | 51 | 52 | 53 | 54 | 57 | 60 | 63 |
55 | 56 | 58 | 59 | 61 | 62 |
64 | 65 | - Hold off on uploading the ```images``` folder until the next step where you will be able to upload the entire folder. 66 | - Press the **Next** button. 67 | - On the next panel, scroll down and press the **View bucket configuration** button. 68 | - On the next panel, in the left navigation sidebar, click on **Objects** 69 | - Then, expand the **Upload** dropdown, and choose **Folders** 70 | ![COS Folder Upload](/images/COS-Upload-Folder-1.png) 71 | - In the popup dialog, click on **Select folders** 72 | - A operating system file dialog will open, navigate and select the **images** folder that you have extracted. 73 | - A progress bar and a confirmation will display as the files are uploaded. 74 | ![COS Folder Upload](/images/COS-Upload-Folder-2.png) 75 | - Congratulations - You have uploaded the hardhat model dataset to IBM Cloud Object Storage. 76 | 77 | ### Create an Watson Machine Learning Instance 78 | 79 | - Visit the [IBM Cloud Catalog](https://cloud.ibm.com/catalog/services/machine-learning) 80 | - Search for **Watson Machine Learning** 81 | ![Catalog WML](/images/IBM-Cloud-Catalog-search-WML.png) 82 | - Click on the Machine Learning tile 83 | ![WML Service](/images/IBM-Cloud-Catalog-WML.png) 84 | - Click on the **Create** button 85 | ![WML Create](/images/IBM-Cloud-WML-Create.png) 86 | - You now have a **Watson Machine Learning** instance which will be used to train your Hard Hat model. 87 | ![WML Instance](/images/IBM-Cloud-WML-Instance.png) 88 | 89 | *** 90 | You are now ready to train your Tensorflow hard hat model, so proceed to the next [Train Model section](/part1/TRAIN.md). 91 | 92 | *** 93 | [Home](/README.md) - [Label Data](/part1/LABEL.md) - [**Cloud Setup**](/part1/CLOUDSETUP.md) - [Train Model](/part1/TRAIN.md) - [Setup Jetson](/part2/JETSON.md) - [Edge Inferencing](/part2/EDGEINFER.md) - [Horizon Hub](/part3/HZNHUB.md) - [Horizon Client](/part3/HZNCLIENT.md) - [Docker Model](/part4/DOCKERMODEL.md) - [Horizon Deploy](/part4/HZNDEPLOY.md) 94 | *** 95 | -------------------------------------------------------------------------------- /part1/LABEL.md: -------------------------------------------------------------------------------- 1 | *Quick links :* 2 | [Home](/README.md) - [**Label Data**](/part1/LABEL.md) - [Cloud Setup](/part1/CLOUDSETUP.md) - [Train Model](/part1/TRAIN.md) - [Setup Jetson](/part2/JETSON.md) - [Edge Inferencing](/part2/EDGEINFER.md) - [Horizon Hub](/part3/HZNHUB.md) - [Horizon Client](/part3/HZNCLIENT.md) - [Docker Model](/part4/DOCKERMODEL.md) - [Horizon Deploy](/part4/HZNDEPLOY.md) 3 | *** 4 | 5 | # Edge Computing Worker Safety Hard Hat Detection - Section 1 6 | 7 | ## Label a dataset of hard hat images with White / Blue / Yellow / Person objects 8 | 9 | This section shows you how to use Cloud Annotations to label a dataset of hard hat images with White / Blue / Yellow / Person objects. 10 | 11 | ### Lab Objectives 12 | 13 | In this lab you will learn how to: 14 | 15 | - Label a dataset of hard hat images with White / Blue / Yellow / Person objects using Cloud Annotations 16 | 17 | ### Hard Hat Image Data Set 18 | 19 | Create a set of videos with individuals wearing hardhats in several industrial factory settings. Make sure to include varied scenarios with different lighting conditions. If you have different colored hats that you want to recognize, you might record video of people wearing blue, white, yellow hard hats. Use a tool to capture frames at the desired time interval from the videos in your data set. Label each frame with objects that are of interest - blue, white, yellow hard hats and people. Draw bounding boxes around the objects. Repeat this step for all frames. 20 | 21 | To proceed with this code pattern example, several hundred annotated and labeled hard hat images are provided in this repository. 22 | 23 | ![Worker Safety Hard Hat Labeled Image](/images/WorkerSafety-HardHat-Labeled-Image.png) 24 | 25 | ### Cloud Annotations 26 | 27 | Cloud Annotations makes labeling images and training machine learning models easy. Whether you’ve never touched a line of code in your life or you’re a TensorFlow ninja, the [Cloud Annotations docs](https://cloud.annotations.ai/docs) will help you build what you need. 28 | 29 | The Cloud Annotations tool will help you [prepare the training data](https://cloud.annotations.ai/docs#preparing-training-data). To sound the alarms, our worker safety use case needs to identify **localized hard hat objects** in the frames. 30 | 31 | A classification model can tell you what an image is and how confident it is about it’s decision. An localized object detection model can provide you with much more information: 32 | - **Location** : The coordinates and area of where the object is in the image. 33 | - **Count** : The number of objects found in the image. 34 | - **Size** : How large the object is with respect to the image dimensions. 35 | 36 | Cloud Annotations stores the annotated labeled metadata in a file named ```_annotations.json``` and images in a directory named ```images```. Cloud Annotations stores this training set in IBM Cloud Object Storage. 37 | 38 | ### Download the Hard Hat training dataset 39 | 40 | [Download](/dataset/hardhat-dataset.zip) the Hart Hat annotated labeled metadata and images (50MB zipfile) to your system. In the next section we will upload the training data to IBM Cloud Object Storage (COS). 41 | 42 | *** 43 | You are now ready to set up your IBM Cloud resources, so proceed to the next [Cloud Setup section](/part1/CLOUDSETUP.md). 44 | 45 | *** 46 | [Home](/README.md) - [**Label Data**](/part1/LABEL.md) - [Cloud Setup](/part1/CLOUDSETUP.md) - [Train Model](/part1/TRAIN.md) - [Setup Jetson](/part2/JETSON.md) - [Edge Inferencing](/part2/EDGEINFER.md) - [Horizon Hub](/part3/HZNHUB.md) - [Horizon Client](/part3/HZNCLIENT.md) - [Docker Model](/part4/DOCKERMODEL.md) - [Horizon Deploy](/part4/HZNDEPLOY.md) 47 | *** 48 | -------------------------------------------------------------------------------- /part1/TRAIN.md: -------------------------------------------------------------------------------- 1 | *Quick links :* 2 | [Home](/README.md) - [Label Data](/part1/LABEL.md) - [Cloud Setup](/part1/CLOUDSETUP.md) - [**Train Model**](/part1/TRAIN.md) - [Setup Jetson](/part2/JETSON.md) - [Edge Inferencing](/part2/EDGEINFER.md) - [Horizon Hub](/part3/HZNHUB.md) - [Horizon Client](/part3/HZNCLIENT.md) - [Docker Model](/part4/DOCKERMODEL.md) - [Horizon Deploy](/part4/HZNDEPLOY.md) 3 | *** 4 | 5 | # Edge Computing Worker Safety Hard Hat Detection - Section 3 6 | 7 | ## Train and Test a Hard Hat Detection Model 8 | 9 | This section demonstrates how to install the Cloud Annotations cacli, train a Tensorflow model, download the model, set up a sample localhost web application and test model on your laptop. 10 | 11 | ### Lab Objectives 12 | 13 | In this lab you will learn how to: 14 | 15 | - Install Cloud Annotations cacli 16 | - Train a Tensorflow model 17 | - Download the model with the cacli 18 | - git clone object-detection-react 19 | - Test model on your laptop 20 | 21 | ### Cloud Annotations cacli Install 22 | 23 | Now that the Hard Hat training data set has been uploaded to IBM Cloud Object Storage, we can train our model using the Cloud Annotations command line interface, ```cacli``` 24 | 25 | The cacli is available for Linux, Mac, Windows. Follow the [download and installation instructions](https://cloud.annotations.ai/docs#installing-the-cli) 26 | 27 | ### Cloud Annotations - Login / Train 28 | 29 | - Login to your IBM Cloud account using ```cacli login``` and answer the OTP prompts 30 | ``` 31 | $ cacli login 32 | ``` 33 | ![cacli login](/images/cacli-login.png) 34 | 35 | - Train your hard hat object detection model 36 | ``` 37 | $ cacli train 38 | ``` 39 | - Select your ```hardhat-detection-``` Bucket. 40 | ![cacli train](/images/cacli-train1.png) 41 | - Note the training run **Model ID** 42 | ![cacli train](/images/cacli-train2.png) 43 | - List training runs 44 | ``` 45 | $ cacli list 46 | ``` 47 | ![cacli list](/images/cacli-list.png) 48 | - Monitor the progress of your training runs 49 | ``` 50 | $ cacli progress model-58sdpqyx 51 | ``` 52 | ![cacli progress](/images/cacli-progress.png) 53 | - Wait for the model training to complete (15-30 minutes) 54 | ![cacli complete](/images/cacli-complete.png) 55 | - Download the model 56 | ``` 57 | $ cacli download model-58sdpqyx 58 | success download complete 59 | ``` 60 | ![cacli download](/images/cacli-download.png) 61 | 62 | ### Set up a Object Detection web application on your local system 63 | 64 | The [Cloud Annotations github repository](https://github.com/cloud-annotations) includes an [object detection react](https://github.com/cloud-annotations/object-detection-react) web application that you can install locally. 65 | 66 | - Follow the [README.md instructions](https://github.com/cloud-annotations/object-detection-react/blob/master/README.md) 67 | - (or) Run these commands: 68 | ``` 69 | $ git clone git@github.com:cloud-annotations/object-detection-react.git 70 | $ cd object-detection-react 71 | $ npm install 72 | ``` 73 | - Copy the **model_web** directory created by the ```cacli download``` and paste it into the ```public``` folder of this repo directory. On my Linux system I just create a symlink. 74 | 75 | - Start the server 76 | ``` 77 | $ npm start 78 | ``` 79 | 80 | ## Test the Hard Hat model on your laptop 81 | 82 | Now that the **object detection react** web application is running, open http://localhost:3000 to view it in the browser. Share your laptop camera with the browser tab. 83 | 84 | Find your hard hat and observe the hard hat model prediction. 85 | 86 | ![author with a hardhat](/images/Sample-HardHat-Detection-Author.png) 87 | 88 | ### Congratulations 89 | 90 | You are now ready to set up your Nvidia Jetson Nano, so proceed to the next [Jetson Setup section](/part2/JETSON.md). 91 | 92 | *** 93 | [Home](/README.md) - [Label Data](/part1/LABEL.md) - [Cloud Setup](/part1/CLOUDSETUP.md) - [**Train Model**](/part1/TRAIN.md) - [Setup Jetson](/part2/JETSON.md) - [Edge Inferencing](/part2/EDGEINFER.md) - [Horizon Hub](/part3/HZNHUB.md) - [Horizon Client](/part3/HZNCLIENT.md) - [Docker Model](/part4/DOCKERMODEL.md) - [Horizon Deploy](/part4/HZNDEPLOY.md) 94 | *** 95 | -------------------------------------------------------------------------------- /part2/EDGEINFER.md: -------------------------------------------------------------------------------- 1 | *Quick links :* 2 | [Home](/README.md) - [Label Data](/part1/LABEL.md) - [Cloud Setup](/part1/CLOUDSETUP.md) - [Train Model](/part1/TRAIN.md) - [Setup Jetson](/part2/JETSON.md) - [**Edge Inferencing**](/part2/EDGEINFER.md) - [Horizon Hub](/part3/HZNHUB.md) - [Horizon Client](/part3/HZNCLIENT.md) - [Docker Model](/part4/DOCKERMODEL.md) - [Horizon Deploy](/part4/HZNDEPLOY.md) 3 | *** 4 | 5 | # Edge Computing Worker Safety Hard Hat Detection - Section 5 6 | 7 | ## Train and Test a Hard Hat Detection Model 8 | 9 | This section shows you how to download and test the hard hat model on your Jetson Nano. 10 | 11 | ### Lab Objectives 12 | 13 | In this lab you will learn how to: 14 | 15 | - Download the hard hat model with the cacli 16 | - git clone object-detection-react 17 | - Test model on the Jetson 18 | 19 | ### Inferencing on the Jetson 20 | 21 | In a previous section you uploaded hard hat image training data and trained a object detection model using the Cloud Annotations command line interface, cacli, Tensorflow and Watson Machine Learning. You also downloaded the model and tested the model on your laptop using a browser and your laptop web camera. 22 | 23 | This section will repeat some of those steps on the Jetson Nano. 24 | 25 | ### Cloud Annotations cacli Install 26 | 27 | Our model is already trained so we can use the Cloud Annotations command line interface, ```cacli``` to download it. 28 | 29 | The cacli is available for Linux, Mac, Windows. Follow the [download and installation instructions](https://cloud.annotations.ai/docs#installing-the-cli) 30 | 31 | ### Cloud Annotations - Login / List Models 32 | 33 | - Login to your IBM Cloud account using ```cacli login``` and answer the OTP prompts 34 | ``` 35 | $ cacli login 36 | ``` 37 | ![cacli login](/images/cacli-login.png) 38 | 39 | - List training runs and trained models 40 | ``` 41 | $ cacli list 42 | ``` 43 | ![cacli list](/images/cacli-list.png) 44 | 45 | ### Download your hard hat object detection model 46 | 47 | - Download the model 48 | ``` 49 | $ cacli download model-58sdpqyx 50 | success download complete 51 | ``` 52 | ![cacli download](/images/cacli-download.png) 53 | 54 | ### Install node.js 55 | 56 | Before we can run the Object Detection web application on your Jetson Nano, these commands install Node.js 57 | 58 | ```bash 59 | $ curl -sL https://deb.nodesource.com/setup_12.x | sudo -E bash - 60 | $ sudo apt-get install -y nodejs 61 | ``` 62 | 63 | ### Set up a Object Detection web application on your Jetson Nano 64 | 65 | The [Cloud Annotations github repository](https://github.com/cloud-annotations) includes an [object detection react](https://github.com/cloud-annotations/object-detection-react) web application that you can install locally. 66 | 67 | - Follow the [README.md instructions](https://github.com/cloud-annotations/object-detection-react/blob/master/README.md) 68 | - (or) Run these commands: 69 | ``` 70 | $ git clone git@github.com:cloud-annotations/object-detection-react.git 71 | $ cd object-detection-react 72 | $ npm install 73 | ``` 74 | - Move the **model_web** directory created by the ```cacli download``` into the ```public``` folder of this repo directory. On the Ubuntu Linux system I just create a symlink. 75 | 76 | - Start the server 77 | ``` 78 | $ npm start 79 | ``` 80 | 81 | ## Test the Hard Hat model on your Jetson Nano 82 | 83 | Now that the **object detection react** web application is running, launch Chromium on your Jetson Nano and open http://localhost:3000 to view it in the browser. Share your Jetson web camera with the browser tab. 84 | 85 | Find your hard hat and observe the hard hat model prediction running on the Jetson! 86 | 87 | ![author with a hardhat](/images/Jetson-HardHat-Detection-Author.png) 88 | 89 | ### Congratulations - Object Detection on the Edge! 90 | 91 | You are now ready to set up Open Horizon Exchange Hub Services, so proceed to the next [Horizon Hub Setup section](/part3/HORIZONHUB.md). 92 | 93 | *** 94 | [Home](/README.md) - [Label Data](/part1/LABEL.md) - [Cloud Setup](/part1/CLOUDSETUP.md) - [Train Model](/part1/TRAIN.md) - [Setup Jetson](/part2/JETSON.md) - [**Edge Inferencing**](/part2/EDGEINFER.md) - [Horizon Hub](/part3/HZNHUB.md) - [Horizon Client](/part3/HZNCLIENT.md) - [Docker Model](/part4/DOCKERMODEL.md) - [Horizon Deploy](/part4/HZNDEPLOY.md) 95 | *** 96 | -------------------------------------------------------------------------------- /part2/JETSON.md: -------------------------------------------------------------------------------- 1 | *Quick links :* 2 | [Home](/README.md) - [Label Data](/part1/LABEL.md) - [Cloud Setup](/part1/CLOUDSETUP.md) - [Train Model](/part1/TRAIN.md) - [**Setup Jetson**](/part2/JETSON.md) - [Edge Inferencing](/part2/EDGEINFER.md) - [Horizon Hub](/part3/HZNHUB.md) - [Horizon Client](/part3/HZNCLIENT.md) - [Docker Model](/part4/DOCKERMODEL.md) - [Horizon Deploy](/part4/HZNDEPLOY.md) 3 | *** 4 | 5 | # Edge Computing Worker Safety Hard Hat Detection - Section 4 6 | 7 | ## Jetson Nano setup 8 | 9 | This section shows you how to configure a Jetson Nano to run object detection. 10 | 11 | ### Lab Objectives 12 | 13 | In this lab you will learn how to: 14 | 15 | - Set up a Jetson Nano with Ubuntu 16 | - Attach a USB webcam, use a 5V barrel power supply to provide sufficient power 17 | 18 | ### Set up a Jetson Nano Developer Kit 19 | 20 | Nvidia provides thorough setup [documentation on the Jetson Nano Developer Kit](https://developer.nvidia.com/embedded/learn/get-started-jetson-nano-devkit) so this guide will not attempt to reproduce those steps. In addition, there are [numerous Internet resources](https://www.hackster.io/news/getting-started-with-the-nvidia-jetson-nano-developer-kit-43aa7c298797) that walk through the configuration and setup. 21 | 22 | ![Jetson Nano hardware picture](https://developer.nvidia.com/sites/default/files/akamai/embedded/images/jetsonNano/gettingStarted/jetson-nano-dev-kit-top-r6-HR-B01.png) 23 | 24 | This tutorial will emphasize several parts of those guides that can assure a successful implementation of the Worker Safety and Hard Hat Detection model. 25 | 26 | #### 5V Barrel Jack Power Adapter 27 | 28 | The Jetson Nano can be powered by a Micro-USB 5V 2A power supply but the camera and GPU require additional power to operate. Avoid the frustration of indeterminate results and switch to a 5V barrel jack power supply (4A). Closing the J48 jumper with a standard 2.54mm pitch jumper will switch the power from the micro USB jack to the barrel jack. Follow the [recommended instructions](https://forums.developer.nvidia.com/t/power-supply-considerations-for-jetson-nano-developer-kit/71637) to close jumper J48 on the Nano PCB. 29 | 30 | Once you are powering the Jetson Nano with a 5V barrel jack power supply, you can enable the 'maximum performance model' and the system and inferencing should run faster. 31 | 32 | ```bash 33 | sudo nvpmodel -m 0 34 | ``` 35 | 36 | ### Attach a USB Web camera 37 | 38 | Object Detection does not require the highest quality camera. Most images will get scaled down. Find a reasonably priced webcam. The Sony Play Station Eye Camera for PS3 is only a few dollars. Check for Linux compatibility when making your choice. 39 | 40 | ### Internet Connectivity 41 | 42 | You will need an ethernet cable (the Jetson Nano developer kit does not include WiFi) or a WiFi USB dongle. 43 | 44 | ### Now the fun begins! 45 | 46 | You are now ready to run the hard hat model on the Jetson edge device, so proceed to the next [Edge Inferencing section](/part2/EDGEINFER.md). 47 | 48 | 49 | *** 50 | [Home](/README.md) - [Label Data](/part1/LABEL.md) - [Cloud Setup](/part1/CLOUDSETUP.md) - [Train Model](/part1/TRAIN.md) - [**Setup Jetson**](/part2/JETSON.md) - [Edge Inferencing](/part2/EDGEINFER.md) - [Horizon Hub](/part3/HZNHUB.md) - [Horizon Client](/part3/HZNCLIENT.md) - [Docker Model](/part4/DOCKERMODEL.md) - [Horizon Deploy](/part4/HZNDEPLOY.md) 51 | *** 52 | -------------------------------------------------------------------------------- /part3/HZNCLIENT.md: -------------------------------------------------------------------------------- 1 | *Quick links :* 2 | [Home](/README.md) - [Label Data](/part1/LABEL.md) - [Cloud Setup](/part1/CLOUDSETUP.md) - [Train Model](/part1/TRAIN.md) - [Setup Jetson](/part2/JETSON.md) - [Edge Inferencing](/part2/EDGEINFER.md) - [Horizon Hub](/part3/HZNHUB.md) - [**Horizon Client**](/part3/HZNCLIENT.md) - [Docker Model](/part4/DOCKERMODEL.md) - [Horizon Deploy](/part4/HZNDEPLOY.md) 3 | *** 4 | 5 | # Edge Computing Worker Safety Hard Hat Detection - Section 7 6 | 7 | ## Deploy and Test Hard Hat Object Detection 8 | 9 | Learn how to install Open Horizon HZN anax client and register the Jetson HZN anax client into your hub. 10 | 11 | ### Lab Objectives 12 | 13 | In this lab you will learn how to: 14 | 15 | - On your Jetson Nano, Install docker 16 | - Install Open Horizon HZN anax client 17 | - Register Jetson HZN anax client into Hub 18 | 19 | ### 20 | 21 | You are now ready to containerize the hard hat model in a Docker image, so proceed to the next [Docker Model section](/part4/DOCKERMODEL.md). 22 | 23 | 24 | *** 25 | [Home](/README.md) - [Label Data](/part1/LABEL.md) - [Cloud Setup](/part1/CLOUDSETUP.md) - [Train Model](/part1/TRAIN.md) - [Setup Jetson](/part2/JETSON.md) - [Edge Inferencing](/part2/EDGEINFER.md) - [Horizon Hub](/part3/HZNHUB.md) - [**Horizon Client**](/part3/HZNCLIENT.md) - [Docker Model](/part4/DOCKERMODEL.md) - [Horizon Deploy](/part4/HZNDEPLOY.md) 26 | *** 27 | -------------------------------------------------------------------------------- /part3/HZNHUB.md: -------------------------------------------------------------------------------- 1 | *Quick links :* 2 | [Home](/README.md) - [Label Data](/part1/LABEL.md) - [Cloud Setup](/part1/CLOUDSETUP.md) - [Train Model](/part1/TRAIN.md) - [Setup Jetson](/part2/JETSON.md) - [Edge Inferencing](/part2/EDGEINFER.md) - [**Horizon Hub**](/part3/HZNHUB.md) - [Horizon Client](/part3/HZNCLIENT.md) - [Docker Model](/part4/DOCKERMODEL.md) - [Horizon Deploy](/part4/HZNDEPLOY.md) 3 | *** 4 | 5 | # Edge Computing Worker Safety Hard Hat Detection - Section 6 6 | 7 | ## Open Horizon Exchange Hub Set Up 8 | 9 | This section guides you through the set up of a laptop with HZN Exchange Hub Services 10 | 11 | ## Lab Objectives 12 | 13 | In this lab you will learn how to: 14 | 15 | - Set up a laptop with Horizon Exchange Hub Services 16 | 17 | ### 18 | 19 | You are now ready to set up your Open Horizon anax client on the Jetson Nano, so proceed to the next [Horizon Client section](/part3/HZNCLIENT.md). 20 | 21 | 22 | *** 23 | [Home](/README.md) - [Label Data](/part1/LABEL.md) - [Cloud Setup](/part1/CLOUDSETUP.md) - [Train Model](/part1/TRAIN.md) - [Setup Jetson](/part2/JETSON.md) - [Edge Inferencing](/part2/EDGEINFER.md) - [**Horizon Hub**](/part3/HZNHUB.md) - [Horizon Client](/part3/HZNCLIENT.md) - [Docker Model](/part4/DOCKERMODEL.md) - [Horizon Deploy](/part4/HZNDEPLOY.md) 24 | *** 25 | -------------------------------------------------------------------------------- /part4/DOCKERMODEL.md: -------------------------------------------------------------------------------- 1 | *Quick links :* 2 | [Home](/README.md) - [Label Data](/part1/LABEL.md) - [Cloud Setup](/part1/CLOUDSETUP.md) - [Train Model](/part1/TRAIN.md) - [Setup Jetson](/part2/JETSON.md) - [Edge Inferencing](/part2/EDGEINFER.md) - [Horizon Hub](/part3/HZNHUB.md) - [Horizon Client](/part3/HZNCLIENT.md) - [**Docker Model**](/part4/DOCKERMODEL.md) - [Horizon Deploy](/part4/HZNDEPLOY.md) 3 | *** 4 | 5 | # Edge Computing Worker Safety Hard Hat Detection - Section 8 6 | 7 | ## Containizer your model in a Docker image 8 | 9 | This section shows you how to build a docker image with a model 10 | 11 | ### Lab Objectives 12 | 13 | In this lab you will learn how to: 14 | 15 | - Build a docker image which contains the model 16 | 17 | ### 18 | 19 | You are now ready to register and deploy the dockerized pattern, so proceed to the next [Horizon Deployment section](/part4/HZNDEPLOY.md). 20 | 21 | *** 22 | [Home](/README.md) - [Label Data](/part1/LABEL.md) - [Cloud Setup](/part1/CLOUDSETUP.md) - [Train Model](/part1/TRAIN.md) - [Setup Jetson](/part2/JETSON.md) - [Edge Inferencing](/part2/EDGEINFER.md) - [Horizon Hub](/part3/HZNHUB.md) - [Horizon Client](/part3/HZNCLIENT.md) - [**Docker Model**](/part4/DOCKERMODEL.md) - [Horizon Deploy](/part4/HZNDEPLOY.md) 23 | *** 24 | -------------------------------------------------------------------------------- /part4/HZNDEPLOY.md: -------------------------------------------------------------------------------- 1 | *Quick links :* 2 | [Home](/README.md) - [Label Data](/part1/LABEL.md) - [Cloud Setup](/part1/CLOUDSETUP.md) - [Train Model](/part1/TRAIN.md) - [Setup Jetson](/part2/JETSON.md) - [Edge Inferencing](/part2/EDGEINFER.md) - [Horizon Hub](/part3/HZNHUB.md) - [Horizon Client](/part3/HZNCLIENT.md) - [Docker Model](/part4/DOCKERMODEL.md) - [**Horizon Deploy**](/part4/HZNDEPLOY.md) 3 | *** 4 | 5 | # Edge Computing Worker Safety Hard Hat Detection - Section 9 6 | 7 | ## Register and deploy the Pattern to the edge 8 | 9 | This final section demonstrates how to register the workload pattern on the Exchange server and deploy the pattern to the edge device. 10 | 11 | ### Lab Objectives 12 | 13 | In this lab you will learn how to: 14 | 15 | - Register the workload pattern on the Exchange server 16 | - Deploy the pattern to the edge device 17 | 18 | ### 19 | 20 | Congratulations - You have completed this Code Pattern. 21 | 22 | *** 23 | [Home](/README.md) - [Label Data](/part1/LABEL.md) - [Cloud Setup](/part1/CLOUDSETUP.md) - [Train Model](/part1/TRAIN.md) - [Setup Jetson](/part2/JETSON.md) - [Edge Inferencing](/part2/EDGEINFER.md) - [Horizon Hub](/part3/HZNHUB.md) - [Horizon Client](/part3/HZNCLIENT.md) - [Docker Model](/part4/DOCKERMODEL.md) - [**Horizon Deploy**](/part4/HZNDEPLOY.md) 24 | *** 25 | --------------------------------------------------------------------------------