├── LICENSE
├── README.md
├── detect_tf.py
├── detect_trt.py
├── resources_tf
├── data
│ └── object-detection.pbtxt
└── fine_tuned_model
│ ├── checkpoint
│ ├── frozen_inference_graph.pb
│ ├── model.ckpt.data-00000-of-00001
│ ├── model.ckpt.index
│ ├── model.ckpt.meta
│ ├── pipeline.config
│ └── saved_model
│ └── saved_model.pb
├── resources_trt
├── frozen_graph_to_trt.py
├── pb_to_pb_trt_transfere.py
├── read.me
└── trt_graph.pb
├── result_output_detect_tf.out
├── result_output_detect_trt.out
└── test
└── images
├── 85.jpg
├── 86.jpg
├── 87.jpg
├── 88.jpg
├── 89.jpg
├── ddeteccted_result.png
└── result.png
/LICENSE:
--------------------------------------------------------------------------------
1 | GNU GENERAL PUBLIC LICENSE
2 | Version 3, 29 June 2007
3 |
4 | Copyright (C) 2007 Free Software Foundation, Inc.
5 | Everyone is permitted to copy and distribute verbatim copies
6 | of this license document, but changing it is not allowed.
7 |
8 | Preamble
9 |
10 | The GNU General Public License is a free, copyleft license for
11 | software and other kinds of works.
12 |
13 | The licenses for most software and other practical works are designed
14 | to take away your freedom to share and change the works. By contrast,
15 | the GNU General Public License is intended to guarantee your freedom to
16 | share and change all versions of a program--to make sure it remains free
17 | software for all its users. We, the Free Software Foundation, use the
18 | GNU General Public License for most of our software; it applies also to
19 | any other work released this way by its authors. You can apply it to
20 | your programs, too.
21 |
22 | When we speak of free software, we are referring to freedom, not
23 | price. Our General Public Licenses are designed to make sure that you
24 | have the freedom to distribute copies of free software (and charge for
25 | them if you wish), that you receive source code or can get it if you
26 | want it, that you can change the software or use pieces of it in new
27 | free programs, and that you know you can do these things.
28 |
29 | To protect your rights, we need to prevent others from denying you
30 | these rights or asking you to surrender the rights. Therefore, you have
31 | certain responsibilities if you distribute copies of the software, or if
32 | you modify it: responsibilities to respect the freedom of others.
33 |
34 | For example, if you distribute copies of such a program, whether
35 | gratis or for a fee, you must pass on to the recipients the same
36 | freedoms that you received. You must make sure that they, too, receive
37 | or can get the source code. And you must show them these terms so they
38 | know their rights.
39 |
40 | Developers that use the GNU GPL protect your rights with two steps:
41 | (1) assert copyright on the software, and (2) offer you this License
42 | giving you legal permission to copy, distribute and/or modify it.
43 |
44 | For the developers' and authors' protection, the GPL clearly explains
45 | that there is no warranty for this free software. For both users' and
46 | authors' sake, the GPL requires that modified versions be marked as
47 | changed, so that their problems will not be attributed erroneously to
48 | authors of previous versions.
49 |
50 | Some devices are designed to deny users access to install or run
51 | modified versions of the software inside them, although the manufacturer
52 | can do so. This is fundamentally incompatible with the aim of
53 | protecting users' freedom to change the software. The systematic
54 | pattern of such abuse occurs in the area of products for individuals to
55 | use, which is precisely where it is most unacceptable. Therefore, we
56 | have designed this version of the GPL to prohibit the practice for those
57 | products. If such problems arise substantially in other domains, we
58 | stand ready to extend this provision to those domains in future versions
59 | of the GPL, as needed to protect the freedom of users.
60 |
61 | Finally, every program is threatened constantly by software patents.
62 | States should not allow patents to restrict development and use of
63 | software on general-purpose computers, but in those that do, we wish to
64 | avoid the special danger that patents applied to a free program could
65 | make it effectively proprietary. To prevent this, the GPL assures that
66 | patents cannot be used to render the program non-free.
67 |
68 | The precise terms and conditions for copying, distribution and
69 | modification follow.
70 |
71 | TERMS AND CONDITIONS
72 |
73 | 0. Definitions.
74 |
75 | "This License" refers to version 3 of the GNU General Public License.
76 |
77 | "Copyright" also means copyright-like laws that apply to other kinds of
78 | works, such as semiconductor masks.
79 |
80 | "The Program" refers to any copyrightable work licensed under this
81 | License. Each licensee is addressed as "you". "Licensees" and
82 | "recipients" may be individuals or organizations.
83 |
84 | To "modify" a work means to copy from or adapt all or part of the work
85 | in a fashion requiring copyright permission, other than the making of an
86 | exact copy. The resulting work is called a "modified version" of the
87 | earlier work or a work "based on" the earlier work.
88 |
89 | A "covered work" means either the unmodified Program or a work based
90 | on the Program.
91 |
92 | To "propagate" a work means to do anything with it that, without
93 | permission, would make you directly or secondarily liable for
94 | infringement under applicable copyright law, except executing it on a
95 | computer or modifying a private copy. Propagation includes copying,
96 | distribution (with or without modification), making available to the
97 | public, and in some countries other activities as well.
98 |
99 | To "convey" a work means any kind of propagation that enables other
100 | parties to make or receive copies. Mere interaction with a user through
101 | a computer network, with no transfer of a copy, is not conveying.
102 |
103 | An interactive user interface displays "Appropriate Legal Notices"
104 | to the extent that it includes a convenient and prominently visible
105 | feature that (1) displays an appropriate copyright notice, and (2)
106 | tells the user that there is no warranty for the work (except to the
107 | extent that warranties are provided), that licensees may convey the
108 | work under this License, and how to view a copy of this License. If
109 | the interface presents a list of user commands or options, such as a
110 | menu, a prominent item in the list meets this criterion.
111 |
112 | 1. Source Code.
113 |
114 | The "source code" for a work means the preferred form of the work
115 | for making modifications to it. "Object code" means any non-source
116 | form of a work.
117 |
118 | A "Standard Interface" means an interface that either is an official
119 | standard defined by a recognized standards body, or, in the case of
120 | interfaces specified for a particular programming language, one that
121 | is widely used among developers working in that language.
122 |
123 | The "System Libraries" of an executable work include anything, other
124 | than the work as a whole, that (a) is included in the normal form of
125 | packaging a Major Component, but which is not part of that Major
126 | Component, and (b) serves only to enable use of the work with that
127 | Major Component, or to implement a Standard Interface for which an
128 | implementation is available to the public in source code form. A
129 | "Major Component", in this context, means a major essential component
130 | (kernel, window system, and so on) of the specific operating system
131 | (if any) on which the executable work runs, or a compiler used to
132 | produce the work, or an object code interpreter used to run it.
133 |
134 | The "Corresponding Source" for a work in object code form means all
135 | the source code needed to generate, install, and (for an executable
136 | work) run the object code and to modify the work, including scripts to
137 | control those activities. However, it does not include the work's
138 | System Libraries, or general-purpose tools or generally available free
139 | programs which are used unmodified in performing those activities but
140 | which are not part of the work. For example, Corresponding Source
141 | includes interface definition files associated with source files for
142 | the work, and the source code for shared libraries and dynamically
143 | linked subprograms that the work is specifically designed to require,
144 | such as by intimate data communication or control flow between those
145 | subprograms and other parts of the work.
146 |
147 | The Corresponding Source need not include anything that users
148 | can regenerate automatically from other parts of the Corresponding
149 | Source.
150 |
151 | The Corresponding Source for a work in source code form is that
152 | same work.
153 |
154 | 2. Basic Permissions.
155 |
156 | All rights granted under this License are granted for the term of
157 | copyright on the Program, and are irrevocable provided the stated
158 | conditions are met. This License explicitly affirms your unlimited
159 | permission to run the unmodified Program. The output from running a
160 | covered work is covered by this License only if the output, given its
161 | content, constitutes a covered work. This License acknowledges your
162 | rights of fair use or other equivalent, as provided by copyright law.
163 |
164 | You may make, run and propagate covered works that you do not
165 | convey, without conditions so long as your license otherwise remains
166 | in force. You may convey covered works to others for the sole purpose
167 | of having them make modifications exclusively for you, or provide you
168 | with facilities for running those works, provided that you comply with
169 | the terms of this License in conveying all material for which you do
170 | not control copyright. Those thus making or running the covered works
171 | for you must do so exclusively on your behalf, under your direction
172 | and control, on terms that prohibit them from making any copies of
173 | your copyrighted material outside their relationship with you.
174 |
175 | Conveying under any other circumstances is permitted solely under
176 | the conditions stated below. Sublicensing is not allowed; section 10
177 | makes it unnecessary.
178 |
179 | 3. Protecting Users' Legal Rights From Anti-Circumvention Law.
180 |
181 | No covered work shall be deemed part of an effective technological
182 | measure under any applicable law fulfilling obligations under article
183 | 11 of the WIPO copyright treaty adopted on 20 December 1996, or
184 | similar laws prohibiting or restricting circumvention of such
185 | measures.
186 |
187 | When you convey a covered work, you waive any legal power to forbid
188 | circumvention of technological measures to the extent such circumvention
189 | is effected by exercising rights under this License with respect to
190 | the covered work, and you disclaim any intention to limit operation or
191 | modification of the work as a means of enforcing, against the work's
192 | users, your or third parties' legal rights to forbid circumvention of
193 | technological measures.
194 |
195 | 4. Conveying Verbatim Copies.
196 |
197 | You may convey verbatim copies of the Program's source code as you
198 | receive it, in any medium, provided that you conspicuously and
199 | appropriately publish on each copy an appropriate copyright notice;
200 | keep intact all notices stating that this License and any
201 | non-permissive terms added in accord with section 7 apply to the code;
202 | keep intact all notices of the absence of any warranty; and give all
203 | recipients a copy of this License along with the Program.
204 |
205 | You may charge any price or no price for each copy that you convey,
206 | and you may offer support or warranty protection for a fee.
207 |
208 | 5. Conveying Modified Source Versions.
209 |
210 | You may convey a work based on the Program, or the modifications to
211 | produce it from the Program, in the form of source code under the
212 | terms of section 4, provided that you also meet all of these conditions:
213 |
214 | a) The work must carry prominent notices stating that you modified
215 | it, and giving a relevant date.
216 |
217 | b) The work must carry prominent notices stating that it is
218 | released under this License and any conditions added under section
219 | 7. This requirement modifies the requirement in section 4 to
220 | "keep intact all notices".
221 |
222 | c) You must license the entire work, as a whole, under this
223 | License to anyone who comes into possession of a copy. This
224 | License will therefore apply, along with any applicable section 7
225 | additional terms, to the whole of the work, and all its parts,
226 | regardless of how they are packaged. This License gives no
227 | permission to license the work in any other way, but it does not
228 | invalidate such permission if you have separately received it.
229 |
230 | d) If the work has interactive user interfaces, each must display
231 | Appropriate Legal Notices; however, if the Program has interactive
232 | interfaces that do not display Appropriate Legal Notices, your
233 | work need not make them do so.
234 |
235 | A compilation of a covered work with other separate and independent
236 | works, which are not by their nature extensions of the covered work,
237 | and which are not combined with it such as to form a larger program,
238 | in or on a volume of a storage or distribution medium, is called an
239 | "aggregate" if the compilation and its resulting copyright are not
240 | used to limit the access or legal rights of the compilation's users
241 | beyond what the individual works permit. Inclusion of a covered work
242 | in an aggregate does not cause this License to apply to the other
243 | parts of the aggregate.
244 |
245 | 6. Conveying Non-Source Forms.
246 |
247 | You may convey a covered work in object code form under the terms
248 | of sections 4 and 5, provided that you also convey the
249 | machine-readable Corresponding Source under the terms of this License,
250 | in one of these ways:
251 |
252 | a) Convey the object code in, or embodied in, a physical product
253 | (including a physical distribution medium), accompanied by the
254 | Corresponding Source fixed on a durable physical medium
255 | customarily used for software interchange.
256 |
257 | b) Convey the object code in, or embodied in, a physical product
258 | (including a physical distribution medium), accompanied by a
259 | written offer, valid for at least three years and valid for as
260 | long as you offer spare parts or customer support for that product
261 | model, to give anyone who possesses the object code either (1) a
262 | copy of the Corresponding Source for all the software in the
263 | product that is covered by this License, on a durable physical
264 | medium customarily used for software interchange, for a price no
265 | more than your reasonable cost of physically performing this
266 | conveying of source, or (2) access to copy the
267 | Corresponding Source from a network server at no charge.
268 |
269 | c) Convey individual copies of the object code with a copy of the
270 | written offer to provide the Corresponding Source. This
271 | alternative is allowed only occasionally and noncommercially, and
272 | only if you received the object code with such an offer, in accord
273 | with subsection 6b.
274 |
275 | d) Convey the object code by offering access from a designated
276 | place (gratis or for a charge), and offer equivalent access to the
277 | Corresponding Source in the same way through the same place at no
278 | further charge. You need not require recipients to copy the
279 | Corresponding Source along with the object code. If the place to
280 | copy the object code is a network server, the Corresponding Source
281 | may be on a different server (operated by you or a third party)
282 | that supports equivalent copying facilities, provided you maintain
283 | clear directions next to the object code saying where to find the
284 | Corresponding Source. Regardless of what server hosts the
285 | Corresponding Source, you remain obligated to ensure that it is
286 | available for as long as needed to satisfy these requirements.
287 |
288 | e) Convey the object code using peer-to-peer transmission, provided
289 | you inform other peers where the object code and Corresponding
290 | Source of the work are being offered to the general public at no
291 | charge under subsection 6d.
292 |
293 | A separable portion of the object code, whose source code is excluded
294 | from the Corresponding Source as a System Library, need not be
295 | included in conveying the object code work.
296 |
297 | A "User Product" is either (1) a "consumer product", which means any
298 | tangible personal property which is normally used for personal, family,
299 | or household purposes, or (2) anything designed or sold for incorporation
300 | into a dwelling. In determining whether a product is a consumer product,
301 | doubtful cases shall be resolved in favor of coverage. For a particular
302 | product received by a particular user, "normally used" refers to a
303 | typical or common use of that class of product, regardless of the status
304 | of the particular user or of the way in which the particular user
305 | actually uses, or expects or is expected to use, the product. A product
306 | is a consumer product regardless of whether the product has substantial
307 | commercial, industrial or non-consumer uses, unless such uses represent
308 | the only significant mode of use of the product.
309 |
310 | "Installation Information" for a User Product means any methods,
311 | procedures, authorization keys, or other information required to install
312 | and execute modified versions of a covered work in that User Product from
313 | a modified version of its Corresponding Source. The information must
314 | suffice to ensure that the continued functioning of the modified object
315 | code is in no case prevented or interfered with solely because
316 | modification has been made.
317 |
318 | If you convey an object code work under this section in, or with, or
319 | specifically for use in, a User Product, and the conveying occurs as
320 | part of a transaction in which the right of possession and use of the
321 | User Product is transferred to the recipient in perpetuity or for a
322 | fixed term (regardless of how the transaction is characterized), the
323 | Corresponding Source conveyed under this section must be accompanied
324 | by the Installation Information. But this requirement does not apply
325 | if neither you nor any third party retains the ability to install
326 | modified object code on the User Product (for example, the work has
327 | been installed in ROM).
328 |
329 | The requirement to provide Installation Information does not include a
330 | requirement to continue to provide support service, warranty, or updates
331 | for a work that has been modified or installed by the recipient, or for
332 | the User Product in which it has been modified or installed. Access to a
333 | network may be denied when the modification itself materially and
334 | adversely affects the operation of the network or violates the rules and
335 | protocols for communication across the network.
336 |
337 | Corresponding Source conveyed, and Installation Information provided,
338 | in accord with this section must be in a format that is publicly
339 | documented (and with an implementation available to the public in
340 | source code form), and must require no special password or key for
341 | unpacking, reading or copying.
342 |
343 | 7. Additional Terms.
344 |
345 | "Additional permissions" are terms that supplement the terms of this
346 | License by making exceptions from one or more of its conditions.
347 | Additional permissions that are applicable to the entire Program shall
348 | be treated as though they were included in this License, to the extent
349 | that they are valid under applicable law. If additional permissions
350 | apply only to part of the Program, that part may be used separately
351 | under those permissions, but the entire Program remains governed by
352 | this License without regard to the additional permissions.
353 |
354 | When you convey a copy of a covered work, you may at your option
355 | remove any additional permissions from that copy, or from any part of
356 | it. (Additional permissions may be written to require their own
357 | removal in certain cases when you modify the work.) You may place
358 | additional permissions on material, added by you to a covered work,
359 | for which you have or can give appropriate copyright permission.
360 |
361 | Notwithstanding any other provision of this License, for material you
362 | add to a covered work, you may (if authorized by the copyright holders of
363 | that material) supplement the terms of this License with terms:
364 |
365 | a) Disclaiming warranty or limiting liability differently from the
366 | terms of sections 15 and 16 of this License; or
367 |
368 | b) Requiring preservation of specified reasonable legal notices or
369 | author attributions in that material or in the Appropriate Legal
370 | Notices displayed by works containing it; or
371 |
372 | c) Prohibiting misrepresentation of the origin of that material, or
373 | requiring that modified versions of such material be marked in
374 | reasonable ways as different from the original version; or
375 |
376 | d) Limiting the use for publicity purposes of names of licensors or
377 | authors of the material; or
378 |
379 | e) Declining to grant rights under trademark law for use of some
380 | trade names, trademarks, or service marks; or
381 |
382 | f) Requiring indemnification of licensors and authors of that
383 | material by anyone who conveys the material (or modified versions of
384 | it) with contractual assumptions of liability to the recipient, for
385 | any liability that these contractual assumptions directly impose on
386 | those licensors and authors.
387 |
388 | All other non-permissive additional terms are considered "further
389 | restrictions" within the meaning of section 10. If the Program as you
390 | received it, or any part of it, contains a notice stating that it is
391 | governed by this License along with a term that is a further
392 | restriction, you may remove that term. If a license document contains
393 | a further restriction but permits relicensing or conveying under this
394 | License, you may add to a covered work material governed by the terms
395 | of that license document, provided that the further restriction does
396 | not survive such relicensing or conveying.
397 |
398 | If you add terms to a covered work in accord with this section, you
399 | must place, in the relevant source files, a statement of the
400 | additional terms that apply to those files, or a notice indicating
401 | where to find the applicable terms.
402 |
403 | Additional terms, permissive or non-permissive, may be stated in the
404 | form of a separately written license, or stated as exceptions;
405 | the above requirements apply either way.
406 |
407 | 8. Termination.
408 |
409 | You may not propagate or modify a covered work except as expressly
410 | provided under this License. Any attempt otherwise to propagate or
411 | modify it is void, and will automatically terminate your rights under
412 | this License (including any patent licenses granted under the third
413 | paragraph of section 11).
414 |
415 | However, if you cease all violation of this License, then your
416 | license from a particular copyright holder is reinstated (a)
417 | provisionally, unless and until the copyright holder explicitly and
418 | finally terminates your license, and (b) permanently, if the copyright
419 | holder fails to notify you of the violation by some reasonable means
420 | prior to 60 days after the cessation.
421 |
422 | Moreover, your license from a particular copyright holder is
423 | reinstated permanently if the copyright holder notifies you of the
424 | violation by some reasonable means, this is the first time you have
425 | received notice of violation of this License (for any work) from that
426 | copyright holder, and you cure the violation prior to 30 days after
427 | your receipt of the notice.
428 |
429 | Termination of your rights under this section does not terminate the
430 | licenses of parties who have received copies or rights from you under
431 | this License. If your rights have been terminated and not permanently
432 | reinstated, you do not qualify to receive new licenses for the same
433 | material under section 10.
434 |
435 | 9. Acceptance Not Required for Having Copies.
436 |
437 | You are not required to accept this License in order to receive or
438 | run a copy of the Program. Ancillary propagation of a covered work
439 | occurring solely as a consequence of using peer-to-peer transmission
440 | to receive a copy likewise does not require acceptance. However,
441 | nothing other than this License grants you permission to propagate or
442 | modify any covered work. These actions infringe copyright if you do
443 | not accept this License. Therefore, by modifying or propagating a
444 | covered work, you indicate your acceptance of this License to do so.
445 |
446 | 10. Automatic Licensing of Downstream Recipients.
447 |
448 | Each time you convey a covered work, the recipient automatically
449 | receives a license from the original licensors, to run, modify and
450 | propagate that work, subject to this License. You are not responsible
451 | for enforcing compliance by third parties with this License.
452 |
453 | An "entity transaction" is a transaction transferring control of an
454 | organization, or substantially all assets of one, or subdividing an
455 | organization, or merging organizations. If propagation of a covered
456 | work results from an entity transaction, each party to that
457 | transaction who receives a copy of the work also receives whatever
458 | licenses to the work the party's predecessor in interest had or could
459 | give under the previous paragraph, plus a right to possession of the
460 | Corresponding Source of the work from the predecessor in interest, if
461 | the predecessor has it or can get it with reasonable efforts.
462 |
463 | You may not impose any further restrictions on the exercise of the
464 | rights granted or affirmed under this License. For example, you may
465 | not impose a license fee, royalty, or other charge for exercise of
466 | rights granted under this License, and you may not initiate litigation
467 | (including a cross-claim or counterclaim in a lawsuit) alleging that
468 | any patent claim is infringed by making, using, selling, offering for
469 | sale, or importing the Program or any portion of it.
470 |
471 | 11. Patents.
472 |
473 | A "contributor" is a copyright holder who authorizes use under this
474 | License of the Program or a work on which the Program is based. The
475 | work thus licensed is called the contributor's "contributor version".
476 |
477 | A contributor's "essential patent claims" are all patent claims
478 | owned or controlled by the contributor, whether already acquired or
479 | hereafter acquired, that would be infringed by some manner, permitted
480 | by this License, of making, using, or selling its contributor version,
481 | but do not include claims that would be infringed only as a
482 | consequence of further modification of the contributor version. For
483 | purposes of this definition, "control" includes the right to grant
484 | patent sublicenses in a manner consistent with the requirements of
485 | this License.
486 |
487 | Each contributor grants you a non-exclusive, worldwide, royalty-free
488 | patent license under the contributor's essential patent claims, to
489 | make, use, sell, offer for sale, import and otherwise run, modify and
490 | propagate the contents of its contributor version.
491 |
492 | In the following three paragraphs, a "patent license" is any express
493 | agreement or commitment, however denominated, not to enforce a patent
494 | (such as an express permission to practice a patent or covenant not to
495 | sue for patent infringement). To "grant" such a patent license to a
496 | party means to make such an agreement or commitment not to enforce a
497 | patent against the party.
498 |
499 | If you convey a covered work, knowingly relying on a patent license,
500 | and the Corresponding Source of the work is not available for anyone
501 | to copy, free of charge and under the terms of this License, through a
502 | publicly available network server or other readily accessible means,
503 | then you must either (1) cause the Corresponding Source to be so
504 | available, or (2) arrange to deprive yourself of the benefit of the
505 | patent license for this particular work, or (3) arrange, in a manner
506 | consistent with the requirements of this License, to extend the patent
507 | license to downstream recipients. "Knowingly relying" means you have
508 | actual knowledge that, but for the patent license, your conveying the
509 | covered work in a country, or your recipient's use of the covered work
510 | in a country, would infringe one or more identifiable patents in that
511 | country that you have reason to believe are valid.
512 |
513 | If, pursuant to or in connection with a single transaction or
514 | arrangement, you convey, or propagate by procuring conveyance of, a
515 | covered work, and grant a patent license to some of the parties
516 | receiving the covered work authorizing them to use, propagate, modify
517 | or convey a specific copy of the covered work, then the patent license
518 | you grant is automatically extended to all recipients of the covered
519 | work and works based on it.
520 |
521 | A patent license is "discriminatory" if it does not include within
522 | the scope of its coverage, prohibits the exercise of, or is
523 | conditioned on the non-exercise of one or more of the rights that are
524 | specifically granted under this License. You may not convey a covered
525 | work if you are a party to an arrangement with a third party that is
526 | in the business of distributing software, under which you make payment
527 | to the third party based on the extent of your activity of conveying
528 | the work, and under which the third party grants, to any of the
529 | parties who would receive the covered work from you, a discriminatory
530 | patent license (a) in connection with copies of the covered work
531 | conveyed by you (or copies made from those copies), or (b) primarily
532 | for and in connection with specific products or compilations that
533 | contain the covered work, unless you entered into that arrangement,
534 | or that patent license was granted, prior to 28 March 2007.
535 |
536 | Nothing in this License shall be construed as excluding or limiting
537 | any implied license or other defenses to infringement that may
538 | otherwise be available to you under applicable patent law.
539 |
540 | 12. No Surrender of Others' Freedom.
541 |
542 | If conditions are imposed on you (whether by court order, agreement or
543 | otherwise) that contradict the conditions of this License, they do not
544 | excuse you from the conditions of this License. If you cannot convey a
545 | covered work so as to satisfy simultaneously your obligations under this
546 | License and any other pertinent obligations, then as a consequence you may
547 | not convey it at all. For example, if you agree to terms that obligate you
548 | to collect a royalty for further conveying from those to whom you convey
549 | the Program, the only way you could satisfy both those terms and this
550 | License would be to refrain entirely from conveying the Program.
551 |
552 | 13. Use with the GNU Affero General Public License.
553 |
554 | Notwithstanding any other provision of this License, you have
555 | permission to link or combine any covered work with a work licensed
556 | under version 3 of the GNU Affero General Public License into a single
557 | combined work, and to convey the resulting work. The terms of this
558 | License will continue to apply to the part which is the covered work,
559 | but the special requirements of the GNU Affero General Public License,
560 | section 13, concerning interaction through a network will apply to the
561 | combination as such.
562 |
563 | 14. Revised Versions of this License.
564 |
565 | The Free Software Foundation may publish revised and/or new versions of
566 | the GNU General Public License from time to time. Such new versions will
567 | be similar in spirit to the present version, but may differ in detail to
568 | address new problems or concerns.
569 |
570 | Each version is given a distinguishing version number. If the
571 | Program specifies that a certain numbered version of the GNU General
572 | Public License "or any later version" applies to it, you have the
573 | option of following the terms and conditions either of that numbered
574 | version or of any later version published by the Free Software
575 | Foundation. If the Program does not specify a version number of the
576 | GNU General Public License, you may choose any version ever published
577 | by the Free Software Foundation.
578 |
579 | If the Program specifies that a proxy can decide which future
580 | versions of the GNU General Public License can be used, that proxy's
581 | public statement of acceptance of a version permanently authorizes you
582 | to choose that version for the Program.
583 |
584 | Later license versions may give you additional or different
585 | permissions. However, no additional obligations are imposed on any
586 | author or copyright holder as a result of your choosing to follow a
587 | later version.
588 |
589 | 15. Disclaimer of Warranty.
590 |
591 | THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
592 | APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
593 | HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
594 | OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
595 | THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
596 | PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
597 | IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
598 | ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
599 |
600 | 16. Limitation of Liability.
601 |
602 | IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
603 | WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
604 | THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
605 | GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
606 | USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
607 | DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
608 | PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
609 | EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
610 | SUCH DAMAGES.
611 |
612 | 17. Interpretation of Sections 15 and 16.
613 |
614 | If the disclaimer of warranty and limitation of liability provided
615 | above cannot be given local legal effect according to their terms,
616 | reviewing courts shall apply local law that most closely approximates
617 | an absolute waiver of all civil liability in connection with the
618 | Program, unless a warranty or assumption of liability accompanies a
619 | copy of the Program in return for a fee.
620 |
621 | END OF TERMS AND CONDITIONS
622 |
623 | How to Apply These Terms to Your New Programs
624 |
625 | If you develop a new program, and you want it to be of the greatest
626 | possible use to the public, the best way to achieve this is to make it
627 | free software which everyone can redistribute and change under these terms.
628 |
629 | To do so, attach the following notices to the program. It is safest
630 | to attach them to the start of each source file to most effectively
631 | state the exclusion of warranty; and each file should have at least
632 | the "copyright" line and a pointer to where the full notice is found.
633 |
634 |
635 | Copyright (C)
636 |
637 | This program is free software: you can redistribute it and/or modify
638 | it under the terms of the GNU General Public License as published by
639 | the Free Software Foundation, either version 3 of the License, or
640 | (at your option) any later version.
641 |
642 | This program is distributed in the hope that it will be useful,
643 | but WITHOUT ANY WARRANTY; without even the implied warranty of
644 | MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
645 | GNU General Public License for more details.
646 |
647 | You should have received a copy of the GNU General Public License
648 | along with this program. If not, see .
649 |
650 | Also add information on how to contact you by electronic and paper mail.
651 |
652 | If the program does terminal interaction, make it output a short
653 | notice like this when it starts in an interactive mode:
654 |
655 | Copyright (C)
656 | This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
657 | This is free software, and you are welcome to redistribute it
658 | under certain conditions; type `show c' for details.
659 |
660 | The hypothetical commands `show w' and `show c' should show the appropriate
661 | parts of the General Public License. Of course, your program's commands
662 | might be different; for a GUI interface, you would use an "about box".
663 |
664 | You should also get your employer (if you work as a programmer) or school,
665 | if any, to sign a "copyright disclaimer" for the program, if necessary.
666 | For more information on this, and how to apply and follow the GNU GPL, see
667 | .
668 |
669 | The GNU General Public License does not permit incorporating your program
670 | into proprietary programs. If your program is a subroutine library, you
671 | may consider it more useful to permit linking proprietary applications with
672 | the library. If this is what you want to do, use the GNU Lesser General
673 | Public License instead of this License. But first, please read
674 | .
675 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # JetsonNanoInsulatorDetection
2 | Detection insulator with ssd_mobilenet_v1 custom trained network.
3 | ***
4 | After all steps of first steps with NVidia Jetson Nano development board (link - https://developer.nvidia.com/embedded/learn/get-started-jetson-nano-devkit) we need to make all requirement installation for tensorflow inference tests with custom ssd_mobilenet_v1 network (trained to detect insulators of power supply substation).
5 | ***
6 | 1. Installing base dependancies and pip
7 | ```
8 | sudo apt-get update
9 | sudo apt-get install git cmake
10 | sudo su
11 | apt-get install libatlas-base-dev gfortran
12 | apt-get install libhdf5-serial-dev hdf5-tools
13 | sudo apt-get install python3-dev
14 |
15 | wget https://bootstrap.pypa.io/get-pip.py
16 | sudo python3 get-pip.py
17 | sudo apt install python3-testresources
18 | sudo rm get-pip.py
19 | ```
20 | 2.tensorflow and others
21 | ```
22 | sudo pip install numpy
23 | sudo pip install --extra-index-url https://developer.download.nvidia.com/compute/redist/jp/v42 tensorflow-gpu==1.13.1+nv19.3
24 |
25 | sudo pip install scipy
26 | sudo pip install keras
27 | ```
28 | this libraries need good internet connection and time for installlation ( from 15 to 40 minutes)
29 |
30 | Now any of pre-trained Keras and tensorflow models can run on Jetson Nano board.
31 | ***
32 | 2. installing pretrained models for TensorRT from https://github.com/NVIDIA-AI-IOT/tf_trt_models
33 | ```
34 | cd ~
35 | git clone --recursive https://github.com/NVIDIA-Jetson/tf_trt_models.git
36 | cd tf_trt_models
37 | ./install.sh python3
38 | ```
39 | You have all installed TensorRT models in /tf_trt_models and also in /tf_trt_models/third_party/models all Tensorflow models zoo (https://github.com/tensorflow/models)
40 |
41 | Directory and file structure
42 | - resources_tf - tensorflow trained graph for Insulator detection.
43 | this frozen model transferred to TensorRT graph inside
44 | - resourses_trt directory. Also tis directory has python scripts to prepare TensorRT graph from Tensorflow graph
45 | -frozen_graph_to_trt.py - script to transfere TF frozen graph to TRT
46 | -pb_to_pb_trt_transfere.py - also the script with othe approach to prepare TRT frozen graph
47 | -trt_graph.pb - frozen TRT graph
48 | - test/images - directory for testing images
49 |
50 | -detect_tf.py - script to test tensorflow speed of Inference
51 | -detect_trt.py - script to test TensorRT speed
52 |
53 | ***
54 | results
55 | ***
56 | Testing with tensorflow frozen graph give about 0.07sec per one image (~15FPS)
57 | Testing with TensorRT frozen graph give about 0.09 sec per image (~11FPS)
58 | Loading of TF program to memory - about 15 seconds
59 | Loading TRT program for execution about 200 seconds
60 |
61 | some problems with TRT approach is could be the training of base tensorflow graph which is used for TRT have done with different sized images ( not 224x224 or 300x300 ) - but thisis not definitely - need to understand later
62 |
63 | also recognition of TensorRT graph have bad results with some amount of insulators at the picture and small insulators...
64 |
65 |
66 | example
67 | 
68 |
69 | Update!
70 | with this approach https://github.com/NVIDIA-AI-IOT/tf_trt_models/blob/master/examples/detection/detection.ipynb
71 | I have recieved better result (about 20fps) with TensorRT library....
72 | 
73 |
--------------------------------------------------------------------------------
/detect_tf.py:
--------------------------------------------------------------------------------
1 | #detect and classify image program test for jetson nano
2 | #python PIL realisation and direct tensorflow graph not TensorRT optimized
3 | #2019-05-25
4 | #checking for library installed with tf_trt_models installation
5 |
6 | import numpy as np
7 | import os
8 | import sys
9 | import tensorflow as tf
10 | # with PIL time to process1 image is about 0,25 sec
11 | #from PIL import Image # choose library for image processing
12 | # with cv2 time to process 1 image is about 0,07 sec
13 | import cv2 # coose which library will process image
14 | import base64
15 | import time
16 |
17 | ################################################################################
18 | #need to add object detection libraries and code downloaded From
19 | #https://github.com/tensorflow/models and stored in /home/jnano/ directory
20 | # this is important if not installed tt_trt_models with install.sh script
21 | #because it insltall all PATHs with installation and tensorflow models located in
22 | #tf_trt_models/third_party/models directory
23 | ################################################################################
24 | #sys.path.append('/home/jnano/tf_trt_models/third_party/models/research/')
25 | #sys.path.append('/home/jnano/tf_trt_models/third_part/models/research/slim/')
26 |
27 | from object_detection.utils import ops as utils_ops
28 | from object_detection.utils import label_map_util
29 | from object_detection.utils import visualization_utils as vis_util
30 |
31 | ################################################################################
32 | #load image to numpy array procedure for wirking with PIL image processing
33 | ################################################################################
34 | def load_image_into_numpy_array(image):
35 | (im_width, im_height) = image.size
36 | return np.array(image.getdata()).reshape(
37 | (im_height, im_width, 3)).astype(np.uint8)
38 |
39 | ################################################################################
40 | #MAIN procedure
41 | ################################################################################
42 | # start timing of processes
43 | time_start = time.time()
44 |
45 | # Path to frozen detection graph. This is the actual model that is used for the object detection.
46 | PATH_TO_CKPT = './resources_tf/fine_tuned_model' + '/frozen_inference_graph.pb'
47 | # List of the strings that is used to add correct label for each box.
48 | PATH_TO_LABELS = os.path.join('./resources_tf/data/', 'object-detection.pbtxt')
49 | #number ofclassesfor classification
50 | NUM_CLASSES = 1
51 |
52 | #for local testing
53 | # If you want to test the code with your images, just add path to the images to the TEST_IMAGE_PATHS.
54 | PATH_TO_TEST_IMAGES_DIR = './test/images/'
55 | TEST_IMAGE_PATHS = [ os.path.join(PATH_TO_TEST_IMAGES_DIR, '8{}.jpg'.format(i)) for i in range(5, 10) ]
56 |
57 |
58 |
59 | # init graph
60 | detection_graph = tf.Graph()
61 | with detection_graph.as_default():
62 | od_graph_def = tf.GraphDef()
63 | with tf.gfile.GFile(PATH_TO_CKPT, 'rb') as fid:
64 | serialized_graph = fid.read()
65 | od_graph_def.ParseFromString(serialized_graph)
66 | tf.import_graph_def(od_graph_def, name='')
67 |
68 | #creat labels
69 | label_map = label_map_util.load_labelmap(PATH_TO_LABELS)
70 | categories = label_map_util.convert_label_map_to_categories(label_map, max_num_classes=NUM_CLASSES, use_display_name=True)
71 | category_index = label_map_util.create_category_index(categories)
72 |
73 | #print(categories)
74 | # return [{'name': 'insulator', 'id': 1}]
75 | #print(category_index)
76 | #return {1: {'name': 'insulator', 'id': 1}}
77 |
78 | #init graph all the vars
79 | with detection_graph.as_default():
80 | with tf.Session() as sess:
81 | # Get handles to input and output tensors
82 | ops = tf.get_default_graph().get_operations()
83 | all_tensor_names = {output.name for op in ops for output in op.outputs}
84 | tensor_dict = {}
85 | for key in [
86 | 'num_detections', 'detection_boxes', 'detection_scores',
87 | 'detection_classes', 'detection_masks'
88 | ]:
89 | tensor_name = key + ':0'
90 | if tensor_name in all_tensor_names:
91 | tensor_dict[key] = tf.get_default_graph().get_tensor_by_name(
92 | tensor_name)
93 | if 'detection_masks' in tensor_dict:
94 | # The following processing is only for single image
95 | detection_boxes = tf.squeeze(tensor_dict['detection_boxes'], [0])
96 | detection_masks = tf.squeeze(tensor_dict['detection_masks'], [0])
97 | # Reframe is required to translate mask from box coordinates to image coordinates and fit the image size.
98 | real_num_detection = tf.cast(tensor_dict['num_detections'][0], tf.int32)
99 | detection_boxes = tf.slice(detection_boxes, [0, 0], [real_num_detection, -1])
100 | detection_masks = tf.slice(detection_masks, [0, 0, 0], [real_num_detection, -1, -1])
101 | detection_masks_reframed = utils_ops.reframe_box_masks_to_image_masks(
102 | detection_masks, detection_boxes, image.shape[0], image.shape[1])
103 | detection_masks_reframed = tf.cast(
104 | tf.greater(detection_masks_reframed, 0.5), tf.uint8)
105 | # Follow the convention by adding back the batch dimension
106 | tensor_dict['detection_masks'] = tf.expand_dims(
107 | detection_masks_reframed, 0)
108 | image_tensor = tf.get_default_graph().get_tensor_by_name('image_tensor:0')
109 |
110 | print(time.time()-time_start)
111 |
112 | for counter_ in range(0,10): # average detection time 0.067 per one image ~ 15FPS
113 | for image_path in TEST_IMAGE_PATHS:
114 | #this part is for PIL image processing
115 | """
116 | image = Image.open(image_path)
117 | # the array based representation of the image will be used later in order to prepare the
118 | # result image with boxes and labels on it.
119 | image_np = load_image_into_numpy_array(image)
120 | #end of part for PIL image processing
121 | """
122 | #this part is for cv2 image processing
123 |
124 | in_file = open(image_path, "rb")
125 | data = in_file.read()
126 | in_file.close()
127 | encoded_string = base64.standard_b64encode(data)
128 | file_string = str(encoded_string, 'ascii', 'ignore')
129 |
130 | file_bytes = np.asarray(bytearray(base64.b64decode(file_string)), dtype=np.uint8)
131 | image_ = cv2.imdecode(file_bytes, 1)
132 | #convertcolor from BGR to RGB
133 | image_np = cv2.cvtColor(image_, 4)
134 |
135 | # Actual detection.
136 | time_cycle = time.time()
137 |
138 | # Run inference
139 | output_dict = sess.run(tensor_dict,
140 | feed_dict={image_tensor: np.expand_dims(image_np, 0)})
141 |
142 | # all outputs are float32 numpy arrays, so convert types as appropriate
143 | output_dict['num_detections'] = int(output_dict['num_detections'][0])
144 | output_dict['detection_classes'] = output_dict[
145 | 'detection_classes'][0].astype(np.uint8)
146 | output_dict['detection_boxes'] = output_dict['detection_boxes'][0]
147 | output_dict['detection_scores'] = output_dict['detection_scores'][0]
148 | if 'detection_masks' in output_dict:
149 | output_dict['detection_masks'] = output_dict['detection_masks'][0]
150 | print('time of inference:'+str(time.time() - time_cycle))
151 | print("detected "+str(output_dict['num_detections']))
152 | #print(output_dict['detection_boxes'])
153 | #print(output_dict['detection_classes'])
154 | #print(output_dict['detection_scores'])
155 | #print(output_dict['detection_classes'])
156 | #print(output_dict.get('detection_masks'))
157 |
158 | # Visualization of the results of a detection.
159 | """
160 | vis_util.visualize_boxes_and_labels_on_image_array(
161 | image_np,
162 | output_dict['detection_boxes'],
163 | output_dict['detection_classes'],
164 | output_dict['detection_scores'],
165 | category_index,
166 | instance_masks=output_dict.get('detection_masks'),
167 | use_normalized_coordinates=True,
168 | line_thickness=3)
169 |
170 | vis_util.save_image_array_as_png(image_np,'./test/images/result.png')
171 | #make result png string from image numpy array
172 | #result_string = vis_util.encode_image_array_as_png_str(image_np)
173 | #print(result_string)
174 | """
175 |
176 | print('thats all folks')
177 | #sys.exit(0)
178 |
--------------------------------------------------------------------------------
/detect_trt.py:
--------------------------------------------------------------------------------
1 | #2019-05-19 for detection and classification with TensorRT optimized graph (directory resources_trt)
2 | #need specially prepared graph with pb_to_pb_trt_transfere.py script
3 | #very slow loaded to memory ( about 3 min)
4 | import tensorflow as tf
5 | import cv2
6 | import os
7 | import time
8 |
9 | init_time = time.time()
10 |
11 | def get_frozen_graph(graph_file):
12 | """Read Frozen Graph file from disk."""
13 | with tf.gfile.GFile(graph_file, "rb") as f:
14 | graph_def = tf.GraphDef()
15 | graph_def.ParseFromString(f.read())
16 | return graph_def
17 |
18 |
19 | PATH_TO_TEST_IMAGES_DIR = './test/images/'
20 | TEST_IMAGE_PATHS = [ os.path.join(PATH_TO_TEST_IMAGES_DIR, '8{}.jpg'.format(i)) for i in range(5, 10) ]
21 |
22 | # The TensorRT frozen inference graph
23 | pb_fname = "./resources_trt/trt_graph.pb"
24 | trt_graph = get_frozen_graph(pb_fname)
25 |
26 | input_names = ['image_tensor']
27 |
28 | # Create session and load graph
29 | tf_config = tf.ConfigProto()
30 | tf_config.gpu_options.allow_growth = True
31 | tf_sess = tf.Session(config=tf_config)
32 | tf.import_graph_def(trt_graph, name='')
33 |
34 | tf_input = tf_sess.graph.get_tensor_by_name(input_names[0] + ':0')
35 | tf_scores = tf_sess.graph.get_tensor_by_name('detection_scores:0')
36 | tf_boxes = tf_sess.graph.get_tensor_by_name('detection_boxes:0')
37 | tf_classes = tf_sess.graph.get_tensor_by_name('detection_classes:0')
38 | tf_num_detections = tf_sess.graph.get_tensor_by_name('num_detections:0')
39 |
40 | print('time of loading:'+str(time.time()-init_time))
41 |
42 | for counter_ in range(0,10): # average detection time 0.087 per one image ~ 11FPS
43 | for image_path in TEST_IMAGE_PATHS:
44 | cycle_time = time.time()
45 | image = cv2.imread(image_path)
46 | #image = cv2.resize(image, (300, 300))
47 |
48 | scores, boxes, classes, num_detections = tf_sess.run([tf_scores, tf_boxes, tf_classes, tf_num_detections], feed_dict={
49 | tf_input: image[None, ...]
50 | })
51 | boxes = boxes[0] # index by 0 to remove batch dimension
52 | scores = scores[0]
53 | classes = classes[0]
54 | num_detections = int(num_detections[0])
55 |
56 | print('time of inference:'+str(time.time()-cycle_time))
57 | print("detected " + str(num_detections))
58 |
--------------------------------------------------------------------------------
/resources_tf/data/object-detection.pbtxt:
--------------------------------------------------------------------------------
1 | item {
2 | id: 1
3 | name: 'insulator'
4 | }
5 |
--------------------------------------------------------------------------------
/resources_tf/fine_tuned_model/checkpoint:
--------------------------------------------------------------------------------
1 | model_checkpoint_path: "model.ckpt"
2 | all_model_checkpoint_paths: "model.ckpt"
3 |
--------------------------------------------------------------------------------
/resources_tf/fine_tuned_model/frozen_inference_graph.pb:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/toborobot/JetsonNanoInsulatorDetection/33213261a40de098b9ec5f4507e4cc47ed079c26/resources_tf/fine_tuned_model/frozen_inference_graph.pb
--------------------------------------------------------------------------------
/resources_tf/fine_tuned_model/model.ckpt.data-00000-of-00001:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/toborobot/JetsonNanoInsulatorDetection/33213261a40de098b9ec5f4507e4cc47ed079c26/resources_tf/fine_tuned_model/model.ckpt.data-00000-of-00001
--------------------------------------------------------------------------------
/resources_tf/fine_tuned_model/model.ckpt.index:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/toborobot/JetsonNanoInsulatorDetection/33213261a40de098b9ec5f4507e4cc47ed079c26/resources_tf/fine_tuned_model/model.ckpt.index
--------------------------------------------------------------------------------
/resources_tf/fine_tuned_model/model.ckpt.meta:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/toborobot/JetsonNanoInsulatorDetection/33213261a40de098b9ec5f4507e4cc47ed079c26/resources_tf/fine_tuned_model/model.ckpt.meta
--------------------------------------------------------------------------------
/resources_tf/fine_tuned_model/pipeline.config:
--------------------------------------------------------------------------------
1 | model {
2 | ssd {
3 | num_classes: 1
4 | image_resizer {
5 | fixed_shape_resizer {
6 | height: 300
7 | width: 300
8 | }
9 | }
10 | feature_extractor {
11 | type: "ssd_mobilenet_v1"
12 | depth_multiplier: 1.0
13 | min_depth: 16
14 | conv_hyperparams {
15 | regularizer {
16 | l2_regularizer {
17 | weight: 3.9999998989515007e-05
18 | }
19 | }
20 | initializer {
21 | truncated_normal_initializer {
22 | mean: 0.0
23 | stddev: 0.029999999329447746
24 | }
25 | }
26 | activation: RELU_6
27 | batch_norm {
28 | decay: 0.9997000098228455
29 | center: true
30 | scale: true
31 | epsilon: 0.0010000000474974513
32 | train: true
33 | }
34 | }
35 | }
36 | box_coder {
37 | faster_rcnn_box_coder {
38 | y_scale: 10.0
39 | x_scale: 10.0
40 | height_scale: 5.0
41 | width_scale: 5.0
42 | }
43 | }
44 | matcher {
45 | argmax_matcher {
46 | matched_threshold: 0.5
47 | unmatched_threshold: 0.5
48 | ignore_thresholds: false
49 | negatives_lower_than_unmatched: true
50 | force_match_for_each_row: true
51 | }
52 | }
53 | similarity_calculator {
54 | iou_similarity {
55 | }
56 | }
57 | box_predictor {
58 | convolutional_box_predictor {
59 | conv_hyperparams {
60 | regularizer {
61 | l2_regularizer {
62 | weight: 3.9999998989515007e-05
63 | }
64 | }
65 | initializer {
66 | truncated_normal_initializer {
67 | mean: 0.0
68 | stddev: 0.029999999329447746
69 | }
70 | }
71 | activation: RELU_6
72 | batch_norm {
73 | decay: 0.9997000098228455
74 | center: true
75 | scale: true
76 | epsilon: 0.0010000000474974513
77 | train: true
78 | }
79 | }
80 | min_depth: 0
81 | max_depth: 0
82 | num_layers_before_predictor: 0
83 | use_dropout: false
84 | dropout_keep_probability: 0.800000011920929
85 | kernel_size: 1
86 | box_code_size: 4
87 | apply_sigmoid_to_scores: false
88 | }
89 | }
90 | anchor_generator {
91 | ssd_anchor_generator {
92 | num_layers: 6
93 | min_scale: 0.20000000298023224
94 | max_scale: 0.949999988079071
95 | aspect_ratios: 1.0
96 | aspect_ratios: 2.0
97 | aspect_ratios: 0.5
98 | aspect_ratios: 3.0
99 | aspect_ratios: 0.33329999446868896
100 | }
101 | }
102 | post_processing {
103 | batch_non_max_suppression {
104 | score_threshold: 0.30000001192092896
105 | iou_threshold: 0.6000000238418579
106 | max_detections_per_class: 100
107 | max_total_detections: 100
108 | }
109 | score_converter: SIGMOID
110 | }
111 | normalize_loss_by_num_matches: true
112 | loss {
113 | localization_loss {
114 | weighted_smooth_l1 {
115 | }
116 | }
117 | classification_loss {
118 | weighted_sigmoid {
119 | }
120 | }
121 | hard_example_miner {
122 | num_hard_examples: 3000
123 | iou_threshold: 0.9900000095367432
124 | loss_type: CLASSIFICATION
125 | max_negatives_per_positive: 3
126 | min_negatives_per_image: 0
127 | }
128 | classification_weight: 1.0
129 | localization_weight: 1.0
130 | }
131 | }
132 | }
133 | train_config {
134 | batch_size: 24
135 | data_augmentation_options {
136 | random_horizontal_flip {
137 | }
138 | }
139 | data_augmentation_options {
140 | ssd_random_crop {
141 | }
142 | }
143 | optimizer {
144 | rms_prop_optimizer {
145 | learning_rate {
146 | exponential_decay_learning_rate {
147 | initial_learning_rate: 0.004000000189989805
148 | decay_steps: 800720
149 | decay_factor: 0.949999988079071
150 | }
151 | }
152 | momentum_optimizer_value: 0.8999999761581421
153 | decay: 0.8999999761581421
154 | epsilon: 1.0
155 | }
156 | }
157 | fine_tune_checkpoint: "/content/data_insulators/models/ssd_mobilenet_v1_coco/model.ckpt"
158 | from_detection_checkpoint: true
159 | num_steps: 200000
160 | }
161 | train_input_reader {
162 | label_map_path: "/content/data_insulators/data/object-detection.pbtxt"
163 | tf_record_input_reader {
164 | input_path: "/content/data_insulators/data/train.record"
165 | }
166 | }
167 | eval_config {
168 | num_examples: 8000
169 | max_evals: 10
170 | use_moving_averages: false
171 | }
172 | eval_input_reader {
173 | label_map_path: "/content/data_insulators/data/object-detection.pbtxt"
174 | shuffle: false
175 | num_readers: 1
176 | tf_record_input_reader {
177 | input_path: "/content/data_insulators/data/test.record"
178 | }
179 | }
180 |
--------------------------------------------------------------------------------
/resources_tf/fine_tuned_model/saved_model/saved_model.pb:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/toborobot/JetsonNanoInsulatorDetection/33213261a40de098b9ec5f4507e4cc47ed079c26/resources_tf/fine_tuned_model/saved_model/saved_model.pb
--------------------------------------------------------------------------------
/resources_trt/frozen_graph_to_trt.py:
--------------------------------------------------------------------------------
1 | # Import TensorFlow and TensorRT
2 | # from https://docs.nvidia.com/deeplearning/frameworks/tf-trt-user-guide/index.html
3 | import tensorflow as tf
4 | import tensorflow.contrib.tensorrt as trt
5 | # Inference with TF-TRT frozen graph workflow:
6 |
7 | frozen_graph_path = './resources_tf/fine_tuned_model/frozen_inference_graph.pb'
8 | output_names = ['num_detections','detection_classes','detection_boxes','detection_scores']
9 |
10 | graph = tf.Graph()
11 | with graph.as_default():
12 | with tf.Session() as sess:
13 | # First deserialize your frozen graph:
14 | with tf.gfile.GFile(frozen_graph_path, 'rb') as f:
15 | graph_def = tf.GraphDef()
16 | graph_def.ParseFromString(f.read())
17 | # Now you can create a TensorRT inference graph from your
18 | # frozen graph:
19 | trt_graph = trt.create_inference_graph(
20 | input_graph_def=graph_def,
21 | outputs=output_names,
22 | max_batch_size=1,
23 | max_workspace_size_bytes=1<<25,
24 | precision_mode='FP16')
25 | # Import the TensorRT graph into a new graph and run:
26 | with open('/home/jnano/tf/prog10/resources_trt/trt_graph.pb', 'wb') as f:
27 | f.write(trt_graph.SerializeToString())
28 |
--------------------------------------------------------------------------------
/resources_trt/pb_to_pb_trt_transfere.py:
--------------------------------------------------------------------------------
1 | import tensorflow.contrib.tensorrt as trt
2 | import tensorflow as tf
3 | import os
4 | #from tf_trt_models.detection import build_detection_graph
5 |
6 | ################################################################################
7 | #need to add object detection libraries and code downloaded From
8 | #https://github.com/tensorflow/models and stored in /home/jnano/ directory
9 | ################################################################################
10 | #sys.path.append('/home/jnano/tf/models/research/')
11 | #sys.path.append('/home/jnano/tf/models/research/slim/')
12 | config_path = './resources_tf/fine_tuned_model/pipeline.config'
13 | checkpoint_path = './resources_tf/fine_tuned_model/model.ckpt'
14 | frozen_graph_path = './resources_tf/fine_tuned_model'
15 | output_names = ['num_detections','detection_classes','detection_boxes','detection_scores']
16 | """
17 | frozen_graph, input_names, output_names = build_detection_graph(
18 | config=config_path, # path to model’s pipeline.config file
19 | checkpoint=checkpoint_path, # path to model.ckpt file
20 | score_threshold=0.3,
21 | #iou_threshold=0.5,
22 | batch_size=1
23 | )
24 | """
25 | # init graph
26 | # read frozen graph from file
27 | frozen_graph = tf.GraphDef()
28 | with open(os.path.join(frozen_graph_path, 'frozen_inference_graph.pb'), 'rb') as f:
29 | frozen_graph.ParseFromString(f.read())
30 |
31 | """
32 | link https://docs.nvidia.com/deeplearning/frameworks/tf-trt-user-guide/index.html
33 |
34 | def create_inference_graph(input_graph_def,
35 | outputs,
36 | max_batch_size=1,
37 | max_workspace_size_bytes=2 << 20,
38 | precision_mode="fp32",
39 | minimum_segment_size=3,
40 | is_dynamic_op=False,
41 | maximum_cached_engines=1,
42 | cached_engine_batch_sizes=None
43 | use_calibration=True,
44 | rewriter_config=None,
45 | input_saved_model_dir=None,
46 | input_saved_model_tags=None,
47 | output_saved_model_dir=None,
48 | session_config=None):
49 |
50 | Where:
51 | -input_graph_def
52 | This parameter is the GraphDef object that contains the model to be transformed.
53 | -outputs
54 | This parameter lists the output nodes in the graph. Tensors which are not marked
55 | as outputs are considered to be transient values that may be optimized away by
56 | the builder.
57 | -max_batch_size
58 | This parameter is the maximum batch size that specifies the batch size for which
59 | TensorRT will optimize. At runtime, a smaller batch size may be chosen. At runtime,
60 | larger batch size is not supported.
61 | -max_workspace_size_bytes
62 | TensorRT operators often require temporary workspace. This parameter limits the
63 | maximum size that any layer in the network can use. If insufficient scratch is
64 | provided, it is possible that TensorRT may not be able to find an implementation
65 | for a given layer.
66 | -precision_mode
67 | TF-TRT only supports models trained in FP32, in other words all the weights of
68 | the model should be stored in FP32 precision. That being said, TensorRT can
69 | convert tensors and weights to lower precisions during the optimization.
70 | The precision_mode parameter sets the precision mode; which can be one of
71 | fp32, fp16, or int8. Precision lower than FP32, meaning FP16 and INT8, would
72 | improve the performance of inference. The FP16 mode uses Tensor Cores or half
73 | precision hardware instructions, if possible. The INT8 precision mode uses
74 | integer hardware instructions.
75 | -minimum_segment_size
76 | This parameter determines the minimum number of TensorFlow nodes in a TensorRT
77 | engine, which means the TensorFlow subgraphs that have fewer nodes than this
78 | number will not be converted to TensorRT. Therefore, in general smaller numbers
79 | such as 5 are preferred. This can also be used to change the minimum number of
80 | nodes in the optimized INT8 engines to change the final optimized graph to
81 | fine tune result accuracy.
82 | -is_dynamic_op
83 | If this parameter is set to True, the conversion and building the TensorRT
84 | engines will happen during the runtime, which would be necessary if there are
85 | tensors in the graphs with unknown initial shapes or dynamic shapes. For more
86 | information, see index.html#static-dynamic-mode.
87 | Note: Conversion during runtime may increase the latency, depending on your
88 | model and how you use it.
89 | -maximum_cached_engines
90 | In dynamic mode, this sets the maximum number of cached TensorRT engines per
91 | TRTEngineOp. For more information, see index.html#static-dynamic-mode.
92 | -cached_engine_batch_sizes
93 | The list of batch sizes used to create cached engines, only used when
94 | is_dynamic_op is True. The length of the list should be smaller than
95 | maximum_cached_engines, and the dynamic TensorRT op will use this list to
96 | determine the batch sizes of the cached engines, instead of making the decision
97 | while in progress. This is useful when we know the most common batch size(s)
98 | the application is going to generate.
99 | -cached_engine_batches
100 | The batch sizes used to pre-create cached engines.
101 | -use_calibration
102 | This argument is ignored if precision_mode is not INT8.
103 | If set to True, a calibration graph will be created to calibrate the missing
104 | ranges. The calibration graph must be converted to an inference graph using
105 | calib_graph_to_infer_graph() after running calibration.
106 | If set to False, quantization nodes will be expected for every tensor in the
107 | graph (excluding those which will be fused). If a range is missing, an error will occur.
108 | Note: Accuracy may be negatively affected if there is a mismatch between which
109 | tensors TensorRT quantizes and which tensors were trained with fake quantization.
110 | -rewriter_config
111 | A RewriterConfig proto to append the TensorRTOptimizer to. If None, it will
112 | create one with default settings.
113 | -input_saved_model_dir
114 | The directory to load the SavedModel containing the input graph to transform.
115 | Used only when input_graph_def is None.
116 | -input_saved_model_tags
117 | A list of tags used to identify the MetaGraphDef of the SavedModel to load.
118 | -output_saved_model_dir
119 | If not None, construct a SavedModel using the returned GraphDef and save it to
120 | the specified directory. This option only works when the input graph is loaded
121 | from a SavedModel, in other words, when input_saved_model_dir is specified and
122 | input_graph_def is None.
123 | -session_config
124 | The ConfigProto used to create a Session. If not specified, a default ConfigProto
125 | will be used.
126 |
127 | Returns:
128 | New GraphDef with TRTEngineOps placed in graph replacing subgraphs.
129 | Raises:
130 | ValueError: If the provided precision mode is invalid.
131 | RuntimeError: If the returned status message is malformed.
132 | """
133 |
134 | trt_graph = trt.create_inference_graph(
135 | input_graph_def=frozen_graph,
136 | outputs=output_names,
137 | max_batch_size=1,
138 | max_workspace_size_bytes=1<<25,
139 | precision_mode='FP16',
140 | minimum_segment_size=5)
141 |
142 | with open('/home/jnano/tf/prog10/resources_trt/trt_graph.pb', 'wb') as f:
143 | f.write(trt_graph.SerializeToString())
144 |
--------------------------------------------------------------------------------
/resources_trt/read.me:
--------------------------------------------------------------------------------
1 | #read me file for prepare TensorflowRT model from tensorflow frozen graph
2 |
3 | #article here
4 | https://www.dlology.com/blog/how-to-run-tensorflow-object-detection-model-on-jetson-nano/
5 |
6 | #first need to clone model from official nvidia github
7 | cd ~
8 | git clone --recursive https://github.com/NVIDIA-Jetson/tf_trt_models.git
9 | cd tf_trt_models
10 | ./install.sh python3
11 | при этом в директорию tf_trt_models устанавливаются модели tensorflow
12 |
13 |
14 | #create directory resources_trt
15 | #make program for transfere fromtensorflow to trt pb
16 |
17 | #link for tensorflow to tensorrt
18 | https://github.com/NVIDIA-AI-IOT/tf_to_trt_image_classification
19 |
--------------------------------------------------------------------------------
/resources_trt/trt_graph.pb:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/toborobot/JetsonNanoInsulatorDetection/33213261a40de098b9ec5f4507e4cc47ed079c26/resources_trt/trt_graph.pb
--------------------------------------------------------------------------------
/result_output_detect_tf.out:
--------------------------------------------------------------------------------
1 | python3 detect_tf.py
2 | 2019-05-25 20:36:50.318603: W tensorflow/core/platform/profile_utils/cpu_utils.cc:98] Failed to find bogomips in /proc/cpuinfo; cannot determine CPU frequency
3 | 2019-05-25 20:36:50.319259: I tensorflow/compiler/xla/service/service.cc:161] XLA service 0x2c88cd40 executing computations on platform Host. Devices:
4 | 2019-05-25 20:36:50.319322: I tensorflow/compiler/xla/service/service.cc:168] StreamExecutor device (0): ,
5 | 2019-05-25 20:36:50.431621: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:965] ARM64 does not support NUMA - returning NUMA node zero
6 | 2019-05-25 20:36:50.432253: I tensorflow/compiler/xla/service/service.cc:161] XLA service 0x242e08f0 executing computations on platform CUDA. Devices:
7 | 2019-05-25 20:36:50.432329: I tensorflow/compiler/xla/service/service.cc:168] StreamExecutor device (0): NVIDIA Tegra X1, Compute Capability 5.3
8 | 2019-05-25 20:36:50.432750: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1433] Found device 0 with properties:
9 | name: NVIDIA Tegra X1 major: 5 minor: 3 memoryClockRate(GHz): 0.9216
10 | pciBusID: 0000:00:00.0
11 | totalMemory: 3.87GiB freeMemory: 1.51GiB
12 | 2019-05-25 20:36:50.432829: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1512] Adding visible gpu devices: 0
13 | 2019-05-25 20:36:51.634978: I tensorflow/core/common_runtime/gpu/gpu_device.cc:984] Device interconnect StreamExecutor with strength 1 edge matrix:
14 | 2019-05-25 20:36:51.635065: I tensorflow/core/common_runtime/gpu/gpu_device.cc:990] 0
15 | 2019-05-25 20:36:51.635104: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1003] 0: N
16 | 2019-05-25 20:36:51.635286: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1115] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 1147 MB memory) -> physical GPU (device: 0, name: NVIDIA Tegra X1, pci bus id: 0000:00:00.0, compute capability: 5.3)
17 | 6.787420272827148
18 | 2019-05-25 20:37:06.781820: W tensorflow/core/common_runtime/bfc_allocator.cc:211] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.05GiB. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available.
19 | 2019-05-25 20:37:07.483913: W tensorflow/core/common_runtime/bfc_allocator.cc:211] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.07GiB. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available.
20 | 2019-05-25 20:37:07.938149: W tensorflow/core/common_runtime/bfc_allocator.cc:211] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.13GiB. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available.
21 | 2019-05-25 20:37:08.132305: W tensorflow/core/common_runtime/bfc_allocator.cc:211] Allocator (GPU_0_bfc) ran out of memory trying to allocate 1.13GiB. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available.
22 | 2019-05-25 20:37:08.185010: W tensorflow/core/common_runtime/bfc_allocator.cc:211] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.26GiB. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available.
23 | time of inference:18.9060161113739
24 | detected 5
25 | time of inference:0.32711172103881836
26 | detected 2
27 | time of inference:0.11663103103637695
28 | detected 2
29 | time of inference:0.10812783241271973
30 | detected 2
31 | time of inference:0.10611200332641602
32 | detected 2
33 | thats all folks
34 |
--------------------------------------------------------------------------------
/result_output_detect_trt.out:
--------------------------------------------------------------------------------
1 | python3 detect_trt.py
2 | 2019-05-25 20:23:16.170712: W tensorflow/core/platform/profile_utils/cpu_utils.cc:98] Failed to find bogomips in /proc/cpuinfo; cannot determine CPU frequency
3 | 2019-05-25 20:23:16.171777: I tensorflow/compiler/xla/service/service.cc:161] XLA service 0x1902fce0 executing computations on platform Host. Devices:
4 | 2019-05-25 20:23:16.172430: I tensorflow/compiler/xla/service/service.cc:168] StreamExecutor device (0): ,
5 | 2019-05-25 20:23:16.283447: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:965] ARM64 does not support NUMA - returning NUMA node zero
6 | 2019-05-25 20:23:16.283737: I tensorflow/compiler/xla/service/service.cc:161] XLA service 0x14c17c70 executing computations on platform CUDA. Devices:
7 | 2019-05-25 20:23:16.283795: I tensorflow/compiler/xla/service/service.cc:168] StreamExecutor device (0): NVIDIA Tegra X1, Compute Capability 5.3
8 | 2019-05-25 20:23:16.284193: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1433] Found device 0 with properties:
9 | name: NVIDIA Tegra X1 major: 5 minor: 3 memoryClockRate(GHz): 0.9216
10 | pciBusID: 0000:00:00.0
11 | totalMemory: 3.87GiB freeMemory: 1.41GiB
12 | 2019-05-25 20:23:16.284264: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1512] Adding visible gpu devices: 0
13 | 2019-05-25 20:23:21.062719: I tensorflow/core/common_runtime/gpu/gpu_device.cc:984] Device interconnect StreamExecutor with strength 1 edge matrix:
14 | 2019-05-25 20:23:21.062799: I tensorflow/core/common_runtime/gpu/gpu_device.cc:990] 0
15 | 2019-05-25 20:23:21.062836: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1003] 0: N
16 | 2019-05-25 20:23:21.063019: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1115] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 721 MB memory) -> physical GPU (device: 0, name: NVIDIA Tegra X1, pci bus id: 0000:00:00.0, compute capability: 5.3)
17 | time of loading:199.7234411239624
18 | 2019-05-25 20:24:02.881493: W tensorflow/core/common_runtime/bfc_allocator.cc:211] Allocator (GPU_0_bfc) ran out of memory trying to allocate 1.02GiB. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available.
19 | 2019-05-25 20:24:02.897610: W tensorflow/core/common_runtime/bfc_allocator.cc:211] Allocator (GPU_0_bfc) ran out of memory trying to allocate 1.02GiB. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available.
20 | 2019-05-25 20:24:02.916653: W tensorflow/core/common_runtime/bfc_allocator.cc:211] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.05GiB. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available.
21 | 2019-05-25 20:24:02.933499: W tensorflow/core/common_runtime/bfc_allocator.cc:211] Allocator (GPU_0_bfc) ran out of memory trying to allocate 1.04GiB. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available.
22 | 2019-05-25 20:24:02.950453: W tensorflow/core/common_runtime/bfc_allocator.cc:211] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.07GiB. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available.
23 | 2019-05-25 20:24:02.968622: W tensorflow/core/common_runtime/bfc_allocator.cc:211] Allocator (GPU_0_bfc) ran out of memory trying to allocate 1.06GiB. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available.
24 | 2019-05-25 20:24:02.990567: W tensorflow/core/common_runtime/bfc_allocator.cc:211] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.13GiB. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available.
25 | 2019-05-25 20:24:03.110873: W tensorflow/core/common_runtime/bfc_allocator.cc:211] Allocator (GPU_0_bfc) ran out of memory trying to allocate 1.13GiB. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available.
26 | 2019-05-25 20:24:03.388728: W tensorflow/core/common_runtime/bfc_allocator.cc:211] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.26GiB. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available.
27 | 2019-05-25 20:24:04.083007: W tensorflow/core/common_runtime/bfc_allocator.cc:211] Allocator (GPU_0_bfc) ran out of memory trying to allocate 579.25MiB. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available.
28 | time of inference:8.978713750839233
29 | detected 2
30 | time of inference:0.21228814125061035
31 | detected 2
32 | time of inference:0.12187004089355469
33 | detected 2
34 | time of inference:0.09182429313659668
35 | detected 3
36 | time of inference:0.1100008487701416
37 | detected 2
38 |
--------------------------------------------------------------------------------
/test/images/85.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/toborobot/JetsonNanoInsulatorDetection/33213261a40de098b9ec5f4507e4cc47ed079c26/test/images/85.jpg
--------------------------------------------------------------------------------
/test/images/86.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/toborobot/JetsonNanoInsulatorDetection/33213261a40de098b9ec5f4507e4cc47ed079c26/test/images/86.jpg
--------------------------------------------------------------------------------
/test/images/87.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/toborobot/JetsonNanoInsulatorDetection/33213261a40de098b9ec5f4507e4cc47ed079c26/test/images/87.jpg
--------------------------------------------------------------------------------
/test/images/88.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/toborobot/JetsonNanoInsulatorDetection/33213261a40de098b9ec5f4507e4cc47ed079c26/test/images/88.jpg
--------------------------------------------------------------------------------
/test/images/89.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/toborobot/JetsonNanoInsulatorDetection/33213261a40de098b9ec5f4507e4cc47ed079c26/test/images/89.jpg
--------------------------------------------------------------------------------
/test/images/ddeteccted_result.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/toborobot/JetsonNanoInsulatorDetection/33213261a40de098b9ec5f4507e4cc47ed079c26/test/images/ddeteccted_result.png
--------------------------------------------------------------------------------
/test/images/result.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/toborobot/JetsonNanoInsulatorDetection/33213261a40de098b9ec5f4507e4cc47ed079c26/test/images/result.png
--------------------------------------------------------------------------------