├── .gitignore
├── COPYING
├── README.md
├── centroidnet
├── __init__.py
├── backbones
│ ├── LICENSE
│ ├── __init__.py
│ └── unet.py
├── centroidnet_core.py
└── dataloaders.py
├── config.py
├── dataset
├── LICENSE.txt
├── training
│ ├── PotatoPlant0.png
│ ├── PotatoPlant0.xml
│ ├── PotatoPlant1024.png
│ ├── PotatoPlant1024.xml
│ ├── PotatoPlant1187.png
│ ├── PotatoPlant1187.xml
│ ├── PotatoPlant1321.png
│ ├── PotatoPlant1321.xml
│ ├── PotatoPlant417.png
│ └── PotatoPlant417.xml
└── validation
│ ├── PotatoPlant143.png
│ ├── PotatoPlant143.xml
│ ├── PotatoPlant297.png
│ ├── PotatoPlant297.xml
│ ├── PotatoPlant552.png
│ └── PotatoPlant552.xml
├── misc
└── create_dataset.py
├── predict.py
└── train.py
/.gitignore:
--------------------------------------------------------------------------------
1 | # Compiled source #
2 | ###################
3 | *.com
4 | *.class
5 | *.dll
6 | *.exe
7 | *.o
8 | *.so
9 | *.pyc
10 |
11 | # Packages #
12 | ############
13 | *.7z
14 | *.dmg
15 | *.gz
16 | *.iso
17 | *.rar
18 | *.tar
19 | *.zip
20 |
21 | # Logs and databases #
22 | ######################
23 | *.log
24 | *.sqlite
25 |
26 | # OS generated files #
27 | ######################
28 | .DS_Store
29 | ehthumbs.db
30 | Icon
31 | Thumbs.db
32 | .tmtags
33 | .idea
34 | **/.idea
35 | tags
36 | vendor.tags
37 | tmtagsHistory
38 | *.sublime-project
39 | *.sublime-workspace
40 | .bundle
41 |
42 | # Package files #
43 | #################
44 | build
45 | centroidnet.egg.info
46 | dist
47 | centroidnet/**/__pycache__
48 |
49 | # clion files #
50 | cmake-build-*
51 |
52 | # other stuff #
53 | .RData
54 | data/
--------------------------------------------------------------------------------
/COPYING:
--------------------------------------------------------------------------------
1 | GNU GENERAL PUBLIC LICENSE
2 | Version 3, 29 June 2007
3 |
4 | Copyright (C) 2007 Free Software Foundation, Inc.
5 | Everyone is permitted to copy and distribute verbatim copies
6 | of this license document, but changing it is not allowed.
7 |
8 | Preamble
9 |
10 | The GNU General Public License is a free, copyleft license for
11 | software and other kinds of works.
12 |
13 | The licenses for most software and other practical works are designed
14 | to take away your freedom to share and change the works. By contrast,
15 | the GNU General Public License is intended to guarantee your freedom to
16 | share and change all versions of a program--to make sure it remains free
17 | software for all its users. We, the Free Software Foundation, use the
18 | GNU General Public License for most of our software; it applies also to
19 | any other work released this way by its authors. You can apply it to
20 | your programs, too.
21 |
22 | When we speak of free software, we are referring to freedom, not
23 | price. Our General Public Licenses are designed to make sure that you
24 | have the freedom to distribute copies of free software (and charge for
25 | them if you wish), that you receive source code or can get it if you
26 | want it, that you can change the software or use pieces of it in new
27 | free programs, and that you know you can do these things.
28 |
29 | To protect your rights, we need to prevent others from denying you
30 | these rights or asking you to surrender the rights. Therefore, you have
31 | certain responsibilities if you distribute copies of the software, or if
32 | you modify it: responsibilities to respect the freedom of others.
33 |
34 | For example, if you distribute copies of such a program, whether
35 | gratis or for a fee, you must pass on to the recipients the same
36 | freedoms that you received. You must make sure that they, too, receive
37 | or can get the source code. And you must show them these terms so they
38 | know their rights.
39 |
40 | Developers that use the GNU GPL protect your rights with two steps:
41 | (1) assert copyright on the software, and (2) offer you this License
42 | giving you legal permission to copy, distribute and/or modify it.
43 |
44 | For the developers' and authors' protection, the GPL clearly explains
45 | that there is no warranty for this free software. For both users' and
46 | authors' sake, the GPL requires that modified versions be marked as
47 | changed, so that their problems will not be attributed erroneously to
48 | authors of previous versions.
49 |
50 | Some devices are designed to deny users access to install or run
51 | modified versions of the software inside them, although the manufacturer
52 | can do so. This is fundamentally incompatible with the aim of
53 | protecting users' freedom to change the software. The systematic
54 | pattern of such abuse occurs in the area of products for individuals to
55 | use, which is precisely where it is most unacceptable. Therefore, we
56 | have designed this version of the GPL to prohibit the practice for those
57 | products. If such problems arise substantially in other domains, we
58 | stand ready to extend this provision to those domains in future versions
59 | of the GPL, as needed to protect the freedom of users.
60 |
61 | Finally, every program is threatened constantly by software patents.
62 | States should not allow patents to restrict development and use of
63 | software on general-purpose computers, but in those that do, we wish to
64 | avoid the special danger that patents applied to a free program could
65 | make it effectively proprietary. To prevent this, the GPL assures that
66 | patents cannot be used to render the program non-free.
67 |
68 | The precise terms and conditions for copying, distribution and
69 | modification follow.
70 |
71 | TERMS AND CONDITIONS
72 |
73 | 0. Definitions.
74 |
75 | "This License" refers to version 3 of the GNU General Public License.
76 |
77 | "Copyright" also means copyright-like laws that apply to other kinds of
78 | works, such as semiconductor masks.
79 |
80 | "The Program" refers to any copyrightable work licensed under this
81 | License. Each licensee is addressed as "you". "Licensees" and
82 | "recipients" may be individuals or organizations.
83 |
84 | To "modify" a work means to copy from or adapt all or part of the work
85 | in a fashion requiring copyright permission, other than the making of an
86 | exact copy. The resulting work is called a "modified version" of the
87 | earlier work or a work "based on" the earlier work.
88 |
89 | A "covered work" means either the unmodified Program or a work based
90 | on the Program.
91 |
92 | To "propagate" a work means to do anything with it that, without
93 | permission, would make you directly or secondarily liable for
94 | infringement under applicable copyright law, except executing it on a
95 | computer or modifying a private copy. Propagation includes copying,
96 | distribution (with or without modification), making available to the
97 | public, and in some countries other activities as well.
98 |
99 | To "convey" a work means any kind of propagation that enables other
100 | parties to make or receive copies. Mere interaction with a user through
101 | a computer network, with no transfer of a copy, is not conveying.
102 |
103 | An interactive user interface displays "Appropriate Legal Notices"
104 | to the extent that it includes a convenient and prominently visible
105 | feature that (1) displays an appropriate copyright notice, and (2)
106 | tells the user that there is no warranty for the work (except to the
107 | extent that warranties are provided), that licensees may convey the
108 | work under this License, and how to view a copy of this License. If
109 | the interface presents a list of user commands or options, such as a
110 | menu, a prominent item in the list meets this criterion.
111 |
112 | 1. Source Code.
113 |
114 | The "source code" for a work means the preferred form of the work
115 | for making modifications to it. "Object code" means any non-source
116 | form of a work.
117 |
118 | A "Standard Interface" means an interface that either is an official
119 | standard defined by a recognized standards body, or, in the case of
120 | interfaces specified for a particular programming language, one that
121 | is widely used among developers working in that language.
122 |
123 | The "System Libraries" of an executable work include anything, other
124 | than the work as a whole, that (a) is included in the normal form of
125 | packaging a Major Component, but which is not part of that Major
126 | Component, and (b) serves only to enable use of the work with that
127 | Major Component, or to implement a Standard Interface for which an
128 | implementation is available to the public in source code form. A
129 | "Major Component", in this context, means a major essential component
130 | (kernel, window system, and so on) of the specific operating system
131 | (if any) on which the executable work runs, or a compiler used to
132 | produce the work, or an object code interpreter used to run it.
133 |
134 | The "Corresponding Source" for a work in object code form means all
135 | the source code needed to generate, install, and (for an executable
136 | work) run the object code and to modify the work, including scripts to
137 | control those activities. However, it does not include the work's
138 | System Libraries, or general-purpose tools or generally available free
139 | programs which are used unmodified in performing those activities but
140 | which are not part of the work. For example, Corresponding Source
141 | includes interface definition files associated with source files for
142 | the work, and the source code for shared libraries and dynamically
143 | linked subprograms that the work is specifically designed to require,
144 | such as by intimate data communication or control flow between those
145 | subprograms and other parts of the work.
146 |
147 | The Corresponding Source need not include anything that users
148 | can regenerate automatically from other parts of the Corresponding
149 | Source.
150 |
151 | The Corresponding Source for a work in source code form is that
152 | same work.
153 |
154 | 2. Basic Permissions.
155 |
156 | All rights granted under this License are granted for the term of
157 | copyright on the Program, and are irrevocable provided the stated
158 | conditions are met. This License explicitly affirms your unlimited
159 | permission to run the unmodified Program. The output from running a
160 | covered work is covered by this License only if the output, given its
161 | content, constitutes a covered work. This License acknowledges your
162 | rights of fair use or other equivalent, as provided by copyright law.
163 |
164 | You may make, run and propagate covered works that you do not
165 | convey, without conditions so long as your license otherwise remains
166 | in force. You may convey covered works to others for the sole purpose
167 | of having them make modifications exclusively for you, or provide you
168 | with facilities for running those works, provided that you comply with
169 | the terms of this License in conveying all material for which you do
170 | not control copyright. Those thus making or running the covered works
171 | for you must do so exclusively on your behalf, under your direction
172 | and control, on terms that prohibit them from making any copies of
173 | your copyrighted material outside their relationship with you.
174 |
175 | Conveying under any other circumstances is permitted solely under
176 | the conditions stated below. Sublicensing is not allowed; section 10
177 | makes it unnecessary.
178 |
179 | 3. Protecting Users' Legal Rights From Anti-Circumvention Law.
180 |
181 | No covered work shall be deemed part of an effective technological
182 | measure under any applicable law fulfilling obligations under article
183 | 11 of the WIPO copyright treaty adopted on 20 December 1996, or
184 | similar laws prohibiting or restricting circumvention of such
185 | measures.
186 |
187 | When you convey a covered work, you waive any legal power to forbid
188 | circumvention of technological measures to the extent such circumvention
189 | is effected by exercising rights under this License with respect to
190 | the covered work, and you disclaim any intention to limit operation or
191 | modification of the work as a means of enforcing, against the work's
192 | users, your or third parties' legal rights to forbid circumvention of
193 | technological measures.
194 |
195 | 4. Conveying Verbatim Copies.
196 |
197 | You may convey verbatim copies of the Program's source code as you
198 | receive it, in any medium, provided that you conspicuously and
199 | appropriately publish on each copy an appropriate copyright notice;
200 | keep intact all notices stating that this License and any
201 | non-permissive terms added in accord with section 7 apply to the code;
202 | keep intact all notices of the absence of any warranty; and give all
203 | recipients a copy of this License along with the Program.
204 |
205 | You may charge any price or no price for each copy that you convey,
206 | and you may offer support or warranty protection for a fee.
207 |
208 | 5. Conveying Modified Source Versions.
209 |
210 | You may convey a work based on the Program, or the modifications to
211 | produce it from the Program, in the form of source code under the
212 | terms of section 4, provided that you also meet all of these conditions:
213 |
214 | a) The work must carry prominent notices stating that you modified
215 | it, and giving a relevant date.
216 |
217 | b) The work must carry prominent notices stating that it is
218 | released under this License and any conditions added under section
219 | 7. This requirement modifies the requirement in section 4 to
220 | "keep intact all notices".
221 |
222 | c) You must license the entire work, as a whole, under this
223 | License to anyone who comes into possession of a copy. This
224 | License will therefore apply, along with any applicable section 7
225 | additional terms, to the whole of the work, and all its parts,
226 | regardless of how they are packaged. This License gives no
227 | permission to license the work in any other way, but it does not
228 | invalidate such permission if you have separately received it.
229 |
230 | d) If the work has interactive user interfaces, each must display
231 | Appropriate Legal Notices; however, if the Program has interactive
232 | interfaces that do not display Appropriate Legal Notices, your
233 | work need not make them do so.
234 |
235 | A compilation of a covered work with other separate and independent
236 | works, which are not by their nature extensions of the covered work,
237 | and which are not combined with it such as to form a larger program,
238 | in or on a volume of a storage or distribution medium, is called an
239 | "aggregate" if the compilation and its resulting copyright are not
240 | used to limit the access or legal rights of the compilation's users
241 | beyond what the individual works permit. Inclusion of a covered work
242 | in an aggregate does not cause this License to apply to the other
243 | parts of the aggregate.
244 |
245 | 6. Conveying Non-Source Forms.
246 |
247 | You may convey a covered work in object code form under the terms
248 | of sections 4 and 5, provided that you also convey the
249 | machine-readable Corresponding Source under the terms of this License,
250 | in one of these ways:
251 |
252 | a) Convey the object code in, or embodied in, a physical product
253 | (including a physical distribution medium), accompanied by the
254 | Corresponding Source fixed on a durable physical medium
255 | customarily used for software interchange.
256 |
257 | b) Convey the object code in, or embodied in, a physical product
258 | (including a physical distribution medium), accompanied by a
259 | written offer, valid for at least three years and valid for as
260 | long as you offer spare parts or customer support for that product
261 | model, to give anyone who possesses the object code either (1) a
262 | copy of the Corresponding Source for all the software in the
263 | product that is covered by this License, on a durable physical
264 | medium customarily used for software interchange, for a price no
265 | more than your reasonable cost of physically performing this
266 | conveying of source, or (2) access to copy the
267 | Corresponding Source from a network server at no charge.
268 |
269 | c) Convey individual copies of the object code with a copy of the
270 | written offer to provide the Corresponding Source. This
271 | alternative is allowed only occasionally and noncommercially, and
272 | only if you received the object code with such an offer, in accord
273 | with subsection 6b.
274 |
275 | d) Convey the object code by offering access from a designated
276 | place (gratis or for a charge), and offer equivalent access to the
277 | Corresponding Source in the same way through the same place at no
278 | further charge. You need not require recipients to copy the
279 | Corresponding Source along with the object code. If the place to
280 | copy the object code is a network server, the Corresponding Source
281 | may be on a different server (operated by you or a third party)
282 | that supports equivalent copying facilities, provided you maintain
283 | clear directions next to the object code saying where to find the
284 | Corresponding Source. Regardless of what server hosts the
285 | Corresponding Source, you remain obligated to ensure that it is
286 | available for as long as needed to satisfy these requirements.
287 |
288 | e) Convey the object code using peer-to-peer transmission, provided
289 | you inform other peers where the object code and Corresponding
290 | Source of the work are being offered to the general public at no
291 | charge under subsection 6d.
292 |
293 | A separable portion of the object code, whose source code is excluded
294 | from the Corresponding Source as a System Library, need not be
295 | included in conveying the object code work.
296 |
297 | A "User Product" is either (1) a "consumer product", which means any
298 | tangible personal property which is normally used for personal, family,
299 | or household purposes, or (2) anything designed or sold for incorporation
300 | into a dwelling. In determining whether a product is a consumer product,
301 | doubtful cases shall be resolved in favor of coverage. For a particular
302 | product received by a particular user, "normally used" refers to a
303 | typical or common use of that class of product, regardless of the status
304 | of the particular user or of the way in which the particular user
305 | actually uses, or expects or is expected to use, the product. A product
306 | is a consumer product regardless of whether the product has substantial
307 | commercial, industrial or non-consumer uses, unless such uses represent
308 | the only significant mode of use of the product.
309 |
310 | "Installation Information" for a User Product means any methods,
311 | procedures, authorization keys, or other information required to install
312 | and execute modified versions of a covered work in that User Product from
313 | a modified version of its Corresponding Source. The information must
314 | suffice to ensure that the continued functioning of the modified object
315 | code is in no case prevented or interfered with solely because
316 | modification has been made.
317 |
318 | If you convey an object code work under this section in, or with, or
319 | specifically for use in, a User Product, and the conveying occurs as
320 | part of a transaction in which the right of possession and use of the
321 | User Product is transferred to the recipient in perpetuity or for a
322 | fixed term (regardless of how the transaction is characterized), the
323 | Corresponding Source conveyed under this section must be accompanied
324 | by the Installation Information. But this requirement does not apply
325 | if neither you nor any third party retains the ability to install
326 | modified object code on the User Product (for example, the work has
327 | been installed in ROM).
328 |
329 | The requirement to provide Installation Information does not include a
330 | requirement to continue to provide support service, warranty, or updates
331 | for a work that has been modified or installed by the recipient, or for
332 | the User Product in which it has been modified or installed. Access to a
333 | network may be denied when the modification itself materially and
334 | adversely affects the operation of the network or violates the rules and
335 | protocols for communication across the network.
336 |
337 | Corresponding Source conveyed, and Installation Information provided,
338 | in accord with this section must be in a format that is publicly
339 | documented (and with an implementation available to the public in
340 | source code form), and must require no special password or key for
341 | unpacking, reading or copying.
342 |
343 | 7. Additional Terms.
344 |
345 | "Additional permissions" are terms that supplement the terms of this
346 | License by making exceptions from one or more of its conditions.
347 | Additional permissions that are applicable to the entire Program shall
348 | be treated as though they were included in this License, to the extent
349 | that they are valid under applicable law. If additional permissions
350 | apply only to part of the Program, that part may be used separately
351 | under those permissions, but the entire Program remains governed by
352 | this License without regard to the additional permissions.
353 |
354 | When you convey a copy of a covered work, you may at your option
355 | remove any additional permissions from that copy, or from any part of
356 | it. (Additional permissions may be written to require their own
357 | removal in certain cases when you modify the work.) You may place
358 | additional permissions on material, added by you to a covered work,
359 | for which you have or can give appropriate copyright permission.
360 |
361 | Notwithstanding any other provision of this License, for material you
362 | add to a covered work, you may (if authorized by the copyright holders of
363 | that material) supplement the terms of this License with terms:
364 |
365 | a) Disclaiming warranty or limiting liability differently from the
366 | terms of sections 15 and 16 of this License; or
367 |
368 | b) Requiring preservation of specified reasonable legal notices or
369 | author attributions in that material or in the Appropriate Legal
370 | Notices displayed by works containing it; or
371 |
372 | c) Prohibiting misrepresentation of the origin of that material, or
373 | requiring that modified versions of such material be marked in
374 | reasonable ways as different from the original version; or
375 |
376 | d) Limiting the use for publicity purposes of names of licensors or
377 | authors of the material; or
378 |
379 | e) Declining to grant rights under trademark law for use of some
380 | trade names, trademarks, or service marks; or
381 |
382 | f) Requiring indemnification of licensors and authors of that
383 | material by anyone who conveys the material (or modified versions of
384 | it) with contractual assumptions of liability to the recipient, for
385 | any liability that these contractual assumptions directly impose on
386 | those licensors and authors.
387 |
388 | All other non-permissive additional terms are considered "further
389 | restrictions" within the meaning of section 10. If the Program as you
390 | received it, or any part of it, contains a notice stating that it is
391 | governed by this License along with a term that is a further
392 | restriction, you may remove that term. If a license document contains
393 | a further restriction but permits relicensing or conveying under this
394 | License, you may add to a covered work material governed by the terms
395 | of that license document, provided that the further restriction does
396 | not survive such relicensing or conveying.
397 |
398 | If you add terms to a covered work in accord with this section, you
399 | must place, in the relevant source files, a statement of the
400 | additional terms that apply to those files, or a notice indicating
401 | where to find the applicable terms.
402 |
403 | Additional terms, permissive or non-permissive, may be stated in the
404 | form of a separately written license, or stated as exceptions;
405 | the above requirements apply either way.
406 |
407 | 8. Termination.
408 |
409 | You may not propagate or modify a covered work except as expressly
410 | provided under this License. Any attempt otherwise to propagate or
411 | modify it is void, and will automatically terminate your rights under
412 | this License (including any patent licenses granted under the third
413 | paragraph of section 11).
414 |
415 | However, if you cease all violation of this License, then your
416 | license from a particular copyright holder is reinstated (a)
417 | provisionally, unless and until the copyright holder explicitly and
418 | finally terminates your license, and (b) permanently, if the copyright
419 | holder fails to notify you of the violation by some reasonable means
420 | prior to 60 days after the cessation.
421 |
422 | Moreover, your license from a particular copyright holder is
423 | reinstated permanently if the copyright holder notifies you of the
424 | violation by some reasonable means, this is the first time you have
425 | received notice of violation of this License (for any work) from that
426 | copyright holder, and you cure the violation prior to 30 days after
427 | your receipt of the notice.
428 |
429 | Termination of your rights under this section does not terminate the
430 | licenses of parties who have received copies or rights from you under
431 | this License. If your rights have been terminated and not permanently
432 | reinstated, you do not qualify to receive new licenses for the same
433 | material under section 10.
434 |
435 | 9. Acceptance Not Required for Having Copies.
436 |
437 | You are not required to accept this License in order to receive or
438 | run a copy of the Program. Ancillary propagation of a covered work
439 | occurring solely as a consequence of using peer-to-peer transmission
440 | to receive a copy likewise does not require acceptance. However,
441 | nothing other than this License grants you permission to propagate or
442 | modify any covered work. These actions infringe copyright if you do
443 | not accept this License. Therefore, by modifying or propagating a
444 | covered work, you indicate your acceptance of this License to do so.
445 |
446 | 10. Automatic Licensing of Downstream Recipients.
447 |
448 | Each time you convey a covered work, the recipient automatically
449 | receives a license from the original licensors, to run, modify and
450 | propagate that work, subject to this License. You are not responsible
451 | for enforcing compliance by third parties with this License.
452 |
453 | An "entity transaction" is a transaction transferring control of an
454 | organization, or substantially all assets of one, or subdividing an
455 | organization, or merging organizations. If propagation of a covered
456 | work results from an entity transaction, each party to that
457 | transaction who receives a copy of the work also receives whatever
458 | licenses to the work the party's predecessor in interest had or could
459 | give under the previous paragraph, plus a right to possession of the
460 | Corresponding Source of the work from the predecessor in interest, if
461 | the predecessor has it or can get it with reasonable efforts.
462 |
463 | You may not impose any further restrictions on the exercise of the
464 | rights granted or affirmed under this License. For example, you may
465 | not impose a license fee, royalty, or other charge for exercise of
466 | rights granted under this License, and you may not initiate litigation
467 | (including a cross-claim or counterclaim in a lawsuit) alleging that
468 | any patent claim is infringed by making, using, selling, offering for
469 | sale, or importing the Program or any portion of it.
470 |
471 | 11. Patents.
472 |
473 | A "contributor" is a copyright holder who authorizes use under this
474 | License of the Program or a work on which the Program is based. The
475 | work thus licensed is called the contributor's "contributor version".
476 |
477 | A contributor's "essential patent claims" are all patent claims
478 | owned or controlled by the contributor, whether already acquired or
479 | hereafter acquired, that would be infringed by some manner, permitted
480 | by this License, of making, using, or selling its contributor version,
481 | but do not include claims that would be infringed only as a
482 | consequence of further modification of the contributor version. For
483 | purposes of this definition, "control" includes the right to grant
484 | patent sublicenses in a manner consistent with the requirements of
485 | this License.
486 |
487 | Each contributor grants you a non-exclusive, worldwide, royalty-free
488 | patent license under the contributor's essential patent claims, to
489 | make, use, sell, offer for sale, import and otherwise run, modify and
490 | propagate the contents of its contributor version.
491 |
492 | In the following three paragraphs, a "patent license" is any express
493 | agreement or commitment, however denominated, not to enforce a patent
494 | (such as an express permission to practice a patent or covenant not to
495 | sue for patent infringement). To "grant" such a patent license to a
496 | party means to make such an agreement or commitment not to enforce a
497 | patent against the party.
498 |
499 | If you convey a covered work, knowingly relying on a patent license,
500 | and the Corresponding Source of the work is not available for anyone
501 | to copy, free of charge and under the terms of this License, through a
502 | publicly available network server or other readily accessible means,
503 | then you must either (1) cause the Corresponding Source to be so
504 | available, or (2) arrange to deprive yourself of the benefit of the
505 | patent license for this particular work, or (3) arrange, in a manner
506 | consistent with the requirements of this License, to extend the patent
507 | license to downstream recipients. "Knowingly relying" means you have
508 | actual knowledge that, but for the patent license, your conveying the
509 | covered work in a country, or your recipient's use of the covered work
510 | in a country, would infringe one or more identifiable patents in that
511 | country that you have reason to believe are valid.
512 |
513 | If, pursuant to or in connection with a single transaction or
514 | arrangement, you convey, or propagate by procuring conveyance of, a
515 | covered work, and grant a patent license to some of the parties
516 | receiving the covered work authorizing them to use, propagate, modify
517 | or convey a specific copy of the covered work, then the patent license
518 | you grant is automatically extended to all recipients of the covered
519 | work and works based on it.
520 |
521 | A patent license is "discriminatory" if it does not include within
522 | the scope of its coverage, prohibits the exercise of, or is
523 | conditioned on the non-exercise of one or more of the rights that are
524 | specifically granted under this License. You may not convey a covered
525 | work if you are a party to an arrangement with a third party that is
526 | in the business of distributing software, under which you make payment
527 | to the third party based on the extent of your activity of conveying
528 | the work, and under which the third party grants, to any of the
529 | parties who would receive the covered work from you, a discriminatory
530 | patent license (a) in connection with copies of the covered work
531 | conveyed by you (or copies made from those copies), or (b) primarily
532 | for and in connection with specific products or compilations that
533 | contain the covered work, unless you entered into that arrangement,
534 | or that patent license was granted, prior to 28 March 2007.
535 |
536 | Nothing in this License shall be construed as excluding or limiting
537 | any implied license or other defenses to infringement that may
538 | otherwise be available to you under applicable patent law.
539 |
540 | 12. No Surrender of Others' Freedom.
541 |
542 | If conditions are imposed on you (whether by court order, agreement or
543 | otherwise) that contradict the conditions of this License, they do not
544 | excuse you from the conditions of this License. If you cannot convey a
545 | covered work so as to satisfy simultaneously your obligations under this
546 | License and any other pertinent obligations, then as a consequence you may
547 | not convey it at all. For example, if you agree to terms that obligate you
548 | to collect a royalty for further conveying from those to whom you convey
549 | the Program, the only way you could satisfy both those terms and this
550 | License would be to refrain entirely from conveying the Program.
551 |
552 | 13. Use with the GNU Affero General Public License.
553 |
554 | Notwithstanding any other provision of this License, you have
555 | permission to link or combine any covered work with a work licensed
556 | under version 3 of the GNU Affero General Public License into a single
557 | combined work, and to convey the resulting work. The terms of this
558 | License will continue to apply to the part which is the covered work,
559 | but the special requirements of the GNU Affero General Public License,
560 | section 13, concerning interaction through a network will apply to the
561 | combination as such.
562 |
563 | 14. Revised Versions of this License.
564 |
565 | The Free Software Foundation may publish revised and/or new versions of
566 | the GNU General Public License from time to time. Such new versions will
567 | be similar in spirit to the present version, but may differ in detail to
568 | address new problems or concerns.
569 |
570 | Each version is given a distinguishing version number. If the
571 | Program specifies that a certain numbered version of the GNU General
572 | Public License "or any later version" applies to it, you have the
573 | option of following the terms and conditions either of that numbered
574 | version or of any later version published by the Free Software
575 | Foundation. If the Program does not specify a version number of the
576 | GNU General Public License, you may choose any version ever published
577 | by the Free Software Foundation.
578 |
579 | If the Program specifies that a proxy can decide which future
580 | versions of the GNU General Public License can be used, that proxy's
581 | public statement of acceptance of a version permanently authorizes you
582 | to choose that version for the Program.
583 |
584 | Later license versions may give you additional or different
585 | permissions. However, no additional obligations are imposed on any
586 | author or copyright holder as a result of your choosing to follow a
587 | later version.
588 |
589 | 15. Disclaimer of Warranty.
590 |
591 | THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
592 | APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
593 | HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
594 | OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
595 | THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
596 | PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
597 | IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
598 | ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
599 |
600 | 16. Limitation of Liability.
601 |
602 | IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
603 | WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
604 | THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
605 | GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
606 | USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
607 | DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
608 | PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
609 | EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
610 | SUCH DAMAGES.
611 |
612 | 17. Interpretation of Sections 15 and 16.
613 |
614 | If the disclaimer of warranty and limitation of liability provided
615 | above cannot be given local legal effect according to their terms,
616 | reviewing courts shall apply local law that most closely approximates
617 | an absolute waiver of all civil liability in connection with the
618 | Program, unless a warranty or assumption of liability accompanies a
619 | copy of the Program in return for a fee.
620 |
621 | END OF TERMS AND CONDITIONS
622 |
623 | How to Apply These Terms to Your New Programs
624 |
625 | If you develop a new program, and you want it to be of the greatest
626 | possible use to the public, the best way to achieve this is to make it
627 | free software which everyone can redistribute and change under these terms.
628 |
629 | To do so, attach the following notices to the program. It is safest
630 | to attach them to the start of each source file to most effectively
631 | state the exclusion of warranty; and each file should have at least
632 | the "copyright" line and a pointer to where the full notice is found.
633 |
634 |
635 | Copyright (C)
636 |
637 | This program is free software: you can redistribute it and/or modify
638 | it under the terms of the GNU General Public License as published by
639 | the Free Software Foundation, either version 3 of the License, or
640 | (at your option) any later version.
641 |
642 | This program is distributed in the hope that it will be useful,
643 | but WITHOUT ANY WARRANTY; without even the implied warranty of
644 | MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
645 | GNU General Public License for more details.
646 |
647 | You should have received a copy of the GNU General Public License
648 | along with this program. If not, see .
649 |
650 | Also add information on how to contact you by electronic and paper mail.
651 |
652 | If the program does terminal interaction, make it output a short
653 | notice like this when it starts in an interactive mode:
654 |
655 | Copyright (C)
656 | This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
657 | This is free software, and you are welcome to redistribute it
658 | under certain conditions; type `show c' for details.
659 |
660 | The hypothetical commands `show w' and `show c' should show the appropriate
661 | parts of the General Public License. Of course, your program's commands
662 | might be different; for a GUI interface, you would use an "about box".
663 |
664 | You should also get your employer (if you work as a programmer) or school,
665 | if any, to sign a "copyright disclaimer" for the program, if necessary.
666 | For more information on this, and how to apply and follow the GNU GPL, see
667 | .
668 |
669 | The GNU General Public License does not permit incorporating your program
670 | into proprietary programs. If your program is a subroutine library, you
671 | may consider it more useful to permit linking proprietary applications with
672 | the library. If this is what you want to do, use the GNU Lesser General
673 | Public License instead of this License. But first, please read
674 | .
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # Getting started with OpenCentroidNet
2 | CentroidNet is a hybrid convolutional neural network
3 |
4 | 1) run **create_dataset.py** to generate a synthetic dataset of your liking.
5 | 2) run **train.py** to train ad model using this generated dataset.
6 | 3) run **predict.py** to predict a single image in production.
7 | 4) adjust **config.py** to change hyper parameters.
8 |
9 | # Generated data files and folders
10 | ## The input images and annotation data
11 | **./data/dataset/** contains the image generated by **create_dataset.py** and the annotations in **train.csv** and **validation.csv**.\
12 | Replace this data with your own set.
13 |
14 | ## Aerial Potato Dataset
15 | The images of the original paper can be found in **./dataset/**
16 |
17 | ## The model
18 | The weights of the CentroidNet model are stored in **./data/CentroidNet.pth**
19 |
20 | ## The input tensors to the model
21 | **./data/validation_result/_inputs.npy** contains the normalized input tensor.\
22 | **./data/validation_result/_targets.npy** contains the normalized target tensor.
23 |
24 | ## The output tensors of the model
25 | **./data/validation_result/_vectors.npy** contains the normalized 2-d voting vectors (use this to see the quality of the training).\
26 | **./data/validation_result/_votes.npy** contains the voting space (use this to tune Config.centroid_threshold and Config.nm-size).\
27 | **./data/validation_result/_centroids.npy** contains a value of one for each centroid (final result).\
28 | **./data/validation_result/_class_ids.npy** contains the class ids for every pixel.\
29 | **./data/validation_result/_class_probs.npy** contains the class probability for every pixel.\
30 | **./data/valitation_result/validation.txt** contains the final centroid coordinates and class info.
31 |
32 | # Citing OpenCentroidNet
33 |
34 | If this code benefits your research please cite:
35 |
36 | @inproceedings{dijkstra2018centroidnet,\
37 | title={CentroidNet: A deep neural network for joint object localization and counting},\
38 | author={Dijkstra, Klaas and van de Loosdrecht, Jaap and Schomaker, L.R.B. and Wiering, Marco A.},\
39 | booktitle={Joint European Conference on Machine Learning and Knowledge Discovery in Databases},\
40 | pages={585--601},\
41 | year={2018},\
42 | organization={Springer}\
43 | }
44 |
45 | # Copyright notice
46 | Copyright (C) 2019 Klaas Dijkstra
47 |
48 | OpenCentroidNet is free software: you can redistribute it and/or modify
49 | it under the terms of the GNU General Public License as published by
50 | the Free Software Foundation, either version 3 of the License, or
51 | (at your option) any later version.
52 |
53 | This program is distributed in the hope that it will be useful,
54 | but WITHOUT ANY WARRANTY; without even the implied warranty of
55 | MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
56 | GNU General Public License for more details.
57 |
58 | You should have received a copy of the GNU General Public License
59 | along with this program. If not, see .
60 |
61 | The Aerial Potato Dataset is licensed under CC BY-NC-SA 4.0.
62 |
63 | # p.s.
64 | For a numpy viewer go to: https://github.com/ArendJanKramer/Numpyviewer.
65 |
66 |
67 |
68 |
69 |
--------------------------------------------------------------------------------
/centroidnet/__init__.py:
--------------------------------------------------------------------------------
1 | # Copyright (C) 2019 Klaas Dijkstra
2 | #
3 | # This file is part of OpenCentroidNet.
4 | #
5 | # OpenCentroidNet is free software: you can redistribute it and/or modify
6 | # it under the terms of the GNU General Public License as published by
7 | # the Free Software Foundation, either version 3 of the License, or
8 | # (at your option) any later version.
9 |
10 | # OpenCentroidNet is distributed in the hope that it will be useful,
11 | # but WITHOUT ANY WARRANTY; without even the implied warranty of
12 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
13 | # GNU General Public License for more details.
14 |
15 | # You should have received a copy of the GNU General Public License
16 | # along with OpenCentroidNet. If not, see .
17 |
18 | from .dataloaders import *
19 | from .backbones import *
20 | from .centroidnet_core import *
21 |
--------------------------------------------------------------------------------
/centroidnet/backbones/LICENSE:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) 2017 Jackson Huang
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE.
--------------------------------------------------------------------------------
/centroidnet/backbones/__init__.py:
--------------------------------------------------------------------------------
1 | from .unet import UNet
2 |
--------------------------------------------------------------------------------
/centroidnet/backbones/unet.py:
--------------------------------------------------------------------------------
1 | #source: https://github.com/jaxony/unet-pytorch/blob/master/model.py
2 |
3 | import torch
4 | import torch.nn as nn
5 | import torch.nn.functional as F
6 | from torch.nn import init
7 | import numpy as np
8 |
9 |
10 | def conv3x3(in_channels, out_channels, stride=1,
11 | padding=1, bias=True, groups=1):
12 | return nn.Conv2d(
13 | in_channels,
14 | out_channels,
15 | kernel_size=3,
16 | stride=stride,
17 | padding=padding,
18 | bias=bias,
19 | groups=groups)
20 |
21 |
22 | def upconv2x2(in_channels, out_channels, mode='transpose'):
23 | if mode == 'transpose':
24 | return nn.ConvTranspose2d(
25 | in_channels,
26 | out_channels,
27 | kernel_size=2,
28 | stride=2)
29 | else:
30 | # out_channels is always going to be the same
31 | # as in_channels
32 | return nn.Sequential(
33 | nn.Upsample(mode='bilinear', scale_factor=2),
34 | conv1x1(in_channels, out_channels))
35 |
36 |
37 | def conv1x1(in_channels, out_channels, groups=1):
38 | return nn.Conv2d(
39 | in_channels,
40 | out_channels,
41 | kernel_size=1,
42 | groups=groups,
43 | stride=1)
44 |
45 |
46 | class DownConv(nn.Module):
47 | """
48 | A helper Module that performs 2 convolutions and 1 MaxPool.
49 | A ReLU activation follows each convolution.
50 | """
51 |
52 | def __init__(self, in_channels, out_channels, pooling=True):
53 | super(DownConv, self).__init__()
54 |
55 | self.in_channels = in_channels
56 | self.out_channels = out_channels
57 | self.pooling = pooling
58 |
59 | self.conv1 = conv3x3(self.in_channels, self.out_channels)
60 | self.conv2 = conv3x3(self.out_channels, self.out_channels)
61 |
62 | if self.pooling:
63 | self.pool = nn.MaxPool2d(kernel_size=2, stride=2)
64 |
65 | def forward(self, x):
66 | x = F.relu(self.conv1(x))
67 | x = F.relu(self.conv2(x))
68 | before_pool = x
69 | if self.pooling:
70 | x = self.pool(x)
71 | return x, before_pool
72 |
73 |
74 | class UpConv(nn.Module):
75 | """
76 | A helper Module that performs 2 convolutions and 1 UpConvolution.
77 | A ReLU activation follows each convolution.
78 | """
79 |
80 | def __init__(self, in_channels, out_channels,
81 | merge_mode='concat', up_mode='transpose'):
82 | super(UpConv, self).__init__()
83 |
84 | self.in_channels = in_channels
85 | self.out_channels = out_channels
86 | self.merge_mode = merge_mode
87 | self.up_mode = up_mode
88 |
89 | self.upconv = upconv2x2(self.in_channels, self.out_channels,
90 | mode=self.up_mode)
91 |
92 | if self.merge_mode == 'concat':
93 | self.conv1 = conv3x3(
94 | 2 * self.out_channels, self.out_channels)
95 | else:
96 | # num of input channels to conv2 is same
97 | self.conv1 = conv3x3(self.out_channels, self.out_channels)
98 | self.conv2 = conv3x3(self.out_channels, self.out_channels)
99 |
100 | def forward(self, from_down, from_up):
101 | """ Forward pass
102 | Arguments:
103 | from_down: tensor from the encoder pathway
104 | from_up: upconv'd tensor from the decoder pathway
105 | """
106 | from_up = self.upconv(from_up)
107 | if not np.array_equal(from_up.data.shape, from_down.data.shape):
108 | from_up = F.upsample(from_up, from_down.data.shape[2:], mode="bilinear")
109 | if self.merge_mode == 'concat':
110 | x = torch.cat((from_up, from_down), 1)
111 | else:
112 | x = from_up + from_down
113 | x = F.relu(self.conv1(x))
114 | x = F.relu(self.conv2(x))
115 | return x
116 |
117 |
118 | class UNet(nn.Module):
119 | """ `UNet` class is based on https://arxiv.org/abs/1505.04597
120 | The U-Net is a convolutional encoder-decoder neural network.
121 | Contextual spatial information (from the decoding,
122 | expansive pathway) about an input tensor is merged with
123 | information representing the localization of details
124 | (from the encoding, compressive pathway).
125 | Modifications to the original paper:
126 | (1) padding is used in 3x3 convolutions to prevent loss
127 | of border pixels
128 | (2) merging outputs does not require cropping due to (1)
129 | (3) residual connections can be used by specifying
130 | UNet(merge_mode='add')
131 | (4) if non-parametric upsampling is used in the decoder
132 | pathway (specified by upmode='upsample'), then an
133 | additional 1x1 2d convolution occurs after upsampling
134 | to reduce channel dimensionality by a factor of 2.
135 | This channel halving happens with the convolution in
136 | the tranpose convolution (specified by upmode='transpose')
137 | """
138 |
139 | def __init__(self, num_classes, in_channels=3, depth=5,
140 | start_filts=64, up_mode='transpose',
141 | merge_mode='concat'):
142 | """
143 | Arguments:
144 | in_channels: int, number of channels in the input tensor.
145 | Default is 3 for RGB images.
146 | depth: int, number of MaxPools in the U-Net.
147 | start_filts: int, number of convolutional filters for the
148 | first conv.
149 | up_mode: string, type of upconvolution. Choices: 'transpose'
150 | for transpose convolution or 'upsample' for nearest neighbour
151 | upsampling.
152 | """
153 | super(UNet, self).__init__()
154 |
155 | if up_mode in ('transpose', 'upsample'):
156 | self.up_mode = up_mode
157 | else:
158 | raise ValueError("\"{}\" is not a valid mode for "
159 | "upsampling. Only \"transpose\" and "
160 | "\"upsample\" are allowed.".format(up_mode))
161 |
162 | if merge_mode in ('concat', 'add'):
163 | self.merge_mode = merge_mode
164 | else:
165 | raise ValueError("\"{}\" is not a valid mode for"
166 | "merging up and down paths. "
167 | "Only \"concat\" and "
168 | "\"add\" are allowed.".format(up_mode))
169 |
170 | # NOTE: up_mode 'upsample' is incompatible with merge_mode 'add'
171 | if self.up_mode == 'upsample' and self.merge_mode == 'add':
172 | raise ValueError("up_mode \"upsample\" is incompatible "
173 | "with merge_mode \"add\" at the moment "
174 | "because it doesn't make sense to use "
175 | "nearest neighbour to reduce "
176 | "depth channels (by half).")
177 |
178 | self.num_classes = num_classes
179 | self.in_channels = in_channels
180 | self.start_filts = start_filts
181 | self.depth = depth
182 |
183 | self.down_convs = []
184 | self.up_convs = []
185 |
186 | # create the encoder pathway and add to a list
187 | for i in range(depth):
188 | ins = self.in_channels if i == 0 else outs
189 | outs = self.start_filts * (2 ** i)
190 | pooling = True if i < depth - 1 else False
191 |
192 | down_conv = DownConv(ins, outs, pooling=pooling)
193 | self.down_convs.append(down_conv)
194 |
195 | # create the decoder pathway and add to a list
196 | # - careful! decoding only requires depth-1 blocks
197 | for i in range(depth - 1):
198 | ins = outs
199 | outs = ins // 2
200 | up_conv = UpConv(ins, outs, up_mode=up_mode,
201 | merge_mode=merge_mode)
202 | self.up_convs.append(up_conv)
203 |
204 | self.conv_final = conv1x1(outs, self.num_classes)
205 |
206 | # add the list of modules to current module
207 | self.down_convs = nn.ModuleList(self.down_convs)
208 | self.up_convs = nn.ModuleList(self.up_convs)
209 |
210 | self.reset_params()
211 |
212 | @staticmethod
213 | def weight_init(m):
214 | if isinstance(m, nn.Conv2d):
215 | init.xavier_normal_(m.weight)
216 | init.constant_(m.bias, 0)
217 |
218 | def reset_params(self):
219 | for i, m in enumerate(self.modules()):
220 | self.weight_init(m)
221 |
222 | def forward(self, x):
223 | encoder_outs = []
224 |
225 | # encoder pathway, save outputs for merging
226 | for i, module in enumerate(self.down_convs):
227 | x, before_pool = module(x)
228 | encoder_outs.append(before_pool)
229 |
230 | for i, module in enumerate(self.up_convs):
231 | before_pool = encoder_outs[-(i + 2)]
232 | x = module(before_pool, x)
233 |
234 | # No softmax is used. This means you need to use
235 | # nn.CrossEntropyLoss is your training script,
236 | # as this module includes a softmax already.
237 | x = self.conv_final(x)
238 |
239 | #x = F.tanh(x)
240 | return x
241 |
--------------------------------------------------------------------------------
/centroidnet/centroidnet_core.py:
--------------------------------------------------------------------------------
1 | # Copyright (C) 2019 Klaas Dijkstra
2 | #
3 | # This file is part of OpenCentroidNet.
4 | #
5 | # OpenCentroidNet is free software: you can redistribute it and/or modify
6 | # it under the terms of the GNU General Public License as published by
7 | # the Free Software Foundation, either version 3 of the License, or
8 | # (at your option) any later version.
9 |
10 | # OpenCentroidNet is distributed in the hope that it will be useful,
11 | # but WITHOUT ANY WARRANTY; without even the implied warranty of
12 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
13 | # GNU General Public License for more details.
14 |
15 | # You should have received a copy of the GNU General Public License
16 | # along with OpenCentroidNet. If not, see .
17 |
18 | import torch
19 | import torch.nn
20 | import torch.autograd
21 | import torch.nn.modules.loss
22 | import torch.nn.functional as F
23 | from centroidnet.backbones import UNet
24 | import numpy as np
25 | from typing import List
26 | from skimage.draw import ellipse
27 | from skimage.feature import peak_local_max
28 |
29 | class CentroidNet(torch.nn.Module):
30 | def __init__(self, num_classes, num_channels):
31 | torch.nn.Module.__init__(self)
32 | self.backbone = UNet(num_classes=num_classes+2, in_channels=num_channels, depth=5, start_filts=64)
33 | self.num_classes = num_classes
34 | self.num_channels = num_channels
35 |
36 | def forward(self, x: torch.Tensor) -> torch.Tensor:
37 | return self.backbone(x)
38 |
39 | def __str__(self):
40 | return f"CentroidNet: {self.backbone}"
41 |
42 |
43 | class CentroidLoss(torch.nn.Module):
44 | def __init__(self):
45 | torch.nn.Module.__init__(self)
46 | self.loss = 0
47 |
48 | def forward(self, result, target):
49 | loss = F.mse_loss(result, target, size_average=True, reduce=True)
50 | self.loss = loss.item()
51 | return loss
52 |
53 | def __str__(self):
54 | return f"{self.loss}"
55 |
56 | def encode(coords, image_height: int, image_width: int, max_dist: int, num_classes: int):
57 | y_coords, x_coords, _, _, _, _, _ = np.transpose(coords)
58 |
59 | #Encode vectors
60 | target_vectors = calc_vector_distance(y_coords, x_coords, image_height, image_width, max_dist)
61 | if not max_dist is None:
62 | target_vectors /= max_dist
63 | target_vectors = np.transpose(target_vectors, [2, 0, 1])
64 |
65 | #Encode logits (bounding box is drawn as ellipses)
66 | target_logits = np.zeros((num_classes, image_height, image_width))
67 | target_logits[0] = 1
68 | for (y, x, ymin, ymax, xmin, xmax, id) in coords:
69 | ymin, ymax, xmin, xmax = min(ymin, ymax), max(ymin, ymax), min(xmin, xmax), max(xmin, xmax)
70 | rr, cc = ellipse(y, x, (ymax - ymin) // 2, (xmax - xmin) // 2, shape=(image_height, image_width))
71 | target_logits[id + 1][rr, cc] = 1
72 | target_logits[0][rr, cc] = 0
73 |
74 | target = np.concatenate((target_vectors, target_logits))
75 | return target
76 |
77 |
78 | def decode(input : np.ndarray, max_dist: int, binning: int, nm_size: int, centroid_threshold: int):
79 | _, image_height, image_width = input.shape
80 | centroid_vectors = input[0:2] * max_dist
81 | logits = input[2:]
82 |
83 | #Calculate class ids and class probabilities
84 | class_ids = np.expand_dims(np.argmax(logits, axis=0), axis=0)
85 | sum_logits = np.expand_dims(np.sum(logits, axis=0), axis=0)
86 | class_probs = np.expand_dims(np.max((logits / sum_logits), axis=0), axis=0)
87 | class_probs = np.clip(class_probs, 0, 1)
88 |
89 | # Calculate the centroid images
90 | votes = calc_vote_image(centroid_vectors, binning)
91 | votes_nm = peak_local_max(votes[0], min_distance=nm_size, threshold_abs=centroid_threshold, indices=False)
92 | votes_nm = np.expand_dims(votes_nm, axis=0)
93 |
94 | # Calculate list of centroid statistics
95 | coords = np.transpose(np.where(votes_nm[0] > 0))
96 | centroids = [[y * binning, x * binning, class_ids[0, y * binning, x * binning] - 1, class_probs[0, y * binning, x * binning]] for (y, x) in coords]
97 | return centroid_vectors, votes, class_ids, class_probs, votes_nm, centroids
98 |
99 |
100 | def calc_vector_distance(y_coords: List[int], x_coords: List[int], image_height: int, image_width: int, max_dist) -> np.ndarray:
101 | assert (len(y_coords) == len(x_coords)), "list of coordinates should be the same"
102 | assert (len(y_coords) > 0), "No centroids in source image"
103 |
104 | # Prepare datastructures
105 | shape = [image_height, image_width]
106 | image_coords = np.indices(shape)
107 | image_coords_planar = np.transpose(image_coords, [1, 2, 0])
108 |
109 | dist_cube = np.empty([len(y_coords), image_height, image_width])
110 | vec_cube = np.empty([len(y_coords), image_height, image_width, 2])
111 |
112 | # Create multichannel image with distances and vectors
113 | for (i, (y, x)) in enumerate(zip(y_coords, x_coords)):
114 | vec = np.array([y, x]) - image_coords_planar
115 | vec_cube[i] = vec
116 | dist = vec ** 2
117 | dist = np.sum(dist, axis=2)
118 | dist = np.sqrt(dist)
119 | dist_cube[i] = dist
120 |
121 | # Get the smallest centroid distance index
122 | dist_ctr_labels = np.argmin(dist_cube, axis=0)
123 |
124 | # Get the smallest distance vectors [h, w, yx]
125 | vec_ctr = vec_cube[dist_ctr_labels, image_coords[0], image_coords[1]]
126 |
127 | # Clip vectors
128 | if not max_dist is None:
129 | active = np.sqrt(np.sum(vec_ctr ** 2, axis=2)) > max_dist
130 | vec_ctr[active, :] = 0
131 |
132 | return vec_ctr
133 |
134 |
135 | def calc_vote_image(centroid_vectors: np.array, f) -> np.ndarray:
136 | channels, height, width = centroid_vectors.shape
137 |
138 | size = np.array(np.array((height, width), dtype=np.float) * ((1/f), (1/f)), dtype=np.int)
139 | indices = np.indices((height, width), dtype=centroid_vectors.dtype)
140 |
141 | # Calculate absolute vectors
142 | vectors = ((centroid_vectors + indices) * (1/f)).astype(np.int)
143 | nimage = np.zeros((size[0], size[1]))
144 |
145 | # Clip pixels
146 | logic = np.logical_and(np.logical_and(vectors[0] >= 0, vectors[1] >= 0), np.logical_and(vectors[0] < size[0], vectors[1] < size[1]))
147 | coords = vectors[:, logic]
148 |
149 | # Accumulate
150 | np.add.at(nimage, (coords[0], coords[1]), 1)
151 | return np.expand_dims(nimage, axis=0)
152 |
--------------------------------------------------------------------------------
/centroidnet/dataloaders.py:
--------------------------------------------------------------------------------
1 | # Copyright (C) 2019 Klaas Dijkstra
2 | #
3 | # This file is part of OpenCentroidNet.
4 | #
5 | # OpenCentroidNet is free software: you can redistribute it and/or modify
6 | # it under the terms of the GNU General Public License as published by
7 | # the Free Software Foundation, either version 3 of the License, or
8 | # (at your option) any later version.
9 |
10 | # OpenCentroidNet is distributed in the hope that it will be useful,
11 | # but WITHOUT ANY WARRANTY; without even the implied warranty of
12 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
13 | # GNU General Public License for more details.
14 |
15 | # You should have received a copy of the GNU General Public License
16 | # along with OpenCentroidNet. If not, see .
17 |
18 | import os
19 | from torch.utils.data import Dataset
20 | import numpy as np
21 | import cv2
22 | import random
23 | import centroidnet
24 |
25 | class CentroidNetDataset(Dataset):
26 | """
27 | CentroidNetDataset Dataset
28 | Load centroids from txt file and apply vector aware data augmentation
29 |
30 | Arguments:
31 | filename: filename of the input data format: (image_file_name,xmin,ymin,xmax,ymax,class_id\lf)
32 | crop (h, w): Random crop size
33 | transpose ((dim2, dim3)): List of random transposes to choose from
34 | stride ((dim2, dim3)): List of random strides to choose from
35 | """
36 | def convert_path(self, path):
37 | if os.path.isabs(path):
38 | return path
39 | else:
40 | return os.path.join(self.data_path, path)
41 |
42 | def load_and_convert_data(self, filename, max_dist=None):
43 | with open(filename) as f:
44 | lines = f.readlines()
45 | lines = [x.strip().split(",") for x in lines]
46 | self.count = len(lines)
47 | img_ctrs = {}
48 | for line in lines:
49 | fn, xmin, xmax, ymin, ymax, id = line
50 | xmin, xmax, ymin, ymax, id = int(xmin), int(xmax), int(ymin), int(ymax), int(id)
51 | x, y = (xmin + xmax) // 2, (ymin + ymax) // 2
52 | if not fn in img_ctrs:
53 | img_ctrs[fn] = list()
54 |
55 | self.num_classes = id+2 if id+2 > self.num_classes else self.num_classes #including background class
56 | img_ctrs[fn].append(np.array([y, x, ymin, ymax, xmin, xmax, id], dtype=int))
57 |
58 | for key in img_ctrs.keys():
59 | img_ctrs[key] = np.stack(img_ctrs[key])
60 | fn = self.convert_path(key)
61 | img = cv2.imread(fn)
62 |
63 | if img is None:
64 | raise Exception("Could not read {}".format(fn))
65 |
66 | crop = min(img.shape[0], img.shape[1], self.crop[0], self.crop[1])
67 | if crop != self.crop[0]:
68 | print(f"Warning: random crop adjusted to {[crop, crop]}")
69 | self.set_crop([crop, crop])
70 |
71 | target = centroidnet.encode(img_ctrs[key], img.shape[0], img.shape[1], max_dist, self.num_classes)
72 | img = (np.transpose(img, [2, 0, 1]).astype(np.float32) - self.sub) / self.div
73 | target = target.astype(np.float32)
74 |
75 | self.images.append(img)
76 | self.targets.append(target)
77 |
78 |
79 | def __init__(self, filename: str, crop=(256, 256), max_dist=100, repeat=1, sub=127, div=256, transpose=np.array([[0, 1], [1, 0]]), stride=np.array([[1, 1], [-1, -1], [-1, 1], [1, -1]]), data_path=None):
80 | self.count = 0
81 | if data_path is None:
82 | self.data_path = os.path.dirname(filename)
83 | else:
84 | self.data_path = data_path
85 |
86 | self.filename = filename
87 | self.images = list()
88 | self.targets = list()
89 |
90 | self.sub = sub
91 | self.div = div
92 | self.repeat = repeat
93 | self.set_crop(crop)
94 | self.set_repeat(repeat)
95 | self.set_transpose(transpose)
96 | self.set_stride(stride)
97 | self.num_classes = 0
98 | self.train()
99 | self.load_and_convert_data(filename, max_dist=max_dist)
100 |
101 | def eval(self):
102 | self.train_mode = False
103 |
104 | def train(self):
105 | self.train_mode = True
106 |
107 | def set_repeat(self, repeat):
108 | if repeat < 0:
109 | self.repeat = 1
110 | self.repeat = repeat
111 |
112 | def set_crop(self, crop):
113 | self.crop = crop
114 |
115 | def set_transpose(self, transpose):
116 | if np.all(transpose == np.array([[0, 1]])):
117 | self.transpose = None
118 | else:
119 | self.transpose = transpose
120 |
121 | def set_stride(self, stride):
122 | if np.all(stride == np.array([[1, 1]])):
123 | self.stride = None
124 | else:
125 | self.stride = stride
126 |
127 | def adjust_vectors(self, img, transpose, stride):
128 | if not transpose is None:
129 | img2 = img.copy()
130 | img[0] = img2[transpose[0]]
131 | img[1] = img2[transpose[1]]
132 | if not stride is None:
133 | img2 = img.copy()
134 | img[0] = img2[0] * stride[0]
135 | img[1] = img2[1] * stride[1]
136 | return img
137 |
138 | def adjust_image(self, img, transpose, slice, crop, stride):
139 | if not transpose is None:
140 | img = np.transpose(img, (0, transpose[0] + 1, transpose[1] + 1))
141 | if not slice is None:
142 | img = img[:, slice[0]:slice[0] + crop[0], slice[1]:slice[1] + crop[1]]
143 | if not stride is None:
144 | img = img[:, ::stride[0], ::stride[1]]
145 | return img
146 |
147 | def get_target(self, img: np.array, transpose, slice, crop, stride):
148 | img[0:2] = self.adjust_vectors(img[0:2], transpose, stride)
149 | img = self.adjust_image(img, transpose, slice, crop, stride)
150 | return img
151 |
152 | def get_input(self, img: np.array, transpose, slice, crop, stride):
153 | img = self.adjust_image(img, transpose, slice, crop, stride)
154 | return img
155 |
156 | def __getitem__(self, index):
157 | index = index // self.repeat
158 | input, target = self.images[index], self.targets[index]
159 |
160 | if self.stride is None and self.transpose is None and self.crop is None or not self.train_mode:
161 | return input.astype(np.float32), target.astype(np.float32)
162 |
163 | if not self.transpose is None:
164 | transpose = random.choice(self.transpose)
165 | else:
166 | transpose = None
167 |
168 | if not self.stride is None:
169 | stride = random.choice(self.stride)
170 | else:
171 | stride = None
172 |
173 | if not self.crop is None:
174 | min = np.array([0, 0])
175 | if not transpose is None:
176 | max = np.array([input.shape[transpose[0] + 1], input.shape[transpose[1] + 1]], dtype=int) - self.crop
177 | else:
178 | max = np.array([input.shape[1] - self.crop[0], input.shape[2] - self.crop[1]])
179 | slice = [random.randint(mn, mx) for mn, mx in zip(min, max)]
180 | else:
181 | slice = None
182 |
183 | input = self.get_input(input, transpose, slice, self.crop, stride).astype(np.float32)
184 | target = self.get_target(target, transpose, slice, self.crop, stride).astype(np.float32)
185 |
186 | return input, target
187 |
188 | def __len__(self):
189 | return len(self.images) * self.repeat
190 |
--------------------------------------------------------------------------------
/config.py:
--------------------------------------------------------------------------------
1 | # Copyright (C) 2019 Klaas Dijkstra
2 | #
3 | # This file is part of OpenCentroidNet.
4 | #
5 | # OpenCentroidNet is free software: you can redistribute it and/or modify
6 | # it under the terms of the GNU General Public License as published by
7 | # the Free Software Foundation, either version 3 of the License, or
8 | # (at your option) any later version.
9 |
10 | # OpenCentroidNet is distributed in the hope that it will be useful,
11 | # but WITHOUT ANY WARRANTY; without even the implied warranty of
12 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
13 | # GNU General Public License for more details.
14 |
15 | # You should have received a copy of the GNU General Public License
16 | # along with OpenCentroidNet. If not, see .
17 |
18 | class Config:
19 | dev = "cuda:0"
20 |
21 | # Random crops to take from the image during training and data augmentation.
22 | # This value should be large enough compared to your image size (in this case image size was (200 X 300) pixels.
23 | crop = [100, 100]
24 |
25 | # This should reflect the mean and average in your dataset (or leave default for 8 bit images)
26 | sub = 127
27 | div = 256
28 |
29 | # Maximum allowed voting vector length. All votes are divided by this value during training.
30 | # This value should roughly be twice the max diameter of an object.
31 | max_dist = 30
32 |
33 | # Number of epochs to train. The best model with the best validation loss is kept automatically.
34 | # This value should typically be large. Check vectors.npy if the quality of vectors is ok.
35 | epochs = 500
36 |
37 | # Batch size for training. Choose a batch size which maximizes GPU memory usage.
38 | batch_size = 20
39 |
40 | # Learning rate. Usually this value is sufficient.
41 | learn_rate = 0.001
42 |
43 | # Determines on what interval validation should occur.
44 | validation_interval = 10
45 |
46 | # Number of input channels of the image. Default is RGB.
47 | num_channels = 3
48 |
49 | # Number of classes. This is including the background, so this should be one more then in the training file.
50 | num_classes = 4
51 |
52 | # The amount of spatial binning to use (Values could be: 1, 2, 4, etc.)
53 | # Increase binning for increasing the robustness of the detection.
54 | # Decrease binning to increase spatial accuracy.
55 | binning = 1
56 |
57 | # How far apart should two centroids minimally be. (values could be: 3, 7, 11, 15, 17, etc.)
58 | # Increase value to get less detection close together.
59 | # Decrease value to improve detection which are very close together.
60 | nm_size = 3
61 |
62 | # How many votes constitutes a centroid.
63 | # Increase value to increase precision and decrease recall (less detections)
64 | # Decrease value to increase recall and decrease precision (more detectons)
65 | # Determine a correct value by reviewing votes.npy
66 | centroid_threshold = 10
67 |
--------------------------------------------------------------------------------
/dataset/LICENSE.txt:
--------------------------------------------------------------------------------
1 | Aerial Potato Dataset (c) by K. Dijkstra
2 |
3 | Aerial Potato Dataset is licensed under a
4 | Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
5 |
6 | You should have received a copy of the license along with this
7 | work. If not, see .
8 |
9 | If this datasets benefits your research please cite:
10 |
11 | @inproceedings{dijkstra2018centroidnet,
12 | title={CentroidNet: A deep neural network for joint object localization and counting},
13 | author={Dijkstra, Klaas and van de Loosdrecht, Jaap and Schomaker, L.R.B. and Wiering, Marco A.},
14 | booktitle={Joint European Conference on Machine Learning and Knowledge Discovery in Databases},
15 | pages={585--601},
16 | year={2018},
17 | organization={Springer}
18 | }
19 |
20 | or
21 |
22 | @article{dijkstra2020centroidnetv2,
23 | title={CentroidNetV2: A Hybrid Deep Neural Network for Small-Object Segmentation and Counting.},
24 | author={Dijkstra, Klaas and van de Loosdrecht, Jaap and Waatze A., Atsma and Schomaker, L.R.B. and Wiering, Marco A.},
25 | booktitle={Neurocomputing},
26 | year={2020},
27 | organization={Elsevier}
28 | DOI={https://doi.org/10.1016/j.neucom.2020.10.075 }
29 | }
--------------------------------------------------------------------------------
/dataset/training/PotatoPlant0.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/kdijkstra13/OpenCentroidNet/a491477351f050f9713072dd1a218314f5580f7a/dataset/training/PotatoPlant0.png
--------------------------------------------------------------------------------
/dataset/training/PotatoPlant1024.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/kdijkstra13/OpenCentroidNet/a491477351f050f9713072dd1a218314f5580f7a/dataset/training/PotatoPlant1024.png
--------------------------------------------------------------------------------
/dataset/training/PotatoPlant1024.xml:
--------------------------------------------------------------------------------
1 |
2 | Kiem Drone
3 | PotatoPlant1024.png
4 | F:\Onedrive\Documenten\Kiem Drone\PotatoPlant1024.png
5 |
6 | Unknown
7 |
8 |
9 | 1800
10 | 1500
11 | 3
12 |
13 | 0
14 |
26 |
38 |
50 |
62 |
74 |
86 |
98 |
110 |
122 |
134 |
146 |
158 |
170 |
182 |
194 |
206 |
218 |
230 |
242 |
254 |
266 |
278 |
290 |
302 |
314 |
326 |
338 |
350 |
362 |
374 |
386 |
398 |
410 |
422 |
434 |
446 |
458 |
470 |
482 |
494 |
506 |
518 |
530 |
542 |
554 |
566 |
578 |
590 |
602 |
614 |
626 |
638 |
650 |
662 |
674 |
686 |
698 |
710 |
722 |
734 |
746 |
758 |
770 |
782 |
794 |
806 |
818 |
830 |
842 |
854 |
866 |
878 |
890 |
902 |
914 |
926 |
938 |
950 |
962 |
974 |
986 |
998 |
1010 |
1022 |
1034 |
1046 |
1058 |
1070 |
1082 |
1094 |
1106 |
1118 |
1130 |
1142 |
1154 |
1166 |
1178 |
1190 |
1202 |
1214 |
1226 |
1238 |
1250 |
1262 |
1274 |
1286 |
1298 |
1310 |
1322 |
1334 |
1346 |
1358 |
1370 |
1382 |
1394 |
1406 |
1418 |
1430 |
1442 |
1454 |
1466 |
1478 |
1490 |
1502 |
1514 |
1526 |
1538 |
1550 |
1562 |
1574 |
1586 |
1598 |
1610 |
1622 |
1634 |
1646 |
1658 |
1670 |
1682 |
1694 |
1706 |
1718 |
1730 |
1742 |
1754 |
1766 |
1778 |
1790 |
1802 |
1814 |
1826 |
1838 |
1850 |
1862 |
1874 |
1886 |
1898 |
1910 |
1922 |
1934 |
1946 |
1958 |
1970 |
1982 |
1994 |
2006 |
2018 |
2030 |
2042 |
2054 |
2066 |
2078 |
2090 |
2102 |
2114 |
2126 |
2138 |
2150 |
2162 |
2174 |
2186 |
2198 |
2210 |
2222 |
2234 |
2246 |
2258 |
2270 |
2282 |
2294 |
2295 |
--------------------------------------------------------------------------------
/dataset/training/PotatoPlant1187.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/kdijkstra13/OpenCentroidNet/a491477351f050f9713072dd1a218314f5580f7a/dataset/training/PotatoPlant1187.png
--------------------------------------------------------------------------------
/dataset/training/PotatoPlant1321.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/kdijkstra13/OpenCentroidNet/a491477351f050f9713072dd1a218314f5580f7a/dataset/training/PotatoPlant1321.png
--------------------------------------------------------------------------------
/dataset/training/PotatoPlant1321.xml:
--------------------------------------------------------------------------------
1 |
2 | Kiem Drone
3 | PotatoPlant1321.png
4 | F:\Onedrive\Documenten\Kiem Drone\PotatoPlant1321.png
5 |
6 | Unknown
7 |
8 |
9 | 1800
10 | 1500
11 | 3
12 |
13 | 0
14 |
26 |
38 |
50 |
62 |
74 |
86 |
98 |
110 |
122 |
134 |
146 |
158 |
170 |
182 |
194 |
206 |
218 |
230 |
242 |
254 |
266 |
278 |
290 |
302 |
314 |
326 |
338 |
350 |
362 |
374 |
386 |
398 |
410 |
422 |
434 |
446 |
458 |
470 |
482 |
494 |
506 |
518 |
530 |
542 |
554 |
566 |
578 |
590 |
602 |
614 |
626 |
638 |
650 |
662 |
674 |
686 |
698 |
710 |
722 |
734 |
746 |
758 |
770 |
782 |
794 |
806 |
818 |
830 |
842 |
854 |
866 |
878 |
890 |
902 |
914 |
926 |
938 |
950 |
962 |
974 |
986 |
998 |
1010 |
1022 |
1034 |
1046 |
1058 |
1070 |
1082 |
1094 |
1106 |
1118 |
1130 |
1142 |
1154 |
1166 |
1178 |
1190 |
1202 |
1214 |
1226 |
1238 |
1250 |
1262 |
1274 |
1286 |
1298 |
1310 |
1322 |
1334 |
1346 |
1358 |
1370 |
1382 |
1394 |
1406 |
1418 |
1430 |
1442 |
1454 |
1466 |
1478 |
1490 |
1502 |
1514 |
1526 |
1538 |
1550 |
1562 |
1574 |
1586 |
1598 |
1610 |
1622 |
1634 |
1646 |
1658 |
1670 |
1682 |
1694 |
1706 |
1718 |
1730 |
1742 |
1754 |
1766 |
1778 |
1790 |
1802 |
1814 |
1826 |
1838 |
1850 |
1862 |
1874 |
1886 |
1898 |
1910 |
1922 |
1934 |
1946 |
1958 |
1970 |
1982 |
1994 |
2006 |
2018 |
2030 |
2042 |
2054 |
2066 |
2078 |
2090 |
2102 |
2114 |
2126 |
2138 |
2150 |
2162 |
2174 |
2186 |
2198 |
2210 |
2222 |
2234 |
2246 |
2258 |
2270 |
2282 |
2294 |
2306 |
2318 |
2330 |
2342 |
2354 |
2366 |
2378 |
2390 |
2402 |
2414 |
2426 |
2438 |
2450 |
2462 |
2474 |
2486 |
2498 |
2510 |
2522 |
2534 |
2546 |
2558 |
2570 |
2582 |
2594 |
2606 |
2618 |
2630 |
2642 |
2654 |
2666 |
2678 |
2690 |
2702 |
2714 |
2726 |
2738 |
2750 |
2762 |
2774 |
2786 |
2798 |
2810 |
2822 |
2834 |
2846 |
2858 |
2870 |
2882 |
2894 |
2906 |
2918 |
2930 |
2942 |
2954 |
2966 |
2978 |
2990 |
3002 |
3014 |
3026 |
3038 |
3050 |
3062 |
3074 |
3086 |
3098 |
3099 |
--------------------------------------------------------------------------------
/dataset/training/PotatoPlant417.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/kdijkstra13/OpenCentroidNet/a491477351f050f9713072dd1a218314f5580f7a/dataset/training/PotatoPlant417.png
--------------------------------------------------------------------------------
/dataset/validation/PotatoPlant143.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/kdijkstra13/OpenCentroidNet/a491477351f050f9713072dd1a218314f5580f7a/dataset/validation/PotatoPlant143.png
--------------------------------------------------------------------------------
/dataset/validation/PotatoPlant297.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/kdijkstra13/OpenCentroidNet/a491477351f050f9713072dd1a218314f5580f7a/dataset/validation/PotatoPlant297.png
--------------------------------------------------------------------------------
/dataset/validation/PotatoPlant552.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/kdijkstra13/OpenCentroidNet/a491477351f050f9713072dd1a218314f5580f7a/dataset/validation/PotatoPlant552.png
--------------------------------------------------------------------------------
/misc/create_dataset.py:
--------------------------------------------------------------------------------
1 | # Copyright (C) 2019 Klaas Dijkstra
2 | #
3 | # This file is part of OpenCentroidNet.
4 | #
5 | # OpenCentroidNet is free software: you can redistribute it and/or modify
6 | # it under the terms of the GNU General Public License as published by
7 | # the Free Software Foundation, either version 3 of the License, or
8 | # (at your option) any later version.
9 |
10 | # OpenCentroidNet is distributed in the hope that it will be useful,
11 | # but WITHOUT ANY WARRANTY; without even the implied warranty of
12 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
13 | # GNU General Public License for more details.
14 |
15 | # You should have received a copy of the GNU General Public License
16 | # along with OpenCentroidNet. If not, see .
17 |
18 | import numpy as np
19 | from skimage.draw import circle
20 | import os
21 | import cv2
22 | import random
23 |
24 | def create_dataset(folder, file: str = "training.csv", n: int = 50, height: int = 200, width: int = 300, n_circles: int = 5, min_r: int = 15, max_r: int = 20, prefix="train", n_classes=3):
25 | boxes = []
26 | for i in range(n):
27 | out_fn = f"{prefix}_{i}.png"
28 | img = np.random.randint(10, 30, (3, height, width))
29 | for a in range(n_circles):
30 | if min_r < max_r:
31 | r = np.random.randint(min_r, max_r)
32 | else:
33 | r = min_r
34 | y = np.random.randint(r, height-r)
35 | x = np.random.randint(r, width-r)
36 | rr, cc = circle(y, x, r)
37 | d = (r - ((((rr - y) ** 2) + ((cc - x) ** 2)) ** 0.5)) * (150 / r) + 50
38 | id = a % min(n_classes, 3)
39 | img[id, rr, cc] = d
40 | xmin,xmax,ymin,ymax = x-r,x+r,y-r,y+r
41 | boxes.append(f"{out_fn},{xmin},{xmax},{ymin},{ymax},{id}\n")
42 | n_points = round(height * width * 0.2)
43 | y_rand = np.random.randint(0, height, n_points)
44 | x_rand = np.random.randint(0, width, n_points)
45 | img[:, y_rand, x_rand] = np.random.randint(0, 64, size=(3, n_points))
46 | img = np.transpose(img.astype(np.uint8), (1, 2, 0))
47 | cv2.imwrite(os.path.join(folder, out_fn), img)
48 | f = open(os.path.join(folder, file), "w")
49 | f.writelines(boxes)
50 |
51 |
52 | random.seed(42)
53 | folder = os.path.join("..", "data", "dataset")
54 | train_folder = os.path.join(folder, "training")
55 | validation_folder = os.path.join(folder, "validation")
56 | os.makedirs(folder, exist_ok=True)
57 | create_dataset(folder, file="training.csv", prefix="train", n=50)
58 | create_dataset(folder, file="validation.csv", prefix="valid", n=10)
59 |
60 |
61 |
--------------------------------------------------------------------------------
/predict.py:
--------------------------------------------------------------------------------
1 | # Copyright (C) 2019 Klaas Dijkstra
2 | #
3 | # This file is part of OpenCentroidNet.
4 | #
5 | # OpenCentroidNet is free software: you can redistribute it and/or modify
6 | # it under the terms of the GNU General Public License as published by
7 | # the Free Software Foundation, either version 3 of the License, or
8 | # (at your option) any later version.
9 |
10 | # OpenCentroidNet is distributed in the hope that it will be useful,
11 | # but WITHOUT ANY WARRANTY; without even the implied warranty of
12 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
13 | # GNU General Public License for more details.
14 |
15 | # You should have received a copy of the GNU General Public License
16 | # along with OpenCentroidNet. If not, see .
17 |
18 | import torch.utils.data
19 | from config import Config
20 |
21 | from centroidnet import *
22 |
23 | dev = "cuda:1"
24 |
25 | def create_centroidnet(num_channels, num_classes):
26 | model = centroidnet.CentroidNet(num_classes, num_channels)
27 | return model
28 |
29 |
30 | def load_model(filename, model):
31 | print(f"Load snapshot from: {os.path.abspath(filename)}")
32 | with open(filename, "rb") as f:
33 | state_dict = torch.load(f)
34 | model.load_state_dict(state_dict)
35 | return model
36 |
37 |
38 | def predict(image, model, max_dist, binning, nm_size, centroid_threshold, sub, div):
39 | # Prepare network input
40 | inputs = np.expand_dims(np.transpose(image, (2, 0, 1)), axis=0).astype(np.float32)
41 | inputs = torch.Tensor((inputs - sub) / div)
42 |
43 | # Upload to device
44 | inputs = inputs.to(Config.dev)
45 | model.to(Config.dev)
46 |
47 | # Do inference and decoding
48 | outputs = model(inputs)[0].cpu().detach().numpy()
49 | centroid_vectors, votes, class_ids, class_probs, votes_nm, centroids = centroidnet.decode(outputs, max_dist, binning, nm_size, centroid_threshold)
50 |
51 | # Only return the list of centroids
52 | return centroids
53 |
54 |
55 | def main():
56 | file = "data/dataset/valid_0.png"
57 |
58 | print(f"Load image: {file}")
59 | image = cv2.imread(file)
60 | assert image is not None, f"Image {os.path.abspath(file)} not found"
61 |
62 | print(f"Predicting.")
63 | model = create_centroidnet(num_channels=Config.num_channels, num_classes=Config.num_classes)
64 | model = load_model(os.path.join("data", "CentroidNet.pth"), model)
65 | centroids = predict(image, model, Config.max_dist, Config.binning, Config.nm_size, Config.centroid_threshold, Config.sub, Config.div)
66 | centroids = np.stack(centroids, axis=0)
67 | print(f"Found {centroids.shape[0]} centroids (y, x, class_id, probability):\n {centroids}")
68 |
69 |
70 | if __name__ == '__main__':
71 | main()
72 |
--------------------------------------------------------------------------------
/train.py:
--------------------------------------------------------------------------------
1 | # Copyright (C) 2019 Klaas Dijkstra
2 | #
3 | # This file is part of OpenCentroidNet.
4 | #
5 | # OpenCentroidNet is free software: you can redistribute it and/or modify
6 | # it under the terms of the GNU General Public License as published by
7 | # the Free Software Foundation, either version 3 of the License, or
8 | # (at your option) any later version.
9 |
10 | # OpenCentroidNet is distributed in the hope that it will be useful,
11 | # but WITHOUT ANY WARRANTY; without even the implied warranty of
12 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
13 | # GNU General Public License for more details.
14 |
15 | # You should have received a copy of the GNU General Public License
16 | # along with OpenCentroidNet. If not, see .
17 |
18 | import torch.utils.data
19 | import torch.optim as optim
20 | from tqdm import tqdm
21 | import copy
22 | from config import Config
23 | from centroidnet import *
24 |
25 |
26 | def create_data_loader(training_file, validation_file, crop, max_dist, repeat, sub, div):
27 | training_set = centroidnet.CentroidNetDataset(training_file, crop=crop, max_dist=max_dist, repeat=repeat, sub=sub, div=div)
28 | validation_set = centroidnet.CentroidNetDataset(validation_file, crop=crop, max_dist=max_dist, sub=sub, div=div)
29 | return training_set, validation_set
30 |
31 |
32 | def create_centroidnet(num_channels, num_classes):
33 | model = centroidnet.CentroidNet(num_classes, num_channels)
34 | return model
35 |
36 |
37 | def create_centroidnet_loss():
38 | loss = centroidnet.CentroidLoss()
39 | return loss
40 |
41 |
42 | def validate(validation_loss, epoch, validation_set_loader, model, loss, validation_interval=10):
43 | if epoch % validation_interval == 0:
44 | with torch.no_grad():
45 | # Validate using validation data loader
46 | model.eval() # put in evaluation mode
47 | validation_loss = 0
48 | idx = 0
49 | for inputs, targets in validation_set_loader:
50 | inputs = inputs.to(Config.dev)
51 | targets = targets.to(Config.dev)
52 | outputs = model(inputs)
53 | mse = loss(outputs, targets)
54 | validation_loss += mse.item()
55 | idx += 1
56 | model.train() # put back in training mode
57 | return validation_loss / idx
58 | else:
59 | return validation_loss
60 |
61 |
62 | def save_model(filename, model):
63 | print(f"Save snapshot to: {os.path.abspath(filename)}")
64 | with open(filename, "wb") as f:
65 | torch.save(model.state_dict(), f)
66 |
67 |
68 | def load_model(filename, model):
69 | print(f"Load snapshot from: {os.path.abspath(filename)}")
70 | with open(filename, "rb") as f:
71 | state_dict = torch.load(f)
72 | model.load_state_dict(state_dict)
73 | return model
74 |
75 |
76 | def train(training_set, validation_set, model, loss, epochs, batch_size, learn_rate, validation_interval):
77 | print(f"Training {len(training_set)} images for {epochs} epochs with a batch size of {batch_size}.\n"
78 | f"Validate {len(validation_set)} images each {validation_interval} epochs and learning rate {learn_rate}.\n")
79 |
80 | #training_set.eval()
81 |
82 | best_model = copy.deepcopy(model)
83 | model.to(Config.dev)
84 |
85 | optimizer = optim.Adam(model.parameters(), lr=learn_rate)
86 | scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size=100, gamma=0.1)
87 |
88 | training_set_loader = torch.utils.data.DataLoader(training_set, batch_size=batch_size, shuffle=True, num_workers=10, drop_last=True)
89 | validation_set_loader = torch.utils.data.DataLoader(validation_set, batch_size=min(len(validation_set), batch_size), shuffle=True, num_workers=10, drop_last=True)
90 |
91 | if len(training_set_loader) == 0:
92 | raise Exception("The training dataset does no contain any samples. Is the minibatch larger than the amount of samples?")
93 | if len(training_set_loader) == 0:
94 | raise Exception("The validation dataset does no contain any samples. Is the minibatch larger than the amount of samples?")
95 |
96 | bar = tqdm(range(1, epochs))
97 | validation_loss = 9999
98 | best_loss = 9999
99 | for epoch in bar:
100 | training_loss = 0
101 | idx = 0
102 | # Train one minibatch
103 | for (inputs, targets) in training_set_loader:
104 | inputs = inputs.to(Config.dev)
105 | targets = targets.to(Config.dev)
106 | optimizer.zero_grad()
107 | outputs = model(inputs)
108 | ls = loss(outputs, targets)
109 | ls.backward()
110 | optimizer.step()
111 | training_loss += ls.item()
112 | idx += 1
113 |
114 | scheduler.step(epoch)
115 |
116 | # Update progress bar
117 | bar.set_description("Epoch {}/{} Loss(T): {:5f} and Loss(V): {:.5f}".format(epoch, epochs, training_loss / idx, validation_loss))
118 | bar.refresh()
119 |
120 | # Validate and save
121 | validation_loss = validate(validation_loss, epoch, validation_set_loader, model, loss, validation_interval)
122 | if validation_loss < best_loss:
123 | print(f"Update model with loss {validation_loss}")
124 | best_loss = validation_loss
125 | best_model.load_state_dict(model.state_dict())
126 |
127 | return best_model
128 |
129 |
130 | def predict(data_set, model, loss, max_dist, binning, nm_size, centroid_threshold):
131 | print(f"Predicting {len(data_set)} files with loss {type(loss)}")
132 | with torch.no_grad():
133 | data_set.eval()
134 | model.eval()
135 | model.to(Config.dev)
136 | loss_value = 0
137 | idx = 0
138 | set_loader = torch.utils.data.DataLoader(data_set, batch_size=5, shuffle=False, num_workers=1, drop_last=False)
139 | result_images = []
140 | result_centroids = []
141 | for inputs, targets in tqdm(set_loader):
142 | inputs = inputs.to(Config.dev)
143 | targets = targets.to(Config.dev)
144 | outputs = model(inputs)
145 | ls = loss(outputs, targets)
146 | loss_value += ls.item()
147 | decoded = [centroidnet.decode(img, max_dist, binning, nm_size, centroid_threshold) for img in outputs.cpu().numpy()]
148 |
149 | # Add all numpy arrays to a list
150 | result_images.extend([{"inputs": i.cpu().numpy(),
151 | "targets": t.cpu().numpy(),
152 | "vectors": d[0],
153 | "votes": d[1],
154 | "class_ids": d[2],
155 | "class_probs": d[3],
156 | "centroids": d[4]} for i, t, o, d in zip(inputs, targets, outputs, decoded)])
157 |
158 | # Add image_id to centroid locations and add to list
159 | result_centroids.extend([np.stack(ctr for ctr in d[5]) for d in decoded])
160 | idx = idx + 1
161 | print("Aggregated loss is {:.5f}".format(loss_value / idx))
162 | return result_images, result_centroids
163 |
164 |
165 | def output(folder, result_images, result_centroids):
166 | os.makedirs(folder, exist_ok=True)
167 | print(f"Created output folder {os.path.abspath(folder)}")
168 | for i, sample in enumerate(result_images):
169 | for name, arr in sample.items():
170 | np.save(os.path.join(folder, f"{i}_{name}.npy"), arr)
171 |
172 | lines = ["image_nr centroid_y centroid_x class_id probability \r\n"]
173 | with open(os.path.join(folder, "validation.txt"), "w") as f:
174 | for i, image in enumerate(result_centroids):
175 | for line in image:
176 | line_str = [str(i), *[str(elm) for elm in line]]
177 | lines.append(" ".join(line_str) + "\r\n")
178 | f.writelines(lines)
179 |
180 |
181 | def main():
182 | # Perform retraining.
183 | do_train = True
184 | # Perform loading of the model.
185 | do_load = True
186 | # Perform final prediction and export data.
187 | do_predict = True
188 |
189 | # Start script
190 | assert (do_train or do_load), "Enable do_train and/or do_load"
191 |
192 | # Load datasets
193 | training_set, validation_set = create_data_loader(os.path.join("data", "dataset", "training.csv"),
194 | os.path.join("data", "dataset", "validation.csv"),
195 | crop=Config.crop, max_dist=Config.max_dist, repeat=1, sub=Config.sub, div=Config.div)
196 |
197 | assert training_set.num_classes == Config.num_classes, f"Number of classes on config.py is incorrect. Should be {training_set.num_classes}"
198 |
199 | # Create loss function
200 | loss = create_centroidnet_loss()
201 | model = None
202 |
203 | # Train network
204 | if do_train:
205 | # Create network and load snapshots
206 | model = create_centroidnet(num_channels=Config.num_channels, num_classes=Config.num_classes)
207 | model = train(training_set, validation_set, model, loss, epochs=Config.epochs, batch_size=Config.batch_size, learn_rate=Config.learn_rate, validation_interval=Config.validation_interval)
208 | save_model(os.path.join("data", "CentroidNet.pth"), model)
209 |
210 | # Load model
211 | if do_load:
212 | model = create_centroidnet(num_channels=3, num_classes=training_set.num_classes)
213 | model = load_model(os.path.join("data", "CentroidNet.pth"), model)
214 |
215 | # Predict
216 | if do_predict:
217 | result_images, result_centroids = predict(validation_set, model, loss, max_dist=Config.max_dist, binning=Config.binning, nm_size=Config.nm_size, centroid_threshold=Config.centroid_threshold)
218 | output(os.path.join("data", "validation_result"), result_images, result_centroids)
219 |
220 |
221 | if __name__ == '__main__':
222 | main()
223 |
--------------------------------------------------------------------------------