├── LICENSE
├── README.md
├── __init__.py
├── examples
├── avoid_and_erase.json
└── avoid_and_erase.png
├── node_list.json
├── nodes
├── nodes_pred.py
└── nodes_sigma.py
└── patches
└── colorPalette.js.patch
/LICENSE:
--------------------------------------------------------------------------------
1 | GNU GENERAL PUBLIC LICENSE
2 | Version 3, 29 June 2007
3 |
4 | Copyright (C) 2007 Free Software Foundation, Inc.
5 | Everyone is permitted to copy and distribute verbatim copies
6 | of this license document, but changing it is not allowed.
7 |
8 | Preamble
9 |
10 | The GNU General Public License is a free, copyleft license for
11 | software and other kinds of works.
12 |
13 | The licenses for most software and other practical works are designed
14 | to take away your freedom to share and change the works. By contrast,
15 | the GNU General Public License is intended to guarantee your freedom to
16 | share and change all versions of a program--to make sure it remains free
17 | software for all its users. We, the Free Software Foundation, use the
18 | GNU General Public License for most of our software; it applies also to
19 | any other work released this way by its authors. You can apply it to
20 | your programs, too.
21 |
22 | When we speak of free software, we are referring to freedom, not
23 | price. Our General Public Licenses are designed to make sure that you
24 | have the freedom to distribute copies of free software (and charge for
25 | them if you wish), that you receive source code or can get it if you
26 | want it, that you can change the software or use pieces of it in new
27 | free programs, and that you know you can do these things.
28 |
29 | To protect your rights, we need to prevent others from denying you
30 | these rights or asking you to surrender the rights. Therefore, you have
31 | certain responsibilities if you distribute copies of the software, or if
32 | you modify it: responsibilities to respect the freedom of others.
33 |
34 | For example, if you distribute copies of such a program, whether
35 | gratis or for a fee, you must pass on to the recipients the same
36 | freedoms that you received. You must make sure that they, too, receive
37 | or can get the source code. And you must show them these terms so they
38 | know their rights.
39 |
40 | Developers that use the GNU GPL protect your rights with two steps:
41 | (1) assert copyright on the software, and (2) offer you this License
42 | giving you legal permission to copy, distribute and/or modify it.
43 |
44 | For the developers' and authors' protection, the GPL clearly explains
45 | that there is no warranty for this free software. For both users' and
46 | authors' sake, the GPL requires that modified versions be marked as
47 | changed, so that their problems will not be attributed erroneously to
48 | authors of previous versions.
49 |
50 | Some devices are designed to deny users access to install or run
51 | modified versions of the software inside them, although the manufacturer
52 | can do so. This is fundamentally incompatible with the aim of
53 | protecting users' freedom to change the software. The systematic
54 | pattern of such abuse occurs in the area of products for individuals to
55 | use, which is precisely where it is most unacceptable. Therefore, we
56 | have designed this version of the GPL to prohibit the practice for those
57 | products. If such problems arise substantially in other domains, we
58 | stand ready to extend this provision to those domains in future versions
59 | of the GPL, as needed to protect the freedom of users.
60 |
61 | Finally, every program is threatened constantly by software patents.
62 | States should not allow patents to restrict development and use of
63 | software on general-purpose computers, but in those that do, we wish to
64 | avoid the special danger that patents applied to a free program could
65 | make it effectively proprietary. To prevent this, the GPL assures that
66 | patents cannot be used to render the program non-free.
67 |
68 | The precise terms and conditions for copying, distribution and
69 | modification follow.
70 |
71 | TERMS AND CONDITIONS
72 |
73 | 0. Definitions.
74 |
75 | "This License" refers to version 3 of the GNU General Public License.
76 |
77 | "Copyright" also means copyright-like laws that apply to other kinds of
78 | works, such as semiconductor masks.
79 |
80 | "The Program" refers to any copyrightable work licensed under this
81 | License. Each licensee is addressed as "you". "Licensees" and
82 | "recipients" may be individuals or organizations.
83 |
84 | To "modify" a work means to copy from or adapt all or part of the work
85 | in a fashion requiring copyright permission, other than the making of an
86 | exact copy. The resulting work is called a "modified version" of the
87 | earlier work or a work "based on" the earlier work.
88 |
89 | A "covered work" means either the unmodified Program or a work based
90 | on the Program.
91 |
92 | To "propagate" a work means to do anything with it that, without
93 | permission, would make you directly or secondarily liable for
94 | infringement under applicable copyright law, except executing it on a
95 | computer or modifying a private copy. Propagation includes copying,
96 | distribution (with or without modification), making available to the
97 | public, and in some countries other activities as well.
98 |
99 | To "convey" a work means any kind of propagation that enables other
100 | parties to make or receive copies. Mere interaction with a user through
101 | a computer network, with no transfer of a copy, is not conveying.
102 |
103 | An interactive user interface displays "Appropriate Legal Notices"
104 | to the extent that it includes a convenient and prominently visible
105 | feature that (1) displays an appropriate copyright notice, and (2)
106 | tells the user that there is no warranty for the work (except to the
107 | extent that warranties are provided), that licensees may convey the
108 | work under this License, and how to view a copy of this License. If
109 | the interface presents a list of user commands or options, such as a
110 | menu, a prominent item in the list meets this criterion.
111 |
112 | 1. Source Code.
113 |
114 | The "source code" for a work means the preferred form of the work
115 | for making modifications to it. "Object code" means any non-source
116 | form of a work.
117 |
118 | A "Standard Interface" means an interface that either is an official
119 | standard defined by a recognized standards body, or, in the case of
120 | interfaces specified for a particular programming language, one that
121 | is widely used among developers working in that language.
122 |
123 | The "System Libraries" of an executable work include anything, other
124 | than the work as a whole, that (a) is included in the normal form of
125 | packaging a Major Component, but which is not part of that Major
126 | Component, and (b) serves only to enable use of the work with that
127 | Major Component, or to implement a Standard Interface for which an
128 | implementation is available to the public in source code form. A
129 | "Major Component", in this context, means a major essential component
130 | (kernel, window system, and so on) of the specific operating system
131 | (if any) on which the executable work runs, or a compiler used to
132 | produce the work, or an object code interpreter used to run it.
133 |
134 | The "Corresponding Source" for a work in object code form means all
135 | the source code needed to generate, install, and (for an executable
136 | work) run the object code and to modify the work, including scripts to
137 | control those activities. However, it does not include the work's
138 | System Libraries, or general-purpose tools or generally available free
139 | programs which are used unmodified in performing those activities but
140 | which are not part of the work. For example, Corresponding Source
141 | includes interface definition files associated with source files for
142 | the work, and the source code for shared libraries and dynamically
143 | linked subprograms that the work is specifically designed to require,
144 | such as by intimate data communication or control flow between those
145 | subprograms and other parts of the work.
146 |
147 | The Corresponding Source need not include anything that users
148 | can regenerate automatically from other parts of the Corresponding
149 | Source.
150 |
151 | The Corresponding Source for a work in source code form is that
152 | same work.
153 |
154 | 2. Basic Permissions.
155 |
156 | All rights granted under this License are granted for the term of
157 | copyright on the Program, and are irrevocable provided the stated
158 | conditions are met. This License explicitly affirms your unlimited
159 | permission to run the unmodified Program. The output from running a
160 | covered work is covered by this License only if the output, given its
161 | content, constitutes a covered work. This License acknowledges your
162 | rights of fair use or other equivalent, as provided by copyright law.
163 |
164 | You may make, run and propagate covered works that you do not
165 | convey, without conditions so long as your license otherwise remains
166 | in force. You may convey covered works to others for the sole purpose
167 | of having them make modifications exclusively for you, or provide you
168 | with facilities for running those works, provided that you comply with
169 | the terms of this License in conveying all material for which you do
170 | not control copyright. Those thus making or running the covered works
171 | for you must do so exclusively on your behalf, under your direction
172 | and control, on terms that prohibit them from making any copies of
173 | your copyrighted material outside their relationship with you.
174 |
175 | Conveying under any other circumstances is permitted solely under
176 | the conditions stated below. Sublicensing is not allowed; section 10
177 | makes it unnecessary.
178 |
179 | 3. Protecting Users' Legal Rights From Anti-Circumvention Law.
180 |
181 | No covered work shall be deemed part of an effective technological
182 | measure under any applicable law fulfilling obligations under article
183 | 11 of the WIPO copyright treaty adopted on 20 December 1996, or
184 | similar laws prohibiting or restricting circumvention of such
185 | measures.
186 |
187 | When you convey a covered work, you waive any legal power to forbid
188 | circumvention of technological measures to the extent such circumvention
189 | is effected by exercising rights under this License with respect to
190 | the covered work, and you disclaim any intention to limit operation or
191 | modification of the work as a means of enforcing, against the work's
192 | users, your or third parties' legal rights to forbid circumvention of
193 | technological measures.
194 |
195 | 4. Conveying Verbatim Copies.
196 |
197 | You may convey verbatim copies of the Program's source code as you
198 | receive it, in any medium, provided that you conspicuously and
199 | appropriately publish on each copy an appropriate copyright notice;
200 | keep intact all notices stating that this License and any
201 | non-permissive terms added in accord with section 7 apply to the code;
202 | keep intact all notices of the absence of any warranty; and give all
203 | recipients a copy of this License along with the Program.
204 |
205 | You may charge any price or no price for each copy that you convey,
206 | and you may offer support or warranty protection for a fee.
207 |
208 | 5. Conveying Modified Source Versions.
209 |
210 | You may convey a work based on the Program, or the modifications to
211 | produce it from the Program, in the form of source code under the
212 | terms of section 4, provided that you also meet all of these conditions:
213 |
214 | a) The work must carry prominent notices stating that you modified
215 | it, and giving a relevant date.
216 |
217 | b) The work must carry prominent notices stating that it is
218 | released under this License and any conditions added under section
219 | 7. This requirement modifies the requirement in section 4 to
220 | "keep intact all notices".
221 |
222 | c) You must license the entire work, as a whole, under this
223 | License to anyone who comes into possession of a copy. This
224 | License will therefore apply, along with any applicable section 7
225 | additional terms, to the whole of the work, and all its parts,
226 | regardless of how they are packaged. This License gives no
227 | permission to license the work in any other way, but it does not
228 | invalidate such permission if you have separately received it.
229 |
230 | d) If the work has interactive user interfaces, each must display
231 | Appropriate Legal Notices; however, if the Program has interactive
232 | interfaces that do not display Appropriate Legal Notices, your
233 | work need not make them do so.
234 |
235 | A compilation of a covered work with other separate and independent
236 | works, which are not by their nature extensions of the covered work,
237 | and which are not combined with it such as to form a larger program,
238 | in or on a volume of a storage or distribution medium, is called an
239 | "aggregate" if the compilation and its resulting copyright are not
240 | used to limit the access or legal rights of the compilation's users
241 | beyond what the individual works permit. Inclusion of a covered work
242 | in an aggregate does not cause this License to apply to the other
243 | parts of the aggregate.
244 |
245 | 6. Conveying Non-Source Forms.
246 |
247 | You may convey a covered work in object code form under the terms
248 | of sections 4 and 5, provided that you also convey the
249 | machine-readable Corresponding Source under the terms of this License,
250 | in one of these ways:
251 |
252 | a) Convey the object code in, or embodied in, a physical product
253 | (including a physical distribution medium), accompanied by the
254 | Corresponding Source fixed on a durable physical medium
255 | customarily used for software interchange.
256 |
257 | b) Convey the object code in, or embodied in, a physical product
258 | (including a physical distribution medium), accompanied by a
259 | written offer, valid for at least three years and valid for as
260 | long as you offer spare parts or customer support for that product
261 | model, to give anyone who possesses the object code either (1) a
262 | copy of the Corresponding Source for all the software in the
263 | product that is covered by this License, on a durable physical
264 | medium customarily used for software interchange, for a price no
265 | more than your reasonable cost of physically performing this
266 | conveying of source, or (2) access to copy the
267 | Corresponding Source from a network server at no charge.
268 |
269 | c) Convey individual copies of the object code with a copy of the
270 | written offer to provide the Corresponding Source. This
271 | alternative is allowed only occasionally and noncommercially, and
272 | only if you received the object code with such an offer, in accord
273 | with subsection 6b.
274 |
275 | d) Convey the object code by offering access from a designated
276 | place (gratis or for a charge), and offer equivalent access to the
277 | Corresponding Source in the same way through the same place at no
278 | further charge. You need not require recipients to copy the
279 | Corresponding Source along with the object code. If the place to
280 | copy the object code is a network server, the Corresponding Source
281 | may be on a different server (operated by you or a third party)
282 | that supports equivalent copying facilities, provided you maintain
283 | clear directions next to the object code saying where to find the
284 | Corresponding Source. Regardless of what server hosts the
285 | Corresponding Source, you remain obligated to ensure that it is
286 | available for as long as needed to satisfy these requirements.
287 |
288 | e) Convey the object code using peer-to-peer transmission, provided
289 | you inform other peers where the object code and Corresponding
290 | Source of the work are being offered to the general public at no
291 | charge under subsection 6d.
292 |
293 | A separable portion of the object code, whose source code is excluded
294 | from the Corresponding Source as a System Library, need not be
295 | included in conveying the object code work.
296 |
297 | A "User Product" is either (1) a "consumer product", which means any
298 | tangible personal property which is normally used for personal, family,
299 | or household purposes, or (2) anything designed or sold for incorporation
300 | into a dwelling. In determining whether a product is a consumer product,
301 | doubtful cases shall be resolved in favor of coverage. For a particular
302 | product received by a particular user, "normally used" refers to a
303 | typical or common use of that class of product, regardless of the status
304 | of the particular user or of the way in which the particular user
305 | actually uses, or expects or is expected to use, the product. A product
306 | is a consumer product regardless of whether the product has substantial
307 | commercial, industrial or non-consumer uses, unless such uses represent
308 | the only significant mode of use of the product.
309 |
310 | "Installation Information" for a User Product means any methods,
311 | procedures, authorization keys, or other information required to install
312 | and execute modified versions of a covered work in that User Product from
313 | a modified version of its Corresponding Source. The information must
314 | suffice to ensure that the continued functioning of the modified object
315 | code is in no case prevented or interfered with solely because
316 | modification has been made.
317 |
318 | If you convey an object code work under this section in, or with, or
319 | specifically for use in, a User Product, and the conveying occurs as
320 | part of a transaction in which the right of possession and use of the
321 | User Product is transferred to the recipient in perpetuity or for a
322 | fixed term (regardless of how the transaction is characterized), the
323 | Corresponding Source conveyed under this section must be accompanied
324 | by the Installation Information. But this requirement does not apply
325 | if neither you nor any third party retains the ability to install
326 | modified object code on the User Product (for example, the work has
327 | been installed in ROM).
328 |
329 | The requirement to provide Installation Information does not include a
330 | requirement to continue to provide support service, warranty, or updates
331 | for a work that has been modified or installed by the recipient, or for
332 | the User Product in which it has been modified or installed. Access to a
333 | network may be denied when the modification itself materially and
334 | adversely affects the operation of the network or violates the rules and
335 | protocols for communication across the network.
336 |
337 | Corresponding Source conveyed, and Installation Information provided,
338 | in accord with this section must be in a format that is publicly
339 | documented (and with an implementation available to the public in
340 | source code form), and must require no special password or key for
341 | unpacking, reading or copying.
342 |
343 | 7. Additional Terms.
344 |
345 | "Additional permissions" are terms that supplement the terms of this
346 | License by making exceptions from one or more of its conditions.
347 | Additional permissions that are applicable to the entire Program shall
348 | be treated as though they were included in this License, to the extent
349 | that they are valid under applicable law. If additional permissions
350 | apply only to part of the Program, that part may be used separately
351 | under those permissions, but the entire Program remains governed by
352 | this License without regard to the additional permissions.
353 |
354 | When you convey a copy of a covered work, you may at your option
355 | remove any additional permissions from that copy, or from any part of
356 | it. (Additional permissions may be written to require their own
357 | removal in certain cases when you modify the work.) You may place
358 | additional permissions on material, added by you to a covered work,
359 | for which you have or can give appropriate copyright permission.
360 |
361 | Notwithstanding any other provision of this License, for material you
362 | add to a covered work, you may (if authorized by the copyright holders of
363 | that material) supplement the terms of this License with terms:
364 |
365 | a) Disclaiming warranty or limiting liability differently from the
366 | terms of sections 15 and 16 of this License; or
367 |
368 | b) Requiring preservation of specified reasonable legal notices or
369 | author attributions in that material or in the Appropriate Legal
370 | Notices displayed by works containing it; or
371 |
372 | c) Prohibiting misrepresentation of the origin of that material, or
373 | requiring that modified versions of such material be marked in
374 | reasonable ways as different from the original version; or
375 |
376 | d) Limiting the use for publicity purposes of names of licensors or
377 | authors of the material; or
378 |
379 | e) Declining to grant rights under trademark law for use of some
380 | trade names, trademarks, or service marks; or
381 |
382 | f) Requiring indemnification of licensors and authors of that
383 | material by anyone who conveys the material (or modified versions of
384 | it) with contractual assumptions of liability to the recipient, for
385 | any liability that these contractual assumptions directly impose on
386 | those licensors and authors.
387 |
388 | All other non-permissive additional terms are considered "further
389 | restrictions" within the meaning of section 10. If the Program as you
390 | received it, or any part of it, contains a notice stating that it is
391 | governed by this License along with a term that is a further
392 | restriction, you may remove that term. If a license document contains
393 | a further restriction but permits relicensing or conveying under this
394 | License, you may add to a covered work material governed by the terms
395 | of that license document, provided that the further restriction does
396 | not survive such relicensing or conveying.
397 |
398 | If you add terms to a covered work in accord with this section, you
399 | must place, in the relevant source files, a statement of the
400 | additional terms that apply to those files, or a notice indicating
401 | where to find the applicable terms.
402 |
403 | Additional terms, permissive or non-permissive, may be stated in the
404 | form of a separately written license, or stated as exceptions;
405 | the above requirements apply either way.
406 |
407 | 8. Termination.
408 |
409 | You may not propagate or modify a covered work except as expressly
410 | provided under this License. Any attempt otherwise to propagate or
411 | modify it is void, and will automatically terminate your rights under
412 | this License (including any patent licenses granted under the third
413 | paragraph of section 11).
414 |
415 | However, if you cease all violation of this License, then your
416 | license from a particular copyright holder is reinstated (a)
417 | provisionally, unless and until the copyright holder explicitly and
418 | finally terminates your license, and (b) permanently, if the copyright
419 | holder fails to notify you of the violation by some reasonable means
420 | prior to 60 days after the cessation.
421 |
422 | Moreover, your license from a particular copyright holder is
423 | reinstated permanently if the copyright holder notifies you of the
424 | violation by some reasonable means, this is the first time you have
425 | received notice of violation of this License (for any work) from that
426 | copyright holder, and you cure the violation prior to 30 days after
427 | your receipt of the notice.
428 |
429 | Termination of your rights under this section does not terminate the
430 | licenses of parties who have received copies or rights from you under
431 | this License. If your rights have been terminated and not permanently
432 | reinstated, you do not qualify to receive new licenses for the same
433 | material under section 10.
434 |
435 | 9. Acceptance Not Required for Having Copies.
436 |
437 | You are not required to accept this License in order to receive or
438 | run a copy of the Program. Ancillary propagation of a covered work
439 | occurring solely as a consequence of using peer-to-peer transmission
440 | to receive a copy likewise does not require acceptance. However,
441 | nothing other than this License grants you permission to propagate or
442 | modify any covered work. These actions infringe copyright if you do
443 | not accept this License. Therefore, by modifying or propagating a
444 | covered work, you indicate your acceptance of this License to do so.
445 |
446 | 10. Automatic Licensing of Downstream Recipients.
447 |
448 | Each time you convey a covered work, the recipient automatically
449 | receives a license from the original licensors, to run, modify and
450 | propagate that work, subject to this License. You are not responsible
451 | for enforcing compliance by third parties with this License.
452 |
453 | An "entity transaction" is a transaction transferring control of an
454 | organization, or substantially all assets of one, or subdividing an
455 | organization, or merging organizations. If propagation of a covered
456 | work results from an entity transaction, each party to that
457 | transaction who receives a copy of the work also receives whatever
458 | licenses to the work the party's predecessor in interest had or could
459 | give under the previous paragraph, plus a right to possession of the
460 | Corresponding Source of the work from the predecessor in interest, if
461 | the predecessor has it or can get it with reasonable efforts.
462 |
463 | You may not impose any further restrictions on the exercise of the
464 | rights granted or affirmed under this License. For example, you may
465 | not impose a license fee, royalty, or other charge for exercise of
466 | rights granted under this License, and you may not initiate litigation
467 | (including a cross-claim or counterclaim in a lawsuit) alleging that
468 | any patent claim is infringed by making, using, selling, offering for
469 | sale, or importing the Program or any portion of it.
470 |
471 | 11. Patents.
472 |
473 | A "contributor" is a copyright holder who authorizes use under this
474 | License of the Program or a work on which the Program is based. The
475 | work thus licensed is called the contributor's "contributor version".
476 |
477 | A contributor's "essential patent claims" are all patent claims
478 | owned or controlled by the contributor, whether already acquired or
479 | hereafter acquired, that would be infringed by some manner, permitted
480 | by this License, of making, using, or selling its contributor version,
481 | but do not include claims that would be infringed only as a
482 | consequence of further modification of the contributor version. For
483 | purposes of this definition, "control" includes the right to grant
484 | patent sublicenses in a manner consistent with the requirements of
485 | this License.
486 |
487 | Each contributor grants you a non-exclusive, worldwide, royalty-free
488 | patent license under the contributor's essential patent claims, to
489 | make, use, sell, offer for sale, import and otherwise run, modify and
490 | propagate the contents of its contributor version.
491 |
492 | In the following three paragraphs, a "patent license" is any express
493 | agreement or commitment, however denominated, not to enforce a patent
494 | (such as an express permission to practice a patent or covenant not to
495 | sue for patent infringement). To "grant" such a patent license to a
496 | party means to make such an agreement or commitment not to enforce a
497 | patent against the party.
498 |
499 | If you convey a covered work, knowingly relying on a patent license,
500 | and the Corresponding Source of the work is not available for anyone
501 | to copy, free of charge and under the terms of this License, through a
502 | publicly available network server or other readily accessible means,
503 | then you must either (1) cause the Corresponding Source to be so
504 | available, or (2) arrange to deprive yourself of the benefit of the
505 | patent license for this particular work, or (3) arrange, in a manner
506 | consistent with the requirements of this License, to extend the patent
507 | license to downstream recipients. "Knowingly relying" means you have
508 | actual knowledge that, but for the patent license, your conveying the
509 | covered work in a country, or your recipient's use of the covered work
510 | in a country, would infringe one or more identifiable patents in that
511 | country that you have reason to believe are valid.
512 |
513 | If, pursuant to or in connection with a single transaction or
514 | arrangement, you convey, or propagate by procuring conveyance of, a
515 | covered work, and grant a patent license to some of the parties
516 | receiving the covered work authorizing them to use, propagate, modify
517 | or convey a specific copy of the covered work, then the patent license
518 | you grant is automatically extended to all recipients of the covered
519 | work and works based on it.
520 |
521 | A patent license is "discriminatory" if it does not include within
522 | the scope of its coverage, prohibits the exercise of, or is
523 | conditioned on the non-exercise of one or more of the rights that are
524 | specifically granted under this License. You may not convey a covered
525 | work if you are a party to an arrangement with a third party that is
526 | in the business of distributing software, under which you make payment
527 | to the third party based on the extent of your activity of conveying
528 | the work, and under which the third party grants, to any of the
529 | parties who would receive the covered work from you, a discriminatory
530 | patent license (a) in connection with copies of the covered work
531 | conveyed by you (or copies made from those copies), or (b) primarily
532 | for and in connection with specific products or compilations that
533 | contain the covered work, unless you entered into that arrangement,
534 | or that patent license was granted, prior to 28 March 2007.
535 |
536 | Nothing in this License shall be construed as excluding or limiting
537 | any implied license or other defenses to infringement that may
538 | otherwise be available to you under applicable patent law.
539 |
540 | 12. No Surrender of Others' Freedom.
541 |
542 | If conditions are imposed on you (whether by court order, agreement or
543 | otherwise) that contradict the conditions of this License, they do not
544 | excuse you from the conditions of this License. If you cannot convey a
545 | covered work so as to satisfy simultaneously your obligations under this
546 | License and any other pertinent obligations, then as a consequence you may
547 | not convey it at all. For example, if you agree to terms that obligate you
548 | to collect a royalty for further conveying from those to whom you convey
549 | the Program, the only way you could satisfy both those terms and this
550 | License would be to refrain entirely from conveying the Program.
551 |
552 | 13. Use with the GNU Affero General Public License.
553 |
554 | Notwithstanding any other provision of this License, you have
555 | permission to link or combine any covered work with a work licensed
556 | under version 3 of the GNU Affero General Public License into a single
557 | combined work, and to convey the resulting work. The terms of this
558 | License will continue to apply to the part which is the covered work,
559 | but the special requirements of the GNU Affero General Public License,
560 | section 13, concerning interaction through a network will apply to the
561 | combination as such.
562 |
563 | 14. Revised Versions of this License.
564 |
565 | The Free Software Foundation may publish revised and/or new versions of
566 | the GNU General Public License from time to time. Such new versions will
567 | be similar in spirit to the present version, but may differ in detail to
568 | address new problems or concerns.
569 |
570 | Each version is given a distinguishing version number. If the
571 | Program specifies that a certain numbered version of the GNU General
572 | Public License "or any later version" applies to it, you have the
573 | option of following the terms and conditions either of that numbered
574 | version or of any later version published by the Free Software
575 | Foundation. If the Program does not specify a version number of the
576 | GNU General Public License, you may choose any version ever published
577 | by the Free Software Foundation.
578 |
579 | If the Program specifies that a proxy can decide which future
580 | versions of the GNU General Public License can be used, that proxy's
581 | public statement of acceptance of a version permanently authorizes you
582 | to choose that version for the Program.
583 |
584 | Later license versions may give you additional or different
585 | permissions. However, no additional obligations are imposed on any
586 | author or copyright holder as a result of your choosing to follow a
587 | later version.
588 |
589 | 15. Disclaimer of Warranty.
590 |
591 | THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
592 | APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
593 | HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
594 | OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
595 | THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
596 | PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
597 | IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
598 | ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
599 |
600 | 16. Limitation of Liability.
601 |
602 | IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
603 | WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
604 | THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
605 | GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
606 | USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
607 | DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
608 | PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
609 | EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
610 | SUCH DAMAGES.
611 |
612 | 17. Interpretation of Sections 15 and 16.
613 |
614 | If the disclaimer of warranty and limitation of liability provided
615 | above cannot be given local legal effect according to their terms,
616 | reviewing courts shall apply local law that most closely approximates
617 | an absolute waiver of all civil liability in connection with the
618 | Program, unless a warranty or assumption of liability accompanies a
619 | copy of the Program in return for a fee.
620 |
621 | END OF TERMS AND CONDITIONS
622 |
623 | How to Apply These Terms to Your New Programs
624 |
625 | If you develop a new program, and you want it to be of the greatest
626 | possible use to the public, the best way to achieve this is to make it
627 | free software which everyone can redistribute and change under these terms.
628 |
629 | To do so, attach the following notices to the program. It is safest
630 | to attach them to the start of each source file to most effectively
631 | state the exclusion of warranty; and each file should have at least
632 | the "copyright" line and a pointer to where the full notice is found.
633 |
634 |
635 | Copyright (C)
636 |
637 | This program is free software: you can redistribute it and/or modify
638 | it under the terms of the GNU General Public License as published by
639 | the Free Software Foundation, either version 3 of the License, or
640 | (at your option) any later version.
641 |
642 | This program is distributed in the hope that it will be useful,
643 | but WITHOUT ANY WARRANTY; without even the implied warranty of
644 | MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
645 | GNU General Public License for more details.
646 |
647 | You should have received a copy of the GNU General Public License
648 | along with this program. If not, see .
649 |
650 | Also add information on how to contact you by electronic and paper mail.
651 |
652 | If the program does terminal interaction, make it output a short
653 | notice like this when it starts in an interactive mode:
654 |
655 | Copyright (C)
656 | This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
657 | This is free software, and you are welcome to redistribute it
658 | under certain conditions; type `show c' for details.
659 |
660 | The hypothetical commands `show w' and `show c' should show the appropriate
661 | parts of the General Public License. Of course, your program's commands
662 | might be different; for a GUI interface, you would use an "about box".
663 |
664 | You should also get your employer (if you work as a programmer) or school,
665 | if any, to sign a "copyright disclaimer" for the program, if necessary.
666 | For more information on this, and how to apply and follow the GNU GPL, see
667 | .
668 |
669 | The GNU General Public License does not permit incorporating your program
670 | into proprietary programs. If your program is a subroutine library, you
671 | may consider it more useful to permit linking proprietary applications with
672 | the library. If this is what you want to do, use the GNU Lesser General
673 | Public License instead of this License. But first, please read
674 | .
675 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # ComfyUI-Prediction
2 | Fully customizable Classifier Free Guidance for [ComfyUI](https://github.com/comfyanonymous/ComfyUI).
3 |
4 | 
5 |
6 | Copyright 2024 by @RedHotTensors and released by [Project RedRocket](https://huggingface.co/RedRocket).
7 |
8 | # Installation
9 | Clone this repo into ``ComfyUI/custom_nodes`` or use [ComfyUI-Manager](https://github.com/ltdrdata/ComfyUI-Manager).
10 |
11 | (Optional) If you want beautiful teal PREDICTION edges like the example apply [patches/colorPalette.js.patch](https://raw.githubusercontent.com/redhottensors/ComfyUI-Prediction/main/patches/colorPalette.js.patch) to ``ComfyUI/web/extensions/core/colorPalette.js``.
12 |
13 | # Usage
14 | All custom nodes are provided under Add Node > sampling > prediction. An example workflow is in ``examples/avoid_and_erase.json``.
15 |
16 | Follow these steps for fully custom prediction:
17 | 1. You will need to use the sampling > prediction > Sample Predictions node as your sampler.
18 | 2. The *sampler* input comes from sampling > custom_sampling > samplers. Generally you'll use **KSamplerSelect**.
19 | 3. The *sigmas* input comes from sampling > custom_sampling > schedulers. If you don't know what sigmas you are using, try **BasicScheduler**. (NOTE: These nodes are **not** in the "sigmas" menu.)
20 | 4. You'll need one or more prompts. Chain conditioning > CLIP Text Encode (Prompt) to sampling > prediction > Conditioned Prediction to get started.
21 | 5. After your prediction chain, connect the result to the *noise_prediction* input of your **Sample Predictions** node.
22 |
23 | Some utility nodes for manipulating sigams are provided under Add Node > sampling > custom_sampling > sigmas.
24 |
25 | # Predictors
26 |
27 | ## Primitive Nodes
28 | All other predictions can be implemented in terms of these nodes. However, it may get a little messy.
29 |
30 | ### Conditioned Prediction
31 | Evaluates your chosen model with a prompt (conditioning). You need to pick a unique conditioning name like "positive", "negative", or "empty".
32 |
33 | The names are arbitrary and you can choose any name, but the names may eventually interact with ControlNet if/when it's implemented.
34 |
35 | ### Combine Predictions
36 | Operates on two predictions. Supports add (+), subtract (-), multiply (*), divide (/), [vector projection](https://en.wikipedia.org/wiki/Vector_projection) (proj), [vector rejection](https://en.wikipedia.org/wiki/Vector_projection) (oproj), min, and max.
37 |
38 | ``prediction_A prediction_B``
39 |
40 | ### Scale Prediction
41 | Linearly scales a prediction.
42 |
43 | ``prediction * scale``
44 |
45 | ### Switch Predictions
46 | Switches from one prediction to another one based on the timestep sigma. Use sampling > custom_sampling > sigmas > Select Sigmas to create a sub-ranges of timestep sigmas.
47 |
48 | ``prediction_B when current_sigma in sigmas_B otherwise prediction_A``
49 |
50 | ### Scaled Guidance Prediction
51 | Combines a baseline prediction with a scaled guidance prediction using optional standard deviation rescaling, similar to CFG.
52 |
53 | Without ``stddev_rescale``: ``baseline + guidance * scale``
54 | With ``stddev_rescale``: [See §3.4 of this paper.](https://arxiv.org/pdf/2305.08891.pdf) As usual, start out around 0.7 and tune from there.
55 |
56 | ### Characteristic Guidance Prediction
57 | Combines an unconditioned (or negative) prediction with a desired, conditioned (or positive) prediction using a [characteristic correction](https://arxiv.org/pdf/2312.07586.pdf).
58 |
59 | ``cond``: Desired conditioned or positive prediction. This prediction should not be independent of the unconditioned prediction.
60 | ``uncond``: Unconditioned or negative prediction.
61 | ``fallback``: Optional prediction to use in the case of non-convergence. Defaults to vanilla CFG if not connected.
62 | ``guidance_scale``: Scale of the independent conditioned prediction, like vanilla CFG.
63 | ``history``: Number of prior states to retain for Anderson acceleration. Generally improves convergence speed but may introduce instabilities. Setting to 1 disables Anderson acceleration, which will require a larger max_steps and smaller log_step_size to converge.
64 | ``log_step_size``: log10 learning rate. Higher values improve convergence speed but will introduce instabilities.
65 | ``log_tolerance``: log10 convergence tolerance. Higher values improve convergence speed but will introduce artifacts.
66 | ``keep_tolerance``: A multiplier greater than 1 relaxes the tolerance requirement on the final step, returning a mostly-converged result instead of using the fallback.
67 | ``reuse_scale``: A multiplier greater than 0 retains a portion of the prior correction term between samples. May improve convergence speed and consistency, but may also introduce instabilities. Use with caution.
68 | ``max_steps``: Maximum number of optimizer steps before giving up and using the fallback prediction.
69 | ``precondition_gradients``: Precondition gradients during Anderson acceleration. This is strongly recommended, but may decrease result quality in specific cases where the gradients are reliably well-conditioned.
70 |
71 | This node is extremely expensive to evaluate. It requires four full model evaluations plus two full evaluations for each optimizer step required.
72 | It's recommended that you use the **Switch Predictions** node to skip CHG on the first timestep as convergence is unlikely and to disable it towards the end of sampling as it has marginal effect.
73 |
74 | ## Prebuilt Nodes
75 |
76 | ### Switch Early/Middle/Late Predictions
77 | Convienence node similar to a three-way **Switch Predictions** node, where the number of ``early_steps`` and ``late_steps`` can be specified directly.
78 |
79 | The whole applicable ``sigmas`` schedule should be provided to the node, and the schedule must be unambiguous when split. (Restart schedules can often be ambiguous.)
80 |
81 | ### Interpolate Predictions
82 | Linearly interpolates two predictions.
83 |
84 | ``prediction_A * (1.0 - scale_B) + prediction_B * scale_B``
85 |
86 | ### CFG Prediction
87 | Vanilla Classifier Free Guidance (CFG) with a positive prompt and a negative/empty prompt. Does not support CFG rescale.
88 |
89 | ``(positive - negative) * cfg_scale + negative``
90 |
91 | ### Perp-Neg Prediction
92 | Implements https://arxiv.org/abs/2304.04968.
93 |
94 | ``pos_ind = positive - empty; neg_ind = negative - empty``
95 | ``(pos_ind - (neg_ind oproj pos_ind) * neg_scale) * cfg_scale + empty``
96 |
97 | ### Avoid and Erase Prediction (v2)
98 | Implements a modification of the Perp-Neg algorithm that avoids moving in the direction of the negative guidance.
99 | The result is a guidance vector that should be recombined with the empty prediction using the **Scaled Guidance Prediction** node or addition in preparation for **Characteristic Guidance Prediction**.
100 |
101 | ``pos_ind = positive - empty; neg_ind = negative - empty``
102 | ``pos_ind oproj neg_ind - (neg_ind oproj pos_ind) * erase_scale``
103 |
104 | # Sigma Utilities
105 | These utility nodes are provided for manipulating sigmas for use with the **Switch Predictions** node. These nodes are under sampling > custom_sampling > sigmas.
106 |
107 | One important thing to keep in mind is that ComfyUI SIGMAS lists contain the ending output sigma, even though the model is not evaulated at this sigma.
108 | So, a 30-step full denoising schedule actually contains 31 sigmas (indexed 0 to 30), with the final sigma being ``0.0``.
109 |
110 | ### Select Sigmas
111 | This node is an alternative to the **Split Sigmas** built-in node. It allows you to specify the specific ``sigmas`` to ``select`` by 0-based index as a comma seperated list.
112 |
113 | Ranges are supported with ``[start]:[end]`` syntax. The ending bound is exclusive and negative indices are supported, which index from the end of the list. Both start and/or end may be ommitted.
114 |
115 | As a convienience, instead of specifing a list, you may specify ``mod `` to select every N'th sigma. You can use this to easily alternate between prediction strageies.
116 |
117 | If ``chained`` is disabled, the final sigma in the input list will be dropped. If you're chaning **Select Sigmas** nodes, you should enable ``chained`` in almost all cases.
118 |
119 | If a specified index is out of range, it is ignored.
120 |
121 | Examples, assuming a 10-timestep schedule with sigmas ``10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0``:
122 |
123 | ```
124 | select: 0, 1, 2 chained: false => 10, 9, 8
125 | select: 0, 1, 3:6, -2 chained: false => 10, 9, 7, 6, 5, 2
126 | select: mod 2 chained: false => 9, 7, 5, 3, 1
127 | select: mod 3 chained: false => 8, 5, 2
128 | select: 100, 0, -100 chained: false => 10
129 |
130 | select: 5: chained: false => 5, 4, 3, 2, 1
131 | select: :5 chained: false => 10, 9, 8, 7, 6
132 | select: : chained: false => 10, 9, 8, 7, 6, 5, 4, 3, 2, 1
133 |
134 | select: -1 chained: false => 1
135 | select: -1 chained: true => 0
136 | select: -1:-4 chained: true => 0, 1, 2
137 | ```
138 |
139 | ### Split At Sigma
140 | Similar to the built in **Split Sigmas** node, this node splits a list of sigmas based on the *value* of the sigma instead of the index.
141 | This node takess a monotonically decreasing list of ``sigmas``, and splits at the earliest sigma which is less than or equal to the specified ``sigma``.
142 |
143 | This node is primarily useful in setting up refiner models without having to reverse engineer the timestep schedule for differing numbers of steps.
144 |
145 | ### Log Sigmas
146 | Prints the provided list of ``sigmas`` to the text console, with an optional ``message`` prefix to differentiate nodes. This is useful for debugging workflows.
147 |
148 | If the sigmas list and message don't change, ComfyUI will not evaluate the node and so nothing will be printed.
149 |
150 | # Limitations
151 | ControlNet is not supported at this time.
152 |
153 | Regional prompting may work but is totally untested.
154 |
155 | Any other advanced features affecting conditioning are not likely to work.
156 |
157 | # License
158 | The license is the same as ComfyUI, GPL 3.0.
159 |
--------------------------------------------------------------------------------
/__init__.py:
--------------------------------------------------------------------------------
1 | """
2 | @author: RedHotTensors
3 | @title: ComfyUI-Prediction
4 | @nickname: ComfyUI-Prediction
5 | @description: Fully customizable Classifer Free Guidance for ComfyUI
6 | """
7 |
8 | from .nodes import nodes_pred, nodes_sigma
9 |
10 | NODE_CLASS_MAPPINGS = {
11 | **nodes_pred.NODE_CLASS_MAPPINGS,
12 | **nodes_sigma.NODE_CLASS_MAPPINGS,
13 | }
14 |
15 | NODE_DISPLAY_NAME_MAPPINGS = {
16 | **nodes_pred.NODE_DISPLAY_NAME_MAPPINGS,
17 | **nodes_sigma.NODE_DISPLAY_NAME_MAPPINGS,
18 | }
19 |
--------------------------------------------------------------------------------
/examples/avoid_and_erase.json:
--------------------------------------------------------------------------------
1 | {
2 | "last_node_id": 75,
3 | "last_link_id": 155,
4 | "nodes": [
5 | {
6 | "id": 25,
7 | "type": "Reroute",
8 | "pos": [
9 | 263,
10 | 474
11 | ],
12 | "size": [
13 | 75,
14 | 26
15 | ],
16 | "flags": {},
17 | "order": 6,
18 | "mode": 0,
19 | "inputs": [
20 | {
21 | "name": "",
22 | "type": "*",
23 | "link": 116
24 | }
25 | ],
26 | "outputs": [
27 | {
28 | "name": "",
29 | "type": "CLIP",
30 | "links": [
31 | 25,
32 | 26,
33 | 34
34 | ],
35 | "slot_index": 0
36 | }
37 | ],
38 | "properties": {
39 | "showOutputText": false,
40 | "horizontal": false
41 | }
42 | },
43 | {
44 | "id": 24,
45 | "type": "LoraLoader",
46 | "pos": [
47 | -200,
48 | 455
49 | ],
50 | "size": {
51 | "0": 353.758056640625,
52 | "1": 126
53 | },
54 | "flags": {},
55 | "order": 3,
56 | "mode": 0,
57 | "inputs": [
58 | {
59 | "name": "model",
60 | "type": "MODEL",
61 | "link": 29
62 | },
63 | {
64 | "name": "clip",
65 | "type": "CLIP",
66 | "link": 42
67 | }
68 | ],
69 | "outputs": [
70 | {
71 | "name": "MODEL",
72 | "type": "MODEL",
73 | "links": [
74 | 30
75 | ],
76 | "shape": 3,
77 | "slot_index": 0
78 | },
79 | {
80 | "name": "CLIP",
81 | "type": "CLIP",
82 | "links": [
83 | 116
84 | ],
85 | "shape": 3,
86 | "slot_index": 1
87 | }
88 | ],
89 | "properties": {
90 | "Node name for S&R": "LoraLoader"
91 | },
92 | "widgets_values": [
93 | "fluffyrock-quality-tags-v4.safetensors",
94 | 1,
95 | 1
96 | ]
97 | },
98 | {
99 | "id": 72,
100 | "type": "Reroute",
101 | "pos": [
102 | 2650.116147237216,
103 | 347.9113717632276
104 | ],
105 | "size": [
106 | 75,
107 | 26
108 | ],
109 | "flags": {},
110 | "order": 20,
111 | "mode": 0,
112 | "inputs": [
113 | {
114 | "name": "",
115 | "type": "*",
116 | "link": 155
117 | }
118 | ],
119 | "outputs": [
120 | {
121 | "name": "",
122 | "type": "PREDICTION",
123 | "links": [
124 | 141
125 | ],
126 | "slot_index": 0
127 | }
128 | ],
129 | "properties": {
130 | "showOutputText": false,
131 | "horizontal": false
132 | }
133 | },
134 | {
135 | "id": 38,
136 | "type": "KSamplerSelect",
137 | "pos": [
138 | 393,
139 | 1212
140 | ],
141 | "size": {
142 | "0": 315,
143 | "1": 58
144 | },
145 | "flags": {},
146 | "order": 0,
147 | "mode": 0,
148 | "outputs": [
149 | {
150 | "name": "SAMPLER",
151 | "type": "SAMPLER",
152 | "links": [
153 | 53
154 | ],
155 | "shape": 3,
156 | "slot_index": 0
157 | }
158 | ],
159 | "properties": {
160 | "Node name for S&R": "KSamplerSelect"
161 | },
162 | "widgets_values": [
163 | "dpmpp_2m"
164 | ]
165 | },
166 | {
167 | "id": 39,
168 | "type": "BasicScheduler",
169 | "pos": [
170 | 392,
171 | 1323
172 | ],
173 | "size": {
174 | "0": 315,
175 | "1": 106
176 | },
177 | "flags": {},
178 | "order": 7,
179 | "mode": 0,
180 | "inputs": [
181 | {
182 | "name": "model",
183 | "type": "MODEL",
184 | "link": 55
185 | }
186 | ],
187 | "outputs": [
188 | {
189 | "name": "SIGMAS",
190 | "type": "SIGMAS",
191 | "links": [
192 | 56
193 | ],
194 | "shape": 3,
195 | "slot_index": 0
196 | }
197 | ],
198 | "properties": {
199 | "Node name for S&R": "BasicScheduler"
200 | },
201 | "widgets_values": [
202 | "normal",
203 | 40,
204 | 1
205 | ]
206 | },
207 | {
208 | "id": 14,
209 | "type": "ModelSamplingDiscrete",
210 | "pos": [
211 | 10,
212 | 1222
213 | ],
214 | "size": {
215 | "0": 315,
216 | "1": 82
217 | },
218 | "flags": {},
219 | "order": 5,
220 | "mode": 0,
221 | "inputs": [
222 | {
223 | "name": "model",
224 | "type": "MODEL",
225 | "link": 30
226 | }
227 | ],
228 | "outputs": [
229 | {
230 | "name": "MODEL",
231 | "type": "MODEL",
232 | "links": [
233 | 54,
234 | 55
235 | ],
236 | "shape": 3,
237 | "slot_index": 0
238 | }
239 | ],
240 | "properties": {
241 | "Node name for S&R": "ModelSamplingDiscrete"
242 | },
243 | "widgets_values": [
244 | "v_prediction",
245 | true
246 | ]
247 | },
248 | {
249 | "id": 4,
250 | "type": "CheckpointLoaderSimple",
251 | "pos": [
252 | -646,
253 | 454
254 | ],
255 | "size": {
256 | "0": 358.83648681640625,
257 | "1": 98
258 | },
259 | "flags": {},
260 | "order": 1,
261 | "mode": 0,
262 | "outputs": [
263 | {
264 | "name": "MODEL",
265 | "type": "MODEL",
266 | "links": [
267 | 29
268 | ],
269 | "slot_index": 0
270 | },
271 | {
272 | "name": "CLIP",
273 | "type": "CLIP",
274 | "links": [
275 | 42
276 | ],
277 | "slot_index": 1
278 | },
279 | {
280 | "name": "VAE",
281 | "type": "VAE",
282 | "links": [
283 | 142
284 | ],
285 | "slot_index": 2
286 | }
287 | ],
288 | "properties": {
289 | "Node name for S&R": "CheckpointLoaderSimple"
290 | },
291 | "widgets_values": [
292 | "Fluffyrock-Unleashed-v1-0.safetensors"
293 | ]
294 | },
295 | {
296 | "id": 68,
297 | "type": "Reroute",
298 | "pos": [
299 | -252,
300 | 1362
301 | ],
302 | "size": [
303 | 75,
304 | 26
305 | ],
306 | "flags": {},
307 | "order": 4,
308 | "mode": 0,
309 | "inputs": [
310 | {
311 | "name": "",
312 | "type": "*",
313 | "link": 142,
314 | "pos": [
315 | 37.5,
316 | 0
317 | ],
318 | "slot_index": 0
319 | }
320 | ],
321 | "outputs": [
322 | {
323 | "name": "",
324 | "type": "VAE",
325 | "links": [
326 | 130
327 | ],
328 | "slot_index": 0
329 | }
330 | ],
331 | "properties": {
332 | "showOutputText": false,
333 | "horizontal": true
334 | }
335 | },
336 | {
337 | "id": 6,
338 | "type": "CLIPTextEncode",
339 | "pos": [
340 | 394,
341 | 474
342 | ],
343 | "size": {
344 | "0": 422.84503173828125,
345 | "1": 164.31304931640625
346 | },
347 | "flags": {},
348 | "order": 8,
349 | "mode": 0,
350 | "inputs": [
351 | {
352 | "name": "clip",
353 | "type": "CLIP",
354 | "link": 25
355 | }
356 | ],
357 | "outputs": [
358 | {
359 | "name": "CONDITIONING",
360 | "type": "CONDITIONING",
361 | "links": [
362 | 75,
363 | 86,
364 | 91
365 | ],
366 | "slot_index": 0
367 | }
368 | ],
369 | "title": "Positive Prompt",
370 | "properties": {
371 | "Node name for S&R": "CLIP Text Encode (Prompt)"
372 | },
373 | "widgets_values": [
374 | "dragon, feral, male, solo, good quality, best quality, detailed scales, detailed, realistic, full-length portrait, gradient background, action pose"
375 | ]
376 | },
377 | {
378 | "id": 7,
379 | "type": "CLIPTextEncode",
380 | "pos": [
381 | 401,
382 | 706
383 | ],
384 | "size": {
385 | "0": 425.27801513671875,
386 | "1": 180.6060791015625
387 | },
388 | "flags": {},
389 | "order": 9,
390 | "mode": 0,
391 | "inputs": [
392 | {
393 | "name": "clip",
394 | "type": "CLIP",
395 | "link": 26
396 | }
397 | ],
398 | "outputs": [
399 | {
400 | "name": "CONDITIONING",
401 | "type": "CONDITIONING",
402 | "links": [
403 | 78,
404 | 87,
405 | 92
406 | ],
407 | "slot_index": 0
408 | }
409 | ],
410 | "title": "Negative Prompt",
411 | "properties": {
412 | "Node name for S&R": "CLIPTextEncode"
413 | },
414 | "widgets_values": [
415 | "bad quality, worst quality, anthro, text, signature, patreon logo, blurred background, genitals"
416 | ]
417 | },
418 | {
419 | "id": 23,
420 | "type": "CLIPTextEncode",
421 | "pos": [
422 | 398,
423 | 947
424 | ],
425 | "size": {
426 | "0": 425.2000427246094,
427 | "1": 203.55630493164062
428 | },
429 | "flags": {},
430 | "order": 10,
431 | "mode": 0,
432 | "inputs": [
433 | {
434 | "name": "clip",
435 | "type": "CLIP",
436 | "link": 34
437 | }
438 | ],
439 | "outputs": [
440 | {
441 | "name": "CONDITIONING",
442 | "type": "CONDITIONING",
443 | "links": [
444 | 82,
445 | 93
446 | ],
447 | "shape": 3,
448 | "slot_index": 0
449 | }
450 | ],
451 | "title": "Empty Prompt",
452 | "properties": {
453 | "Node name for S&R": "CLIPTextEncode"
454 | },
455 | "widgets_values": [
456 | ""
457 | ]
458 | },
459 | {
460 | "id": 5,
461 | "type": "EmptyLatentImage",
462 | "pos": [
463 | 391,
464 | 1608
465 | ],
466 | "size": {
467 | "0": 315,
468 | "1": 106
469 | },
470 | "flags": {},
471 | "order": 2,
472 | "mode": 0,
473 | "outputs": [
474 | {
475 | "name": "LATENT",
476 | "type": "LATENT",
477 | "links": [
478 | 94
479 | ],
480 | "slot_index": 0
481 | }
482 | ],
483 | "properties": {
484 | "Node name for S&R": "EmptyLatentImage"
485 | },
486 | "widgets_values": [
487 | 1080,
488 | 720,
489 | 2
490 | ]
491 | },
492 | {
493 | "id": 52,
494 | "type": "ConditionedPrediction",
495 | "pos": [
496 | 924.7399340820311,
497 | 183.41999908447266
498 | ],
499 | "size": {
500 | "0": 315,
501 | "1": 58
502 | },
503 | "flags": {},
504 | "order": 11,
505 | "mode": 0,
506 | "inputs": [
507 | {
508 | "name": "conditioning",
509 | "type": "CONDITIONING",
510 | "link": 75
511 | }
512 | ],
513 | "outputs": [
514 | {
515 | "name": "prediction",
516 | "type": "PREDICTION",
517 | "links": [
518 | 145
519 | ],
520 | "shape": 3,
521 | "slot_index": 0
522 | }
523 | ],
524 | "properties": {
525 | "Node name for S&R": "ConditionedPrediction"
526 | },
527 | "widgets_values": [
528 | "positive"
529 | ]
530 | },
531 | {
532 | "id": 53,
533 | "type": "ConditionedPrediction",
534 | "pos": [
535 | 929.7399340820311,
536 | 295.41999908447264
537 | ],
538 | "size": {
539 | "0": 315,
540 | "1": 58
541 | },
542 | "flags": {},
543 | "order": 12,
544 | "mode": 0,
545 | "inputs": [
546 | {
547 | "name": "conditioning",
548 | "type": "CONDITIONING",
549 | "link": 78
550 | }
551 | ],
552 | "outputs": [
553 | {
554 | "name": "prediction",
555 | "type": "PREDICTION",
556 | "links": [
557 | 146
558 | ],
559 | "shape": 3,
560 | "slot_index": 0
561 | }
562 | ],
563 | "properties": {
564 | "Node name for S&R": "ConditionedPrediction"
565 | },
566 | "widgets_values": [
567 | "negative"
568 | ]
569 | },
570 | {
571 | "id": 73,
572 | "type": "AvoidErasePrediction",
573 | "pos": [
574 | 1314,
575 | 187
576 | ],
577 | "size": {
578 | "0": 315,
579 | "1": 98
580 | },
581 | "flags": {},
582 | "order": 16,
583 | "mode": 0,
584 | "inputs": [
585 | {
586 | "name": "positive",
587 | "type": "PREDICTION",
588 | "link": 145
589 | },
590 | {
591 | "name": "negative",
592 | "type": "PREDICTION",
593 | "link": 146
594 | },
595 | {
596 | "name": "empty",
597 | "type": "PREDICTION",
598 | "link": 147
599 | }
600 | ],
601 | "outputs": [
602 | {
603 | "name": "guidance",
604 | "type": "PREDICTION",
605 | "links": [
606 | 148,
607 | 150
608 | ],
609 | "shape": 3,
610 | "slot_index": 0
611 | }
612 | ],
613 | "properties": {
614 | "Node name for S&R": "AvoidErasePrediction"
615 | },
616 | "widgets_values": [
617 | 0.2
618 | ]
619 | },
620 | {
621 | "id": 54,
622 | "type": "ConditionedPrediction",
623 | "pos": [
624 | 925.7399340820311,
625 | 406.41999908447264
626 | ],
627 | "size": {
628 | "0": 315,
629 | "1": 58
630 | },
631 | "flags": {},
632 | "order": 14,
633 | "mode": 0,
634 | "inputs": [
635 | {
636 | "name": "conditioning",
637 | "type": "CONDITIONING",
638 | "link": 82
639 | }
640 | ],
641 | "outputs": [
642 | {
643 | "name": "prediction",
644 | "type": "PREDICTION",
645 | "links": [
646 | 110,
647 | 147,
648 | 149,
649 | 153
650 | ],
651 | "shape": 3,
652 | "slot_index": 0
653 | }
654 | ],
655 | "properties": {
656 | "Node name for S&R": "ConditionedPrediction"
657 | },
658 | "widgets_values": [
659 | "empty"
660 | ]
661 | },
662 | {
663 | "id": 74,
664 | "type": "CombinePredictions",
665 | "pos": [
666 | 1701,
667 | 263
668 | ],
669 | "size": {
670 | "0": 315,
671 | "1": 78
672 | },
673 | "flags": {},
674 | "order": 17,
675 | "mode": 0,
676 | "inputs": [
677 | {
678 | "name": "prediction_A",
679 | "type": "PREDICTION",
680 | "link": 148
681 | },
682 | {
683 | "name": "prediction_B",
684 | "type": "PREDICTION",
685 | "link": 149
686 | }
687 | ],
688 | "outputs": [
689 | {
690 | "name": "prediction",
691 | "type": "PREDICTION",
692 | "links": [
693 | 152
694 | ],
695 | "shape": 3,
696 | "slot_index": 0
697 | }
698 | ],
699 | "properties": {
700 | "Node name for S&R": "CombinePredictions"
701 | },
702 | "widgets_values": [
703 | "A + B"
704 | ]
705 | },
706 | {
707 | "id": 56,
708 | "type": "CFGPrediction",
709 | "pos": [
710 | 2220.2986090624995,
711 | 642.8204918027345
712 | ],
713 | "size": {
714 | "0": 315,
715 | "1": 78
716 | },
717 | "flags": {},
718 | "order": 13,
719 | "mode": 0,
720 | "inputs": [
721 | {
722 | "name": "positive",
723 | "type": "CONDITIONING",
724 | "link": 86
725 | },
726 | {
727 | "name": "negative",
728 | "type": "CONDITIONING",
729 | "link": 87
730 | }
731 | ],
732 | "outputs": [
733 | {
734 | "name": "prediction",
735 | "type": "PREDICTION",
736 | "links": [],
737 | "shape": 3,
738 | "slot_index": 0
739 | }
740 | ],
741 | "properties": {
742 | "Node name for S&R": "CFGPrediction"
743 | },
744 | "widgets_values": [
745 | 6
746 | ]
747 | },
748 | {
749 | "id": 57,
750 | "type": "PerpNegPrediction",
751 | "pos": [
752 | 2219.2986090624995,
753 | 771.8204918027345
754 | ],
755 | "size": {
756 | "0": 315,
757 | "1": 122
758 | },
759 | "flags": {},
760 | "order": 15,
761 | "mode": 0,
762 | "inputs": [
763 | {
764 | "name": "positive",
765 | "type": "CONDITIONING",
766 | "link": 91
767 | },
768 | {
769 | "name": "negative",
770 | "type": "CONDITIONING",
771 | "link": 92
772 | },
773 | {
774 | "name": "empty",
775 | "type": "CONDITIONING",
776 | "link": 93
777 | }
778 | ],
779 | "outputs": [
780 | {
781 | "name": "prediction",
782 | "type": "PREDICTION",
783 | "links": [],
784 | "shape": 3,
785 | "slot_index": 0
786 | }
787 | ],
788 | "properties": {
789 | "Node name for S&R": "PerpNegPrediction"
790 | },
791 | "widgets_values": [
792 | 8,
793 | 1
794 | ]
795 | },
796 | {
797 | "id": 28,
798 | "type": "SamplerCustomPrediction",
799 | "pos": [
800 | 2834,
801 | 896
802 | ],
803 | "size": [
804 | 405.5999755859375,
805 | 398
806 | ],
807 | "flags": {},
808 | "order": 21,
809 | "mode": 0,
810 | "inputs": [
811 | {
812 | "name": "model",
813 | "type": "MODEL",
814 | "link": 54
815 | },
816 | {
817 | "name": "sampler",
818 | "type": "SAMPLER",
819 | "link": 53
820 | },
821 | {
822 | "name": "sigmas",
823 | "type": "SIGMAS",
824 | "link": 56
825 | },
826 | {
827 | "name": "latent_image",
828 | "type": "LATENT",
829 | "link": 94
830 | },
831 | {
832 | "name": "noise_prediction",
833 | "type": "PREDICTION",
834 | "link": 141
835 | }
836 | ],
837 | "outputs": [
838 | {
839 | "name": "output",
840 | "type": "LATENT",
841 | "links": [
842 | 136
843 | ],
844 | "shape": 3,
845 | "slot_index": 0
846 | },
847 | {
848 | "name": "denoised_output",
849 | "type": "LATENT",
850 | "links": null,
851 | "shape": 3
852 | }
853 | ],
854 | "properties": {
855 | "Node name for S&R": "SamplerCustomPrediction"
856 | },
857 | "widgets_values": [
858 | true,
859 | 1,
860 | "fixed"
861 | ]
862 | },
863 | {
864 | "id": 8,
865 | "type": "VAEDecode",
866 | "pos": [
867 | 3279,
868 | 902
869 | ],
870 | "size": {
871 | "0": 210,
872 | "1": 46
873 | },
874 | "flags": {},
875 | "order": 22,
876 | "mode": 0,
877 | "inputs": [
878 | {
879 | "name": "samples",
880 | "type": "LATENT",
881 | "link": 136,
882 | "slot_index": 0
883 | },
884 | {
885 | "name": "vae",
886 | "type": "VAE",
887 | "link": 130
888 | }
889 | ],
890 | "outputs": [
891 | {
892 | "name": "IMAGE",
893 | "type": "IMAGE",
894 | "links": [
895 | 139
896 | ],
897 | "slot_index": 0
898 | }
899 | ],
900 | "properties": {
901 | "Node name for S&R": "VAEDecode"
902 | }
903 | },
904 | {
905 | "id": 20,
906 | "type": "PreviewImage",
907 | "pos": [
908 | 3525,
909 | 903
910 | ],
911 | "size": {
912 | "0": 1251.4755859375,
913 | "1": 826.39794921875
914 | },
915 | "flags": {},
916 | "order": 23,
917 | "mode": 0,
918 | "inputs": [
919 | {
920 | "name": "images",
921 | "type": "IMAGE",
922 | "link": 139
923 | }
924 | ],
925 | "properties": {
926 | "Node name for S&R": "PreviewImage"
927 | }
928 | },
929 | {
930 | "id": 75,
931 | "type": "CharacteristicGuidancePrediction",
932 | "pos": [
933 | 2225.0138359374996,
934 | 328.648005126953
935 | ],
936 | "size": {
937 | "0": 315,
938 | "1": 266
939 | },
940 | "flags": {},
941 | "order": 19,
942 | "mode": 0,
943 | "inputs": [
944 | {
945 | "name": "cond",
946 | "type": "PREDICTION",
947 | "link": 152
948 | },
949 | {
950 | "name": "uncond",
951 | "type": "PREDICTION",
952 | "link": 153
953 | },
954 | {
955 | "name": "fallback",
956 | "type": "PREDICTION",
957 | "link": 151
958 | }
959 | ],
960 | "outputs": [
961 | {
962 | "name": "prediction",
963 | "type": "PREDICTION",
964 | "links": [],
965 | "shape": 3,
966 | "slot_index": 0
967 | }
968 | ],
969 | "properties": {
970 | "Node name for S&R": "CharacteristicGuidancePrediction"
971 | },
972 | "widgets_values": [
973 | 6,
974 | 2,
975 | -3,
976 | -4,
977 | 1,
978 | 0,
979 | 20,
980 | true
981 | ]
982 | },
983 | {
984 | "id": 61,
985 | "type": "ScaledGuidancePrediction",
986 | "pos": [
987 | 2221.116147237216,
988 | 185.91137176322826
989 | ],
990 | "size": {
991 | "0": 315,
992 | "1": 102
993 | },
994 | "flags": {},
995 | "order": 18,
996 | "mode": 0,
997 | "inputs": [
998 | {
999 | "name": "guidance",
1000 | "type": "PREDICTION",
1001 | "link": 150,
1002 | "slot_index": 0
1003 | },
1004 | {
1005 | "name": "baseline",
1006 | "type": "PREDICTION",
1007 | "link": 110,
1008 | "slot_index": 1
1009 | }
1010 | ],
1011 | "outputs": [
1012 | {
1013 | "name": "prediction",
1014 | "type": "PREDICTION",
1015 | "links": [
1016 | 151,
1017 | 155
1018 | ],
1019 | "shape": 3,
1020 | "slot_index": 0
1021 | }
1022 | ],
1023 | "properties": {
1024 | "Node name for S&R": "ScaledGuidancePrediction"
1025 | },
1026 | "widgets_values": [
1027 | 7,
1028 | 0
1029 | ]
1030 | }
1031 | ],
1032 | "links": [
1033 | [
1034 | 25,
1035 | 25,
1036 | 0,
1037 | 6,
1038 | 0,
1039 | "CLIP"
1040 | ],
1041 | [
1042 | 26,
1043 | 25,
1044 | 0,
1045 | 7,
1046 | 0,
1047 | "CLIP"
1048 | ],
1049 | [
1050 | 29,
1051 | 4,
1052 | 0,
1053 | 24,
1054 | 0,
1055 | "MODEL"
1056 | ],
1057 | [
1058 | 30,
1059 | 24,
1060 | 0,
1061 | 14,
1062 | 0,
1063 | "MODEL"
1064 | ],
1065 | [
1066 | 34,
1067 | 25,
1068 | 0,
1069 | 23,
1070 | 0,
1071 | "CLIP"
1072 | ],
1073 | [
1074 | 42,
1075 | 4,
1076 | 1,
1077 | 24,
1078 | 1,
1079 | "CLIP"
1080 | ],
1081 | [
1082 | 53,
1083 | 38,
1084 | 0,
1085 | 28,
1086 | 1,
1087 | "SAMPLER"
1088 | ],
1089 | [
1090 | 54,
1091 | 14,
1092 | 0,
1093 | 28,
1094 | 0,
1095 | "MODEL"
1096 | ],
1097 | [
1098 | 55,
1099 | 14,
1100 | 0,
1101 | 39,
1102 | 0,
1103 | "MODEL"
1104 | ],
1105 | [
1106 | 56,
1107 | 39,
1108 | 0,
1109 | 28,
1110 | 2,
1111 | "SIGMAS"
1112 | ],
1113 | [
1114 | 75,
1115 | 6,
1116 | 0,
1117 | 52,
1118 | 0,
1119 | "CONDITIONING"
1120 | ],
1121 | [
1122 | 78,
1123 | 7,
1124 | 0,
1125 | 53,
1126 | 0,
1127 | "CONDITIONING"
1128 | ],
1129 | [
1130 | 82,
1131 | 23,
1132 | 0,
1133 | 54,
1134 | 0,
1135 | "CONDITIONING"
1136 | ],
1137 | [
1138 | 86,
1139 | 6,
1140 | 0,
1141 | 56,
1142 | 0,
1143 | "CONDITIONING"
1144 | ],
1145 | [
1146 | 87,
1147 | 7,
1148 | 0,
1149 | 56,
1150 | 1,
1151 | "CONDITIONING"
1152 | ],
1153 | [
1154 | 91,
1155 | 6,
1156 | 0,
1157 | 57,
1158 | 0,
1159 | "CONDITIONING"
1160 | ],
1161 | [
1162 | 92,
1163 | 7,
1164 | 0,
1165 | 57,
1166 | 1,
1167 | "CONDITIONING"
1168 | ],
1169 | [
1170 | 93,
1171 | 23,
1172 | 0,
1173 | 57,
1174 | 2,
1175 | "CONDITIONING"
1176 | ],
1177 | [
1178 | 94,
1179 | 5,
1180 | 0,
1181 | 28,
1182 | 3,
1183 | "LATENT"
1184 | ],
1185 | [
1186 | 110,
1187 | 54,
1188 | 0,
1189 | 61,
1190 | 1,
1191 | "PREDICTION"
1192 | ],
1193 | [
1194 | 116,
1195 | 24,
1196 | 1,
1197 | 25,
1198 | 0,
1199 | "*"
1200 | ],
1201 | [
1202 | 130,
1203 | 68,
1204 | 0,
1205 | 8,
1206 | 1,
1207 | "VAE"
1208 | ],
1209 | [
1210 | 136,
1211 | 28,
1212 | 0,
1213 | 8,
1214 | 0,
1215 | "LATENT"
1216 | ],
1217 | [
1218 | 139,
1219 | 8,
1220 | 0,
1221 | 20,
1222 | 0,
1223 | "IMAGE"
1224 | ],
1225 | [
1226 | 141,
1227 | 72,
1228 | 0,
1229 | 28,
1230 | 4,
1231 | "PREDICTION"
1232 | ],
1233 | [
1234 | 142,
1235 | 4,
1236 | 2,
1237 | 68,
1238 | 0,
1239 | "*"
1240 | ],
1241 | [
1242 | 145,
1243 | 52,
1244 | 0,
1245 | 73,
1246 | 0,
1247 | "PREDICTION"
1248 | ],
1249 | [
1250 | 146,
1251 | 53,
1252 | 0,
1253 | 73,
1254 | 1,
1255 | "PREDICTION"
1256 | ],
1257 | [
1258 | 147,
1259 | 54,
1260 | 0,
1261 | 73,
1262 | 2,
1263 | "PREDICTION"
1264 | ],
1265 | [
1266 | 148,
1267 | 73,
1268 | 0,
1269 | 74,
1270 | 0,
1271 | "PREDICTION"
1272 | ],
1273 | [
1274 | 149,
1275 | 54,
1276 | 0,
1277 | 74,
1278 | 1,
1279 | "PREDICTION"
1280 | ],
1281 | [
1282 | 150,
1283 | 73,
1284 | 0,
1285 | 61,
1286 | 0,
1287 | "PREDICTION"
1288 | ],
1289 | [
1290 | 151,
1291 | 61,
1292 | 0,
1293 | 75,
1294 | 2,
1295 | "PREDICTION"
1296 | ],
1297 | [
1298 | 152,
1299 | 74,
1300 | 0,
1301 | 75,
1302 | 0,
1303 | "PREDICTION"
1304 | ],
1305 | [
1306 | 153,
1307 | 54,
1308 | 0,
1309 | 75,
1310 | 1,
1311 | "PREDICTION"
1312 | ],
1313 | [
1314 | 155,
1315 | 61,
1316 | 0,
1317 | 72,
1318 | 0,
1319 | "*"
1320 | ]
1321 | ],
1322 | "groups": [
1323 | {
1324 | "title": "Choose",
1325 | "bounding": [
1326 | 2204,
1327 | 110,
1328 | 500,
1329 | 796
1330 | ],
1331 | "color": "#3f789e",
1332 | "font_size": 24
1333 | },
1334 | {
1335 | "title": "Avoid and Erase Workflow (add node > sampling > prediction)",
1336 | "bounding": [
1337 | 914,
1338 | 109,
1339 | 1269,
1340 | 366
1341 | ],
1342 | "color": "#3f789e",
1343 | "font_size": 24
1344 | }
1345 | ],
1346 | "config": {},
1347 | "extra": {},
1348 | "version": 0.4
1349 | }
--------------------------------------------------------------------------------
/examples/avoid_and_erase.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/redhottensors/ComfyUI-Prediction/2a9c7ea7e076776d4224fa61dad0d1e9a6ae2bce/examples/avoid_and_erase.png
--------------------------------------------------------------------------------
/node_list.json:
--------------------------------------------------------------------------------
1 | {
2 | "SamplerCustomPrediction":"",
3 | "ConditionedPrediction":"",
4 | "CombinePredictions":"",
5 | "InterpolatePredictions":"",
6 | "SwitchPredictions":"",
7 | "EarlyMiddleLatePrediction":"",
8 | "ScaledGuidancePrediction":"",
9 | "CharacteristicGuidancePrediction":"",
10 | "AvoidErasePrediction":"",
11 | "ScalePrediction":"",
12 | "CFGPrediction":"",
13 | "PerpNegPrediction":"",
14 | "SelectSigmas":"",
15 | "SplitAtSigma":"",
16 | "LogSigmas":""
17 | }
18 |
--------------------------------------------------------------------------------
/nodes/nodes_pred.py:
--------------------------------------------------------------------------------
1 | import sys
2 |
3 | import comfy
4 | import torch
5 | import latent_preview
6 | from tqdm import tqdm
7 |
8 | try:
9 | from comfy.samplers import calc_cond_batch
10 | except ImportError:
11 | from comfy.samplers import calc_cond_uncond_batch
12 |
13 | def calc_cond_batch(model, conds, x_in, timestep, model_options):
14 | outputs = []
15 |
16 | index = 0
17 | while index < len(conds) - 1:
18 | outputs.extend(calc_cond_uncond_batch(model, conds[index], conds[index + 1], x_in, timestep, model_options))
19 | index += 2
20 |
21 | if index < len(conds):
22 | outputs.append(calc_cond_uncond_batch(model, conds[index], None, x_in, timestep, model_options)[0])
23 |
24 | return outputs
25 |
26 | try:
27 | from comfy.sampler_helpers import convert_cond
28 | except ImportError:
29 | from comfy.sample import convert_cond
30 |
31 | try:
32 | from comfy.sampler_helpers import get_models_from_cond
33 | except ImportError:
34 | from comfy.sample import get_models_from_cond
35 |
36 | try:
37 | from comfy.sampler_helpers import prepare_mask
38 | except ImportError:
39 | from comfy.sample import prepare_mask
40 |
41 | class CustomNoisePredictor(torch.nn.Module):
42 | def __init__(self, model, pred, preds, conds):
43 | super().__init__()
44 |
45 | self.inner_model = model
46 | self.pred = pred
47 | self.preds = preds
48 | self.conds = conds
49 |
50 | def apply_model(self, x, timestep, cond=None, uncond=None, cond_scale=None, model_options=None, seed=None):
51 | if model_options is None:
52 | model_options = {}
53 |
54 | for pred in self.preds:
55 | pred.begin_sample()
56 |
57 | try:
58 | return self.pred.predict_noise(x, timestep, self.inner_model, self.conds, model_options, seed)
59 | finally:
60 | for pred in self.preds:
61 | pred.end_sample()
62 |
63 | def forward(self, *args, **kwargs):
64 | return self.apply_model(*args, **kwargs)
65 |
66 | class SamplerCustomPrediction:
67 | @classmethod
68 | def INPUT_TYPES(cls):
69 | return {
70 | "required": {
71 | "model": ("MODEL",),
72 | "add_noise": ("BOOLEAN", {"default": True}),
73 | "noise_seed": ("INT", {"default": 0, "min": 0, "max": 0xffffffffffffffff}),
74 | "sampler": ("SAMPLER",),
75 | "sigmas": ("SIGMAS",),
76 | "latent_image": ("LATENT",),
77 | "noise_prediction": ("PREDICTION",),
78 | }
79 | }
80 |
81 | RETURN_TYPES = ("LATENT", "LATENT")
82 | RETURN_NAMES = ("output", "denoised_output")
83 | FUNCTION = "sample"
84 | CATEGORY = "sampling/prediction"
85 |
86 | def sample(self, model, add_noise, noise_seed, sampler, sigmas, latent_image, noise_prediction):
87 | latent_samples = latent_image["samples"]
88 |
89 | if not add_noise:
90 | torch.manual_seed(noise_seed)
91 |
92 | noise = torch.zeros(
93 | latent_samples.size(),
94 | dtype=latent_samples.dtype,
95 | layout=latent_samples.layout,
96 | device="cpu"
97 | )
98 | else:
99 | batch_inds = latent_image["batch_index"] if "batch_index" in latent_image else None
100 | noise = comfy.sample.prepare_noise(latent_samples, noise_seed, batch_inds)
101 |
102 | noise_mask = None
103 | if "noise_mask" in latent_image:
104 | noise_mask = latent_image["noise_mask"]
105 |
106 | x0_output = {}
107 | callback = latent_preview.prepare_callback(model, sigmas.shape[-1] - 1, x0_output)
108 |
109 | samples = sample_pred(
110 | model, noise, noise_prediction, sampler, sigmas, latent_samples,
111 | noise_mask=noise_mask,
112 | callback=callback,
113 | disable_pbar=not comfy.utils.PROGRESS_BAR_ENABLED,
114 | seed=noise_seed
115 | )
116 |
117 | out = latent_image.copy()
118 | out["samples"] = samples
119 |
120 | if "x0" in x0_output:
121 | out_denoised = latent_image.copy()
122 | out_denoised["samples"] = model.model.process_latent_out(x0_output["x0"].cpu())
123 | else:
124 | out_denoised = out
125 |
126 | return (out, out_denoised)
127 |
128 | def sample_pred(
129 | model,
130 | noise,
131 | predictor,
132 | sampler,
133 | sigmas,
134 | latent,
135 | noise_mask=None,
136 | callback=None,
137 | disable_pbar=False,
138 | seed=None
139 | ):
140 | if noise_mask is not None:
141 | noise_mask = prepare_mask(noise_mask, noise.shape, model.load_device)
142 |
143 | dtype = model.model_dtype()
144 | device = model.load_device
145 |
146 | models = predictor.get_models()
147 | conds = predictor.get_conds()
148 | preds = predictor.get_preds()
149 |
150 | n_samples = 0
151 | for pred in preds:
152 | n_samples += pred.n_samples()
153 |
154 | inference_memory = model.memory_required([noise.shape[0] * n_samples] + list(noise.shape[1:]))
155 |
156 | for addtl in models:
157 | if "inference_memory_requirements" in addtl:
158 | inference_memory += addtl.inference_memory_requirements(dtype)
159 |
160 | comfy.model_management.load_models_gpu(models | set([model]), inference_memory)
161 |
162 | noise = noise.to(device)
163 | latent = latent.to(device)
164 | sigmas = sigmas.to(device)
165 |
166 | for name, cond in conds.items():
167 | conds[name] = cond[:]
168 |
169 | for cond in conds.values():
170 | comfy.samplers.resolve_areas_and_cond_masks(cond, noise.shape[2], noise.shape[3], device)
171 |
172 | for cond in conds.values():
173 | comfy.samplers.calculate_start_end_timesteps(model.model, cond)
174 |
175 | if latent is not None:
176 | latent = model.model.process_latent_in(latent)
177 |
178 | if hasattr(model.model, "extra_conds"):
179 | for name, cond in conds.items():
180 | conds[name] = comfy.samplers.encode_model_conds(
181 | model.model.extra_conds,
182 | cond, noise, device, name,
183 | latent_image=latent,
184 | denoise_mask=noise_mask,
185 | seed=seed
186 | )
187 |
188 | # ensure each cond area corresponds with all other areas
189 | for name1, cond1 in conds.items():
190 | for name2, cond2 in conds.items():
191 | if name2 == name1:
192 | continue
193 |
194 | for c1 in cond1:
195 | comfy.samplers.create_cond_with_same_area_if_none(cond2, c1)
196 |
197 | # TODO: support controlnet how?
198 |
199 | predictor_model = CustomNoisePredictor(model.model, predictor, preds, conds)
200 | extra_args = { "model_options": model.model_options, "seed": seed }
201 |
202 | for pred in preds:
203 | pred.begin_sampling()
204 |
205 | try:
206 | samples = sampler.sample(predictor_model, sigmas, extra_args, callback, noise, latent, noise_mask, disable_pbar)
207 | finally:
208 | for pred in preds:
209 | pred.end_sampling()
210 |
211 | samples = model.model.process_latent_out(samples.to(torch.float32))
212 | samples = samples.to(comfy.model_management.intermediate_device())
213 |
214 | comfy.sample.cleanup_additional_models(models)
215 | return samples
216 |
217 | class NoisePredictor:
218 | OUTPUTS = { "prediction": "PREDICTION" }
219 |
220 | def get_models(self):
221 | """Returns all additional models transitively used by this predictor."""
222 | return set()
223 |
224 | def get_conds(self):
225 | """Returns all conditionings transitively defined by this predictor."""
226 | return {}
227 |
228 | def get_preds(self):
229 | """Returns all NoisePredcitors transitively used by this predictor, including itself."""
230 | return {self}
231 |
232 | def n_samples(self):
233 | """Returns the number of times a model will be sampled directly by this predictor."""
234 | return 0
235 |
236 | def predict_noise(self, x, timestep, model, conds, model_options, seed):
237 | raise NotImplementedError
238 |
239 | def begin_sampling(self):
240 | """Called when sampling begins for a batch."""
241 | pass
242 |
243 | def begin_sample(self):
244 | """Called when one sampling step begins."""
245 | pass
246 |
247 | def end_sample(self):
248 | """Called when one sampling step ends."""
249 | pass
250 |
251 | def end_sampling(self):
252 | """Called when sampling completes for a batch."""
253 | pass
254 |
255 | def get_models_from_conds(self):
256 | models = set()
257 |
258 | for cond in self.get_conds():
259 | for cnet in get_models_from_cond(cond, "control"):
260 | models |= cnet.get_models()
261 |
262 | for gligen in get_models_from_cond(cond, "gligen"):
263 | models |= [x[1] for x in gligen]
264 |
265 | return models
266 |
267 | @staticmethod
268 | def merge_models(*args):
269 | merged = set()
270 |
271 | for arg in args:
272 | if arg is None:
273 | continue
274 |
275 | if isinstance(arg, NoisePredictor):
276 | merged |= arg.get_models()
277 | elif isinstance(arg, set):
278 | merged |= arg
279 | else:
280 | merged.add(arg)
281 |
282 | return merged
283 |
284 | @staticmethod
285 | def merge_conds(*args):
286 | merged = {}
287 |
288 | for arg in args:
289 | if arg is None:
290 | continue
291 |
292 | if isinstance(arg, NoisePredictor):
293 | arg = arg.get_conds()
294 |
295 | for name, cond in arg.items():
296 | if name not in merged:
297 | merged[name] = cond
298 | elif merged[name] != cond:
299 | raise RuntimeError(f"Conditioning \"{name}\" is not unique.")
300 |
301 | return merged
302 |
303 | def merge_preds(self, *args):
304 | merged = {self}
305 |
306 | for arg in args:
307 | if arg is not None and arg not in merged:
308 | merged |= arg.get_preds()
309 |
310 | return merged
311 |
312 | class CachingNoisePredictor(NoisePredictor):
313 | def __init__(self):
314 | self.cached_prediction = None
315 |
316 | def predict_noise(self, x, timestep, model, conds, model_options, seed):
317 | if self.cached_prediction is None:
318 | self.cached_prediction = self.predict_noise_uncached(x, timestep, model, conds, model_options, seed)
319 |
320 | return self.cached_prediction
321 |
322 | def predict_noise_uncached(self, x, timestep, model, conds, model_options, seed):
323 | raise NotImplementedError
324 |
325 | def begin_sample(self):
326 | self.cached_prediction = None
327 |
328 | def end_sample(self):
329 | self.cached_prediction = None
330 |
331 | class ConditionedPredictor(CachingNoisePredictor):
332 | INPUTS = {
333 | "required": {
334 | "conditioning": ("CONDITIONING",),
335 | "name": ("STRING", {
336 | "multiline": False,
337 | "default": "positive"
338 | }),
339 | }
340 | }
341 |
342 | def __init__(self, conditioning, name):
343 | super().__init__()
344 |
345 | self.cond = convert_cond(conditioning)
346 | self.cond_name = name
347 |
348 | def get_conds(self):
349 | return { self.cond_name: self.cond }
350 |
351 | def get_models(self):
352 | return self.get_models_from_conds()
353 |
354 | def n_samples(self):
355 | return 1
356 |
357 | def predict_noise_uncached(self, x, timestep, model, conds, model_options, seed):
358 | return calc_cond_batch(model, [conds[self.cond_name]], x, timestep, model_options)[0]
359 |
360 | class CombinePredictor(NoisePredictor):
361 | INPUTS = {
362 | "required": {
363 | "prediction_A": ("PREDICTION",),
364 | "prediction_B": ("PREDICTION",),
365 | "operation": ([
366 | "A + B",
367 | "A - B",
368 | "A * B",
369 | "A / B",
370 | "A proj B",
371 | "A oproj B",
372 | "min(A, B)",
373 | "max(A, B)",
374 | ],)
375 | }
376 | }
377 |
378 | def __init__(self, prediction_A, prediction_B, operation):
379 | self.lhs = prediction_A
380 | self.rhs = prediction_B
381 |
382 | match operation:
383 | case "A + B":
384 | self.op = torch.add
385 | case "A - B":
386 | self.op = torch.sub
387 | case "A * B":
388 | self.op = torch.mul
389 | case "A / B":
390 | self.op = torch.div
391 | case "A proj B":
392 | self.op = proj
393 | case "A oproj B":
394 | self.op = oproj
395 | case "min(A, B)":
396 | self.op = torch.minimum
397 | case "max(A, B)":
398 | self.op = torch.maximum
399 | case _:
400 | raise RuntimeError(f"unsupported operation: {self.op}")
401 |
402 | def get_conds(self):
403 | return self.merge_conds(self.lhs, self.rhs)
404 |
405 | def get_models(self):
406 | return self.merge_models(self.lhs, self.rhs)
407 |
408 | def get_preds(self):
409 | return self.merge_preds(self.lhs, self.rhs)
410 |
411 | def predict_noise(self, x, timestep, model, conds, model_options, seed):
412 | lhs = self.lhs.predict_noise(x, timestep, model, conds, model_options, seed)
413 | rhs = self.rhs.predict_noise(x, timestep, model, conds, model_options, seed)
414 | return self.op(lhs, rhs)
415 |
416 | class InterpolatePredictor(NoisePredictor):
417 | INPUTS = {
418 | "required": {
419 | "prediction_A": ("PREDICTION",),
420 | "prediction_B": ("PREDICTION",),
421 | "scale_B": ("FLOAT", {"default": 0.5, "step": 0.01, "min": 0.0, "max": 1.0})
422 | }
423 | }
424 |
425 | def __init__(self, prediction_A, prediction_B, scale_B):
426 | self.lhs = prediction_A
427 | self.rhs = prediction_B
428 | self.lerp = scale_B
429 |
430 | def get_conds(self):
431 | return self.merge_conds(self.lhs, self.rhs)
432 |
433 | def get_models(self):
434 | return self.merge_models(self.lhs, self.rhs)
435 |
436 | def get_preds(self):
437 | return self.merge_preds(self.lhs, self.rhs)
438 |
439 | def predict_noise(self, x, timestep, model, conds, model_options, seed):
440 | match self.lerp:
441 | case 0.0:
442 | return self.lhs.predict_noise(x, timestep, model, conds, model_options, seed)
443 |
444 | case 1.0:
445 | return self.rhs.predict_noise(x, timestep, model, conds, model_options, seed)
446 |
447 | case _:
448 | return torch.lerp(
449 | self.lhs.predict_noise(x, timestep, model, conds, model_options, seed),
450 | self.rhs.predict_noise(x, timestep, model, conds, model_options, seed),
451 | self.lerp
452 | )
453 |
454 | class SwitchPredictor(NoisePredictor):
455 | """Switches predictions for specified sigmas"""
456 | INPUTS = {
457 | "required": {
458 | "prediction_A": ("PREDICTION",),
459 | "prediction_B": ("PREDICTION",),
460 | "sigmas_B": ("SIGMAS",),
461 | }
462 | }
463 |
464 | def __init__(self, prediction_A, prediction_B, sigmas_B):
465 | self.lhs = prediction_A
466 | self.rhs = prediction_B
467 | self.sigmas = sigmas_B
468 |
469 | def get_conds(self):
470 | return self.merge_conds(self.lhs, self.rhs)
471 |
472 | def get_models(self):
473 | return self.merge_models(self.lhs, self.rhs)
474 |
475 | def get_preds(self):
476 | return self.merge_preds(self.lhs, self.rhs)
477 |
478 | def predict_noise(self, x, sigma, model, conds, model_options, seed):
479 | rhs_mask = torch.isin(sigma.cpu(), self.sigmas)
480 | lhs_inds = torch.argwhere(~rhs_mask).squeeze(1)
481 | rhs_inds = torch.argwhere(rhs_mask).squeeze(1)
482 |
483 | if len(lhs_inds) == 0:
484 | return self.rhs.predict_noise(x, sigma, model, conds, model_options, seed)
485 |
486 | if len(rhs_inds) == 0:
487 | return self.lhs.predict_noise(x, sigma, model, conds, model_options, seed)
488 |
489 | preds = torch.empty_like(x)
490 | preds[lhs_inds] = self.lhs.predict_noise(
491 | x[lhs_inds],
492 | sigma[lhs_inds],
493 | model,
494 | conds,
495 | model_options,
496 | seed
497 | )
498 | preds[rhs_inds] = self.rhs.predict_noise(
499 | x[rhs_inds],
500 | sigma[rhs_inds],
501 | model,
502 | conds,
503 | model_options,
504 | seed
505 | )
506 |
507 | return preds
508 |
509 | class EarlyMiddleLatePredictor(NoisePredictor):
510 | """Switches predictions based on an early-middle-late schedule."""
511 | INPUTS = {
512 | "required": {
513 | "sigmas": ("SIGMAS",),
514 | "early_prediction": ("PREDICTION",),
515 | "middle_prediction": ("PREDICTION",),
516 | "late_prediction": ("PREDICTION",),
517 | "early_steps": ("INT", { "min": 0, "max": 1000, "default": 1 }),
518 | "late_steps": ("INT", { "min": 0, "max": 1000, "default": 5 }),
519 | }
520 | }
521 |
522 | def __init__(self, early_prediction, middle_prediction, late_prediction, sigmas, early_steps, late_steps):
523 | self.early_pred = early_prediction
524 | self.middle_pred = middle_prediction
525 | self.late_pred = late_prediction
526 |
527 | late_step = -late_steps - 1
528 |
529 | self.early_sigmas = sigmas[:early_steps]
530 | self.middle_sigmas = sigmas[early_steps:late_step]
531 | self.late_sigmas = sigmas[late_step:-1]
532 |
533 | if torch.any(torch.isin(self.early_sigmas, self.middle_sigmas)) \
534 | or torch.any(torch.isin(self.early_sigmas, self.late_sigmas)) \
535 | or torch.any(torch.isin(self.middle_sigmas, self.late_sigmas)) \
536 | :
537 | raise ValueError("Sigma schedule is ambiguous.")
538 |
539 | def get_conds(self):
540 | return self.merge_conds(self.early_pred, self.middle_pred, self.late_pred)
541 |
542 | def get_models(self):
543 | return self.merge_models(self.early_pred, self.middle_pred, self.late_pred)
544 |
545 | def get_preds(self):
546 | return self.merge_preds(self.early_pred, self.middle_pred, self.late_pred)
547 |
548 | def predict_noise(self, x, sigma, model, conds, model_options, seed):
549 | cpu_sigmas = sigma.cpu()
550 | early_inds = torch.argwhere(torch.isin(cpu_sigmas, self.early_sigmas)).squeeze(1)
551 | middle_inds = torch.argwhere(torch.isin(cpu_sigmas, self.middle_sigmas)).squeeze(1)
552 | late_inds = torch.argwhere(torch.isin(cpu_sigmas, self.late_sigmas)).squeeze(1)
553 |
554 | preds = torch.empty_like(x)
555 |
556 | assert (len(early_inds) + len(middle_inds) + len(late_inds)) == len(x)
557 |
558 | if len(early_inds) > 0:
559 | preds[early_inds] = self.early_pred.predict_noise(
560 | x[early_inds],
561 | sigma[early_inds],
562 | model,
563 | conds,
564 | model_options,
565 | seed
566 | )
567 |
568 | if len(middle_inds) > 0:
569 | preds[middle_inds] = self.middle_pred.predict_noise(
570 | x[middle_inds],
571 | sigma[middle_inds],
572 | model,
573 | conds,
574 | model_options,
575 | seed
576 | )
577 |
578 | if len(late_inds) > 0:
579 | preds[late_inds] = self.late_pred.predict_noise(
580 | x[late_inds],
581 | sigma[late_inds],
582 | model,
583 | conds,
584 | model_options,
585 | seed
586 | )
587 |
588 | return preds
589 |
590 | class ScaledGuidancePredictor(NoisePredictor):
591 | """Implements A * scale + B"""
592 | INPUTS = {
593 | "required": {
594 | "guidance": ("PREDICTION",),
595 | "baseline": ("PREDICTION",),
596 | "guidance_scale": ("FLOAT", {
597 | "default": 6.0,
598 | "step": 0.1,
599 | "min": 0.0,
600 | "max": 100.0,
601 | }),
602 | "stddev_rescale": ("FLOAT", {
603 | "default": 0.0,
604 | "step": 0.1,
605 | "min": 0.0,
606 | "max": 1.0,
607 | }),
608 | }
609 | }
610 |
611 | def __init__(self, guidance, baseline, guidance_scale, stddev_rescale):
612 | self.lhs = guidance
613 | self.rhs = baseline
614 | self.scale = guidance_scale
615 | self.rescale = stddev_rescale
616 |
617 | def get_conds(self):
618 | return self.merge_conds(self.lhs, self.rhs)
619 |
620 | def get_models(self):
621 | return self.merge_models(self.lhs, self.rhs)
622 |
623 | def get_preds(self):
624 | return self.merge_preds(self.lhs, self.rhs)
625 |
626 | def predict_noise(self, x, sigma, model, conds, model_options, seed):
627 | g = self.lhs.predict_noise(x, sigma, model, conds, model_options, seed)
628 | b = self.rhs.predict_noise(x, sigma, model, conds, model_options, seed)
629 |
630 | if self.rescale <= 0.0:
631 | return g * self.scale + b
632 |
633 | # CFG Rescale https://arxiv.org/pdf/2305.08891.pdf Sec. 3.4
634 | # Originally in eps -> v space, but now very clean in x0 thanks to Gaeros.
635 |
636 | sigma = sigma.view(sigma.shape[:1] + (1,) * (b.ndim - 1))
637 | x_norm = x * (1. / (sigma.square() + 1.0))
638 |
639 | x0 = x_norm - (g + b)
640 | x0_cfg = x_norm - (g * self.scale + b)
641 |
642 | std_x0 = torch.std(x0, dim=(1, 2, 3), keepdim=True)
643 | std_x0_cfg = torch.std(x0_cfg, dim=(1, 2, 3), keepdim=True)
644 |
645 | x0_cfg_norm = x0_cfg * (std_x0 / std_x0_cfg)
646 | x0_rescaled = torch.lerp(x0_cfg, x0_cfg_norm, self.rescale)
647 |
648 | return x_norm - x0_rescaled
649 |
650 | class CharacteristicGuidancePredictor(CachingNoisePredictor):
651 | """Implements Characteristic Guidance with Anderson Acceleration
652 |
653 | https://arxiv.org/pdf/2312.07586.pdf"""
654 |
655 | INPUTS = {
656 | "required": {
657 | "cond": ("PREDICTION",),
658 | "uncond": ("PREDICTION",),
659 | "guidance_scale": ("FLOAT", { "default": 6.0, "step": 0.1, "min": 1.0, "max": 100.0 }),
660 | "history": ("INT", { "default": 2, "min": 1 }),
661 | "log_step_size": ("FLOAT", { "default": -3, "step": 0.1, "min": -6, "max": 0 }),
662 | "log_tolerance": ("FLOAT", { "default": -4.0, "step": 0.1, "min": -6.0, "max": -2.0 }),
663 | "keep_tolerance": ("FLOAT", { "default": 1.0, "step": 1.0, "min": 1.0, "max": 1000.0 }),
664 | "reuse_scale": ("FLOAT", { "default": 0.0, "step": 0.0001, "min": 0.0, "max": 1.0 }),
665 | "max_steps": ("INT", { "default": 20, "min": 5, "max": 1000 }),
666 | "precondition_gradients": ("BOOLEAN", { "default": True }),
667 | },
668 | "optional": {
669 | "fallback": ("PREDICTION",),
670 | }
671 | }
672 |
673 | def __init__(
674 | self,
675 | cond,
676 | uncond,
677 | guidance_scale,
678 | history,
679 | log_step_size,
680 | log_tolerance,
681 | keep_tolerance,
682 | max_steps,
683 | reuse_scale,
684 | precondition_gradients = True,
685 | fallback = None
686 | ):
687 | super().__init__()
688 |
689 | self.cond = cond
690 | self.uncond = uncond
691 | self.fallback = fallback
692 | self.scale = guidance_scale
693 | self.history = history
694 | self.step_size = 10 ** log_step_size
695 | self.tolerance = 10 ** log_tolerance
696 | self.keep_tolerance = keep_tolerance
697 | self.max_steps = max_steps
698 | self.reuse = reuse_scale
699 | self.precondition_gradients = precondition_gradients
700 |
701 | self.cond_deps = set()
702 | self.uncond_deps = set()
703 | self.restore_preds = []
704 | self.prev_dx = None
705 | self.sample = 0
706 | self.pbar = None
707 |
708 | def get_conds(self):
709 | return self.merge_conds(self.cond, self.uncond, self.fallback)
710 |
711 | def get_models(self):
712 | return self.merge_models(self.cond, self.uncond, self.fallback)
713 |
714 | def get_preds(self):
715 | return self.merge_preds(self.cond, self.uncond, self.fallback)
716 |
717 | def begin_sampling(self):
718 | super().begin_sampling()
719 |
720 | self.cond_deps = self.cond.get_preds()
721 | self.uncond_deps = self.uncond.get_preds()
722 | self.restore_preds.clear()
723 | self.prev_dx = None
724 | self.sample = 0
725 |
726 | if comfy.utils.PROGRESS_BAR_ENABLED:
727 | self.pbar = tqdm(bar_format="[{rate_fmt}] {desc} ")
728 | self.pbar.set_description_str("CHG")
729 |
730 | def end_sampling(self):
731 | super().end_sampling()
732 |
733 | self.cond_deps.clear()
734 | self.uncond_deps.clear()
735 | self.restore_preds.clear()
736 | self.prev_dx = None
737 | self.sample = 0
738 |
739 | if self.pbar is not None:
740 | self.pbar.close()
741 | self.pbar = None
742 |
743 | def predict_noise_cond(self, *args):
744 | return self.predict_noise_sample(self.cond, self.cond_deps, *args)
745 |
746 | def predict_noise_uncond(self, *args):
747 | return self.predict_noise_sample(self.uncond, self.uncond_deps, *args)
748 |
749 | def predict_noise_fallback(self, *args):
750 | return self.predict_noise_sample(self.fallback, self.fallback.get_preds(), *args)
751 |
752 | @staticmethod
753 | def predict_noise_sample(pred, deps, *args):
754 | for dep in deps:
755 | dep.begin_sample()
756 |
757 | try:
758 | return pred.predict_noise(*args)
759 | finally:
760 | for dep in deps:
761 | dep.end_sample()
762 |
763 | def status(self, msg, start=False, end=False):
764 | if self.pbar is not None:
765 | if start:
766 | self.pbar.unpause()
767 |
768 | self.pbar.set_description_str(f"CHG sample {self.sample:>2}: {msg}")
769 |
770 | if end:
771 | print()
772 | self.pbar.set_description_str("CHG")
773 |
774 | def progress(self):
775 | if self.pbar is not None:
776 | self.pbar.update()
777 |
778 | def restore(self):
779 | while len(self.restore_preds) > 0:
780 | pred, cached = self.restore_preds.pop()
781 | pred.cached_prediction = cached
782 | del cached
783 |
784 | def predict_noise_uncached(self, x, sigma, model, conds, model_options, seed):
785 | self.sample += 1
786 | self.status(f"starting ({len(x)}/{len(x)})", start=True)
787 |
788 | xb, xc, xh, xw = x.shape
789 | cvg = []
790 | uncvg = list(range(xb))
791 |
792 | dx_b = []
793 | g_b = []
794 |
795 | # initial prediction, for regularization, step 1, and fallback
796 | # p_r = (model(x) - model(x|c)) * sigma
797 | p_c = self.cond.predict_noise(x, sigma, model, conds, model_options, seed)
798 | p_u = self.uncond.predict_noise(x, sigma, model, conds, model_options, seed)
799 | p_r = (p_u - p_c) * sigma[:, None, None, None]
800 |
801 | if self.fallback is not None:
802 | del p_c, p_u
803 |
804 | # remember predictions for the original latents so we can restore them later
805 | # end the current sample step since we're about to sample with a different x
806 | for pred in (self.cond_deps | self.uncond_deps):
807 | if isinstance(pred, CachingNoisePredictor):
808 | self.restore_preds.append((pred, pred.cached_prediction))
809 |
810 | pred.end_sample()
811 |
812 | # when dx is 0, we can re-use the predictions made for regularization and save a step
813 | #
814 | # dx = 0
815 | # r = dx - p_r = -p_r
816 | # v = (model(x + (w+1)*dx) - model(x + w*dx|c)) * sigma = (model(x) - model(x|c)) * sigma = p_r
817 | #
818 | # P_r o v = proj(v, r) = proj(p_r, -p_r) = p_r
819 | # g = dx - P_r o v = -p_r
820 | #
821 | # dx' = dx - gamma * g = p_r * gamma
822 |
823 | if self.prev_dx is None:
824 | start_at = 1
825 | dx = p_r * self.step_size
826 |
827 | if self.reuse != 0.0:
828 | self.prev_dx = dx
829 |
830 | if self.history > 1:
831 | dx_b.append(torch.zeros_like(x))
832 | g_b.append(-p_r)
833 | else:
834 | start_at = 0
835 | dx = self.prev_dx
836 | dx *= self.reuse
837 |
838 | self.progress()
839 |
840 | for s in range(start_at, self.max_steps):
841 | comfy.model_management.throw_exception_if_processing_interrupted()
842 | self.status(f"step {s+1}/{self.max_steps} ({len(cvg)}/{len(x)})")
843 |
844 | # we only want to step the unconverged samples
845 | ub = len(uncvg)
846 | dxu = dx[uncvg]
847 |
848 | sigu = sigma[uncvg]
849 | cxu = x[uncvg] + dxu * (self.scale + 1)
850 |
851 | # p = (model(x + dx * (scale + 1)) - model(x + dx * scale|c)) * sigma
852 | p = (
853 | self.predict_noise_uncond(cxu, sigu, model, conds, model_options, seed) -
854 | self.predict_noise_cond(cxu - dxu, sigu, model, conds, model_options, seed)
855 | ) * sigu.view(ub, 1, 1, 1)
856 | del cxu, sigu
857 |
858 | # g = dx - P o (model(x + dx * (scale + 1)) - model(x + dx * scale|c)) * sigma
859 | g = dxu - proj(p, dxu - p_r[uncvg])
860 | del p
861 |
862 | # remember norm before AA to test for convergence
863 | g_norm = torch.linalg.vector_norm(g, dim=(1, 2, 3)).cpu()
864 |
865 | if self.history > 1:
866 | dx_b.append(dxu.clone())
867 | g_b.append(g.clone())
868 |
869 | if len(dx_b) >= 2:
870 | dx_b[-2] = dxu - dx_b[-2]
871 | g_b[-2] = g - g_b[-2]
872 |
873 | def as_mat(buf):
874 | # (M[], B, C, H, W) -> (B, CHW, M[:-1])
875 | return torch.stack([buf[m].view(ub, xc*xh*xw) for m in range(len(buf) - 1)], dim=2)
876 |
877 | # w = argmin_w ||A_g w - b_g||_l2
878 | a_g = as_mat(g_b)
879 | a_g_norm = None
880 |
881 | if self.precondition_gradients:
882 | a_g_norm = torch.maximum(torch.linalg.vector_norm(a_g, dim=-2, keepdim=True), torch.tensor(1e-04))
883 | a_g /= a_g_norm
884 |
885 | w = torch.linalg.lstsq(a_g, g.view(ub, xc*xh*xw, 1)).solution
886 |
887 | # g_AA = b_g - A_g w
888 | g -= (a_g @ w).view(ub, xc, xh, xw)
889 | del a_g
890 |
891 | # dx_AA = x_g - A_x w
892 | a_dx_b = as_mat(dx_b)
893 |
894 | if self.precondition_gradients:
895 | a_dx_b /= a_g_norm
896 |
897 | dx[uncvg] -= (a_dx_b @ w).view(ub, xc, xh, xw)
898 | del a_dx_b, w, a_g_norm
899 |
900 | if len(dx_b) >= self.history:
901 | del dx_b[0], g_b[0]
902 |
903 | # dx = dx_AA - gamma g_AA
904 | dx[uncvg] -= g * self.step_size
905 | del g
906 |
907 | # until ||g||_l2 <= tolerance * dim(g)
908 | uidx = []
909 |
910 | tolerance = self.tolerance * xc * xh * xw
911 | if s == self.max_steps - 1:
912 | tolerance *= self.keep_tolerance
913 |
914 | for i in reversed(range(len(uncvg))):
915 | if g_norm[i] <= tolerance:
916 | cvg.append(uncvg.pop(i))
917 | else:
918 | uidx.append(i)
919 |
920 | del g_norm
921 |
922 | if len(uncvg) < ub:
923 | cvg.sort()
924 |
925 | if len(uncvg) == 0:
926 | self.progress()
927 | break
928 |
929 | if len(uncvg) < ub:
930 | uidx.reverse()
931 |
932 | for m in range(len(dx_b)):
933 | dx_b[m] = dx_b[m][uidx]
934 | g_b[m] = g_b[m][uidx]
935 |
936 | self.progress()
937 |
938 | dx_b.clear()
939 | g_b.clear()
940 | del p_r
941 |
942 | result = torch.empty_like(x)
943 |
944 | if len(cvg) != 0:
945 | self.status(f"sampling ({len(cvg)}/{len(x)})")
946 |
947 | # predict only the converged samples
948 | dxc = dx[cvg]
949 | sigc = sigma[cvg]
950 | cxc = x[cvg] + dxc * self.scale
951 |
952 | # chg = model(x + dx * scale|c) * (scale + 1) - model(x + dx * (scale + 1)) * scale
953 | result[cvg] = (
954 | self.predict_noise_cond(cxc, sigc, model, conds, model_options, seed) * (self.scale + 1.0) -
955 | self.predict_noise_uncond(cxc + dxc, sigc, model, conds, model_options, seed) * self.scale
956 | )
957 | del dxc, sigc, cxc
958 |
959 | self.progress()
960 |
961 | if len(uncvg) != 0:
962 | if self.pbar is None:
963 | print(f"CHG sample {self.sample}: {len(uncvg)}/{len(x)} samples did not converge")
964 |
965 | if self.fallback is not None:
966 | self.status(f"fallback ({len(uncvg)}/{len(x)})")
967 |
968 | # in the very special case that nothing converged, we can restore now to hopefully speed-up the fallback
969 | if len(cvg) == 0:
970 | self.restore()
971 | result = self.fallback.predict_noise(x, sigma, model, conds, model_options, seed)
972 | else:
973 | result[uncvg] = self.predict_noise_fallback(x[uncvg], sigma[uncvg], model, conds, model_options, seed)
974 |
975 | self.progress()
976 | else:
977 | # use vanilla CFG for unconverged samples if no fallback was specified
978 | result[uncvg] = p_c[uncvg] * (self.scale + 1.0) - p_u[uncvg] * self.scale
979 |
980 | # make sure the unconverged corrections are not reused
981 | if self.reuse != 0.0:
982 | for i in uncvg:
983 | dx[i].zero_()
984 |
985 | if self.fallback is None:
986 | del p_c, p_u
987 |
988 | if self.pbar is not None:
989 | self.status(f"{s+1} steps, {len(uncvg)} unconverged", end=True)
990 |
991 | self.restore()
992 | return result
993 |
994 | class AvoidErasePredictor(NoisePredictor):
995 | """Implements Avoid and Erase V2 guidance."""
996 |
997 | INPUTS = {
998 | "required": {
999 | "positive": ("PREDICTION",),
1000 | "negative": ("PREDICTION",),
1001 | "empty": ("PREDICTION",),
1002 | "erase_scale": ("FLOAT", {
1003 | "default": 0.2,
1004 | "step": 0.01,
1005 | "min": 0.0,
1006 | "max": 1.0,
1007 | }),
1008 | }
1009 | }
1010 |
1011 | OUTPUTS = { "guidance": "PREDICTION" }
1012 |
1013 | def __init__(self, positive, negative, empty, erase_scale):
1014 | self.positive_pred = positive
1015 | self.negative_pred = negative
1016 | self.empty_pred = empty
1017 | self.erase_scale = erase_scale
1018 |
1019 | def get_conds(self):
1020 | return self.merge_conds(self.positive_pred, self.negative_pred, self.empty_pred)
1021 |
1022 | def get_models(self):
1023 | return self.merge_models(self.positive_pred, self.negative_pred, self.empty_pred)
1024 |
1025 | def get_preds(self):
1026 | return self.merge_preds(self.positive_pred, self.negative_pred, self.empty_pred)
1027 |
1028 | def predict_noise(self, x, timestep, model, conds, model_options, seed):
1029 | pos = self.positive_pred.predict_noise(x, timestep, model, conds, model_options, seed)
1030 | neg = self.negative_pred.predict_noise(x, timestep, model, conds, model_options, seed)
1031 | empty = self.empty_pred.predict_noise(x, timestep, model, conds, model_options, seed)
1032 |
1033 | pos_ind = pos - empty
1034 | neg_ind = neg - empty
1035 | return oproj(pos_ind, neg_ind) - oproj(neg_ind, pos_ind) * self.erase_scale
1036 |
1037 | def dot(a, b):
1038 | return (a * b).sum(dim=(1, 2, 3), keepdims=True)
1039 |
1040 | def proj(a, b):
1041 | a_dot_b = dot(a, b)
1042 | b_dot_b = dot(b, b)
1043 | divisor = torch.where(
1044 | b_dot_b != 0,
1045 | b_dot_b,
1046 | torch.ones_like(b_dot_b)
1047 | )
1048 |
1049 | return b * (a_dot_b / divisor)
1050 |
1051 | def oproj(a, b):
1052 | return a - proj(a, b)
1053 |
1054 | class ScalePredictor(NoisePredictor):
1055 | INPUTS = {
1056 | "required": {
1057 | "prediction": ("PREDICTION",),
1058 | "scale": ("FLOAT", {
1059 | "default": 1.0,
1060 | "step": 0.01,
1061 | "min": -100.0,
1062 | "max": 100.0,
1063 | })
1064 | }
1065 | }
1066 |
1067 | def __init__(self, prediction, scale):
1068 | self.inner = prediction
1069 | self.scale = scale
1070 |
1071 | def get_conds(self):
1072 | return self.inner.get_conds()
1073 |
1074 | def get_models(self):
1075 | return self.inner.get_models()
1076 |
1077 | def get_preds(self):
1078 | return self.merge_preds(self.inner)
1079 |
1080 | def predict_noise(self, x, timestep, model, conds, model_options, seed):
1081 | return self.inner.predict_noise(x, timestep, model, conds, model_options, seed) * self.scale
1082 |
1083 | class CFGPredictor(CachingNoisePredictor):
1084 | INPUTS = {
1085 | "required": {
1086 | "positive": ("CONDITIONING",),
1087 | "negative": ("CONDITIONING",),
1088 | "cfg_scale": ("FLOAT", {
1089 | "default": 6.0,
1090 | "min": 1.0,
1091 | "max": 100.0,
1092 | "step": 0.5,
1093 | })
1094 | }
1095 | }
1096 |
1097 | def __init__(self, positive, negative, cfg_scale):
1098 | super().__init__()
1099 |
1100 | self.positive = convert_cond(positive)
1101 | self.negative = convert_cond(negative)
1102 | self.cfg_scale = cfg_scale
1103 |
1104 | def get_conds(self):
1105 | return {
1106 | "positive": self.positive,
1107 | "negative": self.negative,
1108 | }
1109 |
1110 | def get_models(self):
1111 | return self.get_models_from_conds()
1112 |
1113 | def n_samples(self):
1114 | return 2
1115 |
1116 | def predict_noise_uncached(self, x, timestep, model, conds, model_options, seed):
1117 | cond, uncond = calc_cond_batch(
1118 | model,
1119 | [conds["positive"], conds["negative"]],
1120 | x,
1121 | timestep,
1122 | model_options
1123 | )
1124 |
1125 | return uncond + (cond - uncond) * self.cfg_scale
1126 |
1127 | class PerpNegPredictor(CachingNoisePredictor):
1128 | INPUTS = {
1129 | "required": {
1130 | "positive": ("CONDITIONING",),
1131 | "negative": ("CONDITIONING",),
1132 | "empty": ("CONDITIONING",),
1133 | "cfg_scale": ("FLOAT", {
1134 | "default": 6.0,
1135 | "min": 1.0,
1136 | "max": 100.0,
1137 | "step": 0.5,
1138 | }),
1139 | "neg_scale": ("FLOAT", {
1140 | "default": 1.0,
1141 | "min": 0.0,
1142 | "max": 2.0,
1143 | "step": 0.05,
1144 | })
1145 | }
1146 | }
1147 |
1148 | def __init__(self, positive, negative, empty, cfg_scale, neg_scale):
1149 | super().__init__()
1150 |
1151 | self.positive = convert_cond(positive)
1152 | self.negative = convert_cond(negative)
1153 | self.empty = convert_cond(empty)
1154 | self.cfg_scale = cfg_scale
1155 | self.neg_scale = neg_scale
1156 |
1157 | def get_conds(self):
1158 | return {
1159 | "positive": self.positive,
1160 | "negative": self.negative,
1161 | "empty": self.empty,
1162 | }
1163 |
1164 | def get_models(self):
1165 | return self.get_models_from_conds()
1166 |
1167 | def n_samples(self):
1168 | return 3
1169 |
1170 | def predict_noise_uncached(self, x, timestep, model, conds, model_options, seed):
1171 | cond, uncond, empty = comfy.samplers.calc_cond_batch(
1172 | model,
1173 | [conds["positive"], conds["negative"], conds["empty"]],
1174 | x,
1175 | timestep,
1176 | model_options
1177 | )
1178 |
1179 | positive = cond - empty
1180 | negative = uncond - empty
1181 | perp_neg = oproj(negative, positive) * self.neg_scale
1182 | return empty + (positive - perp_neg) * self.cfg_scale
1183 |
1184 | NODE_CLASS_MAPPINGS = {
1185 | "SamplerCustomPrediction": SamplerCustomPrediction,
1186 | }
1187 | NODE_DISPLAY_NAME_MAPPINGS = {
1188 | "SamplerCustomPrediction": "Sample Predictions",
1189 | }
1190 |
1191 | def make_node(predictor, display_name, class_name=None, category="sampling/prediction"):
1192 | if class_name is None:
1193 | class_name = predictor.__name__[:-2] + "ion" # Predictor -> Prediction
1194 |
1195 | cls = type(class_name, (), {
1196 | "FUNCTION": "get_predictor",
1197 | "CATEGORY": category,
1198 | "INPUT_TYPES": classmethod(lambda cls: predictor.INPUTS),
1199 | "RETURN_TYPES": tuple(predictor.OUTPUTS.values()),
1200 | "RETURN_NAMES": tuple(predictor.OUTPUTS.keys()),
1201 | "get_predictor": lambda self, **kwargs: (predictor(**kwargs),),
1202 | })
1203 |
1204 | setattr(sys.modules[__name__], class_name, cls)
1205 | NODE_CLASS_MAPPINGS[class_name] = cls
1206 | NODE_DISPLAY_NAME_MAPPINGS[class_name] = display_name
1207 |
1208 | make_node(ConditionedPredictor, "Conditioned Prediction")
1209 | make_node(CombinePredictor, "Combine Predictions", class_name="CombinePredictions")
1210 | make_node(InterpolatePredictor, "Interpolate Predictions", class_name="InterpolatePredictions")
1211 | make_node(SwitchPredictor, "Switch Predictions", class_name="SwitchPredictions")
1212 | make_node(EarlyMiddleLatePredictor, "Switch Early/Middle/Late Predictions")
1213 | make_node(ScaledGuidancePredictor, "Scaled Guidance Prediction")
1214 | make_node(CharacteristicGuidancePredictor, "Characteristic Guidance Prediction")
1215 | make_node(AvoidErasePredictor, "Avoid and Erase Prediction")
1216 | make_node(ScalePredictor, "Scale Prediction")
1217 | make_node(CFGPredictor, "CFG Prediction")
1218 | make_node(PerpNegPredictor, "Perp-Neg Prediction")
1219 |
--------------------------------------------------------------------------------
/nodes/nodes_sigma.py:
--------------------------------------------------------------------------------
1 | import comfy
2 | import torch
3 |
4 | class SelectSigmas:
5 | """Selects a subest of sigmas."""
6 |
7 | @classmethod
8 | def INPUT_TYPES(cls):
9 | return {
10 | "required": {
11 | "sigmas": ("SIGMAS",),
12 | "select": ("STRING", { "multiline": False, "default": "mod 2" }),
13 | "chained": ("BOOLEAN", { "default": False })
14 | }
15 | }
16 |
17 | RETURN_TYPES = ("SIGMAS",)
18 | RETURN_NAMES = ("selection",)
19 | FUNCTION = "get_sigmas"
20 | CATEGORY = "sampling/custom_sampling/sigmas"
21 |
22 | def get_sigmas(self, sigmas, select, chained):
23 | count = len(sigmas)
24 | if not chained:
25 | count -= 1
26 |
27 | if select.startswith("mod "):
28 | arg = int(select[4:])
29 | mask = list(range(arg - 1, count, arg))
30 | else:
31 | mask = []
32 | for idx in select.split(","):
33 | idx = idx.strip()
34 | if ":" in idx:
35 | start, end = idx.split(":")
36 | start = int(start) if start != "" else 0
37 | end = int(end) if end != "" else count
38 |
39 | mask.extend(
40 | idx for idx in range(start, end, 1 if start <= end else -1)
41 | if -count <= idx < count
42 | )
43 | elif idx != "":
44 | idx = int(idx)
45 | if -count <= idx < count:
46 | mask.append(idx)
47 |
48 | return sigmas[mask],
49 |
50 | class SplitAtSigma:
51 | """Splits a descending list of sigmas at the specfied sigma."""
52 |
53 | @classmethod
54 | def INPUT_TYPES(cls):
55 | return {
56 | "required": {
57 | "sigmas": ("SIGMAS",),
58 | "sigma": ("FLOAT", {"default": 1.0, "min": 0.0}),
59 | }
60 | }
61 |
62 | RETURN_TYPES = ("SIGMAS", "SIGMAS")
63 | FUNCTION = "get_sigmas"
64 | CATEGORY = "sampling/custom_sampling/sigmas"
65 |
66 | def get_sigmas(self, sigmas, sigma):
67 | index = 0
68 | while sigmas[index].item() > sigma:
69 | index += 1
70 |
71 | return sigmas[:index+1], sigmas[index:]
72 |
73 | class LogSigmas:
74 | """Logs a list of sigmas to the console."""
75 |
76 | @classmethod
77 | def INPUT_TYPES(cls):
78 | return {
79 | "required": {
80 | "sigmas": ("SIGMAS",),
81 | "message": ("STRING", { "multiline": False, "default": "" }),
82 | }
83 | }
84 |
85 | RETURN_TYPES = ()
86 | FUNCTION = "log_sigmas"
87 | CATEGORY = "sampling/custom_sampling/sigmas"
88 | OUTPUT_NODE = True
89 |
90 | def log_sigmas(self, sigmas, message):
91 | print(f"{message or 'SIGMAS'}: {sigmas.tolist()}")
92 | return ()
93 |
94 | NODE_CLASS_MAPPINGS = {
95 | "SelectSigmas": SelectSigmas,
96 | "SplitAtSigma": SplitAtSigma,
97 | "LogSigmas": LogSigmas,
98 | }
99 |
100 | NODE_DISPLAY_NAME_MAPPINGS = {
101 | "SelectSigmas": "Select Sigmas",
102 | "SplitAtSigma": "Split at Sigma",
103 | "LogSigmas": "Log Sigmas",
104 | }
105 |
--------------------------------------------------------------------------------
/patches/colorPalette.js.patch:
--------------------------------------------------------------------------------
1 | --- a/web/extensions/core/colorPalette.js
2 | +++ b/web/extensions/core/colorPalette.js
3 | @@ -18,6 +18,7 @@ const colorPalettes = {
4 | "LATENT": "#FF9CF9", // light pink-purple
5 | "MASK": "#81C784", // muted green
6 | "MODEL": "#B39DDB", // light lavender-purple
7 | + "PREDICTION": "#30D5C8", // deep turquoise
8 | "STYLE_MODEL": "#C2FFAE", // light green-yellow
9 | "VAE": "#FF6E6E", // bright red
10 | "TAESD": "#DCC274", // cheesecake
11 |
--------------------------------------------------------------------------------