├── .github └── workflows │ └── publish.yml ├── .gitignore ├── LICENSE ├── README.jp.md ├── README.md ├── __init__.py ├── examples ├── efficiency-nodes.json ├── efficiency-nodes.png ├── everywhere_prompt_utilities.json ├── everywhere_prompt_utilities.png ├── extra_metadata.json ├── extra_metadata.png ├── filename_format.json ├── filename_format.png ├── lora_embedding_vae.json ├── lora_embedding_vae.png ├── sd3.json └── sd3.png ├── img └── save_image_with_metadata.png ├── py ├── __init__.py ├── capture.py ├── defs │ ├── __init__.py │ ├── captures.py │ ├── combo.py │ ├── ext │ │ ├── ComfyUI-FluxSettingsNode.py │ │ ├── easyuse_nodes.py │ │ ├── efficiency_nodes.py │ │ ├── rgthree.py │ │ └── size_from_presets.py │ ├── formatters.py │ ├── meta.py │ ├── samplers.py │ └── validators.py ├── hook.py ├── nodes │ ├── __init__.py │ ├── base.py │ └── node.py ├── trace.py └── utils │ ├── embedding.py │ └── hash.py └── pyproject.toml /.github/workflows/publish.yml: -------------------------------------------------------------------------------- 1 | name: Publish to Comfy registry 2 | on: 3 | workflow_dispatch: 4 | push: 5 | branches: 6 | - main 7 | paths: 8 | - "pyproject.toml" 9 | 10 | permissions: 11 | issues: write 12 | 13 | jobs: 14 | publish-node: 15 | name: Publish Custom Node to registry 16 | runs-on: ubuntu-latest 17 | if: ${{ github.repository_owner == 'nkchocoai' }} 18 | steps: 19 | - name: Check out code 20 | uses: actions/checkout@v4 21 | - name: Publish Custom Node 22 | uses: Comfy-Org/publish-node-action@v1 23 | with: 24 | ## Add your own personal access token to your Github Repository secrets and reference it here. 25 | personal_access_token: ${{ secrets.REGISTRY_ACCESS_TOKEN }} 26 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | __pycache__ -------------------------------------------------------------------------------- /README.jp.md: -------------------------------------------------------------------------------- 1 | # ComfyUI-SaveImageWithMetaData 2 | ![SaveImageWithMetaData Preview](img/save_image_with_metadata.png) 3 | - [ComfyUI](https://github.com/comfyanonymous/ComfyUI)用のカスタムノードです。 4 | - 各ノードの入力値から取得したメタデータ(PNGInfo)つきの画像を保存するノードを追加します。 5 | - 動的に値を取得するため、色々な拡張機能のノードで出力された値をメタデータに追加することができます。 6 | 7 | ## インストール手順 8 | ``` 9 | cd /custom_nodes 10 | git clone https://github.com/nkchocoai/ComfyUI-SaveImageWithMetaData.git 11 | ``` 12 | 13 | ## 追加されるノード 14 | ### Save Image With Metadata 15 | - 入力として受け取った `images` をメタデータ(PNGInfo)つきの画像として保存します。 16 | - メタデータは `sampler_selection_method` で見つけたKSamplerノードの入力と以前に実行されたノードの入力から取得します。 17 | - 対象となるKSamplerノードは[py/defs/samplers.py](py/defs/samplers.py)と[py/defs/ext/](py/defs/ext/)配下のファイルの`SAMPLERS`のキーです。 18 | 19 | #### filename_prefix 20 | - `filename_prefix` で指定した文字列(Key)は取得した情報に置換されます。 21 | 22 | | Key | 置換先の情報 | 23 | | --------------------- | -------------------------- | 24 | | %seed% | シード値 | 25 | | %width% | 画像の幅 | 26 | | %height% | 画像の高さ | 27 | | %pprompt% | Positive Prompt | 28 | | %pprompt:<文字数n>% | Positive Promptの先頭n文字 | 29 | | %nprompt% | Negative Prompt | 30 | | %nprompt:<文字数n>% | Negative Promptの先頭n文字 | 31 | | %model% | Checkpoint名 | 32 | | %model:<文字数n>% | Checkpoint名の先頭n文字 | 33 | | %date% | 生成日時(yyyyMMddhhmmss) | 34 | | %date:<フォーマット>% | 生成日時 | 35 | 36 | - `%date:<フォーマット>%` の `<フォーマット>` で指定する識別子は以下の表を参照ください。 37 | 38 | | 識別子 | 説明 | 39 | | ------ | ---- | 40 | | yyyy | 年 | 41 | | MM | 月 | 42 | | dd | 日 | 43 | | hh | 時 | 44 | | mm | 分 | 45 | | ss | 秒 | 46 | 47 | #### sampler_selection_method 48 | - このノードよりも前に実行されたKSamplerノードを選ぶ方法を指定します。 49 | 50 | ##### Farthest 51 | - このノードに最も遠いKSamplerノードを選びます。 52 | - 例: [everywhere_prompt_utilities.png](examples/everywhere_prompt_utilities.png) において、上段のKSamplerノード(seed=12345)を選びます。 53 | 54 | ##### Nearest 55 | - このノードに最も近いKSamplerノードを選びます。 56 | - 例: [everywhere_prompt_utilities.png](examples/everywhere_prompt_utilities.png) において、下段のKSamplerノード(seed=67890)を選びます。 57 | 58 | ##### By node ID 59 | - ノードIDが `sampler_selection_node_id` であるKSamplerノードを選びます。 60 | 61 | ### Create Extra MetaData 62 | - 保存する画像に追加するメタデータを指定します。 63 | - 例: [extra_metadata.png](examples/extra_metadata.png)。 64 | 65 | ## 付与されるメタデータ 66 | - Positive prompt 67 | - Negative prompt 68 | - Steps 69 | - Sampler 70 | - CFG Scale 71 | - Seed 72 | - Clip skip 73 | - Size 74 | - Model 75 | - Model hash 76 | - VAE 77 | - KSamplerノードではなくSaveImageWithMetadataノードの入力から参照されます。 78 | - VAE hash 79 | - KSamplerノードではなくSaveImageWithMetadataノードの入力から参照されます。 80 | - Loras 81 | - Model name 82 | - Model hash 83 | - Strength model 84 | - Strength clip 85 | - Embeddings 86 | - Name 87 | - Hash 88 | - batch size >= 2の場合 : 89 | - Batch index 90 | - Batch size 91 | - Hashes 92 | - Model, Loras, Embeddings 93 | - [Civitai](https://civitai.com/)用 94 | 95 | ## 対応しているノード・拡張機能 96 | - 対応しているノードは以下のファイルをご確認ください。 97 | - [py/defs/captures.py](py/defs/captures.py) 98 | - [py/defs/samplers.py](py/defs/samplers.py) 99 | - 対応している拡張機能は以下のディレクトリをご確認ください。 100 | - [py/defs/ext/](py/defs/ext/) 101 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # ComfyUI-SaveImageWithMetaData 2 | ![SaveImageWithMetaData Preview](img/save_image_with_metadata.png) 3 | 日本語版READMEは[こちら](README.jp.md)。 4 | 5 | - Custom node for [ComfyUI](https://github.com/comfyanonymous/ComfyUI). 6 | - Add a node to save images with metadata (PNGInfo) extracted from the input values of each node. 7 | - Since the values are extracted dynamically, values output by various extension nodes can be added to metadata. 8 | 9 | ## Installation 10 | ``` 11 | cd /custom_nodes 12 | git clone https://github.com/nkchocoai/ComfyUI-SaveImageWithMetaData.git 13 | ``` 14 | 15 | ## Nodes 16 | ### Save Image With Metadata 17 | - Saves the `images` received as input as an image with metadata (PNGInfo). 18 | - Metadata is extracted from the input of the KSampler node found by `sampler_selection_method` and the input of the previously executed node. 19 | - Target KSampler nodes are the key of `SAMPLERS` in the file [py/defs/samplers.py](py/defs/samplers.py) and the file in [py/defs/ext/](py/defs/ext/). 20 | 21 | #### filename_prefix 22 | - The string (Key) specified in `filename_prefix` will be replaced with the retrieved information. 23 | 24 | | Key | Information to be replaced | 25 | | --------------- | ------------------------------------- | 26 | | %seed% | seed value | 27 | | %width% | Image width | 28 | | %height% | Image height | 29 | | %pprompt% | Positive Prompt | 30 | | %pprompt:[n]% | first n characters of Positive Prompt | 31 | | %nprompt% | Negative Prompt | 32 | | %nprompt:[n]% | First n characters of Negative Prompt | 33 | | %model% | Checkpoint name | 34 | | %model:[n]% | First n characters of Checkpoint name | 35 | | %date% | Date of generation(yyyyMMddhhmmss) | 36 | | %date:[format]% | Date of generation | 37 | 38 | - See the following table for the identifier specified by `[format]` in `%date:[format]%`. 39 | 40 | | Identifier | Description | 41 | | ---------- | ----------- | 42 | | yyyy | year | 43 | | MM | month | 44 | | dd | day | 45 | | hh | hour | 46 | | mm | minute | 47 | | ss | second | 48 | 49 | #### sampler_selection_method 50 | - Specifies how to select a KSampler node that has been executed before this node. 51 | 52 | ##### Farthest 53 | - Selects the farthest KSampler node from this node. 54 | - Example: In [everywhere_prompt_utilities.png](examples/everywhere_prompt_utilities.png), select the upper KSampler node (seed=12345). 55 | 56 | ##### Nearest 57 | - Select the nearest KSampler node to this node. 58 | - Example: In [everywhere_prompt_utilities.png](examples/everywhere_prompt_utilities.png), select the bottom KSampler node (seed=67890). 59 | 60 | ##### By node ID 61 | - Select the KSampler node whose node ID is `sampler_selection_node_id`. 62 | 63 | ### Create Extra MetaData 64 | - Specifies metadata to be added to the image to be saved. 65 | - Example: In [extra_metadata.png](examples/extra_metadata.png). 66 | 67 | ## Metadata to be given 68 | - Positive prompt 69 | - Negative prompt 70 | - Steps 71 | - Sampler 72 | - CFG Scale 73 | - Seed 74 | - Clip skip 75 | - Size 76 | - Model 77 | - Model hash 78 | - VAE 79 | - It is referenced from the input of SaveImageWithMetadata node, not KSampler node. 80 | - VAE hash 81 | - It is referenced from the input of SaveImageWithMetadata node, not KSampler node. 82 | - Loras 83 | - Model name 84 | - Model hash 85 | - Strength model 86 | - Strength clip 87 | - Embeddings 88 | - Name 89 | - Hash 90 | - If batch size >= 2 : 91 | - Batch index 92 | - Batch size 93 | - Hashes 94 | - Model, Loras, Embeddings 95 | - For [Civitai](https://civitai.com/) 96 | 97 | ## Supported nodes and extensions 98 | - Please check the following file for supported nodes. 99 | - [py/defs/captures.py](py/defs/captures.py) 100 | - [py/defs/samplers.py](py/defs/samplers.py) 101 | - Please check the following directories for supported extensions. 102 | - [py/defs/ext/](py/defs/ext/) 103 | -------------------------------------------------------------------------------- /__init__.py: -------------------------------------------------------------------------------- 1 | from .py.nodes.node import SaveImageWithMetaData, CreateExtraMetaData 2 | 3 | NODE_CLASS_MAPPINGS = { 4 | "SaveImageWithMetaData": SaveImageWithMetaData, 5 | "CreateExtraMetaData": CreateExtraMetaData, 6 | } 7 | 8 | NODE_DISPLAY_NAME_MAPPINGS = { 9 | "SaveImageWithMetaData": "Save Image With Metadata", 10 | "CreateExtraMetaData": "Create Extra MetaData", 11 | } 12 | 13 | __all__ = ["NODE_CLASS_MAPPINGS", "NODE_DISPLAY_NAME_MAPPINGS"] 14 | -------------------------------------------------------------------------------- /examples/efficiency-nodes.json: -------------------------------------------------------------------------------- 1 | { 2 | "last_node_id": 11, 3 | "last_link_id": 17, 4 | "nodes": [ 5 | { 6 | "id": 1, 7 | "type": "Efficient Loader", 8 | "pos": [ 9 | 136, 10 | 89 11 | ], 12 | "size": { 13 | "0": 400, 14 | "1": 462 15 | }, 16 | "flags": {}, 17 | "order": 0, 18 | "mode": 0, 19 | "inputs": [ 20 | { 21 | "name": "lora_stack", 22 | "type": "LORA_STACK", 23 | "link": null 24 | }, 25 | { 26 | "name": "cnet_stack", 27 | "type": "CONTROL_NET_STACK", 28 | "link": null 29 | } 30 | ], 31 | "outputs": [ 32 | { 33 | "name": "MODEL", 34 | "type": "MODEL", 35 | "links": [ 36 | 1, 37 | 9 38 | ], 39 | "shape": 3 40 | }, 41 | { 42 | "name": "CONDITIONING+", 43 | "type": "CONDITIONING", 44 | "links": [ 45 | 2, 46 | 10 47 | ], 48 | "shape": 3, 49 | "slot_index": 1 50 | }, 51 | { 52 | "name": "CONDITIONING-", 53 | "type": "CONDITIONING", 54 | "links": [ 55 | 3, 56 | 11 57 | ], 58 | "shape": 3 59 | }, 60 | { 61 | "name": "LATENT", 62 | "type": "LATENT", 63 | "links": [ 64 | 4, 65 | 12 66 | ], 67 | "shape": 3, 68 | "slot_index": 3 69 | }, 70 | { 71 | "name": "VAE", 72 | "type": "VAE", 73 | "links": [ 74 | 5, 75 | 13 76 | ], 77 | "shape": 3, 78 | "slot_index": 4 79 | }, 80 | { 81 | "name": "CLIP", 82 | "type": "CLIP", 83 | "links": null, 84 | "shape": 3 85 | }, 86 | { 87 | "name": "DEPENDENCIES", 88 | "type": "DEPENDENCIES", 89 | "links": null, 90 | "shape": 3 91 | } 92 | ], 93 | "properties": { 94 | "Node name for S&R": "Efficient Loader" 95 | }, 96 | "widgets_values": [ 97 | "animagine-xl-3.0.safetensors", 98 | "Baked VAE", 99 | -1, 100 | "None", 101 | 1, 102 | 1, 103 | "1girl, hatsune miku, upper body, smile, looking at viewer, masterpiece, best quality", 104 | "lowres, bad anatomy, bad hands, text, error, missing fingers, extra digit, fewer digits, cropped, worst quality, low quality, normal quality, jpeg artifacts, signature, watermark, username, blurry, artist name", 105 | "none", 106 | "comfy", 107 | 1024, 108 | 1024, 109 | 1 110 | ], 111 | "bgcolor": "#335533", 112 | "shape": 1 113 | }, 114 | { 115 | "id": 9, 116 | "type": "SaveImageWithMetaData", 117 | "pos": [ 118 | 1464, 119 | 366 120 | ], 121 | "size": { 122 | "0": 315, 123 | "1": 318 124 | }, 125 | "flags": {}, 126 | "order": 7, 127 | "mode": 0, 128 | "inputs": [ 129 | { 130 | "name": "images", 131 | "type": "IMAGE", 132 | "link": 15, 133 | "slot_index": 0 134 | } 135 | ], 136 | "properties": { 137 | "Node name for S&R": "SaveImageWithMetaData" 138 | }, 139 | "widgets_values": [ 140 | "eff_adv", 141 | "Farthest", 142 | 0 143 | ] 144 | }, 145 | { 146 | "id": 4, 147 | "type": "Eff. Loader SDXL", 148 | "pos": [ 149 | 141, 150 | 628 151 | ], 152 | "size": { 153 | "0": 400, 154 | "1": 402.0000305175781 155 | }, 156 | "flags": {}, 157 | "order": 1, 158 | "mode": 0, 159 | "inputs": [ 160 | { 161 | "name": "lora_stack", 162 | "type": "LORA_STACK", 163 | "link": null 164 | }, 165 | { 166 | "name": "cnet_stack", 167 | "type": "CONTROL_NET_STACK", 168 | "link": null 169 | } 170 | ], 171 | "outputs": [ 172 | { 173 | "name": "SDXL_TUPLE", 174 | "type": "SDXL_TUPLE", 175 | "links": [ 176 | 6 177 | ], 178 | "shape": 3 179 | }, 180 | { 181 | "name": "LATENT", 182 | "type": "LATENT", 183 | "links": [ 184 | 7 185 | ], 186 | "shape": 3, 187 | "slot_index": 1 188 | }, 189 | { 190 | "name": "VAE", 191 | "type": "VAE", 192 | "links": [ 193 | 8 194 | ], 195 | "shape": 3 196 | }, 197 | { 198 | "name": "DEPENDENCIES", 199 | "type": "DEPENDENCIES", 200 | "links": null, 201 | "shape": 3 202 | } 203 | ], 204 | "properties": { 205 | "Node name for S&R": "Eff. Loader SDXL" 206 | }, 207 | "widgets_values": [ 208 | "animagine-xl-3.0.safetensors", 209 | -2, 210 | "None", 211 | -2, 212 | 6, 213 | 2, 214 | "Baked VAE", 215 | "1girl, hatsune miku, upper body, smile, looking at viewer, masterpiece, best quality", 216 | "lowres, bad anatomy, bad hands, text, error, missing fingers, extra digit, fewer digits, cropped, worst quality, low quality, normal quality, jpeg artifacts, signature, watermark, username, blurry, artist name", 217 | "none", 218 | "comfy", 219 | 832, 220 | 1216, 221 | 1 222 | ], 223 | "bgcolor": "#335555", 224 | "shape": 1 225 | }, 226 | { 227 | "id": 8, 228 | "type": "SaveImageWithMetaData", 229 | "pos": [ 230 | 1468, 231 | -21 232 | ], 233 | "size": { 234 | "0": 315, 235 | "1": 318 236 | }, 237 | "flags": {}, 238 | "order": 5, 239 | "mode": 0, 240 | "inputs": [ 241 | { 242 | "name": "images", 243 | "type": "IMAGE", 244 | "link": 14, 245 | "slot_index": 0 246 | } 247 | ], 248 | "properties": { 249 | "Node name for S&R": "SaveImageWithMetaData" 250 | }, 251 | "widgets_values": [ 252 | "eff", 253 | "Farthest", 254 | 0 255 | ] 256 | }, 257 | { 258 | "id": 10, 259 | "type": "SaveImageWithMetaData", 260 | "pos": [ 261 | 1472, 262 | 772 263 | ], 264 | "size": { 265 | "0": 315, 266 | "1": 318 267 | }, 268 | "flags": {}, 269 | "order": 8, 270 | "mode": 0, 271 | "inputs": [ 272 | { 273 | "name": "images", 274 | "type": "IMAGE", 275 | "link": 16, 276 | "slot_index": 0 277 | } 278 | ], 279 | "properties": { 280 | "Node name for S&R": "SaveImageWithMetaData" 281 | }, 282 | "widgets_values": [ 283 | "eff_xl", 284 | "Farthest", 285 | 0 286 | ] 287 | }, 288 | { 289 | "id": 6, 290 | "type": "KSampler Adv. (Efficient)", 291 | "pos": [ 292 | 1050, 293 | 266 294 | ], 295 | "size": { 296 | "0": 323.57684326171875, 297 | "1": 422 298 | }, 299 | "flags": {}, 300 | "order": 3, 301 | "mode": 0, 302 | "inputs": [ 303 | { 304 | "name": "model", 305 | "type": "MODEL", 306 | "link": 9, 307 | "slot_index": 0 308 | }, 309 | { 310 | "name": "positive", 311 | "type": "CONDITIONING", 312 | "link": 10 313 | }, 314 | { 315 | "name": "negative", 316 | "type": "CONDITIONING", 317 | "link": 11, 318 | "slot_index": 2 319 | }, 320 | { 321 | "name": "latent_image", 322 | "type": "LATENT", 323 | "link": 12 324 | }, 325 | { 326 | "name": "optional_vae", 327 | "type": "VAE", 328 | "link": 13, 329 | "slot_index": 4 330 | }, 331 | { 332 | "name": "script", 333 | "type": "SCRIPT", 334 | "link": null 335 | } 336 | ], 337 | "outputs": [ 338 | { 339 | "name": "MODEL", 340 | "type": "MODEL", 341 | "links": null, 342 | "shape": 3 343 | }, 344 | { 345 | "name": "CONDITIONING+", 346 | "type": "CONDITIONING", 347 | "links": null, 348 | "shape": 3 349 | }, 350 | { 351 | "name": "CONDITIONING-", 352 | "type": "CONDITIONING", 353 | "links": null, 354 | "shape": 3 355 | }, 356 | { 357 | "name": "LATENT", 358 | "type": "LATENT", 359 | "links": null, 360 | "shape": 3 361 | }, 362 | { 363 | "name": "VAE", 364 | "type": "VAE", 365 | "links": null, 366 | "shape": 3 367 | }, 368 | { 369 | "name": "IMAGE", 370 | "type": "IMAGE", 371 | "links": [ 372 | 15 373 | ], 374 | "shape": 3 375 | } 376 | ], 377 | "properties": { 378 | "Node name for S&R": "KSampler Adv. (Efficient)" 379 | }, 380 | "widgets_values": [ 381 | "enable", 382 | 456, 383 | null, 384 | 25, 385 | 7, 386 | "euler_ancestral", 387 | "normal", 388 | 0, 389 | 10000, 390 | "disable", 391 | "none", 392 | "true" 393 | ], 394 | "bgcolor": "#665533", 395 | "shape": 1 396 | }, 397 | { 398 | "id": 3, 399 | "type": "KSampler (Efficient)", 400 | "pos": [ 401 | 603, 402 | 89 403 | ], 404 | "size": { 405 | "0": 339.13238525390625, 406 | "1": 350 407 | }, 408 | "flags": {}, 409 | "order": 2, 410 | "mode": 0, 411 | "inputs": [ 412 | { 413 | "name": "model", 414 | "type": "MODEL", 415 | "link": 1, 416 | "slot_index": 0 417 | }, 418 | { 419 | "name": "positive", 420 | "type": "CONDITIONING", 421 | "link": 2 422 | }, 423 | { 424 | "name": "negative", 425 | "type": "CONDITIONING", 426 | "link": 3, 427 | "slot_index": 2 428 | }, 429 | { 430 | "name": "latent_image", 431 | "type": "LATENT", 432 | "link": 4 433 | }, 434 | { 435 | "name": "optional_vae", 436 | "type": "VAE", 437 | "link": 5 438 | }, 439 | { 440 | "name": "script", 441 | "type": "SCRIPT", 442 | "link": null 443 | } 444 | ], 445 | "outputs": [ 446 | { 447 | "name": "MODEL", 448 | "type": "MODEL", 449 | "links": null, 450 | "shape": 3 451 | }, 452 | { 453 | "name": "CONDITIONING+", 454 | "type": "CONDITIONING", 455 | "links": null, 456 | "shape": 3 457 | }, 458 | { 459 | "name": "CONDITIONING-", 460 | "type": "CONDITIONING", 461 | "links": null, 462 | "shape": 3 463 | }, 464 | { 465 | "name": "LATENT", 466 | "type": "LATENT", 467 | "links": null, 468 | "shape": 3 469 | }, 470 | { 471 | "name": "VAE", 472 | "type": "VAE", 473 | "links": null, 474 | "shape": 3 475 | }, 476 | { 477 | "name": "IMAGE", 478 | "type": "IMAGE", 479 | "links": [ 480 | 14, 481 | 17 482 | ], 483 | "shape": 3 484 | } 485 | ], 486 | "properties": { 487 | "Node name for S&R": "KSampler (Efficient)" 488 | }, 489 | "widgets_values": [ 490 | 123, 491 | null, 492 | 20, 493 | 7, 494 | "euler_ancestral", 495 | "normal", 496 | 1, 497 | "none", 498 | "true" 499 | ], 500 | "bgcolor": "#553333", 501 | "shape": 1 502 | }, 503 | { 504 | "id": 5, 505 | "type": "KSampler SDXL (Eff.)", 506 | "pos": [ 507 | 603, 508 | 720 509 | ], 510 | "size": { 511 | "0": 312.2434997558594, 512 | "1": 334 513 | }, 514 | "flags": {}, 515 | "order": 4, 516 | "mode": 0, 517 | "inputs": [ 518 | { 519 | "name": "sdxl_tuple", 520 | "type": "SDXL_TUPLE", 521 | "link": 6, 522 | "slot_index": 0 523 | }, 524 | { 525 | "name": "latent_image", 526 | "type": "LATENT", 527 | "link": 7 528 | }, 529 | { 530 | "name": "optional_vae", 531 | "type": "VAE", 532 | "link": 8, 533 | "slot_index": 2 534 | }, 535 | { 536 | "name": "script", 537 | "type": "SCRIPT", 538 | "link": null 539 | } 540 | ], 541 | "outputs": [ 542 | { 543 | "name": "SDXL_TUPLE", 544 | "type": "SDXL_TUPLE", 545 | "links": null, 546 | "shape": 3 547 | }, 548 | { 549 | "name": "LATENT", 550 | "type": "LATENT", 551 | "links": null, 552 | "shape": 3 553 | }, 554 | { 555 | "name": "VAE", 556 | "type": "VAE", 557 | "links": null, 558 | "shape": 3 559 | }, 560 | { 561 | "name": "IMAGE", 562 | "type": "IMAGE", 563 | "links": [ 564 | 16 565 | ], 566 | "shape": 3 567 | } 568 | ], 569 | "properties": { 570 | "Node name for S&R": "KSampler SDXL (Eff.)" 571 | }, 572 | "widgets_values": [ 573 | 789, 574 | null, 575 | 20, 576 | 7, 577 | "euler_ancestral", 578 | "normal", 579 | 0, 580 | -1, 581 | "none", 582 | "true" 583 | ], 584 | "bgcolor": "#553355", 585 | "shape": 1 586 | }, 587 | { 588 | "id": 11, 589 | "type": "SaveImage", 590 | "pos": [ 591 | 1077, 592 | -79 593 | ], 594 | "size": { 595 | "0": 263.2173767089844, 596 | "1": 273.5193786621094 597 | }, 598 | "flags": {}, 599 | "order": 6, 600 | "mode": 0, 601 | "inputs": [ 602 | { 603 | "name": "images", 604 | "type": "IMAGE", 605 | "link": 17, 606 | "slot_index": 0 607 | } 608 | ], 609 | "properties": {}, 610 | "widgets_values": [ 611 | "eff-without-meta" 612 | ] 613 | } 614 | ], 615 | "links": [ 616 | [ 617 | 1, 618 | 1, 619 | 0, 620 | 3, 621 | 0, 622 | "MODEL" 623 | ], 624 | [ 625 | 2, 626 | 1, 627 | 1, 628 | 3, 629 | 1, 630 | "CONDITIONING" 631 | ], 632 | [ 633 | 3, 634 | 1, 635 | 2, 636 | 3, 637 | 2, 638 | "CONDITIONING" 639 | ], 640 | [ 641 | 4, 642 | 1, 643 | 3, 644 | 3, 645 | 3, 646 | "LATENT" 647 | ], 648 | [ 649 | 5, 650 | 1, 651 | 4, 652 | 3, 653 | 4, 654 | "VAE" 655 | ], 656 | [ 657 | 6, 658 | 4, 659 | 0, 660 | 5, 661 | 0, 662 | "SDXL_TUPLE" 663 | ], 664 | [ 665 | 7, 666 | 4, 667 | 1, 668 | 5, 669 | 1, 670 | "LATENT" 671 | ], 672 | [ 673 | 8, 674 | 4, 675 | 2, 676 | 5, 677 | 2, 678 | "VAE" 679 | ], 680 | [ 681 | 9, 682 | 1, 683 | 0, 684 | 6, 685 | 0, 686 | "MODEL" 687 | ], 688 | [ 689 | 10, 690 | 1, 691 | 1, 692 | 6, 693 | 1, 694 | "CONDITIONING" 695 | ], 696 | [ 697 | 11, 698 | 1, 699 | 2, 700 | 6, 701 | 2, 702 | "CONDITIONING" 703 | ], 704 | [ 705 | 12, 706 | 1, 707 | 3, 708 | 6, 709 | 3, 710 | "LATENT" 711 | ], 712 | [ 713 | 13, 714 | 1, 715 | 4, 716 | 6, 717 | 4, 718 | "VAE" 719 | ], 720 | [ 721 | 14, 722 | 3, 723 | 5, 724 | 8, 725 | 0, 726 | "IMAGE" 727 | ], 728 | [ 729 | 15, 730 | 6, 731 | 5, 732 | 9, 733 | 0, 734 | "IMAGE" 735 | ], 736 | [ 737 | 16, 738 | 5, 739 | 3, 740 | 10, 741 | 0, 742 | "IMAGE" 743 | ], 744 | [ 745 | 17, 746 | 3, 747 | 5, 748 | 11, 749 | 0, 750 | "IMAGE" 751 | ] 752 | ], 753 | "groups": [], 754 | "config": {}, 755 | "extra": {}, 756 | "version": 0.4 757 | } -------------------------------------------------------------------------------- /examples/efficiency-nodes.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/nkchocoai/ComfyUI-SaveImageWithMetaData/c9a87453bc4257614c87d28a64a5be9c4fdccba9/examples/efficiency-nodes.png -------------------------------------------------------------------------------- /examples/everywhere_prompt_utilities.json: -------------------------------------------------------------------------------- 1 | { 2 | "last_node_id": 28, 3 | "last_link_id": 40, 4 | "nodes": [ 5 | { 6 | "id": 10, 7 | "type": "KSampler", 8 | "pos": [ 9 | 1090, 10 | 640 11 | ], 12 | "size": { 13 | "0": 315, 14 | "1": 262 15 | }, 16 | "flags": {}, 17 | "order": 14, 18 | "mode": 0, 19 | "inputs": [ 20 | { 21 | "name": "model", 22 | "type": "MODEL", 23 | "link": null, 24 | "slot_index": 0 25 | }, 26 | { 27 | "name": "positive", 28 | "type": "CONDITIONING", 29 | "link": null 30 | }, 31 | { 32 | "name": "negative", 33 | "type": "CONDITIONING", 34 | "link": null, 35 | "slot_index": 2 36 | }, 37 | { 38 | "name": "latent_image", 39 | "type": "LATENT", 40 | "link": 26 41 | } 42 | ], 43 | "outputs": [ 44 | { 45 | "name": "LATENT", 46 | "type": "LATENT", 47 | "links": [ 48 | 18 49 | ], 50 | "shape": 3 51 | } 52 | ], 53 | "properties": { 54 | "Node name for S&R": "KSampler" 55 | }, 56 | "widgets_values": [ 57 | 67890, 58 | "fixed", 59 | 15, 60 | 7, 61 | "euler_ancestral", 62 | "normal", 63 | 0.5 64 | ] 65 | }, 66 | { 67 | "id": 8, 68 | "type": "VAEDecode", 69 | "pos": [ 70 | 1470, 71 | 650 72 | ], 73 | "size": { 74 | "0": 210, 75 | "1": 46 76 | }, 77 | "flags": {}, 78 | "order": 17, 79 | "mode": 0, 80 | "inputs": [ 81 | { 82 | "name": "samples", 83 | "type": "LATENT", 84 | "link": 18, 85 | "slot_index": 0 86 | }, 87 | { 88 | "name": "vae", 89 | "type": "VAE", 90 | "link": null, 91 | "slot_index": 1 92 | } 93 | ], 94 | "outputs": [ 95 | { 96 | "name": "IMAGE", 97 | "type": "IMAGE", 98 | "links": [ 99 | 11, 100 | 19, 101 | 20 102 | ], 103 | "shape": 3, 104 | "slot_index": 0 105 | } 106 | ], 107 | "properties": { 108 | "Node name for S&R": "VAEDecode" 109 | } 110 | }, 111 | { 112 | "id": 16, 113 | "type": "VAEDecode", 114 | "pos": [ 115 | 1470, 116 | 170 117 | ], 118 | "size": { 119 | "0": 210, 120 | "1": 46 121 | }, 122 | "flags": {}, 123 | "order": 12, 124 | "mode": 0, 125 | "inputs": [ 126 | { 127 | "name": "samples", 128 | "type": "LATENT", 129 | "link": 21, 130 | "slot_index": 0 131 | }, 132 | { 133 | "name": "vae", 134 | "type": "VAE", 135 | "link": null, 136 | "slot_index": 1 137 | } 138 | ], 139 | "outputs": [ 140 | { 141 | "name": "IMAGE", 142 | "type": "IMAGE", 143 | "links": [ 144 | 23 145 | ], 146 | "shape": 3, 147 | "slot_index": 0 148 | } 149 | ], 150 | "properties": { 151 | "Node name for S&R": "VAEDecode" 152 | } 153 | }, 154 | { 155 | "id": 17, 156 | "type": "PreviewImage", 157 | "pos": [ 158 | 1750, 159 | 170 160 | ], 161 | "size": { 162 | "0": 210, 163 | "1": 246 164 | }, 165 | "flags": {}, 166 | "order": 15, 167 | "mode": 0, 168 | "inputs": [ 169 | { 170 | "name": "images", 171 | "type": "IMAGE", 172 | "link": 23, 173 | "slot_index": 0 174 | } 175 | ], 176 | "properties": { 177 | "Node name for S&R": "PreviewImage" 178 | } 179 | }, 180 | { 181 | "id": 11, 182 | "type": "LatentUpscaleBy", 183 | "pos": [ 184 | 1110, 185 | 480 186 | ], 187 | "size": { 188 | "0": 263.09246826171875, 189 | "1": 82 190 | }, 191 | "flags": {}, 192 | "order": 11, 193 | "mode": 0, 194 | "inputs": [ 195 | { 196 | "name": "samples", 197 | "type": "LATENT", 198 | "link": 13 199 | } 200 | ], 201 | "outputs": [ 202 | { 203 | "name": "LATENT", 204 | "type": "LATENT", 205 | "links": [ 206 | 26 207 | ], 208 | "shape": 3, 209 | "slot_index": 0 210 | } 211 | ], 212 | "properties": { 213 | "Node name for S&R": "LatentUpscaleBy" 214 | }, 215 | "widgets_values": [ 216 | "nearest-exact", 217 | 1.5 218 | ] 219 | }, 220 | { 221 | "id": 22, 222 | "type": "PromptUtilitiesConstString", 223 | "pos": [ 224 | 21, 225 | 204 226 | ], 227 | "size": { 228 | "0": 210, 229 | "1": 58 230 | }, 231 | "flags": {}, 232 | "order": 0, 233 | "mode": 0, 234 | "outputs": [ 235 | { 236 | "name": "STRING", 237 | "type": "STRING", 238 | "links": [ 239 | 29 240 | ], 241 | "shape": 3 242 | } 243 | ], 244 | "properties": { 245 | "Node name for S&R": "PromptUtilitiesConstString" 246 | }, 247 | "widgets_values": [ 248 | "1girl" 249 | ] 250 | }, 251 | { 252 | "id": 23, 253 | "type": "PromptUtilitiesConstStringMultiLine", 254 | "pos": [ 255 | 23, 256 | 313 257 | ], 258 | "size": { 259 | "0": 210, 260 | "1": 93.83313751220703 261 | }, 262 | "flags": {}, 263 | "order": 1, 264 | "mode": 0, 265 | "outputs": [ 266 | { 267 | "name": "STRING", 268 | "type": "STRING", 269 | "links": [ 270 | 30 271 | ], 272 | "shape": 3, 273 | "slot_index": 0 274 | } 275 | ], 276 | "properties": { 277 | "Node name for S&R": "PromptUtilitiesConstStringMultiLine" 278 | }, 279 | "widgets_values": [ 280 | "hatsune miku, vocaloid" 281 | ] 282 | }, 283 | { 284 | "id": 25, 285 | "type": "PromptUtilitiesLoadPreset", 286 | "pos": [ 287 | 31, 288 | 614 289 | ], 290 | "size": { 291 | "0": 306.8652038574219, 292 | "1": 58 293 | }, 294 | "flags": {}, 295 | "order": 2, 296 | "mode": 0, 297 | "outputs": [ 298 | { 299 | "name": "STRING", 300 | "type": "STRING", 301 | "links": [ 302 | 32 303 | ], 304 | "shape": 3, 305 | "slot_index": 0 306 | } 307 | ], 308 | "properties": { 309 | "Node name for S&R": "PromptUtilitiesLoadPreset" 310 | }, 311 | "widgets_values": [ 312 | "sample\\animagineXL : pos postfix" 313 | ] 314 | }, 315 | { 316 | "id": 26, 317 | "type": "PromptUtilitiesJoinStringList", 318 | "pos": [ 319 | 289, 320 | 385 321 | ], 322 | "size": { 323 | "0": 210, 324 | "1": 138 325 | }, 326 | "flags": {}, 327 | "order": 10, 328 | "mode": 0, 329 | "inputs": [ 330 | { 331 | "name": "arg1", 332 | "type": "STRING", 333 | "link": 29, 334 | "widget": { 335 | "name": "arg1" 336 | }, 337 | "slot_index": 0 338 | }, 339 | { 340 | "name": "arg2", 341 | "type": "STRING", 342 | "link": 30 343 | }, 344 | { 345 | "name": "arg3", 346 | "type": "STRING", 347 | "link": 31, 348 | "slot_index": 2 349 | }, 350 | { 351 | "name": "arg4", 352 | "type": "STRING", 353 | "link": 32 354 | }, 355 | { 356 | "name": "arg5", 357 | "type": "STRING", 358 | "link": null 359 | } 360 | ], 361 | "outputs": [ 362 | { 363 | "name": "STRING", 364 | "type": "STRING", 365 | "links": [ 366 | 33 367 | ], 368 | "shape": 3, 369 | "slot_index": 0 370 | } 371 | ], 372 | "properties": { 373 | "Node name for S&R": "PromptUtilitiesJoinStringList" 374 | }, 375 | "widgets_values": [ 376 | ", ", 377 | "" 378 | ] 379 | }, 380 | { 381 | "id": 4, 382 | "type": "KSampler", 383 | "pos": [ 384 | 1080, 385 | 170 386 | ], 387 | "size": { 388 | "0": 315, 389 | "1": 262 390 | }, 391 | "flags": {}, 392 | "order": 9, 393 | "mode": 0, 394 | "inputs": [ 395 | { 396 | "name": "model", 397 | "type": "MODEL", 398 | "link": null 399 | }, 400 | { 401 | "name": "positive", 402 | "type": "CONDITIONING", 403 | "link": null 404 | }, 405 | { 406 | "name": "negative", 407 | "type": "CONDITIONING", 408 | "link": null, 409 | "slot_index": 2 410 | }, 411 | { 412 | "name": "latent_image", 413 | "type": "LATENT", 414 | "link": 28 415 | } 416 | ], 417 | "outputs": [ 418 | { 419 | "name": "LATENT", 420 | "type": "LATENT", 421 | "links": [ 422 | 13, 423 | 21 424 | ], 425 | "shape": 3, 426 | "slot_index": 0 427 | } 428 | ], 429 | "properties": { 430 | "Node name for S&R": "KSampler" 431 | }, 432 | "widgets_values": [ 433 | 12345, 434 | "fixed", 435 | 20, 436 | 8, 437 | "euler_ancestral", 438 | "normal", 439 | 1 440 | ] 441 | }, 442 | { 443 | "id": 1, 444 | "type": "CheckpointLoaderSimple", 445 | "pos": [ 446 | 19, 447 | 37 448 | ], 449 | "size": { 450 | "0": 315, 451 | "1": 98 452 | }, 453 | "flags": {}, 454 | "order": 3, 455 | "mode": 0, 456 | "outputs": [ 457 | { 458 | "name": "MODEL", 459 | "type": "MODEL", 460 | "links": [ 461 | 34 462 | ], 463 | "shape": 3, 464 | "slot_index": 0 465 | }, 466 | { 467 | "name": "CLIP", 468 | "type": "CLIP", 469 | "links": [ 470 | 37 471 | ], 472 | "shape": 3 473 | }, 474 | { 475 | "name": "VAE", 476 | "type": "VAE", 477 | "links": [ 478 | 38 479 | ], 480 | "shape": 3, 481 | "slot_index": 2 482 | } 483 | ], 484 | "properties": { 485 | "Node name for S&R": "CheckpointLoaderSimple" 486 | }, 487 | "widgets_values": [ 488 | "animagine-xl-3.0.safetensors" 489 | ] 490 | }, 491 | { 492 | "id": 20, 493 | "type": "PromptUtilitiesLoadPreset", 494 | "pos": [ 495 | 35, 496 | 733 497 | ], 498 | "size": { 499 | "0": 270.42572021484375, 500 | "1": 58 501 | }, 502 | "flags": {}, 503 | "order": 4, 504 | "mode": 0, 505 | "outputs": [ 506 | { 507 | "name": "STRING", 508 | "type": "STRING", 509 | "links": [ 510 | 27 511 | ], 512 | "shape": 3, 513 | "slot_index": 0 514 | } 515 | ], 516 | "properties": { 517 | "Node name for S&R": "PromptUtilitiesLoadPreset" 518 | }, 519 | "widgets_values": [ 520 | "sample\\animagineXL : neg" 521 | ] 522 | }, 523 | { 524 | "id": 3, 525 | "type": "CLIPTextEncode", 526 | "pos": [ 527 | 355, 528 | 739 529 | ], 530 | "size": { 531 | "0": 210, 532 | "1": 54 533 | }, 534 | "flags": {}, 535 | "order": 8, 536 | "mode": 0, 537 | "inputs": [ 538 | { 539 | "name": "clip", 540 | "type": "CLIP", 541 | "link": null, 542 | "slot_index": 0 543 | }, 544 | { 545 | "name": "text", 546 | "type": "STRING", 547 | "link": 27, 548 | "widget": { 549 | "name": "text" 550 | } 551 | } 552 | ], 553 | "outputs": [ 554 | { 555 | "name": "CONDITIONING", 556 | "type": "CONDITIONING", 557 | "links": [ 558 | 40 559 | ], 560 | "shape": 3 561 | } 562 | ], 563 | "properties": { 564 | "Node name for S&R": "CLIPTextEncode" 565 | }, 566 | "widgets_values": [ 567 | "lowres, bad anatomy, bad hands, text, error, missing fingers, extra digit, fewer digits, cropped, worst quality, low quality, normal quality, jpeg artifacts, signature, watermark, username, blurry, artist name" 568 | ] 569 | }, 570 | { 571 | "id": 2, 572 | "type": "CLIPTextEncode", 573 | "pos": [ 574 | 552, 575 | 365 576 | ], 577 | "size": { 578 | "0": 210, 579 | "1": 54 580 | }, 581 | "flags": {}, 582 | "order": 13, 583 | "mode": 0, 584 | "inputs": [ 585 | { 586 | "name": "clip", 587 | "type": "CLIP", 588 | "link": null, 589 | "slot_index": 0 590 | }, 591 | { 592 | "name": "text", 593 | "type": "STRING", 594 | "link": 33, 595 | "widget": { 596 | "name": "text" 597 | } 598 | } 599 | ], 600 | "outputs": [ 601 | { 602 | "name": "CONDITIONING", 603 | "type": "CONDITIONING", 604 | "links": [ 605 | 39 606 | ], 607 | "shape": 3, 608 | "slot_index": 0 609 | } 610 | ], 611 | "properties": { 612 | "Node name for S&R": "CLIPTextEncode" 613 | }, 614 | "widgets_values": [ 615 | "1girl, hatsune miku, vocaloid, upper body, smile, waving, looking at viewer, masterpiece, best quality" 616 | ] 617 | }, 618 | { 619 | "id": 28, 620 | "type": "Prompts Everywhere", 621 | "pos": [ 622 | 809, 623 | 485 624 | ], 625 | "size": { 626 | "0": 210, 627 | "1": 46 628 | }, 629 | "flags": {}, 630 | "order": 16, 631 | "mode": 0, 632 | "inputs": [ 633 | { 634 | "name": "CONDITIONING", 635 | "type": "*", 636 | "link": 39, 637 | "slot_index": 0, 638 | "color_on": "#FFA931" 639 | }, 640 | { 641 | "name": "CONDITIONING", 642 | "type": "*", 643 | "link": 40, 644 | "slot_index": 1, 645 | "color_on": "#FFA931" 646 | } 647 | ], 648 | "properties": { 649 | "group_restricted": false, 650 | "color_restricted": false, 651 | "Node name for S&R": "Prompts Everywhere" 652 | } 653 | }, 654 | { 655 | "id": 15, 656 | "type": "SaveImageWithMetaData", 657 | "pos": [ 658 | 2384, 659 | 577 660 | ], 661 | "size": { 662 | "0": 315, 663 | "1": 318 664 | }, 665 | "flags": {}, 666 | "order": 20, 667 | "mode": 0, 668 | "inputs": [ 669 | { 670 | "name": "images", 671 | "type": "IMAGE", 672 | "link": 20, 673 | "slot_index": 0 674 | } 675 | ], 676 | "properties": { 677 | "Node name for S&R": "SaveImageWithMetaData" 678 | }, 679 | "widgets_values": [ 680 | "ComfyUI-with-meta-near", 681 | "Nearest", 682 | 0 683 | ] 684 | }, 685 | { 686 | "id": 21, 687 | "type": "EmptyLatentImageFromPresetsSDXL", 688 | "pos": [ 689 | 675, 690 | 161 691 | ], 692 | "size": { 693 | "0": 319.20001220703125, 694 | "1": 122 695 | }, 696 | "flags": {}, 697 | "order": 5, 698 | "mode": 0, 699 | "outputs": [ 700 | { 701 | "name": "latent", 702 | "type": "LATENT", 703 | "links": [ 704 | 28 705 | ], 706 | "shape": 3, 707 | "slot_index": 0 708 | }, 709 | { 710 | "name": "w", 711 | "type": "INT", 712 | "links": null, 713 | "shape": 3 714 | }, 715 | { 716 | "name": "h", 717 | "type": "INT", 718 | "links": null, 719 | "shape": 3 720 | } 721 | ], 722 | "properties": { 723 | "Node name for S&R": "EmptyLatentImageFromPresetsSDXL" 724 | }, 725 | "widgets_values": [ 726 | " 832 x 1216", 727 | 1 728 | ] 729 | }, 730 | { 731 | "id": 24, 732 | "type": "PromptUtilitiesConstStringMultiLine", 733 | "pos": [ 734 | 26, 735 | 458 736 | ], 737 | "size": { 738 | "0": 210, 739 | "1": 93.83313751220703 740 | }, 741 | "flags": {}, 742 | "order": 6, 743 | "mode": 0, 744 | "outputs": [ 745 | { 746 | "name": "STRING", 747 | "type": "STRING", 748 | "links": [ 749 | 31 750 | ], 751 | "shape": 3 752 | } 753 | ], 754 | "properties": { 755 | "Node name for S&R": "PromptUtilitiesConstStringMultiLine" 756 | }, 757 | "widgets_values": [ 758 | "upper body, v sign, smile, looking at viewer" 759 | ] 760 | }, 761 | { 762 | "id": 27, 763 | "type": "Anything Everywhere3", 764 | "pos": [ 765 | 400, 766 | 37 767 | ], 768 | "size": { 769 | "0": 210, 770 | "1": 66 771 | }, 772 | "flags": {}, 773 | "order": 7, 774 | "mode": 0, 775 | "inputs": [ 776 | { 777 | "name": "MODEL", 778 | "type": "*", 779 | "link": 34, 780 | "slot_index": 0, 781 | "color_on": "#B39DDB" 782 | }, 783 | { 784 | "name": "CLIP", 785 | "type": "*", 786 | "link": 37, 787 | "color_on": "#FFD500", 788 | "slot_index": 1 789 | }, 790 | { 791 | "name": "VAE", 792 | "type": "*", 793 | "link": 38, 794 | "slot_index": 2, 795 | "color_on": "#FF6E6E" 796 | } 797 | ], 798 | "properties": { 799 | "group_restricted": false, 800 | "color_restricted": false, 801 | "Node name for S&R": "Anything Everywhere3" 802 | } 803 | }, 804 | { 805 | "id": 9, 806 | "type": "SaveImageWithMetaData", 807 | "pos": [ 808 | 2029, 809 | 572 810 | ], 811 | "size": [ 812 | 307.9046630859375, 813 | 318 814 | ], 815 | "flags": {}, 816 | "order": 18, 817 | "mode": 0, 818 | "inputs": [ 819 | { 820 | "name": "images", 821 | "type": "IMAGE", 822 | "link": 11, 823 | "slot_index": 0 824 | } 825 | ], 826 | "properties": { 827 | "Node name for S&R": "SaveImageWithMetaData" 828 | }, 829 | "widgets_values": [ 830 | "ComfyUI-with-meta-far", 831 | "Farthest", 832 | 0 833 | ] 834 | }, 835 | { 836 | "id": 14, 837 | "type": "SaveImage", 838 | "pos": [ 839 | 1740, 840 | 570 841 | ], 842 | "size": { 843 | "0": 263.2173767089844, 844 | "1": 273.5193786621094 845 | }, 846 | "flags": {}, 847 | "order": 19, 848 | "mode": 0, 849 | "inputs": [ 850 | { 851 | "name": "images", 852 | "type": "IMAGE", 853 | "link": 19 854 | } 855 | ], 856 | "properties": {}, 857 | "widgets_values": [ 858 | "ComfyUI-without-meta" 859 | ] 860 | } 861 | ], 862 | "links": [ 863 | [ 864 | 11, 865 | 8, 866 | 0, 867 | 9, 868 | 0, 869 | "IMAGE" 870 | ], 871 | [ 872 | 13, 873 | 4, 874 | 0, 875 | 11, 876 | 0, 877 | "LATENT" 878 | ], 879 | [ 880 | 18, 881 | 10, 882 | 0, 883 | 8, 884 | 0, 885 | "LATENT" 886 | ], 887 | [ 888 | 19, 889 | 8, 890 | 0, 891 | 14, 892 | 0, 893 | "IMAGE" 894 | ], 895 | [ 896 | 20, 897 | 8, 898 | 0, 899 | 15, 900 | 0, 901 | "IMAGE" 902 | ], 903 | [ 904 | 21, 905 | 4, 906 | 0, 907 | 16, 908 | 0, 909 | "LATENT" 910 | ], 911 | [ 912 | 23, 913 | 16, 914 | 0, 915 | 17, 916 | 0, 917 | "IMAGE" 918 | ], 919 | [ 920 | 26, 921 | 11, 922 | 0, 923 | 10, 924 | 3, 925 | "LATENT" 926 | ], 927 | [ 928 | 27, 929 | 20, 930 | 0, 931 | 3, 932 | 1, 933 | "STRING" 934 | ], 935 | [ 936 | 28, 937 | 21, 938 | 0, 939 | 4, 940 | 3, 941 | "LATENT" 942 | ], 943 | [ 944 | 29, 945 | 22, 946 | 0, 947 | 26, 948 | 0, 949 | "STRING" 950 | ], 951 | [ 952 | 30, 953 | 23, 954 | 0, 955 | 26, 956 | 1, 957 | "STRING" 958 | ], 959 | [ 960 | 31, 961 | 24, 962 | 0, 963 | 26, 964 | 2, 965 | "STRING" 966 | ], 967 | [ 968 | 32, 969 | 25, 970 | 0, 971 | 26, 972 | 3, 973 | "STRING" 974 | ], 975 | [ 976 | 33, 977 | 26, 978 | 0, 979 | 2, 980 | 1, 981 | "STRING" 982 | ], 983 | [ 984 | 34, 985 | 1, 986 | 0, 987 | 27, 988 | 0, 989 | "*" 990 | ], 991 | [ 992 | 37, 993 | 1, 994 | 1, 995 | 27, 996 | 1, 997 | "*" 998 | ], 999 | [ 1000 | 38, 1001 | 1, 1002 | 2, 1003 | 27, 1004 | 2, 1005 | "*" 1006 | ], 1007 | [ 1008 | 39, 1009 | 2, 1010 | 0, 1011 | 28, 1012 | 0, 1013 | "*" 1014 | ], 1015 | [ 1016 | 40, 1017 | 3, 1018 | 0, 1019 | 28, 1020 | 1, 1021 | "*" 1022 | ] 1023 | ], 1024 | "groups": [], 1025 | "config": {}, 1026 | "extra": {}, 1027 | "version": 0.4 1028 | } -------------------------------------------------------------------------------- /examples/everywhere_prompt_utilities.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/nkchocoai/ComfyUI-SaveImageWithMetaData/c9a87453bc4257614c87d28a64a5be9c4fdccba9/examples/everywhere_prompt_utilities.png -------------------------------------------------------------------------------- /examples/extra_metadata.json: -------------------------------------------------------------------------------- 1 | { 2 | "last_node_id": 21, 3 | "last_link_id": 29, 4 | "nodes": [ 5 | { 6 | "id": 10, 7 | "type": "VAEDecode", 8 | "pos": [ 9 | 812, 10 | 429 11 | ], 12 | "size": { 13 | "0": 210, 14 | "1": 46 15 | }, 16 | "flags": {}, 17 | "order": 9, 18 | "mode": 0, 19 | "inputs": [ 20 | { 21 | "name": "samples", 22 | "type": "LATENT", 23 | "link": 9, 24 | "slot_index": 0 25 | }, 26 | { 27 | "name": "vae", 28 | "type": "VAE", 29 | "link": 18 30 | } 31 | ], 32 | "outputs": [ 33 | { 34 | "name": "IMAGE", 35 | "type": "IMAGE", 36 | "links": [ 37 | 19 38 | ], 39 | "shape": 3 40 | } 41 | ], 42 | "properties": { 43 | "Node name for S&R": "VAEDecode" 44 | } 45 | }, 46 | { 47 | "id": 16, 48 | "type": "CreateExtraMetaData", 49 | "pos": [ 50 | 763, 51 | 962 52 | ], 53 | "size": [ 54 | 245.1999969482422, 55 | 226 56 | ], 57 | "flags": {}, 58 | "order": 8, 59 | "mode": 0, 60 | "inputs": [ 61 | { 62 | "name": "extra_metadata", 63 | "type": "EXTRA_METADATA", 64 | "link": 20 65 | }, 66 | { 67 | "name": "value1", 68 | "type": "STRING", 69 | "link": 27, 70 | "widget": { 71 | "name": "value1" 72 | } 73 | }, 74 | { 75 | "name": "value2", 76 | "type": "STRING", 77 | "link": 26, 78 | "widget": { 79 | "name": "value2" 80 | } 81 | } 82 | ], 83 | "outputs": [ 84 | { 85 | "name": "EXTRA_METADATA", 86 | "type": "EXTRA_METADATA", 87 | "links": [ 88 | 25 89 | ], 90 | "shape": 3, 91 | "slot_index": 0 92 | } 93 | ], 94 | "properties": { 95 | "Node name for S&R": "CreateExtraMetaData" 96 | }, 97 | "widgets_values": [ 98 | "custom_w", 99 | "", 100 | "custom_h", 101 | "", 102 | "", 103 | "", 104 | "", 105 | "" 106 | ] 107 | }, 108 | { 109 | "id": 17, 110 | "type": "CreateExtraMetaData", 111 | "pos": [ 112 | 440, 113 | 960 114 | ], 115 | "size": { 116 | "0": 245.1999969482422, 117 | "1": 226 118 | }, 119 | "flags": {}, 120 | "order": 0, 121 | "mode": 0, 122 | "inputs": [ 123 | { 124 | "name": "extra_metadata", 125 | "type": "EXTRA_METADATA", 126 | "link": null 127 | } 128 | ], 129 | "outputs": [ 130 | { 131 | "name": "EXTRA_METADATA", 132 | "type": "EXTRA_METADATA", 133 | "links": [ 134 | 20 135 | ], 136 | "shape": 3, 137 | "slot_index": 0 138 | } 139 | ], 140 | "properties": { 141 | "Node name for S&R": "CreateExtraMetaData" 142 | }, 143 | "widgets_values": [ 144 | "custom_key", 145 | "custom_value", 146 | "hello", 147 | "world", 148 | "", 149 | "", 150 | "", 151 | "" 152 | ] 153 | }, 154 | { 155 | "id": 1, 156 | "type": "CheckpointLoaderSimple", 157 | "pos": [ 158 | -340, 159 | 370 160 | ], 161 | "size": { 162 | "0": 315, 163 | "1": 98 164 | }, 165 | "flags": {}, 166 | "order": 1, 167 | "mode": 0, 168 | "outputs": [ 169 | { 170 | "name": "MODEL", 171 | "type": "MODEL", 172 | "links": [ 173 | 17 174 | ], 175 | "shape": 3, 176 | "slot_index": 0 177 | }, 178 | { 179 | "name": "CLIP", 180 | "type": "CLIP", 181 | "links": [ 182 | 13, 183 | 14 184 | ], 185 | "shape": 3, 186 | "slot_index": 1 187 | }, 188 | { 189 | "name": "VAE", 190 | "type": "VAE", 191 | "links": [ 192 | 18 193 | ], 194 | "shape": 3, 195 | "slot_index": 2 196 | } 197 | ], 198 | "properties": { 199 | "Node name for S&R": "CheckpointLoaderSimple" 200 | }, 201 | "widgets_values": [ 202 | "animagine-xl-3.1.safetensors" 203 | ] 204 | }, 205 | { 206 | "id": 13, 207 | "type": "CLIPTextEncode", 208 | "pos": [ 209 | 40, 210 | 600 211 | ], 212 | "size": { 213 | "0": 315, 214 | "1": 98 215 | }, 216 | "flags": {}, 217 | "order": 4, 218 | "mode": 0, 219 | "inputs": [ 220 | { 221 | "name": "clip", 222 | "type": "CLIP", 223 | "link": 14 224 | } 225 | ], 226 | "outputs": [ 227 | { 228 | "name": "CONDITIONING", 229 | "type": "CONDITIONING", 230 | "links": [ 231 | 15 232 | ], 233 | "shape": 3, 234 | "slot_index": 0 235 | } 236 | ], 237 | "properties": { 238 | "Node name for S&R": "CLIPTextEncode" 239 | }, 240 | "widgets_values": [ 241 | "lowres, (bad), text, error, fewer, extra, missing, worst quality, jpeg artifacts, low quality, watermark, unfinished, displeasing, oldest, early, chromatic aberration, signature, extra digits, artistic error, username, scan, [abstract]", 242 | true 243 | ] 244 | }, 245 | { 246 | "id": 14, 247 | "type": "CLIPTextEncode", 248 | "pos": [ 249 | 40, 250 | 430 251 | ], 252 | "size": { 253 | "0": 315, 254 | "1": 98 255 | }, 256 | "flags": {}, 257 | "order": 3, 258 | "mode": 0, 259 | "inputs": [ 260 | { 261 | "name": "clip", 262 | "type": "CLIP", 263 | "link": 13 264 | } 265 | ], 266 | "outputs": [ 267 | { 268 | "name": "CONDITIONING", 269 | "type": "CONDITIONING", 270 | "links": [ 271 | 16 272 | ], 273 | "shape": 3, 274 | "slot_index": 0 275 | } 276 | ], 277 | "properties": { 278 | "Node name for S&R": "CLIPTextEncode" 279 | }, 280 | "widgets_values": [ 281 | "1girl, hatsune miku, vocaloid, aqua hair, aqua eyes, twintails, aqua necktie, cyan shirt, sleeveless, detached sleeves, black sleeves, upper body, smile, looking at viewer, waving, masterpiece, best quality, very aesthetic, absurdres", 282 | true 283 | ] 284 | }, 285 | { 286 | "id": 9, 287 | "type": "KSampler", 288 | "pos": [ 289 | 445, 290 | 430 291 | ], 292 | "size": { 293 | "0": 315, 294 | "1": 262 295 | }, 296 | "flags": {}, 297 | "order": 7, 298 | "mode": 0, 299 | "inputs": [ 300 | { 301 | "name": "model", 302 | "type": "MODEL", 303 | "link": 17 304 | }, 305 | { 306 | "name": "positive", 307 | "type": "CONDITIONING", 308 | "link": 16 309 | }, 310 | { 311 | "name": "negative", 312 | "type": "CONDITIONING", 313 | "link": 15 314 | }, 315 | { 316 | "name": "latent_image", 317 | "type": "LATENT", 318 | "link": 8, 319 | "slot_index": 3 320 | } 321 | ], 322 | "outputs": [ 323 | { 324 | "name": "LATENT", 325 | "type": "LATENT", 326 | "links": [ 327 | 9 328 | ], 329 | "shape": 3 330 | } 331 | ], 332 | "properties": { 333 | "Node name for S&R": "KSampler" 334 | }, 335 | "widgets_values": [ 336 | 0, 337 | "fixed", 338 | 20, 339 | 8, 340 | "euler_ancestral", 341 | "karras", 342 | 1 343 | ] 344 | }, 345 | { 346 | "id": 20, 347 | "type": "CR Integer To String", 348 | "pos": [ 349 | 457, 350 | 743 351 | ], 352 | "size": [ 353 | 210, 354 | 54 355 | ], 356 | "flags": {}, 357 | "order": 5, 358 | "mode": 0, 359 | "inputs": [ 360 | { 361 | "name": "int_", 362 | "type": "INT", 363 | "link": 29, 364 | "widget": { 365 | "name": "int_" 366 | } 367 | } 368 | ], 369 | "outputs": [ 370 | { 371 | "name": "STRING", 372 | "type": "STRING", 373 | "links": [ 374 | 27 375 | ], 376 | "shape": 3, 377 | "slot_index": 0 378 | }, 379 | { 380 | "name": "show_help", 381 | "type": "STRING", 382 | "links": null, 383 | "shape": 3 384 | } 385 | ], 386 | "properties": { 387 | "Node name for S&R": "CR Integer To String" 388 | }, 389 | "widgets_values": [ 390 | 0 391 | ] 392 | }, 393 | { 394 | "id": 21, 395 | "type": "CR Integer To String", 396 | "pos": [ 397 | 460, 398 | 847 399 | ], 400 | "size": [ 401 | 210, 402 | 54 403 | ], 404 | "flags": {}, 405 | "order": 6, 406 | "mode": 0, 407 | "inputs": [ 408 | { 409 | "name": "int_", 410 | "type": "INT", 411 | "link": 28, 412 | "widget": { 413 | "name": "int_" 414 | } 415 | } 416 | ], 417 | "outputs": [ 418 | { 419 | "name": "STRING", 420 | "type": "STRING", 421 | "links": [ 422 | 26 423 | ], 424 | "shape": 3, 425 | "slot_index": 0 426 | }, 427 | { 428 | "name": "show_help", 429 | "type": "STRING", 430 | "links": null, 431 | "shape": 3 432 | } 433 | ], 434 | "properties": { 435 | "Node name for S&R": "CR Integer To String" 436 | }, 437 | "widgets_values": [ 438 | 0 439 | ] 440 | }, 441 | { 442 | "id": 8, 443 | "type": "EmptyLatentImageFromPresetsSDXL", 444 | "pos": [ 445 | 40, 446 | 800 447 | ], 448 | "size": { 449 | "0": 315, 450 | "1": 122 451 | }, 452 | "flags": {}, 453 | "order": 2, 454 | "mode": 0, 455 | "outputs": [ 456 | { 457 | "name": "latent", 458 | "type": "LATENT", 459 | "links": [ 460 | 8 461 | ], 462 | "shape": 3 463 | }, 464 | { 465 | "name": "w", 466 | "type": "INT", 467 | "links": [ 468 | 29 469 | ], 470 | "shape": 3, 471 | "slot_index": 1 472 | }, 473 | { 474 | "name": "h", 475 | "type": "INT", 476 | "links": [ 477 | 28 478 | ], 479 | "shape": 3, 480 | "slot_index": 2 481 | } 482 | ], 483 | "properties": { 484 | "Node name for S&R": "EmptyLatentImageFromPresetsSDXL" 485 | }, 486 | "widgets_values": [ 487 | " 832 x 1216", 488 | 1 489 | ] 490 | }, 491 | { 492 | "id": 15, 493 | "type": "SaveImageWithMetaData", 494 | "pos": [ 495 | 1106, 496 | 430 497 | ], 498 | "size": [ 499 | 315, 500 | 482 501 | ], 502 | "flags": {}, 503 | "order": 10, 504 | "mode": 0, 505 | "inputs": [ 506 | { 507 | "name": "images", 508 | "type": "IMAGE", 509 | "link": 19 510 | }, 511 | { 512 | "name": "extra_metadata", 513 | "type": "EXTRA_METADATA", 514 | "link": 25 515 | } 516 | ], 517 | "properties": { 518 | "Node name for S&R": "SaveImageWithMetaData" 519 | }, 520 | "widgets_values": [ 521 | "extra_metadata", 522 | "Farthest", 523 | 0, 524 | "png", 525 | true, 526 | 100, 527 | false, 528 | true, 529 | false 530 | ] 531 | } 532 | ], 533 | "links": [ 534 | [ 535 | 8, 536 | 8, 537 | 0, 538 | 9, 539 | 3, 540 | "LATENT" 541 | ], 542 | [ 543 | 9, 544 | 9, 545 | 0, 546 | 10, 547 | 0, 548 | "LATENT" 549 | ], 550 | [ 551 | 13, 552 | 1, 553 | 1, 554 | 14, 555 | 0, 556 | "CLIP" 557 | ], 558 | [ 559 | 14, 560 | 1, 561 | 1, 562 | 13, 563 | 0, 564 | "CLIP" 565 | ], 566 | [ 567 | 15, 568 | 13, 569 | 0, 570 | 9, 571 | 2, 572 | "CONDITIONING" 573 | ], 574 | [ 575 | 16, 576 | 14, 577 | 0, 578 | 9, 579 | 1, 580 | "CONDITIONING" 581 | ], 582 | [ 583 | 17, 584 | 1, 585 | 0, 586 | 9, 587 | 0, 588 | "MODEL" 589 | ], 590 | [ 591 | 18, 592 | 1, 593 | 2, 594 | 10, 595 | 1, 596 | "VAE" 597 | ], 598 | [ 599 | 19, 600 | 10, 601 | 0, 602 | 15, 603 | 0, 604 | "IMAGE" 605 | ], 606 | [ 607 | 20, 608 | 17, 609 | 0, 610 | 16, 611 | 0, 612 | "EXTRA_METADATA" 613 | ], 614 | [ 615 | 25, 616 | 16, 617 | 0, 618 | 15, 619 | 1, 620 | "EXTRA_METADATA" 621 | ], 622 | [ 623 | 26, 624 | 21, 625 | 0, 626 | 16, 627 | 2, 628 | "STRING" 629 | ], 630 | [ 631 | 27, 632 | 20, 633 | 0, 634 | 16, 635 | 1, 636 | "STRING" 637 | ], 638 | [ 639 | 28, 640 | 8, 641 | 2, 642 | 21, 643 | 0, 644 | "INT" 645 | ], 646 | [ 647 | 29, 648 | 8, 649 | 1, 650 | 20, 651 | 0, 652 | "INT" 653 | ] 654 | ], 655 | "groups": [], 656 | "config": {}, 657 | "extra": { 658 | "ds": { 659 | "scale": 0.7513148009015779, 660 | "offset": [ 661 | 653.6784224287038, 662 | -213.77225271772178 663 | ] 664 | } 665 | }, 666 | "version": 0.4 667 | } 668 | -------------------------------------------------------------------------------- /examples/extra_metadata.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/nkchocoai/ComfyUI-SaveImageWithMetaData/c9a87453bc4257614c87d28a64a5be9c4fdccba9/examples/extra_metadata.png -------------------------------------------------------------------------------- /examples/filename_format.json: -------------------------------------------------------------------------------- 1 | { 2 | "last_node_id": 20, 3 | "last_link_id": 29, 4 | "nodes": [ 5 | { 6 | "id": 7, 7 | "type": "PromptUtilitiesConstStringMultiLine", 8 | "pos": [ 9 | 21, 10 | 378 11 | ], 12 | "size": { 13 | "0": 287.9851379394531, 14 | "1": 99.69133758544922 15 | }, 16 | "flags": {}, 17 | "order": 0, 18 | "mode": 0, 19 | "outputs": [ 20 | { 21 | "name": "STRING", 22 | "type": "STRING", 23 | "links": [ 24 | 7 25 | ], 26 | "shape": 3 27 | } 28 | ], 29 | "properties": { 30 | "Node name for S&R": "PromptUtilitiesConstStringMultiLine" 31 | }, 32 | "widgets_values": [ 33 | "(low quality, worst quality:1.4), embedding:EasyNegative" 34 | ] 35 | }, 36 | { 37 | "id": 9, 38 | "type": "CLIPTextEncode", 39 | "pos": [ 40 | 394, 41 | 382 42 | ], 43 | "size": { 44 | "0": 210, 45 | "1": 54 46 | }, 47 | "flags": {}, 48 | "order": 6, 49 | "mode": 0, 50 | "inputs": [ 51 | { 52 | "name": "clip", 53 | "type": "CLIP", 54 | "link": null, 55 | "slot_index": 0 56 | }, 57 | { 58 | "name": "text", 59 | "type": "STRING", 60 | "link": 7, 61 | "widget": { 62 | "name": "text" 63 | }, 64 | "slot_index": 1 65 | } 66 | ], 67 | "outputs": [ 68 | { 69 | "name": "CONDITIONING", 70 | "type": "CONDITIONING", 71 | "links": [ 72 | 10 73 | ], 74 | "shape": 3 75 | } 76 | ], 77 | "properties": { 78 | "Node name for S&R": "CLIPTextEncode" 79 | }, 80 | "widgets_values": [ 81 | "" 82 | ] 83 | }, 84 | { 85 | "id": 8, 86 | "type": "CLIPTextEncode", 87 | "pos": [ 88 | 390, 89 | 257 90 | ], 91 | "size": { 92 | "0": 210, 93 | "1": 54 94 | }, 95 | "flags": {}, 96 | "order": 7, 97 | "mode": 0, 98 | "inputs": [ 99 | { 100 | "name": "clip", 101 | "type": "CLIP", 102 | "link": null 103 | }, 104 | { 105 | "name": "text", 106 | "type": "STRING", 107 | "link": 8, 108 | "widget": { 109 | "name": "text" 110 | } 111 | } 112 | ], 113 | "outputs": [ 114 | { 115 | "name": "CONDITIONING", 116 | "type": "CONDITIONING", 117 | "links": [ 118 | 9 119 | ], 120 | "shape": 3, 121 | "slot_index": 0 122 | } 123 | ], 124 | "properties": { 125 | "Node name for S&R": "CLIPTextEncode" 126 | }, 127 | "widgets_values": [ 128 | "" 129 | ] 130 | }, 131 | { 132 | "id": 10, 133 | "type": "Prompts Everywhere", 134 | "pos": [ 135 | 674, 136 | 302 137 | ], 138 | "size": { 139 | "0": 210, 140 | "1": 46 141 | }, 142 | "flags": {}, 143 | "order": 10, 144 | "mode": 0, 145 | "inputs": [ 146 | { 147 | "name": "CONDITIONING", 148 | "type": "*", 149 | "link": 9, 150 | "color_on": "#FFA931" 151 | }, 152 | { 153 | "name": "CONDITIONING", 154 | "type": "*", 155 | "link": 10, 156 | "slot_index": 1, 157 | "color_on": "#FFA931" 158 | } 159 | ], 160 | "properties": { 161 | "Node name for S&R": "Prompts Everywhere", 162 | "group_restricted": false, 163 | "color_restricted": false 164 | }, 165 | "widgets_values": [] 166 | }, 167 | { 168 | "id": 6, 169 | "type": "PromptUtilitiesConstStringMultiLine", 170 | "pos": [ 171 | 18, 172 | 225 173 | ], 174 | "size": { 175 | "0": 287.9851379394531, 176 | "1": 99.69133758544922 177 | }, 178 | "flags": {}, 179 | "order": 1, 180 | "mode": 0, 181 | "outputs": [ 182 | { 183 | "name": "STRING", 184 | "type": "STRING", 185 | "links": [ 186 | 8 187 | ], 188 | "shape": 3, 189 | "slot_index": 0 190 | } 191 | ], 192 | "properties": { 193 | "Node name for S&R": "PromptUtilitiesConstStringMultiLine" 194 | }, 195 | "widgets_values": [ 196 | "masterpiece, best quality, hatsune miku, upper body" 197 | ] 198 | }, 199 | { 200 | "id": 1, 201 | "type": "CheckpointLoaderSimple", 202 | "pos": [ 203 | 24, 204 | 54 205 | ], 206 | "size": { 207 | "0": 315, 208 | "1": 98 209 | }, 210 | "flags": {}, 211 | "order": 2, 212 | "mode": 0, 213 | "outputs": [ 214 | { 215 | "name": "MODEL", 216 | "type": "MODEL", 217 | "links": [ 218 | 24 219 | ], 220 | "shape": 3, 221 | "slot_index": 0 222 | }, 223 | { 224 | "name": "CLIP", 225 | "type": "CLIP", 226 | "links": [ 227 | 25 228 | ], 229 | "shape": 3, 230 | "slot_index": 1 231 | }, 232 | { 233 | "name": "VAE", 234 | "type": "VAE", 235 | "links": [ 236 | 26 237 | ], 238 | "shape": 3 239 | } 240 | ], 241 | "properties": { 242 | "Node name for S&R": "CheckpointLoaderSimple" 243 | }, 244 | "widgets_values": [ 245 | "CounterfeitV30_v30.safetensors" 246 | ] 247 | }, 248 | { 249 | "id": 5, 250 | "type": "Anything Everywhere3", 251 | "pos": [ 252 | 395, 253 | 54 254 | ], 255 | "size": { 256 | "0": 210, 257 | "1": 66 258 | }, 259 | "flags": {}, 260 | "order": 8, 261 | "mode": 0, 262 | "inputs": [ 263 | { 264 | "name": "MODEL", 265 | "type": "*", 266 | "link": 24, 267 | "slot_index": 0, 268 | "color_on": "#B39DDB" 269 | }, 270 | { 271 | "name": "CLIP", 272 | "type": "*", 273 | "link": 25, 274 | "color_on": "#FFD500" 275 | }, 276 | { 277 | "name": "VAE", 278 | "type": "*", 279 | "link": 26, 280 | "color_on": "#FF6E6E", 281 | "slot_index": 2 282 | } 283 | ], 284 | "properties": { 285 | "Node name for S&R": "Anything Everywhere3", 286 | "group_restricted": false, 287 | "color_restricted": false 288 | }, 289 | "widgets_values": [] 290 | }, 291 | { 292 | "id": 2, 293 | "type": "VAELoader", 294 | "pos": [ 295 | 402, 296 | 858 297 | ], 298 | "size": { 299 | "0": 323.17578125, 300 | "1": 58 301 | }, 302 | "flags": {}, 303 | "order": 3, 304 | "mode": 0, 305 | "outputs": [ 306 | { 307 | "name": "VAE", 308 | "type": "VAE", 309 | "links": [ 310 | 13 311 | ], 312 | "shape": 3, 313 | "slot_index": 0 314 | } 315 | ], 316 | "properties": { 317 | "Node name for S&R": "VAELoader" 318 | }, 319 | "widgets_values": [ 320 | "clearvae_nanslesstest.safetensors" 321 | ] 322 | }, 323 | { 324 | "id": 13, 325 | "type": "VAEDecode", 326 | "pos": [ 327 | 790, 328 | 531 329 | ], 330 | "size": { 331 | "0": 210, 332 | "1": 46 333 | }, 334 | "flags": {}, 335 | "order": 11, 336 | "mode": 0, 337 | "inputs": [ 338 | { 339 | "name": "samples", 340 | "type": "LATENT", 341 | "link": 28, 342 | "slot_index": 0 343 | }, 344 | { 345 | "name": "vae", 346 | "type": "VAE", 347 | "link": 13 348 | } 349 | ], 350 | "outputs": [ 351 | { 352 | "name": "IMAGE", 353 | "type": "IMAGE", 354 | "links": [ 355 | 15 356 | ], 357 | "shape": 3 358 | } 359 | ], 360 | "properties": { 361 | "Node name for S&R": "VAEDecode" 362 | } 363 | }, 364 | { 365 | "id": 12, 366 | "type": "EmptyLatentImageFromPresetsSD15", 367 | "pos": [ 368 | 23, 369 | 550 370 | ], 371 | "size": { 372 | "0": 327.6000061035156, 373 | "1": 122 374 | }, 375 | "flags": {}, 376 | "order": 4, 377 | "mode": 0, 378 | "outputs": [ 379 | { 380 | "name": "latent", 381 | "type": "LATENT", 382 | "links": [ 383 | 11 384 | ], 385 | "shape": 3, 386 | "slot_index": 0 387 | }, 388 | { 389 | "name": "w", 390 | "type": "INT", 391 | "links": null, 392 | "shape": 3 393 | }, 394 | { 395 | "name": "h", 396 | "type": "INT", 397 | "links": null, 398 | "shape": 3 399 | } 400 | ], 401 | "properties": { 402 | "Node name for S&R": "EmptyLatentImageFromPresetsSD15" 403 | }, 404 | "widgets_values": [ 405 | "512 x 768", 406 | 4 407 | ] 408 | }, 409 | { 410 | "id": 11, 411 | "type": "KSampler", 412 | "pos": [ 413 | 402, 414 | 532 415 | ], 416 | "size": { 417 | "0": 315, 418 | "1": 262 419 | }, 420 | "flags": {}, 421 | "order": 9, 422 | "mode": 0, 423 | "inputs": [ 424 | { 425 | "name": "model", 426 | "type": "MODEL", 427 | "link": null 428 | }, 429 | { 430 | "name": "positive", 431 | "type": "CONDITIONING", 432 | "link": null 433 | }, 434 | { 435 | "name": "negative", 436 | "type": "CONDITIONING", 437 | "link": null 438 | }, 439 | { 440 | "name": "latent_image", 441 | "type": "LATENT", 442 | "link": 11 443 | } 444 | ], 445 | "outputs": [ 446 | { 447 | "name": "LATENT", 448 | "type": "LATENT", 449 | "links": [ 450 | 28 451 | ], 452 | "shape": 3, 453 | "slot_index": 0 454 | } 455 | ], 456 | "properties": { 457 | "Node name for S&R": "KSampler" 458 | }, 459 | "widgets_values": [ 460 | 123, 461 | "fixed", 462 | 20, 463 | 8, 464 | "dpmpp_2m", 465 | "karras", 466 | 1 467 | ] 468 | }, 469 | { 470 | "id": 20, 471 | "type": "PromptUtilitiesConstStringMultiLine", 472 | "pos": [ 473 | 800, 474 | 645 475 | ], 476 | "size": [ 477 | 255.52172323794503, 478 | 110.68967742638506 479 | ], 480 | "flags": {}, 481 | "order": 5, 482 | "mode": 0, 483 | "outputs": [ 484 | { 485 | "name": "STRING", 486 | "type": "STRING", 487 | "links": [ 488 | 29 489 | ], 490 | "shape": 3, 491 | "slot_index": 0 492 | } 493 | ], 494 | "properties": { 495 | "Node name for S&R": "PromptUtilitiesConstStringMultiLine" 496 | }, 497 | "widgets_values": [ 498 | "siwm-%model:10%/%pprompt:20%-%nprompt:20%/%seed%-%width%x%height%_%date:yyyyMMdd-hhmmss%" 499 | ] 500 | }, 501 | { 502 | "id": 15, 503 | "type": "SaveImageWithMetaData", 504 | "pos": [ 505 | 1132, 506 | 495 507 | ], 508 | "size": [ 509 | 277.5890906467905, 510 | 411.8033565804974 511 | ], 512 | "flags": {}, 513 | "order": 12, 514 | "mode": 0, 515 | "inputs": [ 516 | { 517 | "name": "images", 518 | "type": "IMAGE", 519 | "link": 15, 520 | "slot_index": 0 521 | }, 522 | { 523 | "name": "filename_prefix", 524 | "type": "STRING", 525 | "link": 29, 526 | "widget": { 527 | "name": "filename_prefix" 528 | } 529 | } 530 | ], 531 | "properties": { 532 | "Node name for S&R": "SaveImageWithMetaData" 533 | }, 534 | "widgets_values": [ 535 | "siwm-%model:10%/%pprompt:20%-%nprompt:20%/%seed%-%width%x%height%_%date:yyyyMMdd-hhmmss%", 536 | "Farthest", 537 | 0 538 | ] 539 | } 540 | ], 541 | "links": [ 542 | [ 543 | 7, 544 | 7, 545 | 0, 546 | 9, 547 | 1, 548 | "STRING" 549 | ], 550 | [ 551 | 8, 552 | 6, 553 | 0, 554 | 8, 555 | 1, 556 | "STRING" 557 | ], 558 | [ 559 | 9, 560 | 8, 561 | 0, 562 | 10, 563 | 0, 564 | "*" 565 | ], 566 | [ 567 | 10, 568 | 9, 569 | 0, 570 | 10, 571 | 1, 572 | "*" 573 | ], 574 | [ 575 | 11, 576 | 12, 577 | 0, 578 | 11, 579 | 3, 580 | "LATENT" 581 | ], 582 | [ 583 | 13, 584 | 2, 585 | 0, 586 | 13, 587 | 1, 588 | "VAE" 589 | ], 590 | [ 591 | 15, 592 | 13, 593 | 0, 594 | 15, 595 | 0, 596 | "IMAGE" 597 | ], 598 | [ 599 | 24, 600 | 1, 601 | 0, 602 | 5, 603 | 0, 604 | "*" 605 | ], 606 | [ 607 | 25, 608 | 1, 609 | 1, 610 | 5, 611 | 1, 612 | "*" 613 | ], 614 | [ 615 | 26, 616 | 1, 617 | 2, 618 | 5, 619 | 2, 620 | "*" 621 | ], 622 | [ 623 | 28, 624 | 11, 625 | 0, 626 | 13, 627 | 0, 628 | "LATENT" 629 | ], 630 | [ 631 | 29, 632 | 20, 633 | 0, 634 | 15, 635 | 1, 636 | "STRING" 637 | ] 638 | ], 639 | "groups": [], 640 | "config": {}, 641 | "extra": {}, 642 | "version": 0.4 643 | } 644 | -------------------------------------------------------------------------------- /examples/filename_format.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/nkchocoai/ComfyUI-SaveImageWithMetaData/c9a87453bc4257614c87d28a64a5be9c4fdccba9/examples/filename_format.png -------------------------------------------------------------------------------- /examples/lora_embedding_vae.json: -------------------------------------------------------------------------------- 1 | { 2 | "last_node_id": 19, 3 | "last_link_id": 22, 4 | "nodes": [ 5 | { 6 | "id": 1, 7 | "type": "CheckpointLoaderSimple", 8 | "pos": [ 9 | 24, 10 | 54 11 | ], 12 | "size": { 13 | "0": 315, 14 | "1": 98 15 | }, 16 | "flags": {}, 17 | "order": 0, 18 | "mode": 0, 19 | "outputs": [ 20 | { 21 | "name": "MODEL", 22 | "type": "MODEL", 23 | "links": [ 24 | 1 25 | ], 26 | "shape": 3, 27 | "slot_index": 0 28 | }, 29 | { 30 | "name": "CLIP", 31 | "type": "CLIP", 32 | "links": [ 33 | 2 34 | ], 35 | "shape": 3 36 | }, 37 | { 38 | "name": "VAE", 39 | "type": "VAE", 40 | "links": null, 41 | "shape": 3 42 | } 43 | ], 44 | "properties": { 45 | "Node name for S&R": "CheckpointLoaderSimple" 46 | }, 47 | "widgets_values": [ 48 | "CounterfeitV30_v30.safetensors" 49 | ] 50 | }, 51 | { 52 | "id": 5, 53 | "type": "Anything Everywhere3", 54 | "pos": [ 55 | 1143, 56 | 52 57 | ], 58 | "size": { 59 | "0": 210, 60 | "1": 66 61 | }, 62 | "flags": {}, 63 | "order": 12, 64 | "mode": 0, 65 | "inputs": [ 66 | { 67 | "name": "MODEL", 68 | "type": "*", 69 | "link": 5, 70 | "slot_index": 0, 71 | "color_on": "#B39DDB" 72 | }, 73 | { 74 | "name": "CLIP", 75 | "type": "*", 76 | "link": 6, 77 | "color_on": "#FFD500" 78 | }, 79 | { 80 | "name": "anything", 81 | "type": "*", 82 | "link": null 83 | } 84 | ], 85 | "properties": { 86 | "group_restricted": false, 87 | "color_restricted": false, 88 | "Node name for S&R": "Anything Everywhere3" 89 | } 90 | }, 91 | { 92 | "id": 7, 93 | "type": "PromptUtilitiesConstStringMultiLine", 94 | "pos": [ 95 | 21, 96 | 378 97 | ], 98 | "size": { 99 | "0": 287.9851379394531, 100 | "1": 99.69133758544922 101 | }, 102 | "flags": {}, 103 | "order": 1, 104 | "mode": 0, 105 | "outputs": [ 106 | { 107 | "name": "STRING", 108 | "type": "STRING", 109 | "links": [ 110 | 7 111 | ], 112 | "shape": 3 113 | } 114 | ], 115 | "properties": { 116 | "Node name for S&R": "PromptUtilitiesConstStringMultiLine" 117 | }, 118 | "widgets_values": [ 119 | "(low quality, worst quality:1.4), embedding:EasyNegative" 120 | ] 121 | }, 122 | { 123 | "id": 9, 124 | "type": "CLIPTextEncode", 125 | "pos": [ 126 | 394, 127 | 382 128 | ], 129 | "size": { 130 | "0": 210, 131 | "1": 54 132 | }, 133 | "flags": {}, 134 | "order": 6, 135 | "mode": 0, 136 | "inputs": [ 137 | { 138 | "name": "clip", 139 | "type": "CLIP", 140 | "link": null, 141 | "slot_index": 0 142 | }, 143 | { 144 | "name": "text", 145 | "type": "STRING", 146 | "link": 7, 147 | "widget": { 148 | "name": "text" 149 | }, 150 | "slot_index": 1 151 | } 152 | ], 153 | "outputs": [ 154 | { 155 | "name": "CONDITIONING", 156 | "type": "CONDITIONING", 157 | "links": [ 158 | 10 159 | ], 160 | "shape": 3 161 | } 162 | ], 163 | "properties": { 164 | "Node name for S&R": "CLIPTextEncode" 165 | }, 166 | "widgets_values": [ 167 | "" 168 | ] 169 | }, 170 | { 171 | "id": 8, 172 | "type": "CLIPTextEncode", 173 | "pos": [ 174 | 390, 175 | 257 176 | ], 177 | "size": { 178 | "0": 210, 179 | "1": 54 180 | }, 181 | "flags": {}, 182 | "order": 8, 183 | "mode": 0, 184 | "inputs": [ 185 | { 186 | "name": "clip", 187 | "type": "CLIP", 188 | "link": null 189 | }, 190 | { 191 | "name": "text", 192 | "type": "STRING", 193 | "link": 8, 194 | "widget": { 195 | "name": "text" 196 | } 197 | } 198 | ], 199 | "outputs": [ 200 | { 201 | "name": "CONDITIONING", 202 | "type": "CONDITIONING", 203 | "links": [ 204 | 9 205 | ], 206 | "shape": 3, 207 | "slot_index": 0 208 | } 209 | ], 210 | "properties": { 211 | "Node name for S&R": "CLIPTextEncode" 212 | }, 213 | "widgets_values": [ 214 | "" 215 | ] 216 | }, 217 | { 218 | "id": 10, 219 | "type": "Prompts Everywhere", 220 | "pos": [ 221 | 674, 222 | 302 223 | ], 224 | "size": { 225 | "0": 210, 226 | "1": 46 227 | }, 228 | "flags": {}, 229 | "order": 11, 230 | "mode": 0, 231 | "inputs": [ 232 | { 233 | "name": "CONDITIONING", 234 | "type": "*", 235 | "link": 9, 236 | "color_on": "#FFA931" 237 | }, 238 | { 239 | "name": "CONDITIONING", 240 | "type": "*", 241 | "link": 10, 242 | "slot_index": 1, 243 | "color_on": "#FFA931" 244 | } 245 | ], 246 | "properties": { 247 | "group_restricted": false, 248 | "color_restricted": false, 249 | "Node name for S&R": "Prompts Everywhere" 250 | } 251 | }, 252 | { 253 | "id": 12, 254 | "type": "EmptyLatentImageFromPresetsSD15", 255 | "pos": [ 256 | 23, 257 | 550 258 | ], 259 | "size": { 260 | "0": 327.6000061035156, 261 | "1": 122 262 | }, 263 | "flags": {}, 264 | "order": 2, 265 | "mode": 0, 266 | "outputs": [ 267 | { 268 | "name": "latent", 269 | "type": "LATENT", 270 | "links": [ 271 | 11 272 | ], 273 | "shape": 3, 274 | "slot_index": 0 275 | }, 276 | { 277 | "name": "w", 278 | "type": "INT", 279 | "links": null, 280 | "shape": 3 281 | }, 282 | { 283 | "name": "h", 284 | "type": "INT", 285 | "links": null, 286 | "shape": 3 287 | } 288 | ], 289 | "properties": { 290 | "Node name for S&R": "EmptyLatentImageFromPresetsSD15" 291 | }, 292 | "widgets_values": [ 293 | "512 x 768", 294 | 1 295 | ] 296 | }, 297 | { 298 | "id": 14, 299 | "type": "SaveImage", 300 | "pos": [ 301 | 1840, 302 | 260 303 | ], 304 | "size": { 305 | "0": 315, 306 | "1": 270 307 | }, 308 | "flags": {}, 309 | "order": 15, 310 | "mode": 0, 311 | "inputs": [ 312 | { 313 | "name": "images", 314 | "type": "IMAGE", 315 | "link": 14, 316 | "slot_index": 0 317 | } 318 | ], 319 | "properties": {}, 320 | "widgets_values": [ 321 | "without-meta" 322 | ] 323 | }, 324 | { 325 | "id": 17, 326 | "type": "LatentUpscaleBy", 327 | "pos": [ 328 | 772, 329 | 532 330 | ], 331 | "size": { 332 | "0": 262.8999938964844, 333 | "1": 82 334 | }, 335 | "flags": {}, 336 | "order": 10, 337 | "mode": 0, 338 | "inputs": [ 339 | { 340 | "name": "samples", 341 | "type": "LATENT", 342 | "link": 18, 343 | "slot_index": 0 344 | } 345 | ], 346 | "outputs": [ 347 | { 348 | "name": "LATENT", 349 | "type": "LATENT", 350 | "links": [ 351 | 19 352 | ], 353 | "shape": 3, 354 | "slot_index": 0 355 | } 356 | ], 357 | "properties": { 358 | "Node name for S&R": "LatentUpscaleBy" 359 | }, 360 | "widgets_values": [ 361 | "nearest-exact", 362 | 2 363 | ] 364 | }, 365 | { 366 | "id": 15, 367 | "type": "SaveImageWithMetaData", 368 | "pos": [ 369 | 1840, 370 | 580 371 | ], 372 | "size": { 373 | "0": 315, 374 | "1": 318 375 | }, 376 | "flags": {}, 377 | "order": 16, 378 | "mode": 0, 379 | "inputs": [ 380 | { 381 | "name": "images", 382 | "type": "IMAGE", 383 | "link": 15, 384 | "slot_index": 0 385 | } 386 | ], 387 | "properties": { 388 | "Node name for S&R": "SaveImageWithMetaData" 389 | }, 390 | "widgets_values": [ 391 | "with-meta", 392 | "Farthest", 393 | 0 394 | ] 395 | }, 396 | { 397 | "id": 11, 398 | "type": "KSampler", 399 | "pos": [ 400 | 402, 401 | 532 402 | ], 403 | "size": { 404 | "0": 315, 405 | "1": 262 406 | }, 407 | "flags": {}, 408 | "order": 7, 409 | "mode": 0, 410 | "inputs": [ 411 | { 412 | "name": "model", 413 | "type": "MODEL", 414 | "link": null 415 | }, 416 | { 417 | "name": "positive", 418 | "type": "CONDITIONING", 419 | "link": null 420 | }, 421 | { 422 | "name": "negative", 423 | "type": "CONDITIONING", 424 | "link": null 425 | }, 426 | { 427 | "name": "latent_image", 428 | "type": "LATENT", 429 | "link": 11 430 | } 431 | ], 432 | "outputs": [ 433 | { 434 | "name": "LATENT", 435 | "type": "LATENT", 436 | "links": [ 437 | 18 438 | ], 439 | "shape": 3, 440 | "slot_index": 0 441 | } 442 | ], 443 | "properties": { 444 | "Node name for S&R": "KSampler" 445 | }, 446 | "widgets_values": [ 447 | 123, 448 | "fixed", 449 | 20, 450 | 8, 451 | "dpmpp_2m", 452 | "karras", 453 | 1 454 | ] 455 | }, 456 | { 457 | "id": 13, 458 | "type": "VAEDecode", 459 | "pos": [ 460 | 1501, 461 | 530 462 | ], 463 | "size": { 464 | "0": 210, 465 | "1": 46 466 | }, 467 | "flags": {}, 468 | "order": 14, 469 | "mode": 0, 470 | "inputs": [ 471 | { 472 | "name": "samples", 473 | "type": "LATENT", 474 | "link": 17, 475 | "slot_index": 0 476 | }, 477 | { 478 | "name": "vae", 479 | "type": "VAE", 480 | "link": 13 481 | } 482 | ], 483 | "outputs": [ 484 | { 485 | "name": "IMAGE", 486 | "type": "IMAGE", 487 | "links": [ 488 | 14, 489 | 15 490 | ], 491 | "shape": 3 492 | } 493 | ], 494 | "properties": { 495 | "Node name for S&R": "VAEDecode" 496 | } 497 | }, 498 | { 499 | "id": 16, 500 | "type": "KSampler", 501 | "pos": [ 502 | 1090, 503 | 470 504 | ], 505 | "size": { 506 | "0": 315, 507 | "1": 262 508 | }, 509 | "flags": {}, 510 | "order": 13, 511 | "mode": 0, 512 | "inputs": [ 513 | { 514 | "name": "model", 515 | "type": "MODEL", 516 | "link": null 517 | }, 518 | { 519 | "name": "positive", 520 | "type": "CONDITIONING", 521 | "link": null 522 | }, 523 | { 524 | "name": "negative", 525 | "type": "CONDITIONING", 526 | "link": null 527 | }, 528 | { 529 | "name": "latent_image", 530 | "type": "LATENT", 531 | "link": 19 532 | } 533 | ], 534 | "outputs": [ 535 | { 536 | "name": "LATENT", 537 | "type": "LATENT", 538 | "links": [ 539 | 17 540 | ], 541 | "shape": 3, 542 | "slot_index": 0 543 | } 544 | ], 545 | "properties": { 546 | "Node name for S&R": "KSampler" 547 | }, 548 | "widgets_values": [ 549 | 123, 550 | "fixed", 551 | 20, 552 | 8, 553 | "dpmpp_2m", 554 | "karras", 555 | 0.6 556 | ] 557 | }, 558 | { 559 | "id": 4, 560 | "type": "LoraLoader", 561 | "pos": [ 562 | 761, 563 | 52 564 | ], 565 | "size": { 566 | "0": 340.8315734863281, 567 | "1": 126 568 | }, 569 | "flags": {}, 570 | "order": 9, 571 | "mode": 0, 572 | "inputs": [ 573 | { 574 | "name": "model", 575 | "type": "MODEL", 576 | "link": 3, 577 | "slot_index": 0 578 | }, 579 | { 580 | "name": "clip", 581 | "type": "CLIP", 582 | "link": 4 583 | } 584 | ], 585 | "outputs": [ 586 | { 587 | "name": "MODEL", 588 | "type": "MODEL", 589 | "links": [ 590 | 5 591 | ], 592 | "shape": 3, 593 | "slot_index": 0 594 | }, 595 | { 596 | "name": "CLIP", 597 | "type": "CLIP", 598 | "links": [ 599 | 6 600 | ], 601 | "shape": 3, 602 | "slot_index": 1 603 | } 604 | ], 605 | "properties": { 606 | "Node name for S&R": "LoraLoader" 607 | }, 608 | "widgets_values": [ 609 | "leco\\leco_cyborg_last.safetensors", 610 | -3, 611 | -3 612 | ] 613 | }, 614 | { 615 | "id": 2, 616 | "type": "VAELoader", 617 | "pos": [ 618 | 1090, 619 | 810 620 | ], 621 | "size": { 622 | "0": 323.17578125, 623 | "1": 58 624 | }, 625 | "flags": {}, 626 | "order": 3, 627 | "mode": 0, 628 | "outputs": [ 629 | { 630 | "name": "VAE", 631 | "type": "VAE", 632 | "links": [ 633 | 13 634 | ], 635 | "shape": 3, 636 | "slot_index": 0 637 | } 638 | ], 639 | "properties": { 640 | "Node name for S&R": "VAELoader" 641 | }, 642 | "widgets_values": [ 643 | "clearvae_nanslesstest.safetensors" 644 | ] 645 | }, 646 | { 647 | "id": 6, 648 | "type": "PromptUtilitiesConstStringMultiLine", 649 | "pos": [ 650 | 18, 651 | 225 652 | ], 653 | "size": { 654 | "0": 287.9851379394531, 655 | "1": 99.69133758544922 656 | }, 657 | "flags": {}, 658 | "order": 4, 659 | "mode": 0, 660 | "outputs": [ 661 | { 662 | "name": "STRING", 663 | "type": "STRING", 664 | "links": [ 665 | 8 666 | ], 667 | "shape": 3, 668 | "slot_index": 0 669 | } 670 | ], 671 | "properties": { 672 | "Node name for S&R": "PromptUtilitiesConstStringMultiLine" 673 | }, 674 | "widgets_values": [ 675 | "masterpiece, best quality, hatsune miku, upper body" 676 | ] 677 | }, 678 | { 679 | "id": 3, 680 | "type": "LoraLoader", 681 | "pos": [ 682 | 397, 683 | 50 684 | ], 685 | "size": { 686 | "0": 315, 687 | "1": 126 688 | }, 689 | "flags": {}, 690 | "order": 5, 691 | "mode": 0, 692 | "inputs": [ 693 | { 694 | "name": "model", 695 | "type": "MODEL", 696 | "link": 1 697 | }, 698 | { 699 | "name": "clip", 700 | "type": "CLIP", 701 | "link": 2, 702 | "slot_index": 1 703 | } 704 | ], 705 | "outputs": [ 706 | { 707 | "name": "MODEL", 708 | "type": "MODEL", 709 | "links": [ 710 | 3 711 | ], 712 | "shape": 3 713 | }, 714 | { 715 | "name": "CLIP", 716 | "type": "CLIP", 717 | "links": [ 718 | 4 719 | ], 720 | "shape": 3, 721 | "slot_index": 1 722 | } 723 | ], 724 | "properties": { 725 | "Node name for S&R": "LoraLoader" 726 | }, 727 | "widgets_values": [ 728 | "util\\flat2.safetensors", 729 | -1, 730 | -1 731 | ] 732 | } 733 | ], 734 | "links": [ 735 | [ 736 | 1, 737 | 1, 738 | 0, 739 | 3, 740 | 0, 741 | "MODEL" 742 | ], 743 | [ 744 | 2, 745 | 1, 746 | 1, 747 | 3, 748 | 1, 749 | "CLIP" 750 | ], 751 | [ 752 | 3, 753 | 3, 754 | 0, 755 | 4, 756 | 0, 757 | "MODEL" 758 | ], 759 | [ 760 | 4, 761 | 3, 762 | 1, 763 | 4, 764 | 1, 765 | "CLIP" 766 | ], 767 | [ 768 | 5, 769 | 4, 770 | 0, 771 | 5, 772 | 0, 773 | "*" 774 | ], 775 | [ 776 | 6, 777 | 4, 778 | 1, 779 | 5, 780 | 1, 781 | "*" 782 | ], 783 | [ 784 | 7, 785 | 7, 786 | 0, 787 | 9, 788 | 1, 789 | "STRING" 790 | ], 791 | [ 792 | 8, 793 | 6, 794 | 0, 795 | 8, 796 | 1, 797 | "STRING" 798 | ], 799 | [ 800 | 9, 801 | 8, 802 | 0, 803 | 10, 804 | 0, 805 | "*" 806 | ], 807 | [ 808 | 10, 809 | 9, 810 | 0, 811 | 10, 812 | 1, 813 | "*" 814 | ], 815 | [ 816 | 11, 817 | 12, 818 | 0, 819 | 11, 820 | 3, 821 | "LATENT" 822 | ], 823 | [ 824 | 13, 825 | 2, 826 | 0, 827 | 13, 828 | 1, 829 | "VAE" 830 | ], 831 | [ 832 | 14, 833 | 13, 834 | 0, 835 | 14, 836 | 0, 837 | "IMAGE" 838 | ], 839 | [ 840 | 15, 841 | 13, 842 | 0, 843 | 15, 844 | 0, 845 | "IMAGE" 846 | ], 847 | [ 848 | 17, 849 | 16, 850 | 0, 851 | 13, 852 | 0, 853 | "LATENT" 854 | ], 855 | [ 856 | 18, 857 | 11, 858 | 0, 859 | 17, 860 | 0, 861 | "LATENT" 862 | ], 863 | [ 864 | 19, 865 | 17, 866 | 0, 867 | 16, 868 | 3, 869 | "LATENT" 870 | ] 871 | ], 872 | "groups": [], 873 | "config": {}, 874 | "extra": {}, 875 | "version": 0.4 876 | } -------------------------------------------------------------------------------- /examples/lora_embedding_vae.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/nkchocoai/ComfyUI-SaveImageWithMetaData/c9a87453bc4257614c87d28a64a5be9c4fdccba9/examples/lora_embedding_vae.png -------------------------------------------------------------------------------- /examples/sd3.json: -------------------------------------------------------------------------------- 1 | { 2 | "last_node_id": 277, 3 | "last_link_id": 606, 4 | "nodes": [ 5 | { 6 | "id": 266, 7 | "type": "Note", 8 | "pos": [ 9 | -2352, 10 | 576 11 | ], 12 | "size": { 13 | "0": 308.061279296875, 14 | "1": 102.86902618408203 15 | }, 16 | "flags": {}, 17 | "order": 0, 18 | "mode": 0, 19 | "properties": { 20 | "text": "" 21 | }, 22 | "widgets_values": [ 23 | "Resolution should be around 1 megapixel and width/height must be multiple of 64" 24 | ], 25 | "color": "#432", 26 | "bgcolor": "#653" 27 | }, 28 | { 29 | "id": 135, 30 | "type": "EmptySD3LatentImage", 31 | "pos": [ 32 | -2352, 33 | 438 34 | ], 35 | "size": { 36 | "0": 315, 37 | "1": 106 38 | }, 39 | "flags": {}, 40 | "order": 1, 41 | "mode": 0, 42 | "inputs": [], 43 | "outputs": [ 44 | { 45 | "name": "LATENT", 46 | "type": "LATENT", 47 | "links": [ 48 | 593 49 | ], 50 | "shape": 3, 51 | "slot_index": 0 52 | } 53 | ], 54 | "properties": { 55 | "Node name for S&R": "EmptySD3LatentImage" 56 | }, 57 | "widgets_values": [ 58 | 1024, 59 | 1024, 60 | 1 61 | ] 62 | }, 63 | { 64 | "id": 71, 65 | "type": "CLIPTextEncode", 66 | "pos": [ 67 | -1869.2871546875003, 68 | 560.071803930664 69 | ], 70 | "size": { 71 | "0": 380.4615783691406, 72 | "1": 102.07693481445312 73 | }, 74 | "flags": {}, 75 | "order": 6, 76 | "mode": 0, 77 | "inputs": [ 78 | { 79 | "name": "clip", 80 | "type": "CLIP", 81 | "link": 601 82 | } 83 | ], 84 | "outputs": [ 85 | { 86 | "name": "CONDITIONING", 87 | "type": "CONDITIONING", 88 | "links": [ 89 | 93, 90 | 580 91 | ], 92 | "shape": 3, 93 | "slot_index": 0 94 | } 95 | ], 96 | "title": "CLIP Text Encode (Negative Prompt)", 97 | "properties": { 98 | "Node name for S&R": "CLIPTextEncode" 99 | }, 100 | "widgets_values": [ 101 | "bad quality, poor quality, disfigured, jpg, bad anatomy, missing limbs, missing fingers" 102 | ], 103 | "color": "#322", 104 | "bgcolor": "#533" 105 | }, 106 | { 107 | "id": 272, 108 | "type": "PrimitiveNode", 109 | "pos": [ 110 | -2342, 111 | 278 112 | ], 113 | "size": { 114 | "0": 210, 115 | "1": 82 116 | }, 117 | "flags": {}, 118 | "order": 2, 119 | "mode": 0, 120 | "outputs": [ 121 | { 122 | "name": "INT", 123 | "type": "INT", 124 | "links": [ 125 | 597 126 | ], 127 | "slot_index": 0, 128 | "widget": { 129 | "name": "seed" 130 | } 131 | } 132 | ], 133 | "title": "Seed", 134 | "properties": { 135 | "Run widget replace on values": false 136 | }, 137 | "widgets_values": [ 138 | 0, 139 | "fixed" 140 | ] 141 | }, 142 | { 143 | "id": 6, 144 | "type": "CLIPTextEncode", 145 | "pos": [ 146 | -1876, 147 | 284 148 | ], 149 | "size": { 150 | "0": 389.06927490234375, 151 | "1": 207.84902954101562 152 | }, 153 | "flags": {}, 154 | "order": 5, 155 | "mode": 0, 156 | "inputs": [ 157 | { 158 | "name": "clip", 159 | "type": "CLIP", 160 | "link": 600 161 | } 162 | ], 163 | "outputs": [ 164 | { 165 | "name": "CONDITIONING", 166 | "type": "CONDITIONING", 167 | "links": [ 168 | 595 169 | ], 170 | "shape": 3, 171 | "slot_index": 0 172 | } 173 | ], 174 | "properties": { 175 | "Node name for S&R": "CLIPTextEncode" 176 | }, 177 | "widgets_values": [ 178 | "Chibi figure of a wizard girl with a big wand. She wears a wizard's hat and a blue robe. The wand is based on a clock. A magic circle is deployed on the underside." 179 | ], 180 | "color": "#232", 181 | "bgcolor": "#353" 182 | }, 183 | { 184 | "id": 252, 185 | "type": "CheckpointLoaderSimple", 186 | "pos": [ 187 | -2355.546164772728, 188 | 16.90906663374463 189 | ], 190 | "size": [ 191 | 404.13517457201715, 192 | 98 193 | ], 194 | "flags": {}, 195 | "order": 3, 196 | "mode": 0, 197 | "outputs": [ 198 | { 199 | "name": "MODEL", 200 | "type": "MODEL", 201 | "links": [ 202 | 565 203 | ], 204 | "shape": 3, 205 | "slot_index": 0 206 | }, 207 | { 208 | "name": "CLIP", 209 | "type": "CLIP", 210 | "links": [ 211 | 600, 212 | 601 213 | ], 214 | "shape": 3, 215 | "slot_index": 1 216 | }, 217 | { 218 | "name": "VAE", 219 | "type": "VAE", 220 | "links": [ 221 | 557 222 | ], 223 | "shape": 3, 224 | "slot_index": 2 225 | } 226 | ], 227 | "properties": { 228 | "Node name for S&R": "CheckpointLoaderSimple" 229 | }, 230 | "widgets_values": [ 231 | "sd3_medium_incl_clips_t5xxlfp8.safetensors" 232 | ] 233 | }, 234 | { 235 | "id": 13, 236 | "type": "ModelSamplingSD3", 237 | "pos": [ 238 | -1774, 239 | 12 240 | ], 241 | "size": { 242 | "0": 315, 243 | "1": 58 244 | }, 245 | "flags": { 246 | "collapsed": false 247 | }, 248 | "order": 4, 249 | "mode": 0, 250 | "inputs": [ 251 | { 252 | "name": "model", 253 | "type": "MODEL", 254 | "link": 565 255 | } 256 | ], 257 | "outputs": [ 258 | { 259 | "name": "MODEL", 260 | "type": "MODEL", 261 | "links": [ 262 | 591 263 | ], 264 | "shape": 3, 265 | "slot_index": 0 266 | } 267 | ], 268 | "properties": { 269 | "Node name for S&R": "ModelSamplingSD3" 270 | }, 271 | "widgets_values": [ 272 | 3 273 | ] 274 | }, 275 | { 276 | "id": 67, 277 | "type": "ConditioningZeroOut", 278 | "pos": [ 279 | -1405, 280 | 392 281 | ], 282 | "size": { 283 | "0": 211.60000610351562, 284 | "1": 26 285 | }, 286 | "flags": {}, 287 | "order": 8, 288 | "mode": 0, 289 | "inputs": [ 290 | { 291 | "name": "conditioning", 292 | "type": "CONDITIONING", 293 | "link": 580 294 | } 295 | ], 296 | "outputs": [ 297 | { 298 | "name": "CONDITIONING", 299 | "type": "CONDITIONING", 300 | "links": [ 301 | 90 302 | ], 303 | "shape": 3, 304 | "slot_index": 0 305 | } 306 | ], 307 | "properties": { 308 | "Node name for S&R": "ConditioningZeroOut" 309 | } 310 | }, 311 | { 312 | "id": 70, 313 | "type": "ConditioningSetTimestepRange", 314 | "pos": [ 315 | -1108, 316 | 549 317 | ], 318 | "size": { 319 | "0": 317.4000244140625, 320 | "1": 82 321 | }, 322 | "flags": {}, 323 | "order": 7, 324 | "mode": 0, 325 | "inputs": [ 326 | { 327 | "name": "conditioning", 328 | "type": "CONDITIONING", 329 | "link": 93, 330 | "slot_index": 0 331 | } 332 | ], 333 | "outputs": [ 334 | { 335 | "name": "CONDITIONING", 336 | "type": "CONDITIONING", 337 | "links": [ 338 | 92 339 | ], 340 | "shape": 3, 341 | "slot_index": 0 342 | } 343 | ], 344 | "properties": { 345 | "Node name for S&R": "ConditioningSetTimestepRange" 346 | }, 347 | "widgets_values": [ 348 | 0, 349 | 0.1 350 | ] 351 | }, 352 | { 353 | "id": 68, 354 | "type": "ConditioningSetTimestepRange", 355 | "pos": [ 356 | -1111, 357 | 391 358 | ], 359 | "size": { 360 | "0": 317.4000244140625, 361 | "1": 82 362 | }, 363 | "flags": {}, 364 | "order": 9, 365 | "mode": 0, 366 | "inputs": [ 367 | { 368 | "name": "conditioning", 369 | "type": "CONDITIONING", 370 | "link": 90 371 | } 372 | ], 373 | "outputs": [ 374 | { 375 | "name": "CONDITIONING", 376 | "type": "CONDITIONING", 377 | "links": [ 378 | 91 379 | ], 380 | "shape": 3, 381 | "slot_index": 0 382 | } 383 | ], 384 | "properties": { 385 | "Node name for S&R": "ConditioningSetTimestepRange" 386 | }, 387 | "widgets_values": [ 388 | 0.1, 389 | 1 390 | ] 391 | }, 392 | { 393 | "id": 69, 394 | "type": "ConditioningCombine", 395 | "pos": [ 396 | -734, 397 | 450 398 | ], 399 | "size": { 400 | "0": 228.39999389648438, 401 | "1": 46 402 | }, 403 | "flags": {}, 404 | "order": 10, 405 | "mode": 0, 406 | "inputs": [ 407 | { 408 | "name": "conditioning_1", 409 | "type": "CONDITIONING", 410 | "link": 91 411 | }, 412 | { 413 | "name": "conditioning_2", 414 | "type": "CONDITIONING", 415 | "link": 92 416 | } 417 | ], 418 | "outputs": [ 419 | { 420 | "name": "CONDITIONING", 421 | "type": "CONDITIONING", 422 | "links": [ 423 | 592 424 | ], 425 | "shape": 3, 426 | "slot_index": 0 427 | } 428 | ], 429 | "properties": { 430 | "Node name for S&R": "ConditioningCombine" 431 | } 432 | }, 433 | { 434 | "id": 271, 435 | "type": "KSampler", 436 | "pos": [ 437 | -420, 438 | 116 439 | ], 440 | "size": { 441 | "0": 301.5874938964844, 442 | "1": 234 443 | }, 444 | "flags": {}, 445 | "order": 11, 446 | "mode": 0, 447 | "inputs": [ 448 | { 449 | "name": "model", 450 | "type": "MODEL", 451 | "link": 591 452 | }, 453 | { 454 | "name": "positive", 455 | "type": "CONDITIONING", 456 | "link": 595 457 | }, 458 | { 459 | "name": "negative", 460 | "type": "CONDITIONING", 461 | "link": 592 462 | }, 463 | { 464 | "name": "latent_image", 465 | "type": "LATENT", 466 | "link": 593 467 | }, 468 | { 469 | "name": "seed", 470 | "type": "INT", 471 | "link": 597, 472 | "widget": { 473 | "name": "seed" 474 | }, 475 | "slot_index": 4 476 | } 477 | ], 478 | "outputs": [ 479 | { 480 | "name": "LATENT", 481 | "type": "LATENT", 482 | "links": [ 483 | 596 484 | ], 485 | "shape": 3, 486 | "slot_index": 0 487 | } 488 | ], 489 | "properties": { 490 | "Node name for S&R": "KSampler" 491 | }, 492 | "widgets_values": [ 493 | 0, 494 | "fixed", 495 | 28, 496 | 4.5, 497 | "dpmpp_2m", 498 | "sgm_uniform", 499 | 1 500 | ] 501 | }, 502 | { 503 | "id": 231, 504 | "type": "VAEDecode", 505 | "pos": [ 506 | -30, 507 | 28 508 | ], 509 | "size": { 510 | "0": 210, 511 | "1": 46 512 | }, 513 | "flags": {}, 514 | "order": 12, 515 | "mode": 0, 516 | "inputs": [ 517 | { 518 | "name": "samples", 519 | "type": "LATENT", 520 | "link": 596 521 | }, 522 | { 523 | "name": "vae", 524 | "type": "VAE", 525 | "link": 557 526 | } 527 | ], 528 | "outputs": [ 529 | { 530 | "name": "IMAGE", 531 | "type": "IMAGE", 532 | "links": [ 533 | 606 534 | ], 535 | "shape": 3, 536 | "slot_index": 0 537 | } 538 | ], 539 | "properties": { 540 | "Node name for S&R": "VAEDecode" 541 | } 542 | }, 543 | { 544 | "id": 277, 545 | "type": "SaveImageWithMetaData", 546 | "pos": [ 547 | 271, 548 | 29 549 | ], 550 | "size": { 551 | "0": 315, 552 | "1": 438 553 | }, 554 | "flags": {}, 555 | "order": 13, 556 | "mode": 0, 557 | "inputs": [ 558 | { 559 | "name": "images", 560 | "type": "IMAGE", 561 | "link": 606 562 | } 563 | ], 564 | "properties": { 565 | "Node name for S&R": "SaveImageWithMetaData" 566 | }, 567 | "widgets_values": [ 568 | "SD3", 569 | "Farthest", 570 | 0, 571 | "png", 572 | true, 573 | 100, 574 | false, 575 | true 576 | ] 577 | } 578 | ], 579 | "links": [ 580 | [ 581 | 90, 582 | 67, 583 | 0, 584 | 68, 585 | 0, 586 | "CONDITIONING" 587 | ], 588 | [ 589 | 91, 590 | 68, 591 | 0, 592 | 69, 593 | 0, 594 | "CONDITIONING" 595 | ], 596 | [ 597 | 92, 598 | 70, 599 | 0, 600 | 69, 601 | 1, 602 | "CONDITIONING" 603 | ], 604 | [ 605 | 93, 606 | 71, 607 | 0, 608 | 70, 609 | 0, 610 | "CONDITIONING" 611 | ], 612 | [ 613 | 557, 614 | 252, 615 | 2, 616 | 231, 617 | 1, 618 | "VAE" 619 | ], 620 | [ 621 | 565, 622 | 252, 623 | 0, 624 | 13, 625 | 0, 626 | "MODEL" 627 | ], 628 | [ 629 | 580, 630 | 71, 631 | 0, 632 | 67, 633 | 0, 634 | "CONDITIONING" 635 | ], 636 | [ 637 | 591, 638 | 13, 639 | 0, 640 | 271, 641 | 0, 642 | "MODEL" 643 | ], 644 | [ 645 | 592, 646 | 69, 647 | 0, 648 | 271, 649 | 2, 650 | "CONDITIONING" 651 | ], 652 | [ 653 | 593, 654 | 135, 655 | 0, 656 | 271, 657 | 3, 658 | "LATENT" 659 | ], 660 | [ 661 | 595, 662 | 6, 663 | 0, 664 | 271, 665 | 1, 666 | "CONDITIONING" 667 | ], 668 | [ 669 | 596, 670 | 271, 671 | 0, 672 | 231, 673 | 0, 674 | "LATENT" 675 | ], 676 | [ 677 | 597, 678 | 272, 679 | 0, 680 | 271, 681 | 4, 682 | "INT" 683 | ], 684 | [ 685 | 600, 686 | 252, 687 | 1, 688 | 6, 689 | 0, 690 | "CLIP" 691 | ], 692 | [ 693 | 601, 694 | 252, 695 | 1, 696 | 71, 697 | 0, 698 | "CLIP" 699 | ], 700 | [ 701 | 606, 702 | 231, 703 | 0, 704 | 277, 705 | 0, 706 | "IMAGE" 707 | ] 708 | ], 709 | "groups": [ 710 | { 711 | "title": "Load Models", 712 | "bounding": [ 713 | -2407, 714 | -85, 715 | 521, 716 | 245 717 | ], 718 | "color": "#3f789e", 719 | "font_size": 24, 720 | "locked": false 721 | }, 722 | { 723 | "title": "Input", 724 | "bounding": [ 725 | -2409, 726 | 181, 727 | 972, 728 | 523 729 | ], 730 | "color": "#3f789e", 731 | "font_size": 24, 732 | "locked": false 733 | } 734 | ], 735 | "config": {}, 736 | "extra": { 737 | "ds": { 738 | "scale": 1.3310000000000004, 739 | "offset": [ 740 | 252.86689786068672, 741 | 111.40230534343857 742 | ] 743 | } 744 | }, 745 | "version": 0.4 746 | } 747 | -------------------------------------------------------------------------------- /examples/sd3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/nkchocoai/ComfyUI-SaveImageWithMetaData/c9a87453bc4257614c87d28a64a5be9c4fdccba9/examples/sd3.png -------------------------------------------------------------------------------- /img/save_image_with_metadata.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/nkchocoai/ComfyUI-SaveImageWithMetaData/c9a87453bc4257614c87d28a64a5be9c4fdccba9/img/save_image_with_metadata.png -------------------------------------------------------------------------------- /py/__init__.py: -------------------------------------------------------------------------------- 1 | import functools 2 | 3 | from .hook import pre_execute, pre_get_input_data 4 | import execution 5 | 6 | 7 | # refer. https://stackoverflow.com/a/35758398 8 | def prefix_function(function, prefunction): 9 | @functools.wraps(function) 10 | def run(*args, **kwargs): 11 | prefunction(*args, **kwargs) 12 | return function(*args, **kwargs) 13 | 14 | return run 15 | 16 | 17 | execution.PromptExecutor.execute = prefix_function( 18 | execution.PromptExecutor.execute, pre_execute 19 | ) 20 | 21 | 22 | execution.get_input_data = prefix_function(execution.get_input_data, pre_get_input_data) 23 | -------------------------------------------------------------------------------- /py/capture.py: -------------------------------------------------------------------------------- 1 | import json 2 | import os 3 | 4 | from . import hook 5 | from .defs.captures import CAPTURE_FIELD_LIST 6 | from .defs.meta import MetaField 7 | 8 | from nodes import NODE_CLASS_MAPPINGS 9 | from execution import get_input_data 10 | from comfy_execution.graph import DynamicPrompt 11 | 12 | 13 | class Capture: 14 | @classmethod 15 | def get_inputs(cls): 16 | inputs = {} 17 | prompt = hook.current_prompt 18 | extra_data = hook.current_extra_data 19 | outputs = hook.prompt_executer.caches.outputs 20 | 21 | for node_id, obj in prompt.items(): 22 | class_type = obj["class_type"] 23 | obj_class = NODE_CLASS_MAPPINGS[class_type] 24 | node_inputs = prompt[node_id]["inputs"] 25 | input_data = get_input_data( 26 | node_inputs, obj_class, node_id, outputs, DynamicPrompt(prompt), extra_data 27 | ) 28 | for node_class, metas in CAPTURE_FIELD_LIST.items(): 29 | if class_type == node_class: 30 | for meta, field_data in metas.items(): 31 | validate = field_data.get("validate") 32 | if validate is not None and not validate( 33 | node_id, obj, prompt, extra_data, outputs, input_data 34 | ): 35 | continue 36 | 37 | if meta not in inputs: 38 | inputs[meta] = [] 39 | 40 | value = field_data.get("value") 41 | if value is not None: 42 | inputs[meta].append((node_id, value)) 43 | continue 44 | 45 | selector = field_data.get("selector") 46 | if selector is not None: 47 | v = selector( 48 | node_id, obj, prompt, extra_data, outputs, input_data 49 | ) 50 | if isinstance(v, list): 51 | for x in v: 52 | inputs[meta].append((node_id, x)) 53 | elif v is not None: 54 | inputs[meta].append((node_id, v)) 55 | continue 56 | 57 | field_name = field_data["field_name"] 58 | value = input_data[0].get(field_name) 59 | if value is not None: 60 | format = field_data.get("format") 61 | v = value 62 | if isinstance(value, list) and len(value) > 0: 63 | v = value[0] 64 | if format is not None: 65 | v = format(v, input_data) 66 | if isinstance(v, list): 67 | for x in v: 68 | inputs[meta].append((node_id, x)) 69 | else: 70 | inputs[meta].append((node_id, v)) 71 | return inputs 72 | 73 | @classmethod 74 | def gen_pnginfo_dict(cls, inputs_before_sampler_node, inputs_before_this_node, save_civitai_sampler = False): 75 | pnginfo_dict = {} 76 | 77 | def update_pnginfo_dict(inputs, metafield, key): 78 | x = inputs.get(metafield, []) 79 | if len(x) > 0: 80 | pnginfo_dict[key] = x[0][1] 81 | 82 | update_pnginfo_dict( 83 | inputs_before_sampler_node, MetaField.POSITIVE_PROMPT, "Positive prompt" 84 | ) 85 | update_pnginfo_dict( 86 | inputs_before_sampler_node, MetaField.NEGATIVE_PROMPT, "Negative prompt" 87 | ) 88 | 89 | update_pnginfo_dict(inputs_before_sampler_node, MetaField.STEPS, "Steps") 90 | 91 | sampler_names = inputs_before_sampler_node.get(MetaField.SAMPLER_NAME, []) 92 | schedulers = inputs_before_sampler_node.get(MetaField.SCHEDULER, []) 93 | 94 | if (save_civitai_sampler): 95 | pnginfo_dict["Sampler"] = cls.get_sampler_for_civitai(sampler_names, schedulers) 96 | else: 97 | if len(sampler_names) > 0: 98 | pnginfo_dict["Sampler"] = sampler_names[0][1] 99 | 100 | if len(schedulers) > 0: 101 | scheduler = schedulers[0][1] 102 | if scheduler != "normal": 103 | pnginfo_dict["Sampler"] += "_" + scheduler 104 | 105 | update_pnginfo_dict(inputs_before_sampler_node, MetaField.CFG, "CFG scale") 106 | update_pnginfo_dict(inputs_before_sampler_node, MetaField.SEED, "Seed") 107 | 108 | update_pnginfo_dict( 109 | inputs_before_sampler_node, MetaField.CLIP_SKIP, "Clip skip" 110 | ) 111 | 112 | image_widths = inputs_before_sampler_node.get(MetaField.IMAGE_WIDTH, []) 113 | image_heights = inputs_before_sampler_node.get(MetaField.IMAGE_HEIGHT, []) 114 | if len(image_widths) > 0 and len(image_heights) > 0: 115 | pnginfo_dict["Size"] = f"{image_widths[0][1]}x{image_heights[0][1]}" 116 | 117 | update_pnginfo_dict(inputs_before_sampler_node, MetaField.MODEL_NAME, "Model") 118 | update_pnginfo_dict( 119 | inputs_before_sampler_node, MetaField.MODEL_HASH, "Model hash" 120 | ) 121 | 122 | update_pnginfo_dict(inputs_before_this_node, MetaField.VAE_NAME, "VAE") 123 | update_pnginfo_dict(inputs_before_this_node, MetaField.VAE_HASH, "VAE hash") 124 | 125 | pnginfo_dict.update(cls.gen_loras(inputs_before_sampler_node)) 126 | pnginfo_dict.update(cls.gen_embeddings(inputs_before_sampler_node)) 127 | 128 | hashes_for_civitai = cls.get_hashes_for_civitai( 129 | inputs_before_sampler_node, inputs_before_this_node 130 | ) 131 | if len(hashes_for_civitai) > 0: 132 | pnginfo_dict["Hashes"] = json.dumps(hashes_for_civitai) 133 | 134 | return pnginfo_dict 135 | 136 | @classmethod 137 | def gen_parameters_str(cls, pnginfo_dict): 138 | result = pnginfo_dict.get("Positive prompt", "") + "\n" 139 | result += "Negative prompt: " + pnginfo_dict.get("Negative prompt", "") + "\n" 140 | 141 | s_list = [] 142 | pnginfo_dict_without_prompt = { 143 | k: v 144 | for k, v in pnginfo_dict.items() 145 | if k not in {"Positive prompt", "Negative prompt"} 146 | } 147 | for k, v in pnginfo_dict_without_prompt.items(): 148 | s = str(v).strip().replace("\n", " ") 149 | s_list.append(f"{k}: {s}") 150 | 151 | return result + ", ".join(s_list) 152 | 153 | @classmethod 154 | def get_hashes_for_civitai( 155 | cls, inputs_before_sampler_node, inputs_before_this_node 156 | ): 157 | resource_hashes = {} 158 | model_hashes = inputs_before_sampler_node.get(MetaField.MODEL_HASH, []) 159 | if len(model_hashes) > 0: 160 | resource_hashes["model"] = model_hashes[0][1] 161 | 162 | vae_hashes = inputs_before_this_node.get(MetaField.VAE_HASH, []) 163 | if len(vae_hashes) > 0: 164 | resource_hashes["vae"] = vae_hashes[0][1] 165 | 166 | lora_model_names = inputs_before_sampler_node.get(MetaField.LORA_MODEL_NAME, []) 167 | lora_model_hashes = inputs_before_sampler_node.get( 168 | MetaField.LORA_MODEL_HASH, [] 169 | ) 170 | for lora_model_name, lora_model_hash in zip( 171 | lora_model_names, lora_model_hashes 172 | ): 173 | lora_model_name = os.path.splitext(os.path.basename(lora_model_name[1]))[0] 174 | resource_hashes[f"lora:{lora_model_name}"] = lora_model_hash[1] 175 | 176 | embedding_names = inputs_before_sampler_node.get(MetaField.EMBEDDING_NAME, []) 177 | embedding_hashes = inputs_before_sampler_node.get(MetaField.EMBEDDING_HASH, []) 178 | for embedding_name, embedding_hash in zip(embedding_names, embedding_hashes): 179 | embedding_name = os.path.splitext(os.path.basename(embedding_name[1]))[0] 180 | resource_hashes[f"embed:{embedding_name}"] = embedding_hash[1] 181 | 182 | return resource_hashes 183 | 184 | @classmethod 185 | def gen_loras(cls, inputs): 186 | pnginfo_dict = {} 187 | 188 | model_names = inputs.get(MetaField.LORA_MODEL_NAME, []) 189 | model_hashes = inputs.get(MetaField.LORA_MODEL_HASH, []) 190 | strength_models = inputs.get(MetaField.LORA_STRENGTH_MODEL, []) 191 | strength_clips = inputs.get(MetaField.LORA_STRENGTH_CLIP, []) 192 | 193 | index = 0 194 | for model_name, model_hashe, strength_model, strength_clip in zip( 195 | model_names, model_hashes, strength_models, strength_clips 196 | ): 197 | field_prefix = f"Lora_{index}" 198 | pnginfo_dict[f"{field_prefix} Model name"] = os.path.basename(model_name[1]) 199 | pnginfo_dict[f"{field_prefix} Model hash"] = model_hashe[1] 200 | pnginfo_dict[f"{field_prefix} Strength model"] = strength_model[1] 201 | pnginfo_dict[f"{field_prefix} Strength clip"] = strength_clip[1] 202 | index += 1 203 | 204 | return pnginfo_dict 205 | 206 | @classmethod 207 | def gen_embeddings(cls, inputs): 208 | pnginfo_dict = {} 209 | 210 | embedding_names = inputs.get(MetaField.EMBEDDING_NAME, []) 211 | embedding_hashes = inputs.get(MetaField.EMBEDDING_HASH, []) 212 | 213 | index = 0 214 | for embedding_name, embedding_hashe in zip(embedding_names, embedding_hashes): 215 | field_prefix = f"Embedding_{index}" 216 | pnginfo_dict[f"{field_prefix} name"] = os.path.basename(embedding_name[1]) 217 | pnginfo_dict[f"{field_prefix} hash"] = embedding_hashe[1] 218 | index += 1 219 | 220 | return pnginfo_dict 221 | 222 | @classmethod 223 | def get_sampler_for_civitai(cls, sampler_names, schedulers): 224 | """ 225 | Get the pretty sampler name for Civitai in the form of ` `. 226 | - `dpmpp_2m` and `karras` will return `DPM++ 2M Karras` 227 | 228 | If there is a matching sampler name but no matching scheduler name, return only the matching sampler name. 229 | - `dpmpp_2m` and `exponential` will return only `DPM++ 2M` 230 | 231 | if there is no matching sampler and scheduler name, return `_` 232 | - `ipndm` and `normal` will return `ipndm` 233 | - `ipndm` and `karras` will return `ipndm_karras` 234 | 235 | Reference: https://github.com/civitai/civitai/blob/main/src/server/common/constants.ts 236 | 237 | Last update: https://github.com/civitai/civitai/blob/a2e6d267eefe6f44811a640c570739bcb078e4a5/src/server/common/constants.ts#L138-L165 238 | """ 239 | 240 | def sampler_with_karras_exponential(sampler, scheduler): 241 | match scheduler: 242 | case "karras": 243 | sampler += " Karras" 244 | case "exponential": 245 | sampler += " Exponential" 246 | return sampler 247 | 248 | def sampler_with_karras(sampler, scheduler): 249 | if scheduler == "karras": 250 | return sampler + " Karras" 251 | return sampler 252 | 253 | if len(sampler_names) > 0: 254 | sampler = sampler_names[0][1] 255 | if len(schedulers) > 0: 256 | scheduler = schedulers[0][1] 257 | 258 | match sampler: 259 | case "euler" | "euler_cfg_pp": 260 | return "Euler" 261 | case "euler_ancestral" | "euler_ancestral_cfg_pp": 262 | return "Euler a" 263 | case "heun" | "heunpp2": 264 | return "Huen" 265 | case "dpm_2": 266 | return sampler_with_karras("DPM2", scheduler) 267 | case "dpm_2_ancestral": 268 | return sampler_with_karras("DPM2 a", scheduler) 269 | case "lms": 270 | return sampler_with_karras("LMS", scheduler) 271 | case "dpm_fast": 272 | return "DPM fast" 273 | case "dpm_adaptive": 274 | return "DPM adaptive" 275 | case "dpmpp_2s_ancestral": 276 | return sampler_with_karras("DPM++ 2S a", scheduler) 277 | case "dpmpp_sde" | "dpmpp_sde_gpu": 278 | return sampler_with_karras("DPM++ SDE", scheduler) 279 | case "dpmpp_2m": 280 | return sampler_with_karras("DPM++ 2M", scheduler) 281 | case "dpmpp_2m_sde" | "dpmpp_2m_sde_gpu": 282 | return sampler_with_karras("DPM++ 2M SDE", scheduler) 283 | case "dpmpp_3m_sde" | "dpmpp_3m_sde_gpu": 284 | return sampler_with_karras_exponential("DPM++ 3M SDE", scheduler) 285 | case "lcm": 286 | return "LCM" 287 | case "ddim": 288 | return "DDIM" 289 | case "uni_pc" | "uni_pc_bh2": 290 | return "UniPC" 291 | 292 | if scheduler == "normal": 293 | return sampler 294 | return sampler + "_" + scheduler -------------------------------------------------------------------------------- /py/defs/__init__.py: -------------------------------------------------------------------------------- 1 | import glob 2 | import importlib 3 | import os 4 | 5 | from .captures import CAPTURE_FIELD_LIST 6 | from .samplers import SAMPLERS 7 | 8 | # load CAPTURE_FIELD_LIST and SAMPLERS in ext folder 9 | dir_name = os.path.dirname(os.path.abspath(__file__)) 10 | for module_path in glob.glob(dir_name + "/ext/*.py"): 11 | module_name = os.path.basename(module_path) 12 | module_name = os.path.splitext(module_name)[0] 13 | package_name = ( 14 | f"custom_nodes.ComfyUI-SaveImageWithMetaData.py.defs.ext.{module_name}" 15 | ) 16 | module = importlib.import_module(package_name) 17 | CAPTURE_FIELD_LIST.update(getattr(module, "CAPTURE_FIELD_LIST", {})) 18 | SAMPLERS.update(getattr(module, "SAMPLERS", {})) 19 | -------------------------------------------------------------------------------- /py/defs/captures.py: -------------------------------------------------------------------------------- 1 | from .meta import MetaField 2 | from .validators import is_positive_prompt, is_negative_prompt 3 | from .formatters import ( 4 | calc_model_hash, 5 | calc_vae_hash, 6 | calc_lora_hash, 7 | calc_unet_hash, 8 | convert_skip_clip, 9 | get_scaled_width, 10 | get_scaled_height, 11 | extract_embedding_names, 12 | extract_embedding_hashes, 13 | ) 14 | 15 | 16 | CAPTURE_FIELD_LIST = { 17 | "CheckpointLoaderSimple": { 18 | MetaField.MODEL_NAME: {"field_name": "ckpt_name"}, 19 | MetaField.MODEL_HASH: {"field_name": "ckpt_name", "format": calc_model_hash}, 20 | }, 21 | "CLIPSetLastLayer": { 22 | MetaField.CLIP_SKIP: { 23 | "field_name": "stop_at_clip_layer", 24 | "format": convert_skip_clip, 25 | }, 26 | }, 27 | "VAELoader": { 28 | MetaField.VAE_NAME: {"field_name": "vae_name"}, 29 | MetaField.VAE_HASH: {"field_name": "vae_name", "format": calc_vae_hash}, 30 | }, 31 | "EmptyLatentImage": { 32 | MetaField.IMAGE_WIDTH: {"field_name": "width"}, 33 | MetaField.IMAGE_HEIGHT: {"field_name": "height"}, 34 | }, 35 | "CLIPTextEncode": { 36 | MetaField.POSITIVE_PROMPT: { 37 | "field_name": "text", 38 | "validate": is_positive_prompt, 39 | }, 40 | MetaField.NEGATIVE_PROMPT: { 41 | "field_name": "text", 42 | "validate": is_negative_prompt, 43 | }, 44 | MetaField.EMBEDDING_NAME: { 45 | "field_name": "text", 46 | "format": extract_embedding_names, 47 | }, 48 | MetaField.EMBEDDING_HASH: { 49 | "field_name": "text", 50 | "format": extract_embedding_hashes, 51 | }, 52 | }, 53 | "KSampler": { 54 | MetaField.SEED: {"field_name": "seed"}, 55 | MetaField.STEPS: {"field_name": "steps"}, 56 | MetaField.CFG: {"field_name": "cfg"}, 57 | MetaField.SAMPLER_NAME: {"field_name": "sampler_name"}, 58 | MetaField.SCHEDULER: {"field_name": "scheduler"}, 59 | }, 60 | "KSamplerAdvanced": { 61 | MetaField.SEED: {"field_name": "noise_seed"}, 62 | MetaField.STEPS: {"field_name": "steps"}, 63 | MetaField.CFG: {"field_name": "cfg"}, 64 | MetaField.SAMPLER_NAME: {"field_name": "sampler_name"}, 65 | MetaField.SCHEDULER: {"field_name": "scheduler"}, 66 | }, 67 | "LatentUpscale": { 68 | MetaField.IMAGE_WIDTH: {"field_name": "width"}, 69 | MetaField.IMAGE_HEIGHT: {"field_name": "height"}, 70 | }, 71 | "LatentUpscaleBy": { 72 | MetaField.IMAGE_WIDTH: {"field_name": "scale_by", "format": get_scaled_width}, 73 | MetaField.IMAGE_HEIGHT: { 74 | "field_name": "scale_by", 75 | "format": get_scaled_height, 76 | }, 77 | }, 78 | "LoraLoader": { 79 | MetaField.LORA_MODEL_NAME: {"field_name": "lora_name"}, 80 | MetaField.LORA_MODEL_HASH: { 81 | "field_name": "lora_name", 82 | "format": calc_lora_hash, 83 | }, 84 | MetaField.LORA_STRENGTH_MODEL: {"field_name": "strength_model"}, 85 | MetaField.LORA_STRENGTH_CLIP: {"field_name": "strength_clip"}, 86 | }, 87 | "LoraLoaderModelOnly": { 88 | MetaField.LORA_MODEL_NAME: {"field_name": "lora_name"}, 89 | MetaField.LORA_MODEL_HASH: { 90 | "field_name": "lora_name", 91 | "format": calc_lora_hash, 92 | }, 93 | MetaField.LORA_STRENGTH_MODEL: {"field_name": "strength_model"}, 94 | MetaField.LORA_STRENGTH_CLIP: {"value": 0}, 95 | }, 96 | # Flux - https://comfyanonymous.github.io/ComfyUI_examples/flux/ 97 | "UNETLoader": { 98 | MetaField.MODEL_NAME: {"field_name": "unet_name"}, 99 | MetaField.MODEL_HASH: {"field_name": "unet_name", "format": calc_unet_hash}, 100 | }, 101 | "RandomNoise": { 102 | MetaField.SEED: {"field_name": "noise_seed"}, 103 | }, 104 | "BasicScheduler": { 105 | MetaField.STEPS: {"field_name": "steps"}, 106 | MetaField.SCHEDULER: {"field_name": "scheduler"}, 107 | }, 108 | "KSamplerSelect": { 109 | MetaField.SAMPLER_NAME: {"field_name": "sampler_name"}, 110 | }, 111 | } 112 | -------------------------------------------------------------------------------- /py/defs/combo.py: -------------------------------------------------------------------------------- 1 | SAMPLER_SELECTION_METHOD = ["Farthest", "Nearest", "By node ID"] 2 | -------------------------------------------------------------------------------- /py/defs/ext/ComfyUI-FluxSettingsNode.py: -------------------------------------------------------------------------------- 1 | #https://github.com/Light-x02/ComfyUI-FluxSettingsNode 2 | from ..meta import MetaField 3 | from ..formatters import calc_model_hash, calc_lora_hash, convert_skip_clip 4 | 5 | 6 | SAMPLERS = { 7 | "FluxSettingsNode": { 8 | "positive": "conditioning.positive", 9 | "negative": "conditioning.negative", 10 | }, 11 | } 12 | 13 | 14 | CAPTURE_FIELD_LIST = { 15 | "FluxSettingsNode": { 16 | MetaField.MODEL_NAME: {"field_name": "model"}, 17 | MetaField.CFG: {"field_name": "guidance"}, 18 | MetaField.SAMPLER_NAME: {"field_name": "sampler_name"}, 19 | MetaField.SCHEDULER: {"field_name": "scheduler"}, 20 | MetaField.STEPS: {"field_name": "steps"}, 21 | MetaField.SEED: {"field_name": "noise_seed"}, 22 | MetaField.POSITIVE_PROMPT: {"field_name": "conditioning.positive"}, 23 | MetaField.NEGATIVE_PROMPT: {"field_name": "conditioning.negative"}, 24 | }, 25 | } -------------------------------------------------------------------------------- /py/defs/ext/easyuse_nodes.py: -------------------------------------------------------------------------------- 1 | # https://github.com/yolain/ComfyUI-Easy-Use 2 | from ..meta import MetaField 3 | from ..formatters import calc_model_hash, calc_lora_hash, convert_skip_clip 4 | import re 5 | 6 | def get_lora_model_name_stack(node_id, obj, prompt, extra_data, outputs, input_data): 7 | toggled_on = input_data[0]["toggle"][0] 8 | 9 | if toggled_on: 10 | return get_lora_data_stack(input_data, "lora_\d_name") 11 | else: 12 | return [] 13 | 14 | 15 | def get_lora_model_hash_stack(node_id, obj, prompt, extra_data, outputs, input_data): 16 | return [ 17 | calc_lora_hash(model_name, input_data) 18 | for model_name in get_lora_data_stack(input_data, "lora_\d_name") 19 | ] 20 | 21 | 22 | def get_lora_strength_model_stack(node_id, obj, prompt, extra_data, outputs, input_data): 23 | if input_data[0]["mode"][0] == "advanced": 24 | return get_lora_data_stack(input_data, "lora_\d_model_strength") 25 | return get_lora_data_stack(input_data, "lora_\d_strength") 26 | 27 | 28 | def get_lora_strength_clip_stack(node_id, obj, prompt, extra_data, outputs, input_data): 29 | if input_data[0]["mode"][0] == "advanced": 30 | return get_lora_data_stack(input_data, "lora_\d_clip_strength") 31 | return get_lora_data_stack(input_data, "lora_\d_strength") 32 | 33 | 34 | def get_lora_data_stack(input_data, attribute): 35 | lora_count = input_data[0]["num_loras"][0] 36 | return [ 37 | v[0] 38 | for k, v in input_data[0].items() 39 | if re.search(attribute, k) != None and v[0] != "None" 40 | ][:lora_count] 41 | 42 | 43 | def get_lora_model_hash(node_id, obj, prompt, extra_data, outputs, input_data): 44 | if input_data[0]["lora_name"][0] != "None": 45 | return calc_lora_hash(input_data[0]["lora_name"][0], input_data) 46 | else: 47 | return "" 48 | 49 | 50 | SAMPLERS = { 51 | "easy fullkSampler": { 52 | "positive": "positive", 53 | "negative": "negative", 54 | }, 55 | "easy preSampling": { 56 | 57 | }, 58 | "easy preSamplingAdvanced": { 59 | 60 | }, 61 | "easy preSamplingCascade": { 62 | 63 | }, 64 | "easy preSamplingCustom": { 65 | 66 | }, 67 | "easy preSamplingDynamicCFG": { 68 | 69 | }, 70 | "easy preSamplingLayerDiffusion": { 71 | 72 | }, 73 | "easy preSamplingNoiseIn": { 74 | 75 | }, 76 | "easy preSamplingSdTurbo": { 77 | 78 | } 79 | } 80 | 81 | 82 | 83 | 84 | 85 | CAPTURE_FIELD_LIST = { 86 | "easy fullLoader": { 87 | MetaField.MODEL_NAME: {"field_name": "ckpt_name"}, 88 | MetaField.MODEL_HASH: {"field_name": "ckpt_name", "format": calc_model_hash}, 89 | MetaField.CLIP_SKIP: {"field_name": "clip_skip", "format": convert_skip_clip}, 90 | MetaField.POSITIVE_PROMPT: {"field_name": "positive"}, 91 | MetaField.NEGATIVE_PROMPT: {"field_name": "negative"}, 92 | MetaField.IMAGE_WIDTH: {"field_name": "empty_latent_width"}, 93 | MetaField.IMAGE_HEIGHT: {"field_name": "empty_latent_height"}, 94 | MetaField.LORA_MODEL_NAME: {"field_name": "lora_name"}, 95 | MetaField.LORA_MODEL_HASH: {"selector": get_lora_model_hash}, 96 | MetaField.LORA_STRENGTH_MODEL: {"field_name": "lora_model_strength"}, 97 | MetaField.LORA_STRENGTH_CLIP: {"field_name": "lora_clip_strength"}, 98 | }, 99 | "easy comfyLoader": { 100 | MetaField.MODEL_NAME: {"field_name": "ckpt_name"}, 101 | MetaField.MODEL_HASH: {"field_name": "ckpt_name", "format": calc_model_hash}, 102 | MetaField.CLIP_SKIP: {"field_name": "clip_skip", "format": convert_skip_clip}, 103 | MetaField.POSITIVE_PROMPT: {"field_name": "positive"}, 104 | MetaField.NEGATIVE_PROMPT: {"field_name": "negative"}, 105 | MetaField.IMAGE_WIDTH: {"field_name": "empty_latent_width"}, 106 | MetaField.IMAGE_HEIGHT: {"field_name": "empty_latent_height"}, 107 | MetaField.LORA_MODEL_NAME: {"field_name": "lora_name"}, 108 | MetaField.LORA_MODEL_HASH: {"selector": get_lora_model_hash}, 109 | MetaField.LORA_STRENGTH_MODEL: {"field_name": "lora_model_strength"}, 110 | MetaField.LORA_STRENGTH_CLIP: {"field_name": "lora_clip_strength"}, 111 | }, 112 | "easy fullkSampler": { 113 | MetaField.SEED: {"field_name": "seed"}, 114 | MetaField.STEPS: {"field_name": "steps"}, 115 | MetaField.CFG: {"field_name": "cfg"}, 116 | MetaField.SAMPLER_NAME: {"field_name": "sampler_name"}, 117 | MetaField.SCHEDULER: {"field_name": "scheduler"}, 118 | }, 119 | "easy preSampling": { 120 | MetaField.SEED: {"field_name": "seed"}, 121 | MetaField.STEPS: {"field_name": "steps"}, 122 | MetaField.CFG: {"field_name": "cfg"}, 123 | MetaField.SAMPLER_NAME: {"field_name": "sampler_name"}, 124 | MetaField.SCHEDULER: {"field_name": "scheduler"}, 125 | }, 126 | "easy preSamplingAdvanced": { 127 | MetaField.SEED: {"field_name": "seed"}, 128 | MetaField.STEPS: {"field_name": "steps"}, 129 | MetaField.CFG: {"field_name": "cfg"}, 130 | MetaField.SAMPLER_NAME: {"field_name": "sampler_name"}, 131 | MetaField.SCHEDULER: {"field_name": "scheduler"}, 132 | }, 133 | "easy preSamplingCascade": { 134 | MetaField.SEED: {"field_name": "seed"}, 135 | MetaField.STEPS: {"field_name": "steps"}, 136 | MetaField.CFG: {"field_name": "cfg"}, 137 | MetaField.SAMPLER_NAME: {"field_name": "sampler_name"}, 138 | MetaField.SCHEDULER: {"field_name": "scheduler"}, 139 | }, 140 | "easy preSamplingCustom": { 141 | MetaField.SEED: {"field_name": "seed"}, 142 | MetaField.STEPS: {"field_name": "steps"}, 143 | MetaField.CFG: {"field_name": "cfg"}, 144 | MetaField.SAMPLER_NAME: {"field_name": "sampler_name"}, 145 | MetaField.SCHEDULER: {"field_name": "scheduler"}, 146 | }, 147 | "easy preSamplingDynamicCFG": { 148 | MetaField.SEED: {"field_name": "seed"}, 149 | MetaField.STEPS: {"field_name": "steps"}, 150 | MetaField.CFG: {"field_name": "cfg"}, 151 | MetaField.SAMPLER_NAME: {"field_name": "sampler_name"}, 152 | MetaField.SCHEDULER: {"field_name": "scheduler"}, 153 | }, 154 | "easy preSamplingLayerDiffusion": { 155 | MetaField.SEED: {"field_name": "seed"}, 156 | MetaField.STEPS: {"field_name": "steps"}, 157 | MetaField.CFG: {"field_name": "cfg"}, 158 | MetaField.SAMPLER_NAME: {"field_name": "sampler_name"}, 159 | MetaField.SCHEDULER: {"field_name": "scheduler"}, 160 | }, 161 | "easy preSamplingNoiseIn": { 162 | MetaField.SEED: {"field_name": "seed"}, 163 | MetaField.STEPS: {"field_name": "steps"}, 164 | MetaField.CFG: {"field_name": "cfg"}, 165 | MetaField.SAMPLER_NAME: {"field_name": "sampler_name"}, 166 | MetaField.SCHEDULER: {"field_name": "scheduler"}, 167 | }, 168 | "easy preSamplingSdTurbo": { 169 | MetaField.SEED: {"field_name": "seed"}, 170 | MetaField.STEPS: {"field_name": "steps"}, 171 | MetaField.CFG: {"field_name": "cfg"}, 172 | MetaField.SAMPLER_NAME: {"field_name": "sampler_name"}, 173 | MetaField.SCHEDULER: {"field_name": "scheduler"}, 174 | }, 175 | "easy loraStack": { 176 | MetaField.LORA_MODEL_NAME: {"selector": get_lora_model_name_stack}, 177 | MetaField.LORA_MODEL_HASH: {"selector": get_lora_model_hash_stack}, 178 | MetaField.LORA_STRENGTH_MODEL: {"selector": get_lora_strength_model_stack}, 179 | MetaField.LORA_STRENGTH_CLIP: {"selector": get_lora_strength_clip_stack}, 180 | }, 181 | } 182 | -------------------------------------------------------------------------------- /py/defs/ext/efficiency_nodes.py: -------------------------------------------------------------------------------- 1 | # https://github.com/jags111/efficiency-nodes-comfyui 2 | from ..meta import MetaField 3 | from ..formatters import calc_model_hash, calc_lora_hash, convert_skip_clip 4 | 5 | 6 | def get_lora_model_name_stack(node_id, obj, prompt, extra_data, outputs, input_data): 7 | return get_lora_data_stack(input_data, "lora_name") 8 | 9 | 10 | def get_lora_model_hash_stack(node_id, obj, prompt, extra_data, outputs, input_data): 11 | return [ 12 | calc_lora_hash(model_name, input_data) 13 | for model_name in get_lora_data_stack(input_data, "lora_name") 14 | ] 15 | 16 | 17 | def get_lora_strength_model_stack( 18 | node_id, obj, prompt, extra_data, outputs, input_data 19 | ): 20 | if input_data[0]["input_mode"][0] == "advanced": 21 | return get_lora_data_stack(input_data, "model_str") 22 | return get_lora_data_stack(input_data, "lora_wt") 23 | 24 | 25 | def get_lora_strength_clip_stack(node_id, obj, prompt, extra_data, outputs, input_data): 26 | if input_data[0]["input_mode"][0] == "advanced": 27 | return get_lora_data_stack(input_data, "clip_str") 28 | return get_lora_data_stack(input_data, "lora_wt") 29 | 30 | 31 | def get_lora_data_stack(input_data, attribute): 32 | lora_count = input_data[0]["lora_count"][0] 33 | return [ 34 | v[0] 35 | for k, v in input_data[0].items() 36 | if k.startswith(attribute) and v[0] != "None" 37 | ][:lora_count] 38 | 39 | 40 | SAMPLERS = { 41 | "KSampler (Efficient)": { 42 | "positive": "positive", 43 | "negative": "negative", 44 | }, 45 | "KSampler Adv. (Efficient)": { 46 | "positive": "positive", 47 | "negative": "negative", 48 | }, 49 | "KSampler SDXL (Eff.)": { 50 | "positive": "positive", 51 | "negative": "negative", 52 | }, 53 | } 54 | 55 | CAPTURE_FIELD_LIST = { 56 | "Efficient Loader": { 57 | MetaField.MODEL_NAME: {"field_name": "ckpt_name"}, 58 | MetaField.MODEL_HASH: {"field_name": "ckpt_name", "format": calc_model_hash}, 59 | MetaField.CLIP_SKIP: {"field_name": "clip_skip", "format": convert_skip_clip}, 60 | MetaField.POSITIVE_PROMPT: {"field_name": "positive"}, 61 | MetaField.NEGATIVE_PROMPT: {"field_name": "negative"}, 62 | MetaField.IMAGE_WIDTH: {"field_name": "empty_latent_width"}, 63 | MetaField.IMAGE_HEIGHT: {"field_name": "empty_latent_height"}, 64 | }, 65 | "Eff. Loader SDXL": { 66 | MetaField.MODEL_NAME: {"field_name": "base_ckpt_name"}, 67 | MetaField.MODEL_HASH: { 68 | "field_name": "base_ckpt_name", 69 | "format": calc_model_hash, 70 | }, 71 | MetaField.CLIP_SKIP: { 72 | "field_name": "base_clip_skip", 73 | "format": convert_skip_clip, 74 | }, 75 | MetaField.POSITIVE_PROMPT: {"field_name": "positive"}, 76 | MetaField.NEGATIVE_PROMPT: {"field_name": "negative"}, 77 | MetaField.IMAGE_WIDTH: {"field_name": "empty_latent_width"}, 78 | MetaField.IMAGE_HEIGHT: {"field_name": "empty_latent_height"}, 79 | }, 80 | "KSampler (Efficient)": { 81 | MetaField.SEED: {"field_name": "seed"}, 82 | MetaField.STEPS: {"field_name": "steps"}, 83 | MetaField.CFG: {"field_name": "cfg"}, 84 | MetaField.SAMPLER_NAME: {"field_name": "sampler_name"}, 85 | MetaField.SCHEDULER: {"field_name": "scheduler"}, 86 | }, 87 | "KSampler Adv. (Efficient)": { 88 | MetaField.SEED: {"field_name": "noise_seed"}, 89 | MetaField.STEPS: {"field_name": "steps"}, 90 | MetaField.CFG: {"field_name": "cfg"}, 91 | MetaField.SAMPLER_NAME: {"field_name": "sampler_name"}, 92 | MetaField.SCHEDULER: {"field_name": "scheduler"}, 93 | }, 94 | "KSampler SDXL (Eff.)": { 95 | MetaField.SEED: {"field_name": "noise_seed"}, 96 | MetaField.STEPS: {"field_name": "steps"}, 97 | MetaField.CFG: {"field_name": "cfg"}, 98 | MetaField.SAMPLER_NAME: {"field_name": "sampler_name"}, 99 | MetaField.SCHEDULER: {"field_name": "scheduler"}, 100 | }, 101 | "LoRA Stacker": { 102 | MetaField.LORA_MODEL_NAME: {"selector": get_lora_model_name_stack}, 103 | MetaField.LORA_MODEL_HASH: {"selector": get_lora_model_hash_stack}, 104 | MetaField.LORA_STRENGTH_MODEL: {"selector": get_lora_strength_model_stack}, 105 | MetaField.LORA_STRENGTH_CLIP: {"selector": get_lora_strength_clip_stack}, 106 | }, 107 | } 108 | -------------------------------------------------------------------------------- /py/defs/ext/rgthree.py: -------------------------------------------------------------------------------- 1 | # https://github.com/rgthree/rgthree-comfy 2 | from ..meta import MetaField 3 | from ..formatters import calc_lora_hash 4 | 5 | 6 | def get_lora_model_name(node_id, obj, prompt, extra_data, outputs, input_data): 7 | return get_lora_data(input_data, "lora") 8 | 9 | 10 | def get_lora_model_hash(node_id, obj, prompt, extra_data, outputs, input_data): 11 | return [ 12 | calc_lora_hash(model_name, input_data) 13 | for model_name in get_lora_data(input_data, "lora") 14 | ] 15 | 16 | 17 | def get_lora_strength(node_id, obj, prompt, extra_data, outputs, input_data): 18 | return get_lora_data(input_data, "strength") 19 | 20 | 21 | def get_lora_data(input_data, attribute): 22 | return [ 23 | v[0][attribute] 24 | for k, v in input_data[0].items() 25 | if k.startswith("lora_") and v[0]["on"] 26 | ] 27 | 28 | 29 | def get_lora_model_name_stack(node_id, obj, prompt, extra_data, outputs, input_data): 30 | return get_lora_data_stack(input_data, "lora") 31 | 32 | 33 | def get_lora_model_hash_stack(node_id, obj, prompt, extra_data, outputs, input_data): 34 | return [ 35 | calc_lora_hash(model_name, input_data) 36 | for model_name in get_lora_data_stack(input_data, "lora") 37 | ] 38 | 39 | 40 | def get_lora_strength_stack(node_id, obj, prompt, extra_data, outputs, input_data): 41 | return get_lora_data_stack(input_data, "strength") 42 | 43 | 44 | def get_lora_data_stack(input_data, attribute): 45 | return [ 46 | v[0] 47 | for k, v in input_data[0].items() 48 | if k.startswith(attribute + "_") and v[0] != "None" 49 | ] 50 | 51 | 52 | CAPTURE_FIELD_LIST = { 53 | "Power Lora Loader (rgthree)": { 54 | MetaField.LORA_MODEL_NAME: {"selector": get_lora_model_name}, 55 | MetaField.LORA_MODEL_HASH: {"selector": get_lora_model_hash}, 56 | MetaField.LORA_STRENGTH_MODEL: {"selector": get_lora_strength}, 57 | MetaField.LORA_STRENGTH_CLIP: {"selector": get_lora_strength}, 58 | }, 59 | "Lora Loader Stack (rgthree)": { 60 | MetaField.LORA_MODEL_NAME: {"selector": get_lora_model_name_stack}, 61 | MetaField.LORA_MODEL_HASH: {"selector": get_lora_model_hash_stack}, 62 | MetaField.LORA_STRENGTH_MODEL: {"selector": get_lora_strength_stack}, 63 | MetaField.LORA_STRENGTH_CLIP: {"selector": get_lora_strength_stack}, 64 | }, 65 | } 66 | -------------------------------------------------------------------------------- /py/defs/ext/size_from_presets.py: -------------------------------------------------------------------------------- 1 | # https://github.com/nkchocoai/ComfyUI-SizeFromPresets/ 2 | from ..meta import MetaField 3 | 4 | 5 | def get_width(preset, input_data): 6 | return preset.split("x")[0].strip() 7 | 8 | 9 | def get_height(preset, input_data): 10 | return preset.split("x")[1].strip() 11 | 12 | 13 | CAPTURE_FIELD_LIST = { 14 | "EmptyLatentImageFromPresetsSD15": { 15 | MetaField.IMAGE_WIDTH: {"field_name": "preset", "format": get_width}, 16 | MetaField.IMAGE_HEIGHT: {"field_name": "preset", "format": get_height}, 17 | }, 18 | "EmptyLatentImageFromPresetsSDXL": { 19 | MetaField.IMAGE_WIDTH: {"field_name": "preset", "format": get_width}, 20 | MetaField.IMAGE_HEIGHT: {"field_name": "preset", "format": get_height}, 21 | }, 22 | # TODO RandomEmptyLatentImageFromPresetsSD.. 23 | } 24 | -------------------------------------------------------------------------------- /py/defs/formatters.py: -------------------------------------------------------------------------------- 1 | import os 2 | 3 | import folder_paths 4 | 5 | from ..utils.hash import calc_hash 6 | from ..utils.embedding import get_embedding_file_path 7 | 8 | from comfy.sd1_clip import escape_important, token_weights, unescape_important 9 | from comfy.sd1_clip import SD1Tokenizer 10 | from comfy.text_encoders.sd2_clip import SD2Tokenizer 11 | from comfy.text_encoders.sd3_clip import SD3Tokenizer 12 | from comfy.text_encoders.flux import FluxTokenizer 13 | from comfy.sdxl_clip import SDXLTokenizer 14 | 15 | cache_model_hash = {} 16 | 17 | 18 | def calc_model_hash(model_name, input_data): 19 | filename = folder_paths.get_full_path("checkpoints", model_name) 20 | return calc_hash(filename) 21 | 22 | 23 | def calc_vae_hash(model_name, input_data): 24 | filename = folder_paths.get_full_path("vae", model_name) 25 | return calc_hash(filename) 26 | 27 | 28 | def calc_lora_hash(model_name, input_data): 29 | filename = folder_paths.get_full_path("loras", model_name) 30 | return calc_hash(filename) 31 | 32 | 33 | def calc_unet_hash(model_name, input_data): 34 | filename = folder_paths.get_full_path("unet", model_name) 35 | return calc_hash(filename) 36 | 37 | 38 | def convert_skip_clip(stop_at_clip_layer, input_data): 39 | return stop_at_clip_layer * -1 40 | 41 | 42 | def get_scaled_width(scaled_by, input_data): 43 | samples = input_data[0]["samples"][0]["samples"] 44 | return round(samples.shape[3] * scaled_by * 8) 45 | 46 | 47 | def get_scaled_height(scaled_by, input_data): 48 | samples = input_data[0]["samples"][0]["samples"] 49 | return round(samples.shape[2] * scaled_by * 8) 50 | 51 | 52 | def extract_embedding_names(text, input_data): 53 | embedding_names, _ = _extract_embedding_names(text, input_data) 54 | 55 | return [os.path.basename(embedding_name) for embedding_name in embedding_names] 56 | 57 | 58 | def extract_embedding_hashes(text, input_data): 59 | embedding_names, clip = _extract_embedding_names(text, input_data) 60 | embedding_hashes = [] 61 | for embedding_name in embedding_names: 62 | embedding_file_path = get_embedding_file_path(embedding_name, clip) 63 | embedding_hashes.append(calc_hash(embedding_file_path)) 64 | 65 | return embedding_hashes 66 | 67 | 68 | def _extract_embedding_names(text, input_data): 69 | embedding_identifier = "embedding:" 70 | clip_ = input_data[0]["clip"][0] 71 | clip = None 72 | if clip_ is not None: 73 | tokenizer = clip_.tokenizer 74 | if isinstance(tokenizer, SD1Tokenizer): 75 | clip = tokenizer.clip_l 76 | elif isinstance(tokenizer, SD2Tokenizer): 77 | clip = tokenizer.clip_h 78 | elif isinstance(tokenizer, SDXLTokenizer): 79 | clip = tokenizer.clip_l 80 | elif isinstance(tokenizer, SD3Tokenizer): 81 | clip = tokenizer.clip_l 82 | elif isinstance(tokenizer, FluxTokenizer): 83 | clip = tokenizer.clip_l 84 | if clip is not None and hasattr(clip, "embedding_identifier"): 85 | embedding_identifier = clip.embedding_identifier 86 | if not isinstance(text, str): 87 | text = "".join(str(item) if item is not None else "" for item in text) 88 | text = escape_important(text) 89 | parsed_weights = token_weights(text, 1.0) 90 | 91 | # tokenize words 92 | embedding_names = [] 93 | for weighted_segment, weight in parsed_weights: 94 | to_tokenize = unescape_important(weighted_segment).replace("\n", " ").split(" ") 95 | to_tokenize = [x for x in to_tokenize if x != ""] 96 | for word in to_tokenize: 97 | # find an embedding, deal with the embedding 98 | if ( 99 | word.startswith(embedding_identifier) 100 | and clip.embedding_directory is not None 101 | ): 102 | embedding_name = word[len(embedding_identifier) :].strip("\n") 103 | embedding_names.append(embedding_name) 104 | 105 | return embedding_names, clip 106 | -------------------------------------------------------------------------------- /py/defs/meta.py: -------------------------------------------------------------------------------- 1 | from enum import IntEnum 2 | 3 | 4 | class MetaField(IntEnum): 5 | MODEL_NAME = 0 6 | MODEL_HASH = 1 7 | VAE_NAME = 2 8 | VAE_HASH = 3 9 | POSITIVE_PROMPT = 10 10 | NEGATIVE_PROMPT = 11 11 | CLIP_SKIP = 12 12 | SEED = 20 13 | STEPS = 21 14 | CFG = 22 15 | SAMPLER_NAME = 23 16 | SCHEDULER = 24 17 | IMAGE_WIDTH = 30 18 | IMAGE_HEIGHT = 31 19 | EMBEDDING_NAME = 40 20 | EMBEDDING_HASH = 41 21 | LORA_MODEL_NAME = 50 22 | LORA_MODEL_HASH = 51 23 | LORA_STRENGTH_MODEL = 52 24 | LORA_STRENGTH_CLIP = 53 25 | -------------------------------------------------------------------------------- /py/defs/samplers.py: -------------------------------------------------------------------------------- 1 | SAMPLERS = { 2 | "KSampler": { 3 | "positive": "positive", 4 | "negative": "negative", 5 | }, 6 | "KSamplerAdvanced": { 7 | "positive": "positive", 8 | "negative": "negative", 9 | }, 10 | # Flux - https://comfyanonymous.github.io/ComfyUI_examples/flux/ 11 | "SamplerCustomAdvanced": { 12 | "positive": "guider", 13 | }, 14 | } 15 | -------------------------------------------------------------------------------- /py/defs/validators.py: -------------------------------------------------------------------------------- 1 | from collections import deque 2 | 3 | from .samplers import SAMPLERS 4 | 5 | 6 | def is_positive_prompt(node_id, obj, prompt, extra_data, outputs, input_data_all): 7 | return node_id in _get_node_id_list(prompt, "positive") 8 | 9 | 10 | def is_negative_prompt(node_id, obj, prompt, extra_data, outputs, input_data_all): 11 | return node_id in _get_node_id_list(prompt, "negative") 12 | 13 | 14 | def _get_node_id_list(prompt, field_name): 15 | node_id_list = {} 16 | for nid, node in prompt.items(): 17 | for sampler_type, field_map in SAMPLERS.items(): 18 | if node["class_type"] == sampler_type: 19 | # There are nodes between "KSampler" and "CLIP Text Encode" in the SD3 workflow 20 | d = deque() 21 | if field_name in field_map and field_map[field_name] in node["inputs"]: 22 | d.append(node["inputs"][field_map[field_name]][0]) 23 | while len(d) > 0: 24 | nid2 = d.popleft() 25 | class_type = prompt[nid2]["class_type"] 26 | if class_type == "CLIPTextEncode": 27 | node_id_list[nid] = nid2 28 | break 29 | inputs = prompt[nid2]["inputs"] 30 | for k, v in inputs.items(): 31 | if isinstance(v, list): 32 | d.append(v[0]) 33 | 34 | return node_id_list.values() 35 | -------------------------------------------------------------------------------- /py/hook.py: -------------------------------------------------------------------------------- 1 | from .nodes.node import SaveImageWithMetaData 2 | 3 | current_prompt = {} 4 | current_extra_data = {} 5 | prompt_executer = None 6 | current_save_image_node_id = -1 7 | 8 | 9 | def pre_execute(self, prompt, prompt_id, extra_data, execute_outputs): 10 | global current_prompt 11 | global current_extra_data 12 | global prompt_executer 13 | 14 | current_prompt = prompt 15 | current_extra_data = extra_data 16 | prompt_executer = self 17 | 18 | 19 | def pre_get_input_data(inputs, class_def, unique_id, *args): 20 | global current_save_image_node_id 21 | 22 | if class_def == SaveImageWithMetaData: 23 | current_save_image_node_id = unique_id 24 | -------------------------------------------------------------------------------- /py/nodes/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/nkchocoai/ComfyUI-SaveImageWithMetaData/c9a87453bc4257614c87d28a64a5be9c4fdccba9/py/nodes/__init__.py -------------------------------------------------------------------------------- /py/nodes/base.py: -------------------------------------------------------------------------------- 1 | class BaseNode: 2 | CATEGORY = "SaveImage" 3 | -------------------------------------------------------------------------------- /py/nodes/node.py: -------------------------------------------------------------------------------- 1 | import json 2 | import os 3 | import re 4 | 5 | from datetime import datetime 6 | 7 | from PIL import Image 8 | from PIL.PngImagePlugin import PngInfo 9 | import numpy as np 10 | 11 | import piexif 12 | import piexif.helper 13 | 14 | import folder_paths 15 | from comfy.cli_args import args 16 | 17 | from .base import BaseNode 18 | 19 | from ..capture import Capture 20 | from .. import hook 21 | from ..trace import Trace 22 | 23 | from ..defs.combo import SAMPLER_SELECTION_METHOD 24 | 25 | 26 | # refer. https://github.com/comfyanonymous/ComfyUI/blob/38b7ac6e269e6ecc5bdd6fefdfb2fb1185b09c9d/nodes.py#L1411 27 | class SaveImageWithMetaData(BaseNode): 28 | SAVE_FILE_FORMATS = ["png", "jpeg", "webp"] 29 | 30 | def __init__(self): 31 | self.output_dir = folder_paths.get_output_directory() 32 | self.type = "output" 33 | self.prefix_append = "" 34 | self.compress_level = 4 35 | 36 | @classmethod 37 | def INPUT_TYPES(s): 38 | return { 39 | "required": { 40 | "images": ("IMAGE",), 41 | "filename_prefix": ("STRING", {"default": "ComfyUI"}), 42 | "sampler_selection_method": (SAMPLER_SELECTION_METHOD,), 43 | "sampler_selection_node_id": ( 44 | "INT", 45 | {"default": 0, "min": 0, "max": 999999999, "step": 1}, 46 | ), 47 | "file_format": (s.SAVE_FILE_FORMATS,), 48 | }, 49 | "optional": { 50 | "lossless_webp": ("BOOLEAN", {"default": True}), 51 | "quality": ("INT", {"default": 100, "min": 1, "max": 100}), 52 | "save_workflow_json": ("BOOLEAN", {"default": False}), 53 | "add_counter_to_filename": ("BOOLEAN", {"default": True}), 54 | "civitai_sampler": ("BOOLEAN", {"default": False}), 55 | "extra_metadata": ("EXTRA_METADATA", {}), 56 | "save_workflow_image": ("BOOLEAN", {"default": True}), 57 | }, 58 | "hidden": {"prompt": "PROMPT", "extra_pnginfo": "EXTRA_PNGINFO"}, 59 | } 60 | 61 | RETURN_TYPES = () 62 | FUNCTION = "save_images" 63 | 64 | OUTPUT_NODE = True 65 | 66 | pattern_format = re.compile(r"(%[^%]+%)") 67 | 68 | def save_images( 69 | self, 70 | images, 71 | filename_prefix="ComfyUI", 72 | sampler_selection_method=SAMPLER_SELECTION_METHOD[0], 73 | sampler_selection_node_id=0, 74 | file_format="png", 75 | lossless_webp=True, 76 | quality=100, 77 | save_workflow_json=False, 78 | add_counter_to_filename=True, 79 | civitai_sampler=False, 80 | extra_metadata={}, 81 | prompt=None, 82 | extra_pnginfo=None, 83 | save_workflow_image=True, 84 | ): 85 | pnginfo_dict_src = self.gen_pnginfo( 86 | sampler_selection_method, sampler_selection_node_id, civitai_sampler 87 | ) 88 | for k, v in extra_metadata.items(): 89 | if k and v: 90 | pnginfo_dict_src[k] = v.replace(",", "/") 91 | 92 | results = list() 93 | for index, image in enumerate(images): 94 | i = 255.0 * image.cpu().numpy() 95 | img = Image.fromarray(np.clip(i, 0, 255).astype(np.uint8)) 96 | 97 | pnginfo_dict = pnginfo_dict_src.copy() 98 | if len(images) >= 2: 99 | pnginfo_dict["Batch index"] = index 100 | pnginfo_dict["Batch size"] = len(images) 101 | 102 | metadata = None 103 | parameters = "" 104 | if not args.disable_metadata: 105 | metadata = PngInfo() 106 | parameters = Capture.gen_parameters_str(pnginfo_dict) 107 | if pnginfo_dict: 108 | metadata.add_text("parameters", parameters) 109 | if prompt is not None and save_workflow_image: 110 | metadata.add_text("prompt", json.dumps(prompt)) 111 | if extra_pnginfo is not None: 112 | for x in extra_pnginfo: 113 | metadata.add_text(x, json.dumps(extra_pnginfo[x])) 114 | if save_workflow_image == False: 115 | metadata.add_text("workflow", "") 116 | 117 | filename_prefix = self.format_filename(filename_prefix, pnginfo_dict) 118 | output_path = os.path.join(self.output_dir, filename_prefix) 119 | if not os.path.exists(os.path.dirname(output_path)): 120 | os.makedirs(os.path.dirname(output_path), exist_ok=True) 121 | ( 122 | full_output_folder, 123 | filename, 124 | counter, 125 | subfolder, 126 | filename_prefix, 127 | ) = folder_paths.get_save_image_path( 128 | filename_prefix, self.output_dir, images[0].shape[1], images[0].shape[0] 129 | ) 130 | base_filename = filename 131 | if add_counter_to_filename: 132 | base_filename += f"_{counter:05}_" 133 | file = base_filename + "." + file_format 134 | file_path = os.path.join(full_output_folder, file) 135 | 136 | if file_format == "png": 137 | img.save( 138 | file_path, 139 | pnginfo=metadata, 140 | compress_level=self.compress_level, 141 | ) 142 | else: 143 | img.save( 144 | file_path, 145 | optimize=True, 146 | quality=quality, 147 | lossless=lossless_webp, 148 | ) 149 | exif_bytes = piexif.dump( 150 | { 151 | "Exif": { 152 | piexif.ExifIFD.UserComment: piexif.helper.UserComment.dump( 153 | parameters, encoding="unicode" 154 | ), 155 | }, 156 | } 157 | ) 158 | piexif.insert(exif_bytes, file_path) 159 | 160 | if save_workflow_json: 161 | file_path_workflow = os.path.join( 162 | full_output_folder, f"{base_filename}.json" 163 | ) 164 | with open(file_path_workflow, "w", encoding="utf-8") as f: 165 | json.dump(extra_pnginfo["workflow"], f) 166 | 167 | results.append( 168 | {"filename": file, "subfolder": subfolder, "type": self.type} 169 | ) 170 | counter += 1 171 | 172 | return {"ui": {"images": results}} 173 | 174 | @classmethod 175 | def gen_pnginfo( 176 | cls, sampler_selection_method, sampler_selection_node_id, save_civitai_sampler 177 | ): 178 | # get all node inputs 179 | inputs = Capture.get_inputs() 180 | 181 | # get sampler node before this node 182 | trace_tree_from_this_node = Trace.trace( 183 | hook.current_save_image_node_id, hook.current_prompt 184 | ) 185 | inputs_before_this_node = Trace.filter_inputs_by_trace_tree( 186 | inputs, trace_tree_from_this_node 187 | ) 188 | sampler_node_id = Trace.find_sampler_node_id( 189 | trace_tree_from_this_node, 190 | sampler_selection_method, 191 | sampler_selection_node_id, 192 | ) 193 | 194 | # get inputs before sampler node 195 | trace_tree_from_sampler_node = Trace.trace(sampler_node_id, hook.current_prompt) 196 | inputs_before_sampler_node = Trace.filter_inputs_by_trace_tree( 197 | inputs, trace_tree_from_sampler_node 198 | ) 199 | 200 | # generate PNGInfo from inputs 201 | pnginfo_dict = Capture.gen_pnginfo_dict( 202 | inputs_before_sampler_node, inputs_before_this_node, save_civitai_sampler 203 | ) 204 | return pnginfo_dict 205 | 206 | @classmethod 207 | def format_filename(cls, filename, pnginfo_dict): 208 | result = re.findall(cls.pattern_format, filename) 209 | for segment in result: 210 | parts = segment.replace("%", "").split(":") 211 | key = parts[0] 212 | if key == "seed": 213 | filename = filename.replace(segment, str(pnginfo_dict.get("Seed", ""))) 214 | elif key == "width": 215 | w = pnginfo_dict.get("Size", "x").split("x")[0] 216 | filename = filename.replace(segment, str(w)) 217 | elif key == "height": 218 | w = pnginfo_dict.get("Size", "x").split("x")[1] 219 | filename = filename.replace(segment, str(w)) 220 | elif key == "pprompt": 221 | prompt = pnginfo_dict.get("Positive prompt", "").replace("\n", " ") 222 | if len(parts) >= 2: 223 | length = int(parts[1]) 224 | prompt = prompt[:length] 225 | filename = filename.replace(segment, prompt.strip()) 226 | elif key == "nprompt": 227 | prompt = pnginfo_dict.get("Negative prompt", "").replace("\n", " ") 228 | if len(parts) >= 2: 229 | length = int(parts[1]) 230 | prompt = prompt[:length] 231 | filename = filename.replace(segment, prompt.strip()) 232 | elif key == "model": 233 | model = pnginfo_dict.get("Model", "") 234 | model = os.path.splitext(os.path.basename(model))[0] 235 | if len(parts) >= 2: 236 | length = int(parts[1]) 237 | model = model[:length] 238 | filename = filename.replace(segment, model) 239 | elif key == "date": 240 | now = datetime.now() 241 | date_table = { 242 | "yyyy": now.year, 243 | "MM": now.month, 244 | "dd": now.day, 245 | "hh": now.hour, 246 | "mm": now.minute, 247 | "ss": now.second, 248 | } 249 | if len(parts) >= 2: 250 | date_format = parts[1] 251 | for k, v in date_table.items(): 252 | date_format = date_format.replace(k, str(v).zfill(len(k))) 253 | filename = filename.replace(segment, date_format) 254 | else: 255 | date_format = "yyyyMMddhhmmss" 256 | for k, v in date_table.items(): 257 | date_format = date_format.replace(k, str(v).zfill(len(k))) 258 | filename = filename.replace(segment, date_format) 259 | 260 | return filename 261 | 262 | 263 | class CreateExtraMetaData(BaseNode): 264 | @classmethod 265 | def INPUT_TYPES(s): 266 | return { 267 | "required": { 268 | "key1": ("STRING", {"default": "", "multiline": False}), 269 | "value1": ("STRING", {"default": "", "multiline": False}), 270 | }, 271 | "optional": { 272 | "key2": ("STRING", {"default": "", "multiline": False}), 273 | "value2": ("STRING", {"default": "", "multiline": False}), 274 | "key3": ("STRING", {"default": "", "multiline": False}), 275 | "value3": ("STRING", {"default": "", "multiline": False}), 276 | "key4": ("STRING", {"default": "", "multiline": False}), 277 | "value4": ("STRING", {"default": "", "multiline": False}), 278 | "extra_metadata": ("EXTRA_METADATA",), 279 | }, 280 | } 281 | 282 | RETURN_TYPES = ("EXTRA_METADATA",) 283 | FUNCTION = "create_extra_metadata" 284 | 285 | def create_extra_metadata( 286 | self, 287 | extra_metadata={}, 288 | key1="", 289 | value1="", 290 | key2="", 291 | value2="", 292 | key3="", 293 | value3="", 294 | key4="", 295 | value4="", 296 | ): 297 | extra_metadata.update( 298 | { 299 | key1: value1, 300 | key2: value2, 301 | key3: value3, 302 | key4: value4, 303 | } 304 | ) 305 | return (extra_metadata,) 306 | -------------------------------------------------------------------------------- /py/trace.py: -------------------------------------------------------------------------------- 1 | from collections import deque 2 | 3 | from .defs.samplers import SAMPLERS 4 | from .defs.combo import SAMPLER_SELECTION_METHOD 5 | 6 | 7 | class Trace: 8 | @classmethod 9 | def trace(cls, start_node_id, prompt): 10 | class_type = prompt[start_node_id]["class_type"] 11 | Q = deque() 12 | Q.append((start_node_id, 0)) 13 | trace_tree = {start_node_id: (0, class_type)} 14 | while len(Q) > 0: 15 | current_node_id, distance = Q.popleft() 16 | input_fields = prompt[current_node_id]["inputs"] 17 | for value in input_fields.values(): 18 | if isinstance(value, list): 19 | nid = value[0] 20 | class_type = prompt[nid]["class_type"] 21 | trace_tree[nid] = (distance + 1, class_type) 22 | Q.append((nid, distance + 1)) 23 | return trace_tree 24 | 25 | @classmethod 26 | def find_sampler_node_id(cls, trace_tree, sampler_selection_method, node_id): 27 | if sampler_selection_method == SAMPLER_SELECTION_METHOD[2]: 28 | node_id = str(node_id) 29 | _, class_type = trace_tree.get(node_id, (-1, None)) 30 | if class_type in SAMPLERS.keys(): 31 | return node_id 32 | return -1 33 | 34 | sorted_by_distance_trace_tree = sorted( 35 | [(k, v[0], v[1]) for k, v in trace_tree.items()], 36 | key=lambda x: x[1], 37 | reverse=(sampler_selection_method == SAMPLER_SELECTION_METHOD[0]), 38 | ) 39 | for nid, _, class_type in sorted_by_distance_trace_tree: 40 | if class_type in SAMPLERS.keys(): 41 | return nid 42 | return -1 43 | 44 | @classmethod 45 | def filter_inputs_by_trace_tree(cls, inputs, trace_tree): 46 | filtered_inputs = {} 47 | for meta, inputs_list in inputs.items(): 48 | for node_id, input_value in inputs_list: 49 | trace = trace_tree.get(node_id) 50 | if trace is not None: 51 | distance = trace[0] 52 | if meta not in filtered_inputs: 53 | filtered_inputs[meta] = [] 54 | filtered_inputs[meta].append((node_id, input_value, distance)) 55 | 56 | # sort by distance 57 | for k, v in filtered_inputs.items(): 58 | filtered_inputs[k] = sorted(v, key=lambda x: x[2]) 59 | return filtered_inputs 60 | -------------------------------------------------------------------------------- /py/utils/embedding.py: -------------------------------------------------------------------------------- 1 | import os 2 | from comfy.sd1_clip import expand_directory_list 3 | 4 | def get_embedding_file_path(embedding_name, clip): 5 | """ 6 | Resolves the file path for an embedding by searching directories and checking file extensions. 7 | 8 | Args: 9 | embedding_name (str): The name of the embedding file (without an extension). 10 | clip (object): An object containing the attribute `embedding_directory`, 11 | which specifies directories to search. 12 | 13 | Returns: 14 | str or None: Full path to the embedding file if found, otherwise None. 15 | """ 16 | # Validate embedding_directory 17 | embedding_directory = getattr(clip, "embedding_directory", None) 18 | if not embedding_directory: 19 | raise ValueError("The 'embedding_directory' attribute in the clip object is None or empty.") 20 | 21 | if isinstance(embedding_directory, str): 22 | embedding_directory = [embedding_directory] 23 | 24 | # Expand directories using the provided function 25 | try: 26 | embedding_directory = expand_directory_list(embedding_directory) 27 | except Exception as e: 28 | raise ValueError(f"Error expanding directory list: {e}") 29 | 30 | if not embedding_directory: 31 | raise ValueError("No valid directories found after expansion.") 32 | 33 | valid_file = None 34 | extensions = [".safetensors", ".pt", ".bin"] 35 | 36 | for embed_dir in embedding_directory: 37 | embed_dir = os.path.abspath(embed_dir) 38 | if not os.path.isdir(embed_dir): 39 | # Skip invalid directories 40 | continue 41 | 42 | # Construct the absolute path for the embedding name 43 | embed_path = os.path.abspath(os.path.join(embed_dir, embedding_name)) 44 | 45 | try: 46 | # Ensure embed_path is within embed_dir (security check) 47 | if os.path.commonpath([embed_dir, embed_path]) != embed_dir: 48 | continue 49 | except Exception as e: 50 | # Skip this directory on exception (e.g., invalid path comparison) 51 | continue 52 | 53 | # Check if the file exists with or without extensions 54 | if os.path.isfile(embed_path): 55 | valid_file = embed_path 56 | else: 57 | for ext in extensions: 58 | candidate_path = embed_path + ext 59 | if os.path.isfile(candidate_path): 60 | valid_file = candidate_path 61 | break 62 | 63 | # Stop searching if a valid file is found 64 | if valid_file: 65 | break 66 | 67 | return valid_file 68 | -------------------------------------------------------------------------------- /py/utils/hash.py: -------------------------------------------------------------------------------- 1 | import hashlib 2 | 3 | 4 | cache_model_hash = {} 5 | 6 | 7 | def calc_hash(filename): 8 | if filename in cache_model_hash: 9 | return cache_model_hash[filename] 10 | sha256_hash = hashlib.sha256() 11 | 12 | with open(filename, "rb") as f: 13 | for byte_block in iter(lambda: f.read(4096), b""): 14 | sha256_hash.update(byte_block) 15 | model_hash = sha256_hash.hexdigest()[:10] 16 | 17 | cache_model_hash[filename] = model_hash 18 | return model_hash 19 | -------------------------------------------------------------------------------- /pyproject.toml: -------------------------------------------------------------------------------- 1 | [project] 2 | name = "comfyui-saveimagewithmetadata" 3 | description = "Add a node to save images with metadata (PNGInfo) extracted from the input values of each node.\nSince the values are extracted dynamically, values output by various extension nodes can be added to metadata." 4 | version = "1.0.0" 5 | license = { file = "LICENSE" } 6 | 7 | [project.urls] 8 | Repository = "https://github.com/nkchocoai/ComfyUI-SaveImageWithMetaData" 9 | # Used by Comfy Registry https://comfyregistry.org 10 | 11 | [tool.comfy] 12 | PublisherId = "nkchocoai" 13 | DisplayName = "ComfyUI-SaveImageWithMetaData" 14 | Icon = "" 15 | --------------------------------------------------------------------------------