├── .github └── workflows │ └── publish.yml ├── .gitignore ├── README.md ├── __init__.py ├── images ├── CivitaiTriggerWords.png ├── LoraListNames.png ├── LoraLoader.png ├── LoraLoaderStacked.png ├── LoraTagsOnly.png ├── TagsFormater.png ├── TagsSelector.png ├── ViewInfo.png ├── loaderAdvanced.png ├── loaderStacked.png ├── main.png └── stackingLoras.png ├── nodes_autotrigger.py ├── nodes_utils.py ├── pyproject.toml ├── utils.py └── web └── js └── loraInfo.js /.github/workflows/publish.yml: -------------------------------------------------------------------------------- 1 | name: Publish to Comfy registry 2 | on: 3 | workflow_dispatch: 4 | push: 5 | branches: 6 | - main 7 | paths: 8 | - "pyproject.toml" 9 | 10 | jobs: 11 | publish-node: 12 | name: Publish Custom Node to registry 13 | runs-on: ubuntu-latest 14 | steps: 15 | - name: Check out code 16 | uses: actions/checkout@v4 17 | - name: Publish Custom Node 18 | uses: Comfy-Org/publish-node-action@main 19 | with: 20 | ## Add your own personal access token to your Github Repository secrets and reference it here. 21 | personal_access_token: ${{ secrets.REGISTRY_ACCESS_TOKEN }} -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | __pycache__ -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # ComfyUI-Lora-Auto-Trigger-Words 2 | 3 | This project is a fork of https://github.com/Extraltodeus/LoadLoraWithTags 4 | The aim of these custom nodes is to get an _easy_ access to the tags used to trigger a lora. 5 | This project is compatible with Stacked Loras from https://github.com/jags111/efficiency-nodes-comfyui 6 | When I talk about **lora**, I also mean **lycoris** too. 7 | 8 | ## Install 9 | Some of this project nodes depends on https://github.com/pythongosssss/ComfyUI-Custom-Scripts : 10 | - LoraLoaderAdvanced 11 | - LoraLoaderStackedAdvanced 12 | - `View info...` 13 | 14 | They get their vanilla equivalents with less features 15 | 16 | Overall, Custom-Scripts is recommended to be able to know the content of the tag lists with the node `showText` 17 | 18 | ## Features 19 | ### Lora Trigger Words 20 | Lora trigger words are imported from two sources : 21 | - Civitai api (only for civitai models) 22 | 23 | ![image](./images/CivitaiTriggerWords.png) 24 | - Model training metadata (when available) 25 | 26 | #### Vanilla vs Advanced 27 | > Vanilla refers to nodes that have no lora preview from the menu, nor the lora list. But the features provided are the same. 28 | 29 | ![image](./images/main.png) 30 | ### Nodes 31 | #### LoraLoader (Vanilla and Advanced) 32 | ![image](./images/LoraLoader.png) 33 | INPUT 34 | - `override_lora_name` (optional): Used to ignore the field `lora_name` and use the name passed. Should use [LoraListNames](#loralistnames) or the `lora_name` output. 35 | 36 | FIELDS 37 | - `force_fetch`: Force the civitai fetching of data even if there is already something saved 38 | - `enable_preview`: Toggle on/off the saved lora preview if any (only in advanced) 39 | - `append_lora_if_empty`: Add the name of the lora to the list of tags if the list is empty 40 | 41 | OUTPUT 42 | - `civitai_tags_list`: a python list of the tags related to this lora on civitai 43 | - `meta_tags_list`: a python list of the tags used for training the lora embeded in it (if any) 44 | - `lora_name`: the name of the current selected lora 45 | #### LoraLoaderStacked (Vanilla and Avanced). 46 | ![image](./images/LoraLoaderStacked.png) 47 | INPUT 48 | - `lora_stack` (optional): another stack of lora. 49 | - `override_lora_name` (optional): Used to ignore the field `lora_name` and use the name passed. Should use [LoraListNames](#loralistnames) or the `lora_name` output. 50 | 51 | FIELDS 52 | - `force_fetch`: Force the civitai fetching of data even if there is already something saved 53 | - `enable_preview`: Toggle on/off the saved lora preview if any (only in advanced) 54 | - `append_lora_if_empty`: Add the name of the lora to the list of tags if the list is empty 55 | 56 | OUTPUT 57 | - `civitai_tags_list`: a python list of the tags related to this lora on civitai 58 | - `meta_tags_list`: a python list of the tags used for training the lora embeded in it (if any) 59 | - `lora_name`: the name of the current selected lora 60 | #### LoraTagsOnly 61 | ![image](./images/LoraTagsOnly.png) 62 | To get the tags without using the lora. 63 | - `override_lora_name` (optional): Used to ignore the field `lora_name` and use the name passed. Should use [LoraListNames](#loralistnames) or the `lora_name` output. 64 | 65 | OUTPUT 66 | - `civitai_tags_list`: a python list of the tags related to this lora on civitai 67 | - `meta_tags_list`: a python list of the tags used for training the lora embeded in it (if any) 68 | - `lora_name`: the name of the current selected lora 69 | 70 | #### TagsFormater 71 | ![image](./images/TagsFormater.png) 72 | Helper to show the available tag and their indexes. Tags are sorted by training frequence. The more a tag was used, the higher in the list it is. Works for both `civitai_tags_list` and `meta_tags_list` 73 | 74 | #### TagsSelector 75 | ![image](./images/TagsSelector.png) 76 | Allow to filter tags and apply a weight to it. 77 | TagSelector contains four parameters. 78 | - `selector` (see the [Filtering](#filtering) section next) 79 | - `weight`: to format the tag like `(tag:weight)`. Default set to 1 without the weight like `tag`. 80 | - `ensure_comma`. To properly append comma if a prefix or suffix is added. 81 | 82 | #### LoraListNames 83 | ![image](./images/LoraListNames.png) 84 | List all the existing lora names. It is used as an input for `override_lora_name` 85 | 86 | ### Filtering 87 | The format is simple. It's the same as python list index, but can select multiple index or ranges of indexes separated by comas. 88 | `Ex: 0, 3, 5:8, -8:` 89 | - Select a specific list of indexes: `0, 2, 3, 15`... 90 | - Select range of indexes: `2:5, 10:15`... 91 | - Select a range from the begining to a specific index: `:5` 92 | - Select a range from a specific index to the end: `5:` 93 | - You can use negative indexes. Like `-1` to select the last tag 94 | - By default `:` selects everything 95 | 96 | ### View Info 97 | ![image](./images/ViewInfo.png) 98 | Pythongossss's [View Info...](https://github.com/pythongosssss/ComfyUI-Custom-Scripts?tab=readme-ov-file#checkpointloraembedding-info) feature from ComfyUI-Custom-Scripts 99 | 100 | ### Examples 101 | #### Example of normal workflow 102 | ![image](./images/loaderAdvanced.png) 103 | 104 | #### Example of Stacked workflow 105 | ![image](./images/loaderStacked.png) 106 | 107 | #### Chaining Selectors and Stacked 108 | Tags selectors can be chained to select differents tags with differents weights `(tags1:0.8), tag2, (tag3:1.1)`. 109 | Lora Stack can also be chained together to load multiple loras into an efficient loaders. 110 | ![image](./images/stackingLoras.png) 111 | 112 | ### Side nodes I made and kept here 113 | - FusionText: takes two text input and join them together 114 | - Randomizer: takes two couples text+lorastack and return randomly one of them 115 | - TextInputBasic: just a text input with two additional input for text chaining 116 | -------------------------------------------------------------------------------- /__init__.py: -------------------------------------------------------------------------------- 1 | #from .nodes_autotrigger import NODE_CLASS_MAPPINGS, NODE_DISPLAY_NAME_MAPPINGS as na_NCM, na_NDNM 2 | #from .nodes_utils import NODE_CLASS_MAPPINGS, NODE_DISPLAY_NAME_MAPPINGS as nu_NCM, nu_NDNM 3 | from .nodes_autotrigger import NODE_CLASS_MAPPINGS as na_NCM 4 | from .nodes_utils import NODE_CLASS_MAPPINGS as nu_NCM 5 | 6 | NODE_CLASS_MAPPINGS = dict(na_NCM, **nu_NCM) 7 | #NODE_DISPLAY_NAME_MAPPINGS = dict(na_NDNM, **nu_NDNM) 8 | WEB_DIRECTORY = "./web" 9 | __all__ = ["NODE_CLASS_MAPPINGS", "WEB_DIRECTORY"]#, "NODE_DISPLAY_NAME_MAPPINGS"] 10 | -------------------------------------------------------------------------------- /images/CivitaiTriggerWords.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/idrirap/ComfyUI-Lora-Auto-Trigger-Words/52157dff5ad65d370b1de05013f940cb79329ec7/images/CivitaiTriggerWords.png -------------------------------------------------------------------------------- /images/LoraListNames.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/idrirap/ComfyUI-Lora-Auto-Trigger-Words/52157dff5ad65d370b1de05013f940cb79329ec7/images/LoraListNames.png -------------------------------------------------------------------------------- /images/LoraLoader.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/idrirap/ComfyUI-Lora-Auto-Trigger-Words/52157dff5ad65d370b1de05013f940cb79329ec7/images/LoraLoader.png -------------------------------------------------------------------------------- /images/LoraLoaderStacked.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/idrirap/ComfyUI-Lora-Auto-Trigger-Words/52157dff5ad65d370b1de05013f940cb79329ec7/images/LoraLoaderStacked.png -------------------------------------------------------------------------------- /images/LoraTagsOnly.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/idrirap/ComfyUI-Lora-Auto-Trigger-Words/52157dff5ad65d370b1de05013f940cb79329ec7/images/LoraTagsOnly.png -------------------------------------------------------------------------------- /images/TagsFormater.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/idrirap/ComfyUI-Lora-Auto-Trigger-Words/52157dff5ad65d370b1de05013f940cb79329ec7/images/TagsFormater.png -------------------------------------------------------------------------------- /images/TagsSelector.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/idrirap/ComfyUI-Lora-Auto-Trigger-Words/52157dff5ad65d370b1de05013f940cb79329ec7/images/TagsSelector.png -------------------------------------------------------------------------------- /images/ViewInfo.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/idrirap/ComfyUI-Lora-Auto-Trigger-Words/52157dff5ad65d370b1de05013f940cb79329ec7/images/ViewInfo.png -------------------------------------------------------------------------------- /images/loaderAdvanced.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/idrirap/ComfyUI-Lora-Auto-Trigger-Words/52157dff5ad65d370b1de05013f940cb79329ec7/images/loaderAdvanced.png -------------------------------------------------------------------------------- /images/loaderStacked.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/idrirap/ComfyUI-Lora-Auto-Trigger-Words/52157dff5ad65d370b1de05013f940cb79329ec7/images/loaderStacked.png -------------------------------------------------------------------------------- /images/main.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/idrirap/ComfyUI-Lora-Auto-Trigger-Words/52157dff5ad65d370b1de05013f940cb79329ec7/images/main.png -------------------------------------------------------------------------------- /images/stackingLoras.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/idrirap/ComfyUI-Lora-Auto-Trigger-Words/52157dff5ad65d370b1de05013f940cb79329ec7/images/stackingLoras.png -------------------------------------------------------------------------------- /nodes_autotrigger.py: -------------------------------------------------------------------------------- 1 | from comfy.sd import load_lora_for_models 2 | from comfy.utils import load_torch_file 3 | import folder_paths 4 | 5 | from .utils import * 6 | 7 | class LoraLoaderVanilla: 8 | def __init__(self): 9 | self.loaded_lora = None 10 | 11 | @classmethod 12 | def INPUT_TYPES(s): 13 | LORA_LIST = sorted(folder_paths.get_filename_list("loras"), key=str.lower) 14 | return { 15 | "required": { 16 | "model": ("MODEL",), 17 | "lora_name": (LORA_LIST, ), 18 | "strength_model": ("FLOAT", {"default": 1.0, "min": -10.0, "max": 10.0, "step": 0.01}), 19 | "strength_clip": ("FLOAT", {"default": 1.0, "min": -10.0, "max": 10.0, "step": 0.01}), 20 | "force_fetch": ("BOOLEAN", {"default": False}), 21 | "append_loraname_if_empty": ("BOOLEAN", {"default": False}), 22 | }, 23 | "optional": { 24 | "clip": ("CLIP", ), 25 | "override_lora_name":("STRING", {"forceInput": True}), 26 | } 27 | } 28 | 29 | RETURN_TYPES = ("MODEL", "CLIP", "LIST", "LIST", "STRING") 30 | RETURN_NAMES = ("MODEL", "CLIP", "civitai_tags_list", "meta_tags_list", "lora_name") 31 | FUNCTION = "load_lora" 32 | CATEGORY = "autotrigger" 33 | 34 | def load_lora(self, model, lora_name, strength_model, strength_clip, force_fetch, append_loraname_if_empty, clip=None, override_lora_name=""): 35 | if clip is None: 36 | strength_clip=0 37 | if override_lora_name != "": 38 | lora_name = override_lora_name 39 | meta_tags_list = sort_tags_by_frequency(get_metadata(lora_name, "loras")) 40 | civitai_tags_list = load_and_save_tags(lora_name, force_fetch) 41 | 42 | meta_tags_list = append_lora_name_if_empty(meta_tags_list, lora_name, append_loraname_if_empty) 43 | civitai_tags_list = append_lora_name_if_empty(civitai_tags_list, lora_name, append_loraname_if_empty) 44 | 45 | lora_path = folder_paths.get_full_path("loras", lora_name) 46 | lora = None 47 | if self.loaded_lora is not None: 48 | if self.loaded_lora[0] == lora_path: 49 | lora = self.loaded_lora[1] 50 | else: 51 | temp = self.loaded_lora 52 | self.loaded_lora = None 53 | del temp 54 | 55 | if lora is None: 56 | lora = load_torch_file(lora_path, safe_load=True) 57 | self.loaded_lora = (lora_path, lora) 58 | 59 | model_lora, clip_lora = load_lora_for_models(model, clip, lora, strength_model, strength_clip) 60 | 61 | return (model_lora, clip_lora, civitai_tags_list, meta_tags_list, lora_name) 62 | 63 | class LoraLoaderStackedVanilla: 64 | @classmethod 65 | def INPUT_TYPES(s): 66 | LORA_LIST = folder_paths.get_filename_list("loras") 67 | return { 68 | "required": { 69 | "lora_name": (LORA_LIST,), 70 | "lora_weight": ("FLOAT", {"default": 1.0, "min": -10.0, "max": 10.0, "step": 0.01}), 71 | "force_fetch": ("BOOLEAN", {"default": False}), 72 | "append_loraname_if_empty": ("BOOLEAN", {"default": False}), 73 | }, 74 | "optional": { 75 | "lora_stack": ("LORA_STACK", ), 76 | "override_lora_name":("STRING", {"forceInput": True}), 77 | } 78 | } 79 | 80 | RETURN_TYPES = ("LIST", "LIST", "LORA_STACK", "STRING") 81 | RETURN_NAMES = ("civitai_tags_list", "meta_tags_list", "LORA_STACK", "lora_name") 82 | FUNCTION = "set_stack" 83 | #OUTPUT_NODE = False 84 | CATEGORY = "autotrigger" 85 | 86 | def set_stack(self, lora_name, lora_weight, force_fetch, append_loraname_if_empty, lora_stack=None, override_lora_name=""): 87 | if override_lora_name != "": 88 | lora_name = override_lora_name 89 | civitai_tags_list = load_and_save_tags(lora_name, force_fetch) 90 | 91 | meta_tags = get_metadata(lora_name, "loras") 92 | meta_tags_list = sort_tags_by_frequency(meta_tags) 93 | 94 | civitai_tags_list = append_lora_name_if_empty(civitai_tags_list, lora_name, append_loraname_if_empty) 95 | meta_tags_list = append_lora_name_if_empty(meta_tags_list, lora_name, append_loraname_if_empty) 96 | 97 | loras = [(lora_name,lora_weight,lora_weight,)] 98 | if lora_stack is not None: 99 | loras.extend(lora_stack) 100 | 101 | return (civitai_tags_list, meta_tags_list, loras, lora_name) 102 | 103 | class LoraLoaderAdvanced: 104 | def __init__(self): 105 | self.loaded_lora = None 106 | 107 | @classmethod 108 | def INPUT_TYPES(s): 109 | LORA_LIST = sorted(folder_paths.get_filename_list("loras"), key=str.lower) 110 | populate_items(LORA_LIST, "loras") 111 | return { 112 | "required": { 113 | "model": ("MODEL",), 114 | "lora_name": (LORA_LIST, ), 115 | "strength_model": ("FLOAT", {"default": 1.0, "min": -10.0, "max": 10.0, "step": 0.01}), 116 | "strength_clip": ("FLOAT", {"default": 1.0, "min": -10.0, "max": 10.0, "step": 0.01}), 117 | "force_fetch": ("BOOLEAN", {"default": False}), 118 | "enable_preview": ("BOOLEAN", {"default": False}), 119 | "append_loraname_if_empty": ("BOOLEAN", {"default": False}), 120 | }, 121 | "optional": { 122 | "clip": ("CLIP", ), 123 | "override_lora_name":("STRING", {"forceInput": True}), 124 | } 125 | } 126 | 127 | RETURN_TYPES = ("MODEL", "CLIP", "LIST", "LIST", "STRING") 128 | RETURN_NAMES = ("MODEL", "CLIP", "civitai_tags_list", "meta_tags_list", "lora_name") 129 | FUNCTION = "load_lora" 130 | CATEGORY = "autotrigger" 131 | 132 | def load_lora(self, model, lora_name, strength_model, strength_clip, force_fetch, enable_preview, append_loraname_if_empty, clip=None, override_lora_name=""): 133 | if clip is None: 134 | strength_clip=0 135 | if override_lora_name != "": 136 | has_preview, prev = get_preview_path(override_lora_name, "loras") 137 | prev = f"loras/{prev}" if has_preview else None 138 | lora_name = {"content": override_lora_name, "image": prev, "type": "loras"} 139 | 140 | meta_tags_list = sort_tags_by_frequency(get_metadata(lora_name["content"], "loras")) 141 | civitai_tags_list = load_and_save_tags(lora_name["content"], force_fetch) 142 | 143 | civitai_tags_list = append_lora_name_if_empty(civitai_tags_list, lora_name["content"], append_loraname_if_empty) 144 | meta_tags_list = append_lora_name_if_empty(meta_tags_list, lora_name["content"], append_loraname_if_empty) 145 | 146 | lora_path = folder_paths.get_full_path("loras", lora_name["content"]) 147 | lora = None 148 | if self.loaded_lora is not None: 149 | if self.loaded_lora[0] == lora_path: 150 | lora = self.loaded_lora[1] 151 | else: 152 | temp = self.loaded_lora 153 | self.loaded_lora = None 154 | del temp 155 | 156 | if lora is None: 157 | lora = load_torch_file(lora_path, safe_load=True) 158 | self.loaded_lora = (lora_path, lora) 159 | 160 | model_lora, clip_lora = load_lora_for_models(model, clip, lora, strength_model, strength_clip) 161 | if enable_preview: 162 | _, preview = copy_preview_to_temp(lora_name["image"]) 163 | if preview is not None: 164 | preview_output = { 165 | "filename": preview, 166 | "subfolder": "lora_preview", 167 | "type": "temp" 168 | } 169 | return {"ui": {"images": [preview_output]}, "result": (model_lora, clip_lora, civitai_tags_list, meta_tags_list, lora_name["content"])} 170 | 171 | 172 | return (model_lora, clip_lora, civitai_tags_list, meta_tags_list, lora_name["content"]) 173 | 174 | class LoraLoaderStackedAdvanced: 175 | @classmethod 176 | def INPUT_TYPES(s): 177 | LORA_LIST = folder_paths.get_filename_list("loras") 178 | populate_items(LORA_LIST, "loras") 179 | return { 180 | "required": { 181 | "lora_name": (LORA_LIST,), 182 | "lora_weight": ("FLOAT", {"default": 1.0, "min": -10.0, "max": 10.0, "step": 0.01}), 183 | "force_fetch": ("BOOLEAN", {"default": False}), 184 | "enable_preview": ("BOOLEAN", {"default": False}), 185 | "append_loraname_if_empty": ("BOOLEAN", {"default": False}), 186 | }, 187 | "optional": { 188 | "lora_stack": ("LORA_STACK", ), 189 | "override_lora_name":("STRING", {"forceInput": True}), 190 | } 191 | } 192 | 193 | RETURN_TYPES = ("LIST", "LIST", "LORA_STACK", "STRING") 194 | RETURN_NAMES = ("civitai_tags_list", "meta_tags_list", "LORA_STACK", "lora_name") 195 | FUNCTION = "set_stack" 196 | #OUTPUT_NODE = False 197 | CATEGORY = "autotrigger" 198 | 199 | def set_stack(self, lora_name, lora_weight, force_fetch, enable_preview, append_loraname_if_empty, lora_stack=None, override_lora_name=""): 200 | if override_lora_name != "": 201 | has_preview, prev = get_preview_path(override_lora_name, "loras") 202 | prev = f"loras/{prev}" if has_preview else None 203 | lora_name = {"content": override_lora_name, "image": prev, "type": "loras"} 204 | 205 | civitai_tags_list = load_and_save_tags(lora_name["content"], force_fetch) 206 | 207 | meta_tags = get_metadata(lora_name["content"], "loras") 208 | meta_tags_list = sort_tags_by_frequency(meta_tags) 209 | 210 | civitai_tags_list = append_lora_name_if_empty(civitai_tags_list, lora_name["content"], append_loraname_if_empty) 211 | meta_tags_list = append_lora_name_if_empty(meta_tags_list, lora_name["content"], append_loraname_if_empty) 212 | 213 | loras = [(lora_name["content"],lora_weight,lora_weight,)] 214 | if lora_stack is not None: 215 | loras.extend(lora_stack) 216 | 217 | if enable_preview: 218 | _, preview = copy_preview_to_temp(lora_name["image"]) 219 | if preview is not None: 220 | preview_output = { 221 | "filename": preview, 222 | "subfolder": "lora_preview", 223 | "type": "temp" 224 | } 225 | return {"ui": {"images": [preview_output]}, "result": (civitai_tags_list, meta_tags_list, loras, lora_name["content"])} 226 | 227 | return {"result": (civitai_tags_list, meta_tags_list, loras, lora_name["content"])} 228 | 229 | class LoraTagsOnly: 230 | @classmethod 231 | def INPUT_TYPES(s): 232 | LORA_LIST = sorted(folder_paths.get_filename_list("loras"), key=str.lower) 233 | return { 234 | "required": { 235 | "lora_name": (LORA_LIST,), 236 | "force_fetch": ("BOOLEAN", {"default": False}), 237 | "append_loraname_if_empty": ("BOOLEAN", {"default": False}), 238 | }, 239 | "optional": { 240 | "override_lora_name":("STRING", {"forceInput": True}), 241 | } 242 | } 243 | 244 | RETURN_TYPES = ("LIST", "LIST") 245 | RETURN_NAMES = ("civitai_tags_list", "meta_tags_list") 246 | FUNCTION = "ask_lora" 247 | CATEGORY = "autotrigger" 248 | 249 | def ask_lora(self, lora_name, force_fetch, append_loraname_if_empty, override_lora_name=""): 250 | if override_lora_name != "": 251 | lora_name = override_lora_name 252 | meta_tags_list = sort_tags_by_frequency(get_metadata(lora_name, "loras")) 253 | civitai_tags_list = load_and_save_tags(lora_name, force_fetch) 254 | 255 | meta_tags_list = append_lora_name_if_empty(meta_tags_list, lora_name, append_loraname_if_empty) 256 | civitai_tags_list = append_lora_name_if_empty(civitai_tags_list, lora_name, append_loraname_if_empty) 257 | 258 | return (civitai_tags_list, meta_tags_list) 259 | 260 | # A dictionary that contains all nodes you want to export with their names 261 | # NOTE: names should be globally unique 262 | NODE_CLASS_MAPPINGS = { 263 | "LoraLoaderVanilla": LoraLoaderVanilla, 264 | "LoraLoaderStackedVanilla": LoraLoaderStackedVanilla, 265 | "LoraLoaderAdvanced": LoraLoaderAdvanced, 266 | "LoraLoaderStackedAdvanced": LoraLoaderStackedAdvanced, 267 | "LoraTagsOnly": LoraTagsOnly, 268 | } 269 | 270 | # A dictionary that contains the friendly/humanly readable titles for the nodes 271 | NODE_DISPLAY_NAME_MAPPINGS = { 272 | "LoraLoaderVanilla": "LoraLoaderVanilla", 273 | "LoraLoaderStackedVanilla": "LoraLoaderStackedVanilla", 274 | "LoraLoaderAdvanced": "LoraLoaderAdvanced", 275 | "LoraLoaderStackedAdvanced": "LoraLoaderStackedAdvanced", 276 | "LoraTagsOnly": "LoraTagsOnly", 277 | } 278 | -------------------------------------------------------------------------------- /nodes_utils.py: -------------------------------------------------------------------------------- 1 | from .utils import * 2 | 3 | class FusionText: 4 | @classmethod 5 | def INPUT_TYPES(s): 6 | return {"required": {"text_1": ("STRING", {"default": "", "forceInput": True}), "text_2": ("STRING", {"default": "", "forceInput": True})}} 7 | RETURN_TYPES = ("STRING",) 8 | FUNCTION = "combine" 9 | CATEGORY = "autotrigger" 10 | 11 | def combine(self, text_1, text_2): 12 | return (text_1 + text_2, ) 13 | 14 | 15 | class Randomizer: 16 | @classmethod 17 | def INPUT_TYPES(s): 18 | return { 19 | "required": { 20 | "text_1":("STRING", {"forceInput": True}), 21 | "text_2":("STRING", {"forceInput": True} ), 22 | "seed": ("INT", {"default": 0, "min": 0, "max": 0xffffffffffffffff}), 23 | }, 24 | "optional": { 25 | "lora_1":("LORA_STACK", ), 26 | "lora_2":("LORA_STACK", ), 27 | } 28 | } 29 | 30 | RETURN_TYPES = ("STRING", "LORA_STACK") 31 | RETURN_NAMES = ("text", "lora stack") 32 | FUNCTION = "randomize" 33 | 34 | #OUTPUT_NODE = False 35 | 36 | CATEGORY = "autotrigger" 37 | 38 | def randomize(self, text_1, text_2, seed, lora_1=[], lora_2=[]): 39 | if seed %2 == 0: 40 | return (text_1, lora_1) 41 | return (text_2, lora_2) 42 | 43 | class TextInputBasic: 44 | @classmethod 45 | def INPUT_TYPES(s): 46 | return { 47 | "required": { 48 | "text":("STRING", {"default":"", "multiline":True}), 49 | }, 50 | "optional": { 51 | "prefix":("STRING", {"default":"", "forceInput": True}), 52 | "suffix":("STRING", {"default":"", "forceInput": True}), 53 | } 54 | } 55 | 56 | RETURN_TYPES = ("STRING",) 57 | RETURN_NAMES = ("text", ) 58 | FUNCTION = "get_text" 59 | 60 | #OUTPUT_NODE = False 61 | 62 | CATEGORY = "autotrigger" 63 | 64 | def get_text(self, text, prefix="", suffix=""): 65 | return (prefix + text + suffix, ) 66 | 67 | 68 | class TagsSelector: 69 | @classmethod 70 | def INPUT_TYPES(s): 71 | return { 72 | "required": { 73 | "tags_list": ("LIST", {"default": []}), 74 | "selector": ("STRING", {"default": ":"}), 75 | "weight": ("FLOAT", {"default": 1.0, "min": -10.0, "max": 10.0, "step": 0.01}), 76 | "ensure_comma": ("BOOLEAN", {"default": True}) 77 | }, 78 | "optional": { 79 | "prefix":("STRING", {"default":"", "forceInput": True}), 80 | "suffix":("STRING", {"default":"", "forceInput": True}), 81 | } 82 | } 83 | 84 | RETURN_TYPES = ("STRING",) 85 | FUNCTION = "select_tags" 86 | CATEGORY = "autotrigger" 87 | 88 | def select_tags(self, tags_list, selector, weight, ensure_comma, prefix="", suffix=""): 89 | if weight != 1.0: 90 | tags_list = [f"({tag}:{weight})" for tag in tags_list] 91 | output = parse_selector(selector, tags_list) 92 | if ensure_comma: 93 | striped_prefix = prefix.strip() 94 | striped_suffix = suffix.strip() 95 | if striped_prefix != "" and not striped_prefix.endswith(",") and output != "" and not output.startswith(","): 96 | prefix = striped_prefix + ", " 97 | if output != "" and not output.endswith(",") and striped_suffix != "" and not striped_suffix.startswith(","): 98 | suffix = ", " + striped_suffix 99 | return (prefix + output + suffix, ) 100 | 101 | class TagsFormater: 102 | @classmethod 103 | def INPUT_TYPES(s): 104 | return { 105 | "required": { 106 | "tags_list": ("LIST", {"default": []}), 107 | }, 108 | } 109 | 110 | RETURN_TYPES = ("STRING",) 111 | FUNCTION = "format_tags" 112 | CATEGORY = "autotrigger" 113 | 114 | def format_tags(self, tags_list): 115 | output = "" 116 | i = 0 117 | for tag in tags_list: 118 | output += f'{i} : "{tag}"\n' 119 | i+=1 120 | 121 | return (output,) 122 | 123 | class LoraListNames: 124 | @classmethod 125 | def INPUT_TYPES(s): 126 | LORA_LIST = sorted(folder_paths.get_filename_list("loras"), key=str.lower) 127 | return { 128 | "required": { 129 | "lora_name": (LORA_LIST,), 130 | } 131 | } 132 | 133 | RETURN_TYPES = ("STRING",) 134 | RETURN_NAMES = ("lora_name",) 135 | FUNCTION = "output_selected" 136 | CATEGORY = "autotrigger" 137 | 138 | def output_selected(self, lora_name): 139 | name = lora_name 140 | return (name,) 141 | 142 | # A dictionary that contains all nodes you want to export with their names 143 | # NOTE: names should be globally unique 144 | NODE_CLASS_MAPPINGS = { 145 | "Randomizer": Randomizer, 146 | "FusionText": FusionText, 147 | "TextInputBasic": TextInputBasic, 148 | "TagsSelector": TagsSelector, 149 | "TagsFormater": TagsFormater, 150 | "LoraListNames": LoraListNames, 151 | } 152 | 153 | # A dictionary that contains the friendly/humanly readable titles for the nodes 154 | NODE_DISPLAY_NAME_MAPPINGS = { 155 | "Randomizer": "Randomizer", 156 | "FusionText": "FusionText", 157 | "TextInputBasic": "TextInputBasic", 158 | "TagsSelector": "TagsSelector", 159 | "TagsFormater": "TagsFormater", 160 | "LoraListNames": "LoraListNames", 161 | } 162 | -------------------------------------------------------------------------------- /pyproject.toml: -------------------------------------------------------------------------------- 1 | [project] 2 | name = "comfyui-lora-auto-trigger-words" 3 | description = "The aim of these custom nodes is to get an easy access to the tags used to trigger a lora / lycoris. Extract the tags from civitai or from the safetensors metadatas when available." 4 | version = "1.0.4" 5 | license = "MIT" 6 | 7 | [project.urls] 8 | Repository = "https://github.com/idrirap/ComfyUI-Lora-Auto-Trigger-Words" 9 | # Used by Comfy Registry https://comfyregistry.org 10 | 11 | [tool.comfy] 12 | PublisherId = "dijkstra" 13 | DisplayName = "ComfyUI-Lora-Auto-Trigger-Words" 14 | Icon = "" 15 | -------------------------------------------------------------------------------- /utils.py: -------------------------------------------------------------------------------- 1 | import folder_paths 2 | import hashlib 3 | import json 4 | import os 5 | import requests 6 | import shutil 7 | 8 | def get_preview_path(name, type): 9 | file_name = os.path.splitext(name)[0] 10 | file_path = folder_paths.get_full_path(type, name) 11 | 12 | if file_path is None: 13 | print(f"Unable to get path for {type} {name}") 14 | return None 15 | 16 | file_path_no_ext = os.path.splitext(file_path)[0] 17 | item_image=None 18 | for ext in ["png", "jpg", "jpeg", "gif"]: 19 | if item_image is not None: 20 | break 21 | for ext2 in ["", ".preview"]: 22 | has_image = os.path.isfile(file_path_no_ext + ext2 + "." + ext) 23 | if has_image: 24 | item_image = f"{file_name}{ext2}.{ext}" 25 | break 26 | 27 | return has_image, item_image 28 | 29 | 30 | def copy_preview_to_temp(file_name): 31 | if file_name is None: 32 | return None, None 33 | base_name = os.path.basename(file_name) 34 | lora_less = "/".join(file_name.split("/")[1:]) 35 | 36 | file_path = folder_paths.get_full_path("loras", lora_less) 37 | if file_path is None: 38 | return None, None 39 | 40 | temp_path = folder_paths.get_temp_directory() 41 | preview_path = os.path.join(temp_path, "lora_preview") 42 | if not os.path.isdir(preview_path) : 43 | os.makedirs(preview_path) 44 | preview_path = os.path.join(preview_path, base_name) 45 | 46 | 47 | shutil.copyfile(file_path, preview_path) 48 | return preview_path, base_name 49 | 50 | # add previews in selectors 51 | def populate_items(names, type): 52 | for idx, item_name in enumerate(names): 53 | 54 | has_image, item_image = get_preview_path(item_name, type) 55 | 56 | names[idx] = { 57 | "content": item_name, 58 | "image": f"{type}/{item_image}" if has_image else None, 59 | "type": "loras", 60 | } 61 | names.sort(key=lambda i: i["content"].lower()) 62 | 63 | 64 | def load_json_from_file(file_path): 65 | try: 66 | with open(file_path, 'r') as json_file: 67 | data = json.load(json_file) 68 | return data 69 | except FileNotFoundError: 70 | print(f"File not found: {file_path}") 71 | return None 72 | except json.JSONDecodeError: 73 | print(f"Error decoding JSON in file: {file_path}") 74 | raise 75 | 76 | def save_dict_to_json(data_dict, file_path): 77 | try: 78 | with open(file_path, 'w') as json_file: 79 | json.dump(data_dict, json_file, indent=4) 80 | print(f"Data saved to {file_path}") 81 | except Exception as e: 82 | print(f"Error saving JSON to file: {e}") 83 | 84 | def get_model_version_info(hash_value): 85 | api_url = f"https://civitai.com/api/v1/model-versions/by-hash/{hash_value}" 86 | try: 87 | response = requests.get(api_url) 88 | except Exception as e: 89 | print(f"[Lora-Auto-Trigger] {e}") 90 | return None 91 | if response.status_code == 200: 92 | return response.json() 93 | else: 94 | return None 95 | 96 | def calculate_sha256(file_path): 97 | sha256_hash = hashlib.sha256() 98 | with open(file_path, "rb") as f: 99 | for chunk in iter(lambda: f.read(4096), b""): 100 | sha256_hash.update(chunk) 101 | return sha256_hash.hexdigest() 102 | 103 | 104 | def load_and_save_tags(lora_name, force_fetch): 105 | json_tags_path = "./loras_tags.json" 106 | lora_tags = load_json_from_file(json_tags_path) 107 | output_tags = lora_tags.get(lora_name, None) if lora_tags is not None else None 108 | if output_tags is not None: 109 | output_tags_list = output_tags 110 | else: 111 | output_tags_list = [] 112 | 113 | lora_path = folder_paths.get_full_path("loras", lora_name) 114 | if lora_tags is None or force_fetch or output_tags is None: # search on civitai only if no local cache or forced 115 | print("[Lora-Auto-Trigger] calculating lora hash") 116 | LORAsha256 = calculate_sha256(lora_path) 117 | print("[Lora-Auto-Trigger] requesting infos") 118 | model_info = get_model_version_info(LORAsha256) 119 | if model_info is not None: 120 | if "trainedWords" in model_info: 121 | print("[Lora-Auto-Trigger] tags found!") 122 | if lora_tags is None: 123 | lora_tags = {} 124 | lora_tags[lora_name] = model_info["trainedWords"] 125 | save_dict_to_json(lora_tags, json_tags_path) 126 | output_tags_list = model_info["trainedWords"] 127 | else: 128 | print("[Lora-Auto-Trigger] No informations found.") 129 | if lora_tags is None: 130 | lora_tags = {} 131 | lora_tags[lora_name] = [] 132 | save_dict_to_json(lora_tags,json_tags_path) 133 | 134 | return output_tags_list 135 | 136 | def show_list(list_input): 137 | i = 0 138 | output = "" 139 | for debug in list_input: 140 | output += f"{i} : {debug}\n" 141 | i+=1 142 | return output 143 | 144 | def get_metadata(filepath, type): 145 | filepath = folder_paths.get_full_path(type, filepath) 146 | with open(filepath, "rb") as file: 147 | # https://github.com/huggingface/safetensors#format 148 | # 8 bytes: N, an unsigned little-endian 64-bit integer, containing the size of the header 149 | header_size = int.from_bytes(file.read(8), "little", signed=False) 150 | 151 | if header_size <= 0: 152 | raise BufferError("Invalid header size") 153 | 154 | header = file.read(header_size) 155 | if header_size <= 0: 156 | raise BufferError("Invalid header") 157 | header_json = json.loads(header) 158 | return header_json["__metadata__"] if "__metadata__" in header_json else None 159 | 160 | # parse the __metadata__ json looking for trained tags 161 | def sort_tags_by_frequency(meta_tags): 162 | if meta_tags is None: 163 | return [] 164 | if "ss_tag_frequency" in meta_tags: 165 | meta_tags = meta_tags["ss_tag_frequency"] 166 | meta_tags = json.loads(meta_tags) 167 | sorted_tags = {} 168 | for _, dataset in meta_tags.items(): 169 | for tag, count in dataset.items(): 170 | tag = str(tag).strip() 171 | if tag in sorted_tags: 172 | sorted_tags[tag] = sorted_tags[tag] + count 173 | else: 174 | sorted_tags[tag] = count 175 | # sort tags by training frequency. Most seen tags firsts 176 | sorted_tags = dict(sorted(sorted_tags.items(), key=lambda item: item[1], reverse=True)) 177 | return list(sorted_tags.keys()) 178 | else: 179 | return [] 180 | 181 | def parse_selector(selector, tags_list): 182 | if len(tags_list) == 0: 183 | return "" 184 | range_index_list = selector.split(",") 185 | output = {} 186 | for range_index in range_index_list: 187 | # single value 188 | if range_index.count(":") == 0: 189 | # remove empty values 190 | if range_index.strip() == "": 191 | continue 192 | index = int(range_index) 193 | # ignore out of bound indexes 194 | if abs(index) > len(tags_list) - 1: 195 | continue 196 | output[index] = tags_list[index] 197 | 198 | # actual range 199 | if range_index.count(":") == 1: 200 | indexes = range_index.split(":") 201 | # check empty 202 | if indexes[0] == "": 203 | start = 0 204 | else: 205 | start = int(indexes[0]) 206 | if indexes[1] == "": 207 | end = len(tags_list) 208 | else: 209 | end = int(indexes[1]) 210 | # check negative 211 | if start < 0: 212 | start = len(tags_list) + start 213 | if end < 0: 214 | end = len(tags_list) + end 215 | # clamp start and end values within list boundaries 216 | start, end = min(start, len(tags_list)), min(end, len(tags_list)) 217 | start, end = max(start, 0), max(end, 0) 218 | # merge all 219 | for i in range(start, end): 220 | output[i] = tags_list[i] 221 | return ", ".join(list(output.values())) 222 | 223 | def append_lora_name_if_empty(tags_list, lora_path, enabled): 224 | if not enabled or len(tags_list) > 0: 225 | return tags_list 226 | filename = os.path.splitext(lora_path)[0] 227 | filename = os.path.basename(filename) 228 | 229 | tags_list.append(filename) 230 | return tags_list 231 | -------------------------------------------------------------------------------- /web/js/loraInfo.js: -------------------------------------------------------------------------------- 1 | import { app } from "../../../scripts/app.js"; 2 | import { LoraInfoDialog } from "../../ComfyUI-Custom-Scripts/js/modelInfo.js"; 3 | 4 | const infoHandlers = { 5 | "LoraLoaderVanilla":true, 6 | "LoraLoaderStackedVanilla":true, 7 | "LoraLoaderAdvanced":true, 8 | "LoraLoaderStackedAdvanced":true 9 | } 10 | 11 | app.registerExtension({ 12 | name: "autotrigger.LoraInfo", 13 | beforeRegisterNodeDef(nodeType) { 14 | if (! infoHandlers[nodeType.comfyClass]) { 15 | return; 16 | } 17 | const getExtraMenuOptions = nodeType.prototype.getExtraMenuOptions; 18 | nodeType.prototype.getExtraMenuOptions = function (_, options) { 19 | let value = this.widgets[0].value; 20 | if (!value) { 21 | return; 22 | } 23 | if (value.content) { 24 | value = value.content; 25 | } 26 | options.unshift({ 27 | content: "View info...", 28 | callback: async () => { 29 | new LoraInfoDialog(value).show("loras", value); 30 | }, 31 | }); 32 | 33 | return getExtraMenuOptions?.apply(this, arguments); 34 | }; 35 | } 36 | }); --------------------------------------------------------------------------------