├── .DS_Store ├── README.md ├── README_CN.md ├── __init__.py ├── image ├── story_lora.png └── story_with_inf.png ├── nodes.py ├── story_attention.py └── workflow ├── story_base.json ├── story_with_inf.json └── story_with_lora.json /.DS_Store: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/SeaArtLab/comfyui_storydiffusion/9a6e3e379e5b790f128a8a15403f3fc3aab11d24/.DS_Store -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # comfyui-storydiffusion 2 | This project is a comfyui implementation of storydiffusion. The structure of diffusers is not used in the project, so it will be fully compatible with all model structures of comfyui's unet. On the basis of the original sdxl, it also supports the use of sd1.5 and sd2.x, and of course lora clip_skip For these components, SeaArtApplyStory in the project will patch out some modules of the original comfyui. According to the implementation of the paper, only the attention module in the unet upsampling stage is affected. Since the id_length information will be cached in the attention module during the processing, currently we It will take up time and be saved in the cpu instead of cuda, so it is recommended that your id_length is not too large. Similarly, we provide SeaArtStoryKSampler and SeaArtStoryKSamplerAdvanced, which are essentially just simple encapsulation of comfyui's sampler. The returned model will retain the necessary cache information. Helps you continue to use it in the subsequent generation mode (write=false) of the story. In the process after write=false, the cache will not continue to increase. In theory, you can generate unlimited generation. Next, we will introduce what you need to pay attention to when using components. 3 | 4 | SeaArtApplyStory is in the write=true stage, width height is the size of the image you generate, and id_length represents the total number of images that the attention layer needs to pay attention to, so the number of conditions, the number of latents, and id_length need to be unified. 5 | 6 | SeaArtStoryKSampler SeaArtStoryKSamplerAdvanced is a simple encapsulation of comfyui's original comfyui, and returns the model structure of cache additional information. 7 | 8 | 9 | SeaArtStoryInfKSampler After SeaArtStoryKSamplerInfAdvanced, the story will enter the write=false stage, and the cache will no longer increase. You can connect any sampler later to maintain the image. 10 | 11 | SeaArtMergeStoryCondition is an additional component for organizing conditions. Due to the max of clip is 77, we will automatically identify conditions that are too long to complete all the conditions until the attention can be calculated. Of course, if the lengths are consistent, no additional deal with 12 | 13 | SeaArtCharactorPrompt SeaArtAppendPrompt is just a simple concatenation of strings. You can of course use other ways to obtain strings. 14 | 15 | If you want to use it with ella, you may need to modify the components appropriately, or leave us an issue by liking it. We will consider handling all issues in our spare time. It is compatible with various ecologies in comfyui. If you like this project, please give us a We like it, your encouragement is our motivation to update 16 | (In addition, we believe that this method may also be applicable in dit, and we will consider supporting it for the upcoming sd3 after determining the effect) 17 | 18 | In principle, this project can support most components in the comfyui ecosystem, but the adapter and others have not tested it. 19 | This project does not include the processing of strings such as style prompts, nor the process of writing text on pictures. 20 | ## Workflow 21 | We have specially prepared usage examples, including examples of using lora. In order to simplify the demonstration, our examples are simple and you do not need to install other plug-ins. 22 | ![story_with_lora](./image/story_lora.png) 23 | ![story_with_inf](./image/story_with_inf.png) 24 | Finally, special thanks[StoryDiffusion](https://github.com/HVision-NKU/StoryDiffusion) -------------------------------------------------------------------------------- /README_CN.md: -------------------------------------------------------------------------------- 1 | # comfyui-storydiffusion 2 | 本项目是storydiffusion的comfyui实现,项目中没有使用diffusers的结构,所以会全面兼容comfyui的unet的所有模型结构,在原本sdxl的基础上同样支持使用sd1.5以及sd2.x,当然也包括lora clip_skip这些组件,项目中的SeaArtApplyStory会patch掉原本comfyui的部分模块,根据论文的实现,受影响的只有unet上采样阶段的attention模块,由于处理过程中id_length的信息会被cache在attention模块中,目前我们会占时保存在cpu而非cuda中,所以建议您的id_length不易过大,同样我们提供了SeaArtStoryKSampler和SeaArtStoryKSamplerAdvanced,本质上只是对comfyui的采样器的简单封装,返回的model会保留必要的cache信息,帮助您在story的后续生成模式(write=false)的阶段继续使用,在write=false之后的流程,cache不会再持续增大,理论上您可以无限生成,接下来将介绍组件使用需注意的几点 3 | SeaArtApplyStory为write=true的阶段,width height是您生成图片的尺寸,id_length表示为attention层需要关注的图片总数,所以需要condition的数量,latent的数量以及id_length保持统一 4 | SeaArtStoryKSampler SeaArtStoryKSamplerAdvanced为对comfyui原本comfyui的简单封装,对返回cache额外信息后的model结构 5 | SeaArtStoryInfKSampler SeaArtStoryKSamplerInfAdvanced之后,story会进入write=false的阶段,cache将不再增大,您可以在后面接入任意的sampler来保持形象 6 | SeaArtMergeStoryCondition为额外提供的整理condition的组件,由于clip的max为77的特性,我们会自动识别过长的condition来把所有的condition补齐到attention可以运算的情况下,当然如果长度一致,则无需额外处理 7 | SeaArtCharactorPrompt SeaArtAppendPrompt只是对string的简单拼接,您当然可以使用其他方式获得string 8 | 9 | 如果要搭配ella使用,您可能需要适当的修改组件,抑或是点赞给我们留下issue,我们会在后续空余的时间来考虑处理所有的issue,兼容comfyui中各种生态,喜欢本项目请给我们点赞,您的鼓励是我们更新的动力 10 | (另外我们认为在dit中可能该方法同样适用,对于即将发行的sd3在确定效果后我们会考虑支持) 11 | 原则上本项目可以支持comfyui生态中的绝大部分组件,但是adapter等并没有做测试 12 | 本项目不包含style prompt等string的处理,也不包含在图片上写上文字的流程 13 | ## Workflow 14 | 我们特意准备了使用例子,包含使用lora的例子,为了简化演示,我们例子简单,您无需安装其他插件 15 | ![story_with_lora](./image/story_lora.png) 16 | ![story_with_inf](./image/story_with_inf.png) 17 | 最后特别感谢[StoryDiffusion](https://github.com/HVision-NKU/StoryDiffusion) -------------------------------------------------------------------------------- /__init__.py: -------------------------------------------------------------------------------- 1 | from . import nodes as nodes 2 | 3 | NODE_CLASS_MAPPINGS = { 4 | "SeaArtApplyStory": nodes.SeaArtApplyStory, 5 | "SeaArtMergeStoryCondition": nodes.SeaArtMergeStoryCondition, 6 | "SeaArtAppendPrompt": nodes.SeaArtAppendPrompt, 7 | "SeaArtCharactorPrompt": nodes.SeaArtCharactorPrompt, 8 | "SeaArtStoryKSampler": nodes.SeaArtStoryKSampler, 9 | "SeaArtStoryKSamplerAdvanced": nodes.SeaArtStoryKSamplerAdvanced, 10 | "SeaArtStoryInfKSampler": nodes.SeaArtStoryInfKSampler, 11 | "SeaArtStoryInfKSamplerAdvanced": nodes.SeaArtStoryInfKSamplerAdvanced, 12 | "SeaArtFirstImage": nodes.SeaArtFirstImage, 13 | } 14 | 15 | NODE_DISPLAY_NAME_MAPPINGS = { 16 | "SeaArtApplyStory": "SeaArtApplyStory", 17 | "SeaArtMergeStoryCondition": "SeaArtMergeStoryCondition", 18 | "SeaArtAppendPrompt": "SeaArtAppendPrompt", 19 | "SeaArtCharactorPrompt": "SeaArtCharactorPrompt", 20 | "SeaArtStoryKSampler": "SeaArtStoryKSampler", 21 | "SeaArtStoryKSamplerAdvanced": "SeaArtStoryKSamplerAdvanced", 22 | "SeaArtStoryInfKSampler": "SeaArtStoryInfKSampler", 23 | "SeaArtStoryInfKSamplerAdvanced": "SeaArtStoryInfKSamplerAdvanced", 24 | "SeaArtFirstImage": "SeaArtFirstImage" 25 | } 26 | -------------------------------------------------------------------------------- /image/story_lora.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/SeaArtLab/comfyui_storydiffusion/9a6e3e379e5b790f128a8a15403f3fc3aab11d24/image/story_lora.png -------------------------------------------------------------------------------- /image/story_with_inf.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/SeaArtLab/comfyui_storydiffusion/9a6e3e379e5b790f128a8a15403f3fc3aab11d24/image/story_with_inf.png -------------------------------------------------------------------------------- /nodes.py: -------------------------------------------------------------------------------- 1 | from . import story_attention 2 | from comfy.ldm.modules.attention import SpatialTransformer 3 | import comfy.model_management as model_management 4 | import torch 5 | import comfy.samplers 6 | import nodes 7 | import copy 8 | 9 | class SeaArtApplyStory: 10 | @classmethod 11 | def INPUT_TYPES(s): 12 | return {"required": { 13 | "model": ("MODEL",), 14 | "id_length": ("INT", {"default": 3, "min": 0, "max": 10, "step": 1}), 15 | "width": ("INT", {"default": 1024, "min": 64, "max": 4096, "step": 1}), 16 | "height": ("INT", {"default": 1024, "min": 64, "max": 4096, "step": 1}), 17 | "same": ("FLOAT", {"default": 0.5, "min": 0.0, "max": 1.0, "step":0.01, "round": 0.01}), 18 | }} 19 | RETURN_TYPES = ("MODEL",) 20 | FUNCTION = "apply" 21 | 22 | CATEGORY = "SeaArt" 23 | 24 | def apply(self, model, id_length, width, height, same): 25 | import gc 26 | import comfy.model_management as model_management 27 | gc.collect() 28 | model_management.soft_empty_cache(True) 29 | model = model.clone() 30 | model.model = copy.deepcopy(model.model) 31 | story_attention.width = width 32 | story_attention.height = height 33 | story_attention.total_count = 0 34 | story_attention.total_length = id_length + 1 35 | story_attention.id_length = id_length 36 | story_attention.mask1024, story_attention.mask4096 = story_attention.cal_attn_mask_xl( 37 | story_attention.total_length,story_attention.id_length,story_attention.sa32,story_attention.sa64,story_attention.height,story_attention.width,device=model_management.get_torch_device(),dtype= torch.float16) 38 | story_attention.write = True 39 | story_attention.sa32 = same 40 | story_attention.sa64 = same 41 | for time_step in model.model.diffusion_model.output_blocks: 42 | #block是否包含上采样 43 | for block in time_step: 44 | if isinstance(block, SpatialTransformer): 45 | for transformer_block in block.transformer_blocks: 46 | if transformer_block.attn1 is not None: 47 | transformer_block.attn1 = story_attention.StoryCrossAttention(transformer_block.attn1,torch.float16) 48 | story_attention.total_count += 1 49 | print("total patch:",story_attention.total_count) 50 | return (model,) 51 | 52 | class SeaArtCharactorPrompt: 53 | @classmethod 54 | def INPUT_TYPES(s): 55 | return {"required": { 56 | "prompt": ("STRING", {"multiline": True,"default": ""})}, 57 | } 58 | 59 | RETURN_TYPES = ("STRING",) 60 | FUNCTION = "input" 61 | 62 | CATEGORY = "SeaArt" 63 | 64 | def input(self,prompt): 65 | return (prompt,) 66 | 67 | class SeaArtAppendPrompt: 68 | @classmethod 69 | def INPUT_TYPES(s): 70 | return {"required": { 71 | "charactor_prompt": ("STRING", {"multiline": True,"default": ""}), 72 | "prompt": ("STRING", {"multiline": True,"default": ""}) 73 | }, 74 | } 75 | 76 | RETURN_TYPES = ("STRING",) 77 | FUNCTION = "input" 78 | 79 | CATEGORY = "SeaArt" 80 | 81 | def input(self,charactor_prompt,prompt): 82 | return (charactor_prompt + ", " + prompt,) 83 | 84 | class SeaArtMergeStoryCondition: 85 | @classmethod 86 | def INPUT_TYPES(s): 87 | return {"required": { 88 | "clip": ("CLIP", ), 89 | "conditioning_1": ("CONDITIONING", ), 90 | }, 91 | "optional":{ 92 | "conditioning_2": ("CONDITIONING", ), 93 | "conditioning_3": ("CONDITIONING", ), 94 | "conditioning_4": ("CONDITIONING", ), 95 | "conditioning_5": ("CONDITIONING", ) 96 | }} 97 | 98 | RETURN_TYPES = ("CONDITIONING",) 99 | FUNCTION = "merge" 100 | 101 | CATEGORY = "SeaArt" 102 | 103 | def clip_tensor_clean(self,clip,list): 104 | ##寻找尺寸最大的张量 105 | max_token = 0 106 | for item in list: 107 | max_token = max(max_token,item.shape[1]) 108 | tokens = clip.tokenize("") 109 | empty_cond, empty_pooled = clip.encode_from_tokens(tokens, return_pooled=True) 110 | for index,item in enumerate(list): 111 | if item.shape[1] < max_token: 112 | r = (max_token - item.shape[1]) // 77 113 | for i in range(r): 114 | list[index] = torch.cat((list[index],empty_cond[0][0]),dim=1) 115 | return list 116 | 117 | def merge(self,clip,conditioning_1,conditioning_2=None,conditioning_3=None,conditioning_4=None,conditioning_5=None): 118 | filter_conds = [] 119 | for x in [conditioning_1,conditioning_2,conditioning_3,conditioning_4,conditioning_5]: 120 | if isinstance(x,list): 121 | for c in x: 122 | filter_conds.append(c) 123 | positive_conds_list = [] 124 | positive_conds_pool_list = [] 125 | for condition in filter_conds: 126 | positive_conds_list.append(condition[0]) 127 | if len(condition) > 1 and torch.is_tensor(condition[1]["pooled_output"]): 128 | positive_conds_pool_list.append(condition[1]["pooled_output"]) 129 | positive_conds_list = self.clip_tensor_clean(clip,positive_conds_list) 130 | positive_conds = [[torch.cat(positive_conds_list), {"pooled_output": torch.cat(positive_conds_pool_list)}]] 131 | return (positive_conds,) 132 | 133 | class SeaArtStoryKSampler: 134 | @classmethod 135 | def INPUT_TYPES(s): 136 | return {"required": 137 | {"model": ("MODEL",), 138 | "seed": ("INT", {"default": 0, "min": 0, "max": 0xffffffffffffffff}), 139 | "steps": ("INT", {"default": 20, "min": 1, "max": 10000}), 140 | "cfg": ("FLOAT", {"default": 8.0, "min": 0.0, "max": 100.0, "step":0.1, "round": 0.01}), 141 | "sampler_name": (comfy.samplers.KSampler.SAMPLERS, ), 142 | "scheduler": (comfy.samplers.KSampler.SCHEDULERS, ), 143 | "positive": ("CONDITIONING", ), 144 | "negative": ("CONDITIONING", ), 145 | "latent_image": ("LATENT", ), 146 | "denoise": ("FLOAT", {"default": 1.0, "min": 0.0, "max": 1.0, "step": 0.01}), 147 | } 148 | } 149 | 150 | RETURN_TYPES = ("LATENT","MODEL",) 151 | FUNCTION = "sample" 152 | 153 | CATEGORY = "SeaArt" 154 | 155 | def sample(self, model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=1.0): 156 | story_attention.write = True 157 | story_attention.cur_step = 0 158 | latent = nodes.KSampler().sample(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise)[0] 159 | return (latent,model,) 160 | 161 | class SeaArtStoryKSamplerAdvanced: 162 | @classmethod 163 | def INPUT_TYPES(s): 164 | return {"required": 165 | {"model": ("MODEL",), 166 | "add_noise": (["enable", "disable"], ), 167 | "noise_seed": ("INT", {"default": 0, "min": 0, "max": 0xffffffffffffffff}), 168 | "steps": ("INT", {"default": 20, "min": 1, "max": 10000}), 169 | "cfg": ("FLOAT", {"default": 8.0, "min": 0.0, "max": 100.0, "step":0.1, "round": 0.01}), 170 | "sampler_name": (comfy.samplers.KSampler.SAMPLERS, ), 171 | "scheduler": (comfy.samplers.KSampler.SCHEDULERS, ), 172 | "positive": ("CONDITIONING", ), 173 | "negative": ("CONDITIONING", ), 174 | "latent_image": ("LATENT", ), 175 | "start_at_step": ("INT", {"default": 0, "min": 0, "max": 10000}), 176 | "end_at_step": ("INT", {"default": 10000, "min": 0, "max": 10000}), 177 | "return_with_leftover_noise": (["disable", "enable"], ), 178 | } 179 | } 180 | 181 | RETURN_TYPES = ("LATENT","MODEL",) 182 | FUNCTION = "sample" 183 | 184 | CATEGORY = "SeaArt" 185 | 186 | def sample(self, model, add_noise, noise_seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, start_at_step, end_at_step, return_with_leftover_noise, denoise=1.0): 187 | story_attention.write = True 188 | story_attention.cur_step = 0 189 | latent = nodes.KSamplerAdvanced().sample(model, add_noise, noise_seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, start_at_step, end_at_step, return_with_leftover_noise, denoise)[0] 190 | return (latent,model,) 191 | 192 | class SeaArtStoryInfKSampler: 193 | @classmethod 194 | def INPUT_TYPES(s): 195 | return {"required": 196 | {"model": ("MODEL",), 197 | "seed": ("INT", {"default": 0, "min": 0, "max": 0xffffffffffffffff}), 198 | "steps": ("INT", {"default": 20, "min": 1, "max": 10000}), 199 | "cfg": ("FLOAT", {"default": 8.0, "min": 0.0, "max": 100.0, "step":0.1, "round": 0.01}), 200 | "sampler_name": (comfy.samplers.KSampler.SAMPLERS, ), 201 | "scheduler": (comfy.samplers.KSampler.SCHEDULERS, ), 202 | "positive": ("CONDITIONING", ), 203 | "negative": ("CONDITIONING", ), 204 | "latent_image": ("LATENT", ), 205 | "denoise": ("FLOAT", {"default": 1.0, "min": 0.0, "max": 1.0, "step": 0.01}), 206 | } 207 | } 208 | 209 | RETURN_TYPES = ("LATENT","MODEL",) 210 | FUNCTION = "sample" 211 | 212 | CATEGORY = "SeaArt" 213 | 214 | def sample(self, model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=1.0): 215 | story_attention.write = False 216 | story_attention.cur_step = 0 217 | latent = nodes.KSampler().sample(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise)[0] 218 | return (latent,model,) 219 | 220 | class SeaArtFirstImage: 221 | @classmethod 222 | def INPUT_TYPES(s): 223 | return {"required": 224 | {"image": ("IMAGE", ),}, 225 | } 226 | 227 | 228 | RETURN_TYPES = ("IMAGE",) 229 | FUNCTION = "do" 230 | 231 | CATEGORY = "SeaArt" 232 | 233 | def do(self,image): 234 | image = image[0] 235 | return (image.unsqueeze(0),) 236 | 237 | class SeaArtStoryInfKSamplerAdvanced: 238 | @classmethod 239 | def INPUT_TYPES(s): 240 | return {"required": 241 | {"model": ("MODEL",), 242 | "add_noise": (["enable", "disable"], ), 243 | "noise_seed": ("INT", {"default": 0, "min": 0, "max": 0xffffffffffffffff}), 244 | "steps": ("INT", {"default": 20, "min": 1, "max": 10000}), 245 | "cfg": ("FLOAT", {"default": 8.0, "min": 0.0, "max": 100.0, "step":0.1, "round": 0.01}), 246 | "sampler_name": (comfy.samplers.KSampler.SAMPLERS, ), 247 | "scheduler": (comfy.samplers.KSampler.SCHEDULERS, ), 248 | "positive": ("CONDITIONING", ), 249 | "negative": ("CONDITIONING", ), 250 | "latent_image": ("LATENT", ), 251 | "start_at_step": ("INT", {"default": 0, "min": 0, "max": 10000}), 252 | "end_at_step": ("INT", {"default": 10000, "min": 0, "max": 10000}), 253 | "return_with_leftover_noise": (["disable", "enable"], ), 254 | } 255 | } 256 | 257 | RETURN_TYPES = ("LATENT","MODEL",) 258 | FUNCTION = "sample" 259 | 260 | CATEGORY = "SeaArt" 261 | 262 | def sample(self, model, add_noise, noise_seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, start_at_step, end_at_step, return_with_leftover_noise, denoise=1.0): 263 | story_attention.write = False 264 | story_attention.cur_step = 0 265 | latent = nodes.KSamplerAdvanced().sample(model, add_noise, noise_seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, start_at_step, end_at_step, return_with_leftover_noise, denoise)[0] 266 | return (latent,model,) 267 | -------------------------------------------------------------------------------- /story_attention.py: -------------------------------------------------------------------------------- 1 | from torch import nn 2 | 3 | import comfy 4 | import comfy.ops as ops 5 | from comfy.ldm.modules.attention import default,CrossAttention,optimized_attention,optimized_attention_masked 6 | import comfy.ops 7 | ops = comfy.ops.disable_weight_init 8 | import torch.nn.functional as F 9 | import torch 10 | import random 11 | 12 | total_count = 0 13 | attn_count = 0 14 | cur_step = 0 15 | mask1024 = None 16 | mask4096 = None 17 | indices1024 = None 18 | indices4096 = None 19 | sa32 = 0.5 20 | sa64 = 0.5 21 | write = False 22 | height = 0 23 | width = 0 24 | id_length = 0 25 | total_length = 0 26 | 27 | def cal_attn_indice_xl_effcient_memory(total_length,id_length,sa32,sa64,height,width,device="cpu",dtype= torch.float16): 28 | nums_1024 = (height // 32) * (width // 32) 29 | nums_4096 = (height // 16) * (width // 16) 30 | bool_matrix1024 = torch.rand((total_length,nums_1024),device = device,dtype = dtype) < sa32 31 | bool_matrix4096 = torch.rand((total_length,nums_4096),device = device,dtype = dtype) < sa64 32 | # 用nonzero()函数获取所有为True的值的索引 33 | indices1024 = [torch.nonzero(bool_matrix1024[i], as_tuple=True)[0] for i in range(total_length)] 34 | indices4096 = [torch.nonzero(bool_matrix4096[i], as_tuple=True)[0] for i in range(total_length)] 35 | 36 | return indices1024,indices4096 37 | 38 | def cal_attn_mask_xl(total_length,id_length,sa32,sa64,height,width,device="cuda",dtype= torch.float16): 39 | nums_1024 = (height // 32) * (width // 32) 40 | nums_4096 = (height // 16) * (width // 16) 41 | bool_matrix1024 = torch.rand((1, total_length * nums_1024),device = device,dtype = dtype) < sa32 42 | bool_matrix4096 = torch.rand((1, total_length * nums_4096),device = device,dtype = dtype) < sa64 43 | bool_matrix1024 = bool_matrix1024.repeat(total_length,1) 44 | bool_matrix4096 = bool_matrix4096.repeat(total_length,1) 45 | for i in range(total_length): 46 | bool_matrix1024[i:i+1,id_length*nums_1024:] = False 47 | bool_matrix4096[i:i+1,id_length*nums_4096:] = False 48 | bool_matrix1024[i:i+1,i*nums_1024:(i+1)*nums_1024] = True 49 | bool_matrix4096[i:i+1,i*nums_4096:(i+1)*nums_4096] = True 50 | mask1024 = bool_matrix1024.unsqueeze(1).repeat(1,nums_1024,1).reshape(-1,total_length * nums_1024) 51 | mask4096 = bool_matrix4096.unsqueeze(1).repeat(1,nums_4096,1).reshape(-1,total_length * nums_4096) 52 | return mask1024,mask4096 53 | 54 | def __call1__( 55 | self, 56 | hidden_states, 57 | encoder_hidden_states=None, 58 | attention_mask=None, 59 | value=None, 60 | ): 61 | input_ndim = hidden_states.ndim 62 | if input_ndim == 4: 63 | total_batch_size, channel, height, width = hidden_states.shape 64 | hidden_states = hidden_states.view(total_batch_size, channel, height * width).transpose(1, 2) 65 | total_batch_size,nums_token,channel = hidden_states.shape 66 | img_nums = total_batch_size//2 67 | hidden_states = hidden_states.view(-1,img_nums,nums_token,channel).reshape(-1,img_nums * nums_token,channel) 68 | 69 | query = self.to_q(hidden_states) 70 | if encoder_hidden_states is None: 71 | encoder_hidden_states = hidden_states # B, N, C 72 | else: 73 | encoder_hidden_states = encoder_hidden_states.view(-1,id_length+1,nums_token,channel).reshape(-1,(id_length+1) * nums_token,channel) 74 | key = self.to_k(encoder_hidden_states) 75 | value = self.to_v(encoder_hidden_states) 76 | batch_size, sequence_length, _ = hidden_states.shape 77 | 78 | query = query.view(batch_size, -1, self.heads, self.dim_head).transpose(1, 2) 79 | 80 | key = key.view(batch_size, -1, self.heads, self.dim_head).transpose(1, 2) 81 | value = value.view(batch_size, -1, self.heads, self.dim_head).transpose(1, 2) 82 | hidden_states = F.scaled_dot_product_attention( 83 | query, key, value, attn_mask=attention_mask, dropout_p=0.0, is_causal=False 84 | ) 85 | 86 | hidden_states = hidden_states.transpose(1, 2).reshape(total_batch_size, -1, self.heads * self.dim_head) 87 | hidden_states = hidden_states.to(query.dtype) 88 | return self.to_out(hidden_states) 89 | 90 | def __call2__( 91 | self, 92 | hidden_states, 93 | encoder_hidden_states=None, 94 | attention_mask=None, 95 | value=None): 96 | input_ndim = hidden_states.ndim 97 | if input_ndim == 4: 98 | batch_size, channel, height, width = hidden_states.shape 99 | hidden_states = hidden_states.view(batch_size, channel, height * width).transpose(1, 2) 100 | batch_size, sequence_length, channel = ( 101 | hidden_states.shape 102 | ) 103 | if encoder_hidden_states is None: 104 | encoder_hidden_states = hidden_states # B, N, C 105 | else: 106 | encoder_hidden_states = encoder_hidden_states.view(-1,id_length+1,sequence_length,channel).reshape(-1,(id_length+1) * sequence_length,channel) 107 | query = self.to_q(hidden_states) 108 | key = self.to_k(encoder_hidden_states) 109 | value = self.to_v(encoder_hidden_states) 110 | 111 | query = query.view(batch_size, -1, self.heads, self.dim_head).transpose(1, 2) 112 | 113 | key = key.view(batch_size, -1, self.heads, self.dim_head).transpose(1, 2) 114 | value = value.view(batch_size, -1, self.heads, self.dim_head).transpose(1, 2) 115 | 116 | hidden_states = F.scaled_dot_product_attention( 117 | query, key, value, attn_mask=attention_mask, dropout_p=0.0, is_causal=False 118 | ) 119 | 120 | hidden_states = hidden_states.transpose(1, 2).reshape(batch_size, -1, self.heads * self.dim_head) 121 | hidden_states = hidden_states.to(query.dtype) 122 | return self.to_out(hidden_states) 123 | 124 | 125 | 126 | class StoryCrossAttention(nn.Module): 127 | def __init__(self, cross_attention, dtype): 128 | import comfy.model_management as model_management 129 | super().__init__() 130 | 131 | self.heads = cross_attention.heads 132 | self.dim_head = cross_attention.dim_head 133 | 134 | self.to_q = cross_attention.to_q 135 | self.to_k = cross_attention.to_k 136 | self.to_v = cross_attention.to_v 137 | 138 | self.to_out = cross_attention.to_out 139 | self.device = model_management.get_torch_device() 140 | self.dtype = dtype 141 | self.id_bank = {} 142 | self.origin_attn = cross_attention 143 | 144 | def forward(self, x, context=None, value=None, mask=None): 145 | return self.new_forward(self,hidden_states=x,encoder_hidden_states=context,temb=value,attention_mask=mask) 146 | 147 | def old_forward(self, x, context=None, value=None, mask=None): 148 | global total_count,attn_count,cur_step,mask1024,mask4096 149 | global sa32, sa64 150 | global write 151 | global height,width 152 | if write: 153 | # print(f"white:{cur_step}") 154 | self.id_bank[cur_step] = [x[:id_length], x[id_length:]] 155 | else: 156 | context = torch.cat((self.id_bank[cur_step][0].to(self.device),x[:1],self.id_bank[cur_step][1].to(self.device),x[1:])) 157 | # skip in early step 158 | if cur_step < 5: 159 | x = __call2__(self,x,context,mask,value) 160 | else: # 256 1024 4096 161 | random_number = random.random() 162 | if cur_step <20: 163 | rand_num = 0.3 164 | else: 165 | rand_num = 0.1 166 | if random_number > rand_num: 167 | if not write: 168 | if x.shape[1] == (height//32) * (width//32): 169 | attention_mask = mask1024[mask1024.shape[0] // total_length * id_length:] 170 | else: 171 | attention_mask = mask4096[mask4096.shape[0] // total_length * id_length:] 172 | else: 173 | if x.shape[1] == (height//32) * (width//32): 174 | attention_mask = mask1024[:mask1024.shape[0] // total_length * id_length,:mask1024.shape[0] // total_length * id_length] 175 | else: 176 | attention_mask = mask4096[:mask4096.shape[0] // total_length * id_length,:mask4096.shape[0] // total_length * id_length] 177 | attention_mask = attention_mask.to(self.device) 178 | x = __call1__(self,x,context,attention_mask,value) 179 | else: 180 | x = __call2__(self,x,None,mask,value) 181 | attn_count +=1 182 | if attn_count == total_count: 183 | attn_count = 0 184 | cur_step += 1 185 | mask1024,mask4096 = cal_attn_mask_xl(total_length,id_length,sa32,sa64,height,width, device=self.device, dtype= self.dtype) 186 | return x 187 | 188 | 189 | def new_forward( 190 | self, 191 | attn, 192 | hidden_states, 193 | encoder_hidden_states=None, 194 | attention_mask=None, 195 | temb=None): 196 | # un_cond_hidden_states, cond_hidden_states = hidden_states.chunk(2) 197 | # un_cond_hidden_states = self.__call2__(attn, un_cond_hidden_states,encoder_hidden_states,attention_mask,temb) 198 | # 生成一个0到1之间的随机数 199 | global total_count,attn_count,cur_step, indices1024,indices4096 200 | global sa32, sa64 201 | global write 202 | global height,width 203 | global total_length,id_length 204 | if attn_count == 0 and cur_step == 0: 205 | indices1024,indices4096 = cal_attn_indice_xl_effcient_memory(total_length,id_length,sa32,sa64,height,width, dtype= self.dtype) 206 | if write: 207 | if hidden_states.shape[1] == (height//32) * (width//32): 208 | indices = indices1024 209 | else: 210 | indices = indices4096 211 | # print(f"white:{cur_step}") 212 | total_batch_size,nums_token,channel = hidden_states.shape 213 | img_nums = total_batch_size // 2 214 | hidden_states = hidden_states.reshape(-1,img_nums,nums_token,channel) 215 | cache_hidden_states = hidden_states.to("cpu") 216 | self.id_bank[cur_step] = [cache_hidden_states[:,img_ind,indices[img_ind],:].reshape(2,-1,channel).clone() for img_ind in range(img_nums)] 217 | hidden_states = hidden_states.reshape(-1,nums_token,channel) 218 | #self.id_bank[cur_step] = [hidden_states[:self.id_length].clone(), hidden_states[self.id_length:].clone()] 219 | else: 220 | #encoder_hidden_states = torch.cat((self.id_bank[cur_step][0].to(self.device),self.id_bank[cur_step][1].to(self.device))) 221 | encoder_arr = [tensor.to(self.device) for tensor in self.id_bank[cur_step]] 222 | # 判断随机数是否大于0.5 223 | if cur_step <1: 224 | hidden_states = self.__call2__(attn, hidden_states,None,attention_mask,temb) 225 | else: # 256 1024 4096 226 | random_number = random.random() 227 | if cur_step <20: 228 | rand_num = 0.3 229 | else: 230 | rand_num = 0.1 231 | # print(f"hidden state shape {hidden_states.shape[1]}") 232 | if random_number > rand_num: 233 | if hidden_states.shape[1] == (height//32) * (width//32): 234 | indices = indices1024 235 | else: 236 | indices = indices4096 237 | # print("before attention",hidden_states.shape,attention_mask.shape,encoder_hidden_states.shape if encoder_hidden_states is not None else "None") 238 | if write: 239 | total_batch_size,nums_token,channel = hidden_states.shape 240 | img_nums = total_batch_size // 2 241 | hidden_states = hidden_states.reshape(-1,img_nums,nums_token,channel) 242 | encoder_arr = [hidden_states[:,img_ind,indices[img_ind],:].reshape(2,-1,channel) for img_ind in range(img_nums)] 243 | for img_ind in range(img_nums): 244 | encoder_hidden_states_tmp = torch.cat(encoder_arr[0:img_ind] + encoder_arr[img_ind+1:] + [hidden_states[:,img_ind,:,:]],dim=1) 245 | hidden_states[:,img_ind,:,:] = self.__call2__(attn, hidden_states[:,img_ind,:,:],encoder_hidden_states_tmp,None,temb) 246 | else: 247 | _,nums_token,channel = hidden_states.shape 248 | # img_nums = total_batch_size // 2 249 | # encoder_hidden_states = encoder_hidden_states.reshape(-1,img_nums,nums_token,channel) 250 | hidden_states = hidden_states.reshape(2,-1,nums_token,channel) 251 | # print(len(indices)) 252 | # encoder_arr = [encoder_hidden_states[:,img_ind,indices[img_ind],:].reshape(2,-1,channel) for img_ind in range(img_nums)] 253 | encoder_hidden_states_tmp = torch.cat(encoder_arr+[hidden_states[:,0,:,:]],dim=1) 254 | hidden_states[:,0,:,:] = self.__call2__(attn, hidden_states[:,0,:,:],encoder_hidden_states_tmp,None,temb) 255 | hidden_states = hidden_states.reshape(-1,nums_token,channel) 256 | else: 257 | hidden_states = self.__call2__(attn, hidden_states,None,attention_mask,temb) 258 | attn_count +=1 259 | if attn_count == total_count: 260 | attn_count = 0 261 | cur_step += 1 262 | indices1024,indices4096 = cal_attn_indice_xl_effcient_memory(total_length,id_length,sa32,sa64,height,width, dtype= self.dtype) 263 | 264 | return hidden_states 265 | def __call1__( 266 | self, 267 | attn, 268 | hidden_states, 269 | encoder_hidden_states=None, 270 | attention_mask=None, 271 | temb=None, 272 | attn_indices = None, 273 | ): 274 | # print("hidden state shape",hidden_states.shape,self.id_length) 275 | residual = hidden_states 276 | # if encoder_hidden_states is not None: 277 | # raise Exception("not implement") 278 | # if attn.spatial_norm is not None: 279 | # hidden_states = attn.spatial_norm(hidden_states, temb) 280 | input_ndim = hidden_states.ndim 281 | 282 | if input_ndim == 4: 283 | total_batch_size, channel, height, width = hidden_states.shape 284 | hidden_states = hidden_states.view(total_batch_size, channel, height * width).transpose(1, 2) 285 | total_batch_size,nums_token,channel = hidden_states.shape 286 | img_nums = total_batch_size//2 287 | hidden_states = hidden_states.view(-1,img_nums,nums_token,channel).reshape(-1,img_nums * nums_token,channel) 288 | batch_size, sequence_length, _ = hidden_states.shape 289 | 290 | # if attn.group_norm is not None: 291 | # hidden_states = attn.group_norm(hidden_states.transpose(1, 2)).transpose(1, 2) 292 | 293 | query = attn.to_q(hidden_states) 294 | 295 | if encoder_hidden_states is None: 296 | encoder_hidden_states = hidden_states # B, N, C 297 | else: 298 | encoder_hidden_states = encoder_hidden_states.view(-1,id_length+1,nums_token,channel).reshape(-1,(id_length+1) * nums_token,channel) 299 | 300 | key = attn.to_k(encoder_hidden_states) 301 | value = attn.to_v(encoder_hidden_states) 302 | 303 | 304 | inner_dim = key.shape[-1] 305 | head_dim = inner_dim // attn.heads 306 | 307 | query = query.view(batch_size, -1, attn.heads, head_dim).transpose(1, 2) 308 | 309 | key = key.view(batch_size, -1, attn.heads, head_dim).transpose(1, 2) 310 | value = value.view(batch_size, -1, attn.heads, head_dim).transpose(1, 2) 311 | # print(key.shape,value.shape,query.shape,attention_mask.shape) 312 | # the output of sdp = (batch, num_heads, seq_len, head_dim) 313 | # TODO: add support for attn.scale when we move to Torch 2.1 314 | #print(query.shape,key.shape,value.shape,attention_mask.shape) 315 | hidden_states = F.scaled_dot_product_attention( 316 | query, key, value, attn_mask=attention_mask, dropout_p=0.0, is_causal=False 317 | ) 318 | 319 | hidden_states = hidden_states.transpose(1, 2).reshape(total_batch_size, -1, attn.heads * head_dim) 320 | hidden_states = hidden_states.to(query.dtype) 321 | 322 | 323 | 324 | # # linear proj 325 | # hidden_states = attn.to_out[0](hidden_states) 326 | # # dropout 327 | # hidden_states = attn.to_out[1](hidden_states) 328 | 329 | hidden_states = attn.to_out(hidden_states) 330 | 331 | # if input_ndim == 4: 332 | # tile_hidden_states = tile_hidden_states.transpose(-1, -2).reshape(batch_size, channel, height, width) 333 | 334 | # if attn.residual_connection: 335 | # tile_hidden_states = tile_hidden_states + residual 336 | 337 | if input_ndim == 4: 338 | hidden_states = hidden_states.transpose(-1, -2).reshape(total_batch_size, channel, height, width) 339 | # if attn.residual_connection: 340 | # hidden_states = hidden_states + residual 341 | # hidden_states = hidden_states / attn.rescale_output_factor 342 | # print(hidden_states.shape) 343 | return hidden_states 344 | 345 | def __call2__( 346 | self, 347 | attn, 348 | hidden_states, 349 | encoder_hidden_states=None, 350 | attention_mask=None, 351 | temb=None): 352 | residual = hidden_states 353 | 354 | input_ndim = hidden_states.ndim 355 | 356 | if input_ndim == 4: 357 | batch_size, channel, height, width = hidden_states.shape 358 | hidden_states = hidden_states.view(batch_size, channel, height * width).transpose(1, 2) 359 | 360 | batch_size, sequence_length, channel = ( 361 | hidden_states.shape 362 | ) 363 | # print(hidden_states.shape) 364 | if attention_mask is not None: 365 | attention_mask = attn.prepare_attention_mask(attention_mask, sequence_length, batch_size) 366 | # scaled_dot_product_attention expects attention_mask shape to be 367 | # (batch, heads, source_length, target_length) 368 | attention_mask = attention_mask.view(batch_size, attn.heads, -1, attention_mask.shape[-1]) 369 | 370 | query = attn.to_q(hidden_states) 371 | 372 | if encoder_hidden_states is None: 373 | encoder_hidden_states = hidden_states # B, N, C 374 | # else: 375 | # encoder_hidden_states = encoder_hidden_states.view(-1,self.id_length+1,sequence_length,channel).reshape(-1,(self.id_length+1) * sequence_length,channel) 376 | 377 | key = attn.to_k(encoder_hidden_states) 378 | value = attn.to_v(encoder_hidden_states) 379 | 380 | inner_dim = key.shape[-1] 381 | head_dim = inner_dim // attn.heads 382 | 383 | query = query.view(batch_size, -1, attn.heads, head_dim).transpose(1, 2) 384 | 385 | key = key.view(batch_size, -1, attn.heads, head_dim).transpose(1, 2) 386 | value = value.view(batch_size, -1, attn.heads, head_dim).transpose(1, 2) 387 | 388 | # the output of sdp = (batch, num_heads, seq_len, head_dim) 389 | # TODO: add support for attn.scale when we move to Torch 2.1 390 | hidden_states = F.scaled_dot_product_attention( 391 | query, key, value, attn_mask=attention_mask, dropout_p=0.0, is_causal=False 392 | ) 393 | 394 | hidden_states = hidden_states.transpose(1, 2).reshape(batch_size, -1, attn.heads * head_dim) 395 | hidden_states = hidden_states.to(query.dtype) 396 | 397 | # # linear proj 398 | # hidden_states = attn.to_out[0](hidden_states) 399 | # # dropout 400 | # hidden_states = attn.to_out[1](hidden_states) 401 | hidden_states = attn.to_out(hidden_states) 402 | 403 | if input_ndim == 4: 404 | hidden_states = hidden_states.transpose(-1, -2).reshape(batch_size, channel, height, width) 405 | 406 | # if attn.residual_connection: 407 | # hidden_states = hidden_states + residual 408 | 409 | # hidden_states = hidden_states / attn.rescale_output_factor 410 | 411 | return hidden_states -------------------------------------------------------------------------------- /workflow/story_base.json: -------------------------------------------------------------------------------- 1 | { 2 | "last_node_id": 29, 3 | "last_link_id": 71, 4 | "nodes": [ 5 | { 6 | "id": 11, 7 | "type": "SeaArtCharactorPrompt", 8 | "pos": [ 9 | -84, 10 | -241 11 | ], 12 | "size": { 13 | "0": 400, 14 | "1": 200 15 | }, 16 | "flags": {}, 17 | "order": 0, 18 | "mode": 0, 19 | "outputs": [ 20 | { 21 | "name": "STRING", 22 | "type": "STRING", 23 | "links": [ 24 | 10, 25 | 11, 26 | 12 27 | ], 28 | "shape": 3, 29 | "slot_index": 0 30 | } 31 | ], 32 | "properties": { 33 | "Node name for S&R": "SeaArtCharactorPrompt" 34 | }, 35 | "widgets_values": [ 36 | "a cute girl, blue hair, red cloth " 37 | ] 38 | }, 39 | { 40 | "id": 13, 41 | "type": "SeaArtAppendPrompt", 42 | "pos": [ 43 | 427, 44 | 282 45 | ], 46 | "size": { 47 | "0": 400, 48 | "1": 200 49 | }, 50 | "flags": {}, 51 | "order": 5, 52 | "mode": 0, 53 | "inputs": [ 54 | { 55 | "name": "charactor_prompt", 56 | "type": "STRING", 57 | "link": 11, 58 | "widget": { 59 | "name": "charactor_prompt" 60 | } 61 | } 62 | ], 63 | "outputs": [ 64 | { 65 | "name": "STRING", 66 | "type": "STRING", 67 | "links": [ 68 | 17 69 | ], 70 | "shape": 3, 71 | "slot_index": 0 72 | } 73 | ], 74 | "properties": { 75 | "Node name for S&R": "SeaArtAppendPrompt" 76 | }, 77 | "widgets_values": [ 78 | "", 79 | "in the park" 80 | ] 81 | }, 82 | { 83 | "id": 12, 84 | "type": "SeaArtAppendPrompt", 85 | "pos": [ 86 | 421, 87 | -54 88 | ], 89 | "size": { 90 | "0": 400, 91 | "1": 200 92 | }, 93 | "flags": {}, 94 | "order": 4, 95 | "mode": 0, 96 | "inputs": [ 97 | { 98 | "name": "charactor_prompt", 99 | "type": "STRING", 100 | "link": 10, 101 | "widget": { 102 | "name": "charactor_prompt" 103 | } 104 | } 105 | ], 106 | "outputs": [ 107 | { 108 | "name": "STRING", 109 | "type": "STRING", 110 | "links": [ 111 | 18 112 | ], 113 | "shape": 3, 114 | "slot_index": 0 115 | } 116 | ], 117 | "properties": { 118 | "Node name for S&R": "SeaArtAppendPrompt" 119 | }, 120 | "widgets_values": [ 121 | "", 122 | "in the bed" 123 | ] 124 | }, 125 | { 126 | "id": 8, 127 | "type": "VAEDecode", 128 | "pos": [ 129 | 2104, 130 | 637 131 | ], 132 | "size": { 133 | "0": 210, 134 | "1": 46 135 | }, 136 | "flags": {}, 137 | "order": 17, 138 | "mode": 0, 139 | "inputs": [ 140 | { 141 | "name": "samples", 142 | "type": "LATENT", 143 | "link": 7 144 | }, 145 | { 146 | "name": "vae", 147 | "type": "VAE", 148 | "link": 8 149 | } 150 | ], 151 | "outputs": [ 152 | { 153 | "name": "IMAGE", 154 | "type": "IMAGE", 155 | "links": [ 156 | 25 157 | ], 158 | "slot_index": 0 159 | } 160 | ], 161 | "properties": { 162 | "Node name for S&R": "VAEDecode" 163 | } 164 | }, 165 | { 166 | "id": 3, 167 | "type": "KSampler", 168 | "pos": [ 169 | 2084, 170 | 262 171 | ], 172 | "size": { 173 | "0": 315, 174 | "1": 262 175 | }, 176 | "flags": {}, 177 | "order": 16, 178 | "mode": 0, 179 | "inputs": [ 180 | { 181 | "name": "model", 182 | "type": "MODEL", 183 | "link": 14 184 | }, 185 | { 186 | "name": "positive", 187 | "type": "CONDITIONING", 188 | "link": 48 189 | }, 190 | { 191 | "name": "negative", 192 | "type": "CONDITIONING", 193 | "link": 49 194 | }, 195 | { 196 | "name": "latent_image", 197 | "type": "LATENT", 198 | "link": 2 199 | } 200 | ], 201 | "outputs": [ 202 | { 203 | "name": "LATENT", 204 | "type": "LATENT", 205 | "links": [ 206 | 7 207 | ], 208 | "slot_index": 0 209 | } 210 | ], 211 | "properties": { 212 | "Node name for S&R": "KSampler" 213 | }, 214 | "widgets_values": [ 215 | 1076421734178196, 216 | "randomize", 217 | 25, 218 | 8, 219 | "dpmpp_2m_sde_gpu", 220 | "karras", 221 | 1 222 | ] 223 | }, 224 | { 225 | "id": 14, 226 | "type": "SeaArtAppendPrompt", 227 | "pos": [ 228 | 423, 229 | 574 230 | ], 231 | "size": { 232 | "0": 400, 233 | "1": 200 234 | }, 235 | "flags": {}, 236 | "order": 6, 237 | "mode": 0, 238 | "inputs": [ 239 | { 240 | "name": "charactor_prompt", 241 | "type": "STRING", 242 | "link": 12, 243 | "widget": { 244 | "name": "charactor_prompt" 245 | } 246 | } 247 | ], 248 | "outputs": [ 249 | { 250 | "name": "STRING", 251 | "type": "STRING", 252 | "links": [ 253 | 16 254 | ], 255 | "shape": 3, 256 | "slot_index": 0 257 | } 258 | ], 259 | "properties": { 260 | "Node name for S&R": "SeaArtAppendPrompt" 261 | }, 262 | "widgets_values": [ 263 | "", 264 | "read book" 265 | ] 266 | }, 267 | { 268 | "id": 18, 269 | "type": "CLIPTextEncode", 270 | "pos": [ 271 | 957, 272 | -50 273 | ], 274 | "size": { 275 | "0": 389.5223388671875, 276 | "1": 74.33411407470703 277 | }, 278 | "flags": {}, 279 | "order": 11, 280 | "mode": 0, 281 | "inputs": [ 282 | { 283 | "name": "clip", 284 | "type": "CLIP", 285 | "link": 64 286 | }, 287 | { 288 | "name": "text", 289 | "type": "STRING", 290 | "link": 18, 291 | "widget": { 292 | "name": "text" 293 | } 294 | } 295 | ], 296 | "outputs": [ 297 | { 298 | "name": "CONDITIONING", 299 | "type": "CONDITIONING", 300 | "links": [ 301 | 40 302 | ], 303 | "shape": 3, 304 | "slot_index": 0 305 | } 306 | ], 307 | "properties": { 308 | "Node name for S&R": "CLIPTextEncode" 309 | }, 310 | "widgets_values": [ 311 | "" 312 | ] 313 | }, 314 | { 315 | "id": 19, 316 | "type": "CLIPTextEncode", 317 | "pos": [ 318 | 958, 319 | 266 320 | ], 321 | "size": { 322 | "0": 389.5223388671875, 323 | "1": 74.33411407470703 324 | }, 325 | "flags": {}, 326 | "order": 12, 327 | "mode": 0, 328 | "inputs": [ 329 | { 330 | "name": "clip", 331 | "type": "CLIP", 332 | "link": 67 333 | }, 334 | { 335 | "name": "text", 336 | "type": "STRING", 337 | "link": 17, 338 | "widget": { 339 | "name": "text" 340 | } 341 | } 342 | ], 343 | "outputs": [ 344 | { 345 | "name": "CONDITIONING", 346 | "type": "CONDITIONING", 347 | "links": [ 348 | 41 349 | ], 350 | "shape": 3, 351 | "slot_index": 0 352 | } 353 | ], 354 | "properties": { 355 | "Node name for S&R": "CLIPTextEncode" 356 | }, 357 | "widgets_values": [ 358 | "" 359 | ] 360 | }, 361 | { 362 | "id": 20, 363 | "type": "CLIPTextEncode", 364 | "pos": [ 365 | 960, 366 | 589 367 | ], 368 | "size": { 369 | "0": 389.5223388671875, 370 | "1": 74.33411407470703 371 | }, 372 | "flags": {}, 373 | "order": 13, 374 | "mode": 0, 375 | "inputs": [ 376 | { 377 | "name": "clip", 378 | "type": "CLIP", 379 | "link": 68 380 | }, 381 | { 382 | "name": "text", 383 | "type": "STRING", 384 | "link": 16, 385 | "widget": { 386 | "name": "text" 387 | } 388 | } 389 | ], 390 | "outputs": [ 391 | { 392 | "name": "CONDITIONING", 393 | "type": "CONDITIONING", 394 | "links": [ 395 | 42 396 | ], 397 | "shape": 3, 398 | "slot_index": 0 399 | } 400 | ], 401 | "properties": { 402 | "Node name for S&R": "CLIPTextEncode" 403 | }, 404 | "widgets_values": [ 405 | "" 406 | ] 407 | }, 408 | { 409 | "id": 23, 410 | "type": "CLIPTextEncode", 411 | "pos": [ 412 | 976, 413 | 904 414 | ], 415 | "size": { 416 | "0": 389.5223388671875, 417 | "1": 74.33411407470703 418 | }, 419 | "flags": {}, 420 | "order": 8, 421 | "mode": 0, 422 | "inputs": [ 423 | { 424 | "name": "clip", 425 | "type": "CLIP", 426 | "link": 69 427 | }, 428 | { 429 | "name": "text", 430 | "type": "STRING", 431 | "link": 28, 432 | "widget": { 433 | "name": "text" 434 | } 435 | } 436 | ], 437 | "outputs": [ 438 | { 439 | "name": "CONDITIONING", 440 | "type": "CONDITIONING", 441 | "links": [ 442 | 44 443 | ], 444 | "shape": 3, 445 | "slot_index": 0 446 | } 447 | ], 448 | "properties": { 449 | "Node name for S&R": "CLIPTextEncode" 450 | }, 451 | "widgets_values": [ 452 | "" 453 | ] 454 | }, 455 | { 456 | "id": 24, 457 | "type": "CLIPTextEncode", 458 | "pos": [ 459 | 970, 460 | 1130 461 | ], 462 | "size": { 463 | "0": 389.5223388671875, 464 | "1": 74.33411407470703 465 | }, 466 | "flags": {}, 467 | "order": 10, 468 | "mode": 0, 469 | "inputs": [ 470 | { 471 | "name": "clip", 472 | "type": "CLIP", 473 | "link": 71 474 | }, 475 | { 476 | "name": "text", 477 | "type": "STRING", 478 | "link": 27, 479 | "widget": { 480 | "name": "text" 481 | } 482 | } 483 | ], 484 | "outputs": [ 485 | { 486 | "name": "CONDITIONING", 487 | "type": "CONDITIONING", 488 | "links": [ 489 | 45 490 | ], 491 | "shape": 3, 492 | "slot_index": 0 493 | } 494 | ], 495 | "properties": { 496 | "Node name for S&R": "CLIPTextEncode" 497 | }, 498 | "widgets_values": [ 499 | "" 500 | ] 501 | }, 502 | { 503 | "id": 25, 504 | "type": "CLIPTextEncode", 505 | "pos": [ 506 | 970, 507 | 1351 508 | ], 509 | "size": { 510 | "0": 389.5223388671875, 511 | "1": 74.33411407470703 512 | }, 513 | "flags": {}, 514 | "order": 9, 515 | "mode": 0, 516 | "inputs": [ 517 | { 518 | "name": "clip", 519 | "type": "CLIP", 520 | "link": 70 521 | }, 522 | { 523 | "name": "text", 524 | "type": "STRING", 525 | "link": 29, 526 | "widget": { 527 | "name": "text" 528 | } 529 | } 530 | ], 531 | "outputs": [ 532 | { 533 | "name": "CONDITIONING", 534 | "type": "CONDITIONING", 535 | "links": [ 536 | 46 537 | ], 538 | "shape": 3, 539 | "slot_index": 0 540 | } 541 | ], 542 | "properties": { 543 | "Node name for S&R": "CLIPTextEncode" 544 | }, 545 | "widgets_values": [ 546 | "" 547 | ] 548 | }, 549 | { 550 | "id": 27, 551 | "type": "SeaArtMergeStoryCondition", 552 | "pos": [ 553 | 1551, 554 | 217 555 | ], 556 | "size": { 557 | "0": 342.5999755859375, 558 | "1": 126 559 | }, 560 | "flags": {}, 561 | "order": 15, 562 | "mode": 0, 563 | "inputs": [ 564 | { 565 | "name": "clip", 566 | "type": "CLIP", 567 | "link": 65 568 | }, 569 | { 570 | "name": "conditioning_1", 571 | "type": "CONDITIONING", 572 | "link": 40 573 | }, 574 | { 575 | "name": "conditioning_2", 576 | "type": "CONDITIONING", 577 | "link": 41 578 | }, 579 | { 580 | "name": "conditioning_3", 581 | "type": "CONDITIONING", 582 | "link": 42 583 | }, 584 | { 585 | "name": "conditioning_4", 586 | "type": "CONDITIONING", 587 | "link": null 588 | }, 589 | { 590 | "name": "conditioning_5", 591 | "type": "CONDITIONING", 592 | "link": null 593 | } 594 | ], 595 | "outputs": [ 596 | { 597 | "name": "CONDITIONING", 598 | "type": "CONDITIONING", 599 | "links": [ 600 | 48 601 | ], 602 | "shape": 3, 603 | "slot_index": 0 604 | } 605 | ], 606 | "properties": { 607 | "Node name for S&R": "SeaArtMergeStoryCondition" 608 | } 609 | }, 610 | { 611 | "id": 22, 612 | "type": "SeaArtAppendPrompt", 613 | "pos": [ 614 | 420, 615 | 883 616 | ], 617 | "size": { 618 | "0": 400, 619 | "1": 200 620 | }, 621 | "flags": {}, 622 | "order": 1, 623 | "mode": 0, 624 | "inputs": [], 625 | "outputs": [ 626 | { 627 | "name": "STRING", 628 | "type": "STRING", 629 | "links": [ 630 | 27, 631 | 28, 632 | 29 633 | ], 634 | "shape": 3, 635 | "slot_index": 0 636 | } 637 | ], 638 | "properties": { 639 | "Node name for S&R": "SeaArtAppendPrompt" 640 | }, 641 | "widgets_values": [ 642 | "", 643 | "water,bad" 644 | ] 645 | }, 646 | { 647 | "id": 16, 648 | "type": "SeaArtApplyStory", 649 | "pos": [ 650 | 1603, 651 | 49 652 | ], 653 | "size": { 654 | "0": 210, 655 | "1": 106 656 | }, 657 | "flags": {}, 658 | "order": 7, 659 | "mode": 0, 660 | "inputs": [ 661 | { 662 | "name": "model", 663 | "type": "MODEL", 664 | "link": 63 665 | } 666 | ], 667 | "outputs": [ 668 | { 669 | "name": "MODEL", 670 | "type": "MODEL", 671 | "links": [ 672 | 14 673 | ], 674 | "shape": 3, 675 | "slot_index": 0 676 | } 677 | ], 678 | "properties": { 679 | "Node name for S&R": "SeaArtApplyStory" 680 | }, 681 | "widgets_values": [ 682 | 3, 683 | 1024, 684 | 1024 685 | ] 686 | }, 687 | { 688 | "id": 28, 689 | "type": "SeaArtMergeStoryCondition", 690 | "pos": [ 691 | 1561, 692 | 1047 693 | ], 694 | "size": { 695 | "0": 342.5999755859375, 696 | "1": 126 697 | }, 698 | "flags": {}, 699 | "order": 14, 700 | "mode": 0, 701 | "inputs": [ 702 | { 703 | "name": "clip", 704 | "type": "CLIP", 705 | "link": 66 706 | }, 707 | { 708 | "name": "conditioning_1", 709 | "type": "CONDITIONING", 710 | "link": 44 711 | }, 712 | { 713 | "name": "conditioning_2", 714 | "type": "CONDITIONING", 715 | "link": 45 716 | }, 717 | { 718 | "name": "conditioning_3", 719 | "type": "CONDITIONING", 720 | "link": 46 721 | }, 722 | { 723 | "name": "conditioning_4", 724 | "type": "CONDITIONING", 725 | "link": null 726 | }, 727 | { 728 | "name": "conditioning_5", 729 | "type": "CONDITIONING", 730 | "link": null 731 | } 732 | ], 733 | "outputs": [ 734 | { 735 | "name": "CONDITIONING", 736 | "type": "CONDITIONING", 737 | "links": [ 738 | 49 739 | ], 740 | "shape": 3, 741 | "slot_index": 0 742 | } 743 | ], 744 | "properties": { 745 | "Node name for S&R": "SeaArtMergeStoryCondition" 746 | } 747 | }, 748 | { 749 | "id": 21, 750 | "type": "PreviewImage", 751 | "pos": [ 752 | 2110, 753 | 757 754 | ], 755 | "size": { 756 | "0": 210, 757 | "1": 246 758 | }, 759 | "flags": {}, 760 | "order": 18, 761 | "mode": 0, 762 | "inputs": [ 763 | { 764 | "name": "images", 765 | "type": "IMAGE", 766 | "link": 25 767 | } 768 | ], 769 | "properties": { 770 | "Node name for S&R": "PreviewImage" 771 | } 772 | }, 773 | { 774 | "id": 5, 775 | "type": "EmptyLatentImage", 776 | "pos": [ 777 | 1584, 778 | 549 779 | ], 780 | "size": { 781 | "0": 315, 782 | "1": 106 783 | }, 784 | "flags": {}, 785 | "order": 2, 786 | "mode": 0, 787 | "outputs": [ 788 | { 789 | "name": "LATENT", 790 | "type": "LATENT", 791 | "links": [ 792 | 2 793 | ], 794 | "slot_index": 0 795 | } 796 | ], 797 | "properties": { 798 | "Node name for S&R": "EmptyLatentImage" 799 | }, 800 | "widgets_values": [ 801 | 1024, 802 | 1024, 803 | 3 804 | ] 805 | }, 806 | { 807 | "id": 4, 808 | "type": "CheckpointLoaderSimple", 809 | "pos": [ 810 | -696, 811 | 174 812 | ], 813 | "size": { 814 | "0": 315, 815 | "1": 98 816 | }, 817 | "flags": {}, 818 | "order": 3, 819 | "mode": 0, 820 | "outputs": [ 821 | { 822 | "name": "MODEL", 823 | "type": "MODEL", 824 | "links": [ 825 | 63 826 | ], 827 | "slot_index": 0 828 | }, 829 | { 830 | "name": "CLIP", 831 | "type": "CLIP", 832 | "links": [ 833 | 64, 834 | 65, 835 | 66, 836 | 67, 837 | 68, 838 | 69, 839 | 70, 840 | 71 841 | ], 842 | "slot_index": 1 843 | }, 844 | { 845 | "name": "VAE", 846 | "type": "VAE", 847 | "links": [ 848 | 8 849 | ], 850 | "slot_index": 2 851 | } 852 | ], 853 | "properties": { 854 | "Node name for S&R": "CheckpointLoaderSimple" 855 | }, 856 | "widgets_values": [ 857 | "Dark_Sushi_Mix.safetensors" 858 | ] 859 | } 860 | ], 861 | "links": [ 862 | [ 863 | 2, 864 | 5, 865 | 0, 866 | 3, 867 | 3, 868 | "LATENT" 869 | ], 870 | [ 871 | 7, 872 | 3, 873 | 0, 874 | 8, 875 | 0, 876 | "LATENT" 877 | ], 878 | [ 879 | 8, 880 | 4, 881 | 2, 882 | 8, 883 | 1, 884 | "VAE" 885 | ], 886 | [ 887 | 10, 888 | 11, 889 | 0, 890 | 12, 891 | 0, 892 | "STRING" 893 | ], 894 | [ 895 | 11, 896 | 11, 897 | 0, 898 | 13, 899 | 0, 900 | "STRING" 901 | ], 902 | [ 903 | 12, 904 | 11, 905 | 0, 906 | 14, 907 | 0, 908 | "STRING" 909 | ], 910 | [ 911 | 14, 912 | 16, 913 | 0, 914 | 3, 915 | 0, 916 | "MODEL" 917 | ], 918 | [ 919 | 16, 920 | 14, 921 | 0, 922 | 20, 923 | 1, 924 | "STRING" 925 | ], 926 | [ 927 | 17, 928 | 13, 929 | 0, 930 | 19, 931 | 1, 932 | "STRING" 933 | ], 934 | [ 935 | 18, 936 | 12, 937 | 0, 938 | 18, 939 | 1, 940 | "STRING" 941 | ], 942 | [ 943 | 25, 944 | 8, 945 | 0, 946 | 21, 947 | 0, 948 | "IMAGE" 949 | ], 950 | [ 951 | 27, 952 | 22, 953 | 0, 954 | 24, 955 | 1, 956 | "STRING" 957 | ], 958 | [ 959 | 28, 960 | 22, 961 | 0, 962 | 23, 963 | 1, 964 | "STRING" 965 | ], 966 | [ 967 | 29, 968 | 22, 969 | 0, 970 | 25, 971 | 1, 972 | "STRING" 973 | ], 974 | [ 975 | 40, 976 | 18, 977 | 0, 978 | 27, 979 | 1, 980 | "CONDITIONING" 981 | ], 982 | [ 983 | 41, 984 | 19, 985 | 0, 986 | 27, 987 | 2, 988 | "CONDITIONING" 989 | ], 990 | [ 991 | 42, 992 | 20, 993 | 0, 994 | 27, 995 | 3, 996 | "CONDITIONING" 997 | ], 998 | [ 999 | 44, 1000 | 23, 1001 | 0, 1002 | 28, 1003 | 1, 1004 | "CONDITIONING" 1005 | ], 1006 | [ 1007 | 45, 1008 | 24, 1009 | 0, 1010 | 28, 1011 | 2, 1012 | "CONDITIONING" 1013 | ], 1014 | [ 1015 | 46, 1016 | 25, 1017 | 0, 1018 | 28, 1019 | 3, 1020 | "CONDITIONING" 1021 | ], 1022 | [ 1023 | 48, 1024 | 27, 1025 | 0, 1026 | 3, 1027 | 1, 1028 | "CONDITIONING" 1029 | ], 1030 | [ 1031 | 49, 1032 | 28, 1033 | 0, 1034 | 3, 1035 | 2, 1036 | "CONDITIONING" 1037 | ], 1038 | [ 1039 | 63, 1040 | 4, 1041 | 0, 1042 | 16, 1043 | 0, 1044 | "MODEL" 1045 | ], 1046 | [ 1047 | 64, 1048 | 4, 1049 | 1, 1050 | 18, 1051 | 0, 1052 | "CLIP" 1053 | ], 1054 | [ 1055 | 65, 1056 | 4, 1057 | 1, 1058 | 27, 1059 | 0, 1060 | "CLIP" 1061 | ], 1062 | [ 1063 | 66, 1064 | 4, 1065 | 1, 1066 | 28, 1067 | 0, 1068 | "CLIP" 1069 | ], 1070 | [ 1071 | 67, 1072 | 4, 1073 | 1, 1074 | 19, 1075 | 0, 1076 | "CLIP" 1077 | ], 1078 | [ 1079 | 68, 1080 | 4, 1081 | 1, 1082 | 20, 1083 | 0, 1084 | "CLIP" 1085 | ], 1086 | [ 1087 | 69, 1088 | 4, 1089 | 1, 1090 | 23, 1091 | 0, 1092 | "CLIP" 1093 | ], 1094 | [ 1095 | 70, 1096 | 4, 1097 | 1, 1098 | 25, 1099 | 0, 1100 | "CLIP" 1101 | ], 1102 | [ 1103 | 71, 1104 | 4, 1105 | 1, 1106 | 24, 1107 | 0, 1108 | "CLIP" 1109 | ] 1110 | ], 1111 | "groups": [], 1112 | "config": {}, 1113 | "extra": { 1114 | "ds": { 1115 | "scale": 0.513158118230707, 1116 | "offset": [ 1117 | 840.0659744450127, 1118 | 339.40740507128805 1119 | ] 1120 | } 1121 | }, 1122 | "version": 0.4 1123 | } -------------------------------------------------------------------------------- /workflow/story_with_inf.json: -------------------------------------------------------------------------------- 1 | { 2 | "last_node_id": 39, 3 | "last_link_id": 103, 4 | "nodes": [ 5 | { 6 | "id": 13, 7 | "type": "SeaArtAppendPrompt", 8 | "pos": [ 9 | 427, 10 | 282 11 | ], 12 | "size": { 13 | "0": 400, 14 | "1": 200 15 | }, 16 | "flags": {}, 17 | "order": 6, 18 | "mode": 0, 19 | "inputs": [ 20 | { 21 | "name": "charactor_prompt", 22 | "type": "STRING", 23 | "link": 11, 24 | "widget": { 25 | "name": "charactor_prompt" 26 | } 27 | } 28 | ], 29 | "outputs": [ 30 | { 31 | "name": "STRING", 32 | "type": "STRING", 33 | "links": [ 34 | 17 35 | ], 36 | "shape": 3, 37 | "slot_index": 0 38 | } 39 | ], 40 | "properties": { 41 | "Node name for S&R": "SeaArtAppendPrompt" 42 | }, 43 | "widgets_values": [ 44 | "", 45 | "in the park" 46 | ] 47 | }, 48 | { 49 | "id": 8, 50 | "type": "VAEDecode", 51 | "pos": [ 52 | 2104, 53 | 637 54 | ], 55 | "size": { 56 | "0": 210, 57 | "1": 46 58 | }, 59 | "flags": {}, 60 | "order": 21, 61 | "mode": 0, 62 | "inputs": [ 63 | { 64 | "name": "samples", 65 | "type": "LATENT", 66 | "link": 98 67 | }, 68 | { 69 | "name": "vae", 70 | "type": "VAE", 71 | "link": 8 72 | } 73 | ], 74 | "outputs": [ 75 | { 76 | "name": "IMAGE", 77 | "type": "IMAGE", 78 | "links": [ 79 | 25 80 | ], 81 | "slot_index": 0 82 | } 83 | ], 84 | "properties": { 85 | "Node name for S&R": "VAEDecode" 86 | } 87 | }, 88 | { 89 | "id": 14, 90 | "type": "SeaArtAppendPrompt", 91 | "pos": [ 92 | 423, 93 | 574 94 | ], 95 | "size": { 96 | "0": 400, 97 | "1": 200 98 | }, 99 | "flags": {}, 100 | "order": 7, 101 | "mode": 0, 102 | "inputs": [ 103 | { 104 | "name": "charactor_prompt", 105 | "type": "STRING", 106 | "link": 12, 107 | "widget": { 108 | "name": "charactor_prompt" 109 | } 110 | } 111 | ], 112 | "outputs": [ 113 | { 114 | "name": "STRING", 115 | "type": "STRING", 116 | "links": [ 117 | 16 118 | ], 119 | "shape": 3, 120 | "slot_index": 0 121 | } 122 | ], 123 | "properties": { 124 | "Node name for S&R": "SeaArtAppendPrompt" 125 | }, 126 | "widgets_values": [ 127 | "", 128 | "read book" 129 | ] 130 | }, 131 | { 132 | "id": 18, 133 | "type": "CLIPTextEncode", 134 | "pos": [ 135 | 957, 136 | -50 137 | ], 138 | "size": { 139 | "0": 389.5223388671875, 140 | "1": 74.33411407470703 141 | }, 142 | "flags": {}, 143 | "order": 14, 144 | "mode": 0, 145 | "inputs": [ 146 | { 147 | "name": "clip", 148 | "type": "CLIP", 149 | "link": 64 150 | }, 151 | { 152 | "name": "text", 153 | "type": "STRING", 154 | "link": 18, 155 | "widget": { 156 | "name": "text" 157 | } 158 | } 159 | ], 160 | "outputs": [ 161 | { 162 | "name": "CONDITIONING", 163 | "type": "CONDITIONING", 164 | "links": [ 165 | 40 166 | ], 167 | "shape": 3, 168 | "slot_index": 0 169 | } 170 | ], 171 | "properties": { 172 | "Node name for S&R": "CLIPTextEncode" 173 | }, 174 | "widgets_values": [ 175 | "" 176 | ] 177 | }, 178 | { 179 | "id": 20, 180 | "type": "CLIPTextEncode", 181 | "pos": [ 182 | 960, 183 | 589 184 | ], 185 | "size": { 186 | "0": 389.5223388671875, 187 | "1": 74.33411407470703 188 | }, 189 | "flags": {}, 190 | "order": 16, 191 | "mode": 0, 192 | "inputs": [ 193 | { 194 | "name": "clip", 195 | "type": "CLIP", 196 | "link": 68 197 | }, 198 | { 199 | "name": "text", 200 | "type": "STRING", 201 | "link": 16, 202 | "widget": { 203 | "name": "text" 204 | } 205 | } 206 | ], 207 | "outputs": [ 208 | { 209 | "name": "CONDITIONING", 210 | "type": "CONDITIONING", 211 | "links": [ 212 | 42 213 | ], 214 | "shape": 3, 215 | "slot_index": 0 216 | } 217 | ], 218 | "properties": { 219 | "Node name for S&R": "CLIPTextEncode" 220 | }, 221 | "widgets_values": [ 222 | "" 223 | ] 224 | }, 225 | { 226 | "id": 24, 227 | "type": "CLIPTextEncode", 228 | "pos": [ 229 | 970, 230 | 1130 231 | ], 232 | "size": { 233 | "0": 389.5223388671875, 234 | "1": 74.33411407470703 235 | }, 236 | "flags": {}, 237 | "order": 12, 238 | "mode": 0, 239 | "inputs": [ 240 | { 241 | "name": "clip", 242 | "type": "CLIP", 243 | "link": 71 244 | }, 245 | { 246 | "name": "text", 247 | "type": "STRING", 248 | "link": 27, 249 | "widget": { 250 | "name": "text" 251 | } 252 | } 253 | ], 254 | "outputs": [ 255 | { 256 | "name": "CONDITIONING", 257 | "type": "CONDITIONING", 258 | "links": [ 259 | 45 260 | ], 261 | "shape": 3, 262 | "slot_index": 0 263 | } 264 | ], 265 | "properties": { 266 | "Node name for S&R": "CLIPTextEncode" 267 | }, 268 | "widgets_values": [ 269 | "" 270 | ] 271 | }, 272 | { 273 | "id": 25, 274 | "type": "CLIPTextEncode", 275 | "pos": [ 276 | 970, 277 | 1351 278 | ], 279 | "size": { 280 | "0": 389.5223388671875, 281 | "1": 74.33411407470703 282 | }, 283 | "flags": {}, 284 | "order": 11, 285 | "mode": 0, 286 | "inputs": [ 287 | { 288 | "name": "clip", 289 | "type": "CLIP", 290 | "link": 70 291 | }, 292 | { 293 | "name": "text", 294 | "type": "STRING", 295 | "link": 29, 296 | "widget": { 297 | "name": "text" 298 | } 299 | } 300 | ], 301 | "outputs": [ 302 | { 303 | "name": "CONDITIONING", 304 | "type": "CONDITIONING", 305 | "links": [ 306 | 46 307 | ], 308 | "shape": 3, 309 | "slot_index": 0 310 | } 311 | ], 312 | "properties": { 313 | "Node name for S&R": "CLIPTextEncode" 314 | }, 315 | "widgets_values": [ 316 | "" 317 | ] 318 | }, 319 | { 320 | "id": 27, 321 | "type": "SeaArtMergeStoryCondition", 322 | "pos": [ 323 | 1551, 324 | 217 325 | ], 326 | "size": { 327 | "0": 342.5999755859375, 328 | "1": 126 329 | }, 330 | "flags": {}, 331 | "order": 19, 332 | "mode": 0, 333 | "inputs": [ 334 | { 335 | "name": "clip", 336 | "type": "CLIP", 337 | "link": 65 338 | }, 339 | { 340 | "name": "conditioning_1", 341 | "type": "CONDITIONING", 342 | "link": 40 343 | }, 344 | { 345 | "name": "conditioning_2", 346 | "type": "CONDITIONING", 347 | "link": 41 348 | }, 349 | { 350 | "name": "conditioning_3", 351 | "type": "CONDITIONING", 352 | "link": 42 353 | }, 354 | { 355 | "name": "conditioning_4", 356 | "type": "CONDITIONING", 357 | "link": null 358 | }, 359 | { 360 | "name": "conditioning_5", 361 | "type": "CONDITIONING", 362 | "link": null 363 | } 364 | ], 365 | "outputs": [ 366 | { 367 | "name": "CONDITIONING", 368 | "type": "CONDITIONING", 369 | "links": [ 370 | 95 371 | ], 372 | "shape": 3, 373 | "slot_index": 0 374 | } 375 | ], 376 | "properties": { 377 | "Node name for S&R": "SeaArtMergeStoryCondition" 378 | } 379 | }, 380 | { 381 | "id": 28, 382 | "type": "SeaArtMergeStoryCondition", 383 | "pos": [ 384 | 1561, 385 | 1047 386 | ], 387 | "size": { 388 | "0": 342.5999755859375, 389 | "1": 126 390 | }, 391 | "flags": {}, 392 | "order": 18, 393 | "mode": 0, 394 | "inputs": [ 395 | { 396 | "name": "clip", 397 | "type": "CLIP", 398 | "link": 66 399 | }, 400 | { 401 | "name": "conditioning_1", 402 | "type": "CONDITIONING", 403 | "link": 44 404 | }, 405 | { 406 | "name": "conditioning_2", 407 | "type": "CONDITIONING", 408 | "link": 45 409 | }, 410 | { 411 | "name": "conditioning_3", 412 | "type": "CONDITIONING", 413 | "link": 46 414 | }, 415 | { 416 | "name": "conditioning_4", 417 | "type": "CONDITIONING", 418 | "link": null 419 | }, 420 | { 421 | "name": "conditioning_5", 422 | "type": "CONDITIONING", 423 | "link": null 424 | } 425 | ], 426 | "outputs": [ 427 | { 428 | "name": "CONDITIONING", 429 | "type": "CONDITIONING", 430 | "links": [ 431 | 96 432 | ], 433 | "shape": 3, 434 | "slot_index": 0 435 | } 436 | ], 437 | "properties": { 438 | "Node name for S&R": "SeaArtMergeStoryCondition" 439 | } 440 | }, 441 | { 442 | "id": 21, 443 | "type": "PreviewImage", 444 | "pos": [ 445 | 2062, 446 | 752 447 | ], 448 | "size": { 449 | "0": 210, 450 | "1": 246 451 | }, 452 | "flags": {}, 453 | "order": 23, 454 | "mode": 0, 455 | "inputs": [ 456 | { 457 | "name": "images", 458 | "type": "IMAGE", 459 | "link": 25 460 | } 461 | ], 462 | "properties": { 463 | "Node name for S&R": "PreviewImage" 464 | } 465 | }, 466 | { 467 | "id": 12, 468 | "type": "SeaArtAppendPrompt", 469 | "pos": [ 470 | 421, 471 | -54 472 | ], 473 | "size": { 474 | "0": 400, 475 | "1": 200 476 | }, 477 | "flags": {}, 478 | "order": 5, 479 | "mode": 0, 480 | "inputs": [ 481 | { 482 | "name": "charactor_prompt", 483 | "type": "STRING", 484 | "link": 10, 485 | "widget": { 486 | "name": "charactor_prompt" 487 | } 488 | } 489 | ], 490 | "outputs": [ 491 | { 492 | "name": "STRING", 493 | "type": "STRING", 494 | "links": [ 495 | 18 496 | ], 497 | "shape": 3, 498 | "slot_index": 0 499 | } 500 | ], 501 | "properties": { 502 | "Node name for S&R": "SeaArtAppendPrompt" 503 | }, 504 | "widgets_values": [ 505 | "", 506 | "in the bed" 507 | ] 508 | }, 509 | { 510 | "id": 11, 511 | "type": "SeaArtCharactorPrompt", 512 | "pos": [ 513 | -84, 514 | -241 515 | ], 516 | "size": { 517 | "0": 400, 518 | "1": 200 519 | }, 520 | "flags": {}, 521 | "order": 0, 522 | "mode": 0, 523 | "outputs": [ 524 | { 525 | "name": "STRING", 526 | "type": "STRING", 527 | "links": [ 528 | 10, 529 | 11, 530 | 12, 531 | 78 532 | ], 533 | "shape": 3, 534 | "slot_index": 0 535 | } 536 | ], 537 | "properties": { 538 | "Node name for S&R": "SeaArtCharactorPrompt" 539 | }, 540 | "widgets_values": [ 541 | "a cute girl, blue hair, red cloth " 542 | ] 543 | }, 544 | { 545 | "id": 19, 546 | "type": "CLIPTextEncode", 547 | "pos": [ 548 | 958, 549 | 266 550 | ], 551 | "size": { 552 | "0": 389.5223388671875, 553 | "1": 74.33411407470703 554 | }, 555 | "flags": {}, 556 | "order": 15, 557 | "mode": 0, 558 | "inputs": [ 559 | { 560 | "name": "clip", 561 | "type": "CLIP", 562 | "link": 67 563 | }, 564 | { 565 | "name": "text", 566 | "type": "STRING", 567 | "link": 17, 568 | "widget": { 569 | "name": "text" 570 | } 571 | } 572 | ], 573 | "outputs": [ 574 | { 575 | "name": "CONDITIONING", 576 | "type": "CONDITIONING", 577 | "links": [ 578 | 41 579 | ], 580 | "shape": 3, 581 | "slot_index": 0 582 | } 583 | ], 584 | "properties": { 585 | "Node name for S&R": "CLIPTextEncode" 586 | }, 587 | "widgets_values": [ 588 | "" 589 | ] 590 | }, 591 | { 592 | "id": 33, 593 | "type": "CLIPTextEncode", 594 | "pos": [ 595 | 1994, 596 | -277 597 | ], 598 | "size": { 599 | "0": 389.5223388671875, 600 | "1": 74.33411407470703 601 | }, 602 | "flags": {}, 603 | "order": 17, 604 | "mode": 0, 605 | "inputs": [ 606 | { 607 | "name": "clip", 608 | "type": "CLIP", 609 | "link": 80 610 | }, 611 | { 612 | "name": "text", 613 | "type": "STRING", 614 | "link": 79, 615 | "widget": { 616 | "name": "text" 617 | } 618 | } 619 | ], 620 | "outputs": [ 621 | { 622 | "name": "CONDITIONING", 623 | "type": "CONDITIONING", 624 | "links": [ 625 | 100 626 | ], 627 | "shape": 3, 628 | "slot_index": 0 629 | } 630 | ], 631 | "properties": { 632 | "Node name for S&R": "CLIPTextEncode" 633 | }, 634 | "widgets_values": [ 635 | "" 636 | ] 637 | }, 638 | { 639 | "id": 23, 640 | "type": "CLIPTextEncode", 641 | "pos": [ 642 | 976, 643 | 904 644 | ], 645 | "size": { 646 | "0": 389.5223388671875, 647 | "1": 74.33411407470703 648 | }, 649 | "flags": {}, 650 | "order": 10, 651 | "mode": 0, 652 | "inputs": [ 653 | { 654 | "name": "clip", 655 | "type": "CLIP", 656 | "link": 69 657 | }, 658 | { 659 | "name": "text", 660 | "type": "STRING", 661 | "link": 28, 662 | "widget": { 663 | "name": "text" 664 | } 665 | } 666 | ], 667 | "outputs": [ 668 | { 669 | "name": "CONDITIONING", 670 | "type": "CONDITIONING", 671 | "links": [ 672 | 44 673 | ], 674 | "shape": 3, 675 | "slot_index": 0 676 | } 677 | ], 678 | "properties": { 679 | "Node name for S&R": "CLIPTextEncode" 680 | }, 681 | "widgets_values": [ 682 | "" 683 | ] 684 | }, 685 | { 686 | "id": 22, 687 | "type": "SeaArtAppendPrompt", 688 | "pos": [ 689 | 420, 690 | 883 691 | ], 692 | "size": { 693 | "0": 400, 694 | "1": 200 695 | }, 696 | "flags": {}, 697 | "order": 1, 698 | "mode": 0, 699 | "inputs": [], 700 | "outputs": [ 701 | { 702 | "name": "STRING", 703 | "type": "STRING", 704 | "links": [ 705 | 27, 706 | 28, 707 | 29, 708 | 83 709 | ], 710 | "shape": 3, 711 | "slot_index": 0 712 | } 713 | ], 714 | "properties": { 715 | "Node name for S&R": "SeaArtAppendPrompt" 716 | }, 717 | "widgets_values": [ 718 | "", 719 | "water,bad" 720 | ] 721 | }, 722 | { 723 | "id": 34, 724 | "type": "CLIPTextEncode", 725 | "pos": [ 726 | 2383, 727 | 797 728 | ], 729 | "size": { 730 | "0": 389.5223388671875, 731 | "1": 74.33411407470703 732 | }, 733 | "flags": {}, 734 | "order": 13, 735 | "mode": 0, 736 | "inputs": [ 737 | { 738 | "name": "clip", 739 | "type": "CLIP", 740 | "link": 82 741 | }, 742 | { 743 | "name": "text", 744 | "type": "STRING", 745 | "link": 83, 746 | "widget": { 747 | "name": "text" 748 | } 749 | } 750 | ], 751 | "outputs": [ 752 | { 753 | "name": "CONDITIONING", 754 | "type": "CONDITIONING", 755 | "links": [ 756 | 101 757 | ], 758 | "shape": 3, 759 | "slot_index": 0 760 | } 761 | ], 762 | "properties": { 763 | "Node name for S&R": "CLIPTextEncode" 764 | }, 765 | "widgets_values": [ 766 | "" 767 | ] 768 | }, 769 | { 770 | "id": 4, 771 | "type": "CheckpointLoaderSimple", 772 | "pos": [ 773 | -696, 774 | 174 775 | ], 776 | "size": { 777 | "0": 315, 778 | "1": 98 779 | }, 780 | "flags": {}, 781 | "order": 2, 782 | "mode": 0, 783 | "outputs": [ 784 | { 785 | "name": "MODEL", 786 | "type": "MODEL", 787 | "links": [ 788 | 63 789 | ], 790 | "slot_index": 0 791 | }, 792 | { 793 | "name": "CLIP", 794 | "type": "CLIP", 795 | "links": [ 796 | 64, 797 | 65, 798 | 66, 799 | 67, 800 | 68, 801 | 69, 802 | 70, 803 | 71, 804 | 80, 805 | 82 806 | ], 807 | "slot_index": 1 808 | }, 809 | { 810 | "name": "VAE", 811 | "type": "VAE", 812 | "links": [ 813 | 8, 814 | 87 815 | ], 816 | "slot_index": 2 817 | } 818 | ], 819 | "properties": { 820 | "Node name for S&R": "CheckpointLoaderSimple" 821 | }, 822 | "widgets_values": [ 823 | "Dark_Sushi_Mix.safetensors" 824 | ] 825 | }, 826 | { 827 | "id": 36, 828 | "type": "VAEDecode", 829 | "pos": [ 830 | 2916, 831 | 630 832 | ], 833 | "size": { 834 | "0": 210, 835 | "1": 46 836 | }, 837 | "flags": {}, 838 | "order": 24, 839 | "mode": 0, 840 | "inputs": [ 841 | { 842 | "name": "samples", 843 | "type": "LATENT", 844 | "link": 103 845 | }, 846 | { 847 | "name": "vae", 848 | "type": "VAE", 849 | "link": 87 850 | } 851 | ], 852 | "outputs": [ 853 | { 854 | "name": "IMAGE", 855 | "type": "IMAGE", 856 | "links": [ 857 | 88 858 | ], 859 | "shape": 3, 860 | "slot_index": 0 861 | } 862 | ], 863 | "properties": { 864 | "Node name for S&R": "VAEDecode" 865 | } 866 | }, 867 | { 868 | "id": 37, 869 | "type": "PreviewImage", 870 | "pos": [ 871 | 2921.312784313015, 872 | 751.8902889385316 873 | ], 874 | "size": [ 875 | 210, 876 | 246 877 | ], 878 | "flags": {}, 879 | "order": 25, 880 | "mode": 0, 881 | "inputs": [ 882 | { 883 | "name": "images", 884 | "type": "IMAGE", 885 | "link": 88 886 | } 887 | ], 888 | "properties": { 889 | "Node name for S&R": "PreviewImage" 890 | } 891 | }, 892 | { 893 | "id": 32, 894 | "type": "SeaArtAppendPrompt", 895 | "pos": [ 896 | 1446, 897 | -276 898 | ], 899 | "size": { 900 | "0": 400, 901 | "1": 200 902 | }, 903 | "flags": {}, 904 | "order": 8, 905 | "mode": 0, 906 | "inputs": [ 907 | { 908 | "name": "charactor_prompt", 909 | "type": "STRING", 910 | "link": 78, 911 | "widget": { 912 | "name": "charactor_prompt" 913 | } 914 | } 915 | ], 916 | "outputs": [ 917 | { 918 | "name": "STRING", 919 | "type": "STRING", 920 | "links": [ 921 | 79 922 | ], 923 | "shape": 3, 924 | "slot_index": 0 925 | } 926 | ], 927 | "properties": { 928 | "Node name for S&R": "SeaArtAppendPrompt" 929 | }, 930 | "widgets_values": [ 931 | "", 932 | "on the street" 933 | ] 934 | }, 935 | { 936 | "id": 30, 937 | "type": "SeaArtStoryKSampler", 938 | "pos": [ 939 | 2075, 940 | 198 941 | ], 942 | "size": { 943 | "0": 315, 944 | "1": 262 945 | }, 946 | "flags": {}, 947 | "order": 20, 948 | "mode": 0, 949 | "inputs": [ 950 | { 951 | "name": "model", 952 | "type": "MODEL", 953 | "link": 94 954 | }, 955 | { 956 | "name": "positive", 957 | "type": "CONDITIONING", 958 | "link": 95 959 | }, 960 | { 961 | "name": "negative", 962 | "type": "CONDITIONING", 963 | "link": 96 964 | }, 965 | { 966 | "name": "latent_image", 967 | "type": "LATENT", 968 | "link": 97 969 | } 970 | ], 971 | "outputs": [ 972 | { 973 | "name": "LATENT", 974 | "type": "LATENT", 975 | "links": [ 976 | 98 977 | ], 978 | "shape": 3, 979 | "slot_index": 0 980 | }, 981 | { 982 | "name": "MODEL", 983 | "type": "MODEL", 984 | "links": [ 985 | 99 986 | ], 987 | "shape": 3, 988 | "slot_index": 1 989 | } 990 | ], 991 | "properties": { 992 | "Node name for S&R": "SeaArtStoryKSampler" 993 | }, 994 | "widgets_values": [ 995 | 890539045590046, 996 | "randomize", 997 | 25, 998 | 8, 999 | "dpmpp_2m_sde_gpu", 1000 | "karras", 1001 | 1 1002 | ] 1003 | }, 1004 | { 1005 | "id": 16, 1006 | "type": "SeaArtApplyStory", 1007 | "pos": [ 1008 | 1603, 1009 | 49 1010 | ], 1011 | "size": { 1012 | "0": 210, 1013 | "1": 106 1014 | }, 1015 | "flags": {}, 1016 | "order": 9, 1017 | "mode": 0, 1018 | "inputs": [ 1019 | { 1020 | "name": "model", 1021 | "type": "MODEL", 1022 | "link": 63 1023 | } 1024 | ], 1025 | "outputs": [ 1026 | { 1027 | "name": "MODEL", 1028 | "type": "MODEL", 1029 | "links": [ 1030 | 94 1031 | ], 1032 | "shape": 3, 1033 | "slot_index": 0 1034 | } 1035 | ], 1036 | "properties": { 1037 | "Node name for S&R": "SeaArtApplyStory" 1038 | }, 1039 | "widgets_values": [ 1040 | 3, 1041 | 768, 1042 | 768 1043 | ] 1044 | }, 1045 | { 1046 | "id": 5, 1047 | "type": "EmptyLatentImage", 1048 | "pos": [ 1049 | 1584, 1050 | 549 1051 | ], 1052 | "size": { 1053 | "0": 315, 1054 | "1": 106 1055 | }, 1056 | "flags": {}, 1057 | "order": 3, 1058 | "mode": 0, 1059 | "outputs": [ 1060 | { 1061 | "name": "LATENT", 1062 | "type": "LATENT", 1063 | "links": [ 1064 | 97 1065 | ], 1066 | "slot_index": 0 1067 | } 1068 | ], 1069 | "properties": { 1070 | "Node name for S&R": "EmptyLatentImage" 1071 | }, 1072 | "widgets_values": [ 1073 | 768, 1074 | 768, 1075 | 3 1076 | ] 1077 | }, 1078 | { 1079 | "id": 35, 1080 | "type": "EmptyLatentImage", 1081 | "pos": [ 1082 | 2402, 1083 | 611 1084 | ], 1085 | "size": { 1086 | "0": 315, 1087 | "1": 106 1088 | }, 1089 | "flags": {}, 1090 | "order": 4, 1091 | "mode": 0, 1092 | "outputs": [ 1093 | { 1094 | "name": "LATENT", 1095 | "type": "LATENT", 1096 | "links": [ 1097 | 102 1098 | ], 1099 | "slot_index": 0 1100 | } 1101 | ], 1102 | "properties": { 1103 | "Node name for S&R": "EmptyLatentImage" 1104 | }, 1105 | "widgets_values": [ 1106 | 768, 1107 | 768, 1108 | 1 1109 | ] 1110 | }, 1111 | { 1112 | "id": 39, 1113 | "type": "SeaArtStoryInfKSampler", 1114 | "pos": [ 1115 | 2816, 1116 | 188 1117 | ], 1118 | "size": { 1119 | "0": 315, 1120 | "1": 262 1121 | }, 1122 | "flags": {}, 1123 | "order": 22, 1124 | "mode": 0, 1125 | "inputs": [ 1126 | { 1127 | "name": "model", 1128 | "type": "MODEL", 1129 | "link": 99 1130 | }, 1131 | { 1132 | "name": "positive", 1133 | "type": "CONDITIONING", 1134 | "link": 100 1135 | }, 1136 | { 1137 | "name": "negative", 1138 | "type": "CONDITIONING", 1139 | "link": 101 1140 | }, 1141 | { 1142 | "name": "latent_image", 1143 | "type": "LATENT", 1144 | "link": 102 1145 | } 1146 | ], 1147 | "outputs": [ 1148 | { 1149 | "name": "LATENT", 1150 | "type": "LATENT", 1151 | "links": [ 1152 | 103 1153 | ], 1154 | "shape": 3, 1155 | "slot_index": 0 1156 | }, 1157 | { 1158 | "name": "MODEL", 1159 | "type": "MODEL", 1160 | "links": null, 1161 | "shape": 3 1162 | } 1163 | ], 1164 | "properties": { 1165 | "Node name for S&R": "SeaArtStoryInfKSampler" 1166 | }, 1167 | "widgets_values": [ 1168 | 62194287431100, 1169 | "randomize", 1170 | 25, 1171 | 8, 1172 | "dpmpp_2m_sde_gpu", 1173 | "karras", 1174 | 1 1175 | ] 1176 | } 1177 | ], 1178 | "links": [ 1179 | [ 1180 | 8, 1181 | 4, 1182 | 2, 1183 | 8, 1184 | 1, 1185 | "VAE" 1186 | ], 1187 | [ 1188 | 10, 1189 | 11, 1190 | 0, 1191 | 12, 1192 | 0, 1193 | "STRING" 1194 | ], 1195 | [ 1196 | 11, 1197 | 11, 1198 | 0, 1199 | 13, 1200 | 0, 1201 | "STRING" 1202 | ], 1203 | [ 1204 | 12, 1205 | 11, 1206 | 0, 1207 | 14, 1208 | 0, 1209 | "STRING" 1210 | ], 1211 | [ 1212 | 16, 1213 | 14, 1214 | 0, 1215 | 20, 1216 | 1, 1217 | "STRING" 1218 | ], 1219 | [ 1220 | 17, 1221 | 13, 1222 | 0, 1223 | 19, 1224 | 1, 1225 | "STRING" 1226 | ], 1227 | [ 1228 | 18, 1229 | 12, 1230 | 0, 1231 | 18, 1232 | 1, 1233 | "STRING" 1234 | ], 1235 | [ 1236 | 25, 1237 | 8, 1238 | 0, 1239 | 21, 1240 | 0, 1241 | "IMAGE" 1242 | ], 1243 | [ 1244 | 27, 1245 | 22, 1246 | 0, 1247 | 24, 1248 | 1, 1249 | "STRING" 1250 | ], 1251 | [ 1252 | 28, 1253 | 22, 1254 | 0, 1255 | 23, 1256 | 1, 1257 | "STRING" 1258 | ], 1259 | [ 1260 | 29, 1261 | 22, 1262 | 0, 1263 | 25, 1264 | 1, 1265 | "STRING" 1266 | ], 1267 | [ 1268 | 40, 1269 | 18, 1270 | 0, 1271 | 27, 1272 | 1, 1273 | "CONDITIONING" 1274 | ], 1275 | [ 1276 | 41, 1277 | 19, 1278 | 0, 1279 | 27, 1280 | 2, 1281 | "CONDITIONING" 1282 | ], 1283 | [ 1284 | 42, 1285 | 20, 1286 | 0, 1287 | 27, 1288 | 3, 1289 | "CONDITIONING" 1290 | ], 1291 | [ 1292 | 44, 1293 | 23, 1294 | 0, 1295 | 28, 1296 | 1, 1297 | "CONDITIONING" 1298 | ], 1299 | [ 1300 | 45, 1301 | 24, 1302 | 0, 1303 | 28, 1304 | 2, 1305 | "CONDITIONING" 1306 | ], 1307 | [ 1308 | 46, 1309 | 25, 1310 | 0, 1311 | 28, 1312 | 3, 1313 | "CONDITIONING" 1314 | ], 1315 | [ 1316 | 63, 1317 | 4, 1318 | 0, 1319 | 16, 1320 | 0, 1321 | "MODEL" 1322 | ], 1323 | [ 1324 | 64, 1325 | 4, 1326 | 1, 1327 | 18, 1328 | 0, 1329 | "CLIP" 1330 | ], 1331 | [ 1332 | 65, 1333 | 4, 1334 | 1, 1335 | 27, 1336 | 0, 1337 | "CLIP" 1338 | ], 1339 | [ 1340 | 66, 1341 | 4, 1342 | 1, 1343 | 28, 1344 | 0, 1345 | "CLIP" 1346 | ], 1347 | [ 1348 | 67, 1349 | 4, 1350 | 1, 1351 | 19, 1352 | 0, 1353 | "CLIP" 1354 | ], 1355 | [ 1356 | 68, 1357 | 4, 1358 | 1, 1359 | 20, 1360 | 0, 1361 | "CLIP" 1362 | ], 1363 | [ 1364 | 69, 1365 | 4, 1366 | 1, 1367 | 23, 1368 | 0, 1369 | "CLIP" 1370 | ], 1371 | [ 1372 | 70, 1373 | 4, 1374 | 1, 1375 | 25, 1376 | 0, 1377 | "CLIP" 1378 | ], 1379 | [ 1380 | 71, 1381 | 4, 1382 | 1, 1383 | 24, 1384 | 0, 1385 | "CLIP" 1386 | ], 1387 | [ 1388 | 78, 1389 | 11, 1390 | 0, 1391 | 32, 1392 | 0, 1393 | "STRING" 1394 | ], 1395 | [ 1396 | 79, 1397 | 32, 1398 | 0, 1399 | 33, 1400 | 1, 1401 | "STRING" 1402 | ], 1403 | [ 1404 | 80, 1405 | 4, 1406 | 1, 1407 | 33, 1408 | 0, 1409 | "CLIP" 1410 | ], 1411 | [ 1412 | 82, 1413 | 4, 1414 | 1, 1415 | 34, 1416 | 0, 1417 | "CLIP" 1418 | ], 1419 | [ 1420 | 83, 1421 | 22, 1422 | 0, 1423 | 34, 1424 | 1, 1425 | "STRING" 1426 | ], 1427 | [ 1428 | 87, 1429 | 4, 1430 | 2, 1431 | 36, 1432 | 1, 1433 | "VAE" 1434 | ], 1435 | [ 1436 | 88, 1437 | 36, 1438 | 0, 1439 | 37, 1440 | 0, 1441 | "IMAGE" 1442 | ], 1443 | [ 1444 | 94, 1445 | 16, 1446 | 0, 1447 | 30, 1448 | 0, 1449 | "MODEL" 1450 | ], 1451 | [ 1452 | 95, 1453 | 27, 1454 | 0, 1455 | 30, 1456 | 1, 1457 | "CONDITIONING" 1458 | ], 1459 | [ 1460 | 96, 1461 | 28, 1462 | 0, 1463 | 30, 1464 | 2, 1465 | "CONDITIONING" 1466 | ], 1467 | [ 1468 | 97, 1469 | 5, 1470 | 0, 1471 | 30, 1472 | 3, 1473 | "LATENT" 1474 | ], 1475 | [ 1476 | 98, 1477 | 30, 1478 | 0, 1479 | 8, 1480 | 0, 1481 | "LATENT" 1482 | ], 1483 | [ 1484 | 99, 1485 | 30, 1486 | 1, 1487 | 39, 1488 | 0, 1489 | "MODEL" 1490 | ], 1491 | [ 1492 | 100, 1493 | 33, 1494 | 0, 1495 | 39, 1496 | 1, 1497 | "CONDITIONING" 1498 | ], 1499 | [ 1500 | 101, 1501 | 34, 1502 | 0, 1503 | 39, 1504 | 2, 1505 | "CONDITIONING" 1506 | ], 1507 | [ 1508 | 102, 1509 | 35, 1510 | 0, 1511 | 39, 1512 | 3, 1513 | "LATENT" 1514 | ], 1515 | [ 1516 | 103, 1517 | 39, 1518 | 0, 1519 | 36, 1520 | 0, 1521 | "LATENT" 1522 | ] 1523 | ], 1524 | "groups": [], 1525 | "config": {}, 1526 | "extra": { 1527 | "ds": { 1528 | "scale": 1.2100000000000006, 1529 | "offset": [ 1530 | -1796.6191053800735, 1531 | -431.6310747064882 1532 | ] 1533 | } 1534 | }, 1535 | "version": 0.4 1536 | } -------------------------------------------------------------------------------- /workflow/story_with_lora.json: -------------------------------------------------------------------------------- 1 | { 2 | "last_node_id": 29, 3 | "last_link_id": 62, 4 | "nodes": [ 5 | { 6 | "id": 11, 7 | "type": "SeaArtCharactorPrompt", 8 | "pos": [ 9 | -84, 10 | -241 11 | ], 12 | "size": { 13 | "0": 400, 14 | "1": 200 15 | }, 16 | "flags": {}, 17 | "order": 0, 18 | "mode": 0, 19 | "outputs": [ 20 | { 21 | "name": "STRING", 22 | "type": "STRING", 23 | "links": [ 24 | 10, 25 | 11, 26 | 12 27 | ], 28 | "shape": 3, 29 | "slot_index": 0 30 | } 31 | ], 32 | "properties": { 33 | "Node name for S&R": "SeaArtCharactorPrompt" 34 | }, 35 | "widgets_values": [ 36 | "a cute girl, blue hair, red cloth " 37 | ] 38 | }, 39 | { 40 | "id": 13, 41 | "type": "SeaArtAppendPrompt", 42 | "pos": [ 43 | 427, 44 | 282 45 | ], 46 | "size": { 47 | "0": 400, 48 | "1": 200 49 | }, 50 | "flags": {}, 51 | "order": 5, 52 | "mode": 0, 53 | "inputs": [ 54 | { 55 | "name": "charactor_prompt", 56 | "type": "STRING", 57 | "link": 11, 58 | "widget": { 59 | "name": "charactor_prompt" 60 | } 61 | } 62 | ], 63 | "outputs": [ 64 | { 65 | "name": "STRING", 66 | "type": "STRING", 67 | "links": [ 68 | 17 69 | ], 70 | "shape": 3, 71 | "slot_index": 0 72 | } 73 | ], 74 | "properties": { 75 | "Node name for S&R": "SeaArtAppendPrompt" 76 | }, 77 | "widgets_values": [ 78 | "", 79 | "in the park" 80 | ] 81 | }, 82 | { 83 | "id": 12, 84 | "type": "SeaArtAppendPrompt", 85 | "pos": [ 86 | 421, 87 | -54 88 | ], 89 | "size": { 90 | "0": 400, 91 | "1": 200 92 | }, 93 | "flags": {}, 94 | "order": 4, 95 | "mode": 0, 96 | "inputs": [ 97 | { 98 | "name": "charactor_prompt", 99 | "type": "STRING", 100 | "link": 10, 101 | "widget": { 102 | "name": "charactor_prompt" 103 | } 104 | } 105 | ], 106 | "outputs": [ 107 | { 108 | "name": "STRING", 109 | "type": "STRING", 110 | "links": [ 111 | 18 112 | ], 113 | "shape": 3, 114 | "slot_index": 0 115 | } 116 | ], 117 | "properties": { 118 | "Node name for S&R": "SeaArtAppendPrompt" 119 | }, 120 | "widgets_values": [ 121 | "", 122 | "in the bed" 123 | ] 124 | }, 125 | { 126 | "id": 8, 127 | "type": "VAEDecode", 128 | "pos": [ 129 | 2104, 130 | 637 131 | ], 132 | "size": { 133 | "0": 210, 134 | "1": 46 135 | }, 136 | "flags": {}, 137 | "order": 18, 138 | "mode": 0, 139 | "inputs": [ 140 | { 141 | "name": "samples", 142 | "type": "LATENT", 143 | "link": 7 144 | }, 145 | { 146 | "name": "vae", 147 | "type": "VAE", 148 | "link": 8 149 | } 150 | ], 151 | "outputs": [ 152 | { 153 | "name": "IMAGE", 154 | "type": "IMAGE", 155 | "links": [ 156 | 25 157 | ], 158 | "slot_index": 0 159 | } 160 | ], 161 | "properties": { 162 | "Node name for S&R": "VAEDecode" 163 | } 164 | }, 165 | { 166 | "id": 3, 167 | "type": "KSampler", 168 | "pos": [ 169 | 2084, 170 | 262 171 | ], 172 | "size": { 173 | "0": 315, 174 | "1": 262 175 | }, 176 | "flags": {}, 177 | "order": 17, 178 | "mode": 0, 179 | "inputs": [ 180 | { 181 | "name": "model", 182 | "type": "MODEL", 183 | "link": 14 184 | }, 185 | { 186 | "name": "positive", 187 | "type": "CONDITIONING", 188 | "link": 48 189 | }, 190 | { 191 | "name": "negative", 192 | "type": "CONDITIONING", 193 | "link": 49 194 | }, 195 | { 196 | "name": "latent_image", 197 | "type": "LATENT", 198 | "link": 2 199 | } 200 | ], 201 | "outputs": [ 202 | { 203 | "name": "LATENT", 204 | "type": "LATENT", 205 | "links": [ 206 | 7 207 | ], 208 | "slot_index": 0 209 | } 210 | ], 211 | "properties": { 212 | "Node name for S&R": "KSampler" 213 | }, 214 | "widgets_values": [ 215 | 919099699469284, 216 | "randomize", 217 | 25, 218 | 8, 219 | "dpmpp_2m_sde_gpu", 220 | "karras", 221 | 1 222 | ] 223 | }, 224 | { 225 | "id": 14, 226 | "type": "SeaArtAppendPrompt", 227 | "pos": [ 228 | 423, 229 | 574 230 | ], 231 | "size": { 232 | "0": 400, 233 | "1": 200 234 | }, 235 | "flags": {}, 236 | "order": 6, 237 | "mode": 0, 238 | "inputs": [ 239 | { 240 | "name": "charactor_prompt", 241 | "type": "STRING", 242 | "link": 12, 243 | "widget": { 244 | "name": "charactor_prompt" 245 | } 246 | } 247 | ], 248 | "outputs": [ 249 | { 250 | "name": "STRING", 251 | "type": "STRING", 252 | "links": [ 253 | 16 254 | ], 255 | "shape": 3, 256 | "slot_index": 0 257 | } 258 | ], 259 | "properties": { 260 | "Node name for S&R": "SeaArtAppendPrompt" 261 | }, 262 | "widgets_values": [ 263 | "", 264 | "read book" 265 | ] 266 | }, 267 | { 268 | "id": 18, 269 | "type": "CLIPTextEncode", 270 | "pos": [ 271 | 957, 272 | -50 273 | ], 274 | "size": { 275 | "0": 389.5223388671875, 276 | "1": 74.33411407470703 277 | }, 278 | "flags": {}, 279 | "order": 9, 280 | "mode": 0, 281 | "inputs": [ 282 | { 283 | "name": "clip", 284 | "type": "CLIP", 285 | "link": 53 286 | }, 287 | { 288 | "name": "text", 289 | "type": "STRING", 290 | "link": 18, 291 | "widget": { 292 | "name": "text" 293 | } 294 | } 295 | ], 296 | "outputs": [ 297 | { 298 | "name": "CONDITIONING", 299 | "type": "CONDITIONING", 300 | "links": [ 301 | 40 302 | ], 303 | "shape": 3, 304 | "slot_index": 0 305 | } 306 | ], 307 | "properties": { 308 | "Node name for S&R": "CLIPTextEncode" 309 | }, 310 | "widgets_values": [ 311 | "" 312 | ] 313 | }, 314 | { 315 | "id": 19, 316 | "type": "CLIPTextEncode", 317 | "pos": [ 318 | 958, 319 | 266 320 | ], 321 | "size": { 322 | "0": 389.5223388671875, 323 | "1": 74.33411407470703 324 | }, 325 | "flags": {}, 326 | "order": 10, 327 | "mode": 0, 328 | "inputs": [ 329 | { 330 | "name": "clip", 331 | "type": "CLIP", 332 | "link": 54 333 | }, 334 | { 335 | "name": "text", 336 | "type": "STRING", 337 | "link": 17, 338 | "widget": { 339 | "name": "text" 340 | } 341 | } 342 | ], 343 | "outputs": [ 344 | { 345 | "name": "CONDITIONING", 346 | "type": "CONDITIONING", 347 | "links": [ 348 | 41 349 | ], 350 | "shape": 3, 351 | "slot_index": 0 352 | } 353 | ], 354 | "properties": { 355 | "Node name for S&R": "CLIPTextEncode" 356 | }, 357 | "widgets_values": [ 358 | "" 359 | ] 360 | }, 361 | { 362 | "id": 20, 363 | "type": "CLIPTextEncode", 364 | "pos": [ 365 | 960, 366 | 589 367 | ], 368 | "size": { 369 | "0": 389.5223388671875, 370 | "1": 74.33411407470703 371 | }, 372 | "flags": {}, 373 | "order": 11, 374 | "mode": 0, 375 | "inputs": [ 376 | { 377 | "name": "clip", 378 | "type": "CLIP", 379 | "link": 55 380 | }, 381 | { 382 | "name": "text", 383 | "type": "STRING", 384 | "link": 16, 385 | "widget": { 386 | "name": "text" 387 | } 388 | } 389 | ], 390 | "outputs": [ 391 | { 392 | "name": "CONDITIONING", 393 | "type": "CONDITIONING", 394 | "links": [ 395 | 42 396 | ], 397 | "shape": 3, 398 | "slot_index": 0 399 | } 400 | ], 401 | "properties": { 402 | "Node name for S&R": "CLIPTextEncode" 403 | }, 404 | "widgets_values": [ 405 | "" 406 | ] 407 | }, 408 | { 409 | "id": 23, 410 | "type": "CLIPTextEncode", 411 | "pos": [ 412 | 976, 413 | 904 414 | ], 415 | "size": { 416 | "0": 389.5223388671875, 417 | "1": 74.33411407470703 418 | }, 419 | "flags": {}, 420 | "order": 12, 421 | "mode": 0, 422 | "inputs": [ 423 | { 424 | "name": "clip", 425 | "type": "CLIP", 426 | "link": 56 427 | }, 428 | { 429 | "name": "text", 430 | "type": "STRING", 431 | "link": 28, 432 | "widget": { 433 | "name": "text" 434 | } 435 | } 436 | ], 437 | "outputs": [ 438 | { 439 | "name": "CONDITIONING", 440 | "type": "CONDITIONING", 441 | "links": [ 442 | 44 443 | ], 444 | "shape": 3, 445 | "slot_index": 0 446 | } 447 | ], 448 | "properties": { 449 | "Node name for S&R": "CLIPTextEncode" 450 | }, 451 | "widgets_values": [ 452 | "" 453 | ] 454 | }, 455 | { 456 | "id": 24, 457 | "type": "CLIPTextEncode", 458 | "pos": [ 459 | 970, 460 | 1130 461 | ], 462 | "size": { 463 | "0": 389.5223388671875, 464 | "1": 74.33411407470703 465 | }, 466 | "flags": {}, 467 | "order": 13, 468 | "mode": 0, 469 | "inputs": [ 470 | { 471 | "name": "clip", 472 | "type": "CLIP", 473 | "link": 57 474 | }, 475 | { 476 | "name": "text", 477 | "type": "STRING", 478 | "link": 27, 479 | "widget": { 480 | "name": "text" 481 | } 482 | } 483 | ], 484 | "outputs": [ 485 | { 486 | "name": "CONDITIONING", 487 | "type": "CONDITIONING", 488 | "links": [ 489 | 45 490 | ], 491 | "shape": 3, 492 | "slot_index": 0 493 | } 494 | ], 495 | "properties": { 496 | "Node name for S&R": "CLIPTextEncode" 497 | }, 498 | "widgets_values": [ 499 | "" 500 | ] 501 | }, 502 | { 503 | "id": 25, 504 | "type": "CLIPTextEncode", 505 | "pos": [ 506 | 970, 507 | 1351 508 | ], 509 | "size": { 510 | "0": 389.5223388671875, 511 | "1": 74.33411407470703 512 | }, 513 | "flags": {}, 514 | "order": 14, 515 | "mode": 0, 516 | "inputs": [ 517 | { 518 | "name": "clip", 519 | "type": "CLIP", 520 | "link": 58 521 | }, 522 | { 523 | "name": "text", 524 | "type": "STRING", 525 | "link": 29, 526 | "widget": { 527 | "name": "text" 528 | } 529 | } 530 | ], 531 | "outputs": [ 532 | { 533 | "name": "CONDITIONING", 534 | "type": "CONDITIONING", 535 | "links": [ 536 | 46 537 | ], 538 | "shape": 3, 539 | "slot_index": 0 540 | } 541 | ], 542 | "properties": { 543 | "Node name for S&R": "CLIPTextEncode" 544 | }, 545 | "widgets_values": [ 546 | "" 547 | ] 548 | }, 549 | { 550 | "id": 27, 551 | "type": "SeaArtMergeStoryCondition", 552 | "pos": [ 553 | 1551, 554 | 217 555 | ], 556 | "size": { 557 | "0": 342.5999755859375, 558 | "1": 126 559 | }, 560 | "flags": {}, 561 | "order": 15, 562 | "mode": 0, 563 | "inputs": [ 564 | { 565 | "name": "clip", 566 | "type": "CLIP", 567 | "link": 59 568 | }, 569 | { 570 | "name": "conditioning_1", 571 | "type": "CONDITIONING", 572 | "link": 40 573 | }, 574 | { 575 | "name": "conditioning_2", 576 | "type": "CONDITIONING", 577 | "link": 41 578 | }, 579 | { 580 | "name": "conditioning_3", 581 | "type": "CONDITIONING", 582 | "link": 42 583 | }, 584 | { 585 | "name": "conditioning_4", 586 | "type": "CONDITIONING", 587 | "link": null 588 | }, 589 | { 590 | "name": "conditioning_5", 591 | "type": "CONDITIONING", 592 | "link": null 593 | } 594 | ], 595 | "outputs": [ 596 | { 597 | "name": "CONDITIONING", 598 | "type": "CONDITIONING", 599 | "links": [ 600 | 48 601 | ], 602 | "shape": 3, 603 | "slot_index": 0 604 | } 605 | ], 606 | "properties": { 607 | "Node name for S&R": "SeaArtMergeStoryCondition" 608 | } 609 | }, 610 | { 611 | "id": 22, 612 | "type": "SeaArtAppendPrompt", 613 | "pos": [ 614 | 420, 615 | 883 616 | ], 617 | "size": { 618 | "0": 400, 619 | "1": 200 620 | }, 621 | "flags": {}, 622 | "order": 1, 623 | "mode": 0, 624 | "inputs": [], 625 | "outputs": [ 626 | { 627 | "name": "STRING", 628 | "type": "STRING", 629 | "links": [ 630 | 27, 631 | 28, 632 | 29 633 | ], 634 | "shape": 3, 635 | "slot_index": 0 636 | } 637 | ], 638 | "properties": { 639 | "Node name for S&R": "SeaArtAppendPrompt" 640 | }, 641 | "widgets_values": [ 642 | "", 643 | "water,bad" 644 | ] 645 | }, 646 | { 647 | "id": 16, 648 | "type": "SeaArtApplyStory", 649 | "pos": [ 650 | 1603, 651 | 49 652 | ], 653 | "size": { 654 | "0": 210, 655 | "1": 106 656 | }, 657 | "flags": {}, 658 | "order": 8, 659 | "mode": 0, 660 | "inputs": [ 661 | { 662 | "name": "model", 663 | "type": "MODEL", 664 | "link": 62 665 | } 666 | ], 667 | "outputs": [ 668 | { 669 | "name": "MODEL", 670 | "type": "MODEL", 671 | "links": [ 672 | 14 673 | ], 674 | "shape": 3, 675 | "slot_index": 0 676 | } 677 | ], 678 | "properties": { 679 | "Node name for S&R": "SeaArtApplyStory" 680 | }, 681 | "widgets_values": [ 682 | 3, 683 | 1024, 684 | 1024 685 | ] 686 | }, 687 | { 688 | "id": 28, 689 | "type": "SeaArtMergeStoryCondition", 690 | "pos": [ 691 | 1561, 692 | 1047 693 | ], 694 | "size": { 695 | "0": 342.5999755859375, 696 | "1": 126 697 | }, 698 | "flags": {}, 699 | "order": 16, 700 | "mode": 0, 701 | "inputs": [ 702 | { 703 | "name": "clip", 704 | "type": "CLIP", 705 | "link": 60 706 | }, 707 | { 708 | "name": "conditioning_1", 709 | "type": "CONDITIONING", 710 | "link": 44 711 | }, 712 | { 713 | "name": "conditioning_2", 714 | "type": "CONDITIONING", 715 | "link": 45 716 | }, 717 | { 718 | "name": "conditioning_3", 719 | "type": "CONDITIONING", 720 | "link": 46 721 | }, 722 | { 723 | "name": "conditioning_4", 724 | "type": "CONDITIONING", 725 | "link": null 726 | }, 727 | { 728 | "name": "conditioning_5", 729 | "type": "CONDITIONING", 730 | "link": null 731 | } 732 | ], 733 | "outputs": [ 734 | { 735 | "name": "CONDITIONING", 736 | "type": "CONDITIONING", 737 | "links": [ 738 | 49 739 | ], 740 | "shape": 3, 741 | "slot_index": 0 742 | } 743 | ], 744 | "properties": { 745 | "Node name for S&R": "SeaArtMergeStoryCondition" 746 | } 747 | }, 748 | { 749 | "id": 4, 750 | "type": "CheckpointLoaderSimple", 751 | "pos": [ 752 | -696, 753 | 174 754 | ], 755 | "size": { 756 | "0": 315, 757 | "1": 98 758 | }, 759 | "flags": {}, 760 | "order": 2, 761 | "mode": 0, 762 | "outputs": [ 763 | { 764 | "name": "MODEL", 765 | "type": "MODEL", 766 | "links": [ 767 | 50 768 | ], 769 | "slot_index": 0 770 | }, 771 | { 772 | "name": "CLIP", 773 | "type": "CLIP", 774 | "links": [ 775 | 51 776 | ], 777 | "slot_index": 1 778 | }, 779 | { 780 | "name": "VAE", 781 | "type": "VAE", 782 | "links": [ 783 | 8 784 | ], 785 | "slot_index": 2 786 | } 787 | ], 788 | "properties": { 789 | "Node name for S&R": "CheckpointLoaderSimple" 790 | }, 791 | "widgets_values": [ 792 | "Dark_Sushi_Mix.safetensors" 793 | ] 794 | }, 795 | { 796 | "id": 21, 797 | "type": "PreviewImage", 798 | "pos": [ 799 | 2110, 800 | 757 801 | ], 802 | "size": { 803 | "0": 210, 804 | "1": 246 805 | }, 806 | "flags": {}, 807 | "order": 19, 808 | "mode": 0, 809 | "inputs": [ 810 | { 811 | "name": "images", 812 | "type": "IMAGE", 813 | "link": 25 814 | } 815 | ], 816 | "properties": { 817 | "Node name for S&R": "PreviewImage" 818 | } 819 | }, 820 | { 821 | "id": 5, 822 | "type": "EmptyLatentImage", 823 | "pos": [ 824 | 1584, 825 | 549 826 | ], 827 | "size": { 828 | "0": 315, 829 | "1": 106 830 | }, 831 | "flags": {}, 832 | "order": 3, 833 | "mode": 0, 834 | "outputs": [ 835 | { 836 | "name": "LATENT", 837 | "type": "LATENT", 838 | "links": [ 839 | 2 840 | ], 841 | "slot_index": 0 842 | } 843 | ], 844 | "properties": { 845 | "Node name for S&R": "EmptyLatentImage" 846 | }, 847 | "widgets_values": [ 848 | 1024, 849 | 1024, 850 | 3 851 | ] 852 | }, 853 | { 854 | "id": 29, 855 | "type": "LoraLoader", 856 | "pos": [ 857 | -209, 858 | 172 859 | ], 860 | "size": { 861 | "0": 315, 862 | "1": 126 863 | }, 864 | "flags": {}, 865 | "order": 7, 866 | "mode": 0, 867 | "inputs": [ 868 | { 869 | "name": "model", 870 | "type": "MODEL", 871 | "link": 50 872 | }, 873 | { 874 | "name": "clip", 875 | "type": "CLIP", 876 | "link": 51 877 | } 878 | ], 879 | "outputs": [ 880 | { 881 | "name": "MODEL", 882 | "type": "MODEL", 883 | "links": [ 884 | 62 885 | ], 886 | "shape": 3, 887 | "slot_index": 0 888 | }, 889 | { 890 | "name": "CLIP", 891 | "type": "CLIP", 892 | "links": [ 893 | 53, 894 | 54, 895 | 55, 896 | 56, 897 | 57, 898 | 58, 899 | 59, 900 | 60 901 | ], 902 | "shape": 3, 903 | "slot_index": 1 904 | } 905 | ], 906 | "properties": { 907 | "Node name for S&R": "LoraLoader" 908 | }, 909 | "widgets_values": [ 910 | "blindbox.safetensors", 911 | 0.6, 912 | 0.6 913 | ] 914 | } 915 | ], 916 | "links": [ 917 | [ 918 | 2, 919 | 5, 920 | 0, 921 | 3, 922 | 3, 923 | "LATENT" 924 | ], 925 | [ 926 | 7, 927 | 3, 928 | 0, 929 | 8, 930 | 0, 931 | "LATENT" 932 | ], 933 | [ 934 | 8, 935 | 4, 936 | 2, 937 | 8, 938 | 1, 939 | "VAE" 940 | ], 941 | [ 942 | 10, 943 | 11, 944 | 0, 945 | 12, 946 | 0, 947 | "STRING" 948 | ], 949 | [ 950 | 11, 951 | 11, 952 | 0, 953 | 13, 954 | 0, 955 | "STRING" 956 | ], 957 | [ 958 | 12, 959 | 11, 960 | 0, 961 | 14, 962 | 0, 963 | "STRING" 964 | ], 965 | [ 966 | 14, 967 | 16, 968 | 0, 969 | 3, 970 | 0, 971 | "MODEL" 972 | ], 973 | [ 974 | 16, 975 | 14, 976 | 0, 977 | 20, 978 | 1, 979 | "STRING" 980 | ], 981 | [ 982 | 17, 983 | 13, 984 | 0, 985 | 19, 986 | 1, 987 | "STRING" 988 | ], 989 | [ 990 | 18, 991 | 12, 992 | 0, 993 | 18, 994 | 1, 995 | "STRING" 996 | ], 997 | [ 998 | 25, 999 | 8, 1000 | 0, 1001 | 21, 1002 | 0, 1003 | "IMAGE" 1004 | ], 1005 | [ 1006 | 27, 1007 | 22, 1008 | 0, 1009 | 24, 1010 | 1, 1011 | "STRING" 1012 | ], 1013 | [ 1014 | 28, 1015 | 22, 1016 | 0, 1017 | 23, 1018 | 1, 1019 | "STRING" 1020 | ], 1021 | [ 1022 | 29, 1023 | 22, 1024 | 0, 1025 | 25, 1026 | 1, 1027 | "STRING" 1028 | ], 1029 | [ 1030 | 40, 1031 | 18, 1032 | 0, 1033 | 27, 1034 | 1, 1035 | "CONDITIONING" 1036 | ], 1037 | [ 1038 | 41, 1039 | 19, 1040 | 0, 1041 | 27, 1042 | 2, 1043 | "CONDITIONING" 1044 | ], 1045 | [ 1046 | 42, 1047 | 20, 1048 | 0, 1049 | 27, 1050 | 3, 1051 | "CONDITIONING" 1052 | ], 1053 | [ 1054 | 44, 1055 | 23, 1056 | 0, 1057 | 28, 1058 | 1, 1059 | "CONDITIONING" 1060 | ], 1061 | [ 1062 | 45, 1063 | 24, 1064 | 0, 1065 | 28, 1066 | 2, 1067 | "CONDITIONING" 1068 | ], 1069 | [ 1070 | 46, 1071 | 25, 1072 | 0, 1073 | 28, 1074 | 3, 1075 | "CONDITIONING" 1076 | ], 1077 | [ 1078 | 48, 1079 | 27, 1080 | 0, 1081 | 3, 1082 | 1, 1083 | "CONDITIONING" 1084 | ], 1085 | [ 1086 | 49, 1087 | 28, 1088 | 0, 1089 | 3, 1090 | 2, 1091 | "CONDITIONING" 1092 | ], 1093 | [ 1094 | 50, 1095 | 4, 1096 | 0, 1097 | 29, 1098 | 0, 1099 | "MODEL" 1100 | ], 1101 | [ 1102 | 51, 1103 | 4, 1104 | 1, 1105 | 29, 1106 | 1, 1107 | "CLIP" 1108 | ], 1109 | [ 1110 | 53, 1111 | 29, 1112 | 1, 1113 | 18, 1114 | 0, 1115 | "CLIP" 1116 | ], 1117 | [ 1118 | 54, 1119 | 29, 1120 | 1, 1121 | 19, 1122 | 0, 1123 | "CLIP" 1124 | ], 1125 | [ 1126 | 55, 1127 | 29, 1128 | 1, 1129 | 20, 1130 | 0, 1131 | "CLIP" 1132 | ], 1133 | [ 1134 | 56, 1135 | 29, 1136 | 1, 1137 | 23, 1138 | 0, 1139 | "CLIP" 1140 | ], 1141 | [ 1142 | 57, 1143 | 29, 1144 | 1, 1145 | 24, 1146 | 0, 1147 | "CLIP" 1148 | ], 1149 | [ 1150 | 58, 1151 | 29, 1152 | 1, 1153 | 25, 1154 | 0, 1155 | "CLIP" 1156 | ], 1157 | [ 1158 | 59, 1159 | 29, 1160 | 1, 1161 | 27, 1162 | 0, 1163 | "CLIP" 1164 | ], 1165 | [ 1166 | 60, 1167 | 29, 1168 | 1, 1169 | 28, 1170 | 0, 1171 | "CLIP" 1172 | ], 1173 | [ 1174 | 62, 1175 | 29, 1176 | 0, 1177 | 16, 1178 | 0, 1179 | "MODEL" 1180 | ] 1181 | ], 1182 | "groups": [], 1183 | "config": {}, 1184 | "extra": { 1185 | "ds": { 1186 | "scale": 0.683013455365071, 1187 | "offset": [ 1188 | 57.99931960174787, 1189 | -70.24046594239127 1190 | ] 1191 | } 1192 | }, 1193 | "version": 0.4 1194 | } --------------------------------------------------------------------------------