├── .gitignore ├── LICENSE ├── README.md ├── assets ├── Screenshot1.png ├── Screenshot2.png └── banner.png ├── components ├── CLIP-AUX-TOKEN.txt ├── CLIP-L-LONG.txt ├── CLIP-SvD-XT-Decoder.txt ├── CLIP-SvD-XT-Visual.txt ├── CLIP-SvD-XT-VisualDecoder.txt ├── CLIP-SvD-XT.txt ├── CLIP-XL-SD.txt ├── CLIP-XL-SSD.txt ├── CLIP-XL-TOKEMB.txt ├── CLIP-XL-TOKEN.txt ├── CLIP-v1-LONG.txt ├── CLIP-v1-POS.txt ├── CLIP-v1-SCPR.txt ├── CLIP-v1-SD.txt ├── CLIP-v1-TOKEN.txt ├── CLIP-v2-SD.txt ├── CLIP-v2-WD.txt ├── CLIP-v3-G.txt ├── CLIP-v3-L.txt ├── CLIP-v3-T5.txt ├── Clipvision-G.txt ├── ControlNet-v1-SD.txt ├── Depth-v2-SD.txt ├── LoCON-XL-UNET.txt ├── LoKr-XL-AUX-CLIP.txt ├── LoKr-XL-CLIP.txt ├── LoKr-XL-UNET.txt ├── LoRA-XL-AUX-CLIP.txt ├── LoRA-XL-CLIP.txt ├── LoRA-XL-UNET-D.txt ├── LoRA-XL-UNET-M.txt ├── LoRA-XL-UNET-S.txt ├── LoRA-XL-UNET.txt ├── LoRA-v1-CLIP.txt ├── LoRA-v1-UNET.txt ├── LoRA-v1A-CLIP.txt ├── LoRA-v1A-UNET.txt ├── LyCO-XL-AUX-CLIP.txt ├── LyCO-XL-CLIP.txt ├── LyCO-XL-UNET.txt ├── Prefixed-v3-G.txt ├── Prefixed-v3-L.txt ├── Prefixed-v3-MMDIT.txt ├── Prefixed-v3-T5.txt ├── Prefixed-v3-VAE.txt ├── UNET-CAS-C.txt ├── UNET-SvD-XT-Decoder.txt ├── UNET-SvD-XT.txt ├── UNET-XL-A-MAIN.txt ├── UNET-XL-B-Inpainting.txt ├── UNET-XL-B-SD.txt ├── UNET-XL-DOWN.txt ├── UNET-XL-Inpainting.txt ├── UNET-XL-Refiner.txt ├── UNET-XL-SD.txt ├── UNET-v1-DOWN.txt ├── UNET-v1-EMA.txt ├── UNET-v1-EMBLAY.txt ├── UNET-v1-Inpainting.txt ├── UNET-v1-MID.txt ├── UNET-v1-Pix2Pix-EMA.txt ├── UNET-v1-Pix2Pix.txt ├── UNET-v1-SD.txt ├── UNET-v1-SKIP.txt ├── UNET-v1-TIME-EMB.txt ├── UNET-v1-UP.txt ├── UNET-v2-Depth.txt ├── UNET-v2-Inpainting.txt ├── UNET-v2-Refiner.txt ├── UNET-v2-SD.txt ├── UNET-v3-MMDIT.txt ├── VAE-MU-DE.txt ├── VAE-MU-EN.txt ├── VAE-NU-DE.txt ├── VAE-NU-EN.txt ├── VAE-UNK-DE.txt ├── VAE-UNK-EN.txt ├── VAE-v1-DS.txt ├── VAE-v1-SD.txt ├── VAE-v3-SD.txt ├── VAE-vX-D.txt ├── VAE-vX-S.txt ├── cascade_vae.txt ├── clip_g.txt └── clip_l_vision.txt ├── scripts └── toolkit_gui.py └── toolkit.py /.gitignore: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/silveroxides/stable-diffusion-webui-model-toolkit-revisited/d80cc166046a70744b56cf18745ee8358019d021/.gitignore -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2023 arenatemp 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # stable-diffusion-model-toolkit 2 | 3 | A Multipurpose toolkit for managing, editing and creating models. 4 | 5 | ![](https://cdn.discordapp.com/attachments/973151736946622467/1067839983781679165/image.png) 6 | 7 |
8 | Screenshots 9 | 10 | - ![Basic Tab](assets/Screenshot1.png) 11 | 12 | - ![Advanced Tab](assets/Screenshot2.png) 13 |
14 | 15 | Install by: `Extensions tab > Install from URL > Paste in this pages URL > Install` 16 | 17 | ## Features 18 | - Cleaning/pruning models. 19 | - Converting to/from safetensors. 20 | - Extracting/replacing model components. 21 | - Identifying/debugging model architectures. 22 | - Creating custom models from components. 23 | 24 | ## Planned additions 25 | - Complete LoRA,LyCORIS,DyLoRA..etc component support 26 | - Settings options for level of pruning 27 | - Key restoration options for repairing badly saved models 28 | 29 | # Example 30 | Many models being distributed are quite bloated, most of their size being redundant or useless data. 31 | 32 | For example, Anything-v3.0 is 7.7gb and requires a separate 800mb VAE. These can be combined and cleaned into a **2.1gb standalone model**, with the correct VAE included. 33 | 34 | Easy way, replace the VAE directly: 35 | ``` 36 | 1. Select Anything-v3.0.ckpt from the source dropdown, press Load. 37 | 2. Change to the Advanced tab. 38 | 3. Select the class VAE-v1 and leave the component on auto 39 | 4. Select Anything-v3.0.vae.ckpt from the import dropdown, press Import. 40 | 5. Change the model name to something appropriate like Anything-v3.0.safetensors 41 | 6. Press Save. 42 | ``` 43 | 44 | Hard way, build the model from components: 45 | ``` 46 | 1. Select Anything-v3.0.ckpt from the dropdown, press Load. 47 | 2. Change to the Advanced tab. 48 | 3. Select the class CLIP-v1, press Export. 49 | 4. Select the class UNET-v1, press Export. 50 | 5. Press Clear 51 | 6. Select NEW SDv1 from the source dropdown, press Load. 52 | 7. Select the class CLIP-v1, and select the Anything-v3 CLIP (just exported), press Import. 53 | 8. Select the class UNET-v1, and select the Anything-v3 UNET (just exported), press Import. 54 | 9. Select the class VAE-v1, and select the Anything-v3 VAE, press Import. 55 | 10. Change the model name to something appropriate like Anything-v3.0.safetensors 56 | 11. Press Save. 57 | ``` 58 | 59 | ## Advanced 60 | The advanced tab lets you replace and extract model components, it also shows the detailed report. Import can extract components from full models, so if you want to replace the CLIP in your model with the SD 1.4 CLIP then you can simply specify the CLIP component and import the SD 1.4 checkpoint. The report will show all matched architectures, all rejected architectures (and reasons why they were rejected), and the list of all unknown keys. This is mostly useful for debuging models to see why they wont load. 61 | 62 | ## Autopruning 63 | Toggle the `Enable Autopruning` option in settings. On startup of the WebUI everything in the `models/Autoprune` folder will be pruned into FP16 `.safetensor` models (except VAEs which will be `.pt`) and moved into their proper folders. Broken or unknown models will be skipped. Models will be renamed if there is a conflict when moving (`NAI.safetensors` to `NAI(1).safetensors`, etc). The `Reload UI` button will also trigger the Autopruning. 64 | 65 | ## Metric 66 | During analysis a metric is computed which attempts to uniquely identify a models weights. The AnythingV3 model produced above has the metric: `(2020/2982/0130)`, which corresponds to `(UNET/VAE/CLIP)`. This toolkit knows the metrics for a few common components and will include any matches in its report. So for `(2020/2982/0130)` it knows the VAE metric of `2982` corresponds to the NAI VAE and will report `Uses the NAI VAE`. Actually many VAEs being distributed are just NAI VAE renamed, without a metric it would be difficult to know. Though this system isnt foolproof and incorrect matches can happen. 67 | 68 | ## Notes 69 | Some things that may be useful to know when manipulating models. 70 | 71 | ### Components 72 | Stable Diffusion requires 3 different components to function: the VAE, the UNET and CLIP. Checkpoints contain all these components. The toolkit can still recognize checkpoints that are missing components, but they wont be considered intact. For example a checkpoint thats missing its CLIP will be recognized as containing the `UNET-v1-BROKEN` and the `VAE-v1-BROKEN` models, which can be exported like normal if you want to fix the model. Any checkpoint under 2gb will be missing a component (or multiple). 73 | 74 | **The WebUI expects a checkpoint to contain all of these components, if one is missing then it will continue using whatever was last loaded (unless you load it first, then its left uninitialized and you will see NaN related errors).** 75 | 76 | ### Precision 77 | Comparison between the 2.1gb model and the orginal 8.5gb model. 78 | **By default the webui will cast all loaded models to FP16**. Without `--no-half` the models will be exactly identical. 79 | Shown is a comparison with `--no-half` enabled. 80 | ![](https://cdn.discordapp.com/attachments/973151736946622467/1060445743707603035/comparison.png) 81 | But which is FP16 and which is FP32? 82 | 83 | ### EMA 84 | EMA data is stored to enable finetuning to stop and start as needed without losing momentum. Upload the original checkpoint output by the trainer if you want people to effectively use the EMA data. 85 | 86 | After merging, the EMA no longer accurately reflects the UNET and will be a detriment to training. 87 | 88 | The EMA data is itself an independant and functional UNET. You can export the `UNET-v1-EMA` component to extract the EMA unet, then replace the regular UNET with it. For example the EMA UNET vs normal UNET in AnythingV3. 89 | ![](https://cdn.discordapp.com/attachments/973151736946622467/1060767681692827718/ema.png) 90 | 91 | ### CLIP 92 | During merging a CLIP key called `embeddings.position_ids` is sometimes broken. This is an int64 tensor that has the values from 0 to 76, merging will convert these to float and introduce errors. For example in AnythingV3 the value `76` has become `75.9975`, which is cast back to int64 when loaded by the webui, resulting in `75`. The option `Fix broken CLIP position IDs` (in settings) will fix this tensor, this is off by default because it changes the model output slightly. Fixed vs Broken. 93 | ![](https://cdn.discordapp.com/attachments/973151736946622467/1060777823624765470/clip_fix.png) 94 | 95 | ### VAE 96 | The UNET somehow takes to merging quite well, but thats not the case for the VAE. Any merge between different VAE's will result in something broken. 97 | This is why the VAE in the AnythingV3 checkpoint produces horrible outputs, and why people have resorted to distributing and loading the VAE seperatly. 98 | 99 | But replacing the VAE is not a perfect solution. UNETs operate inside the latent space produced by the VAE. We dont know the effect merging has on the latent space the UNET is expecting, theres no reason it would interpolate with the weights. Luckily VAEs produce very similar latent spaces (by design), so in practice using one of the original VAEs will do a decent job. 100 | 101 | The effect of merging on CLIP is currently unknown to me, but evidentily its not so devistating. 102 | -------------------------------------------------------------------------------- /assets/Screenshot1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/silveroxides/stable-diffusion-webui-model-toolkit-revisited/d80cc166046a70744b56cf18745ee8358019d021/assets/Screenshot1.png -------------------------------------------------------------------------------- /assets/Screenshot2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/silveroxides/stable-diffusion-webui-model-toolkit-revisited/d80cc166046a70744b56cf18745ee8358019d021/assets/Screenshot2.png -------------------------------------------------------------------------------- /assets/banner.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/silveroxides/stable-diffusion-webui-model-toolkit-revisited/d80cc166046a70744b56cf18745ee8358019d021/assets/banner.png -------------------------------------------------------------------------------- /components/CLIP-AUX-TOKEN.txt: -------------------------------------------------------------------------------- 1 | embeddings.token_embedding.weight [49408,768] -------------------------------------------------------------------------------- /components/CLIP-SvD-XT-Decoder.txt: -------------------------------------------------------------------------------- 1 | decoder.conv_in.bias [512] 2 | decoder.conv_in.weight [512,4,3,3] 3 | decoder.conv_out.bias [3] 4 | decoder.conv_out.weight [3,128,3,3] 5 | decoder.mid.attn_1.k.bias [512] 6 | decoder.mid.attn_1.k.weight [512,512,1,1] 7 | decoder.mid.attn_1.norm.bias [512] 8 | decoder.mid.attn_1.norm.weight [512] 9 | decoder.mid.attn_1.proj_out.bias [512] 10 | decoder.mid.attn_1.proj_out.weight [512,512,1,1] 11 | decoder.mid.attn_1.q.bias [512] 12 | decoder.mid.attn_1.q.weight [512,512,1,1] 13 | decoder.mid.attn_1.v.bias [512] 14 | decoder.mid.attn_1.v.weight [512,512,1,1] 15 | decoder.mid.block_1.conv1.bias [512] 16 | decoder.mid.block_1.conv1.weight [512,512,3,3] 17 | decoder.mid.block_1.conv2.bias [512] 18 | decoder.mid.block_1.conv2.weight [512,512,3,3] 19 | decoder.mid.block_1.norm1.bias [512] 20 | decoder.mid.block_1.norm1.weight [512] 21 | decoder.mid.block_1.norm2.bias [512] 22 | decoder.mid.block_1.norm2.weight [512] 23 | decoder.mid.block_2.conv1.bias [512] 24 | decoder.mid.block_2.conv1.weight [512,512,3,3] 25 | decoder.mid.block_2.conv2.bias [512] 26 | decoder.mid.block_2.conv2.weight [512,512,3,3] 27 | decoder.mid.block_2.norm1.bias [512] 28 | decoder.mid.block_2.norm1.weight [512] 29 | decoder.mid.block_2.norm2.bias [512] 30 | decoder.mid.block_2.norm2.weight [512] 31 | decoder.norm_out.bias [128] 32 | decoder.norm_out.weight [128] 33 | decoder.up.0.block.0.conv1.bias [128] 34 | decoder.up.0.block.0.conv1.weight [128,256,3,3] 35 | decoder.up.0.block.0.conv2.bias [128] 36 | decoder.up.0.block.0.conv2.weight [128,128,3,3] 37 | decoder.up.0.block.0.nin_shortcut.bias [128] 38 | decoder.up.0.block.0.nin_shortcut.weight [128,256,1,1] 39 | decoder.up.0.block.0.norm1.bias [256] 40 | decoder.up.0.block.0.norm1.weight [256] 41 | decoder.up.0.block.0.norm2.bias [128] 42 | decoder.up.0.block.0.norm2.weight [128] 43 | decoder.up.0.block.1.conv1.bias [128] 44 | decoder.up.0.block.1.conv1.weight [128,128,3,3] 45 | decoder.up.0.block.1.conv2.bias [128] 46 | decoder.up.0.block.1.conv2.weight [128,128,3,3] 47 | decoder.up.0.block.1.norm1.bias [128] 48 | decoder.up.0.block.1.norm1.weight [128] 49 | decoder.up.0.block.1.norm2.bias [128] 50 | decoder.up.0.block.1.norm2.weight [128] 51 | decoder.up.0.block.2.conv1.bias [128] 52 | decoder.up.0.block.2.conv1.weight [128,128,3,3] 53 | decoder.up.0.block.2.conv2.bias [128] 54 | decoder.up.0.block.2.conv2.weight [128,128,3,3] 55 | decoder.up.0.block.2.norm1.bias [128] 56 | decoder.up.0.block.2.norm1.weight [128] 57 | decoder.up.0.block.2.norm2.bias [128] 58 | decoder.up.0.block.2.norm2.weight [128] 59 | decoder.up.1.block.0.conv1.bias [256] 60 | decoder.up.1.block.0.conv1.weight [256,512,3,3] 61 | decoder.up.1.block.0.conv2.bias [256] 62 | decoder.up.1.block.0.conv2.weight [256,256,3,3] 63 | decoder.up.1.block.0.nin_shortcut.bias [256] 64 | decoder.up.1.block.0.nin_shortcut.weight [256,512,1,1] 65 | decoder.up.1.block.0.norm1.bias [512] 66 | decoder.up.1.block.0.norm1.weight [512] 67 | decoder.up.1.block.0.norm2.bias [256] 68 | decoder.up.1.block.0.norm2.weight [256] 69 | decoder.up.1.block.1.conv1.bias [256] 70 | decoder.up.1.block.1.conv1.weight [256,256,3,3] 71 | decoder.up.1.block.1.conv2.bias [256] 72 | decoder.up.1.block.1.conv2.weight [256,256,3,3] 73 | decoder.up.1.block.1.norm1.bias [256] 74 | decoder.up.1.block.1.norm1.weight [256] 75 | decoder.up.1.block.1.norm2.bias [256] 76 | decoder.up.1.block.1.norm2.weight [256] 77 | decoder.up.1.block.2.conv1.bias [256] 78 | decoder.up.1.block.2.conv1.weight [256,256,3,3] 79 | decoder.up.1.block.2.conv2.bias [256] 80 | decoder.up.1.block.2.conv2.weight [256,256,3,3] 81 | decoder.up.1.block.2.norm1.bias [256] 82 | decoder.up.1.block.2.norm1.weight [256] 83 | decoder.up.1.block.2.norm2.bias [256] 84 | decoder.up.1.block.2.norm2.weight [256] 85 | decoder.up.1.upsample.conv.bias [256] 86 | decoder.up.1.upsample.conv.weight [256,256,3,3] 87 | decoder.up.2.block.0.conv1.bias [512] 88 | decoder.up.2.block.0.conv1.weight [512,512,3,3] 89 | decoder.up.2.block.0.conv2.bias [512] 90 | decoder.up.2.block.0.conv2.weight [512,512,3,3] 91 | decoder.up.2.block.0.norm1.bias [512] 92 | decoder.up.2.block.0.norm1.weight [512] 93 | decoder.up.2.block.0.norm2.bias [512] 94 | decoder.up.2.block.0.norm2.weight [512] 95 | decoder.up.2.block.1.conv1.bias [512] 96 | decoder.up.2.block.1.conv1.weight [512,512,3,3] 97 | decoder.up.2.block.1.conv2.bias [512] 98 | decoder.up.2.block.1.conv2.weight [512,512,3,3] 99 | decoder.up.2.block.1.norm1.bias [512] 100 | decoder.up.2.block.1.norm1.weight [512] 101 | decoder.up.2.block.1.norm2.bias [512] 102 | decoder.up.2.block.1.norm2.weight [512] 103 | decoder.up.2.block.2.conv1.bias [512] 104 | decoder.up.2.block.2.conv1.weight [512,512,3,3] 105 | decoder.up.2.block.2.conv2.bias [512] 106 | decoder.up.2.block.2.conv2.weight [512,512,3,3] 107 | decoder.up.2.block.2.norm1.bias [512] 108 | decoder.up.2.block.2.norm1.weight [512] 109 | decoder.up.2.block.2.norm2.bias [512] 110 | decoder.up.2.block.2.norm2.weight [512] 111 | decoder.up.2.upsample.conv.bias [512] 112 | decoder.up.2.upsample.conv.weight [512,512,3,3] 113 | decoder.up.3.block.0.conv1.bias [512] 114 | decoder.up.3.block.0.conv1.weight [512,512,3,3] 115 | decoder.up.3.block.0.conv2.bias [512] 116 | decoder.up.3.block.0.conv2.weight [512,512,3,3] 117 | decoder.up.3.block.0.norm1.bias [512] 118 | decoder.up.3.block.0.norm1.weight [512] 119 | decoder.up.3.block.0.norm2.bias [512] 120 | decoder.up.3.block.0.norm2.weight [512] 121 | decoder.up.3.block.1.conv1.bias [512] 122 | decoder.up.3.block.1.conv1.weight [512,512,3,3] 123 | decoder.up.3.block.1.conv2.bias [512] 124 | decoder.up.3.block.1.conv2.weight [512,512,3,3] 125 | decoder.up.3.block.1.norm1.bias [512] 126 | decoder.up.3.block.1.norm1.weight [512] 127 | decoder.up.3.block.1.norm2.bias [512] 128 | decoder.up.3.block.1.norm2.weight [512] 129 | decoder.up.3.block.2.conv1.bias [512] 130 | decoder.up.3.block.2.conv1.weight [512,512,3,3] 131 | decoder.up.3.block.2.conv2.bias [512] 132 | decoder.up.3.block.2.conv2.weight [512,512,3,3] 133 | decoder.up.3.block.2.norm1.bias [512] 134 | decoder.up.3.block.2.norm1.weight [512] 135 | decoder.up.3.block.2.norm2.bias [512] 136 | decoder.up.3.block.2.norm2.weight [512] 137 | decoder.up.3.upsample.conv.bias [512] 138 | decoder.up.3.upsample.conv.weight [512,512,3,3] 139 | encoder.conv_in.bias [128] 140 | encoder.conv_in.weight [128,3,3,3] 141 | encoder.conv_out.bias [8] 142 | encoder.conv_out.weight [8,512,3,3] 143 | encoder.down.0.block.0.conv1.bias [128] 144 | encoder.down.0.block.0.conv1.weight [128,128,3,3] 145 | encoder.down.0.block.0.conv2.bias [128] 146 | encoder.down.0.block.0.conv2.weight [128,128,3,3] 147 | encoder.down.0.block.0.norm1.bias [128] 148 | encoder.down.0.block.0.norm1.weight [128] 149 | encoder.down.0.block.0.norm2.bias [128] 150 | encoder.down.0.block.0.norm2.weight [128] 151 | encoder.down.0.block.1.conv1.bias [128] 152 | encoder.down.0.block.1.conv1.weight [128,128,3,3] 153 | encoder.down.0.block.1.conv2.bias [128] 154 | encoder.down.0.block.1.conv2.weight [128,128,3,3] 155 | encoder.down.0.block.1.norm1.bias [128] 156 | encoder.down.0.block.1.norm1.weight [128] 157 | encoder.down.0.block.1.norm2.bias [128] 158 | encoder.down.0.block.1.norm2.weight [128] 159 | encoder.down.0.downsample.conv.bias [128] 160 | encoder.down.0.downsample.conv.weight [128,128,3,3] 161 | encoder.down.1.block.0.conv1.bias [256] 162 | encoder.down.1.block.0.conv1.weight [256,128,3,3] 163 | encoder.down.1.block.0.conv2.bias [256] 164 | encoder.down.1.block.0.conv2.weight [256,256,3,3] 165 | encoder.down.1.block.0.nin_shortcut.bias [256] 166 | encoder.down.1.block.0.nin_shortcut.weight [256,128,1,1] 167 | encoder.down.1.block.0.norm1.bias [128] 168 | encoder.down.1.block.0.norm1.weight [128] 169 | encoder.down.1.block.0.norm2.bias [256] 170 | encoder.down.1.block.0.norm2.weight [256] 171 | encoder.down.1.block.1.conv1.bias [256] 172 | encoder.down.1.block.1.conv1.weight [256,256,3,3] 173 | encoder.down.1.block.1.conv2.bias [256] 174 | encoder.down.1.block.1.conv2.weight [256,256,3,3] 175 | encoder.down.1.block.1.norm1.bias [256] 176 | encoder.down.1.block.1.norm1.weight [256] 177 | encoder.down.1.block.1.norm2.bias [256] 178 | encoder.down.1.block.1.norm2.weight [256] 179 | encoder.down.1.downsample.conv.bias [256] 180 | encoder.down.1.downsample.conv.weight [256,256,3,3] 181 | encoder.down.2.block.0.conv1.bias [512] 182 | encoder.down.2.block.0.conv1.weight [512,256,3,3] 183 | encoder.down.2.block.0.conv2.bias [512] 184 | encoder.down.2.block.0.conv2.weight [512,512,3,3] 185 | encoder.down.2.block.0.nin_shortcut.bias [512] 186 | encoder.down.2.block.0.nin_shortcut.weight [512,256,1,1] 187 | encoder.down.2.block.0.norm1.bias [256] 188 | encoder.down.2.block.0.norm1.weight [256] 189 | encoder.down.2.block.0.norm2.bias [512] 190 | encoder.down.2.block.0.norm2.weight [512] 191 | encoder.down.2.block.1.conv1.bias [512] 192 | encoder.down.2.block.1.conv1.weight [512,512,3,3] 193 | encoder.down.2.block.1.conv2.bias [512] 194 | encoder.down.2.block.1.conv2.weight [512,512,3,3] 195 | encoder.down.2.block.1.norm1.bias [512] 196 | encoder.down.2.block.1.norm1.weight [512] 197 | encoder.down.2.block.1.norm2.bias [512] 198 | encoder.down.2.block.1.norm2.weight [512] 199 | encoder.down.2.downsample.conv.bias [512] 200 | encoder.down.2.downsample.conv.weight [512,512,3,3] 201 | encoder.down.3.block.0.conv1.bias [512] 202 | encoder.down.3.block.0.conv1.weight [512,512,3,3] 203 | encoder.down.3.block.0.conv2.bias [512] 204 | encoder.down.3.block.0.conv2.weight [512,512,3,3] 205 | encoder.down.3.block.0.norm1.bias [512] 206 | encoder.down.3.block.0.norm1.weight [512] 207 | encoder.down.3.block.0.norm2.bias [512] 208 | encoder.down.3.block.0.norm2.weight [512] 209 | encoder.down.3.block.1.conv1.bias [512] 210 | encoder.down.3.block.1.conv1.weight [512,512,3,3] 211 | encoder.down.3.block.1.conv2.bias [512] 212 | encoder.down.3.block.1.conv2.weight [512,512,3,3] 213 | encoder.down.3.block.1.norm1.bias [512] 214 | encoder.down.3.block.1.norm1.weight [512] 215 | encoder.down.3.block.1.norm2.bias [512] 216 | encoder.down.3.block.1.norm2.weight [512] 217 | encoder.mid.attn_1.k.bias [512] 218 | encoder.mid.attn_1.k.weight [512,512,1,1] 219 | encoder.mid.attn_1.norm.bias [512] 220 | encoder.mid.attn_1.norm.weight [512] 221 | encoder.mid.attn_1.proj_out.bias [512] 222 | encoder.mid.attn_1.proj_out.weight [512,512,1,1] 223 | encoder.mid.attn_1.q.bias [512] 224 | encoder.mid.attn_1.q.weight [512,512,1,1] 225 | encoder.mid.attn_1.v.bias [512] 226 | encoder.mid.attn_1.v.weight [512,512,1,1] 227 | encoder.mid.block_1.conv1.bias [512] 228 | encoder.mid.block_1.conv1.weight [512,512,3,3] 229 | encoder.mid.block_1.conv2.bias [512] 230 | encoder.mid.block_1.conv2.weight [512,512,3,3] 231 | encoder.mid.block_1.norm1.bias [512] 232 | encoder.mid.block_1.norm1.weight [512] 233 | encoder.mid.block_1.norm2.bias [512] 234 | encoder.mid.block_1.norm2.weight [512] 235 | encoder.mid.block_2.conv1.bias [512] 236 | encoder.mid.block_2.conv1.weight [512,512,3,3] 237 | encoder.mid.block_2.conv2.bias [512] 238 | encoder.mid.block_2.conv2.weight [512,512,3,3] 239 | encoder.mid.block_2.norm1.bias [512] 240 | encoder.mid.block_2.norm1.weight [512] 241 | encoder.mid.block_2.norm2.bias [512] 242 | encoder.mid.block_2.norm2.weight [512] 243 | encoder.norm_out.bias [512] 244 | encoder.norm_out.weight [512] 245 | post_quant_conv.bias [4] 246 | post_quant_conv.weight [4,4,1,1] 247 | quant_conv.bias [8] 248 | quant_conv.weight [8,8,1,1] -------------------------------------------------------------------------------- /components/CLIP-SvD-XT.txt: -------------------------------------------------------------------------------- 1 | decoder.conv_in.bias [512] 2 | decoder.conv_in.weight [512,4,3,3] 3 | decoder.conv_out.bias [3] 4 | decoder.conv_out.weight [3,128,3,3] 5 | decoder.mid.attn_1.k.bias [512] 6 | decoder.mid.attn_1.k.weight [512,512,1,1] 7 | decoder.mid.attn_1.norm.bias [512] 8 | decoder.mid.attn_1.norm.weight [512] 9 | decoder.mid.attn_1.proj_out.bias [512] 10 | decoder.mid.attn_1.proj_out.weight [512,512,1,1] 11 | decoder.mid.attn_1.q.bias [512] 12 | decoder.mid.attn_1.q.weight [512,512,1,1] 13 | decoder.mid.attn_1.v.bias [512] 14 | decoder.mid.attn_1.v.weight [512,512,1,1] 15 | decoder.mid.block_1.conv1.bias [512] 16 | decoder.mid.block_1.conv1.weight [512,512,3,3] 17 | decoder.mid.block_1.conv2.bias [512] 18 | decoder.mid.block_1.conv2.weight [512,512,3,3] 19 | decoder.mid.block_1.norm1.bias [512] 20 | decoder.mid.block_1.norm1.weight [512] 21 | decoder.mid.block_1.norm2.bias [512] 22 | decoder.mid.block_1.norm2.weight [512] 23 | decoder.mid.block_2.conv1.bias [512] 24 | decoder.mid.block_2.conv1.weight [512,512,3,3] 25 | decoder.mid.block_2.conv2.bias [512] 26 | decoder.mid.block_2.conv2.weight [512,512,3,3] 27 | decoder.mid.block_2.norm1.bias [512] 28 | decoder.mid.block_2.norm1.weight [512] 29 | decoder.mid.block_2.norm2.bias [512] 30 | decoder.mid.block_2.norm2.weight [512] 31 | decoder.norm_out.bias [128] 32 | decoder.norm_out.weight [128] 33 | decoder.up.0.block.0.conv1.bias [128] 34 | decoder.up.0.block.0.conv1.weight [128,256,3,3] 35 | decoder.up.0.block.0.conv2.bias [128] 36 | decoder.up.0.block.0.conv2.weight [128,128,3,3] 37 | decoder.up.0.block.0.nin_shortcut.bias [128] 38 | decoder.up.0.block.0.nin_shortcut.weight [128,256,1,1] 39 | decoder.up.0.block.0.norm1.bias [256] 40 | decoder.up.0.block.0.norm1.weight [256] 41 | decoder.up.0.block.0.norm2.bias [128] 42 | decoder.up.0.block.0.norm2.weight [128] 43 | decoder.up.0.block.1.conv1.bias [128] 44 | decoder.up.0.block.1.conv1.weight [128,128,3,3] 45 | decoder.up.0.block.1.conv2.bias [128] 46 | decoder.up.0.block.1.conv2.weight [128,128,3,3] 47 | decoder.up.0.block.1.norm1.bias [128] 48 | decoder.up.0.block.1.norm1.weight [128] 49 | decoder.up.0.block.1.norm2.bias [128] 50 | decoder.up.0.block.1.norm2.weight [128] 51 | decoder.up.0.block.2.conv1.bias [128] 52 | decoder.up.0.block.2.conv1.weight [128,128,3,3] 53 | decoder.up.0.block.2.conv2.bias [128] 54 | decoder.up.0.block.2.conv2.weight [128,128,3,3] 55 | decoder.up.0.block.2.norm1.bias [128] 56 | decoder.up.0.block.2.norm1.weight [128] 57 | decoder.up.0.block.2.norm2.bias [128] 58 | decoder.up.0.block.2.norm2.weight [128] 59 | decoder.up.1.block.0.conv1.bias [256] 60 | decoder.up.1.block.0.conv1.weight [256,512,3,3] 61 | decoder.up.1.block.0.conv2.bias [256] 62 | decoder.up.1.block.0.conv2.weight [256,256,3,3] 63 | decoder.up.1.block.0.nin_shortcut.bias [256] 64 | decoder.up.1.block.0.nin_shortcut.weight [256,512,1,1] 65 | decoder.up.1.block.0.norm1.bias [512] 66 | decoder.up.1.block.0.norm1.weight [512] 67 | decoder.up.1.block.0.norm2.bias [256] 68 | decoder.up.1.block.0.norm2.weight [256] 69 | decoder.up.1.block.1.conv1.bias [256] 70 | decoder.up.1.block.1.conv1.weight [256,256,3,3] 71 | decoder.up.1.block.1.conv2.bias [256] 72 | decoder.up.1.block.1.conv2.weight [256,256,3,3] 73 | decoder.up.1.block.1.norm1.bias [256] 74 | decoder.up.1.block.1.norm1.weight [256] 75 | decoder.up.1.block.1.norm2.bias [256] 76 | decoder.up.1.block.1.norm2.weight [256] 77 | decoder.up.1.block.2.conv1.bias [256] 78 | decoder.up.1.block.2.conv1.weight [256,256,3,3] 79 | decoder.up.1.block.2.conv2.bias [256] 80 | decoder.up.1.block.2.conv2.weight [256,256,3,3] 81 | decoder.up.1.block.2.norm1.bias [256] 82 | decoder.up.1.block.2.norm1.weight [256] 83 | decoder.up.1.block.2.norm2.bias [256] 84 | decoder.up.1.block.2.norm2.weight [256] 85 | decoder.up.1.upsample.conv.bias [256] 86 | decoder.up.1.upsample.conv.weight [256,256,3,3] 87 | decoder.up.2.block.0.conv1.bias [512] 88 | decoder.up.2.block.0.conv1.weight [512,512,3,3] 89 | decoder.up.2.block.0.conv2.bias [512] 90 | decoder.up.2.block.0.conv2.weight [512,512,3,3] 91 | decoder.up.2.block.0.norm1.bias [512] 92 | decoder.up.2.block.0.norm1.weight [512] 93 | decoder.up.2.block.0.norm2.bias [512] 94 | decoder.up.2.block.0.norm2.weight [512] 95 | decoder.up.2.block.1.conv1.bias [512] 96 | decoder.up.2.block.1.conv1.weight [512,512,3,3] 97 | decoder.up.2.block.1.conv2.bias [512] 98 | decoder.up.2.block.1.conv2.weight [512,512,3,3] 99 | decoder.up.2.block.1.norm1.bias [512] 100 | decoder.up.2.block.1.norm1.weight [512] 101 | decoder.up.2.block.1.norm2.bias [512] 102 | decoder.up.2.block.1.norm2.weight [512] 103 | decoder.up.2.block.2.conv1.bias [512] 104 | decoder.up.2.block.2.conv1.weight [512,512,3,3] 105 | decoder.up.2.block.2.conv2.bias [512] 106 | decoder.up.2.block.2.conv2.weight [512,512,3,3] 107 | decoder.up.2.block.2.norm1.bias [512] 108 | decoder.up.2.block.2.norm1.weight [512] 109 | decoder.up.2.block.2.norm2.bias [512] 110 | decoder.up.2.block.2.norm2.weight [512] 111 | decoder.up.2.upsample.conv.bias [512] 112 | decoder.up.2.upsample.conv.weight [512,512,3,3] 113 | decoder.up.3.block.0.conv1.bias [512] 114 | decoder.up.3.block.0.conv1.weight [512,512,3,3] 115 | decoder.up.3.block.0.conv2.bias [512] 116 | decoder.up.3.block.0.conv2.weight [512,512,3,3] 117 | decoder.up.3.block.0.norm1.bias [512] 118 | decoder.up.3.block.0.norm1.weight [512] 119 | decoder.up.3.block.0.norm2.bias [512] 120 | decoder.up.3.block.0.norm2.weight [512] 121 | decoder.up.3.block.1.conv1.bias [512] 122 | decoder.up.3.block.1.conv1.weight [512,512,3,3] 123 | decoder.up.3.block.1.conv2.bias [512] 124 | decoder.up.3.block.1.conv2.weight [512,512,3,3] 125 | decoder.up.3.block.1.norm1.bias [512] 126 | decoder.up.3.block.1.norm1.weight [512] 127 | decoder.up.3.block.1.norm2.bias [512] 128 | decoder.up.3.block.1.norm2.weight [512] 129 | decoder.up.3.block.2.conv1.bias [512] 130 | decoder.up.3.block.2.conv1.weight [512,512,3,3] 131 | decoder.up.3.block.2.conv2.bias [512] 132 | decoder.up.3.block.2.conv2.weight [512,512,3,3] 133 | decoder.up.3.block.2.norm1.bias [512] 134 | decoder.up.3.block.2.norm1.weight [512] 135 | decoder.up.3.block.2.norm2.bias [512] 136 | decoder.up.3.block.2.norm2.weight [512] 137 | decoder.up.3.upsample.conv.bias [512] 138 | decoder.up.3.upsample.conv.weight [512,512,3,3] 139 | encoder.conv_in.bias [128] 140 | encoder.conv_in.weight [128,3,3,3] 141 | encoder.conv_out.bias [8] 142 | encoder.conv_out.weight [8,512,3,3] 143 | encoder.down.0.block.0.conv1.bias [128] 144 | encoder.down.0.block.0.conv1.weight [128,128,3,3] 145 | encoder.down.0.block.0.conv2.bias [128] 146 | encoder.down.0.block.0.conv2.weight [128,128,3,3] 147 | encoder.down.0.block.0.norm1.bias [128] 148 | encoder.down.0.block.0.norm1.weight [128] 149 | encoder.down.0.block.0.norm2.bias [128] 150 | encoder.down.0.block.0.norm2.weight [128] 151 | encoder.down.0.block.1.conv1.bias [128] 152 | encoder.down.0.block.1.conv1.weight [128,128,3,3] 153 | encoder.down.0.block.1.conv2.bias [128] 154 | encoder.down.0.block.1.conv2.weight [128,128,3,3] 155 | encoder.down.0.block.1.norm1.bias [128] 156 | encoder.down.0.block.1.norm1.weight [128] 157 | encoder.down.0.block.1.norm2.bias [128] 158 | encoder.down.0.block.1.norm2.weight [128] 159 | encoder.down.0.downsample.conv.bias [128] 160 | encoder.down.0.downsample.conv.weight [128,128,3,3] 161 | encoder.down.1.block.0.conv1.bias [256] 162 | encoder.down.1.block.0.conv1.weight [256,128,3,3] 163 | encoder.down.1.block.0.conv2.bias [256] 164 | encoder.down.1.block.0.conv2.weight [256,256,3,3] 165 | encoder.down.1.block.0.nin_shortcut.bias [256] 166 | encoder.down.1.block.0.nin_shortcut.weight [256,128,1,1] 167 | encoder.down.1.block.0.norm1.bias [128] 168 | encoder.down.1.block.0.norm1.weight [128] 169 | encoder.down.1.block.0.norm2.bias [256] 170 | encoder.down.1.block.0.norm2.weight [256] 171 | encoder.down.1.block.1.conv1.bias [256] 172 | encoder.down.1.block.1.conv1.weight [256,256,3,3] 173 | encoder.down.1.block.1.conv2.bias [256] 174 | encoder.down.1.block.1.conv2.weight [256,256,3,3] 175 | encoder.down.1.block.1.norm1.bias [256] 176 | encoder.down.1.block.1.norm1.weight [256] 177 | encoder.down.1.block.1.norm2.bias [256] 178 | encoder.down.1.block.1.norm2.weight [256] 179 | encoder.down.1.downsample.conv.bias [256] 180 | encoder.down.1.downsample.conv.weight [256,256,3,3] 181 | encoder.down.2.block.0.conv1.bias [512] 182 | encoder.down.2.block.0.conv1.weight [512,256,3,3] 183 | encoder.down.2.block.0.conv2.bias [512] 184 | encoder.down.2.block.0.conv2.weight [512,512,3,3] 185 | encoder.down.2.block.0.nin_shortcut.bias [512] 186 | encoder.down.2.block.0.nin_shortcut.weight [512,256,1,1] 187 | encoder.down.2.block.0.norm1.bias [256] 188 | encoder.down.2.block.0.norm1.weight [256] 189 | encoder.down.2.block.0.norm2.bias [512] 190 | encoder.down.2.block.0.norm2.weight [512] 191 | encoder.down.2.block.1.conv1.bias [512] 192 | encoder.down.2.block.1.conv1.weight [512,512,3,3] 193 | encoder.down.2.block.1.conv2.bias [512] 194 | encoder.down.2.block.1.conv2.weight [512,512,3,3] 195 | encoder.down.2.block.1.norm1.bias [512] 196 | encoder.down.2.block.1.norm1.weight [512] 197 | encoder.down.2.block.1.norm2.bias [512] 198 | encoder.down.2.block.1.norm2.weight [512] 199 | encoder.down.2.downsample.conv.bias [512] 200 | encoder.down.2.downsample.conv.weight [512,512,3,3] 201 | encoder.down.3.block.0.conv1.bias [512] 202 | encoder.down.3.block.0.conv1.weight [512,512,3,3] 203 | encoder.down.3.block.0.conv2.bias [512] 204 | encoder.down.3.block.0.conv2.weight [512,512,3,3] 205 | encoder.down.3.block.0.norm1.bias [512] 206 | encoder.down.3.block.0.norm1.weight [512] 207 | encoder.down.3.block.0.norm2.bias [512] 208 | encoder.down.3.block.0.norm2.weight [512] 209 | encoder.down.3.block.1.conv1.bias [512] 210 | encoder.down.3.block.1.conv1.weight [512,512,3,3] 211 | encoder.down.3.block.1.conv2.bias [512] 212 | encoder.down.3.block.1.conv2.weight [512,512,3,3] 213 | encoder.down.3.block.1.norm1.bias [512] 214 | encoder.down.3.block.1.norm1.weight [512] 215 | encoder.down.3.block.1.norm2.bias [512] 216 | encoder.down.3.block.1.norm2.weight [512] 217 | encoder.mid.attn_1.k.bias [512] 218 | encoder.mid.attn_1.k.weight [512,512,1,1] 219 | encoder.mid.attn_1.norm.bias [512] 220 | encoder.mid.attn_1.norm.weight [512] 221 | encoder.mid.attn_1.proj_out.bias [512] 222 | encoder.mid.attn_1.proj_out.weight [512,512,1,1] 223 | encoder.mid.attn_1.q.bias [512] 224 | encoder.mid.attn_1.q.weight [512,512,1,1] 225 | encoder.mid.attn_1.v.bias [512] 226 | encoder.mid.attn_1.v.weight [512,512,1,1] 227 | encoder.mid.block_1.conv1.bias [512] 228 | encoder.mid.block_1.conv1.weight [512,512,3,3] 229 | encoder.mid.block_1.conv2.bias [512] 230 | encoder.mid.block_1.conv2.weight [512,512,3,3] 231 | encoder.mid.block_1.norm1.bias [512] 232 | encoder.mid.block_1.norm1.weight [512] 233 | encoder.mid.block_1.norm2.bias [512] 234 | encoder.mid.block_1.norm2.weight [512] 235 | encoder.mid.block_2.conv1.bias [512] 236 | encoder.mid.block_2.conv1.weight [512,512,3,3] 237 | encoder.mid.block_2.conv2.bias [512] 238 | encoder.mid.block_2.conv2.weight [512,512,3,3] 239 | encoder.mid.block_2.norm1.bias [512] 240 | encoder.mid.block_2.norm1.weight [512] 241 | encoder.mid.block_2.norm2.bias [512] 242 | encoder.mid.block_2.norm2.weight [512] 243 | encoder.norm_out.bias [512] 244 | encoder.norm_out.weight [512] 245 | post_quant_conv.bias [4] 246 | post_quant_conv.weight [4,4,1,1] 247 | quant_conv.bias [8] 248 | quant_conv.weight [8,8,1,1] -------------------------------------------------------------------------------- /components/CLIP-XL-TOKEMB.txt: -------------------------------------------------------------------------------- 1 | conditioner.embedders.0.transformer.text_model.embeddings.token_embedding.weight [49408,768] 2 | conditioner.embedders.1.model.token_embedding.weight [49408,1280] -------------------------------------------------------------------------------- /components/CLIP-XL-TOKEN.txt: -------------------------------------------------------------------------------- 1 | token_embedding.weight [49408,1280] -------------------------------------------------------------------------------- /components/CLIP-v1-POS.txt: -------------------------------------------------------------------------------- 1 | positional_embedding [248,768] 2 | positional_embedding_res [248,768] -------------------------------------------------------------------------------- /components/CLIP-v1-SCPR.txt: -------------------------------------------------------------------------------- 1 | logit_scale [] 2 | text_projection [768,768] -------------------------------------------------------------------------------- /components/CLIP-v1-SD.txt: -------------------------------------------------------------------------------- 1 | embeddings.position_embedding.weight [77,768] 2 | embeddings.token_embedding.weight [49408,768] 3 | encoder.layers.0.layer_norm1.bias [768] 4 | encoder.layers.0.layer_norm1.weight [768] 5 | encoder.layers.0.layer_norm2.bias [768] 6 | encoder.layers.0.layer_norm2.weight [768] 7 | encoder.layers.0.mlp.fc1.bias [3072] 8 | encoder.layers.0.mlp.fc1.weight [3072,768] 9 | encoder.layers.0.mlp.fc2.bias [768] 10 | encoder.layers.0.mlp.fc2.weight [768,3072] 11 | encoder.layers.0.self_attn.k_proj.bias [768] 12 | encoder.layers.0.self_attn.k_proj.weight [768,768] 13 | encoder.layers.0.self_attn.out_proj.bias [768] 14 | encoder.layers.0.self_attn.out_proj.weight [768,768] 15 | encoder.layers.0.self_attn.q_proj.bias [768] 16 | encoder.layers.0.self_attn.q_proj.weight [768,768] 17 | encoder.layers.0.self_attn.v_proj.bias [768] 18 | encoder.layers.0.self_attn.v_proj.weight [768,768] 19 | encoder.layers.1.layer_norm1.bias [768] 20 | encoder.layers.1.layer_norm1.weight [768] 21 | encoder.layers.1.layer_norm2.bias [768] 22 | encoder.layers.1.layer_norm2.weight [768] 23 | encoder.layers.1.mlp.fc1.bias [3072] 24 | encoder.layers.1.mlp.fc1.weight [3072,768] 25 | encoder.layers.1.mlp.fc2.bias [768] 26 | encoder.layers.1.mlp.fc2.weight [768,3072] 27 | encoder.layers.1.self_attn.k_proj.bias [768] 28 | encoder.layers.1.self_attn.k_proj.weight [768,768] 29 | encoder.layers.1.self_attn.out_proj.bias [768] 30 | encoder.layers.1.self_attn.out_proj.weight [768,768] 31 | encoder.layers.1.self_attn.q_proj.bias [768] 32 | encoder.layers.1.self_attn.q_proj.weight [768,768] 33 | encoder.layers.1.self_attn.v_proj.bias [768] 34 | encoder.layers.1.self_attn.v_proj.weight [768,768] 35 | encoder.layers.10.layer_norm1.bias [768] 36 | encoder.layers.10.layer_norm1.weight [768] 37 | encoder.layers.10.layer_norm2.bias [768] 38 | encoder.layers.10.layer_norm2.weight [768] 39 | encoder.layers.10.mlp.fc1.bias [3072] 40 | encoder.layers.10.mlp.fc1.weight [3072,768] 41 | encoder.layers.10.mlp.fc2.bias [768] 42 | encoder.layers.10.mlp.fc2.weight [768,3072] 43 | encoder.layers.10.self_attn.k_proj.bias [768] 44 | encoder.layers.10.self_attn.k_proj.weight [768,768] 45 | encoder.layers.10.self_attn.out_proj.bias [768] 46 | encoder.layers.10.self_attn.out_proj.weight [768,768] 47 | encoder.layers.10.self_attn.q_proj.bias [768] 48 | encoder.layers.10.self_attn.q_proj.weight [768,768] 49 | encoder.layers.10.self_attn.v_proj.bias [768] 50 | encoder.layers.10.self_attn.v_proj.weight [768,768] 51 | encoder.layers.11.layer_norm1.bias [768] 52 | encoder.layers.11.layer_norm1.weight [768] 53 | encoder.layers.11.layer_norm2.bias [768] 54 | encoder.layers.11.layer_norm2.weight [768] 55 | encoder.layers.11.mlp.fc1.bias [3072] 56 | encoder.layers.11.mlp.fc1.weight [3072,768] 57 | encoder.layers.11.mlp.fc2.bias [768] 58 | encoder.layers.11.mlp.fc2.weight [768,3072] 59 | encoder.layers.11.self_attn.k_proj.bias [768] 60 | encoder.layers.11.self_attn.k_proj.weight [768,768] 61 | encoder.layers.11.self_attn.out_proj.bias [768] 62 | encoder.layers.11.self_attn.out_proj.weight [768,768] 63 | encoder.layers.11.self_attn.q_proj.bias [768] 64 | encoder.layers.11.self_attn.q_proj.weight [768,768] 65 | encoder.layers.11.self_attn.v_proj.bias [768] 66 | encoder.layers.11.self_attn.v_proj.weight [768,768] 67 | encoder.layers.2.layer_norm1.bias [768] 68 | encoder.layers.2.layer_norm1.weight [768] 69 | encoder.layers.2.layer_norm2.bias [768] 70 | encoder.layers.2.layer_norm2.weight [768] 71 | encoder.layers.2.mlp.fc1.bias [3072] 72 | encoder.layers.2.mlp.fc1.weight [3072,768] 73 | encoder.layers.2.mlp.fc2.bias [768] 74 | encoder.layers.2.mlp.fc2.weight [768,3072] 75 | encoder.layers.2.self_attn.k_proj.bias [768] 76 | encoder.layers.2.self_attn.k_proj.weight [768,768] 77 | encoder.layers.2.self_attn.out_proj.bias [768] 78 | encoder.layers.2.self_attn.out_proj.weight [768,768] 79 | encoder.layers.2.self_attn.q_proj.bias [768] 80 | encoder.layers.2.self_attn.q_proj.weight [768,768] 81 | encoder.layers.2.self_attn.v_proj.bias [768] 82 | encoder.layers.2.self_attn.v_proj.weight [768,768] 83 | encoder.layers.3.layer_norm1.bias [768] 84 | encoder.layers.3.layer_norm1.weight [768] 85 | encoder.layers.3.layer_norm2.bias [768] 86 | encoder.layers.3.layer_norm2.weight [768] 87 | encoder.layers.3.mlp.fc1.bias [3072] 88 | encoder.layers.3.mlp.fc1.weight [3072,768] 89 | encoder.layers.3.mlp.fc2.bias [768] 90 | encoder.layers.3.mlp.fc2.weight [768,3072] 91 | encoder.layers.3.self_attn.k_proj.bias [768] 92 | encoder.layers.3.self_attn.k_proj.weight [768,768] 93 | encoder.layers.3.self_attn.out_proj.bias [768] 94 | encoder.layers.3.self_attn.out_proj.weight [768,768] 95 | encoder.layers.3.self_attn.q_proj.bias [768] 96 | encoder.layers.3.self_attn.q_proj.weight [768,768] 97 | encoder.layers.3.self_attn.v_proj.bias [768] 98 | encoder.layers.3.self_attn.v_proj.weight [768,768] 99 | encoder.layers.4.layer_norm1.bias [768] 100 | encoder.layers.4.layer_norm1.weight [768] 101 | encoder.layers.4.layer_norm2.bias [768] 102 | encoder.layers.4.layer_norm2.weight [768] 103 | encoder.layers.4.mlp.fc1.bias [3072] 104 | encoder.layers.4.mlp.fc1.weight [3072,768] 105 | encoder.layers.4.mlp.fc2.bias [768] 106 | encoder.layers.4.mlp.fc2.weight [768,3072] 107 | encoder.layers.4.self_attn.k_proj.bias [768] 108 | encoder.layers.4.self_attn.k_proj.weight [768,768] 109 | encoder.layers.4.self_attn.out_proj.bias [768] 110 | encoder.layers.4.self_attn.out_proj.weight [768,768] 111 | encoder.layers.4.self_attn.q_proj.bias [768] 112 | encoder.layers.4.self_attn.q_proj.weight [768,768] 113 | encoder.layers.4.self_attn.v_proj.bias [768] 114 | encoder.layers.4.self_attn.v_proj.weight [768,768] 115 | encoder.layers.5.layer_norm1.bias [768] 116 | encoder.layers.5.layer_norm1.weight [768] 117 | encoder.layers.5.layer_norm2.bias [768] 118 | encoder.layers.5.layer_norm2.weight [768] 119 | encoder.layers.5.mlp.fc1.bias [3072] 120 | encoder.layers.5.mlp.fc1.weight [3072,768] 121 | encoder.layers.5.mlp.fc2.bias [768] 122 | encoder.layers.5.mlp.fc2.weight [768,3072] 123 | encoder.layers.5.self_attn.k_proj.bias [768] 124 | encoder.layers.5.self_attn.k_proj.weight [768,768] 125 | encoder.layers.5.self_attn.out_proj.bias [768] 126 | encoder.layers.5.self_attn.out_proj.weight [768,768] 127 | encoder.layers.5.self_attn.q_proj.bias [768] 128 | encoder.layers.5.self_attn.q_proj.weight [768,768] 129 | encoder.layers.5.self_attn.v_proj.bias [768] 130 | encoder.layers.5.self_attn.v_proj.weight [768,768] 131 | encoder.layers.6.layer_norm1.bias [768] 132 | encoder.layers.6.layer_norm1.weight [768] 133 | encoder.layers.6.layer_norm2.bias [768] 134 | encoder.layers.6.layer_norm2.weight [768] 135 | encoder.layers.6.mlp.fc1.bias [3072] 136 | encoder.layers.6.mlp.fc1.weight [3072,768] 137 | encoder.layers.6.mlp.fc2.bias [768] 138 | encoder.layers.6.mlp.fc2.weight [768,3072] 139 | encoder.layers.6.self_attn.k_proj.bias [768] 140 | encoder.layers.6.self_attn.k_proj.weight [768,768] 141 | encoder.layers.6.self_attn.out_proj.bias [768] 142 | encoder.layers.6.self_attn.out_proj.weight [768,768] 143 | encoder.layers.6.self_attn.q_proj.bias [768] 144 | encoder.layers.6.self_attn.q_proj.weight [768,768] 145 | encoder.layers.6.self_attn.v_proj.bias [768] 146 | encoder.layers.6.self_attn.v_proj.weight [768,768] 147 | encoder.layers.7.layer_norm1.bias [768] 148 | encoder.layers.7.layer_norm1.weight [768] 149 | encoder.layers.7.layer_norm2.bias [768] 150 | encoder.layers.7.layer_norm2.weight [768] 151 | encoder.layers.7.mlp.fc1.bias [3072] 152 | encoder.layers.7.mlp.fc1.weight [3072,768] 153 | encoder.layers.7.mlp.fc2.bias [768] 154 | encoder.layers.7.mlp.fc2.weight [768,3072] 155 | encoder.layers.7.self_attn.k_proj.bias [768] 156 | encoder.layers.7.self_attn.k_proj.weight [768,768] 157 | encoder.layers.7.self_attn.out_proj.bias [768] 158 | encoder.layers.7.self_attn.out_proj.weight [768,768] 159 | encoder.layers.7.self_attn.q_proj.bias [768] 160 | encoder.layers.7.self_attn.q_proj.weight [768,768] 161 | encoder.layers.7.self_attn.v_proj.bias [768] 162 | encoder.layers.7.self_attn.v_proj.weight [768,768] 163 | encoder.layers.8.layer_norm1.bias [768] 164 | encoder.layers.8.layer_norm1.weight [768] 165 | encoder.layers.8.layer_norm2.bias [768] 166 | encoder.layers.8.layer_norm2.weight [768] 167 | encoder.layers.8.mlp.fc1.bias [3072] 168 | encoder.layers.8.mlp.fc1.weight [3072,768] 169 | encoder.layers.8.mlp.fc2.bias [768] 170 | encoder.layers.8.mlp.fc2.weight [768,3072] 171 | encoder.layers.8.self_attn.k_proj.bias [768] 172 | encoder.layers.8.self_attn.k_proj.weight [768,768] 173 | encoder.layers.8.self_attn.out_proj.bias [768] 174 | encoder.layers.8.self_attn.out_proj.weight [768,768] 175 | encoder.layers.8.self_attn.q_proj.bias [768] 176 | encoder.layers.8.self_attn.q_proj.weight [768,768] 177 | encoder.layers.8.self_attn.v_proj.bias [768] 178 | encoder.layers.8.self_attn.v_proj.weight [768,768] 179 | encoder.layers.9.layer_norm1.bias [768] 180 | encoder.layers.9.layer_norm1.weight [768] 181 | encoder.layers.9.layer_norm2.bias [768] 182 | encoder.layers.9.layer_norm2.weight [768] 183 | encoder.layers.9.mlp.fc1.bias [3072] 184 | encoder.layers.9.mlp.fc1.weight [3072,768] 185 | encoder.layers.9.mlp.fc2.bias [768] 186 | encoder.layers.9.mlp.fc2.weight [768,3072] 187 | encoder.layers.9.self_attn.k_proj.bias [768] 188 | encoder.layers.9.self_attn.k_proj.weight [768,768] 189 | encoder.layers.9.self_attn.out_proj.bias [768] 190 | encoder.layers.9.self_attn.out_proj.weight [768,768] 191 | encoder.layers.9.self_attn.q_proj.bias [768] 192 | encoder.layers.9.self_attn.q_proj.weight [768,768] 193 | encoder.layers.9.self_attn.v_proj.bias [768] 194 | encoder.layers.9.self_attn.v_proj.weight [768,768] 195 | final_layer_norm.bias [768] 196 | final_layer_norm.weight [768] 197 | -------------------------------------------------------------------------------- /components/CLIP-v1-TOKEN.txt: -------------------------------------------------------------------------------- 1 | embeddings.token_embedding.weight [49408,768] -------------------------------------------------------------------------------- /components/CLIP-v2-SD.txt: -------------------------------------------------------------------------------- 1 | ln_final.bias [1024] 2 | ln_final.weight [1024] 3 | logit_scale [] 4 | positional_embedding [77,1024] 5 | text_projection [1024,1024] 6 | token_embedding.weight [49408,1024] 7 | transformer.resblocks.0.attn.in_proj_bias [3072] 8 | transformer.resblocks.0.attn.in_proj_weight [3072,1024] 9 | transformer.resblocks.0.attn.out_proj.bias [1024] 10 | transformer.resblocks.0.attn.out_proj.weight [1024,1024] 11 | transformer.resblocks.0.ln_1.bias [1024] 12 | transformer.resblocks.0.ln_1.weight [1024] 13 | transformer.resblocks.0.ln_2.bias [1024] 14 | transformer.resblocks.0.ln_2.weight [1024] 15 | transformer.resblocks.0.mlp.c_fc.bias [4096] 16 | transformer.resblocks.0.mlp.c_fc.weight [4096,1024] 17 | transformer.resblocks.0.mlp.c_proj.bias [1024] 18 | transformer.resblocks.0.mlp.c_proj.weight [1024,4096] 19 | transformer.resblocks.1.attn.in_proj_bias [3072] 20 | transformer.resblocks.1.attn.in_proj_weight [3072,1024] 21 | transformer.resblocks.1.attn.out_proj.bias [1024] 22 | transformer.resblocks.1.attn.out_proj.weight [1024,1024] 23 | transformer.resblocks.1.ln_1.bias [1024] 24 | transformer.resblocks.1.ln_1.weight [1024] 25 | transformer.resblocks.1.ln_2.bias [1024] 26 | transformer.resblocks.1.ln_2.weight [1024] 27 | transformer.resblocks.1.mlp.c_fc.bias [4096] 28 | transformer.resblocks.1.mlp.c_fc.weight [4096,1024] 29 | transformer.resblocks.1.mlp.c_proj.bias [1024] 30 | transformer.resblocks.1.mlp.c_proj.weight [1024,4096] 31 | transformer.resblocks.10.attn.in_proj_bias [3072] 32 | transformer.resblocks.10.attn.in_proj_weight [3072,1024] 33 | transformer.resblocks.10.attn.out_proj.bias [1024] 34 | transformer.resblocks.10.attn.out_proj.weight [1024,1024] 35 | transformer.resblocks.10.ln_1.bias [1024] 36 | transformer.resblocks.10.ln_1.weight [1024] 37 | transformer.resblocks.10.ln_2.bias [1024] 38 | transformer.resblocks.10.ln_2.weight [1024] 39 | transformer.resblocks.10.mlp.c_fc.bias [4096] 40 | transformer.resblocks.10.mlp.c_fc.weight [4096,1024] 41 | transformer.resblocks.10.mlp.c_proj.bias [1024] 42 | transformer.resblocks.10.mlp.c_proj.weight [1024,4096] 43 | transformer.resblocks.11.attn.in_proj_bias [3072] 44 | transformer.resblocks.11.attn.in_proj_weight [3072,1024] 45 | transformer.resblocks.11.attn.out_proj.bias [1024] 46 | transformer.resblocks.11.attn.out_proj.weight [1024,1024] 47 | transformer.resblocks.11.ln_1.bias [1024] 48 | transformer.resblocks.11.ln_1.weight [1024] 49 | transformer.resblocks.11.ln_2.bias [1024] 50 | transformer.resblocks.11.ln_2.weight [1024] 51 | transformer.resblocks.11.mlp.c_fc.bias [4096] 52 | transformer.resblocks.11.mlp.c_fc.weight [4096,1024] 53 | transformer.resblocks.11.mlp.c_proj.bias [1024] 54 | transformer.resblocks.11.mlp.c_proj.weight [1024,4096] 55 | transformer.resblocks.12.attn.in_proj_bias [3072] 56 | transformer.resblocks.12.attn.in_proj_weight [3072,1024] 57 | transformer.resblocks.12.attn.out_proj.bias [1024] 58 | transformer.resblocks.12.attn.out_proj.weight [1024,1024] 59 | transformer.resblocks.12.ln_1.bias [1024] 60 | transformer.resblocks.12.ln_1.weight [1024] 61 | transformer.resblocks.12.ln_2.bias [1024] 62 | transformer.resblocks.12.ln_2.weight [1024] 63 | transformer.resblocks.12.mlp.c_fc.bias [4096] 64 | transformer.resblocks.12.mlp.c_fc.weight [4096,1024] 65 | transformer.resblocks.12.mlp.c_proj.bias [1024] 66 | transformer.resblocks.12.mlp.c_proj.weight [1024,4096] 67 | transformer.resblocks.13.attn.in_proj_bias [3072] 68 | transformer.resblocks.13.attn.in_proj_weight [3072,1024] 69 | transformer.resblocks.13.attn.out_proj.bias [1024] 70 | transformer.resblocks.13.attn.out_proj.weight [1024,1024] 71 | transformer.resblocks.13.ln_1.bias [1024] 72 | transformer.resblocks.13.ln_1.weight [1024] 73 | transformer.resblocks.13.ln_2.bias [1024] 74 | transformer.resblocks.13.ln_2.weight [1024] 75 | transformer.resblocks.13.mlp.c_fc.bias [4096] 76 | transformer.resblocks.13.mlp.c_fc.weight [4096,1024] 77 | transformer.resblocks.13.mlp.c_proj.bias [1024] 78 | transformer.resblocks.13.mlp.c_proj.weight [1024,4096] 79 | transformer.resblocks.14.attn.in_proj_bias [3072] 80 | transformer.resblocks.14.attn.in_proj_weight [3072,1024] 81 | transformer.resblocks.14.attn.out_proj.bias [1024] 82 | transformer.resblocks.14.attn.out_proj.weight [1024,1024] 83 | transformer.resblocks.14.ln_1.bias [1024] 84 | transformer.resblocks.14.ln_1.weight [1024] 85 | transformer.resblocks.14.ln_2.bias [1024] 86 | transformer.resblocks.14.ln_2.weight [1024] 87 | transformer.resblocks.14.mlp.c_fc.bias [4096] 88 | transformer.resblocks.14.mlp.c_fc.weight [4096,1024] 89 | transformer.resblocks.14.mlp.c_proj.bias [1024] 90 | transformer.resblocks.14.mlp.c_proj.weight [1024,4096] 91 | transformer.resblocks.15.attn.in_proj_bias [3072] 92 | transformer.resblocks.15.attn.in_proj_weight [3072,1024] 93 | transformer.resblocks.15.attn.out_proj.bias [1024] 94 | transformer.resblocks.15.attn.out_proj.weight [1024,1024] 95 | transformer.resblocks.15.ln_1.bias [1024] 96 | transformer.resblocks.15.ln_1.weight [1024] 97 | transformer.resblocks.15.ln_2.bias [1024] 98 | transformer.resblocks.15.ln_2.weight [1024] 99 | transformer.resblocks.15.mlp.c_fc.bias [4096] 100 | transformer.resblocks.15.mlp.c_fc.weight [4096,1024] 101 | transformer.resblocks.15.mlp.c_proj.bias [1024] 102 | transformer.resblocks.15.mlp.c_proj.weight [1024,4096] 103 | transformer.resblocks.16.attn.in_proj_bias [3072] 104 | transformer.resblocks.16.attn.in_proj_weight [3072,1024] 105 | transformer.resblocks.16.attn.out_proj.bias [1024] 106 | transformer.resblocks.16.attn.out_proj.weight [1024,1024] 107 | transformer.resblocks.16.ln_1.bias [1024] 108 | transformer.resblocks.16.ln_1.weight [1024] 109 | transformer.resblocks.16.ln_2.bias [1024] 110 | transformer.resblocks.16.ln_2.weight [1024] 111 | transformer.resblocks.16.mlp.c_fc.bias [4096] 112 | transformer.resblocks.16.mlp.c_fc.weight [4096,1024] 113 | transformer.resblocks.16.mlp.c_proj.bias [1024] 114 | transformer.resblocks.16.mlp.c_proj.weight [1024,4096] 115 | transformer.resblocks.17.attn.in_proj_bias [3072] 116 | transformer.resblocks.17.attn.in_proj_weight [3072,1024] 117 | transformer.resblocks.17.attn.out_proj.bias [1024] 118 | transformer.resblocks.17.attn.out_proj.weight [1024,1024] 119 | transformer.resblocks.17.ln_1.bias [1024] 120 | transformer.resblocks.17.ln_1.weight [1024] 121 | transformer.resblocks.17.ln_2.bias [1024] 122 | transformer.resblocks.17.ln_2.weight [1024] 123 | transformer.resblocks.17.mlp.c_fc.bias [4096] 124 | transformer.resblocks.17.mlp.c_fc.weight [4096,1024] 125 | transformer.resblocks.17.mlp.c_proj.bias [1024] 126 | transformer.resblocks.17.mlp.c_proj.weight [1024,4096] 127 | transformer.resblocks.18.attn.in_proj_bias [3072] 128 | transformer.resblocks.18.attn.in_proj_weight [3072,1024] 129 | transformer.resblocks.18.attn.out_proj.bias [1024] 130 | transformer.resblocks.18.attn.out_proj.weight [1024,1024] 131 | transformer.resblocks.18.ln_1.bias [1024] 132 | transformer.resblocks.18.ln_1.weight [1024] 133 | transformer.resblocks.18.ln_2.bias [1024] 134 | transformer.resblocks.18.ln_2.weight [1024] 135 | transformer.resblocks.18.mlp.c_fc.bias [4096] 136 | transformer.resblocks.18.mlp.c_fc.weight [4096,1024] 137 | transformer.resblocks.18.mlp.c_proj.bias [1024] 138 | transformer.resblocks.18.mlp.c_proj.weight [1024,4096] 139 | transformer.resblocks.19.attn.in_proj_bias [3072] 140 | transformer.resblocks.19.attn.in_proj_weight [3072,1024] 141 | transformer.resblocks.19.attn.out_proj.bias [1024] 142 | transformer.resblocks.19.attn.out_proj.weight [1024,1024] 143 | transformer.resblocks.19.ln_1.bias [1024] 144 | transformer.resblocks.19.ln_1.weight [1024] 145 | transformer.resblocks.19.ln_2.bias [1024] 146 | transformer.resblocks.19.ln_2.weight [1024] 147 | transformer.resblocks.19.mlp.c_fc.bias [4096] 148 | transformer.resblocks.19.mlp.c_fc.weight [4096,1024] 149 | transformer.resblocks.19.mlp.c_proj.bias [1024] 150 | transformer.resblocks.19.mlp.c_proj.weight [1024,4096] 151 | transformer.resblocks.2.attn.in_proj_bias [3072] 152 | transformer.resblocks.2.attn.in_proj_weight [3072,1024] 153 | transformer.resblocks.2.attn.out_proj.bias [1024] 154 | transformer.resblocks.2.attn.out_proj.weight [1024,1024] 155 | transformer.resblocks.2.ln_1.bias [1024] 156 | transformer.resblocks.2.ln_1.weight [1024] 157 | transformer.resblocks.2.ln_2.bias [1024] 158 | transformer.resblocks.2.ln_2.weight [1024] 159 | transformer.resblocks.2.mlp.c_fc.bias [4096] 160 | transformer.resblocks.2.mlp.c_fc.weight [4096,1024] 161 | transformer.resblocks.2.mlp.c_proj.bias [1024] 162 | transformer.resblocks.2.mlp.c_proj.weight [1024,4096] 163 | transformer.resblocks.20.attn.in_proj_bias [3072] 164 | transformer.resblocks.20.attn.in_proj_weight [3072,1024] 165 | transformer.resblocks.20.attn.out_proj.bias [1024] 166 | transformer.resblocks.20.attn.out_proj.weight [1024,1024] 167 | transformer.resblocks.20.ln_1.bias [1024] 168 | transformer.resblocks.20.ln_1.weight [1024] 169 | transformer.resblocks.20.ln_2.bias [1024] 170 | transformer.resblocks.20.ln_2.weight [1024] 171 | transformer.resblocks.20.mlp.c_fc.bias [4096] 172 | transformer.resblocks.20.mlp.c_fc.weight [4096,1024] 173 | transformer.resblocks.20.mlp.c_proj.bias [1024] 174 | transformer.resblocks.20.mlp.c_proj.weight [1024,4096] 175 | transformer.resblocks.21.attn.in_proj_bias [3072] 176 | transformer.resblocks.21.attn.in_proj_weight [3072,1024] 177 | transformer.resblocks.21.attn.out_proj.bias [1024] 178 | transformer.resblocks.21.attn.out_proj.weight [1024,1024] 179 | transformer.resblocks.21.ln_1.bias [1024] 180 | transformer.resblocks.21.ln_1.weight [1024] 181 | transformer.resblocks.21.ln_2.bias [1024] 182 | transformer.resblocks.21.ln_2.weight [1024] 183 | transformer.resblocks.21.mlp.c_fc.bias [4096] 184 | transformer.resblocks.21.mlp.c_fc.weight [4096,1024] 185 | transformer.resblocks.21.mlp.c_proj.bias [1024] 186 | transformer.resblocks.21.mlp.c_proj.weight [1024,4096] 187 | transformer.resblocks.22.attn.in_proj_bias [3072] 188 | transformer.resblocks.22.attn.in_proj_weight [3072,1024] 189 | transformer.resblocks.22.attn.out_proj.bias [1024] 190 | transformer.resblocks.22.attn.out_proj.weight [1024,1024] 191 | transformer.resblocks.22.ln_1.bias [1024] 192 | transformer.resblocks.22.ln_1.weight [1024] 193 | transformer.resblocks.22.ln_2.bias [1024] 194 | transformer.resblocks.22.ln_2.weight [1024] 195 | transformer.resblocks.22.mlp.c_fc.bias [4096] 196 | transformer.resblocks.22.mlp.c_fc.weight [4096,1024] 197 | transformer.resblocks.22.mlp.c_proj.bias [1024] 198 | transformer.resblocks.22.mlp.c_proj.weight [1024,4096] 199 | transformer.resblocks.23.attn.in_proj_bias [3072] 200 | transformer.resblocks.23.attn.in_proj_weight [3072,1024] 201 | transformer.resblocks.23.attn.out_proj.bias [1024] 202 | transformer.resblocks.23.attn.out_proj.weight [1024,1024] 203 | transformer.resblocks.23.ln_1.bias [1024] 204 | transformer.resblocks.23.ln_1.weight [1024] 205 | transformer.resblocks.23.ln_2.bias [1024] 206 | transformer.resblocks.23.ln_2.weight [1024] 207 | transformer.resblocks.23.mlp.c_fc.bias [4096] 208 | transformer.resblocks.23.mlp.c_fc.weight [4096,1024] 209 | transformer.resblocks.23.mlp.c_proj.bias [1024] 210 | transformer.resblocks.23.mlp.c_proj.weight [1024,4096] 211 | transformer.resblocks.3.attn.in_proj_bias [3072] 212 | transformer.resblocks.3.attn.in_proj_weight [3072,1024] 213 | transformer.resblocks.3.attn.out_proj.bias [1024] 214 | transformer.resblocks.3.attn.out_proj.weight [1024,1024] 215 | transformer.resblocks.3.ln_1.bias [1024] 216 | transformer.resblocks.3.ln_1.weight [1024] 217 | transformer.resblocks.3.ln_2.bias [1024] 218 | transformer.resblocks.3.ln_2.weight [1024] 219 | transformer.resblocks.3.mlp.c_fc.bias [4096] 220 | transformer.resblocks.3.mlp.c_fc.weight [4096,1024] 221 | transformer.resblocks.3.mlp.c_proj.bias [1024] 222 | transformer.resblocks.3.mlp.c_proj.weight [1024,4096] 223 | transformer.resblocks.4.attn.in_proj_bias [3072] 224 | transformer.resblocks.4.attn.in_proj_weight [3072,1024] 225 | transformer.resblocks.4.attn.out_proj.bias [1024] 226 | transformer.resblocks.4.attn.out_proj.weight [1024,1024] 227 | transformer.resblocks.4.ln_1.bias [1024] 228 | transformer.resblocks.4.ln_1.weight [1024] 229 | transformer.resblocks.4.ln_2.bias [1024] 230 | transformer.resblocks.4.ln_2.weight [1024] 231 | transformer.resblocks.4.mlp.c_fc.bias [4096] 232 | transformer.resblocks.4.mlp.c_fc.weight [4096,1024] 233 | transformer.resblocks.4.mlp.c_proj.bias [1024] 234 | transformer.resblocks.4.mlp.c_proj.weight [1024,4096] 235 | transformer.resblocks.5.attn.in_proj_bias [3072] 236 | transformer.resblocks.5.attn.in_proj_weight [3072,1024] 237 | transformer.resblocks.5.attn.out_proj.bias [1024] 238 | transformer.resblocks.5.attn.out_proj.weight [1024,1024] 239 | transformer.resblocks.5.ln_1.bias [1024] 240 | transformer.resblocks.5.ln_1.weight [1024] 241 | transformer.resblocks.5.ln_2.bias [1024] 242 | transformer.resblocks.5.ln_2.weight [1024] 243 | transformer.resblocks.5.mlp.c_fc.bias [4096] 244 | transformer.resblocks.5.mlp.c_fc.weight [4096,1024] 245 | transformer.resblocks.5.mlp.c_proj.bias [1024] 246 | transformer.resblocks.5.mlp.c_proj.weight [1024,4096] 247 | transformer.resblocks.6.attn.in_proj_bias [3072] 248 | transformer.resblocks.6.attn.in_proj_weight [3072,1024] 249 | transformer.resblocks.6.attn.out_proj.bias [1024] 250 | transformer.resblocks.6.attn.out_proj.weight [1024,1024] 251 | transformer.resblocks.6.ln_1.bias [1024] 252 | transformer.resblocks.6.ln_1.weight [1024] 253 | transformer.resblocks.6.ln_2.bias [1024] 254 | transformer.resblocks.6.ln_2.weight [1024] 255 | transformer.resblocks.6.mlp.c_fc.bias [4096] 256 | transformer.resblocks.6.mlp.c_fc.weight [4096,1024] 257 | transformer.resblocks.6.mlp.c_proj.bias [1024] 258 | transformer.resblocks.6.mlp.c_proj.weight [1024,4096] 259 | transformer.resblocks.7.attn.in_proj_bias [3072] 260 | transformer.resblocks.7.attn.in_proj_weight [3072,1024] 261 | transformer.resblocks.7.attn.out_proj.bias [1024] 262 | transformer.resblocks.7.attn.out_proj.weight [1024,1024] 263 | transformer.resblocks.7.ln_1.bias [1024] 264 | transformer.resblocks.7.ln_1.weight [1024] 265 | transformer.resblocks.7.ln_2.bias [1024] 266 | transformer.resblocks.7.ln_2.weight [1024] 267 | transformer.resblocks.7.mlp.c_fc.bias [4096] 268 | transformer.resblocks.7.mlp.c_fc.weight [4096,1024] 269 | transformer.resblocks.7.mlp.c_proj.bias [1024] 270 | transformer.resblocks.7.mlp.c_proj.weight [1024,4096] 271 | transformer.resblocks.8.attn.in_proj_bias [3072] 272 | transformer.resblocks.8.attn.in_proj_weight [3072,1024] 273 | transformer.resblocks.8.attn.out_proj.bias [1024] 274 | transformer.resblocks.8.attn.out_proj.weight [1024,1024] 275 | transformer.resblocks.8.ln_1.bias [1024] 276 | transformer.resblocks.8.ln_1.weight [1024] 277 | transformer.resblocks.8.ln_2.bias [1024] 278 | transformer.resblocks.8.ln_2.weight [1024] 279 | transformer.resblocks.8.mlp.c_fc.bias [4096] 280 | transformer.resblocks.8.mlp.c_fc.weight [4096,1024] 281 | transformer.resblocks.8.mlp.c_proj.bias [1024] 282 | transformer.resblocks.8.mlp.c_proj.weight [1024,4096] 283 | transformer.resblocks.9.attn.in_proj_bias [3072] 284 | transformer.resblocks.9.attn.in_proj_weight [3072,1024] 285 | transformer.resblocks.9.attn.out_proj.bias [1024] 286 | transformer.resblocks.9.attn.out_proj.weight [1024,1024] 287 | transformer.resblocks.9.ln_1.bias [1024] 288 | transformer.resblocks.9.ln_1.weight [1024] 289 | transformer.resblocks.9.ln_2.bias [1024] 290 | transformer.resblocks.9.ln_2.weight [1024] 291 | transformer.resblocks.9.mlp.c_fc.bias [4096] 292 | transformer.resblocks.9.mlp.c_fc.weight [4096,1024] 293 | transformer.resblocks.9.mlp.c_proj.bias [1024] 294 | transformer.resblocks.9.mlp.c_proj.weight [1024,4096] -------------------------------------------------------------------------------- /components/CLIP-v2-WD.txt: -------------------------------------------------------------------------------- 1 | ln_final.bias [1024] 2 | ln_final.weight [1024] 3 | positional_embedding [77,1024] 4 | token_embedding.weight [49408,1024] 5 | transformer.resblocks.0.attn.in_proj_bias [3072] 6 | transformer.resblocks.0.attn.in_proj_weight [3072,1024] 7 | transformer.resblocks.0.attn.out_proj.bias [1024] 8 | transformer.resblocks.0.attn.out_proj.weight [1024,1024] 9 | transformer.resblocks.0.ln_1.bias [1024] 10 | transformer.resblocks.0.ln_1.weight [1024] 11 | transformer.resblocks.0.ln_2.bias [1024] 12 | transformer.resblocks.0.ln_2.weight [1024] 13 | transformer.resblocks.0.mlp.c_fc.bias [4096] 14 | transformer.resblocks.0.mlp.c_fc.weight [4096,1024] 15 | transformer.resblocks.0.mlp.c_proj.bias [1024] 16 | transformer.resblocks.0.mlp.c_proj.weight [1024,4096] 17 | transformer.resblocks.1.attn.in_proj_bias [3072] 18 | transformer.resblocks.1.attn.in_proj_weight [3072,1024] 19 | transformer.resblocks.1.attn.out_proj.bias [1024] 20 | transformer.resblocks.1.attn.out_proj.weight [1024,1024] 21 | transformer.resblocks.1.ln_1.bias [1024] 22 | transformer.resblocks.1.ln_1.weight [1024] 23 | transformer.resblocks.1.ln_2.bias [1024] 24 | transformer.resblocks.1.ln_2.weight [1024] 25 | transformer.resblocks.1.mlp.c_fc.bias [4096] 26 | transformer.resblocks.1.mlp.c_fc.weight [4096,1024] 27 | transformer.resblocks.1.mlp.c_proj.bias [1024] 28 | transformer.resblocks.1.mlp.c_proj.weight [1024,4096] 29 | transformer.resblocks.10.attn.in_proj_bias [3072] 30 | transformer.resblocks.10.attn.in_proj_weight [3072,1024] 31 | transformer.resblocks.10.attn.out_proj.bias [1024] 32 | transformer.resblocks.10.attn.out_proj.weight [1024,1024] 33 | transformer.resblocks.10.ln_1.bias [1024] 34 | transformer.resblocks.10.ln_1.weight [1024] 35 | transformer.resblocks.10.ln_2.bias [1024] 36 | transformer.resblocks.10.ln_2.weight [1024] 37 | transformer.resblocks.10.mlp.c_fc.bias [4096] 38 | transformer.resblocks.10.mlp.c_fc.weight [4096,1024] 39 | transformer.resblocks.10.mlp.c_proj.bias [1024] 40 | transformer.resblocks.10.mlp.c_proj.weight [1024,4096] 41 | transformer.resblocks.11.attn.in_proj_bias [3072] 42 | transformer.resblocks.11.attn.in_proj_weight [3072,1024] 43 | transformer.resblocks.11.attn.out_proj.bias [1024] 44 | transformer.resblocks.11.attn.out_proj.weight [1024,1024] 45 | transformer.resblocks.11.ln_1.bias [1024] 46 | transformer.resblocks.11.ln_1.weight [1024] 47 | transformer.resblocks.11.ln_2.bias [1024] 48 | transformer.resblocks.11.ln_2.weight [1024] 49 | transformer.resblocks.11.mlp.c_fc.bias [4096] 50 | transformer.resblocks.11.mlp.c_fc.weight [4096,1024] 51 | transformer.resblocks.11.mlp.c_proj.bias [1024] 52 | transformer.resblocks.11.mlp.c_proj.weight [1024,4096] 53 | transformer.resblocks.12.attn.in_proj_bias [3072] 54 | transformer.resblocks.12.attn.in_proj_weight [3072,1024] 55 | transformer.resblocks.12.attn.out_proj.bias [1024] 56 | transformer.resblocks.12.attn.out_proj.weight [1024,1024] 57 | transformer.resblocks.12.ln_1.bias [1024] 58 | transformer.resblocks.12.ln_1.weight [1024] 59 | transformer.resblocks.12.ln_2.bias [1024] 60 | transformer.resblocks.12.ln_2.weight [1024] 61 | transformer.resblocks.12.mlp.c_fc.bias [4096] 62 | transformer.resblocks.12.mlp.c_fc.weight [4096,1024] 63 | transformer.resblocks.12.mlp.c_proj.bias [1024] 64 | transformer.resblocks.12.mlp.c_proj.weight [1024,4096] 65 | transformer.resblocks.13.attn.in_proj_bias [3072] 66 | transformer.resblocks.13.attn.in_proj_weight [3072,1024] 67 | transformer.resblocks.13.attn.out_proj.bias [1024] 68 | transformer.resblocks.13.attn.out_proj.weight [1024,1024] 69 | transformer.resblocks.13.ln_1.bias [1024] 70 | transformer.resblocks.13.ln_1.weight [1024] 71 | transformer.resblocks.13.ln_2.bias [1024] 72 | transformer.resblocks.13.ln_2.weight [1024] 73 | transformer.resblocks.13.mlp.c_fc.bias [4096] 74 | transformer.resblocks.13.mlp.c_fc.weight [4096,1024] 75 | transformer.resblocks.13.mlp.c_proj.bias [1024] 76 | transformer.resblocks.13.mlp.c_proj.weight [1024,4096] 77 | transformer.resblocks.14.attn.in_proj_bias [3072] 78 | transformer.resblocks.14.attn.in_proj_weight [3072,1024] 79 | transformer.resblocks.14.attn.out_proj.bias [1024] 80 | transformer.resblocks.14.attn.out_proj.weight [1024,1024] 81 | transformer.resblocks.14.ln_1.bias [1024] 82 | transformer.resblocks.14.ln_1.weight [1024] 83 | transformer.resblocks.14.ln_2.bias [1024] 84 | transformer.resblocks.14.ln_2.weight [1024] 85 | transformer.resblocks.14.mlp.c_fc.bias [4096] 86 | transformer.resblocks.14.mlp.c_fc.weight [4096,1024] 87 | transformer.resblocks.14.mlp.c_proj.bias [1024] 88 | transformer.resblocks.14.mlp.c_proj.weight [1024,4096] 89 | transformer.resblocks.15.attn.in_proj_bias [3072] 90 | transformer.resblocks.15.attn.in_proj_weight [3072,1024] 91 | transformer.resblocks.15.attn.out_proj.bias [1024] 92 | transformer.resblocks.15.attn.out_proj.weight [1024,1024] 93 | transformer.resblocks.15.ln_1.bias [1024] 94 | transformer.resblocks.15.ln_1.weight [1024] 95 | transformer.resblocks.15.ln_2.bias [1024] 96 | transformer.resblocks.15.ln_2.weight [1024] 97 | transformer.resblocks.15.mlp.c_fc.bias [4096] 98 | transformer.resblocks.15.mlp.c_fc.weight [4096,1024] 99 | transformer.resblocks.15.mlp.c_proj.bias [1024] 100 | transformer.resblocks.15.mlp.c_proj.weight [1024,4096] 101 | transformer.resblocks.16.attn.in_proj_bias [3072] 102 | transformer.resblocks.16.attn.in_proj_weight [3072,1024] 103 | transformer.resblocks.16.attn.out_proj.bias [1024] 104 | transformer.resblocks.16.attn.out_proj.weight [1024,1024] 105 | transformer.resblocks.16.ln_1.bias [1024] 106 | transformer.resblocks.16.ln_1.weight [1024] 107 | transformer.resblocks.16.ln_2.bias [1024] 108 | transformer.resblocks.16.ln_2.weight [1024] 109 | transformer.resblocks.16.mlp.c_fc.bias [4096] 110 | transformer.resblocks.16.mlp.c_fc.weight [4096,1024] 111 | transformer.resblocks.16.mlp.c_proj.bias [1024] 112 | transformer.resblocks.16.mlp.c_proj.weight [1024,4096] 113 | transformer.resblocks.17.attn.in_proj_bias [3072] 114 | transformer.resblocks.17.attn.in_proj_weight [3072,1024] 115 | transformer.resblocks.17.attn.out_proj.bias [1024] 116 | transformer.resblocks.17.attn.out_proj.weight [1024,1024] 117 | transformer.resblocks.17.ln_1.bias [1024] 118 | transformer.resblocks.17.ln_1.weight [1024] 119 | transformer.resblocks.17.ln_2.bias [1024] 120 | transformer.resblocks.17.ln_2.weight [1024] 121 | transformer.resblocks.17.mlp.c_fc.bias [4096] 122 | transformer.resblocks.17.mlp.c_fc.weight [4096,1024] 123 | transformer.resblocks.17.mlp.c_proj.bias [1024] 124 | transformer.resblocks.17.mlp.c_proj.weight [1024,4096] 125 | transformer.resblocks.18.attn.in_proj_bias [3072] 126 | transformer.resblocks.18.attn.in_proj_weight [3072,1024] 127 | transformer.resblocks.18.attn.out_proj.bias [1024] 128 | transformer.resblocks.18.attn.out_proj.weight [1024,1024] 129 | transformer.resblocks.18.ln_1.bias [1024] 130 | transformer.resblocks.18.ln_1.weight [1024] 131 | transformer.resblocks.18.ln_2.bias [1024] 132 | transformer.resblocks.18.ln_2.weight [1024] 133 | transformer.resblocks.18.mlp.c_fc.bias [4096] 134 | transformer.resblocks.18.mlp.c_fc.weight [4096,1024] 135 | transformer.resblocks.18.mlp.c_proj.bias [1024] 136 | transformer.resblocks.18.mlp.c_proj.weight [1024,4096] 137 | transformer.resblocks.19.attn.in_proj_bias [3072] 138 | transformer.resblocks.19.attn.in_proj_weight [3072,1024] 139 | transformer.resblocks.19.attn.out_proj.bias [1024] 140 | transformer.resblocks.19.attn.out_proj.weight [1024,1024] 141 | transformer.resblocks.19.ln_1.bias [1024] 142 | transformer.resblocks.19.ln_1.weight [1024] 143 | transformer.resblocks.19.ln_2.bias [1024] 144 | transformer.resblocks.19.ln_2.weight [1024] 145 | transformer.resblocks.19.mlp.c_fc.bias [4096] 146 | transformer.resblocks.19.mlp.c_fc.weight [4096,1024] 147 | transformer.resblocks.19.mlp.c_proj.bias [1024] 148 | transformer.resblocks.19.mlp.c_proj.weight [1024,4096] 149 | transformer.resblocks.2.attn.in_proj_bias [3072] 150 | transformer.resblocks.2.attn.in_proj_weight [3072,1024] 151 | transformer.resblocks.2.attn.out_proj.bias [1024] 152 | transformer.resblocks.2.attn.out_proj.weight [1024,1024] 153 | transformer.resblocks.2.ln_1.bias [1024] 154 | transformer.resblocks.2.ln_1.weight [1024] 155 | transformer.resblocks.2.ln_2.bias [1024] 156 | transformer.resblocks.2.ln_2.weight [1024] 157 | transformer.resblocks.2.mlp.c_fc.bias [4096] 158 | transformer.resblocks.2.mlp.c_fc.weight [4096,1024] 159 | transformer.resblocks.2.mlp.c_proj.bias [1024] 160 | transformer.resblocks.2.mlp.c_proj.weight [1024,4096] 161 | transformer.resblocks.20.attn.in_proj_bias [3072] 162 | transformer.resblocks.20.attn.in_proj_weight [3072,1024] 163 | transformer.resblocks.20.attn.out_proj.bias [1024] 164 | transformer.resblocks.20.attn.out_proj.weight [1024,1024] 165 | transformer.resblocks.20.ln_1.bias [1024] 166 | transformer.resblocks.20.ln_1.weight [1024] 167 | transformer.resblocks.20.ln_2.bias [1024] 168 | transformer.resblocks.20.ln_2.weight [1024] 169 | transformer.resblocks.20.mlp.c_fc.bias [4096] 170 | transformer.resblocks.20.mlp.c_fc.weight [4096,1024] 171 | transformer.resblocks.20.mlp.c_proj.bias [1024] 172 | transformer.resblocks.20.mlp.c_proj.weight [1024,4096] 173 | transformer.resblocks.21.attn.in_proj_bias [3072] 174 | transformer.resblocks.21.attn.in_proj_weight [3072,1024] 175 | transformer.resblocks.21.attn.out_proj.bias [1024] 176 | transformer.resblocks.21.attn.out_proj.weight [1024,1024] 177 | transformer.resblocks.21.ln_1.bias [1024] 178 | transformer.resblocks.21.ln_1.weight [1024] 179 | transformer.resblocks.21.ln_2.bias [1024] 180 | transformer.resblocks.21.ln_2.weight [1024] 181 | transformer.resblocks.21.mlp.c_fc.bias [4096] 182 | transformer.resblocks.21.mlp.c_fc.weight [4096,1024] 183 | transformer.resblocks.21.mlp.c_proj.bias [1024] 184 | transformer.resblocks.21.mlp.c_proj.weight [1024,4096] 185 | transformer.resblocks.22.attn.in_proj_bias [3072] 186 | transformer.resblocks.22.attn.in_proj_weight [3072,1024] 187 | transformer.resblocks.22.attn.out_proj.bias [1024] 188 | transformer.resblocks.22.attn.out_proj.weight [1024,1024] 189 | transformer.resblocks.22.ln_1.bias [1024] 190 | transformer.resblocks.22.ln_1.weight [1024] 191 | transformer.resblocks.22.ln_2.bias [1024] 192 | transformer.resblocks.22.ln_2.weight [1024] 193 | transformer.resblocks.22.mlp.c_fc.bias [4096] 194 | transformer.resblocks.22.mlp.c_fc.weight [4096,1024] 195 | transformer.resblocks.22.mlp.c_proj.bias [1024] 196 | transformer.resblocks.22.mlp.c_proj.weight [1024,4096] 197 | transformer.resblocks.3.attn.in_proj_bias [3072] 198 | transformer.resblocks.3.attn.in_proj_weight [3072,1024] 199 | transformer.resblocks.3.attn.out_proj.bias [1024] 200 | transformer.resblocks.3.attn.out_proj.weight [1024,1024] 201 | transformer.resblocks.3.ln_1.bias [1024] 202 | transformer.resblocks.3.ln_1.weight [1024] 203 | transformer.resblocks.3.ln_2.bias [1024] 204 | transformer.resblocks.3.ln_2.weight [1024] 205 | transformer.resblocks.3.mlp.c_fc.bias [4096] 206 | transformer.resblocks.3.mlp.c_fc.weight [4096,1024] 207 | transformer.resblocks.3.mlp.c_proj.bias [1024] 208 | transformer.resblocks.3.mlp.c_proj.weight [1024,4096] 209 | transformer.resblocks.4.attn.in_proj_bias [3072] 210 | transformer.resblocks.4.attn.in_proj_weight [3072,1024] 211 | transformer.resblocks.4.attn.out_proj.bias [1024] 212 | transformer.resblocks.4.attn.out_proj.weight [1024,1024] 213 | transformer.resblocks.4.ln_1.bias [1024] 214 | transformer.resblocks.4.ln_1.weight [1024] 215 | transformer.resblocks.4.ln_2.bias [1024] 216 | transformer.resblocks.4.ln_2.weight [1024] 217 | transformer.resblocks.4.mlp.c_fc.bias [4096] 218 | transformer.resblocks.4.mlp.c_fc.weight [4096,1024] 219 | transformer.resblocks.4.mlp.c_proj.bias [1024] 220 | transformer.resblocks.4.mlp.c_proj.weight [1024,4096] 221 | transformer.resblocks.5.attn.in_proj_bias [3072] 222 | transformer.resblocks.5.attn.in_proj_weight [3072,1024] 223 | transformer.resblocks.5.attn.out_proj.bias [1024] 224 | transformer.resblocks.5.attn.out_proj.weight [1024,1024] 225 | transformer.resblocks.5.ln_1.bias [1024] 226 | transformer.resblocks.5.ln_1.weight [1024] 227 | transformer.resblocks.5.ln_2.bias [1024] 228 | transformer.resblocks.5.ln_2.weight [1024] 229 | transformer.resblocks.5.mlp.c_fc.bias [4096] 230 | transformer.resblocks.5.mlp.c_fc.weight [4096,1024] 231 | transformer.resblocks.5.mlp.c_proj.bias [1024] 232 | transformer.resblocks.5.mlp.c_proj.weight [1024,4096] 233 | transformer.resblocks.6.attn.in_proj_bias [3072] 234 | transformer.resblocks.6.attn.in_proj_weight [3072,1024] 235 | transformer.resblocks.6.attn.out_proj.bias [1024] 236 | transformer.resblocks.6.attn.out_proj.weight [1024,1024] 237 | transformer.resblocks.6.ln_1.bias [1024] 238 | transformer.resblocks.6.ln_1.weight [1024] 239 | transformer.resblocks.6.ln_2.bias [1024] 240 | transformer.resblocks.6.ln_2.weight [1024] 241 | transformer.resblocks.6.mlp.c_fc.bias [4096] 242 | transformer.resblocks.6.mlp.c_fc.weight [4096,1024] 243 | transformer.resblocks.6.mlp.c_proj.bias [1024] 244 | transformer.resblocks.6.mlp.c_proj.weight [1024,4096] 245 | transformer.resblocks.7.attn.in_proj_bias [3072] 246 | transformer.resblocks.7.attn.in_proj_weight [3072,1024] 247 | transformer.resblocks.7.attn.out_proj.bias [1024] 248 | transformer.resblocks.7.attn.out_proj.weight [1024,1024] 249 | transformer.resblocks.7.ln_1.bias [1024] 250 | transformer.resblocks.7.ln_1.weight [1024] 251 | transformer.resblocks.7.ln_2.bias [1024] 252 | transformer.resblocks.7.ln_2.weight [1024] 253 | transformer.resblocks.7.mlp.c_fc.bias [4096] 254 | transformer.resblocks.7.mlp.c_fc.weight [4096,1024] 255 | transformer.resblocks.7.mlp.c_proj.bias [1024] 256 | transformer.resblocks.7.mlp.c_proj.weight [1024,4096] 257 | transformer.resblocks.8.attn.in_proj_bias [3072] 258 | transformer.resblocks.8.attn.in_proj_weight [3072,1024] 259 | transformer.resblocks.8.attn.out_proj.bias [1024] 260 | transformer.resblocks.8.attn.out_proj.weight [1024,1024] 261 | transformer.resblocks.8.ln_1.bias [1024] 262 | transformer.resblocks.8.ln_1.weight [1024] 263 | transformer.resblocks.8.ln_2.bias [1024] 264 | transformer.resblocks.8.ln_2.weight [1024] 265 | transformer.resblocks.8.mlp.c_fc.bias [4096] 266 | transformer.resblocks.8.mlp.c_fc.weight [4096,1024] 267 | transformer.resblocks.8.mlp.c_proj.bias [1024] 268 | transformer.resblocks.8.mlp.c_proj.weight [1024,4096] 269 | transformer.resblocks.9.attn.in_proj_bias [3072] 270 | transformer.resblocks.9.attn.in_proj_weight [3072,1024] 271 | transformer.resblocks.9.attn.out_proj.bias [1024] 272 | transformer.resblocks.9.attn.out_proj.weight [1024,1024] 273 | transformer.resblocks.9.ln_1.bias [1024] 274 | transformer.resblocks.9.ln_1.weight [1024] 275 | transformer.resblocks.9.ln_2.bias [1024] 276 | transformer.resblocks.9.ln_2.weight [1024] 277 | transformer.resblocks.9.mlp.c_fc.bias [4096] 278 | transformer.resblocks.9.mlp.c_fc.weight [4096,1024] 279 | transformer.resblocks.9.mlp.c_proj.bias [1024] 280 | transformer.resblocks.9.mlp.c_proj.weight [1024,4096] -------------------------------------------------------------------------------- /components/CLIP-v3-L.txt: -------------------------------------------------------------------------------- 1 | text_model.embeddings.position_embedding.weight [77,768] 2 | text_model.embeddings.token_embedding.weight [49408,768] 3 | text_model.encoder.layers.0.layer_norm1.bias [768] 4 | text_model.encoder.layers.0.layer_norm1.weight [768] 5 | text_model.encoder.layers.0.layer_norm2.bias [768] 6 | text_model.encoder.layers.0.layer_norm2.weight [768] 7 | text_model.encoder.layers.0.mlp.fc1.bias [3072] 8 | text_model.encoder.layers.0.mlp.fc1.weight [3072,768] 9 | text_model.encoder.layers.0.mlp.fc2.bias [768] 10 | text_model.encoder.layers.0.mlp.fc2.weight [768,3072] 11 | text_model.encoder.layers.0.self_attn.k_proj.bias [768] 12 | text_model.encoder.layers.0.self_attn.k_proj.weight [768,768] 13 | text_model.encoder.layers.0.self_attn.out_proj.bias [768] 14 | text_model.encoder.layers.0.self_attn.out_proj.weight [768,768] 15 | text_model.encoder.layers.0.self_attn.q_proj.bias [768] 16 | text_model.encoder.layers.0.self_attn.q_proj.weight [768,768] 17 | text_model.encoder.layers.0.self_attn.v_proj.bias [768] 18 | text_model.encoder.layers.0.self_attn.v_proj.weight [768,768] 19 | text_model.encoder.layers.1.layer_norm1.bias [768] 20 | text_model.encoder.layers.1.layer_norm1.weight [768] 21 | text_model.encoder.layers.1.layer_norm2.bias [768] 22 | text_model.encoder.layers.1.layer_norm2.weight [768] 23 | text_model.encoder.layers.1.mlp.fc1.bias [3072] 24 | text_model.encoder.layers.1.mlp.fc1.weight [3072,768] 25 | text_model.encoder.layers.1.mlp.fc2.bias [768] 26 | text_model.encoder.layers.1.mlp.fc2.weight [768,3072] 27 | text_model.encoder.layers.1.self_attn.k_proj.bias [768] 28 | text_model.encoder.layers.1.self_attn.k_proj.weight [768,768] 29 | text_model.encoder.layers.1.self_attn.out_proj.bias [768] 30 | text_model.encoder.layers.1.self_attn.out_proj.weight [768,768] 31 | text_model.encoder.layers.1.self_attn.q_proj.bias [768] 32 | text_model.encoder.layers.1.self_attn.q_proj.weight [768,768] 33 | text_model.encoder.layers.1.self_attn.v_proj.bias [768] 34 | text_model.encoder.layers.1.self_attn.v_proj.weight [768,768] 35 | text_model.encoder.layers.10.layer_norm1.bias [768] 36 | text_model.encoder.layers.10.layer_norm1.weight [768] 37 | text_model.encoder.layers.10.layer_norm2.bias [768] 38 | text_model.encoder.layers.10.layer_norm2.weight [768] 39 | text_model.encoder.layers.10.mlp.fc1.bias [3072] 40 | text_model.encoder.layers.10.mlp.fc1.weight [3072,768] 41 | text_model.encoder.layers.10.mlp.fc2.bias [768] 42 | text_model.encoder.layers.10.mlp.fc2.weight [768,3072] 43 | text_model.encoder.layers.10.self_attn.k_proj.bias [768] 44 | text_model.encoder.layers.10.self_attn.k_proj.weight [768,768] 45 | text_model.encoder.layers.10.self_attn.out_proj.bias [768] 46 | text_model.encoder.layers.10.self_attn.out_proj.weight [768,768] 47 | text_model.encoder.layers.10.self_attn.q_proj.bias [768] 48 | text_model.encoder.layers.10.self_attn.q_proj.weight [768,768] 49 | text_model.encoder.layers.10.self_attn.v_proj.bias [768] 50 | text_model.encoder.layers.10.self_attn.v_proj.weight [768,768] 51 | text_model.encoder.layers.11.layer_norm1.bias [768] 52 | text_model.encoder.layers.11.layer_norm1.weight [768] 53 | text_model.encoder.layers.11.layer_norm2.bias [768] 54 | text_model.encoder.layers.11.layer_norm2.weight [768] 55 | text_model.encoder.layers.11.mlp.fc1.bias [3072] 56 | text_model.encoder.layers.11.mlp.fc1.weight [3072,768] 57 | text_model.encoder.layers.11.mlp.fc2.bias [768] 58 | text_model.encoder.layers.11.mlp.fc2.weight [768,3072] 59 | text_model.encoder.layers.11.self_attn.k_proj.bias [768] 60 | text_model.encoder.layers.11.self_attn.k_proj.weight [768,768] 61 | text_model.encoder.layers.11.self_attn.out_proj.bias [768] 62 | text_model.encoder.layers.11.self_attn.out_proj.weight [768,768] 63 | text_model.encoder.layers.11.self_attn.q_proj.bias [768] 64 | text_model.encoder.layers.11.self_attn.q_proj.weight [768,768] 65 | text_model.encoder.layers.11.self_attn.v_proj.bias [768] 66 | text_model.encoder.layers.11.self_attn.v_proj.weight [768,768] 67 | text_model.encoder.layers.2.layer_norm1.bias [768] 68 | text_model.encoder.layers.2.layer_norm1.weight [768] 69 | text_model.encoder.layers.2.layer_norm2.bias [768] 70 | text_model.encoder.layers.2.layer_norm2.weight [768] 71 | text_model.encoder.layers.2.mlp.fc1.bias [3072] 72 | text_model.encoder.layers.2.mlp.fc1.weight [3072,768] 73 | text_model.encoder.layers.2.mlp.fc2.bias [768] 74 | text_model.encoder.layers.2.mlp.fc2.weight [768,3072] 75 | text_model.encoder.layers.2.self_attn.k_proj.bias [768] 76 | text_model.encoder.layers.2.self_attn.k_proj.weight [768,768] 77 | text_model.encoder.layers.2.self_attn.out_proj.bias [768] 78 | text_model.encoder.layers.2.self_attn.out_proj.weight [768,768] 79 | text_model.encoder.layers.2.self_attn.q_proj.bias [768] 80 | text_model.encoder.layers.2.self_attn.q_proj.weight [768,768] 81 | text_model.encoder.layers.2.self_attn.v_proj.bias [768] 82 | text_model.encoder.layers.2.self_attn.v_proj.weight [768,768] 83 | text_model.encoder.layers.3.layer_norm1.bias [768] 84 | text_model.encoder.layers.3.layer_norm1.weight [768] 85 | text_model.encoder.layers.3.layer_norm2.bias [768] 86 | text_model.encoder.layers.3.layer_norm2.weight [768] 87 | text_model.encoder.layers.3.mlp.fc1.bias [3072] 88 | text_model.encoder.layers.3.mlp.fc1.weight [3072,768] 89 | text_model.encoder.layers.3.mlp.fc2.bias [768] 90 | text_model.encoder.layers.3.mlp.fc2.weight [768,3072] 91 | text_model.encoder.layers.3.self_attn.k_proj.bias [768] 92 | text_model.encoder.layers.3.self_attn.k_proj.weight [768,768] 93 | text_model.encoder.layers.3.self_attn.out_proj.bias [768] 94 | text_model.encoder.layers.3.self_attn.out_proj.weight [768,768] 95 | text_model.encoder.layers.3.self_attn.q_proj.bias [768] 96 | text_model.encoder.layers.3.self_attn.q_proj.weight [768,768] 97 | text_model.encoder.layers.3.self_attn.v_proj.bias [768] 98 | text_model.encoder.layers.3.self_attn.v_proj.weight [768,768] 99 | text_model.encoder.layers.4.layer_norm1.bias [768] 100 | text_model.encoder.layers.4.layer_norm1.weight [768] 101 | text_model.encoder.layers.4.layer_norm2.bias [768] 102 | text_model.encoder.layers.4.layer_norm2.weight [768] 103 | text_model.encoder.layers.4.mlp.fc1.bias [3072] 104 | text_model.encoder.layers.4.mlp.fc1.weight [3072,768] 105 | text_model.encoder.layers.4.mlp.fc2.bias [768] 106 | text_model.encoder.layers.4.mlp.fc2.weight [768,3072] 107 | text_model.encoder.layers.4.self_attn.k_proj.bias [768] 108 | text_model.encoder.layers.4.self_attn.k_proj.weight [768,768] 109 | text_model.encoder.layers.4.self_attn.out_proj.bias [768] 110 | text_model.encoder.layers.4.self_attn.out_proj.weight [768,768] 111 | text_model.encoder.layers.4.self_attn.q_proj.bias [768] 112 | text_model.encoder.layers.4.self_attn.q_proj.weight [768,768] 113 | text_model.encoder.layers.4.self_attn.v_proj.bias [768] 114 | text_model.encoder.layers.4.self_attn.v_proj.weight [768,768] 115 | text_model.encoder.layers.5.layer_norm1.bias [768] 116 | text_model.encoder.layers.5.layer_norm1.weight [768] 117 | text_model.encoder.layers.5.layer_norm2.bias [768] 118 | text_model.encoder.layers.5.layer_norm2.weight [768] 119 | text_model.encoder.layers.5.mlp.fc1.bias [3072] 120 | text_model.encoder.layers.5.mlp.fc1.weight [3072,768] 121 | text_model.encoder.layers.5.mlp.fc2.bias [768] 122 | text_model.encoder.layers.5.mlp.fc2.weight [768,3072] 123 | text_model.encoder.layers.5.self_attn.k_proj.bias [768] 124 | text_model.encoder.layers.5.self_attn.k_proj.weight [768,768] 125 | text_model.encoder.layers.5.self_attn.out_proj.bias [768] 126 | text_model.encoder.layers.5.self_attn.out_proj.weight [768,768] 127 | text_model.encoder.layers.5.self_attn.q_proj.bias [768] 128 | text_model.encoder.layers.5.self_attn.q_proj.weight [768,768] 129 | text_model.encoder.layers.5.self_attn.v_proj.bias [768] 130 | text_model.encoder.layers.5.self_attn.v_proj.weight [768,768] 131 | text_model.encoder.layers.6.layer_norm1.bias [768] 132 | text_model.encoder.layers.6.layer_norm1.weight [768] 133 | text_model.encoder.layers.6.layer_norm2.bias [768] 134 | text_model.encoder.layers.6.layer_norm2.weight [768] 135 | text_model.encoder.layers.6.mlp.fc1.bias [3072] 136 | text_model.encoder.layers.6.mlp.fc1.weight [3072,768] 137 | text_model.encoder.layers.6.mlp.fc2.bias [768] 138 | text_model.encoder.layers.6.mlp.fc2.weight [768,3072] 139 | text_model.encoder.layers.6.self_attn.k_proj.bias [768] 140 | text_model.encoder.layers.6.self_attn.k_proj.weight [768,768] 141 | text_model.encoder.layers.6.self_attn.out_proj.bias [768] 142 | text_model.encoder.layers.6.self_attn.out_proj.weight [768,768] 143 | text_model.encoder.layers.6.self_attn.q_proj.bias [768] 144 | text_model.encoder.layers.6.self_attn.q_proj.weight [768,768] 145 | text_model.encoder.layers.6.self_attn.v_proj.bias [768] 146 | text_model.encoder.layers.6.self_attn.v_proj.weight [768,768] 147 | text_model.encoder.layers.7.layer_norm1.bias [768] 148 | text_model.encoder.layers.7.layer_norm1.weight [768] 149 | text_model.encoder.layers.7.layer_norm2.bias [768] 150 | text_model.encoder.layers.7.layer_norm2.weight [768] 151 | text_model.encoder.layers.7.mlp.fc1.bias [3072] 152 | text_model.encoder.layers.7.mlp.fc1.weight [3072,768] 153 | text_model.encoder.layers.7.mlp.fc2.bias [768] 154 | text_model.encoder.layers.7.mlp.fc2.weight [768,3072] 155 | text_model.encoder.layers.7.self_attn.k_proj.bias [768] 156 | text_model.encoder.layers.7.self_attn.k_proj.weight [768,768] 157 | text_model.encoder.layers.7.self_attn.out_proj.bias [768] 158 | text_model.encoder.layers.7.self_attn.out_proj.weight [768,768] 159 | text_model.encoder.layers.7.self_attn.q_proj.bias [768] 160 | text_model.encoder.layers.7.self_attn.q_proj.weight [768,768] 161 | text_model.encoder.layers.7.self_attn.v_proj.bias [768] 162 | text_model.encoder.layers.7.self_attn.v_proj.weight [768,768] 163 | text_model.encoder.layers.8.layer_norm1.bias [768] 164 | text_model.encoder.layers.8.layer_norm1.weight [768] 165 | text_model.encoder.layers.8.layer_norm2.bias [768] 166 | text_model.encoder.layers.8.layer_norm2.weight [768] 167 | text_model.encoder.layers.8.mlp.fc1.bias [3072] 168 | text_model.encoder.layers.8.mlp.fc1.weight [3072,768] 169 | text_model.encoder.layers.8.mlp.fc2.bias [768] 170 | text_model.encoder.layers.8.mlp.fc2.weight [768,3072] 171 | text_model.encoder.layers.8.self_attn.k_proj.bias [768] 172 | text_model.encoder.layers.8.self_attn.k_proj.weight [768,768] 173 | text_model.encoder.layers.8.self_attn.out_proj.bias [768] 174 | text_model.encoder.layers.8.self_attn.out_proj.weight [768,768] 175 | text_model.encoder.layers.8.self_attn.q_proj.bias [768] 176 | text_model.encoder.layers.8.self_attn.q_proj.weight [768,768] 177 | text_model.encoder.layers.8.self_attn.v_proj.bias [768] 178 | text_model.encoder.layers.8.self_attn.v_proj.weight [768,768] 179 | text_model.encoder.layers.9.layer_norm1.bias [768] 180 | text_model.encoder.layers.9.layer_norm1.weight [768] 181 | text_model.encoder.layers.9.layer_norm2.bias [768] 182 | text_model.encoder.layers.9.layer_norm2.weight [768] 183 | text_model.encoder.layers.9.mlp.fc1.bias [3072] 184 | text_model.encoder.layers.9.mlp.fc1.weight [3072,768] 185 | text_model.encoder.layers.9.mlp.fc2.bias [768] 186 | text_model.encoder.layers.9.mlp.fc2.weight [768,3072] 187 | text_model.encoder.layers.9.self_attn.k_proj.bias [768] 188 | text_model.encoder.layers.9.self_attn.k_proj.weight [768,768] 189 | text_model.encoder.layers.9.self_attn.out_proj.bias [768] 190 | text_model.encoder.layers.9.self_attn.out_proj.weight [768,768] 191 | text_model.encoder.layers.9.self_attn.q_proj.bias [768] 192 | text_model.encoder.layers.9.self_attn.q_proj.weight [768,768] 193 | text_model.encoder.layers.9.self_attn.v_proj.bias [768] 194 | text_model.encoder.layers.9.self_attn.v_proj.weight [768,768] 195 | text_model.final_layer_norm.bias [768] 196 | text_model.final_layer_norm.weight [768] -------------------------------------------------------------------------------- /components/CLIP-v3-T5.txt: -------------------------------------------------------------------------------- 1 | encoder.block.0.layer.0.SelfAttention.k.weight [4096,4096] 2 | encoder.block.0.layer.0.SelfAttention.o.weight [4096,4096] 3 | encoder.block.0.layer.0.SelfAttention.q.weight [4096,4096] 4 | encoder.block.0.layer.0.SelfAttention.relative_attention_bias.weight [32,64] 5 | encoder.block.0.layer.0.SelfAttention.v.weight [4096,4096] 6 | encoder.block.0.layer.0.layer_norm.weight [4096] 7 | encoder.block.0.layer.1.DenseReluDense.wi_0.weight [10240,4096] 8 | encoder.block.0.layer.1.DenseReluDense.wi_1.weight [10240,4096] 9 | encoder.block.0.layer.1.DenseReluDense.wo.weight [4096,10240] 10 | encoder.block.0.layer.1.layer_norm.weight [4096] 11 | encoder.block.1.layer.0.SelfAttention.k.weight [4096,4096] 12 | encoder.block.1.layer.0.SelfAttention.o.weight [4096,4096] 13 | encoder.block.1.layer.0.SelfAttention.q.weight [4096,4096] 14 | encoder.block.1.layer.0.SelfAttention.v.weight [4096,4096] 15 | encoder.block.1.layer.0.layer_norm.weight [4096] 16 | encoder.block.1.layer.1.DenseReluDense.wi_0.weight [10240,4096] 17 | encoder.block.1.layer.1.DenseReluDense.wi_1.weight [10240,4096] 18 | encoder.block.1.layer.1.DenseReluDense.wo.weight [4096,10240] 19 | encoder.block.1.layer.1.layer_norm.weight [4096] 20 | encoder.block.10.layer.0.SelfAttention.k.weight [4096,4096] 21 | encoder.block.10.layer.0.SelfAttention.o.weight [4096,4096] 22 | encoder.block.10.layer.0.SelfAttention.q.weight [4096,4096] 23 | encoder.block.10.layer.0.SelfAttention.v.weight [4096,4096] 24 | encoder.block.10.layer.0.layer_norm.weight [4096] 25 | encoder.block.10.layer.1.DenseReluDense.wi_0.weight [10240,4096] 26 | encoder.block.10.layer.1.DenseReluDense.wi_1.weight [10240,4096] 27 | encoder.block.10.layer.1.DenseReluDense.wo.weight [4096,10240] 28 | encoder.block.10.layer.1.layer_norm.weight [4096] 29 | encoder.block.11.layer.0.SelfAttention.k.weight [4096,4096] 30 | encoder.block.11.layer.0.SelfAttention.o.weight [4096,4096] 31 | encoder.block.11.layer.0.SelfAttention.q.weight [4096,4096] 32 | encoder.block.11.layer.0.SelfAttention.v.weight [4096,4096] 33 | encoder.block.11.layer.0.layer_norm.weight [4096] 34 | encoder.block.11.layer.1.DenseReluDense.wi_0.weight [10240,4096] 35 | encoder.block.11.layer.1.DenseReluDense.wi_1.weight [10240,4096] 36 | encoder.block.11.layer.1.DenseReluDense.wo.weight [4096,10240] 37 | encoder.block.11.layer.1.layer_norm.weight [4096] 38 | encoder.block.12.layer.0.SelfAttention.k.weight [4096,4096] 39 | encoder.block.12.layer.0.SelfAttention.o.weight [4096,4096] 40 | encoder.block.12.layer.0.SelfAttention.q.weight [4096,4096] 41 | encoder.block.12.layer.0.SelfAttention.v.weight [4096,4096] 42 | encoder.block.12.layer.0.layer_norm.weight [4096] 43 | encoder.block.12.layer.1.DenseReluDense.wi_0.weight [10240,4096] 44 | encoder.block.12.layer.1.DenseReluDense.wi_1.weight [10240,4096] 45 | encoder.block.12.layer.1.DenseReluDense.wo.weight [4096,10240] 46 | encoder.block.12.layer.1.layer_norm.weight [4096] 47 | encoder.block.13.layer.0.SelfAttention.k.weight [4096,4096] 48 | encoder.block.13.layer.0.SelfAttention.o.weight [4096,4096] 49 | encoder.block.13.layer.0.SelfAttention.q.weight [4096,4096] 50 | encoder.block.13.layer.0.SelfAttention.v.weight [4096,4096] 51 | encoder.block.13.layer.0.layer_norm.weight [4096] 52 | encoder.block.13.layer.1.DenseReluDense.wi_0.weight [10240,4096] 53 | encoder.block.13.layer.1.DenseReluDense.wi_1.weight [10240,4096] 54 | encoder.block.13.layer.1.DenseReluDense.wo.weight [4096,10240] 55 | encoder.block.13.layer.1.layer_norm.weight [4096] 56 | encoder.block.14.layer.0.SelfAttention.k.weight [4096,4096] 57 | encoder.block.14.layer.0.SelfAttention.o.weight [4096,4096] 58 | encoder.block.14.layer.0.SelfAttention.q.weight [4096,4096] 59 | encoder.block.14.layer.0.SelfAttention.v.weight [4096,4096] 60 | encoder.block.14.layer.0.layer_norm.weight [4096] 61 | encoder.block.14.layer.1.DenseReluDense.wi_0.weight [10240,4096] 62 | encoder.block.14.layer.1.DenseReluDense.wi_1.weight [10240,4096] 63 | encoder.block.14.layer.1.DenseReluDense.wo.weight [4096,10240] 64 | encoder.block.14.layer.1.layer_norm.weight [4096] 65 | encoder.block.15.layer.0.SelfAttention.k.weight [4096,4096] 66 | encoder.block.15.layer.0.SelfAttention.o.weight [4096,4096] 67 | encoder.block.15.layer.0.SelfAttention.q.weight [4096,4096] 68 | encoder.block.15.layer.0.SelfAttention.v.weight [4096,4096] 69 | encoder.block.15.layer.0.layer_norm.weight [4096] 70 | encoder.block.15.layer.1.DenseReluDense.wi_0.weight [10240,4096] 71 | encoder.block.15.layer.1.DenseReluDense.wi_1.weight [10240,4096] 72 | encoder.block.15.layer.1.DenseReluDense.wo.weight [4096,10240] 73 | encoder.block.15.layer.1.layer_norm.weight [4096] 74 | encoder.block.16.layer.0.SelfAttention.k.weight [4096,4096] 75 | encoder.block.16.layer.0.SelfAttention.o.weight [4096,4096] 76 | encoder.block.16.layer.0.SelfAttention.q.weight [4096,4096] 77 | encoder.block.16.layer.0.SelfAttention.v.weight [4096,4096] 78 | encoder.block.16.layer.0.layer_norm.weight [4096] 79 | encoder.block.16.layer.1.DenseReluDense.wi_0.weight [10240,4096] 80 | encoder.block.16.layer.1.DenseReluDense.wi_1.weight [10240,4096] 81 | encoder.block.16.layer.1.DenseReluDense.wo.weight [4096,10240] 82 | encoder.block.16.layer.1.layer_norm.weight [4096] 83 | encoder.block.17.layer.0.SelfAttention.k.weight [4096,4096] 84 | encoder.block.17.layer.0.SelfAttention.o.weight [4096,4096] 85 | encoder.block.17.layer.0.SelfAttention.q.weight [4096,4096] 86 | encoder.block.17.layer.0.SelfAttention.v.weight [4096,4096] 87 | encoder.block.17.layer.0.layer_norm.weight [4096] 88 | encoder.block.17.layer.1.DenseReluDense.wi_0.weight [10240,4096] 89 | encoder.block.17.layer.1.DenseReluDense.wi_1.weight [10240,4096] 90 | encoder.block.17.layer.1.DenseReluDense.wo.weight [4096,10240] 91 | encoder.block.17.layer.1.layer_norm.weight [4096] 92 | encoder.block.18.layer.0.SelfAttention.k.weight [4096,4096] 93 | encoder.block.18.layer.0.SelfAttention.o.weight [4096,4096] 94 | encoder.block.18.layer.0.SelfAttention.q.weight [4096,4096] 95 | encoder.block.18.layer.0.SelfAttention.v.weight [4096,4096] 96 | encoder.block.18.layer.0.layer_norm.weight [4096] 97 | encoder.block.18.layer.1.DenseReluDense.wi_0.weight [10240,4096] 98 | encoder.block.18.layer.1.DenseReluDense.wi_1.weight [10240,4096] 99 | encoder.block.18.layer.1.DenseReluDense.wo.weight [4096,10240] 100 | encoder.block.18.layer.1.layer_norm.weight [4096] 101 | encoder.block.19.layer.0.SelfAttention.k.weight [4096,4096] 102 | encoder.block.19.layer.0.SelfAttention.o.weight [4096,4096] 103 | encoder.block.19.layer.0.SelfAttention.q.weight [4096,4096] 104 | encoder.block.19.layer.0.SelfAttention.v.weight [4096,4096] 105 | encoder.block.19.layer.0.layer_norm.weight [4096] 106 | encoder.block.19.layer.1.DenseReluDense.wi_0.weight [10240,4096] 107 | encoder.block.19.layer.1.DenseReluDense.wi_1.weight [10240,4096] 108 | encoder.block.19.layer.1.DenseReluDense.wo.weight [4096,10240] 109 | encoder.block.19.layer.1.layer_norm.weight [4096] 110 | encoder.block.2.layer.0.SelfAttention.k.weight [4096,4096] 111 | encoder.block.2.layer.0.SelfAttention.o.weight [4096,4096] 112 | encoder.block.2.layer.0.SelfAttention.q.weight [4096,4096] 113 | encoder.block.2.layer.0.SelfAttention.v.weight [4096,4096] 114 | encoder.block.2.layer.0.layer_norm.weight [4096] 115 | encoder.block.2.layer.1.DenseReluDense.wi_0.weight [10240,4096] 116 | encoder.block.2.layer.1.DenseReluDense.wi_1.weight [10240,4096] 117 | encoder.block.2.layer.1.DenseReluDense.wo.weight [4096,10240] 118 | encoder.block.2.layer.1.layer_norm.weight [4096] 119 | encoder.block.20.layer.0.SelfAttention.k.weight [4096,4096] 120 | encoder.block.20.layer.0.SelfAttention.o.weight [4096,4096] 121 | encoder.block.20.layer.0.SelfAttention.q.weight [4096,4096] 122 | encoder.block.20.layer.0.SelfAttention.v.weight [4096,4096] 123 | encoder.block.20.layer.0.layer_norm.weight [4096] 124 | encoder.block.20.layer.1.DenseReluDense.wi_0.weight [10240,4096] 125 | encoder.block.20.layer.1.DenseReluDense.wi_1.weight [10240,4096] 126 | encoder.block.20.layer.1.DenseReluDense.wo.weight [4096,10240] 127 | encoder.block.20.layer.1.layer_norm.weight [4096] 128 | encoder.block.21.layer.0.SelfAttention.k.weight [4096,4096] 129 | encoder.block.21.layer.0.SelfAttention.o.weight [4096,4096] 130 | encoder.block.21.layer.0.SelfAttention.q.weight [4096,4096] 131 | encoder.block.21.layer.0.SelfAttention.v.weight [4096,4096] 132 | encoder.block.21.layer.0.layer_norm.weight [4096] 133 | encoder.block.21.layer.1.DenseReluDense.wi_0.weight [10240,4096] 134 | encoder.block.21.layer.1.DenseReluDense.wi_1.weight [10240,4096] 135 | encoder.block.21.layer.1.DenseReluDense.wo.weight [4096,10240] 136 | encoder.block.21.layer.1.layer_norm.weight [4096] 137 | encoder.block.22.layer.0.SelfAttention.k.weight [4096,4096] 138 | encoder.block.22.layer.0.SelfAttention.o.weight [4096,4096] 139 | encoder.block.22.layer.0.SelfAttention.q.weight [4096,4096] 140 | encoder.block.22.layer.0.SelfAttention.v.weight [4096,4096] 141 | encoder.block.22.layer.0.layer_norm.weight [4096] 142 | encoder.block.22.layer.1.DenseReluDense.wi_0.weight [10240,4096] 143 | encoder.block.22.layer.1.DenseReluDense.wi_1.weight [10240,4096] 144 | encoder.block.22.layer.1.DenseReluDense.wo.weight [4096,10240] 145 | encoder.block.22.layer.1.layer_norm.weight [4096] 146 | encoder.block.23.layer.0.SelfAttention.k.weight [4096,4096] 147 | encoder.block.23.layer.0.SelfAttention.o.weight [4096,4096] 148 | encoder.block.23.layer.0.SelfAttention.q.weight [4096,4096] 149 | encoder.block.23.layer.0.SelfAttention.v.weight [4096,4096] 150 | encoder.block.23.layer.0.layer_norm.weight [4096] 151 | encoder.block.23.layer.1.DenseReluDense.wi_0.weight [10240,4096] 152 | encoder.block.23.layer.1.DenseReluDense.wi_1.weight [10240,4096] 153 | encoder.block.23.layer.1.DenseReluDense.wo.weight [4096,10240] 154 | encoder.block.23.layer.1.layer_norm.weight [4096] 155 | encoder.block.3.layer.0.SelfAttention.k.weight [4096,4096] 156 | encoder.block.3.layer.0.SelfAttention.o.weight [4096,4096] 157 | encoder.block.3.layer.0.SelfAttention.q.weight [4096,4096] 158 | encoder.block.3.layer.0.SelfAttention.v.weight [4096,4096] 159 | encoder.block.3.layer.0.layer_norm.weight [4096] 160 | encoder.block.3.layer.1.DenseReluDense.wi_0.weight [10240,4096] 161 | encoder.block.3.layer.1.DenseReluDense.wi_1.weight [10240,4096] 162 | encoder.block.3.layer.1.DenseReluDense.wo.weight [4096,10240] 163 | encoder.block.3.layer.1.layer_norm.weight [4096] 164 | encoder.block.4.layer.0.SelfAttention.k.weight [4096,4096] 165 | encoder.block.4.layer.0.SelfAttention.o.weight [4096,4096] 166 | encoder.block.4.layer.0.SelfAttention.q.weight [4096,4096] 167 | encoder.block.4.layer.0.SelfAttention.v.weight [4096,4096] 168 | encoder.block.4.layer.0.layer_norm.weight [4096] 169 | encoder.block.4.layer.1.DenseReluDense.wi_0.weight [10240,4096] 170 | encoder.block.4.layer.1.DenseReluDense.wi_1.weight [10240,4096] 171 | encoder.block.4.layer.1.DenseReluDense.wo.weight [4096,10240] 172 | encoder.block.4.layer.1.layer_norm.weight [4096] 173 | encoder.block.5.layer.0.SelfAttention.k.weight [4096,4096] 174 | encoder.block.5.layer.0.SelfAttention.o.weight [4096,4096] 175 | encoder.block.5.layer.0.SelfAttention.q.weight [4096,4096] 176 | encoder.block.5.layer.0.SelfAttention.v.weight [4096,4096] 177 | encoder.block.5.layer.0.layer_norm.weight [4096] 178 | encoder.block.5.layer.1.DenseReluDense.wi_0.weight [10240,4096] 179 | encoder.block.5.layer.1.DenseReluDense.wi_1.weight [10240,4096] 180 | encoder.block.5.layer.1.DenseReluDense.wo.weight [4096,10240] 181 | encoder.block.5.layer.1.layer_norm.weight [4096] 182 | encoder.block.6.layer.0.SelfAttention.k.weight [4096,4096] 183 | encoder.block.6.layer.0.SelfAttention.o.weight [4096,4096] 184 | encoder.block.6.layer.0.SelfAttention.q.weight [4096,4096] 185 | encoder.block.6.layer.0.SelfAttention.v.weight [4096,4096] 186 | encoder.block.6.layer.0.layer_norm.weight [4096] 187 | encoder.block.6.layer.1.DenseReluDense.wi_0.weight [10240,4096] 188 | encoder.block.6.layer.1.DenseReluDense.wi_1.weight [10240,4096] 189 | encoder.block.6.layer.1.DenseReluDense.wo.weight [4096,10240] 190 | encoder.block.6.layer.1.layer_norm.weight [4096] 191 | encoder.block.7.layer.0.SelfAttention.k.weight [4096,4096] 192 | encoder.block.7.layer.0.SelfAttention.o.weight [4096,4096] 193 | encoder.block.7.layer.0.SelfAttention.q.weight [4096,4096] 194 | encoder.block.7.layer.0.SelfAttention.v.weight [4096,4096] 195 | encoder.block.7.layer.0.layer_norm.weight [4096] 196 | encoder.block.7.layer.1.DenseReluDense.wi_0.weight [10240,4096] 197 | encoder.block.7.layer.1.DenseReluDense.wi_1.weight [10240,4096] 198 | encoder.block.7.layer.1.DenseReluDense.wo.weight [4096,10240] 199 | encoder.block.7.layer.1.layer_norm.weight [4096] 200 | encoder.block.8.layer.0.SelfAttention.k.weight [4096,4096] 201 | encoder.block.8.layer.0.SelfAttention.o.weight [4096,4096] 202 | encoder.block.8.layer.0.SelfAttention.q.weight [4096,4096] 203 | encoder.block.8.layer.0.SelfAttention.v.weight [4096,4096] 204 | encoder.block.8.layer.0.layer_norm.weight [4096] 205 | encoder.block.8.layer.1.DenseReluDense.wi_0.weight [10240,4096] 206 | encoder.block.8.layer.1.DenseReluDense.wi_1.weight [10240,4096] 207 | encoder.block.8.layer.1.DenseReluDense.wo.weight [4096,10240] 208 | encoder.block.8.layer.1.layer_norm.weight [4096] 209 | encoder.block.9.layer.0.SelfAttention.k.weight [4096,4096] 210 | encoder.block.9.layer.0.SelfAttention.o.weight [4096,4096] 211 | encoder.block.9.layer.0.SelfAttention.q.weight [4096,4096] 212 | encoder.block.9.layer.0.SelfAttention.v.weight [4096,4096] 213 | encoder.block.9.layer.0.layer_norm.weight [4096] 214 | encoder.block.9.layer.1.DenseReluDense.wi_0.weight [10240,4096] 215 | encoder.block.9.layer.1.DenseReluDense.wi_1.weight [10240,4096] 216 | encoder.block.9.layer.1.DenseReluDense.wo.weight [4096,10240] 217 | encoder.block.9.layer.1.layer_norm.weight [4096] 218 | encoder.embed_tokens.weight [32128,4096] 219 | encoder.final_layer_norm.weight [4096] 220 | shared.weight [32128,4096] -------------------------------------------------------------------------------- /components/LoRA-XL-UNET-D.txt: -------------------------------------------------------------------------------- 1 | lora_unet_input_blocks_1_0_emb_layers_1.alpha [] 2 | lora_unet_input_blocks_1_0_emb_layers_1.lora_down.weight [-1,-1] 3 | lora_unet_input_blocks_1_0_emb_layers_1.lora_up.weight [-1,-1] 4 | lora_unet_input_blocks_1_0_in_layers_2.alpha [] 5 | lora_unet_input_blocks_1_0_in_layers_2.lora_down.weight [-1,-1,-1,-1] 6 | lora_unet_input_blocks_1_0_in_layers_2.lora_up.weight [-1,-1,-1,-1] 7 | lora_unet_input_blocks_1_0_out_layers_3.alpha [] 8 | lora_unet_input_blocks_1_0_out_layers_3.lora_down.weight [-1,-1,-1,-1] 9 | lora_unet_input_blocks_1_0_out_layers_3.lora_up.weight [-1,-1,-1,-1] 10 | lora_unet_input_blocks_2_0_emb_layers_1.alpha [] 11 | lora_unet_input_blocks_2_0_emb_layers_1.lora_down.weight [-1,-1] 12 | lora_unet_input_blocks_2_0_emb_layers_1.lora_up.weight [-1,-1] 13 | lora_unet_input_blocks_2_0_in_layers_2.alpha [] 14 | lora_unet_input_blocks_2_0_in_layers_2.lora_down.weight [-1,-1,-1,-1] 15 | lora_unet_input_blocks_2_0_in_layers_2.lora_up.weight [-1,-1,-1,-1] 16 | lora_unet_input_blocks_2_0_out_layers_3.alpha [] 17 | lora_unet_input_blocks_2_0_out_layers_3.lora_down.weight [-1,-1,-1,-1] 18 | lora_unet_input_blocks_2_0_out_layers_3.lora_up.weight [-1,-1,-1,-1] 19 | lora_unet_input_blocks_3_0_op.alpha [] 20 | lora_unet_input_blocks_3_0_op.lora_down.weight [-1,-1,-1,-1] 21 | lora_unet_input_blocks_3_0_op.lora_up.weight [-1,-1,-1,-1] 22 | lora_unet_input_blocks_4_0_emb_layers_1.alpha [] 23 | lora_unet_input_blocks_4_0_emb_layers_1.lora_down.weight [-1,-1] 24 | lora_unet_input_blocks_4_0_emb_layers_1.lora_up.weight [-1,-1] 25 | lora_unet_input_blocks_4_0_in_layers_2.alpha [] 26 | lora_unet_input_blocks_4_0_in_layers_2.lora_down.weight [-1,-1,-1,-1] 27 | lora_unet_input_blocks_4_0_in_layers_2.lora_up.weight [-1,-1,-1,-1] 28 | lora_unet_input_blocks_4_0_out_layers_3.alpha [] 29 | lora_unet_input_blocks_4_0_out_layers_3.lora_down.weight [-1,-1,-1,-1] 30 | lora_unet_input_blocks_4_0_out_layers_3.lora_up.weight [-1,-1,-1,-1] 31 | lora_unet_input_blocks_4_0_skip_connection.alpha [] 32 | lora_unet_input_blocks_4_0_skip_connection.lora_down.weight [-1,-1,-1,-1] 33 | lora_unet_input_blocks_4_0_skip_connection.lora_up.weight [-1,-1,-1,-1] 34 | lora_unet_input_blocks_5_0_emb_layers_1.alpha [] 35 | lora_unet_input_blocks_5_0_emb_layers_1.lora_down.weight [-1,-1] 36 | lora_unet_input_blocks_5_0_emb_layers_1.lora_up.weight [-1,-1] 37 | lora_unet_input_blocks_5_0_in_layers_2.alpha [] 38 | lora_unet_input_blocks_5_0_in_layers_2.lora_down.weight [-1,-1,-1,-1] 39 | lora_unet_input_blocks_5_0_in_layers_2.lora_up.weight [-1,-1,-1,-1] 40 | lora_unet_input_blocks_5_0_out_layers_3.alpha [] 41 | lora_unet_input_blocks_5_0_out_layers_3.lora_down.weight [-1,-1,-1,-1] 42 | lora_unet_input_blocks_5_0_out_layers_3.lora_up.weight [-1,-1,-1,-1] 43 | lora_unet_input_blocks_6_0_op.alpha [] 44 | lora_unet_input_blocks_6_0_op.lora_down.weight [-1,-1,-1,-1] 45 | lora_unet_input_blocks_6_0_op.lora_up.weight [-1,-1,-1,-1] 46 | lora_unet_input_blocks_7_0_emb_layers_1.alpha [] 47 | lora_unet_input_blocks_7_0_emb_layers_1.lora_down.weight [-1,-1] 48 | lora_unet_input_blocks_7_0_emb_layers_1.lora_up.weight [-1,-1] 49 | lora_unet_input_blocks_7_0_in_layers_2.alpha [] 50 | lora_unet_input_blocks_7_0_in_layers_2.lora_down.weight [-1,-1,-1,-1] 51 | lora_unet_input_blocks_7_0_in_layers_2.lora_up.weight [-1,-1,-1,-1] 52 | lora_unet_input_blocks_7_0_out_layers_3.alpha [] 53 | lora_unet_input_blocks_7_0_out_layers_3.lora_down.weight [-1,-1,-1,-1] 54 | lora_unet_input_blocks_7_0_out_layers_3.lora_up.weight [-1,-1,-1,-1] 55 | lora_unet_input_blocks_7_0_skip_connection.alpha [] 56 | lora_unet_input_blocks_7_0_skip_connection.lora_down.weight [-1,-1,-1,-1] 57 | lora_unet_input_blocks_7_0_skip_connection.lora_up.weight [-1,-1,-1,-1] 58 | lora_unet_input_blocks_8_0_emb_layers_1.alpha [] 59 | lora_unet_input_blocks_8_0_emb_layers_1.lora_down.weight [-1,-1] 60 | lora_unet_input_blocks_8_0_emb_layers_1.lora_up.weight [-1,-1] 61 | lora_unet_input_blocks_8_0_in_layers_2.alpha [] 62 | lora_unet_input_blocks_8_0_in_layers_2.lora_down.weight [-1,-1,-1,-1] 63 | lora_unet_input_blocks_8_0_in_layers_2.lora_up.weight [-1,-1,-1,-1] 64 | lora_unet_input_blocks_8_0_out_layers_3.alpha [] 65 | lora_unet_input_blocks_8_0_out_layers_3.lora_down.weight [-1,-1,-1,-1] 66 | lora_unet_input_blocks_8_0_out_layers_3.lora_up.weight [-1,-1,-1,-1] 67 | lora_unet_middle_block_0_emb_layers_1.alpha [] 68 | lora_unet_middle_block_0_emb_layers_1.lora_down.weight [-1,-1] 69 | lora_unet_middle_block_0_emb_layers_1.lora_up.weight [-1,-1] 70 | lora_unet_middle_block_0_in_layers_2.alpha [] 71 | lora_unet_middle_block_0_in_layers_2.lora_down.weight [-1,-1,-1,-1] 72 | lora_unet_middle_block_0_in_layers_2.lora_up.weight [-1,-1,-1,-1] 73 | lora_unet_middle_block_0_out_layers_3.alpha [] 74 | lora_unet_middle_block_0_out_layers_3.lora_down.weight [-1,-1,-1,-1] 75 | lora_unet_middle_block_0_out_layers_3.lora_up.weight [-1,-1,-1,-1] 76 | lora_unet_middle_block_2_emb_layers_1.alpha [] 77 | lora_unet_middle_block_2_emb_layers_1.lora_down.weight [-1,-1] 78 | lora_unet_middle_block_2_emb_layers_1.lora_up.weight [-1,-1] 79 | lora_unet_middle_block_2_in_layers_2.alpha [] 80 | lora_unet_middle_block_2_in_layers_2.lora_down.weight [-1,-1,-1,-1] 81 | lora_unet_middle_block_2_in_layers_2.lora_up.weight [-1,-1,-1,-1] 82 | lora_unet_middle_block_2_out_layers_3.alpha [] 83 | lora_unet_middle_block_2_out_layers_3.lora_down.weight [-1,-1,-1,-1] 84 | lora_unet_middle_block_2_out_layers_3.lora_up.weight [-1,-1,-1,-1] 85 | lora_unet_output_blocks_0_0_emb_layers_1.alpha [] 86 | lora_unet_output_blocks_0_0_emb_layers_1.lora_down.weight [-1,-1] 87 | lora_unet_output_blocks_0_0_emb_layers_1.lora_up.weight [-1,-1] 88 | lora_unet_output_blocks_0_0_in_layers_2.alpha [] 89 | lora_unet_output_blocks_0_0_in_layers_2.lora_down.weight [-1,-1,-1,-1] 90 | lora_unet_output_blocks_0_0_in_layers_2.lora_up.weight [-1,-1,-1,-1] 91 | lora_unet_output_blocks_0_0_out_layers_3.alpha [] 92 | lora_unet_output_blocks_0_0_out_layers_3.lora_down.weight [-1,-1,-1,-1] 93 | lora_unet_output_blocks_0_0_out_layers_3.lora_up.weight [-1,-1,-1,-1] 94 | lora_unet_output_blocks_0_0_skip_connection.alpha [] 95 | lora_unet_output_blocks_0_0_skip_connection.lora_down.weight [-1,-1,-1,-1] 96 | lora_unet_output_blocks_0_0_skip_connection.lora_up.weight [-1,-1,-1,-1] 97 | lora_unet_output_blocks_1_0_emb_layers_1.alpha [] 98 | lora_unet_output_blocks_1_0_emb_layers_1.lora_down.weight [-1,-1] 99 | lora_unet_output_blocks_1_0_emb_layers_1.lora_up.weight [-1,-1] 100 | lora_unet_output_blocks_1_0_in_layers_2.alpha [] 101 | lora_unet_output_blocks_1_0_in_layers_2.lora_down.weight [-1,-1,-1,-1] 102 | lora_unet_output_blocks_1_0_in_layers_2.lora_up.weight [-1,-1,-1,-1] 103 | lora_unet_output_blocks_1_0_out_layers_3.alpha [] 104 | lora_unet_output_blocks_1_0_out_layers_3.lora_down.weight [-1,-1,-1,-1] 105 | lora_unet_output_blocks_1_0_out_layers_3.lora_up.weight [-1,-1,-1,-1] 106 | lora_unet_output_blocks_1_0_skip_connection.alpha [] 107 | lora_unet_output_blocks_1_0_skip_connection.lora_down.weight [-1,-1,-1,-1] 108 | lora_unet_output_blocks_1_0_skip_connection.lora_up.weight [-1,-1,-1,-1] 109 | lora_unet_output_blocks_2_0_emb_layers_1.alpha [] 110 | lora_unet_output_blocks_2_0_emb_layers_1.lora_down.weight [-1,-1] 111 | lora_unet_output_blocks_2_0_emb_layers_1.lora_up.weight [-1,-1] 112 | lora_unet_output_blocks_2_0_in_layers_2.alpha [] 113 | lora_unet_output_blocks_2_0_in_layers_2.lora_down.weight [-1,-1,-1,-1] 114 | lora_unet_output_blocks_2_0_in_layers_2.lora_up.weight [-1,-1,-1,-1] 115 | lora_unet_output_blocks_2_0_out_layers_3.alpha [] 116 | lora_unet_output_blocks_2_0_out_layers_3.lora_down.weight [-1,-1,-1,-1] 117 | lora_unet_output_blocks_2_0_out_layers_3.lora_up.weight [-1,-1,-1,-1] 118 | lora_unet_output_blocks_2_0_skip_connection.alpha [] 119 | lora_unet_output_blocks_2_0_skip_connection.lora_down.weight [-1,-1,-1,-1] 120 | lora_unet_output_blocks_2_0_skip_connection.lora_up.weight [-1,-1,-1,-1] 121 | lora_unet_output_blocks_2_2_conv.alpha [] 122 | lora_unet_output_blocks_2_2_conv.lora_down.weight [-1,-1,-1,-1] 123 | lora_unet_output_blocks_2_2_conv.lora_up.weight [-1,-1,-1,-1] 124 | lora_unet_output_blocks_3_0_emb_layers_1.alpha [] 125 | lora_unet_output_blocks_3_0_emb_layers_1.lora_down.weight [-1,-1] 126 | lora_unet_output_blocks_3_0_emb_layers_1.lora_up.weight [-1,-1] 127 | lora_unet_output_blocks_3_0_in_layers_2.alpha [] 128 | lora_unet_output_blocks_3_0_in_layers_2.lora_down.weight [-1,-1,-1,-1] 129 | lora_unet_output_blocks_3_0_in_layers_2.lora_up.weight [-1,-1,-1,-1] 130 | lora_unet_output_blocks_3_0_out_layers_3.alpha [] 131 | lora_unet_output_blocks_3_0_out_layers_3.lora_down.weight [-1,-1,-1,-1] 132 | lora_unet_output_blocks_3_0_out_layers_3.lora_up.weight [-1,-1,-1,-1] 133 | lora_unet_output_blocks_3_0_skip_connection.alpha [] 134 | lora_unet_output_blocks_3_0_skip_connection.lora_down.weight [-1,-1,-1,-1] 135 | lora_unet_output_blocks_3_0_skip_connection.lora_up.weight [-1,-1,-1,-1] 136 | lora_unet_output_blocks_4_0_emb_layers_1.alpha [] 137 | lora_unet_output_blocks_4_0_emb_layers_1.lora_down.weight [-1,-1] 138 | lora_unet_output_blocks_4_0_emb_layers_1.lora_up.weight [-1,-1] 139 | lora_unet_output_blocks_4_0_in_layers_2.alpha [] 140 | lora_unet_output_blocks_4_0_in_layers_2.lora_down.weight [-1,-1,-1,-1] 141 | lora_unet_output_blocks_4_0_in_layers_2.lora_up.weight [-1,-1,-1,-1] 142 | lora_unet_output_blocks_4_0_out_layers_3.alpha [] 143 | lora_unet_output_blocks_4_0_out_layers_3.lora_down.weight [-1,-1,-1,-1] 144 | lora_unet_output_blocks_4_0_out_layers_3.lora_up.weight [-1,-1,-1,-1] 145 | lora_unet_output_blocks_4_0_skip_connection.alpha [] 146 | lora_unet_output_blocks_4_0_skip_connection.lora_down.weight [-1,-1,-1,-1] 147 | lora_unet_output_blocks_4_0_skip_connection.lora_up.weight [-1,-1,-1,-1] 148 | lora_unet_output_blocks_5_0_emb_layers_1.alpha [] 149 | lora_unet_output_blocks_5_0_emb_layers_1.lora_down.weight [-1,-1] 150 | lora_unet_output_blocks_5_0_emb_layers_1.lora_up.weight [-1,-1] 151 | lora_unet_output_blocks_5_0_in_layers_2.alpha [] 152 | lora_unet_output_blocks_5_0_in_layers_2.lora_down.weight [-1,-1,-1,-1] 153 | lora_unet_output_blocks_5_0_in_layers_2.lora_up.weight [-1,-1,-1,-1] 154 | lora_unet_output_blocks_5_0_out_layers_3.alpha [] 155 | lora_unet_output_blocks_5_0_out_layers_3.lora_down.weight [-1,-1,-1,-1] 156 | lora_unet_output_blocks_5_0_out_layers_3.lora_up.weight [-1,-1,-1,-1] 157 | lora_unet_output_blocks_5_0_skip_connection.alpha [] 158 | lora_unet_output_blocks_5_0_skip_connection.lora_down.weight [-1,-1,-1,-1] 159 | lora_unet_output_blocks_5_0_skip_connection.lora_up.weight [-1,-1,-1,-1] 160 | lora_unet_output_blocks_5_2_conv.alpha [] 161 | lora_unet_output_blocks_5_2_conv.lora_down.weight [-1,-1,-1,-1] 162 | lora_unet_output_blocks_5_2_conv.lora_up.weight [-1,-1,-1,-1] 163 | lora_unet_output_blocks_6_0_emb_layers_1.alpha [] 164 | lora_unet_output_blocks_6_0_emb_layers_1.lora_down.weight [-1,-1] 165 | lora_unet_output_blocks_6_0_emb_layers_1.lora_up.weight [-1,-1] 166 | lora_unet_output_blocks_6_0_in_layers_2.alpha [] 167 | lora_unet_output_blocks_6_0_in_layers_2.lora_down.weight [-1,-1,-1,-1] 168 | lora_unet_output_blocks_6_0_in_layers_2.lora_up.weight [-1,-1,-1,-1] 169 | lora_unet_output_blocks_6_0_out_layers_3.alpha [] 170 | lora_unet_output_blocks_6_0_out_layers_3.lora_down.weight [-1,-1,-1,-1] 171 | lora_unet_output_blocks_6_0_out_layers_3.lora_up.weight [-1,-1,-1,-1] 172 | lora_unet_output_blocks_6_0_skip_connection.alpha [] 173 | lora_unet_output_blocks_6_0_skip_connection.lora_down.weight [-1,-1,-1,-1] 174 | lora_unet_output_blocks_6_0_skip_connection.lora_up.weight [-1,-1,-1,-1] 175 | lora_unet_output_blocks_7_0_emb_layers_1.alpha [] 176 | lora_unet_output_blocks_7_0_emb_layers_1.lora_down.weight [-1,-1] 177 | lora_unet_output_blocks_7_0_emb_layers_1.lora_up.weight [-1,-1] 178 | lora_unet_output_blocks_7_0_in_layers_2.alpha [] 179 | lora_unet_output_blocks_7_0_in_layers_2.lora_down.weight [-1,-1,-1,-1] 180 | lora_unet_output_blocks_7_0_in_layers_2.lora_up.weight [-1,-1,-1,-1] 181 | lora_unet_output_blocks_7_0_out_layers_3.alpha [] 182 | lora_unet_output_blocks_7_0_out_layers_3.lora_down.weight [-1,-1,-1,-1] 183 | lora_unet_output_blocks_7_0_out_layers_3.lora_up.weight [-1,-1,-1,-1] 184 | lora_unet_output_blocks_7_0_skip_connection.alpha [] 185 | lora_unet_output_blocks_7_0_skip_connection.lora_down.weight [-1,-1,-1,-1] 186 | lora_unet_output_blocks_7_0_skip_connection.lora_up.weight [-1,-1,-1,-1] 187 | lora_unet_output_blocks_8_0_emb_layers_1.alpha [] 188 | lora_unet_output_blocks_8_0_emb_layers_1.lora_down.weight [-1,-1] 189 | lora_unet_output_blocks_8_0_emb_layers_1.lora_up.weight [-1,-1] 190 | lora_unet_output_blocks_8_0_in_layers_2.alpha [] 191 | lora_unet_output_blocks_8_0_in_layers_2.lora_down.weight [-1,-1,-1,-1] 192 | lora_unet_output_blocks_8_0_in_layers_2.lora_up.weight [-1,-1,-1,-1] 193 | lora_unet_output_blocks_8_0_out_layers_3.alpha [] 194 | lora_unet_output_blocks_8_0_out_layers_3.lora_down.weight [-1,-1,-1,-1] 195 | lora_unet_output_blocks_8_0_out_layers_3.lora_up.weight [-1,-1,-1,-1] 196 | lora_unet_output_blocks_8_0_skip_connection.alpha [] 197 | lora_unet_output_blocks_8_0_skip_connection.lora_down.weight [-1,-1,-1,-1] 198 | lora_unet_output_blocks_8_0_skip_connection.lora_up.weight [-1,-1,-1,-1] -------------------------------------------------------------------------------- /components/LoRA-v1-CLIP.txt: -------------------------------------------------------------------------------- 1 | lora_te_text_model_encoder_layers_0_mlp_fc1.lora_down.weight [-1,768] 2 | lora_te_text_model_encoder_layers_0_mlp_fc1.lora_up.weight [3072,-1] 3 | lora_te_text_model_encoder_layers_0_mlp_fc2.lora_down.weight [-1,3072] 4 | lora_te_text_model_encoder_layers_0_mlp_fc2.lora_up.weight [768,-1] 5 | lora_te_text_model_encoder_layers_0_self_attn_k_proj.lora_down.weight [-1,768] 6 | lora_te_text_model_encoder_layers_0_self_attn_k_proj.lora_up.weight [768,-1] 7 | lora_te_text_model_encoder_layers_0_self_attn_out_proj.lora_down.weight [-1,768] 8 | lora_te_text_model_encoder_layers_0_self_attn_out_proj.lora_up.weight [768,-1] 9 | lora_te_text_model_encoder_layers_0_self_attn_q_proj.lora_down.weight [-1,768] 10 | lora_te_text_model_encoder_layers_0_self_attn_q_proj.lora_up.weight [768,-1] 11 | lora_te_text_model_encoder_layers_0_self_attn_v_proj.lora_down.weight [-1,768] 12 | lora_te_text_model_encoder_layers_0_self_attn_v_proj.lora_up.weight [768,-1] 13 | lora_te_text_model_encoder_layers_10_mlp_fc1.lora_down.weight [-1,768] 14 | lora_te_text_model_encoder_layers_10_mlp_fc1.lora_up.weight [3072,-1] 15 | lora_te_text_model_encoder_layers_10_mlp_fc2.lora_down.weight [-1,3072] 16 | lora_te_text_model_encoder_layers_10_mlp_fc2.lora_up.weight [768,-1] 17 | lora_te_text_model_encoder_layers_10_self_attn_k_proj.lora_down.weight [-1,768] 18 | lora_te_text_model_encoder_layers_10_self_attn_k_proj.lora_up.weight [768,-1] 19 | lora_te_text_model_encoder_layers_10_self_attn_out_proj.lora_down.weight [-1,768] 20 | lora_te_text_model_encoder_layers_10_self_attn_out_proj.lora_up.weight [768,-1] 21 | lora_te_text_model_encoder_layers_10_self_attn_q_proj.lora_down.weight [-1,768] 22 | lora_te_text_model_encoder_layers_10_self_attn_q_proj.lora_up.weight [768,-1] 23 | lora_te_text_model_encoder_layers_10_self_attn_v_proj.lora_down.weight [-1,768] 24 | lora_te_text_model_encoder_layers_10_self_attn_v_proj.lora_up.weight [768,-1] 25 | lora_te_text_model_encoder_layers_11_mlp_fc1.lora_down.weight [-1,768] 26 | lora_te_text_model_encoder_layers_11_mlp_fc1.lora_up.weight [3072,-1] 27 | lora_te_text_model_encoder_layers_11_mlp_fc2.lora_down.weight [-1,3072] 28 | lora_te_text_model_encoder_layers_11_mlp_fc2.lora_up.weight [768,-1] 29 | lora_te_text_model_encoder_layers_11_self_attn_k_proj.lora_down.weight [-1,768] 30 | lora_te_text_model_encoder_layers_11_self_attn_k_proj.lora_up.weight [768,-1] 31 | lora_te_text_model_encoder_layers_11_self_attn_out_proj.lora_down.weight [-1,768] 32 | lora_te_text_model_encoder_layers_11_self_attn_out_proj.lora_up.weight [768,-1] 33 | lora_te_text_model_encoder_layers_11_self_attn_q_proj.lora_down.weight [-1,768] 34 | lora_te_text_model_encoder_layers_11_self_attn_q_proj.lora_up.weight [768,-1] 35 | lora_te_text_model_encoder_layers_11_self_attn_v_proj.lora_down.weight [-1,768] 36 | lora_te_text_model_encoder_layers_11_self_attn_v_proj.lora_up.weight [768,-1] 37 | lora_te_text_model_encoder_layers_1_mlp_fc1.lora_down.weight [-1,768] 38 | lora_te_text_model_encoder_layers_1_mlp_fc1.lora_up.weight [3072,-1] 39 | lora_te_text_model_encoder_layers_1_mlp_fc2.lora_down.weight [-1,3072] 40 | lora_te_text_model_encoder_layers_1_mlp_fc2.lora_up.weight [768,-1] 41 | lora_te_text_model_encoder_layers_1_self_attn_k_proj.lora_down.weight [-1,768] 42 | lora_te_text_model_encoder_layers_1_self_attn_k_proj.lora_up.weight [768,-1] 43 | lora_te_text_model_encoder_layers_1_self_attn_out_proj.lora_down.weight [-1,768] 44 | lora_te_text_model_encoder_layers_1_self_attn_out_proj.lora_up.weight [768,-1] 45 | lora_te_text_model_encoder_layers_1_self_attn_q_proj.lora_down.weight [-1,768] 46 | lora_te_text_model_encoder_layers_1_self_attn_q_proj.lora_up.weight [768,-1] 47 | lora_te_text_model_encoder_layers_1_self_attn_v_proj.lora_down.weight [-1,768] 48 | lora_te_text_model_encoder_layers_1_self_attn_v_proj.lora_up.weight [768,-1] 49 | lora_te_text_model_encoder_layers_2_mlp_fc1.lora_down.weight [-1,768] 50 | lora_te_text_model_encoder_layers_2_mlp_fc1.lora_up.weight [3072,-1] 51 | lora_te_text_model_encoder_layers_2_mlp_fc2.lora_down.weight [-1,3072] 52 | lora_te_text_model_encoder_layers_2_mlp_fc2.lora_up.weight [768,-1] 53 | lora_te_text_model_encoder_layers_2_self_attn_k_proj.lora_down.weight [-1,768] 54 | lora_te_text_model_encoder_layers_2_self_attn_k_proj.lora_up.weight [768,-1] 55 | lora_te_text_model_encoder_layers_2_self_attn_out_proj.lora_down.weight [-1,768] 56 | lora_te_text_model_encoder_layers_2_self_attn_out_proj.lora_up.weight [768,-1] 57 | lora_te_text_model_encoder_layers_2_self_attn_q_proj.lora_down.weight [-1,768] 58 | lora_te_text_model_encoder_layers_2_self_attn_q_proj.lora_up.weight [768,-1] 59 | lora_te_text_model_encoder_layers_2_self_attn_v_proj.lora_down.weight [-1,768] 60 | lora_te_text_model_encoder_layers_2_self_attn_v_proj.lora_up.weight [768,-1] 61 | lora_te_text_model_encoder_layers_3_mlp_fc1.lora_down.weight [-1,768] 62 | lora_te_text_model_encoder_layers_3_mlp_fc1.lora_up.weight [3072,-1] 63 | lora_te_text_model_encoder_layers_3_mlp_fc2.lora_down.weight [-1,3072] 64 | lora_te_text_model_encoder_layers_3_mlp_fc2.lora_up.weight [768,-1] 65 | lora_te_text_model_encoder_layers_3_self_attn_k_proj.lora_down.weight [-1,768] 66 | lora_te_text_model_encoder_layers_3_self_attn_k_proj.lora_up.weight [768,-1] 67 | lora_te_text_model_encoder_layers_3_self_attn_out_proj.lora_down.weight [-1,768] 68 | lora_te_text_model_encoder_layers_3_self_attn_out_proj.lora_up.weight [768,-1] 69 | lora_te_text_model_encoder_layers_3_self_attn_q_proj.lora_down.weight [-1,768] 70 | lora_te_text_model_encoder_layers_3_self_attn_q_proj.lora_up.weight [768,-1] 71 | lora_te_text_model_encoder_layers_3_self_attn_v_proj.lora_down.weight [-1,768] 72 | lora_te_text_model_encoder_layers_3_self_attn_v_proj.lora_up.weight [768,-1] 73 | lora_te_text_model_encoder_layers_4_mlp_fc1.lora_down.weight [-1,768] 74 | lora_te_text_model_encoder_layers_4_mlp_fc1.lora_up.weight [3072,-1] 75 | lora_te_text_model_encoder_layers_4_mlp_fc2.lora_down.weight [-1,3072] 76 | lora_te_text_model_encoder_layers_4_mlp_fc2.lora_up.weight [768,-1] 77 | lora_te_text_model_encoder_layers_4_self_attn_k_proj.lora_down.weight [-1,768] 78 | lora_te_text_model_encoder_layers_4_self_attn_k_proj.lora_up.weight [768,-1] 79 | lora_te_text_model_encoder_layers_4_self_attn_out_proj.lora_down.weight [-1,768] 80 | lora_te_text_model_encoder_layers_4_self_attn_out_proj.lora_up.weight [768,-1] 81 | lora_te_text_model_encoder_layers_4_self_attn_q_proj.lora_down.weight [-1,768] 82 | lora_te_text_model_encoder_layers_4_self_attn_q_proj.lora_up.weight [768,-1] 83 | lora_te_text_model_encoder_layers_4_self_attn_v_proj.lora_down.weight [-1,768] 84 | lora_te_text_model_encoder_layers_4_self_attn_v_proj.lora_up.weight [768,-1] 85 | lora_te_text_model_encoder_layers_5_mlp_fc1.lora_down.weight [-1,768] 86 | lora_te_text_model_encoder_layers_5_mlp_fc1.lora_up.weight [3072,-1] 87 | lora_te_text_model_encoder_layers_5_mlp_fc2.lora_down.weight [-1,3072] 88 | lora_te_text_model_encoder_layers_5_mlp_fc2.lora_up.weight [768,-1] 89 | lora_te_text_model_encoder_layers_5_self_attn_k_proj.lora_down.weight [-1,768] 90 | lora_te_text_model_encoder_layers_5_self_attn_k_proj.lora_up.weight [768,-1] 91 | lora_te_text_model_encoder_layers_5_self_attn_out_proj.lora_down.weight [-1,768] 92 | lora_te_text_model_encoder_layers_5_self_attn_out_proj.lora_up.weight [768,-1] 93 | lora_te_text_model_encoder_layers_5_self_attn_q_proj.lora_down.weight [-1,768] 94 | lora_te_text_model_encoder_layers_5_self_attn_q_proj.lora_up.weight [768,-1] 95 | lora_te_text_model_encoder_layers_5_self_attn_v_proj.lora_down.weight [-1,768] 96 | lora_te_text_model_encoder_layers_5_self_attn_v_proj.lora_up.weight [768,-1] 97 | lora_te_text_model_encoder_layers_6_mlp_fc1.lora_down.weight [-1,768] 98 | lora_te_text_model_encoder_layers_6_mlp_fc1.lora_up.weight [3072,-1] 99 | lora_te_text_model_encoder_layers_6_mlp_fc2.lora_down.weight [-1,3072] 100 | lora_te_text_model_encoder_layers_6_mlp_fc2.lora_up.weight [768,-1] 101 | lora_te_text_model_encoder_layers_6_self_attn_k_proj.lora_down.weight [-1,768] 102 | lora_te_text_model_encoder_layers_6_self_attn_k_proj.lora_up.weight [768,-1] 103 | lora_te_text_model_encoder_layers_6_self_attn_out_proj.lora_down.weight [-1,768] 104 | lora_te_text_model_encoder_layers_6_self_attn_out_proj.lora_up.weight [768,-1] 105 | lora_te_text_model_encoder_layers_6_self_attn_q_proj.lora_down.weight [-1,768] 106 | lora_te_text_model_encoder_layers_6_self_attn_q_proj.lora_up.weight [768,-1] 107 | lora_te_text_model_encoder_layers_6_self_attn_v_proj.lora_down.weight [-1,768] 108 | lora_te_text_model_encoder_layers_6_self_attn_v_proj.lora_up.weight [768,-1] 109 | lora_te_text_model_encoder_layers_7_mlp_fc1.lora_down.weight [-1,768] 110 | lora_te_text_model_encoder_layers_7_mlp_fc1.lora_up.weight [3072,-1] 111 | lora_te_text_model_encoder_layers_7_mlp_fc2.lora_down.weight [-1,3072] 112 | lora_te_text_model_encoder_layers_7_mlp_fc2.lora_up.weight [768,-1] 113 | lora_te_text_model_encoder_layers_7_self_attn_k_proj.lora_down.weight [-1,768] 114 | lora_te_text_model_encoder_layers_7_self_attn_k_proj.lora_up.weight [768,-1] 115 | lora_te_text_model_encoder_layers_7_self_attn_out_proj.lora_down.weight [-1,768] 116 | lora_te_text_model_encoder_layers_7_self_attn_out_proj.lora_up.weight [768,-1] 117 | lora_te_text_model_encoder_layers_7_self_attn_q_proj.lora_down.weight [-1,768] 118 | lora_te_text_model_encoder_layers_7_self_attn_q_proj.lora_up.weight [768,-1] 119 | lora_te_text_model_encoder_layers_7_self_attn_v_proj.lora_down.weight [-1,768] 120 | lora_te_text_model_encoder_layers_7_self_attn_v_proj.lora_up.weight [768,-1] 121 | lora_te_text_model_encoder_layers_8_mlp_fc1.lora_down.weight [-1,768] 122 | lora_te_text_model_encoder_layers_8_mlp_fc1.lora_up.weight [3072,-1] 123 | lora_te_text_model_encoder_layers_8_mlp_fc2.lora_down.weight [-1,3072] 124 | lora_te_text_model_encoder_layers_8_mlp_fc2.lora_up.weight [768,-1] 125 | lora_te_text_model_encoder_layers_8_self_attn_k_proj.lora_down.weight [-1,768] 126 | lora_te_text_model_encoder_layers_8_self_attn_k_proj.lora_up.weight [768,-1] 127 | lora_te_text_model_encoder_layers_8_self_attn_out_proj.lora_down.weight [-1,768] 128 | lora_te_text_model_encoder_layers_8_self_attn_out_proj.lora_up.weight [768,-1] 129 | lora_te_text_model_encoder_layers_8_self_attn_q_proj.lora_down.weight [-1,768] 130 | lora_te_text_model_encoder_layers_8_self_attn_q_proj.lora_up.weight [768,-1] 131 | lora_te_text_model_encoder_layers_8_self_attn_v_proj.lora_down.weight [-1,768] 132 | lora_te_text_model_encoder_layers_8_self_attn_v_proj.lora_up.weight [768,-1] 133 | lora_te_text_model_encoder_layers_9_mlp_fc1.lora_down.weight [-1,768] 134 | lora_te_text_model_encoder_layers_9_mlp_fc1.lora_up.weight [3072,-1] 135 | lora_te_text_model_encoder_layers_9_mlp_fc2.lora_down.weight [-1,3072] 136 | lora_te_text_model_encoder_layers_9_mlp_fc2.lora_up.weight [768,-1] 137 | lora_te_text_model_encoder_layers_9_self_attn_k_proj.lora_down.weight [-1,768] 138 | lora_te_text_model_encoder_layers_9_self_attn_k_proj.lora_up.weight [768,-1] 139 | lora_te_text_model_encoder_layers_9_self_attn_out_proj.lora_down.weight [-1,768] 140 | lora_te_text_model_encoder_layers_9_self_attn_out_proj.lora_up.weight [768,-1] 141 | lora_te_text_model_encoder_layers_9_self_attn_q_proj.lora_down.weight [-1,768] 142 | lora_te_text_model_encoder_layers_9_self_attn_q_proj.lora_up.weight [768,-1] 143 | lora_te_text_model_encoder_layers_9_self_attn_v_proj.lora_down.weight [-1,768] 144 | lora_te_text_model_encoder_layers_9_self_attn_v_proj.lora_up.weight [768,-1] -------------------------------------------------------------------------------- /components/Prefixed-v3-VAE.txt: -------------------------------------------------------------------------------- 1 | first_stage_model.decoder.conv_in.bias [512] 2 | first_stage_model.decoder.conv_in.weight [512,16,3,3] 3 | first_stage_model.decoder.conv_out.bias [3] 4 | first_stage_model.decoder.conv_out.weight [3,128,3,3] 5 | first_stage_model.decoder.mid.attn_1.k.bias [512] 6 | first_stage_model.decoder.mid.attn_1.k.weight [512,512,1,1] 7 | first_stage_model.decoder.mid.attn_1.norm.bias [512] 8 | first_stage_model.decoder.mid.attn_1.norm.weight [512] 9 | first_stage_model.decoder.mid.attn_1.proj_out.bias [512] 10 | first_stage_model.decoder.mid.attn_1.proj_out.weight [512,512,1,1] 11 | first_stage_model.decoder.mid.attn_1.q.bias [512] 12 | first_stage_model.decoder.mid.attn_1.q.weight [512,512,1,1] 13 | first_stage_model.decoder.mid.attn_1.v.bias [512] 14 | first_stage_model.decoder.mid.attn_1.v.weight [512,512,1,1] 15 | first_stage_model.decoder.mid.block_1.conv1.bias [512] 16 | first_stage_model.decoder.mid.block_1.conv1.weight [512,512,3,3] 17 | first_stage_model.decoder.mid.block_1.conv2.bias [512] 18 | first_stage_model.decoder.mid.block_1.conv2.weight [512,512,3,3] 19 | first_stage_model.decoder.mid.block_1.norm1.bias [512] 20 | first_stage_model.decoder.mid.block_1.norm1.weight [512] 21 | first_stage_model.decoder.mid.block_1.norm2.bias [512] 22 | first_stage_model.decoder.mid.block_1.norm2.weight [512] 23 | first_stage_model.decoder.mid.block_2.conv1.bias [512] 24 | first_stage_model.decoder.mid.block_2.conv1.weight [512,512,3,3] 25 | first_stage_model.decoder.mid.block_2.conv2.bias [512] 26 | first_stage_model.decoder.mid.block_2.conv2.weight [512,512,3,3] 27 | first_stage_model.decoder.mid.block_2.norm1.bias [512] 28 | first_stage_model.decoder.mid.block_2.norm1.weight [512] 29 | first_stage_model.decoder.mid.block_2.norm2.bias [512] 30 | first_stage_model.decoder.mid.block_2.norm2.weight [512] 31 | first_stage_model.decoder.norm_out.bias [128] 32 | first_stage_model.decoder.norm_out.weight [128] 33 | first_stage_model.decoder.up.0.block.0.conv1.bias [128] 34 | first_stage_model.decoder.up.0.block.0.conv1.weight [128,256,3,3] 35 | first_stage_model.decoder.up.0.block.0.conv2.bias [128] 36 | first_stage_model.decoder.up.0.block.0.conv2.weight [128,128,3,3] 37 | first_stage_model.decoder.up.0.block.0.nin_shortcut.bias [128] 38 | first_stage_model.decoder.up.0.block.0.nin_shortcut.weight [128,256,1,1] 39 | first_stage_model.decoder.up.0.block.0.norm1.bias [256] 40 | first_stage_model.decoder.up.0.block.0.norm1.weight [256] 41 | first_stage_model.decoder.up.0.block.0.norm2.bias [128] 42 | first_stage_model.decoder.up.0.block.0.norm2.weight [128] 43 | first_stage_model.decoder.up.0.block.1.conv1.bias [128] 44 | first_stage_model.decoder.up.0.block.1.conv1.weight [128,128,3,3] 45 | first_stage_model.decoder.up.0.block.1.conv2.bias [128] 46 | first_stage_model.decoder.up.0.block.1.conv2.weight [128,128,3,3] 47 | first_stage_model.decoder.up.0.block.1.norm1.bias [128] 48 | first_stage_model.decoder.up.0.block.1.norm1.weight [128] 49 | first_stage_model.decoder.up.0.block.1.norm2.bias [128] 50 | first_stage_model.decoder.up.0.block.1.norm2.weight [128] 51 | first_stage_model.decoder.up.0.block.2.conv1.bias [128] 52 | first_stage_model.decoder.up.0.block.2.conv1.weight [128,128,3,3] 53 | first_stage_model.decoder.up.0.block.2.conv2.bias [128] 54 | first_stage_model.decoder.up.0.block.2.conv2.weight [128,128,3,3] 55 | first_stage_model.decoder.up.0.block.2.norm1.bias [128] 56 | first_stage_model.decoder.up.0.block.2.norm1.weight [128] 57 | first_stage_model.decoder.up.0.block.2.norm2.bias [128] 58 | first_stage_model.decoder.up.0.block.2.norm2.weight [128] 59 | first_stage_model.decoder.up.1.block.0.conv1.bias [256] 60 | first_stage_model.decoder.up.1.block.0.conv1.weight [256,512,3,3] 61 | first_stage_model.decoder.up.1.block.0.conv2.bias [256] 62 | first_stage_model.decoder.up.1.block.0.conv2.weight [256,256,3,3] 63 | first_stage_model.decoder.up.1.block.0.nin_shortcut.bias [256] 64 | first_stage_model.decoder.up.1.block.0.nin_shortcut.weight [256,512,1,1] 65 | first_stage_model.decoder.up.1.block.0.norm1.bias [512] 66 | first_stage_model.decoder.up.1.block.0.norm1.weight [512] 67 | first_stage_model.decoder.up.1.block.0.norm2.bias [256] 68 | first_stage_model.decoder.up.1.block.0.norm2.weight [256] 69 | first_stage_model.decoder.up.1.block.1.conv1.bias [256] 70 | first_stage_model.decoder.up.1.block.1.conv1.weight [256,256,3,3] 71 | first_stage_model.decoder.up.1.block.1.conv2.bias [256] 72 | first_stage_model.decoder.up.1.block.1.conv2.weight [256,256,3,3] 73 | first_stage_model.decoder.up.1.block.1.norm1.bias [256] 74 | first_stage_model.decoder.up.1.block.1.norm1.weight [256] 75 | first_stage_model.decoder.up.1.block.1.norm2.bias [256] 76 | first_stage_model.decoder.up.1.block.1.norm2.weight [256] 77 | first_stage_model.decoder.up.1.block.2.conv1.bias [256] 78 | first_stage_model.decoder.up.1.block.2.conv1.weight [256,256,3,3] 79 | first_stage_model.decoder.up.1.block.2.conv2.bias [256] 80 | first_stage_model.decoder.up.1.block.2.conv2.weight [256,256,3,3] 81 | first_stage_model.decoder.up.1.block.2.norm1.bias [256] 82 | first_stage_model.decoder.up.1.block.2.norm1.weight [256] 83 | first_stage_model.decoder.up.1.block.2.norm2.bias [256] 84 | first_stage_model.decoder.up.1.block.2.norm2.weight [256] 85 | first_stage_model.decoder.up.1.upsample.conv.bias [256] 86 | first_stage_model.decoder.up.1.upsample.conv.weight [256,256,3,3] 87 | first_stage_model.decoder.up.2.block.0.conv1.bias [512] 88 | first_stage_model.decoder.up.2.block.0.conv1.weight [512,512,3,3] 89 | first_stage_model.decoder.up.2.block.0.conv2.bias [512] 90 | first_stage_model.decoder.up.2.block.0.conv2.weight [512,512,3,3] 91 | first_stage_model.decoder.up.2.block.0.norm1.bias [512] 92 | first_stage_model.decoder.up.2.block.0.norm1.weight [512] 93 | first_stage_model.decoder.up.2.block.0.norm2.bias [512] 94 | first_stage_model.decoder.up.2.block.0.norm2.weight [512] 95 | first_stage_model.decoder.up.2.block.1.conv1.bias [512] 96 | first_stage_model.decoder.up.2.block.1.conv1.weight [512,512,3,3] 97 | first_stage_model.decoder.up.2.block.1.conv2.bias [512] 98 | first_stage_model.decoder.up.2.block.1.conv2.weight [512,512,3,3] 99 | first_stage_model.decoder.up.2.block.1.norm1.bias [512] 100 | first_stage_model.decoder.up.2.block.1.norm1.weight [512] 101 | first_stage_model.decoder.up.2.block.1.norm2.bias [512] 102 | first_stage_model.decoder.up.2.block.1.norm2.weight [512] 103 | first_stage_model.decoder.up.2.block.2.conv1.bias [512] 104 | first_stage_model.decoder.up.2.block.2.conv1.weight [512,512,3,3] 105 | first_stage_model.decoder.up.2.block.2.conv2.bias [512] 106 | first_stage_model.decoder.up.2.block.2.conv2.weight [512,512,3,3] 107 | first_stage_model.decoder.up.2.block.2.norm1.bias [512] 108 | first_stage_model.decoder.up.2.block.2.norm1.weight [512] 109 | first_stage_model.decoder.up.2.block.2.norm2.bias [512] 110 | first_stage_model.decoder.up.2.block.2.norm2.weight [512] 111 | first_stage_model.decoder.up.2.upsample.conv.bias [512] 112 | first_stage_model.decoder.up.2.upsample.conv.weight [512,512,3,3] 113 | first_stage_model.decoder.up.3.block.0.conv1.bias [512] 114 | first_stage_model.decoder.up.3.block.0.conv1.weight [512,512,3,3] 115 | first_stage_model.decoder.up.3.block.0.conv2.bias [512] 116 | first_stage_model.decoder.up.3.block.0.conv2.weight [512,512,3,3] 117 | first_stage_model.decoder.up.3.block.0.norm1.bias [512] 118 | first_stage_model.decoder.up.3.block.0.norm1.weight [512] 119 | first_stage_model.decoder.up.3.block.0.norm2.bias [512] 120 | first_stage_model.decoder.up.3.block.0.norm2.weight [512] 121 | first_stage_model.decoder.up.3.block.1.conv1.bias [512] 122 | first_stage_model.decoder.up.3.block.1.conv1.weight [512,512,3,3] 123 | first_stage_model.decoder.up.3.block.1.conv2.bias [512] 124 | first_stage_model.decoder.up.3.block.1.conv2.weight [512,512,3,3] 125 | first_stage_model.decoder.up.3.block.1.norm1.bias [512] 126 | first_stage_model.decoder.up.3.block.1.norm1.weight [512] 127 | first_stage_model.decoder.up.3.block.1.norm2.bias [512] 128 | first_stage_model.decoder.up.3.block.1.norm2.weight [512] 129 | first_stage_model.decoder.up.3.block.2.conv1.bias [512] 130 | first_stage_model.decoder.up.3.block.2.conv1.weight [512,512,3,3] 131 | first_stage_model.decoder.up.3.block.2.conv2.bias [512] 132 | first_stage_model.decoder.up.3.block.2.conv2.weight [512,512,3,3] 133 | first_stage_model.decoder.up.3.block.2.norm1.bias [512] 134 | first_stage_model.decoder.up.3.block.2.norm1.weight [512] 135 | first_stage_model.decoder.up.3.block.2.norm2.bias [512] 136 | first_stage_model.decoder.up.3.block.2.norm2.weight [512] 137 | first_stage_model.decoder.up.3.upsample.conv.bias [512] 138 | first_stage_model.decoder.up.3.upsample.conv.weight [512,512,3,3] 139 | first_stage_model.encoder.conv_in.bias [128] 140 | first_stage_model.encoder.conv_in.weight [128,3,3,3] 141 | first_stage_model.encoder.conv_out.bias [32] 142 | first_stage_model.encoder.conv_out.weight [32,512,3,3] 143 | first_stage_model.encoder.down.0.block.0.conv1.bias [128] 144 | first_stage_model.encoder.down.0.block.0.conv1.weight [128,128,3,3] 145 | first_stage_model.encoder.down.0.block.0.conv2.bias [128] 146 | first_stage_model.encoder.down.0.block.0.conv2.weight [128,128,3,3] 147 | first_stage_model.encoder.down.0.block.0.norm1.bias [128] 148 | first_stage_model.encoder.down.0.block.0.norm1.weight [128] 149 | first_stage_model.encoder.down.0.block.0.norm2.bias [128] 150 | first_stage_model.encoder.down.0.block.0.norm2.weight [128] 151 | first_stage_model.encoder.down.0.block.1.conv1.bias [128] 152 | first_stage_model.encoder.down.0.block.1.conv1.weight [128,128,3,3] 153 | first_stage_model.encoder.down.0.block.1.conv2.bias [128] 154 | first_stage_model.encoder.down.0.block.1.conv2.weight [128,128,3,3] 155 | first_stage_model.encoder.down.0.block.1.norm1.bias [128] 156 | first_stage_model.encoder.down.0.block.1.norm1.weight [128] 157 | first_stage_model.encoder.down.0.block.1.norm2.bias [128] 158 | first_stage_model.encoder.down.0.block.1.norm2.weight [128] 159 | first_stage_model.encoder.down.0.downsample.conv.bias [128] 160 | first_stage_model.encoder.down.0.downsample.conv.weight [128,128,3,3] 161 | first_stage_model.encoder.down.1.block.0.conv1.bias [256] 162 | first_stage_model.encoder.down.1.block.0.conv1.weight [256,128,3,3] 163 | first_stage_model.encoder.down.1.block.0.conv2.bias [256] 164 | first_stage_model.encoder.down.1.block.0.conv2.weight [256,256,3,3] 165 | first_stage_model.encoder.down.1.block.0.nin_shortcut.bias [256] 166 | first_stage_model.encoder.down.1.block.0.nin_shortcut.weight [256,128,1,1] 167 | first_stage_model.encoder.down.1.block.0.norm1.bias [128] 168 | first_stage_model.encoder.down.1.block.0.norm1.weight [128] 169 | first_stage_model.encoder.down.1.block.0.norm2.bias [256] 170 | first_stage_model.encoder.down.1.block.0.norm2.weight [256] 171 | first_stage_model.encoder.down.1.block.1.conv1.bias [256] 172 | first_stage_model.encoder.down.1.block.1.conv1.weight [256,256,3,3] 173 | first_stage_model.encoder.down.1.block.1.conv2.bias [256] 174 | first_stage_model.encoder.down.1.block.1.conv2.weight [256,256,3,3] 175 | first_stage_model.encoder.down.1.block.1.norm1.bias [256] 176 | first_stage_model.encoder.down.1.block.1.norm1.weight [256] 177 | first_stage_model.encoder.down.1.block.1.norm2.bias [256] 178 | first_stage_model.encoder.down.1.block.1.norm2.weight [256] 179 | first_stage_model.encoder.down.1.downsample.conv.bias [256] 180 | first_stage_model.encoder.down.1.downsample.conv.weight [256,256,3,3] 181 | first_stage_model.encoder.down.2.block.0.conv1.bias [512] 182 | first_stage_model.encoder.down.2.block.0.conv1.weight [512,256,3,3] 183 | first_stage_model.encoder.down.2.block.0.conv2.bias [512] 184 | first_stage_model.encoder.down.2.block.0.conv2.weight [512,512,3,3] 185 | first_stage_model.encoder.down.2.block.0.nin_shortcut.bias [512] 186 | first_stage_model.encoder.down.2.block.0.nin_shortcut.weight [512,256,1,1] 187 | first_stage_model.encoder.down.2.block.0.norm1.bias [256] 188 | first_stage_model.encoder.down.2.block.0.norm1.weight [256] 189 | first_stage_model.encoder.down.2.block.0.norm2.bias [512] 190 | first_stage_model.encoder.down.2.block.0.norm2.weight [512] 191 | first_stage_model.encoder.down.2.block.1.conv1.bias [512] 192 | first_stage_model.encoder.down.2.block.1.conv1.weight [512,512,3,3] 193 | first_stage_model.encoder.down.2.block.1.conv2.bias [512] 194 | first_stage_model.encoder.down.2.block.1.conv2.weight [512,512,3,3] 195 | first_stage_model.encoder.down.2.block.1.norm1.bias [512] 196 | first_stage_model.encoder.down.2.block.1.norm1.weight [512] 197 | first_stage_model.encoder.down.2.block.1.norm2.bias [512] 198 | first_stage_model.encoder.down.2.block.1.norm2.weight [512] 199 | first_stage_model.encoder.down.2.downsample.conv.bias [512] 200 | first_stage_model.encoder.down.2.downsample.conv.weight [512,512,3,3] 201 | first_stage_model.encoder.down.3.block.0.conv1.bias [512] 202 | first_stage_model.encoder.down.3.block.0.conv1.weight [512,512,3,3] 203 | first_stage_model.encoder.down.3.block.0.conv2.bias [512] 204 | first_stage_model.encoder.down.3.block.0.conv2.weight [512,512,3,3] 205 | first_stage_model.encoder.down.3.block.0.norm1.bias [512] 206 | first_stage_model.encoder.down.3.block.0.norm1.weight [512] 207 | first_stage_model.encoder.down.3.block.0.norm2.bias [512] 208 | first_stage_model.encoder.down.3.block.0.norm2.weight [512] 209 | first_stage_model.encoder.down.3.block.1.conv1.bias [512] 210 | first_stage_model.encoder.down.3.block.1.conv1.weight [512,512,3,3] 211 | first_stage_model.encoder.down.3.block.1.conv2.bias [512] 212 | first_stage_model.encoder.down.3.block.1.conv2.weight [512,512,3,3] 213 | first_stage_model.encoder.down.3.block.1.norm1.bias [512] 214 | first_stage_model.encoder.down.3.block.1.norm1.weight [512] 215 | first_stage_model.encoder.down.3.block.1.norm2.bias [512] 216 | first_stage_model.encoder.down.3.block.1.norm2.weight [512] 217 | first_stage_model.encoder.mid.attn_1.k.bias [512] 218 | first_stage_model.encoder.mid.attn_1.k.weight [512,512,1,1] 219 | first_stage_model.encoder.mid.attn_1.norm.bias [512] 220 | first_stage_model.encoder.mid.attn_1.norm.weight [512] 221 | first_stage_model.encoder.mid.attn_1.proj_out.bias [512] 222 | first_stage_model.encoder.mid.attn_1.proj_out.weight [512,512,1,1] 223 | first_stage_model.encoder.mid.attn_1.q.bias [512] 224 | first_stage_model.encoder.mid.attn_1.q.weight [512,512,1,1] 225 | first_stage_model.encoder.mid.attn_1.v.bias [512] 226 | first_stage_model.encoder.mid.attn_1.v.weight [512,512,1,1] 227 | first_stage_model.encoder.mid.block_1.conv1.bias [512] 228 | first_stage_model.encoder.mid.block_1.conv1.weight [512,512,3,3] 229 | first_stage_model.encoder.mid.block_1.conv2.bias [512] 230 | first_stage_model.encoder.mid.block_1.conv2.weight [512,512,3,3] 231 | first_stage_model.encoder.mid.block_1.norm1.bias [512] 232 | first_stage_model.encoder.mid.block_1.norm1.weight [512] 233 | first_stage_model.encoder.mid.block_1.norm2.bias [512] 234 | first_stage_model.encoder.mid.block_1.norm2.weight [512] 235 | first_stage_model.encoder.mid.block_2.conv1.bias [512] 236 | first_stage_model.encoder.mid.block_2.conv1.weight [512,512,3,3] 237 | first_stage_model.encoder.mid.block_2.conv2.bias [512] 238 | first_stage_model.encoder.mid.block_2.conv2.weight [512,512,3,3] 239 | first_stage_model.encoder.mid.block_2.norm1.bias [512] 240 | first_stage_model.encoder.mid.block_2.norm1.weight [512] 241 | first_stage_model.encoder.mid.block_2.norm2.bias [512] 242 | first_stage_model.encoder.mid.block_2.norm2.weight [512] 243 | first_stage_model.encoder.norm_out.bias [512] 244 | first_stage_model.encoder.norm_out.weight [512] -------------------------------------------------------------------------------- /components/UNET-XL-B-Inpainting.txt: -------------------------------------------------------------------------------- 1 | input_blocks.0.0.weight [320,9,3,3] -------------------------------------------------------------------------------- /components/UNET-XL-B-SD.txt: -------------------------------------------------------------------------------- 1 | input_blocks.0.0.weight [320,4,3,3] -------------------------------------------------------------------------------- /components/UNET-v1-EMBLAY.txt: -------------------------------------------------------------------------------- 1 | input_blocks.1.0.emb_layers.1.bias [320] 2 | input_blocks.1.0.emb_layers.1.weight [320,1280] 3 | input_blocks.2.0.emb_layers.1.bias [320] 4 | input_blocks.2.0.emb_layers.1.weight [320,1280] 5 | input_blocks.4.0.emb_layers.1.bias [640] 6 | input_blocks.4.0.emb_layers.1.weight [640,1280] 7 | input_blocks.5.0.emb_layers.1.bias [640] 8 | input_blocks.5.0.emb_layers.1.weight [640,1280] 9 | input_blocks.7.0.emb_layers.1.bias [1280] 10 | input_blocks.7.0.emb_layers.1.weight [1280,1280] 11 | input_blocks.8.0.emb_layers.1.bias [1280] 12 | input_blocks.8.0.emb_layers.1.weight [1280,1280] 13 | middle_block.0.emb_layers.1.bias [1280] 14 | middle_block.0.emb_layers.1.weight [1280,1280] 15 | middle_block.2.emb_layers.1.bias [1280] 16 | middle_block.2.emb_layers.1.weight [1280,1280] 17 | output_blocks.0.0.emb_layers.1.bias [1280] 18 | output_blocks.0.0.emb_layers.1.weight [1280,1280] 19 | output_blocks.1.0.emb_layers.1.bias [1280] 20 | output_blocks.1.0.emb_layers.1.weight [1280,1280] 21 | output_blocks.2.0.emb_layers.1.bias [1280] 22 | output_blocks.2.0.emb_layers.1.weight [1280,1280] -------------------------------------------------------------------------------- /components/UNET-v1-MID.txt: -------------------------------------------------------------------------------- 1 | middle_block.0.emb_layers.1.bias [1280] 2 | middle_block.0.emb_layers.1.weight [1280,1280] 3 | middle_block.0.in_layers.0.bias [1280] 4 | middle_block.0.in_layers.0.weight [1280] 5 | middle_block.0.in_layers.2.bias [1280] 6 | middle_block.0.in_layers.2.weight [1280,1280,3,3] 7 | middle_block.0.out_layers.0.bias [1280] 8 | middle_block.0.out_layers.0.weight [1280] 9 | middle_block.0.out_layers.3.bias [1280] 10 | middle_block.0.out_layers.3.weight [1280,1280,3,3] 11 | middle_block.1.norm.bias [1280] 12 | middle_block.1.norm.weight [1280] 13 | middle_block.1.proj_in.bias [1280] 14 | middle_block.1.proj_out.bias [1280] 15 | middle_block.1.transformer_blocks.0.attn1.to_k.weight [1280,1280] 16 | middle_block.1.transformer_blocks.0.attn1.to_out.0.bias [1280] 17 | middle_block.1.transformer_blocks.0.attn1.to_out.0.weight [1280,1280] 18 | middle_block.1.transformer_blocks.0.attn1.to_q.weight [1280,1280] 19 | middle_block.1.transformer_blocks.0.attn1.to_v.weight [1280,1280] 20 | middle_block.1.transformer_blocks.0.attn2.to_out.0.bias [1280] 21 | middle_block.1.transformer_blocks.0.attn2.to_out.0.weight [1280,1280] 22 | middle_block.1.transformer_blocks.0.attn2.to_q.weight [1280,1280] 23 | middle_block.1.transformer_blocks.0.ff.net.0.proj.bias [10240] 24 | middle_block.1.transformer_blocks.0.ff.net.0.proj.weight [10240,1280] 25 | middle_block.1.transformer_blocks.0.ff.net.2.bias [1280] 26 | middle_block.1.transformer_blocks.0.ff.net.2.weight [1280,5120] 27 | middle_block.1.transformer_blocks.0.norm1.bias [1280] 28 | middle_block.1.transformer_blocks.0.norm1.weight [1280] 29 | middle_block.1.transformer_blocks.0.norm2.bias [1280] 30 | middle_block.1.transformer_blocks.0.norm2.weight [1280] 31 | middle_block.1.transformer_blocks.0.norm3.bias [1280] 32 | middle_block.1.transformer_blocks.0.norm3.weight [1280] 33 | middle_block.2.emb_layers.1.bias [1280] 34 | middle_block.2.emb_layers.1.weight [1280,1280] 35 | middle_block.2.in_layers.0.bias [1280] 36 | middle_block.2.in_layers.0.weight [1280] 37 | middle_block.2.in_layers.2.bias [1280] 38 | middle_block.2.in_layers.2.weight [1280,1280,3,3] 39 | middle_block.2.out_layers.0.bias [1280] 40 | middle_block.2.out_layers.0.weight [1280] 41 | middle_block.2.out_layers.3.bias [1280] 42 | middle_block.2.out_layers.3.weight [1280,1280,3,3] -------------------------------------------------------------------------------- /components/UNET-v1-SKIP.txt: -------------------------------------------------------------------------------- 1 | input_blocks.4.0.skip_connection.bias [640] 2 | input_blocks.4.0.skip_connection.weight [640,320,1,1] 3 | input_blocks.7.0.skip_connection.bias [1280] 4 | input_blocks.7.0.skip_connection.weight [1280,640,1,1] 5 | output_blocks.0.0.skip_connection.bias [1280] 6 | output_blocks.0.0.skip_connection.weight [1280,2560,1,1] 7 | output_blocks.1.0.skip_connection.bias [1280] 8 | output_blocks.1.0.skip_connection.weight [1280,2560,1,1] 9 | output_blocks.2.0.skip_connection.bias [1280] -------------------------------------------------------------------------------- /components/UNET-v1-TIME-EMB.txt: -------------------------------------------------------------------------------- 1 | time_embed.0.bias [1280] 2 | time_embed.0.weight [1280,320] 3 | time_embed.2.bias [1280] 4 | time_embed.2.weight [1280,1280] -------------------------------------------------------------------------------- /components/UNET-v1-UP.txt: -------------------------------------------------------------------------------- 1 | input_blocks.0.0.bias [320] 2 | input_blocks.0.0.weight [320,4,3,3] 3 | input_blocks.1.0.emb_layers.1.bias [320] 4 | input_blocks.1.0.emb_layers.1.weight [320,1280] 5 | input_blocks.1.0.in_layers.0.bias [320] 6 | input_blocks.1.0.in_layers.0.weight [320] 7 | input_blocks.1.0.in_layers.2.bias [320] 8 | input_blocks.1.0.in_layers.2.weight [320,320,3,3] 9 | input_blocks.1.0.out_layers.0.bias [320] 10 | input_blocks.1.0.out_layers.0.weight [320] 11 | input_blocks.1.0.out_layers.3.bias [320] 12 | input_blocks.1.0.out_layers.3.weight [320,320,3,3] 13 | input_blocks.2.0.emb_layers.1.bias [320] 14 | input_blocks.2.0.emb_layers.1.weight [320,1280] 15 | input_blocks.2.0.in_layers.0.bias [320] 16 | input_blocks.2.0.in_layers.0.weight [320] 17 | input_blocks.2.0.in_layers.2.bias [320] 18 | input_blocks.2.0.in_layers.2.weight [320,320,3,3] 19 | input_blocks.2.0.out_layers.0.bias [320] 20 | input_blocks.2.0.out_layers.0.weight [320] 21 | input_blocks.2.0.out_layers.3.bias [320] 22 | input_blocks.2.0.out_layers.3.weight [320,320,3,3] 23 | input_blocks.3.0.op.bias [320] 24 | input_blocks.3.0.op.weight [320,320,3,3] 25 | input_blocks.4.0.emb_layers.1.bias [640] 26 | input_blocks.4.0.emb_layers.1.weight [640,1280] 27 | input_blocks.4.0.in_layers.0.bias [320] 28 | input_blocks.4.0.in_layers.0.weight [320] 29 | input_blocks.4.0.in_layers.2.bias [640] 30 | input_blocks.4.0.in_layers.2.weight [640,320,3,3] 31 | input_blocks.4.0.out_layers.0.bias [640] 32 | input_blocks.4.0.out_layers.0.weight [640] 33 | input_blocks.4.0.out_layers.3.bias [640] 34 | input_blocks.4.0.out_layers.3.weight [640,640,3,3] 35 | input_blocks.4.0.skip_connection.bias [640] 36 | input_blocks.4.0.skip_connection.weight [640,320,1,1] 37 | input_blocks.4.1.norm.bias [640] 38 | input_blocks.4.1.norm.weight [640] 39 | input_blocks.4.1.proj_in.bias [640] 40 | input_blocks.4.1.proj_out.bias [640] 41 | input_blocks.4.1.transformer_blocks.0.attn1.to_k.weight [640,640] 42 | input_blocks.4.1.transformer_blocks.0.attn1.to_out.0.bias [640] 43 | input_blocks.4.1.transformer_blocks.0.attn1.to_out.0.weight [640,640] 44 | input_blocks.4.1.transformer_blocks.0.attn1.to_q.weight [640,640] 45 | input_blocks.4.1.transformer_blocks.0.attn1.to_v.weight [640,640] 46 | input_blocks.4.1.transformer_blocks.0.attn2.to_out.0.bias [640] 47 | input_blocks.4.1.transformer_blocks.0.attn2.to_out.0.weight [640,640] 48 | input_blocks.4.1.transformer_blocks.0.attn2.to_q.weight [640,640] 49 | input_blocks.4.1.transformer_blocks.0.ff.net.0.proj.bias [5120] 50 | input_blocks.4.1.transformer_blocks.0.ff.net.0.proj.weight [5120,640] 51 | input_blocks.4.1.transformer_blocks.0.ff.net.2.weight [640,2560] 52 | input_blocks.4.1.transformer_blocks.0.norm1.bias [640] 53 | input_blocks.4.1.transformer_blocks.0.norm1.weight [640] 54 | input_blocks.4.1.transformer_blocks.0.norm2.bias [640] 55 | input_blocks.4.1.transformer_blocks.0.norm2.weight [640] 56 | input_blocks.4.1.transformer_blocks.0.norm3.bias [640] 57 | input_blocks.4.1.transformer_blocks.0.norm3.weight [640] 58 | input_blocks.5.0.emb_layers.1.bias [640] 59 | input_blocks.5.0.emb_layers.1.weight [640,1280] 60 | input_blocks.5.0.in_layers.0.bias [640] 61 | input_blocks.5.0.in_layers.0.weight [640] 62 | input_blocks.5.0.in_layers.2.bias [640] 63 | input_blocks.5.0.in_layers.2.weight [640,640,3,3] 64 | input_blocks.5.0.out_layers.0.bias [640] 65 | input_blocks.5.0.out_layers.0.weight [640] 66 | input_blocks.5.0.out_layers.3.bias [640] 67 | input_blocks.5.0.out_layers.3.weight [640,640,3,3] 68 | input_blocks.5.1.norm.bias [640] 69 | input_blocks.5.1.norm.weight [640] 70 | input_blocks.5.1.proj_in.bias [640] 71 | input_blocks.5.1.proj_out.bias [640] 72 | input_blocks.5.1.transformer_blocks.0.attn1.to_k.weight [640,640] 73 | input_blocks.5.1.transformer_blocks.0.attn1.to_out.0.bias [640] 74 | input_blocks.5.1.transformer_blocks.0.attn1.to_out.0.weight [640,640] 75 | input_blocks.5.1.transformer_blocks.0.attn1.to_q.weight [640,640] 76 | input_blocks.5.1.transformer_blocks.0.attn1.to_v.weight [640,640] 77 | input_blocks.5.1.transformer_blocks.0.attn2.to_out.0.bias [640] 78 | input_blocks.5.1.transformer_blocks.0.attn2.to_out.0.weight [640,640] 79 | input_blocks.5.1.transformer_blocks.0.attn2.to_q.weight [640,640] 80 | input_blocks.5.1.transformer_blocks.0.ff.net.0.proj.bias [5120] 81 | input_blocks.5.1.transformer_blocks.0.ff.net.0.proj.weight [5120,640] 82 | input_blocks.5.1.transformer_blocks.0.ff.net.2.bias [640] 83 | input_blocks.5.1.transformer_blocks.0.ff.net.2.weight [640,2560] 84 | input_blocks.5.1.transformer_blocks.0.norm1.bias [640] 85 | input_blocks.5.1.transformer_blocks.0.norm1.weight [640] 86 | input_blocks.5.1.transformer_blocks.0.norm2.bias [640] 87 | input_blocks.5.1.transformer_blocks.0.norm2.weight [640] 88 | input_blocks.5.1.transformer_blocks.0.norm3.bias [640] 89 | input_blocks.5.1.transformer_blocks.0.norm3.weight [640] 90 | input_blocks.6.0.op.bias [640] 91 | input_blocks.6.0.op.weight [640,640,3,3] 92 | input_blocks.7.0.emb_layers.1.bias [1280] 93 | input_blocks.7.0.emb_layers.1.weight [1280,1280] 94 | input_blocks.7.0.in_layers.0.bias [640] 95 | input_blocks.7.0.in_layers.0.weight [640] 96 | input_blocks.7.0.in_layers.2.bias [1280] 97 | input_blocks.7.0.in_layers.2.weight [1280,640,3,3] 98 | input_blocks.7.0.out_layers.0.bias [1280] 99 | input_blocks.7.0.out_layers.0.weight [1280] 100 | input_blocks.7.0.out_layers.3.bias [1280] 101 | input_blocks.7.0.out_layers.3.weight [1280,1280,3,3] 102 | input_blocks.7.0.skip_connection.bias [1280] 103 | input_blocks.7.0.skip_connection.weight [1280,640,1,1] 104 | input_blocks.7.1.norm.bias [1280] 105 | input_blocks.7.1.norm.weight [1280] 106 | input_blocks.7.1.proj_in.bias [1280] 107 | input_blocks.7.1.proj_out.bias [1280] 108 | input_blocks.7.1.transformer_blocks.0.attn1.to_k.weight [1280,1280] 109 | input_blocks.7.1.transformer_blocks.0.attn1.to_out.0.bias [1280] 110 | input_blocks.7.1.transformer_blocks.0.attn1.to_out.0.weight [1280,1280] 111 | input_blocks.7.1.transformer_blocks.0.attn1.to_q.weight [1280,1280] 112 | input_blocks.7.1.transformer_blocks.0.attn1.to_v.weight [1280,1280] 113 | input_blocks.7.1.transformer_blocks.0.attn2.to_out.0.bias [1280] 114 | input_blocks.7.1.transformer_blocks.0.attn2.to_out.0.weight [1280,1280] 115 | input_blocks.7.1.transformer_blocks.0.attn2.to_q.weight [1280,1280] 116 | input_blocks.7.1.transformer_blocks.0.ff.net.0.proj.bias [10240] 117 | input_blocks.7.1.transformer_blocks.0.ff.net.0.proj.weight [10240,1280] 118 | input_blocks.7.1.transformer_blocks.0.ff.net.2.bias [1280] 119 | input_blocks.7.1.transformer_blocks.0.ff.net.2.weight [1280,5120] 120 | input_blocks.7.1.transformer_blocks.0.norm1.bias [1280] 121 | input_blocks.7.1.transformer_blocks.0.norm1.weight [1280] 122 | input_blocks.7.1.transformer_blocks.0.norm2.bias [1280] 123 | input_blocks.7.1.transformer_blocks.0.norm2.weight [1280] 124 | input_blocks.7.1.transformer_blocks.0.norm3.bias [1280] 125 | input_blocks.7.1.transformer_blocks.0.norm3.weight [1280] 126 | input_blocks.8.0.emb_layers.1.bias [1280] 127 | input_blocks.8.0.emb_layers.1.weight [1280,1280] 128 | input_blocks.8.0.in_layers.0.bias [1280] 129 | input_blocks.8.0.in_layers.0.weight [1280] 130 | input_blocks.8.0.in_layers.2.bias [1280] 131 | input_blocks.8.0.in_layers.2.weight [1280,1280,3,3] 132 | input_blocks.8.0.out_layers.0.bias [1280] 133 | input_blocks.8.0.out_layers.0.weight [1280] 134 | input_blocks.8.0.out_layers.3.bias [1280] 135 | input_blocks.8.0.out_layers.3.weight [1280,1280,3,3] 136 | input_blocks.8.1.norm.bias [1280] 137 | input_blocks.8.1.norm.weight [1280] 138 | input_blocks.8.1.proj_in.bias [1280] 139 | input_blocks.8.1.proj_out.bias [1280] 140 | input_blocks.8.1.transformer_blocks.0.attn1.to_k.weight [1280,1280] 141 | input_blocks.8.1.transformer_blocks.0.attn1.to_out.0.bias [1280] 142 | input_blocks.8.1.transformer_blocks.0.attn1.to_out.0.weight [1280,1280] 143 | input_blocks.8.1.transformer_blocks.0.attn1.to_q.weight [1280,1280] 144 | input_blocks.8.1.transformer_blocks.0.attn1.to_v.weight [1280,1280] 145 | input_blocks.8.1.transformer_blocks.0.attn2.to_out.0.bias [1280] 146 | input_blocks.8.1.transformer_blocks.0.attn2.to_out.0.weight [1280,1280] 147 | input_blocks.8.1.transformer_blocks.0.attn2.to_q.weight [1280,1280] 148 | input_blocks.8.1.transformer_blocks.0.ff.net.0.proj.bias [10240] 149 | input_blocks.8.1.transformer_blocks.0.ff.net.0.proj.weight [10240,1280] 150 | input_blocks.8.1.transformer_blocks.0.ff.net.2.bias [1280] 151 | input_blocks.8.1.transformer_blocks.0.ff.net.2.weight [1280,5120] 152 | input_blocks.8.1.transformer_blocks.0.norm1.bias [1280] 153 | input_blocks.8.1.transformer_blocks.0.norm1.weight [1280] 154 | input_blocks.8.1.transformer_blocks.0.norm2.bias [1280] 155 | input_blocks.8.1.transformer_blocks.0.norm2.weight [1280] 156 | input_blocks.8.1.transformer_blocks.0.norm3.bias [1280] 157 | input_blocks.8.1.transformer_blocks.0.norm3.weight [1280] 158 | middle_block.0.emb_layers.1.bias [1280] 159 | middle_block.0.emb_layers.1.weight [1280,1280] 160 | middle_block.0.in_layers.0.bias [1280] 161 | middle_block.0.in_layers.0.weight [1280] 162 | middle_block.0.in_layers.2.bias [1280] 163 | middle_block.0.in_layers.2.weight [1280,1280,3,3] 164 | middle_block.0.out_layers.0.bias [1280] 165 | middle_block.0.out_layers.0.weight [1280] 166 | middle_block.0.out_layers.3.bias [1280] 167 | middle_block.0.out_layers.3.weight [1280,1280,3,3] 168 | middle_block.1.norm.bias [1280] 169 | middle_block.1.norm.weight [1280] 170 | middle_block.1.proj_in.bias [1280] 171 | middle_block.1.proj_out.bias [1280] 172 | middle_block.1.transformer_blocks.0.attn1.to_k.weight [1280,1280] 173 | middle_block.1.transformer_blocks.0.attn1.to_out.0.bias [1280] 174 | middle_block.1.transformer_blocks.0.attn1.to_out.0.weight [1280,1280] 175 | middle_block.1.transformer_blocks.0.attn1.to_q.weight [1280,1280] 176 | middle_block.1.transformer_blocks.0.attn1.to_v.weight [1280,1280] 177 | middle_block.1.transformer_blocks.0.attn2.to_out.0.bias [1280] 178 | middle_block.1.transformer_blocks.0.attn2.to_out.0.weight [1280,1280] 179 | middle_block.1.transformer_blocks.0.attn2.to_q.weight [1280,1280] 180 | middle_block.1.transformer_blocks.0.ff.net.0.proj.bias [10240] 181 | middle_block.1.transformer_blocks.0.ff.net.0.proj.weight [10240,1280] 182 | middle_block.1.transformer_blocks.0.ff.net.2.bias [1280] 183 | middle_block.1.transformer_blocks.0.ff.net.2.weight [1280,5120] 184 | middle_block.1.transformer_blocks.0.norm1.bias [1280] 185 | middle_block.1.transformer_blocks.0.norm1.weight [1280] 186 | middle_block.1.transformer_blocks.0.norm2.bias [1280] 187 | middle_block.1.transformer_blocks.0.norm2.weight [1280] 188 | middle_block.1.transformer_blocks.0.norm3.bias [1280] 189 | middle_block.1.transformer_blocks.0.norm3.weight [1280] 190 | middle_block.2.emb_layers.1.bias [1280] 191 | middle_block.2.emb_layers.1.weight [1280,1280] 192 | middle_block.2.in_layers.0.bias [1280] 193 | middle_block.2.in_layers.0.weight [1280] 194 | middle_block.2.in_layers.2.bias [1280] 195 | middle_block.2.in_layers.2.weight [1280,1280,3,3] 196 | middle_block.2.out_layers.0.bias [1280] 197 | middle_block.2.out_layers.0.weight [1280] 198 | middle_block.2.out_layers.3.bias [1280] 199 | middle_block.2.out_layers.3.weight [1280,1280,3,3] 200 | out.0.bias [320] 201 | out.0.weight [320] 202 | out.2.bias [4] 203 | out.2.weight [4,320,3,3] 204 | output_blocks.0.0.emb_layers.1.bias [1280] 205 | output_blocks.0.0.emb_layers.1.weight [1280,1280] 206 | output_blocks.0.0.in_layers.0.bias [2560] 207 | output_blocks.0.0.in_layers.0.weight [2560] 208 | output_blocks.0.0.in_layers.2.bias [1280] 209 | output_blocks.0.0.in_layers.2.weight [1280,2560,3,3] 210 | output_blocks.0.0.out_layers.0.bias [1280] 211 | output_blocks.0.0.out_layers.0.weight [1280] 212 | output_blocks.0.0.out_layers.3.bias [1280] 213 | output_blocks.0.0.out_layers.3.weight [1280,1280,3,3] 214 | output_blocks.0.0.skip_connection.bias [1280] 215 | output_blocks.0.0.skip_connection.weight [1280,2560,1,1] 216 | output_blocks.1.0.emb_layers.1.bias [1280] 217 | output_blocks.1.0.emb_layers.1.weight [1280,1280] 218 | output_blocks.1.0.in_layers.0.bias [2560] 219 | output_blocks.1.0.in_layers.0.weight [2560] 220 | output_blocks.1.0.in_layers.2.bias [1280] 221 | output_blocks.1.0.in_layers.2.weight [1280,2560,3,3] 222 | output_blocks.1.0.out_layers.0.bias [1280] 223 | output_blocks.1.0.out_layers.0.weight [1280] 224 | output_blocks.1.0.out_layers.3.bias [1280] 225 | output_blocks.1.0.out_layers.3.weight [1280,1280,3,3] 226 | output_blocks.1.0.skip_connection.bias [1280] 227 | output_blocks.1.0.skip_connection.weight [1280,2560,1,1] 228 | output_blocks.2.0.emb_layers.1.bias [1280] 229 | output_blocks.2.0.emb_layers.1.weight [1280,1280] 230 | output_blocks.2.0.in_layers.2.bias [1280] 231 | output_blocks.2.0.out_layers.0.bias [1280] 232 | output_blocks.2.0.out_layers.0.weight [1280] 233 | output_blocks.2.0.out_layers.3.bias [1280] 234 | output_blocks.2.0.out_layers.3.weight [1280,1280,3,3] 235 | output_blocks.2.0.skip_connection.bias [1280] 236 | time_embed.0.bias [1280] 237 | time_embed.0.weight [1280,320] 238 | time_embed.2.bias [1280] 239 | time_embed.2.weight [1280,1280] -------------------------------------------------------------------------------- /components/VAE-MU-DE.txt: -------------------------------------------------------------------------------- 1 | params.blocks_0_1.conv.bias [128] 2 | params.blocks_0_1.conv.kernel [3,3,256,128] 3 | params.blocks_0_2_0.conv.bias [256] 4 | params.blocks_0_2_0.conv.kernel [3,3,256,256] 5 | params.blocks_0_2_0.group_norm.bias [256] 6 | params.blocks_0_2_0.group_norm.scale [256] 7 | params.blocks_0_2_0.pointwise_contract.bias [256] 8 | params.blocks_0_2_0.pointwise_contract.kernel [256,256] 9 | params.blocks_0_2_1.conv.bias [256] 10 | params.blocks_0_2_1.conv.kernel [3,3,256,256] 11 | params.blocks_0_2_1.group_norm.bias [256] 12 | params.blocks_0_2_1.group_norm.scale [256] 13 | params.blocks_0_2_1.pointwise_contract.bias [256] 14 | params.blocks_0_2_1.pointwise_contract.kernel [256,256] 15 | params.blocks_0_2_2.conv.bias [256] 16 | params.blocks_0_2_2.conv.kernel [3,3,256,256] 17 | params.blocks_0_2_2.group_norm.bias [256] 18 | params.blocks_0_2_2.group_norm.scale [256] 19 | params.blocks_0_2_2.pointwise_contract.bias [256] 20 | params.blocks_0_2_2.pointwise_contract.kernel [256,256] 21 | params.blocks_0_2_3.conv.bias [256] 22 | params.blocks_0_2_3.conv.kernel [3,3,256,256] 23 | params.blocks_0_2_3.group_norm.bias [256] 24 | params.blocks_0_2_3.group_norm.scale [256] 25 | params.blocks_0_2_3.pointwise_contract.bias [256] 26 | params.blocks_0_2_3.pointwise_contract.kernel [256,256] 27 | params.blocks_1_1.conv.bias [128] 28 | params.blocks_1_1.conv.kernel [3,3,128,128] 29 | params.blocks_1_2_0.conv.bias [128] 30 | params.blocks_1_2_0.conv.kernel [3,3,128,128] 31 | params.blocks_1_2_0.group_norm.bias [128] 32 | params.blocks_1_2_0.group_norm.scale [128] 33 | params.blocks_1_2_0.pointwise_contract.bias [128] 34 | params.blocks_1_2_0.pointwise_contract.kernel [128,128] 35 | params.blocks_1_2_1.conv.bias [128] 36 | params.blocks_1_2_1.conv.kernel [3,3,128,128] 37 | params.blocks_1_2_1.group_norm.bias [128] 38 | params.blocks_1_2_1.group_norm.scale [128] 39 | params.blocks_1_2_1.pointwise_contract.bias [128] 40 | params.blocks_1_2_1.pointwise_contract.kernel [128,128] 41 | params.blocks_1_2_2.conv.bias [128] 42 | params.blocks_1_2_2.conv.kernel [3,3,128,128] 43 | params.blocks_1_2_2.group_norm.bias [128] 44 | params.blocks_1_2_2.group_norm.scale [128] 45 | params.blocks_1_2_2.pointwise_contract.bias [128] 46 | params.blocks_1_2_2.pointwise_contract.kernel [128,128] 47 | params.blocks_1_2_3.conv.bias [128] 48 | params.blocks_1_2_3.conv.kernel [3,3,128,128] 49 | params.blocks_1_2_3.group_norm.bias [128] 50 | params.blocks_1_2_3.group_norm.scale [128] 51 | params.blocks_1_2_3.pointwise_contract.bias [128] 52 | params.blocks_1_2_3.pointwise_contract.kernel [128,128] 53 | params.blocks_2_1.conv.bias [128] 54 | params.blocks_2_1.conv.kernel [3,3,128,128] 55 | params.blocks_2_2_0.conv.bias [128] 56 | params.blocks_2_2_0.conv.kernel [3,3,128,128] 57 | params.blocks_2_2_0.group_norm.bias [128] 58 | params.blocks_2_2_0.group_norm.scale [128] 59 | params.blocks_2_2_0.pointwise_contract.bias [128] 60 | params.blocks_2_2_0.pointwise_contract.kernel [128,128] 61 | params.blocks_2_2_1.conv.bias [128] 62 | params.blocks_2_2_1.conv.kernel [3,3,128,128] 63 | params.blocks_2_2_1.group_norm.bias [128] 64 | params.blocks_2_2_1.group_norm.scale [128] 65 | params.blocks_2_2_1.pointwise_contract.bias [128] 66 | params.blocks_2_2_1.pointwise_contract.kernel [128,128] 67 | params.blocks_2_2_2.conv.bias [128] 68 | params.blocks_2_2_2.conv.kernel [3,3,128,128] 69 | params.blocks_2_2_2.group_norm.bias [128] 70 | params.blocks_2_2_2.group_norm.scale [128] 71 | params.blocks_2_2_2.pointwise_contract.bias [128] 72 | params.blocks_2_2_2.pointwise_contract.kernel [128,128] 73 | params.blocks_2_2_3.conv.bias [128] 74 | params.blocks_2_2_3.conv.kernel [3,3,128,128] 75 | params.blocks_2_2_3.group_norm.bias [128] 76 | params.blocks_2_2_3.group_norm.scale [128] 77 | params.blocks_2_2_3.pointwise_contract.bias [128] 78 | params.blocks_2_2_3.pointwise_contract.kernel [128,128] 79 | params.blocks_3_1.conv.bias [128] 80 | params.blocks_3_1.conv.kernel [3,3,128,128] 81 | params.blocks_3_2_0.conv.bias [128] 82 | params.blocks_3_2_0.conv.kernel [3,3,128,128] 83 | params.blocks_3_2_0.group_norm.bias [128] 84 | params.blocks_3_2_0.group_norm.scale [128] 85 | params.blocks_3_2_0.pointwise_contract.bias [128] 86 | params.blocks_3_2_0.pointwise_contract.kernel [128,128] 87 | params.blocks_3_2_1.conv.bias [128] 88 | params.blocks_3_2_1.conv.kernel [3,3,128,128] 89 | params.blocks_3_2_1.group_norm.bias [128] 90 | params.blocks_3_2_1.group_norm.scale [128] 91 | params.blocks_3_2_1.pointwise_contract.bias [128] 92 | params.blocks_3_2_1.pointwise_contract.kernel [128,128] 93 | params.blocks_3_2_2.conv.bias [128] 94 | params.blocks_3_2_2.conv.kernel [3,3,128,128] 95 | params.blocks_3_2_2.group_norm.bias [128] 96 | params.blocks_3_2_2.group_norm.scale [128] 97 | params.blocks_3_2_2.pointwise_contract.bias [128] 98 | params.blocks_3_2_2.pointwise_contract.kernel [128,128] 99 | params.blocks_3_2_3.conv.bias [128] 100 | params.blocks_3_2_3.conv.kernel [3,3,128,128] 101 | params.blocks_3_2_3.group_norm.bias [128] 102 | params.blocks_3_2_3.group_norm.scale [128] 103 | params.blocks_3_2_3.pointwise_contract.bias [128] 104 | params.blocks_3_2_3.pointwise_contract.kernel [128,128] 105 | params.final_conv.bias [3] 106 | params.final_conv.kernel [3,3,128,3] 107 | params.final_norm.bias [128] 108 | params.final_norm.scale [128] 109 | params.projections.bias [256] 110 | params.projections.kernel [64,256] -------------------------------------------------------------------------------- /components/VAE-MU-EN.txt: -------------------------------------------------------------------------------- 1 | params.blocks_0_1.conv.kernel [3,3,128,128] 2 | params.blocks_0_2_0.conv.kernel [3,3,128,128] 3 | params.blocks_0_2_0.group_norm.bias [128] 4 | params.blocks_0_2_0.group_norm.scale [128] 5 | params.blocks_0_2_0.pointwise_contract.kernel [128,128] 6 | params.blocks_0_2_1.conv.kernel [3,3,128,128] 7 | params.blocks_0_2_1.group_norm.bias [128] 8 | params.blocks_0_2_1.group_norm.scale [128] 9 | params.blocks_0_2_1.pointwise_contract.kernel [128,128] 10 | params.blocks_0_2_2.conv.kernel [3,3,128,128] 11 | params.blocks_0_2_2.group_norm.bias [128] 12 | params.blocks_0_2_2.group_norm.scale [128] 13 | params.blocks_0_2_2.pointwise_contract.kernel [128,128] 14 | params.blocks_0_2_3.conv.kernel [3,3,128,128] 15 | params.blocks_0_2_3.group_norm.bias [128] 16 | params.blocks_0_2_3.group_norm.scale [128] 17 | params.blocks_0_2_3.pointwise_contract.kernel [128,128] 18 | params.blocks_1_1.conv.kernel [3,3,128,128] 19 | params.blocks_1_2_0.conv.kernel [3,3,128,128] 20 | params.blocks_1_2_0.group_norm.bias [128] 21 | params.blocks_1_2_0.group_norm.scale [128] 22 | params.blocks_1_2_0.pointwise_contract.kernel [128,128] 23 | params.blocks_1_2_1.conv.kernel [3,3,128,128] 24 | params.blocks_1_2_1.group_norm.bias [128] 25 | params.blocks_1_2_1.group_norm.scale [128] 26 | params.blocks_1_2_1.pointwise_contract.kernel [128,128] 27 | params.blocks_1_2_2.conv.kernel [3,3,128,128] 28 | params.blocks_1_2_2.group_norm.bias [128] 29 | params.blocks_1_2_2.group_norm.scale [128] 30 | params.blocks_1_2_2.pointwise_contract.kernel [128,128] 31 | params.blocks_1_2_3.conv.kernel [3,3,128,128] 32 | params.blocks_1_2_3.group_norm.bias [128] 33 | params.blocks_1_2_3.group_norm.scale [128] 34 | params.blocks_1_2_3.pointwise_contract.kernel [128,128] 35 | params.blocks_2_1.conv.kernel [3,3,128,128] 36 | params.blocks_2_2_0.conv.kernel [3,3,128,128] 37 | params.blocks_2_2_0.group_norm.bias [128] 38 | params.blocks_2_2_0.group_norm.scale [128] 39 | params.blocks_2_2_0.pointwise_contract.kernel [128,128] 40 | params.blocks_2_2_1.conv.kernel [3,3,128,128] 41 | params.blocks_2_2_1.group_norm.bias [128] 42 | params.blocks_2_2_1.group_norm.scale [128] 43 | params.blocks_2_2_1.pointwise_contract.kernel [128,128] 44 | params.blocks_2_2_2.conv.kernel [3,3,128,128] 45 | params.blocks_2_2_2.group_norm.bias [128] 46 | params.blocks_2_2_2.group_norm.scale [128] 47 | params.blocks_2_2_2.pointwise_contract.kernel [128,128] 48 | params.blocks_2_2_3.conv.kernel [3,3,128,128] 49 | params.blocks_2_2_3.group_norm.bias [128] 50 | params.blocks_2_2_3.group_norm.scale [128] 51 | params.blocks_2_2_3.pointwise_contract.kernel [128,128] 52 | params.blocks_3_1.conv.kernel [3,3,128,256] 53 | params.blocks_3_2_0.conv.kernel [3,3,256,256] 54 | params.blocks_3_2_0.group_norm.bias [256] 55 | params.blocks_3_2_0.group_norm.scale [256] 56 | params.blocks_3_2_0.pointwise_contract.kernel [256,256] 57 | params.blocks_3_2_1.conv.kernel [3,3,256,256] 58 | params.blocks_3_2_1.group_norm.bias [256] 59 | params.blocks_3_2_1.group_norm.scale [256] 60 | params.blocks_3_2_1.pointwise_contract.kernel [256,256] 61 | params.blocks_3_2_2.conv.kernel [3,3,256,256] 62 | params.blocks_3_2_2.group_norm.bias [256] 63 | params.blocks_3_2_2.group_norm.scale [256] 64 | params.blocks_3_2_2.pointwise_contract.kernel [256,256] 65 | params.blocks_3_2_3.conv.kernel [3,3,256,256] 66 | params.blocks_3_2_3.group_norm.bias [256] 67 | params.blocks_3_2_3.group_norm.scale [256] 68 | params.blocks_3_2_3.pointwise_contract.kernel [256,256] 69 | params.final_norm.bias [256] 70 | params.final_norm.scale [256] 71 | params.input_conv.bias [128] 72 | params.input_conv.kernel [3,3,3,128] 73 | params.projections.bias [128] 74 | params.projections.kernel [3,3,256,128] -------------------------------------------------------------------------------- /components/VAE-NU-DE.txt: -------------------------------------------------------------------------------- 1 | params.blocks_0_1.conv.bias [128] 2 | params.blocks_0_1.conv.kernel [3,3,256,128] 3 | params.blocks_0_2_0.conv.bias [256] 4 | params.blocks_0_2_0.conv.kernel [3,3,256,256] 5 | params.blocks_0_2_0.group_norm.bias [256] 6 | params.blocks_0_2_0.group_norm.scale [256] 7 | params.blocks_0_2_0.pointwise_contract.bias [256] 8 | params.blocks_0_2_0.pointwise_contract.kernel [256,256] 9 | params.blocks_0_2_1.conv.bias [256] 10 | params.blocks_0_2_1.conv.kernel [3,3,256,256] 11 | params.blocks_0_2_1.group_norm.bias [256] 12 | params.blocks_0_2_1.group_norm.scale [256] 13 | params.blocks_0_2_1.pointwise_contract.bias [256] 14 | params.blocks_0_2_1.pointwise_contract.kernel [256,256] 15 | params.blocks_0_2_2.conv.bias [256] 16 | params.blocks_0_2_2.conv.kernel [3,3,256,256] 17 | params.blocks_0_2_2.group_norm.bias [256] 18 | params.blocks_0_2_2.group_norm.scale [256] 19 | params.blocks_0_2_2.pointwise_contract.bias [256] 20 | params.blocks_0_2_2.pointwise_contract.kernel [256,256] 21 | params.blocks_0_2_3.conv.bias [256] 22 | params.blocks_0_2_3.conv.kernel [3,3,256,256] 23 | params.blocks_0_2_3.group_norm.bias [256] 24 | params.blocks_0_2_3.group_norm.scale [256] 25 | params.blocks_0_2_3.pointwise_contract.bias [256] 26 | params.blocks_0_2_3.pointwise_contract.kernel [256,256] 27 | params.blocks_1_1.conv.bias [128] 28 | params.blocks_1_1.conv.kernel [3,3,128,128] 29 | params.blocks_1_2_0.conv.bias [128] 30 | params.blocks_1_2_0.conv.kernel [3,3,128,128] 31 | params.blocks_1_2_0.group_norm.bias [128] 32 | params.blocks_1_2_0.group_norm.scale [128] 33 | params.blocks_1_2_0.pointwise_contract.bias [128] 34 | params.blocks_1_2_0.pointwise_contract.kernel [128,128] 35 | params.blocks_1_2_1.conv.bias [128] 36 | params.blocks_1_2_1.conv.kernel [3,3,128,128] 37 | params.blocks_1_2_1.group_norm.bias [128] 38 | params.blocks_1_2_1.group_norm.scale [128] 39 | params.blocks_1_2_1.pointwise_contract.bias [128] 40 | params.blocks_1_2_1.pointwise_contract.kernel [128,128] 41 | params.blocks_1_2_2.conv.bias [128] 42 | params.blocks_1_2_2.conv.kernel [3,3,128,128] 43 | params.blocks_1_2_2.group_norm.bias [128] 44 | params.blocks_1_2_2.group_norm.scale [128] 45 | params.blocks_1_2_2.pointwise_contract.bias [128] 46 | params.blocks_1_2_2.pointwise_contract.kernel [128,128] 47 | params.blocks_1_2_3.conv.bias [128] 48 | params.blocks_1_2_3.conv.kernel [3,3,128,128] 49 | params.blocks_1_2_3.group_norm.bias [128] 50 | params.blocks_1_2_3.group_norm.scale [128] 51 | params.blocks_1_2_3.pointwise_contract.bias [128] 52 | params.blocks_1_2_3.pointwise_contract.kernel [128,128] 53 | params.blocks_2_1.conv.bias [128] 54 | params.blocks_2_1.conv.kernel [3,3,128,128] 55 | params.blocks_2_2_0.conv.bias [128] 56 | params.blocks_2_2_0.conv.kernel [3,3,128,128] 57 | params.blocks_2_2_0.group_norm.bias [128] 58 | params.blocks_2_2_0.group_norm.scale [128] 59 | params.blocks_2_2_0.pointwise_contract.bias [128] 60 | params.blocks_2_2_0.pointwise_contract.kernel [128,128] 61 | params.blocks_2_2_1.conv.bias [128] 62 | params.blocks_2_2_1.conv.kernel [3,3,128,128] 63 | params.blocks_2_2_1.group_norm.bias [128] 64 | params.blocks_2_2_1.group_norm.scale [128] 65 | params.blocks_2_2_1.pointwise_contract.bias [128] 66 | params.blocks_2_2_1.pointwise_contract.kernel [128,128] 67 | params.blocks_2_2_2.conv.bias [128] 68 | params.blocks_2_2_2.conv.kernel [3,3,128,128] 69 | params.blocks_2_2_2.group_norm.bias [128] 70 | params.blocks_2_2_2.group_norm.scale [128] 71 | params.blocks_2_2_2.pointwise_contract.bias [128] 72 | params.blocks_2_2_2.pointwise_contract.kernel [128,128] 73 | params.blocks_2_2_3.conv.bias [128] 74 | params.blocks_2_2_3.conv.kernel [3,3,128,128] 75 | params.blocks_2_2_3.group_norm.bias [128] 76 | params.blocks_2_2_3.group_norm.scale [128] 77 | params.blocks_2_2_3.pointwise_contract.bias [128] 78 | params.blocks_2_2_3.pointwise_contract.kernel [128,128] 79 | params.blocks_3_1.conv.bias [128] 80 | params.blocks_3_1.conv.kernel [3,3,128,128] 81 | params.blocks_3_2_0.conv.bias [128] 82 | params.blocks_3_2_0.conv.kernel [3,3,128,128] 83 | params.blocks_3_2_0.group_norm.bias [128] 84 | params.blocks_3_2_0.group_norm.scale [128] 85 | params.blocks_3_2_0.pointwise_contract.bias [128] 86 | params.blocks_3_2_0.pointwise_contract.kernel [128,128] 87 | params.blocks_3_2_1.conv.bias [128] 88 | params.blocks_3_2_1.conv.kernel [3,3,128,128] 89 | params.blocks_3_2_1.group_norm.bias [128] 90 | params.blocks_3_2_1.group_norm.scale [128] 91 | params.blocks_3_2_1.pointwise_contract.bias [128] 92 | params.blocks_3_2_1.pointwise_contract.kernel [128,128] 93 | params.blocks_3_2_2.conv.bias [128] 94 | params.blocks_3_2_2.conv.kernel [3,3,128,128] 95 | params.blocks_3_2_2.group_norm.bias [128] 96 | params.blocks_3_2_2.group_norm.scale [128] 97 | params.blocks_3_2_2.pointwise_contract.bias [128] 98 | params.blocks_3_2_2.pointwise_contract.kernel [128,128] 99 | params.blocks_3_2_3.conv.bias [128] 100 | params.blocks_3_2_3.conv.kernel [3,3,128,128] 101 | params.blocks_3_2_3.group_norm.bias [128] 102 | params.blocks_3_2_3.group_norm.scale [128] 103 | params.blocks_3_2_3.pointwise_contract.bias [128] 104 | params.blocks_3_2_3.pointwise_contract.kernel [128,128] 105 | params.final_conv.bias [3] 106 | params.final_conv.kernel [3,3,128,3] 107 | params.final_norm.bias [128] 108 | params.final_norm.scale [128] 109 | params.projections.bias [256] 110 | params.projections.kernel [64,256] -------------------------------------------------------------------------------- /components/VAE-NU-EN.txt: -------------------------------------------------------------------------------- 1 | params.blocks_0_1.conv.kernel [3,3,128,128] 2 | params.blocks_0_2_0.conv.kernel [3,3,128,128] 3 | params.blocks_0_2_0.group_norm.bias [128] 4 | params.blocks_0_2_0.group_norm.scale [128] 5 | params.blocks_0_2_0.pointwise_contract.kernel [128,128] 6 | params.blocks_0_2_1.conv.kernel [3,3,128,128] 7 | params.blocks_0_2_1.group_norm.bias [128] 8 | params.blocks_0_2_1.group_norm.scale [128] 9 | params.blocks_0_2_1.pointwise_contract.kernel [128,128] 10 | params.blocks_0_2_2.conv.kernel [3,3,128,128] 11 | params.blocks_0_2_2.group_norm.bias [128] 12 | params.blocks_0_2_2.group_norm.scale [128] 13 | params.blocks_0_2_2.pointwise_contract.kernel [128,128] 14 | params.blocks_0_2_3.conv.kernel [3,3,128,128] 15 | params.blocks_0_2_3.group_norm.bias [128] 16 | params.blocks_0_2_3.group_norm.scale [128] 17 | params.blocks_0_2_3.pointwise_contract.kernel [128,128] 18 | params.blocks_1_1.conv.kernel [3,3,128,128] 19 | params.blocks_1_2_0.conv.kernel [3,3,128,128] 20 | params.blocks_1_2_0.group_norm.bias [128] 21 | params.blocks_1_2_0.group_norm.scale [128] 22 | params.blocks_1_2_0.pointwise_contract.kernel [128,128] 23 | params.blocks_1_2_1.conv.kernel [3,3,128,128] 24 | params.blocks_1_2_1.group_norm.bias [128] 25 | params.blocks_1_2_1.group_norm.scale [128] 26 | params.blocks_1_2_1.pointwise_contract.kernel [128,128] 27 | params.blocks_1_2_2.conv.kernel [3,3,128,128] 28 | params.blocks_1_2_2.group_norm.bias [128] 29 | params.blocks_1_2_2.group_norm.scale [128] 30 | params.blocks_1_2_2.pointwise_contract.kernel [128,128] 31 | params.blocks_1_2_3.conv.kernel [3,3,128,128] 32 | params.blocks_1_2_3.group_norm.bias [128] 33 | params.blocks_1_2_3.group_norm.scale [128] 34 | params.blocks_1_2_3.pointwise_contract.kernel [128,128] 35 | params.blocks_2_1.conv.kernel [3,3,128,128] 36 | params.blocks_2_2_0.conv.kernel [3,3,128,128] 37 | params.blocks_2_2_0.group_norm.bias [128] 38 | params.blocks_2_2_0.group_norm.scale [128] 39 | params.blocks_2_2_0.pointwise_contract.kernel [128,128] 40 | params.blocks_2_2_1.conv.kernel [3,3,128,128] 41 | params.blocks_2_2_1.group_norm.bias [128] 42 | params.blocks_2_2_1.group_norm.scale [128] 43 | params.blocks_2_2_1.pointwise_contract.kernel [128,128] 44 | params.blocks_2_2_2.conv.kernel [3,3,128,128] 45 | params.blocks_2_2_2.group_norm.bias [128] 46 | params.blocks_2_2_2.group_norm.scale [128] 47 | params.blocks_2_2_2.pointwise_contract.kernel [128,128] 48 | params.blocks_2_2_3.conv.kernel [3,3,128,128] 49 | params.blocks_2_2_3.group_norm.bias [128] 50 | params.blocks_2_2_3.group_norm.scale [128] 51 | params.blocks_2_2_3.pointwise_contract.kernel [128,128] 52 | params.blocks_3_1.conv.kernel [3,3,128,256] 53 | params.blocks_3_2_0.conv.kernel [3,3,256,256] 54 | params.blocks_3_2_0.group_norm.bias [256] 55 | params.blocks_3_2_0.group_norm.scale [256] 56 | params.blocks_3_2_0.pointwise_contract.kernel [256,256] 57 | params.blocks_3_2_1.conv.kernel [3,3,256,256] 58 | params.blocks_3_2_1.group_norm.bias [256] 59 | params.blocks_3_2_1.group_norm.scale [256] 60 | params.blocks_3_2_1.pointwise_contract.kernel [256,256] 61 | params.blocks_3_2_2.conv.kernel [3,3,256,256] 62 | params.blocks_3_2_2.group_norm.bias [256] 63 | params.blocks_3_2_2.group_norm.scale [256] 64 | params.blocks_3_2_2.pointwise_contract.kernel [256,256] 65 | params.blocks_3_2_3.conv.kernel [3,3,256,256] 66 | params.blocks_3_2_3.group_norm.bias [256] 67 | params.blocks_3_2_3.group_norm.scale [256] 68 | params.blocks_3_2_3.pointwise_contract.kernel [256,256] 69 | params.final_norm.bias [256] 70 | params.final_norm.scale [256] 71 | params.input_conv.bias [128] 72 | params.input_conv.kernel [3,3,3,128] 73 | params.projections.bias [128] 74 | params.projections.kernel [3,3,256,128] -------------------------------------------------------------------------------- /components/VAE-UNK-DE.txt: -------------------------------------------------------------------------------- 1 | params.blocks_0_1.conv.bias [128] 2 | params.blocks_0_1.conv.kernel [3,3,256,128] 3 | params.blocks_0_2_0.conv.bias [256] 4 | params.blocks_0_2_0.conv.kernel [3,3,256,256] 5 | params.blocks_0_2_0.group_norm.bias [256] 6 | params.blocks_0_2_0.group_norm.scale [256] 7 | params.blocks_0_2_0.pointwise_contract.bias [256] 8 | params.blocks_0_2_0.pointwise_contract.kernel [256,256] 9 | params.blocks_0_2_1.conv.bias [256] 10 | params.blocks_0_2_1.conv.kernel [3,3,256,256] 11 | params.blocks_0_2_1.group_norm.bias [256] 12 | params.blocks_0_2_1.group_norm.scale [256] 13 | params.blocks_0_2_1.pointwise_contract.bias [256] 14 | params.blocks_0_2_1.pointwise_contract.kernel [256,256] 15 | params.blocks_0_2_2.conv.bias [256] 16 | params.blocks_0_2_2.conv.kernel [3,3,256,256] 17 | params.blocks_0_2_2.group_norm.bias [256] 18 | params.blocks_0_2_2.group_norm.scale [256] 19 | params.blocks_0_2_2.pointwise_contract.bias [256] 20 | params.blocks_0_2_2.pointwise_contract.kernel [256,256] 21 | params.blocks_0_2_3.conv.bias [256] 22 | params.blocks_0_2_3.conv.kernel [3,3,256,256] 23 | params.blocks_0_2_3.group_norm.bias [256] 24 | params.blocks_0_2_3.group_norm.scale [256] 25 | params.blocks_0_2_3.pointwise_contract.bias [256] 26 | params.blocks_0_2_3.pointwise_contract.kernel [256,256] 27 | params.blocks_1_1.conv.bias [128] 28 | params.blocks_1_1.conv.kernel [3,3,128,128] 29 | params.blocks_1_2_0.conv.bias [128] 30 | params.blocks_1_2_0.conv.kernel [3,3,128,128] 31 | params.blocks_1_2_0.group_norm.bias [128] 32 | params.blocks_1_2_0.group_norm.scale [128] 33 | params.blocks_1_2_0.pointwise_contract.bias [128] 34 | params.blocks_1_2_0.pointwise_contract.kernel [128,128] 35 | params.blocks_1_2_1.conv.bias [128] 36 | params.blocks_1_2_1.conv.kernel [3,3,128,128] 37 | params.blocks_1_2_1.group_norm.bias [128] 38 | params.blocks_1_2_1.group_norm.scale [128] 39 | params.blocks_1_2_1.pointwise_contract.bias [128] 40 | params.blocks_1_2_1.pointwise_contract.kernel [128,128] 41 | params.blocks_1_2_2.conv.bias [128] 42 | params.blocks_1_2_2.conv.kernel [3,3,128,128] 43 | params.blocks_1_2_2.group_norm.bias [128] 44 | params.blocks_1_2_2.group_norm.scale [128] 45 | params.blocks_1_2_2.pointwise_contract.bias [128] 46 | params.blocks_1_2_2.pointwise_contract.kernel [128,128] 47 | params.blocks_1_2_3.conv.bias [128] 48 | params.blocks_1_2_3.conv.kernel [3,3,128,128] 49 | params.blocks_1_2_3.group_norm.bias [128] 50 | params.blocks_1_2_3.group_norm.scale [128] 51 | params.blocks_1_2_3.pointwise_contract.bias [128] 52 | params.blocks_1_2_3.pointwise_contract.kernel [128,128] 53 | params.blocks_2_1.conv.bias [128] 54 | params.blocks_2_1.conv.kernel [3,3,128,128] 55 | params.blocks_2_2_0.conv.bias [128] 56 | params.blocks_2_2_0.conv.kernel [3,3,128,128] 57 | params.blocks_2_2_0.group_norm.bias [128] 58 | params.blocks_2_2_0.group_norm.scale [128] 59 | params.blocks_2_2_0.pointwise_contract.bias [128] 60 | params.blocks_2_2_0.pointwise_contract.kernel [128,128] 61 | params.blocks_2_2_1.conv.bias [128] 62 | params.blocks_2_2_1.conv.kernel [3,3,128,128] 63 | params.blocks_2_2_1.group_norm.bias [128] 64 | params.blocks_2_2_1.group_norm.scale [128] 65 | params.blocks_2_2_1.pointwise_contract.bias [128] 66 | params.blocks_2_2_1.pointwise_contract.kernel [128,128] 67 | params.blocks_2_2_2.conv.bias [128] 68 | params.blocks_2_2_2.conv.kernel [3,3,128,128] 69 | params.blocks_2_2_2.group_norm.bias [128] 70 | params.blocks_2_2_2.group_norm.scale [128] 71 | params.blocks_2_2_2.pointwise_contract.bias [128] 72 | params.blocks_2_2_2.pointwise_contract.kernel [128,128] 73 | params.blocks_2_2_3.conv.bias [128] 74 | params.blocks_2_2_3.conv.kernel [3,3,128,128] 75 | params.blocks_2_2_3.group_norm.bias [128] 76 | params.blocks_2_2_3.group_norm.scale [128] 77 | params.blocks_2_2_3.pointwise_contract.bias [128] 78 | params.blocks_2_2_3.pointwise_contract.kernel [128,128] 79 | params.blocks_3_1.conv.bias [128] 80 | params.blocks_3_1.conv.kernel [3,3,128,128] 81 | params.blocks_3_2_0.conv.bias [128] 82 | params.blocks_3_2_0.conv.kernel [3,3,128,128] 83 | params.blocks_3_2_0.group_norm.bias [128] 84 | params.blocks_3_2_0.group_norm.scale [128] 85 | params.blocks_3_2_0.pointwise_contract.bias [128] 86 | params.blocks_3_2_0.pointwise_contract.kernel [128,128] 87 | params.blocks_3_2_1.conv.bias [128] 88 | params.blocks_3_2_1.conv.kernel [3,3,128,128] 89 | params.blocks_3_2_1.group_norm.bias [128] 90 | params.blocks_3_2_1.group_norm.scale [128] 91 | params.blocks_3_2_1.pointwise_contract.bias [128] 92 | params.blocks_3_2_1.pointwise_contract.kernel [128,128] 93 | params.blocks_3_2_2.conv.bias [128] 94 | params.blocks_3_2_2.conv.kernel [3,3,128,128] 95 | params.blocks_3_2_2.group_norm.bias [128] 96 | params.blocks_3_2_2.group_norm.scale [128] 97 | params.blocks_3_2_2.pointwise_contract.bias [128] 98 | params.blocks_3_2_2.pointwise_contract.kernel [128,128] 99 | params.blocks_3_2_3.conv.bias [128] 100 | params.blocks_3_2_3.conv.kernel [3,3,128,128] 101 | params.blocks_3_2_3.group_norm.bias [128] 102 | params.blocks_3_2_3.group_norm.scale [128] 103 | params.blocks_3_2_3.pointwise_contract.bias [128] 104 | params.blocks_3_2_3.pointwise_contract.kernel [128,128] 105 | params.final_conv.bias [3] 106 | params.final_conv.kernel [3,3,128,3] 107 | params.final_norm.bias [128] 108 | params.final_norm.scale [128] 109 | params.projections.bias [256] 110 | params.projections.kernel [64,256] -------------------------------------------------------------------------------- /components/VAE-UNK-EN.txt: -------------------------------------------------------------------------------- 1 | params.blocks_0_1.conv.kernel [3,3,128,128] 2 | params.blocks_0_2_0.conv.kernel [3,3,128,128] 3 | params.blocks_0_2_0.group_norm.bias [128] 4 | params.blocks_0_2_0.group_norm.scale [128] 5 | params.blocks_0_2_0.pointwise_contract.kernel [128,128] 6 | params.blocks_0_2_1.conv.kernel [3,3,128,128] 7 | params.blocks_0_2_1.group_norm.bias [128] 8 | params.blocks_0_2_1.group_norm.scale [128] 9 | params.blocks_0_2_1.pointwise_contract.kernel [128,128] 10 | params.blocks_0_2_2.conv.kernel [3,3,128,128] 11 | params.blocks_0_2_2.group_norm.bias [128] 12 | params.blocks_0_2_2.group_norm.scale [128] 13 | params.blocks_0_2_2.pointwise_contract.kernel [128,128] 14 | params.blocks_0_2_3.conv.kernel [3,3,128,128] 15 | params.blocks_0_2_3.group_norm.bias [128] 16 | params.blocks_0_2_3.group_norm.scale [128] 17 | params.blocks_0_2_3.pointwise_contract.kernel [128,128] 18 | params.blocks_1_1.conv.kernel [3,3,128,128] 19 | params.blocks_1_2_0.conv.kernel [3,3,128,128] 20 | params.blocks_1_2_0.group_norm.bias [128] 21 | params.blocks_1_2_0.group_norm.scale [128] 22 | params.blocks_1_2_0.pointwise_contract.kernel [128,128] 23 | params.blocks_1_2_1.conv.kernel [3,3,128,128] 24 | params.blocks_1_2_1.group_norm.bias [128] 25 | params.blocks_1_2_1.group_norm.scale [128] 26 | params.blocks_1_2_1.pointwise_contract.kernel [128,128] 27 | params.blocks_1_2_2.conv.kernel [3,3,128,128] 28 | params.blocks_1_2_2.group_norm.bias [128] 29 | params.blocks_1_2_2.group_norm.scale [128] 30 | params.blocks_1_2_2.pointwise_contract.kernel [128,128] 31 | params.blocks_1_2_3.conv.kernel [3,3,128,128] 32 | params.blocks_1_2_3.group_norm.bias [128] 33 | params.blocks_1_2_3.group_norm.scale [128] 34 | params.blocks_1_2_3.pointwise_contract.kernel [128,128] 35 | params.blocks_2_1.conv.kernel [3,3,128,128] 36 | params.blocks_2_2_0.conv.kernel [3,3,128,128] 37 | params.blocks_2_2_0.group_norm.bias [128] 38 | params.blocks_2_2_0.group_norm.scale [128] 39 | params.blocks_2_2_0.pointwise_contract.kernel [128,128] 40 | params.blocks_2_2_1.conv.kernel [3,3,128,128] 41 | params.blocks_2_2_1.group_norm.bias [128] 42 | params.blocks_2_2_1.group_norm.scale [128] 43 | params.blocks_2_2_1.pointwise_contract.kernel [128,128] 44 | params.blocks_2_2_2.conv.kernel [3,3,128,128] 45 | params.blocks_2_2_2.group_norm.bias [128] 46 | params.blocks_2_2_2.group_norm.scale [128] 47 | params.blocks_2_2_2.pointwise_contract.kernel [128,128] 48 | params.blocks_2_2_3.conv.kernel [3,3,128,128] 49 | params.blocks_2_2_3.group_norm.bias [128] 50 | params.blocks_2_2_3.group_norm.scale [128] 51 | params.blocks_2_2_3.pointwise_contract.kernel [128,128] 52 | params.blocks_3_1.conv.kernel [3,3,128,256] 53 | params.blocks_3_2_0.conv.kernel [3,3,256,256] 54 | params.blocks_3_2_0.group_norm.bias [256] 55 | params.blocks_3_2_0.group_norm.scale [256] 56 | params.blocks_3_2_0.pointwise_contract.kernel [256,256] 57 | params.blocks_3_2_1.conv.kernel [3,3,256,256] 58 | params.blocks_3_2_1.group_norm.bias [256] 59 | params.blocks_3_2_1.group_norm.scale [256] 60 | params.blocks_3_2_1.pointwise_contract.kernel [256,256] 61 | params.blocks_3_2_2.conv.kernel [3,3,256,256] 62 | params.blocks_3_2_2.group_norm.bias [256] 63 | params.blocks_3_2_2.group_norm.scale [256] 64 | params.blocks_3_2_2.pointwise_contract.kernel [256,256] 65 | params.blocks_3_2_3.conv.kernel [3,3,256,256] 66 | params.blocks_3_2_3.group_norm.bias [256] 67 | params.blocks_3_2_3.group_norm.scale [256] 68 | params.blocks_3_2_3.pointwise_contract.kernel [256,256] 69 | params.final_norm.bias [256] 70 | params.final_norm.scale [256] 71 | params.input_conv.bias [128] 72 | params.input_conv.kernel [3,3,3,128] 73 | params.projections.bias [128] 74 | params.projections.kernel [3,3,256,128] -------------------------------------------------------------------------------- /components/VAE-v1-DS.txt: -------------------------------------------------------------------------------- 1 | decoder.conv_in.bias [512] 2 | decoder.conv_in.weight [512,4,3,3] 3 | decoder.conv_norm_out.bias [128] 4 | decoder.conv_norm_out.weight [128] 5 | decoder.conv_out.bias [3] 6 | decoder.conv_out.weight [3,128,3,3] 7 | decoder.mid_block.attentions.0.group_norm.bias [512] 8 | decoder.mid_block.attentions.0.group_norm.weight [512] 9 | decoder.mid_block.attentions.0.to_k.bias [512] 10 | decoder.mid_block.attentions.0.to_k.weight [512,512] 11 | decoder.mid_block.attentions.0.to_out.0.bias [512] 12 | decoder.mid_block.attentions.0.to_out.0.weight [512,512] 13 | decoder.mid_block.attentions.0.to_q.bias [512] 14 | decoder.mid_block.attentions.0.to_q.weight [512,512] 15 | decoder.mid_block.attentions.0.to_v.bias [512] 16 | decoder.mid_block.attentions.0.to_v.weight [512,512] 17 | decoder.mid_block.resnets.0.conv1.bias [512] 18 | decoder.mid_block.resnets.0.conv1.weight [512,512,3,3] 19 | decoder.mid_block.resnets.0.conv2.bias [512] 20 | decoder.mid_block.resnets.0.conv2.weight [512,512,3,3] 21 | decoder.mid_block.resnets.0.norm1.bias [512] 22 | decoder.mid_block.resnets.0.norm1.weight [512] 23 | decoder.mid_block.resnets.0.norm2.bias [512] 24 | decoder.mid_block.resnets.0.norm2.weight [512] 25 | decoder.mid_block.resnets.1.conv1.bias [512] 26 | decoder.mid_block.resnets.1.conv1.weight [512,512,3,3] 27 | decoder.mid_block.resnets.1.conv2.bias [512] 28 | decoder.mid_block.resnets.1.conv2.weight [512,512,3,3] 29 | decoder.mid_block.resnets.1.norm1.bias [512] 30 | decoder.mid_block.resnets.1.norm1.weight [512] 31 | decoder.mid_block.resnets.1.norm2.bias [512] 32 | decoder.mid_block.resnets.1.norm2.weight [512] 33 | decoder.up_blocks.0.resnets.0.conv1.bias [512] 34 | decoder.up_blocks.0.resnets.0.conv1.weight [512,512,3,3] 35 | decoder.up_blocks.0.resnets.0.conv2.bias [512] 36 | decoder.up_blocks.0.resnets.0.conv2.weight [512,512,3,3] 37 | decoder.up_blocks.0.resnets.0.norm1.bias [512] 38 | decoder.up_blocks.0.resnets.0.norm1.weight [512] 39 | decoder.up_blocks.0.resnets.0.norm2.bias [512] 40 | decoder.up_blocks.0.resnets.0.norm2.weight [512] 41 | decoder.up_blocks.0.resnets.1.conv1.bias [512] 42 | decoder.up_blocks.0.resnets.1.conv1.weight [512,512,3,3] 43 | decoder.up_blocks.0.resnets.1.conv2.bias [512] 44 | decoder.up_blocks.0.resnets.1.conv2.weight [512,512,3,3] 45 | decoder.up_blocks.0.resnets.1.norm1.bias [512] 46 | decoder.up_blocks.0.resnets.1.norm1.weight [512] 47 | decoder.up_blocks.0.resnets.1.norm2.bias [512] 48 | decoder.up_blocks.0.resnets.1.norm2.weight [512] 49 | decoder.up_blocks.0.resnets.2.conv1.bias [512] 50 | decoder.up_blocks.0.resnets.2.conv1.weight [512,512,3,3] 51 | decoder.up_blocks.0.resnets.2.conv2.bias [512] 52 | decoder.up_blocks.0.resnets.2.conv2.weight [512,512,3,3] 53 | decoder.up_blocks.0.resnets.2.norm1.bias [512] 54 | decoder.up_blocks.0.resnets.2.norm1.weight [512] 55 | decoder.up_blocks.0.resnets.2.norm2.bias [512] 56 | decoder.up_blocks.0.resnets.2.norm2.weight [512] 57 | decoder.up_blocks.0.upsamplers.0.conv.bias [512] 58 | decoder.up_blocks.0.upsamplers.0.conv.weight [512,512,3,3] 59 | decoder.up_blocks.1.resnets.0.conv1.bias [512] 60 | decoder.up_blocks.1.resnets.0.conv1.weight [512,512,3,3] 61 | decoder.up_blocks.1.resnets.0.conv2.bias [512] 62 | decoder.up_blocks.1.resnets.0.conv2.weight [512,512,3,3] 63 | decoder.up_blocks.1.resnets.0.norm1.bias [512] 64 | decoder.up_blocks.1.resnets.0.norm1.weight [512] 65 | decoder.up_blocks.1.resnets.0.norm2.bias [512] 66 | decoder.up_blocks.1.resnets.0.norm2.weight [512] 67 | decoder.up_blocks.1.resnets.1.conv1.bias [512] 68 | decoder.up_blocks.1.resnets.1.conv1.weight [512,512,3,3] 69 | decoder.up_blocks.1.resnets.1.conv2.bias [512] 70 | decoder.up_blocks.1.resnets.1.conv2.weight [512,512,3,3] 71 | decoder.up_blocks.1.resnets.1.norm1.bias [512] 72 | decoder.up_blocks.1.resnets.1.norm1.weight [512] 73 | decoder.up_blocks.1.resnets.1.norm2.bias [512] 74 | decoder.up_blocks.1.resnets.1.norm2.weight [512] 75 | decoder.up_blocks.1.resnets.2.conv1.bias [512] 76 | decoder.up_blocks.1.resnets.2.conv1.weight [512,512,3,3] 77 | decoder.up_blocks.1.resnets.2.conv2.bias [512] 78 | decoder.up_blocks.1.resnets.2.conv2.weight [512,512,3,3] 79 | decoder.up_blocks.1.resnets.2.norm1.bias [512] 80 | decoder.up_blocks.1.resnets.2.norm1.weight [512] 81 | decoder.up_blocks.1.resnets.2.norm2.bias [512] 82 | decoder.up_blocks.1.resnets.2.norm2.weight [512] 83 | decoder.up_blocks.1.upsamplers.0.conv.bias [512] 84 | decoder.up_blocks.1.upsamplers.0.conv.weight [512,512,3,3] 85 | decoder.up_blocks.2.resnets.0.conv1.bias [256] 86 | decoder.up_blocks.2.resnets.0.conv1.weight [256,512,3,3] 87 | decoder.up_blocks.2.resnets.0.conv2.bias [256] 88 | decoder.up_blocks.2.resnets.0.conv2.weight [256,256,3,3] 89 | decoder.up_blocks.2.resnets.0.conv_shortcut.bias [256] 90 | decoder.up_blocks.2.resnets.0.conv_shortcut.weight [256,512,1,1] 91 | decoder.up_blocks.2.resnets.0.norm1.bias [512] 92 | decoder.up_blocks.2.resnets.0.norm1.weight [512] 93 | decoder.up_blocks.2.resnets.0.norm2.bias [256] 94 | decoder.up_blocks.2.resnets.0.norm2.weight [256] 95 | decoder.up_blocks.2.resnets.1.conv1.bias [256] 96 | decoder.up_blocks.2.resnets.1.conv1.weight [256,256,3,3] 97 | decoder.up_blocks.2.resnets.1.conv2.bias [256] 98 | decoder.up_blocks.2.resnets.1.conv2.weight [256,256,3,3] 99 | decoder.up_blocks.2.resnets.1.norm1.bias [256] 100 | decoder.up_blocks.2.resnets.1.norm1.weight [256] 101 | decoder.up_blocks.2.resnets.1.norm2.bias [256] 102 | decoder.up_blocks.2.resnets.1.norm2.weight [256] 103 | decoder.up_blocks.2.resnets.2.conv1.bias [256] 104 | decoder.up_blocks.2.resnets.2.conv1.weight [256,256,3,3] 105 | decoder.up_blocks.2.resnets.2.conv2.bias [256] 106 | decoder.up_blocks.2.resnets.2.conv2.weight [256,256,3,3] 107 | decoder.up_blocks.2.resnets.2.norm1.bias [256] 108 | decoder.up_blocks.2.resnets.2.norm1.weight [256] 109 | decoder.up_blocks.2.resnets.2.norm2.bias [256] 110 | decoder.up_blocks.2.resnets.2.norm2.weight [256] 111 | decoder.up_blocks.2.upsamplers.0.conv.bias [256] 112 | decoder.up_blocks.2.upsamplers.0.conv.weight [256,256,3,3] 113 | decoder.up_blocks.3.resnets.0.conv1.bias [128] 114 | decoder.up_blocks.3.resnets.0.conv1.weight [128,256,3,3] 115 | decoder.up_blocks.3.resnets.0.conv2.bias [128] 116 | decoder.up_blocks.3.resnets.0.conv2.weight [128,128,3,3] 117 | decoder.up_blocks.3.resnets.0.conv_shortcut.bias [128] 118 | decoder.up_blocks.3.resnets.0.conv_shortcut.weight [128,256,1,1] 119 | decoder.up_blocks.3.resnets.0.norm1.bias [256] 120 | decoder.up_blocks.3.resnets.0.norm1.weight [256] 121 | decoder.up_blocks.3.resnets.0.norm2.bias [128] 122 | decoder.up_blocks.3.resnets.0.norm2.weight [128] 123 | decoder.up_blocks.3.resnets.1.conv1.bias [128] 124 | decoder.up_blocks.3.resnets.1.conv1.weight [128,128,3,3] 125 | decoder.up_blocks.3.resnets.1.conv2.bias [128] 126 | decoder.up_blocks.3.resnets.1.conv2.weight [128,128,3,3] 127 | decoder.up_blocks.3.resnets.1.norm1.bias [128] 128 | decoder.up_blocks.3.resnets.1.norm1.weight [128] 129 | decoder.up_blocks.3.resnets.1.norm2.bias [128] 130 | decoder.up_blocks.3.resnets.1.norm2.weight [128] 131 | decoder.up_blocks.3.resnets.2.conv1.bias [128] 132 | decoder.up_blocks.3.resnets.2.conv1.weight [128,128,3,3] 133 | decoder.up_blocks.3.resnets.2.conv2.bias [128] 134 | decoder.up_blocks.3.resnets.2.conv2.weight [128,128,3,3] 135 | decoder.up_blocks.3.resnets.2.norm1.bias [128] 136 | decoder.up_blocks.3.resnets.2.norm1.weight [128] 137 | decoder.up_blocks.3.resnets.2.norm2.bias [128] 138 | decoder.up_blocks.3.resnets.2.norm2.weight [128] 139 | encoder.conv_in.bias [128] 140 | encoder.conv_in.weight [128,3,3,3] 141 | encoder.conv_norm_out.bias [512] 142 | encoder.conv_norm_out.weight [512] 143 | encoder.conv_out.bias [8] 144 | encoder.conv_out.weight [8,512,3,3] 145 | encoder.down_blocks.0.downsamplers.0.conv.bias [128] 146 | encoder.down_blocks.0.downsamplers.0.conv.weight [128,128,3,3] 147 | encoder.down_blocks.0.resnets.0.conv1.bias [128] 148 | encoder.down_blocks.0.resnets.0.conv1.weight [128,128,3,3] 149 | encoder.down_blocks.0.resnets.0.conv2.bias [128] 150 | encoder.down_blocks.0.resnets.0.conv2.weight [128,128,3,3] 151 | encoder.down_blocks.0.resnets.0.norm1.bias [128] 152 | encoder.down_blocks.0.resnets.0.norm1.weight [128] 153 | encoder.down_blocks.0.resnets.0.norm2.bias [128] 154 | encoder.down_blocks.0.resnets.0.norm2.weight [128] 155 | encoder.down_blocks.0.resnets.1.conv1.bias [128] 156 | encoder.down_blocks.0.resnets.1.conv1.weight [128,128,3,3] 157 | encoder.down_blocks.0.resnets.1.conv2.bias [128] 158 | encoder.down_blocks.0.resnets.1.conv2.weight [128,128,3,3] 159 | encoder.down_blocks.0.resnets.1.norm1.bias [128] 160 | encoder.down_blocks.0.resnets.1.norm1.weight [128] 161 | encoder.down_blocks.0.resnets.1.norm2.bias [128] 162 | encoder.down_blocks.0.resnets.1.norm2.weight [128] 163 | encoder.down_blocks.1.downsamplers.0.conv.bias [256] 164 | encoder.down_blocks.1.downsamplers.0.conv.weight [256,256,3,3] 165 | encoder.down_blocks.1.resnets.0.conv1.bias [256] 166 | encoder.down_blocks.1.resnets.0.conv1.weight [256,128,3,3] 167 | encoder.down_blocks.1.resnets.0.conv2.bias [256] 168 | encoder.down_blocks.1.resnets.0.conv2.weight [256,256,3,3] 169 | encoder.down_blocks.1.resnets.0.conv_shortcut.bias [256] 170 | encoder.down_blocks.1.resnets.0.conv_shortcut.weight [256,128,1,1] 171 | encoder.down_blocks.1.resnets.0.norm1.bias [128] 172 | encoder.down_blocks.1.resnets.0.norm1.weight [128] 173 | encoder.down_blocks.1.resnets.0.norm2.bias [256] 174 | encoder.down_blocks.1.resnets.0.norm2.weight [256] 175 | encoder.down_blocks.1.resnets.1.conv1.bias [256] 176 | encoder.down_blocks.1.resnets.1.conv1.weight [256,256,3,3] 177 | encoder.down_blocks.1.resnets.1.conv2.bias [256] 178 | encoder.down_blocks.1.resnets.1.conv2.weight [256,256,3,3] 179 | encoder.down_blocks.1.resnets.1.norm1.bias [256] 180 | encoder.down_blocks.1.resnets.1.norm1.weight [256] 181 | encoder.down_blocks.1.resnets.1.norm2.bias [256] 182 | encoder.down_blocks.1.resnets.1.norm2.weight [256] 183 | encoder.down_blocks.2.downsamplers.0.conv.bias [512] 184 | encoder.down_blocks.2.downsamplers.0.conv.weight [512,512,3,3] 185 | encoder.down_blocks.2.resnets.0.conv1.bias [512] 186 | encoder.down_blocks.2.resnets.0.conv1.weight [512,256,3,3] 187 | encoder.down_blocks.2.resnets.0.conv2.bias [512] 188 | encoder.down_blocks.2.resnets.0.conv2.weight [512,512,3,3] 189 | encoder.down_blocks.2.resnets.0.conv_shortcut.bias [512] 190 | encoder.down_blocks.2.resnets.0.conv_shortcut.weight [512,256,1,1] 191 | encoder.down_blocks.2.resnets.0.norm1.bias [256] 192 | encoder.down_blocks.2.resnets.0.norm1.weight [256] 193 | encoder.down_blocks.2.resnets.0.norm2.bias [512] 194 | encoder.down_blocks.2.resnets.0.norm2.weight [512] 195 | encoder.down_blocks.2.resnets.1.conv1.bias [512] 196 | encoder.down_blocks.2.resnets.1.conv1.weight [512,512,3,3] 197 | encoder.down_blocks.2.resnets.1.conv2.bias [512] 198 | encoder.down_blocks.2.resnets.1.conv2.weight [512,512,3,3] 199 | encoder.down_blocks.2.resnets.1.norm1.bias [512] 200 | encoder.down_blocks.2.resnets.1.norm1.weight [512] 201 | encoder.down_blocks.2.resnets.1.norm2.bias [512] 202 | encoder.down_blocks.2.resnets.1.norm2.weight [512] 203 | encoder.down_blocks.3.resnets.0.conv1.bias [512] 204 | encoder.down_blocks.3.resnets.0.conv1.weight [512,512,3,3] 205 | encoder.down_blocks.3.resnets.0.conv2.bias [512] 206 | encoder.down_blocks.3.resnets.0.conv2.weight [512,512,3,3] 207 | encoder.down_blocks.3.resnets.0.norm1.bias [512] 208 | encoder.down_blocks.3.resnets.0.norm1.weight [512] 209 | encoder.down_blocks.3.resnets.0.norm2.bias [512] 210 | encoder.down_blocks.3.resnets.0.norm2.weight [512] 211 | encoder.down_blocks.3.resnets.1.conv1.bias [512] 212 | encoder.down_blocks.3.resnets.1.conv1.weight [512,512,3,3] 213 | encoder.down_blocks.3.resnets.1.conv2.bias [512] 214 | encoder.down_blocks.3.resnets.1.conv2.weight [512,512,3,3] 215 | encoder.down_blocks.3.resnets.1.norm1.bias [512] 216 | encoder.down_blocks.3.resnets.1.norm1.weight [512] 217 | encoder.down_blocks.3.resnets.1.norm2.bias [512] 218 | encoder.down_blocks.3.resnets.1.norm2.weight [512] 219 | encoder.mid_block.attentions.0.group_norm.bias [512] 220 | encoder.mid_block.attentions.0.group_norm.weight [512] 221 | encoder.mid_block.attentions.0.to_k.bias [512] 222 | encoder.mid_block.attentions.0.to_k.weight [512,512] 223 | encoder.mid_block.attentions.0.to_out.0.bias [512] 224 | encoder.mid_block.attentions.0.to_out.0.weight [512,512] 225 | encoder.mid_block.attentions.0.to_q.bias [512] 226 | encoder.mid_block.attentions.0.to_q.weight [512,512] 227 | encoder.mid_block.attentions.0.to_v.bias [512] 228 | encoder.mid_block.attentions.0.to_v.weight [512,512] 229 | encoder.mid_block.resnets.0.conv1.bias [512] 230 | encoder.mid_block.resnets.0.conv1.weight [512,512,3,3] 231 | encoder.mid_block.resnets.0.conv2.bias [512] 232 | encoder.mid_block.resnets.0.conv2.weight [512,512,3,3] 233 | encoder.mid_block.resnets.0.norm1.bias [512] 234 | encoder.mid_block.resnets.0.norm1.weight [512] 235 | encoder.mid_block.resnets.0.norm2.bias [512] 236 | encoder.mid_block.resnets.0.norm2.weight [512] 237 | encoder.mid_block.resnets.1.conv1.bias [512] 238 | encoder.mid_block.resnets.1.conv1.weight [512,512,3,3] 239 | encoder.mid_block.resnets.1.conv2.bias [512] 240 | encoder.mid_block.resnets.1.conv2.weight [512,512,3,3] 241 | encoder.mid_block.resnets.1.norm1.bias [512] 242 | encoder.mid_block.resnets.1.norm1.weight [512] 243 | encoder.mid_block.resnets.1.norm2.bias [512] 244 | encoder.mid_block.resnets.1.norm2.weight [512] 245 | post_quant_conv.bias [4] 246 | post_quant_conv.weight [4,4,1,1] 247 | quant_conv.bias [8] 248 | quant_conv.weight [8,8,1,1] -------------------------------------------------------------------------------- /components/VAE-v1-SD.txt: -------------------------------------------------------------------------------- 1 | decoder.conv_in.bias [512] 2 | decoder.conv_in.weight [512,4,3,3] 3 | decoder.conv_out.bias [3] 4 | decoder.conv_out.weight [3,128,3,3] 5 | decoder.mid.attn_1.k.bias [512] 6 | decoder.mid.attn_1.k.weight [512,512,1,1] 7 | decoder.mid.attn_1.norm.bias [512] 8 | decoder.mid.attn_1.norm.weight [512] 9 | decoder.mid.attn_1.proj_out.bias [512] 10 | decoder.mid.attn_1.proj_out.weight [512,512,1,1] 11 | decoder.mid.attn_1.q.bias [512] 12 | decoder.mid.attn_1.q.weight [512,512,1,1] 13 | decoder.mid.attn_1.v.bias [512] 14 | decoder.mid.attn_1.v.weight [512,512,1,1] 15 | decoder.mid.block_1.conv1.bias [512] 16 | decoder.mid.block_1.conv1.weight [512,512,3,3] 17 | decoder.mid.block_1.conv2.bias [512] 18 | decoder.mid.block_1.conv2.weight [512,512,3,3] 19 | decoder.mid.block_1.norm1.bias [512] 20 | decoder.mid.block_1.norm1.weight [512] 21 | decoder.mid.block_1.norm2.bias [512] 22 | decoder.mid.block_1.norm2.weight [512] 23 | decoder.mid.block_2.conv1.bias [512] 24 | decoder.mid.block_2.conv1.weight [512,512,3,3] 25 | decoder.mid.block_2.conv2.bias [512] 26 | decoder.mid.block_2.conv2.weight [512,512,3,3] 27 | decoder.mid.block_2.norm1.bias [512] 28 | decoder.mid.block_2.norm1.weight [512] 29 | decoder.mid.block_2.norm2.bias [512] 30 | decoder.mid.block_2.norm2.weight [512] 31 | decoder.norm_out.bias [128] 32 | decoder.norm_out.weight [128] 33 | decoder.up.0.block.0.conv1.bias [128] 34 | decoder.up.0.block.0.conv1.weight [128,256,3,3] 35 | decoder.up.0.block.0.conv2.bias [128] 36 | decoder.up.0.block.0.conv2.weight [128,128,3,3] 37 | decoder.up.0.block.0.nin_shortcut.bias [128] 38 | decoder.up.0.block.0.nin_shortcut.weight [128,256,1,1] 39 | decoder.up.0.block.0.norm1.bias [256] 40 | decoder.up.0.block.0.norm1.weight [256] 41 | decoder.up.0.block.0.norm2.bias [128] 42 | decoder.up.0.block.0.norm2.weight [128] 43 | decoder.up.0.block.1.conv1.bias [128] 44 | decoder.up.0.block.1.conv1.weight [128,128,3,3] 45 | decoder.up.0.block.1.conv2.bias [128] 46 | decoder.up.0.block.1.conv2.weight [128,128,3,3] 47 | decoder.up.0.block.1.norm1.bias [128] 48 | decoder.up.0.block.1.norm1.weight [128] 49 | decoder.up.0.block.1.norm2.bias [128] 50 | decoder.up.0.block.1.norm2.weight [128] 51 | decoder.up.0.block.2.conv1.bias [128] 52 | decoder.up.0.block.2.conv1.weight [128,128,3,3] 53 | decoder.up.0.block.2.conv2.bias [128] 54 | decoder.up.0.block.2.conv2.weight [128,128,3,3] 55 | decoder.up.0.block.2.norm1.bias [128] 56 | decoder.up.0.block.2.norm1.weight [128] 57 | decoder.up.0.block.2.norm2.bias [128] 58 | decoder.up.0.block.2.norm2.weight [128] 59 | decoder.up.1.block.0.conv1.bias [256] 60 | decoder.up.1.block.0.conv1.weight [256,512,3,3] 61 | decoder.up.1.block.0.conv2.bias [256] 62 | decoder.up.1.block.0.conv2.weight [256,256,3,3] 63 | decoder.up.1.block.0.nin_shortcut.bias [256] 64 | decoder.up.1.block.0.nin_shortcut.weight [256,512,1,1] 65 | decoder.up.1.block.0.norm1.bias [512] 66 | decoder.up.1.block.0.norm1.weight [512] 67 | decoder.up.1.block.0.norm2.bias [256] 68 | decoder.up.1.block.0.norm2.weight [256] 69 | decoder.up.1.block.1.conv1.bias [256] 70 | decoder.up.1.block.1.conv1.weight [256,256,3,3] 71 | decoder.up.1.block.1.conv2.bias [256] 72 | decoder.up.1.block.1.conv2.weight [256,256,3,3] 73 | decoder.up.1.block.1.norm1.bias [256] 74 | decoder.up.1.block.1.norm1.weight [256] 75 | decoder.up.1.block.1.norm2.bias [256] 76 | decoder.up.1.block.1.norm2.weight [256] 77 | decoder.up.1.block.2.conv1.bias [256] 78 | decoder.up.1.block.2.conv1.weight [256,256,3,3] 79 | decoder.up.1.block.2.conv2.bias [256] 80 | decoder.up.1.block.2.conv2.weight [256,256,3,3] 81 | decoder.up.1.block.2.norm1.bias [256] 82 | decoder.up.1.block.2.norm1.weight [256] 83 | decoder.up.1.block.2.norm2.bias [256] 84 | decoder.up.1.block.2.norm2.weight [256] 85 | decoder.up.1.upsample.conv.bias [256] 86 | decoder.up.1.upsample.conv.weight [256,256,3,3] 87 | decoder.up.2.block.0.conv1.bias [512] 88 | decoder.up.2.block.0.conv1.weight [512,512,3,3] 89 | decoder.up.2.block.0.conv2.bias [512] 90 | decoder.up.2.block.0.conv2.weight [512,512,3,3] 91 | decoder.up.2.block.0.norm1.bias [512] 92 | decoder.up.2.block.0.norm1.weight [512] 93 | decoder.up.2.block.0.norm2.bias [512] 94 | decoder.up.2.block.0.norm2.weight [512] 95 | decoder.up.2.block.1.conv1.bias [512] 96 | decoder.up.2.block.1.conv1.weight [512,512,3,3] 97 | decoder.up.2.block.1.conv2.bias [512] 98 | decoder.up.2.block.1.conv2.weight [512,512,3,3] 99 | decoder.up.2.block.1.norm1.bias [512] 100 | decoder.up.2.block.1.norm1.weight [512] 101 | decoder.up.2.block.1.norm2.bias [512] 102 | decoder.up.2.block.1.norm2.weight [512] 103 | decoder.up.2.block.2.conv1.bias [512] 104 | decoder.up.2.block.2.conv1.weight [512,512,3,3] 105 | decoder.up.2.block.2.conv2.bias [512] 106 | decoder.up.2.block.2.conv2.weight [512,512,3,3] 107 | decoder.up.2.block.2.norm1.bias [512] 108 | decoder.up.2.block.2.norm1.weight [512] 109 | decoder.up.2.block.2.norm2.bias [512] 110 | decoder.up.2.block.2.norm2.weight [512] 111 | decoder.up.2.upsample.conv.bias [512] 112 | decoder.up.2.upsample.conv.weight [512,512,3,3] 113 | decoder.up.3.block.0.conv1.bias [512] 114 | decoder.up.3.block.0.conv1.weight [512,512,3,3] 115 | decoder.up.3.block.0.conv2.bias [512] 116 | decoder.up.3.block.0.conv2.weight [512,512,3,3] 117 | decoder.up.3.block.0.norm1.bias [512] 118 | decoder.up.3.block.0.norm1.weight [512] 119 | decoder.up.3.block.0.norm2.bias [512] 120 | decoder.up.3.block.0.norm2.weight [512] 121 | decoder.up.3.block.1.conv1.bias [512] 122 | decoder.up.3.block.1.conv1.weight [512,512,3,3] 123 | decoder.up.3.block.1.conv2.bias [512] 124 | decoder.up.3.block.1.conv2.weight [512,512,3,3] 125 | decoder.up.3.block.1.norm1.bias [512] 126 | decoder.up.3.block.1.norm1.weight [512] 127 | decoder.up.3.block.1.norm2.bias [512] 128 | decoder.up.3.block.1.norm2.weight [512] 129 | decoder.up.3.block.2.conv1.bias [512] 130 | decoder.up.3.block.2.conv1.weight [512,512,3,3] 131 | decoder.up.3.block.2.conv2.bias [512] 132 | decoder.up.3.block.2.conv2.weight [512,512,3,3] 133 | decoder.up.3.block.2.norm1.bias [512] 134 | decoder.up.3.block.2.norm1.weight [512] 135 | decoder.up.3.block.2.norm2.bias [512] 136 | decoder.up.3.block.2.norm2.weight [512] 137 | decoder.up.3.upsample.conv.bias [512] 138 | decoder.up.3.upsample.conv.weight [512,512,3,3] 139 | encoder.conv_in.bias [128] 140 | encoder.conv_in.weight [128,3,3,3] 141 | encoder.conv_out.bias [8] 142 | encoder.conv_out.weight [8,512,3,3] 143 | encoder.down.0.block.0.conv1.bias [128] 144 | encoder.down.0.block.0.conv1.weight [128,128,3,3] 145 | encoder.down.0.block.0.conv2.bias [128] 146 | encoder.down.0.block.0.conv2.weight [128,128,3,3] 147 | encoder.down.0.block.0.norm1.bias [128] 148 | encoder.down.0.block.0.norm1.weight [128] 149 | encoder.down.0.block.0.norm2.bias [128] 150 | encoder.down.0.block.0.norm2.weight [128] 151 | encoder.down.0.block.1.conv1.bias [128] 152 | encoder.down.0.block.1.conv1.weight [128,128,3,3] 153 | encoder.down.0.block.1.conv2.bias [128] 154 | encoder.down.0.block.1.conv2.weight [128,128,3,3] 155 | encoder.down.0.block.1.norm1.bias [128] 156 | encoder.down.0.block.1.norm1.weight [128] 157 | encoder.down.0.block.1.norm2.bias [128] 158 | encoder.down.0.block.1.norm2.weight [128] 159 | encoder.down.0.downsample.conv.bias [128] 160 | encoder.down.0.downsample.conv.weight [128,128,3,3] 161 | encoder.down.1.block.0.conv1.bias [256] 162 | encoder.down.1.block.0.conv1.weight [256,128,3,3] 163 | encoder.down.1.block.0.conv2.bias [256] 164 | encoder.down.1.block.0.conv2.weight [256,256,3,3] 165 | encoder.down.1.block.0.nin_shortcut.bias [256] 166 | encoder.down.1.block.0.nin_shortcut.weight [256,128,1,1] 167 | encoder.down.1.block.0.norm1.bias [128] 168 | encoder.down.1.block.0.norm1.weight [128] 169 | encoder.down.1.block.0.norm2.bias [256] 170 | encoder.down.1.block.0.norm2.weight [256] 171 | encoder.down.1.block.1.conv1.bias [256] 172 | encoder.down.1.block.1.conv1.weight [256,256,3,3] 173 | encoder.down.1.block.1.conv2.bias [256] 174 | encoder.down.1.block.1.conv2.weight [256,256,3,3] 175 | encoder.down.1.block.1.norm1.bias [256] 176 | encoder.down.1.block.1.norm1.weight [256] 177 | encoder.down.1.block.1.norm2.bias [256] 178 | encoder.down.1.block.1.norm2.weight [256] 179 | encoder.down.1.downsample.conv.bias [256] 180 | encoder.down.1.downsample.conv.weight [256,256,3,3] 181 | encoder.down.2.block.0.conv1.bias [512] 182 | encoder.down.2.block.0.conv1.weight [512,256,3,3] 183 | encoder.down.2.block.0.conv2.bias [512] 184 | encoder.down.2.block.0.conv2.weight [512,512,3,3] 185 | encoder.down.2.block.0.nin_shortcut.bias [512] 186 | encoder.down.2.block.0.nin_shortcut.weight [512,256,1,1] 187 | encoder.down.2.block.0.norm1.bias [256] 188 | encoder.down.2.block.0.norm1.weight [256] 189 | encoder.down.2.block.0.norm2.bias [512] 190 | encoder.down.2.block.0.norm2.weight [512] 191 | encoder.down.2.block.1.conv1.bias [512] 192 | encoder.down.2.block.1.conv1.weight [512,512,3,3] 193 | encoder.down.2.block.1.conv2.bias [512] 194 | encoder.down.2.block.1.conv2.weight [512,512,3,3] 195 | encoder.down.2.block.1.norm1.bias [512] 196 | encoder.down.2.block.1.norm1.weight [512] 197 | encoder.down.2.block.1.norm2.bias [512] 198 | encoder.down.2.block.1.norm2.weight [512] 199 | encoder.down.2.downsample.conv.bias [512] 200 | encoder.down.2.downsample.conv.weight [512,512,3,3] 201 | encoder.down.3.block.0.conv1.bias [512] 202 | encoder.down.3.block.0.conv1.weight [512,512,3,3] 203 | encoder.down.3.block.0.conv2.bias [512] 204 | encoder.down.3.block.0.conv2.weight [512,512,3,3] 205 | encoder.down.3.block.0.norm1.bias [512] 206 | encoder.down.3.block.0.norm1.weight [512] 207 | encoder.down.3.block.0.norm2.bias [512] 208 | encoder.down.3.block.0.norm2.weight [512] 209 | encoder.down.3.block.1.conv1.bias [512] 210 | encoder.down.3.block.1.conv1.weight [512,512,3,3] 211 | encoder.down.3.block.1.conv2.bias [512] 212 | encoder.down.3.block.1.conv2.weight [512,512,3,3] 213 | encoder.down.3.block.1.norm1.bias [512] 214 | encoder.down.3.block.1.norm1.weight [512] 215 | encoder.down.3.block.1.norm2.bias [512] 216 | encoder.down.3.block.1.norm2.weight [512] 217 | encoder.mid.attn_1.k.bias [512] 218 | encoder.mid.attn_1.k.weight [512,512,1,1] 219 | encoder.mid.attn_1.norm.bias [512] 220 | encoder.mid.attn_1.norm.weight [512] 221 | encoder.mid.attn_1.proj_out.bias [512] 222 | encoder.mid.attn_1.proj_out.weight [512,512,1,1] 223 | encoder.mid.attn_1.q.bias [512] 224 | encoder.mid.attn_1.q.weight [512,512,1,1] 225 | encoder.mid.attn_1.v.bias [512] 226 | encoder.mid.attn_1.v.weight [512,512,1,1] 227 | encoder.mid.block_1.conv1.bias [512] 228 | encoder.mid.block_1.conv1.weight [512,512,3,3] 229 | encoder.mid.block_1.conv2.bias [512] 230 | encoder.mid.block_1.conv2.weight [512,512,3,3] 231 | encoder.mid.block_1.norm1.bias [512] 232 | encoder.mid.block_1.norm1.weight [512] 233 | encoder.mid.block_1.norm2.bias [512] 234 | encoder.mid.block_1.norm2.weight [512] 235 | encoder.mid.block_2.conv1.bias [512] 236 | encoder.mid.block_2.conv1.weight [512,512,3,3] 237 | encoder.mid.block_2.conv2.bias [512] 238 | encoder.mid.block_2.conv2.weight [512,512,3,3] 239 | encoder.mid.block_2.norm1.bias [512] 240 | encoder.mid.block_2.norm1.weight [512] 241 | encoder.mid.block_2.norm2.bias [512] 242 | encoder.mid.block_2.norm2.weight [512] 243 | encoder.norm_out.bias [512] 244 | encoder.norm_out.weight [512] 245 | post_quant_conv.bias [4] 246 | post_quant_conv.weight [4,4,1,1] 247 | quant_conv.bias [8] 248 | quant_conv.weight [8,8,1,1] -------------------------------------------------------------------------------- /components/VAE-v3-SD.txt: -------------------------------------------------------------------------------- 1 | decoder.conv_in.bias [512] 2 | decoder.conv_in.weight [512,16,3,3] 3 | decoder.conv_out.bias [3] 4 | decoder.conv_out.weight [3,128,3,3] 5 | decoder.mid.attn_1.k.bias [512] 6 | decoder.mid.attn_1.k.weight [512,512,1,1] 7 | decoder.mid.attn_1.norm.bias [512] 8 | decoder.mid.attn_1.norm.weight [512] 9 | decoder.mid.attn_1.proj_out.bias [512] 10 | decoder.mid.attn_1.proj_out.weight [512,512,1,1] 11 | decoder.mid.attn_1.q.bias [512] 12 | decoder.mid.attn_1.q.weight [512,512,1,1] 13 | decoder.mid.attn_1.v.bias [512] 14 | decoder.mid.attn_1.v.weight [512,512,1,1] 15 | decoder.mid.block_1.conv1.bias [512] 16 | decoder.mid.block_1.conv1.weight [512,512,3,3] 17 | decoder.mid.block_1.conv2.bias [512] 18 | decoder.mid.block_1.conv2.weight [512,512,3,3] 19 | decoder.mid.block_1.norm1.bias [512] 20 | decoder.mid.block_1.norm1.weight [512] 21 | decoder.mid.block_1.norm2.bias [512] 22 | decoder.mid.block_1.norm2.weight [512] 23 | decoder.mid.block_2.conv1.bias [512] 24 | decoder.mid.block_2.conv1.weight [512,512,3,3] 25 | decoder.mid.block_2.conv2.bias [512] 26 | decoder.mid.block_2.conv2.weight [512,512,3,3] 27 | decoder.mid.block_2.norm1.bias [512] 28 | decoder.mid.block_2.norm1.weight [512] 29 | decoder.mid.block_2.norm2.bias [512] 30 | decoder.mid.block_2.norm2.weight [512] 31 | decoder.norm_out.bias [128] 32 | decoder.norm_out.weight [128] 33 | decoder.up.0.block.0.conv1.bias [128] 34 | decoder.up.0.block.0.conv1.weight [128,256,3,3] 35 | decoder.up.0.block.0.conv2.bias [128] 36 | decoder.up.0.block.0.conv2.weight [128,128,3,3] 37 | decoder.up.0.block.0.nin_shortcut.bias [128] 38 | decoder.up.0.block.0.nin_shortcut.weight [128,256,1,1] 39 | decoder.up.0.block.0.norm1.bias [256] 40 | decoder.up.0.block.0.norm1.weight [256] 41 | decoder.up.0.block.0.norm2.bias [128] 42 | decoder.up.0.block.0.norm2.weight [128] 43 | decoder.up.0.block.1.conv1.bias [128] 44 | decoder.up.0.block.1.conv1.weight [128,128,3,3] 45 | decoder.up.0.block.1.conv2.bias [128] 46 | decoder.up.0.block.1.conv2.weight [128,128,3,3] 47 | decoder.up.0.block.1.norm1.bias [128] 48 | decoder.up.0.block.1.norm1.weight [128] 49 | decoder.up.0.block.1.norm2.bias [128] 50 | decoder.up.0.block.1.norm2.weight [128] 51 | decoder.up.0.block.2.conv1.bias [128] 52 | decoder.up.0.block.2.conv1.weight [128,128,3,3] 53 | decoder.up.0.block.2.conv2.bias [128] 54 | decoder.up.0.block.2.conv2.weight [128,128,3,3] 55 | decoder.up.0.block.2.norm1.bias [128] 56 | decoder.up.0.block.2.norm1.weight [128] 57 | decoder.up.0.block.2.norm2.bias [128] 58 | decoder.up.0.block.2.norm2.weight [128] 59 | decoder.up.1.block.0.conv1.bias [256] 60 | decoder.up.1.block.0.conv1.weight [256,512,3,3] 61 | decoder.up.1.block.0.conv2.bias [256] 62 | decoder.up.1.block.0.conv2.weight [256,256,3,3] 63 | decoder.up.1.block.0.nin_shortcut.bias [256] 64 | decoder.up.1.block.0.nin_shortcut.weight [256,512,1,1] 65 | decoder.up.1.block.0.norm1.bias [512] 66 | decoder.up.1.block.0.norm1.weight [512] 67 | decoder.up.1.block.0.norm2.bias [256] 68 | decoder.up.1.block.0.norm2.weight [256] 69 | decoder.up.1.block.1.conv1.bias [256] 70 | decoder.up.1.block.1.conv1.weight [256,256,3,3] 71 | decoder.up.1.block.1.conv2.bias [256] 72 | decoder.up.1.block.1.conv2.weight [256,256,3,3] 73 | decoder.up.1.block.1.norm1.bias [256] 74 | decoder.up.1.block.1.norm1.weight [256] 75 | decoder.up.1.block.1.norm2.bias [256] 76 | decoder.up.1.block.1.norm2.weight [256] 77 | decoder.up.1.block.2.conv1.bias [256] 78 | decoder.up.1.block.2.conv1.weight [256,256,3,3] 79 | decoder.up.1.block.2.conv2.bias [256] 80 | decoder.up.1.block.2.conv2.weight [256,256,3,3] 81 | decoder.up.1.block.2.norm1.bias [256] 82 | decoder.up.1.block.2.norm1.weight [256] 83 | decoder.up.1.block.2.norm2.bias [256] 84 | decoder.up.1.block.2.norm2.weight [256] 85 | decoder.up.1.upsample.conv.bias [256] 86 | decoder.up.1.upsample.conv.weight [256,256,3,3] 87 | decoder.up.2.block.0.conv1.bias [512] 88 | decoder.up.2.block.0.conv1.weight [512,512,3,3] 89 | decoder.up.2.block.0.conv2.bias [512] 90 | decoder.up.2.block.0.conv2.weight [512,512,3,3] 91 | decoder.up.2.block.0.norm1.bias [512] 92 | decoder.up.2.block.0.norm1.weight [512] 93 | decoder.up.2.block.0.norm2.bias [512] 94 | decoder.up.2.block.0.norm2.weight [512] 95 | decoder.up.2.block.1.conv1.bias [512] 96 | decoder.up.2.block.1.conv1.weight [512,512,3,3] 97 | decoder.up.2.block.1.conv2.bias [512] 98 | decoder.up.2.block.1.conv2.weight [512,512,3,3] 99 | decoder.up.2.block.1.norm1.bias [512] 100 | decoder.up.2.block.1.norm1.weight [512] 101 | decoder.up.2.block.1.norm2.bias [512] 102 | decoder.up.2.block.1.norm2.weight [512] 103 | decoder.up.2.block.2.conv1.bias [512] 104 | decoder.up.2.block.2.conv1.weight [512,512,3,3] 105 | decoder.up.2.block.2.conv2.bias [512] 106 | decoder.up.2.block.2.conv2.weight [512,512,3,3] 107 | decoder.up.2.block.2.norm1.bias [512] 108 | decoder.up.2.block.2.norm1.weight [512] 109 | decoder.up.2.block.2.norm2.bias [512] 110 | decoder.up.2.block.2.norm2.weight [512] 111 | decoder.up.2.upsample.conv.bias [512] 112 | decoder.up.2.upsample.conv.weight [512,512,3,3] 113 | decoder.up.3.block.0.conv1.bias [512] 114 | decoder.up.3.block.0.conv1.weight [512,512,3,3] 115 | decoder.up.3.block.0.conv2.bias [512] 116 | decoder.up.3.block.0.conv2.weight [512,512,3,3] 117 | decoder.up.3.block.0.norm1.bias [512] 118 | decoder.up.3.block.0.norm1.weight [512] 119 | decoder.up.3.block.0.norm2.bias [512] 120 | decoder.up.3.block.0.norm2.weight [512] 121 | decoder.up.3.block.1.conv1.bias [512] 122 | decoder.up.3.block.1.conv1.weight [512,512,3,3] 123 | decoder.up.3.block.1.conv2.bias [512] 124 | decoder.up.3.block.1.conv2.weight [512,512,3,3] 125 | decoder.up.3.block.1.norm1.bias [512] 126 | decoder.up.3.block.1.norm1.weight [512] 127 | decoder.up.3.block.1.norm2.bias [512] 128 | decoder.up.3.block.1.norm2.weight [512] 129 | decoder.up.3.block.2.conv1.bias [512] 130 | decoder.up.3.block.2.conv1.weight [512,512,3,3] 131 | decoder.up.3.block.2.conv2.bias [512] 132 | decoder.up.3.block.2.conv2.weight [512,512,3,3] 133 | decoder.up.3.block.2.norm1.bias [512] 134 | decoder.up.3.block.2.norm1.weight [512] 135 | decoder.up.3.block.2.norm2.bias [512] 136 | decoder.up.3.block.2.norm2.weight [512] 137 | decoder.up.3.upsample.conv.bias [512] 138 | decoder.up.3.upsample.conv.weight [512,512,3,3] 139 | encoder.conv_in.bias [128] 140 | encoder.conv_in.weight [128,3,3,3] 141 | encoder.conv_out.bias [32] 142 | encoder.conv_out.weight [32,512,3,3] 143 | encoder.down.0.block.0.conv1.bias [128] 144 | encoder.down.0.block.0.conv1.weight [128,128,3,3] 145 | encoder.down.0.block.0.conv2.bias [128] 146 | encoder.down.0.block.0.conv2.weight [128,128,3,3] 147 | encoder.down.0.block.0.norm1.bias [128] 148 | encoder.down.0.block.0.norm1.weight [128] 149 | encoder.down.0.block.0.norm2.bias [128] 150 | encoder.down.0.block.0.norm2.weight [128] 151 | encoder.down.0.block.1.conv1.bias [128] 152 | encoder.down.0.block.1.conv1.weight [128,128,3,3] 153 | encoder.down.0.block.1.conv2.bias [128] 154 | encoder.down.0.block.1.conv2.weight [128,128,3,3] 155 | encoder.down.0.block.1.norm1.bias [128] 156 | encoder.down.0.block.1.norm1.weight [128] 157 | encoder.down.0.block.1.norm2.bias [128] 158 | encoder.down.0.block.1.norm2.weight [128] 159 | encoder.down.0.downsample.conv.bias [128] 160 | encoder.down.0.downsample.conv.weight [128,128,3,3] 161 | encoder.down.1.block.0.conv1.bias [256] 162 | encoder.down.1.block.0.conv1.weight [256,128,3,3] 163 | encoder.down.1.block.0.conv2.bias [256] 164 | encoder.down.1.block.0.conv2.weight [256,256,3,3] 165 | encoder.down.1.block.0.nin_shortcut.bias [256] 166 | encoder.down.1.block.0.nin_shortcut.weight [256,128,1,1] 167 | encoder.down.1.block.0.norm1.bias [128] 168 | encoder.down.1.block.0.norm1.weight [128] 169 | encoder.down.1.block.0.norm2.bias [256] 170 | encoder.down.1.block.0.norm2.weight [256] 171 | encoder.down.1.block.1.conv1.bias [256] 172 | encoder.down.1.block.1.conv1.weight [256,256,3,3] 173 | encoder.down.1.block.1.conv2.bias [256] 174 | encoder.down.1.block.1.conv2.weight [256,256,3,3] 175 | encoder.down.1.block.1.norm1.bias [256] 176 | encoder.down.1.block.1.norm1.weight [256] 177 | encoder.down.1.block.1.norm2.bias [256] 178 | encoder.down.1.block.1.norm2.weight [256] 179 | encoder.down.1.downsample.conv.bias [256] 180 | encoder.down.1.downsample.conv.weight [256,256,3,3] 181 | encoder.down.2.block.0.conv1.bias [512] 182 | encoder.down.2.block.0.conv1.weight [512,256,3,3] 183 | encoder.down.2.block.0.conv2.bias [512] 184 | encoder.down.2.block.0.conv2.weight [512,512,3,3] 185 | encoder.down.2.block.0.nin_shortcut.bias [512] 186 | encoder.down.2.block.0.nin_shortcut.weight [512,256,1,1] 187 | encoder.down.2.block.0.norm1.bias [256] 188 | encoder.down.2.block.0.norm1.weight [256] 189 | encoder.down.2.block.0.norm2.bias [512] 190 | encoder.down.2.block.0.norm2.weight [512] 191 | encoder.down.2.block.1.conv1.bias [512] 192 | encoder.down.2.block.1.conv1.weight [512,512,3,3] 193 | encoder.down.2.block.1.conv2.bias [512] 194 | encoder.down.2.block.1.conv2.weight [512,512,3,3] 195 | encoder.down.2.block.1.norm1.bias [512] 196 | encoder.down.2.block.1.norm1.weight [512] 197 | encoder.down.2.block.1.norm2.bias [512] 198 | encoder.down.2.block.1.norm2.weight [512] 199 | encoder.down.2.downsample.conv.bias [512] 200 | encoder.down.2.downsample.conv.weight [512,512,3,3] 201 | encoder.down.3.block.0.conv1.bias [512] 202 | encoder.down.3.block.0.conv1.weight [512,512,3,3] 203 | encoder.down.3.block.0.conv2.bias [512] 204 | encoder.down.3.block.0.conv2.weight [512,512,3,3] 205 | encoder.down.3.block.0.norm1.bias [512] 206 | encoder.down.3.block.0.norm1.weight [512] 207 | encoder.down.3.block.0.norm2.bias [512] 208 | encoder.down.3.block.0.norm2.weight [512] 209 | encoder.down.3.block.1.conv1.bias [512] 210 | encoder.down.3.block.1.conv1.weight [512,512,3,3] 211 | encoder.down.3.block.1.conv2.bias [512] 212 | encoder.down.3.block.1.conv2.weight [512,512,3,3] 213 | encoder.down.3.block.1.norm1.bias [512] 214 | encoder.down.3.block.1.norm1.weight [512] 215 | encoder.down.3.block.1.norm2.bias [512] 216 | encoder.down.3.block.1.norm2.weight [512] 217 | encoder.mid.attn_1.k.bias [512] 218 | encoder.mid.attn_1.k.weight [512,512,1,1] 219 | encoder.mid.attn_1.norm.bias [512] 220 | encoder.mid.attn_1.norm.weight [512] 221 | encoder.mid.attn_1.proj_out.bias [512] 222 | encoder.mid.attn_1.proj_out.weight [512,512,1,1] 223 | encoder.mid.attn_1.q.bias [512] 224 | encoder.mid.attn_1.q.weight [512,512,1,1] 225 | encoder.mid.attn_1.v.bias [512] 226 | encoder.mid.attn_1.v.weight [512,512,1,1] 227 | encoder.mid.block_1.conv1.bias [512] 228 | encoder.mid.block_1.conv1.weight [512,512,3,3] 229 | encoder.mid.block_1.conv2.bias [512] 230 | encoder.mid.block_1.conv2.weight [512,512,3,3] 231 | encoder.mid.block_1.norm1.bias [512] 232 | encoder.mid.block_1.norm1.weight [512] 233 | encoder.mid.block_1.norm2.bias [512] 234 | encoder.mid.block_1.norm2.weight [512] 235 | encoder.mid.block_2.conv1.bias [512] 236 | encoder.mid.block_2.conv1.weight [512,512,3,3] 237 | encoder.mid.block_2.conv2.bias [512] 238 | encoder.mid.block_2.conv2.weight [512,512,3,3] 239 | encoder.mid.block_2.norm1.bias [512] 240 | encoder.mid.block_2.norm1.weight [512] 241 | encoder.mid.block_2.norm2.bias [512] 242 | encoder.mid.block_2.norm2.weight [512] 243 | encoder.norm_out.bias [512] 244 | encoder.norm_out.weight [512] -------------------------------------------------------------------------------- /components/VAE-vX-D.txt: -------------------------------------------------------------------------------- 1 | post_quant_conv.bias [4] 2 | quant_conv.weight [8,8,1,1] 3 | quant_conv.bias [8] 4 | post_quant_conv.weight [4,4,1,1] -------------------------------------------------------------------------------- /components/VAE-vX-S.txt: -------------------------------------------------------------------------------- 1 | decoder.conv_in.bias [512] 2 | decoder.conv_in.weight [512,4,3,3] 3 | decoder.conv_out.bias [3] 4 | decoder.conv_out.weight [3,128,3,3] 5 | decoder.mid.attn_1.k.bias [512] 6 | decoder.mid.attn_1.k.weight [512,512,1,1] 7 | decoder.mid.attn_1.norm.bias [512] 8 | decoder.mid.attn_1.norm.weight [512] 9 | decoder.mid.attn_1.proj_out.bias [512] 10 | decoder.mid.attn_1.proj_out.weight [512,512,1,1] 11 | decoder.mid.attn_1.q.bias [512] 12 | decoder.mid.attn_1.q.weight [512,512,1,1] 13 | decoder.mid.attn_1.v.bias [512] 14 | decoder.mid.attn_1.v.weight [512,512,1,1] 15 | decoder.mid.block_1.conv1.bias [512] 16 | decoder.mid.block_1.conv1.weight [512,512,3,3] 17 | decoder.mid.block_1.conv2.bias [512] 18 | decoder.mid.block_1.conv2.weight [512,512,3,3] 19 | decoder.mid.block_1.norm1.bias [512] 20 | decoder.mid.block_1.norm1.weight [512] 21 | decoder.mid.block_1.norm2.bias [512] 22 | decoder.mid.block_1.norm2.weight [512] 23 | decoder.mid.block_2.conv1.bias [512] 24 | decoder.mid.block_2.conv1.weight [512,512,3,3] 25 | decoder.mid.block_2.conv2.bias [512] 26 | decoder.mid.block_2.conv2.weight [512,512,3,3] 27 | decoder.mid.block_2.norm1.bias [512] 28 | decoder.mid.block_2.norm1.weight [512] 29 | decoder.mid.block_2.norm2.bias [512] 30 | decoder.mid.block_2.norm2.weight [512] 31 | decoder.norm_out.bias [128] 32 | decoder.norm_out.weight [128] 33 | decoder.up.0.block.0.conv1.bias [128] 34 | decoder.up.0.block.0.conv1.weight [128,256,3,3] 35 | decoder.up.0.block.0.conv2.bias [128] 36 | decoder.up.0.block.0.conv2.weight [128,128,3,3] 37 | decoder.up.0.block.0.nin_shortcut.bias [128] 38 | decoder.up.0.block.0.nin_shortcut.weight [128,256,1,1] 39 | decoder.up.0.block.0.norm1.bias [256] 40 | decoder.up.0.block.0.norm1.weight [256] 41 | decoder.up.0.block.0.norm2.bias [128] 42 | decoder.up.0.block.0.norm2.weight [128] 43 | decoder.up.0.block.1.conv1.bias [128] 44 | decoder.up.0.block.1.conv1.weight [128,128,3,3] 45 | decoder.up.0.block.1.conv2.bias [128] 46 | decoder.up.0.block.1.conv2.weight [128,128,3,3] 47 | decoder.up.0.block.1.norm1.bias [128] 48 | decoder.up.0.block.1.norm1.weight [128] 49 | decoder.up.0.block.1.norm2.bias [128] 50 | decoder.up.0.block.1.norm2.weight [128] 51 | decoder.up.0.block.2.conv1.bias [128] 52 | decoder.up.0.block.2.conv1.weight [128,128,3,3] 53 | decoder.up.0.block.2.conv2.bias [128] 54 | decoder.up.0.block.2.conv2.weight [128,128,3,3] 55 | decoder.up.0.block.2.norm1.bias [128] 56 | decoder.up.0.block.2.norm1.weight [128] 57 | decoder.up.0.block.2.norm2.bias [128] 58 | decoder.up.0.block.2.norm2.weight [128] 59 | decoder.up.1.block.0.conv1.bias [256] 60 | decoder.up.1.block.0.conv1.weight [256,512,3,3] 61 | decoder.up.1.block.0.conv2.bias [256] 62 | decoder.up.1.block.0.conv2.weight [256,256,3,3] 63 | decoder.up.1.block.0.nin_shortcut.bias [256] 64 | decoder.up.1.block.0.nin_shortcut.weight [256,512,1,1] 65 | decoder.up.1.block.0.norm1.bias [512] 66 | decoder.up.1.block.0.norm1.weight [512] 67 | decoder.up.1.block.0.norm2.bias [256] 68 | decoder.up.1.block.0.norm2.weight [256] 69 | decoder.up.1.block.1.conv1.bias [256] 70 | decoder.up.1.block.1.conv1.weight [256,256,3,3] 71 | decoder.up.1.block.1.conv2.bias [256] 72 | decoder.up.1.block.1.conv2.weight [256,256,3,3] 73 | decoder.up.1.block.1.norm1.bias [256] 74 | decoder.up.1.block.1.norm1.weight [256] 75 | decoder.up.1.block.1.norm2.bias [256] 76 | decoder.up.1.block.1.norm2.weight [256] 77 | decoder.up.1.block.2.conv1.bias [256] 78 | decoder.up.1.block.2.conv1.weight [256,256,3,3] 79 | decoder.up.1.block.2.conv2.bias [256] 80 | decoder.up.1.block.2.conv2.weight [256,256,3,3] 81 | decoder.up.1.block.2.norm1.bias [256] 82 | decoder.up.1.block.2.norm1.weight [256] 83 | decoder.up.1.block.2.norm2.bias [256] 84 | decoder.up.1.block.2.norm2.weight [256] 85 | decoder.up.1.upsample.conv.bias [256] 86 | decoder.up.1.upsample.conv.weight [256,256,3,3] 87 | decoder.up.2.block.0.conv1.bias [512] 88 | decoder.up.2.block.0.conv1.weight [512,512,3,3] 89 | decoder.up.2.block.0.conv2.bias [512] 90 | decoder.up.2.block.0.conv2.weight [512,512,3,3] 91 | decoder.up.2.block.0.norm1.bias [512] 92 | decoder.up.2.block.0.norm1.weight [512] 93 | decoder.up.2.block.0.norm2.bias [512] 94 | decoder.up.2.block.0.norm2.weight [512] 95 | decoder.up.2.block.1.conv1.bias [512] 96 | decoder.up.2.block.1.conv1.weight [512,512,3,3] 97 | decoder.up.2.block.1.conv2.bias [512] 98 | decoder.up.2.block.1.conv2.weight [512,512,3,3] 99 | decoder.up.2.block.1.norm1.bias [512] 100 | decoder.up.2.block.1.norm1.weight [512] 101 | decoder.up.2.block.1.norm2.bias [512] 102 | decoder.up.2.block.1.norm2.weight [512] 103 | decoder.up.2.block.2.conv1.bias [512] 104 | decoder.up.2.block.2.conv1.weight [512,512,3,3] 105 | decoder.up.2.block.2.conv2.bias [512] 106 | decoder.up.2.block.2.conv2.weight [512,512,3,3] 107 | decoder.up.2.block.2.norm1.bias [512] 108 | decoder.up.2.block.2.norm1.weight [512] 109 | decoder.up.2.block.2.norm2.bias [512] 110 | decoder.up.2.block.2.norm2.weight [512] 111 | decoder.up.2.upsample.conv.bias [512] 112 | decoder.up.2.upsample.conv.weight [512,512,3,3] 113 | decoder.up.3.block.0.conv1.bias [512] 114 | decoder.up.3.block.0.conv1.weight [512,512,3,3] 115 | decoder.up.3.block.0.conv2.bias [512] 116 | decoder.up.3.block.0.conv2.weight [512,512,3,3] 117 | decoder.up.3.block.0.norm1.bias [512] 118 | decoder.up.3.block.0.norm1.weight [512] 119 | decoder.up.3.block.0.norm2.bias [512] 120 | decoder.up.3.block.0.norm2.weight [512] 121 | decoder.up.3.block.1.conv1.bias [512] 122 | decoder.up.3.block.1.conv1.weight [512,512,3,3] 123 | decoder.up.3.block.1.conv2.bias [512] 124 | decoder.up.3.block.1.conv2.weight [512,512,3,3] 125 | decoder.up.3.block.1.norm1.bias [512] 126 | decoder.up.3.block.1.norm1.weight [512] 127 | decoder.up.3.block.1.norm2.bias [512] 128 | decoder.up.3.block.1.norm2.weight [512] 129 | decoder.up.3.block.2.conv1.bias [512] 130 | decoder.up.3.block.2.conv1.weight [512,512,3,3] 131 | decoder.up.3.block.2.conv2.bias [512] 132 | decoder.up.3.block.2.conv2.weight [512,512,3,3] 133 | decoder.up.3.block.2.norm1.bias [512] 134 | decoder.up.3.block.2.norm1.weight [512] 135 | decoder.up.3.block.2.norm2.bias [512] 136 | decoder.up.3.block.2.norm2.weight [512] 137 | decoder.up.3.upsample.conv.bias [512] 138 | decoder.up.3.upsample.conv.weight [512,512,3,3] 139 | encoder.conv_in.bias [128] 140 | encoder.conv_in.weight [128,3,3,3] 141 | encoder.conv_out.bias [8] 142 | encoder.conv_out.weight [8,512,3,3] 143 | encoder.down.0.block.0.conv1.bias [128] 144 | encoder.down.0.block.0.conv1.weight [128,128,3,3] 145 | encoder.down.0.block.0.conv2.bias [128] 146 | encoder.down.0.block.0.conv2.weight [128,128,3,3] 147 | encoder.down.0.block.0.norm1.bias [128] 148 | encoder.down.0.block.0.norm1.weight [128] 149 | encoder.down.0.block.0.norm2.bias [128] 150 | encoder.down.0.block.0.norm2.weight [128] 151 | encoder.down.0.block.1.conv1.bias [128] 152 | encoder.down.0.block.1.conv1.weight [128,128,3,3] 153 | encoder.down.0.block.1.conv2.bias [128] 154 | encoder.down.0.block.1.conv2.weight [128,128,3,3] 155 | encoder.down.0.block.1.norm1.bias [128] 156 | encoder.down.0.block.1.norm1.weight [128] 157 | encoder.down.0.block.1.norm2.bias [128] 158 | encoder.down.0.block.1.norm2.weight [128] 159 | encoder.down.0.downsample.conv.bias [128] 160 | encoder.down.0.downsample.conv.weight [128,128,3,3] 161 | encoder.down.1.block.0.conv1.bias [256] 162 | encoder.down.1.block.0.conv1.weight [256,128,3,3] 163 | encoder.down.1.block.0.conv2.bias [256] 164 | encoder.down.1.block.0.conv2.weight [256,256,3,3] 165 | encoder.down.1.block.0.nin_shortcut.bias [256] 166 | encoder.down.1.block.0.nin_shortcut.weight [256,128,1,1] 167 | encoder.down.1.block.0.norm1.bias [128] 168 | encoder.down.1.block.0.norm1.weight [128] 169 | encoder.down.1.block.0.norm2.bias [256] 170 | encoder.down.1.block.0.norm2.weight [256] 171 | encoder.down.1.block.1.conv1.bias [256] 172 | encoder.down.1.block.1.conv1.weight [256,256,3,3] 173 | encoder.down.1.block.1.conv2.bias [256] 174 | encoder.down.1.block.1.conv2.weight [256,256,3,3] 175 | encoder.down.1.block.1.norm1.bias [256] 176 | encoder.down.1.block.1.norm1.weight [256] 177 | encoder.down.1.block.1.norm2.bias [256] 178 | encoder.down.1.block.1.norm2.weight [256] 179 | encoder.down.1.downsample.conv.bias [256] 180 | encoder.down.1.downsample.conv.weight [256,256,3,3] 181 | encoder.down.2.block.0.conv1.bias [512] 182 | encoder.down.2.block.0.conv1.weight [512,256,3,3] 183 | encoder.down.2.block.0.conv2.bias [512] 184 | encoder.down.2.block.0.conv2.weight [512,512,3,3] 185 | encoder.down.2.block.0.nin_shortcut.bias [512] 186 | encoder.down.2.block.0.nin_shortcut.weight [512,256,1,1] 187 | encoder.down.2.block.0.norm1.bias [256] 188 | encoder.down.2.block.0.norm1.weight [256] 189 | encoder.down.2.block.0.norm2.bias [512] 190 | encoder.down.2.block.0.norm2.weight [512] 191 | encoder.down.2.block.1.conv1.bias [512] 192 | encoder.down.2.block.1.conv1.weight [512,512,3,3] 193 | encoder.down.2.block.1.conv2.bias [512] 194 | encoder.down.2.block.1.conv2.weight [512,512,3,3] 195 | encoder.down.2.block.1.norm1.bias [512] 196 | encoder.down.2.block.1.norm1.weight [512] 197 | encoder.down.2.block.1.norm2.bias [512] 198 | encoder.down.2.block.1.norm2.weight [512] 199 | encoder.down.2.downsample.conv.bias [512] 200 | encoder.down.2.downsample.conv.weight [512,512,3,3] 201 | encoder.down.3.block.0.conv1.bias [512] 202 | encoder.down.3.block.0.conv1.weight [512,512,3,3] 203 | encoder.down.3.block.0.conv2.bias [512] 204 | encoder.down.3.block.0.conv2.weight [512,512,3,3] 205 | encoder.down.3.block.0.norm1.bias [512] 206 | encoder.down.3.block.0.norm1.weight [512] 207 | encoder.down.3.block.0.norm2.bias [512] 208 | encoder.down.3.block.0.norm2.weight [512] 209 | encoder.down.3.block.1.conv1.bias [512] 210 | encoder.down.3.block.1.conv1.weight [512,512,3,3] 211 | encoder.down.3.block.1.conv2.bias [512] 212 | encoder.down.3.block.1.conv2.weight [512,512,3,3] 213 | encoder.down.3.block.1.norm1.bias [512] 214 | encoder.down.3.block.1.norm1.weight [512] 215 | encoder.down.3.block.1.norm2.bias [512] 216 | encoder.down.3.block.1.norm2.weight [512] 217 | encoder.mid.attn_1.k.bias [512] 218 | encoder.mid.attn_1.k.weight [512,512,1,1] 219 | encoder.mid.attn_1.norm.bias [512] 220 | encoder.mid.attn_1.norm.weight [512] 221 | encoder.mid.attn_1.proj_out.bias [512] 222 | encoder.mid.attn_1.proj_out.weight [512,512,1,1] 223 | encoder.mid.attn_1.q.bias [512] 224 | encoder.mid.attn_1.q.weight [512,512,1,1] 225 | encoder.mid.attn_1.v.bias [512] 226 | encoder.mid.attn_1.v.weight [512,512,1,1] 227 | encoder.mid.block_1.conv1.bias [512] 228 | encoder.mid.block_1.conv1.weight [512,512,3,3] 229 | encoder.mid.block_1.conv2.bias [512] 230 | encoder.mid.block_1.conv2.weight [512,512,3,3] 231 | encoder.mid.block_1.norm1.bias [512] 232 | encoder.mid.block_1.norm1.weight [512] 233 | encoder.mid.block_1.norm2.bias [512] 234 | encoder.mid.block_1.norm2.weight [512] 235 | encoder.mid.block_2.conv1.bias [512] 236 | encoder.mid.block_2.conv1.weight [512,512,3,3] 237 | encoder.mid.block_2.conv2.bias [512] 238 | encoder.mid.block_2.conv2.weight [512,512,3,3] 239 | encoder.mid.block_2.norm1.bias [512] 240 | encoder.mid.block_2.norm1.weight [512] 241 | encoder.mid.block_2.norm2.bias [512] 242 | encoder.mid.block_2.norm2.weight [512] 243 | encoder.norm_out.bias [512] 244 | encoder.norm_out.weight [512] --------------------------------------------------------------------------------