├── .python-version ├── LICENSE ├── README.md ├── config.json ├── docs ├── overview.png └── title.png ├── example.py ├── model ├── __init__.py ├── best_rq_framework.py ├── config.py └── random_projection_quanzier.py └── requirements.txt /.python-version: -------------------------------------------------------------------------------- 1 | 3.10.5 2 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Apache License 2 | Version 2.0, January 2004 3 | http://www.apache.org/licenses/ 4 | 5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 6 | 7 | 1. Definitions. 8 | 9 | "License" shall mean the terms and conditions for use, reproduction, 10 | and distribution as defined by Sections 1 through 9 of this document. 11 | 12 | "Licensor" shall mean the copyright owner or entity authorized by 13 | the copyright owner that is granting the License. 14 | 15 | "Legal Entity" shall mean the union of the acting entity and all 16 | other entities that control, are controlled by, or are under common 17 | control with that entity. For the purposes of this definition, 18 | "control" means (i) the power, direct or indirect, to cause the 19 | direction or management of such entity, whether by contract or 20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 21 | outstanding shares, or (iii) beneficial ownership of such entity. 22 | 23 | "You" (or "Your") shall mean an individual or Legal Entity 24 | exercising permissions granted by this License. 25 | 26 | "Source" form shall mean the preferred form for making modifications, 27 | including but not limited to software source code, documentation 28 | source, and configuration files. 29 | 30 | "Object" form shall mean any form resulting from mechanical 31 | transformation or translation of a Source form, including but 32 | not limited to compiled object code, generated documentation, 33 | and conversions to other media types. 34 | 35 | "Work" shall mean the work of authorship, whether in Source or 36 | Object form, made available under the License, as indicated by a 37 | copyright notice that is included in or attached to the work 38 | (an example is provided in the Appendix below). 39 | 40 | "Derivative Works" shall mean any work, whether in Source or Object 41 | form, that is based on (or derived from) the Work and for which the 42 | editorial revisions, annotations, elaborations, or other modifications 43 | represent, as a whole, an original work of authorship. For the purposes 44 | of this License, Derivative Works shall not include works that remain 45 | separable from, or merely link (or bind by name) to the interfaces of, 46 | the Work and Derivative Works thereof. 47 | 48 | "Contribution" shall mean any work of authorship, including 49 | the original version of the Work and any modifications or additions 50 | to that Work or Derivative Works thereof, that is intentionally 51 | submitted to Licensor for inclusion in the Work by the copyright owner 52 | or by an individual or Legal Entity authorized to submit on behalf of 53 | the copyright owner. For the purposes of this definition, "submitted" 54 | means any form of electronic, verbal, or written communication sent 55 | to the Licensor or its representatives, including but not limited to 56 | communication on electronic mailing lists, source code control systems, 57 | and issue tracking systems that are managed by, or on behalf of, the 58 | Licensor for the purpose of discussing and improving the Work, but 59 | excluding communication that is conspicuously marked or otherwise 60 | designated in writing by the copyright owner as "Not a Contribution." 61 | 62 | "Contributor" shall mean Licensor and any individual or Legal Entity 63 | on behalf of whom a Contribution has been received by Licensor and 64 | subsequently incorporated within the Work. 65 | 66 | 2. Grant of Copyright License. Subject to the terms and conditions of 67 | this License, each Contributor hereby grants to You a perpetual, 68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 69 | copyright license to reproduce, prepare Derivative Works of, 70 | publicly display, publicly perform, sublicense, and distribute the 71 | Work and such Derivative Works in Source or Object form. 72 | 73 | 3. Grant of Patent License. Subject to the terms and conditions of 74 | this License, each Contributor hereby grants to You a perpetual, 75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 76 | (except as stated in this section) patent license to make, have made, 77 | use, offer to sell, sell, import, and otherwise transfer the Work, 78 | where such license applies only to those patent claims licensable 79 | by such Contributor that are necessarily infringed by their 80 | Contribution(s) alone or by combination of their Contribution(s) 81 | with the Work to which such Contribution(s) was submitted. If You 82 | institute patent litigation against any entity (including a 83 | cross-claim or counterclaim in a lawsuit) alleging that the Work 84 | or a Contribution incorporated within the Work constitutes direct 85 | or contributory patent infringement, then any patent licenses 86 | granted to You under this License for that Work shall terminate 87 | as of the date such litigation is filed. 88 | 89 | 4. Redistribution. You may reproduce and distribute copies of the 90 | Work or Derivative Works thereof in any medium, with or without 91 | modifications, and in Source or Object form, provided that You 92 | meet the following conditions: 93 | 94 | (a) You must give any other recipients of the Work or 95 | Derivative Works a copy of this License; and 96 | 97 | (b) You must cause any modified files to carry prominent notices 98 | stating that You changed the files; and 99 | 100 | (c) You must retain, in the Source form of any Derivative Works 101 | that You distribute, all copyright, patent, trademark, and 102 | attribution notices from the Source form of the Work, 103 | excluding those notices that do not pertain to any part of 104 | the Derivative Works; and 105 | 106 | (d) If the Work includes a "NOTICE" text file as part of its 107 | distribution, then any Derivative Works that You distribute must 108 | include a readable copy of the attribution notices contained 109 | within such NOTICE file, excluding those notices that do not 110 | pertain to any part of the Derivative Works, in at least one 111 | of the following places: within a NOTICE text file distributed 112 | as part of the Derivative Works; within the Source form or 113 | documentation, if provided along with the Derivative Works; or, 114 | within a display generated by the Derivative Works, if and 115 | wherever such third-party notices normally appear. The contents 116 | of the NOTICE file are for informational purposes only and 117 | do not modify the License. You may add Your own attribution 118 | notices within Derivative Works that You distribute, alongside 119 | or as an addendum to the NOTICE text from the Work, provided 120 | that such additional attribution notices cannot be construed 121 | as modifying the License. 122 | 123 | You may add Your own copyright statement to Your modifications and 124 | may provide additional or different license terms and conditions 125 | for use, reproduction, or distribution of Your modifications, or 126 | for any such Derivative Works as a whole, provided Your use, 127 | reproduction, and distribution of the Work otherwise complies with 128 | the conditions stated in this License. 129 | 130 | 5. Submission of Contributions. Unless You explicitly state otherwise, 131 | any Contribution intentionally submitted for inclusion in the Work 132 | by You to the Licensor shall be under the terms and conditions of 133 | this License, without any additional terms or conditions. 134 | Notwithstanding the above, nothing herein shall supersede or modify 135 | the terms of any separate license agreement you may have executed 136 | with Licensor regarding such Contributions. 137 | 138 | 6. Trademarks. This License does not grant permission to use the trade 139 | names, trademarks, service marks, or product names of the Licensor, 140 | except as required for reasonable and customary use in describing the 141 | origin of the Work and reproducing the content of the NOTICE file. 142 | 143 | 7. Disclaimer of Warranty. Unless required by applicable law or 144 | agreed to in writing, Licensor provides the Work (and each 145 | Contributor provides its Contributions) on an "AS IS" BASIS, 146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 147 | implied, including, without limitation, any warranties or conditions 148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 149 | PARTICULAR PURPOSE. You are solely responsible for determining the 150 | appropriateness of using or redistributing the Work and assume any 151 | risks associated with Your exercise of permissions under this License. 152 | 153 | 8. Limitation of Liability. In no event and under no legal theory, 154 | whether in tort (including negligence), contract, or otherwise, 155 | unless required by applicable law (such as deliberate and grossly 156 | negligent acts) or agreed to in writing, shall any Contributor be 157 | liable to You for damages, including any direct, indirect, special, 158 | incidental, or consequential damages of any character arising as a 159 | result of this License or out of the use or inability to use the 160 | Work (including but not limited to damages for loss of goodwill, 161 | work stoppage, computer failure or malfunction, or any and all 162 | other commercial damages or losses), even if such Contributor 163 | has been advised of the possibility of such damages. 164 | 165 | 9. Accepting Warranty or Additional Liability. While redistributing 166 | the Work or Derivative Works thereof, You may choose to offer, 167 | and charge a fee for, acceptance of support, warranty, indemnity, 168 | or other liability obligations and/or rights consistent with this 169 | License. However, in accepting such obligations, You may act only 170 | on Your own behalf and on Your sole responsibility, not on behalf 171 | of any other Contributor, and only if You agree to indemnify, 172 | defend, and hold each Contributor harmless for any liability 173 | incurred by, or claims asserted against, such Contributor by reason 174 | of your accepting any such warranty or additional liability. 175 | 176 | END OF TERMS AND CONDITIONS 177 | 178 | APPENDIX: How to apply the Apache License to your work. 179 | 180 | To apply the Apache License to your work, attach the following 181 | boilerplate notice, with the fields enclosed by brackets "[]" 182 | replaced with your own identifying information. (Don't include 183 | the brackets!) The text should be enclosed in the appropriate 184 | comment syntax for the file format. We also recommend that a 185 | file or class name and description of purpose be included on the 186 | same "printed page" as the copyright notice for easier 187 | identification within third-party archives. 188 | 189 | Copyright [yyyy] [name of copyright owner] 190 | 191 | Licensed under the Apache License, Version 2.0 (the "License"); 192 | you may not use this file except in compliance with the License. 193 | You may obtain a copy of the License at 194 | 195 | http://www.apache.org/licenses/LICENSE-2.0 196 | 197 | Unless required by applicable law or agreed to in writing, software 198 | distributed under the License is distributed on an "AS IS" BASIS, 199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 200 | See the License for the specific language governing permissions and 201 | limitations under the License. 202 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 |
2 | 3 |
4 | 5 |
6 |
7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 |
23 | 24 | *** 25 | 26 | ## Overview 27 |
28 | 29 |
30 | 31 | ## Installation 32 | 33 | ``` 34 | pip install -r requirements.txt 35 | ``` 36 | 37 | ## Usage 38 | 39 | Described in example.py 40 | ``` 41 | python example.py 42 | ``` 43 | 44 | ## Code Style 45 | I follow [PEP-8](https://www.python.org/dev/peps/pep-0008/) for code style. Especially the style of docstrings is important to generate documentation. 46 | 47 | ## Reference 48 | - [Self-supervised Learning with Random-projection Quantizer for Speech Recognition](https://arxiv.org/abs/2202.01855) 49 | 50 | ## Author 51 | 52 | * [Harunori Kawano](https://harunorikawano.github.io/) -------------------------------------------------------------------------------- /config.json: -------------------------------------------------------------------------------- 1 | { 2 | "mask_prob": 0.01, 3 | "mask_time": 0.4, 4 | "input_feature_size": 80, 5 | "stride_time": 0.01, 6 | "code_book_size": 16, 7 | "num_code_books": 8192, 8 | "num_temporal_dimension_reduction_steps": 4, 9 | "encoder_hidden_size": 768 10 | } -------------------------------------------------------------------------------- /docs/overview.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/HarunoriKawano/BEST-RQ/cd5d70cf55808ebd90b46a5243b4b67b6f772917/docs/overview.png -------------------------------------------------------------------------------- /docs/title.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/HarunoriKawano/BEST-RQ/cd5d70cf55808ebd90b46a5243b4b67b6f772917/docs/title.png -------------------------------------------------------------------------------- /example.py: -------------------------------------------------------------------------------- 1 | import json 2 | 3 | import torch 4 | from torch import nn 5 | from torch.nn.functional import cross_entropy 6 | 7 | from model import Config, BestRqFramework 8 | 9 | 10 | class ExampleEncoder(nn.Module): 11 | def __init__(self): 12 | super().__init__() 13 | self.linear = nn.Linear(80, 192) 14 | self.num_temporal_dimension_reduction_steps = 4 15 | 16 | def forward(self, inputs, _): 17 | batch_size, length, _ = inputs.size() 18 | 19 | hidden_states = self.linear(inputs) 20 | hidden_states = hidden_states.view(batch_size, length // self.num_temporal_dimension_reduction_steps, -1) 21 | 22 | return hidden_states 23 | 24 | 25 | if __name__ == '__main__': 26 | device = "cuda:0" if torch.cuda.is_available() else "cpu" 27 | print(f"Use device: {device}") 28 | 29 | encoder = ExampleEncoder().to(device) 30 | 31 | # `(batch size, time steps, feature size)` 32 | inputs = torch.rand(4, 1000, 80).to(device) 33 | # `(batch size)` Number of available time steps per batch 34 | input_lengths = torch.tensor([1000, 871, 389, 487]).to(device) 35 | 36 | with open("config.json", "r", encoding="utf-8") as f: 37 | config = Config(**json.load(f)) 38 | 39 | model = BestRqFramework(config, encoder).to(device) 40 | 41 | targets, labels = model(inputs, input_lengths) 42 | loss = cross_entropy(targets, labels) 43 | 44 | print(loss) 45 | loss.backward() 46 | -------------------------------------------------------------------------------- /model/__init__.py: -------------------------------------------------------------------------------- 1 | from model.config import Config 2 | from model.best_rq_framework import BestRqFramework 3 | -------------------------------------------------------------------------------- /model/best_rq_framework.py: -------------------------------------------------------------------------------- 1 | import random 2 | 3 | import torch 4 | from torch import nn 5 | 6 | from model.config import Config 7 | from model.random_projection_quanzier import RandomProjectionQuantizer 8 | 9 | 10 | class BestRqFramework(nn.Module): 11 | def __init__(self, config: Config, encoder: nn.Module): 12 | super().__init__() 13 | self.K = config.num_temporal_dimension_reduction_steps 14 | self.layer_norm = nn.LayerNorm(config.input_feature_size) 15 | self.random_projection_quantizer = RandomProjectionQuantizer(config) 16 | self.encoder = encoder 17 | self.config = config 18 | self.out_linear = nn.Linear(config.encoder_hidden_size, config.num_code_books) 19 | self.num_time_steps = int(config.mask_time // (config.stride_time * self.K)) 20 | 21 | def forward(self, input_values: torch.Tensor, input_lengths: torch.Tensor): 22 | """ 23 | Args: 24 | input_values (torch.Tensor): with shape `(B, T, D)` 25 | input_lengths (torch.Tensor): with shape `(B)` 26 | 27 | Returns: 28 | 29 | """ 30 | batch_size, num_steps, hidden_size = input_values.size() 31 | 32 | input_values = self.layer_norm(input_values) 33 | 34 | if not num_steps % self.config.num_temporal_dimension_reduction_steps == 0: 35 | transformed_num_steps = (num_steps // self.K + 1) * self.K 36 | padding = torch.zeros( 37 | batch_size, transformed_num_steps - num_steps, hidden_size, device=input_values.device 38 | ) 39 | input_values = torch.cat([input_values, padding], dim=1) 40 | num_steps = transformed_num_steps 41 | 42 | # Reshape to number of encoder out steps 43 | input_values = input_values.view(batch_size, -1, self.K * hidden_size) 44 | quantized_input_lengths = input_lengths // self.K - 1 45 | 46 | masked_input_values, time_mask_indices = self.masking(input_values.clone(), quantized_input_lengths) 47 | masked_input_values = masked_input_values.view(batch_size, num_steps, hidden_size) 48 | 49 | labels = self.random_projection_quantizer(input_values, time_mask_indices) 50 | 51 | encoder_out = self.encoder(masked_input_values, input_lengths) 52 | 53 | targets = encoder_out[time_mask_indices] 54 | targets_out = self.out_linear(targets) 55 | 56 | return targets_out, labels 57 | 58 | def masking(self, input_values: torch.Tensor, input_lengths: torch.Tensor) -> tuple[torch.Tensor, torch.Tensor]: 59 | """ 60 | Args: 61 | input_values (torch.Tensor): with shape `(B, L, D)` 62 | input_lengths (torch.Tensor): with shape `(B)' 63 | 64 | Returns: 65 | tuple( 66 | torch.Tensor with shape `(B, L, D)` 67 | torch.Tensor with shape `(B, L)` 68 | ) 69 | """ 70 | batch_size, num_steps, hidden_size = input_values.size() 71 | 72 | # non mask: 0, maks: 1 73 | time_mask_indices = torch.zeros( 74 | batch_size, num_steps + self.num_time_steps, 75 | device=input_values.device, dtype=torch.bool 76 | ) 77 | 78 | for batch in range(batch_size): 79 | time_mask_idx_candidates = list(range(int(input_lengths[batch]))) 80 | k = int(self.config.mask_prob * input_lengths[batch]) 81 | start_time_mask_idx_array = torch.tensor( 82 | random.sample(time_mask_idx_candidates, k=k), device=input_values.device, dtype=torch.long 83 | ) 84 | 85 | for i in range(self.num_time_steps): 86 | time_mask_indices[batch, start_time_mask_idx_array+i] = 1 87 | 88 | time_mask_indices = time_mask_indices[:, :-self.num_time_steps] 89 | num_masks = sum(time_mask_indices.flatten()) 90 | 91 | # Replace to random value where mask 92 | random_values = torch.normal(mean=0, std=0.1, size=(num_masks, hidden_size), device=input_values.device) 93 | input_values[time_mask_indices == 1] = random_values 94 | 95 | return input_values, time_mask_indices 96 | -------------------------------------------------------------------------------- /model/config.py: -------------------------------------------------------------------------------- 1 | from dataclasses import dataclass 2 | 3 | 4 | @dataclass(frozen=True) 5 | class Config: 6 | mask_prob: float # 0.0 - 1.0 7 | mask_time: float # Mask time sec (Default: 0.4) 8 | input_feature_size: int # Dimension of input. 9 | stride_time: float # stride_time sec. 10 | code_book_size: int # Dimension of code book (Default: 16) 11 | num_code_books: int # Number of code books (Default: 8192) 12 | num_temporal_dimension_reduction_steps: int # Number of temporal dimension reduction steps by the encoder 13 | encoder_hidden_size: int # Number of encoder output dimensions 14 | -------------------------------------------------------------------------------- /model/random_projection_quanzier.py: -------------------------------------------------------------------------------- 1 | import torch 2 | from torch import nn 3 | from torch.linalg import vector_norm 4 | 5 | from model.config import Config 6 | 7 | 8 | class RandomProjectionQuantizer(nn.Module): 9 | def __init__(self, config: Config): 10 | super().__init__() 11 | self.random_projection = nn.Linear( 12 | config.input_feature_size * config.num_temporal_dimension_reduction_steps, config.code_book_size, bias=False 13 | ) 14 | nn.init.xavier_uniform_(self.random_projection.weight) 15 | 16 | self.code_book = nn.Parameter(torch.randn(config.num_code_books, config.code_book_size)) 17 | 18 | self.random_projection.weight.requires_grad = False 19 | self.code_book.requires_grad = False 20 | 21 | @torch.no_grad() 22 | def forward(self, input_values: torch.Tensor, mask_time_indices: torch.Tensor) -> torch.Tensor: 23 | """ 24 | Args: 25 | input_values (torch.Tensor): with shape `(B, L, D)` 26 | mask_time_indices (torch.Tensor): with shape `(B, L)` 27 | 28 | Returns: 29 | torch.Tensor with shape `(N)` 30 | 31 | """ 32 | targets = self.random_projection(input_values[mask_time_indices == 1]).unsqueeze(1) 33 | 34 | # Compute l2 norm targets and code vectors 35 | vector_distances = vector_norm(targets - self.code_book, dim=-1) 36 | 37 | labels = torch.argmin(vector_distances, dim=-1) 38 | 39 | return labels 40 | -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | --extra-index-url https://download.pytorch.org/whl/cu118 2 | torch 3 | torchvision 4 | torchaudio --------------------------------------------------------------------------------