├── CONTRIBUTING.md ├── LICENSE ├── README.md ├── README_Original.md ├── css ├── colors.module.css ├── styles.css ├── theme.css ├── theme.dark.css ├── theme.light.css ├── tokens.css └── typography.module.css ├── googled873bf132668a2f1.html ├── index.html └── sitemap.xml /CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | # Contribution Guide 2 | 3 | 1. Put the bibtex in `Awesome-Dataset-Distillation/citations/.txt`. 4 | 2. In the `README.md`, add the paper to **one** proper section in the format `TITLE (AUTHOR, CONFERENCE YEAR) OPTIONAL_PROJECT_LINK OPTIONAL_GITHUB BIBTEX`. 5 | 6 | + Title should be a link to the paper. When the paper is hosted on arXiv, please use the abstract link (rather than pdf link). 7 | 8 | + Change the number in "Papers-number-FF6F00". 9 | 10 | + Use the following icons to represent otherlinks: 11 | + :globe_with_meridians: (`:globe_with_meridians:`) Project Page 12 | + :octocat: (`:octocat:`) Code 13 | + :book: (`:book:`) `bibtex` 14 | 15 | For example: 16 | 17 | + [Dataset Distillation](https://arxiv.org/abs/1811.10959) (Tongzhou Wang et al., 2018) [:globe_with_meridians:](https://www.tongzhouwang.info/dataset_distillation/) [:octocat:](https://github.com/SsnL/dataset-distillation) [:book:](./citations/wang2018datasetdistillation.txt) 18 | 19 | 20 | Above entry is generated using following Markdown: 21 | 22 | ``` 23 | + [Dataset Distillation](https://arxiv.org/abs/1811.10959) (Tongzhou Wang et al., 2018) 24 | [:globe_with_meridians:](https://www.tongzhouwang.info/dataset_distillation/) 25 | [:octocat:](https://github.com/SsnL/dataset-distillation) 26 | [:book:](./citations/wang2018datasetdistillation.txt) 27 | 28 | ``` 29 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2022 Guang Li, Bo Zhao, and Tongzhou Wang 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Advances in Partial/Complementary Label Learning 2 | 3 | [![Awesome](https://cdn.rawgit.com/sindresorhus/awesome/d7305f38d29fed78fa85652e3a63e154dd8e8829/media/badge.svg)](https://github.com/sindresorhus/awesome) 4 | Contrib PaperNum ![Stars](https://img.shields.io/github/stars/wu-dd/Advances-in-Partial-and-Complementary-Label-Learning?color=yellow&label=Stars) ![Forks](https://img.shields.io/github/forks/wu-dd/Advances-in-Partial-and-Complementary-Label-Learning?color=green&label=Forks) 5 | 6 | 7 | 8 | 9 | 10 | **Advances in Partial/Complementary Label Learning** provides the most advanced and detailed information on partial/complementary label learning field. 11 | 12 | **Partial/complementary label learning** is an emerging framework in weakly supervised machine learning with broad application prospects. It handles the case in which each training example corresponds to a candidate label set and only one label concealed in the set is the ground-truth label. 13 | 14 | **This project is curated and maintained by [Dong-Dong Wu](https://wu-dd.github.io/)**. I will do my best to keep the project up to date. If you have any suggestions or are interested in being contributors, feel free to drop me an email. 15 | 16 | 17 | 18 | #### [How to submit a pull request?](./CONTRIBUTING.md) 19 | 20 | + :globe_with_meridians: Project Page 21 | + :octocat: Code 22 | + :book: `bibtex` 23 | 24 | ## Latest Updates 25 | 26 | + [2024/12/08] Update some code links of papers. 27 | + [2024/09/13] A major overhaul of the original github repository. 28 | 29 | ## Contents 30 | 31 | - [Main](#main) 32 | - [Early Work](#early-work) 33 | - [Survey](#survey) 34 | - [Generative Modeling](#theory-based) 35 | - [Understanding](#understanding) 36 | - [Better Optimization](#better-optimization) 37 | - [Partial Multi-Label Learning](#partial-multi-label-learning) 38 | - [Noisy Partial Label Learning](#noisy-partial-label-learning) 39 | - [Semi-Supervised Partial Label Learning](#semi-supervised-partial-label-learning) 40 | - [Multi-Instance Partial Label Learning](#multi-instance-learning) 41 | - [Imbalanced Partial Label Problem](#class-imbalance-problem) 42 | - [Out-of-distributrion Partial Label Learning](#partial-label-regression) 43 | - [Federated Partial Label Learning](#fed-pll>) 44 | - [Partial Label Regression](#partial-label-regression) 45 | - [Dimensionality Reduction](#dimensionality-reduction) 46 | - [Multi-Complementary Label Learning](#multi-complementary-label-learning) 47 | - [Multi-View Learning](#multi-view-learning) 48 | - [Adversarial Training](#adversarial-training) 49 | - [Negative Learning](#negative-learning) 50 | - [Incremental Learning](#incremental-learning) 51 | - [Online Learning](#online-learning) 52 | - [Conformal Prediction](#conformal-prediction) 53 | - [Few-Shot Learning](#few-shot-learning) 54 | - [Open-Set Problem](#open-set-problem) 55 | - [Data Augmentation](#data-augmentation) 56 | - [Multi-Dimensional](#multi-dimensional) 57 | - [Domain Adaptation](#domain-adaptation) 58 | - [Applications](#applications) 59 | - [Audio](#audio) 60 | - [Text](#text) 61 | - [Sequence](#sequence-classification) 62 | - [Recognition](#recognition) 63 | - [Object Localization](#object-localization) 64 | - [Map Reconstruction](#map-reconstruction) 65 | - [Semi-Supervised Learning](#semi-supervised-learning) 66 | - [Active Learning](#active-learning) 67 | - [Noisy Label Learning](#noisy-label-learning) 68 | - [Test-Time Adaptation](#TTA) 69 | 70 | 71 | 72 | ## Main 73 | 74 | - [Learning from Complementary Labels](https://arxiv.org/abs/1705.07541) (NeurIPS 2017) [:octocat:](https://github.com/takashiishida/comp) 75 | - [Learning from Partial Labels](https://jmlr.org/papers/v12/cour11a.html) (JMLR 2011) 76 | 77 | 78 | 79 | ### Early Work 80 | 81 | + To be continue. 82 | 83 | 84 | 85 | ### Survey 86 | 87 | - [Partial label learning: Taxonomy, analysis and outlook](https://www.sciencedirect.com/science/article/abs/pii/S0893608023000825) (NN 2023) 88 | 89 | 90 | 91 | ### Generative Modeling 92 | 93 | - [Learning with Biased Complementary Labels](https://arxiv.org/abs/1711.09535) (ECCV 2018) [:octocat:](https://github.com/takashiishida/comp) 94 | - [Complementary-Label Learning for Arbitrary Losses and Models](https://arxiv.org/abs/1810.04327) (ICML 2019) [:octocat:](https://github.com/takashiishida/comp) 95 | - [Unbiased Risk Estimators Can Mislead: A Case Study of Learning with Complementary Labels](https://arxiv.org/abs/2007.02235) (ICML 2020) 96 | - [Provably Consistent Partial-Label Learning](https://arxiv.org/abs/2007.08929) (NeurIPS 2020) [:octocat:](https://lfeng1995.github.io/Code/RCCC.zip) 97 | - [Leveraged Weighted Loss for Partial Label Learning](https://arxiv.org/abs/2106.05731) (ICML 2021) [:octocat:](https://github.com/hongwei-wen/LW-loss-for-partial-label) 98 | - [Unbiased Risk Estimator to Multi-Labeled Complementary Label Learning](https://www.ijcai.org/proceedings/2023/415) (IJCAI 2023) [:octocat:](https://github.com/GaoYi439/GDF) 99 | - [Learning with Complementary Labels Revisited: The Selected-Completely-at-Random Setting Is More Practical](https://arxiv.org/abs/2311.15502) (ICML 2024) [:octocat:](https://github.com/wwangwitsel/SCARCE) 100 | - [Towards Unbiased Exploration in Partial Label Learning](https://arxiv.org/abs/2307.00465) (2023) 101 | 102 | 103 | 104 | ### Understanding 105 | 106 | - [Bridging Ordinary-Label Learning and Complementary-Label Learning](https://proceedings.mlr.press/v129/katsura20a.html) (ACML 2020) 107 | - [On the Power of Deep but Naive Partial Label Learning](https://arxiv.org/abs/2010.11600) (ICASSP 2021) [:octocat:](https://github.com/mikigom/DNPL-PyTorch) 108 | - [Learning from a Complementary-label Source Domain: Theory and Algorithms](https://arxiv.org/abs/2008.01454) (TNNLS 2021) 109 | - [A Unifying Probabilistic Framework for Partially Labeled Data Learning](https://ieeexplore.ieee.org/document/9983986) (TPAMI 2023) 110 | - [Candidate Label Set Pruning: A Data-centric Perspective for Deep Partial-label Learning](https://openreview.net/forum?id=Fk5IzauJ7F) (ICLR 2024) 111 | - [Understanding Self-Distillation and Partial Label Learning in Multi-Class Classification with Label Noise](https://arxiv.org/abs/2402.10482) (2024) 112 | 113 | 114 | 115 | ### Better Optimization 116 | 117 | - [A Conditional Multinomial Mixture Model for Superset Label Learning](https://papers.nips.cc/paper_files/paper/2012/hash/aaebdb8bb6b0e73f6c3c54a0ab0c6415-Abstract.html) (NeurIPS 2012) 118 | - [GM-PLL: Graph Matching based Partial Label Learning](https://arxiv.org/pdf/1901.03073) (TKDE 2019) 119 | - [Partial Label Learning via Label Enhancement](https://ojs.aaai.org/index.php/AAAI/article/view/4497) (AAAI 2019) 120 | - [Partial Label Learning with Self-Guided Retraining](https://arxiv.org/abs/1902.03045) (AAAI 2019) 121 | - [Partial Label Learning by Semantic Difference Maximization](https://www.ijcai.org/proceedings/2019/318) (IJCAI 2019) 122 | - [Partial Label Learning with Unlabeled Data](https://www.ijcai.org/proceedings/2019/521) (IJCAI 2019) 123 | - [Adaptive Graph Guided Disambiguation for Partial Label Learning](https://dl.acm.org/doi/10.1145/3292500.3330840) (KDD 2019) 124 | - [A Self-Paced Regularization Framework for Partial-Label Learning](https://ieeexplore.ieee.org/document/9094702) (TYCB 2020) 125 | - [Large Margin Partial Label Machine](https://ieeexplore.ieee.org/document/8826247) (TNNLS 2020) 126 | - [Learning with Noisy Partial Labels by Simultaneously Leveraging Global and Local Consistencies](https://dl.acm.org/doi/10.1145/3340531.3411885) (CIKM 2020) 127 | - [Network Cooperation with Progressive Disambiguation for Partial Label Learning](https://arxiv.org/abs/2002.11919) (ECML-PKDD 2020) 128 | - [Deep Discriminative CNN with Temporal Ensembling for Ambiguously-Labeled Image Classification](https://ojs.aaai.org/index.php/AAAI/article/view/6959) (AAAI 2020) 129 | - [Generative-Discriminative Complementary Learning](https://arxiv.org/abs/1904.01612) (AAAI 2020) 130 | - [Partial Label Learning with Batch Label Correction](https://ojs.aaai.org/index.php/AAAI/article/view/6132) (AAAI 2020) 131 | - [Progressive Identification of True Labels for Partial-Label Learning](https://arxiv.org/abs/2002.08053) (ICML 2020) 132 | - [Generalized Large Margin -NN for Partial Label Learning](https://ieeexplore.ieee.org/document/9529072) (TMM2021) 133 | - [Adaptive Graph Guided Disambiguation for Partial Label Learning](https://ieeexplore.ieee.org/document/9573413) (TPAMI 2022) 134 | - [Discriminative Metric Learning for Partial Label Learning](https://ieeexplore.ieee.org/document/9585342) (TNNLS 2021) 135 | - [Top-k Partial Label Machine](https://ieeexplore.ieee.org/document/9447152) (TNNLS 2021) 136 | - [Detecting the Fake Candidate Instances: Ambiguous Label Learning with Generative Adversarial Networks](https://dl.acm.org/doi/abs/10.1145/3459637.3482251) (CIKM 2021) 137 | - [Discriminative Complementary-Label Learning with Weighted Loss](https://proceedings.mlr.press/v139/gao21d.html) (ICML 2021) 138 | - [Instance-Dependent Partial Label Learning](https://arxiv.org/abs/2110.12911) (NeurIPS 2021) 139 | - [A Generative Model for Partial Label Learning](https://ieeexplore.ieee.org/abstract/document/9428103) (ICME 2021) 140 | - [Learning with Proper Partial Labels](https://arxiv.org/abs/2112.12303) (NearoComputing 2022) 141 | - [Biased Complementary-Label Learning Without True Labels](https://ieeexplore.ieee.org/document/9836971) (TNNLS 2022) 142 | - [Exploiting Class Activation Value for Partial-Label Learning](https://openreview.net/forum?id=qqdXHUGec9h) (ICLR 2022) [:octocat:](https://github.com/Ferenas/CAVL) 143 | - [PiCO: Contrastive Label Disambiguation for Partial Label Learning](https://openreview.net/forum?id=EhYjZy6e1gJ) (ICLR 2022) 144 | - [Deep Graph Matching for Partial Label Learning](https://www.ijcai.org/proceedings/2022/459) (IJCAI 2022) 145 | - [Exploring Binary Classification Hidden within Partial Label Learning](https://www.ijcai.org/proceedings/2022/456) (IJCAI 2022) 146 | - [Ambiguity-Induced Contrastive Learning for Instance-Dependent Partial Label Learning](https://www.ijcai.org/proceedings/2022/502) (IJCAI 2022) 147 | - [Partial Label Learning via Label Influence Function](https://proceedings.mlr.press/v162/gong22c.html) (ICML 2022) 148 | - [Revisiting Consistency Regularization for Deep Partial Label Learning](https://proceedings.mlr.press/v162/wu22l.html) (ICML 2022) 149 | - [Partial Label Learning with Semantic Label Representations ](https://dl.acm.org/doi/abs/10.1145/3534678.3539434) (KDD 2022) 150 | - [Progressive Purification for Instance-Dependent Partial Label Learning](https://arxiv.org/abs/2206.00830) (ICML 2023) [:octocat:](https://github.com/palm-ml/POP) 151 | - [GraphDPI: Partial label disambiguation by graph representation learning via mutual information maximization](https://www.sciencedirect.com/science/article/abs/pii/S0031320322006136) (PR 2023) 152 | - [Variational Label Enhancement](https://ieeexplore.ieee.org/document/9875104) (TPAMI 2023) 153 | - [CMW-Net: Learning a Class-Aware Sample Weighting Mapping for Robust Deep Learning](https://arxiv.org/abs/2202.05613) (TPAMI 2023) 154 | - [Reduction from Complementary-Label Learning to Probability Estimates](https://arxiv.org/abs/2209.09500) (PAKDD 2023) 155 | - [Decompositional Generation Process for Instance-Dependent Partial Label Learning](https://arxiv.org/abs/2204.03845) (ICLR 2023) 156 | - [Mutual Partial Label Learning with Competitive Label Noise](https://openreview.net/forum?id=EUrxG8IBCrC) (ICLR 2023) 157 | - [Can Label-Specific Features Help Partial-Label Learning? ](https://ojs.aaai.org/index.php/AAAI/article/view/25904) (AAAI 2023) 158 | - [Learning with Partial Labels from Semi-supervised Perspective](https://arxiv.org/abs/2211.13655) (AAAI 2023) 159 | - [Consistent Complementary-Label Learning via Order-Preserving Losses](https://proceedings.mlr.press/v206/liu23g.html) (ICAIS 2023) 160 | - [Complementary Classifier Induced Partial Label Learning](https://arxiv.org/abs/2305.09897) (KDD 2023) 161 | - [Towards Effective Visual Representations for Partial-Label Learning](https://arxiv.org/abs/2305.06080) (CVPR 2023) 162 | - [Candidate-aware Selective Disambiguation Based On Normalized Entropy for Instance-dependent Partial-label Learning](https://ieeexplore.ieee.org/document/10376678) (ICCV 2023) 163 | - [Partial Label Learning with Dissimilarity Propagation guided Candidate Label Shrinkage](https://papers.nips.cc/paper_files/paper/2023/hash/6b97236d90d945be7c58268207a14f4f-Abstract-Conference.html) (NeurIPS 2023) 164 | - [Learning From Biased Soft Labels](https://arxiv.org/abs/2302.08155) (NeurIPS 2023) 165 | - [Meta Objective Guided Disambiguation for Partial Label Learning](https://arxiv.org/abs/2208.12459) (2023) 166 | - [Adversary-Aware Partial label learning with Label distillation](https://arxiv.org/abs/2304.00498) (2023) 167 | - [Solving Partial Label Learning Problem with Multi-Agent Reinforcement Learning ](https://openreview.net/forum?id=BNsuf5g-JRd) (2023) 168 | - [Learning from Stochastic Labels](https://arxiv.org/abs/2302.00299) (2023) 169 | - [Deep Duplex Learning for Weak Supervision](https://openreview.net/forum?id=SeZ5ONageGl) (2023) 170 | - [Imprecise Label Learning: A Unified Framework for Learning with Various Imprecise Label Configurations](https://arxiv.org/abs/2305.12715) (2023) 171 | - [Self-distillation and self-supervision for partial label learning](https://www.sciencedirect.com/science/article/pii/S0031320323007136) (PR 2023) 172 | - [Partial Label Learning with a Partner](https://ojs.aaai.org/index.php/AAAI/article/view/29424) (AAAI 2024) 173 | - [Distilling Reliable Knowledge for Instance-Dependent Partial Label Learning](https://ojs.aaai.org/index.php/AAAI/article/view/29519) (AAAI 2024) 174 | - [Disentangled Partial Label Learning](https://ojs.aaai.org/index.php/AAAI/article/view/28976) (AAAI 2024) 175 | - [CroSel: Cross Selection of Confident Pseudo Labels for Partial-Label Learning](https://arxiv.org/abs/2303.10365) (CVPR 2024) 176 | - [A General Framework for Learning from Weak Supervision](https://arxiv.org/abs/2402.01922) (ICML 2024) 177 | - [Does Label Smoothing Help Deep Partial Label Learning?](https://openreview.net/forum?id=drjjxmi2Ha) (ICML 2024) 178 | - [Label Dropout: Improved Deep Learning Echocardiography Segmentation Using Multiple Datasets With Domain Shift and Partial Labelling](https://arxiv.org/abs/2403.07818) (SCIS 2024) 179 | - [Appeal: Allow Mislabeled Samples the Chance to be Rectified in Partial Label Learning ](https://arxiv.org/abs/2312.11034) (2024) 180 | - [Graph Partial Label Learning with Potential Cause Discovering](https://arxiv.org/abs/2403.11449) (2024) 181 | - [Reduction-based Pseudo-label Generation for Instance-dependent Partial Label Learning](https://arxiv.org/pdf/2410.20797) (2024) 182 | 183 | 184 | 185 | ### Partial-Multi Label Learning 186 | 187 | - [Learning a Deep ConvNet for Multi-label Classification with Partial Labels](https://arxiv.org/abs/1902.09720) (CVPR 2019) 188 | - [Multi-View Partial Multi-Label Learning with Graph-Based Disambiguation](https://ojs.aaai.org/index.php/AAAI/article/view/5761) (AAAI 2020) 189 | - [Partial Multi-Label Learning via Multi-Subspace Representation](https://www.ijcai.org/proceedings/2020/362) (IJCAI 2020) 190 | - [Feature-Induced Manifold Disambiguation for Multi-View Partial Multi-label Learning](https://dl.acm.org/doi/10.1145/3394486.3403098) (KDD 2020) 191 | - [Prior Knowledge Regularized Self-Representation Model for Partial Multilabel Learning](https://ieeexplore.ieee.org/document/9533180) (TYCB 2021) 192 | - [Global-Local Label Correlation for Partial Multi-Label Learning](https://ieeexplore.ieee.org/document/9343691) (TMM 2021) 193 | - [Progressive Enhancement of Label Distributions for Partial Multilabel Learning](https://ieeexplore.ieee.org/document/9615493) (TNNLS 2021) 194 | - [Partial Multi-Label Learning With Noisy Label Identification](https://ieeexplore.ieee.org/document/9354590) (TPAMI 2021) 195 | - [Partial Multi-Label Learning via Credible Label Elicitation](https://ieeexplore.ieee.org/document/9057438) (TPAMI 2021) 196 | - [Adversarial Partial Multi-Label Learning](https://arxiv.org/abs/1909.06717) (AAAI 2021) 197 | - [Learning from Complementary Labels via Partial-Output Consistency Regularization](https://www.ijcai.org/proceedings/2021/423) (IJCAI 2021) 198 | - [Partial Multi-Label Learning with Meta Disambiguation](https://dl.acm.org/doi/abs/10.1145/3447548.3467259) (KDD 2021) 199 | - [Understanding Partial Multi-Label Learning via Mutual Information](https://proceedings.neurips.cc/paper/2021/hash/217c0e01c1828e7279051f1b6675745d-Abstract.html) (NeurIPS 2021) 200 | - [Semantic-Aware Representation Blending for Multi-Label Image Recognition with Partial Labels](https://arxiv.org/abs/2203.02172) (AAAI 2022) 201 | - [Structured Semantic Transfer for Multi-Label Recognition with Partial Labels)](https://arxiv.org/abs/2112.10941) (AAAI 2022) 202 | - [Boosting Multi-Label Image Classification with Complementary Parallel Self-Distillation](https://arxiv.org/abs/2205.10986) (IJCAI 2022) 203 | - [Multi-label Classification with Partial Annotations using Class-aware Selective Loss](https://arxiv.org/abs/2110.10955) (CVPR 2022) 204 | - [Deep Double Incomplete Multi-View Multi-Label Learning With Incomplete Labels and Missing Views](https://ieeexplore.ieee.org/document/10086538) (TNNLS 2023) 205 | - [Towards Enabling Binary Decomposition for Partial Multi-Label Learning](https://ieeexplore.ieee.org/document/10168295) (TPAMI 2023) 206 | - [Deep Partial Multi-Label Learning with Graph Disambiguation](https://arxiv.org/abs/2305.05882) (IJCAI 2023) 207 | - [Learning in Imperfect Environment: Multi-Label Classification with Long-Tailed Distribution and Partial Labels](https://arxiv.org/abs/2304.10539) (ICCV 2023) 208 | - [Partial Multi-Label Learning with Probabilistic Graphical Disambiguation](https://papers.nips.cc/paper_files/paper/2023/hash/04e05ba5cbc36044f6499d1edf15247e-Abstract-Conference.html) (NeurIPS 2023) 209 | - [ProPML: Probability Partial Multi-label Learning](https://arxiv.org/abs/2403.07603) (DSAA 2023) 210 | - [A Deep Model for Partial Multi-Label Image Classification with Curriculum Based Disambiguation](https://arxiv.org/abs/2207.02410) (ML 2024) 211 | - [Partial Multi-View Multi-Label Classification via Semantic Invariance Learning and PrototypeModeling](https://icml.cc/virtual/2024/poster/34972) (ICML 2024) 212 | - [Reliable Representations Learning for Incomplete Multi-View Partial Multi-Label Classification](https://arxiv.org/abs/2303.17117) (2024) 213 | - [PLMCL: Partial-Label Momentum Curriculum Learning for Multi-Label Image Classification](https://arxiv.org/abs/2208.09999) (2024) 214 | - [Combining Supervised Learning and Reinforcement Learning for Multi-Label Classification Tasks with Partial Labels](https://arxiv.org/abs/2406.16293) (2024) 215 | - [Free Performance Gain from Mixing Multiple Partially Labeled Samples in Multi-label Image Classification](https://arxiv.org/abs/2405.15860) (2024) 216 | 217 | 218 | 219 | ### Noisy Partial Label Learning 220 | 221 | - [PiCO+: Contrastive Label Disambiguation for Robust Partial Label Learning](https://arxiv.org/abs/2201.08984) (TPAMI 2023) 222 | - [On the Robustness of Average Losses for Partial-Label Learning](https://arxiv.org/abs/2106.06152) (TPAMI 2023) 223 | - [FREDIS: A Fusion Framework of Refinement and Disambiguation for Unreliable Partial Label Learning](https://proceedings.mlr.press/v202/qiao23b.html) (ICML 2023) 224 | - [Unreliable Partial Label Learning with Recursive Separation](https://arxiv.org/abs/2302.09891) (IJCAI 2023) 225 | - [ALIM: Adjusting Label Importance Mechanism for Noisy Partial Label Learning](https://arxiv.org/abs/2301.12077) (NeurIPS 2023) 226 | - [IRNet: Iterative Refinement Network for Noisy Partial Label Learning](https://arxiv.org/abs/2211.04774) (2023) 227 | - [Robust Representation Learning for Unreliable Partial Label Learning](https://arxiv.org/abs/2308.16718) (2023) 228 | - [Pseudo-labelling meets Label Smoothing for Noisy Partial Label Learning](https://arxiv.org/abs/2402.04835) (2024) 229 | 230 | 231 | 232 | ### Semi-Supervised Partial Label Learning 233 | 234 | - [Semi-Supervised Partial Label Learning via Confidence-Rated Margin Maximization](https://proceedings.neurips.cc/paper/2020/hash/4dea382d82666332fb564f2e711cbc71-Abstract.html) (NeurIPS 2020) 235 | - [Exploiting Unlabeled Data via Partial Label Assignment for Multi-Class Semi-Supervised Learning](https://ojs.aaai.org/index.php/AAAI/article/view/17310) (AAAI 2021) 236 | - [Distributed Semisupervised Partial Label Learning Over Networks](https://ieeexplore.ieee.org/document/9699063) (AI 2022) 237 | - [Learning with Partial-Label and Unlabeled Data: A Uniform Treatment for Supervision Redundancy and Insufficiency](https://proceedings.mlr.press/v235/liu24ar.html) (ICML 2024) 238 | - [FairMatch: Promoting Partial Label Learning by Unlabeled Samples](https://dl.acm.org/doi/abs/10.1145/3637528.3671685) (KDD 2024) 239 | 240 | 241 | 242 | ### Multi-Instance Partial Label Learning 243 | 244 | - [Multi-Instance Partial-Label Learning: Towards Exploiting Dual Inexact Supervision](https://arxiv.org/abs/2212.08997) (SCIS 2023) 245 | - [Disambiguated Attention Embedding for Multi-Instance Partial-Label Learning](https://arxiv.org/abs/2305.16912) (NeurIPS 2023) 246 | - [Exploiting Conjugate Label Information for Multi-Instance Partial-Label Learning](https://arxiv.org/abs/2408.14369) (2024) 247 | - [On Characterizing and Mitigating Imbalances in Multi-Instance Partial Label Learning](https://arxiv.org/abs/2407.10000) (2024) 248 | 249 | 250 | 251 | ### Imblanced Partial Label Learning 252 | 253 | - [Towards Mitigating the Class-Imbalance Problem for Partial Label Learning](https://dl.acm.org/doi/10.1145/3219819.3220008) (KDD 2018) [:octocat:](https://github.com/seu71wj/CIMAP) 254 | - [A Partial Label Metric Learning Algorithm for Class Imbalanced Data](https://proceedings.mlr.press/v157/liu21f.html) (ACML 2021) 255 | - [SoLar: Sinkhorn Label Refinery for Imbalanced Partial-Label Learning](https://arxiv.org/abs/2209.10365) (NeurIPS 2022) 256 | - [Class-Imbalanced Complementary-Label Learning via Weighted Loss](https://arxiv.org/abs/2209.14189) (NN 2023) 257 | - [Long-Tailed Partial Label Learning via Dynamic Rebalancing](https://arxiv.org/abs/2302.05080) (ICLR 2023) 258 | - [Long-Tailed Partial Label Learning by Head Classifier and Tail Classifier Cooperation](https://ojs.aaai.org/index.php/AAAI/article/view/29182) (AAAI 2024) 259 | - [Pseudo Labels Regularization for Imbalanced Partial-Label Learning ](https://arxiv.org/abs/2303.03946) (ICASSP 2024) 260 | 261 | 262 | 263 | ### Out-of-distribution Partial Label Learning 264 | 265 | - [Out-of-distribution Partial Label Learning](https://arxiv.org/abs/2403.06681) (2024) 266 | 267 | 268 | 269 | ### Federated Partial Label Learning 270 | 271 | - [Federated Partial Label Learning with Local-Adaptive Augmentation and Regularization](https://ojs.aaai.org/index.php/AAAI/article/view/29562) (AAAI 2024) 272 | 273 | 274 | 275 | ### Partial Label Regression 276 | 277 | - [Partial-Label Regression](https://arxiv.org/abs/2306.08968) (AAAI 2023) 278 | - [Partial-Label Learning with a Reject Option](https://arxiv.org/abs/2402.00592) (2024) 279 | 280 | 281 | 282 | ### Dimensionality Reduction 283 | 284 | - [Partial Label Dimensionality Reduction via Confidence-Based Dependence Maximization](https://dl.acm.org/doi/abs/10.1145/3447548.3467313) (KDD 2021) 285 | - [Disambiguation Enabled Linear Discriminant Analysis for Partial Label Dimensionality Reduction](https://dl.acm.org/doi/abs/10.1145/3494565) (TKDD 2022) 286 | - [Submodular Feature Selection for Partial Label Learning](https://dl.acm.org/doi/abs/10.1145/3534678.3539292) (KDD 2022) 287 | - [Dimensionality Reduction for Partial Label Learning: A Unified and Adaptive Approach](https://www.computer.org/csdl/journal/tk/2024/08/10440495/1UGSdp8q3Wo) (TKDE 2024) 288 | 289 | 290 | 291 | ### Multi-Complementary Label Learning 292 | 293 | - [Learning with Multiple Complementary Labels](https://arxiv.org/abs/1912.12927) (ICML 2020) 294 | - [Multi-Complementary and Unlabeled Learning for Arbitrary Losses and Models](https://arxiv.org/abs/2001.04243) (PR 2022) 295 | 296 | 297 | 298 | ### Multi-View Learning 299 | 300 | - [Deep Partial Multi-View Learning](https://ieeexplore.ieee.org/document/9258396) (TPAMI 2022) 301 | 302 | 303 | 304 | ### Adversarial Training 305 | 306 | - [Adversarial Training with Complementary Labels: On the Benefit of Gradually Informative Attacks](https://openreview.net/forum?id=s7SukMH7ie9) (NeurIPS 2022) 307 | 308 | 309 | 310 | ### Negative Learning 311 | 312 | - [NLNL: Negative Learning for Noisy Labels](https://arxiv.org/abs/1908.07387) (ICCV 2019) 313 | 314 | 315 | 316 | ### Incremental Learning 317 | 318 | - [Partial label learning with emerging new labels](https://link.springer.com/article/10.1007/s10994-022-06244-2) (ML 2022) 319 | - [Complementary Labels Learning with Augmented Classes](https://arxiv.org/abs/2211.10701) (2022) 320 | - [An Unbiased Risk Estimator for Partial Label Learning with Augmented Classes](https://arxiv.org/pdf/2409.19600) (2024) 321 | 322 | 323 | 324 | ### Online Learning 325 | 326 | - [Online Partial Label Learning](https://dl.acm.org/doi/abs/10.1007/978-3-030-67661-2_27) (ECML-PKDD 2020) 327 | 328 | 329 | 330 | ### Conformal Prediction 331 | 332 | - [Conformal Prediction with Partially Labeled Data](https://arxiv.org/abs/2306.01191) (SCPPA 2023) 333 | 334 | 335 | 336 | ### Few-Shot Learning 337 | 338 | - [Few-Shot Partial-Label Learning](https://arxiv.org/abs/2106.00984) (IJCAI 2021) 339 | 340 | 341 | 342 | ### Open-set Problem 343 | 344 | - [Partial-label Learning with Mixed Closed-set and Open-set Out-of-candidate Examples](https://arxiv.org/abs/2307.00553) (KDD 2023) 345 | 346 | 347 | 348 | ### Data Augmentation 349 | 350 | - [Partial Label Learning with Discrimination Augmentation](https://dl.acm.org/doi/abs/10.1145/3534678.3539363) (KDD 2022) [:octocat:](https://github.com/wwangwitsel/PLDA) 351 | - [Enhancing Label Sharing Efficiency in Complementary-Label Learning with Label Augmentation](https://arxiv.org/abs/2305.08344) (2023) 352 | 353 | 354 | 355 | ### Multi-Dimensional 356 | 357 | - [Learning From Multi-Dimensional Partial Labels](https://www.ijcai.org/proceedings/2020/407) (IJCAI 2020) 358 | 359 | 360 | 361 | ### Domain Adaptation 362 | 363 | - [Partial Label Unsupervised Domain Adaptation with Class-Prototype Alignment](https://openreview.net/forum?id=jpq0qHggw3t) (ICLR 2023) 364 | 365 | 366 | 367 | 368 | 369 | ## Applications 370 | 371 | 372 | 373 | ### Audio 374 | 375 | - [Semi-Supervised Audio Classification with Partially Labeled Data ](https://arxiv.org/abs/2111.12761)(2021) 376 | 377 | 378 | 379 | ### Text 380 | 381 | - [Complementary Auxiliary Classifiers for Label-Conditional Text Generation](https://ojs.aaai.org/index.php/AAAI/article/view/6346) (AAAI 2020) 382 | 383 | 384 | 385 | ### Sequence 386 | 387 | - [Star Temporal Classification: Sequence Classification with Partially Labeled Data](https://arxiv.org/abs/2201.12208) (2022) 388 | 389 | 390 | 391 | ### Recognition 392 | 393 | - [Webly-Supervised Fine-Grained Recognition with Partial Label Learning](https://www.ijcai.org/proceedings/2022/209) (IJCAI 2022) [:octocat:](https://github.com/msfuxian/WSFGPLL) 394 | - [Partial Label Learning with Focal Loss for Sea Ice Classification Based on Ice Charts](https://arxiv.org/abs/2406.03645) (AEORS 2023) 395 | - [Partial Label Learning for Emotion Recognition from EEG](https://arxiv.org/abs/2302.13170) (2023) 396 | - [A Confidence-based Partial Label Learning Model for Crowd-Annotated Named Entity Recognition](https://arxiv.org/abs/2305.12485) (ACL 2023) [:octocat:](https://github.com/LemXiong/CPLL?tab=readme-ov-file) 397 | 398 | 399 | 400 | ### Object Localization 401 | 402 | - [Adversarial Complementary Learning for Weakly Supervised Object Localization](https://arxiv.org/abs/1804.06962) (CVPR 2018) [:octocat:](https://github.com/halbielee/ACoL_pytorch) 403 | - [Learning to Detect Instance-level Salient Objects Using Complementary Image Labels](https://arxiv.org/abs/2111.10137) (2021) 404 | 405 | 406 | 407 | ### Map Reconstruction 408 | 409 | - [Deep Learning with Partially Labeled Data for Radio Map Reconstruction](https://arxiv.org/abs/2306.05294) (2023) 410 | 411 | 412 | 413 | ### Semi-supervised Learning 414 | 415 | - [Boosting Semi-Supervised Learning with Contrastive Complementary Labeling](https://arxiv.org/abs/2212.06643) (NN 2023) 416 | - [Controller-Guided Partial Label Consistency Regularization with Unlabeled Data](https://arxiv.org/abs/2210.11194) (2024) 417 | - [Semi-supervised Contrastive Learning Using Partial Label Information](https://arxiv.org/abs/2003.07921) (2024) 418 | 419 | 420 | 421 | ### Active Learning 422 | 423 | - [Exploiting counter-examples for active learning with partial labels](https://link.springer.com/article/10.1007/s10994-023-06485-9) (ML 2023) [:octocat:](https://github.com/Ferenas/APLL) 424 | - [Active Learning with Partial Labels](https://openreview.net/forum?id=1VuBdlNBuR) (2023) 425 | 426 | 427 | 428 | ### Noisy Label Learning 429 | 430 | - [Learning from Noisy Labels with Complementary Loss Functions](https://ojs.aaai.org/index.php/AAAI/article/view/17213) (AAAI 2021) 431 | - [Adaptive Integration of Partial Label Learning and Negative Learning for Enhanced Noisy Label Learning](https://arxiv.org/abs/2312.09505) (AAAI 2024) 432 | - [Partial Label Supervision for Agnostic Generative Noisy Label Learning](https://arxiv.org/abs/2308.01184) (2024) 433 | 434 | 435 | 436 | ### Test-Time Adaptation 437 | 438 | - [Rethinking Precision of Pseudo Label: Test-Time Adaptation via Complementary Learning](https://arxiv.org/abs/2301.06013) (2023) 439 | 440 | 441 | 442 | 443 | 444 | 445 | ## Dataset 446 | 447 | ### Tabular Dataset: 448 | 449 | **Notice:** The following partial label learning data sets were collected and pre-processed by Prof. [Min-Ling Zhang](https://palm.seu.edu.cn/zhangml/), with courtesy and proprietary to the authors of referred literatures on them. The pre-processed data sets can be used at your own risk and for academic purpose only. More information can be found in [here](http://palm.seu.edu.cn/zhangml/). 450 | 451 | Dataset for partial label learning: 452 | 453 | | [FG-NET](https://palm.seu.edu.cn/zhangml/files/FG-NET.rar) | [Lost](https://palm.seu.edu.cn/zhangml/files/lost.rar) | [MSRCv2](https://palm.seu.edu.cn/zhangml/files/MSRCv2.rar) | [BirdSong](https://palm.seu.edu.cn/zhangml/files/BirdSong.rar) | [Soccer Player](https://palm.seu.edu.cn/zhangml/files/SoccerPlayer.rar) | [Yahoo! News](https://palm.seu.edu.cn/zhangml/files/Yahoo!News.rar) | [Mirflickr](https://palm.seu.edu.cn/zhangml/files/Mirflickr.zip) | 454 | | ---------------------------------------------------------- | ------------------------------------------------------ | ---------------------------------------------------------- | ------------------------------------------------------------ | ------------------------------------------------------------ | ------------------------------------------------------------ | ------------------------------------------------------------ | 455 | 456 | Dataset for partial multi-label learning: 457 | 458 | | [Music_emotion](https://palm.seu.edu.cn/zhangml/files/pml_music_emotion.zip) | [Music_style](https://palm.seu.edu.cn/zhangml/files/pml_music_style.zip) | [Mirflickr](https://palm.seu.edu.cn/zhangml/files/pml_mirflickr.zip) | [YeastBP](https://palm.seu.edu.cn/zhangml/files/pml_YeastBP.zip) | [YeastCC](https://palm.seu.edu.cn/zhangml/files/pml_YeastCC.zip) | [YeastMF](https://palm.seu.edu.cn/zhangml/files/pml_YeastMF.zip) | 459 | | ------------------------------------------------------------ | ------------------------------------------------------------ | ------------------------------------------------------------ | ------------------------------------------------------------ | ------------------------------------------------------------ | ------------------------------------------------------------ | 460 | 461 | Data sets for multi-instance partial-label learning: 462 | 463 | | [MNIST](https://palm.seu.edu.cn/zhangml/files/MNIST_MIPL.zip) | [FMNIST](https://palm.seu.edu.cn/zhangml/files/FMNIST_MIPL.zip) | [Newsgroups](https://palm.seu.edu.cn/zhangml/files/Newsgroups_MIPL.zip) | [Birdsong](https://palm.seu.edu.cn/zhangml/files/Birdsong_MIPL.zip) | [SIVAL](https://palm.seu.edu.cn/zhangml/files/SIVAL_MIPL.zip) | [CRC-Row](https://palm.seu.edu.cn/zhangml/files/CRC-MIPL-Row.zip) | [CRC-SBN](https://palm.seu.edu.cn/zhangml/files/CRC-MIPL-SBN.zip) | [CRC-KMeansSeg](https://palm.seu.edu.cn/zhangml/files/CRC-MIPL-KMeansSeg.zip) | [CRC-SIFT](https://palm.seu.edu.cn/zhangml/files/CRC-MIPL-SIFT.zip) | 464 | | ------------------------------------------------------------ | ------------------------------------------------------------ | ------------------------------------------------------------ | ------------------------------------------------------------ | ------------------------------------------------------------ | ------------------------------------------------------------ | ------------------------------------------------------------ | ------------------------------------------------------------ | ------------------------------------------------------------ | 465 | 466 | ### Image Dataset: 467 | 468 | - [CLImage: Human-Annotated Datasets for Complementary-Label Learning](https://arxiv.org/abs/2305.08295) (2024) 469 | 470 | 471 | 472 | ## Learderboard 473 | 474 | To be continue. 475 | 476 | 477 | 478 | ## Star History 479 | 480 | [![Star History Chart](https://api.star-history.com/svg?repos=wu-dd/Advances-in-Partial-and-Complementary-Label-Learning&type=Date)](https://star-history.com/#wu-dd/Advances-in-Partial-and-Complementary-Label-Learning&Date) 481 | 482 | ## Citing Advances in Partial/Complementary Label Learning 483 | 484 | If you find this project useful for your research, please use the following BibTeX entry. 485 | 486 | ``` 487 | @misc{Wu2022advances, 488 | author={Dong-Dong Wu}, 489 | title={Advances in Partial/Complementary Label Learning }, 490 | howpublished={\url{wu-dd/Advances-in-Partial-and-Complementary-Label-Learning}}, 491 | year={2022} 492 | } 493 | ``` 494 | 495 | ## Acknowledgments 496 | -------------------------------------------------------------------------------- /README_Original.md: -------------------------------------------------------------------------------- 1 | # Awesome Dataset Distillation 2 | 3 | [![Awesome](https://cdn.rawgit.com/sindresorhus/awesome/d7305f38d29fed78fa85652e3a63e154dd8e8829/media/badge.svg)](https://github.com/sindresorhus/awesome) 4 | Contrib PaperNum ![Stars](https://img.shields.io/github/stars/Guang000/Awesome-Dataset-Distillation?color=yellow&label=Stars) ![Forks](https://img.shields.io/github/forks/Guang000/Awesome-Dataset-Distillation?color=green&label=Forks) 5 | 6 | **Awesome Dataset Distillation** provides the most comprehensive and detailed information on the Dataset Distillation field. 7 | 8 | **Dataset distillation** is the task of synthesizing a small dataset such that models trained on it achieve high performance on the original large dataset. A dataset distillation algorithm takes as **input** a large real dataset to be distilled (training set), and **outputs** a small synthetic distilled dataset, which is evaluated via testing models trained on this distilled dataset on a separate real dataset (validation/test set). A good small distilled dataset is not only useful in dataset understanding, but has various applications (e.g., continual learning, privacy, neural architecture search, etc.). This task was first introduced in the paper [*Dataset Distillation* [Tongzhou Wang et al., '18]](https://www.tongzhouwang.info/dataset_distillation/), along with a proposed algorithm using backpropagation through optimization steps. Then the task was first extended to the real-world datasets in the paper [*Medical Dataset Distillation* [Guang Li et al., '19]](https://arxiv.org/abs/2104.02857), which also explored the privacy preservation possibilities of dataset distillation. In the paper [*Dataset Condensation* [Bo Zhao et al., '20]](https://arxiv.org/abs/2006.05929), gradient matching was first introduced and greatly promoted the development of the dataset distillation field. 9 | 10 | In recent years (2022-now), dataset distillation has gained increasing attention in the research community, across many institutes and labs. More papers are now being published each year. These wonderful researches have been constantly improving dataset distillation and exploring its various variants and applications. 11 | 12 | **This project is curated and maintained by [Guang Li](https://www-lmd.ist.hokudai.ac.jp/member/guang-li/), [Bo Zhao](https://www.bozhao.me/), and [Tongzhou Wang](https://www.tongzhouwang.info/).** 13 | 14 | 15 | 16 | #### [How to submit a pull request?](./CONTRIBUTING.md) 17 | 18 | + :globe_with_meridians: Project Page 19 | + :octocat: Code 20 | + :book: `bibtex` 21 | 22 | ## Latest Updates 23 | + [2024/09/11] [DCFL: Non-IID Awareness Dataset Condensation Aided Federated Learning](https://ieeexplore.ieee.org/document/10650791) (Xingwang Wang et al., IJCNN 2024) [:book:](./citations/wang2024dcfl.txt) 24 | + [2024/09/05] [Dataset Distillation from First Principles: Integrating Core Information Extraction and Purposeful Learning](https://arxiv.org/abs/2409.01410) (Vyacheslav Kungurtsev et al., 2024) [:book:](./citations/kungurtsev2024first.txt) 25 | + [2024/09/05] [Neural Spectral Decomposition for Dataset Distillation](https://arxiv.org/abs/2408.16236) (Shaolei Yang et al., ECCV 2024) [:octocat:](https://github.com/slyang2021/NSD) [:book:](./citations/yang2024nsd.txt) 26 | + [2024/09/03] [Hierarchical Features Matter: A Deep Exploration of GAN Priors for Improved Dataset Distillation](https://arxiv.org/abs/2406.05704) (Xinhao Zhong & Hao Fang et al., 2024) [:octocat:](https://github.com/ndhg1213/H-GLaD) [:book:](./citations/zhong2024hglad.txt) 27 | + [2024/08/29] [Distilling Long-tailed Datasets](https://arxiv.org/abs/2408.14506) (Zhenghao Zhao & Haoxuan Wang et al., 2024) [:book:](./citations/zhao2024long.txt) 28 | + [2024/08/24] [Adaptive Backdoor Attacks Against Dataset Distillation for Federated Learning](https://ieeexplore.ieee.org/abstract/document/10622462?casa_token=tHyZ-Pz7DpUAAAAA:vmCYI4cUcKzMluUsASHhIhr0CvBkjzBR-0N7REVj7aFN5hT5TinQTpSEsE0Bo3Fl8auh52Fipm_v) (Ze Chai et al., ICC 2024) [:book:](./citations/chai2024backdoor.txt) 29 | + [2024/08/23] [Not All Samples Should Be Utilized Equally: Towards Understanding and Improving Dataset Distillation](https://arxiv.org/abs/2408.12483) (Shaobo Wang et al., 2024) [:book:](./citations/wang2024samples.txt) 30 | + [2024/08/22] [Dataset Condensation with Latent Quantile Matching](https://openaccess.thecvf.com/content/CVPR2024W/DDCV/html/Wei_Dataset_Condensation_with_Latent_Quantile_Matching_CVPRW_2024_paper.html) (Wei Wei et al., CVPR 2024 Workshop) [:book:](./citations/wei2024lqm.txt) 31 | + [2024/08/20] [Generative Dataset Distillation Based on Diffusion Model](https://arxiv.org/abs/2408.08610) (Duo Su & Junjie Hou & Guang Li et al., ECCV 2024 Workshop) [:octocat:](https://github.com/Guang000/Generative-Dataset-Distillation-Based-on-Diffusion-Model) [:book:](./citations/su2024diffusion.txt) 32 | + [2024/08/07] [Prioritize Alignment in Dataset Distillation](https://arxiv.org/abs/2408.03360) (Zekai Li & Ziyao Guo et al., 2024) [:octocat:](https://github.com/NUS-HPC-AI-Lab/PAD) [:book:](./citations/li2024pad.txt) 33 | 34 | ## Contents 35 | - [Main](#main) 36 | - [Early Work](#early-work) 37 | - [Gradient/Trajectory Matching Surrogate Objective](#gradient-objective) 38 | - [Distribution/Feature Matching Surrogate Objective](#feature-objective) 39 | - [Better Optimization](#optimization) 40 | - [Better Understanding](#understanding) 41 | - [Distilled Dataset Parametrization](#parametrization) 42 | - [Generative Prior](#generative) 43 | - [Label Distillation](#label) 44 | - [Dataset Quantization](#quant) 45 | - [Multimodal Distillation](#multi) 46 | - [Self-Supervised Distillation](#self) 47 | - [Long-Tailed Distillation](#long) 48 | - [Benchmark](#benchmark) 49 | - [Survey](#survey) 50 | - [Ph.D. Thesis](#thesis) 51 | - [Workshop](#workshop) 52 | - [Challenge](#challenge) 53 | - [Applications](#applications) 54 | - [Continual Learning](#continual) 55 | - [Privacy](#privacy) 56 | - [Medical](#medical) 57 | - [Federated Learning](#fed) 58 | - [Graph Neural Network](#gnn) 59 | - [Neural Architecture Search](#nas) 60 | - [Fashion, Art, and Design](#fashion) 61 | - [Knowledge Distillation](#kd) 62 | - [Recommender Systems](#rec) 63 | - [Blackbox Optimization](#blackbox) 64 | - [Trustworthy](#trustworthy) 65 | - [Text](#text) 66 | - [Tabular](#tabular) 67 | - [Retrieval](#retrieval) 68 | - [Video](#video) 69 | - [Domain Adaptation](#domain) 70 | - [Super Resolution](#super) 71 | - [Time Series](#time) 72 | - [Speech](#speech) 73 | - [Machine Unlearning](#unlearning) 74 | - [Reinforcement Learning](#rl) 75 | 76 | 77 | ## Main 78 | + [Dataset Distillation](https://arxiv.org/abs/1811.10959) (Tongzhou Wang et al., 2018) [:globe_with_meridians:](https://ssnl.github.io/dataset_distillation/) [:octocat:](https://github.com/SsnL/dataset-distillation) [:book:](./citations/wang2018datasetdistillation.txt) 79 | 80 | 81 | 82 | ### Early Work 83 | + [Gradient-Based Hyperparameter Optimization Through Reversible Learning](https://arxiv.org/abs/1502.03492) (Dougal Maclaurin et al., ICML 2015) [:octocat:](https://github.com/HIPS/hypergrad) [:book:](./citations/maclaurin2015gradient.txt) 84 | 85 | 86 | 87 | ### Gradient/Trajectory Matching Surrogate Objective 88 | + [Dataset Condensation with Gradient Matching](https://arxiv.org/abs/2006.05929) (Bo Zhao et al., ICLR 2021) [:octocat:](https://github.com/VICO-UoE/DatasetCondensation) [:book:](./citations/zhao2021datasetcondensation.txt) 89 | + [Dataset Condensation with Differentiable Siamese Augmentation](https://arxiv.org/abs/2102.08259) (Bo Zhao et al., ICML 2021) [:octocat:](https://github.com/VICO-UoE/DatasetCondensation) [:book:](./citations/zhao2021differentiatble.txt) 90 | + [Dataset Distillation by Matching Training Trajectories](https://arxiv.org/abs/2203.11932) (George Cazenavette et al., CVPR 2022) [:globe_with_meridians:](https://georgecazenavette.github.io/mtt-distillation/) [:octocat:](https://github.com/georgecazenavette/mtt-distillation) [:book:](./citations/cazenavette2022dataset.txt) 91 | + [Dataset Condensation with Contrastive Signals](https://arxiv.org/abs/2202.02916) (Saehyung Lee et al., ICML 2022) [:octocat:](https://github.com/saehyung-lee/dcc) [:book:](./citations/lee2022dataset.txt) 92 | + [Loss-Curvature Matching for Dataset Selection and Condensation](https://arxiv.org/abs/2303.04449) (Seungjae Shin & Heesun Bae et al., AISTATS 2023) [:octocat:](https://github.com/SJShin-AI/LCMat) [:book:](./citations/shin2023lcmat.txt) 93 | + [Minimizing the Accumulated Trajectory Error to Improve Dataset Distillation](https://arxiv.org/abs/2211.11004) (Jiawei Du & Yidi Jiang et al., CVPR 2023) [:octocat:](https://github.com/AngusDujw/FTD-distillation) [:book:](./citations/du2023minimizing.txt) 94 | + [Scaling Up Dataset Distillation to ImageNet-1K with Constant Memory](https://arxiv.org/abs/2211.10586) (Justin Cui et al., ICML 2023) [:octocat:](https://github.com/justincui03/tesla) [:book:](./citations/cui2022scaling.txt) 95 | + [Sequential Subset Matching for Dataset Distillation](https://arxiv.org/abs/2311.01570) (Jiawei Du et al., NeurIPS 2023) [:octocat:](https://github.com/shqii1j/seqmatch) [:book:](./citations/du2023seqmatch.txt) 96 | + [Towards Lossless Dataset Distillation via Difficulty-Aligned Trajectory Matching](https://arxiv.org/abs/2310.05773) (Ziyao Guo & Kai Wang et al., ICLR 2024) [:globe_with_meridians:](https://gzyaftermath.github.io/DATM/) [:octocat:](https://github.com/GzyAftermath/DATM) [:book:](./citations/guo2024datm.txt) 97 | + [SelMatch: Effectively Scaling Up Dataset Distillation via Selection-Based Initialization and Partial Updates by Trajectory Matching](https://arxiv.org/abs/2406.18561) (Yongmin Lee et al., ICML 2024) [:octocat:](https://github.com/Yongalls/SelMatch) [:book:](./citations/lee2024selmatch.txt) 98 | + [Dataset Distillation by Automatic Training Trajectories](https://arxiv.org/abs/2407.14245) (Dai Liu et al., ECCV 2024) [:octocat:](https://github.com/NiaLiu/ATT) [:book:](./citations/liu2024att.txt) 99 | + [Neural Spectral Decomposition for Dataset Distillation](https://arxiv.org/abs/2408.16236) (Shaolei Yang et al., ECCV 2024) [:octocat:](https://github.com/slyang2021/NSD) [:book:](./citations/yang2024nsd.txt) 100 | + [Prioritize Alignment in Dataset Distillation](https://arxiv.org/abs/2408.03360) (Zekai Li & Ziyao Guo et al., 2024) [:octocat:](https://github.com/NUS-HPC-AI-Lab/PAD) [:book:](./citations/li2024pad.txt) 101 | 102 | 103 | 104 | ### Distribution/Feature Matching Surrogate Objective 105 | + [CAFE: Learning to Condense Dataset by Aligning Features](https://arxiv.org/abs/2203.01531) (Kai Wang & Bo Zhao et al., CVPR 2022) [:octocat:](https://github.com/kaiwang960112/cafe) [:book:](./citations/wang2022cafe.txt) 106 | + [Dataset Condensation with Distribution Matching](https://arxiv.org/abs/2110.04181) (Bo Zhao et al., WACV 2023) [:octocat:](https://github.com/VICO-UoE/DatasetCondensation) [:book:](./citations/zhao2023distribution.txt) 107 | + [Improved Distribution Matching for Dataset Condensation](https://arxiv.org/abs/2307.09742) (Ganlong Zhao et al., CVPR 2023) [:octocat:](https://github.com/uitrbn/IDM) [:book:](./citations/zhao2023idm.txt) 108 | + [DataDAM: Efficient Dataset Distillation with Attention Matching](https://arxiv.org/abs/2310.00093) (Ahmad Sajedi & Samir Khaki et al., ICCV 2023) [:globe_with_meridians:](https://datadistillation.github.io/DataDAM/) [:octocat:](https://github.com/DataDistillation/DataDAM) [:book:](./citations/sajedi2023datadam.txt) 109 | + [Dataset Distillation via the Wasserstein Metric](https://arxiv.org/abs/2311.18531) (Haoyang Liu et al., 2023) [:book:](./citations/liu2023wasserstein.txt) 110 | + [M3D: Dataset Condensation by Minimizing Maximum Mean Discrepancy](https://arxiv.org/abs/2312.15927) (Hansong Zhang & Shikun Li et al., AAAI 2024) [:octocat:](https://github.com/Hansong-Zhang/M3D) [:book:](./citations/zhang2024m3d.txt) 111 | + [On the Diversity and Realism of Distilled Dataset: An Efficient Dataset Distillation Paradigm](https://arxiv.org/abs/2312.03526) (Peng Sun et al., CVPR 2024) [:octocat:](https://github.com/LINs-lab/RDED) [:book:](./citations/sun2024diversity.txt) 112 | + [Exploiting Inter-sample and Inter-feature Relations in Dataset Distillation](https://arxiv.org/abs/2404.00563) (Wenxiao Deng et al., CVPR 2024) [:octocat:](https://github.com/VincenDen/IID) [:book:](./citations/deng2024iid.txt) 113 | + [Dataset Condensation with Latent Quantile Matching](https://openaccess.thecvf.com/content/CVPR2024W/DDCV/html/Wei_Dataset_Condensation_with_Latent_Quantile_Matching_CVPRW_2024_paper.html) (Wei Wei et al., CVPR 2024 Workshop) [:book:](./citations/wei2024lqm.txt) 114 | + [DANCE: Dual-View Distribution Alignment for Dataset Condensation](https://arxiv.org/abs/2406.01063) (Hansong Zhang et al., IJCAI 2024) [:octocat:](https://github.com/Hansong-Zhang/DANCE) [:book:](./citations/zhang2024dance.txt) 115 | 116 | 117 | 118 | ### Better Optimization 119 | + [Dataset Meta-Learning from Kernel Ridge-Regression](https://arxiv.org/abs/2011.00050) (Timothy Nguyen et al., ICLR 2021) [:octocat:](https://github.com/google/neural-tangents) [:book:](./citations/nguyen2021kip.txt) 120 | + [Dataset Distillation with Infinitely Wide Convolutional Networks](https://arxiv.org/abs/2107.13034) (Timothy Nguyen et al., NeurIPS 2021) [:octocat:](https://github.com/google/neural-tangents) [:book:](./citations/nguyen2021kipimprovedresults.txt) 121 | + [Dataset Distillation using Neural Feature Regression](https://arxiv.org/abs/2206.00719) (Yongchao Zhou et al., NeurIPS 2022) [:globe_with_meridians:](https://sites.google.com/view/frepo) [:octocat:](https://github.com/yongchao97/FRePo) [:book:](./citations/zhou2022dataset.txt) 122 | + [Efficient Dataset Distillation using Random Feature Approximation](https://arxiv.org/abs/2210.12067) (Noel Loo et al., NeurIPS 2022) [:octocat:](https://github.com/yolky/RFAD) [:book:](./citations/loo2022efficient.txt) 123 | + [Accelerating Dataset Distillation via Model Augmentation](https://arxiv.org/abs/2212.06152) (Lei Zhang & Jie Zhang et al., CVPR 2023) [:octocat:](https://github.com/ncsu-dk-lab/Acc-DD) [:book:](./citations/zhang2023accelerating.txt) 124 | + [Dataset Distillation with Convexified Implicit Gradients](https://arxiv.org/abs/2302.06755) (Noel Loo et al., ICML 2023) [:octocat:](https://github.com/yolky/RCIG) [:book:](./citations/loo2023dataset.txt) 125 | + [DREAM: Efficient Dataset Distillation by Representative Matching](https://arxiv.org/abs/2302.14416) (Yanqing Liu & Jianyang Gu & Kai Wang et al., ICCV 2023) [:octocat:](https://github.com/lyq312318224/DREAM) [:book:](./citations/liu2023dream.txt) 126 | + [Squeeze, Recover and Relabel: Dataset Condensation at ImageNet Scale From A New Perspective](https://arxiv.org/abs/2306.13092) (Zeyuan Yin & Zhiqiang Shen et al., NeurIPS 2023) [:globe_with_meridians:](https://zeyuanyin.github.io/projects/SRe2L/) [:octocat:](https://github.com/VILA-Lab/SRe2L) [:book:](./citations/yin2023sre2l.txt) 127 | + [You Only Condense Once: Two Rules for Pruning Condensed Datasets](https://arxiv.org/abs/2310.14019) (Yang He et al., NeurIPS 2023) [:octocat:](https://github.com/he-y/you-only-condense-once) [:book:](./citations/he2023yoco.txt) 128 | + [MIM4DD: Mutual Information Maximization for Dataset Distillation](https://arxiv.org/abs/2312.16627) (Yuzhang Shang et al., NeurIPS 2023) [:book:](./citations/shang2023mim4dd.txt) 129 | + [Can Pre-Trained Models Assist in Dataset Distillation?](https://arxiv.org/abs/2310.03295) (Yao Lu et al., 2023) [:octocat:](https://github.com/yaolu-zjut/DDInterpreter) [:book:](./citations/lu2023pre.txt) 130 | + [DREAM+: Efficient Dataset Distillation by Bidirectional Representative Matching](https://arxiv.org/abs/2310.15052) (Yanqing Liu & Jianyang Gu & Kai Wang et al., 2023) [:octocat:](https://github.com/lyq312318224/DREAM) [:book:](./citations/liu2023dream+.txt) 131 | + [Dataset Distillation in Latent Space](https://arxiv.org/abs/2311.15547) (Yuxuan Duan et al., 2023) [:book:](./citations/duan2023latent.txt) 132 | + [Data Distillation Can Be Like Vodka: Distilling More Times For Better Quality](https://arxiv.org/abs/2310.06982) (Xuxi Chen & Yu Yang et al., ICLR 2024) [:octocat:](https://github.com/VITA-Group/ProgressiveDD) [:book:](./citations/chen2024vodka.txt) 133 | + [Embarassingly Simple Dataset Distillation](https://arxiv.org/abs/2311.07025) (Yunzhen Feng et al., ICLR 2024) [:octocat:](https://github.com/fengyzpku/Simple_Dataset_Distillation) [:book:](./citations/yunzhen2024embarassingly.txt) 134 | + [Multisize Dataset Condensation](https://arxiv.org/abs/2403.06075) (Yang He et al., ICLR 2024) [:octocat:](https://github.com/he-y/Multisize-Dataset-Condensation) [:book:](./citations/he2024mdc.txt) 135 | + [Generalized Large-Scale Data Condensation via Various Backbone and Statistical Matching](https://arxiv.org/abs/2311.17950) (Shitong Shao et al., CVPR 2024) [:octocat:](https://github.com/shaoshitong/G_VBSM_Dataset_Condensation) [:book:](./citations/shao2024gvbsm.txt) 136 | + [Ameliorate Spurious Correlations in Dataset Condensation](https://arxiv.org/abs/2406.06609) (Justin Cui et al., ICML 2024) [:book:](./citations/cui2024bias.txt) 137 | + [Large Scale Dataset Distillation with Domain Shift](https://openreview.net/forum?id=0FWPKHMCSc) (Noel Loo & Alaa Maalouf et al., ICML 2024) [:book:](./citations/loo2024d3s.txt) 138 | + [Distill Gold from Massive Ores: Efficient Dataset Distillation via Critical Samples Selection](https://arxiv.org/abs/2305.18381) (Yue Xu et al., ECCV 2024) [:octocat:](https://github.com/silicx/GoldFromOres) [:book:](./citations/xu2024distill.txt) 139 | + [FYI: Flip Your Images for Dataset Distillation](https://arxiv.org/abs/2407.08113) (Byunggwan Son et al., ECCV 2024) [:globe_with_meridians:](https://cvlab.yonsei.ac.kr/projects/FYI/) [:book:](./citations/son2024fyi.txt) 140 | + [Curriculum Dataset Distillation](https://arxiv.org/abs/2405.09150) (Zhiheng Ma & Anjia Cao et al., 2024) [:book:](./citations/ma2024cudd.txt) 141 | + [BACON: Bayesian Optimal Condensation Framework for Dataset Distillation](https://arxiv.org/abs/2406.01112) (Zheng Zhou et al., 2024) [:octocat:](https://github.com/zhouzhengqd/BACON) [:book:](./citations/zhou2024bacon.txt) 142 | + [Not All Samples Should Be Utilized Equally: Towards Understanding and Improving Dataset Distillation](https://arxiv.org/abs/2408.12483) (Shaobo Wang et al., 2024) [:book:](./citations/wang2024samples.txt) 143 | 144 | 145 | 146 | ### Better Understanding 147 | + [Optimizing Millions of Hyperparameters by Implicit Differentiation](https://arxiv.org/abs/1911.02590) (Jonathan Lorraine et al., AISTATS 2020) [:octocat:](https://github.com/MaximeVandegar/Papers-in-100-Lines-of-Code/tree/main/Optimizing_Millions_of_Hyperparameters_by_Implicit_Differentiation) [:book:](./citations/lorraine2020optimizing.txt) 148 | + [On Implicit Bias in Overparameterized Bilevel Optimization](https://proceedings.mlr.press/v162/vicol22a.html) (Paul Vicol et al., ICML 2022) [:book:](./citations/vicol2022implicit.txt) 149 | + [On the Size and Approximation Error of Distilled Sets](https://arxiv.org/abs/2305.14113) (Alaa Maalouf & Murad Tukan et al., NeurIPS 2023) [:book:](./citations/maalouf2023size.txt) 150 | + [A Theoretical Study of Dataset Distillation](https://openreview.net/forum?id=dq5QGXGxoJ) (Zachary Izzo et al., NeurIPS 2023 Workshop) [:book:](./citations/izzo2023theo.txt) 151 | + [What is Dataset Distillation Learning?](https://arxiv.org/abs/2406.04284) (William Yang et al., ICML 2024) [:octocat:](https://github.com/princetonvisualai/What-is-Dataset-Distillation-Learning) [:book:](./citations/yang2024learning.txt) 152 | + [Dataset Distillation from First Principles: Integrating Core Information Extraction and Purposeful Learning](https://arxiv.org/abs/2409.01410) (Vyacheslav Kungurtsev et al., 2024) [:book:](./citations/kungurtsev2024first.txt) 153 | 154 | 155 | 156 | ### Distilled Dataset Parametrization 157 | + [Dataset Condensation via Efficient Synthetic-Data Parameterization](https://arxiv.org/abs/2205.14959) (Jang-Hyun Kim et al., ICML 2022) [:octocat:](https://github.com/snu-mllab/efficient-dataset-condensation) [:book:](./citations/kim2022dataset.txt) 158 | + [Remember the Past: Distilling Datasets into Addressable Memories for Neural Networks](https://arxiv.org/abs/2206.02916) (Zhiwei Deng et al., NeurIPS 2022) [:octocat:](https://github.com/princetonvisualai/RememberThePast-DatasetDistillation) [:book:](./citations/deng2022remember.txt) 159 | + [On Divergence Measures for Bayesian Pseudocoresets](https://arxiv.org/abs/2210.06205) (Balhae Kim et al., NeurIPS 2022) [:octocat:](https://github.com/balhaekim/bpc-divergences) [:book:](./citations/kim2022divergence.txt) 160 | + [Dataset Distillation via Factorization](https://arxiv.org/abs/2210.16774) (Songhua Liu et al., NeurIPS 2022) [:octocat:](https://github.com/Huage001/DatasetFactorization) [:book:](./citations/liu2022dataset.txt) 161 | + [PRANC: Pseudo RAndom Networks for Compacting Deep Models](https://arxiv.org/abs/2206.08464) (Parsa Nooralinejad et al., 2022) [:octocat:](https://github.com/UCDvision/PRANC) [:book:](./citations/nooralinejad2022pranc.txt) 162 | + [Dataset Condensation with Latent Space Knowledge Factorization and Sharing](https://arxiv.org/abs/2208.10494) (Hae Beom Lee & Dong Bok Lee et al., 2022) [:book:](./citations/lee2022kfs.txt) 163 | + [Slimmable Dataset Condensation](https://openaccess.thecvf.com/content/CVPR2023/html/Liu_Slimmable_Dataset_Condensation_CVPR_2023_paper.html) (Songhua Liu et al., CVPR 2023) [:book:](./citations/liu2023slimmable.txt) 164 | + [Few-Shot Dataset Distillation via Translative Pre-Training](https://openaccess.thecvf.com/content/ICCV2023/html/Liu_Few-Shot_Dataset_Distillation_via_Translative_Pre-Training_ICCV_2023_paper.html) (Songhua Liu et al., ICCV 2023) [:book:](./citations/liu2023fewshot.txt) 165 | + [MGDD: A Meta Generator for Fast Dataset Distillation](https://openreview.net/forum?id=D9CMRR5Lof) (Songhua Liu et al., NeurIPS 2023) [:book:](./citations/liu2023mgdd.txt) 166 | + [Sparse Parameterization for Epitomic Dataset Distillation](https://openreview.net/forum?id=ZIfhYAE2xg) (Xing Wei & Anjia Cao et al., NeurIPS 2023) [:octocat:](https://github.com/MIV-XJTU/SPEED) [:book:](./citations/wei2023sparse.txt) 167 | + [Frequency Domain-based Dataset Distillation](https://arxiv.org/abs/2311.08819) (Donghyeok Shin & Seungjae Shin et al., NeurIPS 2023) [:octocat:](https://github.com/sdh0818/FreD) [:book:](./citations/shin2023fred.txt) 168 | 169 | 170 | 171 | ### Generative Prior 172 | + [Synthesizing Informative Training Samples with GAN](https://arxiv.org/abs/2204.07513) (Bo Zhao et al., NeurIPS 2022 Workshop) [:octocat:](https://github.com/vico-uoe/it-gan) [:book:](./citations/zhao2022synthesizing.txt) 173 | + [Generalizing Dataset Distillation via Deep Generative Prior](https://arxiv.org/abs/2305.01649) (George Cazenavette et al., CVPR 2023) [:globe_with_meridians:](https://georgecazenavette.github.io/glad/) [:octocat:](https://github.com/georgecazenavette/glad) [:book:](./citations/cazenavette2023glad.txt) 174 | + [DiM: Distilling Dataset into Generative Model](https://arxiv.org/abs/2303.04707) (Kai Wang & Jianyang Gu et al., 2023) [:octocat:](https://github.com/vimar-gu/DiM) [:book:](./citations/wang2023dim.txt) 175 | + [Dataset Condensation via Generative Model](https://arxiv.org/abs/2309.07698) (Junhao Zhang et al., 2023) [:book:](./citations/zhang2023dc.txt) 176 | + [Efficient Dataset Distillation via Minimax Diffusion](https://arxiv.org/abs/2311.15529) (Jianyang Gu et al., CVPR 2024) [:octocat:](https://github.com/vimar-gu/MinimaxDiffusion) [:book:](./citations/gu2024efficient.txt) 177 | + [D4M: Dataset Distillation via Disentangled Diffusion Model](https://arxiv.org/abs/2407.15138) (Duo Su & Junjie Hou et al., CVPR 2024) [:globe_with_meridians:](https://junjie31.github.io/D4M/) [:octocat:](https://github.com/suduo94/D4M) [:book:](./citations/su2024d4m.txt) 178 | + [Generative Dataset Distillation: Balancing Global Structure and Local Details](https://arxiv.org/abs/2404.17732) (Longzhen Li & Guang Li et al., CVPR 2024 Workshop) [:book:](./citations/li2024generative.txt) 179 | + [Generative Dataset Distillation Based on Diffusion Model](https://arxiv.org/abs/2408.08610) (Duo Su & Junjie Hou & Guang Li et al., ECCV 2024 Workshop) [:octocat:](https://github.com/Guang000/Generative-Dataset-Distillation-Based-on-Diffusion-Model) [:book:](./citations/su2024diffusion.txt) 180 | + [Latent Dataset Distillation with Diffusion Models](https://arxiv.org/abs/2403.03881) (Brian B. Moser & Federico Raue et al., 2024) [:book:](./citations/moser2024ld3m.txt) 181 | + [Hierarchical Features Matter: A Deep Exploration of GAN Priors for Improved Dataset Distillation](https://arxiv.org/abs/2406.05704) (Xinhao Zhong & Hao Fang et al., 2024) [:octocat:](https://github.com/ndhg1213/H-GLaD) [:book:](./citations/zhong2024hglad.txt) 182 | 183 | 184 | 185 | ### Label Distillation 186 | + [Flexible Dataset Distillation: Learn Labels Instead of Images](https://arxiv.org/abs/2006.08572) (Ondrej Bohdal et al., NeurIPS 2020 Workshop) [:octocat:](https://github.com/ondrejbohdal/label-distillation) [:book:](./citations/bohdal2020flexible.txt) 187 | + [Soft-Label Dataset Distillation and Text Dataset Distillation](https://arxiv.org/abs/1910.02551) (Ilia Sucholutsky et al., IJCNN 2021) [:octocat:](https://github.com/ilia10000/dataset-distillation) [:book:](./citations/sucholutsky2021soft.txt) 188 | 189 | 190 | 191 | ### Dataset Quantization 192 | + [Dataset Quantization](https://arxiv.org/abs/2308.10524) (Daquan Zhou & Kai Wang & Jianyang Gu et al., ICCV 2023) [:octocat:](https://github.com/magic-research/Dataset_Quantization) [:book:](./citations/zhou2023dataset.txt) 193 | + [Dataset Quantization with Active Learning based Adaptive Sampling](https://arxiv.org/abs/2407.07268) (Zhenghao Zhao et al., ECCV 2024) [:octocat:](https://github.com/ichbill/DQAS) [:book:](./citations/zhao2024dqas.txt) 194 | 195 | 196 | 197 | ### Multimodal Distillation 198 | + [Vision-Language Dataset Distillation](https://arxiv.org/abs/2308.07545) (Xindi Wu et al., TMLR 2024) [:globe_with_meridians:](https://princetonvisualai.github.io/multimodal_dataset_distillation/) [:octocat:](https://github.com/princetonvisualai/multimodal_dataset_distillation) [:book:](./citations/wu2024multi.txt) 199 | + [Low-Rank Similarity Mining for Multimodal Dataset Distillation](https://arxiv.org/abs/2406.03793) (Yue Xu et al., ICML 2024) [:octocat:](https://github.com/silicx/LoRS_Distill) [:book:](./citations/xu2024lors.txt) 200 | 201 | 202 | 203 | ### Self-Supervised Distillation 204 | + [Self-Supervised Dataset Distillation for Transfer Learning](https://arxiv.org/abs/2310.06511) (Dong Bok Lee & Seanie Lee et al., ICLR 2024) [:octocat:](https://github.com/db-Lee/selfsup_dd) [:book:](./citations/lee2024self.txt) 205 | + [Self-supervised Dataset Distillation: A Good Compression Is All You Need](https://arxiv.org/abs/2404.07976) (Muxin Zhou et al., 2024) [:octocat:](https://github.com/VILA-Lab/SRe2L/tree/main/SCDD/) [:book:](./citations/zhou2024self.txt) 206 | + [Efficiency for Free: Ideal Data Are Transportable Representations](https://arxiv.org/abs/2405.14669) (Peng Sun et al., 2024) [:octocat:](https://github.com/LINs-lab/ReLA) [:book:](./citations/sun2024efficiency.txt) 207 | 208 | 209 | 210 | ### Long-Tailed Distillation 211 | + [Distilling Long-tailed Datasets](https://arxiv.org/abs/2408.14506) (Zhenghao Zhao & Haoxuan Wang et al., 2024) [:book:](./citations/zhao2024long.txt) 212 | 213 | 214 | 215 | ### Benchmark 216 | 217 | + [DC-BENCH: Dataset Condensation Benchmark](https://arxiv.org/abs/2207.09639) (Justin Cui et al., NeurIPS 2022) [:globe_with_meridians:](https://dc-bench.github.io/) [:octocat:](https://github.com/justincui03/dc_benchmark) [:book:](./citations/cui2022dc.txt) 218 | + [A Comprehensive Study on Dataset Distillation: Performance, Privacy, Robustness and Fairness](https://arxiv.org/abs/2305.03355)) (Zongxiong Chen & Jiahui Geng et al., 2023) [:book:](./citations/chen2023study.txt) 219 | + [DD-RobustBench: An Adversarial Robustness Benchmark for Dataset Distillation](https://arxiv.org/abs/2403.13322) (Yifan Wu et al., 2024) [:book:](./citations/wu2024robust.txt) 220 | 221 | 222 | 223 | ### Survey 224 | 225 | + [Data Distillation: A Survey](https://arxiv.org/abs/2301.04272) (Noveen Sachdeva et al., TMLR 2023) [:book:](./citations/sachdeva2023survey.txt) 226 | + [A Survey on Dataset Distillation: Approaches, Applications and Future Directions](https://arxiv.org/abs/2305.01975) (Jiahui Geng & Zongxiong Chen et al., IJCAI 2023) [:octocat:](https://github.com/Guang000/Awesome-Dataset-Distillation) [:book:](./citations/geng2023survey.txt) 227 | + [A Comprehensive Survey to Dataset Distillation](https://arxiv.org/abs/2301.05603) (Shiye Lei et al., TPAMI 2023) [:octocat:](https://github.com/Guang000/Awesome-Dataset-Distillation) [:book:](./citations/lei2023survey.txt) 228 | + [Dataset Distillation: A Comprehensive Review](https://arxiv.org/abs/2301.07014) (Ruonan Yu & Songhua Liu et al., TPAMI 2023) [:octocat:](https://github.com/Guang000/Awesome-Dataset-Distillation) [:book:](./citations/yu2023review.txt) 229 | 230 | 231 | 232 | ### Ph.D. Thesis 233 | + [Data-efficient Neural Network Training with Dataset Condensation](https://era.ed.ac.uk/handle/1842/39756) (Bo Zhao, The University of Edinburgh 2023) [:book:](./citations/zhao2023thesis.txt) 234 | 235 | 236 | 237 | ### Workshop 238 | + 1st CVPR Workshop on Dataset Distillation (Saeed Vahidian et al., CVPR 2024) [:globe_with_meridians:](https://sites.google.com/view/dd-cvpr2024/home) 239 | 240 | 241 | 242 | ### Challenge 243 | + The First Dataset Distillation Challenge (Kai Wang & Ahmad Sajedi et al., ECCV 2024) [:globe_with_meridians:](https://www.dd-challenge.com/) [:octocat:](https://github.com/DataDistillation/ECCV2024-Dataset-Distillation-Challenge) 244 | 245 | ## Applications 246 | 247 | 248 | 249 | ### Continual Learning 250 | + [Reducing Catastrophic Forgetting with Learning on Synthetic Data](https://arxiv.org/abs/2004.14046) (Wojciech Masarczyk et al., CVPR 2020 Workshop) [:book:](./citations/masarczyk2020reducing.txt) 251 | + [Condensed Composite Memory Continual Learning](https://arxiv.org/abs/2102.09890) (Felix Wiewel et al., IJCNN 2021) [:octocat:](https://github.com/FelixWiewel/CCMCL) [:book:](./citations/wiewel2021soft.txt) 252 | + [Distilled Replay: Overcoming Forgetting through Synthetic Samples](https://arxiv.org/abs/2103.15851) (Andrea Rosasco et al., IJCAI 2021 Workshop) [:octocat:](https://github.com/andrearosasco/DistilledReplay) [:book:](./citations/rosasco2021distilled.txt) 253 | + [Sample Condensation in Online Continual Learning](https://arxiv.org/abs/2206.11849) (Mattia Sangermano et al., IJCNN 2022) [:book:](./citations/sangermano2022sample.txt) 254 | + [An Efficient Dataset Condensation Plugin and Its Application to Continual Learning](https://openreview.net/forum?id=Murj6wcjRw) (Enneng Yang et al., NeurIPS 2023) [:book:](./citations/yang2023efficient.txt) 255 | + [Summarizing Stream Data for Memory-Restricted Online Continual Learning](https://arxiv.org/abs/2305.16645) (Jianyang Gu et al., AAAI 2024) [:octocat:](https://github.com/vimar-gu/SSD) [:book:](./citations/gu2024ssd.txt) 256 | 257 | 258 | 259 | ### Privacy 260 | + [Privacy for Free: How does Dataset Condensation Help Privacy?](https://arxiv.org/abs/2206.00240) (Tian Dong et al., ICML 2022) [:book:](./citations/dong2022privacy.txt) 261 | + [Private Set Generation with Discriminative Information](https://arxiv.org/abs/2211.04446) (Dingfan Chen et al., NeurIPS 2022) [:octocat:](https://github.com/DingfanChen/Private-Set) [:book:](./citations/chen2022privacy.txt) 262 | + [No Free Lunch in "Privacy for Free: How does Dataset Condensation Help Privacy"](https://arxiv.org/abs/2209.14987) (Nicholas Carlini et al., 2022) [:book:](./citations/carlini2022no.txt) 263 | + [Backdoor Attacks Against Dataset Distillation](https://arxiv.org/abs/2301.01197) (Yugeng Liu et al., NDSS 2023) [:octocat:](https://github.com/liuyugeng/baadd) [:book:](./citations/liu2023backdoor.txt) 264 | + [Differentially Private Kernel Inducing Points (DP-KIP) for Privacy-preserving Data Distillation](https://arxiv.org/abs/2301.13389) (Margarita Vinaroz et al., 2023) [:octocat:](https://github.com/dpclip/dpclip) [:book:](./citations/vinaroz2023dpkip.txt) 265 | + [Understanding Reconstruction Attacks with the Neural Tangent Kernel and Dataset Distillation](https://arxiv.org/abs/2302.01428) (Noel Loo et al., ICLR 2024) [:book:](./citations/loo2024attack.txt) 266 | + [Rethinking Backdoor Attacks on Dataset Distillation: A Kernel Method Perspective](https://arxiv.org/abs/2311.16646) (Ming-Yu Chung et al., ICLR 2024) [:book:](./citations/chung2024backdoor.txt) 267 | + [Differentially Private Dataset Condensation](https://www.ndss-symposium.org/ndss-paper/auto-draft-542/) (Zheng et al., NDSS 2024 Workshop) [:book:](./citations/zheng2024differentially.txt) 268 | + [Adaptive Backdoor Attacks Against Dataset Distillation for Federated Learning](https://ieeexplore.ieee.org/abstract/document/10622462?casa_token=tHyZ-Pz7DpUAAAAA:vmCYI4cUcKzMluUsASHhIhr0CvBkjzBR-0N7REVj7aFN5hT5TinQTpSEsE0Bo3Fl8auh52Fipm_v) (Ze Chai et al., ICC 2024) [:book:](./citations/chai2024backdoor.txt) 269 | 270 | 271 | 272 | ### Medical 273 | + [Soft-Label Anonymous Gastric X-ray Image Distillation](https://arxiv.org/abs/2104.02857) (Guang Li et al., ICIP 2020) [:octocat:](https://github.com/Guang000/dataset-distillation) [:book:](./citations/li2020soft.txt) 274 | + [Compressed Gastric Image Generation Based on Soft-Label Dataset Distillation for Medical Data Sharing](https://arxiv.org/abs/2209.14635) (Guang Li et al., CMPB 2022) [:octocat:](https://github.com/Guang000/dataset-distillation) [:book:](./citations/li2022compressed.txt) 275 | + [Dataset Distillation for Medical Dataset Sharing](https://r2hcai.github.io/AAAI-23/pages/accepted-papers.html) (Guang Li et al., AAAI 2023 Workshop) [:octocat:](https://github.com/Guang000/mtt-distillation) [:book:](./citations/li2023sharing.txt) 276 | + [Communication-Efficient Federated Skin Lesion Classification with Generalizable Dataset Distillation](https://link.springer.com/chapter/10.1007/978-3-031-47401-9_2) (Yuchen Tian & Jiacheng Wang, MICCAI 2023 Workshop) [:book:](./citations/tian2023gdd.txt) 277 | + [Image Distillation for Safe Data Sharing in Histopathology](https://arxiv.org/abs/2406.13536) (Zhe Li et al., MICCAI 2024) [:octocat:](https://github.com/ZheLi2020/InfoDist) [:book:](./citations/li2024infodist.txt) 278 | + [Progressive Trajectory Matching for Medical Dataset Distillation](https://arxiv.org/abs/2403.13469) (Zhen Yu et al., 2024) [:book:](./citations/yu2024progressive.txt) 279 | + [Dataset Distillation in Medical Imaging: A Feasibility Study](https://arxiv.org/abs/2407.14429) (Muyang Li et al., 2024) [:book:](./citations/li2024medical.txt) 280 | 281 | 282 | 283 | ### Federated Learning 284 | + [Federated Learning via Synthetic Data](https://arxiv.org/abs/2008.04489) (Jack Goetz et al., 2020) [:book:](./citations/goetz2020federated.txt) 285 | + [Distilled One-Shot Federated Learning](https://arxiv.org/abs/2009.07999) (Yanlin Zhou et al., 2020) [:book:](./citations/zhou2020distilled.txt) 286 | + [DENSE: Data-Free One-Shot Federated Learning](https://arxiv.org/abs/2112.12371) (Jie Zhang & Chen Chen et al., NeurIPS 2022) [:octocat:](https://github.com/zj-jayzhang/DENSE) [:book:](./citations/zhang2022dense.txt) 287 | + [FedSynth: Gradient Compression via Synthetic Data in Federated Learning](https://arxiv.org/abs/2204.01273) (Shengyuan Hu et al., 2022) [:book:](./citations/hu2022fedsynth.txt) 288 | + [Meta Knowledge Condensation for Federated Learning](https://arxiv.org/abs/2209.14851) (Ping Liu et al., ICLR 2023) [:book:](./citations/liu2023meta.txt) 289 | + [DYNAFED: Tackling Client Data Heterogeneity with Global Dynamics](https://arxiv.org/abs/2211.10878) (Renjie Pi et al., CVPR 2023) [:octocat:](https://github.com/pipilurj/dynafed) [:book:](./citations/pi2023dynafed.txt) 290 | + [FedDM: Iterative Distribution Matching for Communication-Efficient Federated Learning](https://arxiv.org/abs/2207.09653) (Yuanhao Xiong & Ruochen Wang et al., CVPR 2023) [:book:](./citations/xiong2023feddm.txt) 291 | + [Federated Learning via Decentralized Dataset Distillation in Resource-Constrained Edge Environments](https://arxiv.org/abs/2208.11311) (Rui Song et al., IJCNN 2023) [:octocat:](https://github.com/rruisong/fedd3) [:book:](./citations/song2023federated.txt) 292 | + [Fed-GLOSS-DP: Federated, Global Learning using Synthetic Sets with Record Level Differential Privacy](https://arxiv.org/abs/2302.01068) (Hui-Po Wang et al., 2023) [:book:](./citations/wang2023fed.txt) 293 | + [Federated Virtual Learning on Heterogeneous Data with Local-global Distillation](https://arxiv.org/abs/2303.02278) (Chun-Yin Huang et al., 2023) [:book:](./citations/huang2023federated.txt) 294 | + [An Aggregation-Free Federated Learning for Tackling Data Heterogeneity](https://arxiv.org/abs/2404.18962) (Yuan Wang et al., CVPR 2024) [:book:](./citations/wang2024fed.txt) 295 | + [DCFL: Non-IID Awareness Dataset Condensation Aided Federated Learning](https://ieeexplore.ieee.org/document/10650791) (Xingwang Wang et al., IJCNN 2024) [:book:](./citations/wang2024dcfl.txt) 296 | 297 | 298 | 299 | ### Graph Neural Network 300 | + [Graph Condensation for Graph Neural Networks](https://arxiv.org/abs/2110.07580) (Wei Jin et al., ICLR 2022) [:octocat:](https://github.com/chandlerbang/gcond) [:book:](./citations/jin2022graph.txt) 301 | + [Condensing Graphs via One-Step Gradient Matching](https://arxiv.org/abs/2206.07746) (Wei Jin et al., KDD 2022) [:octocat:](https://github.com/amazon-research/DosCond) [:book:](./citations/jin2022condensing.txt) 302 | + [Graph Condensation via Receptive Field Distribution Matching](https://arxiv.org/abs/2206.13697) (Mengyang Liu et al., 2022) [:book:](./citations/liu2022graph.txt) 303 | + [Kernel Ridge Regression-Based Graph Dataset Distillation](https://dl.acm.org/doi/10.1145/3580305.3599398) (Zhe Xu et al., KDD 2023) [:octocat:](https://github.com/pricexu/KIDD) [:book:](./citations/xu2023kidd.txt) 304 | + [Structure-free Graph Condensation: From Large-scale Graphs to Condensed Graph-free Data](https://arxiv.org/abs/2306.02664) (Xin Zheng et al., NeurIPS 2023) [:octocat:](https://github.com/amanda-zheng/sfgc) [:book:](./citations/zheng2023sfgc.txt) 305 | + [Does Graph Distillation See Like Vision Dataset Counterpart?](https://arxiv.org/abs/2310.09192) (Beining Yang & Kai Wang et al., NeurIPS 2023) [:octocat:](https://github.com/RingBDStack/SGDD) [:book:](./citations/yang2023sgdd.txt) 306 | + [Fair Graph Distillation](https://openreview.net/forum?id=xW0ayZxPWs) (Qizhang Feng et al., NeurIPS 2023) [:book:](./citations/feng2023fair.txt) 307 | + [CaT: Balanced Continual Graph Learning with Graph Condensation](https://arxiv.org/abs/2309.09455) (Liu Yilun et al., ICDM 2023) [:octocat:](https://github.com/superallen13/CaT-CGL) [:book:](./citations/liu2023cat.txt) 308 | + [Mirage: Model-Agnostic Graph Distillation for Graph Classification](https://arxiv.org/abs/2310.09486) (Mridul Gupta & Sahil Manchanda et al., ICLR 2024) [:octocat:](https://github.com/frigategnn/Mirage) [:book:](./citations/gupta2024mirage.txt) 309 | + [Graph Distillation with Eigenbasis Matching](https://arxiv.org/abs/2310.09202) (Yang Liu & Deyu Bo et al., ICML 2024) [:octocat:](https://github.com/liuyang-tian/GDEM) [:book:](./citations/liu2024gdem.txt) 310 | + [Navigating Complexity: Toward Lossless Graph Condensation via Expanding Window Matching](https://arxiv.org/abs/2402.05011) (Yuchen Zhang & Tianle Zhang & Kai Wang et al., ICML 2024) [:octocat:](https://github.com/nus-hpc-ai-lab/geom) [:book:](./citations/zhang2024geom.txt) 311 | + [Graph Data Condensation via Self-expressive Graph Structure Reconstruction](https://arxiv.org/abs/2403.07294) (Zhanyu Liu & Chaolv Zeng et al., KDD 2024) [:octocat:](https://github.com/zclzcl0223/GCSR) [:book:](./citations/liu2024gcsr.txt) 312 | + [Two Trades is not Baffled: Condensing Graph via Crafting Rational Gradient Matching](https://arxiv.org/abs/2402.04924) (Tianle Zhang & Yuchen Zhang & Kai Wang et al., 2024) [:octocat:](https://github.com/nus-hpc-ai-lab/ctrl) [:book:](./citations/zhang2024ctrl.txt) 313 | 314 | 315 | #### Survey 316 | + [A Comprehensive Survey on Graph Reduction: Sparsification, Coarsening, and Condensation](https://arxiv.org/abs/2402.03358) (Mohammad Hashemi et al., IJCAI 2024) [:octocat:](https://github.com/Emory-Melody/awesome-graph-reduction) [:book:](./citations/hashemi2024awesome.txt) 317 | + [Graph Condensation: A Survey](https://arxiv.org/abs/2401.11720) (Xinyi Gao et al., 2024) [:book:](./citations/gao2024graph.txt) 318 | + [A Survey on Graph Condensation](https://arxiv.org/abs/2402.02000) (Hongjia Xu., 2024) [:book:](./citations/xu2024survey.txt) 319 | 320 | #### Benchmark 321 | + [GCondenser: Benchmarking Graph Condensation](https://arxiv.org/abs/2405.14246) (Yilun Liu et al., 2024) [:octocat:](https://github.com/superallen13/GCondenser) [:book:](./citations/liu2024gcondenser.txt) 322 | 323 | 324 | 325 | ### Neural Architecture Search 326 | + [Generative Teaching Networks: Accelerating Neural Architecture Search by Learning to Generate Synthetic Training Data](https://arxiv.org/abs/1912.07768) (Felipe Petroski Such et al., ICML 2020) [:octocat:](https://github.com/uber-research/GTN) [:book:](./citations/such2020generative.txt) 327 | + [Learning to Generate Synthetic Training Data using Gradient Matching and Implicit Differentiation](https://arxiv.org/abs/2203.08559) (Dmitry Medvedev et al., AIST 2021) [:octocat:](https://github.com/dm-medvedev/efficientdistillation) [:book:](./citations/medvedev2021tabular.txt) 328 | 329 | 330 | 331 | ### Fashion, Art, and Design 332 | + [Wearable ImageNet: Synthesizing Tileable Textures via Dataset Distillation](https://openaccess.thecvf.com/content/CVPR2022W/CVFAD/html/Cazenavette_Wearable_ImageNet_Synthesizing_Tileable_Textures_via_Dataset_Distillation_CVPRW_2022_paper.html) (George Cazenavette et al., CVPR 2022 Workshop) [:globe_with_meridians:](https://georgecazenavette.github.io/mtt-distillation/) [:octocat:](https://github.com/georgecazenavette/mtt-distillation) [:book:](./citations/cazenavette2022textures.txt) 333 | + [Learning from Designers: Fashion Compatibility Analysis Via Dataset Distillation](https://ieeexplore.ieee.org/document/9897234) (Yulan Chen et al., ICIP 2022) [:book:](./citations/chen2022fashion.txt) 334 | + [Galaxy Dataset Distillation with Self-Adaptive Trajectory Matching](https://arxiv.org/abs/2311.17967) (Haowen Guan et al., NeurIPS 2023 Workshop) [:octocat:](https://github.com/HaowenGuan/Galaxy-Dataset-Distillation) [:book:](./citations/guan2023galaxy.txt) 335 | 336 | 337 | 338 | ### Knowledge Distillation 339 | + [Knowledge Condensation Distillation](https://arxiv.org/abs/2207.05409) (Chenxin Li et al., ECCV 2022) [:octocat:](https://github.com/dzy3/KCD) [:book:](./citations/li2022knowledge.txt) 340 | 341 | 342 | 343 | ### Recommender Systems 344 | + [Infinite Recommendation Networks: A Data-Centric Approach](https://arxiv.org/abs/2206.02626) (Noveen Sachdeva et al., NeurIPS 2022) [:octocat:](https://github.com/noveens/distill_cf) [:book:](./citations/sachdeva2022data.txt) 345 | + [Gradient Matching for Categorical Data Distillation in CTR Prediction](https://dl.acm.org/doi/10.1145/3604915.3608769) (Chen Wang et al., RecSys 2023) [:book:](./citations/wang2023cgm.txt) 346 | 347 | 348 | 349 | ### Blackbox Optimization 350 | + [Bidirectional Learning for Offline Infinite-width Model-based Optimization](https://arxiv.org/abs/2209.07507) (Can Chen et al., NeurIPS 2022) [:octocat:](https://github.com/ggchen1997/bdi) [:book:](./citations/chen2022bidirectional.txt) 351 | + [Bidirectional Learning for Offline Model-based Biological Sequence Design](https://arxiv.org/abs/2301.02931) (Can Chen et al., ICML 2023) [:octocat:](https://github.com/GGchen1997/BIB-ICML2023-Submission) [:book:](./citations/chen2023bidirectional.txt) 352 | 353 | 354 | 355 | ### Trustworthy 356 | + [Can We Achieve Robustness from Data Alone?](https://arxiv.org/abs/2207.11727) (Nikolaos Tsilivis et al., ICML 2022 Workshop) [:book:](./citations/tsilivis2022robust.txt) 357 | + [Towards Robust Dataset Learning](https://arxiv.org/abs/2211.10752) (Yihan Wu et al., 2022) [:book:](./citations/wu2022towards.txt) 358 | + [Rethinking Data Distillation: Do Not Overlook Calibration](https://arxiv.org/abs/2307.12463) (Dongyao Zhu et al., ICCV 2023) [:book:](./citations/zhu2023calibration.txt) 359 | + [Towards Trustworthy Dataset Distillation](https://arxiv.org/abs/2307.09165) (Shijie Ma et al., PR 2024) [:octocat:](https://github.com/mashijie1028/TrustDD/) [:book:](./citations/ma2024trustworthy.txt) 360 | + [Group Distributionally Robust Dataset Distillation with Risk Minimization](https://arxiv.org/abs/2402.04676) (Saeed Vahidian & Mingyu Wang & Jianyang Gu et al., 2024) [:octocat:](https://github.com/Mming11/RobustDatasetDistillation) [:book:](./citations/vahidian2024group.txt) 361 | + [Towards Adversarially Robust Dataset Distillation by Curvature Regularization](https://arxiv.org/abs/2403.10045) (Eric Xue et al., 2024) [:book:](./citations/xue2024robust.txt) 362 | 363 | 364 | 365 | ### Text 366 | + [Data Distillation for Text Classification](https://arxiv.org/abs/2104.08448) (Yongqi Li et al., 2021) [:book:](./citations/li2021text.txt) 367 | + [Dataset Distillation with Attention Labels for Fine-tuning BERT](https://aclanthology.org/2023.acl-short.12/) (Aru Maekawa et al., ACL 2023) [:octocat:](https://github.com/arumaekawa/dataset-distillation-with-attention-labels) [:book:](./citations/maekawa2023text.txt) 368 | + [DiLM: Distilling Dataset into Language Model for Text-level Dataset Distillation](https://arxiv.org/abs/2404.00264) (Aru Maekawa et al., NAACL 2024) [:octocat:](https://github.com/arumaekawa/DiLM) [:book:](./citations/maekawa2024dilm.txt) 369 | 370 | 371 | 372 | ### Tabular 373 | + [New Properties of the Data Distillation Method When Working With Tabular Data](https://arxiv.org/abs/2010.09839) (Dmitry Medvedev et al., AIST 2020) [:octocat:](https://github.com/dm-medvedev/dataset-distillation) [:book:](./citations/medvedev2020tabular.txt) 374 | 375 | 376 | 377 | ### Retrieval 378 | + [Towards Efficient Deep Hashing Retrieval: Condensing Your Data via Feature-Embedding Matching](https://arxiv.org/abs/2305.18076) (Tao Feng & Jie Zhang et al., 2023) [:book:](./citations/feng2023hash.txt) 379 | 380 | 381 | 382 | ### Video 383 | + [Dancing with Still Images: Video Distillation via Static-Dynamic Disentanglement](https://arxiv.org/abs/2312.00362) (Ziyu Wang & Yue Xu et al., CVPR 2024) [:octocat:](https://github.com/yuz1wan/video_distillation) [:book:](./citations/wang2023dancing.txt) 384 | 385 | 386 | 387 | ### Domain Adaptation 388 | + [Multi-Source Domain Adaptation Meets Dataset Distillation through Dataset Dictionary Learning](https://arxiv.org/abs/2309.07666) (Eduardo Montesuma et al., ICASSP 2024) [:book:](./citations/montesuma2024multi.txt) 389 | 390 | 391 | 392 | ### Super Resolution 393 | + [GSDD: Generative Space Dataset Distillation for Image Super-resolution](https://ojs.aaai.org/index.php/AAAI/article/view/28534) (Haiyu Zhang et al., AAAI 2024) [:book:](./citations/zhang2024super.txt) 394 | 395 | 396 | 397 | ### Time Series 398 | + [Dataset Condensation for Time Series Classification via Dual Domain Matching](https://arxiv.org/abs/2403.07245) (Zhanyu Liu et al., KDD 2024) [:octocat:](https://github.com/zhyliu00/TimeSeriesCond) [:book:](./citations/liu2024time.txt) 399 | + [CondTSF: One-line Plugin of Dataset Condensation for Time Series Forecasting](https://arxiv.org/abs/2406.02131) (Jianrong Ding & Zhanyu Liu et al., 2024) [:octocat:](https://github.com/RafaDD/CondTSF) [:book:](./citations/ding2024time.txt) 400 | 401 | 402 | 403 | ### Speech 404 | + [Dataset-Distillation Generative Model for Speech Emotion Recognition](https://arxiv.org/abs/2406.02963) (Fabian Ritter-Gutierrez et al., Interspeech 2024) [:book:](./citations/fabian2024speech.txt) 405 | 406 | 407 | 408 | ### Machine Unlearning 409 | + [Distilled Datamodel with Reverse Gradient Matching](https://arxiv.org/abs/2404.14006) (Jingwen Ye et al., CVPR 2024) [:book:](./citations/ye2024datamodel.txt) 410 | 411 | 412 | 413 | ### Reinforcement Learning 414 | + [Dataset Distillation for Offline Reinforcement Learning](https://arxiv.org/abs/2407.20299) (Jonathan Light & Yuanzhe Liu et al., ICML 2024 Workshop) [:globe_with_meridians:](https://datasetdistillation4rl.github.io/) [:octocat:](https://github.com/ggflow123/DDRL) [:book:](./citations/light2024rl.txt) 415 | 416 | ## Media Coverage 417 | + [Beginning of Awesome Dataset Distillation](https://twitter.com/TongzhouWang/status/1560043815204970497?cxt=HHwWgoCz9bPlsaYrAAAA) 418 | + [Most Popular AI Research Aug 2022](https://www.libhunt.com/posts/874974-d-most-popular-ai-research-aug-2022-ranked-based-on-github-stars) 419 | + [一个项目帮你了解数据集蒸馏Dataset Distillation](https://www.jiqizhixin.com/articles/2022-10-11-22) 420 | + [浓缩就是精华:用大一统视角看待数据集蒸馏](https://mp.weixin.qq.com/s/__IjS0_FMpu35X9cNhNhPg) 421 | 422 | ## Star History 423 | [![Star History Chart](https://api.star-history.com/svg?repos=Guang000/Awesome-Dataset-Distillation&type=Date)](https://star-history.com/#Guang000/Awesome-Dataset-Distillation&Date) 424 | 425 | ## Citing Awesome Dataset Distillation 426 | If you find this project useful for your research, please use the following BibTeX entry. 427 | ``` 428 | @misc{li2022awesome, 429 | author={Li, Guang and Zhao, Bo and Wang, Tongzhou}, 430 | title={Awesome Dataset Distillation}, 431 | howpublished={\url{https://github.com/Guang000/Awesome-Dataset-Distillation}}, 432 | year={2022} 433 | } 434 | ``` 435 | 436 | ## Acknowledgments 437 | We would like to express our heartfelt thanks to [Nikolaos Tsilivis](https://github.com/Tsili42), [Wei Jin](https://github.com/ChandlerBang), [Yongchao Zhou](https://github.com/yongchao97), [Noveen Sachdeva](https://github.com/noveens), [Can Chen](https://github.com/GGchen1997), [Guangxiang Zhao](https://github.com/zhaoguangxiang), [Shiye Lei](https://github.com/LeavesLei), [Xinchao Wang](https://sites.google.com/site/sitexinchaowang/), [Dmitry Medvedev](https://github.com/dm-medvedev), [Seungjae Shin](https://github.com/SJShin-AI), [Jiawei Du](https://github.com/AngusDujw), [Yidi Jiang](https://github.com/Jiang-Yidi), [Xindi Wu](https://github.com/XindiWu), [Guangyi Liu](https://github.com/lgy0404), [Yilun Liu](https://github.com/superallen13), [Kai Wang](https://github.com/kaiwang960112), [Yue Xu](https://github.com/silicx), [Anjia Cao](https://github.com/CAOANJIA), [Jianyang Gu](https://github.com/vimar-gu), [Yuanzhen Feng](https://github.com/fengyzpku), [Peng Sun](https://github.com/sp12138), [Ahmad Sajedi](https://github.com/AhmadSajedii), 438 | [Zhihao Sui](https://github.com/suizhihao), [Ziyu Wang](https://github.com/yuz1wan), [Haoyang Liu](https://github.com/Liu-Hy), [Eduardo Montesuma](https://github.com/eddardd), [Shengbo Gong](https://github.com/rockcor), [Zheng Zhou](https://github.com/zhouzhengqd), [Zhenghao Zhao](https://github.com/ichbill), [Duo Su](https://github.com/suduo94), [Tianhang Zheng](https://github.com/tianzheng4), [Shijie Ma](https://github.com/mashijie1028), [Wei Wei](https://github.com/WeiWeic6222848), [Yantai Yang](https://github.com/Hiter-Q), [Shaobo Wang](https://github.com/gszfwsb) and [Xinhao Zhong](https://github.com/ndhg1213) for their valuable suggestions and contributions. 439 | 440 | The [Homepage](https://guang000.github.io/Awesome-Dataset-Distillation/) of Awesome Dataset Distillation was designed and maintained by [Longzhen Li](https://github.com/LOVELESSG). 441 | -------------------------------------------------------------------------------- /css/colors.module.css: -------------------------------------------------------------------------------- 1 | .primary { 2 | background-color: var(--md-sys-color-primary); 3 | } 4 | .primary-text { 5 | color: var(--md-sys-color-primary); 6 | } 7 | .on-primary { 8 | background-color: var(--md-sys-color-on-primary); 9 | } 10 | .on-primary-text { 11 | color: var(--md-sys-color-on-primary); 12 | } 13 | .primary-container { 14 | background-color: var(--md-sys-color-primary-container); 15 | } 16 | .primary-container-text { 17 | color: var(--md-sys-color-primary-container); 18 | } 19 | .on-primary-container { 20 | background-color: var(--md-sys-color-on-primary-container); 21 | } 22 | .on-primary-container-text { 23 | color: var(--md-sys-color-on-primary-container); 24 | } 25 | .secondary { 26 | background-color: var(--md-sys-color-secondary); 27 | } 28 | .secondary-text { 29 | color: var(--md-sys-color-secondary); 30 | } 31 | .on-secondary { 32 | background-color: var(--md-sys-color-on-secondary); 33 | } 34 | .on-secondary-text { 35 | color: var(--md-sys-color-on-secondary); 36 | } 37 | .secondary-container { 38 | background-color: var(--md-sys-color-secondary-container); 39 | } 40 | .secondary-container-text { 41 | color: var(--md-sys-color-secondary-container); 42 | } 43 | .on-secondary-container { 44 | background-color: var(--md-sys-color-on-secondary-container); 45 | } 46 | .on-secondary-container-text { 47 | color: var(--md-sys-color-on-secondary-container); 48 | } 49 | .tertiary { 50 | background-color: var(--md-sys-color-tertiary); 51 | } 52 | .tertiary-text { 53 | color: var(--md-sys-color-tertiary); 54 | } 55 | .on-tertiary { 56 | background-color: var(--md-sys-color-on-tertiary); 57 | } 58 | .on-tertiary-text { 59 | color: var(--md-sys-color-on-tertiary); 60 | } 61 | .tertiary-container { 62 | background-color: var(--md-sys-color-tertiary-container); 63 | } 64 | .tertiary-container-text { 65 | color: var(--md-sys-color-tertiary-container); 66 | } 67 | .on-tertiary-container { 68 | background-color: var(--md-sys-color-on-tertiary-container); 69 | } 70 | .on-tertiary-container-text { 71 | color: var(--md-sys-color-on-tertiary-container); 72 | } 73 | .error { 74 | background-color: var(--md-sys-color-error); 75 | } 76 | .error-text { 77 | color: var(--md-sys-color-error); 78 | } 79 | .error-container { 80 | background-color: var(--md-sys-color-error-container); 81 | } 82 | .error-container-text { 83 | color: var(--md-sys-color-error-container); 84 | } 85 | .on-error { 86 | background-color: var(--md-sys-color-on-error); 87 | } 88 | .on-error-text { 89 | color: var(--md-sys-color-on-error); 90 | } 91 | .on-error-container { 92 | background-color: var(--md-sys-color-on-error-container); 93 | } 94 | .on-error-container-text { 95 | color: var(--md-sys-color-on-error-container); 96 | } 97 | .background { 98 | background-color: var(--md-sys-color-background); 99 | } 100 | .background-text { 101 | color: var(--md-sys-color-background); 102 | } 103 | .on-background { 104 | background-color: var(--md-sys-color-on-background); 105 | } 106 | .on-background-text { 107 | color: var(--md-sys-color-on-background); 108 | } 109 | .surface { 110 | background-color: var(--md-sys-color-surface); 111 | } 112 | .surface-text { 113 | color: var(--md-sys-color-surface); 114 | } 115 | .on-surface { 116 | background-color: var(--md-sys-color-on-surface); 117 | } 118 | .on-surface-text { 119 | color: var(--md-sys-color-on-surface); 120 | } 121 | .surface-variant { 122 | background-color: var(--md-sys-color-surface-variant); 123 | } 124 | .surface-variant-text { 125 | color: var(--md-sys-color-surface-variant); 126 | } 127 | .on-surface-variant { 128 | background-color: var(--md-sys-color-on-surface-variant); 129 | } 130 | .on-surface-variant-text { 131 | color: var(--md-sys-color-on-surface-variant); 132 | } 133 | .outline { 134 | background-color: var(--md-sys-color-outline); 135 | } 136 | .outline-text { 137 | color: var(--md-sys-color-outline); 138 | } 139 | .inverse-on-surface { 140 | background-color: var(--md-sys-color-inverse-on-surface); 141 | } 142 | .inverse-on-surface-text { 143 | color: var(--md-sys-color-inverse-on-surface); 144 | } 145 | .inverse-surface { 146 | background-color: var(--md-sys-color-inverse-surface); 147 | } 148 | .inverse-surface-text { 149 | color: var(--md-sys-color-inverse-surface); 150 | } 151 | .inverse-primary { 152 | background-color: var(--md-sys-color-inverse-primary); 153 | } 154 | .inverse-primary-text { 155 | color: var(--md-sys-color-inverse-primary); 156 | } 157 | .shadow { 158 | background-color: var(--md-sys-color-shadow); 159 | } 160 | .shadow-text { 161 | color: var(--md-sys-color-shadow); 162 | } 163 | .surface-tint { 164 | background-color: var(--md-sys-color-surface-tint); 165 | } 166 | .surface-tint-text { 167 | color: var(--md-sys-color-surface-tint); 168 | } 169 | .outline-variant { 170 | background-color: var(--md-sys-color-outline-variant); 171 | } 172 | .outline-variant-text { 173 | color: var(--md-sys-color-outline-variant); 174 | } 175 | .scrim { 176 | background-color: var(--md-sys-color-scrim); 177 | } 178 | .scrim-text { 179 | color: var(--md-sys-color-scrim); 180 | } 181 | -------------------------------------------------------------------------------- /css/styles.css: -------------------------------------------------------------------------------- 1 | @import url(theme.css); 2 | 3 | html { 4 | scroll-padding-top: 72px; 5 | } 6 | 7 | a { 8 | text-decoration: none; 9 | } 10 | 11 | a:hover { 12 | text-decoration: underline; 13 | } 14 | 15 | ul { 16 | margin: 0; 17 | padding: 0; 18 | list-style-type: none; 19 | } 20 | 21 | body { 22 | margin-top: 0; 23 | } 24 | 25 | header { 26 | margin: 0; 27 | width: 100%; 28 | position: fixed; 29 | } 30 | 31 | hr { 32 | text-align: center; 33 | margin-top: 0; 34 | margin-bottom: 0; 35 | } 36 | 37 | .top-bar { 38 | display: block; 39 | margin: 0; 40 | padding: 0; 41 | height: 64px; 42 | } 43 | 44 | .top-bar-left { 45 | display: flex; 46 | justify-content: flex-start; 47 | order: -1; 48 | flex: 1 1 auto; 49 | align-items: center; 50 | } 51 | 52 | .navigation-button { 53 | fill: var(--md-sys-color-on-background); 54 | display: grid; 55 | height: 48px; 56 | width: 48px; 57 | border-radius: 50%; 58 | place-items: center; 59 | } 60 | 61 | .navigation-button:hover { 62 | background-color: var(--md-sys-color-outline-variant); 63 | } 64 | 65 | .close-navigation-button { 66 | fill: var(--md-sys-color-on-surface); 67 | display: grid; 68 | height: 48px; 69 | width: 48px; 70 | margin-left: 8px; 71 | border-radius: 50%; 72 | place-items: center; 73 | } 74 | 75 | .close-navigation-button:hover { 76 | background-color: var(--md-sys-color-inverse-on-surface); 77 | } 78 | 79 | .top-bar-title { 80 | padding-left: 16px; 81 | } 82 | 83 | .sidebar { 84 | height: 100%; 85 | width: 0; 86 | position: fixed; 87 | z-index: 1; 88 | top: 0; 89 | left: 0; 90 | padding-top: 0; 91 | overflow-x: hidden; 92 | transition: 0.3s; 93 | } 94 | 95 | .overlay { 96 | position: fixed; 97 | z-index: 1; 98 | background-color: rgba(0, 0, 0, 0.8); 99 | } 100 | 101 | .navigation-list { 102 | display: block; 103 | } 104 | 105 | .navigation-item { 106 | margin: 8px 0; 107 | } 108 | 109 | .list-section { 110 | display: block; 111 | margin: 0 16px 8px 16px; 112 | padding: 24px; 113 | border-radius: 24px; 114 | } 115 | 116 | .top-title { 117 | width: 100%; 118 | margin: 0 auto; 119 | max-width: 1760px; 120 | } 121 | 122 | .count-container { 123 | border-radius: 24px; 124 | padding: 80px 24px 0 24px; 125 | margin: 0; 126 | display: grid; 127 | justify-content: space-between; 128 | grid-column-gap: 20px; 129 | column-gap: 20px; 130 | grid-template-columns: 6fr 1fr 6fr; 131 | user-select: none; 132 | } 133 | 134 | .item-number-container { 135 | border-radius: 24px; 136 | cursor: pointer; 137 | } 138 | 139 | .item-number-container:hover { 140 | color: var(--md-sys-color-on-primary-container); 141 | background-color: var(--md-sys-color-primary-container); 142 | } 143 | 144 | .count-title { 145 | margin: 0; 146 | text-align: center; 147 | } 148 | 149 | .count-number { 150 | margin: 16px 0 0 0; 151 | text-align: center; 152 | } 153 | 154 | .count-divider { 155 | width: 0; 156 | height: 100%; 157 | margin: 0 auto; 158 | } 159 | 160 | .title-brief { 161 | background-image: url(../images/titleBackground.png); 162 | background-position: 95% 5%; 163 | border-radius: 24px; 164 | padding: 56px; 165 | margin: 0; 166 | text-align: center; 167 | } 168 | 169 | 170 | 171 | .title { 172 | margin: 0 0 8px 0; 173 | font-weight: 475; 174 | font-size: 88px; 175 | } 176 | 177 | .brief { 178 | margin: 0; 179 | padding: 0; 180 | border: 0; 181 | } 182 | 183 | .github-homepage { 184 | margin: 24px 0 0 0; 185 | border: 0; 186 | border-radius: 48px; 187 | height: 60px; 188 | width: 160px; 189 | text-align: center; 190 | align-items: center; 191 | justify-content: center; 192 | cursor: pointer; 193 | } 194 | 195 | .github-homepage:hover { 196 | background-color: var(--md-sys-color-inverse-primary); 197 | } 198 | 199 | .overall-container { 200 | width: 100%; 201 | margin: 0 auto; 202 | max-width: 1200px; 203 | } 204 | 205 | .section-title { 206 | margin: 80px 24px 8px 24px; 207 | } 208 | 209 | .section-subtitle { 210 | margin: 24px 24px; 211 | } 212 | 213 | .background-vision-container { 214 | border-radius: 24px; 215 | padding: 24px; 216 | } 217 | 218 | .background-vision { 219 | line-height: 24px; 220 | margin: 0; 221 | } 222 | 223 | .lastupdate-container { 224 | border-radius: 24px; 225 | overflow: auto; 226 | padding: 8px 0 0 0; 227 | height: 360px; 228 | scrollbar-width: none; 229 | } 230 | 231 | .last-essay-container { 232 | border-radius: 24px; 233 | padding: 24px; 234 | margin: 0; 235 | display: grid; 236 | justify-content: space-between; 237 | grid-column-gap: 20px; 238 | column-gap: 20px; 239 | grid-template-columns: 3fr 1fr; 240 | } 241 | 242 | .last-essay-container:hover { 243 | color: var(--md-sys-color-on-primary-container); 244 | background-color: var(--md-sys-color-primary-container); 245 | } 246 | 247 | /*.lastupdate-container::-webkit-scrollbar { 248 | display: none; 249 | }*/ 250 | 251 | .edit-date { 252 | padding: 8px 24px; 253 | } 254 | 255 | .essay-container { 256 | border-radius: 24px; 257 | padding: 24px; 258 | margin: 0 0 8px 0; 259 | display: grid; 260 | justify-content: space-between; 261 | grid-column-gap: 20px; 262 | column-gap: 20px; 263 | grid-template-columns: 3fr 1fr; 264 | } 265 | 266 | .essay-container:hover { 267 | color: var(--md-sys-color-on-primary-container); 268 | background-color: var(--md-sys-color-primary-container); 269 | } 270 | 271 | .essay-content { 272 | margin: 0; 273 | padding: 0; 274 | } 275 | 276 | .essay-author { 277 | margin: 0; 278 | padding: 0; 279 | } 280 | 281 | .button-group { 282 | display: grid; 283 | grid-auto-flow: column; 284 | place-items: center; 285 | } 286 | 287 | .essay-button { 288 | display: grid; 289 | fill: var(--md-sys-color-on-surface-variant); 290 | height: 40px; 291 | width: 40px; 292 | border-radius: 50%; 293 | place-items: center; 294 | } 295 | 296 | .essay-button:hover { 297 | fill: var(--md-sys-color-on-primary); 298 | background-color: var(--md-sys-color-primary); 299 | } 300 | 301 | .placeholder { 302 | height: 40px; 303 | width: 40px; 304 | border-radius: 50%; 305 | } 306 | 307 | .media-coverage-content-container { 308 | border-radius: 24px; 309 | padding: 24px; 310 | margin: 0 0 8px 0; 311 | } 312 | 313 | .media-coverage-container { 314 | display: grid; 315 | justify-content: space-between; 316 | column-gap: 8px; 317 | grid-template-columns: 1fr 1fr; 318 | } 319 | 320 | .page-footer-container { 321 | display: flex; 322 | flex-direction: column; 323 | align-items: center; 324 | margin: 120px 8px 0; 325 | } 326 | 327 | .page-footer { 328 | padding: 64px 40px; 329 | } 330 | 331 | .about { 332 | display: grid; 333 | justify-content: space-between; 334 | max-width: 1200px; 335 | grid-column-gap: 20px; 336 | column-gap: 20px; 337 | grid-template-columns: 1fr 1fr; 338 | } 339 | 340 | .page-footer-title { 341 | margin: 0; 342 | } 343 | 344 | .cite { 345 | margin-top: 24px; 346 | margin-right: 64px; 347 | } 348 | 349 | .visitor-map { 350 | margin-top: 24px; 351 | margin-right: 64px; 352 | } 353 | 354 | .page-footer-instruction { 355 | margin: 0; 356 | line-height: 24px; 357 | } 358 | 359 | @media screen and (max-width: 857px) { 360 | .show-on-middle-screens { 361 | display: none; 362 | } 363 | .show-on-large-screens { 364 | display: none; 365 | } 366 | .title { 367 | margin: 0 0 8px 0; 368 | font-weight: 475; 369 | font-size: 45px 370 | } 371 | .page-footer-container{ 372 | display: block; 373 | width: auto; 374 | } 375 | .about { 376 | display: block; 377 | width: auto; 378 | } 379 | .media-coverage-container { 380 | display: block; 381 | } 382 | } 383 | 384 | @media screen and (min-width: 858px) and (max-width: 1294px) { 385 | .show-on-large-screens { 386 | display: none; 387 | } 388 | .show-on-small-screens { 389 | display: none; 390 | } 391 | } 392 | 393 | @media screen and (min-width: 1295px) { 394 | .show-on-middle-screens { 395 | display: none; 396 | } 397 | .show-on-small-screens { 398 | display: none; 399 | } 400 | } 401 | 402 | @media (prefers-color-scheme: dark) { 403 | .title-brief { 404 | filter: invert(100%); 405 | } 406 | 407 | .visitor-map { 408 | filter: invert(89%); 409 | } 410 | } -------------------------------------------------------------------------------- /css/theme.css: -------------------------------------------------------------------------------- 1 | @import url(tokens.css); 2 | @import url(colors.module.css); 3 | @import url(typography.module.css); 4 | @import url(theme.light.css) (prefers-color-scheme: light); 5 | @import url(theme.dark.css) (prefers-color-scheme: dark); 6 | -------------------------------------------------------------------------------- /css/theme.dark.css: -------------------------------------------------------------------------------- 1 | :root { 2 | --md-sys-color-primary: var(--md-sys-color-primary-dark); 3 | --md-sys-color-on-primary: var(--md-sys-color-on-primary-dark); 4 | --md-sys-color-primary-container: var(--md-sys-color-primary-container-dark); 5 | --md-sys-color-on-primary-container: var(--md-sys-color-on-primary-container-dark); 6 | --md-sys-color-secondary: var(--md-sys-color-secondary-dark); 7 | --md-sys-color-on-secondary: var(--md-sys-color-on-secondary-dark); 8 | --md-sys-color-secondary-container: var(--md-sys-color-secondary-container-dark); 9 | --md-sys-color-on-secondary-container: var(--md-sys-color-on-secondary-container-dark); 10 | --md-sys-color-tertiary: var(--md-sys-color-tertiary-dark); 11 | --md-sys-color-on-tertiary: var(--md-sys-color-on-tertiary-dark); 12 | --md-sys-color-tertiary-container: var(--md-sys-color-tertiary-container-dark); 13 | --md-sys-color-on-tertiary-container: var(--md-sys-color-on-tertiary-container-dark); 14 | --md-sys-color-error: var(--md-sys-color-error-dark); 15 | --md-sys-color-error-container: var(--md-sys-color-error-container-dark); 16 | --md-sys-color-on-error: var(--md-sys-color-on-error-dark); 17 | --md-sys-color-on-error-container: var(--md-sys-color-on-error-container-dark); 18 | --md-sys-color-background: var(--md-sys-color-background-dark); 19 | --md-sys-color-on-background: var(--md-sys-color-on-background-dark); 20 | --md-sys-color-surface: var(--md-sys-color-surface-dark); 21 | --md-sys-color-on-surface: var(--md-sys-color-on-surface-dark); 22 | --md-sys-color-surface-variant: var(--md-sys-color-surface-variant-dark); 23 | --md-sys-color-on-surface-variant: var(--md-sys-color-on-surface-variant-dark); 24 | --md-sys-color-outline: var(--md-sys-color-outline-dark); 25 | --md-sys-color-inverse-on-surface: var(--md-sys-color-inverse-on-surface-dark); 26 | --md-sys-color-inverse-surface: var(--md-sys-color-inverse-surface-dark); 27 | --md-sys-color-inverse-primary: var(--md-sys-color-inverse-primary-dark); 28 | --md-sys-color-shadow: var(--md-sys-color-shadow-dark); 29 | --md-sys-color-surface-tint: var(--md-sys-color-surface-tint-dark); 30 | --md-sys-color-outline-variant: var(--md-sys-color-outline-variant-dark); 31 | --md-sys-color-scrim: var(--md-sys-color-scrim-dark); 32 | } 33 | -------------------------------------------------------------------------------- /css/theme.light.css: -------------------------------------------------------------------------------- 1 | :root { 2 | --md-sys-color-primary: var(--md-sys-color-primary-light); 3 | --md-sys-color-on-primary: var(--md-sys-color-on-primary-light); 4 | --md-sys-color-primary-container: var(--md-sys-color-primary-container-light); 5 | --md-sys-color-on-primary-container: var(--md-sys-color-on-primary-container-light); 6 | --md-sys-color-secondary: var(--md-sys-color-secondary-light); 7 | --md-sys-color-on-secondary: var(--md-sys-color-on-secondary-light); 8 | --md-sys-color-secondary-container: var(--md-sys-color-secondary-container-light); 9 | --md-sys-color-on-secondary-container: var(--md-sys-color-on-secondary-container-light); 10 | --md-sys-color-tertiary: var(--md-sys-color-tertiary-light); 11 | --md-sys-color-on-tertiary: var(--md-sys-color-on-tertiary-light); 12 | --md-sys-color-tertiary-container: var(--md-sys-color-tertiary-container-light); 13 | --md-sys-color-on-tertiary-container: var(--md-sys-color-on-tertiary-container-light); 14 | --md-sys-color-error: var(--md-sys-color-error-light); 15 | --md-sys-color-error-container: var(--md-sys-color-error-container-light); 16 | --md-sys-color-on-error: var(--md-sys-color-on-error-light); 17 | --md-sys-color-on-error-container: var(--md-sys-color-on-error-container-light); 18 | --md-sys-color-background: var(--md-sys-color-background-light); 19 | --md-sys-color-on-background: var(--md-sys-color-on-background-light); 20 | --md-sys-color-surface: var(--md-sys-color-surface-light); 21 | --md-sys-color-on-surface: var(--md-sys-color-on-surface-light); 22 | --md-sys-color-surface-variant: var(--md-sys-color-surface-variant-light); 23 | --md-sys-color-on-surface-variant: var(--md-sys-color-on-surface-variant-light); 24 | --md-sys-color-outline: var(--md-sys-color-outline-light); 25 | --md-sys-color-inverse-on-surface: var(--md-sys-color-inverse-on-surface-light); 26 | --md-sys-color-inverse-surface: var(--md-sys-color-inverse-surface-light); 27 | --md-sys-color-inverse-primary: var(--md-sys-color-inverse-primary-light); 28 | --md-sys-color-shadow: var(--md-sys-color-shadow-light); 29 | --md-sys-color-surface-tint: var(--md-sys-color-surface-tint-light); 30 | --md-sys-color-outline-variant: var(--md-sys-color-outline-variant-light); 31 | --md-sys-color-scrim: var(--md-sys-color-scrim-light); 32 | } 33 | -------------------------------------------------------------------------------- /css/tokens.css: -------------------------------------------------------------------------------- 1 | :root { 2 | --md-source: #97cfea; 3 | /* primary */ 4 | --md-ref-palette-primary0: #000000; 5 | --md-ref-palette-primary10: #001f2a; 6 | --md-ref-palette-primary20: #003546; 7 | --md-ref-palette-primary25: #004155; 8 | --md-ref-palette-primary30: #004d64; 9 | --md-ref-palette-primary35: #005a74; 10 | --md-ref-palette-primary40: #006684; 11 | --md-ref-palette-primary50: #0081a5; 12 | --md-ref-palette-primary60: #0b9cc7; 13 | --md-ref-palette-primary70: #41b7e4; 14 | --md-ref-palette-primary80: #68d3ff; 15 | --md-ref-palette-primary90: #bee9ff; 16 | --md-ref-palette-primary95: #e0f4ff; 17 | --md-ref-palette-primary98: #f4faff; 18 | --md-ref-palette-primary99: #fafcff; 19 | --md-ref-palette-primary100: #ffffff; 20 | /* secondary */ 21 | --md-ref-palette-secondary0: #000000; 22 | --md-ref-palette-secondary10: #081e27; 23 | --md-ref-palette-secondary20: #1f333c; 24 | --md-ref-palette-secondary25: #2a3e48; 25 | --md-ref-palette-secondary30: #354a53; 26 | --md-ref-palette-secondary35: #41555f; 27 | --md-ref-palette-secondary40: #4d616c; 28 | --md-ref-palette-secondary50: #657a85; 29 | --md-ref-palette-secondary60: #7f949f; 30 | --md-ref-palette-secondary70: #99aeba; 31 | --md-ref-palette-secondary80: #b4cad6; 32 | --md-ref-palette-secondary90: #d0e6f2; 33 | --md-ref-palette-secondary95: #e0f4ff; 34 | --md-ref-palette-secondary98: #f4faff; 35 | --md-ref-palette-secondary99: #fafcff; 36 | --md-ref-palette-secondary100: #ffffff; 37 | /* tertiary */ 38 | --md-ref-palette-tertiary0: #000000; 39 | --md-ref-palette-tertiary10: #1a1836; 40 | --md-ref-palette-tertiary20: #2f2d4d; 41 | --md-ref-palette-tertiary25: #3a3858; 42 | --md-ref-palette-tertiary30: #454364; 43 | --md-ref-palette-tertiary35: #514f71; 44 | --md-ref-palette-tertiary40: #5d5b7d; 45 | --md-ref-palette-tertiary50: #767397; 46 | --md-ref-palette-tertiary60: #908db2; 47 | --md-ref-palette-tertiary70: #aba7ce; 48 | --md-ref-palette-tertiary80: #c6c2ea; 49 | --md-ref-palette-tertiary90: #e3dfff; 50 | --md-ref-palette-tertiary95: #f3eeff; 51 | --md-ref-palette-tertiary98: #fcf8ff; 52 | --md-ref-palette-tertiary99: #fffbff; 53 | --md-ref-palette-tertiary100: #ffffff; 54 | /* neutral */ 55 | --md-ref-palette-neutral0: #000000; 56 | --md-ref-palette-neutral10: #191c1e; 57 | --md-ref-palette-neutral20: #2e3132; 58 | --md-ref-palette-neutral25: #393c3e; 59 | --md-ref-palette-neutral30: #444749; 60 | --md-ref-palette-neutral35: #505355; 61 | --md-ref-palette-neutral40: #5c5f61; 62 | --md-ref-palette-neutral50: #757779; 63 | --md-ref-palette-neutral60: #8f9193; 64 | --md-ref-palette-neutral70: #a9abad; 65 | --md-ref-palette-neutral80: #c5c7c9; 66 | --md-ref-palette-neutral90: #e1e2e4; 67 | --md-ref-palette-neutral95: #eff1f3; 68 | --md-ref-palette-neutral98: #f8f9fb; 69 | --md-ref-palette-neutral99: #fbfcfe; 70 | --md-ref-palette-neutral100: #ffffff; 71 | /* neutral-variant */ 72 | --md-ref-palette-neutral-variant0: #000000; 73 | --md-ref-palette-neutral-variant10: #151d20; 74 | --md-ref-palette-neutral-variant20: #2a3136; 75 | --md-ref-palette-neutral-variant25: #353d41; 76 | --md-ref-palette-neutral-variant30: #40484c; 77 | --md-ref-palette-neutral-variant35: #4c5458; 78 | --md-ref-palette-neutral-variant40: #585f64; 79 | --md-ref-palette-neutral-variant50: #70787d; 80 | --md-ref-palette-neutral-variant60: #8a9297; 81 | --md-ref-palette-neutral-variant70: #a5acb1; 82 | --md-ref-palette-neutral-variant80: #c0c8cd; 83 | --md-ref-palette-neutral-variant90: #dce4e9; 84 | --md-ref-palette-neutral-variant95: #eaf2f7; 85 | --md-ref-palette-neutral-variant98: #f4faff; 86 | --md-ref-palette-neutral-variant99: #fafcff; 87 | --md-ref-palette-neutral-variant100: #ffffff; 88 | /* error */ 89 | --md-ref-palette-error0: #000000; 90 | --md-ref-palette-error10: #410002; 91 | --md-ref-palette-error20: #690005; 92 | --md-ref-palette-error25: #7e0007; 93 | --md-ref-palette-error30: #93000a; 94 | --md-ref-palette-error35: #a80710; 95 | --md-ref-palette-error40: #ba1a1a; 96 | --md-ref-palette-error50: #de3730; 97 | --md-ref-palette-error60: #ff5449; 98 | --md-ref-palette-error70: #ff897d; 99 | --md-ref-palette-error80: #ffb4ab; 100 | --md-ref-palette-error90: #ffdad6; 101 | --md-ref-palette-error95: #ffedea; 102 | --md-ref-palette-error98: #fff8f7; 103 | --md-ref-palette-error99: #fffbff; 104 | --md-ref-palette-error100: #ffffff; 105 | /* light */ 106 | --md-sys-color-primary-light: #006684; 107 | --md-sys-color-on-primary-light: #ffffff; 108 | --md-sys-color-primary-container-light: #bee9ff; 109 | --md-sys-color-on-primary-container-light: #001f2a; 110 | --md-sys-color-secondary-light: #4d616c; 111 | --md-sys-color-on-secondary-light: #ffffff; 112 | --md-sys-color-secondary-container-light: #d0e6f2; 113 | --md-sys-color-on-secondary-container-light: #081e27; 114 | --md-sys-color-tertiary-light: #5d5b7d; 115 | --md-sys-color-on-tertiary-light: #ffffff; 116 | --md-sys-color-tertiary-container-light: #e3dfff; 117 | --md-sys-color-on-tertiary-container-light: #1a1836; 118 | --md-sys-color-error-light: #ba1a1a; 119 | --md-sys-color-error-container-light: #ffdad6; 120 | --md-sys-color-on-error-light: #ffffff; 121 | --md-sys-color-on-error-container-light: #410002; 122 | --md-sys-color-background-light: #fbfcfe; 123 | --md-sys-color-on-background-light: #191c1e; 124 | --md-sys-color-surface-light: #fbfcfe; 125 | --md-sys-color-on-surface-light: #191c1e; 126 | --md-sys-color-surface-variant-light: #dce4e9; 127 | --md-sys-color-on-surface-variant-light: #40484c; 128 | --md-sys-color-outline-light: #70787d; 129 | --md-sys-color-inverse-on-surface-light: #eff1f3; 130 | --md-sys-color-inverse-surface-light: #2e3132; 131 | --md-sys-color-inverse-primary-light: #68d3ff; 132 | --md-sys-color-shadow-light: #000000; 133 | --md-sys-color-surface-tint-light: #006684; 134 | --md-sys-color-outline-variant-light: #c0c8cd; 135 | --md-sys-color-scrim-light: #000000; 136 | /* dark */ 137 | --md-sys-color-primary-dark: #68d3ff; 138 | --md-sys-color-on-primary-dark: #003546; 139 | --md-sys-color-primary-container-dark: #004d64; 140 | --md-sys-color-on-primary-container-dark: #bee9ff; 141 | --md-sys-color-secondary-dark: #b4cad6; 142 | --md-sys-color-on-secondary-dark: #1f333c; 143 | --md-sys-color-secondary-container-dark: #354a53; 144 | --md-sys-color-on-secondary-container-dark: #d0e6f2; 145 | --md-sys-color-tertiary-dark: #c6c2ea; 146 | --md-sys-color-on-tertiary-dark: #2f2d4d; 147 | --md-sys-color-tertiary-container-dark: #454364; 148 | --md-sys-color-on-tertiary-container-dark: #e3dfff; 149 | --md-sys-color-error-dark: #ffb4ab; 150 | --md-sys-color-error-container-dark: #93000a; 151 | --md-sys-color-on-error-dark: #690005; 152 | --md-sys-color-on-error-container-dark: #ffdad6; 153 | --md-sys-color-background-dark: #191c1e; 154 | --md-sys-color-on-background-dark: #e1e2e4; 155 | --md-sys-color-surface-dark: #191c1e; 156 | --md-sys-color-on-surface-dark: #e1e2e4; 157 | --md-sys-color-surface-variant-dark: #40484c; 158 | --md-sys-color-on-surface-variant-dark: #c0c8cd; 159 | --md-sys-color-outline-dark: #8a9297; 160 | --md-sys-color-inverse-on-surface-dark: #191c1e; 161 | --md-sys-color-inverse-surface-dark: #e1e2e4; 162 | --md-sys-color-inverse-primary-dark: #006684; 163 | --md-sys-color-shadow-dark: #000000; 164 | --md-sys-color-surface-tint-dark: #68d3ff; 165 | --md-sys-color-outline-variant-dark: #40484c; 166 | --md-sys-color-scrim-dark: #000000; 167 | /* display - large */ 168 | --md-sys-typescale-display-large-font-family-name: Outfit; 169 | --md-sys-typescale-display-large-font-family-style: Regular; 170 | --md-sys-typescale-display-large-font-weight: 475; 171 | --md-sys-typescale-display-large-font-size: 57px; 172 | --md-sys-typescale-display-large-line-height: 64px; 173 | --md-sys-typescale-display-large-letter-spacing: -0.25px; 174 | /* display - medium */ 175 | --md-sys-typescale-display-medium-font-family-name: Outfit; 176 | --md-sys-typescale-display-medium-font-family-style: Regular; 177 | --md-sys-typescale-display-medium-font-weight: 475; 178 | --md-sys-typescale-display-medium-font-size: 45px; 179 | --md-sys-typescale-display-medium-line-height: 52px; 180 | --md-sys-typescale-display-medium-letter-spacing: 0px; 181 | /* display - small */ 182 | --md-sys-typescale-display-small-font-family-name: Outfit; 183 | --md-sys-typescale-display-small-font-family-style: Regular; 184 | --md-sys-typescale-display-small-font-weight: 475; 185 | --md-sys-typescale-display-small-font-size: 36px; 186 | --md-sys-typescale-display-small-line-height: 44px; 187 | --md-sys-typescale-display-small-letter-spacing: 0px; 188 | /* headline - large */ 189 | --md-sys-typescale-headline-large-font-family-name: Outfit; 190 | --md-sys-typescale-headline-large-font-family-style: Regular; 191 | --md-sys-typescale-headline-large-font-weight: 400px; 192 | --md-sys-typescale-headline-large-font-size: 32px; 193 | --md-sys-typescale-headline-large-line-height: 40px; 194 | --md-sys-typescale-headline-large-letter-spacing: 0px; 195 | /* headline - medium */ 196 | --md-sys-typescale-headline-medium-font-family-name: Outfit; 197 | --md-sys-typescale-headline-medium-font-family-style: Regular; 198 | --md-sys-typescale-headline-medium-font-weight: 400px; 199 | --md-sys-typescale-headline-medium-font-size: 28px; 200 | --md-sys-typescale-headline-medium-line-height: 36px; 201 | --md-sys-typescale-headline-medium-letter-spacing: 0px; 202 | /* headline - small */ 203 | --md-sys-typescale-headline-small-font-family-name: Outfit; 204 | --md-sys-typescale-headline-small-font-family-style: Regular; 205 | --md-sys-typescale-headline-small-font-weight: 400px; 206 | --md-sys-typescale-headline-small-font-size: 24px; 207 | --md-sys-typescale-headline-small-line-height: 32px; 208 | --md-sys-typescale-headline-small-letter-spacing: 0px; 209 | /* body - large */ 210 | --md-sys-typescale-body-large-font-family-name: Outfit; 211 | --md-sys-typescale-body-large-font-family-style: Regular; 212 | --md-sys-typescale-body-large-font-weight: 400px; 213 | --md-sys-typescale-body-large-font-size: 16px; 214 | --md-sys-typescale-body-large-line-height: 24px; 215 | --md-sys-typescale-body-large-letter-spacing: 0.50px; 216 | /* body - medium */ 217 | --md-sys-typescale-body-medium-font-family-name: Outfit; 218 | --md-sys-typescale-body-medium-font-family-style: Regular; 219 | --md-sys-typescale-body-medium-font-weight: 400px; 220 | --md-sys-typescale-body-medium-font-size: 14px; 221 | --md-sys-typescale-body-medium-line-height: 20px; 222 | --md-sys-typescale-body-medium-letter-spacing: 0.25px; 223 | /* body - small */ 224 | --md-sys-typescale-body-small-font-family-name: Outfit; 225 | --md-sys-typescale-body-small-font-family-style: Regular; 226 | --md-sys-typescale-body-small-font-weight: 400px; 227 | --md-sys-typescale-body-small-font-size: 12px; 228 | --md-sys-typescale-body-small-line-height: 16px; 229 | --md-sys-typescale-body-small-letter-spacing: 0.40px; 230 | /* label - large */ 231 | --md-sys-typescale-label-large-font-family-name: Outfit; 232 | --md-sys-typescale-label-large-font-family-style: Medium; 233 | --md-sys-typescale-label-large-font-weight: 500px; 234 | --md-sys-typescale-label-large-font-size: 14px; 235 | --md-sys-typescale-label-large-line-height: 20px; 236 | --md-sys-typescale-label-large-letter-spacing: 0.10px; 237 | /* label - medium */ 238 | --md-sys-typescale-label-medium-font-family-name: Outfit; 239 | --md-sys-typescale-label-medium-font-family-style: Medium; 240 | --md-sys-typescale-label-medium-font-weight: 500px; 241 | --md-sys-typescale-label-medium-font-size: 12px; 242 | --md-sys-typescale-label-medium-line-height: 16px; 243 | --md-sys-typescale-label-medium-letter-spacing: 0.50px; 244 | /* label - small */ 245 | --md-sys-typescale-label-small-font-family-name: Outfit; 246 | --md-sys-typescale-label-small-font-family-style: Medium; 247 | --md-sys-typescale-label-small-font-weight: 500px; 248 | --md-sys-typescale-label-small-font-size: 11px; 249 | --md-sys-typescale-label-small-line-height: 16px; 250 | --md-sys-typescale-label-small-letter-spacing: 0.50px; 251 | /* title - large */ 252 | --md-sys-typescale-title-large-font-family-name: Outfit; 253 | --md-sys-typescale-title-large-font-family-style: Regular; 254 | --md-sys-typescale-title-large-font-weight: 400px; 255 | --md-sys-typescale-title-large-font-size: 22px; 256 | --md-sys-typescale-title-large-line-height: 28px; 257 | --md-sys-typescale-title-large-letter-spacing: 0px; 258 | /* title - medium */ 259 | --md-sys-typescale-title-medium-font-family-name: Outfit; 260 | --md-sys-typescale-title-medium-font-family-style: Medium; 261 | --md-sys-typescale-title-medium-font-weight: 500px; 262 | --md-sys-typescale-title-medium-font-size: 16px; 263 | --md-sys-typescale-title-medium-line-height: 24px; 264 | --md-sys-typescale-title-medium-letter-spacing: 0.15px; 265 | /* title - small */ 266 | --md-sys-typescale-title-small-font-family-name: Outfit; 267 | --md-sys-typescale-title-small-font-family-style: Medium; 268 | --md-sys-typescale-title-small-font-weight: 500px; 269 | --md-sys-typescale-title-small-font-size: 14px; 270 | --md-sys-typescale-title-small-line-height: 20px; 271 | --md-sys-typescale-title-small-letter-spacing: 0.10px; 272 | } 273 | -------------------------------------------------------------------------------- /css/typography.module.css: -------------------------------------------------------------------------------- 1 | .display-large{ 2 | font-family: var(--md-sys-typescale-display-large-font-family-name); 3 | font-style: var(--md-sys-typescale-display-large-font-family-style); 4 | font-weight: var(--md-sys-typescale-display-large-font-weight); 5 | font-size: var(--md-sys-typescale-display-large-font-size); 6 | letter-spacing: var(--md-sys-typescale-display-large-tracking); 7 | line-height: var(--md-sys-typescale-display-large-height); 8 | text-transform: var(--md-sys-typescale-display-large-text-transform); 9 | text-decoration: var(--md-sys-typescale-display-large-text-decoration); 10 | } 11 | .display-medium{ 12 | font-family: var(--md-sys-typescale-display-medium-font-family-name); 13 | font-style: var(--md-sys-typescale-display-medium-font-family-style); 14 | font-weight: var(--md-sys-typescale-display-medium-font-weight); 15 | font-size: var(--md-sys-typescale-display-medium-font-size); 16 | letter-spacing: var(--md-sys-typescale-display-medium-tracking); 17 | line-height: var(--md-sys-typescale-display-medium-height); 18 | text-transform: var(--md-sys-typescale-display-medium-text-transform); 19 | text-decoration: var(--md-sys-typescale-display-medium-text-decoration); 20 | } 21 | .display-small{ 22 | font-family: var(--md-sys-typescale-display-small-font-family-name); 23 | font-style: var(--md-sys-typescale-display-small-font-family-style); 24 | font-weight: var(--md-sys-typescale-display-small-font-weight); 25 | font-size: var(--md-sys-typescale-display-small-font-size); 26 | letter-spacing: var(--md-sys-typescale-display-small-tracking); 27 | line-height: var(--md-sys-typescale-display-small-height); 28 | text-transform: var(--md-sys-typescale-display-small-text-transform); 29 | text-decoration: var(--md-sys-typescale-display-small-text-decoration); 30 | } 31 | .headline-large{ 32 | font-family: var(--md-sys-typescale-headline-large-font-family-name); 33 | font-style: var(--md-sys-typescale-headline-large-font-family-style); 34 | font-weight: var(--md-sys-typescale-headline-large-font-weight); 35 | font-size: var(--md-sys-typescale-headline-large-font-size); 36 | letter-spacing: var(--md-sys-typescale-headline-large-tracking); 37 | line-height: var(--md-sys-typescale-headline-large-height); 38 | text-transform: var(--md-sys-typescale-headline-large-text-transform); 39 | text-decoration: var(--md-sys-typescale-headline-large-text-decoration); 40 | } 41 | .headline-medium{ 42 | font-family: var(--md-sys-typescale-headline-medium-font-family-name); 43 | font-style: var(--md-sys-typescale-headline-medium-font-family-style); 44 | font-weight: var(--md-sys-typescale-headline-medium-font-weight); 45 | font-size: var(--md-sys-typescale-headline-medium-font-size); 46 | letter-spacing: var(--md-sys-typescale-headline-medium-tracking); 47 | line-height: var(--md-sys-typescale-headline-medium-height); 48 | text-transform: var(--md-sys-typescale-headline-medium-text-transform); 49 | text-decoration: var(--md-sys-typescale-headline-medium-text-decoration); 50 | } 51 | .headline-small{ 52 | font-family: var(--md-sys-typescale-headline-small-font-family-name); 53 | font-style: var(--md-sys-typescale-headline-small-font-family-style); 54 | font-weight: var(--md-sys-typescale-headline-small-font-weight); 55 | font-size: var(--md-sys-typescale-headline-small-font-size); 56 | letter-spacing: var(--md-sys-typescale-headline-small-tracking); 57 | line-height: var(--md-sys-typescale-headline-small-height); 58 | text-transform: var(--md-sys-typescale-headline-small-text-transform); 59 | text-decoration: var(--md-sys-typescale-headline-small-text-decoration); 60 | } 61 | .body-large{ 62 | font-family: var(--md-sys-typescale-body-large-font-family-name); 63 | font-style: var(--md-sys-typescale-body-large-font-family-style); 64 | font-weight: var(--md-sys-typescale-body-large-font-weight); 65 | font-size: var(--md-sys-typescale-body-large-font-size); 66 | letter-spacing: var(--md-sys-typescale-body-large-tracking); 67 | line-height: var(--md-sys-typescale-body-large-height); 68 | text-transform: var(--md-sys-typescale-body-large-text-transform); 69 | text-decoration: var(--md-sys-typescale-body-large-text-decoration); 70 | } 71 | .body-medium{ 72 | font-family: var(--md-sys-typescale-body-medium-font-family-name); 73 | font-style: var(--md-sys-typescale-body-medium-font-family-style); 74 | font-weight: var(--md-sys-typescale-body-medium-font-weight); 75 | font-size: var(--md-sys-typescale-body-medium-font-size); 76 | letter-spacing: var(--md-sys-typescale-body-medium-tracking); 77 | line-height: var(--md-sys-typescale-body-medium-height); 78 | text-transform: var(--md-sys-typescale-body-medium-text-transform); 79 | text-decoration: var(--md-sys-typescale-body-medium-text-decoration); 80 | } 81 | .body-small{ 82 | font-family: var(--md-sys-typescale-body-small-font-family-name); 83 | font-style: var(--md-sys-typescale-body-small-font-family-style); 84 | font-weight: var(--md-sys-typescale-body-small-font-weight); 85 | font-size: var(--md-sys-typescale-body-small-font-size); 86 | letter-spacing: var(--md-sys-typescale-body-small-tracking); 87 | line-height: var(--md-sys-typescale-body-small-height); 88 | text-transform: var(--md-sys-typescale-body-small-text-transform); 89 | text-decoration: var(--md-sys-typescale-body-small-text-decoration); 90 | } 91 | .label-large{ 92 | font-family: var(--md-sys-typescale-label-large-font-family-name); 93 | font-style: var(--md-sys-typescale-label-large-font-family-style); 94 | font-weight: var(--md-sys-typescale-label-large-font-weight); 95 | font-size: var(--md-sys-typescale-label-large-font-size); 96 | letter-spacing: var(--md-sys-typescale-label-large-tracking); 97 | line-height: var(--md-sys-typescale-label-large-height); 98 | text-transform: var(--md-sys-typescale-label-large-text-transform); 99 | text-decoration: var(--md-sys-typescale-label-large-text-decoration); 100 | } 101 | .label-medium{ 102 | font-family: var(--md-sys-typescale-label-medium-font-family-name); 103 | font-style: var(--md-sys-typescale-label-medium-font-family-style); 104 | font-weight: var(--md-sys-typescale-label-medium-font-weight); 105 | font-size: var(--md-sys-typescale-label-medium-font-size); 106 | letter-spacing: var(--md-sys-typescale-label-medium-tracking); 107 | line-height: var(--md-sys-typescale-label-medium-height); 108 | text-transform: var(--md-sys-typescale-label-medium-text-transform); 109 | text-decoration: var(--md-sys-typescale-label-medium-text-decoration); 110 | } 111 | .label-small{ 112 | font-family: var(--md-sys-typescale-label-small-font-family-name); 113 | font-style: var(--md-sys-typescale-label-small-font-family-style); 114 | font-weight: var(--md-sys-typescale-label-small-font-weight); 115 | font-size: var(--md-sys-typescale-label-small-font-size); 116 | letter-spacing: var(--md-sys-typescale-label-small-tracking); 117 | line-height: var(--md-sys-typescale-label-small-height); 118 | text-transform: var(--md-sys-typescale-label-small-text-transform); 119 | text-decoration: var(--md-sys-typescale-label-small-text-decoration); 120 | } 121 | .title-large{ 122 | font-family: var(--md-sys-typescale-title-large-font-family-name); 123 | font-style: var(--md-sys-typescale-title-large-font-family-style); 124 | font-weight: var(--md-sys-typescale-title-large-font-weight); 125 | font-size: var(--md-sys-typescale-title-large-font-size); 126 | letter-spacing: var(--md-sys-typescale-title-large-tracking); 127 | line-height: var(--md-sys-typescale-title-large-height); 128 | text-transform: var(--md-sys-typescale-title-large-text-transform); 129 | text-decoration: var(--md-sys-typescale-title-large-text-decoration); 130 | } 131 | .title-medium{ 132 | font-family: var(--md-sys-typescale-title-medium-font-family-name); 133 | font-style: var(--md-sys-typescale-title-medium-font-family-style); 134 | font-weight: var(--md-sys-typescale-title-medium-font-weight); 135 | font-size: var(--md-sys-typescale-title-medium-font-size); 136 | letter-spacing: var(--md-sys-typescale-title-medium-tracking); 137 | line-height: var(--md-sys-typescale-title-medium-height); 138 | text-transform: var(--md-sys-typescale-title-medium-text-transform); 139 | text-decoration: var(--md-sys-typescale-title-medium-text-decoration); 140 | } 141 | .title-small{ 142 | font-family: var(--md-sys-typescale-title-small-font-family-name); 143 | font-style: var(--md-sys-typescale-title-small-font-family-style); 144 | font-weight: var(--md-sys-typescale-title-small-font-weight); 145 | font-size: var(--md-sys-typescale-title-small-font-size); 146 | letter-spacing: var(--md-sys-typescale-title-small-tracking); 147 | line-height: var(--md-sys-typescale-title-small-height); 148 | text-transform: var(--md-sys-typescale-title-small-text-transform); 149 | text-decoration: var(--md-sys-typescale-title-small-text-decoration); 150 | } 151 | -------------------------------------------------------------------------------- /googled873bf132668a2f1.html: -------------------------------------------------------------------------------- 1 | google-site-verification: googled873bf132668a2f1.html -------------------------------------------------------------------------------- /sitemap.xml: -------------------------------------------------------------------------------- 1 | 2 | 7 | 8 | 9 | 10 | 11 | https://guang000.github.io/Awesome-Dataset-Distillation/ 12 | 2024-03-19T12:19:46+00:00 13 | 14 | 15 | 16 | --------------------------------------------------------------------------------