└── README.md /README.md: -------------------------------------------------------------------------------- 1 | 2 | Knowledge distillation can be used for incremental/continual/continuous/lifelong learning to avoid catastrophic forgetting. 3 | ## Papers 4 | ### 2022 5 | - Focal and Global Knowledge Distillation for Detectors (CVPR 2022) [[paper]](https://arxiv.org/abs/2111.11837v1) [[code]](https://github.com/yzd-v/FGD) 6 | - Localization Distillation for Dense Object Detection (CVPR 2022) [[paper]](https://arxiv.org/abs/2102.12252) [[code]](https://github.com/HikariTJU/LD) 7 | - Decoupled Knowledge Distillation (CVPR 2022)[[paper]](https://arxiv.org/abs/2203.08679) [[code]](https://github.com/megvii-research/mdistiller) 8 | - Knowledge Distillation for Object Detection via Rank Mimicking and Prediction-guided Feature Imitation (AAAI 2022)[[paper]](https://arxiv.org/abs/2112.04840) 9 | - Overcoming Catastrophic Forgetting in Incremental Object Detection via Elastic Response Distillation (CVPR 2022) [[paper]](https://arxiv.org/abs/2204.02136) [[code]](https://github.com/Hi-FT/ERD) 10 | - Class-Incremental Learning by Knowledge Distillation with Adaptive Feature Consolidation (CVPR 2022)[[paper]](https://arxiv.org/abs/2204.00895) [[code]](https://github.com/kminsoo/AFC) 11 | ### 2021 12 | - G-DetKD: Towards General Distillation Framework for Object Detectors via Contrastive and Semantic-guided Feature Imitation (ICCV 2021) [[paper]](https://arxiv.org/abs/2108.07482) 13 | - Incremental Object Detection via Meta-Learning (TPAMI 2021) [[paper]](https://arxiv.org/abs/2003.08798) [[code]](https://github.com/JosephKJ/iOD) 14 | - Bridging Non Co-occurrence with Unlabeled In-the-wild Data for Incremental Object Detection (NeurIPS 2021) [[paper]](https://papers.nips.cc/paper/2021/file/ffc58105bf6f8a91aba0fa2d99e6f106-Paper.pdf) 15 | - Wanderlust: Online Continual Object Detection in the Real World (ICCV 2021) [[paper]](https://openaccess.thecvf.com/content/ICCV2021/papers/Wang_Wanderlust_Online_Continual_Object_Detection_in_the_Real_World_ICCV_2021_paper.pdf) 16 | - Always Be Dreaming: A New Approach for Data-Free Class-Incremental Learning (ICCV 2021) [[paper]](https://arxiv.org/abs/2106.09701) [[code]](https://github.com/GT-RIPL/AlwaysBeDreaming-DFCIL) 17 | - Towards Open World Object Detection (CVPR 2021) [[paper]](https://openaccess.thecvf.com/content/CVPR2021/papers/Joseph_Towards_Open_World_Object_Detection_CVPR_2021_paper.pdf) [[code]](https://github.com/JosephKJ/OWOD) [[video]](https://www.youtube.com/watch?v=aB2ZFAR-OZg) 18 | - End-to-End Semi-Supervised Object Detection with Soft Teacher (ICCV 2021)[[papaer]](https://arxiv.org/abs/2106.09018) [[code]](https://github.com/microsoft/SoftTeacher) 19 | - Distilling Knowledge via Knowledge Review (CVPR 2021) [[paper]](https://arxiv.org/abs/2104.09044) [[code]](https://github.com/dvlab-research/ReviewKD) 20 | - General Instance Distillation for Object Detection (CVPR 2021) [[paper]](https://arxiv.org/abs/2103.02340) 21 | - Distilling Object Detectors via Decoupled Features (CVPR 2021) [[paper]](https://arxiv.org/abs/2103.14475) [[code]](https://github.com/ggjy/DeFeat.pytorch) 22 | - Improve Object Detection with Feature-based Knowledge Distillation: Towards Accurate and Efficient Detectors (ICLR 2021) [[paper]](https://openreview.net/forum?id=uKhGRvM8QNH) [[code]](https://github.com/ArchipLab-LinfengZhang/Object-Detection-Knowledge-Distillation-ICLR2021) 23 | ### 2020 24 | - Incremental Few-Shot Object Detection (CVPR 2020) [[paper]](https://openaccess.thecvf.com/content_CVPR_2020/papers/Perez-Rua_Incremental_Few-Shot_Object_Detection_CVPR_2020_paper.pdf) 25 | - RODEO: Replay for Online Object Detection (BMVC 2020) [[paper]](https://arxiv.org/abs/2008.06439) [[code]](https://github.com/manoja328/rodeo) 26 | ### 2019 27 | - Distilling Object Detectors with Fine-grained Feature Imitation (CVPR 2019) [[paper]](https://arxiv.org/abs/1906.03609) [[code]](https://github.com/twangnh/Distilling-Object-Detectors) 28 | - An End-to-End Architecture for Class-Incremental Object Detection with Knowledge Distillation (IEEE ICME 2019) [[paper]](https://ieeexplore.ieee.org/document/8784755) 29 | ### 2017 30 | - Learning Efficient Object Detection Models with Knowledge Distillation (NIPS 2017) [[paper]](https://proceedings.neurips.cc/paper/2017/hash/e1e32e235eee1f970470a3a6658dfdd5-Abstract.html) 31 | - Incremental Learning of Object Detectors Without Catastrophic Forgetting (ICCV 2017) [[paper]](https://arxiv.org/abs/1708.06977v1) 32 | - iCaRL: Incremental classifier and Representation Learning (CVPR 2017) [[paper]](https://arxiv.org/abs/1611.07725) [[code]](https://github.com/srebuffi/iCaRL) 33 | ### 2016 34 | - Learning without Forgetting (ECCV 2016) [[paper]](https://arxiv.org/abs/1606.09282) [[code]](https://github.com/lizhitwo/LearningWithoutForgetting) 35 | ### 2014 36 | - Distilling the Knowledge in a Neural Network(NIPS 2014)[[paper]](https://arxiv.org/abs/1503.02531) 37 | --------------------------------------------------------------------------------