├── .gitignore ├── LICENSE ├── README.md ├── bert ├── extract_features.py ├── modeling.py ├── optimization.py ├── optimization_custom.py └── tokenization.py ├── create_pretraining_data.py ├── create_teacher_output_data.py ├── requirements.txt └── run_distill.py /.gitignore: -------------------------------------------------------------------------------- 1 | .idea/ 2 | __pycache__/ 3 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/xiongma/roberta-wwm-base-distill/HEAD/LICENSE -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/xiongma/roberta-wwm-base-distill/HEAD/README.md -------------------------------------------------------------------------------- /bert/extract_features.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/xiongma/roberta-wwm-base-distill/HEAD/bert/extract_features.py -------------------------------------------------------------------------------- /bert/modeling.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/xiongma/roberta-wwm-base-distill/HEAD/bert/modeling.py -------------------------------------------------------------------------------- /bert/optimization.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/xiongma/roberta-wwm-base-distill/HEAD/bert/optimization.py -------------------------------------------------------------------------------- /bert/optimization_custom.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/xiongma/roberta-wwm-base-distill/HEAD/bert/optimization_custom.py -------------------------------------------------------------------------------- /bert/tokenization.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/xiongma/roberta-wwm-base-distill/HEAD/bert/tokenization.py -------------------------------------------------------------------------------- /create_pretraining_data.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/xiongma/roberta-wwm-base-distill/HEAD/create_pretraining_data.py -------------------------------------------------------------------------------- /create_teacher_output_data.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/xiongma/roberta-wwm-base-distill/HEAD/create_teacher_output_data.py -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | tensorflow>=1.12.0 2 | tqdm>=4.28.1 3 | jieba>=0.3x -------------------------------------------------------------------------------- /run_distill.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/xiongma/roberta-wwm-base-distill/HEAD/run_distill.py --------------------------------------------------------------------------------