└── README.md
/README.md:
--------------------------------------------------------------------------------
1 | # This repository is to collect BERT related resources.
2 |
3 | AD: a repository for graph convolutional networks at https://github.com/Jiakui/awesome-gcn (resources for graph convolutional networks (图卷积神经网络相关资源)).
4 |
5 |
6 | # Papers:
7 |
8 | 1. [arXiv:1810.04805](https://arxiv.org/abs/1810.04805), BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
9 | , Authors: Jacob Devlin, Ming-Wei Chang, Kenton Lee, Kristina Toutanova
10 |
11 |
12 |
13 | Click to see more
14 |
15 | 2. [arXiv:1812.06705](https://arxiv.org/abs/1812.06705), Conditional BERT Contextual Augmentation, Authors: Xing Wu, Shangwen Lv, Liangjun Zang, Jizhong Han, Songlin Hu
16 |
17 | 3. [arXiv:1812.03593](https://arxiv.org/pdf/1812.03593), SDNet: Contextualized Attention-based Deep Network for Conversational Question Answering, Authors: Chenguang Zhu, Michael Zeng, Xuedong Huang
18 |
19 | 4. [arXiv:1901.02860](https://arxiv.org/abs/1901.02860), Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context, Authors: Zihang Dai, Zhilin Yang, Yiming Yang, William W. Cohen, Jaime Carbonell, Quoc V. Le and Ruslan Salakhutdinov.
20 |
21 | 5. [arXiv:1901.04085](https://arxiv.org/pdf/1901.04085.pdf), Passage Re-ranking with BERT, Authors: Rodrigo Nogueira, Kyunghyun Cho
22 |
23 | 6. [arXiv:1902.02671](https://arxiv.org/pdf/1902.02671.pdf), BERT and PALs: Projected Attention Layers for Efficient Adaptation in Multi-Task Learning, Authors: Asa Cooper Stickland, Iain Murray
24 |
25 | 7. [arXiv:1904.02232](https://arxiv.org/abs/1904.02232), BERT Post-Training for Review Reading Comprehension and Aspect-based Sentiment Analysis, Authors: Hu Xu, Bing Liu, Lei Shu, Philip S. Yu, [[code](https://github.com/howardhsu/BERT-for-RRC-ABSA)]
26 |
27 |
28 |
29 |
30 |
31 |
32 | # Github Repositories:
33 |
34 | ## official implement:
35 |
36 | 1. [google-research/bert](https://github.com/google-research/bert), **officical** TensorFlow code and pre-trained models for BERT ,
37 | 
38 |
39 |
40 | ## implement of BERT besides tensorflow:
41 |
42 |
43 |
44 |
45 | 1. [codertimo/BERT-pytorch](https://github.com/codertimo/BERT-pytorch), Google AI 2018 BERT pytorch implementation,
46 | 
47 |
48 | 2. [huggingface/pytorch-pretrained-BERT](https://github.com/huggingface/pytorch-pretrained-BERT), A PyTorch implementation of Google AI's BERT model with script to load Google's pre-trained models,
49 | 
50 |
51 |
52 |
53 | 3. [dmlc/gluon-nlp](https://github.com/dmlc/gluon-nlp), Gluon + MXNet implementation that reproduces BERT pretraining and finetuning on GLUE benchmark, SQuAD, etc,
54 | 
55 |
56 | 4. [dbiir/UER-py](https://github.com/dbiir/UER-py), UER-py is a toolkit for pre-training on general-domain corpus and fine-tuning on downstream task. UER-py maintains model modularity and supports research extensibility. It facilitates the use of different pre-training models (e.g. BERT), and provides interfaces for users to further extend upon.
57 | 
58 |
59 |
60 | 5. [BrikerMan/Kashgari](https://github.com/BrikerMan/Kashgari), Simple, Keras-powered multilingual NLP framework, allows you to build your models in 5 minutes for named entity recognition (NER), part-of-speech tagging (PoS) and text classification tasks. Includes BERT, GPT-2 and word2vec embedding.
61 | 
62 |
63 | 6. [kaushaltrivedi/fast-bert](https://github.com/kaushaltrivedi/fast-bert), Super easy library for BERT based NLP models,
64 | 
65 |
66 |
67 |
68 |
69 |
70 |
71 | Click to see more
72 |
73 |
74 | 7. [Separius/BERT-keras](https://github.com/Separius/BERT-keras), Keras implementation of BERT with pre-trained weights,
75 | 
76 |
77 | 8. [soskek/bert-chainer](https://github.com/soskek/bert-chainer), Chainer implementation of "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding",
78 | 
79 |
80 | 9. [innodatalabs/tbert](https://github.com/innodatalabs/tbert), PyTorch port of BERT ML model
81 | 
82 |
83 | 10. [guotong1988/BERT-tensorflow](https://github.com/guotong1988/BERT-tensorflow), BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
84 | 
85 |
86 | 11. [dreamgonfly/BERT-pytorch](https://github.com/dreamgonfly/BERT-pytorch),
87 | PyTorch implementation of BERT in "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding"
88 | 
89 |
90 | 12. [CyberZHG/keras-bert](https://github.com/CyberZHG/keras-bert), Implementation of BERT that could load official pre-trained models for feature extraction and prediction
91 | 
92 |
93 | 13. [soskek/bert-chainer](https://github.com/soskek/bert-chainer), Chainer implementation of "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding"
94 | 
95 |
96 | 14. [MaZhiyuanBUAA/bert-tf1.4.0](https://github.com/MaZhiyuanBUAA/bert-tf1.4.0), bert-tf1.4.0
97 | 
98 |
99 | 15. [dhlee347/pytorchic-bert](https://github.com/dhlee347/pytorchic-bert), Pytorch Implementation of Google BERT,
100 | 
101 |
102 | 16. [kpot/keras-transformer](https://github.com/kpot/keras-transformer), Keras library for building (Universal) Transformers, facilitating BERT and GPT models,
103 | 
104 |
105 | 17. [miroozyx/BERT_with_keras](https://github.com/miroozyx/BERT_with_keras), A Keras version of Google's BERT model,
106 | 
107 |
108 | 18. [conda-forge/pytorch-pretrained-bert-feedstock](https://github.com/conda-forge/pytorch-pretrained-bert-feedstock), A conda-smithy repository for pytorch-pretrained-bert. ,
109 | 
110 |
111 |
112 | 19. [Rshcaroline/BERT_Pytorch_fastNLP](https://github.com/Rshcaroline/BERT_Pytorch_fastNLP), A PyTorch & fastNLP implementation of Google AI's BERT model.
113 | 
114 |
115 | 20. [nghuyong/ERNIE-Pytorch](https://github.com/nghuyong/ERNIE-Pytorch), ERNIE Pytorch Version,
116 | 
117 |
118 |
119 |
120 |
121 |
122 | ## Pretrained BERT weights:
123 | 1. [brightmart/roberta_zh](https://github.com/brightmart/roberta_zh), RoBERTa for Chinese, 中文预训练RoBERTa模型,
124 | 
125 |
126 | 2. [ymcui/Chinese-BERT-wwm](https://github.com/ymcui/Chinese-BERT-wwm), Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm预训练模型) https://arxiv.org/abs/1906.08101,
127 | 
128 |
129 | 3. [thunlp/OpenCLaP](https://github.com/thunlp/OpenCLaP),Open Chinese Language Pre-trained Model Zoo, OpenCLaP:多领域开源中文预训练语言模型仓库,
130 | 
131 |
132 | 4. [ymcui/Chinese-PreTrained-XLNet](https://github.com/ymcui/Chinese-PreTrained-XLNet), Pre-Trained Chinese XLNet(中文XLNet预训练模型),
133 | 
134 |
135 | 5. [brightmart/xlnet_zh](https://github.com/brightmart/xlnet_zh), 中文预训练XLNet模型: Pre-Trained Chinese XLNet_Large,
136 | 
137 |
138 |
139 | ## improvement over BERT:
140 | 1. [thunlp/ERNIE](https://github.com/https://github.com/thunlp/ERNIE), Source code and dataset for ACL 2019 paper "ERNIE: Enhanced Language Representation with Informative Entities", imporove bert with heterogeneous information fusion.
141 | 
142 |
143 | 2. [PaddlePaddle/LARK](https://github.com/PaddlePaddle/LARK), LAnguage Representations Kit, PaddlePaddle implementation of BERT. It also contains an improved version of BERT, ERNIE, for chinese NLP tasks. BERT 的中文改进版 ERNIE,
144 | 
145 |
146 | 3. [zihangdai/xlnet](https://github.com/zihangdai/xlnet), XLNet: Generalized Autoregressive Pretraining for Language Understanding,
147 | 
148 |
149 | 4. [kimiyoung/transformer-xl](https://github.com/kimiyoung/transformer-xl), Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context, This repository contains the code in both PyTorch and TensorFlow for our paper.
150 | 
151 |
152 | 5. [GaoPeng97/transformer-xl-chinese](https://github.com/GaoPeng97/transformer-xl-chinese), transformer xl在中文文本生成上的尝试。(transformer xl for text generation of chinese),
153 | 
154 |
155 | 6. [PaddlePaddle/ERNIE](https://github.com/PaddlePaddle/ERNIE), An Implementation of ERNIE For Language Understanding (including Pre-training models and Fine-tuning tools) BERT 的中文改进版 ERNIE,
156 | 
157 |
158 | 7. [pytorch/fairseq](https://github.com/pytorch/fairseq), Facebook AI Research Sequence-to-Sequence Toolkit written in Python. RoBERTa: A Robustly Optimized BERT Pretraining Approach,
159 | 
160 |
161 |
162 | 8. [facebookresearch/SpanBERT](https://github.com/facebookresearch/SpanBERT), Code for using and evaluating SpanBERT.
163 | , This repository contains code and models for the paper: SpanBERT: Improving Pre-training by Representing and Predicting Spans.,
164 | 
165 |
166 | 9. [brightmart/albert_zh](https://github.com/brightmart/albert_zh), 海量中文预训练ALBERT模型, A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS https://arxiv.org/pdf/1909.11942.pdf,
167 | 
168 |
169 | 10. [lonePatient/albert_pytorch](https://github.com/lonePatient/albert_pytorch), A Lite Bert For Self-Supervised Learning Language Representations,
170 | 
171 |
172 |
173 | 11. [kpe/bert-for-tf2](https://github.com/kpe/bert-for-tf2), A Keras TensorFlow 2.0 implementation of BERT, ALBERT and adapter-BERT. https://github.com/kpe/bert-for-tf2,
174 | 
175 |
176 |
177 |
178 | ## other resources for BERT:
179 |
180 | 1. [brightmart/bert_language_understanding](https://github.com/brightmart/bert_language_understanding), Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN,
181 | 
182 |
183 | 2. [Y1ran/NLP-BERT--ChineseVersion](https://github.com/Y1ran/NLP-BERT--ChineseVersion), 谷歌自然语言处理模型BERT:论文解析与python代码,
184 | 
185 |
186 |
187 |
188 | Click to see more
189 |
190 |
191 | 3. [yangbisheng2009/cn-bert](https://github.com/yangbisheng2009/cn-bert), BERT在中文NLP的应用, 语法检查
192 | 
193 |
194 | 4. [JayYip/bert-multiple-gpu](https://github.com/JayYip/bert-multiple-gpu), A multiple GPU support version of BERT,
195 | 
196 |
197 | 5. [HighCWu/keras-bert-tpu](https://github.com/HighCWu/keras-bert-tpu), Implementation of BERT that could load official pre-trained models for feature extraction and prediction on TPU,
198 | 
199 |
200 | 6. [Willyoung2017/Bert_Attempt](https://github.com/Willyoung2017/Bert_Attempt), PyTorch Pretrained Bert,
201 | 
202 |
203 | 7. [Pydataman/bert_examples](https://github.com/Pydataman/bert_examples), some examples of bert, run_classifier.py 是基于谷歌bert实现了Quora Insincere Questions Classification二分类比赛。run_ner.py是基于瑞金医院AI大赛 第一赛季数据和bert写的一个命名实体识别。
204 | 
205 |
206 | 8. [guotong1988/BERT-chinese](https://github.com/guotong1988/BERT-chinese), BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 中文 汉语
207 | 
208 |
209 | 9. [zhongyunuestc/bert_multitask](https://github.com/zhongyunuestc/bert_multitask), 多任务task
210 | 
211 |
212 | 10. [Microsoft/AzureML-BERT](https://github.com/Microsoft/AzureML-BERT), End-to-end walk through for fine-tuning BERT using Azure Machine Learning ,
213 | 
214 |
215 | 11. [bigboNed3/bert_serving](https://github.com/bigboNed3/bert_serving), export bert model for serving,
216 | 
217 |
218 | 12. [yoheikikuta/bert-japanese](https://github.com/yoheikikuta/bert-japanese), BERT with SentencePiece for Japanese text.
219 | 
220 |
221 | 13. [whqwill/seq2seq-keyphrase-bert](https://github.com/whqwill/seq2seq-keyphrase-bert), add BERT to encoder part for https://github.com/memray/seq2seq-keyphrase-pytorch,
222 | 
223 |
224 | 14. [algteam/bert-examples](https://github.com/algteam/bert-examples), bert-demo,
225 | 
226 |
227 | 15. [cedrickchee/awesome-bert-nlp](https://github.com/cedrickchee/awesome-bert-nlp), A curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
228 | 
229 |
230 | 16. [cnfive/cnbert](https://github.com/cnfive/cnbert), 中文注释一下bert代码功能,
231 | 
232 |
233 | 17. [brightmart/bert_customized](https://github.com/brightmart/bert_customized), bert with customized features,
234 | 
235 |
236 |
237 | 19. [JayYip/bert-multitask-learning](https://github.com/JayYip/bert-multitask-learning), BERT for Multitask Learning,
238 | 
239 |
240 | 20. [yuanxiaosc/BERT_Paper_Chinese_Translation](https://github.com/yuanxiaosc/BERT_Paper_Chinese_Translation), BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 论文的中文翻译。Chinese Translation! https://yuanxiaosc.github.io/2018/12/…,
241 | 
242 |
243 | 21. [yaserkl/BERTvsULMFIT](https://github.com/yaserkl/BERTvsULMFIT), Comparing Text Classification results using BERT embedding and ULMFIT embedding,
244 | 
245 |
246 | 22. [kpot/keras-transformer](https://github.com/kpot/keras-transformer), Keras library for building (Universal) Transformers, facilitating BERT and GPT models,
247 | 
248 |
249 | 23. [1234560o/Bert-model-code-interpretation](https://github.com/1234560o/Bert-model-code-interpretation), 解读tensorflow版本Bert中modeling.py数据流
250 | 
251 |
252 | 24. [cdathuraliya/bert-inference](https://github.com/cdathuraliya/bert-inference), A helper class for Google BERT (Devlin et al., 2018) to support online prediction and model pipelining.
253 | 
254 |
255 |
256 | 25. [gameofdimension/java-bert-predict](https://github.com/gameofdimension/java-bert-predict), turn bert pretrain checkpoint into saved model for a feature extracting demo in java
257 | 
258 |
259 | 26. [1234560o/Bert-model-code-interpretation](https://github.com/1234560o/Bert-model-code-interpretation), 解读tensorflow版本Bert中modeling.py数据流
260 | 
261 |
262 |
263 |
264 |
265 | ## domain specific BERT:
266 |
267 | 1. [allenai/scibert](https://github.com/allenai/scibert), A BERT model for scientific text. https://arxiv.org/abs/1903.10676,
268 | 
269 |
270 | 2. [MeRajat/SolvingAlmostAnythingWithBert](https://github.com/MeRajat/SolvingAlmostAnythingWithBert), BioBert Pytorch
271 | 
272 |
273 | 3. [kexinhuang12345/clinicalBERT](https://github.com/kexinhuang12345/clinicalBERT), ClinicalBERT: Modeling Clinical Notes and Predicting Hospital Readmission https://arxiv.org/abs/1904.05342
274 | 
275 |
276 | 4. [EmilyAlsentzer/clinicalBERT](https://github.com/EmilyAlsentzer/clinicalBERT), repository for Publicly Available Clinical BERT Embeddings
277 | 
278 |
279 |
280 | ## BERT Deploy Tricks:
281 |
282 | 1. [zhihu/cuBERT](https://github.com/zhihu/cuBERT), Fast implementation of BERT inference directly on NVIDIA (CUDA, CUBLAS) and Intel MKL
283 | 
284 |
285 | 2. [xmxoxo/BERT-train2deploy](https://github.com/xmxoxo/BERT-train2deploy), Bert Model training and deploy, BERT模型从训练到部署,
286 | 
287 |
288 |
289 | 3. [https://github.com/NVIDIA/DeepLearningExamples/tree/master/TensorFlow/LanguageModeling/BERT](https://github.com/NVIDIA/DeepLearningExamples/tree/master/TensorFlow/LanguageModeling/BERT), BERT For TensorFlow, This repository provides a script and recipe to train BERT to achieve state of the art accuracy, and is tested and maintained by NVIDIA.
290 | 
291 |
292 | 4. [qiangsiwei/bert_distill](https://github.com/qiangsiwei/bert_distill), BERT distillation(基于BERT的蒸馏实验 ),
293 | 
294 |
295 | 5. [kevinmtian/distill-bert](https://github.com/kevinmtian/distill-bert), Knowledge Distillation from BERT,
296 | 
297 |
298 |
299 |
300 |
301 |
302 | ## BERT QA & RC task:
303 |
304 | 1. [sogou/SMRCToolkit](https://github.com/sogou/SMRCToolkit), This toolkit was designed for the fast and efficient development of modern machine comprehension models, including both published models and original prototypes.,
305 | 
306 |
307 |
308 | 2. [benywon/ChineseBert](https://github.com/benywon/ChineseBert), This is a chinese Bert model specific for question answering,
309 | 
310 |
311 | 3. [matthew-z/R-net](https://github.com/matthew-z/R-net), R-net in PyTorch, with BERT and ELMo,
312 | 
313 |
314 | 4. [nyu-dl/dl4marco-bert](https://github.com/nyu-dl/dl4marco-bert), Passage Re-ranking with BERT,
315 | 
316 |
317 | 5. [xzp27/BERT-for-Chinese-Question-Answering](https://github.com/xzp27/BERT-for-Chinese-Question-Answering),
318 | 
319 |
320 | 6. [chiayewken/bert-qa](https://github.com/chiayewken/bert-qa), BERT for question answering starting with HotpotQA,
321 | 
322 |
323 | 7. [ankit-ai/BertQA-Attention-on-Steroids](https://github.com/ankit-ai/BertQA-Attention-on-Steroids), BertQA - Attention on Steroids,
324 | 
325 |
326 | 8. [NoviScl/BERT-RACE](https://github.com/NoviScl/BERT-RACE), This work is based on Pytorch implementation of BERT (https://github.com/huggingface/pytorch-pretrained-BERT). I adapted the original BERT model to work on multiple choice machine comprehension.
327 | 
328 |
329 | 9. [eva-n27/BERT-for-Chinese-Question-Answering](https://github.com/eva-n27/BERT-for-Chinese-Question-Answering),
330 | 
331 |
332 | 10. [allenai/allennlp-bert-qa-wrapper](https://github.com/allenai/allennlp-bert-qa-wrapper), This is a simple wrapper on top of pretrained BERT based QA models from pytorch-pretrained-bert to make AllenNLP model archives, so that you can serve demos from AllenNLP.
333 | 
334 |
335 | 11. [edmondchensj/ChineseQA-with-BERT](https://github.com/edmondchensj/ChineseQA-with-BERT), EECS 496: Advanced Topics in Deep Learning Final Project: Chinese Question Answering with BERT (Baidu DuReader Dataset)
336 | 
337 |
338 | 12. [graykode/toeicbert](https://github.com/graykode/toeicbert), TOEIC(Test of English for International Communication) solving using pytorch-pretrained-BERT model.,
339 | 
340 |
341 | 13. [graykode/KorQuAD-beginner](https://github.com/graykode/KorQuAD-beginner), https://github.com/graykode/KorQuAD-beginner
342 | 
343 |
344 | 14. [krishna-sharma19/SBU-QA](https://github.com/krishna-sharma19/SBU-QA), This repository uses pretrain BERT embeddings for transfer learning in QA domain
345 | 
346 |
347 | 15. [basketballandlearn/Dureader-Bert](https://github.com/basketballandlearn/Dureader-Bert), BERT Dureader多文档阅读理解 排名第七, 2019 Dureader机器阅读理解 单模型代码。,
348 | 
349 |
350 |
351 |
352 | ## BERT classification task:
353 |
354 | 1. [zhpmatrix/Kaggle-Quora-Insincere-Questions-Classification](https://github.com/zhpmatrix/Kaggle-Quora-Insincere-Questions-Classification), Kaggle新赛(baseline)-基于BERT的fine-tuning方案+基于tensor2tensor的Transformer Encoder方案
355 | 
356 |
357 | 2. [maksna/bert-fine-tuning-for-chinese-multiclass-classification](https://github.com/maksna/bert-fine-tuning-for-chinese-multiclass-classification), use google pre-training model bert to fine-tuning for the chinese multiclass classification
358 | 
359 |
360 | 3. [NLPScott/bert-Chinese-classification-task](https://github.com/NLPScott/bert-Chinese-classification-task), bert中文分类实践,
361 | 
362 |
363 | 4. [Socialbird-AILab/BERT-Classification-Tutorial](https://github.com/Socialbird-AILab/BERT-Classification-Tutorial),
364 | 
365 |
366 | 5. [fooSynaptic/BERT_classifer_trial](https://github.com/fooSynaptic/BERT_classifer_trial), BERT trial for chinese corpus classfication
367 | 
368 |
369 | 6. [xiaopingzhong/bert-finetune-for-classfier](https://github.com/xiaopingzhong/bert-finetune-for-classfier), 微调BERT模型,同时构建自己的数据集实现分类
370 | 
371 |
372 | 7. [pengming617/bert_classification](https://github.com/pengming617/bert_classification), 利用bert预训练的中文模型进行文本分类,
373 | 
374 |
375 | 8. [xieyufei1993/Bert-Pytorch-Chinese-TextClassification](https://github.com/xieyufei1993/Bert-Pytorch-Chinese-TextClassification), Pytorch Bert Finetune in Chinese Text Classification,
376 | 
377 |
378 | 9. [liyibo/text-classification-demos](https://github.com/liyibo/text-classification-demos), Neural models for Text Classification in Tensorflow, such as cnn, dpcnn, fasttext, bert ...,
379 | 
380 |
381 | 10. [circlePi/BERT_Chinese_Text_Class_By_pytorch](https://github.com/circlePi/BERT_Chinese_Text_Class_By_pytorch), A Pytorch implements of Chinese text class based on BERT_Pretrained_Model,
382 | 
383 |
384 | 11. [kaushaltrivedi/bert-toxic-comments-multilabel](https://github.com/kaushaltrivedi/bert-toxic-comments-multilabel), Multilabel classification for Toxic comments challenge using Bert,
385 | 
386 |
387 | 12. [lonePatient/BERT-chinese-text-classification-pytorch](https://github.com/lonePatient/BERT-chinese-text-classification-pytorch), This repo contains a PyTorch implementation of a pretrained BERT model for text classification.,
388 | 
389 |
390 |
391 |
392 | ## BERT Sentiment Analysis
393 |
394 | 1. [Chung-I/Douban-Sentiment-Analysis](https://github.com/Chung-I/Douban-Sentiment-Analysis), Sentiment Analysis on Douban Movie Short Comments Dataset using BERT.
395 | 
396 |
397 | 2. [lynnna-xu/bert_sa](https://github.com/lynnna-xu/bert_sa), bert sentiment analysis tensorflow serving with RESTful API
398 | 
399 |
400 | 3. [HSLCY/ABSA-BERT-pair](https://github.com/HSLCY/ABSA-BERT-pair), Utilizing BERT for Aspect-Based Sentiment Analysis via Constructing Auxiliary Sentence (NAACL 2019) https://arxiv.org/abs/1903.09588,
401 | 
402 |
403 | 4. [songyouwei/ABSA-PyTorch](https://github.com/songyouwei/ABSA-PyTorch), Aspect Based Sentiment Analysis, PyTorch Implementations. 基于方面的情感分析,使用PyTorch实现。,
404 | 
405 |
406 | 5. [howardhsu/BERT-for-RRC-ABSA](https://github.com/howardhsu/BERT-for-RRC-ABSA), code for our NAACL 2019 paper: "BERT Post-Training for Review Reading Comprehension and Aspect-based Sentiment Analysis",
407 | 
408 |
409 | 6. [brightmart/sentiment_analysis_fine_grain](https://github.com/brightmart/sentiment_analysis_fine_grain), Multi-label Classification with BERT; Fine Grained Sentiment Analysis from AI challenger,
410 | 
411 |
412 |
413 | ## BERT NER task:
414 |
415 | 1. [zhpmatrix/bert-sequence-tagging](https://github.com/zhpmatrix/bert-sequence-tagging), 基于BERT的中文序列标注
416 | 
417 |
418 | 2. [kyzhouhzau/BERT-NER](https://github.com/kyzhouhzau/BERT-NER), Use google BERT to do CoNLL-2003 NER ! ,
419 | 
420 |
421 | 3. [king-menin/ner-bert](https://github.com/king-menin/ner-bert), NER task solution (bert-Bi-LSTM-CRF) with google bert https://github.com/google-research.
422 | 
423 |
424 | 4. [macanv/BERT-BiLSMT-CRF-NER](https://github.com/macanv/BERT-BiLSMT-CRF-NER), Tensorflow solution of NER task Using BiLSTM-CRF model with Google BERT Fine-tuning ,
425 | 
426 |
427 | 5. [FuYanzhe2/Name-Entity-Recognition](https://github.com/FuYanzhe2/Name-Entity-Recognition), Lstm-crf,Lattice-CRF,bert-ner及近年ner相关论文follow,
428 | 
429 |
430 | 6. [mhcao916/NER_Based_on_BERT](https://github.com/mhcao916/NER_Based_on_BERT), this project is based on google bert model, which is a Chinese NER
431 | 
432 |
433 | 7. [ProHiryu/bert-chinese-ner](https://github.com/ProHiryu/bert-chinese-ner), 使用预训练语言模型BERT做中文NER,
434 | 
435 |
436 | 8. [sberbank-ai/ner-bert](https://github.com/sberbank-ai/ner-bert), BERT-NER (nert-bert) with google bert,
437 | 
438 |
439 | 9. [kyzhouhzau/Bert-BiLSTM-CRF](https://github.com/kyzhouhzau/Bert-BiLSTM-CRF), This model base on bert-as-service. Model structure : bert-embedding bilstm crf. ,
440 | 
441 |
442 | 10. [Hoiy/berserker](https://github.com/Hoiy/berserker), Berserker - BERt chineSE woRd toKenizER, Berserker (BERt chineSE woRd toKenizER) is a Chinese tokenizer built on top of Google's BERT model. ,
443 | 
444 |
445 | 11. [Kyubyong/bert_ner](https://github.com/Kyubyong/bert_ner), Ner with Bert,
446 | 
447 |
448 | 12. [jiangpinglei/BERT_ChineseWordSegment](https://github.com/jiangpinglei/BERT_ChineseWordSegment), A Chinese word segment model based on BERT, F1-Score 97%,
449 | 
450 |
451 | 13. [yanwii/ChineseNER](https://github.com/yanwii/ChineseNER), 基于Bi-GRU + CRF 的中文机构名、人名识别 中文实体识别, 支持google bert模型
452 | 
453 |
454 | 14. [lemonhu/NER-BERT-pytorch](https://github.com/lemonhu/NER-BERT-pytorch), PyTorch solution of NER task Using Google AI's pre-trained BERT model.
455 | 
456 |
457 |
458 | ## BERT Text Summarization Task:
459 |
460 | 1. [nlpyang/BertSum](https://github.com/nlpyang/BertSum), Code for paper Fine-tune BERT for Extractive Summarization,
461 | 
462 |
463 | 2. [santhoshkolloju/Abstractive-Summarization-With-Transfer-Learning](https://github.com/santhoshkolloju/Abstractive-Summarization-With-Transfer-Learning), Abstractive summarisation using Bert as encoder and Transformer Decoder,
464 | 
465 |
466 | 3. [nayeon7lee/bert-summarization](https://github.com/nayeon7lee/bert-summarization), Implementation of 'Pretraining-Based Natural Language Generation for Text Summarization', Paper: https://arxiv.org/pdf/1902.09243.pdf
467 | 
468 |
469 | 4. [dmmiller612/lecture-summarizer](https://github.com/dmmiller612/lecture-summarizer), Lecture summarizer with BERT
470 | 
471 |
472 |
473 |
474 |
475 | ## BERT Text Generation Task:
476 | 1. [asyml/texar](https://github.com/asyml/texar), Toolkit for Text Generation and Beyond https://texar.io, Texar is a general-purpose text generation toolkit, has also implemented BERT here for classification, and text generation applications by combining with Texar's other modules.
477 | 
478 |
479 | 2. [voidful/BertGenerate](https://github.com/voidful/BertGenerate), Fine tuning bert for text generation, Bert 做 文本生成 的一些實驗
480 | 
481 |
482 | 3. [Tiiiger/bert_score](https://github.com/Tiiiger/bert_score), BERT score for language generation,
483 | 
484 |
485 |
486 |
487 | ## BERT Knowledge Graph Task :
488 |
489 | 1. [lvjianxin/Knowledge-extraction](https://github.com/lvjianxin/Knowledge-extraction), 基于中文的知识抽取,BaseLine:Bi-LSTM+CRF 升级版:Bert预训练
490 | 
491 |
492 | 2. [sakuranew/BERT-AttributeExtraction](https://github.com/sakuranew/BERT-AttributeExtraction), USING BERT FOR Attribute Extraction in KnowledgeGraph. fine-tuning and feature extraction. 使用基于bert的微调和特征提取方法来进行知识图谱百度百科人物词条属性抽取。,
493 | 
494 |
495 | 3. [aditya-AI/Information-Retrieval-System-using-BERT](https://github.com/aditya-AI/Information-Retrieval-System-using-BERT),
496 | 
497 |
498 | 4. [jkszw2014/bert-kbqa-NLPCC2017](https://github.com/jkszw2014/bert-kbqa-NLPCC2017), A trial of kbqa based on bert for NLPCC2016/2017 Task 5 (基于BERT的中文知识库问答实践,代码可跑通),博客介绍 https://blog.csdn.net/ai_1046067944/article/details/86707784 ,
499 | 
500 |
501 | 5. [yuanxiaosc/Schema-based-Knowledge-Extraction](https://github.com/yuanxiaosc/Schema-based-Knowledge-Extraction), Code for http://lic2019.ccf.org.cn/kg 信息抽取。使用基于 BERT 的实体抽取和关系抽取的端到端的联合模型。(将在比赛结束后,完善代码和使用说明),
502 | 
503 |
504 | 6. [yuanxiaosc/Entity-Relation-Extraction](https://github.com/yuanxiaosc/Entity-Relation-Extraction), Entity and Relation Extraction Based on TensorFlow. 基于TensorFlow的管道式实体及关系抽取,2019语言与智能技术竞赛信息抽取任务解决方案(比赛结束后完善代码)。Schema based Knowledge Extraction, SKE 2019 http://lic2019.ccf.org.cn,
505 | 
506 |
507 | 7. [WenRichard/KBQA-BERT](https://github.com/WenRichard/KBQA-BERT), 基于知识图谱的问答系统,BERT做命名实体识别和句子相似度,分为online和outline模式,博客介绍 https://zhuanlan.zhihu.com/p/62946533 ,
508 | 
509 |
510 | 8. [zhpmatrix/BERTem](https://github.com/zhpmatrix/BERTem), ACL2019论文实现《Matching the Blanks: Distributional Similarity for Relation Learning》 ,
511 | 
512 |
513 |
514 |
515 |
516 | ## BERT Coreference Resolution
517 | 1. [ianycxu/RGCN-with-BERT](https://github.com/ianycxu/RGCN-with-BERT), Gated-Relational Graph Convolutional Networks (RGCN) with BERT for Coreference Resolution Task
518 | 
519 |
520 | 2. [isabellebouchard/BERT_for_GAP-coreference](https://github.com/isabellebouchard/BERT_for_GAP-coreference), BERT finetuning for GAP unbiased pronoun resolution
521 | 
522 |
523 |
524 |
525 | ## BERT visualization toolkit:
526 | 1. [jessevig/bertviz](https://github.com/jessevig/bertviz), Tool for visualizing BERT's attention,
527 | 
528 |
529 |
530 | ## BERT chatbot :
531 | 1. [GaoQ1/rasa_nlu_gq](https://github.com/GaoQ1/rasa_nlu_gq), turn natural language into structured data(支持中文,自定义了N种模型,支持不同的场景和任务),
532 | 
533 |
534 | 2. [GaoQ1/rasa_chatbot_cn](https://github.com/GaoQ1/rasa_chatbot_cn), 基于rasa-nlu和rasa-core 搭建的对话系统demo,
535 | 
536 |
537 | 3. [GaoQ1/rasa-bert-finetune](https://github.com/GaoQ1/rasa-bert-finetune), 支持rasa-nlu 的bert finetune,
538 | 
539 |
540 | 4. [geodge831012/bert_robot](https://github.com/geodge831012/bert_robot), 用于智能助手回答问题的训练,基于BERT模型进行训练改造
541 | 
542 |
543 | 5. [yuanxiaosc/BERT-for-Sequence-Labeling-and-Text-Classification](https://github.com/yuanxiaosc/BERT-for-Sequence-Labeling-and-Text-Classification), This is the template code to use BERT for sequence lableing and text classification, in order to facilitate BERT for more tasks. Currently, the template code has included conll-2003 named entity identification, Snips Slot Filling and Intent Prediction.
544 | 
545 |
546 | 6. [guillaume-chevalier/ReuBERT](https://github.com/guillaume-chevalier/ReuBERT), A question-answering chatbot, simply.
547 | 
548 |
549 |
550 |
551 | ## BERT language model and embedding:
552 |
553 | 1. [hanxiao/bert-as-service](https://github.com/hanxiao/bert-as-service), Mapping a variable-length sentence to a fixed-length vector using pretrained BERT model,
554 | 
555 |
556 | 2. [YC-wind/embedding_study](https://github.com/YC-wind/embedding_study), 中文预训练模型生成字向量学习,测试BERT,ELMO的中文效果,
557 | 
558 |
559 | 3. [Kyubyong/bert-token-embeddings](https://github.com/Kyubyong/bert-token-embeddings), Bert Pretrained Token Embeddings,
560 | 
561 |
562 | 4. [xu-song/bert_as_language_model](https://github.com/xu-song/bert_as_language_model), bert as language model, fork from https://github.com/google-research/bert,
563 | 
564 |
565 | 5. [yuanxiaosc/Deep_dynamic_word_representation](https://github.com/yuanxiaosc/Deep_dynamic_word_representation), TensorFlow code and pre-trained models for deep dynamic word representation (DDWR). It combines the BERT model and ELMo's deep context word representation.,
566 | 
567 |
568 | 6. [imgarylai/bert-embedding](https://github.com/imgarylai/bert-embedding), Token level embeddings from BERT model on mxnet and gluonnlp http://bert-embedding.readthedocs.io/,
569 | 
570 |
571 | 7. [terrifyzhao/bert-utils](https://github.com/terrifyzhao/bert-utils), BERT生成句向量,BERT做文本分类、文本相似度计算,
572 | 
573 |
574 | 8. [fennuDetudou/BERT_implement](https://github.com/fennuDetudou/BERT_implement), 使用BERT模型进行文本分类,相似句子判断,以及词性标注,
575 | 
576 |
577 | 9. [whqwill/seq2seq-keyphrase-bert](https://github.com/whqwill/seq2seq-keyphrase-bert), add BERT to encoder part for https://github.com/memray/seq2seq-keyphrase-pytorch,
578 | 
579 |
580 | 10. [charles9n/bert-sklearn](https://github.com/charles9n/bert-sklearn), a sklearn wrapper for Google's BERT model,
581 | 
582 |
583 |
584 | 11. [NVIDIA/Megatron-LM](https://github.com/NVIDIA/Megatron-LM), Ongoing research training transformer language models at scale, including: BERT,
585 | 
586 |
587 | 12. [hankcs/BERT-token-level-embedding](https://github.com/hankcs/BERT-token-level-embedding), Generate BERT token level embedding without pain
588 | 
589 |
590 | 13. [facebookresearch/LAMA](https://github.com/facebookresearch/LAMA), LAMA: LAnguage Model Analysis, LAMA is a set of connectors to pre-trained language models.
591 | 
592 |
593 |
594 |
595 |
596 | ## BERT Text Match:
597 |
598 | 1. [pengming617/bert_textMatching](https://github.com/pengming617/bert_textMatching), 利用预训练的中文模型实现基于bert的语义匹配模型 数据集为LCQMC官方数据
599 | 
600 |
601 | 2. [Brokenwind/BertSimilarity](https://github.com/Brokenwind/BertSimilarity), Computing similarity of two sentences with google's BERT algorithm
602 | 
603 |
604 | 3. [policeme/chinese_bert_similarity](https://github.com/policeme/chinese_bert_similarity), bert chinese similarity
605 | 
606 |
607 | 4. [lonePatient/bert-sentence-similarity-pytorch](https://github.com/lonePatient/bert-sentence-similarity-pytorch), This repo contains a PyTorch implementation of a pretrained BERT model for sentence similarity task.
608 | 
609 |
610 | 5. [nouhadziri/DialogEntailment](https://github.com/nouhadziri/DialogEntailment), The implementation of the paper "Evaluating Coherence in Dialogue Systems using Entailment" https://arxiv.org/abs/1904.03371
611 | 
612 |
613 | 6. [UKPLab/sentence-transformers](https://github.com/UKPLab/sentence-transformers), Sentence Embeddings with BERT & XLNet,
614 | Sentence Transformers: Sentence Embeddings using BERT / RoBERTa / XLNet with PyTorch,
615 | 
616 |
617 |
618 | ## BERT tutorials:
619 |
620 | 1. [graykode/nlp-tutorial](https://github.com/graykode/nlp-tutorial), Natural Language Processing Tutorial for Deep Learning Researchers https://www.reddit.com/r/MachineLearn…,
621 | 
622 |
623 | 2. [dragen1860/TensorFlow-2.x-Tutorials](https://github.com/dragen1860/TensorFlow-2.x-Tutorials), TensorFlow 2.x version's Tutorials and Examples, including CNN, RNN, GAN, Auto-Encoders, FasterRCNN, GPT, BERT examples, etc. TF 2.0版入门实例代码,实战教程。,
624 | 
625 |
626 |
627 |
--------------------------------------------------------------------------------