├── 2021 ├── July │ ├── 2021-07-01_simple_recurrent_units_for_highly_parallelizable_recurrence.pdf │ ├── 2021-07-08_taming_pretrained_transformers_for_extreme_multi-label_text_classification.pdf │ └── 2021-07-15_data_augmentation_via_dependency_tree_morphing_for_low_resource_languages.pdf └── June │ ├── 2021-06-10 Seq2Seq.pdf │ ├── 2021-06-17_Bleu.pdf │ └── 2021-06-24_Attention_is_All_you_need.pdf └── README.md /2021/July/2021-07-01_simple_recurrent_units_for_highly_parallelizable_recurrence.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/DataTalksClub/reading-club-nlp/fb1a7953169736409691c70c3b23cc01c5f5559b/2021/July/2021-07-01_simple_recurrent_units_for_highly_parallelizable_recurrence.pdf -------------------------------------------------------------------------------- /2021/July/2021-07-08_taming_pretrained_transformers_for_extreme_multi-label_text_classification.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/DataTalksClub/reading-club-nlp/fb1a7953169736409691c70c3b23cc01c5f5559b/2021/July/2021-07-08_taming_pretrained_transformers_for_extreme_multi-label_text_classification.pdf -------------------------------------------------------------------------------- /2021/July/2021-07-15_data_augmentation_via_dependency_tree_morphing_for_low_resource_languages.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/DataTalksClub/reading-club-nlp/fb1a7953169736409691c70c3b23cc01c5f5559b/2021/July/2021-07-15_data_augmentation_via_dependency_tree_morphing_for_low_resource_languages.pdf -------------------------------------------------------------------------------- /2021/June/2021-06-10 Seq2Seq.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/DataTalksClub/reading-club-nlp/fb1a7953169736409691c70c3b23cc01c5f5559b/2021/June/2021-06-10 Seq2Seq.pdf -------------------------------------------------------------------------------- /2021/June/2021-06-17_Bleu.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/DataTalksClub/reading-club-nlp/fb1a7953169736409691c70c3b23cc01c5f5559b/2021/June/2021-06-17_Bleu.pdf -------------------------------------------------------------------------------- /2021/June/2021-06-24_Attention_is_All_you_need.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/DataTalksClub/reading-club-nlp/fb1a7953169736409691c70c3b23cc01c5f5559b/2021/June/2021-06-24_Attention_is_All_you_need.pdf -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # reading-club-nlp 2 | Notes from our NLP reading club! 3 | 4 | ## Papers we have read 5 | 6 | ### 2021 June 7 | 8 | [2021-06-10 Sequence to Sequence Learning with Neural Networks](./2021/June/2021-06-10%20Seq2Seq.pdf) (Sutskever et al, 2014) 9 | 10 | [2021-06-17 Evaluating Text Output in NLP: BLEU at your own risk](./2021/June/2021-06-17_Bleu.pdf) (Rachael Tatman 2019) 11 | 12 | [2021-06-24 Attention is All You Need](./2021/June/2021-06-24_Attention_is_All_you_need.pdf) (Vaswani et al, 2017) 13 | 14 | ### 2021 July 15 | 16 | [2021-07-01 Simple Recurrent Units for Highly Parallelizable Recurrence](./2021/July/2021-07-01_simple_recurrent_units_for_highly_parallelizable_recurrence.pdf) (Lei et al, 2018) 17 | 18 | [2021-07-08 Taming Pretrained Transformers for Extreme Multi-label Text Classification](./2021/July/2021-07-08_taming_pretrained_transformers_for_extreme_multi-label_text_classification.pdf) (Chang et al, 2020) 19 | 20 | [2021-07-015 Data Augmentation via Dependency Tree Morphing for Low-Resource Languages](./2021/July/2021-07-15_data_augmentation_via_dependency_tree_morphing_for_low_resource_languages.pdf) (Sahin et al, 2019) 21 | 22 | --------------------------------------------------------------------------------