├── .gitignore ├── LICENSE ├── README.md ├── papers ├── A Neural Probabilistic Language Model.pdf ├── Attention Is All You Need.pdf ├── BERT - Pre-training of Deep Bidirectional Transformers for.pdf ├── Effective Approaches to Attention-based Neural Machine Translation.pdf ├── Feed-Forward Networks with Attention Can Solve Some Long-Term Memory Problems.pdf ├── Learning Phrase Representations using RNN Encoder–Decoder.pdf ├── NEURAL MACHINE TRANSLATION.pdf ├── Neural Machine Translation and Sequence-to-sequence Models - A Tutorial.pdf └── Universal Sentence Encoder.pdf ├── presentation └── DTL NLP Manning - DS.pdf └── tutorials ├── Deep_TL_for_NLP_Multi_task_NLP_Transformers.ipynb └── Deep_TL_for_NLP_Text_Classification.ipynb /.gitignore: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dipanjanS/live-manning-nlpconf20/HEAD/.gitignore -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dipanjanS/live-manning-nlpconf20/HEAD/LICENSE -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dipanjanS/live-manning-nlpconf20/HEAD/README.md -------------------------------------------------------------------------------- /papers/A Neural Probabilistic Language Model.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dipanjanS/live-manning-nlpconf20/HEAD/papers/A Neural Probabilistic Language Model.pdf -------------------------------------------------------------------------------- /papers/Attention Is All You Need.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dipanjanS/live-manning-nlpconf20/HEAD/papers/Attention Is All You Need.pdf -------------------------------------------------------------------------------- /papers/BERT - Pre-training of Deep Bidirectional Transformers for.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dipanjanS/live-manning-nlpconf20/HEAD/papers/BERT - Pre-training of Deep Bidirectional Transformers for.pdf -------------------------------------------------------------------------------- /papers/Effective Approaches to Attention-based Neural Machine Translation.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dipanjanS/live-manning-nlpconf20/HEAD/papers/Effective Approaches to Attention-based Neural Machine Translation.pdf -------------------------------------------------------------------------------- /papers/Feed-Forward Networks with Attention Can Solve Some Long-Term Memory Problems.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dipanjanS/live-manning-nlpconf20/HEAD/papers/Feed-Forward Networks with Attention Can Solve Some Long-Term Memory Problems.pdf -------------------------------------------------------------------------------- /papers/Learning Phrase Representations using RNN Encoder–Decoder.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dipanjanS/live-manning-nlpconf20/HEAD/papers/Learning Phrase Representations using RNN Encoder–Decoder.pdf -------------------------------------------------------------------------------- /papers/NEURAL MACHINE TRANSLATION.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dipanjanS/live-manning-nlpconf20/HEAD/papers/NEURAL MACHINE TRANSLATION.pdf -------------------------------------------------------------------------------- /papers/Neural Machine Translation and Sequence-to-sequence Models - A Tutorial.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dipanjanS/live-manning-nlpconf20/HEAD/papers/Neural Machine Translation and Sequence-to-sequence Models - A Tutorial.pdf -------------------------------------------------------------------------------- /papers/Universal Sentence Encoder.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dipanjanS/live-manning-nlpconf20/HEAD/papers/Universal Sentence Encoder.pdf -------------------------------------------------------------------------------- /presentation/DTL NLP Manning - DS.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dipanjanS/live-manning-nlpconf20/HEAD/presentation/DTL NLP Manning - DS.pdf -------------------------------------------------------------------------------- /tutorials/Deep_TL_for_NLP_Multi_task_NLP_Transformers.ipynb: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dipanjanS/live-manning-nlpconf20/HEAD/tutorials/Deep_TL_for_NLP_Multi_task_NLP_Transformers.ipynb -------------------------------------------------------------------------------- /tutorials/Deep_TL_for_NLP_Text_Classification.ipynb: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dipanjanS/live-manning-nlpconf20/HEAD/tutorials/Deep_TL_for_NLP_Text_Classification.ipynb --------------------------------------------------------------------------------