└── README.md /README.md: -------------------------------------------------------------------------------- 1 | # EMNLP 2021 Tutorial: Multi-Domain Multilingual Question Answering 2 | 3 | You can find the link to the conference session on Underline [here](https://underline.io/events/192/sessions?eventSessionId=7844) if you have registered for the conference. 4 | 5 | The slides are publicly available at: https://tinyurl.com/multi-qa-tutorial 6 | 7 | If you found the content of [the tutorial](https://aclanthology.org/2021.emnlp-tutorials.4/) helpful, consider citing it as: 8 | 9 | ``` 10 | @inproceedings{ruder-sil-2021-multi, 11 | title = "Multi-Domain Multilingual Question Answering", 12 | author = "Ruder, Sebastian and 13 | Sil, Avi", 14 | booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: Tutorial Abstracts", 15 | month = nov, 16 | year = "2021", 17 | address = "Punta Cana, Dominican Republic {\&} Online", 18 | publisher = "Association for Computational Linguistics", 19 | url = "https://aclanthology.org/2021.emnlp-tutorials.4", 20 | pages = "17--21", 21 | abstract = "Question answering (QA) is one of the most challenging and impactful tasks in natural language processing. Most research in QA, however, has focused on the open-domain or monolingual setting while most real-world applications deal with specific domains or languages. In this tutorial, we attempt to bridge this gap. Firstly, we introduce standard benchmarks in multi-domain and multilingual QA. In both scenarios, we discuss state-of-the-art approaches that achieve impressive performance, ranging from zero-shot transfer learning to out-of-the-box training with open-domain QA systems. Finally, we will present open research problems that this new research agenda poses such as multi-task learning, cross-lingual transfer learning, domain adaptation and training large scale pre-trained multilingual language models.", 22 | } 23 | ``` 24 | --------------------------------------------------------------------------------