├── LICENSE └── README.md /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2020 Aditya Khandelwal 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Transformers-For-Negation-and-Speculation 2 | 1. Transformers_for_Negation_and_Speculation.ipynb: 3 | > This is the code for the following papers: 4 | * [NegBERT: A Transfer Learning Approach for Negation Detection and Scope Resolution](https://arxiv.org/abs/1911.04211) (Published at LREC 2020) 5 | * [Resolving the Scope of Speculation and Negation using Transformer-Based Architectures](https://arxiv.org/abs/2001.02885) 6 | 2. Multitask_Learning_of_Negation_and_Speculation.ipynb: 7 | > This is the code for the following paper: 8 | * [Multitask Learning of Negation and Speculation using Transformers](https://www.aclweb.org/anthology/2020.louhi-1.9.pdf) (Published at LOUHI 2020) 9 | 10 | Code has been utilized from the following sources: 11 | * The starter code was taken from [this article on Named Entity Recognition with Bert](https://www.depends-on-the-definition.com/named-entity-recognition-with-bert/). 12 | * To be able to implement XLNetForTokenClassification and RobertaForTokenClassification for the [Transformers library by Huggingface](https://github.com/huggingface/transformers), some code was copy-pasted from that code base. 13 | * The implementation of Early Stopping for PyTorch was adapted from [Bjarten's implementation(GitHub)](https://github.com/Bjarten/early-stopping-pytorch/blob/master/pytorchtools.py). 14 | 15 | The colab notebook can be executed directly from Google Colaboratory. 16 | 17 | Datasets: 18 | 1. [BioScope Corpus](https://rgai.sed.hu/node/105) 19 | 2. [Sherlock (*sem 2012 Shared Task)](https://www.clips.uantwerpen.be/sem2012-st-neg/) 20 | 3. [SFU Review Corpus](https://www.sfu.ca/~mtaboada/SFU_Review_Corpus.html) 21 | 22 | Contributors: [Aditya Khandelwal](https://github.com/adityak6798), [Benita Kathleen Britto](https://github.com/benitakbritto) 23 | --------------------------------------------------------------------------------