├── .gitignore ├── 01-introduction.md ├── 02-ner-intro.md ├── 03-gmb-ner-bert.ipynb ├── 03a-gmb-ner-xlmr.ipynb ├── 03b-gmb-ner-distilbert.ipynb ├── 04-re-intro.md ├── 05-nyt-re-bert.ipynb ├── 05a-nyt-re-bert.ipynb ├── 05b-nyt-re-bert.ipynb ├── 06-conclusion.md ├── LICENSE ├── README.md └── figures ├── bag-of-features-kernel.png ├── bilstm-crf.png ├── book-oreilly.png ├── book-packt.png ├── linear-crf.png ├── my-headshot.jpg ├── odsc-2022-blog-fig-1.png ├── odsc-2022-blog-fig-2.png ├── pcnn-arch.png ├── re-transformer-archs.png ├── re-triples-table.png └── transformer-based-ner.png /.gitignore: -------------------------------------------------------------------------------- 1 | # Byte-compiled / optimized / DLL files 2 | __pycache__/ 3 | *.py[cod] 4 | *$py.class 5 | 6 | # C extensions 7 | *.so 8 | 9 | # Distribution / packaging 10 | .Python 11 | build/ 12 | develop-eggs/ 13 | dist/ 14 | downloads/ 15 | eggs/ 16 | .eggs/ 17 | lib/ 18 | lib64/ 19 | parts/ 20 | sdist/ 21 | var/ 22 | wheels/ 23 | pip-wheel-metadata/ 24 | share/python-wheels/ 25 | *.egg-info/ 26 | .installed.cfg 27 | *.egg 28 | MANIFEST 29 | 30 | # PyInstaller 31 | # Usually these files are written by a python script from a template 32 | # before PyInstaller builds the exe, so as to inject date/other infos into it. 33 | *.manifest 34 | *.spec 35 | 36 | # Installer logs 37 | pip-log.txt 38 | pip-delete-this-directory.txt 39 | 40 | # Unit test / coverage reports 41 | htmlcov/ 42 | .tox/ 43 | .nox/ 44 | .coverage 45 | .coverage.* 46 | .cache 47 | nosetests.xml 48 | coverage.xml 49 | *.cover 50 | *.py,cover 51 | .hypothesis/ 52 | .pytest_cache/ 53 | 54 | # Translations 55 | *.mo 56 | *.pot 57 | 58 | # Django stuff: 59 | *.log 60 | local_settings.py 61 | db.sqlite3 62 | db.sqlite3-journal 63 | 64 | # Flask stuff: 65 | instance/ 66 | .webassets-cache 67 | 68 | # Scrapy stuff: 69 | .scrapy 70 | 71 | # Sphinx documentation 72 | docs/_build/ 73 | 74 | # PyBuilder 75 | target/ 76 | 77 | # Jupyter Notebook 78 | .ipynb_checkpoints 79 | 80 | # IPython 81 | profile_default/ 82 | ipython_config.py 83 | 84 | # pyenv 85 | .python-version 86 | 87 | # pipenv 88 | # According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control. 89 | # However, in case of collaboration, if having platform-specific dependencies or dependencies 90 | # having no cross-platform support, pipenv may install dependencies that don't work, or not 91 | # install all needed dependencies. 92 | #Pipfile.lock 93 | 94 | # PEP 582; used by e.g. github.com/David-OConnor/pyflow 95 | __pypackages__/ 96 | 97 | # Celery stuff 98 | celerybeat-schedule 99 | celerybeat.pid 100 | 101 | # SageMath parsed files 102 | *.sage.py 103 | 104 | # Environments 105 | .env 106 | .venv 107 | env/ 108 | venv/ 109 | ENV/ 110 | env.bak/ 111 | venv.bak/ 112 | 113 | # Spyder project settings 114 | .spyderproject 115 | .spyproject 116 | 117 | # Rope project settings 118 | .ropeproject 119 | 120 | # mkdocs documentation 121 | /site 122 | 123 | # mypy 124 | .mypy_cache/ 125 | .dmypy.json 126 | dmypy.json 127 | 128 | # Pyre type checker 129 | .pyre/ 130 | -------------------------------------------------------------------------------- /01-introduction.md: -------------------------------------------------------------------------------- 1 | ## Introductions 2 | 3 | * **Who am I** -- Machine Learning Engineer at Elsevier Labs, with interests in Deep Learning, NLP, Search, Knowledge Graphs, etc. 4 | * **Who you should be (ideally)** 5 | * have some experience training PyTorch models, 6 | * have some familiarity with the HuggingFace Transformers and Datasets APIs, 7 | * be interested in Named Entity Recognition (NER) and Relation Extraction (RE), 8 | * be curious about what one can do in this area with HuggingFace Transformers. 9 | * **What you will learn** -- how to implement and fine-tune NER and RE components using HuggingFace transformers. 10 | 11 | --- 12 | 13 | ## Agenda 14 | 15 | * **Introductions** -- first 15 mins (we are here) 16 | * Introduce the different components 17 | * **Hands on Transformer based NER** -- 1 hour 18 | * Intuition behind Transformer based NER 19 | * Walk-through of code 20 | * **Hands on Transformer based RE** -- 1 hour 21 | * Intuition behind Transformer based RE 22 | * Walk-through of code 23 | * **Wrap-up** -- last 15 mins 24 | * References -- where you can find out more 25 | 26 | --- 27 | 28 | ## The Actors 29 | 30 | * Named Entity Recognition (NER) 31 | * Relation Extraction (RE) 32 | * Transformers 33 | * Transfer Learning 34 | 35 | --- 36 | 37 | ## Named Entity Recognition 38 | 39 | * **Named Entity Recognition (NER)** (also known as **(named) entity identification**, **entity chunking** and **entity extraction**) is a subtask of information extraction that seeks to locate and classify named entities mentioned in unstructured text into pre-defined categories such as person names, organizations, locations, medical codes, time expressions, quantities, monetary values, percentages, etc. (_Wikipedia_). 40 | 41 | * Converts unstructured text to structured list of Named Entities. 42 | * 43 | * 44 | | Matched Text | Start Offset | End Offset | Entity Type | 45 | | ------------------------------------- | ------------ | ---------- | ----------- | 46 | | December 1903 | 3 | 16 | DATE | 47 | | the Royal Swedish Academy of Sciences | 18 | 55 | ORG | 48 | | Marie | 64 | 69 | PER | 49 | | Pierre Curie | 74 | 86 | PER | 50 | | Henri Becquerel | 99 | 114 | PER | 51 | | the Nobel Prize in Physics | 115 | 141 | WORK_OF_ART | 52 | 53 | * **Applications** 54 | * Information Retrieval (things not strings) 55 | * Clustering / Categorization / Classification 56 | * Summarization (derive salient topics from named entities) 57 | * Foundation for downstream tasks such as Relation Extraction 58 | 59 |

60 | 61 | _Image Credit: [DisplaCy Named Entity Visualizer](https://explosion.ai/demos/displacy-ent)_ 62 | 63 |

64 | 65 | --- 66 | 67 | ## Relation Extraction 68 | 69 | * **Relation Extraction** requires the detection and classification of semantic relationship mentions within a set of named entities. Relationship extraction involves the identification of relations between entities and it usually focuses on the extraction of binary relations. (_Wikipedia, slightly paraphrased_). 70 | * Discovers Relations that connect Named Entities, converting unstructured text to a Graph. 71 | 72 | 73 | 74 | 75 | 76 |
77 | 78 | * **Applications** 79 | * Knowledge Base Construction 80 | * Question Answering 81 | * Text Analysis in different domains (legal, biomedical) 82 | 83 |

84 | 85 | _Image Credit: Built using Neo4J Console and Cypher_ 86 | 87 |

88 | 89 | --- 90 | 91 | ## Transformers 92 | 93 | * Proposed in 2017 by Vaswani, et al. ([Attention is all you need](https://arxiv.org/abs/1706.03762)) 94 | * Basic component behind the NER and RE architectures we will talk about today 95 | * Transformer based models have achieved SOTA results on many NLP tasks 96 | * Improves on ConvNets -- receptive field of **Self-Attention** is the full input. 97 | * Inproves on RNNs -- handles sequential input in parallel using positional embeddings. 98 | * Both Transformer based NER and RE models use only the **Encoder** portion of the Transformer architecture. 99 | 100 |

101 | 102 | 103 | _The Transformer Architecture (Image Source: [Dive Into Deep Learning](https://d2l.ai/index.html))_ 104 |

105 | 106 | --- 107 | 108 | ## Transfer Learning 109 | 110 | * Process of transferring knowledge from one model to another. 111 | * **Foundation Models** -- large transformer models (many parameters) pre-trained on large volumes of data. 112 | * Training pre-trained foundation models on new tasks usually results in better performance than training from scratch. 113 | * **Feature Extractor** -- encode data using pre-trained model and use encoding to train a simpler model with less training data. 114 | * **Fine Tuning** -- replace / add task specific layer and continue training the whole model; parameter values of trained model are used as initial starting point for task specific training. 115 | * HuggingFace 🤗 provides one-stop shop for using Transformers: 116 | * [Pre-trained models](https://huggingface.co/models) 117 | * [Major NLP Datasets](https://huggingface.co/datasets) 118 | * [APIs to train/fine-tune transformers and handle datasets](https://huggingface.co/docs) -- includes Tokenizers, Transformers and Transformer based networks for specific applications. 119 | 120 | --- 121 | 122 | -------------------------------------------------------------------------------- /02-ner-intro.md: -------------------------------------------------------------------------------- 1 | ## Techniques for Named Entity Recognition 2 | 3 | * **Dictionary Based** 4 | * **Regex based** 5 | * **Model based** 6 | * **Sequence to Sequence** model -- sequence of tokens go in, sequence of IOB tags come out. 7 | 8 | | **tokens** | Joe | Biden | is | the | president | of | the | United | States | . | 9 | | ---------- | --------- | --------- | -- | --- | --------- | -- | --- | ---------- | ---------- | - | 10 | | **tags** | **B-PER** | **I-PER** | O | O | O | O | O | **B-GPE** | **B-GPE** | O | 11 | 12 | * Popular NER models 13 | * **Hidden Markov Model** 14 | * **Linear CRF (Conditional Random Field)** 15 | * **BiLSTM-CRF** 16 | * **Transformers** 17 | 18 | --- 19 | 20 | ## Hidden Markov Model 21 | 22 | * Markov assumption: Event at time _t_ can be predicted from events at times (_t-1, ..., t-N_) where _N_ is small. 23 | * Uses engineered word level features. 24 | * Features from recent past used to predict tag at position _t_. 25 | * Most probable tag sequence assigned to token sequence using Viterbi algorithm. 26 | 27 | --- 28 | 29 | ## Linear Chain CRF 30 | 31 | * Graphical model that calculates the conditional probability of a tag sequence _c = (c1, ..., cN)_ given an observed token sequence _o = (o1, ..., oN)_. 32 | * Considers neighboring examples when making prediction. 33 | * Still uses features generated through feature engineering. 34 | 35 |

36 | 37 | 38 | _A Linear chain Conditional Random Fields model. Image Source: [Building a Named Entity Recognition model using a BiLSTM-CRF network](https://blog.dominodatalab.com/named-entity-recognition-ner-challenges-and-model))_ 39 |

40 | 41 | --- 42 | 43 | ## BiLSTM-CRF 44 | 45 | * Popular neural architecture for NER. 46 | * End-to-end model, features are inferred during training. 47 | * Recurrent Network (LSTM) considers all time steps prior to current time step (LHS context). 48 | * Bidirectional LSTM considers LHS and RHS context together, provides neighborhood context information. 49 | * Output of both LSTMs fed to a linear chain CRF, is effectively an attention mechanism. 50 | 51 |

52 | 53 | 54 | _Architecture of a BiLSTM-CRF Model. (Image Source: [Building a Named Entity Recognition model using a BiLSTM-CRF network](https://blog.dominodatalab.com/named-entity-recognition-ner-challenges-and-model))_ 55 |

56 | 57 | --- 58 | 59 | ## Transformer based NER 60 | 61 | * Modeled as a token classification task, HuggingFace provides `XXXForTokenClassification` models to do NER. 62 | * Input token sequence is padded with [CLA] and [SEP] tags, and subword tokenized. 63 | * Transformer Encoder (BERT) used to generate embeddings, output sent to shared Linear layer to produce logits across the tag vocabulary. 64 | * Self-attention plus linear layer serves as equivalent of CRF head. 65 | 66 |

67 | 68 | 69 | _Architecture of a Transformer based NER Model (Image Source: [Tuning Multilingual Transformers for Named Entity Recognition on Slavic Languages](https://aclanthology.org/W19-3712.pdf))_ 70 |

71 | 72 | -------------------------------------------------------------------------------- /04-re-intro.md: -------------------------------------------------------------------------------- 1 | ## Techniques for Relation Extraction 2 | 3 | * Non-trivial task, usually domain dependent 4 | > General purpose relation extractors, which can model arbitrary relations, are a core aspiration in information extraction. Efforts have been made to build general purpose extractors that represent relations with their surface forms, or which jointly embed surface forms with relations from an existing knowledge graph. _(from [Matching the Blanks: Distributional Similarity for Relation Learning](https://arxiv.org/abs/1906.03158))_ 5 | * Most RE focus on extracting binary relations, i.e. of the form: _AWARDED(Marie Curie, Nobel Prize)_ 6 | * Relation Extraction is an **N-way Classification Problem** 7 | * **RE Techniques** 8 | * Unsupervised / Semi-supervised 9 | * Supervised 10 | * Feature based Methods -- involves feature engineering and domain specific heuristic choices. 11 | * Kernel Methods 12 | * **Bag of Features Kernel** 13 | * Tree Kernel 14 | * **Piecewise CNN (PCNN)** 15 | * **Transformer based RE** 16 | 17 | --- 18 | 19 | ## Bag of Features Kernel 20 | 21 | 22 | 23 | * Context around entity mentions (span before, span middle, span after) useful to determine relation type between entities **Infosys** and **India**. 24 | * 3 sub-kernels (composed of string subsequences from each span) created 25 | * SVM RE classifier based on the intuition that another sentence with similar sub-kernels is likely to indicates a similar relation. 26 | * Tree kernels compute similarity between two entity-augmented shallow parse tree structures. 27 | 28 | --- 29 | 30 | ## Piecewise CNN (PCNN) 31 | 32 | * Popular general purpose neural architecture for Relation Extraction. 33 | * Each sentence can naturally be divided into 3 spans based on the positions of entities in focus. 34 | * span-before + entity-1, 35 | * span-middle + entity-2, 36 | * span-after 37 | * MAXPOOL from each segment computed separately and concatenated 38 | * Finally passed to a N-way Linear (Dense) layer for classifying the relation type. 39 | 40 |

41 | 42 | 43 | _Architecture of the PCNN Model (Image Source: [Distant Supervision for Relation Extraction via Piecewise Convolutional Neural Networks](https://aclanthology.org/D15-1203/))_ 44 |

45 | 46 | --- 47 | 48 | ## Transformer based RE 49 | 50 | * HuggingFace 🤗 does not provide a `XXXForRelationExtraction` model out of the box. 51 | * We create one with `XXXForPreTraining` body and a (`Dropout`+`LayerNorm`+`Linear`) classifier head. 52 | * Our implementation is model (e) below. 53 | * We introduce new marker tokens pairs for subject and object mention for for each ENTITY_TYPE. 54 | * Enclose each of our entity mentions (subject and object) within these entity marker tokens. 55 | * MAXPOOL the tokens between the subject and object entity marker tags and concatenate. 56 | * Input concatenated vector into custom classifier head and learn to predict one of N relation classes. 57 | 58 |

59 | 60 | 61 | _Architecture Variations for Transformer based Relation Extractors (Image Source: [Matching the Blanks: Distributional Similarity for Relation Learning](https://arxiv.org/abs/1906.03158))_ 62 |

63 | 64 | ___ 65 | 66 | -------------------------------------------------------------------------------- /06-conclusion.md: -------------------------------------------------------------------------------- 1 | ## Third Party NER/RE Tools 2 | 3 | * Named Entity Recognition (NER) 4 | * [SpaCy](https://spacy.io/) -- command line interface to train NER using configuration files. 5 | * [Flair](https://github.com/flairNLP/flair) -- programmatic interface neural and Transformer based NER. 6 | * [SimpleTransformers](https://simpletransformers.ai/) -- simplified interface to Transformer models in general, including NER. 7 | * [NERDS](https://github.com/elsevierlabs-os/nerds) -- unified programmatic interface to various third party NER libraries. 8 | * Relationship Extraction (RE) 9 | * [SpaCy](https://spacy.io/) -- RE component is available but not provided OOB. 10 | * [Flair](https://github.com/flairNLP/flair) -- programmatic interface to neural RE model. 11 | * [OpenNRE](https://github.com/thunlp/OpenNRE) -- programmatic interface to neural (PCNN) and Transformer based RE models. 12 | 13 | ## References 14 | 15 | * **Papers** 16 | * [Recent Named Entity Recognition and Classification Techniques: A systematic review](https://www.sciencedirect.com/science/article/pii/S1574013717302782) -- Archana Goyal, Vishal Gupta, and Manish Kumar, 2018. 17 | * [A Survey on Recent Advances in Named Entity Recognition from Deep Learning models](https://arxiv.org/abs/1910.11470) -- Vikas Yadav and Steven Bethard, 2019. 18 | * [A frustratingly easy approach for entity and relation extraction](https://arxiv.org/abs/2010.12812) -- Zexuan Zhong and Danqi Chen, 2021 19 | * [A Review of Relation Extraction](https://www.cs.cmu.edu/~nbach/papers/A-survey-on-Relation-Extraction.pdf) -- Nguyen Bach and Sameer Badaskar, 2007. 20 | * [A Survey of Deep Learning Methods for Relation Extraction](https://arxiv.org/abs/1705.03645) -- Shantanu Kumar, 2017. 21 | * [Maching the Blanks: Distributional Similarity for Relation Learning](https://arxiv.org/abs/1906.03158) -- Livio Baldini Soares, Nicholas Fitzgerald, Jeffrey Ling and Tom Kwiatkowski, 2019. 22 | * **Books** 23 | * [Transformers for Natural Language Processing](https://www.packtpub.com/product/transformers-for-natural-language-processing/9781800565791) 24 | * [Natural Language Processing with Transformers](https://www.oreilly.com/library/view/natural-language-processing/9781098103231/) 25 | 26 |

27 | 28 | 29 | 30 | 31 | 32 |
33 |

34 | 35 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Apache License 2 | Version 2.0, January 2004 3 | http://www.apache.org/licenses/ 4 | 5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 6 | 7 | 1. Definitions. 8 | 9 | "License" shall mean the terms and conditions for use, reproduction, 10 | and distribution as defined by Sections 1 through 9 of this document. 11 | 12 | "Licensor" shall mean the copyright owner or entity authorized by 13 | the copyright owner that is granting the License. 14 | 15 | "Legal Entity" shall mean the union of the acting entity and all 16 | other entities that control, are controlled by, or are under common 17 | control with that entity. For the purposes of this definition, 18 | "control" means (i) the power, direct or indirect, to cause the 19 | direction or management of such entity, whether by contract or 20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 21 | outstanding shares, or (iii) beneficial ownership of such entity. 22 | 23 | "You" (or "Your") shall mean an individual or Legal Entity 24 | exercising permissions granted by this License. 25 | 26 | "Source" form shall mean the preferred form for making modifications, 27 | including but not limited to software source code, documentation 28 | source, and configuration files. 29 | 30 | "Object" form shall mean any form resulting from mechanical 31 | transformation or translation of a Source form, including but 32 | not limited to compiled object code, generated documentation, 33 | and conversions to other media types. 34 | 35 | "Work" shall mean the work of authorship, whether in Source or 36 | Object form, made available under the License, as indicated by a 37 | copyright notice that is included in or attached to the work 38 | (an example is provided in the Appendix below). 39 | 40 | "Derivative Works" shall mean any work, whether in Source or Object 41 | form, that is based on (or derived from) the Work and for which the 42 | editorial revisions, annotations, elaborations, or other modifications 43 | represent, as a whole, an original work of authorship. For the purposes 44 | of this License, Derivative Works shall not include works that remain 45 | separable from, or merely link (or bind by name) to the interfaces of, 46 | the Work and Derivative Works thereof. 47 | 48 | "Contribution" shall mean any work of authorship, including 49 | the original version of the Work and any modifications or additions 50 | to that Work or Derivative Works thereof, that is intentionally 51 | submitted to Licensor for inclusion in the Work by the copyright owner 52 | or by an individual or Legal Entity authorized to submit on behalf of 53 | the copyright owner. For the purposes of this definition, "submitted" 54 | means any form of electronic, verbal, or written communication sent 55 | to the Licensor or its representatives, including but not limited to 56 | communication on electronic mailing lists, source code control systems, 57 | and issue tracking systems that are managed by, or on behalf of, the 58 | Licensor for the purpose of discussing and improving the Work, but 59 | excluding communication that is conspicuously marked or otherwise 60 | designated in writing by the copyright owner as "Not a Contribution." 61 | 62 | "Contributor" shall mean Licensor and any individual or Legal Entity 63 | on behalf of whom a Contribution has been received by Licensor and 64 | subsequently incorporated within the Work. 65 | 66 | 2. Grant of Copyright License. Subject to the terms and conditions of 67 | this License, each Contributor hereby grants to You a perpetual, 68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 69 | copyright license to reproduce, prepare Derivative Works of, 70 | publicly display, publicly perform, sublicense, and distribute the 71 | Work and such Derivative Works in Source or Object form. 72 | 73 | 3. Grant of Patent License. Subject to the terms and conditions of 74 | this License, each Contributor hereby grants to You a perpetual, 75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 76 | (except as stated in this section) patent license to make, have made, 77 | use, offer to sell, sell, import, and otherwise transfer the Work, 78 | where such license applies only to those patent claims licensable 79 | by such Contributor that are necessarily infringed by their 80 | Contribution(s) alone or by combination of their Contribution(s) 81 | with the Work to which such Contribution(s) was submitted. If You 82 | institute patent litigation against any entity (including a 83 | cross-claim or counterclaim in a lawsuit) alleging that the Work 84 | or a Contribution incorporated within the Work constitutes direct 85 | or contributory patent infringement, then any patent licenses 86 | granted to You under this License for that Work shall terminate 87 | as of the date such litigation is filed. 88 | 89 | 4. Redistribution. You may reproduce and distribute copies of the 90 | Work or Derivative Works thereof in any medium, with or without 91 | modifications, and in Source or Object form, provided that You 92 | meet the following conditions: 93 | 94 | (a) You must give any other recipients of the Work or 95 | Derivative Works a copy of this License; and 96 | 97 | (b) You must cause any modified files to carry prominent notices 98 | stating that You changed the files; and 99 | 100 | (c) You must retain, in the Source form of any Derivative Works 101 | that You distribute, all copyright, patent, trademark, and 102 | attribution notices from the Source form of the Work, 103 | excluding those notices that do not pertain to any part of 104 | the Derivative Works; and 105 | 106 | (d) If the Work includes a "NOTICE" text file as part of its 107 | distribution, then any Derivative Works that You distribute must 108 | include a readable copy of the attribution notices contained 109 | within such NOTICE file, excluding those notices that do not 110 | pertain to any part of the Derivative Works, in at least one 111 | of the following places: within a NOTICE text file distributed 112 | as part of the Derivative Works; within the Source form or 113 | documentation, if provided along with the Derivative Works; or, 114 | within a display generated by the Derivative Works, if and 115 | wherever such third-party notices normally appear. The contents 116 | of the NOTICE file are for informational purposes only and 117 | do not modify the License. You may add Your own attribution 118 | notices within Derivative Works that You distribute, alongside 119 | or as an addendum to the NOTICE text from the Work, provided 120 | that such additional attribution notices cannot be construed 121 | as modifying the License. 122 | 123 | You may add Your own copyright statement to Your modifications and 124 | may provide additional or different license terms and conditions 125 | for use, reproduction, or distribution of Your modifications, or 126 | for any such Derivative Works as a whole, provided Your use, 127 | reproduction, and distribution of the Work otherwise complies with 128 | the conditions stated in this License. 129 | 130 | 5. Submission of Contributions. Unless You explicitly state otherwise, 131 | any Contribution intentionally submitted for inclusion in the Work 132 | by You to the Licensor shall be under the terms and conditions of 133 | this License, without any additional terms or conditions. 134 | Notwithstanding the above, nothing herein shall supersede or modify 135 | the terms of any separate license agreement you may have executed 136 | with Licensor regarding such Contributions. 137 | 138 | 6. Trademarks. This License does not grant permission to use the trade 139 | names, trademarks, service marks, or product names of the Licensor, 140 | except as required for reasonable and customary use in describing the 141 | origin of the Work and reproducing the content of the NOTICE file. 142 | 143 | 7. Disclaimer of Warranty. Unless required by applicable law or 144 | agreed to in writing, Licensor provides the Work (and each 145 | Contributor provides its Contributions) on an "AS IS" BASIS, 146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 147 | implied, including, without limitation, any warranties or conditions 148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 149 | PARTICULAR PURPOSE. You are solely responsible for determining the 150 | appropriateness of using or redistributing the Work and assume any 151 | risks associated with Your exercise of permissions under this License. 152 | 153 | 8. Limitation of Liability. In no event and under no legal theory, 154 | whether in tort (including negligence), contract, or otherwise, 155 | unless required by applicable law (such as deliberate and grossly 156 | negligent acts) or agreed to in writing, shall any Contributor be 157 | liable to You for damages, including any direct, indirect, special, 158 | incidental, or consequential damages of any character arising as a 159 | result of this License or out of the use or inability to use the 160 | Work (including but not limited to damages for loss of goodwill, 161 | work stoppage, computer failure or malfunction, or any and all 162 | other commercial damages or losses), even if such Contributor 163 | has been advised of the possibility of such damages. 164 | 165 | 9. Accepting Warranty or Additional Liability. While redistributing 166 | the Work or Derivative Works thereof, You may choose to offer, 167 | and charge a fee for, acceptance of support, warranty, indemnity, 168 | or other liability obligations and/or rights consistent with this 169 | License. However, in accepting such obligations, You may act only 170 | on Your own behalf and on Your sole responsibility, not on behalf 171 | of any other Contributor, and only if You agree to indemnify, 172 | defend, and hold each Contributor harmless for any liability 173 | incurred by, or claims asserted against, such Contributor by reason 174 | of your accepting any such warranty or additional liability. 175 | 176 | END OF TERMS AND CONDITIONS 177 | 178 | APPENDIX: How to apply the Apache License to your work. 179 | 180 | To apply the Apache License to your work, attach the following 181 | boilerplate notice, with the fields enclosed by brackets "[]" 182 | replaced with your own identifying information. (Don't include 183 | the brackets!) The text should be enclosed in the appropriate 184 | comment syntax for the file format. We also recommend that a 185 | file or class name and description of purpose be included on the 186 | same "printed page" as the copyright notice for easier 187 | identification within third-party archives. 188 | 189 | Copyright [yyyy] [name of copyright owner] 190 | 191 | Licensed under the Apache License, Version 2.0 (the "License"); 192 | you may not use this file except in compliance with the License. 193 | You may obtain a copy of the License at 194 | 195 | http://www.apache.org/licenses/LICENSE-2.0 196 | 197 | Unless required by applicable law or agreed to in writing, software 198 | distributed under the License is distributed on an "AS IS" BASIS, 199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 200 | See the License for the specific language governing permissions and 201 | limitations under the License. 202 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # ner-re-with-transformers-odsc2022 2 | 3 | ## Title 4 | 5 | [Transformer based approaches to Named Entity Recognition (NER) and Relationship Extraction (RE)](https://odsc.com/speakers/transformer-based-approaches-to-named-entity-recognition-ner-and-relationship-extraction-re/) 6 | 7 | ## Session type 8 | 9 | Workshop (hands-on) 10 | 11 | ## Abstract 12 | 13 | Named Entity Recognition (NER) and Relationship Extraction (RE) are foundational for many downstream NLP tasks such as Information Retrieval and Knowledge Base construction. While pre-trained models exist for both NER and RE tasks, they are usually specialized for some narrow application domain. If your application domain is different, your best bet is to train your own models. However, the costs associated with training, specifically generating training data, can be a significant deterrent for doing so. Fortunately, Language Models learned by pre-trained Transformers learn a lot about the language of the domain it is trained and fine-tuned on, and therefore NER and RE models based on these Language Models require fewer training examples to deliver the same level of performance. In this workshop, participants will learn about, train, and evaluate Transformer based neural models for NER and RE. 14 | 15 | ## Outline 16 | 17 | * **Background** (25 mins) 18 | * [Introduction and General Concepts](01-introduction.md) 19 | * **Named Entity Recognition** (1 hour) 20 | * [Neural and Transformer based architectures for Named Entity Recognition](02-ner-intro.md) 21 | * [Case Study #1: BERT based NER fine-tuned using the Groningen Meaning Bank (GMB) dataset](03-gmb-ner-bert.ipynb) 22 | * [Case Study #2: Switching out BERT with DistilBERT](03a-gmb-ner-bert.ipynb) 23 | * [Case Study #3: XLM-RoBERTa based NER fine-tuned using HuggingFace Trainer API](03a-gmb-ner-xlmr.ipynb) 24 | * **Relationship Extraction** (1 hour) 25 | * [Neural and Transformer based architectures for Relationship Extraction](04-re-intro.md) 26 | * [Case Study #4: BERT based RE (e: entity markers mention pooling) fine-tuned using New York Times dataset](05-nyt-re-bert.ipynb) 27 | * [Case Study #5: BERT based RE (b: standard mention pooling) fine-tuned using New York Times dataset](05a-nyt-re-bert.ipynb) 28 | * [Case Study #6: BERT based RE (f: entity markers entity start) fine-tuned using New York Times dataset](05b-nyt-re-bert.ipynb) 29 | * **Conclusion** (20 mins) 30 | * [Wrap-up](06-conclusion.md) 31 | * Q/A session 32 | 33 | 34 | ## Running the Code 35 | 36 | * (optional) Fork this repository 37 | * Navigate to the [Colab Web UI](https://colab.research.google.com/) 38 | * Click on the GitHub tab 39 | * Enter the URL of your forked (or this) repository in the field titled "Enter a GitHub URL" and hit the Search icon 40 | * You should see the notebooks appear in the results. Click on the notebook you want to work with in Colab 41 | 42 | ## Datasets 43 | 44 | * [GMB (Groningen Meaning Bank) dataset for NER](https://www.kaggle.com/abhinavwalia95/entity-annotated-corpus) 45 | * [NYT (New York Times) dataset for RE](https://www.kaggle.com/daishinkan002/new-york-times-relation-extraction-dataset) 46 | 47 | ## Additional Links 48 | 49 | * [Blog post on ODSC](https://odsc.com/blog/building-named-entity-recognition-and-relationship-extraction-components-with-huggingface-transformers/?utm_campaign=Learning%20Posts&utm_content=200655503&utm_medium=social&utm_source=twitter&hss_channel=tw-1357730263481122817) describing why you might want to consider taking this tutorial. 50 | 51 | -------------------------------------------------------------------------------- /figures/bag-of-features-kernel.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sujitpal/ner-re-with-transformers-odsc2022/b713a91ef29956ddb1c31f7d0b5a9c8731c501c5/figures/bag-of-features-kernel.png -------------------------------------------------------------------------------- /figures/bilstm-crf.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sujitpal/ner-re-with-transformers-odsc2022/b713a91ef29956ddb1c31f7d0b5a9c8731c501c5/figures/bilstm-crf.png -------------------------------------------------------------------------------- /figures/book-oreilly.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sujitpal/ner-re-with-transformers-odsc2022/b713a91ef29956ddb1c31f7d0b5a9c8731c501c5/figures/book-oreilly.png -------------------------------------------------------------------------------- /figures/book-packt.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sujitpal/ner-re-with-transformers-odsc2022/b713a91ef29956ddb1c31f7d0b5a9c8731c501c5/figures/book-packt.png -------------------------------------------------------------------------------- /figures/linear-crf.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sujitpal/ner-re-with-transformers-odsc2022/b713a91ef29956ddb1c31f7d0b5a9c8731c501c5/figures/linear-crf.png -------------------------------------------------------------------------------- /figures/my-headshot.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sujitpal/ner-re-with-transformers-odsc2022/b713a91ef29956ddb1c31f7d0b5a9c8731c501c5/figures/my-headshot.jpg -------------------------------------------------------------------------------- /figures/odsc-2022-blog-fig-1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sujitpal/ner-re-with-transformers-odsc2022/b713a91ef29956ddb1c31f7d0b5a9c8731c501c5/figures/odsc-2022-blog-fig-1.png -------------------------------------------------------------------------------- /figures/odsc-2022-blog-fig-2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sujitpal/ner-re-with-transformers-odsc2022/b713a91ef29956ddb1c31f7d0b5a9c8731c501c5/figures/odsc-2022-blog-fig-2.png -------------------------------------------------------------------------------- /figures/pcnn-arch.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sujitpal/ner-re-with-transformers-odsc2022/b713a91ef29956ddb1c31f7d0b5a9c8731c501c5/figures/pcnn-arch.png -------------------------------------------------------------------------------- /figures/re-transformer-archs.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sujitpal/ner-re-with-transformers-odsc2022/b713a91ef29956ddb1c31f7d0b5a9c8731c501c5/figures/re-transformer-archs.png -------------------------------------------------------------------------------- /figures/re-triples-table.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sujitpal/ner-re-with-transformers-odsc2022/b713a91ef29956ddb1c31f7d0b5a9c8731c501c5/figures/re-triples-table.png -------------------------------------------------------------------------------- /figures/transformer-based-ner.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sujitpal/ner-re-with-transformers-odsc2022/b713a91ef29956ddb1c31f7d0b5a9c8731c501c5/figures/transformer-based-ner.png --------------------------------------------------------------------------------