├── LICENSE └── README.md /LICENSE: -------------------------------------------------------------------------------- 1 | Apache License 2 | Version 2.0, January 2004 3 | http://www.apache.org/licenses/ 4 | 5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 6 | 7 | 1. Definitions. 8 | 9 | "License" shall mean the terms and conditions for use, reproduction, 10 | and distribution as defined by Sections 1 through 9 of this document. 11 | 12 | "Licensor" shall mean the copyright owner or entity authorized by 13 | the copyright owner that is granting the License. 14 | 15 | "Legal Entity" shall mean the union of the acting entity and all 16 | other entities that control, are controlled by, or are under common 17 | control with that entity. For the purposes of this definition, 18 | "control" means (i) the power, direct or indirect, to cause the 19 | direction or management of such entity, whether by contract or 20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 21 | outstanding shares, or (iii) beneficial ownership of such entity. 22 | 23 | "You" (or "Your") shall mean an individual or Legal Entity 24 | exercising permissions granted by this License. 25 | 26 | "Source" form shall mean the preferred form for making modifications, 27 | including but not limited to software source code, documentation 28 | source, and configuration files. 29 | 30 | "Object" form shall mean any form resulting from mechanical 31 | transformation or translation of a Source form, including but 32 | not limited to compiled object code, generated documentation, 33 | and conversions to other media types. 34 | 35 | "Work" shall mean the work of authorship, whether in Source or 36 | Object form, made available under the License, as indicated by a 37 | copyright notice that is included in or attached to the work 38 | (an example is provided in the Appendix below). 39 | 40 | "Derivative Works" shall mean any work, whether in Source or Object 41 | form, that is based on (or derived from) the Work and for which the 42 | editorial revisions, annotations, elaborations, or other modifications 43 | represent, as a whole, an original work of authorship. For the purposes 44 | of this License, Derivative Works shall not include works that remain 45 | separable from, or merely link (or bind by name) to the interfaces of, 46 | the Work and Derivative Works thereof. 47 | 48 | "Contribution" shall mean any work of authorship, including 49 | the original version of the Work and any modifications or additions 50 | to that Work or Derivative Works thereof, that is intentionally 51 | submitted to Licensor for inclusion in the Work by the copyright owner 52 | or by an individual or Legal Entity authorized to submit on behalf of 53 | the copyright owner. For the purposes of this definition, "submitted" 54 | means any form of electronic, verbal, or written communication sent 55 | to the Licensor or its representatives, including but not limited to 56 | communication on electronic mailing lists, source code control systems, 57 | and issue tracking systems that are managed by, or on behalf of, the 58 | Licensor for the purpose of discussing and improving the Work, but 59 | excluding communication that is conspicuously marked or otherwise 60 | designated in writing by the copyright owner as "Not a Contribution." 61 | 62 | "Contributor" shall mean Licensor and any individual or Legal Entity 63 | on behalf of whom a Contribution has been received by Licensor and 64 | subsequently incorporated within the Work. 65 | 66 | 2. Grant of Copyright License. Subject to the terms and conditions of 67 | this License, each Contributor hereby grants to You a perpetual, 68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 69 | copyright license to reproduce, prepare Derivative Works of, 70 | publicly display, publicly perform, sublicense, and distribute the 71 | Work and such Derivative Works in Source or Object form. 72 | 73 | 3. Grant of Patent License. Subject to the terms and conditions of 74 | this License, each Contributor hereby grants to You a perpetual, 75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 76 | (except as stated in this section) patent license to make, have made, 77 | use, offer to sell, sell, import, and otherwise transfer the Work, 78 | where such license applies only to those patent claims licensable 79 | by such Contributor that are necessarily infringed by their 80 | Contribution(s) alone or by combination of their Contribution(s) 81 | with the Work to which such Contribution(s) was submitted. If You 82 | institute patent litigation against any entity (including a 83 | cross-claim or counterclaim in a lawsuit) alleging that the Work 84 | or a Contribution incorporated within the Work constitutes direct 85 | or contributory patent infringement, then any patent licenses 86 | granted to You under this License for that Work shall terminate 87 | as of the date such litigation is filed. 88 | 89 | 4. Redistribution. You may reproduce and distribute copies of the 90 | Work or Derivative Works thereof in any medium, with or without 91 | modifications, and in Source or Object form, provided that You 92 | meet the following conditions: 93 | 94 | (a) You must give any other recipients of the Work or 95 | Derivative Works a copy of this License; and 96 | 97 | (b) You must cause any modified files to carry prominent notices 98 | stating that You changed the files; and 99 | 100 | (c) You must retain, in the Source form of any Derivative Works 101 | that You distribute, all copyright, patent, trademark, and 102 | attribution notices from the Source form of the Work, 103 | excluding those notices that do not pertain to any part of 104 | the Derivative Works; and 105 | 106 | (d) If the Work includes a "NOTICE" text file as part of its 107 | distribution, then any Derivative Works that You distribute must 108 | include a readable copy of the attribution notices contained 109 | within such NOTICE file, excluding those notices that do not 110 | pertain to any part of the Derivative Works, in at least one 111 | of the following places: within a NOTICE text file distributed 112 | as part of the Derivative Works; within the Source form or 113 | documentation, if provided along with the Derivative Works; or, 114 | within a display generated by the Derivative Works, if and 115 | wherever such third-party notices normally appear. The contents 116 | of the NOTICE file are for informational purposes only and 117 | do not modify the License. You may add Your own attribution 118 | notices within Derivative Works that You distribute, alongside 119 | or as an addendum to the NOTICE text from the Work, provided 120 | that such additional attribution notices cannot be construed 121 | as modifying the License. 122 | 123 | You may add Your own copyright statement to Your modifications and 124 | may provide additional or different license terms and conditions 125 | for use, reproduction, or distribution of Your modifications, or 126 | for any such Derivative Works as a whole, provided Your use, 127 | reproduction, and distribution of the Work otherwise complies with 128 | the conditions stated in this License. 129 | 130 | 5. Submission of Contributions. Unless You explicitly state otherwise, 131 | any Contribution intentionally submitted for inclusion in the Work 132 | by You to the Licensor shall be under the terms and conditions of 133 | this License, without any additional terms or conditions. 134 | Notwithstanding the above, nothing herein shall supersede or modify 135 | the terms of any separate license agreement you may have executed 136 | with Licensor regarding such Contributions. 137 | 138 | 6. Trademarks. This License does not grant permission to use the trade 139 | names, trademarks, service marks, or product names of the Licensor, 140 | except as required for reasonable and customary use in describing the 141 | origin of the Work and reproducing the content of the NOTICE file. 142 | 143 | 7. Disclaimer of Warranty. Unless required by applicable law or 144 | agreed to in writing, Licensor provides the Work (and each 145 | Contributor provides its Contributions) on an "AS IS" BASIS, 146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 147 | implied, including, without limitation, any warranties or conditions 148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 149 | PARTICULAR PURPOSE. You are solely responsible for determining the 150 | appropriateness of using or redistributing the Work and assume any 151 | risks associated with Your exercise of permissions under this License. 152 | 153 | 8. Limitation of Liability. In no event and under no legal theory, 154 | whether in tort (including negligence), contract, or otherwise, 155 | unless required by applicable law (such as deliberate and grossly 156 | negligent acts) or agreed to in writing, shall any Contributor be 157 | liable to You for damages, including any direct, indirect, special, 158 | incidental, or consequential damages of any character arising as a 159 | result of this License or out of the use or inability to use the 160 | Work (including but not limited to damages for loss of goodwill, 161 | work stoppage, computer failure or malfunction, or any and all 162 | other commercial damages or losses), even if such Contributor 163 | has been advised of the possibility of such damages. 164 | 165 | 9. Accepting Warranty or Additional Liability. While redistributing 166 | the Work or Derivative Works thereof, You may choose to offer, 167 | and charge a fee for, acceptance of support, warranty, indemnity, 168 | or other liability obligations and/or rights consistent with this 169 | License. However, in accepting such obligations, You may act only 170 | on Your own behalf and on Your sole responsibility, not on behalf 171 | of any other Contributor, and only if You agree to indemnify, 172 | defend, and hold each Contributor harmless for any liability 173 | incurred by, or claims asserted against, such Contributor by reason 174 | of your accepting any such warranty or additional liability. 175 | 176 | END OF TERMS AND CONDITIONS 177 | 178 | APPENDIX: How to apply the Apache License to your work. 179 | 180 | To apply the Apache License to your work, attach the following 181 | boilerplate notice, with the fields enclosed by brackets "[]" 182 | replaced with your own identifying information. (Don't include 183 | the brackets!) The text should be enclosed in the appropriate 184 | comment syntax for the file format. We also recommend that a 185 | file or class name and description of purpose be included on the 186 | same "printed page" as the copyright notice for easier 187 | identification within third-party archives. 188 | 189 | Copyright [yyyy] [name of copyright owner] 190 | 191 | Licensed under the Apache License, Version 2.0 (the "License"); 192 | you may not use this file except in compliance with the License. 193 | You may obtain a copy of the License at 194 | 195 | http://www.apache.org/licenses/LICENSE-2.0 196 | 197 | Unless required by applicable law or agreed to in writing, software 198 | distributed under the License is distributed on an "AS IS" BASIS, 199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 200 | See the License for the specific language governing permissions and 201 | limitations under the License. 202 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # 中文版 2 | 3 | ## 引言 4 | 5 | 当下人工智能和数字人文浪潮风靡全球,现代汉语自动分析已取得很大成果。而古代汉语的自动分析研究相对薄弱,难以满足国学、史学、文献学、汉语史的研究和国学、传统文化教育的实际需求。古汉语存在字词、词语、词类的诸多争议,资源建设困难重重。数字人文研究需要大规模语料库和高性能古文自然语言处理工具支持。鉴于预训练语言模型已经在英语和现代汉语文本上极大的提升了文本挖掘的精度,目前亟需专门面向古文自动处理领域的预训练模型。 6 | 7 | 2021年产生了两个较为高效的面向古文智能处理任务的预训练模型[`SikuBERT`和`SikuRoBERTa`](https://github.com/hsc748NLP/SikuBERT-for-digital-humanities-and-classical-Chinese-information-processing),并被第一个古汉语领域NLP工具评估比赛——**[EvaHan 2022](https://circse.github.io/LT4HALA/2022/EvaHan)** 作为封闭环境下的预训练模型。**`bert-ancient-chinese`** 是我们为了进一步优化开放环境下模型效果得到的。 8 | 9 | 如果要引用我们的工作,您可以引用这篇[论文](https://aclanthology.org/2022.lt4hala-1.25/): 10 | 11 | ``` 12 | @inproceedings{wang2022uncertainty, 13 | title={The Uncertainty-based Retrieval Framework for Ancient Chinese CWS and POS}, 14 | author={Wang, Pengyu and Ren, Zhichen}, 15 | booktitle={Proceedings of the Second Workshop on Language Technologies for Historical and Ancient Languages}, 16 | pages={164--168}, 17 | year={2022} 18 | } 19 | ``` 20 | 21 | ## 预训练 22 | 23 | **相比于之前的预训练模型,`bert-ancient-chinese`主要有以下特点:** 24 | 25 | - 古汉语文本多以繁体字出现,并且包含大量生僻汉字,这使得预训练模型的`vocab表`(词表)中并不存在部分生僻汉字。`bert-base-chinese`通过在大规模语料中进行学习,进一步扩充了预训练模型的`vocab`(词典),最终的`vocab表`大小为**38208**,相比于`bert-base-chinese`词表大小为**21128**,`siku-bert`词表大小为**29791**,`bert-ancient-chinese`拥有**更大的词表**,也收录了更多的生僻字,更有利于提升模型在下游任务的表现性能。`vocab表`即词表,收录在预训练模型中的`vocab.txt`中。 26 | - `bert-ancient-chinese`使用了更大规模的训练集。相比于`siku-bert`只使用《四库全书》作为预训练数据集,我们使用了更大规模的数据集(约为《四库全书》的六倍),涵盖了从部、道部、佛部、集部、儒部、诗部、史部、医部、艺部、易部、子部,相比于四库全书内容更为丰富、范围更加广泛。 27 | 28 | - 基于领域适应训练(Domain-Adaptive Pretraining)的思想,`bert-ancient-chinese`在`bert-base-chinese`的基础上结合古文语料进行继续训练,以获取面向古文自动处理领域的预训练模型。 29 | 30 | ## 使用方法 31 | 32 | ### Huggingface Transformers 33 | 34 | 基于[Huggingface Transformers](https://github.com/huggingface/transformers)的`from_pretrained`方法可以直接在线获取`bert-ancient-chinese`模型。 35 | 36 | ```python 37 | from transformers import AutoTokenizer, AutoModel 38 | 39 | tokenizer = AutoTokenizer.from_pretrained("Jihuai/bert-ancient-chinese") 40 | 41 | model = AutoModel.from_pretrained("Jihuai/bert-ancient-chinese") 42 | ``` 43 | 44 | ## 模型下载 45 | 46 | 我们提供的模型是`PyTorch`版本。 47 | 48 | ### 调用 49 | 50 | 通过Huggingface官网直接下载,目前官网的模型已同步更新至最新版本: 51 | 52 | - **bert-ancient-chinese: [Jihuai/bert-ancient-chinese · Hugging Face](https://huggingface.co/Jihuai/bert-ancient-chinese)** 53 | 54 | ### 云盘 55 | 56 | 下载地址: 57 | 58 | | 模型名称 | 网盘链接 | 59 | | :------------------: | :----------------------------------------------------------: | 60 | | bert-ancient-chinese | [链接](https://pan.baidu.com/s/1JC5_64gLT07wgG2hjzqxjg ) 提取码: qs7x | 61 | 62 | ## 验证与结果 63 | 64 | 我们在比赛[EvaHan 2022](https://circse.github.io/LT4HALA/2022/EvaHan)提供的训练集、测试集上对不同的预训练模进行了测试和比较。我们通过对模型在下游任务`自动分词`、`词性标注`上微调(fine-tuning)的性能进行了比较。 65 | 66 | 我们以`BERT+CRF`作为基线模型,对比了`siku-bert`、`siku-roberta`和`bert-ancient-chinese`在下游任务上的性能。为了充分利用整个训练数据集,我们采用` K 折交叉验证法`,同时其他超参均保持一致。评测指标为`F1值`。 67 | 68 | 69 | 70 | 71 | 72 | 73 | 74 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 |
《左传》《史记》
自动分词词性标注自动分词词性标注
siku-bert96.0670%92.0156%92.7909%87.1188%
siku-roberta96.0689%92.0496%93.0183%87.5339%
bert-ancient-chinese 96.3273% 92.5027% 93.2917% 87.8749%
105 | 106 | ## 引用 107 | 108 | 如果我们的内容有助您研究工作,欢迎在论文中引用。 109 | 110 | ## 免责声明 111 | 112 | 报告中所呈现的实验结果仅表明在特定数据集和超参组合下的表现,并不能代表各个模型的本质。实验结果可能因随机数种子,计算设备而发生改变。**使用者可以在许可证范围内任意使用该模型,但我们不对因使用该项目内容造成的直接或间接损失负责。** 113 | 114 | ## 致谢 115 | 116 | `bert-ancient-chinese`是基于[bert-base-chinese](https://huggingface.co/bert-base-chinese)继续训练得到的。 117 | 118 | 感谢[邱锡鹏教授](https://xpqiu.github.io/)和[复旦大学自然语言处理实验室](https://nlp.fudan.edu.cn/)。 119 | 120 | ## 联系我们 121 | 122 | Pengyu Wang:wpyjihuai@gmail.com 123 | 124 | 125 | 126 | 127 | 128 | # English version 129 | 130 | ## Introduction 131 | 132 | With the current wave of Artificial Intelligence and Digital Humanities sweeping the world, the automatic analysis of modern Chinese has achieved great results. However, the automatic analysis and research of ancient Chinese is relatively weak, and it is difficult to meet the actual needs of Sinology, history, philology, Chinese history and the education of Sinology and traditional culture. There are many controversies about characters, words and parts of speech in ancient Chinese, and there are many difficulties in resource construction. Digital Humanities research requires large-scale corpora and high-performance ancient natural language processing tools. In view of the fact that pre-trained language models have greatly improved the accuracy of text mining in English and modern Chinese texts, there is an urgent need for pre-trained models for the automatic processing of ancient texts. 133 | 134 | In 2021, two efficient pre-trained models for ancient Chinese intelligent processing tasks, [`SikuBERT` and `SikuRoBERTa`](https://github.com/hsc748NLP/SikuBERT-for-digital-humanities-and-classical-Chinese-information-processing), were produced and selected as pretrained models in closed environment by **[EvaHan 2022](https://circse.github.io/LT4HALA/2022/EvaHan)**, the first NLP tool evaluation competition in the field of ancient Chinese. We trained **`bert-ancient-chinese`** to further optimize the model effect in open environment. 135 | 136 | If you want to refer to our work, you can refer to this [paper](https://aclanthology.org/2022.lt4hala-1.25/): 137 | 138 | ``` 139 | @inproceedings{wang2022uncertainty, 140 | title={The Uncertainty-based Retrieval Framework for Ancient Chinese CWS and POS}, 141 | author={Wang, Pengyu and Ren, Zhichen}, 142 | booktitle={Proceedings of the Second Workshop on Language Technologies for Historical and Ancient Languages}, 143 | pages={164--168}, 144 | year={2022} 145 | } 146 | ``` 147 | 148 | ## Further Pre-training 149 | 150 | **Compared with the previous pre-trained models, `bert-ancient-chinese` mainly has the following characteristics:** 151 | 152 | - Ancient Chinese texts mostly appear in traditional Chinese characters and contain a large number of uncommon Chinese characters, which makes the `vocab table` (vocabulary) of the pre-trained model without some uncommon Chinese characters. `bert-ancient-chinese` further expands the `vocab` (dictionary) of the pre-trained model by learning in a large-scale corpus. The final `vocab table` size is **38208**, compared to `bert-base-chinese` vocabulary size of **21128**, `siku-bert` vocabulary size of **29791**, `bert-ancient-chinese` has a larger vocabulary, and also includes more uncommon vocabulary word, which is more conducive to improving the performance of the model in downstream tasks. The `vocab table` is the vocabulary table, which is included in the `vocab.txt` in the pre-trained model. 153 | - `bert-ancient-chinese` uses a larger training set. Compared with `siku-bert` only using `"Siku Quanshu"` as training dataset, we use a larger-scale dataset (about six times that of `"Siku Quanshu"`), covering from the Ministry of Cong, the Ministry of Taoism, the Ministry of Buddhism, the Ministry of Confucianism, the Ministry of Poetry, the Ministry of History, the Ministry of Medicine, the Ministry of Art, the Ministry of Yi, and the Ministry of Zi, are richer in content and wider in scope than the `"Siku Quanshu"`. 154 | 155 | - Based on the idea of `Domain-Adaptive Pretraining`, `bert-ancient-chinese` was trained on the basis of `bert-base-chinese ` and was combined with ancient Chinese corpus to obtain a pre-trained model for the field of automatic processing of ancient Chinese. 156 | 157 | ## How to use 158 | 159 | ### Huggingface Transformers 160 | 161 | The `from_pretrained` method based on [Huggingface Transformers](https://github.com/huggingface/transformers) can directly obtain `bert-ancient-chinese` model online. 162 | 163 | ```python 164 | from transformers import AutoTokenizer, AutoModel 165 | 166 | tokenizer = AutoTokenizer.from_pretrained("Jihuai/bert-ancient-chinese") 167 | 168 | model = AutoModel.from_pretrained("Jihuai/bert-ancient-chinese") 169 | ``` 170 | 171 | ## Download PTM 172 | 173 | The model we provide is the `PyTorch` version. 174 | 175 | ### From Huggingface 176 | 177 | Download directly through Huggingface's official website, and the model on the official website has been updated to the latest version simultaneously: 178 | 179 | - **bert-ancient-chinese:[Jihuai/bert-ancient-chinese · Hugging Face](https://huggingface.co/Jihuai/bert-ancient-chinese)** 180 | 181 | ### From Cloud Disk 182 | 183 | Download address: 184 | 185 | | Model | Link | 186 | | :------------------: | :----------------------------------------------------------: | 187 | | bert-ancient-chinese | [Link](https://pan.baidu.com/s/1JC5_64gLT07wgG2hjzqxjg ) Extraction code: qs7x | 188 | 189 | ## Evaluation & Results 190 | 191 | We tested and compared different pre-trained models on the training and test sets provided by the competition [EvaHan 2022](https://circse.github.io/LT4HALA/2022/EvaHan). We compare the performance of the models by fine-tuning them on the downstream tasks of `Chinese Word Segmentation(CWS)` and `part-of-speech tagging(POS Tagging)`. 192 | 193 | We use `BERT+CRF` as the baseline model to compare the performance of `siku-bert`, `siku-roberta` and `bert-ancient-chinese` on downstream tasks. To fully utilize the entire training dataset, we employ `K-fold cross-validation`, while keeping other hyperparameters the same. The evaluation index is the `F1 value`. 194 | 195 | 196 | 197 | 198 | 199 | 200 | 201 | 202 | 203 | 204 | 205 | 206 | 207 | 208 | 209 | 210 | 211 | 212 | 213 | 214 | 215 | 216 | 217 | 218 | 219 | 220 | 221 | 222 | 223 | 224 | 225 | 226 | 227 | 228 | 229 | 230 | 231 |
Zuozhuan Shiji
CWSPOSCWSPOS
siku-bert96.0670%92.0156%92.7909%87.1188%
siku-roberta96.0689%92.0496%93.0183%87.5339%
bert-ancient-chinese 96.3273% 92.5027% 93.2917% 87.8749%
232 | 233 | ## Citing 234 | 235 | If our content is helpful for your research work, please quote it in the paper. 236 | 237 | ## Disclaim 238 | 239 | The experimental results presented in the report only show the performance under a specific data set and hyperparameter combination, and cannot represent the essence of each model. The experimental results may change due to random number seeds and computing equipment. **Users can use the model arbitrarily within the scope of the license, but we are not responsible for the direct or indirect losses caused by using the content of the project.** 240 | 241 | ## Acknowledgment 242 | 243 | `bert-ancient-chinese` is based on [bert-base-chinese](https://huggingface.co/bert-base-chinese) to continue training. 244 | 245 | Thanks to Prof. [Xipeng Qiu](https://xpqiu.github.io/) and the [Natural Language Processing Laboratory of Fudan University](https://nlp.fudan.edu.cn/). 246 | 247 | ## Contact us 248 | 249 | Pengyu Wang:wpyjihuai@gmail.com --------------------------------------------------------------------------------