├── README.md └── prompt_source.pth /README.md: -------------------------------------------------------------------------------- 1 | # Transfer-Prompts-for-Text-Generation 2 | 3 | This is the repository for NAACL 2022 paper "Learning to Transfer Prompts for Text Generation". The implementation is completely based on our newly-developed text generation library **[TextBox 2.0](https://github.com/RUCAIBox/TextBox)** 4 | 5 | ## Prompt Source 6 | The `prompt_source.pth` in this repository contains the source task prompts (*i.e.*, tensors of shape `[200,1024]`) trained on 14 datasets as introduced in our paper: 7 | - CNN/Daily Mail (cmmdn) 8 | - XSum (xsum) 9 | - MSNews (msn) 10 | - Multi-News (mn) 11 | - NEWSROOM (nr) 12 | - SQuAD (squad) 13 | - Wiki Neutrality (wiki) 14 | - Quora (quora) 15 | - PersonaChat (pc) 16 | - DailyDialog (dd) 17 | - DSTC7-AVSD (da) 18 | - MultiWOZ (mw) 19 | - WritingPrompts (wp) 20 | 21 | You can also download these datasets [here](https://github.com/RUCAIBox/TextBox#dataset). 22 | 23 | ## Installation 24 | First, you should clone the TextBox repository and follow its [instructions](https://github.com/RUCAIBox/TextBox#installation). 25 | 26 | Then, you may copy the `prompt_source.pth` into the TextBox folder (*i.e.*, \/TextBox). 27 | 28 | ## Running PTG based on TextBox 29 | For example, you can conduct our cross-dataset experiments on cnndm dataset using this command: 30 | ```bash 31 | python run_textbox.py --model=PTG --dataset=cnndm --model_path=facebook/bart-large 32 | ``` 33 | In this default case, the source tasks (datasets) is msn, mn, and nr. 34 | 35 | You can use `--dataset=xxx` to specify the dataset name. 36 | 37 | In addition, you can also specify the source tasks using `--source_task=list_of_task`. The default setting is equivalent to `--source_task=\[\'msn\',\'mn\',\'nr\'\]`. 38 | 39 | We also provide several cases used in our paper: 40 | - `--source_task=cross-dataset2`: tc, da, mw. 41 | - `--source_task=cross-task1`: squad, wiki, quora, wp, cnndm. 42 | - `--source_task=cross-task2`: squad, wiki, quora, wp, pc. 43 | 44 | 45 | 46 | ## Reference 47 | ```bibtex 48 | @inproceedings{li-etal-2022-learning-transfer, 49 | title = "Learning to Transfer Prompts for Text Generation", 50 | author = "Li, Junyi and 51 | Tang, Tianyi and 52 | Nie, Jian-Yun and 53 | Wen, Ji-Rong and 54 | Zhao, Xin", 55 | booktitle = "Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies", 56 | month = jul, 57 | year = "2022", 58 | address = "Seattle, United States", 59 | publisher = "Association for Computational Linguistics", 60 | url = "https://aclanthology.org/2022.naacl-main.257", 61 | doi = "10.18653/v1/2022.naacl-main.257", 62 | pages = "3506--3518", 63 | } 64 | ``` 65 | -------------------------------------------------------------------------------- /prompt_source.pth: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/RUCAIBox/Transfer-Prompts-for-Text-Generation/b1ab5358e2e1bb70d243a0f811e641b215cb2808/prompt_source.pth --------------------------------------------------------------------------------