├── .gitignore ├── README.md ├── __config.ini ├── common.py ├── config.py ├── contents └── 测试.html ├── drafts └── 新娘逃跑了,我谢谢她_ollama.txt ├── get_words.py ├── requirements.txt └── tokenizer-human ├── added_tokens.json ├── special_tokens_map.json ├── tokenizer.model └── tokenizer_config.json /.gitignore: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/KittenCN/get_words/HEAD/.gitignore -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/KittenCN/get_words/HEAD/README.md -------------------------------------------------------------------------------- /__config.ini: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/KittenCN/get_words/HEAD/__config.ini -------------------------------------------------------------------------------- /common.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/KittenCN/get_words/HEAD/common.py -------------------------------------------------------------------------------- /config.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/KittenCN/get_words/HEAD/config.py -------------------------------------------------------------------------------- /contents/测试.html: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/KittenCN/get_words/HEAD/contents/测试.html -------------------------------------------------------------------------------- /drafts/新娘逃跑了,我谢谢她_ollama.txt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/KittenCN/get_words/HEAD/drafts/新娘逃跑了,我谢谢她_ollama.txt -------------------------------------------------------------------------------- /get_words.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/KittenCN/get_words/HEAD/get_words.py -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/KittenCN/get_words/HEAD/requirements.txt -------------------------------------------------------------------------------- /tokenizer-human/added_tokens.json: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/KittenCN/get_words/HEAD/tokenizer-human/added_tokens.json -------------------------------------------------------------------------------- /tokenizer-human/special_tokens_map.json: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/KittenCN/get_words/HEAD/tokenizer-human/special_tokens_map.json -------------------------------------------------------------------------------- /tokenizer-human/tokenizer.model: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/KittenCN/get_words/HEAD/tokenizer-human/tokenizer.model -------------------------------------------------------------------------------- /tokenizer-human/tokenizer_config.json: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/KittenCN/get_words/HEAD/tokenizer-human/tokenizer_config.json --------------------------------------------------------------------------------