├── .gitignore ├── README.md ├── data ├── char_std_5990.txt └── demo.png ├── demo.py ├── expr └── attentioncnn │ └── encoder_600.pth ├── models ├── __init__.py ├── crnn.py └── crnn_lang.py ├── requirements.txt ├── src ├── __init__.py ├── class_attention.py ├── dataset.py └── utils.py ├── test_img ├── 20436312_1683447152.jpg ├── 20437109_1639581473.jpg ├── 20437421_2143654630.jpg ├── 20437531_1514396900.jpg ├── 20438953_2386326516.jpg ├── 20438984_3963047307.jpg ├── 20439171_260546633.jpg ├── 20439281_953270478.jpg ├── 20439562_2199433254.jpg ├── 20439906_2889507409.jpg ├── 20440000_4153137105.jpg ├── 20441140_1981970116.jpg ├── 20441328_2992333319.jpg ├── 20441531_4212871437.jpg ├── 20441750_1134599790.jpg ├── 20441765_2389222735.jpg ├── 20441859_4060744030.jpg ├── 20441968_1253178421.jpg ├── 20442421_1459247642.jpg ├── 20442468_3479649590.jpg ├── 20442515_3243295335.jpg ├── 20442640_2409912250.jpg ├── 20442750_205861976.jpg ├── 20442984_333319517.jpg └── md_img │ ├── attentionV2.png │ ├── attentionocr.png │ └── attention结果.png └── train.py /.gitignore: -------------------------------------------------------------------------------- 1 | .DS_Store 2 | *.pyc 3 | *.pyo 4 | *.log 5 | *.tmp 6 | .idea/ 7 | .vscode/ 8 | expr/ 9 | data/*.txt 10 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | attention-ocr.pytorch:Encoder+Decoder+attention model 2 | ====================================== 3 | 4 | This repository implements the the encoder and decoder model with attention model for OCR, the encoder uses CNN+Bi-LSTM, the decoder uses GRU. This repository is modified from https://github.com/meijieru/crnn.pytorch 5 | Earlier I had an open source version, but had some problems identifying images of fixed width. Recently I modified the model to support image recognition with variable width. The function is the same as CRNN. Due to the time problem, there is no pre-training model this time, which will be updated later. 6 | 7 | # requirements 8 | pytorch 0.4.1 9 | opencv_python 10 | ```bash 11 | cd Attention_ocr.pytorch 12 | pip install -r requirements.txt 13 | ``` 14 | 15 | # Test 16 | pretrained model coming soon 17 | 18 | 19 | # Train 20 | 1. Here i choose a small dataset from [Synthetic_Chinese_String_Dataset](https://github.com/chenjun2hao/caffe_ocr), about 270000+ images for training, 20000 images for testing. 21 | download the image data from [Baidu](https://pan.baidu.com/s/1hIurFJ73XbzL-QG4V-oe0w) 22 | 2. the train_list.txt and test_list.txt are created as the follow form: 23 | ``` 24 | # path/to/image_name.jpg label 25 | path/AttentionData/50843500_2726670787.jpg 情笼罩在他们满是沧桑 26 | path/AttentionData/57724421_3902051606.jpg 心态的松弛决定了比赛 27 | path/AttentionData/52041437_3766953320.jpg 虾的鲜美自是不可待言 28 | ``` 29 | 3. change the **trainlist** and **vallist** parameter in train.py, and start train 30 | ```bash 31 | cd Attention_ocr.pytorch 32 | python train.py --trainlist ./data/ch_train.txt --vallist ./data/ch_test.txt 33 | ``` 34 | then you can see in the terminel as follow: 35 | ![attentionocr](./test_img/md_img/attentionV2.png) 36 | there uses the decoderV2 model for decoder. 37 | 38 | 39 | # The previous version 40 | 41 | **_git checkout AttentionOcrV1_** 42 | 43 | 44 | # Reference 45 | 1. [crnn.pytorch](https://github.com/meijieru/crnn.pytorch) 46 | 2. [Attention-OCR](https://github.com/da03/Attention-OCR) 47 | 3. [Seq2Seq-PyTorch](https://github.com/MaximumEntropy/Seq2Seq-PyTorch) 48 | 4. [caffe_ocr](https://github.com/senlinuc/caffe_ocr) 49 | 50 | 51 | # TO DO 52 | - [ ] change LSTM to Conv1D, it can greatly accelerate the inference 53 | - [ ] change the cnn bone model with inception net, densenet 54 | - [ ] realize the decoder with transformer model 55 | -------------------------------------------------------------------------------- /data/char_std_5990.txt: -------------------------------------------------------------------------------- 1 | , 2 | 的 3 | 。 4 | 一 5 | 是 6 | 0 7 | 不 8 | 在 9 | 有 10 | 、 11 | 人 12 | “ 13 | ” 14 | 了 15 | 中 16 | 国 17 | 大 18 | 为 19 | 1 20 | : 21 | 上 22 | 2 23 | 这 24 | 个 25 | 以 26 | 年 27 | 生 28 | 和 29 | 我 30 | 时 31 | 之 32 | 也 33 | 来 34 | 到 35 | 要 36 | 会 37 | 学 38 | 对 39 | 业 40 | 出 41 | 行 42 | 公 43 | 能 44 | 他 45 | 于 46 | 5 47 | e 48 | 3 49 | 而 50 | 发 51 | 地 52 | 可 53 | 作 54 | 就 55 | 自 56 | 们 57 | 后 58 | 成 59 | 家 60 | 日 61 | 者 62 | 分 63 | 多 64 | 下 65 | 其 66 | 用 67 | 方 68 | 本 69 | 得 70 | 子 71 | . 72 | 高 73 | 4 74 | 过 75 | 经 76 | 6 77 | 现 78 | 说 79 | 与 80 | 前 81 | o 82 | 理 83 | 工 84 | 所 85 | 力 86 | t 87 | 如 88 | 将 89 | 军 90 | 部 91 | , 92 | 事 93 | 进 94 | 9 95 | 司 96 | 场 97 | 同 98 | 机 99 | 主 100 | 都 101 | 实 102 | 天 103 | 面 104 | 市 105 | 8 106 | i 107 | a 108 | 新 109 | 动 110 | 开 111 | n 112 | 关 113 | 定 114 | 还 115 | 长 116 | 此 117 | 月 118 | 7 119 | 道 120 | 美 121 | 心 122 | 法 123 | 最 124 | 文 125 | 等 126 | 当 127 | 第 128 | 好 129 | 然 130 | 体 131 | 全 132 | 比 133 | 股 134 | 通 135 | 性 136 | 重 137 | 三 138 | 外 139 | s 140 | 但 141 | 战 142 | ; 143 | 相 144 | 从 145 | 你 146 | r 147 | 内 148 | 无 149 | 考 150 | 因 151 | 小 152 | 资 153 | 种 154 | 合 155 | 情 156 | 去 157 | 里 158 | 化 159 | 次 160 | 入 161 | 加 162 | 间 163 | 些 164 | 度 165 | ? 166 | 员 167 | 意 168 | 没 169 | 产 170 | 正 171 | 表 172 | 很 173 | 队 174 | 报 175 | 已 176 | 名 177 | 海 178 | 点 179 | 目 180 | 着 181 | 应 182 | 解 183 | 那 184 | 看 185 | 数 186 | 东 187 | 位 188 | 题 189 | 利 190 | 起 191 | 二 192 | 民 193 | 提 194 | 及 195 | 明 196 | 教 197 | 问 198 | ) 199 | 制 200 | 期 201 | ( 202 | 元 203 | 游 204 | 女 205 | - 206 | 并 207 | 曰 208 | 十 209 | 果 210 | ) 211 | 么 212 | 注 213 | 两 214 | 专 215 | 样 216 | 信 217 | 王 218 | 平 219 | 己 220 | 金 221 | 务 222 | 使 223 | 电 224 | 网 225 | 代 226 | 手 227 | 知 228 | 计 229 | 至 230 | 常 231 | ( 232 | 只 233 | 展 234 | 品 235 | 更 236 | 系 237 | 科 238 | 门 239 | 特 240 | 想 241 | 西 242 | l 243 | 水 244 | 做 245 | 被 246 | 北 247 | 由 248 | c 249 | 》 250 | 万 251 | 老 252 | 向 253 | 《 254 | 记 255 | 政 256 | 今 257 | 据 258 | 量 259 | 保 260 | 建 261 | 物 262 | 区 263 | 管 264 | 见 265 | 安 266 | 集 267 | 或 268 | 认 269 | 程 270 | h 271 | 总 272 | — 273 | 少 274 | 身 275 | 先 276 | 师 277 | 球 278 | 价 279 | 空 280 | 旅 281 | 又 282 | 求 283 | 校 284 | 强 285 | 各 286 | 非 287 | 立 288 | 受 289 | 术 290 | 基 291 | 活 292 | 反 293 | ! 294 | 世 295 | 何 296 | 职 297 | 导 298 | 任 299 | 取 300 | 式 301 | [ 302 | ] 303 | 试 304 | 才 305 | 结 306 | 费 307 | 把 308 | 收 309 | 联 310 | 直 311 | 规 312 | 持 313 | 赛 314 | 社 315 | 四 316 | 山 317 | 统 318 | 投 319 | 南 320 | 原 321 | 该 322 | 院 323 | 交 324 | 达 325 | 接 326 | 头 327 | 打 328 | 设 329 | 每 330 | 别 331 | 示 332 | 则 333 | 调 334 | 处 335 | 义 336 | 权 337 | 台 338 | 感 339 | 斯 340 | 证 341 | 言 342 | 五 343 | 议 344 | d 345 | 给 346 | 决 347 | 论 348 | 她 349 | 告 350 | 广 351 | 企 352 | 格 353 | 增 354 | 让 355 | 指 356 | 研 357 | 商 358 | 客 359 | 太 360 | 息 361 | 近 362 | 城 363 | 变 364 | 技 365 | 医 366 | 件 367 | 几 368 | 书 369 | 选 370 | 周 371 | 备 372 | m 373 | 流 374 | 士 375 | 京 376 | 传 377 | u 378 | 放 379 | 病 380 | 华 381 | 单 382 | 话 383 | 招 384 | 路 385 | 界 386 | 药 387 | 回 388 | 再 389 | % 390 | 服 391 | 什 392 | 改 393 | 育 394 | 口 395 | 张 396 | 需 397 | 治 398 | 德 399 | 复 400 | 准 401 | 马 402 | 习 403 | 真 404 | 语 405 | 难 406 | 始 407 | " 408 | 际 409 | 观 410 | 完 411 | 标 412 | 共 413 | 项 414 | 容 415 | 级 416 | 即 417 | 必 418 | 类 419 | 领 420 | A 421 | C 422 | 未 423 | w 424 | 型 425 | 案 426 | 线 427 | 运 428 | 历 429 | 首 430 | 风 431 | 视 432 | 色 433 | 尔 434 | 整 435 | 质 436 | 参 437 | 较 438 | 云 439 | 具 440 | 布 441 | 组 442 | 办 443 | 气 444 | 造 445 | 争 446 | 往 447 | 形 448 | 份 449 | 防 450 | p 451 | 它 452 | 车 453 | 深 454 | 神 455 | 称 456 | g 457 | 况 458 | 推 459 | 越 460 | 英 461 | 易 462 | 且 463 | 营 464 | 条 465 | 消 466 | 命 467 | 团 468 | 确 469 | S 470 | 划 471 | 精 472 | 足 473 | 儿 474 | 局 475 | 飞 476 | 究 477 | 功 478 | 索 479 | 走 480 | 望 481 | 却 482 | 查 483 | 武 484 | 思 485 | 兵 486 | 识 487 | 克 488 | 故 489 | 步 490 | 影 491 | 带 492 | 乐 493 | 白 494 | 源 495 | 史 496 | 航 497 | 志 498 | 州 499 | 限 500 | 清 501 | 光 502 | 装 503 | 节 504 | 号 505 | 转 506 | 图 507 | 根 508 | 省 509 | 许 510 | 引 511 | 势 512 | 失 513 | 候 514 | 济 515 | 显 516 | 百 517 | 击 518 | f 519 | 器 520 | 象 521 | 效 522 | 仅 523 | 爱 524 | 官 525 | 包 526 | 供 527 | 低 528 | 演 529 | 连 530 | 夫 531 | 快 532 | 续 533 | 支 534 | 验 535 | 阳 536 | 男 537 | 觉 538 | 花 539 | 死 540 | 字 541 | 创 542 | 素 543 | 半 544 | 预 545 | 音 546 | 户 547 | 约 548 | 率 549 | 声 550 | 请 551 | 票 552 | … 553 | 便 554 | 构 555 | T 556 | 存 557 | 食 558 | y 559 | 段 560 | 远 561 | 责 562 | M 563 | 拉 564 | 房 565 | 随 566 | 断 567 | 极 568 | 销 569 | 林 570 | 亚 571 | 隐 572 | 超 573 | 获 574 | 升 575 | B 576 | 采 577 | I 578 | 算 579 | 益 580 | 优 581 | 愿 582 | 找 583 | 按 584 | 维 585 | 态 586 | 满 587 | 尽 588 | 令 589 | 汉 590 | 委 591 | 八 592 | 终 593 | 训 594 | 值 595 | 负 596 | 境 597 | 练 598 | 母 599 | 热 600 | 适 601 | 江 602 | 住 603 | 列 604 | 举 605 | 景 606 | 置 607 | 黄 608 | 听 609 | 除 610 | 读 611 | 众 612 | 响 613 | 友 614 | 助 615 | 弹 616 | 干 617 | 孩 618 | 边 619 | 李 620 | 六 621 | 甚 622 | 罗 623 | 致 624 | 施 625 | 模 626 | 料 627 | 火 628 | 像 629 | 古 630 | 眼 631 | 搜 632 | 离 633 | D 634 | 闻 635 | 府 636 | 章 637 | 早 638 | 照 639 | 速 640 | 录 641 | 页 642 | 卫 643 | 青 644 | 例 645 | 石 646 | 父 647 | 状 648 | 农 649 | 排 650 | 降 651 | 千 652 | P 653 | 择 654 | 评 655 | 疗 656 | 班 657 | 购 658 | 属 659 | 革 660 | 够 661 | 环 662 | 占 663 | 养 664 | 曾 665 | 米 666 | 略 667 | 站 668 | 胜 669 | ① 670 | 核 671 | 否 672 | 独 673 | 护 674 | 钱 675 | / 676 | 红 677 | 范 678 | 另 679 | 须 680 | 余 681 | 居 682 | 虽 683 | 毕 684 | 攻 685 | 族 686 | 吃 687 | 喜 688 | 陈 689 | G 690 | 轻 691 | 亲 692 | 积 693 | 星 694 | 假 695 | b 696 | 县 697 | 写 698 | 刘 699 | 财 700 | 亿 701 | 某 702 | 括 703 | 律 704 | 酒 705 | 策 706 | 初 707 | 批 708 | 普 709 | 片 710 | 协 711 | 售 712 | 乃 713 | 落 714 | 留 715 | 岁 716 | 突 717 | 双 718 | 绝 719 | 险 720 | 季 721 | 谓 722 | 严 723 | 村 724 | E 725 | 兴 726 | 围 727 | 依 728 | 念 729 | 苏 730 | 底 731 | 压 732 | 破 733 | 河 734 | 怎 735 | 细 736 | 富 737 | 切 738 | 乎 739 | 待 740 | 室 741 | 血 742 | 帝 743 | 君 744 | 均 745 | 络 746 | 牌 747 | 陆 748 | 印 749 | 层 750 | 斗 751 | 简 752 | 讲 753 | 买 754 | 谈 755 | 纪 756 | 板 757 | 希 758 | 聘 759 | 充 760 | 归 761 | 左 762 | 测 763 | 止 764 | 笑 765 | 差 766 | 控 767 | 担 768 | 杀 769 | 般 770 | 朝 771 | 监 772 | 承 773 | 播 774 | k 775 | 亦 776 | 临 777 | 银 778 | 尼 779 | 介 780 | v 781 | 博 782 | 软 783 | 欢 784 | 害 785 | 七 786 | 良 787 | 善 788 | ’ 789 | 移 790 | 土 791 | 课 792 | 免 793 | 射 794 | 审 795 | 健 796 | 角 797 | 伊 798 | 欲 799 | 似 800 | 配 801 | 既 802 | 拿 803 | 刚 804 | 绩 805 | 密 806 | 织 807 | 九 808 | 编 809 | 狐 810 | 右 811 | 龙 812 | 异 813 | 若 814 | 登 815 | 检 816 | 继 817 | 析 818 | 款 819 | 纳 820 | 威 821 | 微 822 | 域 823 | 齐 824 | 久 825 | 宣 826 | 阿 827 | 俄 828 | 店 829 | 康 830 | 执 831 | 露 832 | 香 833 | 额 834 | 紧 835 | 培 836 | 激 837 | 卡 838 | 短 839 | 群 840 | ② 841 | 春 842 | 仍 843 | 伤 844 | 韩 845 | 楚 846 | 缺 847 | 洲 848 | 版 849 | 答 850 | O 851 | 修 852 | 媒 853 | 秦 854 | ‘ 855 | 错 856 | 欧 857 | 园 858 | 减 859 | 急 860 | 叫 861 | 诉 862 | 述 863 | 钟 864 | 遇 865 | 港 866 | 补 867 | N 868 | · 869 | 送 870 | 托 871 | 夜 872 | 兰 873 | 诸 874 | 呢 875 | 席 876 | 尚 877 | 福 878 | 奖 879 | 党 880 | 坐 881 | 巴 882 | 毛 883 | 察 884 | 奇 885 | 孙 886 | 竞 887 | 宁 888 | 申 889 | L 890 | 疑 891 | 黑 892 | 劳 893 | 脑 894 | R 895 | 舰 896 | 晚 897 | 盘 898 | 征 899 | 波 900 | 背 901 | 访 902 | 互 903 | 败 904 | 苦 905 | 阶 906 | 味 907 | 跟 908 | 沙 909 | 湾 910 | 岛 911 | 挥 912 | 礼 913 | F 914 | 词 915 | 宝 916 | 券 917 | 虑 918 | 徐 919 | 患 920 | 贵 921 | 换 922 | 矣 923 | 戏 924 | 艺 925 | 侯 926 | 顾 927 | 副 928 | 妇 929 | 董 930 | 坚 931 | 含 932 | 授 933 | 皇 934 | 付 935 | 坛 936 | 皆 937 | 抗 938 | 藏 939 | 潜 940 | 封 941 | 础 942 | 材 943 | 停 944 | 判 945 | 吸 946 | 轮 947 | 守 948 | 涨 949 | 派 950 | 彩 951 | 哪 952 | 笔 953 | . 954 | ﹑ 955 | 氏 956 | 尤 957 | 逐 958 | 冲 959 | 询 960 | 铁 961 | W 962 | 衣 963 | 绍 964 | 赵 965 | 弟 966 | 洋 967 | 午 968 | 奥 969 | 昨 970 | 雷 971 | 耳 972 | 谢 973 | 乡 974 | 追 975 | 皮 976 | 句 977 | 刻 978 | 油 979 | 误 980 | 宫 981 | 巨 982 | 架 983 | 湖 984 | 固 985 | 痛 986 | 楼 987 | 杯 988 | 套 989 | 恐 990 | 敢 991 | H 992 | 遂 993 | 透 994 | 薪 995 | 婚 996 | 困 997 | 秀 998 | 帮 999 | 融 1000 | 鲁 1001 | 遗 1002 | 烈 1003 | 吗 1004 | 吴 1005 | 竟 1006 | ③ 1007 | 惊 1008 | 幅 1009 | 温 1010 | 臣 1011 | 鲜 1012 | 画 1013 | 拥 1014 | 罪 1015 | 呼 1016 | 警 1017 | 卷 1018 | 松 1019 | 甲 1020 | 牛 1021 | 诺 1022 | 庭 1023 | 休 1024 | 圣 1025 | 馆 1026 | _ 1027 | 退 1028 | 莫 1029 | 讯 1030 | 渐 1031 | 熟 1032 | 肯 1033 | V 1034 | 冠 1035 | 谁 1036 | 乱 1037 | 朗 1038 | 怪 1039 | 夏 1040 | 危 1041 | 码 1042 | 跳 1043 | 卖 1044 | 签 1045 | 块 1046 | 盖 1047 | 束 1048 | 毒 1049 | 杨 1050 | 饮 1051 | 届 1052 | 序 1053 | 灵 1054 | 怀 1055 | 障 1056 | 永 1057 | 顺 1058 | 载 1059 | 倒 1060 | 姓 1061 | 丽 1062 | 靠 1063 | 概 1064 | 输 1065 | 货 1066 | 症 1067 | 避 1068 | 寻 1069 | 丰 1070 | 操 1071 | 针 1072 | 穿 1073 | 延 1074 | 敌 1075 | 悉 1076 | 召 1077 | 田 1078 | 稳 1079 | 典 1080 | 吧 1081 | 犯 1082 | 饭 1083 | 握 1084 | 染 1085 | 怕 1086 | 端 1087 | 央 1088 | 阴 1089 | 胡 1090 | 座 1091 | 著 1092 | 损 1093 | 借 1094 | 朋 1095 | 救 1096 | 库 1097 | 餐 1098 | 堂 1099 | 庆 1100 | 忽 1101 | 润 1102 | 迎 1103 | 亡 1104 | 肉 1105 | 静 1106 | 阅 1107 | 盛 1108 | 综 1109 | 木 1110 | 疾 1111 | 恶 1112 | 享 1113 | 妻 1114 | 厂 1115 | 杂 1116 | 刺 1117 | 秘 1118 | 僧 1119 | 幸 1120 | 扩 1121 | 裁 1122 | 佳 1123 | 趣 1124 | 智 1125 | 促 1126 | 弃 1127 | 伯 1128 | 吉 1129 | 宜 1130 | 剧 1131 | 野 1132 | 附 1133 | 距 1134 | 唐 1135 | 释 1136 | 草 1137 | 币 1138 | 骨 1139 | 弱 1140 | 俱 1141 | 顿 1142 | 散 1143 | 讨 1144 | 睡 1145 | 探 1146 | 郑 1147 | 频 1148 | 船 1149 | 虚 1150 | 途 1151 | 旧 1152 | 树 1153 | 掌 1154 | 遍 1155 | 予 1156 | 梦 1157 | 圳 1158 | 森 1159 | 泰 1160 | 慢 1161 | 牙 1162 | 盟 1163 | 挑 1164 | 键 1165 | 阵 1166 | 暴 1167 | 脱 1168 | 汇 1169 | 歌 1170 | 禁 1171 | 浪 1172 | 冷 1173 | 艇 1174 | 雅 1175 | 迷 1176 | 拜 1177 | 旦 1178 | 私 1179 | 您 1180 | ④ 1181 | 启 1182 | 纷 1183 | 哈 1184 | 订 1185 | 折 1186 | 累 1187 | 玉 1188 | 脚 1189 | 亮 1190 | 晋 1191 | 祖 1192 | 菜 1193 | 鱼 1194 | 醒 1195 | 谋 1196 | 姐 1197 | 填 1198 | 纸 1199 | 泽 1200 | 戒 1201 | 床 1202 | 努 1203 | 液 1204 | 咨 1205 | 塞 1206 | 遭 1207 | 玩 1208 | 津 1209 | 伦 1210 | 夺 1211 | 辑 1212 | 癌 1213 | x 1214 | 丹 1215 | 荣 1216 | 仪 1217 | 献 1218 | 符 1219 | 翻 1220 | 估 1221 | 乘 1222 | 诚 1223 | K 1224 | 川 1225 | 惠 1226 | 涉 1227 | 街 1228 | 诗 1229 | 曲 1230 | 孔 1231 | 娘 1232 | 怒 1233 | 扬 1234 | 闲 1235 | 蒙 1236 | 尊 1237 | 坦 1238 | = 1239 | 衡 1240 | 迪 1241 | 镇 1242 | 沉 1243 | 署 1244 | 妖 1245 | 脸 1246 | 净 1247 | 哥 1248 | 顶 1249 | 掉 1250 | 厚 1251 | 魏 1252 | 旗 1253 | 兄 1254 | 荐 1255 | 童 1256 | 剂 1257 | 乏 1258 | 倍 1259 | 萨 1260 | 偏 1261 | 洗 1262 | 惯 1263 | 灭 1264 | 径 1265 | 犹 1266 | 趋 1267 | 拍 1268 | 档 1269 | 罚 1270 | 纯 1271 | 洛 1272 | 毫 1273 | 梁 1274 | 雨 1275 | 瑞 1276 | 宗 1277 | 鼓 1278 | 辞 1279 | 洞 1280 | 秋 1281 | 郎 1282 | 舍 1283 | 蓝 1284 | 措 1285 | 篮 1286 | 贷 1287 | 佛 1288 | 坏 1289 | 俗 1290 | 殊 1291 | 炮 1292 | 厅 1293 | 筑 1294 | 姆 1295 | 译 1296 | 摄 1297 | 卒 1298 | 谷 1299 | 妈 1300 | 聚 1301 | 违 1302 | 忘 1303 | 鬼 1304 | 触 1305 | 丁 1306 | 羽 1307 | 贫 1308 | 刑 1309 | 岗 1310 | 庄 1311 | 伟 1312 | 兼 1313 | 乳 1314 | 叶 1315 | 凡 1316 | 龄 1317 | 宽 1318 | 峰 1319 | 宋 1320 | 硬 1321 | 岸 1322 | 迅 1323 | 喝 1324 | 拟 1325 | 雄 1326 | 役 1327 | 零 1328 | 舞 1329 | 暗 1330 | 潮 1331 | 绿 1332 | 倾 1333 | 详 1334 | 税 1335 | 酸 1336 | 徒 1337 | 伴 1338 | 诊 1339 | 跑 1340 | 吾 1341 | 燕 1342 | 澳 1343 | 啊 1344 | 塔 1345 | 宿 1346 | 恩 1347 | 忙 1348 | 督 1349 | 末 1350 | ⑤ 1351 | + 1352 | 伐 1353 | 篇 1354 | 敏 1355 | 贸 1356 | 巧 1357 | 截 1358 | 沟 1359 | 肝 1360 | 迹 1361 | 烟 1362 | 勇 1363 | 乌 1364 | 赞 1365 | 锋 1366 | 返 1367 | 迫 1368 | 凭 1369 | 虎 1370 | 朱 1371 | 拔 1372 | 援 1373 | 搞 1374 | 爆 1375 | 勤 1376 | 抢 1377 | 敬 1378 | 赶 1379 | 抱 1380 | 仁 1381 | 秒 1382 | 缓 1383 | 御 1384 | 唯 1385 | 缩 1386 | 尝 1387 | 贴 1388 | 奔 1389 | 跨 1390 | 炎 1391 | 汤 1392 | 侵 1393 | 骑 1394 | 励 1395 | 戴 1396 | 肤 1397 | 枪 1398 | 植 1399 | 瘤 1400 | 埃 1401 | 汽 1402 | 羊 1403 | 宾 1404 | 替 1405 | 幕 1406 | 贝 1407 | 刀 1408 | 映 1409 | 彻 1410 | 驻 1411 | 披 1412 | 抓 1413 | 奉 1414 | 抵 1415 | 肿 1416 | 麻 1417 | U 1418 | 炸 1419 | 繁 1420 | 赢 1421 | 茶 1422 | 伏 1423 | 梅 1424 | 狂 1425 | 忧 1426 | 豪 1427 | 暂 1428 | 贾 1429 | 洁 1430 | 绪 1431 | 刊 1432 | 忆 1433 | 桥 1434 | 晓 1435 | 册 1436 | 漫 1437 | 圆 1438 | 默 1439 | 妾 1440 | 侧 1441 | 址 1442 | 横 1443 | □ 1444 | 偶 1445 | 狗 1446 | 陵 1447 | ' 1448 | 伙 1449 | 杜 1450 | 忍 1451 | 薄 1452 | 雪 1453 | 陷 1454 | 仙 1455 | 恋 1456 | 焦 1457 | 焉 1458 | 烦 1459 | 甘 1460 | 腺 1461 | 颇 1462 | 赏 1463 | 肠 1464 | 废 1465 | 墙 1466 | 债 1467 | 艾 1468 | 杰 1469 | 残 1470 | 冒 1471 | 屋 1472 | 堡 1473 | 曹 1474 | 储 1475 | 莱 1476 | 挂 1477 | 纵 1478 | 孝 1479 | 珍 1480 | 麦 1481 | 逃 1482 | 奋 1483 | J 1484 | 览 1485 | 镜 1486 | 缘 1487 | 昭 1488 | 摆 1489 | 跌 1490 | 胁 1491 | 昌 1492 | 耶 1493 | 腹 1494 | 偿 1495 | 蛋 1496 | 盈 1497 | 瓦 1498 | 摩 1499 | 沈 1500 | 惟 1501 | 迁 1502 | 冰 1503 | 辛 1504 | 震 1505 | 旁 1506 | 泉 1507 | 圈 1508 | 巡 1509 | 罢 1510 | 泛 1511 | 穷 1512 | 伸 1513 | 曼 1514 | 滋 1515 | 丈 1516 | 颜 1517 | 勒 1518 | 悲 1519 | 肥 1520 | 郭 1521 | 混 1522 | 灯 1523 | 租 1524 | ⑥ 1525 | 鸡 1526 | 阻 1527 | 邑 1528 | 伍 1529 | 践 1530 | 驾 1531 | 魔 1532 | X 1533 | 拒 1534 | 懂 1535 | 糖 1536 | 脏 1537 | 沿 1538 | 翁 1539 | 胆 1540 | 惧 1541 | 聊 1542 | 携 1543 | 晨 1544 | 滑 1545 | 菌 1546 | 辅 1547 | 贤 1548 | 鉴 1549 | 丝 1550 | 尾 1551 | 赴 1552 | 吨 1553 | 宇 1554 | 眠 1555 | 脂 1556 | 籍 1557 | 彼 1558 | 污 1559 | 貌 1560 | 弄 1561 | 郡 1562 | 【 1563 | 奶 1564 | 菲 1565 | 烧 1566 | 垂 1567 | 壮 1568 | 浮 1569 | 弗 1570 | 赖 1571 | 】 1572 | 珠 1573 | 迟 1574 | 渠 1575 | 寿 1576 | 隆 1577 | 剑 1578 | 胞 1579 | 跃 1580 | 稍 1581 | 愈 1582 | 荷 1583 | 壁 1584 | 卿 1585 | 邦 1586 | 忠 1587 | 摇 1588 | 悟 1589 | 锦 1590 | 扰 1591 | 袭 1592 | 盾 1593 | 艘 1594 | 浓 1595 | 筹 1596 | 盗 1597 | 哭 1598 | 淡 1599 | 孕 1600 | 扣 1601 | 呈 1602 | 怨 1603 | 琳 1604 | 孤 1605 | 奴 1606 | 驱 1607 | 振 1608 | 闭 1609 | ~ 1610 | 隔 1611 | 寒 1612 | 汝 1613 | 贯 1614 | 恢 1615 | 饰 1616 | 荡 1617 | 姑 1618 | 械 1619 | * 1620 | 猛 1621 | 亏 1622 | 锁 1623 | 硕 1624 | 舒 1625 | 嘉 1626 | 宏 1627 | 劲 1628 | 帅 1629 | 誉 1630 | 番 1631 | 惜 1632 | 胸 1633 | 抽 1634 | 脉 1635 | 孟 1636 | 遣 1637 | 碍 1638 | 辆 1639 | 玄 1640 | 陶 1641 | 丧 1642 | 矿 1643 | 链 1644 | 矛 1645 | 鸟 1646 | 夷 1647 | 嘴 1648 | 坡 1649 | 吕 1650 | 侦 1651 | 鸣 1652 | 妹 1653 | 邓 1654 | 钢 1655 | 妙 1656 | z 1657 | 欣 1658 | 骗 1659 | 浙 1660 | 辽 1661 | 奏 1662 | 唱 1663 | 腐 1664 | 仆 1665 | 祝 1666 | 冬 1667 | 韦 1668 | 邮 1669 | 酬 1670 | 尺 1671 | 涯 1672 | 毁 1673 | 粉 1674 | 井 1675 | 腰 1676 | 肌 1677 | 搭 1678 | 恨 1679 | 乙 1680 | 勿 1681 | 婆 1682 | ★ 1683 | 闹 1684 | 猎 1685 | 厉 1686 | 哀 1687 | 递 1688 | 廉 1689 | 卧 1690 | 豆 1691 | 揭 1692 | 瓶 1693 | ⑦ 1694 | 蒋 1695 | 忌 1696 | 贡 1697 | 邀 1698 | 覆 1699 | 墓 1700 | 捷 1701 | Q 1702 | 骂 1703 | 芳 1704 | 耗 1705 | 奈 1706 | 腾 1707 | 抑 1708 | 牵 1709 | 履 1710 | 绕 1711 | 睛 1712 | 炼 1713 | 描 1714 | 辉 1715 | 肃 1716 | 循 1717 | 仿 1718 | 葬 1719 | 漏 1720 | 恰 1721 | 殿 1722 | 遥 1723 | 尿 1724 | 凯 1725 | 仲 1726 | 婢 1727 | 胃 1728 | 翼 1729 | 卢 1730 | 慎 1731 | 厦 1732 | 颈 1733 | 哉 1734 | 疲 1735 | 惑 1736 | 汗 1737 | 衰 1738 | 剩 1739 | 昆 1740 | 耐 1741 | 疫 1742 | 霸 1743 | 赚 1744 | 彭 1745 | 狼 1746 | 洪 1747 | 枚 1748 | 媪 1749 | 纲 1750 | 窗 1751 | 偷 1752 | 鼻 1753 | 池 1754 | 磨 1755 | 尘 1756 | 账 1757 | 拼 1758 | 榜 1759 | 拨 1760 | 扫 1761 | 妆 1762 | 槽 1763 | 蔡 1764 | 扎 1765 | 叔 1766 | 辈 1767 | ― 1768 | 泡 1769 | 伪 1770 | 邻 1771 | 锡 1772 | 仰 1773 | 寸 1774 | 盐 1775 | 叹 1776 | 囊 1777 | 幼 1778 | 拓 1779 | 郁 1780 | 桌 1781 | 舟 1782 | 丘 1783 | 棋 1784 | 裂 1785 | 扶 1786 | 逼 1787 | 熊 1788 | 轰 1789 | 允 1790 | 箱 1791 | 挺 1792 | 赤 1793 | 晶 1794 | ● 1795 | 祭 1796 | 寄 1797 | 爷 1798 | 呆 1799 | 胶 1800 | 佩 1801 | 泪 1802 | 沃 1803 | 婴 1804 | 娱 1805 | 霍 1806 | 肾 1807 | 诱 1808 | 扁 1809 | 辩 1810 | 粗 1811 | 夕 1812 | 灾 1813 | 哲 1814 | 涂 1815 | 艰 1816 | 猪 1817 | Y 1818 | 铜 1819 | 踏 1820 | 赫 1821 | 吹 1822 | 屈 1823 | 谐 1824 | 仔 1825 | 沪 1826 | 殷 1827 | 辄 1828 | 渡 1829 | 屏 1830 | 悦 1831 | 漂 1832 | 祸 1833 | 赔 1834 | 涛 1835 | 谨 1836 | 赐 1837 | 劝 1838 | 泌 1839 | 凤 1840 | 庙 1841 | 墨 1842 | 寺 1843 | 淘 1844 | 勃 1845 | 崇 1846 | 灰 1847 | 虫 1848 | 逆 1849 | 闪 1850 | 竹 1851 | 疼 1852 | 旨 1853 | 旋 1854 | 蒂 1855 | ⑧ 1856 | 悬 1857 | 紫 1858 | 慕 1859 | 贪 1860 | 慧 1861 | 腿 1862 | 赌 1863 | 捉 1864 | 疏 1865 | 卜 1866 | 漠 1867 | 堪 1868 | 廷 1869 | 氧 1870 | 牢 1871 | 吏 1872 | 帕 1873 | 棒 1874 | 纽 1875 | 荒 1876 | 屡 1877 | 戈 1878 | 氛 1879 | 黎 1880 | 桃 1881 | 幽 1882 | 尖 1883 | 猫 1884 | 捕 1885 | 嫁 1886 | 窃 1887 | 燃 1888 | 禽 1889 | 稿 1890 | 掩 1891 | 踪 1892 | 姻 1893 | 陪 1894 | 凉 1895 | 阔 1896 | 碰 1897 | 幻 1898 | 迈 1899 | 铺 1900 | 堆 1901 | 柔 1902 | 姿 1903 | 膜 1904 | 爸 1905 | 斤 1906 | 轨 1907 | 疆 1908 | 丢 1909 | 仓 1910 | 岂 1911 | 柳 1912 | 敦 1913 | 祥 1914 | 栏 1915 | 邪 1916 | 魂 1917 | 箭 1918 | 煤 1919 | 惨 1920 | 聪 1921 | 艳 1922 | 儒 1923 | & 1924 | 仇 1925 | 徽 1926 | 厌 1927 | 潘 1928 | 袖 1929 | 宅 1930 | 恒 1931 | 逻 1932 | 肺 1933 | 昂 1934 | 炒 1935 | 醉 1936 | 掘 1937 | 宪 1938 | 摸 1939 | 愤 1940 | 畅 1941 | 汪 1942 | 贺 1943 | 肪 1944 | 撑 1945 | 桂 1946 | 耀 1947 | 柏 1948 | 韂 1949 | 扑 1950 | 淮 1951 | j 1952 | 凌 1953 | 遵 1954 | 钻 1955 | 摘 1956 | 碎 1957 | 抛 1958 | 匹 1959 | 腔 1960 | 纠 1961 | 吐 1962 | 滚 1963 | 凝 1964 | 插 1965 | 鹰 1966 | 郊 1967 | 琴 1968 | 悄 1969 | 撤 1970 | 驶 1971 | 粮 1972 | 辱 1973 | 斩 1974 | 暖 1975 | 杭 1976 | 齿 1977 | 欺 1978 | 殖 1979 | 撞 1980 | 颁 1981 | 匈 1982 | 翔 1983 | 挤 1984 | 乔 1985 | 抚 1986 | 泥 1987 | 饱 1988 | 劣 1989 | 鞋 1990 | 肩 1991 | 雇 1992 | 驰 1993 | 莲 1994 | 岩 1995 | 酷 1996 | 玛 1997 | 赠 1998 | 斋 1999 | 辨 2000 | 泄 2001 | 姬 2002 | 拖 2003 | 湿 2004 | 滨 2005 | 鹏 2006 | 兽 2007 | 锐 2008 | 捧 2009 | 尸 2010 | 宰 2011 | 舆 2012 | 宠 2013 | 胎 2014 | 凶 2015 | 割 2016 | 虹 2017 | 俊 2018 | 糊 2019 | 兹 2020 | 瓜 2021 | 悔 2022 | 慰 2023 | 浦 2024 | 锻 2025 | 削 2026 | 唤 2027 | 戚 2028 | 撒 2029 | 冯 2030 | 丑 2031 | 亭 2032 | 寝 2033 | 嫌 2034 | 袁 2035 | ⑨ 2036 | 尉 2037 | 芬 2038 | 挖 2039 | 弥 2040 | 喊 2041 | 纤 2042 | 辟 2043 | 菩 2044 | 埋 2045 | 呀 2046 | 昏 2047 | 傅 2048 | 桑 2049 | 稀 2050 | 帐 2051 | 添 2052 | 塑 2053 | 赋 2054 | 扮 2055 | 芯 2056 | 喷 2057 | 夸 2058 | 抬 2059 | 旺 2060 | 襄 2061 | 岭 2062 | 颗 2063 | 柱 2064 | 欠 2065 | 逢 2066 | 鼎 2067 | 苗 2068 | 庸 2069 | 甜 2070 | 贼 2071 | 烂 2072 | 怜 2073 | 盲 2074 | 浅 2075 | 霞 2076 | 畏 2077 | 诛 2078 | 倡 2079 | 磁 2080 | 茨 2081 | 毅 2082 | 鲍 2083 | 骇 2084 | 峡 2085 | 妨 2086 | 雕 2087 | 袋 2088 | 裕 2089 | 哩 2090 | 怖 2091 | 阁 2092 | 函 2093 | 浩 2094 | 侍 2095 | 拳 2096 | 寡 2097 | 鸿 2098 | 眉 2099 | 穆 2100 | 狱 2101 | 牧 2102 | 拦 2103 | 雾 2104 | 猜 2105 | 顷 2106 | 昔 2107 | 慈 2108 | 朴 2109 | 疯 2110 | 苍 2111 | ■ 2112 | 渴 2113 | 慌 2114 | 绳 2115 | 闷 2116 | 陕 2117 | 宴 2118 | 辖 2119 | 「 2120 | 」 2121 | 舜 2122 | 讼 2123 | 柯 2124 | 丞 2125 | 姚 2126 | 崩 2127 | 绘 2128 | 枝 2129 | 牲 2130 | 涌 2131 | 虔 2132 | 姜 2133 | 擦 2134 | 桓 2135 | 逊 2136 | 汰 2137 | 斥 2138 | ﹒ 2139 | 颖 2140 | 悠 2141 | 恼 2142 | 灌 2143 | q 2144 | 梯 2145 | 捐 2146 | ∶ 2147 | 挣 2148 | 衷 2149 | 啡 2150 | 娜 2151 | 旬 2152 | 呵 2153 | 刷 2154 | 帽 2155 | 岳 2156 | 豫 2157 | 咖 2158 | 飘 2159 | 臂 2160 | 寂 2161 | 粒 2162 | 募 2163 | 嘱 2164 | 蔬 2165 | 苹 2166 | 泣 2167 | 吊 2168 | 淳 2169 | 诞 2170 | 诈 2171 | 咸 2172 | 猴 2173 | ~ 2174 | 奸 2175 | 淫 2176 | 佐 2177 | 晰 2178 | 崔 2179 | 雍 2180 | 葛 2181 | 鼠 2182 | 爵 2183 | 奢 2184 | 仗 2185 | 涵 2186 | 淋 2187 | 挽 2188 | 敲 2189 | 沛 2190 | 蛇 2191 | 锅 2192 | 庞 2193 | 朵 2194 | 押 2195 | 鹿 2196 | 滩 2197 | 祠 2198 | 枕 2199 | 扭 2200 | 厘 2201 | 魅 2202 | ⑩ 2203 | 湘 2204 | 柴 2205 | 炉 2206 | 荆 2207 | 卓 2208 | 碗 2209 | 夹 2210 | 脆 2211 | 颠 2212 | 窥 2213 | 逾 2214 | 诘 2215 | 贿 2216 | 虞 2217 | 茫 2218 | 榻 2219 | 碑 2220 | 傲 2221 | 骄 2222 | 卑 2223 | × 2224 | Z 2225 | 蓄 2226 | 煮 2227 | 劫 2228 | 卵 2229 | 碳 2230 | 痕 2231 | 攀 2232 | 搬 2233 | 拆 2234 | 谊 2235 | 禹 2236 | 窦 2237 | 绣 2238 | 叉 2239 | 爽 2240 | 肆 2241 | 羞 2242 | 爬 2243 | 泊 2244 | 腊 2245 | 愚 2246 | 牺 2247 | 胖 2248 | 弘 2249 | 秩 2250 | 娶 2251 | 妃 2252 | 柜 2253 | 觽 2254 | 躲 2255 | 葡 2256 | 浴 2257 | 兆 2258 | 滴 2259 | 衔 2260 | 燥 2261 | 斑 2262 | 挡 2263 | 笼 2264 | 徙 2265 | 憾 2266 | 垄 2267 | 肖 2268 | 溪 2269 | 叙 2270 | 茅 2271 | 膏 2272 | 甫 2273 | 缴 2274 | 姊 2275 | 逸 2276 | 淀 2277 | 擅 2278 | 催 2279 | 丛 2280 | 舌 2281 | 竭 2282 | 禅 2283 | 隶 2284 | 歧 2285 | 妥 2286 | 煌 2287 | 玻 2288 | 刃 2289 | ☆ 2290 | 肚 2291 | 惩 2292 | 赂 2293 | 耻 2294 | 詹 2295 | 璃 2296 | 舱 2297 | 溃 2298 | 斜 2299 | 祀 2300 | 翰 2301 | 汁 2302 | 妄 2303 | 枭 2304 | 萄 2305 | 契 2306 | 骤 2307 | 醇 2308 | 泼 2309 | 咽 2310 | 拾 2311 | 廊 2312 | 犬 2313 | 筋 2314 | 扯 2315 | 狠 2316 | 挫 2317 | 钛 2318 | 扇 2319 | 蓬 2320 | 吞 2321 | 帆 2322 | 戎 2323 | 稽 2324 | 娃 2325 | 蜜 2326 | 庐 2327 | 盆 2328 | 胀 2329 | 乞 2330 | 堕 2331 | 趁 2332 | 吓 2333 | 框 2334 | 顽 2335 | 硅 2336 | 宛 2337 | 瘦 2338 | 剥 2339 | 睹 2340 | 烛 2341 | 晏 2342 | 巾 2343 | 狮 2344 | 辰 2345 | 茂 2346 | ○ 2347 | 裙 2348 | 匆 2349 | 霉 2350 | 杖 2351 | 杆 2352 | 糟 2353 | 畜 2354 | 躁 2355 | 愁 2356 | 缠 2357 | 糕 2358 | 峻 2359 | 贱 2360 | 辣 2361 | 歼 2362 | 慨 2363 | 亨 2364 | 芝 2365 | 惕 2366 | 娇 2367 | ⑾ 2368 | 渔 2369 | 冥 2370 | 咱 2371 | 栖 2372 | 浑 2373 | 禄 2374 | 帖 2375 | 巫 2376 | 喻 2377 | 毋 2378 | 泳 2379 | 饿 2380 | 尹 2381 | 穴 2382 | 沫 2383 | 串 2384 | 邹 2385 | 厕 2386 | 蒸 2387 | + 2388 | 滞 2389 | 铃 2390 | 寓 2391 | 萧 2392 | 弯 2393 | 窝 2394 | 杏 2395 | 冻 2396 | 愉 2397 | 逝 2398 | 诣 2399 | 溢 2400 | 嘛 2401 | 兮 2402 | 暮 2403 | 豹 2404 | 骚 2405 | 跪 2406 | 懒 2407 | 缝 2408 | 盒 2409 | 亩 2410 | 寇 2411 | 弊 2412 | 巢 2413 | 咬 2414 | 粹 2415 | 冤 2416 | 陌 2417 | 涕 2418 | 翠 2419 | 勾 2420 | 拘 2421 | 侨 2422 | 肢 2423 | 裸 2424 | 恭 2425 | 叛 2426 | 纹 2427 | 摊 2428 | # 2429 | 兑 2430 | 萝 2431 | 饥 2432 | > 2433 | 浸 2434 | 叟 2435 | 滥 2436 | 灿 2437 | 衍 2438 | 喘 2439 | 吁 2440 | 晒 2441 | 谱 2442 | 堵 2443 | 暑 2444 | 撰 2445 | 棉 2446 | 蔽 2447 | 屠 2448 | 讳 2449 | 庶 2450 | 巩 2451 | 钩 2452 | 丸 2453 | 诏 2454 | 朔 2455 | 瞬 2456 | 抹 2457 | 矢 2458 | 浆 2459 | 蜀 2460 | 洒 2461 | 耕 2462 | 虏 2463 | 诵 2464 | 陛 2465 | 绵 2466 | 尴 2467 | 坤 2468 | ─ 2469 | 尬 2470 | 搏 2471 | 钙 2472 | 饼 2473 | 枯 2474 | 灼 2475 | 饶 2476 | 杉 2477 | 盼 2478 | 蒲 2479 | 尧 2480 | 俘 2481 | 伞 2482 | 庚 2483 | 摧 2484 | 遮 2485 | 痴 2486 | 罕 2487 | 桶 2488 | 巷 2489 | 乖 2490 | { 2491 | 啦 2492 | 纺 2493 | 闯 2494 | → 2495 | 敛 2496 | 弓 2497 | 喉 2498 | 酿 2499 | 彪 2500 | 垃 2501 | 歇 2502 | 圾 2503 | 倦 2504 | 狭 2505 | 晕 2506 | 裤 2507 | 蜂 2508 | } 2509 | 垣 2510 | 莉 2511 | 谍 2512 | 俩 2513 | 妪 2514 | ⑿ 2515 | 钓 2516 | 逛 2517 | 椅 2518 | 砖 2519 | 烤 2520 | 熬 2521 | 悼 2522 | 倘 2523 | 鸭 2524 | 馈 2525 | 惹 2526 | 旭 2527 | 薛 2528 | 诀 2529 | 渗 2530 | 痒 2531 | 蛮 2532 | 罩 2533 | 渊 2534 | 踢 2535 | 崖 2536 | 粟 2537 | 唇 2538 | 辐 2539 | 愧 2540 | 玲 2541 | 遏 2542 | 昼 2543 | 芦 2544 | 纣 2545 | 琼 2546 | 椎 2547 | 咳 2548 | 熙 2549 | 钉 2550 | 剖 2551 | 歉 2552 | 坠 2553 | 誓 2554 | 啤 2555 | 碧 2556 | 郅 2557 | 吻 2558 | 莎 2559 | 屯 2560 | 吟 2561 | 臭 2562 | 谦 2563 | 刮 2564 | 掠 2565 | 垫 2566 | 宙 2567 | 冀 2568 | 栗 2569 | 壳 2570 | 崛 2571 | 瑟 2572 | 哄 2573 | 谏 2574 | 丙 2575 | 叩 2576 | 缪 2577 | 雌 2578 | 叠 2579 | 奠 2580 | 髃 2581 | 碘 2582 | 暨 2583 | 劭 2584 | 霜 2585 | 妓 2586 | 厨 2587 | 脾 2588 | 俯 2589 | 槛 2590 | 芒 2591 | 沸 2592 | 盯 2593 | 坊 2594 | 咒 2595 | 觅 2596 | 剪 2597 | 遽 2598 | 贩 2599 | 寨 2600 | 铸 2601 | 炭 2602 | 绑 2603 | 蹈 2604 | 抄 2605 | 阎 2606 | 窄 2607 | 冈 2608 | 侈 2609 | 匿 2610 | 斌 2611 | 沾 2612 | 壤 2613 | 哨 2614 | 僵 2615 | 坎 2616 | 舅 2617 | 洽 2618 | 勉 2619 | 侣 2620 | 屿 2621 | 啼 2622 | 侠 2623 | 枢 2624 | 膝 2625 | 谒 2626 | 砍 2627 | 厢 2628 | 昧 2629 | 嫂 2630 | 羡 2631 | 铭 2632 | 碱 2633 | 棺 2634 | 漆 2635 | 睐 2636 | 缚 2637 | 谭 2638 | 溶 2639 | 烹 2640 | 雀 2641 | 擎 2642 | 棍 2643 | 瞄 2644 | 裹 2645 | 曝 2646 | 傻 2647 | 旱 2648 | 坑 2649 | 驴 2650 | 弦 2651 | 贬 2652 | 龟 2653 | 塘 2654 | 贞 2655 | 氨 2656 | 盎 2657 | 掷 2658 | 胺 2659 | 焚 2660 | 黏 2661 | 乒 2662 | 耍 2663 | 讶 2664 | 纱 2665 | 蠢 2666 | 掀 2667 | 藤 2668 | 蕴 2669 | 邯 2670 | 瘾 2671 | 婿 2672 | 卸 2673 | 斧 2674 | 鄙 2675 | 冕 2676 | 苑 2677 | 耿 2678 | 腻 2679 | 躺 2680 | 矩 2681 | 蝶 2682 | 浏 2683 | 壶 2684 | 凸 2685 | 臧 2686 | 墅 2687 | 粘 2688 | ⒀ 2689 | 魄 2690 | 杞 2691 | 焰 2692 | 靶 2693 | 邵 2694 | 倚 2695 | 帘 2696 | 鞭 2697 | 僚 2698 | 酶 2699 | 靡 2700 | 虐 2701 | 阐 2702 | 韵 2703 | 迄 2704 | 樊 2705 | 畔 2706 | 钯 2707 | 菊 2708 | 亥 2709 | 嵌 2710 | 狄 2711 | 拱 2712 | 伺 2713 | 潭 2714 | 缆 2715 | 慑 2716 | 厮 2717 | 晃 2718 | 媚 2719 | 吵 2720 | 骃 2721 | 稷 2722 | 涅 2723 | 阪 2724 | 挨 2725 | 珊 2726 | 殆 2727 | 璞 2728 | 婉 2729 | 翟 2730 | 栋 2731 | 醋 2732 | 鹤 2733 | 椒 2734 | 囚 2735 | 瞒 2736 | 竖 2737 | 肴 2738 | 仕 2739 | 钦 2740 | 妒 2741 | 晴 2742 | 裔 2743 | 筛 2744 | 泻 2745 | 阙 2746 | 垒 2747 | 孰 2748 | 抖 2749 | 衬 2750 | 炫 2751 | 兢 2752 | 屑 2753 | 赦 2754 | 宵 2755 | 沮 2756 | 谎 2757 | 苟 2758 | 碌 2759 | 屁 2760 | 腕 2761 | 沦 2762 | 懈 2763 | 扉 2764 | 揖 2765 | 摔 2766 | 塌 2767 | 廖 2768 | 铝 2769 | 嘲 2770 | 胥 2771 | 曳 2772 | 敖 2773 | 傍 2774 | 筒 2775 | 朕 2776 | 扳 2777 | 鑫 2778 | 硝 2779 | 暇 2780 | @ 2781 | 冶 2782 | 靖 2783 | 袍 2784 | 凑 2785 | 悍 2786 | 兔 2787 | 邢 2788 | 熏 2789 | 株 2790 | 哮 2791 | 鹅 2792 | 乾 2793 | 鄂 2794 | 矶 2795 | 逵 2796 | 坟 2797 | 佣 2798 | 髓 2799 | 隙 2800 | 惭 2801 | 轴 2802 | 掏 2803 | 苛 2804 | 偃 2805 | 榴 2806 | ⒁ 2807 | 赎 2808 | 谅 2809 | 裴 2810 | 缅 2811 | 皂 2812 | 淑 2813 | 噪 2814 | 阀 2815 | 咎 2816 | 揽 2817 | 绮 2818 | 瞻 2819 | 谜 2820 | 拐 2821 | 渭 2822 | 啥 2823 | 彦 2824 | 遁 2825 | 琐 2826 | 喧 2827 | 藉 2828 | 嫩 2829 | 寞 2830 | 梳 2831 | 溜 2832 | 粥 2833 | 恤 2834 | 迭 2835 | 瀑 2836 | 蓉 2837 | 寥 2838 | 彬 2839 | 俺 2840 | 忿 2841 | 螺 2842 | 膀 2843 | 惫 2844 | 扔 2845 | 匪 2846 | 毙 2847 | 怠 2848 | 彰 2849 | 啸 2850 | 荻 2851 | 逮 2852 | 删 2853 | 脊 2854 | 轩 2855 | 躬 2856 | 澡 2857 | 衫 2858 | 娥 2859 | 捆 2860 | 牡 2861 | 茎 2862 | 秉 2863 | 俭 2864 | 闺 2865 | 溺 2866 | 萍 2867 | 陋 2868 | 驳 2869 | 撼 2870 | 沽 2871 | 僮 2872 | 厥 2873 | 沧 2874 | 轿 2875 | 棘 2876 | 怡 2877 | 梭 2878 | 嗣 2879 | 凄 2880 | ℃ 2881 | 铅 2882 | 绛 2883 | 祈 2884 | 斐 2885 | 箍 2886 | 爪 2887 | 琦 2888 | 惶 2889 | 刹 2890 | 嗜 2891 | 窜 2892 | 匠 2893 | 锤 2894 | 筵 2895 | 瑶 2896 | 幌 2897 | 捞 2898 | 敷 2899 | 酌 2900 | 阜 2901 | 哗 2902 | 聂 2903 | 絮 2904 | 阱 2905 | 膨 2906 | 坪 2907 | 歪 2908 | 旷 2909 | 翅 2910 | 揣 2911 | 樱 2912 | 甸 2913 | 颐 2914 | 兜 2915 | 頉 2916 | 伽 2917 | 绸 2918 | 拂 2919 | 狎 2920 | 颂 2921 | 谬 2922 | 昊 2923 | 皋 2924 | 嚷 2925 | 徊 2926 | ⒂ 2927 | 曙 2928 | 麟 2929 | 嚣 2930 | 哑 2931 | 灞 2932 | 钧 2933 | 挪 2934 | 奎 2935 | 肇 2936 | 磊 2937 | 蕉 2938 | 荧 2939 | 嗽 2940 | 瓒 2941 | 苯 2942 | 躯 2943 | 绎 2944 | 鸦 2945 | 茵 2946 | 澜 2947 | 搅 2948 | 渺 2949 | 恕 2950 | 矫 2951 | 讽 2952 | 匀 2953 | 畴 2954 | 坞 2955 | 谥 2956 | 趟 2957 | 蔓 2958 | 帛 2959 | 寅 2960 | 呜 2961 | 枣 2962 | 萌 2963 | 磷 2964 | 涤 2965 | 蚀 2966 | 疮 2967 | 浊 2968 | 煎 2969 | 叮 2970 | 倩 2971 | 拯 2972 | 瑰 2973 | 涩 2974 | 绅 2975 | 枉 2976 | 朽 2977 | 哺 2978 | 邱 2979 | 凿 2980 | 莽 2981 | 隋 2982 | 炳 2983 | 睁 2984 | 澄 2985 | 厄 2986 | 惰 2987 | 粤 2988 | 黯 2989 | 纬 2990 | 哦 2991 | 徘 2992 | 炜 2993 | 擒 2994 | 捏 2995 | 帷 2996 | 攒 2997 | 湛 2998 | 夙 2999 | 滤 3000 | 浐 3001 | 霄 3002 | 豁 3003 | 甄 3004 | 剔 3005 | 丫 3006 | 愕 3007 | 袜 3008 | 呕 3009 | | 3010 | 蹲 3011 | 皱 3012 | 勘 3013 | 辜 3014 | 唬 3015 | 葱 3016 | 甩 3017 | 诡 3018 | 猿 3019 | 稻 3020 | 宦 3021 | 姨 3022 | 橡 3023 | 涧 3024 | 亢 3025 | 芽 3026 | 濒 3027 | 蹄 3028 | 窍 3029 | 譬 3030 | 驿 3031 | 拢 3032 | 叱 3033 | 喂 3034 | 怯 3035 | 坝 3036 | 椰 3037 | 孽 3038 | 阖 3039 | 瞩 3040 | 萎 3041 | 镑 3042 | 簿 3043 | 婷 3044 | 咐 3045 | 郸 3046 | 瑜 3047 | 瑚 3048 | 矮 3049 | 祷 3050 | 窟 3051 | 藩 3052 | 牟 3053 | 疡 3054 | 仑 3055 | 谣 3056 | 侄 3057 | 沐 3058 | 孜 3059 | 劈 3060 | 枸 3061 | 妮 3062 | 蔚 3063 | 勋 3064 | 玫 3065 | 虾 3066 | 谴 3067 | 莹 3068 | 紊 3069 | 瓷 3070 | 魁 3071 | 淄 3072 | 扛 3073 | 曩 3074 | 柄 3075 | 滔 3076 | 缀 3077 | 闽 3078 | 莞 3079 | 恳 3080 | 磅 3081 | 耸 3082 | 灶 3083 | 埠 3084 | 嚼 3085 | 汲 3086 | 恍 3087 | 逗 3088 | 畸 3089 | 翩 3090 | 甥 3091 | 蚁 3092 | 耽 3093 | 稚 3094 | 戟 3095 | 戊 3096 | 侃 3097 | 帜 3098 | 璧 3099 | 碟 3100 | 敞 3101 | 晖 3102 | 匙 3103 | 烫 3104 | 眷 3105 | 娟 3106 | 卦 3107 | 寐 3108 | 苌 3109 | 馨 3110 | 锣 3111 | 谛 3112 | 桐 3113 | 钥 3114 | 琅 3115 | 赁 3116 | 蜡 3117 | 颤 3118 | 陇 3119 | 僻 3120 | 埔 3121 | 腥 3122 | 皎 3123 | 酝 3124 | 媳 3125 | ⒃ 3126 | 翘 3127 | 缔 3128 | 葫 3129 | 吼 3130 | 侮 3131 | 淹 3132 | 瘫 3133 | 窘 3134 | 啖 3135 | 犀 3136 | 弒 3137 | 蕾 3138 | 偕 3139 | 笃 3140 | 栽 3141 | 唾 3142 | 陀 3143 | 汾 3144 | 俨 3145 | 呐 3146 | 膳 3147 | 锌 3148 | 瞧 3149 | 骏 3150 | 笨 3151 | 琢 3152 | 踩 3153 | 濮 3154 | 黛 3155 | 墟 3156 | 蒿 3157 | 歹 3158 | 绰 3159 | 捍 3160 | 诫 3161 | 漓 3162 | 篷 3163 | 咄 3164 | 诬 3165 | 乓 3166 | 梨 3167 | 奕 3168 | 睿 3169 | 嫡 3170 | 幢 3171 | 砸 3172 | 俞 3173 | 亟 3174 | 捣 3175 | 溯 3176 | 饵 3177 | 嘘 3178 | 砂 3179 | 凰 3180 | 丕 3181 | 荥 3182 | 赀 3183 | 薇 3184 | 滕 3185 | 袱 3186 | 辍 3187 | 疹 3188 | 泗 3189 | 韧 3190 | 撕 3191 | 磕 3192 | 梗 3193 | 挚 3194 | 挠 3195 | 嫉 3196 | 奚 3197 | 弩 3198 | 蝉 3199 | 罐 3200 | 敝 3201 | 鞍 3202 | 晦 3203 | 酣 3204 | 搁 3205 | 柿 3206 | 菠 3207 | 卞 3208 | 煞 3209 | 堤 3210 | 蟹 3211 | 骼 3212 | 晤 3213 | 娡 3214 | 潇 3215 | 胰 3216 | 酱 3217 | 郦 3218 | 脖 3219 | 檐 3220 | 桩 3221 | 踵 3222 | 禾 3223 | 狩 3224 | 盏 3225 | 弈 3226 | 牒 3227 | 拙 3228 | 喇 3229 | 舶 3230 | 炊 3231 | 喀 3232 | 黔 3233 | 挟 3234 | 钞 3235 | 缕 3236 | 俏 3237 | 娄 3238 | 粪 3239 | 颅 3240 | 锏 3241 | 凹 3242 | 饲 3243 | 肘 3244 | 赟 3245 | 吝 3246 | 襟 3247 | 琪 3248 | 谕 3249 | 飙 3250 | 秽 3251 | 颊 3252 | 渝 3253 | 卯 3254 | 捡 3255 | 氢 3256 | 桀 3257 | 裳 3258 | 滇 3259 | 浇 3260 | 礁 3261 | ◎ 3262 | 蚊 3263 | 芙 3264 | 荀 3265 | 吩 3266 | 凳 3267 | 峨 3268 | 巍 3269 | 雉 3270 | 郢 3271 | 铲 3272 | 倪 3273 | 杳 3274 | 汹 3275 | 豚 3276 | 乍 3277 | 蛙 3278 | 驼 3279 | 嗅 3280 | 讫 3281 | 痰 3282 | 棵 3283 | 睫 3284 | 绒 3285 | 捻 3286 | 罔 3287 | 杠 3288 | 氟 3289 | 堰 3290 | 羁 3291 | 穰 3292 | 钠 3293 | 骸 3294 | 睾 3295 | 鳞 3296 | 邸 3297 | 於 3298 | 谧 3299 | 睢 3300 | 泾 3301 | 芹 3302 | 钾 3303 | 颓 3304 | Ⅱ 3305 | 笋 3306 | 橘 3307 | 卉 3308 | 岐 3309 | 懿 3310 | 巅 3311 | 垮 3312 | 嵩 3313 | 柰 3314 | 鲨 3315 | 涡 3316 | 弧 3317 | ◆ 3318 | 钝 3319 | 啃 3320 | 熹 3321 | 芭 3322 | 隅 3323 | 拌 3324 | 锥 3325 | 抒 3326 | 焕 3327 | 漳 3328 | 鸽 3329 | 烘 3330 | 瞪 3331 | ⒄ 3332 | 箕 3333 | 驯 3334 | 恃 3335 | 靴 3336 | 刁 3337 | 聋 3338 | 剿 3339 | 筝 3340 | 绞 3341 | 鞅 3342 | 夯 3343 | 抉 3344 | 嘻 3345 | 弛 3346 | 垢 3347 | 衾 3348 | 丐 3349 | 斟 3350 | 恙 3351 | 雁 3352 | 匮 3353 | 娼 3354 | 鞠 3355 | 扼 3356 | 镶 3357 | 樵 3358 | 菇 3359 | 兖 3360 | 夭 3361 | 戌 3362 | 褚 3363 | 渲 3364 | 硫 3365 | 挞 3366 | 衙 3367 | 闫 3368 | 绾 3369 | 衅 3370 | 掣 3371 | 磋 3372 | 袒 3373 | 龚 3374 | 叨 3375 | 揉 3376 | 贻 3377 | 瑛 3378 | 俾 3379 | 薯 3380 | 憎 3381 | 傣 3382 | 炬 3383 | 荤 3384 | 烁 3385 | 沂 3386 | 粑 3387 | 蚌 3388 | 渣 3389 | 茄 3390 | 荼 3391 | 愍 3392 | 蒜 3393 | 菱 3394 | 狡 3395 | 蠡 3396 | 戍 3397 | 畤 3398 | 闵 3399 | 颍 3400 | 酋 3401 | 芮 3402 | 渎 3403 | 霆 3404 | 哼 3405 | 韬 3406 | 荫 3407 | 辙 3408 | 榄 3409 | 骆 3410 | 锂 3411 | 肛 3412 | 菑 3413 | 揪 3414 | 皖 3415 | 秃 3416 | 拽 3417 | 诟 3418 | 槐 3419 | 髦 3420 | 脓 3421 | 殡 3422 | 闾 3423 | 怅 3424 | 雯 3425 | \ 3426 | 戮 3427 | 澎 3428 | 悖 3429 | 嗓 3430 | 贮 3431 | 炙 3432 | 跋 3433 | 玮 3434 | 霖 3435 | 皓 3436 | 煽 3437 | 娠 3438 | 肋 3439 | 闸 3440 | 眩 3441 | 慷 3442 | 迂 3443 | 酉 3444 | 赘 3445 | 蝇 3446 | 羌 3447 | 蔑 3448 | 氯 3449 | 蚕 3450 | 汀 3451 | 憋 3452 | 臾 3453 | 汕 3454 | 缸 3455 | 棚 3456 | 唉 3457 | 棕 3458 | 裟 3459 | 蚡 3460 | 驮 3461 | 簇 3462 | 橙 3463 | 〉 3464 | 蹇 3465 | 庇 3466 | 佼 3467 | 禧 3468 | 崎 3469 | 痘 3470 | 芜 3471 | 姥 3472 | 绷 3473 | 惮 3474 | 雏 3475 | ⒅ 3476 | 恬 3477 | 庵 3478 | 瞎 3479 | 臀 3480 | 胚 3481 | 嘶 3482 | 铀 3483 | 靳 3484 | 呻 3485 | 膺 3486 | 醛 3487 | 憧 3488 | 嫦 3489 | 橄 3490 | 褐 3491 | 讷 3492 | 趾 3493 | 讹 3494 | 鹊 3495 | 谯 3496 | 喋 3497 | 篡 3498 | 郝 3499 | 嗟 3500 | 琉 3501 | 逞 3502 | 袈 3503 | 鲧 3504 | 虢 3505 | 穗 3506 | 踰 3507 | 栓 3508 | 钊 3509 | 鬻 3510 | 羹 3511 | 掖 3512 | 笞 3513 | 恺 3514 | 掬 3515 | 憨 3516 | 狸 3517 | 瑕 3518 | 匡 3519 | 〈 3520 | 痪 3521 | 冢 3522 | 梧 3523 | 眺 3524 | 佑 3525 | 愣 3526 | 撇 3527 | 阏 3528 | 疚 3529 | 攘 3530 | 昕 3531 | 瓣 3532 | 烯 3533 | 谗 3534 | 隘 3535 | 酰 3536 | 绊 3537 | 鳌 3538 | 俟 3539 | 嫔 3540 | 崭 3541 | 妊 3542 | 雒 3543 | 荔 3544 | 毯 3545 | 纶 3546 | 祟 3547 | 爹 3548 | 辗 3549 | 竿 3550 | 裘 3551 | 犁 3552 | 柬 3553 | 恣 3554 | 阑 3555 | 榆 3556 | 翦 3557 | 佟 3558 | 钜 3559 | 札 3560 | 隧 3561 | ⒆ 3562 | 腌 3563 | 砌 3564 | 酥 3565 | 辕 3566 | 铬 3567 | 痔 3568 | 讥 3569 | 毓 3570 | 橐 3571 | 跻 3572 | 酮 3573 | 殉 3574 | 哙 3575 | 亵 3576 | 锯 3577 | 糜 3578 | 壬 3579 | 瞭 3580 | 恻 3581 | 轲 3582 | 糙 3583 | 涿 3584 | 绚 3585 | 荟 3586 | 梢 3587 | 赣 3588 | 沼 3589 | 腑 3590 | 朦 3591 | 徇 3592 | 咋 3593 | 膊 3594 | 陡 3595 | 骋 3596 | 伶 3597 | 涓 3598 | 芷 3599 | 弋 3600 | 枫 3601 | 觑 3602 | 髻 3603 | 巳 3604 | 匣 3605 | 蠕 3606 | 恪 3607 | 槟 3608 | 栎 3609 | 噩 3610 | 葵 3611 | 殃 3612 | 淤 3613 | 诠 3614 | 昵 3615 | 眸 3616 | 馁 3617 | 奄 3618 | 绽 3619 | 闱 3620 | 蛛 3621 | 矜 3622 | 馔 3623 | 遐 3624 | 骡 3625 | 罹 3626 | 遑 3627 | 隍 3628 | 拭 3629 | 祁 3630 | ︰ 3631 | 霁 3632 | 釜 3633 | 钵 3634 | 栾 3635 | 睦 3636 | 蚤 3637 | 咏 3638 | 憬 3639 | 韶 3640 | 圭 3641 | 觇 3642 | 芸 3643 | 氓 3644 | 伎 3645 | 氮 3646 | 靓 3647 | 淆 3648 | 绢 3649 | 眈 3650 | 掐 3651 | 簪 3652 | 搀 3653 | 玺 3654 | 镐 3655 | 竺 3656 | 峪 3657 | 冉 3658 | 拴 3659 | 忡 3660 | 卤 3661 | 撮 3662 | 胧 3663 | 邛 3664 | 彝 3665 | 楠 3666 | 缭 3667 | 棠 3668 | 腮 3669 | 祛 3670 | 棱 3671 | 睨 3672 | 嫖 3673 | 圉 3674 | 杵 3675 | 萃 3676 | 沁 3677 | 嬉 3678 | 擂 3679 | 澈 3680 | 麽 3681 | 轸 3682 | 彘 3683 | 褥 3684 | 廓 3685 | 狙 3686 | 笛 3687 | 彗 3688 | 啬 3689 | 盂 3690 | 贲 3691 | 忏 3692 | 驺 3693 | 悚 3694 | 豨 3695 | 旌 3696 | 娩 3697 | 扃 3698 | 蹦 3699 | 扈 3700 | 凛 3701 | 驹 3702 | 剃 3703 | 孺 3704 | 〕 3705 | 吆 3706 | 驷 3707 | 迸 3708 | 毗 3709 | 〔 3710 | 熔 3711 | 逍 3712 | 癸 3713 | 稼 3714 | 溥 3715 | 嫣 3716 | 瓮 3717 | 胱 3718 | 痊 3719 | 逡 3720 | 疟 3721 | 苻 3722 | 曪 3723 | 拣 3724 | 戛 3725 | 臻 3726 | 缉 3727 | 懊 3728 | 竣 3729 | 囤 3730 | 侑 3731 | 肽 3732 | 缮 3733 | 绥 3734 | 踝 3735 | 壑 3736 | 娴 3737 | 猝 3738 | 焻 3739 | 禀 3740 | 漱 3741 | 碁 3742 | 蹬 3743 | 祗 3744 | 濡 3745 | 挝 3746 | 亳 3747 | 萦 3748 | 癖 3749 | 彀 3750 | 毡 3751 | 锈 3752 | 憩 3753 | 筷 3754 | 莒 3755 | 噬 3756 | 珀 3757 | 砝 3758 | 鬓 3759 | 瑾 3760 | 澧 3761 | 栈 3762 | 恚 3763 | 搓 3764 | 褒 3765 | 疤 3766 | 沌 3767 | 絷 3768 | 镖 3769 | 塾 3770 | 钗 3771 | 骊 3772 | 拷 3773 | 铂 3774 | 郄 3775 | 窒 3776 | 驸 3777 | 裨 3778 | 矗 3779 | 烙 3780 | 惬 3781 | 炖 3782 | 赍 3783 | 迥 3784 | 蹴 3785 | 炽 3786 | 诧 3787 | 闰 3788 | 糯 3789 | 捅 3790 | 茜 3791 | 漯 3792 | ﹐ 3793 | 峭 3794 | 哇 3795 | 鹑 3796 | 疵 3797 | 梓 3798 | 骠 3799 | 咫 3800 | 鹦 3801 | 檀 3802 | 痹 3803 | 侥 3804 | 蘑 3805 | 衢 3806 | 灸 3807 | 琵 3808 | 琶 3809 | 懦 3810 | 邺 3811 | 扪 3812 | 痿 3813 | 苔 3814 | 拇 3815 | 腋 3816 | 薨 3817 | 馅 3818 | 雠 3819 | 敕 3820 | 捂 3821 | 鴈 3822 | 栅 3823 | 瓯 3824 | 嘿 3825 | 溉 3826 | 胳 3827 | 拎 3828 | 巿 3829 | 赃 3830 | 咕 3831 | 诃 3832 | 谤 3833 | 舁 3834 | 禺 3835 | 榨 3836 | – 3837 | 拈 3838 | 瘙 3839 | 眯 3840 | 篱 3841 | 鬟 3842 | 咯 3843 | 抨 3844 | 桨 3845 | 岱 3846 | 赡 3847 | 蹶 3848 | 惚 3849 | 嗔 3850 | 喏 3851 | 聆 3852 | 曜 3853 | 窑 3854 | 瘢 3855 | 柠 3856 | 蕃 3857 | 寤 3858 | 攫 3859 | 饷 3860 | 佬 3861 | 臼 3862 | 皈 3863 | 蟒 3864 | 啜 3865 | 蔗 3866 | 汶 3867 | 酪 3868 | 豕 3869 | 窖 3870 | 膛 3871 | 檬 3872 | 戾 3873 | 蟠 3874 | 黍 3875 | 鲸 3876 | 漾 3877 | 猾 3878 | 驭 3879 | 踊 3880 | 稠 3881 | 脯 3882 | 潍 3883 | 倭 3884 | 谑 3885 | 猖 3886 | 聒 3887 | 骞 3888 | 熄 3889 | 渍 3890 | 瞳 3891 | 蒯 3892 | 陉 3893 | 褪 3894 | 筐 3895 | 彤 3896 | 蝴 3897 | 廪 3898 | 嬴 3899 | 沱 3900 | 闼 3901 | 橱 3902 | 蜚 3903 | 蹭 3904 | 鄢 3905 | 臆 3906 | 邳 3907 | 盔 3908 | 眶 3909 | 沓 3910 | 飨 3911 | 覃 3912 | 彷 3913 | 淌 3914 | 岚 3915 | 霹 3916 | 辔 3917 | 袂 3918 | 嗤 3919 | 榔 3920 | 鸾 3921 | 綦 3922 | 莘 3923 | 媲 3924 | 翊 3925 | 雳 3926 | 箸 3927 | 蚩 3928 | 茸 3929 | 嗦 3930 | 楷 3931 | 韭 3932 | 簸 3933 | 帚 3934 | 坍 3935 | 後 3936 | 璋 3937 | 剽 3938 | 渤 3939 | 骥 3940 | 犊 3941 | 迩 3942 | 悯 3943 | 饪 3944 | 搂 3945 | 鹉 3946 | 岑 3947 | 觞 3948 | 棣 3949 | 蕊 3950 | 诳 3951 | 黥 3952 | 藻 3953 | 郜 3954 | 舵 3955 | 毂 3956 | 茗 3957 | 忱 3958 | 铿 3959 | 谙 3960 | 怆 3961 | 钳 3962 | 佗 3963 | 瀚 3964 | 亘 3965 | 铎 3966 | 咀 3967 | 濯 3968 | 鼾 3969 | 酵 3970 | 酯 3971 | 麾 3972 | Ⅰ 3973 | 笙 3974 | ü 3975 | 缨 3976 | 翳 3977 | 龈 3978 | 忒 3979 | 煦 3980 | 顼 3981 | 俎 3982 | 圃 3983 | 刍 3984 | 喙 3985 | 羲 3986 | 陨 3987 | 嘤 3988 | 梏 3989 | 颛 3990 | 蜒 3991 | 啮 3992 | 镁 3993 | 辇 3994 | 葆 3995 | 蔺 3996 | 筮 3997 | 溅 3998 | 佚 3999 | 匾 4000 | 暄 4001 | 谀 4002 | 媵 4003 | 纫 4004 | 砀 4005 | 悸 4006 | 啪 4007 | 迢 4008 | 瞽 4009 | 莓 4010 | 瞰 4011 | 俸 4012 | 珑 4013 | 骜 4014 | 穹 4015 | 麓 4016 | 潢 4017 | 妞 4018 | 铢 4019 | 忻 4020 | 铤 4021 | 劾 4022 | 樟 4023 | 俐 4024 | 缗 4025 | 煲 4026 | 粱 4027 | 虱 4028 | 淇 4029 | 徼 4030 | 脐 4031 | 鼋 4032 | 嘈 4033 | 悴 4034 | 捶 4035 | 嚏 4036 | 挛 4037 | 谚 4038 | 螃 4039 | 殴 4040 | 瘟 4041 | 掺 4042 | 〇 4043 | 酚 4044 | 梵 4045 | 栩 4046 | 褂 4047 | 摹 4048 | 蜿 4049 | 钮 4050 | 箧 4051 | 胫 4052 | 馒 4053 | 焱 4054 | 嘟 4055 | 芋 4056 | 踌 4057 | 圜 4058 | 衿 4059 | 峙 4060 | 宓 4061 | 腆 4062 | 佞 4063 | 砺 4064 | 婪 4065 | 瀛 4066 | 苷 4067 | 昱 4068 | 贰 4069 | 秤 4070 | 扒 4071 | 龁 4072 | 躇 4073 | 翡 4074 | 宥 4075 | 弼 4076 | 醮 4077 | 缤 4078 | 瘗 4079 | 鳖 4080 | 擞 4081 | 眨 4082 | 礶 4083 | 锢 4084 | 辫 4085 | 儋 4086 | 纭 4087 | 洼 4088 | 漕 4089 | 飓 4090 | 纂 4091 | 繇 4092 | 舷 4093 | 勺 4094 | 诲 4095 | 捺 4096 | 瞑 4097 | 啻 4098 | 蹙 4099 | 佯 4100 | 茹 4101 | 怏 4102 | 蛟 4103 | 鹭 4104 | 烬 4105 | ■ 4106 | 兀 4107 | 檄 4108 | 浒 4109 | 胤 4110 | 踞 4111 | 僖 4112 | 卬 4113 | 爇 4114 | 璀 4115 | 暧 4116 | 髡 4117 | 蚂 4118 | 饽 4119 | 镰 4120 | 陂 4121 | 瞌 4122 | 诽 4123 | 钺 4124 | 沥 4125 | 镍 4126 | 耘 4127 | 燎 4128 | 祚 4129 | 儣 4130 | 莺 4131 | 屎 4132 | 辘 4133 | 鸥 4134 | 驩 4135 | 氐 4136 | 匕 4137 | 銮 4138 | ━ 4139 | 苴 4140 | 憔 4141 | 渥 4142 | 袅 4143 | 瞿 4144 | 瓢 4145 | 痣 4146 | 蘸 4147 | 蹑 4148 | 玷 4149 | 惺 4150 | 轧 4151 | 喃 4152 | 潺 4153 | 唏 4154 | 逅 4155 | 懵 4156 | 帏 4157 | 唠 4158 | 徨 4159 | 咤 4160 | 抠 4161 | 蛊 4162 | 苇 4163 | 铮 4164 | 疙 4165 | 闳 4166 | 砥 4167 | 羸 4168 | 遨 4169 | 哎 4170 | 捽 4171 | 钏 4172 | 壹 4173 | 昇 4174 | 擢 4175 | 贽 4176 | 汴 4177 | 砰 4178 | 牝 4179 | 蔼 4180 | 熠 4181 | 粽 4182 | 绌 4183 | 杼 4184 | 麒 4185 | 叭 4186 | 颔 4187 | 锭 4188 | 妍 4189 | 姒 4190 | 邂 4191 | 濞 4192 | 轶 4193 | 搔 4194 | 蹊 4195 | 阂 4196 | 垦 4197 | 猕 4198 | 伫 4199 | 瘩 4200 | 璐 4201 | 黠 4202 | 婺 4203 | 噫 4204 | 潞 4205 | 呱 4206 | 幡 4207 | 汞 4208 | 缯 4209 | 骁 4210 | 墩 4211 | 赧 4212 | 瞥 4213 | 媛 4214 | 瞠 4215 | 羔 4216 | 轼 4217 | Ⅲ 4218 | 拗 4219 | 鹞 4220 | 搴 4221 | 诮 4222 | 趴 4223 | 凋 4224 | 撩 4225 | 芥 4226 | 缎 4227 | 摒 4228 | 泮 4229 | 惘 4230 | 骛 4231 | 瘳 4232 | 姝 4233 | β 4234 | 渚 4235 | 吠 4236 | 稣 4237 | 獘 4238 | 篃 4239 | 罄 4240 | 吒 4241 | 茧 4242 | 黜 4243 | 缢 4244 | 獗 4245 | 诅 4246 | 絜 4247 | 蜕 4248 | 屹 4249 | 哽 4250 | 缄 4251 | 俑 4252 | 坷 4253 | 杓 4254 | 剁 4255 | 锺 4256 | 鹜 4257 | 谩 4258 | 岔 4259 | 籽 4260 | 磬 4261 | 溍 4262 | 邃 4263 | 钨 4264 | 甬 4265 | 笥 4266 | 蝠 4267 | 龋 4268 | 鸱 4269 | 孚 4270 | 馍 4271 | 溴 4272 | 妫 4273 | 偎 4274 | 烽 4275 | 椽 4276 | 阮 4277 | 酗 4278 | 惋 4279 | 牍 4280 | 觥 4281 | 瞅 4282 | 涣 4283 | 狈 4284 | 锰 4285 | 椟 4286 | 饺 4287 | 溲 4288 | 谪 4289 | 掇 4290 | 蓟 4291 | 倔 4292 | 鞫 4293 | 猢 4294 | 笄 4295 | 翕 4296 | 嗥 4297 | 卺 4298 | 寰 4299 | 狞 4300 | 洮 4301 | 炕 4302 | 夡 4303 | 瘠 4304 | 磺 4305 | 肱 4306 | 奭 4307 | 耆 4308 | 棂 4309 | 娅 4310 | 咚 4311 | 豌 4312 | 樗 4313 | 诩 4314 | 斡 4315 | 榈 4316 | 琛 4317 | 狲 4318 | 蕲 4319 | 捎 4320 | 戳 4321 | 炯 4322 | 峦 4323 | 嘎 4324 | 睬 4325 | 怙 4326 | 疱 4327 | 霎 4328 | 哂 4329 | 鱿 4330 | 涸 4331 | 咦 4332 | 痉 4333 | $ 4334 | 抟 4335 | 庖 4336 | 沅 4337 | 瑙 4338 | 珏 4339 | 祜 4340 | 楞 4341 | 漉 4342 | 鸠 4343 | 镂 4344 | 诰 4345 | 谄 4346 | 蜗 4347 | 嗒 4348 | 珂 4349 | 祯 4350 | 鸳 4351 | 殒 4352 | 潼 4353 | 柩 4354 | 萤 4355 | 柑 4356 | 轵 4357 | 缰 4358 | 淼 4359 | 冗 4360 | 蕙 4361 | 鳄 4362 | 嘀 4363 | 彊 4364 | 峥 4365 | 雹 4366 | 藜 4367 | 笠 4368 | 岖 4369 | 傥 4370 | 潦 4371 | 苞 4372 | 蛰 4373 | 嬖 4374 | 僦 4375 | 碣 4376 | 裰 4377 | 疸 4378 | 湮 4379 | 昴 4380 | 榷 4381 | 涎 4382 | 攸 4383 | 砾 4384 | 跖 4385 | 恂 4386 | 舄 4387 | 麝 4388 | 貂 4389 | 孢 4390 | 捋 4391 | 笈 4392 | 璨 4393 | 粕 4394 | 浚 4395 | 鹃 4396 | 歆 4397 | 漪 4398 | 岷 4399 | 咧 4400 | 殁 4401 | 篆 4402 | 湃 4403 | 侏 4404 | 傈 4405 | 殇 4406 | 霭 4407 | 嚎 4408 | 拊 4409 | 崂 4410 | 鬲 4411 | 碉 4412 | 菁 4413 | 庾 4414 | 拚 4415 | 旃 4416 | 幺 4417 | 皿 4418 | 焊 4419 | 噢 4420 | 祺 4421 | 锚 4422 | 痤 4423 | 翎 4424 | 醺 4425 | 噶 4426 | 傀 4427 | 俛 4428 | 秧 4429 | 谆 4430 | 僳 4431 | 菽 4432 | 绯 4433 | 瘥 4434 | 盥 4435 | 蹋 4436 | 髯 4437 | 岌 4438 | 痧 4439 | 偌 4440 | 禳 4441 | 簧 4442 | 跤 4443 | 伉 4444 | 腼 4445 | 爰 4446 | 箫 4447 | 曦 4448 | 蜘 4449 | 霓 4450 | 愆 4451 | 姗 4452 | 陬 4453 | 楂 4454 | 嵘 4455 | 蜓 4456 | 浼 4457 | 癫 4458 | 瓠 4459 | 跷 4460 | 绐 4461 | 枷 4462 | 墀 4463 | 馕 4464 | 盹 4465 | 聩 4466 | 镯 4467 | 砚 4468 | 晁 4469 | 僊 4470 | ° 4471 | 坂 4472 | 煜 4473 | 俚 4474 | 眛 4475 | 焘 4476 | 阍 4477 | 袄 4478 | 夔 4479 | 馋 4480 | 泸 4481 | 庠 4482 | 毐 4483 | 飚 4484 | 刭 4485 | 琏 4486 | 羿 4487 | 斓 4488 | 稔 4489 | 阉 4490 | 喾 4491 | 恸 4492 | 耦 4493 | 咪 4494 | 蝎 4495 | 唿 4496 | 桔 4497 | 缑 4498 | 诋 4499 | 訾 4500 | 迨 4501 | 鹄 4502 | 蟾 4503 | 鬣 4504 | 廿 4505 | 莅 4506 | 荞 4507 | 槌 4508 | 媾 4509 | 愦 4510 | 郏 4511 | 淖 4512 | 嗪 4513 | 镀 4514 | 畦 4515 | 颦 4516 | 浃 4517 | 牖 4518 | 襁 4519 | 怂 4520 | 唆 4521 | 嚭 4522 | 涟 4523 | 拮 4524 | 腓 4525 | 缥 4526 | 郫 4527 | 遴 4528 | 邾 4529 | 悒 4530 | 嗝 4531 | 殽 4532 | 跛 4533 | 掂 4534 | 撬 4535 | 鄣 4536 | 鄱 4537 | 斫 4538 | 窿 4539 | 兕 4540 | 壕 4541 | 疽 4542 | 铙 4543 | 吱 4544 | 厩 4545 | 甭 4546 | 镪 4547 | 篝 4548 | 踣 4549 | 眦 4550 | 啧 4551 | 糠 4552 | 鲤 4553 | 粲 4554 | 噱 4555 | 椭 4556 | 哟 4557 | 潸 4558 | 铆 4559 | 姣 4560 | 馥 4561 | 胙 4562 | 迦 4563 | 偻 4564 | 嗯 4565 | 陟 4566 | 爲 4567 | 桧 4568 | 鸯 4569 | 恿 4570 | 晌 4571 | 臱 4572 | 骈 4573 | 喽 4574 | 淅 4575 | 澹 4576 | 叽 4577 | 桢 4578 | 刨 4579 | 忑 4580 | 忐 4581 | 猩 4582 | 蝙 4583 | 旄 4584 | 晾 4585 | 吭 4586 | 荏 4587 | 觐 4588 | 胄 4589 | 榛 4590 | 豢 4591 | 堑 4592 | 帔 4593 | 咙 4594 | 柚 4595 | 僭 4596 | 锵 4597 | √ 4598 | 肮 4599 | 囿 4600 | 忤 4601 | 惴 4602 | 燮 4603 | 棹 4604 | 摈 4605 | 缈 4606 | 幛 4607 | 墉 4608 | 诎 4609 | 仞 4610 | 剌 4611 | 氇 4612 | 泯 4613 | 茱 4614 | 獾 4615 | 豺 4616 | 蜃 4617 | 殂 4618 | 窈 4619 | 倨 4620 | 褓 4621 | 詈 4622 | 砷 4623 | 邕 4624 | 薰 4625 | 頫 4626 | 焖 4627 | 痫 4628 | 痢 4629 | 掾 4630 | 獐 4631 | 簌 4632 | 雎 4633 | é 4634 | 帧 4635 | 鸩 4636 | 匝 4637 | 桅 4638 | 椁 4639 | 绫 4640 | 桡 4641 | 氆 4642 | 哌 4643 | 咛 4644 | 鞘 4645 | 辎 4646 | 缙 4647 | 玑 4648 | 佤 4649 | 垓 4650 | 槿 4651 | 蛤 4652 | 烨 4653 | 泓 4654 | 罴 4655 | 鄜 4656 | 褶 4657 | 瘀 4658 | 颌 4659 | 蹂 4660 | 弑 4661 | 珪 4662 | 曷 4663 | 膑 4664 | 惦 4665 | 咆 4666 | 梆 4667 | 蛾 4668 | 牂 4669 | 髅 4670 | 捱 4671 | 拧 4672 | 婧 4673 | 踱 4674 | 怵 4675 | 侗 4676 | 屉 4677 | 讪 4678 | 衲 4679 | 麋 4680 | 宕 4681 | 畿 4682 | 唧 4683 | 怛 4684 | 豉 4685 | 籁 4686 | 觌 4687 | 舂 4688 | 蓦 4689 | 廨 4690 | 胪 4691 | 怍 4692 | 鄄 4693 | 绶 4694 | 飕 4695 | 蜻 4696 | 欷 4697 | 邬 4698 | 杲 4699 | 汧 4700 | 唑 4701 | 冽 4702 | 邰 4703 | 鼍 4704 | 魇 4705 | 铐 4706 | 哝 4707 | 泱 4708 | 扞 4709 | 飒 4710 | 醴 4711 | 陲 4712 | 喟 4713 | 筠 4714 | 殓 4715 | 瘸 4716 | 倏 4717 | 嗳 4718 | 啕 4719 | 睑 4720 | 翌 4721 | à 4722 | 幄 4723 | 娓 4724 | 蓺 4725 | 妩 4726 | 奁 4727 | 璜 4728 | 桦 4729 | 朐 4730 | 榕 4731 | 礴 4732 | 儡 4733 | 婕 4734 | 觎 4735 | 觊 4736 | 绦 4737 | 猥 4738 | 涮 4739 | 倬 4740 | 袤 4741 | 啄 4742 | 掳 4743 | 椿 4744 | 俪 4745 | 噜 4746 | 摞 4747 | ※ 4748 | 鄗 4749 | 漩 4750 | 悝 4751 | 淞 4752 | 袴 4753 | 僇 4754 | 酹 4755 | 搒 4756 | 跽 4757 | 鳍 4758 | 疣 4759 | 姁 4760 | 猗 4761 | 舛 4762 | 鞮 4763 | 砭 4764 | 郯 4765 | 徕 4766 | 纥 4767 | 梃 4768 | 卮 4769 | 肣 4770 | 湎 4771 | 怦 4772 | 揄 4773 | 迕 4774 | 芍 4775 | 珥 4776 | 羚 4777 | 喔 4778 | 缁 4779 | 涝 4780 | 栉 4781 | 犷 4782 | 汜 4783 | 悻 4784 | 呛 4785 | 赭 4786 | 淬 4787 | 泫 4788 | 炀 4789 | 箴 4790 | 镌 4791 | 髫 4792 | 拄 4793 | 怔 4794 | 炷 4795 | 桎 4796 | 巽 4797 | 汭 4798 | 鹫 4799 | 挈 4800 | 蝄 4801 | 噙 4802 | 锄 4803 | 邴 4804 | 歔 4805 | 瘪 4806 | 腴 4807 | 呗 4808 | 慵 4809 | 撺 4810 | 欤 4811 | 阡 4812 | 傩 4813 | 苫 4814 | 掰 4815 | 盅 4816 | 冑 4817 | 躏 4818 | 茉 4819 | 霾 4820 | 耄 4821 | 楹 4822 | 蹻 4823 | 苋 4824 | 鲠 4825 | 哆 4826 | 傒 4827 | 榭 4828 | 牦 4829 | 婶 4830 | 仃 4831 | 囱 4832 | 皙 4833 | 醦 4834 | 隰 4835 | 掼 4836 | 琖 4837 | 駆 4838 | 暲 4839 | 砒 4840 | 舀 4841 | 鹗 4842 | 犒 4843 | 斛 4844 | 甑 4845 | 楫 4846 | 嫪 4847 | 胭 4848 | 瘁 4849 | 铛 4850 | 藕 4851 | 簋 4852 | 腭 4853 | 睽 4854 | 阕 4855 | 裀 4856 | 砧 4857 | 蓼 4858 | 贳 4859 | 劬 4860 | 搽 4861 | 龏 4862 | 荃 4863 | 奘 4864 | 祎 4865 | 泵 4866 | 攥 4867 | 翱 4868 | 晟 4869 | 酎 4870 | 睇 4871 | 逋 4872 | 箔 4873 | 羟 4874 | 诙 4875 | 饬 4876 | 跆 4877 | 眇 4878 | 佻 4879 | 铠 4880 | 娑 4881 | 郧 4882 | 葭 4883 | 蝗 4884 | 郓 4885 | 幞 4886 | 鉏 4887 | 碾 4888 | 硒 4889 | 釉 4890 | 磔 4891 | 殄 4892 | 藐 4893 | 莠 4894 | 颧 4895 | 熨 4896 | 獠 4897 | 浞 4898 | 笺 4899 | 癣 4900 | 茬 4901 | 衽 4902 | 喳 4903 | 裾 4904 | 倜 4905 | 鸢 4906 | 蠹 4907 | 廛 4908 | 惆 4909 | 芈 4910 | 燔 4911 | 伛 4912 | 妗 4913 | 佃 4914 | 缜 4915 | 咣 4916 | 龛 4917 | 挎 4918 | 徵 4919 | 粼 4920 | 锉 4921 | 啾 4922 | 隼 4923 | 猬 4924 | 镳 4925 | 璇 4926 | 胯 4927 | 饕 4928 | 揩 4929 | 縠 4930 | 虮 4931 | 苓 4932 | 噎 4933 | 祓 4934 | 筰 4935 | 奂 4936 | 搪 4937 | 喁 4938 | 俦 4939 | 隗 4940 | 馏 4941 | 圩 4942 | 褫 4943 | 僰 4944 | 吮 4945 | 哧 4946 | 湫 4947 | 旻 4948 | 筏 4949 | 搢 4950 | 佶 4951 | 茕 4952 | 铣 4953 | 娆 4954 | 揍 4955 | 嗷 4956 | 柈 4957 | 蕨 4958 | 绖 4959 | 旎 4960 | 汨 4961 | 畑 4962 | 鳏 4963 | 厝 4964 | 溷 4965 | 楯 4966 | 卅 4967 | 祇 4968 | ′ 4969 | 怼 4970 | 焯 4971 | ± 4972 | 柘 4973 | 骷 4974 | 澍 4975 | ▲ 4976 | ` 4977 | 珞 4978 | 褊 4979 | ╱ 4980 | 痂 4981 | 罘 4982 | 殚 4983 | 垠 4984 | 缧 4985 | 瑁 4986 | 齮 4987 | 蓐 4988 | 怿 4989 | 蹿 4990 | 豳 4991 | 犴 4992 | 孵 4993 | 筱 4994 | 蜷 4995 | 窋 4996 | 泞 4997 | 肄 4998 | 祐 4999 | 窕 5000 | 酆 5001 | 谶 5002 | 阗 5003 | 讙 5004 | 镝 5005 | 匍 5006 | 腱 5007 | ^ 5008 | 镬 5009 | 仡 5010 | 樾 5011 | 驽 5012 | 峒 5013 | 蟆 5014 | 葳 5015 | 徉 5016 | 昙 5017 | 罡 5018 | 耜 5019 | 嗨 5020 | 氲 5021 | 骅 5022 | 襦 5023 | 浔 5024 | 纮 5025 | 洱 5026 | 氦 5027 | 舐 5028 | 黙 5029 | 臊 5030 | 縯 5031 | 汛 5032 | 蹀 5033 | 溟 5034 | 枥 5035 | 祉 5036 | 铄 5037 | 豸 5038 | 揶 5039 | 馀 5040 | 闇 5041 | 呷 5042 | 仄 5043 | 焒 5044 | 嗡 5045 | 崆 5046 | 匳 5047 | 皑 5048 | 匐 5049 | ÷ 5050 | 诿 5051 | 髭 5052 | 鲰 5053 | 鲲 5054 | 筴 5055 | 侬 5056 | 鹳 5057 | 滂 5058 | △ 5059 | 橹 5060 | 邈 5061 | 弭 5062 | 弁 5063 | 樽 5064 | 揆 5065 | 幔 5066 | 纨 5067 | 踉 5068 | 帼 5069 | 跸 5070 | 搠 5071 | 缞 5072 | 氤 5073 | 旒 5074 | 旖 5075 | 屣 5076 | 孱 5077 | 槁 5078 | 铉 5079 | 榼 5080 | 沣 5081 | 娣 5082 | 娈 5083 | 夤 5084 | 壅 5085 | 枇 5086 | 讴 5087 | 埶 5088 | 阆 5089 | 杷 5090 | 浣 5091 | 狰 5092 | 愠 5093 | 蚓 5094 | 咿 5095 | 藿 5096 | 欻 5097 | 萸 5098 | 刽 5099 | 稞 5100 | 刎 5101 | 骖 5102 | 冁 5103 | 骰 5104 | 嵯 5105 | 濂 5106 | 跚 5107 | 湄 5108 | 釂 5109 | 麤 5110 | 珰 5111 | 舔 5112 | 谮 5113 | 坨 5114 | 嗲 5115 | 埒 5116 | 锲 5117 | 鲇 5118 | 煨 5119 | 耎 5120 | 绻 5121 | 楣 5122 | 噉 5123 | 谟 5124 | 嗖 5125 | 裆 5126 | 晗 5127 | 囹 5128 | 黝 5129 | 讣 5130 | 薏 5131 | ⑴ 5132 | 貉 5133 | 椹 5134 | 蟜 5135 | 犍 5136 | 蜇 5137 | 秏 5138 | 呶 5139 | 箩 5140 | 悞 5141 | 妤 5142 | 搐 5143 | 芪 5144 | 呦 5145 | 恽 5146 | 赊 5147 | 侩 5148 | 绁 5149 | 猱 5150 | 遒 5151 | 镵 5152 | 鸮 5153 | 趺 5154 | 簏 5155 | 迤 5156 | 坼 5157 | 痼 5158 | 棰 5159 | 凫 5160 | 诂 5161 | 骀 5162 | 瘴 5163 | 螨 5164 | 阚 5165 | 臃 5166 | 葩 5167 | 篓 5168 | 谲 5169 | 悌 5170 | 嬗 5171 | 颉 5172 | 赉 5173 | 珈 5174 | 汩 5175 | 薮 5176 | 亶 5177 | 鬃 5178 | 蒽 5179 | 黾 5180 | 噤 5181 | 螫 5182 | 嶲 5183 | 湍 5184 | 畲 5185 | 徜 5186 | 衮 5187 | 茀 5188 | 蓍 5189 | ┐ 5190 | 遛 5191 | 磐 5192 | 篁 5193 | 遘 5194 | 乩 5195 | 蹒 5196 | ≥ 5197 | 鸵 5198 | 褴 5199 | 苒 5200 | 郈 5201 | 踽 5202 | 叵 5203 | 咻 5204 | 伋 5205 | 襆 5206 | 歙 5207 | 伧 5208 | 醳 5209 | 鄠 5210 | 茴 5211 | 赳 5212 | 矾 5213 | 圄 5214 | 楮 5215 | 坯 5216 | 蕤 5217 | 迓 5218 | 锱 5219 | 腉 5220 | 滦 5221 | 饯 5222 | 诤 5223 | 懋 5224 | 呤 5225 | 纡 5226 | 隽 5227 | 妲 5228 | 蜴 5229 | ┌ 5230 | 疋 5231 | 噻 5232 | 愀 5233 | 龊 5234 | 琨 5235 | 镭 5236 | 藓 5237 | 镣 5238 | 滈 5239 | 蓓 5240 | 杪 5241 | 糗 5242 | 菅 5243 | 椀 5244 | 懑 5245 | 苎 5246 | 劓 5247 | 囫 5248 | α 5249 | 啰 5250 | 钼 5251 | 烷 5252 | 兒 5253 | 脔 5254 | 郴 5255 | 忖 5256 | 芎 5257 | 啶 5258 | 巉 5259 | 钒 5260 | 缒 5261 | 蝼 5262 | 龌 5263 | 沔 5264 | 醢 5265 | 晔 5266 | 孳 5267 | 忝 5268 | 嗫 5269 | 橇 5270 | 勖 5271 | 宸 5272 | 佰 5273 | 蜈 5274 | 酞 5275 | 蔷 5276 | 糅 5277 | 噭 5278 | 猊 5279 | 儇 5280 | 觳 5281 | 缟 5282 | 郐 5283 | 眙 5284 | 赅 5285 | 剜 5286 | 徭 5287 | 蛭 5288 | 愎 5289 | 唔 5290 | 瘘 5291 | 魋 5292 | 镉 5293 | 殛 5294 | 茏 5295 | 邋 5296 | 垛 5297 | 垩 5298 | 焙 5299 | 篾 5300 | 羯 5301 | 浍 5302 | 鏖 5303 | 嚓 5304 | 躞 5305 | 堃 5306 | 烩 5307 | 莴 5308 | ¥ 5309 | 绠 5310 | 纔 5311 | 衩 5312 | 糁 5313 | ≤ 5314 | 町 5315 | 粝 5316 | 玳 5317 | 穑 5318 | 葺 5319 | 钲 5320 | 徂 5321 | ﹖ 5322 | 棓 5323 | 泷 5324 | 涪 5325 | 囵 5326 | 怫 5327 | 屦 5328 | 歘 5329 | 鐘 5330 | 『 5331 | 裱 5332 | 缱 5333 | 圹 5334 | 罂 5335 | 荦 5336 | 腈 5337 | 愬 5338 | 坭 5339 | 嗛 5340 | 铩 5341 | 馐 5342 | 媸 5343 | 遢 5344 | て 5345 | 渑 5346 | 曛 5347 | 粳 5348 | 蹰 5349 | 舫 5350 | 勐 5351 | 窭 5352 | 濠 5353 | 亹 5354 | 跄 5355 | 琥 5356 | 戢 5357 | 駹 5358 | 燧 5359 | 嫜 5360 | 峄 5361 | 竽 5362 | 膈 5363 | 荚 5364 | 姞 5365 | 赇 5366 | 樭 5367 | 澙 5368 | 笮 5369 | 嶙 5370 | 氰 5371 | 孀 5372 | 崧 5373 | 郾 5374 | 蜥 5375 | 阊 5376 | 篙 5377 | 狻 5378 | 靛 5379 | 虬 5380 | 赝 5381 | 篑 5382 | 榇 5383 | 鞑 5384 | 侪 5385 | 盍 5386 | 疝 5387 | 矽 5388 | 堙 5389 | 毶 5390 | 泠 5391 | 瞟 5392 | 癀 5393 | 镞 5394 | 酤 5395 | 涔 5396 | 譄 5397 | 唁 5398 | 薜 5399 | 郿 5400 | ⑵ 5401 | 爻 5402 | 盱 5403 | 膻 5404 | 菡 5405 | ⒉ 5406 | 绨 5407 | 埽 5408 | О 5409 | 鳜 5410 | 醚 5411 | 阃 5412 | 遶 5413 | 岿 5414 | 張 5415 | 椐 5416 | 酺 5417 | 蔟 5418 | 螂 5419 | 辂 5420 | 窠 5421 | 淙 5422 | 鷪 5423 | 貋 5424 | 刳 5425 | 骶 5426 | 恫 5427 | 挹 5428 | 婀 5429 | 铳 5430 | 蒍 5431 | 孥 5432 | 蚣 5433 | 唳 5434 | 纻 5435 | Ⅳ 5436 | 甾 5437 | 旘 5438 | 膘 5439 | < 5440 | 脍 5441 | 耨 5442 | 翮 5443 | 赈 5444 | 浜 5445 | 洹 5446 | 蛎 5447 | 魉 5448 | 纰 5449 | 岫 5450 | 坌 5451 | 捭 5452 | 睒 5453 | 轺 5454 | 锗 5455 | 稗 5456 | 崚 5457 | 仫 5458 | 珩 5459 | 庑 5460 | 邽 5461 | 麃 5462 | 』 5463 | 縻 5464 | 荼 5465 | 嗑 5466 | 瞋 5467 | 螭 5468 | 绔 5469 | 喱 5470 | ‰ 5471 | 痞 5472 | 咔 5473 | 埤 5474 | 疥 5475 | 猷 5476 | 洺 5477 | 啁 5478 | 讦 5479 | 礻 5480 | 餮 5481 | 泅 5482 | 蛹 5483 | 癞 5484 | 妁 5485 | 桞 5486 | 匏 5487 | 琮 5488 | 铨 5489 | 杌 5490 | 孑 5491 | 菟 5492 | 骐 5493 | 钡 5494 | 钚 5495 | 莆 5496 | 荪 5497 | 魑 5498 | 峇 5499 | 斄 5500 | 缶 5501 | 茭 5502 | 煅 5503 | 酩 5504 | 酢 5505 | 湟 5506 | 潏 5507 | 嘌 5508 | 韪 5509 | 苣 5510 | 蛆 5511 | 侔 5512 | 帑 5513 | 鸨 5514 | 愫 5515 | 芫 5516 | 郪 5517 | 踔 5518 | 骧 5519 | 茁 5520 | 溧 5521 | 皁 5522 | 蜔 5523 | 魍 5524 | 瀹 5525 | 楔 5526 | 祧 5527 | 粜 5528 | 晡 5529 | 蹩 5530 | 畎 5531 | 啱 5532 | 窳 5533 | 瞾 5534 | 甙 5535 | 㛃 5536 | 絪 5537 | 绺 5538 | 貔 5539 | 崂 5540 | 痈 5541 | 舡 5542 | 葴 5543 | 耋 5544 | 囔 5545 | П 5546 | 蚯 5547 | 笆 5548 | 鲐 5549 | 踧 5550 | 遫 5551 | 踟 5552 | Р 5553 | 溊 5554 | 咂 5555 | 锹 5556 | 笫 5557 | 癔 5558 | 觜 5559 | 涒 5560 | 碓 5561 | 蛲 5562 | 跺 5563 | 枞 5564 | 茔 5565 | 1 5566 | 谸 5567 | 抿 5568 | 擘 5569 | 跬 5570 | 愛 5571 | 浿 5572 | ∩ 5573 | 黟 5574 | 枰 5575 | な 5576 | 轘 5577 | 荠 5578 | 郇 5579 | 姮 5580 | 锑 5581 | 妳 5582 | 饴 5583 | 绡 5584 | 奡 5585 | 夥 5586 | 钤 5587 | 俅 5588 | 酊 5589 | 潴 5590 | 绀 5591 | 髋 5592 | 獬 5593 | 儆 5594 | 産 5595 | 乂 5596 | 餍 5597 | 颡 5598 | 胾 5599 | 碛 5600 | 貊 5601 | 魭 5602 | 钿 5603 | 鸬 5604 | 喑 5605 | 哏 5606 | 牯 5607 | 蜍 5608 | 摁 5609 | 嶓 5610 | 俳 5611 | 蟭 5612 | 躅 5613 | 羖 5614 | 鳃 5615 | 孛 5616 | 羑 5617 | 濑 5618 | 雩 5619 | 焜 5620 | 鸷 5621 | 箦 5622 | 茯 5623 | 醪 5624 | 鹂 5625 | 铚 5626 | 缳 5627 | 螳 5628 | 酇 5629 | 蛔 5630 | 罃 5631 | 珐 5632 | 苕 5633 | 罅 5634 | 蛀 5635 | 庳 5636 | 褛 5637 | 罥 5638 | 艮 5639 | 娲 5640 | 蒺 5641 | 娉 5642 | 撵 5643 | 禨 5644 | 蓖 5645 | 姹 5646 | 戕 5647 | 庥 5648 | 岬 5649 | 痍 5650 | 烜 5651 | 窴 5652 | 邠 5653 | 蹉 5654 | 诨 5655 | 狁 5656 | 顒 5657 | 莨 5658 | 阈 5659 | 嘹 5660 | 戆 5661 | 窎 5662 | 儙 5663 | 螾 5664 | 纾 5665 | 嵋 5666 | 镕 5667 | 跣 5668 | 繻 5669 | 枳 5670 | 菏 5671 | 赜 5672 | 槃 5673 | 趄 5674 | 煊 5675 | 嬛 5676 | 抡 5677 | 睚 5678 | 跹 5679 | 壖 5680 | 戗 5681 | ⑶ 5682 | 榫 5683 | 沬 5684 | 崴 5685 | 颚 5686 | 畼 5687 | 嫚 5688 | 嚋 5689 | 珮 5690 | ◇ 5691 | 娀 5692 | 枋 5693 | 獭 5694 | 畀 5695 | 谇 5696 | 欃 5697 | 瓴 5698 | 龂 5699 | 鲋 5700 | 鹆 5701 | 鳝 5702 | 郕 5703 | 疴 5704 | 偈 5705 | 诒 5706 | 讧 5707 | 惇 5708 | 跂 5709 | 扢 5710 | 爨 5711 | 赪 5712 | 苡 5713 | 鈇 5714 | 晞 5715 | 亓 5716 | 釐 5717 | 槊 5718 | 寘 5719 | 暾 5720 | 莩 5721 | 徳 5722 | 钹 5723 | 冏 5724 | 書 5725 | 麂 5726 | 撂 5727 | 犨 5728 | 滁 5729 | 孪 5730 | 刓 5731 | 逶 5732 | 澝 5733 | 嬃 5734 | 黡 5735 | 沕 5736 | 恝 5737 | 洟 5738 | 秸 5739 | 逑 5740 | 滓 5741 | 緃 5742 | 媢 5743 | 叼 5744 | 霣 5745 | 3 5746 | 慝 5747 | 厍 5748 | 炟 5749 | 皤 5750 | 囐 5751 | 僤 5752 | 硼 5753 | 楸 5754 | 瞀 5755 | 烝 5756 | 炔 5757 | 瓻 5758 | 耙 5759 | 腩 5760 | 醵 5761 | 锽 5762 | 殪 5763 | 樯 5764 | 芡 5765 | ∈ 5766 | ↓ 5767 | 缵 5768 | 伻 5769 | 玊 5770 | 桠 5771 | 觚 5772 | 踯 5773 | 噔 5774 | 碴 5775 | 砣 5776 | 忪 5777 | 藁 5778 | 镒 5779 | 佝 5780 | 峤 5781 | 峣 5782 | 搤 5783 | 汐 5784 | 嗾 5785 | 鞚 5786 | 巂 5787 | 楗 5788 | 呓 5789 | 狒 5790 | 開 5791 | 坻 5792 | 蘧 5793 | 趵 5794 | 榱 5795 | 锷 5796 | 锾 5797 | 隳 5798 | 饟 5799 | 饦 5800 | 馎 5801 | 驵 5802 | 骘 5803 | 髀 5804 | 髑 5805 | 鮼 5806 | 鲑 5807 | 鲔 5808 | 鹘 5809 | 鹚 5810 | ﹔ 5811 | │ 5812 | 刈 5813 | 刖 5814 | 剎 5815 | 啐 5816 | 嘭 5817 | 噌 5818 | 噗 5819 | 嚬 5820 | 嚰 5821 | 圯 5822 | 坳 5823 | 嫄 5824 | 寖 5825 | 尻 5826 | 峋 5827 | 崃 5828 | 嶂 5829 | 嶶 5830 | 帇 5831 | 幤 5832 | 悫 5833 | 慙 5834 | 扌 5835 | 揜 5836 | 撝 5837 | 旳 5838 | 昀 5839 | 昃 5840 | 暹 5841 | 玕 5842 | 琰 5843 | 璆 5844 | 玃 5845 | 疃 5846 | 猃 5847 | 皴 5848 | 狃 5849 | 祊 5850 | 燹 5851 | 燠 5852 | 熛 5853 | 窣 5854 | 窬 5855 | 糌 5856 | 糍 5857 | 紬 5858 | 濩 5859 | 飧 5860 | 肸 5861 | 脲 5862 | 臬 5863 | 芘 5864 | 荜 5865 | 蔫 5866 | 襜 5867 | 觖 5868 | 豭 5869 | 贇 5870 | 氩 5871 | 氖 5872 | 趸 5873 | 檠 5874 | 檇 5875 | 邘 5876 | 鄏 5877 | 酡 5878 | 鑙 5879 | 钴 5880 | 铵 5881 | 氅 5882 | 莜 5883 | 柢 5884 | 悭 5885 | 鄳 5886 | 蒗 5887 | 虺 5888 | 沇 5889 | 薤 5890 | 踹 5891 | 墠 5892 | 唶 5893 | 骍 5894 | 镊 5895 | 镛 5896 | 帨 5897 | 逖 5898 | 氡 5899 | 鹣 5900 | 恹 5901 | 臛 5902 | 呃 5903 | 幂 5904 | 鹖 5905 | 間 5906 | 磛 5907 | 弢 5908 | 蛐 5909 | 懜 5910 | 凇 5911 | 闟 5912 | 璟 5913 | 遹 5914 | 肓 5915 | 剐 5916 | 垝 5917 | 杅 5918 | 笤 5919 | 佈 5920 | 撷 5921 | 佘 5922 | 嚅 5923 | 蝮 5924 | 谳 5925 | 蚝 5926 | 栀 5927 | 眢 5928 | ∵ 5929 | 蓿 5930 | 枵 5931 | 橪 5932 | 騳 5933 | ≠ 5934 | 蟋 5935 | 嗌 5936 | 玦 5937 | 嗄 5938 | 劙 5939 | 騠 5940 | 鞣 5941 | 唢 5942 | 茆 5943 | 蚰 5944 | 喹 5945 | 趱 5946 | 珅 5947 | 喆 5948 | 谔 5949 | 苄 5950 | 靥 5951 | 鲛 5952 | 洫 5953 | 颀 5954 | 趹 5955 | 蛩 5956 | 馓 5957 | 轫 5958 | 叡 5959 | 蒉 5960 | 睪 5961 | 漦 5962 | 胝 5963 | 瘐 5964 | 逦 5965 | 嶷 5966 | 傕 5967 | 斲 5968 | 嵬 5969 | 缇 5970 | 洙 5971 | 瘵 5972 | 縢 5973 | 渖 5974 | 價 5975 | 灊 5976 | 訇 5977 | 醍 5978 | 膦 5979 | 癜 5980 | 歃 5981 | 钎 5982 | 讵 5983 | 钰 5984 | 嫱 5985 | 婊 5986 | 狝 5987 | 榧 5988 | 脁 5989 | 柞 5990 | -------------------------------------------------------------------------------- /data/demo.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/data/demo.png -------------------------------------------------------------------------------- /demo.py: -------------------------------------------------------------------------------- 1 | # coding:utf-8 2 | 3 | ''' 4 | March 2019 by Chen Jun 5 | https://github.com/chenjun2hao/Attention_ocr.pytorch 6 | 7 | ''' 8 | 9 | import torch 10 | from torch.autograd import Variable 11 | import utils 12 | import dataset 13 | from PIL import Image 14 | from utils import alphabet 15 | import models.crnn_lang as crnn 16 | 17 | use_gpu = True 18 | 19 | encoder_path = './expr/attentioncnn/encoder_5.pth' 20 | # decoder_path = './expr/attentioncnn/decoder_5.pth' 21 | img_path = './test_img/20441531_4212871437.jpg' 22 | max_length = 15 # 最长字符串的长度 23 | EOS_TOKEN = 1 24 | 25 | nclass = len(alphabet) + 3 26 | encoder = crnn.CNN(32, 1, 256) # 编码器 27 | # decoder = crnn.decoder(256, nclass) # seq to seq的解码器, nclass在decoder中还加了2 28 | decoder = crnn.decoderV2(256, nclass) 29 | 30 | 31 | if encoder_path and decoder_path: 32 | print('loading pretrained models ......') 33 | encoder.load_state_dict(torch.load(encoder_path)) 34 | decoder.load_state_dict(torch.load(decoder_path)) 35 | if torch.cuda.is_available() and use_gpu: 36 | encoder = encoder.cuda() 37 | decoder = decoder.cuda() 38 | 39 | 40 | converter = utils.strLabelConverterForAttention(alphabet) 41 | 42 | transformer = dataset.resizeNormalize((280, 32)) 43 | image = Image.open(img_path).convert('L') 44 | image = transformer(image) 45 | if torch.cuda.is_available() and use_gpu: 46 | image = image.cuda() 47 | image = image.view(1, *image.size()) 48 | image = Variable(image) 49 | 50 | encoder.eval() 51 | decoder.eval() 52 | encoder_out = encoder(image) 53 | 54 | decoded_words = [] 55 | prob = 1.0 56 | decoder_attentions = torch.zeros(max_length, 71) 57 | decoder_input = torch.zeros(1).long() # 初始化decoder的开始,从0开始输出 58 | decoder_hidden = decoder.initHidden(1) 59 | if torch.cuda.is_available() and use_gpu: 60 | decoder_input = decoder_input.cuda() 61 | decoder_hidden = decoder_hidden.cuda() 62 | loss = 0.0 63 | # 预测的时候采用非强制策略,将前一次的输出,作为下一次的输入,直到标签为EOS_TOKEN时停止 64 | for di in range(max_length): # 最大字符串的长度 65 | decoder_output, decoder_hidden, decoder_attention = decoder( 66 | decoder_input, decoder_hidden, encoder_out) 67 | probs = torch.exp(decoder_output) 68 | decoder_attentions[di] = decoder_attention.data 69 | topv, topi = decoder_output.data.topk(1) 70 | ni = topi.squeeze(1) 71 | decoder_input = ni 72 | prob *= probs[:, ni] 73 | if ni == EOS_TOKEN: 74 | # decoded_words.append('') 75 | break 76 | else: 77 | decoded_words.append(converter.decode(ni)) 78 | 79 | words = ''.join(decoded_words) 80 | prob = prob.item() 81 | print('predict_str:%-20s => prob:%-20s' % (words, prob)) 82 | -------------------------------------------------------------------------------- /expr/attentioncnn/encoder_600.pth: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/expr/attentioncnn/encoder_600.pth -------------------------------------------------------------------------------- /models/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/models/__init__.py -------------------------------------------------------------------------------- /models/crnn.py: -------------------------------------------------------------------------------- 1 | import torch 2 | import torch.nn as nn 3 | import torch.nn.functional as F 4 | from torch.autograd import Variable 5 | 6 | 7 | class BidirectionalLSTM(nn.Module): 8 | 9 | def __init__(self, nIn, nHidden, nOut): 10 | super(BidirectionalLSTM, self).__init__() 11 | 12 | self.rnn = nn.LSTM(nIn, nHidden, bidirectional=True) 13 | self.embedding = nn.Linear(nHidden * 2, nOut) 14 | 15 | def forward(self, input): 16 | recurrent, _ = self.rnn(input) 17 | T, b, h = recurrent.size() 18 | t_rec = recurrent.view(T * b, h) 19 | 20 | output = self.embedding(t_rec) # [T * b, nOut] 21 | output = output.view(T, b, -1) 22 | 23 | return output 24 | 25 | class AttentionCell(nn.Module): 26 | def __init__(self, input_size, hidden_size): 27 | super(AttentionCell, self).__init__() 28 | self.i2h = nn.Linear(input_size, hidden_size,bias=False) 29 | self.h2h = nn.Linear(hidden_size, hidden_size) 30 | self.score = nn.Linear(hidden_size, 1, bias=False) 31 | self.rnn = nn.GRUCell(input_size, hidden_size) 32 | self.hidden_size = hidden_size 33 | self.input_size = input_size 34 | self.processed_batches = 0 35 | 36 | def forward(self, prev_hidden, feats): 37 | self.processed_batches = self.processed_batches + 1 38 | nT = feats.size(0) 39 | nB = feats.size(1) 40 | nC = feats.size(2) 41 | hidden_size = self.hidden_size 42 | input_size = self.input_size 43 | 44 | feats_proj = self.i2h(feats.view(-1,nC)) 45 | prev_hidden_proj = self.h2h(prev_hidden).view(1,nB, hidden_size).expand(nT, nB, hidden_size).contiguous().view(-1, hidden_size) 46 | emition = self.score(torch.tanh(feats_proj + prev_hidden_proj).view(-1, hidden_size)).view(nT,nB).transpose(0,1) 47 | alpha = F.softmax(emition, dim=1) # nB * nT 48 | 49 | if self.processed_batches % 10000 == 0: 50 | print('emition ', list(emition.data[0])) 51 | print('alpha ', list(alpha.data[0])) 52 | 53 | context = (feats * alpha.transpose(0,1).contiguous().view(nT,nB,1).expand(nT, nB, nC)).sum(0).squeeze(0) 54 | cur_hidden = self.rnn(context, prev_hidden) 55 | return cur_hidden, alpha 56 | 57 | class Attention(nn.Module): 58 | def __init__(self, input_size, hidden_size, num_classes): 59 | super(Attention, self).__init__() 60 | self.attention_cell = AttentionCell(input_size, hidden_size) 61 | self.input_size = input_size 62 | self.hidden_size = hidden_size 63 | self.generator = nn.Linear(hidden_size, num_classes) 64 | self.processed_batches = 0 65 | 66 | def forward(self, feats, text_length): 67 | self.processed_batches = self.processed_batches + 1 68 | nT = feats.size(0) 69 | nB = feats.size(1) 70 | nC = feats.size(2) 71 | hidden_size = self.hidden_size 72 | input_size = self.input_size 73 | assert(input_size == nC) 74 | assert(nB == text_length.numel()) 75 | 76 | num_steps = text_length.data.max() 77 | num_labels = text_length.data.sum() 78 | 79 | output_hiddens = Variable(torch.zeros(num_steps, nB, hidden_size).type_as(feats.data)) 80 | hidden = Variable(torch.zeros(nB,hidden_size).type_as(feats.data)) 81 | max_locs = torch.zeros(num_steps, nB) 82 | max_vals = torch.zeros(num_steps, nB) 83 | for i in range(num_steps): 84 | hidden, alpha = self.attention_cell(hidden, feats) 85 | output_hiddens[i] = hidden 86 | if self.processed_batches % 500 == 0: 87 | max_val, max_loc = alpha.data.max(1) 88 | max_locs[i] = max_loc.cpu() 89 | max_vals[i] = max_val.cpu() 90 | if self.processed_batches % 500 == 0: 91 | print('max_locs', list(max_locs[0:text_length.data[0],0])) 92 | print('max_vals', list(max_vals[0:text_length.data[0],0])) 93 | new_hiddens = Variable(torch.zeros(num_labels, hidden_size).type_as(feats.data)) 94 | b = 0 95 | start = 0 96 | for length in text_length.data: 97 | new_hiddens[start:start+length] = output_hiddens[0:length,b,:] 98 | start = start + length 99 | b = b + 1 100 | probs = self.generator(new_hiddens) 101 | return probs 102 | 103 | class CRNN(nn.Module): 104 | 105 | def __init__(self, imgH, nc, nclass, nh, n_rnn=2, leakyRelu=False): 106 | super(CRNN, self).__init__() 107 | assert imgH % 16 == 0, 'imgH has to be a multiple of 16' 108 | 109 | self.cnn = nn.Sequential( 110 | nn.Conv2d(nc, 64, 3, 1, 1), nn.ReLU(True), nn.MaxPool2d(2, 2), # 64x16x50 111 | nn.Conv2d(64, 128, 3, 1, 1), nn.ReLU(True), nn.MaxPool2d(2, 2), # 128x8x25 112 | nn.Conv2d(128, 256, 3, 1, 1), nn.BatchNorm2d(256), nn.ReLU(True), # 256x8x25 113 | nn.Conv2d(256, 256, 3, 1, 1), nn.ReLU(True), nn.MaxPool2d((2,2), (2,1), (0,1)), # 256x4x25 114 | nn.Conv2d(256, 512, 3, 1, 1), nn.BatchNorm2d(512), nn.ReLU(True), # 512x4x25 115 | nn.Conv2d(512, 512, 3, 1, 1), nn.ReLU(True), nn.MaxPool2d((2,2), (2,1), (0,1)), # 512x2x25 116 | nn.Conv2d(512, 512, 2, 1, 0), nn.BatchNorm2d(512), nn.ReLU(True)) # 512x1x25 117 | #self.cnn = cnn 118 | self.rnn = nn.Sequential( 119 | BidirectionalLSTM(512, nh, nh), 120 | BidirectionalLSTM(nh, nh, nh)) 121 | self.attention = Attention(nh, nh, nclass) 122 | 123 | def forward(self, input, length): 124 | # conv features 125 | conv = self.cnn(input) 126 | b, c, h, w = conv.size() 127 | assert h == 1, "the height of conv must be 1" 128 | conv = conv.squeeze(2) 129 | conv = conv.permute(2, 0, 1) # [w, b, c] 130 | 131 | # rnn features 132 | rnn = self.rnn(conv) 133 | output = self.attention(rnn, length) 134 | 135 | return output 136 | -------------------------------------------------------------------------------- /models/crnn_lang.py: -------------------------------------------------------------------------------- 1 | # coding:utf-8 2 | import torch 3 | import torch.nn as nn 4 | import torch.nn.functional as F 5 | from torch.autograd import Variable 6 | from torch.nn.parameter import Parameter 7 | 8 | GO = 0 9 | EOS_TOKEN = 1 # 结束标志的标签 10 | 11 | class BidirectionalLSTM(nn.Module): 12 | 13 | def __init__(self, nIn, nHidden, nOut): 14 | super(BidirectionalLSTM, self).__init__() 15 | 16 | self.rnn = nn.LSTM(nIn, nHidden, bidirectional=True) 17 | self.embedding = nn.Linear(nHidden * 2, nOut) 18 | 19 | def forward(self, input): 20 | recurrent, _ = self.rnn(input) 21 | T, b, h = recurrent.size() 22 | t_rec = recurrent.view(T * b, h) 23 | 24 | output = self.embedding(t_rec) # [T * b, nOut] 25 | output = output.view(T, b, -1) 26 | 27 | return output 28 | 29 | 30 | class AttentionCell(nn.Module): 31 | def __init__(self, input_size, hidden_size, num_embeddings=128): 32 | super(AttentionCell, self).__init__() 33 | self.i2h = nn.Linear(input_size, hidden_size,bias=False) 34 | self.h2h = nn.Linear(hidden_size, hidden_size) 35 | self.score = nn.Linear(hidden_size, 1, bias=False) 36 | self.rnn = nn.GRUCell(input_size+num_embeddings, hidden_size) 37 | self.hidden_size = hidden_size 38 | self.input_size = input_size 39 | self.num_embeddings = num_embeddings 40 | self.processed_batches = 0 41 | 42 | def forward(self, prev_hidden, feats, cur_embeddings): 43 | nT = feats.size(0) 44 | nB = feats.size(1) 45 | nC = feats.size(2) 46 | hidden_size = self.hidden_size 47 | input_size = self.input_size 48 | 49 | feats_proj = self.i2h(feats.view(-1,nC)) 50 | prev_hidden_proj = self.h2h(prev_hidden).view(1,nB, hidden_size).expand(nT, nB, hidden_size).contiguous().view(-1, hidden_size) 51 | emition = self.score(torch.tanh(feats_proj + prev_hidden_proj).view(-1, hidden_size)).view(nT,nB).transpose(0,1) 52 | self.processed_batches = self.processed_batches + 1 53 | 54 | if self.processed_batches % 10000 == 0: 55 | print('processed_batches = %d' %(self.processed_batches)) 56 | 57 | alpha = F.softmax(emition) # nB * nT 58 | if self.processed_batches % 10000 == 0: 59 | print('emition ', list(emition.data[0])) 60 | print('alpha ', list(alpha.data[0])) 61 | context = (feats * alpha.transpose(0,1).contiguous().view(nT,nB,1).expand(nT, nB, nC)).sum(0).squeeze(0) # nB * nC//感觉不应该sum,输出4×256 62 | context = torch.cat([context, cur_embeddings], 1) 63 | cur_hidden = self.rnn(context, prev_hidden) 64 | return cur_hidden, alpha 65 | 66 | 67 | class DecoderRNN(nn.Module): 68 | """ 69 | 采用RNN进行解码 70 | """ 71 | def __init__(self, hidden_size, output_size): 72 | super(DecoderRNN, self).__init__() 73 | self.hidden_size = hidden_size 74 | 75 | self.embedding = nn.Embedding(output_size, hidden_size) 76 | self.gru = nn.GRU(hidden_size, hidden_size) 77 | self.out = nn.Linear(hidden_size, output_size) 78 | self.softmax = nn.LogSoftmax(dim=1) 79 | 80 | def forward(self, input, hidden): 81 | output = self.embedding(input).view(1, 1, -1) 82 | output = F.relu(output) 83 | output, hidden = self.gru(output, hidden) 84 | output = self.softmax(self.out(output[0])) 85 | return output, hidden 86 | 87 | def initHidden(self): 88 | result = Variable(torch.zeros(1, 1, self.hidden_size)) 89 | 90 | return result 91 | 92 | 93 | class Attentiondecoder(nn.Module): 94 | """ 95 | 采用attention注意力机制,进行解码 96 | """ 97 | def __init__(self, hidden_size, output_size, dropout_p=0.1, max_length=71): 98 | super(Attentiondecoder, self).__init__() 99 | self.hidden_size = hidden_size 100 | self.output_size = output_size 101 | self.dropout_p = dropout_p 102 | self.max_length = max_length 103 | 104 | self.embedding = nn.Embedding(self.output_size, self.hidden_size) 105 | self.attn = nn.Linear(self.hidden_size * 2, self.max_length) 106 | self.attn_combine = nn.Linear(self.hidden_size * 2, self.hidden_size) 107 | self.dropout = nn.Dropout(self.dropout_p) 108 | self.gru = nn.GRU(self.hidden_size, self.hidden_size) 109 | self.out = nn.Linear(self.hidden_size, self.output_size) 110 | 111 | def forward(self, input, hidden, encoder_outputs): 112 | # calculate the attention weight and weight * encoder_output feature 113 | embedded = self.embedding(input) # 前一次的输出进行词嵌入 114 | embedded = self.dropout(embedded) 115 | 116 | attn_weights = F.softmax( 117 | self.attn(torch.cat((embedded, hidden[0]), 1)), dim=1) # 上一次的输出和隐藏状态求出权重, 主要使用一个linear layer从512维到71维,所以只能处理固定宽度的序列 118 | attn_applied = torch.matmul(attn_weights.unsqueeze(1), 119 | encoder_outputs.permute((1, 0, 2))) # 矩阵乘法,bmm(8×1×56,8×56×256)=8×1×256 120 | 121 | output = torch.cat((embedded, attn_applied.squeeze(1) ), 1) # 上一次的输出和attention feature做一个融合,再加一个linear layer 122 | output = self.attn_combine(output).unsqueeze(0) 123 | 124 | output = F.relu(output) 125 | output, hidden = self.gru(output, hidden) # just as sequence to sequence decoder 126 | 127 | output = F.log_softmax(self.out(output[0]), dim=1) # use log_softmax for nllloss 128 | return output, hidden, attn_weights 129 | 130 | def initHidden(self, batch_size): 131 | result = Variable(torch.zeros(1, batch_size, self.hidden_size)) 132 | 133 | return result 134 | 135 | 136 | def target_txt_decode(batch_size, text_length, text): 137 | ''' 138 | 对target txt每个字符串的开始加上GO,最后加上EOS,并用最长的字符串做对齐 139 | return: 140 | targets: num_steps+1 * batch_size 141 | ''' 142 | nB = batch_size # batch 143 | 144 | # 将text分离出来 145 | num_steps = text_length.data.max() 146 | num_steps = int(num_steps.cpu().numpy()) 147 | targets = torch.ones(nB, num_steps + 2) * 2 # 用$符号填充较短的字符串, 在最开始加上GO,结束加上EOS_TOKEN 148 | targets = targets.long().cuda() # 用 149 | start_id = 0 150 | for i in range(nB): 151 | targets[i][0] = GO # 在开始的加上开始标签 152 | targets[i][1:text_length.data[i] + 1] = text.data[start_id:start_id+text_length.data[i]] # 是否要加1 153 | targets[i][text_length.data[i] + 1] = EOS_TOKEN # 加上结束标签 154 | start_id = start_id+text_length.data[i] # 拆分每个目标的target label,为:batch×最长字符的numel 155 | targets = Variable(targets.transpose(0, 1).contiguous()) 156 | return targets 157 | 158 | 159 | class CNN(nn.Module): 160 | ''' 161 | CNN+BiLstm做特征提取 162 | ''' 163 | def __init__(self, imgH, nc, nh): 164 | super(CNN, self).__init__() 165 | assert imgH % 16 == 0, 'imgH has to be a multiple of 16' 166 | 167 | self.cnn = nn.Sequential( 168 | nn.Conv2d(nc, 64, 3, 1, 1), nn.ReLU(True), nn.MaxPool2d(2, 2), # 64x16x50 169 | nn.Conv2d(64, 128, 3, 1, 1), nn.ReLU(True), nn.MaxPool2d(2, 2), # 128x8x25 170 | nn.Conv2d(128, 256, 3, 1, 1), nn.BatchNorm2d(256), nn.ReLU(True), # 256x8x25 171 | nn.Conv2d(256, 256, 3, 1, 1), nn.ReLU(True), nn.MaxPool2d((2,2), (2,1), (0,1)), # 256x4x25 172 | nn.Conv2d(256, 512, 3, 1, 1), nn.BatchNorm2d(512), nn.ReLU(True), # 512x4x25 173 | nn.Conv2d(512, 512, 3, 1, 1), nn.ReLU(True), nn.MaxPool2d((2,2), (2,1), (0,1)), # 512x2x25 174 | nn.Conv2d(512, 512, 2, 1, 0), nn.BatchNorm2d(512), nn.ReLU(True)) # 512x1x25 175 | self.rnn = nn.Sequential( 176 | BidirectionalLSTM(512, nh, nh), 177 | BidirectionalLSTM(nh, nh, nh)) 178 | 179 | def forward(self, input): 180 | # conv features 181 | conv = self.cnn(input) 182 | b, c, h, w = conv.size() 183 | assert h == 1, "the height of conv must be 1" 184 | conv = conv.squeeze(2) 185 | conv = conv.permute(2, 0, 1) # [w, b, c] 186 | 187 | # rnn features calculate 188 | encoder_outputs = self.rnn(conv) # seq * batch * n_classes// 25 × batchsize × 256(隐藏节点个数) 189 | 190 | return encoder_outputs 191 | 192 | 193 | class decoder(nn.Module): 194 | ''' 195 | decoder from image features 196 | ''' 197 | def __init__(self, nh=256, nclass=13, dropout_p=0.1, max_length=71): 198 | super(decoder, self).__init__() 199 | self.hidden_size = nh 200 | self.decoder = Attentiondecoder(nh, nclass, dropout_p, max_length) 201 | 202 | def forward(self, input, hidden, encoder_outputs): 203 | return self.decoder(input, hidden, encoder_outputs) 204 | 205 | def initHidden(self, batch_size): 206 | result = Variable(torch.zeros(1, batch_size, self.hidden_size)) 207 | return result 208 | 209 | 210 | class AttentiondecoderV2(nn.Module): 211 | """ 212 | 采用seq to seq模型,修改注意力权重的计算方式 213 | """ 214 | def __init__(self, hidden_size, output_size, dropout_p=0.1): 215 | super(AttentiondecoderV2, self).__init__() 216 | self.hidden_size = hidden_size 217 | self.output_size = output_size 218 | self.dropout_p = dropout_p 219 | 220 | self.embedding = nn.Embedding(self.output_size, self.hidden_size) 221 | self.attn_combine = nn.Linear(self.hidden_size * 2, self.hidden_size) 222 | self.dropout = nn.Dropout(self.dropout_p) 223 | self.gru = nn.GRU(self.hidden_size, self.hidden_size) 224 | self.out = nn.Linear(self.hidden_size, self.output_size) 225 | 226 | # test 227 | self.vat = nn.Linear(hidden_size, 1) 228 | 229 | def forward(self, input, hidden, encoder_outputs): 230 | embedded = self.embedding(input) # 前一次的输出进行词嵌入 231 | embedded = self.dropout(embedded) 232 | 233 | # test 234 | batch_size = encoder_outputs.shape[1] 235 | alpha = hidden + encoder_outputs # 特征融合采用+/concat其实都可以 236 | alpha = alpha.view(-1, alpha.shape[-1]) 237 | attn_weights = self.vat( torch.tanh(alpha)) # 将encoder_output:batch*seq*features,将features的维度降为1 238 | attn_weights = attn_weights.view(-1, 1, batch_size).permute((2,1,0)) 239 | attn_weights = F.softmax(attn_weights, dim=2) 240 | 241 | # attn_weights = F.softmax( 242 | # self.attn(torch.cat((embedded, hidden[0]), 1)), dim=1) # 上一次的输出和隐藏状态求出权重 243 | 244 | attn_applied = torch.matmul(attn_weights, 245 | encoder_outputs.permute((1, 0, 2))) # 矩阵乘法,bmm(8×1×56,8×56×256)=8×1×256 246 | output = torch.cat((embedded, attn_applied.squeeze(1) ), 1) # 上一次的输出和attention feature,做一个线性+GRU 247 | output = self.attn_combine(output).unsqueeze(0) 248 | 249 | output = F.relu(output) 250 | output, hidden = self.gru(output, hidden) 251 | 252 | output = F.log_softmax(self.out(output[0]), dim=1) # 最后输出一个概率 253 | return output, hidden, attn_weights 254 | 255 | def initHidden(self, batch_size): 256 | result = Variable(torch.zeros(1, batch_size, self.hidden_size)) 257 | 258 | return result 259 | 260 | 261 | class decoderV2(nn.Module): 262 | ''' 263 | decoder from image features 264 | ''' 265 | 266 | def __init__(self, nh=256, nclass=13, dropout_p=0.1): 267 | super(decoderV2, self).__init__() 268 | self.hidden_size = nh 269 | self.decoder = AttentiondecoderV2(nh, nclass, dropout_p) 270 | 271 | def forward(self, input, hidden, encoder_outputs): 272 | return self.decoder(input, hidden, encoder_outputs) 273 | 274 | def initHidden(self, batch_size): 275 | result = Variable(torch.zeros(1, batch_size, self.hidden_size)) 276 | return result 277 | -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | certifi==2018.11.29 2 | cycler==0.10.0 3 | kiwisolver==1.0.1 4 | lxml==4.2.5 5 | matplotlib==3.0.2 6 | numpy==1.15.4 7 | opencv-python==3.4.4.19 8 | Pillow==5.3.0 9 | pyparsing==2.3.0 10 | six==1.12.0 11 | torch==0.4.1 12 | torchfile==0.1.0 13 | torchvision==0.2.1 14 | -------------------------------------------------------------------------------- /src/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/src/__init__.py -------------------------------------------------------------------------------- /src/class_attention.py: -------------------------------------------------------------------------------- 1 | # coding:utf-8 2 | import utils 3 | import torch 4 | import cv2 5 | import numpy as np 6 | from PIL import Image 7 | import torchvision.transforms as transforms 8 | import models.crnn_lang as crnn 9 | 10 | class attention_ocr(): 11 | '''使用attention_ocr进行字符识别 12 | 返回: 13 | ocr读数,置信度 14 | ''' 15 | def __init__(self): 16 | encoder_path = './expr/attentioncnn/encoder_600.pth' 17 | decoder_path = './expr/attentioncnn/decoder_600.pth' 18 | self.alphabet = '0123456789' 19 | self.max_length = 7 # 最长字符串的长度 20 | self.EOS_TOKEN = 1 21 | self.use_gpu = True 22 | self.max_width = 220 23 | self.converter = utils.strLabelConverterForAttention(self.alphabet) 24 | self.transform = transforms.ToTensor() 25 | 26 | nclass = len(self.alphabet) + 3 27 | encoder = crnn.CNN(32, 1, 256) # 编码器 28 | decoder = crnn.decoder(256, nclass) # seq to seq的解码器, nclass在decoder中还加了2 29 | 30 | if encoder_path and decoder_path: 31 | print('loading pretrained models ......') 32 | encoder.load_state_dict(torch.load(encoder_path)) 33 | decoder.load_state_dict(torch.load(decoder_path)) 34 | if torch.cuda.is_available() and self.use_gpu: 35 | encoder = encoder.cuda() 36 | decoder = decoder.cuda() 37 | self.encoder = encoder.eval() 38 | self.decoder = decoder.eval() 39 | 40 | def constant_pad(self, img_crop): 41 | '''把图片等比例缩放到高度:32,再或resize填充到220宽度 42 | img_crop: 43 | cv2图片,rgb顺序 44 | 返回: 45 | img tensor 46 | ''' 47 | h, w, c = img_crop.shape 48 | ratio = h / 32 49 | new_w = int(w / ratio) 50 | new_img = cv2.resize(img_crop,(new_w, 32)) 51 | container = np.ones((32, self.max_width, 3), dtype=np.uint8) * new_img[-3,-3,:] 52 | if new_w <= self.max_width: 53 | container[:,:new_w,:] = new_img 54 | elif new_w > self.max_width: 55 | container = cv2.resize(new_img, (self.max_width, 32)) 56 | 57 | img = Image.fromarray(container.astype('uint8')).convert('L') 58 | img = self.transform(img) 59 | img.sub_(0.5).div_(0.5) 60 | if self.use_gpu: 61 | img = img.cuda() 62 | return img.unsqueeze(0) 63 | 64 | def predict(self, img_crop): 65 | '''attention ocr 做文字识别 66 | img_crop: 67 | cv2图片,rgb顺序 68 | 返回: 69 | ocr读数,prob置信度 70 | ''' 71 | img_tensor = self.constant_pad(img_crop) 72 | encoder_out = self.encoder(img_tensor) 73 | 74 | decoded_words = [] 75 | prob = 1.0 76 | decoder_input = torch.zeros(1).long() # 初始化decoder的开始,从0开始输出 77 | decoder_hidden = self.decoder.initHidden(1) 78 | if torch.cuda.is_available() and self.use_gpu: 79 | decoder_input = decoder_input.cuda() 80 | decoder_hidden = decoder_hidden.cuda() 81 | # 预测的时候采用非强制策略,将前一次的输出,作为下一次的输入,直到标签为EOS_TOKEN时停止 82 | for di in range(self.max_length): # 最大字符串的长度 83 | decoder_output, decoder_hidden, decoder_attention = self.decoder( 84 | decoder_input, decoder_hidden, encoder_out) 85 | probs = torch.exp(decoder_output) 86 | # decoder_attentions[di] = decoder_attention.data 87 | topv, topi = decoder_output.data.topk(1) 88 | ni = topi.squeeze(1) 89 | decoder_input = ni 90 | prob *= probs[:, ni] 91 | if ni == self.EOS_TOKEN: 92 | # decoded_words.append('') 93 | break 94 | else: 95 | decoded_words.append(self.converter.decode(ni)) 96 | 97 | words = ''.join(decoded_words) 98 | prob = prob.item() 99 | 100 | return words, prob 101 | 102 | if __name__ == '__main__': 103 | path = './test_img/00027_299021_27.jpg' 104 | img = cv2.imread(path) 105 | attention = attention_ocr() 106 | res = attention.predict(img) 107 | print(res) -------------------------------------------------------------------------------- /src/dataset.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/python 2 | # encoding: utf-8 3 | 4 | import random 5 | import torch 6 | from torch.utils.data import Dataset 7 | from torch.utils.data import sampler 8 | import torchvision.transforms as transforms 9 | # import lmdb 10 | import six 11 | import sys 12 | from PIL import Image 13 | import numpy as np 14 | 15 | 16 | class listDataset(Dataset): 17 | def __init__(self, list_file=None, transform=None, target_transform=None): 18 | self.list_file = list_file 19 | with open(list_file) as fp: 20 | self.lines = fp.readlines() 21 | self.nSamples = len(self.lines) 22 | 23 | self.transform = transform 24 | self.target_transform = target_transform 25 | 26 | def __len__(self): 27 | return self.nSamples 28 | 29 | def __getitem__(self, index): 30 | assert index <= len(self), 'index range error' 31 | 32 | line_splits = self.lines[index].strip().split(' ') 33 | imgpath = line_splits[0] 34 | try: 35 | if 'train' in self.list_file: 36 | img = Image.open(imgpath).convert('L') 37 | else: 38 | img = Image.open(imgpath).convert('L') 39 | except IOError: 40 | print('Corrupted image for %d' % index) 41 | return self[index + 1] 42 | 43 | if self.transform is not None: 44 | img = self.transform(img) 45 | 46 | label = line_splits[1].decode('utf-8') 47 | 48 | if self.target_transform is not None: 49 | label = self.target_transform(label) 50 | 51 | return (img, label) 52 | 53 | class resizeNormalize(object): 54 | 55 | def __init__(self, size, interpolation=Image.BILINEAR): 56 | self.size = size 57 | self.interpolation = interpolation 58 | self.toTensor = transforms.ToTensor() 59 | 60 | def __call__(self, img): 61 | img = img.resize(self.size, self.interpolation) 62 | img = self.toTensor(img) 63 | img.sub_(0.5).div_(0.5) 64 | return img 65 | 66 | 67 | class randomSequentialSampler(sampler.Sampler): 68 | 69 | def __init__(self, data_source, batch_size): 70 | self.num_samples = len(data_source) 71 | self.batch_size = batch_size 72 | 73 | def __iter__(self): 74 | n_batch = len(self) // self.batch_size 75 | tail = len(self) % self.batch_size 76 | index = torch.LongTensor(len(self)).fill_(0) 77 | for i in range(n_batch): 78 | random_start = random.randint(0, len(self) - self.batch_size) 79 | batch_index = random_start + torch.arange(0, self.batch_size) 80 | index[i * self.batch_size:(i + 1) * self.batch_size] = batch_index 81 | # deal with tail 82 | if tail: 83 | random_start = random.randint(0, len(self) - self.batch_size) 84 | tail_index = random_start + torch.arange(0, tail) 85 | index[(i + 1) * self.batch_size:] = tail_index 86 | 87 | return iter(index) 88 | 89 | def __len__(self): 90 | return self.num_samples 91 | 92 | 93 | class alignCollate(object): 94 | 95 | def __init__(self, imgH=32, imgW=100, keep_ratio=False, min_ratio=1): 96 | self.imgH = imgH 97 | self.imgW = imgW 98 | self.keep_ratio = keep_ratio 99 | self.min_ratio = min_ratio 100 | 101 | def __call__(self, batch): 102 | images, labels = zip(*batch) 103 | 104 | imgH = self.imgH 105 | imgW = self.imgW 106 | if self.keep_ratio: 107 | ratios = [] 108 | for image in images: 109 | w, h = image.size 110 | ratios.append(w / float(h)) 111 | ratios.sort() 112 | max_ratio = ratios[-1] 113 | imgW = int(np.floor(max_ratio * imgH)) 114 | imgW = max(imgH * self.min_ratio, imgW) # assure imgH >= imgW 115 | 116 | transform = resizeNormalize((imgW, imgH)) 117 | images = [transform(image) for image in images] 118 | images = torch.cat([t.unsqueeze(0) for t in images], 0) 119 | 120 | return images, labels 121 | -------------------------------------------------------------------------------- /src/utils.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/python 2 | # encoding: utf-8 3 | # -*- encoding: utf-8 -*- 4 | 5 | import torch 6 | import torch.nn as nn 7 | from torch.autograd import Variable 8 | import collections 9 | from PIL import Image, ImageFilter 10 | import math 11 | import random 12 | import numpy as np 13 | import cv2 14 | 15 | with open('./data/char_std_5990.txt') as f: 16 | data = f.readlines() 17 | alphabet = [x.rstrip() for x in data] 18 | alphabet = ''.join(alphabet).decode('utf-8') # python2不加decode的时候会乱码 19 | 20 | 21 | class strLabelConverterForAttention(object): 22 | """Convert between str and label. 23 | 24 | NOTE: 25 | Insert `EOS` to the alphabet for attention. 26 | 27 | Args: 28 | alphabet (str): set of the possible characters. 29 | ignore_case (bool, default=True): whether or not to ignore all of the case. 30 | """ 31 | 32 | def __init__(self, alphabet): 33 | self.alphabet = alphabet 34 | 35 | self.dict = {} 36 | self.dict['SOS'] = 0 # 开始 37 | self.dict['EOS'] = 1 # 结束 38 | self.dict['$'] = 2 # blank标识符 39 | for i, item in enumerate(self.alphabet): 40 | # NOTE: 0 is reserved for 'blank' required by wrap_ctc 41 | self.dict[item] = i + 3 # 从3开始编码 42 | 43 | def encode(self, text): 44 | """对target_label做编码和对齐 45 | 对target txt每个字符串的开始加上GO,最后加上EOS,并用最长的字符串做对齐 46 | Args: 47 | text (str or list of str): texts to convert. 48 | 49 | Returns: 50 | torch.IntTensor targets:max_length × batch_size 51 | """ 52 | if isinstance(text, unicode): 53 | text = [self.dict[item] for item in text] 54 | elif isinstance(text, collections.Iterable): 55 | text = [self.encode(s) for s in text] # 编码 56 | 57 | max_length = max([len(x) for x in text]) # 对齐 58 | nb = len(text) 59 | targets = torch.ones(nb, max_length + 2) * 2 # use ‘blank’ for pading 60 | for i in range(nb): 61 | targets[i][0] = 0 # 开始 62 | targets[i][1:len(text[i]) + 1] = text[i] 63 | targets[i][len(text[i]) + 1] = 1 64 | text = targets.transpose(0, 1).contiguous() 65 | text = text.long() 66 | return torch.LongTensor(text) 67 | 68 | def decode(self, t): 69 | """Decode encoded texts back into strs. 70 | 71 | Args: 72 | torch.IntTensor [length_0 + length_1 + ... length_{n - 1}]: encoded texts. 73 | torch.IntTensor [n]: length of each text. 74 | 75 | Raises: 76 | AssertionError: when the texts and its length does not match. 77 | 78 | Returns: 79 | text (str or list of str): texts to convert. 80 | """ 81 | 82 | texts = list(self.dict.keys())[list(self.dict.values()).index(t)] 83 | return texts 84 | 85 | class strLabelConverterForCTC(object): 86 | """Convert between str and label. 87 | 88 | NOTE: 89 | Insert `blank` to the alphabet for CTC. 90 | 91 | Args: 92 | alphabet (str): set of the possible characters. 93 | ignore_case (bool, default=True): whether or not to ignore all of the case. 94 | """ 95 | 96 | def __init__(self, alphabet, sep): 97 | self.sep = sep 98 | self.alphabet = alphabet.split(sep) 99 | self.alphabet.append('-') # for `-1` index 100 | 101 | self.dict = {} 102 | for i, item in enumerate(self.alphabet): 103 | # NOTE: 0 is reserved for 'blank' required by wrap_ctc 104 | self.dict[item] = i + 1 105 | 106 | def encode(self, text): 107 | """Support batch or single str. 108 | 109 | Args: 110 | text (str or list of str): texts to convert. 111 | 112 | Returns: 113 | torch.IntTensor [length_0 + length_1 + ... length_{n - 1}]: encoded texts. 114 | torch.IntTensor [n]: length of each text. 115 | """ 116 | if isinstance(text, str): 117 | text = text.split(self.sep) 118 | text = [self.dict[item] for item in text] 119 | length = [len(text)] 120 | elif isinstance(text, collections.Iterable): 121 | length = [len(s.split(self.sep)) for s in text] 122 | text = self.sep.join(text) 123 | text, _ = self.encode(text) 124 | return (torch.IntTensor(text), torch.IntTensor(length)) 125 | 126 | def decode(self, t, length, raw=False): 127 | """Decode encoded texts back into strs. 128 | 129 | Args: 130 | torch.IntTensor [length_0 + length_1 + ... length_{n - 1}]: encoded texts. 131 | torch.IntTensor [n]: length of each text. 132 | 133 | Raises: 134 | AssertionError: when the texts and its length does not match. 135 | 136 | Returns: 137 | text (str or list of str): texts to convert. 138 | """ 139 | if length.numel() == 1: 140 | length = length[0] 141 | assert t.numel() == length, "text with length: {} does not match declared length: {}".format(t.numel(), length) 142 | if raw: 143 | return ''.join([self.alphabet[i - 1] for i in t]) 144 | else: 145 | char_list = [] 146 | for i in range(length): 147 | if t[i] != 0 and (not (i > 0 and t[i - 1] == t[i])): 148 | char_list.append(self.alphabet[t[i] - 1]) 149 | return ''.join(char_list) 150 | else: 151 | # batch mode 152 | assert t.numel() == length.sum(), "texts with length: {} does not match declared length: {}".format(t.numel(), length.sum()) 153 | texts = [] 154 | index = 0 155 | for i in range(length.numel()): 156 | l = length[i] 157 | texts.append( 158 | self.decode( 159 | t[index:index + l], torch.IntTensor([l]), raw=raw)) 160 | index += l 161 | return texts 162 | 163 | 164 | class averager(object): 165 | """Compute average for `torch.Variable` and `torch.Tensor`. """ 166 | 167 | def __init__(self): 168 | self.reset() 169 | 170 | def add(self, v): 171 | if isinstance(v, Variable): 172 | count = v.data.numel() 173 | v = v.data.sum() 174 | elif isinstance(v, torch.Tensor): 175 | count = v.numel() 176 | v = v.sum() 177 | 178 | self.n_count += count 179 | self.sum += v 180 | 181 | def reset(self): 182 | self.n_count = 0 183 | self.sum = 0 184 | 185 | def val(self): 186 | res = 0 187 | if self.n_count != 0: 188 | res = self.sum / float(self.n_count) 189 | return res 190 | 191 | 192 | def oneHot(v, v_length, nc): 193 | batchSize = v_length.size(0) 194 | maxLength = v_length.max() 195 | v_onehot = torch.FloatTensor(batchSize, maxLength, nc).fill_(0) 196 | acc = 0 197 | for i in range(batchSize): 198 | length = v_length[i] 199 | label = v[acc:acc + length].view(-1, 1).long() 200 | v_onehot[i, :length].scatter_(1, label, 1.0) 201 | acc += length 202 | return v_onehot 203 | 204 | 205 | def loadData(v, data): 206 | v.data.resize_(data.size()).copy_(data) 207 | 208 | 209 | def prettyPrint(v): 210 | print('Size {0}, Type: {1}'.format(str(v.size()), v.data.type())) 211 | print('| Max: %f | Min: %f | Mean: %f' % (v.max().data[0], v.min().data[0], 212 | v.mean().data[0])) 213 | 214 | 215 | def assureRatio(img): 216 | """Ensure imgH <= imgW.""" 217 | b, c, h, w = img.size() 218 | if h > w: 219 | main = nn.UpsamplingBilinear2d(size=(h, h), scale_factor=None) 220 | img = main(img) 221 | return img 222 | 223 | 224 | class halo(): 225 | ''' 226 | u:高斯分布的均值 227 | sigma:方差 228 | nums:在一张图片中随机添加几个光点 229 | prob:使用halo的概率 230 | ''' 231 | 232 | def __init__(self, nums, u=0, sigma=0.2, prob=0.5): 233 | self.u = u # 均值μ 234 | self.sig = math.sqrt(sigma) # 标准差δ 235 | self.nums = nums 236 | self.prob = prob 237 | 238 | def create_kernel(self, maxh=32, maxw=50): 239 | height_scope = [10, maxh] # 高度范围 随机生成高斯 240 | weight_scope = [20, maxw] # 宽度范围 241 | 242 | x = np.linspace(self.u - 3 * self.sig, self.u + 3 * self.sig, random.randint(*height_scope)) 243 | y = np.linspace(self.u - 3 * self.sig, self.u + 3 * self.sig, random.randint(*weight_scope)) 244 | Gauss_map = np.zeros((len(x), len(y))) 245 | for i in range(len(x)): 246 | for j in range(len(y)): 247 | Gauss_map[i, j] = np.exp(-((x[i] - self.u) ** 2 + (y[j] - self.u) ** 2) / (2 * self.sig ** 2)) / ( 248 | math.sqrt(2 * math.pi) * self.sig) 249 | 250 | return Gauss_map 251 | 252 | def __call__(self, img): 253 | if random.random() < self.prob: 254 | Gauss_map = self.create_kernel(32, 60) # 初始化一个高斯核,32为高度方向的最大值,60为w方向 255 | img1 = np.asarray(img) 256 | img1.flags.writeable = True # 将数组改为读写模式 257 | nums = random.randint(1, self.nums) # 随机生成nums个光点 258 | img1 = img1.astype(np.float) 259 | # print(nums) 260 | for i in range(nums): 261 | img_h, img_w = img1.shape 262 | pointx = random.randint(0, img_h - 10) # 在原图中随机找一个点 263 | pointy = random.randint(0, img_w - 10) 264 | 265 | h, w = Gauss_map.shape # 判断是否超限 266 | endx = pointx + h 267 | endy = pointy + w 268 | 269 | if pointx + h > img_h: 270 | endx = img_h 271 | Gauss_map = Gauss_map[1:img_h - pointx + 1, :] 272 | if img_w < pointy + w: 273 | endy = img_w 274 | Gauss_map = Gauss_map[:, 1:img_w - pointy + 1] 275 | 276 | # 加上不均匀光照 277 | img1[pointx:endx, pointy:endy] = img1[pointx:endx, pointy:endy] + Gauss_map * 255.0 278 | img1[img1 > 255.0] = 255.0 # 进行限幅,不然uint8会从0开始重新计数 279 | img = img1 280 | return Image.fromarray(np.uint8(img)) 281 | 282 | 283 | class MyGaussianBlur(ImageFilter.Filter): 284 | name = "GaussianBlur" 285 | 286 | def __init__(self, radius=2, bounds=None): 287 | self.radius = radius 288 | self.bounds = bounds 289 | 290 | def filter(self, image): 291 | if self.bounds: 292 | clips = image.crop(self.bounds).gaussian_blur(self.radius) 293 | image.paste(clips, self.bounds) 294 | return image 295 | else: 296 | return image.gaussian_blur(self.radius) 297 | 298 | 299 | class GBlur(object): 300 | def __init__(self, radius=2, prob=0.5): 301 | radius = random.randint(0, radius) 302 | self.blur = MyGaussianBlur(radius=radius) 303 | self.prob = prob 304 | 305 | def __call__(self, img): 306 | if random.random() < self.prob: 307 | img = img.filter(self.blur) 308 | return img 309 | 310 | 311 | class RandomBrightness(object): 312 | """随机改变亮度 313 | pil:pil格式的图片 314 | """ 315 | 316 | def __init__(self, prob=1.5): 317 | self.prob = prob 318 | 319 | def __call__(self, pil): 320 | rgb = np.asarray(pil) 321 | if random.random() < self.prob: 322 | hsv = cv2.cvtColor(rgb, cv2.COLOR_RGB2HSV) 323 | h, s, v = cv2.split(hsv) 324 | adjust = random.choice([0.5, 0.7, 0.9, 1.2, 1.5, 1.7]) # 随机选择一个 325 | # adjust = random.choice([1.2, 1.5, 1.7, 2.0]) # 随机选择一个 326 | v = v * adjust 327 | v = np.clip(v, 0, 255).astype(hsv.dtype) 328 | hsv = cv2.merge((h, s, v)) 329 | rgb = cv2.cvtColor(hsv, cv2.COLOR_HSV2RGB) 330 | return Image.fromarray(np.uint8(rgb)).convert('L') 331 | 332 | 333 | class randapply(object): 334 | """随机决定是否应用光晕、模糊或者二者都用 335 | 336 | Args: 337 | transforms (list or tuple): list of transformations 338 | """ 339 | 340 | def __init__(self, transforms): 341 | assert isinstance(transforms, (list, tuple)) 342 | self.transforms = transforms 343 | 344 | def __call__(self, img): 345 | for t in self.transforms: 346 | img = t(img) 347 | return img 348 | 349 | def __repr__(self): 350 | format_string = self.__class__.__name__ + '(' 351 | format_string += '\n p={}'.format(self.p) 352 | for t in self.transforms: 353 | format_string += '\n' 354 | format_string += ' {0}'.format(t) 355 | format_string += '\n)' 356 | return format_string 357 | 358 | 359 | def weights_init(model): 360 | # Official init from torch repo. 361 | for m in model.modules(): 362 | if isinstance(m, nn.Conv2d): 363 | nn.init.kaiming_normal_(m.weight) 364 | elif isinstance(m, nn.BatchNorm2d): 365 | nn.init.constant_(m.weight, 1) 366 | nn.init.constant_(m.bias, 0) 367 | elif isinstance(m, nn.Linear): 368 | nn.init.constant_(m.bias, 0) -------------------------------------------------------------------------------- /test_img/20436312_1683447152.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/20436312_1683447152.jpg -------------------------------------------------------------------------------- /test_img/20437109_1639581473.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/20437109_1639581473.jpg -------------------------------------------------------------------------------- /test_img/20437421_2143654630.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/20437421_2143654630.jpg -------------------------------------------------------------------------------- /test_img/20437531_1514396900.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/20437531_1514396900.jpg -------------------------------------------------------------------------------- /test_img/20438953_2386326516.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/20438953_2386326516.jpg -------------------------------------------------------------------------------- /test_img/20438984_3963047307.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/20438984_3963047307.jpg -------------------------------------------------------------------------------- /test_img/20439171_260546633.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/20439171_260546633.jpg -------------------------------------------------------------------------------- /test_img/20439281_953270478.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/20439281_953270478.jpg -------------------------------------------------------------------------------- /test_img/20439562_2199433254.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/20439562_2199433254.jpg -------------------------------------------------------------------------------- /test_img/20439906_2889507409.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/20439906_2889507409.jpg -------------------------------------------------------------------------------- /test_img/20440000_4153137105.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/20440000_4153137105.jpg -------------------------------------------------------------------------------- /test_img/20441140_1981970116.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/20441140_1981970116.jpg -------------------------------------------------------------------------------- /test_img/20441328_2992333319.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/20441328_2992333319.jpg -------------------------------------------------------------------------------- /test_img/20441531_4212871437.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/20441531_4212871437.jpg -------------------------------------------------------------------------------- /test_img/20441750_1134599790.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/20441750_1134599790.jpg -------------------------------------------------------------------------------- /test_img/20441765_2389222735.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/20441765_2389222735.jpg -------------------------------------------------------------------------------- /test_img/20441859_4060744030.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/20441859_4060744030.jpg -------------------------------------------------------------------------------- /test_img/20441968_1253178421.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/20441968_1253178421.jpg -------------------------------------------------------------------------------- /test_img/20442421_1459247642.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/20442421_1459247642.jpg -------------------------------------------------------------------------------- /test_img/20442468_3479649590.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/20442468_3479649590.jpg -------------------------------------------------------------------------------- /test_img/20442515_3243295335.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/20442515_3243295335.jpg -------------------------------------------------------------------------------- /test_img/20442640_2409912250.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/20442640_2409912250.jpg -------------------------------------------------------------------------------- /test_img/20442750_205861976.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/20442750_205861976.jpg -------------------------------------------------------------------------------- /test_img/20442984_333319517.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/20442984_333319517.jpg -------------------------------------------------------------------------------- /test_img/md_img/attentionV2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/md_img/attentionV2.png -------------------------------------------------------------------------------- /test_img/md_img/attentionocr.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/md_img/attentionocr.png -------------------------------------------------------------------------------- /test_img/md_img/attention结果.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/chenjun2hao/Attention_ocr.pytorch/81785f3591cd89cbc4fc7744f727e7f2b8961e54/test_img/md_img/attention结果.png -------------------------------------------------------------------------------- /train.py: -------------------------------------------------------------------------------- 1 | # coding:utf-8 2 | from __future__ import print_function 3 | import argparse 4 | import random 5 | import torch 6 | import torch.backends.cudnn as cudnn 7 | import torch.optim as optim 8 | import torch.utils.data 9 | import numpy as np 10 | import os 11 | import src.utils as utils 12 | import src.dataset as dataset 13 | import time 14 | from src.utils import alphabet 15 | from src.utils import weights_init 16 | 17 | import models.crnn_lang as crnn 18 | print(crnn.__name__) 19 | 20 | parser = argparse.ArgumentParser() 21 | parser.add_argument('--trainlist', default='./data/ch_train.txt') 22 | parser.add_argument('--vallist', default='./data/ch_test.txt') 23 | parser.add_argument('--workers', type=int, help='number of data loading workers', default=2) 24 | parser.add_argument('--batchSize', type=int, default=4, help='input batch size') 25 | parser.add_argument('--imgH', type=int, default=32, help='the height of the input image to network') 26 | parser.add_argument('--imgW', type=int, default=280, help='the width of the input image to network') 27 | parser.add_argument('--nh', type=int, default=256, help='size of the lstm hidden state') 28 | parser.add_argument('--niter', type=int, default=21, help='number of epochs to train for') 29 | parser.add_argument('--lr', type=float, default=0.001, help='learning rate for Critic, default=0.00005') 30 | parser.add_argument('--beta1', type=float, default=0.5, help='beta1 for adam. default=0.5') 31 | parser.add_argument('--cuda', action='store_true', help='enables cuda', default=True) 32 | parser.add_argument('--ngpu', type=int, default=1, help='number of GPUs to use') 33 | parser.add_argument('--encoder', type=str, default='./expr/attentioncnn/encoder_600.pth', help="path to encoder (to continue training)") 34 | parser.add_argument('--decoder', type=str, default='', help='path to decoder (to continue training)') 35 | parser.add_argument('--experiment', default='./expr/attentioncnn', help='Where to store samples and models') 36 | parser.add_argument('--displayInterval', type=int, default=10, help='Interval to be displayed') 37 | parser.add_argument('--valInterval', type=int, default=1, help='Interval to be displayed') 38 | parser.add_argument('--saveInterval', type=int, default=10, help='Interval to be displayed') 39 | parser.add_argument('--adam', default=True, action='store_true', help='Whether to use adam (default is rmsprop)') 40 | parser.add_argument('--adadelta', action='store_true', help='Whether to use adadelta (default is rmsprop)') 41 | parser.add_argument('--keep_ratio', action='store_true', help='whether to keep ratio for image resize') 42 | parser.add_argument('--random_sample', default=True, action='store_true', help='whether to sample the dataset with random sampler') 43 | parser.add_argument('--teaching_forcing_prob', type=float, default=0.5, help='where to use teach forcing') 44 | parser.add_argument('--max_width', type=int, default=71, help='the width of the featuremap out from cnn') 45 | opt = parser.parse_args() 46 | print(opt) 47 | 48 | SOS_token = 0 49 | EOS_TOKEN = 1 # 结束标志的标签 50 | BLANK = 2 # blank for padding 51 | 52 | 53 | if opt.experiment is None: 54 | opt.experiment = 'expr' 55 | os.system('mkdir -p {0}'.format(opt.experiment)) # 创建多级目录 56 | 57 | opt.manualSeed = random.randint(1, 10000) # fix seed 58 | print("Random Seed: ", opt.manualSeed) 59 | random.seed(opt.manualSeed) 60 | np.random.seed(opt.manualSeed) 61 | torch.manual_seed(opt.manualSeed) 62 | 63 | cudnn.benchmark = True 64 | 65 | if torch.cuda.is_available() and not opt.cuda: 66 | print("WARNING: You have a CUDA device, so you should probably run with --cuda") 67 | 68 | transform = None 69 | train_dataset = dataset.listDataset(list_file =opt.trainlist, transform=transform) 70 | assert train_dataset 71 | if not opt.random_sample: 72 | sampler = dataset.randomSequentialSampler(train_dataset, opt.batchSize) 73 | else: 74 | sampler = None 75 | train_loader = torch.utils.data.DataLoader( 76 | train_dataset, batch_size=opt.batchSize, 77 | shuffle=False, sampler=sampler, 78 | num_workers=int(opt.workers), 79 | collate_fn=dataset.alignCollate(imgH=opt.imgH, imgW=opt.imgW, keep_ratio=opt.keep_ratio)) 80 | 81 | test_dataset = dataset.listDataset(list_file =opt.vallist, transform=dataset.resizeNormalize((opt.imgW, opt.imgH))) 82 | 83 | nclass = len(alphabet) + 3 # decoder的时候,需要的类别数,3 for SOS,EOS和blank 84 | nc = 1 85 | 86 | converter = utils.strLabelConverterForAttention(alphabet) 87 | # criterion = torch.nn.CrossEntropyLoss() 88 | criterion = torch.nn.NLLLoss() # 最后的输出要为log_softmax 89 | 90 | 91 | encoder = crnn.CNN(opt.imgH, nc, opt.nh) 92 | # decoder = crnn.decoder(opt.nh, nclass, dropout_p=0.1, max_length=opt.max_width) # max_length:w/4,为encoder特征提取之后宽度方向上的序列长度 93 | decoder = crnn.decoderV2(opt.nh, nclass, dropout_p=0.1) # For prediction of an indefinite long sequence 94 | encoder.apply(weights_init) 95 | decoder.apply(weights_init) 96 | # continue training or use the pretrained model to initial the parameters of the encoder and decoder 97 | if opt.encoder: 98 | print('loading pretrained encoder model from %s' % opt.encoder) 99 | encoder.load_state_dict(torch.load(opt.encoder)) 100 | if opt.decoder: 101 | print('loading pretrained encoder model from %s' % opt.decoder) 102 | encoder.load_state_dict(torch.load(opt.encoder)) 103 | print(encoder) 104 | print(decoder) 105 | 106 | image = torch.FloatTensor(opt.batchSize, 3, opt.imgH, opt.imgH) 107 | text = torch.LongTensor(opt.batchSize * 5) 108 | length = torch.IntTensor(opt.batchSize) 109 | 110 | if opt.cuda: 111 | encoder.cuda() 112 | decoder.cuda() 113 | # encoder = torch.nn.DataParallel(encoder, device_ids=range(opt.ngpu)) 114 | # decoder = torch.nn.DataParallel(decoder, device_ids=range(opt.ngpu)) 115 | image = image.cuda() 116 | text = text.cuda() 117 | criterion = criterion.cuda() 118 | 119 | # loss averager 120 | loss_avg = utils.averager() 121 | 122 | # setup optimizer 123 | if opt.adam: 124 | encoder_optimizer = optim.Adam(encoder.parameters(), lr=opt.lr, 125 | betas=(opt.beta1, 0.999)) 126 | decoder_optimizer = optim.Adam(decoder.parameters(), lr=opt.lr, 127 | betas=(opt.beta1, 0.999)) 128 | elif opt.adadelta: 129 | optimizer = optim.Adadelta(encoder.parameters(), lr=opt.lr) 130 | else: 131 | encoder_optimizer = optim.RMSprop(encoder.parameters(), lr=opt.lr) 132 | decoder_optimizer = optim.RMSprop(decoder.parameters(), lr=opt.lr) 133 | 134 | 135 | def val(encoder, decoder, criterion, batchsize, dataset, teach_forcing=False, max_iter=100): 136 | print('Start val') 137 | 138 | for e, d in zip(encoder.parameters(), decoder.parameters()): 139 | e.requires_grad = False 140 | d.requires_grad = False 141 | 142 | encoder.eval() 143 | decoder.eval() 144 | data_loader = torch.utils.data.DataLoader( 145 | dataset, shuffle=False, batch_size=batchsize, num_workers=int(opt.workers)) 146 | val_iter = iter(data_loader) 147 | 148 | n_correct = 0 149 | n_total = 0 150 | loss_avg = utils.averager() 151 | 152 | max_iter = min(max_iter, len(data_loader)) 153 | # max_iter = len(data_loader) - 1 154 | for i in range(max_iter): 155 | data = val_iter.next() 156 | i += 1 157 | cpu_images, cpu_texts = data 158 | b = cpu_images.size(0) 159 | utils.loadData(image, cpu_images) 160 | 161 | target_variable = converter.encode(cpu_texts) 162 | n_total += len(cpu_texts[0]) + 1 # 还要准确预测出EOS停止位 163 | 164 | decoded_words = [] 165 | decoded_label = [] 166 | decoder_attentions = torch.zeros(len(cpu_texts[0]) + 1, opt.max_width) 167 | encoder_outputs = encoder(image) # cnn+biLstm做特征提取 168 | target_variable = target_variable.cuda() 169 | decoder_input = target_variable[0].cuda() # 初始化decoder的开始,从0开始输出 170 | decoder_hidden = decoder.initHidden(b).cuda() 171 | loss = 0.0 172 | if not teach_forcing: 173 | # 预测的时候采用非强制策略,将前一次的输出,作为下一次的输入,直到标签为EOS_TOKEN时停止 174 | for di in range(1, target_variable.shape[0]): # 最大字符串的长度 175 | decoder_output, decoder_hidden, decoder_attention = decoder( 176 | decoder_input, decoder_hidden, encoder_outputs) 177 | loss += criterion(decoder_output, target_variable[di]) # 每次预测一个字符 178 | loss_avg.add(loss) 179 | decoder_attentions[di-1] = decoder_attention.data 180 | topv, topi = decoder_output.data.topk(1) 181 | ni = topi.squeeze(1) 182 | decoder_input = ni 183 | if ni == EOS_TOKEN: 184 | decoded_words.append('') 185 | decoded_label.append(EOS_TOKEN) 186 | break 187 | else: 188 | decoded_words.append(converter.decode(ni)) 189 | decoded_label.append(ni) 190 | 191 | # 计算正确个数 192 | for pred, target in zip(decoded_label, target_variable[1:,:]): 193 | if pred == target: 194 | n_correct += 1 195 | 196 | if i % 100 == 0: # 每100次输出一次 197 | texts = cpu_texts[0] 198 | print('pred:%-20s, gt: %-20s' % (decoded_words, texts)) 199 | 200 | accuracy = n_correct / float(n_total) 201 | print('Test loss: %f, accuray: %f' % (loss_avg.val(), accuracy)) 202 | 203 | 204 | def trainBatch(encoder, decoder, criterion, encoder_optimizer, decoder_optimizer, teach_forcing_prob=1): 205 | ''' 206 | target_label:采用后处理的方式,进行编码和对齐,以便进行batch训练 207 | ''' 208 | data = train_iter.next() 209 | cpu_images, cpu_texts = data 210 | b = cpu_images.size(0) 211 | target_variable = converter.encode(cpu_texts) 212 | utils.loadData(image, cpu_images) 213 | 214 | encoder_outputs = encoder(image) # cnn+biLstm做特征提取 215 | target_variable = target_variable.cuda() 216 | decoder_input = target_variable[0].cuda() # 初始化decoder的开始,从0开始输出 217 | decoder_hidden = decoder.initHidden(b).cuda() 218 | loss = 0.0 219 | teach_forcing = True if random.random() > teach_forcing_prob else False 220 | if teach_forcing: 221 | # 教师强制:将目标label作为下一个输入 222 | for di in range(1, target_variable.shape[0]): # 最大字符串的长度 223 | decoder_output, decoder_hidden, decoder_attention = decoder( 224 | decoder_input, decoder_hidden, encoder_outputs) 225 | loss += criterion(decoder_output, target_variable[di]) # 每次预测一个字符 226 | decoder_input = target_variable[di] # Teacher forcing/前一次的输出 227 | else: 228 | for di in range(1, target_variable.shape[0]): 229 | decoder_output, decoder_hidden, decoder_attention = decoder( 230 | decoder_input, decoder_hidden, encoder_outputs) 231 | loss += criterion(decoder_output, target_variable[di]) # 每次预测一个字符 232 | topv, topi = decoder_output.data.topk(1) 233 | ni = topi.squeeze() 234 | decoder_input = ni 235 | encoder.zero_grad() 236 | decoder.zero_grad() 237 | loss.backward() 238 | encoder_optimizer.step() 239 | decoder_optimizer.step() 240 | return loss 241 | 242 | 243 | if __name__ == '__main__': 244 | t0 = time.time() 245 | for epoch in range(opt.niter): 246 | train_iter = iter(train_loader) 247 | i = 0 248 | while i < len(train_loader)-1: 249 | for e, d in zip(encoder.parameters(), decoder.parameters()): 250 | e.requires_grad = True 251 | d.requires_grad = True 252 | encoder.train() 253 | decoder.train() 254 | cost = trainBatch(encoder, decoder, criterion, encoder_optimizer, 255 | decoder_optimizer, teach_forcing_prob=opt.teaching_forcing_prob) 256 | loss_avg.add(cost) 257 | i += 1 258 | 259 | if i % opt.displayInterval == 0: 260 | print('[%d/%d][%d/%d] Loss: %f' % 261 | (epoch, opt.niter, i, len(train_loader), loss_avg.val()), end=' ') 262 | loss_avg.reset() 263 | t1 = time.time() 264 | print('time elapsed %d' % (t1-t0)) 265 | t0 = time.time() 266 | 267 | # do checkpointing 268 | if epoch % opt.saveInterval == 0: 269 | val(encoder, decoder, criterion, 1, dataset=test_dataset, teach_forcing=False) # batchsize:1 270 | torch.save( 271 | encoder.state_dict(), '{0}/encoder_{1}.pth'.format(opt.experiment, epoch)) 272 | torch.save( 273 | decoder.state_dict(), '{0}/decoder_{1}.pth'.format(opt.experiment, epoch)) --------------------------------------------------------------------------------