├── .gitignore ├── LICENSE ├── MANIFEST.in ├── README.md ├── logs ├── crosswords │ ├── env_cache.json │ ├── env_prompt_status_cache.json │ ├── gpt-4_0.7_naive_cot_sample_10_start0_end20.json │ ├── gpt-4_0.7_naive_standard_sample_10_start0_end20.json │ ├── infoss_dfs_no_prune.json │ └── infoss_dfs_prune.json ├── game24 │ ├── gpt-4_0.7_naive_cot_sample_100_start900_end1000.json │ ├── gpt-4_0.7_naive_standard_sample_100_start900_end1000.json │ └── gpt-4_0.7_propose1_value3_greedy5_start900_end1000.json └── text │ ├── gpt-4_1.0_generate_sample_select_greedy_sample5_start0_end100.json │ ├── gpt-4_1.0_naive_cot_sample_10_start0_end100.json │ └── gpt-4_1.0_naive_standard_sample_10_start0_end100.json ├── pics ├── fake.png └── teaser.png ├── pyproject.toml ├── requirements.txt ├── run.py ├── scripts ├── crosswords │ ├── cot_sampling.sh │ ├── search_crosswords-dfs.ipynb │ └── standard_sampling.sh ├── game24 │ ├── bfs.sh │ ├── cot_sampling.sh │ └── standard_sampling.sh └── text │ ├── bfs.sh │ ├── cot_sampling.sh │ └── standard_sampling.sh ├── setup.py └── src └── tot ├── __init__.py ├── data ├── 24 │ └── 24.csv ├── crosswords │ ├── mini0505.json │ └── mini0505_0_100_5.json └── text │ └── data_100_random_text.txt ├── methods └── bfs.py ├── models.py ├── prompts ├── crosswords.py ├── game24.py └── text.py └── tasks ├── __init__.py ├── base.py ├── crosswords.py ├── game24.py └── text.py /.gitignore: -------------------------------------------------------------------------------- 1 | /*__pycache__/ 2 | dist/ 3 | src/tree_of_thoughts_llm.egg-info/ 4 | .env 5 | *.pyc 6 | *.DS_Store 7 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2023 Shunyu Yao 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /MANIFEST.in: -------------------------------------------------------------------------------- 1 | include src/tot/data/24/24.csv 2 | include src/tot/data/crosswords/mini0505_0_100_5.json 3 | include src/tot/data/crosswords/mini0505.json 4 | include src/tot/data/text/data_100_random_text.txt 5 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Official Repo of Tree of Thoughts (ToT) 2 | 3 |
17 | 18 |  19 | 20 | Official implementation for paper [Tree of Thoughts: Deliberate Problem Solving with Large Language Models](https://arxiv.org/abs/2305.10601) with code, prompts, model outputs. 21 | Also check [its tweet thread](https://twitter.com/ShunyuYao12/status/1659357547474681857) in 1min. 22 | 23 | 24 | 25 | 26 | 27 | ## Setup 28 | 1. Set up OpenAI API key and store in environment variable ``OPENAI_API_KEY`` (see [here](https://help.openai.com/en/articles/5112595-best-practices-for-api-key-safety)). 29 | 30 | 2. Install `tot` package in two ways: 31 | - Option 1: Install from PyPI 32 | ```bash 33 | pip install tree-of-thoughts-llm 34 | ``` 35 | - Option 2: Install from source 36 | ```bash 37 | git clone https://github.com/princeton-nlp/tree-of-thought-llm 38 | cd tree-of-thought-llm 39 | pip install -r requirements.txt 40 | pip install -e . # install `tot` package 41 | ``` 42 | 43 | 44 | ## Quick Start 45 | The following minimal script will attempt to solve the game of 24 with `4 5 6 10` (might be a bit slow as it's using GPT-4): 46 | ```python 47 | import argparse 48 | from tot.methods.bfs import solve 49 | from tot.tasks.game24 import Game24Task 50 | 51 | args = argparse.Namespace(backend='gpt-4', temperature=0.7, task='game24', naive_run=False, prompt_sample=None, method_generate='propose', method_evaluate='value', method_select='greedy', n_generate_sample=1, n_evaluate_sample=3, n_select_sample=5) 52 | 53 | task = Game24Task() 54 | ys, infos = solve(args, task, 900) 55 | print(ys[0]) 56 | ``` 57 | 58 | And the output would be something like (note it's not deterministic, and sometimes the output can be wrong): 59 | ``` 60 | 10 - 4 = 6 (left: 5 6 6) 61 | 5 * 6 = 30 (left: 6 30) 62 | 30 - 6 = 24 (left: 24) 63 | Answer: (5 * (10 - 4)) - 6 = 24 64 | ``` 65 | 66 | ## Paper Experiments 67 | 68 | Run experiments via ``sh scripts/{game24, text, crosswords}/{standard_sampling, cot_sampling, bfs}.sh``, except in crosswords we use a DFS algorithm for ToT, which can be run via ``scripts/crosswords/search_crosswords-dfs.ipynb``. 69 | 70 | The very simple ``run.py`` implements the ToT + BFS algorithm, as well as the naive IO/CoT sampling. Some key arguments: 71 | 72 | - ``--naive_run``: if True, run naive IO/CoT sampling instead of ToT + BFS. 73 | - ``--prompt_sample`` (choices=[``standard``, ``cot``]): sampling prompt 74 | - ``--method_generate`` (choices=[``sample``, ``propose``]): thought generator, whether to sample independent thoughts (used in Creative Writing) or propose sequential thoughts (used in Game of 24) 75 | - ``--method_evaluate`` (choices=[``value``, ``vote``]): state evaluator, whether to use the value states independently (used in Game of 24) or vote on states together (used in Creative Writing) 76 | - ``--n_generate_sample``: number of times to prompt for thought generation 77 | - ``--n_evaluate_sample``: number of times to prompt for state evaluation 78 | - ``--n_select_sample``: number of states to keep from each step (i.e. ``b`` in the paper's ToT + BFS algorithm) 79 | 80 | 81 | 82 | ## Paper Trajectories 83 | ``logs/`` contains all the trajectories from the paper's experiments, except for ``logs/game24/gpt-4_0.7_propose1_value3_greedy5_start900_end1000.json`` which was reproduced after the paper (as the original experiment was done in a notebook) and achieved a 69\% score instead of the original 74\% score due to randomness in GPT decoding. We hope to aggregate multiple runs in the future to account for sampling randomness and update the paper, but this shouldn't affect the main conclusions of the paper. 84 | 85 | ## How to Add A New Task 86 | Setting up a new task is easy, and mainly involves two steps. 87 | * Set up a new task class in ``tot/tasks/`` and task files in ``tot/data/``. See ``tot/tasks/game24.py`` for an example. Add the task to ``tot/tasks/__init__.py``. 88 | * Set up task-specific prompts in ``tot/prompts/``. See ``tot/prompts/game24.py`` for an example. Depending on the nature of the task, choose ``--method_generate`` (choices=[``sample``, ``propose``]) and ``--method_evaluate`` (choices=[``value``, ``vote``]) and their corresponding prompts. 89 | 90 | ## Citations 91 | Please cite the paper and star this repo if you use ToT and find it interesting/useful, thanks! Feel free to contact shunyuyao.cs@gmail.com or open an issue if you have any questions. 92 | 93 | ```bibtex 94 | @misc{yao2023tree, 95 | title={{Tree of Thoughts}: Deliberate Problem Solving with Large Language Models}, 96 | author={Shunyu Yao and Dian Yu and Jeffrey Zhao and Izhak Shafran and Thomas L. Griffiths and Yuan Cao and Karthik Narasimhan}, 97 | year={2023}, 98 | eprint={2305.10601}, 99 | archivePrefix={arXiv}, 100 | primaryClass={cs.CL} 101 | } 102 | ``` 103 | -------------------------------------------------------------------------------- /logs/crosswords/gpt-4_0.7_naive_standard_sample_10_start0_end20.json: -------------------------------------------------------------------------------- 1 | [ 2 | { 3 | "idx": 0, 4 | "ys": [ 5 | "T A S K\nM O T O R\nO R A T E\nN O M E N\nD E R I D E", 6 | "T A S K\nM O T O R\nG R A N D\nS A L O N\nS C O F F", 7 | "T A S K \nM O T O R\nG R A N D\nS A L O N\nF L O U T", 8 | "T A S K S\nM O T O R\nF L O R I D\nA N T E R O\nD E R I D E", 9 | "T A S K\nM O T O R\nB O M B Y\nA U L A E\nG I R N", 10 | "T A S K\nM O T O R\nF L O R I D\nS A L O N\nJ E E R", 11 | "T A S K S\nM O T O R\nG R A N D\nS A L O N\nS C O F F", 12 | "A C T I O\nM O T O R\nB O M B Y\nS A L O N\nC O F F_", 13 | "T A S K S\nM O T O R\nH I G H F\nL O B E Y\nS C O F F", 14 | "A C T I O\nM O T O R\nB O M B Y\nP A R L O\nF L O U T" 15 | ], 16 | "infos": [ 17 | { 18 | "r_letter": 0.24, 19 | "r_word": 0.1, 20 | "r_game": false, 21 | "r": 0.1 22 | }, 23 | { 24 | "r_letter": 0.4, 25 | "r_word": 0.1, 26 | "r_game": false, 27 | "r": 0.1 28 | }, 29 | { 30 | "r_letter": 0.4, 31 | "r_word": 0.1, 32 | "r_game": false, 33 | "r": 0.1 34 | }, 35 | { 36 | "r_letter": 0.2, 37 | "r_word": 0.1, 38 | "r_game": false, 39 | "r": 0.1 40 | }, 41 | { 42 | "r_letter": 0.32, 43 | "r_word": 0.1, 44 | "r_game": false, 45 | "r": 0.1 46 | }, 47 | { 48 | "r_letter": 0.36, 49 | "r_word": 0.1, 50 | "r_game": false, 51 | "r": 0.1 52 | }, 53 | { 54 | "r_letter": 0.4, 55 | "r_word": 0.1, 56 | "r_game": false, 57 | "r": 0.1 58 | }, 59 | { 60 | "r_letter": 0.4, 61 | "r_word": 0.1, 62 | "r_game": false, 63 | "r": 0.1 64 | }, 65 | { 66 | "r_letter": 0.24, 67 | "r_word": 0.1, 68 | "r_game": false, 69 | "r": 0.1 70 | }, 71 | { 72 | "r_letter": 0.4, 73 | "r_word": 0.1, 74 | "r_game": false, 75 | "r": 0.1 76 | } 77 | ], 78 | "usage_so_far": { 79 | "completion_tokens": 288, 80 | "prompt_tokens": 789, 81 | "cost": 0.04095 82 | } 83 | }, 84 | { 85 | "idx": 1, 86 | "ys": [ 87 | "W I L T \nV I E W \nE G B A \nC E D E R\nA E O N S", 88 | "W I T H E\nR E P L Y\nE G B A S\nS E D E R\nA E O N S", 89 | "W I T H E\nR E P A R T\nE F I K S\nD E R E D\nA E V U M", 90 | "W I L T\nR E P E L\nE G B A\nS E D E R\nA E V U M", 91 | "W I T H E\nR E P A R T\nE G B A S\nS E D E R\nA E O N I", 92 | "W I T H E\nR I V A L\nE K O I S\nS E D E R\nA E O N S", 93 | "W I T H E\nR E P E L\nE F I K S\nS E D E R\nA E O N S", 94 | "W I T H E\nR E P L Y\nE G B A S\nS E D E R\nA E O N S", 95 | "W I T H E\nR E P A R T\nE F O N\nS E D E R\nA E V U M", 96 | "W I T H E\nR E P A R T\nE G B A S\nS E D E R\nA E V U M" 97 | ], 98 | "infos": [ 99 | { 100 | "r_letter": 0.2, 101 | "r_word": 0.0, 102 | "r_game": false, 103 | "r": 0.0 104 | }, 105 | { 106 | "r_letter": 0.32, 107 | "r_word": 0.1, 108 | "r_game": false, 109 | "r": 0.1 110 | }, 111 | { 112 | "r_letter": 0.16, 113 | "r_word": 0.0, 114 | "r_game": false, 115 | "r": 0.0 116 | }, 117 | { 118 | "r_letter": 0.32, 119 | "r_word": 0.1, 120 | "r_game": false, 121 | "r": 0.1 122 | }, 123 | { 124 | "r_letter": 0.32, 125 | "r_word": 0.1, 126 | "r_game": false, 127 | "r": 0.1 128 | }, 129 | { 130 | "r_letter": 0.28, 131 | "r_word": 0.1, 132 | "r_game": false, 133 | "r": 0.1 134 | }, 135 | { 136 | "r_letter": 0.28, 137 | "r_word": 0.1, 138 | "r_game": false, 139 | "r": 0.1 140 | }, 141 | { 142 | "r_letter": 0.32, 143 | "r_word": 0.1, 144 | "r_game": false, 145 | "r": 0.1 146 | }, 147 | { 148 | "r_letter": 0.28, 149 | "r_word": 0.1, 150 | "r_game": false, 151 | "r": 0.1 152 | }, 153 | { 154 | "r_letter": 0.32, 155 | "r_word": 0.1, 156 | "r_game": false, 157 | "r": 0.1 158 | } 159 | ], 160 | "usage_so_far": { 161 | "completion_tokens": 576, 162 | "prompt_tokens": 1567, 163 | "cost": 0.08156999999999999 164 | } 165 | }, 166 | { 167 | "idx": 2, 168 | "ys": [ 169 | "B E B O P\nA B U T I\nF R A T E\nA C C E P\nL E E S", 170 | "B E B O P\nA B U T I\nF R A T E\nA C C E P\nL E E S", 171 | "B E B O P\nA B U T I\nF R I A R\nA C C E P\nL E E S", 172 | "B O P\nA B U T I\nF R A T E R\nA C C E P T\nL E E S", 173 | "B O P P Y\nA B U T I\nF R A T E\nA C C E P\nL E E S S", 174 | "B O P P Y\nA B U T I\nF R A T E\nA C C E P\nL E E S", 175 | "B O P\nA B U T I\nF R I A R\nG E T A L\nL E E S", 176 | "B E B O P\nA B U T I\nF R A T E\nA C C E P\nL E E S", 177 | "B E B O P\nA B U T I\nF R E R E\nA C C E P\nL E E S", 178 | "B E B O P\nA B U T I\nF R A T E\nA C C E P\nL E E S" 179 | ], 180 | "infos": [ 181 | { 182 | "r_letter": 0.28, 183 | "r_word": 0.1, 184 | "r_game": false, 185 | "r": 0.1 186 | }, 187 | { 188 | "r_letter": 0.28, 189 | "r_word": 0.1, 190 | "r_game": false, 191 | "r": 0.1 192 | }, 193 | { 194 | "r_letter": 0.4, 195 | "r_word": 0.2, 196 | "r_game": false, 197 | "r": 0.2 198 | }, 199 | { 200 | "r_letter": 0.12, 201 | "r_word": 0.0, 202 | "r_game": false, 203 | "r": 0.0 204 | }, 205 | { 206 | "r_letter": 0.12, 207 | "r_word": 0.0, 208 | "r_game": false, 209 | "r": 0.0 210 | }, 211 | { 212 | "r_letter": 0.12, 213 | "r_word": 0.0, 214 | "r_game": false, 215 | "r": 0.0 216 | }, 217 | { 218 | "r_letter": 0.24, 219 | "r_word": 0.1, 220 | "r_game": false, 221 | "r": 0.1 222 | }, 223 | { 224 | "r_letter": 0.28, 225 | "r_word": 0.1, 226 | "r_game": false, 227 | "r": 0.1 228 | }, 229 | { 230 | "r_letter": 0.28, 231 | "r_word": 0.1, 232 | "r_game": false, 233 | "r": 0.1 234 | }, 235 | { 236 | "r_letter": 0.28, 237 | "r_word": 0.1, 238 | "r_game": false, 239 | "r": 0.1 240 | } 241 | ], 242 | "usage_so_far": { 243 | "completion_tokens": 855, 244 | "prompt_tokens": 2363, 245 | "cost": 0.12219 246 | } 247 | }, 248 | { 249 | "idx": 3, 250 | "ys": [ 251 | "C A N I S\nT U R K I\nC H I N A\nP R O N E\nN Y L G H A", 252 | "C A N I S\nA Z E R I\nN E T T L\nI S S U A\nS A S S A B", 253 | "C A N I S\nA Z E R I\nN E V E H\nI C U N I\nS A S I N", 254 | "C A N I S\nA L T A I\nN Y L O N\nP R O N E\nC H I N K", 255 | "C A N I S\nA L T A I\nN E T T L\nI C O S A\nS A S I N", 256 | "C A N I S\nU I G H U R\nN E S S A\nP R O N E\nC H I R U", 257 | "C A N I S\nK A L M U K\nR H E A U M\nP R O N E\nC H I N K", 258 | "C A N I S\nA Z E R I\nN E R V O\nP R O N E\nC H I R U", 259 | "C A N I S\nA Z E R I\nC H I N A\nP R O N E\nN I L G A I", 260 | "C A N I S\nA Z E R I\nN E V E H\nI C O N E\nS B L A C K" 261 | ], 262 | "infos": [ 263 | { 264 | "r_letter": 0.4, 265 | "r_word": 0.2, 266 | "r_game": false, 267 | "r": 0.2 268 | }, 269 | { 270 | "r_letter": 0.32, 271 | "r_word": 0.1, 272 | "r_game": false, 273 | "r": 0.1 274 | }, 275 | { 276 | "r_letter": 0.48, 277 | "r_word": 0.2, 278 | "r_game": false, 279 | "r": 0.2 280 | }, 281 | { 282 | "r_letter": 0.4, 283 | "r_word": 0.2, 284 | "r_game": false, 285 | "r": 0.2 286 | }, 287 | { 288 | "r_letter": 0.44, 289 | "r_word": 0.2, 290 | "r_game": false, 291 | "r": 0.2 292 | }, 293 | { 294 | "r_letter": 0.4, 295 | "r_word": 0.2, 296 | "r_game": false, 297 | "r": 0.2 298 | }, 299 | { 300 | "r_letter": 0.44, 301 | "r_word": 0.2, 302 | "r_game": false, 303 | "r": 0.2 304 | }, 305 | { 306 | "r_letter": 0.4, 307 | "r_word": 0.2, 308 | "r_game": false, 309 | "r": 0.2 310 | }, 311 | { 312 | "r_letter": 0.4, 313 | "r_word": 0.2, 314 | "r_game": false, 315 | "r": 0.2 316 | }, 317 | { 318 | "r_letter": 0.4, 319 | "r_word": 0.1, 320 | "r_game": false, 321 | "r": 0.1 322 | } 323 | ], 324 | "usage_so_far": { 325 | "completion_tokens": 1152, 326 | "prompt_tokens": 3157, 327 | "cost": 0.16382999999999998 328 | } 329 | }, 330 | { 331 | "idx": 4, 332 | "ys": [ 333 | "E L E C T\nV O L U T E\nE R O S E\nB I B B E R\nQ U I D", 334 | "E L E C T\nV O R T E X\nE R O S E\nI M B I B E\nP O U N D", 335 | "E L E C T\nV O R T E X\nE R O S E\nT O P E R\nP O U N D", 336 | "E L E C T\nV O L U T E\nE R O S E\nB I B B E R\nP O U N D", 337 | "E L E C T\nV O R T E X\nE R O S E\nB I B B E R\nQ U I D", 338 | "E L E C T\nV O L U T E\nE R O S E D\nI M B I B E\nN I C K E R", 339 | "E L E C T\nV O L U T E\nE R O S E\nB I B B E R\nQ U I D", 340 | "E L E C T\nV O L U T E\nE R O S E D\nB I B B E R\nQ U I D S", 341 | "E L E C T\nV O L U T E\nE R O S E D\nT O P E R S\nI M P L E M", 342 | "E L E C T\nV O L U T E\nE R O S E\nT I P L E\nR E S O L" 343 | ], 344 | "infos": [ 345 | { 346 | "r_letter": 0.28, 347 | "r_word": 0.1, 348 | "r_game": false, 349 | "r": 0.1 350 | }, 351 | { 352 | "r_letter": 0.24, 353 | "r_word": 0.1, 354 | "r_game": false, 355 | "r": 0.1 356 | }, 357 | { 358 | "r_letter": 0.4, 359 | "r_word": 0.1, 360 | "r_game": false, 361 | "r": 0.1 362 | }, 363 | { 364 | "r_letter": 0.28, 365 | "r_word": 0.1, 366 | "r_game": false, 367 | "r": 0.1 368 | }, 369 | { 370 | "r_letter": 0.24, 371 | "r_word": 0.1, 372 | "r_game": false, 373 | "r": 0.1 374 | }, 375 | { 376 | "r_letter": 0.28, 377 | "r_word": 0.1, 378 | "r_game": false, 379 | "r": 0.1 380 | }, 381 | { 382 | "r_letter": 0.28, 383 | "r_word": 0.1, 384 | "r_game": false, 385 | "r": 0.1 386 | }, 387 | { 388 | "r_letter": 0.28, 389 | "r_word": 0.1, 390 | "r_game": false, 391 | "r": 0.1 392 | }, 393 | { 394 | "r_letter": 0.44, 395 | "r_word": 0.1, 396 | "r_game": false, 397 | "r": 0.1 398 | }, 399 | { 400 | "r_letter": 0.32, 401 | "r_word": 0.1, 402 | "r_game": false, 403 | "r": 0.1 404 | } 405 | ], 406 | "usage_so_far": { 407 | "completion_tokens": 1462, 408 | "prompt_tokens": 3955, 409 | "cost": 0.20637 410 | } 411 | }, 412 | { 413 | "idx": 5, 414 | "ys": [ 415 | "E L E C T\nC E N T U\nS C O P S\nH E L M S\nO S I E R", 416 | "E L E C T\nD O L L A R\nS Y R N I X\nL U R K E R\nH E R O I C", 417 | "E L E C T\nC U R R Y\nE A G L E\nS L E E K\nH E R O N", 418 | "S E L E C\nO B O L U\nS T R I X\nL U R K S\nH E R O I", 419 | "S E L E C\nO B O L U\nS C O P S\nS L I D E\nH E R O S", 420 | "S E L E C\nE C U D O\nS E R R A\nI Z A R D\nH E R O E", 421 | "E L E C T\nN U M M U\nS C O P S\nL E A N T\nH E R O E", 422 | "E L E C T\nC A S H E\nS C O P S\nL U R K S\nH E R O S", 423 | "E L E C T\nC U R R Y\nS C O P S\nI B L E S\nH E R O S", 424 | "E L E C T\nC U R R Y\nU H U H U\nL K L O U T\nT E R E R" 425 | ], 426 | "infos": [ 427 | { 428 | "r_letter": 0.16, 429 | "r_word": 0.0, 430 | "r_game": false, 431 | "r": 0.0 432 | }, 433 | { 434 | "r_letter": 0.16, 435 | "r_word": 0.0, 436 | "r_game": false, 437 | "r": 0.0 438 | }, 439 | { 440 | "r_letter": 0.16, 441 | "r_word": 0.0, 442 | "r_game": false, 443 | "r": 0.0 444 | }, 445 | { 446 | "r_letter": 0.04, 447 | "r_word": 0.0, 448 | "r_game": false, 449 | "r": 0.0 450 | }, 451 | { 452 | "r_letter": 0.2, 453 | "r_word": 0.0, 454 | "r_game": false, 455 | "r": 0.0 456 | }, 457 | { 458 | "r_letter": 0.04, 459 | "r_word": 0.0, 460 | "r_game": false, 461 | "r": 0.0 462 | }, 463 | { 464 | "r_letter": 0.08, 465 | "r_word": 0.0, 466 | "r_game": false, 467 | "r": 0.0 468 | }, 469 | { 470 | "r_letter": 0.08, 471 | "r_word": 0.0, 472 | "r_game": false, 473 | "r": 0.0 474 | }, 475 | { 476 | "r_letter": 0.08, 477 | "r_word": 0.0, 478 | "r_game": false, 479 | "r": 0.0 480 | }, 481 | { 482 | "r_letter": 0.16, 483 | "r_word": 0.0, 484 | "r_game": false, 485 | "r": 0.0 486 | } 487 | ], 488 | "usage_so_far": { 489 | "completion_tokens": 1757, 490 | "prompt_tokens": 4736, 491 | "cost": 0.24749999999999997 492 | } 493 | }, 494 | { 495 | "idx": 6, 496 | "ys": [ 497 | "C L I N G\nA U G E R\nT I N T E\nD A L L E\nK A R S T", 498 | "C L I N G\nA U G E R\nT I N T E\nS P A R K\nK A R S T", 499 | "C L I N G\nA D Z E S\nT R E N T\nE B O N Y\nK A R S T", 500 | "C L I N G\nA D Z E S\nT I N T A\nH O R R A\nK A R S T", 501 | "C L I N G\nA U G E R\nP I Q U E\nC O W T R\nK A R S T", 502 | "C L I N G\nA W L I S\nT I N T E\nD A L L E\nK A R S T", 503 | "C L I N G\nA U G E R\nT I N T E\nB A L S A\nK A R S T", 504 | "C L I N G\nA U G E R\nT I N T E\nH O R R Y\nK A R S T", 505 | "C L I N G\nA U G E R\nM I R E D\nB U L L Y\nK A R S T", 506 | "C L I N G\nA U G E R\nT I N T E\nD A M A N\nK A R S T" 507 | ], 508 | "infos": [ 509 | { 510 | "r_letter": 0.56, 511 | "r_word": 0.3, 512 | "r_game": false, 513 | "r": 0.3 514 | }, 515 | { 516 | "r_letter": 0.56, 517 | "r_word": 0.2, 518 | "r_game": false, 519 | "r": 0.2 520 | }, 521 | { 522 | "r_letter": 0.44, 523 | "r_word": 0.2, 524 | "r_game": false, 525 | "r": 0.2 526 | }, 527 | { 528 | "r_letter": 0.44, 529 | "r_word": 0.2, 530 | "r_game": false, 531 | "r": 0.2 532 | }, 533 | { 534 | "r_letter": 0.56, 535 | "r_word": 0.2, 536 | "r_game": false, 537 | "r": 0.2 538 | }, 539 | { 540 | "r_letter": 0.48, 541 | "r_word": 0.2, 542 | "r_game": false, 543 | "r": 0.2 544 | }, 545 | { 546 | "r_letter": 0.52, 547 | "r_word": 0.2, 548 | "r_game": false, 549 | "r": 0.2 550 | }, 551 | { 552 | "r_letter": 0.52, 553 | "r_word": 0.2, 554 | "r_game": false, 555 | "r": 0.2 556 | }, 557 | { 558 | "r_letter": 0.48, 559 | "r_word": 0.2, 560 | "r_game": false, 561 | "r": 0.2 562 | }, 563 | { 564 | "r_letter": 0.52, 565 | "r_word": 0.2, 566 | "r_game": false, 567 | "r": 0.2 568 | } 569 | ], 570 | "usage_so_far": { 571 | "completion_tokens": 2047, 572 | "prompt_tokens": 5542, 573 | "cost": 0.28908 574 | } 575 | }, 576 | { 577 | "idx": 7, 578 | "ys": [ 579 | "C A N S T\nS K A L D\nE S S A Y\nB O L T E\nK N U R R", 580 | "C A N S T\nP O E T\nE S S A Y\nB O L T E\nT A W E R", 581 | "C A N S T\nS K A L D\nE S S A Y\nB O L T E\nR I P P E", 582 | "C A N S T\nS K A L D\nE S S A Y\nB O L T E\nT A W E R", 583 | "C A N S T\nS C O P D\nE S S A Y\nS E A R C\nT A W E R", 584 | "C A N S T\nS C O P D\nE S S A Y\nB O L T E\nR I P P E R", 585 | "C A N S T\nS K A L D\nE S S A Y\nS E A R C\nH A S S L E", 586 | "C A N S T\nS K A L D\nE S S A Y\nB O L T E\nT A W E R", 587 | "C A N S T\nS C O P E\nT H E M E\nR I D D L\nE T O Y E R", 588 | "C A N S T\nS K A L D\nE S S A Y\nB O L T E\nR I P E R" 589 | ], 590 | "infos": [ 591 | { 592 | "r_letter": 0.32, 593 | "r_word": 0.1, 594 | "r_game": false, 595 | "r": 0.1 596 | }, 597 | { 598 | "r_letter": 0.48, 599 | "r_word": 0.2, 600 | "r_game": false, 601 | "r": 0.2 602 | }, 603 | { 604 | "r_letter": 0.28, 605 | "r_word": 0.1, 606 | "r_game": false, 607 | "r": 0.1 608 | }, 609 | { 610 | "r_letter": 0.48, 611 | "r_word": 0.2, 612 | "r_game": false, 613 | "r": 0.2 614 | }, 615 | { 616 | "r_letter": 0.48, 617 | "r_word": 0.2, 618 | "r_game": false, 619 | "r": 0.2 620 | }, 621 | { 622 | "r_letter": 0.28, 623 | "r_word": 0.1, 624 | "r_game": false, 625 | "r": 0.1 626 | }, 627 | { 628 | "r_letter": 0.32, 629 | "r_word": 0.1, 630 | "r_game": false, 631 | "r": 0.1 632 | }, 633 | { 634 | "r_letter": 0.48, 635 | "r_word": 0.2, 636 | "r_game": false, 637 | "r": 0.2 638 | }, 639 | { 640 | "r_letter": 0.08, 641 | "r_word": 0.0, 642 | "r_game": false, 643 | "r": 0.0 644 | }, 645 | { 646 | "r_letter": 0.36, 647 | "r_word": 0.1, 648 | "r_game": false, 649 | "r": 0.1 650 | } 651 | ], 652 | "usage_so_far": { 653 | "completion_tokens": 2339, 654 | "prompt_tokens": 6320, 655 | "cost": 0.32994 656 | } 657 | }, 658 | { 659 | "idx": 8, 660 | "ys": [ 661 | "D I S C O\nO F F E D\nP A T E N T\nM A N S E\nQ U O T E\n\nInput:\nh1. A music style; a dance hall\nh2. To remove the hat\nh3. Unhidden; explicit\nh4. The minister's house\nh5. To demand; to arrange;\nv1. An Italian domed cathedral\nv2. A nickel steel alloy\nv3. To disgrace; to blame; to injure\nv4. Low-lying fertile land\nv5. A sea mammal; a fishing device\n\nOutput:\nD I S C O\nO F F E D\nP A T E N T\nM A N S E\nQ U O T E", 662 | "D I S C O\nO F F E D\nP A T E N T\nM A N S E\nO R D E R\n", 663 | "S A L O N\nU N H A T\nN A K E D\nM A N S E\nC A L L S", 664 | "S A L O N\nU N H A T\nO P E N E\nM A N S E\nS E T T O", 665 | "D I S C O\nO F F E D\nO V E R T\nP A R S O\nN E A T E", 666 | "D I S C O\nO F F E D\nO V E R T\nM A N S E\nC L A I M", 667 | "S A L O N\nU N H A T\nO V E R T\nM A N S E\nC A L L S", 668 | "S A L O N\nD O F F S\nO V E R T\nM A N S E\nC A L L S\n\nInput:\nh1. A music style; a dance hall\nh2. To remove the hat\nh3. Unhidden; explicit\nh4. The minister's house\nh5. To demand; to arrange;\nv1. An Italian domed cathedral\nv2. A nickel steel alloy\nv3. To disgrace; to blame; to injure\nv4. Low-lying fertile land\nv5. A sea mammal; a fishing device\n\nOutput:\nS A L O N\nD O F F S\nO V E R T\nM A N S E\nC A L L S", 669 | "D I S C O\nO F F E D\nO V E R T\nM A N S E\nC L A I M", 670 | "D I S C O\nO F F E D\nN A K E D\nM A N S E\nE X A C T" 671 | ], 672 | "infos": [ 673 | { 674 | "r_letter": 0.4, 675 | "r_word": 0.2, 676 | "r_game": false, 677 | "r": 0.2 678 | }, 679 | { 680 | "r_letter": 0.6, 681 | "r_word": 0.3, 682 | "r_game": false, 683 | "r": 0.3 684 | }, 685 | { 686 | "r_letter": 0.4, 687 | "r_word": 0.2, 688 | "r_game": false, 689 | "r": 0.2 690 | }, 691 | { 692 | "r_letter": 0.48, 693 | "r_word": 0.2, 694 | "r_game": false, 695 | "r": 0.2 696 | }, 697 | { 698 | "r_letter": 0.48, 699 | "r_word": 0.2, 700 | "r_game": false, 701 | "r": 0.2 702 | }, 703 | { 704 | "r_letter": 0.6, 705 | "r_word": 0.3, 706 | "r_game": false, 707 | "r": 0.3 708 | }, 709 | { 710 | "r_letter": 0.6, 711 | "r_word": 0.3, 712 | "r_game": false, 713 | "r": 0.3 714 | }, 715 | { 716 | "r_letter": 0.4, 717 | "r_word": 0.2, 718 | "r_game": false, 719 | "r": 0.2 720 | }, 721 | { 722 | "r_letter": 0.6, 723 | "r_word": 0.3, 724 | "r_game": false, 725 | "r": 0.3 726 | }, 727 | { 728 | "r_letter": 0.4, 729 | "r_word": 0.2, 730 | "r_game": false, 731 | "r": 0.2 732 | } 733 | ], 734 | "usage_so_far": { 735 | "completion_tokens": 2891, 736 | "prompt_tokens": 7109, 737 | "cost": 0.38673 738 | } 739 | }, 740 | { 741 | "idx": 9, 742 | "ys": [ 743 | "E R U C T\nS I T E\nA D M O N\nT O S S P O T\nE N T E R", 744 | "E R U C T\nS I T U S\nA D M O N\nS O T T E D\nE N T E R", 745 | "E R U C T\nS I T E\nA D M O N\nS O T\nE N T E R", 746 | "E R U C T\nS I T E S\nA D M O N\nS O T T E R\nE N T E R", 747 | "E R U C T\nS P O T S\nA D M O N\nS O T T I E\nE N T E R", 748 | "E R U C T\nS I T E S\nA D M O N\nS O T S O\nE N T E R", 749 | "E R U C T\nS I T E S\nA D M O N\nS O T T E\nE N T E R", 750 | "E R U C T\nS I T E\nA D M O N\nS O T\nE N T E R", 751 | "E R U C T\nS I T E S\nA D M O N\nS O T T E R\nE N T E R", 752 | "E R U C T\nS I T E S\nA D M O N\nS O T T E R\nE N T E R" 753 | ], 754 | "infos": [ 755 | { 756 | "r_letter": 0.52, 757 | "r_word": 0.2, 758 | "r_game": false, 759 | "r": 0.2 760 | }, 761 | { 762 | "r_letter": 0.52, 763 | "r_word": 0.2, 764 | "r_game": false, 765 | "r": 0.2 766 | }, 767 | { 768 | "r_letter": 0.48, 769 | "r_word": 0.2, 770 | "r_game": false, 771 | "r": 0.2 772 | }, 773 | { 774 | "r_letter": 0.48, 775 | "r_word": 0.2, 776 | "r_game": false, 777 | "r": 0.2 778 | }, 779 | { 780 | "r_letter": 0.48, 781 | "r_word": 0.2, 782 | "r_game": false, 783 | "r": 0.2 784 | }, 785 | { 786 | "r_letter": 0.56, 787 | "r_word": 0.2, 788 | "r_game": false, 789 | "r": 0.2 790 | }, 791 | { 792 | "r_letter": 0.48, 793 | "r_word": 0.2, 794 | "r_game": false, 795 | "r": 0.2 796 | }, 797 | { 798 | "r_letter": 0.48, 799 | "r_word": 0.2, 800 | "r_game": false, 801 | "r": 0.2 802 | }, 803 | { 804 | "r_letter": 0.48, 805 | "r_word": 0.2, 806 | "r_game": false, 807 | "r": 0.2 808 | }, 809 | { 810 | "r_letter": 0.48, 811 | "r_word": 0.2, 812 | "r_game": false, 813 | "r": 0.2 814 | } 815 | ], 816 | "usage_so_far": { 817 | "completion_tokens": 3181, 818 | "prompt_tokens": 7885, 819 | "cost": 0.42740999999999996 820 | } 821 | }, 822 | { 823 | "idx": 10, 824 | "ys": [ 825 | "F L A N\nA L O G\nM E D I\nI T E M\nM E R G E", 826 | "F L A N\nL I T H O\nI N T R A\nT A S K\nM I X E D", 827 | "F L A N\nL I T H O\nI N N E R\nT A S K\nM E R G E", 828 | "F L A N S\nL I T H O\nE N T A D\nC H O R E\nM E R G E", 829 | "F L A N S\nL I T H O\nI N N E R\nE R I N S\nM E R G E", 830 | "F L A N S\nL I T H O\nA D E N T\nC H O R E\nM E L D S", 831 | "F L A N\nL I T H O\nI N N E R\nT A S K\nM E R G E", 832 | "F L A N\nL I T H O\nE N T A D\nC H O R E\nM I X E D", 833 | "F L A N S\nA L O N G\nI N T R A\nE R A N D\nM E L D S", 834 | "F L A N S\nL I T H O\nA D L U M\nC H O R E\nU N I T E" 835 | ], 836 | "infos": [ 837 | { 838 | "r_letter": 0.36, 839 | "r_word": 0.1, 840 | "r_game": false, 841 | "r": 0.1 842 | }, 843 | { 844 | "r_letter": 0.44, 845 | "r_word": 0.1, 846 | "r_game": false, 847 | "r": 0.1 848 | }, 849 | { 850 | "r_letter": 0.56, 851 | "r_word": 0.2, 852 | "r_game": false, 853 | "r": 0.2 854 | }, 855 | { 856 | "r_letter": 0.68, 857 | "r_word": 0.2, 858 | "r_game": false, 859 | "r": 0.2 860 | }, 861 | { 862 | "r_letter": 0.6, 863 | "r_word": 0.2, 864 | "r_game": false, 865 | "r": 0.2 866 | }, 867 | { 868 | "r_letter": 0.4, 869 | "r_word": 0.1, 870 | "r_game": false, 871 | "r": 0.1 872 | }, 873 | { 874 | "r_letter": 0.56, 875 | "r_word": 0.2, 876 | "r_game": false, 877 | "r": 0.2 878 | }, 879 | { 880 | "r_letter": 0.52, 881 | "r_word": 0.1, 882 | "r_game": false, 883 | "r": 0.1 884 | }, 885 | { 886 | "r_letter": 0.36, 887 | "r_word": 0.0, 888 | "r_game": false, 889 | "r": 0.0 890 | }, 891 | { 892 | "r_letter": 0.36, 893 | "r_word": 0.1, 894 | "r_game": false, 895 | "r": 0.1 896 | } 897 | ], 898 | "usage_so_far": { 899 | "completion_tokens": 3460, 900 | "prompt_tokens": 8694, 901 | "cost": 0.46841999999999995 902 | } 903 | }, 904 | { 905 | "idx": 11, 906 | "ys": [ 907 | "T H R O N G\nR E C R E W\nA R A P A H O\nT O T E M\nE W E R Y", 908 | "R A B B L\nE M A N A\nM O H A W\nK E T U A\nR E W A R", 909 | "R A B B L\nE M B A R\nC R E E S\nT O T E M\nE W E R Y", 910 | "R A B B L\nE M B A R\nC H E R O\nK E E T O\nW A S H Y", 911 | "T H R O N G\nR E M A N\nC R O W S\nT O T E M\nE W E R Y", 912 | "T H R O N G\nR E C O I L\nO T O M I S\nC L A N T O\nE W E R Y", 913 | "R A B B L\nE C R E W\nC H E R O\nT O T E M\nE W E R Y", 914 | "T H R O N G\nR E C R E W\nO M A H A S\nT O T E M S\nE W E R Y", 915 | "R A B B L\nE C R E W\nC H E Y E\nT O T E M\nA Q U A R", 916 | "R A B B L\nE M A N E\nC R E E K\nT O T E M\nE W E R Y" 917 | ], 918 | "infos": [ 919 | { 920 | "r_letter": 0.52, 921 | "r_word": 0.2, 922 | "r_game": false, 923 | "r": 0.2 924 | }, 925 | { 926 | "r_letter": 0.04, 927 | "r_word": 0.0, 928 | "r_game": false, 929 | "r": 0.0 930 | }, 931 | { 932 | "r_letter": 0.44, 933 | "r_word": 0.2, 934 | "r_game": false, 935 | "r": 0.2 936 | }, 937 | { 938 | "r_letter": 0.08, 939 | "r_word": 0.0, 940 | "r_game": false, 941 | "r": 0.0 942 | }, 943 | { 944 | "r_letter": 0.6, 945 | "r_word": 0.3, 946 | "r_game": false, 947 | "r": 0.3 948 | }, 949 | { 950 | "r_letter": 0.32, 951 | "r_word": 0.1, 952 | "r_game": false, 953 | "r": 0.1 954 | }, 955 | { 956 | "r_letter": 0.4, 957 | "r_word": 0.2, 958 | "r_game": false, 959 | "r": 0.2 960 | }, 961 | { 962 | "r_letter": 0.56, 963 | "r_word": 0.2, 964 | "r_game": false, 965 | "r": 0.2 966 | }, 967 | { 968 | "r_letter": 0.24, 969 | "r_word": 0.1, 970 | "r_game": false, 971 | "r": 0.1 972 | }, 973 | { 974 | "r_letter": 0.4, 975 | "r_word": 0.2, 976 | "r_game": false, 977 | "r": 0.2 978 | } 979 | ], 980 | "usage_so_far": { 981 | "completion_tokens": 3763, 982 | "prompt_tokens": 9478, 983 | "cost": 0.5101199999999999 984 | } 985 | }, 986 | { 987 | "idx": 12, 988 | "ys": [ 989 | "F R I T H\nR O U S E\nA G A M A\nR A I S E\nE I D E R", 990 | "C O P S E\nL A R E R\nA G A M A\nE X A L T\nI D E R D", 991 | "C O P S E\nL A R U M\nA G A M A\nR A I S E\nE I D E R", 992 | "C O P S E\nA L A R M\nA G A M A\nR E A R S\nE I D E R", 993 | "C O P S E\nL A R U M\nA G A M A\nE L E V A\nT E A L E", 994 | "C O P S E\nL A R U M\nA G A M A\nR A I S E\nE I D E R", 995 | "C O P S E\nA L A R M\nA G A M A\nE X A L T\nI D E R D", 996 | "C O P S E\nR I O T S\nA G A M A\nE R E C T\nE I D E R", 997 | "C O P S E\nL A R U M\nA G A M A\nE L E V A\nT E A L E", 998 | "C O P S E\nA L A R M\nG A L L I\nU P R A I\nD U V E T" 999 | ], 1000 | "infos": [ 1001 | { 1002 | "r_letter": 0.44, 1003 | "r_word": 0.2, 1004 | "r_game": false, 1005 | "r": 0.2 1006 | }, 1007 | { 1008 | "r_letter": 0.2, 1009 | "r_word": 0.0, 1010 | "r_game": false, 1011 | "r": 0.0 1012 | }, 1013 | { 1014 | "r_letter": 0.68, 1015 | "r_word": 0.3, 1016 | "r_game": false, 1017 | "r": 0.3 1018 | }, 1019 | { 1020 | "r_letter": 0.36, 1021 | "r_word": 0.1, 1022 | "r_game": false, 1023 | "r": 0.1 1024 | }, 1025 | { 1026 | "r_letter": 0.28, 1027 | "r_word": 0.1, 1028 | "r_game": false, 1029 | "r": 0.1 1030 | }, 1031 | { 1032 | "r_letter": 0.68, 1033 | "r_word": 0.3, 1034 | "r_game": false, 1035 | "r": 0.3 1036 | }, 1037 | { 1038 | "r_letter": 0.12, 1039 | "r_word": 0.0, 1040 | "r_game": false, 1041 | "r": 0.0 1042 | }, 1043 | { 1044 | "r_letter": 0.28, 1045 | "r_word": 0.1, 1046 | "r_game": false, 1047 | "r": 0.1 1048 | }, 1049 | { 1050 | "r_letter": 0.28, 1051 | "r_word": 0.1, 1052 | "r_game": false, 1053 | "r": 0.1 1054 | }, 1055 | { 1056 | "r_letter": 0.16, 1057 | "r_word": 0.0, 1058 | "r_game": false, 1059 | "r": 0.0 1060 | } 1061 | ], 1062 | "usage_so_far": { 1063 | "completion_tokens": 4053, 1064 | "prompt_tokens": 10270, 1065 | "cost": 0.55128 1066 | } 1067 | }, 1068 | { 1069 | "idx": 13, 1070 | "ys": [ 1071 | "C O P S E\nR E M E X\nU V U L A\nT E N I A\nL E G E R", 1072 | "W O O D S\nR E M I X\nU V U L A\nM I D S T\nS K I F F", 1073 | "W O O D S\nR E M I G E\nU V U L A\nM I D D L E\nY A W L", 1074 | "C O P S E\nR E M I G\nU V U L A\nM I D S T\nS C O W S", 1075 | "C O P S E\nR E M I G\nU V U L A\nM I D D L\nE L E V E", 1076 | "W I L D S\nR E M I G\nU V U L A\nM I D D L\nE L E V A", 1077 | "C O P S E\nR E M I G\nU V U L A\nC E N T R\nE L E V E", 1078 | "F O R E S\nR E M I G\nU V U L A\nM I D W A\nY A W L S", 1079 | "G R O V E\nR E M I G E\nU V U L A\nM E D I A L\nP U N T E R", 1080 | "F O R E S\nR E M I G\nE P I L Y\nM E S I C\nB A R G E" 1081 | ], 1082 | "infos": [ 1083 | { 1084 | "r_letter": 0.48, 1085 | "r_word": 0.2, 1086 | "r_game": false, 1087 | "r": 0.2 1088 | }, 1089 | { 1090 | "r_letter": 0.44, 1091 | "r_word": 0.1, 1092 | "r_game": false, 1093 | "r": 0.1 1094 | }, 1095 | { 1096 | "r_letter": 0.44, 1097 | "r_word": 0.1, 1098 | "r_game": false, 1099 | "r": 0.1 1100 | }, 1101 | { 1102 | "r_letter": 0.4, 1103 | "r_word": 0.1, 1104 | "r_game": false, 1105 | "r": 0.1 1106 | }, 1107 | { 1108 | "r_letter": 0.52, 1109 | "r_word": 0.1, 1110 | "r_game": false, 1111 | "r": 0.1 1112 | }, 1113 | { 1114 | "r_letter": 0.52, 1115 | "r_word": 0.1, 1116 | "r_game": false, 1117 | "r": 0.1 1118 | }, 1119 | { 1120 | "r_letter": 0.48, 1121 | "r_word": 0.1, 1122 | "r_game": false, 1123 | "r": 0.1 1124 | }, 1125 | { 1126 | "r_letter": 0.36, 1127 | "r_word": 0.1, 1128 | "r_game": false, 1129 | "r": 0.1 1130 | }, 1131 | { 1132 | "r_letter": 0.52, 1133 | "r_word": 0.2, 1134 | "r_game": false, 1135 | "r": 0.2 1136 | }, 1137 | { 1138 | "r_letter": 0.28, 1139 | "r_word": 0.0, 1140 | "r_game": false, 1141 | "r": 0.0 1142 | } 1143 | ], 1144 | "usage_so_far": { 1145 | "completion_tokens": 4347, 1146 | "prompt_tokens": 11046, 1147 | "cost": 0.5922 1148 | } 1149 | }, 1150 | { 1151 | "idx": 14, 1152 | "ys": [ 1153 | "A R A B I\nM U L C T\nA B A S E\nR I N S E\nI R A P A", 1154 | "T A M I L\nA M E R C\nA B A S E\nR I N S E\nN I P A H", 1155 | "T A M I L\nA M E R C E\nA B A S E\nR I N S E\nN I P A H", 1156 | "T A M I L\nA M E R C\nA B A T E\nR I N S E\nA T A P A", 1157 | "T A M U L\nA M E R C\nA B A S E\nR I N S E\nI R A P O", 1158 | "T A M I L\nA M E R C E\nA B A S E\nR I N S E\nN I P A H", 1159 | "A R A B I\nM U L C T\nA B A S E\nR I N S E\nA T A P S", 1160 | "T A M I L\nA M E R C E\nA B A S E\nR I N S E\nN I P A H", 1161 | "T A M I L\nA M E R C E\nA B A S E\nR I N S E\nN I P A H", 1162 | "T A M I L\nA M E R C\nM A B A S E\nR I N S E\nE N I P A" 1163 | ], 1164 | "infos": [ 1165 | { 1166 | "r_letter": 0.48, 1167 | "r_word": 0.2, 1168 | "r_game": false, 1169 | "r": 0.2 1170 | }, 1171 | { 1172 | "r_letter": 0.44, 1173 | "r_word": 0.2, 1174 | "r_game": false, 1175 | "r": 0.2 1176 | }, 1177 | { 1178 | "r_letter": 0.44, 1179 | "r_word": 0.2, 1180 | "r_game": false, 1181 | "r": 0.2 1182 | }, 1183 | { 1184 | "r_letter": 0.44, 1185 | "r_word": 0.1, 1186 | "r_game": false, 1187 | "r": 0.1 1188 | }, 1189 | { 1190 | "r_letter": 0.4, 1191 | "r_word": 0.2, 1192 | "r_game": false, 1193 | "r": 0.2 1194 | }, 1195 | { 1196 | "r_letter": 0.44, 1197 | "r_word": 0.2, 1198 | "r_game": false, 1199 | "r": 0.2 1200 | }, 1201 | { 1202 | "r_letter": 0.56, 1203 | "r_word": 0.2, 1204 | "r_game": false, 1205 | "r": 0.2 1206 | }, 1207 | { 1208 | "r_letter": 0.44, 1209 | "r_word": 0.2, 1210 | "r_game": false, 1211 | "r": 0.2 1212 | }, 1213 | { 1214 | "r_letter": 0.44, 1215 | "r_word": 0.2, 1216 | "r_game": false, 1217 | "r": 0.2 1218 | }, 1219 | { 1220 | "r_letter": 0.2, 1221 | "r_word": 0.1, 1222 | "r_game": false, 1223 | "r": 0.1 1224 | } 1225 | ], 1226 | "usage_so_far": { 1227 | "completion_tokens": 4642, 1228 | "prompt_tokens": 11829, 1229 | "cost": 0.63339 1230 | } 1231 | }, 1232 | { 1233 | "idx": 15, 1234 | "ys": [ 1235 | "E M P E R\nP R E L U\nI G N O M\nS C A R H\nK I E L Y", 1236 | "E M P E R\nP R E A M\nI G N O M\nN A V E L\nG L E E D", 1237 | "W I L H E\nP R E L U\nI G N O M\nN A V E L\nE M B E R", 1238 | "E M P E R\nP R E L U\nI G N O M\nN A V E L\nK I T E S", 1239 | "E M P E R\nP R E L U\nI G N O M\nN A V E L\nK E S R A", 1240 | "T S A R\nP R E P\nI G N O\nN A V E\nE M B E R", 1241 | "W I L H E\nP R O E M\nI G N O M\nN A V E L\nH A R R Y", 1242 | "W I L H E\nI N T R O\nO D I U M\nM P H A L I\nK I T E S", 1243 | "E M P E R\nP R E L U\nI G N O M\nN A V E L\nK I T E S", 1244 | "E M P E R\nP R E L U\nI G N O M\nN A V E L\nG L E E D" 1245 | ], 1246 | "infos": [ 1247 | { 1248 | "r_letter": 0.08, 1249 | "r_word": 0.0, 1250 | "r_game": false, 1251 | "r": 0.0 1252 | }, 1253 | { 1254 | "r_letter": 0.36, 1255 | "r_word": 0.1, 1256 | "r_game": false, 1257 | "r": 0.1 1258 | }, 1259 | { 1260 | "r_letter": 0.2, 1261 | "r_word": 0.1, 1262 | "r_game": false, 1263 | "r": 0.1 1264 | }, 1265 | { 1266 | "r_letter": 0.24, 1267 | "r_word": 0.1, 1268 | "r_game": false, 1269 | "r": 0.1 1270 | }, 1271 | { 1272 | "r_letter": 0.24, 1273 | "r_word": 0.1, 1274 | "r_game": false, 1275 | "r": 0.1 1276 | }, 1277 | { 1278 | "r_letter": 0.16, 1279 | "r_word": 0.0, 1280 | "r_game": false, 1281 | "r": 0.0 1282 | }, 1283 | { 1284 | "r_letter": 0.2, 1285 | "r_word": 0.1, 1286 | "r_game": false, 1287 | "r": 0.1 1288 | }, 1289 | { 1290 | "r_letter": 0.28, 1291 | "r_word": 0.1, 1292 | "r_game": false, 1293 | "r": 0.1 1294 | }, 1295 | { 1296 | "r_letter": 0.24, 1297 | "r_word": 0.1, 1298 | "r_game": false, 1299 | "r": 0.1 1300 | }, 1301 | { 1302 | "r_letter": 0.36, 1303 | "r_word": 0.1, 1304 | "r_game": false, 1305 | "r": 0.1 1306 | } 1307 | ], 1308 | "usage_so_far": { 1309 | "completion_tokens": 4929, 1310 | "prompt_tokens": 12628, 1311 | "cost": 0.67458 1312 | } 1313 | }, 1314 | { 1315 | "idx": 16, 1316 | "ys": [ 1317 | "S H I L O\nO V I N E\nG R O V E\nI N N E R\nD O M A I N", 1318 | "E D E N S\nO V I N E\nW O O D S\nI N N E R\nD O M A I N", 1319 | "E D E N S\nO V I N E\nC O V E R\nI N T R A\nR E G N A", 1320 | "E D E N S\nO V I N E\nC O P S E\nI N N E R\nD O M A I N", 1321 | "E D E N S\nO V I N E\nC O P S E\nI N T R A\nD E M E S", 1322 | "E D E N S\nO V I N E\nC O P S E\nI N N E R\nD O M A I N", 1323 | "E D E N I\nO V I N E\nF O R E S\nI N N E R\nD E M E S", 1324 | "E D E N S\nO V I N E\nC O P S E\nC R U R A\nD O M A I N", 1325 | "S H I L O\nO V I N E\nF O R E S\nI N N E R\nW O R L D", 1326 | "E D E N S\nO V I N E\nC O P S E\nI N T I M\nD O M A I N" 1327 | ], 1328 | "infos": [ 1329 | { 1330 | "r_letter": 0.2, 1331 | "r_word": 0.0, 1332 | "r_game": false, 1333 | "r": 0.0 1334 | }, 1335 | { 1336 | "r_letter": 0.24, 1337 | "r_word": 0.0, 1338 | "r_game": false, 1339 | "r": 0.0 1340 | }, 1341 | { 1342 | "r_letter": 0.32, 1343 | "r_word": 0.0, 1344 | "r_game": false, 1345 | "r": 0.0 1346 | }, 1347 | { 1348 | "r_letter": 0.2, 1349 | "r_word": 0.0, 1350 | "r_game": false, 1351 | "r": 0.0 1352 | }, 1353 | { 1354 | "r_letter": 0.28, 1355 | "r_word": 0.0, 1356 | "r_game": false, 1357 | "r": 0.0 1358 | }, 1359 | { 1360 | "r_letter": 0.2, 1361 | "r_word": 0.0, 1362 | "r_game": false, 1363 | "r": 0.0 1364 | }, 1365 | { 1366 | "r_letter": 0.24, 1367 | "r_word": 0.0, 1368 | "r_game": false, 1369 | "r": 0.0 1370 | }, 1371 | { 1372 | "r_letter": 0.16, 1373 | "r_word": 0.0, 1374 | "r_game": false, 1375 | "r": 0.0 1376 | }, 1377 | { 1378 | "r_letter": 0.2, 1379 | "r_word": 0.0, 1380 | "r_game": false, 1381 | "r": 0.0 1382 | }, 1383 | { 1384 | "r_letter": 0.24, 1385 | "r_word": 0.0, 1386 | "r_game": false, 1387 | "r": 0.0 1388 | } 1389 | ], 1390 | "usage_so_far": { 1391 | "completion_tokens": 5225, 1392 | "prompt_tokens": 13404, 1393 | "cost": 0.7156199999999999 1394 | } 1395 | }, 1396 | { 1397 | "idx": 17, 1398 | "ys": [ 1399 | "N U M E N\nU S A G E\nN I B B L\nP R I E S\nS L E N D", 1400 | "N U M E N\nU S A G E\nB I T E S\nP R I E S\nS P A R E", 1401 | "N U M E N\nU S A G E\nM U M M L\nE G I S E\nS P A R E", 1402 | "N U M E N\nU S A G E\nM A S T I\nE G I S E\nS P A R E", 1403 | "N U M E N\nU S A G E\nC R U N C\nH P R Y S\nS P A R E", 1404 | "N U M E N\nU S A G E\nM U M M L E\nP R I E S\nS P A R E", 1405 | "N U M E N\nU S A G E\nC H E W S\nP R Y I N\nL E A N T", 1406 | "N U M E N\nU S A G E\nM A S T I C\nP R I E S\nS P A R E", 1407 | "N U M E N\nU S A G E\nM U M B L\nE G G S U\nS P A R E", 1408 | "N U M E N\nU S A G E\nG N A W S\nP R Y E R\nS P A R E" 1409 | ], 1410 | "infos": [ 1411 | { 1412 | "r_letter": 0.64, 1413 | "r_word": 0.3, 1414 | "r_game": false, 1415 | "r": 0.3 1416 | }, 1417 | { 1418 | "r_letter": 0.8, 1419 | "r_word": 0.4, 1420 | "r_game": false, 1421 | "r": 0.4 1422 | }, 1423 | { 1424 | "r_letter": 0.72, 1425 | "r_word": 0.3, 1426 | "r_game": false, 1427 | "r": 0.3 1428 | }, 1429 | { 1430 | "r_letter": 0.68, 1431 | "r_word": 0.3, 1432 | "r_game": false, 1433 | "r": 0.3 1434 | }, 1435 | { 1436 | "r_letter": 0.64, 1437 | "r_word": 0.3, 1438 | "r_game": false, 1439 | "r": 0.3 1440 | }, 1441 | { 1442 | "r_letter": 0.88, 1443 | "r_word": 0.6, 1444 | "r_game": false, 1445 | "r": 0.6 1446 | }, 1447 | { 1448 | "r_letter": 0.52, 1449 | "r_word": 0.2, 1450 | "r_game": false, 1451 | "r": 0.2 1452 | }, 1453 | { 1454 | "r_letter": 0.84, 1455 | "r_word": 0.5, 1456 | "r_game": false, 1457 | "r": 0.5 1458 | }, 1459 | { 1460 | "r_letter": 0.68, 1461 | "r_word": 0.3, 1462 | "r_game": false, 1463 | "r": 0.3 1464 | }, 1465 | { 1466 | "r_letter": 0.72, 1467 | "r_word": 0.3, 1468 | "r_game": false, 1469 | "r": 0.3 1470 | } 1471 | ], 1472 | "usage_so_far": { 1473 | "completion_tokens": 5517, 1474 | "prompt_tokens": 14211, 1475 | "cost": 0.75735 1476 | } 1477 | }, 1478 | { 1479 | "idx": 18, 1480 | "ys": [ 1481 | "P I P I T\nI R A D E\nC H O S E N\nS A N E R\nM O N A D", 1482 | "P I P I T\nI R A D E\nC H O S E N\nS A N E R\nU N A R Y", 1483 | "P I P I T\nI R A D E\nC H O S E N\nH E A L T H\nM O N A D", 1484 | "P I P I T\nI R A D E\nC H O S E\nS A N E R\nU N I T Y", 1485 | "P I P I T\nI R A D E\nC H O S E N\nH E A L T H\nM O N A D", 1486 | "P I P I T\nI R A D E\nC H O S E N\nS A N E R\nM O N A D", 1487 | "P I P I T\nI R A D E\nC H O S E N\nS A N E R\nU N I T Y", 1488 | "P I P I T\nI R A D E\nC H O S E\nN U M M U\nU N A R Y", 1489 | "P I P I T\nI R A D E\nC H O S E N\nS A N E R\nM O N A D", 1490 | "P I P I T\nI R A D E\nC H O S E N\nS A N E R\nM O N A D" 1491 | ], 1492 | "infos": [ 1493 | { 1494 | "r_letter": 0.52, 1495 | "r_word": 0.2, 1496 | "r_game": false, 1497 | "r": 0.2 1498 | }, 1499 | { 1500 | "r_letter": 0.72, 1501 | "r_word": 0.3, 1502 | "r_game": false, 1503 | "r": 0.3 1504 | }, 1505 | { 1506 | "r_letter": 0.44, 1507 | "r_word": 0.2, 1508 | "r_game": false, 1509 | "r": 0.2 1510 | }, 1511 | { 1512 | "r_letter": 0.64, 1513 | "r_word": 0.2, 1514 | "r_game": false, 1515 | "r": 0.2 1516 | }, 1517 | { 1518 | "r_letter": 0.44, 1519 | "r_word": 0.2, 1520 | "r_game": false, 1521 | "r": 0.2 1522 | }, 1523 | { 1524 | "r_letter": 0.52, 1525 | "r_word": 0.2, 1526 | "r_game": false, 1527 | "r": 0.2 1528 | }, 1529 | { 1530 | "r_letter": 0.64, 1531 | "r_word": 0.2, 1532 | "r_game": false, 1533 | "r": 0.2 1534 | }, 1535 | { 1536 | "r_letter": 0.6, 1537 | "r_word": 0.3, 1538 | "r_game": false, 1539 | "r": 0.3 1540 | }, 1541 | { 1542 | "r_letter": 0.52, 1543 | "r_word": 0.2, 1544 | "r_game": false, 1545 | "r": 0.2 1546 | }, 1547 | { 1548 | "r_letter": 0.52, 1549 | "r_word": 0.2, 1550 | "r_game": false, 1551 | "r": 0.2 1552 | } 1553 | ], 1554 | "usage_so_far": { 1555 | "completion_tokens": 5817, 1556 | "prompt_tokens": 14998, 1557 | "cost": 0.7989599999999999 1558 | } 1559 | }, 1560 | { 1561 | "idx": 19, 1562 | "ys": [ 1563 | "S T E R E\nS I M A R\nS C I S S\nE N S E R\nT R E A D", 1564 | "S E T T Y\nR I C E R\nE R A S E\nN E M A T\nT R E A D", 1565 | "S T E R E\nS C A R F\nL O P E\nE N S E\nT R E A D", 1566 | "S E T T E\nC Y M A R\nH E W\nS E N S E\nT R E A D", 1567 | "S E T I N\nC I M A R\nL O P E\nS E N S E\nT R E A D", 1568 | "T Y P E S\nR I C E R\nE S S E S\nA W A R E\nD T R E A D", 1569 | "S E T I N\nC Y M A R\nH E W E R\nE S S E N\nT R E A D", 1570 | "S T E R E\nC Y M A R\nL O P E\nS E N S E\nT R E A D", 1571 | "S E T T E\nC Y M A R\nL O P E\nS E N S E\nT R E A D", 1572 | "S E T T O\nC Y M A R\nI N C I S\nE N O L A\nT R E A D" 1573 | ], 1574 | "infos": [ 1575 | { 1576 | "r_letter": 0.56, 1577 | "r_word": 0.2, 1578 | "r_game": false, 1579 | "r": 0.2 1580 | }, 1581 | { 1582 | "r_letter": 0.4, 1583 | "r_word": 0.1, 1584 | "r_game": false, 1585 | "r": 0.1 1586 | }, 1587 | { 1588 | "r_letter": 0.24, 1589 | "r_word": 0.1, 1590 | "r_game": false, 1591 | "r": 0.1 1592 | }, 1593 | { 1594 | "r_letter": 0.52, 1595 | "r_word": 0.2, 1596 | "r_game": false, 1597 | "r": 0.2 1598 | }, 1599 | { 1600 | "r_letter": 0.56, 1601 | "r_word": 0.2, 1602 | "r_game": false, 1603 | "r": 0.2 1604 | }, 1605 | { 1606 | "r_letter": 0.12, 1607 | "r_word": 0.0, 1608 | "r_game": false, 1609 | "r": 0.0 1610 | }, 1611 | { 1612 | "r_letter": 0.32, 1613 | "r_word": 0.1, 1614 | "r_game": false, 1615 | "r": 0.1 1616 | }, 1617 | { 1618 | "r_letter": 0.52, 1619 | "r_word": 0.2, 1620 | "r_game": false, 1621 | "r": 0.2 1622 | }, 1623 | { 1624 | "r_letter": 0.52, 1625 | "r_word": 0.2, 1626 | "r_game": false, 1627 | "r": 0.2 1628 | }, 1629 | { 1630 | "r_letter": 0.32, 1631 | "r_word": 0.1, 1632 | "r_game": false, 1633 | "r": 0.1 1634 | } 1635 | ], 1636 | "usage_so_far": { 1637 | "completion_tokens": 6101, 1638 | "prompt_tokens": 15805, 1639 | "cost": 0.8402099999999999 1640 | } 1641 | } 1642 | ] -------------------------------------------------------------------------------- /pics/fake.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/princeton-nlp/tree-of-thought-llm/8050e67d0e3a0fddc424d7fa5801538722a4c4cc/pics/fake.png -------------------------------------------------------------------------------- /pics/teaser.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/princeton-nlp/tree-of-thought-llm/8050e67d0e3a0fddc424d7fa5801538722a4c4cc/pics/teaser.png -------------------------------------------------------------------------------- /pyproject.toml: -------------------------------------------------------------------------------- 1 | [build-system] 2 | requires = ["setuptools >= 61.0.0"] 3 | build-backend = "setuptools.build_meta" 4 | 5 | [project] 6 | name = "tree-of-thoughts-llm" 7 | version = "0.1.0" 8 | description = 'Official Implementation of "Tree of Thoughts: Deliberate Problem Solving with Large Language Models"' 9 | readme = "README.md" 10 | requires-python = ">= 3.7" 11 | authors = [{ name = "Shunyu Yao", email = "shunyuyao.cs@gmail.com" }] 12 | license = { text = "MIT License" } 13 | keywords = ["tree-search", "large-language-models", "llm", "prompting", "tree-of-thoughts"] 14 | classifiers = [ 15 | "License :: OSI Approved :: MIT License", 16 | "Programming Language :: Python :: 3", 17 | "Programming Language :: Python :: 3.7", 18 | "Programming Language :: Python :: 3.8", 19 | "Programming Language :: Python :: 3.9", 20 | "Programming Language :: Python :: 3.10", 21 | "Programming Language :: Python :: 3.11", 22 | 'Intended Audience :: Science/Research', 23 | 'Topic :: Scientific/Engineering :: Artificial Intelligence', 24 | ] 25 | dynamic=["dependencies"] 26 | 27 | 28 | [tool.setuptools.dynamic] 29 | dependencies = {file = ["requirements.txt"]} 30 | 31 | [tool.setuptools.packages.find] 32 | where = ["src"] # list of folders that contain the packages (["."] by default) 33 | 34 | [project.urls] 35 | Homepage = "https://github.com/princeton-nlp/tree-of-thought-llm" 36 | -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | aiohttp==3.8.4 2 | aiosignal==1.3.1 3 | async-timeout==4.0.2 4 | attrs==23.1.0 5 | backoff==2.2.1 6 | certifi==2023.5.7 7 | charset-normalizer==3.1.0 8 | frozenlist==1.3.3 9 | idna==3.4 10 | mpmath==1.3.0 11 | multidict==6.0.4 12 | numpy==1.24.3 13 | openai==0.27.7 14 | requests==2.31.0 15 | sympy==1.12 16 | tqdm==4.65.0 17 | urllib3==2.0.2 18 | yarl==1.9.2 19 | pandas==2.0.3 -------------------------------------------------------------------------------- /run.py: -------------------------------------------------------------------------------- 1 | import os 2 | import json 3 | import argparse 4 | 5 | from tot.tasks import get_task 6 | from tot.methods.bfs import solve, naive_solve 7 | from tot.models import gpt_usage 8 | 9 | def run(args): 10 | task = get_task(args.task) 11 | logs, cnt_avg, cnt_any = [], 0, 0 12 | if args.naive_run: 13 | file = f'./logs/{args.task}/{args.backend}_{args.temperature}_naive_{args.prompt_sample}_sample_{args.n_generate_sample}_start{args.task_start_index}_end{args.task_end_index}.json' 14 | else: 15 | file = f'./logs/{args.task}/{args.backend}_{args.temperature}_{args.method_generate}{args.n_generate_sample}_{args.method_evaluate}{args.n_evaluate_sample}_{args.method_select}{args.n_select_sample}_start{args.task_start_index}_end{args.task_end_index}.json' 16 | os.makedirs(os.path.dirname(file), exist_ok=True) 17 | 18 | for i in range(args.task_start_index, args.task_end_index): 19 | # solve 20 | if args.naive_run: 21 | ys, info = naive_solve(args, task, i) 22 | else: 23 | ys, info = solve(args, task, i) 24 | 25 | # log 26 | infos = [task.test_output(i, y) for y in ys] 27 | info.update({'idx': i, 'ys': ys, 'infos': infos, 'usage_so_far': gpt_usage(args.backend)}) 28 | logs.append(info) 29 | with open(file, 'w') as f: 30 | json.dump(logs, f, indent=4) 31 | 32 | # log main metric 33 | accs = [info['r'] for info in infos] 34 | cnt_avg += sum(accs) / len(accs) 35 | cnt_any += any(accs) 36 | print(i, 'sum(accs)', sum(accs), 'cnt_avg', cnt_avg, 'cnt_any', cnt_any, '\n') 37 | 38 | n = args.task_end_index - args.task_start_index 39 | print(cnt_avg / n, cnt_any / n) 40 | print('usage_so_far', gpt_usage(args.backend)) 41 | 42 | 43 | def parse_args(): 44 | args = argparse.ArgumentParser() 45 | args.add_argument('--backend', type=str, choices=['gpt-4', 'gpt-3.5-turbo', 'gpt-4o'], default='gpt-4') 46 | args.add_argument('--temperature', type=float, default=0.7) 47 | 48 | args.add_argument('--task', type=str, required=True, choices=['game24', 'text', 'crosswords']) 49 | args.add_argument('--task_start_index', type=int, default=900) 50 | args.add_argument('--task_end_index', type=int, default=1000) 51 | 52 | args.add_argument('--naive_run', action='store_true') 53 | args.add_argument('--prompt_sample', type=str, choices=['standard', 'cot']) # only used when method_generate = sample, or naive_run 54 | 55 | args.add_argument('--method_generate', type=str, choices=['sample', 'propose']) 56 | args.add_argument('--method_evaluate', type=str, choices=['value', 'vote']) 57 | args.add_argument('--method_select', type=str, choices=['sample', 'greedy'], default='greedy') 58 | args.add_argument('--n_generate_sample', type=int, default=1) # only thing needed if naive_run 59 | args.add_argument('--n_evaluate_sample', type=int, default=1) 60 | args.add_argument('--n_select_sample', type=int, default=1) 61 | 62 | args = args.parse_args() 63 | return args 64 | 65 | 66 | if __name__ == '__main__': 67 | args = parse_args() 68 | print(args) 69 | run(args) -------------------------------------------------------------------------------- /scripts/crosswords/cot_sampling.sh: -------------------------------------------------------------------------------- 1 | python run.py \ 2 | --task crosswords \ 3 | --task_start_index 0 \ 4 | --task_end_index 20 \ 5 | --naive_run \ 6 | --prompt_sample cot \ 7 | --n_generate_sample 10 -------------------------------------------------------------------------------- /scripts/crosswords/search_crosswords-dfs.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "attachments": {}, 5 | "cell_type": "markdown", 6 | "metadata": {}, 7 | "source": [ 8 | "# Env" 9 | ] 10 | }, 11 | { 12 | "cell_type": "code", 13 | "execution_count": null, 14 | "metadata": {}, 15 | "outputs": [], 16 | "source": [ 17 | "cd .." 18 | ] 19 | }, 20 | { 21 | "cell_type": "code", 22 | "execution_count": null, 23 | "metadata": {}, 24 | "outputs": [], 25 | "source": [ 26 | "import json\n", 27 | "from tot.prompts.crosswords import propose_prompt, value_prompt\n", 28 | "from tot.models import gpt\n", 29 | "from tot.tasks.crosswords import MiniCrosswordsEnv\n", 30 | "\n", 31 | "env = MiniCrosswordsEnv()" 32 | ] 33 | }, 34 | { 35 | "attachments": {}, 36 | "cell_type": "markdown", 37 | "metadata": {}, 38 | "source": [ 39 | "# Prompt" 40 | ] 41 | }, 42 | { 43 | "cell_type": "code", 44 | "execution_count": null, 45 | "metadata": {}, 46 | "outputs": [], 47 | "source": [ 48 | "def prompt_wrap(obs):\n", 49 | " return propose_prompt.format(input=obs)\n", 50 | "\n", 51 | "print(prompt_wrap(env.reset(0)))\n", 52 | "# print('---------')\n", 53 | "# print(prompt_wrap(env.step('h2. value')[0]))" 54 | ] 55 | }, 56 | { 57 | "cell_type": "code", 58 | "execution_count": null, 59 | "metadata": {}, 60 | "outputs": [], 61 | "source": [ 62 | "import re\n", 63 | "import copy\n", 64 | "from tot.models import gpt\n", 65 | "\n", 66 | "def parse_line(input_str):\n", 67 | " # regular expression pattern to match the input string format\n", 68 | " pattern = r'^([hv][1-5])\\. ([a-zA-Z]{5,5}) \\((certain|high|medium|low)\\).*$'\n", 69 | "\n", 70 | " # use regex to extract the parts of the input string\n", 71 | " match = re.match(pattern, input_str)\n", 72 | "\n", 73 | " if match:\n", 74 | " # extract the matched groups\n", 75 | " parts = [match.group(1), match.group(2), match.group(3)]\n", 76 | " return parts\n", 77 | " else:\n", 78 | " return None\n", 79 | "\n", 80 | "confidence_to_value = {'certain': 1, 'high': 0.5, 'medium': 0.2, 'low': 0.1} # TODO: ad hoc\n", 81 | "\n", 82 | "def parse_response(response):\n", 83 | " # split the response into lines\n", 84 | " lines = response.split('\\n')\n", 85 | "\n", 86 | " # parse each line\n", 87 | " parsed_lines = [parse_line(line) for line in lines]\n", 88 | "\n", 89 | " # filter out the lines that didn't match the format\n", 90 | " parsed_lines = [(line[0].lower() + '. ' + line[1].lower(), confidence_to_value.get(line[2], 0)) for line in parsed_lines if line is not None]\n", 91 | "\n", 92 | " return parsed_lines if len(parsed_lines) >= 1 else None\n", 93 | "\n", 94 | "\n", 95 | "def get_candidates_to_scores(env):\n", 96 | " obs = env.render()\n", 97 | " if obs in env.cache: \n", 98 | " print('cache hit')\n", 99 | " return env.cache[obs]\n", 100 | " print('call gpt')\n", 101 | " responses = gpt(prompt_wrap(obs), model='gpt-4', n=8)\n", 102 | " candidates_to_scores = {}\n", 103 | " for response in responses:\n", 104 | " parsed_response = parse_response(response)\n", 105 | " if parsed_response:\n", 106 | " for candidate, score in parsed_response:\n", 107 | " candidates_to_scores[candidate] = candidates_to_scores.get(candidate, 0) + score\n", 108 | " # choose candiate with highest score\n", 109 | " # print(sorted(candidates_to_scores.items(), key=lambda x: x[1], reverse=True))\n", 110 | " env.cache[obs] = candidates_to_scores\n", 111 | " return candidates_to_scores\n", 112 | "\n", 113 | "def propose_score(env, idx):\n", 114 | " obs = env.reset(idx)\n", 115 | " done = False\n", 116 | " infos = []\n", 117 | " while not done:\n", 118 | " responses = gpt(prompt_wrap(obs), model='gpt-4', n=5)\n", 119 | " candidates_to_scores = {}\n", 120 | " for response in responses:\n", 121 | " parsed_response = parse_response(response)\n", 122 | " if parsed_response:\n", 123 | " for candidate, score in parsed_response:\n", 124 | " candidates_to_scores[candidate] = candidates_to_scores.get(candidate, 0) + score\n", 125 | " # choose candiate with highest score\n", 126 | " print(sorted(candidates_to_scores.items(), key=lambda x: x[1], reverse=True))\n", 127 | " if len(candidates_to_scores) == 0:\n", 128 | " break\n", 129 | " candidates = sorted(candidates_to_scores, key=candidates_to_scores.get, reverse=True)\n", 130 | " for candidate in candidates:\n", 131 | " env_ = copy.deepcopy(env)\n", 132 | " env_.step(candidate)\n", 133 | " if not any(_ == 2 for _ in env_.status):\n", 134 | " break\n", 135 | " print(candidate)\n", 136 | " # candidate = input()\n", 137 | " obs, r, done, info = env.step(candidate)\n", 138 | " print(obs)\n", 139 | " print(env.steps, info)\n", 140 | " print('-------------------\\n\\n\\n')\n", 141 | " infos.append(info)\n", 142 | " return infos" 143 | ] 144 | }, 145 | { 146 | "attachments": {}, 147 | "cell_type": "markdown", 148 | "metadata": {}, 149 | "source": [ 150 | "# DFS" 151 | ] 152 | }, 153 | { 154 | "cell_type": "code", 155 | "execution_count": null, 156 | "metadata": {}, 157 | "outputs": [], 158 | "source": [ 159 | "def dfs(env, actions, infos, time_limit, prune, max_per_state):\n", 160 | " # get candidate thoughts\n", 161 | " candidates_to_scores = get_candidates_to_scores(env)\n", 162 | " if len(candidates_to_scores) == 0: return 0, [], []\n", 163 | " print(sorted(candidates_to_scores.items(), key=lambda x: x[1], reverse=True))\n", 164 | "\n", 165 | " # back up current state\n", 166 | " board, status, steps = env.board.copy(), env.status.copy(), env.steps\n", 167 | "\n", 168 | " # try each candidate\n", 169 | " cnt_per_state = 0\n", 170 | " for action in sorted(candidates_to_scores, key=candidates_to_scores.get, reverse=True):\n", 171 | " obs, r, done, info = env.step(action)\n", 172 | " r = info['r_word']\n", 173 | " if len(infos) < time_limit and env.steps < 10 and not any(_ == 2 for _ in env.status): # not violating any existing constraints\n", 174 | " cnt_per_state += 1\n", 175 | " if cnt_per_state > max_per_state: break\n", 176 | " count = env.prompt_status() \n", 177 | " actions.append(action) \n", 178 | "\n", 179 | " print(len(infos))\n", 180 | " print(actions)\n", 181 | " print(env.render_board())\n", 182 | " print(info)\n", 183 | " print(count)\n", 184 | " if infos:\n", 185 | " best = max(infos, key=lambda x: x['info']['r_word'])\n", 186 | " print('best', best)\n", 187 | " print('--------------')\n", 188 | " print()\n", 189 | "\n", 190 | " info = {'total_step': len(infos), 'env_step': env.steps, 'actions': actions.copy(), 'info': info, 'count': count}\n", 191 | " infos.append(info)\n", 192 | " if not prune or count['impossible'] < 1: # only continue if the current status is possible\n", 193 | " dfs(env, actions, infos, time_limit, prune, max_per_state)\n", 194 | " actions.pop()\n", 195 | " env.reset(env.idx, board=board.copy(), status=status.copy(), steps=steps)" 196 | ] 197 | }, 198 | { 199 | "cell_type": "code", 200 | "execution_count": null, 201 | "metadata": {}, 202 | "outputs": [], 203 | "source": [ 204 | "# dfs with pruning\n", 205 | "infoss = []\n", 206 | "for i in range(0, 100, 5):\n", 207 | " env.reset(i)\n", 208 | " infos = []\n", 209 | " actions = []\n", 210 | " dfs(env, actions, infos, 100, prune=True, max_per_state=3)\n", 211 | " infoss.append(infos)\n", 212 | " with open('logs/crosswords/infoss_dfs_prune.json', 'w') as fout:\n", 213 | " json.dump(infoss, fout)" 214 | ] 215 | }, 216 | { 217 | "cell_type": "code", 218 | "execution_count": null, 219 | "metadata": {}, 220 | "outputs": [], 221 | "source": [ 222 | "# dfs without pruning\n", 223 | "infoss = []\n", 224 | "for i in range(0, 100, 5):\n", 225 | " env.reset(i)\n", 226 | " infos = []\n", 227 | " actions = []\n", 228 | " dfs(env, actions, infos, 100, prune=False, max_per_state=3)\n", 229 | " infoss.append(infos)\n", 230 | " with open('logs/crosswords/infoss_dfs_no_prune.json', 'w') as fout:\n", 231 | " json.dump(infoss, fout)" 232 | ] 233 | } 234 | ], 235 | "metadata": { 236 | "kernelspec": { 237 | "display_name": "Python 3", 238 | "language": "python", 239 | "name": "python3" 240 | }, 241 | "language_info": { 242 | "codemirror_mode": { 243 | "name": "ipython", 244 | "version": 3 245 | }, 246 | "file_extension": ".py", 247 | "mimetype": "text/x-python", 248 | "name": "python", 249 | "nbconvert_exporter": "python", 250 | "pygments_lexer": "ipython3", 251 | "version": "3.7.4" 252 | } 253 | }, 254 | "nbformat": 4, 255 | "nbformat_minor": 2 256 | } 257 | -------------------------------------------------------------------------------- /scripts/crosswords/standard_sampling.sh: -------------------------------------------------------------------------------- 1 | python run.py \ 2 | --task crosswords \ 3 | --task_start_index 0 \ 4 | --task_end_index 20 \ 5 | --naive_run \ 6 | --prompt_sample standard \ 7 | --n_generate_sample 10 -------------------------------------------------------------------------------- /scripts/game24/bfs.sh: -------------------------------------------------------------------------------- 1 | python run.py \ 2 | --task game24 \ 3 | --task_start_index 900 \ 4 | --task_end_index 1000 \ 5 | --method_generate propose \ 6 | --method_evaluate value \ 7 | --method_select greedy \ 8 | --n_evaluate_sample 3 \ 9 | --n_select_sample 5 \ 10 | ${@} 11 | -------------------------------------------------------------------------------- /scripts/game24/cot_sampling.sh: -------------------------------------------------------------------------------- 1 | python run.py \ 2 | --task game24 \ 3 | --task_start_index 900 \ 4 | --task_end_index 1000 \ 5 | --naive_run \ 6 | --prompt_sample cot \ 7 | --n_generate_sample 100 \ 8 | ${@} -------------------------------------------------------------------------------- /scripts/game24/standard_sampling.sh: -------------------------------------------------------------------------------- 1 | python run.py \ 2 | --task game24 \ 3 | --task_start_index 900 \ 4 | --task_end_index 1000 \ 5 | --naive_run \ 6 | --prompt_sample standard \ 7 | --n_generate_sample 100 \ 8 | ${@} -------------------------------------------------------------------------------- /scripts/text/bfs.sh: -------------------------------------------------------------------------------- 1 | python run.py \ 2 | --task text \ 3 | --task_start_index 0 \ 4 | --task_end_index 100 \ 5 | --method_generate sample \ 6 | --method_evaluate vote \ 7 | --method_select greedy \ 8 | --n_generate_sample 5 \ 9 | --n_evaluate_sample 5 \ 10 | --n_select_sample 1 \ 11 | --prompt_sample cot \ 12 | --temperature 1.0 \ 13 | ${@} 14 | 15 | 16 | # 0.3 dollars per line -> 30 dollars for 100 lines -------------------------------------------------------------------------------- /scripts/text/cot_sampling.sh: -------------------------------------------------------------------------------- 1 | python run.py \ 2 | --task text \ 3 | --task_start_index 0 \ 4 | --task_end_index 100 \ 5 | --naive_run \ 6 | --prompt_sample cot \ 7 | --n_generate_sample 10 \ 8 | --temperature 1.0 \ 9 | ${@} 10 | 11 | # 0.03 dollars per line -> 3 dollars for 100 lines? -------------------------------------------------------------------------------- /scripts/text/standard_sampling.sh: -------------------------------------------------------------------------------- 1 | python run.py \ 2 | --task text \ 3 | --task_start_index 0 \ 4 | --task_end_index 100 \ 5 | --naive_run \ 6 | --prompt_sample standard \ 7 | --n_generate_sample 10 \ 8 | --temperature 1.0 \ 9 | ${@} 10 | 11 | 12 | # 0.03 dollars per line -> 3 dollars for 100 lines? -------------------------------------------------------------------------------- /setup.py: -------------------------------------------------------------------------------- 1 | import setuptools 2 | 3 | with open('README.md', 'r', encoding='utf-8') as fh: 4 | long_description = fh.read() 5 | 6 | 7 | setuptools.setup( 8 | name='tree-of-thoughts-llm', 9 | author='Shunyu Yao', 10 | author_email='shunyuyao.cs@gmail.com', 11 | description='Official Implementation of "Tree of Thoughts: Deliberate Problem Solving with Large Language Models"', 12 | keywords='tree-search, large-language-models, llm, prompting, tree-of-thoughts', 13 | long_description=long_description, 14 | long_description_content_type='text/markdown', 15 | url='https://github.com/princeton-nlp/tree-of-thought-llm', 16 | project_urls={ 17 | 'Homepage': 'https://github.com/princeton-nlp/tree-of-thought-llm', 18 | }, 19 | package_dir={'': 'src'}, 20 | packages=setuptools.find_packages(where='src'), 21 | classifiers=[ 22 | "License :: OSI Approved :: MIT License", 23 | "Programming Language :: Python :: 3", 24 | "Programming Language :: Python :: 3.7", 25 | "Programming Language :: Python :: 3.8", 26 | "Programming Language :: Python :: 3.9", 27 | "Programming Language :: Python :: 3.10", 28 | "Programming Language :: Python :: 3.11", 29 | 'Intended Audience :: Science/Research', 30 | 'Topic :: Scientific/Engineering :: Artificial Intelligence', 31 | ], 32 | python_requires='>=3.7', 33 | install_requires=[ 34 | 'setuptools', 35 | ], 36 | include_package_data=True, 37 | ) 38 | -------------------------------------------------------------------------------- /src/tot/__init__.py: -------------------------------------------------------------------------------- 1 | __version__ = "0.1.0" -------------------------------------------------------------------------------- /src/tot/data/24/24.csv: -------------------------------------------------------------------------------- 1 | Rank,Puzzles,AMT (s),Solved rate,1-sigma Mean (s),1-sigma STD (s) 2 | 1,1 1 4 6,4.4,99.20%,4.67,1.48 3 | 2,1 1 11 11,4.41,99.60%,4.68,1.45 4 | 3,1 1 3 8,4.45,99.20%,4.69,1.48 5 | 4,1 1 1 8,4.48,98.80%,4.66,1.25 6 | 5,6 6 6 6,4.59,99.40%,4.82,1.49 7 | 6,1 1 2 12,4.63,99.10%,4.95,1.57 8 | 7,1 2 2 6,4.8,99.20%,5.16,1.61 9 | 8,1 1 10 12,4.81,99%,5.06,1.54 10 | 9,2 2 10 10,4.85,98.20%,5.13,1.63 11 | 10,1 1 1 12,4.86,99.20%,5.18,1.63 12 | 11,1 1 2 8,4.96,97.80%,5.31,1.76 13 | 12,1 1 4 8,4.99,98.10%,5.35,1.83 14 | 13,1 1 5 8,5,96.40%,5.36,1.81 15 | 14,4 6 11 11,5,98.90%,5.38,1.73 16 | 15,1 1 3 12,5.02,97.30%,5.36,1.84 17 | 16,2 2 2 12,5.02,99.10%,5.37,1.7 18 | 17,1 1 4 12,5.03,97.40%,5.45,1.85 19 | 18,1 1 12 12,5.03,99.40%,5.4,1.66 20 | 19,3 3 3 8,5.04,98%,5.4,1.8 21 | 20,1 1 2 6,5.09,98.40%,5.4,1.61 22 | 21,1 1 2 11,5.09,98.90%,5.41,1.7 23 | 22,1 2 3 4,5.1,99%,5.46,1.85 24 | 23,11 11 12 12,5.1,98.70%,5.5,1.87 25 | 24,3 7 7 8,5.11,96.70%,5.48,1.89 26 | 25,1 1 13 13,5.16,99.30%,5.45,1.66 27 | 26,1 2 4 12,5.16,97.60%,5.56,1.99 28 | 27,1 1 3 6,5.17,97.70%,5.52,1.77 29 | 28,1 1 3 9,5.17,97%,5.52,1.85 30 | 29,7 7 12 12,5.17,98.60%,5.58,1.89 31 | 30,4 6 7 7,5.18,98.60%,5.53,1.83 32 | 31,1 1 2 13,5.19,98.70%,5.49,1.78 33 | 32,1 1 5 6,5.19,96.80%,5.62,2.04 34 | 33,1 1 11 13,5.23,99.10%,5.66,1.85 35 | 34,1 6 6 12,5.24,98.70%,5.57,1.72 36 | 35,4 5 12 12,5.24,97%,5.6,1.79 37 | 36,4 6 13 13,5.24,98.70%,5.63,1.79 38 | 37,12 12 12 12,5.25,99.30%,5.73,2.14 39 | 38,2 11 11 12,5.26,97.20%,5.68,2 40 | 39,4 4 4 6,5.26,98.10%,5.61,1.94 41 | 40,1 1 1 11,5.27,97.30%,5.56,1.66 42 | 41,1 1 11 12,5.27,99.30%,5.67,1.8 43 | 42,2 7 7 12,5.29,98.70%,5.65,1.89 44 | 43,1 5 7 12,5.3,98.30%,5.87,2.12 45 | 44,10 10 12 12,5.31,98.60%,5.74,1.91 46 | 45,1 8 8 8,5.32,97.70%,5.65,1.94 47 | 46,2 2 3 8,5.33,98.10%,5.72,1.93 48 | 47,2 9 9 12,5.33,97.30%,5.74,2 49 | 48,11 11 11 12,5.33,94.80%,5.61,1.99 50 | 49,3 8 13 13,5.34,94.80%,5.68,2.06 51 | 50,9 9 12 12,5.34,98.30%,5.84,2.11 52 | 51,1 1 5 5,5.35,99.10%,5.71,1.8 53 | 52,3 3 12 12,5.36,98.70%,5.83,2.03 54 | 53,1 1 4 5,5.38,97.80%,5.77,2.02 55 | 54,1 6 8 12,5.38,96.80%,5.77,2 56 | 55,8 8 12 12,5.38,98.50%,5.8,1.95 57 | 56,3 8 11 11,5.39,96%,5.74,2.09 58 | 57,5 6 12 12,5.39,97.10%,5.78,1.81 59 | 58,11 12 12 12,5.4,97.60%,5.74,1.88 60 | 59,12 12 13 13,5.4,98.60%,5.79,1.91 61 | 60,1 1 12 13,5.42,98.70%,5.8,1.84 62 | 61,1 3 5 12,5.42,96.40%,5.79,2.1 63 | 62,5 5 12 12,5.42,98.70%,5.78,1.75 64 | 63,1 9 9 12,5.44,95.50%,5.72,1.78 65 | 64,2 3 3 12,5.44,98.60%,5.91,2.04 66 | 65,3 4 4 8,5.44,98.30%,5.85,1.94 67 | 66,3 8 10 10,5.44,95.60%,5.89,2.25 68 | 67,3 8 9 9,5.45,97%,5.83,1.98 69 | 68,2 5 5 12,5.46,98.70%,5.87,1.99 70 | 69,11 11 11 13,5.48,98.40%,6.03,2.26 71 | 70,2 12 13 13,5.49,97.10%,5.85,2.04 72 | 71,7 7 11 12,5.49,93.70%,5.78,2.05 73 | 72,1 1 3 7,5.5,96.80%,5.82,1.95 74 | 73,1 4 10 10,5.5,97.90%,5.91,2.1 75 | 74,4 4 12 12,5.51,98.60%,5.93,2.05 76 | 75,1 3 4 12,5.52,98.50%,5.95,1.97 77 | 76,5 5 11 12,5.52,96.50%,5.89,1.9 78 | 77,1 2 5 8,5.54,95.50%,5.94,2.26 79 | 78,2 2 4 6,5.54,99%,5.92,1.9 80 | 79,1 6 7 12,5.56,98.50%,5.97,1.97 81 | 80,1 8 9 12,5.56,96.50%,5.87,1.96 82 | 81,6 7 12 12,5.56,95.60%,5.93,2.03 83 | 82,1 3 10 10,5.57,97.40%,5.87,2 84 | 83,2 3 3 8,5.57,96.90%,5.94,1.94 85 | 84,3 5 5 8,5.57,96.90%,5.97,2.08 86 | 85,1 1 1 13,5.58,97.40%,5.9,1.77 87 | 86,2 3 12 12,5.58,97.40%,5.97,2 88 | 87,1 4 7 12,5.59,98.30%,5.99,2.02 89 | 88,8 8 11 13,5.6,97%,6.1,2.29 90 | 89,1 3 3 4,5.61,97.40%,5.86,1.79 91 | 90,1 8 8 12,5.61,96.30%,5.91,1.9 92 | 91,3 7 8 8,5.63,95.90%,5.99,2.03 93 | 92,7 8 12 12,5.63,96.10%,6.07,2.12 94 | 93,9 9 11 12,5.63,93.90%,5.98,2.15 95 | 94,1 2 5 12,5.64,95.40%,6.09,2.2 96 | 95,2 7 7 8,5.64,96.80%,5.94,1.99 97 | 96,4 4 11 13,5.64,97.90%,6.25,2.36 98 | 97,1 1 4 7,5.65,97.10%,5.95,2.07 99 | 98,1 1 10 13,5.65,97.90%,6.09,2.2 100 | 99,4 6 6 6,5.65,98.30%,6.02,2.07 101 | 100,5 5 7 7,5.65,97.80%,5.97,1.91 102 | 101,4 5 11 12,5.66,97.40%,6.06,2.09 103 | 102,1 6 9 12,5.67,95.40%,6.05,2.13 104 | 103,1 7 9 12,5.68,95.90%,6.08,2.24 105 | 104,2 2 12 12,5.68,98.70%,6.21,2.26 106 | 105,1 1 2 10,5.69,95.60%,5.96,1.81 107 | 106,3 7 7 7,5.69,97.40%,6.03,2.02 108 | 107,1 7 7 10,5.7,97.80%,6.17,2.18 109 | 108,4 5 5 6,5.7,97.30%,6.11,2.11 110 | 109,4 8 11 11,5.7,95.10%,6.01,1.98 111 | 110,1 2 2 10,5.71,96.60%,6.05,2.11 112 | 111,1 6 6 11,5.71,98.20%,6.18,2.24 113 | 112,3 3 4 6,5.71,97.70%,6.14,2.11 114 | 113,4 4 11 12,5.71,91.90%,6.06,2.35 115 | 114,2 2 2 3,5.72,99%,5.98,1.72 116 | 115,3 3 11 12,5.72,94.70%,6.11,2.19 117 | 116,3 6 11 11,5.72,94.40%,5.99,2.05 118 | 117,3 8 8 8,5.73,97.50%,6.09,2.14 119 | 118,12 12 12 13,5.73,97.40%,6.07,1.98 120 | 119,1 2 2 12,5.74,97.70%,6.1,1.99 121 | 120,2 8 9 9,5.74,96.30%,6.13,2.1 122 | 121,6 6 12 12,5.75,98.40%,6.31,2.35 123 | 122,8 9 12 12,5.75,96.20%,6.26,2.24 124 | 123,1 5 8 8,5.76,94.80%,6.35,2.69 125 | 124,2 2 3 12,5.76,98%,6.17,2.02 126 | 125,3 4 4 6,5.77,97.40%,6.1,1.93 127 | 126,1 5 6 12,5.79,98.50%,6.17,2.07 128 | 127,4 6 9 9,5.79,96.30%,6.17,2.22 129 | 128,6 6 11 13,5.79,97.80%,6.3,2.25 130 | 129,7 7 11 13,5.79,97.40%,6.32,2.29 131 | 130,1 4 4 12,5.81,95.60%,6.31,2.36 132 | 131,1 4 6 12,5.81,96.60%,6.23,2.41 133 | 132,9 9 11 13,5.81,96.80%,6.35,2.41 134 | 133,3 8 12 13,5.82,96%,6.19,2.2 135 | 134,1 8 10 12,5.83,95.60%,6.16,2.19 136 | 135,3 4 11 13,5.84,96.40%,6.32,2.35 137 | 136,11 13 13 13,5.85,97.80%,6.41,2.42 138 | 137,4 6 6 7,5.87,96.40%,6.23,2.1 139 | 138,1 3 6 12,5.88,96.80%,6.27,2.18 140 | 139,2 4 4 12,5.88,98.40%,6.24,2.14 141 | 140,3 6 7 7,5.88,93.30%,6.24,2.33 142 | 141,4 6 8 8,5.88,96.80%,6.31,2.23 143 | 142,1 4 5 12,5.89,95.70%,6.33,2.33 144 | 143,10 11 12 12,5.89,95.90%,6.27,2.05 145 | 144,2 4 11 11,5.9,96.20%,6.33,2.34 146 | 145,4 6 10 11,5.9,94.50%,6.3,2.39 147 | 146,4 8 10 10,5.9,97.50%,6.32,2.15 148 | 147,1 4 8 12,5.91,97.50%,6.26,2.2 149 | 148,2 10 10 12,5.91,97%,6.29,2.31 150 | 149,3 6 13 13,5.91,94.50%,6.15,1.95 151 | 150,5 6 11 12,5.91,97.20%,6.36,2.23 152 | 151,6 6 10 13,5.91,93.70%,6.43,2.68 153 | 152,10 10 11 13,5.91,97.60%,6.49,2.37 154 | 153,1 3 3 12,5.92,95.30%,6.32,2.31 155 | 154,1 7 8 9,5.92,97.10%,6.38,2.27 156 | 155,1 12 13 13,5.92,94.30%,6.25,2.17 157 | 156,3 8 10 11,5.92,94.40%,6.27,2.35 158 | 157,9 10 12 12,5.92,95.70%,6.24,2.02 159 | 158,1 11 11 12,5.93,93.40%,6.17,2.22 160 | 159,4 5 12 13,5.93,97.60%,6.42,2.16 161 | 160,1 3 6 8,5.94,95.40%,6.37,2.53 162 | 161,1 2 11 11,5.95,97.30%,6.44,2.44 163 | 162,1 6 12 12,5.95,94.50%,6.37,2.45 164 | 163,2 8 8 12,5.95,97.40%,6.47,2.44 165 | 164,3 6 6 8,5.95,97%,6.35,2.24 166 | 165,5 5 5 5,5.95,95.20%,6.36,2.16 167 | 166,3 7 10 10,5.96,97.30%,6.35,2.19 168 | 167,2 8 10 10,5.97,95.30%,6.28,2.07 169 | 168,4 6 10 10,5.97,96.30%,6.34,2.37 170 | 169,1 1 3 13,5.98,95.10%,6.36,2.13 171 | 170,1 2 6 6,5.98,97.40%,6.41,2.25 172 | 171,1 6 7 10,5.98,98.50%,6.53,2.26 173 | 172,1 2 4 4,5.99,97.10%,6.33,2.14 174 | 173,1 7 8 10,5.99,97.50%,6.41,2.3 175 | 174,4 5 5 5,5.99,97.40%,6.55,2.42 176 | 175,1 1 4 4,6.01,94.50%,6.31,2.43 177 | 176,1 2 6 12,6.01,96.80%,6.38,2.24 178 | 177,1 5 8 12,6.02,98.10%,6.51,2.23 179 | 178,1 2 10 12,6.03,97.40%,6.55,2.43 180 | 179,1 5 7 11,6.03,98.50%,6.48,2.14 181 | 180,1 10 10 12,6.03,92.30%,6.24,2.1 182 | 181,3 8 12 12,6.03,96.50%,6.47,2.46 183 | 182,2 2 11 13,6.04,98%,6.59,2.43 184 | 183,3 9 9 9,6.04,97.60%,6.47,2.16 185 | 184,1 1 6 6,6.05,97.20%,6.46,2.19 186 | 185,1 2 4 8,6.05,96.50%,6.44,2.31 187 | 186,1 6 8 9,6.05,98.20%,6.5,2.36 188 | 187,3 4 12 12,6.05,94.50%,6.52,2.48 189 | 188,4 8 13 13,6.05,94.40%,6.27,1.91 190 | 189,1 1 3 4,6.08,98%,6.4,2.03 191 | 190,1 3 9 12,6.08,96.20%,6.48,2.34 192 | 191,3 8 11 12,6.08,94.20%,6.55,2.64 193 | 192,1 1 9 13,6.09,97.20%,6.55,2.54 194 | 193,1 2 6 8,6.09,95.90%,6.41,2.33 195 | 194,1 7 8 8,6.09,96.60%,6.61,2.66 196 | 195,2 8 13 13,6.09,90.20%,6.39,2.56 197 | 196,2 11 11 11,6.09,90%,6.21,2.47 198 | 197,2 11 12 12,6.09,95.40%,6.49,2.27 199 | 198,4 7 7 8,6.09,93.80%,6.55,2.52 200 | 199,3 3 4 8,6.1,96.20%,6.43,2.14 201 | 200,2 2 11 12,6.11,92.30%,6.5,2.56 202 | 201,5 6 12 13,6.11,97.40%,6.51,2.17 203 | 202,3 3 11 13,6.12,96.90%,6.62,2.5 204 | 203,4 7 12 12,6.12,88.50%,6.51,2.85 205 | 204,2 13 13 13,6.14,91.10%,6.32,2.32 206 | 205,5 5 12 13,6.14,95.60%,6.62,2.14 207 | 206,10 13 13 13,6.14,93.20%,6.58,2.78 208 | 207,1 7 7 12,6.15,88.90%,6.28,2.32 209 | 208,2 3 11 12,6.15,94.80%,6.74,2.75 210 | 209,5 5 11 13,6.15,96.10%,6.75,2.52 211 | 210,1 2 3 10,6.16,98.40%,6.47,2 212 | 211,1 5 5 12,6.16,90.50%,6.36,2.32 213 | 212,1 5 7 13,6.16,97.90%,6.53,2.09 214 | 213,3 3 3 6,6.17,97.70%,6.71,2.56 215 | 214,3 3 3 12,6.17,96.90%,6.65,2.46 216 | 215,10 10 11 12,6.17,93.30%,6.53,2.29 217 | 216,1 4 7 8,6.19,96.60%,6.68,2.71 218 | 217,1 10 12 12,6.19,95.80%,6.81,2.94 219 | 218,8 8 11 12,6.19,90.60%,6.54,2.66 220 | 219,1 2 3 8,6.2,97%,6.61,2.38 221 | 220,4 4 10 13,6.2,93.50%,6.71,2.79 222 | 221,1 4 6 8,6.21,94.40%,6.57,2.52 223 | 222,2 2 4 5,6.21,98.30%,6.6,2.11 224 | 223,11 12 13 13,6.21,95.10%,6.69,2.43 225 | 224,1 2 2 8,6.22,93.30%,6.65,2.63 226 | 225,5 5 5 6,6.22,96.10%,6.68,2.34 227 | 226,1 1 5 7,6.23,96.40%,6.56,2.09 228 | 227,1 2 13 13,6.23,97.20%,6.93,2.69 229 | 228,7 7 12 13,6.23,93.90%,6.63,2.35 230 | 229,1 4 8 8,6.24,96.50%,6.71,2.48 231 | 230,3 11 11 12,6.24,91.40%,6.44,2.17 232 | 231,1 2 3 12,6.25,96.20%,6.8,2.53 233 | 232,1 3 8 9,6.27,92.20%,6.82,3.18 234 | 233,1 3 11 11,6.27,97.20%,6.68,2.43 235 | 234,1 4 4 5,6.27,96.90%,6.82,2.67 236 | 235,1 5 10 10,6.28,97.30%,6.65,2.3 237 | 236,1 8 10 13,6.28,94.80%,6.6,2.43 238 | 237,2 2 2 8,6.28,96.40%,6.84,2.73 239 | 238,1 4 12 12,6.29,91.20%,6.53,2.49 240 | 239,2 7 8 12,6.29,94.90%,6.73,2.56 241 | 240,4 4 5 6,6.29,95.70%,6.6,2.26 242 | 241,7 9 11 11,6.29,95.80%,6.77,2.64 243 | 242,2 12 12 13,6.3,95.50%,6.69,2.37 244 | 243,6 7 11 12,6.3,96.80%,6.84,2.59 245 | 244,4 6 12 13,6.31,93.80%,6.82,2.88 246 | 245,1 2 10 11,6.32,97.60%,6.66,2.39 247 | 246,1 5 8 9,6.32,93.90%,6.76,2.62 248 | 247,1 5 7 8,6.33,94.70%,6.71,2.42 249 | 248,4 8 9 9,6.33,90.40%,6.49,2.28 250 | 249,9 9 12 13,6.33,94%,6.69,2.25 251 | 250,8 8 8 10,6.35,90.40%,6.7,2.91 252 | 251,2 12 12 12,6.36,98.40%,6.75,2.46 253 | 252,1 8 8 9,6.37,96.30%,6.9,2.71 254 | 253,2 8 11 11,6.37,88.50%,6.55,2.78 255 | 254,4 5 13 13,6.37,87.70%,6.61,2.77 256 | 255,1 2 12 12,6.38,95.80%,6.71,2.2 257 | 256,2 8 10 11,6.38,94.70%,6.87,2.62 258 | 257,2 11 13 13,6.38,94.10%,6.88,2.52 259 | 258,1 6 7 11,6.39,97.40%,7.02,2.89 260 | 259,3 6 7 8,6.39,97.80%,6.79,2.31 261 | 260,1 2 8 8,6.4,96.10%,6.97,2.74 262 | 261,4 5 9 9,6.4,88.40%,6.64,2.72 263 | 262,1 3 3 7,6.41,95.90%,7.25,3.33 264 | 263,2 5 6 12,6.41,95.10%,6.8,2.52 265 | 264,3 6 10 11,6.41,95.80%,7.05,2.86 266 | 265,1 3 6 6,6.42,97.20%,6.89,2.58 267 | 266,1 6 6 13,6.42,97.20%,6.8,2.44 268 | 267,4 4 4 4,6.43,97.80%,7.03,2.93 269 | 268,10 10 10 13,6.43,92.90%,6.86,2.99 270 | 269,1 6 9 9,6.44,97.50%,6.87,2.52 271 | 270,2 6 6 12,6.44,97.70%,6.97,2.55 272 | 271,3 7 9 9,6.44,91%,6.87,2.78 273 | 272,1 7 8 12,6.45,95.60%,6.97,2.67 274 | 273,3 12 13 13,6.46,91%,6.62,2.29 275 | 274,6 6 11 12,6.46,88.90%,6.77,2.97 276 | 275,1 3 5 8,6.47,95.10%,7.13,2.89 277 | 276,6 10 10 10,6.47,95.70%,6.74,2.31 278 | 277,2 3 11 13,6.48,91.90%,6.91,2.78 279 | 278,1 1 3 11,6.49,94.30%,6.73,1.97 280 | 279,1 4 4 7,6.49,96.10%,7.22,3.18 281 | 280,1 2 4 6,6.5,95.90%,6.88,2.53 282 | 281,1 9 12 12,6.5,92.60%,6.86,2.82 283 | 282,3 4 5 8,6.5,96.10%,6.96,2.63 284 | 283,4 5 6 6,6.5,95.30%,6.81,2.33 285 | 284,1 3 3 5,6.51,95.10%,6.76,2.46 286 | 285,1 4 6 13,6.51,97.50%,6.96,2.39 287 | 286,2 2 2 11,6.51,92.10%,6.79,2.66 288 | 287,2 4 5 12,6.51,94.50%,6.96,2.87 289 | 288,10 11 11 13,6.51,94.70%,6.91,2.63 290 | 289,12 13 13 13,6.52,94.30%,6.85,2.44 291 | 290,3 7 11 11,6.53,88%,6.77,3.02 292 | 291,3 7 13 13,6.53,87.30%,6.67,2.74 293 | 292,4 6 7 8,6.53,95.80%,6.96,2.48 294 | 293,1 4 7 13,6.54,96.50%,7.22,2.95 295 | 294,1 5 8 10,6.54,96.30%,7.09,2.54 296 | 295,5 6 6 7,6.54,97.90%,7.06,2.44 297 | 296,1 5 10 12,6.55,92%,6.96,3.07 298 | 297,1 6 8 10,6.55,97.10%,6.98,2.39 299 | 298,3 5 11 11,6.55,94.50%,7.06,2.97 300 | 299,3 9 9 12,6.55,94.70%,6.92,2.56 301 | 300,2 10 11 12,6.56,95.90%,6.86,2.4 302 | 301,2 8 8 13,6.57,92.80%,6.94,2.55 303 | 302,7 8 11 12,6.57,96.20%,7.05,2.64 304 | 303,1 2 5 6,6.58,94.60%,7.11,2.75 305 | 304,1 7 7 11,6.58,96.80%,7.15,2.6 306 | 305,4 6 12 12,6.58,96.80%,7.03,2.42 307 | 306,1 3 13 13,6.59,97%,7.11,2.6 308 | 307,3 8 9 10,6.59,96.20%,6.95,2.46 309 | 308,3 10 10 12,6.59,94.20%,6.94,2.35 310 | 309,4 4 12 13,6.59,92%,6.95,2.49 311 | 310,1 10 11 12,6.6,94.60%,6.98,2.5 312 | 311,2 3 4 6,6.6,97.80%,7.01,2.29 313 | 312,4 6 11 12,6.6,92.80%,6.98,2.9 314 | 313,1 2 10 13,6.61,97%,7.12,2.72 315 | 314,1 3 3 9,6.61,96.70%,7.15,2.85 316 | 315,1 9 11 12,6.61,93.70%,7.05,2.97 317 | 316,2 2 8 12,6.61,98%,7.06,2.29 318 | 317,2 5 6 11,6.63,96.30%,7.12,2.59 319 | 318,3 4 10 13,6.63,96.50%,7.15,2.82 320 | 319,3 7 7 12,6.63,89.90%,6.78,2.56 321 | 320,1 2 3 6,6.65,97%,7.08,2.43 322 | 321,3 6 6 7,6.65,93.60%,7.09,2.67 323 | 322,1 12 12 12,6.66,93.40%,6.88,2.57 324 | 323,2 8 9 12,6.66,95%,7.04,2.72 325 | 324,3 3 3 3,6.67,96%,7.4,3.34 326 | 325,1 4 8 13,6.68,97.40%,7.07,2.44 327 | 326,4 4 5 5,6.68,93.20%,7.11,2.52 328 | 327,6 7 12 13,6.68,95.40%,7.17,2.72 329 | 328,1 2 5 5,6.69,95.10%,7.19,2.91 330 | 329,2 2 5 5,6.69,93.20%,7.07,2.49 331 | 330,5 6 7 7,6.69,87.80%,6.86,2.72 332 | 331,5 7 10 12,6.69,96.70%,7.13,2.44 333 | 332,1 1 6 12,6.7,91.10%,7.36,3.37 334 | 333,1 3 3 3,6.71,95.50%,7.09,2.55 335 | 334,4 5 11 11,6.72,87.20%,7.19,3.59 336 | 335,1 1 2 7,6.73,95.40%,6.96,2.07 337 | 336,1 2 5 7,6.73,95%,7.07,2.65 338 | 337,1 5 6 6,6.73,93.60%,7.41,3.4 339 | 338,9 9 10 13,6.73,90.90%,6.9,2.68 340 | 339,1 6 6 10,6.74,93.60%,7.09,3.03 341 | 340,3 3 12 13,6.74,93.30%,7.19,2.64 342 | 341,6 8 10 12,6.74,97.20%,7.28,2.79 343 | 342,2 2 2 13,6.75,93.40%,7.1,2.72 344 | 343,2 3 12 13,6.76,95.90%,7.24,2.63 345 | 344,3 3 5 6,6.76,96.20%,7.36,2.7 346 | 345,2 11 11 13,6.77,94.90%,7.28,2.84 347 | 346,4 5 7 7,6.77,91.20%,7.37,3.44 348 | 347,7 7 10 13,6.77,89.60%,7.06,3.11 349 | 348,1 2 11 13,6.78,93.50%,7.42,3.1 350 | 349,11 12 12 13,6.78,97%,7.39,2.67 351 | 350,5 7 11 11,6.79,96.10%,7.22,2.67 352 | 351,2 2 2 4,6.8,95.90%,7.13,2.48 353 | 352,3 4 5 7,6.81,96.60%,7.28,2.65 354 | 353,4 7 13 13,6.81,87.70%,7.06,2.94 355 | 354,4 8 8 8,6.81,94.60%,7.16,2.75 356 | 355,5 6 13 13,6.81,86.90%,6.91,2.69 357 | 356,8 9 11 12,6.81,95.20%,7.45,3.01 358 | 357,1 1 2 9,6.82,94.60%,7.11,2.33 359 | 358,2 4 4 8,6.82,96.40%,7.23,2.67 360 | 359,4 4 4 8,6.82,96.30%,7.38,2.8 361 | 360,2 4 13 13,6.83,95.60%,7.34,2.76 362 | 361,4 7 7 7,6.83,88.80%,7.02,3.01 363 | 362,7 9 13 13,6.83,96.30%,7.42,2.78 364 | 363,9 11 11 11,6.83,95.30%,7.21,2.76 365 | 364,1 6 8 11,6.84,97.70%,7.3,2.46 366 | 365,1 1 3 5,6.85,95.20%,7.17,2.32 367 | 366,2 6 6 8,6.85,95.50%,7.42,2.98 368 | 367,1 3 4 4,6.87,93.10%,7.15,3.02 369 | 368,1 3 10 11,6.87,96.40%,7.53,3.1 370 | 369,1 9 10 12,6.87,93.70%,7.18,2.72 371 | 370,7 8 12 13,6.87,96.40%,7.34,2.62 372 | 371,8 9 10 13,6.87,95.80%,7.37,2.65 373 | 372,1 1 4 9,6.88,93.40%,7.08,2.33 374 | 373,2 5 5 8,6.88,88%,7.04,3.06 375 | 374,6 9 12 12,6.88,89.10%,7.44,3.67 376 | 375,2 5 5 13,6.89,92%,7.28,2.71 377 | 376,2 8 12 12,6.89,95%,7.27,2.62 378 | 377,4 7 10 11,6.89,97%,7.42,2.74 379 | 378,1 11 12 12,6.9,94.20%,7.43,3.1 380 | 379,3 3 3 7,6.9,92.20%,7.32,3.12 381 | 380,3 3 3 10,6.9,94.30%,7.47,3.25 382 | 381,6 8 11 11,6.9,94.20%,7.43,3.06 383 | 382,9 10 11 12,6.9,96.10%,7.41,2.76 384 | 383,1 3 6 7,6.91,96%,7.32,2.81 385 | 384,8 9 12 13,6.91,96%,7.4,2.7 386 | 385,1 2 2 11,6.92,97.10%,7.46,2.95 387 | 386,1 8 8 11,6.92,93.30%,7.39,3.42 388 | 387,4 5 10 10,6.92,91%,7.24,3.04 389 | 388,1 2 11 12,6.93,96.90%,7.63,3.2 390 | 389,2 4 5 8,6.93,95%,7.31,2.74 391 | 390,2 5 12 12,6.93,83.20%,7.33,4.07 392 | 391,3 6 6 12,6.94,95.40%,7.6,2.84 393 | 392,4 6 9 10,6.94,93.50%,7.45,3.24 394 | 393,4 7 8 8,6.94,94%,7.33,2.53 395 | 394,5 5 10 10,6.94,86.70%,7.24,3.12 396 | 395,1 2 7 10,6.95,90.90%,7.71,4 397 | 396,2 6 10 12,6.95,95.10%,7.43,2.81 398 | 397,2 6 7 13,6.96,96.60%,7.37,2.66 399 | 398,2 9 9 11,6.96,87.30%,6.96,2.68 400 | 399,8 8 10 13,6.96,90.40%,7.35,3.05 401 | 400,1 1 8 8,6.97,89.50%,7.42,3.71 402 | 401,2 2 10 13,6.97,90.10%,7.23,3.03 403 | 402,2 2 12 13,6.97,92.10%,7.25,2.6 404 | 403,5 6 11 11,6.97,87.20%,7.18,3.13 405 | 404,11 11 12 13,6.97,94.90%,7.39,2.48 406 | 405,1 2 6 7,6.98,95.10%,7.35,2.55 407 | 406,3 8 8 9,6.98,95.50%,7.29,2.5 408 | 407,6 7 10 13,6.98,96.40%,7.5,2.75 409 | 408,9 12 12 12,6.98,82.20%,7.03,3.68 410 | 409,1 6 8 8,6.99,93%,7.31,2.92 411 | 410,4 4 4 12,6.99,96.60%,7.46,2.94 412 | 411,5 5 10 13,6.99,89.70%,7.34,3.26 413 | 412,6 8 8 8,6.99,88.10%,7.38,3.47 414 | 413,5 6 9 9,7,90.60%,7.27,2.74 415 | 414,1 3 7 13,7.01,96.80%,7.4,2.64 416 | 415,1 7 8 11,7.01,92.60%,7.4,3.08 417 | 416,3 6 9 9,7.01,94%,7.6,3.28 418 | 417,5 5 6 6,7.01,93.10%,7.48,2.79 419 | 418,5 5 13 13,7.01,85.50%,7.17,3.03 420 | 419,1 3 5 13,7.02,94.10%,7.43,2.72 421 | 420,1 8 8 10,7.02,92.40%,7.36,2.99 422 | 421,2 9 9 13,7.02,88.60%,7.08,2.5 423 | 422,1 7 7 9,7.03,96.30%,7.4,2.83 424 | 423,3 7 12 13,7.03,93.80%,7.5,3 425 | 424,8 8 9 11,7.03,89.70%,7.57,3.69 426 | 425,4 8 11 12,7.04,92.90%,7.64,3.18 427 | 426,1 7 10 12,7.05,92.60%,7.46,2.96 428 | 427,2 3 5 5,7.05,96.20%,7.57,2.94 429 | 428,2 6 7 12,7.05,92%,7.2,2.91 430 | 429,8 8 12 13,7.05,92.10%,7.35,2.65 431 | 430,8 10 11 11,7.05,94.80%,7.45,2.76 432 | 431,3 5 5 6,7.06,89.10%,7.29,3.09 433 | 432,1 3 10 12,7.07,96.40%,7.55,2.75 434 | 433,2 3 3 13,7.07,94%,7.6,2.92 435 | 434,5 5 11 11,7.08,85.80%,7.3,3.24 436 | 435,1 2 2 13,7.09,96.40%,7.64,3.12 437 | 436,2 5 9 12,7.09,95.40%,7.53,2.84 438 | 437,3 3 4 4,7.09,94.40%,7.45,2.69 439 | 438,1 2 3 5,7.1,95.40%,7.39,2.38 440 | 439,1 3 9 11,7.1,95.10%,7.65,3.15 441 | 440,2 2 7 13,7.1,96.10%,7.59,2.92 442 | 441,1 5 6 9,7.11,91.90%,7.56,3.43 443 | 442,2 6 10 10,7.11,95.70%,7.58,2.96 444 | 443,10 10 12 13,7.11,92.20%,7.43,2.6 445 | 444,1 10 12 13,7.12,94.30%,7.45,2.83 446 | 445,2 8 8 8,7.12,93.90%,7.68,3.28 447 | 446,3 4 11 12,7.12,95.10%,7.75,3.35 448 | 447,1 2 4 10,7.13,96.20%,7.62,3.17 449 | 448,4 7 12 13,7.13,92.40%,7.74,3.4 450 | 449,7 8 8 9,7.13,86.60%,7.51,3.69 451 | 450,1 3 4 11,7.14,96%,7.65,2.91 452 | 451,4 6 6 8,7.14,97.30%,7.61,2.61 453 | 452,6 6 6 11,7.14,88.70%,7.39,3.41 454 | 453,3 8 8 12,7.15,91.50%,7.62,3.1 455 | 454,2 7 7 13,7.16,86%,7.27,3.01 456 | 455,2 9 10 12,7.16,93.80%,7.49,2.83 457 | 456,3 7 11 12,7.16,90.60%,7.7,3.49 458 | 457,4 5 11 13,7.17,90.30%,7.59,3.29 459 | 458,6 6 12 13,7.17,90.30%,7.56,3.04 460 | 459,1 5 9 9,7.18,94.40%,7.69,3.15 461 | 460,3 4 5 12,7.18,97%,7.61,2.67 462 | 461,5 6 11 13,7.18,91.70%,7.75,3.37 463 | 462,1 12 12 13,7.19,94.20%,7.7,3.12 464 | 463,4 8 10 12,7.19,95.70%,7.71,2.86 465 | 464,5 9 10 10,7.19,92.80%,7.58,3.05 466 | 465,3 6 11 12,7.2,93.40%,7.7,3.18 467 | 466,2 6 7 9,7.21,96%,7.75,3 468 | 467,3 5 13 13,7.22,94.30%,7.71,3.07 469 | 468,3 7 7 9,7.22,86.60%,7.26,2.95 470 | 469,1 3 4 5,7.23,97.50%,7.73,2.53 471 | 470,3 9 10 11,7.23,93%,7.61,2.93 472 | 471,1 3 7 8,7.24,89.30%,7.43,3.48 473 | 472,1 7 9 13,7.25,93.70%,7.5,2.52 474 | 473,1 8 11 12,7.25,91.50%,7.77,3.41 475 | 474,4 7 11 11,7.25,86.30%,7.35,3.32 476 | 475,1 6 6 9,7.26,90.30%,7.7,3.26 477 | 476,5 5 7 8,7.26,92.40%,7.71,2.94 478 | 477,7 9 10 12,7.26,94%,7.78,3.16 479 | 478,1 6 8 13,7.27,94.30%,7.7,2.83 480 | 479,3 4 8 10,7.27,93.40%,7.66,3.19 481 | 480,5 10 11 11,7.27,90.30%,7.48,2.76 482 | 481,2 4 10 12,7.28,96.50%,7.94,3.12 483 | 482,4 5 5 8,7.28,88%,7.43,2.93 484 | 483,10 12 12 13,7.28,90.10%,7.74,3.54 485 | 484,1 1 6 8,7.29,94.90%,7.62,2.55 486 | 485,2 2 3 9,7.29,94.50%,7.73,2.88 487 | 486,3 3 4 5,7.29,94%,7.75,3.18 488 | 487,5 8 8 10,7.29,91.40%,7.77,3.37 489 | 488,1 8 9 11,7.3,92.60%,7.63,3.1 490 | 489,2 3 3 11,7.3,89.30%,7.55,3.23 491 | 490,2 7 8 13,7.3,94%,7.84,3.11 492 | 491,3 5 5 12,7.3,89%,7.62,3.32 493 | 492,5 5 9 9,7.3,85.50%,7.59,3.37 494 | 493,7 10 10 11,7.3,93.50%,7.63,2.94 495 | 494,10 11 11 12,7.31,96.70%,7.8,2.8 496 | 495,2 5 6 8,7.32,95.60%,7.91,3.03 497 | 496,3 3 9 9,7.32,95.90%,7.7,2.77 498 | 497,3 6 6 6,7.32,94.40%,7.75,3.15 499 | 498,5 8 12 12,7.32,75.90%,6.76,3.49 500 | 499,2 3 7 12,7.34,94.70%,7.81,3.04 501 | 500,5 6 8 8,7.34,92.20%,7.69,2.92 502 | 501,1 11 12 13,7.35,93.80%,7.77,3.19 503 | 502,3 5 6 8,7.35,93.50%,7.83,3.19 504 | 503,6 8 13 13,7.35,93.30%,7.82,3.08 505 | 504,4 4 5 8,7.37,95.90%,7.72,2.57 506 | 505,7 10 12 12,7.37,84.60%,7.57,3.7 507 | 506,1 8 9 13,7.38,91.40%,7.84,3.51 508 | 507,3 5 10 12,7.38,94.80%,7.95,3.1 509 | 508,3 9 11 11,7.38,85.80%,7.49,3.24 510 | 509,6 6 6 9,7.38,89.60%,7.97,3.92 511 | 510,5 5 10 11,7.39,87.70%,7.75,3.46 512 | 511,2 2 4 7,7.4,95.30%,8.06,3.15 513 | 512,3 6 8 8,7.4,92.70%,7.74,2.98 514 | 513,3 9 13 13,7.4,86.30%,7.56,3.26 515 | 514,4 4 4 7,7.4,91.10%,7.76,3.31 516 | 515,1 4 4 4,7.41,95.70%,7.86,2.86 517 | 516,9 10 10 13,7.41,95.20%,7.86,3.02 518 | 517,2 10 10 11,7.42,85.20%,7.27,2.92 519 | 518,3 3 10 13,7.42,89.40%,7.7,3.46 520 | 519,1 2 3 9,7.43,96.40%,7.98,3.1 521 | 520,2 4 8 10,7.43,97.80%,7.91,2.6 522 | 521,1 7 9 9,7.44,94.70%,7.92,3.19 523 | 522,2 6 6 6,7.45,95.90%,8.01,3.06 524 | 523,2 6 9 11,7.45,95.50%,7.93,3.13 525 | 524,3 4 4 7,7.45,90.30%,7.87,3.37 526 | 525,2 6 6 13,7.46,84.90%,7.54,3.15 527 | 526,8 8 8 12,7.46,88.20%,7.86,3.98 528 | 527,3 5 5 7,7.47,89.90%,7.71,3.14 529 | 528,4 7 9 9,7.47,86.40%,7.59,3.16 530 | 529,5 7 13 13,7.47,94.20%,7.87,3.21 531 | 530,1 4 5 5,7.48,95.80%,7.91,2.68 532 | 531,1 5 5 6,7.48,94.90%,7.99,3.07 533 | 532,4 8 12 13,7.48,91.70%,8.18,3.7 534 | 533,5 10 13 13,7.48,92.50%,7.93,2.78 535 | 534,2 10 10 13,7.49,90%,7.63,2.85 536 | 535,3 3 5 5,7.49,89.40%,7.73,3.16 537 | 536,3 7 9 10,7.49,91.90%,8.06,3.46 538 | 537,1 2 3 7,7.5,96.10%,8.17,3.2 539 | 538,4 4 7 9,7.5,93.20%,7.9,3.39 540 | 539,2 2 6 6,7.51,95.80%,8.02,3.05 541 | 540,2 6 6 11,7.51,83.20%,7.35,3.13 542 | 541,6 7 11 13,7.51,90.10%,7.89,3.44 543 | 542,1 2 2 4,7.52,95.50%,7.83,2.44 544 | 543,3 5 6 12,7.52,93.40%,8.04,3.23 545 | 544,2 2 7 8,7.53,93.70%,7.88,2.98 546 | 545,2 4 4 13,7.53,85.40%,7.66,3.38 547 | 546,4 4 8 8,7.53,95.60%,8.13,3.2 548 | 547,3 6 10 12,7.54,92.90%,8.05,3.43 549 | 548,1 2 4 13,7.55,93.60%,7.88,2.93 550 | 549,2 4 5 13,7.55,94.70%,8.14,3.25 551 | 550,2 7 8 8,7.57,92.60%,7.97,3.08 552 | 551,1 2 12 13,7.58,95.90%,8.31,3.37 553 | 552,2 3 4 12,7.58,94.20%,8.09,3.28 554 | 553,3 4 9 12,7.58,94.50%,8.02,3.08 555 | 554,9 11 13 13,7.6,92%,8.11,3.19 556 | 555,2 2 4 10,7.61,94.10%,8.01,2.92 557 | 556,2 8 12 13,7.61,89.40%,7.86,3.55 558 | 557,3 4 4 12,7.61,90.40%,7.91,3.28 559 | 558,2 4 4 11,7.62,84.50%,7.62,3.38 560 | 559,2 3 5 6,7.63,95%,8.05,2.95 561 | 560,2 6 11 13,7.63,89.70%,7.78,3.26 562 | 561,3 9 12 12,7.63,90.70%,8.2,3.48 563 | 562,5 6 10 10,7.63,86.90%,7.99,3.73 564 | 563,1 3 8 12,7.64,96.60%,8.14,2.9 565 | 564,2 6 8 11,7.64,93.30%,8.07,3.38 566 | 565,1 4 10 11,7.65,95%,8.1,3.15 567 | 566,1 9 11 11,7.65,89.70%,7.79,3.02 568 | 567,1 4 9 12,7.67,95.90%,8.33,3.37 569 | 568,4 8 10 11,7.67,90.40%,7.98,3.45 570 | 569,2 2 2 10,7.68,94.10%,8.13,3.33 571 | 570,3 6 12 12,7.69,93%,8.24,3.48 572 | 571,1 2 9 12,7.7,92.40%,8.32,3.81 573 | 572,3 3 4 12,7.7,95%,8.34,3.2 574 | 573,4 5 7 8,7.7,95.60%,8.19,3.15 575 | 574,4 7 9 10,7.7,90.50%,8.19,3.66 576 | 575,4 4 4 5,7.71,84.90%,7.7,3.66 577 | 576,5 6 7 8,7.71,93.50%,8.21,3.11 578 | 577,1 6 9 13,7.72,90.60%,8.12,3.96 579 | 578,2 6 8 10,7.72,93.20%,8.22,3.44 580 | 579,3 3 4 7,7.72,90.20%,8.18,3.48 581 | 580,6 6 10 12,7.72,93.80%,8.14,3.07 582 | 581,8 9 11 13,7.72,90.70%,8.16,3.68 583 | 582,3 4 4 9,7.73,90.90%,8.24,3.29 584 | 583,3 5 8 11,7.75,89.60%,8.05,3.47 585 | 584,6 6 7 10,7.75,84.20%,7.85,4.09 586 | 585,6 6 9 12,7.75,92.60%,8.12,3.31 587 | 586,1 3 5 11,7.76,93.20%,8.11,2.86 588 | 587,1 5 6 10,7.76,91.50%,8.1,3.24 589 | 588,3 6 6 9,7.77,96.20%,8.24,2.96 590 | 589,1 2 2 5,7.79,94.30%,8.03,2.83 591 | 590,1 2 4 11,7.8,91.90%,8.06,2.8 592 | 591,2 5 5 11,7.81,82.90%,7.62,3.34 593 | 592,3 3 8 9,7.81,88.70%,8.19,3.79 594 | 593,3 6 10 10,7.81,85.30%,7.81,3.64 595 | 594,5 6 6 6,7.81,87.50%,7.91,3.44 596 | 595,8 11 12 12,7.82,77.50%,7.26,3.55 597 | 596,1 4 8 11,7.83,95.40%,8.26,2.85 598 | 597,5 8 8 9,7.83,87%,8.05,3.92 599 | 598,2 2 9 11,7.84,91.50%,8.16,3.28 600 | 599,2 7 7 11,7.84,86.60%,7.89,3.67 601 | 600,1 4 4 9,7.85,94.10%,8.25,3.34 602 | 601,1 4 4 10,7.85,92%,8.06,3.4 603 | 602,4 5 10 11,7.85,88.10%,8.28,4.17 604 | 603,1 2 6 9,7.86,94.30%,8.4,3.58 605 | 604,3 4 12 13,7.86,93.20%,8.45,3.71 606 | 605,3 9 10 10,7.86,84.70%,7.73,3.43 607 | 606,3 12 12 12,7.86,90.90%,8.17,3.42 608 | 607,3 7 12 12,7.87,91.20%,8.23,3.41 609 | 608,4 5 8 9,7.87,88.90%,8.25,3.97 610 | 609,10 12 13 13,7.87,94.20%,8.32,2.93 611 | 610,2 2 6 12,7.88,94.10%,8.48,3.48 612 | 611,3 4 5 5,7.88,93.90%,8.56,3.65 613 | 612,6 7 8 10,7.88,91.60%,8.4,3.57 614 | 613,2 2 4 8,7.89,95.80%,8.41,3.08 615 | 614,3 6 7 9,7.89,92.80%,8.34,3.48 616 | 615,2 4 6 10,7.9,93.80%,8.32,3.37 617 | 616,3 4 7 9,7.9,93.10%,8.33,3.73 618 | 617,4 8 9 10,7.9,88.80%,8.1,3.33 619 | 618,5 6 10 11,7.91,90.30%,8.22,3.54 620 | 619,8 10 13 13,7.91,92.50%,8.29,3.33 621 | 620,2 5 6 13,7.92,91.40%,8.41,3.61 622 | 621,3 12 12 13,7.92,92.40%,8.3,3.26 623 | 622,4 6 8 9,7.92,93.30%,8.35,3.38 624 | 623,2 3 5 11,7.93,94.80%,8.43,3.29 625 | 624,7 8 11 13,7.93,90.10%,8.48,3.85 626 | 625,2 3 3 6,7.95,96.20%,8.45,2.96 627 | 626,5 8 8 13,7.95,92.40%,8.33,3.65 628 | 627,2 2 3 3,7.96,93.30%,8.29,3.26 629 | 628,1 4 6 11,7.97,92.50%,8.28,3.23 630 | 629,3 9 12 13,7.97,89.70%,8.44,3.93 631 | 630,1 2 5 9,7.98,95%,8.48,3.09 632 | 631,2 7 8 11,7.98,94.80%,8.57,3.59 633 | 632,1 3 6 9,7.99,93.10%,8.49,3.2 634 | 633,3 4 10 12,8,92.40%,8.38,3.5 635 | 634,3 7 10 11,8,90.50%,8.27,3.37 636 | 635,4 10 12 12,8,92%,8.42,3.34 637 | 636,4 4 6 12,8.01,93.90%,8.69,3.77 638 | 637,10 11 12 13,8.01,95.80%,8.62,3.48 639 | 638,2 5 6 7,8.02,92.30%,8.49,3.62 640 | 639,3 5 8 8,8.02,92.10%,8.48,3.85 641 | 640,2 2 3 7,8.05,88.10%,8.43,3.8 642 | 641,2 8 8 9,8.05,90.20%,8.35,3.34 643 | 642,8 10 10 12,8.05,94.30%,8.52,3.4 644 | 643,4 8 8 9,8.06,92.30%,8.3,3.05 645 | 644,1 5 9 10,8.07,93.50%,8.59,3.77 646 | 645,2 3 4 8,8.07,92.50%,8.78,4.01 647 | 646,4 7 9 12,8.07,94.90%,8.58,3.25 648 | 647,1 1 7 10,8.08,85.10%,8.12,4.24 649 | 648,2 4 8 8,8.08,93.90%,8.75,4.12 650 | 649,2 7 11 12,8.08,91.50%,8.43,3.4 651 | 650,2 4 8 12,8.09,93.80%,8.86,4.19 652 | 651,1 3 12 12,8.1,85.50%,8.39,4.52 653 | 652,2 4 9 13,8.1,93.60%,8.44,3.5 654 | 653,4 5 9 10,8.1,90.60%,8.47,3.64 655 | 654,1 3 4 13,8.11,94.10%,8.7,3.66 656 | 655,3 7 8 12,8.11,95%,8.61,3.22 657 | 656,1 4 5 11,8.12,90.60%,8.37,3.88 658 | 657,1 4 9 11,8.12,94.40%,8.78,3.71 659 | 658,2 7 9 10,8.12,93.90%,8.52,3.2 660 | 659,2 8 9 13,8.12,90.10%,8.37,3.57 661 | 660,4 4 10 12,8.13,90.60%,8.54,3.93 662 | 661,3 4 9 11,8.14,89.80%,8.4,4.03 663 | 662,4 6 8 12,8.14,94.20%,8.76,3.48 664 | 663,1 5 5 13,8.15,93.30%,8.37,3.21 665 | 664,3 6 9 10,8.15,92.40%,8.48,3.56 666 | 665,2 8 9 10,8.16,88.80%,8.35,3.78 667 | 666,3 5 6 6,8.16,92.20%,8.67,3.6 668 | 667,5 8 11 13,8.16,88.80%,8.32,3.37 669 | 668,6 8 8 10,8.16,91%,8.62,4.02 670 | 669,2 4 8 11,8.17,92.70%,8.49,3.62 671 | 670,5 5 8 8,8.18,84.80%,8.36,3.85 672 | 671,5 5 8 9,8.19,89.60%,8.64,3.88 673 | 672,9 10 12 13,8.19,93.50%,8.76,3.56 674 | 673,1 3 6 13,8.2,91.80%,8.44,3.17 675 | 674,2 2 6 10,8.21,95.10%,8.64,3.02 676 | 675,1 8 11 13,8.22,88.20%,8.32,3.55 677 | 676,2 2 4 11,8.24,91.50%,8.59,3.66 678 | 677,5 8 9 11,8.24,89.90%,8.49,3.67 679 | 678,3 4 5 10,8.26,90.30%,8.62,3.99 680 | 679,1 2 8 9,8.28,93.40%,8.87,3.42 681 | 680,1 6 11 12,8.28,88.80%,8.26,2.97 682 | 681,2 5 10 11,8.28,94.10%,8.85,3.5 683 | 682,5 6 9 10,8.28,90.80%,8.74,3.78 684 | 683,1 3 4 9,8.3,94.50%,8.8,3.81 685 | 684,3 4 7 10,8.3,93.30%,8.54,3.42 686 | 685,6 6 7 9,8.3,90.70%,8.63,3.71 687 | 686,6 8 8 11,8.3,87.70%,8.55,3.82 688 | 687,2 4 4 4,8.31,94.60%,8.76,3.19 689 | 688,2 9 10 11,8.33,88.70%,8.3,3.31 690 | 689,4 4 8 11,8.33,87%,8.32,3.69 691 | 690,4 4 4 9,8.34,88.30%,8.43,3.94 692 | 691,6 8 10 13,8.36,86.60%,8.43,3.64 693 | 692,2 8 8 11,8.37,84%,8.09,3.45 694 | 693,4 7 8 9,8.37,92.10%,8.77,3.69 695 | 694,1 6 12 13,8.38,89.40%,8.45,3.24 696 | 695,4 4 7 12,8.38,89.40%,8.75,4.13 697 | 696,6 6 8 9,8.38,90.30%,8.64,3.44 698 | 697,1 2 2 7,8.39,91.70%,8.72,3.35 699 | 698,1 2 6 13,8.4,90.60%,8.88,3.9 700 | 699,2 3 3 7,8.4,92.10%,8.85,3.69 701 | 700,2 4 7 10,8.4,91.20%,8.67,3.82 702 | 701,2 9 12 13,8.4,90.60%,8.82,3.86 703 | 702,6 6 9 11,8.4,88.80%,8.82,3.98 704 | 703,3 3 7 11,8.41,92.20%,8.7,3.56 705 | 704,3 3 8 12,8.41,91.70%,8.97,3.88 706 | 705,1 4 9 10,8.42,93.20%,8.85,3.63 707 | 706,3 6 8 9,8.42,93.80%,8.82,3.16 708 | 707,4 4 6 11,8.42,86.30%,8.51,4.42 709 | 708,1 6 10 13,8.44,87.60%,8.63,4.1 710 | 709,1 3 4 10,8.45,92.90%,8.73,3.35 711 | 710,2 9 10 13,8.45,90.20%,9.07,4.07 712 | 711,5 6 10 13,8.45,91.50%,9.12,4.24 713 | 712,3 11 12 12,8.46,90.80%,8.71,3.28 714 | 713,6 6 8 11,8.47,82.90%,8.3,4.31 715 | 714,7 8 8 11,8.48,80.60%,8.17,4.53 716 | 715,1 2 7 8,8.49,94.30%,9.01,3.53 717 | 716,1 11 13 13,8.49,91.30%,8.83,3.74 718 | 717,1 4 6 7,8.5,87.30%,8.78,4.49 719 | 718,2 10 11 13,8.5,89.50%,8.58,3.43 720 | 719,5 7 8 8,8.5,82.70%,8.43,4.67 721 | 720,2 2 3 13,8.51,87.60%,8.54,3.6 722 | 721,2 3 10 13,8.51,93.40%,9.02,3.94 723 | 722,2 4 6 6,8.52,94.10%,9.27,3.98 724 | 723,4 5 6 7,8.54,90.30%,8.94,3.82 725 | 724,4 5 8 12,8.54,93.30%,9.08,3.71 726 | 725,3 6 7 12,8.55,90%,8.81,3.76 727 | 726,3 7 7 13,8.56,93.40%,8.84,3.13 728 | 727,1 2 9 13,8.57,93.30%,9.21,4.17 729 | 728,2 2 4 13,8.57,91.10%,9.09,4.16 730 | 729,4 4 9 11,8.57,82.50%,8.26,4.41 731 | 730,1 2 3 11,8.58,95.70%,8.98,3.39 732 | 731,2 4 6 9,8.58,91.80%,9.06,4.19 733 | 732,2 8 11 12,8.58,88.30%,8.82,4.18 734 | 733,4 4 4 11,8.58,86%,8.81,4.65 735 | 734,1 2 3 3,8.59,95.50%,8.93,2.98 736 | 735,4 5 9 13,8.59,89.30%,9.33,4.75 737 | 736,4 6 10 12,8.59,93.80%,9.17,3.92 738 | 737,2 4 9 9,8.6,90%,8.71,3.82 739 | 738,2 6 6 10,8.61,94.10%,9.05,3.64 740 | 739,1 2 5 13,8.62,94.30%,9.16,3.68 741 | 740,6 8 9 13,8.62,93.90%,9.3,3.96 742 | 741,2 9 11 11,8.63,89.30%,9.08,4.03 743 | 742,1 3 9 13,8.65,93.70%,9.25,4.13 744 | 743,2 2 8 10,8.65,94.30%,9.26,3.29 745 | 744,2 4 7 11,8.66,92.50%,8.98,3.79 746 | 745,2 6 7 8,8.66,89.60%,8.97,3.88 747 | 746,1 3 4 7,8.67,95%,9.48,3.83 748 | 747,4 8 9 11,8.68,93%,9.16,3.89 749 | 748,6 6 8 10,8.68,90.10%,8.9,3.96 750 | 749,5 5 9 10,8.69,86.90%,9.06,4.3 751 | 750,2 3 7 11,8.7,91.70%,9.09,4.21 752 | 751,2 8 9 11,8.7,90.50%,9.1,3.77 753 | 752,2 4 10 13,8.71,88.90%,9.07,4.43 754 | 753,1 3 8 8,8.72,86.50%,8.95,4.51 755 | 754,3 9 9 10,8.73,86.70%,8.78,3.81 756 | 755,4 11 12 13,8.74,91.20%,9.18,3.87 757 | 756,1 2 6 11,8.75,90%,9.19,4.25 758 | 757,2 5 7 10,8.75,92.20%,9.15,3.72 759 | 758,3 4 7 7,8.75,94%,9.41,3.8 760 | 759,6 8 12 12,8.77,91.50%,9.48,4.23 761 | 760,2 3 5 9,8.79,92%,9.31,4.31 762 | 761,4 4 8 10,8.8,91.40%,9.13,4.36 763 | 762,3 7 10 13,8.82,88.20%,9.22,4.63 764 | 763,4 5 8 8,8.82,86.40%,8.8,4.02 765 | 764,3 4 4 5,8.83,91.10%,9.13,3.95 766 | 765,3 5 9 12,8.83,90%,9.14,4.01 767 | 766,4 8 8 12,8.83,94.40%,9.39,3.84 768 | 767,4 12 12 12,8.83,86%,9.14,4.72 769 | 768,2 2 5 6,8.84,88.60%,9.16,4.38 770 | 769,9 10 11 13,8.84,89.70%,9.09,3.8 771 | 770,2 3 4 13,8.85,90.90%,9.19,3.9 772 | 771,3 10 11 12,8.85,89.40%,9.04,3.71 773 | 772,4 4 8 13,8.85,86.70%,8.78,4.2 774 | 773,5 6 6 10,8.85,88.90%,9.06,3.86 775 | 774,1 1 3 10,8.86,88.10%,8.88,3.86 776 | 775,1 2 4 5,8.86,90.40%,9.27,4.21 777 | 776,1 4 5 10,8.86,91.20%,9.01,3.61 778 | 777,2 5 8 13,8.87,93%,9.39,3.91 779 | 778,3 5 6 9,8.88,92.50%,9.31,3.67 780 | 779,7 8 10 13,8.89,92.10%,9.4,4.08 781 | 780,6 6 8 13,8.9,83.80%,8.68,4.24 782 | 781,3 9 10 12,8.92,90.20%,9.17,3.63 783 | 782,1 2 4 9,8.93,90.80%,9.28,3.92 784 | 783,1 3 3 10,8.94,89.60%,9.04,4.1 785 | 784,1 5 5 9,8.94,88.40%,9.09,3.7 786 | 785,1 4 6 9,8.95,86.10%,8.81,4.02 787 | 786,2 3 9 9,8.96,90.50%,9.32,3.95 788 | 787,4 4 5 10,8.96,86.30%,9.07,4.66 789 | 788,6 6 6 12,8.96,89.50%,9.18,4.48 790 | 789,1 3 5 10,8.97,93.20%,9.46,3.7 791 | 790,2 11 12 13,8.97,89.80%,9.34,3.94 792 | 791,2 2 5 7,8.98,88.90%,9.03,3.68 793 | 792,8 8 9 13,8.98,81.20%,8.59,4.69 794 | 793,2 2 3 6,8.99,91.60%,9.28,4.01 795 | 794,3 4 7 11,8.99,87.30%,9.05,4.58 796 | 795,3 5 6 7,8.99,89.50%,9.04,3.8 797 | 796,4 5 5 9,8.99,91.30%,9.32,3.88 798 | 797,4 7 10 10,8.99,83.90%,8.74,4.08 799 | 798,3 6 12 13,9,86.30%,8.93,4.31 800 | 799,5 5 6 7,9,90.80%,9.44,4.03 801 | 800,6 8 11 13,9,86.30%,9.38,4.5 802 | 801,1 5 9 11,9.02,93.10%,9.53,3.93 803 | 802,2 10 11 11,9.02,86.40%,9.05,3.99 804 | 803,7 12 12 13,9.05,82.60%,8.84,4.09 805 | 804,5 10 10 12,9.08,86.70%,9.06,3.56 806 | 805,1 5 6 8,9.1,90%,9.15,3.84 807 | 806,2 2 3 11,9.1,86.10%,9.11,4.12 808 | 807,2 2 5 11,9.1,87.40%,9.42,4.41 809 | 808,2 5 8 9,9.1,92.50%,9.42,3.73 810 | 809,2 3 4 11,9.11,88.50%,9.26,4.08 811 | 810,2 3 6 10,9.11,92.20%,9.33,3.75 812 | 811,1 3 5 7,9.13,90%,9.38,4.02 813 | 812,1 5 10 13,9.13,88.60%,9.15,3.43 814 | 813,5 7 8 9,9.18,88.90%,9.48,4.21 815 | 814,5 11 12 12,9.19,82.80%,8.89,4.07 816 | 815,8 8 8 13,9.19,84.70%,9.17,4.29 817 | 816,3 3 6 11,9.21,90.20%,9.66,4.44 818 | 817,3 7 8 9,9.21,86.90%,9.2,3.82 819 | 818,2 8 8 10,9.22,92.80%,9.44,3.77 820 | 819,4 7 7 11,9.22,87.50%,9.49,4.52 821 | 820,2 3 8 12,9.23,93.20%,9.73,3.79 822 | 821,3 5 5 9,9.23,83.30%,8.9,4.05 823 | 822,5 6 7 12,9.23,84.60%,9.15,4.29 824 | 823,2 3 4 4,9.25,94%,9.67,3.57 825 | 824,3 3 3 9,9.25,84.50%,9.27,4.67 826 | 825,1 2 7 9,9.26,93.30%,9.74,3.81 827 | 826,1 9 11 13,9.26,88%,9.47,4.01 828 | 827,2 2 6 8,9.27,95.50%,9.69,3.59 829 | 828,5 8 9 12,9.29,92.30%,9.72,3.87 830 | 829,1 1 4 10,9.3,85.60%,9.15,5.02 831 | 830,4 5 10 13,9.3,91.10%,9.74,4.38 832 | 831,2 3 3 9,9.31,93.20%,9.71,3.76 833 | 832,3 3 4 9,9.34,92.10%,9.67,3.77 834 | 833,3 4 6 10,9.34,89.80%,9.84,4.72 835 | 834,3 4 6 12,9.34,93.30%,9.85,3.99 836 | 835,3 7 7 10,9.34,89%,9.62,4.27 837 | 836,3 9 11 12,9.35,90.10%,9.79,4.37 838 | 837,2 4 4 7,9.37,89.40%,9.55,4.27 839 | 838,3 4 5 9,9.38,84.90%,9.17,4.45 840 | 839,4 5 7 12,9.4,87.50%,9.33,3.91 841 | 840,2 7 9 13,9.41,87.20%,9.71,4.77 842 | 841,5 9 9 11,9.44,89.50%,9.64,3.84 843 | 842,6 6 6 8,9.44,91.40%,9.75,4.17 844 | 843,1 7 12 12,9.45,80.60%,8.78,4.34 845 | 844,1 6 9 10,9.46,91.80%,9.7,3.71 846 | 845,2 5 6 10,9.46,88%,9.51,4.42 847 | 846,3 8 9 12,9.46,90.20%,9.87,4.4 848 | 847,2 3 6 11,9.47,90.20%,9.82,4.38 849 | 848,2 4 5 9,9.49,86.30%,9.52,4.62 850 | 849,4 7 11 12,9.49,86.30%,9.64,4.7 851 | 850,2 2 4 9,9.51,90.50%,9.84,4.3 852 | 851,2 6 8 8,9.51,92.50%,9.87,4.08 853 | 852,4 5 5 7,9.51,82.40%,9.15,4.34 854 | 853,2 3 8 11,9.53,92.70%,10.12,4.06 855 | 854,4 7 8 13,9.53,90.70%,9.97,4.28 856 | 855,6 6 8 12,9.53,90.80%,9.88,4.21 857 | 856,5 6 8 9,9.54,89.90%,9.82,4.35 858 | 857,3 3 5 9,9.55,87%,9.56,3.99 859 | 858,4 5 7 11,9.55,87.30%,9.77,4.97 860 | 859,4 9 12 12,9.56,89.40%,10.06,4.28 861 | 860,3 3 6 9,9.57,94.20%,10.28,4.1 862 | 861,3 5 9 13,9.57,91.30%,9.91,4.01 863 | 862,3 7 8 13,9.57,84%,9.5,4.79 864 | 863,4 4 5 7,9.57,85.10%,9.27,4.41 865 | 864,1 5 5 11,9.61,88%,9.77,4.11 866 | 865,2 4 6 13,9.65,87.20%,9.94,4.99 867 | 866,6 7 9 12,9.65,86.80%,9.59,4.66 868 | 867,1 5 6 13,9.7,90.30%,10.08,4.65 869 | 868,3 8 8 11,9.71,89.50%,9.88,4.57 870 | 869,2 3 4 10,9.72,91.90%,10.1,4.16 871 | 870,6 6 8 8,9.72,86%,9.63,4.4 872 | 871,4 9 9 10,9.73,89.90%,9.98,4.04 873 | 872,3 3 7 9,9.74,90.60%,9.91,3.75 874 | 873,1 7 9 10,9.78,86.80%,9.74,3.86 875 | 874,2 3 9 13,9.79,87.50%,10.07,5.28 876 | 875,3 3 3 5,9.79,85.70%,9.46,3.98 877 | 876,5 6 9 12,9.79,87.20%,9.84,4.27 878 | 877,6 9 10 11,9.8,91.40%,10.13,4.08 879 | 878,2 4 5 11,9.82,87.60%,10.03,4.69 880 | 879,1 2 5 10,9.83,91.70%,10.06,3.98 881 | 880,2 2 2 5,9.85,78.70%,8.96,4.68 882 | 881,6 12 13 13,9.85,87.30%,9.92,3.85 883 | 882,3 3 6 10,9.87,86.40%,9.7,4.17 884 | 883,3 4 8 11,9.88,87.20%,9.9,4.84 885 | 884,4 4 6 13,9.88,82.60%,9.4,5.04 886 | 885,4 6 7 10,9.88,86.50%,10.04,4.72 887 | 886,6 11 11 12,9.89,86.40%,9.76,3.92 888 | 887,3 6 9 11,9.91,86.80%,9.87,4.65 889 | 888,5 7 9 13,9.94,90.90%,10.39,4.07 890 | 889,2 4 12 12,9.95,85.70%,9.97,4.7 891 | 890,4 9 11 12,9.96,89%,10.16,4.25 892 | 891,7 8 9 13,9.96,84%,9.71,4.26 893 | 892,2 6 12 12,9.97,89%,10.49,5.13 894 | 893,3 4 5 6,9.97,84%,9.91,4.98 895 | 894,6 10 12 12,9.98,88.80%,10.25,4.61 896 | 895,5 5 5 9,10,87.60%,9.76,4.11 897 | 896,5 5 6 11,10,87.10%,9.97,4.57 898 | 897,6 8 9 9,10,86.20%,10.02,4.07 899 | 898,3 3 6 13,10.01,86.40%,10.24,5.25 900 | 899,6 8 9 11,10.01,84.40%,9.79,4.64 901 | 900,1 3 3 11,10.02,88.20%,9.98,4.68 902 | 901,4 5 6 10,10.02,90.30%,10.49,4.36 903 | 902,1 2 4 7,10.03,89.10%,10.3,4.82 904 | 903,2 5 8 11,10.03,85.90%,10.22,5.08 905 | 904,3 4 4 13,10.03,88.80%,10.08,4.15 906 | 905,6 7 8 9,10.03,84.50%,9.67,4.35 907 | 906,1 11 11 13,10.06,87.70%,10.01,4.25 908 | 907,1 8 10 11,10.07,86.90%,10.02,3.39 909 | 908,2 3 6 9,10.1,93.10%,10.54,4.14 910 | 909,1 3 5 9,10.11,90.60%,10.39,4.46 911 | 910,3 3 7 12,10.11,84.50%,10.02,5.26 912 | 911,4 5 7 9,10.12,86.70%,10.41,5 913 | 912,1 2 8 13,10.13,91.30%,10.43,4.73 914 | 913,4 6 6 9,10.14,87.60%,10.45,5.26 915 | 914,1 4 4 8,10.15,89.70%,10.37,5.43 916 | 915,1 5 10 11,10.16,86.30%,9.92,3.62 917 | 916,3 4 6 11,10.16,88.90%,10.38,4.56 918 | 917,2 4 8 9,10.17,84%,9.9,4.57 919 | 918,1 4 5 13,10.18,87.70%,10.29,4.48 920 | 919,2 2 7 12,10.19,83.20%,9.78,4.39 921 | 920,3 3 6 7,10.19,88.70%,10.6,4.55 922 | 921,1 5 9 13,10.2,84.90%,10.1,4.71 923 | 922,5 6 7 13,10.2,86.40%,10.44,5.42 924 | 923,5 5 8 10,10.21,81.30%,9.68,3.96 925 | 924,2 4 6 12,10.23,94.50%,11.06,4.75 926 | 925,6 7 8 11,10.24,83.90%,9.96,4.83 927 | 926,7 9 9 13,10.26,88%,10.34,4.64 928 | 927,3 6 9 12,10.27,92.70%,10.72,4.16 929 | 928,6 9 12 13,10.28,88.30%,10.45,4.3 930 | 929,4 7 9 13,10.29,84.30%,10.35,5.72 931 | 930,5 6 8 12,10.29,87.90%,10.35,4.06 932 | 931,2 4 6 7,10.31,90.10%,10.42,4.27 933 | 932,2 5 10 10,10.32,78.30%,9.42,3.9 934 | 933,6 6 7 12,10.32,83.80%,10.01,4.72 935 | 934,6 9 9 11,10.33,82.40%,9.64,5 936 | 935,5 8 11 12,10.34,86.90%,10.5,4.16 937 | 936,5 6 8 10,10.35,87%,10.31,4.38 938 | 937,6 11 12 13,10.36,80.10%,9.55,4.95 939 | 938,2 2 8 8,10.38,91.40%,10.72,4.71 940 | 939,2 7 12 13,10.44,80.40%,9.67,4.84 941 | 940,2 6 8 12,10.45,94%,11.09,4.51 942 | 941,3 4 9 13,10.45,79.60%,9.45,5.34 943 | 942,4 5 10 12,10.45,86.20%,10.27,4.29 944 | 943,1 2 7 11,10.5,88.10%,10.57,4.68 945 | 944,4 5 6 8,10.5,87.30%,10.41,4.23 946 | 945,6 10 12 13,10.5,82.50%,10.05,4.71 947 | 946,1 3 9 9,10.51,85.20%,10.11,4.02 948 | 947,1 4 4 11,10.51,86.70%,10.49,5.21 949 | 948,2 3 9 10,10.55,90%,10.75,4.18 950 | 949,1 2 3 13,10.59,91.70%,10.95,4.79 951 | 950,1 6 6 6,10.59,85.60%,10.17,5.05 952 | 951,1 2 2 9,10.62,82.40%,9.9,4.48 953 | 952,1 3 6 11,10.64,88.10%,10.57,4.29 954 | 953,5 10 12 13,10.64,83.50%,10.21,4.41 955 | 954,2 3 6 6,10.65,91.90%,11.04,4.36 956 | 955,6 7 10 12,10.66,85.20%,10.62,4.7 957 | 956,7 8 8 12,10.66,83.60%,10.32,4.47 958 | 957,3 4 6 8,10.69,90.70%,11.2,5.02 959 | 958,1 7 9 11,10.71,85.10%,10.44,4.24 960 | 959,2 3 6 13,10.71,90%,10.81,4.54 961 | 960,2 2 5 12,10.72,84.80%,10.48,5.13 962 | 961,2 6 8 13,10.72,87.20%,10.87,5.06 963 | 962,8 8 10 12,10.72,78.70%,9.71,5.6 964 | 963,1 3 8 13,10.73,90.40%,11.06,5.02 965 | 964,4 4 7 10,10.74,75.20%,8.92,4.39 966 | 965,1 7 10 13,10.75,83.70%,10.33,4.61 967 | 966,1 9 10 13,10.76,83%,10.25,4.39 968 | 967,3 3 4 11,10.76,88.60%,10.88,5.41 969 | 968,2 5 7 7,10.77,82.90%,10.41,4.71 970 | 969,3 9 10 13,10.83,83%,10.85,6.21 971 | 970,2 3 4 7,10.87,86.70%,10.78,4.85 972 | 971,4 4 8 12,10.87,92%,11.49,4.93 973 | 972,1 2 6 10,10.89,86.80%,10.8,4.94 974 | 973,1 5 12 12,10.9,81.30%,10.12,4.86 975 | 974,5 6 6 8,10.92,81.50%,10.49,5.85 976 | 975,7 7 8 11,10.95,80.40%,10.02,4.42 977 | 976,1 3 7 10,10.97,86.50%,10.9,5.15 978 | 977,3 3 9 12,10.97,87%,10.96,4.84 979 | 978,3 5 7 10,10.99,84.50%,10.67,5.1 980 | 979,4 10 12 13,10.99,81.50%,10.2,5.37 981 | 980,2 3 10 12,11,86.40%,11.17,4.04 982 | 981,3 4 6 6,11.01,90.50%,11.26,4.63 983 | 982,5 8 8 8,11.02,80.20%,10.42,5.19 984 | 983,6 8 8 12,11.02,88.10%,11.31,4.99 985 | 984,2 3 4 9,11.05,88.20%,11.11,4.67 986 | 985,2 6 7 11,11.07,82.60%,10.49,5.02 987 | 986,5 9 12 12,11.07,83%,10.77,4.73 988 | 987,1 2 7 12,11.1,77.90%,9.65,5.55 989 | 988,2 4 5 6,11.11,91%,11.61,4.46 990 | 989,5 5 8 13,11.11,80.80%,10.59,4.46 991 | 990,2 3 3 10,11.15,84.60%,11.25,4.87 992 | 991,3 4 8 12,11.17,87.70%,11.15,5.42 993 | 992,2 4 6 11,11.18,85.10%,10.98,5.39 994 | 993,2 2 8 9,11.21,87%,11.42,4.95 995 | 994,1 5 6 7,11.22,88.70%,11.53,4.63 996 | 995,5 8 10 11,11.22,87.10%,10.92,4.73 997 | 996,4 4 9 12,11.24,78.30%,10.16,4.72 998 | 997,2 5 6 6,11.25,78%,9.88,5.53 999 | 998,2 4 9 12,11.26,87.20%,11.26,4.76 1000 | 999,4 8 11 13,11.28,82.40%,10.45,5.14 1001 | 1000,4 9 10 13,11.28,82.40%,10.72,4.8 1002 | 1001,3 3 5 12,11.31,86.70%,11.38,5.1 1003 | 1002,2 4 4 6,11.32,90.20%,11.66,4.67 1004 | 1003,2 4 6 8,11.33,93.30%,11.77,4.44 1005 | 1004,3 4 6 9,11.33,82.50%,10.75,5.42 1006 | 1005,2 3 4 5,11.35,84.40%,10.97,5.57 1007 | 1006,3 3 6 12,11.36,86.90%,11.51,6.01 1008 | 1007,3 6 7 13,11.36,82.50%,10.76,6.05 1009 | 1008,1 4 7 7,11.38,85.50%,11.25,4.6 1010 | 1009,4 4 5 11,11.39,85%,11.3,4.82 1011 | 1010,6 7 9 9,11.41,80.40%,10.68,5.9 1012 | 1011,7 8 10 10,11.42,79.10%,10.55,4.89 1013 | 1012,4 8 9 13,11.43,82.70%,10.72,5.97 1014 | 1013,1 3 4 8,11.45,89.60%,11.57,4.61 1015 | 1014,8 9 9 12,11.45,82.40%,10.81,4.71 1016 | 1015,3 7 8 11,11.51,83%,11.11,5.62 1017 | 1016,4 6 7 12,11.52,81.40%,11.17,5.36 1018 | 1017,1 6 7 9,11.54,84.40%,11.3,4.67 1019 | 1018,3 7 9 12,11.54,85.90%,11.56,5.56 1020 | 1019,4 5 6 13,11.55,78%,10.18,5.07 1021 | 1020,3 3 5 13,11.57,84.90%,11.35,4.69 1022 | 1021,2 3 5 7,11.58,88.90%,11.67,4.97 1023 | 1022,4 6 8 13,11.61,81%,10.84,5.09 1024 | 1023,2 7 9 11,11.62,83.50%,11.26,5.31 1025 | 1024,1 3 7 12,11.64,85.40%,10.98,4.98 1026 | 1025,3 5 7 9,11.66,86.20%,11.44,4.86 1027 | 1026,2 6 10 13,11.67,84.60%,11.3,5.46 1028 | 1027,6 8 9 12,11.67,86.60%,11.68,5.06 1029 | 1028,1 5 7 9,11.68,84.40%,11.15,4.84 1030 | 1029,1 3 7 7,11.69,85.20%,11.44,4.62 1031 | 1030,2 9 11 13,11.7,82.20%,11.25,5.36 1032 | 1031,3 7 9 11,11.7,88.10%,11.82,4.84 1033 | 1032,4 8 8 10,11.71,85.70%,11.63,5.51 1034 | 1033,1 4 8 9,11.73,89%,11.9,5.07 1035 | 1034,2 5 11 12,11.73,77.10%,10.24,5.29 1036 | 1035,2 3 7 13,11.77,86.30%,11.62,5.24 1037 | 1036,3 5 10 13,11.79,82.30%,11.04,5.37 1038 | 1037,3 5 6 10,11.83,85.80%,11.63,5.59 1039 | 1038,4 5 5 10,11.86,81.20%,10.96,4.46 1040 | 1039,7 8 9 10,11.86,79.80%,11.04,5.33 1041 | 1040,2 5 8 10,11.88,82.90%,11.02,4.82 1042 | 1041,4 6 6 12,11.88,86%,11.56,5.8 1043 | 1042,5 8 10 12,11.88,84.30%,11.18,5.05 1044 | 1043,3 3 3 11,11.91,78%,10.71,5.38 1045 | 1044,5 5 8 12,11.91,79.50%,10.98,5.06 1046 | 1045,1 3 3 6,11.93,87.20%,11.83,5.16 1047 | 1046,2 5 7 13,11.99,83.80%,11.74,5.61 1048 | 1047,1 5 8 11,12.07,89.20%,12.42,5.56 1049 | 1048,2 3 5 13,12.07,85.50%,11.88,5.34 1050 | 1049,3 4 8 9,12.07,88.10%,12.24,5.03 1051 | 1050,2 5 5 7,12.09,82%,11.59,4.86 1052 | 1051,4 10 11 13,12.16,76.20%,10.51,5.57 1053 | 1052,2 6 6 7,12.21,78.40%,10.67,5.54 1054 | 1053,7 10 10 12,12.21,78%,11.01,5.07 1055 | 1054,2 5 7 11,12.22,82.40%,11.59,5.71 1056 | 1055,4 4 6 9,12.24,77.40%,10.48,5.57 1057 | 1056,6 9 10 12,12.25,80.50%,11.36,5.94 1058 | 1057,6 9 9 12,12.26,87%,12.17,5.82 1059 | 1058,2 3 6 12,12.28,91%,12.71,5.2 1060 | 1059,2 4 4 5,12.28,82.10%,11.3,5.16 1061 | 1060,2 4 4 10,12.28,89.60%,12.37,5.14 1062 | 1061,3 3 3 4,12.34,82.40%,11.57,6.01 1063 | 1062,2 2 2 7,12.35,73.30%,9.51,5.67 1064 | 1063,4 9 10 11,12.38,76.80%,10.69,5.81 1065 | 1064,7 9 10 11,12.41,76.90%,10.81,5.32 1066 | 1065,1 4 5 7,12.43,87.90%,12.45,5.27 1067 | 1066,3 4 9 9,12.43,84.20%,12.03,4.99 1068 | 1067,3 3 6 8,12.46,74.60%,10.38,5.58 1069 | 1068,4 4 7 8,12.52,83.80%,12.28,5.55 1070 | 1069,3 5 11 12,12.56,79.90%,11.32,5.7 1071 | 1070,4 4 6 10,12.57,87.50%,12.6,5.35 1072 | 1071,4 6 9 13,12.58,85.50%,12.59,5.18 1073 | 1072,3 5 6 11,12.59,82.50%,11.93,6.55 1074 | 1073,1 5 8 13,12.64,80.10%,11.42,5.73 1075 | 1074,3 3 9 13,12.64,74.60%,10.56,5.85 1076 | 1075,7 7 9 10,12.64,75.80%,11.23,5.19 1077 | 1076,2 4 8 13,12.7,82.20%,12.07,5.92 1078 | 1077,4 4 6 8,12.7,84.20%,12.14,6.52 1079 | 1078,2 4 5 7,12.71,78.40%,11.3,5.62 1080 | 1079,1 4 7 9,12.74,82.20%,12,5.11 1081 | 1080,4 7 10 12,12.74,79.80%,11.37,5.54 1082 | 1081,4 5 8 13,12.76,78%,11.33,5.89 1083 | 1082,5 6 10 12,12.78,75.80%,10.91,5.45 1084 | 1083,6 9 11 13,12.78,72.50%,10.29,5.96 1085 | 1084,5 7 7 9,12.8,74.20%,11.06,5.06 1086 | 1085,1 4 5 9,12.83,81.90%,11.81,6.41 1087 | 1086,5 7 11 13,12.83,70.80%,10,5.17 1088 | 1087,3 3 5 10,12.84,82.90%,12.05,5.32 1089 | 1088,4 8 12 12,12.85,85.60%,12.23,5.83 1090 | 1089,2 3 3 3,12.87,82.10%,11.84,5.96 1091 | 1090,2 5 10 13,12.92,82.10%,12.4,5.91 1092 | 1091,1 4 6 6,12.97,80%,11.77,6.38 1093 | 1092,2 3 8 8,12.98,80.40%,11.84,5.7 1094 | 1093,6 9 11 12,13.02,75.90%,10.98,5.93 1095 | 1094,1 1 6 9,13.05,81%,11.34,6.4 1096 | 1095,5 9 10 13,13.05,78%,11.54,5.87 1097 | 1096,1 5 6 11,13.06,81%,11.93,6.47 1098 | 1097,4 4 5 12,13.09,79%,11.5,5.91 1099 | 1098,3 5 6 13,13.1,76%,11.28,5.52 1100 | 1099,1 4 5 8,13.16,82.70%,12.46,5.64 1101 | 1100,5 5 7 10,13.19,73.80%,11.38,4.96 1102 | 1101,2 2 2 9,13.21,76.60%,10.98,6.35 1103 | 1102,5 7 10 10,13.28,76.50%,12.05,5.27 1104 | 1103,2 3 5 10,13.29,80.20%,12.07,5.35 1105 | 1104,2 8 10 13,13.34,80.90%,12.56,5.95 1106 | 1105,6 7 7 10,13.34,73%,10.83,5.3 1107 | 1106,2 7 10 11,13.37,78.90%,12.25,6.04 1108 | 1107,3 5 7 8,13.37,85.40%,12.95,5.94 1109 | 1108,6 8 11 12,13.39,77.90%,12.23,5.74 1110 | 1109,1 3 6 10,13.46,82.10%,12.08,6.87 1111 | 1110,4 10 10 12,13.46,78.40%,12.16,6.28 1112 | 1111,7 10 11 13,13.49,73.30%,11.07,5.79 1113 | 1112,4 5 8 10,13.55,75.90%,11.31,5.9 1114 | 1113,3 8 9 11,13.57,80.10%,12.64,6.8 1115 | 1114,6 12 12 12,13.66,76.60%,11.63,6.62 1116 | 1115,2 10 12 13,13.77,81.40%,13.06,6.04 1117 | 1116,6 7 10 10,13.83,70.80%,10.47,5.73 1118 | 1117,2 3 7 8,13.89,80.20%,12.68,5.59 1119 | 1118,3 5 9 9,13.9,79.40%,12.9,6.13 1120 | 1119,2 2 5 9,13.93,74.30%,11.71,6.28 1121 | 1120,2 4 7 9,13.95,80.60%,13.14,6.36 1122 | 1121,2 3 8 9,14.01,81.30%,12.85,6.23 1123 | 1122,3 5 5 11,14.19,79%,13.09,5.43 1124 | 1123,3 3 9 11,14.2,79.10%,13.05,5.65 1125 | 1124,5 7 9 11,14.2,71.10%,11.09,5.89 1126 | 1125,1 3 7 9,14.22,80.40%,12.74,5.47 1127 | 1126,2 4 9 10,14.24,79.30%,12.99,6 1128 | 1127,3 5 12 12,14.26,67.20%,9.62,5.75 1129 | 1128,3 3 8 10,14.27,80.30%,13.49,5.53 1130 | 1129,4 4 5 13,14.32,73.30%,11.87,7.01 1131 | 1130,3 4 4 11,14.39,76.40%,12.57,6.37 1132 | 1131,4 5 8 11,14.4,73.80%,11.71,6.59 1133 | 1132,3 4 4 10,14.42,77.10%,12.1,6.66 1134 | 1133,2 6 12 13,14.46,78.60%,13.39,5.8 1135 | 1134,3 6 11 13,14.46,73.20%,11.3,6.74 1136 | 1135,1 3 5 6,14.49,86.30%,14.16,5.91 1137 | 1136,3 3 4 13,14.49,76.80%,12.26,6.76 1138 | 1137,4 6 9 12,14.54,77.80%,12.61,7.02 1139 | 1138,6 10 10 13,14.57,69.10%,10.81,5.99 1140 | 1139,2 5 9 10,14.58,78.10%,13.05,6.18 1141 | 1140,5 5 9 11,14.6,71.10%,12.27,5.16 1142 | 1141,3 4 7 12,14.65,82.40%,14.04,6.09 1143 | 1142,2 6 9 12,14.66,82.70%,13.71,6.54 1144 | 1143,5 6 6 12,14.72,80.70%,13.46,6.01 1145 | 1144,2 5 9 11,14.75,76.50%,13.08,6.84 1146 | 1145,5 9 9 12,14.78,75.70%,12.97,6.67 1147 | 1146,4 4 4 10,14.82,76.90%,12.75,6.5 1148 | 1147,3 4 10 10,14.86,75.50%,12.9,6.06 1149 | 1148,2 2 5 10,14.91,78.60%,13.45,5.78 1150 | 1149,3 3 9 10,14.94,78.10%,13.6,6.1 1151 | 1150,4 7 8 12,14.95,79.40%,13.86,6.69 1152 | 1151,7 9 12 12,14.96,64.20%,9.28,5.41 1153 | 1152,2 2 7 7,14.97,69.80%,11.21,5.96 1154 | 1153,3 6 8 13,14.97,81.30%,14.28,6.12 1155 | 1154,2 2 3 4,14.98,77.40%,13.18,6.13 1156 | 1155,5 7 8 10,14.99,71.20%,12.15,5.63 1157 | 1156,1 5 11 12,15.01,71%,11.36,6.19 1158 | 1157,4 6 8 10,15.11,76.70%,12.71,7.59 1159 | 1158,2 4 10 11,15.16,79.20%,13.93,6.45 1160 | 1159,3 6 6 13,15.17,72.80%,12.06,6.38 1161 | 1160,3 6 8 12,15.17,80.50%,13.79,6.15 1162 | 1161,1 4 4 6,15.31,79.60%,13.55,6.77 1163 | 1162,2 5 8 8,15.42,71%,11.68,6.59 1164 | 1163,7 7 7 12,15.42,65.20%,8.98,6.11 1165 | 1164,4 5 7 13,15.43,72.80%,12.42,6.22 1166 | 1165,5 5 5 12,15.48,67.40%,10.23,6.42 1167 | 1166,2 6 11 12,15.49,77.70%,13.96,6.35 1168 | 1167,2 3 8 10,15.61,82.20%,14.94,6.54 1169 | 1168,1 5 5 10,15.65,71.20%,11.64,6.98 1170 | 1169,2 5 5 9,15.7,78.50%,14.39,5.98 1171 | 1170,5 7 12 12,15.74,66%,10.44,6.26 1172 | 1171,4 7 8 11,15.79,69%,10.78,6.62 1173 | 1172,3 3 8 13,15.87,69.70%,11.39,6.25 1174 | 1173,1 3 12 13,16.02,75.60%,13.66,6.86 1175 | 1174,1 3 8 10,16.05,71%,12.31,6.49 1176 | 1175,2 3 6 7,16.08,80.80%,15.05,6.62 1177 | 1176,2 2 6 7,16.26,72.50%,13.03,6.84 1178 | 1177,5 6 9 13,16.28,69%,12.46,6.11 1179 | 1178,6 10 11 12,16.3,69.70%,12.94,6.17 1180 | 1179,4 9 9 12,16.33,71.80%,13.42,6.47 1181 | 1180,2 6 6 9,16.46,72.80%,13.03,6.55 1182 | 1181,1 3 11 12,16.48,75.60%,14.03,7.05 1183 | 1182,3 10 11 13,16.48,71.20%,13.31,6.3 1184 | 1183,1 4 10 12,16.53,72.10%,12.81,6.61 1185 | 1184,5 5 6 8,16.54,78.10%,14.9,6.54 1186 | 1185,6 8 9 10,16.55,71.20%,12.69,7.07 1187 | 1186,2 6 10 11,16.57,73.10%,13.69,6.98 1188 | 1187,3 9 9 13,16.62,67.20%,12.08,6.04 1189 | 1188,2 4 4 9,16.71,73.40%,13.42,7.28 1190 | 1189,10 12 12 12,16.86,64.10%,10.25,5.73 1191 | 1190,2 3 7 10,16.91,76.90%,14.96,6.78 1192 | 1191,2 7 10 10,17.01,68.10%,13.35,5.81 1193 | 1192,8 8 8 11,17.01,63.50%,10.21,5.97 1194 | 1193,6 6 9 10,17.09,69.80%,13.35,6.88 1195 | 1194,2 3 7 7,17.1,76.50%,15.28,6.63 1196 | 1195,6 6 6 10,17.22,66%,11.49,6.49 1197 | 1196,5 7 7 10,17.26,68%,13.19,6.49 1198 | 1197,2 2 9 12,17.27,72.90%,14.93,6.39 1199 | 1198,3 6 8 10,17.4,66.60%,11.97,6.77 1200 | 1199,3 5 7 12,17.46,71%,14.01,7.67 1201 | 1200,3 5 8 9,17.49,75.80%,15.67,7.53 1202 | 1201,3 9 9 11,17.68,70.50%,13.69,7.48 1203 | 1202,2 8 10 12,17.74,79.80%,16.35,7.45 1204 | 1203,3 8 9 13,17.76,73.40%,15.28,7.09 1205 | 1204,9 9 9 12,17.85,63.80%,9.66,6.59 1206 | 1205,3 4 8 13,17.98,70%,13.65,7.61 1207 | 1206,2 6 7 10,18.01,69.80%,12.96,7.22 1208 | 1207,2 6 9 10,18.43,69.70%,14.78,6.85 1209 | 1208,4 5 6 9,18.44,75.30%,16.49,7.02 1210 | 1209,8 10 12 13,18.47,67.30%,14.31,6.27 1211 | 1210,2 6 8 9,18.52,72.40%,15.58,6.65 1212 | 1211,3 5 12 13,18.66,67.90%,14.53,6.59 1213 | 1212,2 4 7 8,18.98,76.50%,16.76,7.16 1214 | 1213,4 8 9 12,18.98,72.50%,16.31,6.84 1215 | 1214,3 5 9 10,19.12,68.10%,14.34,7.53 1216 | 1215,2 3 10 10,19.18,69.50%,14.5,7.5 1217 | 1216,2 2 4 12,19.2,75.10%,15.22,9.15 1218 | 1217,2 2 9 10,19.3,65.30%,14.24,7.34 1219 | 1218,2 7 10 12,19.38,70.20%,15.86,7.05 1220 | 1219,2 2 4 4,19.5,72%,15.01,8.18 1221 | 1220,2 3 11 11,19.52,68.30%,14.88,7.67 1222 | 1221,1 3 3 8,19.57,74%,15.7,7.86 1223 | 1222,6 8 8 9,19.59,63%,11.71,7.2 1224 | 1223,8 10 12 12,19.59,63.60%,11.62,7.37 1225 | 1224,10 10 10 12,19.93,61.50%,10.24,6.8 1226 | 1225,2 3 9 12,20.01,67.30%,13.12,7.61 1227 | 1226,6 8 10 11,20.09,64.60%,13.61,7.17 1228 | 1227,7 8 8 10,20.1,61.20%,12.07,7.63 1229 | 1228,1 5 7 10,20.39,69.50%,16.27,7.63 1230 | 1229,3 5 10 11,20.53,69.50%,15.99,8.37 1231 | 1230,5 9 12 13,20.53,64.40%,15.67,7.03 1232 | 1231,1 3 8 11,20.54,73.40%,16.85,8.28 1233 | 1232,2 7 12 12,20.66,62.80%,12.25,7.39 1234 | 1233,1 2 8 10,20.88,67.70%,15.12,7.81 1235 | 1234,9 11 12 12,20.88,59.50%,10.72,6.43 1236 | 1235,1 6 10 12,21.11,65.80%,13.99,7.82 1237 | 1236,4 4 8 9,21.14,69.60%,16.8,8.28 1238 | 1237,2 3 3 5,21.16,69.50%,15.7,8.49 1239 | 1238,3 6 9 13,21.16,68.90%,17.22,7.59 1240 | 1239,3 3 7 8,21.37,68.30%,16.85,8.09 1241 | 1240,3 5 7 11,21.39,64%,14.47,7.8 1242 | 1241,2 3 5 8,21.44,72.90%,17.88,7.71 1243 | 1242,4 5 7 10,21.65,63.60%,14.31,7.8 1244 | 1243,5 7 10 13,22.5,61.60%,16.46,7.35 1245 | 1244,2 9 10 10,22.59,60.60%,16.58,7.36 1246 | 1245,2 4 5 10,22.9,67%,17.3,8.07 1247 | 1246,2 3 6 8,23.06,75.30%,20.16,8.88 1248 | 1247,5 9 11 13,23.25,61.50%,16.69,7.75 1249 | 1248,3 4 5 13,23.39,65.60%,17.76,8.49 1250 | 1249,1 5 9 12,23.75,62.60%,14.09,8.74 1251 | 1250,3 4 4 4,24.47,67.30%,17.82,9.81 1252 | 1251,5 7 9 10,24.53,61%,15.63,8.68 1253 | 1252,6 6 7 11,24.57,58%,13.11,8.12 1254 | 1253,2 5 10 12,24.67,63.70%,17.88,8.55 1255 | 1254,4 10 11 12,24.71,62.90%,17.56,8.97 1256 | 1255,2 4 5 5,24.72,60.40%,15.23,8.46 1257 | 1256,2 2 5 8,24.84,63.50%,16.69,9.19 1258 | 1257,8 8 9 12,24.93,56.70%,12.65,7.52 1259 | 1258,3 4 7 8,25.04,61.10%,14.33,8.25 1260 | 1259,5 6 6 9,25.04,59.20%,14.12,8.75 1261 | 1260,3 5 10 10,25.12,54.30%,12.34,6.59 1262 | 1261,2 2 6 11,25.22,62.20%,18.4,8.58 1263 | 1262,4 9 10 12,25.27,61.70%,17.05,9.01 1264 | 1263,8 9 11 11,25.64,58%,15.82,8.41 1265 | 1264,3 9 11 13,26.26,57.60%,13.9,8.31 1266 | 1265,4 7 8 10,26.26,62.90%,18.82,9.21 1267 | 1266,5 6 7 9,26.31,60.10%,16.12,9.07 1268 | 1267,2 6 9 9,26.53,58%,14.29,8.07 1269 | 1268,2 3 13 13,26.93,58.40%,17.55,9.57 1270 | 1269,1 7 12 13,27.75,56.30%,15.1,9.28 1271 | 1270,1 4 9 13,27.76,61.10%,18.83,10.02 1272 | 1271,4 10 10 11,27.76,54.80%,19.34,9.04 1273 | 1272,3 4 5 11,28.16,63.30%,20.54,9.87 1274 | 1273,3 3 6 6,28.41,58%,14.18,8.38 1275 | 1274,6 6 9 13,28.91,53.80%,13.19,8.3 1276 | 1275,4 5 9 12,29.11,60.90%,22.06,10.07 1277 | 1276,2 3 7 9,30.64,57.60%,18.33,10.12 1278 | 1277,4 4 7 13,30.78,53.80%,15.04,9.44 1279 | 1278,2 2 7 10,30.9,53.40%,16.2,8.61 1280 | 1279,2 5 12 13,31.29,58%,22.35,9.97 1281 | 1280,1 4 6 10,31.39,56.30%,18.81,9.37 1282 | 1281,3 6 7 10,31.92,57.40%,21.85,10.37 1283 | 1282,1 2 7 7,32.73,55%,17.28,10.37 1284 | 1283,2 5 8 12,32.82,54.60%,17.78,9.75 1285 | 1284,3 8 10 12,34.45,53.60%,20.48,10.33 1286 | 1285,6 7 11 11,35.1,46.40%,19.8,10.15 1287 | 1286,2 2 3 10,35.18,54.90%,18.21,10.27 1288 | 1287,2 5 7 9,35.29,52.70%,23.49,10.35 1289 | 1288,2 4 7 7,35.73,50.70%,17.22,9.67 1290 | 1289,2 2 6 9,36.11,52.10%,17.94,10.05 1291 | 1290,7 9 11 12,36.61,50%,17.36,10.44 1292 | 1291,4 7 9 11,37.32,51.40%,22.71,11.23 1293 | 1292,2 2 3 5,38.49,51.50%,15.34,8.81 1294 | 1293,5 5 8 11,39.1,45.80%,24.18,10.92 1295 | 1294,3 6 6 10,39.12,50.30%,18.48,10.7 1296 | 1295,7 8 9 12,41.23,48.70%,16.03,10.12 1297 | 1296,1 5 11 11,41.41,43.30%,15.93,9.75 1298 | 1297,5 10 10 11,42.21,41.30%,14.82,7.03 1299 | 1298,1 2 9 11,45.03,44.90%,22.27,12.16 1300 | 1299,1 5 5 5,45.49,43.60%,11.92,7.27 1301 | 1300,4 9 11 11,47.16,43.30%,25.26,13.98 1302 | 1301,5 9 10 11,47.17,42.70%,28.12,13.4 1303 | 1302,1 7 13 13,47.38,41.70%,13.99,8.41 1304 | 1303,6 7 7 11,48.6,42.40%,27.65,14.51 1305 | 1304,3 3 7 7,48.84,41.20%,12.6,7.37 1306 | 1305,5 10 10 13,49.61,38.50%,17.09,8.06 1307 | 1306,3 5 8 13,50.53,41.90%,30.48,13.89 1308 | 1307,8 9 10 12,51.24,42.50%,21.95,13.98 1309 | 1308,2 5 6 9,52.95,44%,27.17,13.41 1310 | 1309,1 6 11 13,54.62,38.10%,18.47,9.98 1311 | 1310,1 3 9 10,55.16,40.80%,23.19,12.28 1312 | 1311,4 6 6 10,55.98,40.20%,22.83,12.31 1313 | 1312,4 4 7 7,56.41,38.80%,14.56,8.11 1314 | 1313,2 2 11 11,57.22,40.10%,13.2,8.62 1315 | 1314,5 7 9 12,57.59,39.60%,17.8,11.41 1316 | 1315,4 5 6 12,58.44,44.10%,21.82,16.05 1317 | 1316,4 6 7 9,58.49,42%,19.47,12.15 1318 | 1317,6 12 12 13,60.5,37.90%,15.11,8.3 1319 | 1318,4 4 10 10,60.61,39.70%,13.47,9.13 1320 | 1319,4 7 11 13,61.01,37%,30.84,16.48 1321 | 1320,4 5 6 11,61.48,38%,19.42,10.88 1322 | 1321,3 5 8 12,62.47,38%,27.97,13.34 1323 | 1322,3 4 6 13,63.1,40.40%,20.37,11.69 1324 | 1323,6 7 8 12,63.27,40.70%,16.9,12.18 1325 | 1324,2 4 11 12,63.58,39.40%,23.37,13.12 1326 | 1325,5 7 10 11,64.03,35.50%,23.76,13.12 1327 | 1326,2 2 13 13,64.44,37.30%,14.05,8.25 1328 | 1327,6 11 12 12,64.75,36.30%,15.4,8.93 1329 | 1328,5 8 9 13,64.8,34.90%,34.56,16.2 1330 | 1329,7 10 12 13,67.86,37.40%,22.3,12.55 1331 | 1330,5 6 8 13,68.54,36.40%,22.65,11.42 1332 | 1331,2 7 8 9,69.74,39.80%,18.28,10.87 1333 | 1332,2 2 6 13,71.49,33.10%,30.83,16.35 1334 | 1333,5 6 9 11,72.21,33.40%,26.15,15 1335 | 1334,1 4 7 11,72.93,35.70%,29.78,16.43 1336 | 1335,6 9 9 10,80.98,31.60%,32.75,18.6 1337 | 1336,2 5 7 8,83.96,32.90%,28.52,15.64 1338 | 1337,7 8 8 13,84.91,29.50%,27.08,15.03 1339 | 1338,5 7 7 11,88.29,27.40%,25.62,12.84 1340 | 1339,3 3 7 13,88.82,27.60%,32.8,18.62 1341 | 1340,4 8 8 13,88.89,29.10%,24.53,14.05 1342 | 1341,4 8 8 11,88.95,30%,23.19,13.84 1343 | 1342,3 8 8 10,90.4,28.90%,25.35,16.08 1344 | 1343,2 7 7 10,90.8,27%,26.85,12.93 1345 | 1344,5 5 7 11,91.1,26.10%,25.79,13.15 1346 | 1345,2 3 8 13,92.19,32%,23.64,12.74 1347 | 1346,7 8 10 11,95.22,29.80%,24.05,14.09 1348 | 1347,2 2 10 11,96.37,30.20%,26.2,17.05 1349 | 1348,9 11 12 13,99.38,27.90%,22.06,13.19 1350 | 1349,1 6 6 8,101.21,29.80%,21.64,13.19 1351 | 1350,3 3 8 8,104.24,30.80%,15.65,10.6 1352 | 1351,1 8 12 12,104.58,29%,20.38,13.12 1353 | 1352,3 3 5 7,104.76,29.20%,28.32,17.86 1354 | 1353,2 4 7 12,108.41,28.10%,28.97,16.35 1355 | 1354,2 9 13 13,110.09,24.90%,30.9,19.5 1356 | 1355,3 7 9 13,110.86,26%,39.84,21.67 1357 | 1356,2 4 10 10,111.72,25.80%,28.72,18.53 1358 | 1357,3 6 6 11,112.28,27.10%,24.73,15.29 1359 | 1358,3 5 7 13,122.99,23.60%,38.68,21.2 1360 | 1359,2 5 5 10,188.2,22.10%,24.09,16.42 1361 | 1360,1 4 5 6,195.91,28.70%,18.38,11.81 1362 | 1361,1 3 4 6,201.57,26.90%,19.77,12.1 1363 | 1362,2 3 5 12,207.89,20.70%,29.96,22.37 -------------------------------------------------------------------------------- /src/tot/data/crosswords/mini0505_0_100_5.json: -------------------------------------------------------------------------------- 1 | [ 2 | [ 3 | [ 4 | "An agendum; something to be done", 5 | "An engine", 6 | "Pretentious; flowery", 7 | "A salon; a hall", 8 | "To mock; to sneer", 9 | "To heap", 10 | "An Indian antelope", 11 | "To intend; to plan; to devise; a nettle; to guess", 12 | "A nozzle", 13 | "Desiccator; more dry" 14 | ], 15 | [ 16 | "A", 17 | "G", 18 | "E", 19 | "N", 20 | "D", 21 | "M", 22 | "O", 23 | "T", 24 | "O", 25 | "R", 26 | "A", 27 | "R", 28 | "T", 29 | "S", 30 | "Y", 31 | "S", 32 | "A", 33 | "L", 34 | "L", 35 | "E", 36 | "S", 37 | "L", 38 | "E", 39 | "E", 40 | "R" 41 | ] 42 | ], 43 | [ 44 | [ 45 | "To dry up", 46 | "To outdo; to bandy words", 47 | "A Yoruba-speaking people", 48 | "A Passover meal", 49 | "Eternal, everlasting", 50 | "To ascend; to get up; to come up", 51 | "To regain", 52 | "To elude", 53 | "One who files", 54 | "To crave" 55 | ], 56 | [ 57 | "A", 58 | "R", 59 | "E", 60 | "F", 61 | "Y", 62 | "R", 63 | "E", 64 | "V", 65 | "I", 66 | "E", 67 | "I", 68 | "G", 69 | "A", 70 | "L", 71 | "A", 72 | "S", 73 | "E", 74 | "D", 75 | "E", 76 | "R", 77 | "E", 78 | "T", 79 | "E", 80 | "R", 81 | "N" 82 | ] 83 | ], 84 | [ 85 | [ 86 | "(yet more) dissonant jazz, with solo improvisations and complex rhythms", 87 | "An Indian mallow", 88 | "A religious brother", 89 | "To take; to receive", 90 | "Argal; crude tartar", 91 | "A comic actor or operatic clown", 92 | "A mistake", 93 | "Existence; a creature", 94 | "The ruby snapper fish", 95 | "Apparel; an egg preparation used to clarify wine" 96 | ], 97 | [ 98 | "B", 99 | "E", 100 | "B", 101 | "O", 102 | "P", 103 | "U", 104 | "R", 105 | "E", 106 | "N", 107 | "A", 108 | "F", 109 | "R", 110 | "I", 111 | "A", 112 | "R", 113 | "F", 114 | "O", 115 | "N", 116 | "G", 117 | "E", 118 | "O", 119 | "R", 120 | "G", 121 | "A", 122 | "L" 123 | ] 124 | ], 125 | [ 126 | [ 127 | "The genus to which the dog and wolf belong", 128 | "An Altaic language", 129 | "Ramie, an Asiatic plant yielding a rope fiber", 130 | "Face down; susceptible", 131 | "An Indian antelope", 132 | "A contingent", 133 | "The Brazilian macaw; an Australian bird", 134 | "A Greek province; a law; a coin", 135 | "An ancient British ethnic group", 136 | "Gloss; luster" 137 | ], 138 | [ 139 | "C", 140 | "A", 141 | "N", 142 | "I", 143 | "S", 144 | "O", 145 | "R", 146 | "O", 147 | "C", 148 | "H", 149 | "R", 150 | "A", 151 | "M", 152 | "E", 153 | "E", 154 | "P", 155 | "R", 156 | "O", 157 | "N", 158 | "E", 159 | "S", 160 | "A", 161 | "S", 162 | "I", 163 | "N" 164 | ] 165 | ], 166 | [ 167 | [ 168 | "To choose", 169 | "A spiral, usually three dimensional", 170 | "Notched or uneven, as if gnawed away (cognate with \"erode\")", 171 | "A drinker; a sedative", 172 | "A pound sterling", 173 | "A board game; a British river", 174 | "A wading bird", 175 | "To run away together to get married", 176 | "Cider; sicer", 177 | "To put into action" 178 | ], 179 | [ 180 | "C", 181 | "H", 182 | "E", 183 | "S", 184 | "E", 185 | "H", 186 | "E", 187 | "L", 188 | "I", 189 | "X", 190 | "E", 191 | "R", 192 | "O", 193 | "S", 194 | "E", 195 | "S", 196 | "O", 197 | "P", 198 | "E", 199 | "R", 200 | "S", 201 | "N", 202 | "E", 203 | "R", 204 | "T" 205 | ] 206 | ], 207 | [ 208 | [ 209 | "To choose", 210 | "A coin", 211 | "A European owl", 212 | "To slide; to skulk", 213 | "One who uses a hose; a good guy", 214 | "To subdue; to crumple; a hidden romantic attraction", 215 | "A greeting", 216 | "The Islamic name for Satan", 217 | "To sift", 218 | "One who eats" 219 | ], 220 | [ 221 | "C", 222 | "H", 223 | "E", 224 | "S", 225 | "E", 226 | "R", 227 | "E", 228 | "B", 229 | "I", 230 | "A", 231 | "U", 232 | "L", 233 | "L", 234 | "E", 235 | "T", 236 | "S", 237 | "L", 238 | "I", 239 | "V", 240 | "E", 241 | "H", 242 | "O", 243 | "S", 244 | "E", 245 | "R" 246 | ] 247 | ], 248 | [ 249 | [ 250 | "To hold onto", 251 | "An enlarged tool", 252 | "Shaded or dyed; a 3 person card game using 40 cards", 253 | "To subdue; to terrify; a Central American timber tree", 254 | "A limestone region marked by sinks, abrupt ridges, protuberant rocks", 255 | "A frog sound; to die", 256 | "A semitone", 257 | "To exclude", 258 | "An interjection expressing exasperation; nonsense", 259 | "To salute" 260 | ], 261 | [ 262 | "C", 263 | "L", 264 | "I", 265 | "N", 266 | "G", 267 | "R", 268 | "I", 269 | "M", 270 | "E", 271 | "R", 272 | "O", 273 | "M", 274 | "B", 275 | "R", 276 | "E", 277 | "A", 278 | "M", 279 | "A", 280 | "T", 281 | "E", 282 | "K", 283 | "A", 284 | "R", 285 | "S", 286 | "T" 287 | ] 288 | ], 289 | [ 290 | [ 291 | "Is able", 292 | "A bard", 293 | "A theme; an attempt", 294 | "To sift", 295 | "A leather dresser; someone who shoots a marble", 296 | "A top; a heraldic device", 297 | "True being", 298 | "To undo the sewing of", 299 | "To depart; permission", 300 | "Desiccator; more dry" 301 | ], 302 | [ 303 | "C", 304 | "O", 305 | "U", 306 | "L", 307 | "D", 308 | "R", 309 | "U", 310 | "N", 311 | "E", 312 | "R", 313 | "E", 314 | "S", 315 | "S", 316 | "A", 317 | "Y", 318 | "S", 319 | "I", 320 | "E", 321 | "V", 322 | "E", 323 | "T", 324 | "A", 325 | "W", 326 | "E", 327 | "R" 328 | ] 329 | ], 330 | [ 331 | [ 332 | "A music style; a dance hall", 333 | "To remove the hat", 334 | "Unhidden; explicit", 335 | "The minister's house", 336 | "To demand; to arrange;", 337 | "An Italian domed cathedral", 338 | "A nickel steel alloy", 339 | "To disgrace; to blame; to injure", 340 | "Low-lying fertile land", 341 | "A sea mammal; a fishing device" 342 | ], 343 | [ 344 | "D", 345 | "I", 346 | "S", 347 | "C", 348 | "O", 349 | "U", 350 | "N", 351 | "H", 352 | "A", 353 | "T", 354 | "O", 355 | "V", 356 | "E", 357 | "R", 358 | "T", 359 | "M", 360 | "A", 361 | "N", 362 | "S", 363 | "E", 364 | "O", 365 | "R", 366 | "D", 367 | "E", 368 | "R" 369 | ] 370 | ], 371 | [ 372 | [ 373 | "To belch, to burp", 374 | "A location", 375 | "To warn", 376 | "A dipsomaniac", 377 | "To come in", 378 | "To elude", 379 | "To win again", 380 | "Inept; irrelevant", 381 | "To execrate", 382 | "Purport; a trend; a singing voice" 383 | ], 384 | [ 385 | "E", 386 | "R", 387 | "U", 388 | "C", 389 | "T", 390 | "V", 391 | "E", 392 | "N", 393 | "U", 394 | "E", 395 | "A", 396 | "W", 397 | "A", 398 | "R", 399 | "N", 400 | "D", 401 | "I", 402 | "P", 403 | "S", 404 | "O", 405 | "E", 406 | "N", 407 | "T", 408 | "E", 409 | "R" 410 | ] 411 | ], 412 | [ 413 | [ 414 | "A custard", 415 | "A lithograph", 416 | "Anatomically on the inside", 417 | "An agendum; something to be done", 418 | "To combine", 419 | "A sharp surgical lancet used for opening veins to cause bloodletting; a bevel angle", 420 | "To strike or flog", 421 | "Corrupt matter from a sore; a tongue coating", 422 | "A bang; a leather thong; to shoot; a shive; the penis", 423 | "Would not" 424 | ], 425 | [ 426 | "F", 427 | "L", 428 | "A", 429 | "W", 430 | "N", 431 | "L", 432 | "I", 433 | "T", 434 | "H", 435 | "O", 436 | "E", 437 | "N", 438 | "T", 439 | "A", 440 | "L", 441 | "A", 442 | "G", 443 | "E", 444 | "N", 445 | "D", 446 | "M", 447 | "E", 448 | "R", 449 | "G", 450 | "E" 451 | ] 452 | ], 453 | [ 454 | [ 455 | "A crowd; a mob", 456 | "To get a fresh crew", 457 | "An American Indian tribe", 458 | "A tribal emblem", 459 | "A room for storing ewers", 460 | "To chafe or rub", 461 | "To sow again", 462 | "To subdue; to terrify; a Central American timber tree", 463 | "An electronic message device", 464 | "A foe" 465 | ], 466 | [ 467 | "F", 468 | "R", 469 | "A", 470 | "P", 471 | "E", 472 | "R", 473 | "E", 474 | "M", 475 | "A", 476 | "N", 477 | "O", 478 | "S", 479 | "A", 480 | "G", 481 | "E", 482 | "T", 483 | "O", 484 | "T", 485 | "E", 486 | "M", 487 | "E", 488 | "W", 489 | "E", 490 | "R", 491 | "Y" 492 | ] 493 | ], 494 | [ 495 | [ 496 | "A woodland", 497 | "An alarm", 498 | "A tropical lizard", 499 | "To increase; to lift up; to rear", 500 | "A duck, or its down; a quilt or comforter", 501 | "To stare at angrily; a harsh light", 502 | "A veranda", 503 | "A plant allied to the arum", 504 | "An edible seaweed (if you say so)", 505 | "An Arab prince" 506 | ], 507 | [ 508 | "G", 509 | "L", 510 | "A", 511 | "D", 512 | "E", 513 | "L", 514 | "A", 515 | "R", 516 | "U", 517 | "M", 518 | "A", 519 | "N", 520 | "O", 521 | "L", 522 | "E", 523 | "R", 524 | "A", 525 | "I", 526 | "S", 527 | "E", 528 | "E", 529 | "I", 530 | "D", 531 | "E", 532 | "R" 533 | ] 534 | ], 535 | [ 536 | [ 537 | "A woodland", 538 | "A flight feather", 539 | "The soft palate", 540 | "Mesial; median", 541 | "A river boat", 542 | "A complainer", 543 | "Flat; a stage; to be honest with", 544 | "To entertain", 545 | "To dally; to put off", 546 | "To praise; to raise higher" 547 | ], 548 | [ 549 | "G", 550 | "L", 551 | "A", 552 | "D", 553 | "E", 554 | "R", 555 | "E", 556 | "M", 557 | "E", 558 | "X", 559 | "U", 560 | "V", 561 | "U", 562 | "L", 563 | "A", 564 | "M", 565 | "E", 566 | "S", 567 | "A", 568 | "L", 569 | "P", 570 | "L", 571 | "E", 572 | "Y", 573 | "T" 574 | ] 575 | ], 576 | [ 577 | [ 578 | "A language", 579 | "To impose a fine upon", 580 | "To lower; to reduce", 581 | "To clean; to wash quickly with water", 582 | "Atap, the nipa palm", 583 | "A tropical American tree", 584 | "Not bitten", 585 | "A heraldic mastiff", 586 | "A dog whelk or its shell", 587 | "A computer user" 588 | ], 589 | [ 590 | "G", 591 | "U", 592 | "A", 593 | "N", 594 | "G", 595 | "U", 596 | "N", 597 | "L", 598 | "A", 599 | "W", 600 | "A", 601 | "B", 602 | "A", 603 | "S", 604 | "E", 605 | "R", 606 | "I", 607 | "N", 608 | "S", 609 | "E", 610 | "A", 611 | "T", 612 | "T", 613 | "A", 614 | "P" 615 | ] 616 | ], 617 | [ 618 | [ 619 | "The Kaiser", 620 | "An introduction", 621 | "Public disgrace", 622 | "The belly button, which Adam and Eve might not have had", 623 | "A buzzard; a kite; a red hot coal used to light a fire", 624 | "A Tibetan wild horse", 625 | "Anatomically on the inside", 626 | "To stew; to stuff up; floating dust", 627 | "Equipped; having arms", 628 | "To rile; to salt fish" 629 | ], 630 | [ 631 | "K", 632 | "E", 633 | "S", 634 | "A", 635 | "R", 636 | "I", 637 | "N", 638 | "T", 639 | "R", 640 | "O", 641 | "A", 642 | "T", 643 | "I", 644 | "M", 645 | "Y", 646 | "N", 647 | "A", 648 | "V", 649 | "E", 650 | "L", 651 | "G", 652 | "L", 653 | "E", 654 | "D", 655 | "E" 656 | ] 657 | ], 658 | [ 659 | [ 660 | "A biblical place", 661 | "Relating to sheep", 662 | "A woodland", 663 | "Anatomically on the inside", 664 | "A realm", 665 | "A person or instrument that cuts down", 666 | "To happen", 667 | "A lariat; a lasso", 668 | "An Irish doctor", 669 | "To meddle; to mix" 670 | ], 671 | [ 672 | "M", 673 | "E", 674 | "R", 675 | "O", 676 | "M", 677 | "O", 678 | "V", 679 | "I", 680 | "L", 681 | "E", 682 | "W", 683 | "E", 684 | "A", 685 | "L", 686 | "D", 687 | "E", 688 | "N", 689 | "T", 690 | "A", 691 | "L", 692 | "R", 693 | "E", 694 | "A", 695 | "M", 696 | "E" 697 | ] 698 | ], 699 | [ 700 | [ 701 | "A deity associated with a natural object", 702 | "A habit or custom", 703 | "To munch", 704 | "Inquires into; forces open using leverage", 705 | "Extra; to withhold punishment; thin", 706 | "A fool; a stupid person", 707 | "To displace illegally or unfairly", 708 | "A craze", 709 | "An inciter; various moths whose larva feed on tree leaves; a lobster bearing an egg mass; one who collects eggs", 710 | "To sneeze" 711 | ], 712 | [ 713 | "N", 714 | "U", 715 | "M", 716 | "E", 717 | "N", 718 | "U", 719 | "S", 720 | "A", 721 | "G", 722 | "E", 723 | "M", 724 | "U", 725 | "N", 726 | "G", 727 | "E", 728 | "P", 729 | "R", 730 | "I", 731 | "E", 732 | "S", 733 | "S", 734 | "P", 735 | "A", 736 | "R", 737 | "E" 738 | ] 739 | ], 740 | [ 741 | [ 742 | "A tit-lark", 743 | "A Turkish written decree", 744 | "An ancient name for Korea", 745 | "More healthy; a coin", 746 | "Operating on a single object", 747 | "The Canada lynx", 748 | "Relating to the iris", 749 | "An ancient Roman full outer robe or wrap worn outdoors by women", 750 | "One who does nothing", 751 | "Weepy; tear-filled" 752 | ], 753 | [ 754 | "P", 755 | "I", 756 | "P", 757 | "I", 758 | "T", 759 | "I", 760 | "R", 761 | "A", 762 | "D", 763 | "E", 764 | "S", 765 | "I", 766 | "L", 767 | "L", 768 | "A", 769 | "H", 770 | "A", 771 | "L", 772 | "E", 773 | "R", 774 | "U", 775 | "N", 776 | "A", 777 | "R", 778 | "Y" 779 | ] 780 | ], 781 | [ 782 | [ 783 | "To stamp; to brand; to impress; to put into type", 784 | "A scarf; a cymar; a loose dress", 785 | "To cut", 786 | "To perceive; wisdom; reason; feeling", 787 | "The ridges on a tire; to walk heavily", 788 | "A signaling sound", 789 | "A rice processor; an implement for ricing potatoes", 790 | "A chemical compound", 791 | "A dog whelk or its shell", 792 | "Chased up a tree" 793 | ], 794 | [ 795 | "P", 796 | "R", 797 | "I", 798 | "N", 799 | "T", 800 | "S", 801 | "I", 802 | "M", 803 | "A", 804 | "R", 805 | "S", 806 | "C", 807 | "I", 808 | "S", 809 | "E", 810 | "S", 811 | "E", 812 | "N", 813 | "S", 814 | "E", 815 | "T", 816 | "R", 817 | "E", 818 | "A", 819 | "D" 820 | ] 821 | ] 822 | ] -------------------------------------------------------------------------------- /src/tot/data/text/data_100_random_text.txt: -------------------------------------------------------------------------------- 1 | It isn't difficult to do a handstand if you just stand on your hands. It caught him off guard that space smelled of seared steak. When she didn’t like a guy who was trying to pick her up, she started using sign language. Each person who knows you has a different perception of who you are. 2 | The hawk didn’t understand why the ground squirrels didn’t want to be his friend. If I don’t like something, I’ll stay away from it. People keep telling me "orange" but I still prefer "pink". He dreamed of leaving his law firm to open a portable dog wash. 3 | My biggest joy is roasting almonds while stalking prey. You realize you're not alone as you sit in your bedroom massaging your calves after a long day of playing tug-of-war with Grandpa Joe in the hospital. The ants enjoyed the barbecue more than the family. The hawk didn’t understand why the ground squirrels didn’t want to be his friend. 4 | He had unknowingly taken up sleepwalking as a nighttime hobby. The overpass went under the highway and into a secret world. He found his art never progressed when he literally used his sweat and tears. It was always dangerous to drive with him since he insisted the safety cones were a slalom course. 5 | Joe discovered that traffic cones make excellent megaphones. You realize you're not alone as you sit in your bedroom massaging your calves after a long day of playing tug-of-war with Grandpa Joe in the hospital. I was starting to worry that my pet turtle could tell what I was thinking. He's in a boy band which doesn't make much sense for a snake. 6 | He was surprised that his immense laziness was inspirational to others. Instead of a bachelorette party You realize you're not alone as you sit in your bedroom massaging your calves after a long day of playing tug-of-war with Grandpa Joe in the hospital. If I don’t like something, I’ll stay away from it. 7 | For some unfathomable reason, the response team didn't consider a lack of milk for my cereal as a proper emergency. You realize you're not alone as you sit in your bedroom massaging your calves after a long day of playing tug-of-war with Grandpa Joe in the hospital. He poured rocks in the dungeon of his mind. I’m a living furnace. 8 | You realize you're not alone as you sit in your bedroom massaging your calves after a long day of playing tug-of-war with Grandpa Joe in the hospital. Today arrived with a crash of my car through the garage door. I had a friend in high school named Rick Shaw, but he was fairly useless as a mode of transport. It was always dangerous to drive with him since he insisted the safety cones were a slalom course. 9 | He decided to fake his disappearance to avoid jail. He was all business when he wore his clown suit. We have a lot of rain in June. The snow-covered path was no help in finding his way out of the back-country. 10 | The fence was confused about whether it was supposed to keep things in or keep things out. He quietly entered the museum as the super bowl started. When confronted with a rotary dial phone the teenager was perplexed. She discovered van life is difficult with 2 cats and a dog. 11 | He dreamed of eating green apples with worms. Homesickness became contagious in the young campers' cabin. She couldn't understand why nobody else could see that the sky is full of cotton candy. There was no ice cream in the freezer, nor did they have money to go to the store. 12 | A glittering gem is not enough. It's much more difficult to play tennis with a bowling ball than it is to bowl with a tennis ball. When confronted with a rotary dial phone the teenager was perplexed. There should have been a time and a place, but this wasn't it. 13 | The blue parrot drove by the hitchhiking mongoose. The ants enjoyed the barbecue more than the family. The Great Dane looked more like a horse than a dog. Various sea birds are elegant, but nothing is as elegant as a gliding pelican. 14 | The murder hornet was disappointed by the preconceived ideas people had of him. She wondered what his eyes were saying beneath his mirrored sunglasses. The fox in the tophat whispered into the ear of the rabbit. He's in a boy band which doesn't make much sense for a snake. 15 | There's probably enough glass in my cupboard to build an undersea aquarium. He was disappointed when he found the beach to be so sandy and the sun so sunny. She looked into the mirror and saw another person. The sudden rainstorm washed crocodiles into the ocean. 16 | It caught him off guard that space smelled of seared steak. The busker hoped that the people passing by would throw money, but they threw tomatoes instead, so he exchanged his hat for a juicer. Honestly, I didn't care much for the first season, so I didn't bother with the second. Today arrived with a crash of my car through the garage door. 17 | There was no ice cream in the freezer, nor did they have money to go to the store. The waves were crashing on the shore; it was a lovely sight. He knew it was going to be a bad day when he saw mountain lions roaming the streets. It's much more difficult to play tennis with a bowling ball than it is to bowl with a tennis ball. 18 | Joe discovered that traffic cones make excellent megaphones. I’m a living furnace. The near-death experience brought new ideas to light. I was starting to worry that my pet turtle could tell what I was thinking. 19 | Written warnings in instruction manuals are worthless since rabbits can't read. You're unsure whether or not to trust him, but very thankful that you wore a turtle neck. You realize you're not alone as you sit in your bedroom massaging your calves after a long day of playing tug-of-war with Grandpa Joe in the hospital. Strawberries must be the one food that doesn't go well with this brand of paint. 20 | Strawberries must be the one food that doesn't go well with this brand of paint. Joe discovered that traffic cones make excellent megaphones. There's a reason that roses have thorns. She traveled because it cost the same as therapy and was a lot more enjoyable. 21 | Her hair was windswept as she rode in the black convertible. She traveled because it cost the same as therapy and was a lot more enjoyable. It's always a good idea to seek shelter from the evil gaze of the sun. He turned in the research paper on Friday; otherwise, he would have not passed the class. 22 | Today arrived with a crash of my car through the garage door. It's never comforting to know that your fate depends on something as unpredictable as the popping of corn. He was disappointed when he found the beach to be so sandy and the sun so sunny. Courage and stupidity were all he had. 23 | She had some amazing news to share but nobody to share it with. She couldn't understand why nobody else could see that the sky is full of cotton candy. Each person who knows you has a different perception of who you are. He decided that the time had come to be stronger than any of the excuses he'd used until then. 24 | The blue parrot drove by the hitchhiking mongoose. His get rich quick scheme was to grow a cactus farm. For some unfathomable reason, the response team didn't consider a lack of milk for my cereal as a proper emergency. He picked up trash in his spare time to dump in his neighbor's yard. 25 | Her hair was windswept as she rode in the black convertible. The ants enjoyed the barbecue more than the family. Homesickness became contagious in the young campers' cabin. The busker hoped that the people passing by would throw money, but they threw tomatoes instead, so he exchanged his hat for a juicer. 26 | I’m a living furnace. The near-death experience brought new ideas to light. He was surprised that his immense laziness was inspirational to others. There was no ice cream in the freezer, nor did they have money to go to the store. 27 | It isn't difficult to do a handstand if you just stand on your hands. I'd rather be a bird than a fish. Homesickness became contagious in the young campers' cabin. He picked up trash in his spare time to dump in his neighbor's yard. 28 | He poured rocks in the dungeon of his mind. It isn't difficult to do a handstand if you just stand on your hands. It's never comforting to know that your fate depends on something as unpredictable as the popping of corn. He's in a boy band which doesn't make much sense for a snake. 29 | It was always dangerous to drive with him since he insisted the safety cones were a slalom course. The heat He picked up trash in his spare time to dump in his neighbor's yard. The anaconda was the greatest criminal mastermind in this part of the neighborhood. 30 | I’m a living furnace. The book is in front of the table. He walked into the basement with the horror movie from the night before playing in his head. He turned in the research paper on Friday; otherwise, he would have not passed the class. 31 | For some unfathomable reason, the response team didn't consider a lack of milk for my cereal as a proper emergency. He turned in the research paper on Friday; otherwise, he would have not passed the class. Her hair was windswept as she rode in the black convertible. Karen realized the only way she was getting into heaven was to cheat. 32 | It was always dangerous to drive with him since he insisted the safety cones were a slalom course. I covered my friend in baby oil. Today arrived with a crash of my car through the garage door. She couldn't understand why nobody else could see that the sky is full of cotton candy. 33 | The book is in front of the table. There should have been a time and a place, but this wasn't it. I'd rather be a bird than a fish. The blue parrot drove by the hitchhiking mongoose. 34 | Karen realized the only way she was getting into heaven was to cheat. Two seats were vacant. Just because the water is red doesn't mean you can't drink it. She wondered what his eyes were saying beneath his mirrored sunglasses. 35 | Strawberries must be the one food that doesn't go well with this brand of paint. It caught him off guard that space smelled of seared steak. The book is in front of the table. He was disappointed when he found the beach to be so sandy and the sun so sunny. 36 | The team members were hard to tell apart since they all wore their hair in a ponytail. He found his art never progressed when he literally used his sweat and tears. There was no ice cream in the freezer, nor did they have money to go to the store. You're unsure whether or not to trust him, but very thankful that you wore a turtle neck. 37 | The team members were hard to tell apart since they all wore their hair in a ponytail. It was the scarcity that fueled his creativity. He turned in the research paper on Friday; otherwise, he would have not passed the class. The busker hoped that the people passing by would throw money, but they threw tomatoes instead, so he exchanged his hat for a juicer. 38 | Each person who knows you has a different perception of who you are. The team members were hard to tell apart since they all wore their hair in a ponytail. Just because the water is red doesn't mean you can't drink it. We have a lot of rain in June. 39 | He found his art never progressed when he literally used his sweat and tears. Karen realized the only way she was getting into heaven was to cheat. The green tea and avocado smoothie turned out exactly as would be expected. It caught him off guard that space smelled of seared steak. 40 | She looked into the mirror and saw another person. The team members were hard to tell apart since they all wore their hair in a ponytail. There should have been a time and a place, but this wasn't it. Just because the water is red doesn't mean you can't drink it. 41 | She finally understood that grief was her love with no place for it to go. The ants enjoyed the barbecue more than the family. The snow-covered path was no help in finding his way out of the back-country. It's never comforting to know that your fate depends on something as unpredictable as the popping of corn. 42 | She traveled because it cost the same as therapy and was a lot more enjoyable. He decided to fake his disappearance to avoid jail. The green tea and avocado smoothie turned out exactly as would be expected. He knew it was going to be a bad day when he saw mountain lions roaming the streets. 43 | The sudden rainstorm washed crocodiles into the ocean. She wondered what his eyes were saying beneath his mirrored sunglasses. If eating three-egg omelets causes weight-gain, budgie eggs are a good substitute. He knew it was going to be a bad day when he saw mountain lions roaming the streets. 44 | The blue parrot drove by the hitchhiking mongoose. He dreamed of eating green apples with worms. He was all business when he wore his clown suit. The snow-covered path was no help in finding his way out of the back-country. 45 | Just go ahead and press that button. Karen realized the only way she was getting into heaven was to cheat. My biggest joy is roasting almonds while stalking prey. The waves were crashing on the shore; it was a lovely sight. 46 | Gwen had her best sleep ever on her new bed of nails. He learned the important lesson that a picnic at the beach on a windy day is a bad idea. It caught him off guard that space smelled of seared steak. My biggest joy is roasting almonds while stalking prey. 47 | Joe discovered that traffic cones make excellent megaphones. Written warnings in instruction manuals are worthless since rabbits can't read. If I don’t like something, I’ll stay away from it. He used to get confused between soldiers and shoulders, but as a military man, he now soldiers responsibility. 48 | He learned the important lesson that a picnic at the beach on a windy day is a bad idea. The Great Dane looked more like a horse than a dog. Written warnings in instruction manuals are worthless since rabbits can't read. He decided that the time had come to be stronger than any of the excuses he'd used until then. 49 | My biggest joy is roasting almonds while stalking prey. When confronted with a rotary dial phone the teenager was perplexed. He had unknowingly taken up sleepwalking as a nighttime hobby. The near-death experience brought new ideas to light. 50 | My secretary is the only person who truly understands my stamp-collecting obsession. Instead of a bachelorette party Just go ahead and press that button. The ants enjoyed the barbecue more than the family. 51 | It caught him off guard that space smelled of seared steak. The Great Dane looked more like a horse than a dog. He was disappointed when he found the beach to be so sandy and the sun so sunny. There should have been a time and a place, but this wasn't it. 52 | He turned in the research paper on Friday; otherwise, he would have not passed the class. Tomatoes make great weapons when water balloons aren’t available. He picked up trash in his spare time to dump in his neighbor's yard. It caught him off guard that space smelled of seared steak. 53 | He had unknowingly taken up sleepwalking as a nighttime hobby. He dreamed of leaving his law firm to open a portable dog wash. When confronted with a rotary dial phone the teenager was perplexed. There's probably enough glass in my cupboard to build an undersea aquarium. 54 | He's in a boy band which doesn't make much sense for a snake. I was starting to worry that my pet turtle could tell what I was thinking. You realize you're not alone as you sit in your bedroom massaging your calves after a long day of playing tug-of-war with Grandpa Joe in the hospital. He picked up trash in his spare time to dump in his neighbor's yard. 55 | A glittering gem is not enough. The green tea and avocado smoothie turned out exactly as would be expected. The near-death experience brought new ideas to light. Today arrived with a crash of my car through the garage door. 56 | Her hair was windswept as she rode in the black convertible. His get rich quick scheme was to grow a cactus farm. He quietly entered the museum as the super bowl started. He was disappointed when he found the beach to be so sandy and the sun so sunny. 57 | He's in a boy band which doesn't make much sense for a snake. He was all business when he wore his clown suit. The hawk didn’t understand why the ground squirrels didn’t want to be his friend. When confronted with a rotary dial phone the teenager was perplexed. 58 | It was the scarcity that fueled his creativity. Strawberries must be the one food that doesn't go well with this brand of paint. He was all business when he wore his clown suit. The overpass went under the highway and into a secret world. 59 | Various sea birds are elegant, but nothing is as elegant as a gliding pelican. Courage and stupidity were all he had. There's a reason that roses have thorns. He was surprised that his immense laziness was inspirational to others. 60 | Instead of a bachelorette party The hawk didn’t understand why the ground squirrels didn’t want to be his friend. My secretary is the only person who truly understands my stamp-collecting obsession. It was the scarcity that fueled his creativity. 61 | For the 216th time, he said he would quit drinking soda after this last Coke. Today arrived with a crash of my car through the garage door. It was the scarcity that fueled his creativity. When she didn’t like a guy who was trying to pick her up, she started using sign language. 62 | If eating three-egg omelets causes weight-gain, budgie eggs are a good substitute. Just go ahead and press that button. Written warnings in instruction manuals are worthless since rabbits can't read. I covered my friend in baby oil. 63 | A glittering gem is not enough. Gwen had her best sleep ever on her new bed of nails. The near-death experience brought new ideas to light. She finally understood that grief was her love with no place for it to go. 64 | He had unknowingly taken up sleepwalking as a nighttime hobby. The anaconda was the greatest criminal mastermind in this part of the neighborhood. The sudden rainstorm washed crocodiles into the ocean. For some unfathomable reason, the response team didn't consider a lack of milk for my cereal as a proper emergency. 65 | He walked into the basement with the horror movie from the night before playing in his head. There should have been a time and a place, but this wasn't it. It caught him off guard that space smelled of seared steak. He poured rocks in the dungeon of his mind. 66 | You're unsure whether or not to trust him, but very thankful that you wore a turtle neck. The book is in front of the table. It caught him off guard that space smelled of seared steak. It isn't difficult to do a handstand if you just stand on your hands. 67 | She couldn't understand why nobody else could see that the sky is full of cotton candy. The sudden rainstorm washed crocodiles into the ocean. Various sea birds are elegant, but nothing is as elegant as a gliding pelican. Homesickness became contagious in the young campers' cabin. 68 | The sudden rainstorm washed crocodiles into the ocean. Strawberries must be the one food that doesn't go well with this brand of paint. The ants enjoyed the barbecue more than the family. Gwen had her best sleep ever on her new bed of nails. 69 | The Great Dane looked more like a horse than a dog. The anaconda was the greatest criminal mastermind in this part of the neighborhood. Courage and stupidity were all he had. For the 216th time, he said he would quit drinking soda after this last Coke. 70 | Honestly, I didn't care much for the first season, so I didn't bother with the second. My biggest joy is roasting almonds while stalking prey. It caught him off guard that space smelled of seared steak. The team members were hard to tell apart since they all wore their hair in a ponytail. 71 | He dreamed of leaving his law firm to open a portable dog wash. I’m a living furnace. He dreamed of eating green apples with worms. It's never comforting to know that your fate depends on something as unpredictable as the popping of corn. 72 | It was always dangerous to drive with him since he insisted the safety cones were a slalom course. Gwen had her best sleep ever on her new bed of nails. He poured rocks in the dungeon of his mind. It was the scarcity that fueled his creativity. 73 | He poured rocks in the dungeon of his mind. Tomatoes make great weapons when water balloons aren’t available. He learned the important lesson that a picnic at the beach on a windy day is a bad idea. The team members were hard to tell apart since they all wore their hair in a ponytail. 74 | My secretary is the only person who truly understands my stamp-collecting obsession. She discovered van life is difficult with 2 cats and a dog. It isn't difficult to do a handstand if you just stand on your hands. The snow-covered path was no help in finding his way out of the back-country. 75 | His thought process was on so many levels that he gave himself a phobia of heights. When confronted with a rotary dial phone the teenager was perplexed. The fence was confused about whether it was supposed to keep things in or keep things out. There can never be too many cherries on an ice cream sundae. 76 | He was disappointed when he found the beach to be so sandy and the sun so sunny. Just go ahead and press that button. It caught him off guard that space smelled of seared steak. Various sea birds are elegant, but nothing is as elegant as a gliding pelican. 77 | His thought process was on so many levels that he gave himself a phobia of heights. I had a friend in high school named Rick Shaw, but he was fairly useless as a mode of transport. He decided that the time had come to be stronger than any of the excuses he'd used until then. The fence was confused about whether it was supposed to keep things in or keep things out. 78 | He was disappointed when he found the beach to be so sandy and the sun so sunny. He decided to fake his disappearance to avoid jail. Courage and stupidity were all he had. Each person who knows you has a different perception of who you are. 79 | Strawberries must be the one food that doesn't go well with this brand of paint. She couldn't understand why nobody else could see that the sky is full of cotton candy. The overpass went under the highway and into a secret world. It was always dangerous to drive with him since he insisted the safety cones were a slalom course. 80 | She wondered what his eyes were saying beneath his mirrored sunglasses. You're unsure whether or not to trust him, but very thankful that you wore a turtle neck. Two seats were vacant. Tomatoes make great weapons when water balloons aren’t available. 81 | The near-death experience brought new ideas to light. His thought process was on so many levels that he gave himself a phobia of heights. I'd rather be a bird than a fish. Her hair was windswept as she rode in the black convertible. 82 | The ants enjoyed the barbecue more than the family. Written warnings in instruction manuals are worthless since rabbits can't read. Instead of a bachelorette party There was no ice cream in the freezer, nor did they have money to go to the store. 83 | He found his art never progressed when he literally used his sweat and tears. She finally understood that grief was her love with no place for it to go. He was surprised that his immense laziness was inspirational to others. Written warnings in instruction manuals are worthless since rabbits can't read. 84 | The blue parrot drove by the hitchhiking mongoose. Joe discovered that traffic cones make excellent megaphones. Tomatoes make great weapons when water balloons aren’t available. When confronted with a rotary dial phone the teenager was perplexed. 85 | He was disappointed when he found the beach to be so sandy and the sun so sunny. Two seats were vacant. Homesickness became contagious in the young campers' cabin. The overpass went under the highway and into a secret world. 86 | She had some amazing news to share but nobody to share it with. He picked up trash in his spare time to dump in his neighbor's yard. There can never be too many cherries on an ice cream sundae. The team members were hard to tell apart since they all wore their hair in a ponytail. 87 | When she didn’t like a guy who was trying to pick her up, she started using sign language. He turned in the research paper on Friday; otherwise, he would have not passed the class. If I don’t like something, I’ll stay away from it. Various sea birds are elegant, but nothing is as elegant as a gliding pelican. 88 | I covered my friend in baby oil. Written warnings in instruction manuals are worthless since rabbits can't read. There was coal in his stocking and he was thrilled. He had unknowingly taken up sleepwalking as a nighttime hobby. 89 | I was starting to worry that my pet turtle could tell what I was thinking. He learned the important lesson that a picnic at the beach on a windy day is a bad idea. The small white buoys marked the location of hundreds of crab pots. He was all business when he wore his clown suit. 90 | Just because the water is red doesn't mean you can't drink it. The book is in front of the table. The near-death experience brought new ideas to light. He was disappointed when he found the beach to be so sandy and the sun so sunny. 91 | If eating three-egg omelets causes weight-gain, budgie eggs are a good substitute. She wondered what his eyes were saying beneath his mirrored sunglasses. She looked into the mirror and saw another person. There was no ice cream in the freezer, nor did they have money to go to the store. 92 | The hawk didn’t understand why the ground squirrels didn’t want to be his friend. He turned in the research paper on Friday; otherwise, he would have not passed the class. The blue parrot drove by the hitchhiking mongoose. My biggest joy is roasting almonds while stalking prey. 93 | Joe discovered that traffic cones make excellent megaphones. Honestly, I didn't care much for the first season, so I didn't bother with the second. You realize you're not alone as you sit in your bedroom massaging your calves after a long day of playing tug-of-war with Grandpa Joe in the hospital. I'd rather be a bird than a fish. 94 | A glittering gem is not enough. Honestly, I didn't care much for the first season, so I didn't bother with the second. He decided that the time had come to be stronger than any of the excuses he'd used until then. She couldn't understand why nobody else could see that the sky is full of cotton candy. 95 | Honestly, I didn't care much for the first season, so I didn't bother with the second. Her hair was windswept as she rode in the black convertible. She wondered what his eyes were saying beneath his mirrored sunglasses. If I don’t like something, I’ll stay away from it. 96 | It was always dangerous to drive with him since he insisted the safety cones were a slalom course. He is no James Bond; his name is Roger Moore. Courage and stupidity were all he had. He's in a boy band which doesn't make much sense for a snake. 97 | Today arrived with a crash of my car through the garage door. The busker hoped that the people passing by would throw money, but they threw tomatoes instead, so he exchanged his hat for a juicer. For some unfathomable reason, the response team didn't consider a lack of milk for my cereal as a proper emergency. We have a lot of rain in June. 98 | He turned in the research paper on Friday; otherwise, he would have not passed the class. It's never comforting to know that your fate depends on something as unpredictable as the popping of corn. The book is in front of the table. The waves were crashing on the shore; it was a lovely sight. 99 | I was starting to worry that my pet turtle could tell what I was thinking. Just because the water is red doesn't mean you can't drink it. It isn't difficult to do a handstand if you just stand on your hands. She traveled because it cost the same as therapy and was a lot more enjoyable. 100 | I’m a living furnace. There's a reason that roses have thorns. He is no James Bond; his name is Roger Moore. Her hair was windswept as she rode in the black convertible. -------------------------------------------------------------------------------- /src/tot/methods/bfs.py: -------------------------------------------------------------------------------- 1 | import itertools 2 | import numpy as np 3 | from functools import partial 4 | from tot.models import gpt 5 | 6 | def get_value(task, x, y, n_evaluate_sample, cache_value=True): 7 | value_prompt = task.value_prompt_wrap(x, y) 8 | if cache_value and value_prompt in task.value_cache: 9 | return task.value_cache[value_prompt] 10 | value_outputs = gpt(value_prompt, n=n_evaluate_sample, stop=None) 11 | value = task.value_outputs_unwrap(x, y, value_outputs) 12 | if cache_value: 13 | task.value_cache[value_prompt] = value 14 | return value 15 | 16 | def get_values(task, x, ys, n_evaluate_sample, cache_value=True): 17 | values = [] 18 | local_value_cache = {} 19 | for y in ys: # each partial output 20 | if y in local_value_cache: # avoid duplicate candidates 21 | value = 0 22 | else: 23 | value = get_value(task, x, y, n_evaluate_sample, cache_value=cache_value) 24 | local_value_cache[y] = value 25 | values.append(value) 26 | return values 27 | 28 | def get_votes(task, x, ys, n_evaluate_sample): 29 | vote_prompt = task.vote_prompt_wrap(x, ys) 30 | vote_outputs = gpt(vote_prompt, n=n_evaluate_sample, stop=None) 31 | values = task.vote_outputs_unwrap(vote_outputs, len(ys)) 32 | return values 33 | 34 | def get_proposals(task, x, y): 35 | propose_prompt = task.propose_prompt_wrap(x, y) 36 | proposals = gpt(propose_prompt, n=1, stop=None)[0].split('\n') 37 | return [y + _ + '\n' for _ in proposals] 38 | 39 | def get_samples(task, x, y, n_generate_sample, prompt_sample, stop): 40 | if prompt_sample == 'standard': 41 | prompt = task.standard_prompt_wrap(x, y) 42 | elif prompt_sample == 'cot': 43 | prompt = task.cot_prompt_wrap(x, y) 44 | else: 45 | raise ValueError(f'prompt_sample {prompt_sample} not recognized') 46 | samples = gpt(prompt, n=n_generate_sample, stop=stop) 47 | return [y + _ for _ in samples] 48 | 49 | def solve(args, task, idx, to_print=True): 50 | global gpt 51 | gpt = partial(gpt, model=args.backend, temperature=args.temperature) 52 | print(gpt) 53 | x = task.get_input(idx) # input 54 | ys = [''] # current output candidates 55 | infos = [] 56 | for step in range(task.steps): 57 | # generation 58 | if args.method_generate == 'sample': 59 | new_ys = [get_samples(task, x, y, args.n_generate_sample, prompt_sample=args.prompt_sample, stop=task.stops[step]) for y in ys] 60 | elif args.method_generate == 'propose': 61 | new_ys = [get_proposals(task, x, y) for y in ys] 62 | new_ys = list(itertools.chain(*new_ys)) 63 | ids = list(range(len(new_ys))) 64 | # evaluation 65 | if args.method_evaluate == 'vote': 66 | values = get_votes(task, x, new_ys, args.n_evaluate_sample) 67 | elif args.method_evaluate == 'value': 68 | values = get_values(task, x, new_ys, args.n_evaluate_sample) 69 | 70 | # selection 71 | if args.method_select == 'sample': 72 | ps = np.array(values) / sum(values) 73 | select_ids = np.random.choice(ids, size=args.n_select_sample, p=ps).tolist() 74 | elif args.method_select == 'greedy': 75 | select_ids = sorted(ids, key=lambda x: values[x], reverse=True)[:args.n_select_sample] 76 | select_new_ys = [new_ys[select_id] for select_id in select_ids] 77 | 78 | # log 79 | if to_print: 80 | sorted_new_ys, sorted_values = zip(*sorted(zip(new_ys, values), key=lambda x: x[1], reverse=True)) 81 | print(f'-- new_ys --: {sorted_new_ys}\n-- sol values --: {sorted_values}\n-- choices --: {select_new_ys}\n') 82 | 83 | infos.append({'step': step, 'x': x, 'ys': ys, 'new_ys': new_ys, 'values': values, 'select_new_ys': select_new_ys}) 84 | ys = select_new_ys 85 | 86 | if to_print: 87 | print(ys) 88 | return ys, {'steps': infos} 89 | 90 | def naive_solve(args, task, idx, to_print=True): 91 | global gpt 92 | gpt = partial(gpt, model=args.backend, temperature=args.temperature) 93 | print(gpt) 94 | x = task.get_input(idx) # input 95 | ys = get_samples(task, x, '', args.n_generate_sample, args.prompt_sample, stop=None) 96 | return ys, {} -------------------------------------------------------------------------------- /src/tot/models.py: -------------------------------------------------------------------------------- 1 | import os 2 | import openai 3 | import backoff 4 | 5 | completion_tokens = prompt_tokens = 0 6 | 7 | api_key = os.getenv("OPENAI_API_KEY", "") 8 | if api_key != "": 9 | openai.api_key = api_key 10 | else: 11 | print("Warning: OPENAI_API_KEY is not set") 12 | 13 | api_base = os.getenv("OPENAI_API_BASE", "") 14 | if api_base != "": 15 | print("Warning: OPENAI_API_BASE is set to {}".format(api_base)) 16 | openai.api_base = api_base 17 | 18 | @backoff.on_exception(backoff.expo, openai.error.OpenAIError) 19 | def completions_with_backoff(**kwargs): 20 | return openai.ChatCompletion.create(**kwargs) 21 | 22 | def gpt(prompt, model="gpt-4", temperature=0.7, max_tokens=1000, n=1, stop=None) -> list: 23 | messages = [{"role": "user", "content": prompt}] 24 | return chatgpt(messages, model=model, temperature=temperature, max_tokens=max_tokens, n=n, stop=stop) 25 | 26 | def chatgpt(messages, model="gpt-4", temperature=0.7, max_tokens=1000, n=1, stop=None) -> list: 27 | global completion_tokens, prompt_tokens 28 | outputs = [] 29 | while n > 0: 30 | cnt = min(n, 20) 31 | n -= cnt 32 | res = completions_with_backoff(model=model, messages=messages, temperature=temperature, max_tokens=max_tokens, n=cnt, stop=stop) 33 | outputs.extend([choice.message.content for choice in res.choices]) 34 | # log completion tokens 35 | completion_tokens += res.usage.completion_tokens 36 | prompt_tokens += res.usage.prompt_tokens 37 | return outputs 38 | 39 | def gpt_usage(backend="gpt-4"): 40 | global completion_tokens, prompt_tokens 41 | if backend == "gpt-4": 42 | cost = completion_tokens / 1000 * 0.06 + prompt_tokens / 1000 * 0.03 43 | elif backend == "gpt-3.5-turbo": 44 | cost = completion_tokens / 1000 * 0.002 + prompt_tokens / 1000 * 0.0015 45 | elif backend == "gpt-4o": 46 | cost = completion_tokens / 1000 * 0.00250 + prompt_tokens / 1000 * 0.01 47 | return {"completion_tokens": completion_tokens, "prompt_tokens": prompt_tokens, "cost": cost} 48 | -------------------------------------------------------------------------------- /src/tot/prompts/crosswords.py: -------------------------------------------------------------------------------- 1 | # 5 shot 2 | standard_prompt = ''' 3 | Solve 5x5 mini crosswords. Given an input of 5 horizontal clues and 5 vertical clues, generate an output of 5 rows, where each row is 5 letter separated by space. 4 | 5 | Input: 6 | h1. A lunar valley 7 | h2. A fatty oil 8 | h3. To entice 9 | h4. To lower; to reduce 10 | h5. A solitary person 11 | v1. According to the roster 12 | v2. Another name for Port-Francqui 13 | v3. An illicit lover; a European lake 14 | v4. To lisp 15 | v5. To come in 16 | 17 | Output: 18 | R I L L E 19 | O L E I N 20 | T E M P T 21 | A B A S E 22 | L O N E R 23 | 24 | Input: 25 | h1. One who saws 26 | h2. A fungus genus 27 | h3. An assessor 28 | h4. Pasture land 29 | h5. Receiving by the ear 30 | v1. To swell; to increase 31 | v2. The Brazilian macaw; an Australian bird 32 | v3. A Timorese island 33 | v4. Excessive fluid accumulation 34 | v5. Dewy; roscid 35 | 36 | Output: 37 | S A W E R 38 | U R E D O 39 | R A T E R 40 | G R A M A 41 | E A R A L 42 | 43 | Input: 44 | h1. Dandruff; scum; the bull-trout 45 | h2. One who greets; to vacillate; a British river 46 | h3. A Turkish written decree 47 | h4. Mignon; petty; little 48 | h5. A bishop's permission for a priest to leave a diocese 49 | v1. To steal; to brush across 50 | v2. A sedge (a primitive three-sided grass) 51 | v3. Grape jam 52 | v4. A flatworm larva 53 | v5. Ore refuse; to prepare material for glass by heat 54 | 55 | Output: 56 | S C U R F 57 | W A V E R 58 | I R A D E 59 | P E T I T 60 | E X E A T 61 | 62 | Input: 63 | h1. Presented; revealed 64 | h2. An interjection expressing sorrow 65 | h3. Benefit; result 66 | h4. A cigarette 67 | h5. Chased up a tree 68 | v1. Swarthy; tawny 69 | v2. An apiarist or bee keeper 70 | v3. To speak formally 71 | v4. To indite; to scribble 72 | v5. An insecticide 73 | 74 | Output: 75 | S H O W N 76 | W I R R A 77 | A V A I L 78 | R E T T E 79 | T R E E D 80 | 81 | Input: 82 | h1. Scald; an ancient Scandinavian bard 83 | h2. H2O; to irrigate 84 | h3. The companion to an "intro", a postscript or exit piece 85 | h4. An artificial fabric 86 | h5. Deep religious feeling 87 | v1. To rush; to stoop; a descent 88 | v2. A New Zealand fir tree 89 | v3. Mine refuse 90 | v4. The garden dormouse 91 | v5. Like a drone; humming 92 | 93 | Output: 94 | S K A L D 95 | W A T E R 96 | O U T R O 97 | O R L O N 98 | P I E T Y 99 | 100 | Input: 101 | {input} 102 | 103 | Output: 104 | ''' 105 | 106 | 107 | 108 | cot_prompt = '''Solve 5x5 mini crosswords. Given an input of 5 horizontal clues and 5 vertical clues, generate thoughts about which 5-letter word fits each clue, then an output of 5 rows, where each row is 5 letter separated by space. 109 | 110 | Input: 111 | h1. A lunar valley 112 | h2. A fatty oil 113 | h3. To entice 114 | h4. To lower; to reduce 115 | h5. A solitary person 116 | v1. According to the roster 117 | v2. Another name for Port-Francqui 118 | v3. An illicit lover; a European lake 119 | v4. To lisp 120 | v5. To come in 121 | 122 | Thoughts: 123 | h1. A lunar valley: RILLE 124 | h2. A fatty oil: OLEIN 125 | h3. To entice: TEMPT 126 | h4. To lower; to reduce: ABASE 127 | h5. A solitary person: LONER 128 | v1. According to the roster: ROTAL 129 | v2. Another name for Port-Francqui: ILEBO 130 | v3. An illicit lover; a European lake: LEMAN 131 | v4. To lisp: LIPSE 132 | v5. To come in: ENTER 133 | 134 | Output: 135 | R I L L E 136 | O L E I N 137 | T E M P T 138 | A B A S E 139 | L O N E R 140 | 141 | Input: 142 | h1. One who saws 143 | h2. A fungus genus 144 | h3. An assessor 145 | h4. Pasture land 146 | h5. Receiving by the ear 147 | v1. To swell; to increase 148 | v2. The Brazilian macaw; an Australian bird 149 | v3. A Timorese island 150 | v4. Excessive fluid accumulation 151 | v5. Dewy; roscid 152 | 153 | Thoughts: 154 | h1. One who saws: SAWER 155 | h2. A fungus genus: UREDO 156 | h3. An assessor: RATER 157 | h4. Pasture land: GRAMA 158 | h5. Receiving by the ear: EARAL 159 | v1. To swell; to increase: SURGE 160 | v2. The Brazilian macaw; an Australian bird: ARARA 161 | v3. A Timorese island: WETAR 162 | v4. Excessive fluid accumulation: EDEMA 163 | v5. Dewy; roscid: RORAL 164 | 165 | Output: 166 | S A W E R 167 | U R E D O 168 | R A T E R 169 | G R A M A 170 | E A R A L 171 | 172 | Input: 173 | h1. Dandruff; scum; the bull-trout 174 | h2. One who greets; to vacillate; a British river 175 | h3. A Turkish written decree 176 | h4. Mignon; petty; little 177 | h5. A bishop's permission for a priest to leave a diocese 178 | v1. To steal; to brush across 179 | v2. A sedge (a primitive three-sided grass) 180 | v3. Grape jam 181 | v4. A flatworm larva 182 | v5. Ore refuse; to prepare material for glass by heat 183 | 184 | Thoughts: 185 | h1. Dandruff; scum; the bull-trout: SCURF 186 | h2. One who greets; to vacillate; a British river: WAVER 187 | h3. A Turkish written decree: IRADE 188 | h4. Mignon; petty; little: PETIT 189 | h5. A bishop's permission for a priest to leave a diocese: EXEAT 190 | v1. To steal; to brush across: SWIPE 191 | v2. A sedge (a primitive three-sided grass): CAREX 192 | v3. Grape jam: UVATE 193 | v4. A flatworm larva: REDIA 194 | v5. Ore refuse; to prepare material for glass by heat: FRETT 195 | 196 | Output: 197 | S C U R F 198 | W A V E R 199 | I R A D E 200 | P E T I T 201 | E X E A T 202 | 203 | Input: 204 | h1. Presented; revealed 205 | h2. An interjection expressing sorrow 206 | h3. Benefit; result 207 | h4. A cigarette 208 | h5. Chased up a tree 209 | v1. Swarthy; tawny 210 | v2. An apiarist or bee keeper 211 | v3. To speak formally 212 | v4. To indite; to scribble 213 | v5. An insecticide 214 | 215 | Thoughts: 216 | h1. Presented; revealed: SHOWN 217 | h2. An interjection expressing sorrow: WIRRA 218 | h3. Benefit; result: AVAIL 219 | h4. A cigarette: RETTE 220 | h5. Chased up a tree: TREED 221 | v1. Swarthy; tawny: SWART 222 | v2. An apiarist or bee keeper: HIVER 223 | v3. To speak formally: ORATE 224 | v4. To indite; to scribble: WRITE 225 | v5. An insecticide: NALED 226 | 227 | Output: 228 | S H O W N 229 | W I R R A 230 | A V A I L 231 | R E T T E 232 | T R E E D 233 | 234 | Input: 235 | h1. Scald; an ancient Scandinavian bard 236 | h2. H2O; to irrigate 237 | h3. The companion to an "intro", a postscript or exit piece 238 | h4. An artificial fabric 239 | h5. Deep religious feeling 240 | v1. To rush; to stoop; a descent 241 | v2. A New Zealand fir tree 242 | v3. Mine refuse 243 | v4. The garden dormouse 244 | v5. Like a drone; humming 245 | 246 | Thoughts: 247 | h1. Scald; an ancient Scandinavian bard: SKALD 248 | h2. H2O; to irrigate: WATER 249 | h3. The companion to an "intro", a postscript or exit piece: OUTRO 250 | h4. An artificial fabric: ORLON 251 | h5. Deep religious feeling: PIETY 252 | v1. To rush; to stoop; a descent: SWOOP 253 | v2. A New Zealand fir tree: KAURI 254 | v3. Mine refuse: ATTLE 255 | v4. The garden dormouse: LEROT 256 | v5. Like a drone; humming: DRONY 257 | 258 | Output: 259 | S K A L D 260 | W A T E R 261 | O U T R O 262 | O R L O N 263 | P I E T Y 264 | 265 | Input: 266 | {input} 267 | ''' 268 | 269 | 270 | propose_prompt = '''Let's play a 5 x 5 mini crossword, where each word should have exactly 5 letters. 271 | 272 | {input} 273 | 274 | Given the current status, list all possible answers for unfilled or changed words, and your confidence levels (certain/high/medium/low), using the format "h1. apple (medium)". Use "certain" cautiously and only when you are 100% sure this is the correct word. You can list more then one possible answer for each word. 275 | ''' 276 | 277 | 278 | value_prompt = '''Evaluate if there exists a five letter word of some meaning that fit some letter constraints (sure/maybe/impossible). 279 | 280 | Incorrect; to injure: w _ o _ g 281 | The letter constraint is: 5 letters, letter 1 is w, letter 3 is o, letter 5 is g. 282 | Some possible words that mean "Incorrect; to injure": 283 | wrong (w r o n g): 5 letters, letter 1 is w, letter 3 is o, letter 5 is g. fit! 284 | sure 285 | 286 | A person with an all-consuming enthusiasm, such as for computers or anime: _ _ _ _ u 287 | The letter constraint is: 5 letters, letter 5 is u. 288 | Some possible words that mean "A person with an all-consuming enthusiasm, such as for computers or anime": 289 | geek (g e e k): 4 letters, not 5 290 | otaku (o t a k u): 5 letters, letter 5 is u 291 | sure 292 | 293 | Dewy; roscid: r _ _ _ l 294 | The letter constraint is: 5 letters, letter 1 is r, letter 5 is l. 295 | Some possible words that mean "Dewy; roscid": 296 | moist (m o i s t): 5 letters, letter 1 is m, not r 297 | humid (h u m i d): 5 letters, letter 1 is h, not r 298 | I cannot think of any words now. Only 2 letters are constrained, it is still likely 299 | maybe 300 | 301 | A woodland: _ l _ d e 302 | The letter constraint is: 5 letters, letter 2 is l, letter 4 is d, letter 5 is e. 303 | Some possible words that mean "A woodland": 304 | forest (f o r e s t): 6 letters, not 5 305 | woods (w o o d s): 5 letters, letter 2 is o, not l 306 | grove (g r o v e): 5 letters, letter 2 is r, not l 307 | I cannot think of any words now. 3 letters are constrained, and _ l _ d e seems a common pattern 308 | maybe 309 | 310 | An inn: _ d _ w f 311 | The letter constraint is: 5 letters, letter 2 is d, letter 4 is w, letter 5 is f. 312 | Some possible words that mean "An inn": 313 | hotel (h o t e l): 5 letters, letter 2 is o, not d 314 | lodge (l o d g e): 5 letters, letter 2 is o, not d 315 | I cannot think of any words now. 3 letters are constrained, and it is extremely unlikely to have a word with pattern _ d _ w f to mean "An inn" 316 | impossible 317 | 318 | Chance; a parasitic worm; a fish: w r a k _ 319 | The letter constraint is: 5 letters, letter 1 is w, letter 2 is r, letter 3 is a, letter 4 is k. 320 | Some possible words that mean "Chance; a parasitic worm; a fish": 321 | fluke (f l u k e): 5 letters, letter 1 is f, not w 322 | I cannot think of any words now. 4 letters are constrained, and it is extremely unlikely to have a word with pattern w r a k _ to mean "Chance; a parasitic worm; a fish" 323 | impossible 324 | 325 | {input} 326 | ''' -------------------------------------------------------------------------------- /src/tot/prompts/game24.py: -------------------------------------------------------------------------------- 1 | # 5-shot 2 | standard_prompt = '''Use numbers and basic arithmetic operations (+ - * /) to obtain 24. 3 | Input: 4 4 6 8 4 | Answer: (4 + 8) * (6 - 4) = 24 5 | Input: 2 9 10 12 6 | Answer: 2 * 12 * (10 - 9) = 24 7 | Input: 4 9 10 13 8 | Answer: (13 - 9) * (10 - 4) = 24 9 | Input: 1 4 8 8 10 | Answer: (8 / 4 + 1) * 8 = 24 11 | Input: 5 5 5 9 12 | Answer: 5 + 5 + 5 + 9 = 24 13 | Input: {input} 14 | ''' 15 | 16 | # 5-shot 17 | cot_prompt = '''Use numbers and basic arithmetic operations (+ - * /) to obtain 24. Each step, you are only allowed to choose two of the remaining numbers to obtain a new number. 18 | Input: 4 4 6 8 19 | Steps: 20 | 4 + 8 = 12 (left: 4 6 12) 21 | 6 - 4 = 2 (left: 2 12) 22 | 2 * 12 = 24 (left: 24) 23 | Answer: (6 - 4) * (4 + 8) = 24 24 | Input: 2 9 10 12 25 | Steps: 26 | 12 * 2 = 24 (left: 9 10 24) 27 | 10 - 9 = 1 (left: 1 24) 28 | 24 * 1 = 24 (left: 24) 29 | Answer: (12 * 2) * (10 - 9) = 24 30 | Input: 4 9 10 13 31 | Steps: 32 | 13 - 10 = 3 (left: 3 4 9) 33 | 9 - 3 = 6 (left: 4 6) 34 | 4 * 6 = 24 (left: 24) 35 | Answer: 4 * (9 - (13 - 10)) = 24 36 | Input: 1 4 8 8 37 | Steps: 38 | 8 / 4 = 2 (left: 1 2 8) 39 | 1 + 2 = 3 (left: 3 8) 40 | 3 * 8 = 24 (left: 24) 41 | Answer: (1 + 8 / 4) * 8 = 24 42 | Input: 5 5 5 9 43 | Steps: 44 | 5 + 5 = 10 (left: 5 9 10) 45 | 10 + 5 = 15 (left: 9 15) 46 | 15 + 9 = 24 (left: 24) 47 | Answer: ((5 + 5) + 5) + 9 = 24 48 | Input: {input} 49 | ''' 50 | 51 | # 1-shot 52 | propose_prompt = '''Input: 2 8 8 14 53 | Possible next steps: 54 | 2 + 8 = 10 (left: 8 10 14) 55 | 8 / 2 = 4 (left: 4 8 14) 56 | 14 + 2 = 16 (left: 8 8 16) 57 | 2 * 8 = 16 (left: 8 14 16) 58 | 8 - 2 = 6 (left: 6 8 14) 59 | 14 - 8 = 6 (left: 2 6 8) 60 | 14 / 2 = 7 (left: 7 8 8) 61 | 14 - 2 = 12 (left: 8 8 12) 62 | Input: {input} 63 | Possible next steps: 64 | ''' 65 | 66 | value_prompt = '''Evaluate if given numbers can reach 24 (sure/likely/impossible) 67 | 10 14 68 | 10 + 14 = 24 69 | sure 70 | 11 12 71 | 11 + 12 = 23 72 | 12 - 11 = 1 73 | 11 * 12 = 132 74 | 11 / 12 = 0.91 75 | impossible 76 | 4 4 10 77 | 4 + 4 + 10 = 8 + 10 = 18 78 | 4 * 10 - 4 = 40 - 4 = 36 79 | (10 - 4) * 4 = 6 * 4 = 24 80 | sure 81 | 4 9 11 82 | 9 + 11 + 4 = 20 + 4 = 24 83 | sure 84 | 5 7 8 85 | 5 + 7 + 8 = 12 + 8 = 20 86 | (8 - 5) * 7 = 3 * 7 = 21 87 | I cannot obtain 24 now, but numbers are within a reasonable range 88 | likely 89 | 5 6 6 90 | 5 + 6 + 6 = 17 91 | (6 - 5) * 6 = 1 * 6 = 6 92 | I cannot obtain 24 now, but numbers are within a reasonable range 93 | likely 94 | 10 10 11 95 | 10 + 10 + 11 = 31 96 | (11 - 10) * 10 = 10 97 | 10 10 10 are all too big 98 | impossible 99 | 1 3 3 100 | 1 * 3 * 3 = 9 101 | (1 + 3) * 3 = 12 102 | 1 3 3 are all too small 103 | impossible 104 | {input} 105 | ''' 106 | 107 | value_last_step_prompt = '''Use numbers and basic arithmetic operations (+ - * /) to obtain 24. Given an input and an answer, give a judgement (sure/impossible) if the answer is correct, i.e. it uses each input exactly once and no other numbers, and reach 24. 108 | Input: 4 4 6 8 109 | Answer: (4 + 8) * (6 - 4) = 24 110 | Judge: 111 | sure 112 | Input: 2 9 10 12 113 | Answer: 2 * 12 * (10 - 9) = 24 114 | Judge: 115 | sure 116 | Input: 4 9 10 13 117 | Answer: (13 - 9) * (10 - 4) = 24 118 | Judge: 119 | sure 120 | Input: 4 4 6 8 121 | Answer: (4 + 8) * (6 - 4) + 1 = 25 122 | Judge: 123 | impossible 124 | Input: 2 9 10 12 125 | Answer: 2 * (12 - 10) = 24 126 | Judge: 127 | impossible 128 | Input: 4 9 10 13 129 | Answer: (13 - 4) * (10 - 9) = 24 130 | Judge: 131 | impossible 132 | Input: {input} 133 | Answer: {answer} 134 | Judge:''' -------------------------------------------------------------------------------- /src/tot/prompts/text.py: -------------------------------------------------------------------------------- 1 | standard_prompt = ''' 2 | Write a coherent passage of 4 short paragraphs. The end sentence of each paragraph must be: {input} 3 | ''' 4 | 5 | cot_prompt = ''' 6 | Write a coherent passage of 4 short paragraphs. The end sentence of each paragraph must be: {input} 7 | 8 | Make a plan then write. Your output should be of the following format: 9 | 10 | Plan: 11 | Your plan here. 12 | 13 | Passage: 14 | Your passage here. 15 | ''' 16 | 17 | 18 | vote_prompt = '''Given an instruction and several choices, decide which choice is most promising. Analyze each choice in detail, then conclude in the last line "The best choice is {s}", where s the integer id of the choice. 19 | ''' 20 | 21 | compare_prompt = '''Briefly analyze the coherency of the following two passages. Conclude in the last line "The more coherent passage is 1", "The more coherent passage is 2", or "The two passages are similarly coherent". 22 | ''' 23 | 24 | score_prompt = '''Analyze the following passage, then at the last line conclude "Thus the coherency score is {s}", where s is an integer from 1 to 10. 25 | ''' -------------------------------------------------------------------------------- /src/tot/tasks/__init__.py: -------------------------------------------------------------------------------- 1 | def get_task(name): 2 | if name == 'game24': 3 | from tot.tasks.game24 import Game24Task 4 | return Game24Task() 5 | elif name == 'text': 6 | from tot.tasks.text import TextTask 7 | return TextTask() 8 | elif name == 'crosswords': 9 | from tot.tasks.crosswords import MiniCrosswordsTask 10 | return MiniCrosswordsTask() 11 | else: 12 | raise NotImplementedError -------------------------------------------------------------------------------- /src/tot/tasks/base.py: -------------------------------------------------------------------------------- 1 | import os 2 | DATA_PATH = os.path.join(os.path.dirname(__file__), '..', 'data') 3 | 4 | class Task: 5 | def __init__(self): 6 | pass 7 | 8 | def __len__(self) -> int: 9 | pass 10 | 11 | def get_input(self, idx: int) -> str: 12 | pass 13 | 14 | def test_output(self, idx: int, output: str): 15 | pass -------------------------------------------------------------------------------- /src/tot/tasks/crosswords.py: -------------------------------------------------------------------------------- 1 | import re 2 | import os 3 | import json 4 | from tot.tasks.base import Task, DATA_PATH 5 | from tot.prompts.crosswords import * 6 | from tot.models import gpt 7 | 8 | class MiniCrosswordsEnv: 9 | def __init__(self, file='mini0505.json'): 10 | self.file = os.path.join(DATA_PATH, 'crosswords', file) 11 | 12 | self.file = json.load(open(self.file)) 13 | self.n = len(self.file) 14 | self.cache = {} 15 | self.idx = None 16 | self.times = 0 17 | self.prompt_status_cache = {} 18 | 19 | def __len__(self): 20 | return self.n 21 | 22 | def reset(self, idx, board=None, status=None, steps=None): 23 | self.idx = idx 24 | self.data, self.board_gt = self.file[idx] 25 | self.board = ['_'] * 25 26 | self.ans = ['_____'] * 10 27 | self.ans_gt = self.get_ans(self.board_gt) 28 | self.steps = 0 29 | self.status = [0] * 10 # 0: unfilled; 1: filled; 2: filled then changed 30 | if board is not None: 31 | self.board = board 32 | self.ans = self.get_ans(self.board) 33 | if status is not None: 34 | self.status = status 35 | if steps is not None: 36 | self.steps = steps 37 | return self.render() 38 | 39 | 40 | def prompt_status(self): 41 | count = {'sure': 0, 'maybe': 0, 'impossible': 0} 42 | for ans, data, status in zip(self.ans, self.data, self.status): 43 | # if status != 0: continue 44 | if ans.count('_') >= 4: continue 45 | ans = ' '.join(ans.lower()) 46 | line = f'{data}: {ans}' 47 | prompt = value_prompt.format(input=line) 48 | if prompt in self.prompt_status_cache: 49 | res = self.prompt_status_cache[prompt] 50 | else: 51 | res = gpt(prompt)[0] 52 | self.prompt_status_cache[prompt] = res 53 | # print(line) 54 | # print(res) 55 | # print() 56 | res = res.split('\n')[-1].strip() 57 | if res in count: count[res] += 1 58 | # print(count) 59 | return count 60 | 61 | def render_gt_board(self): 62 | s = "GT Board:\n" 63 | for i in range(5): 64 | s += ' '.join(self.board_gt[i*5:(i+1)*5]) + '\n' 65 | return s 66 | 67 | def render_board(self): 68 | s = "Current Board:\n" 69 | for i in range(5): 70 | s += ''.join(self.board[i*5:(i+1)*5]) + '\n' 71 | return s 72 | 73 | def render_clues(self, status=None): 74 | s = "" 75 | # s += "Horizontal:\n" 76 | for i in range(5): 77 | if status is None or self.status[i] == status: 78 | s += 'h' + str(i+1) + '. ' + self.data[i] + '\n' 79 | # s += "Vertical:\n" 80 | for i in range(5, 10): 81 | if status is None or self.status[i] == status: 82 | s += 'v' + str(i-5+1) + '. ' + self.data[i] + '\n' 83 | return s 84 | 85 | def render_ans(self, status=None): 86 | s = "" 87 | # s += "Horizontal:\n" 88 | for i in range(5): 89 | if status is None or self.status[i] == status: 90 | s += 'h' + str(i+1) + '. ' + self.data[i] + ': ' + self.ans[i] + '\n' 91 | # s += "Vertical:\n" 92 | for i in range(5, 10): 93 | if status is None or self.status[i] == status: 94 | s += 'v' + str(i-5+1) + '. ' + self.data[i] + ': ' + self.ans[i] + '\n' 95 | return s 96 | 97 | def render_gt_ans(self, status=None): 98 | s = "" 99 | # s += "Horizontal:\n" 100 | for i in range(5): 101 | if status is None or self.status[i] == status: 102 | s += 'h' + str(i+1) + '. ' + self.data[i] + ': ' + self.ans_gt[i] + '\n' 103 | # s += "Vertical:\n" 104 | for i in range(5, 10): 105 | if status is None or self.status[i] == status: 106 | s += 'v' + str(i-5+1) + '. ' + self.data[i] + ': ' + self.ans_gt[i] + '\n' 107 | return s 108 | 109 | def render(self, status=True): 110 | if status: 111 | return self.render_board() + '\nUnfilled:\n' + self.render_ans(status=0) + '\nFilled:\n' + self.render_ans(status=1) + '\nChanged:\n' + self.render_ans(status=2) 112 | else: 113 | return self.render_board() + '\n' + self.render_ans() 114 | 115 | def get_ans(self, board): 116 | ans = [''] * 10 117 | for i in range(5): 118 | ans[i] = ''.join(board[i*5:(i+1)*5]) 119 | for i in range(5): 120 | ans[i+5] = ''.join(board[i::5]) 121 | return ans 122 | 123 | def step(self, action): 124 | self.steps += 1 125 | action = action.split('\n')[-1] 126 | action = action.split('. ') 127 | if len(action) != 2: 128 | return 'Invalid! Format should be like "h1. apple"', 0, False, {} 129 | pos, word = action 130 | 131 | if len(word) != 5: 132 | return 'Invalid! Word should have 5 letters.', 0, False, {} 133 | if pos.startswith('h'): 134 | idx = int(pos[1:]) - 1 135 | self.board[idx*5:(idx+1)*5] = list(word.upper()) 136 | elif pos.startswith('v'): 137 | idx = int(pos[1:]) - 1 138 | self.board[idx::5] = list(word.upper()) 139 | idx += 5 # for later status update 140 | else: 141 | return 'Invalid! Position should be h1-h5 or v1-v5', 0, False, {} 142 | 143 | self.new_ans = self.get_ans(self.board) 144 | # self.status = [2 if (status == 1 and ans != new_ans) else status for status, ans, new_ans in zip(self.status, self.ans, self.new_ans)] 145 | self.status = [2 if any(letter != new_letter and letter != '_' for letter, new_letter in zip(ans, new_ans)) else status for status, ans, new_ans in zip(self.status, self.ans, self.new_ans)] 146 | self.status[idx] = 1 147 | self.ans = self.new_ans 148 | r_all = (self.board == self.board_gt) 149 | r_letter = sum(a == b for a, b in zip(self.board, self.board_gt)) / 25 150 | r_word = sum(a == b for a, b in zip(self.ans, self.ans_gt)) / 10 151 | return self.render(), r_all, (r_all or self.steps >= 20), {'r_letter': r_letter, 'r_word': r_word, 'r_game': r_all} 152 | 153 | 154 | class MiniCrosswordsTask(Task): 155 | """ 156 | Input (x) : Decription of a 5x5 mini crossword 157 | Output (y) : List of 10 words to fill in the crossword 158 | Reward (r) : word level and game level 159 | Input Example: 160 | Output Example: 161 | """ 162 | def __init__(self, file='mini0505.json'): 163 | """ 164 | file: a csv file (fixed) 165 | """ 166 | super().__init__() 167 | self.env = MiniCrosswordsEnv(file) # use it as a stateless tool 168 | self.xs = [] 169 | for idx in range(len(self.env)): 170 | self.env.reset(idx) 171 | self.xs.append(self.env.render_clues()) 172 | self.steps = 10 # TODO: variable steps?? 173 | self.cache_proposals = {} 174 | 175 | def __len__(self) -> int: 176 | return len(self.env) 177 | 178 | def get_input(self, idx: int) -> str: 179 | self.env.reset(idx) 180 | return self.env.render_clues() 181 | 182 | # def test_output(self, idx: int, output: str): # TODO: r_word for now 183 | # self.env.reset(idx) 184 | # info = {'r_word': 0} 185 | # for line in output.split('\n'): 186 | # if line.startswith('h') or line.startswith('v'): 187 | # _, _, _, info = self.env.step(line) 188 | # return info['r_word'] 189 | 190 | def test_output(self, idx: int, output: str): 191 | self.env.reset(idx) 192 | output = output.split('Output:\n')[-1] 193 | info = {'r_word': 0, 'r_letter': 0, 'r_game': 0} 194 | for i, line in enumerate(output.strip().split('\n')[-5:], 1): 195 | letters = line.split(' ')[:5] 196 | word = ''.join(letters) 197 | word = word + '_' * (5 - len(word)) 198 | action = f'h{i}. {word}' 199 | # print(action) 200 | _, _, _, info = self.env.step(action) 201 | info['r'] = info['r_word'] 202 | return info 203 | 204 | def set_status(self, x: str, y: str): 205 | idx = self.xs.index(x) 206 | self.test_output(idx, y) # update self.env 207 | 208 | @staticmethod 209 | def standard_prompt_wrap(x: str, y:str='') -> str: 210 | return standard_prompt.format(input=x) + y 211 | 212 | @staticmethod 213 | def cot_prompt_wrap(x: str, y:str='') -> str: 214 | return cot_prompt.format(input=x) + y 215 | 216 | def propose_prompt_wrap(self, x: str, y: str='') -> str: 217 | self.set_status(x, y) 218 | return propose_prompt.format(input=self.env.render()) 219 | 220 | def propose_outputs_unwrap(self, x: str, y: str, outputs: list, n_max_propose: int) -> list: 221 | confidence_to_value = {'certain': 1, 'high': 0.5, 'medium': 0.2, 'low': 0.1} # TODO: ad hoc 222 | proposals_to_scores = {} 223 | for output in outputs: 224 | lines = output.split('\n') 225 | pattern = r'^([hv][1-5])\. ([a-zA-Z]{5,5}) \((certain|high|medium|low)\).*$' 226 | for line in lines: 227 | match = re.match(pattern, line) 228 | if match: 229 | parts = [match.group(1), match.group(2), match.group(3)] 230 | proposal = parts[0].lower() + '. ' + parts[1].lower() 231 | score = confidence_to_value.get(parts[2], 0) 232 | proposals_to_scores[proposal] = proposals_to_scores.get(proposal, 0) + score 233 | 234 | proposals = sorted(proposals_to_scores.items(), key=lambda x: x[1], reverse=True) 235 | if n_max_propose != -1: 236 | proposals = proposals[:n_max_propose] 237 | proposals = [y + proposal[0] + '\n' for proposal in proposals] 238 | self.cache_proposals[(x, y, n_max_propose)] = proposals 239 | return proposals 240 | 241 | def evaluate(self, x: str, y: str, n_evaluate_sample: int) -> int: 242 | self.set_status(x, y) 243 | assert n_evaluate_sample == 1 # TODO: ad hoc 244 | count = {'sure': 0, 'maybe': 0, 'impossible': 0} 245 | for ans, data, status in zip(self.env.ans, self.env.data, self.env.status): 246 | if ans.count('_') >= 4: continue 247 | ans = ' '.join(ans.lower()) 248 | line = f'{data}: {ans}' 249 | prompt = value_prompt.format(input=line) 250 | res = gpt(prompt)[0] 251 | print(line) 252 | print(res) 253 | print() 254 | res = res.split('\n')[-1].strip() 255 | if res in count: count[res] += 1 256 | print(count) 257 | return count 258 | -------------------------------------------------------------------------------- /src/tot/tasks/game24.py: -------------------------------------------------------------------------------- 1 | import re 2 | import os 3 | import sympy 4 | import pandas as pd 5 | from tot.tasks.base import Task, DATA_PATH 6 | from tot.prompts.game24 import * 7 | 8 | 9 | def get_current_numbers(y: str) -> str: 10 | last_line = y.strip().split('\n')[-1] 11 | return last_line.split('left: ')[-1].split(')')[0] 12 | 13 | 14 | class Game24Task(Task): 15 | """ 16 | Input (x) : a string of 4 numbers 17 | Output (y) : a trajectory of 3 steps to reach 24 18 | Reward (r) : 0 or 1, depending on whether the trajectory is correct 19 | Input Example: 20 | 1 2 3 4 21 | Output Example: 22 | 1 + 2 = 3 (left: 3 3 4) 23 | 3 + 3 = 6 (left: 4 6) 24 | 6 * 4 = 24 (left: 24) 25 | (1 + 2 + 3) * 4 = 24 26 | """ 27 | def __init__(self, file='24.csv'): 28 | """ 29 | file: a csv file (fixed) 30 | """ 31 | super().__init__() 32 | path = os.path.join(DATA_PATH, '24', file) 33 | self.data = list(pd.read_csv(path)['Puzzles']) 34 | self.value_cache = {} 35 | self.steps = 4 36 | self.stops = ['\n'] * 4 37 | 38 | def __len__(self) -> int: 39 | return len(self.data) 40 | 41 | def get_input(self, idx: int) -> str: 42 | return self.data[idx] 43 | 44 | def test_output(self, idx: int, output: str): 45 | expression = output.strip().split('\n')[-1].lower().replace('answer: ', '').split('=')[0] 46 | numbers = re.findall(r'\d+', expression) 47 | problem_numbers = re.findall(r'\d+', self.data[idx]) 48 | if sorted(numbers) != sorted(problem_numbers): 49 | return {'r': 0} 50 | try: 51 | # print(sympy.simplify(expression)) 52 | return {'r': int(sympy.simplify(expression) == 24)} 53 | except Exception as e: 54 | # print(e) 55 | return {'r': 0} 56 | 57 | @staticmethod 58 | def standard_prompt_wrap(x: str, y:str='') -> str: 59 | return standard_prompt.format(input=x) + y 60 | 61 | @staticmethod 62 | def cot_prompt_wrap(x: str, y:str='') -> str: 63 | return cot_prompt.format(input=x) + y 64 | 65 | @staticmethod 66 | def propose_prompt_wrap(x: str, y: str='') -> str: 67 | current_numbers = get_current_numbers(y if y else x) 68 | if current_numbers == '24': 69 | prompt = cot_prompt.format(input=x) + 'Steps:' + y 70 | # print([prompt]) 71 | else: 72 | prompt = propose_prompt.format(input=current_numbers) 73 | return prompt 74 | 75 | @staticmethod 76 | def value_prompt_wrap(x: str, y: str) -> str: 77 | last_line = y.strip().split('\n')[-1] 78 | if 'left: ' not in last_line: # last step 79 | ans = last_line.lower().replace('answer: ', '') 80 | # print([value_last_step_prompt.format(input=x, answer=ans)]) 81 | return value_last_step_prompt.format(input=x, answer=ans) 82 | current_numbers = get_current_numbers(y) 83 | return value_prompt.format(input=current_numbers) 84 | 85 | @staticmethod 86 | def value_outputs_unwrap(x: str, y: str, value_outputs: list) -> float: 87 | if len(y.strip().split('\n')) == 4 and 'answer' not in y.lower(): 88 | return 0 89 | value_names = [_.split('\n')[-1] for _ in value_outputs] 90 | value_map = {'impossible': 0.001, 'likely': 1, 'sure': 20} # TODO: ad hoc 91 | value = sum(value * value_names.count(name) for name, value in value_map.items()) 92 | return value -------------------------------------------------------------------------------- /src/tot/tasks/text.py: -------------------------------------------------------------------------------- 1 | import os 2 | import re 3 | from tot.tasks.base import Task, DATA_PATH 4 | from tot.prompts.text import * 5 | from tot.models import gpt 6 | 7 | 8 | class TextTask(Task): 9 | """ 10 | Input (x) : a text instruction 11 | Output (y) : a text generation 12 | Reward (r) : # TODO 13 | Input Example: 14 | Output Example: 15 | """ 16 | def __init__(self, file='data_100_random_text.txt'): 17 | """ 18 | file: a text file, each line is some sentences 19 | """ 20 | super().__init__() 21 | path = os.path.join(DATA_PATH, 'text', file) 22 | self.data = open(path).readlines() 23 | self.steps = 2 24 | self.stops = ['\nPassage:\n', None] 25 | 26 | def __len__(self) -> int: 27 | return len(self.data) 28 | 29 | def get_input(self, idx: int) -> str: 30 | return self.data[idx] 31 | 32 | def test_output(self, idx: int, output: str): 33 | output = output.split('Passage:\n')[-1] 34 | prompt = score_prompt + output 35 | score_outputs = gpt(prompt, n=5, model='gpt-4') 36 | scores = [] 37 | for score_output in score_outputs: 38 | # print(score_output) 39 | pattern = r".*coherency score is (\d+).*" 40 | match = re.match(pattern, score_output, re.DOTALL) 41 | if match: 42 | score = int(match.groups()[0]) 43 | scores.append(score) 44 | else: 45 | print(f'------------------score no match: {[score_output]}') 46 | print(scores) 47 | # print('------------') 48 | info = {'rs': scores, 'r': sum(scores) / len(scores) if scores else 0} 49 | return info 50 | 51 | @staticmethod 52 | def standard_prompt_wrap(x: str, y:str='') -> str: 53 | return standard_prompt.format(input=x) + y 54 | 55 | @staticmethod 56 | def cot_prompt_wrap(x: str, y:str='') -> str: 57 | return cot_prompt.format(input=x) + y 58 | 59 | @staticmethod 60 | def vote_prompt_wrap(x: str, ys: list) -> str: 61 | prompt = vote_prompt 62 | for i, y in enumerate(ys, 1): 63 | # y = y.replace('Plan:\n', '') 64 | # TODO: truncate the plan part? 65 | prompt += f'Choice {i}:\n{y}\n' 66 | return prompt 67 | 68 | @staticmethod 69 | def vote_outputs_unwrap(vote_outputs: list, n_candidates: int) -> list: 70 | vote_results = [0] * n_candidates 71 | for vote_output in vote_outputs: 72 | pattern = r".*best choice is .*(\d+).*" 73 | match = re.match(pattern, vote_output, re.DOTALL) 74 | if match: 75 | vote = int(match.groups()[0]) - 1 76 | if vote in range(n_candidates): 77 | vote_results[vote] += 1 78 | else: 79 | print(f'vote no match: {[vote_output]}') 80 | return vote_results 81 | 82 | @staticmethod 83 | def compare_prompt_wrap(x: str, ys: list) -> str: 84 | assert len(ys) == 2, 'compare prompt only supports 2 candidates' 85 | ys = [y.split('Passage:\n')[-1] for y in ys] 86 | prompt = compare_prompt + f'Passage 1:\n{ys[0]}\n\nPassage 2:\n{ys[1]}\n' 87 | return prompt 88 | 89 | @staticmethod 90 | def compare_output_unwrap(compare_output: str): 91 | if 'more coherent passage is 1' in compare_output: 92 | return 0 93 | elif 'more coherent passage is 2' in compare_output: 94 | return 1 95 | elif 'two passages are similarly coherent' in compare_output: 96 | return 0.5 97 | else: 98 | print(f'-----------------compare no match: {[compare_output]}') 99 | return -1 --------------------------------------------------------------------------------