├── LICENSE ├── README.md ├── fuzz.py └── hooks ├── deprecated_get_options.py ├── deprecated_get_settings.py ├── gerpocalypse.py └── profile.py /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2025 Bastien Orivel 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | Archipelago fuzzer 2 | ================== 3 | 4 | This is a fairly dumb fuzzer that will generate multiworlds with N random YAMLs and record failures. 5 | 6 | ## How to run this? 7 | 8 | You need to run archipelago from source. If you don't know how to do that, there's documentation from the archipelago project [here](https://github.com/ArchipelagoMW/Archipelago/blob/main/docs/running%20from%20source.md) 9 | 10 | Copy the `fuzz.py` file at the root of the archipelago project, you can then run the fuzzer like any other archipelago entry point: 11 | 12 | ``` 13 | python fuzz.py -r 100 -j 16 -g alttp -n 1 14 | ``` 15 | 16 | This will run 100 tests on the alttp world, with 1 YAML per generation, using 16 jobs. 17 | The output will be available in `./fuzz_output`. 18 | 19 | ## Flags 20 | 21 | - `-g` selects the apworld to fuzz. If omitted, every run will take a random loaded world 22 | - `-j` specifies the number of jobs to run in parallel. Defaults to 10, recommended value is the number of cores of your CPU. 23 | - `-r` specifies the number of generations to do. This is a mandatory setting 24 | - `-n` specifies how many YAMLs to use per generation. Defaults to 1. You can 25 | also specify ranges like `1-10` to make all generations pick a number between 26 | 1 and 10 YAMLs. 27 | - `-t` specifies the maximum time per generation in seconds. Defaults to 15s. 28 | - `-m` to specify a meta file that overrides specific values 29 | - `--skip-output` specifies to skip the output step of generation. 30 | - `--dump-ignored` makes it so option errors are also dumped in the result. 31 | - `--with-static-worlds` takes a path to a directory containing YAML to include in every generation. Not recursive. 32 | - `--hook` takes a `module:class` string to a hook and can be specified multiple times. More information about that below 33 | 34 | ## Meta files 35 | 36 | You can force some options to always be the same value by providing a meta file via the `-m` flag. 37 | The syntax is very similar to the archipelago meta.yaml syntax: 38 | 39 | ```yaml 40 | null: 41 | progression_balancing: 50 42 | Pokemon FireRed and LeafGreen: 43 | ability_blacklist: [] 44 | move_blacklist: [] 45 | ``` 46 | 47 | Note that unlike an archipelago meta file, this will override the values in the 48 | generated YAML, there's no implicit application of options at generation time 49 | so you don't need to provide the meta file to report bugs. 50 | 51 | ## Hooks 52 | 53 | To repurpose the fuzzer for some specific bug testing, it can be useful to 54 | monkeypatch archipelago before generation and/or to reclassify some failures. 55 | That's where a hook comes in. 56 | 57 | You can declare a class like this one in a file alongside `fuzz.py` in your 58 | archipelago installation: 59 | 60 | ```py 61 | from fuzz import BaseHook, GenOutcome 62 | 63 | class Hook(BaseHook): 64 | def setup_main(self, args): 65 | """ 66 | The args parameter is the `Namespace` containing the parsed arguments from the CLI. 67 | setup is classed as early as possible after argument parsing in the 68 | main process. It is guaranteed to be only ever called once. It will 69 | always be called before any worker process is started 70 | """ 71 | pass 72 | 73 | def setup_worker(self, args): 74 | """ 75 | The args parameter is the `Namespace` containing the parsed arguments from the CLI. 76 | setup is classed as early as possible after starting a worker process. 77 | It is guaranteed to only ever be called once per worker process, before 78 | any generation attempt. 79 | """ 80 | pass 81 | 82 | def reclassify_outcome(self, outcome, exception): 83 | """ 84 | The outcome is a `GenOutcome` from generation. 85 | The exception is the exception raised during generation if one happened, None otherwise. 86 | 87 | This function is called in the worker process just after the result is first decided. 88 | The one exception is for timeouts where the outcome has to be processed on the main process. 89 | As such, this function must do very minimal work and not make 90 | assumptions as whether it's running in worker or in the main process. 91 | """ 92 | return GenOutcome.Success 93 | 94 | def before_generate(self): 95 | """ 96 | This method will be called once per generation, just before we actually call into archipelago 97 | """ 98 | pass 99 | 100 | def after_generate(self): 101 | """ 102 | This method will be called once per generation except if the generation timed out. 103 | If you need to inspect the failure, use `reclassify_outcome` instead. 104 | """ 105 | pass 106 | 107 | def finalize(self): 108 | """ 109 | This method will be called once just before the main process exits. It 110 | will only be called on the main process 111 | """ 112 | pass 113 | ``` 114 | 115 | You can then pass the following argument: `--hook your_file:Hook`, note that it should be the name of your file, without the extension. 116 | The `hooks` folder in this repository contains examples of some usage that I personally made of hooks. 117 | 118 | ### Profiler hook 119 | 120 | You can get a profile in a callgrind format by using the provided `profile` hook. 121 | 122 | Example: 123 | 124 | ``` 125 | python -O fuzz.py -r 1000 -n 1 -g pokemon_crystal -j24 --hook hooks.profile:Hook 126 | ``` 127 | 128 | The output (`fuzz_output/full.prof`) can be read with a tool such as `qcachegrind`. 129 | -------------------------------------------------------------------------------- /fuzz.py: -------------------------------------------------------------------------------- 1 | import sys 2 | import os 3 | 4 | ap_path = os.path.abspath(os.path.dirname(sys.argv[0])) 5 | sys.path.insert(0, ap_path) 6 | 7 | # Prevent multiprocess workers from spamming nonsense when KeyboardInterrupted 8 | # I can't wait for this to hide actual issues... 9 | if __name__ == "__mp_main__": 10 | sys.stderr = None 11 | 12 | from worlds import AutoWorldRegister 13 | from Options import ( 14 | get_option_groups, 15 | Choice, 16 | Toggle, 17 | Range, 18 | ItemSet, 19 | ItemDict, 20 | LocationSet, 21 | NumericOption, 22 | OptionSet, 23 | FreeText, 24 | PlandoConnections, 25 | OptionList, 26 | PlandoTexts, 27 | OptionDict, 28 | OptionError, 29 | ) 30 | from Utils import __version__, local_path 31 | import Utils 32 | import settings 33 | 34 | from Generate import main as GenMain 35 | from Main import main as ERmain 36 | from settings import get_settings 37 | from argparse import Namespace, ArgumentParser 38 | from concurrent.futures import TimeoutError 39 | import ctypes 40 | import threading 41 | from contextlib import redirect_stderr, redirect_stdout 42 | from enum import Enum 43 | from functools import wraps 44 | from io import StringIO 45 | from multiprocessing import Pool 46 | 47 | import functools 48 | import logging 49 | import multiprocessing 50 | import platform 51 | import random 52 | import shutil 53 | import signal 54 | import string 55 | import tempfile 56 | import time 57 | import traceback 58 | import yaml 59 | 60 | 61 | OUT_DIR = f"fuzz_output" 62 | settings.no_gui = True 63 | settings.skip_autosave = True 64 | MP_HOOKS = [] 65 | 66 | 67 | # We patch this because AP can't keep its hands to itself and has to start a thread to clean stuff up. 68 | # We could monkey patch the hell out of it but since it's an inner function, I feel like the complexity 69 | # of it is unreasonable compared to just reimplement a logger 70 | # especially since it allows us to not have to cheat user_path 71 | 72 | # Taken from https://github.com/ArchipelagoMW/Archipelago/blob/0.5.1.Hotfix1/Utils.py#L488 73 | # and removed everythinhg that had to do with files, typing and cleanup 74 | def patched_init_logging( 75 | name, 76 | loglevel = logging.INFO, 77 | write_mode = "w", 78 | log_format = "[%(name)s at %(asctime)s]: %(message)s", 79 | exception_logger = None, 80 | *args, 81 | **kwargs 82 | ): 83 | loglevel: int = Utils.loglevel_mapping.get(loglevel, loglevel) 84 | root_logger = logging.getLogger() 85 | for handler in root_logger.handlers[:]: 86 | root_logger.removeHandler(handler) 87 | handler.close() 88 | root_logger.setLevel(loglevel) 89 | 90 | class Filter(logging.Filter): 91 | def __init__(self, filter_name, condition) -> None: 92 | super().__init__(filter_name) 93 | self.condition = condition 94 | 95 | def filter(self, record: logging.LogRecord) -> bool: 96 | return self.condition(record) 97 | 98 | stream_handler = logging.StreamHandler(sys.stdout) 99 | stream_handler.addFilter(Filter("NoFile", lambda record: not getattr(record, "NoStream", False))) 100 | root_logger.addHandler(stream_handler) 101 | 102 | # Relay unhandled exceptions to logger. 103 | if not getattr(sys.excepthook, "_wrapped", False): # skip if already modified 104 | orig_hook = sys.excepthook 105 | 106 | def handle_exception(exc_type, exc_value, exc_traceback): 107 | if issubclass(exc_type, KeyboardInterrupt): 108 | sys.__excepthook__(exc_type, exc_value, exc_traceback) 109 | return 110 | logging.getLogger(exception_logger).exception("Uncaught exception", 111 | exc_info=(exc_type, exc_value, exc_traceback)) 112 | return orig_hook(exc_type, exc_value, exc_traceback) 113 | 114 | handle_exception._wrapped = True 115 | 116 | sys.excepthook = handle_exception 117 | 118 | logging.info( 119 | f"Archipelago ({__version__}) logging initialized" 120 | f" on {platform.platform()}" 121 | f" running Python {sys.version_info.major}.{sys.version_info.minor}.{sys.version_info.micro}" 122 | ) 123 | 124 | Utils.init_logging = patched_init_logging 125 | 126 | 127 | def exception_in_causes(e, ty): 128 | if isinstance(e, ty): 129 | return True 130 | if e.__cause__ is not None: 131 | return exception_in_causes(e.__cause__, ty) 132 | return False 133 | 134 | 135 | def world_from_apworld_name(apworld_name): 136 | for name, world in AutoWorldRegister.world_types.items(): 137 | if world.__module__.startswith(f"worlds.{apworld_name}"): 138 | return name, world 139 | 140 | raise Exception(f"Couldn't find loaded workd with world: {apworld_name}") 141 | 142 | 143 | # See https://github.com/yaml/pyyaml/issues/103 144 | yaml.SafeDumper.ignore_aliases = lambda *args: True 145 | 146 | # Adapted from archipelago'd generate_yaml_templates 147 | # https://github.com/ArchipelagoMW/Archipelago/blob/f75a1ae1174fb467e5c5bd5568d7de3c806d5b1c/Options.py#L1504 148 | def generate_random_yaml(world_name, meta): 149 | def dictify_range(option): 150 | data = {option.default: 50} 151 | for sub_option in ["random", "random-low", "random-high"]: 152 | if sub_option != option.default: 153 | data[sub_option] = 0 154 | 155 | notes = {} 156 | for name, number in getattr(option, "special_range_names", {}).items(): 157 | notes[name] = f"equivalent to {number}" 158 | if number in data: 159 | data[name] = data[number] 160 | del data[number] 161 | else: 162 | data[name] = 0 163 | 164 | return data, notes 165 | 166 | def sanitize(value): 167 | if isinstance(value, frozenset): 168 | return list(value) 169 | return value 170 | 171 | game_name, world = world_from_apworld_name(world_name) 172 | if world is None: 173 | raise Exception(f"Failed to resolve apworld from apworld name: {world_name}") 174 | 175 | game_options = {} 176 | option_groups = get_option_groups(world) 177 | for group, options in option_groups.items(): 178 | for option_name, option_value in options.items(): 179 | override = meta.get(None, {}).get(option_name) 180 | if not override: 181 | override = meta.get(game_name, {}).get(option_name) 182 | 183 | if override is not None: 184 | game_options[option_name] = override 185 | continue 186 | 187 | game_options[option_name] = sanitize( 188 | get_random_value(option_name, option_value) 189 | ) 190 | 191 | yaml_content = { 192 | "description": "%s Template, generated with https://github.com/Eijebong/Archipelago-fuzzer" 193 | % game_name, 194 | "game": game_name, 195 | "requires": { 196 | "version": __version__, 197 | }, 198 | game_name: game_options, 199 | } 200 | 201 | res = yaml.safe_dump(yaml_content, sort_keys=False) 202 | 203 | return res 204 | 205 | 206 | def get_random_value(name, option): 207 | if name == "item_links": 208 | # Let's not fuck with item links right now, I'm scared 209 | return option.default 210 | 211 | if name == "megamix_mod_data": 212 | # Megamix is a special child and requires this to be valid JSON. Since we can't provide that, just ignore it 213 | return option.default 214 | 215 | if issubclass(option, (PlandoConnections, PlandoTexts)): 216 | # See, I was already afraid with item_links but now it's plain terror. Let's not ever touch this ever. 217 | return option.default 218 | 219 | if name == "gfxmod": 220 | # XXX: LADX has this and it should be a choice but is freetext for some reason... 221 | # Putting invalid values here means the gen fails even though it doesn't affect any logic 222 | # Just return Link for now. 223 | return "Link" 224 | 225 | if issubclass(option, OptionDict): 226 | # This is for example factorio's start_items and worldgen settings. I don't think it's worth randomizing those as I'm not expecting the generation outcome to change from them. 227 | # Plus I have no idea how to randomize them in the first place :) 228 | return option.default 229 | 230 | if issubclass(option, (Choice, Toggle)): 231 | valid_choices = [key for key in option.options.keys() if key not in option.aliases] 232 | if not valid_choices: 233 | valid_choices = list(option.options.keys()) 234 | 235 | return random.choice(valid_choices) 236 | 237 | if issubclass(option, Range): 238 | return random.randint(option.range_start, option.range_end) 239 | 240 | if issubclass(option, (ItemSet, ItemDict, LocationSet)): 241 | # I don't know what to do here so just return the default value instead of a random one. 242 | # This affects options like start inventory, local items, non local 243 | # items so it's not the end of the world if they don't get randomized 244 | # but we might want to look into that later on 245 | return option.default 246 | 247 | if issubclass(option, OptionSet): 248 | return random.sample( 249 | list(option.valid_keys), k=random.randint(0, len(option.valid_keys)) 250 | ) 251 | 252 | if issubclass(option, OptionList): 253 | return random.sample( 254 | list(option.valid_keys), k=random.randint(0, len(option.valid_keys)) 255 | ) 256 | 257 | if issubclass(option, NumericOption): 258 | return option("random").value 259 | 260 | if issubclass(option, FreeText): 261 | return "".join( 262 | random.choice(string.ascii_letters) for i in range(random.randint(0, 255)) 263 | ) 264 | 265 | return option.default 266 | 267 | 268 | def call_generate(yaml_path, args): 269 | from settings import get_settings 270 | 271 | settings = get_settings() 272 | 273 | with tempfile.TemporaryDirectory(prefix="apfuzz") as output_path: 274 | args = Namespace( 275 | **{ 276 | "weights_file_path": settings.generator.weights_file_path, 277 | "sameoptions": False, 278 | "player_files_path": yaml_path, 279 | "seed": random.randint(0, 1000000000), 280 | "multi": 1, 281 | "spoiler": 1, 282 | "outputpath": output_path, 283 | "race": False, 284 | "meta_file_path": "meta-doesnt-exist.yaml", 285 | "log_level": "info", 286 | "yaml_output": 1, 287 | "plando": [], 288 | "skip_prog_balancing": False, 289 | "skip_output": args.skip_output, 290 | "csv_output": False, 291 | "log_time": False, 292 | "spoiler_only": False, 293 | } 294 | ) 295 | erargs, seed = GenMain(args) 296 | ERmain(erargs, seed) 297 | 298 | 299 | def gen_wrapper(yaml_path, apworld_name, i, args, queue): 300 | global MP_HOOKS 301 | 302 | out_buf = StringIO() 303 | 304 | myself = os.getpid() 305 | def stop(): 306 | queue.put_nowait((myself, apworld_name, i, yaml_path, out_buf)) 307 | queue.join() 308 | timer = threading.Timer(args.timeout, stop) 309 | timer.start() 310 | 311 | 312 | raised = None 313 | 314 | with redirect_stdout(out_buf), redirect_stderr(out_buf): 315 | try: 316 | # If we have hooks defined in args but they're not registered yet, register them 317 | if args.hook and not MP_HOOKS: 318 | for hook_class_path in args.hook: 319 | hook = find_hook(hook_class_path) 320 | hook.setup_worker(args) 321 | MP_HOOKS.append(hook) 322 | 323 | for hook in MP_HOOKS: 324 | hook.before_generate() 325 | 326 | call_generate(yaml_path.name, args) 327 | except Exception as e: 328 | raised = e 329 | finally: 330 | timer.cancel() 331 | timer.join() 332 | root_logger = logging.getLogger() 333 | handlers = root_logger.handlers[:] 334 | for handler in handlers: 335 | root_logger.removeHandler(handler) 336 | handler.close() 337 | 338 | for hook in MP_HOOKS: 339 | hook.after_generate() 340 | 341 | outcome = GenOutcome.Success 342 | if raised: 343 | is_timeout = isinstance(raised, TimeoutError) 344 | is_option_error = exception_in_causes(raised, OptionError) 345 | 346 | if is_timeout: 347 | outcome = GenOutcome.Timeout 348 | elif is_option_error: 349 | outcome = GenOutcome.OptionError 350 | else: 351 | outcome = GenOutcome.Failure 352 | 353 | for hook in MP_HOOKS: 354 | outcome = hook.reclassify_outcome(outcome, raised) 355 | 356 | if outcome == GenOutcome.Success: 357 | return outcome 358 | 359 | if outcome == GenOutcome.OptionError and not args.dump_ignored: 360 | return outcome 361 | 362 | if outcome == GenOutcome.Timeout: 363 | extra = f"[...] Generation killed here after {args.timeout}s" 364 | else: 365 | extra = "".join(traceback.format_exception(raised)) 366 | 367 | dump_generation_output(outcome, apworld_name, i, yaml_path, out_buf, extra) 368 | 369 | return outcome 370 | 371 | 372 | def dump_generation_output(outcome, apworld_name, i, yamls_dir, out_buf, extra=None): 373 | if outcome == GenOutcome.Success: 374 | return 375 | 376 | if outcome == GenOutcome.OptionError: 377 | error_ty = "ignored" 378 | elif outcome == GenOutcome.Timeout: 379 | error_ty = "timeout" 380 | else: 381 | error_ty = "error" 382 | 383 | error_output_dir = os.path.join(OUT_DIR, error_ty, apworld_name, str(i)) 384 | os.makedirs(error_output_dir) 385 | 386 | for yaml_file in os.listdir(yamls_dir.name): 387 | shutil.copy(os.path.join(yamls_dir.name, yaml_file), error_output_dir) 388 | 389 | error_log_path = os.path.join(error_output_dir, f"{i}.log") 390 | with open(error_log_path, "w") as fd: 391 | fd.write(out_buf.getvalue()) 392 | if extra is not None: 393 | fd.write(extra) 394 | 395 | 396 | class GenOutcome: 397 | Success = 0 398 | Failure = 1 399 | Timeout = 2 400 | OptionError = 3 401 | 402 | 403 | IS_TTY = sys.stdout.isatty() 404 | SUCCESS = 0 405 | FAILURE = 0 406 | TIMEOUTS = 0 407 | OPTION_ERRORS = 0 408 | SUBMITTED = 0 409 | 410 | 411 | def gen_callback(yamls_dir, args, outcome): 412 | global SUCCESS, FAILURE, SUBMITTED, OPTION_ERRORS, TIMEOUTS 413 | SUBMITTED -= 1 414 | 415 | if outcome == GenOutcome.Success: 416 | SUCCESS += 1 417 | if IS_TTY: 418 | print(".", end="") 419 | elif outcome == GenOutcome.Failure: 420 | FAILURE += 1 421 | if IS_TTY: 422 | print("F", end="") 423 | elif outcome == GenOutcome.Timeout: 424 | TIMEOUTS += 1 425 | if IS_TTY: 426 | print("T", end="") 427 | elif outcome == GenOutcome.OptionError: 428 | OPTION_ERRORS += 1 429 | if IS_TTY: 430 | print("I", end="") 431 | 432 | # If we're not on a TTY, print progress every once in a while 433 | if not IS_TTY: 434 | checks_done = SUCCESS + FAILURE + TIMEOUTS + OPTION_ERRORS 435 | if (checks_done % (args.runs // 50)) == 0: 436 | print(f"{checks_done} / {args.runs} done. {FAILURE} failures, {TIMEOUTS} timeouts, {OPTION_ERRORS} ignored.") 437 | 438 | sys.stdout.flush() 439 | 440 | 441 | def error(yamls_dir, args, raised): 442 | return gen_callback(yamls_dir, args, GenOutcome.Failure) 443 | 444 | 445 | def print_status(): 446 | print() 447 | print("Success:", SUCCESS) 448 | print("Failures:", FAILURE) 449 | print("Timeouts:", TIMEOUTS) 450 | print("Ignored:", OPTION_ERRORS) 451 | print() 452 | print("Time taken:{:.2f}s".format(time.time() - START)) 453 | 454 | 455 | def find_hook(hook_path): 456 | modulepath, objectpath = hook_path.split(':') 457 | obj = __import__(modulepath) 458 | for inner in modulepath.split('.')[1:]: 459 | obj = getattr(obj, inner) 460 | for inner in objectpath.split('.'): 461 | obj = getattr(obj, inner) 462 | 463 | if not isinstance(obj, type): 464 | raise RuntimeError("the hook argument should refer to a class in a module") 465 | 466 | if issubclass(obj, BaseHook): 467 | raise RuntimeError("the hook {} is not a subclass of `fuzz.BaseHook`)".format(hook_path)) 468 | 469 | return obj() 470 | 471 | 472 | class BaseHook: 473 | def setup_main(self, args): 474 | """ 475 | This function is guaranteed to only ever be called once, in the main process. 476 | """ 477 | pass 478 | 479 | def setup_worker(self, args): 480 | """ 481 | This function is guaranteed to only ever be called once per worker process. It can be used to load extra apworlds for example. 482 | """ 483 | pass 484 | 485 | def reclassify_outcome(self, outcome, raised): 486 | """ 487 | This function is called once after a generation outcome has been decided. 488 | You can reclassify the outcome with this before it is returned to the main process by returning a new `GenOutcome` 489 | Note that because timeouts are processed by the main process and not by the worker itself (as it is busy timing out), 490 | this function can be called from both the main process and the workers. 491 | """ 492 | return outcome 493 | 494 | def before_generate(self): 495 | pass 496 | 497 | def after_generate(self): 498 | pass 499 | 500 | def finalize(self): 501 | pass 502 | 503 | if __name__ == "__main__": 504 | MAIN_HOOKS = [] 505 | 506 | def main(p, args): 507 | global SUBMITTED 508 | 509 | apworld_name = args.game 510 | if args.meta: 511 | with open(args.meta, "r") as fd: 512 | meta = yaml.safe_load(fd.read()) 513 | else: 514 | meta = {} 515 | 516 | if apworld_name is not None: 517 | world = world_from_apworld_name(apworld_name) 518 | if world is None: 519 | raise Exception( 520 | f"Failed to resolve apworld from apworld name: {apworld_name}" 521 | ) 522 | 523 | if os.path.exists(OUT_DIR): 524 | shutil.rmtree(OUT_DIR) 525 | os.makedirs(OUT_DIR) 526 | 527 | for hook_class_path in args.hook: 528 | hook = find_hook(hook_class_path) 529 | hook.setup_main(args) 530 | 531 | MAIN_HOOKS.append(hook) 532 | 533 | sys.stdout.write("\x1b[2J\x1b[H") 534 | sys.stdout.flush() 535 | 536 | i = 0 537 | valid_worlds = [ 538 | world.__module__.split(".")[1] 539 | for world in AutoWorldRegister.world_types.values() 540 | ] 541 | if "apsudoku" in valid_worlds: 542 | valid_worlds.remove("apsudoku") 543 | 544 | yamls_per_run_bounds = [int(arg) for arg in args.yamls_per_run.split("-")] 545 | 546 | if len(yamls_per_run_bounds) not in {1, 2}: 547 | raise Exception( 548 | "Invalid value passed for `yamls_per_run`. Either pass an int or a range like `1-10`" 549 | ) 550 | 551 | if len(yamls_per_run_bounds) == 2: 552 | if yamls_per_run_bounds[0] >= yamls_per_run_bounds[1]: 553 | raise Exception("Invalid range value passed for `yamls_per_run`.") 554 | 555 | static_yamls = [] 556 | if args.with_static_worlds: 557 | for yaml_file in os.listdir(args.with_static_worlds): 558 | path = os.path.join(args.with_static_worlds, yaml_file) 559 | if not os.path.isfile(path): 560 | continue 561 | with open(path, "r") as fd: 562 | static_yamls.append(fd.read()) 563 | 564 | 565 | manager = multiprocessing.Manager() 566 | queue = manager.Queue(1000) 567 | def handle_timeouts(): 568 | while True: 569 | try: 570 | pid, apworld_name, i, yamls_dir, out_buf = queue.get() 571 | os.kill(pid, signal.SIGTERM) 572 | 573 | extra = f"[...] Generation killed here after {args.timeout}s" 574 | outcome = GenOutcome.Timeout 575 | for hook in MAIN_HOOKS: 576 | outcome = hook.classify(outcome, TimeoutError()) 577 | dump_generation_output(outcome, apworld_name, i, yamls_dir, out_buf, extra) 578 | gen_callback(yamls_dir, args, outcome) 579 | except: 580 | break 581 | 582 | timeout_handler = threading.Thread(target=handle_timeouts) 583 | timeout_handler.daemon = True 584 | timeout_handler.start() 585 | 586 | while i < args.runs: 587 | if apworld_name is None: 588 | actual_apworld = random.choice(valid_worlds) 589 | else: 590 | actual_apworld = apworld_name 591 | 592 | if len(yamls_per_run_bounds) == 1: 593 | yamls_this_run = yamls_per_run_bounds[0] 594 | else: 595 | # +1 here to make the range inclusive 596 | yamls_this_run = random.randrange( 597 | yamls_per_run_bounds[0], yamls_per_run_bounds[1] + 1 598 | ) 599 | 600 | random_yamls = [ 601 | generate_random_yaml(actual_apworld, meta) for _ in range(yamls_this_run) 602 | ] 603 | 604 | SUBMITTED += 1 605 | 606 | # We don't care about the actual gen output, just trash it immediately after gen 607 | yamls_dir = tempfile.TemporaryDirectory(prefix="apfuzz") 608 | for nb, yaml_content in enumerate(random_yamls): 609 | yaml_path = os.path.join(yamls_dir.name, f"{i}-{nb}.yaml") 610 | open(yaml_path, "wb").write(yaml_content.encode("utf-8")) 611 | 612 | for nb, yaml_content in enumerate(static_yamls): 613 | yaml_path = os.path.join(yamls_dir.name, f"static-{i}-{nb}.yaml") 614 | open(yaml_path, "wb").write(yaml_content.encode("utf-8")) 615 | 616 | last_job = p.apply_async( 617 | gen_wrapper, 618 | args=(yamls_dir, actual_apworld, i, args, queue), 619 | callback=functools.partial(gen_callback, yamls_dir, args), # The yamls_dir arg isn't used but we abuse functools.partial to keep the object and thus the tempdir alive 620 | error_callback=functools.partial(error, yamls_dir, args), 621 | ) 622 | 623 | while SUBMITTED >= args.jobs * 10: 624 | # Poll the last job to keep the queue running 625 | last_job.ready() 626 | time.sleep(0.001) 627 | 628 | i += 1 629 | 630 | while SUBMITTED > 0: 631 | last_job.ready() 632 | time.sleep(0.05) 633 | 634 | parser = ArgumentParser(prog="apfuzz") 635 | parser.add_argument("-g", "--game", default=None) 636 | parser.add_argument("-j", "--jobs", default=10, type=int) 637 | parser.add_argument("-r", "--runs", type=int, required=True) 638 | parser.add_argument("-n", "--yamls_per_run", default="1", type=str) 639 | parser.add_argument("-t", "--timeout", default=15, type=int) 640 | parser.add_argument("-m", "--meta", default=None, type=None) 641 | parser.add_argument("--dump-ignored", default=False, action="store_true") 642 | parser.add_argument("--with-static-worlds", default=None) 643 | parser.add_argument("--hook", action="append", default=[]) 644 | parser.add_argument("--skip-output", default=False, action="store_true") 645 | 646 | args = parser.parse_args() 647 | 648 | # This is just to make sure that the host.yaml file exists by the time we fork 649 | # so that a first run on a new installation doesn't throw out failures until 650 | # the host.yaml from the first gen is written 651 | get_settings() 652 | try: 653 | can_fork = hasattr(os, "fork") 654 | # fork here is way faster because it doesn't have to reload all worlds, but it's only available on some platforms 655 | # forking for every job also has the advantage of being sure that the process is "clean". Although I don't know if that actually matters 656 | start_method = "fork" if can_fork else "spawn" 657 | multiprocessing.set_start_method(start_method) 658 | with Pool(processes=args.jobs, maxtasksperchild=None) as p: 659 | START = time.time() 660 | main(p, args) 661 | except KeyboardInterrupt: 662 | pass 663 | except Exception as e: 664 | traceback.print_exc() 665 | finally: 666 | print_status() 667 | 668 | for hook in MAIN_HOOKS: 669 | hook.finalize() 670 | 671 | sys.exit((FAILURE + TIMEOUTS) != 0) 672 | 673 | -------------------------------------------------------------------------------- /hooks/deprecated_get_options.py: -------------------------------------------------------------------------------- 1 | from fuzz import GenOutcome, BaseHook 2 | 3 | class FuzzException(Exception): 4 | pass 5 | 6 | def raise_fuzz_exception(*args, **kwargs): 7 | raise FuzzException("Caught usage of deprecated Utils.get_options") 8 | 9 | 10 | class Hook(BaseHook): 11 | def setup_worker(self, _args): 12 | import Utils 13 | Utils.get_options = raise_fuzz_exception 14 | 15 | def reclassify_outcome(self, outcome, exception): 16 | if isinstance(exception, FuzzException): 17 | return GenOutcome.Failure 18 | return GenOutcome.Success 19 | -------------------------------------------------------------------------------- /hooks/deprecated_get_settings.py: -------------------------------------------------------------------------------- 1 | from fuzz import GenOutcome, BaseHook 2 | 3 | class FuzzException(Exception): 4 | pass 5 | 6 | def raise_fuzz_exception(*args, **kwargs): 7 | raise FuzzException("Caught usage of deprecated Utils.get_settings") 8 | 9 | 10 | class Hook(BaseHook): 11 | def setup_worker(self, _args): 12 | import Utils 13 | Utils.get_settings = raise_fuzz_exception 14 | 15 | def reclassify_outcome(self, outcome, exception): 16 | if isinstance(exception, FuzzException): 17 | return GenOutcome.Failure 18 | return GenOutcome.Success 19 | -------------------------------------------------------------------------------- /hooks/gerpocalypse.py: -------------------------------------------------------------------------------- 1 | from fuzz import GenOutcome, BaseHook 2 | from worlds import AutoWorldRegister, WorldSource 3 | import os 4 | import tempfile 5 | import shutil 6 | 7 | class Hook(BaseHook): 8 | def setup_main(self, args): 9 | self._tmp = tempfile.TemporaryDirectory(prefix="apfuzz") 10 | with open(os.path.join(self._tmp.name, "kh.yaml"), "w") as fd: 11 | fd.write(""" 12 | name: Player{number} 13 | description: Default Kingdom Hearts Template 14 | game: Kingdom Hearts 15 | Kingdom Hearts: {} 16 | """) 17 | args.with_static_worlds = self._tmp.name 18 | 19 | def setup_worker(self, args): 20 | if 'Kingdom Hearts' not in AutoWorldRegister.world_types: 21 | # This is correct for the ap-yaml-checker container 22 | if os.path.isfile('/ap/supported_worlds/kh1-0.6.1.apworld'): 23 | shutil.copy('/ap/supported_worlds/kh1-0.6.1.apworld', '/ap/archipelago/worlds/kh1.apworld') 24 | WorldSource('/ap/archipelago/worlds/kh1.apworld', is_zip=True, relative=False).load() 25 | 26 | if 'Kingdom Hearts' not in AutoWorldRegister.world_types: 27 | raise RuntimeError("kh1 needs to be loaded") 28 | 29 | def reclassify_outcome(self, outcome, exception): 30 | message = str(exception).lower() 31 | if "no connected region" in message or "tried to search through an entrance" in message: 32 | return GenOutcome.Failure 33 | return GenOutcome.Success 34 | -------------------------------------------------------------------------------- /hooks/profile.py: -------------------------------------------------------------------------------- 1 | import os 2 | import yappi 3 | from fuzz import BaseHook, OUT_DIR 4 | 5 | class Hook(BaseHook): 6 | def before_generate(self): 7 | yappi.start(builtins=True, profile_threads=True) 8 | 9 | def after_generate(self): 10 | yappi.stop() 11 | yappi.get_func_stats().save(os.path.join(OUT_DIR, 'profile', f'profile_{os.getpid()}.prof')) 12 | 13 | def setup_main(self, args): 14 | os.makedirs(os.path.join(OUT_DIR, 'profile')) 15 | 16 | def finalize(self): 17 | aggregated_stats = yappi.YFuncStats([os.path.join(OUT_DIR, 'profile', f) for f in os.listdir(os.path.join(OUT_DIR, 'profile')) if f.startswith('profile_')]) 18 | aggregated_stats.strip_dirs() 19 | aggregated_stats.save(os.path.join(OUT_DIR, 'full.prof'), "callgrind") 20 | --------------------------------------------------------------------------------