├── .gitignore ├── CITATION.cff ├── CONTRIBUTING.md ├── EvaluationScripts ├── Dockerfile ├── IOCTLpicture.py ├── README.md ├── dbConnection.py ├── docker-compose.yml ├── figures │ ├── IOCTLpicture.svg │ ├── gatheringResults1.svg │ ├── gatheringResults2.svg │ ├── housekeeperResults.svg │ ├── interestingFunctions.svg │ ├── pathingResults.svg │ └── pipelineEfficiency.svg ├── file_cache.py ├── fuzzingGatherData │ ├── createSeeds.py │ ├── fuzzImprovementEval.py │ └── readme.md ├── fuzzingResults.py ├── gatheringResults.py ├── housekeeperResults.py ├── interestingFunctions.py ├── pathingGatherData │ ├── evalTargets │ │ └── targets.txt │ ├── pathfinderImproveEval.py │ ├── timing_data.json │ ├── timing_data_with_debug.json │ ├── timing_data_with_imp.json │ └── timing_data_without_imp.json ├── pathingResults.py ├── pipelineEfficiency.py ├── requirements.txt └── runAll.py ├── FinalPresentation.pdf ├── LICENSE ├── MasterThesis.pdf ├── Pipeline ├── Certificator │ ├── README.md │ ├── certificator.py │ ├── certificatorService.py │ ├── requirements.txt │ ├── sigcheck.json │ ├── sigcheck.txt │ └── sigcheckParser.py ├── Coordinator │ ├── Dockerfile │ ├── LICENSE_peresults │ ├── README.md │ ├── coordinator.py │ ├── databaseDefinition.dbml │ ├── docker-compose.yml │ ├── interestingFunctions.csv │ ├── knownVulnerableDrivers.csv │ ├── knownVulnerableDrivers │ │ ├── VDR.csv │ │ ├── downloadParseKnownVulnerableDrivers.py │ │ └── requirements.txt │ ├── models.py │ ├── openThroughRedirector.sh │ ├── peresults.py │ ├── requirements.txt │ └── storage │ │ ├── files │ │ └── .gitkeep │ │ ├── postgres │ │ └── .gitkeep │ │ └── uploads │ │ └── .gitkeep ├── Frontender │ ├── .eslintrc.json │ ├── Dockerfile │ ├── LICENSE_nextjs │ ├── LICENSE_shadcnui │ ├── README.md │ ├── components.json │ ├── docker-compose.yml │ ├── next-env.d.ts │ ├── next.config.mjs │ ├── package-lock.json │ ├── package.json │ ├── postcss.config.mjs │ ├── src │ │ ├── app │ │ │ ├── driver │ │ │ │ └── page.tsx │ │ │ ├── drivers │ │ │ │ └── page.tsx │ │ │ ├── favicon.ico │ │ │ ├── fuzzing-queue │ │ │ │ └── page.tsx │ │ │ ├── fuzzing-results │ │ │ │ └── page.tsx │ │ │ ├── globals.css │ │ │ ├── ida-pathing │ │ │ │ └── page.tsx │ │ │ ├── import-drivers │ │ │ │ └── page.tsx │ │ │ ├── known-vulnerable-drivers │ │ │ │ └── page.tsx │ │ │ ├── layout.tsx │ │ │ ├── origin-drivers │ │ │ │ └── page.tsx │ │ │ └── page.tsx │ │ ├── components │ │ │ ├── data-fetcher.tsx │ │ │ ├── data-pusher.tsx │ │ │ ├── ida-pathing.tsx │ │ │ ├── mode-toggle.tsx │ │ │ ├── navbar.tsx │ │ │ ├── theme-provider.tsx │ │ │ └── ui │ │ │ │ ├── button.tsx │ │ │ │ ├── card.tsx │ │ │ │ ├── column-header.tsx │ │ │ │ ├── column-toggle.tsx │ │ │ │ ├── data-pagination.tsx │ │ │ │ ├── data-table.tsx │ │ │ │ ├── dialog.tsx │ │ │ │ ├── driver.tsx │ │ │ │ ├── dropdown-menu.tsx │ │ │ │ ├── fuzzingQueueElem.tsx │ │ │ │ ├── hover-card.tsx │ │ │ │ ├── input.tsx │ │ │ │ ├── label.tsx │ │ │ │ ├── menubar.tsx │ │ │ │ ├── navigation-menu.tsx │ │ │ │ ├── radio-group.tsx │ │ │ │ ├── select.tsx │ │ │ │ ├── separator.tsx │ │ │ │ ├── table.tsx │ │ │ │ ├── toast.tsx │ │ │ │ ├── toaster.tsx │ │ │ │ └── use-toast.ts │ │ ├── lib │ │ │ └── utils.ts │ │ └── types │ │ │ ├── Driver.tsx │ │ │ ├── FuzzingQueue.tsx │ │ │ ├── KnownVulnerable.tsx │ │ │ └── Pathing.tsx │ ├── tailwind.config.ts │ └── tsconfig.json ├── Fuzzifier │ ├── README.md │ ├── fuzzifier.py │ ├── payload │ │ ├── .gitkeep │ │ ├── fuzzingDecoder.py │ │ ├── hexdump.py │ │ └── ioctlDecoder.py │ ├── seeds │ │ └── .gitkeep │ └── vuln_test.c.template ├── Housekeeper │ ├── Dockerfile │ ├── README.md │ ├── docker-compose.yml │ ├── housekeeper.py │ ├── requirements.txt │ └── utils.py ├── Identifier │ ├── Dockerfile │ ├── LICENSE.old │ ├── README.md │ ├── docker-compose.yml │ ├── go.mod │ ├── go.sum │ ├── identifier.go │ ├── requirements.txt │ ├── scan.go │ └── template.go ├── Importers │ ├── README.md │ ├── VTinterface.py │ ├── VTresults │ │ └── .gitkeep │ ├── cursors │ │ └── .gitkeep │ ├── gatherMDEfiles.py │ ├── recursiveFileImporter.py │ ├── smartVTscrape.py │ └── utils.py ├── Pathfinder │ ├── README.md │ ├── VDR │ │ ├── .env │ │ ├── LICENSE │ │ ├── find_all_ioctl_codes.py │ │ ├── ida_ioctl_propagate.py │ │ ├── kmdf_re │ │ │ ├── README.md │ │ │ └── code │ │ │ │ ├── WDFStructs.h │ │ │ │ ├── kmdf_re_ida_68.py │ │ │ │ └── kmdf_re_ida_74.py │ │ ├── offsetCreation.py │ │ ├── requirements.txt │ │ ├── vdr_arm.md │ │ ├── wdfFindFunctions.py │ │ └── wdfFunctionDict.py │ ├── WinARMTil │ │ ├── ARMtils │ │ │ ├── NTAPI_win10_ARM64.til │ │ │ ├── NTDDK_win10_ARM64.til │ │ │ └── NTWDF_win10_ARM64.til │ │ ├── NTAPI_win10_ARM64.til │ │ ├── NTDDK_win10_ARM64.til │ │ ├── README.md │ │ ├── buildNTAPI.bat │ │ ├── buildNTDDK.bat │ │ ├── buildNTWDF.bat │ │ ├── ntddk_mod.h │ │ └── wdf_mod.h │ ├── functionsTree.py │ ├── ida_ioctl_res.json │ ├── interestingFunctions.csv │ ├── pathfinder.py │ └── start_multiple_instances.bat ├── UpdateCataloger │ ├── Dockerfile │ ├── LICENCE_get_microsoft_updates │ ├── README.md │ ├── catalogUpdater.py │ ├── catalogUpdaterCache.json │ ├── docker-compose.yml │ ├── get_microsoft_updates.py │ ├── utils.py │ └── vendorIDs.txt ├── docker-compose.yml └── storage │ ├── files │ └── .gitkeep │ ├── postgres │ └── .gitkeep │ └── uploads │ └── .gitkeep └── README.md /.gitignore: -------------------------------------------------------------------------------- 1 | **/*/.DS_Store 2 | **/.DS_Store 3 | 4 | **/.git 5 | 6 | Share/* 7 | Tooling/* 8 | Drivers/* 9 | Created/driverResource.hpp 10 | Created/Pipeline/Coordinator/storage/* 11 | TODO/pyPefile 12 | Created/Pipeline/Importers/VTresults/* 13 | Created/Pipeline/Importers/downloaded_files* 14 | Created/Pipeline/Pathfinder/vdrAllChanges.diff 15 | Created/Pipeline/Importers/cursors/ 16 | **/*/~$*.xlsx 17 | Documents/TO_MasterThesis* 18 | Documents/ResponsibleDisclosures.zip 19 | Created/EvaluationScripts/figures/*.eps 20 | Created/EvaluationScripts/pathingGatherData/evalTargets/* 21 | 22 | # Ignore all python cache files 23 | **/__pycache__ 24 | **/*.pyc 25 | **/*venv 26 | 27 | Created/Pipeline/storage 28 | Created/Pipeline/Importers/getVTfiles/downloaded_files/* 29 | Created/Pipeline/Importers/test 30 | Created/Pipeline/Importers/venv 31 | Created/Pipeline/Housekeeper/venv 32 | 33 | Created/Pipeline/Certificator/sigcheck64.exe 34 | 35 | 36 | ### Node stuff ### 37 | # Logs 38 | logs 39 | *.log 40 | npm-debug.log* 41 | 42 | # Runtime data 43 | pids 44 | *.pid 45 | *.seed 46 | 47 | # Directory for instrumented libs generated by jscoverage/JSCover 48 | lib-cov 49 | 50 | # Coverage directory used by tools like istanbul 51 | coverage 52 | 53 | # nyc test coverage 54 | .nyc_output 55 | 56 | # Grunt intermediate storage (http://gruntjs.com/creating-plugins#storing-task-files) 57 | .grunt 58 | 59 | # node-waf configuration 60 | .lock-wscript 61 | 62 | # Compiled binary addons (http://nodejs.org/api/addons.html) 63 | build/Release 64 | 65 | # Dependency directories 66 | node_modules 67 | jspm_packages 68 | 69 | # Optional npm cache directory 70 | .npm 71 | 72 | # Optional REPL history 73 | .node_repl_history 74 | .next -------------------------------------------------------------------------------- /CITATION.cff: -------------------------------------------------------------------------------- 1 | cff-version: 1.2.0 2 | message: "If you use this software, please cite it as below." 3 | authors: 4 | - family-names: Oberdoerfer 5 | given-names: Tobias 6 | orcid: https://orcid.org/TODO 7 | title: "Efficient Pipelining for Windows Driver Vulnerability Research" 8 | version: 1.0.0 9 | identifiers: 10 | - type: doi 11 | value: TODO 12 | date-released: 2024-07-31 -------------------------------------------------------------------------------- /CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | # Contributing to Windows Kernel Driver Security Pipeline 2 | 3 | Thank you for your interest in contributing to the Windows Kernel Driver Security Pipeline project! 4 | Your contributions help enhance the security of Windows kernel drivers and improve the overall quality of the system. 5 | 6 | ## How You Can Contribute 7 | 8 | ### Reporting Issues 9 | 10 | If you encounter any bugs or have suggestions for improvements, please report them using the [Issues](https://github.com/Toroto006/windows-kernel-driver-pipeline/issues) section of the repository. 11 | When reporting issues, provide as much detail as possible: 12 | 13 | - A clear description of the problem or suggestion. 14 | - Steps to reproduce the issue (if applicable). 15 | - Any relevant logs or screenshots. 16 | 17 | ### Requesting Features 18 | 19 | If you have a feature request or enhancement idea, please open a new issue and label it as a "Feature Request." 20 | Include: 21 | 22 | - A description of the proposed feature or enhancement. 23 | - The problem it solves or the benefit it provides. 24 | - Any ideas on how it might be implemented (optional). 25 | 26 | ### Contributing Code 27 | 28 | We welcome code contributions to improve the system. To contribute code: 29 | 30 | 1. **Fork the Repository**: Click on the "Fork" button at the top right of the repository page. 31 | 32 | 2. **Clone Your Fork**: Clone your forked repository to your local machine: 33 | ```bash 34 | git clone https://github.com/YOUR_USERNAME/windows-kernel-driver-pipeline 35 | ``` 36 | 37 | 3. **Create a Branch**: Create a new branch for your changes: 38 | ```bash 39 | git checkout -b feature/your-feature-name 40 | ``` 41 | 42 | 4. **Make Your Changes**: Implement your changes or new features in this branch. 43 | 44 | 5. **Write Tests**: Add or update tests to cover your changes if applicable. 45 | 46 | 6. **Commit Your Changes**: Commit your changes with a clear and descriptive commit message: 47 | ```bash 48 | git add . 49 | git commit -m "Add feature: your feature description" 50 | ``` 51 | 52 | 7. **Push Your Changes**: Push your changes to your fork: 53 | ```bash 54 | git push origin feature/your-feature-name 55 | ``` 56 | 57 | 8. **Create a Pull Request**: Go to the [Pull Requests](https://github.com/Toroto006/windows-kernel-driver-pipeline/pulls) section of the repository and click on "New Pull Request." Select your branch and provide a description of your changes. 58 | 59 | ### Coding Guidelines 60 | 61 | - **Code Style**: Follow the existing code style and formatting used in the project (or improve upon it, given it is not great yet). 62 | - **Documentation**: Ensure that any new features or changes are well-documented. Update README files or add new documentation as needed. 63 | 64 | ## Reviewing Contributions 65 | 66 | All contributions will be reviewed by the project maintainers. 67 | Feedback will be provided, and changes may be requested before the pull request can be merged. 68 | 69 | ## Contact 70 | 71 | If you have any questions or need further assistance, feel free to reach out through the [Discussions](https://github.com/Toroto006/windows-kernel-driver-pipeline/discussions) section or by opening an issue. 72 | 73 | Thank you for contributing to the Windows Kernel Driver Security Pipeline project! -------------------------------------------------------------------------------- /EvaluationScripts/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM --platform=linux/amd64 python:3.9.19-slim-bullseye 2 | 3 | RUN apt update && \ 4 | apt install -y texlive dvipng texlive-latex-extra texlive-fonts-recommended cm-super 5 | 6 | RUN mkdir /evaluation 7 | COPY . /evaluation 8 | 9 | RUN python3 -m pip install -r /evaluation/requirements.txt 10 | WORKDIR /evaluation 11 | 12 | ENTRYPOINT [ "tail", "-f", "/dev/null" ] 13 | #ENTRYPOINT [ "python3", "runAll.py" ] -------------------------------------------------------------------------------- /EvaluationScripts/IOCTLpicture.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | import matplotlib.pyplot as plt 3 | from matplotlib.lines import Line2D 4 | from matplotlib.legend import Legend 5 | Line2D._us_dashSeq = property(lambda self: self._dash_pattern[1]) 6 | Line2D._us_dashOffset = property(lambda self: self._dash_pattern[0]) 7 | Legend._ncol = property(lambda self: self._ncols) 8 | import matplotlib.transforms as mtrans 9 | 10 | def IOCTLpicture(save_tex=False): 11 | print(f"IOCTL picture creation") 12 | FONT_SIZE = 12 13 | fig, ax = plt.subplots(figsize=(12, 2)) 14 | 15 | # Add vertical lines separating the bit segments 16 | for i in [2, 13, 15]: 17 | ax.axvline(x=i, color='black', linestyle='--') 18 | 19 | # Labels for the bit segments 20 | ax.text(1, 3.5, 'Transfer\nType', ha='center', va='center', fontsize=FONT_SIZE) 21 | ax.text(7.5, 3.5, 'Function Code', ha='center', va='center', fontsize=FONT_SIZE) 22 | ax.text(14, 3.5, 'Required\nAccess', ha='center', va='center', fontsize=FONT_SIZE) 23 | ax.text(23, 3.5, 'Device Type', ha='center', va='center', fontsize=FONT_SIZE) 24 | 25 | # Set the limits, labels, and title 26 | ax.set_xlim(0, 32) 27 | ax.set_ylim(2.5, 4.5) 28 | ax.set_yticks([]) 29 | ax.set_xticks(range(0, 32)) 30 | ax.set_xticklabels(range(0, 32)) 31 | trans = mtrans.Affine2D().translate(12, 0) 32 | for t in ax.get_xticklabels(): 33 | t.set_transform(t.get_transform()+trans) 34 | ax.set_xlabel('Bit', fontsize=14) 35 | 36 | plt.grid(False) 37 | plt.show() 38 | 39 | fig.tight_layout() 40 | #plt.show() 41 | 42 | if not save_tex: 43 | fig.savefig('figures/IOCTLpicture.svg') 44 | else: 45 | # To save it for the thesis 46 | fig.savefig('figures/IOCTLpicture.pdf', format="pdf", dpi=1200, bbox_inches="tight", transparent=True) 47 | 48 | if __name__ == "__main__": 49 | save_tex = False 50 | 51 | import os 52 | os.system('clear') 53 | 54 | IOCTLpicture(save_tex=save_tex) 55 | -------------------------------------------------------------------------------- /EvaluationScripts/README.md: -------------------------------------------------------------------------------- 1 | # Evaluation Script Code for Master Thesis 2 | 3 | ## Overview 4 | This folder contains the evaluation scripts used in my master thesis. 5 | These scripts are designed to connect to the pipeline database and generate the result graphics and facts presented in the thesis. 6 | The primary script, [runAll.py](./runAll.py), orchestrates the execution of the other scripts to produce comprehensive evaluation results. 7 | 8 | ## Folder Structure 9 | - `runAll.py`: Main script that runs all other evaluation scripts. 10 | - `Dockerfile`: Docker configuration for the evaluation environment. 11 | - `docker-compose.yml`: Docker Compose configuration to set up this container. 12 | - `file_cache.py`: A file caching system gotten from [here](https://github.com/sweepai/sweep/blob/main/docs/public/file_cache.py) to speedup development. 13 | 14 | ## Prerequisites 15 | * Docker 16 | * Docker Compose 17 | 18 | ## Setup Instructions 19 | 1. Build and Run the Docker Containers 20 | Ensure you are in the this directory, where the docker-compose.yml file is located, then run: 21 | ```bash 22 | docker-compose up --build 23 | ``` 24 | 25 | 2. Running the Evaluation Script 26 | Once the Docker containers are up and running, execute the main script: 27 | ```bash 28 | docker exec -it your-container-name python3 runAll.py 29 | ``` 30 | Replace your-container-name with the actual name of the container running the evaluation environment. 31 | 32 | ## Configuration 33 | The connection to the pipeline database and other configuration settings are defined in the [dbConnection.py](./dbConnection.py) file. 34 | Modify these settings as necessary to match your database credentials and other configurations. 35 | 36 | ## Output 37 | The results generated by the evaluation scripts will be stored in the [figures](./figures/) directory. -------------------------------------------------------------------------------- /EvaluationScripts/dbConnection.py: -------------------------------------------------------------------------------- 1 | import psycopg2 2 | from psycopg2 import sql 3 | from file_cache import file_cache 4 | 5 | # Define the connection parameters 6 | conn_params = { 7 | 'dbname': 'pipeline', 8 | 'user': 'pipeline', 9 | 'password': 'CHANGE_PASSWORD', 10 | 'host': 'COORDINATOR_IP', 11 | 'port': '5432' 12 | } 13 | 14 | cached_conn = None 15 | cached_cursor = None 16 | 17 | @file_cache() 18 | def run_query(query): 19 | global cached_conn, cached_cursor 20 | # Establish the connection 21 | try: 22 | # cache the connection 23 | if cached_conn is None: 24 | cached_conn = psycopg2.connect(**conn_params) 25 | cached_cursor = cached_conn.cursor() 26 | 27 | # Execute the query 28 | cached_cursor.execute(query) 29 | 30 | # Fetch all results 31 | results = cached_cursor.fetchall() 32 | 33 | return results 34 | except Exception as e: 35 | print(f"An error occurred: {e}") 36 | return None 37 | 38 | def close_connection(): 39 | if cached_conn is not None: 40 | cached_conn.close() 41 | cached_cursor.close() -------------------------------------------------------------------------------- /EvaluationScripts/docker-compose.yml: -------------------------------------------------------------------------------- 1 | services: 2 | evaluation: 3 | build: . 4 | restart: no 5 | container_name: evaluation 6 | volumes: 7 | - .:/evaluation 8 | -------------------------------------------------------------------------------- /EvaluationScripts/file_cache.py: -------------------------------------------------------------------------------- 1 | # https://github.com/sweepai/sweep/blob/main/docs/public/file_cache.py 2 | 3 | import hashlib 4 | import inspect 5 | import os 6 | import pickle 7 | import shutil 8 | 9 | DISABLE_CACHE = False 10 | 11 | MAX_DEPTH = 6 12 | if DISABLE_CACHE: 13 | print("File cache is disabled.") 14 | 15 | 16 | def recursive_hash(value, depth=0, ignore_params=[]): 17 | """Hash primitives recursively with maximum depth.""" 18 | if depth > MAX_DEPTH: 19 | return hashlib.md5("max_depth_reached".encode()).hexdigest() 20 | 21 | if isinstance(value, (int, float, str, bool, bytes)): 22 | return hashlib.md5(str(value).encode()).hexdigest() 23 | elif isinstance(value, (list, tuple)): 24 | return hashlib.md5( 25 | "".join( 26 | [recursive_hash(item, depth + 1, ignore_params) for item in value] 27 | ).encode() 28 | ).hexdigest() 29 | elif isinstance(value, dict): 30 | return hashlib.md5( 31 | "".join( 32 | [ 33 | recursive_hash(key, depth + 1, ignore_params) 34 | + recursive_hash(val, depth + 1, ignore_params) 35 | for key, val in value.items() 36 | if key not in ignore_params 37 | ] 38 | ).encode() 39 | ).hexdigest() 40 | elif hasattr(value, "__dict__") and value.__class__.__name__ not in ignore_params: 41 | return recursive_hash(value.__dict__, depth + 1, ignore_params) 42 | else: 43 | return hashlib.md5("unknown".encode()).hexdigest() 44 | 45 | 46 | def hash_code(code): 47 | return hashlib.md5(code.encode()).hexdigest() 48 | 49 | 50 | cache_dir = "/tmp/caches/file_cache" 51 | 52 | def file_cache(ignore_params=[], verbose=False): 53 | """Decorator to cache function output based on its inputs, ignoring specified parameters. 54 | Ignore parameters are used to avoid caching on non-deterministic inputs, such as timestamps. 55 | We can also ignore parameters that are slow to serialize/constant across runs, such as large objects. 56 | """ 57 | global cache_dir 58 | 59 | def decorator(func): 60 | if DISABLE_CACHE: 61 | if verbose: 62 | print("Cache is disabled for function: " + func.__name__) 63 | return func 64 | func_source_code_hash = hash_code(inspect.getsource(func)) 65 | 66 | def wrapper(*args, **kwargs): 67 | os.makedirs(cache_dir, exist_ok=True) 68 | 69 | # Convert args to a dictionary based on the function's signature 70 | args_names = func.__code__.co_varnames[: func.__code__.co_argcount] 71 | args_dict = dict(zip(args_names, args)) 72 | 73 | # Remove ignored params 74 | kwargs_clone = kwargs.copy() 75 | for param in ignore_params: 76 | args_dict.pop(param, None) 77 | kwargs_clone.pop(param, None) 78 | 79 | # Create hash based on argument names, argument values, and function source code 80 | arg_hash = ( 81 | recursive_hash(args_dict, ignore_params=ignore_params) 82 | + recursive_hash(kwargs_clone, ignore_params=ignore_params) 83 | + func_source_code_hash 84 | ) 85 | cache_file = os.path.join( 86 | cache_dir, f"{func.__module__}_{func.__name__}_{arg_hash}.pickle" 87 | ) 88 | 89 | try: 90 | # If cache exists, load and return it 91 | if os.path.exists(cache_file): 92 | if verbose: 93 | print("Used cache for function: " + func.__name__) 94 | with open(cache_file, "rb") as f: 95 | return pickle.load(f) 96 | except Exception: 97 | print("Unpickling of file cache failed") 98 | 99 | # Otherwise, call the function and save its result to the cache 100 | result = func(*args, **kwargs) 101 | try: 102 | with open(cache_file, "wb") as f: 103 | pickle.dump(result, f) 104 | except Exception as e: 105 | print(f"Pickling of file cache failed: {e}") 106 | return result 107 | 108 | return wrapper 109 | 110 | return decorator 111 | 112 | def clear_cache(): 113 | """Clear all cache files.""" 114 | shutil.rmtree(cache_dir) -------------------------------------------------------------------------------- /EvaluationScripts/fuzzingGatherData/createSeeds.py: -------------------------------------------------------------------------------- 1 | import struct 2 | import base64 3 | 4 | def cyclic(length, n=4): 5 | pattern = b'' 6 | sequence = b'' 7 | count = 0 8 | 9 | while count < length: 10 | if len(sequence) == n: 11 | pattern += sequence 12 | sequence = b'' 13 | sequence += struct.pack('" in ioctl['op']: 35 | ioctl_set.add(ioctl['val'] + 0x4) 36 | ioctl_set.add(ioctl['val']) 37 | 38 | 39 | seed_obj = [] 40 | for ioctl in ioctl_set: 41 | seeds = [ 42 | #seed(ioctl, inputSize=8, outputSize=8, empty=True), 43 | seed(ioctl, inputSize=16, outputSize=16, empty=True), 44 | #seed(ioctl, inputSize=0x80, outputSize=0x80, empty=True), 45 | seed(ioctl, inputSize=0x80, outputSize=0x80, empty=False), 46 | seed(ioctl, inputSize=0x1000, outputSize=0x1000, empty=True), 47 | #seed(ioctl, inputSize=0x1000, outputSize=0x1000, empty=False), 48 | ] 49 | seed_obj.append(seeds) 50 | return seed_obj 51 | 52 | 53 | if __name__ == '__main__': 54 | ioctl_comp = [ 55 | { 56 | 'op': '==', 57 | 'val': 0x222003 58 | }, 59 | { 60 | 'op': '==', 61 | 'val': 0x222007 62 | } 63 | ] 64 | seed_obj = create_ioctl_seeds_for(ioctl_comp) 65 | for s in seed_obj: 66 | print(f"'{s}',") 67 | 68 | print(f"{seed(0x000000, 16, 16, True)}") 69 | print(f"{seed(0x000000, 0x80, 0x80, False)}") 70 | print(f"{seed(0x000000, 0x1000, 0x1000, True)}") 71 | 72 | -------------------------------------------------------------------------------- /EvaluationScripts/fuzzingGatherData/readme.md: -------------------------------------------------------------------------------- 1 | This python script is a slight modification of the fuzzifier to gather data about the improvements the IOCTL seeding made. 2 | Run it in the same location you would run the fuzzifier. -------------------------------------------------------------------------------- /EvaluationScripts/housekeeperResults.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | import matplotlib.pyplot as plt 3 | from matplotlib.lines import Line2D 4 | from matplotlib.legend import Legend 5 | Line2D._us_dashSeq = property(lambda self: self._dash_pattern[1]) 6 | Line2D._us_dashOffset = property(lambda self: self._dash_pattern[0]) 7 | Legend._ncol = property(lambda self: self._ncols) 8 | from dbConnection import run_query, close_connection 9 | 10 | import numpy as np 11 | import matplotlib.ticker as tkr 12 | 13 | def sizeof_fmt(num, pos): 14 | for x in ['B', 'KB', 'MB', '', 'TB']: 15 | if num < 1024.0: 16 | return f"{num:3.0f}{x}" 17 | num /= 1024 18 | 19 | def sizeof_fmt_si(x, pos): 20 | for x_unit in ['', 'k', 'M', 'G', 'T']: 21 | if x < 1000: 22 | return f"{x:3.0f}{x_unit}" 23 | x /= 1000 24 | 25 | def housekeeperResults(save_tex=False): 26 | print(f"Housekeeper statistics") 27 | 28 | query = """SELECT files.id, filename, path FROM files 29 | JOIN notes ON notes.isfor = files.id 30 | WHERE filename LIKE '%.xrs' 31 | AND notes.title = 'magic' 32 | AND notes.content NOT LIKE '%PE%executable%';""" 33 | xrs_files_that_are_not_exes = run_query(query) 34 | print(f"XRS files that are not executables: {len(xrs_files_that_are_not_exes)}") 35 | 36 | query = """SELECT 37 | LOWER(SPLIT_PART(filename, '.', array_length(string_to_array(filename, '.'), 1))) AS extension, 38 | COUNT(*) AS extension_count 39 | FROM files 40 | WHERE LENGTH(SPLIT_PART(filename, '.', array_length(string_to_array(filename, '.'), 1))) <= 5 41 | GROUP BY LOWER(SPLIT_PART(filename, '.', array_length(string_to_array(filename, '.'), 1))) 42 | ORDER BY extension_count DESC;""" 43 | all_result = run_query(query) 44 | 45 | query = """SELECT 46 | LOWER(SPLIT_PART(filename, '.', array_length(string_to_array(filename, '.'), 1))) AS extension, 47 | SUM(size) AS total_size 48 | FROM files 49 | WHERE LENGTH(SPLIT_PART(filename, '.', array_length(string_to_array(filename, '.'), 1))) <= 5 50 | GROUP BY extension 51 | ORDER BY total_size DESC;""" 52 | all_result_size = run_query(query) 53 | 54 | query = """SELECT COUNT(*) FROM files;""" 55 | total_files_count = run_query(query)[0][0] 56 | query = """SELECT SUM(size) FROM files;""" 57 | total_files_size = run_query(query)[0][0] 58 | 59 | # create the graphic 60 | X = 15 61 | amount_files = all_result[:X] 62 | amount_files_size = all_result_size[:X] 63 | percent_covered_top_X = sum([row[1] for row in amount_files]) / total_files_count * 100 64 | percent_covered_top_X_size = sum([row[1] for row in amount_files_size]) / total_files_size * 100 65 | print(f"Top {X} extensions cover {percent_covered_top_X:.2f}% of all extensions by count and {percent_covered_top_X_size:.2f}% by size") 66 | 67 | # Plot these amount_filess in a bar chart 68 | plt.grid(False) 69 | fig, ax = plt.subplots(2, 1, figsize=(5,4)) 70 | plt.xticks(rotation=45) 71 | 72 | extensions = [row[0] for row in amount_files] 73 | counts = [row[1] for row in amount_files] 74 | def color_ext(ext): 75 | if ext in ['xml', 'wtl', 'htm', 'html', 'txt', 'log', 'ini', 'mui', 'mum', 'rdata', 'text', 'data']: 76 | # any text files except inf are useless 77 | return 'tab:gray' 78 | if ext in ['inf', 'cat']: 79 | return 'tab:green' 80 | if ext in ['dll', 'sys']: 81 | return 'tab:red' 82 | if ext in ['cab', 'exe']: 83 | return 'tab:orange' 84 | else: 85 | return 'tab:blue' 86 | colors = list(map(color_ext, extensions)) 87 | 88 | ax[0].bar(extensions, counts, color=colors) 89 | ax[0].set_xticklabels(extensions, rotation=30) 90 | ax[0].set_ylabel('Count') 91 | #ax[0].set_xlabel('Extension of File') 92 | ax[0].yaxis.set_major_formatter(tkr.FuncFormatter(sizeof_fmt_si)) 93 | 94 | # plot the size of the files 95 | extensions = [row[0] for row in amount_files_size] 96 | # shorten all too long exten names 97 | 98 | sizes = [row[1] for row in amount_files_size] 99 | colors = list(map(color_ext, extensions)) 100 | ax[1].bar(extensions, sizes, color=colors) 101 | ax[1].set_xticklabels(extensions, rotation=30) 102 | ax[1].set_ylabel('Size in GB') 103 | if not save_tex: 104 | ax[1].set_xlabel('Extension of File') 105 | ax[1].yaxis.set_major_formatter(tkr.FuncFormatter(sizeof_fmt)) 106 | 107 | fig.tight_layout() 108 | #plt.show() 109 | 110 | if not save_tex: 111 | fig.savefig('figures/housekeeperResults.svg') 112 | else: 113 | # To save it for the thesis 114 | fig.savefig('figures/housekeeperResults.pdf', format="pdf", dpi=1200, bbox_inches="tight", transparent=True) 115 | 116 | if __name__ == "__main__": 117 | save_tex = False 118 | 119 | import os 120 | os.system('clear') 121 | 122 | housekeeperResults(save_tex=save_tex) 123 | 124 | # cleanup DB connections 125 | close_connection() 126 | -------------------------------------------------------------------------------- /EvaluationScripts/pathingGatherData/timing_data.json: -------------------------------------------------------------------------------- 1 | {"2046": [9.17704153060913, 7.836297988891602, 7.6308753490448, 7.668422698974609, 8.115129470825195, 7.744079113006592, 7.602044105529785, 8.473036527633667, 8.265427827835083, 7.841711521148682, 7.5999555587768555, 7.703534841537476, 8.155822992324829, 7.540645360946655, 7.727557897567749, 7.56475567817688, 8.086885452270508, 7.461124897003174, 7.810689210891724, 7.81720495223999, 7.693303346633911, 7.8214616775512695, 7.962069511413574, 7.722068786621094, 9.77800440788269, 8.123475551605225, 7.621859312057495, 8.06055760383606, 7.909996032714844, 7.686694383621216], "25993": [10.4516122341156, 10.374760866165161, 9.93375849723816, 9.748517274856567, 9.988101482391357, 10.739658117294312, 9.859022617340088, 9.538267850875854, 9.5648832321167, 10.513072967529297, 9.676513433456421, 10.0924391746521, 10.745392084121704, 9.33002257347107, 9.827680110931396, 10.734796524047852, 11.629641056060791, 10.189533233642578, 10.294504880905151, 9.308673858642578, 9.96841311454773, 10.155933380126953, 10.512485027313232, 10.06449270248413, 9.998260736465454, 9.522744178771973, 9.266030073165894, 9.96322774887085, 9.900075912475586, 9.655819416046143], "332": [64.03578662872314, 67.1960928440094, 60.79214787483215, 63.364659547805786, 65.61525702476501, 60.672457218170166, 62.0703980922699, 60.57488799095154, 65.91709923744202, 63.611979722976685, 63.513726234436035, 60.90448451042175, 65.28287315368652, 68.42104411125183, 61.946877002716064, 64.31804323196411, 62.50108766555786, 62.678194761276245, 63.283971309661865, 64.35687923431396, 63.657902002334595, 67.50174474716187, 65.08932399749756, 67.37360429763794, 61.37448978424072, 61.33926248550415, 61.69741463661194, 63.57839560508728, 62.73291873931885, 65.84966802597046], "676": [12.289759159088135, 11.07045841217041, 11.631366729736328, 11.25057053565979, 10.975219964981079, 11.718161344528198, 11.147015571594238, 11.337153196334839, 11.418219566345215, 11.497079849243164, 10.81413984298706, 11.955076932907104, 10.705798864364624, 10.822819471359253, 11.316617250442505, 11.786760568618774, 11.481260299682617, 11.733560800552368, 11.667461633682251, 11.295334339141846, 11.753819227218628, 11.702272176742554, 10.885707139968872, 11.17809510231018, 10.598897457122803, 11.043208837509155, 11.575322389602661, 11.401673316955566, 11.400685787200928, 11.572119951248169], "30105": [49.983612060546875, 52.41014552116394, 52.14942812919617, 49.48392343521118, 50.100000619888306, 51.05495238304138, 52.214457988739014, 51.43062901496887, 55.30897641181946, 53.2883780002594, 53.477614879608154, 49.75772047042847, 51.34149384498596, 51.61863660812378, 50.86578130722046, 54.41053485870361, 51.52240014076233, 50.495676040649414, 51.27362084388733, 52.03613352775574, 50.55678606033325, 52.05858325958252, 50.44626522064209, 49.574227809906006, 48.08893370628357, 48.54989218711853, 51.51060223579407, 49.76343774795532, 51.355294704437256, 51.80850386619568], "284": [11.768394708633423, 12.210420846939087, 11.844457149505615, 11.779604196548462, 11.487494945526123, 11.476407289505005, 11.778305530548096, 11.39381718635559, 12.366632461547852, 12.260750770568848, 11.891938209533691, 12.133279085159302, 11.406816720962524, 11.432368516921997, 12.205097436904907, 11.88157606124878, 11.177547216415405, 12.359823226928711, 11.236374378204346, 12.399032354354858, 11.513599634170532, 11.0072021484375, 10.550741195678711, 11.763375997543335, 11.272939682006836, 11.016903162002563, 11.233101844787598, 11.059524297714233, 10.935725927352905, 11.264850616455078], "4123": [68.44997596740723, 71.75470328330994, 72.55065298080444, 72.05062222480774, 72.4514639377594]} -------------------------------------------------------------------------------- /EvaluationScripts/requirements.txt: -------------------------------------------------------------------------------- 1 | contourpy==1.2.1 2 | cycler==0.12.1 3 | fonttools==4.53.1 4 | kiwisolver==1.4.5 5 | matplotlib==3.6 6 | packaging==24.1 7 | pillow==10.4.0 8 | pyparsing==3.1.2 9 | python-dateutil==2.9.0.post0 10 | six==1.16.0 11 | tikzplotlib==0.10.1 12 | webcolors==1.13 13 | numpy==1.26.4 14 | psycopg2-binary 15 | seaborn 16 | requests 17 | scipy -------------------------------------------------------------------------------- /EvaluationScripts/runAll.py: -------------------------------------------------------------------------------- 1 | import matplotlib.pyplot as plt 2 | 3 | import os 4 | os.system('clear') 5 | 6 | from pipelineEfficiency import pipelineEfficiency 7 | from gatheringResults import gatheringResults1, gatheringResults2, gatheringResults3 8 | from pathingResults import pathingResults 9 | from housekeeperResults import housekeeperResults 10 | from interestingFunctions import interestingFunctions 11 | from IOCTLpicture import IOCTLpicture 12 | from fuzzingResults import fuzzingResults 13 | 14 | # latex will be slower and require a working LaTeX installation 15 | SAVE_TEX = True 16 | 17 | if SAVE_TEX: 18 | ## reset defaults 19 | plt.rcdefaults() 20 | 21 | plt.style.use('default') 22 | 23 | ## Set up LaTeX fonts 24 | # plt.rcParams.update({ 25 | # "text.usetex": True, 26 | # "font.family": "serif", 27 | # "font.serif": ["Computer Modern Roman"], 28 | # "font.size": 12, 29 | # }) 30 | 31 | print(f"Running all evaluation scripts against DB and saving to {'tex' if SAVE_TEX else 'svg'} files.") 32 | 33 | pipelineEfficiency(save_tex=SAVE_TEX) 34 | gatheringResults1(save_tex=SAVE_TEX) 35 | gatheringResults2(save_tex=SAVE_TEX) 36 | gatheringResults3(save_tex=SAVE_TEX) 37 | pathingResults(save_tex=SAVE_TEX) 38 | housekeeperResults(SAVE_TEX) 39 | interestingFunctions(SAVE_TEX) 40 | IOCTLpicture(SAVE_TEX) 41 | fuzzingResults(SAVE_TEX) -------------------------------------------------------------------------------- /FinalPresentation.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Toroto006/windows-kernel-driver-pipeline/91c832c2dc33967fd4bbd9af0338b135f9b1fa98/FinalPresentation.pdf -------------------------------------------------------------------------------- /MasterThesis.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Toroto006/windows-kernel-driver-pipeline/91c832c2dc33967fd4bbd9af0338b135f9b1fa98/MasterThesis.pdf -------------------------------------------------------------------------------- /Pipeline/Certificator/README.md: -------------------------------------------------------------------------------- 1 | # HowToUse 2 | 3 | This certificator is supposed to run on a windows system that has a connection to the coordinator. 4 | 5 | Add to this folder the [sigcheck executable](https://learn.microsoft.com/en-us/sysinternals/downloads/sigcheck), which will be executed by this script. 6 | 7 | It requires for obvious reasons to disable [windows defender](https://superuser.com/a/1757341) or any similar system. -------------------------------------------------------------------------------- /Pipeline/Certificator/certificator.py: -------------------------------------------------------------------------------- 1 | # coding=utf8 2 | import requests 3 | import time 4 | from urllib3.exceptions import InsecureRequestWarning 5 | import os 6 | from sigcheckParser import parse_sigcheck_output 7 | 8 | # Suppress only the single warning from urllib3 needed. 9 | requests.packages.urllib3.disable_warnings(category=InsecureRequestWarning) 10 | 11 | COORDINATOR_BASE_URL = 'http://COORDINATOR_IP:5000' 12 | 13 | # the work dir is the directory this file is in 14 | WORKDIR = os.path.dirname(os.path.realpath(__file__)) 15 | 16 | def fetch_driver_signatures_todo(page=1): 17 | url = f'{COORDINATOR_BASE_URL}/todo-signatures' 18 | # catch Max retries exceeded with url 19 | try: 20 | response = requests.get(url, verify=False) 21 | except requests.exceptions.RequestException as e: 22 | print(f"Failed to fetch the drivers.") 23 | return [] 24 | if response.status_code == 200: 25 | return response.json()['drivers'] 26 | else: 27 | print(f"Failed to fetch the drivers that require signature checking. Status code: {response.status_code}") 28 | return [] 29 | 30 | def do_driver_certificat_checking(driver): 31 | # first download the file 32 | url = f'{COORDINATOR_BASE_URL}/files/{driver["file"]}' 33 | response = requests.get(url, verify=False) 34 | if response.status_code != 200: 35 | print(f"Failed to download the file {driver['file']}. Status code: {response.status_code}") 36 | return 37 | 38 | driver_filename = driver['filename'] 39 | # first check if file exists, if so error out 40 | if os.path.exists(f'C:\\Windows\\Temp\\{driver_filename}'): 41 | print(f"File {driver['filename']} already exists in temp folder. Exiting.") 42 | return 43 | with open(f'C:\\Windows\\Temp\\{driver_filename}', 'wb') as f: 44 | f.write(response.content) 45 | 46 | temp_output = os.path.join(WORKDIR, "sigcheck.txt") 47 | # then run sigcheck over it 48 | sigcheck_cmd = os.path.join(WORKDIR, "sigcheck64.exe") 49 | sigcheck_cmd += f' -accepteula -h -i -e -w {temp_output}' 50 | sigcheck_cmd += f' "C:\\Windows\\Temp\\{driver_filename}"' 51 | sigcheck_cmd += " > nul 2>&1" 52 | 53 | os.system(sigcheck_cmd) 54 | time.sleep(1) # wait for output to be written 55 | 56 | # get results from sigcheck.txt 57 | sigcheck_output = "" 58 | with open(temp_output, 'r', encoding='UTF-16') as f: 59 | sigcheck_output = f.read() 60 | 61 | if len(sigcheck_output) == 0: 62 | print(f"Failed to get any output from sigcheck for {driver['filename']}") 63 | return 64 | 65 | try: 66 | output = parse_sigcheck_output(sigcheck_output) 67 | except Exception as e: 68 | print(f"Failed to parse the output from sigcheck for {driver['filename']}. Error: {e}") 69 | return 70 | 71 | # check sha256 is correct 72 | if output['SHA256'].lower() != driver['sha256'].lower(): 73 | print(f"\nERROR: downloaded wrong file?!? ({driver['filename']} [{driver['id']}] with driver file sha256:{driver['sha256']} and sigcheck sha256:{output['SHA256']})") 74 | return 75 | 76 | # send the results to the coordinator 77 | url = f'{COORDINATOR_BASE_URL}/driver-signature/{driver["id"]}' 78 | response = requests.post(url, json=output, verify=False) 79 | if response.status_code != 200: 80 | print(f"Failed to send the results to the coordinator for {driver['filename']}. Status code: {response.status_code}") 81 | return 82 | 83 | # Cleanup 84 | os.remove(f'C:\\Windows\\Temp\\{driver_filename}') 85 | os.remove(temp_output) 86 | 87 | # Main for if this is run within a container, or by hand 88 | def main(): 89 | while True: 90 | drivers = fetch_driver_signatures_todo() 91 | if len(drivers) == 0: # We are done with one full iteration 92 | time.sleep(30) 93 | continue 94 | 95 | for driver in drivers: 96 | print(f"Doing {driver['filename']} ({driver['id']}) ... ", end="") 97 | do_driver_certificat_checking(driver) 98 | print("done") 99 | 100 | if __name__ == "__main__": 101 | main() -------------------------------------------------------------------------------- /Pipeline/Certificator/certificatorService.py: -------------------------------------------------------------------------------- 1 | # currently not used in the pipeline, idea was to make 2 | # a service instead of having to run it manually after restarting the VM 3 | 4 | import win32serviceutil 5 | import win32service 6 | import win32event 7 | import servicemanager 8 | import socket 9 | 10 | from certificator import fetch_driver_signatures_todo, do_driver_certificat_checking 11 | import time 12 | 13 | class AppServerSvc (win32serviceutil.ServiceFramework): 14 | _svc_name_ = "CertificatorService" 15 | _svc_display_name_ = "Certificator Service" 16 | 17 | def __init__(self,args): 18 | self.isStopped = True 19 | win32serviceutil.ServiceFramework.__init__(self,args) 20 | self.hWaitStop = win32event.CreateEvent(None,0,0,None) 21 | socket.setdefaulttimeout(60) 22 | 23 | def SvcStop(self): 24 | self.ReportServiceStatus(win32service.SERVICE_STOP_PENDING) 25 | self.isStopped = True 26 | win32event.SetEvent(self.hWaitStop) 27 | 28 | def SvcDoRun(self): 29 | servicemanager.LogMsg(servicemanager.EVENTLOG_INFORMATION_TYPE, 30 | servicemanager.PYS_SERVICE_STARTED, 31 | (self._svc_name_,'')) 32 | self.isStopped = False 33 | self.main() 34 | 35 | def main(self): 36 | while not self.isStopped: 37 | drivers = fetch_driver_signatures_todo() 38 | if len(drivers) == 0: # We are done with one full iteration 39 | time.sleep(30) 40 | continue 41 | 42 | for driver in drivers: 43 | print(f"Doing {driver['filename']} ({driver['id']}) ... ", end="") 44 | do_driver_certificat_checking(driver) 45 | print("done") 46 | 47 | if __name__ == '__main__': 48 | win32serviceutil.HandleCommandLine(AppServerSvc) -------------------------------------------------------------------------------- /Pipeline/Certificator/requirements.txt: -------------------------------------------------------------------------------- 1 | requests -------------------------------------------------------------------------------- /Pipeline/Certificator/sigcheck.txt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Toroto006/windows-kernel-driver-pipeline/91c832c2dc33967fd4bbd9af0338b135f9b1fa98/Pipeline/Certificator/sigcheck.txt -------------------------------------------------------------------------------- /Pipeline/Coordinator/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM python:3.11-slim 2 | 3 | RUN mkdir /app /tmp/uploads 4 | 5 | WORKDIR /app 6 | 7 | COPY requirements.txt /app/requirements.txt 8 | 9 | RUN pip3 install --upgrade pip && \ 10 | pip3 install -r requirements.txt 11 | 12 | RUN apt update && apt install -y binutils 13 | #COPY StaticAnalyzer/peresults.py /app/peresults.py 14 | 15 | COPY coordinator.py /app/coordinator.py 16 | COPY models.py /app/models.py 17 | COPY peresults.py /app/peresults.py 18 | COPY knownVulnerableDrivers.csv /app/knownVulnerableDrivers.csv 19 | COPY interestingFunctions.csv /app/interestingFunctions.csv 20 | 21 | # infinite loop to keep the container running 22 | #ENTRYPOINT ["tail", "-f", "/dev/null"] 23 | ENTRYPOINT [ "python3", "coordinator.py" ] -------------------------------------------------------------------------------- /Pipeline/Coordinator/LICENSE_peresults: -------------------------------------------------------------------------------- 1 | The MIT License (MIT) 2 | 3 | Copyright (c) 2016 Malice.IO 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | 23 | Copyright (c) 2013-2016, Claudio "nex" Guarnieri 24 | All rights reserved. 25 | 26 | Redistribution and use in source and binary forms, with or without modification, 27 | are permitted provided that the following conditions are met: 28 | 29 | * Redistributions of source code must retain the above copyright notice, this 30 | list of conditions and the following disclaimer. 31 | 32 | * Redistributions in binary form must reproduce the above copyright notice, this 33 | list of conditions and the following disclaimer in the documentation and/or 34 | other materials provided with the distribution. 35 | 36 | * Neither the name of the {organization} nor the names of its 37 | contributors may be used to endorse or promote products derived from 38 | this software without specific prior written permission. 39 | 40 | THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND 41 | ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED 42 | WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE 43 | DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR 44 | ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES 45 | (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; 46 | LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON 47 | ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT 48 | (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS 49 | SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. 50 | 51 | 52 | Individual Licenses for modules 53 | 54 | IDX Module Licensed under the Apache License, Version 2.0. Original Copyright @bbaskin 55 | pymacho Licensed under the GNU License Copyright 2013 Jérémie BOUTOILLE -------------------------------------------------------------------------------- /Pipeline/Coordinator/README.md: -------------------------------------------------------------------------------- 1 | # Coordinator Module 2 | ## Overview 3 | The [Coordinator](./coordinator.py) module is the central component of the pipeline, responsible for coordinating file movement through various analysis modules and storing metadata. It comprises of mainly two containers: a PostgreSQL database container for metadata storage and a Python container running a Flask web server for database access. 4 | Both containers use persistent volumes for data storage, and Flask-SQLAlchemy facilitates interaction with the database via Object-Relational Mapping (ORM). 5 | These containers can be started through the [docker-compose](./docker-compose.yml) file. 6 | 7 | ## Database Schema and ORM 8 | The database schema, defined using Database Markup Language (DBML) file [databaseDefinition.dbml](./databaseDefinition.dbml), consists of 14 tables to manage a diverse dataset. 9 | DBML simplifies database design by abstracting complexities irrelevant to the schema design. The schema is developed and converted to a PostgreSQL definition using dbdiagram.io. 10 | The ORM model, created with Flask-SQLAlchemy, translates the DBML definition into SQLAlchemy ORM, manually addressing many-to-many relationships and unique multi-indexes. 11 | 12 | ## Identifying Vulnerable Drivers 13 | The Coordinator module identifies known vulnerable drivers by aggregating data from Microsoft recommended driver block rules, loldrivers.io, and a manually compiled list of vulnerable drivers using the script in [knownVulnerableDrivers](./knownVulnerableDrivers/). 14 | 15 | ## Exposed Endpoints 16 | The Coordinator module exposes HTTP endpoints for various tasks within the pipeline. These endpoints are categorized into note-taking, file operations, origin file operations, extractions, certificate operations, pathing operations, fuzzing operations, fuzzing queue management, driver filtering, and miscellaneous tasks. Each category supports specific functions, such as updating file information, managing origin files, and coordinating fuzzing processes, ensuring efficient pipeline operation and data retrieval. 17 | 18 | ## License 19 | This project is under [GPLv3](../../LICENSE), but the [peresults.py](./peresults.py) is based on another published work, hence has its own license at [LICENSE_peresults](./LICENSE_peresults). -------------------------------------------------------------------------------- /Pipeline/Coordinator/docker-compose.yml: -------------------------------------------------------------------------------- 1 | version: "3.8" 2 | 3 | services: 4 | coordinator-db: 5 | image: postgres:latest 6 | restart: always 7 | container_name: coordinator-db 8 | environment: 9 | POSTGRES_PASSWORD: POSTGRES_PASSWORD 10 | POSTGRES_DATABASE: pipeline 11 | POSTGRES_USER: pipeline 12 | volumes: 13 | - ./storage/postgres:/var/lib/postgresql 14 | - ./storage/postgres/data:/var/lib/postgresql/data 15 | ports: # only for evaluation and debugging purpose 16 | - "5432:5432" 17 | healthcheck: 18 | test: ["CMD-SHELL", "pg_isready -U pipeline"] 19 | interval: 5s 20 | timeout: 5s 21 | retries: 5 22 | 23 | coordinator: 24 | container_name: coordinator 25 | build: . 26 | volumes: 27 | - .:/app 28 | - ./storage/files:/storage/files 29 | - ./storage/uploads:/storage/uploads 30 | ports: 31 | - "5000:5000" 32 | restart: unless-stopped 33 | depends_on: 34 | coordinator-db: 35 | condition: service_healthy 36 | 37 | coordinator-db-admin: 38 | container_name: coordinator-db-admin 39 | image: dpage/pgadmin4 40 | environment: 41 | - PGADMIN_DEFAULT_EMAIL=user@test.com 42 | - PGADMIN_DEFAULT_PASSWORD=test 43 | ports: 44 | - "5050:80" 45 | depends_on: 46 | coordinator-db: 47 | condition: service_healthy -------------------------------------------------------------------------------- /Pipeline/Coordinator/interestingFunctions.csv: -------------------------------------------------------------------------------- 1 | function, value 2 | MmMapIoSpace, 5 3 | MmMapIoSpaceEx, 5 4 | IoCreateSymbolicLink, 5 5 | WdfDeviceCreateSymbolicLink, 5 6 | ZwMapViewOfSection, 4 7 | ZwMapViewOfSectionEx, 4 8 | IoCreateDevice, 3 9 | WdfDeviceCreateDeviceInterface, 3 10 | WdfIoQueueCreate, 3 11 | ZwOpenSection, 3 12 | IoAllocateMdl, 3 13 | SeSinglePrivilegeCheck, 3 14 | SePrivilegeCheck, 3 15 | MmInitializeMdl, 3 16 | WdfMemoryGetBuffer, 3 17 | MmBuildMdlForNonPagedPool, 1 18 | MmProbeAndLockPages, 1 19 | MmMapLockedPagesSpecifyCache, 1 20 | MmMapLockedPages, 1 21 | MmGetPhysicalAddress, 2 22 | KeStackAttachProcess, 2 23 | WdfControlDeviceInitAllocate, 2 24 | IoGetRequestorProcessId, 1 25 | PsGetCurrentProcessId, 1 26 | MmGetSystemRoutineAddress, 1 27 | MmMapMemoryDumpMdl, 2 28 | ZwUnmapViewOfSection, 1 29 | MmUnmapIoSpace, 2 30 | IoFreeMdl, 1 31 | WdfDeviceInitSetIoInCallerContextCallback, 3 32 | WdfDeviceInitAssignWdmIrpPreprocessCallback, 2 33 | WdmlibIoCreateDeviceSecure, 2 34 | WdfIoTargetOpen, 2 35 | WdfIoTargetSendInternalIoctlSynchronously, 1 36 | WdfChildListAddOrUpdateChildDescriptionAsPresent, 1 37 | WdfRequestRetrieveUnsafeUserOutputBuffer, 1 38 | WdfRequestProbeAndLockUserBufferForWrite, 1 39 | WdfRequestRetrieveUnsafeUserInputBuffer, 1 40 | WdfRequestProbeAndLockUserBufferForRead, 1 41 | memcpy, 2 42 | MmCopyMemory, 1 43 | MmCopyVirtualMemory, 1 44 | memset, 1 45 | memmov, 2 -------------------------------------------------------------------------------- /Pipeline/Coordinator/knownVulnerableDrivers/requirements.txt: -------------------------------------------------------------------------------- 1 | requests -------------------------------------------------------------------------------- /Pipeline/Coordinator/openThroughRedirector.sh: -------------------------------------------------------------------------------- 1 | #! /bin/bash 2 | 3 | # This script opens the redirect SSH tunnel to open the coordinator through the redirector 4 | # don't forget to open the correct firewall ports 5 | 6 | # CONFIGURATION 7 | REDIRECTOR_IP=TODO 8 | SSH_PRIVATE_KEY_PATH=./sshKey/id_redirector 9 | 10 | # RUNNING 11 | # open the reverse portforward from a screen session 12 | # remember to configure the redirector correctly: https://serverfault.com/questions/861909/ssh-r-make-target-host-accept-connection-on-all-interfaces 13 | # when nothing comes through the tunnel dies (prob. bc of firewalls) --> autossh 14 | RPW_CMD="autossh -M 20000 -N -T -R 0.0.0.0:5000:localhost:5000 -i $SSH_PRIVATE_KEY_PATH root@$REDIRECTOR_IP" 15 | echo "Running in screen: $RPW_CMD" 16 | screen -dmS redirector $RPW_CMD -------------------------------------------------------------------------------- /Pipeline/Coordinator/peresults.py: -------------------------------------------------------------------------------- 1 | # Heavily based on https://github.com/malice-plugins/pescan/blob/master/malice/__init__.py#L40 2 | 3 | from os import path 4 | import pefile 5 | import subprocess 6 | 7 | class PeResults(object): 8 | 9 | def __init__(self, file_path): 10 | self.file = file_path 11 | self.results = {} 12 | self.result_sections = None 13 | if not path.exists(self.file): 14 | raise Exception("file does not exist: {}".format(self.file)) 15 | 16 | def imports(self, pef): 17 | imports = [] 18 | if hasattr(pef, 'DIRECTORY_ENTRY_IMPORT') and len(pef.DIRECTORY_ENTRY_IMPORT) > 0: 19 | for entry in pef.DIRECTORY_ENTRY_IMPORT: 20 | try: 21 | if isinstance(entry.dll, bytes): 22 | dll = entry.dll.decode() 23 | else: 24 | dll = entry.dll 25 | dlls = {dll: []} 26 | for symbol in entry.imports: 27 | if isinstance(symbol.name, bytes): 28 | name = symbol.name.decode() 29 | else: 30 | name = symbol.name 31 | dlls[dll].append(dict(address=hex(symbol.address), name=name)) 32 | imports.append(dlls) 33 | except Exception: 34 | continue 35 | self.results['imports'] = imports 36 | 37 | def strings(self): 38 | # run the `strings -e l ` command on the file 39 | strings = subprocess.check_output(["strings", "-e", "l", self.file]) 40 | self.results['strings'] = strings.decode().split('\n') 41 | 42 | def run_analysis(self): 43 | try: 44 | self.strings() 45 | pef = pefile.PE(self.file) 46 | self.imports(pef) 47 | self.results['imphash'] = pef.get_imphash() 48 | except Exception as e: 49 | print("[E] Unable to parse PE file: {0}".format(e)) 50 | 51 | return self.results -------------------------------------------------------------------------------- /Pipeline/Coordinator/requirements.txt: -------------------------------------------------------------------------------- 1 | blinker==1.7.0 2 | click==8.1.7 3 | Flask==3.0.2 4 | Flask-SQLAlchemy==3.1.1 5 | greenlet==3.0.3 6 | itsdangerous==2.1.2 7 | Jinja2==3.1.3 8 | MarkupSafe==2.1.5 9 | pydbml==1.0.10 10 | pyparsing==3.1.2 11 | SQLAlchemy==2.0.29 12 | typing_extensions==4.10.0 13 | Werkzeug==3.0.1 14 | pefile 15 | flask-cors 16 | psycopg2-binary -------------------------------------------------------------------------------- /Pipeline/Coordinator/storage/files/.gitkeep: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Toroto006/windows-kernel-driver-pipeline/91c832c2dc33967fd4bbd9af0338b135f9b1fa98/Pipeline/Coordinator/storage/files/.gitkeep -------------------------------------------------------------------------------- /Pipeline/Coordinator/storage/postgres/.gitkeep: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Toroto006/windows-kernel-driver-pipeline/91c832c2dc33967fd4bbd9af0338b135f9b1fa98/Pipeline/Coordinator/storage/postgres/.gitkeep -------------------------------------------------------------------------------- /Pipeline/Coordinator/storage/uploads/.gitkeep: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Toroto006/windows-kernel-driver-pipeline/91c832c2dc33967fd4bbd9af0338b135f9b1fa98/Pipeline/Coordinator/storage/uploads/.gitkeep -------------------------------------------------------------------------------- /Pipeline/Frontender/.eslintrc.json: -------------------------------------------------------------------------------- 1 | { 2 | "extends": "next/core-web-vitals" 3 | } 4 | -------------------------------------------------------------------------------- /Pipeline/Frontender/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM node:18-alpine 2 | 3 | WORKDIR /app 4 | 5 | # COPY package.json ./ 6 | # RUN npm install 7 | # COPY . . 8 | 9 | CMD ["npm", "run", "dev"] -------------------------------------------------------------------------------- /Pipeline/Frontender/LICENSE_nextjs: -------------------------------------------------------------------------------- 1 | The MIT License (MIT) 2 | 3 | Copyright (c) 2024 Vercel, Inc. 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: 6 | 7 | The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. 8 | 9 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. -------------------------------------------------------------------------------- /Pipeline/Frontender/LICENSE_shadcnui: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2023 shadcn 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. -------------------------------------------------------------------------------- /Pipeline/Frontender/README.md: -------------------------------------------------------------------------------- 1 | # Frontender Module: Security Researcher Interaction 2 | Frontender is a module designed to assist security researchers by presenting driver analysis results in a navigable format. It is part of a pipeline that collects and analyzes drivers to identify potential vulnerabilities through static and dynamic methods. The Frontender provides a website that allows researchers to verify these results, modify driver metadata, and add jobs to the fuzzing queue. 3 | 4 | Frontender streamlines the process of verifying driver vulnerabilities, providing a comprehensive tool for security researchers to manage and analyze driver data effectively. 5 | 6 | ## Technical Overview 7 | Frontender runs as a containerized Node.js server (version 18) hosting a React website developed in TypeScript. It uses shadcn/ui components for styling and Vercel's stale-while-revalidate (SWR) for HTTP caching to enhance the user experience. 8 | 9 | ## Website Features 10 | The website features three main tabs for displaying pipeline data: 11 | 12 | - `Drivers Tab`: Displays all drivers in the database with filtering and search options. 13 | - `Drivers by Origin Tab`: Restricts displayed drivers to those from a specific origin. 14 | - `Drivers by Imports Tab`: Shows drivers based on imported functions specified in the search field. 15 | - `Fuzzing Queue Tab:` Displays tables of drivers currently being fuzzed, those that finished or encountered errors, and those still in the queue. 16 | - `Known Vulnerable Drivers Tab`: Lists known vulnerable drivers in a table format similar to the driver's overview. 17 | 18 | Each driver can be inspected in detail by clicking its filename, revealing metadata, identification results, certificate details, and fuzzing statistics. Users can modify driver tags, add drivers to the fuzzing queue, and download drivers. 19 | 20 | ## Learn More about Next.js 21 | This is a [Next.js](https://nextjs.org/) project bootstrapped with [`create-next-app`](https://github.com/vercel/next.js/tree/canary/packages/create-next-app). 22 | 23 | To learn more about Next.js, take a look at the following resources: 24 | 25 | - [Next.js Documentation](https://nextjs.org/docs) - learn about Next.js features and API. 26 | - [Learn Next.js](https://nextjs.org/learn) - an interactive Next.js tutorial. 27 | 28 | You can check out [the Next.js GitHub repository](https://github.com/vercel/next.js/) - your feedback and contributions are welcome! 29 | 30 | ## License 31 | This project is under [GPLv3](../../LICENSE), but any of the shadcn/ui elements are licensed under [LICENSE_shadcnui](./LICENSE_shadcnui) and similarly any next.js elements under [LICENSE_nextjs](./LICENSE_nextjs). -------------------------------------------------------------------------------- /Pipeline/Frontender/components.json: -------------------------------------------------------------------------------- 1 | { 2 | "$schema": "https://ui.shadcn.com/schema.json", 3 | "style": "default", 4 | "rsc": true, 5 | "tsx": true, 6 | "tailwind": { 7 | "config": "tailwind.config.ts", 8 | "css": "src/app/globals.css", 9 | "baseColor": "slate", 10 | "cssVariables": true, 11 | "prefix": "" 12 | }, 13 | "aliases": { 14 | "components": "@/components", 15 | "utils": "@/lib/utils" 16 | } 17 | } -------------------------------------------------------------------------------- /Pipeline/Frontender/docker-compose.yml: -------------------------------------------------------------------------------- 1 | version: "3.8" 2 | 3 | services: 4 | frontender: 5 | build: . 6 | volumes: 7 | - .:/app 8 | ports: 9 | - "3000:3000" -------------------------------------------------------------------------------- /Pipeline/Frontender/next-env.d.ts: -------------------------------------------------------------------------------- 1 | /// 2 | /// 3 | 4 | // NOTE: This file should not be edited 5 | // see https://nextjs.org/docs/basic-features/typescript for more information. 6 | -------------------------------------------------------------------------------- /Pipeline/Frontender/next.config.mjs: -------------------------------------------------------------------------------- 1 | /** @type {import('next').NextConfig} */ 2 | const nextConfig = {}; 3 | 4 | export default nextConfig; 5 | -------------------------------------------------------------------------------- /Pipeline/Frontender/package.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "frontender", 3 | "version": "0.1.0", 4 | "private": true, 5 | "scripts": { 6 | "dev": "next dev", 7 | "build": "next build", 8 | "start": "next start", 9 | "lint": "next lint" 10 | }, 11 | "dependencies": { 12 | "@radix-ui/react-dialog": "^1.0.5", 13 | "@radix-ui/react-dropdown-menu": "^2.0.6", 14 | "@radix-ui/react-hover-card": "^1.0.7", 15 | "@radix-ui/react-icons": "^1.3.0", 16 | "@radix-ui/react-label": "^2.0.2", 17 | "@radix-ui/react-menubar": "^1.0.4", 18 | "@radix-ui/react-navigation-menu": "^1.1.4", 19 | "@radix-ui/react-radio-group": "^1.1.3", 20 | "@radix-ui/react-select": "^2.0.0", 21 | "@radix-ui/react-separator": "^1.0.3", 22 | "@radix-ui/react-slot": "^1.0.2", 23 | "@radix-ui/react-toast": "^1.1.5", 24 | "@tanstack/react-table": "^8.16.0", 25 | "class-variance-authority": "^0.7.0", 26 | "clsx": "^2.1.0", 27 | "lucide-react": "^0.368.0", 28 | "next": "14.2.1", 29 | "next-themes": "^0.3.0", 30 | "react": "^18", 31 | "react-dom": "^18", 32 | "swr": "^2.2.5", 33 | "tailwind-merge": "^2.2.2", 34 | "tailwindcss-animate": "^1.0.7" 35 | }, 36 | "devDependencies": { 37 | "@types/node": "^20", 38 | "@types/react": "^18", 39 | "@types/react-dom": "^18", 40 | "eslint": "^8", 41 | "eslint-config-next": "14.2.1", 42 | "postcss": "^8", 43 | "tailwindcss": "^3.4.1", 44 | "typescript": "^5" 45 | } 46 | } 47 | -------------------------------------------------------------------------------- /Pipeline/Frontender/postcss.config.mjs: -------------------------------------------------------------------------------- 1 | /** @type {import('postcss-load-config').Config} */ 2 | const config = { 3 | plugins: { 4 | tailwindcss: {}, 5 | }, 6 | }; 7 | 8 | export default config; 9 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/app/driver/page.tsx: -------------------------------------------------------------------------------- 1 | "use client"; 2 | 3 | import { NavBar } from "@/components/navbar"; 4 | import { getDriver } from '@/components/data-fetcher'; 5 | import { Driver } from '@/types/Driver' 6 | import DriverComponent from "@/components/ui/driver"; 7 | import { useSearchParams } from 'next/navigation' 8 | 9 | export default function DriverPage() { 10 | const searchParams = useSearchParams() 11 | const driverID = searchParams.get('id') ? parseInt(searchParams.get('id') as string) : -1; 12 | 13 | let driver: Driver | null = null; 14 | driver = getDriver(driverID).driver; 15 | let error: boolean = getDriver(driverID).isError; 16 | 17 | return ( 18 |
19 | 20 | {/* center the table and give it a border */} 21 | {getDriver(driverID).isLoading &&

Loading...

} 22 | {error &&

Error in getting this specific driver!

} 23 | 24 | {!error && driver && 25 |
26 | 27 |
28 | } 29 |
30 | ); 31 | } 32 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/app/drivers/page.tsx: -------------------------------------------------------------------------------- 1 | "use client"; 2 | 3 | import { NavBar } from "@/components/navbar"; 4 | import { allDrivers } from '@/components/data-fetcher'; 5 | 6 | import { columns } from "@/types/Driver" 7 | import { DataTable } from "@/components/ui/data-table" 8 | 9 | export default function DriversPage() { 10 | return ( 11 |
12 | 13 | {/* center the table and give it a border */} 14 | {allDrivers().isLoading &&

Loading...

} 15 | {allDrivers().isError &&

Error: {allDrivers().isError}

} 16 | 17 | {allDrivers().drivers && 18 |
19 | 29 |
30 | } 31 |
32 | ); 33 | } 34 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/app/favicon.ico: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Toroto006/windows-kernel-driver-pipeline/91c832c2dc33967fd4bbd9af0338b135f9b1fa98/Pipeline/Frontender/src/app/favicon.ico -------------------------------------------------------------------------------- /Pipeline/Frontender/src/app/fuzzing-queue/page.tsx: -------------------------------------------------------------------------------- 1 | "use client"; 2 | 3 | import { NavBar } from "@/components/navbar"; 4 | import { fuzzingQueue } from '@/components/data-fetcher'; 5 | 6 | import { DataTable } from "@/components/ui/data-table" 7 | import { columns, FuzzingQueueElem } from "@/types/FuzzingQueue" 8 | 9 | export default function FuzzingQueuePage() { 10 | let fuzzingQueued: FuzzingQueueElem[] | null = null; 11 | let fuzzingDone: FuzzingQueueElem[] | null = null; 12 | let fuzzingRunning: FuzzingQueueElem[] | null = null; 13 | let res = fuzzingQueue(); 14 | if (!res.isError) { 15 | fuzzingQueued = res.queued; 16 | fuzzingDone = res.done; 17 | fuzzingRunning = res.running; 18 | } else { 19 | console.error(res.isError); 20 | } 21 | 22 | return ( 23 |
24 | 25 | {/* center the table and give it a border */} 26 | {!res.isError && !fuzzingQueued &&

Loading...

} 27 | {res.isError &&

Error: {res.isError}

} 28 | 29 | {/* Fuzzing currently running */} 30 | {!res.isError && fuzzingRunning && fuzzingRunning.length === 0 &&

No fuzzing jobs currently running.

} 31 | {!res.isError && fuzzingRunning && fuzzingRunning.length > 0 && 32 |
33 | 42 |
43 | } 44 | 45 | {/* Fuzzing done */} 46 | {!res.isError && fuzzingDone && fuzzingDone.length === 0 &&

No fuzzing jobs done.

} 47 | {!res.isError && fuzzingDone && fuzzingDone.length > 0 && 48 |
49 | 58 |
59 | } 60 | 61 | {/* Fuzzing queue*/} 62 | {!res.isError && fuzzingQueued && fuzzingQueued.length === 0 &&

No fuzzing jobs queued.

} 63 | {!res.isError && fuzzingQueued && fuzzingQueued.length > 0 && 64 |
65 | 75 |
76 | } 77 |
78 | ); 79 | } 80 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/app/fuzzing-results/page.tsx: -------------------------------------------------------------------------------- 1 | "use client"; 2 | 3 | import { NavBar } from "@/components/navbar"; 4 | import { getDriver } from '@/components/data-fetcher'; 5 | import { Driver } from '@/types/Driver' 6 | import DriverComponent from "@/components/ui/driver"; 7 | import { useSearchParams } from 'next/navigation' 8 | 9 | export default function DriverFuzzingPage() { 10 | const searchParams = useSearchParams() 11 | const driverID = searchParams.get('id') ? parseInt(searchParams.get('id') as string) : -1; 12 | 13 | let driver: Driver | null = null; 14 | driver = getDriver(driverID).driver; 15 | let error: boolean = getDriver(driverID).isError; 16 | 17 | return ( 18 |
19 | 20 | {/* center the table and give it a border */} 21 | {getDriver(driverID).isLoading &&

Loading...

} 22 | {error &&

Error in getting this specific driver!

} 23 | 24 | {!error && driver && 25 |
26 | 27 |
28 | } 29 |
30 | ); 31 | } 32 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/app/globals.css: -------------------------------------------------------------------------------- 1 | @tailwind base; 2 | @tailwind components; 3 | @tailwind utilities; 4 | 5 | @layer base { 6 | :root { 7 | --background: 0 0% 100%; 8 | --foreground: 222.2 84% 4.9%; 9 | 10 | --card: 0 0% 100%; 11 | --card-foreground: 222.2 84% 4.9%; 12 | 13 | --popover: 0 0% 100%; 14 | --popover-foreground: 222.2 84% 4.9%; 15 | 16 | --primary: 222.2 47.4% 11.2%; 17 | --primary-foreground: 210 40% 98%; 18 | 19 | --secondary: 210 40% 96.1%; 20 | --secondary-foreground: 222.2 47.4% 11.2%; 21 | 22 | --muted: 210 40% 96.1%; 23 | --muted-foreground: 215.4 16.3% 46.9%; 24 | 25 | --accent: 210 40% 96.1%; 26 | --accent-foreground: 222.2 47.4% 11.2%; 27 | 28 | --destructive: 0 84.2% 60.2%; 29 | --destructive-foreground: 210 40% 98%; 30 | 31 | --border: 214.3 31.8% 91.4%; 32 | --input: 214.3 31.8% 91.4%; 33 | --ring: 222.2 84% 4.9%; 34 | 35 | --radius: 0.5rem; 36 | } 37 | 38 | .dark { 39 | --background: 222.2 84% 4.9%; 40 | --foreground: 210 40% 98%; 41 | 42 | --card: 222.2 84% 4.9%; 43 | --card-foreground: 210 40% 98%; 44 | 45 | --popover: 222.2 84% 4.9%; 46 | --popover-foreground: 210 40% 98%; 47 | 48 | --primary: 210 40% 98%; 49 | --primary-foreground: 222.2 47.4% 11.2%; 50 | 51 | --secondary: 217.2 32.6% 17.5%; 52 | --secondary-foreground: 210 40% 98%; 53 | 54 | --muted: 217.2 32.6% 17.5%; 55 | --muted-foreground: 215 20.2% 65.1%; 56 | 57 | --accent: 217.2 32.6% 17.5%; 58 | --accent-foreground: 210 40% 98%; 59 | 60 | --destructive: 0 62.8% 30.6%; 61 | --destructive-foreground: 210 40% 98%; 62 | 63 | --border: 217.2 32.6% 17.5%; 64 | --input: 217.2 32.6% 17.5%; 65 | --ring: 212.7 26.8% 83.9%; 66 | } 67 | } 68 | 69 | @layer base { 70 | * { 71 | @apply border-border; 72 | } 73 | body { 74 | @apply bg-background text-foreground; 75 | } 76 | } -------------------------------------------------------------------------------- /Pipeline/Frontender/src/app/ida-pathing/page.tsx: -------------------------------------------------------------------------------- 1 | "use client"; 2 | 3 | import { NavBar } from "@/components/navbar"; 4 | import { getPathing } from '@/components/data-fetcher'; 5 | import { useSearchParams } from 'next/navigation' 6 | import { PathingResult } from "@/types/Pathing"; 7 | 8 | import FunctionTree from "@/components/ida-pathing"; 9 | 10 | export default function PathingPage() { 11 | const searchParams = useSearchParams() 12 | const driverID = searchParams.get('id') ? parseInt(searchParams.get('id') as string) : -1; 13 | 14 | let pathing: PathingResult | null = null; 15 | pathing = getPathing(driverID).pathing; 16 | let error: boolean = getPathing(driverID).isError; 17 | 18 | return ( 19 |
20 | 21 | {/* center the table and give it a border */} 22 | {getPathing(driverID).isLoading &&

Loading...

} 23 | {!error && pathing && 24 |
25 |

Type {pathing.type} return code {pathing.ret_code} at {pathing.created_at}.

26 |

Total of {pathing.ioctl_comp.length} probable IOCTL codes found.

27 |
28 | } 29 | {!error && pathing && 30 |
31 |

Following paths were found:

32 | {pathing.handler_addrs && 33 | 34 | } 35 |
36 | } 37 |
38 | ); 39 | } 40 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/app/import-drivers/page.tsx: -------------------------------------------------------------------------------- 1 | "use client"; 2 | 3 | import { NavBar } from "@/components/navbar"; 4 | import { importDrivers } from '@/components/data-fetcher'; 5 | import { Input } from "@/components/ui/input" 6 | 7 | import { columns } from "@/types/Driver" 8 | import { DataTable } from "@/components/ui/data-table" 9 | import { useState } from "react"; 10 | 11 | export default function DriversPage() { 12 | const [search, searchSetter] = useState("MmMapIoSpace"); 13 | 14 | const drivers = importDrivers(search); 15 | 16 | return ( 17 |
18 | 19 | Drivers by Import 20 |
21 | { 25 | if (event.target.value.length > 0) 26 | searchSetter(event.target.value) 27 | else { 28 | searchSetter("MmMapIoSpace") 29 | } 30 | } 31 | } 32 | className="max-w-sm" 33 | /> 34 |
35 | 36 | {/* center the table and give it a border */} 37 | {drivers.isLoading &&

Loading...

} 38 | {drivers.isError &&

Error: {drivers.isError.toString()}

} 39 | 40 | {!drivers.isLoading && drivers.drivers.length > 0 && 41 |
42 | 52 |
53 | } 54 |
55 | ); 56 | } 57 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/app/known-vulnerable-drivers/page.tsx: -------------------------------------------------------------------------------- 1 | "use client"; 2 | 3 | import { NavBar } from "@/components/navbar"; 4 | import { getVulnerableDrivers } from '@/components/data-fetcher'; 5 | 6 | import { columns } from "@/types/KnownVulnerable" 7 | import { DataTable } from "@/components/ui/data-table" 8 | 9 | export default function DriversPage() { 10 | let knownVulnerableDrivers = getVulnerableDrivers().drivers; 11 | 12 | return ( 13 |
14 | 15 | {/* center the table and give it a border */} 16 | {/* {isLoading &&

Loading...

} 17 | {isError &&

Error: {knownVulnerableDrivers}

} */} 18 | 19 | {knownVulnerableDrivers && 20 |
21 | 22 |
23 | } 24 |
25 | ); 26 | } 27 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/app/layout.tsx: -------------------------------------------------------------------------------- 1 | import type { Metadata } from "next"; 2 | import { Inter } from "next/font/google"; 3 | import { ThemeProvider } from "@/components/theme-provider" 4 | import "./globals.css"; 5 | 6 | const inter = Inter({ subsets: ["latin"] }); 7 | 8 | export const metadata: Metadata = { 9 | title: "Frontender", 10 | description: "Frontend to search Vulnerable Driver Database", 11 | }; 12 | 13 | export default function RootLayout({ 14 | children, 15 | }: Readonly<{ 16 | children: React.ReactNode; 17 | }>) { 18 | return ( 19 | 20 | 21 | 27 | {children} 28 | 29 | 30 | 31 | ); 32 | } 33 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/app/origin-drivers/page.tsx: -------------------------------------------------------------------------------- 1 | "use client"; 2 | 3 | import { NavBar } from "@/components/navbar"; 4 | import { originDrivers } from '@/components/data-fetcher'; 5 | import { Input } from "@/components/ui/input" 6 | 7 | import { columns } from "@/types/Driver" 8 | import { DataTable } from "@/components/ui/data-table" 9 | import { useState } from "react"; 10 | 11 | export default function DriversPage() { 12 | const [search, searchSetter] = useState("CDC"); 13 | 14 | const drivers = originDrivers(search); 15 | 16 | return ( 17 |
18 | 19 | Drivers by Origin 20 |
21 | { 25 | if (event.target.value.length > 0) 26 | searchSetter(event.target.value) 27 | else { 28 | searchSetter("CDC") 29 | } 30 | } 31 | } 32 | className="max-w-sm" 33 | /> 34 |
35 | 36 | {/* center the table and give it a border */} 37 | {drivers.isLoading &&

Loading...

} 38 | {drivers.isError &&

Error: {drivers.isError.toString()}

} 39 | 40 | {!drivers.isLoading && drivers.drivers.length > 0 && 41 |
42 | 52 |
53 | } 54 |
55 | ); 56 | } 57 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/app/page.tsx: -------------------------------------------------------------------------------- 1 | import { NavBar } from "@/components/navbar"; 2 | import DriverComponent from "@/components/ui/driver"; 3 | import { getDriver } from "@/components/data-fetcher2"; 4 | 5 | import { allDrivers } from '@/components/data-fetcher'; 6 | 7 | const testID = 100; 8 | 9 | export default function Home() { 10 | return ( 11 |
12 | 13 | 14 |
15 | ); 16 | } 17 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/components/data-fetcher.tsx: -------------------------------------------------------------------------------- 1 | "use client"; 2 | 3 | import useSWR from 'swr' 4 | import { DriverOverview, Driver } from '@/types/Driver' 5 | import { PathingResult, Path } from '@/types/Pathing' 6 | import { VulnerableDriver } from '@/types/KnownVulnerable' 7 | import { FuzzingQueueElem } from '@/types/FuzzingQueue'; 8 | 9 | const fetcher = (input:string | URL | Request) => fetch(input).then(res => res.json()) 10 | 11 | const backendUrl = "http://COORDINATOR_IP:5000" 12 | // TODO FIX the blocked: mixed content issue, i.e. make the backend serve https 13 | 14 | export function allDrivers (): { drivers: DriverOverview[], isLoading: boolean, isError: any } { 15 | const { data, error, isLoading } = useSWR(`${backendUrl}/drivers`, fetcher) 16 | 17 | if (!isLoading && !error) { 18 | return { 19 | drivers: data.drivers, 20 | isLoading, 21 | isError: error 22 | } 23 | } 24 | 25 | return { 26 | drivers: [], 27 | isLoading, 28 | isError: error 29 | } 30 | } 31 | 32 | export function originDrivers (search:string): { drivers: DriverOverview[], isLoading: boolean, isError: any } { 33 | const { data, error, isLoading } = useSWR(`${backendUrl}/drivers-filter/origin/${search}`, fetcher) 34 | 35 | if (!isLoading && !error) { 36 | return { 37 | drivers: data.drivers, 38 | isLoading, 39 | isError: error 40 | } 41 | } 42 | 43 | return { 44 | drivers: [], 45 | isLoading, 46 | isError: error 47 | } 48 | } 49 | 50 | export function importDrivers (search:string): { drivers: DriverOverview[], isLoading: boolean, isError: any } { 51 | const { data, error, isLoading } = useSWR(`${backendUrl}/drivers-filter/imports/${search}`, fetcher) 52 | 53 | if (!isLoading && !error) { 54 | return { 55 | drivers: data.drivers, 56 | isLoading, 57 | isError: error 58 | } 59 | } 60 | 61 | return { 62 | drivers: [], 63 | isLoading, 64 | isError: error 65 | } 66 | } 67 | 68 | export function getDriver (id: number) : { driver: Driver, isLoading: boolean, isError: any }{ 69 | const { data, error, isLoading } = useSWR(`${backendUrl}/drivers/${id}`, fetcher) 70 | 71 | if (!isLoading && !error) { 72 | return { 73 | driver: data.driver, 74 | isLoading, 75 | isError: false 76 | } 77 | } 78 | 79 | return { 80 | driver: data, 81 | isLoading, 82 | isError: true 83 | } 84 | } 85 | 86 | export function getPathing (id: number) : { pathing: PathingResult, isLoading: boolean, isError: any }{ 87 | const { data, error, isLoading } = useSWR(`${backendUrl}/driver-paths/${id}`, fetcher) 88 | 89 | if (!isLoading && !error && data.path) { 90 | 91 | return { 92 | pathing: data.path, 93 | isLoading, 94 | isError: false 95 | } 96 | } 97 | 98 | return { 99 | pathing: data, 100 | isLoading, 101 | isError: true 102 | } 103 | } 104 | 105 | export function getPossibleTags () : { tags: string[], isLoading: boolean, isError: any }{ 106 | const { data, error, isLoading } = useSWR(`${backendUrl}/driver-tags`, fetcher) 107 | 108 | if (!isLoading && !error && data.tags) { 109 | 110 | return { 111 | tags: data.tags, 112 | isLoading, 113 | isError: false 114 | } 115 | } 116 | 117 | return { 118 | tags: data, 119 | isLoading, 120 | isError: true 121 | } 122 | } 123 | 124 | export function getVulnerableDrivers () : { drivers: VulnerableDriver[], isLoading: boolean, isError: any }{ 125 | const { data, error, isLoading } = useSWR(`${backendUrl}/known-vulnerable-list`, fetcher) 126 | 127 | if (!isLoading && !error && data.drivers) { 128 | 129 | return { 130 | drivers: data.drivers, 131 | isLoading, 132 | isError: false 133 | } 134 | } 135 | 136 | return { 137 | drivers: data, 138 | isLoading, 139 | isError: true 140 | } 141 | } 142 | 143 | export function fuzzingQueue () : { queued: FuzzingQueueElem[], done: FuzzingQueueElem[], running: FuzzingQueueElem[], isLoading: boolean, isError: any }{ 144 | const { data, error, isLoading } = useSWR(`${backendUrl}/fuzzing-queue`, fetcher) 145 | 146 | if (!isLoading && !error && data.queued) { 147 | 148 | return { 149 | queued: data.queued, 150 | done: data.done.concat(data.errored), 151 | running: data.running, 152 | isLoading, 153 | isError: false 154 | } 155 | } 156 | 157 | return { 158 | queued: data, 159 | done: data, 160 | running: data, 161 | isLoading, 162 | isError: true 163 | } 164 | } 165 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/components/data-pusher.tsx: -------------------------------------------------------------------------------- 1 | const backendUrl = "http://COORDINATOR_IP:5000" 2 | // TODO FIX the blocked: mixed content issue, i.e. make the backend serve https 3 | 4 | export function updateFilename(id: number, newFilename: string) { 5 | const response = fetch(`${backendUrl}/files/${id}`, { 6 | method: 'PUT', 7 | headers: { 8 | 'Content-Type': 'application/json', 9 | }, 10 | body: JSON.stringify({ 11 | 'filename': newFilename, 12 | }), 13 | }); 14 | 15 | return response 16 | } 17 | 18 | export function updateDriverTag(id: number, newTag: string) { 19 | const response = fetch(`${backendUrl}/driver-tags/${id}`, { 20 | method: 'PUT', 21 | headers: { 22 | 'Content-Type': 'application/json', 23 | }, 24 | body: JSON.stringify({ 25 | 'tag': newTag, 26 | }), 27 | }); 28 | 29 | return response 30 | } 31 | 32 | import { FuzzingQueueElem } from '@/types/FuzzingQueue' 33 | export function addFuzzingQueue(queueElem: FuzzingQueueElem) { 34 | let data : { [id: string] : string | number | number[] } = { 35 | 'driver': queueElem.driver, 36 | 'priority': queueElem.priority, 37 | //'seeds': queueElem.seeds, // TODO: Implement seeds 38 | }; 39 | if ( queueElem.max_runtime ) { 40 | data['max_runtime'] = queueElem.max_runtime 41 | } 42 | if ( queueElem.max_last_crash ) { 43 | data['max_last_crash'] = queueElem.max_last_crash 44 | } 45 | if ( queueElem.max_last_any ) { 46 | data['max_last_any'] = queueElem.max_last_any 47 | } 48 | if ( queueElem.dos_device_str ) { 49 | data['dos_device_str'] = queueElem.dos_device_str 50 | } 51 | 52 | const response = fetch(`${backendUrl}/fuzzing-queue`, { 53 | method: 'POST', 54 | headers: { 55 | 'Content-Type': 'application/json', 56 | }, 57 | body: JSON.stringify(data), 58 | }); 59 | 60 | return response 61 | } 62 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/components/ida-pathing.tsx: -------------------------------------------------------------------------------- 1 | import React from 'react'; 2 | 3 | import { Path } from '@/types/Pathing'; 4 | import { 5 | HoverCard, 6 | HoverCardContent, 7 | HoverCardTrigger, 8 | } from "@/components/ui/hover-card" 9 | import { nodeServerPages } from 'next/dist/build/webpack/plugins/pages-manifest-plugin'; 10 | 11 | interface LocContext { 12 | context: string; 13 | ui: () => JSX.Element; 14 | } 15 | 16 | interface FuncNode { 17 | addr: number; 18 | name?: string; 19 | context?: LocContext; 20 | depth: number; 21 | children: FuncNode[]; 22 | } 23 | 24 | function createContext(context: string, title:string): LocContext { 25 | /* returns a location context object containing the React.FC */ 26 | let lines = context.split('\n'); 27 | let maxWidthNecessary = Math.max(...lines.map(l => l.length)); 28 | console.log(maxWidthNecessary); 29 | return { 30 | context, 31 | ui: () => ( 32 | 33 | {title} 34 | 35 |
36 | {lines.map((line, idx) => ( 37 |
38 | {line} 39 |
40 | ))} 41 |
42 |
43 |
44 | ), 45 | }; 46 | } 47 | 48 | function makeFunctionTree(handlerAddr: number, paths: Path[], rootName?: string): FuncNode { 49 | const root: FuncNode = { addr: handlerAddr, depth: 0, children: [] }; 50 | root.name = rootName; 51 | 52 | for (const p of paths) { 53 | // First check if the path is relevant to this tree 54 | if (p.path[0] !== handlerAddr) { 55 | continue; 56 | } 57 | 58 | // Add all intermediate nodes 59 | let current: FuncNode | undefined = root; 60 | if (p.path.length > 1) { 61 | for (let i = 1; i < p.path.length; i++) { 62 | const addr = p.path[i]; 63 | let child : FuncNode | undefined = current.children.find(c => c.addr === addr); 64 | if (!child) { 65 | child = { 66 | addr, 67 | depth: current.depth + 1, 68 | children: [], 69 | }; 70 | current.children.push(child); 71 | } 72 | current = child; 73 | } 74 | } 75 | 76 | // Add the leaf node name 77 | if (current) { 78 | current.name = p.name; 79 | current.context = createContext(p.context, `0x${current.addr.toString(16)} ${current.name}`); 80 | } 81 | } 82 | 83 | return root; 84 | } 85 | 86 | interface FunctionTreeProps { 87 | handlerAddrs: number[]; 88 | paths: Path[]; 89 | } 90 | 91 | const renderNode = (node: FuncNode): JSX.Element => ( 92 |
93 | {/* add element on hover to show the context */} 94 | {node.name && node.context && node.context.ui()} 95 | {node.addr && !node.context && `0x${node.addr.toString(16)} ${node.name ? `(${node.name})` : ''}`} 96 | {node.children.map(child => renderNode(child))} 97 |
98 | ); 99 | 100 | const FunctionTree: React.FC = ({ handlerAddrs, paths }) => { 101 | let roots = handlerAddrs.map(addr => makeFunctionTree(addr, paths)); 102 | 103 | return ( 104 |
105 | {handlerAddrs && roots.map(root => renderNode(root))} 106 |
107 | ); 108 | }; 109 | 110 | export default FunctionTree; 111 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/components/mode-toggle.tsx: -------------------------------------------------------------------------------- 1 | "use client" 2 | 3 | import * as React from "react" 4 | import { Moon, Sun } from "lucide-react" 5 | import { useTheme } from "next-themes" 6 | 7 | import { Button } from "@/components/ui/button" 8 | import { 9 | DropdownMenu, 10 | DropdownMenuContent, 11 | DropdownMenuItem, 12 | DropdownMenuTrigger, 13 | } from "@/components/ui/dropdown-menu" 14 | 15 | export function ModeToggle() { 16 | const { setTheme } = useTheme() 17 | 18 | return ( 19 | 20 | 21 | 26 | 27 | 28 | setTheme("light")}> 29 | Light 30 | 31 | setTheme("dark")}> 32 | Dark 33 | 34 | setTheme("system")}> 35 | System 36 | 37 | 38 | 39 | ) 40 | } 41 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/components/theme-provider.tsx: -------------------------------------------------------------------------------- 1 | "use client" 2 | 3 | import * as React from "react" 4 | import { ThemeProvider as NextThemesProvider } from "next-themes" 5 | import { type ThemeProviderProps } from "next-themes/dist/types" 6 | 7 | export function ThemeProvider({ children, ...props }: ThemeProviderProps) { 8 | return {children} 9 | } 10 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/components/ui/button.tsx: -------------------------------------------------------------------------------- 1 | import * as React from "react" 2 | import { Slot } from "@radix-ui/react-slot" 3 | import { cva, type VariantProps } from "class-variance-authority" 4 | 5 | import { cn } from "@/lib/utils" 6 | 7 | const buttonVariants = cva( 8 | "inline-flex items-center justify-center whitespace-nowrap rounded-md text-sm font-medium ring-offset-background transition-colors focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2 disabled:pointer-events-none disabled:opacity-50", 9 | { 10 | variants: { 11 | variant: { 12 | default: "bg-primary text-primary-foreground hover:bg-primary/90", 13 | destructive: 14 | "bg-destructive text-destructive-foreground hover:bg-destructive/90", 15 | outline: 16 | "border border-input bg-background hover:bg-accent hover:text-accent-foreground", 17 | secondary: 18 | "bg-secondary text-secondary-foreground hover:bg-secondary/80", 19 | ghost: "hover:bg-accent hover:text-accent-foreground", 20 | link: "text-primary underline-offset-4 hover:underline", 21 | }, 22 | size: { 23 | default: "h-10 px-4 py-2", 24 | sm: "h-9 rounded-md px-3", 25 | lg: "h-11 rounded-md px-8", 26 | icon: "h-10 w-10", 27 | }, 28 | }, 29 | defaultVariants: { 30 | variant: "default", 31 | size: "default", 32 | }, 33 | } 34 | ) 35 | 36 | export interface ButtonProps 37 | extends React.ButtonHTMLAttributes, 38 | VariantProps { 39 | asChild?: boolean 40 | } 41 | 42 | const Button = React.forwardRef( 43 | ({ className, variant, size, asChild = false, ...props }, ref) => { 44 | const Comp = asChild ? Slot : "button" 45 | return ( 46 | 51 | ) 52 | } 53 | ) 54 | Button.displayName = "Button" 55 | 56 | export { Button, buttonVariants } 57 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/components/ui/card.tsx: -------------------------------------------------------------------------------- 1 | import * as React from "react" 2 | 3 | import { cn } from "@/lib/utils" 4 | 5 | const Card = React.forwardRef< 6 | HTMLDivElement, 7 | React.HTMLAttributes 8 | >(({ className, ...props }, ref) => ( 9 |
17 | )) 18 | Card.displayName = "Card" 19 | 20 | const CardHeader = React.forwardRef< 21 | HTMLDivElement, 22 | React.HTMLAttributes 23 | >(({ className, ...props }, ref) => ( 24 |
29 | )) 30 | CardHeader.displayName = "CardHeader" 31 | 32 | const CardTitle = React.forwardRef< 33 | HTMLParagraphElement, 34 | React.HTMLAttributes 35 | >(({ className, ...props }, ref) => ( 36 |

44 | )) 45 | CardTitle.displayName = "CardTitle" 46 | 47 | const CardDescription = React.forwardRef< 48 | HTMLParagraphElement, 49 | React.HTMLAttributes 50 | >(({ className, ...props }, ref) => ( 51 |

56 | )) 57 | CardDescription.displayName = "CardDescription" 58 | 59 | const CardContent = React.forwardRef< 60 | HTMLDivElement, 61 | React.HTMLAttributes 62 | >(({ className, ...props }, ref) => ( 63 |

64 | )) 65 | CardContent.displayName = "CardContent" 66 | 67 | const CardFooter = React.forwardRef< 68 | HTMLDivElement, 69 | React.HTMLAttributes 70 | >(({ className, ...props }, ref) => ( 71 |
76 | )) 77 | CardFooter.displayName = "CardFooter" 78 | 79 | export { Card, CardHeader, CardFooter, CardTitle, CardDescription, CardContent } 80 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/components/ui/column-header.tsx: -------------------------------------------------------------------------------- 1 | import { 2 | ArrowDownIcon, 3 | ArrowUpIcon, 4 | CaretSortIcon, 5 | EyeNoneIcon, 6 | } from "@radix-ui/react-icons" 7 | import { Column } from "@tanstack/react-table" 8 | 9 | import { cn } from "@/lib/utils" 10 | import { Button } from "@/components/ui/button" 11 | import { 12 | DropdownMenu, 13 | DropdownMenuContent, 14 | DropdownMenuItem, 15 | DropdownMenuSeparator, 16 | DropdownMenuTrigger, 17 | } from "@/components/ui/dropdown-menu" 18 | 19 | interface DataTableColumnHeaderProps 20 | extends React.HTMLAttributes { 21 | column: Column 22 | title: string 23 | } 24 | 25 | export function DataTableColumnHeader({ 26 | column, 27 | title, 28 | className, 29 | }: DataTableColumnHeaderProps) { 30 | if (!column.getCanSort()) { 31 | return
{title}
32 | } 33 | 34 | return ( 35 |
36 | 37 | 38 | 52 | 53 | 54 | column.toggleSorting(false)}> 55 | 56 | Asc 57 | 58 | column.toggleSorting(true)}> 59 | 60 | Desc 61 | 62 | 63 | column.toggleVisibility(false)}> 64 | 65 | Hide 66 | 67 | 68 | 69 |
70 | ) 71 | } 72 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/components/ui/column-toggle.tsx: -------------------------------------------------------------------------------- 1 | "use client" 2 | 3 | import { DropdownMenuTrigger } from "@radix-ui/react-dropdown-menu" 4 | import { MixerHorizontalIcon } from "@radix-ui/react-icons" 5 | import { Table } from "@tanstack/react-table" 6 | 7 | import { Button } from "@/components/ui/button" 8 | import { 9 | DropdownMenu, 10 | DropdownMenuCheckboxItem, 11 | DropdownMenuContent, 12 | DropdownMenuLabel, 13 | DropdownMenuSeparator, 14 | } from "@/components/ui/dropdown-menu" 15 | 16 | interface DataTableViewOptionsProps { 17 | table: Table 18 | } 19 | 20 | export function DataTableViewOptions({ 21 | table, 22 | }: DataTableViewOptionsProps) { 23 | return ( 24 | 25 | 26 | 34 | 35 | 36 | Toggle columns 37 | 38 | {table 39 | .getAllColumns() 40 | .filter( 41 | (column) => 42 | typeof column.accessorFn !== "undefined" && column.getCanHide() 43 | ) 44 | .map((column) => { 45 | return ( 46 | column.toggleVisibility(!!value)} 51 | > 52 | {column.id} 53 | 54 | ) 55 | })} 56 | 57 | 58 | ) 59 | } 60 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/components/ui/data-pagination.tsx: -------------------------------------------------------------------------------- 1 | import { 2 | ChevronLeftIcon, 3 | ChevronRightIcon, 4 | DoubleArrowLeftIcon, 5 | DoubleArrowRightIcon, 6 | } from "@radix-ui/react-icons" 7 | import { Table } from "@tanstack/react-table" 8 | 9 | import { Button } from "@/components/ui/button" 10 | import { 11 | Select, 12 | SelectContent, 13 | SelectItem, 14 | SelectTrigger, 15 | SelectValue, 16 | } from "@/components/ui/select" 17 | 18 | interface DataTablePaginationProps { 19 | table: Table 20 | } 21 | 22 | export function DataTablePagination({ 23 | table, 24 | }: DataTablePaginationProps) { 25 | return ( 26 |
27 |
28 | {/* {table.getFilteredSelectedRowModel().rows.length} of{" "} */} 29 | {table.getFilteredRowModel().rows.length} rows total 30 |
31 |
32 |
33 |

Rows per page

34 | 51 |
52 |
53 | Page {table.getState().pagination.pageIndex + 1} of{" "} 54 | {table.getPageCount()} 55 |
56 |
57 | 66 | 75 | 84 | 93 |
94 |
95 |
96 | ) 97 | } 98 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/components/ui/data-table.tsx: -------------------------------------------------------------------------------- 1 | "use client" 2 | import * as React from "react" 3 | import { 4 | ColumnDef, 5 | SortingState, 6 | flexRender, 7 | getPaginationRowModel, 8 | getCoreRowModel, 9 | getSortedRowModel, 10 | getFilteredRowModel, 11 | useReactTable, 12 | ColumnFiltersState, 13 | } from "@tanstack/react-table" 14 | import { Input } from "@/components/ui/input" 15 | import { 16 | Table, 17 | TableBody, 18 | TableCell, 19 | TableHead, 20 | TableHeader, 21 | TableRow, 22 | } from "@/components/ui/table" 23 | 24 | import { DataTableViewOptions } from "@/components/ui/column-toggle" 25 | import { DataTablePagination } from "@/components/ui/data-pagination" 26 | 27 | interface DataTableProps { 28 | columns: ColumnDef[] 29 | data: TData[] 30 | tableTitle?: string | undefined 31 | filterBy?: string | undefined 32 | columnVisibility?: any | undefined 33 | } 34 | 35 | export function DataTable({ 36 | columns, 37 | data, 38 | tableTitle, 39 | filterBy, 40 | columnVisibility, 41 | }: DataTableProps) { 42 | const [sorting, setSorting] = React.useState([]) 43 | const [columnFilters, setColumnFilters] = React.useState( 44 | [] 45 | ) 46 | 47 | const table = useReactTable({ 48 | data, 49 | columns, 50 | getCoreRowModel: getCoreRowModel(), 51 | getPaginationRowModel: getPaginationRowModel(), 52 | onSortingChange: setSorting, 53 | getSortedRowModel: getSortedRowModel(), 54 | onColumnFiltersChange: setColumnFilters, 55 | getFilteredRowModel: getFilteredRowModel(), 56 | state: { 57 | sorting, 58 | columnFilters, 59 | }, 60 | initialState: { 61 | columnVisibility, 62 | }, 63 | }) 64 | 65 | return ( 66 |
67 |
68 | {filterBy && 69 | 73 | table.getColumn(filterBy)?.setFilterValue(event.target.value) 74 | } 75 | className="max-w-sm" 76 | /> 77 | } 78 | {tableTitle && 79 |
{tableTitle}
80 | } 81 | 82 |
83 |
84 | 85 | 86 | {table.getHeaderGroups().map((headerGroup) => ( 87 | 88 | {headerGroup.headers.map((header) => { 89 | return ( 90 | 91 | {header.isPlaceholder 92 | ? null 93 | : flexRender( 94 | header.column.columnDef.header, 95 | header.getContext() 96 | )} 97 | 98 | ) 99 | })} 100 | 101 | ))} 102 | 103 | 104 | {table.getRowModel().rows?.length ? ( 105 | table.getRowModel().rows.map((row) => ( 106 | 110 | {row.getVisibleCells().map((cell) => ( 111 | 112 | {flexRender(cell.column.columnDef.cell, cell.getContext())} 113 | 114 | ))} 115 | 116 | )) 117 | ) : ( 118 | 119 | 120 | No results. 121 | 122 | 123 | )} 124 | 125 |
126 |
127 | {/*
128 | 136 | 144 |
*/} 145 | 146 |
147 | ) 148 | } 149 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/components/ui/dialog.tsx: -------------------------------------------------------------------------------- 1 | "use client" 2 | 3 | import * as React from "react" 4 | import * as DialogPrimitive from "@radix-ui/react-dialog" 5 | import * as DropdownMenuPrimitive from "@radix-ui/react-dropdown-menu" 6 | import { Cross2Icon } from "@radix-ui/react-icons" 7 | import { X } from "lucide-react" 8 | 9 | import { cn } from "@/lib/utils" 10 | 11 | const Dialog = DialogPrimitive.Root 12 | 13 | const DialogTrigger = DialogPrimitive.Trigger 14 | 15 | const DialogPortal = DialogPrimitive.Portal 16 | 17 | const DialogClose = DialogPrimitive.Close 18 | 19 | const DialogOverlay = React.forwardRef< 20 | React.ElementRef, 21 | React.ComponentPropsWithoutRef 22 | >(({ className, ...props }, ref) => ( 23 | 31 | )) 32 | DialogOverlay.displayName = DialogPrimitive.Overlay.displayName 33 | 34 | const DialogContent = React.forwardRef< 35 | React.ElementRef, 36 | React.ComponentPropsWithoutRef 37 | >(({ className, children, ...props }, ref) => ( 38 | 39 | 40 | 48 | {children} 49 | 50 | 51 | Close 52 | 53 | 54 | 55 | )) 56 | DialogContent.displayName = DialogPrimitive.Content.displayName 57 | 58 | const DialogHeader = ({ 59 | className, 60 | ...props 61 | }: React.HTMLAttributes) => ( 62 |
69 | ) 70 | DialogHeader.displayName = "DialogHeader" 71 | 72 | const DialogFooter = ({ 73 | className, 74 | ...props 75 | }: React.HTMLAttributes) => ( 76 |
83 | ) 84 | DialogFooter.displayName = "DialogFooter" 85 | 86 | const DialogTitle = React.forwardRef< 87 | React.ElementRef, 88 | React.ComponentPropsWithoutRef 89 | >(({ className, ...props }, ref) => ( 90 | 98 | )) 99 | DialogTitle.displayName = DialogPrimitive.Title.displayName 100 | 101 | const DialogDescription = React.forwardRef< 102 | React.ElementRef, 103 | React.ComponentPropsWithoutRef 104 | >(({ className, ...props }, ref) => ( 105 | 110 | )) 111 | DialogDescription.displayName = DialogPrimitive.Description.displayName 112 | 113 | export { 114 | Dialog, 115 | DialogPortal, 116 | DialogOverlay, 117 | DialogClose, 118 | DialogTrigger, 119 | DialogContent, 120 | DialogHeader, 121 | DialogFooter, 122 | DialogTitle, 123 | DialogDescription, 124 | } 125 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/components/ui/fuzzingQueueElem.tsx: -------------------------------------------------------------------------------- 1 | import { FuzzingQueueElem } from "@/types/FuzzingQueue"; 2 | import { 3 | Card, 4 | CardContent, 5 | CardDescription, 6 | CardFooter, 7 | CardHeader, 8 | CardTitle, 9 | } from "@/components/ui/card" 10 | import { Input } from "@/components/ui/input" 11 | import { Label } from "@/components/ui/label" 12 | import { 13 | Select, 14 | SelectContent, 15 | SelectItem, 16 | SelectTrigger, 17 | SelectValue, 18 | } from "@/components/ui/select" 19 | 20 | // UI element showing the fuzzing queue element, allowing for input changes and value changes 21 | export function FuzzingQueueElemUI( 22 | { queueElem, setQueueElem }: { queueElem: FuzzingQueueElem, setQueueElem: (queueElem: FuzzingQueueElem) => void } 23 | ) { 24 | return ( 25 | 26 | 27 | 28 | setQueueElem({ ...queueElem, priority: parseInt(e.target.value) })} /> 29 | 30 | setQueueElem({ ...queueElem, max_runtime: parseInt(e.target.value) })} /> 31 | 32 | setQueueElem({ ...queueElem, max_last_crash: parseInt(e.target.value) })} /> 33 | 34 | setQueueElem({ ...queueElem, max_last_any: parseInt(e.target.value) })} /> 35 | 36 | setQueueElem({ ...queueElem, dos_device_str: e.target.value })} /> 37 | 38 | 39 | ); 40 | }; -------------------------------------------------------------------------------- /Pipeline/Frontender/src/components/ui/hover-card.tsx: -------------------------------------------------------------------------------- 1 | "use client" 2 | 3 | import * as React from "react" 4 | import * as HoverCardPrimitive from "@radix-ui/react-hover-card" 5 | 6 | import { cn } from "@/lib/utils" 7 | 8 | const HoverCard = HoverCardPrimitive.Root 9 | 10 | const HoverCardTrigger = HoverCardPrimitive.Trigger 11 | 12 | const HoverCardContent = React.forwardRef< 13 | React.ElementRef, 14 | React.ComponentPropsWithoutRef 15 | >(({ className, align = "center", sideOffset = 4, ...props }, ref) => ( 16 | 26 | )) 27 | HoverCardContent.displayName = HoverCardPrimitive.Content.displayName 28 | 29 | export { HoverCard, HoverCardTrigger, HoverCardContent } 30 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/components/ui/input.tsx: -------------------------------------------------------------------------------- 1 | import * as React from "react" 2 | 3 | import { cn } from "@/lib/utils" 4 | 5 | export interface InputProps 6 | extends React.InputHTMLAttributes {} 7 | 8 | const Input = React.forwardRef( 9 | ({ className, type, ...props }, ref) => { 10 | return ( 11 | 20 | ) 21 | } 22 | ) 23 | Input.displayName = "Input" 24 | 25 | export { Input } 26 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/components/ui/label.tsx: -------------------------------------------------------------------------------- 1 | "use client" 2 | 3 | import * as React from "react" 4 | import * as LabelPrimitive from "@radix-ui/react-label" 5 | import { cva, type VariantProps } from "class-variance-authority" 6 | 7 | import { cn } from "@/lib/utils" 8 | 9 | const labelVariants = cva( 10 | "text-sm font-medium leading-none peer-disabled:cursor-not-allowed peer-disabled:opacity-70" 11 | ) 12 | 13 | const Label = React.forwardRef< 14 | React.ElementRef, 15 | React.ComponentPropsWithoutRef & 16 | VariantProps 17 | >(({ className, ...props }, ref) => ( 18 | 23 | )) 24 | Label.displayName = LabelPrimitive.Root.displayName 25 | 26 | export { Label } 27 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/components/ui/radio-group.tsx: -------------------------------------------------------------------------------- 1 | "use client" 2 | 3 | import * as React from "react" 4 | import * as RadioGroupPrimitive from "@radix-ui/react-radio-group" 5 | import { Circle } from "lucide-react" 6 | 7 | import { cn } from "@/lib/utils" 8 | 9 | const RadioGroup = React.forwardRef< 10 | React.ElementRef, 11 | React.ComponentPropsWithoutRef 12 | >(({ className, ...props }, ref) => { 13 | return ( 14 | 19 | ) 20 | }) 21 | RadioGroup.displayName = RadioGroupPrimitive.Root.displayName 22 | 23 | const RadioGroupItem = React.forwardRef< 24 | React.ElementRef, 25 | React.ComponentPropsWithoutRef 26 | >(({ className, ...props }, ref) => { 27 | return ( 28 | 36 | 37 | 38 | 39 | 40 | ) 41 | }) 42 | RadioGroupItem.displayName = RadioGroupPrimitive.Item.displayName 43 | 44 | export { RadioGroup, RadioGroupItem } 45 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/components/ui/separator.tsx: -------------------------------------------------------------------------------- 1 | "use client" 2 | 3 | import * as React from "react" 4 | import * as SeparatorPrimitive from "@radix-ui/react-separator" 5 | 6 | import { cn } from "@/lib/utils" 7 | 8 | const Separator = React.forwardRef< 9 | React.ElementRef, 10 | React.ComponentPropsWithoutRef 11 | >( 12 | ( 13 | { className, orientation = "horizontal", decorative = true, ...props }, 14 | ref 15 | ) => ( 16 | 27 | ) 28 | ) 29 | Separator.displayName = SeparatorPrimitive.Root.displayName 30 | 31 | export { Separator } 32 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/components/ui/table.tsx: -------------------------------------------------------------------------------- 1 | import * as React from "react" 2 | 3 | import { cn } from "@/lib/utils" 4 | 5 | const Table = React.forwardRef< 6 | HTMLTableElement, 7 | React.HTMLAttributes 8 | >(({ className, ...props }, ref) => ( 9 |
10 | 15 | 16 | )) 17 | Table.displayName = "Table" 18 | 19 | const TableHeader = React.forwardRef< 20 | HTMLTableSectionElement, 21 | React.HTMLAttributes 22 | >(({ className, ...props }, ref) => ( 23 | 24 | )) 25 | TableHeader.displayName = "TableHeader" 26 | 27 | const TableBody = React.forwardRef< 28 | HTMLTableSectionElement, 29 | React.HTMLAttributes 30 | >(({ className, ...props }, ref) => ( 31 | 36 | )) 37 | TableBody.displayName = "TableBody" 38 | 39 | const TableFooter = React.forwardRef< 40 | HTMLTableSectionElement, 41 | React.HTMLAttributes 42 | >(({ className, ...props }, ref) => ( 43 | tr]:last:border-b-0", 47 | className 48 | )} 49 | {...props} 50 | /> 51 | )) 52 | TableFooter.displayName = "TableFooter" 53 | 54 | const TableRow = React.forwardRef< 55 | HTMLTableRowElement, 56 | React.HTMLAttributes 57 | >(({ className, ...props }, ref) => ( 58 | 66 | )) 67 | TableRow.displayName = "TableRow" 68 | 69 | const TableHead = React.forwardRef< 70 | HTMLTableCellElement, 71 | React.ThHTMLAttributes 72 | >(({ className, ...props }, ref) => ( 73 |
81 | )) 82 | TableHead.displayName = "TableHead" 83 | 84 | const TableCell = React.forwardRef< 85 | HTMLTableCellElement, 86 | React.TdHTMLAttributes 87 | >(({ className, ...props }, ref) => ( 88 | 93 | )) 94 | TableCell.displayName = "TableCell" 95 | 96 | const TableCaption = React.forwardRef< 97 | HTMLTableCaptionElement, 98 | React.HTMLAttributes 99 | >(({ className, ...props }, ref) => ( 100 |
105 | )) 106 | TableCaption.displayName = "TableCaption" 107 | 108 | export { 109 | Table, 110 | TableHeader, 111 | TableBody, 112 | TableFooter, 113 | TableHead, 114 | TableRow, 115 | TableCell, 116 | TableCaption, 117 | } 118 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/components/ui/toast.tsx: -------------------------------------------------------------------------------- 1 | "use client" 2 | 3 | import * as React from "react" 4 | import * as ToastPrimitives from "@radix-ui/react-toast" 5 | import { cva, type VariantProps } from "class-variance-authority" 6 | import { X } from "lucide-react" 7 | 8 | import { cn } from "@/lib/utils" 9 | 10 | const ToastProvider = ToastPrimitives.Provider 11 | 12 | const ToastViewport = React.forwardRef< 13 | React.ElementRef, 14 | React.ComponentPropsWithoutRef 15 | >(({ className, ...props }, ref) => ( 16 | 24 | )) 25 | ToastViewport.displayName = ToastPrimitives.Viewport.displayName 26 | 27 | const toastVariants = cva( 28 | "group pointer-events-auto relative flex w-full items-center justify-between space-x-4 overflow-hidden rounded-md border p-6 pr-8 shadow-lg transition-all data-[swipe=cancel]:translate-x-0 data-[swipe=end]:translate-x-[var(--radix-toast-swipe-end-x)] data-[swipe=move]:translate-x-[var(--radix-toast-swipe-move-x)] data-[swipe=move]:transition-none data-[state=open]:animate-in data-[state=closed]:animate-out data-[swipe=end]:animate-out data-[state=closed]:fade-out-80 data-[state=closed]:slide-out-to-right-full data-[state=open]:slide-in-from-top-full data-[state=open]:sm:slide-in-from-bottom-full", 29 | { 30 | variants: { 31 | variant: { 32 | default: "border bg-background text-foreground", 33 | destructive: 34 | "destructive group border-destructive bg-destructive text-destructive-foreground", 35 | }, 36 | }, 37 | defaultVariants: { 38 | variant: "default", 39 | }, 40 | } 41 | ) 42 | 43 | const Toast = React.forwardRef< 44 | React.ElementRef, 45 | React.ComponentPropsWithoutRef & 46 | VariantProps 47 | >(({ className, variant, ...props }, ref) => { 48 | return ( 49 | 54 | ) 55 | }) 56 | Toast.displayName = ToastPrimitives.Root.displayName 57 | 58 | const ToastAction = React.forwardRef< 59 | React.ElementRef, 60 | React.ComponentPropsWithoutRef 61 | >(({ className, ...props }, ref) => ( 62 | 70 | )) 71 | ToastAction.displayName = ToastPrimitives.Action.displayName 72 | 73 | const ToastClose = React.forwardRef< 74 | React.ElementRef, 75 | React.ComponentPropsWithoutRef 76 | >(({ className, ...props }, ref) => ( 77 | 86 | 87 | 88 | )) 89 | ToastClose.displayName = ToastPrimitives.Close.displayName 90 | 91 | const ToastTitle = React.forwardRef< 92 | React.ElementRef, 93 | React.ComponentPropsWithoutRef 94 | >(({ className, ...props }, ref) => ( 95 | 100 | )) 101 | ToastTitle.displayName = ToastPrimitives.Title.displayName 102 | 103 | const ToastDescription = React.forwardRef< 104 | React.ElementRef, 105 | React.ComponentPropsWithoutRef 106 | >(({ className, ...props }, ref) => ( 107 | 112 | )) 113 | ToastDescription.displayName = ToastPrimitives.Description.displayName 114 | 115 | type ToastProps = React.ComponentPropsWithoutRef 116 | 117 | type ToastActionElement = React.ReactElement 118 | 119 | export { 120 | type ToastProps, 121 | type ToastActionElement, 122 | ToastProvider, 123 | ToastViewport, 124 | Toast, 125 | ToastTitle, 126 | ToastDescription, 127 | ToastClose, 128 | ToastAction, 129 | } 130 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/components/ui/toaster.tsx: -------------------------------------------------------------------------------- 1 | "use client" 2 | 3 | import { 4 | Toast, 5 | ToastClose, 6 | ToastDescription, 7 | ToastProvider, 8 | ToastTitle, 9 | ToastViewport, 10 | } from "@/components/ui/toast" 11 | import { useToast } from "@/components/ui/use-toast" 12 | 13 | export function Toaster() { 14 | const { toasts } = useToast() 15 | 16 | return ( 17 | 18 | {toasts.map(function ({ id, title, description, action, ...props }) { 19 | return ( 20 | 21 |
22 | {title && {title}} 23 | {description && ( 24 | {description} 25 | )} 26 |
27 | {action} 28 | 29 |
30 | ) 31 | })} 32 | 33 |
34 | ) 35 | } 36 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/components/ui/use-toast.ts: -------------------------------------------------------------------------------- 1 | "use client" 2 | 3 | // Inspired by react-hot-toast library 4 | import * as React from "react" 5 | 6 | import type { 7 | ToastActionElement, 8 | ToastProps, 9 | } from "@/components/ui/toast" 10 | 11 | const TOAST_LIMIT = 1 12 | const TOAST_REMOVE_DELAY = 1000000 13 | 14 | type ToasterToast = ToastProps & { 15 | id: string 16 | title?: React.ReactNode 17 | description?: React.ReactNode 18 | action?: ToastActionElement 19 | } 20 | 21 | const actionTypes = { 22 | ADD_TOAST: "ADD_TOAST", 23 | UPDATE_TOAST: "UPDATE_TOAST", 24 | DISMISS_TOAST: "DISMISS_TOAST", 25 | REMOVE_TOAST: "REMOVE_TOAST", 26 | } as const 27 | 28 | let count = 0 29 | 30 | function genId() { 31 | count = (count + 1) % Number.MAX_SAFE_INTEGER 32 | return count.toString() 33 | } 34 | 35 | type ActionType = typeof actionTypes 36 | 37 | type Action = 38 | | { 39 | type: ActionType["ADD_TOAST"] 40 | toast: ToasterToast 41 | } 42 | | { 43 | type: ActionType["UPDATE_TOAST"] 44 | toast: Partial 45 | } 46 | | { 47 | type: ActionType["DISMISS_TOAST"] 48 | toastId?: ToasterToast["id"] 49 | } 50 | | { 51 | type: ActionType["REMOVE_TOAST"] 52 | toastId?: ToasterToast["id"] 53 | } 54 | 55 | interface State { 56 | toasts: ToasterToast[] 57 | } 58 | 59 | const toastTimeouts = new Map>() 60 | 61 | const addToRemoveQueue = (toastId: string) => { 62 | if (toastTimeouts.has(toastId)) { 63 | return 64 | } 65 | 66 | const timeout = setTimeout(() => { 67 | toastTimeouts.delete(toastId) 68 | dispatch({ 69 | type: "REMOVE_TOAST", 70 | toastId: toastId, 71 | }) 72 | }, TOAST_REMOVE_DELAY) 73 | 74 | toastTimeouts.set(toastId, timeout) 75 | } 76 | 77 | export const reducer = (state: State, action: Action): State => { 78 | switch (action.type) { 79 | case "ADD_TOAST": 80 | return { 81 | ...state, 82 | toasts: [action.toast, ...state.toasts].slice(0, TOAST_LIMIT), 83 | } 84 | 85 | case "UPDATE_TOAST": 86 | return { 87 | ...state, 88 | toasts: state.toasts.map((t) => 89 | t.id === action.toast.id ? { ...t, ...action.toast } : t 90 | ), 91 | } 92 | 93 | case "DISMISS_TOAST": { 94 | const { toastId } = action 95 | 96 | // ! Side effects ! - This could be extracted into a dismissToast() action, 97 | // but I'll keep it here for simplicity 98 | if (toastId) { 99 | addToRemoveQueue(toastId) 100 | } else { 101 | state.toasts.forEach((toast) => { 102 | addToRemoveQueue(toast.id) 103 | }) 104 | } 105 | 106 | return { 107 | ...state, 108 | toasts: state.toasts.map((t) => 109 | t.id === toastId || toastId === undefined 110 | ? { 111 | ...t, 112 | open: false, 113 | } 114 | : t 115 | ), 116 | } 117 | } 118 | case "REMOVE_TOAST": 119 | if (action.toastId === undefined) { 120 | return { 121 | ...state, 122 | toasts: [], 123 | } 124 | } 125 | return { 126 | ...state, 127 | toasts: state.toasts.filter((t) => t.id !== action.toastId), 128 | } 129 | } 130 | } 131 | 132 | const listeners: Array<(state: State) => void> = [] 133 | 134 | let memoryState: State = { toasts: [] } 135 | 136 | function dispatch(action: Action) { 137 | memoryState = reducer(memoryState, action) 138 | listeners.forEach((listener) => { 139 | listener(memoryState) 140 | }) 141 | } 142 | 143 | type Toast = Omit 144 | 145 | function toast({ ...props }: Toast) { 146 | const id = genId() 147 | 148 | const update = (props: ToasterToast) => 149 | dispatch({ 150 | type: "UPDATE_TOAST", 151 | toast: { ...props, id }, 152 | }) 153 | const dismiss = () => dispatch({ type: "DISMISS_TOAST", toastId: id }) 154 | 155 | dispatch({ 156 | type: "ADD_TOAST", 157 | toast: { 158 | ...props, 159 | id, 160 | open: true, 161 | onOpenChange: (open) => { 162 | if (!open) dismiss() 163 | }, 164 | }, 165 | }) 166 | 167 | return { 168 | id: id, 169 | dismiss, 170 | update, 171 | } 172 | } 173 | 174 | function useToast() { 175 | const [state, setState] = React.useState(memoryState) 176 | 177 | React.useEffect(() => { 178 | listeners.push(setState) 179 | return () => { 180 | const index = listeners.indexOf(setState) 181 | if (index > -1) { 182 | listeners.splice(index, 1) 183 | } 184 | } 185 | }, [state]) 186 | 187 | return { 188 | ...state, 189 | toast, 190 | dismiss: (toastId?: string) => dispatch({ type: "DISMISS_TOAST", toastId }), 191 | } 192 | } 193 | 194 | export { useToast, toast } 195 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/lib/utils.ts: -------------------------------------------------------------------------------- 1 | import { type ClassValue, clsx } from "clsx" 2 | import { twMerge } from "tailwind-merge" 3 | 4 | export function cn(...inputs: ClassValue[]) { 5 | return twMerge(clsx(inputs)) 6 | } 7 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/types/FuzzingQueue.tsx: -------------------------------------------------------------------------------- 1 | "use client" 2 | 3 | import { ColumnDef } from "@tanstack/react-table" 4 | import { DataTableColumnHeader } from "@/components/ui/column-header" 5 | 6 | import { cn } from "@/lib/utils" 7 | import { Button } from "@/components/ui/button" 8 | import { 9 | DropdownMenu, 10 | DropdownMenuContent, 11 | DropdownMenuItem, 12 | DropdownMenuTrigger, 13 | } from "@/components/ui/dropdown-menu" 14 | import { ArrowDownIcon, ArrowUpIcon, CaretSortIcon } from "@radix-ui/react-icons" 15 | 16 | export type FuzzingQueueElem = { 17 | driver: number; 18 | priority: number; 19 | max_runtime: number | null; 20 | max_last_crash: number | null; 21 | max_last_any: number | null; 22 | dos_device_str: string | null; 23 | seeds: number[]; 24 | // the following are not used for adding new elements 25 | created_at: Date | null; 26 | finished_at: Date | null; 27 | id: number | null; 28 | state: string | null; 29 | }; 30 | 31 | export const columns: ColumnDef[] = [ 32 | { 33 | accessorKey: "driver_id", 34 | header: ({ column }) => ( 35 |
36 | 37 | 38 | 52 | 53 | 54 | column.toggleSorting(false)}> 55 | 56 | Asc 57 | 58 | column.toggleSorting(true)}> 59 | 60 | Desc 61 | 62 | 63 | 64 |
65 | ), 66 | cell: props => {props.row.original.driver} 67 | }, 68 | { 69 | accessorKey: "priority", 70 | header: ({ column }) => ( 71 | 72 | ), 73 | }, 74 | { 75 | accessorKey: "state", 76 | header: ({ column }) => ( 77 | 78 | ), 79 | }, 80 | { 81 | accessorKey: "dos_device_str", 82 | header: ({ column }) => ( 83 | 84 | ), 85 | }, 86 | { 87 | accessorKey: "finished_at", 88 | header: ({ column }) => ( 89 | 90 | ), 91 | cell: props => { 92 | if (props.row.original.created_at == null) { 93 | return "N/A"; 94 | } else { 95 | const date = new Date(props.row.original.created_at); 96 | return date.toLocaleString(); 97 | } 98 | } 99 | }, 100 | { 101 | accessorKey: "created_at", 102 | header: ({ column }) => ( 103 | 104 | ), 105 | cell: props => { 106 | if (props.row.original.created_at == null) { 107 | return "N/A"; 108 | } else { 109 | const date = new Date(props.row.original.created_at); 110 | return date.toLocaleString(); 111 | } 112 | } 113 | }, 114 | // { 115 | // accessorKey: "og_file_id", 116 | // header: ({ column }) => ( 117 | // 118 | // ), 119 | // }, 120 | // { 121 | // accessorKey: "file", 122 | // header: "Underlying File ID", 123 | // }, 124 | ] 125 | 126 | 127 | // FuzzingQueueElem creator, setting defaults 128 | export function createFuzzingQueueElem(driver_id: number, priority: number = 0, max_runtime: number | null = 42800, max_last_crash: number | null = null, max_last_any: number | null = null, dos_device_str: string | null = null, seeds: number[] = []): FuzzingQueueElem { 129 | return { 130 | driver: driver_id, 131 | priority: priority, 132 | max_runtime: max_runtime, 133 | max_last_crash: max_last_crash, 134 | max_last_any: max_last_any, 135 | dos_device_str: dos_device_str, 136 | seeds: seeds, 137 | created_at: null, 138 | finished_at: null, 139 | id: null, 140 | state: null 141 | }; 142 | } 143 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/types/KnownVulnerable.tsx: -------------------------------------------------------------------------------- 1 | "use client" 2 | 3 | import { ColumnDef } from "@tanstack/react-table" 4 | import { DataTableColumnHeader } from "@/components/ui/column-header" 5 | 6 | import { cn } from "@/lib/utils" 7 | import { Button } from "@/components/ui/button" 8 | import { 9 | DropdownMenu, 10 | DropdownMenuContent, 11 | DropdownMenuItem, 12 | DropdownMenuTrigger, 13 | } from "@/components/ui/dropdown-menu" 14 | import { ArrowDownIcon, ArrowUpIcon, CaretSortIcon } from "@radix-ui/react-icons" 15 | 16 | export interface VulnerableDriver { 17 | description: string; 18 | file: number; 19 | filename: string; 20 | id: number; 21 | origin: string; 22 | sha256: string; 23 | } 24 | 25 | export const columns: ColumnDef[] = [ 26 | { 27 | accessorKey: "filename", 28 | header: ({ column }) => ( 29 |
30 | 31 | 32 | 46 | 47 | 48 | column.toggleSorting(false)}> 49 | 50 | Asc 51 | 52 | column.toggleSorting(true)}> 53 | 54 | Desc 55 | 56 | 57 | 58 |
59 | ), 60 | }, 61 | { 62 | accessorKey: "origin", 63 | header: ({ column }) => ( 64 | 65 | ), 66 | }, 67 | { 68 | accessorKey: "description", 69 | header: ({ column }) => ( 70 | 71 | ), 72 | }, 73 | { 74 | accessorKey: "sha256", 75 | header: ({ column }) => ( 76 | 77 | ), 78 | }, 79 | { 80 | accessorKey: "file", 81 | header: ({ column }) => ( 82 | 83 | ), 84 | }, 85 | ] 86 | -------------------------------------------------------------------------------- /Pipeline/Frontender/src/types/Pathing.tsx: -------------------------------------------------------------------------------- 1 | export interface Path { 2 | id: number; 3 | isfor: number; 4 | name: string; 5 | context: string; 6 | path: number[]; // Assuming the path is stored as a string for simplicity 7 | } 8 | 9 | export interface IoctlComparison { 10 | op: string; 11 | val: number; 12 | line: string; 13 | } 14 | 15 | export interface PathingResult { 16 | combined_sub_functions: number; 17 | created_at: string; 18 | handler_addrs: number[]; // Assuming the handler addresses are stored as a string for simplicity 19 | ioctl_comp: IoctlComparison[]; // Assuming the handler addresses are stored as a string for simplicity 20 | id: number; 21 | paths: Path[]; 22 | ret_code: number; 23 | type: string; 24 | } -------------------------------------------------------------------------------- /Pipeline/Frontender/tailwind.config.ts: -------------------------------------------------------------------------------- 1 | import type { Config } from "tailwindcss" 2 | 3 | const config = { 4 | darkMode: ["class"], 5 | content: [ 6 | './pages/**/*.{ts,tsx}', 7 | './components/**/*.{ts,tsx}', 8 | './app/**/*.{ts,tsx}', 9 | './src/**/*.{ts,tsx}', 10 | ], 11 | prefix: "", 12 | theme: { 13 | container: { 14 | center: true, 15 | padding: "2rem", 16 | screens: { 17 | "2xl": "1400px", 18 | }, 19 | }, 20 | extend: { 21 | colors: { 22 | border: "hsl(var(--border))", 23 | input: "hsl(var(--input))", 24 | ring: "hsl(var(--ring))", 25 | background: "hsl(var(--background))", 26 | foreground: "hsl(var(--foreground))", 27 | primary: { 28 | DEFAULT: "hsl(var(--primary))", 29 | foreground: "hsl(var(--primary-foreground))", 30 | }, 31 | secondary: { 32 | DEFAULT: "hsl(var(--secondary))", 33 | foreground: "hsl(var(--secondary-foreground))", 34 | }, 35 | destructive: { 36 | DEFAULT: "hsl(var(--destructive))", 37 | foreground: "hsl(var(--destructive-foreground))", 38 | }, 39 | muted: { 40 | DEFAULT: "hsl(var(--muted))", 41 | foreground: "hsl(var(--muted-foreground))", 42 | }, 43 | accent: { 44 | DEFAULT: "hsl(var(--accent))", 45 | foreground: "hsl(var(--accent-foreground))", 46 | }, 47 | popover: { 48 | DEFAULT: "hsl(var(--popover))", 49 | foreground: "hsl(var(--popover-foreground))", 50 | }, 51 | card: { 52 | DEFAULT: "hsl(var(--card))", 53 | foreground: "hsl(var(--card-foreground))", 54 | }, 55 | }, 56 | borderRadius: { 57 | lg: "var(--radius)", 58 | md: "calc(var(--radius) - 2px)", 59 | sm: "calc(var(--radius) - 4px)", 60 | }, 61 | keyframes: { 62 | "accordion-down": { 63 | from: { height: "0" }, 64 | to: { height: "var(--radix-accordion-content-height)" }, 65 | }, 66 | "accordion-up": { 67 | from: { height: "var(--radix-accordion-content-height)" }, 68 | to: { height: "0" }, 69 | }, 70 | }, 71 | animation: { 72 | "accordion-down": "accordion-down 0.2s ease-out", 73 | "accordion-up": "accordion-up 0.2s ease-out", 74 | }, 75 | }, 76 | }, 77 | plugins: [require("tailwindcss-animate")], 78 | } satisfies Config 79 | 80 | export default config -------------------------------------------------------------------------------- /Pipeline/Frontender/tsconfig.json: -------------------------------------------------------------------------------- 1 | { 2 | "compilerOptions": { 3 | "lib": ["dom", "dom.iterable", "esnext"], 4 | "allowJs": true, 5 | "skipLibCheck": true, 6 | "strict": true, 7 | "noEmit": true, 8 | "esModuleInterop": true, 9 | "module": "esnext", 10 | "moduleResolution": "bundler", 11 | "resolveJsonModule": true, 12 | "isolatedModules": true, 13 | "jsx": "preserve", 14 | "incremental": true, 15 | "plugins": [ 16 | { 17 | "name": "next" 18 | } 19 | ], 20 | "paths": { 21 | "@/*": ["./src/*"] 22 | } 23 | }, 24 | "include": ["next-env.d.ts", "**/*.ts", "**/*.tsx", ".next/types/**/*.ts"], 25 | "exclude": ["node_modules"] 26 | } 27 | -------------------------------------------------------------------------------- /Pipeline/Fuzzifier/README.md: -------------------------------------------------------------------------------- 1 | # Fuzzifier Module: Automated Dynamic Analysis 2 | 3 | ## Overview 4 | The Fuzzifier is the final analytical module, designed for fuzz testing to identify vulnerabilities in drivers. 5 | This module requires significant computational resources and relies on the outputs from previous modules. 6 | Utilizing [Intel's kAFL](https://github.com/IntelLabs/kAFL/), the Fuzzifier automates the fuzzing of selected drivers, featuring a custom wrapper and integration with a modified Windows kernel driver fuzzing harness. 7 | 8 | ## Requirements 9 | - Dedicated hardware meeting specific criteria for kAFL 10 | - kAFL installed with a modified Linux kernel 11 | - Valid kAFL environment setup (i.e. venv as per documentation) 12 | - Access to the Coordinator API 13 | 14 | ## What the Fuzzifier does 15 | 1. Retrieve the next driver fuzzing configuration from the fuzzing queue. 16 | 2. Prepare the virtual machine as specified by kAFL. 17 | 3. Execute `kAFL fuzz` using Python's subprocess library. 18 | 4. Collect and send relevant data to the Coordinator, then clean up for the next target. 19 | 20 | Errors during setup or fuzzing are logged, and the configuration is marked as unsuccessful. Successful runs are marked accordingly, and the process repeats for the next driver. 21 | 22 | ## Modifications to kAFL fuzzing harness 23 | - Templating and recompiling the fuzzing agent for each driver 24 | - Handling and validating fuzzing payloads and IOCTL codes 25 | - Enforcing crashes on detecting potential memory access issues to save payloads for inspection 26 | 27 | ## License 28 | This project is under [GPLv3](../../LICENSE), but the original file for the [vuln_test.c.template](./vuln_test.c.template) file is the MIT license. -------------------------------------------------------------------------------- /Pipeline/Fuzzifier/payload/.gitkeep: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Toroto006/windows-kernel-driver-pipeline/91c832c2dc33967fd4bbd9af0338b135f9b1fa98/Pipeline/Fuzzifier/payload/.gitkeep -------------------------------------------------------------------------------- /Pipeline/Fuzzifier/payload/fuzzingDecoder.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | import struct 3 | import sys 4 | import os 5 | if __name__ != "__main__": 6 | from payload.ioctlDecoder import decodeIt 7 | from payload.hexdump import hexdump 8 | else: 9 | from ioctlDecoder import decodeIt 10 | from hexdump import hexdump 11 | import base64 12 | 13 | def decode_output_(data): 14 | # Unpack the data to retrieve ioctl, input size, and output size 15 | ioctl, input_size = struct.unpack(' 1: 96 | path = sys.argv[1] 97 | if os.path.isdir(path) or os.path.isfile(path): 98 | print(decode_files_in_folder(path, visible=False)) 99 | else: 100 | try: 101 | bin_data = base64.b64decode(path) 102 | ioctl, input_bytes, output_bytes = decode_output_(bin_data) 103 | decoded_ioctl = decodeIt(ioctl) 104 | print(f"IOCTL: {hex(ioctl)} which is:\n\tDevice: {decoded_ioctl['Device']}\n\tFunction: {decoded_ioctl['Function']}\n\tAccess:{decoded_ioctl['Access'].name}\n\tMethod:{decoded_ioctl['Method'].name}") 105 | print(f"Input Bytes ({len(input_bytes)}) and Output Bytes ({len(output_bytes)})") 106 | except: 107 | print("Invalid base64 encoded string") 108 | else: 109 | print("This is a IOCTL decoder; please provide a folder path, file path or base64 encoded as an argument") -------------------------------------------------------------------------------- /Pipeline/Fuzzifier/payload/hexdump.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: latin-1 -*- 3 | 4 | # <-- removing this magic comment breaks Python 3.4 on Windows 5 | import binascii # binascii is required for Python 3 6 | import sys 7 | 8 | # hexdump code from https://raw.githubusercontent.com/smorin/hextool/master/hexdump.py 9 | # --- constants 10 | PY3K = sys.version_info >= (3, 0) 11 | # --- - chunking helpers 12 | def chunks(seq, size): 13 | '''Generator that cuts sequence (bytes, memoryview, etc.) 14 | into chunks of given size. If `seq` length is not multiply 15 | of `size`, the lengh of the last chunk returned will be 16 | less than requested. 17 | 18 | >>> list( chunks([1,2,3,4,5,6,7], 3) ) 19 | [[1, 2, 3], [4, 5, 6], [7]] 20 | ''' 21 | d, m = divmod(len(seq), size) 22 | for i in range(d): 23 | yield seq[i*size:(i+1)*size] 24 | if m: 25 | yield seq[d*size:] 26 | 27 | def chunkread(f, size): 28 | '''Generator that reads from file like object. May return less 29 | data than requested on the last read.''' 30 | c = f.read(size) 31 | while len(c): 32 | yield c 33 | c = f.read(size) 34 | 35 | def genchunks(mixed, size): 36 | '''Generator to chunk binary sequences or file like objects. 37 | The size of the last chunk returned may be less than 38 | requested.''' 39 | if hasattr(mixed, 'read'): 40 | return chunkread(mixed, size) 41 | else: 42 | return chunks(mixed, size) 43 | # --- - /chunking helpers 44 | 45 | def dump(binary, size=2, sep=' '): 46 | ''' 47 | Convert binary data (bytes in Python 3 and str in 48 | Python 2) to hex string like '00 DE AD BE EF'. 49 | `size` argument specifies length of text chunks 50 | and `sep` sets chunk separator. 51 | ''' 52 | hexstr = binascii.hexlify(binary) 53 | if PY3K: 54 | hexstr = hexstr.decode('ascii') 55 | return sep.join(chunks(hexstr.upper(), size)) 56 | 57 | def dumpgen(data): 58 | ''' 59 | Generator that produces strings: 60 | 61 | '00000000: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................' 62 | ''' 63 | generator = genchunks(data, 16) 64 | for addr, d in enumerate(generator): 65 | # 00000000: 66 | line = '%08X: ' % (addr*16) 67 | # 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 68 | dumpstr = dump(d) 69 | line += dumpstr[:8*3] 70 | if len(d) > 8: # insert separator if needed 71 | line += ' ' + dumpstr[8*3:] 72 | # calculate indentation, which may be different for the last line 73 | pad = 2 74 | if len(d) < 16: 75 | pad += 3*(16 - len(d)) 76 | if len(d) <= 8: 77 | pad += 1 78 | line += ' '*pad 79 | 80 | for byte in d: 81 | # printable ASCII range 0x20 to 0x7E 82 | if not PY3K: 83 | byte = ord(byte) 84 | if 0x20 <= byte <= 0x7E: 85 | line += chr(byte) 86 | else: 87 | line += '.' 88 | yield line 89 | 90 | def hexdump(data): 91 | gen = dumpgen(data) 92 | return '\n'.join(gen) 93 | -------------------------------------------------------------------------------- /Pipeline/Fuzzifier/payload/ioctlDecoder.py: -------------------------------------------------------------------------------- 1 | #! /usr/bin/env python3 2 | import sys 3 | from enum import Enum 4 | 5 | class Device(Enum): 6 | BEEP = 1 7 | CD_ROM = 2 8 | CD_ROM_FILE_SYSTEM = 3 9 | CONTROLLER = 4 10 | DATALINK = 5 11 | DFS = 6 12 | DISK = 7 13 | DISK_FILE_SYSTEM = 8 14 | FILE_SYSTEM = 9 15 | INPORT_PORT = 10 16 | KEYBOARD = 11 17 | MAILSLOT = 12 18 | MIDI_IN = 13 19 | MIDI_OUT = 14 20 | MOUSE = 15 21 | MULTI_UNC_PROVIDER = 16 22 | NAMED_PIPE = 17 23 | NETWORK = 18 24 | NETWORK_BROWSER = 19 25 | NETWORK_FILE_SYSTEM = 20 26 | NULL = 21 27 | PARALLEL_PORT = 22 28 | PHYSICAL_NETCARD = 23 29 | PRINTER = 24 30 | SCANNER = 25 31 | SERIAL_MOUSE_PORT = 26 32 | SERIAL_PORT = 27 33 | SCREEN = 28 34 | SOUND = 29 35 | STREAMS = 30 36 | TAPE = 31 37 | TAPE_FILE_SYSTEM = 32 38 | TRANSPORT = 33 39 | UNKNOWN = 34 40 | VIDEO = 35 41 | VIRTUAL_DISK = 36 42 | WAVE_IN = 37 43 | WAVE_OUT = 38 44 | _8042_PORT = 39 45 | NETWORK_REDIRECTOR = 40 46 | BATTERY = 41 47 | BUS_EXTENDER = 42 48 | MODEM = 43 49 | VDM = 44 50 | MASS_STORAGE = 45 51 | SMB = 46 52 | KS = 47 53 | CHANGER = 48 54 | SMARTCARD = 49 55 | ACPI = 50 56 | DVD = 51 57 | FULLSCREEN_VIDEO = 52 58 | DFS_FILE_SYSTEM = 53 59 | DFS_VOLUME = 54 60 | 61 | class Access(Enum): 62 | FILE_ANY_ACCESS = 0 63 | FILE_READ_ACCESS = 1 64 | FILE_WRITE_ACCESS = 2 65 | READ_WRITE_ACCESS = 3 66 | 67 | class Method(Enum): 68 | METHOD_BUFFERED = 0 69 | METHOD_IN_DIRECT = 1 70 | METHOD_OUT_DIRECT = 2 71 | METHOD_NEITHER = 3 72 | 73 | def decodeIt(ioctl_code): 74 | if isinstance(ioctl_code, str): 75 | input_val = int(ioctl_code, 16) 76 | else: 77 | input_val = ioctl_code 78 | 79 | if input_val == 0 or input_val > 0xFFFFFFFF: 80 | return None 81 | 82 | device_val = (input_val >> 16) & 0xFFF 83 | func_val = (input_val >> 2) & 0xFFF 84 | 85 | device_str = "" 86 | if 0 < device_val <= 54: 87 | device_str = f"{Device(device_val).name} (0x{device_val:03X})" 88 | else: 89 | device_str = f"0x{device_val:03X}" 90 | 91 | access = (input_val >> 14) & 3 92 | method = input_val & 3 93 | 94 | return { 95 | "Device": device_str, 96 | "Function": f"0x{func_val:03X}", 97 | "Access": Access(access), 98 | "Method": Method(method), 99 | "Common Bit": input_val >> 31 100 | } 101 | 102 | def printDecoded(decoded_ioctl): 103 | if decoded_ioctl is None: 104 | print("Invalid IOCTL code") 105 | else: 106 | print("Device:", decoded_ioctl["Device"]) 107 | print("Function:", decoded_ioctl["Function"]) 108 | print("Access:", decoded_ioctl["Access"].name) 109 | print("Method:", decoded_ioctl["Method"].name) 110 | print("Common Bit:", decoded_ioctl["Common Bit"]) 111 | 112 | if __name__ == "__main__": 113 | if len(sys.argv) > 1: 114 | ioctl_code = sys.argv[1] 115 | decoded_ioctl = decodeIt(ioctl_code) 116 | printDecoded(decoded_ioctl) 117 | else: 118 | print("This is a IOCTL decoder; please provide a IOCTL code to decode as an argument") 119 | # Sample IOCTL code 120 | ioctl_code = "0x830020C3" 121 | -------------------------------------------------------------------------------- /Pipeline/Fuzzifier/seeds/.gitkeep: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Toroto006/windows-kernel-driver-pipeline/91c832c2dc33967fd4bbd9af0338b135f9b1fa98/Pipeline/Fuzzifier/seeds/.gitkeep -------------------------------------------------------------------------------- /Pipeline/Housekeeper/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM python:3.11-slim 2 | 3 | RUN mkdir /app 4 | 5 | RUN apt-get update && apt-get install -y \ 6 | cabextract p7zip-full 7 | 8 | WORKDIR /app 9 | 10 | COPY requirements.txt /app/requirements.txt 11 | 12 | RUN pip3 install --upgrade pip && \ 13 | pip3 install -r requirements.txt 14 | 15 | COPY housekeeper.py /app/housekeeper.py 16 | COPY utils.py /app/utils.py 17 | 18 | # infinite loop to keep the container running 19 | #ENTRYPOINT ["tail", "-f", "/dev/null"] 20 | ENTRYPOINT [ "python3", "-O", "housekeeper.py" ] -------------------------------------------------------------------------------- /Pipeline/Housekeeper/README.md: -------------------------------------------------------------------------------- 1 | # Housekeeper Module: Storage Cleanup & Extraction 2 | 3 | ## Overview 4 | 5 | The Housekeeper module is designed to manage storage by extracting known archive formats and removing unnecessary files to optimize pipeline storage efficiency. 6 | Given storage constraints, the module extracts archives while also deleting files that are no longer needed. 7 | It operates by downloading files from the Coordinator, which provides an endpoint for specific file types like Microsoft Cabinet archives and installer executables. 8 | Extraction is performed using cabextract and 7-Zip, with the extracted files being reintroduced to the pipeline, maintaining a reference to the original file. 9 | 10 | ## File Removal 11 | 12 | In addition to extraction, Housekeeper removes files deemed unnecessary based on the Coordinator's identification results. 13 | Files unlikely to contain drivers, such as most text files (excluding INF files), are deleted. 14 | Successfully extracted archives are removed from the filesystem, though their database entries are preserved to avoid redundant inspections. 15 | This approach ensures efficient storage management while keeping the pipeline's database up-to-date. -------------------------------------------------------------------------------- /Pipeline/Housekeeper/docker-compose.yml: -------------------------------------------------------------------------------- 1 | version: "3.8" 2 | 3 | services: 4 | housekeeper: 5 | build: . 6 | # volumes: 7 | # - .:/app 8 | #ports: 9 | # - "5000:5000" -------------------------------------------------------------------------------- /Pipeline/Housekeeper/requirements.txt: -------------------------------------------------------------------------------- 1 | requests -------------------------------------------------------------------------------- /Pipeline/Housekeeper/utils.py: -------------------------------------------------------------------------------- 1 | import os 2 | import requests 3 | import time 4 | import hashlib 5 | from urllib3.exceptions import InsecureRequestWarning 6 | 7 | # Suppress only the single warning from urllib3 needed. 8 | requests.packages.urllib3.disable_warnings(category=InsecureRequestWarning) 9 | 10 | COORDINATOR_BASE_URL = 'http://coordinator:5000' 11 | #COORDINATOR_BASE_URL = 'http://COORDINATOR_IP:5000' 12 | 13 | def check_hashes(hashes): 14 | known_files = [] 15 | new_hashes = [] 16 | 17 | for hash in hashes: 18 | url = f'{COORDINATOR_BASE_URL}/driver-id/{hash}' 19 | #print(f"Fetching existing files info for page {page}...") 20 | response = requests.get(url, verify=False) 21 | if response.status_code == 200: 22 | print(f"Hash {hash} already exists.") 23 | known_files.append(hash) 24 | elif response.status_code == 400: 25 | print(f"Hash {hash} is an invalid hash?") 26 | else: 27 | new_hashes.append(hash) 28 | 29 | return new_hashes, known_files 30 | 31 | def calculate_sha256(full_path): 32 | with open(full_path, 'rb') as f: 33 | sha256 = hashlib.sha256() 34 | while True: 35 | data = f.read(65536) # lets read stuff in 64kb chunks! 36 | if not data: 37 | break 38 | sha256.update(data) 39 | return sha256.hexdigest() 40 | 41 | def upload_file(file_path, origin=None): 42 | sha256 = calculate_sha256(file_path) 43 | url = f'{COORDINATOR_BASE_URL}/ogfile/{sha256}' 44 | data = {'origin': origin} if origin else {} 45 | response = requests.post(url, data=data, verify=False) 46 | if response.status_code == 200: 47 | if 'ogfile_id' not in response.json(): 48 | return None, f"Success uploading file '{file_path}' but failed to get new_ogfile id for it" 49 | return response.json()['ogfile_id'], None 50 | elif response.status_code == 400 and 'Invalid hash length' in response.text: 51 | return None, f"[E] Hash {sha256} of {file_path} is an invalid hash?" 52 | elif response.status_code == 404: 53 | # This file does not yet exist, lets upload it 54 | files = {'file': open(file_path, 'rb')} 55 | data = {'origin': origin} if origin else {} 56 | # Suppressing the InsecureRequestWarning 57 | response = requests.post(f"{COORDINATOR_BASE_URL}/ogfile", files=files, data=data, verify=False) 58 | if response.status_code == 409 and 'already exists' in response.text: 59 | if 'ogfile_id' not in response.json(): 60 | return None, f"File '{file_path}' already exists, but new_ogfile id for it not responded" 61 | return response.json()['ogfile_id'], None 62 | elif response.status_code == 500 and 'database is locked' in response.text: 63 | # TODO change backend DB to not get (sqlite3.OperationalError) database is locked 64 | time.sleep(1) 65 | return upload_file(file_path, origin) 66 | elif response.status_code != 200: 67 | return None, f"Failed to upload file '{file_path}'. Status code: {response.status_code} and response: {response.text}" 68 | else: 69 | if 'ogfile_id' not in response.json(): 70 | return None, f"Success uploading file '{file_path}' but failed to get new_ogfile id for it" 71 | return response.json()['ogfile_id'], None 72 | elif response.status_code == 409 and 'already exists' in response.text: 73 | if 'ogfile_id' not in response.json(): 74 | return None, f"File '{file_path}' already exists, but new_ogfile id for it not responded" 75 | return response.json()['ogfile_id'], None 76 | else: 77 | return None, f"Failed to get file id for {file_path} hash {sha256}. Status code: {response.status_code} and response: {response.text}" 78 | -------------------------------------------------------------------------------- /Pipeline/Identifier/Dockerfile: -------------------------------------------------------------------------------- 1 | #################################################### 2 | # GOLANG BUILDER 3 | #################################################### 4 | FROM golang:1 as go_builder 5 | RUN apt-get update && apt-get install -y libmagic-dev libc6 6 | COPY . /go/src/github.com/maliceio/malice-fileinfo 7 | WORKDIR /go/src/github.com/maliceio/malice-fileinfo 8 | RUN go build -ldflags "-s -w -X main.Version=v$(cat VERSION) -X main.BuildTime=$(date -u +%Y%m%d)" -o /bin/info 9 | 10 | #################################################### 11 | # FILEINFO BUILDER 12 | #################################################### 13 | FROM ubuntu:jammy 14 | 15 | LABEL maintainer "https://github.com/blacktop" 16 | 17 | LABEL malice.plugin.repository = "https://github.com/malice-plugins/fileinfo.git" 18 | LABEL malice.plugin.category="metadata" 19 | LABEL malice.plugin.mime="*" 20 | LABEL malice.plugin.docker.engine="*" 21 | 22 | # Create a malice user and group first so the IDs get set the same way, even as 23 | # the rest of this may change over time. 24 | RUN groupadd -r malice \ 25 | && useradd --no-log-init -r -g malice malice \ 26 | && mkdir /malware \ 27 | && chown -R malice:malice /malware 28 | 29 | ENV SSDEEP 2.14.1 30 | ENV EXIFTOOL 12.76 31 | 32 | RUN buildDeps='ca-certificates \ 33 | build-essential \ 34 | openssl \ 35 | unzip \ 36 | curl' \ 37 | && set -x \ 38 | && apt-get update -qq \ 39 | && apt-get install -yq --no-install-recommends $buildDeps libmagic-dev libc6 cpanminus \ 40 | && echo "Downloading TRiD and Database..." \ 41 | && curl -Ls http://mark0.net/download/trid_linux_64.zip > /tmp/trid_linux_64.zip \ 42 | && curl -Ls http://mark0.net/download/triddefs.zip > /tmp/triddefs.zip \ 43 | && cd /tmp \ 44 | && unzip trid_linux_64.zip \ 45 | && unzip triddefs.zip \ 46 | && chmod +x trid \ 47 | && mv trid /usr/bin/ \ 48 | && mv triddefs.trd /usr/bin/ \ 49 | && echo "Installing ssdeep..." \ 50 | && curl -Ls https://github.com/ssdeep-project/ssdeep/releases/download/release-$SSDEEP/ssdeep-$SSDEEP.tar.gz > \ 51 | /tmp/ssdeep-$SSDEEP.tar.gz \ 52 | && cd /tmp \ 53 | && tar xzf ssdeep-$SSDEEP.tar.gz \ 54 | && cd ssdeep-$SSDEEP \ 55 | && ./configure \ 56 | && make \ 57 | && make install \ 58 | && echo "Installing exiftool..." \ 59 | && cpanm --notest File::Find Archive::Zip Compress::Raw::Zlib \ 60 | && curl -Ls https://exiftool.org/Image-ExifTool-$EXIFTOOL.tar.gz> \ 61 | /tmp/exiftool.tar.gz \ 62 | && cd /tmp \ 63 | && tar xzf exiftool.tar.gz \ 64 | && cd Image-ExifTool-$EXIFTOOL \ 65 | && perl Makefile.PL \ 66 | && make test \ 67 | && make install \ 68 | && echo "Clean up unnecessary files..." \ 69 | && apt-get purge -y --auto-remove $buildDeps \ 70 | && apt-get clean \ 71 | && rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/* /root/.gnupg 72 | 73 | RUN apt-get update -qq && apt-get install -yq --no-install-recommends ca-certificates gosu \ 74 | && rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/* 75 | 76 | COPY --from=go_builder /bin/info /bin/info 77 | 78 | #################################################### 79 | ### FROM HERE NEW ADDITIONS to dockerfile ### 80 | RUN mkdir /app 81 | 82 | WORKDIR /app 83 | 84 | # infinite loop to keep the container running 85 | #ENTRYPOINT ["tail", "-f", "/dev/null"] 86 | ENTRYPOINT [ "gosu", "malice", "info" ] -------------------------------------------------------------------------------- /Pipeline/Identifier/LICENSE.old: -------------------------------------------------------------------------------- 1 | The MIT License (MIT) 2 | 3 | Copyright (c) 2016 Malice.IO 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /Pipeline/Identifier/README.md: -------------------------------------------------------------------------------- 1 | # Identifier Module: File Classification 2 | 3 | ## Overview 4 | 5 | The Identifier module is part of a file classification pipeline designed to complete missing metadata and determine file types, with a focus on identifying Windows drivers. It integrates with the Coordinator module to process and store identification results. Building on the Malice File Info Plugin, the Identifier replaces the original web service with a polling loop for unidentified files, leveraging tools like ExifTool and TrID for accurate file type detection. 6 | 7 | ## Features 8 | 9 | - **File Classification**: Uses heuristic methods to determine if a file is a Windows executable based on TrID results, MIME type, file extension, and architecture. 10 | - **Driver Detection**: Examines the Portable Executable (PE) header to identify Windows drivers by checking for imports from `ntoskrnl.exe` or `wdfldr.sys`. 11 | - **Metadata Extraction**: For files identified as Windows drivers, it extracts and stores key strings such as SDDL strings, "PhysicalMemory," and symbolic device names. 12 | 13 | ## Acknowledgement 14 | The "base image" for this container is [malice-plugin/fileinfo](https://github.com/malice-plugins/fileinfo). 15 | 16 | ## License 17 | This project is under [GPLv3](../../LICENSE), but because this module is heavily based on malice-plugin/fileinfo any part of that previous project is still under [LICENSE.old](./LICENSE.old). 18 | -------------------------------------------------------------------------------- /Pipeline/Identifier/docker-compose.yml: -------------------------------------------------------------------------------- 1 | version: "3.8" 2 | 3 | services: 4 | identifier: 5 | build: . -------------------------------------------------------------------------------- /Pipeline/Identifier/go.mod: -------------------------------------------------------------------------------- 1 | module github.com/malice-plugins/fileinfo 2 | 3 | go 1.21.3 4 | 5 | require ( 6 | github.com/gorilla/mux v1.8.1 7 | github.com/malice-plugins/pkgs v1.1.7 8 | github.com/parnurzeal/gorequest v0.2.16 9 | github.com/rakyll/magicmime v0.1.0 10 | github.com/sirupsen/logrus v1.9.3 11 | gopkg.in/resty.v1 v1.12.0 12 | ) 13 | 14 | require ( 15 | github.com/elazarl/goproxy v0.0.0-20231117061959-7cc037d33fb5 // indirect 16 | github.com/pkg/errors v0.9.1 // indirect 17 | github.com/smartystreets/goconvey v1.8.1 // indirect 18 | github.com/stretchr/testify v1.8.4 // indirect 19 | golang.org/x/net v0.0.0-20200320220750-118fecf932d8 // indirect 20 | golang.org/x/sys v0.6.0 // indirect 21 | moul.io/http2curl v1.0.0 // indirect 22 | ) 23 | -------------------------------------------------------------------------------- /Pipeline/Identifier/identifier.go: -------------------------------------------------------------------------------- 1 | package main 2 | 3 | import ( 4 | "context" 5 | "encoding/json" 6 | "fmt" 7 | "log" 8 | "os" 9 | "io/ioutil" 10 | "time" 11 | "net/http" 12 | "crypto/tls" 13 | 14 | "gopkg.in/resty.v1" 15 | "github.com/malice-plugins/pkgs/utils" 16 | ) 17 | 18 | const ( 19 | coordinatorURL = "http://coordinator:5000" 20 | ) 21 | 22 | type FilesResponse struct { 23 | Files []File `json:"files"` 24 | } 25 | 26 | // File represents information about a file 27 | type File struct { 28 | ID int `json:"id"` 29 | Filename string `json:"filename"` 30 | Path string `json:"path"` 31 | SHA1 string `json:"sha1"` 32 | SHA256 string `json:"sha256"` 33 | Size int `json:"size"` 34 | SSDeep string `json:"ssdeep"` 35 | } 36 | 37 | func main() { 38 | http.DefaultTransport.(*http.Transport).TLSClientConfig = &tls.Config{InsecureSkipVerify: true} 39 | for { 40 | files := getNonIdentifiedFiles() 41 | for _, file := range files { 42 | fileInfo := identifyFile(file) 43 | 44 | // Send the file info to the coordinator as a json 45 | fileInfoJSON, err := json.Marshal(fileInfo) 46 | if err != nil { 47 | log.Fatalf("failed to marshal file info: %v", err) 48 | } 49 | // fmt.Println(string(fileInfoJSON)) 50 | resp, err := resty.R(). 51 | SetHeader("Content-Type", "application/json"). 52 | SetBody(fileInfoJSON). 53 | Post(coordinatorURL + "/files/" + fmt.Sprintf("%d", file.ID)) 54 | if err != nil { 55 | log.Fatalf("failed to send file info: %v", err) 56 | } 57 | if resp.StatusCode() != 200 { 58 | log.Fatalf("failed to send file info: %d and error %s", resp.StatusCode(), resp.Body()) 59 | } 60 | } 61 | time.Sleep(10 * time.Second) 62 | } 63 | } 64 | 65 | func getNonIdentifiedFiles() []File { 66 | var files FilesResponse 67 | resp, err := resty.R().Get(coordinatorURL + "/unidentified-files-info") 68 | if err != nil { 69 | log.Fatalf("failed to fetch non-identified files: %v", err) 70 | } 71 | if resp.StatusCode() != 200 { 72 | log.Fatalf("failed to fetch non-identified files: %d", resp.StatusCode()) 73 | } 74 | err = json.Unmarshal(resp.Body(), &files) 75 | if err != nil { 76 | log.Fatalf("failed to unmarshal response: %v", err) 77 | } 78 | return files.Files 79 | } 80 | 81 | func identifyFile(file File) FileInfo { 82 | // Get the actual file by its ID, save it in tmp folder 83 | resp, err := resty.R().Get(coordinatorURL + "/files/" + fmt.Sprintf("%d", file.ID)) 84 | if err != nil { 85 | log.Fatalf("failed to fetch file %s: %v", file.Filename, err) 86 | } 87 | if resp.StatusCode() != 200 { 88 | log.Fatalf("failed to fetch file %s: %d", file.Filename, resp.StatusCode()) 89 | } 90 | 91 | tmpfile, err := ioutil.TempFile("/tmp", "ident_") 92 | if err != nil { 93 | log.Fatal(err) 94 | } 95 | defer os.Remove(tmpfile.Name()) // clean up 96 | 97 | if _, err = tmpfile.Write(resp.Body()); err != nil { 98 | log.Fatal(err) 99 | } 100 | if err = tmpfile.Close(); err != nil { 101 | log.Fatal(err) 102 | } 103 | 104 | ctx, cancel := context.WithTimeout(context.Background(), time.Duration(60)*time.Second) 105 | defer cancel() 106 | 107 | // Do FileInfo scan 108 | mtx.Lock() 109 | defer mtx.Unlock() 110 | path := tmpfile.Name() 111 | GetFileMimeType(ctx, path) 112 | GetFileDescription(ctx, path) 113 | 114 | fileInfo := FileInfo{ 115 | Magic: fi.Magic, 116 | SSDeep: ParseSsdeepOutput(utils.RunCommand(ctx, "ssdeep", path)), 117 | TRiD: ParseTRiDOutput(utils.RunCommand(ctx, "trid", path)), 118 | Exiftool: ParseExiftoolOutput(utils.RunCommand(ctx, "exiftool", path)), 119 | } 120 | 121 | return fileInfo 122 | } 123 | -------------------------------------------------------------------------------- /Pipeline/Identifier/requirements.txt: -------------------------------------------------------------------------------- 1 | requests -------------------------------------------------------------------------------- /Pipeline/Identifier/scan.go: -------------------------------------------------------------------------------- 1 | package main 2 | // TODO Add LICENSE file back (was MIT before modification) 3 | 4 | import ( 5 | "bytes" 6 | "context" 7 | "crypto/md5" 8 | "encoding/json" 9 | "fmt" 10 | "html/template" 11 | "io/ioutil" 12 | "net/http" 13 | "os" 14 | "strings" 15 | "sync" 16 | "time" 17 | 18 | log "github.com/sirupsen/logrus" 19 | "github.com/gorilla/mux" 20 | "github.com/malice-plugins/pkgs/utils" 21 | "github.com/parnurzeal/gorequest" 22 | "github.com/rakyll/magicmime" 23 | ) 24 | 25 | const ( 26 | name = "fileinfo" 27 | category = "metadata" 28 | ) 29 | 30 | var ( 31 | // Version stores the plugin's version 32 | Version string 33 | // BuildTime stores the plugin's build time 34 | BuildTime string 35 | 36 | fi FileInfo 37 | 38 | mtx sync.Mutex 39 | ) 40 | 41 | type pluginResults struct { 42 | ID string `structs:"id"` 43 | FileInfo FileInfo `structs:"fileinfo"` 44 | } 45 | 46 | // FileMagic is file magic 47 | type FileMagic struct { 48 | Mime string `json:"mime" structs:"mime"` 49 | Description string `json:"description" structs:"description"` 50 | } 51 | 52 | // FileInfo json object 53 | type FileInfo struct { 54 | Magic FileMagic `json:"magic" structs:"magic"` 55 | SSDeep string `json:"ssdeep" structs:"ssdeep"` 56 | TRiD []string `json:"trid" structs:"trid"` 57 | Exiftool map[string]string `json:"exiftool" structs:"exiftool"` 58 | MarkDown string `json:"markdown,omitempty" structs:"markdown,omitempty"` 59 | } 60 | 61 | // GetFileMimeType returns the mime-type of a file path 62 | func GetFileMimeType(ctx context.Context, path string) error { 63 | 64 | utils.Assert(magicmime.Open(magicmime.MAGIC_MIME_TYPE | magicmime.MAGIC_SYMLINK | magicmime.MAGIC_ERROR)) 65 | defer magicmime.Close() 66 | 67 | mimetype, err := magicmime.TypeByFile(path) 68 | if err != nil { 69 | fi.Magic.Mime = err.Error() 70 | return err 71 | } 72 | 73 | fi.Magic.Mime = mimetype 74 | return nil 75 | } 76 | 77 | // GetFileDescription returns the textual libmagic type of a file path 78 | func GetFileDescription(ctx context.Context, path string) error { 79 | 80 | utils.Assert(magicmime.Open(magicmime.MAGIC_SYMLINK | magicmime.MAGIC_ERROR)) 81 | defer magicmime.Close() 82 | 83 | magicdesc, err := magicmime.TypeByFile(path) 84 | if err != nil { 85 | fi.Magic.Description = err.Error() 86 | return err 87 | } 88 | 89 | fi.Magic.Description = magicdesc 90 | return nil 91 | } 92 | 93 | // ParseExiftoolOutput convert exiftool output into JSON 94 | func ParseExiftoolOutput(exifout string, err error) map[string]string { 95 | 96 | if err != nil { 97 | m := make(map[string]string) 98 | m["error"] = err.Error() 99 | return m 100 | } 101 | 102 | var ignoreTags = []string{ 103 | "Directory", 104 | "File Name", 105 | "File Permissions", 106 | "File Modification Date/Time", 107 | } 108 | 109 | lines := strings.Split(exifout, "\n") 110 | 111 | log.Debugln("Exiftool lines: ", lines) 112 | 113 | if utils.SliceContainsString("File not found", lines) { 114 | return nil 115 | } 116 | 117 | datas := make(map[string]string, len(lines)) 118 | 119 | for _, line := range lines { 120 | keyvalue := strings.Split(line, ":") 121 | if len(keyvalue) != 2 { 122 | continue 123 | } 124 | if !utils.StringInSlice(strings.TrimSpace(keyvalue[0]), ignoreTags) { 125 | datas[strings.TrimSpace(utils.CamelCase(keyvalue[0]))] = strings.TrimSpace(keyvalue[1]) 126 | } 127 | } 128 | 129 | return datas 130 | } 131 | 132 | // ParseSsdeepOutput convert ssdeep output into JSON 133 | func ParseSsdeepOutput(ssdout string, err error) string { 134 | 135 | if err != nil { 136 | return err.Error() 137 | } 138 | 139 | // Break output into lines 140 | lines := strings.Split(ssdout, "\n") 141 | 142 | log.Debugln("ssdeep lines: ", lines) 143 | 144 | if utils.SliceContainsString("No such file or directory", lines) { 145 | return "" 146 | } 147 | 148 | // Break second line into hash and path 149 | hashAndPath := strings.Split(lines[1], ",") 150 | 151 | return strings.TrimSpace(hashAndPath[0]) 152 | } 153 | 154 | // ParseTRiDOutput convert trid output into JSON 155 | func ParseTRiDOutput(tridout string, err error) []string { 156 | 157 | if err != nil { 158 | return []string{err.Error()} 159 | } 160 | 161 | keepLines := []string{} 162 | 163 | lines := strings.Split(tridout, "\n") 164 | 165 | log.Debugln("TRiD lines: ", lines) 166 | 167 | if utils.SliceContainsString("Error: found no file(s) to analyze!", lines) { 168 | return nil 169 | } 170 | 171 | lines = lines[6:] 172 | 173 | for _, line := range lines { 174 | if len(strings.TrimSpace(line)) != 0 { 175 | keepLines = append(keepLines, strings.TrimSpace(line)) 176 | } 177 | } 178 | 179 | return keepLines 180 | } 181 | -------------------------------------------------------------------------------- /Pipeline/Identifier/template.go: -------------------------------------------------------------------------------- 1 | package main 2 | 3 | const tpl = `{{ if .Magic}}#### Magic 4 | | Field | Value | 5 | |-------------|------------------------| 6 | | Mime | {{.Magic.Mime}} | 7 | | Description | {{.Magic.Description}} | 8 | {{ end -}} 9 | {{- if .SSDeep}} 10 | #### SSDeep 11 | - ` + "`" + `{{.SSDeep}}` + "`" + ` 12 | {{ end -}} 13 | {{- if .TRiD}} 14 | #### TRiD 15 | {{ range .TRiD -}} 16 | - {{ . }} 17 | {{end}} 18 | {{- end }} 19 | {{- if .Exiftool}} 20 | #### Exiftool 21 | | Field | Value | 22 | |-------------|----------------------| 23 | {{- range $key, $value := .Exiftool }} 24 | | {{ $key }} | {{ $value }} | 25 | {{- end }} 26 | {{- end }} 27 | ` 28 | -------------------------------------------------------------------------------- /Pipeline/Importers/README.md: -------------------------------------------------------------------------------- 1 | # Driver Importers 2 | 3 | ## Overview 4 | 5 | This folder includes a set of tools designed to import Windows kernel drivers to the pipeline. 6 | These include the Manual Importer, VirusTotal Importer and CDC Importer. 7 | 8 | ## Importers 9 | 10 | ### Manual Importer 11 | The [Manual Importer](./recursiveFileImporter.py) allows users to recursively search a specified folder for driver files and upload any new files not already present in the pipeline. 12 | The user defines the origin string for each file, and duplicate files are linked to existing entries, updating only the origin if it’s new. 13 | This importer supports collecting drivers from various sources, including manual downloads and operational Windows systems. 14 | 15 | ### VirusTotal Importer 16 | The [VirusTotal Importer](./smartVTscrape.py) utilizes the VirusTotal file download API to retrieve Windows drivers that match specific queries. 17 | This method is optimized to minimize API usage by avoiding redundant downloads. 18 | 19 | ### CDC Importer 20 | The CDC importer is nothing more than a combination of a [parser](./gatherMDEfiles.py) for the Microsoft XDR result and then the VirusTotal importer. -------------------------------------------------------------------------------- /Pipeline/Importers/VTresults/.gitkeep: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Toroto006/windows-kernel-driver-pipeline/91c832c2dc33967fd4bbd9af0338b135f9b1fa98/Pipeline/Importers/VTresults/.gitkeep -------------------------------------------------------------------------------- /Pipeline/Importers/cursors/.gitkeep: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Toroto006/windows-kernel-driver-pipeline/91c832c2dc33967fd4bbd9af0338b135f9b1fa98/Pipeline/Importers/cursors/.gitkeep -------------------------------------------------------------------------------- /Pipeline/Importers/gatherMDEfiles.py: -------------------------------------------------------------------------------- 1 | import os 2 | import csv 3 | import json 4 | from sys import argv 5 | 6 | from VTinterface import download_files_individually, get_VT_quotas, get_VT_usage 7 | from utils import download_file, check_hashes 8 | 9 | 10 | def get_hashes_MDE(csv_file_path): 11 | """Reads the MDE export csv file and returns a list of sha1 hashes for all found drivers.""" 12 | hashes = set() 13 | 14 | with open(csv_file_path, newline='') as csvfile: 15 | reader = csv.reader(csvfile, delimiter=',', quotechar='"') 16 | next(reader) 17 | for sha1,Signer,EventCount in reader: 18 | assert len(sha1) == 40 19 | hashes.add(sha1) 20 | 21 | return list(hashes) 22 | 23 | if __name__ == "__main__": 24 | if len(argv) != 2: 25 | print(f"Usage: {argv[0]} ") 26 | exit(1) 27 | 28 | output_dir = 'downloaded_files' 29 | os.makedirs(output_dir, exist_ok=True) 30 | # check the output directory has no files 31 | if len(os.listdir(output_dir)) > 0: 32 | print(f"Output directory {output_dir} is not empty. Exiting.") 33 | exit(1) 34 | 35 | csv_file_path = argv[1] 36 | hashes = get_hashes_MDE(csv_file_path) 37 | print(f"Gotten {len(hashes)} files from {csv_file_path}...") 38 | 39 | # check which already exist 40 | hashes, known_files = check_hashes(hashes) 41 | 42 | remaining_quotas = get_VT_quotas() 43 | print(f"Remaining quotas: {json.dumps(remaining_quotas, indent=3)}") 44 | assert remaining_quotas['api']['daily'] - len(hashes) > 0, f"If you'd download you'd use up other quota." 45 | 46 | api_usage = get_VT_usage() 47 | print(f"This API key usage: {json.dumps(api_usage, indent=3)}") 48 | 49 | if input(f"Do you want to continue and download? Would cost {len(hashes)} api calls. (Yes/No) ") != 'Yes': 50 | print("Exiting...") 51 | exit(0) 52 | else: 53 | # download unknown from VT, if they exist 54 | print(f"Downloading {len(hashes)} files, bc they are new...") 55 | not_found = download_files_individually(hashes, output_dir) # Currently broken + still uses quota?? 56 | # download those that are known from our coordinator 57 | for hash in known_files: 58 | file_path = os.path.join(output_dir, hash) 59 | if not download_file(hash=hash, driver_destination=file_path): 60 | print(f"Failed to download file {hash} from coordinator.") 61 | 62 | # write a csv file with the not found hashes, i.e. those we have to get through MDE 63 | if len(not_found) > 0: 64 | with open(os.path.join(output_dir, "not_found_hashes.csv"), "w") as f: 65 | for hash in not_found: 66 | f.write(f"{hash}\n") 67 | print(f"Written {len(not_found)} hashes to not_found_hashes.csv.") 68 | else: 69 | print("All files downloaded successfully.") 70 | 71 | remaining_quotas = get_VT_quotas() 72 | print(f"Remaining quotas: {json.dumps(remaining_quotas, indent=3)}") -------------------------------------------------------------------------------- /Pipeline/Importers/recursiveFileImporter.py: -------------------------------------------------------------------------------- 1 | import os 2 | import argparse 3 | import requests 4 | import time 5 | import hashlib 6 | from utils import upload_file 7 | 8 | def upload_files_to_(folder_path, origin=None): 9 | for filename in os.listdir(folder_path): 10 | file_path = os.path.join(folder_path, filename) 11 | if os.path.isfile(file_path): 12 | upload_file(file_path, origin) 13 | elif os.path.isdir(file_path): 14 | upload_files_to_(file_path, origin) 15 | 16 | def main(): 17 | parser = argparse.ArgumentParser(description="Upload files to /ogfile destination") 18 | parser.add_argument("folder_path", help="Path to the folder containing files to upload") 19 | parser.add_argument("--origin", help="Origin for the files (optional)") 20 | args = parser.parse_args() 21 | 22 | folder_path = args.folder_path 23 | origin = args.origin 24 | 25 | if os.path.exists(folder_path): 26 | upload_files_to_(folder_path, origin) 27 | else: 28 | print("Folder not found.") 29 | 30 | 31 | if __name__ == "__main__": 32 | main() 33 | -------------------------------------------------------------------------------- /Pipeline/Importers/utils.py: -------------------------------------------------------------------------------- 1 | import os 2 | import requests 3 | import time 4 | import hashlib 5 | from urllib3.exceptions import InsecureRequestWarning 6 | 7 | # Suppress only the single warning from urllib3 8 | requests.packages.urllib3.disable_warnings(category=InsecureRequestWarning) 9 | 10 | COORDINATOR_BASE_URL = 'http://COORDINATOR_IP:5000' 11 | 12 | def check_hashes(hashes): 13 | known_files = [] 14 | new_hashes = [] 15 | 16 | for hash in hashes: 17 | url = f'{COORDINATOR_BASE_URL}/driver-id/{hash}' 18 | #print(f"Fetching existing files info for page {page}...") 19 | response = requests.get(url, verify=False) 20 | if response.status_code == 200: 21 | print(f"Hash {hash} already exists.") 22 | known_files.append(hash) 23 | elif response.status_code == 400: 24 | print(f"Hash {hash} is an invalid hash?") 25 | else: 26 | new_hashes.append(hash) 27 | 28 | return new_hashes, known_files 29 | 30 | def download_file(hash, driver_destination): 31 | url = f'{COORDINATOR_BASE_URL}/driver-id/{hash}' 32 | response = requests.get(url, verify=False) 33 | if response.status_code != 200: 34 | print(f"Failed to get driver id for hash {hash}. Status code: {response.status_code}") 35 | return False 36 | driver_id = response.json()['driver_id'] 37 | 38 | # get file_id from drivers endpoint 39 | url = f'{COORDINATOR_BASE_URL}/drivers/{driver_id}' 40 | response = requests.get(url, verify=False) 41 | if response.status_code != 200: 42 | print(f"Failed to get file id for hash {hash}. Status code: {response.status_code}") 43 | return False 44 | file_id = response.json()['driver']['file'] 45 | 46 | # download the file 47 | url = f'{COORDINATOR_BASE_URL}/files/{file_id}' 48 | response = requests.get(url, verify=False) 49 | if response.status_code != 200: 50 | print(f"Failed to download the file {file_id}. Status code: {response.status_code}") 51 | return False 52 | # first check if file exists, if so error out 53 | if os.path.exists(driver_destination): 54 | print(f"File {driver_destination} already exists in temp folder. Exiting.") 55 | return False 56 | with open(driver_destination, 'wb') as f: 57 | f.write(response.content) 58 | 59 | return True 60 | 61 | def calculate_sha256(full_path): 62 | with open(full_path, 'rb') as f: 63 | sha256 = hashlib.sha256() 64 | while True: 65 | data = f.read(65536) # lets read stuff in 64kb chunks! 66 | if not data: 67 | break 68 | sha256.update(data) 69 | return sha256.hexdigest() 70 | 71 | def upload_file(file_path, origin=None): 72 | sha256 = calculate_sha256(file_path) 73 | url = f'{COORDINATOR_BASE_URL}/ogfile/{sha256}' 74 | data = {'origin': origin} if origin else {} 75 | # first try to add origin to already existing file 76 | response = requests.post(url, data=data, verify=False) 77 | if response.status_code == 200: 78 | pass 79 | elif response.status_code == 400 and 'Invalid hash length' in response.text: 80 | print(f"Hash {sha256} of {file_path} is an invalid hash?") 81 | elif response.status_code == 404: 82 | # This file does not yet exist, lets upload it 83 | files = {'file': open(file_path, 'rb')} 84 | # Suppressing the InsecureRequestWarning 85 | response = requests.post(f"{COORDINATOR_BASE_URL}/ogfile", files=files, data=data, verify=False) 86 | if response.status_code == 409 and 'already exists' in response.text: 87 | pass 88 | elif response.status_code == 500 and 'database is locked' in response.text: 89 | # change backend DB to not get (sqlite3.OperationalError) database is locked - should not happen anymore 90 | time.sleep(1) 91 | return upload_file(file_path, origin) 92 | elif response.status_code != 200: 93 | print(f"Failed to upload file '{file_path}'. Status code: {response.status_code} and response: {response.text}") 94 | else: 95 | print(f"Failed to get file id for {file_path} hash {sha256}. Status code: {response.status_code} and response: {response.text}") 96 | -------------------------------------------------------------------------------- /Pipeline/Pathfinder/README.md: -------------------------------------------------------------------------------- 1 | # Pathfinder Module: Static Reachability Analysis 2 | 3 | ## Overview 4 | 5 | This is an advanced module built upon and extending VMware Threat Analysis Unit's driver analysis script. 6 | It interacts with the Coordinator module to update driver metadata, including driver frameworks, IOCTL handler addresses, kernel API paths, and WDF functions used. 7 | The module executes the customized IDAPython script using IDA Pro 8.4, with results communicated through a result file managed via Python's subprocess functionality. 8 | 9 | ## Key Features 10 | - **Enhanced Driver Compatibility**: Supports ARM64 drivers by integrating new [type information libraries](./WinARMTil/) and improving recognition of driver IOCTL handlers for AMD64. 11 | - **Improved Script Execution**: Optimizations include replacing global variable debug conditions and reducing redundant decompilation operations to speed up analysis. 12 | - **Extended Functionality**: Adds features like decompilation context retrieval and WDF import list completion, improving Windows Driver Framework drivers verification speed. 13 | - **IOCTL Code Extraction**: Implements a new feature to identify Input/Output Control Codes using various checks to aid in fuzzing efforts through the Fuzzifier. 14 | 15 | ## Usage 16 | To use Pathfinder, integrate it with the Coordinator module and run the modified IDAPython script with IDA Pro. 17 | **DO NOT FORGET TO START WITH** `$env:PYTHONOPTIMIZE = 'x'` as failure to do so causes errors. 18 | The results will be written to a specified location, which Pathfinder will then read and process. 19 | For detailed setup and configuration instructions, refer to the original blog post. 20 | 21 | ## Acknowledgments 22 | 23 | Pathfinder is based on and a [modified set of scripts](./VDR/) of the VMware TAU’s [Threat Analysis Unit - Hunting Vulnerable Kernel Drivers](https://blogs.vmware.com/security/2023/10/hunting-vulnerable-kernel-drivers.html) blog post. 24 | We extend our gratitude to VMware TAU for their work that significantly informed the development of this module. 25 | 26 | ## License 27 | This project is under [GPLv3](../../LICENSE). -------------------------------------------------------------------------------- /Pipeline/Pathfinder/VDR/.env: -------------------------------------------------------------------------------- 1 | PYTHONPATH=%PYTHONPATH%;C:\Program Files\IDA Pro 8.4\python\3 -------------------------------------------------------------------------------- /Pipeline/Pathfinder/VDR/kmdf_re/README.md: -------------------------------------------------------------------------------- 1 | # Reverse Engineering and Bug Hunting on KMDF Drivers 2 | 3 | Link to slides: https://ioactive.com/wp-content/uploads/2018/09/Reverse_Engineering_and_Bug_Hunting_On_KMDF_Drivers.pdf 4 | 5 | kmdf_re is a small idapython code that attempts to rename common structures and find usages of interesting kmdf callbacks. 6 | 7 | Presentation given at AsiaSecWest 2018 (https://www.asiasecwest.com) and 44Con 2018 (https://44con.com/) 8 | 9 | ## Author 10 | * [Enrique Nissim](https://twitter.com/kiqueNissim) -------------------------------------------------------------------------------- /Pipeline/Pathfinder/VDR/requirements.txt: -------------------------------------------------------------------------------- 1 | pefile 2 | -------------------------------------------------------------------------------- /Pipeline/Pathfinder/VDR/vdr_arm.md: -------------------------------------------------------------------------------- 1 | # IDA scripting stuff 2 | 3 | [cot_asg = Is Assignment?](https://hex-rays.com/products/ida/support/idapython_docs/ida_hexrays.html#ida_hexrays.cot_asg) 4 | [cot_idx = x[y] indexing](https://hex-rays.com/products/ida/support/idapython_docs/ida_hexrays.html#ida_hexrays.cot_idx) -------------------------------------------------------------------------------- /Pipeline/Pathfinder/WinARMTil/ARMtils/NTAPI_win10_ARM64.til: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Toroto006/windows-kernel-driver-pipeline/91c832c2dc33967fd4bbd9af0338b135f9b1fa98/Pipeline/Pathfinder/WinARMTil/ARMtils/NTAPI_win10_ARM64.til -------------------------------------------------------------------------------- /Pipeline/Pathfinder/WinARMTil/ARMtils/NTDDK_win10_ARM64.til: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Toroto006/windows-kernel-driver-pipeline/91c832c2dc33967fd4bbd9af0338b135f9b1fa98/Pipeline/Pathfinder/WinARMTil/ARMtils/NTDDK_win10_ARM64.til -------------------------------------------------------------------------------- /Pipeline/Pathfinder/WinARMTil/ARMtils/NTWDF_win10_ARM64.til: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Toroto006/windows-kernel-driver-pipeline/91c832c2dc33967fd4bbd9af0338b135f9b1fa98/Pipeline/Pathfinder/WinARMTil/ARMtils/NTWDF_win10_ARM64.til -------------------------------------------------------------------------------- /Pipeline/Pathfinder/WinARMTil/NTAPI_win10_ARM64.til: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Toroto006/windows-kernel-driver-pipeline/91c832c2dc33967fd4bbd9af0338b135f9b1fa98/Pipeline/Pathfinder/WinARMTil/NTAPI_win10_ARM64.til -------------------------------------------------------------------------------- /Pipeline/Pathfinder/WinARMTil/NTDDK_win10_ARM64.til: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Toroto006/windows-kernel-driver-pipeline/91c832c2dc33967fd4bbd9af0338b135f9b1fa98/Pipeline/Pathfinder/WinARMTil/NTDDK_win10_ARM64.til -------------------------------------------------------------------------------- /Pipeline/Pathfinder/WinARMTil/README.md: -------------------------------------------------------------------------------- 1 | # TIL creation for NTDDK 2 | 3 | Between [this](https://reverseengineering.stackexchange.com/questions/18669/how-to-make-type-libraries-from-windows-10-sdk-and-ddk) stack exchange post, [these](https://hex-rays.com/blog/igors-tip-of-the-week-60-type-libraries/) tips from a hex-ray blog and [this](https://blog.nviso.eu/2023/11/07/generating-ida-type-information-libraries-from-windows-type-libraries/) example blog of how to use idaclang. 4 | 5 | For ntddk a single struct is unable to be compiled: comment out the _WHEAP_ROW_FAILURE_EVENT struct from the ntddk.h or use the [ntddk_mod.h](./ntddk_mod.h) directly when [building NTDDK_ARM](./buildNTDDK.bat) 6 | [Building NTWDF ARM](./buildNTWDF.bat) hence requires the modification of the original ntddk.h -------------------------------------------------------------------------------- /Pipeline/Pathfinder/WinARMTil/buildNTAPI.bat: -------------------------------------------------------------------------------- 1 | cls 2 | set ver=10.0.22621.0 3 | set folder=%ProgramFiles(x86)%\Windows Kits\10\Include\%ver% 4 | .\idaclang.exe ^ 5 | -x c++ ^ 6 | -target arm64-pc-windows ^ 7 | -ferror-limit=0 ^ 8 | --idaclang-log-target ^ 9 | --idaclang-tildesc "NTAPI Win10 ARM64" ^ 10 | --idaclang-tilname "NTAPI_win10_ARM64.til" ^ 11 | -I"%folder%\cppwinrt\winrt" ^ 12 | -I"%folder%\km" ^ 13 | -I"%folder%\km\crt" ^ 14 | -I"%folder%\shared" ^ 15 | -I"%folder%\ucrt" ^ 16 | -I"%folder%\um" ^ 17 | -I"%folder%\winrt" ^ 18 | "%folder%\um\Windows.h" -------------------------------------------------------------------------------- /Pipeline/Pathfinder/WinARMTil/buildNTDDK.bat: -------------------------------------------------------------------------------- 1 | cls 2 | set ver=10.0.22621.0 3 | set folder=%ProgramFiles(x86)%\Windows Kits\10\Include\%ver% 4 | .\idaclang.exe ^ 5 | -x c++ ^ 6 | -target arm64-pc-windows ^ 7 | -ferror-limit=0 ^ 8 | --idaclang-log-target ^ 9 | --idaclang-tildesc "NTDDK Win10 ARM64" ^ 10 | --idaclang-tilname "NTDDK_win10_ARM64.til" ^ 11 | -I"%folder%\cppwinrt\winrt" ^ 12 | -I"%folder%\km" ^ 13 | -I"%folder%\km\crt" ^ 14 | -I"%folder%\shared" ^ 15 | -I"%folder%\ucrt" ^ 16 | -I"%folder%\um" ^ 17 | -I"%folder%\winrt" ^ 18 | .\ntddk_mod.h -------------------------------------------------------------------------------- /Pipeline/Pathfinder/WinARMTil/buildNTWDF.bat: -------------------------------------------------------------------------------- 1 | cls 2 | set verWDF=1.33 3 | set folderWDF=%ProgramFiles(x86)%\Windows Kits\10\Include\wdf\kmdf\%verWDF% 4 | set ver=10.0.22621.0 5 | set folder=%ProgramFiles(x86)%\Windows Kits\10\Include\%ver% 6 | .\idaclang.exe ^ 7 | -x c++ ^ 8 | -target arm64-pc-windows ^ 9 | -ferror-limit=0 ^ 10 | --idaclang-log-target ^ 11 | --idaclang-tildesc "NTWDF Win10 ARM64" ^ 12 | --idaclang-tilname "NTWDF_win10_ARM64.til" ^ 13 | -I"%folder%\cppwinrt\winrt" ^ 14 | -I"%folder%\km" ^ 15 | -I"%folder%\km\crt" ^ 16 | -I"%folder%\shared" ^ 17 | -I"%folder%\ucrt" ^ 18 | -I"%folder%\um" ^ 19 | -I"%folder%\winrt" ^ 20 | -I"%folderWDF%" ^ 21 | .\wdf_mod.h" -------------------------------------------------------------------------------- /Pipeline/Pathfinder/WinARMTil/wdf_mod.h: -------------------------------------------------------------------------------- 1 | #include 2 | #include 3 | 4 | #include 5 | #include // for SDDLs -------------------------------------------------------------------------------- /Pipeline/Pathfinder/functionsTree.py: -------------------------------------------------------------------------------- 1 | class FuncNode(): 2 | """Represents a node in the function tree""" 3 | 4 | def __init__(self, addr, name=None, depth=0) -> None: 5 | self.addr = addr 6 | self.name = name 7 | self.depth = depth 8 | self.children = [] 9 | 10 | def getOrAdd(self, addr): 11 | for c in self.children: 12 | if c.addr == addr: 13 | return c 14 | 15 | c = FuncNode(addr, depth=self.depth + 1) 16 | self.children.append(c) 17 | return c 18 | 19 | def __str__(self) -> str: 20 | """Prints the tree starting from this node""" 21 | indent = " " * self.depth 22 | s = f"{indent}0x{self.addr:02X}{' ' + self.name if self.name is not None else ''}{' {' if len(self.children) > 0 else ''}\n" 23 | for c in self.children: 24 | s += str(c) 25 | s += f'{indent}}}\n' if len(self.children) > 0 else '' 26 | return s 27 | 28 | def makeFunctionTree(handler_addr, paths, root_name=None): 29 | root = FuncNode(handler_addr, depth=0) 30 | root.name = root_name 31 | 32 | for p in paths: 33 | # First check if the path is relevant to this tree 34 | if p['path'][0] != handler_addr: 35 | continue 36 | 37 | # Add all intermediate nodes 38 | current = root.getOrAdd(p['path'][1]) 39 | if len(p['path']) > 2: 40 | for addr in p['path'][2:]: 41 | current = current.getOrAdd(addr) 42 | 43 | # Add the leaf node name 44 | current.name = p['name'] 45 | 46 | return root 47 | 48 | def combinedSubfunctions(tree, depth=0): 49 | """Returns a list of all subfunctions that have more than one child on any level """ 50 | if len(tree.children) == 0: 51 | return [] 52 | 53 | subfunctions = [] 54 | for c in tree.children: 55 | if len(c.children) > 1: 56 | subfunctions.append(c) 57 | subfunctions += combinedSubfunctions(c, depth + 1) 58 | 59 | return subfunctions 60 | 61 | if __name__ == "__main__": 62 | ## Test the functionality 63 | import json 64 | RESULT_FILE = '.\\ida_ioctl_res.json' 65 | results = {} 66 | with open(RESULT_FILE, "r") as f: 67 | results = json.loads(f.read()) 68 | 69 | for handler in results['handler_addrs']: 70 | tree = makeFunctionTree(handler, results['target_paths'] + results['helper_paths'], results['handler_type']) 71 | print(tree) 72 | print(f"{len(combinedSubfunctions(tree))} combined subfunctions:") 73 | for sub in combinedSubfunctions(tree): 74 | print(sub) 75 | -------------------------------------------------------------------------------- /Pipeline/Pathfinder/interestingFunctions.csv: -------------------------------------------------------------------------------- 1 | function, value 2 | MmMapIoSpace, 5 3 | MmMapIoSpaceEx, 5 4 | IoCreateDevice, 2 5 | IoCreateSymbolicLink, 5 6 | WdfDeviceCreateDeviceInterface, 5 7 | ZwMapViewOfSection, 4 8 | ZwMapViewOfSectionEx, 4 9 | ZwOpenSection, 3 10 | IoAllocateMdl, 3 11 | SeSinglePrivilegeCheck, 3 12 | MmInitializeMdl, 3 13 | MmBuildMdlForNonPagedPool, 1 14 | MmProbeAndLockPages, 1 15 | MmMapLockedPagesSpecifyCache, 1 16 | MmMapLockedPages, 1 17 | MmGetPhysicalAddress, 2 18 | KeStackAttachProcess, 2 19 | WdfControlDeviceInitAllocate, 2 20 | IoGetRequestorProcessId, 1 21 | PsGetCurrentProcessId, 1 22 | MmGetSystemRoutineAddress, 1 23 | MmMapMemoryDumpMdl, 2 24 | ZwUnmapViewOfSection, 1 25 | MmUnmapIoSpace, 2 26 | IoFreeMdl, 1 27 | WdfRequestRetrieveUnsafeUserOutputBuffer, 4 28 | WdfRequestProbeAndLockUserBufferForWrite, 3 29 | WdfRequestRetrieveUnsafeUserInputBuffer, 4 30 | WdfRequestProbeAndLockUserBufferForRead, 3 31 | WdmlibIoCreateDeviceSecure, 2 32 | WdfIoTargetOpen, 2 33 | WdfIoTargetSendInternalIoctlSynchronously, 1 34 | WdfChildListAddOrUpdateChildDescriptionAsPresent, 1 35 | memcpy, 2 36 | MmCopyMemory, 1 37 | memset, 1 38 | memmov, 2 -------------------------------------------------------------------------------- /Pipeline/Pathfinder/start_multiple_instances.bat: -------------------------------------------------------------------------------- 1 | @echo off 2 | setlocal 3 | 4 | REM Check if the number of instances is provided as an argument 5 | if "%~1"=="" ( 6 | echo Usage: %~nx0 [number_of_instances] [python_script] 7 | exit /b 1 8 | ) 9 | 10 | REM Check if the Python script path is provided as an argument 11 | if "%~2"=="" ( 12 | echo Usage: %~nx0 [number_of_instances] [python_script] 13 | exit /b 1 14 | ) 15 | 16 | set "NUM_INSTANCES=%~1" 17 | set "PYTHON_SCRIPT=%~2" 18 | set "OPTIMIZATION_LEVEL=x" 19 | 20 | REM Extract the directory of the Python script 21 | for %%F in ("%PYTHON_SCRIPT%") do set "SCRIPT_DIR=%%~dpF" 22 | set "SCRIPT_NAME=%~nx2" 23 | 24 | REM Start the instances, for each also echo the start command 25 | for /L %%i in (0,1,%NUM_INSTANCES%) do ( 26 | echo Start instance %%i 27 | start "" powershell -Command "cd '%SCRIPT_DIR%'; $env:PYTHONOPTIMIZE = '%OPTIMIZATION_LEVEL%'; python3 .\%SCRIPT_NAME% %%i" 28 | ) 29 | 30 | endlocal 31 | -------------------------------------------------------------------------------- /Pipeline/UpdateCataloger/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM python:3.11-slim 2 | 3 | RUN pip3 install beautifulsoup4 requests 4 | 5 | RUN mkdir /catalogUpdater 6 | WORKDIR /catalogUpdater 7 | 8 | COPY catalogUpdater.py /catalogUpdater/ 9 | COPY get_microsoft_updates.py /catalogUpdater/ 10 | COPY utils.py /catalogUpdater/ 11 | 12 | # wait 13 | #ENTRYPOINT [ "tail", "-f", "/dev/null" ] 14 | ENTRYPOINT [ "python3", "-O", "catalogUpdater.py" ] -------------------------------------------------------------------------------- /Pipeline/UpdateCataloger/LICENCE_get_microsoft_updates: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2019, Jordan Borean (@jborean93) 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. -------------------------------------------------------------------------------- /Pipeline/UpdateCataloger/README.md: -------------------------------------------------------------------------------- 1 | # UpdateCataloger Module: Fully Automated Importer 2 | 3 | ## Overview 4 | UpdateCataloger is a fully automated tool for periodically searching and importing new Windows drivers from the Microsoft Update Catalog. It utilizes a modified version of the [get_microsoft_updates.py](./get_microsoft_updates.py) script to enhance functionality with caching and logging features. The caching layer minimizes redundant requests by storing previously retrieved metadata, while detailed logging identifies when search queries reach their result limits, aiding in query refinement. 5 | 6 | ## Features 7 | - **Automated Search and Download**: Periodically searches for new driver updates using a comprehensive list of search queries. 8 | - **Caching Layer**: Reduces redundant metadata and download requests by caching previously seen results. 9 | - **Logging**: Identifies search queries hitting result limits, helping to narrow search scopes. 10 | - **Extensive Query Coverage**: Includes all possible vendor IDs (0x0000 to 0xFFFF), a curated list of component manufacturers, and specific device-related keywords. 11 | 12 | ## Acknowledgements 13 | Special thanks to the creators of `get_microsoft_updates.py` for the foundational script on which this module is based upon. 14 | 15 | ## License 16 | This project is under [GPLv3](../../LICENSE), but the [get_microsoft_updates.py](./get_microsoft_updates.py) file is based on an MIT licensed work, hence under [LICENCE_get_microsoft_updates](./LICENCE_get_microsoft_updates). -------------------------------------------------------------------------------- /Pipeline/UpdateCataloger/catalogUpdaterCache.json: -------------------------------------------------------------------------------- 1 | {} -------------------------------------------------------------------------------- /Pipeline/UpdateCataloger/docker-compose.yml: -------------------------------------------------------------------------------- 1 | version: "3.8" 2 | 3 | services: 4 | update-cataloger: 5 | build: . 6 | volumes: 7 | - ./:/catalogUpdater/ 8 | #ports: 9 | # - "5000:5000" -------------------------------------------------------------------------------- /Pipeline/UpdateCataloger/utils.py: -------------------------------------------------------------------------------- 1 | import os 2 | import requests 3 | import time 4 | import hashlib 5 | from urllib3.exceptions import InsecureRequestWarning 6 | 7 | # Suppress only the single warning from urllib3 needed. 8 | requests.packages.urllib3.disable_warnings(category=InsecureRequestWarning) 9 | 10 | COORDINATOR_BASE_URL = 'http://COORDINATOR_IP:5000' 11 | 12 | def clean_filename(filename): 13 | # remove all things that might break pathing after download 14 | return filename.replace("/", "_").replace("\\", "_").replace(":", "_")\ 15 | .replace(" ", "_").replace('"', "_").replace("'", "_")\ 16 | .replace("?", "_").replace("*", "_").replace("<", "_")\ 17 | .replace(">", "_").replace("|", "_") 18 | 19 | def check_hashes(hashes): 20 | known_files = [] 21 | new_hashes = [] 22 | 23 | for hash in hashes: 24 | url = f'{COORDINATOR_BASE_URL}/driver-id/{hash}' 25 | #print(f"Fetching existing files info for page {page}...") 26 | response = requests.get(url, verify=False) 27 | if response.status_code == 200: 28 | print(f"Hash {hash} already exists.") 29 | known_files.append(hash) 30 | elif response.status_code == 400: 31 | print(f"Hash {hash} is an invalid hash?") 32 | else: 33 | new_hashes.append(hash) 34 | 35 | return new_hashes, known_files 36 | 37 | def download_file(hash, driver_destination): 38 | url = f'{COORDINATOR_BASE_URL}/driver-id/{hash}' 39 | response = requests.get(url, verify=False) 40 | if response.status_code != 200: 41 | print(f"Failed to get driver id for hash {hash}. Status code: {response.status_code}") 42 | return False 43 | driver_id = response.json()['driver_id'] 44 | 45 | # get file_id from drivers endpoint 46 | url = f'{COORDINATOR_BASE_URL}/drivers/{driver_id}' 47 | response = requests.get(url, verify=False) 48 | if response.status_code != 200: 49 | print(f"Failed to get file id for hash {hash}. Status code: {response.status_code}") 50 | return False 51 | file_id = response.json()['driver']['file'] 52 | 53 | # download the file 54 | url = f'{COORDINATOR_BASE_URL}/files/{file_id}' 55 | response = requests.get(url, verify=False) 56 | if response.status_code != 200: 57 | print(f"Failed to download the file {file_id}. Status code: {response.status_code}") 58 | return False 59 | # first check if file exists, if so error out 60 | if os.path.exists(driver_destination): 61 | print(f"File {driver_destination} already exists in temp folder. Exiting.") 62 | return False 63 | with open(driver_destination, 'wb') as f: 64 | f.write(response.content) 65 | 66 | return True 67 | 68 | def calculate_sha256(full_path): 69 | with open(full_path, 'rb') as f: 70 | sha256 = hashlib.sha256() 71 | while True: 72 | data = f.read(65536) # lets read stuff in 64kb chunks! 73 | if not data: 74 | break 75 | sha256.update(data) 76 | return sha256.hexdigest() 77 | 78 | def upload_file(file_path, origin=None): 79 | sha256 = calculate_sha256(file_path) 80 | url = f'{COORDINATOR_BASE_URL}/ogfile/{sha256}' 81 | data = {'origin': origin} if origin else {} 82 | response = requests.post(url, data=data, verify=False) 83 | if response.status_code == 200: 84 | return True, None 85 | elif response.status_code == 400 and 'Invalid hash length' in response.text: 86 | return False, f"[E] Hash {sha256} of {file_path} is an invalid hash?" 87 | elif response.status_code == 404: 88 | # This file does not yet exist, lets upload it 89 | files = {'file': open(file_path, 'rb')} 90 | data = {'origin': origin} if origin else {} 91 | # Suppressing the InsecureRequestWarning 92 | response = requests.post(f"{COORDINATOR_BASE_URL}/ogfile", files=files, data=data, verify=False) 93 | if response.status_code == 409 and 'already exists' in response.text: 94 | return True, None 95 | elif response.status_code == 500 and 'database is locked' in response.text: 96 | # TODO change backend DB to not get (sqlite3.OperationalError) database is locked 97 | time.sleep(1) 98 | return upload_file(file_path, origin) 99 | elif response.status_code != 200: 100 | return False, f"Failed to upload file '{file_path}'. Status code: {response.status_code} and response: {response.text}" 101 | else: 102 | return True, None 103 | else: 104 | return False, f"Failed to get file id for {file_path} hash {sha256}. Status code: {response.status_code} and response: {response.text}" 105 | -------------------------------------------------------------------------------- /Pipeline/UpdateCataloger/vendorIDs.txt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Toroto006/windows-kernel-driver-pipeline/91c832c2dc33967fd4bbd9af0338b135f9b1fa98/Pipeline/UpdateCataloger/vendorIDs.txt -------------------------------------------------------------------------------- /Pipeline/docker-compose.yml: -------------------------------------------------------------------------------- 1 | version: '3.9' 2 | 3 | services: 4 | 5 | coordinator-db: 6 | image: postgres:latest 7 | restart: always 8 | container_name: coordinator-db 9 | environment: 10 | POSTGRES_PASSWORD: POSTGRES_PASSWORD 11 | POSTGRES_DATABASE: pipeline 12 | POSTGRES_USER: pipeline 13 | volumes: 14 | - ./storage/postgres:/var/lib/postgresql 15 | - ./storage/postgres/data:/var/lib/postgresql/data 16 | healthcheck: 17 | test: ["CMD-SHELL", "pg_isready -U pipeline"] 18 | interval: 5s 19 | timeout: 5s 20 | retries: 5 21 | 22 | # coordinator-db-admin: 23 | # container_name: coordinator-db-admin 24 | # image: dpage/pgadmin4 25 | # environment: 26 | # - PGADMIN_DEFAULT_EMAIL=user@test.com 27 | # - PGADMIN_DEFAULT_PASSWORD=test 28 | # ports: 29 | # - "5050:80" 30 | # depends_on: 31 | # coordinator-db: 32 | # condition: service_healthy 33 | 34 | coordinator: 35 | container_name: coordinator 36 | depends_on: 37 | coordinator-db: 38 | condition: service_healthy 39 | build: ./Coordinator 40 | volumes: 41 | - ./Coordinator:/app 42 | - ./storage/files:/storage/files 43 | - ./storage/uploads:/storage/uploads 44 | ports: 45 | - "5000:5000" 46 | restart: unless-stopped 47 | healthcheck: 48 | test: curl --fail http://localhost:5000/health || exit 1 49 | interval: 60s 50 | timeout: 15s 51 | retries: 3 52 | start_period: 10s 53 | 54 | frontender: 55 | build: ./Frontender 56 | ports: 57 | - "3000:3000" 58 | 59 | identifier: 60 | build: ./Identifier 61 | restart: unless-stopped 62 | depends_on: 63 | coordinator: 64 | condition: service_healthy 65 | 66 | housekeeper: 67 | build: ./Housekeeper 68 | restart: unless-stopped 69 | depends_on: 70 | coordinator: 71 | condition: service_healthy 72 | 73 | update-cataloger: 74 | build: ./UpdateCataloger 75 | volumes: 76 | - ./UpdateCataloger:/catalogUpdater/ 77 | depends_on: 78 | coordinator: 79 | condition: service_healthy 80 | -------------------------------------------------------------------------------- /Pipeline/storage/files/.gitkeep: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Toroto006/windows-kernel-driver-pipeline/91c832c2dc33967fd4bbd9af0338b135f9b1fa98/Pipeline/storage/files/.gitkeep -------------------------------------------------------------------------------- /Pipeline/storage/postgres/.gitkeep: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Toroto006/windows-kernel-driver-pipeline/91c832c2dc33967fd4bbd9af0338b135f9b1fa98/Pipeline/storage/postgres/.gitkeep -------------------------------------------------------------------------------- /Pipeline/storage/uploads/.gitkeep: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Toroto006/windows-kernel-driver-pipeline/91c832c2dc33967fd4bbd9af0338b135f9b1fa98/Pipeline/storage/uploads/.gitkeep -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Efficient Pipelining for Windows Driver Vulnerability Research 2 | 3 | ## Overview 4 | 5 | This repository contains the source code for the system developed to enhance Windows kernel driver security during my master thesis. 6 | The system is designed to continuously gather, analyze, and evaluate Windows device drivers for potential vulnerabilities and present these results to security researchers in an efficient way. 7 | 8 | An export of my final presentation can be found at [FinalPresentation](./FinalPresentation.pdf) and the thesis at [MasterThesis](./MasterThesis.pdf). 9 | 10 | ## Motivation 11 | 12 | The Windows operating system's widespread use in business operations necessitates robust security measures, particularly concerning kernel drivers. 13 | These drivers operate with high-level privileges and are a prime target for attacks due to their critical role in interfacing applications with hardware components. 14 | Despite Microsoft's driver signing requirements and use of frameworks like Windows Driver Model, Windows Driver Framework, and Driver Module Framework, vulnerabilities in these drivers can still be exploited by threat actors. 15 | 16 | ## Challenges 17 | 1. **Collection and Centralization**: There is no centralized repository for all Windows drivers, and existing block lists are incomplete, complicating the identification of already known vulnerable drivers. 18 | 2. **Dynamic and Static Analysis**: Executing drivers for dynamic analysis requires specific environments, while static analysis is hindered by driver complexity and lack of ARM architecture support. 19 | 3. **Optimization of Research Time**: Avoiding false positives and negatives is crucial to efficiently allocate security researchers time and resources. 20 | 21 | ## Solution 22 | 23 | The solution implemented in this thesis involves a system that: 24 | 25 | 1. **Continuous Collection**: Gathers drivers from multiple sources. 26 | 2. **Vulnerability Analysis**: Checks drivers against known vulnerabilities and applies both static and dynamic analysis techniques, through IDA Pro scripting and automated fuzz testing with kAFL. 27 | 3. **Prioritization**: Highlights the most likely vulnerable drivers for researchers to focus on. 28 | 29 | The system has processed over 27,000 drivers during the time of the thesis. 30 | 140 were manually reviewed, resulting in the identification of 14 unique drivers with 28 vulnerabilities - four of which were already published, and ten were previously unknown. 31 | 32 | ## Repository Contents 33 | 34 | - [Pipeline](./Pipeline/): The implementation of the driver collection and analysis system. 35 | - [EvaluationScripts](./EvaluationScripts/README.md): Implementation of statistic and result figure generation for the thesis. 36 | 37 | ## Industry Collaboration 38 | 39 | The project benefited from collaboration with industry experts, including access to a Security Operations Center (SOC) team and an operationally active Red Team, which provided invaluable resources and insights. 40 | 41 | ## Contribution 42 | Contributions are welcome. Please follow the guidelines outlined in [CONTRIBUTING.md](./CONTRIBUTING.md) for submitting issues, feature requests, or pull requests. 43 | 44 | ## License 45 | 46 | This project is licensed under the GPLv3 License where applicable. 47 | For more details, refer to the [LICENSE](./LICENSE) file. 48 | Additionally, some components are licensed under the MIT License due to their reliance on prior work that was released under MIT. --------------------------------------------------------------------------------