├── .flake8 ├── .github ├── ISSUE_TEMPLATE │ ├── bug_report.md │ ├── config.yml │ ├── feature_request.md │ └── question.md ├── pull_request_template.md └── workflows │ └── testing.yml ├── .gitignore ├── .pre-commit-config.yaml ├── LICENSE ├── README.md ├── app ├── emu_gui.py ├── emu_svc.py ├── group_filtered_planner.py ├── parsers │ └── vssadmin_shadow.py └── requirements │ ├── check_lightneuron_registered.py │ └── check_registered.py ├── conf └── default.yml ├── data └── .gitkeep ├── download_payloads.sh ├── gui └── views │ └── emu.vue ├── hook.py ├── payloads └── .gitkeep ├── requirements.txt ├── templates └── emu.html ├── tests ├── .gitkeep ├── test_emu_svc.py ├── test_group_filtered_planner.py └── test_vssadmin_shadow_parser.py └── tox.ini /.flake8: -------------------------------------------------------------------------------- 1 | [flake8] 2 | max-line-length = 180 3 | exclude = 4 | .svn, 5 | CVS, 6 | .bzr, 7 | .hg, 8 | .git, 9 | __pycache__, 10 | .tox, 11 | venv, 12 | .venv 13 | data 14 | payloads -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/bug_report.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: "\U0001F41E Bug report" 3 | about: Create a report to help us improve 4 | title: '' 5 | labels: bug 6 | assignees: wbooth 7 | 8 | --- 9 | 10 | **Describe the bug** 11 | If the bug is related to the adversary emulation plan itself please report them here: 12 | 13 | (https://github.com/center-for-threat-informed-defense/adversary_emulation_library/issues) 14 | 15 | A clear and concise description of what the bug is. 16 | 17 | **To Reproduce** 18 | Steps to reproduce the behavior: 19 | 1. 20 | 21 | **Expected behavior** 22 | A clear and concise description of what you expected to happen. 23 | 24 | **Screenshots** 25 | If applicable, add screenshots to help explain your problem. 26 | 27 | **Desktop (please complete the following information):** 28 | - OS: [e.g. Mac, Windows, Kali] 29 | - Browser [e.g. chrome, safari] 30 | - Version [e.g. 2.8.0] 31 | 32 | **Additional context** 33 | Add any other context about the problem here. 34 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/config.yml: -------------------------------------------------------------------------------- 1 | contact_links: 2 | - name: Documentation 3 | url: https://caldera.readthedocs.io/en/latest/ 4 | about: Your question may be answered in the documentation 5 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/feature_request.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: "\U0001F680 New Feature Request" 3 | about: Propose a new feature 4 | title: '' 5 | labels: feature 6 | assignees: 'wbooth' 7 | 8 | --- 9 | 10 | **What problem are you trying to solve? Please describe.** 11 | > Eg. I'm always frustrated when [...] 12 | 13 | 14 | **The ideal solution: What should the feature should do?** 15 | > a clear and concise description 16 | 17 | 18 | **What category of feature is this?** 19 | 20 | - [ ] UI/UX 21 | - [ ] API 22 | - [ ] Other 23 | 24 | **If you have code or pseudo-code please provide:** 25 | 26 | 27 | ```python 28 | 29 | ``` 30 | 31 | - [ ] Willing to submit a pull request to implement this feature? 32 | 33 | **Additional context** 34 | Add any other context or screenshots about the feature request here. 35 | 36 | Thank you for your contribution! 37 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/question.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: "\U00002753 Question" 3 | about: Support questions 4 | title: '' 5 | labels: question 6 | assignees: '' 7 | 8 | --- 9 | 10 | 11 | 12 | 13 | 18 | -------------------------------------------------------------------------------- /.github/pull_request_template.md: -------------------------------------------------------------------------------- 1 | ## Description 2 | 3 | (insert summary) 4 | 5 | ## Type of change 6 | 7 | Please delete options that are not relevant. 8 | 9 | - [ ] Bug fix (non-breaking change which fixes an issue) 10 | - [ ] New feature (non-breaking change which adds functionality) 11 | - [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected) 12 | - [ ] This change requires a documentation update 13 | 14 | ## How Has This Been Tested? 15 | 16 | Please describe the tests that you ran to verify your changes. 17 | 18 | 19 | ## Checklist: 20 | 21 | - [ ] My code follows the style guidelines of this project 22 | - [ ] I have performed a self-review of my own code 23 | - [ ] I have made corresponding changes to the documentation 24 | - [ ] I have added tests that prove my fix is effective or that my feature works 25 | -------------------------------------------------------------------------------- /.github/workflows/testing.yml: -------------------------------------------------------------------------------- 1 | name: Code Testing 2 | 3 | on: [push] 4 | 5 | jobs: 6 | build: 7 | runs-on: ubuntu-latest 8 | strategy: 9 | fail-fast: false 10 | matrix: 11 | include: 12 | - python-version: 3.8 13 | toxenv: py38,style,coverage-ci 14 | - python-version: 3.9 15 | toxenv: py39,style,coverage-ci 16 | - python-version: 3.10.9 17 | toxenv: py310,style,coverage-ci 18 | - python-version: 3.11 19 | toxenv: py311,style,coverage-ci 20 | 21 | steps: 22 | - uses: actions/checkout@v2 23 | with: 24 | submodules: recursive 25 | - name: Setup python 26 | uses: actions/setup-python@v2 27 | with: 28 | python-version: ${{ matrix.python-version }} 29 | - name: Install dependencies 30 | run: | 31 | pip install --upgrade virtualenv 32 | pip install tox 33 | - name: Run tests 34 | env: 35 | TOXENV: ${{ matrix.toxenv }} 36 | run: tox 37 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | *.pyc 2 | *.crt 3 | *.pyd 4 | *.DS_Store 5 | *.spec 6 | *.pstat 7 | *.tokens 8 | *__pycache__* 9 | .idea/* 10 | conf/secrets.yml 11 | data/adversary-emulation-plans 12 | data/abilities 13 | data/adversaries 14 | data/sources 15 | data/planners 16 | payloads 17 | -------------------------------------------------------------------------------- /.pre-commit-config.yaml: -------------------------------------------------------------------------------- 1 | repos: 2 | - repo: https://github.com/pycqa/flake8 3 | rev: 5.0.4 4 | hooks: 5 | - id: flake8 6 | additional_dependencies: [flake8-bugbear] 7 | - repo: https://github.com/PyCQA/bandit 8 | rev: 1.7.4 9 | hooks: 10 | - id: bandit 11 | entry: bandit -ll --exclude=tests/ --skip=B303 12 | additional_dependencies: 13 | - importlib-metadata<5; python_version < '3.8' 14 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Apache License 2 | Version 2.0, January 2004 3 | http://www.apache.org/licenses/ 4 | 5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 6 | 7 | 1. Definitions. 8 | 9 | "License" shall mean the terms and conditions for use, reproduction, 10 | and distribution as defined by Sections 1 through 9 of this document. 11 | 12 | "Licensor" shall mean the copyright owner or entity authorized by 13 | the copyright owner that is granting the License. 14 | 15 | "Legal Entity" shall mean the union of the acting entity and all 16 | other entities that control, are controlled by, or are under common 17 | control with that entity. For the purposes of this definition, 18 | "control" means (i) the power, direct or indirect, to cause the 19 | direction or management of such entity, whether by contract or 20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 21 | outstanding shares, or (iii) beneficial ownership of such entity. 22 | 23 | "You" (or "Your") shall mean an individual or Legal Entity 24 | exercising permissions granted by this License. 25 | 26 | "Source" form shall mean the preferred form for making modifications, 27 | including but not limited to software source code, documentation 28 | source, and configuration files. 29 | 30 | "Object" form shall mean any form resulting from mechanical 31 | transformation or translation of a Source form, including but 32 | not limited to compiled object code, generated documentation, 33 | and conversions to other media types. 34 | 35 | "Work" shall mean the work of authorship, whether in Source or 36 | Object form, made available under the License, as indicated by a 37 | copyright notice that is included in or attached to the work 38 | (an example is provided in the Appendix below). 39 | 40 | "Derivative Works" shall mean any work, whether in Source or Object 41 | form, that is based on (or derived from) the Work and for which the 42 | editorial revisions, annotations, elaborations, or other modifications 43 | represent, as a whole, an original work of authorship. For the purposes 44 | of this License, Derivative Works shall not include works that remain 45 | separable from, or merely link (or bind by name) to the interfaces of, 46 | the Work and Derivative Works thereof. 47 | 48 | "Contribution" shall mean any work of authorship, including 49 | the original version of the Work and any modifications or additions 50 | to that Work or Derivative Works thereof, that is intentionally 51 | submitted to Licensor for inclusion in the Work by the copyright owner 52 | or by an individual or Legal Entity authorized to submit on behalf of 53 | the copyright owner. For the purposes of this definition, "submitted" 54 | means any form of electronic, verbal, or written communication sent 55 | to the Licensor or its representatives, including but not limited to 56 | communication on electronic mailing lists, source code control systems, 57 | and issue tracking systems that are managed by, or on behalf of, the 58 | Licensor for the purpose of discussing and improving the Work, but 59 | excluding communication that is conspicuously marked or otherwise 60 | designated in writing by the copyright owner as "Not a Contribution." 61 | 62 | "Contributor" shall mean Licensor and any individual or Legal Entity 63 | on behalf of whom a Contribution has been received by Licensor and 64 | subsequently incorporated within the Work. 65 | 66 | 2. Grant of Copyright License. Subject to the terms and conditions of 67 | this License, each Contributor hereby grants to You a perpetual, 68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 69 | copyright license to reproduce, prepare Derivative Works of, 70 | publicly display, publicly perform, sublicense, and distribute the 71 | Work and such Derivative Works in Source or Object form. 72 | 73 | 3. Grant of Patent License. Subject to the terms and conditions of 74 | this License, each Contributor hereby grants to You a perpetual, 75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 76 | (except as stated in this section) patent license to make, have made, 77 | use, offer to sell, sell, import, and otherwise transfer the Work, 78 | where such license applies only to those patent claims licensable 79 | by such Contributor that are necessarily infringed by their 80 | Contribution(s) alone or by combination of their Contribution(s) 81 | with the Work to which such Contribution(s) was submitted. If You 82 | institute patent litigation against any entity (including a 83 | cross-claim or counterclaim in a lawsuit) alleging that the Work 84 | or a Contribution incorporated within the Work constitutes direct 85 | or contributory patent infringement, then any patent licenses 86 | granted to You under this License for that Work shall terminate 87 | as of the date such litigation is filed. 88 | 89 | 4. Redistribution. You may reproduce and distribute copies of the 90 | Work or Derivative Works thereof in any medium, with or without 91 | modifications, and in Source or Object form, provided that You 92 | meet the following conditions: 93 | 94 | (a) You must give any other recipients of the Work or 95 | Derivative Works a copy of this License; and 96 | 97 | (b) You must cause any modified files to carry prominent notices 98 | stating that You changed the files; and 99 | 100 | (c) You must retain, in the Source form of any Derivative Works 101 | that You distribute, all copyright, patent, trademark, and 102 | attribution notices from the Source form of the Work, 103 | excluding those notices that do not pertain to any part of 104 | the Derivative Works; and 105 | 106 | (d) If the Work includes a "NOTICE" text file as part of its 107 | distribution, then any Derivative Works that You distribute must 108 | include a readable copy of the attribution notices contained 109 | within such NOTICE file, excluding those notices that do not 110 | pertain to any part of the Derivative Works, in at least one 111 | of the following places: within a NOTICE text file distributed 112 | as part of the Derivative Works; within the Source form or 113 | documentation, if provided along with the Derivative Works; or, 114 | within a display generated by the Derivative Works, if and 115 | wherever such third-party notices normally appear. The contents 116 | of the NOTICE file are for informational purposes only and 117 | do not modify the License. You may add Your own attribution 118 | notices within Derivative Works that You distribute, alongside 119 | or as an addendum to the NOTICE text from the Work, provided 120 | that such additional attribution notices cannot be construed 121 | as modifying the License. 122 | 123 | You may add Your own copyright statement to Your modifications and 124 | may provide additional or different license terms and conditions 125 | for use, reproduction, or distribution of Your modifications, or 126 | for any such Derivative Works as a whole, provided Your use, 127 | reproduction, and distribution of the Work otherwise complies with 128 | the conditions stated in this License. 129 | 130 | 5. Submission of Contributions. Unless You explicitly state otherwise, 131 | any Contribution intentionally submitted for inclusion in the Work 132 | by You to the Licensor shall be under the terms and conditions of 133 | this License, without any additional terms or conditions. 134 | Notwithstanding the above, nothing herein shall supersede or modify 135 | the terms of any separate license agreement you may have executed 136 | with Licensor regarding such Contributions. 137 | 138 | 6. Trademarks. This License does not grant permission to use the trade 139 | names, trademarks, service marks, or product names of the Licensor, 140 | except as required for reasonable and customary use in describing the 141 | origin of the Work and reproducing the content of the NOTICE file. 142 | 143 | 7. Disclaimer of Warranty. Unless required by applicable law or 144 | agreed to in writing, Licensor provides the Work (and each 145 | Contributor provides its Contributions) on an "AS IS" BASIS, 146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 147 | implied, including, without limitation, any warranties or conditions 148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 149 | PARTICULAR PURPOSE. You are solely responsible for determining the 150 | appropriateness of using or redistributing the Work and assume any 151 | risks associated with Your exercise of permissions under this License. 152 | 153 | 8. Limitation of Liability. In no event and under no legal theory, 154 | whether in tort (including negligence), contract, or otherwise, 155 | unless required by applicable law (such as deliberate and grossly 156 | negligent acts) or agreed to in writing, shall any Contributor be 157 | liable to You for damages, including any direct, indirect, special, 158 | incidental, or consequential damages of any character arising as a 159 | result of this License or out of the use or inability to use the 160 | Work (including but not limited to damages for loss of goodwill, 161 | work stoppage, computer failure or malfunction, or any and all 162 | other commercial damages or losses), even if such Contributor 163 | has been advised of the possibility of such damages. 164 | 165 | 9. Accepting Warranty or Additional Liability. While redistributing 166 | the Work or Derivative Works thereof, You may choose to offer, 167 | and charge a fee for, acceptance of support, warranty, indemnity, 168 | or other liability obligations and/or rights consistent with this 169 | License. However, in accepting such obligations, You may act only 170 | on Your own behalf and on Your sole responsibility, not on behalf 171 | of any other Contributor, and only if You agree to indemnify, 172 | defend, and hold each Contributor harmless for any liability 173 | incurred by, or claims asserted against, such Contributor by reason 174 | of your accepting any such warranty or additional liability. 175 | 176 | END OF TERMS AND CONDITIONS -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # MITRE Caldera Plugin: Emu 2 | 3 | A plugin supplying Caldera with TTPs from the Center for Threat Informed Defense (CTID) Adversary Emulation Plans. 4 | 5 | # Installation 6 | 7 | Using the Emu plugin with Caldera will enable users to access the adversary profiles contained in the [CTID Adversary Emulation Library](https://github.com/center-for-threat-informed-defense/adversary_emulation_library). 8 | 9 | To run Caldera along with the Emu plugin: 10 | 1. Download Caldera as detailed in the [Installation Guide](https://github.com/mitre/Caldera) 11 | 2. Enable the Emu plugin by adding `- emu` to the list of enabled plugins in `conf/local.yml` or `conf/default.yml` (if running Caldera in insecure mode) 12 | 3. Start Caldera to automatically download the Adversary Emulation Library to the `data` folder of the Emu plugin. 13 | 4. Stop Caldera. 14 | 5. Some adversaries may require additional payloads and executables to be downloaded. Run the `download_payloads.sh` script to download these binaries to the `payloads` directory. 15 | 6. Start Caldera again. You will see the Emu plugin shown on the left sidebar of the Caldera server, and you will be able to access the Adversary Emulation Library adversary profiles from the Adversary tab of the Caldera server. 16 | 17 | # Additional setup 18 | Each emulation plan will have an adversary and a set of facts. Please ensure to select the related facts to the 19 | adversary when starting an operation. 20 | 21 | Because some payloads within the Adversary Emulation Library are encrypted, a Python script is used to automate 22 | the decryption which requires installation of some dependencies. Depending on the host OS, `pyminizip` 23 | can be installed using the following: 24 | 25 | - Ubuntu: `apt-get install zlib1g` 26 | - MacOS: `brew install zlib` 27 | - All OS's: `pip3 install -r requirements.txt` 28 | 29 | See URL for more information regarding `pyminizip`: https://github.com/smihica/pyminizip 30 | 31 | ## Acknowledgements 32 | 33 | - [Adversary Emulation Library](https://github.com/center-for-threat-informed-defense/adversary_emulation_library) 34 | -------------------------------------------------------------------------------- /app/emu_gui.py: -------------------------------------------------------------------------------- 1 | import logging 2 | 3 | from aiohttp_jinja2 import template 4 | 5 | from app.service.auth_svc import for_all_public_methods, check_authorization 6 | from app.utility.base_world import BaseWorld 7 | 8 | 9 | @for_all_public_methods(check_authorization) 10 | class EmuGUI(BaseWorld): 11 | 12 | def __init__(self, services, name, description): 13 | self.auth_svc = services.get('auth_svc') 14 | self.data_svc = services.get('data_svc') 15 | self.name = name 16 | self.description = description 17 | 18 | self.log = logging.getLogger('emu_gui') 19 | 20 | @template('emu.html') 21 | async def splash(self, request): 22 | return dict() 23 | -------------------------------------------------------------------------------- /app/emu_svc.py: -------------------------------------------------------------------------------- 1 | import glob 2 | import json 3 | import os 4 | import uuid 5 | import yaml 6 | from aiohttp import web 7 | from pathlib import Path 8 | import shutil 9 | from subprocess import DEVNULL, PIPE, STDOUT, check_call, Popen, CalledProcessError 10 | import sys 11 | 12 | from app.utility.base_service import BaseService 13 | from app.utility.base_world import BaseWorld 14 | 15 | 16 | class EmuService(BaseService): 17 | _dynamicically_compiled_payloads = {'sandcat.go-linux', 'sandcat.go-darwin', 'sandcat.go-windows'} 18 | _emu_config_path = "conf/default.yml" 19 | 20 | def __init__(self): 21 | self.log = self.add_service('emu_svc', self) 22 | self.emu_dir = os.path.join('plugins', 'emu') 23 | self.repo_dir = os.path.join(self.emu_dir, 'data/adversary-emulation-plans') 24 | self.data_dir = os.path.join(self.emu_dir, 'data') 25 | self.payloads_dir = os.path.join(self.emu_dir, 'payloads') 26 | self.required_payloads = set() 27 | BaseWorld.apply_config('emu', BaseWorld.strip_yml(self._emu_config_path)[0]) 28 | self.evals_c2_host = self.get_config(name='emu', prop='evals_c2_host') 29 | self.evals_c2_port = self.get_config(name='emu', prop='evals_c2_port') 30 | self.app_svc = self.get_service('app_svc') 31 | self.contact_svc = self.get_service('contact_svc') 32 | if not self.app_svc: 33 | self.log.error('App svc not found.') 34 | else: 35 | self.app_svc.application.router.add_route('POST', '/plugins/emu/beacons', self.handle_forwarded_beacon) 36 | 37 | async def handle_forwarded_beacon(self, request): 38 | try: 39 | forwarded_profile = json.loads(await request.read()) 40 | profile = dict() 41 | profile['paw'] = forwarded_profile.get('guid') 42 | profile['contact'] = 'http' 43 | profile['group'] = 'evals' 44 | if 'platform' in forwarded_profile: 45 | profile['platform'] = forwarded_profile.get('platform') 46 | else: 47 | profile['platform'] = 'evals' 48 | if 'hostName' in forwarded_profile: 49 | profile['host'] = forwarded_profile.get('hostName') 50 | if 'user' in forwarded_profile: 51 | profile['username'] = forwarded_profile.get('user') 52 | if 'pid' in forwarded_profile: 53 | profile['pid'] = forwarded_profile.get('pid') 54 | if 'ppid' in forwarded_profile: 55 | profile['ppid'] = forwarded_profile.get('ppid') 56 | await self.contact_svc.handle_heartbeat(**profile) 57 | response = 'Successfully processed forwarded beacon with session ID %s' % profile['paw'] 58 | return web.Response(text=response) 59 | except Exception as e: 60 | error_msg = 'Server error when processing forwarded beacon: %s' % e 61 | self.log.error(error_msg) 62 | raise web.HTTPBadRequest(error_msg) 63 | 64 | async def clone_repo(self, repo_url=None): 65 | """ 66 | Clone the Adversary Emulation Library repository. You can use a specific url via 67 | the `repo_url` parameter (eg. if you want to use a fork). 68 | """ 69 | if not repo_url: 70 | repo_url = 'https://github.com/center-for-threat-informed-defense/adversary_emulation_library' 71 | 72 | if not os.path.exists(self.repo_dir) or not os.listdir(self.repo_dir): 73 | self.log.debug('cloning repo %s' % repo_url) 74 | check_call(['git', 'clone', '--depth', '1', repo_url, self.repo_dir], stdout=DEVNULL, stderr=STDOUT) 75 | self.log.debug('clone complete') 76 | 77 | async def populate_data_directory(self, library_path=None): 78 | """ 79 | Populate the 'data' directory with the Adversary Emulation Library abilities. 80 | """ 81 | if not library_path: 82 | library_path = os.path.join(self.repo_dir, '*') 83 | await self._load_adversaries_and_abilities(library_path) 84 | await self._load_planners(library_path) 85 | 86 | async def decrypt_payloads(self): 87 | path_crypt_script = os.path.join(self.repo_dir, '*', 'Resources', 'utilities', 'crypt_executables.py') 88 | for crypt_script in glob.iglob(path_crypt_script): 89 | plan_path = crypt_script[:crypt_script.rindex('Resources') + len('Resources')] 90 | self.log.debug('attempting to decrypt plan payloads from %s using %s with the password "malware"', 91 | plan_path, crypt_script) 92 | process = Popen([sys.executable, crypt_script, '-i', plan_path, '-p', 'malware', '--decrypt'], stdout=PIPE) 93 | with process.stdout: 94 | for line in iter(process.stdout.readline, b''): 95 | if b'[-]' in line: 96 | self.log.error(line.decode('UTF-8').rstrip()) 97 | else: 98 | self.log.debug(line.decode('UTF-8').rstrip()) 99 | exit_code = process.wait() 100 | if exit_code != 0: 101 | self.log.error(process.stderr) 102 | raise CalledProcessError( 103 | returncode=exit_code, 104 | cmd=process.args, 105 | stderr=process.stderr 106 | ) 107 | 108 | @staticmethod 109 | def get_adversary_from_filename(filename): 110 | base = os.path.basename(filename) 111 | return os.path.splitext(base)[0] 112 | 113 | """ PRIVATE """ 114 | 115 | async def _load_adversaries_and_abilities(self, library_path): 116 | adv_emu_plan_path = os.path.join(library_path, 'Emulation_Plan', 'yaml', '*.yaml') 117 | await self._load_object(adv_emu_plan_path, 'abilities', self._ingest_emulation_plan) 118 | self._store_required_payloads() 119 | 120 | async def _load_planners(self, library_path): 121 | planner_path = os.path.join(library_path, 'Emulation_Plan', 'yaml', 'planners', '*.yml') 122 | await self._load_object(planner_path, 'planners', self._ingest_planner) 123 | 124 | async def _load_object(self, search_path, object_name, ingestion_func): 125 | total, ingested, errors = 0, 0, 0 126 | for filename in glob.iglob(search_path): 127 | total_obj, ingested_obj, num_errors = await ingestion_func(filename) 128 | total += total_obj 129 | ingested += ingested_obj 130 | errors += num_errors 131 | errors_output = f' and ran into {errors} errors' if errors else '' 132 | self.log.debug(f'Ingested {ingested} {object_name} (out of {total}) from emu plugin{errors_output}') 133 | 134 | async def _ingest_planner(self, filename): 135 | num_planners, num_ingested, num_errors = 0, 0, 0 136 | self.log.debug('Ingesting planner at %s', filename) 137 | try: 138 | planner_contents = self.strip_yml(filename)[0] 139 | if self._is_planner(planner_contents): 140 | num_planners += 1 141 | planner_id = planner_contents['id'] 142 | target_filename = '%s.yml' % planner_id 143 | try: 144 | self._copy_planner(filename, target_filename) 145 | num_ingested += 1 146 | except IOError as e: 147 | self.log.error('Error copying planner file to %s', target_filename, e) 148 | num_errors += 1 149 | else: 150 | self.log.error('Yaml file %s located in planner directory but does not contain a planner.', filename) 151 | num_errors += 1 152 | except Exception as e: 153 | self.log.error('Error parsing yaml file %s: %s', filename, e) 154 | num_errors += 1 155 | return num_planners, num_ingested, num_errors 156 | 157 | def _copy_planner(self, source_path, target_filename): 158 | planner_dir = os.path.join(self.data_dir, 'planners') 159 | if not os.path.exists(planner_dir): 160 | os.makedirs(planner_dir) 161 | target_path = os.path.join(planner_dir, target_filename) 162 | shutil.copyfile(source_path, target_path) 163 | self.log.debug('Copied planner to %s', target_path) 164 | 165 | @staticmethod 166 | def _is_planner(data): 167 | return {'id', 'module'}.issubset(set(data.keys())) 168 | 169 | async def _ingest_emulation_plan(self, filename): 170 | self.log.debug('Ingesting emulation plan at %s', filename) 171 | emulation_plan = self.strip_yml(filename)[0] 172 | details = dict() 173 | for entry in emulation_plan: 174 | if 'emulation_plan_details' in entry: 175 | details = entry['emulation_plan_details'] 176 | if not self._is_valid_format_version(entry['emulation_plan_details']): 177 | self.log.error('Yaml file %s does not contain emulation plan details', filename) 178 | return 0, 0, 1 179 | 180 | if 'adversary_name' not in details: 181 | self.log.error('Yaml file %s does not contain adversary info', filename) 182 | return 0, 0, 1 183 | 184 | abilities, adversary_facts, at_total, at_ingested, errors = await self._ingest_abilities(emulation_plan) 185 | await self._save_adversary(id=details.get('id', str(uuid.uuid4())), 186 | name=details.get('adversary_name', filename), 187 | description=details.get('adversary_description', filename), 188 | abilities=abilities) 189 | await self._save_source(details.get('adversary_name', filename), adversary_facts) 190 | return at_total, at_ingested, errors 191 | 192 | async def _ingest_abilities(self, emulation_plan): 193 | """Ingests the abilities in the emulation plan and returns a tuple representing the following: 194 | - list of ingested ability IDs to add to the adversary profile 195 | - list of facts required for the adversary profile 196 | - total number of abilities from the emulation plan 197 | - total number of abilities that were successfully ingested 198 | - number of errors""" 199 | at_total, at_ingested, errors = 0, 0, 0 200 | abilities = [] 201 | adversary_facts = [] 202 | for entry in emulation_plan: 203 | if await self._is_ability(entry): 204 | at_total += 1 205 | try: 206 | ability_id, ability_facts = await self._save_ability(entry) 207 | adversary_facts.extend(ability_facts) 208 | abilities.append(ability_id) 209 | at_ingested += 1 210 | except Exception as e: 211 | self.log.error(e) 212 | errors += 1 213 | return abilities, adversary_facts, at_total, at_ingested, errors 214 | 215 | @staticmethod 216 | def _is_valid_format_version(details): 217 | try: 218 | return float(details['format_version']) >= 1.0 219 | except Exception: 220 | return False 221 | 222 | async def _write_adversary(self, data): 223 | d = os.path.join(self.data_dir, 'adversaries') 224 | 225 | if not os.path.exists(d): 226 | os.makedirs(d) 227 | 228 | file_path = os.path.join(d, '%s.yml' % data['id']) 229 | with open(file_path, 'w') as f: 230 | f.write(yaml.dump(data)) 231 | 232 | async def _save_adversary(self, id, name, description, abilities): 233 | adversary = dict( 234 | id=id, 235 | name=name, 236 | description='%s (Emu)' % description, 237 | atomic_ordering=abilities 238 | ) 239 | await self._write_adversary(adversary) 240 | 241 | @staticmethod 242 | async def _is_ability(data): 243 | if {'id', 'platforms'}.issubset(set(data.keys())): 244 | return True 245 | return False 246 | 247 | async def _write_ability(self, data): 248 | d = os.path.join(self.data_dir, 'abilities', data['tactic']) 249 | if not os.path.exists(d): 250 | os.makedirs(d) 251 | file_path = os.path.join(d, '%s.yml' % data['id']) 252 | with open(file_path, 'w') as f: 253 | f.write(yaml.dump([data])) 254 | 255 | @staticmethod 256 | def get_privilege(executors): 257 | try: 258 | for ex in executors: 259 | if 'elevation_required' in ex: 260 | return 'Elevated' 261 | return False 262 | except Exception: 263 | return False 264 | 265 | async def _save_ability(self, ab): 266 | """ 267 | Return True iif an ability was saved. 268 | """ 269 | 270 | ability = dict( 271 | id=ab.pop('id', str(uuid.uuid4())), 272 | name=ab.pop('name', ''), 273 | description=ab.pop('description', ''), 274 | tactic='-'.join(ab.pop('tactic', '').lower().split(' ')), 275 | technique=dict(name=ab.get('technique', dict()).get('name'), 276 | attack_id=ab.pop('technique', dict()).get('attack_id')), 277 | repeatable=ab.pop('repeatable', False), 278 | requirements=ab.pop('requirements', []), 279 | platforms=ab.pop('platforms') 280 | ) 281 | 282 | privilege = self.get_privilege(ab.get('executors')) 283 | if privilege: 284 | ability['privilege'] = privilege 285 | facts = [] 286 | 287 | for platform in ability.get('platforms', dict()).values(): 288 | for executor_details in platform.values(): 289 | self._register_required_payloads(executor_details.get('payloads', [])) 290 | 291 | for fact, details in ab.get('input_arguments', dict()).items(): 292 | if details.get('default'): 293 | facts.append(dict(trait=fact, value=details.get('default'))) 294 | 295 | await self._write_ability(ability) 296 | return ability['id'], facts 297 | 298 | def _register_required_payloads(self, payloads): 299 | self.required_payloads.update( 300 | [payload for payload in payloads if payload not in self._dynamicically_compiled_payloads] 301 | ) 302 | 303 | def _store_required_payloads(self): 304 | self.log.debug('Searching for and storing required payloads.') 305 | for payload in self.required_payloads: 306 | copied = False 307 | found = False 308 | if os.path.exists(os.path.join(self.payloads_dir, payload)): 309 | continue 310 | for path in Path(self.repo_dir).rglob(payload): 311 | found = True 312 | target_path = os.path.join(self.payloads_dir, path.name) 313 | try: 314 | shutil.copyfile(path, target_path) 315 | copied = True 316 | break 317 | except Exception as e: 318 | self.log.error('Failed to copy payload %s to %s: %s.', payload, target_path, e) 319 | if not found: 320 | self.log.warn('Could not find payload %s within %s.', payload, self.repo_dir) 321 | elif not copied: 322 | self.log.warn('Found payload %s, but could not copy it to the payloads directory.', payload) 323 | 324 | async def _save_source(self, name, facts): 325 | source = dict( 326 | id=str(uuid.uuid5(uuid.NAMESPACE_OID, name)), 327 | name='%s (Emu)' % name, 328 | facts=await self._unique_facts(facts) 329 | ) 330 | await self._write_source(source) 331 | 332 | @staticmethod 333 | async def _unique_facts(facts): 334 | unique_facts = [] 335 | for fact in facts: 336 | if fact not in unique_facts: 337 | unique_facts.append(fact) 338 | return unique_facts 339 | 340 | async def _write_source(self, data): 341 | d = os.path.join(self.data_dir, 'sources') 342 | 343 | if not os.path.exists(d): 344 | os.makedirs(d) 345 | 346 | file_path = os.path.join(d, '%s.yml' % data['id']) 347 | with open(file_path, 'w') as f: 348 | f.write(yaml.dump(data)) 349 | -------------------------------------------------------------------------------- /app/group_filtered_planner.py: -------------------------------------------------------------------------------- 1 | from app.utility.base_world import BaseWorld 2 | 3 | 4 | class LogicalPlanner: 5 | def __init__(self, operation, planning_svc, stopping_conditions=(), filtered_groups_by_ability=None): 6 | self.operation = operation 7 | self.planning_svc = planning_svc 8 | self.stopping_conditions = stopping_conditions 9 | self.stopping_condition_met = False 10 | self.state_machine = ['fetch_and_run_links'] 11 | self.next_bucket = 'fetch_and_run_links' # repeat this bucket until we run out of links. 12 | self.filtered_groups_by_ability = filtered_groups_by_ability if filtered_groups_by_ability else dict() 13 | self.pending_links = [] 14 | self.current_ability_index = 0 15 | self.log = BaseWorld.create_logger('group_filtered_planner') 16 | 17 | async def execute(self): 18 | await self.planning_svc.execute_planner(self) 19 | 20 | async def fetch_and_run_links(self): 21 | links_to_use = await self._fetch_links() 22 | if links_to_use: 23 | # Each agent will run the next available step. 24 | self.log.debug('Applying %d links', len(links_to_use)) 25 | links_to_wait_for = [await self.operation.apply(link) for link in links_to_use] 26 | await self.operation.wait_for_links_completion(links_to_wait_for) 27 | else: 28 | self.log.debug('No more links to run.') 29 | self.next_bucket = None 30 | 31 | async def _fetch_links(self): 32 | # If we have no pending links, go to the next ability in the adversary profile. 33 | # Determine which agents can run the ability based on filtered_groups_by_activity and then generate 34 | # the pool of links for just that ability. 35 | # If the ability does not generate any runnable links, iterate through the 36 | # atomic ordering until we find an ability that does generate links. 37 | while not self.pending_links: 38 | if self.current_ability_index >= len(self.operation.adversary.atomic_ordering): 39 | return [] 40 | ability_id = self.operation.adversary.atomic_ordering[self.current_ability_index] 41 | self.pending_links = await self._get_pending_links(ability_id) 42 | self.current_ability_index += 1 43 | return self._fetch_from_pending_links() 44 | 45 | async def _get_pending_links(self, ability_id): 46 | valid_agents = self._get_valid_agents_for_ability(ability_id) 47 | potential_links = [] 48 | for agent in valid_agents: 49 | potential_links += await self._get_links(agent=agent) 50 | return [link for link in potential_links if link.ability.ability_id == ability_id] 51 | 52 | def _fetch_from_pending_links(self): 53 | """Return at most one link per agent. Any link that gets assigned will be removed from self.pending_links.""" 54 | assigned_agent_paws = set() 55 | links_to_use = [] 56 | unassigned_links = [] 57 | for link in self.pending_links: 58 | if link.paw not in assigned_agent_paws: 59 | assigned_agent_paws.add(link.paw) 60 | links_to_use.append(link) 61 | else: 62 | # Agent has already been assigned a link from this pool 63 | unassigned_links.append(link) 64 | self.pending_links = unassigned_links 65 | return links_to_use 66 | 67 | def _get_valid_agents_for_ability(self, ability_id): 68 | if ability_id not in self.filtered_groups_by_ability: 69 | return self.operation.agents 70 | valid_agents = [] 71 | for agent in self.operation.agents: 72 | if agent.group in self.filtered_groups_by_ability.get(ability_id, []): 73 | valid_agents.append(agent) 74 | return valid_agents 75 | 76 | async def _get_links(self, agent=None): 77 | return await self.planning_svc.get_links(operation=self.operation, agent=agent) 78 | -------------------------------------------------------------------------------- /app/parsers/vssadmin_shadow.py: -------------------------------------------------------------------------------- 1 | import re 2 | 3 | from app.objects.secondclass.c_fact import Fact 4 | from app.objects.secondclass.c_relationship import Relationship 5 | from app.utility.base_parser import BaseParser 6 | 7 | 8 | class Parser(BaseParser): 9 | def parse(self, blob): 10 | relationships = [] 11 | name = self._get_volume_name(blob) 12 | if name: 13 | for mp in self.mappers: 14 | source = self.set_value(mp.source, name, self.used_facts) 15 | target = self.set_value(mp.target, name, self.used_facts) 16 | relationships.append( 17 | Relationship(source=Fact(mp.source, source), 18 | edge=mp.edge, 19 | target=Fact(mp.target, target)) 20 | ) 21 | return relationships 22 | 23 | @staticmethod 24 | def _get_volume_name(blob): 25 | results = re.findall(r'^\s*Shadow Copy Volume Name: (\S+)', blob, re.MULTILINE) 26 | if results: 27 | return results[0] 28 | -------------------------------------------------------------------------------- /app/requirements/check_lightneuron_registered.py: -------------------------------------------------------------------------------- 1 | from plugins.stockpile.app.requirements.base_requirement import BaseRequirement 2 | from app.utility.base_service import BaseService 3 | 4 | 5 | class Requirement(BaseRequirement): 6 | 7 | async def enforce(self, link, operation): 8 | """ 9 | Given a link and the current operation, ensure the ability will only run if the 10 | agent with the given ID/PAW is listed in the Agents tab on the Caldera Server GUI. 11 | :param link 12 | :param operation 13 | :return: True if it complies, False if it doesn't 14 | """ 15 | agent_paws = [agent.paw for agent in BaseService.get_service('data_svc').ram['agents']] 16 | for uf in link.used: 17 | # Remove the "@" character if it appears in the given fact 18 | # In order to accomodate the Lightneuron implant ID 19 | if uf.value.replace("@", "") in agent_paws: 20 | return True 21 | return False 22 | -------------------------------------------------------------------------------- /app/requirements/check_registered.py: -------------------------------------------------------------------------------- 1 | from plugins.stockpile.app.requirements.base_requirement import BaseRequirement 2 | 3 | 4 | class Requirement(BaseRequirement): 5 | 6 | async def enforce(self, link, operation): 7 | """ 8 | Given a link and the current operation, ensure will only run if the agent with the given ID/PAW is alive. 9 | :param link 10 | :param operation 11 | :return: True if it complies, False if it doesn't 12 | """ 13 | agent_paws = [agent.paw for agent in await operation.active_agents()] 14 | for uf in link.used: 15 | if uf.value in agent_paws: 16 | return True 17 | return False 18 | -------------------------------------------------------------------------------- /conf/default.yml: -------------------------------------------------------------------------------- 1 | evals_c2_host: "127.0.0.1" 2 | evals_c2_port: "9999" -------------------------------------------------------------------------------- /data/.gitkeep: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mitre/emu/fdb0bab5c9c3e1f6fcff57cdd2a4d999955cb7b8/data/.gitkeep -------------------------------------------------------------------------------- /download_payloads.sh: -------------------------------------------------------------------------------- 1 | # We are not able to bundle some payloads because their licensing 2 | # prohibits redistribution (notably sysinternals). This script will 3 | # will download non-redistributable payloads. If you're deploying 4 | # the plugin without internet access, you can copy this script to 5 | # an internet connected host, run it, and then copy the resulting 6 | # payloads back to the emu/payloads directory 7 | 8 | curl -o payloads/AdFind.zip http://www.joeware.net/downloads/files/AdFind.zip 9 | unzip -P $(unzip -p payloads/AdFind.zip password.txt) payloads/AdFind.zip -d payloads/ 10 | cp payloads/AdFind.exe payloads/adfind.exe 11 | 12 | curl -o payloads/dnscat2.ps1 https://raw.githubusercontent.com/lukebaggett/dnscat2-powershell/master/dnscat2.ps1 13 | 14 | curl -o payloads/NetSess.zip http://www.joeware.net/downloads/files/NetSess.zip 15 | unzip payloads/NetSess.zip -d payloads/ 16 | cp payloads/NetSess.exe payloads/netsess.exe 17 | 18 | curl -o payloads/nbtscan.exe http://unixwiz.net/tools/nbtscan-1.0.35.exe 19 | 20 | curl -o payloads/psexec.exe https://github.com/ropnop/impacket_static_binaries/releases/download/0.9.22.dev-binaries/psexec_windows.exe 21 | cp payloads/psexec.exe payloads/PsExec.exe 22 | 23 | curl -o payloads/putty.exe https://the.earth.li/~sgtatham/putty/latest/w64/putty.exe 24 | 25 | curl -o payloads/secretsdump.exe https://github.com/ropnop/impacket_static_binaries/releases/download/0.9.22.dev-binaries/secretsdump_windows.exe 26 | 27 | curl -o payloads/tcping.exe https://download.elifulkerson.com//files/tcping/0.39/tcping.exe 28 | 29 | curl -o payloads/wce_v1_41beta_universal.zip https://www.ampliasecurity.com/research/wce_v1_41beta_universal.zip 30 | unzip payloads/wce_v1_41beta_universal.zip -d payloads/ 31 | 32 | curl -o payloads/wmiexec.vbs https://raw.githubusercontent.com/Twi1ight/AD-Pentest-Script/master/wmiexec.vbs 33 | 34 | curl -o payloads/psexec_sandworm.py https://raw.githubusercontent.com/SecureAuthCorp/impacket/c328de825265df12ced44d14b36c688cd9973f5c/examples/psexec.py 35 | 36 | curl -o payloads/PSTools.zip https://web.archive.org/web/20221102141531/http://download.sysinternals.com/files/PSTools.zip 37 | unzip payloads/PSTools.zip -d payloads/PSTools 38 | psexec_md5=$(md5sum payloads/PSTools/PsExec64.exe | awk '{ print $1 }') 39 | if [ "$psexec_md5" = "84858ca42dc54947eea910e8fab5f668" ] 40 | then 41 | target_dir="data/adversary-emulation-plans/turla/Resources/payloads/snake" 42 | mkdir -p "$target_dir" && cp payloads/PSTools/PsExec64.exe $target_dir/PsExec.exe 43 | echo "PsExec64.exe v2.4 copied to Turla payloads directory" 44 | else 45 | echo "PsExec from PSTools.zip with MD5 '$psexec_md5' does not match v2.4 with MD5 of 84858ca42dc54947eea910e8fab5f668" 46 | fi 47 | 48 | curl -o payloads/pscp.exe https://the.earth.li/~sgtatham/putty/latest/w64/pscp.exe 49 | target_dir="data/adversary-emulation-plans/turla/Resources/payloads/carbon" 50 | mkdir -p "$target_dir" && cp payloads/pscp.exe $target_dir/pscp.exe 51 | echo "Pscp.exe copied to Turla payloads directory" 52 | 53 | curl -o payloads/plink.exe https://the.earth.li/~sgtatham/putty/latest/w64/plink.exe 54 | target_dir="data/adversary-emulation-plans/turla/Resources/payloads/carbon" 55 | mkdir -p "$target_dir" && cp payloads/plink.exe $target_dir/plink.exe 56 | echo "Plink.exe copied to Turla payloads directory" 57 | 58 | curl -o payloads/m64.exe https://github.com/ParrotSec/mimikatz/blob/master/x64/mimikatz.exe 59 | echo "x64 mimikatz.exe copied to payloads directory as m64.exe" -------------------------------------------------------------------------------- /gui/views/emu.vue: -------------------------------------------------------------------------------- 1 | 24 | 25 | 49 | -------------------------------------------------------------------------------- /hook.py: -------------------------------------------------------------------------------- 1 | import os 2 | import shutil 3 | 4 | from app.utility.base_world import BaseWorld 5 | from plugins.emu.app.emu_svc import EmuService 6 | from plugins.emu.app.emu_gui import EmuGUI 7 | 8 | name = 'Emu' 9 | description = 'The collection of abilities from the CTID Adversary Emulation Plans' 10 | address = '/plugin/emu/gui' 11 | access = BaseWorld.Access.RED 12 | data_dir = os.path.join('plugins', name.lower(), 'data') 13 | 14 | 15 | async def enable(services): 16 | BaseWorld.apply_config('emu', BaseWorld.strip_yml('plugins/emu/conf/default.yml')[0]) 17 | plugin_svc = EmuService() 18 | emu_gui = EmuGUI(services, name, description) 19 | app = services.get('app_svc').application 20 | app.router.add_route('GET', '/plugin/emu/gui', emu_gui.splash) 21 | 22 | if not os.path.isdir(plugin_svc.repo_dir): 23 | await plugin_svc.clone_repo() 24 | 25 | for directory in ['abilities', 'adversaries', 'sources', 'planners']: 26 | full_path = os.path.join(data_dir, directory) 27 | if os.path.isdir(full_path): 28 | shutil.rmtree(full_path) 29 | 30 | await plugin_svc.decrypt_payloads() 31 | await plugin_svc.populate_data_directory() 32 | -------------------------------------------------------------------------------- /payloads/.gitkeep: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mitre/emu/fdb0bab5c9c3e1f6fcff57cdd2a4d999955cb7b8/payloads/.gitkeep -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | pyminizip==0.2.6 2 | -------------------------------------------------------------------------------- /templates/emu.html: -------------------------------------------------------------------------------- 1 |
2 |
3 |

Emu

4 |

The collection of abilities from the CTID Adversary Emulation Plans

5 |
6 |
7 |
8 |
9 |
10 |

11 |

abilities

12 | 13 | Abilities 14 | 15 | 16 |
17 |
18 |

19 |

adversaries

20 | 21 | Adversaries 22 | 23 | 24 |
25 |
26 | 27 |

View or edit these abilities and adversaries on their respective pages.

28 |
29 |
30 | 31 | 53 | 54 | 60 | -------------------------------------------------------------------------------- /tests/.gitkeep: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mitre/emu/fdb0bab5c9c3e1f6fcff57cdd2a4d999955cb7b8/tests/.gitkeep -------------------------------------------------------------------------------- /tests/test_emu_svc.py: -------------------------------------------------------------------------------- 1 | import glob 2 | import yaml 3 | import shutil 4 | 5 | import asyncio 6 | import pytest 7 | 8 | from pathlib import Path, PosixPath 9 | from unittest.mock import patch, call 10 | 11 | from app.utility.base_world import BaseWorld 12 | from plugins.emu.app.emu_svc import EmuService 13 | 14 | 15 | def async_mock_return(to_return): 16 | mock_future = asyncio.Future() 17 | mock_future.set_result(to_return) 18 | return mock_future 19 | 20 | 21 | @pytest.fixture 22 | def emu_svc(): 23 | return EmuService() 24 | 25 | 26 | @pytest.fixture 27 | def planner_yaml(): 28 | return [yaml.safe_load(''' 29 | --- 30 | id: testid 31 | name: Test Planner 32 | description: | 33 | Test planner 34 | module: plugins.emu.app.test_planner_module 35 | params: 36 | param_name: 37 | key1: 38 | - value1 39 | key2: 40 | - value2 41 | ignore_enforcement_modules: [] 42 | allow_repeatable_abilities: False 43 | ''')] 44 | 45 | 46 | @pytest.fixture 47 | def not_planner_yaml(): 48 | return [yaml.safe_load(''' 49 | --- 50 | name: I am a yaml file 51 | value: But I am not a planner 52 | description: something 53 | ''')] 54 | 55 | 56 | @pytest.fixture 57 | def sample_emu_plan(): 58 | return yaml.safe_load(''' 59 | - emulation_plan_details: 60 | id: planid123 61 | adversary_name: Adversary APTXY 62 | adversary_description: Adversary description 63 | attack_version: 8 64 | format_version: 1.0 65 | 66 | # two payloads, 1 input arg 67 | - id: 1-2-3 68 | name: Ability 1 69 | description: Desc 1 70 | tactic: tactic1 71 | technique: 72 | attack_id: technique1 73 | name: technique 1 74 | cti_source: source1 75 | procedure_group: procedure1 76 | procedure_step: step1 77 | platforms: 78 | linux: 79 | sh: 80 | command: test command 81 | payloads: 82 | - payload1A 83 | - payload1B 84 | 85 | input_arguments: 86 | arg1: 87 | description: argdesc1 88 | type: string 89 | default: default1 90 | 91 | # 1 payload, 2 input arguments, 2 executors 92 | - id: 2-3-4 93 | name: Ability 2 94 | description: Desc 2 95 | tactic: tactic2 96 | technique: 97 | attack_id: technique2 98 | name: technique 2 99 | cti_source: source2 100 | procedure_group: procedure2 101 | procedure_step: step2 102 | platforms: 103 | linux: 104 | sh: 105 | command: test command 106 | payloads: 107 | - payload2A 108 | windows: 109 | cmd: 110 | command: test command 111 | 112 | input_arguments: 113 | arg1: 114 | description: argdesc1 115 | type: string 116 | default: default1 117 | arg2: 118 | description: argdesc2 119 | type: string 120 | default: default2 121 | 122 | # 0 payloads, 0 input arguments, 2 uploads, cleanup 123 | - id: 3-4-5 124 | name: Ability 3 125 | description: Desc 3 126 | tactic: tactic3 127 | technique: 128 | attack_id: technique3 129 | name: technique 3 130 | cti_source: source3 131 | procedure_group: procedure3 132 | procedure_step: step3 133 | platforms: 134 | linux: 135 | sh: 136 | command: test command 137 | uploads: 138 | - upload1A 139 | - upload1B 140 | cleanup: cleanup command 141 | ''') 142 | 143 | 144 | class TestEmuSvc: 145 | LIBRARY_GLOB_PATH = 'plugins/emu/data/adversary-emulation-plans/*' 146 | PLANNER_GLOB_PATH = 'plugins/emu/data/adversary-emulation-plans/*/Emulation_Plan/yaml/planners/*.yml' 147 | PLANNER_PATH = 'plugins/emu/data/adversary-emulation-plans/library1/Emulation_Plan/yaml/planners/test_planner.yml' 148 | DEST_PLANNER_PATH = 'plugins/emu/data/planners/testid.yml' 149 | 150 | def test_general_svc_creation(self, emu_svc): 151 | assert emu_svc.emu_dir == 'plugins/emu' 152 | assert emu_svc.repo_dir == 'plugins/emu/data/adversary-emulation-plans' 153 | assert emu_svc.data_dir == 'plugins/emu/data' 154 | assert emu_svc.payloads_dir == 'plugins/emu/payloads' 155 | 156 | async def test_ingest_planner(self, emu_svc, planner_yaml): 157 | 158 | with patch.object(BaseWorld, 'strip_yml', return_value=planner_yaml) as new_strip_yml: 159 | with patch.object(shutil, 'copyfile', return_value=None) as new_copyfile: 160 | await emu_svc._ingest_planner(TestEmuSvc.PLANNER_PATH) 161 | new_strip_yml.assert_called_once_with(TestEmuSvc.PLANNER_PATH) 162 | new_copyfile.assert_called_once_with(TestEmuSvc.PLANNER_PATH, TestEmuSvc.DEST_PLANNER_PATH) 163 | 164 | async def test_ingest_bad_planner(self, emu_svc, not_planner_yaml): 165 | with patch.object(BaseWorld, 'strip_yml', return_value=not_planner_yaml) as new_strip_yml: 166 | with patch.object(shutil, 'copyfile', return_value=None) as new_copyfile: 167 | await emu_svc._ingest_planner(TestEmuSvc.PLANNER_PATH) 168 | new_strip_yml.assert_called_once_with(TestEmuSvc.PLANNER_PATH) 169 | new_copyfile.assert_not_called() 170 | 171 | async def test_load_planners(self, emu_svc, planner_yaml): 172 | with patch.object(BaseWorld, 'strip_yml', return_value=planner_yaml) as new_strip_yml: 173 | with patch.object(shutil, 'copyfile', return_value=None) as new_copyfile: 174 | with patch.object(glob, 'iglob', return_value=[TestEmuSvc.PLANNER_PATH]) as new_iglob: 175 | await emu_svc._load_planners(TestEmuSvc.LIBRARY_GLOB_PATH) 176 | new_iglob.assert_called_once_with(TestEmuSvc.PLANNER_GLOB_PATH) 177 | new_strip_yml.assert_called_once_with(TestEmuSvc.PLANNER_PATH) 178 | new_copyfile.assert_called_once_with(TestEmuSvc.PLANNER_PATH, TestEmuSvc.DEST_PLANNER_PATH) 179 | 180 | async def test_load_bad_planners(self, emu_svc, not_planner_yaml): 181 | with patch.object(BaseWorld, 'strip_yml', return_value=not_planner_yaml) as new_strip_yml: 182 | with patch.object(shutil, 'copyfile', return_value=None) as new_copyfile: 183 | with patch.object(glob, 'iglob', return_value=[TestEmuSvc.PLANNER_PATH]) as new_iglob: 184 | await emu_svc._load_planners(TestEmuSvc.LIBRARY_GLOB_PATH) 185 | new_iglob.assert_called_once_with(TestEmuSvc.PLANNER_GLOB_PATH) 186 | new_strip_yml.assert_called_once_with(TestEmuSvc.PLANNER_PATH) 187 | new_copyfile.assert_not_called() 188 | 189 | async def test_ingest_abilities(self, emu_svc, sample_emu_plan): 190 | with patch.object(EmuService, '_write_ability', return_value=async_mock_return(None)) as write_ability: 191 | abilities, facts, at_total, at_ingested, errors = await emu_svc._ingest_abilities(sample_emu_plan) 192 | assert write_ability.call_count == 3 193 | assert abilities == ['1-2-3', '2-3-4', '3-4-5'] 194 | assert facts == [ 195 | dict(trait='arg1', value='default1'), 196 | dict(trait='arg1', value='default1'), 197 | dict(trait='arg2', value='default2'), 198 | ] 199 | assert at_total == 3 200 | assert at_ingested == 3 201 | assert errors == 0 202 | assert {'payload1A', 'payload1B', 'payload2A'} == emu_svc.required_payloads 203 | write_ability.assert_has_calls([ 204 | call(dict( 205 | id='1-2-3', name='Ability 1', description='Desc 1', tactic='tactic1', 206 | technique=dict(name='technique 1', attack_id='technique1'), repeatable=False, requirements=[], 207 | platforms=dict(linux=dict(sh=dict(command='test command', payloads=['payload1A', 'payload1B']))), 208 | )), 209 | call(dict( 210 | id='2-3-4', name='Ability 2', description='Desc 2', tactic='tactic2', 211 | technique=dict(name='technique 2', attack_id='technique2'), repeatable=False, requirements=[], 212 | platforms=dict( 213 | linux=dict(sh=dict(command='test command', payloads=['payload2A'])), 214 | windows=dict(cmd=dict(command='test command')), 215 | ), 216 | )), 217 | call(dict( 218 | id='3-4-5', name='Ability 3', description='Desc 3', tactic='tactic3', 219 | technique=dict(name='technique 3', attack_id='technique3'), repeatable=False, requirements=[], 220 | platforms=dict(linux=dict(sh=dict(command='test command', uploads=['upload1A', 'upload1B'], 221 | cleanup='cleanup command'))), 222 | )), 223 | ]) 224 | 225 | def test_store_required_payloads(self, emu_svc): 226 | def _rglob(_, target): 227 | return [PosixPath('/path/to/' + target), PosixPath('/path2/to/' + target)] 228 | 229 | emu_svc.required_payloads = {'payload1', 'payload2', 'payload3'} 230 | with patch.object(Path, 'rglob', new=_rglob): 231 | with patch.object(shutil, 'copyfile', return_value=None) as new_copyfile: 232 | emu_svc._store_required_payloads() 233 | assert new_copyfile.call_count == 3 234 | new_copyfile.assert_has_calls([ 235 | call(PosixPath('/path/to/payload1'), 'plugins/emu/payloads/payload1'), 236 | call(PosixPath('/path/to/payload2'), 'plugins/emu/payloads/payload2'), 237 | call(PosixPath('/path/to/payload3'), 'plugins/emu/payloads/payload3'), 238 | ], any_order=True) 239 | 240 | def test_register_required_payloads(self, emu_svc): 241 | payloads = ['payload1', 'payload2', 'payload3', 'sandcat.go-darwin', 'sandcat.go-linux', 'sandcat.go-windows'] 242 | want = {'payload1', 'payload2', 'payload3'} 243 | emu_svc._register_required_payloads(payloads) 244 | assert emu_svc.required_payloads == want 245 | -------------------------------------------------------------------------------- /tests/test_group_filtered_planner.py: -------------------------------------------------------------------------------- 1 | import pytest 2 | 3 | from app.objects.secondclass.c_link import Link 4 | from plugins.emu.app.group_filtered_planner import LogicalPlanner 5 | 6 | 7 | BUCKET_NAME = 'fetch_and_run_links' 8 | 9 | 10 | class DummyOperation: 11 | def __init__(self, dummy_adversary, dummy_agents): 12 | self.adversary = dummy_adversary 13 | self.agents = dummy_agents 14 | 15 | async def wait_for_links_completion(self, _): 16 | return 17 | 18 | async def apply(self, link): 19 | return link 20 | 21 | 22 | class DummyAdversary: 23 | def __init__(self, atomic_ordering): 24 | self.atomic_ordering = atomic_ordering 25 | 26 | 27 | class DummyAgent: 28 | def __init__(self, paw, group): 29 | self.paw = paw 30 | self.group = group 31 | 32 | 33 | class DummyAbility: 34 | def __init__(self, ability_id): 35 | self.ability_id = ability_id 36 | 37 | 38 | @pytest.fixture 39 | def dummy_agents(): 40 | return [ 41 | DummyAgent('paw1', 'group1'), 42 | DummyAgent('paw2', 'group2'), 43 | DummyAgent('paw3', 'group3'), 44 | DummyAgent('paw4', 'group1'), 45 | DummyAgent('paw5', 'group4'), 46 | ] 47 | 48 | 49 | @pytest.fixture 50 | def pending_links(): 51 | return [ 52 | Link(command='test command', paw='paw1', ability=DummyAbility('123')), 53 | Link(command='test command variant', paw='paw1', ability=DummyAbility('123')), 54 | Link(command='test command', paw='paw2', ability=DummyAbility('123')), 55 | Link(command='test command', paw='paw3', ability=DummyAbility('123')), 56 | ] 57 | 58 | 59 | @pytest.fixture 60 | def potential_links_dict(): 61 | return { 62 | 'paw1': [ 63 | Link(command='test command', paw='paw1', ability=DummyAbility('123')), 64 | Link(command='test command variant', paw='paw1', ability=DummyAbility('123')), 65 | Link(command='test command', paw='paw1', ability=DummyAbility('456')), 66 | Link(command='test command', paw='paw1', ability=DummyAbility('1011')), 67 | ], 68 | 'paw2': [ 69 | Link(command='test command', paw='paw2', ability=DummyAbility('123')), 70 | ], 71 | 'paw3': [ 72 | Link(command='test command', paw='paw3', ability=DummyAbility('1011')), 73 | ], 74 | 'paw4': [ 75 | Link(command='test command', paw='paw4', ability=DummyAbility('456')), 76 | ], 77 | } 78 | 79 | 80 | @pytest.fixture 81 | def potential_links_list(): 82 | return [ 83 | Link(command='test command', paw='paw1', ability=DummyAbility('123')), 84 | Link(command='test command variant', paw='paw1', ability=DummyAbility('123')), 85 | Link(command='test command', paw='paw1', ability=DummyAbility('456')), 86 | Link(command='test command', paw='paw2', ability=DummyAbility('123')), 87 | Link(command='test command', paw='paw3', ability=DummyAbility('1011')), 88 | Link(command='test command', paw='paw4', ability=DummyAbility('456')), 89 | Link(command='test command', paw='paw1', ability=DummyAbility('1011')), 90 | ] 91 | 92 | 93 | @pytest.fixture 94 | def sample_abilities(): 95 | return ['123', '456', '789', '1011'] 96 | 97 | 98 | @pytest.fixture 99 | def sample_filter(): 100 | return { 101 | '123': ['group1'], 102 | '789': ['group2'], 103 | '1011': ['group1', 'group3'], 104 | } 105 | 106 | 107 | @pytest.fixture 108 | def generate_planner(potential_links_dict): 109 | async def _get_links_mock(agent): 110 | return potential_links_dict.get(agent.paw, []) 111 | 112 | def _generate_planner(atomic_ordering, agents, filtered_groups_by_ability=None): 113 | operation = DummyOperation(DummyAdversary(atomic_ordering), agents) 114 | planner = LogicalPlanner(operation, None, filtered_groups_by_ability=filtered_groups_by_ability) 115 | planner._get_links = _get_links_mock 116 | return planner 117 | return _generate_planner 118 | 119 | 120 | @pytest.fixture 121 | def planner_without_filter(generate_planner, dummy_agents, sample_abilities): 122 | return generate_planner(sample_abilities, dummy_agents) 123 | 124 | 125 | @pytest.fixture 126 | def filtered_planner(generate_planner, dummy_agents, sample_abilities, sample_filter): 127 | return generate_planner(sample_abilities, dummy_agents, filtered_groups_by_ability=sample_filter) 128 | 129 | 130 | class TestGroupFilteredPlanner: 131 | async def test_fetch_from_pending_links(self, planner_without_filter, pending_links): 132 | planner_without_filter.pending_links = pending_links 133 | links_to_use = planner_without_filter._fetch_from_pending_links() 134 | assert len(links_to_use) == 3 135 | assert len(planner_without_filter.pending_links) == 1 136 | assert links_to_use[0].paw == 'paw1' and links_to_use[0].ability.ability_id == '123' 137 | assert links_to_use[0].command == 'test command' 138 | assert links_to_use[1].paw == 'paw2' 139 | assert links_to_use[2].paw == 'paw3' 140 | assert planner_without_filter.pending_links[0].paw == 'paw1' 141 | assert planner_without_filter.pending_links[0].command == 'test command variant' 142 | 143 | async def test_fetch_from_empty_pending_links(self, planner_without_filter): 144 | links_to_use = planner_without_filter._fetch_from_pending_links() 145 | assert not links_to_use 146 | assert not planner_without_filter.pending_links 147 | 148 | async def test_get_valid_agents_for_abil_without_filter(self, planner_without_filter): 149 | assert len(planner_without_filter.operation.agents) == 5 150 | valid_agents = planner_without_filter._get_valid_agents_for_ability('123') 151 | assert len(valid_agents) == 5 152 | 153 | async def test_get_pending_links_without_filter(self, planner_without_filter): 154 | links = await planner_without_filter._get_pending_links('123') 155 | assert len(links) == 3 156 | assert links[0].paw == 'paw1' 157 | assert links[0].ability.ability_id == '123' 158 | assert links[0].command == 'test command' 159 | assert links[1].paw == 'paw1' 160 | assert links[1].ability.ability_id == '123' 161 | assert links[1].command == 'test command variant' 162 | assert links[2].paw == 'paw2' 163 | assert links[2].ability.ability_id == '123' 164 | assert links[2].command == 'test command' 165 | 166 | async def test_fetch_links_without_filter(self, planner_without_filter): 167 | assert not planner_without_filter.pending_links 168 | assert planner_without_filter.current_ability_index == 0 169 | 170 | # first pass 171 | links_to_use = await planner_without_filter._fetch_links() 172 | assert len(planner_without_filter.pending_links) == 1 173 | assert planner_without_filter.pending_links[0].paw == 'paw1' 174 | assert planner_without_filter.pending_links[0].command == 'test command variant' 175 | assert planner_without_filter.pending_links[0].ability.ability_id == '123' 176 | assert planner_without_filter.current_ability_index == 1 177 | assert len(links_to_use) == 2 178 | assert links_to_use[0].paw == 'paw1' 179 | assert links_to_use[0].command == 'test command' 180 | assert links_to_use[0].ability.ability_id == '123' 181 | assert links_to_use[1].paw == 'paw2' 182 | assert links_to_use[1].command == 'test command' 183 | assert links_to_use[1].ability.ability_id == '123' 184 | 185 | # second pass - finishes links from first pass 186 | links_to_use = await planner_without_filter._fetch_links() 187 | assert len(planner_without_filter.pending_links) == 0 188 | assert planner_without_filter.current_ability_index == 1 189 | assert len(links_to_use) == 1 190 | assert links_to_use[0].paw == 'paw1' 191 | assert links_to_use[0].command == 'test command variant' 192 | assert links_to_use[0].ability.ability_id == '123' 193 | 194 | # third pass - ability #2 195 | links_to_use = await planner_without_filter._fetch_links() 196 | assert len(planner_without_filter.pending_links) == 0 197 | assert planner_without_filter.current_ability_index == 2 198 | assert len(links_to_use) == 2 199 | assert links_to_use[0].paw == 'paw1' 200 | assert links_to_use[0].command == 'test command' 201 | assert links_to_use[0].ability.ability_id == '456' 202 | assert links_to_use[1].paw == 'paw4' 203 | assert links_to_use[1].command == 'test command' 204 | assert links_to_use[1].ability.ability_id == '456' 205 | 206 | # fourth pass - skip ability #3 and go to #4 207 | links_to_use = await planner_without_filter._fetch_links() 208 | assert len(planner_without_filter.pending_links) == 0 209 | assert planner_without_filter.current_ability_index == 4 210 | assert len(links_to_use) == 2 211 | assert links_to_use[0].paw == 'paw1' 212 | assert links_to_use[0].command == 'test command' 213 | assert links_to_use[0].ability.ability_id == '1011' 214 | assert links_to_use[1].paw == 'paw3' 215 | assert links_to_use[1].command == 'test command' 216 | assert links_to_use[1].ability.ability_id == '1011' 217 | 218 | # fifth pass - end 219 | links_to_use = await planner_without_filter._fetch_links() 220 | assert len(planner_without_filter.pending_links) == 0 221 | assert planner_without_filter.current_ability_index == 4 222 | assert len(links_to_use) == 0 223 | 224 | async def test_get_valid_agents_for_abil_with_filter(self, filtered_planner): 225 | assert len(filtered_planner.operation.agents) == 5 226 | valid_agent_paws = [agent.paw for agent in filtered_planner._get_valid_agents_for_ability('123')] 227 | assert len(valid_agent_paws) == 2 228 | assert 'paw1' in valid_agent_paws 229 | assert 'paw4' in valid_agent_paws 230 | valid_agent_paws = [agent.paw for agent in filtered_planner._get_valid_agents_for_ability('456')] 231 | assert len(valid_agent_paws) == 5 232 | valid_agent_paws = [agent.paw for agent in filtered_planner._get_valid_agents_for_ability('789')] 233 | assert len(valid_agent_paws) == 1 234 | assert 'paw2' in valid_agent_paws 235 | 236 | async def test_get_pending_links_with_filter(self, filtered_planner): 237 | links = await filtered_planner._get_pending_links('123') 238 | assert len(links) == 2 239 | assert links[0].paw == 'paw1' 240 | assert links[0].ability.ability_id == '123' 241 | assert links[0].command == 'test command' 242 | assert links[1].paw == 'paw1' 243 | assert links[1].ability.ability_id == '123' 244 | assert links[1].command == 'test command variant' 245 | 246 | links = await filtered_planner._get_pending_links('456') 247 | assert len(links) == 2 248 | assert links[0].paw == 'paw1' 249 | assert links[0].ability.ability_id == '456' 250 | assert links[0].command == 'test command' 251 | assert links[1].paw == 'paw4' 252 | assert links[1].ability.ability_id == '456' 253 | assert links[1].command == 'test command' 254 | 255 | links = await filtered_planner._get_pending_links('1011') 256 | assert len(links) == 2 257 | assert links[0].paw == 'paw1' 258 | assert links[0].ability.ability_id == '1011' 259 | assert links[0].command == 'test command' 260 | assert links[1].paw == 'paw3' 261 | assert links[1].ability.ability_id == '1011' 262 | assert links[1].command == 'test command' 263 | 264 | async def test_fetch_links_with_filter(self, filtered_planner): 265 | assert not filtered_planner.pending_links 266 | assert filtered_planner.current_ability_index == 0 267 | 268 | # first pass 269 | links_to_use = await filtered_planner._fetch_links() 270 | assert len(filtered_planner.pending_links) == 1 271 | assert filtered_planner.pending_links[0].paw == 'paw1' 272 | assert filtered_planner.pending_links[0].command == 'test command variant' 273 | assert filtered_planner.pending_links[0].ability.ability_id == '123' 274 | assert filtered_planner.current_ability_index == 1 275 | assert len(links_to_use) == 1 276 | assert links_to_use[0].paw == 'paw1' 277 | assert links_to_use[0].command == 'test command' 278 | assert links_to_use[0].ability.ability_id == '123' 279 | 280 | # second pass - finishes links from first pass 281 | links_to_use = await filtered_planner._fetch_links() 282 | assert len(filtered_planner.pending_links) == 0 283 | assert filtered_planner.current_ability_index == 1 284 | assert len(links_to_use) == 1 285 | assert links_to_use[0].paw == 'paw1' 286 | assert links_to_use[0].command == 'test command variant' 287 | assert links_to_use[0].ability.ability_id == '123' 288 | 289 | # third pass - ability #2 290 | links_to_use = await filtered_planner._fetch_links() 291 | assert len(filtered_planner.pending_links) == 0 292 | assert filtered_planner.current_ability_index == 2 293 | assert len(links_to_use) == 2 294 | assert links_to_use[0].paw == 'paw1' 295 | assert links_to_use[0].command == 'test command' 296 | assert links_to_use[0].ability.ability_id == '456' 297 | assert links_to_use[1].paw == 'paw4' 298 | assert links_to_use[1].command == 'test command' 299 | assert links_to_use[1].ability.ability_id == '456' 300 | 301 | # fourth pass - skip ability #3 and go to #4 302 | links_to_use = await filtered_planner._fetch_links() 303 | assert len(filtered_planner.pending_links) == 0 304 | assert filtered_planner.current_ability_index == 4 305 | assert len(links_to_use) == 2 306 | assert links_to_use[0].paw == 'paw1' 307 | assert links_to_use[0].command == 'test command' 308 | assert links_to_use[0].ability.ability_id == '1011' 309 | assert links_to_use[1].paw == 'paw3' 310 | assert links_to_use[1].command == 'test command' 311 | assert links_to_use[1].ability.ability_id == '1011' 312 | 313 | # fifth pass - end 314 | links_to_use = await filtered_planner._fetch_links() 315 | assert len(filtered_planner.pending_links) == 0 316 | assert filtered_planner.current_ability_index == 4 317 | assert len(links_to_use) == 0 318 | -------------------------------------------------------------------------------- /tests/test_vssadmin_shadow_parser.py: -------------------------------------------------------------------------------- 1 | import pytest 2 | 3 | from plugins.emu.app.parsers.vssadmin_shadow import Parser 4 | 5 | 6 | @pytest.fixture 7 | def dummy_blob(): 8 | return r""" 9 | vssadmin 1.1 - Volume Shadow Copy Service administrative command-line tool 10 | (C) Copyright 2001-2013 Microsoft Corp. 11 | 12 | Successfully created shadow copy for 'C:\' 13 | Shadow Copy ID: {uuid} 14 | Shadow Copy Volume Name: \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy7 15 | """ 16 | 17 | 18 | class TestVssadminShadowParser: 19 | def test_parse_blob(self, dummy_blob): 20 | results = Parser._get_volume_name(dummy_blob) 21 | assert results == r'\\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy7' 22 | 23 | def test_parse_blob_without_match(self): 24 | assert not Parser._get_volume_name('no volume name here') 25 | -------------------------------------------------------------------------------- /tox.ini: -------------------------------------------------------------------------------- 1 | # tox (https://tox.readthedocs.io/) is a tool for running tests 2 | # in multiple virtualenvs. This configuration file will run the 3 | # test suite on all supported python versions. To use it, "pip install tox" 4 | # and then run "tox" from this directory. 5 | 6 | [tox] 7 | skipsdist = True 8 | envlist = 9 | py{38,39,310,311} 10 | style 11 | coverage 12 | bandit 13 | skip_missing_interpreters = true 14 | 15 | [testenv] 16 | description = run tests 17 | passenv = TOXENV,CI,TRAVIS,TRAVIS_*,CODECOV_* 18 | deps = 19 | virtualenv!=20.0.22 20 | pre-commit 21 | pytest 22 | pytest-aiohttp 23 | coverage 24 | codecov 25 | changedir = /tmp/caldera 26 | commands = 27 | /usr/bin/git clone https://github.com/mitre/caldera.git --recursive /tmp/caldera 28 | /bin/rm -rf /tmp/caldera/plugins/emu 29 | python -m pip install -r /tmp/caldera/requirements.txt 30 | /usr/bin/cp -R {toxinidir} /tmp/caldera/plugins/emu 31 | python -m pip install -r /tmp/caldera/plugins/emu/requirements.txt 32 | coverage run -p -m pytest --tb=short --rootdir=/tmp/caldera /tmp/caldera/plugins/emu/tests -W ignore::DeprecationWarning 33 | allowlist_externals = 34 | /usr/bin/git 35 | /usr/bin/cp 36 | /bin/rm 37 | 38 | [testenv:style] 39 | deps = 40 | pre-commit 41 | skip_install = true 42 | changedir = {toxinidir} 43 | commands = 44 | pre-commit run --all-files --show-diff-on-failure 45 | 46 | [testenv:coverage] 47 | deps = 48 | coverage 49 | skip_install = true 50 | changedir = /tmp/caldera 51 | commands = 52 | coverage combine 53 | coverage html 54 | coverage report 55 | 56 | [testenv:coverage-ci] 57 | deps = 58 | coveralls 59 | coverage 60 | skip_install = true 61 | changedir = /tmp/caldera 62 | commands = 63 | coverage combine 64 | coverage xml 65 | coverage report 66 | --------------------------------------------------------------------------------