22 | 23 | Download Mappings – Excel 24 |
25 | 26 | You can download a spreadsheet containing the mappings for all sensors or dive into the 27 | details for a specific sensor: 28 | 29 | .. toctree:: 30 | :maxdepth: 1 31 | 32 | mapping_auditd 33 | mapping_cloudtrail 34 | mapping_osquery 35 | mapping_sysmon 36 | mapping_winevtx 37 | mapping_zeek 38 | 39 | .. image:: ../_static/sensors.png 40 | 41 | Visualize Coverage 42 | ------------------ 43 | 44 | .. raw:: html 45 | 46 |47 | 48 | Open Sensor Coverage in ATT&CK Navigator 49 |
50 | 51 | The Navigator layer connects sensors → data sources → techniques. Each sensor is color 52 | coded to the techniques that it is mapped to. Techniques with multiple sensors are 53 | shaded in green: lighter green means fewer sensors and darker green means more sensors. 54 | 55 | .. image:: ../_static/sensor_comparisons.png 56 | -------------------------------------------------------------------------------- /docs/index.rst: -------------------------------------------------------------------------------- 1 | Sensor Mappings to ATT&CK 2 | ========================= 3 | 4 | The Sensor Mappings to ATT&CK Project (SMAP) is a collection of resources to assist 5 | security operations teams and security leaders with understanding which tools, 6 | capabilities, and events can help provide visibility into real-world adversary behaviors 7 | potentially occurring in their environments. SMAP builds on `MITRE ATT&CK® 8 |10 | 11 | Download CSV 12 | 13 | 14 | Download STIX 15 | 16 | 17 | Open in ATT&CK Navigator 18 |
19 | 20 | .. MAPPINGS_TABLE Generated at: 2025-03-20T10:08:37.277616Z 21 | 22 | Enterprise 23 | ---------- 24 | 25 | .. list-table:: 26 | :widths: 50 50 27 | :header-rows: 1 28 | 29 | * - EVENT 30 | - ATT&CK MAPPING 31 | 32 | * - | **1** 33 | | 34 | | A new process has been created 35 | - | **Data Source:** Process 36 | | **Data Component:** Process Creation 37 | 38 | * - | **2** 39 | | 40 | | A process changed a file creation time 41 | - | **Data Source:** File 42 | | **Data Component:** File Modification 43 | 44 | * - | **3** 45 | | 46 | | Network connection 47 | - | **Data Source:** Network Traffic 48 | | **Data Component:** Network Connection Creation 49 | 50 | * - | **4** 51 | | 52 | | Sysmon service state changed. 53 | - | **Data Source:** Service 54 | | **Data Component:** Service Metadata 55 | 56 | * - | **5** 57 | | 58 | | Process terminated 59 | - | **Data Source:** Process 60 | | **Data Component:** Process Termination 61 | 62 | * - | **6** 63 | | 64 | | Driver loaded 65 | - | **Data Source:** Driver 66 | | **Data Component:** Driver Load 67 | 68 | * - | **7** 69 | | 70 | | Image Loaded 71 | - | **Data Source:** Module 72 | | **Data Component:** Module Load 73 | 74 | * - | **8** 75 | | 76 | | The CreateRemoteThread event detects when a process creates a thread in another process. 77 | - | **Data Source:** Process 78 | | **Data Component:** Process Modification 79 | 80 | * - | **9** 81 | | 82 | | The RawAccessRead event detects when a process conducts reading operations from the drive using the \.\ denotation 83 | - | **Data Source:** File 84 | | **Data Component:** File Access 85 | 86 | * - | **10** 87 | | 88 | | ProcessAccess 89 | - | **Data Source:** Process 90 | | **Data Component:** Process Access 91 | 92 | * - | **11** 93 | | 94 | | FileCreate 95 | - | **Data Source:** File 96 | | **Data Component:** File Creation 97 | 98 | * - | **12** 99 | | 100 | | RegistryEvent (Object create and delete) 101 | - | **Data Source:** Windows Registry 102 | | **Data Component:** Windows Registry Key Creation 103 | 104 | * - | **12** 105 | | 106 | | RegistryEvent (Object create and delete) 107 | - | **Data Source:** Windows Registry 108 | | **Data Component:** Windows Registry Key Deletion 109 | 110 | * - | **13** 111 | | 112 | | RegistryEvent (Value Set) 113 | - | **Data Source:** Windows Registry 114 | | **Data Component:** Windows Registry Key Modification 115 | 116 | * - | **14** 117 | | 118 | | RegistryEvent (Key and Value Rename) 119 | - | **Data Source:** Windows Registry 120 | | **Data Component:** Windows Registry Key Modification 121 | 122 | * - | **15** 123 | | 124 | | FileCreateStreamHash 125 | - | **Data Source:** File 126 | | **Data Component:** File Creation 127 | 128 | * - | **17** 129 | | 130 | | PipeEvent (Pipe Created) 131 | - | **Data Source:** Named Pipe 132 | | **Data Component:** Named Pipe Metadata 133 | 134 | * - | **18** 135 | | 136 | | PipeEvent (Pipe Connected) 137 | - | **Data Source:** Named Pipe 138 | | **Data Component:** Named Pipe Connection 139 | 140 | * - | **18** 141 | | 142 | | PipeEvent (Pipe Connected) 143 | - | **Data Source:** Named Pipe 144 | | **Data Component:** Named Pipe Metadata 145 | 146 | * - | **19** 147 | | 148 | | WmiEvent (WmiEventFilter activity detected). 149 | - | **Data Source:** WMI 150 | | **Data Component:** WMI Creation 151 | 152 | * - | **19** 153 | | 154 | | WmiEvent (WmiEventFilter activity detected). 155 | - | **Data Source:** WMI 156 | | **Data Component:** WMI Deletion 157 | 158 | * - | **20** 159 | | 160 | | WmiEvent (WmiEventConsumer activity detected). 161 | - | **Data Source:** WMI 162 | | **Data Component:** WMI Creation 163 | 164 | * - | **20** 165 | | 166 | | WmiEvent (WmiEventConsumer activity detected). 167 | - | **Data Source:** WMI 168 | | **Data Component:** WMI Deletion 169 | 170 | * - | **23** 171 | | 172 | | FileDelete 173 | - | **Data Source:** File 174 | | **Data Component:** File Deletion 175 | 176 | * - | **26** 177 | | 178 | | File Delete logged. 179 | - | **Data Source:** File 180 | | **Data Component:** File Deletion 181 | 182 | * - | **30** 183 | | 184 | | EventID(30) 185 | - | **Data Source:** Process 186 | | **Data Component:** Process Metadata 187 | .. /MAPPINGS_TABLE 188 | -------------------------------------------------------------------------------- /src/util/create_mappings.py: -------------------------------------------------------------------------------- 1 | import argparse 2 | import csv 3 | import json 4 | from pathlib import Path 5 | 6 | import pandas as pd 7 | 8 | 9 | def standardize(sheet, *columns_to_exclude): 10 | """Helper method to standardize columns used for STIX data.""" 11 | skip = ["Worksheet Name", "Event Description", "Event ID"] + list(*columns_to_exclude) 12 | 13 | for idx, row in sheet.iterrows(): 14 | # Pandas does not have a good way to select only string values in a column, so we iterate by row to do our replacements. 15 | for col in sheet.columns: 16 | if pd.isna(row[col]): 17 | continue 18 | elif isinstance(row[col], str): 19 | sheet.loc[idx, col] = sheet.loc[idx, col].strip() 20 | if col in skip: 21 | # Only strip out the whitespace from this column, iff row[col] is a string 22 | continue 23 | else: 24 | # Match case 25 | sheet.loc[idx, col] = sheet.loc[idx, col].replace(" ", "_") 26 | sheet.loc[idx, col] = sheet.loc[idx, col].lower() 27 | # Skip non-string rows 28 | sheet.drop_duplicates(inplace=True, ignore_index=True) 29 | # After dropping duplicates, revert changes 30 | for idx, row in sheet.iterrows(): 31 | for col in sheet.columns: 32 | if pd.isna(row[col]): 33 | continue 34 | elif isinstance(row[col], str): 35 | if col in skip: 36 | continue 37 | else: 38 | sheet.loc[idx, col] = sheet.loc[idx, col].replace("_", " ") 39 | sheet.loc[idx, col] = sheet.loc[idx, col].title() 40 | sheet.loc[idx, col] = sheet.loc[idx, col].replace("Wmi", "WMI") 41 | # Skip non-string rows 42 | 43 | 44 | def get_sheets(spreadsheet_location, config_location): 45 | """Helper method to separate combined Excel sheet into individual Dataframes""" 46 | with config_location.open("r", encoding="utf-8") as f: 47 | config = json.load(f) 48 | version = config["attack_version"] 49 | 50 | df = pd.read_excel(spreadsheet_location, sheet_name="Combined Events", usecols="A,C:I") 51 | standardize(df) 52 | 53 | # Merge in the Data Source ID's from the ATT&CK Data Source CSV 54 | datasource_csv_location = spreadsheet_location.parent.parent.parent 55 | data_source_ids = pd.read_csv(Path(datasource_csv_location, f"enterprise-attack-v{version}-datasources.csv"), usecols=[0, 1]) 56 | 57 | df = df.merge(data_source_ids, how="left", left_on="Data Source", right_on="name") 58 | df.drop(columns=["name"], inplace=True) 59 | df.rename(columns={"ID":"Data Source ID"}, inplace=True) 60 | df = df[['Worksheet Name', 'Data Source', 'Data Source ID', 'Data Component', 'Event Description', 61 | 'Event ID', 'Source', 'Relationship', 'Target']] 62 | df["Data Source ID"] = df["Data Source ID"].apply(lambda n: -1 if pd.isna(n) else n) 63 | # Where a value of `-1` indicates that this is a new ATT&CK Data Source 64 | 65 | worksheet_names = df["Worksheet Name"].dropna().unique() 66 | sheets = [] 67 | for _worksheet in worksheet_names: 68 | _df = df[df["Worksheet Name"] == _worksheet].reset_index(drop=True) 69 | sheets.append((_df, _worksheet)) 70 | return sheets 71 | 72 | 73 | def generate_csv_spreadsheet(sheets, mappings_location): 74 | """Reads the main XSLX mappings file and creates a spreadsheet for the mappings in CSV""" 75 | if not mappings_location.exists(): 76 | mappings_location.mkdir(parents=True) 77 | 78 | for sheet, name in sheets: 79 | with mappings_location.joinpath(f"{name}-sensors-mappings-enterprise.csv").open('w', newline='', encoding='utf-8') as csvfile: 80 | fieldnames = ['EVENT ID', 'EVENT DESCRIPTION', 'ATT&CK DATA SOURCE ID', 'ATT&CK DATA SOURCE', 'ATT&CK DATA COMPONENT', 'SOURCE', 'RELATIONSHIP', 'TARGET'] 81 | dataframe_fields = ['Event ID', 'Event Description', 'Data Source ID', 'Data Source', 'Data Component', 'Source', 'Relationship', 'Target'] 82 | 83 | writer = csv.DictWriter(csvfile, fieldnames=fieldnames) 84 | writer.writeheader() 85 | 86 | for idx, row in sheet.iterrows(): 87 | is_mapped = (pd.notna(row["Data Source"])) and (pd.notna(row["Data Component"])) and (pd.notna(row["Relationship"])) 88 | if not is_mapped: 89 | # Skip this row 90 | continue 91 | csv_row = {} 92 | for i in range(len(fieldnames)): 93 | if pd.isna(row[dataframe_fields[i]]): 94 | csv_row[fieldnames[i]] = None 95 | else: 96 | csv_row[fieldnames[i]] = row[dataframe_fields[i]] 97 | writer.writerow(csv_row) 98 | # Skip any rows without mappable fields 99 | 100 | 101 | def _parse_args(): 102 | ROOT_DIR = Path(__file__).parent.parent.parent 103 | 104 | parser = argparse.ArgumentParser(description="Create mappings from sensors data") 105 | parser.add_argument("-config_location", 106 | dest="config_location", 107 | help="filepath to the configuration for the framework", 108 | type=Path, 109 | default=Path(ROOT_DIR, "mappings", "input", "config.json")) 110 | parser.add_argument("-spreadsheet_location", 111 | dest="spreadsheet_location", 112 | help="filepath to the Excel spreadsheet for the mappings", 113 | type=Path, 114 | default=Path(ROOT_DIR, "mappings", "input", 115 | "enterprise", "xlsx", "Sensor ID to Data Source to API v2.xlsx")) 116 | parser.add_argument("-mappings_location", 117 | dest="mappings_location", 118 | help="filepath to the folder to write CSV spreadsheets", 119 | type=Path, 120 | default=Path(ROOT_DIR, "mappings", "input", 121 | "enterprise", "csv")) 122 | return parser.parse_args() 123 | 124 | 125 | def main(): 126 | args = _parse_args() 127 | sheets = get_sheets(args.spreadsheet_location, args.config_location) 128 | generate_csv_spreadsheet(sheets, args.mappings_location) 129 | 130 | 131 | if __name__ == '__main__': 132 | main() 133 | 134 | -------------------------------------------------------------------------------- /docs/_templates/layout.html: -------------------------------------------------------------------------------- 1 | {%- set url_root = pathto('', 1) -%} {%- if url_root == '#' %}{% set url_root = 2 | '' %}{% endif -%} {%- if not embedded and docstitle %} {%- set titlesuffix = " 3 | — "|safe + docstitle|e -%} {%- else %} {%- set titlesuffix = "" -%} {%- 4 | endif -%} {%- set hidetoc = '' %} {%- if meta is defined and meta %} {%- if 5 | 'hidetoc' in meta.keys() %} {%- set hidetoc = meta.get('hidetoc') %} {%- endif 6 | %} {%- endif %} {%- set sphinx_version_info = sphinx_version.split('.') | 7 | map('int') | list -%} 8 | 9 | 11 | 12 | 13 | 14 | {%- if metatags %} {{ metatags }} {%- endif %} 15 | 16 | {% block htmltitle %} 17 |") 53 | file_beginning.append( 54 | f' ' 55 | ) 56 | file_beginning.append( 57 | ' Download CSV' 58 | ) 59 | file_beginning.append("") 60 | file_beginning.append( 61 | f' ' 62 | ) 63 | file_beginning.append( 64 | ' Download STIX' 65 | ) 66 | file_beginning.append("") 67 | file_beginning.append( 68 | f' ' 69 | ) 70 | file_beginning.append( 71 | ' Open in ATT&CK Navigator' 72 | ) 73 | file_beginning.append("
") 74 | file_beginning.append("") 75 | file_beginning.append(".. MAPPINGS_TABLE") 76 | file_beginning.append(".. /MAPPINGS_TABLE") 77 | with open(old_doc, "w") as _new_file: 78 | _new_file.write("\n".join(file_beginning)) 79 | 80 | with open(old_doc) as file: 81 | new_doc = insert_docs(file, tables, "MAPPINGS_TABLE") 82 | if new_doc[-1] != "\n": 83 | new_doc += "\n" 84 | # New line at end of file 85 | with open(old_doc, "w") as out: 86 | out.write(new_doc) 87 | 88 | 89 | def generate_table_from_csv(csv_file): 90 | """Parse out table data from `csv_file`.""" 91 | obj_lines = [] 92 | with csv_file.open("r") as file: 93 | reader = csv.DictReader(file) 94 | 95 | for row in reader: 96 | _dict = { 97 | "EVENT ID": row["EVENT ID"], 98 | "EVENT DESCRIPTION": row["EVENT DESCRIPTION"], 99 | "ATT&CK DATA SOURCE": row["ATT&CK DATA SOURCE"], 100 | "ATT&CK DATA COMPONENT": row["ATT&CK DATA COMPONENT"], 101 | } 102 | if _dict not in obj_lines: 103 | obj_lines.append(_dict) 104 | return obj_lines 105 | 106 | 107 | def csv_to_tables(attack_types, mappings_location): 108 | """Goes through the CSV files for `attack_types` and extracts data for Sphinx tables.""" 109 | 110 | sensor_table_lists = {} 111 | # Key = Sensor, Value = [{Attack_Type1: obj_lines}, {Attack_Type2: obj_lines}, ...] 112 | for attack_type in attack_types: 113 | current_dir = Path(mappings_location, attack_type, "csv") 114 | csv_files = [ 115 | filename for filename in current_dir.iterdir() if filename.is_file() 116 | ] 117 | 118 | # Work on each CSV file 119 | for _csv in csv_files: 120 | _sensor_ = _csv.name.split("-sensors-mappings")[0] 121 | obj_lines = generate_table_from_csv(_csv) 122 | 123 | if _sensor_ in sensor_table_lists: 124 | sensor_table_lists[_sensor_].append({attack_type: obj_lines}) 125 | else: 126 | sensor_table_lists[_sensor_] = [{attack_type: obj_lines}] 127 | return sensor_table_lists 128 | 129 | 130 | def generate_mappings_table(sensor_table_lists): 131 | """Consolidates output from `csv_to_tables` and returns it as a dictionary. Key is a sensor name, Value is a list of table data.""" 132 | sensor_tables = {} 133 | for sensor in sensor_table_lists: 134 | attack_types: list = sensor_table_lists[sensor] 135 | # attack_types is a List of Dictionaries. 136 | # Key is the attack_type, Value is a list of rows (dictionaries) 137 | obj_lines = [""] 138 | for attack_type_dict in attack_types: 139 | attack_type = list(attack_type_dict.keys())[0] 140 | if attack_type == "ics": 141 | attack_type = attack_type.upper() 142 | else: 143 | attack_type = attack_type.title() 144 | 145 | obj_lines.extend(header_generator(attack_type, "-")) 146 | 147 | # Add table header 148 | obj_lines.extend(table_start(sensor)) 149 | 150 | rows = attack_type_dict[attack_type.lower()] 151 | # Check if Event IDs are numerical. If so, convert them for sorting 152 | sample = rows[0]["EVENT ID"] 153 | try: 154 | int(sample) 155 | for _dict in rows: 156 | new_event_id = int(_dict.pop("EVENT ID")) 157 | _dict["EVENT ID"] = new_event_id 158 | except ValueError: 159 | pass 160 | for row in sorted(rows, key=lambda i: i["EVENT ID"]): 161 | obj_lines.extend(extract_row_data(row)) 162 | obj_lines.append("") 163 | # Remove the last empty line 164 | if not obj_lines[-1]: 165 | obj_lines.pop() 166 | sensor_tables[sensor] = obj_lines 167 | return sensor_tables 168 | 169 | 170 | def extract_row_data(row: dict): 171 | # Unwrap multi-line inputs 172 | description = " ".join(filter(None, row["EVENT DESCRIPTION"].splitlines())) 173 | 174 | result = [ 175 | f" * - | **{row['EVENT ID']}**", 176 | f" |", 177 | f" | {description}", 178 | f" - | **Data Source:** {row['ATT&CK DATA SOURCE']}", 179 | f" | **Data Component:** {row['ATT&CK DATA COMPONENT']}", 180 | ] 181 | 182 | return result 183 | 184 | 185 | def header_generator(variable_name: str, symbol: str): 186 | """Generates reusable underline for documents based on `variable_name`""" 187 | _header = [] 188 | _header.append(variable_name) 189 | _header.append("".join(f"{symbol}" for _ in range(len(variable_name)))) 190 | _header.append("") 191 | return _header 192 | 193 | 194 | def table_start(sensor): 195 | """Helper function to create the start of a table""" 196 | table_header = [] 197 | table_header.append(".. list-table::") 198 | table_header.append(f" :widths: 50 50") 199 | table_header.append(" :header-rows: 1") 200 | table_header.append("") 201 | table_header.append(" * - EVENT") 202 | table_header.append(" - ATT&CK MAPPING") 203 | table_header.append("") 204 | return table_header 205 | 206 | 207 | def _parse_args(): 208 | ROOT_DIR = Path(__file__).parent.parent.parent 209 | parser = argparse.ArgumentParser(description="Generate docs for mappings") 210 | parser.add_argument( 211 | "-docs", 212 | dest="docs_location", 213 | help="Path to documentation files to edit", 214 | type=Path, 215 | default=Path(ROOT_DIR, "docs", "levels"), 216 | ) 217 | parser.add_argument( 218 | "-mappings", 219 | dest="mappings_location", 220 | help="Path to mapping files directory", 221 | type=Path, 222 | default=Path(ROOT_DIR, "mappings", "input"), 223 | ) 224 | return parser.parse_args() 225 | 226 | 227 | def main(): 228 | args = _parse_args() 229 | 230 | attack_types = [ 231 | i 232 | for i in os.listdir(args.mappings_location) 233 | if Path(args.mappings_location, i).is_dir() 234 | ] 235 | sensor_table_lists = csv_to_tables(attack_types, args.mappings_location) 236 | sensor_table_data = generate_mappings_table(sensor_table_lists) 237 | write_doc_files(sensor_table_data, args.docs_location) 238 | 239 | 240 | if __name__ == "__main__": 241 | main() 242 | -------------------------------------------------------------------------------- /docs/example_technique_mappings/windows.rst: -------------------------------------------------------------------------------- 1 | Windows Example Scenarios 2 | ========================= 3 | 4 | Both Windows-based examples explore `Create or Modify System Process (T1543) 5 |