├── CODE_OF_CONDUCT.md ├── CONTRIBUTING.md ├── LICENSE ├── README.md ├── csv_manager_sechub.yaml └── csv_manager_sechub_cdk ├── bin └── cdk_solution.ts ├── cdk.json ├── config.json ├── lambdas ├── csvExporter.py ├── csvExporter.zip ├── csvObjects.py ├── csvUpdater.py └── csvUpdater.zip ├── lib └── sechub_finding_export.ts ├── package.json └── tsconfig.json /CODE_OF_CONDUCT.md: -------------------------------------------------------------------------------- 1 | ## Code of Conduct 2 | This project has adopted the [Amazon Open Source Code of Conduct](https://aws.github.io/code-of-conduct). 3 | For more information see the [Code of Conduct FAQ](https://aws.github.io/code-of-conduct-faq) or contact 4 | opensource-codeofconduct@amazon.com with any additional questions or comments. 5 | -------------------------------------------------------------------------------- /CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | # Contributing Guidelines 2 | 3 | Thank you for your interest in contributing to our project. Whether it's a bug report, new feature, correction, or additional 4 | documentation, we greatly value feedback and contributions from our community. 5 | 6 | Please read through this document before submitting any issues or pull requests to ensure we have all the necessary 7 | information to effectively respond to your bug report or contribution. 8 | 9 | 10 | ## Reporting Bugs/Feature Requests 11 | 12 | We welcome you to use the GitHub issue tracker to report bugs or suggest features. 13 | 14 | When filing an issue, please check existing open, or recently closed, issues to make sure somebody else hasn't already 15 | reported the issue. Please try to include as much information as you can. Details like these are incredibly useful: 16 | 17 | * A reproducible test case or series of steps 18 | * The version of our code being used 19 | * Any modifications you've made relevant to the bug 20 | * Anything unusual about your environment or deployment 21 | 22 | 23 | ## Contributing via Pull Requests 24 | Contributions via pull requests are much appreciated. Before sending us a pull request, please ensure that: 25 | 26 | 1. You are working against the latest source on the *main* branch. 27 | 2. You check existing open, and recently merged, pull requests to make sure someone else hasn't addressed the problem already. 28 | 3. You open an issue to discuss any significant work - we would hate for your time to be wasted. 29 | 30 | To send us a pull request, please: 31 | 32 | 1. Fork the repository. 33 | 2. Modify the source; please focus on the specific change you are contributing. If you also reformat all the code, it will be hard for us to focus on your change. 34 | 3. Ensure local tests pass. 35 | 4. Commit to your fork using clear commit messages. 36 | 5. Send us a pull request, answering any default questions in the pull request interface. 37 | 6. Pay attention to any automated CI failures reported in the pull request, and stay involved in the conversation. 38 | 39 | GitHub provides additional document on [forking a repository](https://help.github.com/articles/fork-a-repo/) and 40 | [creating a pull request](https://help.github.com/articles/creating-a-pull-request/). 41 | 42 | 43 | ## Finding contributions to work on 44 | Looking at the existing issues is a great way to find something to contribute on. As our projects, by default, use the default GitHub issue labels (enhancement/bug/duplicate/help wanted/invalid/question/wontfix), looking at any 'help wanted' issues is a great place to start. 45 | 46 | 47 | ## Code of Conduct 48 | This project has adopted the [Amazon Open Source Code of Conduct](https://aws.github.io/code-of-conduct). 49 | For more information see the [Code of Conduct FAQ](https://aws.github.io/code-of-conduct-faq) or contact 50 | opensource-codeofconduct@amazon.com with any additional questions or comments. 51 | 52 | 53 | ## Security issue notifications 54 | If you discover a potential security issue in this project we ask that you notify AWS/Amazon Security via our [vulnerability reporting page](http://aws.amazon.com/security/vulnerability-reporting/). Please do **not** create a public github issue. 55 | 56 | 57 | ## Licensing 58 | 59 | See the [LICENSE](LICENSE) file for our project's licensing. We will ask you to confirm the licensing of your contribution. 60 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. 2 | 3 | Permission is hereby granted, free of charge, to any person obtaining a copy of 4 | this software and associated documentation files (the "Software"), to deal in 5 | the Software without restriction, including without limitation the rights to 6 | use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of 7 | the Software, and to permit persons to whom the Software is furnished to do so. 8 | 9 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 10 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS 11 | FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR 12 | COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER 13 | IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN 14 | CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. 15 | 16 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | ## CSV Manager for AWS Security Hub 2 | 3 | CSV Manager for AWS Security Hub exports Security Hub findings to a CSV file and allows you to mass-update SecurityHub findings by modifying that CSV file. For more information, please consult the `README.pdf` file in this repository. 4 | 5 | ## Build 6 | 7 | To build this app, you need to be in the cdk project root folder [`csv_manager_sechub_cdk`](/csv_manager_sechub_cdk/). Then run the following: 8 | 9 | npm install -g aws-cdk 10 | npm install 11 | npm run build 12 | 13 | $ npm install -g aws-cdk 14 | 15 | 16 | $ npm install 17 | 18 | 19 | $ npm run build 20 | 21 | 22 | ### IMPORTANT: You will need to add additional IAM principal(s) to access the S3 bucket where findings will be exported, add a valid IAM principal ARN to [`config.json`](/csv_manager_sechub_cdk/config.json). 23 | 24 | ## Deploy 25 | 26 | $ cdk bootstrap aws:/// 27 | 28 | 29 | $ cdk deploy 30 | 31 | 32 | ## CDK Toolkit 33 | 34 | The [`cdk.json`](/csv_manager_sechub_cdk/cdk.json) file in the root of this repository includes 35 | instructions for the CDK toolkit on how to execute this program. 36 | 37 | After building your TypeScript code, you will be able to run the CDK toolkits commands as usual: 38 | 39 | $ cdk ls 40 | 41 | 42 | $ cdk synth 43 | 44 | 45 | $ cdk deploy 46 | 47 | 48 | $ cdk diff 49 | 50 | 51 | ## Security 52 | 53 | See [CONTRIBUTING](CONTRIBUTING.md#security-issue-notifications) for more information. 54 | 55 | ## License 56 | 57 | This library is licensed under the MIT-0 License. See the LICENSE file. 58 | 59 | -------------------------------------------------------------------------------- /csv_manager_sechub.yaml: -------------------------------------------------------------------------------- 1 | Parameters: 2 | S3AccessIAMRole: 3 | Type: String 4 | Description: IAM Role to access the CSV objects in S3. 5 | Frequency: 6 | Type: String 7 | Default: cron(0 8 ? * SUN *) 8 | Description: A cron or rate expression for how often the export occurs. 9 | Partition: 10 | Type: String 11 | Default: aws 12 | Description: The partition in which CSV Manager for Security Hub will operate. 13 | Regions: 14 | Type: String 15 | Default: us-east-1 16 | Description: The comma-delimeted list of regions in which CSV Manager for Security Hub will operate. 17 | PrimaryRegion: 18 | Type: String 19 | Default: us-east-1 20 | Description: The region in which the S3 bucket and SSM parameters are stored. 21 | FindingsFolder: 22 | Type: String 23 | Default: Findings 24 | Description: Folder that will contain Lambda code & CloudFormation templates. 25 | S3BucketCode: 26 | Type: String 27 | Default: "awsiammedia" 28 | Description: S3 Bucket that will contain Lambda code. 29 | S3KeyExporter: 30 | Type: String 31 | Default: "public/sample/1280-export-sh-findings-to-csv-format/csvExporter.zip" 32 | Description: S3 Bucket prefix that will contain Lambda code for Exporter. 33 | S3KeyUpdater: 34 | Type: String 35 | Default: "public/sample/1280-export-sh-findings-to-csv-format/csvUpdater.zip" 36 | Description: S3 Bucket prefix that will contain Lambda code for Updater. 37 | ExpirationPeriod: 38 | Type: Number 39 | Default: 365 40 | Description: Maximum days to retain exported findings. 41 | GlacierTransitionPeriod: 42 | Type: Number 43 | Default: 31 44 | Description: Maximum days before exported findings are moved to AWS Glacier. 45 | Resources: 46 | s3kmskeyA0E45BCB: 47 | Type: AWS::KMS::Key 48 | Properties: 49 | KeyPolicy: 50 | Version: "2012-10-17" 51 | Statement: 52 | - Action: kms:* 53 | Effect: Allow 54 | Principal: 55 | AWS: 56 | Fn::Join: 57 | - "" 58 | - - "arn:" 59 | - Ref: AWS::Partition 60 | - ":iam::" 61 | - Ref: AWS::AccountId 62 | - :root 63 | Resource: "*" 64 | Description: KMS key for security hub findings in S3. 65 | EnableKeyRotation: false 66 | PendingWindowInDays: 7 67 | UpdateReplacePolicy: Delete 68 | DeletionPolicy: Delete 69 | s3kmskeyAlias2C7CE359: 70 | Type: AWS::KMS::Alias 71 | Properties: 72 | AliasName: alias/sh_export_key 73 | TargetKeyId: 74 | Fn::GetAtt: 75 | - s3kmskeyA0E45BCB 76 | - Arn 77 | securityhubexportbucket0BDF3430: 78 | Type: AWS::S3::Bucket 79 | Properties: 80 | BucketEncryption: 81 | ServerSideEncryptionConfiguration: 82 | - BucketKeyEnabled: true 83 | ServerSideEncryptionByDefault: 84 | KMSMasterKeyID: 85 | Fn::GetAtt: 86 | - s3kmskeyA0E45BCB 87 | - Arn 88 | SSEAlgorithm: aws:kms 89 | LifecycleConfiguration: 90 | Rules: 91 | - ExpirationInDays: 92 | Ref: ExpirationPeriod 93 | Status: Enabled 94 | Transitions: 95 | - StorageClass: GLACIER 96 | TransitionInDays: 97 | Ref: GlacierTransitionPeriod 98 | OwnershipControls: 99 | Rules: 100 | - ObjectOwnership: BucketOwnerEnforced 101 | PublicAccessBlockConfiguration: 102 | BlockPublicAcls: true 103 | BlockPublicPolicy: true 104 | IgnorePublicAcls: true 105 | RestrictPublicBuckets: true 106 | VersioningConfiguration: 107 | Status: Enabled 108 | UpdateReplacePolicy: Retain 109 | DeletionPolicy: Retain 110 | securityhubexportbucketPolicy5AE68C6C: 111 | Type: AWS::S3::BucketPolicy 112 | Properties: 113 | Bucket: 114 | Ref: securityhubexportbucket0BDF3430 115 | PolicyDocument: 116 | Version: "2012-10-17" 117 | Statement: 118 | - Action: s3:* 119 | Condition: 120 | Bool: 121 | aws:SecureTransport: "false" 122 | Effect: Deny 123 | Principal: 124 | AWS: "*" 125 | Resource: 126 | - Fn::GetAtt: 127 | - securityhubexportbucket0BDF3430 128 | - Arn 129 | - Fn::Join: 130 | - "" 131 | - - Fn::GetAtt: 132 | - securityhubexportbucket0BDF3430 133 | - Arn 134 | - /* 135 | - Action: 136 | - s3:GetObject* 137 | - s3:ListBucket 138 | - s3:PutObject* 139 | Effect: Allow 140 | Principal: 141 | AWS: 142 | - Ref: S3AccessIAMRole 143 | - Fn::GetAtt: 144 | - secubcsvmanagerrole6674D6A1 145 | - Arn 146 | Resource: 147 | - Fn::GetAtt: 148 | - securityhubexportbucket0BDF3430 149 | - Arn 150 | - Fn::Join: 151 | - "" 152 | - - Fn::GetAtt: 153 | - securityhubexportbucket0BDF3430 154 | - Arn 155 | - /* 156 | secubcsvmanagerrole6674D6A1: 157 | Type: AWS::IAM::Role 158 | Properties: 159 | AssumeRolePolicyDocument: 160 | Version: "2012-10-17" 161 | Statement: 162 | - Action: sts:AssumeRole 163 | Effect: Allow 164 | Principal: 165 | Service: 166 | - ec2.amazonaws.com 167 | - lambda.amazonaws.com 168 | - ssm.amazonaws.com 169 | ManagedPolicyArns: 170 | - arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole 171 | RoleName: SecurityHub_CSV_Exporter 172 | secubcsvexporterfunction0B1F1A8E: 173 | Type: AWS::Lambda::Function 174 | Properties: 175 | Code: 176 | S3Bucket: 177 | Ref: S3BucketCode 178 | S3Key: 179 | Ref: S3KeyExporter 180 | Role: 181 | Fn::GetAtt: 182 | - secubcsvmanagerrole6674D6A1 183 | - Arn 184 | Description: Export SecurityHub findings to CSV in S3 bucket. 185 | Environment: 186 | Variables: 187 | CSV_PRIMARY_REGION: 188 | Ref: PrimaryRegion 189 | FunctionName: SecHubExportStack_545171356966_sh_csv_exporter 190 | Handler: csvExporter.lambdaHandler 191 | MemorySize: 512 192 | ReservedConcurrentExecutions: 100 193 | Runtime: python3.9 194 | Timeout: 900 195 | DependsOn: 196 | - secubcsvmanagerrole6674D6A1 197 | secubcsvupdaterfunction199A043C: 198 | Type: AWS::Lambda::Function 199 | Properties: 200 | Code: 201 | S3Bucket: 202 | Ref: S3BucketCode 203 | S3Key: 204 | Ref: S3KeyUpdater 205 | Role: 206 | Fn::GetAtt: 207 | - secubcsvmanagerrole6674D6A1 208 | - Arn 209 | Description: Update SecurityHub findings to CSV in S3 bucket. 210 | Environment: 211 | Variables: 212 | CSV_PRIMARY_REGION: 213 | Ref: PrimaryRegion 214 | FunctionName: SecHubExportStack_545171356966_sh_csv_updater 215 | Handler: csvUpdater.lambdaHandler 216 | MemorySize: 512 217 | ReservedConcurrentExecutions: 100 218 | Runtime: python3.9 219 | Timeout: 900 220 | DependsOn: 221 | - secubcsvmanagerrole6674D6A1 222 | sechubcsvmanagedpolicy49025002: 223 | Type: AWS::IAM::ManagedPolicy 224 | Properties: 225 | PolicyDocument: 226 | Version: "2012-10-17" 227 | Statement: 228 | - Action: 229 | - iam:CreateServiceLinkedRole 230 | - iam:PassRole 231 | Effect: Allow 232 | Resource: 233 | Fn::Join: 234 | - "" 235 | - - "arn:" 236 | - Ref: AWS::Partition 237 | - ":iam::" 238 | - Ref: AWS::AccountId 239 | - :role/* 240 | Sid: IAMAllow 241 | - Action: 242 | - sts:AssumeRole 243 | - sts:GetCallerIdentity 244 | Effect: Allow 245 | Resource: "*" 246 | Sid: STSAllow 247 | - Action: 248 | - securityhub:BatchUpdateFindings 249 | - securityhub:GetFindings 250 | Effect: Allow 251 | Resource: "*" 252 | Sid: SecurityHubAllow 253 | - Action: 254 | - s3:GetObject 255 | - s3:PutObject 256 | Effect: Allow 257 | Resource: 258 | - Fn::GetAtt: 259 | - securityhubexportbucket0BDF3430 260 | - Arn 261 | - Fn::Join: 262 | - "" 263 | - - Fn::GetAtt: 264 | - securityhubexportbucket0BDF3430 265 | - Arn 266 | - /* 267 | Sid: S3Allow 268 | - Action: 269 | - kms:Decrypt 270 | - kms:Describe* 271 | - kms:Encrypt 272 | - kms:GenerateDataKey 273 | Effect: Allow 274 | Resource: 275 | Fn::GetAtt: 276 | - s3kmskeyA0E45BCB 277 | - Arn 278 | Sid: KMSAllow 279 | - Action: lambda:InvokeFunction 280 | Effect: Allow 281 | Resource: 282 | - Fn::GetAtt: 283 | - secubcsvexporterfunction0B1F1A8E 284 | - Arn 285 | - Fn::GetAtt: 286 | - secubcsvupdaterfunction199A043C 287 | - Arn 288 | Sid: InvokeLambdaAllow 289 | - Action: 290 | - ssm:GetParameters 291 | - ssm:PutParameter 292 | Effect: Allow 293 | Resource: 294 | Fn::Join: 295 | - "" 296 | - - "arn:" 297 | - Ref: AWS::Partition 298 | - ":ssm:" 299 | - Ref: PrimaryRegion 300 | - ":" 301 | - Ref: AWS::AccountId 302 | - :parameter/csvManager/* 303 | Sid: SSMAllow 304 | Description: "" 305 | ManagedPolicyName: sechub_csv_manager 306 | Path: / 307 | Roles: 308 | - Ref: secubcsvmanagerrole6674D6A1 309 | Rule4C995B7F: 310 | Type: AWS::Events::Rule 311 | Properties: 312 | Description: Invoke Security Hub findings exporter periodically. 313 | ScheduleExpression: 314 | Ref: Frequency 315 | State: DISABLED 316 | Targets: 317 | - Arn: 318 | Fn::GetAtt: 319 | - secubcsvexporterfunction0B1F1A8E 320 | - Arn 321 | Id: Target0 322 | InputTransformer: 323 | InputPathsMap: 324 | event: $.event 325 | InputTemplate: '{"event":}' 326 | RuleAllowEventRuleSecHubExportStacksecubcsvexporterfunction16C4F2B1D42AAC42: 327 | Type: AWS::Lambda::Permission 328 | Properties: 329 | Action: lambda:InvokeFunction 330 | FunctionName: 331 | Fn::GetAtt: 332 | - secubcsvexporterfunction0B1F1A8E 333 | - Arn 334 | Principal: events.amazonaws.com 335 | SourceArn: 336 | Fn::GetAtt: 337 | - Rule4C995B7F 338 | - Arn 339 | createshexportdocument: 340 | Type: AWS::SSM::Document 341 | Properties: 342 | Content: 343 | schemaVersion: "0.3" 344 | assumeRole: 345 | Fn::GetAtt: 346 | - secubcsvmanagerrole6674D6A1 347 | - Arn 348 | description: Generate a Security Hub Findings Export (CSV Manager for Security Hub) outside of the normal export. 349 | parameters: 350 | Filters: 351 | type: String 352 | description: The canned filter "HighActive" or a JSON-formatted string for the GetFindings API filter. 353 | default: HighActive 354 | Partition: 355 | type: String 356 | description: The partition in which CSV Manager for Security Hub will operate. 357 | default: 358 | Ref: AWS::Partition 359 | Regions: 360 | type: String 361 | description: The comma-separated list of regions in which CSV Manager for Security Hub will operate. 362 | default: 363 | Ref: PrimaryRegion 364 | mainSteps: 365 | - action: aws:invokeLambdaFunction 366 | name: InvokeLambdaforSHFindingExport 367 | inputs: 368 | InvocationType: RequestResponse 369 | FunctionName: 370 | Ref: secubcsvexporterfunction0B1F1A8E 371 | Payload: '{ "filters" : "{{Filters}}" , "partition" : "{{Partition}}", "regions" : "[ {{Regions}} ]"}' 372 | description: Invoke the CSV Manager for Security Hub lambda function. 373 | outputs: 374 | - Name: resultCode 375 | Selector: $.Payload.resultCode 376 | Type: Integer 377 | - Name: bucket 378 | Selector: $.Payload.bucket 379 | Type: String 380 | - Name: exportKey 381 | Selector: $.Payload.exportKey 382 | Type: String 383 | isEnd: true 384 | DocumentType: Automation 385 | Name: start_sh_finding_export 386 | updateshexportdocument: 387 | Type: AWS::SSM::Document 388 | Properties: 389 | Content: 390 | schemaVersion: "0.3" 391 | assumeRole: 392 | Fn::GetAtt: 393 | - secubcsvmanagerrole6674D6A1 394 | - Arn 395 | description: Update a Security Hub Findings Update (CSV Manager for Security Hub) outside of the normal Update. 396 | parameters: 397 | Source: 398 | type: String 399 | description: An S3 URI containing the CSV file to update. i.e. s3:///Findings/SecurityHub-20220415-115112.csv 400 | default: "" 401 | PrimaryRegion: 402 | type: String 403 | description: Region to pull the CSV file from. 404 | default: 405 | Ref: PrimaryRegion 406 | mainSteps: 407 | - action: aws:invokeLambdaFunction 408 | name: InvokeLambdaforSHFindingUpdate 409 | inputs: 410 | InvocationType: RequestResponse 411 | FunctionName: 412 | Ref: secubcsvupdaterfunction199A043C 413 | Payload: '{ "input" : "{{Source}}" , "primaryRegion" : "{{PrimaryRegion}}"}' 414 | description: Invoke the CSV Manager Update for Security Hub lambda function. 415 | outputs: 416 | - Name: resultCode 417 | Selector: $.Payload.resultCode 418 | Type: Integer 419 | isEnd: true 420 | DocumentType: Automation 421 | Name: start_sechub_csv_update 422 | BucketNameParameterBB904042: 423 | Type: AWS::SSM::Parameter 424 | Properties: 425 | Type: String 426 | Value: 427 | Ref: securityhubexportbucket0BDF3430 428 | Description: The S3 bucket where Security Hub are exported. 429 | Name: /csvManager/bucket 430 | KMSKeyParameter0B3310DD: 431 | Type: AWS::SSM::Parameter 432 | Properties: 433 | Type: String 434 | Value: 435 | Fn::GetAtt: 436 | - s3kmskeyA0E45BCB 437 | - Arn 438 | Description: The KMS key encrypting the S3 bucket objects. 439 | Name: /csvManager/key 440 | CodeFolderParameter82BE4E3A: 441 | Type: AWS::SSM::Parameter 442 | Properties: 443 | Type: String 444 | Value: 445 | Ref: S3BucketCode 446 | Description: The folder where CSV Manager for Security Hub code is stored. 447 | Name: /csvManager/folder/code 448 | FindingsFolderParameter017314BA: 449 | Type: AWS::SSM::Parameter 450 | Properties: 451 | Type: String 452 | Value: 453 | Ref: FindingsFolder 454 | Description: The folder where CSV Manager for Security Hub findings are exported. 455 | Name: /csvManager/folder/findings 456 | ArchiveKeyParameter8BD59A44: 457 | Type: AWS::SSM::Parameter 458 | Properties: 459 | Type: String 460 | Value: Not Initialized 461 | Description: The name of the ZIP archive containing CSV Manager for Security Hub Lambda code. 462 | Name: /csvManager/object/codeArchive 463 | PartitionParameter82B291B4: 464 | Type: AWS::SSM::Parameter 465 | Properties: 466 | Type: String 467 | Value: 468 | Ref: Partition 469 | Description: The partition in which CSV Manager for Security Hub will operate. 470 | Name: /csvManager/partition 471 | RegionParameterA71AF3D0: 472 | Type: AWS::SSM::Parameter 473 | Properties: 474 | Type: String 475 | Value: 476 | Ref: Regions 477 | Description: The list of regions in which CSV Manager for Security Hub will operate. 478 | Name: /csvManager/regionList -------------------------------------------------------------------------------- /csv_manager_sechub_cdk/bin/cdk_solution.ts: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env node 2 | import 'source-map-support/register'; 3 | import * as cdk from 'aws-cdk-lib'; 4 | import { SecHubExportStack } from '../lib/sechub_finding_export'; 5 | 6 | const app = new cdk.App(); 7 | 8 | new SecHubExportStack(app, 'SecHubExportStack', { 9 | env: { 10 | account: process.env.CDK_DEFAULT_ACCOUNT, 11 | region: process.env.CDK_DEFAULT_REGION 12 | } 13 | }); -------------------------------------------------------------------------------- /csv_manager_sechub_cdk/cdk.json: -------------------------------------------------------------------------------- 1 | { 2 | "app": "npx ts-node --prefer-ts-exts bin/cdk_solution.ts", 3 | "watch": { 4 | "include": [ 5 | "**" 6 | ], 7 | "exclude": [ 8 | "README.md", 9 | "cdk*.json", 10 | "**/*.d.ts", 11 | "**/*.js", 12 | "tsconfig.json", 13 | "package*.json", 14 | "yarn.lock", 15 | "node_modules", 16 | "test" 17 | ] 18 | }, 19 | "context": { 20 | "@aws-cdk/aws-apigateway:usagePlanKeyOrderInsensitiveId": true, 21 | "@aws-cdk/core:stackRelativeExports": true, 22 | "@aws-cdk/aws-rds:lowercaseDbIdentifier": true, 23 | "@aws-cdk/aws-lambda:recognizeVersionProps": true, 24 | "@aws-cdk/aws-cloudfront:defaultSecurityPolicyTLSv1.2_2021": true, 25 | "@aws-cdk-containers/ecs-service-extensions:enableDefaultLogDriver": true, 26 | "@aws-cdk/aws-ec2:uniqueImdsv2TemplateName": true, 27 | "@aws-cdk/aws-iam:minimizePolicies": true, 28 | "@aws-cdk/core:target-partitions": [ 29 | "aws", 30 | "aws-cn" 31 | ] 32 | } 33 | } 34 | -------------------------------------------------------------------------------- /csv_manager_sechub_cdk/config.json: -------------------------------------------------------------------------------- 1 | { 2 | "principalsJson": { 3 | "principals": [ 4 | "arn:aws:iam::123456789012:role/sechub_admin" 5 | ] 6 | } 7 | } -------------------------------------------------------------------------------- /csv_manager_sechub_cdk/lambdas/csvExporter.py: -------------------------------------------------------------------------------- 1 | #!/usr/local/bin/python3 2 | """ 3 | Convert SecurityHub findings to CSV and store in an S3 bucket 4 | 5 | This program can be invoked as an AWS Lambda function or from the command line. 6 | If invoked from the command line, an assumable role is required. If invoked 7 | from Lambda no parameters are required. 8 | 9 | python3 csvExporter.py 10 | --role-arn=[assumeableRoleArn] 11 | --regions=[commaSeaparatedRegionList] 12 | --bucket=[s3BucketName] 13 | --filters=[cannedFilterName|jsonObject] 14 | """ 15 | 16 | import json 17 | import argparse 18 | import csv 19 | import sys 20 | import os 21 | import csvObjects as csvo 22 | import logging 23 | import traceback 24 | import re 25 | 26 | # Default regions in list and string form 27 | _DEFAULT_REGION_STRING = "" 28 | _DEFAULT_REGION_LIST = [] #_DEFAULT_REGION_STRING.split(",") 29 | 30 | # Retrieves the name of the current function (for logging purposes) 31 | this = lambda frame=0 : sys._getframe(frame+1).f_code.co_name 32 | 33 | _DEFAULT_LOGGING_LEVEL = logging.INFO 34 | """ Default logging level """ 35 | 36 | # Set up logging 37 | logging.basicConfig(level=_DEFAULT_LOGGING_LEVEL) 38 | 39 | # Retrieve the logging instance 40 | _LOGGER = logging.getLogger() 41 | _LOGGER.setLevel(_DEFAULT_LOGGING_LEVEL) 42 | """ Initialized logging RootLogger instance """ 43 | 44 | ################################################################################ 45 | #### 46 | ################################################################################ 47 | def choose (default=None, *choices): 48 | """ 49 | Choose between an option and an environment variable (the option always 50 | has priority, if specified) 51 | """ 52 | answer = default 53 | 54 | for choice in choices: 55 | _LOGGER.debug(f'csvExport.493010i choice {choice}') 56 | 57 | if choice: 58 | answer = choice 59 | break 60 | 61 | return answer 62 | ################################################################################ 63 | #### 64 | ################################################################################ 65 | def getFilters ( candidate = None ): 66 | """ 67 | Process filters, which are specified as a JSON object or as a string, in 68 | this case "HighActive." If the filter can't be parsed, a messagae is issued 69 | but a null filter is returned. 70 | """ 71 | if not candidate: 72 | filters = {} 73 | elif candidate != "HighActive": 74 | try: 75 | if type(candidate) is dict: 76 | filters = candidate 77 | else: 78 | filters = json.loads(candidate) 79 | except Exception as thrown: 80 | _LOGGER.error(f'493020e filter parsing failed: {thrown}') 81 | filters = {} 82 | else: 83 | _LOGGER.info("493030i canned HighActive filter selects active high- " + \ 84 | "and critical-severity findings") 85 | filters = { 86 | "SeverityLabel": 87 | [ 88 | {"Value": "CRITICAL", "Comparison": "EQUALS" }, 89 | {"Value": "HIGH", "Comparison": "EQUALS"} 90 | ], 91 | "RecordState": 92 | [ 93 | { "Comparison": "EQUALS", "Value": "ACTIVE"} 94 | ] 95 | } 96 | 97 | return filters 98 | 99 | ################################################################################ 100 | #### Invocation-independent process handler 101 | ################################################################################ 102 | def executor (role=None, region=None, filters=None, bucket=None, limit=0, 103 | retain=False): 104 | """ 105 | Carry out the actions necessary to download and export SecurityHub findings, 106 | whether invoked as a Lambda or from the command line. 107 | """ 108 | # Get the SSM parameters and a client for further SSM operations 109 | ssmActor = csvo.SsmActor(role=role, region=region) 110 | 111 | # Get a list of Security Hub regions we wish to act on 112 | regions = choose( 113 | os.environ.get("CSV_SECURITYHUB_REGIONLIST"), 114 | re.compile("\s*,\s*").split(getattr(ssmActor, "/csvManager/regionList", region)), 115 | ssmActor.getSupportedRegions(service="securityhub") 116 | ) 117 | 118 | _LOGGER.info("493040i selected SecurityHub regions %s" % regions) 119 | 120 | # Get information about the bucket 121 | folder = getattr(ssmActor, "/csvManager/folder/findings", None) 122 | bucket = bucket if bucket else getattr(ssmActor, "/csvManager/bucket", None) 123 | 124 | _LOGGER.debug(f'493050d writing to s3://{bucket}/{folder}/*') 125 | 126 | # A client to act on the bucket 127 | s3Actor = csvo.S3Actor( 128 | bucket=bucket, 129 | folder=folder, 130 | region=region, 131 | role=role 132 | ) 133 | 134 | # Filename where file can be stored locally 135 | localFile = s3Actor.filePath() 136 | 137 | # Now obtain a client for SecurityHub regions 138 | hubActor = csvo.HubActor( 139 | role=role, 140 | region=regions 141 | ) 142 | 143 | # Obtain the findings for all applicable regions 144 | hubActor.downloadFindings(filters=filters,limit=limit) 145 | 146 | if hubActor.count <= 0: 147 | _LOGGER.warning("493060w no findings downloaded") 148 | else: 149 | _LOGGER.info(f'493070i preparing to write {hubActor.count} findings') 150 | 151 | first = True 152 | 153 | with open(localFile, 'w') as target: 154 | for finding in hubActor.getFinding(): 155 | findingObject = csvo.Finding(finding, actor=hubActor) 156 | 157 | # Start the CSV file with a header 158 | if first: 159 | _LOGGER.debug("493080d finding object %s keys %s" \ 160 | % (findingObject, findingObject.columns)) 161 | 162 | writer = csv.DictWriter(target, 163 | fieldnames=findingObject.columns) 164 | 165 | writer.writeheader() 166 | 167 | # Write the finding 168 | writer.writerow(findingObject.rowMap) 169 | 170 | first = False 171 | 172 | # Announce completion of write 173 | _LOGGER.info("493090i findings written to %s" % localFile) 174 | 175 | # Place the object in the S3 bucket 176 | s3Actor.put() 177 | 178 | _LOGGER.info('493100i uploaded to ' + 179 | f's3://{s3Actor.bucket}/{s3Actor.objectKey}') 180 | 181 | # Determine whether to retain the local file or not 182 | if retain: 183 | _LOGGER.warning("493110w local file %s retained" % localFile) 184 | else: 185 | os.unlink(localFile) 186 | 187 | _LOGGER.info("493120i local file deleted") 188 | 189 | # Return details to caller 190 | answer = { 191 | "success" : True , 192 | "message" : "Export succeeded" , 193 | "bucket" : s3Actor.bucket , 194 | "exportKey" : s3Actor.objectKey 195 | } 196 | 197 | return answer 198 | ################################################################################ 199 | #### Lambda handler 200 | ################################################################################ 201 | def lambdaHandler ( event = None, context = None ): 202 | """ 203 | Perform the operations necessary if CsvExporter is invoked as a Lambda 204 | function. 205 | """ 206 | # The event keys we care about are processed below 207 | role = event.get("role") 208 | region = event.get("region") 209 | if 'filters' in (event.keys()): 210 | filters=getFilters(event.get("filters", {})) 211 | else: 212 | filters=event 213 | bucket = event.get("bucket") 214 | retain = event.get("retainLocal", False) 215 | limit = event.get("limit", 0) 216 | eventData = event.get("event") 217 | 218 | # If no region is specified it must be obtains from the environments 219 | if not region: 220 | region = os.environ.get("CSV_PRIMARY_REGION") 221 | _LOGGER.info(f"493130i obtained region {region} from environment") 222 | 223 | # This is where we will store the result 224 | answer = {} 225 | 226 | # Determine if Lambda was invoked manually or via an event 227 | if eventData: 228 | eventType = eventData.get("detail-type", "UNKNOWN") 229 | _LOGGER.info("493140i Lambda invoked by %s" % eventType) 230 | else: 231 | _LOGGER.info("493150i Lambda invoked extemporaneously") 232 | 233 | # Perform the real work 234 | try: 235 | result = executor( 236 | role=role, 237 | region=region, 238 | filters=filters, 239 | bucket=bucket, 240 | retain=retain, 241 | limit=limit 242 | ) 243 | 244 | answer = { 245 | "message": result.get("message"), 246 | "bucket": result.get("bucket"), 247 | "exportKey": result.get("exportKey"), 248 | "resultCode": 200 if result.get("success") else 400 249 | } 250 | 251 | # Catnch any errors 252 | except Exception as thrown: 253 | errorType = type(thrown).__name__ 254 | errorTrace = traceback.format_tb(thrown.__traceback__, limit=5) 255 | 256 | _LOGGER.error("493160e Lambda failed (%s): %s\n%s" \ 257 | % (errorType, thrown, errorTrace)) 258 | 259 | answer = { 260 | "message" : thrown , 261 | "traceback" : traceback.format_tb(thrown.__traceback__, limit=5), 262 | "bucket" : None , 263 | "exportKey" : None , 264 | "resultCode" : 500 265 | } 266 | 267 | return answer 268 | 269 | ################################################################################ 270 | #### Main body is invoked if this is a command invocation 271 | ################################################################################ 272 | if __name__ == "__main__": 273 | """ 274 | Need to make regions etc. configurable 275 | """ 276 | try: 277 | parser = argparse.ArgumentParser() 278 | parser.add_argument("--role-arn", required=False, dest="roleArn", 279 | help="The assumable role ARN to access SecurityHub") 280 | parser.add_argument("--filters", default='{}', required=False, 281 | help="Filters to apply to findings") 282 | parser.add_argument("--bucket", required=False, 283 | help="S3 bucket to store findings") 284 | parser.add_argument("--limit", required=False, type=int, default=0, 285 | help="Limit number of findings retrieved") 286 | parser.add_argument("--retain-local", action="store_true", 287 | dest="retainLocal", default=False, help="Retain local file") 288 | parser.add_argument("--primary-region", dest="region", required=True, 289 | help="Primary region for operations") 290 | 291 | arguments = parser.parse_args() 292 | 293 | executor( 294 | role=arguments.roleArn, 295 | filters=getFilters(arguments.filters), 296 | bucket=arguments.bucket, 297 | limit=arguments.limit, 298 | retain=arguments.retainLocal, 299 | region=arguments.region 300 | ) 301 | 302 | except Exception as thrown: 303 | _LOGGER.exception("493170t unexpected command invocation error %s" \ 304 | % str(thrown)) 305 | -------------------------------------------------------------------------------- /csv_manager_sechub_cdk/lambdas/csvExporter.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/aws-security-hub-csv-manager/066f9dabce99d510bed90c4e26f00a0f2dd73d32/csv_manager_sechub_cdk/lambdas/csvExporter.zip -------------------------------------------------------------------------------- /csv_manager_sechub_cdk/lambdas/csvObjects.py: -------------------------------------------------------------------------------- 1 | #!/usr/local/bin/python3 2 | # Objects used in CsvManager 3 | # Update 20200827 4 | # Update 20210225 - Make it work with GovCloud 5 | 6 | import boto3 7 | import botocore 8 | import time 9 | import re 10 | import logging 11 | from botocore import exceptions 12 | from botocore.exceptions import ClientError 13 | from boto3.session import Session 14 | 15 | # The current supported version 16 | _CURRENT_VERSION = "2021-01-01" 17 | 18 | _DEFAULT_LOGGING_LEVEL = logging.INFO 19 | """ Default logging level """ 20 | 21 | # Set up logging 22 | logging.basicConfig(level=_DEFAULT_LOGGING_LEVEL) 23 | 24 | # Retrieve the logging instance 25 | _LOGGER = logging.getLogger() 26 | _LOGGER.setLevel(_DEFAULT_LOGGING_LEVEL) 27 | 28 | # Simplify the extraction of error detail from botocore.exception.ClientError 29 | errorCode = lambda exception: exception.response \ 30 | .get("Error", {}) \ 31 | .get("Code", "INVALID") if getattr(exception, "response") \ 32 | else type(exception).__name__ 33 | 34 | ################################################################################ 35 | # 36 | ################################################################################ 37 | class FindingValueError (Exception): 38 | """ 39 | This exception is thrown if some value in a finding is out of bounds, or 40 | if there are other problems in the import or update of a finding. 41 | """ 42 | pass 43 | ################################################################################ 44 | # 45 | ################################################################################ 46 | class FindingColumn: 47 | """ 48 | Map a SecurityHub finding dictionary to a set of CSV column. This object 49 | represents a single column. 50 | 51 | If you will be using more than one list of FindingColumns, you must start 52 | each successive list with a reset=True parameter. 53 | 54 | Parameters 55 | ---------- 56 | columnName : str 57 | The name assigned to the CSV column 58 | keys : list of str 59 | A list of keys used to access the value for this column in a 60 | securityhub:get_findings API response. 61 | isKey : boolean 62 | True if this column will act as a key to uniquely identify a finding 63 | isUpdatable : boolean 64 | True if this column can be updated using securityhub:batch_update_findings 65 | d2l : callable, None, or other 66 | A transformation between the API dictionary and the CSV record value 67 | d2lParameters : dict 68 | A ** dictionary to be passed to the d2l transform if it is callable 69 | l2d : callable, None, or other 70 | A transformation between the cSV record and API dictionary value 71 | l2dParameters : dict 72 | A ** dictionary to be passed to the l2d transform if it is callable 73 | 74 | Notes 75 | ----- 76 | The d2l and l2d transforms should be combined, since in all cases so far 77 | identified, the transform is the same in both directions. 78 | """ 79 | #--------------------------------------------------------------------------- 80 | def __init__ (self, columnNumber=0, columnName=None, keys=None, isKey=False, 81 | isUpdatable=False, d2l=None, d2lParameters={}, l2d=None, l2dParameters={}): 82 | """ 83 | See class definition for details. 84 | """ 85 | self.columnNumber = columnNumber 86 | self.columnName = columnName 87 | self.keys = keys 88 | self.isKey = isKey 89 | self.isUpdatable = isUpdatable 90 | self.d2lParameters = d2lParameters 91 | self.d2l = d2l 92 | self.l2dParameters = l2dParameters 93 | self.l2d = l2d 94 | self.transform = None 95 | self.parameters = {} 96 | self._value = None 97 | #--------------------------------------------------------------------------- 98 | @property 99 | def rawValue (self): 100 | """ 101 | The raw value that came from the API finding dictionary or the CSV 102 | column, depending on how this finding was initialized. 103 | """ 104 | return self._value 105 | #--------------------------------------------------------------------------- 106 | @property 107 | def value (self): 108 | """ 109 | Return the value transformed according to its source (the API finding 110 | or the CSV column) 111 | """ 112 | return self._value 113 | #--------------------------------------------------------------------------- 114 | @value.setter 115 | def value (self, initializer=None): 116 | """ 117 | The setter is invoked against a dictionary of values, or a list of values 118 | as in: 119 | 120 | self.value = { "key": "value", "key2"; "value2" } 121 | self.value = [ "value", "value2" ] 122 | 123 | If a list is specified, values are subject to l2d transformations; If a 124 | dict is specified, values are subject to d2l transformations. 125 | 126 | Object attributes are set to facilitate the processing of the value 127 | when the value property is used. 128 | 129 | 1. If the transform is missing, the original value is used 130 | 2. If the transform is callable, the transform return value is used 131 | 3. Else, the transform value is used 132 | """ 133 | _LOGGER.debug("496010d FindingColumn.value %s" % initializer) 134 | 135 | # An API result dict is supplied 136 | if isinstance(initializer, dict): 137 | candidate = self.deep(initializer) 138 | 139 | _LOGGER.debug("496020d processing an API result %s = '%s'" % \ 140 | (self.columnName, candidate)) 141 | 142 | self.transform = self.d2l 143 | self.parameters = self.d2lParameters 144 | # A CSV list is supplied 145 | elif isinstance(initializer, list): 146 | try: 147 | candidate = initializer[self.columnNumber] 148 | except (ValueError, IndexError): 149 | candidate = None 150 | 151 | _LOGGER.debug("496030d processing a CSV column %d (%s) = '%s'" % \ 152 | (self.columnNumber, self.columnName, candidate)) 153 | 154 | self.transform = self.l2d 155 | self.parameters = self.l2dParameters 156 | # Neither is an error 157 | else: 158 | raise FindingValueError("496040t must be passed a list or dict") 159 | 160 | # Case 1 - No transform 161 | if not self.transform: 162 | answer = candidate 163 | 164 | # Case 2 - Callable transform 165 | elif callable(self.transform): 166 | answer = self.transform(candidate, **self.parameters) 167 | 168 | # Case 3 - Fixed transform 169 | else: 170 | answer = self.transform 171 | 172 | _LOGGER.debug("496050d %s source is %s candidate is [%s]" \ 173 | % (self.columnName, type(initializer).__name__, candidate)) 174 | 175 | self._value = answer if answer != '' else None 176 | 177 | _LOGGER.debug("496060d %s transformed is [%s]" \ 178 | % (self.columnName, answer)) 179 | #--------------------------------------------------------------------------- 180 | def key (self): 181 | """ 182 | If this column is a key column, return its value. This method will 183 | return "none" if the self.value setter is not called first! 184 | """ 185 | return None if not self.isKey else self.value 186 | #--------------------------------------------------------------------------- 187 | def update (self): 188 | """ 189 | If this column is an updatable column, return its value. This method 190 | will return "None" if self.value setter is not called first! 191 | """ 192 | return None if not self.isUpdatable else self.value 193 | #--------------------------------------------------------------------------- 194 | def deep (self, dictionary = {}): 195 | """ 196 | Retrieve a nested item from a dict. For example, given { "A" : "B": {1}} 197 | and a self.keys attribute of ["A", "B"], this will return 1. 198 | """ 199 | for key in self.keys: 200 | dictionary = dictionary.get(key, None) 201 | 202 | if not dictionary: 203 | break 204 | 205 | return dictionary 206 | ################################################################################ 207 | # 208 | ################################################################################ 209 | class FindingColumnMap: 210 | """ 211 | Maps all API values to CSV columns, essentially flattening the API dict 212 | into a list that can be appended to a CSV file. This class automatically 213 | numbers the columns. 214 | """ 215 | #--------------------------------------------------------------------------- 216 | def __init__ (self, map=[]): 217 | """ 218 | See the class definition for details 219 | """ 220 | self.columns = 0 221 | self.itemList = [] 222 | self.itemMap = {} 223 | 224 | for item in map: 225 | if not isinstance(item, FindingColumn): 226 | raise FindingValueError("496070t all FindingColumnMap items " + \ 227 | "must be FindingColumn objects") 228 | 229 | # Set the column number automatically 230 | item.columnNumber = self.columns 231 | 232 | self.itemList.append(item) 233 | self.itemMap[item.columnName] = item 234 | self.columns += 1 235 | 236 | _LOGGER.debug("496080d mapped column number %d to column name %s" 237 | % (item.columnNumber, item.columnName)) 238 | #--------------------------------------------------------------------------- 239 | def __getitem__ (self, item=None): 240 | """ 241 | Return an item if indexed by a column number object[number] or a dict 242 | key object[key]. 243 | """ 244 | if isinstance(item, int) and (item >= 0) and (item <= len(self.itemList)): 245 | answer = self.itemList[item] 246 | elif item != None: 247 | answer = self.itemMap[item] 248 | else: 249 | answer = None 250 | 251 | return answer 252 | #--------------------------------------------------------------------------- 253 | def __len__ (self): 254 | """ 255 | Return the length of the column map. 256 | """ 257 | return len(self.itemList) 258 | ################################################################################ 259 | # 260 | ################################################################################ 261 | class Finding: 262 | """ 263 | Represents an AWS SecurityHub finding. 264 | 265 | Parameters 266 | ---------- 267 | initializer : dict or list 268 | * The dictionary for a SecurityHub finding from the 269 | securityhub:get_findings API call. 270 | 271 | * A list containing the values of a CSV record containing finding 272 | fields in specific columns 273 | 274 | Attributes 275 | ---------- 276 | mapping : list of FindingColumn 277 | A list of FindingColumn objects mapping the securityhub:get_findings 278 | dictionary to a CSV record with particular column names 279 | rowList : list 280 | A list of values representing a CSV row 281 | rowMap : dict 282 | A dict keyed by the column name, with each value being the value for 283 | that column. 284 | finding : dict 285 | The nested dict returned by the securityhub:get_findings API, or built 286 | from a CSV initializer (see below) 287 | source : type 288 | Set to dict if initialized by a finding dict, and list if it is initialized 289 | from a CSV row list 290 | """ 291 | #--------------------------------------------------------------------------- 292 | def __init__ (self, initializer = None, actor=None): 293 | """ 294 | See class description for details 295 | """ 296 | self.actor = actor 297 | self.mapping = self.fullMap() 298 | self.findingColumn = {} 299 | 300 | # These values msut exist in the fullMap 301 | self.Id = None 302 | self.ProductArn = None 303 | 304 | if isinstance(initializer, dict): 305 | self.rowMap = self.mapColumns(initializer) 306 | self.rowList = [ value for key, value in self.rowMap.items() ] 307 | self.finding = initializer 308 | self.source = dict 309 | elif isinstance(initializer, (list, tuple)): 310 | self.rowList = list(initializer) 311 | self.rowMap = dict(zip(self.columns, self.rowList)) 312 | self.finding = self.mapFinding(initializer) 313 | self.source = list 314 | else: 315 | raise FindingValueError("496090s initializer must be dict or list") 316 | #--------------------------------------------------------------------------- 317 | @property 318 | def keys (self): 319 | """ 320 | Return a dict of key values for this finding. Key values are attributes 321 | with the isKey attribute that uniquely identify the finding. 322 | """ 323 | answer = {} 324 | 325 | for map in self.mapping: 326 | if map.isKey: 327 | answer[map.columnName] = getattr(self, map.columnName) 328 | 329 | return answer 330 | #--------------------------------------------------------------------------- 331 | @property 332 | def columns (self): 333 | """ 334 | Return a list of CSV column names. 335 | """ 336 | return [ map.columnName for map in self.mapping ] 337 | #--------------------------------------------------------------------------- 338 | def mapColumns (self, initializer = None): 339 | """ 340 | Convert a SecurityHub findings dictionary to a CSV row 341 | """ 342 | row = {} 343 | 344 | for descriptor in self.mapping: 345 | # There is magic here -- see the FindingColumn class value setter 346 | descriptor.value = initializer 347 | 348 | value = descriptor.value 349 | name = descriptor.columnName 350 | 351 | row[name] = value 352 | 353 | setattr(self, name, value) 354 | self.findingColumn[name] = descriptor 355 | 356 | _LOGGER.debug("496100d mapped %s to column %s with value [%s]" \ 357 | % (descriptor.keys, name, value)) 358 | 359 | return row 360 | #--------------------------------------------------------------------------- 361 | def mapFinding (self, initializer=None): 362 | """ 363 | Convert an CSV Manager record to a SecurityHub finding dictionary 364 | """ 365 | finding = {} 366 | 367 | for value, descriptor in zip(initializer, self.mapping): 368 | # There is magic here -- see the FindingColumn class value setter 369 | descriptor.value = initializer 370 | 371 | value = descriptor.value 372 | name = descriptor.columnName 373 | 374 | setattr(self, name, value) 375 | self.findingColumn[name] = descriptor 376 | 377 | # Now we build up the finding 378 | _LOGGER.debug("496110d %s deep set %s = %s" % \ 379 | (name, descriptor.keys, value)) 380 | 381 | Finding._deepSet(finding, descriptor.keys, value) 382 | 383 | return finding 384 | #--------------------------------------------------------------------------- 385 | def getFindingColumn (self, name=None): 386 | """ 387 | Return the FindingColumn object associated with a named column. 388 | """ 389 | return self.findingColumn.get(name) 390 | #--------------------------------------------------------------------------- 391 | @staticmethod 392 | def _deepSet (dictionary={}, keys=[], value=None, skipNone=False): 393 | """ 394 | Given a nested dictionary and a list of keys, set the given value in 395 | the nested ditionary based on the key list. I.e., 396 | _deepset({}, ["a","b"], 1) results in a dictionary {"a": {"b": 1 }} 397 | """ 398 | 399 | if not keys: 400 | _LOGGER.debug("496120d deepset called with no keys") 401 | elif not skipNone or (value != None): 402 | key = keys[0] 403 | 404 | if len(keys) <= 1: 405 | dictionary[key] = value 406 | else: 407 | keys = keys[1:] 408 | 409 | if not(key in dictionary): 410 | dictionary[key] = {} 411 | elif not isinstance(dictionary[key], dict): 412 | dictionary[key] = {} 413 | 414 | Finding._deepSet(dictionary[key], keys, value) 415 | 416 | return dictionary 417 | #--------------------------------------------------------------------------- 418 | def fullMap (self): 419 | """ 420 | This list maps CSV column names to sequences of nested keys in the 421 | Security Hub findings dictionary. See the FindingColumn object for 422 | details. 423 | """ 424 | map = FindingColumnMap([ 425 | FindingColumn( 426 | columnName="Id", 427 | keys=["Id"], 428 | isKey=True 429 | ), 430 | FindingColumn( 431 | columnName="ProductArn", 432 | keys=["ProductArn"], 433 | isKey=True 434 | ), 435 | #### BEGIN Updatable fields using securityhub:batch_update_findings() 436 | FindingColumn( 437 | columnName="Criticality", 438 | keys=["Criticality"], 439 | isUpdatable=True, 440 | d2l=FindingActions.forceInteger, 441 | l2d=FindingActions.forceInteger 442 | ), 443 | FindingColumn( 444 | columnName="Confidence", 445 | keys=["Confidence"], 446 | isUpdatable=True, 447 | d2l=FindingActions.forceInteger, 448 | l2d=FindingActions.forceInteger, 449 | ), 450 | FindingColumn( 451 | columnName="NoteText", 452 | keys=["Note", "Text"], 453 | isUpdatable=True 454 | ), 455 | FindingColumn( 456 | columnName="NoteUpdatedBy", 457 | keys=["Note", "UpdatedBy"], 458 | isUpdatable=True, 459 | d2l=FindingActions.noteUpdater, 460 | d2lParameters={"actor" : self.actor , "finding": self }, 461 | l2d=FindingActions.noteUpdater, 462 | l2dParameters={"actor" : self.actor , "finding": self } 463 | ), 464 | FindingColumn( 465 | columnName="CustomerOwner", 466 | keys=["UserDefinedFields", "Owner"], 467 | isUpdatable=True 468 | ), 469 | FindingColumn( 470 | columnName="CustomerIssue", 471 | keys=["UserDefinedFields", "Issue"], 472 | isUpdatable=True 473 | ), 474 | FindingColumn( 475 | columnName="CustomerTicket", 476 | keys=["UserDefinedFields", "Ticket"], 477 | isUpdatable=True 478 | ), 479 | FindingColumn( 480 | columnName="ProductSeverity", 481 | keys=["Severity", "Product"], 482 | isUpdatable=True, 483 | d2l=FindingActions.checkSeverity, 484 | l2d=FindingActions.checkSeverity, 485 | ), 486 | FindingColumn( 487 | columnName="NormalizedSeverity", 488 | keys=["Severity", "Normalized"], 489 | isUpdatable=True, 490 | d2l=FindingActions.checkSeverity, 491 | l2d=FindingActions.checkSeverity 492 | ), 493 | FindingColumn( 494 | columnName="SeverityLabel", 495 | keys=["Severity", "Label"], 496 | isUpdatable=True, 497 | d2l=FindingActions.checkSeverityLabel, 498 | l2d=FindingActions.checkSeverityLabel 499 | ), 500 | FindingColumn( 501 | columnName="VerificationState", 502 | keys=["VerificationState"], 503 | isUpdatable=True, 504 | d2l=FindingActions.checkVerificationState, 505 | l2d=FindingActions.checkVerificationState 506 | ), 507 | FindingColumn( 508 | columnName="Workflow", 509 | keys=["Workflow", "Status"], 510 | isUpdatable=True, 511 | d2l=FindingActions.checkWorkflow, 512 | l2d=FindingActions.checkWorkflow 513 | ), 514 | #### END Updatable fields using securityhub:batch_update_findings() 515 | FindingColumn( 516 | columnName="UpdateVersion", 517 | keys=[], 518 | d2l=str(_CURRENT_VERSION), 519 | l2d=str(_CURRENT_VERSION) 520 | ), 521 | FindingColumn( 522 | columnName="GeneratorId", 523 | keys=["GeneratorId"] 524 | ), 525 | FindingColumn( 526 | columnName="AwsAccountId", 527 | keys=["AwsAccountId"] 528 | ), 529 | FindingColumn( 530 | columnName="Types", 531 | keys=["Types"], 532 | d2l=FindingActions.delist 533 | ) , 534 | FindingColumn( 535 | columnName="FirstObservedAt", 536 | keys=["FirstObservedAt"] 537 | ), 538 | FindingColumn( 539 | columnName="LastObservedAt", 540 | keys=["LastObservedAt"] 541 | ), 542 | FindingColumn( 543 | columnName="CreatedAt", 544 | keys=["CreatedAt"] 545 | ), 546 | FindingColumn( 547 | columnName="UpdatedAt", 548 | keys=["UpdatedAt"] 549 | ), 550 | FindingColumn( 551 | columnName="Title", 552 | keys=["Title"] 553 | ), 554 | FindingColumn( 555 | columnName="Description", 556 | keys=["Description"] 557 | ), 558 | FindingColumn( 559 | columnName="StandardsArn", 560 | keys=["ProductFields", "StandardsArn"] 561 | ), 562 | FindingColumn( 563 | columnName="StandardsSubscriptionArn", 564 | keys=["ProductFields", "StandardsSubscriptionArn"] 565 | ), 566 | FindingColumn( 567 | columnName="ControlId", 568 | keys=["ProductFields", "ControlId"] 569 | ), 570 | FindingColumn( 571 | columnName="RecommendationUrl", 572 | keys=["ProductFields", "RecommendationUrl"] 573 | ), 574 | FindingColumn( 575 | columnName="StandardsControlArn", 576 | keys=["ProductFields", "StandardsControlArn"] 577 | ), 578 | FindingColumn( 579 | columnName="ProductName", 580 | keys=["ProductFields", "aws/securityhub/ProductName"] 581 | ), 582 | FindingColumn( 583 | columnName="CompanyName", 584 | keys=["ProductFields", "aws/securityhub/CompanyName"] 585 | ), 586 | FindingColumn( 587 | columnName="Annotation", 588 | keys=["ProductFields", "aws/securityhub/annotation"] 589 | ), 590 | FindingColumn( 591 | columnName="FindingId", 592 | keys=["ProductFields", "aws/securityhub/FindingId"] 593 | ), 594 | FindingColumn( 595 | columnName="Resources", 596 | keys=["Resources"], 597 | d2l=FindingActions.resources 598 | ), 599 | FindingColumn( 600 | columnName="ComplianceStatus", 601 | keys=["Compliance", "Status"] 602 | ), 603 | FindingColumn( 604 | columnName="WorkflowState", 605 | keys=["WorkflowState"] 606 | ), 607 | FindingColumn( 608 | columnName="RecordState", 609 | keys=["RecordState"] 610 | ) 611 | ]) 612 | 613 | return map 614 | ################################################################################ 615 | # 616 | ################################################################################ 617 | class FindingActions: 618 | """ 619 | A class containing static methods used to pre- and post-process values 620 | used in SecurityHub finding dictionaries and CSV records 621 | """ 622 | _SEVERITY_LABELS = [ 623 | "INFORMATIONAL", 624 | "LOW", 625 | "MEDIUM", 626 | "HIGH", 627 | "CRITICAL" 628 | ] 629 | 630 | _VERIFICATION_STATES = [ 631 | "UNKNOWN", 632 | "TRUE_POSITIVE", 633 | "FALSE_POSITIVE", 634 | "BENIGN_POSITIVE" 635 | ] 636 | 637 | _WORKFLOWS = [ 638 | "NEW", 639 | "NOTIFIED", 640 | "RESOLVED", 641 | "SUPPRESSED" 642 | ] 643 | #--------------------------------------------------------------------------- 644 | @staticmethod 645 | def noteUpdater (value=None, actor=None, finding=None): 646 | """ 647 | Return the principal ID of the user who is updating a note. This value 648 | is only set if there is a corresponding update to the NoteText 649 | attribute of the finding. 650 | """ 651 | if not isinstance(actor, Actor) or not isinstance(finding, Finding): 652 | _LOGGER.warning("496130w missing actor or finding for '%s'" % value) 653 | answer = None 654 | else: 655 | if finding.NoteText: 656 | answer = actor.principal.get("UserId") 657 | else: 658 | answer = None 659 | 660 | return answer 661 | #--------------------------------------------------------------------------- 662 | @staticmethod 663 | def delist (list = []): 664 | """ 665 | Convert a list into a newline-separated string 666 | """ 667 | return "".join(list) 668 | #--------------------------------------------------------------------------- 669 | @staticmethod 670 | def resources (resources = []): 671 | """ 672 | Convert a list of SecurityHub resources to a newline-separated string 673 | """ 674 | answer = [] 675 | 676 | for resource in resources: 677 | _type = resource.get("Type") 678 | _id = resource.get("Id") 679 | _partition = resource.get("Partition") 680 | _region = resource.get("Region") 681 | 682 | answer.append("%s, %s, %s, %s" % (_type, _id, _partition, _region)) 683 | 684 | return "".join(answer) 685 | #--------------------------------------------------------------------------- 686 | @staticmethod 687 | def checkSeverity (value=0): 688 | """ 689 | Verify a Security Hub severity value, which must be an integer between 690 | 0 and 100, or a floating point number provided by the source application 691 | """ 692 | if value == None: 693 | answer = None 694 | elif isinstance(value, str): 695 | try: 696 | answer = int(value) 697 | 698 | if (answer < 0) or (answer > 100): 699 | raise FindingValueError( 700 | "%d is not an int between 0 and 100" % value 701 | ) 702 | except: 703 | try: 704 | answer = float(value) 705 | except: 706 | answer = None 707 | elif isinstance(value, float): 708 | answer = value 709 | elif isinstance(value, int) and (value >= 0) and (value <= 100): 710 | answer = value 711 | else: 712 | raise FindingValueError( 713 | "%d is not a float or int between 0 and 100" % value 714 | ) 715 | 716 | return answer 717 | #--------------------------------------------------------------------------- 718 | @staticmethod 719 | def checkSeverityLabel (value=None): 720 | """ 721 | Verify a Security Hub severity label which must be one of the values in 722 | the list below. The comparison is case insensitive and an uppercase 723 | value is always returned. 724 | 725 | Valid values are stored in FindingActions._SEVERITY_LABELS 726 | """ 727 | if not isinstance(value, str) or value == '': 728 | answer = None 729 | else: 730 | if value.upper() in FindingActions._SEVERITY_LABELS: 731 | answer = value.upper() 732 | else: 733 | raise FindingValueError( 734 | "'%s' is not a valid severity label" % value 735 | ) 736 | 737 | return answer 738 | #--------------------------------------------------------------------------- 739 | @staticmethod 740 | def checkVerificationState (value=None): 741 | """ 742 | Verify a Security Hub verification state which must be one of the values 743 | in the list below. The comparison is case insensitive, converts 744 | whitespace to a single underscore ("_") and an uppercase value is always 745 | returned. 746 | 747 | Valid values are in FindingActions._VERIFICATION_STATES 748 | """ 749 | if not isinstance(value, str) or value == '': 750 | answer = None 751 | else: 752 | candidate = re.sub(r'\s+', "_", value).upper() 753 | 754 | if candidate in FindingActions._VERIFICATION_STATES: 755 | answer = candidate 756 | else: 757 | raise FindingValueError( 758 | "'%s' is not a valid verification state" % value 759 | ) 760 | 761 | return answer 762 | #--------------------------------------------------------------------------- 763 | @staticmethod 764 | def checkWorkflow (value=None): 765 | """ 766 | Verify a Security Hub workflow label which must be one of the values in 767 | the list below. The comparison is case insensitive and an uppercase 768 | value is always returned. 769 | 770 | Valid values are in FindingActions._WORKFLOWS 771 | """ 772 | if not isinstance(value, str) or value == '': 773 | answer = None 774 | else: 775 | if value.upper() in FindingActions._WORKFLOWS: 776 | answer = value.upper() 777 | else: 778 | raise FindingValueError( 779 | "'%s' is not a valid workflow state" % value 780 | ) 781 | 782 | return answer 783 | #--------------------------------------------------------------------------- 784 | @staticmethod 785 | def forceInteger (value=None): 786 | """ 787 | Force a value to be an integer. 788 | """ 789 | try: 790 | answer = int(value) 791 | except: 792 | answer = None 793 | 794 | return answer 795 | ################################################################################ 796 | # 797 | ################################################################################ 798 | class ActorException (Exception): 799 | pass 800 | ################################################################################ 801 | # 802 | ################################################################################ 803 | class Actor: 804 | _REGION_MODE_SINGLE = 1 # A simple, single-region client 805 | _REGION_MODE_MULTIPLE = 1 # Requires a ServiceRegionBroker 806 | """ 807 | An abstract class for API functions. The class defines clients in each of 808 | a set of supported regions, and then carries out actions using those 809 | clients. 810 | 811 | The following APIs are used by the abstract class: 812 | sts:AssumeRole 813 | sts:GetCallerIdentity 814 | """ 815 | #--------------------------------------------------------------------------- 816 | def __init__ (self, service=None, region=None, role=None): 817 | """ 818 | See the class definition for details. 819 | """ 820 | self.role = role 821 | self.authorized = False 822 | self.accessKeyId = None 823 | self.accessKey = None 824 | self.sessionToken = None 825 | self.client = {} 826 | self.principal = None 827 | self.service = service 828 | 829 | # Some things depend on whether we were passed a region or a list 830 | if isinstance(region, list): 831 | self.regions = region 832 | self.mode = Actor._REGION_MODE_MULTIPLE 833 | 834 | _LOGGER.debug(f'496140d service {service} set to _REGION_MODE_MULTIPLE') 835 | 836 | elif isinstance(region, str): 837 | self.regions = [ region ] 838 | self.mode = Actor._REGION_MODE_SINGLE 839 | 840 | else: 841 | raise ActorException("496150t region must be a region name or list of regions [%s]" % region) 842 | 843 | # Get authorization 844 | self.authorize(regions=self.regions) 845 | 846 | # Create a client for the specified regions 847 | for region in self.regions: 848 | _LOGGER.debug("496160d create %s client in region %s" 849 | % (self.service, region)) 850 | 851 | self.client[region] = self.getClient(region) 852 | #--------------------------------------------------------------------------- 853 | def getPartition(self, region:str) -> str: 854 | """ 855 | Get an AWS partition name from a given region. This must be possible to 856 | do more elegantly later 857 | """ 858 | candidate = region if region else self.primaryRegion 859 | answer = "aws" 860 | 861 | if re.match(r'^us-gov.*', candidate): 862 | answer = "aws-us-gov" 863 | 864 | _LOGGER.debug(f'496180d mapped region {candidate} to partition {answer}') 865 | 866 | return answer 867 | #--------------------------------------------------------------------------- 868 | def getSupportedRegions(self, region:str=None, service:str=None) -> list: 869 | """ 870 | Return a list of supported regions for this servce 871 | """ 872 | region = self.primaryRegion if not region else region 873 | partition = self.getPartition(region) 874 | service = self.service if not service else service 875 | 876 | answer = Session(region_name=region) \ 877 | .get_available_regions( 878 | service, 879 | partition_name=partition 880 | ) 881 | 882 | return answer 883 | #--------------------------------------------------------------------------- 884 | def getClient (self, region:str) -> object: 885 | """ 886 | Create an AWS API client associated with a specific region 887 | """ 888 | try: 889 | client = boto3.client( 890 | self.service, 891 | aws_access_key_id=self.accessKeyId, 892 | aws_secret_access_key=self.accessKey, 893 | aws_session_token=self.sessionToken , 894 | region_name=region 895 | ) 896 | 897 | except Exception as thrown: 898 | _LOGGER.critical(f'496190s error obtaining client for {self.service} ' + 899 | f'in {region}: {thrown}') 900 | client = None 901 | 902 | return client 903 | #--------------------------------------------------------------------------- 904 | @property 905 | def primaryRegion (self): 906 | """ 907 | Return the SRB's primary region or just use the supplied region 908 | """ 909 | return self.regions[0] 910 | #--------------------------------------------------------------------------- 911 | @property 912 | def primaryClient (self): 913 | """ 914 | Return the client for the primary region 915 | """ 916 | return self.client[self.primaryRegion] 917 | #--------------------------------------------------------------------------- 918 | def authorize (self, regions=None): 919 | """ 920 | If no role is supplied to the actor, the authorization is implicit 921 | through the credentials already in the environment. Otherwise, use 922 | sts:assume_role to gain the privileges associated with the supplied 923 | role ARN. 924 | """ 925 | _LOGGER.debug("496200d request to authorize %s client region %s" 926 | % (self.service, regions[0])) 927 | 928 | # Obtain an STS client 929 | try: 930 | # Obtain an STS client 931 | client = boto3.client("sts", region_name=regions[0]) 932 | _LOGGER.debug("496210d obtained STS client %s" % client) 933 | 934 | # No role supplied - use environment credentials 935 | if not self.role: 936 | self.authorized = True 937 | _LOGGER.debug("496220d authorized from environment") 938 | 939 | # Role supplied, try to assume the role 940 | else: 941 | _LOGGER.debug("496230d attempt to assume role %s" % self.role) 942 | 943 | answer = client.assume_role( 944 | RoleArn=self.role, 945 | RoleSessionName=("%s-access" % self.service) 946 | ) 947 | 948 | self.accessKeyId = answer["Credentials"]["AccessKeyId"] 949 | self.accessKey = answer["Credentials"]["SecretAccessKey"] 950 | self.sessionToken = answer["Credentials"]["SessionToken"] 951 | self.authorized = True 952 | 953 | _LOGGER.info("496240i assumed role %s for service %s" \ 954 | % (self.role, self.service)) 955 | 956 | # Now get the principal name of the authorized identity 957 | self.principal = client.get_caller_identity() 958 | 959 | # Catch client errors 960 | except ClientError as thrown: 961 | _LOGGER.critical(f'496250d threw {errorCode(thrown)}: {thrown}') 962 | self.authorized = False 963 | 964 | # If we got this far, we're authorized 965 | else: 966 | self.authorized = True 967 | 968 | # Complain if we aren't authorized 969 | if not self.authorized: 970 | raise ActorException('496260t authorization failed') 971 | 972 | return self 973 | ################################################################################ 974 | # 975 | ################################################################################ 976 | class SsmActor (Actor): 977 | """ 978 | Perform systems manager (SSM) actions. The following SSM APIs are used 979 | in this concrete class: 980 | 981 | ssm:PutParameter 982 | ssm:GetParameters 983 | """ 984 | _PARAMETERS = [ 985 | "/csvManager/bucket", 986 | "/csvManager/folder/code", 987 | "/csvManager/folder/findings", 988 | "/csvManager/object/codeArchive", 989 | "/csvManager/partition", 990 | "/csvManager/regionList" 991 | ] 992 | #--------------------------------------------------------------------------- 993 | def __init__ (self, region=None, role=None, resolve=_PARAMETERS): 994 | """ 995 | See the class definition for details. 996 | """ 997 | super().__init__( 998 | service="ssm", 999 | region=region, 1000 | role=role 1001 | ) 1002 | 1003 | if resolve: 1004 | answers = self.getValue(resolve) 1005 | 1006 | for name, value in answers.items(): 1007 | setattr(self, name, value) 1008 | #--------------------------------------------------------------------------- 1009 | # Set an SSM parameter value 1010 | def putValue (self, name=None, description=None, value=None, type="String"): 1011 | """ 1012 | Set the value of an SSM parameter. 1013 | """ 1014 | try: 1015 | answer = self.primaryClient.put_parameter( 1016 | Name=name, 1017 | Description=description, 1018 | Type=type, 1019 | Value=value, 1020 | Overwrite=True 1021 | ) 1022 | 1023 | except Exception as thrown: 1024 | _LOGGER.info(f'496270s cannot set parameter: {thrown}') 1025 | 1026 | answer = None 1027 | 1028 | return answer 1029 | #--------------------------------------------------------------------------- 1030 | # Get SSM parameter values 1031 | def getValue (self, names:list[str]=[]): 1032 | """ 1033 | Retrieve the value of an SSM parameter. 1034 | """ 1035 | if isinstance(names, list): 1036 | single = False 1037 | answer = {} 1038 | else: 1039 | names = [names] 1040 | single = True 1041 | answer = None 1042 | 1043 | try: 1044 | answer = self.primaryClient.get_parameters(Names=names) 1045 | 1046 | _LOGGER.debug("496280d result from ssm:get_parameters %s" % answer) 1047 | 1048 | for candidate in answer["Parameters"]: 1049 | name = candidate["Name"] 1050 | value = candidate["Value"] 1051 | 1052 | if single: 1053 | answer = value 1054 | else: 1055 | answer[name] = value 1056 | 1057 | _LOGGER.debug(f'496290i SSM {name} = {value}') 1058 | 1059 | for name in answer["InvalidParameters"]: 1060 | _LOGGER.info("496300d parameter '%s' not found" % name) 1061 | answer[name] = None 1062 | 1063 | except Exception as thrown: 1064 | _LOGGER.error("496310e cannot get parameters: %s" % str(thrown)) 1065 | 1066 | if not single: 1067 | for name in names: 1068 | _LOGGER.info("496320d parameter '%s' set to None" % name) 1069 | answer[name] = None 1070 | 1071 | return answer 1072 | ################################################################################ 1073 | # 1074 | ################################################################################ 1075 | class S3Actor(Actor): 1076 | """ 1077 | Perform AWS Simple Storage Service (S3) API operations. The following S3 1078 | API operations are used by this concrete class: 1079 | 1080 | s3:PutObject 1081 | s3:GetObject 1082 | """ 1083 | _PREFIX = "SecurityHub" 1084 | _SUFFIX = ".csv" 1085 | _FOLDER = "SecurityHub" 1086 | #--------------------------------------------------------------------------- 1087 | def __init__ (self, bucket=None, folder=_FOLDER, prefix=_PREFIX, 1088 | suffix=_SUFFIX, region=None, role=None): 1089 | """ 1090 | See the class definition for details 1091 | """ 1092 | super().__init__("s3", region=region, role=role) 1093 | 1094 | self.prefix = prefix 1095 | self.suffix = suffix 1096 | self.folder = folder 1097 | self.bucket = bucket 1098 | self._filename = None 1099 | #--------------------------------------------------------------------------- 1100 | def buildFilename (self, bucket=None, folder=None, name=None, 1101 | extension=None): 1102 | """ 1103 | Construct an fully qualified S3 name from the bucket, folder, 1104 | filename, and extention. 1105 | """ 1106 | answer = (bucket if bucket else self.bucket) + "/" + \ 1107 | (folder if folder else self.folder) + "/" + \ 1108 | ((name + "." + extension) if extension else name) 1109 | 1110 | return answer 1111 | #--------------------------------------------------------------------------- 1112 | @property 1113 | def filename (self): 1114 | """ 1115 | Return the unique S3 key associated with this object. 1116 | """ 1117 | if self._filename: 1118 | answer = self._filename 1119 | else: 1120 | answer = self.prefix + "-" + \ 1121 | time.strftime("%Y%m%d-%H%M%S") + \ 1122 | self.suffix 1123 | 1124 | self._filename = answer 1125 | 1126 | return answer 1127 | #--------------------------------------------------------------------------- 1128 | def filePath (self, directory = "/tmp"): 1129 | """ 1130 | Return a local fully qualified file path. 1131 | """ 1132 | return "/".join([directory, self.filename]) 1133 | #--------------------------------------------------------------------------- 1134 | @property 1135 | def objectKey (self): 1136 | """ 1137 | Return an S3 object key from the "folder" and unique filename. 1138 | """ 1139 | return "/".join([self.folder, self.filename]) 1140 | #--------------------------------------------------------------------------- 1141 | def put (self, inputFile = None, outputObject = None ): 1142 | """ 1143 | Store an object in the S3 bucket. The inputFile is read from 1144 | the local filesystem and stored to S3 as outputObject. 1145 | """ 1146 | source = inputFile if inputFile else self.filePath() 1147 | target = outputObject if outputObject else self.objectKey 1148 | 1149 | try: 1150 | with open(source, "rb") as source: 1151 | answer = self.primaryClient.put_object( 1152 | Bucket=self.bucket, 1153 | Key=target, 1154 | Body=source 1155 | ) 1156 | 1157 | except botocore.exceptions.ClientError as thrown: 1158 | answer = None 1159 | _LOGGER.critical("496330s cannot put object %s to bucket %s: %s" \ 1160 | % (target, self.bucket, str(thrown))) 1161 | 1162 | return answer 1163 | #--------------------------------------------------------------------------- 1164 | def parseS3Url (self, url=None): 1165 | """ 1166 | Parse an S3 url into bucket and key components. 1167 | """ 1168 | # This pattern will match s3://[bucket]/[key] 1169 | pattern = re.compile( 1170 | r'^s3://(?!^(\d{1,3}\.){3}\d{1,3}$)(^[a-z0-9]([a-z0-9-]*(\.[a-z0-9])?)*$)(/*(.*))', 1171 | flags=re.IGNORECASE 1172 | ) 1173 | 1174 | # Perform the match 1175 | match = pattern.match(url) if url else None 1176 | 1177 | if not match: 1178 | answer = None 1179 | else: 1180 | answer = ( match.group(1), match.group(3) ) 1181 | 1182 | return answer 1183 | #--------------------------------------------------------------------------- 1184 | def get (self, file=None, bucket=None, key=None, split=False): 1185 | """ 1186 | Retrieve an object from S3 or a local file and return the entire body. 1187 | 1188 | Parameters 1189 | ---------- 1190 | file : str 1191 | A local file path 1192 | bucket : str 1193 | Mutually exclusive with file, specifies an S3 bucket name 1194 | key : str 1195 | Mutually exclusive with file, specifies an S3 object key 1196 | """ 1197 | # Specifying a local file overrides S3 1198 | if file: 1199 | source = file 1200 | 1201 | try: 1202 | with open(source, "r") as input: 1203 | candidate = input.read() 1204 | 1205 | if not split: 1206 | answer = candidate 1207 | else: 1208 | answer = [ line.strip() for line in candidate.splitlines() ] 1209 | except Exception as thrown: 1210 | answer = None 1211 | _LOGGER.critical("496340s cannot read file %s: %s" \ 1212 | % (source, str(thrown))) 1213 | else: 1214 | bucket = bucket if bucket else self.bucket 1215 | key = key if key else self.objectKey 1216 | 1217 | try: 1218 | response = self.primaryClient.get_object( 1219 | Bucket=bucket, 1220 | Key=key 1221 | ) 1222 | 1223 | candidate = response.get("Body").read().decode("utf-8") 1224 | 1225 | if not split: 1226 | answer = candidate 1227 | else: 1228 | answer = [ line.strip() for line in candidate.splitlines() ] 1229 | except botocore.exceptions.ClientError as thrown: 1230 | answer = None 1231 | _LOGGER.critical("496350s cannot get object %s from bucket %s: %s" \ 1232 | % (key, bucket, str(thrown))) 1233 | 1234 | return answer 1235 | ################################################################################ 1236 | # 1237 | ################################################################################ 1238 | class HubActor (Actor): 1239 | """ 1240 | Perform Security Hub API actions. The following API actions are used 1241 | by this concrete class: 1242 | 1243 | securityhub:GetFindings 1244 | securityhub:BatchUpdateFindings 1245 | """ 1246 | #--------------------------------------------------------------------------- 1247 | def __init__ (self, region=None, role = None): 1248 | """ 1249 | See class definition for details/ 1250 | """ 1251 | super().__init__( 1252 | "securityhub", 1253 | region=region, 1254 | role=role 1255 | ) 1256 | 1257 | self.findings = [] 1258 | self.count = 0 1259 | #--------------------------------------------------------------------------- 1260 | def updateFindings (self, region=None, parameters=None): 1261 | """ 1262 | Update a finding. Parameters are generated by the MinimalUpdateList 1263 | parameterSets method. This method returns the untouched response 1264 | structure from the API call. 1265 | """ 1266 | client = self.getClient(region) 1267 | 1268 | try: 1269 | response = client.batch_update_findings(**parameters) 1270 | except Exception as thrown: 1271 | response = None 1272 | 1273 | _LOGGER.critical("496360t securityhub:batch_update_findings: %s" \ 1274 | % str(thrown)) 1275 | 1276 | return response 1277 | #--------------------------------------------------------------------------- 1278 | def downloadFindings (self, regions=None, filters={}, limit=0): 1279 | """ 1280 | Get findings from Security Hub using the securityhub:get_findings API, 1281 | applying filters as necessary, and limiting results as necessary. 1282 | """ 1283 | regions = self.regions 1284 | 1285 | self.findings = [] 1286 | downloaded = 0 1287 | 1288 | # Get findings for each region 1289 | for region in regions: 1290 | _LOGGER.info(f'496370i retrieving findings from region {region}') 1291 | 1292 | # Get SecurityHub client for this region 1293 | client = self.client[region] 1294 | 1295 | try: 1296 | token = None 1297 | 1298 | while True: 1299 | if not token: 1300 | answer = client.get_findings( 1301 | Filters=filters, 1302 | MaxResults=100 1303 | ) 1304 | else: 1305 | answer = client.get_findings( 1306 | Filters=filters, 1307 | MaxResults=100, 1308 | NextToken=token 1309 | ) 1310 | 1311 | token = answer.get("NextToken", None) 1312 | findings = answer.get("Findings", []) 1313 | downloaded += len(findings) 1314 | 1315 | if (downloaded % 1000) == 0: 1316 | _LOGGER.info("496380i ... %8d findings retrieved" \ 1317 | % downloaded) 1318 | 1319 | self.findings += findings 1320 | 1321 | # This is the last set of findings if there is no "nexttoken" 1322 | if not token: 1323 | break 1324 | 1325 | # If we've exceeded the finding limit, we're done 1326 | if (limit != 0) and (downloaded > limit): 1327 | _LOGGER.info("496390i %d findings exceeds limit of %d" \ 1328 | % (downloaded, limit)) 1329 | break 1330 | 1331 | self.count = len(self.findings) 1332 | 1333 | if (limit != 0) and (downloaded > limit): 1334 | break 1335 | 1336 | except client.exceptions.InvalidAccessException as thrown: 1337 | _LOGGER.error('496400e cannot retrieve findings for ' 1338 | + f'region {region}: {thrown.response["Error"]["Message"]}') 1339 | 1340 | _LOGGER.info("496410i retrieved %d total findings from all regions" \ 1341 | % downloaded) 1342 | 1343 | return self.findings 1344 | #--------------------------------------------------------------------------- 1345 | def getFinding (self): 1346 | """ 1347 | Generator yields each successive finding from a previous 1348 | downloadFindings operation. 1349 | """ 1350 | for finding in self.findings: 1351 | yield finding 1352 | ################################################################################ 1353 | # 1354 | ################################################################################ 1355 | class MalformedUpdate (Exception): 1356 | """ 1357 | There were errors in the uppdate request 1358 | """ 1359 | pass 1360 | ################################################################################ 1361 | # 1362 | ################################################################################ 1363 | class FindingUpdate: 1364 | """ 1365 | Represents an update to a Security Hub finding. The update is separated 1366 | into a key signature and an update signature. See MinimumUpdateList for 1367 | more details. 1368 | """ 1369 | #--------------------------------------------------------------------------- 1370 | def __init__ (self, finding=None): 1371 | """ 1372 | See the class description for details. 1373 | """ 1374 | if not isinstance(finding, Finding): 1375 | raise MalformedUpdate("496420t findings must be Finding objects") 1376 | 1377 | self.finding = finding 1378 | self.changes = 0 1379 | self.update = {} 1380 | self.attributes = [] 1381 | self.keys = {} 1382 | 1383 | # Handle the updates - map is assigned to a FindingColumn object 1384 | for column in finding.mapping: 1385 | name = column.columnName 1386 | value = getattr(finding, name) 1387 | 1388 | # Keep track of keys 1389 | if column.isKey: 1390 | _LOGGER.debug("496430d column %s value '%s' is a key" \ 1391 | % (name, value)) 1392 | 1393 | self.keys[name] = getattr(finding, name) 1394 | 1395 | # Do not process non-updatable columns 1396 | if not column.isUpdatable: 1397 | _LOGGER.debug("496440d skipping column %s - not updatable" \ 1398 | 1399 | % name) 1400 | continue 1401 | 1402 | # Do not process empty strings or None 1403 | if (value != 0) and not value: 1404 | _LOGGER.debug("496450d skipping column %s value '%s'" \ 1405 | % (name, value)) 1406 | 1407 | continue 1408 | 1409 | _LOGGER.debug("496460d column %s value '%s'" % (name, value)) 1410 | 1411 | # Save the change (we know this is an updatable column) 1412 | self.attributes.append(name) 1413 | 1414 | Finding._deepSet( 1415 | self.update, 1416 | finding.getFindingColumn(name).keys, 1417 | value 1418 | ) 1419 | 1420 | self.changes += 1 1421 | 1422 | setattr(self, name, value) 1423 | 1424 | _LOGGER.debug("496470d finding %s\n\tsignature %s\n\tchanges %d" % \ 1425 | (self.keyString, self.signature, self.changes)) 1426 | #--------------------------------------------------------------------------- 1427 | @property 1428 | def updateRegion (self): 1429 | """ 1430 | Return the region associated with this finding. The region is parsed 1431 | from the Id value of the associated finding, which should be an ARN. 1432 | """ 1433 | identity = getattr(self.finding, "Id", None) 1434 | answer = None 1435 | 1436 | if not identity: 1437 | raise MalformedUpdate("496480t finding does not contain an Id") 1438 | else: 1439 | try: 1440 | answer = identity.split(":")[3] 1441 | except Exception as thrown: 1442 | raise MalformedUpdate("496490t malformed finding ID %s: %s" \ 1443 | % (identity, thrown)) 1444 | 1445 | return answer 1446 | #--------------------------------------------------------------------------- 1447 | @property 1448 | def signature (self): 1449 | """ 1450 | Returns an update signature as a string composed of the sorted 1451 | attribute names to be changed and their proposed values. 1452 | """ 1453 | answer = [] 1454 | 1455 | for attribute in sorted(self.attributes): 1456 | value = getattr(self, attribute) 1457 | 1458 | if value: 1459 | answer.append("%s=%s" % (attribute, value)) 1460 | 1461 | answer = self.updateRegion + "|".join(answer) 1462 | 1463 | return answer 1464 | #--------------------------------------------------------------------------- 1465 | @property 1466 | def keyString (self): 1467 | """ 1468 | Returns a string containing the key values for the finding as 1469 | key=value pairs separated by vertical bars. 1470 | """ 1471 | answer = [] 1472 | 1473 | for key in sorted(self.keys.keys()): 1474 | answer.append("%s=%s" % (key, self.keys.get(key))) 1475 | 1476 | answer = "|".join(answer) 1477 | 1478 | return answer 1479 | ################################################################################ 1480 | # 1481 | ################################################################################ 1482 | class StartNextUpdateBatch (Exception): 1483 | """ 1484 | Raise this exception when an update set has 100 findings 1485 | """ 1486 | pass 1487 | ################################################################################ 1488 | # 1489 | ################################################################################ 1490 | class MinimumUpdateList: 1491 | """ 1492 | Accumulate finding updates and structure the updates so that multiple 1493 | findings with the same changes will only be submitted once. 1494 | """ 1495 | #--------------------------------------------------------------------------- 1496 | def __init__ (self): 1497 | """ 1498 | See class definition for details. 1499 | """ 1500 | self.update = {} 1501 | self.findings = {} 1502 | self.regions = {} 1503 | self.sets = 0 1504 | #--------------------------------------------------------------------------- 1505 | def add (self, finding=None): 1506 | """ 1507 | Add a finding to the minimum update list. Findings are aggregated by 1508 | their update signature (a unique string of keys and values to be 1509 | changed). Multiple findings with the same update signature will be 1510 | submitted as a single update to Security Hub. 1511 | """ 1512 | update = FindingUpdate(finding) 1513 | 1514 | if update.changes > 0: 1515 | region = update.updateRegion 1516 | signature = region + "|" + update.signature 1517 | 1518 | _LOGGER.debug("496500d found signature '%s'" % signature) 1519 | 1520 | # This is the first time we've seen this signature 1521 | if not (signature in self.update): 1522 | _LOGGER.debug("496510d saved update for signature '%s'" \ 1523 | % signature) 1524 | 1525 | self.update[signature] = update 1526 | 1527 | # This is the first time we've seen this finding 1528 | if not (signature in self.findings): 1529 | _LOGGER.debug("496520d initialized findings and region for '%s'" \ 1530 | % signature) 1531 | 1532 | self.findings[signature] = [] 1533 | self.regions[signature] = region 1534 | self.sets += 1 1535 | 1536 | # Track all findings for a signature 1537 | self.findings[signature].append(finding) 1538 | _LOGGER.debug("496530d added finding to '%s'" % signature) 1539 | #--------------------------------------------------------------------------- 1540 | @staticmethod 1541 | def updateCount (update=[]): 1542 | """ 1543 | How many updates are in this batch 1544 | """ 1545 | return len(update.get("FindingIdentifiers", [])) 1546 | #--------------------------------------------------------------------------- 1547 | def parameterSets (self): 1548 | """ 1549 | Generator to yield each update as a set of parameters to the 1550 | securityhub:batch_update_findings API. 1551 | """ 1552 | signatures = 0 1553 | 1554 | # Go through each update signature and findings 1555 | for signature, findings in self.findings.items(): 1556 | region = self.regions[signature] 1557 | update = self.update.get(signature).update 1558 | 1559 | update["FindingIdentifiers"] = [] 1560 | 1561 | _LOGGER.debug(f'496540d signature {signature} update {update}') 1562 | 1563 | signatures += 1 1564 | 1565 | _LOGGER.info(f'496550i processing update set {signatures}...') 1566 | 1567 | # Now loop through findings to build an update set 1568 | for finding in findings: 1569 | update["FindingIdentifiers"].append(finding.keys) 1570 | _LOGGER.debug(f'496560d adding {len(finding.keys)} to update now {self.updateCount(update)}') 1571 | 1572 | # The upate set can be contain no more than 100 finding IDs -- 1573 | # yield the updaet set and then clear it out 1574 | if self.updateCount(update) >= 100: 1575 | _LOGGER.debug(f'496570d batch has >=100 finding IDs, yielding to {region}') 1576 | yield region, update 1577 | 1578 | update["FindingIdentifiers"] = [] 1579 | 1580 | # We end up here whenever the number of findings is < 100 1581 | if self.updateCount(update) > 0: 1582 | _LOGGER.debug(f'496580d yielding {self.updateCount(update)} finding IDs to {region}') 1583 | yield region, update 1584 | #---------------------------------------------------------------------------- 1585 | @staticmethod 1586 | def apply (update=None, region=None, actor=None): 1587 | """ 1588 | Static method to apply an update to SecurityHub findings. The update 1589 | must be the value yielded by the parameterSets generator, and the 1590 | actor must be a HubActor object. 1591 | 1592 | This method should probably be moved to HubActor as an object method, 1593 | it but suffices for now. 1594 | """ 1595 | response = {} 1596 | 1597 | if not isinstance(actor, HubActor): 1598 | raise MalformedUpdate("496590t MinimumUpdateList.apply requires a " + \ 1599 | "HubActor object") 1600 | try: 1601 | response = actor.updateFindings(region=region, parameters=update) 1602 | except Exception as thrown: 1603 | response = None 1604 | _LOGGER.critical("496600s unexpected error %s" % str(thrown)) 1605 | 1606 | if (response == None): 1607 | _LOGGER.critical("496610s bad things in MinimumUpdateList.apply") 1608 | 1609 | return response 1610 | -------------------------------------------------------------------------------- /csv_manager_sechub_cdk/lambdas/csvUpdater.py: -------------------------------------------------------------------------------- 1 | #!/usr/local/bin/python3 2 | """ 3 | Update Security Hub findings en bloc/en masse from a CSV file 4 | 5 | This program can be invoked as an AWS Lambda function or from the command line. 6 | 7 | REVISION 20210225 Make work in GovCloud 8 | REVISION 20210930 Actually make work in GovCloud 9 | """ 10 | 11 | import argparse 12 | import csv 13 | import sys 14 | import os 15 | import re 16 | import logging 17 | import csvObjects as csvo 18 | import traceback 19 | import re 20 | 21 | # Retrieves the name of the current function (for logging purposes) 22 | this = lambda frame=0 : sys._getframe(frame+1).f_code.co_name 23 | 24 | _DEFAULT_LOGGING_LEVEL = logging.INFO 25 | """ Default logging level """ 26 | 27 | # Set up logging 28 | logging.basicConfig(level=_DEFAULT_LOGGING_LEVEL) 29 | 30 | # Retrieve the logging instance 31 | _LOGGER = logging.getLogger() 32 | _LOGGER.setLevel(_DEFAULT_LOGGING_LEVEL) 33 | """ Initialized logging RootLogger instance """ 34 | ################################################################################ 35 | # 36 | ################################################################################ 37 | class InputDiscriminator: 38 | """ 39 | Parse the input parameter into an S3 bucket and key or a local file path. 40 | Resulting object has the following properties: 41 | 42 | isLocal - boolean - the input is a local file 43 | bucket - string - if an S3 input, the bucket name 44 | key - string - if an S3 input, the object key 45 | path - string - if a local file, the file path 46 | """ 47 | #--------------------------------------------------------------------------- 48 | def __init__ (self, input=None): 49 | """ See class definition """ 50 | match = re.match(r'^s3://([^/]+)/(.*)', input, re.IGNORECASE) 51 | 52 | if match: 53 | self.isLocal = False 54 | self.bucket = match.group(1) 55 | self.key = match.group(2) 56 | self.path = None 57 | else: 58 | self.isLocal = True 59 | self.bucket = None 60 | self.key = None 61 | self.path = input 62 | ################################################################################ 63 | #### 64 | ################################################################################ 65 | def choose (default=None, *choices): 66 | """ 67 | Choose between an option and an environment variable (the option always 68 | has priority, if specified) 69 | """ 70 | answer = default 71 | 72 | for choice in choices: 73 | if choice: 74 | answer = choice 75 | break 76 | 77 | return answer 78 | ################################################################################ 79 | #### Invocation-independent process handler 80 | ################################################################################ 81 | def executor (role=None, region=None, debug=False, input=None): 82 | """ 83 | Called from either the command or Lambda invocations. Obtains the necessary 84 | API clients, gathers updates from the input CSV file, and then applies 85 | updates using the securityhub:batch_update_findings API. 86 | """ 87 | processed = [] 88 | unprocessed = [] 89 | 90 | # Get the SSM parameters and a client for further SSM operations 91 | ssmActor = csvo.SsmActor(role=role, region=region) 92 | 93 | # Get a list of Security Hub regions we wish to act on 94 | regions = choose( 95 | os.environ.get("CSV_SECURITYHUB_REGIONLIST"), 96 | re.compile("\s*,\s*").split(getattr(ssmActor, "/csvManager/regionList")), 97 | ssmActor.getSupportedRegions(service="securityhub") 98 | ) 99 | 100 | _LOGGER.info("494010i selected SecurityHub regions %s" % regions) 101 | 102 | # Determine if input is S3 or local file 103 | source = InputDiscriminator(input) 104 | 105 | try: 106 | # Will handle input in S3 or local file 107 | s3Actor = csvo.S3Actor( 108 | bucket=source.bucket, 109 | region=regions, 110 | role=role 111 | ) 112 | 113 | # Use SecurityHub to update findings 114 | hubActor = csvo.HubActor( 115 | role=role, 116 | region=regions 117 | ) 118 | 119 | # Determine whether the input is coming from local file or S3 120 | if source.isLocal: 121 | raw = s3Actor.get(file=source.path, split=True) 122 | else: 123 | raw = s3Actor.get(bucket=source.bucket, key=source.key, split=True) 124 | 125 | # Reader for CSV input 126 | reader = csv.reader(raw, delimiter=',') 127 | 128 | # This object creates a minimum set of updates 129 | updates = csvo.MinimumUpdateList() 130 | count = 0 131 | 132 | # Report start of export 133 | _LOGGER.info("494020i processing %d records from CSV" % len(raw)) 134 | 135 | for rowNumber, row in enumerate(reader): 136 | # Skip the column header row 137 | if row[0] == "Id": 138 | continue 139 | 140 | # Process each finding 141 | try: 142 | finding = csvo.Finding(row, actor=hubActor) 143 | 144 | # If there is a problem with the finding, just skip it--user 145 | # can re-run later after corrections 146 | except csvo.FindingValueError as thrown: 147 | _LOGGER.error("494030e row %d error: %s" \ 148 | % (rowNumber + 1, str(thrown))) 149 | 150 | continue 151 | 152 | count += 1 153 | 154 | updates.add(finding) 155 | 156 | # Report progress 157 | if (count % 1000) == 0: 158 | _LOGGER.info("494040i ... %8d findings processed" % count) 159 | 160 | # Report the results of the preprocessing 161 | _LOGGER.info("494050i processed %d findings and identified %d update sets" \ 162 | % (count, updates.sets)) 163 | 164 | # Now apply the updates 165 | if (updates.sets > 0): 166 | _LOGGER.info("494060i processing update sets") 167 | 168 | for region, update in updates.parameterSets(): 169 | # Actually apply the update set -- this is a per-region operation 170 | response = csvo.MinimumUpdateList.apply( 171 | update=update, 172 | region=region, 173 | actor=hubActor 174 | ) 175 | 176 | # Keep track of successes and failures 177 | processed += response.get("ProcessedFindings") 178 | unprocessed += response.get("UnprocessedFindings") 179 | 180 | # Report the results of the update 181 | _LOGGER.info( 182 | "494070i %d findings processed, %d findings not processed" \ 183 | % (len(processed), len(unprocessed)) 184 | ) 185 | 186 | # If some findings were not processed, report those findings 187 | if len(unprocessed) > 0: 188 | _LOGGER.error( 189 | "494080e the following findings were not processed" 190 | ) 191 | 192 | for failed in unprocessed: 193 | finding = failed.get("FindingIdentifier", {}).get("Id") 194 | code = failed.get("ErrorCode") 195 | message = failed.get("ErrorMessage") 196 | 197 | _LOGGER.error("494090e %s\n\t%s - %s" \ 198 | % (finding, code, message)) 199 | 200 | # Handle any errors that arise 201 | except Exception as thrown: 202 | message = "(s) Unexpected executor error %s" % str(thrown) 203 | 204 | if arguments.debug: 205 | _LOGGER.exception("494100t %s" % message) 206 | else: 207 | _LOGGER.critical("494110t %s" % message) 208 | 209 | answer = { 210 | "success" : False , 211 | "message" : str(thrown) , 212 | "input" : input 213 | } 214 | 215 | # What to do if there are no errors 216 | else: 217 | answer = { 218 | "processed": processed, 219 | "unprocessed": unprocessed 220 | } 221 | 222 | if len(unprocessed) == 0: 223 | answer["message"] = "Updated succeeded" 224 | answer["success"] = True 225 | else: 226 | if len(processed) > 0: 227 | answer["message"] = "Update partially succeeded" 228 | answer["success"] = True 229 | else: 230 | answer["message"] = "Update failed" 231 | answer["success"] = False 232 | 233 | return answer 234 | 235 | ################################################################################ 236 | #### Lambda handler 237 | ################################################################################ 238 | def lambdaHandler ( event = None, context = None ): 239 | """ 240 | Stub for Lambda handler. 241 | """ 242 | try: 243 | # These data come from the event 244 | roleArn = event.get("roleArn") 245 | input = event.get("input") 246 | debug = event.get("debug") 247 | region = event.get("primaryRegion") 248 | 249 | # Do the work 250 | answer = executor( 251 | role=roleArn, 252 | input=input, 253 | debug=debug, 254 | region=region 255 | ) 256 | 257 | # Handle trouble if it arises 258 | except Exception as thrown: 259 | errorType = type(thrown).__name__ 260 | errorTrace = traceback.format_tb(thrown.__traceback__, limit=5) 261 | message = "lambda raised exception (%s): %s\n%s" % \ 262 | (errorType, thrown, errorTrace) 263 | 264 | response = { 265 | "message": str(thrown), 266 | "traceback": traceback.format_tb(thrown.__traceback__, limit=5), 267 | "input": input, 268 | "resultCode": 500 269 | } 270 | 271 | # No trouble so far 272 | else: 273 | response = { 274 | "message": "Success", 275 | "details": answer, 276 | "input": input, 277 | "resultCode": 200 278 | } 279 | 280 | return response 281 | ################################################################################ 282 | # 283 | ################################################################################ 284 | if __name__ == "__main__": 285 | """ 286 | This section is executed when csvUpdater.py is invoked as a command-line command. 287 | (need to make regions and other parameters configurable) 288 | """ 289 | try: 290 | parser = argparse.ArgumentParser() 291 | parser.add_argument("--role-arn", dest="roleArn", required=False, 292 | help="The assumable role ARN to access SecurityHub") 293 | parser.add_argument("--input", required=True, 294 | help="S3 or file system file that holds findings to update") 295 | parser.add_argument("--debug", action="store_true", default=False, 296 | help="Provide more debugging details") 297 | parser.add_argument("--primary-region", dest="region", required=True, 298 | help="Primary region for operations") 299 | 300 | arguments = parser.parse_args() 301 | 302 | # Do the work 303 | executor( 304 | role=arguments.roleArn, 305 | input=arguments.input, 306 | region=arguments.region , 307 | debug=arguments.debug 308 | ) 309 | 310 | # Catch trouble 311 | except Exception as thrown: 312 | message = "command invocation raised (%s): %s" % \ 313 | (type(thrown).__name__, thrown) 314 | 315 | # This will generate a traceback 316 | if arguments.debug: 317 | _LOGGER.exception("494120i %s" % message) 318 | 319 | # This will not generate a traceback 320 | else: 321 | _LOGGER.critical("494130i %s" % message) 322 | -------------------------------------------------------------------------------- /csv_manager_sechub_cdk/lambdas/csvUpdater.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/aws-security-hub-csv-manager/066f9dabce99d510bed90c4e26f00a0f2dd73d32/csv_manager_sechub_cdk/lambdas/csvUpdater.zip -------------------------------------------------------------------------------- /csv_manager_sechub_cdk/lib/sechub_finding_export.ts: -------------------------------------------------------------------------------- 1 | import { Stack, StackProps, Duration, CfnParameter, Fn, RemovalPolicy } from 'aws-cdk-lib'; 2 | import * as iam from 'aws-cdk-lib/aws-iam'; 3 | import { Construct } from 'constructs'; 4 | import { join } from 'path'; 5 | import { Function, Runtime, Code } from 'aws-cdk-lib/aws-lambda'; 6 | import * as events from 'aws-cdk-lib/aws-events' 7 | import { LambdaFunction } from 'aws-cdk-lib/aws-events-targets'; 8 | import { StringParameter, CfnDocument } from 'aws-cdk-lib/aws-ssm'; 9 | import { Key } from 'aws-cdk-lib/aws-kms'; 10 | import { BlockPublicAccess, Bucket, BucketEncryption, ObjectOwnership, StorageClass } from 'aws-cdk-lib/aws-s3'; 11 | import { principalsJson } from '../config.json'; 12 | 13 | export class SecHubExportStack extends Stack { 14 | constructor(scope: Construct, id: string, props?: StackProps) { 15 | super(scope, id, props); 16 | 17 | // Stack Parameters 18 | const Frequency = new CfnParameter(this, 'Frequency', { 19 | type: 'String', 20 | description: 'A cron or rate expression for how often the export occurs.', 21 | default: 'cron(0 8 ? * SUN *)' 22 | }); 23 | 24 | const Partition = new CfnParameter(this, 'Partition', { 25 | type: 'String', 26 | description: 'The partition in which CSV Manager for Security Hub will operate.', 27 | default: 'aws' 28 | }); 29 | 30 | const Regions = new CfnParameter(this, 'Regions', { 31 | type: 'String', 32 | description: 'The comma-delimeted list of regions in which CSV Manager for Security Hub will operate.', 33 | default: this.region 34 | }); 35 | 36 | const PrimaryRegion = new CfnParameter(this, 'PrimaryRegion', { 37 | type: 'String', 38 | description: 'The region in which the S3 bucket and SSM parameters are stored.', 39 | default: this.region 40 | }); 41 | 42 | const FindingsFolder = new CfnParameter(this, 'FindingsFolder', { 43 | type: 'String', 44 | description: 'Folder that will contain Lambda code & CloudFormation templates.', 45 | default: 'Findings' 46 | }); 47 | 48 | const CodeFolder = new CfnParameter(this, 'CodeFolder', { 49 | type: 'String', 50 | description: 'Folder that will contain Lambda code & CloudFormation templates.', 51 | default: 'Code' 52 | }); 53 | 54 | const ExpirationPeriod = new CfnParameter(this, 'ExpirationPeriod', { 55 | type: 'Number', 56 | description: 'Maximum days to retain exported findings.', 57 | default: 365 58 | }); 59 | 60 | const GlacierTransitionPeriod = new CfnParameter(this, 'GlacierTransitionPeriod', { 61 | type: 'Number', 62 | description: 'Maximum days before exported findings are moved to AWS Glacier.', 63 | default: 31 64 | }); 65 | 66 | // KMS Key for S3 Bucket for Security Hub Export 67 | const s3_kms_key = new Key(this, 's3_kms_key', { 68 | removalPolicy: RemovalPolicy.DESTROY, 69 | pendingWindow: Duration.days(7), 70 | description: 'KMS key for security hub findings in S3.', 71 | enableKeyRotation: false, 72 | alias: 'sh_export_key' 73 | }); 74 | 75 | // S3 Bucket for Security Hub Export 76 | const security_hub_export_bucket = new Bucket(this, 'security_hub_export_bucket', { 77 | removalPolicy: RemovalPolicy.RETAIN, 78 | bucketKeyEnabled: true, 79 | encryption: BucketEncryption.KMS, 80 | encryptionKey: s3_kms_key, 81 | enforceSSL: true, 82 | versioned: true, 83 | blockPublicAccess: BlockPublicAccess.BLOCK_ALL, 84 | objectOwnership: ObjectOwnership.BUCKET_OWNER_ENFORCED, 85 | publicReadAccess: false, 86 | lifecycleRules: [{ 87 | expiration: Duration.days(ExpirationPeriod.valueAsNumber), 88 | transitions: [{ 89 | storageClass: StorageClass.GLACIER, 90 | transitionAfter: Duration.days(GlacierTransitionPeriod.valueAsNumber) 91 | }] 92 | }] 93 | }); 94 | 95 | // Be sure to add valid IAM principals to the principalsJson object in ../config.json object. 96 | principalsJson.principals.forEach((principal: string) => { 97 | security_hub_export_bucket.addToResourcePolicy(new iam.PolicyStatement({ 98 | actions: [ 99 | 's3:GetObject*', 100 | 's3:ListBucket', 101 | 's3:PutObject*' 102 | ], 103 | resources: [ 104 | security_hub_export_bucket.bucketArn, 105 | security_hub_export_bucket.arnForObjects('*') 106 | ], 107 | principals: [ 108 | new iam.ArnPrincipal(principal)], 109 | })); 110 | }) 111 | 112 | // Lambda Function for CSV exporter 113 | const secub_csv_manager_role = new iam.Role(this, 'secub_csv_manager_role', { 114 | assumedBy: new iam.CompositePrincipal( 115 | new iam.ServicePrincipal("lambda.amazonaws.com"), 116 | new iam.ServicePrincipal("ec2.amazonaws.com"), 117 | new iam.ServicePrincipal("ssm.amazonaws.com") 118 | ), 119 | roleName: "SecurityHub_CSV_Exporter", 120 | managedPolicies: [ 121 | iam.ManagedPolicy.fromManagedPolicyArn(this, 'lambdaExporterSHLogExecutionPolicy', 'arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole') 122 | ] 123 | }); 124 | 125 | security_hub_export_bucket.addToResourcePolicy(new iam.PolicyStatement({ 126 | actions: [ 127 | 's3:GetObject*', 128 | 's3:ListBucket', 129 | 's3:PutObject*' 130 | ], 131 | resources: [ 132 | security_hub_export_bucket.bucketArn, 133 | security_hub_export_bucket.arnForObjects('*') 134 | ], 135 | principals: [ 136 | new iam.ArnPrincipal(secub_csv_manager_role.roleArn)], 137 | })); 138 | 139 | const sh_csv_exporter_function = new Function(this, 'secub_csv_exporter_function', { 140 | runtime: Runtime.PYTHON_3_9, 141 | functionName: this.stackName + '_' + this.account + '_sh_csv_exporter', 142 | code: Code.fromAsset(join(__dirname, "../lambdas")), 143 | handler: 'csvExporter.lambdaHandler', 144 | description: 'Export SecurityHub findings to CSV in S3 bucket.', 145 | timeout: Duration.seconds(900), 146 | memorySize: 512, 147 | role: secub_csv_manager_role, 148 | reservedConcurrentExecutions: 100, 149 | environment:{ 150 | CSV_PRIMARY_REGION: PrimaryRegion.valueAsString 151 | }, 152 | }); 153 | 154 | const sh_csv_updater_function = new Function(this, 'secub_csv_updater_function', { 155 | runtime: Runtime.PYTHON_3_9, 156 | functionName: this.stackName + '_' + this.account + '_sh_csv_updater', 157 | code: Code.fromAsset(join(__dirname, "../lambdas")), 158 | handler: 'csvUpdater.lambdaHandler', 159 | description: 'Update SecurityHub findings to CSV in S3 bucket.', 160 | timeout: Duration.seconds(900), 161 | memorySize: 512, 162 | role: secub_csv_manager_role, 163 | reservedConcurrentExecutions: 100, 164 | environment:{ 165 | CSV_PRIMARY_REGION: PrimaryRegion.valueAsString 166 | }, 167 | }); 168 | 169 | const export_sechub_finding_policy_doc = new iam.PolicyDocument({ 170 | statements: [ 171 | new iam.PolicyStatement({ 172 | sid: "IAMAllow", 173 | effect: iam.Effect.ALLOW, 174 | actions: [ 175 | "iam:CreateServiceLinkedRole", 176 | "iam:PassRole" 177 | ], 178 | resources: [ 179 | Fn.join('', ["arn:", this.partition ,":iam::", this.account,':role/*']), 180 | ] 181 | }), 182 | new iam.PolicyStatement({ 183 | sid: "STSAllow", 184 | effect: iam.Effect.ALLOW, 185 | actions: [ 186 | "sts:AssumeRole", 187 | "sts:GetCallerIdentity" 188 | ], 189 | resources: [ 190 | '*' 191 | ] 192 | }), 193 | new iam.PolicyStatement({ 194 | sid: "SecurityHubAllow", 195 | effect: iam.Effect.ALLOW, 196 | actions: [ 197 | "securityhub:BatchUpdateFindings", 198 | "securityhub:GetFindings" 199 | ], 200 | resources: [ 201 | '*' 202 | ] 203 | }), 204 | new iam.PolicyStatement({ 205 | sid: "S3Allow", 206 | effect: iam.Effect.ALLOW, 207 | actions: [ 208 | "s3:GetObject", 209 | "s3:PutObject" 210 | ], 211 | resources: [ 212 | security_hub_export_bucket.bucketArn, 213 | security_hub_export_bucket.arnForObjects("*") 214 | ] 215 | }), 216 | new iam.PolicyStatement({ 217 | sid: "KMSAllow", 218 | effect: iam.Effect.ALLOW, 219 | actions: [ 220 | "kms:Decrypt", 221 | "kms:Describe*", 222 | "kms:Encrypt", 223 | "kms:GenerateDataKey" 224 | ], 225 | resources: [ 226 | s3_kms_key.keyArn 227 | ] 228 | }), 229 | new iam.PolicyStatement({ 230 | sid: "InvokeLambdaAllow", 231 | effect: iam.Effect.ALLOW, 232 | actions: [ 233 | "lambda:InvokeFunction" 234 | ], 235 | resources: [ 236 | sh_csv_exporter_function.functionArn, 237 | sh_csv_updater_function.functionArn 238 | ] 239 | }), 240 | new iam.PolicyStatement({ 241 | sid: "SSMAllow", 242 | effect: iam.Effect.ALLOW, 243 | actions: [ 244 | "ssm:GetParameters", 245 | "ssm:PutParameter" 246 | ], 247 | resources: [ 248 | Fn.join('', ["arn:", this.partition, ':ssm:', this.region, ':', this.account,':parameter/csvManager/*']), 249 | ] 250 | }), 251 | ], 252 | }); 253 | 254 | new iam.ManagedPolicy(this, 'sechub_csv_managed_policy', { 255 | description: '', 256 | document:export_sechub_finding_policy_doc, 257 | managedPolicyName: 'sechub_csv_manager', 258 | roles: [secub_csv_manager_role] 259 | }); 260 | 261 | new events.Rule(this, 'Rule', { 262 | schedule: events.Schedule.expression(Frequency.valueAsString), 263 | enabled: false, 264 | description: "Invoke Security Hub findings exporter periodically.", 265 | targets: [ 266 | new LambdaFunction(sh_csv_exporter_function, { 267 | event: events.RuleTargetInput.fromObject( 268 | { 269 | "event": events.EventField.fromPath('$.event') 270 | } 271 | ) 272 | }) 273 | ] 274 | }); 275 | 276 | // SSM Document for SSM Account configuration 277 | new CfnDocument(this, 'create_sh_export_document', { 278 | documentType: 'Automation', 279 | name: 'start_sh_finding_export', 280 | content: { 281 | "schemaVersion": "0.3", 282 | "assumeRole": secub_csv_manager_role.roleArn, 283 | "description": "Generate a Security Hub Findings Export (CSV Manager for Security Hub) outside of the normal export.", 284 | "parameters": { 285 | "Filters": { 286 | "type": "String", 287 | "description": "The canned filter \"HighActive\" or a JSON-formatted string for the GetFindings API filter.", 288 | "default": 'HighActive' 289 | }, 290 | "Partition": { 291 | "type": "String", 292 | "description": "The partition in which CSV Manager for Security Hub will operate.", 293 | "default": this.partition 294 | }, 295 | "Regions": { 296 | "type": "String", 297 | "description": "The comma-separated list of regions in which CSV Manager for Security Hub will operate.", 298 | "default": PrimaryRegion.valueAsString 299 | } 300 | }, 301 | "mainSteps": [{ 302 | "action": "aws:invokeLambdaFunction", 303 | "name": "InvokeLambdaforSHFindingExport", 304 | "inputs": { 305 | "InvocationType": 'RequestResponse', 306 | "FunctionName": sh_csv_exporter_function.functionName, 307 | "Payload": "{ \"filters\" : \"{{Filters}}\" , \"partition\" : \"{{Partition}}\", \"regions\" : \"[ {{Regions}} ]\"}" 308 | }, 309 | 'description':'Invoke the CSV Manager for Security Hub lambda function.', 310 | 'outputs':[ 311 | { 312 | 'Name': 'resultCode', 313 | 'Selector': '$.Payload.resultCode', 314 | 'Type': 'Integer' 315 | }, 316 | { 317 | 'Name': 'bucket', 318 | 'Selector': '$.Payload.bucket', 319 | 'Type': 'String' 320 | }, 321 | { 322 | 'Name': 'exportKey', 323 | 'Selector': '$.Payload.exportKey', 324 | 'Type': 'String' 325 | } 326 | ], 327 | 'isEnd': true 328 | }] 329 | } 330 | }); 331 | 332 | // SSM Document for SSM Account configuration 333 | new CfnDocument(this, 'update_sh_export_document', { 334 | documentType: 'Automation', 335 | name: 'start_sechub_csv_update', 336 | content: { 337 | "schemaVersion": "0.3", 338 | "assumeRole": secub_csv_manager_role.roleArn, 339 | "description": "Update a Security Hub Findings Update (CSV Manager for Security Hub) outside of the normal Update.", 340 | "parameters": { 341 | "Source": { 342 | "type": "String", 343 | "description": "An S3 URI containing the CSV file to update. i.e. s3:///Findings/SecurityHub-20220415-115112.csv", 344 | "default": '' 345 | }, 346 | "PrimaryRegion": { 347 | "type": "String", 348 | "description": "Region to pull the CSV file from.", 349 | "default": PrimaryRegion 350 | } 351 | }, 352 | "mainSteps": [{ 353 | "action": "aws:invokeLambdaFunction", 354 | "name": "InvokeLambdaforSHFindingUpdate", 355 | "inputs": { 356 | "InvocationType": 'RequestResponse', 357 | "FunctionName": sh_csv_updater_function.functionName, 358 | "Payload": "{ \"input\" : \"{{Source}}\" , \"primaryRegion\" : \"{{PrimaryRegion}}\"}" 359 | }, 360 | 'description':'Invoke the CSV Manager Update for Security Hub lambda function.', 361 | 'outputs':[ 362 | { 363 | 'Name': 'resultCode', 364 | 'Selector': '$.Payload.resultCode', 365 | 'Type': 'Integer' 366 | } 367 | ], 368 | 'isEnd':true 369 | }] 370 | } 371 | }); 372 | 373 | //SSM Parameters 374 | new StringParameter(this, 'BucketNameParameter', { 375 | description: 'The S3 bucket where Security Hub are exported.', 376 | parameterName: '/csvManager/bucket', 377 | stringValue: security_hub_export_bucket.bucketName, 378 | }); 379 | 380 | new StringParameter(this, 'KMSKeyParameter', { 381 | description: 'The KMS key encrypting the S3 bucket objects.', 382 | parameterName: '/csvManager/key', 383 | stringValue: s3_kms_key.keyArn, 384 | }); 385 | 386 | new StringParameter(this, 'CodeFolderParameter', { 387 | description: 'The folder where CSV Manager for Security Hub code is stored.', 388 | parameterName: '/csvManager/folder/code', 389 | stringValue: CodeFolder.valueAsString, 390 | }); 391 | 392 | new StringParameter(this, 'FindingsFolderParameter', { 393 | description: 'The folder where CSV Manager for Security Hub findings are exported.', 394 | parameterName: '/csvManager/folder/findings', 395 | stringValue: FindingsFolder.valueAsString, 396 | }); 397 | 398 | new StringParameter(this, 'ArchiveKeyParameter', { 399 | description: 'The name of the ZIP archive containing CSV Manager for Security Hub Lambda code.', 400 | parameterName: '/csvManager/object/codeArchive', 401 | stringValue: 'Not Initialized', 402 | }); 403 | 404 | new StringParameter(this, 'PartitionParameter', { 405 | description: 'The partition in which CSV Manager for Security Hub will operate.', 406 | parameterName: '/csvManager/partition', 407 | stringValue: Partition.valueAsString, 408 | }); 409 | 410 | new StringParameter(this, 'RegionParameter', { 411 | description: 'The list of regions in which CSV Manager for Security Hub will operate.', 412 | parameterName: '/csvManager/regionList', 413 | stringValue: Regions.valueAsString, 414 | }); 415 | 416 | } 417 | } 418 | -------------------------------------------------------------------------------- /csv_manager_sechub_cdk/package.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "cdk_solution", 3 | "version": "0.1.0", 4 | "bin": { 5 | "cdk_solution": "bin/cdk_solution.js" 6 | }, 7 | "scripts": { 8 | "build": "tsc", 9 | "watch": "tsc -w", 10 | "test": "jest", 11 | "cdk": "cdk" 12 | }, 13 | "devDependencies": { 14 | "@types/jest": "^26.0.10", 15 | "@types/node": "10.17.27", 16 | "aws-cdk": "2.19.0", 17 | "jest": "^26.4.2", 18 | "ts-jest": "^26.2.0", 19 | "ts-node": "^9.0.0", 20 | "typescript": "~3.9.7" 21 | }, 22 | "dependencies": { 23 | "@aws-cdk/aws-kms": "^1.152.0", 24 | "@aws-cdk/aws-s3": "^1.152.0", 25 | "@aws-cdk/aws-ssm": "^1.152.0", 26 | "aws-cdk-lib": "2.19.0", 27 | "cdk-ssm-document": "^3.1.0", 28 | "constructs": "^10.0.0", 29 | "source-map-support": "^0.5.16" 30 | } 31 | } 32 | -------------------------------------------------------------------------------- /csv_manager_sechub_cdk/tsconfig.json: -------------------------------------------------------------------------------- 1 | { 2 | "compilerOptions": { 3 | "target": "ES2018", 4 | "module": "commonjs", 5 | "lib": [ 6 | "es2018" 7 | ], 8 | "declaration": true, 9 | "strict": true, 10 | "noImplicitAny": true, 11 | "strictNullChecks": true, 12 | "noImplicitThis": true, 13 | "alwaysStrict": true, 14 | "noUnusedLocals": false, 15 | "noUnusedParameters": false, 16 | "noImplicitReturns": true, 17 | "noFallthroughCasesInSwitch": false, 18 | "inlineSourceMap": true, 19 | "inlineSources": true, 20 | "experimentalDecorators": true, 21 | "strictPropertyInitialization": false, 22 | "resolveJsonModule": true, 23 | "typeRoots": [ 24 | "./node_modules/@types" 25 | ] 26 | }, 27 | "exclude": [ 28 | "node_modules", 29 | "cdk.out" 30 | ] 31 | } 32 | --------------------------------------------------------------------------------