├── .github ├── ISSUE_TEMPLATE │ ├── config.yml │ ├── question.md │ ├── bug_report.md │ └── feature_request.md ├── CODE_OF_CONDUCT.md ├── PULL_REQUEST_TEMPLATE.md └── workflows │ ├── python-lambda-check.yml │ └── cloudformation-check.yml ├── diagram ├── aft-bootstrap-pipeline.jpg └── aft-bootstrap-pipeline.drawio ├── code ├── aft-setup │ └── terraform │ │ ├── backend.tf │ │ └── main.tf ├── lambda │ └── gen_files.py └── aft-deployment-pipeline.yaml ├── .gitignore ├── LICENSE ├── README.md └── CONTRIBUTING.md /.github/ISSUE_TEMPLATE/config.yml: -------------------------------------------------------------------------------- 1 | blank_issues_enabled: false -------------------------------------------------------------------------------- /diagram/aft-bootstrap-pipeline.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/aft-bootstrap-pipeline/main/diagram/aft-bootstrap-pipeline.jpg -------------------------------------------------------------------------------- /code/aft-setup/terraform/backend.tf: -------------------------------------------------------------------------------- 1 | # Copyright Amazon.com, Inc. or its affiliates. All rights reserved. 2 | # SPDX-License-Identifier: Apache-2.0 3 | 4 | terraform { 5 | backend "s3" { 6 | region = "" 7 | bucket = "" 8 | key = "aft-setup.tfstate" 9 | } 10 | } 11 | -------------------------------------------------------------------------------- /.github/CODE_OF_CONDUCT.md: -------------------------------------------------------------------------------- 1 | ## Code of Conduct 2 | This project has adopted the [Amazon Open Source Code of Conduct](https://aws.github.io/code-of-conduct). 3 | For more information see the [Code of Conduct FAQ](https://aws.github.io/code-of-conduct-faq) or contact 4 | opensource-codeofconduct@amazon.com with any additional questions or comments. 5 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | *~ 2 | .Trash-* 3 | .DS_Store 4 | .LSOverride 5 | ._* 6 | __pycache__/ 7 | *.py[cod] 8 | *$py.class 9 | .env 10 | .venv 11 | env/ 12 | venv/ 13 | ENV/ 14 | env.bak/ 15 | venv.bak/ 16 | crash.log 17 | **.terraform* 18 | **terraform.rc 19 | **.tfvars 20 | **.tfstate 21 | **.tfstate.* 22 | **.zip 23 | **.not 24 | **tests 25 | **.drawio.bkp 26 | 27 | -------------------------------------------------------------------------------- /.github/PULL_REQUEST_TEMPLATE.md: -------------------------------------------------------------------------------- 1 | # Description 2 | 3 | ### Motivation and Context 4 | 5 | 6 | - Resolves # 7 | 8 | ### How was this change tested? 9 | 10 | - [ ] Yes, I have tested the PR using my local account setup (Provide any test evidence report under Additional Notes) 11 | 12 | ### Additional Notes 13 | 14 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/question.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Question 3 | about: I have a Question 4 | --- 5 | 6 | - [ ] ✋ I have searched the open/closed issues and my issue is not listed. 7 | 8 | #### Please describe your question here 9 | 10 | 11 | 12 | #### Provide a link to the example/module related to the question 13 | 14 | 15 | 16 | #### Additional context 17 | 18 | -------------------------------------------------------------------------------- /.github/workflows/python-lambda-check.yml: -------------------------------------------------------------------------------- 1 | name: Python Lambda Checks 2 | 3 | on: 4 | push: 5 | branches: 6 | - main 7 | 8 | jobs: 9 | lint-and-security-check: 10 | name: Lint and Security Check 11 | runs-on: ubuntu-latest 12 | 13 | steps: 14 | - name: Checkout Repository 15 | uses: actions/checkout@v2 16 | 17 | - name: Set up Python 18 | uses: actions/setup-python@v2 19 | with: 20 | python-version: '3.10' 21 | 22 | - name: Install Pylint 23 | run: pip install pylint 24 | 25 | - name: Run Pylint 26 | run: pylint code/lambda/gen_files.py 27 | 28 | - name: Install Bandit 29 | run: pip install bandit 30 | 31 | - name: Run Bandit 32 | run: bandit -r code/lambda/gen_files.py 33 | -------------------------------------------------------------------------------- /.github/workflows/cloudformation-check.yml: -------------------------------------------------------------------------------- 1 | name: CloudFormation Checks 2 | 3 | on: 4 | push: 5 | branches: 6 | - main 7 | 8 | jobs: 9 | run-checkov-and-cfn-nag: 10 | name: Run Checkov and cfn-nag 11 | runs-on: ubuntu-latest 12 | 13 | steps: 14 | - name: Checkout code 15 | uses: actions/checkout@v2 16 | 17 | - name: Set up Python 18 | uses: actions/setup-python@v2 19 | with: 20 | python-version: '3.x' 21 | 22 | - name: Install Checkov 23 | run: pip install checkov 24 | 25 | - name: Run Checkov 26 | run: checkov -f code/aft-deployment-pipeline.yaml 27 | 28 | - name: Install cfn-nag 29 | run: | 30 | sudo gem install cfn-nag 31 | 32 | - name: Run cfn-nag 33 | run: cfn_nag_scan --input-path code/aft-deployment-pipeline.yaml 34 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT No Attribution 2 | 3 | Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy of 6 | this software and associated documentation files (the "Software"), to deal in 7 | the Software without restriction, including without limitation the rights to 8 | use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of 9 | the Software, and to permit persons to whom the Software is furnished to do so. 10 | 11 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 12 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS 13 | FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR 14 | COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER 15 | IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN 16 | CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. 17 | 18 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/bug_report.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Bug report 3 | about: Create a report to help us improve 4 | --- 5 | 6 | ## Description 7 | 8 | Please provide a clear and concise description of the issue you are encountering, and a reproduction of your configuration. 9 | 10 | If your request is for a new feature, please use the `Feature request` template. 11 | 12 | - [ ] ✋ I have searched the open/closed issues and my issue is not listed. 13 | 14 | ## Versions 15 | 16 | - AWS Control Tower version [Required]: 17 | - Terraform version: 18 | - AFT version: 19 | 20 | ## Reproduction Code [Required] 21 | 22 | 23 | 24 | Steps to reproduce the behavior: 25 | 26 | ## Expected behaviour 27 | 28 | 29 | 30 | ## Actual behaviour 31 | 32 | 33 | 34 | ### Terminal Output Screenshot(s) 35 | 36 | 37 | 38 | ## Additional context 39 | 40 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/feature_request.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Feature request 3 | about: Suggest an idea for this project 4 | --- 5 | 6 | 7 | 8 | ### Community Note 9 | 10 | * Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request 11 | * Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request 12 | * If you are interested in working on this issue or have submitted a pull request, please leave a comment 13 | 14 | 15 | 16 | #### What is the outcome that you are trying to reach? 17 | 18 | 19 | 20 | #### Describe the solution you would like 21 | 22 | 23 | 24 | #### Describe alternatives you have considered 25 | 26 | 27 | 28 | #### Additional context 29 | 30 | -------------------------------------------------------------------------------- /code/aft-setup/terraform/main.tf: -------------------------------------------------------------------------------- 1 | # Copyright Amazon.com, Inc. or its affiliates. All rights reserved. 2 | # SPDX-License-Identifier: Apache-2.0 3 | 4 | module "aft" { 5 | source = "github.com/aws-ia/terraform-aws-control_tower_account_factory" 6 | 7 | # Required variables 8 | ct_management_account_id = "" 9 | log_archive_account_id = "" 10 | audit_account_id = "" 11 | aft_management_account_id = "" 12 | ct_home_region = "" 13 | 14 | # Optional variables 15 | tf_backend_secondary_region = "" 16 | aft_metrics_reporting = "" 17 | 18 | # AFT Feature flags 19 | aft_feature_cloudtrail_data_events = "" 20 | aft_feature_enterprise_support = "" 21 | aft_feature_delete_default_vpcs_enabled = "" 22 | 23 | # Terraform variables 24 | terraform_version = "" 25 | terraform_distribution = "" 26 | 27 | # VCS variables (only if you are not using AWS CodeCommit) 28 | # vcs_provider = "" 29 | # account_request_repo_name = "/aft-account-request" 30 | # account_customizations_repo_name = "/aft-account-customizations" 31 | # account_provisioning_customizations_repo_name = "/aft-account-provisioning-customizations" 32 | # global_customizations_repo_name = "/aft-global-customizations" 33 | 34 | } 35 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | 2 | # Account Factory for Terraform (AFT) bootstrap pipeline 3 | 4 | This repository introduces a streamlined and secure method for deploying Account Factory for Terraform (AFT) within the management account of AWS Organizations. Our approach centers around a CloudFormation template that simplifies and automates the AFT setup process. The template creates a Terraform pipeline, designed for ease of adaptability and maintenance. 5 | 6 | Prioritizing security and data integrity, the Terraform state file is securely stored in an S3 bucket, featuring server-side encryption and policies to block public access. This ensures the safeguarding of the state file against unauthorized access and data breaches. 7 | 8 | In recognition of the management account's critical role in AWS Control Tower, this solution adheres to AWS best practices. It aims to provide an efficient, secure, and compliant process for AFT deployment. 9 | 10 | The following resources can be deployed by this solution: 11 | 12 | - CodeBuild 13 | - CodeCommit 14 | - CodeConnection 15 | - CodePipeline 16 | - EventBridge 17 | - IAM 18 | - Lambda 19 | - S3 20 | 21 | ## Table of Contents 22 | 23 | - [Diagram](#diagram) 24 | - [Prerequisites](#prerequisites) 25 | - [Limitations](#limitations) 26 | - [Versions](#versions) 27 | - [Security](#security) 28 | - [License](#license) 29 | 30 | ### Diagram 31 | 32 | ![diagram](diagram/aft-bootstrap-pipeline.jpg) 33 | 34 | --- 35 | 36 | ## Prerequisites 37 | 38 | - AWS Accounts: A basic AWS multi-account environment with at least an AWS Management account, AWS Log Archive account, AWS Audit account and one additional account for AFT management. 39 | - AWS Control Tower Setup: An established AWS Control Tower environment is essential. The management account should be properly configured as this template will be deployed within it. 40 | - AWS Account Requirements: Ensure you have the necessary permissions in the AWS management account. You'll need sufficient privileges to create and manage resources like S3 buckets, Lambda functions, IAM roles, and CodePipeline projects. 41 | - Terraform Knowledge: Familiarity with Terraform is crucial. Understanding Terraform's core concepts and workflow is important as the deployment involves generating and managing Terraform configurations. 42 | 43 | ## Limitations 44 | 45 | - Resource Limitations: Be aware of the AWS resource limits in your account. The deployment may create multiple resources, and hitting service limits could impede the process. 46 | - Version Compatibility: The template is designed for specific versions of Terraform and AWS services. Upgrading or changing versions may require template modifications. 47 | - Self-managed VCSs: This template doesn't support self-managed VCS services, such as GitHub Enterprise. 48 | 49 | ## Versions 50 | 51 | - Terraform: >=1.6.6 52 | - AFT: >=1.11 53 | 54 | ## Security 55 | 56 | See [CONTRIBUTING](CONTRIBUTING.md#security-issue-notifications) for more information. 57 | 58 | ## License 59 | 60 | This library is licensed under the MIT-0 License. See the LICENSE file. 61 | 62 | -------------------------------------------------------------------------------- /CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | # Contributing Guidelines 2 | 3 | Thank you for your interest in contributing to our project. Whether it's a bug report, new feature, correction, or additional 4 | documentation, we greatly value feedback and contributions from our community. 5 | 6 | Please read through this document before submitting any issues or pull requests to ensure we have all the necessary 7 | information to effectively respond to your bug report or contribution. 8 | 9 | 10 | ## Reporting Bugs/Feature Requests 11 | 12 | We welcome you to use the GitHub issue tracker to report bugs or suggest features. 13 | 14 | When filing an issue, please check existing open, or recently closed, issues to make sure somebody else hasn't already 15 | reported the issue. Please try to include as much information as you can. Details like these are incredibly useful: 16 | 17 | * A reproducible test case or series of steps 18 | * The version of our code being used 19 | * Any modifications you've made relevant to the bug 20 | * Anything unusual about your environment or deployment 21 | 22 | 23 | ## Contributing via Pull Requests 24 | Contributions via pull requests are much appreciated. Before sending us a pull request, please ensure that: 25 | 26 | 1. You are working against the latest source on the *main* branch. 27 | 2. You check existing open, and recently merged, pull requests to make sure someone else hasn't addressed the problem already. 28 | 3. You open an issue to discuss any significant work - we would hate for your time to be wasted. 29 | 30 | To send us a pull request, please: 31 | 32 | 1. Fork the repository. 33 | 2. Modify the source; please focus on the specific change you are contributing. If you also reformat all the code, it will be hard for us to focus on your change. 34 | 3. Ensure local tests pass. 35 | 4. Commit to your fork using clear commit messages. 36 | 5. Send us a pull request, answering any default questions in the pull request interface. 37 | 6. Pay attention to any automated CI failures reported in the pull request, and stay involved in the conversation. 38 | 39 | GitHub provides additional document on [forking a repository](https://help.github.com/articles/fork-a-repo/) and 40 | [creating a pull request](https://help.github.com/articles/creating-a-pull-request/). 41 | 42 | 43 | ## Finding contributions to work on 44 | Looking at the existing issues is a great way to find something to contribute on. As our projects, by default, use the default GitHub issue labels (enhancement/bug/duplicate/help wanted/invalid/question/wontfix), looking at any 'help wanted' issues is a great place to start. 45 | 46 | 47 | ## Code of Conduct 48 | This project has adopted the [Amazon Open Source Code of Conduct](https://aws.github.io/code-of-conduct). 49 | For more information see the [Code of Conduct FAQ](https://aws.github.io/code-of-conduct-faq) or contact 50 | opensource-codeofconduct@amazon.com with any additional questions or comments. 51 | 52 | 53 | ## Security issue notifications 54 | If you discover a potential security issue in this project we ask that you notify AWS/Amazon Security via our [vulnerability reporting page](http://aws.amazon.com/security/vulnerability-reporting/). Please do **not** create a public github issue. 55 | 56 | 57 | ## Licensing 58 | 59 | See the [LICENSE](LICENSE) file for our project's licensing. We will ask you to confirm the licensing of your contribution. 60 | -------------------------------------------------------------------------------- /code/lambda/gen_files.py: -------------------------------------------------------------------------------- 1 | """ 2 | This module contains the AWS Lambda function for generating Terraform configuration files. 3 | 4 | The Lambda function defined in this module is triggered by AWS CloudFormation events. 5 | It handles the creation, update, and deletion of resources by generating the necessary Terraform 6 | configuration files and uploading them to an S3 bucket. The function's behavior is controlled by 7 | environment variables and the type of request received from CloudFormation. 8 | 9 | Functions: 10 | handler(event, context): The main function that handles CloudFormation custom resource events. 11 | """ 12 | 13 | import os 14 | import zipfile 15 | import io 16 | 17 | # pylint: disable=import-error 18 | import boto3 19 | import cfnresponse 20 | 21 | 22 | def handler(event, context): 23 | """ 24 | Handle Lambda function invocations for CloudFormation custom resources. 25 | 26 | This function processes CloudFormation custom resource events, generating 27 | Terraform configuration files and uploading them to an S3 bucket, 28 | depending on the environment variables and CloudFormation request type. 29 | 30 | Args: 31 | event (dict): The event passed by the AWS CloudFormation service, 32 | which includes details about the request type (Create, Update, Delete). 33 | context (LambdaContext): Provides runtime information about the Lambda function execution. 34 | 35 | Returns: 36 | None: The function sends a response directly to CloudFormation and does not return a value. 37 | """ 38 | if event["RequestType"] == "Delete": 39 | cfnresponse.send(event, context, cfnresponse.SUCCESS, {}) 40 | return 41 | 42 | if event["RequestType"] == "Update": 43 | cfnresponse.send(event, context, cfnresponse.SUCCESS, {}) 44 | return 45 | 46 | s3_client = boto3.client("s3") 47 | # pylint: disable=line-too-long 48 | aft_version = ( 49 | "" 50 | if os.environ["AFT_VERSION"] == "latest" 51 | else f"?ref={os.environ['AFT_VERSION']}" 52 | ) 53 | 54 | # Create in-memory zip file 55 | zip_buffer = io.BytesIO() 56 | with zipfile.ZipFile(zip_buffer, "w", zipfile.ZIP_DEFLATED) as zip_file: 57 | 58 | # main.tf content 59 | main_tf_content = f""" 60 | # Copyright Amazon.com, Inc. or its affiliates. All rights reserved. 61 | # SPDX-License-Identifier: Apache-2.0 62 | 63 | module "aft" {{ 64 | source = "github.com/aws-ia/terraform-aws-control_tower_account_factory{aft_version}" 65 | 66 | # Required variables 67 | ct_management_account_id = "{os.environ['MANAGEMENT_ACCOUNT_ID']}" 68 | log_archive_account_id = "{os.environ['LOG_ARCHIVE_ACCOUNT_ID']}" 69 | audit_account_id = "{os.environ['AUDIT_ACCOUNT_ID']}" 70 | aft_management_account_id = "{os.environ['AFT_ACCOUNT_ID']}" 71 | ct_home_region = "{os.environ['HOME_REGION']}" 72 | 73 | # Optional variables 74 | tf_backend_secondary_region = "{os.environ['SECONDARY_REGION']}" 75 | aft_metrics_reporting = "{os.environ['AFT_METRICS_REPORTING']}" 76 | 77 | # AFT Feature flags 78 | aft_feature_cloudtrail_data_events = "{os.environ['AFT_FEATURE_CLOUDTRAIL_DATA_EVENTS']}" 79 | aft_feature_enterprise_support = "{os.environ['AFT_FEATURE_ENTERPRISE_SUPPORT']}" 80 | aft_feature_delete_default_vpcs_enabled = "{os.environ['AFT_FEATURE_DELETE_DEFAULT_VPCS_ENABLED']}" 81 | 82 | # Terraform variables 83 | terraform_version = "{os.environ['TERRAFORM_VERSION']}" 84 | terraform_distribution = "oss" 85 | }} 86 | """ 87 | zip_file.writestr("terraform/main.tf", main_tf_content) 88 | 89 | # backend.tf content 90 | backend_tf_content = f""" 91 | # Copyright Amazon.com, Inc. or its affiliates. All rights reserved. 92 | # SPDX-License-Identifier: Apache-2.0 93 | 94 | terraform {{ 95 | backend "s3" {{ 96 | bucket = "{os.environ['TF_BACKEND_BUCKET_NAME']}" 97 | key = "aft-setup.tfstate" 98 | region = "{os.environ['HOME_REGION']}" 99 | }} 100 | }} 101 | """ 102 | zip_file.writestr("terraform/backend.tf", backend_tf_content) 103 | 104 | try: 105 | # Move the pointer of zip_buffer to the beginning of the buffer 106 | zip_buffer.seek(0) 107 | # Upload zip file 108 | s3_client.put_object( 109 | Bucket=os.environ["TF_FILES_GEN_BUCKET_NAME"], 110 | Key="terraform_files.zip", 111 | Body=zip_buffer.getvalue(), 112 | ) 113 | 114 | response_data = {} 115 | response_data["BucketName"] = os.environ["TF_FILES_GEN_BUCKET_NAME"] 116 | response_data["FileName"] = "terraform_files.zip" 117 | 118 | cfnresponse.send( 119 | event=event, 120 | context=context, 121 | responseStatus=cfnresponse.SUCCESS, 122 | responseData=response_data, 123 | physicalResourceId="", 124 | noEcho=True, 125 | ) 126 | 127 | except Exception as e: # pylint: disable=broad-exception-caught 128 | print(e) 129 | cfnresponse.send(event, context, cfnresponse.FAILED, {}) 130 | -------------------------------------------------------------------------------- /diagram/aft-bootstrap-pipeline.drawio: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | 41 | 42 | 43 | 44 | 45 | 46 | 47 | 48 | 49 | 50 | 51 | 52 | 53 | 54 | 55 | 56 | 57 | 58 | 59 | 60 | 61 | 62 | 63 | 64 | 65 | 66 | 67 | 68 | 69 | 70 | 71 | 72 | 73 | 74 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | 124 | 125 | 126 | 127 | 128 | 129 | 130 | 131 | 132 | 133 | 134 | 135 | 136 | 137 | 138 | 139 | 140 | 141 | 142 | 143 | 144 | 145 | 146 | 147 | 148 | 149 | 150 | 151 | 152 | 153 | 154 | 155 | 156 | 157 | 158 | 159 | 160 | 161 | 162 | 163 | 164 | 165 | 166 | 167 | 168 | 169 | 170 | 171 | 172 | 173 | 174 | 175 | 176 | 177 | 178 | 179 | 180 | 181 | 182 | 183 | 184 | 185 | 186 | 187 | 188 | 189 | 190 | 191 | 192 | 193 | 194 | 195 | 196 | 197 | 198 | 199 | 200 | 201 | 202 | 203 | 204 | 205 | 206 | 207 | 208 | 209 | 210 | 211 | 212 | 213 | 214 | 215 | 216 | 217 | 218 | 219 | 220 | 221 | 222 | 223 | 224 | 225 | 226 | 227 | 228 | -------------------------------------------------------------------------------- /code/aft-deployment-pipeline.yaml: -------------------------------------------------------------------------------- 1 | AWSTemplateFormatVersion: '2010-09-09' 2 | Description: "AFT bootstrap pipeline" 3 | 4 | Metadata: 5 | AWS::CloudFormation::Interface: 6 | ParameterGroups: 7 | - Label: 8 | default: "AFT (Account Factory for Terraform) bootstrap pipeline" 9 | Parameters: 10 | - pVCSProvider 11 | - pRepositoryName 12 | - pBranchName 13 | - pCodeBuildDockerImage 14 | - Label: 15 | default: "Generate default AFT files? [ONLY for CodeCommit]" 16 | Parameters: 17 | - pGenerateAFTFiles 18 | - Label: 19 | default: "AFT deployment file parameters [ONLY for CodeCommit]" 20 | Parameters: 21 | - pLogArchiveAccountID 22 | - pAuditAccountID 23 | - pAFTAccountID 24 | - pAFTMainRegion 25 | - pAFTSecondaryRegion 26 | - pAFTMetricsReporting 27 | - pAFTFeatureCloudTrailDataEvents 28 | - pAFTFeatureEnterpriseSupport 29 | - pAFTFeatureDeleteDefaultVpcsEnabled 30 | - pTerraformVersion 31 | - pAFTVersion 32 | ParameterLabels: 33 | pVCSProvider: 34 | default: VCS Provider 35 | pRepositoryName: 36 | default: Repository Name 37 | pBranchName: 38 | default: Branch Name 39 | pCodeBuildDockerImage: 40 | default: CodeBuild Docker Image 41 | pGenerateAFTFiles: 42 | default: Generate AFT Files 43 | pLogArchiveAccountID: 44 | default: Log Archive Account ID 45 | pAuditAccountID: 46 | default: Audit Account ID 47 | pAFTAccountID: 48 | default: AFT Management Account ID 49 | pAFTMainRegion: 50 | default: AFT Main Region 51 | pAFTSecondaryRegion: 52 | default: AFT Secondary Region 53 | pAFTMetricsReporting: 54 | default: Enable AFT Metrics Reporting 55 | pAFTFeatureCloudTrailDataEvents: 56 | default: Enable AFT CloudTrail Data Events 57 | pAFTFeatureEnterpriseSupport: 58 | default: Enable AFT Enterprise Support 59 | pAFTFeatureDeleteDefaultVpcsEnabled: 60 | default: Enable AFT Delete Default VPCs 61 | pAFTVersion: 62 | default: AFT Version 63 | pTerraformVersion: 64 | default: AFT Terraform Version 65 | 66 | Parameters: 67 | # AFT Bootstrap Pipeline Params 68 | pVCSProvider: 69 | Type: String 70 | Default: GitHub 71 | AllowedValues: 72 | - CodeCommit 73 | - GitHub 74 | - Bitbucket 75 | - GitLab 76 | Description: "Select the VCS provider." 77 | 78 | pRepositoryName: 79 | Type: String 80 | Default: "myorg/aft-setup" 81 | Description: "Enter the source repository name. For external VCS providers, inform the whole path, including organization name. (e.g my-github-org/my-repo)" 82 | 83 | pBranchName: 84 | Type: String 85 | Default: main 86 | Description: "Enter the source repository branch name." 87 | 88 | pCodeBuildDockerImage: 89 | Type: String 90 | Default: "aws/codebuild/amazonlinux2-x86_64-standard:4.0" 91 | Description: "Input the AWS CodeBuild docker base image version to run the pipeline." 92 | 93 | # Whether to gen files or not 94 | pGenerateAFTFiles: 95 | Type: String 96 | Default: false 97 | AllowedValues: 98 | - true 99 | - false 100 | Description: "Whether to generate default AFT deployment files or not. Change it to false IifF you don't want to generate the default files." 101 | 102 | # AWS Control Tower Required Params 103 | pLogArchiveAccountID: 104 | Type: String 105 | Description: "Input the Log Archive Account ID in AWS Control Tower." 106 | 107 | pAuditAccountID: 108 | Type: String 109 | Description: "Input the Audit Account ID in AWS Control Tower." 110 | 111 | pAFTAccountID: 112 | Type: String 113 | Description: "Input the AFT Management Account ID." 114 | 115 | pAFTMainRegion: 116 | Type: String 117 | Description: "Input the AFT main region. It must be the same as AWS Control Tower. (e.g. us-east-1)" 118 | 119 | # AFT Optional Params 120 | pAFTSecondaryRegion: 121 | Type: String 122 | Description: "Input the AFT secondary region (e.g. us-west-2)" 123 | 124 | pAFTMetricsReporting: 125 | Type: String 126 | Default: true 127 | AllowedValues: 128 | - true 129 | - false 130 | Description: "Input the AFT metrics reporting option. Select 'true' to enable it. (default: true)" 131 | 132 | # AFT Feature Flags Params 133 | pAFTFeatureCloudTrailDataEvents: 134 | Type: String 135 | Default: false 136 | AllowedValues: 137 | - true 138 | - false 139 | Description: "Input the AFT CloudTrail feature option. Select 'true' to enable it. (default: false)" 140 | 141 | pAFTFeatureEnterpriseSupport: 142 | Type: String 143 | Default: false 144 | AllowedValues: 145 | - true 146 | - false 147 | Description: "Input the AFT Enterprise Support feature option. Select 'true' to enable it. (default: false)" 148 | 149 | pAFTFeatureDeleteDefaultVpcsEnabled: 150 | Type: String 151 | Default: false 152 | AllowedValues: 153 | - true 154 | - false 155 | Description: "Input the AFT delete default VPCs feature option. Select 'true' to enable it. (default: false)" 156 | 157 | # Version Params 158 | pAFTVersion: 159 | Type: String 160 | Default: "latest" 161 | Description: "Input the AFT version to be deployed. (default: latest). [see https://github.com/aws-ia/terraform-aws-control_tower_account_factory/releases]" 162 | 163 | pTerraformVersion: 164 | Type: String 165 | Default: "1.6.6" 166 | Description: "Input the Terraform version to be used on AFT pipelines. (default: 1.6.6)" 167 | 168 | Conditions: 169 | cCreateCodeCommitRepository: !Equals [ !Ref pVCSProvider, "CodeCommit" ] 170 | cGenerateAFTFiles: !And [!Condition cCreateCodeCommitRepository, !Equals [ !Ref pGenerateAFTFiles, true ]] 171 | cUseExternalVCS: !Not [!Condition cCreateCodeCommitRepository] 172 | 173 | Resources: 174 | rTerraformFilesBucket: 175 | #checkov:skip=CKV_AWS_18:There is no need to enable log in this bucket. 176 | Condition: cGenerateAFTFiles 177 | Type: AWS::S3::Bucket 178 | DeletionPolicy: Retain 179 | UpdateReplacePolicy: Retain 180 | Properties: 181 | BucketEncryption: 182 | ServerSideEncryptionConfiguration: 183 | - ServerSideEncryptionByDefault: 184 | SSEAlgorithm: 'AES256' 185 | BucketName: !Sub "${AWS::StackName}-tf-files-generator-${AWS::AccountId}" 186 | PublicAccessBlockConfiguration: 187 | BlockPublicAcls: true 188 | BlockPublicPolicy: true 189 | IgnorePublicAcls: true 190 | RestrictPublicBuckets: true 191 | VersioningConfiguration: 192 | Status: Enabled 193 | Metadata: 194 | cfn_nag: 195 | rules_to_suppress: 196 | - id: W35 197 | reason: "There is no need to enable log in this bucket." 198 | - id: W51 199 | reason: "This bucket will be accessible only within the account to store the terraform files" 200 | 201 | rTerraformFilesGenerator: 202 | #checkov:skip=CKV_AWS_117:This lambda cannot run inside the vpc because is intended to run in the management account 203 | #checkov:skip=CKV_AWS_116:This lambda doesn't need DLQ because it runs inside the cfn 204 | #checkov:skip=CKV_AWS_173:The default AWS Encryption is enough for this lambda 205 | #checkov:skip=CKV_AWS_115:This is a one-time Lambda execution, it doesn't need reserved concurrent executions 206 | Condition: cGenerateAFTFiles 207 | DependsOn: rTerraformFilesBucket 208 | Type: AWS::Lambda::Function 209 | Properties: 210 | Handler: index.handler 211 | Role: !GetAtt rTerraformFilesLambdaExecutionRole.Arn 212 | Runtime: python3.10 213 | Timeout: 120 214 | FunctionName: !Sub "${AWS::StackName}-tf-files-generator" 215 | Environment: 216 | Variables: 217 | MANAGEMENT_ACCOUNT_ID: !Ref "AWS::AccountId" 218 | LOG_ARCHIVE_ACCOUNT_ID: !Ref pLogArchiveAccountID 219 | AUDIT_ACCOUNT_ID: !Ref pAuditAccountID 220 | AFT_ACCOUNT_ID: !Ref pAFTAccountID 221 | HOME_REGION: !Ref pAFTMainRegion 222 | SECONDARY_REGION: !Ref pAFTSecondaryRegion 223 | AFT_METRICS_REPORTING: !Ref pAFTMetricsReporting 224 | AFT_FEATURE_CLOUDTRAIL_DATA_EVENTS: !Ref pAFTFeatureCloudTrailDataEvents 225 | AFT_FEATURE_ENTERPRISE_SUPPORT: !Ref pAFTFeatureEnterpriseSupport 226 | AFT_FEATURE_DELETE_DEFAULT_VPCS_ENABLED: !Ref pAFTFeatureDeleteDefaultVpcsEnabled 227 | AFT_VERSION: !Ref pAFTVersion 228 | TERRAFORM_VERSION: !Ref pTerraformVersion 229 | TF_BACKEND_BUCKET_NAME: !Sub "${AWS::StackName}-tf-backend-${AWS::AccountId}" 230 | TF_FILES_GEN_BUCKET_NAME: !Sub "${AWS::StackName}-tf-files-generator-${AWS::AccountId}" 231 | Code: 232 | ZipFile: | 233 | """ 234 | This module contains the AWS Lambda function for generating Terraform configuration files. 235 | 236 | The Lambda function defined in this module is triggered by AWS CloudFormation events. 237 | It handles the creation, update, and deletion of resources by generating the necessary Terraform 238 | configuration files and uploading them to an S3 bucket. The function's behavior is controlled by 239 | environment variables and the type of request received from CloudFormation. 240 | 241 | Functions: 242 | handler(event, context): The main function that handles CloudFormation custom resource events. 243 | """ 244 | import os 245 | import zipfile 246 | import io 247 | # pylint: disable=import-error 248 | import boto3 249 | import cfnresponse 250 | 251 | def handler(event, context): 252 | """ 253 | Handle Lambda function invocations for CloudFormation custom resources. 254 | 255 | This function processes CloudFormation custom resource events, generating 256 | Terraform configuration files and uploading them to an S3 bucket, 257 | depending on the environment variables and CloudFormation request type. 258 | 259 | Args: 260 | event (dict): The event passed by the AWS CloudFormation service, 261 | which includes details about the request type (Create, Update, Delete). 262 | context (LambdaContext): Provides runtime information about the Lambda function execution. 263 | 264 | Returns: 265 | None: The function sends a response directly to CloudFormation and does not return a value. 266 | """ 267 | if event['RequestType'] == 'Delete': 268 | cfnresponse.send(event, context, cfnresponse.SUCCESS, {}) 269 | return 270 | 271 | if event['RequestType'] == 'Update': 272 | cfnresponse.send(event, context, cfnresponse.SUCCESS, {}) 273 | return 274 | 275 | s3_client = boto3.client('s3') 276 | aft_version = "" if os.environ['AFT_VERSION'] == 'latest' else f"?ref={os.environ['AFT_VERSION']}" 277 | 278 | # Create in-memory zip file 279 | zip_buffer = io.BytesIO() 280 | with zipfile.ZipFile(zip_buffer, 'w', zipfile.ZIP_DEFLATED) as zip_file: 281 | 282 | # main.tf content 283 | main_tf_content = f''' 284 | # Copyright Amazon.com, Inc. or its affiliates. All rights reserved. 285 | # SPDX-License-Identifier: Apache-2.0 286 | 287 | module "aft" {{ 288 | source = "github.com/aws-ia/terraform-aws-control_tower_account_factory{aft_version}" 289 | 290 | # Required variables 291 | ct_management_account_id = "{os.environ['MANAGEMENT_ACCOUNT_ID']}" 292 | log_archive_account_id = "{os.environ['LOG_ARCHIVE_ACCOUNT_ID']}" 293 | audit_account_id = "{os.environ['AUDIT_ACCOUNT_ID']}" 294 | aft_management_account_id = "{os.environ['AFT_ACCOUNT_ID']}" 295 | ct_home_region = "{os.environ['HOME_REGION']}" 296 | 297 | # Optional variables 298 | tf_backend_secondary_region = "{os.environ['SECONDARY_REGION']}" 299 | aft_metrics_reporting = "{os.environ['AFT_METRICS_REPORTING']}" 300 | 301 | # AFT Feature flags 302 | aft_feature_cloudtrail_data_events = "{os.environ['AFT_FEATURE_CLOUDTRAIL_DATA_EVENTS']}" 303 | aft_feature_enterprise_support = "{os.environ['AFT_FEATURE_ENTERPRISE_SUPPORT']}" 304 | aft_feature_delete_default_vpcs_enabled = "{os.environ['AFT_FEATURE_DELETE_DEFAULT_VPCS_ENABLED']}" 305 | 306 | # Terraform variables 307 | terraform_version = "{os.environ['TERRAFORM_VERSION']}" 308 | terraform_distribution = "oss" 309 | }} 310 | ''' 311 | zip_file.writestr('terraform/main.tf', main_tf_content) 312 | 313 | # backend.tf content 314 | backend_tf_content = f''' 315 | # Copyright Amazon.com, Inc. or its affiliates. All rights reserved. 316 | # SPDX-License-Identifier: Apache-2.0 317 | 318 | terraform {{ 319 | backend "s3" {{ 320 | bucket = "{os.environ['TF_BACKEND_BUCKET_NAME']}" 321 | key = "aft-setup.tfstate" 322 | region = "{os.environ['HOME_REGION']}" 323 | }} 324 | }} 325 | ''' 326 | zip_file.writestr('terraform/backend.tf', backend_tf_content) 327 | 328 | try: 329 | # Move the pointer of zip_buffer to the beginning of the buffer 330 | zip_buffer.seek(0) 331 | # Upload zip file 332 | s3_client.put_object( 333 | Bucket=os.environ['TF_FILES_GEN_BUCKET_NAME'], 334 | Key='terraform_files.zip', 335 | Body=zip_buffer.getvalue() 336 | ) 337 | 338 | response_data = {} 339 | response_data['BucketName'] = os.environ['TF_FILES_GEN_BUCKET_NAME'] 340 | response_data['FileName'] = 'terraform_files.zip' 341 | 342 | cfnresponse.send( 343 | event=event, 344 | context=context, 345 | responseStatus=cfnresponse.SUCCESS, 346 | responseData=response_data, 347 | physicalResourceId='', 348 | noEcho=True 349 | ) 350 | except Exception as e: # pylint: disable=broad-exception-caught 351 | print(e) 352 | cfnresponse.send(event, context, cfnresponse.FAILED, {}) 353 | Metadata: 354 | cfn_nag: 355 | rules_to_suppress: 356 | - id: W89 357 | reason: "This lambda cannot run inside the vpc because is intended to run in the management account" 358 | 359 | rTerraformFilesLambdaExecutionRole: 360 | Condition: cGenerateAFTFiles 361 | Type: AWS::IAM::Role 362 | Properties: 363 | AssumeRolePolicyDocument: 364 | Version: '2012-10-17' 365 | Statement: 366 | - Effect: Allow 367 | Principal: 368 | Service: [lambda.amazonaws.com] 369 | Action: ['sts:AssumeRole'] 370 | Policies: 371 | - PolicyName: LambdaS3AccessPolicy 372 | PolicyDocument: 373 | Version: '2012-10-17' 374 | Statement: 375 | - Effect: Allow 376 | Action: 377 | - s3:PutObject 378 | Resource: !Sub 'arn:${AWS::Partition}:s3:::${rTerraformFilesBucket}/*' 379 | - PolicyName: LambdaCloudWatchLogsPolicy 380 | PolicyDocument: 381 | Version: '2012-10-17' 382 | Statement: 383 | - Effect: Allow 384 | Action: 385 | - logs:CreateLogGroup 386 | Resource: !Sub 'arn:${AWS::Partition}:logs:${AWS::Region}:${AWS::AccountId}:*' 387 | - Effect: Allow 388 | Action: 389 | - logs:CreateLogStream 390 | - logs:PutLogEvents 391 | Resource: !Sub 'arn:${AWS::Partition}:logs:${AWS::Region}:${AWS::AccountId}:log-group:/aws/lambda/${AWS::StackName}-tf-files-generator-${AWS::AccountId}:*' 392 | 393 | rTerraformFilesGeneratorTrigger: 394 | Condition: cGenerateAFTFiles 395 | Type: Custom::TerraformFilesGenerator 396 | Properties: 397 | ServiceToken: !GetAtt rTerraformFilesGenerator.Arn 398 | 399 | rCodeCommitAftRepo: 400 | Type: AWS::CodeCommit::Repository 401 | Condition: cCreateCodeCommitRepository 402 | DeletionPolicy: Retain 403 | UpdateReplacePolicy: Retain 404 | Properties: 405 | RepositoryName: !Ref pRepositoryName 406 | RepositoryDescription: The AFT setup repository 407 | Code: !If 408 | - cGenerateAFTFiles 409 | - S3: 410 | Bucket: !GetAtt rTerraformFilesGeneratorTrigger.BucketName 411 | Key: !GetAtt rTerraformFilesGeneratorTrigger.FileName 412 | - !Ref AWS::NoValue 413 | 414 | rCodeConnection: 415 | Condition: cUseExternalVCS 416 | Type: AWS::CodeConnections::Connection 417 | Properties: 418 | ConnectionName: "aft-vcs-connection" 419 | ProviderType: !Ref pVCSProvider 420 | 421 | rTerraformBackendBucket: 422 | #checkov:skip=CKV_AWS_18:There is no need to enable log in this bucket. 423 | Type: AWS::S3::Bucket 424 | DeletionPolicy: Retain 425 | UpdateReplacePolicy: Retain 426 | Properties: 427 | BucketEncryption: 428 | ServerSideEncryptionConfiguration: 429 | - ServerSideEncryptionByDefault: 430 | SSEAlgorithm: 'AES256' 431 | BucketName: !Sub "${AWS::StackName}-tf-backend-${AWS::AccountId}" 432 | PublicAccessBlockConfiguration: 433 | BlockPublicAcls: true 434 | BlockPublicPolicy: true 435 | IgnorePublicAcls: true 436 | RestrictPublicBuckets: true 437 | VersioningConfiguration: 438 | Status: Enabled 439 | Metadata: 440 | cfn_nag: 441 | rules_to_suppress: 442 | - id: W35 443 | reason: "There is no need to enable log in this bucket." 444 | - id: W51 445 | reason: "This bucket will be accessible only within the account to store the terraform state" 446 | 447 | rCodePipelineArtifactBucket: 448 | #checkov:skip=CKV_AWS_18:There is no need to enable log in this bucket. 449 | Type: AWS::S3::Bucket 450 | DeletionPolicy: Retain 451 | UpdateReplacePolicy: Retain 452 | Properties: 453 | BucketEncryption: 454 | ServerSideEncryptionConfiguration: 455 | - ServerSideEncryptionByDefault: 456 | SSEAlgorithm: 'AES256' 457 | BucketName: !Sub "${AWS::StackName}-codepipeline-artifacts-${AWS::AccountId}" 458 | PublicAccessBlockConfiguration: 459 | BlockPublicAcls: true 460 | BlockPublicPolicy: true 461 | IgnorePublicAcls: true 462 | RestrictPublicBuckets: true 463 | VersioningConfiguration: 464 | Status: Enabled 465 | Metadata: 466 | cfn_nag: 467 | rules_to_suppress: 468 | - id: W35 469 | reason: "There is no need to enable log in this bucket." 470 | - id: W51 471 | reason: "This bucket will be accessible only within the account to store the pipeline artifacts" 472 | 473 | rCodePipelineServiceRole: 474 | Type: AWS::IAM::Role 475 | Properties: 476 | AssumeRolePolicyDocument: 477 | Version: 2012-10-17 478 | Statement: 479 | - Effect: Allow 480 | Principal: 481 | Service: codepipeline.amazonaws.com 482 | Action: sts:AssumeRole 483 | Path: / 484 | Policies: 485 | - PolicyName: CodePipelinePermissions 486 | PolicyDocument: 487 | Version: 2012-10-17 488 | Statement: 489 | - Effect: Allow 490 | Action: 491 | - codecommit:GetBranch 492 | - codecommit:GetCommit 493 | - codecommit:UploadArchive 494 | - codecommit:GetUploadArchiveStatus 495 | - codecommit:CancelUploadArchive 496 | Resource: !Sub "arn:${AWS::Partition}:codecommit:${AWS::Region}:${AWS::AccountId}:${pRepositoryName}" 497 | - Effect: Allow 498 | Action: 499 | - codebuild:StartBuild 500 | - codebuild:BatchGetBuilds 501 | Resource: 502 | - !GetAtt rCodeBuildTerraformPlan.Arn 503 | - !GetAtt rCodeBuildTerraformApply.Arn 504 | - Effect: Allow 505 | Action: 506 | - s3:PutObject 507 | - s3:GetObject 508 | - s3:GetObjectVersion 509 | - s3:GetBucketVersioning 510 | Resource: !Sub arn:${AWS::Partition}:s3:::${rCodePipelineArtifactBucket}/* 511 | - !If 512 | - cUseExternalVCS 513 | - Effect: Allow 514 | Action: 515 | - codestar-connections:UseConnection 516 | Resource: !Ref rCodeConnection 517 | - !Ref "AWS::NoValue" 518 | Metadata: 519 | cfn_nag: 520 | rules_to_suppress: 521 | - id: F38 522 | reason: "Wildcard is required for these IAM actions" 523 | - id: W11 524 | reason: "Wildcard is required for these IAM actions" 525 | 526 | rCodeBuildServiceRole: 527 | #checkov:skip=CKV_AWS_111:Wildcard is required for these IAM actions 528 | Type: AWS::IAM::Role 529 | Properties: 530 | AssumeRolePolicyDocument: 531 | Version: 2012-10-17 532 | Statement: 533 | - Effect: Allow 534 | Principal: 535 | Service: codebuild.amazonaws.com 536 | Action: sts:AssumeRole 537 | Path: / 538 | Policies: 539 | - PolicyName: LogAccess 540 | PolicyDocument: 541 | Version: 2012-10-17 542 | Statement: 543 | - Effect: Allow 544 | Action: 545 | - logs:CreateLogGroup 546 | - logs:CreateLogStream 547 | - logs:PutLogEvents 548 | Resource: !Sub "arn:${AWS::Partition}:logs:${AWS::Region}:${AWS::AccountId}:*" 549 | - PolicyName: S3Access 550 | PolicyDocument: 551 | Version: 2012-10-17 552 | Statement: 553 | - Effect: Allow 554 | Action: 555 | - s3:List* 556 | - s3:GetObject 557 | - s3:GetObjectVersion 558 | - s3:GetBucketVersioning 559 | - s3:PutObject 560 | Resource: 561 | - !Sub "arn:${AWS::Partition}:s3:::${rCodePipelineArtifactBucket}" 562 | - !Sub "arn:${AWS::Partition}:s3:::${rCodePipelineArtifactBucket}/*" 563 | - !Sub "arn:${AWS::Partition}:s3:::${rTerraformBackendBucket}" 564 | - !Sub "arn:${AWS::Partition}:s3:::${rTerraformBackendBucket}/*" 565 | - Effect: Allow 566 | Action: 567 | - s3:DeleteObject 568 | Resource: 569 | - !Sub "arn:${AWS::Partition}:s3:::${rTerraformBackendBucket}/*.tflock" 570 | - PolicyName: CodeBuildPermissions 571 | PolicyDocument: 572 | Version: 2012-10-17 573 | Statement: 574 | - Effect: Allow 575 | Action: 576 | - ec2:CreateNetworkInterfacePermission 577 | Resource: !Sub "arn:${AWS::Partition}:ec2:${AWS::Region}:${AWS::AccountId}:network-interface/*" 578 | Condition: 579 | StringEquals: 580 | ec2:AuthorizedService: "codebuild.amazonaws.com" 581 | - Effect: Allow 582 | Action: 583 | - ec2:CreateNetworkInterface 584 | - ec2:DescribeDhcpOptions 585 | - ec2:DescribeNetworkInterfaces 586 | - ec2:DeleteNetworkInterface 587 | - ec2:DescribeSubnets 588 | - ec2:DescribeSecurityGroups 589 | - ec2:DescribeVpcs 590 | Resource: !Sub "arn:${AWS::Partition}:ec2:${AWS::Region}:${AWS::AccountId}:*" 591 | - Effect: Allow 592 | Action: 593 | - organizations:Describe* 594 | - organizations:List* 595 | Resource: "*" 596 | - PolicyName: IAMPermissions 597 | PolicyDocument: 598 | Version: 2012-10-17 599 | Statement: 600 | - Effect: Allow 601 | Action: 602 | - sts:AssumeRole 603 | Resource: !Sub "arn:${AWS::Partition}:iam::*:role/AWSControlTowerExecution" 604 | - Effect: Allow 605 | Action: 606 | - iam:PassRole 607 | Resource: !Sub "arn:${AWS::Partition}:iam::${AWS::AccountId}:role/aft-*" 608 | - Effect: Allow 609 | Action: 610 | - iam:CreateRole 611 | - iam:TagRole 612 | - iam:AttachRolePolicy 613 | - iam:PutRolePolicy 614 | - iam:List* 615 | - iam:Get* 616 | Resource: 617 | - !Sub "arn:${AWS::Partition}:iam::${AWS::AccountId}:role/aft-control-tower-events-rule" 618 | - !Sub "arn:${AWS::Partition}:iam::${AWS::AccountId}:role/AWSAFTExecution" 619 | - !Sub "arn:${AWS::Partition}:iam::${AWS::AccountId}:role/AWSAFTService" 620 | - PolicyName: EventBridgePermissions 621 | PolicyDocument: 622 | Version: 2012-10-17 623 | Statement: 624 | - Effect: Allow 625 | Action: 626 | - events:List* 627 | - events:DeleteRule 628 | - events:DisableRule 629 | - events:EnableRule 630 | - events:PutTargets 631 | - events:RemoveTargets 632 | Resource: 633 | - !Sub "arn:${AWS::Partition}:events:*:${AWS::AccountId}:rule/aft-capture-ct-events" 634 | - Effect: Allow 635 | Action: 636 | - events:DescribeRule 637 | - events:TagResource 638 | - events:UntagResource 639 | - events:PutRule 640 | Resource: "*" 641 | Condition: 642 | StringEqualsIfExists: 643 | events:creatorAccount: "${aws:PrincipalAccount}" 644 | - PolicyName: SSMParameterPermissions 645 | PolicyDocument: 646 | Version: 2012-10-17 647 | Statement: 648 | - Effect: Allow 649 | Action: 650 | - ssm:GetParameter 651 | - ssm:GetParameters 652 | - ssm:GetParametersByPath 653 | Resource: !Sub "arn:${AWS::Partition}:ssm:*:*:parameter/*" 654 | Metadata: 655 | cfn_nag: 656 | rules_to_suppress: 657 | - id: F38 658 | reason: "Wildcard is required for these IAM actions" 659 | - id: W11 660 | reason: "Wildcard is required for these IAM actions" 661 | 662 | rCodeBuildTerraformPlan: 663 | Type: AWS::CodeBuild::Project 664 | Properties: 665 | Artifacts: 666 | Type: CODEPIPELINE 667 | Environment: 668 | ComputeType: BUILD_GENERAL1_SMALL 669 | Type: LINUX_CONTAINER 670 | Image: !Ref pCodeBuildDockerImage 671 | PrivilegedMode: true 672 | EnvironmentVariables: 673 | - Name: TERRAFORM_VERSION 674 | Value: !Ref pTerraformVersion 675 | # - Name: REPOSITORY_NAME 676 | # Value: !Ref pRepositoryName 677 | # - Name: REPOSITORY_BRANCH 678 | # Value: !Ref pBranchName 679 | Name: "aft-bootstrap-pipeline-tf-plan" 680 | ServiceRole: !Ref rCodeBuildServiceRole 681 | Source: 682 | Type: CODEPIPELINE 683 | BuildSpec: | 684 | version: 0.2 685 | phases: 686 | install: 687 | commands: 688 | - | 689 | set -e 690 | echo $TERRAFORM_VERSION 691 | echo "Installing Terraform" 692 | cd /tmp 693 | curl -q -o terraform_${TERRAFORM_VERSION}_linux_amd64.zip https://releases.hashicorp.com/terraform/${TERRAFORM_VERSION}/terraform_${TERRAFORM_VERSION}_linux_amd64.zip 694 | unzip -q -o terraform_${TERRAFORM_VERSION}_linux_amd64.zip 695 | mv terraform /usr/local/bin/ 696 | terraform -no-color --version 697 | pre_build: 698 | on-failure: ABORT 699 | commands: 700 | - | 701 | cd $CODEBUILD_SRC_DIR/terraform 702 | echo "Initializing and validating terraform" 703 | terraform fmt -no-color 704 | terraform init -no-color 705 | terraform validate -no-color 706 | build: 707 | on-failure: ABORT 708 | commands: 709 | - | 710 | echo "Running terraform plan" 711 | terraform plan -no-color -out output.tfplan 712 | post_build: 713 | commands: 714 | - echo "Terraform plan successfully run" 715 | artifacts: 716 | files: 717 | - '**/*' 718 | Metadata: 719 | cfn_nag: 720 | rules_to_suppress: 721 | - id: W32 722 | reason: "CodeBuild is using the managed CMK for Amazon Simple Storage Service (Amazon S3) by default" 723 | 724 | rCodeBuildTerraformApply: 725 | Type: AWS::CodeBuild::Project 726 | Properties: 727 | Artifacts: 728 | Type: CODEPIPELINE 729 | Environment: 730 | ComputeType: BUILD_GENERAL1_SMALL 731 | Type: LINUX_CONTAINER 732 | Image: !Ref pCodeBuildDockerImage 733 | PrivilegedMode: true 734 | EnvironmentVariables: 735 | - Name: TERRAFORM_VERSION 736 | Value: !Ref pTerraformVersion 737 | # - Name: REPOSITORY_NAME 738 | # Value: !Ref pRepositoryName 739 | # - Name: REPOSITORY_BRANCH 740 | # Value: !Ref pBranchName 741 | Name: "aft-bootstrap-pipeline-tf-apply" 742 | ServiceRole: !Ref rCodeBuildServiceRole 743 | Source: 744 | Type: CODEPIPELINE 745 | BuildSpec: | 746 | version: 0.2 747 | phases: 748 | install: 749 | commands: 750 | - | 751 | set -e 752 | echo $TERRAFORM_VERSION 753 | echo "Installing terraform" 754 | cd /tmp 755 | curl -q -o terraform_${TERRAFORM_VERSION}_linux_amd64.zip https://releases.hashicorp.com/terraform/${TERRAFORM_VERSION}/terraform_${TERRAFORM_VERSION}_linux_amd64.zip 756 | unzip -q -o terraform_${TERRAFORM_VERSION}_linux_amd64.zip 757 | mv terraform /usr/local/bin/ 758 | terraform -no-color --version 759 | build: 760 | on-failure: ABORT 761 | commands: 762 | - | 763 | cd $CODEBUILD_SRC_DIR/terraform 764 | echo "Running terraform apply" 765 | terraform apply -no-color -input=false --auto-approve "output.tfplan" 766 | post_build: 767 | commands: 768 | - echo "AFT setup deployment successfully" 769 | artifacts: 770 | files: 771 | - '**/*' 772 | Metadata: 773 | cfn_nag: 774 | rules_to_suppress: 775 | - id: W32 776 | reason: "CodeBuild is using the managed CMK for Amazon Simple Storage Service (Amazon S3) by default" 777 | 778 | rCodePipeline: 779 | Type: AWS::CodePipeline::Pipeline 780 | Properties: 781 | ArtifactStore: 782 | Type: S3 783 | Location: !Ref rCodePipelineArtifactBucket 784 | Name: "aft-bootstrap-pipeline" 785 | RoleArn: !GetAtt rCodePipelineServiceRole.Arn 786 | Stages: 787 | - Name: Source 788 | Actions: 789 | - !If 790 | - cCreateCodeCommitRepository 791 | - Name: !Ref pVCSProvider 792 | ActionTypeId: 793 | Category: Source 794 | Owner: AWS 795 | Version: 1 796 | Provider: CodeCommit 797 | Configuration: 798 | RepositoryName: !GetAtt rCodeCommitAftRepo.Name 799 | BranchName: !Ref pBranchName 800 | PollForSourceChanges: false 801 | OutputArtifacts: 802 | - Name: code 803 | RunOrder: 1 804 | - Name: !Ref pVCSProvider 805 | ActionTypeId: 806 | Category: Source 807 | Owner: AWS 808 | Version: 1 809 | Provider: CodeStarSourceConnection 810 | Configuration: 811 | ConnectionArn: !Ref rCodeConnection 812 | FullRepositoryId: !Ref pRepositoryName 813 | BranchName: !Ref pBranchName 814 | OutputArtifacts: 815 | - Name: code 816 | RunOrder: 1 817 | - Name: Build 818 | Actions: 819 | - Name: terraform-plan 820 | ActionTypeId: 821 | Category: Build 822 | Owner: AWS 823 | Version: 1 824 | Provider: CodeBuild 825 | Configuration: 826 | ProjectName: !Ref rCodeBuildTerraformPlan 827 | InputArtifacts: 828 | - Name: code 829 | OutputArtifacts: 830 | - Name: plan 831 | RunOrder: 1 832 | - Name: Approval 833 | Actions: 834 | - Name: approve 835 | ActionTypeId: 836 | Category: Approval 837 | Owner: AWS 838 | Version: '1' 839 | Provider: Manual 840 | Configuration: 841 | CustomData: "Check the terraform plan output and approve before the changes are implemented." 842 | - Name: Deploy 843 | Actions: 844 | - Name: terraform-apply 845 | ActionTypeId: 846 | Category: Build 847 | Owner: AWS 848 | Version: 1 849 | Provider: CodeBuild 850 | Configuration: 851 | ProjectName: !Ref rCodeBuildTerraformApply 852 | InputArtifacts: 853 | - Name: plan 854 | OutputArtifacts: 855 | - Name: apply 856 | RunOrder: 1 857 | 858 | rEventRulePipelineCodeChange: 859 | Type: AWS::Events::Rule 860 | Condition: cCreateCodeCommitRepository 861 | Properties: 862 | Name: aft-bootstrap-pipeline-code-change-trigger 863 | Description: "Rule to trigger the CodePipeline based on code changes" 864 | EventPattern: 865 | source: 866 | - "aws.codecommit" 867 | detail-type: 868 | - "CodeCommit Repository State Change" 869 | resources: 870 | - !Sub "arn:${AWS::Partition}:codecommit:${AWS::Region}:${AWS::AccountId}:${pRepositoryName}" 871 | detail: 872 | event: 873 | - referenceCreated 874 | - referenceUpdated 875 | referenceType: 876 | - branch 877 | referenceName: 878 | - main 879 | Targets: 880 | - Arn: !Sub "arn:${AWS::Partition}:codepipeline:${AWS::Region}:${AWS::AccountId}:${rCodePipeline}" 881 | RoleArn: !GetAtt rEventsRuleRole.Arn 882 | Id: aft-bootstrap-pipeline-target 883 | 884 | rEventsRuleRole: 885 | Type: AWS::IAM::Role 886 | Condition: cCreateCodeCommitRepository 887 | Properties: 888 | AssumeRolePolicyDocument: 889 | Version: 2012-10-17 890 | Statement: 891 | - Effect: Allow 892 | Principal: 893 | Service: 894 | - events.amazonaws.com 895 | Action: sts:AssumeRole 896 | Path: / 897 | Policies: 898 | - PolicyName: StartPipelineExecution 899 | PolicyDocument: 900 | Version: 2012-10-17 901 | Statement: 902 | - Effect: Allow 903 | Action: codepipeline:StartPipelineExecution 904 | Resource: !Sub "arn:${AWS::Partition}:codepipeline:${AWS::Region}:${AWS::AccountId}:${rCodePipeline}" 905 | 906 | Outputs: 907 | TerraformBackendBucketName: 908 | Value: !Ref rTerraformBackendBucket --------------------------------------------------------------------------------