├── LICENSE ├── README.md ├── cloudformation-pipeline ├── README.md ├── cfn-devsecops.zip ├── cfn-security-pipeline.yml ├── cloudformation-pipeline-architecture.jpg └── src │ ├── buildspecs │ ├── cfnnag-buildspec.yml │ ├── cfripper-buildspec.yml │ ├── checkov-createstack-buildspec.yml │ ├── detect-secrets-ejection.py │ ├── lint-buildspec.yaml │ └── secrets-buildspec.yml │ ├── params │ └── parameters.json │ └── templates │ └── template.yml ├── docker-k8s-double-decker └── docker-k8s-double-decker-architecture.jpg ├── docker-pipeline-snyk └── docker-pipeline-snyk-architecture.jpg ├── docker-pipeline-wss ├── README.md ├── docker-devsecops.zip ├── docker-pipeline-wss-architecture.jpg ├── docker-secdevops.yaml └── src │ ├── artifacts │ ├── Dockerfile │ └── requirements.txt │ └── buildspecs │ ├── dagda-buildspec.yaml │ ├── dagda-parse.py │ ├── detect-secrets-ejection.py │ ├── docker-compose.yml │ ├── findingDetonator.py │ ├── lint-buildspec.yaml │ ├── secrets-buildspec.yaml │ ├── trivy-buildspec.yaml │ ├── wss-buildspec.yaml │ └── wss-unified-agent.config ├── flask-pipeline └── flask-pipeline-architecture.jpg ├── golang-pipeline └── .gitkeep ├── golden-ami-pipeline ├── GoldenAMIPipeline_AMZL2_CFN.yaml ├── GoldenAMIPipeline_Ubuntu18_CFN.yaml ├── GoldenAMIPipeline_WS19_CFN.yaml ├── README.md └── golden-ami-pipeline.jpg ├── k8s-pipeline ├── K8s_DevSecOps_Pipeline_CFN.yaml ├── README.md ├── k8s-devsecops.zip ├── k8s-pipeline-architecure.jpg └── src │ ├── artifacts │ └── deployment.yaml │ ├── buildspecs │ ├── deployment-buildspec.yaml │ ├── detect-secrets-ejection.py │ ├── polaris-buildspec.yaml │ ├── secrets-buildspec.yaml │ └── skan-buildspec.yaml │ └── security-hub │ ├── polarisAsff.py │ ├── secretsAsff.py │ └── skanAsff.py └── terraform-pipeline ├── README.md ├── src ├── artifacts │ ├── main.tf │ ├── provider.template │ └── variables.tf ├── buildspecs │ ├── checkov-buildspec.yaml │ ├── detect-secrets-ejection.py │ ├── secrets-buildspec.yaml │ ├── terraform-buildspec.yaml │ ├── tflint-buildspec.yaml │ └── tfsec-buildspec.yaml └── security-hub │ ├── checkovAsff.py │ ├── secretsAsff.py │ ├── tflintAsff.py │ └── tfsecAsff.py ├── terraform-pipeline-architecture.jpg ├── tf-devsecops.zip └── tf-secdevops.yaml /LICENSE: -------------------------------------------------------------------------------- 1 | Apache License 2 | Version 2.0, January 2004 3 | http://www.apache.org/licenses/ 4 | 5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 6 | 7 | 1. Definitions. 8 | 9 | "License" shall mean the terms and conditions for use, reproduction, 10 | and distribution as defined by Sections 1 through 9 of this document. 11 | 12 | "Licensor" shall mean the copyright owner or entity authorized by 13 | the copyright owner that is granting the License. 14 | 15 | "Legal Entity" shall mean the union of the acting entity and all 16 | other entities that control, are controlled by, or are under common 17 | control with that entity. For the purposes of this definition, 18 | "control" means (i) the power, direct or indirect, to cause the 19 | direction or management of such entity, whether by contract or 20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 21 | outstanding shares, or (iii) beneficial ownership of such entity. 22 | 23 | "You" (or "Your") shall mean an individual or Legal Entity 24 | exercising permissions granted by this License. 25 | 26 | "Source" form shall mean the preferred form for making modifications, 27 | including but not limited to software source code, documentation 28 | source, and configuration files. 29 | 30 | "Object" form shall mean any form resulting from mechanical 31 | transformation or translation of a Source form, including but 32 | not limited to compiled object code, generated documentation, 33 | and conversions to other media types. 34 | 35 | "Work" shall mean the work of authorship, whether in Source or 36 | Object form, made available under the License, as indicated by a 37 | copyright notice that is included in or attached to the work 38 | (an example is provided in the Appendix below). 39 | 40 | "Derivative Works" shall mean any work, whether in Source or Object 41 | form, that is based on (or derived from) the Work and for which the 42 | editorial revisions, annotations, elaborations, or other modifications 43 | represent, as a whole, an original work of authorship. For the purposes 44 | of this License, Derivative Works shall not include works that remain 45 | separable from, or merely link (or bind by name) to the interfaces of, 46 | the Work and Derivative Works thereof. 47 | 48 | "Contribution" shall mean any work of authorship, including 49 | the original version of the Work and any modifications or additions 50 | to that Work or Derivative Works thereof, that is intentionally 51 | submitted to Licensor for inclusion in the Work by the copyright owner 52 | or by an individual or Legal Entity authorized to submit on behalf of 53 | the copyright owner. For the purposes of this definition, "submitted" 54 | means any form of electronic, verbal, or written communication sent 55 | to the Licensor or its representatives, including but not limited to 56 | communication on electronic mailing lists, source code control systems, 57 | and issue tracking systems that are managed by, or on behalf of, the 58 | Licensor for the purpose of discussing and improving the Work, but 59 | excluding communication that is conspicuously marked or otherwise 60 | designated in writing by the copyright owner as "Not a Contribution." 61 | 62 | "Contributor" shall mean Licensor and any individual or Legal Entity 63 | on behalf of whom a Contribution has been received by Licensor and 64 | subsequently incorporated within the Work. 65 | 66 | 2. Grant of Copyright License. Subject to the terms and conditions of 67 | this License, each Contributor hereby grants to You a perpetual, 68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 69 | copyright license to reproduce, prepare Derivative Works of, 70 | publicly display, publicly perform, sublicense, and distribute the 71 | Work and such Derivative Works in Source or Object form. 72 | 73 | 3. Grant of Patent License. Subject to the terms and conditions of 74 | this License, each Contributor hereby grants to You a perpetual, 75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 76 | (except as stated in this section) patent license to make, have made, 77 | use, offer to sell, sell, import, and otherwise transfer the Work, 78 | where such license applies only to those patent claims licensable 79 | by such Contributor that are necessarily infringed by their 80 | Contribution(s) alone or by combination of their Contribution(s) 81 | with the Work to which such Contribution(s) was submitted. If You 82 | institute patent litigation against any entity (including a 83 | cross-claim or counterclaim in a lawsuit) alleging that the Work 84 | or a Contribution incorporated within the Work constitutes direct 85 | or contributory patent infringement, then any patent licenses 86 | granted to You under this License for that Work shall terminate 87 | as of the date such litigation is filed. 88 | 89 | 4. Redistribution. You may reproduce and distribute copies of the 90 | Work or Derivative Works thereof in any medium, with or without 91 | modifications, and in Source or Object form, provided that You 92 | meet the following conditions: 93 | 94 | (a) You must give any other recipients of the Work or 95 | Derivative Works a copy of this License; and 96 | 97 | (b) You must cause any modified files to carry prominent notices 98 | stating that You changed the files; and 99 | 100 | (c) You must retain, in the Source form of any Derivative Works 101 | that You distribute, all copyright, patent, trademark, and 102 | attribution notices from the Source form of the Work, 103 | excluding those notices that do not pertain to any part of 104 | the Derivative Works; and 105 | 106 | (d) If the Work includes a "NOTICE" text file as part of its 107 | distribution, then any Derivative Works that You distribute must 108 | include a readable copy of the attribution notices contained 109 | within such NOTICE file, excluding those notices that do not 110 | pertain to any part of the Derivative Works, in at least one 111 | of the following places: within a NOTICE text file distributed 112 | as part of the Derivative Works; within the Source form or 113 | documentation, if provided along with the Derivative Works; or, 114 | within a display generated by the Derivative Works, if and 115 | wherever such third-party notices normally appear. The contents 116 | of the NOTICE file are for informational purposes only and 117 | do not modify the License. You may add Your own attribution 118 | notices within Derivative Works that You distribute, alongside 119 | or as an addendum to the NOTICE text from the Work, provided 120 | that such additional attribution notices cannot be construed 121 | as modifying the License. 122 | 123 | You may add Your own copyright statement to Your modifications and 124 | may provide additional or different license terms and conditions 125 | for use, reproduction, or distribution of Your modifications, or 126 | for any such Derivative Works as a whole, provided Your use, 127 | reproduction, and distribution of the Work otherwise complies with 128 | the conditions stated in this License. 129 | 130 | 5. Submission of Contributions. Unless You explicitly state otherwise, 131 | any Contribution intentionally submitted for inclusion in the Work 132 | by You to the Licensor shall be under the terms and conditions of 133 | this License, without any additional terms or conditions. 134 | Notwithstanding the above, nothing herein shall supersede or modify 135 | the terms of any separate license agreement you may have executed 136 | with Licensor regarding such Contributions. 137 | 138 | 6. Trademarks. This License does not grant permission to use the trade 139 | names, trademarks, service marks, or product names of the Licensor, 140 | except as required for reasonable and customary use in describing the 141 | origin of the Work and reproducing the content of the NOTICE file. 142 | 143 | 7. Disclaimer of Warranty. Unless required by applicable law or 144 | agreed to in writing, Licensor provides the Work (and each 145 | Contributor provides its Contributions) on an "AS IS" BASIS, 146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 147 | implied, including, without limitation, any warranties or conditions 148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 149 | PARTICULAR PURPOSE. You are solely responsible for determining the 150 | appropriateness of using or redistributing the Work and assume any 151 | risks associated with Your exercise of permissions under this License. 152 | 153 | 8. Limitation of Liability. In no event and under no legal theory, 154 | whether in tort (including negligence), contract, or otherwise, 155 | unless required by applicable law (such as deliberate and grossly 156 | negligent acts) or agreed to in writing, shall any Contributor be 157 | liable to You for damages, including any direct, indirect, special, 158 | incidental, or consequential damages of any character arising as a 159 | result of this License or out of the use or inability to use the 160 | Work (including but not limited to damages for loss of goodwill, 161 | work stoppage, computer failure or malfunction, or any and all 162 | other commercial damages or losses), even if such Contributor 163 | has been advised of the possibility of such damages. 164 | 165 | 9. Accepting Warranty or Additional Liability. While redistributing 166 | the Work or Derivative Works thereof, You may choose to offer, 167 | and charge a fee for, acceptance of support, warranty, indemnity, 168 | or other liability obligations and/or rights consistent with this 169 | License. However, in accepting such obligations, You may act only 170 | on Your own behalf and on Your sole responsibility, not on behalf 171 | of any other Contributor, and only if You agree to indemnify, 172 | defend, and hold each Contributor harmless for any liability 173 | incurred by, or claims asserted against, such Contributor by reason 174 | of your accepting any such warranty or additional liability. 175 | 176 | END OF TERMS AND CONDITIONS 177 | 178 | APPENDIX: How to apply the Apache License to your work. 179 | 180 | To apply the Apache License to your work, attach the following 181 | boilerplate notice, with the fields enclosed by brackets "[]" 182 | replaced with your own identifying information. (Don't include 183 | the brackets!) The text should be enclosed in the appropriate 184 | comment syntax for the file format. We also recommend that a 185 | file or class name and description of purpose be included on the 186 | same "printed page" as the copyright notice for easier 187 | identification within third-party archives. 188 | 189 | Copyright [yyyy] [name of copyright owner] 190 | 191 | Licensed under the Apache License, Version 2.0 (the "License"); 192 | you may not use this file except in compliance with the License. 193 | You may obtain a copy of the License at 194 | 195 | http://www.apache.org/licenses/LICENSE-2.0 196 | 197 | Unless required by applicable law or agreed to in writing, software 198 | distributed under the License is distributed on an "AS IS" BASIS, 199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 200 | See the License for the specific language governing permissions and 201 | limitations under the License. 202 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # AWS-DevSecOps-Factory 2 | Sample DevSecOps pipelines (heavily biased on the "Sec") for various stacks and tools using open-source security tools and AWS native services. Stop in, pick what you want, add your own! 3 | 4 | AS OF 6 DEC 2020 THIS REPO IS NOW PUBLIC, IT IS AN UNFINISHED PROJECT!! 5 | 6 | ## Table of Contents 7 | - [Description](https://github.com/jonrau1/AWS-DevSecOps-Factory#description) 8 | - [How to use](https://github.com/jonrau1/AWS-DevSecOps-Factory#how-to-use-this-repository) 9 | - [Capability set](https://github.com/jonrau1/AWS-DevSecOps-Factory#capability-set-this-will-be-subject-to-change) 10 | - [Pipelines](https://github.com/jonrau1/AWS-DevSecOps-Factory#pipelines) 11 | - [FAQ](https://github.com/jonrau1/AWS-DevSecOps-Factory#faq) 12 | 13 | ## Description 14 | The AWS-DevSecOps-Factory is a consolidation of a variety of work I had done to create DevSecOps pipelines using AWS native tools. In reality these are more like automated AppSec pipelines that you would bolt on to the start of your release train. That approach will be the least maintainence and ideally you would commit artifacts supported by the pipeline to be scanned before creating a release candidate from them. This repository will continually grow (hopefully through outside contributions) and is focused on using open-source security tools to deliver functionality. At times I will use commercial or "freemium" (commercial tools with a functional free tier). Being an AWS solutions library all security-related findings from the various tools will be parsed and written to Security Hub to tie your SecOps people and processes closer to your DevSecOps development groups. 15 | 16 | Given that this repository is heavily biased on security it is very worthwhile the [give this whitepaper a read](https://d0.awsstatic.com/whitepapers/DevOps/practicing-continuous-integration-continuous-delivery-on-AWS.pdf) for a broader picture into tradtional "DevOps" (see, actual testing)/ 17 | 18 | ## How to use this repository 19 | Each available pipeline will have an architecture diagram and a link to the directory containing the code is provided as a hyperlink. The subdirectories will have a more detailed walkthrough of steps, prerequisites and deployment considerations. You can deploy the solution with CloudFormation after uploading a ZIP archive to an S3 bucket, or, you can go into the `/src/` subdirectory of each solution to view the raw files (example artifacts, `buildspec`, `appspec` and Python scripts). 20 | 21 | ## Capability set (this will be subject to change) 22 | 23 | ### Security Tools 24 | - **Secret detection**: [Detect-Secrets](https://github.com/Yelp/detect-secrets) 25 | - **Linting**: [TFLint](https://github.com/terraform-linters/tflint), [cfn-python-lint](https://github.com/aws-cloudformation/cfn-python-lint), [Hadolint](https://github.com/hadolint/hadolint) 26 | - **Platform SAST**: [TFSec](https://github.com/liamg/tfsec), [Checkov](https://github.com/bridgecrewio/checkov), [Cfn-nag](https://github.com/stelligent/cfn_nag), [Cfripper](https://github.com/Skyscanner/cfripper), [Polaris](https://github.com/FairwindsOps/polaris), [sKan](https://github.com/alcideio/skan) 27 | - **Code-specific SAST**: [Bandit](https://github.com/PyCQA/bandit), [Gosec](https://github.com/securego/gosec) 28 | - **OSSec / License management**: [Snyk](https://github.com/snyk/snyk), [Whitesource](https://github.com/whitesource/agents), [OWASP DependencyCheck](https://github.com/jeremylong/DependencyCheck) (via Dagda) 29 | - **Vulnerability management**: [Trivy](https://github.com/aquasecurity/trivy), [Dagda](https://github.com/eliasgranderubio/dagda) 30 | - **Anti-virus / anti-malware**: [Dagda](https://github.com/eliasgranderubio/dagda), [ClamAV](https://www.clamav.net/documents/clam-antivirus-user-manual) 31 | - **Operating System Assurance**: [DISA STIG](https://docs.aws.amazon.com/imagebuilder/latest/userguide/image-builder-stig.html), [Amazon Inspector](https://docs.aws.amazon.com/inspector/latest/userguide/inspector_introduction.html) (for CIS Benchmarks) 32 | 33 | ### Developement Tools 34 | - **Infrastructure-as-Code**: [AWS CloudFormation](https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/Welcome.html) 35 | - **Source code management**: [AWS CodeCommit](https://docs.aws.amazon.com/codecommit/latest/userguide/welcome.html) 36 | - **Continuous integration**: [AWS CodeBuild](https://docs.aws.amazon.com/codebuild/latest/userguide/welcome.html) 37 | - **Continuous delivery**: [AWS CodePipeline](https://docs.aws.amazon.com/codepipeline/latest/userguide/welcome.html), [AWS EC2 Image Builder](https://aws.amazon.com/image-builder/) 38 | - **Continuous deployment**: [AWS CodeDeploy](https://d0.awsstatic.com/whitepapers/DevOps/practicing-continuous-integration-continuous-delivery-on-AWS.pdf) 39 | - **Secrets management**: [AWS Systems Manager Parameter Store](https://docs.aws.amazon.com/systems-manager/latest/userguide/systems-manager-parameter-store.html) 40 | - **Artifact management**: [AWS S3](https://docs.aws.amazon.com/AmazonS3/latest/gsg/GetStartedWithS3.html) (artifacts in this perspective refer to code that is shared between your CodePipeline stages) 41 | 42 | ## Pipelines 43 | 44 | ### CloudFormation DevSecOps Pipeline 45 | ![CloudFormation DevSecOps Architecture](/cloudformation-pipeline/cloudformation-pipeline-architecture.jpg) 46 | 47 | [**Start Here**](/cloudformation-pipeline) 48 | 49 | ### Terraform DevSecOps Pipeline 50 | ![Terraform DevSecOps Architecture](/terraform-pipeline/terraform-pipeline-architecture.jpg) 51 | 52 | [**Start Here**](/terraform-pipeline) 53 | 54 | ### Docker image DevSecOps Pipeline (using Whitesource) 55 | ![Docker-DevSecOps-WSS](/docker-pipeline-wss/docker-pipeline-wss-architecture.jpg) 56 | 57 | [**Start Here**](/docker-pipeline-wss) 58 | 59 | ### Docker image DevSecOps Pipeline (using Snyk) 60 | ![Docker-DevSecOps-WSS](/docker-pipeline-snyk/docker-pipeline-snyk-architecture.jpg) 61 | 62 | [**Start Here**](/docker-pipeline-snyk) 63 | 64 | ### Kubernetes deployment DevSecOps Pipeline 65 | ![Kubernetes DevSecOps Architecture](/k8s-pipeline/k8s-pipeline-architecure.jpg) 66 | 67 | [**Start Here**](/k8s-pipeline) 68 | 69 | ### Docker + Kubernetes 2-stager DevSecOps Pipeline 70 | ![Docker-K8s-Architecture](/docker-k8s-double-decker/docker-k8s-double-decker-architecture.jpg) 71 | 72 | [**Start Here**](/docker-k8s-double-decker) 73 | 74 | ### Flask DevSecOps Pipeline 75 | ![Flask DevSecOps Architecture](/flask-pipeline/flask-pipeline-architecture.jpg) 76 | 77 | [**Start Here**](/flask-pipeline) 78 | 79 | ### Go application DevSecOps Pipeline 80 | Architecture TODO 81 | 82 | [**Start Here**](/golang-pipeline) 83 | 84 | ### Golden AMI Pipelines 85 | ![Golden AMI Pipelines Architecture](/golden-ami-pipeline/golden-ami-pipeline.jpg) 86 | 87 | [**Start Here**](/golden-ami-pipeline) 88 | 89 | ## FAQ 90 | 91 | ### 1. What is DevSecOps? How different is it than DevOps? 92 | Simply DevSecOps stands for “development, security and operations”, from the standpoint of principles and a high-level technology stack (e.g. automation, continuous integration / continuous delivery (CI/CD)) DevSecOps is not much different than DevOps. The goal of this ideology is to make security a shared responsibility in development and operations teams and carry over all the cultural benefits that DevOps seeks to boil up to the top such as agility, innovation, and speed. DevSecOps is synonymous in this aspect with “shifting security left” and that statement can be taken literally: automation and CI/CD sees the addition of security-focused tools where the main benefit is you may reduce or totally eliminate any misconfigurations or vulnerabilities from your software by the time it reaches production. 93 | 94 | DevSecOps seeks to address the security gaps that a DevOps culture introduced due either to the inexperience and relative immaturity of the security organization, the DevOps organization, or both. An inexperienced security organization may have hampered velocity due to the introduction of manual stage gates just as threat modeling or long lead times of a traditional application security (AppSec) organization. An immature DevOps organization may have completely sacrificed any security testing in the name of speed only to introduce more vulnerabilities or misconfigurations had they taken the time to do it. 95 | 96 | ### 2. What leads to success in DevSecOps? 97 | DevSecOps, like DevOps, is a cultural change and will require adherence to a core set of people, process, and technologies. Before starting your journey, you should consider what your software looks today and what it may look like 3, 6 or 12 months down the road. If available, you should also record your current technical debt and any vulnerabilities or other deficiencies surfaced by your AppSec, vulnerability management and/or cloud security posture management programs. These considerations must be taken into effect as it will define where you will concentrate your work before or during your shift to DevSecOps. For instance, if you have a lot of vulnerabilities due to a high amount of technical debt and your product roadmap is dictating a move to the cloud due to ROI or the perceived competitiveness it may bring to bear, you should take all of that into account. 98 | 99 | Taking that previous example further, you should identify if you will be refactoring completely or upgrading to the latest versions and re-hosting / re-platforming, that will dictate what people need to be hired and what tools need to be purchased and/or built. If you will be undergoing the DevSecOps shift just for the sake of doing it or without the added complexities or refactoring or reducing technical debt you should benchmark your current posture and vulnerabilities to measure success against. Like DevOps, DevSecOps is also key performance indicator (KPI) driven and those should be created before investing the time in starting. Just adding some tools like Bandit or Black Duck into a toolchain is not “doing DevSecOps” – there must be measurement and accountability. 100 | 101 | Needless to say you need to “pick the right tool for the right job” – if you are moving from on-premise to a cloud service provider you should invest in learning about cloud-native (e.g. Azure DevOps or AWS CodeBuild) solutions and also picking security and testing tools that align with your platform. Are you refactoring to .NET Core? You should find tools that support static analyst against .NET Core projects. With that in mind are you using packages that ***must*** run on Windows? If not, consider learning how to package and deploy software onto Linux using the `dotnet` CLI or `msbuild` - this OS shift can possibly bring ROI over Windows-based deployments 102 | 103 | Lastly you should agree into delivery lifecycle apparatus that will coincide with your DevSecOps journey. If you are traditionally delivering software in quarterly slow iterations that may not be the right approach to achieve agility – you should consider training your product or project managers into an agile methodology such as Scrum or Kanban. Will you take any technical debt with you? What is your risk appetite – is not breaking builds on Medium CVE’s in line with your security program? Will you build in time to focus solely on remediation of vulnerabilities? All these things and more need to be weighed before starting and continually adjusted throughout your journey. 104 | 105 | ### 3. What are guiding principles or concepts about DevSecOps? 106 | As touched on in #2 you should choose first to define some KPIs as well as key risk indicators (KRIs) to set a baseline when you start your DevSecOps journeys. These KPIs and KRIs should be allowed to adjust overtime as there are many variables that can bring you above or below those bars. Sometimes those KPIs and KRIs can be matched together, such as measuring that you are using the latest patches or major releases while also tracking if that move introduces new vulnerabilities to 0-days. Your KRIs can either be to measure operational risk, cyber risk or both and that should be lockstep with how your security program looks at risk, the appetite of risk and the acceptance (if any) of risk to and for the business. 107 | 108 | Next, you should define a set of principles or tenets you are trying to achieve during your journey. These can be placed onto a Wiki or a Confluence page for your DevSecOps engineers and maybe your larger product ecosystem (or your whole company, preferably). Transparency is a large part of cultural success and setting public tenets or principles is a small piece of that. These can be very high-level such as committing to the minimum necessary usage of packages, the least privilege of all IAM roles / agreeing to never use an AWS managed policy or the strict guidance that there are to never be secrets in code. Ideally these principles, like your KPIs/KRIs, can be adjusted overtime but are forward looking (and maybe aspirational). Do you hardcode credentials in your `boto3` clients? Then it may make sense to boldly declare the elimination of secrets and refactoring of your code to support that. 109 | 110 | The last concept that should be tackled is your security toolchain which are typically AppSec or vulnerability management type tools. There should be some tools that are non-negotiable and tools that are workload dependent. Just like in DevOps where you would have multiple pipelines to create different pieces of your software it is no different in DevSecOps. At a minimum you should seek to implement open-source or commercial statis analysis (aka SCA or SAST) tools to find deficiencies or vulnerabilities in your source code and dependencies. You should also consider the usage or linters and utilities to find secrets in your code base, these will help avoid breaking builds due to simple syntactic errors and committing hardcode credentials into your final product, respectively. Other platform-specific tooling or container security tools are more “boutique” and should only be used if you will use the platforms or intend to. 111 | 112 | ### 4. What does DevSecOps done wrong look like? Are there anti-patterns to look for or pitfalls to avoid? 113 | DevSecOps (or even plain DevOps) done wrong is declaring that you “do it” without the necessary cultural commitment. Just using a SAST or vulnerability management tool in your pipeline is not DevSecOps, that is more like DevOps with a side of AppSec. DevSecOps is about the saturation of security principles and tools into your development cycles and taking ownership for any vulnerabilities found. Funneling deficiencies into a central organization for remediation (either your AppSec or Technical Operations (TechOps) organizations) is not DevSecOps. This journey comes knowing that your team is responsible for everything, including security, and its sacrifice or just running the tests and not acting on them is not good enough. 114 | 115 | The first anti-pattern is the usage of tools for the sake of usage. Running security tools and override the failure exit codes or suppressing all checks is not conducive to a successful DevSecOps journey. The only time a finding should be suppressed or ignored if it is verified that it is a true false positive or a massive performance hit. The latter requires a formal risk acceptance, this is better handled by an enterprise or information risk management group (ERM/IRM) but you should include it as a logged item to revisit. 116 | 117 | Another anti-pattern is running through redundant tools just to pad the usage of security tools. While there is value that can be found from using overlapping SAST tools or supplementing a commercial tool with a similar open-source one, just stacking security tests into your pipeline should be avoided. This can lead to longer build times or the surfacing of false positives or false negatives, you should always examine your tools for a fit and supplement or remove only when there is overwhelming evidence that it will lead to efficiencies. 118 | 119 | A final example anti-pattern is not committing fully to a transparent culture. While some dislike the “name and shame” mantra that has become associated with gamification or publicizing results, this is an invaluable tool to drive accountability (and even some friendly competition) among teams. This does not always need to be from a centralized organization view, if you have the skills to create dashboards or automated narratives in your own team, you should do it. Dashboarding and transparency lets you and everyone else on your team / organization see if you are abiding by your KPIs/KRIs, tenets and principles. This does not need to be a negative – it is perfectly okay to re-base on real world performance. DevSecOps is called a journey for a reason, there are no dogmas within, every team will have a different experience and should remain flexible without sacrificing accountability in the adjustment of the supporting apparatus. 120 | 121 | ### 5. What other elements or codes of practice should I incorporate in my DevSecOps program that aren't necessarily called out? 122 | (Warning, this is heavily opinionated). While not widely talked about, you should consider branching your DevSecOps journey out more broadly to your “traditional” centralized IT organization and other parts of the security organization – especially threat intelligence, IRM/ERM and legal. Your centralized IT organization may be responsible for things such as OS builds, maintaining Active Directory or general networking operations (hello Direct Connect and Transit Gateway). Those organizations should either be encouraged to adopt their own DevSecOps journey / methodology or co-opted directly or in part by your current one. Doing this helps solve two things – the outside dependency the “Ops” folks in your DevSecOps teams may run into (adding a route, opening a firewall rule, updating an IDS signature) and the inclusivity that comes with a positive DevSecOps culture. It is more likely your network operations team does not have the tools or training to be as agile as you are and not that they hate your guts. Getting these folks into ceremonies or otherwise included can be greatly beneficial for those reasons. 123 | 124 | As far as the security and legal organizations go there are multiple reasons why you may want to include them. Your threat intelligence group can let you know if there are threat actors or other activity groups with the motivation and capability to target your product, you can use this to implement new tests, adopt a host-based security tool or even accelerate your refactoring to move your application out of the means of the attackers. You can also ask them for indicators of compromise (IOCs) such as hashes, IP addresses or domains to add to your existing perimeter protection tools and to have the “Ops” part of your team monitor for. 125 | 126 | Talking with your ERM/IRM teams can help keep your KRIs in alignment, proactively seek any risk acceptances and be included in the loop for any risk-based decisions or changes to risk appetite. It is likely that cyber risk is being driven in full or in part by your CISO (and maybe General Counsel / CLO) and that CISO may be gapping in your skills to be able to determine what the real inherent or residual risk you introduce by your decisions. This is the same reason to bring in your legal team – likely they are responsible for writing Master Services Agreements (MSAs), answering 3rd party questionnaires and assessing 3rd party risk. It is helpful to talk to them in plain terms about what your 11-stage pipeline brings (“We are able to reduce or eliminate all vulnerabilities in our cloud environment and software packages”) and how they can answer a questionnaire. You can also help them with evaluating a new commercial tool you can bring in or even lend your expertise to writing more stringent MSAs or reviewing one sent by a 3rd party (customer or another business) who consumes your services. 127 | 128 | /opinion time over 129 | -------------------------------------------------------------------------------- /cloudformation-pipeline/README.md: -------------------------------------------------------------------------------- 1 | # CloudFormation Security Scanning - CodeSuite 2 | Sample implementation of a CloudFormation security testing pipeline using the AWS CodeSuite. CodeCommit is used as the SCM, CodeBuild projects are used as the CI servers and CodePipeline is the CD automation engine which will start builds as new code is pushed (directly or via PR) to the Master branch. The final build stage also creates stacks from the template and separate parameter files. 3 | 4 | This pipeline lints CloudFormation templates by using `python-cfn-lint` and looks for regex and high-entropy based secrets/sensitive values using Detect-Secrets. Various security static analysis is performed against the templates using CFRipper, Cfn-nag and Checkov, respectively. The final stage (Checkov) will also use the AWS CLI to create and deploy the stack(s). 5 | 6 | ## Before you start 7 | All CloudFormation templates will need to be placed in the `/templates` directory regardless of their language. This is done to seperate them from external `parameters.json` files which will fail builds due to a syntax errors from reading these. This is also done to future-proof the solution in case you use JSON as your CloudFormation template language. **If you do use JSON** you will need to go through the `buildspec` files and change some of the tool commands to scan JSON templates as this is setup for YAML only right now. 8 | 9 | It is reccomended that you create your own ZIP archive of your own templates, parameter files and modified CodeBuild `buildspec` and any other helper scripts. You will need to make an archive with `/templates`, `/params` and `/buildspecs` at the root **do not** send the whole `/src` directory to a ZIP archive. 10 | 11 | ## Getting Started 12 | Clone this repository and upload `cfn-devsecops.zip` to a bucket of your choosing. Deploy a stack from `cfn-security-pipeline.yml` in your AWS account, all necessary artifacts will be pushed as the first commit to the created CodeCommit repository. 13 | 14 | **Important Note:** Modify the permissions of the CodeBuild Role in `cfn-security-pipeline.yml` to give it permissions for whatever you will be deploying from your deployment stack. I.e. if you will be deploying EKS clusters or ECS Services you should modify your permissions to include `eks` or `ecs-task` permissions. -------------------------------------------------------------------------------- /cloudformation-pipeline/cfn-devsecops.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jonrau1/AWS-DevSecOps-Factory/c07921704872b2102c47098038300fc5adf4f003/cloudformation-pipeline/cfn-devsecops.zip -------------------------------------------------------------------------------- /cloudformation-pipeline/cfn-security-pipeline.yml: -------------------------------------------------------------------------------- 1 | AWSTemplateFormatVersion: 2010-09-09 2 | Description: Deploys a sample CodePipeline to scan and deploy CFN templates 3 | Parameters: 4 | InitialCommitBucket: 5 | Type: String 6 | Description: The name of the S3 bucket containing the package for the initial commit for the DevSecOps pipeline 7 | InitialCommitKey: 8 | Type: String 9 | Description: Name of the package for the initial commit for the DevSecOps pipeline DO NOT include .zip 10 | Default: cfn-devsecops 11 | ExampleStackName: 12 | Type: String 13 | Description: Name of the stack to be deployed in the final stage. Placeholder value and will likely not scale in prod 14 | Default: SecDevOpsStack 15 | Resources: 16 | DevSecOpsCICDCodeCommit: 17 | Type: AWS::CodeCommit::Repository 18 | Properties: 19 | RepositoryDescription: Contains the Dockerfile and other scripts for the DevSecOps container scanning CICD pipeline 20 | RepositoryName: cfn-devsecops 21 | Code: 22 | S3: 23 | Bucket: !Ref InitialCommitBucket 24 | Key: !Sub '${InitialCommitKey}.zip' 25 | CodeBuildServiceRole: 26 | Type: AWS::IAM::Role 27 | Properties: 28 | ManagedPolicyArns: 29 | - arn:aws:iam::aws:policy/AmazonEC2ContainerRegistryPowerUser 30 | Policies: 31 | - PolicyName: CodeBuildServiceRolePolicy 32 | PolicyDocument: 33 | Version: 2012-10-17 34 | Statement: 35 | - Effect: Allow 36 | Action: 37 | - codecommit:GitPull 38 | Resource: !GetAtt DevSecOpsCICDCodeCommit.Arn 39 | - Effect: Allow 40 | Action: 41 | - logs:CreateLogGroup 42 | - logs:CreateLogStream 43 | - logs:PutLogEvents 44 | # these permissions are to create the stack within the pipeline 45 | - s3:CreateBucket 46 | - codecommit:CreateRepository 47 | Resource: '*' 48 | - Effect: Allow 49 | Action: 50 | - s3:GetObject 51 | - s3:GetObjectVersion 52 | - s3:PutObject 53 | - s3:GetBucketAcl 54 | - s3:GetBucketLocation 55 | Resource: 56 | - !Sub 'arn:aws:s3:::${DevSecOpsCICDCodePipelineArtifactBucket}' 57 | - !Sub 'arn:aws:s3:::${DevSecOpsCICDCodePipelineArtifactBucket}/*' 58 | - Effect: Allow 59 | Action: 60 | - kms:Decrypt 61 | - ssm:GetParameter 62 | - ssm:GetParameters 63 | - cloudformation:DetectStackSetDrift 64 | - cloudformation:ListExports 65 | - cloudformation:DescribeStackDriftDetectionStatus 66 | - cloudformation:DetectStackDrift 67 | - cloudformation:ListStackSetOperations 68 | - cloudformation:ListStackInstances 69 | - cloudformation:ListTypes 70 | - cloudformation:DescribeStackResource 71 | - cloudformation:UpdateStackSet 72 | - cloudformation:CreateChangeSet 73 | - cloudformation:CreateStackInstances 74 | - cloudformation:ListTypeRegistrations 75 | - cloudformation:ListStackSetOperationResults 76 | - cloudformation:DetectStackResourceDrift 77 | - cloudformation:EstimateTemplateCost 78 | - cloudformation:DescribeStackEvents 79 | - cloudformation:DescribeStackSetOperation 80 | - cloudformation:UpdateStack 81 | - cloudformation:DescribeAccountLimits 82 | - cloudformation:DescribeChangeSet 83 | - cloudformation:CreateStackSet 84 | - cloudformation:ExecuteChangeSet 85 | - cloudformation:ListStackResources 86 | - cloudformation:SetStackPolicy 87 | - cloudformation:ListStacks 88 | - cloudformation:DescribeType 89 | - cloudformation:ListImports 90 | - cloudformation:DescribeStackInstance 91 | - cloudformation:DescribeStackResources 92 | - cloudformation:DeleteStackSet 93 | - cloudformation:DescribeTypeRegistration 94 | - cloudformation:GetTemplateSummary 95 | - cloudformation:DescribeStacks 96 | - cloudformation:DescribeStackResourceDrifts 97 | - cloudformation:GetStackPolicy 98 | - cloudformation:DescribeStackSet 99 | - cloudformation:ListStackSets 100 | - cloudformation:CreateStack 101 | - cloudformation:GetTemplate 102 | - cloudformation:DeleteStack 103 | - cloudformation:ValidateTemplate 104 | - cloudformation:ListChangeSets 105 | - cloudformation:ListTypeVersions 106 | Resource: '*' 107 | AssumeRolePolicyDocument: 108 | Version: 2012-10-17 109 | Statement: 110 | - Effect: Allow 111 | Principal: { Service: codebuild.amazonaws.com } 112 | Action: 113 | - sts:AssumeRole 114 | CodePipelineServiceRole: 115 | Type: AWS::IAM::Role 116 | Properties: 117 | Policies: 118 | - PolicyName: CodePipelineServiceRolePolicy 119 | PolicyDocument: 120 | Version: 2012-10-17 121 | Statement: 122 | - Effect: Allow 123 | Action: 124 | - codecommit:CancelUploadArchive 125 | - codecommit:GetBranch 126 | - codecommit:GetCommit 127 | - codecommit:GetUploadArchiveStatus 128 | - codecommit:UploadArchive 129 | Resource: !GetAtt DevSecOpsCICDCodeCommit.Arn 130 | - Effect: Allow 131 | Action: 132 | - cloudwatch:* 133 | Resource: '*' 134 | - Effect: Allow 135 | Action: 136 | - s3:GetObject 137 | - s3:GetObjectVersion 138 | - s3:PutObject 139 | - s3:GetBucketAcl 140 | - s3:GetBucketLocation 141 | - s3:PutBucketPolicy 142 | - s3:ListAllMyBuckets 143 | - s3:ListBucket 144 | Resource: 145 | - !Sub 'arn:aws:s3:::${DevSecOpsCICDCodePipelineArtifactBucket}' 146 | - !Sub 'arn:aws:s3:::${DevSecOpsCICDCodePipelineArtifactBucket}/*' 147 | - Effect: Allow 148 | Action: 149 | - codebuild:BatchGetBuilds 150 | - codebuild:StartBuild 151 | Resource: 152 | - !GetAtt LintingStage.Arn 153 | - !GetAtt SecretScanStage.Arn 154 | - !GetAtt CfRipperStage.Arn 155 | - !GetAtt CfnNagStage.Arn 156 | - !GetAtt CheckovDeploymentStage.Arn 157 | AssumeRolePolicyDocument: 158 | Version: 2012-10-17 159 | Statement: 160 | - Effect: Allow 161 | Principal: { Service: codepipeline.amazonaws.com } 162 | Action: 163 | - sts:AssumeRole 164 | DevSecOpsCICDCodePipelineArtifactBucket: 165 | Type: AWS::S3::Bucket 166 | Properties: 167 | BucketName: !Sub 'devsecopscicd-artifacts-${AWS::AccountId}' 168 | PublicAccessBlockConfiguration: 169 | BlockPublicAcls: true 170 | BlockPublicPolicy: true 171 | IgnorePublicAcls: true 172 | RestrictPublicBuckets: true 173 | VersioningConfiguration: 174 | Status: Enabled 175 | BucketEncryption: 176 | ServerSideEncryptionConfiguration: 177 | - ServerSideEncryptionByDefault: 178 | SSEAlgorithm: AES256 179 | LintingStage: 180 | Type: AWS::CodeBuild::Project 181 | Properties: 182 | Artifacts: 183 | Type: CODEPIPELINE 184 | Description: Uses python-cfn-lint to lint CFN Templates - Managed by CloudFormation 185 | Environment: 186 | ComputeType: BUILD_GENERAL1_SMALL 187 | Image: aws/codebuild/standard:4.0 188 | PrivilegedMode: True 189 | Type: LINUX_CONTAINER 190 | LogsConfig: 191 | CloudWatchLogs: 192 | Status: ENABLED 193 | Name: CFN-Linter 194 | ServiceRole: !GetAtt CodeBuildServiceRole.Arn 195 | Source: 196 | BuildSpec: buildspecs/lint-buildspec.yaml 197 | Type: CODEPIPELINE 198 | SecretScanStage: 199 | Type: AWS::CodeBuild::Project 200 | Properties: 201 | Artifacts: 202 | Type: CODEPIPELINE 203 | Description: Uses Yelp's Detect-Secrets to look for any secrets or sensitive material - Managed by CloudFormation 204 | Environment: 205 | ComputeType: BUILD_GENERAL1_SMALL 206 | Image: aws/codebuild/standard:4.0 207 | PrivilegedMode: True 208 | Type: LINUX_CONTAINER 209 | LogsConfig: 210 | CloudWatchLogs: 211 | Status: ENABLED 212 | Name: CFN-Secrets 213 | ServiceRole: !GetAtt CodeBuildServiceRole.Arn 214 | Source: 215 | BuildSpec: buildspecs/secrets-buildspec.yml 216 | Type: CODEPIPELINE 217 | CfRipperStage: 218 | Type: AWS::CodeBuild::Project 219 | Properties: 220 | Artifacts: 221 | Type: CODEPIPELINE 222 | Description: Uses CF-Ripper to perfrom static security analysis of CloudFormation templates - Managed by CloudFormation 223 | Environment: 224 | ComputeType: BUILD_GENERAL1_MEDIUM 225 | Image: aws/codebuild/standard:4.0 226 | PrivilegedMode: True 227 | Type: LINUX_CONTAINER 228 | LogsConfig: 229 | CloudWatchLogs: 230 | Status: ENABLED 231 | Name: CF-Ripper 232 | ServiceRole: !GetAtt CodeBuildServiceRole.Arn 233 | Source: 234 | BuildSpec: buildspecs/cfripper-buildspec.yml 235 | Type: CODEPIPELINE 236 | CfnNagStage: 237 | Type: AWS::CodeBuild::Project 238 | Properties: 239 | Artifacts: 240 | Type: CODEPIPELINE 241 | Description: Uses CFN-Nag to perfrom static security analysis of CloudFormation templates - Managed by CloudFormation 242 | Environment: 243 | ComputeType: BUILD_GENERAL1_MEDIUM 244 | Image: aws/codebuild/standard:4.0 245 | PrivilegedMode: True 246 | Type: LINUX_CONTAINER 247 | LogsConfig: 248 | CloudWatchLogs: 249 | Status: ENABLED 250 | Name: CFN-Nag 251 | ServiceRole: !GetAtt CodeBuildServiceRole.Arn 252 | Source: 253 | BuildSpec: buildspecs/cfnnag-buildspec.yml 254 | Type: CODEPIPELINE 255 | CheckovDeploymentStage: 256 | Type: AWS::CodeBuild::Project 257 | Properties: 258 | Artifacts: 259 | Type: CODEPIPELINE 260 | Description: Uses Checkov to perform static security analysis of CloudFormation templates before creating and deploying a stack - Managed by CloudFormation 261 | Environment: 262 | ComputeType: BUILD_GENERAL1_MEDIUM 263 | Image: aws/codebuild/standard:4.0 264 | PrivilegedMode: True 265 | Type: LINUX_CONTAINER 266 | EnvironmentVariables: 267 | - Name: STACK_NAME 268 | Type: PLAINTEXT 269 | Value: !Ref ExampleStackName 270 | LogsConfig: 271 | CloudWatchLogs: 272 | Status: ENABLED 273 | Name: CheckovDeploymentStage 274 | ServiceRole: !GetAtt CodeBuildServiceRole.Arn 275 | Source: 276 | BuildSpec: buildspecs/checkov-createstack-buildspec.yml 277 | Type: CODEPIPELINE 278 | DevSecOpsCICDCodePipeline: 279 | Type: AWS::CodePipeline::Pipeline 280 | Properties: 281 | ArtifactStore: 282 | Location: !Ref DevSecOpsCICDCodePipelineArtifactBucket 283 | Type: S3 284 | Name: !Sub 'DevSecOpsCICD-scan-cicd-pipeline-${AWS::AccountId}' 285 | RestartExecutionOnUpdate: True 286 | RoleArn: !GetAtt CodePipelineServiceRole.Arn 287 | Stages: 288 | - 289 | Name: Source 290 | Actions: 291 | - 292 | Name: SourceAction 293 | ActionTypeId: 294 | Category: Source 295 | Owner: AWS 296 | Version: 1 297 | Provider: CodeCommit 298 | Configuration: 299 | RepositoryName: !GetAtt DevSecOpsCICDCodeCommit.Name 300 | BranchName: master 301 | OutputArtifacts: 302 | - 303 | Name: SourceOutput 304 | RunOrder: 1 305 | - 306 | Name: CFN-Linter 307 | Actions: 308 | - 309 | InputArtifacts: 310 | - 311 | Name: SourceOutput 312 | Name: BuildAction 313 | ActionTypeId: 314 | Category: Build 315 | Owner: AWS 316 | Version: 1 317 | Provider: CodeBuild 318 | Configuration: 319 | ProjectName: !Ref LintingStage 320 | PrimarySource: SourceOutput 321 | RunOrder: 2 322 | - 323 | Name: CFN-Secrets 324 | Actions: 325 | - 326 | InputArtifacts: 327 | - 328 | Name: SourceOutput 329 | Name: BuildAction 330 | ActionTypeId: 331 | Category: Build 332 | Owner: AWS 333 | Version: 1 334 | Provider: CodeBuild 335 | Configuration: 336 | ProjectName: !Ref SecretScanStage 337 | PrimarySource: SourceOutput 338 | RunOrder: 3 339 | - 340 | Name: CF-Ripper 341 | Actions: 342 | - 343 | InputArtifacts: 344 | - 345 | Name: SourceOutput 346 | Name: BuildAction 347 | ActionTypeId: 348 | Category: Build 349 | Owner: AWS 350 | Version: 1 351 | Provider: CodeBuild 352 | Configuration: 353 | ProjectName: !Ref CfRipperStage 354 | PrimarySource: SourceOutput 355 | RunOrder: 4 356 | - 357 | Name: CFN-Nag 358 | Actions: 359 | - 360 | InputArtifacts: 361 | - 362 | Name: SourceOutput 363 | Name: BuildAction 364 | ActionTypeId: 365 | Category: Build 366 | Owner: AWS 367 | Version: 1 368 | Provider: CodeBuild 369 | Configuration: 370 | ProjectName: !Ref CfnNagStage 371 | PrimarySource: SourceOutput 372 | RunOrder: 5 373 | - 374 | Name: CheckovDeploymentStage 375 | Actions: 376 | - 377 | InputArtifacts: 378 | - 379 | Name: SourceOutput 380 | Name: BuildAction 381 | ActionTypeId: 382 | Category: Build 383 | Owner: AWS 384 | Version: 1 385 | Provider: CodeBuild 386 | Configuration: 387 | ProjectName: !Ref CheckovDeploymentStage 388 | PrimarySource: SourceOutput 389 | RunOrder: 6 -------------------------------------------------------------------------------- /cloudformation-pipeline/cloudformation-pipeline-architecture.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jonrau1/AWS-DevSecOps-Factory/c07921704872b2102c47098038300fc5adf4f003/cloudformation-pipeline/cloudformation-pipeline-architecture.jpg -------------------------------------------------------------------------------- /cloudformation-pipeline/src/buildspecs/cfnnag-buildspec.yml: -------------------------------------------------------------------------------- 1 | version: 0.2 2 | 3 | phases: 4 | install: 5 | runtime-versions: 6 | ruby: latest 7 | commands: 8 | - gem install cfn-nag 9 | build: 10 | commands: 11 | - cfn_nag_scan --input-path ./templates --output-format json 12 | post_build: 13 | commands: 14 | - echo cfn-nag testing completed on `date` -------------------------------------------------------------------------------- /cloudformation-pipeline/src/buildspecs/cfripper-buildspec.yml: -------------------------------------------------------------------------------- 1 | version: 0.2 2 | 3 | phases: 4 | install: 5 | runtime-versions: 6 | python: 3.7 7 | commands: 8 | - pip3 install boto3 9 | - pip3 install cfripper 10 | - cd templates 11 | build: 12 | commands: 13 | - cfripper --format json --resolve ./*.yml 14 | post_build: 15 | commands: 16 | - echo CFRipper testing completed on `date` -------------------------------------------------------------------------------- /cloudformation-pipeline/src/buildspecs/checkov-createstack-buildspec.yml: -------------------------------------------------------------------------------- 1 | version: 0.2 2 | 3 | phases: 4 | install: 5 | runtime-versions: 6 | commands: 7 | - pip3 install --upgrade pip 8 | - pip3 install awscli 9 | - pip3 install checkov 10 | pre_build: 11 | commands: 12 | - checkov -d ./templates -o json 13 | build: 14 | commands: 15 | - aws cloudformation create-stack --stackname $STACK_NAME --template-body file://templates/template.yml --parameters file://params/parameters.json 16 | post_build: 17 | commands: 18 | - echo Checkov scan completed and Stack created on `date` -------------------------------------------------------------------------------- /cloudformation-pipeline/src/buildspecs/detect-secrets-ejection.py: -------------------------------------------------------------------------------- 1 | import json 2 | 3 | try: 4 | with open('secret-results.json') as json_file: 5 | data = json.load(json_file) 6 | if str(data['results']) != '{}': 7 | exit(1) 8 | else: 9 | pass 10 | except Exception as e: 11 | print(e) -------------------------------------------------------------------------------- /cloudformation-pipeline/src/buildspecs/lint-buildspec.yaml: -------------------------------------------------------------------------------- 1 | version: 0.2 2 | 3 | phases: 4 | install: 5 | commands: 6 | - pip3 install boto3 7 | - pip3 install cfn-lint 8 | build: 9 | commands: 10 | - cfn-lint ./templates/*.yml -f json 11 | post_build: 12 | commands: 13 | - echo CFN Linting completed on `date` -------------------------------------------------------------------------------- /cloudformation-pipeline/src/buildspecs/secrets-buildspec.yml: -------------------------------------------------------------------------------- 1 | version: 0.2 2 | 3 | phases: 4 | install: 5 | commands: 6 | - apt update 7 | - apt install jq -y 8 | - pip3 install detect-secrets 9 | - cp ./buildspecs/detect-secrets-ejection.py ./templates/ 10 | - cd templates 11 | build: 12 | commands: 13 | - detect-secrets scan . --all-files > secret-results.json 14 | - jq . secret-results.json 15 | - python3 detect-secrets-ejection.py 16 | post_build: 17 | commands: 18 | - echo detect-secrets scan completed on `date` -------------------------------------------------------------------------------- /cloudformation-pipeline/src/params/parameters.json: -------------------------------------------------------------------------------- 1 | [ 2 | { 3 | "ParameterKey": "CodeCommitName", 4 | "ParameterValue": "monorepo" 5 | }, 6 | { 7 | "ParameterKey": "BucketNamePrefix", 8 | "ParameterValue": "devsecops-bucket" 9 | } 10 | ] -------------------------------------------------------------------------------- /cloudformation-pipeline/src/templates/template.yml: -------------------------------------------------------------------------------- 1 | AWSTemplateFormatVersion: 2010-09-09 2 | Description: Generic CFN for testing 3 | Parameters: 4 | CodeCommitName: 5 | Type: String 6 | Description: Name of your CodeCommit repo 7 | BucketNamePrefix: 8 | Type: String 9 | Description: Name of your S3 bucket prefix. Must be lower case. 10 | Resources: 11 | DevSecOpsCICDCodeCommit: 12 | Type: AWS::CodeCommit::Repository 13 | Properties: 14 | RepositoryDescription: Create for CFN DevSecOps Pipeline example 15 | RepositoryName: !Sub '${CodeCommitName}-devsecops-sample' 16 | DevSecOpsExampleBucket: 17 | Type: AWS::S3::Bucket 18 | Properties: 19 | #checkov:skip=CKV_AWS_18:this bucket doesnt need access logging 20 | BucketName: !Sub '${BucketNamePrefix}-bucket-${AWS::AccountId}' 21 | PublicAccessBlockConfiguration: 22 | BlockPublicAcls: true 23 | BlockPublicPolicy: true 24 | IgnorePublicAcls: true 25 | RestrictPublicBuckets: true 26 | VersioningConfiguration: 27 | Status: Enabled 28 | BucketEncryption: 29 | ServerSideEncryptionConfiguration: 30 | - ServerSideEncryptionByDefault: 31 | SSEAlgorithm: AES256 -------------------------------------------------------------------------------- /docker-k8s-double-decker/docker-k8s-double-decker-architecture.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jonrau1/AWS-DevSecOps-Factory/c07921704872b2102c47098038300fc5adf4f003/docker-k8s-double-decker/docker-k8s-double-decker-architecture.jpg -------------------------------------------------------------------------------- /docker-pipeline-snyk/docker-pipeline-snyk-architecture.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jonrau1/AWS-DevSecOps-Factory/c07921704872b2102c47098038300fc5adf4f003/docker-pipeline-snyk/docker-pipeline-snyk-architecture.jpg -------------------------------------------------------------------------------- /docker-pipeline-wss/README.md: -------------------------------------------------------------------------------- 1 | # Docker Security Scanning - CodeSuite 2 | Sample implementation of a Docker security testing pipeline using the AWS CodeSuite. CodeCommit is used as the SCM, CodeBuild projects are used as the CI servers and CodePipeline is the CD automation engine which will start builds as new code is pushed (directly or via PR) to the Master branch. Elastic Container Registry is used to store the finalized Docker image. 3 | 4 | This pipeline lints Dockerfiles using Hadolint, looks for regex and high-entropy based secrets/sensitive values using Detect-Secrets and performs software composition analysis (dependency vulnerability test & lisc. inventory) using Whitesource. Dagda is used to perform anti-virus/anti-malware checks and additional vulnerabillity testing on the layers and dependencies is done using various Red Hat and NVD databases and open-source SAST tools such as GoSec and Bandit. Finally, Trivy is used to perform layer vulnerability analysis on most major parent images before the scanned image is pushed to ECR. 5 | 6 | ## Getting Started 7 | Clone this repository and upload `docker-devsecops.zip` to a bucket of your choosing. You can view the individual CodeBuild `buildspec`'s in `src/buildspecs/`. 8 | 9 | Create SSM Parameters for your Whitesource API Key and User Key (for your CI user) as these will be needed for parameters in the CloudFormation template. You will also need to know the name of your Product (i.e. Infosec-dev) 10 | ```bash 11 | aws ssm put-parameter --name wssApiKey --type SecureString --value 12 | aws ssm put-parameter --name wssUserKeyParameter --type SecureString --value 13 | ``` 14 | 15 | Deploy a stack from `docker-secdevops.yml` in your AWS account, all necessary artifacts will be pushed as the first commit to the created CodeCommit repository. 16 | 17 | To test your own Dockerfile and dependencies push those artifacts to the `/src/artifacts/` directory. 18 | 19 | **Note:** Modify your Config file for WSS to tailor to your codebase. -------------------------------------------------------------------------------- /docker-pipeline-wss/docker-devsecops.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jonrau1/AWS-DevSecOps-Factory/c07921704872b2102c47098038300fc5adf4f003/docker-pipeline-wss/docker-devsecops.zip -------------------------------------------------------------------------------- /docker-pipeline-wss/docker-pipeline-wss-architecture.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jonrau1/AWS-DevSecOps-Factory/c07921704872b2102c47098038300fc5adf4f003/docker-pipeline-wss/docker-pipeline-wss-architecture.jpg -------------------------------------------------------------------------------- /docker-pipeline-wss/docker-secdevops.yaml: -------------------------------------------------------------------------------- 1 | AWSTemplateFormatVersion: 2010-09-09 2 | Description: Deploys a sample CodePipeline to scan and deploy CFN templates 3 | Parameters: 4 | InitialCommitBucket: 5 | Type: String 6 | Description: The name of the S3 bucket containing the package for the initial commit for the DevSecOps pipeline 7 | InitialCommitKey: 8 | Type: String 9 | Description: Name of the package for the initial commit for the DevSecOps pipeline DO NOT include .zip 10 | Default: docker-devsecops 11 | WssProductName: 12 | Type: String 13 | Description: Your Whitesource Product Name this value is case sensitive 14 | WssProjectName: 15 | Type: String 16 | Description: Your Whitesource Project Name to report against 17 | Default: Whitesource-Codebuild 18 | WssUserKeyParameter: 19 | Type: String 20 | Description: The SSM Parameter that contains your WSS User Key 21 | WssApiKeyParameter: 22 | Type: String 23 | Description: The SSM Parameter that contains your WSS API Key 24 | Resources: 25 | DevSecOpsCICDECR: 26 | Type: AWS::ECR::Repository 27 | Properties: 28 | RepositoryName: docker-devsecops-ecr 29 | DevSecOpsCICDCodeCommit: 30 | Type: AWS::CodeCommit::Repository 31 | Properties: 32 | RepositoryDescription: Contains all artifacts needed for a Docker security scanning DevSecOps pipeline - Managed by CloudFormation 33 | RepositoryName: docker-devsecops 34 | Code: 35 | S3: 36 | Bucket: !Ref InitialCommitBucket 37 | Key: !Sub '${InitialCommitKey}.zip' 38 | CodeBuildServiceRole: 39 | Type: AWS::IAM::Role 40 | Properties: 41 | ManagedPolicyArns: 42 | - arn:aws:iam::aws:policy/AmazonEC2ContainerRegistryPowerUser 43 | Policies: 44 | - PolicyName: CodeBuildServiceRolePolicy 45 | PolicyDocument: 46 | Version: 2012-10-17 47 | Statement: 48 | - Effect: Allow 49 | Action: 50 | - codecommit:GitPull 51 | Resource: !GetAtt DevSecOpsCICDCodeCommit.Arn 52 | - Effect: Allow 53 | Action: 54 | - s3:GetObject 55 | - s3:GetObjectVersion 56 | - s3:PutObject 57 | - s3:GetBucketAcl 58 | - s3:GetBucketLocation 59 | Resource: 60 | - !Sub 'arn:aws:s3:::${DevSecOpsCICDCodePipelineArtifactBucket}' 61 | - !Sub 'arn:aws:s3:::${DevSecOpsCICDCodePipelineArtifactBucket}/*' 62 | - Effect: Allow 63 | Action: 64 | - kms:Decrypt 65 | - ssm:GetParameter 66 | - ssm:GetParameters 67 | - logs:CreateLogGroup 68 | - logs:CreateLogStream 69 | - logs:PutLogEvents 70 | Resource: '*' 71 | - Effect: Allow 72 | Action: 73 | - ecr:GetDownloadUrlForLayer 74 | - ecr:BatchGetImage 75 | - ecr:BatchCheckLayerAvailability 76 | - ecr:PutImage 77 | - ecr:InitiateLayerUpload 78 | - ecr:UploadLayerPart 79 | - ecr:CompleteLayerUpload 80 | - ecr:GetAuthorizationToken 81 | Resource: !GetAtt DevSecOpsCICDECR.Arn 82 | AssumeRolePolicyDocument: 83 | Version: 2012-10-17 84 | Statement: 85 | - Effect: Allow 86 | Principal: { Service: codebuild.amazonaws.com } 87 | Action: 88 | - sts:AssumeRole 89 | CodePipelineServiceRole: 90 | Type: AWS::IAM::Role 91 | Properties: 92 | Policies: 93 | - PolicyName: CodePipelineServiceRolePolicy 94 | PolicyDocument: 95 | Version: 2012-10-17 96 | Statement: 97 | - Effect: Allow 98 | Action: 99 | - codecommit:CancelUploadArchive 100 | - codecommit:GetBranch 101 | - codecommit:GetCommit 102 | - codecommit:GetUploadArchiveStatus 103 | - codecommit:UploadArchive 104 | Resource: !GetAtt DevSecOpsCICDCodeCommit.Arn 105 | - Effect: Allow 106 | Action: 107 | - cloudwatch:'*' 108 | - ssm:GetParameter 109 | - ssm:GetParameters 110 | - kms:Decrypt 111 | Resource: '*' 112 | - Effect: Allow 113 | Action: 114 | - s3:GetObject 115 | - s3:GetObjectVersion 116 | - s3:PutObject 117 | - s3:GetBucketAcl 118 | - s3:GetBucketLocation 119 | - s3:PutBucketPolicy 120 | - s3:ListAllMyBuckets 121 | - s3:ListBucket 122 | Resource: 123 | - !Sub 'arn:aws:s3:::${DevSecOpsCICDCodePipelineArtifactBucket}' 124 | - !Sub 'arn:aws:s3:::${DevSecOpsCICDCodePipelineArtifactBucket}/*' 125 | - Effect: Allow 126 | Action: 127 | - codebuild:BatchGetBuilds 128 | - codebuild:StartBuild 129 | Resource: 130 | - !GetAtt LintingStage.Arn 131 | - !GetAtt SecretScanStage.Arn 132 | - !GetAtt WhitesourceStage.Arn 133 | - !GetAtt DagdaStage.Arn 134 | - !GetAtt TrivyStage.Arn 135 | AssumeRolePolicyDocument: 136 | Version: 2012-10-17 137 | Statement: 138 | - Effect: Allow 139 | Principal: { Service: codepipeline.amazonaws.com } 140 | Action: 141 | - sts:AssumeRole 142 | DevSecOpsCICDCodePipelineArtifactBucket: 143 | Type: AWS::S3::Bucket 144 | Properties: 145 | BucketName: !Sub 'docker-devsecopscicd-artifacts-${AWS::AccountId}' 146 | PublicAccessBlockConfiguration: 147 | BlockPublicAcls: true 148 | BlockPublicPolicy: true 149 | IgnorePublicAcls: true 150 | RestrictPublicBuckets: true 151 | VersioningConfiguration: 152 | Status: Enabled 153 | BucketEncryption: 154 | ServerSideEncryptionConfiguration: 155 | - ServerSideEncryptionByDefault: 156 | SSEAlgorithm: AES256 157 | LintingStage: 158 | Type: AWS::CodeBuild::Project 159 | Properties: 160 | Artifacts: 161 | Type: CODEPIPELINE 162 | Description: Uses Hadolint to lint Dockerfiles - Managed by CloudFormation 163 | Environment: 164 | ComputeType: BUILD_GENERAL1_SMALL 165 | Image: aws/codebuild/standard:4.0 166 | PrivilegedMode: True 167 | Type: LINUX_CONTAINER 168 | LogsConfig: 169 | CloudWatchLogs: 170 | Status: ENABLED 171 | Name: Docker-Linter 172 | ServiceRole: !GetAtt CodeBuildServiceRole.Arn 173 | Source: 174 | BuildSpec: buildspecs/lint-buildspec.yaml 175 | Type: CODEPIPELINE 176 | SecretScanStage: 177 | Type: AWS::CodeBuild::Project 178 | Properties: 179 | Artifacts: 180 | Type: CODEPIPELINE 181 | Description: Uses Yelp's Detect-Secrets to look for any secrets or sensitive material - Managed by CloudFormation 182 | Environment: 183 | ComputeType: BUILD_GENERAL1_SMALL 184 | Image: aws/codebuild/standard:4.0 185 | PrivilegedMode: True 186 | Type: LINUX_CONTAINER 187 | LogsConfig: 188 | CloudWatchLogs: 189 | Status: ENABLED 190 | Name: SecretScan 191 | ServiceRole: !GetAtt CodeBuildServiceRole.Arn 192 | Source: 193 | BuildSpec: buildspecs/secrets-buildspec.yaml 194 | Type: CODEPIPELINE 195 | WhitesourceStage: 196 | Type: AWS::CodeBuild::Project 197 | Properties: 198 | Artifacts: 199 | Type: CODEPIPELINE 200 | Description: Uses Whitesource to detect vulnerabilities in dependencies, Docker images - Managed by CloudFormation 201 | Environment: 202 | ComputeType: BUILD_GENERAL1_SMALL 203 | Image: aws/codebuild/standard:4.0 204 | PrivilegedMode: True 205 | Type: LINUX_CONTAINER 206 | EnvironmentVariables: 207 | - Name: ECR_REPO_NAME 208 | Type: PLAINTEXT 209 | Value: !Ref DevSecOpsCICDECR 210 | - Name: WSS_API_KEY 211 | Type: PARAMETER_STORE 212 | Value: !Ref WssApiKeyParameter 213 | - Name: WSS_USER_KEY 214 | Type: PARAMETER_STORE 215 | Value: !Ref WssUserKeyParameter 216 | - Name: WSS_PROJECT_NAME 217 | Type: PLAINTEXT 218 | Value: !Ref WssProjectName 219 | - Name: WSS_PRODUCT_NAME 220 | Type: PLAINTEXT 221 | Value: !Ref WssProductName 222 | LogsConfig: 223 | CloudWatchLogs: 224 | Status: ENABLED 225 | Name: WhitesourceStage 226 | ServiceRole: !GetAtt CodeBuildServiceRole.Arn 227 | Source: 228 | BuildSpec: buildspecs/wss-buildspec.yaml 229 | Type: CODEPIPELINE 230 | DagdaStage: 231 | Type: AWS::CodeBuild::Project 232 | Properties: 233 | Artifacts: 234 | Type: CODEPIPELINE 235 | Description: Uses Dagda to scan Docker images for malware and vulnerabilities in OS and code dependencies - Managed by CloudFormation 236 | Environment: 237 | ComputeType: BUILD_GENERAL1_LARGE 238 | Image: aws/codebuild/standard:4.0 239 | PrivilegedMode: True 240 | Type: LINUX_CONTAINER 241 | EnvironmentVariables: 242 | - Name: ECR_REPO_NAME 243 | Type: PLAINTEXT 244 | Value: !Ref DevSecOpsCICDECR 245 | LogsConfig: 246 | CloudWatchLogs: 247 | Status: ENABLED 248 | Name: DagdaStage 249 | ServiceRole: !GetAtt CodeBuildServiceRole.Arn 250 | Source: 251 | BuildSpec: buildspecs/dagda-buildspec.yaml 252 | Type: CODEPIPELINE 253 | TrivyStage: 254 | Type: AWS::CodeBuild::Project 255 | Properties: 256 | Artifacts: 257 | Type: CODEPIPELINE 258 | Description: Uses Aqua Securitys Trivy to scan Docker parent and layer images at runtime - Managed by CloudFormation 259 | Environment: 260 | ComputeType: BUILD_GENERAL1_MEDIUM 261 | Image: aws/codebuild/standard:4.0 262 | PrivilegedMode: True 263 | Type: LINUX_CONTAINER 264 | EnvironmentVariables: 265 | - Name: ECR_REPO_NAME 266 | Type: PLAINTEXT 267 | Value: !Ref DevSecOpsCICDECR 268 | - Name: ECR_REPO_URI 269 | Type: PLAINTEXT 270 | Value: !Sub '${AWS::AccountId}.dkr.ecr.${AWS::Region}.amazonaws.com/${DevSecOpsCICDECR}' 271 | LogsConfig: 272 | CloudWatchLogs: 273 | Status: ENABLED 274 | Name: TrivyStage 275 | ServiceRole: !GetAtt CodeBuildServiceRole.Arn 276 | Source: 277 | BuildSpec: buildspecs/trivy-buildspec.yaml 278 | Type: CODEPIPELINE 279 | DevSecOpsCICDCodePipeline: 280 | Type: AWS::CodePipeline::Pipeline 281 | Properties: 282 | ArtifactStore: 283 | Location: !Ref DevSecOpsCICDCodePipelineArtifactBucket 284 | Type: S3 285 | Name: !Sub 'DevSecOpsCICD-scan-cicd-pipeline-${AWS::AccountId}' 286 | RestartExecutionOnUpdate: True 287 | RoleArn: !GetAtt CodePipelineServiceRole.Arn 288 | Stages: 289 | - 290 | Name: Source 291 | Actions: 292 | - 293 | Name: SourceAction 294 | ActionTypeId: 295 | Category: Source 296 | Owner: AWS 297 | Version: 1 298 | Provider: CodeCommit 299 | Configuration: 300 | RepositoryName: !GetAtt DevSecOpsCICDCodeCommit.Name 301 | BranchName: master 302 | OutputArtifacts: 303 | - 304 | Name: SourceOutput 305 | RunOrder: 1 306 | - 307 | Name: Docker-Linter 308 | Actions: 309 | - 310 | InputArtifacts: 311 | - 312 | Name: SourceOutput 313 | Name: BuildAction 314 | ActionTypeId: 315 | Category: Build 316 | Owner: AWS 317 | Version: 1 318 | Provider: CodeBuild 319 | Configuration: 320 | ProjectName: !Ref LintingStage 321 | PrimarySource: SourceOutput 322 | RunOrder: 2 323 | - 324 | Name: SecretScan 325 | Actions: 326 | - 327 | InputArtifacts: 328 | - 329 | Name: SourceOutput 330 | Name: BuildAction 331 | ActionTypeId: 332 | Category: Build 333 | Owner: AWS 334 | Version: 1 335 | Provider: CodeBuild 336 | Configuration: 337 | ProjectName: !Ref SecretScanStage 338 | PrimarySource: SourceOutput 339 | RunOrder: 3 340 | - 341 | Name: WhitesourceStage 342 | Actions: 343 | - 344 | InputArtifacts: 345 | - 346 | Name: SourceOutput 347 | Name: BuildAction 348 | ActionTypeId: 349 | Category: Build 350 | Owner: AWS 351 | Version: 1 352 | Provider: CodeBuild 353 | Configuration: 354 | ProjectName: !Ref WhitesourceStage 355 | PrimarySource: SourceOutput 356 | RunOrder: 4 357 | - 358 | Name: DagdaStage 359 | Actions: 360 | - 361 | InputArtifacts: 362 | - 363 | Name: SourceOutput 364 | Name: BuildAction 365 | ActionTypeId: 366 | Category: Build 367 | Owner: AWS 368 | Version: 1 369 | Provider: CodeBuild 370 | Configuration: 371 | ProjectName: !Ref DagdaStage 372 | PrimarySource: SourceOutput 373 | RunOrder: 5 374 | - 375 | Name: TrivyStage 376 | Actions: 377 | - 378 | InputArtifacts: 379 | - 380 | Name: SourceOutput 381 | Name: BuildAction 382 | ActionTypeId: 383 | Category: Build 384 | Owner: AWS 385 | Version: 1 386 | Provider: CodeBuild 387 | Configuration: 388 | ProjectName: !Ref TrivyStage 389 | PrimarySource: SourceOutput 390 | RunOrder: 6 -------------------------------------------------------------------------------- /docker-pipeline-wss/src/artifacts/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM python:3.8.3-alpine3.11 2 | 3 | ENV PYTHONUNBUFFERED=1 4 | 5 | COPY requirements.txt /tmp/requirements.txt 6 | 7 | RUN pip3 install -r /tmp/requirements.txt -------------------------------------------------------------------------------- /docker-pipeline-wss/src/artifacts/requirements.txt: -------------------------------------------------------------------------------- 1 | requests -------------------------------------------------------------------------------- /docker-pipeline-wss/src/buildspecs/dagda-buildspec.yaml: -------------------------------------------------------------------------------- 1 | version: 0.2 2 | 3 | phases: 4 | install: 5 | commands: 6 | - apt update 7 | - apt install -y jq 8 | - apt install -y docker-compose 9 | - pip3 install --upgrade pip 10 | - git clone https://github.com/eliasgranderubio/dagda.git 11 | - rm ./dagda/docker-compose.yml 12 | - cp ./buildspecs/docker-compose.yml ./dagda/ 13 | - cd dagda 14 | - docker-compose build 15 | - docker-compose up -d 16 | - sleep 55 17 | - cd - 18 | - cp ./buildspecs/dagda-parse.py ./artifacts/ 19 | - cp ./buildspecs/findingDetonator.py ./artifacts/ 20 | - cd artifacts 21 | pre_build: 22 | commands: 23 | - export ECR_REPO_NAME=$ECR_REPO_NAME 24 | build: 25 | commands: 26 | - docker exec -t dagda python3 dagda.py vuln --init 27 | - sleep 60 28 | - docker build -t $ECR_REPO_NAME . 29 | - sleep 2 30 | - docker exec -t dagda python3 dagda.py check --docker_image $ECR_REPO_NAME:latest > dagdaId.json 31 | - sleep 2 32 | - export DAGDA_ID=`python3 dagda-parse.py` 33 | - sleep 420 34 | - docker exec -t dagda python3 dagda.py history $ECR_REPO_NAME:latest --id $DAGDA_ID > dagdaFinding.json 35 | - jq . dagdaFinding.json 36 | - python3 findingDetonator.py 37 | post_build: 38 | commands: 39 | - echo Dagda job completed on `date` -------------------------------------------------------------------------------- /docker-pipeline-wss/src/buildspecs/dagda-parse.py: -------------------------------------------------------------------------------- 1 | import json 2 | 3 | with open('dagdaId.json') as jsonFile: 4 | data = json.load(jsonFile) 5 | scanId = str(data['id']) 6 | print(scanId) -------------------------------------------------------------------------------- /docker-pipeline-wss/src/buildspecs/detect-secrets-ejection.py: -------------------------------------------------------------------------------- 1 | import json 2 | 3 | try: 4 | with open('secret-results.json') as json_file: 5 | data = json.load(json_file) 6 | if str(data['results']) != '{}': 7 | exit(1) 8 | else: 9 | exit(0) 10 | except Exception as e: 11 | print(e) 12 | raise -------------------------------------------------------------------------------- /docker-pipeline-wss/src/buildspecs/docker-compose.yml: -------------------------------------------------------------------------------- 1 | version: '2' 2 | services: 3 | dagda: 4 | build: . 5 | image: 3grander/dagda:0.8.0 6 | container_name: dagda 7 | environment: 8 | - DAGDA_HOST=0.0.0.0 9 | - DAGDA_PORT=5000 10 | networks: 11 | - mdb 12 | entrypoint: python dagda.py start -s 0.0.0.0 -p 5000 -m vulndb -mp 27017 13 | ports: 14 | - "5000:5000" 15 | volumes: 16 | - /var/run/docker.sock:/var/run/docker.sock:ro 17 | - /tmp:/tmp 18 | depends_on: 19 | - vulndb 20 | vulndb: 21 | image: mongo 22 | container_name: vulndb 23 | networks: 24 | - mdb 25 | ports: 26 | - "27017:27017" 27 | volumes: 28 | - ./db:/data/db 29 | networks: 30 | mdb: 31 | external: false -------------------------------------------------------------------------------- /docker-pipeline-wss/src/buildspecs/findingDetonator.py: -------------------------------------------------------------------------------- 1 | import json 2 | 3 | with open('dagdaFinding.json') as jsonFile: 4 | data = json.load(jsonFile) 5 | dagdaMalware = str(data[0]['static_analysis']['malware_binaries']) 6 | osVulnCount = str(data[0]['static_analysis']['os_packages']['vuln_os_packages']) 7 | pckgVulnCounter = str(data[0]['static_analysis']['prog_lang_dependencies']['vuln_dependencies']) 8 | if dagdaMalware == '[]': 9 | print('Dagda scan completed with no Malware detected! Total number of vulnerable OS packages is ' + osVulnCount + ' Total number of vulnerable code dependencies is ' + pckgVulnCounter) 10 | exit(0) 11 | else: 12 | print('Dagda scan detected Malware! Total number of vulnerable OS packages is ' + osVulnCount + ' Total number of vulnerable code dependencies is ' + pckgVulnCounter) 13 | exit(1) -------------------------------------------------------------------------------- /docker-pipeline-wss/src/buildspecs/lint-buildspec.yaml: -------------------------------------------------------------------------------- 1 | version: 0.2 2 | 3 | phases: 4 | install: 5 | commands: 6 | - cd artifacts 7 | - wget https://github.com/hadolint/hadolint/releases/download/v1.18.0/hadolint-Linux-x86_64 8 | - chmod -R +x hadolint-Linux-x86_64 9 | build: 10 | commands: 11 | - ./hadolint-Linux-x86_64 -f json Dockerfile 12 | post_build: 13 | commands: 14 | - echo Hadolint scan completed on `date` -------------------------------------------------------------------------------- /docker-pipeline-wss/src/buildspecs/secrets-buildspec.yaml: -------------------------------------------------------------------------------- 1 | version: 0.2 2 | 3 | phases: 4 | install: 5 | commands: 6 | - pip3 install --upgrade pip 7 | - pip3 install detect-secrets 8 | - cp ./buildspecs/detect-secrets-ejection.py ./artifacts/ 9 | - cd artifacts 10 | build: 11 | commands: 12 | - detect-secrets scan . --all-files > secret-results.json 13 | - jq . secret-results.json 14 | - python3 detect-secrets-ejection.py 15 | post_build: 16 | commands: 17 | - echo detect-secrets scan completed on `date` -------------------------------------------------------------------------------- /docker-pipeline-wss/src/buildspecs/trivy-buildspec.yaml: -------------------------------------------------------------------------------- 1 | version: 0.2 2 | 3 | phases: 4 | install: 5 | commands: 6 | - apt update 7 | - pip3 install boto3 8 | - pip3 install awscli 9 | - apt install rpm -y 10 | - wget https://github.com/aquasecurity/trivy/releases/download/v0.9.1/trivy_0.9.1_Linux-64bit.deb 11 | - dpkg -i trivy_0.9.1_Linux-64bit.deb 12 | pre_build: 13 | commands: 14 | - cd artifacts 15 | - $(aws ecr get-login --no-include-email --region $AWS_DEFAULT_REGION) 16 | - docker build -t $ECR_REPO_NAME:latest . 17 | build: 18 | commands: 19 | - trivy -f json --exit-code 0 --severity LOW,MEDIUM,HIGH --quiet --auto-refresh $ECR_REPO_NAME:latest 20 | - trivy -f json --exit-code 1 --severity CRITICAL --quiet --auto-refresh $ECR_REPO_NAME:latest 21 | - docker tag $ECR_REPO_NAME:latest $ECR_REPO_URI:latest 22 | - docker push $ECR_REPO_URI:latest 23 | - echo Docker image pushed to ECR on `date` 24 | post_build: 25 | commands: 26 | - echo Trivy scan completed on `date` -------------------------------------------------------------------------------- /docker-pipeline-wss/src/buildspecs/wss-buildspec.yaml: -------------------------------------------------------------------------------- 1 | version: 0.2 2 | 3 | phases: 4 | install: 5 | runtime-versions: 6 | python: latest 7 | java: latest 8 | commands: 9 | - cp ./buildspecs/wss-unified-agent.config ./artifacts/ 10 | - cd artifacts 11 | - curl -LJO https://github.com/whitesource/unified-agent-distribution/releases/latest/download/wss-unified-agent.jar 12 | pre_build: 13 | commands: 14 | - docker build -t $ECR_REPO_NAME:latest . 15 | build: 16 | commands: 17 | - java -jar wss-unified-agent.jar -apiKey $WSS_API_KEY -userKey $WSS_USER_KEY -product $WSS_PRODUCT_NAME -project $WSS_PROJECT_NAME 18 | post_build: 19 | commands: 20 | - echo WSS scan completed on `date` -------------------------------------------------------------------------------- /docker-pipeline-wss/src/buildspecs/wss-unified-agent.config: -------------------------------------------------------------------------------- 1 | ############################################################### 2 | # WhiteSource Unified-Agent configuration file 3 | ############################################################### 4 | # GENERAL SCAN MODE: Files and Package Managers 5 | ############################################################### 6 | # Organization vitals 7 | ###################### 8 | 9 | apiKey= 10 | #userKey is required if WhiteSource administrator has enabled "Enforce user level access" option 11 | #userKey= 12 | 13 | #projectName= 14 | #projectVersion= 15 | #projectToken= 16 | #projectTag= key:value 17 | 18 | productName= 19 | productVersion= 20 | productToken= 21 | 22 | #projectPerFolder=true 23 | #projectPerFolderIncludes= 24 | #projectPerFolderExcludes= 25 | 26 | #wss.connectionTimeoutMinutes=60 27 | 28 | # Change the below URL to your WhiteSource server. 29 | # Use the 'WhiteSource Server URL' which can be retrieved 30 | # from your 'Profile' page on the 'Server URLs' panel. 31 | # Then, add the '/agent' path to it. 32 | wss.url=https://saas.whitesourcesoftware.com/agent 33 | #wss.url=https://app.whitesourcesoftware.com/agent 34 | #wss.url=https://app-eu.whitesourcesoftware.com/agent 35 | 36 | ############ 37 | # Policies # 38 | ############ 39 | checkPolicies=true 40 | forceCheckAllDependencies=false 41 | forceUpdate=true 42 | forceUpdate.failBuildOnPolicyViolation=true 43 | #updateInventory=false 44 | 45 | ########### 46 | # General # 47 | ########### 48 | #offline=false 49 | #updateType=APPEND 50 | #ignoreSourceFiles=true 51 | #scanComment= 52 | #failErrorLevel=ALL 53 | #requireKnownSha1=false 54 | 55 | #generateProjectDetailsJson=true 56 | #generateScanReport=true 57 | #scanReportTimeoutMinutes=10 58 | #scanReportFilenameFormat= 59 | 60 | #analyzeFrameworks=true 61 | #analyzeFrameworksReference= 62 | 63 | #updateEmptyProject=false 64 | 65 | #log.files.level= 66 | #log.files.maxFileSize= 67 | #log.files.maxFilesCount= 68 | #log.files.path= 69 | 70 | ######################################## 71 | # Package Manager Dependency resolvers # 72 | ######################################## 73 | #resolveAllDependencies=false 74 | #excludeDependenciesFromNodes=.*commons-io.*,.*maven-model 75 | 76 | #npm.resolveDependencies=false 77 | #npm.ignoreSourceFiles=false 78 | #npm.includeDevDependencies=true 79 | #npm.runPreStep=true 80 | #npm.ignoreNpmLsErrors=true 81 | #npm.ignoreScripts=true 82 | #npm.yarnProject=true 83 | #npm.accessToken= 84 | #npm.identifyByNameAndVersion=true 85 | #npm.yarn.frozenLockfile=true 86 | #npm.resolveMainPackageJsonOnly=true 87 | #npm.removeDuplicateDependencies=false 88 | #npm.resolveAdditionalDependencies=true 89 | #npm.failOnNpmLsErrors = 90 | #npm.projectNameFromDependencyFile = true 91 | #npm.resolveGlobalPackages=true 92 | #npm.resolveLockFile=true 93 | 94 | #bower.resolveDependencies=false 95 | #bower.ignoreSourceFiles=true 96 | #bower.runPreStep=true 97 | 98 | #nuget.resolvePackagesConfigFiles=false 99 | #nuget.resolveCsProjFiles=false 100 | #nuget.resolveDependencies=false 101 | #nuget.restoreDependencies=true 102 | #nuget.preferredEnvironment= 103 | #nuget.packagesDirectory= 104 | #nuget.ignoreSourceFiles=false 105 | #nuget.runPreStep=true 106 | #nuget.resolveNuspecFiles=false 107 | #nuget.resolveAssetsFiles=true 108 | 109 | #python.resolveDependencies=false 110 | #python.ignoreSourceFiles=false 111 | #python.ignorePipInstallErrors=true 112 | #python.installVirtualenv=true 113 | #python.resolveHierarchyTree=false 114 | #python.requirementsFileIncludes=requirements.txt 115 | #python.resolveSetupPyFiles=true 116 | #python.runPipenvPreStep=true 117 | #python.pipenvDevDependencies=true 118 | #python.IgnorePipenvInstallErrors=true 119 | #python.resolveGlobalPackages=true 120 | 121 | #maven.ignoredScopes=test provided 122 | #maven.resolveDependencies=false 123 | #maven.ignoreSourceFiles=true 124 | #maven.aggregateModules=true 125 | #maven.ignorePomModules=false 126 | #maven.runPreStep=true 127 | #maven.ignoreMvnTreeErrors=true 128 | #maven.environmentPath= 129 | #maven.m2RepositoryPath= 130 | #maven.downloadMissingDependencies=false 131 | #maven.additionalArguments= 132 | #maven.projectNameFromDependencyFile=true 133 | 134 | #gradle.ignoredScopes= 135 | #gradle.resolveDependencies=false 136 | #gradle.runAssembleCommand=false 137 | #gradle.runPreStep=true 138 | #gradle.ignoreSourceFiles=true 139 | #gradle.aggregateModules=true 140 | #gradle.preferredEnvironment=wrapper 141 | #gradle.localRepositoryPath= 142 | #gradle.wrapperPath= 143 | #gradle.downloadMissingDependencies=false 144 | #gradle.additionalArguments= 145 | #gradle.includedScopes= 146 | #gradle.excludeModules= 147 | #gradle.includeModules= 148 | #gradle.includedConfigurations= 149 | #gradle.ignoredConfigurations= 150 | 151 | #paket.resolveDependencies=false 152 | #paket.ignoredGroups= 153 | #paket.ignoreSourceFiles=false 154 | #paket.runPreStep=true 155 | #paket.exePath= 156 | 157 | #go.resolveDependencies=false 158 | #go.collectDependenciesAtRuntime=true 159 | #go.dependencyManager= 160 | #go.ignoreSourceFiles=true 161 | #go.glide.ignoreTestPackages=false 162 | #go.gogradle.enableTaskAlias=true 163 | 164 | #ruby.resolveDependencies=false 165 | #ruby.ignoreSourceFiles=false 166 | #ruby.installMissingGems=true 167 | #ruby.runBundleInstall=true 168 | #ruby.overwriteGemFile=true 169 | 170 | #sbt.resolveDependencies=false 171 | #sbt.ignoreSourceFiles=true 172 | #sbt.aggregateModules=true 173 | #sbt.runPreStep=true 174 | #sbt.includedScopes= 175 | 176 | #php.resolveDependencies=false 177 | #php.runPreStep=true 178 | #php.includeDevDependencies=true 179 | 180 | #html.resolveDependencies=false 181 | 182 | #cocoapods.resolveDependencies=false 183 | #cocoapods.runPreStep=true 184 | #cocoapods.ignoreSourceFiles=false 185 | 186 | #hex.resolveDependencies=false 187 | #hex.runPreStep=true 188 | #hex.ignoreSourceFiles=false 189 | #hex.aggregateModules=true 190 | 191 | #ant.resolveDependencies=false 192 | #ant.pathIdIncludes=.* 193 | #ant.external.parameters= 194 | 195 | #r.resolveDependencies=false 196 | #r.runPreStep=true 197 | #r.ignoreSourceFiles=false 198 | #r.cranMirrorUrl= 199 | #r.packageManager=None 200 | 201 | #cargo.resolveDependencies=false 202 | #cargo.runPreStep=true 203 | #cargo.ignoreSourceFiles=false 204 | 205 | #haskell.resolveDependencies=false 206 | #haskell.runPreStep=true 207 | #haskell.ignoreSourceFiles=false 208 | #haskell.ignorePreStepErrors=true 209 | 210 | #ocaml.resolveDependencies=false 211 | #ocaml.runPrepStep=true 212 | #ocaml.ignoreSourceFiles=false 213 | #ocaml.switchName= 214 | #ocaml.ignoredScopes=none 215 | #ocaml.aggregateModules=true 216 | 217 | ########################################################################################### 218 | # Includes/Excludes Glob patterns - Please use only one exclude line and one include line # 219 | ########################################################################################### 220 | includes=**/*.c **/*.cc **/*.cp **/*.cpp **/*.cxx **/*.c++ **/*.h **/*.hpp **/*.hxx 221 | 222 | #includes=**/*.m **/*.mm **/*.js **/*.php 223 | #includes=**/*.jar 224 | #includes=**/*.gem **/*.rb 225 | #includes=**/*.dll **/*.cs **/*.nupkg 226 | #includes=**/*.tgz **/*.deb **/*.gzip **/*.rpm **/*.tar.bz2 227 | #includes=**/*.zip **/*.tar.gz **/*.egg **/*.whl **/*.py 228 | 229 | #Exclude file extensions or specific directories by adding **/*. or **//** 230 | excludes=**/*sources.jar **/*javadoc.jar 231 | 232 | case.sensitive.glob=false 233 | followSymbolicLinks=true 234 | 235 | ###################### 236 | # Archive properties # 237 | ###################### 238 | #archiveExtractionDepth=2 239 | #archiveIncludes=**/*.war **/*.ear 240 | #archiveExcludes=**/*sources.jar 241 | 242 | ############## 243 | # SCAN MODES # 244 | ############## 245 | 246 | # Docker images 247 | ################ 248 | #docker.scanImages=true 249 | #docker.includes=.*.* 250 | #docker.excludes= 251 | #docker.pull.enable=true 252 | #docker.pull.images=.*.* 253 | #docker.pull.maxImages=10 254 | #docker.pull.tags=.*.* 255 | #docker.pull.digest= 256 | #docker.delete.force=true 257 | #docker.login.sudo=true 258 | #docker.projectNameFormat=default 259 | #docker.scanTarFiles=true 260 | 261 | #docker.aws.enable=true 262 | #docker.aws.registryIds= 263 | 264 | #docker.azure.enable=true 265 | #docker.azure.userName= 266 | #docker.azure.userPassword= 267 | #docker.azure.registryNames= 268 | #docker.azure.authenticationType=containerRegistry 269 | #docker.azure.registryAuthenticationParameters=: : 270 | 271 | #docker.artifactory.enable=true 272 | #docker.artifactory.url= 273 | #docker.artifactory.pullUrl= 274 | #docker.artifactory.userName= 275 | #docker.artifactory.userPassword= 276 | #docker.artifactory.repositoriesNames= 277 | #docker.artifactory.dockerAccessMethod= 278 | 279 | #docker.hub.enabled=true 280 | #docker.hub.userName= 281 | #docker.hub.userPassword= 282 | #docker.hub.organizationsNames= 283 | 284 | # Docker containers 285 | #################### 286 | #docker.scanContainers=true 287 | #docker.containerIncludes=.*.* 288 | #docker.containerExcludes= 289 | 290 | # Linux package manager settings 291 | ################################ 292 | #scanPackageManager=true 293 | 294 | # Serverless settings 295 | ###################### 296 | #serverless.provider= 297 | #serverless.scanFunctions=true 298 | #serverless.includes= 299 | #serverless.excludes= 300 | #serverless.region= 301 | #serverless.maxFunctions=10 302 | 303 | # Artifactory settings 304 | ######################## 305 | #artifactory.enableScan=true 306 | #artifactory.url= 307 | #artifactory.accessToken= 308 | #artifactory.repoKeys= 309 | #artifactory.userName= 310 | #artifactory.userPassword= 311 | 312 | ################## 313 | # Proxy settings # 314 | ################## 315 | #proxy.host= 316 | #proxy.port= 317 | #proxy.user= 318 | #proxy.pass= 319 | 320 | ################ 321 | # SCM settings # 322 | ################ 323 | #scm.type= 324 | #scm.user= 325 | #scm.pass= 326 | #scm.ppk= 327 | #scm.url= 328 | #scm.branch= 329 | #scm.tag= 330 | #scm.npmInstall= 331 | #scm.npmInstallTimeoutMinutes= 332 | #scm.repositoriesFile= 333 | -------------------------------------------------------------------------------- /flask-pipeline/flask-pipeline-architecture.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jonrau1/AWS-DevSecOps-Factory/c07921704872b2102c47098038300fc5adf4f003/flask-pipeline/flask-pipeline-architecture.jpg -------------------------------------------------------------------------------- /golang-pipeline/.gitkeep: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jonrau1/AWS-DevSecOps-Factory/c07921704872b2102c47098038300fc5adf4f003/golang-pipeline/.gitkeep -------------------------------------------------------------------------------- /golden-ami-pipeline/GoldenAMIPipeline_AMZL2_CFN.yaml: -------------------------------------------------------------------------------- 1 | AWSTemplateFormatVersion: 2010-09-09 2 | Description: Creates a Golden AMI Baking pipeline for Amazon Linux 2 using EC2 Image Builder 3 | Parameters: 4 | PipelinePrefix: 5 | Type: String 6 | Description: A lowercase prefix to be used for all EC2 Image Builder components and other services. Must be lowercase to avoid S3 issues 7 | Default: amzn-linux-golden-pipeline 8 | ParentAMIId: 9 | Type: 'AWS::SSM::Parameter::Value' 10 | Description: Parameter store containing the AMI ID of an Amazon Linux 2 AMI to server as the parent for the pipeline 11 | Default: '/aws/service/ami-amazon-linux-latest/amzn2-ami-hvm-x86_64-gp2' 12 | # https://docs.aws.amazon.com/systems-manager/latest/userguide/parameter-store-public-parameters-ami.html 13 | ImageBuildSecGroupIds: 14 | Type: List 15 | Description: Security Group to create the Image Builder infrastructure with 16 | ImageBuilderSubnetId: 17 | Type: AWS::EC2::Subnet::Id 18 | Description: Subnet ID to create the Image Builder infrastructure with 19 | AwsAccountDistroList: 20 | Type: List 21 | Description: List of AWS accounts to share the final Golden AMIs with 22 | Resources: 23 | EC2ImageBuilderLogginBucket: 24 | Type: AWS::S3::Bucket 25 | Properties: 26 | BucketName: !Sub '${PipelinePrefix}-logs-bucket-${AWS::AccountId}' 27 | PublicAccessBlockConfiguration: 28 | BlockPublicAcls: true 29 | BlockPublicPolicy: true 30 | IgnorePublicAcls: true 31 | RestrictPublicBuckets: true 32 | BucketEncryption: 33 | ServerSideEncryptionConfiguration: 34 | - ServerSideEncryptionByDefault: 35 | SSEAlgorithm: AES256 36 | EC2ImageBuilderRole: 37 | Type: AWS::IAM::Role 38 | Properties: 39 | RoleName: !Sub '${PipelinePrefix}-EC2ImageBuilderRole' 40 | ManagedPolicyArns: 41 | - arn:aws:iam::aws:policy/EC2InstanceProfileForImageBuilder 42 | - arn:aws:iam::aws:policy/AmazonSSMManagedInstanceCore 43 | - arn:aws:iam::aws:policy/AmazonInspectorFullAccess 44 | Policies: 45 | - PolicyName: !Sub '${PipelinePrefix}-EC2ImageBuilderRolePolicy' 46 | PolicyDocument: 47 | Version: 2012-10-17 48 | Statement: 49 | - Effect: Allow 50 | Action: 51 | - ec2:CreateTags 52 | - ssm:SendCommand 53 | Resource: '*' 54 | - Effect: Allow 55 | Action: 56 | - s3:GetObject 57 | - s3:GetObjectVersion 58 | - s3:PutObject 59 | - s3:GetBucketAcl 60 | - s3:GetBucketLocation 61 | Resource: 62 | - !Sub 'arn:aws:s3:::${EC2ImageBuilderLogginBucket}' 63 | - !Sub 'arn:aws:s3:::${EC2ImageBuilderLogginBucket}/*' 64 | AssumeRolePolicyDocument: 65 | Version: 2012-10-17 66 | Statement: 67 | - Effect: Allow 68 | Principal: 69 | Service: 70 | - ec2.amazonaws.com 71 | Action: 72 | - sts:AssumeRole 73 | EC2ImageBuilderInstanceProfile: 74 | Type: AWS::IAM::InstanceProfile 75 | Properties: 76 | Roles: 77 | - !Ref EC2ImageBuilderRole 78 | AmazonLinux2GoldenAMIPipeline: 79 | Type: AWS::ImageBuilder::ImagePipeline 80 | Properties: 81 | Name: !Sub '${PipelinePrefix}-ImagePipeline' 82 | Description: Golden AMI Pipeline to create Amazon Linux 2 images - Managed by CloudFormation 83 | ImageRecipeArn: !Ref AmazonLinux2GoldenAMIImageRecipe 84 | DistributionConfigurationArn: !Ref ImageBuilderDistroConfig 85 | InfrastructureConfigurationArn: !Ref ImageBuilderInfraConfig 86 | ImageTestsConfiguration: 87 | ImageTestsEnabled: true 88 | TimeoutMinutes: 90 89 | Schedule: 90 | PipelineExecutionStartCondition: EXPRESSION_MATCH_ONLY 91 | ScheduleExpression: cron(0 0 * * 0) #every week, on Sunday at 00:00 92 | Status: ENABLED 93 | AmazonLinux2GoldenAMIImageRecipe: 94 | Type: AWS::ImageBuilder::ImageRecipe 95 | Properties: 96 | Name: !Sub '${PipelinePrefix}-SecurityImageRecipe' 97 | Description: Applies STIG Low configurations, runs an Inspector CIS test, installs Python3 and the latest security patches for Amazon Linux 2 and then tests Reboot - Managed by CloudFormation 98 | ParentImage: !Ref ParentAMIId 99 | Components: 100 | - ComponentArn: !Sub 'arn:${AWS::Partition}:imagebuilder:${AWS::Region}:aws:component/stig-build-linux-low/2.6.0' 101 | - ComponentArn: !Sub 'arn:${AWS::Partition}:imagebuilder:${AWS::Region}:aws:component/python-3-linux/1.0.1' 102 | - ComponentArn: !Sub 'arn:${AWS::Partition}:imagebuilder:${AWS::Region}:aws:component/update-linux/1.0.0' 103 | - ComponentArn: !Sub 'arn:${AWS::Partition}:imagebuilder:${AWS::Region}:aws:component/inspector-test-linux/1.0.1' 104 | - ComponentArn: !Sub 'arn:${AWS::Partition}:imagebuilder:${AWS::Region}:aws:component/reboot-test-linux/1.0.0' 105 | Version: 1.0.0 106 | ImageBuilderDistroConfig: 107 | Type: AWS::ImageBuilder::DistributionConfiguration 108 | Properties: 109 | Name: !Sub '${PipelinePrefix}-ImageBuilderDistroConfig' 110 | Description: Amazon Linux 2 Golden AMI distribution configuration - Managed by CloudFormation 111 | Distributions: 112 | - Region: us-east-1 113 | AmiDistributionConfiguration: 114 | Name: 'ami-hardened-amazonlinux2-{{ imagebuilder:buildDate }}' 115 | Description: Amazon Linux 2 Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 116 | AmiTags: 117 | AmiTagKey: ami-tag-key 118 | LaunchPermissionConfiguration: 119 | UserIds: !Ref AwsAccountDistroList 120 | - Region: us-east-2 121 | AmiDistributionConfiguration: 122 | Name: 'ami-hardened-amazonlinux2-{{ imagebuilder:buildDate }}' 123 | Description: Amazon Linux 2 Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 124 | AmiTags: 125 | AmiTagKey: ami-tag-key 126 | LaunchPermissionConfiguration: 127 | UserIds: !Ref AwsAccountDistroList 128 | - Region: us-west-1 129 | AmiDistributionConfiguration: 130 | Name: 'ami-hardened-amazonlinux2-{{ imagebuilder:buildDate }}' 131 | Description: Amazon Linux 2 Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 132 | AmiTags: 133 | AmiTagKey: ami-tag-key 134 | LaunchPermissionConfiguration: 135 | UserIds: !Ref AwsAccountDistroList 136 | - Region: us-west-2 137 | AmiDistributionConfiguration: 138 | Name: 'ami-hardened-amazonlinux2-{{ imagebuilder:buildDate }}' 139 | Description: Amazon Linux 2 Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 140 | AmiTags: 141 | AmiTagKey: ami-tag-key 142 | LaunchPermissionConfiguration: 143 | UserIds: !Ref AwsAccountDistroList 144 | - Region: eu-west-1 145 | AmiDistributionConfiguration: 146 | Name: 'ami-hardened-amazonlinux2-{{ imagebuilder:buildDate }}' 147 | Description: Amazon Linux 2 Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 148 | AmiTags: 149 | AmiTagKey: ami-tag-key 150 | LaunchPermissionConfiguration: 151 | UserIds: !Ref AwsAccountDistroList 152 | - Region: eu-west-2 153 | AmiDistributionConfiguration: 154 | Name: 'ami-hardened-amazonlinux2-{{ imagebuilder:buildDate }}' 155 | Description: Amazon Linux 2 Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 156 | AmiTags: 157 | AmiTagKey: ami-tag-key 158 | LaunchPermissionConfiguration: 159 | UserIds: !Ref AwsAccountDistroList 160 | - Region: sa-east-1 161 | AmiDistributionConfiguration: 162 | Name: 'ami-hardened-amazonlinux2-{{ imagebuilder:buildDate }}' 163 | Description: Amazon Linux 2 Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 164 | AmiTags: 165 | AmiTagKey: ami-tag-key 166 | LaunchPermissionConfiguration: 167 | UserIds: !Ref AwsAccountDistroList 168 | - Region: ap-south-1 169 | AmiDistributionConfiguration: 170 | Name: 'ami-hardened-amazonlinux2-{{ imagebuilder:buildDate }}' 171 | Description: Amazon Linux 2 Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 172 | AmiTags: 173 | AmiTagKey: ami-tag-key 174 | LaunchPermissionConfiguration: 175 | UserIds: !Ref AwsAccountDistroList 176 | - Region: ap-southeast-2 177 | AmiDistributionConfiguration: 178 | Name: 'ami-hardened-amazonlinux2-{{ imagebuilder:buildDate }}' 179 | Description: Amazon Linux 2 Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 180 | AmiTags: 181 | AmiTagKey: ami-tag-key 182 | LaunchPermissionConfiguration: 183 | UserIds: !Ref AwsAccountDistroList 184 | ImageBuilderInfraConfig: 185 | Type: AWS::ImageBuilder::InfrastructureConfiguration 186 | Properties: 187 | Name: !Sub '${PipelinePrefix}-ImageBuilderInfraConfig' 188 | Description: Amazon Linux 2 Golden AMI infrastructure configuration - Managed by CloudFormation 189 | InstanceProfileName: !Ref EC2ImageBuilderInstanceProfile 190 | Logging: 191 | S3Logs: 192 | S3BucketName: !Ref EC2ImageBuilderLogginBucket 193 | SecurityGroupIds: !Ref ImageBuildSecGroupIds 194 | SubnetId: !Ref ImageBuilderSubnetId 195 | TerminateInstanceOnFailure: true -------------------------------------------------------------------------------- /golden-ami-pipeline/GoldenAMIPipeline_Ubuntu18_CFN.yaml: -------------------------------------------------------------------------------- 1 | AWSTemplateFormatVersion: 2010-09-09 2 | Description: Creates a Golden AMI Baking pipeline for Ubuntu 18LTS using EC2 Image Builder 3 | Parameters: 4 | PipelinePrefix: 5 | Type: String 6 | Description: A lowercase prefix to be used for all EC2 Image Builder components and other services. Must be lowercase to avoid S3 issues 7 | Default: ubuntu18-golden-pipeline 8 | ParentAMIId: 9 | Type: String 10 | Description: AMI ID of an Ubuntu 18LTS AMI to server as the parent for the pipeline 11 | Default: ami-098f16afa9edf40be 12 | ImageBuildSecGroupIds: 13 | Type: List 14 | Description: Security Group to create the Image Builder infrastructure with 15 | ImageBuilderSubnetId: 16 | Type: AWS::EC2::Subnet::Id 17 | Description: Subnet ID to create the Image Builder infrastructure with 18 | AwsAccountDistroList: 19 | Type: List 20 | Description: List of AWS accounts to share the final Golden AMIs with 21 | Resources: 22 | EC2ImageBuilderLogginBucket: 23 | Type: AWS::S3::Bucket 24 | Properties: 25 | BucketName: !Sub '${PipelinePrefix}-logs-bucket=${AWS::AccountId' 26 | PublicAccessBlockConfiguration: 27 | BlockPublicAcls: true 28 | BlockPublicPolicy: true 29 | IgnorePublicAcls: true 30 | RestrictPublicBuckets: true 31 | BucketEncryption: 32 | ServerSideEncryptionConfiguration: 33 | - ServerSideEncryptionByDefault: 34 | SSEAlgorithm: AES256 35 | EC2ImageBuilderRole: 36 | Type: AWS::IAM::Role 37 | Properties: 38 | RoleName: !Sub '${PipelinePrefix}-EC2ImageBuilderRole' 39 | ManagedPolicyArns: 40 | - arn:aws:iam::aws:policy/EC2InstanceProfileForImageBuilder 41 | - arn:aws:iam::aws:policy/AmazonSSMManagedInstanceCore 42 | - arn:aws:iam::aws:policy/AmazonInspectorFullAccess 43 | Policies: 44 | - PolicyName: !Sub '${PipelinePrefix}-EC2ImageBuilderRolePolicy' 45 | PolicyDocument: 46 | Version: 2012-10-17 47 | Statement: 48 | - Effect: Allow 49 | Action: 50 | - ec2:CreateTags 51 | - ssm:SendCommand 52 | Resource: '*' 53 | - Effect: Allow 54 | Action: 55 | - s3:GetObject 56 | - s3:GetObjectVersion 57 | - s3:PutObject 58 | - s3:GetBucketAcl 59 | - s3:GetBucketLocation 60 | Resource: 61 | - !Sub 'arn:aws:s3:::${EC2ImageBuilderLogginBucket}' 62 | - !Sub 'arn:aws:s3:::${EC2ImageBuilderLogginBucket}/*' 63 | AssumeRolePolicyDocument: 64 | Version: 2012-10-17 65 | Statement: 66 | - Effect: Allow 67 | Principal: 68 | Service: 69 | - ec2.amazonaws.com 70 | Action: 71 | - sts:AssumeRole 72 | EC2ImageBuilderInstanceProfile: 73 | Type: AWS::IAM::InstanceProfile 74 | Properties: 75 | Roles: 76 | - !Ref EC2ImageBuilderRole 77 | Ubuntu18PipelinePipeline: 78 | Type: AWS::ImageBuilder::ImagePipeline 79 | Properties: 80 | Name: !Sub '${PipelinePrefix}-ImagePipeline' 81 | Description: Golden AMI Pipeline to create Ubuntu 18LTS images - Managed by CloudFormation 82 | ImageRecipeArn: !Ref Ubuntu18PipelineImageRecipe 83 | DistributionConfigurationArn: !Ref ImageBuilderDistroConfig 84 | InfrastructureConfigurationArn: !Ref ImageBuilderInfraConfig 85 | ImageTestsConfiguration: 86 | ImageTestsEnabled: true 87 | TimeoutMinutes: 90 88 | Schedule: 89 | PipelineExecutionStartCondition: EXPRESSION_MATCH_ONLY 90 | ScheduleExpression: cron(0 0 * * 0) #every week, on Sunday at 00:00 91 | Status: ENABLED 92 | Ubuntu18PipelineImageRecipe: 93 | Type: AWS::ImageBuilder::ImageRecipe 94 | Properties: 95 | Name: !Sub '${PipelinePrefix}-SecurityImageRecipe' 96 | Description: Installs Python 3, the AWS CLI, the Inspector Agent and the latest security patches for Ubuntu 18LTS and then tests Reboot - Managed by CloudFormation 97 | ParentImage: !Ref ParentAMIId 98 | Components: 99 | - ComponentArn: !Ref Ubuntu18PipelinePython3Bootstrap 100 | - ComponentArn: !Ref Ubuntu18InstallInspector 101 | - ComponentArn: !Sub 'arn:${AWS::Partition}:imagebuilder:${AWS::Region}:aws:component/update-linux/1.0.0' 102 | - ComponentArn: !Sub 'arn:${AWS::Partition}:imagebuilder:${AWS::Region}:aws:component/reboot-test-linux/1.0.0' 103 | Version: 1.0.0 104 | ImageBuilderDistroConfig: 105 | Type: AWS::ImageBuilder::DistributionConfiguration 106 | Properties: 107 | Name: !Sub '${PipelinePrefix}-ImageBuilderDistroConfig' 108 | Description: Ubuntu 18LTS Golden AMI distribution configuration - Managed by CloudFormation 109 | Distributions: 110 | - Region: us-east-1 111 | AmiDistributionConfiguration: 112 | Name: 'ami-hardened-ubuntu18-{{ imagebuilder:buildDate }}' 113 | Description: Ubuntu 18LTS Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 114 | AmiTags: 115 | AmiTagKey: ami-tag-key 116 | LaunchPermissionConfiguration: 117 | UserIds: !Ref AwsAccountDistroList 118 | - Region: us-east-2 119 | AmiDistributionConfiguration: 120 | Name: 'ami-hardened-ubuntu18-{{ imagebuilder:buildDate }}' 121 | Description: Ubuntu 18LTS Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 122 | AmiTags: 123 | AmiTagKey: ami-tag-key 124 | LaunchPermissionConfiguration: 125 | UserIds: !Ref AwsAccountDistroList 126 | - Region: us-west-1 127 | AmiDistributionConfiguration: 128 | Name: 'ami-hardened-ubuntu18-{{ imagebuilder:buildDate }}' 129 | Description: Ubuntu 18LTS Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 130 | AmiTags: 131 | AmiTagKey: ami-tag-key 132 | LaunchPermissionConfiguration: 133 | UserIds: !Ref AwsAccountDistroList 134 | - Region: us-west-2 135 | AmiDistributionConfiguration: 136 | Name: 'ami-hardened-ubuntu18-{{ imagebuilder:buildDate }}' 137 | Description: Ubuntu 18LTS Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 138 | AmiTags: 139 | AmiTagKey: ami-tag-key 140 | LaunchPermissionConfiguration: 141 | UserIds: !Ref AwsAccountDistroList 142 | - Region: eu-west-1 143 | AmiDistributionConfiguration: 144 | Name: 'ami-hardened-ubuntu18-{{ imagebuilder:buildDate }}' 145 | Description: Ubuntu 18LTS Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 146 | AmiTags: 147 | AmiTagKey: ami-tag-key 148 | LaunchPermissionConfiguration: 149 | UserIds: !Ref AwsAccountDistroList 150 | - Region: eu-west-2 151 | AmiDistributionConfiguration: 152 | Name: 'ami-hardened-ubuntu18-{{ imagebuilder:buildDate }}' 153 | Description: Ubuntu 18LTS Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 154 | AmiTags: 155 | AmiTagKey: ami-tag-key 156 | LaunchPermissionConfiguration: 157 | UserIds: !Ref AwsAccountDistroList 158 | - Region: sa-east-1 159 | AmiDistributionConfiguration: 160 | Name: 'ami-hardened-ubuntu18-{{ imagebuilder:buildDate }}' 161 | Description: Ubuntu 18LTS Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 162 | AmiTags: 163 | AmiTagKey: ami-tag-key 164 | LaunchPermissionConfiguration: 165 | UserIds: !Ref AwsAccountDistroList 166 | - Region: ap-south-1 167 | AmiDistributionConfiguration: 168 | Name: 'ami-hardened-ubuntu18-{{ imagebuilder:buildDate }}' 169 | Description: Ubuntu 18LTS Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 170 | AmiTags: 171 | AmiTagKey: ami-tag-key 172 | LaunchPermissionConfiguration: 173 | UserIds: !Ref AwsAccountDistroList 174 | - Region: ap-southeast-2 175 | AmiDistributionConfiguration: 176 | Name: 'ami-hardened-ubuntu18-{{ imagebuilder:buildDate }}' 177 | Description: Ubuntu 18LTS Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 178 | AmiTags: 179 | AmiTagKey: ami-tag-key 180 | LaunchPermissionConfiguration: 181 | UserIds: !Ref AwsAccountDistroList 182 | ImageBuilderInfraConfig: 183 | Type: AWS::ImageBuilder::InfrastructureConfiguration 184 | Properties: 185 | Name: !Sub '${PipelinePrefix}-ImageBuilderInfraConfig' 186 | Description: Ubuntu 18LTS Golden AMI infrastructure configuration - Managed by CloudFormation 187 | InstanceProfileName: !Ref EC2ImageBuilderInstanceProfile 188 | Logging: 189 | S3Logs: 190 | S3BucketName: !Ref EC2ImageBuilderLogginBucket 191 | SecurityGroupIds: !Ref ImageBuildSecGroupIds 192 | SubnetId: !Ref ImageBuilderSubnetId 193 | TerminateInstanceOnFailure: true 194 | Ubuntu18PipelinePython3Bootstrap: 195 | Type: AWS::ImageBuilder::Component 196 | Properties: 197 | Name: !Sub '${PipelinePrefix}-Python3AwsCliBoostrap' 198 | Description: Python3 and AWS CLI Ubuntu18 bootstrapping - Managed by CloudFormation 199 | Platform: Linux 200 | Version: 1.0.0 201 | Data: | 202 | name: UpdateLinux 203 | description: This installs Python 3, AWSCLI and Boto3 onto Ubuntu18 204 | schemaVersion: 1.0 205 | phases: 206 | - name: build 207 | steps: 208 | - name: BuildUpdateKernel 209 | action: ExecuteBash 210 | inputs: 211 | commands: 212 | - | 213 | apt update 214 | apt install -y python3 215 | apt install -y python3-pip 216 | pip3 install boto3 217 | pip3 install awscli 218 | Ubuntu18InstallInspector: 219 | Type: AWS::ImageBuilder::Component 220 | Properties: 221 | Name: !Sub '${PipelinePrefix}-InstallInspector' 222 | Description: Python3 and AWS CLI Ubuntu18 bootstrapping - Managed by CloudFormation 223 | Platform: Linux 224 | Version: 1.0.0 225 | Data: | 226 | name: UpdateLinux 227 | description: This installs the Amazon Inspector Agent 228 | schemaVersion: 1.0 229 | phases: 230 | - name: build 231 | steps: 232 | - name: BuildUpdateKernel 233 | action: ExecuteBash 234 | inputs: 235 | commands: 236 | - | 237 | apt update 238 | wget https://inspector-agent.amazonaws.com/linux/latest/install 239 | bash install -------------------------------------------------------------------------------- /golden-ami-pipeline/GoldenAMIPipeline_WS19_CFN.yaml: -------------------------------------------------------------------------------- 1 | AWSTemplateFormatVersion: 2010-09-09 2 | Description: Creates a Golden AMI Baking pipeline for Windows Server 2019 using EC2 Image Builder 3 | Parameters: 4 | PipelinePrefix: 5 | Type: String 6 | Description: A lowercase prefix to be used for all EC2 Image Builder components and other services. Must be lowercase to avoid S3 issues 7 | Default: winserver2019-golden-pipeline 8 | ParentAMIId: 9 | Type: String 10 | Description: AMI ID of an Windows Server 2019 AMI to server as the parent for the pipeline 11 | Default: ami-05bb2dae0b1de90b3 12 | ImageBuildSecGroupIds: 13 | Type: List 14 | Description: Security Group to create the Image Builder infrastructure with 15 | ImageBuilderSubnetId: 16 | Type: AWS::EC2::Subnet::Id 17 | Description: Subnet ID to create the Image Builder infrastructure with 18 | AwsAccountDistroList: 19 | Type: List 20 | Description: List of AWS accounts to share the final Golden AMIs with 21 | Resources: 22 | EC2ImageBuilderLogginBucket: 23 | Type: AWS::S3::Bucket 24 | Properties: 25 | BucketName: !Sub '${PipelinePrefix}-logs-bucket-${AWS::AccountId}' 26 | PublicAccessBlockConfiguration: 27 | BlockPublicAcls: true 28 | BlockPublicPolicy: true 29 | IgnorePublicAcls: true 30 | RestrictPublicBuckets: true 31 | BucketEncryption: 32 | ServerSideEncryptionConfiguration: 33 | - ServerSideEncryptionByDefault: 34 | SSEAlgorithm: AES256 35 | EC2ImageBuilderRole: 36 | Type: AWS::IAM::Role 37 | Properties: 38 | RoleName: !Sub '${PipelinePrefix}-EC2ImageBuilderRole' 39 | ManagedPolicyArns: 40 | - arn:aws:iam::aws:policy/EC2InstanceProfileForImageBuilder 41 | - arn:aws:iam::aws:policy/AmazonSSMManagedInstanceCore 42 | - arn:aws:iam::aws:policy/AmazonInspectorFullAccess 43 | Policies: 44 | - PolicyName: !Sub '${PipelinePrefix}-EC2ImageBuilderRolePolicy' 45 | PolicyDocument: 46 | Version: 2012-10-17 47 | Statement: 48 | - Effect: Allow 49 | Action: 50 | - ec2:CreateTags 51 | - ssm:SendCommand 52 | Resource: '*' 53 | - Effect: Allow 54 | Action: 55 | - s3:GetObject 56 | - s3:GetObjectVersion 57 | - s3:PutObject 58 | - s3:GetBucketAcl 59 | - s3:GetBucketLocation 60 | Resource: 61 | - !Sub 'arn:aws:s3:::${EC2ImageBuilderLogginBucket}' 62 | - !Sub 'arn:aws:s3:::${EC2ImageBuilderLogginBucket}/*' 63 | AssumeRolePolicyDocument: 64 | Version: 2012-10-17 65 | Statement: 66 | - Effect: Allow 67 | Principal: 68 | Service: 69 | - ec2.amazonaws.com 70 | Action: 71 | - sts:AssumeRole 72 | EC2ImageBuilderInstanceProfile: 73 | Type: AWS::IAM::InstanceProfile 74 | Properties: 75 | Roles: 76 | - !Ref EC2ImageBuilderRole 77 | WS2019AMIPipeline: 78 | Type: AWS::ImageBuilder::ImagePipeline 79 | Properties: 80 | Name: !Sub '${PipelinePrefix}-ImagePipeline' 81 | Description: Golden AMI Pipeline to create Windows Server 2019 images - Managed by CloudFormation 82 | ImageRecipeArn: !Ref WS2019AMIImageRecipe 83 | DistributionConfigurationArn: !Ref ImageBuilderDistroConfig 84 | InfrastructureConfigurationArn: !Ref ImageBuilderInfraConfig 85 | ImageTestsConfiguration: 86 | ImageTestsEnabled: true 87 | TimeoutMinutes: 90 88 | Schedule: 89 | PipelineExecutionStartCondition: EXPRESSION_MATCH_ONLY 90 | ScheduleExpression: cron(0 0 * * 0) #every week, on Sunday at 00:00 91 | Status: ENABLED 92 | WS2019AMIImageRecipe: 93 | Type: AWS::ImageBuilder::ImageRecipe 94 | Properties: 95 | Name: !Sub '${PipelinePrefix}-SecurityImageRecipe' 96 | Description: Applies STIG Low configurations, runs a CIS Benchmark scan and install security patches, Python 3 and PowerShell for Windows Server 2019 and then tests Reboot - Managed by CloudFormation 97 | ParentImage: !Ref ParentAMIId 98 | Components: 99 | - ComponentArn: !Sub 'arn:${AWS::Partition}:imagebuilder:${AWS::Region}:aws:component/stig-build-windows-low/1.0.0' 100 | - ComponentArn: !Sub 'arn:${AWS::Partition}:imagebuilder:${AWS::Region}:aws:component/powershell-core-windows/6.2.4' 101 | - ComponentArn: !Sub 'arn:${AWS::Partition}:imagebuilder:${AWS::Region}:aws:component/python-3-windows/3.8.2' 102 | - ComponentArn: !Sub 'arn:${AWS::Partition}:imagebuilder:${AWS::Region}:aws:component/update-windows/1.0.0' 103 | - ComponentArn: !Sub 'arn:${AWS::Partition}:imagebuilder:${AWS::Region}:aws:component/inspector-test-windows/1.0.1' 104 | - ComponentArn: !Sub 'arn:${AWS::Partition}:imagebuilder:${AWS::Region}:aws:component/reboot-test-windows/1.0.0' 105 | Version: 1.0.0 106 | ImageBuilderDistroConfig: 107 | Type: AWS::ImageBuilder::DistributionConfiguration 108 | Properties: 109 | Name: !Sub '${PipelinePrefix}-ImageBuilderDistroConfig' 110 | Description: Windows Server 2019 Golden AMI distribution configuration - Managed by CloudFormation 111 | Distributions: 112 | - Region: us-east-1 113 | AmiDistributionConfiguration: 114 | Name: 'ami-hardened-winserver2019-{{ imagebuilder:buildDate }}' 115 | Description: Windows Server 2019 Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 116 | AmiTags: 117 | AmiTagKey: ami-tag-key 118 | LaunchPermissionConfiguration: 119 | UserIds: !Ref AwsAccountDistroList 120 | - Region: us-east-2 121 | AmiDistributionConfiguration: 122 | Name: 'ami-hardened-winserver2019-{{ imagebuilder:buildDate }}' 123 | Description: Windows Server 2019 Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 124 | AmiTags: 125 | AmiTagKey: ami-tag-key 126 | LaunchPermissionConfiguration: 127 | UserIds: !Ref AwsAccountDistroList 128 | - Region: us-west-1 129 | AmiDistributionConfiguration: 130 | Name: 'ami-hardened-winserver2019-{{ imagebuilder:buildDate }}' 131 | Description: Windows Server 2019 Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 132 | AmiTags: 133 | AmiTagKey: ami-tag-key 134 | LaunchPermissionConfiguration: 135 | UserIds: !Ref AwsAccountDistroList 136 | - Region: us-west-2 137 | AmiDistributionConfiguration: 138 | Name: 'ami-hardened-winserver2019-{{ imagebuilder:buildDate }}' 139 | Description: Windows Server 2019 Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 140 | AmiTags: 141 | AmiTagKey: ami-tag-key 142 | LaunchPermissionConfiguration: 143 | UserIds: !Ref AwsAccountDistroList 144 | - Region: eu-west-1 145 | AmiDistributionConfiguration: 146 | Name: 'ami-hardened-winserver2019-{{ imagebuilder:buildDate }}' 147 | Description: Windows Server 2019 Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 148 | AmiTags: 149 | AmiTagKey: ami-tag-key 150 | LaunchPermissionConfiguration: 151 | UserIds: !Ref AwsAccountDistroList 152 | - Region: eu-west-2 153 | AmiDistributionConfiguration: 154 | Name: 'ami-hardened-winserver2019-{{ imagebuilder:buildDate }}' 155 | Description: Windows Server 2019 Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 156 | AmiTags: 157 | AmiTagKey: ami-tag-key 158 | LaunchPermissionConfiguration: 159 | UserIds: !Ref AwsAccountDistroList 160 | - Region: sa-east-1 161 | AmiDistributionConfiguration: 162 | Name: 'ami-hardened-winserver2019-{{ imagebuilder:buildDate }}' 163 | Description: Windows Server 2019 Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 164 | AmiTags: 165 | AmiTagKey: ami-tag-key 166 | LaunchPermissionConfiguration: 167 | UserIds: !Ref AwsAccountDistroList 168 | - Region: ap-south-1 169 | AmiDistributionConfiguration: 170 | Name: 'ami-hardened-winserver2019-{{ imagebuilder:buildDate }}' 171 | Description: Windows Server 2019 Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 172 | AmiTags: 173 | AmiTagKey: ami-tag-key 174 | LaunchPermissionConfiguration: 175 | UserIds: !Ref AwsAccountDistroList 176 | - Region: ap-southeast-2 177 | AmiDistributionConfiguration: 178 | Name: 'ami-hardened-winserver2019-{{ imagebuilder:buildDate }}' 179 | Description: Windows Server 2019 Golden AMI - Created by EC2 Image Builder - Pipeline Managed by CloudFormation 180 | AmiTags: 181 | AmiTagKey: ami-tag-key 182 | LaunchPermissionConfiguration: 183 | UserIds: !Ref AwsAccountDistroList 184 | ImageBuilderInfraConfig: 185 | Type: AWS::ImageBuilder::InfrastructureConfiguration 186 | Properties: 187 | Name: !Sub '${PipelinePrefix}-ImageBuilderInfraConfig' 188 | Description: Windows Server 2019 Golden AMI infrastructure configuration - Managed by CloudFormation 189 | InstanceProfileName: !Ref EC2ImageBuilderInstanceProfile 190 | Logging: 191 | S3Logs: 192 | S3BucketName: !Ref EC2ImageBuilderLogginBucket 193 | SecurityGroupIds: !Ref ImageBuildSecGroupIds 194 | SubnetId: !Ref ImageBuilderSubnetId 195 | TerminateInstanceOnFailure: true -------------------------------------------------------------------------------- /golden-ami-pipeline/README.md: -------------------------------------------------------------------------------- 1 | # Golden Amazon Machine Image (AMI) Pipelines 2 | Various implementations of Golden AMI Pipelines built by AWS [EC2 Image Builder](https://docs.aws.amazon.com/imagebuilder/latest/userguide/what-is-image-builder.html) and deployed via CloudFormation. The implementation examples include Amazon Linux 2, Ubuntu 18.04LTS and Windows Server 2019. At a minimum, each Pipeline includes [Recipes and Components](https://docs.aws.amazon.com/imagebuilder/latest/userguide/how-image-builder-works.html) which install the latest security updates. AWS does not have feature parity across all components to support various OS flavors. Each AMI baking Recipe is described in the **Capability Set** section. 3 | 4 | ## Capability Set 5 | - Amazon Linux 2 (`GoldenAMIPipeline_AMZL2_CFN.yaml`) 6 | - [DISA STIG](https://docs.aws.amazon.com/imagebuilder/latest/userguide/image-builder-stig.html#ie-os-stig) Low configuration. Low (Cat. III) includes "Any vulnerability that degrades measures to protect against loss of confidentiality, availability, or integrity." 7 | - Install Python 3 8 | - Apply Linux security updates 9 | - Inspector CIS Benchmark test. Performs a Center for Internet Security (CIS) security assessment of an instance with AWS Inspector Service. 10 | - Reboot test: Tests whether the system can reboot successfully 11 | - Ubuntu 18.04LTS (`GoldenAMIPipeline_Ubuntu18_CFN.yaml`) 12 | - Install Python 3, Pip3, AWSCLI and Kubectl (custom Component) 13 | - Install Inspector Agent (custom Component) 14 | - Apply Linux security updates 15 | - Reboot test: Tests whether the system can reboot successfully 16 | - Windows Server 2019 (`GoldenAMIPipeline_WS19_CFN.yaml`) 17 | - [DISA STIG](https://docs.aws.amazon.com/imagebuilder/latest/userguide/image-builder-stig.html#ie-os-stig) Low configuration. Low (Cat. III) includes "Any vulnerability that degrades measures to protect against loss of confidentiality, availability, or integrity." 18 | - Install PowerShell Core 6.2.4 19 | - Install Python 3 20 | - Inspector CIS Benchmark test. Performs a Center for Internet Security (CIS) security assessment of an instance with AWS Inspector Service. 21 | - Reboot test: Tests whether the system can reboot successfully 22 | 23 | ## Getting Started 24 | **Note** Each CloudFormation template will create an EC2 Image Builder pipeline and all related services without sharing them between templates. This means you can have multiple IAM Roles, S3 Buckets and Instance Profiles that do the same thing but are just named differently. 25 | 26 | 1. Create a stack from any of the CloudFormation templates 27 | 2. After creation navigate to the EC2 Image Builder console, select the Pipeline and choose Run from the Actions dropdown -------------------------------------------------------------------------------- /golden-ami-pipeline/golden-ami-pipeline.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jonrau1/AWS-DevSecOps-Factory/c07921704872b2102c47098038300fc5adf4f003/golden-ami-pipeline/golden-ami-pipeline.jpg -------------------------------------------------------------------------------- /k8s-pipeline/K8s_DevSecOps_Pipeline_CFN.yaml: -------------------------------------------------------------------------------- 1 | AWSTemplateFormatVersion: 2010-09-09 2 | Description: Deploys a sample CodePipeline to scan and deploy CFN templates 3 | Parameters: 4 | InitialCommitBucket: 5 | Type: String 6 | Description: The name of the S3 bucket containing the package for the initial commit for the DevSecOps pipeline 7 | InitialCommitKey: 8 | Type: String 9 | Description: Name of the package for the initial commit for the DevSecOps pipeline DO NOT include .zip 10 | Default: k8s-devsecops 11 | EKSClusterName: 12 | Type: String 13 | Description: The name of the EKS cluster you will deploy your config to 14 | Resources: 15 | DevSecOpsCICDCodeCommit: 16 | Type: AWS::CodeCommit::Repository 17 | Properties: 18 | RepositoryDescription: Contains all artifacts needed for a K8s security scanning DevSecOps pipeline - Managed by CloudFormation 19 | RepositoryName: k8s-devsecops 20 | Code: 21 | S3: 22 | Bucket: !Ref InitialCommitBucket 23 | Key: !Sub '${InitialCommitKey}.zip' 24 | CodeBuildServiceRole: 25 | Type: AWS::IAM::Role 26 | Properties: 27 | RoleName: K8sDevSecOps-CodeBuildServiceRole 28 | ManagedPolicyArns: 29 | - arn:aws:iam::aws:policy/AmazonEC2ContainerRegistryPowerUser 30 | Policies: 31 | - PolicyName: K8sDevSecOps-CodeBuildServiceRolePolicy 32 | PolicyDocument: 33 | Version: 2012-10-17 34 | Statement: 35 | - Effect: Allow 36 | Action: 37 | - codecommit:GitPull 38 | Resource: !GetAtt DevSecOpsCICDCodeCommit.Arn 39 | - Effect: Allow 40 | Action: 41 | - s3:GetObject 42 | - s3:GetObjectVersion 43 | - s3:PutObject 44 | - s3:GetBucketAcl 45 | - s3:GetBucketLocation 46 | Resource: 47 | - !Sub 'arn:aws:s3:::${DevSecOpsCICDCodePipelineArtifactBucket}' 48 | - !Sub 'arn:aws:s3:::${DevSecOpsCICDCodePipelineArtifactBucket}/*' 49 | - Effect: Allow 50 | Action: 51 | - securityhub:BatchImportFindings 52 | - eks:List* 53 | - eks:Describe* 54 | - logs:CreateLogGroup 55 | - logs:CreateLogStream 56 | - logs:PutLogEvents 57 | Resource: '*' 58 | AssumeRolePolicyDocument: 59 | Version: 2012-10-17 60 | Statement: 61 | - Effect: Allow 62 | Principal: { Service: codebuild.amazonaws.com } 63 | Action: 64 | - sts:AssumeRole 65 | CodePipelineServiceRole: 66 | Type: AWS::IAM::Role 67 | Properties: 68 | RoleName: K8sDevSecOps-CodePipelineServiceRole 69 | Policies: 70 | - PolicyName: K8sDevSecOps-CodePipelineServiceRolePolicy 71 | PolicyDocument: 72 | Version: 2012-10-17 73 | Statement: 74 | - Effect: Allow 75 | Action: 76 | - codecommit:CancelUploadArchive 77 | - codecommit:GetBranch 78 | - codecommit:GetCommit 79 | - codecommit:GetUploadArchiveStatus 80 | - codecommit:UploadArchive 81 | Resource: !GetAtt DevSecOpsCICDCodeCommit.Arn 82 | - Effect: Allow 83 | Action: 84 | - cloudwatch:* 85 | Resource: '*' 86 | - Effect: Allow 87 | Action: 88 | - s3:GetObject 89 | - s3:GetObjectVersion 90 | - s3:PutObject 91 | - s3:GetBucketAcl 92 | - s3:GetBucketLocation 93 | - s3:PutBucketPolicy 94 | - s3:ListAllMyBuckets 95 | - s3:ListBucket 96 | Resource: 97 | - !Sub 'arn:aws:s3:::${DevSecOpsCICDCodePipelineArtifactBucket}' 98 | - !Sub 'arn:aws:s3:::${DevSecOpsCICDCodePipelineArtifactBucket}/*' 99 | - Effect: Allow 100 | Action: 101 | - codebuild:BatchGetBuilds 102 | - codebuild:StartBuild 103 | Resource: 104 | - !GetAtt SecretScanStage.Arn 105 | - !GetAtt PolarisStage.Arn 106 | - !GetAtt SkanStage.Arn 107 | - !GetAtt K8sDeploymentStage.Arn 108 | AssumeRolePolicyDocument: 109 | Version: 2012-10-17 110 | Statement: 111 | - Effect: Allow 112 | Principal: { Service: codepipeline.amazonaws.com } 113 | Action: 114 | - sts:AssumeRole 115 | DevSecOpsCICDCodePipelineArtifactBucket: 116 | Type: AWS::S3::Bucket 117 | Properties: 118 | BucketName: !Sub 'k8s-devsecopscicd-artifacts-${AWS::AccountId}' 119 | PublicAccessBlockConfiguration: 120 | BlockPublicAcls: true 121 | BlockPublicPolicy: true 122 | IgnorePublicAcls: true 123 | RestrictPublicBuckets: true 124 | VersioningConfiguration: 125 | Status: Enabled 126 | BucketEncryption: 127 | ServerSideEncryptionConfiguration: 128 | - ServerSideEncryptionByDefault: 129 | SSEAlgorithm: AES256 130 | SecretScanStage: 131 | Type: AWS::CodeBuild::Project 132 | Properties: 133 | Artifacts: 134 | Type: CODEPIPELINE 135 | Description: Uses Yelp's Detect-Secrets to look for any secrets or sensitive material - Managed by CloudFormation 136 | Environment: 137 | ComputeType: BUILD_GENERAL1_SMALL 138 | Image: aws/codebuild/standard:5.0 139 | PrivilegedMode: True 140 | Type: LINUX_CONTAINER 141 | LogsConfig: 142 | CloudWatchLogs: 143 | Status: ENABLED 144 | Name: K8sDevSecOps-SecretScan 145 | ServiceRole: !GetAtt CodeBuildServiceRole.Arn 146 | Source: 147 | BuildSpec: buildspecs/secrets-buildspec.yaml 148 | Type: CODEPIPELINE 149 | SkanStage: 150 | Type: AWS::CodeBuild::Project 151 | Properties: 152 | Artifacts: 153 | Type: CODEPIPELINE 154 | Description: Uses Alcide's sKan to perfrom static security analysis of K8s config files and helm charts - Managed by CloudFormation 155 | Environment: 156 | ComputeType: BUILD_GENERAL1_SMALL 157 | Image: aws/codebuild/standard:5.0 158 | PrivilegedMode: True 159 | Type: LINUX_CONTAINER 160 | LogsConfig: 161 | CloudWatchLogs: 162 | Status: ENABLED 163 | Name: K8sDevSecOps-SkanStage 164 | ServiceRole: !GetAtt CodeBuildServiceRole.Arn 165 | Source: 166 | BuildSpec: buildspecs/skan-buildspec.yaml 167 | Type: CODEPIPELINE 168 | PolarisStage: 169 | Type: AWS::CodeBuild::Project 170 | Properties: 171 | Artifacts: 172 | Type: CODEPIPELINE 173 | Description: Uses Fairwind's Polaris to perfrom static security analysis of K8s config files and helm charts - Managed by CloudFormation 174 | Environment: 175 | ComputeType: BUILD_GENERAL1_SMALL 176 | Image: aws/codebuild/standard:5.0 177 | PrivilegedMode: True 178 | Type: LINUX_CONTAINER 179 | LogsConfig: 180 | CloudWatchLogs: 181 | Status: ENABLED 182 | Name: K8sDevSecOps-PolarisStage 183 | ServiceRole: !GetAtt CodeBuildServiceRole.Arn 184 | Source: 185 | BuildSpec: buildspecs/polaris-buildspec.yaml 186 | Type: CODEPIPELINE 187 | K8sDeploymentStage: 188 | Type: AWS::CodeBuild::Project 189 | Properties: 190 | Artifacts: 191 | Type: CODEPIPELINE 192 | Description: Installs Kubectl, authenticates to EKS and applies your latest configuration spec - Managed by CloudFormation 193 | Environment: 194 | ComputeType: BUILD_GENERAL1_SMALL 195 | Image: aws/codebuild/standard:5.0 196 | PrivilegedMode: True 197 | Type: LINUX_CONTAINER 198 | EnvironmentVariables: 199 | - Name: EKS_CLUSTER_NAME 200 | Type: PLAINTEXT 201 | Value: !Ref EKSClusterName 202 | LogsConfig: 203 | CloudWatchLogs: 204 | Status: ENABLED 205 | Name: K8sDevSecOps-K8sDeploymentStage 206 | ServiceRole: !GetAtt CodeBuildServiceRole.Arn 207 | Source: 208 | BuildSpec: buildspecs/deployment-buildspec.yaml 209 | Type: CODEPIPELINE 210 | DevSecOpsCICDCodePipeline: 211 | Type: AWS::CodePipeline::Pipeline 212 | Properties: 213 | ArtifactStore: 214 | Location: !Ref DevSecOpsCICDCodePipelineArtifactBucket 215 | Type: S3 216 | Name: !Sub 'DevSecOpsCICD-scan-cicd-pipeline-${AWS::AccountId}' 217 | RestartExecutionOnUpdate: True 218 | RoleArn: !GetAtt CodePipelineServiceRole.Arn 219 | Stages: 220 | - 221 | Name: Source 222 | Actions: 223 | - 224 | Name: SourceAction 225 | ActionTypeId: 226 | Category: Source 227 | Owner: AWS 228 | Version: 1 229 | Provider: CodeCommit 230 | Configuration: 231 | RepositoryName: !GetAtt DevSecOpsCICDCodeCommit.Name 232 | BranchName: master 233 | OutputArtifacts: 234 | - 235 | Name: SourceOutput 236 | RunOrder: 1 237 | - 238 | Name: SecretScan 239 | Actions: 240 | - 241 | InputArtifacts: 242 | - 243 | Name: SourceOutput 244 | Name: BuildAction 245 | ActionTypeId: 246 | Category: Build 247 | Owner: AWS 248 | Version: 1 249 | Provider: CodeBuild 250 | Configuration: 251 | ProjectName: !Ref SecretScanStage 252 | PrimarySource: SourceOutput 253 | RunOrder: 2 254 | - 255 | Name: SkanStage 256 | Actions: 257 | - 258 | InputArtifacts: 259 | - 260 | Name: SourceOutput 261 | Name: BuildAction 262 | ActionTypeId: 263 | Category: Build 264 | Owner: AWS 265 | Version: 1 266 | Provider: CodeBuild 267 | Configuration: 268 | ProjectName: !Ref SkanStage 269 | PrimarySource: SourceOutput 270 | RunOrder: 3 271 | - 272 | Name: PolarisStage 273 | Actions: 274 | - 275 | InputArtifacts: 276 | - 277 | Name: SourceOutput 278 | Name: BuildAction 279 | ActionTypeId: 280 | Category: Build 281 | Owner: AWS 282 | Version: 1 283 | Provider: CodeBuild 284 | Configuration: 285 | ProjectName: !Ref PolarisStage 286 | PrimarySource: SourceOutput 287 | RunOrder: 4 288 | - 289 | Name: K8sDeploymentStage 290 | Actions: 291 | - 292 | InputArtifacts: 293 | - 294 | Name: SourceOutput 295 | Name: BuildAction 296 | ActionTypeId: 297 | Category: Build 298 | Owner: AWS 299 | Version: 1 300 | Provider: CodeBuild 301 | Configuration: 302 | ProjectName: !Ref K8sDeploymentStage 303 | PrimarySource: SourceOutput 304 | RunOrder: 5 -------------------------------------------------------------------------------- /k8s-pipeline/README.md: -------------------------------------------------------------------------------- 1 | # Kubernetes Security Scanning - CodeSuite 2 | Sample implementation of a Kubernetes security testing and deployment pipeline using the AWS CodeSuite. CodeCommit is used as the SCM, CodeBuild projects are used as the CI servers and CodePipeline is the CD automation engine which will start builds as new code is pushed (directly or via PR) to the Master branch. This pipeline assumes you have an *existing* EKS cluster that you have access to the IAM entity that created it with. This pipeline also uses an image from Docker Hub instead of from ECR. 3 | 4 | This pipeline will look for regex and high-entropy based secrets/sensitive values using Detect-Secrets. Alcide's sKan and Fairwind's Polaris are used to perform static analysis on K8s deployments and Helm charts to look for security and best practice violations. The last build stage will authenticate and apply your `deployment.yaml` that was just scanned to your EKS Cluster. 5 | 6 | ## Before you start 7 | **If you have an existing EKS cluster that you have `system:masters` RBAC access into you can skip this section** 8 | 9 | **Note**: If you do not have an EKS cluster and would rather use `eksctl` to create it refer to the [Getting started with eksctl](https://docs.aws.amazon.com/eks/latest/userguide/getting-started-eksctl.html) section of the Amazon EKS User Guide. If you use `eksctl` or use the below methods, ensure it is with an IAM principal that you can get access to readily to add additional IAM principals to the Cluster RBAC. It is recommended that you take these actions from a Cloud9 IDE with an Instance Profile attached. 10 | 11 | 1. Install or upgrade your AWS CLI to the latest version. Ensure it is at least version `1.16.156`. 12 | ```bash 13 | sudo apt install -y python3 python3-pip 14 | pip3 install awscli 15 | aws --version 16 | ``` 17 | 18 | 2. [Install](https://kubernetes.io/docs/tasks/tools/install-kubectl/) `kubectl` on your system. Ensure it is at least version `1.16`. 19 | ```bash 20 | curl -LO https://storage.googleapis.com/kubernetes-release/release/`curl -s https://storage.googleapis.com/kubernetes-release/release/stable.txt`/bin/linux/amd64/kubectl 21 | chmod +x ./kubectl 22 | sudo mv ./kubectl /usr/local/bin/kubectl 23 | kubectl version --short --client 24 | ``` 25 | 26 | 3. Create the EKS Cluster service role by navigating to the IAM Console, creating a Role and choosing **EKS - Cluster** from the list of service use cases. It will automatically attach the necessary service role permissions. Give your Role a unique name and create it. 27 | 28 | 4. If you do not have a VPC with public and private subnets you can use this [AWS-provided template](https://amazon-eks.s3.us-west-2.amazonaws.com/cloudformation/2020-06-10/amazon-eks-vpc-private-subnets.yaml). You should specify at least one Private and one Public subnet for the next step. You should also have a Security Group that *at least* allows access on HTTPS (TCP 443). 29 | 30 | 5. Create an EKS Cluster with the CLI. Replace the placeholder values with your Role Name, Subnet IDs and Security Groups. This can take up to 20 minutes to complete and will only provision the Cluster and *not* the Node Groups. If you wanted to specify other values refer to the create-cluster CLI command [here](https://docs.aws.amazon.com/cli/latest/reference/eks/create-cluster.html). 31 | ```bash 32 | aws eks create-cluster \ 33 | --kubernetes-version 1.16 \ 34 | --name devsecops-demo \ 35 | --role-arn $ROLE_NAME \ 36 | --resources-vpc-config subnetIds=$SUBNET_1,$SUBNET_2,securityGroupIds=$SECURITY_GROUP 37 | ``` 38 | 39 | 6. After your EKS Cluster has finished creating, authenticate to the cluster and interact with it using `kubectl` to ensure everything is working as expected. 40 | ```bash 41 | aws eks --region $AWS_REGION update-kubeconfig --name devsecops-demo 42 | kubectl get svc 43 | ``` 44 | 45 | 7. Create an IAM role for Managed Node Groups, navigate to the IAM Console and create a role with the EC2 common use case. Attach the below AWS Managed Policies and finish creating the role. You should give a name such as **NodeInstanceRole-* and change the default description. 46 | ```bash 47 | AmazonEKSWorkerNodePolicy 48 | AmazonEKS_CNI_Policy 49 | AmazonEC2ContainerRegistryReadOnly 50 | ``` 51 | 52 | 8. Create a Managed Node Group with the CLI. Repalce the placeholder values as necessary. **Note** the subnets should be configured to automatically assign Public IPs otherwise the health check for the Node Groups will fail. 53 | ```bash 54 | aws eks create-nodegroup \ 55 | --cluster-name devsecops-demo \ 56 | --nodegroup-name devsecops-node-group \ 57 | --subnets $SUBNET_1 $SUBNET_2 \ 58 | --instance-types t3.medium \ 59 | --node-role $NODEGROUP_IAM_ROLE 60 | ``` 61 | 62 | 9. After the Node Group has finished creating you can move onto the next steps. This is important because it will create an AWS authentication Config Map in the `kube-system` namespace that you will interact with in Step 4 of the next session. 63 | 64 | ## Getting Started 65 | 1. Clone this repository and upload `k8s-devsecops.zip` to a bucket of your choosing. 66 | 2. Deploy a CloudFormation Stack from `K8s_DevSecOps_Pipeline_CFN.yaml` in your AWS account, all necessary artifacts will be pushed as the first commit to the created CodeCommit repository. **Note** the first deployment will fail because your CodeBuild IAM Role does not have the required RBAC access into your EKS Cluster. 67 | 3. After the Stack has finished creating navigate to the Resources tab and copy the ARN of the IAM Role for CodeBuild, you may need to select the hyperlink to be taken to the IAM Console - you *may* be able to catch the IAM Role ARN before the Stack finishes creating. After you have the ARN run the following command to generate an `aws-auth` ConfigMap: 68 | ```bash 69 | aws eks --region $AWS_REGION update-kubeconfig --ame $AWS_CLUSTER_NAME 70 | kubectl get configmaps aws-auth -n kube-system -o yaml > aws-auth.yaml 71 | ``` 72 | 4. Open and edit the `aws-auth.yaml` file to add the CodeBuild role to `data.mapRoles` - replace the placeholder values with your Account Number and the name of the CodeBuild role: 73 | ```yaml 74 | - groups: 75 | - system:masters 76 | rolearn: arn:aws:iam::$ACCOUNT_NUMBER:role/K8sDevSecOps-CodeBuildServiceRole 77 | username: K8sDevSecOps-CodeBuildServiceRole 78 | ``` 79 | Your finalized `aws-auth.yaml` should somewhat resemble the following: 80 | ```yaml 81 | apiVersion: v1 82 | data: 83 | mapRoles: | 84 | - groups: 85 | - system:masters 86 | rolearn: arn:aws:iam::$ACCOUNT_NUMBER:role/K8sDevSecOps-CodeBuildServiceRole 87 | username: K8sDevSecOps-CodeBuildServiceRole 88 | - groups: 89 | - system:bootstrappers 90 | - system:nodes 91 | rolearn: arn:aws:iam::$ACCOUNT_NUMBER:role/NodeInstanceRole-DevSecOps 92 | username: system:node:{{EC2PrivateDNSName}} 93 | kind: ConfigMap 94 | metadata: 95 | creationTimestamp: "" 96 | name: aws-auth 97 | namespace: kube-system 98 | ``` 99 | 5. Set the new configuration on your cluster: `kubectl apply -f aws-auth.yaml` 100 | 6. Navigate to the AWS CodePipeline Console, select the K8s pipeline and choose **Release Change** on the top-right of the console. This will manually restart the entire pipeline form the last commit in your source 101 | 102 | To utilize the scanning tools deployed in this solution, upload all K8s YAML configs or Helm charts into the `/artifacts` subdirectory in the solution. You will need to clone the CodeCommit repository or upload the files manually. Refer to the [Getting started](https://docs.aws.amazon.com/codecommit/latest/userguide/getting-started-topnode.html) section of the AWS CodeCommit User Guide for help with either. 103 | 104 | If you use JSON to define your k8s Deployments, or if you want to scan your Helm Charts, you will need to modify all example `*-buildspec.yaml` files in `src/buildspecs/` and push the entire solution again. -------------------------------------------------------------------------------- /k8s-pipeline/k8s-devsecops.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jonrau1/AWS-DevSecOps-Factory/c07921704872b2102c47098038300fc5adf4f003/k8s-pipeline/k8s-devsecops.zip -------------------------------------------------------------------------------- /k8s-pipeline/k8s-pipeline-architecure.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jonrau1/AWS-DevSecOps-Factory/c07921704872b2102c47098038300fc5adf4f003/k8s-pipeline/k8s-pipeline-architecure.jpg -------------------------------------------------------------------------------- /k8s-pipeline/src/artifacts/deployment.yaml: -------------------------------------------------------------------------------- 1 | apiVersion: v1 2 | kind: Namespace 3 | metadata: 4 | name: web 5 | --- 6 | apiVersion: apps/v1 7 | kind: Deployment 8 | metadata: 9 | namespace: web 10 | name: web-proxy 11 | spec: 12 | selector: 13 | matchLabels: 14 | app: web 15 | replicas: 2 16 | template: 17 | metadata: 18 | labels: 19 | app: web 20 | spec: 21 | containers: 22 | - name: nginx 23 | image: nginx@sha256:0efad4d09a419dc6d574c3c3baacb804a530acd61d5eba72cb1f14e1f5ac0c8f 24 | resources: 25 | limits: 26 | memory: "200Mi" 27 | cpu: "700m" 28 | requests: 29 | memory: "200Mi" 30 | cpu: "700m" 31 | ports: 32 | - containerPort: 80 -------------------------------------------------------------------------------- /k8s-pipeline/src/buildspecs/deployment-buildspec.yaml: -------------------------------------------------------------------------------- 1 | version: 0.2 2 | 3 | phases: 4 | install: 5 | commands: 6 | - apt update 7 | - curl -LO https://storage.googleapis.com/kubernetes-release/release/`curl -s https://storage.googleapis.com/kubernetes-release/release/stable.txt | egrep 'v([0-9]+\.){2}[0-9]+'`/bin/linux/amd64/kubectl 8 | - chmod +x ./kubectl 9 | - mv ./kubectl /usr/local/bin/kubectl 10 | - pip3 install --upgrade pip 11 | - pip3 install awscli 12 | - cd artifacts 13 | pre_build: 14 | commands: 15 | - aws eks --region $AWS_DEFAULT_REGION update-kubeconfig --name $EKS_CLUSTER_NAME 16 | build: 17 | commands: 18 | - kubectl apply -f deployment.yaml 19 | - echo Latest spec pushed to cluster on `date` -------------------------------------------------------------------------------- /k8s-pipeline/src/buildspecs/detect-secrets-ejection.py: -------------------------------------------------------------------------------- 1 | import json 2 | 3 | try: 4 | with open('secret-results.json') as json_file: 5 | data = json.load(json_file) 6 | if str(data['results']) != '{}': 7 | exit(1) 8 | else: 9 | pass 10 | except Exception as e: 11 | print(e) -------------------------------------------------------------------------------- /k8s-pipeline/src/buildspecs/polaris-buildspec.yaml: -------------------------------------------------------------------------------- 1 | version: 0.2 2 | 3 | phases: 4 | install: 5 | commands: 6 | - apt update && apt install -y jq 7 | - pip3 install --upgrade pip 8 | - pip3 install boto3 9 | - cp ./security-hub/polarisAsff.py ./artifacts/ 10 | - cd artifacts 11 | - wget https://github.com/FairwindsOps/polaris/releases/download/1.1.0/polaris_1.1.0_linux_amd64.tar.gz 12 | - tar -xzf polaris_1.1.0_linux_amd64.tar.gz 13 | - chmod +x polaris 14 | build: 15 | commands: 16 | - ./polaris audit --format json --output-file polaris-findings.json --set-exit-code-on-danger --audit-path ./deployment.yaml 17 | post_build: 18 | commands: 19 | - jq . polaris-findings.json 20 | - python3 polarisAsff.py 21 | - echo Security Hub script executed on `date` review the findings there -------------------------------------------------------------------------------- /k8s-pipeline/src/buildspecs/secrets-buildspec.yaml: -------------------------------------------------------------------------------- 1 | version: 0.2 2 | 3 | phases: 4 | install: 5 | commands: 6 | - apt update && apt install -y jq 7 | - pip3 install --upgrade pip 8 | - pip3 install boto3 9 | - pip3 install detect-secrets 10 | - cp ./buildspecs/detect-secrets-ejection.py ./artifacts/ 11 | - cp ./security-hub/secretsAsff.py ./artifacts/ 12 | - cd artifacts 13 | build: 14 | commands: 15 | - detect-secrets scan . --all-files > secret-results.json 16 | - jq . secret-results.json 17 | - python3 detect-secrets-ejection.py 18 | - echo detect-secrets scan completed on `date` 19 | post_build: 20 | commands: 21 | - python3 secretsAsff.py 22 | - echo Security Hub script executed on `date` review the findings there -------------------------------------------------------------------------------- /k8s-pipeline/src/buildspecs/skan-buildspec.yaml: -------------------------------------------------------------------------------- 1 | version: 0.2 2 | 3 | phases: 4 | install: 5 | commands: 6 | - apt update && apt install -y jq 7 | - pip3 install --upgrade pip 8 | - pip3 install boto3 9 | - cp ./security-hub/skanAsff.py ./artifacts/ 10 | - cd artifacts 11 | - wget https://github.com/alcideio/skan/releases/download/v0.8.0/skan_v0.8.0_linux_amd64 12 | - chmod +x skan_v0.8.0_linux_amd64 13 | build: 14 | commands: 15 | - ./skan_v0.8.0_linux_amd64 manifest --output json --outputfile skan-findings.json -f ./ 16 | post_build: 17 | commands: 18 | - jq . skan-findings.json 19 | - python3 skanAsff.py 20 | - echo Security Hub script executed on `date` review the findings there -------------------------------------------------------------------------------- /k8s-pipeline/src/security-hub/polarisAsff.py: -------------------------------------------------------------------------------- 1 | import json 2 | import boto3 3 | import datetime 4 | import os 5 | # import sechub + sts boto3 client 6 | securityhub = boto3.client('securityhub') 7 | sts = boto3.client('sts') 8 | # retrieve account id from STS 9 | awsAccount = sts.get_caller_identity()['Account'] 10 | # retrieve env vars from codebuild 11 | awsRegion = os.environ['AWS_REGION'] 12 | codebuildBuildArn = os.environ['CODEBUILD_BUILD_ARN'] 13 | 14 | with open('polaris-findings.json') as json_file: 15 | data = json.load(json_file) 16 | sourceName = str(data['SourceName']) 17 | 18 | def pod_findings(): 19 | for r in data['Results']: 20 | for key in r['PodResult']['Results']: 21 | if str(r['PodResult']['Results'][key]['Success']) == 'False': 22 | findingId = r['PodResult']['Results'][key]['ID'] 23 | findingDescription = r['PodResult']['Results'][key]['Message'] 24 | findingCategory = r['PodResult']['Results'][key]['Category'] 25 | polarisSev = r['PodResult']['Results'][key]['Severity'] 26 | try: 27 | iso8601Time = datetime.datetime.utcnow().replace(tzinfo=datetime.timezone.utc).isoformat() 28 | response = securityhub.batch_import_findings( 29 | Findings=[ 30 | { 31 | 'SchemaVersion': '2018-10-08', 32 | 'Id': codebuildBuildArn + 'polaris-scan' + findingId, 33 | 'ProductArn': 'arn:aws:securityhub:' + awsRegion + ':' + awsAccount + ':product/' + awsAccount + '/default', 34 | 'GeneratorId': codebuildBuildArn, 35 | 'AwsAccountId': awsAccount, 36 | 'Types': [ 'Software and Configuration Checks' ], 37 | 'CreatedAt': iso8601Time, 38 | 'UpdatedAt': iso8601Time, 39 | 'Severity': { 40 | 'Label': 'MEDIUM', 41 | 'Original': polarisSev 42 | }, 43 | 'Title': '[Polaris] Security or network misconfigurations identified in Kubernetes configuration file or Helm Chart', 44 | 'Description': 'Detect-Secrets identified security or network misconfigurations in Kubernetes configuration file or Helm Chart during build ' + codebuildBuildArn + '. The following check in the ' + findingCategory + ' category failed: ' + findingDescription, 45 | 'ProductFields': { 46 | 'Product Name': 'Polaris' 47 | }, 48 | 'Resources': [ 49 | { 50 | 'Type': 'AwsCodeBuildProject', 51 | 'Id': codebuildBuildArn, 52 | 'Partition': 'aws', 53 | 'Region': awsRegion, 54 | 'Details': { 55 | 'AwsCodeBuildProject': { 'Name': codebuildBuildArn }, 56 | 'Other': { 57 | 'findingId': findingId, 58 | 'findingCategory': findingCategory, 59 | 'sourceName': sourceName 60 | } 61 | } 62 | } 63 | ], 64 | 'RecordState': 'ACTIVE', 65 | 'Workflow': {'Status': 'NEW'} 66 | } 67 | ] 68 | ) 69 | print(response) 70 | except Exception as e: 71 | print(e) 72 | else: 73 | pass 74 | 75 | def container_findings(): 76 | for r in data['Results']: 77 | for c in r['PodResult']['ContainerResults']: 78 | for key in c['Results']: 79 | if str(c['Results'][key]['Success']) == 'False': 80 | findingId = c['Results'][key]['ID'] 81 | findingDescription = c['Results'][key]['Message'] 82 | findingCategory = c['Results'][key]['Category'] 83 | polarisSev = c['Results'][key]['Severity'] 84 | try: 85 | iso8601Time = datetime.datetime.utcnow().replace(tzinfo=datetime.timezone.utc).isoformat() 86 | response = securityhub.batch_import_findings( 87 | Findings=[ 88 | { 89 | 'SchemaVersion': '2018-10-08', 90 | 'Id': codebuildBuildArn + 'polaris-scan' + findingId, 91 | 'ProductArn': 'arn:aws:securityhub:' + awsRegion + ':' + awsAccount + ':product/' + awsAccount + '/default', 92 | 'GeneratorId': codebuildBuildArn, 93 | 'AwsAccountId': awsAccount, 94 | 'Types': [ 'Software and Configuration Checks' ], 95 | 'CreatedAt': iso8601Time, 96 | 'UpdatedAt': iso8601Time, 97 | 'Severity': { 98 | 'Label': 'MEDIUM', 99 | 'Original': polarisSev 100 | }, 101 | 'Title': '[Polaris] Security or network misconfigurations identified in Kubernetes configuration file or Helm Chart', 102 | 'Description': 'Detect-Secrets identified security or network misconfigurations in Kubernetes configuration file or Helm Chart during build ' + codebuildBuildArn + '. The following check in the ' + findingCategory + ' category failed: ' + findingDescription, 103 | 'ProductFields': { 104 | 'Product Name': 'Polaris' 105 | }, 106 | 'Resources': [ 107 | { 108 | 'Type': 'AwsCodeBuildProject', 109 | 'Id': codebuildBuildArn, 110 | 'Partition': 'aws', 111 | 'Region': awsRegion, 112 | 'Details': { 113 | 'AwsCodeBuildProject': { 'Name': codebuildBuildArn }, 114 | 'Other': { 115 | 'findingId': findingId, 116 | 'findingCategory': findingCategory, 117 | 'sourceName': sourceName 118 | } 119 | } 120 | } 121 | ], 122 | 'RecordState': 'ACTIVE', 123 | 'Workflow': {'Status': 'NEW'} 124 | } 125 | ] 126 | ) 127 | print(response) 128 | except Exception as e: 129 | print(e) 130 | else: 131 | pass 132 | 133 | def main(): 134 | pod_findings() 135 | container_findings() 136 | 137 | main() -------------------------------------------------------------------------------- /k8s-pipeline/src/security-hub/secretsAsff.py: -------------------------------------------------------------------------------- 1 | import json 2 | import boto3 3 | import datetime 4 | import os 5 | # import sechub + sts boto3 client 6 | securityhub = boto3.client('securityhub') 7 | sts = boto3.client('sts') 8 | # retrieve account id from STS 9 | awsAccount = sts.get_caller_identity()['Account'] 10 | # retrieve env vars from codebuild 11 | awsRegion = os.environ['AWS_REGION'] 12 | codebuildBuildArn = os.environ['CODEBUILD_BUILD_ARN'] 13 | try: 14 | with open('secret-results.json') as json_file: 15 | data = json.load(json_file) 16 | if str(data['results']) == '{}': 17 | pass 18 | else: 19 | secretDetectionCheck = str(data['results']) 20 | secretDetectionCheck = (secretDetectionCheck[:700] + '..') if len(secretDetectionCheck) > 700 else secretDetectionCheck 21 | secretTimestamp = str(data['generated_at']) 22 | try: 23 | iso8601Time = datetime.datetime.utcnow().replace(tzinfo=datetime.timezone.utc).isoformat() 24 | response = securityhub.batch_import_findings( 25 | Findings=[ 26 | { 27 | 'SchemaVersion': '2018-10-08', 28 | 'Id': codebuildBuildArn + 'detect-secrets-scan' + secretTimestamp, 29 | 'ProductArn': 'arn:aws:securityhub:' + awsRegion + ':' + awsAccount + ':product/' + awsAccount + '/default', 30 | 'GeneratorId': codebuildBuildArn, 31 | 'AwsAccountId': awsAccount, 32 | 'Types': [ 33 | 'Sensitive Data Identifications', 34 | 'Effects/Data Exposure' 35 | ], 36 | 'CreatedAt': iso8601Time, 37 | 'UpdatedAt': iso8601Time, 38 | 'Severity': { 'Label': 'CRITICAL' }, 39 | 'Title': 'Detect-Secrets identified sensitive information in source code', 40 | 'Description': 'Detect-Secrets identified sensitive information in source code of build ' + codebuildBuildArn + ' with the following information (may be truncated): ' + secretDetectionCheck, 41 | 'ProductFields': { 42 | 'Product Name': 'Detect-Secrets' 43 | }, 44 | 'Resources': [ 45 | { 46 | 'Type': 'AwsCodeBuildProject', 47 | 'Id': codebuildBuildArn, 48 | 'Partition': 'aws', 49 | 'Region': awsRegion, 50 | 'Details': { 51 | 'AwsCodeBuildProject': { 'Name': codebuildBuildArn } 52 | } 53 | } 54 | ], 55 | 'RecordState': 'ACTIVE', 56 | 'Workflow': {'Status': 'NEW'} 57 | } 58 | ] 59 | ) 60 | print(response) 61 | except Exception as e: 62 | print(e) 63 | except Exception as e: 64 | print(e) -------------------------------------------------------------------------------- /k8s-pipeline/src/security-hub/skanAsff.py: -------------------------------------------------------------------------------- 1 | import json 2 | import boto3 3 | import datetime 4 | import os 5 | # import sechub + sts boto3 client 6 | securityhub = boto3.client('securityhub') 7 | sts = boto3.client('sts') 8 | # retrieve account id from STS 9 | awsAccount = sts.get_caller_identity()['Account'] 10 | # retrieve env vars from codebuild 11 | awsRegion = os.environ['AWS_REGION'] 12 | codebuildBuildArn = os.environ['CODEBUILD_BUILD_ARN'] 13 | 14 | with open('skan-findings.json') as json_file: 15 | data = json.load(json_file) 16 | for key in data['Reports']: 17 | for findings in data['Reports'][key]['Results']: 18 | checkCategory = str(findings['Category']) 19 | checkModule = str(findings['Check']['ModuleId']) 20 | checkGroup = str(findings['Check']['GroupId']) 21 | checkNum = str(findings['Check']['CheckId']) 22 | newCheckId = checkModule + '.' + checkGroup + '.' + checkNum 23 | checkName = str(findings['Check']['CheckTitle']) 24 | checkDescription = str(findings['Message']) 25 | checkDescription = (checkDescription[:700] + '..') if len(checkDescription) > 700 else checkDescription 26 | reccDescription = str(findings['Message']) 27 | reccDescription = (reccDescription[:1000] + '..') if len(reccDescription) > 1000 else reccDescription 28 | reccLink = str(findings['References'][0]) 29 | skanSev = str(findings['Severity']) 30 | if skanSev == 'Critical': 31 | shSev = 'CRITICAL' 32 | elif skanSev == 'High': 33 | shSev = 'HIGH' 34 | elif skanSev == 'Medium': 35 | shSev = 'MEDIUM' 36 | else: 37 | shSev = 'LOW' 38 | try: 39 | iso8601Time = datetime.datetime.utcnow().replace(tzinfo=datetime.timezone.utc).isoformat() 40 | response = securityhub.batch_import_findings( 41 | Findings=[ 42 | { 43 | 'SchemaVersion': '2018-10-08', 44 | 'Id': codebuildBuildArn + 'sKan-scan' + newCheckId, 45 | 'ProductArn': 'arn:aws:securityhub:' + awsRegion + ':' + awsAccount + ':product/' + awsAccount + '/default', 46 | 'GeneratorId': codebuildBuildArn, 47 | 'AwsAccountId': awsAccount, 48 | 'Types': [ 'Software and Configuration Checks' ], 49 | 'CreatedAt': iso8601Time, 50 | 'UpdatedAt': iso8601Time, 51 | 'Severity': { 52 | 'Label': shSev, 53 | 'Original': skanSev 54 | }, 55 | 'Title': '[sKan] Security misconfigurations identified in Kubernetes configuration file or Helm Chart', 56 | 'Description': 'Alcide sKan identified security misconfigurations in Kubernetes configuration file or Helm Chart during build ' + codebuildBuildArn + ' with the following information (may be truncated): ' + checkDescription, 57 | 'ProductFields': { 58 | 'Product Name': 'sKan' 59 | }, 60 | 'Remediation': { 61 | 'Recommendation': { 62 | 'Text': reccDescription, 63 | 'Url': reccLink 64 | } 65 | }, 66 | 'Resources': [ 67 | { 68 | 'Type': 'AwsCodeBuildProject', 69 | 'Id': codebuildBuildArn, 70 | 'Partition': 'aws', 71 | 'Region': awsRegion, 72 | 'Details': { 73 | 'AwsCodeBuildProject': { 'Name': codebuildBuildArn }, 74 | 'Other': { 75 | 'findingId': newCheckId, 76 | 'findingName': checkName, 77 | 'findingCategory': checkCategory 78 | } 79 | } 80 | } 81 | ], 82 | 'RecordState': 'ACTIVE', 83 | 'Workflow': {'Status': 'NEW'} 84 | } 85 | ] 86 | ) 87 | print(response) 88 | except Exception as e: 89 | print(e) -------------------------------------------------------------------------------- /terraform-pipeline/README.md: -------------------------------------------------------------------------------- 1 | # Terraform Security Scanning - CodeSuite 2 | Sample implementation of a Terraform security testing pipeline using the AWS CodeSuite. CodeCommit is used as the SCM, CodeBuild projects are used as the CI servers and CodePipeline is the CD automation engine which will start builds as new code is pushed (directly or via PR) to the Master branch. The final build stage also applies your Terraform state and persists the `.tfstate` files into a S3 backend and uses Terraform to supply state locking for multiple builds. 3 | 4 | This pipeline lints Terraform configurations files by using TFLint and looks for regex and high-entropy based secrets/sensitive values using Detect-Secrets. Various security static analysis is performed against the templates using TFSec and Checkov, respectively. A standalone project is used to bootstrap and apply the Terraform state. To avoid interpolation errors due to the `terraform` declaration within `provider.tf` not supporting variables, we use `envsubst` to pass into the CodeBuild environment variables into `provider.tf`. 5 | 6 | All security findings will be sent to AWS Security Hub if any of the tests fail - that way you can keep track of them and use further downstream integrations such as PagerDuty, Azure DevOps or JIRA to take the Security Hub findings and parse them down as alerts / issues / bugs / etc. Those are provided as [add-ons in ElectricEye](https://github.com/jonrau1/ElectricEye/tree/master/add-ons) 7 | 8 | ## Solution architecture 9 | ![Architecture Diagram](./terraform-pipeline-architecture.jpg) 10 | 11 | ## Getting Started 12 | Clone this repository and upload `tf-devsecops.zip` to a bucket of your choosing. Deploy a stack from `tf-security-pipeline.yml` in your AWS account, all necessary artifacts will be pushed as the first commit to the created CodeCommit repository. 13 | 14 | The utilize the scanning utilities deployed in this solution, upload all Terraform configurations files to the `src/artifacts` subdirectory in the solution. You can view the individual CodeBuild buildspec's in `src/buildspecs/` and Security Hub integration scripts in `src/security-hub` 15 | 16 | **Important Note:** Modify the permissions of the CodeBuild Role in `tf-security-pipeline.yml` to give it permissions for whatever you will be deploying with Terraform. In the basic example the CodeBuild role is given `'*'` resource access for S3:CreateBucket and S3:DeleteBucket to apply and destory the sample bucket in this pipeline. -------------------------------------------------------------------------------- /terraform-pipeline/src/artifacts/main.tf: -------------------------------------------------------------------------------- 1 | resource "aws_s3_bucket" "mybucket" { 2 | bucket_prefix = "my-codebuild-bucket" 3 | acl = "private" 4 | server_side_encryption_configuration { 5 | rule { 6 | apply_server_side_encryption_by_default { 7 | sse_algorithm = "AES256" 8 | } 9 | } 10 | } 11 | } -------------------------------------------------------------------------------- /terraform-pipeline/src/artifacts/provider.template: -------------------------------------------------------------------------------- 1 | provider "aws" { 2 | region = "$REGION" 3 | } 4 | 5 | terraform { 6 | backend "s3" { 7 | bucket = "$STATE_BUCKET" 8 | key = "$ENV/$PROJECT_NAME/terraform.tfstate" 9 | region = "$REGION" 10 | dynamodb_table = "$STATE_DDB" 11 | encrypt = true 12 | } 13 | } -------------------------------------------------------------------------------- /terraform-pipeline/src/artifacts/variables.tf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jonrau1/AWS-DevSecOps-Factory/c07921704872b2102c47098038300fc5adf4f003/terraform-pipeline/src/artifacts/variables.tf -------------------------------------------------------------------------------- /terraform-pipeline/src/buildspecs/checkov-buildspec.yaml: -------------------------------------------------------------------------------- 1 | version: 0.2 2 | 3 | phases: 4 | install: 5 | commands: 6 | - apt update && apt install -y jq 7 | - pip3 install --upgrade pip 8 | - pip3 install boto3 9 | - pip3 install checkov 10 | - cp ./security-hub/checkovAsff.py ./artifacts/ 11 | - cd artifacts 12 | build: 13 | commands: 14 | ## this showcases how to skip certain Checkov checks which may not be relevant 15 | ## in this case, MFA Delete, Versioning and Access Logging are turned off 16 | ## Only Root users can interact with MFA delete anyway - skipping checks versus forcing an Exit 0 is a good way to break builds and whitelist certain actions 17 | ## ideally that should be supplemented with an exception recorded in SNOW or something... 18 | - checkov --framework terraform --skip-check CKV_AWS_18,CKV_AWS_52,CKV_AWS_21 -d . -o json > checkov-findings.json 19 | - echo Checkov scan completed on `date` 20 | post_build: 21 | commands: 22 | - jq . checkov-findings.json 23 | - python3 checkovAsff.py 24 | - echo Security Hub script executed on `date` review the findings there -------------------------------------------------------------------------------- /terraform-pipeline/src/buildspecs/detect-secrets-ejection.py: -------------------------------------------------------------------------------- 1 | import json 2 | 3 | try: 4 | with open('secret-results.json') as json_file: 5 | data = json.load(json_file) 6 | if str(data['results']) != '{}': 7 | exit(1) 8 | else: 9 | exit(0) 10 | except Exception as e: 11 | print(e) 12 | raise -------------------------------------------------------------------------------- /terraform-pipeline/src/buildspecs/secrets-buildspec.yaml: -------------------------------------------------------------------------------- 1 | version: 0.2 2 | 3 | phases: 4 | install: 5 | commands: 6 | - apt update && apt install -y jq 7 | - pip3 install --upgrade pip 8 | - pip3 install boto3 9 | - pip3 install detect-secrets 10 | - cp ./buildspecs/detect-secrets-ejection.py ./artifacts/ 11 | - cp ./security-hub/secretsAsff.py ./artifacts/ 12 | - cd artifacts 13 | build: 14 | commands: 15 | - detect-secrets scan . --all-files > secret-results.json 16 | - jq . secret-results.json 17 | - python3 detect-secrets-ejection.py 18 | - echo detect-secrets scan completed on `date` 19 | post_build: 20 | commands: 21 | - python3 secretsAsff.py 22 | - echo Security Hub script executed on `date` review the findings there -------------------------------------------------------------------------------- /terraform-pipeline/src/buildspecs/terraform-buildspec.yaml: -------------------------------------------------------------------------------- 1 | version: 0.2 2 | 3 | phases: 4 | install: 5 | commands: 6 | - apt update 7 | - apt install -y unzip wget 8 | - wget https://releases.hashicorp.com/terraform/0.12.28/terraform_0.12.28_linux_amd64.zip 9 | - unzip terraform_0.12.28_linux_amd64.zip 10 | - mv terraform /usr/local/bin/ 11 | pre_build: 12 | commands: 13 | - cd artifacts 14 | - export REGION=$AWS_REGION 15 | - export STATE_BUCKET=$TF_STATE_BUCKET 16 | - export STATE_DDB=$TF_STATE_DDB_TABLE 17 | - export ENV=$SDLC_ENV_NAME 18 | - export PROJECT_NAME=$PROJECT_NAME 19 | - envsubst < provider.template > provider.tf 20 | - terraform init 21 | build: 22 | commands: 23 | - terraform $TERRAFORM_COMMAND -auto-approve 24 | - echo Terraform deployment completed on `date` -------------------------------------------------------------------------------- /terraform-pipeline/src/buildspecs/tflint-buildspec.yaml: -------------------------------------------------------------------------------- 1 | version: 0.2 2 | 3 | phases: 4 | install: 5 | commands: 6 | - apt update && apt install -y jq 7 | - pip3 install --upgrade pip 8 | - pip3 install boto3 9 | - cp ./security-hub/tflintAsff.py ./artifacts/ 10 | - cd artifacts 11 | build: 12 | commands: 13 | - curl -L "$(curl -Ls https://api.github.com/repos/terraform-linters/tflint/releases/latest | grep -o -E "https://.+?_linux_amd64.zip")" -o tflint.zip && unzip tflint.zip && rm tflint.zip 14 | - ./tflint -f=json . > tflint-findings.json 15 | - echo TFLint check completed on `date` 16 | post_build: 17 | commands: 18 | - jq . tflint-findings.json 19 | - python3 tflintAsff.py 20 | - echo Security Hub script executed on `date` review the findings there -------------------------------------------------------------------------------- /terraform-pipeline/src/buildspecs/tfsec-buildspec.yaml: -------------------------------------------------------------------------------- 1 | version: 0.2 2 | 3 | phases: 4 | install: 5 | commands: 6 | - apt update && apt install -y jq 7 | - pip3 install --upgrade pip 8 | - pip3 install boto3 9 | - cp ./security-hub/tfsecAsff.py ./artifacts/ 10 | - cd artifacts 11 | build: 12 | commands: 13 | - wget https://github.com/liamg/tfsec/releases/download/v0.21.0/tfsec-linux-amd64 14 | - chmod +x ./tfsec-linux-amd64 15 | # we are willingly ignoring the bucket logging check because it's annoying 16 | # remove the -e flag if you do not want to ignore any checks 17 | - ./tfsec-linux-amd64 -e AWS002 -f json > tfsec-findings.json 18 | - echo TFLint check completed on `date` 19 | post_build: 20 | commands: 21 | - jq . tfsec-findings.json 22 | - python3 tfsecAsff.py 23 | - echo Security Hub script executed on `date` review the findings there -------------------------------------------------------------------------------- /terraform-pipeline/src/security-hub/checkovAsff.py: -------------------------------------------------------------------------------- 1 | import os 2 | import boto3 3 | import json 4 | import datetime 5 | # import sechub + sts boto3 client 6 | securityhub = boto3.client('securityhub') 7 | sts = boto3.client('sts') 8 | # retrieve account id from STS 9 | awsAccount = sts.get_caller_identity()['Account'] 10 | # retrieve env vars from codebuild 11 | awsRegion = os.environ['AWS_REGION'] 12 | codebuildBuildArn = os.environ['CODEBUILD_BUILD_ARN'] 13 | 14 | with open('checkov-findings.json') as json_file: 15 | data = json.load(json_file) 16 | checkTest = str(data['results']['failed_checks']) 17 | if checkTest == '[]': 18 | pass 19 | else: 20 | for checks in data['results']['failed_checks']: 21 | checkId = str(checks['check_id']) 22 | checkName = str(checks['check_name']) 23 | checkovSev = str(checks['check_result']['result']) 24 | fileName = str(checks['file_path']) 25 | tfResource = str(checks['resource']) 26 | reccoUrl = str(checks['guideline']) 27 | try: 28 | iso8601Time = datetime.datetime.utcnow().replace(tzinfo=datetime.timezone.utc).isoformat() 29 | response = securityhub.batch_import_findings( 30 | Findings=[ 31 | { 32 | 'SchemaVersion': '2018-10-08', 33 | 'Id': codebuildBuildArn + 'checkov-findings-' + checkId + '-' + tfResource, 34 | 'ProductArn': 'arn:aws:securityhub:' + awsRegion + ':' + awsAccount + ':product/' + awsAccount + '/default', 35 | 'GeneratorId': codebuildBuildArn, 36 | 'AwsAccountId': awsAccount, 37 | 'Types': [ 'Software and Configuration Checks' ], 38 | 'CreatedAt': iso8601Time, 39 | 'UpdatedAt': iso8601Time, 40 | 'Severity': { 41 | 'Label': 'LOW', 42 | 'Original': checkovSev 43 | }, 44 | 'Title': '[Checkov] Security misconfiguration identified in Terraform source files', 45 | 'Description': 'Checkov has identified security misconfigurations in Terraform source files in source code of build ' + codebuildBuildArn + '. Check ' + checkId + ' has failed with the following message ' + checkName + ' for filename ' + fileName, 46 | 'ProductFields': { 47 | 'Product Name': 'TFLint' 48 | }, 49 | 'Remediation': { 50 | 'Recommendation': { 51 | 'Text': 'For information on the Checkov failed check refer to this guideline from Bridgecrew', 52 | 'Url': reccoUrl 53 | } 54 | }, 55 | 'Resources': [ 56 | { 57 | 'Type': 'AwsCodeBuildProject', 58 | 'Id': codebuildBuildArn, 59 | 'Partition': 'aws', 60 | 'Region': awsRegion, 61 | 'Details': { 62 | 'AwsCodeBuildProject': { 'Name': codebuildBuildArn }, 63 | 'Other': { 64 | 'checkId': checkId, 65 | 'checkName': checkName, 66 | 'fileName': fileName, 67 | 'tfResource': tfResource 68 | } 69 | } 70 | } 71 | ], 72 | 'RecordState': 'ACTIVE', 73 | 'Workflow': {'Status': 'NEW'} 74 | } 75 | ] 76 | ) 77 | print(response) 78 | except Exception as e: 79 | print(e) -------------------------------------------------------------------------------- /terraform-pipeline/src/security-hub/secretsAsff.py: -------------------------------------------------------------------------------- 1 | import json 2 | import boto3 3 | import datetime 4 | import os 5 | # import sechub + sts boto3 client 6 | securityhub = boto3.client('securityhub') 7 | sts = boto3.client('sts') 8 | # retrieve account id from STS 9 | awsAccount = sts.get_caller_identity()['Account'] 10 | # retrieve env vars from codebuild 11 | awsRegion = os.environ['AWS_REGION'] 12 | codebuildBuildArn = os.environ['CODEBUILD_BUILD_ARN'] 13 | try: 14 | with open('secret-results.json') as json_file: 15 | data = json.load(json_file) 16 | if str(data['results']) == '{}': 17 | pass 18 | else: 19 | secretDetectionCheck = str(data['results']) 20 | secretDetectionCheck = (secretDetectionCheck[:700] + '..') if len(secretDetectionCheck) > 700 else secretDetectionCheck 21 | secretTimestamp = str(data['generated_at']) 22 | try: 23 | iso8601Time = datetime.datetime.utcnow().replace(tzinfo=datetime.timezone.utc).isoformat() 24 | response = securityhub.batch_import_findings( 25 | Findings=[ 26 | { 27 | 'SchemaVersion': '2018-10-08', 28 | 'Id': codebuildBuildArn + 'detect-secrets-scan' + secretTimestamp, 29 | 'ProductArn': 'arn:aws:securityhub:' + awsRegion + ':' + awsAccount + ':product/' + awsAccount + '/default', 30 | 'GeneratorId': codebuildBuildArn, 31 | 'AwsAccountId': awsAccount, 32 | 'Types': [ 33 | 'Sensitive Data Identifications', 34 | 'Effects/Data Exposure' 35 | ], 36 | 'CreatedAt': iso8601Time, 37 | 'UpdatedAt': iso8601Time, 38 | 'Severity': { 'Label': 'CRITICAL' }, 39 | 'Title': 'Detect-Secrets identified sensitive information in source code', 40 | 'Description': 'Detect-Secrets identified sensitive information in source code of build ' + codebuildBuildArn + ' with the following information (may be truncated): ' + secretDetectionCheck, 41 | 'ProductFields': { 42 | 'Product Name': 'Detect-Secrets' 43 | }, 44 | 'Resources': [ 45 | { 46 | 'Type': 'AwsCodeBuildProject', 47 | 'Id': codebuildBuildArn, 48 | 'Partition': 'aws', 49 | 'Region': awsRegion, 50 | 'Details': { 51 | 'AwsCodeBuildProject': { 'Name': codebuildBuildArn } 52 | } 53 | } 54 | ], 55 | 'RecordState': 'ACTIVE', 56 | 'Workflow': {'Status': 'NEW'} 57 | } 58 | ] 59 | ) 60 | print(response) 61 | except Exception as e: 62 | print(e) 63 | except Exception as e: 64 | print(e) -------------------------------------------------------------------------------- /terraform-pipeline/src/security-hub/tflintAsff.py: -------------------------------------------------------------------------------- 1 | import os 2 | import boto3 3 | import json 4 | import datetime 5 | # import sechub + sts boto3 client 6 | securityhub = boto3.client('securityhub') 7 | sts = boto3.client('sts') 8 | # retrieve account id from STS 9 | awsAccount = sts.get_caller_identity()['Account'] 10 | # retrieve env vars from codebuild 11 | awsRegion = os.environ['AWS_REGION'] 12 | codebuildBuildArn = os.environ['CODEBUILD_BUILD_ARN'] 13 | 14 | with open('tflint-findings.json') as json_file: 15 | data = json.load(json_file) 16 | lintIssues = str(data['issues']) 17 | if lintIssues == '[]': 18 | pass 19 | else: 20 | try: 21 | iso8601Time = datetime.datetime.utcnow().replace(tzinfo=datetime.timezone.utc).isoformat() 22 | response = securityhub.batch_import_findings( 23 | Findings=[ 24 | { 25 | 'SchemaVersion': '2018-10-08', 26 | 'Id': codebuildBuildArn + 'tflint-findings', 27 | 'ProductArn': 'arn:aws:securityhub:' + awsRegion + ':' + awsAccount + ':product/' + awsAccount + '/default', 28 | 'GeneratorId': codebuildBuildArn, 29 | 'AwsAccountId': awsAccount, 30 | 'Types': [ 'Software and Configuration Checks' ], 31 | 'CreatedAt': iso8601Time, 32 | 'UpdatedAt': iso8601Time, 33 | 'Severity': { 'Label': 'LOW' }, 34 | 'Title': '[TFLint] Syntax or security issues identified in Terraform source files', 35 | 'Description': 'TFLint has identified syntax or security issues identified in Terraform source files in source code of build ' + codebuildBuildArn + ' refer to the CodeBuild logs to view any issues that were found.', 36 | 'ProductFields': { 37 | 'Product Name': 'TFLint' 38 | }, 39 | 'Resources': [ 40 | { 41 | 'Type': 'AwsCodeBuildProject', 42 | 'Id': codebuildBuildArn, 43 | 'Partition': 'aws', 44 | 'Region': awsRegion, 45 | 'Details': { 46 | 'AwsCodeBuildProject': { 'Name': codebuildBuildArn } 47 | } 48 | } 49 | ], 50 | 'RecordState': 'ACTIVE', 51 | 'Workflow': {'Status': 'NEW'} 52 | } 53 | ] 54 | ) 55 | print(response) 56 | except Exception as e: 57 | print(e) 58 | lintErrors = str(data['errors']) 59 | if lintErrors == '[]': 60 | pass 61 | else: 62 | try: 63 | iso8601Time = datetime.datetime.utcnow().replace(tzinfo=datetime.timezone.utc).isoformat() 64 | response = securityhub.batch_import_findings( 65 | Findings=[ 66 | { 67 | 'SchemaVersion': '2018-10-08', 68 | 'Id': codebuildBuildArn + 'tflint-findings', 69 | 'ProductArn': 'arn:aws:securityhub:' + awsRegion + ':' + awsAccount + ':product/' + awsAccount + '/default', 70 | 'GeneratorId': codebuildBuildArn, 71 | 'AwsAccountId': awsAccount, 72 | 'Types': [ 'Software and Configuration Checks' ], 73 | 'CreatedAt': iso8601Time, 74 | 'UpdatedAt': iso8601Time, 75 | 'Severity': { 'Label': 'MEDIUM' }, 76 | 'Title': '[TFLint] Syntax or security errors identified in Terraform source files', 77 | 'Description': 'TFLint has identified syntax or security errors identified in Terraform source files in source code of build ' + codebuildBuildArn + ' refer to the CodeBuild logs to view any issues that were found.', 78 | 'ProductFields': { 79 | 'Product Name': 'TFLint' 80 | }, 81 | 'Resources': [ 82 | { 83 | 'Type': 'AwsCodeBuildProject', 84 | 'Id': codebuildBuildArn, 85 | 'Partition': 'aws', 86 | 'Region': awsRegion, 87 | 'Details': { 88 | 'AwsCodeBuildProject': { 'Name': codebuildBuildArn } 89 | } 90 | } 91 | ], 92 | 'RecordState': 'ACTIVE', 93 | 'Workflow': {'Status': 'NEW'} 94 | } 95 | ] 96 | ) 97 | print(response) 98 | except Exception as e: 99 | print(e) -------------------------------------------------------------------------------- /terraform-pipeline/src/security-hub/tfsecAsff.py: -------------------------------------------------------------------------------- 1 | import os 2 | import boto3 3 | import json 4 | import datetime 5 | # import sechub + sts boto3 client 6 | securityhub = boto3.client('securityhub') 7 | sts = boto3.client('sts') 8 | # retrieve account id from STS 9 | awsAccount = sts.get_caller_identity()['Account'] 10 | # retrieve env vars from codebuild 11 | awsRegion = os.environ['AWS_REGION'] 12 | codebuildBuildArn = os.environ['CODEBUILD_BUILD_ARN'] 13 | 14 | with open('tfsec-findings.json') as json_file: 15 | data = json.load(json_file) 16 | if str(data) == "{'results': None}": 17 | pass 18 | else: 19 | for result in data['results']: 20 | ruleId = str(result['rule_id']) 21 | ruleLink = str(result['link']) 22 | fileName = str(result['location']['filename']) 23 | startLine = str(result['location']['start_line']) 24 | endLine = str(result['location']['end_line']) 25 | ruleDescr = str(result['description']) 26 | tfsecSev = str(result['severity']) 27 | try: 28 | iso8601Time = datetime.datetime.utcnow().replace(tzinfo=datetime.timezone.utc).isoformat() 29 | response = securityhub.batch_import_findings( 30 | Findings=[ 31 | { 32 | 'SchemaVersion': '2018-10-08', 33 | 'Id': codebuildBuildArn + 'tfsec-findings-' + ruleId + '-' + startLine + '-' + endLine, 34 | 'ProductArn': 'arn:aws:securityhub:' + awsRegion + ':' + awsAccount + ':product/' + awsAccount + '/default', 35 | 'GeneratorId': codebuildBuildArn, 36 | 'AwsAccountId': awsAccount, 37 | 'Types': [ 'Software and Configuration Checks' ], 38 | 'CreatedAt': iso8601Time, 39 | 'UpdatedAt': iso8601Time, 40 | 'Severity': { 41 | 'Label': 'LOW', 42 | 'Original': tfsecSev 43 | }, 44 | 'Title': '[TFSec] Security misconfiguration identified in Terraform source files', 45 | 'Description': 'TFSec has identified security misconfigurations in Terraform source files in source code of build ' + codebuildBuildArn + '. Check ' + ruleId + ' has failed with the following message ' + ruleDescr + ' for filename ' + fileName, 46 | 'ProductFields': { 47 | 'Product Name': 'TFLint' 48 | }, 49 | 'Remediation': { 50 | 'Recommendation': { 51 | 'Text': 'For information on the failed check refer to this entry from TFSec', 52 | 'Url': ruleLink 53 | } 54 | }, 55 | 'Resources': [ 56 | { 57 | 'Type': 'AwsCodeBuildProject', 58 | 'Id': codebuildBuildArn, 59 | 'Partition': 'aws', 60 | 'Region': awsRegion, 61 | 'Details': { 62 | 'AwsCodeBuildProject': { 'Name': codebuildBuildArn }, 63 | 'Other': { 64 | 'ruleId': ruleId, 65 | 'ruleDescription': ruleDescr, 66 | 'fileName': fileName 67 | } 68 | } 69 | } 70 | ], 71 | 'RecordState': 'ACTIVE', 72 | 'Workflow': {'Status': 'NEW'} 73 | } 74 | ] 75 | ) 76 | print(response) 77 | except Exception as e: 78 | print(e) -------------------------------------------------------------------------------- /terraform-pipeline/terraform-pipeline-architecture.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jonrau1/AWS-DevSecOps-Factory/c07921704872b2102c47098038300fc5adf4f003/terraform-pipeline/terraform-pipeline-architecture.jpg -------------------------------------------------------------------------------- /terraform-pipeline/tf-devsecops.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jonrau1/AWS-DevSecOps-Factory/c07921704872b2102c47098038300fc5adf4f003/terraform-pipeline/tf-devsecops.zip -------------------------------------------------------------------------------- /terraform-pipeline/tf-secdevops.yaml: -------------------------------------------------------------------------------- 1 | AWSTemplateFormatVersion: 2010-09-09 2 | Description: Deploys a sample CodePipeline to scan and deploy Terraform configuration files 3 | Parameters: 4 | InitialCommitBucket: 5 | Type: String 6 | Description: The name of the S3 bucket containing the package for the initial commit for the DevSecOps pipeline 7 | InitialCommitKey: 8 | Type: String 9 | Description: Name of the package for the initial commit for the DevSecOps pipeline DO NOT include .zip 10 | Default: tf-devsecops 11 | SDLCEnvName: 12 | Type: String 13 | Description: This populates the SDLC_ENV_NAME value for CodeBuild. It is used to add another subdirectory to the path of the .tftate stored in S3 remote state 14 | Default: development 15 | CodeBuildProjectName: 16 | Type: String 17 | Description: This populates the PROJECT_NAME value for CodeBuild. It both sets the name and adds another sub-directory to the path of the .tfstate file stored in S3 remote state 18 | Default: tf-devsecops-codebuild 19 | TerraformApplyCommand: 20 | Type: String 21 | Description: This populates the TERRAFORM_COMMAND value for CodeBuild. If you ever needed to roll a deployment back replace this value with 'destroy' and manually release the change 22 | Default: apply 23 | Resources: 24 | DevSecOpsCICDCodeCommit: 25 | Type: AWS::CodeCommit::Repository 26 | Properties: 27 | RepositoryDescription: Contains all build artifacts for a Terraform DevSecOps pipeline 28 | RepositoryName: tf-devsecops 29 | Code: 30 | S3: 31 | Bucket: !Ref InitialCommitBucket 32 | Key: !Sub '${InitialCommitKey}.zip' 33 | TerraformRemoteS3: 34 | Type: AWS::S3::Bucket 35 | Properties: 36 | BucketName: !Sub 'terraform-devsecopscicd-remote-state-${AWS::AccountId}' 37 | PublicAccessBlockConfiguration: 38 | BlockPublicAcls: true 39 | BlockPublicPolicy: true 40 | IgnorePublicAcls: true 41 | RestrictPublicBuckets: true 42 | VersioningConfiguration: 43 | Status: Enabled 44 | BucketEncryption: 45 | ServerSideEncryptionConfiguration: 46 | - ServerSideEncryptionByDefault: 47 | SSEAlgorithm: AES256 48 | TerraformRemoteDynamoDB: 49 | Type: AWS::DynamoDB::Table 50 | Properties: 51 | AttributeDefinitions: 52 | - 53 | AttributeName: LockID 54 | AttributeType: S 55 | KeySchema: 56 | - 57 | AttributeName: LockID 58 | KeyType: HASH 59 | BillingMode: PAY_PER_REQUEST 60 | TableName: Terraform-DevSecOps-State-Locking 61 | CodeBuildServiceRole: 62 | Type: AWS::IAM::Role 63 | Properties: 64 | RoleName: Terraform-DevSecOps-CodeBuildServiceRole 65 | Policies: 66 | - PolicyName: Terraform-DevSecOps-CodeBuildServiceRolePolicy 67 | PolicyDocument: 68 | Version: 2012-10-17 69 | Statement: 70 | - Effect: Allow 71 | Action: 72 | - codecommit:GitPull 73 | Resource: !GetAtt DevSecOpsCICDCodeCommit.Arn 74 | - Effect: Allow 75 | Action: 76 | - dynamodb:GetItem 77 | - dynamodb:PutItem 78 | - dynamodb:DeleteItem 79 | Resource: 80 | - !Sub 'arn:aws:dynamodb:${AWS::Region}:${AWS::AccountId}:table/Terraform-DevSecOps-State-Locking' 81 | - Effect: Allow 82 | Action: 83 | - s3:GetObject 84 | - s3:GetObjectVersion 85 | - s3:PutObject 86 | - s3:GetBucketAcl 87 | - s3:GetBucketLocation 88 | - s3:ListBucket 89 | - s3:ListAllMyBuckets 90 | - s3:HeadBucket 91 | - s3:GetBucketLocation 92 | - s3:GetObjectVersion 93 | Resource: 94 | - !Sub 'arn:aws:s3:::${TerraformRemoteS3}' 95 | - !Sub 'arn:aws:s3:::${TerraformRemoteS3}/*' 96 | - !Sub 'arn:aws:s3:::${DevSecOpsCICDCodePipelineArtifactBucket}' 97 | - !Sub 'arn:aws:s3:::${DevSecOpsCICDCodePipelineArtifactBucket}/*' 98 | - Effect: Allow 99 | Action: 100 | # S3 permissions needed for the Terraform CRUD 101 | - s3:List* 102 | - s3:HeadBucket 103 | - s3:Get* 104 | - s3:CreateBucket 105 | - s3:DeleteBucket 106 | - s3:PutEncryptionConfiguration 107 | - s3:GetEncryptionConfiguration 108 | - kms:Decrypt 109 | - ssm:GetParameter 110 | - ssm:GetParameters 111 | - logs:CreateLogGroup 112 | - logs:CreateLogStream 113 | - logs:PutLogEvents 114 | Resource: '*' 115 | AssumeRolePolicyDocument: 116 | Version: 2012-10-17 117 | Statement: 118 | - Effect: Allow 119 | Principal: { Service: codebuild.amazonaws.com } 120 | Action: 121 | - sts:AssumeRole 122 | CodePipelineServiceRole: 123 | Type: AWS::IAM::Role 124 | Properties: 125 | RoleName: Terraform-DevSecOps-CodePipelineServiceRole 126 | Policies: 127 | - PolicyName: Terraform-DevSecOps-CodePipelineServiceRolePolicy 128 | PolicyDocument: 129 | Version: 2012-10-17 130 | Statement: 131 | - Effect: Allow 132 | Action: 133 | - codecommit:CancelUploadArchive 134 | - codecommit:GetBranch 135 | - codecommit:GetCommit 136 | - codecommit:GetUploadArchiveStatus 137 | - codecommit:UploadArchive 138 | Resource: !GetAtt DevSecOpsCICDCodeCommit.Arn 139 | - Effect: Allow 140 | Action: 141 | - cloudwatch:'*' 142 | - ssm:GetParameter 143 | - ssm:GetParameters 144 | - kms:Decrypt 145 | Resource: '*' 146 | - Effect: Allow 147 | Action: 148 | - s3:GetObject 149 | - s3:GetObjectVersion 150 | - s3:PutObject 151 | - s3:GetBucketAcl 152 | - s3:GetBucketLocation 153 | - s3:PutBucketPolicy 154 | - s3:ListAllMyBuckets 155 | - s3:ListBucket 156 | Resource: 157 | - !Sub 'arn:aws:s3:::${DevSecOpsCICDCodePipelineArtifactBucket}' 158 | - !Sub 'arn:aws:s3:::${DevSecOpsCICDCodePipelineArtifactBucket}/*' 159 | - Effect: Allow 160 | Action: 161 | - codebuild:BatchGetBuilds 162 | - codebuild:StartBuild 163 | Resource: 164 | - !GetAtt TFLinter.Arn 165 | - !GetAtt SecretScanStage.Arn 166 | - !GetAtt TfsecStage.Arn 167 | - !GetAtt CheckovStage.Arn 168 | - !GetAtt TFDeploymentStage.Arn 169 | AssumeRolePolicyDocument: 170 | Version: 2012-10-17 171 | Statement: 172 | - Effect: Allow 173 | Principal: { Service: codepipeline.amazonaws.com } 174 | Action: 175 | - sts:AssumeRole 176 | DevSecOpsCICDCodePipelineArtifactBucket: 177 | Type: AWS::S3::Bucket 178 | Properties: 179 | BucketName: !Sub 'terraform-devsecopscicd-artifacts-${AWS::AccountId}' 180 | PublicAccessBlockConfiguration: 181 | BlockPublicAcls: true 182 | BlockPublicPolicy: true 183 | IgnorePublicAcls: true 184 | RestrictPublicBuckets: true 185 | VersioningConfiguration: 186 | Status: Enabled 187 | BucketEncryption: 188 | ServerSideEncryptionConfiguration: 189 | - ServerSideEncryptionByDefault: 190 | SSEAlgorithm: AES256 191 | TFLinter: 192 | Type: AWS::CodeBuild::Project 193 | Properties: 194 | Artifacts: 195 | Type: CODEPIPELINE 196 | Description: Uses TFLint to lint Terraform configuration files - Managed by CloudFormation 197 | Environment: 198 | ComputeType: BUILD_GENERAL1_SMALL 199 | Image: aws/codebuild/standard:4.0 200 | PrivilegedMode: True 201 | Type: LINUX_CONTAINER 202 | LogsConfig: 203 | CloudWatchLogs: 204 | Status: ENABLED 205 | Name: TFLinter 206 | ServiceRole: !GetAtt CodeBuildServiceRole.Arn 207 | Source: 208 | BuildSpec: buildspecs/tflint-buildspec.yaml 209 | Type: CODEPIPELINE 210 | SecretScanStage: 211 | Type: AWS::CodeBuild::Project 212 | Properties: 213 | Artifacts: 214 | Type: CODEPIPELINE 215 | Description: Uses Yelp's Detect-Secrets to look for any secrets or sensitive material - Managed by CloudFormation 216 | Environment: 217 | ComputeType: BUILD_GENERAL1_SMALL 218 | Image: aws/codebuild/standard:4.0 219 | PrivilegedMode: True 220 | Type: LINUX_CONTAINER 221 | LogsConfig: 222 | CloudWatchLogs: 223 | Status: ENABLED 224 | Name: SecretScan 225 | ServiceRole: !GetAtt CodeBuildServiceRole.Arn 226 | Source: 227 | BuildSpec: buildspecs/secrets-buildspec.yaml 228 | Type: CODEPIPELINE 229 | TfsecStage: 230 | Type: AWS::CodeBuild::Project 231 | Properties: 232 | Artifacts: 233 | Type: CODEPIPELINE 234 | Description: Uses TFSec to detect security misconfigurations in Terraform configuration files - Managed by CloudFormation 235 | Environment: 236 | ComputeType: BUILD_GENERAL1_SMALL 237 | Image: aws/codebuild/standard:4.0 238 | PrivilegedMode: True 239 | Type: LINUX_CONTAINER 240 | LogsConfig: 241 | CloudWatchLogs: 242 | Status: ENABLED 243 | Name: TfsecStage 244 | ServiceRole: !GetAtt CodeBuildServiceRole.Arn 245 | Source: 246 | BuildSpec: buildspecs/tfsec-buildspec.yaml 247 | Type: CODEPIPELINE 248 | CheckovStage: 249 | Type: AWS::CodeBuild::Project 250 | Properties: 251 | Artifacts: 252 | Type: CODEPIPELINE 253 | Description: Uses Checkov to detect security misconfigurations in Terraform configuration files - Managed by CloudFormation 254 | Environment: 255 | ComputeType: BUILD_GENERAL1_SMALL 256 | Image: aws/codebuild/standard:4.0 257 | PrivilegedMode: True 258 | Type: LINUX_CONTAINER 259 | LogsConfig: 260 | CloudWatchLogs: 261 | Status: ENABLED 262 | Name: CheckovStage 263 | ServiceRole: !GetAtt CodeBuildServiceRole.Arn 264 | Source: 265 | BuildSpec: buildspecs/checkov-buildspec.yaml 266 | Type: CODEPIPELINE 267 | TFDeploymentStage: 268 | Type: AWS::CodeBuild::Project 269 | Properties: 270 | Artifacts: 271 | Type: CODEPIPELINE 272 | Description: Installs Terraform and applies the state into your AWS Account - Managed by CloudFormation 273 | Environment: 274 | ComputeType: BUILD_GENERAL1_MEDIUM 275 | Image: aws/codebuild/standard:4.0 276 | PrivilegedMode: True 277 | Type: LINUX_CONTAINER 278 | EnvironmentVariables: 279 | - Name: TF_STATE_BUCKET 280 | Type: PLAINTEXT 281 | Value: !Ref TerraformRemoteS3 282 | - Name: TF_STATE_DDB_TABLE 283 | Type: PLAINTEXT 284 | Value: !Ref TerraformRemoteDynamoDB 285 | - Name: SDLC_ENV_NAME 286 | Type: PLAINTEXT 287 | Value: !Ref SDLCEnvName 288 | - Name: PROJECT_NAME 289 | Type: PLAINTEXT 290 | Value: !Ref CodeBuildProjectName 291 | - Name: TERRAFORM_COMMAND 292 | Type: PLAINTEXT 293 | Value: !Ref TerraformApplyCommand 294 | LogsConfig: 295 | CloudWatchLogs: 296 | Status: ENABLED 297 | Name: !Ref CodeBuildProjectName 298 | ServiceRole: !GetAtt CodeBuildServiceRole.Arn 299 | Source: 300 | BuildSpec: buildspecs/terraform-buildspec.yaml 301 | Type: CODEPIPELINE 302 | DevSecOpsCICDCodePipeline: 303 | Type: AWS::CodePipeline::Pipeline 304 | Properties: 305 | ArtifactStore: 306 | Location: !Ref DevSecOpsCICDCodePipelineArtifactBucket 307 | Type: S3 308 | Name: !Sub 'DevSecOpsCICD-tf-cicd-pipeline-${AWS::AccountId}' 309 | RestartExecutionOnUpdate: True 310 | RoleArn: !GetAtt CodePipelineServiceRole.Arn 311 | Stages: 312 | - 313 | Name: Source 314 | Actions: 315 | - 316 | Name: SourceAction 317 | ActionTypeId: 318 | Category: Source 319 | Owner: AWS 320 | Version: 1 321 | Provider: CodeCommit 322 | Configuration: 323 | RepositoryName: !GetAtt DevSecOpsCICDCodeCommit.Name 324 | BranchName: master 325 | OutputArtifacts: 326 | - 327 | Name: SourceOutput 328 | RunOrder: 1 329 | - 330 | Name: TF-Linter 331 | Actions: 332 | - 333 | InputArtifacts: 334 | - 335 | Name: SourceOutput 336 | Name: BuildAction 337 | ActionTypeId: 338 | Category: Build 339 | Owner: AWS 340 | Version: 1 341 | Provider: CodeBuild 342 | Configuration: 343 | ProjectName: !Ref TFLinter 344 | PrimarySource: SourceOutput 345 | RunOrder: 2 346 | - 347 | Name: SecretScan 348 | Actions: 349 | - 350 | InputArtifacts: 351 | - 352 | Name: SourceOutput 353 | Name: BuildAction 354 | ActionTypeId: 355 | Category: Build 356 | Owner: AWS 357 | Version: 1 358 | Provider: CodeBuild 359 | Configuration: 360 | ProjectName: !Ref SecretScanStage 361 | PrimarySource: SourceOutput 362 | RunOrder: 3 363 | - 364 | Name: TfsecStage 365 | Actions: 366 | - 367 | InputArtifacts: 368 | - 369 | Name: SourceOutput 370 | Name: BuildAction 371 | ActionTypeId: 372 | Category: Build 373 | Owner: AWS 374 | Version: 1 375 | Provider: CodeBuild 376 | Configuration: 377 | ProjectName: !Ref TfsecStage 378 | PrimarySource: SourceOutput 379 | RunOrder: 4 380 | - 381 | Name: CheckovStage 382 | Actions: 383 | - 384 | InputArtifacts: 385 | - 386 | Name: SourceOutput 387 | Name: BuildAction 388 | ActionTypeId: 389 | Category: Build 390 | Owner: AWS 391 | Version: 1 392 | Provider: CodeBuild 393 | Configuration: 394 | ProjectName: !Ref CheckovStage 395 | PrimarySource: SourceOutput 396 | RunOrder: 5 397 | - 398 | Name: TFDeploymentStage 399 | Actions: 400 | - 401 | InputArtifacts: 402 | - 403 | Name: SourceOutput 404 | Name: BuildAction 405 | ActionTypeId: 406 | Category: Build 407 | Owner: AWS 408 | Version: 1 409 | Provider: CodeBuild 410 | Configuration: 411 | ProjectName: !Ref TFDeploymentStage 412 | PrimarySource: SourceOutput 413 | RunOrder: 6 --------------------------------------------------------------------------------