├── .github ├── CODEOWNERS ├── ISSUE_TEMPLATE.md ├── PULL_REQUEST_TEMPLATE.md └── workflows │ └── main.yml ├── .gitignore ├── CHAPTER_LIST.txt ├── CONTRIBUTING.md ├── LICENSE ├── Makefile ├── NOTICE ├── README.md ├── ch0_introduction ├── 00_02_what_you_should_know │ └── README.md ├── 00_03_ci-cd_tool_categories │ └── README.md ├── 00_04_pros_and_cons │ └── README.md ├── 00_05_the_experimental_pipeline │ ├── 00_05_the_experimental_pipeline.png │ └── README.md ├── 00_06_about_the_exercise_files │ ├── Makefile │ ├── README.md │ └── sample-application.yml └── README.md ├── ch1_self_hosted ├── 01_01_jenkins │ ├── Jenkins-Install-Blue-Ocean-Plugin-SCR-20231014-labg.png │ ├── Jenkinsfile │ ├── Makefile │ ├── README.md │ ├── data.json │ ├── index.py │ ├── index_test.py │ ├── jenkins-cloudformation-template.yml │ ├── requirements.txt │ └── template.html ├── 01_02_bamboo │ ├── Bamboo-Plan-Tasks-SCR-20230916-mare.png │ ├── Bamboo-Plan-Variables-SCR-20230916-mezc.png │ ├── Makefile │ ├── README.md │ ├── bamboo-cloudformation-template.yml │ ├── bamboo-specs │ │ └── bamboo.yml │ ├── data.json │ ├── index.py │ ├── index_test.py │ ├── requirements.txt │ └── template.html ├── 01_03_teamcity │ ├── Makefile │ ├── README.md │ ├── TeamCity-Build-Parameters-SCR-20230916-naii.png │ ├── TeamCity-Pom-XML-after-SCR-20231014-kkpo.png │ ├── TeamCity-Pom-XML-before-SCR-20231014-kkib.png │ ├── TeamCity-Settings-Kts-after-SCR-20231014-klxh.png │ ├── TeamCity-Settings-Kts-before-SCR-20231014-klup.png │ ├── data.json │ ├── index.py │ ├── index_test.py │ ├── pom.xml │ ├── requirements.txt │ ├── settings.kts │ ├── teamcity-cloudformation-template.yml │ └── template.html ├── 01_04_self_hosted_tool_summary │ └── README.md └── README.md ├── ch2_software_as_a_service ├── 02_01_travis_ci │ ├── Makefile │ ├── README.md │ ├── Travis-Environment-Variables-SCR-20230916-shdl.png │ ├── data.json │ ├── index.py │ ├── index_test.py │ ├── requirements.txt │ ├── template.html │ └── travis.yml ├── 02_02_circleci │ ├── CicleCI-Trigger-Pipeline-SCR-20230917-pesy.png │ ├── CircleCI-Build-Numbers-SCR-20230917-phnu.png │ ├── CircleCI-Environment-Variables-SCR-20230917-ngsd.png │ ├── CircleCI-Project-Settings-SCR-20230917-ozxw.png │ ├── CircleCI-config-after-SCR-20230917-ouax.png │ ├── CircleCI-config-before-SCR-20230917-otkq.png │ ├── Makefile │ ├── README.md │ ├── config.yml │ ├── data.json │ ├── index.py │ ├── index_test.py │ ├── requirements.txt │ └── template.html ├── 02_03_saas_tool_summary │ └── README.md └── README.md ├── ch3_cloud_service_providers ├── 03_01_aws_codepipeline_codebuild │ ├── Makefile │ ├── README.md │ ├── buildspec.yml │ ├── data.json │ ├── index.py │ ├── index_test.py │ ├── requirements.txt │ └── template.html ├── 03_02_azure_pipelines │ ├── Azure-Pipelines-Variabiles-SCR-20231015-pric.png │ ├── Makefile │ ├── README.md │ ├── azure-pipelines.yml │ ├── data.json │ ├── index.py │ ├── index_test.py │ ├── requirements.txt │ └── template.html ├── 03_03_gcp_cloud_build │ ├── Makefile │ ├── README.md │ ├── cloud-build-project-number.png │ ├── cloud-build-secret-manager-secret-accessor.png │ ├── cloudbuild.yaml │ ├── data.json │ ├── index.py │ ├── index_test.py │ ├── requirements.txt │ └── template.html ├── 03_04_cloud_service_provider_tool_summary │ └── README.md └── README.md ├── ch4_code_repositories ├── 04_01_github_actions │ ├── 0-github-actions-rename-workflow-before.png │ ├── 1-github-actions-rename-workflow-after.png │ ├── Makefile │ ├── README.md │ ├── data.json │ ├── index.py │ ├── index_test.py │ ├── pipeline.yml │ ├── requirements.txt │ └── template.html ├── 04_02_gitlab_ci │ ├── 0-gitlabci-edit-web-ide.png │ ├── 1-gitlabci-upload-files.png │ ├── 2-gitlabci-rename-pipeline-file-before.png │ ├── 3-gitlabci-rename-pipeline-file-after.png │ ├── 4-gitlabci-variables.png │ ├── 5-gitlabci-pipeline.png │ ├── Makefile │ ├── README.md │ ├── data.json │ ├── gitlab-ci.yml │ ├── index.py │ ├── index_test.py │ ├── requirements.txt │ └── template.html ├── 04_03_bitbucket_pipelines │ ├── Makefile │ ├── README.md │ ├── bitbucket-pipelines.yml │ ├── create-access-token-00001.png │ ├── create-access-token-00002.png │ ├── create-access-token-00004.png │ ├── create-access-token-00005.png │ ├── create-access-token-00006.png │ ├── create-access-token-00007.png │ ├── create-access-token-00008.png │ ├── create-access-token-00009.png │ ├── data.json │ ├── index.py │ ├── index_test.py │ ├── pipeline-settings-00001.png │ ├── pipeline-settings-00002.png │ ├── requirements.txt │ ├── run-pipeline-00001.png │ ├── run-pipeline-00002.png │ └── template.html ├── 04_04_code_repository_tool_summary │ └── README.md └── README.md └── ch5_selecting_a_ci_tool ├── 05_01_selecting_the_right_ci_tool ├── 05_01_selecting_the_right_ci_tool.png └── README.md └── README.md /.github/CODEOWNERS: -------------------------------------------------------------------------------- 1 | # Codeowners for these exercise files: 2 | # * (asterisk) denotes "all files and folders" 3 | # Example: * @producer @instructor 4 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE.md: -------------------------------------------------------------------------------- 1 | 7 | 8 | ## Issue Overview 9 | 10 | 11 | ## Describe your environment 12 | 13 | 14 | ## Steps to Reproduce 15 | 16 | 1. 17 | 2. 18 | 3. 19 | 4. 20 | 21 | ## Expected Behavior 22 | 23 | 24 | ## Current Behavior 25 | 26 | 27 | ## Possible Solution 28 | 29 | 30 | ## Screenshots / Video 31 | 32 | 33 | ## Related Issues 34 | 35 | -------------------------------------------------------------------------------- /.github/PULL_REQUEST_TEMPLATE.md: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /.github/workflows/main.yml: -------------------------------------------------------------------------------- 1 | name: Copy To Branches 2 | on: 3 | workflow_dispatch: 4 | jobs: 5 | copy-to-branches: 6 | runs-on: ubuntu-latest 7 | steps: 8 | - uses: actions/checkout@v2 9 | with: 10 | fetch-depth: 0 11 | - name: Copy To Branches Action 12 | uses: planetoftheweb/copy-to-branches@v1.2 13 | env: 14 | key: main 15 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | # Byte-compiled / optimized / DLL files 2 | __pycache__/ 3 | *.py[cod] 4 | *$py.class 5 | 6 | # C extensions 7 | *.so 8 | 9 | # Distribution / packaging 10 | .Python 11 | build/ 12 | develop-eggs/ 13 | dist/ 14 | downloads/ 15 | eggs/ 16 | .eggs/ 17 | lib/ 18 | lib64/ 19 | parts/ 20 | sdist/ 21 | var/ 22 | wheels/ 23 | share/python-wheels/ 24 | *.egg-info/ 25 | .installed.cfg 26 | *.egg 27 | MANIFEST 28 | 29 | # PyInstaller 30 | # Usually these files are written by a python script from a template 31 | # before PyInstaller builds the exe, so as to inject date/other infos into it. 32 | *.manifest 33 | *.spec 34 | 35 | # Installer logs 36 | pip-log.txt 37 | pip-delete-this-directory.txt 38 | 39 | # Unit test / coverage reports 40 | htmlcov/ 41 | .tox/ 42 | .nox/ 43 | .coverage 44 | .coverage.* 45 | .cache 46 | nosetests.xml 47 | coverage.xml 48 | *.cover 49 | *.py,cover 50 | .hypothesis/ 51 | .pytest_cache/ 52 | cover/ 53 | 54 | # Translations 55 | *.mo 56 | *.pot 57 | 58 | # Django stuff: 59 | *.log 60 | local_settings.py 61 | db.sqlite3 62 | db.sqlite3-journal 63 | 64 | # Flask stuff: 65 | instance/ 66 | .webassets-cache 67 | 68 | # Scrapy stuff: 69 | .scrapy 70 | 71 | # Sphinx documentation 72 | docs/_build/ 73 | 74 | # PyBuilder 75 | .pybuilder/ 76 | target/ 77 | 78 | # Jupyter Notebook 79 | .ipynb_checkpoints 80 | 81 | # IPython 82 | profile_default/ 83 | ipython_config.py 84 | 85 | # pyenv 86 | # For a library or package, you might want to ignore these files since the code is 87 | # intended to run in multiple environments; otherwise, check them in: 88 | # .python-version 89 | 90 | # pipenv 91 | # According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control. 92 | # However, in case of collaboration, if having platform-specific dependencies or dependencies 93 | # having no cross-platform support, pipenv may install dependencies that don't work, or not 94 | # install all needed dependencies. 95 | #Pipfile.lock 96 | 97 | # poetry 98 | # Similar to Pipfile.lock, it is generally recommended to include poetry.lock in version control. 99 | # This is especially recommended for binary packages to ensure reproducibility, and is more 100 | # commonly ignored for libraries. 101 | # https://python-poetry.org/docs/basic-usage/#commit-your-poetrylock-file-to-version-control 102 | #poetry.lock 103 | 104 | # pdm 105 | # Similar to Pipfile.lock, it is generally recommended to include pdm.lock in version control. 106 | #pdm.lock 107 | # pdm stores project-wide configurations in .pdm.toml, but it is recommended to not include it 108 | # in version control. 109 | # https://pdm.fming.dev/#use-with-ide 110 | .pdm.toml 111 | 112 | # PEP 582; used by e.g. github.com/David-OConnor/pyflow and github.com/pdm-project/pdm 113 | __pypackages__/ 114 | 115 | # Celery stuff 116 | celerybeat-schedule 117 | celerybeat.pid 118 | 119 | # SageMath parsed files 120 | *.sage.py 121 | 122 | # Environments 123 | .env 124 | .venv 125 | env/ 126 | venv/ 127 | ENV/ 128 | env.bak/ 129 | venv.bak/ 130 | 131 | # Spyder project settings 132 | .spyderproject 133 | .spyproject 134 | 135 | # Rope project settings 136 | .ropeproject 137 | 138 | # mkdocs documentation 139 | /site 140 | 141 | # mypy 142 | .mypy_cache/ 143 | .dmypy.json 144 | dmypy.json 145 | 146 | # Pyre type checker 147 | .pyre/ 148 | 149 | # pytype static type analyzer 150 | .pytype/ 151 | 152 | # Cython debug symbols 153 | cython_debug/ 154 | 155 | # PyCharm 156 | # JetBrains specific template is maintained in a separate JetBrains.gitignore that can 157 | # be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore 158 | # and can be added to the global gitignore or merged into this file. For a more nuclear 159 | # option (not recommended) you can uncomment the following to ignore the entire idea folder. 160 | #.idea/ 161 | *.zip 162 | *.bak 163 | .obsidian 164 | -------------------------------------------------------------------------------- /CHAPTER_LIST.txt: -------------------------------------------------------------------------------- 1 | ch0_introduction/00_01_building_your_ci-cd_pipeline 2 | ch0_introduction/00_02_what_you_should_know 3 | ch0_introduction/00_03_ci-cd_tool_categories 4 | ch0_introduction/00_04_pros_and_cons 5 | ch0_introduction/00_05_the_experimental_pipeline 6 | ch0_introduction/00_06_about_the_exercise_files 7 | ch1_self_hosted/01_01_jenkins 8 | ch1_self_hosted/01_02_bamboo 9 | ch1_self_hosted/01_03_teamcity 10 | ch1_self_hosted/01_04_self_hosted_tool_summary 11 | ch2_software_as_a_service/02_01_travis_ci 12 | ch2_software_as_a_service/02_02_circleci 13 | ch2_software_as_a_service/02_03_saas_tool_summary 14 | ch3_cloud_service_providers/03_01_aws_codepipeline_codebuild 15 | ch3_cloud_service_providers/03_02_azure_pipelines 16 | ch3_cloud_service_providers/03_03_gcp_cloud_build 17 | ch3_cloud_service_providers/03_04_cloud_service_provider_tool_summary 18 | ch4_code_repositories/04_01_github_actions 19 | ch4_code_repositories/04_02_gitlab_ci 20 | ch4_code_repositories/04_03_bitbucket_pipelines 21 | ch4_code_repositories/04_04_code_repository_tool_summary 22 | ch5_selecting_a_ci_tool/05_01_selecting_the_right_ci_tool -------------------------------------------------------------------------------- /CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | 2 | Contribution Agreement 3 | ====================== 4 | 5 | This repository does not accept pull requests (PRs). All pull requests will be closed. 6 | 7 | However, if any contributions (through pull requests, issues, feedback or otherwise) are provided, as a contributor, you represent that the code you submit is your original work or that of your employer (in which case you represent you have the right to bind your employer). By submitting code (or otherwise providing feedback), you (and, if applicable, your employer) are licensing the submitted code (and/or feedback) to LinkedIn and the open source community subject to the BSD 2-Clause license. 8 | -------------------------------------------------------------------------------- /Makefile: -------------------------------------------------------------------------------- 1 | dirs: 2 | @find . -type d -name 0\* -not -path '*/\.*' | sort 3 | 4 | links: 5 | @for dir in $$(find . -type d -name ch\* | sort); do \ 6 | cd "$$dir" && \ 7 | for i in $$(find . ! -path ./README.md -type f -name README.md | sort); do \ 8 | echo "- [$$(head -1 "$$i" | sed -e 's/# //')]($$i)"; \ 9 | done; \ 10 | cd - >/dev/null; \ 11 | done 12 | 13 | next: 14 | @for dir in $$(find . -type d -name ch\* | sort); do \ 15 | cd "$$dir" && \ 16 | for i in $$(find . ! -path ./README.md -type f -name README.md | sort); do \ 17 | echo "[Next: $$(head -1 "$$i" | sed -e 's/# //')](.$$i)"; \ 18 | done; \ 19 | cd - >/dev/null; \ 20 | done 21 | 22 | edit: 23 | @vim $$(find . -type f -name README.md | sort) 24 | 25 | spell: 26 | -find . -type f -name "*.md" -exec aspell check {} \; 27 | 28 | clean: 29 | -find . -type f -name \*.bak -exec rm {} \; -print 30 | 31 | -------------------------------------------------------------------------------- /NOTICE: -------------------------------------------------------------------------------- 1 | Copyright 2023 LinkedIn Corporation 2 | All Rights Reserved. 3 | 4 | Licensed under the LinkedIn Learning Exercise File License (the "License"). 5 | See LICENSE in the project root for license information. 6 | 7 | Please note, this project may automatically load third party code from external 8 | repositories (for example, NPM modules, Composer packages, or other dependencies). 9 | If so, such third party code may be subject to other license terms than as set 10 | forth above. In addition, such third party code may also depend on and load 11 | multiple tiers of dependencies. Please review the applicable licenses of the 12 | additional dependencies. 13 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Continuous Integration: Tools 2 | This is the repository for the LinkedIn Learning course `Continuous Integration: Tools`. The full course is available from [LinkedIn Learning][lil-course-url]. 3 | 4 | ![Continuous Integration: Tools][lil-thumbnail-url] 5 | 6 | Continuous integration and continuous delivery (CI/CD) practices enable developers to reliably and quickly produce applications at scale—enhancing overall team collaboration in the process. As CI/CD has grown to become one of the most prominent subfields of DevOps, new CI/CD tools have emerged and proliferated across the entire industry. Curious about which tools are right for your team? This course was designed for you. Join instructor Michael Jenkins as he digs into a variety of CI/CD tools and explores several different use case categories, highlighting the pros and cons of each tool, how each tool fits into the wider development landscape, and how to select the tool that will work best for you. Along the way, Michael covers self-hosted options such as Jenkins, Bamboo, and TeamCity; SaaS tools like Travis CI and CircleCI; cloud service providers such as AWS, Azure, and GCP; and code repositories like GitHub, GitLab, and Atlassian Bitbucket. 7 | 8 | ## What You Should Know 9 | This course is intended for both **beginners** and **experienced** developers. 10 | 11 | ### Beginner level 12 | If you are just getting started with continuous integration, consider auditing this course to learn the basics of CI and how the process is implemented in various tools. 13 | 14 | Use the exercise files as a reference for these implementations. 15 | 16 | ### Intermediate level 17 | If you are already familiar with continuous integration, use these exercise files to explore each tool. The `README` file in each lesson will help guide and expedite your exploration. 18 | 19 | It will be helpful if you have some experience with: 20 | 21 | 1. Application development and the software development process. 22 | 2. Building, testing, and deploying applications. 23 | 3. Scripting and using a command-line interface. 24 | 4. Source code management tools like GitHub, Bitbucket, or Gitlab. 25 | 26 | ## Instructions 27 | This repository has folders for each of the videos in the course. 28 | 29 | ## Folders 30 | The folders are structured to correspond to the videos in the course. The naming convention is `CHAPTER#_MOVIE#`. For example, the folder named `02_03` corresponds to the second chapter and the third video in the second chapter. 31 | 32 | ## Installing 33 | 1. To use these exercise files, you must have the following installed: 34 | - `git` 35 | 1. Clone this repository into your local machine using the terminal (Mac), CMD (Windows), or a GUI tool like [SourceTree](https://www.sourcetreeapp.com/). 36 | 3. Follow any additional instructions in the `README` files in each folder. 37 | 38 | ## About the Instructor 39 | Michael Jenkins - Software Engineer, Reliability Engineer 40 | 41 | Check out Michael's other courses on [LinkedIn Learning](https://www.linkedin.com/learning/instructors/michael-jenkins). 42 | 43 | [0]: # (Replace these placeholder URLs with actual course URLs) 44 | 45 | [lil-course-url]: https://www.linkedin.com/learning/ 46 | [lil-thumbnail-url]: https://media.licdn.com/dms/image/D4D0DAQF_XV39CwrXiw/learning-public-crop_675_1200/0/1702073152131?e=2147483647&v=beta&t=8XvqzqvYCEOI47pNrJGwVbSzoQM582hCOx-de2bmM4E 47 | 48 | [Next: Ch0 Introduction](./ch0_introduction/README.md) 49 | -------------------------------------------------------------------------------- /ch0_introduction/00_02_what_you_should_know/README.md: -------------------------------------------------------------------------------- 1 | # 00_02 What You Should Know 2 | 3 | ## CI/CD/CD 4 | Here's a summary of continuous integration, continuous delivery, and continuous deployment. 5 | 6 | - **Continuous Integration (CI)** 7 | - Developers work on their code in a local environment and commit their changes to a shared repository regularly. 8 | - Code integration involves combining new code with existing repository code. 9 | - The new code undergoes testing to identify and resolve errors quickly. 10 | 11 | - **Continuous Delivery (CD)** 12 | - A complementary process to continuous integration. 13 | - Allows developers to build, test, and release their software with every new change. 14 | - Includes testing to ensure the final product meets requirements and avoids bugs. 15 | 16 | - **Continuous Deployment** 17 | - Refers to automated deployment without human interaction. 18 | - The application is automatically built, tested, and deployed into a production environment. 19 | 20 | ## Exercise Files 21 | The exercise files contain instructions and details to enhance your learning experience with the tools being discussed. 22 | 23 | To get the most out of the exercise files, it will be helpful if you have some of the following: 24 | 25 | - Familiarity with application development and the software development process. 26 | - Experience in building, testing, and deploying applications. 27 | - Experience with scripting and using a command-line interface. 28 | - Familiarity with source code management tools like GitHub, Bitbucket, or Gitlab. 29 | 30 | [Next: 00_03 CI/CD Tool Categories](../00_03_ci-cd_tool_categories/README.md) 31 | 32 | -------------------------------------------------------------------------------- /ch0_introduction/00_03_ci-cd_tool_categories/README.md: -------------------------------------------------------------------------------- 1 | # 00_03 CI/CD Tool Categories 2 | Generally speaking, we can sort CI/CD tools into four categories. 3 | 4 | - **Self-Hosted Tools:** 5 | - Run on your hardware, which could be a server in your company's data center, a cloud-based virtual machine (VM), or your local workstation. 6 | - You are responsible for installing and maintaining the tool. 7 | 8 | - **Software as a Service (SaaS) Tools:** 9 | - Vendors provide and maintain the tool and allow you to access it. 10 | - No installation or maintenance is required from you. 11 | 12 | - **Cloud Service Providers:** 13 | - Offer SaaS-based CI/CD tools along with other cloud-based services such as virtual machines, managed services, and storage. 14 | 15 | - **Code Repositories:** 16 | - Primarily known for storing and managing code. 17 | - Many modern repositories also offer CI/CD tools for transforming code into software. 18 | 19 | [Next: 00_04 Pros and Cons](../00_04_pros_and_cons/README.md) 20 | 21 | -------------------------------------------------------------------------------- /ch0_introduction/00_04_pros_and_cons/README.md: -------------------------------------------------------------------------------- 1 | # 00_04 Pros and Cons 2 | Each CI/CD tool category has benefits and drawbacks. Consider these factors when choosing a CI/CD tool category for your projects. 3 | 4 | Regarding concerns about security vulnerabilities and vendor lock-in, expect these to be the same for all categories. 5 | 6 | - **Self-Hosted CI/CD Tools** 7 | - Pros: 8 | - Offers maximum flexibility, allowing you to specify the entire technology stack, including software, hardware, and network. 9 | - Provides greater control over data flow, reducing concerns about data leaks. 10 | - Cons: 11 | - Requires maintenance and administration, which can be challenging alongside regular duties. 12 | - Scalability is limited to existing infrastructure. 13 | 14 | - **Software as a Service (SaaS) CI/CD Tools** 15 | - Pros: 16 | - Easy to get started; No maintenance. 17 | - Many free or reasonably priced SaaS CI/CD services available. 18 | - Automatically create triggers from your repository. 19 | - Cons: 20 | - Costs can increase as team size increases or development rate grows. 21 | - Potential security concerns as data is on a system you don't control. 22 | 23 | - **Cloud Service Providers** 24 | - Pros: 25 | - Easy integration with other services from the same provider. 26 | - Offers better control over project access using Identity and Access Management. 27 | - Has access to potentially unlimited resources. 28 | - Cons: 29 | - Setting up CI/CD pipelines can be more complex. 30 | - May lead to vendor lock-in, but this can be a concern in any CI/CD system. 31 | 32 | - **Code Repository-Based CI/CD** 33 | - Pros: 34 | - Combines code repositories and CI/CD tools in one place. 35 | - Plenty of free or low-fee options for starting. 36 | - Cons: 37 | - Becomes expensive at larger scales. 38 | 39 | [Next: 00_05 The Experimental Pipeline](../00_05_the_experimental_pipeline/README.md) 40 | 41 | -------------------------------------------------------------------------------- /ch0_introduction/00_05_the_experimental_pipeline/00_05_the_experimental_pipeline.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch0_introduction/00_05_the_experimental_pipeline/00_05_the_experimental_pipeline.png -------------------------------------------------------------------------------- /ch0_introduction/00_05_the_experimental_pipeline/README.md: -------------------------------------------------------------------------------- 1 | # 00_05 The Experimental Pipeline 2 | As we explore CICD tools, it's helpful to have a sample pipeline to work with. We'll be configuring a pipeline that builds, tests, and deploys a web application. Our application is a Python script that serves JSON data over a simple API. 3 | 4 | ## Pipeline Stages 5 | The pipeline will have seven stages. Each stage must complete successfully for any following stages to be initiated. 6 | 7 |  8 | 9 | 1. **Requirements**: Install any tools or libraries needed to test, build, and deploy the application. 10 | 11 | 2. **Check**: Lint the code and run integration tests. 12 | 13 | 3. **Build**: Package the code into a deployable artifact. 14 | 15 | 4. **Deploy Staging**: Deploy the code to the staging environment. 16 | 17 | 5. **Test Staging**: Test the staging environment. A successful test will allow the production deployment to proceed. 18 | 19 | 6. **Deploy Production**: Deploy the code to the production environment. 20 | 21 | 7. **Test Production**: Test the production environment to confirm the application has been deployed successfully. 22 | 23 | 24 | ## Your Pipeline Might Be Different 25 | As you’re viewing this course, you might already have a pipeline in mind that you’re looking to implement. Just know that all the stages mentioned here might not apply to your project. But at the least, they should give you an idea of the stages in a typical CI/CD pipeline. 26 | 27 | [Next: 00_06 About the Exercise Files](../00_06_about_the_exercise_files/README.md) 28 | -------------------------------------------------------------------------------- /ch0_introduction/00_06_about_the_exercise_files/Makefile: -------------------------------------------------------------------------------- 1 | # Makefile for linting using cfn-lint. 2 | 3 | # Target: cfn-lint 4 | # Description: This target runs cfn-lint to check the syntax and validity of a CloudFormation template. 5 | # 6 | # Prerequisites: 7 | # - cfn-lint: You must have cfn-lint installed on your system. Install it using pip: 8 | # ``` 9 | # pip install cfn-lint 10 | # ``` 11 | # - A CloudFormation template file named 'sample-application.yml' in the same directory as this Makefile. 12 | # 13 | # Usage: Run `make cfn-lint` to lint the CloudFormation template. 14 | cfn-lint: 15 | cfn-lint -t ./sample-application.yml 16 | -------------------------------------------------------------------------------- /ch0_introduction/00_06_about_the_exercise_files/README.md: -------------------------------------------------------------------------------- 1 | # 00_06 About the Exercise Files 2 | 3 | Exercise files are available to help you following along with this course. Use the exercise files to create the resources used in the course demonstrations. 4 | 5 | The directory structure follows the course structure, with directories for each chapter and section. 6 | 7 | ## Prerequisites 8 | 1. A [GitHub account](https://github.com/join) is needed to host the code for the sample application. 9 | 2. An [Amazon Web Services account](https://aws.amazon.com/free) is needed to deploy and host the sample application. 10 | 11 | ## Deploying the Sample Application 12 | A sample application is included to use as a deployment target. The goal is to model the experimental pipeline with each CI tool and deploy updates to the sample application. 13 | 14 | Use the provided [CloudFormation template](./sample-application.yml) and the following instructions to manually deploy the sample application in AWS. 15 | 16 | ### 1. Get the exercise files in place on your local system 17 | 1. Open the GitHub repo for this course and download the exercise files by selecting **Code -> Local -> Download ZIP**. 18 | 2. Open the ZIP file and confirm the contents of the file are accessible. 19 | 20 | ### 2. Deploy the Sample Application 21 | 1. Log into your AWS account. Select the search bar at the top of the page and enter **CloudFormation**. 22 | 1. On the CloudFormation homepage, select **Create stack**. If the button includes a dropdown, select **With new resources (standard)**. 23 | 1. Under "Prerequisite - Prepare template", confirm that "Template is ready" is selected. 24 | 1. Under "Specify template", select **Upload a template file**. Select **Chose file**. Browse to the location where you opened the ZIP file. Navigate to **ch0_introduction/00_06_about_the_exercise_files** and select **sample-application.yml**. Select **Open**. Select **Next**. 25 | 1. Enter a name for the stack under "Stack name". *Note that the name should only include letters (A-Z and a-z), numbers (0-9), and dashes (-)*. Select **Next**. 26 | 1. On the "Configure stack options" screen, keep all options as the default. Scroll to the bottom of the page and select **Next**. 27 | 1. On the "Review" screen, scroll to the bottom of the page and select the **checkbox** next to "I acknowledge that AWS CloudFormation might create IAM resources with custom names". Select **Submit**. 28 | 1. Review the "Events" tab on the stack homepage until *CREATE_COMPLETE* is reported under the "Status" column for the Logical ID that matches your stack name. 29 | 30 | ### 3. Review the Deployed Sample Application and Lambda Functions 31 | 1. On the stack homepage, select the "Outputs" column. Make a note of the output key, value, and description for: 32 | 33 | - AwsAccessKeyId 34 | - AwsDefaultRegion 35 | - AwsSecretAccessKey 36 | - ProductionFunctionName 37 | - ProductionURL 38 | - StagingFunctionName 39 | - StagingURL 40 | 41 | These values will be needed to configure the experimental pipeline for each CI tool. 42 | 43 | 2. To view the environments for the sample application, select the URL next to **ProductionURL** and **StagingURL**. *Open each link in a new tab*. 44 | 45 | 3. To view the lambda functions for the sample application, select the "Resources" tab on the stack homepage. Under "Logical ID", locate **LambdaFunctionProduction** and **LambdaFunctionStaging**. Select the link next to each resource in the "Physical ID" column. 46 | 47 | [Next: Ch1 Self Hosted](../../ch1_self_hosted/README.md) 48 | 49 | -------------------------------------------------------------------------------- /ch0_introduction/README.md: -------------------------------------------------------------------------------- 1 | # Ch0 Introduction 2 | - [00_02 What You Should Know](./00_02_what_you_should_know/README.md) 3 | - [00_03 CI/CD Tool Categories](./00_03_ci-cd_tool_categories/README.md) 4 | - [00_04 Pros and Cons](./00_04_pros_and_cons/README.md) 5 | - [00_05 The Experimental Pipeline](./00_05_the_experimental_pipeline/README.md) 6 | - [00_06 About the Exercise Files](./00_06_about_the_exercise_files/README.md) 7 | -------------------------------------------------------------------------------- /ch1_self_hosted/01_01_jenkins/Jenkins-Install-Blue-Ocean-Plugin-SCR-20231014-labg.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch1_self_hosted/01_01_jenkins/Jenkins-Install-Blue-Ocean-Plugin-SCR-20231014-labg.png -------------------------------------------------------------------------------- /ch1_self_hosted/01_01_jenkins/Jenkinsfile: -------------------------------------------------------------------------------- 1 | pipeline { 2 | agent any 3 | 4 | environment { 5 | AWS_ACCESS_KEY_ID = credentials('AWS_ACCESS_KEY_ID') 6 | AWS_SECRET_ACCESS_KEY = credentials('AWS_SECRET_ACCESS_KEY') 7 | AWS_DEFAULT_REGION = 'UPDATE_THIS_VALUE' 8 | STAGING_FUNCTION_NAME = 'UPDATE_THIS_VALUE' 9 | STAGING_URL = 'UPDATE_THIS_VALUE' 10 | PRODUCTION_FUNCTION_NAME = 'UPDATE_THIS_VALUE' 11 | PRODUCTION_URL = 'UPDATE_THIS_VALUE' 12 | } 13 | 14 | stages { 15 | stage('Requirements') { 16 | steps { 17 | sh(''' 18 | #!/bin/bash 19 | python3 -m venv local 20 | . ./local/bin/activate 21 | make requirements 22 | ''') 23 | } 24 | } 25 | 26 | stage('Check') { 27 | parallel { 28 | stage('Check:Lint') { 29 | steps { 30 | sh(''' 31 | #!/bin/bash 32 | . ./local/bin/activate 33 | make check lint 34 | ''') 35 | } 36 | } 37 | 38 | stage('Check:Test') { 39 | steps { 40 | sh(''' 41 | #!/bin/bash 42 | . ./local/bin/activate 43 | make test 44 | ''') 45 | } 46 | } 47 | } 48 | } 49 | 50 | stage('Build') { 51 | steps { 52 | sh(''' 53 | #!/bin/bash 54 | make build 55 | ''') 56 | } 57 | } 58 | 59 | stage('Deploy Staging') { 60 | steps { 61 | sh(''' 62 | #!/bin/bash 63 | make deploy \ 64 | PLATFORM="Jenkins" \ 65 | FUNCTION=${STAGING_FUNCTION_NAME} \ 66 | VERSION=${GIT_COMMIT} \ 67 | BUILD_NUMBER=${BUILD_NUMBER} 68 | ''') 69 | } 70 | } 71 | 72 | stage('Test Staging') { 73 | steps { 74 | sh(''' 75 | #!/bin/bash 76 | make testdeployment URL=${STAGING_URL} VERSION=${GIT_COMMIT} 77 | ''') 78 | } 79 | } 80 | 81 | stage('Deploy Production') { 82 | steps { 83 | sh(''' 84 | #!/bin/bash 85 | make deploy \ 86 | PLATFORM="Jenkins" \ 87 | FUNCTION=${PRODUCTION_FUNCTION_NAME} \ 88 | VERSION=${GIT_COMMIT} \ 89 | BUILD_NUMBER=${BUILD_NUMBER} 90 | ''') 91 | } 92 | } 93 | 94 | stage('Test Production') { 95 | steps { 96 | sh(''' 97 | #!/bin/bash 98 | make testdeployment URL=${PRODUCTION_URL} VERSION=${GIT_COMMIT} 99 | ''') 100 | } 101 | } 102 | } 103 | 104 | post { 105 | success { 106 | // Archive the lambda.zip file as an artifact 107 | archiveArtifacts artifacts: 'lambda.zip', allowEmptyArchive: false 108 | } 109 | } 110 | } 111 | -------------------------------------------------------------------------------- /ch1_self_hosted/01_01_jenkins/Makefile: -------------------------------------------------------------------------------- 1 | FUNCTION=undefined 2 | PLATFORM=undefined 3 | URL=undefined 4 | VERSION=undefined 5 | BUILD_NUMBER=undefined 6 | CODE=$(shell ls *.py) 7 | 8 | ifneq (,$(findstring -staging,$(FUNCTION))) 9 | ENVIRONMENT = STAGING 10 | else ifneq (,$(findstring -production,$(FUNCTION))) 11 | ENVIRONMENT = PRODUCTION 12 | else 13 | ENVIRONMENT = undefined 14 | endif 15 | 16 | hello: 17 | @echo "Here are the targets for this Makefile:" 18 | @echo " requirements - install the project requirements" 19 | @echo " lint - run linters on the code" 20 | @echo " black - run black to format the code" 21 | @echo " test - run the tests" 22 | @echo " build - build the lambda.zip file" 23 | @echo " deploy - deploy the lambda.zip file to AWS" 24 | @echo " testdeployment - test the deployment" 25 | @echo " clean - remove the lambda.zip file" 26 | @echo " all - clean, lint, black, test, build, and deploy" 27 | @echo 28 | @echo 29 | @echo "You must set the FUNCTION variables to use the deploy target." 30 | @echo "FUNCTION must be set to the name of an existing lambda function to update." 31 | @echo "For example:" 32 | @echo 33 | @echo " make deploy FUNCTION=sample-application-staging" 34 | @echo 35 | @echo "Optional deploy variables are:" 36 | @echo " VERSION - the version of the code being deployed (default: undefined)" 37 | @echo " PLATFORM - the platform being used for the deployment (default: undefined)" 38 | @echo " BUILD_NUMBER - the build number assigned by the deployment platform (default: undefined)" 39 | @echo " URL - the URL to use for testing the deployment (default: undefined)" 40 | @echo 41 | 42 | requirements: 43 | pip install -U pip 44 | pip install --requirement requirements.txt 45 | 46 | check: 47 | set 48 | zip --version 49 | python --version 50 | pylint --version 51 | flake8 --version 52 | aws --version 53 | 54 | lint: 55 | pylint --exit-zero --errors-only --disable=C0301 --disable=C0326 --disable=R,C $(CODE) 56 | flake8 --exit-zero --ignore=E501,E231 $(CODE) 57 | 58 | 59 | black: 60 | black --diff $(CODE) 61 | 62 | test: 63 | python -m unittest -v index_test 64 | 65 | build: 66 | zip lambda.zip index.py data.json template.html 67 | 68 | deploy: 69 | aws sts get-caller-identity 70 | 71 | aws lambda wait function-active \ 72 | --function-name="$(FUNCTION)" 73 | 74 | aws lambda update-function-configuration \ 75 | --function-name="$(FUNCTION)" \ 76 | --environment "Variables={PLATFORM=$(PLATFORM),VERSION=$(VERSION),BUILD_NUMBER=$(BUILD_NUMBER),ENVIRONMENT=$(ENVIRONMENT)}" 77 | 78 | aws lambda wait function-updated \ 79 | --function-name="$(FUNCTION)" 80 | 81 | aws lambda update-function-code \ 82 | --function-name="$(FUNCTION)" \ 83 | --zip-file=fileb://lambda.zip 84 | 85 | aws lambda wait function-updated \ 86 | --function-name="$(FUNCTION)" 87 | 88 | testdeployment: 89 | curl -s $(URL) | grep $(VERSION) 90 | 91 | clean: 92 | rm -vf lambda.zip 93 | 94 | all: clean lint black test build deploy 95 | 96 | .PHONY: test build deploy all clean 97 | -------------------------------------------------------------------------------- /ch1_self_hosted/01_01_jenkins/index.py: -------------------------------------------------------------------------------- 1 | """ 2 | An API that returns data from a JSON file. 3 | """ 4 | import os 5 | import json 6 | 7 | 8 | def handler(event, context): 9 | """ 10 | The lambda handler function. 11 | """ 12 | del context # Unused 13 | 14 | if event["rawPath"] == "/": 15 | environment = os.environ.get("ENVIRONMENT", "undefined") 16 | platform = os.environ.get("PLATFORM", "undefined") 17 | version = os.environ.get("VERSION", "undefined") 18 | build_number = os.environ.get("BUILD_NUMBER", "undefined") 19 | 20 | # Read the HTML template 21 | with open("template.html", mode="r", encoding="utf-8") as template_file: 22 | template = template_file.read() 23 | 24 | # Render the template 25 | docs_page = template.format( 26 | environment=environment, 27 | version=version, 28 | platform=platform, 29 | build_number=build_number, 30 | ) 31 | 32 | # Return the rendered template 33 | return { 34 | "statusCode": 200, 35 | "headers": { 36 | "Content-Type": "text/html", 37 | }, 38 | "body": docs_page, 39 | } 40 | 41 | # Read the data from the JSON file 42 | with open("data.json", mode="r", encoding="utf-8") as data_file: 43 | data = json.load(data_file) 44 | 45 | # Route /data: return all data 46 | if event["rawPath"] == "/data": 47 | return { 48 | "statusCode": 200, 49 | "headers": { 50 | "Content-Type": "application/json", 51 | }, 52 | "body": json.dumps(data), 53 | } 54 | 55 | # Route /data/{item_id}: return a single item 56 | # Get the item_id from the event's rawPath 57 | item_id = event["rawPath"][1:] 58 | 59 | # Check the item_id against each item in the data 60 | for item in data: 61 | # Return the item if the item_id matches 62 | if item["id"] == item_id: 63 | return { 64 | "statusCode": 200, 65 | "headers": { 66 | "Content-Type": "application/json", 67 | }, 68 | "body": json.dumps(item), 69 | } 70 | 71 | # Return a 404 if the item_id doesn't match any item 72 | return { 73 | "statusCode": 404, 74 | "headers": { 75 | "Content-Type": "application/json", 76 | }, 77 | "body": json.dumps( 78 | { 79 | "message": f"item_id {item_id} not found", 80 | "event": event, 81 | "item_id": item_id, 82 | } 83 | ), 84 | } 85 | 86 | 87 | def main(): 88 | """ 89 | A main function for testing the handler function. 90 | """ 91 | # Simulate an event for testing 92 | event = {"rawPath": "/1"} 93 | context = None 94 | 95 | # Call the handler function 96 | response = handler(event, context) 97 | 98 | # Print the response 99 | print(json.dumps(response, indent=2)) 100 | 101 | 102 | if __name__ == "__main__": 103 | main() 104 | -------------------------------------------------------------------------------- /ch1_self_hosted/01_01_jenkins/index_test.py: -------------------------------------------------------------------------------- 1 | from index import handler 2 | from unittest.mock import patch 3 | import unittest 4 | import json 5 | 6 | 7 | class TestHandler(unittest.TestCase): 8 | def setUp(self): 9 | # Set up any test data or configurations you need 10 | self.mock_env = "test_environment" 11 | patch.dict("os.environ", {"ENVIRONMENT": self.mock_env}).start() 12 | 13 | def tearDown(self): 14 | # Clean up after each test if necessary 15 | patch.stopall() 16 | 17 | def test_home_page(self): 18 | # Test the home page 19 | event = {"rawPath": "/"} 20 | response = handler(event, None) 21 | self.assertEqual(response["statusCode"], 200) 22 | self.assertEqual(response["headers"]["Content-Type"], "text/html") 23 | self.assertIn(f"The Sample Application - {self.mock_env}", response["body"]) 24 | 25 | def test_get_all_data(self): 26 | with open("data.json", mode="r", encoding="utf-8") as data_file: 27 | data = json.load(data_file) 28 | 29 | event = {"rawPath": "/data"} 30 | response = handler(event, None) 31 | self.assertEqual(response["statusCode"], 200) 32 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 33 | self.assertEqual(response["body"], json.dumps(data)) 34 | 35 | def test_get_item_by_id(self): 36 | with open("data.json", "r") as f: 37 | data = json.load(f) 38 | 39 | event = {"rawPath": "/1"} 40 | response = handler(event, None) 41 | self.assertEqual(response["statusCode"], 200) 42 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 43 | 44 | # Find the expected item in data based on the given ID 45 | item_id = event["rawPath"][1:] 46 | expected_item = next((item for item in data if item["id"] == item_id), None) 47 | self.assertEqual(response["body"], json.dumps(expected_item)) 48 | 49 | def test_invalid_id(self): 50 | event = {"rawPath": "/invalid_id"} 51 | response = handler(event, None) 52 | self.assertEqual(response["statusCode"], 404) 53 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 54 | self.assertIn(f"id {event['rawPath'][1:]} not found", response["body"]) 55 | 56 | 57 | if __name__ == "__main__": 58 | unittest.main() 59 | -------------------------------------------------------------------------------- /ch1_self_hosted/01_01_jenkins/requirements.txt: -------------------------------------------------------------------------------- 1 | black 2 | pylint 3 | flake8 4 | awscli 5 | -------------------------------------------------------------------------------- /ch1_self_hosted/01_01_jenkins/template.html: -------------------------------------------------------------------------------- 1 | 2 | 3 |
4 |Environment | 65 |{environment} | 66 |
Version | 69 |{version} | 70 |
Platform | 73 |{platform} | 74 |
Build # | 77 |{build_number} | 78 |
Returns the documentation page for the application.
83 | 84 | 85 |Returns all data in JSON format.
87 | 88 | 89 |Returns a specific item by its ID in JSON format.
91 | 92 | 93 | 94 | -------------------------------------------------------------------------------- /ch1_self_hosted/01_02_bamboo/Bamboo-Plan-Tasks-SCR-20230916-mare.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch1_self_hosted/01_02_bamboo/Bamboo-Plan-Tasks-SCR-20230916-mare.png -------------------------------------------------------------------------------- /ch1_self_hosted/01_02_bamboo/Bamboo-Plan-Variables-SCR-20230916-mezc.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch1_self_hosted/01_02_bamboo/Bamboo-Plan-Variables-SCR-20230916-mezc.png -------------------------------------------------------------------------------- /ch1_self_hosted/01_02_bamboo/Makefile: -------------------------------------------------------------------------------- 1 | FUNCTION=undefined 2 | PLATFORM=undefined 3 | URL=undefined 4 | VERSION=undefined 5 | BUILD_NUMBER=undefined 6 | CODE=$(shell ls *.py) 7 | 8 | ifneq (,$(findstring -staging,$(FUNCTION))) 9 | ENVIRONMENT = STAGING 10 | else ifneq (,$(findstring -production,$(FUNCTION))) 11 | ENVIRONMENT = PRODUCTION 12 | else 13 | ENVIRONMENT = undefined 14 | endif 15 | 16 | hello: 17 | @echo "Here are the targets for this Makefile:" 18 | @echo " requirements - install the project requirements" 19 | @echo " lint - run linters on the code" 20 | @echo " black - run black to format the code" 21 | @echo " test - run the tests" 22 | @echo " build - build the lambda.zip file" 23 | @echo " deploy - deploy the lambda.zip file to AWS" 24 | @echo " testdeployment - test the deployment" 25 | @echo " clean - remove the lambda.zip file" 26 | @echo " all - clean, lint, black, test, build, and deploy" 27 | @echo 28 | @echo 29 | @echo "You must set the FUNCTION variables to use the deploy target." 30 | @echo "FUNCTION must be set to the name of an existing lambda function to update." 31 | @echo "For example:" 32 | @echo 33 | @echo " make deploy FUNCTION=sample-application-staging" 34 | @echo 35 | @echo "Optional deploy variables are:" 36 | @echo " VERSION - the version of the code being deployed (default: undefined)" 37 | @echo " PLATFORM - the platform being used for the deployment (default: undefined)" 38 | @echo " BUILD_NUMBER - the build number assigned by the deployment platform (default: undefined)" 39 | @echo " URL - the URL to use for testing the deployment (default: undefined)" 40 | @echo 41 | 42 | requirements: 43 | pip install -U pip 44 | pip install --requirement requirements.txt 45 | 46 | check: 47 | set 48 | zip --version 49 | python --version 50 | pylint --version 51 | flake8 --version 52 | aws --version 53 | 54 | lint: 55 | pylint --exit-zero --errors-only --disable=C0301 --disable=C0326 --disable=R,C $(CODE) 56 | flake8 --exit-zero --ignore=E501,E231 $(CODE) 57 | 58 | 59 | black: 60 | black --diff $(CODE) 61 | 62 | test: 63 | python -m unittest -v index_test 64 | 65 | build: 66 | zip lambda.zip index.py data.json template.html 67 | 68 | deploy: 69 | aws sts get-caller-identity 70 | 71 | aws lambda wait function-active \ 72 | --function-name="$(FUNCTION)" 73 | 74 | aws lambda update-function-configuration \ 75 | --function-name="$(FUNCTION)" \ 76 | --environment "Variables={PLATFORM=$(PLATFORM),VERSION=$(VERSION),BUILD_NUMBER=$(BUILD_NUMBER),ENVIRONMENT=$(ENVIRONMENT)}" 77 | 78 | aws lambda wait function-updated \ 79 | --function-name="$(FUNCTION)" 80 | 81 | aws lambda update-function-code \ 82 | --function-name="$(FUNCTION)" \ 83 | --zip-file=fileb://lambda.zip 84 | 85 | aws lambda wait function-updated \ 86 | --function-name="$(FUNCTION)" 87 | 88 | testdeployment: 89 | curl -s $(URL) | grep $(VERSION) 90 | 91 | clean: 92 | rm -vf lambda.zip 93 | 94 | all: clean lint black test build deploy 95 | 96 | .PHONY: test build deploy all clean 97 | -------------------------------------------------------------------------------- /ch1_self_hosted/01_02_bamboo/bamboo-specs/bamboo.yml: -------------------------------------------------------------------------------- 1 | --- 2 | version: 2 3 | plan: 4 | project-key: EP 5 | key: BUIL 6 | name: Build 7 | stages: 8 | - Default Stage: 9 | manual: false 10 | final: false 11 | jobs: 12 | - Default Job 13 | Default Job: 14 | key: JOB1 15 | tasks: 16 | - checkout: 17 | force-clean-build: false 18 | description: Checkout Default Repository 19 | - script: 20 | interpreter: SHELL 21 | scripts: 22 | - |- 23 | python3 -m venv local 24 | . ./local/bin/activate 25 | make requirements 26 | description: Requirements 27 | - script: 28 | interpreter: SHELL 29 | scripts: 30 | - |- 31 | . ./local/bin/activate 32 | make check lint test 33 | description: Check-Lint-Test 34 | - command: 35 | executable: Make 36 | argument: clean build 37 | description: Build 38 | - command: 39 | executable: Make 40 | argument: deploy PLATFORM="Bamboo" FUNCTION=${bamboo.STAGING_FUNCTION_NAME} VERSION=${bamboo.planRepository.revision} BUILD_NUMBER=${bamboo.buildNumber} 41 | envrionment: AWS_ACCESS_KEY_ID=${bamboo.AWS_ACCESS_KEY_ID} AWS_SECRET_ACCESS_KEY=${bamboo.AWS_SECRET_ACCESS_KEY} AWS_DEFAULT_REGION=${bamboo.AWS_DEFAULT_REGION} 42 | description: Deploy Staging 43 | - command: 44 | executable: Make 45 | argument: testdeployment URL=${bamboo.STAGING_URL} VERSION=${bamboo.planRepository.revision} 46 | description: Test Staging 47 | - command: 48 | executable: Make 49 | argument: deploy PLATFORM="Bamboo" FUNCTION=${bamboo.PRODUCTION_FUNCTION_NAME} VERSION=${bamboo.planRepository.revision} BUILD_NUMBER=${bamboo.buildNumber} 50 | envrionment: AWS_ACCESS_KEY_ID=${bamboo.AWS_ACCESS_KEY_ID} AWS_SECRET_ACCESS_KEY=${bamboo.AWS_SECRET_ACCESS_KEY} AWS_DEFAULT_REGION=${bamboo.AWS_DEFAULT_REGION} 51 | description: Deploy Production 52 | - command: 53 | executable: Make 54 | argument: testdeployment URL=${bamboo.PRODUCTION_URL} VERSION=${bamboo.planRepository.revision} 55 | description: Test Production 56 | artifacts: 57 | - name: lambda.zip 58 | location: . 59 | pattern: '**/*.zip' 60 | shared: true 61 | required: true 62 | artifact-subscriptions: [] 63 | variables: 64 | AWS_ACCESS_KEY_ID: UPDATE_THIS_VALUE 65 | AWS_DEFAULT_REGION: UPDATE_THIS_VALUE 66 | AWS_SECRET_ACCESS_KEY: UPDATE_THIS_VALUE 67 | PRODUCTION_FUNCTION_NAME: UPDATE_THIS_VALUE 68 | PRODUCTION_URL: UPDATE_THIS_VALUE 69 | STAGING_FUNCTION_NAME: UPDATE_THIS_VALUE 70 | STAGING_URL: UPDATE_THIS_VALUE 71 | repositories: 72 | - repo: 73 | scope: global 74 | triggers: 75 | - polling: 76 | period: '180' 77 | branches: 78 | create: manually 79 | delete: never 80 | link-to-jira: true 81 | notifications: [] 82 | labels: [] 83 | dependencies: 84 | require-all-stages-passing: false 85 | enabled-for-branches: true 86 | block-strategy: none 87 | plans: [] 88 | other: 89 | concurrent-build-plugin: system-default 90 | --- 91 | version: 2 92 | plan: 93 | key: EP-BUIL 94 | plan-permissions: 95 | - users: 96 | - admin 97 | permissions: 98 | - view 99 | - edit 100 | - build 101 | - clone 102 | - admin 103 | - view-configuration 104 | - roles: 105 | - logged-in 106 | - anonymous 107 | permissions: 108 | - view 109 | ... 110 | -------------------------------------------------------------------------------- /ch1_self_hosted/01_02_bamboo/index.py: -------------------------------------------------------------------------------- 1 | """ 2 | An API that returns data from a JSON file. 3 | """ 4 | import os 5 | import json 6 | 7 | 8 | def handler(event, context): 9 | """ 10 | The lambda handler function. 11 | """ 12 | del context # Unused 13 | 14 | if event["rawPath"] == "/": 15 | environment = os.environ.get("ENVIRONMENT", "undefined") 16 | platform = os.environ.get("PLATFORM", "undefined") 17 | version = os.environ.get("VERSION", "undefined") 18 | build_number = os.environ.get("BUILD_NUMBER", "undefined") 19 | 20 | # Read the HTML template 21 | with open("template.html", mode="r", encoding="utf-8") as template_file: 22 | template = template_file.read() 23 | 24 | # Render the template 25 | docs_page = template.format( 26 | environment=environment, 27 | version=version, 28 | platform=platform, 29 | build_number=build_number, 30 | ) 31 | 32 | # Return the rendered template 33 | return { 34 | "statusCode": 200, 35 | "headers": { 36 | "Content-Type": "text/html", 37 | }, 38 | "body": docs_page, 39 | } 40 | 41 | # Read the data from the JSON file 42 | with open("data.json", mode="r", encoding="utf-8") as data_file: 43 | data = json.load(data_file) 44 | 45 | # Route /data: return all data 46 | if event["rawPath"] == "/data": 47 | return { 48 | "statusCode": 200, 49 | "headers": { 50 | "Content-Type": "application/json", 51 | }, 52 | "body": json.dumps(data), 53 | } 54 | 55 | # Route /data/{item_id}: return a single item 56 | # Get the item_id from the event's rawPath 57 | item_id = event["rawPath"][1:] 58 | 59 | # Check the item_id against each item in the data 60 | for item in data: 61 | # Return the item if the item_id matches 62 | if item["id"] == item_id: 63 | return { 64 | "statusCode": 200, 65 | "headers": { 66 | "Content-Type": "application/json", 67 | }, 68 | "body": json.dumps(item), 69 | } 70 | 71 | # Return a 404 if the item_id doesn't match any item 72 | return { 73 | "statusCode": 404, 74 | "headers": { 75 | "Content-Type": "application/json", 76 | }, 77 | "body": json.dumps( 78 | { 79 | "message": f"item_id {item_id} not found", 80 | "event": event, 81 | "item_id": item_id, 82 | } 83 | ), 84 | } 85 | 86 | 87 | def main(): 88 | """ 89 | A main function for testing the handler function. 90 | """ 91 | # Simulate an event for testing 92 | event = {"rawPath": "/1"} 93 | context = None 94 | 95 | # Call the handler function 96 | response = handler(event, context) 97 | 98 | # Print the response 99 | print(json.dumps(response, indent=2)) 100 | 101 | 102 | if __name__ == "__main__": 103 | main() 104 | -------------------------------------------------------------------------------- /ch1_self_hosted/01_02_bamboo/index_test.py: -------------------------------------------------------------------------------- 1 | from index import handler 2 | from unittest.mock import patch 3 | import unittest 4 | import json 5 | 6 | 7 | class TestHandler(unittest.TestCase): 8 | def setUp(self): 9 | # Set up any test data or configurations you need 10 | self.mock_env = "test_environment" 11 | patch.dict("os.environ", {"ENVIRONMENT": self.mock_env}).start() 12 | 13 | def tearDown(self): 14 | # Clean up after each test if necessary 15 | patch.stopall() 16 | 17 | def test_home_page(self): 18 | # Test the home page 19 | event = {"rawPath": "/"} 20 | response = handler(event, None) 21 | self.assertEqual(response["statusCode"], 200) 22 | self.assertEqual(response["headers"]["Content-Type"], "text/html") 23 | self.assertIn(f"The Sample Application - {self.mock_env}", response["body"]) 24 | 25 | def test_get_all_data(self): 26 | with open("data.json", mode="r", encoding="utf-8") as data_file: 27 | data = json.load(data_file) 28 | 29 | event = {"rawPath": "/data"} 30 | response = handler(event, None) 31 | self.assertEqual(response["statusCode"], 200) 32 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 33 | self.assertEqual(response["body"], json.dumps(data)) 34 | 35 | def test_get_item_by_id(self): 36 | with open("data.json", "r") as f: 37 | data = json.load(f) 38 | 39 | event = {"rawPath": "/1"} 40 | response = handler(event, None) 41 | self.assertEqual(response["statusCode"], 200) 42 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 43 | 44 | # Find the expected item in data based on the given ID 45 | item_id = event["rawPath"][1:] 46 | expected_item = next((item for item in data if item["id"] == item_id), None) 47 | self.assertEqual(response["body"], json.dumps(expected_item)) 48 | 49 | def test_invalid_id(self): 50 | event = {"rawPath": "/invalid_id"} 51 | response = handler(event, None) 52 | self.assertEqual(response["statusCode"], 404) 53 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 54 | self.assertIn(f"id {event['rawPath'][1:]} not found", response["body"]) 55 | 56 | 57 | if __name__ == "__main__": 58 | unittest.main() 59 | -------------------------------------------------------------------------------- /ch1_self_hosted/01_02_bamboo/requirements.txt: -------------------------------------------------------------------------------- 1 | black 2 | pylint 3 | flake8 4 | awscli 5 | -------------------------------------------------------------------------------- /ch1_self_hosted/01_02_bamboo/template.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 |Environment | 65 |{environment} | 66 |
Version | 69 |{version} | 70 |
Platform | 73 |{platform} | 74 |
Build # | 77 |{build_number} | 78 |
Returns the documentation page for the application.
83 | 84 | 85 |Returns all data in JSON format.
87 | 88 | 89 |Returns a specific item by its ID in JSON format.
91 | 92 | 93 | 94 | -------------------------------------------------------------------------------- /ch1_self_hosted/01_03_teamcity/Makefile: -------------------------------------------------------------------------------- 1 | FUNCTION=undefined 2 | PLATFORM=undefined 3 | URL=undefined 4 | VERSION=undefined 5 | BUILD_NUMBER=undefined 6 | CODE=$(shell ls *.py) 7 | 8 | ifneq (,$(findstring -staging,$(FUNCTION))) 9 | ENVIRONMENT = STAGING 10 | else ifneq (,$(findstring -production,$(FUNCTION))) 11 | ENVIRONMENT = PRODUCTION 12 | else 13 | ENVIRONMENT = undefined 14 | endif 15 | 16 | hello: 17 | @echo "Here are the targets for this Makefile:" 18 | @echo " requirements - install the project requirements" 19 | @echo " lint - run linters on the code" 20 | @echo " black - run black to format the code" 21 | @echo " test - run the tests" 22 | @echo " build - build the lambda.zip file" 23 | @echo " deploy - deploy the lambda.zip file to AWS" 24 | @echo " testdeployment - test the deployment" 25 | @echo " clean - remove the lambda.zip file" 26 | @echo " all - clean, lint, black, test, build, and deploy" 27 | @echo 28 | @echo 29 | @echo "You must set the FUNCTION variables to use the deploy target." 30 | @echo "FUNCTION must be set to the name of an existing lambda function to update." 31 | @echo "For example:" 32 | @echo 33 | @echo " make deploy FUNCTION=sample-application-staging" 34 | @echo 35 | @echo "Optional deploy variables are:" 36 | @echo " VERSION - the version of the code being deployed (default: undefined)" 37 | @echo " PLATFORM - the platform being used for the deployment (default: undefined)" 38 | @echo " BUILD_NUMBER - the build number assigned by the deployment platform (default: undefined)" 39 | @echo " URL - the URL to use for testing the deployment (default: undefined)" 40 | @echo 41 | 42 | requirements: 43 | pip install -U pip 44 | pip install --requirement requirements.txt 45 | 46 | check: 47 | set 48 | zip --version 49 | python --version 50 | pylint --version 51 | flake8 --version 52 | aws --version 53 | 54 | lint: 55 | pylint --exit-zero --errors-only --disable=C0301 --disable=C0326 --disable=R,C $(CODE) 56 | flake8 --exit-zero --ignore=E501,E231 $(CODE) 57 | 58 | 59 | black: 60 | black --diff $(CODE) 61 | 62 | test: 63 | python -m unittest -v index_test 64 | 65 | build: 66 | zip lambda.zip index.py data.json template.html 67 | 68 | deploy: 69 | aws sts get-caller-identity 70 | 71 | aws lambda wait function-active \ 72 | --function-name="$(FUNCTION)" 73 | 74 | aws lambda update-function-configuration \ 75 | --function-name="$(FUNCTION)" \ 76 | --environment "Variables={PLATFORM=$(PLATFORM),VERSION=$(VERSION),BUILD_NUMBER=$(BUILD_NUMBER),ENVIRONMENT=$(ENVIRONMENT)}" 77 | 78 | aws lambda wait function-updated \ 79 | --function-name="$(FUNCTION)" 80 | 81 | aws lambda update-function-code \ 82 | --function-name="$(FUNCTION)" \ 83 | --zip-file=fileb://lambda.zip 84 | 85 | aws lambda wait function-updated \ 86 | --function-name="$(FUNCTION)" 87 | 88 | testdeployment: 89 | curl -s $(URL) | grep $(VERSION) 90 | 91 | clean: 92 | rm -vf lambda.zip 93 | 94 | all: clean lint black test build deploy 95 | 96 | .PHONY: test build deploy all clean 97 | -------------------------------------------------------------------------------- /ch1_self_hosted/01_03_teamcity/TeamCity-Build-Parameters-SCR-20230916-naii.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch1_self_hosted/01_03_teamcity/TeamCity-Build-Parameters-SCR-20230916-naii.png -------------------------------------------------------------------------------- /ch1_self_hosted/01_03_teamcity/TeamCity-Pom-XML-after-SCR-20231014-kkpo.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch1_self_hosted/01_03_teamcity/TeamCity-Pom-XML-after-SCR-20231014-kkpo.png -------------------------------------------------------------------------------- /ch1_self_hosted/01_03_teamcity/TeamCity-Pom-XML-before-SCR-20231014-kkib.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch1_self_hosted/01_03_teamcity/TeamCity-Pom-XML-before-SCR-20231014-kkib.png -------------------------------------------------------------------------------- /ch1_self_hosted/01_03_teamcity/TeamCity-Settings-Kts-after-SCR-20231014-klxh.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch1_self_hosted/01_03_teamcity/TeamCity-Settings-Kts-after-SCR-20231014-klxh.png -------------------------------------------------------------------------------- /ch1_self_hosted/01_03_teamcity/TeamCity-Settings-Kts-before-SCR-20231014-klup.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch1_self_hosted/01_03_teamcity/TeamCity-Settings-Kts-before-SCR-20231014-klup.png -------------------------------------------------------------------------------- /ch1_self_hosted/01_03_teamcity/index.py: -------------------------------------------------------------------------------- 1 | """ 2 | An API that returns data from a JSON file. 3 | """ 4 | import os 5 | import json 6 | 7 | 8 | def handler(event, context): 9 | """ 10 | The lambda handler function. 11 | """ 12 | del context # Unused 13 | 14 | if event["rawPath"] == "/": 15 | environment = os.environ.get("ENVIRONMENT", "undefined") 16 | platform = os.environ.get("PLATFORM", "undefined") 17 | version = os.environ.get("VERSION", "undefined") 18 | build_number = os.environ.get("BUILD_NUMBER", "undefined") 19 | 20 | # Read the HTML template 21 | with open("template.html", mode="r", encoding="utf-8") as template_file: 22 | template = template_file.read() 23 | 24 | # Render the template 25 | docs_page = template.format( 26 | environment=environment, 27 | version=version, 28 | platform=platform, 29 | build_number=build_number, 30 | ) 31 | 32 | # Return the rendered template 33 | return { 34 | "statusCode": 200, 35 | "headers": { 36 | "Content-Type": "text/html", 37 | }, 38 | "body": docs_page, 39 | } 40 | 41 | # Read the data from the JSON file 42 | with open("data.json", mode="r", encoding="utf-8") as data_file: 43 | data = json.load(data_file) 44 | 45 | # Route /data: return all data 46 | if event["rawPath"] == "/data": 47 | return { 48 | "statusCode": 200, 49 | "headers": { 50 | "Content-Type": "application/json", 51 | }, 52 | "body": json.dumps(data), 53 | } 54 | 55 | # Route /data/{item_id}: return a single item 56 | # Get the item_id from the event's rawPath 57 | item_id = event["rawPath"][1:] 58 | 59 | # Check the item_id against each item in the data 60 | for item in data: 61 | # Return the item if the item_id matches 62 | if item["id"] == item_id: 63 | return { 64 | "statusCode": 200, 65 | "headers": { 66 | "Content-Type": "application/json", 67 | }, 68 | "body": json.dumps(item), 69 | } 70 | 71 | # Return a 404 if the item_id doesn't match any item 72 | return { 73 | "statusCode": 404, 74 | "headers": { 75 | "Content-Type": "application/json", 76 | }, 77 | "body": json.dumps( 78 | { 79 | "message": f"item_id {item_id} not found", 80 | "event": event, 81 | "item_id": item_id, 82 | } 83 | ), 84 | } 85 | 86 | 87 | def main(): 88 | """ 89 | A main function for testing the handler function. 90 | """ 91 | # Simulate an event for testing 92 | event = {"rawPath": "/1"} 93 | context = None 94 | 95 | # Call the handler function 96 | response = handler(event, context) 97 | 98 | # Print the response 99 | print(json.dumps(response, indent=2)) 100 | 101 | 102 | if __name__ == "__main__": 103 | main() 104 | -------------------------------------------------------------------------------- /ch1_self_hosted/01_03_teamcity/index_test.py: -------------------------------------------------------------------------------- 1 | from index import handler 2 | from unittest.mock import patch 3 | import unittest 4 | import json 5 | 6 | 7 | class TestHandler(unittest.TestCase): 8 | def setUp(self): 9 | # Set up any test data or configurations you need 10 | self.mock_env = "test_environment" 11 | patch.dict("os.environ", {"ENVIRONMENT": self.mock_env}).start() 12 | 13 | def tearDown(self): 14 | # Clean up after each test if necessary 15 | patch.stopall() 16 | 17 | def test_home_page(self): 18 | # Test the home page 19 | event = {"rawPath": "/"} 20 | response = handler(event, None) 21 | self.assertEqual(response["statusCode"], 200) 22 | self.assertEqual(response["headers"]["Content-Type"], "text/html") 23 | self.assertIn(f"The Sample Application - {self.mock_env}", response["body"]) 24 | 25 | def test_get_all_data(self): 26 | with open("data.json", mode="r", encoding="utf-8") as data_file: 27 | data = json.load(data_file) 28 | 29 | event = {"rawPath": "/data"} 30 | response = handler(event, None) 31 | self.assertEqual(response["statusCode"], 200) 32 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 33 | self.assertEqual(response["body"], json.dumps(data)) 34 | 35 | def test_get_item_by_id(self): 36 | with open("data.json", "r") as f: 37 | data = json.load(f) 38 | 39 | event = {"rawPath": "/1"} 40 | response = handler(event, None) 41 | self.assertEqual(response["statusCode"], 200) 42 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 43 | 44 | # Find the expected item in data based on the given ID 45 | item_id = event["rawPath"][1:] 46 | expected_item = next((item for item in data if item["id"] == item_id), None) 47 | self.assertEqual(response["body"], json.dumps(expected_item)) 48 | 49 | def test_invalid_id(self): 50 | event = {"rawPath": "/invalid_id"} 51 | response = handler(event, None) 52 | self.assertEqual(response["statusCode"], 404) 53 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 54 | self.assertIn(f"id {event['rawPath'][1:]} not found", response["body"]) 55 | 56 | 57 | if __name__ == "__main__": 58 | unittest.main() 59 | -------------------------------------------------------------------------------- /ch1_self_hosted/01_03_teamcity/pom.xml: -------------------------------------------------------------------------------- 1 | 2 |Environment | 65 |{environment} | 66 |
Version | 69 |{version} | 70 |
Platform | 73 |{platform} | 74 |
Build # | 77 |{build_number} | 78 |
Returns the documentation page for the application.
83 | 84 | 85 |Returns all data in JSON format.
87 | 88 | 89 |Returns a specific item by its ID in JSON format.
91 | 92 | 93 | 94 | -------------------------------------------------------------------------------- /ch1_self_hosted/01_04_self_hosted_tool_summary/README.md: -------------------------------------------------------------------------------- 1 | # 01_04 Self Hosted Tool Summary 2 | This summary covers the key points about self-hosted CI/CD tools and introduces Jenkins, Bamboo, and TeamCity with some highlights. 3 | 4 | ## Self-Hosted CI/CD Tools 5 | - Offer full control over hardware and software for testing, artifact creation, and application deployment. 6 | - Responsibility includes installing and maintaining the entire stack. 7 | 8 | ## Jenkins 9 | - Open-source and free. 10 | - Supports thousands of plugins. 11 | - Jenkinsfile format is ideal for modeling pipeline stages. 12 | 13 | ## Bamboo 14 | - Known for its tight integration with other Atlassian products. 15 | - Connects to a marketplace for plugins and extensions. 16 | - Provides a web-based interface for creating pipelines. 17 | 18 | ## TeamCity 19 | - Seamlessly integrates with JetBrains IDEs and editors like Visual Studio Code. 20 | - Features project detection for faster configuration. 21 | - Uses Kotlin-based pipeline files, which are approachable for most developers. 22 | 23 | [Next: Ch2 Software as a Service](../../ch2_software_as_a_service/README.md) 24 | -------------------------------------------------------------------------------- /ch1_self_hosted/README.md: -------------------------------------------------------------------------------- 1 | # Ch1 Self Hosted 2 | - [01_01 Jenkins](./01_01_jenkins/README.md) 3 | - [01_02 Bamboo](./01_02_bamboo/README.md) 4 | - [01_03 TeamCity](./01_03_teamcity/README.md) 5 | - [01_04 Self Hosted Tool Summary](./01_04_self_hosted_tool_summary/README.md) 6 | -------------------------------------------------------------------------------- /ch2_software_as_a_service/02_01_travis_ci/Makefile: -------------------------------------------------------------------------------- 1 | FUNCTION=undefined 2 | PLATFORM=undefined 3 | URL=undefined 4 | VERSION=undefined 5 | BUILD_NUMBER=undefined 6 | CODE=$(shell ls *.py) 7 | 8 | ifneq (,$(findstring -staging,$(FUNCTION))) 9 | ENVIRONMENT = STAGING 10 | else ifneq (,$(findstring -production,$(FUNCTION))) 11 | ENVIRONMENT = PRODUCTION 12 | else 13 | ENVIRONMENT = undefined 14 | endif 15 | 16 | hello: 17 | @echo "Here are the targets for this Makefile:" 18 | @echo " requirements - install the project requirements" 19 | @echo " lint - run linters on the code" 20 | @echo " black - run black to format the code" 21 | @echo " test - run the tests" 22 | @echo " build - build the lambda.zip file" 23 | @echo " deploy - deploy the lambda.zip file to AWS" 24 | @echo " testdeployment - test the deployment" 25 | @echo " clean - remove the lambda.zip file" 26 | @echo " all - clean, lint, black, test, build, and deploy" 27 | @echo 28 | @echo 29 | @echo "You must set the FUNCTION variables to use the deploy target." 30 | @echo "FUNCTION must be set to the name of an existing lambda function to update." 31 | @echo "For example:" 32 | @echo 33 | @echo " make deploy FUNCTION=sample-application-staging" 34 | @echo 35 | @echo "Optional deploy variables are:" 36 | @echo " VERSION - the version of the code being deployed (default: undefined)" 37 | @echo " PLATFORM - the platform being used for the deployment (default: undefined)" 38 | @echo " BUILD_NUMBER - the build number assigned by the deployment platform (default: undefined)" 39 | @echo " URL - the URL to use for testing the deployment (default: undefined)" 40 | @echo 41 | 42 | requirements: 43 | pip install -U pip 44 | pip install --requirement requirements.txt 45 | 46 | check: 47 | set 48 | zip --version 49 | python --version 50 | pylint --version 51 | flake8 --version 52 | aws --version 53 | 54 | lint: 55 | pylint --exit-zero --errors-only --disable=C0301 --disable=C0326 --disable=R,C $(CODE) 56 | flake8 --exit-zero --ignore=E501,E231 $(CODE) 57 | 58 | 59 | black: 60 | black --diff $(CODE) 61 | 62 | test: 63 | python -m unittest -v index_test 64 | 65 | build: 66 | zip lambda.zip index.py data.json template.html 67 | 68 | deploy: 69 | aws sts get-caller-identity 70 | 71 | aws lambda wait function-active \ 72 | --function-name="$(FUNCTION)" 73 | 74 | aws lambda update-function-configuration \ 75 | --function-name="$(FUNCTION)" \ 76 | --environment "Variables={PLATFORM=$(PLATFORM),VERSION=$(VERSION),BUILD_NUMBER=$(BUILD_NUMBER),ENVIRONMENT=$(ENVIRONMENT)}" 77 | 78 | aws lambda wait function-updated \ 79 | --function-name="$(FUNCTION)" 80 | 81 | aws lambda update-function-code \ 82 | --function-name="$(FUNCTION)" \ 83 | --zip-file=fileb://lambda.zip 84 | 85 | aws lambda wait function-updated \ 86 | --function-name="$(FUNCTION)" 87 | 88 | testdeployment: 89 | curl -s $(URL) | grep $(VERSION) 90 | 91 | clean: 92 | rm -vf lambda.zip 93 | 94 | all: clean lint black test build deploy 95 | 96 | .PHONY: test build deploy all clean 97 | -------------------------------------------------------------------------------- /ch2_software_as_a_service/02_01_travis_ci/Travis-Environment-Variables-SCR-20230916-shdl.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch2_software_as_a_service/02_01_travis_ci/Travis-Environment-Variables-SCR-20230916-shdl.png -------------------------------------------------------------------------------- /ch2_software_as_a_service/02_01_travis_ci/index.py: -------------------------------------------------------------------------------- 1 | """ 2 | An API that returns data from a JSON file. 3 | """ 4 | import os 5 | import json 6 | 7 | 8 | def handler(event, context): 9 | """ 10 | The lambda handler function. 11 | """ 12 | del context # Unused 13 | 14 | if event["rawPath"] == "/": 15 | environment = os.environ.get("ENVIRONMENT", "undefined") 16 | platform = os.environ.get("PLATFORM", "undefined") 17 | version = os.environ.get("VERSION", "undefined") 18 | build_number = os.environ.get("BUILD_NUMBER", "undefined") 19 | 20 | # Read the HTML template 21 | with open("template.html", mode="r", encoding="utf-8") as template_file: 22 | template = template_file.read() 23 | 24 | # Render the template 25 | docs_page = template.format( 26 | environment=environment, 27 | version=version, 28 | platform=platform, 29 | build_number=build_number, 30 | ) 31 | 32 | # Return the rendered template 33 | return { 34 | "statusCode": 200, 35 | "headers": { 36 | "Content-Type": "text/html", 37 | }, 38 | "body": docs_page, 39 | } 40 | 41 | # Read the data from the JSON file 42 | with open("data.json", mode="r", encoding="utf-8") as data_file: 43 | data = json.load(data_file) 44 | 45 | # Route /data: return all data 46 | if event["rawPath"] == "/data": 47 | return { 48 | "statusCode": 200, 49 | "headers": { 50 | "Content-Type": "application/json", 51 | }, 52 | "body": json.dumps(data), 53 | } 54 | 55 | # Route /data/{item_id}: return a single item 56 | # Get the item_id from the event's rawPath 57 | item_id = event["rawPath"][1:] 58 | 59 | # Check the item_id against each item in the data 60 | for item in data: 61 | # Return the item if the item_id matches 62 | if item["id"] == item_id: 63 | return { 64 | "statusCode": 200, 65 | "headers": { 66 | "Content-Type": "application/json", 67 | }, 68 | "body": json.dumps(item), 69 | } 70 | 71 | # Return a 404 if the item_id doesn't match any item 72 | return { 73 | "statusCode": 404, 74 | "headers": { 75 | "Content-Type": "application/json", 76 | }, 77 | "body": json.dumps( 78 | { 79 | "message": f"item_id {item_id} not found", 80 | "event": event, 81 | "item_id": item_id, 82 | } 83 | ), 84 | } 85 | 86 | 87 | def main(): 88 | """ 89 | A main function for testing the handler function. 90 | """ 91 | # Simulate an event for testing 92 | event = {"rawPath": "/1"} 93 | context = None 94 | 95 | # Call the handler function 96 | response = handler(event, context) 97 | 98 | # Print the response 99 | print(json.dumps(response, indent=2)) 100 | 101 | 102 | if __name__ == "__main__": 103 | main() 104 | -------------------------------------------------------------------------------- /ch2_software_as_a_service/02_01_travis_ci/index_test.py: -------------------------------------------------------------------------------- 1 | from index import handler 2 | from unittest.mock import patch 3 | import unittest 4 | import json 5 | 6 | 7 | class TestHandler(unittest.TestCase): 8 | def setUp(self): 9 | # Set up any test data or configurations you need 10 | self.mock_env = "test_environment" 11 | patch.dict("os.environ", {"ENVIRONMENT": self.mock_env}).start() 12 | 13 | def tearDown(self): 14 | # Clean up after each test if necessary 15 | patch.stopall() 16 | 17 | def test_home_page(self): 18 | # Test the home page 19 | event = {"rawPath": "/"} 20 | response = handler(event, None) 21 | self.assertEqual(response["statusCode"], 200) 22 | self.assertEqual(response["headers"]["Content-Type"], "text/html") 23 | self.assertIn(f"The Sample Application - {self.mock_env}", response["body"]) 24 | 25 | def test_get_all_data(self): 26 | with open("data.json", mode="r", encoding="utf-8") as data_file: 27 | data = json.load(data_file) 28 | 29 | event = {"rawPath": "/data"} 30 | response = handler(event, None) 31 | self.assertEqual(response["statusCode"], 200) 32 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 33 | self.assertEqual(response["body"], json.dumps(data)) 34 | 35 | def test_get_item_by_id(self): 36 | with open("data.json", "r") as f: 37 | data = json.load(f) 38 | 39 | event = {"rawPath": "/1"} 40 | response = handler(event, None) 41 | self.assertEqual(response["statusCode"], 200) 42 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 43 | 44 | # Find the expected item in data based on the given ID 45 | item_id = event["rawPath"][1:] 46 | expected_item = next((item for item in data if item["id"] == item_id), None) 47 | self.assertEqual(response["body"], json.dumps(expected_item)) 48 | 49 | def test_invalid_id(self): 50 | event = {"rawPath": "/invalid_id"} 51 | response = handler(event, None) 52 | self.assertEqual(response["statusCode"], 404) 53 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 54 | self.assertIn(f"id {event['rawPath'][1:]} not found", response["body"]) 55 | 56 | 57 | if __name__ == "__main__": 58 | unittest.main() 59 | -------------------------------------------------------------------------------- /ch2_software_as_a_service/02_01_travis_ci/requirements.txt: -------------------------------------------------------------------------------- 1 | black 2 | pylint 3 | flake8 4 | awscli 5 | -------------------------------------------------------------------------------- /ch2_software_as_a_service/02_01_travis_ci/template.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 |Environment | 65 |{environment} | 66 |
Version | 69 |{version} | 70 |
Platform | 73 |{platform} | 74 |
Build # | 77 |{build_number} | 78 |
Returns the documentation page for the application.
83 | 84 | 85 |Returns all data in JSON format.
87 | 88 | 89 |Returns a specific item by its ID in JSON format.
91 | 92 | 93 | 94 | -------------------------------------------------------------------------------- /ch2_software_as_a_service/02_01_travis_ci/travis.yml: -------------------------------------------------------------------------------- 1 | --- 2 | # Specify the programming language to be used, in this case, Python. 3 | language: python 4 | 5 | # Define the Python version to be used for the build. 6 | python: 7 | - "3.9" 8 | 9 | # Try to speed up builds with caching 10 | cache: pip 11 | 12 | # 'install' represents the 'Requirements' stage of the pipeline. 13 | install: 14 | # Install required Python packages specified in the 'requirements.txt' file. 15 | - pip install --quiet --upgrade --requirement requirements.txt 16 | 17 | # Install AWS CLI 18 | - pip install --quiet --upgrade awscli 19 | 20 | # Each step in the 'script' represents the remaining stages of the pipeline. 21 | script: 22 | 23 | # Check: Run the 'check', 'lint', and 'test' targets defined in the Makefile. 24 | - make check lint test 25 | 26 | # Build: Clean previous builds and initiate a new build. 27 | - make clean build 28 | 29 | # Deploy Staging: Deploy to the staging environment using specified parameters. 30 | - make deploy FUNCTION=${STAGING_FUNCTION_NAME} PLATFORM="Travis CI" VERSION=${TRAVIS_COMMIT} BUILD_NUMBER=${TRAVIS_BUILD_NUMBER} 31 | 32 | # Test Staging: Perform deployment testing for the staging environment. 33 | - make testdeployment URL=${STAGING_URL} VERSION=${TRAVIS_COMMIT} 34 | 35 | # Deploy Production: Deploy to the production environment using specified parameters. 36 | - make deploy FUNCTION=${PRODUCTION_FUNCTION_NAME} PLATFORM="Travis CI" VERSION=${TRAVIS_COMMIT} BUILD_NUMBER=${TRAVIS_BUILD_NUMBER} 37 | 38 | # Test Production: Perform deployment testing for the production environment. 39 | - make testdeployment URL=${PRODUCTION_URL} VERSION=${TRAVIS_COMMIT} -------------------------------------------------------------------------------- /ch2_software_as_a_service/02_02_circleci/CicleCI-Trigger-Pipeline-SCR-20230917-pesy.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch2_software_as_a_service/02_02_circleci/CicleCI-Trigger-Pipeline-SCR-20230917-pesy.png -------------------------------------------------------------------------------- /ch2_software_as_a_service/02_02_circleci/CircleCI-Build-Numbers-SCR-20230917-phnu.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch2_software_as_a_service/02_02_circleci/CircleCI-Build-Numbers-SCR-20230917-phnu.png -------------------------------------------------------------------------------- /ch2_software_as_a_service/02_02_circleci/CircleCI-Environment-Variables-SCR-20230917-ngsd.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch2_software_as_a_service/02_02_circleci/CircleCI-Environment-Variables-SCR-20230917-ngsd.png -------------------------------------------------------------------------------- /ch2_software_as_a_service/02_02_circleci/CircleCI-Project-Settings-SCR-20230917-ozxw.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch2_software_as_a_service/02_02_circleci/CircleCI-Project-Settings-SCR-20230917-ozxw.png -------------------------------------------------------------------------------- /ch2_software_as_a_service/02_02_circleci/CircleCI-config-after-SCR-20230917-ouax.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch2_software_as_a_service/02_02_circleci/CircleCI-config-after-SCR-20230917-ouax.png -------------------------------------------------------------------------------- /ch2_software_as_a_service/02_02_circleci/CircleCI-config-before-SCR-20230917-otkq.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch2_software_as_a_service/02_02_circleci/CircleCI-config-before-SCR-20230917-otkq.png -------------------------------------------------------------------------------- /ch2_software_as_a_service/02_02_circleci/Makefile: -------------------------------------------------------------------------------- 1 | FUNCTION=undefined 2 | PLATFORM=undefined 3 | URL=undefined 4 | VERSION=undefined 5 | BUILD_NUMBER=undefined 6 | CODE=$(shell ls *.py) 7 | 8 | ifneq (,$(findstring -staging,$(FUNCTION))) 9 | ENVIRONMENT = STAGING 10 | else ifneq (,$(findstring -production,$(FUNCTION))) 11 | ENVIRONMENT = PRODUCTION 12 | else 13 | ENVIRONMENT = undefined 14 | endif 15 | 16 | hello: 17 | @echo "Here are the targets for this Makefile:" 18 | @echo " requirements - install the project requirements" 19 | @echo " lint - run linters on the code" 20 | @echo " black - run black to format the code" 21 | @echo " test - run the tests" 22 | @echo " build - build the lambda.zip file" 23 | @echo " deploy - deploy the lambda.zip file to AWS" 24 | @echo " testdeployment - test the deployment" 25 | @echo " clean - remove the lambda.zip file" 26 | @echo " all - clean, lint, black, test, build, and deploy" 27 | @echo 28 | @echo 29 | @echo "You must set the FUNCTION variables to use the deploy target." 30 | @echo "FUNCTION must be set to the name of an existing lambda function to update." 31 | @echo "For example:" 32 | @echo 33 | @echo " make deploy FUNCTION=sample-application-staging" 34 | @echo 35 | @echo "Optional deploy variables are:" 36 | @echo " VERSION - the version of the code being deployed (default: undefined)" 37 | @echo " PLATFORM - the platform being used for the deployment (default: undefined)" 38 | @echo " BUILD_NUMBER - the build number assigned by the deployment platform (default: undefined)" 39 | @echo " URL - the URL to use for testing the deployment (default: undefined)" 40 | @echo 41 | 42 | requirements: 43 | pip install -U pip 44 | pip install --requirement requirements.txt 45 | 46 | check: 47 | set 48 | zip --version 49 | python --version 50 | pylint --version 51 | flake8 --version 52 | aws --version 53 | 54 | lint: 55 | pylint --exit-zero --errors-only --disable=C0301 --disable=C0326 --disable=R,C $(CODE) 56 | flake8 --exit-zero --ignore=E501,E231 $(CODE) 57 | 58 | 59 | black: 60 | black --diff $(CODE) 61 | 62 | test: 63 | python -m unittest -v index_test 64 | 65 | build: 66 | zip lambda.zip index.py data.json template.html 67 | 68 | deploy: 69 | aws sts get-caller-identity 70 | 71 | aws lambda wait function-active \ 72 | --function-name="$(FUNCTION)" 73 | 74 | aws lambda update-function-configuration \ 75 | --function-name="$(FUNCTION)" \ 76 | --environment "Variables={PLATFORM=$(PLATFORM),VERSION=$(VERSION),BUILD_NUMBER=$(BUILD_NUMBER),ENVIRONMENT=$(ENVIRONMENT)}" 77 | 78 | aws lambda wait function-updated \ 79 | --function-name="$(FUNCTION)" 80 | 81 | aws lambda update-function-code \ 82 | --function-name="$(FUNCTION)" \ 83 | --zip-file=fileb://lambda.zip 84 | 85 | aws lambda wait function-updated \ 86 | --function-name="$(FUNCTION)" 87 | 88 | testdeployment: 89 | curl -s $(URL) | grep $(VERSION) 90 | 91 | clean: 92 | rm -vf lambda.zip 93 | 94 | all: clean lint black test build deploy 95 | 96 | .PHONY: test build deploy all clean 97 | -------------------------------------------------------------------------------- /ch2_software_as_a_service/02_02_circleci/config.yml: -------------------------------------------------------------------------------- 1 | # Use the latest 2.1 version of CircleCI pipeline process engine. 2 | # See: https://circleci.com/docs/configuration-reference 3 | version: 2.1 4 | 5 | # Orbs are reusable packages of CircleCI configuration that you may share across projects, enabling you to create encapsulated, parameterized commands, jobs, and executors that can be used across multiple projects. 6 | # See: https://circleci.com/docs/orb-intro/ 7 | orbs: 8 | # The python orb contains a set of prepackaged CircleCI configuration you can use repeatedly in your configuration files 9 | # Orb commands and jobs help you with common scripting around a language/tool 10 | # so you dont have to copy and paste it everywhere. 11 | # See the orb documentation here: https://circleci.com/developer/orbs/orb/circleci/python 12 | python: circleci/python@1.5.0 13 | 14 | # Define a job to be invoked later in a workflow. 15 | # See: https://circleci.com/docs/configuration-reference/#jobs 16 | jobs: 17 | integration: # This is the name of the job 18 | # These next lines defines a Docker executors: https://circleci.com/docs/executor-types/ 19 | # You can specify an image from Dockerhub or use one of the convenience images from CircleCI's Developer Hub 20 | # A list of available CircleCI Docker convenience images are available here: https://circleci.com/developer/images/image/cimg/python 21 | # The executor is the environment in which the steps below will be executed - below will use a python 3.10.2 container 22 | # Change the version below to your required version of python 23 | docker: 24 | - image: cimg/python:3.10.2 25 | # Checkout the code as the first step. This is a dedicated CircleCI step. 26 | # The python orb's install-packages step will install the dependencies from a Pipfile via Pipenv by default. 27 | # Here we're making sure we use just use the system-wide pip. By default it uses the project root's requirements.txt. 28 | # Then run your tests! 29 | # CircleCI will report the results back to your VCS provider. 30 | steps: 31 | - checkout 32 | # This step takes care of the requirements 33 | - python/install-packages: 34 | pkg-manager: pip 35 | - run: 36 | name: install aws CLI 37 | command: pip install --quiet --upgrade awscli 38 | - run: 39 | name: check 40 | command: make check lint test 41 | 42 | build: # New job for build 43 | docker: 44 | - image: cimg/python:3.10.2 45 | steps: 46 | - checkout 47 | - run: 48 | name: install aws cli 49 | command: pip install --quiet --upgrade awscli 50 | - run: 51 | name: build 52 | command: make clean build 53 | - persist_to_workspace: # Persist the lambda.zip to workspace 54 | root: . 55 | paths: 56 | - lambda.zip 57 | 58 | deploy-test-staging: 59 | docker: 60 | - image: cimg/python:3.10.2 61 | steps: 62 | - checkout 63 | - attach_workspace: # Attach workspace to get the lambda.zip 64 | at: . 65 | - run: 66 | name: install aws cli 67 | command: pip install --quiet --upgrade awscli 68 | - run: 69 | name: deploy 70 | command: make deploy FUNCTION=${STAGING_FUNCTION_NAME} PLATFORM="CircleCI" VERSION=${CIRCLE_SHA1} BUILD_NUMBER=${CIRCLE_BUILD_NUM} 71 | - run: 72 | name: test 73 | command: make testdeployment URL=${STAGING_URL} VERSION=${CIRCLE_SHA1} 74 | 75 | deploy-test-production: 76 | docker: 77 | - image: cimg/python:3.10.2 78 | steps: 79 | - checkout 80 | - attach_workspace: # Attach workspace to get the lambda.zip 81 | at: . 82 | - run: 83 | name: install aws cli 84 | command: pip install --quiet --upgrade awscli 85 | - run: 86 | name: deploy 87 | command: make deploy FUNCTION=${PRODUCTION_FUNCTION_NAME} PLATFORM="CircleCI" VERSION=${CIRCLE_SHA1} BUILD_NUMBER=${CIRCLE_BUILD_NUM} 88 | - run: 89 | name: test 90 | command: make testdeployment URL=${PRODUCTION_URL} VERSION=${CIRCLE_SHA1} 91 | 92 | 93 | # Invoke jobs via workflows 94 | # See: https://circleci.com/docs/configuration-reference/#workflows 95 | workflows: 96 | experimental-pipeline: 97 | jobs: 98 | - integration 99 | - build: 100 | requires: 101 | - integration 102 | - deploy-test-staging: 103 | requires: 104 | - build 105 | - deploy-test-production: 106 | requires: 107 | - deploy-test-staging -------------------------------------------------------------------------------- /ch2_software_as_a_service/02_02_circleci/index.py: -------------------------------------------------------------------------------- 1 | """ 2 | An API that returns data from a JSON file. 3 | """ 4 | import os 5 | import json 6 | 7 | 8 | def handler(event, context): 9 | """ 10 | The lambda handler function. 11 | """ 12 | del context # Unused 13 | 14 | if event["rawPath"] == "/": 15 | environment = os.environ.get("ENVIRONMENT", "undefined") 16 | platform = os.environ.get("PLATFORM", "undefined") 17 | version = os.environ.get("VERSION", "undefined") 18 | build_number = os.environ.get("BUILD_NUMBER", "undefined") 19 | 20 | # Read the HTML template 21 | with open("template.html", mode="r", encoding="utf-8") as template_file: 22 | template = template_file.read() 23 | 24 | # Render the template 25 | docs_page = template.format( 26 | environment=environment, 27 | version=version, 28 | platform=platform, 29 | build_number=build_number, 30 | ) 31 | 32 | # Return the rendered template 33 | return { 34 | "statusCode": 200, 35 | "headers": { 36 | "Content-Type": "text/html", 37 | }, 38 | "body": docs_page, 39 | } 40 | 41 | # Read the data from the JSON file 42 | with open("data.json", mode="r", encoding="utf-8") as data_file: 43 | data = json.load(data_file) 44 | 45 | # Route /data: return all data 46 | if event["rawPath"] == "/data": 47 | return { 48 | "statusCode": 200, 49 | "headers": { 50 | "Content-Type": "application/json", 51 | }, 52 | "body": json.dumps(data), 53 | } 54 | 55 | # Route /data/{item_id}: return a single item 56 | # Get the item_id from the event's rawPath 57 | item_id = event["rawPath"][1:] 58 | 59 | # Check the item_id against each item in the data 60 | for item in data: 61 | # Return the item if the item_id matches 62 | if item["id"] == item_id: 63 | return { 64 | "statusCode": 200, 65 | "headers": { 66 | "Content-Type": "application/json", 67 | }, 68 | "body": json.dumps(item), 69 | } 70 | 71 | # Return a 404 if the item_id doesn't match any item 72 | return { 73 | "statusCode": 404, 74 | "headers": { 75 | "Content-Type": "application/json", 76 | }, 77 | "body": json.dumps( 78 | { 79 | "message": f"item_id {item_id} not found", 80 | "event": event, 81 | "item_id": item_id, 82 | } 83 | ), 84 | } 85 | 86 | 87 | def main(): 88 | """ 89 | A main function for testing the handler function. 90 | """ 91 | # Simulate an event for testing 92 | event = {"rawPath": "/1"} 93 | context = None 94 | 95 | # Call the handler function 96 | response = handler(event, context) 97 | 98 | # Print the response 99 | print(json.dumps(response, indent=2)) 100 | 101 | 102 | if __name__ == "__main__": 103 | main() 104 | -------------------------------------------------------------------------------- /ch2_software_as_a_service/02_02_circleci/index_test.py: -------------------------------------------------------------------------------- 1 | from index import handler 2 | from unittest.mock import patch 3 | import unittest 4 | import json 5 | 6 | 7 | class TestHandler(unittest.TestCase): 8 | def setUp(self): 9 | # Set up any test data or configurations you need 10 | self.mock_env = "test_environment" 11 | patch.dict("os.environ", {"ENVIRONMENT": self.mock_env}).start() 12 | 13 | def tearDown(self): 14 | # Clean up after each test if necessary 15 | patch.stopall() 16 | 17 | def test_home_page(self): 18 | # Test the home page 19 | event = {"rawPath": "/"} 20 | response = handler(event, None) 21 | self.assertEqual(response["statusCode"], 200) 22 | self.assertEqual(response["headers"]["Content-Type"], "text/html") 23 | self.assertIn(f"The Sample Application - {self.mock_env}", response["body"]) 24 | 25 | def test_get_all_data(self): 26 | with open("data.json", mode="r", encoding="utf-8") as data_file: 27 | data = json.load(data_file) 28 | 29 | event = {"rawPath": "/data"} 30 | response = handler(event, None) 31 | self.assertEqual(response["statusCode"], 200) 32 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 33 | self.assertEqual(response["body"], json.dumps(data)) 34 | 35 | def test_get_item_by_id(self): 36 | with open("data.json", "r") as f: 37 | data = json.load(f) 38 | 39 | event = {"rawPath": "/1"} 40 | response = handler(event, None) 41 | self.assertEqual(response["statusCode"], 200) 42 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 43 | 44 | # Find the expected item in data based on the given ID 45 | item_id = event["rawPath"][1:] 46 | expected_item = next((item for item in data if item["id"] == item_id), None) 47 | self.assertEqual(response["body"], json.dumps(expected_item)) 48 | 49 | def test_invalid_id(self): 50 | event = {"rawPath": "/invalid_id"} 51 | response = handler(event, None) 52 | self.assertEqual(response["statusCode"], 404) 53 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 54 | self.assertIn(f"id {event['rawPath'][1:]} not found", response["body"]) 55 | 56 | 57 | if __name__ == "__main__": 58 | unittest.main() 59 | -------------------------------------------------------------------------------- /ch2_software_as_a_service/02_02_circleci/requirements.txt: -------------------------------------------------------------------------------- 1 | black 2 | pylint 3 | flake8 4 | -------------------------------------------------------------------------------- /ch2_software_as_a_service/02_02_circleci/template.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 |Environment | 65 |{environment} | 66 |
Version | 69 |{version} | 70 |
Platform | 73 |{platform} | 74 |
Build # | 77 |{build_number} | 78 |
Returns the documentation page for the application.
83 | 84 | 85 |Returns all data in JSON format.
87 | 88 | 89 |Returns a specific item by its ID in JSON format.
91 | 92 | 93 | 94 | -------------------------------------------------------------------------------- /ch2_software_as_a_service/02_03_saas_tool_summary/README.md: -------------------------------------------------------------------------------- 1 | # 02_03 SaaS Tool Summary 2 | This summary highlights the advantages of SaaS CI tools and introduces Travis CI and CircleCI along with some highlights. 3 | 4 | - **SaaS CI Tools:** 5 | - Offer easy configuration and maintenance-free operations. 6 | - Provide scalability for increased capacity, such as more compute resources and build minutes, as needed. 7 | 8 | - **Travis CI:** 9 | - Preferred choice for large open source projects. 10 | - Integrates seamlessly with GitHub; Adding support for other code repositories. 11 | - Utilizes intuitive and elegant YAML configuration for pipelines. 12 | 13 | - **CircleCI:** 14 | - Known for speed and efficiency in testing and building applications. 15 | - Offers Orbs for creating reusable pipeline components. 16 | - Includes CircleCI Insights, a feature displaying project history trends. 17 | 18 | [Next: Ch3 Cloud Service Providers](../../ch3_cloud_service_providers/README.md) 19 | -------------------------------------------------------------------------------- /ch2_software_as_a_service/README.md: -------------------------------------------------------------------------------- 1 | # Ch2 Software as a Service 2 | - [02_01 Travis CI](./02_01_travis_ci/README.md) 3 | - [02_02 CircleCI](./02_02_circleci/README.md) 4 | - [02_03 SaaS Tool Summary](./02_03_saas_tool_summary/README.md) 5 | -------------------------------------------------------------------------------- /ch3_cloud_service_providers/03_01_aws_codepipeline_codebuild/Makefile: -------------------------------------------------------------------------------- 1 | FUNCTION=undefined 2 | PLATFORM=undefined 3 | URL=undefined 4 | VERSION=undefined 5 | BUILD_NUMBER=undefined 6 | CODE=$(shell ls *.py) 7 | 8 | ifneq (,$(findstring -staging,$(FUNCTION))) 9 | ENVIRONMENT = STAGING 10 | else ifneq (,$(findstring -production,$(FUNCTION))) 11 | ENVIRONMENT = PRODUCTION 12 | else 13 | ENVIRONMENT = undefined 14 | endif 15 | 16 | hello: 17 | @echo "Here are the targets for this Makefile:" 18 | @echo " requirements - install the project requirements" 19 | @echo " lint - run linters on the code" 20 | @echo " black - run black to format the code" 21 | @echo " test - run the tests" 22 | @echo " build - build the lambda.zip file" 23 | @echo " deploy - deploy the lambda.zip file to AWS" 24 | @echo " testdeployment - test the deployment" 25 | @echo " clean - remove the lambda.zip file" 26 | @echo " all - clean, lint, black, test, build, and deploy" 27 | @echo 28 | @echo 29 | @echo "You must set the FUNCTION variables to use the deploy target." 30 | @echo "FUNCTION must be set to the name of an existing lambda function to update." 31 | @echo "For example:" 32 | @echo 33 | @echo " make deploy FUNCTION=sample-application-staging" 34 | @echo 35 | @echo "Optional deploy variables are:" 36 | @echo " VERSION - the version of the code being deployed (default: undefined)" 37 | @echo " PLATFORM - the platform being used for the deployment (default: undefined)" 38 | @echo " BUILD_NUMBER - the build number assigned by the deployment platform (default: undefined)" 39 | @echo " URL - the URL to use for testing the deployment (default: undefined)" 40 | @echo 41 | 42 | requirements: 43 | pip install -U pip 44 | pip install --requirement requirements.txt 45 | 46 | check: 47 | set 48 | zip --version 49 | python --version 50 | pylint --version 51 | flake8 --version 52 | aws --version 53 | 54 | lint: 55 | pylint --exit-zero --errors-only --disable=C0301 --disable=C0326 --disable=R,C $(CODE) 56 | flake8 --exit-zero --ignore=E501,E231 $(CODE) 57 | 58 | 59 | black: 60 | black --diff $(CODE) 61 | 62 | test: 63 | python -m unittest -v index_test 64 | 65 | build: 66 | zip lambda.zip index.py data.json template.html 67 | 68 | deploy: 69 | aws sts get-caller-identity 70 | 71 | aws lambda wait function-active \ 72 | --function-name="$(FUNCTION)" 73 | 74 | aws lambda update-function-configuration \ 75 | --function-name="$(FUNCTION)" \ 76 | --environment "Variables={PLATFORM=$(PLATFORM),VERSION=$(VERSION),BUILD_NUMBER=$(BUILD_NUMBER),ENVIRONMENT=$(ENVIRONMENT)}" 77 | 78 | aws lambda wait function-updated \ 79 | --function-name="$(FUNCTION)" 80 | 81 | aws lambda update-function-code \ 82 | --function-name="$(FUNCTION)" \ 83 | --zip-file=fileb://lambda.zip 84 | 85 | aws lambda wait function-updated \ 86 | --function-name="$(FUNCTION)" 87 | 88 | testdeployment: 89 | curl -s $(URL) | grep $(VERSION) 90 | 91 | clean: 92 | rm -vf lambda.zip 93 | 94 | all: clean lint black test build deploy 95 | 96 | .PHONY: test build deploy all clean 97 | -------------------------------------------------------------------------------- /ch3_cloud_service_providers/03_01_aws_codepipeline_codebuild/buildspec.yml: -------------------------------------------------------------------------------- 1 | # AWS CodeBuild specification version. 2 | version: 0.2 3 | 4 | # Define the shell and environment variables for the build. 5 | env: 6 | shell: bash 7 | variables: 8 | FUNCTION_NAME: 9 | URL: 10 | 11 | # Define the different phases of the build lifecycle. 12 | phases: 13 | 14 | # Install necessary dependencies. 15 | install: 16 | commands: 17 | - python3 -m venv local 18 | - . ./local/bin/activate 19 | - make requirements 20 | 21 | # Pre-build phase for code quality checks and testing. 22 | pre_build: 23 | commands: 24 | - . ./local/bin/activate 25 | - make check lint 26 | - make test 27 | 28 | # Main build phase. 29 | build: 30 | commands: 31 | - make build 32 | 33 | # Post-build phase, usually for deployments or further testing. 34 | post_build: 35 | commands: 36 | - make deploy PLATFORM="AWS CodeBuild" FUNCTION=$FUNCTION_NAME VERSION=$CODEBUILD_RESOLVED_SOURCE_VERSION BUILD_NUMBER=$CODEBUILD_BUILD_NUMBER 37 | - make testdeployment URL=$URL VERSION=$CODEBUILD_RESOLVED_SOURCE_VERSION 38 | 39 | # Specify build artifacts to be stored. 40 | artifacts: 41 | files: 42 | - 'lambda.zip' 43 | -------------------------------------------------------------------------------- /ch3_cloud_service_providers/03_01_aws_codepipeline_codebuild/index.py: -------------------------------------------------------------------------------- 1 | """ 2 | An API that returns data from a JSON file. 3 | """ 4 | import os 5 | import json 6 | 7 | 8 | def handler(event, context): 9 | """ 10 | The lambda handler function. 11 | """ 12 | del context # Unused 13 | 14 | if event["rawPath"] == "/": 15 | environment = os.environ.get("ENVIRONMENT", "undefined") 16 | platform = os.environ.get("PLATFORM", "undefined") 17 | version = os.environ.get("VERSION", "undefined") 18 | build_number = os.environ.get("BUILD_NUMBER", "undefined") 19 | 20 | # Read the HTML template 21 | with open("template.html", mode="r", encoding="utf-8") as template_file: 22 | template = template_file.read() 23 | 24 | # Render the template 25 | docs_page = template.format( 26 | environment=environment, 27 | version=version, 28 | platform=platform, 29 | build_number=build_number, 30 | ) 31 | 32 | # Return the rendered template 33 | return { 34 | "statusCode": 200, 35 | "headers": { 36 | "Content-Type": "text/html", 37 | }, 38 | "body": docs_page, 39 | } 40 | 41 | # Read the data from the JSON file 42 | with open("data.json", mode="r", encoding="utf-8") as data_file: 43 | data = json.load(data_file) 44 | 45 | # Route /data: return all data 46 | if event["rawPath"] == "/data": 47 | return { 48 | "statusCode": 200, 49 | "headers": { 50 | "Content-Type": "application/json", 51 | }, 52 | "body": json.dumps(data), 53 | } 54 | 55 | # Route /data/{item_id}: return a single item 56 | # Get the item_id from the event's rawPath 57 | item_id = event["rawPath"][1:] 58 | 59 | # Check the item_id against each item in the data 60 | for item in data: 61 | # Return the item if the item_id matches 62 | if item["id"] == item_id: 63 | return { 64 | "statusCode": 200, 65 | "headers": { 66 | "Content-Type": "application/json", 67 | }, 68 | "body": json.dumps(item), 69 | } 70 | 71 | # Return a 404 if the item_id doesn't match any item 72 | return { 73 | "statusCode": 404, 74 | "headers": { 75 | "Content-Type": "application/json", 76 | }, 77 | "body": json.dumps( 78 | { 79 | "message": f"item_id {item_id} not found", 80 | "event": event, 81 | "item_id": item_id, 82 | } 83 | ), 84 | } 85 | 86 | 87 | def main(): 88 | """ 89 | A main function for testing the handler function. 90 | """ 91 | # Simulate an event for testing 92 | event = {"rawPath": "/1"} 93 | context = None 94 | 95 | # Call the handler function 96 | response = handler(event, context) 97 | 98 | # Print the response 99 | print(json.dumps(response, indent=2)) 100 | 101 | 102 | if __name__ == "__main__": 103 | main() 104 | -------------------------------------------------------------------------------- /ch3_cloud_service_providers/03_01_aws_codepipeline_codebuild/index_test.py: -------------------------------------------------------------------------------- 1 | from index import handler 2 | from unittest.mock import patch 3 | import unittest 4 | import json 5 | 6 | 7 | class TestHandler(unittest.TestCase): 8 | def setUp(self): 9 | # Set up any test data or configurations you need 10 | self.mock_env = "test_environment" 11 | patch.dict("os.environ", {"ENVIRONMENT": self.mock_env}).start() 12 | 13 | def tearDown(self): 14 | # Clean up after each test if necessary 15 | patch.stopall() 16 | 17 | def test_home_page(self): 18 | # Test the home page 19 | event = {"rawPath": "/"} 20 | response = handler(event, None) 21 | self.assertEqual(response["statusCode"], 200) 22 | self.assertEqual(response["headers"]["Content-Type"], "text/html") 23 | self.assertIn(f"The Sample Application - {self.mock_env}", response["body"]) 24 | 25 | def test_get_all_data(self): 26 | with open("data.json", mode="r", encoding="utf-8") as data_file: 27 | data = json.load(data_file) 28 | 29 | event = {"rawPath": "/data"} 30 | response = handler(event, None) 31 | self.assertEqual(response["statusCode"], 200) 32 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 33 | self.assertEqual(response["body"], json.dumps(data)) 34 | 35 | def test_get_item_by_id(self): 36 | with open("data.json", "r") as f: 37 | data = json.load(f) 38 | 39 | event = {"rawPath": "/1"} 40 | response = handler(event, None) 41 | self.assertEqual(response["statusCode"], 200) 42 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 43 | 44 | # Find the expected item in data based on the given ID 45 | item_id = event["rawPath"][1:] 46 | expected_item = next((item for item in data if item["id"] == item_id), None) 47 | self.assertEqual(response["body"], json.dumps(expected_item)) 48 | 49 | def test_invalid_id(self): 50 | event = {"rawPath": "/invalid_id"} 51 | response = handler(event, None) 52 | self.assertEqual(response["statusCode"], 404) 53 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 54 | self.assertIn(f"id {event['rawPath'][1:]} not found", response["body"]) 55 | 56 | 57 | if __name__ == "__main__": 58 | unittest.main() 59 | -------------------------------------------------------------------------------- /ch3_cloud_service_providers/03_01_aws_codepipeline_codebuild/requirements.txt: -------------------------------------------------------------------------------- 1 | black 2 | pylint 3 | flake8 4 | awscli 5 | -------------------------------------------------------------------------------- /ch3_cloud_service_providers/03_01_aws_codepipeline_codebuild/template.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 |Environment | 65 |{environment} | 66 |
Version | 69 |{version} | 70 |
Platform | 73 |{platform} | 74 |
Build # | 77 |{build_number} | 78 |
Returns the documentation page for the application.
83 | 84 | 85 |Returns all data in JSON format.
87 | 88 | 89 |Returns a specific item by its ID in JSON format.
91 | 92 | 93 | 94 | -------------------------------------------------------------------------------- /ch3_cloud_service_providers/03_02_azure_pipelines/Azure-Pipelines-Variabiles-SCR-20231015-pric.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch3_cloud_service_providers/03_02_azure_pipelines/Azure-Pipelines-Variabiles-SCR-20231015-pric.png -------------------------------------------------------------------------------- /ch3_cloud_service_providers/03_02_azure_pipelines/Makefile: -------------------------------------------------------------------------------- 1 | FUNCTION=undefined 2 | PLATFORM=undefined 3 | URL=undefined 4 | VERSION=undefined 5 | BUILD_NUMBER=undefined 6 | CODE=$(shell ls *.py) 7 | 8 | ifneq (,$(findstring -staging,$(FUNCTION))) 9 | ENVIRONMENT = STAGING 10 | else ifneq (,$(findstring -production,$(FUNCTION))) 11 | ENVIRONMENT = PRODUCTION 12 | else 13 | ENVIRONMENT = undefined 14 | endif 15 | 16 | hello: 17 | @echo "Here are the targets for this Makefile:" 18 | @echo " requirements - install the project requirements" 19 | @echo " lint - run linters on the code" 20 | @echo " black - run black to format the code" 21 | @echo " test - run the tests" 22 | @echo " build - build the lambda.zip file" 23 | @echo " deploy - deploy the lambda.zip file to AWS" 24 | @echo " testdeployment - test the deployment" 25 | @echo " clean - remove the lambda.zip file" 26 | @echo " all - clean, lint, black, test, build, and deploy" 27 | @echo 28 | @echo 29 | @echo "You must set the FUNCTION variables to use the deploy target." 30 | @echo "FUNCTION must be set to the name of an existing lambda function to update." 31 | @echo "For example:" 32 | @echo 33 | @echo " make deploy FUNCTION=sample-application-staging" 34 | @echo 35 | @echo "Optional deploy variables are:" 36 | @echo " VERSION - the version of the code being deployed (default: undefined)" 37 | @echo " PLATFORM - the platform being used for the deployment (default: undefined)" 38 | @echo " BUILD_NUMBER - the build number assigned by the deployment platform (default: undefined)" 39 | @echo " URL - the URL to use for testing the deployment (default: undefined)" 40 | @echo 41 | 42 | requirements: 43 | pip install -U pip 44 | pip install --requirement requirements.txt 45 | 46 | check: 47 | set 48 | zip --version 49 | python --version 50 | pylint --version 51 | flake8 --version 52 | aws --version 53 | 54 | lint: 55 | pylint --exit-zero --errors-only --disable=C0301 --disable=C0326 --disable=R,C $(CODE) 56 | flake8 --exit-zero --ignore=E501,E231 $(CODE) 57 | 58 | 59 | black: 60 | black --diff $(CODE) 61 | 62 | test: 63 | python -m unittest -v index_test 64 | 65 | build: 66 | zip lambda.zip index.py data.json template.html 67 | 68 | deploy: 69 | aws sts get-caller-identity 70 | 71 | aws lambda wait function-active \ 72 | --function-name="$(FUNCTION)" 73 | 74 | aws lambda update-function-configuration \ 75 | --function-name="$(FUNCTION)" \ 76 | --environment "Variables={PLATFORM=$(PLATFORM),VERSION=$(VERSION),BUILD_NUMBER=$(BUILD_NUMBER),ENVIRONMENT=$(ENVIRONMENT)}" 77 | 78 | aws lambda wait function-updated \ 79 | --function-name="$(FUNCTION)" 80 | 81 | aws lambda update-function-code \ 82 | --function-name="$(FUNCTION)" \ 83 | --zip-file=fileb://lambda.zip 84 | 85 | aws lambda wait function-updated \ 86 | --function-name="$(FUNCTION)" 87 | 88 | testdeployment: 89 | curl -s $(URL) | grep $(VERSION) 90 | 91 | clean: 92 | rm -vf lambda.zip 93 | 94 | all: clean lint black test build deploy 95 | 96 | .PHONY: test build deploy all clean 97 | -------------------------------------------------------------------------------- /ch3_cloud_service_providers/03_02_azure_pipelines/README.md: -------------------------------------------------------------------------------- 1 | # 03_02 Azure Pipelines 2 | Azure DevOps includes a complete set of hosted tools for application development with Pipelines as the main tool for building, testing, and deploying applications. 3 | 4 | - [Azure Pipelines](https://azure.microsoft.com/en-us/products/devops/pipelines/) 5 | 6 | ## Recommended Resources 7 | - [Azure DevOps for Beginners on LinkedIn Learning](https://www.linkedin.com/learning/azure-devops-for-beginners) 8 | 9 | 10 | ## Prerequisites 11 | Having the following items in place before starting this lab will help you have a smooth experience. 12 | 13 | 1. An [Azure DevOps account](https://learn.microsoft.com/en-us/azure/devops/pipelines/get-started/pipelines-sign-up?view=azure-devops) is needed to host the code for this exercise and create the pipeline. 14 | 15 | Follow the instructions on the linked page to create an account. 16 | 17 | 2. An [Amazon Web Services account](https://aws.amazon.com/free) is needed to deploy the sample application used for the deployment target. 18 | 3. The sample application should be in place before starting. See [00_06 About the Exercise Files](../../ch0_introduction/00_06_about_the_exercise_files/README.md) for steps to deploy the sample application. 19 | 4. The exercise files for the course should be downloaded and accessible on your local system. 20 | 21 | ## Implement the Experimental Pipeline 22 | To implement the experimental pipeline in Azure Pipelines, you will need to create an Azure DevOps project, add the exercise files, and modify the files for your project if needed. 23 | 24 | Next, you'll configure the project that implements the pipeline. 25 | 26 | And finally, you'll trigger the pipeline to deploy the sample application. 27 | 28 | Before starting these steps, open the Output tab of the Cloudformation stack for the sample application. You'll be referencing values displayed on that tab. 29 | 30 | ### 1. Create an Azure DevOps project for the repo and pipeline 31 | For this exercise, we'll use the repo service included with an Azure DevOps project. 32 | 33 | 1. From the Azure DevOps homepage, select **+ New Project**. 34 | 2. Enter a project name and description. Under "Visibility", select **Public**. Select **Create**. 35 | 3. From the homepage of the new project, Select **Repos**. 36 | 4. At the bottom of the page, under "Initialize 37 | main branch with a README or gitignore", select the option to add a `README` and choose `Python` for the `.gitignore`. Select **Initialize**. 38 | 1. On the new repo page, select the three-dot menu on the upper right and then select **Upload files**. 39 | 2. Select **Browse** and navigate to the location where you downloaded the files for this lesson. Select all of the files and then select **Open**. 40 | 3. Select **Commit**. 41 | 42 | 43 | ### 2. Configure and run the pipeline 44 | 1. Select **Set Up Build** 45 | 1. Select **Variables** -> **New variable** 46 | 1. Add the name and value for each of the following parameters. Select the "+" for each additional variable. *Note: For AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY select the option for "Keep this value secret".* 47 | 48 | - AWS_ACCESS_KEY_ID 49 | - AWS_SECRET_ACCESS_KEY 50 | - AWS_DEFAULT_REGION 51 | - STAGING_FUNCTION_NAME 52 | - STAGING_URL 53 | - PRODUCTION_FUNCTION_NAME 54 | - PRODUCTION_URL 55 | 56 | Your completed variable configuration should appear similar to the following: 57 |  58 | 59 | When all variables are in place select **Save**. 60 | 61 | 1. Run the pipeline by selecting **Run**. 62 | 1. Allow the build to complete. 63 | 1. Open the URLs for the sample application's staging and production environments. For both environments, confirm that the deployment platform is "Azure DevOps" and the build number matches the last successful pipeline number. 64 | 1. If any errors are encountered, review the logs and make corrections as needed. Consider reviewing the configuration steps for the parameters. If you are not able to resolve the errors, please post a question on LinkedIn Learning in the course Q&A section. 65 | 66 | 67 | ## Additional Information 68 | - [Specify jobs in your pipeline](https://learn.microsoft.com/en-us/azure/devops/pipelines/process/phases) 69 | - [Pricing for Azure DevOps](https://azure.microsoft.com/en-us/pricing/details/devops/azure-devops-services/) 70 | 71 | [Next: 03_03 GCP Cloud Build](../03_03_gcp_cloud_build/README.md) 72 | -------------------------------------------------------------------------------- /ch3_cloud_service_providers/03_02_azure_pipelines/index.py: -------------------------------------------------------------------------------- 1 | """ 2 | An API that returns data from a JSON file. 3 | """ 4 | import os 5 | import json 6 | 7 | 8 | def handler(event, context): 9 | """ 10 | The lambda handler function. 11 | """ 12 | del context # Unused 13 | 14 | if event["rawPath"] == "/": 15 | environment = os.environ.get("ENVIRONMENT", "undefined") 16 | platform = os.environ.get("PLATFORM", "undefined") 17 | version = os.environ.get("VERSION", "undefined") 18 | build_number = os.environ.get("BUILD_NUMBER", "undefined") 19 | 20 | # Read the HTML template 21 | with open("template.html", mode="r", encoding="utf-8") as template_file: 22 | template = template_file.read() 23 | 24 | # Render the template 25 | docs_page = template.format( 26 | environment=environment, 27 | version=version, 28 | platform=platform, 29 | build_number=build_number, 30 | ) 31 | 32 | # Return the rendered template 33 | return { 34 | "statusCode": 200, 35 | "headers": { 36 | "Content-Type": "text/html", 37 | }, 38 | "body": docs_page, 39 | } 40 | 41 | # Read the data from the JSON file 42 | with open("data.json", mode="r", encoding="utf-8") as data_file: 43 | data = json.load(data_file) 44 | 45 | # Route /data: return all data 46 | if event["rawPath"] == "/data": 47 | return { 48 | "statusCode": 200, 49 | "headers": { 50 | "Content-Type": "application/json", 51 | }, 52 | "body": json.dumps(data), 53 | } 54 | 55 | # Route /data/{item_id}: return a single item 56 | # Get the item_id from the event's rawPath 57 | item_id = event["rawPath"][1:] 58 | 59 | # Check the item_id against each item in the data 60 | for item in data: 61 | # Return the item if the item_id matches 62 | if item["id"] == item_id: 63 | return { 64 | "statusCode": 200, 65 | "headers": { 66 | "Content-Type": "application/json", 67 | }, 68 | "body": json.dumps(item), 69 | } 70 | 71 | # Return a 404 if the item_id doesn't match any item 72 | return { 73 | "statusCode": 404, 74 | "headers": { 75 | "Content-Type": "application/json", 76 | }, 77 | "body": json.dumps( 78 | { 79 | "message": f"item_id {item_id} not found", 80 | "event": event, 81 | "item_id": item_id, 82 | } 83 | ), 84 | } 85 | 86 | 87 | def main(): 88 | """ 89 | A main function for testing the handler function. 90 | """ 91 | # Simulate an event for testing 92 | event = {"rawPath": "/1"} 93 | context = None 94 | 95 | # Call the handler function 96 | response = handler(event, context) 97 | 98 | # Print the response 99 | print(json.dumps(response, indent=2)) 100 | 101 | 102 | if __name__ == "__main__": 103 | main() 104 | -------------------------------------------------------------------------------- /ch3_cloud_service_providers/03_02_azure_pipelines/index_test.py: -------------------------------------------------------------------------------- 1 | from index import handler 2 | from unittest.mock import patch 3 | import unittest 4 | import json 5 | 6 | 7 | class TestHandler(unittest.TestCase): 8 | def setUp(self): 9 | # Set up any test data or configurations you need 10 | self.mock_env = "test_environment" 11 | patch.dict("os.environ", {"ENVIRONMENT": self.mock_env}).start() 12 | 13 | def tearDown(self): 14 | # Clean up after each test if necessary 15 | patch.stopall() 16 | 17 | def test_home_page(self): 18 | # Test the home page 19 | event = {"rawPath": "/"} 20 | response = handler(event, None) 21 | self.assertEqual(response["statusCode"], 200) 22 | self.assertEqual(response["headers"]["Content-Type"], "text/html") 23 | self.assertIn(f"The Sample Application - {self.mock_env}", response["body"]) 24 | 25 | def test_get_all_data(self): 26 | with open("data.json", mode="r", encoding="utf-8") as data_file: 27 | data = json.load(data_file) 28 | 29 | event = {"rawPath": "/data"} 30 | response = handler(event, None) 31 | self.assertEqual(response["statusCode"], 200) 32 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 33 | self.assertEqual(response["body"], json.dumps(data)) 34 | 35 | def test_get_item_by_id(self): 36 | with open("data.json", "r") as f: 37 | data = json.load(f) 38 | 39 | event = {"rawPath": "/1"} 40 | response = handler(event, None) 41 | self.assertEqual(response["statusCode"], 200) 42 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 43 | 44 | # Find the expected item in data based on the given ID 45 | item_id = event["rawPath"][1:] 46 | expected_item = next((item for item in data if item["id"] == item_id), None) 47 | self.assertEqual(response["body"], json.dumps(expected_item)) 48 | 49 | def test_invalid_id(self): 50 | event = {"rawPath": "/invalid_id"} 51 | response = handler(event, None) 52 | self.assertEqual(response["statusCode"], 404) 53 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 54 | self.assertIn(f"id {event['rawPath'][1:]} not found", response["body"]) 55 | 56 | 57 | if __name__ == "__main__": 58 | unittest.main() 59 | -------------------------------------------------------------------------------- /ch3_cloud_service_providers/03_02_azure_pipelines/requirements.txt: -------------------------------------------------------------------------------- 1 | black 2 | pylint 3 | flake8 4 | awscli 5 | -------------------------------------------------------------------------------- /ch3_cloud_service_providers/03_02_azure_pipelines/template.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 |Environment | 65 |{environment} | 66 |
Version | 69 |{version} | 70 |
Platform | 73 |{platform} | 74 |
Build # | 77 |{build_number} | 78 |
Returns the documentation page for the application.
83 | 84 | 85 |Returns all data in JSON format.
87 | 88 | 89 |Returns a specific item by its ID in JSON format.
91 | 92 | 93 | 94 | -------------------------------------------------------------------------------- /ch3_cloud_service_providers/03_03_gcp_cloud_build/Makefile: -------------------------------------------------------------------------------- 1 | FUNCTION=undefined 2 | PLATFORM=undefined 3 | URL=undefined 4 | VERSION=undefined 5 | BUILD_NUMBER=undefined 6 | CODE=$(shell ls *.py) 7 | 8 | ifneq (,$(findstring -staging,$(FUNCTION))) 9 | ENVIRONMENT = STAGING 10 | else ifneq (,$(findstring -production,$(FUNCTION))) 11 | ENVIRONMENT = PRODUCTION 12 | else 13 | ENVIRONMENT = undefined 14 | endif 15 | 16 | hello: 17 | @echo "Here are the targets for this Makefile:" 18 | @echo " requirements - install the project requirements" 19 | @echo " lint - run linters on the code" 20 | @echo " black - run black to format the code" 21 | @echo " test - run the tests" 22 | @echo " build - build the lambda.zip file" 23 | @echo " deploy - deploy the lambda.zip file to AWS" 24 | @echo " testdeployment - test the deployment" 25 | @echo " clean - remove the lambda.zip file" 26 | @echo " all - clean, lint, black, test, build, and deploy" 27 | @echo 28 | @echo 29 | @echo "You must set the FUNCTION variables to use the deploy target." 30 | @echo "FUNCTION must be set to the name of an existing lambda function to update." 31 | @echo "For example:" 32 | @echo 33 | @echo " make deploy FUNCTION=sample-application-staging" 34 | @echo 35 | @echo "Optional deploy variables are:" 36 | @echo " VERSION - the version of the code being deployed (default: undefined)" 37 | @echo " PLATFORM - the platform being used for the deployment (default: undefined)" 38 | @echo " BUILD_NUMBER - the build number assigned by the deployment platform (default: undefined)" 39 | @echo " URL - the URL to use for testing the deployment (default: undefined)" 40 | @echo 41 | 42 | requirements: 43 | pip install -U pip 44 | pip install --requirement requirements.txt 45 | 46 | check: 47 | set 48 | zip --version 49 | python --version 50 | pylint --version 51 | flake8 --version 52 | aws --version 53 | 54 | lint: 55 | pylint --exit-zero --errors-only --disable=C0301 --disable=C0326 --disable=R,C $(CODE) 56 | flake8 --exit-zero --ignore=E501,E231 $(CODE) 57 | 58 | 59 | black: 60 | black --diff $(CODE) 61 | 62 | test: 63 | python -m unittest -v index_test 64 | 65 | build: 66 | zip lambda.zip index.py data.json template.html 67 | 68 | deploy: 69 | aws sts get-caller-identity 70 | 71 | aws lambda wait function-active \ 72 | --function-name="$(FUNCTION)" 73 | 74 | aws lambda update-function-configuration \ 75 | --function-name="$(FUNCTION)" \ 76 | --environment "Variables={PLATFORM=$(PLATFORM),VERSION=$(VERSION),BUILD_NUMBER=$(BUILD_NUMBER),ENVIRONMENT=$(ENVIRONMENT)}" 77 | 78 | aws lambda wait function-updated \ 79 | --function-name="$(FUNCTION)" 80 | 81 | aws lambda update-function-code \ 82 | --function-name="$(FUNCTION)" \ 83 | --zip-file=fileb://lambda.zip 84 | 85 | aws lambda wait function-updated \ 86 | --function-name="$(FUNCTION)" 87 | 88 | testdeployment: 89 | curl -s $(URL) | grep $(VERSION) 90 | 91 | clean: 92 | rm -vf lambda.zip 93 | 94 | all: clean lint black test build deploy 95 | 96 | .PHONY: test build deploy all clean 97 | -------------------------------------------------------------------------------- /ch3_cloud_service_providers/03_03_gcp_cloud_build/cloud-build-project-number.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch3_cloud_service_providers/03_03_gcp_cloud_build/cloud-build-project-number.png -------------------------------------------------------------------------------- /ch3_cloud_service_providers/03_03_gcp_cloud_build/cloud-build-secret-manager-secret-accessor.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch3_cloud_service_providers/03_03_gcp_cloud_build/cloud-build-secret-manager-secret-accessor.png -------------------------------------------------------------------------------- /ch3_cloud_service_providers/03_03_gcp_cloud_build/cloudbuild.yaml: -------------------------------------------------------------------------------- 1 | steps: 2 | # Step 0: Create a virtual environment named 'local' 3 | - name: 'python' 4 | args: ['/usr/local/bin/python', '-m', 'venv', 'local'] 5 | 6 | # Step 1: Activate the virtual environment, install AWS CLI and then use 'make' to install the requirements 7 | - name: 'python' 8 | entrypoint: '/usr/bin/bash' 9 | args: 10 | - '-c' 11 | - | 12 | source ./local/bin/activate 13 | pip install awscli 14 | /usr/bin/make requirements 15 | 16 | # Step 2: install zip, check the environment, and lint the code 17 | - name: 'python' 18 | entrypoint: '/usr/bin/bash' 19 | args: 20 | - '-c' 21 | - | 22 | apt update && apt install zip -qq 23 | source ./local/bin/activate 24 | /usr/bin/make check lint 25 | 26 | # Step 3: Run tests 27 | - name: 'python' 28 | entrypoint: '/usr/bin/bash' 29 | args: 30 | - '-c' 31 | - | 32 | source ./local/bin/activate 33 | /usr/bin/make test 34 | 35 | # Step 4: Install zip, activate the virtual environment, and build the project 36 | - name: 'python' 37 | entrypoint: '/usr/bin/bash' 38 | args: 39 | - '-c' 40 | - | 41 | apt update && apt install zip -qq 42 | source ./local/bin/activate 43 | /usr/bin/make build 44 | 45 | # Step 5: Deploy to staging using AWS CLI (AWS secrets are injected from Secret Manager) 46 | - name: 'python' 47 | entrypoint: '/usr/bin/bash' 48 | env: 49 | - 'AWS_DEFAULT_REGION=$_AWS_DEFAULT_REGION' 50 | secretEnv: ['AWS_ACCESS_KEY_ID', 'AWS_SECRET_ACCESS_KEY'] 51 | args: 52 | - '-c' 53 | - | 54 | source ./local/bin/activate 55 | /usr/bin/make deploy PLATFORM="Google Cloud Build" \ 56 | FUNCTION=$_STAGING_FUNCTION_NAME \ 57 | VERSION=$COMMIT_SHA \ 58 | BUILD_NUMBER=$BUILD_ID 59 | 60 | # Step 6: Test the deployed function's staging environment 61 | - name: 'python' 62 | entrypoint: '/usr/bin/bash' 63 | args: 64 | - '-c' 65 | - | 66 | /usr/bin/curl -s $_STAGING_URL | /usr/bin/grep $BUILD_ID 67 | 68 | # Step 7: Deploy to production using AWS CLI (AWS secrets are injected from Secret Manager) 69 | - name: 'python' 70 | entrypoint: '/usr/bin/bash' 71 | env: 72 | - 'AWS_DEFAULT_REGION=$_AWS_DEFAULT_REGION' 73 | secretEnv: ['AWS_ACCESS_KEY_ID', 'AWS_SECRET_ACCESS_KEY'] 74 | args: 75 | - '-c' 76 | - | 77 | source ./local/bin/activate 78 | /usr/bin/make deploy PLATFORM="Google Cloud Build" \ 79 | FUNCTION=$_PRODUCTION_FUNCTION_NAME \ 80 | VERSION=$COMMIT_SHA \ 81 | BUILD_NUMBER=$BUILD_ID 82 | 83 | # Step 8: Test the deployed function's production environment 84 | - name: 'python' 85 | entrypoint: '/usr/bin/bash' 86 | args: 87 | - '-c' 88 | - | 89 | /usr/bin/curl -s $_PRODUCTION_URL | /usr/bin/grep $BUILD_ID 90 | 91 | # Substitutions: default values for environment variables that can be overridden when triggering the build 92 | substitutions: 93 | _AWS_DEFAULT_REGION: UPDATE_THIS_VALUE 94 | _STAGING_FUNCTION_NAME: UPDATE_THIS_VALUE 95 | _STAGING_URL: UPDATE_THIS_VALUE 96 | _PRODUCTION_FUNCTION_NAME: UPDATE_THIS_VALUE 97 | _PRODUCTION_URL: UPDATE_THIS_VALUE 98 | 99 | # Secrets: Fetch AWS secrets from Google Cloud Secret Manager 100 | availableSecrets: 101 | secretManager: 102 | - versionName: projects/$PROJECT_ID/secrets/AWS_ACCESS_KEY_ID/versions/latest 103 | env: 'AWS_ACCESS_KEY_ID' 104 | - versionName: projects/$PROJECT_ID/secrets/AWS_SECRET_ACCESS_KEY/versions/latest 105 | env: 'AWS_SECRET_ACCESS_KEY' -------------------------------------------------------------------------------- /ch3_cloud_service_providers/03_03_gcp_cloud_build/index.py: -------------------------------------------------------------------------------- 1 | """ 2 | An API that returns data from a JSON file. 3 | """ 4 | import os 5 | import json 6 | 7 | 8 | def handler(event, context): 9 | """ 10 | The lambda handler function. 11 | """ 12 | del context # Unused 13 | 14 | if event["rawPath"] == "/": 15 | environment = os.environ.get("ENVIRONMENT", "undefined") 16 | platform = os.environ.get("PLATFORM", "undefined") 17 | version = os.environ.get("VERSION", "undefined") 18 | build_number = os.environ.get("BUILD_NUMBER", "undefined") 19 | 20 | # Read the HTML template 21 | with open("template.html", mode="r", encoding="utf-8") as template_file: 22 | template = template_file.read() 23 | 24 | # Render the template 25 | docs_page = template.format( 26 | environment=environment, 27 | version=version, 28 | platform=platform, 29 | build_number=build_number, 30 | ) 31 | 32 | # Return the rendered template 33 | return { 34 | "statusCode": 200, 35 | "headers": { 36 | "Content-Type": "text/html", 37 | }, 38 | "body": docs_page, 39 | } 40 | 41 | # Read the data from the JSON file 42 | with open("data.json", mode="r", encoding="utf-8") as data_file: 43 | data = json.load(data_file) 44 | 45 | # Route /data: return all data 46 | if event["rawPath"] == "/data": 47 | return { 48 | "statusCode": 200, 49 | "headers": { 50 | "Content-Type": "application/json", 51 | }, 52 | "body": json.dumps(data), 53 | } 54 | 55 | # Route /data/{item_id}: return a single item 56 | # Get the item_id from the event's rawPath 57 | item_id = event["rawPath"][1:] 58 | 59 | # Check the item_id against each item in the data 60 | for item in data: 61 | # Return the item if the item_id matches 62 | if item["id"] == item_id: 63 | return { 64 | "statusCode": 200, 65 | "headers": { 66 | "Content-Type": "application/json", 67 | }, 68 | "body": json.dumps(item), 69 | } 70 | 71 | # Return a 404 if the item_id doesn't match any item 72 | return { 73 | "statusCode": 404, 74 | "headers": { 75 | "Content-Type": "application/json", 76 | }, 77 | "body": json.dumps( 78 | { 79 | "message": f"item_id {item_id} not found", 80 | "event": event, 81 | "item_id": item_id, 82 | } 83 | ), 84 | } 85 | 86 | 87 | def main(): 88 | """ 89 | A main function for testing the handler function. 90 | """ 91 | # Simulate an event for testing 92 | event = {"rawPath": "/1"} 93 | context = None 94 | 95 | # Call the handler function 96 | response = handler(event, context) 97 | 98 | # Print the response 99 | print(json.dumps(response, indent=2)) 100 | 101 | 102 | if __name__ == "__main__": 103 | main() 104 | -------------------------------------------------------------------------------- /ch3_cloud_service_providers/03_03_gcp_cloud_build/index_test.py: -------------------------------------------------------------------------------- 1 | from index import handler 2 | from unittest.mock import patch 3 | import unittest 4 | import json 5 | 6 | 7 | class TestHandler(unittest.TestCase): 8 | def setUp(self): 9 | # Set up any test data or configurations you need 10 | self.mock_env = "test_environment" 11 | patch.dict("os.environ", {"ENVIRONMENT": self.mock_env}).start() 12 | 13 | def tearDown(self): 14 | # Clean up after each test if necessary 15 | patch.stopall() 16 | 17 | def test_home_page(self): 18 | # Test the home page 19 | event = {"rawPath": "/"} 20 | response = handler(event, None) 21 | self.assertEqual(response["statusCode"], 200) 22 | self.assertEqual(response["headers"]["Content-Type"], "text/html") 23 | self.assertIn(f"The Sample Application - {self.mock_env}", response["body"]) 24 | 25 | def test_get_all_data(self): 26 | with open("data.json", mode="r", encoding="utf-8") as data_file: 27 | data = json.load(data_file) 28 | 29 | event = {"rawPath": "/data"} 30 | response = handler(event, None) 31 | self.assertEqual(response["statusCode"], 200) 32 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 33 | self.assertEqual(response["body"], json.dumps(data)) 34 | 35 | def test_get_item_by_id(self): 36 | with open("data.json", "r") as f: 37 | data = json.load(f) 38 | 39 | event = {"rawPath": "/1"} 40 | response = handler(event, None) 41 | self.assertEqual(response["statusCode"], 200) 42 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 43 | 44 | # Find the expected item in data based on the given ID 45 | item_id = event["rawPath"][1:] 46 | expected_item = next((item for item in data if item["id"] == item_id), None) 47 | self.assertEqual(response["body"], json.dumps(expected_item)) 48 | 49 | def test_invalid_id(self): 50 | event = {"rawPath": "/invalid_id"} 51 | response = handler(event, None) 52 | self.assertEqual(response["statusCode"], 404) 53 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 54 | self.assertIn(f"id {event['rawPath'][1:]} not found", response["body"]) 55 | 56 | 57 | if __name__ == "__main__": 58 | unittest.main() 59 | -------------------------------------------------------------------------------- /ch3_cloud_service_providers/03_03_gcp_cloud_build/requirements.txt: -------------------------------------------------------------------------------- 1 | black 2 | pylint 3 | flake8 4 | awscli 5 | -------------------------------------------------------------------------------- /ch3_cloud_service_providers/03_03_gcp_cloud_build/template.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 |Environment | 65 |{environment} | 66 |
Version | 69 |{version} | 70 |
Platform | 73 |{platform} | 74 |
Build # | 77 |{build_number} | 78 |
Returns the documentation page for the application.
83 | 84 | 85 |Returns all data in JSON format.
87 | 88 | 89 |Returns a specific item by its ID in JSON format.
91 | 92 | 93 | 94 | -------------------------------------------------------------------------------- /ch3_cloud_service_providers/03_04_cloud_service_provider_tool_summary/README.md: -------------------------------------------------------------------------------- 1 | # 03_04 Cloud Service Provider Tool Summary 2 | This summary highlights the advantages of using CI/CD tools from cloud service providers and introduces specific tools from AWS, Azure, and GCP along with their respective strengths. 3 | 4 | - **Cloud Service Provider CI/CD Tools:** 5 | - Seamlessly integrated into their respective cloud ecosystems. 6 | - Advantage of building and deploying software in the same cloud environment for improved access management and security controls. 7 | - Offers scalability with access to essentially unlimited cloud resources. 8 | 9 | - **AWS CodePipeline and CodeBuild**: 10 | - Both tools work together for CI/CD projects hosted on AWS. 11 | - They help model pipelines with stages, actions, and reusable build components. 12 | 13 | - **Azure Pipelines**: 14 | - Integrates well with the project management features of the Azure DevOps platform. 15 | - Includes features for release management. 16 | 17 | - **Google Cloud Platform (GCP) Cloud Build**: 18 | - Known for its fast run times and simple configuration. 19 | - Ideal for projects based on containers, with native functionality for creating container images. 20 | 21 | [Next: Ch4 Code Repositories](../../ch4_code_repositories/README.md) 22 | -------------------------------------------------------------------------------- /ch3_cloud_service_providers/README.md: -------------------------------------------------------------------------------- 1 | # Ch3 Cloud Service Providers 2 | - [03_01 AWS CodePipeline and CodeBuild](./03_01_aws_codepipeline_codebuild/README.md) 3 | - [03_02 Azure Pipelines](./03_02_azure_pipelines/README.md) 4 | - [03_03 GCP Cloud Build](./03_03_gcp_cloud_build/README.md) 5 | - [03_04 Cloud Service Provider Tool Summary](./03_04_cloud_service_provider_tool_summary/README.md) 6 | -------------------------------------------------------------------------------- /ch4_code_repositories/04_01_github_actions/0-github-actions-rename-workflow-before.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch4_code_repositories/04_01_github_actions/0-github-actions-rename-workflow-before.png -------------------------------------------------------------------------------- /ch4_code_repositories/04_01_github_actions/1-github-actions-rename-workflow-after.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch4_code_repositories/04_01_github_actions/1-github-actions-rename-workflow-after.png -------------------------------------------------------------------------------- /ch4_code_repositories/04_01_github_actions/Makefile: -------------------------------------------------------------------------------- 1 | FUNCTION=undefined 2 | PLATFORM=undefined 3 | URL=undefined 4 | VERSION=undefined 5 | BUILD_NUMBER=undefined 6 | CODE=$(shell ls *.py) 7 | 8 | ifneq (,$(findstring -staging,$(FUNCTION))) 9 | ENVIRONMENT = STAGING 10 | else ifneq (,$(findstring -production,$(FUNCTION))) 11 | ENVIRONMENT = PRODUCTION 12 | else 13 | ENVIRONMENT = undefined 14 | endif 15 | 16 | hello: 17 | @echo "Here are the targets for this Makefile:" 18 | @echo " requirements - install the project requirements" 19 | @echo " lint - run linters on the code" 20 | @echo " black - run black to format the code" 21 | @echo " test - run the tests" 22 | @echo " build - build the lambda.zip file" 23 | @echo " deploy - deploy the lambda.zip file to AWS" 24 | @echo " testdeployment - test the deployment" 25 | @echo " clean - remove the lambda.zip file" 26 | @echo " all - clean, lint, black, test, build, and deploy" 27 | @echo 28 | @echo 29 | @echo "You must set the FUNCTION variables to use the deploy target." 30 | @echo "FUNCTION must be set to the name of an existing lambda function to update." 31 | @echo "For example:" 32 | @echo 33 | @echo " make deploy FUNCTION=sample-application-staging" 34 | @echo 35 | @echo "Optional deploy variables are:" 36 | @echo " VERSION - the version of the code being deployed (default: undefined)" 37 | @echo " PLATFORM - the platform being used for the deployment (default: undefined)" 38 | @echo " BUILD_NUMBER - the build number assigned by the deployment platform (default: undefined)" 39 | @echo " URL - the URL to use for testing the deployment (default: undefined)" 40 | @echo 41 | 42 | requirements: 43 | pip install -U pip 44 | pip install --requirement requirements.txt 45 | 46 | check: 47 | set 48 | zip --version 49 | python --version 50 | pylint --version 51 | flake8 --version 52 | aws --version 53 | 54 | lint: 55 | pylint --exit-zero --errors-only --disable=C0301 --disable=C0326 --disable=R,C $(CODE) 56 | flake8 --exit-zero --ignore=E501,E231 $(CODE) 57 | 58 | 59 | black: 60 | black --diff $(CODE) 61 | 62 | test: 63 | python -m unittest -v index_test 64 | 65 | build: 66 | zip lambda.zip index.py data.json template.html 67 | 68 | deploy: 69 | aws sts get-caller-identity 70 | 71 | aws lambda wait function-active \ 72 | --function-name="$(FUNCTION)" 73 | 74 | aws lambda update-function-configuration \ 75 | --function-name="$(FUNCTION)" \ 76 | --environment "Variables={PLATFORM=$(PLATFORM),VERSION=$(VERSION),BUILD_NUMBER=$(BUILD_NUMBER),ENVIRONMENT=$(ENVIRONMENT)}" 77 | 78 | aws lambda wait function-updated \ 79 | --function-name="$(FUNCTION)" 80 | 81 | aws lambda update-function-code \ 82 | --function-name="$(FUNCTION)" \ 83 | --zip-file=fileb://lambda.zip 84 | 85 | aws lambda wait function-updated \ 86 | --function-name="$(FUNCTION)" 87 | 88 | testdeployment: 89 | curl -s $(URL) | grep $(VERSION) 90 | 91 | clean: 92 | rm -vf lambda.zip 93 | 94 | all: clean lint black test build deploy 95 | 96 | .PHONY: test build deploy all clean 97 | -------------------------------------------------------------------------------- /ch4_code_repositories/04_01_github_actions/index.py: -------------------------------------------------------------------------------- 1 | """ 2 | An API that returns data from a JSON file. 3 | """ 4 | import os 5 | import json 6 | 7 | 8 | def handler(event, context): 9 | """ 10 | The lambda handler function. 11 | """ 12 | del context # Unused 13 | 14 | if event["rawPath"] == "/": 15 | environment = os.environ.get("ENVIRONMENT", "undefined") 16 | platform = os.environ.get("PLATFORM", "undefined") 17 | version = os.environ.get("VERSION", "undefined") 18 | build_number = os.environ.get("BUILD_NUMBER", "undefined") 19 | 20 | # Read the HTML template 21 | with open("template.html", mode="r", encoding="utf-8") as template_file: 22 | template = template_file.read() 23 | 24 | # Render the template 25 | docs_page = template.format( 26 | environment=environment, 27 | version=version, 28 | platform=platform, 29 | build_number=build_number, 30 | ) 31 | 32 | # Return the rendered template 33 | return { 34 | "statusCode": 200, 35 | "headers": { 36 | "Content-Type": "text/html", 37 | }, 38 | "body": docs_page, 39 | } 40 | 41 | # Read the data from the JSON file 42 | with open("data.json", mode="r", encoding="utf-8") as data_file: 43 | data = json.load(data_file) 44 | 45 | # Route /data: return all data 46 | if event["rawPath"] == "/data": 47 | return { 48 | "statusCode": 200, 49 | "headers": { 50 | "Content-Type": "application/json", 51 | }, 52 | "body": json.dumps(data), 53 | } 54 | 55 | # Route /data/{item_id}: return a single item 56 | # Get the item_id from the event's rawPath 57 | item_id = event["rawPath"][1:] 58 | 59 | # Check the item_id against each item in the data 60 | for item in data: 61 | # Return the item if the item_id matches 62 | if item["id"] == item_id: 63 | return { 64 | "statusCode": 200, 65 | "headers": { 66 | "Content-Type": "application/json", 67 | }, 68 | "body": json.dumps(item), 69 | } 70 | 71 | # Return a 404 if the item_id doesn't match any item 72 | return { 73 | "statusCode": 404, 74 | "headers": { 75 | "Content-Type": "application/json", 76 | }, 77 | "body": json.dumps( 78 | { 79 | "message": f"item_id {item_id} not found", 80 | "event": event, 81 | "item_id": item_id, 82 | } 83 | ), 84 | } 85 | 86 | 87 | def main(): 88 | """ 89 | A main function for testing the handler function. 90 | """ 91 | # Simulate an event for testing 92 | event = {"rawPath": "/1"} 93 | context = None 94 | 95 | # Call the handler function 96 | response = handler(event, context) 97 | 98 | # Print the response 99 | print(json.dumps(response, indent=2)) 100 | 101 | 102 | if __name__ == "__main__": 103 | main() 104 | -------------------------------------------------------------------------------- /ch4_code_repositories/04_01_github_actions/index_test.py: -------------------------------------------------------------------------------- 1 | from index import handler 2 | from unittest.mock import patch 3 | import unittest 4 | import json 5 | 6 | 7 | class TestHandler(unittest.TestCase): 8 | def setUp(self): 9 | # Set up any test data or configurations you need 10 | self.mock_env = "test_environment" 11 | patch.dict("os.environ", {"ENVIRONMENT": self.mock_env}).start() 12 | 13 | def tearDown(self): 14 | # Clean up after each test if necessary 15 | patch.stopall() 16 | 17 | def test_home_page(self): 18 | # Test the home page 19 | event = {"rawPath": "/"} 20 | response = handler(event, None) 21 | self.assertEqual(response["statusCode"], 200) 22 | self.assertEqual(response["headers"]["Content-Type"], "text/html") 23 | self.assertIn(f"The Sample Application - {self.mock_env}", response["body"]) 24 | 25 | def test_get_all_data(self): 26 | with open("data.json", mode="r", encoding="utf-8") as data_file: 27 | data = json.load(data_file) 28 | 29 | event = {"rawPath": "/data"} 30 | response = handler(event, None) 31 | self.assertEqual(response["statusCode"], 200) 32 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 33 | self.assertEqual(response["body"], json.dumps(data)) 34 | 35 | def test_get_item_by_id(self): 36 | with open("data.json", "r") as f: 37 | data = json.load(f) 38 | 39 | event = {"rawPath": "/1"} 40 | response = handler(event, None) 41 | self.assertEqual(response["statusCode"], 200) 42 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 43 | 44 | # Find the expected item in data based on the given ID 45 | item_id = event["rawPath"][1:] 46 | expected_item = next((item for item in data if item["id"] == item_id), None) 47 | self.assertEqual(response["body"], json.dumps(expected_item)) 48 | 49 | def test_invalid_id(self): 50 | event = {"rawPath": "/invalid_id"} 51 | response = handler(event, None) 52 | self.assertEqual(response["statusCode"], 404) 53 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 54 | self.assertIn(f"id {event['rawPath'][1:]} not found", response["body"]) 55 | 56 | 57 | if __name__ == "__main__": 58 | unittest.main() 59 | -------------------------------------------------------------------------------- /ch4_code_repositories/04_01_github_actions/pipeline.yml: -------------------------------------------------------------------------------- 1 | name: Experimental Pipeline 2 | 3 | # This workflow is triggered on every push and can also be manually dispatched. 4 | on: 5 | push: 6 | workflow_dispatch: 7 | 8 | # Ensures that only one instance of this workflow can run at a time. 9 | concurrency: 10 | group: ${{ github.workflow }} 11 | 12 | jobs: 13 | 14 | # First job installs dependencies and checks the runner environment 15 | Check: 16 | runs-on: ubuntu-latest 17 | steps: 18 | - name: Checkout code 19 | uses: actions/checkout@v4 20 | 21 | - name: Set up Python environment with caching enabled for pip dependencies 22 | uses: actions/setup-python@v4 23 | with: 24 | python-version: '3.11' 25 | cache: 'pip' 26 | 27 | - name: Install the project requirements 28 | run: make requirements 29 | 30 | - name: Check the environment 31 | run: make check 32 | 33 | - name: Lint the code to ensure code quality 34 | run: make lint 35 | 36 | # Test the code. This job is dependent on the 'Check' job. 37 | Test: 38 | needs: Check 39 | runs-on: ubuntu-latest 40 | steps: 41 | - uses: actions/checkout@v4.1.0 42 | 43 | - name: Set up Python environment with caching enabled for pip dependencies 44 | uses: actions/setup-python@v4 45 | with: 46 | python-version: '3.11' 47 | cache: 'pip' 48 | 49 | - name: Run the tests defined in the project 50 | run: make test 51 | 52 | # Build the code. This job depends on the 'Test' job. 53 | Build: 54 | needs: Test 55 | runs-on: ubuntu-latest 56 | steps: 57 | - uses: actions/checkout@v4.1.0 58 | 59 | # Build the code and create a lambda.zip artifact 60 | - name: Create the lambda.zip artifact 61 | run: make build 62 | 63 | # Archive the lambda.zip file for future use in subsequent jobs 64 | - name: Archive lambda.zip artifact 65 | if: success() 66 | uses: actions/upload-artifact@v3 67 | with: 68 | name: lambda-artifact 69 | path: ./lambda.zip 70 | 71 | # Deploy to Staging environment. Depends on 'Build' job. 72 | Staging: 73 | environment: Staging 74 | needs: Build 75 | runs-on: ubuntu-latest 76 | steps: 77 | - uses: actions/checkout@v4.1.0 78 | 79 | # Set up AWS credentials for deployments 80 | # The credentials are retreived from the secrets defined in the project settings 81 | - name: Configure AWS Credentials 82 | uses: aws-actions/configure-aws-credentials@v4 83 | with: 84 | aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }} 85 | aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }} 86 | aws-region: ${{ vars.AWS_DEFAULT_REGION }} 87 | 88 | # Download the lambda.zip artifact from the 'Build' job 89 | - name: Retrieve lambda.zip 90 | uses: actions/download-artifact@v3 91 | with: 92 | name: lambda-artifact 93 | 94 | # Deploy and test the Staging environment 95 | # Variables will be read from the evironment definitions in the repo settings 96 | - name: Deploy 97 | run: make deploy \ 98 | PLATFORM="GitHub Actions" \ 99 | FUNCTION="${{ vars.FUNCTION_NAME }}" \ 100 | VERSION="${GITHUB_SHA}" \ 101 | BUILD_NUMBER="${GITHUB_RUN_NUMBER}" 102 | 103 | - name: Test 104 | run: make testdeployment URL=${{ vars.URL }} VERSION=${GITHUB_SHA} 105 | 106 | # Deploy to Production environment. Depends on 'Staging' job. 107 | Production: 108 | needs: Staging 109 | environment: Production 110 | runs-on: ubuntu-latest 111 | steps: 112 | - name: Checkout code 113 | uses: actions/checkout@v4.1.0 114 | 115 | # Configure AWS credentials for Production deployment 116 | # The credentials are retreived from the secrets defined in the project settings 117 | - name: Configure AWS Credentials 118 | uses: aws-actions/configure-aws-credentials@v4 119 | with: 120 | aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }} 121 | aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }} 122 | aws-region: ${{ vars.AWS_DEFAULT_REGION }} 123 | 124 | # Retrieve the lambda.zip artifact for Production deployment 125 | - name: Retrieve lambda.zip 126 | uses: actions/download-artifact@v3 127 | with: 128 | name: lambda-artifact 129 | 130 | # Deploy and test the Production environment 131 | # Variables will be read from the evironment definitions in the repo settings 132 | - name: Deploy 133 | run: make deploy \ 134 | PLATFORM="GitHub Actions" \ 135 | FUNCTION="${{ vars.FUNCTION_NAME }}" \ 136 | VERSION="${GITHUB_SHA}" \ 137 | BUILD_NUMBER="${GITHUB_RUN_NUMBER}" 138 | 139 | - name: Test 140 | run: make testdeployment URL=${{ vars.URL }} VERSION=${GITHUB_SHA} 141 | -------------------------------------------------------------------------------- /ch4_code_repositories/04_01_github_actions/requirements.txt: -------------------------------------------------------------------------------- 1 | black 2 | pylint 3 | flake8 4 | awscli 5 | -------------------------------------------------------------------------------- /ch4_code_repositories/04_01_github_actions/template.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 |Environment | 65 |{environment} | 66 |
Version | 69 |{version} | 70 |
Platform | 73 |{platform} | 74 |
Build # | 77 |{build_number} | 78 |
Returns the documentation page for the application.
83 | 84 | 85 |Returns all data in JSON format.
87 | 88 | 89 |Returns a specific item by its ID in JSON format.
91 | 92 | 93 | 94 | -------------------------------------------------------------------------------- /ch4_code_repositories/04_02_gitlab_ci/0-gitlabci-edit-web-ide.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch4_code_repositories/04_02_gitlab_ci/0-gitlabci-edit-web-ide.png -------------------------------------------------------------------------------- /ch4_code_repositories/04_02_gitlab_ci/1-gitlabci-upload-files.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch4_code_repositories/04_02_gitlab_ci/1-gitlabci-upload-files.png -------------------------------------------------------------------------------- /ch4_code_repositories/04_02_gitlab_ci/2-gitlabci-rename-pipeline-file-before.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch4_code_repositories/04_02_gitlab_ci/2-gitlabci-rename-pipeline-file-before.png -------------------------------------------------------------------------------- /ch4_code_repositories/04_02_gitlab_ci/3-gitlabci-rename-pipeline-file-after.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch4_code_repositories/04_02_gitlab_ci/3-gitlabci-rename-pipeline-file-after.png -------------------------------------------------------------------------------- /ch4_code_repositories/04_02_gitlab_ci/4-gitlabci-variables.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch4_code_repositories/04_02_gitlab_ci/4-gitlabci-variables.png -------------------------------------------------------------------------------- /ch4_code_repositories/04_02_gitlab_ci/5-gitlabci-pipeline.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch4_code_repositories/04_02_gitlab_ci/5-gitlabci-pipeline.png -------------------------------------------------------------------------------- /ch4_code_repositories/04_02_gitlab_ci/Makefile: -------------------------------------------------------------------------------- 1 | FUNCTION=undefined 2 | PLATFORM=undefined 3 | URL=undefined 4 | VERSION=undefined 5 | BUILD_NUMBER=undefined 6 | CODE=$(shell ls *.py) 7 | 8 | ifneq (,$(findstring -staging,$(FUNCTION))) 9 | ENVIRONMENT = STAGING 10 | else ifneq (,$(findstring -production,$(FUNCTION))) 11 | ENVIRONMENT = PRODUCTION 12 | else 13 | ENVIRONMENT = undefined 14 | endif 15 | 16 | hello: 17 | @echo "Here are the targets for this Makefile:" 18 | @echo " requirements - install the project requirements" 19 | @echo " lint - run linters on the code" 20 | @echo " black - run black to format the code" 21 | @echo " test - run the tests" 22 | @echo " build - build the lambda.zip file" 23 | @echo " deploy - deploy the lambda.zip file to AWS" 24 | @echo " testdeployment - test the deployment" 25 | @echo " clean - remove the lambda.zip file" 26 | @echo " all - clean, lint, black, test, build, and deploy" 27 | @echo 28 | @echo 29 | @echo "You must set the FUNCTION variables to use the deploy target." 30 | @echo "FUNCTION must be set to the name of an existing lambda function to update." 31 | @echo "For example:" 32 | @echo 33 | @echo " make deploy FUNCTION=sample-application-staging" 34 | @echo 35 | @echo "Optional deploy variables are:" 36 | @echo " VERSION - the version of the code being deployed (default: undefined)" 37 | @echo " PLATFORM - the platform being used for the deployment (default: undefined)" 38 | @echo " BUILD_NUMBER - the build number assigned by the deployment platform (default: undefined)" 39 | @echo " URL - the URL to use for testing the deployment (default: undefined)" 40 | @echo 41 | 42 | requirements: 43 | pip install -U pip 44 | pip install --requirement requirements.txt 45 | 46 | check: 47 | set 48 | zip --version 49 | python --version 50 | pylint --version 51 | flake8 --version 52 | aws --version 53 | 54 | lint: 55 | pylint --exit-zero --errors-only --disable=C0301 --disable=C0326 --disable=R,C $(CODE) 56 | flake8 --exit-zero --ignore=E501,E231 $(CODE) 57 | 58 | 59 | black: 60 | black --diff $(CODE) 61 | 62 | test: 63 | python -m unittest -v index_test 64 | 65 | build: 66 | zip lambda.zip index.py data.json template.html 67 | 68 | deploy: 69 | aws sts get-caller-identity 70 | 71 | aws lambda wait function-active \ 72 | --function-name="$(FUNCTION)" 73 | 74 | aws lambda update-function-configuration \ 75 | --function-name="$(FUNCTION)" \ 76 | --environment "Variables={PLATFORM=$(PLATFORM),VERSION=$(VERSION),BUILD_NUMBER=$(BUILD_NUMBER),ENVIRONMENT=$(ENVIRONMENT)}" 77 | 78 | aws lambda wait function-updated \ 79 | --function-name="$(FUNCTION)" 80 | 81 | aws lambda update-function-code \ 82 | --function-name="$(FUNCTION)" \ 83 | --zip-file=fileb://lambda.zip 84 | 85 | aws lambda wait function-updated \ 86 | --function-name="$(FUNCTION)" 87 | 88 | testdeployment: 89 | curl -s $(URL) | grep $(VERSION) 90 | 91 | clean: 92 | rm -vf lambda.zip 93 | 94 | all: clean lint black test build deploy 95 | 96 | .PHONY: test build deploy all clean 97 | -------------------------------------------------------------------------------- /ch4_code_repositories/04_02_gitlab_ci/gitlab-ci.yml: -------------------------------------------------------------------------------- 1 | # Use the Python 3.11 image as the base image for this pipeline. 2 | image: python:3.11 3 | 4 | # Define the stages of the pipeline in the order they should run. 5 | stages: 6 | - requirements 7 | - check 8 | - test 9 | - build 10 | - staging 11 | - production 12 | 13 | # The requirements job sets up the Python virtual environment and installs necessary dependencies. 14 | Requirements: 15 | stage: requirements 16 | # Cache the virtual environment to speed up subsequent jobs. 17 | cache: 18 | key: venv 19 | paths: 20 | - venv 21 | script: 22 | - python -m venv venv 23 | - venv/bin/pip install --upgrade --requirement requirements.txt 24 | - make requirements 25 | 26 | # The check job performs static code analysis and linting. 27 | Check: 28 | stage: check 29 | needs: ["Requirements"] 30 | cache: 31 | policy: pull 32 | key: venv 33 | paths: 34 | - venv 35 | # Setup necessary utilities for the job. 36 | before_script: 37 | - apt-get update -y 38 | - apt-get install -qq zip 39 | script: 40 | - source venv/bin/activate 41 | - make check 42 | - make lint 43 | 44 | # The test job runs the project's tests. 45 | Test: 46 | stage: test 47 | needs: ["Check"] 48 | cache: 49 | policy: pull 50 | key: venv 51 | paths: 52 | - venv 53 | script: 54 | - source venv/bin/activate 55 | - make test 56 | 57 | # The build job creates the deployable artifact. 58 | Build: 59 | stage: build 60 | needs: ["Test"] 61 | before_script: 62 | - apt-get update -y 63 | - apt-get install -qq zip 64 | script: 65 | - make build 66 | # Store the lambda.zip as an artifact to be used in subsequent stages. 67 | artifacts: 68 | paths: 69 | - ./lambda.zip 70 | 71 | # The staging job deploys the code to the staging environment. 72 | Staging: 73 | stage: staging 74 | environment: Staging 75 | needs: ["Build"] 76 | # Define environment variables for AWS and the function name. 77 | variables: 78 | AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION 79 | AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID 80 | AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY 81 | FUNCTION: $FUNCTION_NAME 82 | cache: 83 | policy: pull 84 | key: venv 85 | paths: 86 | - venv 87 | script: 88 | - source venv/bin/activate 89 | - make deploy PLATFORM="GitLab CI" FUNCTION=$FUNCTION VERSION=$CI_COMMIT_SHA BUILD_NUMBER=$CI_PIPELINE_ID 90 | # Test the deployed code in the staging environment. 91 | - make testdeployment URL=$CI_ENVIRONMENT_URL VERSION=$CI_COMMIT_SHA 92 | dependencies: 93 | - Build 94 | 95 | # The production job deploys the code to the production environment. 96 | Production: 97 | stage: production 98 | environment: Production 99 | needs: ["Build","Staging"] 100 | variables: 101 | AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION 102 | AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID 103 | AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY 104 | FUNCTION: $FUNCTION_NAME 105 | cache: 106 | policy: pull 107 | key: venv 108 | paths: 109 | - venv 110 | script: 111 | - source venv/bin/activate 112 | - make deploy PLATFORM="GitLab CI" FUNCTION=$FUNCTION VERSION=$CI_COMMIT_SHA BUILD_NUMBER=$CI_PIPELINE_ID 113 | # Test the deployed code in the production environment. 114 | - make testdeployment URL=$CI_ENVIRONMENT_URL VERSION=$CI_COMMIT_SHA 115 | dependencies: 116 | - Build 117 | -------------------------------------------------------------------------------- /ch4_code_repositories/04_02_gitlab_ci/index.py: -------------------------------------------------------------------------------- 1 | """ 2 | An API that returns data from a JSON file. 3 | """ 4 | import os 5 | import json 6 | 7 | 8 | def handler(event, context): 9 | """ 10 | The lambda handler function. 11 | """ 12 | del context # Unused 13 | 14 | if event["rawPath"] == "/": 15 | environment = os.environ.get("ENVIRONMENT", "undefined") 16 | platform = os.environ.get("PLATFORM", "undefined") 17 | version = os.environ.get("VERSION", "undefined") 18 | build_number = os.environ.get("BUILD_NUMBER", "undefined") 19 | 20 | # Read the HTML template 21 | with open("template.html", mode="r", encoding="utf-8") as template_file: 22 | template = template_file.read() 23 | 24 | # Render the template 25 | docs_page = template.format( 26 | environment=environment, 27 | version=version, 28 | platform=platform, 29 | build_number=build_number, 30 | ) 31 | 32 | # Return the rendered template 33 | return { 34 | "statusCode": 200, 35 | "headers": { 36 | "Content-Type": "text/html", 37 | }, 38 | "body": docs_page, 39 | } 40 | 41 | # Read the data from the JSON file 42 | with open("data.json", mode="r", encoding="utf-8") as data_file: 43 | data = json.load(data_file) 44 | 45 | # Route /data: return all data 46 | if event["rawPath"] == "/data": 47 | return { 48 | "statusCode": 200, 49 | "headers": { 50 | "Content-Type": "application/json", 51 | }, 52 | "body": json.dumps(data), 53 | } 54 | 55 | # Route /data/{item_id}: return a single item 56 | # Get the item_id from the event's rawPath 57 | item_id = event["rawPath"][1:] 58 | 59 | # Check the item_id against each item in the data 60 | for item in data: 61 | # Return the item if the item_id matches 62 | if item["id"] == item_id: 63 | return { 64 | "statusCode": 200, 65 | "headers": { 66 | "Content-Type": "application/json", 67 | }, 68 | "body": json.dumps(item), 69 | } 70 | 71 | # Return a 404 if the item_id doesn't match any item 72 | return { 73 | "statusCode": 404, 74 | "headers": { 75 | "Content-Type": "application/json", 76 | }, 77 | "body": json.dumps( 78 | { 79 | "message": f"item_id {item_id} not found", 80 | "event": event, 81 | "item_id": item_id, 82 | } 83 | ), 84 | } 85 | 86 | 87 | def main(): 88 | """ 89 | A main function for testing the handler function. 90 | """ 91 | # Simulate an event for testing 92 | event = {"rawPath": "/1"} 93 | context = None 94 | 95 | # Call the handler function 96 | response = handler(event, context) 97 | 98 | # Print the response 99 | print(json.dumps(response, indent=2)) 100 | 101 | 102 | if __name__ == "__main__": 103 | main() 104 | -------------------------------------------------------------------------------- /ch4_code_repositories/04_02_gitlab_ci/index_test.py: -------------------------------------------------------------------------------- 1 | from index import handler 2 | from unittest.mock import patch 3 | import unittest 4 | import json 5 | 6 | 7 | class TestHandler(unittest.TestCase): 8 | def setUp(self): 9 | # Set up any test data or configurations you need 10 | self.mock_env = "test_environment" 11 | patch.dict("os.environ", {"ENVIRONMENT": self.mock_env}).start() 12 | 13 | def tearDown(self): 14 | # Clean up after each test if necessary 15 | patch.stopall() 16 | 17 | def test_home_page(self): 18 | # Test the home page 19 | event = {"rawPath": "/"} 20 | response = handler(event, None) 21 | self.assertEqual(response["statusCode"], 200) 22 | self.assertEqual(response["headers"]["Content-Type"], "text/html") 23 | self.assertIn(f"The Sample Application - {self.mock_env}", response["body"]) 24 | 25 | def test_get_all_data(self): 26 | with open("data.json", mode="r", encoding="utf-8") as data_file: 27 | data = json.load(data_file) 28 | 29 | event = {"rawPath": "/data"} 30 | response = handler(event, None) 31 | self.assertEqual(response["statusCode"], 200) 32 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 33 | self.assertEqual(response["body"], json.dumps(data)) 34 | 35 | def test_get_item_by_id(self): 36 | with open("data.json", "r") as f: 37 | data = json.load(f) 38 | 39 | event = {"rawPath": "/1"} 40 | response = handler(event, None) 41 | self.assertEqual(response["statusCode"], 200) 42 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 43 | 44 | # Find the expected item in data based on the given ID 45 | item_id = event["rawPath"][1:] 46 | expected_item = next((item for item in data if item["id"] == item_id), None) 47 | self.assertEqual(response["body"], json.dumps(expected_item)) 48 | 49 | def test_invalid_id(self): 50 | event = {"rawPath": "/invalid_id"} 51 | response = handler(event, None) 52 | self.assertEqual(response["statusCode"], 404) 53 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 54 | self.assertIn(f"id {event['rawPath'][1:]} not found", response["body"]) 55 | 56 | 57 | if __name__ == "__main__": 58 | unittest.main() 59 | -------------------------------------------------------------------------------- /ch4_code_repositories/04_02_gitlab_ci/requirements.txt: -------------------------------------------------------------------------------- 1 | black 2 | pylint 3 | flake8 4 | awscli 5 | -------------------------------------------------------------------------------- /ch4_code_repositories/04_02_gitlab_ci/template.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 |Environment | 65 |{environment} | 66 |
Version | 69 |{version} | 70 |
Platform | 73 |{platform} | 74 |
Build # | 77 |{build_number} | 78 |
Returns the documentation page for the application.
83 | 84 | 85 |Returns all data in JSON format.
87 | 88 | 89 |Returns a specific item by its ID in JSON format.
91 | 92 | 93 | 94 | -------------------------------------------------------------------------------- /ch4_code_repositories/04_03_bitbucket_pipelines/Makefile: -------------------------------------------------------------------------------- 1 | FUNCTION=undefined 2 | PLATFORM=undefined 3 | URL=undefined 4 | VERSION=undefined 5 | BUILD_NUMBER=undefined 6 | CODE=$(shell ls *.py) 7 | 8 | ifneq (,$(findstring -staging,$(FUNCTION))) 9 | ENVIRONMENT = STAGING 10 | else ifneq (,$(findstring -production,$(FUNCTION))) 11 | ENVIRONMENT = PRODUCTION 12 | else 13 | ENVIRONMENT = undefined 14 | endif 15 | 16 | hello: 17 | @echo "Here are the targets for this Makefile:" 18 | @echo " requirements - install the project requirements" 19 | @echo " lint - run linters on the code" 20 | @echo " black - run black to format the code" 21 | @echo " test - run the tests" 22 | @echo " build - build the lambda.zip file" 23 | @echo " deploy - deploy the lambda.zip file to AWS" 24 | @echo " testdeployment - test the deployment" 25 | @echo " clean - remove the lambda.zip file" 26 | @echo " all - clean, lint, black, test, build, and deploy" 27 | @echo 28 | @echo 29 | @echo "You must set the FUNCTION variables to use the deploy target." 30 | @echo "FUNCTION must be set to the name of an existing lambda function to update." 31 | @echo "For example:" 32 | @echo 33 | @echo " make deploy FUNCTION=sample-application-staging" 34 | @echo 35 | @echo "Optional deploy variables are:" 36 | @echo " VERSION - the version of the code being deployed (default: undefined)" 37 | @echo " PLATFORM - the platform being used for the deployment (default: undefined)" 38 | @echo " BUILD_NUMBER - the build number assigned by the deployment platform (default: undefined)" 39 | @echo " URL - the URL to use for testing the deployment (default: undefined)" 40 | @echo 41 | 42 | requirements: 43 | pip install -U pip 44 | pip install --requirement requirements.txt 45 | 46 | check: 47 | set 48 | zip --version 49 | python --version 50 | pylint --version 51 | flake8 --version 52 | aws --version 53 | 54 | lint: 55 | pylint --exit-zero --errors-only --disable=C0301 --disable=C0326 --disable=R,C $(CODE) 56 | flake8 --exit-zero --ignore=E501,E231 $(CODE) 57 | 58 | 59 | black: 60 | black --diff $(CODE) 61 | 62 | test: 63 | python -m unittest -v index_test 64 | 65 | build: 66 | zip lambda.zip index.py data.json template.html 67 | 68 | deploy: 69 | aws sts get-caller-identity 70 | 71 | aws lambda wait function-active \ 72 | --function-name="$(FUNCTION)" 73 | 74 | aws lambda update-function-configuration \ 75 | --function-name="$(FUNCTION)" \ 76 | --environment "Variables={PLATFORM=$(PLATFORM),VERSION=$(VERSION),BUILD_NUMBER=$(BUILD_NUMBER),ENVIRONMENT=$(ENVIRONMENT)}" 77 | 78 | aws lambda wait function-updated \ 79 | --function-name="$(FUNCTION)" 80 | 81 | aws lambda update-function-code \ 82 | --function-name="$(FUNCTION)" \ 83 | --zip-file=fileb://lambda.zip 84 | 85 | aws lambda wait function-updated \ 86 | --function-name="$(FUNCTION)" 87 | 88 | testdeployment: 89 | curl -s $(URL) | grep $(VERSION) 90 | 91 | clean: 92 | rm -vf lambda.zip 93 | 94 | all: clean lint black test build deploy 95 | 96 | .PHONY: test build deploy all clean 97 | -------------------------------------------------------------------------------- /ch4_code_repositories/04_03_bitbucket_pipelines/bitbucket-pipelines.yml: -------------------------------------------------------------------------------- 1 | # Use the latest version of Python 2 | image: python 3 | 4 | pipelines: 5 | default: 6 | 7 | # Step 1: Prepare the environment by installing all necessary dependencies. 8 | - step: 9 | name: Requirements 10 | caches: 11 | - pip 12 | script: 13 | - make requirements 14 | 15 | # Step 2: In parallel, conduct code checks and run tests to validate the code quality and functionality. 16 | - parallel: 17 | 18 | # Sub-Step 1: Perform code checks and linting to ensure code quality. 19 | - step: 20 | name: Check 21 | caches: 22 | - pip 23 | script: 24 | - apt-get update -y 25 | - apt-get install -qq zip 26 | - make requirements 27 | - make check 28 | - make lint 29 | 30 | # Sub-Step 2: Execute tests to confirm that the code works as expected. 31 | - step: 32 | name: Test 33 | caches: 34 | - pip 35 | script: 36 | - make requirements 37 | - make test 38 | 39 | # Step 3: Build the application and generate a ZIP artifact for deployment. 40 | - step: 41 | name: Build 42 | script: 43 | - apt-get update -y 44 | - apt-get install -qq zip 45 | - make build 46 | artifacts: 47 | - lambda.zip 48 | 49 | # Step 4: Deploy the application to the staging environment and validate the deployment. 50 | - step: 51 | name: Staging 52 | deployment: staging 53 | caches: 54 | - pip 55 | script: 56 | - make requirements 57 | - make deploy PLATFORM="Bitbucket Pipelines" \ 58 | FUNCTION=$STAGING_FUNCTION_NAME \ 59 | VERSION=$BITBUCKET_COMMIT \ 60 | BUILD_NUMBER=$BITBUCKET_BUILD_NUMBER 61 | - make testdeployment URL=$STAGING_URL VERSION=$BITBUCKET_COMMIT 62 | 63 | # Step 5: If staging is successful, deploy the application to the production environment and validate the deployment. 64 | - step: 65 | name: Production 66 | deployment: production 67 | caches: 68 | - pip 69 | script: 70 | - make requirements 71 | - make deploy PLATFORM="Bitbucket Pipelines" \ 72 | FUNCTION=$PRODUCTION_FUNCTION_NAME \ 73 | VERSION=$BITBUCKET_COMMIT \ 74 | BUILD_NUMBER=$BITBUCKET_BUILD_NUMBER 75 | - make testdeployment URL=$PRODUCTION_URL VERSION=$BITBUCKET_COMMIT 76 | -------------------------------------------------------------------------------- /ch4_code_repositories/04_03_bitbucket_pipelines/create-access-token-00001.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch4_code_repositories/04_03_bitbucket_pipelines/create-access-token-00001.png -------------------------------------------------------------------------------- /ch4_code_repositories/04_03_bitbucket_pipelines/create-access-token-00002.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch4_code_repositories/04_03_bitbucket_pipelines/create-access-token-00002.png -------------------------------------------------------------------------------- /ch4_code_repositories/04_03_bitbucket_pipelines/create-access-token-00004.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch4_code_repositories/04_03_bitbucket_pipelines/create-access-token-00004.png -------------------------------------------------------------------------------- /ch4_code_repositories/04_03_bitbucket_pipelines/create-access-token-00005.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch4_code_repositories/04_03_bitbucket_pipelines/create-access-token-00005.png -------------------------------------------------------------------------------- /ch4_code_repositories/04_03_bitbucket_pipelines/create-access-token-00006.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch4_code_repositories/04_03_bitbucket_pipelines/create-access-token-00006.png -------------------------------------------------------------------------------- /ch4_code_repositories/04_03_bitbucket_pipelines/create-access-token-00007.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch4_code_repositories/04_03_bitbucket_pipelines/create-access-token-00007.png -------------------------------------------------------------------------------- /ch4_code_repositories/04_03_bitbucket_pipelines/create-access-token-00008.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch4_code_repositories/04_03_bitbucket_pipelines/create-access-token-00008.png -------------------------------------------------------------------------------- /ch4_code_repositories/04_03_bitbucket_pipelines/create-access-token-00009.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch4_code_repositories/04_03_bitbucket_pipelines/create-access-token-00009.png -------------------------------------------------------------------------------- /ch4_code_repositories/04_03_bitbucket_pipelines/index.py: -------------------------------------------------------------------------------- 1 | """ 2 | An API that returns data from a JSON file. 3 | """ 4 | import os 5 | import json 6 | 7 | 8 | def handler(event, context): 9 | """ 10 | The lambda handler function. 11 | """ 12 | del context # Unused 13 | 14 | if event["rawPath"] == "/": 15 | environment = os.environ.get("ENVIRONMENT", "undefined") 16 | platform = os.environ.get("PLATFORM", "undefined") 17 | version = os.environ.get("VERSION", "undefined") 18 | build_number = os.environ.get("BUILD_NUMBER", "undefined") 19 | 20 | # Read the HTML template 21 | with open("template.html", mode="r", encoding="utf-8") as template_file: 22 | template = template_file.read() 23 | 24 | # Render the template 25 | docs_page = template.format( 26 | environment=environment, 27 | version=version, 28 | platform=platform, 29 | build_number=build_number, 30 | ) 31 | 32 | # Return the rendered template 33 | return { 34 | "statusCode": 200, 35 | "headers": { 36 | "Content-Type": "text/html", 37 | }, 38 | "body": docs_page, 39 | } 40 | 41 | # Read the data from the JSON file 42 | with open("data.json", mode="r", encoding="utf-8") as data_file: 43 | data = json.load(data_file) 44 | 45 | # Route /data: return all data 46 | if event["rawPath"] == "/data": 47 | return { 48 | "statusCode": 200, 49 | "headers": { 50 | "Content-Type": "application/json", 51 | }, 52 | "body": json.dumps(data), 53 | } 54 | 55 | # Route /data/{item_id}: return a single item 56 | # Get the item_id from the event's rawPath 57 | item_id = event["rawPath"][1:] 58 | 59 | # Check the item_id against each item in the data 60 | for item in data: 61 | # Return the item if the item_id matches 62 | if item["id"] == item_id: 63 | return { 64 | "statusCode": 200, 65 | "headers": { 66 | "Content-Type": "application/json", 67 | }, 68 | "body": json.dumps(item), 69 | } 70 | 71 | # Return a 404 if the item_id doesn't match any item 72 | return { 73 | "statusCode": 404, 74 | "headers": { 75 | "Content-Type": "application/json", 76 | }, 77 | "body": json.dumps( 78 | { 79 | "message": f"item_id {item_id} not found", 80 | "event": event, 81 | "item_id": item_id, 82 | } 83 | ), 84 | } 85 | 86 | 87 | def main(): 88 | """ 89 | A main function for testing the handler function. 90 | """ 91 | # Simulate an event for testing 92 | event = {"rawPath": "/1"} 93 | context = None 94 | 95 | # Call the handler function 96 | response = handler(event, context) 97 | 98 | # Print the response 99 | print(json.dumps(response, indent=2)) 100 | 101 | 102 | if __name__ == "__main__": 103 | main() 104 | -------------------------------------------------------------------------------- /ch4_code_repositories/04_03_bitbucket_pipelines/index_test.py: -------------------------------------------------------------------------------- 1 | from index import handler 2 | from unittest.mock import patch 3 | import unittest 4 | import json 5 | 6 | 7 | class TestHandler(unittest.TestCase): 8 | def setUp(self): 9 | # Set up any test data or configurations you need 10 | self.mock_env = "test_environment" 11 | patch.dict("os.environ", {"ENVIRONMENT": self.mock_env}).start() 12 | 13 | def tearDown(self): 14 | # Clean up after each test if necessary 15 | patch.stopall() 16 | 17 | def test_home_page(self): 18 | # Test the home page 19 | event = {"rawPath": "/"} 20 | response = handler(event, None) 21 | self.assertEqual(response["statusCode"], 200) 22 | self.assertEqual(response["headers"]["Content-Type"], "text/html") 23 | self.assertIn(f"The Sample Application - {self.mock_env}", response["body"]) 24 | 25 | def test_get_all_data(self): 26 | with open("data.json", mode="r", encoding="utf-8") as data_file: 27 | data = json.load(data_file) 28 | 29 | event = {"rawPath": "/data"} 30 | response = handler(event, None) 31 | self.assertEqual(response["statusCode"], 200) 32 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 33 | self.assertEqual(response["body"], json.dumps(data)) 34 | 35 | def test_get_item_by_id(self): 36 | with open("data.json", "r") as f: 37 | data = json.load(f) 38 | 39 | event = {"rawPath": "/1"} 40 | response = handler(event, None) 41 | self.assertEqual(response["statusCode"], 200) 42 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 43 | 44 | # Find the expected item in data based on the given ID 45 | item_id = event["rawPath"][1:] 46 | expected_item = next((item for item in data if item["id"] == item_id), None) 47 | self.assertEqual(response["body"], json.dumps(expected_item)) 48 | 49 | def test_invalid_id(self): 50 | event = {"rawPath": "/invalid_id"} 51 | response = handler(event, None) 52 | self.assertEqual(response["statusCode"], 404) 53 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 54 | self.assertIn(f"id {event['rawPath'][1:]} not found", response["body"]) 55 | 56 | 57 | if __name__ == "__main__": 58 | unittest.main() 59 | -------------------------------------------------------------------------------- /ch4_code_repositories/04_03_bitbucket_pipelines/pipeline-settings-00001.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch4_code_repositories/04_03_bitbucket_pipelines/pipeline-settings-00001.png -------------------------------------------------------------------------------- /ch4_code_repositories/04_03_bitbucket_pipelines/pipeline-settings-00002.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch4_code_repositories/04_03_bitbucket_pipelines/pipeline-settings-00002.png -------------------------------------------------------------------------------- /ch4_code_repositories/04_03_bitbucket_pipelines/requirements.txt: -------------------------------------------------------------------------------- 1 | black 2 | pylint 3 | flake8 4 | awscli 5 | -------------------------------------------------------------------------------- /ch4_code_repositories/04_03_bitbucket_pipelines/run-pipeline-00001.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch4_code_repositories/04_03_bitbucket_pipelines/run-pipeline-00001.png -------------------------------------------------------------------------------- /ch4_code_repositories/04_03_bitbucket_pipelines/run-pipeline-00002.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch4_code_repositories/04_03_bitbucket_pipelines/run-pipeline-00002.png -------------------------------------------------------------------------------- /ch4_code_repositories/04_03_bitbucket_pipelines/template.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 |Environment | 65 |{environment} | 66 |
Version | 69 |{version} | 70 |
Platform | 73 |{platform} | 74 |
Build # | 77 |{build_number} | 78 |
Returns the documentation page for the application.
83 | 84 | 85 |Returns all data in JSON format.
87 | 88 | 89 |Returns a specific item by its ID in JSON format.
91 | 92 | 93 | 94 | -------------------------------------------------------------------------------- /ch4_code_repositories/04_04_code_repository_tool_summary/README.md: -------------------------------------------------------------------------------- 1 | # 04_04 Code Repository Tool Summary 2 | This summary highlights the advantages of using CI tools provided by code repositories and introduces GitHub Actions, GitLab CI, and Bitbucket CI along with their respective strengths. 3 | 4 | The CI tools within code repositories offer a streamlined development experience, advanced analytics, and flexibility in workflow management. 5 | 6 | - **CI Tools in Code Repositories:** 7 | - Unique advantage of residing in the same location as developers' code. 8 | - Productivity benefit of minimizing context switching between tools. 9 | - Advanced pipeline analytics, increased observability of code deployments, and the use of custom pipeline components. 10 | - Hosted Runners: 11 | - Code repo CI tools offer hosted runners for various operating systems. 12 | - There's also the option to use self-hosted runners for more control. 13 | 14 | - **GitHub Actions:** 15 | - Supports multiple workflows managed in separate files. 16 | - Triggered by various activities within a GitHub repository. 17 | - Benefits from the GitHub marketplace with thousands of actions for building pipelines. 18 | 19 | - **GitLab CI:** 20 | - Utilizes YAML configuration format. 21 | - Provides pipeline functionality for even the most complex projects. 22 | - Includes native tools for analyzing pipeline results and trends. 23 | 24 | - **Bitbucket Pipelines:** 25 | - Strong features for observing pipeline execution and deployments to different environments. 26 | - Integrates seamlessly with Atlassian project management applications. 27 | 28 | [Next: Ch5 Selecting a CI Tool](../../ch5_selecting_a_ci_tool/README.md) 29 | -------------------------------------------------------------------------------- /ch4_code_repositories/README.md: -------------------------------------------------------------------------------- 1 | # Ch4 Code Repositories 2 | - [04_01 GitHub Actions](./04_01_github_actions/README.md) 3 | - [04_02 GitLab CI](./04_02_gitlab_ci/README.md) 4 | - [04_03 Bitbucket Pipelines](./04_03_bitbucket_pipelines/README.md) 5 | - [04_04 Code Repository Tool Summary](./04_04_code_repository_tool_summary/README.md) 6 | -------------------------------------------------------------------------------- /ch5_selecting_a_ci_tool/05_01_selecting_the_right_ci_tool/05_01_selecting_the_right_ci_tool.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/continuous-integration-tools-4490242/2b44c9affe9b98a7706f4343feed8da7d242bcf6/ch5_selecting_a_ci_tool/05_01_selecting_the_right_ci_tool/05_01_selecting_the_right_ci_tool.png -------------------------------------------------------------------------------- /ch5_selecting_a_ci_tool/05_01_selecting_the_right_ci_tool/README.md: -------------------------------------------------------------------------------- 1 | # 05_01 Selecting the Right CI Tool 2 | This summary provides an overview of the CI/CD tool categories and guidance on choosing the right tool based on individual, team, or enterprise needs. 3 | 4 | ## Tool Categories 5 | Start by considering tools from four categories: 6 | 7 | - **Self-Hosted Tools:** 8 | - Requires installation and maintenance efforts. 9 | - Offers the most flexibility of all tools. 10 | 11 | - **Software as a Service (SaaS) Tools:** 12 | - Easy onboarding modern features without installation or maintenance. 13 | 14 | - **Cloud Service Providers:** 15 | - Provide interactions with cloud-based products and services, offering convenience. 16 | 17 | - **Code Repositories:** 18 | - Simplifies collaboration by managing CI/CD and code on the same platform. 19 | 20 | ## Selecting CI Tools for Individuals, Teams, and Enterprises 21 |  22 | 23 | - **Individual developers** may prefer SaaS or code repository tools for quick setup. 24 | - **Small teams** looking for scalability and project management capabilities may opt for code repository tools. 25 | - **Businesses and enterprises** have flexibility to choose from any tool category but may benefit from self-hosted and cloud service provider options for integration and security. 26 | 27 | ## Experiment with Tools and Be Open to Change 28 | - Start with a tool that suits your current needs. 29 | - Be open to trying different tools if it doesn't fit your requirements. 30 | - Utilize the exercise files and revisit the course during your CI/CD tool selection journey. 31 | 32 | Selecting the right CI/CD tool depends on your specific needs and context. Explore, experiment, and adapt as necessary to find the tool that works best for you. 33 | 34 | [Fin](../../README.md) 35 | -------------------------------------------------------------------------------- /ch5_selecting_a_ci_tool/README.md: -------------------------------------------------------------------------------- 1 | # Ch5 Selecting a CI Tool 2 | - [05_01 Selecting the Right CI Tool](./05_01_selecting_the_right_ci_tool/README.md) 3 | --------------------------------------------------------------------------------