├── ch3_create_custom_pipes ├── 03_06_challenge_create_a_custom_pipe │ ├── pipe │ │ ├── __init__.py │ │ └── pipe.py │ ├── test │ │ ├── requirements.txt │ │ └── test.py │ ├── requirements.txt │ ├── .editorconfig │ ├── Dockerfile │ ├── pipe.yml │ ├── LICENSE.txt │ ├── bitbucket-pipelines.yml │ ├── .gitignore │ ├── RELEASING.md │ └── README.md ├── 03_07_solution_create_a_custom_pipe │ ├── pipe │ │ ├── __init__.py │ │ └── pipe.py │ ├── test │ │ ├── requirements.txt │ │ └── test.py │ ├── requirements.txt │ ├── .editorconfig │ ├── Dockerfile │ ├── pipe.yml │ ├── LICENSE.txt │ ├── bitbucket-pipelines.yml │ ├── .gitignore │ └── RELEASING.md ├── 03_03_test_a_custom_pipe │ ├── advanced-python │ │ ├── pipe │ │ │ ├── __init__.py │ │ │ └── pipe.py │ │ ├── .gitignore │ │ ├── test │ │ │ ├── requirements.txt │ │ │ ├── __pycache__ │ │ │ │ └── test.cpython-310-pytest-7.2.1.pyc │ │ │ └── test.py │ │ ├── requirements.txt │ │ ├── .changes │ │ │ └── next-release │ │ │ │ └── minor-20181204085425.json │ │ ├── .editorconfig │ │ ├── Dockerfile │ │ ├── pipe.yml │ │ ├── LICENSE.txt │ │ ├── bitbucket-pipelines.yml │ │ └── RELEASING.md │ ├── advanced-bash │ │ ├── .gitignore │ │ ├── .changes │ │ │ └── next-release │ │ │ │ └── minor-20181204085425.json │ │ ├── .editorconfig │ │ ├── pipe.yml │ │ ├── Dockerfile │ │ ├── pipe │ │ │ └── pipe.sh │ │ ├── test │ │ │ └── test.bats │ │ ├── LICENSE.txt │ │ ├── bitbucket-pipelines.yml │ │ └── RELEASING.md │ ├── simple-python │ │ ├── requirements.txt │ │ ├── Dockerfile │ │ ├── pipe.py │ │ └── bitbucket-pipelines.yml │ └── simple-bash │ │ ├── pipe.sh │ │ ├── Dockerfile │ │ └── bitbucket-pipelines.yml ├── 03_02_develop_a_custom_pipe │ ├── advanced-python │ │ ├── pipe │ │ │ ├── __init__.py │ │ │ └── pipe.py │ │ ├── .gitignore │ │ ├── test │ │ │ ├── requirements.txt │ │ │ └── test.py │ │ ├── requirements.txt │ │ ├── .changes │ │ │ └── next-release │ │ │ │ └── minor-20181204085425.json │ │ ├── .editorconfig │ │ ├── Dockerfile │ │ ├── pipe.yml │ │ ├── LICENSE.txt │ │ ├── bitbucket-pipelines.yml │ │ ├── RELEASING.md │ │ └── README.md │ ├── advanced-bash │ │ ├── .gitignore │ │ ├── .changes │ │ │ └── next-release │ │ │ │ └── minor-20181204085425.json │ │ ├── .editorconfig │ │ ├── pipe.yml │ │ ├── Dockerfile │ │ ├── pipe │ │ │ └── pipe.sh │ │ ├── test │ │ │ └── test.bats │ │ ├── LICENSE.txt │ │ ├── bitbucket-pipelines.yml │ │ └── RELEASING.md │ ├── simple-python │ │ ├── requirements.txt │ │ ├── Dockerfile │ │ ├── pipe.py │ │ └── bitbucket-pipelines.yml │ ├── simple-bash │ │ ├── pipe.sh │ │ ├── Dockerfile │ │ └── bitbucket-pipelines.yml │ └── images │ │ ├── 03-run-generator.png │ │ ├── 01-check-versions.png │ │ ├── 02-clone-the-repo.png │ │ ├── 04-workspace-name.png │ │ ├── 05-generated-files.png │ │ ├── SCR-20241217-mjqe.png │ │ ├── 06-docker-push-error.png │ │ └── 07-registry-prefixes.png ├── 03_05_use_a_custom_pipe_in_a_pipeline │ ├── images │ │ └── 00-youre-awesome.png │ └── README.md ├── 03_04_deploy_a_custom_pipe_to_a_container_registry │ ├── images │ │ ├── 1-SCR-20241217-lcqn.png │ │ ├── 2-SCR-20241217-ldhz.png │ │ ├── 3-SCR-20241217-lqkt.png │ │ ├── 4-SCR-20241217-lptx.png │ │ ├── 5-SCR-20241217-ntoq.png │ │ └── 6-SCR-20241217-misj.png │ └── README.md └── 03_01_when_to_use_custom_pipes │ └── README.md ├── ch1_pipeline_optimizations ├── 01_05_cache_dependencies │ ├── python-backend │ │ ├── requirements.txt │ │ ├── app.py │ │ └── app_test.py │ ├── ruby-backend │ │ ├── Gemfile │ │ ├── Gemfile.lock │ │ ├── app.rb │ │ └── app_test.rb │ ├── bitbucket-pipelines.yml │ ├── shared │ │ └── database-schema.json │ └── README.md ├── 01_04_use_conditional_steps │ ├── python-backend │ │ ├── requirements.txt │ │ ├── app.py │ │ └── app_test.py │ ├── ruby-backend │ │ ├── Gemfile │ │ ├── app.rb │ │ └── app_test.rb │ ├── Gemfile.lock │ ├── bitbucket-pipelines.yml │ ├── bitbucket-pipelines-no-conditional-steps.yml │ ├── bitbucket-pipelines-with-conditional-steps.yml │ ├── shared │ │ └── database-schema.json │ └── README.md ├── 01_06_challenge_optimize_a_pipeline │ ├── dev-requirements.txt │ ├── requirements.txt │ ├── images │ │ ├── SCR-20250103-trnw-run-initial-pipeline-1.png │ │ └── SCR-20250103-trwv-run-initial-pipeline-2.png │ ├── bitbucket-pipelines.yml │ ├── Makefile │ ├── .gitignore │ ├── README.md │ └── cluster_analysis.py ├── 01_07_solution_optimize_a_pipeline │ ├── dev-requirements.txt │ ├── requirements.txt │ ├── images │ │ ├── SCR-20250103-trnw-run-initial-pipeline-1.png │ │ └── SCR-20250103-trwv-run-initial-pipeline-2.png │ ├── Makefile │ ├── .gitignore │ ├── bitbucket-pipelines.yml │ └── cluster_analysis.py ├── 01_02_configure_maximum_runtime │ ├── bitbucket-pipelines.yml │ └── README.md ├── 01_03_configure_resource_allocation │ ├── bitbucket-pipelines.yml │ ├── README.md │ ├── 1x-pipelineLog.txt │ └── 2x-pipelineLog.txt └── 01_01_optimizing_pipeline_performance │ └── README.md ├── ch2_using_pipes_in_pipelines ├── 02_04_use_a_pipe_to_deploy_code_to_aws_lambda │ ├── requirements.txt │ ├── create_function_configuration.py │ ├── index_test.py │ ├── bitbucket-pipelines.yml │ ├── template.html │ ├── Makefile │ └── index.py ├── 02_03_deploy_lambda_functions_in_aws │ ├── images │ │ ├── 05-submit.png │ │ ├── 08-outputs.png │ │ ├── 01-create-stack.png │ │ ├── 10-staging-url.png │ │ ├── 09-production-url.png │ │ ├── 00-resource-diagram.png │ │ ├── 02-upload-template.png │ │ ├── 03-enter-stack-name.png │ │ ├── 07-create-complete.png │ │ ├── 06-create-in-progress.png │ │ └── 04-acknowledge-iam-resources.png │ └── README.md ├── 02_06_solution_use_pipes_in_a_pipeline │ ├── images │ │ ├── 00005_run-pipeline.png │ │ ├── 00004_edit-pipeline.png │ │ ├── 00002_copy-access-token.png │ │ ├── 00006_download-report.png │ │ ├── 00000_create-access-token-0.png │ │ ├── 00001_create-access-token-1.png │ │ ├── 00003_create-repository-variable.png │ │ ├── SCR-20250103-trnw-run-initial-pipeline-1.png │ │ └── SCR-20250103-trwv-run-initial-pipeline-2.png │ └── bitbucket-pipelines.yml ├── 02_05_challenge_use_pipes_in_a_pipeline │ ├── images │ │ ├── SCR-20250103-trnw-run-initial-pipeline-1.png │ │ └── SCR-20250103-trwv-run-initial-pipeline-2.png │ ├── bitbucket-pipelines.yml │ └── README.md ├── 02_02_use_a_pipe_in_a_pipeline_configuration │ ├── bitbucket-pipelines.yml │ ├── Shenanigans │ │ └── bitbucket-pipelines.yml │ └── README.md └── 02_01_getting_to_know_pipes │ └── README.md ├── ch4_self_hosted_runners ├── 04_07_challenge_deploy_a_self_hosted_runner │ ├── requirements.txt │ ├── images │ │ └── cfn-designer.png │ ├── bitbucket-pipelines.yml │ ├── read-data-file.py │ ├── README.md │ └── get-runner-details.py ├── 04_08_solution_deploy_a_self_hosted_runner │ ├── requirements.txt │ ├── images │ │ └── cfn-designer.png │ ├── bitbucket-pipelines.yml │ ├── read-data-file.py │ └── get-runner-details.py ├── 04_04_deploy_an_ec2_server_in_aws │ └── images │ │ ├── 05-submit.png │ │ ├── 08-outputs.png │ │ ├── 01-create-stack.png │ │ ├── 02-upload-template.png │ │ ├── 07-create-complete.png │ │ ├── 00-resource-diagram.png │ │ ├── 03-enter-stack-name.png │ │ ├── 00-ec2-instance-pricing.png │ │ ├── 06-create-in-progress.png │ │ ├── 04-acknowledge-iam-resources.png │ │ ├── 10-elevate-session-to-use-root-account.png │ │ └── 09-connect-to-instance-via-session-manager.png ├── 04_06_use_self_hosted_runners_in_a_pipeline │ ├── bitbucket-pipelines.yml │ └── README.md ├── 04_03_compare_repository_and_workspace_runners │ └── README.md ├── 04_02_self_hosted_runner_configurations │ └── README.md └── 04_01_when_to_use_self_hosted_runners │ └── README.md ├── .github ├── CODEOWNERS ├── PULL_REQUEST_TEMPLATE.md ├── workflows │ └── main.yml └── ISSUE_TEMPLATE.md ├── .markdownlint.jsonc ├── CONTRIBUTING.md ├── ch0_intro ├── 00_01_introduction │ └── README.md ├── 00_03_using_the_exercise_files │ └── README.md ├── 00_04_bitbucket_pipelines_review │ └── README.md └── 00_02_what_you_should_know │ └── README.md ├── ch5_conclusion └── 05_01_next_steps │ └── README.md ├── CHAPTER_LIST.txt ├── Makefile ├── NOTICE └── README.md /ch3_create_custom_pipes/03_06_challenge_create_a_custom_pipe/pipe/__init__.py: -------------------------------------------------------------------------------- 1 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_07_solution_create_a_custom_pipe/pipe/__init__.py: -------------------------------------------------------------------------------- 1 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/advanced-python/pipe/__init__.py: -------------------------------------------------------------------------------- 1 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/advanced-python/pipe/__init__.py: -------------------------------------------------------------------------------- 1 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/advanced-bash/.gitignore: -------------------------------------------------------------------------------- 1 | .idea 2 | *.iml 3 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/advanced-bash/.gitignore: -------------------------------------------------------------------------------- 1 | .idea 2 | *.iml 3 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/advanced-python/.gitignore: -------------------------------------------------------------------------------- 1 | .idea 2 | *.iml 3 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/advanced-python/.gitignore: -------------------------------------------------------------------------------- 1 | .idea 2 | *.iml 3 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_05_cache_dependencies/python-backend/requirements.txt: -------------------------------------------------------------------------------- 1 | psycopg2-binary 2 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_06_challenge_create_a_custom_pipe/test/requirements.txt: -------------------------------------------------------------------------------- 1 | pytest==7.2.* 2 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_07_solution_create_a_custom_pipe/test/requirements.txt: -------------------------------------------------------------------------------- 1 | pytest==7.2.* 2 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_04_use_conditional_steps/python-backend/requirements.txt: -------------------------------------------------------------------------------- 1 | psycopg2-binary 2 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/advanced-python/test/requirements.txt: -------------------------------------------------------------------------------- 1 | pytest==7.2.* 2 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/advanced-python/test/requirements.txt: -------------------------------------------------------------------------------- 1 | pytest==7.2.* 2 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/simple-python/requirements.txt: -------------------------------------------------------------------------------- 1 | bitbucket-pipes-toolkit==4.* 2 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_06_challenge_create_a_custom_pipe/requirements.txt: -------------------------------------------------------------------------------- 1 | bitbucket-pipes-toolkit==4.* 2 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_07_solution_create_a_custom_pipe/requirements.txt: -------------------------------------------------------------------------------- 1 | bitbucket-pipes-toolkit==4.* 2 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_06_challenge_optimize_a_pipeline/dev-requirements.txt: -------------------------------------------------------------------------------- 1 | flake8 2 | black 3 | pytest 4 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_07_solution_optimize_a_pipeline/dev-requirements.txt: -------------------------------------------------------------------------------- 1 | flake8 2 | black 3 | pytest 4 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/advanced-python/requirements.txt: -------------------------------------------------------------------------------- 1 | bitbucket-pipes-toolkit==4.* 2 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/simple-python/requirements.txt: -------------------------------------------------------------------------------- 1 | bitbucket-pipes-toolkit==4.* 2 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/advanced-python/requirements.txt: -------------------------------------------------------------------------------- 1 | bitbucket-pipes-toolkit==4.* 2 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_06_challenge_optimize_a_pipeline/requirements.txt: -------------------------------------------------------------------------------- 1 | pandas 2 | scikit-learn 3 | numpy 4 | tensorflow 5 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_07_solution_optimize_a_pipeline/requirements.txt: -------------------------------------------------------------------------------- 1 | pandas 2 | scikit-learn 3 | numpy 4 | tensorflow 5 | -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_04_use_a_pipe_to_deploy_code_to_aws_lambda/requirements.txt: -------------------------------------------------------------------------------- 1 | black 2 | pylint 3 | flake8 4 | awscli 5 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/simple-bash/pipe.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | set -e 4 | 5 | echo "Hello World" 6 | -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_07_challenge_deploy_a_self_hosted_runner/requirements.txt: -------------------------------------------------------------------------------- 1 | boto3 2 | psutil 3 | py-cpuinfo 4 | setuptools 5 | -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_08_solution_deploy_a_self_hosted_runner/requirements.txt: -------------------------------------------------------------------------------- 1 | boto3 2 | psutil 3 | py-cpuinfo 4 | setuptools 5 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/simple-bash/pipe.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | set -e 4 | 5 | echo "Hello World" 6 | -------------------------------------------------------------------------------- /.github/CODEOWNERS: -------------------------------------------------------------------------------- 1 | # Codeowners for these exercise files: 2 | # * (asterisk) denotes "all files and folders" 3 | # Example: * @producer @instructor 4 | -------------------------------------------------------------------------------- /.markdownlint.jsonc: -------------------------------------------------------------------------------- 1 | { 2 | "MD013": false, 3 | "MD024": false, 4 | "MD028": false, 5 | "MD033": false, 6 | "MD046": false 7 | } 8 | -------------------------------------------------------------------------------- /.github/PULL_REQUEST_TEMPLATE.md: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/advanced-bash/.changes/next-release/minor-20181204085425.json: -------------------------------------------------------------------------------- 1 | { 2 | "type": "minor", 3 | "description": "Initial release" 4 | } 5 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/advanced-bash/.changes/next-release/minor-20181204085425.json: -------------------------------------------------------------------------------- 1 | { 2 | "type": "minor", 3 | "description": "Initial release" 4 | } 5 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/advanced-python/.changes/next-release/minor-20181204085425.json: -------------------------------------------------------------------------------- 1 | { 2 | "type": "minor", 3 | "description": "Initial release" 4 | } 5 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/advanced-python/.changes/next-release/minor-20181204085425.json: -------------------------------------------------------------------------------- 1 | { 2 | "type": "minor", 3 | "description": "Initial release" 4 | } 5 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_05_cache_dependencies/ruby-backend/Gemfile: -------------------------------------------------------------------------------- 1 | source 'https://rubygems.org' 2 | gem 'pg' # For PostgreSQL connection 3 | gem 'minitest' # For testing 4 | 5 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/simple-bash/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM alpine:3.9 2 | 3 | RUN apk add --update --no-cache bash 4 | 5 | COPY pipe.sh / 6 | 7 | ENTRYPOINT ["/pipe.sh"] 8 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/simple-bash/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM alpine:3.9 2 | 3 | RUN apk add --update --no-cache bash 4 | 5 | COPY pipe.sh / 6 | 7 | ENTRYPOINT ["/pipe.sh"] 8 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_04_use_conditional_steps/ruby-backend/Gemfile: -------------------------------------------------------------------------------- 1 | source 'https://rubygems.org' 2 | gem 'pg' # For PostgreSQL connection 3 | gem 'minitest' # For testing 4 | 5 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/images/03-run-generator.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch3_create_custom_pipes/03_02_develop_a_custom_pipe/images/03-run-generator.png -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_04_deploy_an_ec2_server_in_aws/images/05-submit.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch4_self_hosted_runners/04_04_deploy_an_ec2_server_in_aws/images/05-submit.png -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_04_deploy_an_ec2_server_in_aws/images/08-outputs.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch4_self_hosted_runners/04_04_deploy_an_ec2_server_in_aws/images/08-outputs.png -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/images/01-check-versions.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch3_create_custom_pipes/03_02_develop_a_custom_pipe/images/01-check-versions.png -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/images/02-clone-the-repo.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch3_create_custom_pipes/03_02_develop_a_custom_pipe/images/02-clone-the-repo.png -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/images/04-workspace-name.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch3_create_custom_pipes/03_02_develop_a_custom_pipe/images/04-workspace-name.png -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/images/05-generated-files.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch3_create_custom_pipes/03_02_develop_a_custom_pipe/images/05-generated-files.png -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/images/SCR-20241217-mjqe.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch3_create_custom_pipes/03_02_develop_a_custom_pipe/images/SCR-20241217-mjqe.png -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/images/06-docker-push-error.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch3_create_custom_pipes/03_02_develop_a_custom_pipe/images/06-docker-push-error.png -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/images/07-registry-prefixes.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch3_create_custom_pipes/03_02_develop_a_custom_pipe/images/07-registry-prefixes.png -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_04_deploy_an_ec2_server_in_aws/images/01-create-stack.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch4_self_hosted_runners/04_04_deploy_an_ec2_server_in_aws/images/01-create-stack.png -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_03_deploy_lambda_functions_in_aws/images/05-submit.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch2_using_pipes_in_pipelines/02_03_deploy_lambda_functions_in_aws/images/05-submit.png -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_03_deploy_lambda_functions_in_aws/images/08-outputs.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch2_using_pipes_in_pipelines/02_03_deploy_lambda_functions_in_aws/images/08-outputs.png -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_04_deploy_an_ec2_server_in_aws/images/02-upload-template.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch4_self_hosted_runners/04_04_deploy_an_ec2_server_in_aws/images/02-upload-template.png -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_04_deploy_an_ec2_server_in_aws/images/07-create-complete.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch4_self_hosted_runners/04_04_deploy_an_ec2_server_in_aws/images/07-create-complete.png -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_05_use_a_custom_pipe_in_a_pipeline/images/00-youre-awesome.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch3_create_custom_pipes/03_05_use_a_custom_pipe_in_a_pipeline/images/00-youre-awesome.png -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_04_deploy_an_ec2_server_in_aws/images/00-resource-diagram.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch4_self_hosted_runners/04_04_deploy_an_ec2_server_in_aws/images/00-resource-diagram.png -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_04_deploy_an_ec2_server_in_aws/images/03-enter-stack-name.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch4_self_hosted_runners/04_04_deploy_an_ec2_server_in_aws/images/03-enter-stack-name.png -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_03_deploy_lambda_functions_in_aws/images/01-create-stack.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch2_using_pipes_in_pipelines/02_03_deploy_lambda_functions_in_aws/images/01-create-stack.png -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_03_deploy_lambda_functions_in_aws/images/10-staging-url.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch2_using_pipes_in_pipelines/02_03_deploy_lambda_functions_in_aws/images/10-staging-url.png -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_04_deploy_an_ec2_server_in_aws/images/00-ec2-instance-pricing.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch4_self_hosted_runners/04_04_deploy_an_ec2_server_in_aws/images/00-ec2-instance-pricing.png -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_04_deploy_an_ec2_server_in_aws/images/06-create-in-progress.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch4_self_hosted_runners/04_04_deploy_an_ec2_server_in_aws/images/06-create-in-progress.png -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_07_challenge_deploy_a_self_hosted_runner/images/cfn-designer.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch4_self_hosted_runners/04_07_challenge_deploy_a_self_hosted_runner/images/cfn-designer.png -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_08_solution_deploy_a_self_hosted_runner/images/cfn-designer.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch4_self_hosted_runners/04_08_solution_deploy_a_self_hosted_runner/images/cfn-designer.png -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_03_deploy_lambda_functions_in_aws/images/09-production-url.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch2_using_pipes_in_pipelines/02_03_deploy_lambda_functions_in_aws/images/09-production-url.png -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_03_deploy_lambda_functions_in_aws/images/00-resource-diagram.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch2_using_pipes_in_pipelines/02_03_deploy_lambda_functions_in_aws/images/00-resource-diagram.png -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_03_deploy_lambda_functions_in_aws/images/02-upload-template.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch2_using_pipes_in_pipelines/02_03_deploy_lambda_functions_in_aws/images/02-upload-template.png -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_03_deploy_lambda_functions_in_aws/images/03-enter-stack-name.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch2_using_pipes_in_pipelines/02_03_deploy_lambda_functions_in_aws/images/03-enter-stack-name.png -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_03_deploy_lambda_functions_in_aws/images/07-create-complete.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch2_using_pipes_in_pipelines/02_03_deploy_lambda_functions_in_aws/images/07-create-complete.png -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_06_solution_use_pipes_in_a_pipeline/images/00005_run-pipeline.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch2_using_pipes_in_pipelines/02_06_solution_use_pipes_in_a_pipeline/images/00005_run-pipeline.png -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_04_deploy_an_ec2_server_in_aws/images/04-acknowledge-iam-resources.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch4_self_hosted_runners/04_04_deploy_an_ec2_server_in_aws/images/04-acknowledge-iam-resources.png -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_03_deploy_lambda_functions_in_aws/images/06-create-in-progress.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch2_using_pipes_in_pipelines/02_03_deploy_lambda_functions_in_aws/images/06-create-in-progress.png -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_06_solution_use_pipes_in_a_pipeline/images/00004_edit-pipeline.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch2_using_pipes_in_pipelines/02_06_solution_use_pipes_in_a_pipeline/images/00004_edit-pipeline.png -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/advanced-bash/.editorconfig: -------------------------------------------------------------------------------- 1 | root = true 2 | 3 | [*] 4 | indent_style = space 5 | indent_size = 2 6 | end_of_line = lf 7 | insert_final_newline = true 8 | charset = utf-8 9 | 10 | [*.md] 11 | trim_trailing_whitespace = false 12 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_06_challenge_create_a_custom_pipe/.editorconfig: -------------------------------------------------------------------------------- 1 | root = true 2 | 3 | [*] 4 | indent_style = space 5 | indent_size = 4 6 | end_of_line = lf 7 | insert_final_newline = true 8 | charset = utf-8 9 | 10 | [*.md] 11 | trim_trailing_whitespace = false 12 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_07_solution_create_a_custom_pipe/.editorconfig: -------------------------------------------------------------------------------- 1 | root = true 2 | 3 | [*] 4 | indent_style = space 5 | indent_size = 4 6 | end_of_line = lf 7 | insert_final_newline = true 8 | charset = utf-8 9 | 10 | [*.md] 11 | trim_trailing_whitespace = false 12 | -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_06_solution_use_pipes_in_a_pipeline/images/00002_copy-access-token.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch2_using_pipes_in_pipelines/02_06_solution_use_pipes_in_a_pipeline/images/00002_copy-access-token.png -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_06_solution_use_pipes_in_a_pipeline/images/00006_download-report.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch2_using_pipes_in_pipelines/02_06_solution_use_pipes_in_a_pipeline/images/00006_download-report.png -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/advanced-bash/.editorconfig: -------------------------------------------------------------------------------- 1 | root = true 2 | 3 | [*] 4 | indent_style = space 5 | indent_size = 2 6 | end_of_line = lf 7 | insert_final_newline = true 8 | charset = utf-8 9 | 10 | [*.md] 11 | trim_trailing_whitespace = false 12 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/advanced-python/.editorconfig: -------------------------------------------------------------------------------- 1 | root = true 2 | 3 | [*] 4 | indent_style = space 5 | indent_size = 4 6 | end_of_line = lf 7 | insert_final_newline = true 8 | charset = utf-8 9 | 10 | [*.md] 11 | trim_trailing_whitespace = false 12 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/advanced-python/.editorconfig: -------------------------------------------------------------------------------- 1 | root = true 2 | 3 | [*] 4 | indent_style = space 5 | indent_size = 4 6 | end_of_line = lf 7 | insert_final_newline = true 8 | charset = utf-8 9 | 10 | [*.md] 11 | trim_trailing_whitespace = false 12 | -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_03_deploy_lambda_functions_in_aws/images/04-acknowledge-iam-resources.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch2_using_pipes_in_pipelines/02_03_deploy_lambda_functions_in_aws/images/04-acknowledge-iam-resources.png -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_06_solution_use_pipes_in_a_pipeline/images/00000_create-access-token-0.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch2_using_pipes_in_pipelines/02_06_solution_use_pipes_in_a_pipeline/images/00000_create-access-token-0.png -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_06_solution_use_pipes_in_a_pipeline/images/00001_create-access-token-1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch2_using_pipes_in_pipelines/02_06_solution_use_pipes_in_a_pipeline/images/00001_create-access-token-1.png -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_04_deploy_a_custom_pipe_to_a_container_registry/images/1-SCR-20241217-lcqn.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch3_create_custom_pipes/03_04_deploy_a_custom_pipe_to_a_container_registry/images/1-SCR-20241217-lcqn.png -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_04_deploy_a_custom_pipe_to_a_container_registry/images/2-SCR-20241217-ldhz.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch3_create_custom_pipes/03_04_deploy_a_custom_pipe_to_a_container_registry/images/2-SCR-20241217-ldhz.png -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_04_deploy_a_custom_pipe_to_a_container_registry/images/3-SCR-20241217-lqkt.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch3_create_custom_pipes/03_04_deploy_a_custom_pipe_to_a_container_registry/images/3-SCR-20241217-lqkt.png -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_04_deploy_a_custom_pipe_to_a_container_registry/images/4-SCR-20241217-lptx.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch3_create_custom_pipes/03_04_deploy_a_custom_pipe_to_a_container_registry/images/4-SCR-20241217-lptx.png -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_04_deploy_a_custom_pipe_to_a_container_registry/images/5-SCR-20241217-ntoq.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch3_create_custom_pipes/03_04_deploy_a_custom_pipe_to_a_container_registry/images/5-SCR-20241217-ntoq.png -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_04_deploy_a_custom_pipe_to_a_container_registry/images/6-SCR-20241217-misj.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch3_create_custom_pipes/03_04_deploy_a_custom_pipe_to_a_container_registry/images/6-SCR-20241217-misj.png -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_04_deploy_an_ec2_server_in_aws/images/10-elevate-session-to-use-root-account.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch4_self_hosted_runners/04_04_deploy_an_ec2_server_in_aws/images/10-elevate-session-to-use-root-account.png -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_06_solution_use_pipes_in_a_pipeline/images/00003_create-repository-variable.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch2_using_pipes_in_pipelines/02_06_solution_use_pipes_in_a_pipeline/images/00003_create-repository-variable.png -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_04_deploy_an_ec2_server_in_aws/images/09-connect-to-instance-via-session-manager.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch4_self_hosted_runners/04_04_deploy_an_ec2_server_in_aws/images/09-connect-to-instance-via-session-manager.png -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_04_use_conditional_steps/Gemfile.lock: -------------------------------------------------------------------------------- 1 | GEM 2 | remote: https://rubygems.org/ 3 | specs: 4 | minitest (5.25.4) 5 | pg (1.5.9) 6 | 7 | PLATFORMS 8 | aarch64-linux 9 | ruby 10 | 11 | DEPENDENCIES 12 | minitest 13 | pg 14 | 15 | BUNDLED WITH 16 | 2.5.22 17 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_06_challenge_optimize_a_pipeline/images/SCR-20250103-trnw-run-initial-pipeline-1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch1_pipeline_optimizations/01_06_challenge_optimize_a_pipeline/images/SCR-20250103-trnw-run-initial-pipeline-1.png -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_06_challenge_optimize_a_pipeline/images/SCR-20250103-trwv-run-initial-pipeline-2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch1_pipeline_optimizations/01_06_challenge_optimize_a_pipeline/images/SCR-20250103-trwv-run-initial-pipeline-2.png -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_07_solution_optimize_a_pipeline/images/SCR-20250103-trnw-run-initial-pipeline-1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch1_pipeline_optimizations/01_07_solution_optimize_a_pipeline/images/SCR-20250103-trnw-run-initial-pipeline-1.png -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_07_solution_optimize_a_pipeline/images/SCR-20250103-trwv-run-initial-pipeline-2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch1_pipeline_optimizations/01_07_solution_optimize_a_pipeline/images/SCR-20250103-trwv-run-initial-pipeline-2.png -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/advanced-python/test/__pycache__/test.cpython-310-pytest-7.2.1.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch3_create_custom_pipes/03_03_test_a_custom_pipe/advanced-python/test/__pycache__/test.cpython-310-pytest-7.2.1.pyc -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_05_cache_dependencies/ruby-backend/Gemfile.lock: -------------------------------------------------------------------------------- 1 | GEM 2 | remote: https://rubygems.org/ 3 | specs: 4 | minitest (5.25.4) 5 | pg (1.5.9) 6 | 7 | PLATFORMS 8 | aarch64-linux 9 | ruby 10 | 11 | DEPENDENCIES 12 | minitest 13 | pg 14 | 15 | BUNDLED WITH 16 | 2.5.22 17 | -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_06_solution_use_pipes_in_a_pipeline/images/SCR-20250103-trnw-run-initial-pipeline-1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch2_using_pipes_in_pipelines/02_06_solution_use_pipes_in_a_pipeline/images/SCR-20250103-trnw-run-initial-pipeline-1.png -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_06_solution_use_pipes_in_a_pipeline/images/SCR-20250103-trwv-run-initial-pipeline-2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch2_using_pipes_in_pipelines/02_06_solution_use_pipes_in_a_pipeline/images/SCR-20250103-trwv-run-initial-pipeline-2.png -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_05_challenge_use_pipes_in_a_pipeline/images/SCR-20250103-trnw-run-initial-pipeline-1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch2_using_pipes_in_pipelines/02_05_challenge_use_pipes_in_a_pipeline/images/SCR-20250103-trnw-run-initial-pipeline-1.png -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_05_challenge_use_pipes_in_a_pipeline/images/SCR-20250103-trwv-run-initial-pipeline-2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184/main/ch2_using_pipes_in_pipelines/02_05_challenge_use_pipes_in_a_pipeline/images/SCR-20250103-trwv-run-initial-pipeline-2.png -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/simple-python/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM python:3.7-slim 2 | 3 | # install requirements 4 | COPY requirements.txt / 5 | WORKDIR / 6 | RUN pip install --no-cache-dir -r requirements.txt 7 | 8 | # copy the pipe source code 9 | COPY pipe / 10 | 11 | ENTRYPOINT ["python3", "/pipe.py"] 12 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_06_challenge_create_a_custom_pipe/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM python:3.10-slim 2 | 3 | # install requirements 4 | COPY requirements.txt / 5 | WORKDIR / 6 | RUN pip install --no-cache-dir -r requirements.txt 7 | 8 | # copy the pipe source code 9 | COPY pipe / 10 | 11 | ENTRYPOINT ["python3", "/pipe.py"] 12 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_07_solution_create_a_custom_pipe/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM python:3.10-slim 2 | 3 | # install requirements 4 | COPY requirements.txt / 5 | WORKDIR / 6 | RUN pip install --no-cache-dir -r requirements.txt 7 | 8 | # copy the pipe source code 9 | COPY pipe / 10 | 11 | ENTRYPOINT ["python3", "/pipe.py"] 12 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/simple-python/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM python:3.7-slim 2 | 3 | # install requirements 4 | COPY requirements.txt / 5 | WORKDIR / 6 | RUN pip install --no-cache-dir -r requirements.txt 7 | 8 | # copy the pipe source code 9 | COPY pipe / 10 | 11 | ENTRYPOINT ["python3", "/pipe.py"] 12 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/advanced-bash/pipe.yml: -------------------------------------------------------------------------------- 1 | name: example 2 | image: example/example:0.0.0 3 | description: example 4 | category: example 5 | repository: https://bitbucket.org/example/example 6 | vendor: 7 | name: example 8 | website: example 9 | maintainer: 10 | name: example 11 | website: example 12 | tags: 13 | - example 14 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/advanced-python/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM python:3.10-slim 2 | 3 | # install requirements 4 | COPY requirements.txt / 5 | WORKDIR / 6 | RUN pip install --no-cache-dir -r requirements.txt 7 | 8 | # copy the pipe source code 9 | COPY pipe / 10 | COPY LICENSE.txt README.md pipe.yml / 11 | 12 | ENTRYPOINT ["python3", "/pipe.py"] 13 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/advanced-python/pipe.yml: -------------------------------------------------------------------------------- 1 | name: example 2 | image: example/example:0.0.0 3 | description: example 4 | category: example 5 | repository: https://bitbucket.org/example/example 6 | vendor: 7 | name: example 8 | website: example 9 | maintainer: 10 | name: example 11 | website: example 12 | tags: 13 | - example 14 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/advanced-bash/pipe.yml: -------------------------------------------------------------------------------- 1 | name: example 2 | image: example/example:0.0.0 3 | description: example 4 | category: example 5 | repository: https://bitbucket.org/example/example 6 | vendor: 7 | name: example 8 | website: example 9 | maintainer: 10 | name: example 11 | website: example 12 | tags: 13 | - example 14 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/advanced-python/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM python:3.10-slim 2 | 3 | # install requirements 4 | COPY requirements.txt / 5 | WORKDIR / 6 | RUN pip install --no-cache-dir -r requirements.txt 7 | 8 | # copy the pipe source code 9 | COPY pipe / 10 | COPY LICENSE.txt README.md pipe.yml / 11 | 12 | ENTRYPOINT ["python3", "/pipe.py"] 13 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/advanced-python/pipe.yml: -------------------------------------------------------------------------------- 1 | name: example 2 | image: example/example:0.0.0 3 | description: example 4 | category: example 5 | repository: https://bitbucket.org/example/example 6 | vendor: 7 | name: example 8 | website: example 9 | maintainer: 10 | name: example 11 | website: example 12 | tags: 13 | - example 14 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_06_challenge_optimize_a_pipeline/bitbucket-pipelines.yml: -------------------------------------------------------------------------------- 1 | image: python:3.12 2 | 3 | pipelines: 4 | default: 5 | - step: 6 | name: Run Data Analysis 7 | 8 | script: 9 | - pip install -r requirements.txt 10 | - python cluster_analysis.py 11 | 12 | artifacts: 13 | - analysis_report.md 14 | 15 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/advanced-bash/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM alpine:3.9 2 | 3 | RUN apk add --update --no-cache bash 4 | 5 | COPY pipe / 6 | COPY LICENSE.txt pipe.yml README.md / 7 | RUN wget --no-verbose -P / https://bitbucket.org/bitbucketpipelines/bitbucket-pipes-toolkit-bash/raw/0.6.0/common.sh 8 | 9 | RUN chmod a+x /*.sh 10 | 11 | ENTRYPOINT ["/pipe.sh"] 12 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/advanced-bash/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM alpine:3.9 2 | 3 | RUN apk add --update --no-cache bash 4 | 5 | COPY pipe / 6 | COPY LICENSE.txt pipe.yml README.md / 7 | RUN wget --no-verbose -P / https://bitbucket.org/bitbucketpipelines/bitbucket-pipes-toolkit-bash/raw/0.6.0/common.sh 8 | 9 | RUN chmod a+x /*.sh 10 | 11 | ENTRYPOINT ["/pipe.sh"] 12 | -------------------------------------------------------------------------------- /.github/workflows/main.yml: -------------------------------------------------------------------------------- 1 | name: Copy To Branches 2 | on: 3 | workflow_dispatch: 4 | jobs: 5 | copy-to-branches: 6 | runs-on: ubuntu-latest 7 | steps: 8 | - uses: actions/checkout@v2 9 | with: 10 | fetch-depth: 0 11 | - name: Copy To Branches Action 12 | uses: planetoftheweb/copy-to-branches@v1.2 13 | env: 14 | key: main 15 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_02_configure_maximum_runtime/bitbucket-pipelines.yml: -------------------------------------------------------------------------------- 1 | options: 2 | max-time: 2 # Pipeline will timeout after 2 minutes 3 | 4 | image: atlassian/default-image:4 5 | 6 | pipelines: 7 | default: 8 | - step: 9 | name: Long Running Step 10 | script: 11 | - echo "Simulate a network timeout" 12 | - sleep 240 # Sleep for 4 minutes 13 | 14 | -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_05_challenge_use_pipes_in_a_pipeline/bitbucket-pipelines.yml: -------------------------------------------------------------------------------- 1 | image: atlassian/default-image:4 2 | 3 | pipelines: 4 | default: 5 | - step: 6 | name: Collect Bitbucket Build Statistics 7 | artifacts: 8 | - "*.txt" 9 | # consider adding a cache for `docker` once the pipes are in place 10 | script: 11 | - echo "Add you solution here..." 12 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/simple-python/pipe.py: -------------------------------------------------------------------------------- 1 | from bitbucket_pipes_toolkit import Pipe 2 | 3 | variables = { 4 | 'NAME': {'type': 'string', 'required': True}, 5 | 'DEBUG': {'type': 'boolean', 'required': False, 'default': False} 6 | } 7 | 8 | pipe = Pipe(schema=variables) 9 | name = pipe.get_variable('NAME') 10 | 11 | pipe.log_info("Executing the pipe...") 12 | pipe.log_info("Hello, " + name) 13 | 14 | pipe.success(message="Success!") -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/simple-python/pipe.py: -------------------------------------------------------------------------------- 1 | from bitbucket_pipes_toolkit import Pipe 2 | 3 | variables = { 4 | 'NAME': {'type': 'string', 'required': True}, 5 | 'DEBUG': {'type': 'boolean', 'required': False, 'default': False} 6 | } 7 | 8 | pipe = Pipe(schema=variables) 9 | name = pipe.get_variable('NAME') 10 | 11 | pipe.log_info("Executing the pipe...") 12 | pipe.log_info("Hello, " + name) 13 | 14 | pipe.success(message="Success!") -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/advanced-bash/pipe/pipe.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | # 3 | # example 4 | # 5 | 6 | source "$(dirname "$0")/common.sh" 7 | 8 | info "Executing the pipe..." 9 | 10 | # Required parameters 11 | NAME=${NAME:?'NAME variable missing.'} 12 | 13 | # Default parameters 14 | DEBUG=${DEBUG:="false"} 15 | 16 | run echo "${NAME}" 17 | 18 | if [[ "${status}" == "0" ]]; then 19 | success "Success!" 20 | else 21 | fail "Error!" 22 | fi 23 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/advanced-bash/pipe/pipe.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | # 3 | # example 4 | # 5 | 6 | source "$(dirname "$0")/common.sh" 7 | 8 | info "Executing the pipe..." 9 | 10 | # Required parameters 11 | NAME=${NAME:?'NAME variable missing.'} 12 | 13 | # Default parameters 14 | DEBUG=${DEBUG:="false"} 15 | 16 | run echo "${NAME}" 17 | 18 | if [[ "${status}" == "0" ]]; then 19 | success "Success!" 20 | else 21 | fail "Error!" 22 | fi 23 | -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_02_use_a_pipe_in_a_pipeline_configuration/bitbucket-pipelines.yml: -------------------------------------------------------------------------------- 1 | image: atlassian/default-image:4 2 | 3 | pipelines: 4 | default: 5 | - step: 6 | name: Use an official Bitbucket pipe 7 | caches: 8 | - docker 9 | script: 10 | - pipe: atlassian/bitbucket-iac-scan:0.5.2 11 | variables: 12 | FILES_TO_SCAN_PATH: . 13 | SCAN_EXTRA_ARGS: 14 | - "--type=CloudFormation" 15 | 16 | -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_02_use_a_pipe_in_a_pipeline_configuration/Shenanigans/bitbucket-pipelines.yml: -------------------------------------------------------------------------------- 1 | image: atlassian/default-image:4 2 | 3 | pipelines: 4 | default: 5 | - step: 6 | name: Use a pipe from Docker Hub 7 | script: 8 | - pipe: docker://docker.io/hello-world:latest 9 | 10 | - step: 11 | name: Use a pipe from AWS Elastic Container Registry 12 | script: 13 | - pipe: docker://public.ecr.aws/docker/library/hello-world:linux 14 | 15 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/advanced-bash/test/test.bats: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bats 2 | 3 | setup() { 4 | DOCKER_IMAGE=${DOCKER_IMAGE:="test/example"} 5 | 6 | echo "Building image..." 7 | docker build -t ${DOCKER_IMAGE}:test . 8 | } 9 | 10 | @test "Dummy test" { 11 | run docker run \ 12 | -e NAME="baz" \ 13 | -v $(pwd):$(pwd) \ 14 | -w $(pwd) \ 15 | ${DOCKER_IMAGE}:test 16 | 17 | echo "Status: $status" 18 | echo "Output: $output" 19 | 20 | [ "$status" -eq 0 ] 21 | } 22 | 23 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/advanced-bash/test/test.bats: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bats 2 | 3 | setup() { 4 | DOCKER_IMAGE=${DOCKER_IMAGE:="test/example"} 5 | 6 | echo "Building image..." 7 | docker build -t ${DOCKER_IMAGE}:test . 8 | } 9 | 10 | @test "Dummy test" { 11 | run docker run \ 12 | -e NAME="baz" \ 13 | -v $(pwd):$(pwd) \ 14 | -w $(pwd) \ 15 | ${DOCKER_IMAGE}:test 16 | 17 | echo "Status: $status" 18 | echo "Output: $output" 19 | 20 | [ "$status" -eq 0 ] 21 | } 22 | 23 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_06_challenge_create_a_custom_pipe/pipe.yml: -------------------------------------------------------------------------------- 1 | name: JSON Report Generator 2 | image: amazing-mobile-app/json-report-generator:0.0.0 3 | description: This pipe will be used to create JSON reports from data files. 4 | category: Reports 5 | repository: https://bitbucket.org/amazing-mobile-app-finance-team/json-report-generator 6 | vendor: 7 | name: The Amazing Mobile App, Inc. 8 | website: http://example.com/amazing-mobile-app 9 | maintainer: 10 | name: The Amazing Mobile App, Inc. 11 | website: http://example.com/amazing-mobile-app 12 | tags: 13 | - deployment 14 | 15 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_07_solution_create_a_custom_pipe/pipe.yml: -------------------------------------------------------------------------------- 1 | name: JSON Report Generator 2 | image: amazing-mobile-app/json-report-generator:0.0.0 3 | description: This pipe will be used to create JSON reports from data files. 4 | category: Reports 5 | repository: https://bitbucket.org/amazing-mobile-app-finance-team/json-report-generator 6 | vendor: 7 | name: The Amazing Mobile App, Inc. 8 | website: http://example.com/amazing-mobile-app 9 | maintainer: 10 | name: The Amazing Mobile App, Inc. 11 | website: http://example.com/amazing-mobile-app 12 | tags: 13 | - deployment 14 | 15 | -------------------------------------------------------------------------------- /CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | 2 | Contribution Agreement 3 | ====================== 4 | 5 | This repository does not accept pull requests (PRs). All pull requests will be closed. 6 | 7 | However, if any contributions (through pull requests, issues, feedback or otherwise) are provided, as a contributor, you represent that the code you submit is your original work or that of your employer (in which case you represent you have the right to bind your employer). By submitting code (or otherwise providing feedback), you (and, if applicable, your employer) are licensing the submitted code (and/or feedback) to LinkedIn and the open source community subject to the BSD 2-Clause license. 8 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_06_challenge_create_a_custom_pipe/LICENSE.txt: -------------------------------------------------------------------------------- 1 | Copyright @ 2019 Atlassian Pty Ltd 2 | 3 | Licensed under the Apache License, Version 2.0 (the "License"); 4 | you may not use this file except in compliance with the License. 5 | You may obtain a copy of the License at 6 | 7 | http://www.apache.org/licenses/LICENSE-2.0 8 | 9 | Unless required by applicable law or agreed to in writing, software 10 | distributed under the License is distributed on an "AS IS" BASIS, 11 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | See the License for the specific language governing permissions and 13 | limitations under the License. 14 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_07_solution_create_a_custom_pipe/LICENSE.txt: -------------------------------------------------------------------------------- 1 | Copyright @ 2019 Atlassian Pty Ltd 2 | 3 | Licensed under the Apache License, Version 2.0 (the "License"); 4 | you may not use this file except in compliance with the License. 5 | You may obtain a copy of the License at 6 | 7 | http://www.apache.org/licenses/LICENSE-2.0 8 | 9 | Unless required by applicable law or agreed to in writing, software 10 | distributed under the License is distributed on an "AS IS" BASIS, 11 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | See the License for the specific language governing permissions and 13 | limitations under the License. 14 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/advanced-bash/LICENSE.txt: -------------------------------------------------------------------------------- 1 | Copyright @ 2019 Atlassian Pty Ltd 2 | 3 | Licensed under the Apache License, Version 2.0 (the "License"); 4 | you may not use this file except in compliance with the License. 5 | You may obtain a copy of the License at 6 | 7 | http://www.apache.org/licenses/LICENSE-2.0 8 | 9 | Unless required by applicable law or agreed to in writing, software 10 | distributed under the License is distributed on an "AS IS" BASIS, 11 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | See the License for the specific language governing permissions and 13 | limitations under the License. 14 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/advanced-bash/LICENSE.txt: -------------------------------------------------------------------------------- 1 | Copyright @ 2019 Atlassian Pty Ltd 2 | 3 | Licensed under the Apache License, Version 2.0 (the "License"); 4 | you may not use this file except in compliance with the License. 5 | You may obtain a copy of the License at 6 | 7 | http://www.apache.org/licenses/LICENSE-2.0 8 | 9 | Unless required by applicable law or agreed to in writing, software 10 | distributed under the License is distributed on an "AS IS" BASIS, 11 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | See the License for the specific language governing permissions and 13 | limitations under the License. 14 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/advanced-python/LICENSE.txt: -------------------------------------------------------------------------------- 1 | Copyright @ 2019 Atlassian Pty Ltd 2 | 3 | Licensed under the Apache License, Version 2.0 (the "License"); 4 | you may not use this file except in compliance with the License. 5 | You may obtain a copy of the License at 6 | 7 | http://www.apache.org/licenses/LICENSE-2.0 8 | 9 | Unless required by applicable law or agreed to in writing, software 10 | distributed under the License is distributed on an "AS IS" BASIS, 11 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | See the License for the specific language governing permissions and 13 | limitations under the License. 14 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/advanced-python/LICENSE.txt: -------------------------------------------------------------------------------- 1 | Copyright @ 2019 Atlassian Pty Ltd 2 | 3 | Licensed under the Apache License, Version 2.0 (the "License"); 4 | you may not use this file except in compliance with the License. 5 | You may obtain a copy of the License at 6 | 7 | http://www.apache.org/licenses/LICENSE-2.0 8 | 9 | Unless required by applicable law or agreed to in writing, software 10 | distributed under the License is distributed on an "AS IS" BASIS, 11 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | See the License for the specific language governing permissions and 13 | limitations under the License. 14 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_06_challenge_create_a_custom_pipe/pipe/pipe.py: -------------------------------------------------------------------------------- 1 | from bitbucket_pipes_toolkit import Pipe, get_logger 2 | 3 | logger = get_logger() 4 | 5 | schema = { 6 | 'NAME': {'type': 'string', 'required': True}, 7 | 'DEBUG': {'type': 'boolean', 'required': False, 'default': False} 8 | } 9 | 10 | 11 | class DemoPipe(Pipe): 12 | def run(self): 13 | super().run() 14 | 15 | logger.info('Executing the pipe...') 16 | name = self.get_variable('NAME') 17 | 18 | print(name) 19 | 20 | self.success(message="Success!") 21 | 22 | 23 | if __name__ == '__main__': 24 | pipe = DemoPipe(pipe_metadata='/pipe.yml', schema=schema) 25 | pipe.run() 26 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/advanced-python/pipe/pipe.py: -------------------------------------------------------------------------------- 1 | from bitbucket_pipes_toolkit import Pipe, get_logger 2 | 3 | logger = get_logger() 4 | 5 | schema = { 6 | 'NAME': {'type': 'string', 'required': True}, 7 | 'DEBUG': {'type': 'boolean', 'required': False, 'default': False} 8 | } 9 | 10 | 11 | class DemoPipe(Pipe): 12 | def run(self): 13 | super().run() 14 | 15 | logger.info('Executing the pipe...') 16 | name = self.get_variable('NAME') 17 | 18 | print(name) 19 | 20 | self.success(message="Success!") 21 | 22 | 23 | if __name__ == '__main__': 24 | pipe = DemoPipe(pipe_metadata='/pipe.yml', schema=schema) 25 | pipe.run() 26 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/advanced-python/pipe/pipe.py: -------------------------------------------------------------------------------- 1 | from bitbucket_pipes_toolkit import Pipe, get_logger 2 | 3 | logger = get_logger() 4 | 5 | schema = { 6 | 'NAME': {'type': 'string', 'required': True}, 7 | 'DEBUG': {'type': 'boolean', 'required': False, 'default': False} 8 | } 9 | 10 | 11 | class DemoPipe(Pipe): 12 | def run(self): 13 | super().run() 14 | 15 | logger.info('Executing the pipe...') 16 | name = self.get_variable('NAME') 17 | 18 | print(name) 19 | 20 | self.success(message="Success!") 21 | 22 | 23 | if __name__ == '__main__': 24 | pipe = DemoPipe(pipe_metadata='/pipe.yml', schema=schema) 25 | pipe.run() 26 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_03_configure_resource_allocation/bitbucket-pipelines.yml: -------------------------------------------------------------------------------- 1 | image: atlassian/default-image:4 2 | 3 | definitions: 4 | steps: 5 | - step: &resource_allocation_report 6 | name: Resource Allocation Report 7 | script: 8 | - echo "CPUs = $(nproc)" 9 | - echo "RAM = $(grep MemTotal /proc/meminfo | awk '{print $2 / 1024 / 1024 }') GB" 10 | - echo "Disk = $(df --block-size=1G / | awk 'NR==2 {print $2}') GB" 11 | 12 | pipelines: 13 | default: 14 | - step: 15 | name: Default Size 16 | size: 1x 17 | <<: *resource_allocation_report 18 | - step: 19 | name: Double Size 20 | size: 2x 21 | <<: *resource_allocation_report 22 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_06_challenge_optimize_a_pipeline/Makefile: -------------------------------------------------------------------------------- 1 | TARGETS := hello dev-requirements requirements lint black analysis clean all 2 | TAG := $(shell git rev-parse --short HEAD) 3 | 4 | hello: 5 | @echo make [$(TARGETS)] 6 | 7 | dev-requirements: requirements 8 | pip install --quiet --upgrade --requirement dev-requirements.txt 9 | 10 | requirements: 11 | pip install --upgrade pip 12 | pip install --upgrade --requirement requirements.txt 13 | 14 | lint: 15 | flake8 --ignore=E501,W503 *.py 16 | black --diff *.py 17 | 18 | black: 19 | black *.py 20 | 21 | analysis: 22 | python3 ./cluster_analysis.py 23 | 24 | clean: 25 | -rm -vf ./data/*.json 26 | -rm -rvf __pycache__/ 27 | 28 | all: requirements analysis 29 | 30 | .PHONY: $(TARGETS) 31 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_07_solution_optimize_a_pipeline/Makefile: -------------------------------------------------------------------------------- 1 | TARGETS := hello dev-requirements requirements lint black analysis clean all 2 | TAG := $(shell git rev-parse --short HEAD) 3 | 4 | hello: 5 | @echo make [$(TARGETS)] 6 | 7 | dev-requirements: requirements 8 | pip install --quiet --upgrade --requirement dev-requirements.txt 9 | 10 | requirements: 11 | pip install --upgrade pip 12 | pip install --upgrade --requirement requirements.txt 13 | 14 | lint: 15 | flake8 --ignore=E501,W503 *.py 16 | black --diff *.py 17 | 18 | black: 19 | black *.py 20 | 21 | analysis: 22 | python3 ./cluster_analysis.py 23 | 24 | clean: 25 | -rm -vf ./data/*.json 26 | -rm -rvf __pycache__/ 27 | 28 | all: requirements analysis 29 | 30 | .PHONY: $(TARGETS) 31 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_07_solution_create_a_custom_pipe/pipe/pipe.py: -------------------------------------------------------------------------------- 1 | from bitbucket_pipes_toolkit import Pipe, get_logger 2 | 3 | logger = get_logger() 4 | 5 | schema = { 6 | 'CUSTOMER_ID': {'type': 'string', 'required': True}, 7 | 'DEBUG': {'type': 'boolean', 'required': False, 'default': False} 8 | } 9 | 10 | 11 | class DemoPipe(Pipe): 12 | def run(self): 13 | super().run() 14 | 15 | logger.info('Executing the pipe...') 16 | customer_id = self.get_variable('CUSTOMER_ID') 17 | 18 | print(customer_id) 19 | 20 | self.success(message=f"CUSTOMER_ID {customer_id} processed successfully!") 21 | 22 | 23 | if __name__ == '__main__': 24 | pipe = DemoPipe(pipe_metadata='/pipe.yml', schema=schema) 25 | pipe.run() 26 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_06_challenge_create_a_custom_pipe/bitbucket-pipelines.yml: -------------------------------------------------------------------------------- 1 | image: 2 | name: python:3.10 3 | 4 | test: &test 5 | step: 6 | name: Test 7 | script: 8 | - pip install -r test/requirements.txt 9 | - pytest --verbose test/test*.py --junitxml=test-reports/report.xml 10 | services: 11 | - docker 12 | 13 | push: &push 14 | step: 15 | name: Push and Tag 16 | image: python:3.10 17 | script: 18 | - pipe: atlassian/bitbucket-pipe-release:5.6.0 19 | variables: 20 | REGISTRY_USERNAME: $REGISTRY_USERNAME 21 | REGISTRY_PASSWORD: $REGISTRY_PASSWORD 22 | IMAGE: amazing-mobile-app/$BITBUCKET_REPO_SLUG 23 | services: 24 | - docker 25 | 26 | pipelines: 27 | default: 28 | - <<: *test 29 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_07_solution_create_a_custom_pipe/bitbucket-pipelines.yml: -------------------------------------------------------------------------------- 1 | image: 2 | name: python:3.10 3 | 4 | test: &test 5 | step: 6 | name: Test 7 | script: 8 | - pip install -r test/requirements.txt 9 | - pytest --verbose test/test*.py --junitxml=test-reports/report.xml 10 | services: 11 | - docker 12 | 13 | push: &push 14 | step: 15 | name: Push and Tag 16 | image: python:3.10 17 | script: 18 | - pipe: atlassian/bitbucket-pipe-release:5.6.0 19 | variables: 20 | REGISTRY_USERNAME: $REGISTRY_USERNAME 21 | REGISTRY_PASSWORD: $REGISTRY_PASSWORD 22 | IMAGE: amazing-mobile-app/$BITBUCKET_REPO_SLUG 23 | services: 24 | - docker 25 | 26 | pipelines: 27 | default: 28 | - <<: *test 29 | -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_06_solution_use_pipes_in_a_pipeline/bitbucket-pipelines.yml: -------------------------------------------------------------------------------- 1 | image: atlassian/default-image:4 2 | 3 | pipelines: 4 | default: 5 | - step: 6 | name: Collect Bitbucket Build Statistics 7 | caches: 8 | - docker 9 | artifacts: 10 | - "*.txt" 11 | script: 12 | 13 | - export FILENAME="builds-statistics-$(date +%Y-%m-%d).txt" 14 | 15 | - pipe: atlassian/bitbucket-build-statistics:1.5.3 16 | variables: 17 | BITBUCKET_ACCESS_TOKEN: $STATISTICS_ACCESS_TOKEN 18 | FILENAME: "$FILENAME" 19 | 20 | - pipe: atlassian/bitbucket-upload-file:0.7.4 21 | variables: 22 | BITBUCKET_ACCESS_TOKEN: $STATISTICS_ACCESS_TOKEN 23 | FILENAME: "$FILENAME" 24 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/advanced-python/bitbucket-pipelines.yml: -------------------------------------------------------------------------------- 1 | image: 2 | name: python:3.10 3 | 4 | test: &test 5 | step: 6 | name: Test 7 | script: 8 | - pip install -r test/requirements.txt 9 | - pytest -v test/test.py 10 | services: 11 | - docker 12 | 13 | push: &push 14 | step: 15 | name: Push and Tag 16 | image: python:3.10 17 | script: 18 | - pipe: atlassian/bitbucket-pipe-release:5.6.0 19 | variables: 20 | REGISTRY_USERNAME: $REGISTRY_USERNAME 21 | REGISTRY_PASSWORD: $REGISTRY_PASSWORD 22 | IMAGE: example/$BITBUCKET_REPO_SLUG 23 | services: 24 | - docker 25 | 26 | pipelines: 27 | default: 28 | - <<: *test 29 | branches: 30 | master: 31 | - <<: *test 32 | - <<: *push 33 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/advanced-python/bitbucket-pipelines.yml: -------------------------------------------------------------------------------- 1 | image: 2 | name: python:3.10 3 | 4 | test: &test 5 | step: 6 | name: Test 7 | script: 8 | - pip install -r test/requirements.txt 9 | - pytest -v test/test.py 10 | services: 11 | - docker 12 | 13 | push: &push 14 | step: 15 | name: Push and Tag 16 | image: python:3.10 17 | script: 18 | - pipe: atlassian/bitbucket-pipe-release:5.6.0 19 | variables: 20 | REGISTRY_USERNAME: $REGISTRY_USERNAME 21 | REGISTRY_PASSWORD: $REGISTRY_PASSWORD 22 | IMAGE: example/$BITBUCKET_REPO_SLUG 23 | services: 24 | - docker 25 | 26 | pipelines: 27 | default: 28 | - <<: *test 29 | branches: 30 | master: 31 | - <<: *test 32 | - <<: *push 33 | -------------------------------------------------------------------------------- /ch0_intro/00_01_introduction/README.md: -------------------------------------------------------------------------------- 1 | # 00_01 Introduction 2 | 3 | Hey there, welcome to this course: Advanced Bitbucket Pipelines! 4 | 5 | Mastering advanced CI/CD techniques can significantly speed up your build times and streamline your software development workflows. 6 | 7 | In this course, you'll learn to efficiently use pipeline components including self-hosted runners that provide full control over your build environment. 8 | 9 | This section is just here to get you introduced to the course and the exercise files. There's much more in the lessons ahead. 10 | 11 | Happy learning! 12 | 13 | 14 | --- 15 | [← Advanced Bitbucket Pipelines: Automating Deployments & Managing Third Party Integrations](../../README.md) | [00_02 What You Should Know →](../00_02_what_you_should_know/README.md) 16 | 17 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/advanced-bash/bitbucket-pipelines.yml: -------------------------------------------------------------------------------- 1 | image: 2 | name: atlassian/default-image:3 3 | 4 | test: &test 5 | step: 6 | name: Test 7 | script: 8 | - npm install -g bats 9 | - chmod a+x test/*.bats 10 | - bats test/test.bats 11 | services: 12 | - docker 13 | 14 | push: &push 15 | step: 16 | name: Push and Tag 17 | image: python:3.10 18 | script: 19 | - pipe: atlassian/bitbucket-pipe-release:5.6.0 20 | variables: 21 | REGISTRY_USERNAME: $REGISTRY_USERNAME 22 | REGISTRY_PASSWORD: $REGISTRY_PASSWORD 23 | IMAGE: example/$BITBUCKET_REPO_SLUG 24 | services: 25 | - docker 26 | 27 | pipelines: 28 | default: 29 | - <<: *test 30 | branches: 31 | master: 32 | - <<: *test 33 | - <<: *push 34 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/advanced-bash/bitbucket-pipelines.yml: -------------------------------------------------------------------------------- 1 | image: 2 | name: atlassian/default-image:3 3 | 4 | test: &test 5 | step: 6 | name: Test 7 | script: 8 | - npm install -g bats 9 | - chmod a+x test/*.bats 10 | - bats test/test.bats 11 | services: 12 | - docker 13 | 14 | push: &push 15 | step: 16 | name: Push and Tag 17 | image: python:3.10 18 | script: 19 | - pipe: atlassian/bitbucket-pipe-release:5.6.0 20 | variables: 21 | REGISTRY_USERNAME: $REGISTRY_USERNAME 22 | REGISTRY_PASSWORD: $REGISTRY_PASSWORD 23 | IMAGE: example/$BITBUCKET_REPO_SLUG 24 | services: 25 | - docker 26 | 27 | pipelines: 28 | default: 29 | - <<: *test 30 | branches: 31 | master: 32 | - <<: *test 33 | - <<: *push 34 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/simple-bash/bitbucket-pipelines.yml: -------------------------------------------------------------------------------- 1 | image: atlassian/default-image:4 2 | 3 | pipelines: 4 | default: 5 | - step: 6 | name: Build and Push 7 | deployment: production 8 | script: 9 | # Build and push image 10 | - VERSION="1.$BITBUCKET_BUILD_NUMBER" 11 | - IMAGE="$DOCKERHUB_USERNAME/$BITBUCKET_REPO_SLUG" 12 | - docker login --username "$DOCKERHUB_USERNAME" --password "${DOCKERHUB_PASSWORD}" 13 | - docker image build -t ${IMAGE}:${VERSION} . 14 | - docker image tag ${IMAGE}:${VERSION} ${IMAGE}:latest 15 | - docker image push ${IMAGE} 16 | # Push tags 17 | - git tag -a "${VERSION}" -m "Tagging for release ${VERSION}" 18 | - git push origin ${VERSION} 19 | services: 20 | - docker 21 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/simple-bash/bitbucket-pipelines.yml: -------------------------------------------------------------------------------- 1 | image: atlassian/default-image:4 2 | 3 | pipelines: 4 | default: 5 | - step: 6 | name: Build and Push 7 | deployment: production 8 | script: 9 | # Build and push image 10 | - VERSION="1.$BITBUCKET_BUILD_NUMBER" 11 | - IMAGE="$DOCKERHUB_USERNAME/$BITBUCKET_REPO_SLUG" 12 | - docker login --username "$DOCKERHUB_USERNAME" --password "${DOCKERHUB_PASSWORD}" 13 | - docker image build -t ${IMAGE}:${VERSION} . 14 | - docker image tag ${IMAGE}:${VERSION} ${IMAGE}:latest 15 | - docker image push ${IMAGE} 16 | # Push tags 17 | - git tag -a "${VERSION}" -m "Tagging for release ${VERSION}" 18 | - git push origin ${VERSION} 19 | services: 20 | - docker 21 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/simple-python/bitbucket-pipelines.yml: -------------------------------------------------------------------------------- 1 | image: atlassian/default-image:4 2 | 3 | pipelines: 4 | default: 5 | - step: 6 | name: Build and Push 7 | deployment: production 8 | script: 9 | # Build and push image 10 | - VERSION="1.$BITBUCKET_BUILD_NUMBER" 11 | - IMAGE="$DOCKERHUB_USERNAME/$BITBUCKET_REPO_SLUG" 12 | - docker login --username "$DOCKERHUB_USERNAME" --password "${DOCKERHUB_PASSWORD}" 13 | - docker image build -t ${IMAGE}:${VERSION} . 14 | - docker image tag ${IMAGE}:${VERSION} ${IMAGE}:latest 15 | - docker image push ${IMAGE} 16 | # Push tags 17 | - git tag -a "${VERSION}" -m "Tagging for release ${VERSION}" 18 | - git push origin ${VERSION} 19 | services: 20 | - docker 21 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/simple-python/bitbucket-pipelines.yml: -------------------------------------------------------------------------------- 1 | image: atlassian/default-image:4 2 | 3 | pipelines: 4 | default: 5 | - step: 6 | name: Build and Push 7 | deployment: production 8 | script: 9 | # Build and push image 10 | - VERSION="1.$BITBUCKET_BUILD_NUMBER" 11 | - IMAGE="$DOCKERHUB_USERNAME/$BITBUCKET_REPO_SLUG" 12 | - docker login --username "$DOCKERHUB_USERNAME" --password "${DOCKERHUB_PASSWORD}" 13 | - docker image build -t ${IMAGE}:${VERSION} . 14 | - docker image tag ${IMAGE}:${VERSION} ${IMAGE}:latest 15 | - docker image push ${IMAGE} 16 | # Push tags 17 | - git tag -a "${VERSION}" -m "Tagging for release ${VERSION}" 18 | - git push origin ${VERSION} 19 | services: 20 | - docker 21 | -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_07_challenge_deploy_a_self_hosted_runner/bitbucket-pipelines.yml: -------------------------------------------------------------------------------- 1 | image: atlassian/default-image:4 2 | 3 | pipelines: 4 | default: 5 | - step: 6 | name: Install Dependencies 7 | runs-on: 8 | - self.hosted 9 | - linux.shell 10 | caches: 11 | - pip 12 | script: 13 | - /usr/bin/pip3 install --requirement requirements.txt 14 | - step: 15 | name: Get Runner Details 16 | runs-on: 17 | - self.hosted 18 | - linux.shell 19 | script: 20 | - /usr/bin/python3 get-runner-details.py | tee runner-details.log 21 | - step: 22 | name: Read Data File 23 | runs-on: 24 | - self.hosted 25 | - linux.shell 26 | script: 27 | - /usr/bin/python3 read-data-file.py | tee data-file.log 28 | artifacts: 29 | - '*.log' 30 | -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_08_solution_deploy_a_self_hosted_runner/bitbucket-pipelines.yml: -------------------------------------------------------------------------------- 1 | image: atlassian/default-image:4 2 | 3 | pipelines: 4 | default: 5 | - step: 6 | name: Install Dependencies 7 | runs-on: 8 | - self.hosted 9 | - linux.shell 10 | caches: 11 | - pip 12 | script: 13 | - /usr/bin/pip3 install --requirement requirements.txt 14 | - step: 15 | name: Get Runner Details 16 | runs-on: 17 | - self.hosted 18 | - linux.shell 19 | script: 20 | - /usr/bin/python3 get-runner-details.py | tee runner-details.log 21 | - step: 22 | name: Read Data File 23 | runs-on: 24 | - self.hosted 25 | - linux.shell 26 | script: 27 | - /usr/bin/python3 read-data-file.py | tee data-file.log 28 | artifacts: 29 | - '*.log' 30 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_07_solution_create_a_custom_pipe/.gitignore: -------------------------------------------------------------------------------- 1 | # These are some examples of commonly ignored file patterns. 2 | # You should customize this list as applicable to your project. 3 | # Learn more about .gitignore: 4 | # https://www.atlassian.com/git/tutorials/saving-changes/gitignore 5 | 6 | # Node artifact files 7 | node_modules/ 8 | dist/ 9 | 10 | # Compiled Java class files 11 | *.class 12 | 13 | # Compiled Python bytecode 14 | *.py[cod] 15 | 16 | # Log files 17 | *.log 18 | 19 | # Package files 20 | *.jar 21 | 22 | # Maven 23 | target/ 24 | dist/ 25 | 26 | # JetBrains IDE 27 | .idea/ 28 | 29 | # Unit test reports 30 | TEST*.xml 31 | 32 | # Generated by MacOS 33 | .DS_Store 34 | 35 | # Generated by Windows 36 | Thumbs.db 37 | 38 | # Applications 39 | *.app 40 | *.exe 41 | *.war 42 | 43 | # Large media files 44 | *.mp4 45 | *.tiff 46 | *.avi 47 | *.flv 48 | *.mov 49 | *.wmv 50 | test-reports/ 51 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_06_challenge_create_a_custom_pipe/.gitignore: -------------------------------------------------------------------------------- 1 | # These are some examples of commonly ignored file patterns. 2 | # You should customize this list as applicable to your project. 3 | # Learn more about .gitignore: 4 | # https://www.atlassian.com/git/tutorials/saving-changes/gitignore 5 | 6 | # Node artifact files 7 | node_modules/ 8 | dist/ 9 | 10 | # Compiled Java class files 11 | *.class 12 | 13 | # Compiled Python bytecode 14 | *.py[cod] 15 | 16 | # Log files 17 | *.log 18 | 19 | # Package files 20 | *.jar 21 | 22 | # Maven 23 | target/ 24 | dist/ 25 | 26 | # JetBrains IDE 27 | .idea/ 28 | 29 | # Unit test reports 30 | TEST*.xml 31 | 32 | # Generated by MacOS 33 | .DS_Store 34 | 35 | # Generated by Windows 36 | Thumbs.db 37 | 38 | # Applications 39 | *.app 40 | *.exe 41 | *.war 42 | 43 | # Large media files 44 | *.mp4 45 | *.tiff 46 | *.avi 47 | *.flv 48 | *.mov 49 | *.wmv 50 | test-reports/ 51 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_07_solution_optimize_a_pipeline/.gitignore: -------------------------------------------------------------------------------- 1 | analysis_report.md 2 | data/ 3 | 4 | # These are some examples of commonly ignored file patterns. 5 | # You should customize this list as applicable to your project. 6 | # Learn more about .gitignore: 7 | # https://www.atlassian.com/git/tutorials/saving-changes/gitignore 8 | 9 | # Node artifact files 10 | node_modules/ 11 | dist/ 12 | 13 | # Compiled Java class files 14 | *.class 15 | 16 | # Compiled Python bytecode 17 | *.py[cod] 18 | 19 | # Log files 20 | *.log 21 | 22 | # Package files 23 | *.jar 24 | 25 | # Maven 26 | target/ 27 | dist/ 28 | 29 | # JetBrains IDE 30 | .idea/ 31 | 32 | # Unit test reports 33 | TEST*.xml 34 | 35 | # Generated by MacOS 36 | .DS_Store 37 | 38 | # Generated by Windows 39 | Thumbs.db 40 | 41 | # Applications 42 | *.app 43 | *.exe 44 | *.war 45 | 46 | # Large media files 47 | *.mp4 48 | *.tiff 49 | *.avi 50 | *.flv 51 | *.mov 52 | *.wmv 53 | 54 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_06_challenge_optimize_a_pipeline/.gitignore: -------------------------------------------------------------------------------- 1 | analysis_report.md 2 | data/ 3 | 4 | # These are some examples of commonly ignored file patterns. 5 | # You should customize this list as applicable to your project. 6 | # Learn more about .gitignore: 7 | # https://www.atlassian.com/git/tutorials/saving-changes/gitignore 8 | 9 | # Node artifact files 10 | node_modules/ 11 | dist/ 12 | 13 | # Compiled Java class files 14 | *.class 15 | 16 | # Compiled Python bytecode 17 | *.py[cod] 18 | 19 | # Log files 20 | *.log 21 | 22 | # Package files 23 | *.jar 24 | 25 | # Maven 26 | target/ 27 | dist/ 28 | 29 | # JetBrains IDE 30 | .idea/ 31 | 32 | # Unit test reports 33 | TEST*.xml 34 | 35 | # Generated by MacOS 36 | .DS_Store 37 | 38 | # Generated by Windows 39 | Thumbs.db 40 | 41 | # Applications 42 | *.app 43 | *.exe 44 | *.war 45 | 46 | # Large media files 47 | *.mp4 48 | *.tiff 49 | *.avi 50 | *.flv 51 | *.mov 52 | *.wmv 53 | 54 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_04_use_conditional_steps/bitbucket-pipelines.yml: -------------------------------------------------------------------------------- 1 | definitions: 2 | services: 3 | database: 4 | image: postgres:14.3 5 | environment: 6 | POSTGRES_DB: 'testing' 7 | POSTGRES_USER: 'testing' 8 | POSTGRES_PASSWORD: 'testing' 9 | 10 | pipelines: 11 | default: 12 | - step: 13 | name: Test Python 14 | services: 15 | - database 16 | image: python:3 17 | script: 18 | - echo "# RUNNING PYTHON TESTS" 19 | - cd ./python-backend 20 | - pip install --requirement requirements.txt 21 | - python3 -m unittest --verbose app_test.py 22 | - step: 23 | name: Test Ruby 24 | services: 25 | - database 26 | image: ruby:3 27 | script: 28 | - echo "# RUNNING RUBY TESTS" 29 | - cd ./ruby-backend 30 | - bundle install 31 | - bundle exec ruby app_test.rb 32 | 33 | -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_06_use_self_hosted_runners_in_a_pipeline/bitbucket-pipelines.yml: -------------------------------------------------------------------------------- 1 | definitions: 2 | script: &inspect-runner-environment 3 | - echo "Gathering CPU Info..." && lscpu 4 | - echo "Gathering Memory Info..." && free -h 5 | - echo "Disk Usage:" && df -h / 6 | - echo "Installed Tools:" 7 | - java --version || true 8 | - git --version || true 9 | - docker --version || true 10 | - echo "Listing All Commands:" && compgen -c 11 | - echo "Network Information:" && ip addr show 12 | - echo "Environment Variables:" && env 13 | 14 | pipelines: 15 | default: 16 | - step: 17 | name: Use Linux Shell Runner 18 | runs-on: 19 | - self.hosted 20 | - linux.shell 21 | script: *inspect-runner-environment 22 | - step: 23 | name: Use Linux Docker Runner 24 | runs-on: 25 | - self.hosted 26 | - linux 27 | script: *inspect-runner-environment 28 | -------------------------------------------------------------------------------- /ch5_conclusion/05_01_next_steps/README.md: -------------------------------------------------------------------------------- 1 | # 05_01 Next Steps 2 | 3 | Hey! We made it to the end of the course. Thanks for joining me to learn about some of the advanced features in Bitbucket Pipelines. 4 | 5 | As you take your workflows to the next level, remember that this course is here for reference - you can revisit the exercise files, retry the challenges, or take the course all over again. 6 | 7 | And please, ask questions and share your experience in the Q&A section! It's a great place to build community with other Bitbucket fans. 8 | 9 | I had an absolute blast as your instructor, and I'm excited to see the advanced projects you'll create with Bitbucket Pipelines! 10 | 11 | 12 | --- 13 | [← 04_08 Solution: Deploy a Self-Hosted Runner](../../ch4_self_hosted_runners/04_08_solution_deploy_a_self_hosted_runner/README.md) | [Advanced Bitbucket Pipelines: Automating Deployments & Managing Third Party Integrations →](../../README.md) 14 | 15 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_04_use_conditional_steps/bitbucket-pipelines-no-conditional-steps.yml: -------------------------------------------------------------------------------- 1 | definitions: 2 | services: 3 | database: 4 | image: postgres:14.3 5 | environment: 6 | POSTGRES_DB: 'testing' 7 | POSTGRES_USER: 'testing' 8 | POSTGRES_PASSWORD: 'testing' 9 | 10 | pipelines: 11 | default: 12 | - step: 13 | name: Test Python 14 | services: 15 | - database 16 | image: python:3 17 | script: 18 | - echo "# RUNNING PYTHON TESTS" 19 | - cd ./python-backend 20 | - pip install --requirement requirements.txt 21 | - python3 -m unittest --verbose app_test.py 22 | - step: 23 | name: Test Ruby 24 | services: 25 | - database 26 | image: ruby:3 27 | script: 28 | - echo "# RUNNING RUBY TESTS" 29 | - cd ./ruby-backend 30 | - bundle install 31 | - bundle exec ruby app_test.rb 32 | 33 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_07_solution_optimize_a_pipeline/bitbucket-pipelines.yml: -------------------------------------------------------------------------------- 1 | image: python:3.12 2 | 3 | # Prevents the analysis from running 4 | # more than 10 minutes 5 | options: 6 | max-time: 10 7 | 8 | definitions: 9 | caches: 10 | data: ./data 11 | 12 | pipelines: 13 | default: 14 | - step: 15 | name: Run Data Analysis 16 | 17 | caches: 18 | # Minimizes the time spent loading data 19 | # analysis libraries 20 | - pip 21 | 22 | # Reuses any data that has already been 23 | # generated by the script and written 24 | # to the `./data` directory in the workspace 25 | - data 26 | 27 | # Only runs the analysis when script has changed. 28 | condition: 29 | changesets: 30 | includePaths: 31 | - cluster_analysis.py 32 | 33 | script: 34 | - pip install -r requirements.txt 35 | - python cluster_analysis.py 36 | 37 | 38 | artifacts: 39 | - analysis_report.md 40 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_01_optimizing_pipeline_performance/README.md: -------------------------------------------------------------------------------- 1 | # 01_01 Optimizing Pipeline Performance and Reducing Build Times 2 | 3 | In this chapter, we’ll cover four strategies for optimizing pipelines: 4 | 5 | 1. **Configuring Maximum Runtime:** Setting sensible time limits to prevent unnecessary resource usage. 6 | 1. **Configuring Runner Sizes:** Choosing the right runner size for your build to balance speed and cost. 7 | 1. **Using Conditional Statements in Pipelines:** Making your pipelines smarter by running only the necessary steps. 8 | 1. **Caching Dependencies:** Leveraging caches to avoid re-downloading or re-building dependencies every time your pipeline runs. 9 | 10 | Each of these strategies can significantly improve your pipeline performance and contribute to faster, more efficient development cycles. Let’s get started and dive into the first topic: **Configuring Maximum Runtime**. 11 | 12 | 13 | --- 14 | [← 00_04 Bitbucket Pipelines Review](../../ch0_intro/00_04_bitbucket_pipelines_review/README.md) | [01_02 Configure Maximum Runtime →](../01_02_configure_maximum_runtime/README.md) 15 | 16 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_06_challenge_create_a_custom_pipe/test/test.py: -------------------------------------------------------------------------------- 1 | import os 2 | import subprocess 3 | 4 | docker_image = 'bitbucketpipelines/demo-pipe-python:ci' + os.getenv('BITBUCKET_BUILD_NUMBER', 'local') 5 | 6 | def docker_build(): 7 | """ 8 | Build the docker image for tests. 9 | :return: 10 | """ 11 | args = [ 12 | 'docker', 13 | 'build', 14 | '-t', 15 | docker_image, 16 | '.', 17 | ] 18 | subprocess.run(args, check=True) 19 | 20 | 21 | def setup(): 22 | docker_build() 23 | 24 | def test_no_parameters(): 25 | args = [ 26 | 'docker', 27 | 'run', 28 | docker_image, 29 | ] 30 | 31 | result = subprocess.run(args, check=False, text=True, capture_output=True) 32 | assert result.returncode == 1 33 | assert '✖ Validation errors: \nNAME:\n- required field' in result.stdout 34 | 35 | def test_success(): 36 | args = [ 37 | 'docker', 38 | 'run', 39 | '-e', 'NAME=hello world', 40 | docker_image, 41 | ] 42 | 43 | result = subprocess.run(args, check=False, text=True, capture_output=True) 44 | assert 'hello world' in result.stdout 45 | assert result.returncode == 0 46 | 47 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/advanced-python/test/test.py: -------------------------------------------------------------------------------- 1 | import os 2 | import subprocess 3 | 4 | docker_image = 'bitbucketpipelines/demo-pipe-python:ci' + os.getenv('BITBUCKET_BUILD_NUMBER', 'local') 5 | 6 | def docker_build(): 7 | """ 8 | Build the docker image for tests. 9 | :return: 10 | """ 11 | args = [ 12 | 'docker', 13 | 'build', 14 | '-t', 15 | docker_image, 16 | '.', 17 | ] 18 | subprocess.run(args, check=True) 19 | 20 | 21 | def setup(): 22 | docker_build() 23 | 24 | def test_no_parameters(): 25 | args = [ 26 | 'docker', 27 | 'run', 28 | docker_image, 29 | ] 30 | 31 | result = subprocess.run(args, check=False, text=True, capture_output=True) 32 | assert result.returncode == 1 33 | assert '✖ Validation errors: \nNAME:\n- required field' in result.stdout 34 | 35 | def test_success(): 36 | args = [ 37 | 'docker', 38 | 'run', 39 | '-e', 'NAME=hello world', 40 | docker_image, 41 | ] 42 | 43 | result = subprocess.run(args, check=False, text=True, capture_output=True) 44 | assert 'hello world' in result.stdout 45 | assert result.returncode == 0 46 | 47 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/advanced-python/test/test.py: -------------------------------------------------------------------------------- 1 | import os 2 | import subprocess 3 | 4 | docker_image = 'bitbucketpipelines/demo-pipe-python:ci' + os.getenv('BITBUCKET_BUILD_NUMBER', 'local') 5 | 6 | def docker_build(): 7 | """ 8 | Build the docker image for tests. 9 | :return: 10 | """ 11 | args = [ 12 | 'docker', 13 | 'build', 14 | '-t', 15 | docker_image, 16 | '.', 17 | ] 18 | subprocess.run(args, check=True) 19 | 20 | 21 | def setup(): 22 | docker_build() 23 | 24 | def test_no_parameters(): 25 | args = [ 26 | 'docker', 27 | 'run', 28 | docker_image, 29 | ] 30 | 31 | result = subprocess.run(args, check=False, text=True, capture_output=True) 32 | assert result.returncode == 1 33 | assert '✖ Validation errors: \nNAME:\n- required field' in result.stdout 34 | 35 | def test_success(): 36 | args = [ 37 | 'docker', 38 | 'run', 39 | '-e', 'NAME=hello world', 40 | docker_image, 41 | ] 42 | 43 | result = subprocess.run(args, check=False, text=True, capture_output=True) 44 | assert 'hello world' in result.stdout 45 | assert result.returncode == 0 46 | 47 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_07_solution_create_a_custom_pipe/test/test.py: -------------------------------------------------------------------------------- 1 | import os 2 | import subprocess 3 | 4 | docker_image = 'bitbucketpipelines/demo-pipe-python:ci' + os.getenv('BITBUCKET_BUILD_NUMBER', 'local') 5 | 6 | def docker_build(): 7 | """ 8 | Build the docker image for tests. 9 | :return: 10 | """ 11 | args = [ 12 | 'docker', 13 | 'build', 14 | '-t', 15 | docker_image, 16 | '.', 17 | ] 18 | subprocess.run(args, check=True) 19 | 20 | 21 | def setup(): 22 | docker_build() 23 | 24 | def test_no_parameters(): 25 | args = [ 26 | 'docker', 27 | 'run', 28 | docker_image, 29 | ] 30 | 31 | result = subprocess.run(args, check=False, text=True, capture_output=True) 32 | assert result.returncode == 1 33 | assert '✖ Validation errors: \nCUSTOMER_ID:\n- required field' in result.stdout 34 | 35 | def test_success(): 36 | args = [ 37 | 'docker', 38 | 'run', 39 | '-e', 'CUSTOMER_ID=DEC041906', 40 | docker_image, 41 | ] 42 | 43 | result = subprocess.run(args, check=False, text=True, capture_output=True) 44 | assert 'DEC041906' in result.stdout 45 | assert result.returncode == 0 46 | 47 | -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_03_compare_repository_and_workspace_runners/README.md: -------------------------------------------------------------------------------- 1 | # 04_03 Compare Repository and Workspace Runners 2 | 3 | Self-hosted runners can be set up at the repository level for dedicated, consistent environments or at the workspace level to share resources across multiple projects. 4 | 5 | - Repo-level runners are ideal for custom environments (like Linux-shell, Windows, or macOS). 6 | - Workspace runners, especially using Linux-Docker, prevent dependency conflicts by providing isolated environments for each run. 7 | 8 | | Repository Runners | Workspace Runners | 9 | |--------------------------------------|-------------------| 10 | | Runner access is limited to the repo | Runner is available to all repos in the workspace | 11 | | Linux Shell, Windows, and macOS Runners | Linux Docker | 12 | | Tools, dependencies, and hardware are consistent | Creates clean environments on each run | 13 | 14 | 15 | --- 16 | [← 04_02 Self-Hosted Runner Configurations](../04_02_self_hosted_runner_configurations/README.md) | [04_04 Deploy an EC2 Server in AWS →](../04_04_deploy_an_ec2_server_in_aws/README.md) 17 | 18 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE.md: -------------------------------------------------------------------------------- 1 | 7 | 8 | ## Issue Overview 9 | 10 | 11 | ## Describe your environment 12 | 13 | 14 | ## Steps to Reproduce 15 | 16 | 1. 17 | 2. 18 | 3. 19 | 4. 20 | 21 | ## Expected Behavior 22 | 23 | 24 | ## Current Behavior 25 | 26 | 27 | ## Possible Solution 28 | 29 | 30 | ## Screenshots / Video 31 | 32 | 33 | ## Related Issues 34 | 35 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_04_use_conditional_steps/bitbucket-pipelines-with-conditional-steps.yml: -------------------------------------------------------------------------------- 1 | definitions: 2 | services: 3 | database: 4 | image: postgres:14.3 5 | environment: 6 | POSTGRES_DB: 'testing' 7 | POSTGRES_USER: 'testing' 8 | POSTGRES_PASSWORD: 'testing' 9 | 10 | pipelines: 11 | default: 12 | - step: 13 | name: Test Python 14 | services: 15 | - database 16 | image: python:3 17 | condition: 18 | changesets: 19 | includePaths: 20 | - "shared/**" 21 | - "python-backend/**" 22 | script: 23 | - echo "# RUNNING PYTHON TESTS" 24 | - cd ./python-backend 25 | - pip install --requirement requirements.txt 26 | - python3 -m unittest --verbose app_test.py 27 | - step: 28 | name: Test Ruby 29 | services: 30 | - database 31 | image: ruby:3 32 | condition: 33 | changesets: 34 | includePaths: 35 | - "shared/**" 36 | - "ruby-backend/**" 37 | script: 38 | - echo "# RUNNING RUBY TESTS" 39 | - cd ./ruby-backend 40 | - bundle install 41 | - bundle exec ruby app_test.rb 42 | 43 | -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_07_challenge_deploy_a_self_hosted_runner/read-data-file.py: -------------------------------------------------------------------------------- 1 | import os 2 | import boto3 3 | 4 | def get_parameter(parameter_name): 5 | """Fetches a parameter value from AWS Systems Manager Parameter Store.""" 6 | try: 7 | ssm_client = boto3.client('ssm', region_name=os.getenv('AWS_REGION', 'us-east-1')) 8 | response = ssm_client.get_parameter(Name=parameter_name, WithDecryption=True) 9 | return response['Parameter']['Value'] 10 | except Exception as e: 11 | print(f"Error fetching parameter {parameter_name}: {e}") 12 | raise 13 | 14 | def read_file(file_path): 15 | """Reads the contents of a file given its file path.""" 16 | try: 17 | with open(file_path, 'r') as file: 18 | return file.read() 19 | except Exception as e: 20 | print(f"Error reading file {file_path}: {e}") 21 | raise 22 | 23 | if __name__ == "__main__": 24 | PARAMETER_NAME = "DATA_FILE" 25 | 26 | print("Fetching parameter from Parameter Store...") 27 | file_path = get_parameter(PARAMETER_NAME) 28 | 29 | print(f"Parameter value: {file_path}") 30 | 31 | print("Reading file contents...") 32 | file_contents = read_file(file_path) 33 | 34 | print("File Contents:") 35 | print(file_contents) 36 | -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_08_solution_deploy_a_self_hosted_runner/read-data-file.py: -------------------------------------------------------------------------------- 1 | import os 2 | import boto3 3 | 4 | def get_parameter(parameter_name): 5 | """Fetches a parameter value from AWS Systems Manager Parameter Store.""" 6 | try: 7 | ssm_client = boto3.client('ssm', region_name=os.getenv('AWS_REGION', 'us-east-1')) 8 | response = ssm_client.get_parameter(Name=parameter_name, WithDecryption=True) 9 | return response['Parameter']['Value'] 10 | except Exception as e: 11 | print(f"Error fetching parameter {parameter_name}: {e}") 12 | raise 13 | 14 | def read_file(file_path): 15 | """Reads the contents of a file given its file path.""" 16 | try: 17 | with open(file_path, 'r') as file: 18 | return file.read() 19 | except Exception as e: 20 | print(f"Error reading file {file_path}: {e}") 21 | raise 22 | 23 | if __name__ == "__main__": 24 | PARAMETER_NAME = "DATA_FILE" 25 | 26 | print("Fetching parameter from Parameter Store...") 27 | file_path = get_parameter(PARAMETER_NAME) 28 | 29 | print(f"Parameter value: {file_path}") 30 | 31 | print("Reading file contents...") 32 | file_contents = read_file(file_path) 33 | 34 | print("File Contents:") 35 | print(file_contents) 36 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_05_cache_dependencies/ruby-backend/app.rb: -------------------------------------------------------------------------------- 1 | require 'pg' # Assuming you'll use the `pg` gem for PostgreSQL connection 2 | 3 | def connect_to_db 4 | # Connects to the database using environment variables or default values 5 | 6 | host = ENV['DB_HOST'] || 'localhost' 7 | database = ENV['DB_NAME'] || 'testing' 8 | user = ENV['DB_USER'] || 'testing' 9 | password = ENV['DB_PASSWORD'] || 'testing' 10 | 11 | begin 12 | connection = PG::Connection.new(host: host, dbname: database, user: user, password: password) 13 | return connection 14 | rescue PG::Error => e 15 | puts "Error connecting to database: #{e}" 16 | return nil 17 | end 18 | end 19 | 20 | def create_table(connection) 21 | connection.exec("CREATE TABLE IF NOT EXISTS tasks (id SERIAL PRIMARY KEY, description TEXT NOT NULL);") 22 | end 23 | 24 | def insert_task(connection, description) 25 | connection.exec("INSERT INTO tasks (description) VALUES ($1)", description) 26 | end 27 | 28 | def main 29 | connection = connect_to_db 30 | if connection 31 | create_table(connection) 32 | insert_task(connection, "Learn more about Bitbucket Pipelines for CI/CD") 33 | connection.close 34 | puts "Task added successfully!" 35 | end 36 | end 37 | 38 | if __FILE__ == $PROGRAM_NAME 39 | main 40 | end 41 | 42 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_04_use_conditional_steps/ruby-backend/app.rb: -------------------------------------------------------------------------------- 1 | require 'pg' # Assuming you'll use the `pg` gem for PostgreSQL connection 2 | 3 | def connect_to_db 4 | # Connects to the database using environment variables or default values 5 | 6 | host = ENV['DB_HOST'] || 'localhost' 7 | database = ENV['DB_NAME'] || 'testing' 8 | user = ENV['DB_USER'] || 'testing' 9 | password = ENV['DB_PASSWORD'] || 'testing' 10 | 11 | begin 12 | connection = PG::Connection.new(host: host, dbname: database, user: user, password: password) 13 | return connection 14 | rescue PG::Error => e 15 | puts "Error connecting to database: #{e}" 16 | return nil 17 | end 18 | end 19 | 20 | def create_table(connection) 21 | connection.exec("CREATE TABLE IF NOT EXISTS tasks (id SERIAL PRIMARY KEY, description TEXT NOT NULL);") 22 | end 23 | 24 | def insert_task(connection, description) 25 | connection.exec("INSERT INTO tasks (description) VALUES ($1)", description) 26 | end 27 | 28 | def main 29 | connection = connect_to_db 30 | if connection 31 | create_table(connection) 32 | insert_task(connection, "Learn more about Bitbucket Pipelines for CI/CD") 33 | connection.close 34 | puts "Task added successfully!" 35 | end 36 | end 37 | 38 | if __FILE__ == $PROGRAM_NAME 39 | main 40 | end 41 | 42 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_05_use_a_custom_pipe_in_a_pipeline/README.md: -------------------------------------------------------------------------------- 1 | # 03_05 Use a custom pipe in a pipeline 2 | 3 | If you followed along with the previous lessons to: 4 | 5 | - create a custom pipe 6 | - and push the pipe to a container registry 7 | 8 | Create a Bitbucket Pipelines configuration that uses the pipe. 9 | 10 | Your configuration should be similar to the following: 11 | 12 | ```yaml 13 | image: atlassian/default-image:4 14 | 15 | pipelines: 16 | default: 17 | - step: 18 | name: Test Custom Pipe 19 | script: 20 | - pipe: YOUR_USER_NAME_HERE/YOUR_PIPE_NAME_HERE:PIPE.VERSION.HERE 21 | variables: 22 | NAME: "$BITBUCKET_REPO_OWNER" # Or any other variables that your pipe needs 23 | ``` 24 | 25 | ## Shenanigans 26 | 27 | ### 03_05.1: The Log Output for the Custom Pipe may be Truncated 28 | 29 | When the custom pipeline developed in the previous lesson is run, the output from the log is cut off. 30 | 31 | No worries! Just refresh the browser window and take another look. 32 | 33 | ![You're AWESOME!](./images/00-youre-awesome.png) 34 | 35 | 36 | --- 37 | [← 03_04 Deploy a Custom Pipe to a Container Registry](../03_04_deploy_a_custom_pipe_to_a_container_registry/README.md) | [03_06 Challenge: Develop a Custom Pipe →](../03_06_challenge_create_a_custom_pipe/README.md) 38 | 39 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_02_configure_maximum_runtime/README.md: -------------------------------------------------------------------------------- 1 | # 01_02 Configure Maximum Runtime 2 | 3 | ## Reference 4 | 5 | [Max Time](https://support.atlassian.com/bitbucket-cloud/docs/global-options/#Max-time) 6 | 7 | The max-time option sets the maximum length of time a step can run before timing out (in minutes). The max-time option can be set using both the global options property and on individual pipeline steps. The default maximum time for pipeline steps is 120 minutes. 8 | 9 | ## Pipeline Setting 10 | 11 | ```yaml 12 | options: 13 | max-time: 2 # The entire pipeline will timeout after 2 minutes 14 | ``` 15 | 16 | ## Step Setting 17 | 18 | ```yaml 19 | - step: 20 | max-time: 2 21 | script: 22 | - sleep 120m # This step will timeout after 2 minutes 23 | ``` 24 | 25 | ## Using the Exercise Files 26 | 27 | 1. Create a new repo and add the exercise files 28 | 1. Adding the files may not trigger the pipeline right away. Go to the **Pipelines** menu and select **Run initial pipeline**. 29 | 1. Observe the pipeline being stopped after the amount of time specified in the pipeline configuration. 30 | 31 | 32 | --- 33 | [← 01_01 Optimizing Pipeline Performance and Reducing Build Times](../01_01_optimizing_pipeline_performance/README.md) | [01_03 Configure resource allocation →](../01_03_configure_resource_allocation/README.md) 34 | 35 | -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_01_getting_to_know_pipes/README.md: -------------------------------------------------------------------------------- 1 | # 02_01 Getting to Know Pipes 2 | 3 | Bitbucket Pipeline pipes are pre-packaged commands that simplify complex CI/CD tasks. They encapsulate intricate operations into portable configuration elements that you can easily reuse across different workspaces, projects, and repositories. 4 | 5 | Key points include: 6 | 7 | - **Pre-packaged & Reusable:** Pipes bundle all necessary code in lightweight Docker containers, allowing you to execute complex commands with minimal setup. 8 | - **Parameterization:** You can customize pipes by passing repository or pipeline variables, tailoring them to the specific needs of your workflow. 9 | - **Wide Integration:** Bitbucket provides dozens of pre-built pipes that integrate with popular cloud platforms, collaboration tools, and third-party services. 10 | - **In-Context Feedback:** Many pipes can output reports directly within the Bitbucket interface, helping developers stay focused and iterate rapidly. 11 | 12 | This unit will demonstrate practical examples of using pipes, showing just how easy it is to integrate these powerful tools into your CI/CD pipelines. 13 | 14 | 15 | --- 16 | [← 01_07 Solution: Optimize a Workflow in Bitbucket Pipelines](../../ch1_pipeline_optimizations/01_07_solution_optimize_a_pipeline/README.md) | [02_02 Use a pipe in a pipeline configuration →](../02_02_use_a_pipe_in_a_pipeline_configuration/README.md) 17 | 18 | -------------------------------------------------------------------------------- /ch0_intro/00_03_using_the_exercise_files/README.md: -------------------------------------------------------------------------------- 1 | # 00_03 Using the Exercise Files 2 | 3 | To help you get the most out of this course, exercise files are available for you to use. 4 | 5 | Use these files to follow along with demonstrations and as a starting point for challenges. 6 | 7 | The exercise files are located in the following GitHub repository: 8 | 9 | - [Advanced Bitbucket Pipelines: Automating Deployments & Managing Third Party Integrations](https://github.com/LinkedInLearning/advanced-bitbucket-pipelines-3925184.git) 10 | 11 | ## Downloading the Exercise Files 12 | 13 | The best way to use the exercise files is to download the repo as a zip file and extract the archive to your local system. 14 | 15 | 1. In the GitHub web interface, select **Code**. 16 | 1. On the **Local** tab, select **Download ZIP**. 17 | 1. Save the zip file to your local system. 18 | 1. Locate the zip file and extract the contents. 19 | 20 | ## Using the exercise files with the course content 21 | 22 | 1. The exercise files are organized in directories by chapter and video. 23 | 1. Locate the chapter and video that correspond to the course content. 24 | 1. Use the files as needed by adding them to a Bitbucket repo, applying them in the AWS web interface, or modifying them in your local editor. 25 | 26 | 27 | --- 28 | [← 00_02 What You Should Know](../00_02_what_you_should_know/README.md) | [00_04 Bitbucket Pipelines Review →](../00_04_bitbucket_pipelines_review/README.md) 29 | 30 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_05_cache_dependencies/bitbucket-pipelines.yml: -------------------------------------------------------------------------------- 1 | definitions: 2 | caches: 3 | custom-ruby: 4 | path: ruby-backend/vendor/bundle 5 | key: 6 | files: 7 | - ruby-backend/Gemfile.lock 8 | services: 9 | database: 10 | image: postgres:14.3 11 | environment: 12 | POSTGRES_DB: 'testing' 13 | POSTGRES_USER: 'testing' 14 | POSTGRES_PASSWORD: 'testing' 15 | 16 | pipelines: 17 | default: 18 | - step: 19 | name: Test Python 20 | caches: 21 | - pip 22 | services: 23 | - database 24 | image: python:3 25 | condition: 26 | changesets: 27 | includePaths: 28 | - "shared/**" 29 | - "python-backend/**" 30 | script: 31 | - echo "# RUNNING PYTHON TESTS" 32 | - cd ./python-backend 33 | - pip install --requirement requirements.txt 34 | - python3 -m unittest --verbose app_test.py 35 | - step: 36 | name: Test Ruby 37 | caches: 38 | - custom-ruby 39 | services: 40 | - database 41 | image: ruby:3 42 | condition: 43 | changesets: 44 | includePaths: 45 | - "shared/**" 46 | - "ruby-backend/**" 47 | script: 48 | - echo "# RUNNING RUBY TESTS" 49 | - cd ./ruby-backend 50 | - bundle install --path vendor/bundle 51 | - bundle exec ruby app_test.rb 52 | 53 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_06_challenge_create_a_custom_pipe/RELEASING.md: -------------------------------------------------------------------------------- 1 | ## Releasing a new version 2 | 3 | This pipe uses an automated release process to bump versions using semantic versioning and generate the CHANGELOG.md file automatically. 4 | 5 | In order to automate this process it uses a tool called [`semversioner`](https://pypi.org/project/semversioner/). 6 | 7 | ### Steps to release 8 | 9 | 1) Install semversioner in local. 10 | 11 | ```sh 12 | pip install semversioner 13 | ``` 14 | 15 | 2) During development phase, every change that needs to be integrated to `master` will need one or more changeset files. You can use semversioner to generate changeset. 16 | 17 | ```sh 18 | semversioner add-change --type patch --description "Fix security vulnerability with authentication." 19 | ``` 20 | 21 | 3) Make sure you commit the changeset files generated in `.change/next-release/` folder with your code. For example: 22 | 23 | ```sh 24 | git add . 25 | git commit -m "BP-234 FIX security issue with authentication" 26 | git push origin 27 | ``` 28 | 29 | 4) That's it! Merge to `master` and Enjoy! Bitbucket Pipelines will do the rest: 30 | 31 | - Generate new version number based on the changeset types `major`, `minor`, `patch`. 32 | - Generate a new file in `.changes` directory with all the changes for this specific version. 33 | - (Re)generate the CHANGELOG.md file. 34 | - Bump the version number in `README.md` example and `pipe.yml` metadata. 35 | - Commit and push back to the repository. 36 | - Tag your commit with the new version number. -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_07_solution_create_a_custom_pipe/RELEASING.md: -------------------------------------------------------------------------------- 1 | ## Releasing a new version 2 | 3 | This pipe uses an automated release process to bump versions using semantic versioning and generate the CHANGELOG.md file automatically. 4 | 5 | In order to automate this process it uses a tool called [`semversioner`](https://pypi.org/project/semversioner/). 6 | 7 | ### Steps to release 8 | 9 | 1) Install semversioner in local. 10 | 11 | ```sh 12 | pip install semversioner 13 | ``` 14 | 15 | 2) During development phase, every change that needs to be integrated to `master` will need one or more changeset files. You can use semversioner to generate changeset. 16 | 17 | ```sh 18 | semversioner add-change --type patch --description "Fix security vulnerability with authentication." 19 | ``` 20 | 21 | 3) Make sure you commit the changeset files generated in `.change/next-release/` folder with your code. For example: 22 | 23 | ```sh 24 | git add . 25 | git commit -m "BP-234 FIX security issue with authentication" 26 | git push origin 27 | ``` 28 | 29 | 4) That's it! Merge to `master` and Enjoy! Bitbucket Pipelines will do the rest: 30 | 31 | - Generate new version number based on the changeset types `major`, `minor`, `patch`. 32 | - Generate a new file in `.changes` directory with all the changes for this specific version. 33 | - (Re)generate the CHANGELOG.md file. 34 | - Bump the version number in `README.md` example and `pipe.yml` metadata. 35 | - Commit and push back to the repository. 36 | - Tag your commit with the new version number. -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/advanced-bash/RELEASING.md: -------------------------------------------------------------------------------- 1 | ## Releasing a new version 2 | 3 | This pipe uses an automated release process to bump versions using semantic versioning and generate the CHANGELOG.md file automatically. 4 | 5 | In order to automate this process it uses a tool called [`semversioner`](https://pypi.org/project/semversioner/). 6 | 7 | ### Steps to release 8 | 9 | 1) Install semversioner in local. 10 | 11 | ```sh 12 | pip install semversioner 13 | ``` 14 | 15 | 2) During development phase, every change that needs to be integrated to `master` will need one or more changeset files. You can use semversioner to generate changeset. 16 | 17 | ```sh 18 | semversioner add-change --type patch --description "Fix security vulnerability with authentication." 19 | ``` 20 | 21 | 3) Make sure you commit the changeset files generated in `.change/next-release/` folder with your code. For example: 22 | 23 | ```sh 24 | git add . 25 | git commit -m "BP-234 FIX security issue with authentication" 26 | git push origin 27 | ``` 28 | 29 | 4) That's it! Merge to `master` and Enjoy! Bitbucket Pipelines will do the rest: 30 | 31 | - Generate new version number based on the changeset types `major`, `minor`, `patch`. 32 | - Generate a new file in `.changes` directory with all the changes for this specific version. 33 | - (Re)generate the CHANGELOG.md file. 34 | - Bump the version number in `README.md` example and `pipe.yml` metadata. 35 | - Commit and push back to the repository. 36 | - Tag your commit with the new version number. -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/advanced-bash/RELEASING.md: -------------------------------------------------------------------------------- 1 | ## Releasing a new version 2 | 3 | This pipe uses an automated release process to bump versions using semantic versioning and generate the CHANGELOG.md file automatically. 4 | 5 | In order to automate this process it uses a tool called [`semversioner`](https://pypi.org/project/semversioner/). 6 | 7 | ### Steps to release 8 | 9 | 1) Install semversioner in local. 10 | 11 | ```sh 12 | pip install semversioner 13 | ``` 14 | 15 | 2) During development phase, every change that needs to be integrated to `master` will need one or more changeset files. You can use semversioner to generate changeset. 16 | 17 | ```sh 18 | semversioner add-change --type patch --description "Fix security vulnerability with authentication." 19 | ``` 20 | 21 | 3) Make sure you commit the changeset files generated in `.change/next-release/` folder with your code. For example: 22 | 23 | ```sh 24 | git add . 25 | git commit -m "BP-234 FIX security issue with authentication" 26 | git push origin 27 | ``` 28 | 29 | 4) That's it! Merge to `master` and Enjoy! Bitbucket Pipelines will do the rest: 30 | 31 | - Generate new version number based on the changeset types `major`, `minor`, `patch`. 32 | - Generate a new file in `.changes` directory with all the changes for this specific version. 33 | - (Re)generate the CHANGELOG.md file. 34 | - Bump the version number in `README.md` example and `pipe.yml` metadata. 35 | - Commit and push back to the repository. 36 | - Tag your commit with the new version number. -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_03_test_a_custom_pipe/advanced-python/RELEASING.md: -------------------------------------------------------------------------------- 1 | ## Releasing a new version 2 | 3 | This pipe uses an automated release process to bump versions using semantic versioning and generate the CHANGELOG.md file automatically. 4 | 5 | In order to automate this process it uses a tool called [`semversioner`](https://pypi.org/project/semversioner/). 6 | 7 | ### Steps to release 8 | 9 | 1) Install semversioner in local. 10 | 11 | ```sh 12 | pip install semversioner 13 | ``` 14 | 15 | 2) During development phase, every change that needs to be integrated to `master` will need one or more changeset files. You can use semversioner to generate changeset. 16 | 17 | ```sh 18 | semversioner add-change --type patch --description "Fix security vulnerability with authentication." 19 | ``` 20 | 21 | 3) Make sure you commit the changeset files generated in `.change/next-release/` folder with your code. For example: 22 | 23 | ```sh 24 | git add . 25 | git commit -m "BP-234 FIX security issue with authentication" 26 | git push origin 27 | ``` 28 | 29 | 4) That's it! Merge to `master` and Enjoy! Bitbucket Pipelines will do the rest: 30 | 31 | - Generate new version number based on the changeset types `major`, `minor`, `patch`. 32 | - Generate a new file in `.changes` directory with all the changes for this specific version. 33 | - (Re)generate the CHANGELOG.md file. 34 | - Bump the version number in `README.md` example and `pipe.yml` metadata. 35 | - Commit and push back to the repository. 36 | - Tag your commit with the new version number. -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/advanced-python/RELEASING.md: -------------------------------------------------------------------------------- 1 | ## Releasing a new version 2 | 3 | This pipe uses an automated release process to bump versions using semantic versioning and generate the CHANGELOG.md file automatically. 4 | 5 | In order to automate this process it uses a tool called [`semversioner`](https://pypi.org/project/semversioner/). 6 | 7 | ### Steps to release 8 | 9 | 1) Install semversioner in local. 10 | 11 | ```sh 12 | pip install semversioner 13 | ``` 14 | 15 | 2) During development phase, every change that needs to be integrated to `master` will need one or more changeset files. You can use semversioner to generate changeset. 16 | 17 | ```sh 18 | semversioner add-change --type patch --description "Fix security vulnerability with authentication." 19 | ``` 20 | 21 | 3) Make sure you commit the changeset files generated in `.change/next-release/` folder with your code. For example: 22 | 23 | ```sh 24 | git add . 25 | git commit -m "BP-234 FIX security issue with authentication" 26 | git push origin 27 | ``` 28 | 29 | 4) That's it! Merge to `master` and Enjoy! Bitbucket Pipelines will do the rest: 30 | 31 | - Generate new version number based on the changeset types `major`, `minor`, `patch`. 32 | - Generate a new file in `.changes` directory with all the changes for this specific version. 33 | - (Re)generate the CHANGELOG.md file. 34 | - Bump the version number in `README.md` example and `pipe.yml` metadata. 35 | - Commit and push back to the repository. 36 | - Tag your commit with the new version number. -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_02_use_a_pipe_in_a_pipeline_configuration/README.md: -------------------------------------------------------------------------------- 1 | # 02_02 Use a pipe in a pipeline configuration 2 | 3 | ## Using the exercise files for this lesson 4 | 5 | 1. Create a new repo and add the exercise files 6 | 1. Review the CloudFormation template: [AWS-Lambda-CloudFormation.yml](./AWS-Lambda-CloudFormation.yml) 7 | 1. Review the contents of the pipeline configuration: [bitbucket-pipelines.yml](./bitbucket-pipelines.yml) 8 | 1. Run the pipeline 9 | 1. review the output of the step that calls the pipe for the [Bitbucket Security: Infrastructure as Code Security Scanner](https://bitbucket.org/atlassian/bitbucket-iac-scan/src/master/) 10 | 1. Review the report generated by the pipeline run 11 | 12 | ## Shenanigans 13 | 14 | ## 02_02.1 Using Publicly Available Containers as Pipes 15 | 16 | Approved pipes are sourced from container registries hosted by Bitbucket. 17 | 18 | > [!Important] 19 | > Since pipes are essentially container images, we can run any publicly available container as a pipe! 20 | 21 | Review the contents of this [pipeline configuration](./Shenanigans/bitbucket-pipelines.yml) to see how`hello-world` images from [Docker Hub](https://hub.docker.com/) and the [Amazon Elastic Container Registry Public Gallery](https://gallery.ecr.aws/) are used as pipes in a pipeline. 22 | 23 | For extra credit, run the pipeline to see the results! 24 | 25 | 26 | --- 27 | [← 02_01 Getting to Know Pipes](../02_01_getting_to_know_pipes/README.md) | [02_03 Deploy Lambda Functions in AWS →](../02_03_deploy_lambda_functions_in_aws/README.md) 28 | 29 | -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_02_self_hosted_runner_configurations/README.md: -------------------------------------------------------------------------------- 1 | # 04_02 Self-Hosted Runner Configurations 2 | 3 | Here are a few key points regarding self-hosted runners: 4 | 5 | - **Runner Options:** You can choose from Linux Docker, Linux Shell, Windows, or MacOS runners. 6 | 7 | - **Common System Requirements:** All runners require a 64-bit OS with at least 8 GB of RAM, a recent version of the Open Java Development Kit, and Git installed. 8 | 9 | - **Runner Labels:** Labels can be added to runners to specify characteristics like operating system or pre-installed tools, which helps in targeting the right runner for specific pipeline tasks. 10 | 11 | - **Focus on Linux Options:** The unit emphasizes Linux Docker and Linux Shell runners due to their broad applicability in CI/CD scenarios. 12 | 13 | - **Linux Docker Runners:** They run steps in isolated container environments, ensuring a clean workspace for each pipeline run, though they don’t have access to the host file system. 14 | 15 | - **Linux Shell Runners:** They operate in persistent environments that maintain state between steps, allowing for faster builds by using pre-installed tools and dependencies. 16 | 17 | The upcoming lessons will cover deploying a server to experiment with these runner types. 18 | 19 | ## References 20 | 21 | 22 | - https://support.atlassian.com/bitbucket-cloud/docs/set-up-runners-for-linux-shell/ 23 | 24 | 25 | --- 26 | [← 04_01 When to Use Self-Hosted Runners](../04_01_when_to_use_self_hosted_runners/README.md) | [04_03 Compare Repository and Workspace Runners →](../04_03_compare_repository_and_workspace_runners/README.md) 27 | 28 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_02_develop_a_custom_pipe/advanced-python/README.md: -------------------------------------------------------------------------------- 1 | # Bitbucket Pipelines Pipe: example 2 | 3 | example 4 | 5 | ## YAML Definition 6 | 7 | Add the following snippet to the script section of your `bitbucket-pipelines.yml` file: 8 | 9 | ```yaml 10 | - pipe: example/example:0.0.0 11 | variables: 12 | NAME: "" 13 | # DEBUG: "" # Optional 14 | ``` 15 | ## Variables 16 | 17 | | Variable | Usage | 18 | |----------|----------------------------------------------------| 19 | | NAME (*) | The name that will be printed in the logs | 20 | | DEBUG | Turn on extra debug information. Default: `false`. | 21 | 22 | _(*) = required variable._ 23 | 24 | ## Prerequisites 25 | 26 | ## Examples 27 | 28 | Basic example: 29 | 30 | ```yaml 31 | script: 32 | - pipe: example/example:0.0.0 33 | variables: 34 | NAME: "foobar" 35 | ``` 36 | 37 | Advanced example: 38 | 39 | ```yaml 40 | script: 41 | - pipe: example/example:0.0.0 42 | variables: 43 | NAME: "foobar" 44 | DEBUG: "true" 45 | ``` 46 | 47 | ## Support 48 | If you’d like help with this pipe, or you have an issue or feature request, let us know. 49 | The pipe is maintained by example. 50 | 51 | If you’re reporting an issue, please include: 52 | 53 | - the version of the pipe 54 | - relevant logs and error messages 55 | - steps to reproduce 56 | 57 | ## License 58 | Copyright (c) 2019 Atlassian and others. 59 | Apache 2.0 licensed, see [LICENSE](LICENSE.txt) file. 60 | 61 | 62 | --- 63 | [← 03_02 Develop a Custom Pipe](../README.md) | [03_03 Test a Custom Pipe →](../../03_03_test_a_custom_pipe/README.md) 64 | 65 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_04_use_conditional_steps/python-backend/app.py: -------------------------------------------------------------------------------- 1 | import os 2 | import psycopg2 3 | 4 | 5 | def connect_to_db(): 6 | """Connects to the database using environment variables or default values. 7 | 8 | Returns: 9 | A psycopg2 connection object on success, None on failure. 10 | """ 11 | try: 12 | host = os.getenv("DB_HOST", "localhost") 13 | database = os.getenv("DB_NAME", "testing") 14 | user = os.getenv("DB_USER", "testing") 15 | password = os.getenv("DB_PASSWORD", "testing") 16 | 17 | connection = psycopg2.connect( 18 | host=host, database=database, user=user, password=password 19 | ) 20 | return connection 21 | except Exception as e: 22 | print(f"Error connecting to database: {e}") 23 | return None 24 | 25 | 26 | def create_table(connection): 27 | cursor = connection.cursor() 28 | cursor.execute( 29 | """ 30 | CREATE TABLE IF NOT EXISTS tasks ( 31 | id SERIAL PRIMARY KEY, 32 | description TEXT NOT NULL 33 | ); 34 | """ 35 | ) 36 | connection.commit() 37 | cursor.close() 38 | 39 | 40 | def insert_task(connection, description): 41 | cursor = connection.cursor() 42 | cursor.execute("INSERT INTO tasks (description) VALUES (%s)", (description,)) 43 | connection.commit() 44 | cursor.close() 45 | 46 | 47 | def main(): 48 | connection = connect_to_db() 49 | if connection: 50 | create_table(connection) 51 | insert_task(connection, "Learn more about Bitbucket Pipelines for CI/CD") 52 | connection.close() 53 | print("Task added successfully!") 54 | 55 | 56 | if __name__ == "__main__": 57 | main() 58 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_05_cache_dependencies/python-backend/app.py: -------------------------------------------------------------------------------- 1 | import os 2 | import psycopg2 3 | 4 | 5 | def connect_to_db(): 6 | """Connects to the database using environment variables or default values. 7 | 8 | Returns: 9 | A psycopg2 connection object on success, None on failure. 10 | """ 11 | try: 12 | host = os.getenv("DB_HOST", "localhost") 13 | database = os.getenv("DB_NAME", "testing") 14 | user = os.getenv("DB_USER", "testing") 15 | password = os.getenv("DB_PASSWORD", "testing") 16 | 17 | connection = psycopg2.connect( 18 | host=host, database=database, user=user, password=password 19 | ) 20 | return connection 21 | except Exception as e: 22 | print(f"Error connecting to database: {e}") 23 | return None 24 | 25 | 26 | def create_table(connection): 27 | cursor = connection.cursor() 28 | cursor.execute( 29 | """ 30 | CREATE TABLE IF NOT EXISTS tasks ( 31 | id SERIAL PRIMARY KEY, 32 | description TEXT NOT NULL 33 | ); 34 | """ 35 | ) 36 | connection.commit() 37 | cursor.close() 38 | 39 | 40 | def insert_task(connection, description): 41 | cursor = connection.cursor() 42 | cursor.execute("INSERT INTO tasks (description) VALUES (%s)", (description,)) 43 | connection.commit() 44 | cursor.close() 45 | 46 | 47 | def main(): 48 | connection = connect_to_db() 49 | if connection: 50 | create_table(connection) 51 | insert_task(connection, "Learn more about Bitbucket Pipelines for CI/CD") 52 | connection.close() 53 | print("Task added successfully!") 54 | 55 | 56 | if __name__ == "__main__": 57 | main() 58 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_03_configure_resource_allocation/README.md: -------------------------------------------------------------------------------- 1 | # 01_03 Configure resource allocation 2 | 3 | ## Reference 4 | 5 | [Size](https://support.atlassian.com/bitbucket-cloud/docs/global-options/#Size) 6 | 7 | We can use the `size` option to allocate additional CPU, memory, and storage for an entire pipeline, or for a specific step. 8 | 9 | ## Pipeline Resource Allocation 10 | 11 | ```yaml 12 | options: 13 | size: 2x 14 | 15 | pipelines: 16 | default: 17 | - step: 18 | ``` 19 | 20 | ## Step Resource Allocation 21 | 22 | ```yaml 23 | - step: 24 | size: 2x 25 | script: 26 | - echo "This step will run on a 2x instance" 27 | ``` 28 | 29 | ## Available Sizes 30 | 31 | The `size` option can be set to one of the following values: 32 | 33 | | Size | RAM | CPU | Disk | 34 | |------|--------|---------------|--------| 35 | | 1x | 2 GB | 4 Shared | 64 GB | 36 | | 2x | 8 GB | 4 Shared | 64 GB | 37 | | 4x | 16 GB | 8 Dedicated | 256 GB | 38 | | 8x | 32 GB | 16 Dedicated | 256 GB | 39 | 40 | > [!IMPORTANT] 41 | > 4x and 8x sizes are only available for Bitbucket accounts associated with a paid plan. 42 | 43 | > [!WARNING] 44 | > Increasing size uses additional build minutes. 45 | > 2x, 4x, and 8x will use the same multiplier on build minutes per minute of execution 46 | 47 | ## Using the Exercise Files 48 | 49 | 1. Create a new repo and add the exercise files 50 | 1. Adding the files may not trigger the pipeline right away. Go to the **Pipelines** menu and select **Run initial pipeline**. 51 | 1. Compare the runtimes of the steps that use different size runners. 52 | 53 | 54 | --- 55 | [← 01_02 Configure Maximum Runtime](../01_02_configure_maximum_runtime/README.md) | [01_04 Use Conditional Steps →](../01_04_use_conditional_steps/README.md) 56 | 57 | -------------------------------------------------------------------------------- /ch0_intro/00_04_bitbucket_pipelines_review/README.md: -------------------------------------------------------------------------------- 1 | # 00_04 Bitbucket Pipelines Review 2 | 3 | You should be familiar with these essential elements of the Bitbucket Pipelines configuration: 4 | 5 | | Element | Notes | Reference | 6 | |---------|-------|-------| 7 | | `bitbucket-pipelines.yml`| The Pipelines configuration file where the CI/CD process is defined in YAML format. | [Bitbucket Pipelines configuration reference](https://support.atlassian.com/bitbucket-cloud/docs/bitbucket-pipelines-configuration-reference/) | 8 | | Triggers | Define Events that start pipelines. | [Pipeline start conditions](https://support.atlassian.com/bitbucket-cloud/docs/pipeline-start-conditions/) | 9 | | Image | Specifies the Docker image used as the environment for the pipeline's execution. | [Docker image options](https://support.atlassian.com/bitbucket-cloud/docs/docker-image-options/) | 10 | | Steps | Define individual tasks in a pipeline, executed sequentially within a stage. | [Step Options](https://support.atlassian.com/bitbucket-cloud/docs/step-options/) | 11 | | Stages | Group steps into logical units to organize and parallelize pipeline execution. | [Stage Options](https://support.atlassian.com/bitbucket-cloud/docs/stage-options/) | 12 | | YAML Anchors | Reusable YAML syntax, useful for defining shared steps or configurations that help avoid duplication. | [YAML Anchors](https://support.atlassian.com/bitbucket-cloud/docs/yaml-anchors/) | 13 | 14 | ## Resources 15 | 16 | - **[Validator for bitbucket-pipelines.yml](https://bitbucket.org/product/pipelines/validator)**: Use this tool to validate your `bitbucket-pipelines.yml` file. 17 | 18 | 19 | --- 20 | [← 00_03 Using the Exercise Files](../00_03_using_the_exercise_files/README.md) | [01_01 Optimizing Pipeline Performance and Reducing Build Times →](../../ch1_pipeline_optimizations/01_01_optimizing_pipeline_performance/README.md) 21 | 22 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_04_use_conditional_steps/ruby-backend/app_test.rb: -------------------------------------------------------------------------------- 1 | require 'minitest/autorun' 2 | require_relative 'app' 3 | 4 | class TestDatabase < Minitest::Test 5 | 6 | def test_connection 7 | connection = connect_to_db 8 | refute_nil connection, "Failed to connect to database" 9 | end 10 | 11 | def test_create_table 12 | connection = connect_to_db 13 | refute_nil connection, "Failed to connect to database" 14 | 15 | cursor = connection.exec("SELECT EXISTS (SELECT 1 FROM information_schema.tables WHERE table_name = 'tasks');") 16 | exists = cursor.getvalue(0, 0) 17 | assert exists, "Table 'tasks' does not exist" 18 | 19 | ensure 20 | connection.close if connection 21 | end 22 | 23 | def test_insert_task 24 | connection = connect_to_db 25 | refute_nil connection, "Failed to connect to database" 26 | 27 | 28 | # FIXME: Ruby dev team, can you please fix the following assertions? 29 | # 30 | # At the moment they are causing errors in the pipeline. 31 | # 32 | # For the Python and Ruby tests to continue together, all tests need 33 | # parity in both code bases. 34 | # 35 | # If you need assistance, please reach out to Yukihiro for help. 36 | # 37 | # Until these tests are repaired, keep the following lines 38 | # commented out so the pipeline can complete successfully. 39 | # 40 | # Sincerely, 41 | # 42 | # The Amazing Mobile App Python Dev Team 43 | # 44 | 45 | # cursor = connection.exec("SELECT description FROM tasks WHERE description = $1", ["Learn more about using Bitbucket Pipelines for CI/CD"]) 46 | # inserted_task = cursor.first 47 | 48 | # refute_nil inserted_task, "Failed to insert task" 49 | # assert_equal inserted_task['description'], "Learn more about using Bitbucket Pipelines for CI/CD", "Inserted task description is incorrect" 50 | 51 | ensure 52 | connection.close if connection 53 | end 54 | end 55 | 56 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_05_cache_dependencies/ruby-backend/app_test.rb: -------------------------------------------------------------------------------- 1 | require 'minitest/autorun' 2 | require_relative 'app' 3 | 4 | class TestDatabase < Minitest::Test 5 | 6 | def test_connection 7 | connection = connect_to_db 8 | refute_nil connection, "Failed to connect to database" 9 | end 10 | 11 | def test_create_table 12 | connection = connect_to_db 13 | refute_nil connection, "Failed to connect to database" 14 | 15 | cursor = connection.exec("SELECT EXISTS (SELECT 1 FROM information_schema.tables WHERE table_name = 'tasks');") 16 | exists = cursor.getvalue(0, 0) 17 | assert exists, "Table 'tasks' does not exist" 18 | 19 | ensure 20 | connection.close if connection 21 | end 22 | 23 | def test_insert_task 24 | connection = connect_to_db 25 | refute_nil connection, "Failed to connect to database" 26 | 27 | 28 | # FIXME: Ruby dev team, can you please fix the following assertions? 29 | # 30 | # At the moment they are causing errors in the pipeline. 31 | # 32 | # For the Python and Ruby tests to continue together, all tests need 33 | # parity in both code bases. 34 | # 35 | # If you need assistance, please reach out to Yukihiro for help. 36 | # 37 | # Until these tests are repaired, keep the following lines 38 | # commented out so the pipeline can complete successfully. 39 | # 40 | # Sincerely, 41 | # 42 | # The Amazing Mobile App Python Dev Team 43 | # 44 | 45 | # cursor = connection.exec("SELECT description FROM tasks WHERE description = $1", ["Learn more about using Bitbucket Pipelines for CI/CD"]) 46 | # inserted_task = cursor.first 47 | 48 | # refute_nil inserted_task, "Failed to insert task" 49 | # assert_equal inserted_task['description'], "Learn more about using Bitbucket Pipelines for CI/CD", "Inserted task description is incorrect" 50 | 51 | ensure 52 | connection.close if connection 53 | end 54 | end 55 | 56 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_04_use_conditional_steps/shared/database-schema.json: -------------------------------------------------------------------------------- 1 | { 2 | "database": { 3 | "name": "my_database", 4 | "host": "remote-postgres-service.com", 5 | "port": 5432, 6 | "username": "my_remote_user", 7 | "password": "my_secure_password" 8 | }, 9 | "connections": [ 10 | { 11 | "name": "default_connection", 12 | "database": "my_database", 13 | "driver": "postgresql", 14 | "host": "remote-postgres-service.com", 15 | "port": 5432, 16 | "username": "my_remote_user", 17 | "password": "my_secure_password" 18 | }, 19 | { 20 | "name": "readonly_connection", 21 | "database": "my_database_ro", 22 | "driver": "postgresql", 23 | "host": "remote-postgres-service.com", 24 | "port": 5432, 25 | "username": "my_remote_user_ro", 26 | "password": "my_secure_password_ro" 27 | } 28 | ], 29 | "tables": [ 30 | { 31 | "name": "users", 32 | "fields": [ 33 | {"name": "id", "type": "integer"}, 34 | {"name": "email", "type": "string"} 35 | ] 36 | }, 37 | { 38 | "name": "orders", 39 | "fields": [ 40 | {"name": "id", "type": "integer"}, 41 | {"name": "user_id", "type": "integer"}, 42 | {"name": "order_date", "type": "date"} 43 | ] 44 | } 45 | ], 46 | "indexes": [ 47 | { 48 | "table_name": "users", 49 | "index_name": "email_idx", 50 | "fields": ["email"] 51 | }, 52 | { 53 | "table_name": "orders", 54 | "index_name": "user_id_idx", 55 | "fields": ["user_id"] 56 | } 57 | ], 58 | "relationships": [ 59 | { 60 | "name": "one_to_many", 61 | "table1": "users", 62 | "table2": "orders" 63 | }, 64 | { 65 | "name": "many_to_one", 66 | "table1": "orders", 67 | "table2": "users" 68 | } 69 | ], 70 | "schema_version": 3, 71 | "created_at": "2022-01-01T12:00:00.000Z", 72 | "updated_at": "2022-01-02T14:30:00.000Z" 73 | } 74 | 75 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_05_cache_dependencies/shared/database-schema.json: -------------------------------------------------------------------------------- 1 | { 2 | "database": { 3 | "name": "my_database", 4 | "host": "remote-postgres-service.com", 5 | "port": 5432, 6 | "username": "my_remote_user", 7 | "password": "my_secure_password" 8 | }, 9 | "connections": [ 10 | { 11 | "name": "default_connection", 12 | "database": "my_database", 13 | "driver": "postgresql", 14 | "host": "remote-postgres-service.com", 15 | "port": 5432, 16 | "username": "my_remote_user", 17 | "password": "my_secure_password" 18 | }, 19 | { 20 | "name": "readonly_connection", 21 | "database": "my_database_ro", 22 | "driver": "postgresql", 23 | "host": "remote-postgres-service.com", 24 | "port": 5432, 25 | "username": "my_remote_user_ro", 26 | "password": "my_secure_password_ro" 27 | } 28 | ], 29 | "tables": [ 30 | { 31 | "name": "users", 32 | "fields": [ 33 | {"name": "id", "type": "integer"}, 34 | {"name": "email", "type": "string"} 35 | ] 36 | }, 37 | { 38 | "name": "orders", 39 | "fields": [ 40 | {"name": "id", "type": "integer"}, 41 | {"name": "user_id", "type": "integer"}, 42 | {"name": "order_date", "type": "date"} 43 | ] 44 | } 45 | ], 46 | "indexes": [ 47 | { 48 | "table_name": "users", 49 | "index_name": "email_idx", 50 | "fields": ["email"] 51 | }, 52 | { 53 | "table_name": "orders", 54 | "index_name": "user_id_idx", 55 | "fields": ["user_id"] 56 | } 57 | ], 58 | "relationships": [ 59 | { 60 | "name": "one_to_many", 61 | "table1": "users", 62 | "table2": "orders" 63 | }, 64 | { 65 | "name": "many_to_one", 66 | "table1": "orders", 67 | "table2": "users" 68 | } 69 | ], 70 | "schema_version": 3, 71 | "created_at": "2022-01-01T12:00:00.000Z", 72 | "updated_at": "2022-01-02T14:30:00.000Z" 73 | } 74 | 75 | -------------------------------------------------------------------------------- /CHAPTER_LIST.txt: -------------------------------------------------------------------------------- 1 | ch0_intro/00_01_introduction 2 | ch0_intro/00_02_what_you_should_know 3 | ch0_intro/00_03_using_the_exercise_files 4 | ch0_intro/00_04_bitbucket_pipelines_review 5 | ch1_pipeline_optimizations/01_01_optimizing_pipeline_performance 6 | ch1_pipeline_optimizations/01_02_configure_maximum_runtime 7 | ch1_pipeline_optimizations/01_03_configure_resource_allocation 8 | ch1_pipeline_optimizations/01_04_use_conditional_steps 9 | ch1_pipeline_optimizations/01_05_cache_dependencies 10 | ch1_pipeline_optimizations/01_06_challenge_optimize_a_pipeline 11 | ch1_pipeline_optimizations/01_07_solution_optimize_a_pipeline 12 | ch2_using_pipes_in_pipelines/02_01_getting_to_know_pipes 13 | ch2_using_pipes_in_pipelines/02_02_use_a_pipe_in_a_pipeline_configuration 14 | ch2_using_pipes_in_pipelines/02_03_deploy_lambda_functions_in_aws 15 | ch2_using_pipes_in_pipelines/02_04_use_a_pipe_to_deploy_code_to_aws_lambda 16 | ch2_using_pipes_in_pipelines/02_05_challenge_use_pipes_in_a_pipeline 17 | ch2_using_pipes_in_pipelines/02_06_solution_use_pipes_in_a_pipeline 18 | ch3_create_custom_pipes/03_01_when_to_use_custom_pipes 19 | ch3_create_custom_pipes/03_02_develop_a_custom_pipe 20 | ch3_create_custom_pipes/03_03_test_a_custom_pipe 21 | ch3_create_custom_pipes/03_04_deploy_a_custom_pipe_to_a_container_registry 22 | ch3_create_custom_pipes/03_05_use_a_custom_pipe_in_a_pipeline 23 | ch3_create_custom_pipes/03_06_challenge_create_a_custom_pipe 24 | ch3_create_custom_pipes/03_07_solution_create_a_custom_pipe 25 | ch4_self_hosted_runners/04_01_when_to_use_self_hosted_runners 26 | ch4_self_hosted_runners/04_02_self_hosted_runner_configurations 27 | ch4_self_hosted_runners/04_03_compare_repository_and_workspace_runners 28 | ch4_self_hosted_runners/04_04_deploy_an_ec2_server_in_aws 29 | ch4_self_hosted_runners/04_05_install_runners_in_a_workspace 30 | ch4_self_hosted_runners/04_06_use_self_hosted_runners_in_a_pipeline 31 | ch4_self_hosted_runners/04_07_challenge_deploy_a_self_hosted_runner 32 | ch4_self_hosted_runners/04_08_solution_deploy_a_self_hosted_runner 33 | ch5_conclusion/05_01_next_steps 34 | -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_01_when_to_use_self_hosted_runners/README.md: -------------------------------------------------------------------------------- 1 | # 04_01 When to Use Self-Hosted Runners 2 | 3 | Self-hosted runners give you greater control and customization for your CI/CD pipelines compared to Atlassian’s cloud-hosted runners. 4 | 5 | Following are the key points we'll explore in this chapter: 6 | 7 | - **Enhanced Control & Customization:** You can tailor hardware (CPU, memory, storage) and build environments to meet demanding or specialized requirements. 8 | 9 | - **Security & Compliance:** Running pipelines on private infrastructure allows secure interactions with internal networks and helps safeguard sensitive data to meet regulations like GDPR and HIPAA. 10 | 11 | - **Cost Advantages:** Since build minutes aren’t charged on your own infrastructure and data transfer stays internal, self-hosting can reduce costs for long-running or frequent jobs. 12 | 13 | - **Operational Challenges:** Self-hosting requires you to manage server deployment, maintenance, scalability, and regular security updates. It also introduces security risks, so it’s recommended only for private repositories and environments with strict access controls. 14 | 15 | - **When to Consider Self-Hosting:** It’s a good fit if you encounter resource limitations on the cloud, have dedicated infrastructure available, need to work with sensitive or internal systems, and are prepared to manage the associated maintenance and security responsibilities. 16 | 17 | Now let's discuss how to deploy and use self-hosted runners in your pipelines. 18 | 19 | ## References 20 | 21 | - [Set up runners for Linux Shell](https://support.atlassian.com/bitbucket-cloud/docs/set-up-runners-for-linux-shell/) 22 | - [Set up runners for Linux Docker](https://support.atlassian.com/bitbucket-cloud/docs/set-up-and-use-runners-for-linux/) 23 | 24 | 25 | --- 26 | [← 03_07 Solution: Develop a custom pipe](../../ch3_create_custom_pipes/03_07_solution_create_a_custom_pipe/README.md) | [04_02 Self-Hosted Runner Configurations →](../04_02_self_hosted_runner_configurations/README.md) 27 | 28 | -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_06_use_self_hosted_runners_in_a_pipeline/README.md: -------------------------------------------------------------------------------- 1 | # 04_06 Use Self-Hosted Runners in a Pipeline 2 | 3 | Follow the steps in the previous lessons to: 4 | 5 | - Deploy an EC2 instance where self-hosted runners can be installed 6 | - Install two self-hosted runners: a linux shell runner and a linux docker runner. 7 | 8 | Then, create a new repo and added the exercise files for this lesson. 9 | 10 | The [pipeline configuration](./bitbucket-pipelines.yml) sets up a script definition that can be used to explore the runners. 11 | 12 | ```yaml 13 | definitions: 14 | script: &inspect-runner-environment 15 | - echo "Gathering CPU Info..." && lscpu 16 | - echo "Gathering Memory Info..." && free -h 17 | - echo "Disk Usage:" && df -h / 18 | - echo "Installed Tools:" 19 | - java --version || true 20 | - git --version || true 21 | - docker --version || true 22 | - echo "Listing All Commands:" && compgen -c 23 | - echo "Network Information:" && ip addr show 24 | - echo "Environment Variables:" && env 25 | 26 | pipelines: 27 | default: 28 | - step: 29 | name: Use Linux Shell Runner 30 | runs-on: 31 | - self.hosted 32 | - linux.shell 33 | - os.al2023 34 | script: *inspect-runner-environment 35 | - step: 36 | name: Use Linux Docker Runner 37 | runs-on: 38 | - self.hosted 39 | - linux 40 | script: *inspect-runner-environment 41 | ``` 42 | 43 | Run the pipeline and compare the output from both runners. 44 | 45 | > [!TIP] 46 | > If you prefer not to run the pipeline for this lesson, use the following output for your review: 47 | > 48 | > - [Linux Shell Runner Output](./linux-shell-runner-pipelineLog.txt) 49 | > - [Linux Docker Runner Output](./linux-docker-runner-pipelineLog.txt) 50 | 51 | 52 | --- 53 | [← 04_05 Install Runners in a Workspace](../04_05_install_runners_in_a_workspace/README.md) | [04_07 Challenge: Deploy a Self-Hosted Runner →](../04_07_challenge_deploy_a_self_hosted_runner/README.md) 54 | 55 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_05_cache_dependencies/python-backend/app_test.py: -------------------------------------------------------------------------------- 1 | import unittest 2 | import psycopg2.errors 3 | from app import connect_to_db, create_table, insert_task 4 | 5 | 6 | class TestDatabase(unittest.TestCase): 7 | 8 | def test_connection(self): 9 | connection = connect_to_db() 10 | self.assertIsNotNone(connection) 11 | if connection: 12 | connection.close() 13 | 14 | def test_create_table(self): 15 | connection = connect_to_db() 16 | self.assertIsNotNone(connection) 17 | cursor = connection.cursor() 18 | try: 19 | create_table(connection) 20 | # Check if the table exists (assuming a specific table name) 21 | cursor.execute( 22 | "SELECT EXISTS (SELECT 1 FROM information_schema.tables WHERE table_name = 'tasks');" 23 | ) 24 | exists = cursor.fetchone()[0] 25 | self.assertTrue(exists) 26 | except ( 27 | psycopg2.errors.ProgrammingError 28 | ) as e: # Handle potential table creation errors 29 | self.fail(f"Error creating table: {e}") 30 | finally: 31 | connection.close() 32 | 33 | def test_insert_task(self): 34 | connection = connect_to_db() 35 | self.assertIsNotNone(connection) 36 | 37 | cursor = connection.cursor() 38 | try: 39 | task_description = "Learn more about using Bitbucket Pipelines for CI/CD" 40 | insert_task(connection, task_description) 41 | 42 | cursor.execute( 43 | "SELECT description FROM tasks WHERE description = %s", 44 | (task_description,), 45 | ) 46 | inserted_task = cursor.fetchone() 47 | self.assertIsNotNone(inserted_task) 48 | self.assertEqual( 49 | inserted_task[0], task_description 50 | ) # Verify inserted description 51 | except Exception as e: 52 | self.fail(f"Error inserting task: {e}") 53 | finally: 54 | connection.close() 55 | 56 | 57 | if __name__ == "__main__": 58 | unittest.main() 59 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_04_use_conditional_steps/python-backend/app_test.py: -------------------------------------------------------------------------------- 1 | import unittest 2 | import psycopg2.errors 3 | from app import connect_to_db, create_table, insert_task 4 | 5 | 6 | class TestDatabase(unittest.TestCase): 7 | 8 | def test_connection(self): 9 | connection = connect_to_db() 10 | self.assertIsNotNone(connection) 11 | if connection: 12 | connection.close() 13 | 14 | def test_create_table(self): 15 | connection = connect_to_db() 16 | self.assertIsNotNone(connection) 17 | cursor = connection.cursor() 18 | try: 19 | create_table(connection) 20 | # Check if the table exists (assuming a specific table name) 21 | cursor.execute( 22 | "SELECT EXISTS (SELECT 1 FROM information_schema.tables WHERE table_name = 'tasks');" 23 | ) 24 | exists = cursor.fetchone()[0] 25 | self.assertTrue(exists) 26 | except ( 27 | psycopg2.errors.ProgrammingError 28 | ) as e: # Handle potential table creation errors 29 | self.fail(f"Error creating table: {e}") 30 | finally: 31 | connection.close() 32 | 33 | def test_insert_task(self): 34 | connection = connect_to_db() 35 | self.assertIsNotNone(connection) 36 | 37 | cursor = connection.cursor() 38 | try: 39 | task_description = "Learn more about using Bitbucket Pipelines for CI/CD" 40 | insert_task(connection, task_description) 41 | 42 | cursor.execute( 43 | "SELECT description FROM tasks WHERE description = %s", 44 | (task_description,), 45 | ) 46 | inserted_task = cursor.fetchone() 47 | self.assertIsNotNone(inserted_task) 48 | self.assertEqual( 49 | inserted_task[0], task_description 50 | ) # Verify inserted description 51 | except Exception as e: 52 | self.fail(f"Error inserting task: {e}") 53 | finally: 54 | connection.close() 55 | 56 | 57 | if __name__ == "__main__": 58 | unittest.main() 59 | -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_04_use_a_pipe_to_deploy_code_to_aws_lambda/create_function_configuration.py: -------------------------------------------------------------------------------- 1 | import json 2 | import os 3 | 4 | 5 | def create_function_config_file( 6 | function_name, platform, version, build_number, environment, output_path 7 | ): 8 | """ 9 | Creates a JSON file for AWS Lambda function configuration. 10 | 11 | Args: 12 | function_name (str): Name of the Lambda function. 13 | platform (str): Platform name to include in the configuration. 14 | version (str): Version number to include in the configuration. 15 | build_number (str): Build number to include in the configuration. 16 | environment (str): Deployment environment (e.g., "production", "staging"). 17 | output_path (str): Path to save the generated JSON file. 18 | 19 | Returns: 20 | str: Path to the generated JSON file. 21 | """ 22 | config = { 23 | "FunctionName": function_name, 24 | "Environment": { 25 | "Variables": { 26 | "PLATFORM": platform, 27 | "VERSION": version, 28 | "BUILD_NUMBER": build_number, 29 | "ENVIRONMENT": environment, 30 | } 31 | }, 32 | } 33 | 34 | # Write the configuration to a JSON file 35 | with open(output_path, "w") as json_file: 36 | json.dump(config, json_file, indent=4) 37 | 38 | print(f"Configuration file created at: {output_path}") 39 | return output_path 40 | 41 | 42 | if __name__ == "__main__": 43 | FUNCTION_NAME = os.getenv("FUNCTION_NAME", "undefined") 44 | VERSION = os.getenv("BITBUCKET_COMMIT", "undefined") 45 | BUILD_NUMBER = os.getenv("BITBUCKET_BUILD_NUMBER", "undefined") 46 | ENVIRONMENT = os.getenv("BITBUCKET_DEPLOYMENT_ENVIRONMENT", "undefined") 47 | PLATFORM = os.getenv("PLATFORM", "Bitbucket") 48 | OUTPUT_PATH = "function_configuration.json" 49 | 50 | create_function_config_file( 51 | function_name=FUNCTION_NAME, 52 | platform=PLATFORM, 53 | version=VERSION, 54 | build_number=BUILD_NUMBER, 55 | environment=ENVIRONMENT, 56 | output_path=OUTPUT_PATH, 57 | ) 58 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_06_challenge_optimize_a_pipeline/README.md: -------------------------------------------------------------------------------- 1 | # 01_06 Challenge: Optimize a Workflow in Bitbucket Pipelines 2 | 3 | ## Challenge Scenario 4 | 5 | In this challenge you’re the Bitbucket Pipelines expert supporting development on the Amazing Mobile App. 6 | 7 | The Data Analysis team has developed a Python script to run a cluster analysis using machine learning. This script processes a large dataset stored in a JSON file and generates an analysis report. 8 | 9 | The Data Analysis team is facing challenges with execution time and resource usage, and they’ve asked you to optimize their **Bitbucket Pipelines** workflow. 10 | 11 | Specifically, they need a pipeline that: 12 | 13 | 1. Prevents the analysis from running more than 10 minutes 14 | 1. Minimizes the time spent loading data analysis libraries 15 | 1. Reuses any data that has already been generated by the script and written to the `./data` directory in the workspace 16 | 1. Only runs the analysis when the [cluster analysis](./cluster_analysis.py) script has changed. 17 | 18 | Review the current pipeline configuration and make changes to optimize the pipeline as requested. 19 | 20 | ## Challenge Tasks 21 | 22 | 1. Log into Bitbucket and create a new repository. 23 | 1. Add the exercise files. 24 | 1. From the **Pipelines** menu, run the pipeline once to enable pipeline settings. 25 | 26 | ![Run initial pipeline, step 1](./images/SCR-20250103-trnw-run-initial-pipeline-1.png) 27 | 28 | ![Run initial pipeline, step 2](./images/SCR-20250103-trwv-run-initial-pipeline-2.png) 29 | 30 | 1. Review the pipeline to get an idea of the pipeline performance before making any changes. 31 | 1. Update the [pipeline configuration](./bitbucket-pipelines.yml) to meet the specifications provided by the Data Analysis team. 32 | 33 | > [!NOTE] 34 | > Use the topics from this chapter to solve the challenge, specifically: `max-time`, `caches`, and `conditions`. 35 | 36 | This challenge should take 10-15 minutes to complete. 37 | 38 | 39 | --- 40 | [← 01_05 Cache Dependencies](../01_05_cache_dependencies/README.md) | [01_07 Solution: Optimize a Workflow in Bitbucket Pipelines →](../01_07_solution_optimize_a_pipeline/README.md) 41 | 42 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_03_configure_resource_allocation/1x-pipelineLog.txt: -------------------------------------------------------------------------------- 1 | + umask 000 2 | 3 | + GIT_LFS_SKIP_SMUDGE=1 retry 6 git clone --branch="main" --depth 50 https://x-token-auth:$REPOSITORY_OAUTH_ACCESS_TOKEN@bitbucket.org/$BITBUCKET_REPO_FULL_NAME.git $BUILD_DIR 4 | Cloning into '/opt/atlassian/pipelines/agent/build'... 5 | 6 | + git reset --hard c5950fbe99d171ecc8087fe985fd5a0cc78dbfbc 7 | HEAD is now at c5950fb add exercise files 8 | 9 | + git config user.name bitbucket-pipelines 10 | 11 | + git config user.email commits-noreply@bitbucket.org 12 | 13 | + git config push.default current 14 | 15 | + git config http.${BITBUCKET_GIT_HTTP_ORIGIN}.proxy http://localhost:29418/ 16 | 17 | + git remote set-url origin http://bitbucket.org/$BITBUCKET_REPO_FULL_NAME 18 | 19 | + git reflog expire --expire=all --all 20 | 21 | + echo ".bitbucket/pipelines/generated" >> .git/info/exclude 22 | 23 | + chmod 777 $BUILD_DIR 24 | 25 | 26 | Default variables: 27 | BITBUCKET_BRANCH 28 | BITBUCKET_BUILD_NUMBER 29 | BITBUCKET_CLONE_DIR 30 | BITBUCKET_COMMIT 31 | BITBUCKET_GIT_HTTP_ORIGIN 32 | BITBUCKET_GIT_SSH_ORIGIN 33 | BITBUCKET_PIPELINE_UUID 34 | BITBUCKET_PROJECT_KEY 35 | BITBUCKET_PROJECT_UUID 36 | BITBUCKET_REPO_FULL_NAME 37 | BITBUCKET_REPO_IS_PRIVATE 38 | BITBUCKET_REPO_OWNER 39 | BITBUCKET_REPO_OWNER_UUID 40 | BITBUCKET_REPO_SLUG 41 | BITBUCKET_REPO_UUID 42 | BITBUCKET_SSH_KEY_FILE 43 | BITBUCKET_STEP_RUN_NUMBER 44 | BITBUCKET_STEP_TRIGGERER_UUID 45 | BITBUCKET_STEP_UUID 46 | BITBUCKET_WORKSPACE 47 | CI 48 | DOCKER_HOST 49 | PIPELINES_JWT_TOKEN 50 | 51 | Images used: 52 | build : docker.io/atlassian/default-image@sha256:673c80ac148d45569aabf3be1ba4ccf4c49443e54c843b70abe51498a5304da6 53 | 54 | Cloud runtime: 2 55 | Arch: X86 56 | + echo "CPUs = $(nproc)" 57 | CPUs = 2 58 | 59 | + echo "RAM = $(grep MemTotal /proc/meminfo | awk '{print $2 / 1024 / 1024 }') GB" 60 | RAM = 5.9534 GB 61 | 62 | + echo "Disk = $(df --block-size=1G / | awk 'NR==2 {print $2}') GB" 63 | Disk = 63 GB 64 | 65 | Searching for test report files in directories named [test-results, failsafe-reports, test-reports, TestResults, surefire-reports] down to a depth of 4 66 | Finished scanning for test reports. Found 0 test report files. 67 | Merged test suites, total number tests is 0, with 0 failures and 0 errors. -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_03_configure_resource_allocation/2x-pipelineLog.txt: -------------------------------------------------------------------------------- 1 | + umask 000 2 | 3 | + GIT_LFS_SKIP_SMUDGE=1 retry 6 git clone --branch="main" --depth 50 https://x-token-auth:$REPOSITORY_OAUTH_ACCESS_TOKEN@bitbucket.org/$BITBUCKET_REPO_FULL_NAME.git $BUILD_DIR 4 | Cloning into '/opt/atlassian/pipelines/agent/build'... 5 | 6 | + git reset --hard c5950fbe99d171ecc8087fe985fd5a0cc78dbfbc 7 | HEAD is now at c5950fb add exercise files 8 | 9 | + git config user.name bitbucket-pipelines 10 | 11 | + git config user.email commits-noreply@bitbucket.org 12 | 13 | + git config push.default current 14 | 15 | + git config http.${BITBUCKET_GIT_HTTP_ORIGIN}.proxy http://localhost:29418/ 16 | 17 | + git remote set-url origin http://bitbucket.org/$BITBUCKET_REPO_FULL_NAME 18 | 19 | + git reflog expire --expire=all --all 20 | 21 | + echo ".bitbucket/pipelines/generated" >> .git/info/exclude 22 | 23 | + chmod 777 $BUILD_DIR 24 | 25 | 26 | Default variables: 27 | BITBUCKET_BRANCH 28 | BITBUCKET_BUILD_NUMBER 29 | BITBUCKET_CLONE_DIR 30 | BITBUCKET_COMMIT 31 | BITBUCKET_GIT_HTTP_ORIGIN 32 | BITBUCKET_GIT_SSH_ORIGIN 33 | BITBUCKET_PIPELINE_UUID 34 | BITBUCKET_PROJECT_KEY 35 | BITBUCKET_PROJECT_UUID 36 | BITBUCKET_REPO_FULL_NAME 37 | BITBUCKET_REPO_IS_PRIVATE 38 | BITBUCKET_REPO_OWNER 39 | BITBUCKET_REPO_OWNER_UUID 40 | BITBUCKET_REPO_SLUG 41 | BITBUCKET_REPO_UUID 42 | BITBUCKET_SSH_KEY_FILE 43 | BITBUCKET_STEP_RUN_NUMBER 44 | BITBUCKET_STEP_TRIGGERER_UUID 45 | BITBUCKET_STEP_UUID 46 | BITBUCKET_WORKSPACE 47 | CI 48 | DOCKER_HOST 49 | PIPELINES_JWT_TOKEN 50 | 51 | Images used: 52 | build : docker.io/atlassian/default-image@sha256:673c80ac148d45569aabf3be1ba4ccf4c49443e54c843b70abe51498a5304da6 53 | 54 | Cloud runtime: 2 55 | Arch: X86 56 | + echo "CPUs = $(nproc)" 57 | CPUs = 4 58 | 59 | + echo "RAM = $(grep MemTotal /proc/meminfo | awk '{print $2 / 1024 / 1024 }') GB" 60 | RAM = 9.95307 GB 61 | 62 | + echo "Disk = $(df --block-size=1G / | awk 'NR==2 {print $2}') GB" 63 | Disk = 63 GB 64 | 65 | Searching for test report files in directories named [test-results, failsafe-reports, test-reports, TestResults, surefire-reports] down to a depth of 4 66 | Finished scanning for test reports. Found 0 test report files. 67 | Merged test suites, total number tests is 0, with 0 failures and 0 errors. -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_04_use_a_pipe_to_deploy_code_to_aws_lambda/index_test.py: -------------------------------------------------------------------------------- 1 | from index import handler 2 | from unittest.mock import patch 3 | import unittest 4 | import json 5 | 6 | 7 | class TestHandler(unittest.TestCase): 8 | def setUp(self): 9 | # Set up any test data or configurations you need 10 | self.mock_env = "test_environment" 11 | patch.dict("os.environ", {"ENVIRONMENT": self.mock_env}).start() 12 | 13 | def tearDown(self): 14 | # Clean up after each test if necessary 15 | patch.stopall() 16 | 17 | def test_home_page(self): 18 | # Test the home page 19 | event = {"rawPath": "/"} 20 | response = handler(event, None) 21 | self.assertEqual(response["statusCode"], 200) 22 | self.assertEqual(response["headers"]["Content-Type"], "text/html") 23 | self.assertIn(f"The Sample Application - {self.mock_env}", response["body"]) 24 | 25 | def test_get_all_data(self): 26 | with open("data.json", mode="r", encoding="utf-8") as data_file: 27 | data = json.load(data_file) 28 | 29 | event = {"rawPath": "/data"} 30 | response = handler(event, None) 31 | self.assertEqual(response["statusCode"], 200) 32 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 33 | self.assertEqual(response["body"], json.dumps(data)) 34 | 35 | def test_get_item_by_id(self): 36 | with open("data.json", "r") as f: 37 | data = json.load(f) 38 | 39 | event = {"rawPath": "/1"} 40 | response = handler(event, None) 41 | self.assertEqual(response["statusCode"], 200) 42 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 43 | 44 | # Find the expected item in data based on the given ID 45 | item_id = event["rawPath"][1:] 46 | expected_item = next((item for item in data if item["id"] == item_id), None) 47 | self.assertEqual(response["body"], json.dumps(expected_item)) 48 | 49 | def test_invalid_id(self): 50 | event = {"rawPath": "/invalid_id"} 51 | response = handler(event, None) 52 | self.assertEqual(response["statusCode"], 404) 53 | self.assertEqual(response["headers"]["Content-Type"], "application/json") 54 | self.assertIn(f"id {event['rawPath'][1:]} not found", response["body"]) 55 | 56 | 57 | if __name__ == "__main__": 58 | unittest.main() 59 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_01_when_to_use_custom_pipes/README.md: -------------------------------------------------------------------------------- 1 | # 03_01 When to Use Custom Pipes 2 | 3 | ## Reasons to Create a Custom Pipe 4 | 5 | - Developing pipelines that perform the same, complex tasks in several steps or need to run complex tasks across multiple repositories 6 | - Vendor that wants developers to integrate with their software or platform from Bitbucket Pipelines 7 | - Advanced pipeline solutions with specific dependencies or complex environments 8 | 9 | ## Simple Pipe vs Complete Pipe 10 | 11 | - **Simple pipes** are intended for personal and team use and require: 12 | - A script 13 | - A Dockerfile 14 | - Image deployment and hosting 15 | 16 | - **Complete pipes** are intended for public use and distribution. In addition to the simple pipe requirements, complete pipes require: 17 | - A metadata file 18 | - Documentation 19 | - A test suite 20 | 21 | > [!IMPORTANT] 22 | > Only complete pipes may be considered for distribution via the Atlassian Marketplace. 23 | 24 | ## Pipe Development Tools and Resources 25 | 26 | | Resource | Link | 27 | |----------|------| 28 | | Official documentation for creating Bitbucket Pipes | [Write a pipe for Bitbucket Pipelines](https://support.atlassian.com/bitbucket-cloud/docs/write-a-pipe-for-bitbucket-pipelines/) | 29 | | A Yeoman generator for creating Bitbucket Pipes | [NPM: generator-bitbucket-pipes](https://www.npmjs.com/package/generator-bitbucket-pipe)
[Repo: bitbucket-pipe-release](https://bitbucket.org/atlassian/bitbucket-pipe-release/src/master/)| 30 | | A Python library for developing Bitbucket Pipes | [bitbucket-pipes-toolkit](https://bitbucket.org/bitbucketpipelines/bitbucket-pipes-toolkit/src/master/) | 31 | | A Bash library for developing Bitbucket Pipes | [bitbucket-pipes-toolkit-bash](https://bitbucket.org/bitbucketpipelines/bitbucket-pipes-toolkit-bash/src/master/) | 32 | | A simple pipe example using Bash | [demo-simple-pipe](https://bitbucket.org/bitbucketpipelines/demo-pipe-simple/src/master/) | 33 | | A complete pipe example using Python | [demo-pipe-python](https://bitbucket.org/atlassian/demo-pipe-python/src/master/) | 34 | | Repositories for the Official Bitbucket Pipes | [Bitbucket Pipelines Pipes](https://bitbucket.org/atlassian/workspace/projects/BPP) | 35 | 36 | 37 | --- 38 | [← 02_06 Solution: Use Pipes in a Pipeline](../../ch2_using_pipes_in_pipelines/02_06_solution_use_pipes_in_a_pipeline/README.md) | [03_02 Develop a Custom Pipe →](../03_02_develop_a_custom_pipe/README.md) 39 | 40 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_05_cache_dependencies/README.md: -------------------------------------------------------------------------------- 1 | # 01_05 Cache Dependencies 2 | 3 | **Dependencies**: external components that a software application 4 | relies on to function properly. 5 | 6 | - Examples: 7 | - Python script that needs data analysis capabilities may depend on the [pandas](https://pandas.pydata.org/) library. 8 | - Javascript web application that needs a graphical interface may depend on the [React](https://react.dev/) framework. 9 | 10 | ## Caching dependencies in pipelines 11 | 12 | ```yaml 13 | - step: 14 | caches: 15 | - CACHE_NAME_1 16 | - CACHE_NAME_2 17 | ``` 18 | 19 | ### Pre-defined caches 20 | 21 | - `docker` 22 | - `composer` 23 | - `dotnetcore` 24 | - `gradle` 25 | - `ivy2` 26 | - `maven` 27 | - `node` 28 | - `pip` 29 | - `sbt` 30 | 31 | ### Custom caches 32 | 33 | Define the cache in the `definitions` section: 34 | 35 | - `path`: identifies where dependencies are stored 36 | - `key` and `files`: the files to check for changes to determine if the cache needs to be recreated. Multiple files can be listed. 37 | 38 | ```yaml 39 | definitions: 40 | caches: 41 | custom-ruby: 42 | path: ruby-backend/vendor/bundle 43 | key: 44 | files: 45 | - ruby-backend/Gemfile.lock 46 | ``` 47 | 48 | Then use the cache by name in a `step`: 49 | 50 | ```yaml 51 | - step: 52 | name: Test Ruby 53 | caches: 54 | - custom-ruby 55 | ``` 56 | 57 | ## References 58 | 59 | - Compressed cache size limit is 1 GB; Any larger and the cache will not be uploaded. 60 | 61 | - [Caches - Step Options](https://support.atlassian.com/bitbucket-cloud/docs/step-options/#Caches) 62 | - [Caches - Documentation](https://support.atlassian.com/bitbucket-cloud/docs/cache-dependencies/) 63 | 64 | ## Using the Exercise Files 65 | 66 | 1. The lesson reuses the repo and exercise files from [01_04 Use Conditional Steps](../01_04_use_conditional_steps/README.md). 67 | 1. Replace the contents of the pipeline configuration with the contents of the `bitbucket-pipelines.yaml` from this lesson. 68 | 1. Observe the pipeline activity after committing the new pipeline configuration. 69 | 1. Make a change to a file in `ruby-backend`, such as adding a comment. 70 | 1. Observe the pipeline activity after committing the updated file. 71 | 72 | 73 | --- 74 | [← 01_04 Use Conditional Steps](../01_04_use_conditional_steps/README.md) | [01_06 Challenge: Optimize a Workflow in Bitbucket Pipelines →](../01_06_challenge_optimize_a_pipeline/README.md) 75 | 76 | -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_04_use_a_pipe_to_deploy_code_to_aws_lambda/bitbucket-pipelines.yml: -------------------------------------------------------------------------------- 1 | # Use the latest version of Python 2 | image: python:3 3 | 4 | definitions: 5 | steps: 6 | - step: &integration 7 | name: Run Integration Tests 8 | caches: 9 | - pip 10 | script: 11 | - apt-get update -y 12 | - apt-get install -qq zip 13 | - make requirements 14 | - make check 15 | - make lint 16 | - make test 17 | 18 | - step: &build 19 | name: Build 20 | script: 21 | - apt-get update -y 22 | - apt-get install -qq zip 23 | - make build 24 | artifacts: 25 | - lambda.zip 26 | 27 | - step: &deploy 28 | name: Deploy 29 | caches: 30 | - docker 31 | script: 32 | - apt-get update -y 33 | - apt-get install -qq zip 34 | - python create_function_configuration.py 35 | - cat function_configuration.json 36 | 37 | - pipe: atlassian/aws-lambda-deploy:1.12.0 38 | variables: 39 | FUNCTION_NAME: "$FUNCTION_NAME" 40 | FUNCTION_CONFIGURATION: "./function_configuration.json" 41 | ZIP_FILE: "lambda.zip" 42 | COMMAND: "update" 43 | WAIT: "true" 44 | 45 | - make testdeployment URL=$URL VERSION=$BITBUCKET_COMMIT 46 | 47 | pipelines: 48 | default: 49 | - stage: 50 | name: Integration 51 | condition: 52 | changesets: 53 | includePaths: 54 | - Makefile 55 | - "**/*.py" 56 | - "**/*.json" 57 | - "**/*.html" 58 | - "**/*.txt" 59 | steps: 60 | - step: *integration 61 | - stage: 62 | name: Build 63 | steps: 64 | - step: *build 65 | - stage: 66 | name: Deploy Staging 67 | deployment: Staging 68 | steps: 69 | - step: *build 70 | - step: *deploy 71 | 72 | branches: 73 | main: 74 | - stage: 75 | name: Integration 76 | condition: 77 | changesets: 78 | includePaths: 79 | - Makefile 80 | - "**/*.py" 81 | - "**/*.json" 82 | - "**/*.html" 83 | - "**/*.txt" 84 | steps: 85 | - step: *integration 86 | - stage: 87 | name: Build 88 | steps: 89 | - step: *build 90 | - stage: 91 | name: Deploy Production 92 | deployment: Production 93 | trigger: manual 94 | steps: 95 | - step: *build 96 | - step: *deploy 97 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_06_challenge_create_a_custom_pipe/README.md: -------------------------------------------------------------------------------- 1 | # 03_06 Challenge: Develop a Custom Pipe 2 | 3 | ## Challenge Scenario 4 | 5 | The world-renowned Amazing Mobile App is expanding! Their new strategy will allow business partners to integrate with the mobile app directly from their deployment pipelines using APIs. 6 | 7 | The initiative is led by the Amazing Library for Phone Hooks and APIs (ALPHA) Team. The ALPHA Team is planning to base the new integration on credentials that include an API key and a Customer ID. 8 | 9 | Before implementing the API interface, the ALPHA team is asking you to demonstrate how customers can use a pipe to call the ALPHA API. They've asked you to update their code (which was generated using a Python-based Complete Pipe template) to use the `CUSTOMER_ID` parameter instead of the `NAME` parameter. 10 | 11 | All the code you need for this challenge can be found in the exercise files directory for this lesson. 12 | 13 | Review the notes in the following sections for the tools and techniques you’ll use to solve this challenge: 14 | 15 | - [03_02 Develop a Custom Pipe](../03_02_develop_a_custom_pipe/README.md) 16 | - [03_03 Test a custom pipe](../03_03_test_a_custom_pipe/README.md) 17 | 18 | ## Challenge Tasks 19 | 20 | 1. **Run the Test Suite**: 21 | 22 | After getting the files in place, run the `pytest` test suite to confirm that the code is working before you make any changes: 23 | 24 | ```bash 25 | pytest --verbose test/test*.py 26 | ``` 27 | 28 | 1. **Explore the Provided Code**: 29 | 30 | Review the following files to see how the `NAME` parameter is being used and tested: 31 | 32 | - `pipe/pipe.py` 33 | - `test/test.py` 34 | 35 | 1. **Update the Code**: 36 | 37 | Complete the following: 38 | 39 | - Replace `NAME` with `CUSTOMER_ID` throughout the `pipe.py` file. 40 | - Adjust any associated variables as needed. 41 | - Update the success message to include the value of `CUSTOMER_ID`. That is, for `CUSTOMER_ID=DEC041906` The message should read as follows: 42 | 43 | ```python 44 | "CUSTOMER_ID DEC041906 processed successfully!" 45 | ``` 46 | 47 | 1. **Update the Tests**: 48 | 49 | Complete the following: 50 | 51 | - Modify `test.py` to replace all instances of `NAME` with `CUSTOMER_ID`. 52 | - Update test cases to validate the new parameter and success message. 53 | 54 | 1. **Validate the Changes**: 55 | 56 | Run the `pytest` test suite again to confirm the changes were applied correctly. 57 | 58 | This challenge should take 10-15 minutes to complete. 59 | 60 | 61 | --- 62 | [← 03_05 Use a custom pipe in a pipeline](../03_05_use_a_custom_pipe_in_a_pipeline/README.md) | [03_07 Solution: Develop a custom pipe →](../03_07_solution_create_a_custom_pipe/README.md) 63 | 64 | -------------------------------------------------------------------------------- /Makefile: -------------------------------------------------------------------------------- 1 | README_FILES := $(shell find . -type f -name 'README.md' -not -path './.git/*') 2 | DIRECTORIES := $(shell find $(PWD) -type f -name 'README.md' -exec dirname {} \;) 3 | 4 | hello: 5 | @echo "This makefile has the following tasks:" 6 | @echo "\tlint - lint README files" 7 | @echo "\tspellcheck - spell check README files" 8 | @echo "\tfooter - generate footer links for README files" 9 | @echo "\tpdf - generate PDFs for README files" 10 | @echo "\tclean - remove backup files" 11 | @echo "\tall - run all tasks (except clean)" 12 | 13 | #all: footer lint spellcheck 14 | all: clean footer spellcheck 15 | @echo "Done." 16 | 17 | lint: 18 | #-@docker pull ghcr.io/managedkaos/summarize-markdown-lint:main 19 | #-@docker pull davidanson/markdownlint-cli2:v0.13.0 20 | -@docker run -v $(PWD):/workdir davidanson/markdownlint-cli2:v0.13.0 $(README_FILES) 2>&1 | \ 21 | docker run --interactive ghcr.io/managedkaos/summarize-markdown-lint:main 22 | 23 | rawlint: 24 | -@docker run -v $(PWD):/workdir davidanson/markdownlint-cli2:v0.13.0 $(README_FILES) 2>&1 25 | 26 | spellcheck: 27 | @echo "Spell checking README files..." 28 | @for file in $(README_FILES); do \ 29 | echo "\t$$file"; \ 30 | aspell check --mode=markdown --lang=en $$file; \ 31 | done 32 | 33 | toc: 34 | @echo "Generating table of contents for README files..." 35 | -@docker pull ghcr.io/managedkaos/readme-toc-generator:main 36 | @docker run --rm --volume $(PWD):/data ghcr.io/managedkaos/readme-toc-generator:main 37 | 38 | footer: 39 | @echo "Generating footer links for README files..." 40 | -@docker pull ghcr.io/managedkaos/readme-footer-generator:main 41 | @docker run --rm --volume $(PWD):/data ghcr.io/managedkaos/readme-footer-generator:main 42 | 43 | pdf: $(DIRECTORIES) 44 | 45 | $(DIRECTORIES): 46 | @echo "Processing directory: $@" 47 | @cd $@ && pandoc README.md -o $(notdir $@)-README.pdf 48 | @cd $(PROJECT_HOME) 49 | 50 | wordcount: 51 | @find . -type f -name README.md -exec wc -l {} \; | sort -nr 52 | 53 | chapterlist: 54 | @find . -type f -name README.md | sed 's/\/README.md//' | sed 's/\.\///' | sed '/\./d' | sort 55 | 56 | chapterlist-touch: 57 | @cat ./CHAPTER_LIST.txt | while read line; do \ 58 | echo "$$line"; \ 59 | mkdir -p $$line; \ 60 | touch $$line/README.md; \ 61 | done 62 | 63 | overlay: 64 | @find . -type f -name README.md | sort | sed 's/^\.\///' | sed 's/\// > /g' | sed 's/ > README.md//' 65 | 66 | clean: 67 | find . -type f -name \*.pdf -exec rm -vf {} \; 68 | find . -type f -name \*.bak -exec rm -vf {} \; 69 | find . -type f -name \*.new -exec rm -vf {} \; 70 | find . -type d -name .pytest_cache -exec trash {} \; 71 | 72 | nuke: clean 73 | find /tmp/ -type f -name \*.pdf -exec rm -vf {} \; 74 | 75 | .PHONY: hello lint spellcheck toc footer pdf countlines chapterlist overlay clean nuke $(DIRECTORIES) 76 | -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_04_use_a_pipe_to_deploy_code_to_aws_lambda/template.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | The Sample Application - {environment} 5 | 59 | 60 | 61 |

The Sample Application

62 | 63 | 64 | 65 | 66 | 67 | 68 | 69 | 70 | 71 | 72 | 73 | 74 | 75 | 76 | 77 | 78 | 79 |
Environment{environment}
Version{version}
Platform{platform}
Build #{build_number}
80 | 81 |

GET /

82 |

Returns the documentation page for the application.

83 |

84 | 85 |

GET /data

86 |

Returns all data in JSON format.

87 |

88 | 89 |

GET /{{id}}

90 |

Returns a specific item by its ID in JSON format.

91 |

92 | 93 | 94 | -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_05_challenge_use_pipes_in_a_pipeline/README.md: -------------------------------------------------------------------------------- 1 | # 02_05 Challenge: Use Pipes in a Pipeline 2 | 3 | ## Challenge Scenario 4 | 5 | In this challenge you’re continuing your role as the Bitbucket Pipelines expert supporting the Amazing Mobile App. 6 | 7 | Everyone on the team is excited about using Bitbucket Pipelines to automate their workflows. Management is pleased with the velocity of feature development and it looks like this is going to be an amazing year for the company. 8 | 9 | As planning begins for the next fiscal year, the CFO has come to you and asked if there's a way to get a report of the build minutes being used across all of the companies repositories. Specifically, they need to know which builds are taking the longest amount of time and how many build minutes are being used. They'd also like to capture this information on a regular basis. 10 | 11 | Use your knowledge of Bitbucket Pipelines to automate a solution for the CFO. 12 | 13 | ## Challenge Tasks 14 | 15 | 1. Log into Bitbucket and create a new repository. Add the provided [bitbucket-pipelines.yml](./bitbucket-pipelines.yml) file. 16 | 1. From the **Pipelines** menu, run the pipeline once to enable pipeline settings. 17 | 18 | ![Run initial pipeline, step 1](./images/SCR-20250103-trnw-run-initial-pipeline-1.png) 19 | 20 | ![Run initial pipeline, step 1](./images/SCR-20250103-trwv-run-initial-pipeline-2.png) 21 | 22 | 1. Create a repository access token that you can use to collect the desired information. The token must have the following permissions: 23 | 24 | - `Repositories:Write` 25 | - `Pipelines:Read` 26 | 27 | 1. Store the token as a repository variable named `STATISTICS_ACCESS_TOKEN`. 28 | 1. Use the following script element to format the report name: 29 | 30 | ```bash 31 | export FILENAME="builds-statistics-$(date +%Y-%m-%d).txt" 32 | ``` 33 | 34 | 1. Use the following pipes to generate and store the report: 35 | 36 | - [atlassian/bitbucket-build-statistics](https://bitbucket.org/atlassian/bitbucket-build-statistics/src/master/) 37 | 38 | ```yaml 39 | - pipe: atlassian/bitbucket-build-statistics:1.5.3 40 | variables: 41 | BITBUCKET_ACCESS_TOKEN: "$STATISTICS_ACCESS_TOKEN" 42 | FILENAME: "$FILENAME" 43 | ``` 44 | 45 | - [atlassian/bitbucket-upload-file](https://bitbucket.org/atlassian/bitbucket-upload-file/src/master/) 46 | 47 | ```yaml 48 | - pipe: atlassian/bitbucket-upload-file:0.7.4 49 | variables: 50 | BITBUCKET_ACCESS_TOKEN: "$STATISTICS_ACCESS_TOKEN" 51 | FILENAME: "$FILENAME" 52 | ``` 53 | 54 | This challenge should take 10-15 minutes to complete. 55 | 56 | 57 | --- 58 | [← 02_04 Use a pipe to deploy code to AWS Lambda](../02_04_use_a_pipe_to_deploy_code_to_aws_lambda/README.md) | [02_06 Solution: Use Pipes in a Pipeline →](../02_06_solution_use_pipes_in_a_pipeline/README.md) 59 | 60 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_06_challenge_optimize_a_pipeline/cluster_analysis.py: -------------------------------------------------------------------------------- 1 | """ 2 | This script performs clustering analysis on a dataset and generates a report. 3 | 4 | """ 5 | 6 | import os 7 | import json 8 | import pandas as pd 9 | import numpy as np 10 | 11 | # FIXME: These imports need to be added when the analysis code is ready 12 | # import tensorflow as tf 13 | # from sklearn.cluster import KMeans 14 | # from sklearn.metrics import silhouette_score 15 | 16 | 17 | # Step 1: Generate or Load Dataset 18 | def load_dataset(file_path): 19 | """ 20 | Load an existing dataset from a JSON file. 21 | """ 22 | print("Creating or loading dataset...") 23 | 24 | # FIXME: Add code to download the dataset from Amazing Mobile App HQ 25 | # If no data is in place, generate a random dataset 26 | if not os.path.exists(file_path): 27 | os.makedirs(os.path.dirname(file_path), exist_ok=True) 28 | 29 | data = { 30 | "feature1": np.random.uniform(0, 100, 1000).tolist(), 31 | "feature2": np.random.uniform(0, 100, 1000).tolist(), 32 | "feature3": np.random.randint(0, 50, 1000).tolist(), 33 | "category": np.random.choice(["A", "B", "C"], 1000).tolist(), 34 | } 35 | 36 | with open(file_path, "w") as f: 37 | json.dump(data, f) 38 | print("Dataset created at", file_path) 39 | 40 | else: 41 | print("Dataset exists, loading from", file_path) 42 | 43 | with open(file_path, "r") as f: 44 | data = json.load(f) 45 | 46 | return pd.DataFrame(data) 47 | 48 | 49 | # Step 2: Perform Analysis 50 | def perform_analysis(df): 51 | """ 52 | Perform clustering analysis on the dataset. 53 | """ 54 | print("Performing clustering analysis...") 55 | print("Dataset shape:", df.shape) 56 | 57 | return None, None 58 | 59 | 60 | # Step 3: Generate Report 61 | def generate_report(summary, silhouette_score, output_file): 62 | """ 63 | Generate a report of the clustering analysis. 64 | """ 65 | print("Generating analysis report...") 66 | 67 | report_content = f""" 68 | # Clustering Analysis Report 69 | 70 | - FIXME: Add analysis report content here 71 | 72 | ## Silhouette Score 73 | 74 | Based on the analysis, the silhouette score is: {silhouette_score} 75 | 76 | ## Cluster Summary Statistics 77 | 78 | - FIXME: Add cluster summary statistics here 79 | """ 80 | 81 | with open(output_file, "w") as f: 82 | f.write(report_content) 83 | print(f"Report generated at {output_file}") 84 | 85 | 86 | # Main Script Execution 87 | if __name__ == "__main__": 88 | dataset_path = "./data/dataset.json" 89 | report_path = "./analysis_report.md" 90 | 91 | # Create or load dataset 92 | dataset = load_dataset(dataset_path) 93 | 94 | # Perform analysis 95 | cluster_summary, silhouette_avg = perform_analysis(dataset) 96 | 97 | # Generate report 98 | generate_report(cluster_summary, silhouette_avg, report_path) 99 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_07_solution_optimize_a_pipeline/cluster_analysis.py: -------------------------------------------------------------------------------- 1 | """ 2 | This script performs clustering analysis on a dataset and generates a report. 3 | 4 | """ 5 | 6 | import os 7 | import json 8 | import pandas as pd 9 | import numpy as np 10 | 11 | # FIXME: These imports need to be added when the analysis code is ready 12 | # import tensorflow as tf 13 | # from sklearn.cluster import KMeans 14 | # from sklearn.metrics import silhouette_score 15 | 16 | 17 | # Step 1: Generate or Load Dataset 18 | def load_dataset(file_path): 19 | """ 20 | Load an existing dataset from a JSON file. 21 | """ 22 | print("Creating or loading dataset...") 23 | 24 | # FIXME: Add code to download the dataset from Amazing Mobile App HQ 25 | # If not data is in place, generate a random dataset 26 | if not os.path.exists(file_path): 27 | os.makedirs(os.path.dirname(file_path), exist_ok=True) 28 | 29 | data = { 30 | "feature1": np.random.uniform(0, 100, 1000).tolist(), 31 | "feature2": np.random.uniform(0, 100, 1000).tolist(), 32 | "feature3": np.random.randint(0, 50, 1000).tolist(), 33 | "category": np.random.choice(["A", "B", "C"], 1000).tolist(), 34 | } 35 | 36 | with open(file_path, "w") as f: 37 | json.dump(data, f) 38 | print("Dataset created at", file_path) 39 | 40 | else: 41 | print("Dataset exists, loading from", file_path) 42 | 43 | with open(file_path, "r") as f: 44 | data = json.load(f) 45 | 46 | return pd.DataFrame(data) 47 | 48 | 49 | # Step 2: Perform Analysis 50 | def perform_analysis(df): 51 | """ 52 | Perform clustering analysis on the dataset. 53 | """ 54 | print("Performing clustering analysis...") 55 | print("Dataset shape:", df.shape) 56 | 57 | return None, None 58 | 59 | 60 | # Step 3: Generate Report 61 | def generate_report(summary, silhouette_score, output_file): 62 | """ 63 | Generate a report of the clustering analysis. 64 | """ 65 | print("Generating analysis report...") 66 | 67 | report_content = f""" 68 | # Clustering Analysis Report 69 | 70 | - FIXME: Add analysis report content here 71 | 72 | ## Silhouette Score 73 | 74 | Based on the analysis, the silhouette score is: {silhouette_score} 75 | 76 | ## Cluster Summary Statistics 77 | 78 | - FIXME: Add cluster summary statistics here 79 | """ 80 | 81 | with open(output_file, "w") as f: 82 | f.write(report_content) 83 | print(f"Report generated at {output_file}") 84 | 85 | 86 | # Main Script Execution 87 | if __name__ == "__main__": 88 | dataset_path = "./data/dataset.json" 89 | report_path = "./analysis_report.md" 90 | 91 | # Create or load dataset 92 | dataset = load_dataset(dataset_path) 93 | 94 | # Perform analysis 95 | cluster_summary, silhouette_avg = perform_analysis(dataset) 96 | 97 | # Generate report 98 | generate_report(cluster_summary, silhouette_avg, report_path) 99 | -------------------------------------------------------------------------------- /ch1_pipeline_optimizations/01_04_use_conditional_steps/README.md: -------------------------------------------------------------------------------- 1 | # 01_04 Use Conditional Steps 2 | 3 | - **Monorepos** are single repositories with code from multiple projects 4 | - Inefficient CI/CD for monorepos can cause long pipeline runtimes and waste build minutes processing code that hasn't changed 5 | 6 | ## Using conditions to skip stages and steps 7 | 8 | - [Pipeline Flow Control](https://support.atlassian.com/bitbucket-cloud/docs/step-options/#Flow-control) 9 | 10 | Bitbucket allows us to use a `condition` directive to run portions of a pipeline if certain files have changed. 11 | 12 | - Use [glob patterns](https://support.atlassian.com/bitbucket-cloud/docs/use-glob-patterns-on-the-pipelines-yaml-file/) to target individual files or directories 13 | - If the event that triggered the pipeline includes changes to the indicated files or files inside the indicated directory, then the stage or step will be run. Otherwise it gets skipped completely. 14 | 15 | ### Conditions in stages 16 | 17 | ```yaml 18 | - stage: 19 | name: Cloud Deployment 20 | condition: 21 | changesets: 22 | includePaths: 23 | - "infrastructure/*.tf"\ 24 | ``` 25 | 26 | ### Conditions in steps 27 | 28 | ```yaml 29 | - step: 30 | name: Test Python 31 | condition: 32 | changesets: 33 | includePaths: 34 | - "python-backend/**" 35 | ``` 36 | 37 | ## Working with conditional pipelines 38 | 39 | - This directory contains the following files that represent a monorepo. 40 | - Use them to experiment with applying conditionals in a pipeline 41 | 42 | ```text 43 | ├── README.md 44 | ├── bitbucket-pipelines.yml 45 | ├── bitbucket-pipelines-no-conditional-steps.yml 46 | ├── bitbucket-pipelines-with-conditional-steps.yml 47 | ├── python-backend 48 | │   ├── app.py 49 | │   ├── app_test.py 50 | │   └── requirements.txt 51 | ├── ruby-backend 52 | │   ├── app.rb 53 | │   ├── app_test.rb 54 | │   └── Gemfile 55 | │   └── Gemfile.lock 56 | └── shared 57 | └── database-schema.json 58 | ``` 59 | 60 | ## Using the Exercise Files 61 | 62 | 1. Create a new repo and add the exercise files 63 | 1. Adding the files may not trigger the pipeline right away. Go to the **Pipelines** menu and select **Run initial pipeline**. 64 | 1. The initial pipeline does not include conditional steps. 65 | 1. Replace the initial pipeline with the contents of `bitbucket-pipelines-no-conditional-steps.yml`. 66 | 1. Observe the pipeline activity after committing the new pipeline configuration. 67 | 1. Make a change to a file in either `python-backend` or `ruby-backend`, such as adding a comment. 68 | 1. Observe the pipeline activity after committing the updated file. 69 | 1. Make a change to `database-schema.json` in the `shared` directory. 70 | 1. Observe the pipeline activity after committing the updated file. 71 | 72 | 73 | --- 74 | [← 01_03 Configure resource allocation](../01_03_configure_resource_allocation/README.md) | [01_05 Cache Dependencies →](../01_05_cache_dependencies/README.md) 75 | 76 | -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_07_challenge_deploy_a_self_hosted_runner/README.md: -------------------------------------------------------------------------------- 1 | # 04_07 Challenge: Deploy a Self-Hosted Runner 2 | 3 | ## Challenge Scenario 4 | 5 | In this challenge you’re continuing your role as the Bitbucket Pipelines expert supporting the Amazing Mobile App. 6 | 7 | The Amazing Mobile App’s DevOps team needs to run a deployment script that accesses sensitive data stored in the company's AWS account. The deployment script needs to access AWS Parameter Store to retrieve the path to the file that contains the data. 8 | 9 | They're concerned about the risk of enabling Bitbucket's runners to have access to private networks and would like to develop a solution that uses their own, self-hosted runners. 10 | 11 | Before building out their solution, they've asked you to develop a proof of concept that shows how a workflow running in Bitbucket Pipelines can access protected resources in AWS using a self-hosted runner. 12 | 13 | ## Challenge Tasks 14 | 15 | Complete the following steps to solve this challenge: 16 | 17 | 1. Use the exercise files to deploy resources that model the needs of the DevOps team. 18 | 19 | **NOTE: If you are following along with the course and have deployed the resources described in [04_04 Deploy an EC2 Server in AWS](../04_04_deploy_an_ec2_server_in_aws/README.md), then you can skip this step. Those resources can be reused for this challenge.** 20 | 21 | ![AWS Resources](./images/cfn-designer.png) 22 | 23 | The resources include: 24 | 25 | 1. A parameter store location that contains the path to a data file 26 | 1. An IAM policy and role that allow an EC2 instance to read the parameter 27 | 1. An EC2 instance that assumes the role and can be used as a self-hosted runner 28 | 29 | 1. Deploy a runner on the EC2 instance using the runner type that is most applicable to the needs of the DevOps team. 30 | 31 | **NOTE: If you are following along with the course and have deployed the resources described in [04_05 Install Runners in a Workspace](../04_05_install_runners_in_a_workspace/), then you can skip this step. Those runners can be reused for this challenge.** 32 | 33 | - What type of runner should you deploy; **Linux Shell** or **Linux Docker**? 34 | - Why is the runner you selected the right choice for your solution? 35 | 36 | 1. Create a new repository, add the exercise files, and create repository variables. This includes: 37 | 38 | 1. A script that will read a Parameter Store location and then read a data file. 39 | 1. A pipeline configuration that schedules a step to run on a self-hosted runner and then runs the script. 40 | 1. A repository variable named `AWS_REGION` that contains the region where the runner was deployed. 41 | 42 | 1. Confirm that the runner you deployed and the repository configuration validate the proof of concept. 43 | 44 | This challenge should take 25-30 minutes to complete. 45 | 46 | 47 | --- 48 | [← 04_06 Use Self-Hosted Runners in a Pipeline](../04_06_use_self_hosted_runners_in_a_pipeline/README.md) | [04_08 Solution: Deploy a Self-Hosted Runner →](../04_08_solution_deploy_a_self_hosted_runner/README.md) 49 | 50 | -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_04_use_a_pipe_to_deploy_code_to_aws_lambda/Makefile: -------------------------------------------------------------------------------- 1 | FUNCTION=undefined 2 | PLATFORM=undefined 3 | URL=undefined 4 | VERSION=undefined 5 | BUILD_NUMBER=undefined 6 | CODE=$(shell ls *.py) 7 | 8 | ifneq (,$(findstring -staging,$(FUNCTION))) 9 | ENVIRONMENT = STAGING 10 | else ifneq (,$(findstring -production,$(FUNCTION))) 11 | ENVIRONMENT = PRODUCTION 12 | else 13 | ENVIRONMENT = undefined 14 | endif 15 | 16 | hello: 17 | @echo "Here are the targets for this Makefile:" 18 | @echo " requirements - install the project requirements" 19 | @echo " lint - run linters on the code" 20 | @echo " black - run black to format the code" 21 | @echo " test - run the tests" 22 | @echo " build - build the lambda.zip file" 23 | @echo " deploy - deploy the lambda.zip file to AWS" 24 | @echo " testdeployment - test the deployment" 25 | @echo " clean - remove the lambda.zip file" 26 | @echo " all - clean, lint, black, test, build, and deploy" 27 | @echo 28 | @echo 29 | @echo "You must set the FUNCTION variables to use the deploy target." 30 | @echo "FUNCTION must be set to the name of an existing lambda function to update." 31 | @echo "For example:" 32 | @echo 33 | @echo " make deploy FUNCTION=sample-application-staging" 34 | @echo 35 | @echo "Optional deploy variables are:" 36 | @echo " VERSION - the version of the code being deployed (default: undefined)" 37 | @echo " PLATFORM - the platform being used for the deployment (default: undefined)" 38 | @echo " BUILD_NUMBER - the build number assigned by the deployment platform (default: undefined)" 39 | @echo " URL - the URL to use for testing the deployment (default: undefined)" 40 | @echo 41 | 42 | requirements: 43 | pip install -U pip 44 | pip install --requirement requirements.txt 45 | 46 | check: 47 | set 48 | zip --version 49 | python --version 50 | pylint --version 51 | flake8 --version 52 | aws --version 53 | 54 | lint: 55 | pylint --exit-zero --errors-only --disable=C0301 --disable=C0326 --disable=R,C $(CODE) 56 | flake8 --exit-zero --ignore=E501,E231 $(CODE) 57 | 58 | 59 | black: 60 | black --diff $(CODE) 61 | 62 | test: 63 | python -m unittest -v index_test 64 | 65 | build: 66 | zip lambda.zip index.py data.json template.html 67 | 68 | deploy: 69 | aws sts get-caller-identity 70 | 71 | aws lambda wait function-active \ 72 | --function-name="$(FUNCTION)" 73 | 74 | aws lambda update-function-configuration \ 75 | --function-name="$(FUNCTION)" \ 76 | --environment "Variables={PLATFORM=$(PLATFORM),VERSION=$(VERSION),BUILD_NUMBER=$(BUILD_NUMBER),ENVIRONMENT=$(ENVIRONMENT)}" 77 | 78 | aws lambda wait function-updated \ 79 | --function-name="$(FUNCTION)" 80 | 81 | aws lambda update-function-code \ 82 | --function-name="$(FUNCTION)" \ 83 | --zip-file=fileb://lambda.zip 84 | 85 | aws lambda wait function-updated \ 86 | --function-name="$(FUNCTION)" 87 | 88 | testdeployment: 89 | curl -s $(URL) | grep $(VERSION) 90 | 91 | clean: 92 | rm -vf lambda.zip 93 | 94 | all: clean lint black test build deploy 95 | 96 | .PHONY: test build deploy all clean 97 | -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_04_use_a_pipe_to_deploy_code_to_aws_lambda/index.py: -------------------------------------------------------------------------------- 1 | """ 2 | An API that returns data from a JSON file. 3 | 4 | """ 5 | 6 | import os 7 | import json 8 | 9 | 10 | def handler(event, context): 11 | """ 12 | The lambda handler function. 13 | """ 14 | del context # Unused 15 | 16 | if event["rawPath"] == "/": 17 | environment = os.environ.get("ENVIRONMENT", "undefined") 18 | platform = os.environ.get("PLATFORM", "undefined") 19 | version = os.environ.get("VERSION", "undefined") 20 | build_number = os.environ.get("BUILD_NUMBER", "undefined") 21 | 22 | # Read the HTML template 23 | with open("template.html", mode="r", encoding="utf-8") as template_file: 24 | template = template_file.read() 25 | 26 | # Render the template 27 | docs_page = template.format( 28 | environment=environment, 29 | version=version, 30 | platform=platform, 31 | build_number=build_number, 32 | ) 33 | 34 | # Return the rendered template 35 | return { 36 | "statusCode": 200, 37 | "headers": { 38 | "Content-Type": "text/html", 39 | }, 40 | "body": docs_page, 41 | } 42 | 43 | # Read the data from the JSON file 44 | with open("data.json", mode="r", encoding="utf-8") as data_file: 45 | data = json.load(data_file) 46 | 47 | # Route /data: return all data 48 | if event["rawPath"] == "/data": 49 | return { 50 | "statusCode": 200, 51 | "headers": { 52 | "Content-Type": "application/json", 53 | }, 54 | "body": json.dumps(data), 55 | } 56 | 57 | # Route /data/{item_id}: return a single item 58 | # Get the item_id from the event's rawPath 59 | item_id = event["rawPath"][1:] 60 | 61 | # Check the item_id against each item in the data 62 | for item in data: 63 | # Return the item if the item_id matches 64 | if item["id"] == item_id: 65 | return { 66 | "statusCode": 200, 67 | "headers": { 68 | "Content-Type": "application/json", 69 | }, 70 | "body": json.dumps(item), 71 | } 72 | 73 | # Return a 404 if the item_id doesn't match any item 74 | return { 75 | "statusCode": 404, 76 | "headers": { 77 | "Content-Type": "application/json", 78 | }, 79 | "body": json.dumps( 80 | { 81 | "message": f"item_id {item_id} not found", 82 | "event": event, 83 | "item_id": item_id, 84 | } 85 | ), 86 | } 87 | 88 | 89 | def main(): 90 | """ 91 | A main function for testing the handler function. 92 | """ 93 | # Simulate an event for testing 94 | event = {"rawPath": "/1"} 95 | context = None 96 | 97 | # Call the handler function 98 | response = handler(event, context) 99 | 100 | # Print the response 101 | print(json.dumps(response, indent=2)) 102 | 103 | 104 | if __name__ == "__main__": 105 | main() 106 | -------------------------------------------------------------------------------- /NOTICE: -------------------------------------------------------------------------------- 1 | Copyright 2025 LinkedIn Corporation 2 | All Rights Reserved. 3 | 4 | Licensed under the LinkedIn Learning Exercise File License (the "License"). 5 | See LICENSE in the project root for license information. 6 | 7 | ATTRIBUTIONS: 8 | 9 | -------------------- 10 | **pg** 11 | https://github.com/ged/ruby-pg 12 | License: BSD-2-Clause 13 | https://opensource.org/licenses/BSD-2-Clause 14 | -------------------- 15 | **minitest** 16 | https://github.com/seattlerb/minitest 17 | License: MIT 18 | https://opensource.org/licenses/MIT 19 | -------------------- 20 | **psycopg2** 21 | https://github.com/psycopg/psycopg2 22 | License: LGPL-3.0 23 | https://opensource.org/licenses/LGPL-3.0 24 | -------------------- 25 | **black** 26 | https://github.com/psf/black 27 | License: MIT 28 | https://opensource.org/licenses/MIT 29 | -------------------- 30 | **flake8** 31 | https://github.com/PyCQA/flake8 32 | License: MIT 33 | https://opensource.org/licenses/MIT 34 | -------------------- 35 | **pytest** 36 | https://github.com/pytest-dev/pytest 37 | License: MIT 38 | https://opensource.org/licenses/MIT 39 | -------------------- 40 | **numpy** 41 | https://github.com/numpy/numpy 42 | License: BSD-3-Clause 43 | https://opensource.org/licenses/BSD-3-Clause 44 | -------------------- 45 | **pandas** 46 | https://github.com/pandas-dev/pandas 47 | License: BSD-3-Clause 48 | https://opensource.org/licenses/BSD-3-Clause 49 | -------------------- 50 | **tensorflow** 51 | https://github.com/tensorflow/tensorflow 52 | License: Apache 2.0 53 | https://www.apache.org/licenses/LICENSE-2.0 54 | -------------------- 55 | **scikit-learn** 56 | https://github.com/scikit-learn/scikit-learn 57 | License: BSD-3-Clause 58 | https://opensource.org/licenses/BSD-3-Clause 59 | -------------------- 60 | **pylint** 61 | https://github.com/PyCQA/pylint 62 | License: GPL-2.0 63 | https://opensource.org/licenses/GPL-2.0 64 | -------------------- 65 | **awscli** 66 | https://github.com/aws/aws-cli 67 | License: Apache 2.0 68 | https://www.apache.org/licenses/LICENSE-2.0 69 | -------------------- 70 | **bitbucket-pipes-toolkit** 71 | https://bitbucket.org/atlassian/bitbucket-pipes-toolkit 72 | License: Apache 2.0 73 | https://www.apache.org/licenses/LICENSE-2.0 74 | -------------------- 75 | **boto3** 76 | https://github.com/boto/boto3 77 | License: Apache 2.0 78 | https://www.apache.org/licenses/LICENSE-2.0 79 | -------------------- 80 | **setuptools** 81 | https://github.com/pypa/setuptools 82 | License: MIT 83 | https://opensource.org/licenses/MIT 84 | -------------------- 85 | **psutil** 86 | https://github.com/giampaolo/psutil 87 | License: BSD-3-Clause 88 | https://opensource.org/licenses/BSD-3-Clause 89 | -------------------- 90 | **py-cpuinfo** 91 | https://github.com/workhorsy/py-cpuinfo 92 | License: MIT 93 | https://opensource.org/licenses/MIT 94 | -------------------- 95 | 96 | Please note, this project may automatically load third party code from external 97 | repositories (for example, NPM modules, Composer packages, or other dependencies). 98 | If so, such third party code may be subject to other license terms than as set 99 | forth above. In addition, such third party code may also depend on and load 100 | multiple tiers of dependencies. Please review the applicable licenses of the 101 | additional dependencies. 102 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Advanced Bitbucket Pipelines: Automating Deployments & Managing Third Party Integrations 2 | 3 | This is the repository for the LinkedIn Learning course `Advanced Bitbucket Pipelines: Automating Deployments & Managing Third Party Integrations`. The full course is available from [LinkedIn Learning][lil-course-url]. 4 | 5 | ![lil-thumbnail-url] 6 | 7 | ## Course Description 8 | 9 |

In this course, software engineer Michael Jenkins guides you through how to use advanced Bitbucket topics for your software projects. Gain a solid understanding of developing optimal Bitbucket Pipelines to automate deployments. Dive into integrating Bitbucket Pipelines with third party services within your environment.

10 | Check out this course to find out how you can solve complex CI/CD challenges using Bitbucket Pipelines to reduce build times, leverage the power of pipeline components, and deploy self-hosted runners for more control of your build environments.

11 | 12 | _See the readme file in the main branch for updated instructions and information._ 13 | 14 | ## Message from the instructor 15 | 16 | Hi! I'm glad that you're investing in yourself to learn these advanced topics for Bitbucket Pipelines. 17 | 18 | I'm a firm believer in the power of hands-on learning so the exercise files are packed with demos and challenges that you can use to follow along. 19 | 20 | > [!IMPORTANT] 21 | > Running the demos and challenges _will_ use up quite a few build minutes. If you're using a free account, I highly encourage you to invest in Overage Protection that will add minutes as needed if you go over the 50 minutes per month limit. 22 | > 23 | > You can find more information on the Overage Protection here: [Manage your plan and billing](https://support.atlassian.com/bitbucket-cloud/docs/manage-your-plan-and-billing/) 24 | 25 | I had a great time putting this course together and I hope you enjoy it just as much. 26 | 27 | Cheers! 28 | 29 | - [Michael J.](https://www.linkedin.com/in/michaelpjenkins/) 30 | 31 | ## Instructions 32 | 33 | This repository has directories for each of the videos in the course. 34 | 35 | Download this repo as a zip file and reference the files on your local system. 36 | 37 | In most cases, you will be creating a new repo in Bitbucket and adding specific exercise files to that repo. 38 | 39 | ## Software 40 | 41 | To use these exercise files, you must have the following installed: 42 | 43 | - [Git](https://git-scm.com/) 44 | - [Node.js](https://nodejs.org/) 45 | - [npm](https://www.npmjs.com/) 46 | - [Docker](https://www.docker.com/) 47 | - [Python](https://www.python.org/) 48 | 49 | ## Instructor 50 | 51 | Michael Jenkins 52 | 53 | Senior Systems Engineer 54 | 55 | Check out my other courses on [LinkedIn Learning](https://www.linkedin.com/learning/instructors/michael-jenkins?u=104). 56 | 57 | [0]: # (Replace these placeholder URLs with actual course URLs) 58 | 59 | [lil-course-url]: https://www.linkedin.com/learning/advanced-bitbucket-pipelines-automating-deployments-and-managing-third-party-integrations 60 | [lil-thumbnail-url]: https://media.licdn.com/dms/image/v2/D4E0DAQEtb5QOp2cq9w/learning-public-crop_675_1200/B4EZT7Ov5NGYAY-/0/1739381735658?e=2147483647&v=beta&t=k03SgGqwASFU3H6ljtjnue6-e7ogrCnApwhMcDZMvz8 61 | 62 | 63 | --- 64 | [00_01 Introduction →](ch0_intro/00_01_introduction/README.md) 65 | 66 | -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_07_challenge_deploy_a_self_hosted_runner/get-runner-details.py: -------------------------------------------------------------------------------- 1 | import sys 2 | import psutil 3 | import cpuinfo 4 | import platform 5 | import pkg_resources 6 | from datetime import datetime 7 | 8 | 9 | def get_size(bytes, suffix="B"): 10 | """ 11 | Scale bytes to its proper format 12 | e.g: 13 | 1253656 => '1.20MB' 14 | 1253656678 => '1.17GB' 15 | """ 16 | factor = 1024 17 | for unit in ["", "K", "M", "G", "T", "P"]: 18 | if bytes < factor: 19 | return f"{bytes:.2f}{unit}{suffix}" 20 | bytes /= factor 21 | 22 | 23 | def system_information(): 24 | info = [] # List to hold information tuples 25 | 26 | # System Information 27 | uname = platform.uname() 28 | info.append(("System", uname.system)) 29 | info.append(("Node Name", uname.node)) 30 | info.append(("Release", uname.release)) 31 | info.append(("Version", uname.version)) 32 | info.append(("Machine", uname.machine)) 33 | info.append(("Processor", uname.processor)) 34 | info.append(("Processor Brand", cpuinfo.get_cpu_info()["brand_raw"])) 35 | 36 | # CPU Information 37 | info.append(("Physical cores", psutil.cpu_count(logical=False))) 38 | 39 | # Memory Information 40 | svmem = psutil.virtual_memory() 41 | info.append(("Total Memory", get_size(svmem.total))) 42 | 43 | # Disk Information 44 | partitions = psutil.disk_partitions() 45 | for partition in partitions: 46 | if partition.mountpoint in ["/", "/boot"]: 47 | try: 48 | partition_usage = psutil.disk_usage(partition.mountpoint) 49 | info.append( 50 | ( 51 | f"Disk: {partition.device} Total Size", 52 | get_size(partition_usage.total), 53 | ) 54 | ) 55 | except PermissionError: 56 | # Skip the partition if permission error 57 | continue 58 | 59 | return info 60 | 61 | 62 | def python_information(): 63 | """Fetches details about the Python environment.""" 64 | info = [] 65 | # Python Executable 66 | info.append(("Python Executable", sys.executable)) 67 | 68 | # Site-Packages Location 69 | info.append(("Site-Packages Location", sys.prefix)) 70 | 71 | # Installed Packages 72 | installed_packages = [(d.project_name, d.version) for d in pkg_resources.working_set] 73 | info.append(("Installed Packages", installed_packages)) 74 | 75 | return info 76 | 77 | if __name__ == "__main__": 78 | print("System Information Report") 79 | print("Generated on:", datetime.now().strftime("%Y-%m-%d %H:%M:%S")) 80 | print("=" * 40) 81 | 82 | system_info = system_information() 83 | python_info = python_information() 84 | 85 | for label, value in system_info: 86 | print(f"{label:<25}: {value}") 87 | 88 | print("=" * 40) 89 | print("Python Environment Report") 90 | print("=" * 40) 91 | 92 | for label, value in python_info: 93 | if label == "Installed Packages": 94 | print(f"{label:<25}:") 95 | for pkg_name, pkg_version in value: 96 | print(f" - {pkg_name} ({pkg_version})") 97 | else: 98 | print(f"{label:<25}: {value}") 99 | 100 | print("=" * 40) 101 | 102 | -------------------------------------------------------------------------------- /ch4_self_hosted_runners/04_08_solution_deploy_a_self_hosted_runner/get-runner-details.py: -------------------------------------------------------------------------------- 1 | import sys 2 | import psutil 3 | import cpuinfo 4 | import platform 5 | import pkg_resources 6 | from datetime import datetime 7 | 8 | 9 | def get_size(bytes, suffix="B"): 10 | """ 11 | Scale bytes to its proper format 12 | e.g: 13 | 1253656 => '1.20MB' 14 | 1253656678 => '1.17GB' 15 | """ 16 | factor = 1024 17 | for unit in ["", "K", "M", "G", "T", "P"]: 18 | if bytes < factor: 19 | return f"{bytes:.2f}{unit}{suffix}" 20 | bytes /= factor 21 | 22 | 23 | def system_information(): 24 | info = [] # List to hold information tuples 25 | 26 | # System Information 27 | uname = platform.uname() 28 | info.append(("System", uname.system)) 29 | info.append(("Node Name", uname.node)) 30 | info.append(("Release", uname.release)) 31 | info.append(("Version", uname.version)) 32 | info.append(("Machine", uname.machine)) 33 | info.append(("Processor", uname.processor)) 34 | info.append(("Processor Brand", cpuinfo.get_cpu_info()["brand_raw"])) 35 | 36 | # CPU Information 37 | info.append(("Physical cores", psutil.cpu_count(logical=False))) 38 | 39 | # Memory Information 40 | svmem = psutil.virtual_memory() 41 | info.append(("Total Memory", get_size(svmem.total))) 42 | 43 | # Disk Information 44 | partitions = psutil.disk_partitions() 45 | for partition in partitions: 46 | if partition.mountpoint in ["/", "/boot"]: 47 | try: 48 | partition_usage = psutil.disk_usage(partition.mountpoint) 49 | info.append( 50 | ( 51 | f"Disk: {partition.device} Total Size", 52 | get_size(partition_usage.total), 53 | ) 54 | ) 55 | except PermissionError: 56 | # Skip the partition if permission error 57 | continue 58 | 59 | return info 60 | 61 | 62 | def python_information(): 63 | """Fetches details about the Python environment.""" 64 | info = [] 65 | # Python Executable 66 | info.append(("Python Executable", sys.executable)) 67 | 68 | # Site-Packages Location 69 | info.append(("Site-Packages Location", sys.prefix)) 70 | 71 | # Installed Packages 72 | installed_packages = [(d.project_name, d.version) for d in pkg_resources.working_set] 73 | info.append(("Installed Packages", installed_packages)) 74 | 75 | return info 76 | 77 | if __name__ == "__main__": 78 | print("System Information Report") 79 | print("Generated on:", datetime.now().strftime("%Y-%m-%d %H:%M:%S")) 80 | print("=" * 40) 81 | 82 | system_info = system_information() 83 | python_info = python_information() 84 | 85 | for label, value in system_info: 86 | print(f"{label:<25}: {value}") 87 | 88 | print("=" * 40) 89 | print("Python Environment Report") 90 | print("=" * 40) 91 | 92 | for label, value in python_info: 93 | if label == "Installed Packages": 94 | print(f"{label:<25}:") 95 | for pkg_name, pkg_version in value: 96 | print(f" - {pkg_name} ({pkg_version})") 97 | else: 98 | print(f"{label:<25}: {value}") 99 | 100 | print("=" * 40) 101 | 102 | -------------------------------------------------------------------------------- /ch2_using_pipes_in_pipelines/02_03_deploy_lambda_functions_in_aws/README.md: -------------------------------------------------------------------------------- 1 | # 02_03 Deploy Lambda Functions in AWS 2 | 3 | In this lab you will use a CloudFormation template to deploy the AWS resources that will be used in the future lessons. 4 | 5 | ![resource diagram](./images/00-resource-diagram.png) 6 | 7 | ## Prerequisites 8 | 9 | Having the following items in place before starting this lab will help you have a smooth experience. 10 | 11 | 1. [Atlassian and Bitbucket accounts](https://bitbucket.org/product) are required to host the code for the sample application. 12 | 1. An [Amazon Web Services account](https://aws.amazon.com/free/) is needed to deploy the sample application used for the deployment target. 13 | 1. The exercise files for this lesson should be downloaded and accessible on your local system. 14 | 15 | ## Deploy the CloudFormation Template 16 | 17 | 1. Log into your AWS account. Access the CloudFormation homepage and select **Create stack**. 18 | 19 | ![create stack](./images/01-create-stack.png) 20 | 21 | 1. On the create stack screen, select **Upload a template file** and then select **Choose file**. 22 | 23 | Using your system's file finder, browse to the location where the CloudFormation template is located. 24 | 25 | Select the file [AWS-Lambda-CloudFormation.yml](./AWS-Lambda-CloudFormation.yml). 26 | 27 | Select **Next**. 28 | 29 | ![upload template](./images/02-upload-template.png) 30 | 31 | 1. On the "**Specify stack details**" screen, **enter a stack name**. Select **Next**. 32 | 33 | ![enter stack name](./images/03-enter-stack-name.png) 34 | 35 | 1. On the "**Stack options**" screen, scroll down to the very bottom. 36 | 37 | Acknowledge that AWS CloudFormation might create IAM resources with custom names, and then select **Next**. 38 | 39 | ![acknowledge IAM resources](./images/04-acknowledge-iam-resources.png) 40 | 41 | 1. On the "Review and create" screen, scroll down to the bottom and select **Submit**. 42 | 43 | ![submit](./images/05-submit.png) 44 | 45 | 1. Selecting submit starts the stack creation process. 46 | 47 | ![create in progress](./images/06-create-in-progress.png) 48 | 49 | Wait 2-3 minutes for the stack deployment to complete. 50 | 51 | ![create complete](./images/07-create-complete.png) 52 | 53 | 1. After the creation is complete, select the **Outputs** tab to see the values for: 54 | 55 | - AwsAccessKeyId [^1] 56 | - AwsDefaultRegion 57 | - AwsSecretAccessKey 58 | - ProductionFunctionName 59 | - ProductionURL 60 | - StagingFunctionName 61 | - StagingURL 62 | 63 | > [!WARNING] The values for `AwsAccessKeyId` and `AwsSecretAccessKey` are sensitive. Treat them as you would a username and password. Do not store them in publicly available repositories, files, or documents. 64 | 65 | ![outputs](./images/08-outputs.png) 66 | 67 | 1. To validate the deployment, open the links for the **ProductionURL** and **StagingURL**. 68 | 69 | ![production url](./images/09-production-url.png) 70 | 71 | ![staging url](./images/10-staging-url.png) 72 | 73 | [^1]: WARNING! The values for `AwsAccessKeyId` and `AwsSecretAccessKey` are sensitive. Treat them as you would a username and password. Do not store them in publicly available repositories, files, or documents. 74 | 75 | 76 | --- 77 | [← 02_02 Use a pipe in a pipeline configuration](../02_02_use_a_pipe_in_a_pipeline_configuration/README.md) | [02_04 Use a pipe to deploy code to AWS Lambda →](../02_04_use_a_pipe_to_deploy_code_to_aws_lambda/README.md) 78 | 79 | -------------------------------------------------------------------------------- /ch0_intro/00_02_what_you_should_know/README.md: -------------------------------------------------------------------------------- 1 | # 00_02 What You Should Know 2 | 3 | To get the most out of this course, there are a few things you should be familiar with. You should also know that this course has built in features to help you out along the way. 4 | 5 | ## Git and Bitbucket 6 | 7 | You should already be familiar with using the following tools: 8 | 9 | - **Git**, a free, open-source version control system that allows users to store code, track changes, and collaborate on projects 10 | - **Bitbucket**, a cloud-based service for hosting git repositories. 11 | 12 | You should already know how to create the following using Bitbucket: 13 | 14 | - Workspaces 15 | - Projects 16 | - Repositories 17 | - Branches 18 | - Pull Requests 19 | - Pipelines 20 | 21 | This course will share some basics and reviews along the way but if you’re unfamiliar with Git or Bitbucket, you can use the following LinkedIn Learning courses to get up to speed or as references. 22 | 23 | - [Git Essential Training](https://www.linkedin.com/learning/git-essential-training-19417064/get-started-with-git) 24 | - [Learning Bitbucket](https://www.linkedin.com/learning/learning-bitbucket/streamline-your-code-and-collaboration-with-bitbucket) 25 | - [Bitbucket Pipelines for CI/CD](https://www.linkedin.com/learning/bitbucket-pipelines-for-ci-cd/from-commit-to-deployment-with-bitbucket-pipelines) 26 | 27 | ## Amazon Web Services (AWS) Account 28 | 29 | If you’re going to follow along by completing the challenges described in the course, you’ll also need an account with **Amazon Web Services**. 30 | 31 | Browse to [aws.amazon.com/free](https:.//aws.amazon.com/free) and follow the steps there to create an account using the AWS free tier. 32 | 33 | ## Exercise Files 34 | 35 | Exercise files are available for your use and reference. A link to the files is on the homepage of the course. 36 | 37 | The files contain: 38 | 39 | - notes and additional information 40 | - examples for the material we’ll be covering in the course 41 | - and maybe even some trivia and shenanigans. 42 | 43 | If you’re looking for a reference to something in the course, you’ll most likely find it in the exercise files. 44 | 45 | ## Questions and Answers 46 | 47 | If you get stuck on something and you need a bit more help than you can find in the exercise files, please use the **Course Q&A** section. 48 | 49 | Ask your question there and I’ll do my best to help you out. If other folks have gotten stuck in the same place, they might join in the discussion and share their solution. 50 | 51 | Also, please check out the Q&A section to provide answers! As a community, we’ll use the Q&A to collaborate on a positive learning experience for everyone. 52 | 53 | ## Version Information 54 | 55 | This course tracks the versions of software used in demonstrations to ensure a consistent learning experience. 56 | 57 | The content of this course is reviewed periodically and updated for compatibility with newer software versions. 58 | 59 | If you encounter any version discrepancies, please refer to the course’s Q&A section. Also, check the exercise files for additional information. 60 | 61 | This course was created using the following software: 62 | 63 | - Git version 2.40.1 64 | - Node 23.5.0 65 | - NPM 10.9.2 66 | - Yeoman 5.1.0 67 | - generator-bitbucket-pipe 10.9.2 68 | - Docker 69 | - Docker for macOS 27.4.0, build bde2b89 70 | - Docker for Linux 25.0.5, build 5dc9bcc 71 | - OpenJDK Runtime Environment Corretto-23.0.1.8.1 (build 23.0.1+8-FR) 72 | - Amazon Linux al2023-ami-kernel-6.1-x86_64 73 | - AWS Lambda Runtime for Python 3.10 74 | - Bitbucket Pipes 75 | - atlassian/aws-lambda-deploy:1.12.0 76 | - atlassian/bitbucket-iac-scan:0.5.2 77 | - atlassian/bitbucket-build-statistics:1.5.3 78 | - atlassian/bitbucket-upload-file:0.7.4 79 | 80 | 81 | --- 82 | [← 00_01 Introduction](../00_01_introduction/README.md) | [00_03 Using the Exercise Files →](../00_03_using_the_exercise_files/README.md) 83 | 84 | -------------------------------------------------------------------------------- /ch3_create_custom_pipes/03_04_deploy_a_custom_pipe_to_a_container_registry/README.md: -------------------------------------------------------------------------------- 1 | # 03_04 Deploy a Custom Pipe to a Container Registry 2 | 3 | Start by enabling Pipelines in the repository settings. 4 | 5 | Then get set up in Docker Hub. 6 | 7 | ## Setting Up an Image Repository on Docker Hub 8 | 9 | Follow these steps to create a repository, generate an access token, and securely store the registry credentials as repository variables. 10 | 11 | ### 1. Create a Repository on Docker Hub 12 | 13 | 1. Go to [Docker Hub](https://hub.docker.com) and log into your account. 14 | 15 | 1. Click **"Create a repository"** on the **Repositories** page. 16 | 17 | ![Step 1 - Create Repository Button](./images/1-SCR-20241217-lcqn.png) 18 | 19 | 1. In the **Create repository** form: 20 | 21 | - Enter a **Repository Name** (e.g., `awesome-pipe`). 22 | - Choose **Public Visibility**. 23 | - Click **Create**. 24 | 25 | ![Step 2 - Create Repository Form](./images/2-SCR-20241217-ldhz.png) 26 | 27 | ### 2. Generate a Personal Access Token 28 | 29 | 1. Navigate to **Account Settings**. Under **Security** select **Access Tokens**. 30 | 2. Click **Create Token** and fill in: 31 | 32 | - **Access token description**: (e.g., `Awesome Pipe`). 33 | - **Expiration date**: Choose duration (e.g., `30 days`). 34 | - **Access permissions**: Select **Read & Write**. 35 | 36 | ![Step 3 - Generate Access Token](./images/3-SCR-20241217-lqkt.png) 37 | 38 | 3. Click **Generate**. 39 | 40 | ### 3. Copy the Access Token 41 | 42 | 1. Once the token is generated, copy it immediately as it **will not be shown again**. 43 | 2. Use the **Copy** button to store it securely. 44 | 45 | ![Step 4 - Copy Access Token](./images/4-SCR-20241217-lptx.png) 46 | 47 | ### 4. Securely Store Your Credentials in Bitbucket Pipelines 48 | 49 | 1. Navigate to your Bitbucket Repository settings → **Repository Variables**. 50 | 2. Add the following secured variables: 51 | 3. Make sure **Secured** is checked to mask sensitive information. 52 | 53 | - **`REGISTRY_USERNAME`**: Your Docker Hub username. 54 | - **`REGISTRY_PASSWORD`**: The personal access token you copied. 55 | 56 | ![Step 5 - Secure Repository Variables](./images/4-SCR-20241217-lptx.png) 57 | 58 | ## Merge the Branch 59 | 60 | 1. Use a pull request to merge the feature branch into the main branch. 61 | 62 | 1. The merge will trigger a pipeline run that pushes the image to Docker Hub. 63 | 64 | 1. Confirm the pipeline completes successfully. Resolve any errors as needed to get the image pushed. 65 | 66 | ## Shenanigans 67 | 68 | ### 03_04.1: Semantic Versioning 69 | 70 | If you see the following error in your pipeline: 71 | 72 | ![Semversioner Error](./images/6-SCR-20241217-misj.png) 73 | 74 | ...its because a change file with semantic versioning is not in place. 75 | 76 | Run these commands in the root directory of your repo: 77 | 78 | ```bash 79 | pip install semversioner 80 | semversioner add-change --type minor --description "create pipe" 81 | ``` 82 | 83 | - [Bitbucket pipe release v5.6.1 has a Semversioner error](https://community.atlassian.com/t5/Bitbucket-questions/Bitbucket-pipe-release-v5-6-1-has-a-Semversioner-error/qaq-p/2677152) 84 | 85 | #### What's Semantic Versioning? 86 | 87 | Semantic Versioning (SemVer) is a convention for versioning software that uses three numbers separated by dots: 88 | 89 | ```bash 90 | MAJOR.MINOR.PATCH 91 | ``` 92 | 93 | - `MAJOR`: Incremented when backward incompatible changes are made 94 | - `MINOR`: Incremented when new functionality is added, but backwards compatible 95 | - `PATCH`: Incremented when bug fixes or minor updates are made 96 | 97 | For example: `1.2.3` means: 98 | 99 | - `MAJOR`: Version 1 of the software has been released 100 | - `MINOR`: There have been two minor releases with new features since version 1 101 | - `PATCH`: There have been three patches (bug fixes) since the last minor release 102 | 103 | 104 | --- 105 | [← 03_03 Test a Custom Pipe](../03_03_test_a_custom_pipe/README.md) | [03_05 Use a custom pipe in a pipeline →](../03_05_use_a_custom_pipe_in_a_pipeline/README.md) 106 | 107 | --------------------------------------------------------------------------------