├── .gitattributes ├── README.md ├── Deploy a Compute Instance with a Remote Startup Script ├── resources-install-web.sh └── Deploy a Compute Instance with a Remote Startup Script.md ├── Build and Secure Networks in Google Cloud ├── Region_Image.png ├── Multiple VPC Networks [GSP211].md ├── VPC Networks - Controlling Access [GSP213] .md └── Build and Secure Networks in Google Cloud Challenge Lab.md ├── Configure Secure RDP using a Windows Bastion Host ├── img1.png ├── img2.png ├── img3.png ├── img4.png ├── img5.png └── Configure Secure RDP using a Windows Bastion Host.md ├── Build and Deploy a Docker Image to a Kubernetes Cluster ├── resources-echo-web.tar.gz └── Build and Deploy a Docker Image to a Kubernetes Cluster.md ├── Scale Out and Update a Containerized Application on a Kubernetes Cluster ├── image1.png └── Scale Out and Update a Containerized Application on a Kubernetes Cluster.md ├── Store, Process, and Manage Data on Google Cloud Challenge Lab .md ├── LICENSE ├── Get Started with Pub-Sub Challenge Lab.md ├── The Basics of Google Cloud Compute Challenge Lab.md ├── Google Cloud Essential Skills Challenge Lab.md ├── Streaming Analytics into BigQuery Challenge Lab.md ├── App Engine 3 Ways Challenge Lab .md ├── Get Started with Eventarc ├── Get Started with Eventarc Challenge Lab .md └── Eventarc for Cloud Run [GSP773] .md ├── Using the Google Cloud Speech API Challenge Lab.md ├── Engineer Data in Google Cloud Challenge Lab.md ├── Integrate with Machine Learning APIs ├── Integrate with Machine Learning APIs Challenge Lab .md └── analyze-images-v2.py ├── Build a Website on Google Cloud ├── Deploy, Scale, and Update Your Website on Google Kubernetes Engine.md ├── Build a Website on Google Cloud Challenge Lab .md ├── Migrating a Monolithic Website to Microservices on Google Kubernetes Engine.md └── Hosting a Web App on Google Cloud Using Compute Engine [GSP662].md ├── Deploy to Kubernetes in Google Cloud Challenge Lab.md ├── Migrate a MySQL Database to Google Cloud SQL Challenge Lab.md ├── Serverless Firebase Development Challenge Lab.md ├── Use APIs to Work with Cloud Storage Challenge Lab.md ├── Cloud Functions 3 Ways Challenge Lab.md ├── Create ML Models with BigQuery ML Challenge Lab.md ├── Create an Internal Load Balancer.md ├── Monitoring in Google Cloud └── Monitoring in Google Cloud Challenge Lab .md ├── Cloud Speech API 3 Ways Challenge Lab .md ├── Insights from Data with BigQuery Challenge Lab.md ├── Analyze Sentiment with Natural Language API Challenge Lab.md ├── All Weeks Moonsoon Skill Badges.md └── Implement DevOps in Google Cloud Challenge Lab.md /.gitattributes: -------------------------------------------------------------------------------- 1 | # Auto detect text files and perform LF normalization 2 | * text=auto 3 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Google Cloud Challenge Labs Soluitons 2 | This repository contains challenge labs solutions of google cloud labs. 3 | -------------------------------------------------------------------------------- /Deploy a Compute Instance with a Remote Startup Script/resources-install-web.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | apt-get update 3 | apt-get install -y apache2 -------------------------------------------------------------------------------- /Build and Secure Networks in Google Cloud/Region_Image.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/atulguptag/Google-Cloud-Challenge-Labs-Soluitons/HEAD/Build and Secure Networks in Google Cloud/Region_Image.png -------------------------------------------------------------------------------- /Configure Secure RDP using a Windows Bastion Host/img1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/atulguptag/Google-Cloud-Challenge-Labs-Soluitons/HEAD/Configure Secure RDP using a Windows Bastion Host/img1.png -------------------------------------------------------------------------------- /Configure Secure RDP using a Windows Bastion Host/img2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/atulguptag/Google-Cloud-Challenge-Labs-Soluitons/HEAD/Configure Secure RDP using a Windows Bastion Host/img2.png -------------------------------------------------------------------------------- /Configure Secure RDP using a Windows Bastion Host/img3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/atulguptag/Google-Cloud-Challenge-Labs-Soluitons/HEAD/Configure Secure RDP using a Windows Bastion Host/img3.png -------------------------------------------------------------------------------- /Configure Secure RDP using a Windows Bastion Host/img4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/atulguptag/Google-Cloud-Challenge-Labs-Soluitons/HEAD/Configure Secure RDP using a Windows Bastion Host/img4.png -------------------------------------------------------------------------------- /Configure Secure RDP using a Windows Bastion Host/img5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/atulguptag/Google-Cloud-Challenge-Labs-Soluitons/HEAD/Configure Secure RDP using a Windows Bastion Host/img5.png -------------------------------------------------------------------------------- /Build and Deploy a Docker Image to a Kubernetes Cluster/resources-echo-web.tar.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/atulguptag/Google-Cloud-Challenge-Labs-Soluitons/HEAD/Build and Deploy a Docker Image to a Kubernetes Cluster/resources-echo-web.tar.gz -------------------------------------------------------------------------------- /Scale Out and Update a Containerized Application on a Kubernetes Cluster/image1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/atulguptag/Google-Cloud-Challenge-Labs-Soluitons/HEAD/Scale Out and Update a Containerized Application on a Kubernetes Cluster/image1.png -------------------------------------------------------------------------------- /Store, Process, and Manage Data on Google Cloud Challenge Lab .md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *Store, Process, and Manage Data on Google Cloud: Challenge Lab [ARC100]* 2 | 3 | ## `Lab Link` - [Click Here](https://www.cloudskillsboost.google/focuses/60439?parent=catalog) 4 | 5 | ## [YouTube Solution Link](https://youtu.be/QkG3dwAb3MU) 6 | 7 | Run the below commands in the Cloud Shell Terminal. 8 | 9 | ## Task 1. Create a bucket 10 | 11 | ``` 12 | gsutil mb gs://YOUR_BUCKET_NAME 13 | ``` 14 | 15 | ## Task 2. Create a Pub/Sub topic 16 | 17 | ``` 18 | gcloud pubsub topics create YOUR_TOPIC_NAME 19 | ``` 20 | 21 | ## Task 3. Create the thumbnail Cloud Function 22 | 23 | *Follow the Video Instruction.* 24 | 25 | # Congratulations 🎉! You're all done with this Challenge Lab. -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2023 Atul Gupta 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /Get Started with Pub-Sub Challenge Lab.md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *Get Started with Pub/Sub: Challenge Lab [ARC113]* 2 | 3 | ## `Lab Link` - [*Click Here*](https://www.cloudskillsboost.google/focuses/63246?parent=catalog) 4 | 5 | 6 | 7 | Run the below commands in the Cloud Shell Terminal. 8 | 9 | ``` 10 | export REGION= 11 | ``` 12 | 13 | ``` 14 | export MESSAGE_BODY= 15 | ``` 16 | 17 | * You can find the REGION/LOCATION in the Task 2 (On the Lab Page). 18 | 19 | ## Task 1. Set up Cloud Pub/Sub 20 | 21 | ``` 22 | gcloud pubsub topics create $DEVSHELL_PROJECT_ID-cron-topic 23 | 24 | gcloud pubsub subscriptions create $DEVSHELL_PROJECT_ID-cron-sub --topic $DEVSHELL_PROJECT_ID-cron-topic 25 | ``` 26 | 27 | ## Task 2. Create a Cloud Scheduler job 28 | 29 | ``` 30 | gcloud services enable cloudscheduler.googleapis.com 31 | 32 | gcloud scheduler jobs create pubsub $DEVSHELL_PROJECT_ID-cron-scheduler-job --schedule="* * * * *" --location $REGION --topic $DEVSHELL_PROJECT_ID-cron-topic --message-body="$MESSAGE_BODY" 33 | ``` 34 | 35 | ## Task 3. Verify the results in Cloud Pub/Sub 36 | 37 | ``` 38 | gcloud pubsub subscriptions pull $DEVSHELL_PROJECT_ID-cron-sub --limit 5 39 | ``` 40 | 41 | # Congratulations🎉! You're all done with this Challenge Lab. -------------------------------------------------------------------------------- /The Basics of Google Cloud Compute Challenge Lab.md: -------------------------------------------------------------------------------- 1 | ### Lab Name - The Basics of Google Cloud Compute: Challenge Lab [ARC120] 2 | ### Lab Link - [CLICK HERE](https://www.cloudskillsboost.google/focuses/65384?parent=catalog) 3 | 4 | ## [YouTube Solution Link](https://youtu.be/7OAMnqoLSuU) 5 | 6 | Run the below commands in the Cloud Shell Terminal. 7 | 8 | ```cmd 9 | export ZONE= 10 | ``` 11 | 12 | ## Task 1. Create a Cloud Storage bucket 13 | 14 | ```cmd 15 | export REGION=${ZONE::-2} 16 | export BUCKET="gs://$DEVSHELL_PROJECT_ID" 17 | gsutil mb -l $REGION gs://$DEVSHELL_PROJECT_ID 18 | ``` 19 | 20 | ## Task 2. Create and attach a persistent disk to a Compute Engine instance 21 | 22 | ```cmd 23 | gcloud compute instances create my-instance --project=$DEVSHELL_PROJECT_ID --zone=$ZONE --machine-type=e2-medium --tags=http-server 24 | 25 | gcloud compute disks create mydisk --size=200GB \ 26 | --zone=$ZONE 27 | 28 | gcloud compute instances attach-disk my-instance --disk mydisk --zone=$ZONE 29 | ``` 30 | 31 | 32 | ## Task 3. Install a NGINX web server 33 | 34 | ```cmd 35 | gcloud compute ssh my-instance --zone=$ZONE 36 | ``` 37 | 38 | ```cmd 39 | sudo apt-get update 40 | 41 | sudo apt-get install -y nginx 42 | 43 | ps auwx | grep nginx 44 | ``` 45 | 46 | # Congratulations🎉! You are done with this Challenge Lab. -------------------------------------------------------------------------------- /Google Cloud Essential Skills Challenge Lab.md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *Google Cloud Essential Skills: Challenge Lab [GSP101]* 2 | 3 | ## `Lab Link` - [*Click Here*](https://www.cloudskillsboost.google/focuses/1734?parent=catalog) 4 | 5 | ## [YouTube Solution Link](https://youtu.be/WCAxqZcU70g) 6 | 7 | 8 | ### Let's start with defining some variables given by Cloud Skill Boosts 9 | 10 | ``` 11 | export VM_NAME= 12 | ``` 13 | 14 | ``` 15 | export ZONE= 16 | ``` 17 | 18 | ## Task1 Create a Compute Engine instance, add necessary firewall rules. 19 | 20 | ``` 21 | gcloud compute firewall-rules create http-ingress --allow=tcp:80 --source-ranges 0.0.0.0/0 --target-tags http-server --network default 22 | ``` 23 | 24 | ## Task2 Configure Apache2 Web Server in your instance 25 | 26 | ``` 27 | gcloud compute instances create $VM_NAME \ 28 | --zone=$ZONE \ 29 | --machine-type=e2-medium \ 30 | --tags=http-server,https-server \ 31 | --image=projects/debian-cloud/global/images/debian-10-buster-v20220406 \ 32 | --metadata=startup-script=\#\!\ /bin/bash$'\n'apt-get\ update$'\n'apt-get\ install\ apache2\ -y$'\n'service\ --status-all$'\n' 33 | ``` 34 | 35 | ## Task3 Test your server 36 | 37 | ``` 38 | gcloud compute instances describe $VM_NAME \ 39 | --format='get(networkInterfaces[0].accessConfigs[0].natIP)' 40 | ``` 41 | 42 | # Congratulations🎉! You're all done with this Challenge Lab. -------------------------------------------------------------------------------- /Streaming Analytics into BigQuery Challenge Lab.md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *Streaming Analytics into BigQuery: Challenge Lab [ARC106]* 2 | 3 | ## `Lab Link` - [Click Here](https://www.cloudskillsboost.google/focuses/61948?parent=catalog) 4 | 5 | ## [YouTube Solution Link](https://youtu.be/357s47IIwtg) 6 | 7 | Run the following commands in the Cloud Shell Terminal. 8 | 9 | ``` 10 | export BUCKET_NAME= 11 | 12 | export BIGQUERY_DATASET_NAME= 13 | 14 | export TABLE_NAME= 15 | 16 | export TOPIC_NAME= 17 | ``` 18 | 19 | * Before performing the Tasks, please make sure to `Enable the Dataflow API`. If this API is already enabled by default then first `disable` it and then `Enable it again`. 20 | 21 | ## Task 1. Create a Cloud Storage bucket 22 | 23 | ``` 24 | gsutil mb gs://$BUCKET_NAME/ 25 | ``` 26 | 27 | ## Task 2. Create a BigQuery dataset and table 28 | 29 | ``` 30 | bq mk $BIGQUERY_DATASET_NAME 31 | 32 | bq mk \ 33 | --schema data:string -t $BIGQUERY_DATASET_NAME.$TABLE_NAME 34 | ``` 35 | 36 | ## Task 3. Set up a Pub/Sub topic 37 | 38 | ``` 39 | gcloud pubsub topics create $TOPIC_NAME 40 | 41 | gcloud pubsub subscriptions create $TOPIC_NAME-sub --topic $TOPIC_NAME 42 | ``` 43 | 44 | ## Task 4. Run a Dataflow pipeline to stream data from a Pub/Sub topic to BigQuery 45 | 46 | * Follow the *Video Instruction Carefully*. 47 | 48 | ## Task 5. Publish a test message to the topic and validate data in BigQuery 49 | 50 | * Go to *Navigation Menu* > *Bigquery* > Click on `+Create Query`. 51 | 52 | * Run this query - 53 | 54 | ``` 55 | SELECT * FROM '.' 56 | ``` 57 | 58 | # Congratulations 🎉! You're all done with this Challenge Lab. -------------------------------------------------------------------------------- /App Engine 3 Ways Challenge Lab .md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *App Engine: 3 Ways: Challenge Lab [ARC112]* 2 | 3 | ## `Lab Link` - [*Click Here*](https://www.cloudskillsboost.google/focuses/63241?parent=catalog) 4 | 5 | ## [YouTube Video Link](https://youtu.be/GpTpeUGBTsU) 6 | 7 | Run the below commands in *SSH on the VM Instance* `lab-setup`. 8 | 9 | ## Task 1 - Enable the Google App Engine Admin API 10 | 11 | ``` 12 | gcloud services enable appengine.googleapis.com 13 | ``` 14 | 15 | ## Task 2 - Download the Hello World app 16 | 17 | *Make sure to download the web application on the VM instance "lab-setup"* 18 | 19 | ``` 20 | git clone https://github.com/GoogleCloudPlatform/php-docs-samples.git 21 | cd php-docs-samples/appengine/standard/helloworld 22 | ``` 23 | 24 | * After cloning the above command in the `SSH of VM-Instance`, come back to your `cloud console` and open the `Cloud Shell Terminal`. 25 | 26 | * And again, `recloning the sample file` using the *above commands*. 27 | 28 | 29 | ## Task 3 - Deploy your application 30 | 31 | ``` 32 | gcloud app deploy 33 | gcloud app browse 34 | ``` 35 | 36 | * Choose *Region* in the list which you have given in your lab. --> (This is the most important thing in this step.) 37 | * If asking "Enter your choice (Y/n)" then --> press y 38 | 39 | 40 | ## Task 4 - Deploy updates to your application 41 | 42 | * Here, in the `index.php`, change the code to *Hello, World!* to `Whatever the message is in the Task 4`. 43 | * Now, after editing the code, save and exit from the editor by pressing the `Ctrl+X, Y`, and then hit `Enter`. 44 | 45 | ``` 46 | nano index.php 47 | gcloud app deploy 48 | ``` 49 | 50 | # Congratulations🎉!, You're all done with this Challenge lab. -------------------------------------------------------------------------------- /Get Started with Eventarc/Get Started with Eventarc Challenge Lab .md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *Get Started with Eventarc: Challenge Lab [ARC118]* 2 | 3 | ## `Lab Link` - [*Click Here*](https://www.cloudskillsboost.google/focuses/63244?parent=catalog) 4 | 5 | ## [YouTube Solution Link](https://youtu.be/7UUk5Zqf6Yk) 6 | 7 | Run the following commands in the Cloud Shell Terminal. 8 | 9 | 10 | ``` 11 | export REGION= 12 | ``` 13 | 14 | ``` 15 | gcloud config set run/region "REGION" 16 | 17 | export SERVICE_NAME=pubsub-events 18 | 19 | export IMAGE_NAME="gcr.io/cloudrun/hello" 20 | ``` 21 | 22 | 23 | ## Task 1. Create a Pub/Sub topic 24 | 25 | ``` 26 | gcloud pubsub topics create $DEVSHELL_PROJECT_ID-topic 27 | 28 | gcloud pubsub subscriptions create --topic $DEVSHELL_PROJECT_ID-topic $DEVSHELL_PROJECT_ID-topic-sub 29 | ``` 30 | 31 | 32 | ## Task 2. Create a Cloud Run sink 33 | 34 | ``` 35 | gcloud run deploy ${SERVICE_NAME} \ 36 | --image ${IMAGE_NAME} \ 37 | --allow-unauthenticated \ 38 | --max-instances=3 \ 39 | --region=$REGION 40 | ``` 41 | 42 | * If asking to Enble APIs, then press `y`. 43 | 44 | ## Task 3. Create and test a Pub/Sub event trigger using Eventarc 45 | 46 | ``` 47 | gcloud eventarc triggers create pubsub-events-trigger \ 48 | --location=$REGION \ 49 | --destination-run-service=pubsub-events \ 50 | --destination-run-region=$REGION \ 51 | --transport-topic=$DEVSHELL_PROJECT_ID-topic \ 52 | --event-filters="type=google.cloud.pubsub.topic.v1.messagePublished" 53 | 54 | gcloud pubsub topics publish $DEVSHELL_PROJECT_ID-topic --message="Hello there" 55 | ``` 56 | * If asking to Enble APIs, then press `y`. 57 | 58 | 59 | # Congratulations🎉! You're all done with this Challenge Lab. -------------------------------------------------------------------------------- /Scale Out and Update a Containerized Application on a Kubernetes Cluster/Scale Out and Update a Containerized Application on a Kubernetes Cluster.md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *Scale Out and Update a Containerized Application on a Kubernetes Cluster [GSP305]* 2 | 3 | ## `Lab Link` - [*Click Here*](https://www.cloudskillsboost.google/focuses/1739?parent=catalog) 4 | 5 | ## [YouTube Solution Link](https://youtu.be/s5tw900ueK4) 6 | 7 | Run the below commands in the Cloud Shell Terminal. 8 | ``` 9 | export ZONE= 10 | ``` 11 | 12 | * Before starting this lab, we need to download and extract the source code of the application which is given in the lab instruction. 13 | 14 | ``` 15 | gsutil cp gs://$DEVSHELL_PROJECT_ID/echo-web-v2.tar.gz . 16 | tar -xvf echo-web-v2.tar.gz 17 | ``` 18 | 19 | ## Task 1: Build a echo-app:v2 tagged Docker Image & Push the image to the Google Container Registry 20 | 21 | ``` 22 | gcloud builds submit --tag gcr.io/$DEVSHELL_PROJECT_ID/echo-app:v2 . 23 | ``` 24 | 25 | ## Task 2: Deploy the echo-app:v2 application image to the Kubernetes Cluster 26 | 27 | ``` 28 | gcloud container clusters get-credentials echo-cluster --zone $ZONE --project $DEVSHELL_PROJECT_ID 29 | 30 | kubectl create deployment echo-web --image=gcr.io/$DEVSHELL_PROJECT_ID/echo-app:v2 31 | 32 | kubectl expose deployment echo-web --type=LoadBalancer --port=80 --target-port=8000 33 | 34 | kubectl scale deploy echo-web --replicas=2 35 | ``` 36 | 37 | ## Task 4: Validate the application, It should working on v 2.0.0 38 | 39 | ``` 40 | kubectl port-forward service/echo-web 8080:80 41 | ``` 42 | 43 | After the above command, click on `web preview on port 8080` & you can see in your browser that it is showing `version: 2.0.0` 44 | 45 | ![image](image1.png) 46 | 47 | # Congratulations🎉! You're all done with this Challenge Lab. 48 | -------------------------------------------------------------------------------- /Deploy a Compute Instance with a Remote Startup Script/Deploy a Compute Instance with a Remote Startup Script.md: -------------------------------------------------------------------------------- 1 | ## Lab Name - Deploy a Compute Instance with a Remote Startup Script 2 | 3 | ## Lab Link - [Click Here](https://www.cloudskillsboost.google/focuses/1735?parent=catalog) 4 | 5 | ## [YouTube Solution Link](https://youtu.be/6iSkXsBtFgU) 6 | ### Run the below commands in the cloud shell terminal. 7 | 8 | ## Task 1: Confirm that a Google Cloud Storage bucket exists that contains a file 9 | 10 | ```cmd 11 | export Zone= 12 | ``` 13 | 14 | ```cmd 15 | gsutil mb -b off gs://$DEVSHELL_PROJECT_ID 16 | 17 | wget https://raw.githubusercontent.com/atulguptag/Google-Cloud-Challenge-Labs-Soluitons/main/Deploy%20a%20Compute%20Instance%20with%20a%20Remote%20Startup%20Script/resources-install-web.sh 18 | 19 | gsutil cp resources-install-web.sh gs://$DEVSHELL_PROJECT_ID 20 | ``` 21 | 22 | 23 | ## Task 2: Confirm that a compute instance has been created that has a remote startup script called install-web.sh configured 24 | 25 | ```cmd 26 | gcloud compute instances add-metadata lab-monitor --metadata startup-script-url=gs://$DEVSHELL_PROJECT_ID/resources-install-web.sh --zone $Zone 27 | ``` 28 | 29 | ## Task 3: Confirm that a HTTP access firewall rule exists with tag that applies to that virtual machine 30 | 31 | ```cmd 32 | gcloud compute firewall-rules create http-fw-rule --allow=tcp:80 --source-ranges 0.0.0.0/0 --target-tags allow-http-traffic --network default 33 | 34 | gcloud compute instances add-tags lab-monitor --tags=allow-http-traffic --zone $Zone 35 | 36 | gcloud compute instances reset lab-monitor --zone $Zone 37 | ``` 38 | 39 | ## Task 4: Connect to the server ip-address using HTTP and get a non-error response 40 | 41 | ```cmd 42 | export VM_EXTERNAL_IP=$(gcloud compute instances describe lab-monitor --zone $Zone --format='get(networkInterfaces[0].accessConfigs[0].natIP)') 43 | 44 | curl http://$VM_EXTERNAL_IP 45 | ``` 46 | 47 | # Congratulations🎉! You are done with this Challenge Lab. 48 | -------------------------------------------------------------------------------- /Using the Google Cloud Speech API Challenge Lab.md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *Using the Google Cloud Speech API: Challenge Lab [ARC131]* 2 | 3 | ## `Lab Link` - [*Click Here*](https://www.cloudskillsboost.google/focuses/65993?parent=catalog) 4 | 5 | 6 | 7 | Run the below commands in the SSH of your VM-Instance. 8 | 9 | ## Task 1. Create an API key 10 | 11 | * *Navigation Menu* > *APIs and Services* > *Credentials* > Click `+Create Credentials` > Choose `API KEY`. 12 | 13 | ``` 14 | gcloud compute ssh lab-vm --zone= 15 | ``` 16 | 17 | ``` 18 | export API_KEY= 19 | ``` 20 | 21 | ### Define variables for file names. 22 | 23 | ``` 24 | TASK_2_REQUEST_FILE= 25 | 26 | TASK_2_RESPONSE_FILE= 27 | 28 | TASK_3_REQUEST_FILE= 29 | 30 | TASK_3_RESPONSE_FILE= 31 | ``` 32 | 33 | ## Task 2. Transcribe speech to English text 34 | 35 | ``` 36 | cat > "$TASK_2_REQUEST_FILE" < "$TASK_2_RESPONSE_FILE" 54 | ``` 55 | 56 | ## Task 3. Transcribe speech to Spanish text 57 | 58 | ``` 59 | cat > "$TASK_3_REQUEST_FILE" < "$TASK_3_RESPONSE_FILE" 77 | ``` 78 | 79 | # Congratulations 🎉! You're all done with this Challenge Lab. -------------------------------------------------------------------------------- /Build and Deploy a Docker Image to a Kubernetes Cluster/Build and Deploy a Docker Image to a Kubernetes Cluster.md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *Build and Deploy a Docker Image to a Kubernetes Cluster [GSP304]* 2 | 3 | ## `Lab Link` - [Click Here](https://www.cloudskillsboost.google/focuses/1738?parent=catalog) 4 | 5 | ## YouTube Solution Link - `To be Uploaded Soon` 6 | 7 | Run the below commands in the Cloud Shell Terminal. 8 | 9 | ``` 10 | export ZONE= 11 | ``` 12 | 13 | ``` 14 | export MACHINE_TYPE= 15 | ``` 16 | 17 | * Before starting this lab, we need to download the source code of the application which is given in the lab instruction. 18 | 19 | ``` 20 | wget https://github.com/atulguptag/Google-Cloud-Challenge-Labs-Soluitons/raw/main/Build%20and%20Deploy%20a%20Docker%20Image%20to%20a%20Kubernetes%20Cluster/resources-echo-web.tar.gz 21 | ``` 22 | 23 | * Extract the downloaded application file 24 | 25 | ``` 26 | tar -xvf resources-echo-web.tar.gz 27 | ``` 28 | 29 | ## Task 1: Create a Kubernetes Cluster 30 | 31 | * The below command will take approx 4-5 mints to create cluster, so please be patience. 32 | 33 | ``` 34 | gcloud container clusters create echo-cluster --num-nodes 2 --zone $ZONE --machine-type $MACHINE_TYPE 35 | ``` 36 | 37 | ## Task 2: Build a tagged Docker Image & Push the image to the Google Container Registry 38 | ``` 39 | gcloud builds submit --tag gcr.io/$DEVSHELL_PROJECT_ID/echo-app:v1 . 40 | ``` 41 | 42 | ## Task 3: Deploy the application to the Kubernetes Cluster 43 | 44 | 3.1 Connect to the previously created cluster so deployment of application 45 | 46 | ``` 47 | gcloud container clusters get-credentials echo-cluster --zone $ZONE --project $DEVSHELL_PROJECT_ID 48 | ``` 49 | 50 | 3.2 Deploy the sample application image, which we created in the previous step 51 | 52 | ``` 53 | kubectl create deployment echo-web --image=gcr.io/$DEVSHELL_PROJECT_ID/echo-app:v1 54 | ``` 55 | 56 | 3.3 Expose your deployment by creating a service 57 | 58 | ``` 59 | kubectl expose deployment echo-web --type=LoadBalancer --port=80 --target-port=8000 60 | ``` 61 | 62 | # Congratulations🎉! You're all done with this Challenge Lab. 63 | -------------------------------------------------------------------------------- /Engineer Data in Google Cloud Challenge Lab.md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *Engineer Data in Google Cloud: Challenge Lab [GSP327]* 2 | 3 | ## Lab Link - [Click Here](https://www.cloudskillsboost.google/focuses/12379?parent=catalog) 4 | 5 | ## [YouTube Solution Link](https://youtu.be/sK_gAUwzPQc) 6 | 7 | ## Task 1: Clean your training data 8 | 9 | - Run this Query (change the values given in lab manual) 10 | 11 | ``` 12 | CREATE OR REPLACE TABLE 13 | taxirides. AS 14 | SELECT 15 | (tolls_amount + fare_amount) AS , 16 | pickup_datetime, 17 | pickup_longitude AS pickuplon, 18 | pickup_latitude AS pickuplat, 19 | dropoff_longitude AS dropofflon, 20 | dropoff_latitude AS dropofflat, 21 | passenger_count AS passengers, 22 | FROM 23 | taxirides.historical_taxi_rides_raw 24 | WHERE 25 | RAND() < 0.001 26 | AND trip_distance > 2 [Change_as_mention_in_lab] 27 | AND fare_amount >= 3.0 [Change_as_mention_in_lab] 28 | AND pickup_longitude > -78 29 | AND pickup_longitude < -70 30 | AND dropoff_longitude > -78 31 | AND dropoff_longitude < -70 32 | AND pickup_latitude > 37 33 | AND pickup_latitude < 45 34 | AND dropoff_latitude > 37 35 | AND dropoff_latitude < 45 36 | AND passenger_count > 2 [Change_as_mention_in_lab] 37 | ``` 38 | 39 | ## Task 2: Create a BQML model 40 | 41 | - Run this Query (change the values given in lab manual) 42 | ``` 43 | CREATE OR REPLACE MODEL taxirides. 44 | TRANSFORM( 45 | * EXCEPT(pickup_datetime) 46 | 47 | , ST_Distance(ST_GeogPoint(pickuplon, pickuplat), ST_GeogPoint(dropofflon, dropofflat)) AS euclidean 48 | , CAST(EXTRACT(DAYOFWEEK FROM pickup_datetime) AS STRING) AS dayofweek 49 | , CAST(EXTRACT(HOUR FROM pickup_datetime) AS STRING) AS hourofday 50 | ) 51 | OPTIONS(input_label_cols=[''], model_type='linear_reg') 52 | AS 53 | 54 | SELECT * FROM taxirides. 55 | ``` 56 | 57 | ## Task 3: Perform a batch prediction on new data 58 | 59 | - Run this Query (change the values given in lab manual) 60 | ``` 61 | CREATE OR REPLACE TABLE taxirides.2015_fare_amount_predictions 62 | AS 63 | SELECT * FROM ML.PREDICT(MODEL taxirides.,( 64 | SELECT * FROM taxirides.report_prediction_data) 65 | ) 66 | ``` 67 | 68 | # Congratulations🎉! You're all done with this Challenge Lab. -------------------------------------------------------------------------------- /Integrate with Machine Learning APIs/Integrate with Machine Learning APIs Challenge Lab .md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *Integrate with Machine Learning APIs: Challenge Lab [GSP329]* 2 | 3 | ## `Lab Link` - [*Click Here*](https://www.cloudskillsboost.google/focuses/12704?parent=catalog) 4 | 5 | ## [YouTube Solution Link](https://youtu.be/1syTy7ewtJU) 6 | 7 | Run the below commands in the Cloud Shell Terminal. 8 | 9 | 10 | ``` 11 | export LANGUAGE= 12 | 13 | export LOCAL= 14 | 15 | export BIGQUERY_ROLE= 16 | 17 | export CLOUD_STORAGE_ROLE= 18 | ``` 19 | 20 | ## Task 1. Configure a service account to access the Machine Learning APIs, BigQuery, and Cloud Storage 21 | 22 | ``` 23 | gcloud iam service-accounts create sample-sa 24 | 25 | gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID --member=serviceAccount:sample-sa@$DEVSHELL_PROJECT_ID.iam.gserviceaccount.com --role=$BIGQUERY_ROLE 26 | 27 | gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID --member=serviceAccount:sample-sa@$DEVSHELL_PROJECT_ID.iam.gserviceaccount.com --role=$CLOUD_STORAGE_ROLE 28 | ``` 29 | 30 | ## Task 2. Create and download a credential file for your service account 31 | 32 | ``` 33 | gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID --member=serviceAccount:sample-sa@$DEVSHELL_PROJECT_ID.iam.gserviceaccount.com --role=roles/serviceusage.serviceUsageConsumer 34 | 35 | gcloud iam service-accounts keys create sample-sa-key.json --iam-account sample-sa@$DEVSHELL_PROJECT_ID.iam.gserviceaccount.com 36 | 37 | export GOOGLE_APPLICATION_CREDENTIALS=${PWD}/sample-sa-key.json 38 | ``` 39 | 40 | ## Task 3. Modify the Python script to extract text from image files 41 | 42 | ``` 43 | wget https://raw.githubusercontent.com/atulguptag/Google-Cloud-Challenge-Labs-Soluitons/main/Integrate%20with%20Machine%20Learning%20APIs/analyze-images-v2.py 44 | 45 | sed -i "s/'en'/'${LOCAL}'/g" analyze-images-v2.py 46 | ``` 47 | 48 | ## Task 4. Modify the Python script to translate the text using the Translation API 49 | 50 | ``` 51 | python3 analyze-images-v2.py $DEVSHELL_PROJECT_ID $DEVSHELL_PROJECT_ID 52 | ``` 53 | 54 | ## Task 5. Identify the most common language used in the signs in the dataset 55 | 56 | ``` 57 | bq query --use_legacy_sql=false "SELECT locale,COUNT(locale) as lcount FROM image_classification_dataset.image_text_detail GROUP BY locale ORDER BY lcount DESC" 58 | ``` 59 | 60 | # Congratulations🎉! You're all done with this Challenge Lab. -------------------------------------------------------------------------------- /Build a Website on Google Cloud/Deploy, Scale, and Update Your Website on Google Kubernetes Engine.md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *Deploy, Scale, and Update Your Website on Google Kubernetes Engine [GSP663]* 2 | 3 | ## `Lab Link` - [*Click Here*](https://www.cloudskillsboost.google/focuses/10470?parent=catalog) 4 | 5 | ## [YouTube Solution Link](https://youtu.be/PT4v_a-QK0g) 6 | 7 | ## Task 1. Create a GKE cluster 8 | 9 | ``` 10 | gcloud services enable container.googleapis.com 11 | 12 | gcloud container clusters create fancy-cluster --num-nodes 3 13 | 14 | gcloud compute instances list 15 | ``` 16 | 17 | ## Task 2. Clone source repository 18 | 19 | ```cd ~ 20 | git clone https://github.com/googlecodelabs/monolith-to-microservices.git 21 | 22 | cd ~/monolith-to-microservices 23 | ./setup.sh 24 | 25 | nvm install --lts 26 | 27 | cd ~/monolith-to-microservices/monolith 28 | npm start 29 | ``` 30 | 31 | * Press `Ctrl+C`. 32 | 33 | ## Task 3. Create Docker container with Cloud Build 34 | 35 | ``` 36 | gcloud services enable cloudbuild.googleapis.com 37 | 38 | cd ~/monolith-to-microservices/monolith 39 | gcloud builds submit --tag gcr.io/${GOOGLE_CLOUD_PROJECT}/monolith:1.0.0 . 40 | ``` 41 | 42 | ## Task 4. Deploy container to GKE 43 | 44 | ``` 45 | kubectl create deployment monolith --image=gcr.io/${GOOGLE_CLOUD_PROJECT}/monolith:1.0.0 46 | 47 | kubectl get all 48 | ``` 49 | 50 | ## Task 5. Expose GKE deployment 51 | 52 | ``` 53 | kubectl expose deployment monolith --type=LoadBalancer --port 80 --target-port 8080 54 | 55 | kubectl get service 56 | ``` 57 | 58 | ## Task 6. Scale GKE deployment 59 | ``` 60 | kubectl scale deployment monolith --replicas=3 61 | 62 | kubectl get all 63 | ``` 64 | 65 | ## Task 7. Make changes to the website 66 | 67 | ``` 68 | cd ~/monolith-to-microservices/react-app/src/pages/Home 69 | mv index.js.new index.js 70 | 71 | cat ~/monolith-to-microservices/react-app/src/pages/Home/index.js 72 | 73 | cd ~/monolith-to-microservices/react-app 74 | npm run build:monolith 75 | 76 | cd ~/monolith-to-microservices/monolith 77 | 78 | gcloud builds submit --tag gcr.io/${GOOGLE_CLOUD_PROJECT}/monolith:2.0.0 . 79 | ``` 80 | 81 | ## Task 8. Update website with zero downtime 82 | 83 | ``` 84 | kubectl set image deployment/monolith monolith=gcr.io/${GOOGLE_CLOUD_PROJECT}/monolith:2.0.0 85 | 86 | kubectl get pods 87 | 88 | npm start 89 | ``` 90 | 91 | * Press `Ctrl+C`. 92 | 93 | # Congratulation🎉! You're all done with this lab. -------------------------------------------------------------------------------- /Deploy to Kubernetes in Google Cloud Challenge Lab.md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *Deploy to Kubernetes in Google Cloud: Challenge Lab [GSP318]* 2 | 3 | ## `Lab Link` - [*Click Here*](https://www.cloudskillsboost.google/focuses/10457?parent=catalog) 4 | 5 | ## [YouTube Solution Link](https://youtu.be/WrT2V60syiI?si=gmiD_HNr1e6cFVBf) 6 | 7 | 8 | ## Task 1. Create a Docker image and store the Dockerfile 9 | 10 | ``` 11 | export DOCKER_IMAGE_WITH_TAG_NAME= 12 | 13 | export REPOSITORY= 14 | 15 | export PROJECT_ID= 16 | ``` 17 | 18 | ``` 19 | gcloud auth list 20 | 21 | gsutil cat gs://cloud-training/gsp318/marking/setup_marking_v2.sh | bash 22 | 23 | gcloud source repos clone valkyrie-app 24 | 25 | cd valkyrie-app 26 | cat > Dockerfile <` with the *IMAGE-ID* which you got from the above command. 80 | 81 | ``` 82 | docker tag us-central1-docker.pkg.dev/$PROJECT_ID/$REPOSITORY/$DOCKER_IMAGE_WITH_TAG_NAME 83 | ``` 84 | 85 | ``` 86 | docker push us-central1-docker.pkg.dev/$PROJECT_ID/$REPOSITORY/$DOCKER_IMAGE_WITH_TAG_NAME 87 | 88 | sed -i s#IMAGE_HERE#us-central1-docker.pkg.dev/$PROJECT_ID/$REPOSITORY/$DOCKER_IMAGE_WITH_TAG_NAME#g k8s/deployment.yaml 89 | ``` 90 | 91 | ## Task 4. Create and expose a deployment in Kubernetes 92 | 93 | ``` 94 | gcloud container clusters get-credentials valkyrie-dev --zone us-east1-d 95 | kubectl create -f k8s/deployment.yaml 96 | kubectl create -f k8s/service.yaml 97 | ``` 98 | 99 | # Congratulations🎉! You're all done with this Challenge Lab. -------------------------------------------------------------------------------- /Build and Secure Networks in Google Cloud/Multiple VPC Networks [GSP211].md: -------------------------------------------------------------------------------- 1 | ## Lab Name - *Multiple VPC Networks [GSP211]* 2 | 3 | ## Lab Link - [Click Here](https://www.cloudskillsboost.google/focuses/1230?parent=catalog) 4 | 5 | ## [YouTube Solution Link](https://youtu.be/YUTiL0ojLy0) 6 | Run the following commands in the Cloud Shell Terminal. 7 | 8 | ``` 9 | export REGION= 10 | ``` 11 | 12 | ``` 13 | export REGION2= 14 | ``` 15 | 16 | ``` 17 | export ZONE= 18 | ``` 19 | Region Image 20 | 21 | ## Task 1. Create custom mode VPC networks with firewall rules 22 | 23 | ``` 24 | gcloud compute networks create managementnet --subnet-mode=custom 25 | 26 | gcloud compute networks subnets create managementsubnet-$REGION --network=managementnet --region=$REGION --range=10.130.0.0/20 27 | ``` 28 | 29 | ``` 30 | gcloud compute networks create privatenet --subnet-mode=custom 31 | 32 | gcloud compute networks subnets create privatesubnet-$REGION --network=privatenet --region=$REGION --range=172.16.0.0/24 33 | 34 | gcloud compute networks subnets create privatesubnet-$REGION2 --network=privatenet --region=$REGION2 --range=172.20.0.0/20 35 | 36 | gcloud compute networks list 37 | ``` 38 | 39 | ``` 40 | gcloud compute firewall-rules create managementnet-allow-icmp-ssh-rdp --direction=INGRESS --priority=1000 --network=managementnet --action=ALLOW --rules=tcp:22,tcp:3389,icmp --source-ranges=0.0.0.0/0 41 | 42 | gcloud compute firewall-rules create privatenet-allow-icmp-ssh-rdp --direction=INGRESS --priority=1000 --network=privatenet --action=ALLOW --rules=icmp,tcp:22,tcp:3389 --source-ranges=0.0.0.0/0 43 | 44 | gcloud compute firewall-rules list --sort-by=NETWORK 45 | ``` 46 | 47 | ## Task 2. Create VM instances 48 | 49 | ``` 50 | gcloud compute instances create managementnet-$REGION-vm --zone=$ZONE --machine-type=e2-micro --network=managementnet --subnet=managementsubnet-$REGION 51 | 52 | gcloud compute instances create privatenet-$REGION-vm --zone=$ZONE --machine-type=e2-micro --subnet=privatesubnet-$REGION 53 | ``` 54 | 55 | ## Task 4. Create a VM instance with multiple network interfaces 56 | 57 | ``` 58 | gcloud compute instances create vm-appliance \ 59 | --zone=$ZONE \ 60 | --machine-type=e2-standard-4 \ 61 | --network-interface=network-tier=PREMIUM,stack-type=IPV4_ONLY,subnet=privatesubnet-$REGION \ 62 | --network-interface=network-tier=PREMIUM,stack-type=IPV4_ONLY,subnet=managementsubnet-$REGION \ 63 | --network-interface=network-tier=PREMIUM,stack-type=IPV4_ONLY,subnet=mynetwork 64 | ``` 65 | 66 | # Congratulations🎉! You've completed this Lab. -------------------------------------------------------------------------------- /Build and Secure Networks in Google Cloud/VPC Networks - Controlling Access [GSP213] .md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *VPC Networks - Controlling Access [GSP213]* 2 | 3 | ## `Lab Link` - [Click Here](https://www.cloudskillsboost.google/focuses/1231?parent=catalog) 4 | 5 | ## [YouTube Solution Link](https://youtu.be/lqpSxMuyjf0) 6 | 7 | Run the following commands in the Cloud Shell Terminal. 8 | 9 | ``` 10 | export ZONE= 11 | ``` 12 | 13 | ## Task 1. Create the web servers 14 | 15 | ``` 16 | gcloud compute instances create blue \ 17 | --zone=$ZONE \ 18 | --machine-type=e2-medium \ 19 | --tags=web-server \ 20 | 21 | gcloud compute instances create green \ 22 | --zone=$ZONE \ 23 | --machine-type=e2-medium 24 | ``` 25 | 26 | * SSH into the blue server. 27 | 28 | ``` 29 | gcloud compute ssh blue --zone=$ZONE 30 | ``` 31 | 32 | ``` 33 | sudo apt-get install nginx-light -y 34 | 35 | sudo nano /var/www/html/index.nginx-debian.html 36 | ``` 37 | 38 | * Replace the *`

Welcome to nginx!

`* line with `

Welcome to the blue server!

`. 39 | 40 | * Press `Ctrl+X, Y`, and then hit `ENTER`. 41 | 42 | ``` 43 | cat /var/www/html/index.nginx-debian.html 44 | 45 | exit 46 | ``` 47 | 48 | * SSH into the green server. 49 | 50 | ``` 51 | gcloud compute ssh green --zone=$ZONE 52 | ``` 53 | 54 | ``` 55 | sudo apt-get install nginx-light -y 56 | 57 | sudo nano /var/www/html/index.nginx-debian.html 58 | ``` 59 | 60 | * Replace the *`

Welcome to nginx!

`* line with `

Welcome to the green server!

`. 61 | 62 | * Press `Ctrl+X, Y`, and then hit `ENTER`. 63 | 64 | ``` 65 | cat /var/www/html/index.nginx-debian.html 66 | 67 | exit 68 | ``` 69 | 70 | ## Task 2. Create the firewall rule 71 | 72 | ``` 73 | gcloud compute firewall-rules create allow-http-web-server --direction=INGRESS --priority=1000 --network=default --action=ALLOW --rules=tcp:80,icmp --source-ranges=0.0.0.0/0 --target-tags=web-server 74 | 75 | gcloud compute instances create test-vm --machine-type=e2-micro --subnet=default --zone=$ZONE 76 | ``` 77 | 78 | ## Task 3. Explore the Network and Security Admin roles 79 | 80 | ### Create a service account 81 | 82 | * In the Console, navigate to *Navigation menu* > *IAM & admin* > `Service Accounts`. 83 | 84 | * Notice the Compute Engine default service account. 85 | 86 | * Click `Create service account`. 87 | 88 | * Set the Service account name to `Network-admin` and click `CREATE AND CONTINUE`. 89 | 90 | * For Select a role, select *Compute Engine* > `Compute Network Admin` and click `CONTINUE` then click `DONE`. 91 | 92 | # Congratulation🎉! You're all done with this lab. -------------------------------------------------------------------------------- /Migrate a MySQL Database to Google Cloud SQL Challenge Lab.md: -------------------------------------------------------------------------------- 1 | ## `Lab Link` - *Migrate a MySQL Database to Google Cloud SQL [GSP306]* 2 | ## `Lab Link` - [Click Here](https://www.cloudskillsboost.google/focuses/1740?parent=catalog) 3 | 4 | ## [YouTube Solution Link](https://www.youtube.com/watch?v=bxI_-W21LHE) 5 | ## Task 1: Check that there is a Cloud SQL instance 6 | 7 | ```cmd 8 | gcloud sql instances create wordpress --tier=db-n1-standard-1 --activation-policy=ALWAYS --zone=us-central1-a --database-version=MYSQL_5_7 9 | 10 | gcloud sql users set-password --host % root --instance wordpress --password Password1* 11 | ``` 12 | 13 | ## Task 2 & 3: Check that there is a user database on the Cloud SQL instance & Check that the blog instance is authorized to access Cloud SQL 14 | 15 | * Storing Zone of your blog_vm_instance into a variable 16 | ```cmd 17 | export ZONE= 18 | ``` 19 | 20 | * Storing the BLOG_EXTERNAL_IP into a variable. 21 | ```cmd 22 | export BLOG_VM_EXTERNAL_IP= 23 | ``` 24 | 25 | * Authorize blog VM to your cloudSQL instance 26 | ```cmd 27 | gcloud sql instances patch wordpress --authorized-networks $BLOG_VM_EXTERNAL_IP/32 --quiet 28 | ``` 29 | 30 | * Connect to blog VM using SSH 31 | ``` 32 | gcloud compute ssh blog --zone=$ZONE 33 | ``` 34 | 35 | * Storing CloudSQL instance IP into a variable 36 | ```cmd 37 | export CLOUD_SQL_IP=$(gcloud sql instances describe wordpress --format 'value(ipAddresses.ipAddress)') 38 | ``` 39 | 40 | * Connecting to the cloudSQL instance 41 | ``` 42 | mysql --host=$CLOUD_SQL_IP --user=root --password=Password1* 43 | ``` 44 | 45 | * Creating database & user for your cloud SQL database 46 | 47 | ``` 48 | CREATE DATABASE wordpress; CREATE USER 'blogadmin'@'%' IDENTIFIED BY 'Password1*'; GRANT ALL PRIVILEGES ON wordpress.* TO 'blogadmin'@'%'; FLUSH PRIVILEGES; 49 | ``` 50 | 51 | * All done from here, let's exit 52 | 53 | ``` 54 | exit 55 | ``` 56 | 57 | ``` 58 | sudo mysqldump -u root -pPassword1* wordpress > wordpress_db_backup.sql 59 | 60 | mysql --host=$CLOUD_SQL_IP --user=root -pPassword1* --verbose wordpress < wordpress_db_backup.sql 61 | ``` 62 | 63 | ## Task 4 & 5: Check that wp-config.php points to the Cloud SQL instance and Check that the blog still responds to requests 64 | 65 | * Restart the Apache webserver service 66 | 67 | ```cmd 68 | sudo service apache2 restart 69 | ``` 70 | 71 | * Replace the Database IP configuration from localhost to our cloudSQL instance IP 72 | 73 | ```cmd 74 | sudo sed -i "s/localhost/$CLOUD_SQL_IP/g" /var/www/html/wordpress/wp-config.php 75 | ``` 76 | 77 | # Congratulations!🎉 You're all done with this Challenge Lab. -------------------------------------------------------------------------------- /Serverless Firebase Development Challenge Lab.md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *Serverless Firebase Development: Challenge Lab [GSP344]* 2 | ## `Lab Link` - [*Click Here*](https://www.cloudskillsboost.google/focuses/14677?parent=catalog) 3 | 4 | ## [YouTube Solution Link](https://www.youtube.com/watch?v=-TefmOhEEPA) 5 | ### Start by executing the following commands: 6 | 7 | * Run the below Commands 8 | 9 | ```cmd 10 | gcloud config set project $(gcloud projects list --format='value(PROJECT_ID)' --filter='qwiklabs-gcp') 11 | git clone https://github.com/rosera/pet-theory.git 12 | ``` 13 | ## Task 1. Create a Firestore database 14 | 15 | 16 | * We have to Create a Firestore database. 17 | * Click on Search > Firestore > Choose Native Store > Add location Nam5 (United States) 18 | 19 | ## Task - 2: Firestore Database Populate 20 | 21 | ```cmd 22 | cd pet-theory/lab06/firebase-import-csv/solution 23 | npm install 24 | node index.js netflix_titles_original.csv 25 | ``` 26 | 27 | ## Task - 3: Cloud Build Rest API Staging 28 | 29 | ```cmd 30 | cd ~/pet-theory/lab06/firebase-rest-api/solution-01 31 | npm install 32 | gcloud builds submit --tag gcr.io/$GOOGLE_CLOUD_PROJECT/rest-api:0.1 33 | gcloud beta run deploy --image gcr.io/$GOOGLE_CLOUD_PROJECT/rest-api:0.1 --allow-unauthenticated 34 | ``` 35 | 36 | * Choose `29` and `us-central1` 37 | 38 | 39 | ## Task - 4: Cloud Build Rest API Production 40 | 41 | ```cmd 42 | cd ~/pet-theory/lab06/firebase-rest-api/solution-02 43 | npm install 44 | gcloud builds submit --tag gcr.io/$GOOGLE_CLOUD_PROJECT/rest-api:0.2 45 | gcloud beta run deploy --image gcr.io/$GOOGLE_CLOUD_PROJECT/rest-api:0.2 --allow-unauthenticated 46 | ``` 47 | 48 | ### Goto `Cloud Run` and Click `Netflix-Dataset-Service` then copy *the url*. 49 | 50 | ```cmd 51 | SERVICE_URL= 52 | curl -X GET $SERVICE_URL/2019 53 | ``` 54 | 55 | ## Task - 5: Cloud Build Frontend Staging 56 | 57 | ```cmd 58 | cd ~/pet-theory/lab06/firebase-frontend/public 59 | nano app.js # comment line 3 and uncomment line 4, insert your netflix-dataset-service url 60 | npm install 61 | cd ~/pet-theory/lab06/firebase-frontend 62 | gcloud builds submit --tag gcr.io/$GOOGLE_CLOUD_PROJECT/frontend-staging:0.1 63 | gcloud beta run deploy --image gcr.io/$GOOGLE_CLOUD_PROJECT/frontend-staging:0.1 64 | ``` 65 | 66 | * Choose `29` and `us-central1` 67 | 68 | 69 | ## Task - 6: Cloud Build Frontend Production 70 | 71 | ```cmd 72 | gcloud builds submit --tag gcr.io/$GOOGLE_CLOUD_PROJECT/frontend-production:0.1 73 | gcloud beta run deploy --image gcr.io/$GOOGLE_CLOUD_PROJECT/frontend-production:0.1 74 | ``` 75 | 76 | * Choose `29` and `us-central1` 77 | 78 | # Congratulations🎉! You're all done with this Challenge lab. -------------------------------------------------------------------------------- /Build and Secure Networks in Google Cloud/Build and Secure Networks in Google Cloud Challenge Lab.md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *Build and Secure Networks in Google Cloud: Challenge Lab [GSP322]* 2 | 3 | ## `Lab Link` - [Click Here](https://www.cloudskillsboost.google/focuses/12068?parent=catalog) 4 | 5 | ## [YouTube Solution Link](https://youtu.be/yXmMYUARPOE) 6 | 7 | Run the following commands in the Cloud Shell Terminal. 8 | 9 | ### Let's start with defining some variables given by Cloud Skill Boosts 10 | 11 | * For exporting the zone, first go to *Compute Engine* > *VM Instances* > Copy `Any Instance Zone`(both instances zone should be in the same.) 12 | 13 | ``` 14 | export ZONE= 15 | ``` 16 | 17 | ``` 18 | export SSH_IAP_NETWORK_TAG= 19 | ``` 20 | 21 | ``` 22 | export SSH_INTERNAL_NETWORK_TAG= 23 | ``` 24 | 25 | ``` 26 | export HTTP_NETWORK_TAG= 27 | ``` 28 | 29 | ## Task 1 : Remove the overly permissive rules 30 | 31 | ``` 32 | gcloud compute firewall-rules delete open-access 33 | ``` 34 | 35 | ## Task 2 : Start the bastion host instance 36 | ``` 37 | gcloud compute instances start bastion --zone=$ZONE 38 | ``` 39 | 40 | ## Task 3 : Create a firewall rule that allows SSH (tcp/22) from the IAP service and add network tag on bastion 41 | 42 | ``` 43 | gcloud compute firewall-rules create ssh-ingress --allow=tcp:22 --source-ranges 35.235.240.0/20 --target-tags $SSH_IAP_NETWORK_TAG --network acme-vpc 44 | 45 | gcloud compute instances add-tags bastion --tags=$SSH_IAP_NETWORK_TAG --zone=$ZONE 46 | ``` 47 | 48 | ## Task 4 : Create a firewall rule that allows traffic on HTTP (tcp/80) to any address and add network tag on juice-shop 49 | 50 | ``` 51 | gcloud compute firewall-rules create http-ingress --allow=tcp:80 --source-ranges 0.0.0.0/0 --target-tags $HTTP_NETWORK_TAG --network acme-vpc 52 | 53 | gcloud compute instances add-tags juice-shop --tags=$HTTP_NETWORK_TAG --zone=$ZONE 54 | ``` 55 | 56 | ## Task 5 : Create a firewall rule that allows traffic on SSH (tcp/22) from acme-mgmt-subnet network address and add network tag on juice-shop 57 | 58 | ``` 59 | gcloud compute firewall-rules create internal-ssh-ingress --allow=tcp:22 --source-ranges 192.168.10.0/24 --target-tags $SSH_INTERNAL_NETWORK_TAG --network acme-vpc 60 | 61 | gcloud compute instances add-tags juice-shop --tags=$SSH_INTERNAL_NETWORK_TAG --zone=$ZONE 62 | ``` 63 | 64 | ## Task 6 : SSH to bastion host via IAP and juice-shop via bastion 65 | 66 | ``` 67 | gcloud compute ssh --zone "$ZONE" "bastion" --tunnel-through-iap --project $DEVSHELL_PROJECT_ID 68 | ``` 69 | 70 | * If prompted, type `y` & then hit `Enter Two Times`. You'll see you're successfully login to the bastion VM 71 | 72 | ``` 73 | export JUICE_SHOP_VM_INTERNAL_IP=$(gcloud compute instances describe juice-shop --format='get(networkInterfaces[0].networkIP)') 74 | 75 | gcloud compute ssh juice-shop --internal-ip 76 | ``` 77 | 78 | * If prompted, type `y` & then hit `Enter Two Times`. You'll see you're successfully login to the juice-shop VM from bastion VM. 79 | 80 | ``` 81 | ssh $JUICE_SHOP_VM_INTERNAL_IP 82 | ``` 83 | 84 | # Congratulations🎉! You're all done with this Challenge Lab. -------------------------------------------------------------------------------- /Use APIs to Work with Cloud Storage Challenge Lab.md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *Use APIs to Work with Cloud Storage: Challenge Lab [ARC125]* 2 | 3 | ## `Lab Link` - [*Click Here*](https://www.cloudskillsboost.google/focuses/65991?parent=catalog) 4 | 5 | ## [YouTube Solution Link](https://youtu.be/qowlfzbHotI) 6 | 7 | Run the following commands in the Cloud Shell Terminal. 8 | 9 | ``` 10 | gcloud services enable fitness.googleapis.com 11 | 12 | export OAUTH2_TOKEN=$(gcloud auth print-access-token) 13 | ``` 14 | 15 | ## Task 1. Create two Cloud Storage buckets 16 | 17 | * This will create and call the Bucket-1. 18 | 19 | ``` 20 | echo '{ 21 | "name": "'"$DEVSHELL_PROJECT_ID"'-bucket-1", 22 | "location": "us", 23 | "storageClass": "multi_regional" 24 | } 25 | ' > values.json 26 | 27 | curl -X POST --data-binary @values.json \ 28 | -H "Authorization: Bearer $OAUTH2_TOKEN" \ 29 | -H "Content-Type: application/json" \ 30 | "https://www.googleapis.com/storage/v1/b?project=$DEVSHELL_PROJECT_ID" 31 | ``` 32 | 33 | * This will create and call the Bucket-2. 34 | 35 | ``` 36 | echo '{ 37 | "name": "'"$DEVSHELL_PROJECT_ID"'-bucket-2", 38 | "location": "us", 39 | "storageClass": "multi_regional" 40 | }' > values.json 41 | 42 | curl -X POST --data-binary @values.json \ 43 | -H "Authorization: Bearer $OAUTH2_TOKEN" \ 44 | -H "Content-Type: application/json" \ 45 | "https://www.googleapis.com/storage/v1/b?project=$DEVSHELL_PROJECT_ID" 46 | 47 | ``` 48 | 49 | ## Task 2. Upload an image file to a Cloud Storage Bucket 50 | 51 | ``` 52 | curl -X POST --data-binary @/home/gcpstaging25084_student/demo-image.png \ 53 | -H "Authorization: Bearer $OAUTH2_TOKEN" \ 54 | -H "Content-Type: image/png" \ 55 | "https://www.googleapis.com/upload/storage/v1/b/$DEVSHELL_PROJECT_ID-bucket-1/o?uploadType=media&name=demo-image" 56 | ``` 57 | 58 | ## Task 3. Copy a file to another bucket 59 | 60 | ``` 61 | curl -X POST \ 62 | -H "Authorization: Bearer $OAUTH2_TOKEN" \ 63 | -H "Content-Type: application/json" \ 64 | -d '{ 65 | "destination": "gs://$DEVSHELL_PROJECT_ID-bucket-2/demo-image" 66 | }' \ 67 | "https://storage.googleapis.com/storage/v1/b/$DEVSHELL_PROJECT_ID-bucket-1/o/demo-image/copyTo/b/$DEVSHELL_PROJECT_ID-bucket-2/o/demo-image" 68 | ``` 69 | 70 | ## Task 4. Make an object (file) publicly accessible 71 | 72 | ``` 73 | echo '{ 74 | "entity": "allUsers", 75 | "role": "READER" 76 | } 77 | ' > values.json 78 | 79 | curl -X POST --data-binary @values.json \ 80 | -H "Authorization: Bearer $OAUTH2_TOKEN" \ 81 | -H "Content-Type: application/json" \ 82 | "https://storage.googleapis.com/storage/v1/b/$DEVSHELL_PROJECT_ID-bucket-1/o/demo-image/acl" 83 | ``` 84 | 85 | ## Task 5. Delete the object file and a Cloud Storage bucket (Bucket 1) 86 | 87 | ``` 88 | curl -X DELETE \ 89 | -H "Authorization: Bearer $OAUTH2_TOKEN" \ 90 | "https://storage.googleapis.com/storage/v1/b/gs://$DEVSHELL_PROJECT_ID-bucket-1/o/demo-image" 91 | 92 | curl -X DELETE \ 93 | -H "Authorization: Bearer $OAUTH2_TOKEN" \ 94 | "https://storage.googleapis.com/storage/v1/b/$DEVSHELL_PROJECT_ID-bucket-1/o/demo-image" 95 | 96 | curl -X DELETE \ 97 | -H "Authorization: Bearer $OAUTH2_TOKEN" \ 98 | "https://storage.googleapis.com/storage/v1/b/$DEVSHELL_PROJECT_ID-bucket-1" 99 | ``` 100 | 101 | # Congratulations 🎉! You're all done with this Challenge Lab. -------------------------------------------------------------------------------- /Build a Website on Google Cloud/Build a Website on Google Cloud Challenge Lab .md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *Build a Website on Google Cloud: Challenge Lab [GSP319]* 2 | 3 | ## `Lab Link` - [*Click Here*](https://www.cloudskillsboost.google/focuses/11765?parent=catalog) 4 | 5 | ## [YouTube Solution Link](https://youtu.be/KsONu06SsDw) 6 | Run the following commands in the Cloud Shell Terminal. 7 | 8 | ``` 9 | export MONOLITH= 10 | 11 | export CLUSTER= 12 | 13 | export ORDERS= 14 | 15 | export PRODUCTS= 16 | 17 | export FRONTEND= 18 | 19 | gcloud config set compute/zone [ZONE] 20 | 21 | gcloud services enable container.googleapis.com \ 22 | cloudbuild.googleapis.com 23 | ``` 24 | 25 | ## Task 1: Download the monolith code and build your container 26 | 27 | ``` 28 | git clone https://github.com/googlecodelabs/monolith-to-microservices.git 29 | cd ~/monolith-to-microservices 30 | 31 | ./setup.sh 32 | 33 | nvm install --lts 34 | 35 | cd ~/monolith-to-microservices/monolith 36 | gcloud builds submit --tag gcr.io/${GOOGLE_CLOUD_PROJECT}/${MONOLITH}:1.0.0 . 37 | ``` 38 | 39 | ## Task 2. Create a kubernetes cluster and deploy the application 40 | 41 | ``` 42 | gcloud container clusters create $CLUSTER --num-nodes 3 43 | 44 | kubectl create deployment $MONOLITH --image=gcr.io/${GOOGLE_CLOUD_PROJECT}/${MONOLITH}:1.0.0 45 | 46 | kubectl expose deployment $MONOLITH --type=LoadBalancer --port 80 --target-port 8080 47 | ``` 48 | 49 | ## Task 3. Create new microservices 50 | 51 | ``` 52 | cd ~/monolith-to-microservices/microservices/src/orders 53 | 54 | gcloud builds submit --tag gcr.io/${GOOGLE_CLOUD_PROJECT}/${ORDERS}:1.0.0 . 55 | 56 | cd ~/monolith-to-microservices/microservices/src/products 57 | 58 | gcloud builds submit --tag gcr.io/${GOOGLE_CLOUD_PROJECT}/${PRODUCTS}:1.0.0 . 59 | ``` 60 | 61 | ## Task 4. Deploy the new microservices 62 | 63 | ``` 64 | kubectl create deployment $ORDERS --image=gcr.io/${GOOGLE_CLOUD_PROJECT}/${ORDERS}:1.0.0 65 | 66 | kubectl expose deployment $ORDERS --type=LoadBalancer --port 80 --target-port 8081 67 | 68 | kubectl create deployment $PRODUCTS --image=gcr.io/${GOOGLE_CLOUD_PROJECT}/${PRODUCTS}:1.0.0 69 | 70 | kubectl expose deployment $PRODUCTS --type=LoadBalancer --port 80 --target-port 8082 71 | 72 | kubectl get service 73 | ``` 74 | 75 | ## Task 5. Configure and deploy the Frontend microservice 76 | 77 | ``` 78 | export ORDERS_IP=$(kubectl get services -o jsonpath="{.items[1].status.loadBalancer.ingress[0].ip}") 79 | 80 | export PRODUCTS_IP=$(kubectl get services -o jsonpath="{.items[2].status.loadBalancer.ingress[0].ip}") 81 | 82 | cd ~/monolith-to-microservices/react-app 83 | 84 | sed -i "s/localhost:8081/$ORDERS_IP/g" .env 85 | 86 | sed -i "s/localhost:8082/$PRODUCTS_IP/g" .env 87 | 88 | npm run build 89 | ``` 90 | 91 | ## Task 6. Create a containerized version of the Frontend microservice 92 | 93 | ``` 94 | cd ~/monolith-to-microservices/microservices/src/frontend 95 | 96 | gcloud builds submit --tag gcr.io/${GOOGLE_CLOUD_PROJECT}/${FRONTEND}:1.0.0 . 97 | ``` 98 | 99 | ## Task 7. Deploy the Frontend microservice 100 | 101 | ``` 102 | kubectl create deployment $FRONTEND --image=gcr.io/${GOOGLE_CLOUD_PROJECT}/${FRONTEND}:1.0.0 103 | 104 | kubectl expose deployment $FRONTEND --type=LoadBalancer --port 80 --target-port 8080 105 | 106 | kubectl get svc 107 | ``` 108 | 109 | # Congratulations🎉! You're all done with this Challenge lab. -------------------------------------------------------------------------------- /Get Started with Eventarc/Eventarc for Cloud Run [GSP773] .md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *Eventarc for Cloud Run [GSP773]* 2 | 3 | ## `Lab Link` - [Click Here](https://www.cloudskillsboost.google/focuses/15657?parent=catalog) 4 | 5 | 6 | Run the following commands in the Cloud Shell Terminal. 7 | 8 | 15 | 16 | ``` 17 | export PROJECT_NUMBER="$(gcloud projects list \ 18 | --filter=$(gcloud config get-value project) \ 19 | --format='value(PROJECT_NUMBER)')" 20 | 21 | gcloud projects add-iam-policy-binding $(gcloud config get-value project) \ 22 | --member=serviceAccount:${PROJECT_NUMBER}-compute@developer.gserviceaccount.com \ 23 | --role='roles/eventarc.admin' 24 | 25 | gcloud beta eventarc attributes types list 26 | 27 | gcloud beta eventarc attributes types describe \ 28 | google.cloud.pubsub.topic.v1.messagePublished 29 | 30 | export SERVICE_NAME=event-display 31 | 32 | export IMAGE_NAME="gcr.io/cloudrun/hello" 33 | 34 | gcloud run deploy ${SERVICE_NAME} \ 35 | --image ${IMAGE_NAME} \ 36 | --allow-unauthenticated \ 37 | --max-instances=3 38 | 39 | gcloud beta eventarc attributes types describe \ 40 | google.cloud.pubsub.topic.v1.messagePublished 41 | 42 | gcloud beta eventarc triggers create trigger-pubsub \ 43 | --destination-run-service=${SERVICE_NAME} \ 44 | --matching-criteria="type=google.cloud.pubsub.topic.v1.messagePublished" 45 | 46 | export TOPIC_ID=$(gcloud eventarc triggers describe trigger-pubsub \ 47 | --format='value(transport.pubsub.topic)') 48 | 49 | gcloud eventarc triggers list 50 | 51 | gcloud pubsub topics publish ${TOPIC_ID} --message="Hello there" 52 | 53 | export BUCKET_NAME=$(gcloud config get-value project)-cr-bucket 54 | 55 | gsutil mb -p $(gcloud config get-value project) \ 56 | -l $(gcloud config get-value run/region) \ 57 | gs://${BUCKET_NAME}/ 58 | ``` 59 | 60 | ### Enable Audit Logs 61 | 62 | * From the *Navigation menu*, select *IAM & Admin* > `Audit Logs`. 63 | 64 | * In the list of services, check the box for Google Cloud Storage. 65 | 66 | * On the right hand side, click the `LOG TYPE tab`. Admin Write is selected by default, make sure you also select `Admin Read`, `Data Read`, `Data Write` and then click `Save`. 67 | 68 | ``` 69 | echo "Hello World" > random.txt 70 | 71 | gsutil cp random.txt gs://${BUCKET_NAME}/random.txt 72 | ``` 73 | 74 | 75 | * In the Cloud Console, go to *Navigation menu* > *Logging* > `Logs Explorer`. 76 | 77 | * Under *Resource*, choose `GCS Bucket` > [Bucket Name] > `Location` then choose `your bucket` and `its location`. 78 | 79 | * Click `Add`. 80 | 81 | * Click on `Run Query`. 82 | 83 | ``` 84 | gcloud beta eventarc attributes types describe google.cloud.audit.log.v1.written 85 | 86 | gcloud beta eventarc triggers create trigger-auditlog \ 87 | --destination-run-service=${SERVICE_NAME} \ 88 | --matching-criteria="type=google.cloud.audit.log.v1.written" \ 89 | --matching-criteria="serviceName=storage.googleapis.com" \ 90 | --matching-criteria="methodName=storage.objects.create" \ 91 | --service-account=${PROJECT_NUMBER}-compute@developer.gserviceaccount.com 92 | 93 | gcloud eventarc triggers list 94 | 95 | gsutil cp random.txt gs://${BUCKET_NAME}/random.txt 96 | ``` 97 | * The above command will take some time to get the score in the Task 6. 98 | 99 | # Congratulations🎉! You're all done with this lab. -------------------------------------------------------------------------------- /Configure Secure RDP using a Windows Bastion Host/Configure Secure RDP using a Windows Bastion Host.md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *Configure Secure RDP using a Windows Bastion Host [GSP303]* 2 | 3 | ## `Lab Link` - [*Click Here*](https://www.cloudskillsboost.google/focuses/1737?parent=catalog) 4 | 5 | ## [YouTube Solution Link](https://www.youtube.com/watch?v=_mBh_nPEdBM) 6 | ### Run the below commands in the Cloud Shell Terminal. 7 | 8 | ```cmd 9 | export Region= 10 | ``` 11 | 12 | ```cmd 13 | export Zone= 14 | ``` 15 | 16 | ## Task 1: A new non-default VPC has been created 17 | 18 | ```cmd 19 | gcloud compute networks create securenetwork --subnet-mode custom 20 | ``` 21 | 22 | ## Task 2: The new VPC contains a new non-default subnet within it 23 | 24 | ```cmd 25 | gcloud compute networks subnets create securenetwork-subnet --network=securenetwork --region $Region --range=192.168.16.0/20 26 | ``` 27 | 28 | ## Task 3: A firewall rule exists that allows TCP port 3389 traffic ( for RDP ) 29 | 30 | ```cmd 31 | gcloud compute firewall-rules create rdp-ingress-fw-rule --allow=tcp:3389 --source-ranges 0.0.0.0/0 --target-tags allow-rdp-traffic --network securenetwork 32 | ``` 33 | 34 | ## Task 4: A Windows compute instance called vm-bastionhost exists that has a public ip-address to which the TCP port 3389 firewall rule applies. 35 | 36 | ```cmd 37 | gcloud compute instances create vm-bastionhost --zone=$Zone --machine-type=e2-medium --network-interface=subnet=securenetwork-subnet --network-interface=subnet=default,no-address --tags=allow-rdp-traffic --image=projects/windows-cloud/global/images/windows-server-2016-dc-v20220513 38 | ``` 39 | 40 | ## Task 5: A Windows compute instance called vm-securehost exists that does not have a public ip-address 41 | 42 | * The below command will make an instance with `No External IP Address` which is `required` in task 5. 43 | ```cmd 44 | gcloud compute instances create vm-securehost --zone=$Zone --machine-type=e2-medium --network-interface=subnet=securenetwork-subnet,no-address --network-interface=subnet=default,no-address --tags=allow-rdp-traffic --image=projects/windows-cloud/global/images/windows-server-2016-dc-v20220513 45 | ``` 46 | ### *You can now end your lab at task 5 and check if the green tick came beside the lab or not.* 47 | ### *If you see the green tick, then congratulations! Your work here was successful.* 48 | 49 | * If you want to perform the `task 6` then you should have to run the below command. 50 | ``` 51 | gcloud compute instances create vm-securehost --zone=$Zone --machine-type=e2-medium --network-interface=subnet=securenetwork-subnet --network-interface=subnet=default,no-address --tags=allow-rdp-traffic --image=projects/windows-cloud/global/images/windows-server-2016-dc-v20220513 52 | ``` 53 | 54 | ## Task 6: The vm-securehost is running Microsoft IIS web server software. 55 | 56 | ```cmd 57 | gcloud compute reset-windows-password vm-securehost --user app_admin --zone $Zone 58 | ``` 59 | 60 | If asking choose y/n then press "y", and then hit Enter. 61 | 62 | * Now, copy the username and password for app_admin user, we need this to login into bastion vm. 63 | 64 | ## Manual Steps (*Necessary to perform) 65 | 66 | * Click on `Add roles and features`. 67 | 68 | Image1 69 | Image2 70 | Image3 71 | Image4 72 | Image5 73 | 74 | 75 | * Click *next* > *next* > *Install* 76 | 77 | # Congratulations🎉! You are done with this Challenge Lab. -------------------------------------------------------------------------------- /Cloud Functions 3 Ways Challenge Lab.md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *Cloud Functions: 3 Ways: Challenge Lab [ARC104]* 2 | 3 | ## `Lab Link` - [*Click Here*](https://www.cloudskillsboost.google/focuses/61974?parent=catalog) 4 | 5 | 6 | 7 | Run the below commands in the Cloud Shell Terminal. 8 | 9 | * Enabling an APIs. 10 | 11 | ``` 12 | gcloud services enable \ 13 | artifactregistry.googleapis.com \ 14 | cloudfunctions.googleapis.com \ 15 | cloudbuild.googleapis.com \ 16 | eventarc.googleapis.com \ 17 | run.googleapis.com \ 18 | logging.googleapis.com \ 19 | pubsub.googleapis.com 20 | ``` 21 | 22 | * Store values in a variables. 23 | 24 | ``` 25 | export REGION= 26 | 27 | export HTTP_FUNCTION= 28 | 29 | export FUNCTION_NAME= 30 | 31 | export BUCKET="gs://$DEVSHELL_PROJECT_ID" 32 | ``` 33 | 34 | ## Task 1. Create a Cloud Storage bucket 35 | 36 | ``` 37 | PROJECT_NUMBER=$(gcloud projects list --filter="project_id:$DEVSHELL_PROJECT_ID" --format='value(project_number)') 38 | 39 | SERVICE_ACCOUNT=$(gsutil kms serviceaccount -p $PROJECT_NUMBER) 40 | 41 | gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID \ 42 | --member serviceAccount:$SERVICE_ACCOUNT \ 43 | --role roles/pubsub.publisher 44 | 45 | gsutil mb -l $REGION gs://$DEVSHELL_PROJECT_ID 46 | ``` 47 | 48 | ## Task 2. Create, deploy, and test a Cloud Storage function (2nd gen) 49 | 50 | ``` 51 | mkdir ~/$FUNCTION_NAME && cd $_ 52 | 53 | touch index.js && touch package.json 54 | 55 | cat > index.js < { 58 | console.log('A new event in your Cloud Storage bucket has been logged!'); 59 | console.log(cloudevent); 60 | }); 61 | EOF 62 | 63 | cat > package.json < index.js < { 103 | res.status(200).send('subscribe to quikclab'); 104 | }); 105 | EOF 106 | 107 | cat > package.json < 12 | ``` 13 | 14 | ``` 15 | cd ~ 16 | git clone https://github.com/googlecodelabs/monolith-to-microservices.git 17 | cd ~/monolith-to-microservices 18 | ./setup.sh 19 | ``` 20 | 21 | ## Task 2. Create a GKE cluster 22 | 23 | ``` 24 | gcloud services enable container.googleapis.com 25 | 26 | gcloud container clusters create fancy-cluster --num-nodes 3 --machine-type=e2-standard-4 27 | 28 | gcloud compute instances list 29 | ``` 30 | 31 | ## Task 3. Deploy the existing monolith 32 | 33 | ``` 34 | cd ~/monolith-to-microservices 35 | ./deploy-monolith.sh 36 | 37 | kubectl get service monolith 38 | ``` 39 | 40 | ## Task 4. Migrate orders to a microservice 41 | 42 | ``` 43 | cd ~/monolith-to-microservices/microservices/src/orders 44 | gcloud builds submit --tag gcr.io/${GOOGLE_CLOUD_PROJECT}/orders:1.0.0 . 45 | 46 | kubectl create deployment orders --image=gcr.io/${GOOGLE_CLOUD_PROJECT}/orders:1.0.0 47 | 48 | kubectl get all 49 | 50 | kubectl expose deployment orders --type=LoadBalancer --port 80 --target-port 8081 51 | 52 | kubectl get service orders 53 | 54 | cd ~/monolith-to-microservices/react-app 55 | nano .env.monolith 56 | ``` 57 | 58 | * Replace the `REACT_APP_ORDERS_URL` to the new format 59 | 60 | ``` 61 | REACT_APP_ORDERS_URL=http:///api/orders 62 | REACT_APP_PRODUCTS_URL=/service/products 63 | ``` 64 | 65 | * Press `CTRL+O, press ENTER, then CTRL+X` to save the file in the nano editor. 66 | 67 | ``` 68 | npm run build:monolith 69 | 70 | cd ~/monolith-to-microservices/monolith 71 | gcloud builds submit --tag gcr.io/${GOOGLE_CLOUD_PROJECT}/monolith:2.0.0 . 72 | 73 | kubectl set image deployment/monolith monolith=gcr.io/${GOOGLE_CLOUD_PROJECT}/monolith:2.0.0 74 | ``` 75 | 76 | ## Task 5. Migrate Products to microservice 77 | 78 | ``` 79 | cd ~/monolith-to-microservices/microservices/src/products 80 | gcloud builds submit --tag gcr.io/${GOOGLE_CLOUD_PROJECT}/products:1.0.0 . 81 | 82 | kubectl create deployment products --image=gcr.io/${GOOGLE_CLOUD_PROJECT}/products:1.0.0 83 | 84 | kubectl expose deployment products --type=LoadBalancer --port 80 --target-port 8082 85 | 86 | kubectl get service products 87 | 88 | cd ~/monolith-to-microservices/react-app 89 | nano .env.monolith 90 | ``` 91 | 92 | * Replace the `REACT_APP_PRODUCTS_URL` to the new format 93 | 94 | ``` 95 | REACT_APP_ORDERS_URL=http:///api/orders 96 | REACT_APP_PRODUCTS_URL=http:///api/products 97 | ``` 98 | 99 | * Press `CTRL+O, press ENTER, then CTRL+X` to save the file. 100 | 101 | ``` 102 | npm run build:monolith 103 | 104 | cd ~/monolith-to-microservices/monolith 105 | gcloud builds submit --tag gcr.io/${GOOGLE_CLOUD_PROJECT}/monolith:3.0.0 . 106 | 107 | kubectl set image deployment/monolith monolith=gcr.io/${GOOGLE_CLOUD_PROJECT}/monolith:3.0.0 108 | ``` 109 | 110 | ## Task 6. Migrate frontend to microservice 111 | 112 | ``` 113 | cd ~/monolith-to-microservices/react-app 114 | cp .env.monolith .env 115 | npm run build 116 | 117 | cd ~/monolith-to-microservices/microservices/src/frontend 118 | gcloud builds submit --tag gcr.io/${GOOGLE_CLOUD_PROJECT}/frontend:1.0.0 . 119 | 120 | kubectl create deployment frontend --image=gcr.io/${GOOGLE_CLOUD_PROJECT}/frontend:1.0.0 121 | 122 | kubectl expose deployment frontend --type=LoadBalancer --port 80 --target-port 8080 123 | ``` 124 | 125 | # Congratulation🎉! You're all done with this lab. -------------------------------------------------------------------------------- /Create ML Models with BigQuery ML Challenge Lab.md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *Create ML Models with BigQuery ML: Challenge Lab [GSP341]* 2 | 3 | ## `Lab Link` - [*Click Here*](https://www.cloudskillsboost.google/focuses/14294?parent=catalog) 4 | 5 | ## [YouTube Solution Link](https://youtu.be/tpiil_p4uKk) 6 | ## Task 1. Create a dataset to store your machine learning models 7 | 8 | ```cmd 9 | bq mk austin 10 | ``` 11 | 12 | * *Navigation Menu* -> `BigQuery`. 13 | 14 | ## Task 2. Create a forecasting BigQuery machine learning model 15 | 16 | Run in `BigQuery Console Query Editor` 17 | 18 | ```cmd 19 | CREATE OR REPLACE MODEL austin.location_model 20 | OPTIONS 21 | (model_type='linear_reg', labels=['duration_minutes']) AS 22 | SELECT 23 | start_station_name, 24 | EXTRACT(HOUR FROM start_time) AS start_hour, 25 | EXTRACT(DAYOFWEEK FROM start_time) AS day_of_week, 26 | duration_minutes, 27 | address as location 28 | FROM 29 | `bigquery-public-data.austin_bikeshare.bikeshare_trips` AS trips 30 | JOIN 31 | `bigquery-public-data.austin_bikeshare.bikeshare_stations` AS stations 32 | ON 33 | trips.start_station_name = stations.name 34 | WHERE 35 | EXTRACT(YEAR FROM start_time) = 36 | AND duration_minutes > 0 37 | ``` 38 | 39 | ## Task 3: Create the second machine learning model 40 | 41 | * 3.1 Querry 42 | 43 | ```cmd 44 | CREATE OR REPLACE MODEL austin.subscriber_model 45 | OPTIONS 46 | (model_type='linear_reg', labels=['duration_minutes']) AS 47 | SELECT 48 | start_station_name, 49 | EXTRACT(HOUR FROM start_time) AS start_hour, 50 | subscriber_type, 51 | duration_minutes 52 | FROM `bigquery-public-data.austin_bikeshare.bikeshare_trips` AS trips 53 | WHERE EXTRACT(YEAR FROM start_time) = 54 | ``` 55 | 56 | * 3.2 Querry 57 | 58 | ```cmd 59 | CREATE OR REPLACE MODEL austin.subscriber_model 60 | OPTIONS 61 | (model_type='linear_reg', labels=['duration_minutes']) AS 62 | SELECT 63 | start_station_name, 64 | EXTRACT(HOUR FROM start_time) AS start_hour, 65 | subscriber_type, 66 | duration_minutes 67 | FROM `bigquery-public-data.austin_bikeshare.bikeshare_trips` AS trips 68 | WHERE EXTRACT(YEAR FROM start_time) = 69 | ``` 70 | 71 | ## Task 4: Evaluate the two machine learning models 72 | 73 | * 4.1 Querry 74 | 75 | ```cmd 76 | -- Evaluation metrics for location_model 77 | SELECT 78 | SQRT(mean_squared_error) AS rmse, 79 | mean_absolute_error 80 | FROM 81 | ML.EVALUATE(MODEL austin.location_model, ( 82 | SELECT 83 | start_station_name, 84 | EXTRACT(HOUR FROM start_time) AS start_hour, 85 | EXTRACT(DAYOFWEEK FROM start_time) AS day_of_week, 86 | duration_minutes, 87 | address as location 88 | FROM 89 | `bigquery-public-data.austin_bikeshare.bikeshare_trips` AS trips 90 | JOIN 91 | `bigquery-public-data.austin_bikeshare.bikeshare_stations` AS stations 92 | ON 93 | trips.start_station_name = stations.name 94 | WHERE EXTRACT(YEAR FROM start_time) = ) 95 | ) 96 | ``` 97 | 98 | * 4.2 Querry 99 | 100 | ```cmd 101 | -- Evaluation metrics for subscriber_model 102 | SELECT 103 | SQRT(mean_squared_error) AS rmse, 104 | mean_absolute_error 105 | FROM 106 | ML.EVALUATE(MODEL austin.subscriber_model, ( 107 | SELECT 108 | start_station_name, 109 | EXTRACT(HOUR FROM start_time) AS start_hour, 110 | subscriber_type, 111 | duration_minutes 112 | FROM 113 | `bigquery-public-data.austin_bikeshare.bikeshare_trips` AS trips 114 | WHERE 115 | EXTRACT(YEAR FROM start_time) = ) 116 | ) 117 | ``` 118 | 119 | ## Task 5: Use the subscriber type machine learning model to predict average trip durations 120 | 121 | * 5.1 Querry 122 | 123 | ```cmd 124 | SELECT 125 | start_station_name, 126 | COUNT(*) AS trips 127 | FROM 128 | `bigquery-public-data.austin_bikeshare.bikeshare_trips` 129 | WHERE 130 | EXTRACT(YEAR FROM start_time) = 131 | GROUP BY 132 | start_station_name 133 | ORDER BY 134 | trips DESC 135 | ``` 136 | 137 | * 5.2 Querry 138 | 139 | ```cmd 140 | SELECT AVG(predicted_duration_minutes) AS average_predicted_trip_length 141 | FROM ML.predict(MODEL austin.subscriber_model, ( 142 | SELECT 143 | start_station_name, 144 | EXTRACT(HOUR FROM start_time) AS start_hour, 145 | subscriber_type, 146 | duration_minutes 147 | FROM 148 | `bigquery-public-data.austin_bikeshare.bikeshare_trips` 149 | WHERE 150 | EXTRACT(YEAR FROM start_time) = 151 | AND subscriber_type = 'Single Trip' 152 | AND start_station_name = '21st & Speedway @PCL')) 153 | ``` 154 | 155 | # Congratulations🎉! You're done with this Challenge Lab. -------------------------------------------------------------------------------- /Integrate with Machine Learning APIs/analyze-images-v2.py: -------------------------------------------------------------------------------- 1 | import os 2 | import sys 3 | 4 | # Import Google Cloud Library modules 5 | from google.cloud import storage, bigquery, language, vision, translate_v2 6 | 7 | if ('GOOGLE_APPLICATION_CREDENTIALS' in os.environ): 8 | if (not os.path.exists(os.environ['GOOGLE_APPLICATION_CREDENTIALS'])): 9 | print ("The GOOGLE_APPLICATION_CREDENTIALS file does not exist.\n") 10 | exit() 11 | else: 12 | print ("The GOOGLE_APPLICATION_CREDENTIALS environment variable is not defined.\n") 13 | exit() 14 | 15 | if len(sys.argv)<3: 16 | print('You must provide parameters for the Google Cloud project ID and Storage bucket') 17 | print ('python3 '+sys.argv[0]+ '[PROJECT_NAME] [BUCKET_NAME]') 18 | exit() 19 | 20 | project_name = sys.argv[1] 21 | bucket_name = sys.argv[2] 22 | 23 | # Set up our GCS, BigQuery, and Natural Language clients 24 | storage_client = storage.Client() 25 | bq_client = bigquery.Client(project=project_name) 26 | nl_client = language.LanguageServiceClient() 27 | 28 | # Set up client objects for the vision and translate_v2 API Libraries 29 | vision_client = vision.ImageAnnotatorClient() 30 | translate_client = translate_v2.Client() 31 | 32 | # Setup the BigQuery dataset and table objects 33 | dataset_ref = bq_client.dataset('image_classification_dataset') 34 | dataset = bigquery.Dataset(dataset_ref) 35 | table_ref = dataset.table('image_text_detail') 36 | table = bq_client.get_table(table_ref) 37 | 38 | # Create an array to store results data to be inserted into the BigQuery table 39 | rows_for_bq = [] 40 | 41 | # Get a list of the files in the Cloud Storage Bucket 42 | files = storage_client.bucket(bucket_name).list_blobs() 43 | bucket = storage_client.bucket(bucket_name) 44 | 45 | print('Processing image files from GCS. This will take a few minutes..') 46 | 47 | # Process files from Cloud Storage and save the result to send to BigQuery 48 | for file in files: 49 | if file.name.endswith('jpg') or file.name.endswith('png'): 50 | file_content = file.download_as_string() 51 | 52 | # TBD: Create a Vision API image object called image_object 53 | # Ref: https://googleapis.dev/python/vision/latest/gapic/v1/types.html#google.cloud.vision_v1.types.Image 54 | from google.cloud import vision_v1 55 | import io 56 | client = vision.ImageAnnotatorClient() 57 | 58 | 59 | # TBD: Detect text in the image and save the response data into an object called response 60 | # Ref: https://googleapis.dev/python/vision/latest/gapic/v1/api.html#google.cloud.vision_v1.ImageAnnotatorClient.document_text_detection 61 | image = vision_v1.types.Image(content=file_content) 62 | response = client.text_detection(image=image) 63 | 64 | # Save the text content found by the vision API into a variable called text_data 65 | text_data = response.text_annotations[0].description 66 | 67 | # Save the text detection response data in .txt to cloud storage 68 | file_name = file.name.split('.')[0] + '.txt' 69 | blob = bucket.blob(file_name) 70 | # Upload the contents of the text_data string variable to the Cloud Storage file 71 | blob.upload_from_string(text_data, content_type='text/plain') 72 | 73 | # Extract the description and locale data from the response file 74 | # into variables called desc and locale 75 | # using response object properties e.g. response.text_annotations[0].description 76 | desc = response.text_annotations[0].description 77 | locale = response.text_annotations[0].locale 78 | 79 | # if the locale is English (en) save the description as the translated_txt 80 | if locale == 'en': 81 | translated_text = desc 82 | else: 83 | # TBD: For non EN locales pass the description data to the translation API 84 | # ref: https://googleapis.dev/python/translation/latest/client.html#google.cloud.translate_v2.client.Client.translate 85 | # Set the target_language locale to 'en') 86 | from google.cloud import translate_v2 as translate 87 | 88 | client = translate.Client() 89 | translation = translate_client.translate(text_data, target_language='en') 90 | translated_text = translation['translatedText'] 91 | print(translated_text) 92 | 93 | # if there is response data save the original text read from the image, 94 | # the locale, translated text, and filename 95 | if len(response.text_annotations) > 0: 96 | rows_for_bq.append((desc, locale, translated_text, file.name)) 97 | 98 | print('Writing Vision API image data to BigQuery...') 99 | # Write original text, locale and translated text to BQ 100 | # TBD: When the script is working uncomment the next line to upload results to BigQuery 101 | errors = bq_client.insert_rows(table, rows_for_bq) 102 | 103 | assert errors == [] -------------------------------------------------------------------------------- /Create an Internal Load Balancer.md: -------------------------------------------------------------------------------- 1 | ## *Lab Name* - `Create an Internal Load Balancer [GSP216]` 2 | 3 | ## *Lab Link* - [`Click Here`](https://www.cloudskillsboost.google/focuses/1250?parent=catalog) 4 | 5 | 6 | 7 | Run the below commands in the Cloud Shell Terminal. 8 | 9 | ### Get the Zone from Task 2 10 | 11 | ``` 12 | export ZONE= 13 | ``` 14 | 15 | ``` 16 | export REGION=${ZONE::-2} 17 | last_char="${ZONE: -1}" 18 | if [ "$last_char" == "a" ]; then 19 | export NZONE="${ZONE%?}b" 20 | elif [ "$last_char" == "b" ]; then 21 | export NZONE="${ZONE%?}c" 22 | elif [ "$last_char" == "c" ]; then 23 | export NZONE="${ZONE%?}b" 24 | elif [ "$last_char" == "d" ]; then 25 | export NZONE="${ZONE%?}b" 26 | fi 27 | ``` 28 | 29 | ## Task 1. Configure HTTP and health check firewall rules 30 | 31 | ``` 32 | gcloud compute firewall-rules create app-allow-http --network my-internal-app --action allow --direction INGRESS --target-tags lb-backend --source-ranges 0.0.0.0/0 --rules tcp:80 33 | 34 | gcloud compute firewall-rules create app-allow-health-check --network default --action allow --direction INGRESS --target-tags lb-backend --source-ranges 130.211.0.0/22,35.191.0.0/16 --rules tcp 35 | ``` 36 | 37 | ## Task 2. Configure instance templates and create instance groups 38 | 39 | ``` 40 | gcloud compute instance-templates create instance-template-1 --machine-type=e2-medium --network=my-internal-app --region $REGION --subnet=subnet-a --tags=lb-backend --metadata=startup-script-url=gs://cloud-training/gcpnet/ilb/startup.sh 41 | 42 | gcloud compute instance-templates create instance-template-2 --machine-type=e2-medium --network=my-internal-app --region $REGION --subnet=subnet-b --tags=lb-backend --metadata=startup-script-url=gs://cloud-training/gcpnet/ilb/startup.sh 43 | 44 | gcloud compute instance-groups managed create instance-group-1 --base-instance-name=instance-group-1 --template=instance-template-1 --zone=$ZONE --size=1 45 | 46 | gcloud compute instance-groups managed set-autoscaling instance-group-1 --zone=$ZONE --cool-down-period=45 --max-num-replicas=5 --min-num-replicas=1 --target-cpu-utilization=0.8 47 | 48 | gcloud compute instance-groups managed create instance-group-2 --base-instance-name=instance-group-2 --template=instance-template-2 --zone=$NZONE --size=1 49 | 50 | gcloud compute instance-groups managed set-autoscaling instance-group-2 --zone=$NZONE --cool-down-period=45 --max-num-replicas=5 --min-num-replicas=1 --target-cpu-utilization=0.8 51 | 52 | gcloud compute instances create utility-vm --zone=$ZONE --machine-type=e2-micro --image-family=debian-10 --image-project=debian-cloud --boot-disk-size=10GB --boot-disk-type=pd-standard --network=my-internal-app --subnet=subnet-a --private-network-ip=10.10.20.50 53 | ``` 54 | 55 | ## Task 3. Configure the Internal Load Balancer 56 | 57 | ``` 58 | gcloud compute health-checks create tcp my-ilb-health-check \ 59 | --description="Subscribe To Atul Gupta" \ 60 | --check-interval=5s \ 61 | --timeout=5s \ 62 | --unhealthy-threshold=2 \ 63 | --healthy-threshold=2 \ 64 | --port=80 \ 65 | --proxy-header=NONE 66 | 67 | TOKEN=$(gcloud auth application-default print-access-token) 68 | cat > 1.json < 2.json < --zone= 13 | ``` 14 | 15 | ``` 16 | curl -sSO https://dl.google.com/cloudagents/add-logging-agent-repo.sh 17 | sudo bash add-logging-agent-repo.sh --also-install 18 | ``` 19 | 20 | ``` 21 | curl -sSO https://dl.google.com/cloudagents/add-monitoring-agent-repo.sh 22 | sudo bash add-monitoring-agent-repo.sh --also-install 23 | ``` 24 | 25 | ``` 26 | (cd /etc/stackdriver/collectd.d/ && sudo curl -O https://raw.githubusercontent.com/Stackdriver/stackdriver-agent-service-configs/master/etc/collectd.d/apache.conf) 27 | sudo service stackdriver-agent restart 28 | ``` 29 | 30 | ## Task 2. Add an uptime check for Apache Web Server on the VM 31 | 32 | * In Google Cloud console, select *Navigation menu* > *Monitoring* > *Uptime Check* > click `Create Uptime Check`. 33 | 34 | 35 | * For *Protocol*, select `HTTP`. 36 | * For *Resource Type*, select `Instance`. 37 | * For *Instance*, select `VM_NAME`. 38 | * Click `Continue`. 39 | 40 | * In *Response Validation*, accept the defaults and then click `Continue`. 41 | * In *Alert & Notification*, accept the defaults, and then click `Continue`. 42 | * For *Title*, type `Subscribe_to_Atul_Gupta` 43 | * Click `Create`. 44 | 45 | ## Task 3. Add an alert policy for Apache Web Server 46 | 47 | * In the Cloud Console, in the left menu, Click *Alerting* > click `+Create Policy`. 48 | * Select the time series to be monitored: 49 | * Click *Select a metric* and enter `VM instance` into the Filter Bar. 50 | * In the *Active metric categories list*, select `Apache`. 51 | * In the *Active metrics list*, select `traffic`. 52 | 53 | * In the Transform data section, select the following values: 54 | * *Rolling window*: `1 min` 55 | * *Rolling window function*: `rate` 56 | 57 | 58 | * In the Configure alert trigger section, select the following values and click *Next*: 59 | * Alert trigger: `Any time series violates` 60 | * Threshold position: `Above threshold` 61 | * Threshold value: `3072` 62 | 63 | * Click on the drop down arrow next to Notification Channels, then click on Manage Notification Channels. 64 | * A Notification channels page will open in a new tab. 65 | 66 | * Scroll down the page and click on `ADD NEW for Email`. 67 | 68 | * In the *Create Email Channel dialog box*, enter the given `student email address` in the *Email Address field* and for a Display name enter `Student`. 69 | 70 | * Click on `Save`. 71 | 72 | * Go back to the previous Create alerting policy tab. 73 | 74 | * Click on *Notification Channels again*, then click on the `Refresh icon` to get the display name you mentioned in the previous step. 75 | 76 | * Click on Notification Channels again if necessary, select your Display name and click OK. 77 | 78 | * Mention the Alert name as `Subscribe_to_Atul_Gupta`. 79 | 80 | * Click *Next*. 81 | 82 | * Review the alert and click `Create Policy`. 83 | 84 | 85 | ## Task 4. Create a dashboard and charts for Apache Web Server on the VM 86 | 87 | * In the left menu select Dashboards, and then +Create Dashboard. 88 | * Name the dashboard - `Subscribe_to_Atul_Gupta` 89 | 90 | * Add the *first chart* 91 | 92 | * Click the Line option in the Chart library. 93 | 94 | * Name the chart title - `CPU Load`. 95 | 96 | * Click on *Resource & Metric dropdown*. Disable the `Show only active resources & metrics`. 97 | 98 | * Type `CPU load (1m)` in filter by *resource and metric name* and click on *VM instance* > *Cpu* > Select `CPU load (1m)` and click `Apply`. 99 | 100 | * Leave all other fields at the default value. Refresh the tab to view the graph. 101 | 102 | * Add the second chart 103 | 104 | * Click `+ Add Chart` and select `Line` option in the Chart library. 105 | 106 | * Name this chart - `Received Packets`. 107 | 108 | * Click on *Resource & Metric dropdown*. Disable the `Show only active resources & metrics`. 109 | 110 | * Type `Requests` in filter by *resource and metric name* and click on *VM instance* > `Apache` > Select `Requests` and click `Apply`. 111 | 112 | * Refresh the tab to view the graph. 113 | 114 | * Leave the other fields at their default values. You see the chart data. 115 | 116 | 117 | ## Task 5. Create a log-based metric 118 | 119 | * In the Console, select *Navigation menu* > *Logging* > `Log-based Metrics`. 120 | * *The Cloud Logging opens in the Console.* 121 | * Click `Create metric`. 122 | * In the *Create logs metric*: 123 | * Change the *Metric Type* to `Distribution`. 124 | * Name your *metric* `Subscribe_to_Atul_Gupta` 125 | * Paste the below code in the `Build Filter`. 126 | 127 | ``` 128 | resource.type="gce_instance" 129 | logName="projects/PROJECT_ID/logs/apache-access" 130 | textPayload:"200" 131 | ``` 132 | * REPLACE `PROJECT_ID` with Your *GCP Project-Id*. 133 | 134 | * For Field name > Enter `textPayload`. 135 | 136 | * Enter the below text in the `Regular Expression field`: 137 | 138 | ``` 139 | execution took (\d+) 140 | ``` 141 | 142 | # Congratulations🎉! You're all done with this Challenge Lab. -------------------------------------------------------------------------------- /Cloud Speech API 3 Ways Challenge Lab .md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *Cloud Speech API 3 Ways: Challenge Lab [ARC132]* 2 | 3 | ## `Lab Link` - [Click Here](https://www.cloudskillsboost.google/focuses/67215?parent=catalog) 4 | 5 | ## [YouTube Solution Link](https://youtu.be/ELtMXEyZq5g) 6 | 7 | Run the following commands in the Cloud Shell Terminal. 8 | 9 | ## Task 1. Create an API key 10 | 11 | In this task you have to create an API KEY. 12 | 13 | * Go to *Navigation Menu* > *APIs and Services* > `Credentials` > Click `Create` then select `API KEY`. 14 | 15 | * Save the `GENRATED API KEY`. 16 | 17 | Now, run the below commands in the Cloud Shell Terminal. 18 | 19 | ``` 20 | gcloud compute ssh lab-vm --zone= 21 | ``` 22 | 23 | ``` 24 | export API_KEY= 25 | 26 | TASK_2_FILE_NAME="" 27 | 28 | TASK_3_REQUEST_FILE="" 29 | 30 | TASK_3_RESPONSE_FILE="" 31 | 32 | TASK_4_SENTENCE="" 33 | 34 | TASK_4_FILE_NAME="" 35 | 36 | TASK_5_SENTENCE="" 37 | 38 | TASK_5_FILE_NAME="" 39 | ``` 40 | 41 | ``` 42 | export PROJECT_ID=$(gcloud config get-value project) 43 | 44 | source venv/bin/activate 45 | ``` 46 | 47 | * Creating synthesize-text.json file. 48 | 49 | ``` 50 | cat > synthesize-text.json < $TASK_2_FILE_NAME 77 | ``` 78 | 79 | * Creating tts_decode.py file. 80 | 81 | ``` 82 | cat > tts_decode.py < "$TASK_3_REQUEST_FILE" < "$TASK_4_FILE_NAME" 186 | echo "Translation saved to $TASK_4_FILE_NAME:" 187 | cat "$TASK_4_FILE_NAME" 188 | fi 189 | fi 190 | ``` 191 | 192 | * URL - decode the sentence. 193 | 194 | ``` 195 | decoded_sentence=$(python -c "import urllib.parse; print(urllib.parse.unquote('$TASK_5_SENTENCE'))") 196 | 197 | # Make the Language Detection API request using curl 198 | curl -s -X POST \ 199 | -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \ 200 | -H "Content-Type: application/json; charset=utf-8" \ 201 | -d "{\"q\": [\"$decoded_sentence\"]}" \ 202 | "https://translation.googleapis.com/language/translate/v2/detect?key=${API_KEY}" \ 203 | -o "$TASK_5_FILE_NAME" 204 | ``` 205 | 206 | # Congratulations🎉! You're done with this Challenge Lab. -------------------------------------------------------------------------------- /Insights from Data with BigQuery Challenge Lab.md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *Insights from Data with BigQuery: Challenge Lab [GSP787]* 2 | 3 | ## `Lab Link` - [*Click Here*](https://www.cloudskillsboost.google/focuses/11988?parent=catalog) 4 | 5 | ## [YouTube Solution Link](https://youtu.be/yCaiNwnAvJU) 6 | 7 | * At first, create a Public Dataset. 8 | 9 | Open Public Dataset. 10 | 11 | In the GCP Console go to `Navigation Menu` > `BigQuery`. 12 | 13 | Click on `+ ADD DATA`. 14 | 15 | Then Explore `Public Datasets` from the left pane. 16 | 17 | Search `covid19_open_data` and Select it. 18 | 19 | ## Task 1. Total confirmed cases 20 | 21 | ```cmd 22 | SELECT 23 | SUM(cumulative_confirmed) AS total_cases_worldwide 24 | FROM 25 | `bigquery-public-data.covid19_open_data.covid19_open_data` 26 | WHERE 27 | date = "2020-06-25" 28 | ``` 29 | 30 | 31 | ## Task 2. Worst affected areas 32 | 33 | ```cmd 34 | with deaths_by_states as ( 35 | SELECT subregion1_name as state, sum(cumulative_deceased) as death_count 36 | FROM `bigquery-public-data.covid19_open_data.covid19_open_data` where country_name="United States of America" and date='2020-06-25' and subregion1_name is NOT NULL 37 | group by subregion1_name ) 38 | select count(*) as count_of_states from deaths_by_states where death_count > Enter_states 39 | ``` 40 | 41 | 42 | ## Task 3. Identifying hotspots 43 | 44 | ```cmd 45 | SELECT * FROM ( 46 | SELECT subregion1_name as state, sum(cumulative_confirmed) as total_confirmed_cases 47 | FROM `bigquery-public-data.covid19_open_data.covid19_open_data` WHERE country_code="US" AND date='2020-06-25' AND subregion1_name is NOT NULL 48 | GROUP BY subregion1_name ORDER BY total_confirmed_cases DESC ) WHERE total_confirmed_cases > Enter_confirmed_cases_number 49 | ``` 50 | 51 | 52 | ## Task 4.Fatality Ratio 53 | 54 | ```cmd 55 | SELECT SUM(cumulative_confirmed) AS total_confirmed_cases, SUM(cumulative_deceased) AS total_deaths, (SUM(cumulative_deceased)/SUM(cumulative_confirmed))*100 AS case_fatality_ratio 56 | FROM `bigquery-public-data.covid19_open_data.covid19_open_data` 57 | WHERE country_name="Italy" AND date BETWEEN "2020-04-01" AND "2020-04-30" 58 | ``` 59 | 60 | 61 | ## Task 5.Identifying specific day 62 | 63 | ```cmd 64 | SELECT 65 | date 66 | FROM 67 | `bigquery-public-data.covid19_open_data.covid19_open_data` 68 | WHERE 69 | country_name = 'Italy' 70 | AND cumulative_deceased > Enter_Total_Number 71 | ORDER BY date 72 | LIMIT 1 73 | ``` 74 | 75 | 76 | ## Task 6: Finding days with zero net new cases 77 | 78 | ```cmd 79 | WITH india_cases_by_date AS ( 80 | SELECT 81 | date, 82 | SUM(cumulative_confirmed) AS cases 83 | FROM 84 | `bigquery-public-data.covid19_open_data.covid19_open_data` 85 | WHERE 86 | country_name="India" 87 | AND date between '2020-02-21' and '2020-03-14' 88 | GROUP BY 89 | date 90 | ORDER BY 91 | date ASC 92 | ) 93 | 94 | , india_previous_day_comparison AS 95 | (SELECT 96 | date, 97 | cases, 98 | LAG(cases) OVER(ORDER BY date) AS previous_day, 99 | cases - LAG(cases) OVER(ORDER BY date) AS net_new_cases 100 | FROM india_cases_by_date 101 | ) 102 | SELECT 103 | COUNT(date) 104 | FROM 105 | india_previous_day_comparison 106 | WHERE 107 | net_new_cases = 0 108 | ``` 109 | 110 | 111 | ## Task 7.Doubling rate 112 | 113 | ```cmd 114 | WITH us_cases_by_date AS ( 115 | SELECT 116 | date, 117 | SUM( cumulative_confirmed ) AS cases 118 | FROM 119 | `bigquery-public-data.covid19_open_data.covid19_open_data` 120 | WHERE 121 | country_name="United States of America" 122 | AND date between '2020-03-22' and '2020-04-20' 123 | GROUP BY 124 | date 125 | ORDER BY 126 | date ASC 127 | ) 128 | 129 | , us_previous_day_comparison AS 130 | (SELECT 131 | date, 132 | cases, 133 | LAG(cases) OVER(ORDER BY date) AS previous_day, 134 | cases - LAG(cases) OVER(ORDER BY date) AS net_new_cases, 135 | (cases - LAG(cases) OVER(ORDER BY date))*100/LAG(cases) OVER(ORDER BY date) AS percentage_increase 136 | FROM us_cases_by_date 137 | ) 138 | SELECT 139 | Date, 140 | cases AS Confirmed_Cases_On_Day, 141 | previous_day AS Confirmed_Cases_Previous_Day, 142 | percentage_increase AS Percentage_Increase_In_Cases 143 | FROM 144 | us_previous_day_comparison 145 | WHERE 146 | percentage_increase > Enter_Percentage_Here 147 | ``` 148 | 149 | 150 | ## Task 8.Recovery rate 151 | 152 | ```cmd 153 | WITH cases_by_country AS ( 154 | SELECT 155 | country_name AS country, 156 | SUM(cumulative_confirmed) AS cases, 157 | SUM(cumulative_recovered) AS recovered_cases 158 | FROM 159 | `bigquery-public-data.covid19_open_data.covid19_open_data` 160 | WHERE 161 | date="2020-05-10" 162 | GROUP BY 163 | country_name 164 | ) 165 | 166 | , recovered_rate AS ( 167 | SELECT 168 | country, cases, recovered_cases, 169 | (recovered_cases * 100)/cases AS recovery_rate 170 | FROM 171 | cases_by_country 172 | ) 173 | 174 | SELECT country, cases AS confirmed_cases, recovered_cases, recovery_rate 175 | FROM 176 | recovered_rate 177 | WHERE 178 | cases > 50000 179 | ORDER BY recovery_rate DESC 180 | LIMIT Enter_Limit_Here 181 | ``` 182 | 183 | 184 | ## Task 9.CDGR — Cumulative Daily Growth Rate 185 | 186 | ```cmd 187 | WITH 188 | france_cases AS ( 189 | SELECT 190 | date, 191 | SUM(cumulative_confirmed) AS total_cases 192 | FROM 193 | `bigquery-public-data.covid19_open_data.covid19_open_data` 194 | WHERE 195 | country_name="France" 196 | AND date IN ('2020-01-24', 197 | '2020-06-25') 198 | GROUP BY 199 | date 200 | ORDER BY 201 | date) 202 | , summary as ( 203 | SELECT 204 | total_cases AS first_day_cases, 205 | LEAD(total_cases) OVER(ORDER BY date) AS last_day_cases, 206 | DATE_DIFF(LEAD(date) OVER(ORDER BY date),date, day) AS days_diff 207 | FROM 208 | france_cases 209 | LIMIT 1 210 | ) 211 | 212 | select first_day_cases, last_day_cases, days_diff, POWER(last_day_cases/first_day_cases,1/days_diff)-1 as cdgr 213 | from summary 214 | ``` 215 | 216 | 217 | ## Task 10.Create a Datastudio report 218 | 219 | ```cmd 220 | SELECT 221 | date, SUM(cumulative_confirmed) AS country_cases, 222 | SUM(cumulative_deceased) AS country_deaths 223 | FROM 224 | `bigquery-public-data.covid19_open_data.covid19_open_data` 225 | WHERE 226 | date BETWEEN 'Date range' 227 | AND 'Date range' 228 | AND country_name='United States of America' 229 | GROUP BY date 230 | ``` 231 | 232 | # Congratulations🎉! You're done with this challenge lab. -------------------------------------------------------------------------------- /Analyze Sentiment with Natural Language API Challenge Lab.md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *Analyze Sentiment with Natural Language API: Challenge Lab [ARC130]* 2 | 3 | ## `Lab Link` - [*Click Here*](https://www.cloudskillsboost.google/focuses/66586?parent=catalog) 4 | 5 | 6 | 7 | Run the below commands in the SSH of your VM-Instance. 8 | 9 | ## Task 1. Create an API key 10 | 11 | * *Navigation Menu* > *APIs and Services* > *Credentials* > Click `+Create Credentials` > Choose `API KEY`. 12 | 13 | ``` 14 | gcloud compute ssh lab-vm --zone= 15 | ``` 16 | 17 | ``` 18 | export API_KEY= 19 | ``` 20 | 21 | 22 | ## Task 2. Set up Google Docs and call the Natural Language API 23 | 24 | * Open this link in Incoginato Mode [Click Here](https://docs.google.com/document/create) 25 | * Login using your Lab Credentials. 26 | * Click `Externsion` > then `Apps Script`. 27 | 28 | * Paste the below code in the `code.js file`. 29 | 30 | ``` 31 | /** 32 | * @OnlyCurrentDoc 33 | * 34 | * The above comment directs Apps Script to limit the scope of file 35 | * access for this add-on. It specifies that this add-on will only 36 | * attempt to read or modify the files in which the add-on is used, 37 | * and not all of the user's files. The authorization request message 38 | * presented to users will reflect this limited scope. 39 | */ 40 | /** 41 | * Creates a menu entry in the Google Docs UI when the document is 42 | * opened. 43 | * 44 | */ 45 | function onOpen() { 46 | var ui = DocumentApp.getUi(); 47 | ui.createMenu('Natural Language Tools') 48 | .addItem('Mark Sentiment', 'markSentiment') 49 | .addToUi(); 50 | } 51 | /** 52 | * Gets the user-selected text and highlights it based on sentiment 53 | * with green for positive sentiment, red for negative, and yellow 54 | * for neutral. 55 | * 56 | */ 57 | function markSentiment() { 58 | var POSITIVE_COLOR = '#00ff00'; // Colors for sentiments 59 | var NEGATIVE_COLOR = '#ff0000'; 60 | var NEUTRAL_COLOR = '#ffff00'; 61 | var NEGATIVE_CUTOFF = -0.2; // Thresholds for sentiments 62 | var POSITIVE_CUTOFF = 0.2; 63 | var selection = DocumentApp.getActiveDocument().getSelection(); 64 | if (selection) { 65 | var string = getSelectedText(); 66 | var sentiment = retrieveSentiment(string); 67 | // Select the appropriate color 68 | var color = NEUTRAL_COLOR; 69 | if (sentiment <= NEGATIVE_CUTOFF) { 70 | color = NEGATIVE_COLOR; 71 | } 72 | if (sentiment >= POSITIVE_CUTOFF) { 73 | color = POSITIVE_COLOR; 74 | } 75 | // Highlight the text 76 | var elements = selection.getSelectedElements(); 77 | for (var i = 0; i < elements.length; i++) { 78 | if (elements[i].isPartial()) { 79 | var element = elements[i].getElement().editAsText(); 80 | var startIndex = elements[i].getStartOffset(); 81 | var endIndex = elements[i].getEndOffsetInclusive(); 82 | element.setBackgroundColor(startIndex, endIndex, color); 83 | } else { 84 | var element = elements[i].getElement().editAsText(); 85 | foundText = elements[i].getElement().editAsText(); 86 | foundText.setBackgroundColor(color); 87 | } 88 | } 89 | } 90 | } 91 | /** 92 | * Returns a string with the contents of the selected text. 93 | * If no text is selected, returns an empty string. 94 | */ 95 | function getSelectedText() { 96 | var selection = DocumentApp.getActiveDocument().getSelection(); 97 | var string = ""; 98 | if (selection) { 99 | var elements = selection.getSelectedElements(); 100 | for (var i = 0; i < elements.length; i++) { 101 | if (elements[i].isPartial()) { 102 | var element = elements[i].getElement().asText(); 103 | var startIndex = elements[i].getStartOffset(); 104 | var endIndex = elements[i].getEndOffsetInclusive() + 1; 105 | var text = element.getText().substring(startIndex, endIndex); 106 | string = string + text; 107 | } else { 108 | var element = elements[i].getElement(); 109 | // Only translate elements that can be edited as text; skip 110 | // images and other non-text elements. 111 | if (element.editAsText) { 112 | string = string + element.asText().getText(); 113 | } 114 | } 115 | } 116 | } 117 | return string; 118 | } 119 | /** Given a string, will call the Natural Language API and retrieve 120 | * the sentiment of the string. The sentiment will be a real 121 | * number in the range -1 to 1, where -1 is highly negative 122 | * sentiment and 1 is highly positive. 123 | */ 124 | function retrieveSentiment (line) { 125 | var apiKey = "YOUR_API_KEY_HERE"; 126 | var apiEndpoint = 127 | 'https://language.googleapis.com/v1/documents:analyzeSentiment?key=' 128 | + apiKey; 129 | // Create a structure with the text, its language, its type, 130 | // and its encoding 131 | var docDetails = { 132 | language: 'en-us', 133 | type: 'PLAIN_TEXT', 134 | content: line 135 | }; 136 | var nlData = { 137 | document: docDetails, 138 | encodingType: 'UTF8' 139 | }; 140 | // Package all of the options and the data together for the call 141 | var nlOptions = { 142 | method : 'post', 143 | contentType: 'application/json', 144 | payload : JSON.stringify(nlData) 145 | }; 146 | // And make the call 147 | var response = UrlFetchApp.fetch(apiEndpoint, nlOptions); 148 | var data = JSON.parse(response); 149 | var sentiment = 0.0; 150 | // Ensure all pieces were in the returned value 151 | if (data && data.documentSentiment 152 | && data.documentSentiment.score){ 153 | sentiment = data.documentSentiment.score; 154 | } 155 | return sentiment; 156 | } 157 | ``` 158 | * In place of "*YOUR_API_KEY_HERE*", replace it with your created `API_KEY` which you make it in task 1. 159 | 160 | ## Task 3. Analyze syntax and parts of speech with the Natural Language API 161 | 162 | ``` 163 | cat > analyze-request.json < analyze-response.txt 177 | 178 | cat analyze-response.txt 179 | ``` 180 | 181 | ## Task 4. Perform multilingual natural language processing 182 | 183 | ``` 184 | cat > multi-nl-request.json < multi-response.txt 198 | 199 | cat multi-response.txt 200 | ``` 201 | 202 | # Congratulations🎉!, You're all done with this Challenge lab. -------------------------------------------------------------------------------- /All Weeks Moonsoon Skill Badges.md: -------------------------------------------------------------------------------- 1 | ## **Official Arcade Website** - [Click Here](https://go.qwiklabs.com/arcade/) 2 | 3 | ## **Arcade Syllabus Link** - [Click Here](https://rsvp.withgoogle.com/events/arcade-facilitator/syllabus) 4 | 5 | ## *WEEK 1 Moonsoon Skill Badges* 6 | 7 | ### Skill Badge 1 - Create and Manage Cloud Resources 8 | 9 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/120) 10 | 11 | Solution Link - [Click Here](https://www.youtube.com/playlist?list=PLVyfLG0N2WQzSb70j76BbQj_OWGMhJztd) 12 | 13 | 14 | ### Skill Badge 2 - Perform Foundational Infrastructure Tasks in Google Cloud 15 | 16 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/118) 17 | 18 | Solution Link - [Click Here](https://www.youtube.com/playlist?list=PLVyfLG0N2WQzCrhP2ATYMyMr7idIHohJ0) 19 | 20 | ### Skill Badge 3 - Perform Foundational Data, ML, and AI Tasks in Google Cloud 21 | 22 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/117) 23 | 24 | Solution Link - [Click Here](https://www.youtube.com/playlist?list=PLHfVKuKwHnWM6cS57g857OdZhHSlcOTAe) 25 | 26 | ### Skill Badge 4 - Configure Service Accounts and IAM Roles for Google Cloud 27 | 28 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/328) 29 | 30 | Solution Link - [Click Here](https://www.youtube.com/playlist?list=PLHfVKuKwHnWOzVlfbcUBaLvhQvhw8-slD) 31 | 32 | ### Skill Badge 5 - Integrate BigQuery Data and Google Workspace using Apps Script 33 | 34 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/327) 35 | 36 | Solution Link - [Click Here](https://www.youtube.com/playlist?list=PLHfVKuKwHnWPdRFsmsRQ233yf3v3J3hGx) 37 | 38 | ### Skill Badge 6 - Develop with Apps Script and AppSheet 39 | 40 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/324) 41 | 42 | Solution Link - [Click Here] 43 | 44 | ## *WEEK 2 Moonsoon Skill Badges* 45 | 46 | 47 | ### Skill Badge 1 - Set Up and Configure a Cloud Environment in Google Cloud 48 | 49 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/119) 50 | 51 | Solution Link - [Click Here](https://www.youtube.com/playlist?list=PLVyfLG0N2WQx-Bv3dOo1CL1aMlUG_2IQb) 52 | 53 | ### Skill Badge 2 - Create ML Models with BigQuery ML 54 | 55 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/146) 56 | 57 | Solution Link - [Click Here](https://www.youtube.com/playlist?list=PLVyfLG0N2WQw-urV-ysN_yYqyFeOoy-Ul) 58 | 59 | ### Skill Badge 3 - Exploring Data with Looker 60 | 61 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/165) 62 | 63 | Solution Link - [Click Here](https://www.youtube.com/playlist?list=PLHfVKuKwHnWMTFXqbvAGGJPk59Xdk2xch) 64 | 65 | ### Skill Badge 4 - Secure BigLake Data 66 | 67 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/307) 68 | 69 | Solution Link - [Click Here](https://www.youtube.com/playlist?list=PLHfVKuKwHnWNl70H6msRHJ-KNuBYrpKKA) 70 | 71 | ### Skill Badge 5 - Protect Sensitive Data with Data Loss Prevention 72 | 73 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/315) 74 | 75 | Solution Link - [Click Here](https://www.youtube.com/playlist?list=PLVyfLG0N2WQxXccT8rmT92ZQuQJms_4F8) 76 | 77 | ### Skill Badge 6 - Tag and Discover BigLake Data 78 | 79 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/305) 80 | 81 | Solution Link - [Click Here](https://www.youtube.com/playlist?list=PLHfVKuKwHnWMGjggiFrQHRUXy3cLI3WQc) 82 | 83 | 84 | ## *WEEK 3 Moonsoon Skill Badges* 85 | 86 | 87 | ### Skill Badge 1 - Store, Process, and Manage Data in Google Cloud Console 88 | 89 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/263) 90 | 91 | Solution Link - [Click Here](https://www.youtube.com/playlist?list=PLVyfLG0N2WQx8huI3cjFzbG7iA7Om4qyn) 92 | 93 | ### Skill Badge 2 - Analyze BigQuery data in Connected Sheets 94 | 95 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/274) 96 | 97 | Solution Link - [Click Here] 98 | 99 | ### Skill Badge 3 - Optimize Costs for Google Kubernetes Engine 100 | 101 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/157) 102 | 103 | Solution Link - [Click Here] 104 | 105 | ### Skill Badge 4 - Get Started with Eventarc 106 | 107 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/300) 108 | 109 | Solution Link - [Click Here](https://www.youtube.com/playlist?list=PLVyfLG0N2WQxyTPf1gc-jWH7bn7xw8LXP) 110 | 111 | ### Skill Badge 5 - Get Started with Dataplex 112 | 113 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/291) 114 | 115 | Solution Link - [Click Here](https://www.youtube.com/playlist?list=PLHfVKuKwHnWOL8WAfH5JOL8FEblPI36Zc) 116 | 117 | ### Skill Badge 6 - Get Started with Pub/Sub 118 | 119 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/301) 120 | 121 | Solution Link - [Click Here](https://www.youtube.com/playlist?list=PLVyfLG0N2WQwx0hbACJLszOktqyMY0QIp) 122 | 123 | 124 | ## *WEEK 4 Moonsoon Skill Badges* 125 | 126 | 127 | ### Skill Badge 1 - Automating Infrastructure on Google Cloud with Terraform 128 | 129 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/159) 130 | 131 | Solution Link - [Click Here](https://www.youtube.com/playlist?list=PLVyfLG0N2WQyvOx0WoAVQ45cb124xkJfz) 132 | 133 | ### Skill Badge 2 - Deploy and Manage Cloud Environments with Google Cloud 134 | 135 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/121) 136 | 137 | Solution Link - [Click Here](https://www.youtube.com/playlist?list=PLHfVKuKwHnWNvlDz3QA7zlU-TkHKMMLz3) 138 | 139 | ### Skill Badge 3 - Ensure Access & Identity in Google Cloud 140 | 141 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/150) 142 | 143 | Solution Link - [Click Here] 144 | 145 | ### Skill Badge 4 - Create a Streaming Data Lake on Cloud Storage 146 | 147 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/289) 148 | 149 | Solution Link - [Click Here] 150 | 151 | ### Skill Badge 5 - Use APIs to Work with Cloud Storage 152 | 153 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/321) 154 | 155 | Solution Link - [Click Here](https://www.youtube.com/playlist?list=PLVyfLG0N2WQyCd6tUmj8pSRWX3Si1sg-v) 156 | 157 | ### Skill Badge 6 - App Engine: 3 Ways 158 | 159 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/285) 160 | 161 | Solution Link - [Click Here](https://www.youtube.com/playlist?list=PLVyfLG0N2WQwA7Jfdz_5Jz_X02Fyu-nug) 162 | 163 | 164 | ## *WEEK 5 Moonsoon Skill Badges* 165 | 166 | 167 | ### Skill Badge 1 - Serverless Firebase Development 168 | 169 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/153) 170 | 171 | Solution Link - [Click Here](https://www.youtube.com/playlist?list=PLVyfLG0N2WQxcMUhlHadIolunMJh5o_0H) 172 | 173 | ### Skill Badge 2 - Cloud Architecture: Design, Implement, and Manage 174 | 175 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/124) 176 | 177 | Solution Link - [Click Here](https://www.youtube.com/playlist?list=PLVyfLG0N2WQw9JE9JP6UWX30pifwoohlL) 178 | 179 | ### Skill Badge 3 - Engineer Data in Google Cloud 180 | 181 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/132) 182 | 183 | Solution Link - [Click Here](https://www.youtube.com/playlist?list=PLVyfLG0N2WQyEeqBmqjEnzB22GdrTY56q) 184 | 185 | ### Skill Badge 4 - The Basics of Google Cloud Compute 186 | 187 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/319) 188 | 189 | Solution Link - [Click Here](https://www.youtube.com/playlist?list=PLVyfLG0N2WQy6a8D1-f-uCqj4u5yMnaFB) 190 | 191 | ### Skill Badge 5 - Networking Fundamentals on Google Cloud 192 | 193 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/318) 194 | 195 | Solution Link - [Click Here](https://www.youtube.com/playlist?list=PLVyfLG0N2WQz9juswW5sYZvQ6ay_RX-pW) 196 | 197 | ### Skill Badge 6 - Streaming Analytics into BigQuery 198 | 199 | Quest Link - [Click Here](https://www.cloudskillsboost.google/quests/284) 200 | 201 | Solution Link - [Click Here](https://www.youtube.com/playlist?list=PLVyfLG0N2WQxC4XZ01y1_6HgLNSJg_HQY) 202 | 203 | # ALL DONE🎉 -------------------------------------------------------------------------------- /Implement DevOps in Google Cloud Challenge Lab.md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *Implement DevOps in Google Cloud: Challenge Lab [GSP330]* 2 | ## `Lab Link` - [*Click Here*](https://www.cloudskillsboost.google/focuses/13287?parent=catalog) 3 | 4 | Run the following commands in the Cloud Shell Terminal 5 | 6 | * Put your `Qwiklabs USERNAME ID` here. 7 | ``` 8 | export EMAIL= 9 | ``` 10 | 11 | * Storing values in a variable. 12 | ``` 13 | export CLUSTER_NAME=hello-cluster 14 | 15 | export ZONE= 16 | 17 | export REGION= 18 | 19 | export REPO=my-repository 20 | 21 | export PROJECT_ID=$(gcloud config get-value project) 22 | ``` 23 | 24 | ## Task 1. Create the lab resources 25 | 26 | * Enabling Some APIs and adding iam-policy. 27 | ``` 28 | gcloud services enable container.googleapis.com \ 29 | cloudbuild.googleapis.com \ 30 | sourcerepo.googleapis.com 31 | 32 | gcloud projects add-iam-policy-binding $PROJECT_ID \ 33 | --member=serviceAccount:$(gcloud projects describe $PROJECT_ID \ 34 | --format="value(projectNumber)")@cloudbuild.gserviceaccount.com --role="roles/container.developer" 35 | 36 | git config --global user.email $EMAIL 37 | 38 | git config --global user.name student 39 | 40 | gcloud artifacts repositories create $REPO \ 41 | --repository-format=docker \ 42 | --location=$REGION \ 43 | --description="Subscribe to Atul Gupta" 44 | ``` 45 | 46 | * Creating cluster with the following configuration. 47 | ``` 48 | gcloud beta container --project "$PROJECT_ID" clusters create "$CLUSTER_NAME" --zone "$ZONE" --cluster-version latest --release-channel "regular" --machine-type "e2-medium" --subnetwork "projects/$PROJECT_ID/regions/$REGION/subnetworks/default" --enable-autoscaling --min-nodes "2" --max-nodes "6" 49 | 50 | kubectl create namespace prod 51 | 52 | kubectl create namespace dev 53 | ``` 54 | 55 | ## Task 2. Create a repository in Cloud Source Repositories 56 | 57 | * Creating source repos using gcloud command. 58 | ``` 59 | gcloud source repos create sample-app 60 | 61 | git clone https://source.developers.google.com/p/$PROJECT_ID/r/sample-app 62 | 63 | cd ~ 64 | 65 | gsutil cp -r gs://spls/gsp330/sample-app/* sample-app 66 | 67 | git init 68 | 69 | cd sample-app/ 70 | ``` 71 | 72 | * Commit to master using git command 73 | ``` 74 | git add . 75 | 76 | git commit -m "Subscribe to Atul Gupta" 77 | 78 | git push -u origin master 79 | 80 | git branch dev 81 | 82 | git checkout dev 83 | 84 | git push -u origin dev 85 | ``` 86 | 87 | ## Task 3. Create the Cloud Build Triggers 88 | 89 | * Creating builds triggers 90 | ``` 91 | gcloud builds triggers create cloud-source-repositories --name="sample-app-prod-deploy" --repo="sample-app" --branch-pattern="^master$" --build-config="cloudbuild.yaml" 92 | 93 | gcloud builds triggers create cloud-source-repositories --name="sample-app-dev-deploy" --repo="sample-app" --branch-pattern="^dev$" --build-config="cloudbuild-dev.yaml" 94 | ``` 95 | 96 | ## Task 4. Deploy the first versions of the application 97 | 98 | ``` 99 | COMMIT_ID="$(git rev-parse --short=7 HEAD)" 100 | 101 | gcloud builds submit --tag="${REGION}-docker.pkg.dev/${PROJECT_ID}/$REPO/hello-cloudbuild:${COMMIT_ID}" . 102 | 103 | IMAGE=$(gcloud builds list --format="value(IMAGES)") 104 | 105 | sed -i "s//v1.0/g" cloudbuild-dev.yaml 106 | 107 | sed -i "s##$IMAGE#g" dev/deployment.yaml 108 | ``` 109 | 110 | * Commit to master using git command 111 | ``` 112 | git add . 113 | 114 | git commit -m "Subscribe to Atul Gupta" 115 | 116 | git push -u origin dev 117 | 118 | git checkout master 119 | 120 | sed -i "s//v1.0/g" cloudbuild.yaml 121 | 122 | sed -i "s##$IMAGE#g" prod/deployment.yaml 123 | 124 | git add . 125 | 126 | git commit -m "Subscribe to Atul Gupta" 127 | 128 | git push -u origin master 129 | 130 | git checkout dev 131 | ``` 132 | 133 | ## Task 5. Deploy the second versions of the application 134 | 135 | * Creating `main.go` file. 136 | ``` 137 | rm -rf main.go 138 | 139 | touch main.go 140 | 141 | tee main.go < `HISTORY` 281 | * Click `SECOND FAILED ONE(dev)` > CLICK `RETRY` 282 | 283 | # Congratulation🎉! You're all done with this Challenge Lab. -------------------------------------------------------------------------------- /Build a Website on Google Cloud/Hosting a Web App on Google Cloud Using Compute Engine [GSP662].md: -------------------------------------------------------------------------------- 1 | ## `Lab Name` - *Hosting a Web App on Google Cloud Using Compute Engine [GSP662]* 2 | 3 | ## `Lab Link` - [*CLICK HERE*](https://www.cloudskillsboost.google/focuses/11952?parent=catalog) 4 | 5 | 6 | ### Run the following commands in the Cloud Shell 7 | 8 | ``` 9 | export ZONE= 10 | ``` 11 | 12 | ``` 13 | export REGION= 14 | ``` 15 | 16 | ``` 17 | gcloud services enable compute.googleapis.com 18 | 19 | gsutil mb gs://fancy-store-$DEVSHELL_PROJECT_ID 20 | ``` 21 | ## Task 3. Clone source repository 22 | 23 | ``` 24 | git clone https://github.com/googlecodelabs/monolith-to-microservices.git 25 | 26 | cd ~/monolith-to-microservices 27 | 28 | ./setup.sh 29 | 30 | nvm install --lts 31 | 32 | cd microservices 33 | 34 | npm start 35 | ``` 36 | 37 | ## Task 4. Create Compute Engine instances 38 | 39 | * 4.1 40 | 41 | ``` 42 | touch ~/monolith-to-microservices/startup-script.sh 43 | ``` 44 | * Now, Click Open Editor in the Cloud Shell ribbon to open the Code Editor. 45 | * Navigate to the `monolith-to-microservices` folder. 46 | * Add the following code to the `startup-script.sh` file. 47 | 48 | ``` 49 | #!/bin/bash 50 | # Install logging monitor. The monitor will automatically pick up logs sent to 51 | # syslog. 52 | curl -s "https://storage.googleapis.com/signals-agents/logging/google-fluentd-install.sh" | bash 53 | service google-fluentd restart & 54 | # Install dependencies from apt 55 | apt-get update 56 | apt-get install -yq ca-certificates git build-essential supervisor psmisc 57 | # Install nodejs 58 | mkdir /opt/nodejs 59 | curl https://nodejs.org/dist/v16.14.0/node-v16.14.0-linux-x64.tar.gz | tar xvzf - -C /opt/nodejs --strip-components=1 60 | ln -s /opt/nodejs/bin/node /usr/bin/node 61 | ln -s /opt/nodejs/bin/npm /usr/bin/npm 62 | # Get the application source code from the Google Cloud Storage bucket. 63 | mkdir /fancy-store 64 | gsutil -m cp -r gs://fancy-store-[DEVSHELL_PROJECT_ID]/monolith-to-microservices/microservices/* /fancy-store/ 65 | # Install app dependencies. 66 | cd /fancy-store/ 67 | npm install 68 | # Create a nodeapp user. The application will run as this user. 69 | useradd -m -d /home/nodeapp nodeapp 70 | chown -R nodeapp:nodeapp /opt/app 71 | # Configure supervisor to run the node app. 72 | cat >/etc/supervisor/conf.d/node-app.conf << EOF 73 | [program:nodeapp] 74 | directory=/fancy-store 75 | command=npm start 76 | autostart=true 77 | autorestart=true 78 | user=nodeapp 79 | environment=HOME="/home/nodeapp",USER="nodeapp",NODE_ENV="production" 80 | stdout_logfile=syslog 81 | stderr_logfile=syslog 82 | EOF 83 | supervisorctl reread 84 | supervisorctl update 85 | ``` 86 | 87 | * Find the text `[DEVSHELL_PROJECT_ID]` in the file and replace it with your `Project ID`. 88 | * Return back to your terminal and these commands. 89 | 90 | ``` 91 | gsutil cp ~/monolith-to-microservices/startup-script.sh gs://fancy-store-$DEVSHELL_PROJECT_ID 92 | 93 | cd ~ 94 | rm -rf monolith-to-microservices/*/node_modules 95 | gsutil -m cp -r monolith-to-microservices gs://fancy-store-$DEVSHELL_PROJECT_ID/ 96 | 97 | 98 | gcloud compute instances create backend \ 99 | --zone=$ZONE \ 100 | --machine-type=e2-standard-2 \ 101 | --tags=backend \ 102 | --metadata=startup-script-url=https://storage.googleapis.com/fancy-store-$DEVSHELL_PROJECT_ID/startup-script.sh 103 | 104 | gcloud compute instances list 105 | ``` 106 | * Copy the `External IP` for the backend. 107 | 108 | ### In the Cloud Shell Explorer, navigate to `monolith-to-microservices` > `react-app`. 109 | 110 | ### In the Code Editor, select View > `Toggle Hidden Files` in order to see the `.env file`. 111 | 112 | * 4.2 113 | 114 | ```cmd 115 | cd ~/monolith-to-microservices/react-app 116 | npm install && npm run-script build 117 | 118 | cd ~ 119 | rm -rf monolith-to-microservices/*/node_modules 120 | 121 | gsutil -m cp -r monolith-to-microservices gs://fancy-store-$DEVSHELL_PROJECT_ID/ 122 | ``` 123 | 124 | * 4.3 125 | 126 | ``` 127 | gcloud compute instances create frontend \ 128 | --zone=$ZONE \ 129 | --machine-type=e2-standard-2 \ 130 | --tags=frontend \ 131 | --metadata=startup-script-url=https://storage.googleapis.com/fancy-store-$DEVSHELL_PROJECT_ID/startup-script.sh 132 | 133 | 134 | gcloud compute firewall-rules create fw-fe \ 135 | --allow tcp:8080 \ 136 | --target-tags=frontend 137 | 138 | gcloud compute firewall-rules create fw-be \ 139 | --allow tcp:8081-8082 \ 140 | --target-tags=backend 141 | 142 | gcloud compute instances list 143 | ``` 144 | 145 | ## Task 5. Create managed instance groups 146 | 147 | * 5.1 148 | 149 | ``` 150 | gcloud compute instances stop frontend --zone=$ZONE 151 | 152 | gcloud compute instances stop backend --zone=$ZONE 153 | 154 | gcloud compute instance-templates create fancy-fe \ 155 | --source-instance-zone=$ZONE \ 156 | --source-instance=frontend 157 | 158 | gcloud compute instance-templates create fancy-be \ 159 | --source-instance-zone=$ZONE \ 160 | --source-instance=backend 161 | 162 | gcloud compute instance-templates list 163 | 164 | gcloud compute instances delete backend --zone=$ZONE 165 | ``` 166 | * If asking, type and enter `y` when prompted. 167 | 168 | * 5.2 169 | 170 | ``` 171 | gcloud compute instance-groups managed create fancy-fe-mig \ 172 | --zone=$ZONE \ 173 | --base-instance-name fancy-fe \ 174 | --size 2 \ 175 | --template fancy-fe 176 | 177 | gcloud compute instance-groups managed create fancy-be-mig \ 178 | --zone=$ZONE \ 179 | --base-instance-name fancy-be \ 180 | --size 2 \ 181 | --template fancy-be 182 | 183 | gcloud compute instance-groups set-named-ports fancy-fe-mig \ 184 | --zone=$ZONE \ 185 | --named-ports frontend:8080 186 | 187 | gcloud compute instance-groups set-named-ports fancy-be-mig \ 188 | --zone=$ZONE \ 189 | --named-ports orders:8081,products:8082 190 | ``` 191 | * 5.3 192 | 193 | ``` 194 | gcloud compute health-checks create http fancy-fe-hc \ 195 | --port 8080 \ 196 | --check-interval 30s \ 197 | --healthy-threshold 1 \ 198 | --timeout 10s \ 199 | --unhealthy-threshold 3 200 | 201 | gcloud compute health-checks create http fancy-be-hc \ 202 | --port 8081 \ 203 | --request-path=/api/orders \ 204 | --check-interval 30s \ 205 | --healthy-threshold 1 \ 206 | --timeout 10s \ 207 | --unhealthy-threshold 3 208 | ``` 209 | 210 | * 5.4 211 | 212 | ``` 213 | gcloud compute firewall-rules create allow-health-check \ 214 | --allow tcp:8080-8081 \ 215 | --source-ranges 130.211.0.0/22,35.191.0.0/16 \ 216 | --network default 217 | 218 | gcloud compute instance-groups managed update fancy-fe-mig \ 219 | --zone=$ZONE \ 220 | --health-check fancy-fe-hc \ 221 | --initial-delay 300 222 | 223 | gcloud compute instance-groups managed update fancy-be-mig \ 224 | --zone=$ZONE \ 225 | --health-check fancy-be-hc \ 226 | --initial-delay 300 227 | ``` 228 | 229 | ## Task 6. Create load balancers 230 | 231 | * 6.1 232 | 233 | ``` 234 | gcloud compute http-health-checks create fancy-fe-frontend-hc \ 235 | --request-path / \ 236 | --port 8080 237 | 238 | gcloud compute http-health-checks create fancy-be-orders-hc \ 239 | --request-path /api/orders \ 240 | --port 8081 241 | 242 | gcloud compute http-health-checks create fancy-be-products-hc \ 243 | --request-path /api/products \ 244 | --port 8082 245 | ``` 246 | 247 | * 6.2 248 | 249 | ``` 250 | gcloud compute backend-services create fancy-fe-frontend \ 251 | --http-health-checks fancy-fe-frontend-hc \ 252 | --port-name frontend \ 253 | --global 254 | 255 | gcloud compute backend-services create fancy-be-orders \ 256 | --http-health-checks fancy-be-orders-hc \ 257 | --port-name orders \ 258 | --global 259 | 260 | gcloud compute backend-services create fancy-be-products \ 261 | --http-health-checks fancy-be-products-hc \ 262 | --port-name products \ 263 | --global 264 | 265 | gcloud compute backend-services add-backend fancy-fe-frontend \ 266 | --instance-group fancy-fe-mig \ 267 | --instance-group-zone $ZONE \ 268 | --global 269 | 270 | gcloud compute backend-services add-backend fancy-be-orders \ 271 | --instance-group fancy-be-mig \ 272 | --instance-group-zone $ZONE \ 273 | --global 274 | 275 | gcloud compute backend-services add-backend fancy-be-products \ 276 | --instance-group fancy-be-mig \ 277 | --instance-group-zone $ZONE \ 278 | --global 279 | ``` 280 | 281 | * 6.3 282 | 283 | ``` 284 | gcloud compute url-maps create fancy-map \ 285 | --default-service fancy-fe-frontend 286 | 287 | gcloud compute url-maps add-path-matcher fancy-map \ 288 | --default-service fancy-fe-frontend \ 289 | --path-matcher-name orders \ 290 | --path-rules "/api/orders=fancy-be-orders,/api/products=fancy-be-products" 291 | 292 | 293 | gcloud compute target-http-proxies create fancy-proxy \ 294 | --url-map fancy-map 295 | 296 | gcloud compute forwarding-rules create fancy-http-rule \ 297 | --global \ 298 | --target-http-proxy fancy-proxy \ 299 | --ports 80 300 | ``` 301 | 302 | * 6.4 303 | 304 | ``` 305 | cd ~/monolith-to-microservices/react-app/ 306 | 307 | gcloud compute forwarding-rules list --global 308 | 309 | cd ~/monolith-to-microservices/react-app 310 | npm install && npm run-script build 311 | 312 | cd ~ 313 | rm -rf monolith-to-microservices/*/node_modules 314 | gsutil -m cp -r monolith-to-microservices gs://fancy-store-$DEVSHELL_PROJECT_ID/ 315 | 316 | gcloud compute instance-groups managed rolling-action replace fancy-fe-mig \ 317 | --max-unavailable 100% 318 | ``` 319 | 320 | ## Task 7. Scaling Compute Engine 321 | 322 | ``` 323 | gcloud compute instance-groups managed set-autoscaling \ 324 | fancy-fe-mig \ 325 | --zone=$ZONE \ 326 | --max-num-replicas 2 \ 327 | --target-load-balancing-utilization 0.60 328 | 329 | gcloud compute instance-groups managed set-autoscaling \ 330 | fancy-be-mig \ 331 | --zone=$ZONE \ 332 | --max-num-replicas 2 \ 333 | --target-load-balancing-utilization 0.60 334 | 335 | gcloud compute backend-services update fancy-fe-frontend \ 336 | --enable-cdn --global 337 | ``` 338 | 339 | ## Task 8. Update the website 340 | 341 | * 8.1 342 | 343 | ``` 344 | gcloud compute instances set-machine-type frontend \ 345 | --zone=$ZONE \ 346 | --machine-type e2-small 347 | 348 | gcloud compute instance-templates create fancy-fe-new \ 349 | --region=$REGION \ 350 | --source-instance=frontend \ 351 | --source-instance-zone=$ZONE 352 | 353 | gcloud compute instance-groups managed rolling-action start-update fancy-fe-mig \ 354 | --zone=$ZONE \ 355 | --version template=fancy-fe-new 356 | 357 | watch -n 2 gcloud compute instance-groups managed list-instances fancy-fe-mig \ 358 | --zone=$ZONE 359 | ``` 360 | 361 | * 8.2 362 | 363 | ``` 364 | cd ~/monolith-to-microservices/react-app/src/pages/Home 365 | 366 | mv index.js.new index.js 367 | 368 | cat ~/monolith-to-microservices/react-app/src/pages/Home/index.js 369 | 370 | cd ~/monolith-to-microservices/react-app 371 | 372 | npm install && npm run-script build 373 | 374 | cd ~ 375 | 376 | rm -rf monolith-to-microservices/*/node_modules 377 | 378 | gsutil -m cp -r monolith-to-microservices gs://fancy-store-$DEVSHELL_PROJECT_ID/ 379 | 380 | gcloud compute instance-groups managed rolling-action replace fancy-fe-mig \ 381 | --zone=$ZONE \ 382 | --max-unavailable=100% 383 | ``` 384 | 385 | # Congratulation🎉! You're all done with this lab. --------------------------------------------------------------------------------