├── assets ├── badges.png ├── banner.png ├── dsc_logo.png └── prizes_table.png ├── screenshots ├── IAM.png ├── job_1.png ├── job_2.png ├── job_3.png ├── uptime.png ├── cluster.png └── table_1.png ├── LICENSE ├── Engineer Data in Google Cloud: Challenge Lab.md ├── Perform Foundational Infrastructure Tasks in Google Cloud Challenge Lab.md ├── Deploy and Manage Cloud Environments with Google Cloud: Challenge Lab.md ├── Explore Machine Learning Models with Explainable AI: Challenge Lab.md ├── Getting Started: Create and Manage Cloud Resources: Challenge Lab.md ├── Build and Secure Networks in Google Cloud: Challenge Lab.md ├── Deploy to Kubernetes in Google Cloud: Challenge Lab.md ├── Set up and Configure a Cloud Environment in Google Cloud: Challenge Lab.md ├── Perform Foundational Data, ML, and AI Tasks in Google Cloud: Challenge Lab.md ├── Insights from Data with BigQuery: Challenge Lab.md ├── Integrate with Machine Learning APIs: Challenge Lab.md └── README.md /assets/badges.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/GDSC-IIIT-Kalyani/qwiklabs_challenges/HEAD/assets/badges.png -------------------------------------------------------------------------------- /assets/banner.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/GDSC-IIIT-Kalyani/qwiklabs_challenges/HEAD/assets/banner.png -------------------------------------------------------------------------------- /assets/dsc_logo.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/GDSC-IIIT-Kalyani/qwiklabs_challenges/HEAD/assets/dsc_logo.png -------------------------------------------------------------------------------- /screenshots/IAM.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/GDSC-IIIT-Kalyani/qwiklabs_challenges/HEAD/screenshots/IAM.png -------------------------------------------------------------------------------- /screenshots/job_1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/GDSC-IIIT-Kalyani/qwiklabs_challenges/HEAD/screenshots/job_1.png -------------------------------------------------------------------------------- /screenshots/job_2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/GDSC-IIIT-Kalyani/qwiklabs_challenges/HEAD/screenshots/job_2.png -------------------------------------------------------------------------------- /screenshots/job_3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/GDSC-IIIT-Kalyani/qwiklabs_challenges/HEAD/screenshots/job_3.png -------------------------------------------------------------------------------- /screenshots/uptime.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/GDSC-IIIT-Kalyani/qwiklabs_challenges/HEAD/screenshots/uptime.png -------------------------------------------------------------------------------- /assets/prizes_table.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/GDSC-IIIT-Kalyani/qwiklabs_challenges/HEAD/assets/prizes_table.png -------------------------------------------------------------------------------- /screenshots/cluster.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/GDSC-IIIT-Kalyani/qwiklabs_challenges/HEAD/screenshots/cluster.png -------------------------------------------------------------------------------- /screenshots/table_1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/GDSC-IIIT-Kalyani/qwiklabs_challenges/HEAD/screenshots/table_1.png -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2020 Souvik Biswas [Cloud Facilitator] 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. -------------------------------------------------------------------------------- /Engineer Data in Google Cloud: Challenge Lab.md: -------------------------------------------------------------------------------- 1 | # Engineer Data in Google Cloud: Challenge Lab 2 | 3 | > Launch the lab [here](https://google.qwiklabs.com/focuses/12379?parent=catalog) 4 | 5 | ## Your challenge 6 | 7 | You have started a new role as a Data Engineer for TaxiCab Inc. You are expected to import some historical data to a working BigQuery dataset, and build a basic model that predicts fares based on information available when a new ride starts. Leadership is interested in building an app and estimating for users how much a ride will cost. The source data will be provided in your project. 8 | 9 | As soon as you sit down at your desk and open your new laptop you receive your first assignment: build a basic BQML fare prediction model for leadership. Perform the following tasks to import and clean the data, then build the model and perform batch predictions with new data so that leadership can review model performance and make a go/no-go decision on deploying the app functionality. 10 | 11 | ### Task 1: Clean your training data 12 | 13 | Go to **BigQuery** > Run these queries: 14 | 15 | ```sql 16 | CREATE OR REPLACE TABLE 17 | taxirides.taxi_training_data AS 18 | SELECT 19 | (tolls_amount + fare_amount) AS fare_amount, 20 | pickup_datetime, 21 | pickup_longitude AS pickuplon, 22 | pickup_latitude AS pickuplat, 23 | dropoff_longitude AS dropofflon, 24 | dropoff_latitude AS dropofflat, 25 | passenger_count AS passengers, 26 | FROM 27 | taxirides.historical_taxi_rides_raw 28 | WHERE 29 | RAND() < 0.001 30 | AND trip_distance > 0 31 | AND fare_amount >= 2.5 32 | AND pickup_longitude > -78 33 | AND pickup_longitude < -70 34 | AND dropoff_longitude > -78 35 | AND dropoff_longitude < -70 36 | AND pickup_latitude > 37 37 | AND pickup_latitude < 45 38 | AND dropoff_latitude > 37 39 | AND dropoff_latitude < 45 40 | AND passenger_count > 0 41 | ``` 42 | 43 | ### Task 2: Create a BQML model called `taxirides.fare_model` 44 | 45 | ```sql 46 | CREATE OR REPLACE MODEL taxirides.fare_model 47 | TRANSFORM( 48 | * EXCEPT(pickup_datetime) 49 | 50 | , ST_Distance(ST_GeogPoint(pickuplon, pickuplat), ST_GeogPoint(dropofflon, dropofflat)) AS euclidean 51 | , CAST(EXTRACT(DAYOFWEEK FROM pickup_datetime) AS STRING) AS dayofweek 52 | , CAST(EXTRACT(HOUR FROM pickup_datetime) AS STRING) AS hourofday 53 | ) 54 | OPTIONS(input_label_cols=['fare_amount'], model_type='linear_reg') 55 | AS 56 | 57 | SELECT * FROM taxirides.taxi_training_data 58 | ``` 59 | 60 | ### Task 3: Perform a batch prediction on new data 61 | 62 | ```sql 63 | CREATE OR REPLACE TABLE taxirides.2015_fare_amount_predictions 64 | AS 65 | SELECT * FROM ML.PREDICT(MODEL taxirides.fare_model,( 66 | SELECT * FROM taxirides.report_prediction_data) 67 | ) 68 | ``` 69 | 70 | -------------------------------------------------------------------------------- /Perform Foundational Infrastructure Tasks in Google Cloud Challenge Lab.md: -------------------------------------------------------------------------------- 1 | # Perform Foundational Infrastructure Tasks in Google Cloud: Challenge Lab 2 | 3 | > Launch the lab [here](https://google.qwiklabs.com/quests/118) 4 | 5 | In this lab there is no command line instructions, you have to just use the **Google Cloud Platform (GCP)** for completing this lab. A walkthrough of each task is available below: 6 | 7 | ## Your challenge 8 | 9 | You are now asked to help a newly formed development team with some of their initial work on a new project around storing and organizing photographs, called memories. You have been asked to assist the memories team with initial configuration for their application development environment; you receive the following request to complete the following tasks: 10 | 11 | - Create a bucket for storing the photographs. 12 | - Create a Pub/Sub topic that will be used by a Cloud Function you create. 13 | - Create a Cloud Function. 14 | - Remove the previous cloud engineer’s access from the memories project. 15 | 16 | Some Jooli Inc. standards you should follow: 17 | 18 | - Create all resources in the **us-east1** region and **us-east1-b** zone, unless otherwise directed. 19 | - Use the projectdatd VPCs. 20 | - Naming is normally *team-resource*, e.g. an instance could be named **kraken-webserver1** 21 | - Allocate cost effective resource sizes. Projects are monitored and excessive resource use will result in the containing project's termination (and possibly yours), so beware. This is the guidance the monitoring team is willing to share; unless directed, use **f1-micro** for small Linux VMs and **n1-standard-1** for Windows or other applications such as Kubernetes nodes. 22 | 23 | ## Solving tasks 24 | 25 | ### Task 1: Create a bucket 26 | 27 | 1. Navigation menu > **Cloud Storage** > Browser > Create Bucket 28 | 2. Name your bucket > Enter **GCP Project ID** > Continue 29 | 3. Choose where to store your data > **Region:** us-east1 > Continue 30 | 4. Use default for the remaining 31 | 5. Create 32 | 33 | ### Task 2: Create a Pub/Sub topic 34 | 35 | 1. Navigation menu > **Pub/Sub** > Topics 36 | 2. Create Topic > **Name:** Jooli > Create Topic 37 | 38 | ### Task 3: Create the thumbnail Cloud Function 39 | 40 | 1. Navigation menu > **Cloud Functions** > Create Function 41 | 42 | 2. Use the following config: 43 | 44 | **Name:** CFJooli 45 | **Region:** us-east1 46 | **Trigger:** Cloud Storage 47 | **Event type:** Finalize/Create 48 | **Bucket:** BROWSE > Select the qwiklabs bucket 49 | 50 | 3. Remaining default > Next 51 | 52 | 4. **Runtime:** Node.js 10 53 | **Entry point:** thumbnail 54 | 5. Add the code appropiately 55 | 6. Download the image from URL 56 | 7. Navigation menu > **Cloud Storage** > Browser > Select your bucket > Upload files 57 | 8. Refresh bucket 58 | 59 | ### Task 4: Remove the previous cloud engineer 60 | 61 | 1. Navigation menu > **IAM & Admin** > IAM 62 | 2. Search for the "**Username 2**" > Edit > Delete Role 63 | -------------------------------------------------------------------------------- /Deploy and Manage Cloud Environments with Google Cloud: Challenge Lab.md: -------------------------------------------------------------------------------- 1 | # Deploy and Manage Cloud Environments with Google Cloud: Challenge Lab 2 | 3 | > Launch the lab [here](https://google.qwiklabs.com/quests/121?utm_source=google&utm_medium=lp&utm_campaign=gcpskills) 4 | 5 | ## Your challenge 6 | 7 | As soon as you sit down at your desk and open your new laptop you receive the following request to complete these tasks. 8 | 9 | > Do not wait for the lab to provision! You can complete tasks 1 and 2 before the lab provisioning has ended, just ensure the jumphost exists. 10 | 11 | ### Task 1: Create the production environment 12 | 13 | * Go to **Compute Engine** > **VM Instances** > **kraken-jumphost** > **SSH** 14 | * Run the following there: 15 | 16 | ```yaml 17 | cd /work/dm 18 | 19 | sed -i s/SET_REGION/us-east1/g prod-network.yaml 20 | 21 | gcloud deployment-manager deployments create prod-network --config=prod-network.yaml 22 | 23 | gcloud config set compute/zone us-east1-b 24 | 25 | gcloud container clusters create kraken-prod \ 26 | --num-nodes 2 \ 27 | --network kraken-prod-vpc \ 28 | --subnetwork kraken-prod-subnet 29 | 30 | gcloud container clusters get-credentials kraken-prod 31 | 32 | cd /work/k8s 33 | 34 | for F in $(ls *.yaml); do kubectl create -f $F; done 35 | ``` 36 | 37 | ### Task 2: Setup the Admin instance 38 | 39 | * Run the following from the **Cloud Terminal**: 40 | 41 | ```yaml 42 | gcloud config set compute/zone us-east1-b 43 | 44 | gcloud compute instances create kraken-admin --network-interface="subnet=kraken-mgmt-subnet" --network-interface="subnet=kraken-prod-subnet" 45 | 46 | ``` 47 | 48 | * Go to **Monitoring** > **Alerting** > **Create Alerting Policy** 49 | 50 | * Add Condition (use the **kraken-admin instance ID**) 51 | 52 | * **Manage notification Channel** > Create a new **Email Channel** 53 | 54 | * Click **Done** 55 | 56 | ### Task 3: Verify the Spinnaker deployment 57 | 58 | * Run the following from the **Cloud Terminal**: 59 | 60 | ```yaml 61 | gcloud config set compute/zone us-east1-b 62 | 63 | gcloud container clusters get-credentials spinnaker-tutorial 64 | 65 | DECK_POD=$(kubectl get pods --namespace default -l "cluster=spin-deck" -o jsonpath="{.items[0].metadata.name}") 66 | 67 | kubectl port-forward --namespace default $DECK_POD 8080:9000 >> /dev/null & 68 | ``` 69 | 70 | * **Web Preview** > **Preview on port 8080** 71 | 72 | * Go to **Applications** > **sample** 73 | 74 | * Select **Pipelines** 75 | 76 | * Start **Manual Execution** > Pipeline: **Deploy** > **Run** 77 | 78 | * Click and continue the **PUBSUB** 79 | 80 | * While the **MANUAL START** is running, execute the following: 81 | 82 | ```yaml 83 | gcloud config set compute/zone us-east1-b 84 | 85 | gcloud source repos clone sample-app 86 | 87 | cd sample-app 88 | 89 | touch a 90 | 91 | git config --global user.email "$(gcloud config get-value account)" 92 | 93 | git config --global user.name "Student" 94 | 95 | git commit -a -m "change" 96 | 97 | git tag v1.0.1 98 | 99 | git push --tags 100 | ``` 101 | 102 | * Now, return to the **Spinnaker** page and allow the "**Continue to Deploy?**" message. 103 | 104 | * Again, start another **MANUAL START** deployment (this time it will use the new changes) 105 | 106 | * Wait the deployment to finish. -------------------------------------------------------------------------------- /Explore Machine Learning Models with Explainable AI: Challenge Lab.md: -------------------------------------------------------------------------------- 1 | # Explore Machine Learning Models with Explainable AI: Challenge Lab 2 | 3 | > Launch the lab [here](https://google.qwiklabs.com/focuses/12011?parent=catalog) 4 | 5 | ## Your challenge 6 | 7 | You are a curious coder who wants to explore biases in public datasets using the What-If Tool. You decide to pull some mortgage [data](https://www.consumerfinance.gov/data-research/hmda/historic-data/) to train a couple of machine learning models to predict whether an applicant will be granted a loan. You specifically want to investigate how the two models perform when they are trained on different proportions of males and females in the datasets, and visualize their differences in the What-If Tool. 8 | 9 | ### Start a JupyterLab Notebook instance 10 | 11 | * Storage > Bucket > Create Bucket > Enter the name as your **Project ID** > CREATE 12 | 13 | * AI Platform > Notebooks > NEW INSTANCE > Use the latest version of TensorFlow > CREATE 14 | 15 | * Wait for it to be created > **Open Jupyterlab** 16 | 17 | ### Download the Challenge Notebook 18 | 19 | * Clone the repo 20 | 21 | * Open the Juputer Notebook file as per the lab instruction 22 | 23 | ### Build and train your models 24 | 25 | * Modify the block 10 as: 26 | 27 | ```python 28 | model = Sequential() 29 | model.add(layers.Dense(200, input_shape=(input_size,), activation='relu')) 30 | model.add(layers.Dense(50, activation='relu')) 31 | model.add(layers.Dense(20, activation='relu')) 32 | model.add(layers.Dense(1, activation='sigmoid')) 33 | 34 | # The data will come from train_data and train_labels. 35 | model.compile(loss='mean_squared_error', optimizer='adam', metrics=['accuracy']) 36 | model.fit(train_data, train_labels, epochs=10, batch_size=2048, validation_split=0.1) 37 | ``` 38 | 39 | * Modify the block 13: 40 | 41 | ```python 42 | limited_model = Sequential() 43 | limited_model.add(layers.Dense(200, input_shape=(input_size,), activation='relu')) 44 | limited_model.add(layers.Dense(50, activation='relu')) 45 | limited_model.add(layers.Dense(20, activation='relu')) 46 | limited_model.add(layers.Dense(1, activation='sigmoid')) 47 | limited_model.compile(loss='mean_squared_error', optimizer='adam', metrics=['accuracy']) 48 | 49 | # The data will come from limited_train_data and limited_train_labels. 50 | limited_model.fit(limited_train_data, limited_train_labels, epochs=10, batch_size=2048, validation_split=0.1) 51 | ``` 52 | 53 | ### Deploy the models to AI Platform 54 | 55 | * Enter the peoper information block 16. Change the saved_limited_model to limited_model and saved_completed_model to completed_model . 56 | 57 | * Block 19: 58 | 59 | ```python 60 | !gcloud ai-platform models create $MODEL_NAME --regions $REGION 61 | ``` 62 | 63 | * Block 20: 64 | 65 | ```python 66 | !gcloud ai-platform versions create $VERSION_NAME \ 67 | --model=$MODEL_NAME \ 68 | --framework='Tensorflow' \ 69 | --runtime-version=2.1 \ 70 | --origin=$MODEL_BUCKET/completed_model \ 71 | --staging-bucket=$MODEL_BUCKET \ 72 | --python-version=3.7 73 | ``` 74 | 75 | * Block 21: 76 | 77 | ```python 78 | !gcloud ai-platform models create $LIM_MODEL_NAME --regions $REGION 79 | ``` 80 | 81 | * Block 22: 82 | 83 | ```python 84 | !gcloud ai-platform versions create $VERSION_NAME \ 85 | --model=$LIM_MODEL_NAME \ 86 | --framework='Tensorflow' \ 87 | --runtime-version=2.1 \ 88 | --origin=$MODEL_BUCKET/limited_model \ 89 | --staging-bucket=$MODEL_BUCKET \ 90 | --python-version=3.7 91 | ``` 92 | 93 | ### Use the What-If Tool to explore biases 94 | 95 | Just run the code for What-If Tool. 96 | -------------------------------------------------------------------------------- /Getting Started: Create and Manage Cloud Resources: Challenge Lab.md: -------------------------------------------------------------------------------- 1 | # Getting Started: Create and Manage Cloud Resources: Challenge Lab 2 | 3 | > Launch the lab [here](https://google.qwiklabs.com/focuses/10258?parent=catalog) 4 | 5 | ## Your challenge 6 | 7 | As soon as you sit down at your desk and open your new laptop you receive several requests from the Nucleus team. Read through each description, then create the resources. 8 | 9 | ## Solving tasks 10 | 11 | ### Task 1: Create a project jumphost instance 12 | 13 | * Run the following from the **Cloud Terminal**: 14 | 15 | ```yaml 16 | gcloud compute instances create nucleus-jumphost \ 17 | --network nucleus-vpc \ 18 | --zone us-east1-b \ 19 | --machine-type f1-micro \ 20 | --image-family debian-9 \ 21 | --image-project debian-cloud \ 22 | --scopes cloud-platform \ 23 | --no-address 24 | ``` 25 | ### Task 2: Create a Kubernetes service cluster 26 | 27 | * Run the following from the **Cloud Terminal**: 28 | 29 | ```yaml 30 | gcloud container clusters create nucleus-backend \ 31 | --num-nodes 1 \ 32 | --network nucleus-vpc \ 33 | --region us-east1 34 | 35 | gcloud container clusters get-credentials nucleus-backend \ 36 | --region us-east1 37 | 38 | kubectl create deployment hello-server \ 39 | --image=gcr.io/google-samples/hello-app:2.0 40 | 41 | kubectl expose deployment hello-server \ 42 | --type=LoadBalancer \ 43 | --port 8080 44 | ``` 45 | 46 | ### Task 3: Setup an HTTP load balancer 47 | 48 | * Run the following from the **Cloud Terminal**: 49 | 50 | ```yaml 51 | cat << EOF > startup.sh 52 | #! /bin/bash 53 | apt-get update 54 | apt-get install -y nginx 55 | service nginx start 56 | sed -i -- 's/nginx/Google Cloud Platform - '"\$HOSTNAME"'/' /var/www/html/index.nginx-debian.html 57 | EOF 58 | 59 | gcloud compute instance-templates create web-server-template \ 60 | --metadata-from-file startup-script=startup.sh \ 61 | --network nucleus-vpc \ 62 | --machine-type g1-small \ 63 | --region us-east1 64 | 65 | gcloud compute instance-groups managed create web-server-group \ 66 | --base-instance-name web-server \ 67 | --size 2 \ 68 | --template web-server-template \ 69 | --region us-east1 70 | 71 | gcloud compute firewall-rules create web-server-firewall \ 72 | --allow tcp:80 \ 73 | --network nucleus-vpc 74 | 75 | gcloud compute http-health-checks create http-basic-check 76 | 77 | gcloud compute instance-groups managed \ 78 | set-named-ports web-server-group \ 79 | --named-ports http:80 \ 80 | --region us-east1 81 | 82 | gcloud compute backend-services create web-server-backend \ 83 | --protocol HTTP \ 84 | --http-health-checks http-basic-check \ 85 | --global 86 | 87 | gcloud compute backend-services add-backend web-server-backend \ 88 | --instance-group web-server-group \ 89 | --instance-group-region us-east1 \ 90 | --global 91 | 92 | gcloud compute url-maps create web-server-map \ 93 | --default-service web-server-backend 94 | 95 | gcloud compute target-http-proxies create http-lb-proxy \ 96 | --url-map web-server-map 97 | 98 | gcloud compute forwarding-rules create http-content-rule \ 99 | --global \ 100 | --target-http-proxy http-lb-proxy \ 101 | --ports 80 102 | 103 | gcloud compute forwarding-rules list 104 | ``` 105 | -------------------------------------------------------------------------------- /Build and Secure Networks in Google Cloud: Challenge Lab.md: -------------------------------------------------------------------------------- 1 | # Build and Secure Networks in Google Cloud: Challenge Lab 2 | 3 | > Launch the lab [here](https://google.qwiklabs.com/quests/128?utm_source=google&utm_medium=lp&utm_campaign=gcpskills) 4 | 5 | ## Your challenge 6 | 7 | You need to configure this simple environment securely. Your first challenge is to set up appropriate firewall rules and virtual machine tags. You also need to ensure that SSH is only available to the bastion via IAP. 8 | 9 | For the firewall rules, make sure: 10 | 11 | - The bastion host does not have a public IP address. 12 | - You can only SSH to the bastion and only via IAP. 13 | - You can only SSH to juice-shop via the bastion. 14 | - Only HTTP is open to the world for `juice-shop`. 15 | 16 | Tips and tricks: 17 | 18 | - Pay close attention to the network tags and the associated VPC firewall rules. 19 | - Be specific and limit the size of the VPC firewall rule source ranges. 20 | - Overly permissive permissions will not be marked correct. 21 | 22 | ## Solving tasks 23 | 24 | ### Task 1: Check the firewall rules. Remove the overly permissive rules. 25 | 26 | * Go to **VPC network** > **Firewall** > will see **open-access** 27 | * Use the following command from the cloud console: 28 | 29 | ```yaml 30 | gcloud compute firewall-rules delete open-access 31 | ``` 32 | 33 | ### Task 2: Navigate to Compute Engine in the Cloud Console and identify the bastion host. The instance should be stopped. Start the instance 34 | 35 | * Go to **Compute Engine** > **VM Instances** > Select **bastion** > click on **Start** 36 | 37 | ### Task 3: The bastion host is the one machine authorized to receive external SSH traffic. Create a firewall rule that allows SSH (tcp/22) from the IAP service. The firewall rule should be enabled on bastion via a network tag. 38 | 39 | * Run the following: 40 | * Make sure you replace with the tag provided on the Left Pane. 41 | 42 | ```yaml 43 | gcloud compute firewall-rules create --allow=tcp:22 --source-ranges 35.235.240.0/20 --target-tags --network acme-vpc 44 | 45 | gcloud compute instances add-tags bastion --tags= --zone=us-central1-b 46 | ``` 47 | 48 | ### Task 4: The `juice-shop` server serves HTTP traffic. Create a firewall rule that allows traffic on HTTP (tcp/80) to any address. The firewall rule should be enabled on juice-shop via a network tag 49 | 50 | - Run the following: 51 | * Make sure you replace with the tag provided on the Left Pane. 52 | 53 | ```yaml 54 | gcloud compute firewall-rules create --allow=tcp:80 --source-ranges 0.0.0.0/0 --target-tags --network acme-vpc 55 | 56 | gcloud compute instances add-tags juice-shop --tags= --zone=us-central1-b 57 | ``` 58 | 59 | ### Task 5: You need to connect to `juice-shop` from the bastion using SSH. Create a firewall rule that allows traffic on SSH (tcp/22) from `acme-mgmt-subnet` network address. The firewall rule should be enabled on `juice-shop` via a network tag 60 | 61 | * Run the following: 62 | * Make sure you replace with the tag provided on the Left Pane. 63 | 64 | ```yaml 65 | gcloud compute firewall-rules create --allow=tcp:22 --source-ranges 192.168.10.0/24 --target-tags --network acme-vpc 66 | 67 | gcloud compute instances add-tags juice-shop --tags= --zone=us-central1-b 68 | ``` 69 | 70 | ### Task 6: In the Compute Engine instances page, click the SSH button for the bastion host. Once connected, SSH to `juice-shop` 71 | 72 | * Go to **Compute Engine** > **VM instances** > **SSH** to **bastion** host 73 | * Run the following: 74 | 75 | ```yaml 76 | ssh 77 | ``` 78 | 79 | -------------------------------------------------------------------------------- /Deploy to Kubernetes in Google Cloud: Challenge Lab.md: -------------------------------------------------------------------------------- 1 | # Deploy to Kubernetes in Google Cloud: Challenge Lab 2 | 3 | > Launch the lab [here](https://google.qwiklabs.com/focuses/10457?parent=catalog) 4 | 5 | ## Your challenge 6 | 7 | You are expected to create container images, store the images in a repository, and configure a Jenkins CI/CD pipeline to automate the build for the product. Your know that Kurt, your supervisor, will ask you to complete these tasks: 8 | 9 | - Create a Docker image and store the Dockerfile. 10 | - Test the created Docker image. 11 | - Push the Docker image into the Container Repository. 12 | - Use the image to create and expose a deployment in Kubernetes 13 | - Update the image and push a change to the deployment. 14 | - Create a pipeline in Jenkins to deploy a new version of your image when the source code changes. 15 | 16 | Some Jooli Inc. standards you should follow: 17 | 18 | - Create all resources in the `us-east1` region and `us-east1-b` zone, unless otherwise directed. 19 | - Use the project VPCs. 20 | - Naming is normally *team-resource*, e.g. an instance could be named **kraken-webserver1**. 21 | - Allocate cost effective resource sizes. Projects are monitored and excessive resource use will result in the containing project's termination (and possibly yours), so beware. This is the guidance the monitoring team is willing to share: unless directed, use `n1-standard-1`. 22 | 23 | ### Task 1: Create a Docker image and store the Dockerfile 24 | 25 | ```yaml 26 | gsutil cat gs://cloud-training/gsp318/marking/setup_marking.sh | bash 27 | gcloud source repos clone valkyrie-app 28 | cd valkyrie-app 29 | cat > Dockerfile <> /dev/null & 92 | printf $(kubectl get secret cd-jenkins -o jsonpath="{.data.jenkins-admin-password}" | base64 --decode);echo 93 | 94 | gcloud source repos list 95 | 96 | sed -i "s/green/orange/g" source/html.go 97 | 98 | # Update project in Jenkinsfile 99 | 100 | sed -i "s/YOUR_PROJECT/$GOOGLE_CLOUD_PROJECT/g" Jenkinsfile 101 | git config --global user.email "you@example.com" <------ put from first consol 102 | git config --global user.name "student" <--------- from login status 103 | git add . 104 | git commit -m "built pipeline init" 105 | git push 106 | ``` 107 | 108 | * Now, open the Jenkins window and run the **Build Pipeline**. 109 | 110 | -------------------------------------------------------------------------------- /Set up and Configure a Cloud Environment in Google Cloud: Challenge Lab.md: -------------------------------------------------------------------------------- 1 | # Set up and Configure a Cloud Environment in Google Cloud: Challenge Lab 2 | 3 | > Launch the lab [here](https://google.qwiklabs.com/quests/119?utm_source=google&utm_medium=lp&utm_campaign=gcpskills) 4 | 5 | ## Your challenge 6 | 7 | You need to help the team with some of their initial work on a new project. They plan to use WordPress and need you to set up a development environment. Some of the work was already done for you, but other parts require your expert skills. 8 | 9 | As soon as you sit down at your desk and open your new laptop you receive the following request to complete these tasks. 10 | 11 | ### Task 1: Create development VPC manually 12 | 13 | * Run the following from the **Cloud Terminal**: 14 | 15 | ```yaml 16 | gcloud compute networks create griffin-dev-vpc --subnet-mode custom 17 | 18 | gcloud compute networks subnets create griffin-dev-wp --network=griffin-dev-vpc --region us-east1 --range=192.168.16.0/20 19 | 20 | gcloud compute networks subnets create griffin-dev-mgmt --network=griffin-dev-vpc --region us-east1 --range=192.168.32.0/20 21 | ``` 22 | 23 | ### Task 2: Create production VPC using Deployment Manager 24 | 25 | * Run the following from the **Cloud Terminal**: 26 | 27 | ```yaml 28 | gsutil cp -r gs://cloud-training/gsp321/dm . 29 | 30 | cd dm 31 | 32 | sed -i s/SET_REGION/us-east1/g prod-network.yaml 33 | 34 | gcloud deployment-manager deployments create prod-network \ 35 | --config=prod-network.yaml 36 | ``` 37 | 38 | ### Task 3: Create bastion host 39 | 40 | * Run the following from the **Cloud Terminal**: 41 | 42 | ```yaml 43 | cd .. 44 | 45 | gcloud compute instances create bastion --network-interface=network=griffin-dev-vpc,subnet=griffin-dev-mgmt --network-interface=network=griffin-prod-vpc,subnet=griffin-prod-mgmt --tags=ssh --zone=us-east1-b 46 | 47 | gcloud compute firewall-rules create fw-ssh-dev --source-ranges=0.0.0.0/0 --target-tags ssh --allow=tcp:22 --network=griffin-dev-vpc 48 | 49 | gcloud compute firewall-rules create fw-ssh-prod --source-ranges=0.0.0.0/0 --target-tags ssh --allow=tcp:22 --network=griffin-prod-vpc 50 | ``` 51 | 52 | ### Task 4: Create and configure Cloud SQL Instance 53 | 54 | * Run the following from the **Cloud Terminal**: 55 | 56 | ```yaml 57 | gcloud sql instances create griffin-dev-db --root-password password --region=us-east1 58 | 59 | gcloud sql connect griffin-dev-db 60 | 61 | # Copy paste the following from the lab mannual 62 | CREATE DATABASE wordpress; 63 | GRANT ALL PRIVILEGES ON wordpress.* TO "wp_user"@"%" IDENTIFIED BY "stormwind_rules"; 64 | FLUSH PRIVILEGES; 65 | 66 | # Use the following to get out of the SQL terminal 67 | exit; 68 | ``` 69 | 70 | ### Task 5: Create Kubernetes cluster 71 | 72 | * Run the following from the **Cloud Terminal**: 73 | 74 | ```yaml 75 | gcloud container clusters create griffin-dev \ 76 | --network griffin-dev-vpc \ 77 | --subnetwork griffin-dev-wp \ 78 | --machine-type n1-standard-4 \ 79 | --num-nodes 2 \ 80 | --zone us-east1-b 81 | 82 | gcloud container clusters get-credentials griffin-dev --zone us-east1-b 83 | ``` 84 | 85 | ### Task 6: Prepare the Kubernetes cluster 86 | 87 | * Run the following from the **Cloud Terminal**: 88 | 89 | ```yaml 90 | gsutil cp -r gs://cloud-training/gsp321/wp-k8s . 91 | 92 | cd wp-k8s 93 | 94 | sed -i s/username_goes_here/wp_user/g wp-env.yaml 95 | 96 | sed -i s/password_goes_here/stormwind_rules/g wp-env.yaml 97 | 98 | kubectl create -f wp-env.yaml 99 | 100 | gcloud iam service-accounts keys create key.json \ 101 | --iam-account=cloud-sql-proxy@$GOOGLE_CLOUD_PROJECT.iam.gserviceaccount.com 102 | 103 | kubectl create secret generic cloudsql-instance-credentials \ 104 | --from-file key.json 105 | ``` 106 | 107 | ### Task 7: Create a WordPress deployment 108 | 109 | * Run the following from the **Cloud Terminal**: 110 | 111 | ```yaml 112 | # Use the following for replace YOUR_SQL_INSTANCE with "griffin-dev-db" 113 | I=$(gcloud sql instances describe griffin-dev-db --format="value(connectionName)") 114 | 115 | sed -i s/YOUR_SQL_INSTANCE/$I/g wp-deployment.yaml 116 | 117 | kubectl create -f wp-deployment.yaml 118 | 119 | kubectl create -f wp-service.yaml 120 | ``` 121 | 122 | ### Task 8: Enable monitoring 123 | 124 | 1. Go to **OPERATIONS** > **Monitoring** 125 | 2. Wait for the workspace creation to complete 126 | 3. Go to **Uptime checks** > **CREATE UPTIME CHECKS** 127 | 4. Now enter the info as below: 128 | 129 | Uptime check 130 | 131 | ### Task 9: Provide access for an additional engineer 132 | 133 | 1. Go to **IAM & Admin** > **ADD** 134 | 2. Enter the second username and give him **Project** > **Editor** access in **Role** 135 | 136 | IAM 137 | -------------------------------------------------------------------------------- /Perform Foundational Data, ML, and AI Tasks in Google Cloud: Challenge Lab.md: -------------------------------------------------------------------------------- 1 | # Perform Foundational Data, ML, and AI Tasks in Google Cloud: Challenge Lab 2 | 3 | > Launch the lab [here](https://google.qwiklabs.com/focuses/11044?parent=catalog) 4 | 5 | ## Your challenge 6 | 7 | As a junior data engineer in Jooli Inc. and recently trained with Google Cloud and a number of data services you have been asked to demonstrate your newly learned skills. The team has asked you to complete the following tasks. 8 | 9 | ### Task 1: Run a simple Dataflow job 10 | 11 | * Navigation menu > Storage > Browser 12 | 13 | * Create a **Storage** Bucket > Enter name as your GCP Project ID > Leave others to default > Create 14 | 15 | * Go to **BigQuery** > Select project ID > Create Dataset > Enter the name as `lab` and click on Create 16 | 17 | * Run the following from the Cloud Shell: 18 | 19 | ```yaml 20 | gsutil cp gs://cloud-training/gsp323/lab.csv . 21 | 22 | cat lab.csv 23 | 24 | gsutil cp gs://cloud-training/gsp323/lab.schema . 25 | 26 | cat lab.schema 27 | ``` 28 | 29 | * Now, create a table inside the `lab` dataset and configure it as follows: 30 | 31 | ![](https://github.com/DSC-IIIT-Kalyani/qwiklabs_challenges/raw/master/screenshots/table_1.png) 32 | 33 | * Click on **Create table** 34 | 35 | * Go to **Dataflow** > Jobs > Create Job from Template 36 | 37 | ![](https://github.com/DSC-IIIT-Kalyani/qwiklabs_challenges/raw/master/screenshots/job_1.png) 38 | 39 | ![](https://github.com/DSC-IIIT-Kalyani/qwiklabs_challenges/raw/master/screenshots/job_2.png) 40 | 41 | * Run the Job. 42 | 43 | ### Task 2: Run a simple Dataproc job 44 | 45 | * Go to **Dataproc** > Clusters > Create Cluster 46 | 47 | ![](https://github.com/DSC-IIIT-Kalyani/qwiklabs_challenges/raw/master/screenshots/cluster.png) 48 | 49 | * Select the Created Cluster > Go to **VM Instances** > **SSH** into cluster 50 | 51 | * Run the following command: 52 | 53 | ```yaml 54 | hdfs dfs -cp gs://cloud-training/gsp323/data.txt /data.txt 55 | ``` 56 | 57 | * Exit the **SSH** 58 | 59 | * Submit Job > Configure as given: 60 | 61 | ![](https://github.com/DSC-IIIT-Kalyani/qwiklabs_challenges/raw/master/screenshots/job_3.png) 62 | 63 | * Click on **SUBMIT** 64 | 65 | ### Task 3: Run a simple Dataprep job 66 | 67 | * Go to **Dataprep** > Accept the terms > Login with the same account 68 | 69 | * Import Data > Select GCS > Edit > Enter the path as this: `gs://cloud-training/gsp323/runs.csv` > Import and Wrangle 70 | 71 | * Modify the table as specified in the lab instructions. 72 | 73 | ### Task 4: AI 74 | 75 | **PART 1** 76 | 77 | Use the following commands: 78 | 79 | ```yaml 80 | gcloud iam service-accounts create my-natlang-sa \ 81 | --display-name "my natural language service account" 82 | 83 | gcloud iam service-accounts keys create ~/key.json \ 84 | --iam-account my-natlang-sa@${GOOGLE_CLOUD_PROJECT}.iam.gserviceaccount.com 85 | 86 | export GOOGLE_APPLICATION_CREDENTIALS="/home/$USER/key.json" 87 | 88 | gcloud auth activate-service-account my-natlang-sa@${GOOGLE_CLOUD_PROJECT}.iam.gserviceaccount.com --key-file=$GOOGLE_APPLICATION_CREDENTIALS 89 | 90 | gcloud ml language analyze-entities --content="Old Norse texts portray Odin as one-eyed and long-bearded, frequently wielding a spear named Gungnir and wearing a cloak and a broad hat." > result.json 91 | 92 | gcloud auth login 93 | (Copy the token from the link provided) 94 | 95 | gsutil cp result.json gs://YOUR_PROJECT-marking/task4-cnl.result 96 | ``` 97 | 98 | **PART 2** 99 | 100 | * Create an API Key by going to IAM > Credentials, and export it as `API_KEY` variable in the Cloud Shell. 101 | 102 | * Create the following JSON file: 103 | 104 | ```yaml 105 | nano request.json 106 | 107 | { 108 | "config": { 109 | "encoding":"FLAC", 110 | "languageCode": "en-US" 111 | }, 112 | "audio": { 113 | "uri":"gs://cloud-training/gsp323/task4.flac" 114 | } 115 | } 116 | ``` 117 | 118 | * Run the following: 119 | 120 | > Replace `YOUR_PROJECT` with your GCP Project ID. 121 | 122 | ```yaml 123 | curl -s -X POST -H "Content-Type: application/json" --data-binary @request.json \ 124 | "https://speech.googleapis.com/v1/speech:recognize?key=${API_KEY}" > result.json 125 | 126 | gsutil cp result.json gs://YOUR_PROJECT-marking/task4-gcs.result 127 | ``` 128 | 129 | **PART 3** 130 | 131 | * Run the following: 132 | 133 | ```yaml 134 | gcloud iam service-accounts create quickstart 135 | 136 | gcloud iam service-accounts keys create key.json --iam-account quickstart@${GOOGLE_CLOUD_PROJECT}.iam.gserviceaccount.com 137 | 138 | gcloud auth activate-service-account --key-file key.json 139 | 140 | export ACCESS_TOKEN=$(gcloud auth print-access-token) 141 | ``` 142 | 143 | * Modify the previous JSON file: 144 | 145 | ```yaml 146 | nano request.json 147 | 148 | { 149 | "inputUri":"gs://spls/gsp154/video/chicago.mp4", 150 | "features": [ 151 | "TEXT_DETECTION" 152 | ] 153 | } 154 | ``` 155 | 156 | * Run the following: 157 | 158 | > Replace `YOUR_PROJECT` with your GCP Project ID. 159 | 160 | ```yaml 161 | curl -s -H 'Content-Type: application/json' \ 162 | -H "Authorization: Bearer $ACCESS_TOKEN" \ 163 | 'https://videointelligence.googleapis.com/v1/videos:annotate' \ 164 | -d @request.json 165 | 166 | 167 | 168 | curl -s -H 'Content-Type: application/json' -H "Authorization: Bearer $ACCESS_TOKEN" 'https://videointelligence.googleapis.com/v1/operations/OPERATION_FROM_PREVIOUS_REQUEST' > result1.json 169 | 170 | 171 | gsutil cp result1.json gs://YOUR_PROJECT-marking/task4-gvi.result 172 | ``` 173 | 174 | 175 | 176 | -------------------------------------------------------------------------------- /Insights from Data with BigQuery: Challenge Lab.md: -------------------------------------------------------------------------------- 1 | # Insights from Data with BigQuery: Challenge Lab 2 | 3 | > Launch the lab [here](https://google.qwiklabs.com/focuses/11988?parent=catalog) 4 | 5 | ## Your challenge 6 | 7 | You're part of a public health organization which is tasked with identifying answers to queries related to the Covid-19 pandemic. Obtaining the right answers will help the organization in planning and focusing healthcare efforts and awareness programs appropriately. 8 | 9 | The dataset and table that will be used for this analysis will be : `bigquery-public-data.covid19_open_data.covid19_open_data`. This repository contains country-level datasets of daily time-series data related to COVID-19 globally. It includes data relating to demographics, economy, epidemiology, geography, health, hospitalizations, mobility, government response, and weather. 10 | 11 | * Go to **BigQuery** 12 | * Go to **Resources** > Add Data > Explore public datasets 13 | * Select **COVID-19 Open Data** > **VIEW DATASET** 14 | 15 | ### Query 1: Total Confirmed Cases 16 | 17 | ```sql 18 | SELECT sum(cumulative_confirmed) as total_cases_worldwide FROM `bigquery-public-data.covid19_open_data.covid19_open_data` where date='2020-04-15' 19 | ``` 20 | 21 | ### Query 2: Worst Affected Areas 22 | 23 | ```sql 24 | WITH deaths_by_states as ( 25 | SELECT subregion1_name as state, sum(cumulative_deceased) as death_count 26 | FROM `bigquery-public-data.covid19_open_data.covid19_open_data` 27 | WHERE country_name="United States of America" and date='2020-04-10' and subregion1_name is NOT NULL 28 | GROUP BY subregion1_name 29 | ) 30 | 31 | SELECT count(*) as count_of_states 32 | FROM deaths_by_states 33 | WHERE death_count > 100 34 | ``` 35 | 36 | ### Query 3: Identifying Hotspots 37 | 38 | ```sql 39 | SELECT subregion1_name as state, sum(cumulative_confirmed) as total_confirmed_cases 40 | FROM `bigquery-public-data.covid19_open_data.covid19_open_data` 41 | WHERE country_name="United States of America" and date='2020-04-10' and subregion1_name is NOT NULL 42 | GROUP BY subregion1_name 43 | HAVING total_confirmed_cases > 1000 44 | ORDER BY total_confirmed_cases DESC 45 | ``` 46 | 47 | ### Query 4: Fatality Ratio 48 | 49 | ```sql 50 | SELECT sum(cumulative_confirmed) as total_confirmed_cases, sum(cumulative_deceased) as total_deaths, (sum(cumulative_deceased)/sum(cumulative_confirmed))*100 as case_fatality_ratio 51 | FROM `bigquery-public-data.covid19_open_data.covid19_open_data` 52 | WHERE country_name="Italy" and date BETWEEN "2020-04-01" AND "2020-04-30" 53 | ``` 54 | 55 | ### Query 5: Identifying specific day 56 | 57 | ```sql 58 | SELECT date 59 | FROM `bigquery-public-data.covid19_open_data.covid19_open_data` 60 | WHERE country_name="Italy" and cumulative_deceased > 10000 61 | ORDER BY date ASC 62 | LIMIT 1 63 | ``` 64 | 65 | ### Query 6: Finding days with zero net new cases 66 | 67 | ```sql 68 | WITH india_cases_by_date AS ( 69 | SELECT date, SUM( cumulative_confirmed ) AS cases 70 | FROM `bigquery-public-data.covid19_open_data.covid19_open_data` 71 | WHERE country_name ="India" AND date between '2020-02-21' and '2020-03-15' 72 | GROUP BY date 73 | ORDER BY date ASC 74 | ), india_previous_day_comparison AS ( 75 | SELECT date, cases, LAG(cases) OVER(ORDER BY date) AS previous_day, cases - LAG(cases) OVER(ORDER BY date) AS net_new_cases 76 | FROM india_cases_by_date 77 | ) 78 | 79 | SELECT count(*) 80 | FROM india_previous_day_comparison 81 | WHERE net_new_cases = 0 82 | ``` 83 | 84 | ### Query 7: Doubling rate 85 | 86 | ```sql 87 | WITH us_cases_by_date AS ( 88 | SELECT 89 | date, 90 | SUM(cumulative_confirmed) AS cases 91 | FROM 92 | `bigquery-public-data.covid19_open_data.covid19_open_data` 93 | WHERE 94 | country_name="United States of America" 95 | AND date between '2020-03-22' and '2020-04-20' 96 | GROUP BY 97 | date 98 | ORDER BY 99 | date ASC 100 | ) 101 | , us_previous_day_comparison AS 102 | (SELECT 103 | date, 104 | cases, 105 | LAG(cases) OVER(ORDER BY date) AS previous_day, 106 | cases - LAG(cases) OVER(ORDER BY date) AS net_new_cases, 107 | (cases - LAG(cases) OVER(ORDER BY date))*100/LAG(cases) OVER(ORDER BY date) AS percentage_increase 108 | FROM us_cases_by_date 109 | ) 110 | SELECT Date, cases as Confirmed_Cases_On_Day, previous_day as Confirmed_Cases_Previous_Day, percentage_increase as Percentage_Increase_In_Cases 111 | FROM us_previous_day_comparison 112 | WHERE percentage_increase > 10 113 | ``` 114 | 115 | ### Query 8: Recovery rate 116 | 117 | ```sql 118 | WITH cases_by_country AS ( 119 | SELECT 120 | country_name AS country, 121 | sum(cumulative_confirmed) AS cases, 122 | sum(cumulative_recovered) AS recovered_cases 123 | FROM 124 | bigquery-public-data.covid19_open_data.covid19_open_data 125 | WHERE 126 | date = '2020-05-10' 127 | GROUP BY 128 | country_name 129 | ) 130 | , recovered_rate AS 131 | (SELECT 132 | country, cases, recovered_cases, 133 | (recovered_cases * 100)/cases AS recovery_rate 134 | FROM cases_by_country 135 | ) 136 | SELECT country, cases AS confirmed_cases, recovered_cases, recovery_rate 137 | FROM recovered_rate 138 | WHERE cases > 50000 139 | ORDER BY recovery_rate desc 140 | LIMIT 10 141 | ``` 142 | 143 | ### Query 9: CDGR - Cumulative Daily Growth Rate 144 | 145 | ```sql 146 | WITH 147 | france_cases AS ( 148 | SELECT 149 | date, 150 | SUM(cumulative_confirmed) AS total_cases 151 | FROM 152 | `bigquery-public-data.covid19_open_data.covid19_open_data` 153 | WHERE 154 | country_name="France" 155 | AND date IN ('2020-01-24', 156 | '2020-05-10') 157 | GROUP BY 158 | date 159 | ORDER BY 160 | date) 161 | , summary as ( 162 | SELECT 163 | total_cases AS first_day_cases, 164 | LEAD(total_cases) OVER(ORDER BY date) AS last_day_cases, 165 | DATE_DIFF(LEAD(date) OVER(ORDER BY date),date, day) AS days_diff 166 | FROM 167 | france_cases 168 | LIMIT 1 169 | ) 170 | 171 | select first_day_cases, last_day_cases, days_diff, POW((last_day_cases/first_day_cases),(1/days_diff))-1 as cdgr 172 | from summary 173 | ``` 174 | 175 | ### Create a Datastudio report 176 | 177 | ```sql 178 | SELECT 179 | date, SUM(cumulative_confirmed) AS country_cases, 180 | SUM(cumulative_deceased) AS country_deaths 181 | FROM 182 | `bigquery-public-data.covid19_open_data.covid19_open_data` 183 | WHERE 184 | date BETWEEN '2020-03-15' 185 | AND '2020-04-30' 186 | AND country_name ="United States of America" 187 | GROUP BY date 188 | ``` 189 | 190 | * Open EXPLORE DATA 191 | * Select **Explore with Data Studio** 192 | * AUTHORIZE 193 | * You can view the chart here -------------------------------------------------------------------------------- /Integrate with Machine Learning APIs: Challenge Lab.md: -------------------------------------------------------------------------------- 1 | # Integrate with Machine Learning APIs: Challenge Lab 2 | 3 | > Launch the lab [here](https://google.qwiklabs.com/focuses/12704?parent=catalog) 4 | 5 | ## Your challenge 6 | 7 | You have started a new role as a member of the Analytics team for Jooli Inc. You are expected to help with the development and assessment of data sets for your company's Machine Learning projects. Common tasks include preparing, cleaning, and analyzing diverse data sets. 8 | 9 | ### Task 1: Configure a service account to access the Machine Learning APIs, BigQuery, and Cloud Storage 10 | 11 | * Go to **Cloud Shell** and add the `PROJECT` environment variable. 12 | 13 | ```yaml 14 | export PROJECT= 15 | ``` 16 | 17 | * Go to **IAM & Admin** > Service Accounts > CREATE SERVICE ACCOUNT 18 | 19 | * Enter Service account name as `my-account@` > CREATE 20 | 21 | * Enter the two roles as **BigQuery Admin** and **Cloud Storage Admin** 22 | 23 | * Click CONTINUE 24 | 25 | ### Task 2: Create and download a credential file for your Service Account 26 | 27 | * Click on three dots > Create key > Key type: JSON > CREATE 28 | 29 | ### Task 3: Modify the Python script to extract text from image files 30 | 31 | * Three dots on Cloud Shell > Upload file > Select the key file 32 | 33 | * Find the script in Storage > Select the bucket > analyze-images.py > Download 34 | 35 | * Modify as follow: 36 | 37 | ```python 38 | # Dataset: image_classification_dataset 39 | # Table name: image_text_detail 40 | import os 41 | import sys 42 | 43 | # Import Google Cloud Library modules 44 | from google.cloud import storage, bigquery, language, vision 45 | client = vision.ImageAnnotatorClient() 46 | 47 | if ('GOOGLE_APPLICATION_CREDENTIALS' in os.environ): 48 | if (not os.path.exists(os.environ['GOOGLE_APPLICATION_CREDENTIALS'])): 49 | print ("The GOOGLE_APPLICATION_CREDENTIALS file does not exist.\n") 50 | exit() 51 | else: 52 | print ("The GOOGLE_APPLICATION_CREDENTIALS environment variable is not defined.\n") 53 | exit() 54 | 55 | if len(sys.argv)<3: 56 | print('You must provide parameters for the Google Cloud project ID and Storage bucket') 57 | print ('python3 '+sys.argv[0]+ '[PROJECT_NAME] [BUCKET_NAME]') 58 | exit() 59 | 60 | project_name = sys.argv[1] 61 | bucket_name = sys.argv[2] 62 | 63 | # Set up our GCS, BigQuery, and Natural Language clients 64 | storage_client = storage.Client() 65 | bq_client = bigquery.Client(project=project_name) 66 | nl_client = language.LanguageServiceClient() 67 | 68 | # Set up client objects for the vision and translate_v2 API Libraries 69 | vision_client = vision.ImageAnnotatorClient() 70 | 71 | # Setup the BigQuery dataset and table objects 72 | dataset_ref = bq_client.dataset('image_classification_dataset') 73 | dataset = bigquery.Dataset(dataset_ref) 74 | table_ref = dataset.table('image_text_detail') 75 | table = bq_client.get_table(table_ref) 76 | 77 | # Create an array to store results data to be inserted into the BigQuery table 78 | rows_for_bq = [] 79 | 80 | # Get a list of the files in the Cloud Storage Bucket 81 | files = storage_client.bucket(bucket_name).list_blobs() 82 | bucket = storage_client.bucket(bucket_name) 83 | 84 | print('Processing image files from GCS. This will take a few minutes..') 85 | 86 | # Process files from Cloud Storage and save the result to send to BigQuery 87 | for file in files: 88 | if file.name.endswith('jpg') or file.name.endswith('png'): 89 | file_content = file.download_as_string() 90 | 91 | # TBD: Create a Vision API image object called image_object 92 | image_object = vision.Image(content=file_content) 93 | # Ref: https://googleapis.dev/python/vision/latest/gapic/v1/types.html#google.cloud.vision_v1.types.Image 94 | 95 | 96 | # TBD: Detect text in the image and save the response data into an object called response 97 | response = client.text_detection(image=image_object) 98 | # Ref: https://googleapis.dev/python/vision/latest/gapic/v1/api.html#google.cloud.vision_v1.ImageAnnotatorClient.document_text_detection 99 | 100 | 101 | # Save the text content found by the vision API into a variable called text_data 102 | text_data = response.text_annotations[0].description 103 | 104 | # Save the text detection response data in .txt to cloud storage 105 | file_name = file.name.split('.')[0] + '.txt' 106 | blob = bucket.blob(file_name) 107 | # Upload the contents of the text_data string variable to the Cloud Storage file 108 | blob.upload_from_string(text_data, content_type='text/plain') 109 | 110 | # Extract the description and locale data from the response file 111 | # into variables called desc and locale 112 | # using response object properties e.g. response.text_annotations[0].description 113 | desc = response.text_annotations[0].description 114 | locale = response.text_annotations[0].locale 115 | 116 | # if the locale is English (en) save the description as the translated_txt 117 | if locale == 'en': 118 | translated_text = desc 119 | else: 120 | # TBD: For non EN locales pass the description data to the translation API 121 | # ref: https://googleapis.dev/python/translation/latest/client.html#google.cloud.translate_v2.client.Client.translate 122 | # Set the target_language locale to 'en') 123 | from google.cloud import translate_v2 as translate 124 | translate_client = translate.Client() 125 | translation = translate_client.translate(desc, target_language='en',format_='text') 126 | translated_text = translation['translatedText'] 127 | print(translated_text) 128 | 129 | # if there is response data save the original text read from the image, 130 | # the locale, translated text, and filename 131 | if len(response.text_annotations) > 0: 132 | rows_for_bq.append((desc, locale, translated_text, file.name)) 133 | 134 | print('Writing Vision API image data to BigQuery...') 135 | # Write original text, locale and translated text to BQ 136 | # TBD: When the script is working uncomment the next line to upload results to BigQuery 137 | errors = bq_client.insert_rows(table, rows_for_bq) 138 | assert errors == [] 139 | ``` 140 | 141 | * Three dots Cloud Shell > Upload the file. 142 | 143 | * Run in Cloud Shell: 144 | 145 | ```yaml 146 | export GOOGLE_APPLICATION_CREDENTIALS=key.json 147 | 148 | python3 analyze-images.py $DEVSHELL_PROJECT_ID $DEVSHELL_PROJECT_ID 149 | ``` 150 | 151 | ### Task 4: Modify the Python script to translate the text using the Translation API 152 | 153 | Already implemented in the previous code. 154 | 155 | ### Task 5: Identify the most common non-English language used in the signs in the data set 156 | 157 | * Go to **BigQuery** 158 | 159 | * Run the query: 160 | 161 | ```sql 162 | SELECT locale,COUNT(locale) as lcount FROM image_classification_dataset.image_text_detail GROUP BY locale ORDER BY lcount DESC 163 | ``` 164 | 165 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | ![Banner](https://github.com/DSC-IIIT-Kalyani/qwiklabs_challenges/raw/master/assets/banner.png) 2 | 3 | This repository contains the some important information related to **30 Days of Google Cloud** program as well as some of the solutions to the **Challenge Labs**! 4 | 5 | > **[UPDATE 29/10/2020]**: All the solutions for challenge labs of the `Data Science & Machine Learning Track` & `Cloud Engineering Track` are uploaded. 6 | 7 | ## Challenge Lab Solutions 8 | 9 | These are the solutions to the Challenge Labs of the **Cloud Engineering Track**. 10 | 11 | > Please try the labs yourself and take help from here only if you get stuck. 12 | 13 | 1. [Getting Started: Create and Manage Cloud Resources: Challenge Lab](https://github.com/DSC-IIIT-Kalyani/qwiklabs_challenges/blob/master/Getting%20Started:%20Create%20and%20Manage%20Cloud%20Resources:%20Challenge%20Lab.md) 14 | 2. [Perform Foundational Infrastructure Tasks in Google Cloud: Challenge Lab](https://github.com/DSC-IIIT-Kalyani/qwiklabs_challenges/blob/master/Perform%20Foundational%20Infrastructure%20Tasks%20in%20Google%20Cloud%20Challenge%20Lab.md) 15 | 3. [Setup and Configure a Cloud Environment in Google Cloud: Challenge Lab](https://github.com/DSC-IIIT-Kalyani/qwiklabs_challenges/blob/master/Set%20up%20and%20Configure%20a%20Cloud%20Environment%20in%20Google%20Cloud:%20Challenge%20Lab.md) 16 | 4. [Deploy and Manage Cloud Environments with Google Cloud: Challenge Lab](https://github.com/DSC-IIIT-Kalyani/qwiklabs_challenges/blob/master/Deploy%20and%20Manage%20Cloud%20Environments%20with%20Google%20Cloud:%20Challenge%20Lab.md) 17 | 5. [Build and Secure Networks in Google Cloud: Challenge Lab](https://github.com/DSC-IIIT-Kalyani/qwiklabs_challenges/blob/master/Build%20and%20Secure%20Networks%20in%20Google%20Cloud:%20Challenge%20Lab.md) 18 | 6. [Deploy to Kubernetes in Google Cloud: Challenge Lab](https://github.com/DSC-IIIT-Kalyani/qwiklabs_challenges/blob/master/Deploy%20to%20Kubernetes%20in%20Google%20Cloud:%20Challenge%20Lab.md) 19 | 20 | Solutions to the Challenge Labs of the **Data Science & Machine Learning Track**. 21 | 22 | 1. [Getting Started: Create and Manage Cloud Resources: Challenge Lab](https://github.com/DSC-IIIT-Kalyani/qwiklabs_challenges/blob/master/Getting%20Started:%20Create%20and%20Manage%20Cloud%20Resources:%20Challenge%20Lab.md) 23 | 2. [Perform Foundational Data, ML, and AI Tasks in Google Cloud: Challenge Lab](https://github.com/DSC-IIIT-Kalyani/qwiklabs_challenges/blob/master/Perform%20Foundational%20Data%2C%20ML%2C%20and%20AI%20Tasks%20in%20Google%20Cloud:%20Challenge%20Lab.md) 24 | 3. [Insights from Data with BigQuery: Challenge Lab](https://github.com/DSC-IIIT-Kalyani/qwiklabs_challenges/blob/master/Insights%20from%20Data%20with%20BigQuery:%20Challenge%20Lab.md) 25 | 4. [Engineer Data in Google Cloud: Challenge Lab](https://github.com/DSC-IIIT-Kalyani/qwiklabs_challenges/blob/master/Engineer%20Data%20in%20Google%20Cloud:%20Challenge%20Lab.md) 26 | 5. [Integrate with Machine Learning APIs: Challenge Lab](https://github.com/DSC-IIIT-Kalyani/qwiklabs_challenges/blob/master/Integrate%20with%20Machine%20Learning%20APIs:%20Challenge%20Lab.md) 27 | 6. [Explore Machine Learning Models with Explainable AI: Challenge Lab](https://github.com/DSC-IIIT-Kalyani/qwiklabs_challenges/blob/master/Explore%20Machine%20Learning%20Models%20with%20Explainable%20AI:%20Challenge%20Lab.md) 28 | 29 | ## Resources 30 | 31 | * [[IMPORTANT] 30 Days of Google Cloud Program description page](https://events.withgoogle.com/30daysofgooglecloud/) 32 | * [[IMPORTANT] Google Cloud Docs](https://cloud.google.com/docs) 33 | * [Google Cloud Blog](https://cloud.google.com/blog/) 34 | * [[Article] Migration to Google Cloud Platform (GCP)](https://blog.hike.in/migration-to-google-cloud-platform-gcp-17c397e564b8) 35 | 36 | ## Program Syllabus 37 | 38 | ![](https://github.com/DSC-IIIT-Kalyani/qwiklabs_challenges/raw/master/assets/badges.png) 39 | 40 | * Track 1: Cloud Engineering Track 41 | 42 | * [Getting Started: Create and Manage Cloud Resources](https://google.qwiklabs.com/quests/120) 43 | * [Perform Foundational Infrastructure Tasks in Google Cloud](https://google.qwiklabs.com/quests/118) 44 | * [Setup and Configure a Cloud Environment in Google Cloud](https://google.qwiklabs.com/quests/119?utm_source=google&utm_medium=lp&utm_campaign=gcpskills) 45 | * [Deploy and Manage Cloud Environments with Google Cloud](https://google.qwiklabs.com/quests/121?utm_source=google&utm_medium=lp&utm_campaign=gcpskills) 46 | * [Build and Secure Networks in Google Cloud](https://google.qwiklabs.com/quests/128?utm_source=google&utm_medium=lp&utm_campaign=gcpskills) 47 | * [Deploy to Kubernetes in Google Cloud](https://google.qwiklabs.com/quests/116?utm_source=google&utm_medium=lp&utm_campaign=gcpskills) 48 | 49 | * Track 2: Data Science and Machine Learning Track 50 | * [Getting Started: Create and Manage Cloud Resources](https://google.qwiklabs.com/quests/120) 51 | * [Perform Foundational Data, ML, and AI Tasks in Google Cloud](https://google.qwiklabs.com/quests/117?utm_source=google&utm_medium=lp&utm_campaign=gcpskills) 52 | * [Insights from Data with BigQuery](https://google.qwiklabs.com/quests/123) 53 | * [Engineer Data in Google Cloud](https://google.qwiklabs.com/quests/132) 54 | * [Integrate with Machine Learning APIs](https://google.qwiklabs.com/quests/136?utm_source=google&utm_medium=lp&utm_campaign=gcpskills) 55 | * [Explore Machine Learning Models with Explainable AI](https://google.qwiklabs.com/quests/126?utm_source=google&utm_medium=lp&utm_campaign=gcpskills) 56 | 57 | ## Rules for availing the prizes 58 | 59 | To earn prizes in the program, you need to **complete either of the track**! 60 | 61 | See the tracks and the prizes associated with them below. Your progress is recorded on a daily basis and will be evaluate at the end of the program i.e. **5th November 11:59 PM IST**. 62 | 63 | You will receive another form in the **last week of October** for you to confirm the tracks that you have finished and reconfirm your t-shirt size and shipping address. 64 | 65 | ![](https://github.com/DSC-IIIT-Kalyani/qwiklabs_challenges/raw/master/assets/prizes_table.png) 66 | 67 | > For more information refer [here](https://events.withgoogle.com/30daysofgooglecloud/prize-rules/#content). 68 | 69 | ## Support 70 | 71 | * Email: dscsupport@qwiklabs.com 72 | * There is online chat support available 24/7 for **Qwiklabs**. Select the Department as "**30 Days of Google Cloud**" 73 | 74 | ## License 75 | 76 | Copyright (c) 2020 Souvik Biswas [Cloud Facilitator] 77 | 78 | Permission is hereby granted, free of charge, to any person obtaining a copy 79 | of this software and associated documentation files (the "Software"), to deal 80 | in the Software without restriction, including without limitation the rights 81 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 82 | copies of the Software, and to permit persons to whom the Software is 83 | furnished to do so, subject to the following conditions: 84 | 85 | The above copyright notice and this permission notice shall be included in all 86 | copies or substantial portions of the Software. 87 | 88 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 89 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 90 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 91 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 92 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 93 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 94 | SOFTWARE. 95 | 96 | ![](https://github.com/DSC-IIIT-Kalyani/qwiklabs_challenges/raw/master/assets/dsc_logo.png) 97 | --------------------------------------------------------------------------------