├── 2024 Bracketology with Google Machine Learning └── quicklabgsp461.sh ├── 2024 Collect Metrics from Exporters using the Managed Service for Prometheus └── quicklabgsp1026.sh ├── 2024 Securing Compute Engine Applications and Resources using BeyondCorp Enterprise BCE └── quicklabgsp1033.sh ├── 2024 Troubleshooting and Solving Data Join Pitfalls └── quicklabgsp412.sh ├── API Gateway Qwik Start └── quicklabgsp872.sh ├── APIs Explorer App Engine ├── GSP422.md └── quicklabgsp422.sh ├── APIs Explorer Cloud SQL ├── GSP423.md └── quicklabgsp423.sh ├── APIs Explorer Cloud Storage ├── demo-image1-copy.png ├── demo-image1.png ├── demo-image2.png └── quicklabgsp421.sh ├── APIs Explorer Compute Engine ├── GSP293.md └── quicklabgsp293.sh ├── APIs Explorer Create and Update a Cluster ├── GSP288.md └── quicklabgsp288.sh ├── APIs Explorer Qwik Start └── quicklabgsp277.sh ├── AUGUST Configure Secure RDP using a Windows Bastion Host.md ├── AUGUST Create ML Models with BigQuery ML Challenge Lab ├── newquicklabgsp341.sh └── quicklabgsp341.sh ├── AUGUST Serverless Firebase Development Challenge Lab ├── GSP344.md ├── quicklab.sh └── quicklabgsp344.sh ├── Activity Add and manage users with Linux commands └── quicklab.sh ├── Activity Filter a SQL query.md ├── Activity- Examine input:output in the Linux shell └── quicklab.sh ├── Activity- Filter with AND, OR, and NOT ├── Activity: Apply more filters in SQL ├── Activity: Complete a SQL join ├── Activity: Decrypt an encrypted message.md ├── Activity: Filter with grep ├── Activity: Find files with Linux commands ├── Activity: Get help in the command line └── quicklab.md ├── Activity: Install software in a Linux.md ├── Activity: Manage authorization ├── Activity: Manage files with Linux commands.md ├── Activity: Perform a SQL query.md ├── Adding a Phone Gateway to a Virtual Agent ├── GSP793.md ├── pigeon-travel-gsp-793-cloud-function │ ├── index.js │ ├── package.json │ └── quicklab.zip └── quicklabgsp793.sh ├── Administering an AlloyDB Database └── GSP1086.md ├── Alerting in Google Cloud └── quicklab.sh ├── AlloyDB Database Fundamentals └── GSP1083.md ├── Analyze Customer Reviews with Gemini Using Python Notebooks ├── GSP1249.md └── quicklabgsp1249.sh ├── Analyze Customer Reviews with Gemini Using SQL ├── GSP1246.md └── quicklabgsp1246.sh ├── Analyze Images with the Cloud Vision API Challenge Lab └── quicklabarc122.sh ├── Analyze Sentiment with Natural Language API Challenge Lab └── quicklabarc130.sh ├── Analyze and activate your data with Looker Enterprise.md ├── Analyze data with Gemini assistance └── quicklab.sh ├── Analyzing Movie Posters in BigQuery with Remote Models ├── GSP1247.md └── quicklabgsp1247.sh ├── Analyzing Network Traffic with VPC Flow Logs ├── Quicklab.md └── quicklab.sh ├── Analyzing and Visualizing Data in Looker ├── Answering Complex Questions Using Native Derived Tables with LookML ├── App Dev Deploying the Application into Kubernetes Engine Node.js ├── Quicklab.md └── quicklab.sh ├── App Dev Deploying the Application into Kubernetes Engine Python └── quicklabgsp188.sh ├── App Dev Processing Cloud PubSub Data using Cloud Functions Node.js └── quicklab.sh ├── App Dev Setting up a Development Environment Java ├── Quicklab.md └── quicklab.sh ├── App Dev Setting up a Development Environment Node.js ├── Quicklab.md └── quicklab.sh ├── App Dev Setting up a Development Environment Python └── quicklab.sh ├── App Dev Storing Application Data in Cloud Datastore Python ├── GSP184.md └── quicklabgsp184.sh ├── App Dev Storing Image and Video Files in Cloud Storage Python └── quicklabgsp185.sh ├── App DevStoring Image and Video Files in Cloud Storage v11 ├── Quicklab.md ├── cloudstorage.js └── quicklab.sh ├── App Engine 3 Ways Challenge Lab ├── ARC112.md └── quicklabarc112.sh ├── App Engine Qwik Start Go └── quicklabgsp070.sh ├── App Engine Qwik Start Java └── quicklabgsp068.sh ├── App Engine Qwik Start PHP └── quicklabgsp069.sh ├── App Engine Qwik Start Python └── quicklabgsp067.sh ├── App Engine: 3 Ways: Challenge Lab ├── AppSheet to Google Chat using Webhooks from Automation Bots └── GSP756.md ├── Application Load Balancer with Cloud Armor ├── GSP215.md └── quicklabgsp215.sh ├── Apply RFM method to segment customer data ├── Quicklab.md └── quicklab.sh ├── Arcade Hero ├── mdfiles │ ├── ARC1200.md │ ├── ARC1201.md │ ├── ARC1202.md │ ├── ARC1203.md │ ├── ARC1204.md │ ├── ARC1208.md │ ├── ARC1209.md │ ├── ARC1210.md │ ├── ARC1211.md │ ├── ARC1221.md │ ├── ARC1223.md │ ├── ARC233.md │ └── ARC236.md ├── quicklabarc1200.sh ├── quicklabarc1201.sh ├── quicklabarc1202.sh ├── quicklabarc1203.sh ├── quicklabarc1204.sh ├── quicklabarc1208.sh ├── quicklabarc1209.sh ├── quicklabarc1210.sh ├── quicklabarc1211.sh ├── quicklabarc1221.sh ├── quicklabarc1223.sh ├── quicklabarc124.sh ├── quicklabarc127.sh ├── quicklabarc130.sh ├── quicklabarc230.sh ├── quicklabarc231.sh ├── quicklabarc232.sh ├── quicklabarc233.sh ├── quicklabarc234.sh ├── quicklabarc235.sh ├── quicklabarc236.sh └── quicklabarc237.sh ├── Assessing Data Quality with Dataplex └── quicklabgsp1158.sh ├── August 2023 Optimize Costs for Google Kubernetes Engine Challenge Lab └── quicklabgsp343.sh ├── August Automate Interactions with Contact Center AI Challenge Lab.md ├── August Cloud SQL Loading data into Google Cloud SQL └── quicklabarc116.sh ├── Authentication Authorization and Identity with Vault ├── GSP1005.md └── quicklabgsp1005.sh ├── Automate Data Capture at Scale with Document AI Challenge Lab ├── GSP367.md └── quicklabgsp367.sh ├── Automate Interactions with Contact Center AI: Challenge Lab ├── Automatically Deploy Python Web Apps from Version Control to Cloud Run ├── GSP1204.md ├── task1.sh └── task2.sh ├── Automatically Generate a PDF and send it by Email ├── Automating Infrastructure on Google Cloud with Terraform: Challenge Lab ├── Automating the Deployment of Infrastructure Using Terraform ├── instance │ └── main.tf ├── mynetwork.tf ├── provider.tf ├── quicklab.sh ├── terraform.tfstate └── variables.tf ├── Autoscaling TensorFlow Model Deployments with TF Serving and Kubernetes ├── GSP777.md └── quicklabgsp777.sh ├── Autoscaling an Instance Group with Custom Cloud Monitoring Metrics └── quicklabgsp087.sh ├── Awwvision Cloud Vision API from a Kubernetes Cluster └── quicklabgsp066.sh ├── Bastion Host ├── Quicklab.md └── quicklab.sh ├── BigLake Qwik Start └── quicklabgsp1040.sh ├── BigQuery Core Challenge Lab ├── Quicklab.md └── quicklab.sh ├── BigQuery Machine Learning using Soccer Data └── quicklabgsp851.sh ├── BigQuery Qwik Start Command Line ├── names.zip └── quicklabgsp071.sh ├── BigQuery Soccer Data Analysis └── quicklabgsp849.sh ├── BigQuery Soccer Data Analytical Insight └── quicklabsgsp850.sh ├── BigQuery Soccer Data Ingestion └── quicklabgsp848.sh ├── BigQuery: Qwik Start - Console ├── Bigtable Qwik Start Command Line └── quicklabgsp099.sh ├── Block.one: Creating a Multi Node EOSIO Blockchain ├── Bot Management with Google Cloud Armor and reCAPTCHA └── quicklabgsp877.sh ├── Bracketology with Google Machine Learning ├── Build Apps with Gemini Code Assist ├── Quicklab.md ├── convert.html ├── index.html ├── task1.sh └── task2.sh ├── Build Google Cloud Infrastructure for AWS Professionals Challenge Lab ├── GSP511.md └── quicklabgsp511.sh ├── Build Google Cloud Infrastructure for Azure Professionals Challenge Lab └── quicklabgsp512.sh ├── Build Infrastructure with Terraform on Google Cloud Challenge Lab └── quicklab345.sh ├── Build LangChain Applications using Vertex AI Challenge Lab └── retrieval_augmentation_generation_challenge.ipynb ├── Build LookML Objects in Looker: Challenge Lab ├── Build a BI Dashboard Using Looker Studio and BigQuery └── quicklabgsp403.sh ├── Build a Chat Application using the PaLM 2 API on Cloud Run ├── GSP1201.md └── quicklabgsp1201.sh ├── Build a Data Mesh with Dataplex Challenge Lab └── quicklabgsp514.sh ├── Build a Data Warehouse with BigQuery Challenge Lab ├── GSP340.md └── quicklabgsp340.sh ├── Build a Generative AI solution using a RAG Framework Challenge Lab L400 └── quicklab.ipynb ├── Build a Multi-Modal GenAI Application Challenge Lab ├── Quicklab.md └── quicklab.sh ├── Build a RAG Solution with Firebase Genkit ├── Quicklab.md ├── index.ts ├── task1.sh └── task4.sh ├── Build a Resilient, Asynchronous System with Cloud Run and Pub:Sub ├── GSP650.md └── quicklabgsp650.sh ├── Build a Secure Google Cloud Network Challenge Lab ├── GSP322.md └── quicklabgsp322.sh ├── Build a Serverless App with Cloud Run that Creates PDF Files └── quicklabgsp644.sh ├── Build a Serverless Web App with Firebase ├── Build a Serverless Web App with Firebase.md ├── GSP643.md ├── quicklab.sh ├── quicklab11.sh └── quicklabgsp643.sh ├── Build a Website on Google Cloud: Challenge Lab.md ├── Build a Website on Google Cloud: Challenge Lab ├── quicklab.md └── quicklabgsp319.sh ├── Build an AI Image Generator app using Imagen on Vertex AI └── quicklab.sh ├── Build an AI Image Recognition app using Gemini on Vertex AI └── quicklab.sh ├── Build an application to send Chat Prompts using the Gemini model └── quicklab.sh ├── Build and Deploy Machine Learning Solutions with Vertex AI Challenge Lab └── vertex-challenge-lab.ipynb ├── Build and Deploy a Docker Image to a Kubernetes Cluster └── quicklabgsp304.sh ├── Build and Execute MySQL PostgreSQL and SQLServer to Data Catalog Connectors └── quicklabgsp814.sh ├── Build and Optimize Data Warehouses with BigQuery Challenge Lab └── quicklabgsp340.sh ├── Build and Optimize Data Warehouses with BigQuery: Challenge Lab ├── Build and Secure Networks in Google Cloud: Challenge Lab ├── Building Event-Driven Orchestration on Google Cloud ├── Quicklab.md └── quicklab.sh ├── Building Transformations and Preparing Data with Wrangler in Cloud Data Fusion └── quicklab.sh ├── Building Virtual Agent Fulfillment ├── Building Virtual Agent Fulfillment.md ├── quicklab │ ├── index.js │ └── package.json ├── quicklabgsp792.sh └── quicklabgsp792.zip ├── Building a High throughput VPN └── quicklabgsp062.sh ├── Building an Application with MongoDB Atlas and Natural Language API hosted on Cloud Run.md ├── CI CD for a TFX pipeline ├── CICD on Google Cloud Challenge Lab ├── GSP393.md └── quicklabgsp393.sh ├── CICD pipeline using GCP └── quicklab.sh ├── Caching Content with Cloud CDN ├── Quicklab.md └── quicklab.sh ├── Change firewall rules using Terraform and Cloud Shell └── quicklab.sh ├── Check my progress └── bookmark.md ├── Classify Images of Clouds in the Cloud with AutoML Images └── quicklabgsp223.sh ├── Classify Images with TensorFlow Convolutional Neural Networks └── CLS_Vertex_AI_CNN_fmnist-v1.0.0.ipynb ├── Classify Images with TensorFlow on Google Cloud Challenge Lab └── quicklab.ipynb ├── Classify Text into Categories with the Natural Language API └── quicklabgsp063.sh ├── Clean Up Unused IP Addresses ├── GSP646.md └── quicklabgsp646.sh ├── Clean Up Unused and Orphaned Persistent Disks ├── GSP648.md ├── gsp648.sh └── quicklabgsp648.sh ├── Cloud Armor Preconfigured WAF Rules └── quicklabgsp879.sh ├── Cloud Audit Logs └── quicklab.sh ├── Cloud CDN └── quicklabgsp217.sh ├── Cloud Composer Copying BigQuery Tables Across Different Locations └── quicklabgsp283.sh ├── Cloud Composer Qwik Start Command Line └── quicklabgsp606.sh ├── Cloud DNS Traffic Steering using Geolocation Policy └── quicklabgsp1008.sh ├── Cloud Data Loss Prevention API Qwik Start ├── GSP107.md └── quicklabgsp107.sh ├── Cloud Datastore v15 ├── Quicklab.md └── quicklab.sh ├── Cloud Digital Leader Learning Path ├── 01 Digital Transformation with Google Cloud │ ├── 01 Why Cloud Technology is Transforming Business.md │ ├── 02 Fundamental Cloud Concepts.md │ └── 03 Cloud Computing Benefits and Models.md ├── 02 Exploring Data Transformation with Google Cloud │ ├── 01 The Value of Data.md │ ├── 02 Google Cloud Data Management Solutions.md │ └── 03 Making Data Useful and Accessible.md ├── 03 Innovating with Google Cloud Artificial Intelligence │ ├── 01 AI and ML Fundamentals.md │ └── 02 Google Cloud’s AI and ML Solutions.md ├── 04 Modernize Infrastructure and Applications with Google Cloud │ ├── 01 Modernizing Infrastructure in the Cloud.md │ └── 02 Modernizing Applications in the Cloud.md ├── 05 Trust and Security with Google Cloud │ ├── 01 Trust and Security in the Cloud.md │ ├── 02 Google’s Trusted Infrastructure.md │ └── 03 Google Cloud’s Trust Principles and Compliance.md └── 06 Scaling with Google Cloud Operations │ ├── 01 Financial Governance and Managing Cloud Costs.md │ ├── 02 Operational Excellence and Reliability at Scale.md │ └── 03 Sustainability with Google Cloud.md ├── Cloud Endpoints Qwik Start └── quicklabgsp164.sh ├── Cloud Filestore Qwik Start └── quicklabgsp244.sh ├── Cloud Foundation Toolkit CFT Training Cloud Function └── quicklabgsp803.sh ├── Cloud Foundation Toolkit CFT Training Load Balancer └── quicklabgsp802.sh ├── Cloud Functions 2nd Gen- Qwik Start.md ├── Cloud Functions 2nd Gen: Qwik Start ├── Cloud Functions 2nd GenQwik Start ├── Cloud Functions 2nd Gen- Qwik Start.md ├── task1.sh └── task2.sh ├── Cloud Functions Qwik Start Console ├── GSP081.md └── quicklabsgsp081.sh ├── Cloud Functions Qwik Start Command Line └── quicklabgsp080.sh ├── Cloud Functions: 3 Ways: Challenge Lab ├── Cloud IAM Qwik Start └── quicklabgsp064.sh ├── Cloud IAM v15 ├── Quicklab.md └── quicklab.sh ├── Cloud Logging on Kubernetes Engine └── quicklabgsp483.sh ├── Cloud Monitoring Qwik Start └── quicklabgsp089.sh ├── Cloud Natural Language API Qwik Start └── quicklabgsp097.sh ├── Cloud Operations for GKE ├── GSP497.md └── quicklabgsp497.sh ├── Cloud Run Functions Qwik Start Console ├── GSP081.md ├── GSP1089.md ├── newquicklabgsp081.sh └── quicklabgsp081.sh ├── Cloud SQL for MySQL Qwik Start └── quicklabgsp151.sh ├── Cloud SQL for PostgreSQL Qwik Start └── quicklabgsp152.sh ├── Cloud SQL with Cloud Run [APPRUN].md ├── Cloud SQL with Terraform └── quicklabgsp234.sh ├── Cloud Scheduler: Qwik Start ├── Cloud Spanner Database Fundamentals └── quicklabgsp1048.sh ├── Cloud Spanner - Database Fundamentals ├── Cloud Spanner Qwik Start ├── GSP102.md └── quicklabgsp102.sh ├── Cloud Speech API 3 Ways: Challenge Lab.md ├── Cloud Storage AWS ├── Quicklab.md ├── task1.sh └── task2.sh ├── Cloud Storage Qwik Start CLISDK └── quicklabgsp074.sh ├── Cloud Storage Qwik Start Cloud Console └── quicklabgsp073.sh ├── Cloud Storage ├── task1.sh └── task2.sh ├── Collect Metrics from Exporters using the Managed Service for Prometheus ├── Collect process and store data in BigQuery └── quicklab.sh ├── Compute Engine Qwik Start Windows └── quicklabgsp093.sh ├── Configure Secure RDP using a Windows Bastion Host ├── Configure Service Accounts and IAM for Google Cloud: Challenge Lab ├── Configuring IAM Permissions with gCloud - Azure.md ├── Configuring IAM Permissions with gcloud ├── Configuring IAM Permissions with gcloud - AWS.md ├── Configuring IAM Permissions with gcloud updated └── quicklabgsp647.sh ├── Configuring Liveness and Readiness Probes └── quicklab.sh ├── Configuring Networks via gcloud └── quicklabgsp630.sh ├── Configuring Pod Autoscaling and NodePools ├── Configuring Traffic Blocklisting with Google Cloud Armor ├── Quicklab.md └── quicklab.sh ├── Configuring Traffic Management with a Load Balancer └── quicklab.sh ├── Configuring Using and Auditing VM Service Accounts and Scopes ├── Quicklab.md └── quicklab.sh ├── Configuring VPC Firewalls ├── Quicklab.md └── quicklab.sh ├── Configuring VPC Network Peering ├── Quicklab.md ├── quicklab.sh ├── task1.sh ├── task2.sh ├── task3.sh ├── task4.sh └── task5.sh ├── Configuring an HTTP Load Balancer with Autoscaling Azure └── quicklab.sh ├── Configuring an HTTP Load Balancer with Autoscaling └── quicklab.sh ├── Configuring an HTTP Load Balancer with Google Cloud Armor ├── Configuring an Internal Load Balancer ├── Configuring and Using Cloud Logging and Cloud Monitoring └── quicklab.sh ├── Configuring and Viewing Cloud Audit Logs ├── Quicklab.md └── quicklab.sh ├── Connect Cloud Run Functions ├── Quicklab.md └── quicklab.sh ├── Connect an App to a Cloud SQL for PostgreSQL Instance ├── GSP919.md └── quicklabgsp919.sh ├── Connect and Configure Data for your AppSheet App ├── README.md └── quicklab.xlsx ├── Connect to Cloud SQL from an Application in Google Kubernetes Engine ├── GSP449.md ├── automate_create.exp └── quicklabgsp449.sh ├── Connecting Cloud Functions └── quicklab.sh ├── Continuous Delivery Pipelines with Spinnaker and Kubernetes Engine └── quicklabgsp114.sh ├── Continuous Delivery with Google Cloud Deploy ├── GSP1079.md └── quicklabgsp1079.sh ├── Continuous Delivery with Jenkins in Kubernetes Engine ├── GSP051.md └── quicklabgsp051.sh ├── Controlling Access to VPC Networks ├── Quicklab.md └── quicklab.sh ├── Conversational Agents Managing Environments └── quicklab.blob ├── Cost Optimization and Data Tiering with BigLake and Cloud Storage ├── Quicklab.md └── quicklabgsp267.sh ├── Create Analyze BigQuery data in Connected Sheets: Challenge Lab ├── Create ML Models with BigQuery ML Challenge Lab ├── Create Synthetic Speech Using quicklab └── quicklabgsp222.sh ├── Create VM template and Automate deployment ├── Create a Custom Network and Apply Firewall Rules ├── GSP159.md └── quicklabgsp159.sh ├── Create a RAG Application with BigQuery ├── GSP1289.md └── quicklabgsp1289.sh ├── Create a Secure Data Lake on Cloud Storage Challenge Lab ├── ARC119.md ├── ARC119form4.md ├── ARC199Form1&2.md ├── ARC199form3.md ├── quicklab.sh ├── quicklabarc119.sh └── quicklabform3.sh ├── Create a Secure Data Lake on Cloud Storage: Challenge Lab ├── Create a Streaming Data Lake on Cloud Storage Challenge Lab ├── ARC110.md ├── quicklab.sh └── quicklabarc110.sh ├── Create a VPC using Cloud Shell ├── Quicklab.md └── quicklab.sh ├── Create an Internal Load Balancer ├── Create and Manage AlloyDB Instances Challenge Lab ├── GSP395.md └── quicklabgsp395.sh ├── Create and Manage Cloud Resources: Challenge Lab ├── Create and Manage Cloud Spanner Databases Challenge Lab ├── Customer_List_500.csv └── quicklabgsp381.sh ├── Create and Test a Document AI Processor └── quicklabgsp924.sh ├── Create synthetic speech from text using the Text to Speech API └── quicklabgsp074.sh ├── Create, Modify and Remove Files and Folders in Linux ├── Creating Cloud SQL Databases └── quicklab.sh ├── Creating Cross region Load Balancing ├── GSP157.md └── quicklabgsp157.sh ├── Creating Databases in Compute Engine └── quicklab.sh ├── Creating Date Partitioned Tables in BigQuery └── quicklabgsp414.sh ├── Creating Derived Tables Using LookML └── GSP858.md ├── Creating Dynamic Secrets for Google Cloud with Vault ├── GSP1007.md └── quicklabgsp1007.sh ├── Creating Google Kubernetes Engine Deployments AWS ├── Quicklab.md └── quicklab.sh ├── Creating Google Kubernetes Engine Deployments Azure └── quicklab.sh ├── Creating Google Kubernetes Engine Deployments └── quicklab.sh ├── Creating Measures and Dimensions Using LookML └── GSP890.md ├── Creating PDFs with Go and Cloud Run ├── quicklabgsp762.sh └── server.go ├── Creating Permanent Tables and AccessControlled Views in BigQuery └── quicklabgsp410.sh ├── Creating Resource Dependencies with Terraform └── quicklab.sh ├── Creating Services and Ingress Resources └── quicklab.sh ├── Creating a Containerized Application with Buildpacks ├── Quicklab.md └── quicklab.sh ├── Creating a Data Warehouse Through Joins and Unions └── quicklabgsp413.sh ├── Creating a Looker Modeled Query and Working with Quick Start ├── Creating a Persistent Disk └── quicklabgsp004.sh ├── Creating a Real-time Data Pipeline using Eventarc and MongoDB Atla ├── Creating a Remote Backend └── quicklab.sh ├── Creating a Streaming Data Pipeline for a Real-Time Dashboard with Dataflow ├── index.js └── quicklabgsp644.sh ├── Creating a VPC Network with Gemini └── quicklab.sh ├── Creating a Virtual Machine └── quicklabgsp001.sh ├── Creating and Alerting on Logs based Metrics └── quicklabgsp091.sh ├── Creating and Populating a Bigtable Instance └── quicklabgsp1054.sh ├── Creating dynamic SQL derived tables with LookML and Liquid └── GSP932.md ├── Dart Functions Framework └── quicklabgsp889.sh ├── Data Analytics SME Academy Loading Data into Google Cloud SQL └── quicklabgsp196.sh ├── Data Catalog Qwik Start ├── GSP729.md └── quicklabgsp729.sh ├── Dataflow Qwik Start Python └── quicklabgsp207.sh ├── Dataflow Qwik Start Templates └── quicklabgsp192.sh ├── Dataplex Qwik Start Command Line └── quicklabgsp1144.sh ├── Dataplex Qwik Start Console ├── Dataplex Qwik Start Console.md ├── quicklabtask1.sh └── quicklabtask2.sh ├── Dataproc Qwik Start Command Line ├── GSP104.md └── quicklabgsp104.sh ├── Dataproc Qwik Start Console └── quicklabgsp103.sh ├── De-identifying DICOM Data with the Healthcare API ├── Debugging Apps on Google Kubernetes Engine new └── quicklabgsp736.sh ├── Debugging Apps on Google Kubernetes Engine.md ├── Defending Edge Cache with Cloud Armor ├── Defending Edge Cache with Cloud Armor.md ├── quicklabtask1.sh └── quicklabtask2.sh ├── Deidentify DICOM Data with the Healthcare API ├── GSP626.md └── quicklabgsp626.sh ├── Deidentifying DICOM Data with the Healthcare API └── quicklabgsp626.sh ├── Deploy Go Apps on Google Cloud Serverless Platforms ├── GSP702.md └── quicklabgsp702.sh ├── Deploy Google Cloud Framework Data Foundation for SAP: Challenge Lab ├── Deploy Kubernetes Applications on Google Cloud Challenge Lab └── quicklabgsp318.sh ├── Deploy Kubernetes Load Balancer Service with Terraform └── quicklabgsp233.sh ├── Deploy Microsoft SQL Server to Compute Engine ├── GSP031.md └── quicklabgsp031.sh ├── Deploy Scale and Update Your Website on Google Kubernetes Engine └── quicklabgsp663.sh ├── Deploy Your Website on Cloud Run └── quicklabgsp659.sh ├── Deploy a Compute Instance with a Remote Startup Script Challenge Lab ├── GSP301.md ├── quicklabgsp301.sh └── resources-install-web.sh ├── Deploy a Generative AI solution using a RAG Framework to Google Cloud Challenge Lab L400 ├── main.py ├── quicklab.sh └── requirements.txt ├── Deploy a Hugo Website with Cloud Build and Firebase Pipeline ├── GSP747.md ├── quicklab.md ├── quicklabtask1.sh ├── quicklabtask2.sh ├── task1.sh ├── task2.sh └── task3.sh ├── Deploy a Modern Web App connected to a Cloud Spanner Instance ├── GSP1051.md └── quicklabgsp1051.sh ├── Deploy a Streamlit App Integrated with Gemini Pro on Cloud Run └── quicklabgsp1229.sh ├── Deploy and Manage Cloud Environments with Google Cloud: Challenge Lab ├── Deploy and Test a Visual Inspection AI Component Anomaly Detection Solution ├── GSP896.md └── quicklabgsp896.sh ├── Deploy and Test a Visual Inspection AI Cosmetic Anomaly Detection Solution └── quicklabgsp898.sh ├── Deploy and Troubleshoot a Website Challenge Lab ├── GSP101.md └── quicklabgsp101.sh ├── Deploy to Kubernetes in Google Cloud: Challenge Lab ├── Deploying Apps to Google Cloud └── quicklab.sh ├── Deploying GKE Autopilot Clusters from Cloud Shell └── quicklab.sh ├── Deploying GKE Autopilot Clusters ├── Quicklab.md └── quicklab.sh ├── Deploying Google Kubernetes Engine AWS ├── Quicklab.md └── quicklab.sh ├── Deploying Google Kubernetes Engine └── quicklab.sh ├── Deploying Redis Enterprise for GKE and Serverless App on Anthos Bare Metal └── quicklabgsp938.sh ├── Deploying a Containerized Application on Cloud Run ├── Quicklab.md └── quicklab.sh ├── Deploying a FaultTolerant Microsoft Active Directory Environment └── quicklabgsp118.sh ├── Deploying a Python Flask Web Application to App Engine Flexible └── quicklabgsp023.sh ├── Deploying the Oracle on Bare Metal Solution └── quicklab.sh ├── Derive Insights from BigQuery Data: Challenge Lab ├── GSP787.md ├── lab.md └── quicklabgsp787.sh ├── Designing and Querying Bigtable Schemas └── quicklabgsp1053.sh ├── Detect Labels Faces and Landmarks in Images with the Cloud Vision API ├── city.png ├── donuts.png ├── quicklabgsp037.sh └── selfie.png ├── Detect Manufacturing Defects using Visual Inspection AI Challenge Lab ├── GSP366.md └── quicklabgsp366.sh ├── Detect Manufacturing Defects using Visual Inspection AI: Challenge Lab ├── Detect and Investigate Threats with Security Command Center.md ├── Detect and Investigate Threats with Security Command Center ├── GSP1125.md └── quicklabgsp1125.sh ├── Develop GenAI Apps with Gemini and Google Maps └── agent-v1.0.0.ipynb ├── Develop GenAI Apps with Gemini and Streamlit Challenge Lab ├── Dockerfile.txt ├── GSP517.md ├── bonustask.sh ├── chef.py ├── prompt.ipynb ├── requirements.txt ├── task1.sh └── task2.sh ├── Develop No-Code Chat Apps with AppSheet └── quicklabs.xlsx ├── Develop Serverless Applications on Cloud Run Challenge Lab ├── quicklab.md └── quicklabgsp328.sh ├── Develop an App with Vertex AI Gemini 10 Pro ├── Quicklab.md ├── gemini-app │ ├── Dockerfile.txt │ ├── app.py │ ├── app_tab1.py │ ├── app_tab2.py │ ├── app_tab3.py │ ├── app_tab4.py │ ├── requirements.txt │ └── response_utils.py └── quicklab.sh ├── Develop an app with Gemini ├── Develop an app with Gemini.md ├── app.py ├── inventory.py └── quicklab.sh ├── Develop and Deploy Cloud Run Functions ├── Quicklab.md └── quicklab.sh ├── Develop with Apps Script and AppSheet Challenge Lab ├── ARC126.md └── quicklabarc126.xlsx ├── Develop your Google Cloud Network Challenge Lab ├── GSP321.md └── quicklabgsp321.sh ├── Developing a REST API with Go and Cloud Run └── quicklabgsp761.sh ├── Developing and Deploying Cloud Functions └── quicklab.sh ├── Dialogflow CX Managing Environments ├── quicklab.blob └── quicklab.md ├── Dialogflow CX with Generative Fallbacks └── divebooker-agent.zip ├── Dialogflow Logging and Monitoring in Operations Suite ├── GSP794.md ├── pigeon-travel-gsp-794-cloud-function.zip └── quicklab794.zip ├── Discover and Protect Sensitive Data Across Your Ecosystem Challenge Lab ├── GSP522.md ├── deidentify-model-response-challenge-lab-v1.0.0.ipynb └── quicklabgsp522.sh ├── Distributed Load Testing Using Kubernetes ├── GSP182.md └── quicklabgsp182.sh ├── ETL Processing on Google Cloud Using Dataflow and BigQuery Python ├── GSP290.md └── quicklabgsp290.sh ├── ETL Processing on Google Cloud Using Dataflow and BigQuery.md ├── Embedding Maps in Looker BI.md ├── Employing Best Practices for Improving the Usability of LookML Projects.md ├── Enabling Sensitive Data Protection Discovery for Cloud Storage ├── GSP1281.md └── quicklabgsp1281.sh ├── Engineer Data in Google Cloud Challenge Lab.md ├── Enhancing User Interactivity in Looker with Liquid ├── Ensure Access & Identity in Google Cloud Challenge Lab └── quicklabgsp342.sh ├── Entity and Sentiment Analysis with the Natural Language API ├── Eventarc for Cloud Run ├── Exemplar: Add and manage users with Linux commands ├── Exemplar: Manage authorization ├── Exemplar: Manage files with Linux commands ├── Explore Generative AI with the Vertex AI Gemini API Challenge Lab ├── gemini-explorer-challenge-v1.0.0.ipynb ├── gemini-explorer-challenge-v2.0.0.ipynb └── gemini-explorer-challenge.ipynb ├── Explore and Evaluate Models using Model Garden └── quicklabgsp1166.sh ├── Explore false positives through incident detection ├── Quicklab.md └── quicklab.sh ├── Exploring Cost-optimization for GKE Virtual Machines ├── GSP767.md └── quicklabgsp767.sh ├── Exploring Data with Looker: Challenge Lab ├── Exploring Dataset Metadata Between Projects with Data Catalog ├── GSP789.md └── quicklabgsp789.sh ├── Exploring IAM ├── quicklab.sh └── sample.txt ├── Exploring Your Ecommerce Dataset with SQL in Google BigQuery └── quicklabgsp407.sh ├── Extract Analyze and Translate Text from Images with the Cloud ML APIs ├── quicklabgsp075.sh └── sign.jpg ├── Finish a Puppet deployment ├── Fix errors with a crashing script ├── Fix errors with a crashing script.md ├── Fraud Detection on Financial Transactions with Machine Learning on Google Cloud ├── GSP774.md └── quicklabgsp774.sh ├── Function Calling with Firebase Genkit ├── Quicklab.md └── quicklab.sh ├── Fundamentals of Cloud Logging ├── GSP610.md └── quicklabgsp610.sh ├── GKE Workload Optimization.md ├── Gating Deployments with Binary Authorization └── quicklabgsp1183.sh ├── Gemini for Data Scientists └── quicklab.ipynb ├── Generative AI Search Challenge Lab └── quicklab.ipynb ├── Generative AI with Vertex AI Getting Started ├── intro_palm_api.ipynb └── intro_prompt_design.ipynb ├── Generative AI with Vertex AI Prompt Design └── intro_prompt_design.ipynb ├── Generative AI with Vertex AI Text Prompt Design Challenge Lab └── text_prompt_design_challenge_lab-v1.0.0.ipynb ├── Get Started with Cloud Storage Challenge Lab ├── ARC111.md └── quicklabarc111.sh ├── Get Started with Eventarc: Challenge Lab ├── Get Started with Looker: Challenge Lab ├── Get Started with PubSub Challenge Lab ├── ARC113.md ├── newquicklabarc113.sh └── quicklabarc113.sh ├── Get Started with PubSub: Challenge Lab ├── Get Started with Sensitive Data Protection Challenge Lab ├── ARC116.md └── quicklabarc116.sh ├── Get Started with Vertex AI Studio ├── GSP1154.md ├── quicklabtask1.json ├── quicklabtask2.json ├── quicklabtask3.json └── quicklabtask4.json ├── Getting Started with API Gateway Challenge Lab └── quicklabarc109.sh ├── Getting Started with BigQuery GIS for Data Analysts ├── GSP866.md └── quicklabgsp866.sh ├── Getting Started with BigQuery Machine Learning └── quicklabgsp247.sh ├── Getting Started with Cloud IDS ├── Getting Started with Cloud KMS └── quicklabgsp079.sh ├── Getting Started with Cloud Shell and gcloud ├── GSP002.md └── quicklabgsp002.sh ├── Getting Started with Firebase Genkit ├── Quicklab.md ├── index.ts ├── task1.sh └── task2.sh ├── Getting Started with Liquid to Customize the Looker User Experience ├── Getting Started with MongoDB Atlas on Google Cloud ├── Getting Started with Security Command Center └── quicklabgsp1124.sh ├── Getting Started with Splunk Cloud GDI on Google Cloud ├── Getting Started with VPC Networking and Google Compute Engine Windows └── quicklab.sh ├── Getting started with Flutter Development └── quicklab.md ├── Git Merges ├── Google Chat Bot - Apps Script.md ├── Google Chat Bot - Apps Script └── GSP250.md ├── Google Cloud Fundamentals Getting Started with App Engine ├── Quicklab.md └── quicklab.sh ├── Google Cloud Fundamentals Getting Started with Cloud Storage and Cloud SQL └── quicklab.sh ├── Google Cloud Fundamentals Getting Started with GKE ├── Quicklab.md └── quicklab.sh ├── Google Cloud Fundamentals: Getting Started with BigQuery ├── Quicklab.md └── quicklab.sh ├── Google Cloud Packet Mirroring with OpenSource IDS └── quicklabgsp474.sh ├── Google Cloud SDK Qwik Start Redha tCentos └── quicklabgsp122.sh ├── Google Cloud Speech to Text API Qwik Start └── quicklabgsp119.sh ├── Google Cloud Storage Bucket Lock ├── Google Cloud Storage - Bucket Lock.md └── quicklabgsp297.sh ├── Google Kubernetes Engine Pipeline using Cloud Build ├── GSP1077.md ├── app-cloudbuild.yaml ├── env-cloudbuild.yaml └── quicklabgsp1077.sh ├── Google Kubernetes Engine Qwik Start └── quicklabgsp100.sh ├── Google Kubernetes Engine Security Binary Authorization └── quicklabgsp479.sh ├── Google Kubernetes Engine: Qwik Start ACE ├── Quicklab.md └── quicklab.sh ├── HTTP Google Cloud Functions in Go ├── GSP602.md └── quicklabgsp602.sh ├── HTTP Load Balancer with Cloud Armor ├── HTTP Load Balancer with Cloud Armor AUGUST └── quicklabgsp215.sh ├── HTTPS Content-Based Load Balancer with Terraform ├── GSP206.md ├── main.tf └── quicklabgsp206.sh ├── Hello Node Kubernetes └── quicklabgsp005.sh ├── Hosting a Web App on Google Cloud Using Compute Engine ├── Hosting a Web App on Google Cloud Using Compute Engine AWS ├── task1.sh └── task2.sh ├── Hosting a Web App on Google Cloud Using Compute Engine Azure ├── task1.sh └── task2.sh ├── Hosting a Web App on Google Cloud Using Compute Engine updated └── quicklabgsp662 │ ├── task1.sh │ └── task2.sh ├── How to Use a Network Policy on Google Kubernetes Engine ├── GSP480.md └── quicklabgsp480.sh ├── IAM Custom Roles └── quicklabgsp190.sh ├── IAP-secured Web App User ├── GSP1033.md └── quicklabgsp1033.sh ├── Identify Application Vulnerabilities with Security Command Center.md ├── Identify Damaged Car Parts with Vertex AutoML Vision └── payload.json ├── Identify Horses or Humans with TensorFlow and Vertex AI └── CLS_Vertex_AI_CNN_horse_or_human-v1.0.0.ipynb ├── Identifying Damaged Car Parts with Vertex AutoML Vision └── quicklab.sh ├── Implement Cloud Security Fundamentals on Google Cloud Challenge Lab ├── GSP342.md └── quicklabgsp342.sh ├── Implement DevOps Workflows in Google Cloud Challenge Lab └── quicklabgsp330.sh ├── Implement DevOps in Google Cloud: Challenge Lab ├── Implement Load Balancing on Compute Engine Challenge Lab ├── quicklab.md └── quicklabgsp313.sh ├── Implement Private Google Access and Cloud NAT Azure ├── Quicklab.md └── quicklab.sh ├── Implement Private Google Access and Cloud NAT └── quicklab.sh ├── Implement Unit Testing.md ├── Implement continuous delivery with Gemini └── quicklab.sh ├── Implement the User Experience for your AppSheet App └── GSP1029.md ├── Implementing Canary Releases of TensorFlow Model Deployments with Kubernetes and Anthos Service Mesh └── quicklabgsp778.sh ├── Implementing Cloud SQL AWS └── Quicklab.md ├── Implementing Cloud SQL └── quicklab.sh ├── Implementing Least Privilege IAM Policy Bindings in Cloud Run APPRUN └── quicklab.sh ├── Implementing Role-Based Access Control with Google Kubernetes Engine ├── Importing Data to a Firestore Database ├── createTestData.js ├── importTestData.js └── quicklabgsp642.sh ├── Infrastructure as Code with Terraform └── quicklabgsp750.sh ├── Ingesting DICOM Data with the Healthcare API └── quicklabgsp615.sh ├── Ingesting FHIR Data with the Healthcare API ├── Ingesting FHIR Data with the Healthcare API.md └── quicklabgsp457.sh ├── Ingesting HL7v2 Data with the Healthcare API └── quicklabgsp628.sh ├── Ingesting New Datasets into BigQuery ├── products.csv ├── quicklab.csv └── quicklabgsp411.sh ├── Insights from Data with BigQuery: Challenge Lab ├── Inspect Rich Documents with Gemini Multimodality and Multimodal RAG Challenge Lab └── inspect_rich_documents_w_gemini_multimodality_and_multimodal_rag-v1.0.0.ipynb ├── Installing Updating and Removing Software in Linux └── quicklab.sh ├── Installing, Updating, and Removing Software in Windows ├── Integrate BigQuery Data and Google Workspace using Apps Script: Challenge Lab ├── Integrate with Machine Learning APIs: Challenge Lab ├── Integrating Cloud Functions with Firestore ├── Integrating Cloud Functions with Firestore.md ├── quicklab.sh ├── quicklabtask3.sh ├── quicklabtask4.sh └── quicklabtask5.sh ├── Interact with Terraform Modules └── quicklabgsp751.sh ├── Interconnecting Networks Challenge Lab ├── Quicklab.md └── quicklab.sh ├── Internal Load Balancer └── quicklabgsp041.sh ├── Introduction to APIs in Google Cloud └── quicklabgsp294.sh ├── Introduction to BigQuery SQL translation ├── Quicklab.md └── quicklab.sh ├── Introduction to Cloud Bigtable Java └── quicklabgsp1038.sh ├── Introduction to Cloud Dataproc Hadoop and Spark on Google Cloud ├── GSP123.md └── quicklabgsp123.sh ├── Introduction to Computer Vision with TensorFlow ├── callback_model.ipynb ├── model.ipynb └── updated_model.ipynb ├── Introduction to Computer Vision with TensorFlow 2024 ├── callback_model.py ├── model.py ├── quicklabgsp631.sh ├── update_model_4.py ├── updated_model_1.py ├── updated_model_2.py ├── updated_model_3.py └── vertex-lab │ ├── GSP631.md │ ├── callback_model.ipynb │ ├── model.ipynb │ └── updated_model.ipynb ├── Introduction to Docker └── quicklabgsp055.sh ├── Introduction to Migration Center Assessments ├── Introduction to SQL for BigQuery and Cloud SQL ├── end_station_name.csv ├── quicklabgsp281.sh └── start_station_name.csv ├── July updated Create an Internal Load Balancer └── quicklabgsp216.sh ├── Kubernetes Basics v16 ├── Quicklab.md └── quicklab.sh ├── Kubernetes Engine Qwik Start └── quicklabgsp100.sh ├── Load Balancing and Auto scaling Challenge Lab ├── Quicklab.md └── quicklab.sh ├── Loading Data into Google Cloud SQL └── quicklabgsp196.sh ├── Loading Taxi Data into Google Cloud SQL ├── Quiclab.md └── quicklab.sh ├── Loading Your Own Data into BigQuery ├── GSP865.md ├── nyc_tlc_yellow_trips_2018_subset_1.csv └── quicklabgsp865.sh ├── Log Analysis Using Regular Expressions ├── Log Analytics on Google Cloud ├── GSP1088.md └── quicklabgsp1088.sh ├── Looker Data Explorer - Qwik Start ├── Looker Developer - Qwik Start ├── Looker Developer - Qwik Start.md ├── Looker Functions and Operators ├── MS Cloud Foundation Toolkit CFT Training Getting Started └── quicklabgsp798.sh ├── Machine Learning with TensorFlow in Vertex AI └── quicklab.ipynb ├── Manage Bigtable on Google Cloud Challenge Lab ├── quicklab.md └── quicklabgsp380.sh ├── Manage Data Models in Looker Challenge Lab.md ├── Manage Kubernetes in Google Cloud Challenge Lab.md ├── Manage Kubernetes in Google Cloud: Challenge Lab ├── Manage PostgreSQL Databases on Cloud SQL Challenge Lab.md ├── Manage PostgreSQL Databases on Cloud SQL: Challenge Lab ├── Managed Service for Microsoft Active Directory └── quicklabgsp847.sh ├── Managing Deployments Using Kubernetes Engine └── quicklabgsp053.sh ├── Managing Terraform State └── quicklabgsp752.sh ├── Managing Vault Tokens └── quicklabgsp1006.sh ├── Managing a GKE Multi-tenant Cluster with Namespaces └── quicklabgsp766.sh ├── Material Components for Flutter Basics.md ├── Measuring and Improving Speech Accuracy └── quicklabgsp758.sh ├── Migrate MySQL Data to Cloud SQL using Database Migration Service Challenge Lab └── GSP351.md ├── Migrate a MySQL Database to Google Cloud SQL.md ├── Modular Load Balancing with Terraform Regional Load Balancer ├── GSP191.md └── quicklabgsp191.sh ├── Modularizing LookML Code with Extends ├── MongoDB Atlas with Natural Language API and Cloud Run ├── Monitor an Apache Web Server using Ops Agent ├── GSP1108.md └── quicklabgsp1108.sh ├── Monitor and Log with Google Cloud Operations Suite Challenge Lab ├── quicklab.md └── quicklabgsp338.sh ├── Monitor and Manage Google Cloud Resources Challenge Lab ├── ARC101.md └── quicklabarc101.sh ├── Monitoring Applications in Google Cloud └── quicklab.sh ├── Monitoring a Compute Engine using Ops Agent AWS └── quicklab.sh ├── Monitoring a Compute Engine using Ops Agent Azure └── quicklab.sh ├── Monitoring and Logging for Cloud Functions └── quicklabgsp092.sh ├── Monitoring and Logging for Cloud Run Functions ├── GSP092.md └── quicklabgsp092.sh ├── Monitoring in Google Cloud Challenge Lab ├── ARC115.md └── quicklabarc115.sh ├── Monitoring in Google Cloud: Challenge Lab ├── Movie Recommendations in BigQuery ML 25 └── quicklab.sh ├── Multiple VPC Networks lab └── quicklabgsp315.sh ├── Multiple VPC Networks └── quicklabgsp211.sh ├── NEW Manage Kubernetes in Google Cloud Challenge Lab ├── quicklab.md └── quicklabgsp510.sh ├── Navigate BigQuery └── quicklab.sh ├── Navigate Dataplex ├── Quicklab.md └── quicklab.sh ├── Navigate Security Decisions with Duet AI └── quicklab.sh ├── Network Tiers Optimizing Network Spend └── quicklabgsp219.sh ├── Networking 101 ├── GSP016.md ├── quicklab016.sh └── spequicklabgsp016.sh ├── Networking Fundamentals on Google Cloud: Challenge Lab ├── ONDEMAND Cloud Monitoring Qwik Start ├── Quicklab.md └── quicklab.sh ├── ONDEMAND Cloud Run Functions Qwik Start Command Line ├── Quicklab.md └── quicklab.sh ├── Offloading Financial Mainframe Data into BigQuery and Elastic Search ├── GSP1153.md └── quicklabgsp1153.sh ├── Online Data Migration to BigQuery using Striim ├── Optical Character Recognition OCR with Document AI Python └── quicklabgsp1138.sh ├── Optimize Costs for Google Kubernetes Engine: Challenge Lab ├── Optimizing Cost with Google Cloud Storage ├── GSP649.md └── quicklabgsp649.sh ├── Optimizing Performance of LookML Queries ├── Optimizing your BigQuery Queries for Performance 25 └── quicklab.sh ├── Orchestrating LLMs with LangChain Challenge Lab ├── orchestration_llms_with_langchain_challenge_notebook-v1.0.0.ipynb └── orchestration_llms_with_langchain_challenge_notebook.ipynb ├── Orchestrating the Cloud with Kubernetes AWS └── quicklabgsp1120.sh ├── Orchestrating the Cloud with Kubernetes Azure └── quicklabgsp1121.sh ├── Orchestrating the Cloud with Kubernetes └── quicklabgsp021.sh ├── Partitioning and formatting a Disk Drive in Linux ├── Perform Foundational Data, ML, and AI Tasks in Google Cloud: Challenge Lab ├── Perform Foundational Data, ML, and AI Tasks in Google Cloud: Challenge Lab [APRIL 2023] ├── Perform Foundational Infrastructure Tasks in Google Cloud Challenge Lab └── quicklabgsp315.sh ├── Perform Predictive Data Analysis in BigQuery Challenge Lab .md ├── Pre-Assessment GCP Labs.md ├── Predict Bike Trip Duration with a Regression Model in BQML 25 └── quicklab.sh ├── Predict Soccer Match Outcomes with BigQuery ML Challenge Lab ├── GSP374.md └── quicklabgsp374.sh ├── Predict Taxi Fare with a BigQuery ML Forecasting Model └── quicklabgsp246.sh ├── Predict Visitor Purchases with a Classification Model in BigQuery ML ├── GSP229.md └── quicklabgsp229.sh ├── Prepare Data for ML APIs on Google Cloud Challenge Lab ├── GSP323.md ├── quicklabgsp323.sh ├── quicklabtask1.sh └── quicklabtask2.sh ├── Process Documents with Python Using the Document AI API ├── Process Documents with Python Using the Document AI API.md ├── quickklabgsp925.sh └── quicklab.sh ├── Prompt Design in Vertex AI Challenge Lab ├── Cymbal Product Analysis.json ├── Cymbal Tagline Generator Template.json ├── GSP519.md ├── image-analysis.ipynb └── tagline-generator.ipynb ├── Protect Cloud Traffic with BeyondCorp Enterprise (BCE) Security: Challenge Lab.md ├── Protecting Sensitive Data in Gen AI Model Responses ├── deidentify-model-response-v1.0.0.ipynb └── test.sh ├── Provision Cloud Infrastructure with Duet AI └── quicklab.sh ├── Provision Cloud Infrastructure with Gemini ├── Quicklab.md └── quicklab.sh ├── PubSub Lite Qwik Start └── quicklabgsp832.sh ├── PubSub Qwik Start Python └── quicklabgsp094.sh ├── Rate Limiting with Cloud Armor ├── Rate Limiting with Cloud Armor new ├── GSP975.md └── quicklabgsp975.sh ├── Redacting Critical Data with Sensitive Data Protection ├── GSP864.md └── quicklabgsp864.sh ├── Reduce Costs for the Managed Service for Prometheus └── quicklabgsp1027.sh ├── Rent a VM to Process Earthquake Data ├── quicklab.sh └── quicklabgsp008.sh ├── Resource Monitoring ├── Quicklab.md └── quicklab.sh ├── Responding to Cloud Logging Messages with Cloud Functions ├── Running Databases in GKE ├── Quicklab.md └── quicklab.sh ├── Running Pipelines on Vertex AI 25 └── quicklab.sh ├── Running Windows Containers on Compute Engine ├── GSP153.md └── quicklabgsp153.bat ├── Running a Containerized App on Google Kubernetes Engine └── quicklabgsp015.sh ├── Running a Dedicated Ethereum RPC Node in Google Cloud ├── GSP1116.md └── quicklabgsp1116.sh ├── Running a MongoDB Database in Kubernetes with StatefulSets ├── GSP022.md ├── mongo-statefulset.yaml └── quicklabgsp022.sh ├── SAP Landing Zone: Add and Configure Storage to SAP VMs ├── Scale Out and Update a Containerized Application on a Kubernetes Cluster └── quicklabgsp305.sh ├── Scaling Microservices Applications: Migration to Redis Enterprise on Google Cloud ├── GSP1177.md └── quicklabgsp1177.sh ├── Scanning User generated Content Using the Cloud Video Intelligence and Cloud Vision APIs └── quicklabgsp138.sh ├── Secure Builds with Cloud Build ├── GSP1184.md └── quicklabgsp1184.sh ├── Secure Software Supply Chain Using Cloud Build & Cloud Deploy to Deploy Containerized Applications ├── GSP1092.md └── quicklabgsp1092.sh ├── Securing Cloud Applications with Identity Aware Proxy IAP using ZeroTrust ├── quicklabgsp946.sh ├── quicklabtask1.sh └── quicklabtask2.sh ├── Securing Compute Engine Applications and Resources using BeyondCorp Enterprise (BCE) ├── Securing Compute Engine Applications and Resources using BeyondCorp Enterprise BCE ├── Securing Compute Engine Applications with BeyondCorp Enterprise └── quicklab.sh ├── Securing Container Builds ├── GSP1185.md ├── pom.xml └── quicklabgsp1185.sh ├── Securing Google Kubernetes Engine with Cloud IAM and Pod Security Admission.md ├── Securing Virtual Machines using BeyondCorp Enterprise BCE.md ├── September Create VM template and Automate deployment └── quicklab.sh ├── Serverless Data Analysis with Dataflow Side Inputs java └── quicklab.sh ├── Serverless Data Processing with Dataflow - CI CD with Dataflow ├── Serverless Data Processing with Dataflow Advanced Streaming Analytics Pipeline with Cloud Dataflow Java └── quicklab.sh ├── Serverless Data Processing with Dataflow Monitoring Logging and Error Reporting for Dataflow Jobs └── quicklab.sh ├── Serverless Data Processing with Dataflow Writing an ETL Pipeline using Apache Beam and Dataflow Python ├── Quicklab.md ├── my_pipeline.py └── quicklab.sh ├── Serverless Firebase Development: Challenge Lab ├── Service Accounts and Roles Fundamentals ├── GSP199.md └── quicklabgsp199.sh ├── Service Directory Qwik Start └── quicklabgsp732.sh ├── Service Monitoring ├── Quicklab.md └── quicklab.sh ├── Set Up Network and HTTP Load Balancers └── quicklabgsp007.sh ├── Set Up a Google Cloud Network: Challenge Lab ├── GSP314.md ├── quicklab.md ├── quicklabgsp314.sh ├── quicklabip.sh ├── quicklabtask1.sh ├── quicklabtask2.sh ├── quicklabtask3.sh └── quicklabtask4.sh ├── Set Up a Query based Alert By Using MQL Qwik Start └── quicklabgsp1109.sh ├── Set Up an App Dev Environment on Google Cloud Challenge Lab ├── GSP315.md └── quicklabgsp315.sh ├── Set Up and Configure a Cloud Environment in Google Cloud Azure Challenge Lab └── quicklabgsp512.sh ├── Set Up and Configure a Cloud Environment in Google Cloud AWS Challenge Lab └── quicklabgsp511.sh ├── Set Up and Configure a Cloud Environment in Google Cloud Challenge Lab └── quicklabgsp321.sh ├── Setting Up Cost Control with Quota └── quicklabgsp651.sh ├── Setting Up Network and HTTP Load Balancers ACE ├── Quicklab.md └── quicklab.sh ├── Setting up Jenkins on Kubernetes Engine └── quicklabgsp117.sh ├── Simplify Network Connectivity for AlloyDB for PostgreSQL Challenge Lab ├── GCC040.md └── quicklabgcc040.sh ├── SingleStore on Google Cloud ├── GSP1096.md └── quicklabgsp1096.sh ├── Speaking with a Webpage Streaming Speech Transcripts ├── task1.sh └── task2.sh ├── Speech to Text Transcription with the Cloud Speech API ├── GSP048.md ├── task1.sh └── task2.sh ├── Store Process and Manage Data on Google Cloud Command Line Challenge Lab ├── index.js └── package.json ├── Store Process and Manage Data on Google Cloud Challenge Lab ├── ARC100.md ├── index.js ├── package.json └── quicklabarc100.sh ├── Store Process and Manage Data on Google Cloud Command Line Challenge Lab ├── ARC102.md └── quicklabarc102.sh ├── Store, Process, and Manage Data on Google Cloud: Challenge Lab.md ├── Stream Processing with Cloud Pub:Sub and Dataflow Qwik Start ├── quicklabgsp903.sh └── quicklabpythongsp903.sh ├── Streaming Analytics into BigQuery Challenge Lab └── quicklabarc106.sh ├── Streaming Analytics into BigQuery: Challenge Lab ├── Streaming Data Processing Streaming Data Pipelines into Bigtable.md ├── Streaming Data to Bigtable ├── Streaming Data to Bigtable.md ├── task1.sh ├── task2.sh └── task3.sh ├── Streaming HL7 to FHIR Data with Dataflow and the Healthcare API └── quicklabgsp894.sh ├── Summarize Text using SQL and LLMs in BigQuery ML └── quicklabgsp835.sh ├── TFX on Cloud AI Platform Pipelines └── lab-02.ipynb ├── TOC Challenge Lab Configuring Google Cloud Networking └── quicklab.sh ├── TOC Challenge Lab Configuring VPN Connectivity between Google Cloud and AWS └── quicklab.sh ├── TOC Challenge Lab Logging and Monitoring.md ├── Tagging Dataplex Assets ├── TensorFlow Qwik Start └── quicklabgsp637.sh ├── Terraform Fundamentals └── quicklabgsp156.sh ├── Test Network Latency Between VMs ├── GSP161.md └── quicklabgsp161.sh ├── The Basics of Google Cloud Compute: Challenge Lab ├── Transacting Digital Assets with Multi-Party Computation and Confidential Space ├── GSP1128.md ├── mpc-ethereum-demo │ ├── Dockerfile │ ├── credential-config.js │ ├── index.js │ ├── kms-decrypt.js │ ├── mpc.js │ └── package.json └── quicklabgsp1128.sh ├── Troubleshooting Common SQL Errors with BigQuery └── quicklabgsp408.sh ├── Troubleshooting Data Models in Looker ├── Troubleshooting and Solving Data Join Pitfalls ├── Understanding and Combining GKE Autoscaling Strategies └── quicklabgsp768.sh ├── Upgrading Google Kubernetes Engine Clusters └── quicklab.sh ├── Use APIs to Work with Cloud Storage Challenge Lab ├── ARC125.md ├── quicklabarc125.sh └── world.jpeg ├── Use Dataproc Serverless for Spark to Load BigQuery ├── Quicklab.md └── quicklab.sh ├── Use Functions Formulas and Charts in Google Sheets Challenge Lab ├── GSP379.md ├── On the Rise Bakery Business Challenge.xlsx └── Staff Roles.pptx ├── Use Functions, Formulas, and Charts in Google Sheets: Challenge Lab ├── Use Go Code to Work with Google Cloud Data Sources ├── Use Machine Learning APIs on Google Cloud Challenge Lab ├── GSP329.md └── quicklabgsp329.sh ├── User Authentication IdentityAware Proxy ├── GSP499.md ├── quicklabgsp499.sh └── quicklabgsp499sh.sh ├── Using BigQuery in the Google Cloud Console └── quicklabgsp406.sh ├── Using Cloud PubSub with Cloud Run APPRUN └── quicklab.sh ├── Using Cloud Trace on Kubernetes Engine ├── GSP484.md └── quicklabgsp484.sh ├── Using Elastic Stack to Monitor Google Cloud ├── task1.sh └── task2.sh ├── Using Gemini Throughout the Software Development Lifecycle ├── Quicklab.md ├── index.ts ├── quicklabtask1.sh └── quicklabtask2.sh ├── Using Logs to Help You Track Down an Issue in Linux └── quicklab.sh ├── Using OpenTSDB to Monitor TimeSeries Data on Cloud Platform └── quicklabgsp142.sh ├── Using Prometheus for Monitoring on Google Cloud Qwik Start ├── GSP1024.md └── quicklabgsp1024.sh ├── Using Ruby on Rails with Cloud SQL for PostgreSQL on Cloud Run ├── GSP943.md └── quicklabgsp943.sh ├── Using Specialized Processors with Document AI Python ├── GSP1140.md └── quicklabgsp1140.sh ├── Using Terraform to Create Clients and Servers └── quicklab.sh ├── Using gsutil to Perform Operations on Buckets and Objects ├── GSP130.md └── quicklabgsp130.sh ├── Using the Google Cloud Speech API: Challenge Lab ├── Using the Natural Language API from Google Docs ├── Utilize the Streamlit Framework with Cloud Run and the Gemini API in Vertex AI ├── GSP1229.md └── quicklabgsp1229.sh ├── VM Migration Planning ├── network.tf └── quicklabgsp616.sh ├── VPC Flow Logs Analyzing Network Traffic └── quicklabgsp212.sh ├── VPC Network Peering └── quicklabgsp193.sh ├── VPC Networking Cloud HAVPN └── quicklabgsp619.sh ├── VPC Networking Fundamentals ├── VPC Networking └── quicklab.sh ├── VPC Networks Controlling Access └── quicklabgsp213.sh ├── VPC Networks - Controlling Access ├── Validating Policies for Terraform on Google Cloud └── quicklabgsp1021.sh ├── Vertex AI PaLM API Qwik Start └── quicklabgsp1155.sh ├── Vertex Pipelines Qwik Start └── quicklab.ipynb ├── Video Intelligence Qwik Start └── quicklabgsp154.sh ├── Virtual Networking v15 ├── Quicklab.md └── quicklab.sh ├── Virtual Private Networks VPN └── quicklab.sh ├── Visualizing Billing Data with Looker Studio └── quicklabgsp622.sh ├── Weather Data in BigQuery └── quicklabgsp009.sh ├── Web Security Scanner Qwik Start └── quicklabgsp112.sh ├── Web Security Scanner: Qwik Start.md ├── Work with Python Scripts.md ├── Working with Artifact Registry └── quicklabgsp1076.sh ├── Working with Cloud Build └── quicklab.sh ├── Working with Google Kubernetes Engine Secrets and ConfigMaps ├── Working with JSON Arrays and Structs in BigQuery └── quicklabgsp416.sh ├── Working with Virtual Machines ├── backup.sh ├── quicklab.sh └── world_session.lock ├── Working with multiple VPC networks ├── Quicklab.md └── quicklab.sh ├── Working with the Google Cloud Console and Cloud Shell ├── Working with the Google Cloud Console and Cloud Shell Azure └── Quicklab.md ├── [2023] Automating the Deployment of Networks with Terraform └── quicklabgsp460.sh ├── [2023] Create and Manage Cloud Resources: Challenge Lab.md ├── [2024] Detect and Investigate Threats with Security Command Center.md ├── [Updated] Deploy and Manage Cloud Environments with Google Cloud: Challenge Lab ├── gsutil for Storage Configuration ├── GSP695.md └── quicklab.sh └── mini lab ├── Quicklab.md ├── quicklab1.sh ├── quicklab2.sh ├── quicklab3.sh ├── quicklab4.sh ├── quicklab5.sh ├── quicklab6.sh ├── quicklabbig1.sh ├── quicklabbig2.sh ├── quicklabbig3.sh ├── quicklabbig4.sh ├── quicklabbig5.sh └── quicklablabbig6.sh /APIs Explorer App Engine/quicklabgsp422.sh: -------------------------------------------------------------------------------- 1 | 2 | echo "" 3 | echo "" 4 | 5 | read -p "Enter REGION: " REGION 6 | 7 | gcloud auth list 8 | 9 | git clone https://github.com/GoogleCloudPlatform/python-docs-samples 10 | 11 | cd ~/python-docs-samples/appengine/standard_python3/hello_world 12 | 13 | export PROJECT_ID=$DEVSHELL_PROJECT_ID 14 | 15 | gcloud app create --project $PROJECT_ID --region=$REGION 16 | 17 | echo "Y" | gcloud app deploy app.yaml --project $PROJECT_ID 18 | 19 | echo "Y" | gcloud app deploy -v v1 -------------------------------------------------------------------------------- /APIs Explorer Cloud Storage/demo-image1-copy.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/quiccklabs/Labs_solutions/d72b296e05c55ad79acaa8ce5195181d1877061a/APIs Explorer Cloud Storage/demo-image1-copy.png -------------------------------------------------------------------------------- /APIs Explorer Cloud Storage/demo-image1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/quiccklabs/Labs_solutions/d72b296e05c55ad79acaa8ce5195181d1877061a/APIs Explorer Cloud Storage/demo-image1.png -------------------------------------------------------------------------------- /APIs Explorer Cloud Storage/demo-image2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/quiccklabs/Labs_solutions/d72b296e05c55ad79acaa8ce5195181d1877061a/APIs Explorer Cloud Storage/demo-image2.png -------------------------------------------------------------------------------- /APIs Explorer Cloud Storage/quicklabgsp421.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | curl -LO https://github.com/quiccklabs/Labs_solutions/blob/master/APIs%20Explorer%20Cloud%20Storage/demo-image1-copy.png 5 | curl -LO https://github.com/quiccklabs/Labs_solutions/blob/master/APIs%20Explorer%20Cloud%20Storage/demo-image1.png 6 | curl -LO https://github.com/quiccklabs/Labs_solutions/blob/master/APIs%20Explorer%20Cloud%20Storage/demo-image2.png 7 | 8 | 9 | gcloud alpha services api-keys create --display-name="quicklab" 10 | KEY_NAME=$(gcloud alpha services api-keys list --format="value(name)" --filter "displayName=quicklab") 11 | export API_KEY=$(gcloud alpha services api-keys get-key-string $KEY_NAME --format="value(keyString)") 12 | echo $API_KEY 13 | 14 | gsutil mb gs://$DEVSHELL_PROJECT_ID 15 | 16 | gsutil mb gs://$DEVSHELL_PROJECT_ID-quicklab 17 | 18 | gsutil cp demo-image1.png gs://$DEVSHELL_PROJECT_ID 19 | 20 | gsutil cp demo-image2.png gs://$DEVSHELL_PROJECT_ID 21 | 22 | gsutil cp demo-image1-copy.png gs://$DEVSHELL_PROJECT_ID-quicklab -------------------------------------------------------------------------------- /APIs Explorer Compute Engine/quicklabgsp293.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | export REGION=$(gcloud compute project-info describe \ 5 | --format="value(commonInstanceMetadata.items[google-compute-default-region])") 6 | 7 | export ZONE=$(gcloud compute project-info describe \ 8 | --format="value(commonInstanceMetadata.items[google-compute-default-zone])") 9 | 10 | 11 | PROJECT_ID=`gcloud config get-value project` 12 | 13 | export PROJECT_NUMBER=$(gcloud projects describe $PROJECT_ID --format="value(projectNumber)") 14 | 15 | 16 | 17 | gcloud compute instances create instance-1 \ 18 | --project=$PROJECT_ID \ 19 | --zone=$ZONE \ 20 | --machine-type=n1-standard-1 \ 21 | --image-family=debian-11 \ 22 | --image-project=debian-cloud \ 23 | --boot-disk-type=pd-standard \ 24 | --boot-disk-device-name=instance-1 25 | 26 | 27 | 28 | gcloud compute instances delete instance-1 \ 29 | --project=$PROJECT_ID \ 30 | --zone=$ZONE --quiet 31 | -------------------------------------------------------------------------------- /APIs Explorer Create and Update a Cluster/quicklabgsp288.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | echo "" 4 | echo "" 5 | 6 | read -p "Enter ZONE: " ZONE 7 | 8 | export REGION="${ZONE%-*}" 9 | 10 | gcloud services enable dataproc.googleapis.com 11 | 12 | gcloud dataproc clusters create my-cluster \ 13 | --region=$REGION \ 14 | --zone=$ZONE \ 15 | --image-version=2.0-debian10 \ 16 | --optional-components=JUPYTER \ 17 | --project=$DEVSHELL_PROJECT_ID 18 | 19 | gcloud dataproc jobs submit spark \ 20 | --cluster=my-cluster \ 21 | --region=$REGION \ 22 | --jars=file:///usr/lib/spark/examples/jars/spark-examples.jar \ 23 | --class=org.apache.spark.examples.SparkPi \ 24 | --project=$DEVSHELL_PROJECT_ID \ 25 | -- \ 26 | 1000 27 | 28 | gcloud dataproc clusters update my-cluster \ 29 | --region=$REGION \ 30 | --num-workers=3 \ 31 | --project=$DEVSHELL_PROJECT_ID -------------------------------------------------------------------------------- /APIs Explorer Qwik Start/quicklabgsp277.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud config set project $DEVSHELL_PROJECT_ID 4 | gsutil mb gs://$DEVSHELL_PROJECT_ID-bucket 5 | gsutil bucketpolicyonly set off gs://$DEVSHELL_PROJECT_ID-bucket 6 | gsutil iam ch allUsers:objectViewer gs://$DEVSHELL_PROJECT_ID-bucket 7 | 8 | 9 | wget https://github.com/quiccklabs/Labs_solutions/blob/master/Use%20APIs%20to%20Work%20with%20Cloud%20Storage%20Challenge%20Lab/world.jpeg 10 | 11 | mv world.jpeg demo-image.jpg 12 | 13 | gsutil cp demo-image.jpg gs://$DEVSHELL_PROJECT_ID-bucket 14 | 15 | gsutil acl ch -u allUsers:R gs://$DEVSHELL_PROJECT_ID-bucket/demo-image.jpg -------------------------------------------------------------------------------- /Activity Add and manage users with Linux commands/quicklab.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | sudo useradd researcher9 7 | 8 | sudo usermod -aG research_team researcher9 9 | 10 | sudo chown researcher9:research_team /home/researcher2/projects/project_r.txt 11 | 12 | sudo usermod -aG sales_team researcher9 13 | 14 | 15 | 16 | -------------------------------------------------------------------------------- /Activity Filter a SQL query.md: -------------------------------------------------------------------------------- 1 | 2 | ### Activity: Filter a SQL query 3 | 4 | 5 | 6 | 7 | ```bash 8 | SELECT device_id, operating_system 9 | FROM machines; 10 | 11 | 12 | SELECT device_id, operating_system 13 | FROM machines 14 | WHERE operating_system = 'OS 2'; 15 | 16 | 17 | SELECT * 18 | FROM employees 19 | WHERE department = 'Finance'; 20 | 21 | 22 | SELECT * 23 | FROM employees 24 | WHERE department = 'Sales'; 25 | 26 | 27 | 28 | SELECT * 29 | FROM employees 30 | WHERE office = 'South-109'; 31 | 32 | 33 | SELECT * 34 | FROM employees 35 | WHERE office LIKE 'South%'; 36 | ``` 37 | 38 | 39 | ### Congratulation !!! 40 | -------------------------------------------------------------------------------- /Activity- Examine input:output in the Linux shell/quicklab.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | echo hello 4 | echo "hello" 5 | echo "Quicklab" 6 | 7 | expr 32 - 8 8 | expr 3500 \* 12 9 | 10 | clear 11 | 12 | -------------------------------------------------------------------------------- /Activity- Filter with AND, OR, and NOT : -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | SELECT * 5 | FROM log_in_attempts 6 | WHERE login_time > '18:00' AND success = 0; 7 | 8 | 9 | 10 | SELECT * 11 | FROM log_in_attempts 12 | WHERE login_date = '2022-05-09' OR login_date = '2022-05-08'; 13 | 14 | 15 | 16 | SELECT * 17 | FROM log_in_attempts 18 | WHERE NOT country LIKE 'MEX%'; 19 | 20 | 21 | 22 | SELECT * 23 | FROM employees 24 | WHERE department = 'Marketing' AND office LIKE 'East-%'; 25 | 26 | 27 | SELECT * 28 | FROM employees 29 | WHERE department = 'Finance' OR department = 'Sales'; 30 | 31 | 32 | SELECT * 33 | FROM employees 34 | WHERE NOT department = 'Information Technology'; 35 | 36 | -------------------------------------------------------------------------------- /Activity: Apply more filters in SQL: -------------------------------------------------------------------------------- 1 | 2 | 3 | SELECT * 4 | FROM log_in_attempts 5 | WHERE login_date > '2022-05-09'; 6 | 7 | 8 | SELECT * 9 | FROM log_in_attempts 10 | WHERE login_date >= '2022-05-09'; 11 | 12 | 13 | SELECT * 14 | FROM log_in_attempts 15 | WHERE login_date BETWEEN '2022-05-09' AND '2022-05-11'; 16 | 17 | 18 | SELECT * 19 | FROM log_in_attempts 20 | WHERE login_time < '07:00:00'; 21 | 22 | 23 | SELECT * 24 | FROM log_in_attempts 25 | WHERE login_time >= '06:00:00' AND login_time < '07:00:00'; 26 | 27 | 28 | SELECT event_id, username, login_date 29 | FROM log_in_attempts 30 | WHERE event_id >= 100; 31 | 32 | 33 | SELECT event_id, username, login_date 34 | FROM log_in_attempts 35 | WHERE event_id BETWEEN 100 AND 150; 36 | 37 | 38 | 39 | 40 | 41 | 42 | 43 | 44 | 45 | 46 | 47 | 48 | 49 | 50 | -------------------------------------------------------------------------------- /Activity: Complete a SQL join: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | SELECT * 6 | FROM machines; 7 | 8 | 9 | SELECT * 10 | FROM machines 11 | INNER JOIN employees ON machines.device_id = employees.device_id; 12 | 13 | 14 | SELECT * 15 | FROM machines 16 | LEFT JOIN employees ON machines.device_id = employees.device_id; 17 | 18 | 19 | SELECT * 20 | FROM machines 21 | RIGHT JOIN employees ON machines.device_id = employees.device_id; 22 | 23 | 24 | SELECT * 25 | FROM employees 26 | INNER JOIN log_in_attempts ON employees.username = log_in_attempts.username; 27 | 28 | 29 | 30 | 31 | 32 | -------------------------------------------------------------------------------- /Activity: Decrypt an encrypted message.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | ## Activity: Decrypt an encrypted message 5 | 6 | 7 | ***Start the lab and run the below commands*** 8 | 9 | 10 | ```bash 11 | cat README.txt 12 | 13 | cd caesar 14 | 15 | cat .leftShift3 | tr "d-za-cD-ZA-C" "a-zA-Z" 16 | 17 | cd ~ 18 | 19 | openssl aes-256-cbc -pbkdf2 -a -d -in Q1.encrypted -out Q1.recovered -k ettubrute 20 | 21 | cat Q1.recovered 22 | ``` 23 | 24 | ## Congratulations !!! 25 | -------------------------------------------------------------------------------- /Activity: Filter with grep: -------------------------------------------------------------------------------- 1 | 2 | 3 | cd /home/analyst/logs 4 | 5 | grep "error" server_logs.txt 6 | 7 | 8 | cd /home/analyst/reports/users 9 | 10 | ls | grep "Q1" 11 | 12 | ls | grep access 13 | 14 | 15 | grep -i "Human Resources" Q4_added_users.txt 16 | 17 | grep "Human Resources" Q4_added_users.txt 18 | 19 | grep "jhill" Q2_deleted_users.txt 20 | -------------------------------------------------------------------------------- /Activity: Find files with Linux commands: -------------------------------------------------------------------------------- 1 | 2 | 3 | pwd 4 | 5 | ls 6 | 7 | cd /home/analyst/reports 8 | 9 | ls -d /home/analyst/reports 10 | 11 | cd /home/analyst/reports/users 12 | 13 | cat Q1_added_users.txt 14 | 15 | cd /home/analyst/logs 16 | 17 | head -n 10 server_logs.txt 18 | 19 | 20 | -------------------------------------------------------------------------------- /Activity: Get help in the command line/quicklab.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | ## Activity: Get help in the command line 4 | 5 | 6 | ## 7 | 8 | 9 | ```bash 10 | whatis cat 11 | ``` 12 | ## 13 | ```bash 14 | man cat 15 | ``` 16 | ## 17 | #### Press Q to exit this manual page. 18 | 19 | ## 20 | ```bash 21 | apropos -a first part file 22 | ``` 23 | ## 24 | 25 | ```bash 26 | man useradd 27 | ``` 28 | 29 | #### Press Q to exit this manual page. 30 | 31 | ```bash 32 | whatis rm 33 | 34 | whatis rmdir 35 | 36 | apropos -a "create new group" 37 | ``` 38 | 39 | 40 | ## Congratulation !!! 41 | -------------------------------------------------------------------------------- /Activity: Install software in a Linux.md: -------------------------------------------------------------------------------- 1 | ## Activity: Install software in a Linux distribution 2 | 3 | 4 | 5 | 6 | ***Start the lab and run the below commands*** 7 | 8 | 9 | ```bash 10 | apt 11 | 12 | sudo apt install suricata 13 | 14 | suricata --version 15 | ``` 16 | ### 17 | 18 | ***Check the score for task 1*** 19 | 20 | ```bash 21 | sudo apt remove suricata -y 22 | 23 | suricata --version 24 | ``` 25 | 26 | ***Check the score for task 2*** 27 | 28 | ```bash 29 | sudo apt install tcpdump 30 | ``` 31 | #### 32 | ***Check the score for task 3*** 33 | 34 | 35 | ```bash 36 | apt list --installed 37 | 38 | apt list --installed | grep tcpdump 39 | ``` 40 | 41 | #### 42 | ***Check the score for task 4*** 43 | 44 | ```bash 45 | sudo apt install suricata -y 46 | ``` 47 | #### 48 | ***Check the score for task 5*** 49 | 50 | ### 51 | ## Congratulations !!! 52 | 53 | -------------------------------------------------------------------------------- /Activity: Manage authorization: -------------------------------------------------------------------------------- 1 | 2 | 3 | cd projects/ 4 | 5 | ls -a 6 | 7 | chmod g-rw project_m.txt 8 | 9 | chmod g-rw project_m.txt 10 | 11 | chmod o-w project_k.txt 12 | 13 | chmod 440 .project_x.txt 14 | 15 | 16 | ls -ld /home/researcher2/projects/drafts 17 | 18 | chmod g-x /home/researcher2/projects/drafts 19 | 20 | -------------------------------------------------------------------------------- /Activity: Perform a SQL query.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | ## Activity: Perform a SQL query 7 | 8 | 9 | ***Start the lab and run the below commands*** 10 | 11 | 12 | ```bash 13 | SELECT device_id, email_client 14 | FROM machines; 15 | 16 | SELECT device_id, operating_system, OS_patch_date 17 | FROM machines; 18 | 19 | 20 | 21 | SELECT event_id, country 22 | FROM log_in_attempts 23 | WHERE country = 'Australia'; 24 | 25 | 26 | SELECT username, login_date, login_time 27 | FROM log_in_attempts 28 | OFFSET 4 LIMIT 1; 29 | 30 | 31 | SELECT * 32 | FROM log_in_attempts; 33 | 34 | 35 | SELECT * 36 | FROM log_in_attempts 37 | ORDER BY login_date; 38 | 39 | SELECT * 40 | FROM log_in_attempts 41 | ORDER BY login_date, login_time; 42 | ``` 43 | 44 | ## Congratulations !!! 45 | -------------------------------------------------------------------------------- /Adding a Phone Gateway to a Virtual Agent/pigeon-travel-gsp-793-cloud-function/package.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "dialogflowFirebaseFulfillment", 3 | "description": "This is the default fulfillment for a Dialogflow agents using Cloud Functions for Firebase", 4 | "version": "0.0.1", 5 | "private": true, 6 | "license": "Apache Version 2.0", 7 | "author": "Google Inc.", 8 | "engines": { 9 | "node": "8" 10 | }, 11 | "scripts": { 12 | "start": "firebase serve --only functions:dialogflowFirebaseFulfillment", 13 | "deploy": "firebase deploy --only functions:dialogflowFirebaseFulfillment" 14 | }, 15 | "dependencies": { 16 | "actions-on-google": "^2.2.0", 17 | "firebase-admin": "^5.13.1", 18 | "firebase-functions": "^2.0.2", 19 | "dialogflow": "^0.6.0", 20 | "dialogflow-fulfillment": "^0.5.0" 21 | } 22 | } -------------------------------------------------------------------------------- /Adding a Phone Gateway to a Virtual Agent/pigeon-travel-gsp-793-cloud-function/quicklab.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/quiccklabs/Labs_solutions/d72b296e05c55ad79acaa8ce5195181d1877061a/Adding a Phone Gateway to a Virtual Agent/pigeon-travel-gsp-793-cloud-function/quicklab.zip -------------------------------------------------------------------------------- /Analyze Sentiment with Natural Language API Challenge Lab/quicklabarc130.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | cat > analyze-request.json < analyze-response.txt 18 | 19 | 20 | 21 | 22 | 23 | cat > multi-nl-request.json < multi-response.txt 39 | 40 | 41 | -------------------------------------------------------------------------------- /Analyzing and Visualizing Data in Looker: -------------------------------------------------------------------------------- 1 | 2 | 3 | TASK 1:- 4 | 5 | # Place in `faa` model 6 | explore: +airports { 7 | query: start_from_here { 8 | measures: [average_elevation] 9 | } 10 | } 11 | 12 | 13 | 14 | TASK 2:- 15 | 16 | # Place in `faa` model 17 | explore: +airports { 18 | query: start_from_here { 19 | dimensions: [facility_type] 20 | measures: [average_elevation, count] 21 | } 22 | } 23 | 24 | 25 | 26 | 27 | TASK 3:- 28 | 29 | # Place in `faa` model 30 | explore: +flights { 31 | query: start_from_here { 32 | dimensions: [depart_week] 33 | measures: [cancelled_count] 34 | filters: [flights.depart_date: "2004"] 35 | } 36 | } 37 | 38 | 39 | 40 | TASK 4:- 41 | 42 | # Place in `faa` model 43 | explore: +flights { 44 | query: start_from_here { 45 | dimensions: [depart_week, distance_tiered] 46 | measures: [count] 47 | filters: [flights.depart_date: "2003"] 48 | } 49 | } 50 | 51 | 52 | 53 | 54 | 55 | 56 | -------------------------------------------------------------------------------- /App DevStoring Image and Video Files in Cloud Storage v11/quicklab.sh: -------------------------------------------------------------------------------- 1 | gcloud auth list 2 | 3 | git clone https://github.com/GoogleCloudPlatform/training-data-analyst 4 | 5 | cd ~/training-data-analyst/courses/developingapps/nodejs/cloudstorage/start 6 | 7 | . prepare_environment.sh 8 | 9 | gsutil mb gs://$DEVSHELL_PROJECT_ID-media 10 | 11 | export GCLOUD_BUCKET=$DEVSHELL_PROJECT_ID-media 12 | 13 | cd ~/training-data-analyst/courses/developingapps/nodejs/cloudstorage/start/server/gcp 14 | 15 | rm cloudstorage.js 16 | 17 | ##uploading files here 18 | 19 | wget https://raw.githubusercontent.com/quiccklabs/Labs_solutions/refs/heads/master/App%20DevStoring%20Image%20and%20Video%20Files%20in%20Cloud%20Storage%20v11/cloudstorage.js 20 | 21 | npm start 22 | -------------------------------------------------------------------------------- /App Engine Qwik Start Go/quicklabgsp070.sh: -------------------------------------------------------------------------------- 1 | gcloud auth list 2 | gcloud services enable appengine.googleapis.com 3 | 4 | git clone https://github.com/GoogleCloudPlatform/golang-samples.git 5 | 6 | cd golang-samples/appengine/go11x/helloworld 7 | 8 | sudo apt-get install google-cloud-sdk-app-engine-go 9 | 10 | sleep 30 11 | 12 | gcloud app create --region=$REGION 13 | 14 | gcloud app deploy --quiet 15 | -------------------------------------------------------------------------------- /App Engine Qwik Start Java/quicklabgsp068.sh: -------------------------------------------------------------------------------- 1 | 2 | gcloud services enable appengine.googleapis.com 3 | 4 | sleep 10 5 | 6 | git clone https://github.com/GoogleCloudPlatform/java-docs-samples.git 7 | 8 | cd java-docs-samples/appengine-java8/helloworld 9 | 10 | mvn clean 11 | mvn package 12 | 13 | timeout 30 mvn appengine:run & 14 | 15 | sleep 30 16 | 17 | gcloud app create --region=$REGION 18 | 19 | sed -i "s/myProjectId/$DEVSHELL_PROJECT_ID/g" pom.xml 20 | 21 | mvn package appengine:deploy 22 | 23 | gcloud app browse 24 | -------------------------------------------------------------------------------- /App Engine Qwik Start PHP/quicklabgsp069.sh: -------------------------------------------------------------------------------- 1 | gcloud auth list 2 | gcloud services enable appengine.googleapis.com 3 | 4 | git clone https://github.com/GoogleCloudPlatform/php-docs-samples.git 5 | 6 | cd php-docs-samples/appengine/standard/helloworld 7 | 8 | sleep 30 9 | 10 | gcloud app create --region=$REGION 11 | 12 | gcloud app deploy --quiet 13 | -------------------------------------------------------------------------------- /App Engine Qwik Start Python/quicklabgsp067.sh: -------------------------------------------------------------------------------- 1 | gcloud auth list 2 | gcloud services enable appengine.googleapis.com 3 | 4 | git clone https://github.com/GoogleCloudPlatform/python-docs-samples.git 5 | 6 | cd python-docs-samples/appengine/standard_python3/hello_world 7 | 8 | sudo apt install python3 -y 9 | sudo apt install python3.11-venv -y 10 | python3 -m venv create myvenv 11 | source myvenv/bin/activate 12 | 13 | 14 | sed -i '32c\ return "Hello, Cruel World!"' main.py 15 | 16 | sleep 30 17 | 18 | gcloud app create --region=$REGION 19 | 20 | gcloud app deploy --quiet 21 | -------------------------------------------------------------------------------- /App Engine: 3 Ways: Challenge Lab: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud services enable appengine.googleapis.com 4 | 5 | git clone https://github.com/GoogleCloudPlatform/python-docs-samples.git 6 | 7 | cd python-docs-samples/appengine/standard_python3/hello_world 8 | 9 | gcloud app deploy 10 | 11 | 12 | nano main.py 13 | -------------------------------------------------------------------------------- /Arcade Hero/quicklabarc1200.sh: -------------------------------------------------------------------------------- 1 | export REGION=$(gcloud compute project-info describe \ 2 | --format="value(commonInstanceMetadata.items[google-compute-default-region])") 3 | 4 | PROJECT_ID=`gcloud config get-value project` 5 | BUCKET=${PROJECT_ID}-bucket 6 | gsutil mb -l $REGION gs://${BUCKET} 7 | -------------------------------------------------------------------------------- /Arcade Hero/quicklabarc1201.sh: -------------------------------------------------------------------------------- 1 | export REGION=$(gcloud compute project-info describe \ 2 | --format="value(commonInstanceMetadata.items[google-compute-default-region])") 3 | 4 | export ZONE=$(gcloud compute project-info describe \ 5 | --format="value(commonInstanceMetadata.items[google-compute-default-zone])") 6 | 7 | PROJECT_ID=`gcloud config get-value project` 8 | 9 | bq mk --location=$REGION sports 10 | -------------------------------------------------------------------------------- /Arcade Hero/quicklabarc1202.sh: -------------------------------------------------------------------------------- 1 | gcloud compute networks create staging --subnet-mode=auto 2 | -------------------------------------------------------------------------------- /Arcade Hero/quicklabarc1203.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | export REGION=$(gcloud compute project-info describe \ 4 | --format="value(commonInstanceMetadata.items[google-compute-default-region])") 5 | 6 | export ZONE=$(gcloud compute project-info describe \ 7 | --format="value(commonInstanceMetadata.items[google-compute-default-zone])") 8 | 9 | PROJECT_ID=`gcloud config get-value project` 10 | 11 | 12 | 13 | users=($(gcloud projects get-iam-policy $PROJECT_ID --format=json | jq -r '.bindings[] | select(.role=="roles/viewer") | .members[]' | sed 's/user://g')) 14 | 15 | # Assign each user to a separate variable dynamically 16 | for i in "${!users[@]}"; do 17 | eval "USER$((i+1))=${users[i]}" 18 | done 19 | 20 | # # Print all stored users 21 | # echo "Users with Viewer role:" 22 | # for i in "${!users[@]}"; do 23 | # eval "echo User$((i+1)): \$USER$((i+1))" 24 | # done 25 | 26 | 27 | gcloud projects add-iam-policy-binding $PROJECT_ID \ 28 | --member="user:$USER3" \ 29 | --role="roles/editor" 30 | 31 | sleep 20 32 | 33 | gcloud projects add-iam-policy-binding $PROJECT_ID \ 34 | --member="user:$USER1" \ 35 | --role="roles/compute.admin" 36 | -------------------------------------------------------------------------------- /Arcade Hero/quicklabarc1204.sh: -------------------------------------------------------------------------------- 1 | gcloud pubsub topics create sports_topic 2 | 3 | gcloud pubsub topics create app_topic 4 | 5 | gcloud pubsub subscriptions create app_subscription --topic=app_topic 6 | -------------------------------------------------------------------------------- /Arcade Hero/quicklabarc1208.sh: -------------------------------------------------------------------------------- 1 | echo "" 2 | echo "" 3 | 4 | # STEP 1: Set region 5 | read -p "Export REGION :- " REGION 6 | 7 | 8 | # Step 1.2: Set variables 9 | REPO_NAME="container-registry" 10 | FORMAT="DOCKER" 11 | POLICY_NAME="Grandfather" 12 | KEEP_COUNT=3 13 | 14 | # Step 2: Create the Artifact Registry repository 15 | gcloud artifacts repositories create $REPO_NAME \ 16 | --repository-format=$FORMAT \ 17 | --location=$REGION \ 18 | --description="Docker repo for container images" 19 | 20 | # Step 3: Create cleanup policy named 'Grandfather' to keep only the latest 3 versions 21 | # gcloud artifacts policies create $POLICY_NAME \ 22 | # --repository=$REPO_NAME \ 23 | # --location=$REGION \ 24 | # --package-type=$FORMAT \ 25 | # --keep-count=$KEEP_COUNT \ 26 | # --action=DELETE 27 | -------------------------------------------------------------------------------- /Arcade Hero/quicklabarc1209.sh: -------------------------------------------------------------------------------- 1 | echo "" 2 | echo "" 3 | 4 | # STEP 1: Set region 5 | read -p "Export REGION :- " REGION 6 | 7 | 8 | # Step 1.2: Set variables 9 | REPO_NAME="container-registry" 10 | FORMAT="DOCKER" 11 | POLICY_NAME="Grandfather" 12 | KEEP_COUNT=3 13 | 14 | # Step 2: Create the Artifact Registry repository 15 | gcloud artifacts repositories create $REPO_NAME \ 16 | --repository-format=$FORMAT \ 17 | --location=$REGION \ 18 | --description="Docker repo for container images" 19 | 20 | # Step 3: Create cleanup policy named 'Grandfather' to keep only the latest 3 versions 21 | # gcloud artifacts policies create $POLICY_NAME \ 22 | # --repository=$REPO_NAME \ 23 | # --location=$REGION \ 24 | # --package-type=$FORMAT \ 25 | # --keep-count=$KEEP_COUNT \ 26 | # --action=DELETE 27 | -------------------------------------------------------------------------------- /Arcade Hero/quicklabarc1210.sh: -------------------------------------------------------------------------------- 1 | echo "" 2 | echo "" 3 | 4 | # STEP 1: Set region 5 | read -p "Export REGION :- " REGION 6 | 7 | 8 | # Step 1.2: Set variables 9 | REPO_NAME="container-registry" 10 | FORMAT="DOCKER" 11 | POLICY_NAME="Grandfather" 12 | KEEP_COUNT=3 13 | 14 | # Step 2: Create the Artifact Registry repository 15 | gcloud artifacts repositories create $REPO_NAME \ 16 | --repository-format=$FORMAT \ 17 | --location=$REGION \ 18 | --description="Docker repo for container images" 19 | 20 | # Step 3: Create cleanup policy named 'Grandfather' to keep only the latest 3 versions 21 | # gcloud artifacts policies create $POLICY_NAME \ 22 | # --repository=$REPO_NAME \ 23 | # --location=$REGION \ 24 | # --package-type=$FORMAT \ 25 | # --keep-count=$KEEP_COUNT \ 26 | # --action=DELETE 27 | -------------------------------------------------------------------------------- /Arcade Hero/quicklabarc1211.sh: -------------------------------------------------------------------------------- 1 | echo "" 2 | echo "" 3 | 4 | # STEP 1: Set region 5 | read -p "Export REGION :- " REGION 6 | 7 | 8 | # Step 1.2: Set variables 9 | REPO_NAME="container-registry" 10 | FORMAT="DOCKER" 11 | POLICY_NAME="Grandfather" 12 | KEEP_COUNT=3 13 | 14 | # Step 2: Create the Artifact Registry repository 15 | gcloud artifacts repositories create $REPO_NAME \ 16 | --repository-format=$FORMAT \ 17 | --location=$REGION \ 18 | --description="Docker repo for container images" 19 | 20 | # Step 3: Create cleanup policy named 'Grandfather' to keep only the latest 3 versions 21 | # gcloud artifacts policies create $POLICY_NAME \ 22 | # --repository=$REPO_NAME \ 23 | # --location=$REGION \ 24 | # --package-type=$FORMAT \ 25 | # --keep-count=$KEEP_COUNT \ 26 | # --action=DELETE 27 | -------------------------------------------------------------------------------- /Arcade Hero/quicklabarc1221.sh: -------------------------------------------------------------------------------- 1 | echo "" 2 | echo "" 3 | 4 | # STEP 1: Set region 5 | read -p "Export REGION :- " REGION 6 | 7 | # STEP 2: Create working directory 8 | mkdir ~/hello-go && cd ~/hello-go 9 | 10 | # STEP 3: Create main.go file 11 | cat > main.go < go.mod < index.js < { 15 | res.status(200).send('HTTP with Node.js in GCF 2nd gen!'); 16 | }); 17 | EOF_END 18 | 19 | 20 | cat > package.json < index.js <<'EOF_END' 8 | const functions = require('@google-cloud/functions-framework'); 9 | 10 | // Register an HTTP function with the Functions Framework that will be executed 11 | // when you make an HTTP request to the deployed function's endpoint. 12 | functions.http('helloGET', (req, res) => { 13 | res.send('Subscribe to Quicklab!'); 14 | }); 15 | EOF_END 16 | 17 | cat > package.json < Function.cs <<'EOF_END' 9 | using Google.Cloud.Functions.Framework; 10 | using Microsoft.AspNetCore.Http; 11 | using System.Threading.Tasks; 12 | 13 | namespace HelloWorld; 14 | 15 | public class Function : IHttpFunction 16 | { 17 | public async Task HandleAsync(HttpContext context) 18 | { 19 | await context.Response.WriteAsync("Hello World!", context.RequestAborted); 20 | } 21 | } 22 | EOF_END 23 | 24 | cat > HelloHttp.csproj <<'EOF_END' 25 | 26 | 27 | Exe 28 | net6.0 29 | 30 | 31 | 32 | 33 | 34 | 35 | EOF_END 36 | 37 | #HelloHttp.Function entery point 38 | 39 | 40 | gcloud functions deploy cf-demo \ 41 | --gen2 \ 42 | --entry-point=HelloWorld.Function \ 43 | --runtime=dotnet6 \ 44 | --region=$REGION \ 45 | --source=. \ 46 | --trigger-http \ 47 | --allow-unauthenticated \ 48 | --quiet 49 | -------------------------------------------------------------------------------- /Arcade Hero/quicklabarc236.sh: -------------------------------------------------------------------------------- 1 | PROJECT_ID=$(gcloud config get-value project) 2 | PROJECT_NUMBER=$(gcloud projects list --filter="project_id:$PROJECT_ID" --format='value(project_number)') 3 | 4 | git clone https://github.com/GoogleCloudPlatform/golang-samples.git 5 | cd golang-samples/functions/functionsv2/hellostorage/ 6 | 7 | 8 | deploy_function() { 9 | gcloud functions deploy "$SERVICE_NAME" \ 10 | --runtime=go121 \ 11 | --region="$REGION" \ 12 | --source=. \ 13 | --entry-point=HelloStorage \ 14 | --trigger-bucket="$DEVSHELL_PROJECT_ID-bucket" 15 | 16 | } 17 | 18 | # Variables 19 | SERVICE_NAME="cf-demo" 20 | 21 | # Loop until the Cloud Function is deployed 22 | while true; do 23 | # Run the deployment command 24 | deploy_function 25 | 26 | # Check if Cloud Function is deployed 27 | if gcloud functions describe "$SERVICE_NAME" --region "$REGION" &> /dev/null; then 28 | echo "Cloud Function is deployed. Exiting the loop." 29 | break 30 | else 31 | echo "Waiting for Cloud Function to be deployed..." 32 | echo "In the meantime, subscribe to Quicklab at https://www.youtube.com/@quick_lab." 33 | sleep 10 34 | fi 35 | done 36 | 37 | -------------------------------------------------------------------------------- /Arcade Hero/quicklabarc237.sh: -------------------------------------------------------------------------------- 1 | gcloud config set compute/region $REGION 2 | export PROJECT_ID=$(gcloud config get-value project) 3 | 4 | 5 | git clone https://github.com/GoogleCloudPlatform/nodejs-docs-samples.git 6 | cd nodejs-docs-samples/functions/v2/helloPubSub/ 7 | 8 | gcloud functions deploy cf-demo \ 9 | --gen2 \ 10 | --runtime=nodejs20 \ 11 | --region=$REGION \ 12 | --source=. \ 13 | --entry-point=helloPubSub \ 14 | --trigger-topic=cf_topic \ 15 | --quiet 16 | -------------------------------------------------------------------------------- /August Cloud SQL Loading data into Google Cloud SQL/quicklabarc116.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | cat > $JSON_FILE_NAME < main.py < /tmp/tabledef.json 24 | 25 | 26 | bq mk --external_table_definition=/tmp/tabledef.json --project_id=$DEVSHELL_PROJECT_ID demo_dataset.biglake_table 27 | 28 | bq mk --external_table_definition=/tmp/tabledef.json --project_id=$DEVSHELL_PROJECT_ID demo_dataset.external_table 29 | 30 | bq show --schema --format=prettyjson demo_dataset.external_table > /tmp/schema 31 | 32 | bq update --external_table_definition=/tmp/tabledef.json --schema=/tmp/schema demo_dataset.external_table 33 | 34 | 35 | 36 | 37 | 38 | 39 | -------------------------------------------------------------------------------- /BigQuery Qwik Start Command Line/names.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/quiccklabs/Labs_solutions/d72b296e05c55ad79acaa8ce5195181d1877061a/BigQuery Qwik Start Command Line/names.zip -------------------------------------------------------------------------------- /BigQuery Qwik Start Command Line/quicklabgsp071.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | bq show bigquery-public-data:samples.shakespeare 4 | 5 | bq query --use_legacy_sql=false \ 6 | 'SELECT 7 | word, 8 | SUM(word_count) AS count 9 | FROM 10 | `bigquery-public-data`.samples.shakespeare 11 | WHERE 12 | word LIKE "%raisin%" 13 | GROUP BY 14 | word' 15 | 16 | bq query --use_legacy_sql=false \ 17 | 'SELECT 18 | word 19 | FROM 20 | `bigquery-public-data`.samples.shakespeare 21 | WHERE 22 | word = "huzzah"' 23 | 24 | bq mk babynames 25 | 26 | wget https://github.com/quiccklabs/Labs_solutions/raw/master/BigQuery%20Qwik%20Start%20%20Command%20Line/names.zip 27 | 28 | unzip names.zip 29 | 30 | bq load babynames.names2010 yob2010.txt name:string,gender:string,count:integer 31 | 32 | bq query "SELECT name,count FROM babynames.names2010 WHERE gender = 'F' ORDER BY count DESC LIMIT 5" 33 | 34 | bq query "SELECT name,count FROM babynames.names2010 WHERE gender = 'M' ORDER BY count ASC LIMIT 5" 35 | 36 | 37 | 38 | 39 | -------------------------------------------------------------------------------- /BigQuery: Qwik Start - Console: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | bq query --use_legacy_sql=false \ 5 | ' 6 | #standardSQL 7 | SELECT 8 | weight_pounds, state, year, gestation_weeks 9 | FROM 10 | `bigquery-public-data.samples.natality` 11 | ORDER BY weight_pounds DESC LIMIT 10; 12 | ' 13 | 14 | bq mk babynames 15 | 16 | https://labshell-service-mvrcyiow4a-uc.a.run.app 17 | 18 | gsutil cp gs://spls/gsp072/baby-names.zip . 19 | 20 | unzip baby-names.zip 21 | 22 | bq load --autodetect --source_format=CSV babynames.names_2014 gs://spls/gsp072/baby-names/yob2014.txt name:string,gender:string,count:integer 23 | 24 | 25 | bq query --use_legacy_sql=false \ 26 | ' 27 | #standardSQL 28 | SELECT 29 | name, count 30 | FROM 31 | `babynames.names_2014` 32 | WHERE 33 | gender = "M" 34 | ORDER BY count DESC LIMIT 5; 35 | ' 36 | 37 | -------------------------------------------------------------------------------- /Bigtable Qwik Start Command Line/quicklabgsp099.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud services enable bigtable.googleapis.com bigtableadmin.googleapis.com 4 | 5 | sleep 10 6 | 7 | gcloud bigtable instances create quickstart-instance \ 8 | --display-name="Quickstart instance" \ 9 | --cluster-storage-type=SSD \ 10 | --cluster-config=id=quickstart-instance-c1,zone=$ZONE,nodes=1 11 | 12 | 13 | echo project = `gcloud config get-value project` > ~/.cbtrc 14 | 15 | echo instance = quickstart-instance >> ~/.cbtrc 16 | 17 | 18 | cbt createtable my-table 19 | 20 | cbt ls 21 | 22 | cbt createfamily my-table cf1 23 | 24 | cbt ls my-table 25 | 26 | cbt set my-table r1 cf1:c1=test-value 27 | 28 | cbt read my-table 29 | 30 | cbt deletetable my-table 31 | 32 | 33 | -------------------------------------------------------------------------------- /Build a BI Dashboard Using Looker Studio and BigQuery/quicklabgsp403.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | bq mk Reports 4 | 5 | 6 | bq query \ 7 | --use_legacy_sql=false \ 8 | --destination_table=$DEVSHELL_PROJECT_ID:Reports.Trees \ 9 | --replace=false \ 10 | --nouse_cache \ 11 | "SELECT 12 | TIMESTAMP_TRUNC(plant_date, MONTH) as plant_month, 13 | COUNT(tree_id) AS total_trees, 14 | species, 15 | care_taker, 16 | address, 17 | site_info 18 | FROM 19 | \`bigquery-public-data.san_francisco_trees.street_trees\` 20 | WHERE 21 | address IS NOT NULL 22 | AND plant_date >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 365 DAY) 23 | AND plant_date < TIMESTAMP_TRUNC(CURRENT_TIMESTAMP(), DAY) 24 | GROUP BY 25 | plant_month, 26 | species, 27 | care_taker, 28 | address, 29 | site_info" 30 | 31 | 32 | 33 | echo -e "\033[1;32mClick Here\033[0m https://datastudio.google.com/" 34 | -------------------------------------------------------------------------------- /Build a Chat Application using the PaLM 2 API on Cloud Run/quicklabgsp1201.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | gsutil cp -R gs://spls/gsp1201/chat-flask-cloudrun . 5 | 6 | cd chat-flask-cloudrun 7 | 8 | export PROJECT_ID=$DEVSHELL_PROJECT_ID 9 | 10 | 11 | export AR_REPO='chat-app-repo' 12 | export SERVICE_NAME='chat-flask-app' 13 | 14 | gcloud artifacts repositories create "$AR_REPO" --location="$REGION" --repository-format=Docker 15 | 16 | gcloud builds submit --tag "$REGION-docker.pkg.dev/$PROJECT_ID/$AR_REPO/$SERVICE_NAME" 17 | 18 | gcloud run deploy "$SERVICE_NAME" --port=8080 --image="$REGION-docker.pkg.dev/$PROJECT_ID/$AR_REPO/$SERVICE_NAME:latest" --allow-unauthenticated --region=$REGION --platform=managed --project=$PROJECT_ID --set-env-vars=GCP_PROJECT=$PROJECT_ID,GCP_REGION=$REGION -------------------------------------------------------------------------------- /Build a Data Mesh with Dataplex Challenge Lab/quicklabgsp514.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | cat > dq-customer-orders.yaml < ~/genkit-intro/data/questions.json 4 | [ 5 | { 6 | "input": { 7 | "question": "What's on the menu today?" 8 | } 9 | }, 10 | { 11 | "input": { 12 | "question": "Are there any burgers in the menu today?" 13 | } 14 | } 15 | ] 16 | EOF 17 | 18 | 19 | cd ~/genkit-intro 20 | npx genkit eval:flow ragMenuQuestion --input data/questions.json --output application_result.json 21 | 22 | gcloud storage cp -r ~/genkit-intro/data gs://$DEVSHELL_PROJECT_ID 23 | gcloud storage cp ~/genkit-intro/application_result.json gs://$DEVSHELL_PROJECT_ID 24 | gcloud storage cp ~/genkit-intro/src/index.ts gs://$DEVSHELL_PROJECT_ID 25 | -------------------------------------------------------------------------------- /Build a Serverless Web App with Firebase/quicklab.sh: -------------------------------------------------------------------------------- 1 | 2 | git clone https://github.com/rosera/pet-theory.git 3 | 4 | cd pet-theory/lab02/ 5 | 6 | npm i && npm audit fix --force 7 | 8 | gcloud firestore databases create --location=$REGION 9 | 10 | echo "https://console.cloud.google.com/firebase?referrer=search&project=$DEVSHELL_PROJECT_ID" 11 | -------------------------------------------------------------------------------- /Building Virtual Agent Fulfillment/quicklab/package.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "dialogflowFirebaseFulfillment", 3 | "description": "This is the default fulfillment for a Dialogflow agents using Cloud Functions for Firebase", 4 | "version": "0.0.1", 5 | "private": true, 6 | "license": "Apache Version 2.0", 7 | "author": "Google Inc.", 8 | "engines": { 9 | "node": "10" 10 | }, 11 | "scripts": { 12 | "start": "firebase serve --only functions:dialogflowFirebaseFulfillment", 13 | "deploy": "firebase deploy --only functions:dialogflowFirebaseFulfillment" 14 | }, 15 | "dependencies": { 16 | "actions-on-google": "^2.2.0", 17 | "firebase-admin": "^5.13.1", 18 | "firebase-functions": "^2.0.2", 19 | "dialogflow": "^0.6.0", 20 | "dialogflow-fulfillment": "^0.5.0" 21 | } 22 | } -------------------------------------------------------------------------------- /Building Virtual Agent Fulfillment/quicklabgsp792.sh: -------------------------------------------------------------------------------- 1 | 2 | export PROJECT_NUMBER=$(gcloud projects describe $DEVSHELL_PROJECT_ID --format='value(projectNumber)') 3 | export PROJECT_ID=$(gcloud info --format='value(config.project)') 4 | 5 | 6 | gcloud services enable dialogflow.googleapis.com 7 | gcloud services disable cloudfunctions.googleapis.com 8 | gcloud services enable cloudfunctions.googleapis.com 9 | gcloud services enable firestore.googleapis.com 10 | 11 | 12 | gcloud projects add-iam-policy-binding $PROJECT_ID \ 13 | --member="serviceAccount:service-$PROJECT_NUMBER@gcf-admin-robot.iam.gserviceaccount.com" \ 14 | --role="roles/artifactregistry.reader" 15 | 16 | 17 | gcloud firestore databases create --location=nam5 18 | 19 | 20 | # Print Firestore in yellow 21 | echo -e "\033[1;33mFirestore\033[0m" 22 | 23 | # Print the URL in blue and bold 24 | echo -e "\033[1;34m(https://console.cloud.google.com/firestore/databases/-default-/data/panel)\033[0m" 25 | -------------------------------------------------------------------------------- /Building Virtual Agent Fulfillment/quicklabgsp792.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/quiccklabs/Labs_solutions/d72b296e05c55ad79acaa8ce5195181d1877061a/Building Virtual Agent Fulfillment/quicklabgsp792.zip -------------------------------------------------------------------------------- /CI CD for a TFX pipeline: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | !gcloud builds submit --timeout=15m --tag {IMAGE_URI} tfx-cli 5 | 6 | 7 | 8 | !gcloud builds submit . --timeout=15m --config cloudbuild.yaml --substitutions {SUBSTITUTIONS} 9 | 10 | 11 | 12 | cd mlops-on-gcp/workshops/tfx-caip-tf23/ 13 | 14 | 15 | git remote -v 16 | 17 | git remote add upstream 18 | 19 | 20 | 21 | 22 | -------------------------------------------------------------------------------- /Change firewall rules using Terraform and Cloud Shell/quicklab.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud auth list 4 | 5 | git clone https://github.com/terraform-google-modules/docs-examples.git 6 | 7 | cd docs-examples/firewall_basic/ 8 | 9 | terraform init 10 | 11 | terraform apply --auto-approve 12 | 13 | terraform init 14 | 15 | terraform apply --auto-approve 16 | -------------------------------------------------------------------------------- /Check my progress/bookmark.md: -------------------------------------------------------------------------------- 1 | 2 | ## Bookmark Code 3 | 4 | ```javascript 5 | javascript:(function () { 6 | const removeLearboard = document.querySelector('.js-lab-leaderboard'); 7 | const showScore = document.querySelector('.games-labs'); 8 | 9 | removeLearboard.remove(); 10 | showScore.className = "lab-show l-full no-nav application-new lab-show l-full no-nav " 11 | })(); 12 | ``` 13 | -------------------------------------------------------------------------------- /Classify Images of Clouds in the Cloud with AutoML Images/quicklabgsp223.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | gsutil mb -p $GOOGLE_CLOUD_PROJECT \ 5 | -c standard \ 6 | -l us \ 7 | gs://$GOOGLE_CLOUD_PROJECT-vcm/ 8 | 9 | export BUCKET=$GOOGLE_CLOUD_PROJECT-vcm 10 | 11 | gsutil -m cp -r gs://spls/gsp223/images/* gs://${BUCKET} 12 | 13 | gsutil cp gs://spls/gsp223/data.csv . 14 | 15 | sed -i -e "s/placeholder/${BUCKET}/g" ./data.csv 16 | 17 | gsutil cp ./data.csv gs://${BUCKET} 18 | 19 | 20 | 21 | -------------------------------------------------------------------------------- /Cloud Audit Logs/quicklab.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | gsutil mb gs://$DEVSHELL_PROJECT_ID 7 | 8 | echo "Hello World!" > sample.txt 9 | gsutil cp sample.txt gs://$DEVSHELL_PROJECT_ID 10 | 11 | gcloud compute networks create mynetwork --subnet-mode=auto 12 | 13 | gcloud compute instances create default-us-vm \ 14 | --zone=$ZONE --network=mynetwork --machine-type=e2-medium 15 | 16 | gcloud logging read \ 17 | "logName=projects/$DEVSHELL_PROJECT_ID/logs/cloudaudit.googleapis.com%2Factivity" 18 | 19 | gcloud logging read \ 20 | "logName=projects/$DEVSHELL_PROJECT_ID/logs/cloudaudit.googleapis.com%2Fdata_access" 21 | 22 | 23 | -------------------------------------------------------------------------------- /Cloud Digital Leader Learning Path/01 Digital Transformation with Google Cloud/02 Fundamental Cloud Concepts.md: -------------------------------------------------------------------------------- 1 | # Fundamental Cloud Concepts 2 | 3 | 1. **A financial services organization has bank branches in a number of countries, and has built an application that needs to run in different configurations based on the local regulations of each country. How can cloud infrastructure help achieve this goal?** 4 | 5 | - **Flexibility of infrastructure configuration.** 6 | 7 | 2. **An organization wants to ensure they have redundancy of their resources so their application remains available in the event of a disaster. How can they ensure this happens?** 8 | 9 | - **By putting resources in different zones.** 10 | 11 | 3. **An organization has shifted from a CapEx to OpEx based spending model. Which of these statements is true?** 12 | 13 | - **They will only pay for what they use.** 14 | 15 | 4. **An organization wants to innovate using the latest technologies, but also has compliance needs that specify data must be stored in specific locations. Which cloud approach would best suit their needs?** 16 | 17 | - **Hybrid Cloud** 18 | 19 | 5. **Which network performance metric describes the amount of data a network can transfer in a given amount of time?** 20 | 21 | - **Bandwidth** 22 | -------------------------------------------------------------------------------- /Cloud Digital Leader Learning Path/01 Digital Transformation with Google Cloud/03 Cloud Computing Benefits and Models.md: -------------------------------------------------------------------------------- 1 | # Cloud Computing Benefits and Models 2 | 3 | 1. **Which option best describes a benefit of Infrastructure as a Service (IaaS)?** 4 | 5 | - **It’s efficient**, as IaaS resources are available when needed and resources aren’t wasted by overbuilding capacity. 6 | 7 | 2. **Which cloud computing service model offers a develop-and-deploy environment to build cloud applications?** 8 | 9 | - **Platform as a Service (PaaS)** 10 | 11 | 3. **An organization wants to move their collaboration software to the cloud, but due to limited IT staff one of their main drivers is having low maintenance needs. Which cloud computing model would best suit their requirements?** 12 | 13 | - **Software as a Service (SaaS)** 14 | 15 | 4. **In the cloud computing shared responsibility model, what types of content are customers always responsible for, regardless of the computing model chosen?** 16 | 17 | - **The customer is responsible for securing anything that they create within the cloud, such as the configurations, access policies, and user data.** 18 | -------------------------------------------------------------------------------- /Cloud Digital Leader Learning Path/02 Exploring Data Transformation with Google Cloud/03 Making Data Useful and Accessible.md: -------------------------------------------------------------------------------- 1 | # Making Data Useful and Accessible 2 | 3 | 1. **Streaming analytics is the processing and analyzing of data records continuously instead of in batches. Which option is a source of streaming data?** 4 | 5 | - **Temperature sensors** 6 | 7 | 2. **What is Google Cloud’s distributed messaging service that can receive messages from various device streams such as gaming events, Internet of Things (IoT) devices, and application streams?** 8 | 9 | - **Pub/Sub** 10 | 11 | 3. **What feature of Looker makes it easy to integrate into existing workflows and share with multiple teams at an organization?** 12 | 13 | - **It’s 100% web based.** 14 | 15 | 4. **What Google Cloud business intelligence platform is designed to help individuals and teams analyze, visualize, and share data?** 16 | 17 | - **Looker** 18 | 19 | 5. **Which statement is true about Dataflow?** 20 | 21 | - **It handles infrastructure setup and maintenance for processing pipelines.** 22 | 23 | 6. **What does ETL stand for in the context of data processing?** 24 | 25 | - **Extract, transform, and load** 26 | -------------------------------------------------------------------------------- /Cloud Digital Leader Learning Path/05 Trust and Security with Google Cloud/03 Google Cloud’s Trust Principles and Compliance.md: -------------------------------------------------------------------------------- 1 | # Google Cloud’s Trust Principles and Compliance 2 | 3 | 1. **Which is one of Google Cloud’s seven trust principles?** 4 | 5 | - **All customer data is encrypted by default.** 6 | 7 | 2. **Which report provides a way for Google Cloud to share data about how the policies and actions of governments and corporations affect privacy, security, and access to information?** 8 | 9 | - **Transparency reports** 10 | 11 | 3. **Which Google Cloud feature allows users to control their data's physical location?** 12 | 13 | - **Regions** 14 | 15 | 4. **Where can you find details about certifications and compliance standards met by Google Cloud?** 16 | 17 | - **Compliance resource center** 18 | 19 | 5. **Which term describes the concept that data is subject to the laws and regulations of the country where it resides?** 20 | 21 | - **Data sovereignty** 22 | -------------------------------------------------------------------------------- /Cloud Digital Leader Learning Path/06 Scaling with Google Cloud Operations/03 Sustainability with Google Cloud.md: -------------------------------------------------------------------------------- 1 | # Sustainability with Google Cloud 2 | 3 | 1. **Google's data centers were the first to achieve ISO 14001 certification. What is this standard’s purpose?** 4 | 5 | - **It’s a framework for an organization to enhance its environmental performance through improving resource efficiency and reducing waste.** 6 | 7 | 2. **Kaluza is an electric vehicle smart-charging solution. How does it use BigQuery and Looker Studio?** 8 | 9 | - **It uses BigQuery and Looker Studio to create dashboards that provide granular operational insights.** 10 | 11 | 3. **What sustainability goal does Google aim to achieve by the year 2030?** 12 | 13 | - **To be the first major company to operate completely carbon free.** 14 | 15 | -------------------------------------------------------------------------------- /Cloud Endpoints Qwik Start/quicklabgsp164.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | gsutil cp gs://spls/gsp164/endpoints-quickstart.zip . 5 | unzip endpoints-quickstart.zip 6 | 7 | cd endpoints-quickstart 8 | 9 | cd scripts 10 | 11 | ./deploy_api.sh 12 | 13 | ./deploy_app.sh 14 | 15 | ./query_api.sh 16 | 17 | ./query_api.sh JFK 18 | 19 | ./deploy_api.sh ../openapi_with_ratelimit.yaml 20 | 21 | ./deploy_app.sh 22 | 23 | gcloud alpha services api-keys create --display-name="quicklab" 24 | KEY_NAME=$(gcloud alpha services api-keys list --format="value(name)" --filter "displayName=quicklab") 25 | export API_KEY=$(gcloud alpha services api-keys get-key-string $KEY_NAME --format="value(keyString)") 26 | 27 | 28 | ./query_api_with_key.sh $API_KEY 29 | 30 | timeout 15s ./generate_traffic_with_key.sh $API_KEY && ./query_api_with_key.sh $API_KEY 31 | 32 | -------------------------------------------------------------------------------- /Cloud Filestore Qwik Start/quicklabgsp244.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | gcloud compute instances create nfs-client --project=$DEVSHELL_PROJECT_ID --zone=$ZONE --machine-type=e2-medium --network-interface=network-tier=PREMIUM,stack-type=IPV4_ONLY,subnet=default --metadata=enable-oslogin=true --maintenance-policy=MIGRATE --provisioning-model=STANDARD --scopes=https://www.googleapis.com/auth/devstorage.read_only,https://www.googleapis.com/auth/logging.write,https://www.googleapis.com/auth/monitoring.write,https://www.googleapis.com/auth/servicecontrol,https://www.googleapis.com/auth/service.management.readonly,https://www.googleapis.com/auth/trace.append --tags=http-server --create-disk=auto-delete=yes,boot=yes,device-name=nfs-client,image=projects/debian-cloud/global/images/debian-11-bullseye-v20231010,mode=rw,size=10,type=projects/$DEVSHELL_PROJECT_ID/zones/$ZONE/diskTypes/pd-balanced --no-shielded-secure-boot --shielded-vtpm --shielded-integrity-monitoring --labels=goog-ec-src=vm_add-gcloud --reservation-affinity=any 6 | 7 | 8 | gcloud services enable file.googleapis.com 9 | 10 | 11 | gcloud filestore instances create nfs-server --zone=$ZONE --tier=BASIC_HDD --file-share=name="vol1",capacity=1TB --network=name="default" -------------------------------------------------------------------------------- /Cloud Functions 2nd GenQwik Start/task2.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | ## ``` Now check the score for TASK 6 After that run the below commands 4 | SLOW_URL=$(gcloud functions describe slow-function --region $REGION --gen2 --format="value(serviceConfig.uri)") 5 | 6 | hey -n 10 -c 10 $SLOW_URL 7 | 8 | 9 | gcloud run services delete slow-function --region $REGION --quiet 10 | 11 | gcloud functions deploy slow-concurrent-function \ 12 | --gen2 \ 13 | --runtime go116 \ 14 | --entry-point HelloWorld \ 15 | --source . \ 16 | --region $REGION \ 17 | --trigger-http \ 18 | --allow-unauthenticated \ 19 | --min-instances 1 \ 20 | --max-instances 4 \ 21 | --quiet 22 | 23 | 24 | 25 | gcloud run deploy slow-concurrent-function \ 26 | --image=$REGION-docker.pkg.dev/$DEVSHELL_PROJECT_ID/gcf-artifacts/slow--concurrent--function:version_1 \ 27 | --concurrency=100 \ 28 | --cpu=1 \ 29 | --max-instances=4 \ 30 | --region=$REGION \ 31 | --project=$DEVSHELL_PROJECT_ID \ 32 | && gcloud run services update-traffic slow-concurrent-function --to-latest --region=$REGION 33 | 34 | -------------------------------------------------------------------------------- /Cloud IAM Qwik Start/quicklabgsp064.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | gsutil mb -l us -b on gs://$DEVSHELL_PROJECT_ID 6 | 7 | 8 | echo "subscribe to quicklab " > sample.txt 9 | 10 | 11 | gsutil cp sample.txt gs://$DEVSHELL_PROJECT_ID 12 | 13 | 14 | gcloud projects remove-iam-policy-binding $DEVSHELL_PROJECT_ID \ 15 | --member=user:$USERNAME_2 \ 16 | --role=roles/viewer 17 | 18 | 19 | 20 | gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID \ 21 | --member=user:$USERNAME_2 \ 22 | --role=roles/storage.objectViewer 23 | -------------------------------------------------------------------------------- /Cloud Natural Language API Qwik Start/quicklabgsp097.sh: -------------------------------------------------------------------------------- 1 | 2 | ZONE="$(gcloud compute instances list --project=$DEVSHELL_PROJECT_ID --format='value(ZONE)')" 3 | 4 | 5 | export GOOGLE_CLOUD_PROJECT=$(gcloud config get-value core/project) 6 | gcloud iam service-accounts create my-natlang-sa \ 7 | --display-name "my natural language service account" 8 | 9 | gcloud iam service-accounts keys create ~/key.json \ 10 | --iam-account my-natlang-sa@${GOOGLE_CLOUD_PROJECT}.iam.gserviceaccount.com 11 | 12 | export GOOGLE_APPLICATION_CREDENTIALS="/home/USER/key.json" 13 | 14 | gcloud compute ssh --zone "$ZONE" "linux-instance" --project "$DEVSHELL_PROJECT_ID" --quiet --command "gcloud ml language analyze-entities --content='Michelangelo Caravaggio, Italian painter, is known for \"The Calling of Saint Matthew\".' > result.json" 15 | 16 | -------------------------------------------------------------------------------- /Cloud Operations for GKE/quicklabgsp497.sh: -------------------------------------------------------------------------------- 1 | export REGION=$(gcloud compute project-info describe \ 2 | --format="value(commonInstanceMetadata.items[google-compute-default-region])") 3 | 4 | export ZONE=$(gcloud compute project-info describe \ 5 | --format="value(commonInstanceMetadata.items[google-compute-default-zone])") 6 | 7 | 8 | PROJECT_ID=`gcloud config get-value project` 9 | 10 | export PROJECT_NUMBER=$(gcloud projects describe $PROJECT_ID --format="value(projectNumber)") 11 | 12 | 13 | gcloud config set compute/region $REGION 14 | gcloud config set compute/zone $ZONE 15 | 16 | 17 | gsutil cp gs://spls/gsp497/gke-monitoring-tutorial.zip . 18 | unzip gke-monitoring-tutorial.zip 19 | 20 | cd gke-monitoring-tutorial 21 | 22 | make create 23 | 24 | make teardown 25 | -------------------------------------------------------------------------------- /Cloud SQL for PostgreSQL Qwik Start/quicklabgsp152.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | gcloud sql instances create myinstance \ 6 | --database-version=POSTGRES_15 \ 7 | --tier=db-custom-2-7680 \ 8 | --region=$REGION \ 9 | --storage-type=SSD \ 10 | --storage-size=100GB 11 | -------------------------------------------------------------------------------- /Cloud SQL with Terraform/quicklabgsp234.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | mkdir sql-with-terraform 5 | cd sql-with-terraform 6 | gsutil cp -r gs://spls/gsp234/gsp234.zip . 7 | 8 | unzip gsp234.zip 9 | 10 | terraform init 11 | 12 | terraform plan -out=tfplan 13 | 14 | terraform apply tfplan 15 | 16 | 17 | 18 | -------------------------------------------------------------------------------- /Cloud Scheduler: Qwik Start: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud services enable cloudscheduler.googleapis.com 4 | 5 | gcloud pubsub topics create cron-topic 6 | 7 | gcloud pubsub subscriptions create cron-sub --topic cron-topic 8 | 9 | -------------------------------------------------------------------------------- /Cloud Spanner - Database Fundamentals: -------------------------------------------------------------------------------- 1 | 2 | gcloud spanner instances create banking-instance \ 3 | --config=regional-us-central1 \ 4 | --description="quicklab_1" \ 5 | --nodes=1 6 | 7 | gcloud spanner databases create banking-db --instance=banking-instance 8 | 9 | gcloud spanner instances create banking-instance-2 \ 10 | --config=regional-us-central1 \ 11 | --description="quicklab_2" \ 12 | --nodes=2 13 | 14 | gcloud spanner databases create banking-db-2 --instance=banking-instance-2 15 | -------------------------------------------------------------------------------- /Cloud Spanner Qwik Start/quicklabgsp102.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | gcloud spanner instances create test-instance \ 5 | --config=regional-$REGION \ 6 | --description="Test Instance" \ 7 | --nodes=1 8 | 9 | 10 | gcloud spanner databases create example-db --instance=test-instance 11 | 12 | gcloud spanner databases ddl update example-db --instance=test-instance \ 13 | --ddl="CREATE TABLE Singers ( 14 | SingerId INT64 NOT NULL, 15 | FirstName STRING(1024), 16 | LastName STRING(1024), 17 | SingerInfo BYTES(MAX), 18 | BirthDate DATE, 19 | ) PRIMARY KEY(SingerId);" 20 | -------------------------------------------------------------------------------- /Cloud Storage AWS/task2.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | # 2nd Project ID 4 | 5 | export REGION="${ZONE%-*}" 6 | 7 | gsutil mb -p $DEVSHELL_PROJECT_ID -c STANDARD -l $REGION -b on gs://$DEVSHELL_PROJECT_ID-2 8 | 9 | gsutil uniformbucketlevelaccess set off gs://$DEVSHELL_PROJECT_ID-2 10 | 11 | echo "subscribe to quicklab" > test.txt 12 | 13 | gsutil cp test.txt gs://$DEVSHELL_PROJECT_ID-2 14 | 15 | 16 | 17 | 18 | # Create the service account 19 | gcloud iam service-accounts create cross-project-storage --display-name "Cross-Project Storage Account" 20 | 21 | # Grant Storage Object Viewer role to the service account 22 | gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID --member="serviceAccount:cross-project-storage@$DEVSHELL_PROJECT_ID.iam.gserviceaccount.com" --role="roles/storage.objectViewer" 23 | 24 | gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID --member="serviceAccount:cross-project-storage@$DEVSHELL_PROJECT_ID.iam.gserviceaccount.com" --role="roles/storage.objectAdmin" 25 | 26 | # Generate and download the JSON key file 27 | gcloud iam service-accounts keys create credentials.json --iam-account=cross-project-storage@$DEVSHELL_PROJECT_ID.iam.gserviceaccount.com 28 | -------------------------------------------------------------------------------- /Cloud Storage Qwik Start CLISDK/quicklabgsp074.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | gcloud config set compute/region $REGION 6 | 7 | 8 | gsutil mb gs://$DEVSHELL_PROJECT_ID 9 | 10 | curl https://upload.wikimedia.org/wikipedia/commons/thumb/a/a4/Ada_Lovelace_portrait.jpg/800px-Ada_Lovelace_portrait.jpg --output ada.jpg 11 | 12 | gsutil cp ada.jpg gs://$DEVSHELL_PROJECT_ID 13 | 14 | gsutil cp -r gs://$DEVSHELL_PROJECT_ID/ada.jpg . 15 | 16 | gsutil cp gs://$DEVSHELL_PROJECT_ID/ada.jpg gs://$DEVSHELL_PROJECT_ID/image-folder/ 17 | 18 | gsutil acl ch -u AllUsers:R gs://$DEVSHELL_PROJECT_ID/ada.jpg -------------------------------------------------------------------------------- /Cloud Storage Qwik Start Cloud Console/quicklabgsp073.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | gcloud config set compute/region $REGION 6 | 7 | 8 | gsutil mb gs://$DEVSHELL_PROJECT_ID 9 | 10 | curl https://upload.wikimedia.org/wikipedia/commons/thumb/a/a4/Ada_Lovelace_portrait.jpg/800px-Ada_Lovelace_portrait.jpg --output ada.jpg 11 | 12 | mv ada.jpg kitten.png 13 | 14 | gsutil cp kitten.png gs://$DEVSHELL_PROJECT_ID 15 | 16 | gsutil cp -r gs://$DEVSHELL_PROJECT_ID/kitten.png . 17 | 18 | gsutil cp gs://$DEVSHELL_PROJECT_ID/kitten.png gs://$DEVSHELL_PROJECT_ID/image-folder/ 19 | 20 | 21 | gsutil iam ch allUsers:objectViewer gs://$DEVSHELL_PROJECT_ID 22 | -------------------------------------------------------------------------------- /Cloud Storage/task2.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | # 2nd Project ID 4 | 5 | export REGION="${ZONE%-*}" 6 | 7 | gsutil mb -p $DEVSHELL_PROJECT_ID -c STANDARD -l $REGION -b on gs://$DEVSHELL_PROJECT_ID-2 8 | 9 | gsutil uniformbucketlevelaccess set off gs://$DEVSHELL_PROJECT_ID-2 10 | 11 | echo "subscribe to quicklab" > test.txt 12 | 13 | gsutil cp test.txt gs://$DEVSHELL_PROJECT_ID-2 14 | 15 | 16 | 17 | 18 | # Create the service account 19 | gcloud iam service-accounts create cross-project-storage --display-name "Cross-Project Storage Account" 20 | 21 | # Grant Storage Object Viewer role to the service account 22 | gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID --member="serviceAccount:cross-project-storage@$DEVSHELL_PROJECT_ID.iam.gserviceaccount.com" --role="roles/storage.objectViewer" 23 | 24 | gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID --member="serviceAccount:cross-project-storage@$DEVSHELL_PROJECT_ID.iam.gserviceaccount.com" --role="roles/storage.objectAdmin" 25 | 26 | # Generate and download the JSON key file 27 | gcloud iam service-accounts keys create credentials.json --iam-account=cross-project-storage@$DEVSHELL_PROJECT_ID.iam.gserviceaccount.com 28 | -------------------------------------------------------------------------------- /Compute Engine Qwik Start Windows/quicklabgsp093.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud compute instances create quicklab --project=$DEVSHELL_PROJECT_ID --zone=$ZONE --machine-type=e2-medium --network-interface=network-tier=PREMIUM,stack-type=IPV4_ONLY,subnet=default --metadata=enable-oslogin=true --maintenance-policy=MIGRATE --provisioning-model=STANDARD --scopes=https://www.googleapis.com/auth/devstorage.read_only,https://www.googleapis.com/auth/logging.write,https://www.googleapis.com/auth/monitoring.write,https://www.googleapis.com/auth/servicecontrol,https://www.googleapis.com/auth/service.management.readonly,https://www.googleapis.com/auth/trace.append --create-disk=auto-delete=yes,boot=yes,device-name=quicklab,image=projects/windows-cloud/global/images/windows-server-2022-dc-v20230913,mode=rw,size=50,type=projects/$DEVSHELL_PROJECT_ID/zones/$ZONE/diskTypes/pd-balanced --no-shielded-secure-boot --shielded-vtpm --shielded-integrity-monitoring --labels=goog-ec-src=vm_add-gcloud --reservation-affinity=any 4 | 5 | sleep 60 6 | 7 | gcloud compute instances get-serial-port-output quicklab --zone=$ZONE 8 | 9 | echo "Y" | gcloud compute reset-windows-password quicklab --zone $ZONE --user admin 10 | -------------------------------------------------------------------------------- /Configuring Liveness and Readiness Probes/quicklab.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | source <(kubectl completion bash) 8 | 9 | 10 | gcloud container clusters get-credentials $CLUSTER_NAME --zone $ZONE 11 | 12 | git clone https://github.com/GoogleCloudPlatform/training-data-analyst 13 | 14 | 15 | ln -s ~/training-data-analyst/courses/ak8s/v1.1 ~/ak8s 16 | 17 | cd ~/ak8s/Probes/ 18 | 19 | 20 | kubectl create -f exec-liveness.yaml 21 | 22 | kubectl get pod liveness-exec 23 | 24 | 25 | kubectl create -f readiness-deployment.yaml 26 | 27 | kubectl get service readiness-demo-svc 28 | 29 | kubectl exec readiness-demo-pod -- touch /tmp/healthz 30 | 31 | kubectl describe pod readiness-demo-pod | grep ^Conditions -A 5 -------------------------------------------------------------------------------- /Configuring Networks via gcloud/quicklabgsp630.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | export REGION="${ZONE%-*}" 5 | 6 | gcloud compute networks create labnet --subnet-mode=custom 7 | 8 | gcloud compute networks subnets create labnet-sub \ 9 | --network labnet \ 10 | --region "$REGION" \ 11 | --range 10.0.0.0/28 12 | 13 | gcloud compute networks list 14 | 15 | gcloud compute firewall-rules create labnet-allow-internal \ 16 | --network=labnet \ 17 | --action=ALLOW \ 18 | --rules=icmp,tcp:22 \ 19 | --source-ranges=0.0.0.0/0 20 | 21 | gcloud compute networks create privatenet --subnet-mode=custom 22 | 23 | 24 | gcloud compute networks subnets create private-sub \ 25 | --network=privatenet \ 26 | --region="$REGION" \ 27 | --range 10.1.0.0/28 28 | 29 | 30 | gcloud compute firewall-rules create privatenet-deny \ 31 | --network=privatenet \ 32 | --action=DENY \ 33 | --rules=icmp,tcp:22 \ 34 | --source-ranges=0.0.0.0/0 35 | 36 | gcloud compute firewall-rules list --sort-by=NETWORK 37 | 38 | 39 | gcloud compute instances create pnet-vm \ 40 | --zone="$ZONE" \ 41 | --machine-type=n1-standard-1 \ 42 | --subnet=private-sub 43 | 44 | 45 | 46 | gcloud compute instances create lnet-vm \ 47 | --zone="$ZONE" \ 48 | --machine-type=n1-standard-1 \ 49 | --subnet=labnet-sub 50 | -------------------------------------------------------------------------------- /Configuring VPC Network Peering/quicklab.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | curl -LO https://raw.githubusercontent.com/quiccklabs/Labs_solutions/refs/heads/master/Configuring%20VPC%20Network%20Peering/task1.sh 5 | 6 | curl -LO https://raw.githubusercontent.com/quiccklabs/Labs_solutions/refs/heads/master/Configuring%20VPC%20Network%20Peering/task2.sh 7 | 8 | curl -LO https://raw.githubusercontent.com/quiccklabs/Labs_solutions/refs/heads/master/Configuring%20VPC%20Network%20Peering/task3.sh 9 | 10 | curl -LO https://raw.githubusercontent.com/quiccklabs/Labs_solutions/refs/heads/master/Configuring%20VPC%20Network%20Peering/task4.sh 11 | 12 | curl -LO https://raw.githubusercontent.com/quiccklabs/Labs_solutions/refs/heads/master/Configuring%20VPC%20Network%20Peering/task5.sh 13 | 14 | chmod +x task1.sh 15 | chmod +x task2.sh 16 | chmod +x task3.sh 17 | chmod +x task4.sh 18 | chmod +x task5.sh 19 | -------------------------------------------------------------------------------- /Configuring VPC Network Peering/task1.sh: -------------------------------------------------------------------------------- 1 | #TASK 1 2 | 3 | ZONE=$(gcloud compute instances list --format="value(ZONE)" | tail -n 2 | head -n 1) 4 | 5 | # Retrieve the internal and external IPs and store them in variables 6 | INTERNAL_IP_privatenet=$(gcloud compute instances describe privatenet-us-vm --zone=$ZONE --format="get(networkInterfaces[0].networkIP)") 7 | EXTERNAL_IP_privatenet=$(gcloud compute instances describe privatenet-us-vm --zone=$ZONE --format="get(networkInterfaces[0].accessConfigs[0].natIP)") 8 | 9 | 10 | BOLD_GREEN="\033[1;32m" 11 | BOLD_RED="\033[1;31m" # Changed to bold red 12 | RESET="\033[0m" 13 | 14 | echo -e "${BOLD_RED}External:- ${BOLD_GREEN}$EXTERNAL_IP_privatenet${RESET}" 15 | echo -e "${BOLD_RED}Internal IP: :- ${BOLD_GREEN}$INTERNAL_IP_privatenet${RESET}" 16 | 17 | 18 | gcloud compute ssh --zone "$ZONE" "mynet-us-vm" --project "$DEVSHELL_PROJECT_ID" --quiet 19 | -------------------------------------------------------------------------------- /Configuring VPC Network Peering/task2.sh: -------------------------------------------------------------------------------- 1 | #TASK 2 2 | 3 | ZONE=$(gcloud compute instances list --format="value(ZONE)" | tail -n 2 | head -n 1) 4 | 5 | 6 | gcloud compute networks peerings create peering-1-2 \ 7 | --network=mynetwork \ 8 | --peer-network=privatenet \ 9 | --auto-create-routes 10 | 11 | 12 | gcloud compute networks peerings create peering-2-1 \ 13 | --network=privatenet \ 14 | --peer-network=mynetwork \ 15 | --auto-create-routes 16 | -------------------------------------------------------------------------------- /Configuring VPC Network Peering/task3.sh: -------------------------------------------------------------------------------- 1 | #TASK 3 2 | ZONE=$(gcloud compute instances list --format="value(ZONE)" | tail -n 2 | head -n 1) 3 | 4 | INTERNAL_IP_privatenet=$(gcloud compute instances describe privatenet-us-vm --zone=$ZONE --format="get(networkInterfaces[0].networkIP)") 5 | 6 | BOLD_GREEN="\033[1;32m" 7 | BOLD_RED="\033[1;31m" # Changed to bold red 8 | RESET="\033[0m" 9 | 10 | echo -e "${BOLD_RED}INTERNAL_IP:- ${BOLD_GREEN}$INTERNAL_IP_privatenet${RESET}" 11 | 12 | gcloud compute ssh --zone "$ZONE" "mynet-us-vm" --project "$DEVSHELL_PROJECT_ID" --quiet 13 | -------------------------------------------------------------------------------- /Configuring VPC Network Peering/task4.sh: -------------------------------------------------------------------------------- 1 | #TASK 4 2 | ZONE=$(gcloud compute instances list --format="value(ZONE)" | tail -n 2 | head -n 1) 3 | 4 | INTERNAL_IP_mynet_us_vm=$(gcloud compute instances describe mynet-us-vm --zone=$ZONE --format="get(networkInterfaces[0].networkIP)") 5 | 6 | BOLD_GREEN="\033[1;32m" 7 | BOLD_RED="\033[1;31m" # Changed to bold red 8 | RESET="\033[0m" 9 | 10 | echo -e "${BOLD_RED}INTERNAL_IP :- ${BOLD_GREEN}$INTERNAL_IP_mynet_us_vm${RESET}" 11 | 12 | gcloud compute ssh --zone "$ZONE" "privatenet-us-vm" --project "$DEVSHELL_PROJECT_ID" --quiet 13 | -------------------------------------------------------------------------------- /Configuring VPC Network Peering/task5.sh: -------------------------------------------------------------------------------- 1 | #TASK 5 2 | 3 | ZONE=$(gcloud compute instances list --format="value(ZONE)" | tail -n 2 | head -n 1) 4 | 5 | gcloud compute networks peerings delete peering-1-2 --network=mynetwork 6 | 7 | INTERNAL_IP_privatenet=$(gcloud compute instances describe privatenet-us-vm --zone=$ZONE --format="get(networkInterfaces[0].networkIP)") 8 | 9 | BOLD_GREEN="\033[1;32m" 10 | BOLD_RED="\033[1;31m" # Changed to bold red 11 | RESET="\033[0m" 12 | 13 | echo -e "${BOLD_RED}INTERNAL_IP :- ${BOLD_GREEN}$INTERNAL_IP_privatenet${RESET}" 14 | 15 | gcloud compute ssh --zone "$ZONE" "mynet-us-vm" --project "$DEVSHELL_PROJECT_ID" --quiet 16 | -------------------------------------------------------------------------------- /Connect and Configure Data for your AppSheet App/quicklab.xlsx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/quiccklabs/Labs_solutions/d72b296e05c55ad79acaa8ce5195181d1877061a/Connect and Configure Data for your AppSheet App/quicklab.xlsx -------------------------------------------------------------------------------- /Connect to Cloud SQL from an Application in Google Kubernetes Engine/automate_create.exp: -------------------------------------------------------------------------------- 1 | #!/usr/bin/expect -f 2 | 3 | set timeout -1 4 | 5 | set pg_email [lindex $argv 0] 6 | set dbadmin_pass "dbadmin" 7 | set user_pass "teJMfkwoewWG" 8 | 9 | spawn ./create.sh dbadmin $pg_email 10 | 11 | expect "Enter a password for dbadmin" 12 | send "$dbadmin_pass\r" 13 | 14 | expect "Enter a password for $pg_email" 15 | send "$user_pass\r" 16 | 17 | expect eof 18 | -------------------------------------------------------------------------------- /Connect to Cloud SQL from an Application in Google Kubernetes Engine/quicklabgsp449.sh: -------------------------------------------------------------------------------- 1 | export REGION=$(gcloud compute project-info describe \ 2 | --format="value(commonInstanceMetadata.items[google-compute-default-region])") 3 | 4 | export ZONE=$(gcloud compute project-info describe \ 5 | --format="value(commonInstanceMetadata.items[google-compute-default-zone])") 6 | 7 | 8 | PROJECT_ID=`gcloud config get-value project` 9 | 10 | export PROJECT_NUMBER=$(gcloud projects describe $PROJECT_ID --format="value(projectNumber)") 11 | 12 | 13 | gcloud config set compute/region $REGION 14 | gcloud config set compute/zone $ZONE 15 | 16 | gsutil cp gs://spls/gsp449/gke-cloud-sql-postgres-demo.tar.gz . 17 | tar -xzvf gke-cloud-sql-postgres-demo.tar.gz 18 | 19 | cd gke-cloud-sql-postgres-demo 20 | 21 | PG_EMAIL=$(gcloud config get-value account) 22 | 23 | ./create.sh dbadmin $PG_EMAIL 24 | -------------------------------------------------------------------------------- /Conversational Agents Managing Environments/quicklab.blob: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/quiccklabs/Labs_solutions/d72b296e05c55ad79acaa8ce5195181d1877061a/Conversational Agents Managing Environments/quicklab.blob -------------------------------------------------------------------------------- /Create Analyze BigQuery data in Connected Sheets: Challenge Lab: -------------------------------------------------------------------------------- 1 | IF THIS VIDEO HELPFULL DO LIKE SO THAT YT PROMOTE TO OTHER & IT WILL HELP THEM TO SOLVE THIS LAB 2 | 3 | 4 | 5 | 6 | TASK 2:- 7 | 8 | =COUNTIF(tlc_yellow_trips_2022!airport_fee, "1") 9 | 10 | 11 | TASK 5:- 12 | 13 | =IF(fare_amount>0,tip_amount/fare_amount*100,0) 14 | 15 | 16 | 17 | 18 | THANKS FOR WATCHING :O 19 | -------------------------------------------------------------------------------- /Create Synthetic Speech Using quicklab/quicklabgsp222.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | gcloud services enable texttospeech.googleapis.com --project=$DEVSHELL_PROJECT_ID 5 | 6 | sudo apt-get install -y virtualenv 7 | 8 | python3 -m venv venv 9 | 10 | source venv/bin/activate 11 | 12 | 13 | gcloud iam service-accounts create tts-qwiklab 14 | 15 | export PROJECT_ID=$(gcloud config get-value project) 16 | 17 | gcloud iam service-accounts keys create tts-qwiklab.json --iam-account tts-qwiklab@$PROJECT_ID.iam.gserviceaccount.com 18 | 19 | export GOOGLE_APPLICATION_CREDENTIALS=tts-qwiklab.json 20 | 21 | 22 | 23 | -------------------------------------------------------------------------------- /Create a Secure Data Lake on Cloud Storage Challenge Lab/quicklabform3.sh: -------------------------------------------------------------------------------- 1 | 2 | echo "" 3 | echo "" 4 | echo "Please export the values." 5 | 6 | 7 | # Prompt user to input three regions 8 | read -p "Enter REGION: " REGION 9 | 10 | 11 | #TASK 1 12 | bq mk --location=US Raw_data 13 | 14 | bq load --source_format=AVRO Raw_data.public-data gs://spls/gsp1145/users.avro 15 | 16 | 17 | # TASK 2 18 | 19 | 20 | gcloud dataplex zones create temperature-raw-data \ 21 | --lake=public-lake \ 22 | --location=$REGION \ 23 | --type=RAW \ 24 | --resource-location-type=SINGLE_REGION \ 25 | --display-name="temperature-raw-data" 26 | 27 | #TASK 3 28 | 29 | 30 | gcloud dataplex assets create customer-details-dataset \ 31 | --location=$REGION \ 32 | --lake=public-lake \ 33 | --zone=temperature-raw-data \ 34 | --resource-type=BIGQUERY_DATASET \ 35 | --resource-name=projects/$DEVSHELL_PROJECT_ID/datasets/customer_reference_data \ 36 | --display-name="Customer Details Dataset" \ 37 | --discovery-enabled 38 | 39 | 40 | -------------------------------------------------------------------------------- /Create a Secure Data Lake on Cloud Storage: Challenge Lab: -------------------------------------------------------------------------------- 1 | 2 | 3 | export REGION= 4 | 5 | export USER2= 6 | 7 | gcloud services enable dataplex.googleapis.com 8 | 9 | gsutil mb -l $REGION gs://"$DEVSHELL_PROJECT_ID-bucket" 10 | 11 | gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID --member=user:$USER2 --role=roles/serviceusage.serviceUsageAdmin 12 | gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID --member=user:$USER2 --role=roles/dataplex.admin 13 | 14 | 15 | -------------------------------------------------------------------------------- /Create a VPC using Cloud Shell/quicklab.sh: -------------------------------------------------------------------------------- 1 | export REGION=$(gcloud compute project-info describe \ 2 | --format="value(commonInstanceMetadata.items[google-compute-default-region])") 3 | 4 | export ZONE=$(gcloud compute project-info describe \ 5 | --format="value(commonInstanceMetadata.items[google-compute-default-zone])") 6 | 7 | 8 | PROJECT_ID=`gcloud config get-value project` 9 | 10 | export PROJECT_NUMBER=$(gcloud projects describe $PROJECT_ID --format="value(projectNumber)") 11 | 12 | 13 | # echo "" 14 | # echo "" 15 | # echo "Please export the values." 16 | 17 | 18 | 19 | gcloud compute networks create labnet --subnet-mode=custom 20 | 21 | gcloud compute networks subnets create labnet-sub \ 22 | --network labnet \ 23 | --region $REGION \ 24 | --range 10.0.0.0/28 25 | -------------------------------------------------------------------------------- /Create and Manage AlloyDB Instances Challenge Lab/quicklabgsp395.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud beta alloydb clusters create lab-cluster \ 4 | --password=Change3Me \ 5 | --network=peering-network \ 6 | --region=$REGION \ 7 | --project=$DEVSHELL_PROJECT_ID 8 | 9 | gcloud beta alloydb instances create lab-instance \ 10 | --instance-type=PRIMARY \ 11 | --cpu-count=2 \ 12 | --region=$REGION \ 13 | --cluster=lab-cluster \ 14 | --project=$DEVSHELL_PROJECT_ID 15 | 16 | gcloud alloydb instances create lab-instance-rp1 \ 17 | --cluster=lab-cluster \ 18 | --region=$REGION \ 19 | --instance-type=READ_POOL \ 20 | --cpu-count=2 \ 21 | --read-pool-node-count=2 22 | 23 | gcloud beta alloydb backups create lab-backup --region=$REGION --cluster=lab-cluster -------------------------------------------------------------------------------- /Create synthetic speech from text using the Text to Speech API/quicklabgsp074.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | gcloud config set compute/region $REGION 5 | 6 | gsutil mb gs://$DEVSHELL_PROJECT_ID 7 | 8 | curl https://upload.wikimedia.org/wikipedia/commons/thumb/a/a4/Ada_Lovelace_portrait.jpg/800px-Ada_Lovelace_portrait.jpg --output ada.jpg 9 | 10 | gsutil cp ada.jpg gs://$DEVSHELL_PROJECT_ID 11 | 12 | gsutil cp -r gs://$DEVSHELL_PROJECT_ID/ada.jpg . 13 | 14 | gsutil cp gs://$DEVSHELL_PROJECT_ID/ada.jpg gs://$DEVSHELL_PROJECT_ID/image-folder/ 15 | 16 | gsutil acl ch -u AllUsers:R gs://$DEVSHELL_PROJECT_ID/ada.jpg -------------------------------------------------------------------------------- /Create, Modify and Remove Files and Folders in Linux: -------------------------------------------------------------------------------- 1 | 2 | mkdir dir_name 3 | cd /home/user/Documents 4 | mkdir red blue green yellow magenta 5 | cd /home/user/Pictures 6 | mv .apple .banana .broccoli .milk /home/user/Documents/Hidden 7 | mv /home/user/Movies/Europe\ Pictures /home/user/Pictures 8 | cd /home/user/Pictures 9 | mv /home/user/Images/Vacation.JPG . 10 | cd /home/user/Music 11 | rm Best_of_the_90s 80s_jams Classical 12 | rmdir Rock 13 | grep -rw /home/user/Downloads -e "vacation" 14 | mv /home/user/Downloads/Iceland /home/user/Downloads/Japan /home/user/Documents -------------------------------------------------------------------------------- /Creating a Persistent Disk/quicklabgsp004.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | gcloud compute instances create gcelab --zone $ZONE --machine-type e2-standard-2 5 | 6 | gcloud compute disks create mydisk --size=200GB \ 7 | --zone $ZONE 8 | 9 | gcloud compute instances attach-disk gcelab --disk mydisk --zone $ZONE 10 | 11 | 12 | gcloud compute ssh gcelab --zone $ZONE --quiet --command "sudo mkdir /mnt/mydisk && 13 | sudo mkfs.ext4 -F -E lazy_itable_init=0,lazy_journal_init=0,discard /dev/disk/by-id/scsi-0Google_PersistentDisk_persistent-disk-1 && 14 | sudo mount -o discard,defaults /dev/disk/by-id/scsi-0Google_PersistentDisk_persistent-disk-1 /mnt/mydisk && 15 | echo '/dev/disk/by-id/scsi-0Google_PersistentDisk_persistent-disk-1 /mnt/mydisk ext4 defaults 1 1' | sudo tee -a /etc/fstab" 16 | 17 | 18 | 19 | -------------------------------------------------------------------------------- /Creating a Real-time Data Pipeline using Eventarc and MongoDB Atla: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | API_KEY = z2e6t1j0G8r1cw1oPnLnvd4rgaG4kwDefpF8msf97HiCA8Sx1bMaHv0YBCKsz3aL 5 | 6 | 7 | URL = https://ap-south-1.aws.data.mongodb-api.com/app/data-oonhe/endpoint/data/v1 8 | 9 | 10 | 11 | -------------------------------------------------------------------------------- /Creating and Populating a Bigtable Instance/quicklabgsp1054.sh: -------------------------------------------------------------------------------- 1 | #GSP1054 2 | 3 | 4 | 5 | 6 | 7 | 8 | export REGION="${ZONE%-*}" 9 | 10 | gcloud auth list 11 | 12 | gcloud services disable dataflow.googleapis.com 13 | gcloud services enable dataflow.googleapis.com 14 | 15 | 16 | 17 | gsutil mb gs://$DEVSHELL_PROJECT_ID 18 | 19 | 20 | gcloud bigtable instances tables create UserSessions \ 21 | --instance=personalized-sales \ 22 | --project=$DEVSHELL_PROJECT_ID \ 23 | --column-families=Interactions,Sales 24 | 25 | sleep 100 26 | 27 | gcloud dataflow jobs run import-usersessions --gcs-location gs://dataflow-templates-$REGION/latest/GCS_SequenceFile_to_Cloud_Bigtable --region $REGION --staging-location gs://$DEVSHELL_PROJECT_ID/temp --parameters bigtableProject=$DEVSHELL_PROJECT_ID,bigtableInstanceId=personalized-sales,bigtableTableId=UserSessions,sourcePattern=gs://cloud-training/OCBL377/retail-interactions-sales-00000-of-00001,mutationThrottleLatencyMs=0 28 | 29 | 30 | 31 | echo -e "\e[1m\e[33mClick here: https://console.cloud.google.com/dataflow/jobs?project=$DEVSHELL_PROJECT_ID&walkthrough_id=dataflow_index\e[0m" 32 | 33 | 34 | -------------------------------------------------------------------------------- /Data Analytics SME Academy Loading Data into Google Cloud SQL/quicklabgsp196.sh: -------------------------------------------------------------------------------- 1 | 2 | git clone \ 3 | https://github.com/GoogleCloudPlatform/data-science-on-gcp/ 4 | 5 | cd data-science-on-gcp/03_sqlstudio 6 | 7 | export PROJECT_ID=$(gcloud info --format='value(config.project)') 8 | export BUCKET=${PROJECT_ID}-ml 9 | 10 | gsutil cp create_table.sql \ 11 | gs://$BUCKET/create_table.sql 12 | 13 | gcloud sql instances create flights \ 14 | --database-version=POSTGRES_13 --cpu=2 --memory=8GiB \ 15 | --region=$REGION --root-password=Passw0rd 16 | 17 | 18 | export ADDRESS=$(curl -s http://ipecho.net/plain)/32 19 | 20 | gcloud sql instances patch flights --authorized-networks $ADDRESS --quiet 21 | 22 | gcloud sql databases create bts --instance=flights 23 | -------------------------------------------------------------------------------- /Dataflow Qwik Start Templates/quicklabgsp192.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | gcloud services disable dataflow.googleapis.com 7 | 8 | gcloud services enable dataflow.googleapis.com 9 | 10 | sleep 20 11 | 12 | bq mk taxirides 13 | 14 | bq mk \ 15 | --time_partitioning_field timestamp \ 16 | --schema ride_id:string,point_idx:integer,latitude:float,longitude:float,\ 17 | timestamp:timestamp,meter_reading:float,meter_increment:float,ride_status:string,\ 18 | passenger_count:integer -t taxirides.realtime 19 | 20 | 21 | gsutil mb gs://$DEVSHELL_PROJECT_ID/ 22 | 23 | sleep 20 24 | 25 | gcloud dataflow jobs run iotflow \ 26 | --gcs-location gs://dataflow-templates-$REGION/latest/PubSub_to_BigQuery \ 27 | --region $REGION \ 28 | --worker-machine-type e2-medium \ 29 | --staging-location gs://$DEVSHELL_PROJECT_ID/temp \ 30 | --parameters inputTopic=projects/pubsub-public-data/topics/taxirides-realtime,outputTableSpec=$DEVSHELL_PROJECT_ID:taxirides.realtime -------------------------------------------------------------------------------- /Dataplex Qwik Start Console/Dataplex Qwik Start Console.md: -------------------------------------------------------------------------------- 1 | ## Dataplex: Qwik Start - Console 2 | 3 | ## 4 | 5 | ``` 6 | export REGION= 7 | ``` 8 | 9 | 10 | ``` 11 | curl -LO raw.githubusercontent.com/quiccklabs/Labs_solutions/master/Dataplex%20Qwik%20Start%20Console/quicklabtask1.sh 12 | 13 | sudo chmod +x quicklabtask1.sh 14 | 15 | ./quicklabtask1.sh 16 | ``` 17 | ### ***Now Check the score for First 3 Task & then go ahead with next Commands.*** 18 | 19 | ## 20 | 21 | 22 | 23 | ``` 24 | curl -LO raw.githubusercontent.com/quiccklabs/Labs_solutions/master/Dataplex%20Qwik%20Start%20Console/quicklabtask2.sh 25 | 26 | sudo chmod +x quicklabtask2.sh 27 | 28 | ./quicklabtask2.sh 29 | ``` 30 | 31 | ### Congratulation!!! -------------------------------------------------------------------------------- /Dataplex Qwik Start Console/quicklabtask2.sh: -------------------------------------------------------------------------------- 1 | 2 | LAKE_ID="sensors" 3 | ZONE_ID="temperature-raw-data" 4 | ASSET_ID="measurements" 5 | 6 | 7 | # Detach (delete) the asset 8 | gcloud dataplex assets delete $ASSET_ID --lake=$LAKE_ID --zone=$ZONE_ID --location=$REGION --quiet 9 | 10 | # Delete the zone 11 | gcloud dataplex zones delete $ZONE_ID --lake=$LAKE_ID --location=$REGION --quiet 12 | 13 | # Delete the lake 14 | gcloud dataplex lakes delete $LAKE_ID --location=$REGION --quiet 15 | -------------------------------------------------------------------------------- /Dataproc Qwik Start Command Line/quicklabgsp104.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud config set dataproc/region $REGION 4 | 5 | PROJECT_ID=$(gcloud config get-value project) && \ 6 | gcloud config set project $PROJECT_ID 7 | 8 | PROJECT_NUMBER=$(gcloud projects describe $PROJECT_ID --format='value(projectNumber)') 9 | 10 | gcloud projects add-iam-policy-binding $PROJECT_ID \ 11 | --member=serviceAccount:$PROJECT_NUMBER-compute@developer.gserviceaccount.com \ 12 | --role=roles/storage.admin 13 | 14 | gcloud compute networks subnets update default --region=$REGION --enable-private-ip-google-access 15 | 16 | gcloud dataproc clusters create example-cluster --worker-boot-disk-size 500 --worker-machine-type=e2-standard-4 --master-machine-type=e2-standard-4 --quiet 17 | 18 | gcloud dataproc jobs submit spark --cluster example-cluster \ 19 | --class org.apache.spark.examples.SparkPi \ 20 | --jars file:///usr/lib/spark/examples/jars/spark-examples.jar -- 1000 -------------------------------------------------------------------------------- /Debugging Apps on Google Kubernetes Engine.md: -------------------------------------------------------------------------------- 1 | ## Debugging Apps on Google Kubernetes Engine 2 | 3 | ## 4 | 5 | 6 | ### ```Now you have to export the ZONE from Setup and requirements task``` 7 | 8 | ```bash 9 | export ZONE= 10 | ``` 11 | 12 | #### 13 | 14 | ``` 15 | gcloud config set compute/zone $ZONE 16 | 17 | export PROJECT_ID=$(gcloud info --format='value(config.project)') 18 | 19 | gcloud container clusters get-credentials central --zone $ZONE 20 | 21 | git clone https://github.com/xiangshen-dk/microservices-demo.git 22 | 23 | cd microservices-demo 24 | 25 | kubectl apply -f release/kubernetes-manifests.yaml 26 | 27 | sleep 30 28 | 29 | gcloud logging metrics create Error_Rate_SLI \ 30 | --description="subscribe to quicklab" \ 31 | --log-filter="resource.type=\"k8s_container\" severity=ERROR labels.\"k8s-pod/app\": \"recommendationservice\"" 32 | ``` 33 | 34 | 35 | 36 | 37 | -------------------------------------------------------------------------------- /Deploy Google Cloud Framework Data Foundation for SAP: Challenge Lab: -------------------------------------------------------------------------------- 1 | 2 | 3 | export PROJECT_ID= 4 | 5 | TASK 1:- 6 | 7 | export PROJECT_ID=$(gcloud config get-value project) 8 | gcloud services enable bigquery.googleapis.com \ 9 | cloudbuild.googleapis.com \ 10 | composer.googleapis.com \ 11 | storage-component.googleapis.com \ 12 | cloudresourcemanager.googleapis.com 13 | 14 | 15 | TASK 2:- 16 | 17 | bq mk --dataset CDC_PROCESSED 18 | bq mk --dataset SAP_REPLICATED_DATA 19 | bq mk --dataset SAP_REPORTING 20 | 21 | 22 | 23 | TASK 4:- 24 | 25 | gsutil mb -l US gs://$PROJECT_ID-sap-cortex 26 | export GCS_BUCKET=$PROJECT_ID-sap-cortex 27 | 28 | 29 | TASK 5:- 30 | 31 | cd ~ 32 | 33 | git clone --recurse-submodules https://github.com/GoogleCloudPlatform/cortex-data-foundation 34 | 35 | cd cortex-data-foundation 36 | 37 | gcloud builds submit --project $PROJECT_ID \ 38 | --substitutions \ 39 | _PJID_SRC=$PROJECT_ID,_PJID_TGT=$PROJECT_ID,_DS_CDC=CDC_PROCESSED,_DS_RAW=SAP_REPLICATED_DATA,_DS_REPORTING=SAP_REPORTING,_GCS_BUCKET=$GCS_BUCKET,_TGT_BUCKET=$GCS_BUCKET,_TEST_DATA=true,_DEPLOY_CDC=true 40 | 41 | 42 | 43 | 44 | 45 | -------------------------------------------------------------------------------- /Deploy Kubernetes Load Balancer Service with Terraform/quicklabgsp233.sh: -------------------------------------------------------------------------------- 1 | 2 | export REGION="${ZONE%-*}" 3 | 4 | gsutil -m cp -r gs://spls/gsp233/* . 5 | 6 | cd tf-gke-k8s-service-lb 7 | 8 | terraform init 9 | 10 | terraform apply -var="region=$REGION" -var="location=$ZONE" --auto-approve 11 | 12 | 13 | -------------------------------------------------------------------------------- /Deploy Microsoft SQL Server to Compute Engine/quicklabgsp031.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | gcloud compute instances create sqlserver-lab --project=$DEVSHELL_PROJECT_ID --zone=$ZONE --machine-type=e2-medium --network-interface=network-tier=PREMIUM,stack-type=IPV4_ONLY,subnet=default --metadata=enable-oslogin=true --maintenance-policy=MIGRATE --provisioning-model=STANDARD --scopes=https://www.googleapis.com/auth/devstorage.read_only,https://www.googleapis.com/auth/logging.write,https://www.googleapis.com/auth/monitoring.write,https://www.googleapis.com/auth/service.management.readonly,https://www.googleapis.com/auth/servicecontrol,https://www.googleapis.com/auth/trace.append --create-disk=auto-delete=yes,boot=yes,device-name=sqlserver-lab,image=projects/windows-sql-cloud/global/images/sql-2016-web-windows-2016-dc-v20240711,mode=rw,size=50,type=pd-balanced --no-shielded-secure-boot --shielded-vtpm --shielded-integrity-monitoring --labels=goog-ec-src=vm_add-gcloud --reservation-affinity=any 6 | 7 | gcloud compute reset-windows-password sqlserver-lab --zone=$ZONE --quiet -------------------------------------------------------------------------------- /Deploy Scale and Update Your Website on Google Kubernetes Engine/quicklabgsp663.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud auth list 4 | 5 | gcloud config set compute/zone $ZONE 6 | 7 | gcloud services enable container.googleapis.com 8 | gcloud services enable cloudbuild.googleapis.com 9 | 10 | gcloud container clusters create fancy-cluster --num-nodes 3 11 | 12 | cd ~ 13 | 14 | git clone https://github.com/googlecodelabs/monolith-to-microservices.git 15 | 16 | cd ~/monolith-to-microservices 17 | 18 | ./setup.sh 19 | 20 | nvm install --lts 21 | 22 | cd ~/monolith-to-microservices/monolith 23 | 24 | gcloud builds submit --tag gcr.io/${GOOGLE_CLOUD_PROJECT}/monolith:1.0.0 . 25 | 26 | kubectl create deployment monolith --image=gcr.io/${GOOGLE_CLOUD_PROJECT}/monolith:1.0.0 27 | 28 | kubectl expose deployment monolith --type=LoadBalancer --port 80 --target-port 8080 29 | 30 | kubectl scale deployment monolith --replicas=3 31 | 32 | cd ~/monolith-to-microservices/react-app/src/pages/Home 33 | mv index.js.new index.js 34 | 35 | cd ~/monolith-to-microservices/react-app 36 | npm run build:monolith 37 | 38 | cd ~/monolith-to-microservices/monolith 39 | 40 | gcloud builds submit --tag gcr.io/${GOOGLE_CLOUD_PROJECT}/monolith:2.0.0 . 41 | 42 | kubectl set image deployment/monolith monolith=gcr.io/${GOOGLE_CLOUD_PROJECT}/monolith:2.0.0 43 | 44 | -------------------------------------------------------------------------------- /Deploy a Compute Instance with a Remote Startup Script Challenge Lab/resources-install-web.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | apt-get update 3 | apt-get install -y apache2 -------------------------------------------------------------------------------- /Deploy a Generative AI solution using a RAG Framework to Google Cloud Challenge Lab L400/requirements.txt: -------------------------------------------------------------------------------- 1 | flask 2 | Jinja2 3 | pytest 4 | pyyaml 5 | google-cloud-aiplatform 6 | google-cloud-logging 7 | firebase-admin 8 | langchain_community 9 | pypdf 10 | -------------------------------------------------------------------------------- /Deploy a Hugo Website with Cloud Build and Firebase Pipeline/GSP747.md: -------------------------------------------------------------------------------- 1 | ## Deploy a Hugo Website with Cloud Build and Firebase Pipeline 2 | 3 | ## 4 | 5 | [![Screenshot-2024-06-30-at-1-46-19-AM.png](https://i.postimg.cc/1z2471v2/Screenshot-2024-06-30-at-1-46-19-AM.png)](https://postimg.cc/hJ8Sh6W1) 6 | 7 | 8 | 9 | ```bash 10 | curl -LO raw.githubusercontent.com/quiccklabs/Labs_solutions/refs/heads/master/Deploy%20a%20Hugo%20Website%20with%20Cloud%20Build%20and%20Firebase%20Pipeline/task1.sh 11 | 12 | source task1.sh 13 | ``` 14 | ***Now Check the score for Task 2 & then go ahead with next Commands.*** 15 | 16 | ## 17 | ```bash 18 | curl -LO raw.githubusercontent.com/quiccklabs/Labs_solutions/refs/heads/master/Deploy%20a%20Hugo%20Website%20with%20Cloud%20Build%20and%20Firebase%20Pipeline/task2.sh 19 | 20 | source task2.sh 21 | ``` 22 | 23 | [![Screenshot-2024-06-29-at-2-47-57-AM.png](https://i.postimg.cc/0NR7rmsb/Screenshot-2024-06-29-at-2-47-57-AM.png)](https://postimg.cc/k2s2p2Dm) 24 | 25 | ```bash 26 | curl -LO raw.githubusercontent.com/quiccklabs/Labs_solutions/refs/heads/master/Deploy%20a%20Hugo%20Website%20with%20Cloud%20Build%20and%20Firebase%20Pipeline/task3.sh 27 | 28 | source task3.sh 29 | ``` 30 | 31 | # 32 | 33 | 34 | 35 | ### Congratulation!!! 36 | -------------------------------------------------------------------------------- /Deploy a Hugo Website with Cloud Build and Firebase Pipeline/quicklab.md: -------------------------------------------------------------------------------- 1 | ## Deploy a Hugo Website with Cloud Build and Firebase Pipeline 2 | 3 | ## 4 | 5 | [![Screenshot-2024-06-30-at-1-46-19-AM.png](https://i.postimg.cc/1z2471v2/Screenshot-2024-06-30-at-1-46-19-AM.png)](https://postimg.cc/hJ8Sh6W1) 6 | 7 | 8 | 9 | ``` 10 | curl -LO raw.githubusercontent.com/quiccklabs/Labs_solutions/refs/heads/master/Deploy%20a%20Hugo%20Website%20with%20Cloud%20Build%20and%20Firebase%20Pipeline/quicklabtask1.sh 11 | 12 | source quicklabtask1.sh 13 | ``` 14 | ***Now Check the score for Task 2 & then go ahead with next Commands.*** 15 | 16 | ## 17 | ``` 18 | curl -LO raw.githubusercontent.com/quiccklabs/Labs_solutions/master/Deploy%20a%20Hugo%20Website%20with%20Cloud%20Build%20and%20Firebase%20Pipeline/quicklabtask2.sh 19 | 20 | source quicklabtask2.sh 21 | ``` 22 | 23 | [![Screenshot-2024-06-29-at-2-47-57-AM.png](https://i.postimg.cc/0NR7rmsb/Screenshot-2024-06-29-at-2-47-57-AM.png)](https://postimg.cc/k2s2p2Dm) 24 | 25 | # 26 | 27 | 28 | 29 | ### Congratulation!!! 30 | -------------------------------------------------------------------------------- /Deploy a Hugo Website with Cloud Build and Firebase Pipeline/quicklabtask1.sh: -------------------------------------------------------------------------------- 1 | 2 | cd ~ 3 | /tmp/installhugo.sh 4 | 5 | sudo apt-get update -y 6 | sudo apt-get install git -y 7 | 8 | cd ~ 9 | gcloud source repos create my_hugo_site 10 | gcloud source repos clone my_hugo_site 11 | 12 | 13 | cd ~ 14 | /tmp/hugo new site my_hugo_site --force 15 | 16 | cd ~/my_hugo_site 17 | git clone \ 18 | https://github.com/rhazdon/hugo-theme-hello-friend-ng.git themes/hello-friend-ng 19 | echo 'theme = "hello-friend-ng"' >> config.toml 20 | 21 | sudo rm -r themes/hello-friend-ng/.git 22 | sudo rm themes/hello-friend-ng/.gitignore 23 | 24 | cd ~/my_hugo_site 25 | /tmp/hugo server -D --bind 0.0.0.0 --port 8080 26 | -------------------------------------------------------------------------------- /Deploy a Hugo Website with Cloud Build and Firebase Pipeline/quicklabtask2.sh: -------------------------------------------------------------------------------- 1 | 2 | curl -sL https://firebase.tools | bash 3 | 4 | cd ~/my_hugo_site 5 | firebase init 6 | 7 | #add screenshot 8 | 9 | /tmp/hugo && firebase deploy 10 | 11 | 12 | git config --global user.name "hugo" 13 | git config --global user.email "hugo@blogger.com" 14 | 15 | cd ~/my_hugo_site 16 | echo "resources" >> .gitignore 17 | 18 | git add . 19 | git commit -m "Add app to Cloud Source Repositories" 20 | git push -u origin master 21 | 22 | cd ~/my_hugo_site 23 | cp /tmp/cloudbuild.yaml . 24 | 25 | 26 | echo -e "options:\n logging: CLOUD_LOGGING_ONLY" >> cloudbuild.yaml 27 | 28 | 29 | 30 | gcloud alpha builds triggers import --source=/tmp/trigger.yaml 31 | 32 | cd ~/my_hugo_site 33 | 34 | sed -i "3c\title = 'Blogging with Hugo and Cloud Build'" config.toml 35 | 36 | git add . 37 | git commit -m "I updated the site title" 38 | git push -u origin master 39 | 40 | sleep 20 41 | 42 | gcloud builds list 43 | 44 | gcloud builds log $(gcloud builds list --format='value(ID)' --filter=$(git rev-parse HEAD)) 45 | 46 | gcloud builds log $(gcloud builds list --format='value(ID)' --filter=$(git rev-parse HEAD)) | grep "Hosting URL" -------------------------------------------------------------------------------- /Deploy a Hugo Website with Cloud Build and Firebase Pipeline/task2.sh: -------------------------------------------------------------------------------- 1 | curl -sL https://firebase.tools | bash 2 | 3 | cd ~/my_hugo_site 4 | firebase init 5 | 6 | #add screenshot 7 | 8 | /tmp/hugo && firebase deploy 9 | 10 | 11 | git config --global user.name "hugo" 12 | git config --global user.email "hugo@blogger.com" 13 | 14 | cd ~/my_hugo_site 15 | echo "resources" >> .gitignore 16 | 17 | git add . 18 | git commit -m "Add app to Cloud Source Repositories" 19 | git push -u origin master 20 | 21 | cd ~/my_hugo_site 22 | cp /tmp/cloudbuild.yaml . 23 | 24 | gcloud builds connections create github cloud-build-connection --project=$PROJECT_ID --region=$REGION 25 | 26 | gcloud builds connections describe cloud-build-connection --region=$REGION 27 | -------------------------------------------------------------------------------- /Deploy a Streamlit App Integrated with Gemini Pro on Cloud Run/quicklabgsp1229.sh: -------------------------------------------------------------------------------- 1 | #GSP1229 2 | 3 | 4 | 5 | 6 | 7 | gcloud auth list 8 | 9 | git clone https://github.com/GoogleCloudPlatform/generative-ai.git 10 | cd generative-ai/gemini/sample-apps/gemini-streamlit-cloudrun 11 | 12 | python3 -m venv gemini-streamlit 13 | source gemini-streamlit/bin/activate 14 | pip install -r requirements.txt 15 | 16 | GCP_PROJECT=$DEVSHELL_PROJECT_ID 17 | GCP_REGION=$REGION 18 | 19 | 20 | AR_REPO='gemini-repo' 21 | SERVICE_NAME='gemini-streamlit-app' 22 | gcloud artifacts repositories create "$AR_REPO" --location="$GCP_REGION" --repository-format=Docker 23 | gcloud builds submit --tag "$GCP_REGION-docker.pkg.dev/$GCP_PROJECT/$AR_REPO/$SERVICE_NAME" 24 | 25 | gcloud run deploy "$SERVICE_NAME" \ 26 | --port=8080 \ 27 | --image="$GCP_REGION-docker.pkg.dev/$GCP_PROJECT/$AR_REPO/$SERVICE_NAME" \ 28 | --allow-unauthenticated \ 29 | --region=$GCP_REGION \ 30 | --platform=managed \ 31 | --project=$GCP_PROJECT \ 32 | --set-env-vars=GCP_PROJECT=$GCP_PROJECT,GCP_REGION=$GCP_REGION -------------------------------------------------------------------------------- /Deploying GKE Autopilot Clusters/quicklab.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | export REGION=$(gcloud compute project-info describe \ 4 | --format="value(commonInstanceMetadata.items[google-compute-default-region])") 5 | 6 | export ZONE=$(gcloud compute project-info describe \ 7 | --format="value(commonInstanceMetadata.items[google-compute-default-zone])") 8 | 9 | PROJECT_ID=`gcloud config get-value project` 10 | 11 | 12 | 13 | export my_region=$REGION 14 | 15 | export my_cluster=autopilot-cluster-1 16 | 17 | 18 | gcloud container clusters create-auto $my_cluster --region $my_region 19 | 20 | gcloud container clusters get-credentials $my_cluster --region $my_region 21 | 22 | kubectl config view 23 | 24 | kubectl cluster-info 25 | 26 | kubectl config current-context 27 | 28 | kubectl config get-contexts 29 | 30 | kubectl config use-context gke_${DEVSHELL_PROJECT_ID}_us-central1_autopilot-cluster-1 31 | 32 | 33 | kubectl create deployment nginx-1 --image=nginx:latest 34 | 35 | 36 | -------------------------------------------------------------------------------- /Deploying the Oracle on Bare Metal Solution/quicklab.sh: -------------------------------------------------------------------------------- 1 | 2 | export BMSPROJ=$PROJECT_ID_1 3 | 4 | export BASTIONPROJ=$PROJECT_ID_2 5 | 6 | gcloud config set project $BASTIONPROJ 7 | 8 | gcloud compute networks peerings create bms-peer-bastion --network bms-bastion-net --peer-network bms-db-net --peer-project $BMSPROJ 9 | 10 | gcloud config set project $BMSPROJ 11 | 12 | gcloud compute networks peerings create bms-peer-db --network bms-db-net --peer-network bms-bastion-net --peer-project $BASTIONPROJ 13 | 14 | 15 | -------------------------------------------------------------------------------- /Designing and Querying Bigtable Schemas/quicklabgsp1053.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | echo project = `gcloud config get-value project` \ 4 | >> ~/.cbtrc 5 | 6 | cbt listinstances 7 | 8 | echo instance = personalized-sales \ 9 | >> ~/.cbtrc 10 | 11 | 12 | cat ~/.cbtrc 13 | 14 | cbt createtable test-sessions 15 | 16 | cbt createfamily test-sessions Interactions 17 | 18 | cbt createfamily test-sessions Sales 19 | 20 | cbt ls test-sessions 21 | 22 | 23 | cbt read test-sessions 24 | 25 | -------------------------------------------------------------------------------- /Detect Labels Faces and Landmarks in Images with the Cloud Vision API/city.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/quiccklabs/Labs_solutions/d72b296e05c55ad79acaa8ce5195181d1877061a/Detect Labels Faces and Landmarks in Images with the Cloud Vision API/city.png -------------------------------------------------------------------------------- /Detect Labels Faces and Landmarks in Images with the Cloud Vision API/donuts.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/quiccklabs/Labs_solutions/d72b296e05c55ad79acaa8ce5195181d1877061a/Detect Labels Faces and Landmarks in Images with the Cloud Vision API/donuts.png -------------------------------------------------------------------------------- /Detect Labels Faces and Landmarks in Images with the Cloud Vision API/selfie.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/quiccklabs/Labs_solutions/d72b296e05c55ad79acaa8ce5195181d1877061a/Detect Labels Faces and Landmarks in Images with the Cloud Vision API/selfie.png -------------------------------------------------------------------------------- /Develop GenAI Apps with Gemini and Streamlit Challenge Lab/Dockerfile.txt: -------------------------------------------------------------------------------- 1 | FROM python:3.8 2 | 3 | EXPOSE 8080 4 | WORKDIR /app 5 | 6 | COPY . ./ 7 | 8 | RUN pip install -r requirements.txt 9 | 10 | ENTRYPOINT ["streamlit", "run", "chef.py", "--server.port=8080", "--server.address=0.0.0.0"] -------------------------------------------------------------------------------- /Develop GenAI Apps with Gemini and Streamlit Challenge Lab/bonustask.sh: -------------------------------------------------------------------------------- 1 | 2 | cd generative-ai/gemini/sample-apps/gemini-streamlit-cloudrun 3 | pip install google-cloud-aiplatform 4 | 5 | streamlit run chef.py \ 6 | --browser.serverAddress=localhost \ 7 | --server.enableCORS=false \ 8 | --server.enableXsrfProtection=false \ 9 | --server.port 8080 10 | -------------------------------------------------------------------------------- /Develop GenAI Apps with Gemini and Streamlit Challenge Lab/requirements.txt: -------------------------------------------------------------------------------- 1 | streamlit==1.44.1 2 | google-genai==1.9.0 3 | google-cloud-logging 4 | google-cloud-aiplatform 5 | -------------------------------------------------------------------------------- /Develop GenAI Apps with Gemini and Streamlit Challenge Lab/task2.sh: -------------------------------------------------------------------------------- 1 | cd generative-ai/gemini/sample-apps/gemini-streamlit-cloudrun 2 | 3 | gcloud services enable run.googleapis.com 4 | 5 | 6 | sed -i "s/FROM python:3.8/FROM python:3.9/g" Dockerfile 7 | echo "google-cloud-aiplatform" >> requirements.txt 8 | 9 | pip install --no-cache-dir -r requirements.txt 10 | 11 | 12 | AR_REPO='chef-repo' 13 | SERVICE_NAME='chef-streamlit-app' 14 | export PROJECT="$DEVSHELL_PROJECT_ID" 15 | gcloud artifacts repositories create "$AR_REPO" --location="$REGION" --repository-format=Docker 16 | gcloud builds submit --tag "$REGION-docker.pkg.dev/$DEVSHELL_PROJECT_ID/$AR_REPO/$SERVICE_NAME" 17 | 18 | gcloud builds submit --tag "$REGION-docker.pkg.dev/$PROJECT/$AR_REPO/$SERVICE_NAME" 19 | 20 | 21 | gcloud run deploy "$SERVICE_NAME" \ 22 | --port=8080 \ 23 | --image="$REGION-docker.pkg.dev/$PROJECT/$AR_REPO/$SERVICE_NAME" \ 24 | --allow-unauthenticated \ 25 | --region=$REGION \ 26 | --platform=managed \ 27 | --project=$DEVSHELL_PROJECT_ID \ 28 | --set-env-vars=GCP_PROJECT=$DEVSHELL_PROJECT_ID,GCP_REGION=$REGION 29 | 30 | 31 | -------------------------------------------------------------------------------- /Develop No-Code Chat Apps with AppSheet/quicklabs.xlsx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/quiccklabs/Labs_solutions/d72b296e05c55ad79acaa8ce5195181d1877061a/Develop No-Code Chat Apps with AppSheet/quicklabs.xlsx -------------------------------------------------------------------------------- /Develop Serverless Applications on Cloud Run Challenge Lab/quicklab.md: -------------------------------------------------------------------------------- 1 | 2 | # Develop Serverless Applications on Cloud Run: Challenge Lab 3 | 4 | ## Environment Variables 5 | 6 | Set the following environment variables: 7 | 8 | ```bash 9 | export REGION= 10 | 11 | export TASK_1_SERVICES_NAME= 12 | 13 | export TASK_2_SERVICES_NAME= 14 | 15 | export TASK_3_SERVICES_NAME= 16 | 17 | export TASK_4_SERVICES_NAME= 18 | 19 | export TASK_5_SERVICES_NAME= 20 | 21 | export TASK_6_SERVICES_NAME= 22 | 23 | export TASK_7_SERVICES_NAME= 24 | ``` 25 | 26 | ## Script Execution 27 | 28 | 29 | 30 | ```bash 31 | curl -LO raw.githubusercontent.com/quiccklabs/Labs_solutions/master/Develop%20Serverless%20Applications%20on%20Cloud%20Run%20Challenge%20Lab/quicklabgsp328.sh 32 | 33 | sudo chmod +x quicklabgsp328.sh 34 | 35 | ./quicklabgsp328.sh 36 | ``` 37 | 38 | ## Congratulation!!! 39 | -------------------------------------------------------------------------------- /Develop an App with Vertex AI Gemini 10 Pro/gemini-app/Dockerfile.txt: -------------------------------------------------------------------------------- 1 | FROM python:3.8 2 | 3 | EXPOSE 8080 4 | WORKDIR /app 5 | 6 | COPY . ./ 7 | 8 | RUN pip install -r requirements.txt 9 | 10 | ENTRYPOINT ["streamlit", "run", "app.py", "--server.port=8080", "--server.address=0.0.0.0"] 11 | 12 | -------------------------------------------------------------------------------- /Develop an App with Vertex AI Gemini 10 Pro/gemini-app/requirements.txt: -------------------------------------------------------------------------------- 1 | streamlit 2 | google-cloud-aiplatform==1.38.1 3 | google-cloud-logging==3.6.0 4 | 5 | -------------------------------------------------------------------------------- /Develop an app with Gemini/Develop an app with Gemini.md: -------------------------------------------------------------------------------- 1 | ## **Develop an app with Gemini** 2 | 3 | ## 4 | 5 | ``` 6 | export REGION= 7 | ``` 8 | 9 | 10 | ``` 11 | curl -LO raw.githubusercontent.com/quiccklabs/Labs_solutions/master/Develop%20an%20app%20with%20Gemini/quicklab.sh 12 | 13 | sudo chmod +x quicklab.sh 14 | 15 | ./quicklab.sh 16 | ``` 17 | ### ```Check the score for TASK 1.``` 18 | 19 | 20 | 21 | ### ***```Perfrom Task 2 from lab instruction as shown in video.```*** 22 | 23 | ### ```Check the score for TASK 2.``` 24 | 25 | 26 | 27 | ### **Click here to download these files:** 28 | 29 | - ***File 1:*** **[inventory.py](https://github.com/quiccklabs/Labs_solutions/blob/master/Develop%20an%20app%20with%20Gemini/inventory.py)** 30 | - ***File 2:*** **[app.py](https://github.com/quiccklabs/Labs_solutions/blob/master/Develop%20an%20app%20with%20Gemini/app.py)** 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | ### **```Note: It might take up to 5 minutes to update the score.```** 41 | 42 | ### Congratulation!!! -------------------------------------------------------------------------------- /Develop an app with Gemini/inventory.py: -------------------------------------------------------------------------------- 1 | inventory = [ 2 | { 3 | "productid": "P001", 4 | "onhandqty": "10" 5 | }, 6 | { 7 | "productid": "P002", 8 | "onhandqty": "20" 9 | }, 10 | { 11 | "productid": "P003", 12 | "onhandqty": "30" 13 | } 14 | ] -------------------------------------------------------------------------------- /Develop an app with Gemini/quicklab.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | PROJECT_ID=$(gcloud config get-value project) 5 | 6 | echo "PROJECT_ID=${PROJECT_ID}" 7 | echo "REGION=${REGION}" 8 | 9 | USER=$(gcloud config get-value account 2> /dev/null) 10 | echo "USER=${USER}" 11 | 12 | gcloud services enable cloudaicompanion.googleapis.com --project ${PROJECT_ID} 13 | gcloud services enable run.googleapis.com 14 | 15 | gcloud projects add-iam-policy-binding ${PROJECT_ID} --member user:${USER} --role=roles/cloudaicompanion.user 16 | gcloud projects add-iam-policy-binding ${PROJECT_ID} --member user:${USER} --role=roles/serviceusage.serviceUsageViewer -------------------------------------------------------------------------------- /Develop with Apps Script and AppSheet Challenge Lab/quicklabarc126.xlsx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/quiccklabs/Labs_solutions/d72b296e05c55ad79acaa8ce5195181d1877061a/Develop with Apps Script and AppSheet Challenge Lab/quicklabarc126.xlsx -------------------------------------------------------------------------------- /Dialogflow CX Managing Environments/quicklab.blob: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/quiccklabs/Labs_solutions/d72b296e05c55ad79acaa8ce5195181d1877061a/Dialogflow CX Managing Environments/quicklab.blob -------------------------------------------------------------------------------- /Dialogflow CX with Generative Fallbacks/divebooker-agent.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/quiccklabs/Labs_solutions/d72b296e05c55ad79acaa8ce5195181d1877061a/Dialogflow CX with Generative Fallbacks/divebooker-agent.zip -------------------------------------------------------------------------------- /Dialogflow Logging and Monitoring in Operations Suite/pigeon-travel-gsp-794-cloud-function.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/quiccklabs/Labs_solutions/d72b296e05c55ad79acaa8ce5195181d1877061a/Dialogflow Logging and Monitoring in Operations Suite/pigeon-travel-gsp-794-cloud-function.zip -------------------------------------------------------------------------------- /Dialogflow Logging and Monitoring in Operations Suite/quicklab794.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/quiccklabs/Labs_solutions/d72b296e05c55ad79acaa8ce5195181d1877061a/Dialogflow Logging and Monitoring in Operations Suite/quicklab794.zip -------------------------------------------------------------------------------- /Entity and Sentiment Analysis with the Natural Language API: -------------------------------------------------------------------------------- 1 | 2 | 3 | export API_KEY= 4 | 5 | cat > request.json < result.json 17 | 18 | 19 | 20 | 21 | 22 | 23 | -------------------------------------------------------------------------------- /Exemplar: Add and manage users with Linux commands: -------------------------------------------------------------------------------- 1 | sudo useradd researcher9 2 | 3 | sudo usermod -g research_team researcher9 4 | 5 | sudo useradd researcher9 -g research_team 6 | 7 | sudo chown researcher9 /home/researcher2/projects/project_r.txt 8 | 9 | sudo userdel researcher9 10 | 11 | sudo groupdel researcher9 12 | -------------------------------------------------------------------------------- /Exemplar: Manage authorization: -------------------------------------------------------------------------------- 1 | cd projects 2 | 3 | ls -l 4 | 5 | ls -la 6 | 7 | chmod o-w project_k.txt 8 | 9 | ls -l 10 | 11 | chmod g-r project_m.txt 12 | 13 | ls -la 14 | 15 | chmod u-w,g-w,g+r .project_x.txt 16 | 17 | chmod g-x drafts 18 | -------------------------------------------------------------------------------- /Exemplar: Manage files with Linux commands: -------------------------------------------------------------------------------- 1 | 2 | mkdir logs 3 | 4 | ls 5 | 6 | rmdir temp 7 | 8 | ls 9 | 10 | cd /home/analyst/notes 11 | 12 | cd notes 13 | 14 | mv Q3patches.txt /home/analyst/reports/ 15 | 16 | ls /home/analyst/reports 17 | 18 | rm tempnotes.txt 19 | 20 | ls 21 | 22 | touch tasks.txt 23 | 24 | ls 25 | 26 | cat > tasks.txt < redact-request.json < 30" 19 | -------------------------------------------------------------------------------- /Getting Started with Cloud Shell and gcloud/quicklabgsp002.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | echo "" 5 | echo "" 6 | 7 | read -p "ENTER ZONE:- " ZONE 8 | 9 | gcloud compute instances create gcelab2 --machine-type e2-medium --zone $ZONE 10 | 11 | gcloud compute instances add-tags gcelab2 --zone $ZONE --tags http-server,https-server 12 | 13 | gcloud compute firewall-rules create default-allow-http --direction=INGRESS --priority=1000 --network=default --action=ALLOW --rules=tcp:80 --source-ranges=0.0.0.0/0 --target-tags=http-server 14 | -------------------------------------------------------------------------------- /Getting Started with Firebase Genkit/task1.sh: -------------------------------------------------------------------------------- 1 | echo "" 2 | echo "" 3 | 4 | read -p "ENTER REGION:- " REGION 5 | 6 | 7 | export GCLOUD_PROJECT=$DEVSHELL_PROJECT_ID 8 | export GCLOUD_LOCATION=$REGION 9 | 10 | gcloud auth list 11 | 12 | mkdir genkit-intro && cd genkit-intro 13 | npm init -y 14 | 15 | npm install -D genkit-cli@0.9.12 16 | 17 | npm install genkit@0.9.12 --save 18 | npm install @genkit-ai/vertexai@0.9.12 @genkit-ai/google-cloud@0.9.12 express --save 19 | 20 | mkdir src && cd src 21 | 22 | ## Add file 23 | 24 | wget https://raw.githubusercontent.com/quiccklabs/Labs_solutions/refs/heads/master/Getting%20Started%20with%20Firebase%20Genkit/index.ts 25 | 26 | sed -i "s/us-west1/$REGION/g" index.ts 27 | 28 | cd .. 29 | 30 | gcloud storage cp ~/genkit-intro/package.json gs://$GCLOUD_PROJECT 31 | gcloud storage cp ~/genkit-intro/src/index.ts gs://$GCLOUD_PROJECT 32 | 33 | 34 | cd ~/genkit-intro 35 | npx genkit start -- npx tsx src/index.ts 36 | -------------------------------------------------------------------------------- /Getting Started with Firebase Genkit/task2.sh: -------------------------------------------------------------------------------- 1 | cd ~/genkit-intro 2 | npx genkit flow:run menuSuggestionFlow '"French"' -s | tee -a output.txt 3 | 4 | gcloud storage cp -r ~/genkit-intro/output.txt gs://$DEVSHELL_PROJECT_ID 5 | gcloud storage cp -r ~/genkit-intro/src/index.ts gs://$DEVSHELL_PROJECT_ID 6 | -------------------------------------------------------------------------------- /Getting Started with MongoDB Atlas on Google Cloud: -------------------------------------------------------------------------------- 1 | Value 1:- AtSYKoZAUCRewN9s4lFXkAVtQHszWQSmfg9bKf2BBrUWXP12r4bXSL1wycxhLNIR 2 | 3 | 4 | Value 2:- https://asia-south1.gcp.realm.mongodb.com/api/client/v2.0/app/bakery-zfvdi/graphql 5 | 6 | 7 | Please like share & Subscribe if this video helpful for you 8 | -------------------------------------------------------------------------------- /Getting started with Flutter Development/quicklab.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | # Getting Started with Flutter Development 5 | 6 | ## Steps 7 | 8 | 1. **Create a new Flutter project:** 9 | ```sh 10 | flutter create startup_namer 11 | ``` 12 | 13 | 2. **Navigate into the project directory:** 14 | ```sh 15 | cd startup_namer 16 | ``` 17 | 18 | 3. **Update the main.dart file:** 19 | ```sh 20 | sed -i '34c\ home: const MyHomePage(title: "Flutter is awesome!"),' lib/main.dart 21 | ``` 22 | 23 | 4. **Run the Flutter project:** 24 | ```sh 25 | fwr 26 | ``` 27 | 28 | ## Congratulation !!! 29 | -------------------------------------------------------------------------------- /Git Merges: -------------------------------------------------------------------------------- 1 | 2 | 3 | sudo curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | sudo apt-key add - 4 | 5 | sudo sed -i s/deb.debian.org/archive.debian.org/g /etc/apt/sources.list 6 | 7 | sudo sed -i 's|security.debian.org|archive.debian.org/debian-security/|g' /etc/apt/sources.list 8 | 9 | sudo sed -i '/stretch-updates/d' /etc/apt/sources.list 10 | 11 | sudo sed -i s/deb.debian.org/archive.debian.org/g /etc/apt/sources.list.d/backports.list 12 | 13 | sudo apt update 14 | 15 | sudo apt-get install git -y 16 | 17 | 18 | 19 | -------------------------------------------------------------------------------- /Google Cloud Fundamentals Getting Started with App Engine/quicklab.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud app create --project=$DEVSHELL_PROJECT_ID 4 | 5 | git clone https://github.com/GoogleCloudPlatform/python-docs-samples 6 | 7 | cd python-docs-samples/appengine/standard_python3/hello_world 8 | 9 | cat > Dockerfile < request.json < result.json -------------------------------------------------------------------------------- /Google Cloud Storage Bucket Lock/Google Cloud Storage - Bucket Lock.md: -------------------------------------------------------------------------------- 1 | 2 | ## **Google Cloud Storage - Bucket Lock** 3 | 4 | ## **Just Copy & Paste the commands** 5 | 6 | ``` 7 | curl -LO raw.githubusercontent.com/quiccklabs/Labs_solutions/master/Google%20Cloud%20Storage%20Bucket%20Lock/quicklabgsp297.sh 8 | 9 | sudo chmod +x quicklabgsp297.sh 10 | 11 | ./quicklabgsp297.sh 12 | ``` 13 | 14 | 15 | ## Congratulation -------------------------------------------------------------------------------- /Google Kubernetes Engine Qwik Start/quicklabgsp100.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | gcloud config set compute/zone $ZONE 5 | 6 | gcloud container clusters create --machine-type=e2-medium --zone=$ZONE lab-cluster 7 | 8 | gcloud container clusters get-credentials lab-cluster 9 | 10 | kubectl create deployment hello-server --image=gcr.io/google-samples/hello-app:1.0 11 | 12 | kubectl expose deployment hello-server --type=LoadBalancer --port 8080 13 | 14 | 15 | echo "Y" | gcloud container clusters delete lab-cluster -------------------------------------------------------------------------------- /Google Kubernetes Engine: Qwik Start ACE/quicklab.sh: -------------------------------------------------------------------------------- 1 | echo "" 2 | echo "" 3 | 4 | read -p "ENTER ZONE: " ZONE 5 | 6 | gcloud config set compute/zone $ZONE 7 | 8 | gcloud container clusters create --machine-type=e2-medium --zone=$ZONE lab-cluster 9 | 10 | gcloud container clusters get-credentials lab-cluster 11 | 12 | kubectl create deployment hello-server --image=gcr.io/google-samples/hello-app:1.0 13 | 14 | kubectl expose deployment hello-server --type=LoadBalancer --port 8080 15 | -------------------------------------------------------------------------------- /HTTP Google Cloud Functions in Go/quicklabgsp602.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | 4 | 5 | export REGION=$(gcloud compute project-info describe \ 6 | --format="value(commonInstanceMetadata.items[google-compute-default-region])") 7 | 8 | export ZONE=$(gcloud compute project-info describe \ 9 | --format="value(commonInstanceMetadata.items[google-compute-default-zone])") 10 | 11 | PROJECT_ID=`gcloud config get-value project` 12 | 13 | 14 | 15 | gcloud services enable run.googleapis.com 16 | 17 | gcloud services enable cloudfunctions.googleapis.com 18 | 19 | curl -LO https://github.com/GoogleCloudPlatform/golang-samples/archive/main.zip 20 | 21 | unzip main.zip 22 | 23 | cd golang-samples-main/functions/codelabs/gopher 24 | 25 | gcloud functions deploy HelloWorld --gen2 --runtime go121 --trigger-http --region $REGION --quiet 26 | 27 | gcloud functions deploy Gopher --gen2 --runtime go121 --trigger-http --region $REGION --quiet 28 | -------------------------------------------------------------------------------- /HTTPS Content-Based Load Balancer with Terraform/quicklabgsp206.sh: -------------------------------------------------------------------------------- 1 | 2 | echo "" 3 | echo "" 4 | 5 | read -p "Enter Group1_Region :- " group1_region 6 | read -p "Enter Group2_Region :- " group2_region 7 | read -p "Enter Group3_Region :- " group3_region 8 | 9 | 10 | git clone https://github.com/terraform-google-modules/terraform-google-lb-http.git 11 | 12 | cd ~/terraform-google-lb-http/examples/multi-backend-multi-mig-bucket-https-lb 13 | 14 | rm -rf main.tf 15 | 16 | wget https://raw.githubusercontent.com/quiccklabs/Labs_solutions/master/HTTPS%20Content-Based%20Load%20Balancer%20with%20Terraform/main.tf 17 | 18 | 19 | cat > variables.tf < importTestData.js 18 | 19 | npm install faker@5.5.3 20 | 21 | curl https://raw.githubusercontent.com/quiccklabs/Labs_solutions/master/Importing%20Data%20to%20a%20Firestore%20Database/createTestData.js > createTestData.js 22 | 23 | 24 | node createTestData 1000 25 | 26 | node importTestData customers_1000.csv 27 | 28 | npm install csv-parse 29 | 30 | node createTestData 20000 31 | node importTestData customers_20000.csv 32 | 33 | 34 | 35 | 36 | -------------------------------------------------------------------------------- /Installing Updating and Removing Software in Linux/quicklab.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | dpkg -s firefox-esr 4 | 5 | dpkg -s gimp 6 | 7 | dpkg -s vlc 8 | 9 | echo 'Y' | sudo apt-get install -f 10 | 11 | dpkg -s vlc 12 | 13 | sudo apt-get update 14 | 15 | echo 'Y' | sudo apt-get install firefox-esr 16 | 17 | dpkg -s firefox-esr 18 | 19 | echo 'Y' | sudo apt-get remove gimp 20 | 21 | dpkg -s gimp -------------------------------------------------------------------------------- /Installing, Updating, and Removing Software in Windows: -------------------------------------------------------------------------------- 1 | 2 | 3 | Set-ExecutionPolicy Bypass -Scope Process -Force; [System.Net.ServicePointManager]::SecurityProtocol = [System.Net.ServicePointManager]::SecurityProtocol -bor 3072; iex ((New-Object System.Net.WebClient).DownloadString('https://chocolatey.org/install.ps1')) 4 | 5 | 6 | choco install firefox -y 7 | 8 | choco upgrade vlc -y 9 | 10 | 11 | choco install gimp -y 12 | 13 | choco uninstall gimp -y 14 | -------------------------------------------------------------------------------- /Introduction to APIs in Google Cloud/quicklabgsp294.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud services enable fitness.googleapis.com 4 | 5 | cat > values.json < main.tf < main.tf < token_policies.txt <= '2023-06-01' 13 | AND o.created_at < '2023-07-01'; 14 | " 15 | 16 | 17 | 18 | 19 | bq query --use_legacy_sql=false \ 20 | " 21 | SELECT 22 | first_name, 23 | last_name, 24 | team_name, 25 | sum(points) as total_points 26 | FROM \`bigquery-public-data.ncaa_basketball.mbb_players_games_sr\` 27 | group by first_name, last_name, team_name 28 | order by total_points desc; 29 | " 30 | 31 | 32 | 33 | 34 | 35 | bq query --use_legacy_sql=false \ 36 | " 37 | WITH rankings AS ( 38 | SELECT 39 | RANK() OVER (ORDER BY points DESC) AS ranking, 40 | first_name, 41 | last_name, 42 | team_name, 43 | points 44 | FROM 45 | \`bigquery-public-data.ncaa_basketball.mbb_players_games_sr\` 46 | ) 47 | SELECT 48 | ranking , 49 | first_name, 50 | last_name, 51 | team_name, 52 | points 53 | FROM 54 | rankings 55 | WHERE 56 | ranking<=10 57 | ORDER BY 58 | ranking; 59 | " -------------------------------------------------------------------------------- /Navigate Dataplex/quicklab.sh: -------------------------------------------------------------------------------- 1 | 2 | bq query --use_legacy_sql=false \ 3 | ' 4 | SELECT * FROM `thelook_gcda.products` WHERE brand IS NOT NULL limit 10; 5 | ' 6 | 7 | 8 | 9 | 10 | bq query --use_legacy_sql=false \ 11 | ' 12 | SELECT * FROM `thelook_gcda.orders` limit 10; 13 | ' 14 | 15 | 16 | bq query --use_legacy_sql=false \ 17 | ' 18 | SELECT * FROM `thelook_gcda.order_items` limit 10; 19 | ' 20 | 21 | 22 | bq query --use_legacy_sql=false \ 23 | ' 24 | SELECT * FROM `ghcn_daily.ghcnd_1763` limit 10; 25 | ' 26 | 27 | 28 | bq query --use_legacy_sql=false \ 29 | ' 30 | SELECT 31 | name AS product_name, 32 | id AS product_id 33 | FROM 34 | thelook_gcda.products 35 | LIMIT 10; 36 | ' 37 | -------------------------------------------------------------------------------- /Navigate Security Decisions with Duet AI/quicklab.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | PROJECT_ID=$(gcloud config get-value project) 4 | echo "PROJECT_ID=${PROJECT_ID}" 5 | echo "REGION=${REGION}" 6 | 7 | USER=$(gcloud config get-value account 2> /dev/null) 8 | echo "USER=${USER}" 9 | 10 | gcloud services enable cloudaicompanion.googleapis.com --project ${PROJECT_ID} 11 | 12 | gcloud projects add-iam-policy-binding ${PROJECT_ID} --member user:${USER} --role=roles/cloudaicompanion.user 13 | gcloud projects add-iam-policy-binding ${PROJECT_ID} --member user:${USER} --role=roles/serviceusage.serviceUsageViewer 14 | 15 | gcloud container clusters create test --region=$REGION --num-nodes=1 16 | 17 | git clone https://github.com/GoogleCloudPlatform/microservices-demo && cd microservices-demo 18 | 19 | kubectl apply -f ./release/kubernetes-manifests.yaml 20 | 21 | kubectl get service frontend-external | awk '{print $4}' 22 | 23 | 24 | gcloud container clusters update test --region "$REGION" --enable-master-authorized-networks -------------------------------------------------------------------------------- /Online Data Migration to BigQuery using Striim: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | gcloud config set compute/zone us-central1-a 5 | export COMPUTE_ZONE=us-central1-a 6 | 7 | export USER_PWD=QUICKLABuser 8 | 9 | export ROOT_PWD=QUICKLABroot 10 | 11 | export PROJECT_ID=$(gcloud config get-value project) 12 | CSQL_NAME=striim-sql-src 13 | CSQL_USERNAME=striim-user 14 | 15 | 16 | 17 | gcloud sql instances create $CSQL_NAME \ 18 | --root-password=$ROOT_PWD \ 19 | --zone=$COMPUTE_ZONE \ 20 | --enable-bin-log 21 | 22 | 23 | gcloud sql users create $CSQL_USERNAME \ 24 | --instance $CSQL_NAME \ 25 | --password $USER_PWD \ 26 | --host=% 27 | 28 | 29 | STRIIMVM_NAME=striim-1-vm 30 | STRIIMVM_ZONE=us-central1-a 31 | 32 | 33 | gcloud sql instances patch $CSQL_NAME --authorized-networks=$(gcloud compute instances describe $STRIIMVM_NAME --format='get(networkInterfaces[0].accessConfigs[0].natIP)' --zone=$STRIIMVM_ZONE) 34 | 35 | 36 | 37 | 38 | 39 | 40 | 41 | 42 | 43 | 44 | 45 | 46 | 47 | -------------------------------------------------------------------------------- /Partitioning and formatting a Disk Drive in Linux: -------------------------------------------------------------------------------- 1 | Hello guys this is the very simple lab you just need to take care of one thing here. So let's get stared 2 | 3 | Now look closely for vroot if its in sda then used or its is sdb 4 | 5 | For now it's in sda 6 | So we are using 7 | 8 | Until you get No partition yet please run that command other wise you won't get the score. 9 | 10 | If you face any issue or have any doubt please comment we will help you :P 11 | 12 | 13 | 14 | For SDA 15 | 16 | lsblk 17 | 18 | sudo fdisk /dev/sdb 19 | 20 | d and 2 times enter button 21 | 22 | sudo mkfs -t ext4 /dev/sdb2 23 | sudo mount /dev/sdb2 /home/my_drive 24 | 25 | 26 | For SDB 27 | 28 | 29 | lsblk 30 | 31 | sudo fdisk /dev/sda 32 | 33 | d and 2 times enter button 34 | 35 | sudo mkfs -t ext4 /dev/sda2 36 | sudo mount /dev/sda2 /home/my_drive 37 | 38 | 39 | 40 | -------------------------------------------------------------------------------- /Predict Soccer Match Outcomes with BigQuery ML Challenge Lab/GSP374.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | ## Predict Soccer Match Outcomes with BigQuery ML: Challenge Lab 4 | 5 | ## OPEN YOUR CLOUD SHELL 6 | 7 | ## ONLY USE ONLINE NOTEPAD WHICH I WAS USING 🙏🏻 8 | 9 | 10 | ### EXPORT Values 11 | 12 | ```bash 13 | export EVENT_NAME= 14 | 15 | export TABLE_NAME= 16 | 17 | export VALUE_X_1= 18 | 19 | export VALUE_Y_1= 20 | 21 | export VALUE_X_2= 22 | 23 | export VALUE_Y_2= 24 | 25 | export FUNCTION_1= 26 | 27 | export FUNCTION_2= 28 | 29 | export MODEL_NAME= 30 | ``` 31 | 32 | 33 | ### 34 | ### 35 | 36 | ## NOW JUST COPY AND PASTE ON YOUR CLOUD SHELL 37 | 38 | ```bash 39 | curl -LO raw.githubusercontent.com/quiccklabs/Labs_solutions/master/Predict%20Soccer%20Match%20Outcomes%20with%20BigQuery%20ML%20Challenge%20Lab/quicklabgsp374.sh 40 | 41 | sudo chmod +x quicklabgsp374.sh 42 | 43 | ./quicklabgsp374.sh 44 | ``` 45 | 46 | 47 | #### perform task 4.1 & 4.2 manually as I explain in the video. 48 | 49 | 50 | 51 | 52 | ## Congratulation !!! 53 | -------------------------------------------------------------------------------- /Process Documents with Python Using the Document AI API/Process Documents with Python Using the Document AI API.md: -------------------------------------------------------------------------------- 1 | 2 | # Process Documents with Python Using the Document AI API 3 | 4 | 5 | #### **Run command in cloud shell** 6 | 7 | ```bash 8 | curl -LO raw.githubusercontent.com/quiccklabs/Labs_solutions/master/Process%20Documents%20with%20Python%20Using%20the%20Document%20AI%20API/quickklabgsp925.sh 9 | 10 | sudo chmod +x quickklabgsp925.sh 11 | 12 | ./quickklabgsp925.sh 13 | 14 | ``` 15 | 16 | 17 | ### **Run command in juypter NoteBook** 18 | 19 | ```bash 20 | curl -LO raw.githubusercontent.com/quiccklabs/Labs_solutions/master/Process%20Documents%20with%20Python%20Using%20the%20Document%20AI%20API/quicklab.sh 21 | 22 | sudo chmod +x quicklab.sh 23 | 24 | ./quicklab.sh 25 | 26 | ``` 27 | 28 | ## Now wait for 5 minutes maximum 29 | 30 | ## Congratulation!!! 31 | -------------------------------------------------------------------------------- /Process Documents with Python Using the Document AI API/quicklab.sh: -------------------------------------------------------------------------------- 1 | gsutil cp gs://cloud-training/gsp925/*.ipynb . 2 | python -m pip install --upgrade google-cloud-core google-cloud-documentai google-cloud-storage prettytable --user 3 | gsutil cp gs://cloud-training/gsp925/health-intake-form.pdf form.pdf 4 | 5 | export PROJECT_ID="$(gcloud config get-value core/project)" 6 | export BUCKET="${PROJECT_ID}"_doc_ai_async 7 | gsutil mb gs://${BUCKET} 8 | gsutil -m cp gs://cloud-training/gsp925/async/*.* gs://${BUCKET}/input 9 | -------------------------------------------------------------------------------- /Protect Cloud Traffic with BeyondCorp Enterprise (BCE) Security: Challenge Lab.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | ## Protect Cloud Traffic with BeyondCorp Enterprise (BCE) Security: Challenge Lab 4 | 5 | 6 | 7 | ``` 8 | export REGION= 9 | ``` 10 | ``` 11 | gcloud auth list 12 | git clone https://github.com/GoogleCloudPlatform/python-docs-samples.git 13 | cd python-docs-samples/appengine/standard_python3/hello_world/ 14 | gcloud app create --project=$(gcloud config get-value project) --region=$REGION 15 | gcloud app deploy --quiet 16 | export AUTH_DOMAIN=$(gcloud config get-value project).uc.r.appspot.com 17 | echo $AUTH_DOMAIN 18 | ``` 19 | 20 | -------------------------------------------------------------------------------- /Protecting Sensitive Data in Gen AI Model Responses/test.sh: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Provision Cloud Infrastructure with Duet AI/quicklab.sh: -------------------------------------------------------------------------------- 1 | 2 | PROJECT_ID=$(gcloud config get-value project) 3 | echo "PROJECT_ID=${PROJECT_ID}" 4 | echo "REGION=${REGION}" 5 | 6 | 7 | USER=$(gcloud config get-value account 2> /dev/null) 8 | echo "USER=${USER}" 9 | 10 | gcloud services enable cloudaicompanion.googleapis.com --project ${PROJECT_ID} 11 | 12 | gcloud projects add-iam-policy-binding ${PROJECT_ID} --member user:${USER} --role=roles/cloudaicompanion.user 13 | gcloud projects add-iam-policy-binding ${PROJECT_ID} --member user:${USER} --role=roles/serviceusage.serviceUsageViewer 14 | 15 | sleep 20 16 | 17 | gcloud container clusters create-auto duet-ai-demo --region $REGION 18 | 19 | kubectl create deployment hello-server --image=us-docker.pkg.dev/google-samples/containers/gke/hello-app:1.0 20 | 21 | kubectl expose deployment hello-server --type LoadBalancer --port 80 --target-port 8080 22 | 23 | kubectl get service hello-server 24 | -------------------------------------------------------------------------------- /Provision Cloud Infrastructure with Gemini/quicklab.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | PROJECT_ID=$(gcloud config get-value project) 4 | 5 | echo "PROJECT_ID=${PROJECT_ID}" 6 | echo "REGION=${REGION}" 7 | 8 | USER=$(gcloud config get-value account 2> /dev/null) 9 | echo "USER=${USER}" 10 | 11 | gcloud services enable cloudaicompanion.googleapis.com --project ${PROJECT_ID} 12 | 13 | gcloud projects add-iam-policy-binding ${PROJECT_ID} --member user:${USER} --role=roles/cloudaicompanion.user 14 | gcloud projects add-iam-policy-binding ${PROJECT_ID} --member user:${USER} --role=roles/serviceusage.serviceUsageViewer 15 | 16 | gcloud container clusters create-auto gemini-demo --region $REGION 17 | 18 | kubectl create deployment hello-server --image=us-docker.pkg.dev/google-samples/containers/gke/hello-app:1.0 19 | 20 | kubectl expose deployment hello-server --type LoadBalancer --port 80 --target-port 8080 -------------------------------------------------------------------------------- /PubSub Lite Qwik Start/quicklabgsp832.sh: -------------------------------------------------------------------------------- 1 | 2 | gcloud services enable pubsublite.googleapis.com 3 | 4 | sleep 30 5 | 6 | pip3 install --upgrade google-cloud-pubsublite 7 | 8 | 9 | 10 | gcloud pubsub lite-topics create my-lite-topic \ 11 | --zone=$REGION-a --partitions=1 \ 12 | --per-partition-bytes=30GiB --message-retention-period=2w 13 | 14 | gcloud pubsub lite-subscriptions create my-lite-subscription \ 15 | --project=$DEVSHELL_PROJECT_ID \ 16 | --zone=$REGION-a \ 17 | --topic=my-lite-topic \ 18 | --delivery-requirement=deliver-after-stored 19 | -------------------------------------------------------------------------------- /PubSub Qwik Start Python/quicklabgsp094.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud auth list 4 | 5 | sudo apt-get install -y virtualenv 6 | 7 | python3 -m venv venv 8 | 9 | source venv/bin/activate 10 | 11 | pip install --upgrade google-cloud-pubsub 12 | 13 | git clone https://github.com/googleapis/python-pubsub.git 14 | 15 | cd python-pubsub/samples/snippets 16 | 17 | python publisher.py -h 18 | 19 | python publisher.py $GOOGLE_CLOUD_PROJECT create MyTopic 20 | 21 | python subscriber.py $GOOGLE_CLOUD_PROJECT create MyTopic MySub 22 | 23 | -------------------------------------------------------------------------------- /Rent a VM to Process Earthquake Data/quicklab.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | #!/bin/bash 4 | 5 | # Update the package list 6 | sudo apt-get update 7 | 8 | # Install git 9 | sudo apt-get -y -qq install git 10 | 11 | # Install python-mpltoolkits.basemap 12 | sudo apt-get install python-mpltoolkits.basemap -y 13 | 14 | # Check Git version 15 | git --version 16 | 17 | # Clone the repository 18 | git clone https://github.com/GoogleCloudPlatform/training-data-analyst 19 | 20 | # Navigate to the appropriate directory 21 | cd training-data-analyst/CPB100/lab2b 22 | 23 | # Run the necessary scripts 24 | bash ingest.sh 25 | bash install_missing.sh 26 | python3 transform.py 27 | 28 | # List the files in the directory 29 | ls -l 30 | 31 | export DEVSHELL_PROJECT_ID=$(gcloud config get-value project) 32 | 33 | # Copy files to Google Cloud Storage 34 | gsutil cp earthquakes.* gs://$DEVSHELL_PROJECT_ID/earthquakes/ 35 | -------------------------------------------------------------------------------- /Rent a VM to Process Earthquake Data/quicklabgsp008.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud compute instances create instance-1 --project=$DEVSHELL_PROJECT_ID --zone=$ZONE --machine-type=e2-medium --network-interface=network-tier=PREMIUM,stack-type=IPV4_ONLY,subnet=default --metadata=enable-oslogin=true --maintenance-policy=MIGRATE --provisioning-model=STANDARD --scopes=https://www.googleapis.com/auth/cloud-platform --create-disk=auto-delete=yes,boot=yes,device-name=instance-1,image=projects/debian-cloud/global/images/debian-10-buster-v20230809,mode=rw,size=10,type=projects/$DEVSHELL_PROJECT_ID/zones/$ZONE/diskTypes/pd-balanced --no-shielded-secure-boot --shielded-vtpm --shielded-integrity-monitoring --labels=goog-ec-src=vm_add-gcloud --reservation-affinity=any 4 | 5 | sleep 10 6 | 7 | gcloud storage buckets create gs://$DEVSHELL_PROJECT_ID \ 8 | --location="${ZONE%-*}" \ 9 | --default-storage-class=REGIONAL \ 10 | --project=$DEVSHELL_PROJECT_ID 11 | 12 | 13 | 14 | gcloud compute ssh --zone "$ZONE" "instance-1" --project "$DEVSHELL_PROJECT_ID" --quiet --command 'bash -c "$(curl -fsSL https://raw.githubusercontent.com/quiccklabs/Labs_solutions/master/Rent%20a%20VM%20to%20Process%20Earthquake%20Data/quicklab.sh)"' 15 | 16 | 17 | 18 | -------------------------------------------------------------------------------- /Responding to Cloud Logging Messages with Cloud Functions: -------------------------------------------------------------------------------- 1 | 2 | TASK 1-2 3 | 4 | 5 | gcloud pubsub topics create vm-audit-logs 6 | 7 | export ZONE=us-central1-c 8 | 9 | gcloud compute instances create --zone $ZONE instance-1 10 | 11 | gcloud compute instances stop --zone $ZONE instance-1 12 | 13 | 14 | 15 | 16 | TASK 4:- 17 | 18 | resource.type="gce_instance" 19 | log_name="projects/YOUR_PROJECT_ID/logs/cloudaudit.googleapis.com%2Factivity" 20 | protoPayload.methodName="v1.compute.instances.insert" 21 | operation.last = "true" 22 | 23 | 24 | 25 | 26 | -------------------------------------------------------------------------------- /Running Pipelines on Vertex AI 25/quicklab.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | export PROJECT_NUMBER="$(gcloud projects describe $DEVSHELL_PROJECT_ID --format='get(projectNumber)')" 4 | 5 | 6 | gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID \ 7 | --member=serviceAccount:$PROJECT_NUMBER-compute@developer.gserviceaccount.com \ 8 | --role=roles/storage.admin 9 | 10 | sleep 70 11 | 12 | gcloud storage buckets create gs://$DEVSHELL_PROJECT_ID 13 | touch emptyfile1 14 | touch emptyfile2 15 | gcloud storage cp emptyfile1 gs://$DEVSHELL_PROJECT_ID/pipeline-output/emptyfile1 16 | gcloud storage cp emptyfile2 gs://$DEVSHELL_PROJECT_ID/pipeline-input/emptyfile2 17 | 18 | 19 | wget https://storage.googleapis.com/cloud-training/dataengineering/lab_assets/ai_pipelines/basic_pipeline.json 20 | 21 | sed -i 's/PROJECT_ID/$DEVSHELL_PROJECT_ID/g' basic_pipeline.json 22 | 23 | tail -20 basic_pipeline.json 24 | 25 | gcloud storage cp basic_pipeline.json gs://$DEVSHELL_PROJECT_ID/pipeline-input/basic_pipeline.json 26 | -------------------------------------------------------------------------------- /Running Windows Containers on Compute Engine/quicklabgsp153.bat: -------------------------------------------------------------------------------- 1 | @echo off 2 | mkdir my-windows-app 3 | cd my-windows-app 4 | mkdir content 5 | 6 | echo ^^^Windows containers^^^^Windows containers are cool!^^^ > content\index.html 7 | 8 | echo FROM mcr.microsoft.com/windows/servercore/iis:windowsservercore-ltsc2019 > Dockerfile 9 | echo RUN powershell -NoProfile -Command Remove-Item -Recurse C:\inetpub\wwwroot\* >> Dockerfile 10 | echo WORKDIR /inetpub/wwwroot >> Dockerfile 11 | echo COPY content/ . >> Dockerfile 12 | 13 | docker build -t gcr.io/dotnet-atamel/iis-site-windows . 14 | docker images 15 | -------------------------------------------------------------------------------- /Running a Containerized App on Google Kubernetes Engine/quicklabgsp015.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | gcloud config set compute/zone $ZONE 5 | 6 | gcloud container clusters create hello-world 7 | 8 | git clone https://github.com/GoogleCloudPlatform/kubernetes-engine-samples 9 | 10 | cd kubernetes-engine-samples/quickstarts/hello-app 11 | 12 | docker build -t gcr.io/$DEVSHELL_PROJECT_ID/hello-app:1.0 . 13 | 14 | gcloud docker -- push gcr.io/$DEVSHELL_PROJECT_ID/hello-app:1.0 15 | 16 | kubectl create deployment hello-app --image=gcr.io/$DEVSHELL_PROJECT_ID/hello-app:1.0 17 | 18 | kubectl expose deployment hello-app --name=hello-app --type=LoadBalancer --port=80 --target-port=8080 -------------------------------------------------------------------------------- /Running a MongoDB Database in Kubernetes with StatefulSets/quicklabgsp022.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud config set compute/zone $ZONE 4 | 5 | gcloud container clusters create hello-world --num-nodes=2 6 | 7 | gsutil -m cp -r gs://spls/gsp022/mongo-k8s-sidecar . 8 | 9 | cd ./mongo-k8s-sidecar/example/StatefulSet/ 10 | 11 | kubectl apply -f googlecloud_ssd.yaml 12 | 13 | rm mongo-statefulset.yaml 14 | 15 | ## add file 16 | curl -LO https://raw.githubusercontent.com/quiccklabs/Labs_solutions/master/Running%20a%20MongoDB%20Database%20in%20Kubernetes%20with%20StatefulSets/mongo-statefulset.yaml 17 | 18 | kubectl apply -f mongo-statefulset.yaml 19 | 20 | kubectl get statefulset 21 | 22 | sleep 120 23 | 24 | kubectl scale --replicas=5 statefulset mongo 25 | 26 | sleep 100 27 | 28 | kubectl scale --replicas=3 statefulset mongo 29 | 30 | sleep 60 31 | 32 | kubectl get pods 33 | -------------------------------------------------------------------------------- /Scale Out and Update a Containerized Application on a Kubernetes Cluster/quicklabgsp305.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | gsutil cp gs://sureskills-ql/challenge-labs/ch05-k8s-scale-and-update/echo-web-v2.tar.gz . 8 | 9 | tar xvzf echo-web-v2.tar.gz 10 | 11 | gcloud builds submit --tag gcr.io/$DEVSHELL_PROJECT_ID/echo-app:v2 . 12 | 13 | gcloud container clusters get-credentials echo-cluster --zone $ZONE 14 | 15 | kubectl create deployment echo-web --image=gcr.io/qwiklabs-resources/echo-app:v1 16 | 17 | kubectl expose deployment echo-web --type=LoadBalancer --port 80 --target-port 8000 18 | 19 | kubectl patch deployment echo-web -p '{"spec": {"template": {"spec": {"containers": [{"name": "echo-app", "image": "gcr.io/qwiklabs-resources/echo-app:v2"}]}}}}' 20 | 21 | 22 | kubectl scale deploy echo-web --replicas=2 23 | 24 | 25 | -------------------------------------------------------------------------------- /Securing Cloud Applications with Identity Aware Proxy IAP using ZeroTrust/quicklabgsp946.sh: -------------------------------------------------------------------------------- 1 | gcloud auth list 2 | git clone https://github.com/googlecodelabs/user-authentication-with-iap.git 3 | cd user-authentication-with-iap 4 | 5 | cd 1-HelloWorld 6 | 7 | gcloud app create --project=$(gcloud config get-value project) --region=$REGION 8 | 9 | sed -i "15c\runtime: python38" app.yaml 10 | sleep 30 11 | gcloud app deploy --quiet 12 | 13 | 14 | cd ~/user-authentication-with-iap/2-HelloUser 15 | sed -i "15c\runtime: python38" app.yaml 16 | sleep 30 17 | gcloud app deploy --quiet 18 | -------------------------------------------------------------------------------- /Securing Cloud Applications with Identity Aware Proxy IAP using ZeroTrust/quicklabtask1.sh: -------------------------------------------------------------------------------- 1 | gcloud auth list 2 | git clone https://github.com/googlecodelabs/user-authentication-with-iap.git 3 | cd user-authentication-with-iap 4 | 5 | cd 1-HelloWorld 6 | 7 | gcloud app create --project=$(gcloud config get-value project) --region=$REGION 8 | 9 | sed -i "15c\runtime: python38" app.yaml 10 | sleep 30 11 | gcloud app deploy --quiet 12 | -------------------------------------------------------------------------------- /Securing Cloud Applications with Identity Aware Proxy IAP using ZeroTrust/quicklabtask2.sh: -------------------------------------------------------------------------------- 1 | cd ~/user-authentication-with-iap/2-HelloUser 2 | sed -i "15c\runtime: python38" app.yaml 3 | sleep 30 4 | gcloud app deploy --quiet 5 | -------------------------------------------------------------------------------- /Serverless Data Analysis with Dataflow Side Inputs java/quicklab.sh: -------------------------------------------------------------------------------- 1 | 2 | gcloud services disable dataflow.googleapis.com 3 | 4 | gcloud services enable dataflow.googleapis.com 5 | 6 | sleep 30 7 | 8 | 9 | git clone https://github.com/GoogleCloudPlatform/training-data-analyst 10 | 11 | gsutil mb gs://$DEVSHELL_PROJECT_ID 12 | 13 | export BUCKET=$DEVSHELL_PROJECT_ID 14 | 15 | cd ~/training-data-analyst/courses/data_analysis/lab2/javahelp 16 | ./run_oncloud3.sh $DEVSHELL_PROJECT_ID $BUCKET JavaProjectsThatNeedHelp 17 | 18 | -------------------------------------------------------------------------------- /Service Directory Qwik Start/quicklabgsp732.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | gcloud services enable networkservices.googleapis.com 6 | 7 | gcloud services enable servicedirectory.googleapis.com 8 | 9 | gcloud service-directory namespaces create example-namespace --location=$REGION 10 | 11 | gcloud service-directory services create example-service --namespace=example-namespace --location=$REGION 12 | 13 | gcloud service-directory endpoints create example-endpoint \ 14 | --namespace=example-namespace \ 15 | --service=example-service \ 16 | --address=0.0.0.0 \ 17 | --port=80 \ 18 | --location=$REGION 19 | 20 | 21 | gcloud dns --project=$DEVSHELL_PROJECT_ID managed-zones create example-zone-name --description="" --dns-name="myzone.example.com." --visibility="private" --networks="https://compute.googleapis.com/compute/v1/projects/$DEVSHELL_PROJECT_ID/global/networks/default" --service-directory-namespace="https://servicedirectory.googleapis.com/v1/projects/$DEVSHELL_PROJECT_ID/locations/$REGION/namespaces/example-namespace" 22 | 23 | 24 | 25 | -------------------------------------------------------------------------------- /Set Up a Google Cloud Network: Challenge Lab/quicklabip.sh: -------------------------------------------------------------------------------- 1 | export ZONE="$(gcloud compute instances list --project=$DEVSHELL_PROJECT_ID --format='value(ZONE)')" 2 | export INT_IP=$(gcloud compute instances describe antern-postgresql-vm --zone=$ZONE \ 3 | --format='get(networkInterfaces[0].networkIP)') 4 | echo -e "\033[1;34mVM Internal IP:\033[0m \033[1;33m$INT_IP\033[0m" 5 | -------------------------------------------------------------------------------- /Set Up a Google Cloud Network: Challenge Lab/quicklabtask1.sh: -------------------------------------------------------------------------------- 1 | gcloud services enable datamigration.googleapis.com 2 | gcloud services enable servicenetworking.googleapis.com 3 | 4 | export ZONE="$(gcloud compute instances list --project=$DEVSHELL_PROJECT_ID --format='value(ZONE)')" 5 | export PROJECT_ID=$DEVSHELL_PROJECT_ID 6 | gcloud compute ssh --zone "$ZONE" "antern-postgresql-vm" --project "$DEVSHELL_PROJECT_ID" --quiet 7 | -------------------------------------------------------------------------------- /Set Up a Google Cloud Network: Challenge Lab/quicklabtask2.sh: -------------------------------------------------------------------------------- 1 | gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID \ 2 | --member=user:$USER1 \ 3 | --role=roles/cloudsql.instanceUser 4 | 5 | gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID \ 6 | --member=user:$USER2 \ 7 | --role=roles/cloudsql.admin 8 | 9 | 10 | gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID \ 11 | --member=user:$USER2 \ 12 | --role=roles/compute.networkAdmin 13 | 14 | gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID \ 15 | --member=user:$USER2 \ 16 | --role=roles/compute.securityAdmin 17 | 18 | gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID \ 19 | --member=user:$USER2 \ 20 | --role=roles/logging.admin 21 | 22 | 23 | # Remove Viewer role 24 | gcloud projects remove-iam-policy-binding $DEVSHELL_PROJECT_ID \ 25 | --member=user:$USER3 \ 26 | --role=roles/viewer 27 | 28 | # Add Editor role 29 | gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID \ 30 | --member=user:$USER3 \ 31 | --role=roles/editor 32 | 33 | gcloud projects remove-iam-policy-binding $DEVSHELL_PROJECT_ID \ 34 | --member=user:$USER3 \ 35 | --role=roles/editor 36 | 37 | gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID \ 38 | --member=user:$USER3 \ 39 | --role=roles/viewer 40 | -------------------------------------------------------------------------------- /Set Up a Google Cloud Network: Challenge Lab/quicklabtask3.sh: -------------------------------------------------------------------------------- 1 | gcloud compute networks create $VPC_NAME --project=$DEVSHELL_PROJECT_ID --subnet-mode=custom --mtu=1460 --bgp-routing-mode=regional && gcloud compute networks subnets create $SUBNET1 --project=$DEVSHELL_PROJECT_ID --range=10.10.10.0/24 --stack-type=IPV4_ONLY --network=$VPC_NAME --region=$REGION1 && gcloud compute networks subnets create $SUBNET2 --project=$DEVSHELL_PROJECT_ID --range=10.10.20.0/24 --stack-type=IPV4_ONLY --network=$VPC_NAME --region=$REGION2 2 | 3 | gcloud compute --project=$DEVSHELL_PROJECT_ID firewall-rules create $RULE_NAME1 --direction=INGRESS --priority=65535 --network=$VPC_NAME --action=ALLOW --rules=tcp:22 --source-ranges=0.0.0.0/0 4 | 5 | gcloud compute --project=$DEVSHELL_PROJECT_ID firewall-rules create $RULE_NAME2 --direction=INGRESS --priority=65535 --network=$VPC_NAME --action=ALLOW --rules=tcp:3389 --source-ranges=0.0.0.0/0 6 | 7 | gcloud compute --project=$DEVSHELL_PROJECT_ID firewall-rules create $RULE_NAME3 --direction=INGRESS --priority=65535 --network=$VPC_NAME --action=ALLOW --rules=icmp --source-ranges=0.0.0.0/0 8 | -------------------------------------------------------------------------------- /Set Up a Google Cloud Network: Challenge Lab/quicklabtask4.sh: -------------------------------------------------------------------------------- 1 | bq mk --dataset --project_id=$DEVSHELL_PROJECT_ID gke_app_errors_sink 2 | 3 | gcloud logging sinks create $SINK_NAME --project=$DEVSHELL_PROJECT_ID bigquery.googleapis.com/projects/$DEVSHELL_PROJECT_ID/datasets/gke_app_errors_sink --log-filter='resource.type=resource.labels.container_name; 4 | severity=ERROR' 5 | 6 | gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID \ 7 | --member=user:$USER1 \ 8 | --role=roles/bigquery.dataViewer 9 | 10 | gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID \ 11 | --member=user:$USER2 \ 12 | --role=roles/bigquery.admin 13 | -------------------------------------------------------------------------------- /Setting Up Cost Control with Quota/quicklabgsp651.sh: -------------------------------------------------------------------------------- 1 | 2 | bq query --use_legacy_sql=false \ 3 | " 4 | SELECT 5 | w1mpro_ep, 6 | mjd, 7 | load_id, 8 | frame_id 9 | FROM 10 | \`bigquery-public-data.wise_all_sky_data_release.mep_wise\` 11 | ORDER BY 12 | mjd ASC 13 | LIMIT 500 14 | " 15 | 16 | 17 | gcloud alpha services quota list --service=bigquery.googleapis.com --consumer=projects/${DEVSHELL_PROJECT_ID} --filter="usage" 18 | 19 | gcloud alpha services quota update --consumer=projects/${DEVSHELL_PROJECT_ID} --service bigquery.googleapis.com --metric bigquery.googleapis.com/quota/query/usage --value 262144 --unit 1/d/{project}/{user} --force 20 | 21 | gcloud alpha services quota list --service=bigquery.googleapis.com --consumer=projects/${DEVSHELL_PROJECT_ID} --filter="usage" 22 | 23 | bq query --use_legacy_sql=false --nouse_cache \ 24 | " 25 | SELECT 26 | w1mpro_ep, 27 | mjd, 28 | load_id, 29 | frame_id 30 | FROM 31 | \`bigquery-public-data.wise_all_sky_data_release.mep_wise\` 32 | ORDER BY 33 | mjd ASC 34 | LIMIT 500 35 | " 36 | 37 | -------------------------------------------------------------------------------- /Setting up Jenkins on Kubernetes Engine/quicklabgsp117.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | gcloud auth list 5 | 6 | 7 | gcloud config set compute/zone $ZONE 8 | 9 | git clone https://github.com/GoogleCloudPlatform/continuous-deployment-on-kubernetes.git 10 | 11 | cd continuous-deployment-on-kubernetes 12 | 13 | gcloud container clusters create jenkins-cd \ 14 | --num-nodes 2 \ 15 | --scopes "https://www.googleapis.com/auth/projecthosting,cloud-platform" 16 | 17 | gcloud container clusters get-credentials jenkins-cd 18 | 19 | helm repo add jenkins https://charts.jenkins.io 20 | 21 | helm repo update 22 | 23 | helm upgrade --install -f jenkins/values.yaml myjenkins jenkins/jenkins 24 | -------------------------------------------------------------------------------- /Speaking with a Webpage Streaming Speech Transcripts/task2.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | gcloud compute ssh "speaking-with-a-webpage" --zone "$ZONE" --project "$DEVSHELL_PROJECT_ID" --quiet --command "pkill -f 'java.*jetty'" 5 | 6 | sleep 5 7 | 8 | gcloud compute ssh "speaking-with-a-webpage" --zone "$ZONE" --project "$DEVSHELL_PROJECT_ID" --quiet --command "cd ~/speaking-with-a-webpage/02-webaudio && mvn clean jetty:run" 9 | -------------------------------------------------------------------------------- /Speech to Text Transcription with the Cloud Speech API/task1.sh: -------------------------------------------------------------------------------- 1 | cat > prepare_disk.sh <<'EOF_END' 2 | 3 | gcloud services enable apikeys.googleapis.com 4 | 5 | gcloud alpha services api-keys create --display-name="quicklab" 6 | 7 | KEY=$(gcloud alpha services api-keys list --format="value(name)" --filter "displayName=quicklab") 8 | 9 | API_KEY=$(gcloud alpha services api-keys get-key-string $KEY --format="value(keyString)") 10 | 11 | cat > request.json < result.json 27 | 28 | cat result.json 29 | 30 | EOF_END 31 | 32 | export ZONE=$(gcloud compute instances list linux-instance --format 'csv[no-heading](zone)') 33 | 34 | gcloud compute scp prepare_disk.sh linux-instance:/tmp --project=$DEVSHELL_PROJECT_ID --zone=$ZONE --quiet 35 | 36 | gcloud compute ssh linux-instance --project=$DEVSHELL_PROJECT_ID --zone=$ZONE --quiet --command="bash /tmp/prepare_disk.sh" 37 | -------------------------------------------------------------------------------- /Speech to Text Transcription with the Cloud Speech API/task2.sh: -------------------------------------------------------------------------------- 1 | cat > prepare_disk.sh <<'EOF_END' 2 | 3 | KEY=$(gcloud alpha services api-keys list --format="value(name)" --filter "displayName=quicklab") 4 | 5 | API_KEY=$(gcloud alpha services api-keys get-key-string $KEY --format="value(keyString)") 6 | 7 | rm -f request.json 8 | 9 | cat >> request.json < result.json 25 | 26 | cat result.json 27 | 28 | EOF_END 29 | 30 | export ZONE=$(gcloud compute instances list linux-instance --format 'csv[no-heading](zone)') 31 | 32 | gcloud compute scp prepare_disk.sh linux-instance:/tmp --project=$DEVSHELL_PROJECT_ID --zone=$ZONE --quiet 33 | 34 | gcloud compute ssh linux-instance --project=$DEVSHELL_PROJECT_ID --zone=$ZONE --quiet --command="bash /tmp/prepare_disk.sh" 35 | -------------------------------------------------------------------------------- /Store Process and Manage Data on Google Cloud Command Line Challenge Lab/package.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "thumbnails", 3 | "version": "1.0.0", 4 | "description": "Create Thumbnail of uploaded image", 5 | "scripts": { 6 | "start": "node index.js" 7 | }, 8 | "dependencies": { 9 | "@google-cloud/pubsub": "^2.0.0", 10 | "@google-cloud/storage": "^5.0.0", 11 | "fast-crc32c": "1.0.4", 12 | "imagemagick-stream": "4.1.1" 13 | }, 14 | "devDependencies": {}, 15 | "engines": { 16 | "node": ">=4.3.2" 17 | } 18 | } -------------------------------------------------------------------------------- /Store Process and Manage Data on Google Cloud Challenge Lab/package.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "thumbnails", 3 | "version": "1.0.0", 4 | "description": "Create Thumbnail of uploaded image", 5 | "scripts": { 6 | "start": "node index.js" 7 | }, 8 | "dependencies": { 9 | "@google-cloud/functions-framework": "^3.0.0", 10 | "@google-cloud/pubsub": "^2.0.0", 11 | "@google-cloud/storage": "^5.0.0", 12 | "fast-crc32c": "1.0.4", 13 | "imagemagick-stream": "4.1.1" 14 | }, 15 | "devDependencies": {}, 16 | "engines": { 17 | "node": ">=4.3.2" 18 | } 19 | } -------------------------------------------------------------------------------- /Streaming Analytics into BigQuery: Challenge Lab: -------------------------------------------------------------------------------- 1 | 2 | export BUCKET_NAME= 3 | 4 | export DATASET_NAME= 5 | 6 | export TABLE_NAME= 7 | 8 | export TOPIC_NAME= 9 | 10 | 11 | 12 | gsutil mb gs://$BUCKET_NAME 13 | 14 | bq mk $DATASET_NAME 15 | 16 | bq mk --table \ 17 | $DEVSHEL_PROJECT_ID:$DATASET_NAME.$TABLE_NAME \ 18 | data:string 19 | 20 | gcloud pubsub topics create $TOPIC_NAME 21 | 22 | gcloud pubsub subscriptions create $TOPIC_NAME-sub --topic=$TOPIC_NAME 23 | 24 | -------------------------------------------------------------------------------- /Streaming Data to Bigtable/Streaming Data to Bigtable.md: -------------------------------------------------------------------------------- 1 | 2 | # ```Streaming Data to Bigtable``` 3 | 4 | ### Task 1. Create a Bigtable instance and table:- ```Need to complete using console``` 5 | 6 | ``` 7 | curl -LO https://raw.githubusercontent.com/quiccklabs/Labs_solutions/master/Streaming%20Data%20to%20Bigtable/task1.sh 8 | 9 | sudo chmod +x task1.sh 10 | 11 | ./task1.sh 12 | ``` 13 | 14 | ## Open another cloud shell and run the below commands: 15 | 16 | ``` 17 | curl -LO https://raw.githubusercontent.com/quiccklabs/Labs_solutions/master/Streaming%20Data%20to%20Bigtable/task2.sh 18 | 19 | sudo chmod +x task2.sh 20 | 21 | ./task2.sh 22 | ``` 23 | 24 | ## Once you get score on Task 1-4 then only run this command otherwise you won't get full score 25 | 26 | ``` 27 | curl -LO https://raw.githubusercontent.com/quiccklabs/Labs_solutions/master/Streaming%20Data%20to%20Bigtable/task3.sh 28 | 29 | sudo chmod +x task3.sh 30 | 31 | ./task3.sh 32 | ``` 33 | 34 | 35 | 36 | ## ```Congratulation !!! ``` 37 | -------------------------------------------------------------------------------- /Streaming Data to Bigtable/task1.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | echo project = `gcloud config get-value project` \ 5 | >> ~/.cbtrc 6 | 7 | echo instance = sandiego \ 8 | >> ~/.cbtrc 9 | 10 | cbt createtable current_conditions \ 11 | families="lane" 12 | 13 | cat > prepare_disk.sh <<'EOF_END' 14 | git clone https://github.com/GoogleCloudPlatform/training-data-analyst 15 | source /training/project_env.sh 16 | /training/sensor_magic.sh 17 | EOF_END 18 | 19 | ZONE="$(gcloud compute instances list --project=$DEVSHELL_PROJECT_ID --format='value(ZONE)' | head -n 1)" 20 | 21 | gcloud compute scp prepare_disk.sh training-vm:/tmp --project=$DEVSHELL_PROJECT_ID --zone=$ZONE --quiet 22 | 23 | gcloud compute ssh training-vm --project=$DEVSHELL_PROJECT_ID --zone=$ZONE --quiet --command="bash /tmp/prepare_disk.sh" 24 | 25 | 26 | 27 | 28 | 29 | 30 | 31 | -------------------------------------------------------------------------------- /Streaming Data to Bigtable/task2.sh: -------------------------------------------------------------------------------- 1 | 2 | cat > prepare_disk_1.sh <<'EOF_END' 3 | source /training/project_env.sh 4 | cd ~/training-data-analyst/courses/streaming/process/sandiego 5 | ./run_oncloud.sh $DEVSHELL_PROJECT_ID $BUCKET CurrentConditions --bigtable 6 | EOF_END 7 | 8 | ZONE="$(gcloud compute instances list --project=$DEVSHELL_PROJECT_ID --format='value(ZONE)' | head -n 1)" 9 | 10 | 11 | gcloud compute scp prepare_disk_1.sh training-vm:/tmp --project=$DEVSHELL_PROJECT_ID --zone=$ZONE --quiet 12 | 13 | gcloud compute ssh training-vm --project=$DEVSHELL_PROJECT_ID --zone=$ZONE --quiet --command="bash /tmp/prepare_disk_1.sh" 14 | 15 | 16 | -------------------------------------------------------------------------------- /Streaming Data to Bigtable/task3.sh: -------------------------------------------------------------------------------- 1 | echo "Listing all Dataflow jobs:" 2 | gcloud dataflow jobs list 3 | 4 | 5 | export JOB_ID=$(gcloud dataflow jobs list | grep JOB_ID: | awk '{print $2}') 6 | 7 | gcloud dataflow jobs cancel $JOB_ID 8 | 9 | cbt deletetable current_conditions 10 | 11 | echo -e "\e[1m\e[34mhttps://console.cloud.google.com/dataflow/jobs?referrer=search&project=$DEVSHELL_PROJECT_ID\e[0m" 12 | -------------------------------------------------------------------------------- /TOC Challenge Lab Logging and Monitoring.md: -------------------------------------------------------------------------------- 1 | 2 | # TOC Challenge Lab: Logging and Monitoring 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | ### Activate your cloud shell :- 17 | 18 | ### Make sure you doing all the task on 2nd PROJECT_ID 19 | 20 | 21 | ```bash 22 | export PROJECT_ID_2= 23 | ``` 24 | 25 | 26 | ```bash 27 | gcloud config set project $PROJECT_ID_2 28 | 29 | gcloud compute instances create cymbal-instance2 \ 30 | --zone us-central1-f \ 31 | --machine-type=e2-micro \ 32 | --image-family debian-11 \ 33 | --image-project debian-cloud 34 | 35 | gcloud logging buckets create $PROJECT_ID_2 --location global --retention-days 365 36 | ``` 37 | 38 | ### Congratulation !!! -------------------------------------------------------------------------------- /Tagging Dataplex Assets: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud services enable dataplex.googleapis.com 4 | 5 | gcloud services enable datacatalog.googleapis.com 6 | 7 | export REGION= 8 | 9 | gcloud dataplex lakes create orders-lake \ 10 | --location=$REGION \ 11 | --display-name="Orders Lake" 12 | 13 | gcloud dataplex zones create customer-curated-zone \ 14 | --location=$REGION \ 15 | --lake=orders-lake \ 16 | --display-name="Customer Curated Zone" \ 17 | --resource-location-type=SINGLE_REGION \ 18 | --type=CURATED \ 19 | --discovery-enabled \ 20 | --discovery-schedule="0 * * * *" 21 | 22 | 23 | bq mk --location=$REGION --dataset customers 24 | 25 | gcloud dataplex assets create customer-details-dataset \ 26 | --location=$REGION \ 27 | --lake=orders-lake \ 28 | --zone=customer-curated-zone \ 29 | --display-name="Customer Details Dataset" \ 30 | --resource-type=BIGQUERY_DATASET \ 31 | --resource-name=projects/$DEVSHELL_PROJECT_ID/datasets/customers \ 32 | --discovery-enabled 33 | 34 | -------------------------------------------------------------------------------- /TensorFlow Qwik Start/quicklabgsp637.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | pip install google-cloud-logging 4 | 5 | pip install ---upgrade protobuf 6 | 7 | pip install --upgrade tensorflow 8 | 9 | python --version 10 | 11 | python -c "import tensorflow;print(tensorflow.__version__)" 12 | 13 | 14 | cat > model.py <<'EOF' 15 | import logging 16 | import google.cloud.logging as cloud_logging 17 | from google.cloud.logging.handlers import CloudLoggingHandler 18 | from google.cloud.logging_v2.handlers import setup_logging 19 | 20 | cloud_logger = logging.getLogger('cloudLogger') 21 | cloud_logger.setLevel(logging.INFO) 22 | cloud_logger.addHandler(CloudLoggingHandler(cloud_logging.Client())) 23 | cloud_logger.addHandler(logging.StreamHandler()) 24 | 25 | import tensorflow as tf 26 | import numpy as np 27 | 28 | xs = np.array([-1.0, 0.0, 1.0, 2.0, 3.0, 4.0], dtype=float) 29 | ys = np.array([-2.0, 1.0, 4.0, 7.0, 10.0, 13.0], dtype=float) 30 | 31 | model = tf.keras.Sequential([tf.keras.layers.Dense(units=1, input_shape=[1])]) 32 | 33 | model.compile(optimizer=tf.keras.optimizers.SGD(), loss=tf.keras.losses.MeanSquaredError()) 34 | 35 | model.fit(xs, ys, epochs=500) 36 | cloud_logger.info(str(model.predict(np.array([10.0])))) 37 | EOF 38 | 39 | python model.py 40 | -------------------------------------------------------------------------------- /Terraform Fundamentals/quicklabgsp156.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | gcloud auth list 5 | 6 | cat > instance.tf < { 4 | try { 5 | // Create the unsigned transaction object 6 | const unsignedTransaction = { 7 | nonce: 0, 8 | gasLimit: 21000, 9 | gasPrice: '0x09184e72a000', 10 | to: '0x0000000000000000000000000000000000000000', 11 | value: '0x00', 12 | data: '0x', 13 | }; 14 | 15 | // Sign the transaction 16 | const signedTransaction = await signTransaction(unsignedTransaction); 17 | 18 | // Submit the transaction to Ganache 19 | const transaction = await submitTransaction(signedTransaction); 20 | 21 | // Write the transaction receipt 22 | uploadFromMemory(transaction); 23 | 24 | return transaction; 25 | } catch (e) { 26 | console.log(e); 27 | uploadFromMemory(e); 28 | } 29 | }; 30 | 31 | await signAndSubmitTransaction(); 32 | -------------------------------------------------------------------------------- /Transacting Digital Assets with Multi-Party Computation and Confidential Space/mpc-ethereum-demo/package.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "gcp-mpc-ethereum-demo", 3 | "version": "1.0.0", 4 | "description": "Demo for GCP multi-party-compute on Confidential Space", 5 | "main": "index.js", 6 | "scripts": { 7 | "start": "node index.js" 8 | }, 9 | "type": "module", 10 | "dependencies": { 11 | "@google-cloud/kms": "^3.2.0", 12 | "@google-cloud/storage": "^6.9.2", 13 | "ethers": "^5.7.2", 14 | "fast-crc32c": "^2.0.0" 15 | }, 16 | "author": "", 17 | "license": "ISC" 18 | } 19 | -------------------------------------------------------------------------------- /Use APIs to Work with Cloud Storage Challenge Lab/world.jpeg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/quiccklabs/Labs_solutions/d72b296e05c55ad79acaa8ce5195181d1877061a/Use APIs to Work with Cloud Storage Challenge Lab/world.jpeg -------------------------------------------------------------------------------- /Use Functions Formulas and Charts in Google Sheets Challenge Lab/On the Rise Bakery Business Challenge.xlsx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/quiccklabs/Labs_solutions/d72b296e05c55ad79acaa8ce5195181d1877061a/Use Functions Formulas and Charts in Google Sheets Challenge Lab/On the Rise Bakery Business Challenge.xlsx -------------------------------------------------------------------------------- /Use Functions Formulas and Charts in Google Sheets Challenge Lab/Staff Roles.pptx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/quiccklabs/Labs_solutions/d72b296e05c55ad79acaa8ce5195181d1877061a/Use Functions Formulas and Charts in Google Sheets Challenge Lab/Staff Roles.pptx -------------------------------------------------------------------------------- /Use Functions, Formulas, and Charts in Google Sheets: Challenge Lab: -------------------------------------------------------------------------------- 1 | 2 | 3 | Task 1:- 4 | 5 | =PROPER(A2) 6 | 7 | =ISEMAIL(E2) 8 | 9 | 10 | Task 2:- 11 | 12 | =SUBSTITUTE(H2,"Oct-5","Oct-10") 13 | 14 | =VLOOKUP(A2, A2:E16, 5, False) 15 | 16 | Task 4:- 17 | 18 | =AVERAGE(C2:C101) 19 | 20 | =MEDIAN(C2:C101) 21 | 22 | =MAX(c2:c101)-MIN(c2:c101) 23 | 24 | =STDEV(C2:C101) 25 | 26 | 27 | Task 5:- 28 | 29 | 'Customer Ratings'!A1:C101 30 | 31 | 32 | -------------------------------------------------------------------------------- /Using BigQuery in the Google Cloud Console/quicklabgsp406.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | bq query --use_legacy_sql=false \ 5 | " 6 | SELECT 7 | name, gender, 8 | SUM(number) AS total 9 | FROM 10 | \`bigquery-public-data.usa_names.usa_1910_2013\` 11 | GROUP BY 12 | name, gender 13 | ORDER BY 14 | total DESC 15 | LIMIT 16 | 10 17 | " 18 | 19 | bq mk babynames 20 | 21 | 22 | bq mk --table \ 23 | --schema "name:string,count:integer,gender:string" \ 24 | $DEVSHELL_PROJECT_ID:babynames.names_2014 25 | 26 | 27 | bq query --use_legacy_sql=false \ 28 | " 29 | SELECT 30 | name, count 31 | FROM 32 | \`babynames.names_2014\` 33 | WHERE 34 | gender = 'M' 35 | ORDER BY count DESC LIMIT 5 36 | " -------------------------------------------------------------------------------- /Using Cloud Trace on Kubernetes Engine/quicklabgsp484.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud auth list 4 | 5 | git clone https://github.com/GoogleCloudPlatform/gke-tracing-demo 6 | cd gke-tracing-demo/terraform 7 | 8 | rm provider.tf 9 | 10 | cat > provider.tf < terraform.tfvars < requirements.txt < app.yaml < requirements.txt < text.txt 4 | 5 | gcloud storage cp text.txt gs://$GOOGLE_CLOUD_PROJECT-2 6 | 7 | -------------------------------------------------------------------------------- /gsutil for Storage Configuration/quicklab.sh: -------------------------------------------------------------------------------- 1 | 2 | git clone https://github.com/GoogleCloudPlatform/training-data-analyst 3 | cd training-data-analyst/blogs 4 | PROJECT_ID=`gcloud config get-value project` 5 | BUCKET=${PROJECT_ID}-bucket 6 | gsutil mb -c multi_regional gs://${BUCKET} 7 | gsutil -m cp -r endpointslambda gs://${BUCKET} 8 | gsutil ls gs://${BUCKET}/* 9 | mv endpointslambda/Apache2_0License.txt endpointslambda/old.txt 10 | rm endpointslambda/aeflex-endpoints/app.yaml 11 | gsutil -m rsync -d -r endpointslambda gs://${BUCKET}/endpointslambda 12 | gsutil ls gs://${BUCKET}/* 13 | gsutil -m acl set -R -a public-read gs://${BUCKET} 14 | gsutil cp -s nearline ghcn/ghcn_on_bq.ipynb gs://${BUCKET} -------------------------------------------------------------------------------- /mini lab/quicklab1.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | export PROJECT=$(gcloud projects list --format="value(PROJECT_ID)") 4 | 5 | gcloud storage buckets update gs://$PROJECT-bucket --no-uniform-bucket-level-access 6 | 7 | gcloud storage buckets update gs://$PROJECT-bucket --web-main-page-suffix=index.html --web-error-page=error.html 8 | 9 | gcloud storage objects update gs://$PROJECT-bucket/index.html --add-acl-grant=entity=AllUsers,role=READER 10 | 11 | gcloud storage objects update gs://$PROJECT-bucket/error.html --add-acl-grant=entity=AllUsers,role=READER -------------------------------------------------------------------------------- /mini lab/quicklab4.sh: -------------------------------------------------------------------------------- 1 | export PROJECT=$(gcloud projects list --format="value(PROJECT_ID)") 2 | 3 | gsutil setmeta -h "Content-Type:text/html" gs://$PROJECT-bucket/index.html 4 | -------------------------------------------------------------------------------- /mini lab/quicklab5.sh: -------------------------------------------------------------------------------- 1 | export PROJECT=$(gcloud projects list --format="value(PROJECT_ID)") 2 | 3 | gsutil iam get gs://$PROJECT-urgent 4 | 5 | gsutil iam ch -d allUsers gs://$PROJECT-urgent 6 | gsutil iam ch -d allAuthenticatedUsers gs://$PROJECT-urgent 7 | 8 | gsutil iam get gs://$PROJECT-urgent 9 | -------------------------------------------------------------------------------- /mini lab/quicklab6.sh: -------------------------------------------------------------------------------- 1 | export PROJECT_ID=$(gcloud projects list --format="value(PROJECT_ID)") 2 | 3 | export BUCKET=$PROJECT_ID-bucket 4 | 5 | 6 | cat > quicklab.json <