├── BigQuery Qwik Start Command Line README.md ├── Build and Deploy Machine Learning Solutions with Vertex AI: Challenge Lab ├── techcps.md └── vertex-challenge-lab.ipynb ├── Cloud Armor Preconfigured WAF Rules GSP879 README.md ├── Cloud Endpoints Qwik Start GSP164.md ├── Creating a Containerized Application with Buildpacks └── techcps.sh ├── Creating a Data Transformation Pipeline with Cloud Dataprep └── flow_Ecommerce_Analytics_Pipeline.zip ├── Creating a Data Warehouse Through Joins and Unions └── techcpsgsp413.sh ├── Creating a Gmail Add-on └── techcps249.md ├── Creating a Looker Modeled Query and Working with Quick Start └── techcps984.md ├── Creating a Persistent Disk └── techcps004.sh ├── Creating a Virtual Machine └── techcpsgsp001.sh ├── Creating and Alerting on Logs-based Metrics ├── techcps.md └── techcps091.sh ├── Creating and Populating a Bigtable Instance ├── techcps.md └── techcps1054.sh ├── Data Analytics SME Academy - Loading Data into Google Cloud SQL └── techcps196.sh ├── Data Loss Prevention: Qwik Start - JSON └── techcps107.sh ├── Dataflow Qwik Start - Python GSP207 README.md ├── Dataplex: Qwik Start - Command Line └── techcpsgsp1144.sh ├── Dataplex: Qwik Start - Console └── techcpsgsp1143.sh ├── Dataproc: Qwik Start - Command Line ├── techcps.md └── techcps104.sh ├── Dataproc: Qwik Start - Console ├── techcps103.md └── techcps103.sh ├── Debugging Apps on Google Kubernetes Engine └── techcpsgsp736.sh ├── Deploy Go Apps on Google Cloud Serverless Platforms ├── techcps702.sh └── techcpsgsp702.md ├── Deploy Kubernetes Applications on Google Cloud: Challenge Lab ├── techcps.md └── techcps318.sh ├── Deploy Kubernetes Load Balancer Service with Terraform └── techcps233.sh ├── Deploy Microsoft SQL Server to Compute Engine └── techcps031.sh ├── Deploy Your Website on Cloud Run GSP659.md ├── Deploy a Compute Instance with a Remote Startup Script: Challenge Lab └── techcps301.sh ├── Deploy a Hugo Website with Cloud Build and Firebase Pipeline ├── techcps1.sh ├── techcps2.sh └── techcps747.md ├── Deploy a Modern Web App connected to a Cloud Spanner Instance └── techcps1051.sh ├── Deploy a Streamlit App Integrated with Gemini Pro on Cloud Run └── techcps1229.sh ├── Deploy and Test a Visual Inspection AI Component Anomaly Detection Solution ├── techcps.md └── techcps896.sh ├── Deploy and Test a Visual Inspection AI Cosmetic Anomaly Detection Solution ├── techcps.md └── techcps898.sh ├── Deploy and Troubleshoot a Website: Challenge Lab ├── techcps.md └── techcps101.sh ├── Deploy, Scale, and Update Your Website on Google Kubernetes Engine └── techcpsgsp663.sh ├── Deploying Google Kubernetes Engine └── techcpslab.md ├── Deploying Redis Enterprise for GKE and Serverless App on Anthos Bare Metal └── techcpsgsp938.sh ├── Deploying a Containerized Application on Cloud Run └── techcps.sh ├── Deploying a Fault-Tolerant Microsoft Active Directory Environment ├── techcps.md └── techcps118.sh ├── Deploying a Python Flask Web Application to App Engine Flexible └── techcps023.sh ├── Deploying an Open Source Cassandra™ Database using the GCP Marketplace └── techcps704.md ├── Derive Insights from BigQuery Data: Challenge Lab └── techcps787.md ├── Designing and Querying Bigtable Schemas ├── techcps.md ├── techcps.sh └── techcps1053.sh ├── Detect Labels, Faces, and Landmarks in Images with the Cloud Vision API GSP037 ├── city.png ├── donuts.png ├── selfie.png └── techcpsgsp037.md ├── Detect Labels, Faces, and Landmarks in Images with the Cloud Vision API ├── city.png ├── donuts.png ├── selfie.png ├── techcps.md └── techcps037.sh ├── Detect Manufacturing Defects using Visual Inspection AI: Challenge Lab └── techcps366.sh ├── Detect and Investigate Threats with Security Command Center └── techcpsgsp1125.md ├── Develop GenAI Apps with Gemini and Streamlit: Challenge Lab └── techcps517.md ├── Develop Serverless Applications on Cloud Run: Challenge Lab ├── techcps.md └── techcps328.sh ├── Develop Serverless Apps with Firebase: Challenge Lab └── techcpsgsp344.sh ├── Develop an App with Vertex AI Gemini 1.0 Pro ├── gemini-app │ ├── Dockerfile.txt │ ├── app.py │ ├── app_tab1.py │ ├── app_tab2.py │ ├── app_tab3.py │ ├── app_tab4.py │ ├── requirements.txt │ └── response_utils.py ├── techcps.md └── techcps.sh ├── Develop your Google Cloud Network: Challenge Lab └── techcps321.sh ├── Developing a REST API with Go and Cloud Run ├── main.go ├── techcps.md └── techcps761.sh ├── Developing and Deploying Cloud Functions └── techcps.sh ├── Dialogflow CX Bot Building Basics └── techcps928.blob ├── Dialogflow CX Enable IVR Features for your Voice Agent └── techcps967.blob ├── Dialogflow CX Managing Environments └── techcps929.blob ├── Dialogflow CX Parameter Manipulation └── techcps949.blob ├── Distributed Load Testing Using Kubernetes ├── techcps.md └── techcps182.sh ├── ETL Processing on Google Cloud Using Dataflow and BigQuery Python ├── techcps290.md └── techcps290.sh ├── Embedding Maps in Look └── techcps.md ├── Employing Best Practices for Improving the Usability of LookML Projects └── techcps1020.md ├── Engineer Data for Predictive Modeling with BigQuery ML: Challenge Lab └── techcps327.sh ├── Engineer Data in Google Cloud ├── Cloud Composer Copying BigQuery Tables Across Different Locations │ └── techcpsgsp283.sh └── Lab Solutions.md ├── Ensure Access & Identity in Google Cloud ├── Lab Solutions.md └── Setting up a Private Kubernetes Cluster │ └── techcpsgsp178.sh ├── Entity and Sentiment Analysis with the Natural Language API └── techcps038.sh ├── Eventarc for Cloud Run ├── techcps1.sh ├── techcps2.sh └── techcps773.md ├── Examining Billing data with BigQuery ├── techcps.md └── techcps.sh ├── Explore and Evaluate Models using Model Garden └── techcps1166.sh ├── Exploring IAM ├── sample.txt └── techcps.sh ├── Exploring NCAA Data with BigQuery └── techcpsgsp160.sh ├── Exploring Your Ecommerce Dataset with SQL in Google BigQuery └── techcpsgsp407.sh ├── Filtering Explores with LookML └── techcps892.md ├── GKE Workload Optimization ├── techcps1.sh ├── techcps2.sh └── techcpsgsp769.md ├── Get Started with TensorFlow on Google Cloud Challenge Lab └── cnn_challenge_lab.ipynb ├── Get Started with Vertex AI Studio ├── techcps1.json ├── techcps1154.md ├── techcps2.json ├── techcps3.json └── techcps4.json ├── Getting Started with BigQuery Machine Learning └── techcpsgsp247.sh ├── Getting Started with Cloud KMS └── techcpsgsp079.sh ├── Getting Started with Cloud Shell and gcloud └── techcpsgsp002.sh ├── Getting Started with Security Command Center ├── techcps.sh └── techcpsgsp1124.md ├── Getting Started with VPC Networking and Google Compute Engine └── techcps.sh ├── Getting Started with the Vertex AI Gemini API and Python SDK ├── intro_gemini_python-v1.0.0.ipynb └── intro_gemini_python.ipynb ├── Getting started with Flutter Development └── techcps885.md ├── Google AppSheet Getting Started └── Techcps.xlsx ├── Google Chat Bot - Apps Script └── techcps250.md ├── Google Cloud Packet Mirroring with OpenSource IDS ├── techcps.sh └── techcpsgsp474.sh ├── Google Cloud PubSub Qwik Start - Command Line GSP095.md ├── Google Cloud SDK Qwik Start - RedhatCentos ├── techcps.sh └── techcps122.md ├── Google Cloud Speech-to-Text API: Qwik Start └── techcpsgsp119.sh ├── Google Kubernetes Engine Pipeline using Cloud Build └── techcps1077.sh ├── Google Sheets Getting Started ├── exported-data.csv ├── important-data.xlsx └── techcps469.md ├── Google Workspace for Education: Getting Started └── techcps978.csv ├── Hello Node Kubernetes └── techcpsgsp005.sh ├── Hosting a Web App on Google Cloud Using Compute Engine ├── startup-script.sh ├── techcps.sh └── techcps1.sh ├── How to Use a Network Policy on Google Kubernetes Engine ├── techcps.md └── techcps480.sh ├── IAM Custom Roles └── techcps190.sh ├── Identify Application Vulnerabilities with Security Command Center ├── app.py ├── techcps.sh └── techcps1.sh ├── Identify Damaged Car Parts with Vertex AutoML Vision ├── payload.json └── techcps972.sh ├── Implement CICD Pipelines on Google Cloud: Challenge Lab └── techcps393.sh ├── Implement Cloud Security Fundamentals on Google Cloud: Challenge Lab ├── techcps.md └── techcps342.sh ├── Implement DevOps Workflows in Google Cloud: Challenge Lab └── techcps330.sh ├── Implement Private Google Access and Cloud NAT └── techcps.sh ├── Implement continuous delivery with Gemini └── techcps.sh ├── Implementing Canary Releases of TensorFlow Model Deployments with Kubernetes and Anthos Service Mesh └── techcps778.sh ├── Implementing Least Privilege IAM Policy Bindings in Cloud Run [APPRUN] ├── techcps.sh └── techcpslab.sh ├── Importing Data to a Firestore Database ├── createTestData.js ├── importTestData.js └── techcpsgsp642.sh ├── Improving Network Performance I └── techcps045.sh ├── Infrastructure as Code with Terraform ├── techcps.sh └── techcpsgsp750.sh ├── Ingesting DICOM Data with the Healthcare API └── techcps615.sh ├── Ingesting FHIR Data with the Healthcare API ├── techcps.sh └── techcps457.md ├── Ingesting New Datasets into BigQuery ├── products.csv ├── techcps.csv ├── techcps.md └── techcps411.sh ├── Inspect Rich Documents with Gemini Multimodality and Multimodal RAG Challenge Lab ├── inspect_rich_documents_w_gemini_multimodality_and_multimodal_rag-v1.0.0.ipynb └── inspect_rich_documents_w_gemini_multimodality_and_multimodal_rag.ipynb ├── Integrating Cloud Functions with Firestore ├── techcps.md └── techcps.sh ├── Internal Load Balancer └── techcps041.sh ├── Introduction to APIs in Google Cloud ├── demo-image.png ├── techcps.md └── techcps294.sh ├── Introduction to Computer Vision with TensorFlow ├── callback_model.py ├── model.py ├── requirements.txt ├── techcps.md ├── techcps.sh ├── techcps631.sh ├── updated_model_1.py ├── updated_model_2.py └── updated_model_3.py ├── Introduction to Docker └── techcpsgsp055.sh ├── Introduction to SQL for BigQuery and Cloud SQL ├── end_station_data.csv ├── start_station_data.csv └── techcpsgsp281.sh ├── Loading Your Own Data into BigQuery ├── nyc_tlc_yellow_trips_2018_subset_1.csv ├── techcps.md └── techcps865.sh ├── Log Analytics on Google Cloud ├── techcps.md └── techcps1088.sh ├── Looker Data Explorer - Qwik Start └── techcps718.md ├── Looker Developer: Qwik Start └── techcps891.md ├── Manage Bigtable on Google Cloud: Challenge Lab └── techcps380.sh ├── Manage Kubernetes in Google Cloud: Challenge Lab ├── techcps.sh └── techcps510.md ├── Manage and Secure Distributed Services with GKE Managed Service Mesh ├── techcps1.sh ├── techcps1242.md └── techcps2.sh ├── Managing Vault Tokens └── techcpsgsp1006.md ├── Managing a GKE Multi-tenant Cluster with Namespaces └── techcpsgsp766.md ├── Migrate MySQL Data to Cloud SQL using Database Migration Service: Challenge Lab └── techcps351.md ├── Migrate a MySQL Database to Google Cloud SQL: Challenge Lab ├── techcps.md └── techcps306.sh ├── Migrating a Monolithic Website to Microservices on Google Kubernetes Engine └── techcps699.sh ├── Mitigate Threats and Vulnerabilities with Security Command Center: Challenge Lab ├── findings.jsonl ├── techcps1.sh ├── techcps2.sh └── techcps382.md ├── Monitor an Apache Web Server using Ops Agent ├── techcps.md └── techcps1108.sh ├── Monitor and Log with Google Cloud Observability: Challenge Lab ├── techcps.md └── techcps338.sh ├── Monitoring Multiple Projects with Cloud Monitoring ├── techcps.md └── techcps090.sh ├── Monitoring and Logging for Cloud Functions ├── techcps.md ├── techcps092.md └── techcps092.sh ├── Monitoring and Managing Bigtable Health and Performance └── techcps1056.sh ├── Movie Recommendations in BigQuery ML 2.5 └── techcps.sh ├── Networking 101 ├── techcps.md ├── techcps.sh └── techcps016.sh ├── Optical Character Recognition OCR with Document AI Python ├── techcps.md └── techcps1138.sh ├── Optimize Costs for Google Kubernetes Engine: Challenge Lab ├── techcps.md └── techcps343.sh ├── Optimizing your BigQuery Queries for Performance 2.5 └── techcps.sh ├── Perform Foundational Data, ML, and AI Tasks in Google Cloud Challenge Lab GSP323 README.md ├── Perform Foundational Infrastructure Tasks in Google Cloud Challenge Lab GSP315README.md ├── Predict Bike Trip Duration with a Regression Model in BQML 2.5 └── techcps.sh ├── Predict Soccer Match Outcomes with BigQuery ML: Challenge Lab ├── techcps.md └── techcps.sh ├── Predict Taxi Fare with a BigQuery ML Forecasting Model └── techcpsgsp246.sh ├── Predict Visitor Purchases with a Classification Model in BigQuery ML ├── Techcps229.md └── techcpsgsp229.sh ├── Prepare Data for ML APIs on Google Cloud: Challenge Lab ├── techcps.md └── techcps323.sh ├── Prompt Design in Vertex AI Challenge Lab ├── Cymbal Product Analysis.json ├── Cymbal Tagline Generator Template.json ├── image-analysis.ipynb ├── tagline-generator.ipynb └── techcps519.md ├── Protect Cloud Traffic with BeyondCorp Enterprise BCE Security: Challenge Lab └── techcps373.sh ├── Provision Services with Google Cloud Marketplace ├── techcps.md └── techcps003.sh ├── Pub Sub Lite: Qwik Start └── techcps832.sh ├── Reduce Costs for the Managed Service for Prometheus └── techcps1027.sh ├── Reinforcement Learning: Qwik Start └── techcps691.sh ├── Running Windows Containers on Compute Engine └── techcps153.bat ├── Running a MongoDB Database in Kubernetes with StatefulSets └── techcps022.sh ├── Scale Out and Update a Containerized Application on a Kubernetes Cluster: Challenge Lab └── techcpsgsp305.sh ├── Scanning User-generated Content Using the Cloud Video Intelligence and Cloud Vision APIs GSP138 README.md ├── Secure Builds with Cloud Build └── techcps1184.sh ├── Secure Software Delivery: Challenge Lab ├── techcps.md └── techcps521.sh ├── Securing Cloud Applications with Identity Aware Proxy IAP using Zero-Trust ├── techcps.md └── techcps946.sh ├── Securing Compute Engine Applications and Resources using BeyondCorp Enterprise BCE └── techcps1033.sh ├── Securing Container Builds ├── techcps.md └── techcps1185.sh ├── Securing Google Cloud with CFT Scorecard └── techcps698.sh ├── Securing Virtual Machines using Chrome Enterprise Premium ├── techcps.md └── techcps1036.sh ├── Serverless Cloud Run Development: Challenge Lab └── techcpsgsp328.md ├── Serverless Data Processing with Dataflow - Monitoring, Logging and Error Reporting for Dataflow Jobs └── techcps.sh ├── Service Accounts and Roles: Fundamentals ├── techcps1.sh └── techcps2.sh ├── Service Directory: Qwik Start └── techcps732.sh ├── Set Up Network and HTTP Load Balancers └── techcpsgsp007.sh ├── Set Up a Google Cloud Network: Challenge Lab ├── techcps.md └── techcps314.sh ├── Set Up an App Dev Environment on Google Cloud: Challenge Lab ├── techcps.md └── techcps315.sh ├── Setting Up Cost Control with Quota ├── techcps.md └── techcps651.sh ├── Setting up Jenkins on Kubernetes Engine ├── techcps.md └── techcps117.sh ├── Speaking with a Webpage - Streaming Speech Transcripts ├── techcps.sh ├── techcps1.sh └── techcps125.md ├── Speech to Text Transcription with the Cloud Speech API └── tehcps048.md ├── Speech-to-Text API: Qwik Start └── techcps119.sh ├── Stream Processing with Cloud PubSub and Dataflow: Qwik Start └── techcps903.sh ├── Streaming HL7 to FHIR Data with Dataflow and the Healthcare API ├── techcps.md └── techcps894.sh ├── Summarize Text using SQL and LLMs in BigQuery ML └── techcps835.sh ├── Tagging Dataplex Assets ├── techcps.md └── techcps1145.sh ├── TensorFlow: Qwik Start ├── model.ipynb ├── techcps.md ├── techcps637.md └── techcps637.sh ├── Terraform Fundamentals └── techcpsgsp156.sh ├── Test Network Latency Between VMs ├── techcps.sh └── techcps161.md ├── Transacting Digital Assets with Multi-Party Computation and Confidential Space ├── mpc-ethereum-demo │ ├── Dockerfile │ ├── credential-config.js │ ├── index.js │ ├── kms-decrypt.js │ ├── mpc.js │ └── package.json ├── techcps.md └── techcps1128.sh ├── Translate Text with the Cloud Translation API ├── techcps.md └── techcps049.sh ├── Troubleshooting Common SQL Errors with BigQuery ├── techcps.md └── techcps408.sh ├── Troubleshooting Data Models in Looker └── techcpsgsp1019.md ├── Troubleshooting and Solving Data Join Pitfalls └── techcpsgsp412.sh ├── Understanding and Combining GKE Autoscaling Strategies ├── techcps.md └── techcps768.sh ├── Use Charts in Google Sheets ├── On the Rise Bakery Sales and Locations.xlsx ├── On the Rise Bakery.pptx └── techcps1061.md ├── Use Cloud Run Functions to Load BigQuery ├── techcps.md └── techcps.sh ├── Use Functions, Formulas, and Charts in Google Sheets Challenge Lab ├── On the Rise Bakery Business Challenge.xlsx └── Staff Roles.pptx ├── Use Go Code to Work with Google Cloud Data Sources └── techcpsgsp701.sh ├── Using BigQuery and Cloud Logging to Analyze BigQuery Usage ├── techcps.sh └── techcpsgsp617.md ├── Using BigQuery in the Google Cloud Console GSP406 ├── code.md └── techcpsgsp406.sh ├── Using Cloud PubSub with Cloud Run APPRUN ├── techcps.md └── techcps.sh ├── Using Cloud Trace on Kubernetes Engine └── techcpsgsp484.sh ├── Using OpenTSDB to Monitor Time-Series Data on Cloud Platform └── techcps142.sh ├── Using Specialized Processors with Document AI Python ├── techcps.md └── techcps1140.sh ├── VPC Flow Logs - Analyzing Network Traffic ├── techcps.md └── techcps212.sh ├── VPC Network Peering └── techcps193.sh ├── VPC Networking Fundamentals └── techcps210.sh ├── VPC Networking └── techcps.sh ├── VPC Networking: Cloud HA-VPN └── techcps619.sh ├── Validate Data in Google Sheets ├── techcps1062.md └── techcps1062.xlsx ├── Validating Policies for Terraform on Google Cloud └── techcpsgsp1021.md ├── Vertex AI PaLM API Qwik Start GSP1155 README.md ├── Vertex AI: Qwik Start ├── lab_exercise.ipynb └── techcps917.sh ├── Virtual Private Networks (VPN) README.md ├── Visualize the 10,000 Bitcoin Pizza Transaction Using BigQuery and Vertex AI Workbench ├── techcps.md ├── techcps604.sh └── visualizing_the_10000_pizza_bitcoin_network.ipynb ├── Visualizing Billing Data with Looker Studio └── techcps622.sh ├── Weather Data in BigQuery ├── techcps.sh └── techcpsgsp009.md ├── Web Security Scanner: Qwik Start └── techcps112.sh ├── Working with Artifact Registry └── techcps1076.sh ├── Working with JSON, Arrays, and Structs in BigQuery └── techcpsgsp416.sh ├── Working with Virtual Machines ├── cp.md └── techcps.sh ├── Working with the Google Cloud Console and Cloud Shell └── techcps.sh └── cptrick ├── techcps1184.md ├── techcps142.md ├── techcps190.md ├── techcps373.md └── techcps850.md /BigQuery Qwik Start Command Line README.md: -------------------------------------------------------------------------------- 1 | 2 | # BigQuery: Qwik Start - Command Line [GSP071] 3 | 4 | * In the GCP Console open the Cloud Shell and enter the following commands: 5 | 6 | ``` 7 | bq show bigquery-public-data:samples.shakespeare 8 | bq help query 9 | bq query --use_legacy_sql=false \ 10 | 'SELECT 11 | word, 12 | SUM(word_count) AS count 13 | FROM 14 | `bigquery-public-data`.samples.shakespeare 15 | WHERE 16 | word LIKE "%raisin%" 17 | GROUP BY 18 | word' 19 | bq query --use_legacy_sql=false \ 20 | 'SELECT 21 | word 22 | FROM 23 | `bigquery-public-data`.samples.shakespeare 24 | WHERE 25 | word = "huzzah"' 26 | bq ls 27 | bq ls bigquery-public-data: 28 | bq mk babynames 29 | bq ls 30 | curl -LO http://www.ssa.gov/OACT/babynames/names.zip 31 | ls 32 | unzip names.zip 33 | ls 34 | bq load babynames.names2010 yob2010.txt name:string,gender:string,count:integer 35 | bq ls babynames 36 | bq show babynames.names2010 37 | bq query "SELECT name,count FROM babynames.names2010 WHERE gender = 'F' ORDER BY count DESC LIMIT 5" 38 | bq query "SELECT name,count FROM babynames.names2010 WHERE gender = 'M' ORDER BY count ASC LIMIT 5" 39 | bq rm -r babynames 40 | ``` 41 | 42 | # Congratulations, you're all done with the lab 😄 43 | # If you consider that the video helped you to complete your lab, so please do like and subscribe 44 | # Thanks for watching :) 45 | -------------------------------------------------------------------------------- /Build and Deploy Machine Learning Solutions with Vertex AI: Challenge Lab/techcps.md: -------------------------------------------------------------------------------- 1 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) & join our [WhatsApp Channel](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A) 2 | -------------------------------------------------------------------------------- /Cloud Endpoints Qwik Start GSP164.md: -------------------------------------------------------------------------------- 1 | 2 | # Cloud Endpoints: Qwik Start [GSP164] 3 | 4 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) 5 | 6 | * In the GCP Console open the Cloud Shell and enter the following commands: 7 | 8 | ``` 9 | gcloud auth list 10 | gcloud config list project 11 | gsutil cp gs://spls/gsp164/endpoints-quickstart.zip . 12 | unzip endpoints-quickstart.zip 13 | cd endpoints-quickstart/scripts 14 | 15 | ./deploy_api.sh 16 | 17 | ./deploy_app.sh 18 | 19 | ./query_api.sh 20 | 21 | ./query_api.sh JFK 22 | 23 | ./deploy_api.sh ../openapi_with_ratelimit.yaml 24 | 25 | ./deploy_app.sh 26 | 27 | gcloud alpha services api-keys create --display-name="techcps" 28 | KEY_NAME=$(gcloud alpha services api-keys list --format="value(name)" --filter "displayName=techcps") 29 | export API_KEY=$(gcloud alpha services api-keys get-key-string $KEY_NAME --format="value(keyString)") 30 | ./query_api_with_key.sh $API_KEY 31 | ./generate_traffic_with_key.sh $API_KEY 32 | ./query_api_with_key.sh $API_KEY 33 | ``` 34 | 35 | # Congratulations, you're all done with the lab 😄 36 | 37 | # Thanks for watching :) 38 | -------------------------------------------------------------------------------- /Creating a Data Transformation Pipeline with Cloud Dataprep/flow_Ecommerce_Analytics_Pipeline.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/0bed5c988e3db8690d4ca5a0bbcb7d6560076817/Creating a Data Transformation Pipeline with Cloud Dataprep/flow_Ecommerce_Analytics_Pipeline.zip -------------------------------------------------------------------------------- /Creating a Persistent Disk/techcps004.sh: -------------------------------------------------------------------------------- 1 | 2 | gcloud auth list 3 | 4 | gcloud config set compute/zone $ZONE 5 | export REGION=${ZONE%-*} 6 | gcloud config set compute/region $REGION 7 | 8 | 9 | gcloud compute instances create gcelab --zone $ZONE --machine-type e2-standard-2 10 | 11 | 12 | gcloud compute disks create mydisk --size=200GB \ 13 | --zone $ZONE 14 | 15 | gcloud compute instances attach-disk gcelab --disk mydisk --zone $ZONE 16 | 17 | gcloud compute instances create gcelab2 --machine-type e2-medium --zone=$ZONE 18 | 19 | 20 | gcloud compute ssh gcelab --zone $ZONE --quiet --command "sudo mkdir /mnt/mydisk && 21 | sudo mkfs.ext4 -F -E lazy_itable_init=0,lazy_journal_init=0,discard /dev/disk/by-id/scsi-0Google_PersistentDisk_persistent-disk-1 && 22 | sudo mount -o discard,defaults /dev/disk/by-id/scsi-0Google_PersistentDisk_persistent-disk-1 /mnt/mydisk && 23 | echo '/dev/disk/by-id/scsi-0Google_PersistentDisk_persistent-disk-1 /mnt/mydisk ext4 defaults 1 1' | sudo tee -a /etc/fstab" 24 | 25 | 26 | -------------------------------------------------------------------------------- /Creating a Virtual Machine/techcpsgsp001.sh: -------------------------------------------------------------------------------- 1 | 2 | gcloud compute instances create gcelab --zone=$ZONE --machine-type=e2-medium --boot-disk-size=10GB --image-family=debian-11 --image-project=debian-cloud --create-disk=size=10GB,type=pd-balanced --tags=http-server 3 | 4 | gcloud compute instances create gcelab2 --machine-type e2-medium --zone=$ZONE 5 | 6 | gcloud compute firewall-rules create allow-http --action=ALLOW --direction=INGRESS --rules=tcp:80 --source-ranges=0.0.0.0/0 --target-tags=http-server 7 | 8 | gcloud compute ssh gcelab --zone=$ZONE --quiet --command "sudo apt-get update && sudo apt-get install -y nginx && ps auwx | grep nginx " 9 | -------------------------------------------------------------------------------- /Creating and Alerting on Logs-based Metrics/techcps.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) & join our [WhatsApp Community](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A) 4 | 5 | 6 | ## 🚨Export the ZONE name correctly: 7 | 8 | ``` 9 | export ZONE= 10 | ``` 11 | 12 | ## 🚨Copy and run the below commands: 13 | 14 | ``` 15 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Creating%20and%20Alerting%20on%20Logs-based%20Metrics/techcps091.sh 16 | sudo chmod +x techcps091.sh 17 | ./techcps091.sh 18 | 19 | ``` 20 | 21 | ## Congratulations, you're all done with the lab 😄 22 | 23 | # Thanks for watching :) 24 | -------------------------------------------------------------------------------- /Creating and Populating a Bigtable Instance/techcps.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) & join our [WhatsApp Community](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A) 4 | 5 | 6 | ``` 7 | export ZONE= 8 | ``` 9 | ``` 10 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Creating%20and%20Populating%20a%20Bigtable%20Instance/techcps1054.sh 11 | sudo chmod +x techcps1054.sh 12 | ./techcps1054.sh 13 | ``` 14 | 15 | # Note Check your progress on Task 1-3 16 | ## Do not run the next command until you get a score on Task 1 to 3 17 | 18 | ``` 19 | cbt -instance personalized-sales ls 20 | 21 | cbt -instance personalized-sales deletetable UserSessions 22 | 23 | cbt deleteinstance personalized-sales 24 | 25 | ``` 26 | 27 | ## Congratulations, you're all done with the lab 😄 28 | 29 | # Thanks for watching :) 30 | -------------------------------------------------------------------------------- /Creating and Populating a Bigtable Instance/techcps1054.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | export REGION="${ZONE%-*}" 4 | 5 | # Disable Dataflow API 6 | gcloud services disable dataflow.googleapis.com 7 | 8 | # Enable Dataflow API 9 | gcloud services enable dataflow.googleapis.com 10 | 11 | 12 | gcloud bigtable instances tables create UserSessions --project=$DEVSHELL_PROJECT_ID --instance=personalized-sales --column-families=Interactions,Sales 13 | 14 | gsutil mb -l US gs://$DEVSHELL_PROJECT_ID 15 | 16 | sleep 60 17 | 18 | 19 | #!/bin/bash 20 | 21 | deploy_function() { 22 | gcloud dataflow jobs run import-usersessions \ 23 | --region=$REGION \ 24 | --gcs-location gs://dataflow-templates-$REGION/latest/GCS_SequenceFile_to_Cloud_Bigtable \ 25 | --staging-location gs://$DEVSHELL_PROJECT_ID/temp \ 26 | --parameters bigtableProject=$DEVSHELL_PROJECT_ID,bigtableInstanceId=personalized-sales,bigtableTableId=UserSessions,sourcePattern=gs://cloud-training/OCBL377/retail-interactions-sales-00000-of-00001,mutationThrottleLatencyMs=0 27 | } 28 | 29 | deploy_success=false 30 | 31 | while [ "$deploy_success" = false ]; do 32 | if deploy_function; then 33 | echo "Function deployed successfully. (https://www.youtube.com/@techcps)" 34 | deploy_success=true 35 | else 36 | echo "Deployment Retrying, please subscribe to techcps (https://www.youtube.com/@techcps).." 37 | sleep 10 38 | fi 39 | done 40 | 41 | 42 | 43 | 44 | echo project = `gcloud config get-value project` \ 45 | >> ~/.cbtrc 46 | 47 | echo instance = personalized-sales \ 48 | >> ~/.cbtrc 49 | 50 | cat ~/.cbtrc 51 | 52 | 53 | -------------------------------------------------------------------------------- /Data Analytics SME Academy - Loading Data into Google Cloud SQL/techcps196.sh: -------------------------------------------------------------------------------- 1 | 2 | git clone \ 3 | https://github.com/GoogleCloudPlatform/data-science-on-gcp/ 4 | 5 | cd data-science-on-gcp/03_sqlstudio 6 | 7 | export PROJECT_ID=$(gcloud info --format='value(config.project)') 8 | export BUCKET=${PROJECT_ID}-ml 9 | 10 | gsutil cp create_table.sql \ 11 | gs://$BUCKET/create_table.sql 12 | 13 | 14 | gcloud sql instances create flights \ 15 | --database-version=POSTGRES_13 --cpu=2 --memory=8GiB \ 16 | --region=us-east4 --root-password=Passw0rd 17 | 18 | export ADDRESS=$(curl -s http://ipecho.net/plain)/32 19 | 20 | gcloud sql instances patch flights --authorized-networks $ADDRESS --quiet 21 | 22 | gcloud sql databases create bts --instance=flights 23 | 24 | 25 | -------------------------------------------------------------------------------- /Data Loss Prevention: Qwik Start - JSON/techcps107.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud auth list 4 | 5 | export PROJECT_ID=$DEVSHELL_PROJECT_ID 6 | 7 | cat > inspect-request.json < new-inspect-file.json < Settings 6 | * Enble to all this three GCP Service role name 7 | 8 | > **Cloud Functions** 9 | 10 | > **Cloud Run** 11 | 12 | > **App Engine** 13 | 14 | ## 🚨Export the variables name correctly: 15 | ``` 16 | export REGION= 17 | ``` 18 | 19 | ## 🚨Copy and run the below commands in Cloud Shell: 20 | ``` 21 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Deploy%20Go%20Apps%20on%20Google%20Cloud%20Serverless%20Platforms/techcps702.sh 22 | sudo chmod +x techcps702.sh 23 | ./techcps702.sh 24 | ``` 25 | 26 | ## Congratulations, you're all done with the lab 😄 27 | 28 | # Thanks for watching :) 29 | -------------------------------------------------------------------------------- /Deploy Kubernetes Applications on Google Cloud: Challenge Lab/techcps.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) & join our [WhatsApp Community](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A) 4 | 5 | 6 | ## 🚨 Export the Variables name correctly 7 | 8 | ``` 9 | export DOCKER_IMAGE= 10 | export TAG_NAME= 11 | export REPO_NAME= 12 | export ZONE= 13 | ``` 14 | 15 | ## 🚨 Copy and run the below commands 16 | 17 | ``` 18 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Deploy%20Kubernetes%20Applications%20on%20Google%20Cloud%3A%20Challenge%20Lab/techcps318.sh 19 | sudo chmod +x techcps318.sh 20 | ./techcps318.sh 21 | ``` 22 | 23 | ## Congratulations, you're all done with the lab 😄 24 | 25 | # Thanks for watching :) 26 | -------------------------------------------------------------------------------- /Deploy Kubernetes Applications on Google Cloud: Challenge Lab/techcps318.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud auth list 4 | 5 | export REGION="${ZONE%-*}" 6 | 7 | gsutil cat gs://cloud-training/gsp318/marking/setup_marking_v2.sh | bash 8 | gcloud source repos clone valkyrie-app 9 | cd valkyrie-app 10 | cat > Dockerfile <> config.toml 30 | 31 | 32 | 33 | 34 | sudo rm -r themes/hello-friend-ng/.git 35 | sudo rm themes/hello-friend-ng/.gitignore 36 | 37 | 38 | 39 | 40 | cd ~/my_hugo_site 41 | /tmp/hugo server -D --bind 0.0.0.0 --port 8080 42 | 43 | 44 | 45 | -------------------------------------------------------------------------------- /Deploy a Hugo Website with Cloud Build and Firebase Pipeline/techcps2.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | curl -sL https://firebase.tools | bash 5 | 6 | 7 | 8 | cd ~/my_hugo_site 9 | firebase init 10 | 11 | 12 | /tmp/hugo && firebase deploy 13 | 14 | 15 | 16 | git config --global user.name "hugo" 17 | git config --global user.email "hugo@blogger.com" 18 | 19 | 20 | 21 | cd ~/my_hugo_site 22 | echo "resources" >> .gitignore 23 | 24 | 25 | 26 | git add . 27 | git commit -m "Add app to Cloud Source Repositories" 28 | git push -u origin master 29 | 30 | 31 | 32 | cd ~/my_hugo_site 33 | cp /tmp/cloudbuild.yaml . 34 | 35 | 36 | echo -e "options:\n logging: CLOUD_LOGGING_ONLY" >> cloudbuild.yaml 37 | 38 | 39 | 40 | gcloud alpha builds triggers import --source=/tmp/trigger.yaml 41 | 42 | 43 | gcloud alpha builds triggers import --source=/tmp/trigger.yaml 44 | 45 | 46 | 47 | cd ~/my_hugo_site 48 | 49 | 50 | 51 | # Edit the file config.toml 52 | 53 | sed -i "3c\title = 'Blogging with Hugo and Cloud Build'" config.toml 54 | 55 | 56 | 57 | git add . 58 | git commit -m "I updated the site title" 59 | git push -u origin master 60 | 61 | 62 | 63 | sleep 15 64 | 65 | 66 | 67 | gcloud builds list 68 | 69 | 70 | 71 | gcloud builds log $(gcloud builds list --format='value(ID)' --filter=$(git rev-parse HEAD)) 72 | 73 | 74 | 75 | gcloud builds log $(gcloud builds list --format='value(ID)' --filter=$(git rev-parse HEAD)) | grep "Hosting URL" 76 | 77 | -------------------------------------------------------------------------------- /Deploy a Modern Web App connected to a Cloud Spanner Instance/techcps1051.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud auth list 4 | 5 | gcloud config set project $DEVSHELL_PROJECT_ID 6 | 7 | gcloud services enable spanner.googleapis.com artifactregistry.googleapis.com containerregistry.googleapis.com run.googleapis.com 8 | 9 | sleep 20 10 | 11 | 12 | git clone https://github.com/GoogleCloudPlatform/training-data-analyst 13 | 14 | 15 | cd training-data-analyst/courses/cloud-spanner/omegatrade/backend 16 | 17 | 18 | cat > .env < 28 | - **Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps)** icon 29 | - **Join our [WhatsApp Community](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A)** icon 30 | - **Join our [Telegram Channel](https://t.me/Techcps)** icon 31 | - **Follow us on [LinkedIn](https://www.linkedin.com/company/techcps/)** icon 32 | 33 | ### Thanks for watching and stay connected :) 34 | -------------------------------------------------------------------------------- /Deploy and Troubleshoot a Website: Challenge Lab/techcps101.sh: -------------------------------------------------------------------------------- 1 | 2 | gcloud auth list 3 | 4 | export PROJECT_ID=$(gcloud config get-value project) 5 | 6 | export PROJECT_ID=$DEVSHELL_PROJECT_ID 7 | 8 | gcloud config set compute/zone $ZONE 9 | 10 | gcloud compute instances create $VM_NAME \ 11 | --zone=$ZONE --project=$DEVSHELL_PROJECT_ID \ 12 | --machine-type=f1-micro \ 13 | --network-interface=network-tier=PREMIUM,stack-type=IPV4_ONLY,subnet=default \ 14 | --metadata=startup-script='#!/bin/bash 15 | apt-get update 16 | apt-get install apache2 -y 17 | systemctl start apache2 18 | systemctl enable apache2' \ 19 | --tags=http-server,https-server \ 20 | --create-disk=auto-delete=yes,boot=yes,device-name=$VM_NAME,image=projects/debian-cloud/global/images/debian-12-bookworm-v20240312,mode=rw,size=10,type=pd-balanced \ 21 | --no-shielded-secure-boot --shielded-vtpm --shielded-integrity-monitoring \ 22 | --reservation-affinity=any 23 | 24 | gcloud compute firewall-rules create allow-http --action=ALLOW --direction=INGRESS --target-tags=http-server --source-ranges=0.0.0.0/0 --rules=tcp:80 --description="Allow incoming HTTP traffic" 25 | 26 | 27 | IP_CP=$(gcloud compute instances list --filter="name=('$VM_NAME')" --zones="$ZONE" --format='value(EXTERNAL_IP)') 28 | 29 | 30 | curl http://$IP_CP 31 | -------------------------------------------------------------------------------- /Deploy, Scale, and Update Your Website on Google Kubernetes Engine/techcpsgsp663.sh: -------------------------------------------------------------------------------- 1 | 2 | gcloud auth list 3 | gcloud config list project 4 | gcloud config set compute/zone $ZONE 5 | 6 | gcloud services enable container.googleapis.com 7 | 8 | gcloud container clusters create fancy-cluster --num-nodes 3 9 | 10 | gcloud compute instances list 11 | 12 | cd ~ 13 | 14 | git clone https://github.com/googlecodelabs/monolith-to-microservices.git 15 | 16 | cd ~/monolith-to-microservices 17 | 18 | ./setup.sh 19 | 20 | nvm install --lts 21 | 22 | gcloud services enable cloudbuild.googleapis.com 23 | 24 | cd ~/monolith-to-microservices/monolith 25 | 26 | gcloud builds submit --tag gcr.io/${GOOGLE_CLOUD_PROJECT}/monolith:1.0.0 . 27 | 28 | kubectl create deployment monolith --image=gcr.io/${GOOGLE_CLOUD_PROJECT}/monolith:1.0.0 29 | 30 | kubectl get all 31 | 32 | kubectl expose deployment monolith --type=LoadBalancer --port 80 --target-port 8080 33 | 34 | kubectl get service 35 | 36 | kubectl scale deployment monolith --replicas=3 37 | 38 | kubectl get all 39 | 40 | cd ~/monolith-to-microservices/react-app/src/pages/Home 41 | mv index.js.new index.js 42 | 43 | cat ~/monolith-to-microservices/react-app/src/pages/Home/index.js 44 | 45 | cd ~/monolith-to-microservices/react-app 46 | npm run build:monolith 47 | 48 | cd ~/monolith-to-microservices/monolith 49 | 50 | gcloud builds submit --tag gcr.io/${GOOGLE_CLOUD_PROJECT}/monolith:2.0.0 . 51 | 52 | kubectl set image deployment/monolith monolith=gcr.io/${GOOGLE_CLOUD_PROJECT}/monolith:2.0.0 53 | 54 | kubectl get pods 55 | 56 | -------------------------------------------------------------------------------- /Deploying Google Kubernetes Engine/techcpslab.md: -------------------------------------------------------------------------------- 1 | 2 | # Deploying Google Kubernetes Engine 3 | 4 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) 5 | 6 | ``` 7 | export ZONE= 8 | ``` 9 | 10 | ``` 11 | gcloud auth list 12 | gcloud config list project 13 | gcloud container clusters create standard-cluster-1 --zone=$ZONE --num-nodes=3 14 | gcloud container clusters resize standard-cluster-1 --zone=$ZONE --num-nodes=4 15 | ``` 16 | ## NOTE: Kubernetes Engine > Workloads > Click Create Deployment 17 | * Click Continue > Continue> Deploy 18 | 19 | ## Congratulations, you're all done with the lab 😄 20 | 21 | # Thanks for watching :) 22 | -------------------------------------------------------------------------------- /Deploying a Fault-Tolerant Microsoft Active Directory Environment/techcps.md: -------------------------------------------------------------------------------- 1 | 2 | ## 💡 Lab Link: [Deploying a Fault-Tolerant Microsoft Active Directory Environment - GSP118](https://www.cloudskillsboost.google/focuses/1817?parent=catalog) 3 | 4 | ## 🚀 Lab Solution [Watch Here](https://youtu.be/O2EZhqO83_c) 5 | 6 | --- 7 | 8 | ## 🚨Export the variables from Task 1 9 | 10 | ## 🚨Copy and run the below commands in Cloud Shell: 11 | 12 | ``` 13 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Deploying%20a%20Fault-Tolerant%20Microsoft%20Active%20Directory%20Environment/techcps118.sh 14 | sudo chmod +x techcps118.sh 15 | ./techcps118.sh 16 | ``` 17 | 18 | ## Congratulations, you're all done with the lab 😄 19 | 20 | --- 21 | 22 | ### 🌐 Join our Community 23 | 24 | - **Join our [Discussion Group](https://t.me/Techcpschat)** icon 25 | - **Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps)** icon 26 | - **Join our [WhatsApp Community](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A)** icon 27 | - **Join our [Telegram Channel](https://t.me/Techcps)** icon 28 | - **Follow us on [LinkedIn](https://www.linkedin.com/company/techcps/)** icon 29 | 30 | ### Thanks for watching :) 31 | -------------------------------------------------------------------------------- /Deploying a Fault-Tolerant Microsoft Active Directory Environment/techcps118.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | gcloud auth list 5 | 6 | gcloud config set compute/region ${region1} 7 | 8 | gcloud compute networks create ${vpc_name} \ 9 | --description "VPC network to deploy Active Directory" \ 10 | --subnet-mode custom 11 | 12 | 13 | gcloud compute networks subnets create private-ad-zone-1 \ 14 | --network ${vpc_name} \ 15 | --range 10.1.0.0/24 \ 16 | --region ${region1} 17 | 18 | 19 | gcloud compute networks subnets create private-ad-zone-2 \ 20 | --network ${vpc_name} \ 21 | --range 10.2.0.0/24 \ 22 | --region ${region2} 23 | 24 | 25 | gcloud compute firewall-rules create allow-internal-ports-private-ad \ 26 | --network ${vpc_name} \ 27 | --allow tcp:1-65535,udp:1-65535,icmp \ 28 | --source-ranges 10.1.0.0/24,10.2.0.0/24 29 | 30 | 31 | gcloud compute firewall-rules create allow-rdp \ 32 | --network ${vpc_name} \ 33 | --allow tcp:3389 \ 34 | --source-ranges 0.0.0.0/0 35 | 36 | gcloud compute instances create ad-dc1 --machine-type e2-standard-2 \ 37 | --boot-disk-type pd-ssd \ 38 | --boot-disk-size 50GB \ 39 | --image-family windows-2016 --image-project windows-cloud \ 40 | --network ${vpc_name} \ 41 | --zone ${zone_1} --subnet private-ad-zone-1 \ 42 | --private-network-ip=10.1.0.100 43 | 44 | 45 | export project_id=$(gcloud config get-value project) 46 | gcloud config set compute/region ${region2} 47 | 48 | 49 | gcloud compute instances create ad-dc2 --machine-type e2-standard-2 \ 50 | --boot-disk-size 50GB \ 51 | --boot-disk-type pd-ssd \ 52 | --image-family windows-2016 --image-project windows-cloud \ 53 | --can-ip-forward \ 54 | --network ${vpc_name} \ 55 | --zone ${zone_2} \ 56 | --subnet private-ad-zone-2 \ 57 | --private-network-ip=10.2.0.100 58 | 59 | 60 | 61 | -------------------------------------------------------------------------------- /Derive Insights from BigQuery Data: Challenge Lab/techcps787.md: -------------------------------------------------------------------------------- 1 | 2 | # Derive Insights from BigQuery Data: Challenge Lab [GSP787] 3 | 4 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) & join our [WhatsApp Channel](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A) 5 | 6 | 7 | ``` 8 | export DATE_CP1=2020-04-10 9 | 10 | 11 | export DEATH_COUNT_CP2=100 12 | export DATE_CP2=2020-04-10 13 | 14 | 15 | export DEATH_COUNT_CP3=3000 16 | export DATE_CP3=2020-04-10 17 | 18 | 19 | export DATE_X_CP4=2020-05-01 20 | export DATE_Y_CP4=2020-05-31 21 | 22 | 23 | export DEATHS_CROSS_CP5=14000 24 | 25 | 26 | export DATE_X_CP6=2020-02-22 27 | export DATE_Y_CP6=2020-03-13 28 | 29 | 30 | export DOUBLING_RATE_CP7=10 31 | export DATE_X_CP7=2020-03-22 32 | export DATE_Y_CP7=2020-04-20 33 | 34 | 35 | export LIMIT_CP8=10 36 | export DATE_CP8=2020-05-10 37 | 38 | 39 | export DATE_X_CP9=2020-01-24 40 | export DATE_Y_CP9=2020-04-10 41 | 42 | 43 | export DATE_X_CP10=2020-03-24 44 | export DATE_Y_CP10=2020-04-22 45 | 46 | 47 | 48 | curl -LO raw.githubusercontent.com/Techcps/GSP/master/Derive%20Insights%20from%20BigQuery%20Data%3A%20Challenge%20Lab/techcps787.sh 49 | sudo chmod +x techcps787.sh 50 | ./techcps787.sh 51 | 52 | ``` 53 | ## Congratulations, you're all done with the lab 😄 54 | 55 | # Thanks for watching :) 56 | -------------------------------------------------------------------------------- /Designing and Querying Bigtable Schemas/techcps.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) & join our [WhatsApp Channel](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A) 4 | 5 | 6 | ``` 7 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Designing%20and%20Querying%20Bigtable%20Schemas/techcps1053.sh 8 | sudo chmod +x techcps1053.sh 9 | ./techcps1053.sh 10 | 11 | ``` 12 | 13 | # Note Check your progress on Task 3 - first Check my progress 14 | # Do not run the next command until you get a score on Task 3 - first Check my progress 15 | 16 | ``` 17 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Designing%20and%20Querying%20Bigtable%20Schemas/techcps.sh 18 | sudo chmod +x techcps.sh 19 | ./techcps.sh 20 | 21 | ``` 22 | 23 | ## Congratulations, you're all done with the lab 😄 24 | 25 | # Thanks for watching :) 26 | -------------------------------------------------------------------------------- /Designing and Querying Bigtable Schemas/techcps.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | cbt set test-sessions green1939#1638940844260 Interactions:red_hat=seen 4 | 5 | cbt set test-sessions blue2737#1638940844260 Sales:sale=blue_dress#blue_jacket 6 | 7 | cbt read test-sessions 8 | 9 | cbt deletetable test-sessions 10 | 11 | -------------------------------------------------------------------------------- /Designing and Querying Bigtable Schemas/techcps1053.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | echo project = `gcloud config get-value project` \ 5 | >> ~/.cbtrc 6 | 7 | cbt listinstances 8 | 9 | echo instance = personalized-sales \ 10 | >> ~/.cbtrc 11 | 12 | 13 | cat ~/.cbtrc 14 | 15 | cbt ls 16 | 17 | cbt createtable test-sessions 18 | 19 | cbt createfamily test-sessions Interactions 20 | 21 | cbt createfamily test-sessions Sales 22 | 23 | cbt ls test-sessions 24 | 25 | 26 | -------------------------------------------------------------------------------- /Detect Labels, Faces, and Landmarks in Images with the Cloud Vision API GSP037/city.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/0bed5c988e3db8690d4ca5a0bbcb7d6560076817/Detect Labels, Faces, and Landmarks in Images with the Cloud Vision API GSP037/city.png -------------------------------------------------------------------------------- /Detect Labels, Faces, and Landmarks in Images with the Cloud Vision API GSP037/donuts.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/0bed5c988e3db8690d4ca5a0bbcb7d6560076817/Detect Labels, Faces, and Landmarks in Images with the Cloud Vision API GSP037/donuts.png -------------------------------------------------------------------------------- /Detect Labels, Faces, and Landmarks in Images with the Cloud Vision API GSP037/selfie.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/0bed5c988e3db8690d4ca5a0bbcb7d6560076817/Detect Labels, Faces, and Landmarks in Images with the Cloud Vision API GSP037/selfie.png -------------------------------------------------------------------------------- /Detect Labels, Faces, and Landmarks in Images with the Cloud Vision API GSP037/techcpsgsp037.md: -------------------------------------------------------------------------------- 1 | 2 | # Detect Labels, Faces, and Landmarks in Images with the Cloud Vision API GSP037 3 | 4 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) 5 | 6 | * In the GCP Console open the Cloud Shell and enter the following commands: 7 | 8 | ``` 9 | gcloud alpha services api-keys create --display-name="techcps" 10 | KEY_NAME=$(gcloud alpha services api-keys list --format="value(name)" --filter "displayName=techcps") 11 | export API_KEY=$(gcloud alpha services api-keys get-key-string $KEY_NAME --format="value(keyString)") 12 | export PROJECT_ID=$(gcloud config list --format 'value(core.project)') 13 | gsutil mb gs://$PROJECT_ID 14 | ``` 15 | 16 | # Note: Go to Bucket and upload all three images then run the following commands 17 | 18 | ``` 19 | gcloud storage objects update gs://$PROJECT_ID/donuts.png --add-acl-grant=entity=AllUsers,role=READER 20 | gcloud storage objects update gs://$PROJECT_ID/selfie.png --add-acl-grant=entity=AllUsers,role=READER 21 | gcloud storage objects update gs://$PROJECT_ID/city.png --add-acl-grant=entity=AllUsers,role=READER 22 | ``` 23 | 24 | # Congratulations, you're all done with the lab 😄 25 | 26 | # Thanks for watching :) 27 | -------------------------------------------------------------------------------- /Detect Labels, Faces, and Landmarks in Images with the Cloud Vision API/city.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/0bed5c988e3db8690d4ca5a0bbcb7d6560076817/Detect Labels, Faces, and Landmarks in Images with the Cloud Vision API/city.png -------------------------------------------------------------------------------- /Detect Labels, Faces, and Landmarks in Images with the Cloud Vision API/donuts.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/0bed5c988e3db8690d4ca5a0bbcb7d6560076817/Detect Labels, Faces, and Landmarks in Images with the Cloud Vision API/donuts.png -------------------------------------------------------------------------------- /Detect Labels, Faces, and Landmarks in Images with the Cloud Vision API/selfie.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/0bed5c988e3db8690d4ca5a0bbcb7d6560076817/Detect Labels, Faces, and Landmarks in Images with the Cloud Vision API/selfie.png -------------------------------------------------------------------------------- /Detect Labels, Faces, and Landmarks in Images with the Cloud Vision API/techcps037.sh: -------------------------------------------------------------------------------- 1 | #new! 2 | 3 | gcloud auth list 4 | 5 | gcloud alpha services api-keys create --display-name="techcps" 6 | 7 | KEY_NAME=$(gcloud alpha services api-keys list --format="value(name)" --filter "displayName=techcps") 8 | 9 | export API_KEY=$(gcloud alpha services api-keys get-key-string $KEY_NAME --format="value(keyString)") 10 | 11 | export PROJECT_ID=$(gcloud config list --format 'value(core.project)') 12 | 13 | gsutil mb gs://$PROJECT_ID 14 | 15 | 16 | curl -L -o donuts.png https://github.com/Techcps/GSP-Short-Trick/master/Detect%20Labels%2C%20Faces%2C%20and%20Landmarks%20in%20Images%20with%20the%20Cloud%20Vision%20API/donuts.png 17 | 18 | curl -L -o selfie.png https://github.com/Techcps/GSP-Short-Trick/master/Detect%20Labels%2C%20Faces%2C%20and%20Landmarks%20in%20Images%20with%20the%20Cloud%20Vision%20API/selfie.png 19 | 20 | curl -L -o city.png https://github.com/Techcps/GSP-Short-Trick/master/Detect%20Labels%2C%20Faces%2C%20and%20Landmarks%20in%20Images%20with%20the%20Cloud%20Vision%20API/city.png 21 | 22 | 23 | gsutil cp donuts.png gs://$PROJECT_ID/donuts.png 24 | 25 | gsutil cp selfie.png gs://$PROJECT_ID/selfie.png 26 | 27 | gsutil cp city.png gs://$PROJECT_ID/city.png 28 | 29 | gcloud storage objects update gs://$PROJECT_ID/donuts.png --add-acl-grant=entity=AllUsers,role=READER 30 | 31 | gcloud storage objects update gs://$PROJECT_ID/selfie.png --add-acl-grant=entity=AllUsers,role=READER 32 | 33 | gcloud storage objects update gs://$PROJECT_ID/city.png --add-acl-grant=entity=AllUsers,role=READER 34 | 35 | -------------------------------------------------------------------------------- /Detect Manufacturing Defects using Visual Inspection AI: Challenge Lab/techcps366.sh: -------------------------------------------------------------------------------- 1 | 2 | export PROJECT_ID=$(gcloud config get-value core/project) 3 | 4 | export container_registry=gcr.io/ql-shared-resources-test/defect_solution@sha256:776fd8c65304ac017f5b9a986a1b8189695b7abbff6aa0e4ef693c46c7122f4c 5 | 6 | export VISERVING_CPU_DOCKER_WITH_MODEL=${container_registry} 7 | export HTTP_PORT=8602 8 | export LOCAL_METRIC_PORT=8603 9 | 10 | docker pull ${VISERVING_CPU_DOCKER_WITH_MODEL} 11 | 12 | docker run -v /secrets:/secrets --rm -d --name $container_name \ 13 | --network="host" \ 14 | -p ${HTTP_PORT}:8602 \ 15 | -p ${LOCAL_METRIC_PORT}:8603 \ 16 | -t ${VISERVING_CPU_DOCKER_WITH_MODEL} 17 | 18 | docker container ls 19 | 20 | gsutil cp gs://cloud-training/gsp895/prediction_script.py . 21 | 22 | gsutil mb gs://${PROJECT_ID} 23 | gsutil -m cp gs://cloud-training/gsp897/cosmetic-test-data/*.png \ 24 | gs://${PROJECT_ID}/cosmetic-test-data/ 25 | gsutil cp gs://${PROJECT_ID}/cosmetic-test-data/IMG_07703.png . 26 | 27 | python3 ./prediction_script.py --input_image_file=./IMG_07703.png --port=8602 --output_result_file="$defective" 28 | 29 | gsutil cp gs://${PROJECT_ID}/cosmetic-test-data/IMG_0769.png . 30 | 31 | python3 ./prediction_script.py --input_image_file=./IMG_0769.png --port=8602 --output_result_file="$non_defective" 32 | 33 | -------------------------------------------------------------------------------- /Develop an App with Vertex AI Gemini 1.0 Pro/gemini-app/Dockerfile.txt: -------------------------------------------------------------------------------- 1 | FROM python:3.8 2 | 3 | EXPOSE 8080 4 | WORKDIR /app 5 | 6 | COPY . ./ 7 | 8 | RUN pip install -r requirements.txt 9 | 10 | ENTRYPOINT ["streamlit", "run", "app.py", "--server.port=8080", "--server.address=0.0.0.0"] 11 | -------------------------------------------------------------------------------- /Develop an App with Vertex AI Gemini 1.0 Pro/gemini-app/app.py: -------------------------------------------------------------------------------- 1 | import os 2 | import streamlit as st 3 | from app_tab1 import render_story_tab 4 | from vertexai.preview.generative_models import GenerativeModel 5 | import vertexai 6 | import logging 7 | from google.cloud import logging as cloud_logging 8 | 9 | # configure logging 10 | logging.basicConfig(level=logging.INFO) 11 | # attach a Cloud Logging handler to the root logger 12 | log_client = cloud_logging.Client() 13 | log_client.setup_logging() 14 | 15 | PROJECT_ID = os.environ.get('PROJECT_ID') # Your Qwiklabs Google Cloud Project ID 16 | LOCATION = os.environ.get('REGION') # Your Qwiklabs Google Cloud Project Region 17 | vertexai.init(project=PROJECT_ID, location=LOCATION) 18 | 19 | @st.cache_resource 20 | def load_models(): 21 | text_model_pro = GenerativeModel("gemini-pro") 22 | multimodal_model_pro = GenerativeModel("gemini-pro-vision") 23 | return text_model_pro, multimodal_model_pro 24 | 25 | st.header("Vertex AI Gemini API", divider="rainbow") 26 | text_model_pro, multimodal_model_pro = load_models() 27 | 28 | tab1, tab2, tab3, tab4 = st.tabs(["Story", "Marketing Campaign", "Image Playground", "Video Playground"]) 29 | 30 | with tab1: 31 | render_story_tab(text_model_pro) 32 | 33 | 34 | from app_tab2 import render_mktg_campaign_tab 35 | 36 | with tab2: 37 | render_mktg_campaign_tab(text_model_pro) 38 | 39 | 40 | from app_tab3 import render_image_playground_tab 41 | 42 | with tab3: 43 | render_image_playground_tab(multimodal_model_pro) 44 | 45 | 46 | from app_tab4 import render_video_playground_tab 47 | 48 | with tab4: 49 | render_video_playground_tab(multimodal_model_pro) 50 | -------------------------------------------------------------------------------- /Develop an App with Vertex AI Gemini 1.0 Pro/gemini-app/requirements.txt: -------------------------------------------------------------------------------- 1 | streamlit 2 | google-cloud-aiplatform==1.38.1 3 | google-cloud-logging==3.6.0 4 | -------------------------------------------------------------------------------- /Developing a REST API with Go and Cloud Run/techcps761.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud auth list 4 | 5 | export PROJECT_ID=$(gcloud config get-value project) 6 | export PROJECT_ID=$DEVSHELL_PROJECT_ID 7 | 8 | gcloud services enable cloudbuild.googleapis.com cloudfunctions.googleapis.com run.googleapis.com 9 | 10 | sleep 30 11 | 12 | gcloud config set project $(gcloud projects list --format='value(PROJECT_ID)' --filter='qwiklabs-gcp') 13 | 14 | git clone https://github.com/rosera/pet-theory.git && cd pet-theory/lab08 15 | 16 | cat > main.go < Dockerfile < ${TRIP_DISTANCE_VALUE} 20 | AND fare_amount >= ${FARE_AMOUNT_VALUE} 21 | AND pickup_longitude > -78 22 | AND pickup_longitude < -70 23 | AND dropoff_longitude > -78 24 | AND dropoff_longitude < -70 25 | AND pickup_latitude > 37 26 | AND pickup_latitude < 45 27 | AND dropoff_latitude > 37 28 | AND dropoff_latitude < 45 29 | AND passenger_count > ${PASSENGER_COUNT_VALUE} 30 | " 31 | 32 | 33 | 34 | # Construct the SQL query with variable interpolation 35 | SQL_QUERY=$(cat < request.json < result.json 24 | 25 | 26 | -------------------------------------------------------------------------------- /Eventarc for Cloud Run/techcps1.sh: -------------------------------------------------------------------------------- 1 | 2 | gcloud auth list 3 | gcloud config set project $DEVSHELL_PROJECT_ID 4 | 5 | gcloud config set run/region $REGION 6 | 7 | gcloud config set run/platform managed 8 | 9 | gcloud config set eventarc/location $REGION 10 | 11 | # Task 2 12 | 13 | export PROJECT_NUMBER="$(gcloud projects list \ 14 | --filter=$(gcloud config get-value project) \ 15 | --format='value(PROJECT_NUMBER)')" 16 | 17 | 18 | gcloud projects add-iam-policy-binding $(gcloud config get-value project) \ 19 | --member=serviceAccount:${PROJECT_NUMBER}-compute@developer.gserviceaccount.com \ 20 | --role='roles/eventarc.admin' 21 | 22 | 23 | # Task 3 24 | 25 | gcloud eventarc providers list 26 | 27 | gcloud eventarc providers describe \ 28 | pubsub.googleapis.com 29 | 30 | 31 | # Task 4 32 | 33 | 34 | export SERVICE_NAME=event-display 35 | 36 | export IMAGE_NAME="gcr.io/cloudrun/hello" 37 | 38 | gcloud run deploy ${SERVICE_NAME} \ 39 | --image ${IMAGE_NAME} \ 40 | --allow-unauthenticated \ 41 | --max-instances=3 42 | 43 | 44 | # Task 5 45 | 46 | gcloud eventarc providers describe \ 47 | pubsub.googleapis.com 48 | 49 | gcloud eventarc triggers create trigger-pubsub \ 50 | --destination-run-service=${SERVICE_NAME} \ 51 | --event-filters="type=google.cloud.pubsub.topic.v1.messagePublished" 52 | 53 | export TOPIC_ID=$(gcloud eventarc triggers describe trigger-pubsub \ 54 | --format='value(transport.pubsub.topic)') 55 | 56 | echo ${TOPIC_ID} 57 | 58 | gcloud eventarc triggers list 59 | 60 | gcloud pubsub topics publish ${TOPIC_ID} --message="Hello there" 61 | 62 | 63 | 64 | export BUCKET_NAME=$(gcloud config get-value project)-cr-bucket 65 | 66 | 67 | gsutil mb -p $(gcloud config get-value project) \ 68 | -l $(gcloud config get-value run/region) \ 69 | gs://${BUCKET_NAME}/ 70 | 71 | 72 | echo " Click this link to open Audit Logs,[https://console.cloud.google.com/iam-admin/audit?referrer=search&cloudshell=true&project=$DEVSHELL_PROJECT_ID]" 73 | -------------------------------------------------------------------------------- /Eventarc for Cloud Run/techcps2.sh: -------------------------------------------------------------------------------- 1 | 2 | export PROJECT_NUMBER="$(gcloud projects list \ 3 | --filter=$(gcloud config get-value project) \ 4 | --format='value(PROJECT_NUMBER)')" 5 | 6 | 7 | export SERVICE_NAME=event-display 8 | 9 | export IMAGE_NAME="gcr.io/cloudrun/hello" 10 | 11 | export BUCKET_NAME=$(gcloud config get-value project)-cr-bucket 12 | 13 | 14 | gcloud eventarc triggers delete trigger-pubsub 15 | 16 | 17 | echo "Hello World" > random.txt 18 | 19 | gsutil cp random.txt gs://${BUCKET_NAME}/random.txt 20 | 21 | gcloud eventarc providers describe cloudaudit.googleapis.com 22 | 23 | 24 | gcloud eventarc triggers create trigger-auditlog \ 25 | --destination-run-service=${SERVICE_NAME} \ 26 | --event-filters="type=google.cloud.audit.log.v1.written" \ 27 | --event-filters="serviceName=storage.googleapis.com" \ 28 | --event-filters="methodName=storage.objects.create" \ 29 | --service-account=${PROJECT_NUMBER}-compute@developer.gserviceaccount.com 30 | 31 | 32 | gcloud eventarc triggers list 33 | 34 | gsutil cp random.txt gs://${BUCKET_NAME}/random.txt 35 | 36 | sleep 5 37 | 38 | gsutil cp random.txt gs://${BUCKET_NAME}/random.txt 39 | 40 | gsutil cp random.txt gs://${BUCKET_NAME}/random.txt 41 | 42 | gsutil cp random.txt gs://${BUCKET_NAME}/random.txt 43 | sleep 5 44 | gsutil cp random.txt gs://${BUCKET_NAME}/random.txt 45 | 46 | -------------------------------------------------------------------------------- /Explore and Evaluate Models using Model Garden/techcps1166.sh: -------------------------------------------------------------------------------- 1 | 2 | export PROJECT_ID=$(gcloud config get-value project) 3 | 4 | #!/bin/bash 5 | 6 | BLUE='\033[0;32m' 7 | GOLD='\033[0;34m' 8 | NC='\033[0m' 9 | 10 | echo -e "${BLUE}Open text-bison:${NC}" 11 | python -c "print(' https://console.cloud.google.com/vertex-ai/publishers/google/model-garden/text-bison?project=$PROJECT_ID')" 12 | echo "CP, link 1" 13 | 14 | echo "subscribe to techcps" 15 | 16 | echo -e "${GOLD}Open owlvit-base-patch32:${NC}" 17 | python -c "print(' https://console.cloud.google.com/vertex-ai/publishers/google/model-garden/owlvit-base-patch32?project=$PROJECT_ID')" 18 | echo "CP, link 2" 19 | 20 | echo "subscribe to techcps, https://www.youtube.com/@techcps" 21 | 22 | echo -e "${BLUE}Open bert-base:${NC}" 23 | python -c "print(' https://console.cloud.google.com/vertex-ai/publishers/google/model-garden/bert-base?project=$PROJECT_ID')" 24 | echo "CP, link 3" 25 | 26 | -------------------------------------------------------------------------------- /Exploring IAM/sample.txt: -------------------------------------------------------------------------------- 1 | Please like share & subscribe to Techcps (https://www.youtube.com/@techcps) 2 | -------------------------------------------------------------------------------- /Exploring IAM/techcps.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud auth list 4 | export PROJECT_ID=$(gcloud info --format='value(config.project)') 5 | 6 | gsutil mb -l us gs://$PROJECT_ID 7 | 8 | curl -O https://github.com/Techcps/GSP-Short-Trick/blob/main/Exploring%20IAM/sample.txt 9 | 10 | gsutil cp sample.txt gs://$PROJECT_ID 11 | 12 | gcloud projects remove-iam-policy-binding $PROJECT_ID --member=user:$USERNAME_2 --role=roles/viewer 13 | 14 | gcloud projects add-iam-policy-binding $PROJECT_ID --member=user:$USERNAME_2 --role=roles/storage.objectViewer 15 | 16 | gcloud iam service-accounts create read-bucket-objects --description="please like share & subscribe to techcps" --display-name="read-bucket-objects" 17 | 18 | gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:read-bucket-objects@$PROJECT_ID.iam.gserviceaccount.com" --role="roles/storage.objectViewer" 19 | 20 | gcloud iam service-accounts add-iam-policy-binding read-bucket-objects@$PROJECT_ID.iam.gserviceaccount.com --member=domain:altostrat.com --role=roles/iam.serviceAccountUser 21 | 22 | gcloud projects add-iam-policy-binding $PROJECT_ID --member=domain:altostrat.com --role=roles/compute.instanceAdmin.v1 23 | 24 | gcloud compute instances create demoiam --zone=$ZONE --machine-type e2-micro --image-family debian-11 --image-project debian-cloud --service-account read-bucket-objects@$PROJECT_ID.iam.gserviceaccount.com 25 | 26 | -------------------------------------------------------------------------------- /GKE Workload Optimization/techcps1.sh: -------------------------------------------------------------------------------- 1 | 2 | gcloud auth list 3 | gcloud config list project 4 | gcloud config set compute/zone $ZONE 5 | gcloud container clusters create test-cluster --num-nodes=3 --enable-ip-alias 6 | 7 | cat << EOF > gb_frontend_pod.yaml 8 | apiVersion: v1 9 | kind: Pod 10 | metadata: 11 | labels: 12 | app: gb-frontend 13 | name: gb-frontend 14 | spec: 15 | containers: 16 | - name: gb-frontend 17 | image: gcr.io/google-samples/gb-frontend-amd64:v5 18 | resources: 19 | requests: 20 | cpu: 100m 21 | memory: 256Mi 22 | ports: 23 | - containerPort: 80 24 | EOF 25 | 26 | kubectl apply -f gb_frontend_pod.yaml 27 | 28 | cat << EOF > gb_frontend_cluster_ip.yaml 29 | apiVersion: v1 30 | kind: Service 31 | metadata: 32 | name: gb-frontend-svc 33 | annotations: 34 | cloud.google.com/neg: '{"ingress": true}' 35 | spec: 36 | type: ClusterIP 37 | selector: 38 | app: gb-frontend 39 | ports: 40 | - port: 80 41 | protocol: TCP 42 | targetPort: 80 43 | EOF 44 | 45 | kubectl apply -f gb_frontend_cluster_ip.yaml 46 | cat << EOF > gb_frontend_ingress.yaml 47 | apiVersion: networking.k8s.io/v1 48 | kind: Ingress 49 | metadata: 50 | name: gb-frontend-ingress 51 | spec: 52 | defaultBackend: 53 | service: 54 | name: gb-frontend-svc 55 | port: 56 | number: 80 57 | EOF 58 | 59 | kubectl apply -f gb_frontend_ingress.yaml 60 | sleep 417 61 | BACKEND_SERVICE=$(gcloud compute backend-services list | grep NAME | cut -d ' ' -f2) 62 | gcloud compute backend-services get-health $BACKEND_SERVICE --global 63 | BACKEND_SERVICE=$(gcloud compute backend-services list | grep NAME | cut -d ' ' -f2) 64 | gcloud compute backend-services get-health $BACKEND_SERVICE --global 65 | kubectl get ingress gb-frontend-ingress 66 | -------------------------------------------------------------------------------- /GKE Workload Optimization/techcpsgsp769.md: -------------------------------------------------------------------------------- 1 | 2 | # GKE Workload Optimization [GSP769] 3 | 4 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) & join our [WhatsApp Channel](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A) 5 | 6 | ``` 7 | export ZONE= 8 | ``` 9 | ``` 10 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/GKE%20Workload%20Optimization/techcps1.sh 11 | sudo chmod +x techcps1.sh 12 | ./techcps1.sh 13 | ``` 14 | # NOTE: Check the progress on Task 1 & Do not run next commands until you get a green trick 15 | 16 | ``` 17 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/GKE%20Workload%20Optimization/techcps2.sh 18 | sudo chmod +x techcps2.sh 19 | ./techcps2.sh 20 | 21 | ``` 22 | 23 | ## Congratulations, you're all done with the lab 😄 24 | 25 | # Thanks for watching :) 26 | -------------------------------------------------------------------------------- /Get Started with Vertex AI Studio/techcps2.json: -------------------------------------------------------------------------------- 1 | { 2 | "title": "Untitled prompt", 3 | "description": "", 4 | "parameters": { 5 | "groundingPromptConfig": { 6 | "disabled": true, 7 | "groundingConfig": { 8 | "sources": [ 9 | { 10 | "type": "VERTEX_AI_SEARCH" 11 | } 12 | ] 13 | } 14 | }, 15 | "stopSequences": [], 16 | "temperature": 0.9, 17 | "tokenLimits": 2048, 18 | "topP": 1 19 | }, 20 | "type": "structured", 21 | "context": "", 22 | "inputPrefixes": [ 23 | "" 24 | ], 25 | "outputPrefixes": [ 26 | "" 27 | ], 28 | "examples": [ 29 | { 30 | "inputs": [ 31 | "A well-made and entertaining film" 32 | ], 33 | "outputs": [ 34 | "positive" 35 | ] 36 | }, 37 | { 38 | "inputs": [ 39 | "I fell asleep after 10 minutes\t" 40 | ], 41 | "outputs": [ 42 | "negative" 43 | ] 44 | }, 45 | { 46 | "inputs": [ 47 | "The movie was ok\t" 48 | ], 49 | "outputs": [ 50 | "neutral" 51 | ] 52 | } 53 | ], 54 | "testData": [ 55 | { 56 | "inputs": [ 57 | "" 58 | ] 59 | } 60 | ], 61 | "model": "gemini-1.0-pro-001" 62 | } -------------------------------------------------------------------------------- /Get Started with Vertex AI Studio/techcps3.json: -------------------------------------------------------------------------------- 1 | { 2 | "title": "Untitled prompt", 3 | "description": "", 4 | "parameters": { 5 | "candidateCount": 1, 6 | "groundingPromptConfig": { 7 | "disabled": true, 8 | "groundingConfig": { 9 | "sources": [ 10 | { 11 | "type": "VERTEX_AI_SEARCH" 12 | } 13 | ] 14 | } 15 | }, 16 | "stopSequences": [], 17 | "temperature": 0.9, 18 | "tokenLimits": 1024, 19 | "topP": 1 20 | }, 21 | "type": "chat", 22 | "context": { 23 | "parts": [ 24 | { 25 | "text": "Your name is Roy.\nYou are a support technician of an IT department.\nYou only respond with \"Have you tried turning it off and on again?\" to any queries." 26 | } 27 | ] 28 | }, 29 | "examples": [], 30 | "messages": [ 31 | { 32 | "author": "user", 33 | "content": { 34 | "parts": [ 35 | { 36 | "text": "My computer is so slow" 37 | } 38 | ] 39 | } 40 | }, 41 | { 42 | "content": { 43 | "parts": [ 44 | { 45 | "text": " Have you tried turning it off and on again?" 46 | } 47 | ] 48 | }, 49 | "author": "bot", 50 | "citationMetadata": { 51 | "citations": [] 52 | } 53 | } 54 | ], 55 | "model": "chat-bison" 56 | } -------------------------------------------------------------------------------- /Getting Started with Cloud Shell and gcloud/techcpsgsp002.sh: -------------------------------------------------------------------------------- 1 | 2 | gcloud compute instances create gcelab2 --project=$DEVSHELL_PROJECT_ID --zone $ZONE --machine-type e2-medium 3 | -------------------------------------------------------------------------------- /Getting Started with Security Command Center /techcps.sh: -------------------------------------------------------------------------------- 1 | gcloud auth list 2 | gcloud services enable securitycenter.googleapis.com --project=$DEVSHELL_PROJECT_ID 3 | sleep 17 4 | gcloud scc muteconfigs create muting-pga-findings --project=$DEVSHELL_PROJECT_ID --description="Mute rule for VPC Flow Logs" --filter="category=\"FLOW_LOGS_DISABLED\"" 5 | gcloud compute networks create scc-lab-net --subnet-mode=auto 6 | gcloud compute firewall-rules update default-allow-rdp --source-ranges=35.235.240.0/20 7 | gcloud compute firewall-rules update default-allow-ssh --source-ranges=35.235.240.0/20 8 | -------------------------------------------------------------------------------- /Getting Started with Security Command Center /techcpsgsp1124.md: -------------------------------------------------------------------------------- 1 | 2 | # Getting Started with Security Command Center [GSP1124] 3 | 4 | # If you consider that the video helped you to complete your lab, so please do like and subscribe. https://www.youtube.com/@techcps 5 | 6 | * In the GCP Console open the Cloud Shell and run the following commands: 7 | 8 | ``` 9 | gcloud auth list 10 | gcloud config list project 11 | # Enable the Google Cloud Security Command Center service 12 | gcloud services enable securitycenter.googleapis.com 13 | sleep 17 14 | gcloud scc muteconfigs create muting-pga-findings \ 15 | --project=$DEVSHELL_PROJECT_ID \ 16 | --description="Mute rule for VPC Flow Logs" \ 17 | --filter="category=\"FLOW_LOGS_DISABLED\"" 18 | gcloud compute networks create scc-lab-net --subnet-mode=auto 19 | gcloud compute firewall-rules update default-allow-rdp --source-ranges=35.235.240.0/20 20 | gcloud compute firewall-rules update default-allow-ssh --source-ranges=35.235.240.0/20 21 | ``` 22 | 23 | # Congratulations, you're all done with the lab 😄 24 | 25 | # Thanks for watching :) 26 | -------------------------------------------------------------------------------- /Getting Started with VPC Networking and Google Compute Engine/techcps.sh: -------------------------------------------------------------------------------- 1 | 2 | export REGION_1=${ZONE_1%-*} 3 | 4 | export REGION_2=${ZONE_2%-*} 5 | 6 | # Delete the all default network firewall rules 7 | gcloud compute firewall-rules delete default-allow-icmp --quiet 8 | gcloud compute firewall-rules delete default-allow-rdp --quiet 9 | gcloud compute firewall-rules delete default-allow-ssh --quiet 10 | gcloud compute firewall-rules delete default-allow-internal --quiet 11 | 12 | # Delete the default VPC network 13 | gcloud compute networks delete default --quiet 14 | 15 | # Create an auto mode VPC network with firewall rules 16 | gcloud compute networks create mynetwork --project=$DEVSHELL_PROJECT_ID --subnet-mode auto 17 | gcloud compute firewall-rules create allow-all --network mynetwork --project=$DEVSHELL_PROJECT_ID --allow all 18 | 19 | # Create a VM instance 20 | gcloud compute instances create mynet-us-vm --zone=$ZONE_1 --project=$DEVSHELL_PROJECT_ID --machine-type=e2-micro --network-interface=network-tier=PREMIUM,stack-type=IPV4_ONLY,subnet=mynetwork --create-disk=auto-delete=yes,boot=yes,device-name=mynet-us-vm,image=projects/debian-cloud/global/images/debian-11-bullseye-v20231010,mode=rw,size=10,type=projects/$DEVSHELL_PROJECT_ID/zones/$ZONE_1/diskTypes/pd-balanced 21 | 22 | gcloud compute instances create mynet-eu-vm --zone=$ZONE_2 --project=$DEVSHELL_PROJECT_ID --machine-type=e2-micro --network-interface=network-tier=PREMIUM,stack-type=IPV4_ONLY,subnet=mynetwork --create-disk=auto-delete=yes,boot=yes,device-name=mynet-eu-vm,image=projects/debian-cloud/global/images/debian-11-bullseye-v20231010,mode=rw,size=10,type=projects/$DEVSHELL_PROJECT_ID/zones/us-east1-c/diskTypes/pd-balanced 23 | 24 | -------------------------------------------------------------------------------- /Getting started with Flutter Development/techcps885.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) & join our [WhatsApp Community](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A) 4 | 5 | ## Press Ctrl+Shift+V to run the command 6 | ## Allow the pop-up message 7 | 8 | ``` 9 | flutter create startup_namer 10 | 11 | cd startup_namer 12 | 13 | sed -i '34c\ home: const MyHomePage(title: "Flutter is awesome!"),' lib/main.dart 14 | 15 | fwr 16 | ``` 17 | 18 | ## Congratulations, you're all done with the lab 😄 19 | 20 | # Thanks for watching :) 21 | -------------------------------------------------------------------------------- /Google AppSheet Getting Started/Techcps.xlsx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/0bed5c988e3db8690d4ca5a0bbcb7d6560076817/Google AppSheet Getting Started/Techcps.xlsx -------------------------------------------------------------------------------- /Google Cloud Packet Mirroring with OpenSource IDS/techcps.sh: -------------------------------------------------------------------------------- 1 | 2 | sudo apt-get update -y 3 | 4 | sudo apt-get install libpcre3-dbg libpcre3-dev autoconf automake libtool libpcap-dev libnet1-dev libyaml-dev zlib1g-dev libcap-ng-dev libmagic-dev libjansson-dev libjansson4 -y 5 | 6 | sudo apt-get install libnspr4-dev -y 7 | 8 | sudo apt-get install libnss3-dev -y 9 | 10 | sudo apt-get install liblz4-dev -y 11 | 12 | sudo apt install rustc cargo -y 13 | 14 | sudo add-apt-repository ppa:oisf/suricata-stable -y 15 | 16 | sudo apt-get update -y 17 | 18 | sudo apt-get install suricata -y 19 | 20 | suricata -V 21 | 22 | sudo systemctl stop suricata 23 | 24 | sudo cp /etc/suricata/suricata.yaml /etc/suricata/suricata.backup 25 | 26 | wget https://storage.googleapis.com/tech-academy-enablement/GCP-Packet-Mirroring-with-OpenSource-IDS/suricata.yaml 27 | 28 | wget https://storage.googleapis.com/tech-academy-enablement/GCP-Packet-Mirroring-with-OpenSource-IDS/my.rules 29 | 30 | sudo mkdir /etc/suricata/poc-rules 31 | 32 | sudo cp my.rules /etc/suricata/poc-rules/my.rules 33 | 34 | sudo cp suricata.yaml /etc/suricata/suricata.yaml 35 | 36 | sudo systemctl start suricata 37 | 38 | sudo systemctl restart suricata 39 | 40 | cat /etc/suricata/poc-rules/my.rules 41 | 42 | -------------------------------------------------------------------------------- /Google Cloud PubSub Qwik Start - Command Line GSP095.md: -------------------------------------------------------------------------------- 1 | 2 | # Google Cloud Pub/Sub: Qwik Start - Command Line [GSP095] 3 | 4 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) 5 | 6 | 7 | ``` 8 | gcloud auth list 9 | gcloud config list project 10 | gcloud pubsub topics create myTopic 11 | gcloud pubsub subscriptions create --topic myTopic mySubscription 12 | ``` 13 | 14 | ## Congratulations, you're all done with the lab 😄 15 | 16 | # Thanks for watching :) -------------------------------------------------------------------------------- /Google Cloud SDK Qwik Start - RedhatCentos/techcps.sh: -------------------------------------------------------------------------------- 1 | 2 | gcloud compute instances create techcps \ 3 | --project=$DEVSHELL_PROJECT_ID \ 4 | --zone=$ZONE --machine-type=e2-medium \ 5 | --network-interface=network-tier=PREMIUM,stack-type=IPV4_ONLY,subnet=default \ 6 | --metadata=enable-oslogin=true --maintenance-policy=MIGRATE --provisioning-model=STANDARD \ 7 | --scopes=https://www.googleapis.com/auth/devstorage.read_only,https://www.googleapis.com/auth/logging.write,https://www.googleapis.com/auth/monitoring.write,https://www.googleapis.com/auth/servicecontrol,https://www.googleapis.com/auth/service.management.readonly,https://www.googleapis.com/auth/trace.append --tags=http-server --create-disk=auto-delete=yes,boot=yes,device-name=techcps,image=projects/centos-cloud/global/images/centos-7-v20231010,mode=rw,size=20,type=projects/$DEVSHELL_PROJECT_ID/zones/$ZONE/diskTypes/pd-balanced --no-shielded-secure-boot --shielded-vtpm --shielded-integrity-monitoring --labels=goog-ec-src=vm_add-gcloud --reservation-affinity=any 8 | 9 | 10 | gcloud compute ssh techcps --zone=$ZONE --quiet --command "# Update YUM with Cloud SDK repo information: 11 | sudo tee -a /etc/yum.repos.d/google-cloud-sdk.repo << EOM 12 | [google-cloud-sdk] 13 | name=Google Cloud SDK 14 | baseurl=https://packages.cloud.google.com/yum/repos/cloud-sdk-el7-x86_64 15 | enabled=1 16 | gpgcheck=1 17 | repo_gpgcheck=0 18 | gpgkey=https://packages.cloud.google.com/yum/doc/yum-key.gpg 19 | https://packages.cloud.google.com/yum/doc/rpm-package-key.gpg 20 | EOM 21 | 22 | # The indentation for the 2nd line of gpgkey is important. 23 | 24 | # Install the Cloud SDK 25 | sudo yum install google-cloud-sdk -y && gcloud init --console-only" 26 | 27 | -------------------------------------------------------------------------------- /Google Cloud SDK Qwik Start - RedhatCentos/techcps122.md: -------------------------------------------------------------------------------- 1 | 2 | # Google Cloud SDK: Qwik Start - Redhat/Centos [GSP122] 3 | 4 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) & join our [WhatsApp Channel](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A) 5 | 6 | * In the GCP Console open the Cloud Shell and enter the following commands: 7 | 8 | ``` 9 | export ZONE= 10 | ``` 11 | ``` 12 | gcloud compute instances create techcps \ 13 | --project=$DEVSHELL_PROJECT_ID \ 14 | --zone=$ZONE --machine-type=e2-medium \ 15 | --network-interface=network-tier=PREMIUM,stack-type=IPV4_ONLY,subnet=default \ 16 | --metadata=enable-oslogin=true --maintenance-policy=MIGRATE --provisioning-model=STANDARD \ 17 | --scopes=https://www.googleapis.com/auth/devstorage.read_only,https://www.googleapis.com/auth/logging.write,https://www.googleapis.com/auth/monitoring.write,https://www.googleapis.com/auth/servicecontrol,https://www.googleapis.com/auth/service.management.readonly,https://www.googleapis.com/auth/trace.append --tags=http-server --create-disk=auto-delete=yes,boot=yes,device-name=techcps,image=projects/centos-cloud/global/images/centos-7-v20231010,mode=rw,size=20,type=projects/$DEVSHELL_PROJECT_ID/zones/$ZONE/diskTypes/pd-balanced --no-shielded-secure-boot --shielded-vtpm --shielded-integrity-monitoring --labels=goog-ec-src=vm_add-gcloud --reservation-affinity=any 18 | ``` 19 | 20 | ## Perform task 2 & 3 using lab instruction 21 | 22 | # Congratulations, you're all done with the lab 😄 23 | 24 | # Thanks for watching :) 25 | -------------------------------------------------------------------------------- /Google Cloud Speech-to-Text API: Qwik Start/techcpsgsp119.sh: -------------------------------------------------------------------------------- 1 | 2 | cat > request.json < result.json 19 | -------------------------------------------------------------------------------- /Google Sheets Getting Started/exported-data.csv: -------------------------------------------------------------------------------- 1 | Timestamp,Department,Desired range ,Actual amount (ex: 34000),,"Ask the following questions in ""Explore""" 2 | 1/7/2019 13:07:51,Agriculture,"less than $25,000",10500,,department with the highest actual amount 3 | 1/7/2019 13:08:44,Board of equalization,"$25,001 - $50,000",25600,,how many unique departments 4 | 1/7/2019 13:09:02,Childrens services,"$50,001 - $75,000",56000,, 5 | 1/7/2019 13:09:22,Commerce and Insurance,"less than $25,000",11000,, 6 | 1/7/2019 13:09:36,Commission on Aging and Disability,"$75,001 - $100,000",199000,, 7 | 1/7/2019 13:09:57,Commission on Children and Youth,"$100,00 - $250,000",240000,, 8 | 1/7/2019 13:10:15,Economic and Community Development,"$75,001 - $100,000",98000,, 9 | 1/7/2019 13:10:32,Environment and Conservation,"$100,00 - $250,000",129000,, 10 | 1/7/2019 13:10:49,Housing Development Agency,"$250,001 - $500,000",280000,, 11 | 1/7/2019 13:11:04,Human Rights Commission,"$25,001 - $50,000",47000,, 12 | 1/7/2019 13:11:18,Mental Health and Substance Abuse Services,"$100,00 - $250,000",118900,, 13 | 1/7/2019 13:14:20,Economic and Community Development,"less than $25,000",17000,, 14 | 1/7/2019 16:11:01,Environment and Conservation,"less than $25,000",20000,, 15 | 1/9/2019 0:23:30,Environment and Conservation,"$250,001 - $500,000",348998,, -------------------------------------------------------------------------------- /Google Sheets Getting Started/important-data.xlsx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/0bed5c988e3db8690d4ca5a0bbcb7d6560076817/Google Sheets Getting Started/important-data.xlsx -------------------------------------------------------------------------------- /Google Sheets Getting Started/techcps469.md: -------------------------------------------------------------------------------- 1 | 2 | ## 💡 Lab Link: [Google Sheets: Getting Started - GSP49](https://www.cloudskillsboost.google/focuses/5828?parent=catalog) 3 | 4 | ## 🚀 Lab Solution [Watch Here](https://youtu.be/-abpaniEL4Y) 5 | 6 | --- 7 | 8 | ## 🚀 Open a sample spreadsheet [Explore this data](https://docs.google.com/spreadsheets/d/19iLO-XbrqqWRuqphkXTax0lFn71NW6crJK504JvAxoU/edit#gid=599358521) 9 | 10 | --- 11 | 12 | ## 🚨 Download the below 2 files: 13 | 14 | - 🚀 **Exported Data File [Click here!](https://github.com/Techcps/GSP-Short-Trick/blob/main/Google%20Sheets%20Getting%20Started/exported-data.csv)** 15 | 16 | - 🚀 **Important Data [Click here!](https://github.com/Techcps/GSP-Short-Trick/blob/main/Google%20Sheets%20Getting%20Started/important-data.xlsx)** 17 | 18 | --- 19 | 20 | ## 🚀 Open Google Drive in a new browser tab [Click here](https://drive.google.com/) 21 | 22 | ### Congratulations, you're all done with the lab 😄 23 | 24 | --- 25 | 26 | ### 🌐 Join our Community 27 | 28 | - **Join our [Discussion Group](https://t.me/Techcpschat)** icon 29 | - **Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps)** icon 30 | - **Join our [WhatsApp Community](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A)** icon 31 | - **Join our [Telegram Channel](https://t.me/Techcps)** icon 32 | - **Follow us on [LinkedIn](https://www.linkedin.com/company/techcps/)** icon 33 | 34 | ### Thanks for watching :) 35 | -------------------------------------------------------------------------------- /Google Workspace for Education: Getting Started/techcps978.csv: -------------------------------------------------------------------------------- 1 | First Name [Required],Last Name [Required],Email Address [Required],Password [Required],Password Hash Function [UPLOAD ONLY],Org Unit Path [Required],New Primary Email [UPLOAD ONLY],Recovery Email,Home Secondary Email,Work Secondary Email,Recovery Phone [MUST BE IN THE E.164 FORMAT],Work Phone,Home Phone,Mobile Phone,Work Address,Home Address,Employee ID,Employee Type,Employee Title,Manager Email,Department,Cost Center,Building ID,Floor Name,Floor Section,Change Password at Next Sign-In,New Status [UPLOAD ONLY],Advanced Protection Program enrollment 2 | rohit,sharma,rohit@goog-test.reseller.gappslabs.co.s-4l199ep2.qwiklabs-gsuite.net,password123,,/Teachers,,,,,,,,,,,,,,,,,,,,,, 3 | virat,kohli,virat@goog-test.reseller.gappslabs.co.s-4l199ep2.qwiklabs-gsuite.net,password124,,/Teachers,,,,,,,,,,,,,,,,,,,,,, 4 | melie,kerr,melie@goog-test.reseller.gappslabs.co.s-4l199ep2.qwiklabs-gsuite.net,password125,,/Teachers,,,,,,,,,,,,,,,,,,,,,, 5 | -------------------------------------------------------------------------------- /Hello Node Kubernetes/techcpsgsp005.sh: -------------------------------------------------------------------------------- 1 | 2 | export PROJECT_ID=$(gcloud config get-value project) 3 | 4 | cat > server.js < Dockerfile </etc/supervisor/conf.d/node-app.conf << EOF 32 | [program:nodeapp] 33 | directory=/fancy-store 34 | command=npm start 35 | autostart=true 36 | autorestart=true 37 | user=nodeapp 38 | environment=HOME="/home/nodeapp",USER="nodeapp",NODE_ENV="production" 39 | stdout_logfile=syslog 40 | stderr_logfile=syslog 41 | EOF 42 | 43 | supervisorctl reread 44 | supervisorctl update 45 | -------------------------------------------------------------------------------- /How to Use a Network Policy on Google Kubernetes Engine/techcps480.sh: -------------------------------------------------------------------------------- 1 | 2 | # Set text styles 3 | YELLOW=$(tput setaf 3) 4 | BOLD=$(tput bold) 5 | RESET=$(tput sgr0) 6 | 7 | echo "Please set the below values correctly" 8 | read -p "${YELLOW}${BOLD}Enter the ZONE: ${RESET}" ZONE 9 | 10 | gcloud config set compute/zone $ZONE 11 | 12 | export REGION="${ZONE%-*}" 13 | 14 | gcloud config set compute/region $REGION 15 | 16 | gsutil cp -r gs://spls/gsp480/gke-network-policy-demo . 17 | 18 | 19 | cd gke-network-policy-demo 20 | 21 | chmod -R 755 * 22 | 23 | 24 | echo "y" | make setup-project 25 | 26 | echo "yes" | make tf-apply 27 | 28 | echo "export ZONE=$ZONE" > cp.sh 29 | 30 | source cp.sh 31 | 32 | cat > techcps.sh <<'EOF_CP' 33 | 34 | source /tmp/cp.sh 35 | 36 | sudo apt-get install google-cloud-sdk-gke-gcloud-auth-plugin -y 37 | 38 | echo "export USE_GKE_GCLOUD_AUTH_PLUGIN=True" >> ~/.bashrc 39 | 40 | source ~/.bashrc 41 | 42 | gcloud container clusters get-credentials gke-demo-cluster --zone $ZONE 43 | 44 | kubectl apply -f ./manifests/hello-app/ 45 | 46 | kubectl apply -f ./manifests/network-policy.yaml 47 | 48 | kubectl delete -f ./manifests/network-policy.yaml 49 | 50 | kubectl create -f ./manifests/network-policy-namespaced.yaml 51 | 52 | kubectl -n hello-apps apply -f ./manifests/hello-app/hello-client.yaml 53 | 54 | EOF_CP 55 | 56 | 57 | gcloud compute scp cp.sh gke-demo-bastion:/tmp --project=$DEVSHELL_PROJECT_ID --zone=$ZONE --quiet 58 | 59 | gcloud compute scp techcps.sh gke-demo-bastion:/tmp --project=$DEVSHELL_PROJECT_ID --zone=$ZONE --quiet 60 | 61 | gcloud compute ssh gke-demo-bastion --project=$DEVSHELL_PROJECT_ID --zone=$ZONE --quiet --command="bash /tmp/techcps.sh" 62 | 63 | -------------------------------------------------------------------------------- /IAM Custom Roles/techcps190.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | gcloud config set compute/region $REGION 5 | 6 | # gcloud iam list-testable-permissions //cloudresourcemanager.googleapis.com/projects/$DEVSHELL_PROJECT_ID 7 | 8 | 9 | cat > role-definition.yaml < new-role-definition.yaml <": ">", 10 | "<": "<", 11 | } 12 | 13 | @app.route('/', methods=["GET", "POST"]) 14 | def input(): 15 | global input_string 16 | if flask.request.method == "GET": 17 | return flask.render_template("input.html") 18 | else: 19 | input_string = flask.request.form.get("input") 20 | return flask.redirect("output") 21 | 22 | 23 | @app.route('/output') 24 | def output(): 25 | output_string = "".join([html_escape_table.get(c, c) for c in input_string]) 26 | # output_string = input_string 27 | return flask.render_template("output.html", output=output_string) 28 | 29 | if __name__ == '__main__': 30 | app.run(host='0.0.0.0', port=8080) 31 | -------------------------------------------------------------------------------- /Identify Application Vulnerabilities with Security Command Center/techcps.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | export REGION="${ZONE%-*}" 5 | 6 | gcloud services enable websecurityscanner.googleapis.com 7 | 8 | gcloud compute addresses create xss-test-ip-address --region=$REGION 9 | 10 | gcloud compute addresses describe xss-test-ip-address \ 11 | --region=$REGION --format="value(address)" 12 | 13 | gcloud compute instances create xss-test-vm-instance \ 14 | --address=xss-test-ip-address --no-service-account \ 15 | --no-scopes --machine-type=e2-micro --zone=$ZONE \ 16 | --metadata=startup-script='apt-get update; apt-get install -y python3-flask' 17 | 18 | gcloud compute firewall-rules create enable-wss-scan \ 19 | --direction=INGRESS --priority=1000 \ 20 | --network=default --action=ALLOW \ 21 | --rules=tcp:8080 --source-ranges=0.0.0.0/0 22 | 23 | sleep 15 24 | 25 | CP_IP=$(gcloud compute instances describe xss-test-vm-instance --zone=$ZONE --project=$DEVSHELL_PROJECT_ID --format="get(networkInterfaces[0].accessConfigs[0].natIP)") 26 | 27 | gcloud alpha web-security-scanner scan-configs create --display-name=techcps --starting-urls=http://$CP_IP:8080 28 | 29 | 30 | SCAN_CONFIG=$(gcloud alpha web-security-scanner scan-configs list --project=$DEVSHELL_PROJECT_ID --format="value(name)") 31 | 32 | gcloud alpha web-security-scanner scan-runs start $SCAN_CONFIG 33 | 34 | sleep 5 35 | 36 | 37 | gcloud compute ssh xss-test-vm-instance --zone $ZONE --project=$DEVSHELL_PROJECT_ID --quiet --command "gsutil cp gs://cloud-training/GCPSEC-ScannerAppEngine/flask_code.tar . && tar xvf flask_code.tar && python3 app.py" 38 | 39 | 40 | 41 | 42 | -------------------------------------------------------------------------------- /Identify Application Vulnerabilities with Security Command Center/techcps1.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | cat > app.py <": ">", 13 | "<": "<", 14 | } 15 | 16 | @app.route('/', methods=["GET", "POST"]) 17 | def input(): 18 | global input_string 19 | if flask.request.method == "GET": 20 | return flask.render_template("input.html") 21 | else: 22 | input_string = flask.request.form.get("input") 23 | return flask.redirect("output") 24 | 25 | 26 | @app.route('/output') 27 | def output(): 28 | output_string = "".join([html_escape_table.get(c, c) for c in input_string]) 29 | # output_string = input_string 30 | return flask.render_template("output.html", output=output_string) 31 | 32 | if __name__ == '__main__': 33 | app.run(host='0.0.0.0', port=8080) 34 | EOF_CP 35 | 36 | 37 | python3 app.py 38 | 39 | 40 | -------------------------------------------------------------------------------- /Identify Damaged Car Parts with Vertex AutoML Vision/techcps972.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | export PROJECT_ID=$DEVSHELL_PROJECT_ID 4 | export BUCKET=$PROJECT_ID 5 | 6 | gsutil mb -p $PROJECT_ID \ 7 | -c standard \ 8 | -l us-central1 \ 9 | gs://${BUCKET} 10 | 11 | 12 | gsutil -m cp -r gs://car_damage_lab_images/* gs://${BUCKET} 13 | 14 | gsutil cp gs://car_damage_lab_metadata/data.csv . 15 | 16 | sed -i -e "s/car_damage_lab_images/${BUCKET}/g" ./data.csv 17 | 18 | 19 | gsutil cp ./data.csv gs://${BUCKET} 20 | 21 | sleep 30 22 | 23 | wget https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/main/Identify%20Damaged%20Car%20Parts%20with%20Vertex%20AutoML%20Vision/payload.json 24 | 25 | AUTOML_PROXY=$(gcloud run services describe automl-proxy --region=us-central1 --format="value(status.url)") 26 | 27 | INPUT_DATA_FILE=payload.json 28 | 29 | 30 | curl -X POST -H "Content-Type: application/json" $AUTOML_PROXY/v1 -d "@${INPUT_DATA_FILE}" 31 | 32 | 33 | -------------------------------------------------------------------------------- /Implement Private Google Access and Cloud NAT/techcps.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | export REGION=${ZONE%-*} 4 | 5 | export PROJECT_ID=$(gcloud info --format='value(config.project)') 6 | 7 | gcloud compute networks create privatenet --project=$PROJECT_ID --subnet-mode custom 8 | 9 | gcloud compute networks subnets create privatenet-us --project=$PROJECT_ID --network=privatenet --region=$REGION --range=10.130.0.0/20 10 | 11 | gcloud compute firewall-rules create privatenet-allow-ssh --project=$PROJECT_ID --network=privatenet --action=ALLOW --rules=tcp:22 --source-ranges=35.235.240.0/20 12 | 13 | gcloud compute instances create vm-internal --zone=$ZONE --project=$PROJECT_ID --machine-type=e2-medium --image-project=debian-cloud --image-family=debian-11 --boot-disk-size=10GB --boot-disk-type=pd-balanced --boot-disk-device-name=vm-internal --create-disk=mode=rw,size=10GB,type=pd-standard --network=privatenet --subnet=privatenet-us --no-address 14 | 15 | gsutil mb gs://$PROJECT_ID-techcps 16 | 17 | gcloud storage cp gs://cloud-training/gcpnet/private/access.svg gs://$PROJECT_ID-techcps 18 | 19 | gcloud storage cp gs://$PROJECT_ID-techcps/*.svg . 20 | 21 | gcloud compute networks subnets update privatenet-us --region=$REGION --enable-private-ip-google-access 22 | 23 | gcloud compute routers create nat-router --network=privatenet --region=$REGION 24 | 25 | gcloud compute routers get-status nat-router --region=$REGION 26 | 27 | -------------------------------------------------------------------------------- /Importing Data to a Firestore Database/createTestData.js: -------------------------------------------------------------------------------- 1 | const fs = require('fs'); 2 | const faker = require('faker'); 3 | const { Logging } = require("@google-cloud/logging"); 4 | 5 | function getRandomCustomerEmail(firstName, lastName) { 6 | const provider = faker.internet.domainName(); 7 | const email = faker.internet.email(firstName, lastName, provider); 8 | return email.toLowerCase(); 9 | } 10 | 11 | // Logging constants and client initialization 12 | const logName = "pet-theory-logs-createTestData"; 13 | const logging = new Logging(); 14 | const log = logging.log(logName); 15 | const resource = { 16 | type: "global", 17 | }; 18 | 19 | async function createTestData(recordCount) { 20 | const fileName = `customers_${recordCount}.csv`; 21 | var f = fs.createWriteStream(fileName); 22 | f.write('id,name,email,phone\n') 23 | for (let i=0; i { 21 | console.log(`Write: ${record}`); 22 | const docRef = db.collection("customers").doc(record.email); 23 | batch.set(docRef, record, { merge: true }); 24 | }); 25 | 26 | batch.commit() 27 | .then(() => { 28 | console.log('Batch executed'); 29 | }) 30 | .catch(err => { 31 | console.log(`Batch error: ${err}`); 32 | }); 33 | } 34 | 35 | async function importCsv(csvFilename) { 36 | const parser = csv.parse({ columns: true, delimiter: ',' }, async function (err, records) { 37 | if (err) { 38 | console.error('Error parsing CSV:', err); 39 | return; 40 | } 41 | try { 42 | console.log(`Call write to Firestore`); 43 | await writeToFirestore(records); 44 | const success_message = `Success: importTestData - Wrote ${records.length} records`; 45 | const entry = log.entry({ resource: resource }, { message: `${success_message}` }); 46 | log.write([entry]); 47 | console.log(`Wrote ${records.length} records`); 48 | } catch (e) { 49 | console.error(e); 50 | process.exit(1); 51 | } 52 | }); 53 | 54 | await fs.createReadStream(csvFilename).pipe(parser); 55 | } 56 | 57 | if (process.argv.length < 3) { 58 | console.error('Please include a path to a csv file'); 59 | process.exit(1); 60 | } 61 | 62 | importCsv(process.argv[2]).catch(e => console.error(e)); 63 | -------------------------------------------------------------------------------- /Importing Data to a Firestore Database/techcpsgsp642.sh: -------------------------------------------------------------------------------- 1 | 2 | gcloud auth list 3 | gcloud config list project 4 | 5 | git clone https://github.com/rosera/pet-theory 6 | cd pet-theory/lab01 7 | npm install @google-cloud/firestore 8 | npm install @google-cloud/logging 9 | 10 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Importing%20Data%20to%20a%20Firestore%20Database/importTestData.js 11 | 12 | npm install faker@5.5.3 13 | 14 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Importing%20Data%20to%20a%20Firestore%20Database/createTestData.js 15 | 16 | node createTestData 1000 17 | node importTestData customers_1000.csv 18 | npm install csv-parse 19 | -------------------------------------------------------------------------------- /Improving Network Performance I/techcps045.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud compute firewall-rules create iperf-testing --direction=INGRESS --network=default --action=ALLOW --priority=1000 --rules=tcp:5001,udp:5001 --source-ranges=0.0.0.0/0 --target-tags=all-instances 4 | 5 | 6 | echo "https://console.cloud.google.com/net-security/firewall-manager/firewall-policies/details/iperf-testing?project=$DEVSHELL_PROJECT_ID" 7 | 8 | 9 | -------------------------------------------------------------------------------- /Ingesting FHIR Data with the Healthcare API/techcps.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | export REGION=$LOCATION 4 | 5 | gcloud services enable healthcare.googleapis.com 6 | 7 | 8 | bq --location=$REGION mk --dataset --description HCAPI-dataset $PROJECT_ID:$DATASET_ID 9 | 10 | bq --location=$REGION mk --dataset --description HCAPI-dataset-de-id $PROJECT_ID:de_id 11 | 12 | 13 | gcloud projects add-iam-policy-binding $PROJECT_ID \ 14 | --member=serviceAccount:service-$PROJECT_NUMBER@gcp-sa-healthcare.iam.gserviceaccount.com \ 15 | --role=roles/bigquery.dataEditor 16 | gcloud projects add-iam-policy-binding $PROJECT_ID \ 17 | --member=serviceAccount:service-$PROJECT_NUMBER@gcp-sa-healthcare.iam.gserviceaccount.com \ 18 | --role=roles/bigquery.jobUser 19 | 20 | 21 | gcloud healthcare datasets create $DATASET_ID \ 22 | --location=$LOCATION 23 | 24 | 25 | gcloud pubsub topics create fhir-topic 26 | 27 | 28 | gcloud healthcare fhir-stores create ${FHIR_STORE_ID} --project=$DEVSHELL_PROJECT_ID --dataset=${DATASET_ID} --location=${LOCATION} --version=R4 --pubsub-topic=projects/${PROJECT_ID}/topics/${TOPIC} --enable-update-create --disable-referential-integrity 29 | 30 | 31 | gcloud healthcare fhir-stores create de_id --project=$DEVSHELL_PROJECT_ID --dataset=${DATASET_ID} --location=${LOCATION} --version=R4 --pubsub-topic=projects/${PROJECT_ID}/topics/${TOPIC} --enable-update-create --disable-referential-integrity 32 | 33 | 34 | gcloud healthcare fhir-stores import gcs $FHIR_STORE_ID \ 35 | --dataset=$DATASET_ID \ 36 | --location=$LOCATION \ 37 | --gcs-uri=gs://spls/gsp457/fhir_devdays_gcp/fhir1/* \ 38 | --content-structure=BUNDLE_PRETTY 39 | 40 | 41 | gcloud healthcare fhir-stores export bq $FHIR_STORE_ID \ 42 | --dataset=$DATASET_ID \ 43 | --location=$LOCATION \ 44 | --bq-dataset=bq://$PROJECT_ID.$DATASET_ID \ 45 | --schema-type=analytics 46 | 47 | 48 | -------------------------------------------------------------------------------- /Ingesting FHIR Data with the Healthcare API/techcps457.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) & join our [WhatsApp Community](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A) 4 | 5 | # NOTE: Go to Task 1 and Export the variables 6 | 7 | ``` 8 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Ingesting%20FHIR%20Data%20with%20the%20Healthcare%20API/techcps.sh 9 | sudo chmod +x techcps.sh 10 | ./techcps.sh 11 | 12 | ``` 13 | ## After above command get executed follow the video instructions 14 | 15 | ``` 16 | gcloud healthcare fhir-stores export bq de_id \ 17 | --dataset=$DATASET_ID \ 18 | --location=$LOCATION \ 19 | --bq-dataset=bq://$PROJECT_ID.de_id \ 20 | --schema-type=analytics 21 | ``` 22 | 23 | ``` 24 | SELECT 25 | id AS patient_id, 26 | name[safe_offset(0)].given AS given_name, 27 | name[safe_offset(0)].family AS family, 28 | birthDate AS birth_date 29 | FROM dataset1.Patient LIMIT 10 30 | ``` 31 | 32 | ## Congratulations, you're all done with the lab 😄 33 | 34 | # Thanks for watching :) 35 | -------------------------------------------------------------------------------- /Ingesting New Datasets into BigQuery/techcps411.sh: -------------------------------------------------------------------------------- 1 | 2 | gcloud auth list 3 | 4 | export PROJECT_ID=$(gcloud config get-value project) 5 | 6 | export PROJECT_ID=$DEVSHELL_PROJECT_ID 7 | 8 | bq mk --dataset ecommerce 9 | 10 | gsutil mb gs://$DEVSHELL_PROJECT_ID/ 11 | 12 | 13 | curl -LO https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Ingesting%20New%20Datasets%20into%20BigQuery/products.csv 14 | 15 | curl -LO https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Ingesting%20New%20Datasets%20into%20BigQuery/techcps.csv 16 | 17 | 18 | gsutil cp products.csv gs://$DEVSHELL_PROJECT_ID/ 19 | 20 | gsutil cp techcps.csv gs://$DEVSHELL_PROJECT_ID/ 21 | 22 | 23 | bq --location=US load --source_format=CSV --autodetect --skip_leading_rows=1 ecommerce.products gs://$DEVSHELL_PROJECT_ID/products.csv 24 | 25 | bq --location=US load --source_format=CSV --autodetect --skip_leading_rows=1 ecommerce.products gs://data-insights-course/exports/products.csv 26 | 27 | 28 | bq query --use_legacy_sql=false \ 29 | " 30 | #standardSQL 31 | SELECT 32 | *, 33 | SAFE_DIVIDE(orderedQuantity,stockLevel) AS ratio 34 | FROM 35 | ecommerce.products 36 | WHERE 37 | # include products that have been ordered and 38 | # are 80% through their inventory 39 | orderedQuantity > 0 40 | AND SAFE_DIVIDE(orderedQuantity,stockLevel) >= .8 41 | ORDER BY 42 | restockingLeadTime DESC 43 | " 44 | 45 | cat > external_table_definition.json < 22 | - **Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps)** icon 23 | - **Join our [WhatsApp Community](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A)** icon 24 | - **Join our [Telegram Channel](https://t.me/Techcps)** icon 25 | - **Follow us on [LinkedIn](https://www.linkedin.com/company/techcps/)** icon 26 | 27 | ### Thanks for watching and stay connected :) 28 | -------------------------------------------------------------------------------- /Introduction to APIs in Google Cloud/techcps294.sh: -------------------------------------------------------------------------------- 1 | # GSP294! 2 | 3 | # Authenticate and get the token 4 | export OAUTH2_TOKEN=$(gcloud auth print-access-token) 5 | 6 | export DEVSHELL_PROJECT_ID=$(gcloud config get-value project) 7 | 8 | gcloud services enable storage.googleapis.com 9 | 10 | gcloud config set project $DEVSHELL_PROJECT_ID 11 | 12 | 13 | # Create the JSON file 14 | cat > values.json << EOF 15 | { 16 | "name": "${DEVSHELL_PROJECT_ID}-bucket", 17 | "location": "US", 18 | "storageClass": "MULTI_REGIONAL" 19 | } 20 | EOF 21 | 22 | # Create the bucket 23 | curl -X POST --data-binary @values.json \ 24 | -H "Authorization: Bearer $OAUTH2_TOKEN" \ 25 | -H "Content-Type: application/json" \ 26 | "https://www.googleapis.com/storage/v1/b?project=$DEVSHELL_PROJECT_ID" 27 | 28 | 29 | wget https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Introduction%20to%20APIs%20in%20Google%20Cloud/demo-image.png 30 | 31 | 32 | # Verify if the image is downloaded 33 | if [ -f demo-image.png ]; then 34 | export OBJECT=$(realpath demo-image.png) 35 | 36 | export BUCKET_NAME=$DEVSHELL_PROJECT_ID-bucket 37 | 38 | export OAUTH2_TOKEN=$(gcloud auth print-access-token) 39 | 40 | curl -X POST --data-binary @$OBJECT \ 41 | -H "Authorization: Bearer $OAUTH2_TOKEN" \ 42 | -H "Content-Type: image/png" \ 43 | "https://www.googleapis.com/upload/storage/v1/b/$BUCKET_NAME/o?uploadType=media&name=demo-image" 44 | else 45 | echo "File not downloaded. Check the URL or file path." 46 | echo "Please like share and subscribe to techcps(https://www.youtube.com/@techcps)." 47 | fi 48 | -------------------------------------------------------------------------------- /Introduction to Computer Vision with TensorFlow/callback_model.py: -------------------------------------------------------------------------------- 1 | # Import and configure logging 2 | import logging 3 | import google.cloud.logging as cloud_logging 4 | from google.cloud.logging.handlers import CloudLoggingHandler 5 | from google.cloud.logging_v2.handlers import setup_logging 6 | exp_logger = logging.getLogger('expLogger') 7 | exp_logger.setLevel(logging.INFO) 8 | exp_logger.addHandler(CloudLoggingHandler(cloud_logging.Client(), name="callback")) 9 | 10 | # Import tensorflow_datasets 11 | import tensorflow_datasets as tfds 12 | # Import numpy 13 | import numpy as np 14 | # Import TensorFlow 15 | import tensorflow as tf 16 | # Define Callback 17 | class myCallback(tf.keras.callbacks.Callback): 18 | def on_epoch_end(self, epoch, logs={}): 19 | if(logs.get('sparse_categorical_accuracy')>0.84): 20 | exp_logger.info("\nReached 84% accuracy so cancelling training!") 21 | self.model.stop_training = True 22 | callbacks = myCallback() 23 | # Define, load and configure data 24 | (ds_train, ds_test), info = tfds.load('fashion_mnist', split=['train', 'test'], with_info=True, as_supervised=True) 25 | # Define batch size 26 | BATCH_SIZE = 32 27 | # Normalizing and batch processing of data 28 | ds_train = ds_train.map(lambda x, y: (tf.cast(x, tf.float32)/255.0, y)).batch(BATCH_SIZE) 29 | ds_test = ds_test.map(lambda x, y: (tf.cast(x, tf.float32)/255.0, y)).batch(BATCH_SIZE) 30 | # Define the model 31 | model = tf.keras.models.Sequential([tf.keras.layers.Flatten(), 32 | tf.keras.layers.Dense(64, activation=tf.nn.relu), 33 | tf.keras.layers.Dense(10, activation=tf.nn.softmax)]) 34 | # Compile data 35 | model.compile(optimizer = tf.keras.optimizers.Adam(), 36 | loss = tf.keras.losses.SparseCategoricalCrossentropy(), 37 | metrics=[tf.keras.metrics.SparseCategoricalAccuracy()]) 38 | model.fit(ds_train, epochs=5, callbacks=[callbacks]) 39 | -------------------------------------------------------------------------------- /Introduction to Computer Vision with TensorFlow/requirements.txt: -------------------------------------------------------------------------------- 1 | google-cloud-logging 2 | tensorflow 3 | tensorflow-datasets 4 | numpy>=1.16.5,<1.23.0 5 | scipy 6 | protobuf==3.20.* 7 | -------------------------------------------------------------------------------- /Introduction to Computer Vision with TensorFlow/techcps.sh: -------------------------------------------------------------------------------- 1 | 2 | wget https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Introduction%20to%20Computer%20Vision%20with%20TensorFlow/callback_model.py 3 | 4 | python callback_model.py 5 | 6 | wget https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Introduction%20to%20Computer%20Vision%20with%20TensorFlow/updated_model_1.py 7 | 8 | python updated_model_1.py 9 | 10 | wget https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Introduction%20to%20Computer%20Vision%20with%20TensorFlow/updated_model_2.py 11 | 12 | python updated_model_2.py 13 | 14 | wget https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Introduction%20to%20Computer%20Vision%20with%20TensorFlow/updated_model_3.py 15 | 16 | python updated_model_3.py 17 | 18 | -------------------------------------------------------------------------------- /Introduction to Computer Vision with TensorFlow/techcps631.sh: -------------------------------------------------------------------------------- 1 | 2 | python --version 3 | 4 | python -c "import tensorflow;print(tensorflow.__version__)" 5 | 6 | pip install tensorflow tensorflow-datasets google-cloud-logging numpy 7 | 8 | pip3 install --upgrade pip 9 | 10 | /usr/bin/python3 -m pip install -U google-cloud-logging --user 11 | 12 | /usr/bin/python3 -m pip install -U pylint --user 13 | 14 | pip install --upgrade tensorflow 15 | 16 | wget https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Introduction%20to%20Computer%20Vision%20with%20TensorFlow/requirements.txt 17 | 18 | /usr/bin/python3 -m pip install -r requirements.txt 19 | 20 | wget https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Introduction%20to%20Computer%20Vision%20with%20TensorFlow/model.py 21 | 22 | python model.py 23 | 24 | wget https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Introduction%20to%20Computer%20Vision%20with%20TensorFlow/CP1/model.py 25 | 26 | python model.py 27 | 28 | wget https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Introduction%20to%20Computer%20Vision%20with%20TensorFlow/CP/model.py 29 | 30 | python model.py 31 | 32 | wget https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Introduction%20to%20Computer%20Vision%20with%20TensorFlow/callback_model.py 33 | 34 | python callback_model.py 35 | 36 | wget https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Introduction%20to%20Computer%20Vision%20with%20TensorFlow/updated_model_1.py 37 | 38 | python updated_model_1.py 39 | 40 | wget https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Introduction%20to%20Computer%20Vision%20with%20TensorFlow/updated_model_2.py 41 | 42 | python updated_model_2.py 43 | 44 | wget https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Introduction%20to%20Computer%20Vision%20with%20TensorFlow/updated_model_3.py 45 | 46 | python updated_model_3.py 47 | -------------------------------------------------------------------------------- /Introduction to Computer Vision with TensorFlow/updated_model_1.py: -------------------------------------------------------------------------------- 1 | # Import and configure logging 2 | import logging 3 | import google.cloud.logging as cloud_logging 4 | from google.cloud.logging.handlers import CloudLoggingHandler 5 | from google.cloud.logging_v2.handlers import setup_logging 6 | up_logger = logging.getLogger('upLogger') 7 | up_logger.setLevel(logging.INFO) 8 | up_logger.addHandler(CloudLoggingHandler(cloud_logging.Client(), name="updated")) 9 | 10 | # Import tensorflow_datasets 11 | import tensorflow_datasets as tfds 12 | # Import numpy 13 | import numpy as np 14 | # Import TensorFlow 15 | import tensorflow as tf 16 | 17 | # Define, load and configure data 18 | (ds_train, ds_test), info = tfds.load('fashion_mnist', split=['train', 'test'], with_info=True, as_supervised=True) 19 | # Define batch size 20 | BATCH_SIZE = 32 21 | # Normalizing and batch processing of data 22 | ds_train = ds_train.map(lambda x, y: (tf.cast(x, tf.float32)/255.0, y)).batch(BATCH_SIZE) 23 | ds_test = ds_test.map(lambda x, y: (tf.cast(x, tf.float32)/255.0, y)).batch(BATCH_SIZE) 24 | # Define the model 25 | model = tf.keras.models.Sequential([tf.keras.layers.Flatten(), 26 | tf.keras.layers.Dense(128, activation=tf.nn.relu), 27 | tf.keras.layers.Dense(10, activation=tf.nn.softmax)]) 28 | # Compile data 29 | model.compile(optimizer = tf.keras.optimizers.Adam(), 30 | loss = tf.keras.losses.SparseCategoricalCrossentropy(), 31 | metrics=[tf.keras.metrics.SparseCategoricalAccuracy()]) 32 | model.fit(ds_train, epochs=5) 33 | # Logs model summary 34 | model.summary(print_fn=up_logger.info) 35 | -------------------------------------------------------------------------------- /Introduction to Computer Vision with TensorFlow/updated_model_2.py: -------------------------------------------------------------------------------- 1 | # Import and configure logging 2 | import logging 3 | import google.cloud.logging as cloud_logging 4 | from google.cloud.logging.handlers import CloudLoggingHandler 5 | from google.cloud.logging_v2.handlers import setup_logging 6 | up_logger = logging.getLogger('upLogger') 7 | up_logger.setLevel(logging.INFO) 8 | up_logger.addHandler(CloudLoggingHandler(cloud_logging.Client(), name="updated")) 9 | 10 | # Import tensorflow_datasets 11 | import tensorflow_datasets as tfds 12 | # Import numpy 13 | import numpy as np 14 | # Import TensorFlow 15 | import tensorflow as tf 16 | 17 | # Define, load and configure data 18 | (ds_train, ds_test), info = tfds.load('fashion_mnist', split=['train', 'test'], with_info=True, as_supervised=True) 19 | # Define batch size 20 | BATCH_SIZE = 32 21 | # Normalizing and batch processing of data 22 | ds_train = ds_train.map(lambda x, y: (tf.cast(x, tf.float32)/255.0, y)).batch(BATCH_SIZE) 23 | ds_test = ds_test.map(lambda x, y: (tf.cast(x, tf.float32)/255.0, y)).batch(BATCH_SIZE) 24 | # Define the model 25 | model = tf.keras.models.Sequential([tf.keras.layers.Flatten(), 26 | tf.keras.layers.Dense(64, activation=tf.nn.relu), 27 | tf.keras.layers.Dense(64, activation=tf.nn.relu), 28 | tf.keras.layers.Dense(10, activation=tf.nn.softmax)]) 29 | # Compile data 30 | model.compile(optimizer = tf.keras.optimizers.Adam(), 31 | loss = tf.keras.losses.SparseCategoricalCrossentropy(), 32 | metrics=[tf.keras.metrics.SparseCategoricalAccuracy()]) 33 | model.fit(ds_train, epochs=5) 34 | # Logs model summary 35 | model.summary(print_fn=up_logger.info) 36 | -------------------------------------------------------------------------------- /Introduction to Computer Vision with TensorFlow/updated_model_3.py: -------------------------------------------------------------------------------- 1 | # Import and configure logging 2 | import logging 3 | import google.cloud.logging as cloud_logging 4 | from google.cloud.logging.handlers import CloudLoggingHandler 5 | from google.cloud.logging_v2.handlers import setup_logging 6 | up_logger = logging.getLogger('upLogger') 7 | up_logger.setLevel(logging.INFO) 8 | up_logger.addHandler(CloudLoggingHandler(cloud_logging.Client(), name="updated")) 9 | 10 | # Import tensorflow_datasets 11 | import tensorflow_datasets as tfds 12 | # Import numpy 13 | import numpy as np 14 | # Import TensorFlow 15 | import tensorflow as tf 16 | 17 | # Define, load and configure data 18 | (ds_train, ds_test), info = tfds.load('fashion_mnist', split=['train', 'test'], with_info=True, as_supervised=True) 19 | # Define batch size 20 | BATCH_SIZE = 32 21 | 22 | 23 | # Normalizing and batch processing of data 24 | ds_train = ds_train.batch(BATCH_SIZE) 25 | ds_test = ds_test.batch(BATCH_SIZE) 26 | 27 | # Define the model 28 | model = tf.keras.models.Sequential([tf.keras.layers.Flatten(), 29 | tf.keras.layers.Dense(64, activation=tf.nn.relu), 30 | tf.keras.layers.Dense(10, activation=tf.nn.softmax)]) 31 | 32 | # Compile data 33 | model.compile(optimizer = tf.keras.optimizers.Adam(), 34 | loss = tf.keras.losses.SparseCategoricalCrossentropy(), 35 | metrics=[tf.keras.metrics.SparseCategoricalAccuracy()]) 36 | model.fit(ds_train, epochs=5) 37 | # Logs model summary 38 | model.summary(print_fn=up_logger.info) 39 | 40 | # Print out max value to see the changes 41 | image_batch, labels_batch = next(iter(ds_train)) 42 | t_image_batch, t_labels_batch = next(iter(ds_test)) 43 | up_logger.info("training images max " + str(np.max(image_batch[0]))) 44 | up_logger.info("test images max " + str(np.max(t_image_batch[0]))) 45 | -------------------------------------------------------------------------------- /Introduction to SQL for BigQuery and Cloud SQL/techcpsgsp281.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | export PROJECT_ID=$(gcloud info --format='value(config.project)') 4 | export BUCKET_NAME=$PROJECT_ID 5 | 6 | bq query --use_legacy_sql=false 'SELECT start_station_name, COUNT(*) AS num FROM `bigquery-public-data.london_bicycles.cycle_hire` GROUP BY start_station_name ORDER BY num DESC' 7 | 8 | bq query --use_legacy_sql=false 'SELECT end_station_name, COUNT(*) AS num FROM `bigquery-public-data.london_bicycles.cycle_hire` GROUP BY end_station_name ORDER BY num DESC' 9 | 10 | gsutil mb gs://$PROJECT_ID 11 | 12 | curl -O https://github.com/Techcps/GSP-Short-Trick/blob/main/Introduction%20to%20SQL%20for%20BigQuery%20and%20Cloud%20SQL/start_station_data.csv 13 | 14 | curl -O https://github.com/Techcps/GSP-Short-Trick/blob/main/Introduction%20to%20SQL%20for%20BigQuery%20and%20Cloud%20SQL/end_station_data.csv 15 | 16 | gsutil cp *start_station_data.csv gs://$PROJECT_ID/ 17 | gsutil cp *end_station_data.csv gs://$PROJECT_ID/ 18 | 19 | gcloud sql instances create my-demo --tier=db-n1-standard-2 --region=$REGION --root-password=Password 20 | 21 | gcloud sql databases create bike --instance=my-demo 22 | -------------------------------------------------------------------------------- /Loading Your Own Data into BigQuery/techcps865.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | bq mk --dataset nyctaxi 4 | 5 | wget https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Loading%20Your%20Own%20Data%20into%20BigQuery/nyc_tlc_yellow_trips_2018_subset_1.csv 6 | 7 | 8 | bq load \ 9 | --source_format=CSV \ 10 | --autodetect \ 11 | nyctaxi.2018trips \ 12 | nyc_tlc_yellow_trips_2018_subset_1.csv 13 | 14 | 15 | 16 | bq query --use_legacy_sql=false \ 17 | " 18 | #standardSQL 19 | SELECT 20 | * 21 | FROM 22 | nyctaxi.2018trips 23 | ORDER BY 24 | fare_amount DESC 25 | LIMIT 5 26 | " 27 | 28 | bq load \ 29 | --source_format=CSV \ 30 | --autodetect \ 31 | --noreplace \ 32 | nyctaxi.2018trips \ 33 | gs://cloud-training/OCBL013/nyc_tlc_yellow_trips_2018_subset_2.csv 34 | 35 | 36 | 37 | bq query --use_legacy_sql=false \ 38 | "#standardSQL 39 | CREATE TABLE 40 | nyctaxi.january_trips AS 41 | SELECT 42 | * 43 | FROM 44 | nyctaxi.2018trips 45 | WHERE 46 | EXTRACT(Month 47 | FROM 48 | pickup_datetime)=1; 49 | " 50 | 51 | 52 | bq query --use_legacy_sql=false \ 53 | "#standardSQL 54 | SELECT 55 | * 56 | FROM 57 | nyctaxi.january_trips 58 | ORDER BY 59 | trip_distance DESC 60 | LIMIT 61 | 1 62 | " 63 | 64 | -------------------------------------------------------------------------------- /Log Analytics on Google Cloud/techcps.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | ## 💡 Lab Link: [Log Analytics on Google Cloud - GSP1088](https://www.cloudskillsboost.google/focuses/49749?parent=catalog) 4 | 5 | ## 🚀 Lab Solution [Watch Here](https://youtu.be/1IUNZevC2qI) 6 | 7 | --- 8 | 9 | ## 🚨Export the ZONE Name correctly 10 | ``` 11 | export ZONE= 12 | ``` 13 | 14 | ## 🚨Copy and run the below commands in Cloud Shell: 15 | 16 | ``` 17 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Log%20Analytics%20on%20Google%20Cloud/techcps1088.sh 18 | sudo chmod +x techcps1088.sh 19 | ./techcps1088.sh 20 | ``` 21 | 22 | ### Congratulations, you're all done with the lab 😄 23 | 24 | --- 25 | 26 | ### 🌐 Join our Community 27 | 28 | - **Join our [Discussion Group](https://t.me/Techcpschat)** icon 29 | - **Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps)** icon 30 | - **Join our [WhatsApp Community](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A)** icon 31 | - **Join our [Telegram Channel](https://t.me/Techcps)** icon 32 | - **Follow us on [LinkedIn](https://www.linkedin.com/company/techcps/)** icon 33 | 34 | ### Thanks for watching :) 35 | 36 | -------------------------------------------------------------------------------- /Log Analytics on Google Cloud/techcps1088.sh: -------------------------------------------------------------------------------- 1 | 2 | gcloud auth list 3 | 4 | export PROJECT_ID=$(gcloud config get-value project) 5 | 6 | export PROJECT_ID=$DEVSHELL_PROJECT_ID 7 | 8 | gcloud config set compute/zone $ZONE 9 | 10 | export REGION=${ZONE%-*} 11 | gcloud config set compute/region $REGION 12 | 13 | gcloud container clusters get-credentials day2-ops --region $REGION 14 | 15 | git clone https://github.com/GoogleCloudPlatform/microservices-demo.git 16 | 17 | 18 | cd microservices-demo 19 | 20 | kubectl apply -f release/kubernetes-manifests.yaml 21 | 22 | sleep 45 23 | 24 | kubectl get pods 25 | 26 | 27 | export EXTERNAL_IP=$(kubectl get service frontend-external -o jsonpath="{.status.loadBalancer.ingress[0].ip}") 28 | echo $EXTERNAL_IP 29 | 30 | curl -o /dev/null -s -w "%{http_code}\n" http://${EXTERNAL_IP} 31 | 32 | 33 | gcloud logging buckets update _Default --project=$DEVSHELL_PROJECT_ID --location=global --enable-analytics 34 | 35 | 36 | gcloud logging sinks create day2ops-sink \ 37 | logging.googleapis.com/projects/$DEVSHELL_PROJECT_ID/locations/global/buckets/day2ops-log \ 38 | --log-filter='resource.type="k8s_container"' \ 39 | --include-children --format='json' 40 | 41 | 42 | echo "Click this link to open Log bucket! [https://console.cloud.google.com/logs/storage/bucket?cloudshell=true&project=$DEVSHELL_PROJECT_ID]" 43 | 44 | 45 | -------------------------------------------------------------------------------- /Manage Kubernetes in Google Cloud: Challenge Lab/techcps510.md: -------------------------------------------------------------------------------- 1 | 2 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) & join our [WhatsApp Community](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A) 3 | 4 | ## Go to Logging > log base metric 5 | ## Metric type: Counter 6 | 7 | ## Log Metric Name: 8 | ``` 9 | pod-image-errors 10 | ``` 11 | 12 | ## In the built filter box , add the following query: 13 | ``` 14 | resource.type="k8s_pod" 15 | severity=WARNING 16 | ``` 17 | 18 | ## Tap here to open the [Online Notepad](https://www.rapidtables.com/tools/notepad.html#) 19 | 20 | ## EXPORT to all the below variablbe: 21 | 22 | ``` 23 | export CLUSTER_NAME= 24 | 25 | export ZONE= 26 | 27 | export NAMESPACE= 28 | 29 | export INTERVAL= 30 | 31 | export REPO_NAME= 32 | 33 | export SERVICE_NAME= 34 | ``` 35 | 36 | ``` 37 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Manage%20Kubernetes%20in%20Google%20Cloud%3A%20Challenge%20Lab/techcps.sh 38 | sudo chmod +x techcps.sh 39 | ./techcps.sh 40 | ``` 41 | 42 | ## Congratulations, you're all done with the lab 😄 43 | 44 | # Thanks for watching :) 45 | 46 | -------------------------------------------------------------------------------- /Managing Vault Tokens/techcpsgsp1006.md: -------------------------------------------------------------------------------- 1 | 2 | # Managing Vault Tokens [GSP1006] 3 | 4 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) 5 | 6 | ``` 7 | curl -fsSL https://apt.releases.hashicorp.com/gpg | sudo apt-key add - 8 | sudo apt-add-repository "deb [arch=amd64] https://apt.releases.hashicorp.com $(lsb_release -cs) main" 9 | sudo apt-get update 10 | sudo apt-get install vault 11 | vault server -dev 12 | ``` 13 | ## Open a new Cloud Shell tab 14 | 15 | ``` 16 | export VAULT_ADDR='http://127.0.0.1:8200' 17 | vault auth enable approle 18 | vault write auth/approle/role/jenkins policies="jenkins" period="24h" 19 | vault read -format=json auth/approle/role/jenkins/role-id \ 20 | | jq -r ".data.role_id" > role_id.txt 21 | vault write -f -format=json auth/approle/role/jenkins/secret-id | jq -r ".data.secret_id" > secret_id.txt 22 | vault write auth/approle/login role_id=$(cat role_id.txt) \ 23 | secret_id=$(cat secret_id.txt) 24 | ``` 25 | ## REPLACE YOUR TOKEN ID 26 | 27 | ``` 28 | vault token lookup 29 | ``` 30 | ``` 31 | vault token lookup -format=json | jq -r .data.policies > token_policies.txt 32 | ``` 33 | ``` 34 | export PROJECT_ID=$(gcloud config get-value project) 35 | gsutil cp token_policies.txt gs://$PROJECT_ID 36 | ``` 37 | 38 | ## Congratulations, you're all done with the lab 😄 39 | 40 | # Thanks for watching :) 41 | -------------------------------------------------------------------------------- /Mitigate Threats and Vulnerabilities with Security Command Center: Challenge Lab/findings.jsonl: -------------------------------------------------------------------------------- 1 | Please like share and subscribe to techcps 2 | -------------------------------------------------------------------------------- /Mitigate Threats and Vulnerabilities with Security Command Center: Challenge Lab/techcps1.sh: -------------------------------------------------------------------------------- 1 | 2 | ZONE=$(gcloud compute project-info describe --format="value(commonInstanceMetadata.items[google-compute-default-zone])") 3 | 4 | # gcloud scc muteconfigs create muting-flow-log-findings --project=$DEVSHELL_PROJECT_ID --location=global --description="Rule for muting VPC Flow Logs" --filter="category=\"FLOW_LOGS_DISABLED\"" --type=STATIC 5 | 6 | # gcloud scc muteconfigs create muting-audit-logging-findings --project=$DEVSHELL_PROJECT_ID --location=global --description="Rule for muting audit logs" --filter="category=\"AUDIT_LOGGING_DISABLED\"" --type=STATIC 7 | 8 | # gcloud scc muteconfigs create muting-admin-sa-findings --project=$DEVSHELL_PROJECT_ID --location=global --description="Rule for muting admin service account findings" --filter="category=\"ADMIN_SERVICE_ACCOUNT\"" --type=STATIC 9 | 10 | gcloud compute firewall-rules delete default-allow-rdp --quiet && gcloud compute firewall-rules delete default-allow-ssh --quiet 11 | 12 | gcloud compute firewall-rules create default-allow-rdp --source-ranges=35.235.240.0/20 --allow=tcp:3389 --priority=65534 --description="Allow HTTP traffic from 35.235.240.0/20" 13 | 14 | gcloud compute firewall-rules create default-allow-ssh --source-ranges=35.235.240.0/20 --allow=tcp:22 --priority=65534 --description="Allow HTTP traffic from 35.235.240.0/20" 15 | 16 | echo "" 17 | 18 | echo "" 19 | 20 | echo "Click this link to open [https://console.cloud.google.com/compute/instancesEdit/zones/$ZONE/instances/cls-vm?project=$DEVSHELL_PROJECT_ID]" 21 | 22 | echo "" 23 | 24 | -------------------------------------------------------------------------------- /Mitigate Threats and Vulnerabilities with Security Command Center: Challenge Lab/techcps2.sh: -------------------------------------------------------------------------------- 1 | 2 | ZONE=$(gcloud compute project-info describe --format="value(commonInstanceMetadata.items[google-compute-default-zone])") 3 | 4 | REGION="${ZONE%-*}" 5 | 6 | External_IP=$(gcloud compute instances describe cls-vm --zone $ZONE --project $DEVSHELL_PROJECT_ID --format='get(networkInterfaces[0].accessConfigs[0].natIP)') 7 | 8 | gsutil mb -p $DEVSHELL_PROJECT_ID -c STANDARD -l $REGION -b on gs://scc-export-bucket-$DEVSHELL_PROJECT_ID 9 | 10 | gsutil uniformbucketlevelaccess set off gs://scc-export-bucket-$DEVSHELL_PROJECT_ID 11 | 12 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Mitigate%20Threats%20and%20Vulnerabilities%20with%20Security%20Command%20Center%3A%20Challenge%20Lab/findings.jsonl 13 | 14 | gsutil cp findings.jsonl gs://scc-export-bucket-$DEVSHELL_PROJECT_ID 15 | 16 | 17 | echo "" 18 | 19 | echo "" 20 | 21 | echo "Copy this External IP http://$External_IP:8080" 22 | 23 | 24 | echo "" 25 | 26 | echo "" 27 | 28 | echo "Click this link to open [https://console.cloud.google.com/security/web-scanner/scanConfigs/edit?project=$DEVSHELL_PROJECT_ID]" 29 | 30 | echo "" 31 | -------------------------------------------------------------------------------- /Monitor an Apache Web Server using Ops Agent/techcps.md: -------------------------------------------------------------------------------- 1 | 2 | ## 💡 Lab Link: [Monitor an Apache Web Server using Ops Agent - GSP1108](https://www.cloudskillsboost.google/focuses/56596?parent=catalog) 3 | 4 | ## 🚀 Lab Solution [Watch Here](https://youtu.be/3vNqet2BGr4) 5 | 6 | --- 7 | 8 | ## 🚨Copy and run the below commands in Cloud Shell: 9 | 10 | ``` 11 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Monitor%20an%20Apache%20Web%20Server%20using%20Ops%20Agent/techcps1108.sh 12 | sudo chmod +x techcps1108.sh 13 | ./techcps1108.sh 14 | ``` 15 | 16 | ### Congratulations, you're all done with the lab 😄 17 | 18 | --- 19 | 20 | ### 🌐 Join our Community 21 | 22 | - **Join our [Discussion Group](https://t.me/Techcpschat)** icon 23 | - **Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps)** icon 24 | - **Join our [WhatsApp Community](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A)** icon 25 | - **Join our [Telegram Channel](https://t.me/Techcps)** icon 26 | - **Follow us on [LinkedIn](https://www.linkedin.com/company/techcps/)** icon 27 | 28 | ### Thanks for watching :) 29 | -------------------------------------------------------------------------------- /Monitoring Multiple Projects with Cloud Monitoring/techcps090.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | # Set text styles 4 | YELLOW=$(tput setaf 3) 5 | BOLD=$(tput bold) 6 | RESET=$(tput sgr0) 7 | 8 | echo "Please set the below values correctly" 9 | read -p "${YELLOW}${BOLD}Enter the ZONE: ${RESET}" ZONE 10 | 11 | # Export variables after collecting input 12 | export ZONE 13 | 14 | gcloud compute instances create instance2 --zone=$ZONE --machine-type=e2-medium 15 | 16 | 17 | cat > techcps.json < index.js < { 30 | let message = req.query.message || req.body.message || 'Hello World!'; 31 | res.status(200).send(message); 32 | }; 33 | EOF_CP 34 | 35 | cat > package.json < 32 | - **Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps)** icon 33 | - **Join our [WhatsApp Community](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A)** icon 34 | - **Join our [Telegram Channel](https://t.me/Techcps)** icon 35 | - **Follow us on [LinkedIn](https://www.linkedin.com/company/techcps/)** icon 36 | 37 | ### Thanks for watching :) 38 | -------------------------------------------------------------------------------- /Networking 101/techcps.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud auth list 4 | 5 | export PROJECT_ID=$(gcloud config get-value project) 6 | 7 | gcloud compute networks create taw-custom-network --subnet-mode custom 8 | 9 | gcloud compute networks subnets create subnet-$REGION1 \ 10 | --network taw-custom-network \ 11 | --region $REGION1 \ 12 | --range 10.0.0.0/16 13 | 14 | gcloud compute networks subnets create subnet-$REGION2 \ 15 | --network taw-custom-network \ 16 | --region $REGION2 \ 17 | --range 10.1.0.0/16 18 | 19 | gcloud compute networks subnets create subnet-$REGION3 \ 20 | --network taw-custom-network \ 21 | --region $REGION3 \ 22 | --range 10.2.0.0/16 23 | 24 | 25 | 26 | gcloud compute networks subnets list \ 27 | --network taw-custom-network 28 | 29 | gcloud compute firewall-rules create nw101-allow-http \ 30 | --allow tcp:80 --network taw-custom-network --source-ranges 0.0.0.0/0 \ 31 | --target-tags http 32 | 33 | gcloud compute firewall-rules create "nw101-allow-icmp" --allow icmp --network "taw-custom-network" --target-tags rules 34 | 35 | gcloud compute firewall-rules create "nw101-allow-internal" --allow tcp:0-65535,udp:0-65535,icmp --network "taw-custom-network" --source-ranges "10.0.0.0/16","10.2.0.0/16","10.1.0.0/16" 36 | 37 | gcloud compute firewall-rules create "nw101-allow-ssh" --allow tcp:22 --network "taw-custom-network" --target-tags "ssh" 38 | 39 | gcloud compute firewall-rules create "nw101-allow-rdp" --allow tcp:3389 --network "taw-custom-network" 40 | 41 | 42 | -------------------------------------------------------------------------------- /Predict Soccer Match Outcomes with BigQuery ML: Challenge Lab/techcps.md: -------------------------------------------------------------------------------- 1 | 2 | # Predict Soccer Match Outcomes with BigQuery ML: Challenge Lab [GSP374] 3 | 4 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) & join our [WhatsApp Channel](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A) 5 | 6 | ``` 7 | export EVENT_NAME= 8 | 9 | export TABLE_NAME= 10 | 11 | export MODEL_NAME= 12 | 13 | export CP_VALUE_X1= 14 | 15 | export CP_VALUE_Y1= 16 | 17 | export CP_VALUE_X2= 18 | 19 | export CP_VALUE_Y2= 20 | 21 | export CP_VALUE= 22 | 23 | export FUNCTION_NAME_1= 24 | 25 | export FUNCTION_NAME_2= 26 | 27 | 28 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Predict%20Soccer%20Match%20Outcomes%20with%20BigQuery%20ML%3A%20Challenge%20Lab/techcps.sh 29 | sudo chmod +x techcps.sh 30 | ./techcps.sh 31 | 32 | ``` 33 | 34 | ## Congratulations, you're all done with the lab 😄 35 | 36 | # Thanks for watching :) 37 | 38 | -------------------------------------------------------------------------------- /Prepare Data for ML APIs on Google Cloud: Challenge Lab/techcps.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) & join our [WhatsApp Community](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A) 4 | 5 | 6 | ## 🚨 Export the below variables name correctly: 7 | 8 | ``` 9 | export DATASET_NAME= 10 | 11 | export BUCKET_CP1= 12 | 13 | export REGION= 14 | 15 | export TABLE_NAME= 16 | 17 | export BUCKET_CP3= 18 | 19 | export BUCKET_CP4= 20 | 21 | export API_KEY= 22 | ``` 23 | 24 | ## 🚨 25 | 26 | ``` 27 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Prepare%20Data%20for%20ML%20APIs%20on%20Google%20Cloud%3A%20Challenge%20Lab/techcps323.sh 28 | sudo chmod +x techcps323.sh 29 | ./techcps323.sh 30 | ``` 31 | 32 | ## Congratulations, you're all done with the lab 😄 33 | 34 | # Thanks for watching :) 35 | -------------------------------------------------------------------------------- /Prompt Design in Vertex AI Challenge Lab/Cymbal Tagline Generator Template.json: -------------------------------------------------------------------------------- 1 | { 2 | "title": "Cymbal Tagline Generator Template", 3 | "description": "", 4 | "parameters": { 5 | "groundingPromptConfig": { 6 | "disabled": true, 7 | "groundingConfig": { 8 | "sources": [ 9 | { 10 | "type": "WEB" 11 | } 12 | ] 13 | } 14 | }, 15 | "stopSequences": [], 16 | "temperature": 1, 17 | "tokenLimits": 8192, 18 | "topP": 0.95 19 | }, 20 | "systemInstruction": { 21 | "parts": [ 22 | { 23 | "text": "Cymbal Direct is partnering with an outdoor gear retailer. They're launching a new line of products designed to encourage young people to explore the outdoors. Help them create catchy taglines for this product line." 24 | } 25 | ] 26 | }, 27 | "inputPrefixes": [ 28 | "" 29 | ], 30 | "outputPrefixes": [ 31 | "" 32 | ], 33 | "examples": [ 34 | { 35 | "inputs": [ 36 | "Write a tagline for a durable backpack designed for hikers that makes them feel prepared. Consider styles like minimalist.\t" 37 | ], 38 | "outputs": [ 39 | "Built for the Journey: Your Adventure Essentials." 40 | ] 41 | } 42 | ], 43 | "testData": [ 44 | { 45 | "inputs": [ 46 | "Write a tagline for a durable backpack designed for hikers that makes them feel prepared. Consider styles like minimalist.\t" 47 | ] 48 | } 49 | ], 50 | "type": "freeform", 51 | "prompt": { 52 | "parts": [] 53 | }, 54 | "model": "gemini-1.5-pro-001" 55 | } -------------------------------------------------------------------------------- /Protect Cloud Traffic with BeyondCorp Enterprise BCE Security: Challenge Lab/techcps373.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud auth list 4 | 5 | gcloud services enable iap.googleapis.com 6 | 7 | gcloud config set project $DEVSHELL_PROJECT_ID 8 | 9 | git clone https://github.com/GoogleCloudPlatform/python-docs-samples.git 10 | cd python-docs-samples/appengine/standard_python3/hello_world/ 11 | gcloud app create --project=$(gcloud config get-value project) --region=$REGION 12 | 13 | gcloud app deploy --quiet 14 | 15 | 16 | 17 | export AUTH_DOMAIN=$(gcloud config get-value project).uc.r.appspot.com 18 | echo $AUTH_DOMAIN 19 | 20 | -------------------------------------------------------------------------------- /Provision Services with Google Cloud Marketplace/techcps003.sh: -------------------------------------------------------------------------------- 1 | 2 | export ZONE=$(gcloud compute instances list --filter="name=nginxstack-1-vm" --format="value(zone)") 3 | 4 | 5 | cat > cp.sh <<'EOF_CP' 6 | 7 | ps aux | grep nginx 8 | 9 | EOF_CP 10 | 11 | gcloud compute scp cp.sh nginxstack-1-vm:/tmp --project=$DEVSHELL_PROJECT_ID --zone=$ZONE --quiet 12 | 13 | gcloud compute ssh nginxstack-1-vm --project=$DEVSHELL_PROJECT_ID --zone=$ZONE --quiet --command="bash /tmp/cp.sh" 14 | 15 | -------------------------------------------------------------------------------- /Pub Sub Lite: Qwik Start/techcps832.sh: -------------------------------------------------------------------------------- 1 | 2 | gcloud auth list 3 | 4 | gcloud config set compute/region $REGION 5 | gcloud services enable pubsublite.googleapis.com --project=$DEVSHELL_PROJECT_ID 6 | 7 | sleep 25 8 | 9 | pip3 install --upgrade google-cloud-pubsublite 10 | 11 | gcloud pubsub lite-topics create my-lite-topic --project=$DEVSHELL_PROJECT_ID --zone=$REGION-b --partitions=1 --per-partition-bytes=30GiB --message-retention-period=2w 12 | 13 | gcloud pubsub lite-subscriptions create my-lite-subscription --project=$DEVSHELL_PROJECT_ID --zone=$REGION-b --topic=my-lite-topic --delivery-requirement=deliver-after-stored 14 | 15 | -------------------------------------------------------------------------------- /Reinforcement Learning: Qwik Start/techcps691.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud services enable aiplatform.googleapis.com storage-component.googleapis.com dataflow.googleapis.com artifactregistry.googleapis.com dataplex.googleapis.com compute.googleapis.com dataform.googleapis.com notebooks.googleapis.com datacatalog.googleapis.com visionai.googleapis.com 4 | 5 | sleep 30 6 | 7 | 8 | # Create a new notebook instance 9 | gcloud notebooks instances create my-notebook --location=$ZONE --vm-image-project=deeplearning-platform-release --vm-image-family=tf-2-11-cu113-notebooks --machine-type=e2-standard-2 10 | 11 | echo "Please subscribe to Techcps [https://www.youtube.com/@techcps].." 12 | 13 | echo "----------------Click the link below------------------" 14 | 15 | echo "Click the link on here https://console.cloud.google.com/vertex-ai/workbench/user-managed?cloudshell=true&project=$DEVSHELL_PROJECT_ID" 16 | 17 | 18 | -------------------------------------------------------------------------------- /Running Windows Containers on Compute Engine/techcps153.bat: -------------------------------------------------------------------------------- 1 | 2 | @echo off 3 | mkdir my-windows-app 4 | cd my-windows-app 5 | mkdir content 6 | 7 | echo ^^^Windows containers^^^^Windows containers are cool!^^^ > content\index.html 8 | 9 | 10 | echo FROM mcr.microsoft.com/windows/servercore/iis:windowsservercore-ltsc2019 > Dockerfile 11 | echo RUN powershell -NoProfile -Command Remove-Item -Recurse C:\inetpub\wwwroot\* >> Dockerfile 12 | echo WORKDIR /inetpub/wwwroot >> Dockerfile 13 | echo COPY content/ . >> Dockerfile 14 | 15 | docker build -t gcr.io/dotnet-atamel/iis-site-windows . 16 | docker images 17 | -------------------------------------------------------------------------------- /Scale Out and Update a Containerized Application on a Kubernetes Cluster: Challenge Lab/techcpsgsp305.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gsutil cp gs://$DEVSHELL_PROJECT_ID/echo-web-v2.tar.gz . 4 | tar -xzvf echo-web-v2.tar.gz 5 | 6 | gcloud builds submit --tag gcr.io/$DEVSHELL_PROJECT_ID/echo-app:v2 . 7 | 8 | gcloud container clusters get-credentials echo-cluster --zone=$ZONE 9 | 10 | kubectl create deployment echo-web --image=gcr.io/qwiklabs-resources/echo-app:v2 11 | 12 | kubectl expose deployment echo-web --type=LoadBalancer --port 80 --target-port 8000 13 | 14 | kubectl scale deploy echo-web --replicas=2 15 | -------------------------------------------------------------------------------- /Securing Cloud Applications with Identity Aware Proxy IAP using Zero-Trust/techcps.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) & join our [WhatsApp Community](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A) 4 | 5 | ## 🚨Export the variables name correctly: 6 | ``` 7 | export REGION= 8 | ``` 9 | 10 | ## 🚨Copy and run the below commands in Cloud Shell: 11 | ``` 12 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Securing%20Cloud%20Applications%20with%20Identity%20Aware%20Proxy%20IAP%20using%20Zero-Trust/techcps946.sh 13 | sudo chmod +x techcps946.sh 14 | ./techcps946.sh 15 | ``` 16 | 17 | 18 | ## Congratulations, you're all done with the lab 😄 19 | 20 | # Thanks for watching :) 21 | -------------------------------------------------------------------------------- /Securing Cloud Applications with Identity Aware Proxy IAP using Zero-Trust/techcps946.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud auth list 4 | 5 | echo "Click this link to open new tab https://console.cloud.google.com/security/iap/getStarted?cloudshell=true&project=$DEVSHELL_PROJECT_ID" 6 | 7 | export PROJECT_ID=$(gcloud config get-value project) 8 | 9 | gcloud config set compute/region $REGION 10 | 11 | git clone https://github.com/googlecodelabs/user-authentication-with-iap.git 12 | cd user-authentication-with-iap 13 | 14 | cd 1-HelloWorld 15 | 16 | gcloud app create --project=$(gcloud config get-value project) --region=$REGION 17 | 18 | sed -i "15c\runtime: python38" app.yaml 19 | 20 | sleep 25 21 | 22 | #!/bin/bash 23 | 24 | deploy_app() { 25 | gcloud app deploy --quiet 26 | } 27 | 28 | deploy_success=false 29 | 30 | while [ "$deploy_success" = false ]; do 31 | if deploy_app; then 32 | echo "App deployed successfully." 33 | deploy_success=true 34 | else 35 | echo "Deployment failed. Retrying in 10 seconds..." 36 | echo "Please subscribe to techcps (https://www.youtube.com/@techcps)." 37 | sleep 10 38 | fi 39 | done 40 | 41 | 42 | cd ~/user-authentication-with-iap/2- 43 | 44 | sed -i "15c\runtime: python38" app.yaml 45 | 46 | sleep 25 47 | 48 | #!/bin/bash 49 | 50 | deploy_app() { 51 | gcloud app deploy --quiet 52 | } 53 | 54 | deploy_success=false 55 | 56 | while [ "$deploy_success" = false ]; do 57 | if deploy_app; then 58 | echo "App deployed successfully." 59 | deploy_success=true 60 | else 61 | echo "Deployment failed. Retrying in 10 seconds..." 62 | echo "Please subscribe to techcps (https://www.youtube.com/@techcps)." 63 | sleep 10 64 | fi 65 | done 66 | 67 | echo "Congratulations, you're all done with the lab" 68 | echo "Please like share and subscribe to techcps(https://www.youtube.com/@techcps)..." 69 | -------------------------------------------------------------------------------- /Securing Google Cloud with CFT Scorecard/techcps698.sh: -------------------------------------------------------------------------------- 1 | 2 | export GOOGLE_PROJECT=$DEVSHELL_PROJECT_ID 3 | export CAI_BUCKET_NAME=cai-$GOOGLE_PROJECT 4 | 5 | gcloud services enable cloudasset.googleapis.com \ 6 | --project $GOOGLE_PROJECT 7 | 8 | 9 | gcloud beta services identity create --service=cloudasset.googleapis.com --project=$GOOGLE_PROJECT 10 | 11 | gcloud projects add-iam-policy-binding ${GOOGLE_PROJECT} \ 12 | --member=serviceAccount:service-$(gcloud projects list --filter="$GOOGLE_PROJECT" --format="value(PROJECT_NUMBER)")@gcp-sa-cloudasset.iam.gserviceaccount.com \ 13 | --role=roles/storage.admin 14 | 15 | git clone https://github.com/forseti-security/policy-library.git 16 | 17 | cp policy-library/samples/storage_denylist_public.yaml policy-library/policies/constraints/ 18 | 19 | gsutil mb -l $REGION -p $GOOGLE_PROJECT gs://$CAI_BUCKET_NAME 20 | 21 | 22 | # Export resource data 23 | gcloud asset export \ 24 | --output-path=gs://$CAI_BUCKET_NAME/resource_inventory.json \ 25 | --content-type=resource \ 26 | --project=$GOOGLE_PROJECT 27 | 28 | # Export IAM data 29 | gcloud asset export \ 30 | --output-path=gs://$CAI_BUCKET_NAME/iam_inventory.json \ 31 | --content-type=iam-policy \ 32 | --project=$GOOGLE_PROJECT 33 | 34 | # Export org policy data 35 | gcloud asset export \ 36 | --output-path=gs://$CAI_BUCKET_NAME/org_policy_inventory.json \ 37 | --content-type=org-policy \ 38 | --project=$GOOGLE_PROJECT 39 | 40 | # Export access policy data 41 | gcloud asset export \ 42 | --output-path=gs://$CAI_BUCKET_NAME/access_policy_inventory.json \ 43 | --content-type=access-policy \ 44 | --project=$GOOGLE_PROJECT 45 | 46 | -------------------------------------------------------------------------------- /Service Accounts and Roles: Fundamentals/techcps1.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud auth list 4 | export PROJECT_ID=$(gcloud config get-value project) 5 | 6 | export REGION=${ZONE%-*} 7 | gcloud config set compute/region $REGION 8 | 9 | gcloud iam service-accounts create my-sa-123 --display-name "my service account" 10 | 11 | gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID --member serviceAccount:my-sa-123@$DEVSHELL_PROJECT_ID.iam.gserviceaccount.com --role roles/editor 12 | 13 | # Create a service account 14 | gcloud iam service-accounts create bigquery-qwiklab --description="please like share & subscribe to techcps" --display-name="bigquery-qwiklab" 15 | 16 | # Assign BigQuery Data Viewer role 17 | gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID --member="serviceAccount:bigquery-qwiklab@$DEVSHELL_PROJECT_ID.iam.gserviceaccount.com" --role="roles/bigquery.dataViewer" 18 | 19 | # Assign BigQuery User role 20 | gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID --member="serviceAccount:bigquery-qwiklab@$DEVSHELL_PROJECT_ID.iam.gserviceaccount.com" --role="roles/bigquery.user" 21 | 22 | # Create a VM instance with the following information: 23 | gcloud compute instances create bigquery-instance --project=$DEVSHELL_PROJECT_ID --zone=$ZONE --machine-type=e2-medium --network-interface=network-tier=PREMIUM,stack-type=IPV4_ONLY,subnet=default --metadata=enable-oslogin=true --maintenance-policy=MIGRATE --provisioning-model=STANDARD --service-account=bigquery-qwiklab@$DEVSHELL_PROJECT_ID.iam.gserviceaccount.com --scopes=https://www.googleapis.com/auth/cloud-platform --create-disk=auto-delete=yes,boot=yes,device-name=bigquery-instance,image=projects/debian-cloud/global/images/debian-11-bullseye-v20231010,mode=rw,size=10,type=projects/$DEVSHELL_PROJECT_ID/zones/$ZONE/diskTypes/pd-balanced --no-shielded-secure-boot --shielded-vtpm --shielded-integrity-monitoring --labels=goog-ec-src=vm_add-gcloud --reservation-affinity=any 24 | 25 | -------------------------------------------------------------------------------- /Service Accounts and Roles: Fundamentals/techcps2.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | export PROJECT_ID=$(gcloud config get-value project) 4 | 5 | # Install the necessary dependencies by running the following commands 6 | sudo apt-get update 7 | sudo apt-get install -y git python3-pip 8 | pip3 install --upgrade pip 9 | pip3 install google-cloud-bigquery 10 | pip3 install pyarrow 11 | pip3 install pandas 12 | pip3 install db-dtypes 13 | 14 | # create the example Python file 15 | echo " 16 | from google.auth import compute_engine 17 | from google.cloud import bigquery 18 | credentials = compute_engine.Credentials( 19 | service_account_email='bigquery-qwiklab@$PROJECT_ID.iam.gserviceaccount.com') 20 | query = ''' 21 | SELECT 22 | year, 23 | COUNT(1) as num_babies 24 | FROM 25 | publicdata.samples.natality 26 | WHERE 27 | year > 2000 28 | GROUP BY 29 | year 30 | ''' 31 | client = bigquery.Client( 32 | project='$PROJECT_ID', 33 | credentials=credentials) 34 | print(client.query(query).to_dataframe()) 35 | " > query.py 36 | 37 | # The application now uses the permissions that are associated with this service account. Run the query with the following Python command 38 | python3 query.py 39 | 40 | -------------------------------------------------------------------------------- /Service Directory: Qwik Start/techcps732.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud auth list 4 | 5 | gcloud services enable servicedirectory.googleapis.com 6 | 7 | sleep 5 8 | 9 | gcloud service-directory namespaces create example-namespace --project=$DEVSHELL_PROJECT_ID --location=$REGION 10 | 11 | gcloud service-directory services create example-service --project=$DEVSHELL_PROJECT_ID --location=$REGION --namespace=example-namespace 12 | 13 | gcloud service-directory endpoints create example-endpoint --location=$REGION --project=$DEVSHELL_PROJECT_ID --namespace=example-namespace --service=example-service --address=0.0.0.0 --port=80 14 | 15 | gcloud dns --project=$DEVSHELL_PROJECT_ID managed-zones create example-zone-name --description="subscribe to techcps" --dns-name="myzone.example.com." --visibility="private" --networks="https://compute.googleapis.com/compute/v1/projects/$DEVSHELL_PROJECT_ID/global/networks/default" --service-directory-namespace="https://servicedirectory.googleapis.com/v1/projects/$DEVSHELL_PROJECT_ID/locations/$REGION/namespaces/example-namespace" 16 | 17 | -------------------------------------------------------------------------------- /Set Up an App Dev Environment on Google Cloud: Challenge Lab/techcps.md: -------------------------------------------------------------------------------- 1 | 2 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) & join our [WhatsApp Community](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A) 3 | 4 | 5 | ## 🚨 Export the variables name correctly 6 | 7 | ``` 8 | export ZONE= 9 | 10 | export TOPIC_NAME= 11 | 12 | export FUNCTION_NAME= 13 | 14 | export USER2= 15 | ``` 16 | 17 | ## 🚨 Copy and run the below commands 18 | 19 | ``` 20 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Set%20Up%20an%20App%20Dev%20Environment%20on%20Google%20Cloud%3A%20Challenge%20Lab/techcps315.sh 21 | sudo chmod +x techcps315.sh 22 | ./techcps315.sh 23 | ``` 24 | 25 | ## Congratulations, you're all done with the lab 😄 26 | 27 | # Thanks for watching :) 28 | 29 | -------------------------------------------------------------------------------- /Setting Up Cost Control with Quota/techcps.md: -------------------------------------------------------------------------------- 1 | 2 | ## 💡 Lab Link: [Setting Up Cost Control with Quota - GSP651](https://www.cloudskillsboost.google/focuses/7847?parent=catalog) 3 | 4 | ## 🚀 Lab Solution [Watch Here](https://youtu.be/EpJtATA9BZ8) 5 | 6 | --- 7 | 8 | ## 🚨Copy and run the below commands in Cloud Shell: 9 | 10 | ``` 11 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Setting%20Up%20Cost%20Control%20with%20Quota/techcps651.sh 12 | sudo chmod +x techcps651.sh 13 | ./techcps651.sh 14 | ``` 15 | 16 | ## Congratulations, you're all done with the lab 😄 17 | 18 | --- 19 | 20 | ### 🌐 Join our Community 21 | 22 | - **Join our [Discussion Group](https://t.me/Techcpschat)** icon 23 | - **Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps)** icon 24 | - **Join our [WhatsApp Community](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A)** icon 25 | - **Join our [Telegram Channel](https://t.me/Techcps)** icon 26 | - **Follow us on [LinkedIn](https://www.linkedin.com/company/techcps/)** icon 27 | 28 | ### Thanks for watching :) 29 | -------------------------------------------------------------------------------- /Setting Up Cost Control with Quota/techcps651.sh: -------------------------------------------------------------------------------- 1 | 2 | gcloud auth list 3 | 4 | bq query --use_legacy_sql=false \ 5 | " 6 | SELECT 7 | w1mpro_ep, 8 | mjd, 9 | load_id, 10 | frame_id 11 | FROM 12 | \`bigquery-public-data.wise_all_sky_data_release.mep_wise\` 13 | ORDER BY 14 | mjd ASC 15 | LIMIT 500 16 | " 17 | 18 | gcloud alpha services quota list --service=bigquery.googleapis.com --consumer=projects/${DEVSHELL_PROJECT_ID} --filter="usage" 19 | 20 | 21 | gcloud alpha services quota update --consumer=projects/${DEVSHELL_PROJECT_ID} --service bigquery.googleapis.com --metric bigquery.googleapis.com/quota/query/usage --value 262144 --unit 1/d/{project}/{user} --force 22 | 23 | 24 | gcloud alpha services quota list --service=bigquery.googleapis.com --consumer=projects/${DEVSHELL_PROJECT_ID} --filter="usage" 25 | 26 | 27 | bq query --use_legacy_sql=false --nouse_cache \ 28 | " 29 | SELECT 30 | w1mpro_ep, 31 | mjd, 32 | load_id, 33 | frame_id 34 | FROM 35 | \`bigquery-public-data.wise_all_sky_data_release.mep_wise\` 36 | ORDER BY 37 | mjd ASC 38 | LIMIT 500 39 | " 40 | 41 | 42 | 43 | bq query --use_legacy_sql=false --nouse_cache \ 44 | " 45 | SELECT 46 | w1mpro_ep, 47 | mjd, 48 | load_id, 49 | frame_id 50 | FROM 51 | \`bigquery-public-data.wise_all_sky_data_release.mep_wise\` 52 | ORDER BY 53 | mjd ASC 54 | LIMIT 500 55 | " 56 | 57 | sleep 5 58 | 59 | bq query --use_legacy_sql=false --nouse_cache \ 60 | " 61 | SELECT 62 | w1mpro_ep, 63 | mjd, 64 | load_id, 65 | frame_id 66 | FROM 67 | \`bigquery-public-data.wise_all_sky_data_release.mep_wise\` 68 | ORDER BY 69 | mjd ASC 70 | LIMIT 500 71 | " 72 | 73 | 74 | 75 | -------------------------------------------------------------------------------- /Setting up Jenkins on Kubernetes Engine/techcps117.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud auth list 4 | 5 | export PROJECT_ID=$(gcloud config get-value project) 6 | 7 | export PROJECT_ID=$DEVSHELL_PROJECT_ID 8 | 9 | gcloud config set compute/zone $ZONE 10 | 11 | git clone https://github.com/GoogleCloudPlatform/continuous-deployment-on-kubernetes.git 12 | 13 | cd continuous-deployment-on-kubernetes 14 | 15 | gcloud container clusters create jenkins-cd \ 16 | --num-nodes 2 --zone=$ZONE \ 17 | --scopes "https://www.googleapis.com/auth/projecthosting,cloud-platform" 18 | 19 | gcloud container clusters list 20 | 21 | gcloud container clusters get-credentials jenkins-cd 22 | 23 | kubectl cluster-info 24 | 25 | helm repo add jenkins https://charts.jenkins.io 26 | 27 | helm repo update 28 | 29 | helm upgrade --install -f jenkins/values.yaml myjenkins jenkins/jenkins 30 | 31 | 32 | -------------------------------------------------------------------------------- /Speaking with a Webpage - Streaming Speech Transcripts/techcps.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud auth list 4 | 5 | gcloud compute instances create speaking-with-a-webpage --project=$DEVSHELL_PROJECT_ID --zone=$ZONE --machine-type=e2-medium --image=debian-11-bullseye-v20230509 --image-project=debian-cloud --scopes=https://www.googleapis.com/auth/cloud-platform --tags=http-server,https-server 6 | 7 | 8 | sleep 15 9 | 10 | gcloud compute ssh "speaking-with-a-webpage" --zone=$ZONE --project=$DEVSHELL_PROJECT_ID --quiet --command "sudo apt update && sudo apt install git -y && sudo apt-get install -y maven openjdk-11-jdk && git clone https://github.com/googlecodelabs/speaking-with-a-webpage.git && gcloud compute firewall-rules create dev-ports --allow=tcp:8443 --source-ranges=0.0.0.0/0 && cd ~/speaking-with-a-webpage/01-hello-https && mvn clean jetty:run" 11 | -------------------------------------------------------------------------------- /Speaking with a Webpage - Streaming Speech Transcripts/techcps1.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud compute ssh "speaking-with-a-webpage" --zone=$ZONE --project=$DEVSHELL_PROJECT_ID --quiet --command "pkill -f 'java.*jetty'" 4 | 5 | sleep 10 6 | 7 | gcloud compute ssh "speaking-with-a-webpage" --zone=$ZONE --project=$DEVSHELL_PROJECT_ID --quiet --command "cd ~/speaking-with-a-webpage/02-webaudio && mvn clean jetty:run" 8 | 9 | 10 | -------------------------------------------------------------------------------- /Speaking with a Webpage - Streaming Speech Transcripts/techcps125.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) & join our [WhatsApp Community](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A) 4 | 5 | ## EXPORT the zone 6 | ``` 7 | export ZONE= 8 | ``` 9 | 10 | ``` 11 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Speaking%20with%20a%20Webpage%20-%20Streaming%20Speech%20Transcripts/techcps.sh 12 | sudo chmod +x techcps.sh 13 | ./techcps.sh 14 | ``` 15 | ## Press CTRL+C to stop the server when you see this kind of output 16 | 17 | ![techcps](https://github.com/Techcps/GSP-Short-Trick/assets/104138529/96ddd4e2-f7b1-4c37-aa95-a6f3eb9ff11b) 18 | 19 | ## Check your progress on task 1-3 && do not run the next command until you get the score 20 | 21 | ## EXPORT the zone 22 | ``` 23 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Speaking%20with%20a%20Webpage%20-%20Streaming%20Speech%20Transcripts/techcps1.sh 24 | sudo chmod +x techcps1.sh 25 | ./techcps1.sh 26 | ``` 27 | 28 | ## Congratulations, you're all done with the lab 😄 29 | 30 | # Thanks for watching :) 31 | -------------------------------------------------------------------------------- /Speech to Text Transcription with the Cloud Speech API/tehcps048.md: -------------------------------------------------------------------------------- 1 | 2 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) & join our [WhatsApp Community](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A) 3 | 4 | 5 | ``` 6 | gcloud services enable apikeys.googleapis.com 7 | gcloud alpha services api-keys create --display-name="techcps" 8 | CP=$(gcloud alpha services api-keys list --format="value(name)" --filter "displayName=techcps") 9 | API_KEY=$(gcloud alpha services api-keys get-key-string $CP --format="value(keyString)") 10 | 11 | cat > request.json < result.json 25 | cat result.json 26 | ``` 27 | 28 | ## 🚨 Check your progress on Task 1-3 29 | ## ❌ Do not run the next command until you get the score on Task 1-3 30 | 31 | ``` 32 | cat > request.json < result.json 46 | cat result.json 47 | ``` 48 | 49 | ## Congratulations, you're all done with the lab 😄 50 | 51 | # Thanks for watching :) 52 | -------------------------------------------------------------------------------- /Speech-to-Text API: Qwik Start/techcps119.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | cat > request.json < result.json 20 | 21 | -------------------------------------------------------------------------------- /Streaming HL7 to FHIR Data with Dataflow and the Healthcare API/techcps.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) & join our [WhatsApp Community](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A) 4 | 5 | ``` 6 | export ZONE= 7 | ``` 8 | ``` 9 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Streaming%20HL7%20to%20FHIR%20Data%20with%20Dataflow%20and%20the%20Healthcare%20API/techcps894.sh 10 | sudo chmod +x techcps894.sh 11 | ./techcps894.sh 12 | 13 | ``` 14 | 15 | ## Congratulations, you're all done with the lab 😄 16 | 17 | # Thanks for watching :) 18 | -------------------------------------------------------------------------------- /Tagging Dataplex Assets/techcps.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | ## 💡 Lab Link: [Tagging Dataplex Assets - GSP1145](https://www.cloudskillsboost.google/focuses/62711?parent=catalog) 4 | 5 | ## 🚀 Lab Solution [Watch Here](https://youtu.be/Qn0QCuninxs) 6 | 7 | --- 8 | 9 | ## 🚨Export the REGION Name correctly 10 | ``` 11 | export REGION= 12 | ``` 13 | 14 | ## 🚨Copy and run the below commands in Cloud Shell: 15 | 16 | ``` 17 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Tagging%20Dataplex%20Assets/techcps1145.sh 18 | sudo chmod +x techcps1145.sh 19 | ./techcps1145.sh 20 | ``` 21 | 22 | ## Congratulations, you're all done with the lab 😄 23 | 24 | --- 25 | 26 | ### 🌐 Join our Community 27 | 28 | - **Join our [Discussion Group](https://t.me/Techcpschat)** icon 29 | - **Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps)** icon 30 | - **Join our [WhatsApp Community](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A)** icon 31 | - **Join our [Telegram Channel](https://t.me/Techcps)** icon 32 | - **Follow us on [LinkedIn](https://www.linkedin.com/company/techcps/)** icon 33 | 34 | ### Thanks for watching :) 35 | 36 | -------------------------------------------------------------------------------- /Tagging Dataplex Assets/techcps1145.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud auth list 4 | 5 | gcloud config set compute/region $REGION 6 | 7 | export PROJECT_ID=$(gcloud config get-value project) 8 | export PROJECT_ID=$DEVSHELL_PROJECT_ID 9 | 10 | gcloud services enable dataplex.googleapis.com datacatalog.googleapis.com --project=$DEVSHELL_PROJECT_ID 11 | 12 | sleep 10 13 | 14 | gcloud dataplex lakes create orders-lake --location=$REGION --display-name="Orders Lake" 15 | 16 | gcloud dataplex zones create customer-curated-zone --location=$REGION --display-name="Customer Curated Zone" --lake=orders-lake --resource-location-type=SINGLE_REGION --type=CURATED --discovery-enabled --discovery-schedule="0 * * * *" 17 | 18 | gcloud dataplex assets create customer-details-dataset --location=$REGION --display-name="Customer Details Dataset" --lake=orders-lake --zone=customer-curated-zone --resource-type=BIGQUERY_DATASET --resource-name=projects/$DEVSHELL_PROJECT_ID/datasets/customers --discovery-enabled 19 | 20 | gcloud data-catalog tag-templates create protected_data_template --location=$REGION --display-name="Protected Data Template" --field=id=protected_data_flag,display-name="Protected Data Flag",type='enum(YES|NO)' 21 | 22 | 23 | echo "Click the below link, Please like share and subscribe to Techcps" 24 | 25 | echo "https://console.cloud.google.com/dataplex/search?project=$DEVSHELL_PROJECT_ID&cloudshell=true&q=customer_details" 26 | 27 | -------------------------------------------------------------------------------- /TensorFlow: Qwik Start/techcps637.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | pip install google-cloud-logging 4 | 5 | pip install ---upgrade protobuf 6 | 7 | pip install --upgrade tensorflow 8 | 9 | python --version 10 | 11 | python -c "import tensorflow;print(tensorflow.__version__)" 12 | 13 | 14 | cat > model.py <<'EOF_CP' 15 | import logging 16 | import google.cloud.logging as cloud_logging 17 | from google.cloud.logging.handlers import CloudLoggingHandler 18 | from google.cloud.logging_v2.handlers import setup_logging 19 | 20 | cloud_logger = logging.getLogger('cloudLogger') 21 | cloud_logger.setLevel(logging.INFO) 22 | cloud_logger.addHandler(CloudLoggingHandler(cloud_logging.Client())) 23 | cloud_logger.addHandler(logging.StreamHandler()) 24 | 25 | import tensorflow as tf 26 | import numpy as np 27 | 28 | xs = np.array([-1.0, 0.0, 1.0, 2.0, 3.0, 4.0], dtype=float) 29 | ys = np.array([-2.0, 1.0, 4.0, 7.0, 10.0, 13.0], dtype=float) 30 | 31 | model = tf.keras.Sequential([tf.keras.layers.Dense(units=1, input_shape=[1])]) 32 | 33 | model.compile(optimizer=tf.keras.optimizers.SGD(), loss=tf.keras.losses.MeanSquaredError()) 34 | 35 | model.fit(xs, ys, epochs=500) 36 | cloud_logger.info(str(model.predict(np.array([10.0])))) 37 | EOF_CP 38 | 39 | python model.py 40 | 41 | -------------------------------------------------------------------------------- /Terraform Fundamentals/techcpsgsp156.sh: -------------------------------------------------------------------------------- 1 | 2 | gcloud auth list 3 | terraform 4 | 5 | cat > instance.tf < { 4 | try { 5 | // Create the unsigned transaction object 6 | const unsignedTransaction = { 7 | nonce: 0, 8 | gasLimit: 21000, 9 | gasPrice: '0x09184e72a000', 10 | to: '0x0000000000000000000000000000000000000000', 11 | value: '0x00', 12 | data: '0x', 13 | }; 14 | 15 | // Sign the transaction 16 | const signedTransaction = await signTransaction(unsignedTransaction); 17 | 18 | // Submit the transaction to Ganache 19 | const transaction = await submitTransaction(signedTransaction); 20 | 21 | // Write the transaction receipt 22 | uploadFromMemory(transaction); 23 | 24 | return transaction; 25 | } catch (e) { 26 | console.log(e); 27 | uploadFromMemory(e); 28 | } 29 | }; 30 | 31 | await signAndSubmitTransaction(); 32 | -------------------------------------------------------------------------------- /Transacting Digital Assets with Multi-Party Computation and Confidential Space/mpc-ethereum-demo/kms-decrypt.js: -------------------------------------------------------------------------------- 1 | import {KeyManagementServiceClient} from '@google-cloud/kms'; 2 | import {credentialConfig} from './credential-config.js'; 3 | 4 | import crc32c from 'fast-crc32c'; 5 | 6 | const projectId = process.env.MPC_PROJECT_ID; 7 | const locationId = 'global'; 8 | const keyRingId = 'mpc-keys'; 9 | const keyId = 'mpc-key'; 10 | 11 | // Instantiates a client 12 | const client = new KeyManagementServiceClient({ 13 | credentials: credentialConfig, 14 | }); 15 | 16 | // Build the key name 17 | const keyName = client.cryptoKeyPath(projectId, locationId, keyRingId, keyId); 18 | 19 | export const decryptSymmetric = async (ciphertext) => { 20 | const ciphertextCrc32c = crc32c.calculate(ciphertext); 21 | const [decryptResponse] = await client.decrypt({ 22 | name: keyName, 23 | ciphertext, 24 | ciphertextCrc32c: { 25 | value: ciphertextCrc32c, 26 | }, 27 | }); 28 | 29 | // Optional, but recommended: perform integrity verification on decryptResponse. 30 | // For more details on ensuring E2E in-transit integrity to and from Cloud KMS visit: 31 | // https://cloud.google.com/kms/docs/data-integrity-guidelines 32 | if ( 33 | crc32c.calculate(decryptResponse.plaintext) !== 34 | Number(decryptResponse.plaintextCrc32c.value) 35 | ) { 36 | throw new Error('Decrypt: response corrupted in-transit'); 37 | } 38 | 39 | const plaintext = decryptResponse.plaintext.toString(); 40 | 41 | return plaintext; 42 | }; 43 | -------------------------------------------------------------------------------- /Transacting Digital Assets with Multi-Party Computation and Confidential Space/mpc-ethereum-demo/package.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "gcp-mpc-ethereum-demo", 3 | "version": "1.0.0", 4 | "description": "Demo for GCP multi-party-compute on Confidential Space", 5 | "main": "index.js", 6 | "scripts": { 7 | "start": "node index.js" 8 | }, 9 | "type": "module", 10 | "dependencies": { 11 | "@google-cloud/kms": "^3.2.0", 12 | "@google-cloud/storage": "^6.9.2", 13 | "ethers": "^5.7.2", 14 | "fast-crc32c": "^2.0.0" 15 | }, 16 | "author": "", 17 | "license": "ISC" 18 | } 19 | -------------------------------------------------------------------------------- /Translate Text with the Cloud Translation API/techcps049.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud auth list 4 | 5 | gcloud services enable apikeys.googleapis.com 6 | 7 | gcloud alpha services api-keys create --display-name="techcps" 8 | 9 | KEY_NAME=$(gcloud alpha services api-keys list --format="value(name)" --filter "displayName=techcps") 10 | 11 | API_KEY=$(gcloud alpha services api-keys get-key-string $KEY_NAME --format="value(keyString)") 12 | 13 | echo $API_KEY 14 | 15 | -------------------------------------------------------------------------------- /Troubleshooting Common SQL Errors with BigQuery/techcps.md: -------------------------------------------------------------------------------- 1 | 2 | ## 💡 Lab Link: [Troubleshooting Common SQL Errors with BigQuery - GSP408](https://www.cloudskillsboost.google/focuses/3642?parent=catalog) 3 | 4 | ## 🚀 Lab Solution [Watch Here](https://youtu.be/MkDRIeWszsI) 5 | 6 | --- 7 | 8 | ## 🚨Copy and run the below commands in Cloud Shell: 9 | 10 | ``` 11 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Troubleshooting%20Common%20SQL%20Errors%20with%20BigQuery/techcps408.sh 12 | sudo chmod +x techcps408.sh 13 | ./techcps408.sh 14 | ``` 15 | 16 | ### Congratulations, you're all done with the lab 😄 17 | 18 | --- 19 | 20 | ### 🌐 Join our Community 21 | 22 | - **Join our [Discussion Group](https://t.me/Techcpschat)** icon 23 | - **Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps)** icon 24 | - **Join our [WhatsApp Community](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A)** icon 25 | - **Join our [Telegram Channel](https://t.me/Techcps)** icon 26 | - **Follow us on [LinkedIn](https://www.linkedin.com/company/techcps/)** icon 27 | 28 | ### Thanks for watching :) 29 | -------------------------------------------------------------------------------- /Use Charts in Google Sheets/On the Rise Bakery Sales and Locations.xlsx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/0bed5c988e3db8690d4ca5a0bbcb7d6560076817/Use Charts in Google Sheets/On the Rise Bakery Sales and Locations.xlsx -------------------------------------------------------------------------------- /Use Charts in Google Sheets/On the Rise Bakery.pptx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/0bed5c988e3db8690d4ca5a0bbcb7d6560076817/Use Charts in Google Sheets/On the Rise Bakery.pptx -------------------------------------------------------------------------------- /Use Functions, Formulas, and Charts in Google Sheets Challenge Lab/On the Rise Bakery Business Challenge.xlsx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/0bed5c988e3db8690d4ca5a0bbcb7d6560076817/Use Functions, Formulas, and Charts in Google Sheets Challenge Lab/On the Rise Bakery Business Challenge.xlsx -------------------------------------------------------------------------------- /Use Functions, Formulas, and Charts in Google Sheets Challenge Lab/Staff Roles.pptx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/0bed5c988e3db8690d4ca5a0bbcb7d6560076817/Use Functions, Formulas, and Charts in Google Sheets Challenge Lab/Staff Roles.pptx -------------------------------------------------------------------------------- /Use Go Code to Work with Google Cloud Data Sources/techcpsgsp701.sh: -------------------------------------------------------------------------------- 1 | 2 | gcloud auth list 3 | gcloud config list project 4 | export PROJECT_ID=$(gcloud info --format="value(config.project)") 5 | go version 6 | git clone https://github.com/GoogleCloudPlatform/DIY-Tools.git 7 | gcloud firestore import gs://$PROJECT_ID-firestore/prd-back 8 | export PROJECT_ID=$(gcloud info --format="value(config.project)") 9 | export PREVIEW_URL=[REPLACE_WITH_WEB_PREVIEW_URL] 10 | echo $PREVIEW_URL/fs/$PROJECT_ID/symbols/product/symbol 11 | # please like share & subscribe to techcps 12 | cd ~/DIY-Tools/gcp-data-drive/cmd/webserver 13 | gcloud app deploy app.yaml --project $PROJECT_ID -q 14 | export TARGET_URL=https://$(gcloud app describe --format="value(defaultHostname)") 15 | curl $TARGET_URL/fs/$PROJECT_ID/symbols/product/symbol 16 | curl $TARGET_URL/fs/$PROJECT_ID/symbols/product/symbol/008888166900 17 | curl $TARGET_URL/bq/$PROJECT_ID/publicviews/ca_zip_codes 18 | -------------------------------------------------------------------------------- /Using BigQuery and Cloud Logging to Analyze BigQuery Usage/techcpsgsp617.md: -------------------------------------------------------------------------------- 1 | 2 | # Using BigQuery and Cloud Logging to Analyze BigQuery Usage [GSP617] 3 | 4 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) & join our [WhatsApp Channel](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A) 5 | 6 | ``` 7 | bq mk bq_logs 8 | bq query --use_legacy_sql=false "SELECT current_date()" 9 | ``` 10 | ``` 11 | resource.type="bigquery_resource" 12 | protoPayload.methodName="jobservice.jobcompleted" 13 | ``` 14 | # Create Sink name: JobComplete 15 | ``` 16 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Using%20BigQuery%20and%20Cloud%20Logging%20to%20Analyze%20BigQuery%20Usage/techcps.sh 17 | sudo chmod +x techcps.sh 18 | ./techcps.sh 19 | ``` 20 | 21 | ## Congratulations, you're all done with the lab 😄 22 | 23 | # Thanks for watching :) 24 | -------------------------------------------------------------------------------- /Using BigQuery in the Google Cloud Console GSP406/code.md: -------------------------------------------------------------------------------- 1 | 2 | # Using BigQuery in the Google Cloud Console [GSP406] 3 | 4 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) 5 | 6 | * In the GCP Console open the Cloud Shell and upload the donwload file & run the following commands: 7 | 8 | ``` 9 | sudo chmod +x techcpsgsp406.sh 10 | ./techcpsgsp406.sh 11 | ``` 12 | 13 | # Congratulations, you're all done with the lab 😄 14 | 15 | # Thanks for watching :) 16 | -------------------------------------------------------------------------------- /Using BigQuery in the Google Cloud Console GSP406/techcpsgsp406.sh: -------------------------------------------------------------------------------- 1 | 2 | bq query --use_legacy_sql=false \ 3 | " 4 | SELECT 5 | name, gender, 6 | SUM(number) AS total 7 | FROM 8 | \`bigquery-public-data.usa_names.usa_1910_2013\` 9 | GROUP BY 10 | name, gender 11 | ORDER BY 12 | total DESC 13 | LIMIT 14 | 10 15 | " 16 | bq mk babynames 17 | bq mk --table \ 18 | --schema "name:string,count:integer,gender:string" \ 19 | $DEVSHELL_PROJECT_ID:babynames.names_2014 20 | 21 | bq query --use_legacy_sql=false \ 22 | " 23 | SELECT 24 | name, count 25 | FROM 26 | \`babynames.names_2014\` 27 | WHERE 28 | gender = 'M' 29 | ORDER BY count DESC LIMIT 5 30 | " 31 | 32 | 33 | -------------------------------------------------------------------------------- /Using Cloud PubSub with Cloud Run APPRUN/techcps.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | gcloud auth list 5 | 6 | export PROJECT_ID=$(gcloud config get-value project) 7 | 8 | export PROJECT_ID=$DEVSHELL_PROJECT_ID 9 | 10 | gcloud services disable pubsub.googleapis.com --force 11 | 12 | 13 | 14 | sleep 20 15 | 16 | 17 | 18 | 19 | gcloud services enable pubsub.googleapis.com --project=$DEVSHELL_PROJECT_ID 20 | gcloud services enable run.googleapis.com --project=$DEVSHELL_PROJECT_ID 21 | 22 | sleep 20 23 | 24 | gcloud config set compute/region $REGION 25 | 26 | 27 | gcloud run deploy store-service \ 28 | --image gcr.io/qwiklabs-resources/gsp724-store-service \ 29 | --region $REGION \ 30 | --allow-unauthenticated 31 | 32 | 33 | gcloud run deploy order-service \ 34 | --image gcr.io/qwiklabs-resources/gsp724-order-service \ 35 | --region $REGION \ 36 | --no-allow-unauthenticated 37 | 38 | 39 | gcloud pubsub topics create ORDER_PLACED 40 | 41 | gcloud iam service-accounts create pubsub-cloud-run-invoker \ 42 | --display-name "Order Initiator" 43 | 44 | gcloud iam service-accounts list --filter="Order Initiator" 45 | 46 | sleep 20 47 | 48 | gcloud run services add-iam-policy-binding order-service --region $REGION \ 49 | --member=serviceAccount:pubsub-cloud-run-invoker@$GOOGLE_CLOUD_PROJECT.iam.gserviceaccount.com \ 50 | --role=roles/run.invoker --platform managed 51 | 52 | PROJECT_NUMBER=$(gcloud projects list \ 53 | --filter="qwiklabs-gcp" \ 54 | --format='value(PROJECT_NUMBER)') 55 | 56 | 57 | sleep 30 58 | 59 | 60 | ORDER_SERVICE_URL=$(gcloud run services describe order-service \ 61 | --region $REGION \ 62 | --format="value(status.address.url)") 63 | 64 | 65 | 66 | gcloud pubsub subscriptions create order-service-sub \ 67 | --topic ORDER_PLACED \ 68 | --push-endpoint=$ORDER_SERVICE_URL \ 69 | --push-auth-service-account=pubsub-cloud-run-invoker@$GOOGLE_CLOUD_PROJECT.iam.gserviceaccount.com 70 | 71 | 72 | 73 | 74 | -------------------------------------------------------------------------------- /Using Cloud Trace on Kubernetes Engine/techcpsgsp484.sh: -------------------------------------------------------------------------------- 1 | 2 | export REGION=${ZONE%-*} 3 | 4 | gcloud config set compute/region $REGION 5 | gcloud config set compute/zone $ZONE 6 | 7 | git clone https://github.com/GoogleCloudPlatform/gke-tracing-demo 8 | cd gke-tracing-demo 9 | cd terraform 10 | sed -i '/version = "~> 2.10.0"/d' provider.tf 11 | terraform init 12 | ../scripts/generate-tfvars.sh 13 | terraform plan 14 | terraform apply -auto-approve 15 | 16 | kubectl apply -f tracing-demo-deployment.yaml 17 | echo http://$(kubectl get svc tracing-demo -n default -o jsonpath='{.status.loadBalancer.ingress[0].ip}')?string=CustomMessage 18 | -------------------------------------------------------------------------------- /VPC Flow Logs - Analyzing Network Traffic/techcps.md: -------------------------------------------------------------------------------- 1 | 2 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) & join our [WhatsApp Community](https://whatsapp.com/channel/0029Va9nne147XeIFkXYv71A) 3 | 4 | # Set ZONE 5 | ``` 6 | export ZONE= 7 | ``` 8 | 9 | ``` 10 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/VPC%20Flow%20Logs%20-%20Analyzing%20Network%20Traffic/techcps212.sh 11 | sudo chmod +x techcps212.sh 12 | ./techcps212.sh 13 | ``` 14 | 15 | ``` 16 | CP_IP=$(gcloud compute instances describe web-server --zone=$ZONE --format='get(networkInterfaces[0].accessConfigs[0].natIP)') 17 | 18 | export MY_SERVER=$CP_IP 19 | 20 | for ((i=1;i<=50;i++)); do curl $MY_SERVER; done 21 | ``` 22 | 23 | ## Congratulations, you're all done with the lab 😄 24 | 25 | # Thanks for watching :) 26 | -------------------------------------------------------------------------------- /VPC Network Peering/techcps193.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | export REGION_1="${ZONE_1%-*}" 4 | 5 | export REGION_2="${ZONE_2%-*}" 6 | 7 | 8 | 9 | gcloud config set project $PROJECT_ID1 10 | 11 | 12 | gcloud compute networks create network-a --subnet-mode custom 13 | 14 | gcloud compute networks subnets create network-a-subnet --network network-a \ 15 | --range 10.0.0.0/16 --region $REGION_1 16 | 17 | sleep 5 18 | 19 | gcloud compute instances create vm-a --zone $ZONE_1 --network network-a --subnet network-a-subnet --machine-type e2-small 20 | 21 | gcloud compute firewall-rules create network-a-fw --network network-a --allow tcp:22,icmp 22 | 23 | 24 | 25 | # Second terminal 26 | gcloud config set project $PROJECT_ID2 27 | 28 | 29 | # Create the custom network 30 | gcloud compute networks create network-b --subnet-mode custom 31 | 32 | 33 | gcloud compute networks subnets create network-b-subnet --network network-b \ 34 | --range 10.8.0.0/16 --region $REGION_2 35 | 36 | sleep 5 37 | 38 | gcloud compute instances create vm-b --zone $ZONE_2 --network network-b --subnet network-b-subnet --machine-type e2-small 39 | 40 | 41 | gcloud compute firewall-rules create network-b-fw --network network-b --allow tcp:22,icmp 42 | 43 | 44 | # First terminal 45 | gcloud config set project $PROJECT_ID1 46 | 47 | 48 | gcloud compute networks peerings create peer-ab \ 49 | --network=network-a \ 50 | --peer-project=$PROJECT_ID2 \ 51 | --peer-network=network-b 52 | 53 | 54 | # Second terminal 55 | gcloud config set project $PROJECT_ID2 56 | 57 | 58 | gcloud compute networks peerings create peer-ba \ 59 | --network=network-b \ 60 | --peer-project=$PROJECT_ID1 \ 61 | --peer-network=network-a 62 | 63 | -------------------------------------------------------------------------------- /VPC Networking Fundamentals/techcps210.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gcloud auth list 4 | 5 | 6 | gcloud compute networks create mynetwork --project=$DEVSHELL_PROJECT_ID --subnet-mode=auto --bgp-routing-mode=regional --mtu=1460 7 | 8 | 9 | gcloud compute instances create mynet-us-vm --project=$DEVSHELL_PROJECT_ID --zone=$ZONE --machine-type=e2-micro --network-interface=network-tier=PREMIUM,stack-type=IPV4_ONLY,subnet=mynetwork --metadata=enable-oslogin=true --maintenance-policy=MIGRATE --provisioning-model=STANDARD --create-disk=auto-delete=yes,boot=yes,device-name=mynet-us-vm,image=projects/debian-cloud/global/images/debian-11-bullseye-v20230509,mode=rw,size=10,type=projects/$DEVSHELL_PROJECT_ID/zones/$ZONE/diskTypes/pd-balanced --no-shielded-secure-boot --shielded-vtpm --shielded-integrity-monitoring --labels=goog-ec-src=vm_add-gcloud --reservation-affinity=any 10 | 11 | 12 | gcloud compute instances create mynet-second-vm --project=$DEVSHELL_PROJECT_ID --zone=$ZONE --machine-type=e2-micro --network-interface=network-tier=PREMIUM,stack-type=IPV4_ONLY,subnet=mynetwork --metadata=enable-oslogin=true --maintenance-policy=MIGRATE --provisioning-model=STANDARD --create-disk=auto-delete=yes,boot=yes,device-name=mynet-second-vm,image=projects/debian-cloud/global/images/debian-11-bullseye-v20230509,mode=rw,size=10,type=projects/$DEVSHELL_PROJECT_ID/zones/$ZONE/diskTypes/pd-balanced --no-shielded-secure-boot --shielded-vtpm --shielded-integrity-monitoring --labels=goog-ec-src=vm_add-gcloud --reservation-affinity=any 13 | 14 | 15 | -------------------------------------------------------------------------------- /Validate Data in Google Sheets/techcps1062.xlsx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Techcps/GSP-Short-Trick/0bed5c988e3db8690d4ca5a0bbcb7d6560076817/Validate Data in Google Sheets/techcps1062.xlsx -------------------------------------------------------------------------------- /Validating Policies for Terraform on Google Cloud/techcpsgsp1021.md: -------------------------------------------------------------------------------- 1 | 2 | # Validating Policies for Terraform on Google Cloud [GSP1021] 3 | 4 | # Please like share & subscribe to [Techcps](https://www.youtube.com/@techcps) 5 | 6 | ``` 7 | gcloud auth list 8 | gcloud config list project 9 | git clone https://github.com/GoogleCloudPlatform/policy-library.git 10 | cd policy-library/ 11 | cp samples/iam_service_accounts_only.yaml policies/constraints 12 | cat policies/constraints/iam_service_accounts_only.yaml 13 | 14 | cat > main.tf < ./tfplan.json 37 | sudo apt-get install google-cloud-sdk-terraform-tools 38 | gcloud beta terraform vet tfplan.json --policy-library=. 39 | cd policies/constraints 40 | 41 | cat > iam_service_accounts_only.yaml < Go to BigQuery > Click Create dataset named: demos 7 | 8 | ``` 9 | SELECT 10 | -- Create a timestamp from the date components. 11 | timestamp(concat(year,"-",mo,"-",da)) as timestamp, 12 | -- Replace numerical null values with actual nulls 13 | AVG(IF (temp=9999.9, null, temp)) AS temperature, 14 | AVG(IF (visib=999.9, null, visib)) AS visibility, 15 | AVG(IF (wdsp="999.9", null, CAST(wdsp AS Float64))) AS wind_speed, 16 | AVG(IF (gust=999.9, null, gust)) AS wind_gust, 17 | AVG(IF (prcp=99.99, null, prcp)) AS precipitation, 18 | AVG(IF (sndp=999.9, null, sndp)) AS snow_depth 19 | FROM 20 | `bigquery-public-data.noaa_gsod.gsod20*` 21 | WHERE 22 | CAST(YEAR AS INT64) > 2008 23 | AND (stn="725030" OR -- La Guardia 24 | stn="744860") -- JFK 25 | GROUP BY timestamp 26 | ``` 27 | 28 | # NOTE: In the query EDITOR section, click More > Query settings. 29 | 30 | > Destination: select Set a destination table for query results 31 | 32 | > Dataset: Type demos and select your dataset. 33 | 34 | > Table Id: Type nyc_weather 35 | 36 | > Results size: check Allow large results (no size limit) 37 | 38 | > Click SAVE and RUN the query 39 | 40 | # In the GCP Console active your Cloud Shell and run the following commands: 41 | 42 | ``` 43 | curl -LO raw.githubusercontent.com/Techcps/GSP-Short-Trick/master/Weather%20Data%20in%20BigQuery/techcps.sh 44 | sudo chmod +x techcps.sh 45 | ./techcps.sh 46 | ``` 47 | 48 | ## Congratulations, you're all done with the lab 😄 49 | 50 | # Thanks for watching :) 51 | -------------------------------------------------------------------------------- /Web Security Scanner: Qwik Start/techcps112.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | gsutil -m cp -r gs://spls/gsp067/python-docs-samples . 4 | 5 | cd python-docs-samples/appengine/standard_python3/hello_world 6 | 7 | sed -i "s/python37/python39/g" app.yaml 8 | 9 | cat > requirements.txt < app.yaml <