Skip to main content

Perform Foundational Data, ML, and AI Tasks in Google Cloud: Challenge Lab [GSP323]


Link to the Video: https://youtu.be/lkI8WscliSs

STEPS

(will be unique for everyone)


storage →  bucket → create bucket with your Cloud Storage Bucket Name. For Access Control select Fine-Grained → Press create


bigquery →  select project →  create dataset →  name= BigQuery Dataset Name


open cloud shell


gsutil cp gs://cloud-training/gsp323/lab.csv .


cat lab.csv


gsutil cp gs://cloud-training/gsp323/lab.schema .


cat lab.schema


copy the bigquery schema and store it somewhere


go to bigquery

go to dataset lab

create table

create table from = google cloud storage

select file from GCS bucket = cloud-training/gsp323/lab.csv

table name = customers

enable edit as text

paste the bigquery schema that you copied earlier

create


Go to dataproc

clusters

create cluster → Cluster on Compute Engine → Create

Location Region

leave the rest default values and press create

click on the cluster name

go to vm instances

open ssh with the same name as the cluster with role "master"


run command in ssh: 

hdfs dfs -cp gs://cloud-training/gsp323/data.txt /data.txt


Go to dataflow

Go to jobs

create job



job name = lab-transform

region = Region

dataflow template = Text Files on Cloud Storage to BigQuery [BATCH]

Javascript UDF path in CLoud Storage = gs://cloud-training/gsp323/lab.js

JSON path = cloud-training/gsp323/lab.schema

JavaScript UDF name = transform

BigQuery output table = BigQuery Output Table (unique for everyone)

Cloud Storage input path = gs://cloud-training/gsp323/lab.csv

Temporary BigQuery directory = Cloud Storage Bucket Name/bigquery_temp

Temporary location = Cloud Storage Bucket Name/temp


RUN JOB


HERE YOUR TASK 1 IS COMPLETED




Go to dataproc

jobs

submit job

region = Region

job type = spark

Main class or jar = org.apache.spark.examples.SparkPageRank

Arguments = /data.txt

Jar files = file:///usr/lib/spark/examples/jars/spark-examples.jar

SUBMIT


HERE YOUR TASK 2 IS COMPLETED




Go to dataprep

accept all the conditions and move ahead

Create a blank flow

Connect to your data

import datasets

GCS

gs://cloud-training/gsp323/runs.csv [in search box type this]

runs.csv

import and add to flow


edit recipe


column 9

right click →  filter rows →  on column values →  contains


patterns to match = /(^0$|^0\.0$)/


column 2 rename = runid

column 3 rename = userid

column 4 rename = labid

column 5 rename = lab_title

column 6 rename = start

column 7 rename = end

column 8 rename = time

column 9 rename = score

column 10 rename = state


RUN JOB

again RUN JOB


HERE YOUR TASK 3 IS COMPLETED




Go to cloud shell


gcloud iam service-accounts create my-natlang-sa \ --display-name "my natural language service account" gcloud iam service-accounts keys create ~/key.json \ --iam-account my-natlang-sa@${GOOGLE_CLOUD_PROJECT}.iam.gserviceaccount.com


export GOOGLE_APPLICATION_CREDENTIALS="/home/$USER/key.json"


gcloud auth activate-service-account my-natlang-sa@${GOOGLE_CLOUD_PROJECT}.iam.gserviceaccount.com --key-file=$GOOGLE_APPLICATION_CREDENTIALS


gcloud ml language analyze-entities --content="Old Norse texts portray Odin as one-eyed and long-bearded, frequently wielding a spear named Gungnir and wearing a cloak and a broad hat." > result.json


gcloud auth login (optional)


Copy the token from the link provided  (optional)


gsutil cp result.json <CNL LINK>


if the command above gave error: go to IAM

Search for Principal name: PROJECT_ID@PROJECT_ID.iam.gserviceaccount.com

Add Role: Storage Object Admin

Re-run the above command. It will work fine.


nano request.json


👇👇👇👇👇👇👇👇👇


{ "config": { "encoding":"FLAC", "languageCode": "en-US" }, "audio": { "uri":"gs://cloud-training/gsp323/task4.flac" } }





curl -s -X POST -H "Content-Type: application/json" --data-binary @request.json \ "https://speech.googleapis.com/v1/speech:recognize?key=${API_KEY}" > result.json


gsutil cp result.json <GCS LINK>


gcloud iam service-accounts create quickstart


gcloud iam service-accounts keys create key.json --iam-account quickstart@${GOOGLE_CLOUD_PROJECT}.iam.gserviceaccount.com


gcloud auth activate-service-account --key-file key.json


export ACCESS_TOKEN=$(gcloud auth print-access-token)


nano request.json 


👇👇👇👇👇👇👇👇👇


{ "inputUri":"gs://spls/gsp154/video/chicago.mp4", "features": [ "TEXT_DETECTION" ] }




curl -s -H 'Content-Type: application/json' \ -H "Authorization: Bearer $ACCESS_TOKEN" \ 'https://videointelligence.googleapis.com/v1/videos:annotate' \ -d @request.json



curl -s -H 'Content-Type: application/json' -H "Authorization: Bearer $ACCESS_TOKEN" 'https://videointelligence.googleapis.com/v1/operations/OPERATION_FROM_PREVIOUS_REQUEST' > result1.json



gsutil cp result1.json <GVI LINK>




Congratulations you have completed the Challenge Lab!


SAKURA SATIO



Comments

Popular posts from this blog

A Tour of Google Cloud Hands-on Labs [GSP282]

Link to the Lab:  https://www.cloudskillsboost.google/focuses/2794?parent=catalog Link to the Video:  https://youtu.be/KFjyI-o60W8 Link to the Channel:  https://www.youtube.com/channel/UCiWP5KYh4MWj-yp3j-DMIrA Answers are marked bold and are highlighted in light yellow This builds a temporary environment in Google Cloud. Start lab (button) Credit Time Score When the timer reaches 00:00:00, you will lose access to your temporary Google Cloud environment. False True Some labs have tracking, which scores your completion of hands-on lab activities. False True In order to receive completion credit for a lab that has tracking, you must complete the required hands-on lab activities. False True What field is NOT found in the left pane? Project ID System admin Password Open Google Console The username in the left panel, which resembles googlexxxxxx_student@qwiklabs.net, is a Cloud IAM identity. True False An organizing entity for anything you build with Google Cloud. Password...

Deploy and Manage Cloud Environments with Google Cloud: Challenge Lab [GSP314]

Link to the Lab:  https://google.qwiklabs.com/focuses/10417?parent=catalog Link to the youtube video:  https://youtu.be/eraSK1T45Do Link to my youtube channel:  https://www.youtube.com/channel/UCiWP5KYh4MWj-yp3j-DMIrA   DO NOT START WITHOUT LAB RESOURCES! DO NOT MAKE HASTE. COPY PASTE STEP BY STEP OTHERWISE IT WILL NOT WORK. TASK 1: GO TO COMPUTE ENGINE > VM INSTANCES > OPEN SSH OF kraken-jumphost TYPE THE COMMANDS IN SSH: cd /work/dm sed -i s/SET_REGION/us-east1/g prod-network.yaml gcloud deployment-manager deployments create prod-network --config=prod-network.yaml gcloud config set compute/zone us-east1-b gcloud container clusters create kraken-prod --num-nodes 2 --network kraken-prod-vpc --subnetwork kraken-prod-subnet --zone us-east1-b gcloud container clusters get-credentials kraken-prod cd /work/k8s for F in $(ls *.yaml); do kubectl create -f $F; done