Skip to main content

Explore Machine Learning Models with Explainable AI: Challenge Lab [GSP324]


Link to the video: https://youtu.be/gIJmNrIwR4k


STEPS

OPEN "Notebooks" (AI Platform)

New instance


Select "Tensorflow enterprise 2.3" > without GPUs


Region: us-central1 (lowa)

Zone: us-central1-a


create


open jupyterlab


open terminal


git clone https://github.com/GoogleCloudPlatform/training-data-analyst


Go to the enclosing folder: training-data-analyst/quests/dei.


Open the notebook file what-if-tool-challenge.ipynb.


Download and import the dataset hmda_2017_ny_all-records_labels.


*to run the cells either you can press run button or press shift+enter in each cell*

[*] - this means the cell is in running state



#this is the size of the array you'll be feeding into our model for each example


model = Sequential()

model.add(layers.Dense(200, input_shape=(input_size,), activation='relu'))

model.add(layers.Dense(50, activation='relu'))

model.add(layers.Dense(20, activation='relu'))

model.add(layers.Dense(1, activation='sigmoid'))

model.compile(loss='mean_squared_error', optimizer='adam', metrics=['accuracy'])

model.fit(train_data, train_labels, epochs=10, batch_size=2048, validation_split=0.1)


#Train your second model on the Limited dataset. Use `Limited_train_data` fir your data and `Limited_train_labels`


limited_model = Sequential()

limited_model.add(layers.Dense(200, input_shape=(input_size,), activation='relu'))

limited_model.add(layers.Dense(50, activation='relu'))

limited_model.add(layers.Dense(20, activation='relu'))

limited_model.add(layers.Dense(1, activation='sigmoid'))

limited_model.compile(loss='mean_squared_error', optimizer='adam', metrics=['accuracy'])

limited_model.fit(limited_train_data, limited_train_labels, epochs=10, batch_size=2048, validation_split=0.1)




Go to cloud storage create bucket and name it by your gcp project id





#Fill out this information

GCP_PROJECT = <PUT_PROJECT_ID>

MODEL_BUCKET = 'gs://<PUT_PROJECT_ID>'




Create the Complete AI Platform model

Use the following to create your first AI Platform model:


Model Name = complete_model

Version Name = v1

Python version = 3.7

Framework = TensorFlow

Framework version = 2.3.1

ML Runtime version = 2.3


Create the Limited AI Platform model

Use the following to create your second AI Platform model:


Model Name = limited_model

Version Name = v1

Python version = 3.7

Framework = TensorFlow

Framework version = 2.3.1

ML Runtime version = 2.3


Congratulations you have completed the lab!

SAKURA SATIO



Comments

Popular posts from this blog

A Tour of Google Cloud Hands-on Labs [GSP282]

Link to the Lab:  https://www.cloudskillsboost.google/focuses/2794?parent=catalog Link to the Video:  https://youtu.be/KFjyI-o60W8 Link to the Channel:  https://www.youtube.com/channel/UCiWP5KYh4MWj-yp3j-DMIrA Answers are marked bold and are highlighted in light yellow This builds a temporary environment in Google Cloud. Start lab (button) Credit Time Score When the timer reaches 00:00:00, you will lose access to your temporary Google Cloud environment. False True Some labs have tracking, which scores your completion of hands-on lab activities. False True In order to receive completion credit for a lab that has tracking, you must complete the required hands-on lab activities. False True What field is NOT found in the left pane? Project ID System admin Password Open Google Console The username in the left panel, which resembles googlexxxxxx_student@qwiklabs.net, is a Cloud IAM identity. True False An organizing entity for anything you build with Google Cloud. Password...

Insights from Data with BigQuery: Challenge Lab [GSP787]

Link to the Lab:  https://www.cloudskillsboost.google/focuses/11988?parent=catalog Link to the Video:  https://youtu.be/jdZd834oysE Link to the Channel:  https://www.youtube.com/channel/UCiWP5KYh4MWj-yp3j-DMIrA STEPS 1.     In the Cloud Console, navigate to Menu > BigQuery . 2.     Click + ADD DATA > Explore public datasets from the left pane. 3.     Search covid19_open_data and then select COVID-19 Open Data 4.     Use Filter to locate the table covid19_open_data under the covid19_open_data dataset. Query 1: Total Confirmed Cases SELECT    SUM ( cumulative_confirmed ) AS total_cases_worldwide FROM    `bigquery-public-data.covid19_open_data.covid19_open_data` WHERE    date = "2020-04-15" Query 2: Worst Affected Areas with deaths_by_states as ( SELECT subregion1_name as state, sum ( cumulative_deceased ) as death_count FROM `bigquery-public-data.covid19_open_data.covid19_open_da...

Ensure Access & Identity in Google Cloud: Challenge Lab [GSP342]

Link to the Lab:  https://www.cloudskillsboost.google/focuses/14572?parent=catalog Link to the Video:  https://www.youtube.com/embed/vVvE-dvGs-Y Link to the Channel:  https://www.youtube.com/channel/UCiWP5KYh4MWj-yp3j-DMIrA STEPS Task 1: gcloud config set compute/zone us-east1-b nano role-definition.yaml *insert whatever is given below* title: "orca_storage_update" description: "Permissions" stage: "ALPHA" description: "Permissions" stage: "ALPHA" includedPermissions: - storage.buckets.get - storage.objects.get - storage.objects.list - storage.objects.update - storage.objects.create   To save a .yaml file press ctrl+x then press y then press enter   gcloud iam service-accounts create orca-private-cluster-sa --display-name "Orca Private Cluster Service Account"   gcloud iam roles create orca_storage_update \ --project $DEVSHELL_PROJECT_ID \ --file role-definition.yaml Task 2 and 3:   gcloud projects add-iam-policy-bi...