Spring Sale 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: save70

Google Cloud Certified Professional-Cloud-Security-Engineer Updated Exam

Google Cloud Certified - Professional Cloud Security Engineer Questions and Answers

Question 65

You are running a workload which processes very sensitive data that is intended to be used downstream by data scientists to train further models. The security team has very strict requirements around data handling and encryption, approved workloads, as well as separation of duties for the users of the output of the workload. You need to build the environment to support these requirements. What should you do?

Options:

A.

Use Confidential Computing on an N2D VM instance to process that data and output the results to a CMEK encrypted Cloud Storage bucket. Assign a storage object reader role to the data scientist service account. Manage access to this service account by using Workload Identity pools.

B.

Use Confidential Computing within Confidential Space, assign workload operator roles to the confidential compute VM service account. Assign the data collaborator role to the data scientist service account. Manage user access to these service accounts by using attestations and Workload Identity pools.

C.

Use Dataflow with Confidential Computing enabled to process the data and stream the results to a CMEK encrypted Cloud Storage bucket. Assign a storage object viewer role to the data scientist service account. Manage access to this service account by using Workload Identity pools.

D.

Use Dataproc with Confidential Computing enabled to process the data and stream the results to a CMEK encrypted Cloud Storage bucket. Assign a storage object reader role to the data scientist service account. Manage access to this service account by using Workload Identity pools.

Question 66

A company is backing up application logs to a Cloud Storage bucket shared with both analysts and the administrator. Analysts should only have access to logs that do not contain any personally identifiable information (PII). Log files containing PII should be stored in another bucket that is only accessible by the administrator.

What should you do?

Options:

A.

Use Cloud Pub/Sub and Cloud Functions to trigger a Data Loss Prevention scan every time a file is uploaded to the shared bucket. If the scan detects PII, have the function move into a Cloud Storage bucket only accessible by the administrator.

B.

Upload the logs to both the shared bucket and the bucket only accessible by the administrator. Create ajob trigger using the Cloud Data Loss Prevention API. Configure the trigger to delete any files from the shared bucket that contain PII.

C.

On the bucket shared with both the analysts and the administrator, configure Object Lifecycle Management to delete objects that contain any PII.

D.

On the bucket shared with both the analysts and the administrator, configure a Cloud Storage Trigger that is only triggered when PII data is uploaded. Use Cloud Functions to capture the trigger and delete such files.

Question 67

Your security team wants to implement a defense-in-depth approach to protect sensitive data stored in a Cloud Storage bucket. Your team has the following requirements:

The Cloud Storage bucket in Project A can only be readable from Project B.

The Cloud Storage bucket in Project A cannot be accessed from outside the network.

Data in the Cloud Storage bucket cannot be copied to an external Cloud Storage bucket.

What should the security team do?

Options:

A.

Enable domain restricted sharing in an organization policy, and enable uniform bucket-level access on the Cloud Storage bucket.

B.

Enable VPC Service Controls, create a perimeter around Projects A and B. and include the Cloud Storage API in the Service Perimeter configuration.

C.

Enable Private Access in both Project A and B's networks with strict firewall rules that allow communication between the networks.

D.

Enable VPC Peering between Project A and B's networks with strict firewall rules that allow communication between the networks.

Question 68

A company migrated their entire data/center to Google Cloud Platform. It is running thousands of instances across multiple projects managed by different departments. You want to have a historical record of what was running in Google Cloud Platform at any point in time.

What should you do?

Options:

A.

Use Resource Manager on the organization level.

B.

Use Forseti Security to automate inventory snapshots.

C.

Use Stackdriver to create a dashboard across all projects.

D.

Use Security Command Center to view all assets across the organization.