Summer Special Limited Time 60% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: 60certs

Google Professional-Cloud-Security-Engineer Dumps

Google Cloud Certified - Professional Cloud Security Engineer Questions and Answers

Question 1

Your organization’s Google Cloud VMs are deployed via an instance template that configures them with a public IP address in order to host web services for external users. The VMs reside in a service project that is attached to a host (VPC) project containing one custom Shared VPC for the VMs. You have been asked to reduce the exposure of the VMs to the internet while continuing to service external users. You have already recreated the instance template without a public IP address configuration to launch the managed instance group (MIG). What should you do?

Options:

A.

Deploy a Cloud NAT Gateway in the service project for the MIG.

B.

Deploy a Cloud NAT Gateway in the host (VPC) project for the MIG.

C.

Deploy an external HTTP(S) load balancer in the service project with the MIG as a backend.

D.

Deploy an external HTTP(S) load balancer in the host (VPC) project with the MIG as a backend.

Question 2

You are deploying regulated workloads on Google Cloud. The regulation has data residency and data access requirements. It also requires that support is provided from the same geographical location as where the data resides.

What should you do?

Options:

A.

Enable Access Transparency Logging.

B.

Deploy resources only to regions permitted by data residency requirements

C.

Use Data Access logging and Access Transparency logging to confirm that no users are accessing data from another region.

D.

Deploy Assured Workloads.

Question 3

You are creating an internal App Engine application that needs to access a user’s Google Drive on the user’s behalf. Your company does not want to rely on the current user’s credentials. It also wants to follow Google- recommended practices.

What should you do?

Options:

A.

Create a new Service account, and give all application users the role of Service Account User.

B.

Create a new Service account, and add all application users to a Google Group. Give this group the role of Service Account User.

C.

Use a dedicated G Suite Admin account, and authenticate the application’s operations with these G Suite credentials.

D.

Create a new service account, and grant it G Suite domain-wide delegation. Have the application use it to impersonate the user.

Question 4

Your organization is using Active Directory and wants to configure Security Assertion Markup Language (SAML). You must set up and enforce single sign-on (SSO) for all users.

What should you do?

Options:

A.

1. Manage SAML profile assignments.

• 2. Enable OpenID Connect (OIDC) in your Active Directory (AD) tenant.

• 3. Verify the domain.

B.

1. Create a new SAML profile.

• 2. Upload the X.509 certificate.

• 3. Enable the change password URL.

• 4. Configure Entity ID and ACS URL in your IdP.

C.

1- Create a new SAML profile.

• 2. Populate the sign-in and sign-out page URLs.

• 3. Upload the X.509 certificate.

• 4. Configure Entity ID and ACS URL in your IdP

D.

1. Configure prerequisites for OpenID Connect (OIDC) in your Active Directory (AD) tenant

• 2. Verify the AD domain.

• 3. Decide which users should use SAML.

• 4. Assign the pre-configured profile to the select organizational units (OUs) and groups.

Question 5

You work for an organization in a regulated industry that has strict data protection requirements. The organization backs up their data in the cloud. To comply with data privacy regulations, this data can only be stored for a specific length of time and must be deleted after this specific period.

You want to automate the compliance with this regulation while minimizing storage costs. What should you do?

Options:

A.

Store the data in a persistent disk, and delete the disk at expiration time.

B.

Store the data in a Cloud Bigtable table, and set an expiration time on the column families.

C.

Store the data in a BigQuery table, and set the table's expiration time.

D.

Store the data in a Cloud Storage bucket, and configure the bucket's Object Lifecycle Management feature.

Question 6

An organization's security and risk management teams are concerned about where their responsibility lies for certain production workloads they are running in Google Cloud Platform (GCP), and where Google's responsibility lies. They are mostly running workloads using Google Cloud's Platform-as-a-Service (PaaS) offerings, including App Engine primarily.

Which one of these areas in the technology stack would they need to focus on as their primary responsibility when using App Engine?

Options:

A.

Configuring and monitoring VPC Flow Logs

B.

Defending against XSS and SQLi attacks

C.

Manage the latest updates and security patches for the Guest OS

D.

Encrypting all stored data

Question 7

An organization is starting to move its infrastructure from its on-premises environment to Google Cloud Platform (GCP). The first step the organization wants to take is to migrate its ongoing data backup and disaster recovery solutions to GCP. The organization's on-premises production environment is going to be the next phase for migration to GCP. Stable networking connectivity between the on-premises environment and GCP is also being implemented.

Which GCP solution should the organization use?

Options:

A.

BigQuery using a data pipeline job with continuous updates via Cloud VPN

B.

Cloud Storage using a scheduled task and gsutil via Cloud Interconnect

C.

Compute Engines Virtual Machines using Persistent Disk via Cloud Interconnect

D.

Cloud Datastore using regularly scheduled batch upload jobs via Cloud VPN

Question 8

An employer wants to track how bonus compensations have changed over time to identify employee outliers and correct earning disparities. This task must be performed without exposing the sensitive compensation data for any individual and must be reversible to identify the outlier.

Which Cloud Data Loss Prevention API technique should you use to accomplish this?

Options:

A.

Generalization

B.

Redaction

C.

CryptoHashConfig

D.

CryptoReplaceFfxFpeConfig

Question 9

You need to implement an encryption-at-rest strategy that protects sensitive data and reduces key management complexity for non-sensitive data. Your solution has the following requirements:

  • Schedule key rotation for sensitive data.
  • Control which region the encryption keys for sensitive data are stored in.
  • Minimize the latency to access encryption keys for both sensitive and non-sensitive data.

What should you do?

Options:

A.

Encrypt non-sensitive data and sensitive data with Cloud External Key Manager.

B.

Encrypt non-sensitive data and sensitive data with Cloud Key Management Service.

C.

Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud External Key Manager.

D.

Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud Key Management Service.

Question 10

You manage one of your organization's Google Cloud projects (Project A). AVPC Service Control (SC) perimeter is blocking API access requests to this project including Pub/Sub. A resource running under a service account in another project (Project B) needs to collect messages from a Pub/Sub topic in your project Project B is not included in a VPC SC perimeter. You need to provide access from Project B to the Pub/Sub topic in Project A using the principle of least

Privilege.

What should you do?

Options:

A.

Configure an ingress policy for the perimeter in Project A and allow access for the service account in ProjectB to collect messages.

B.

Create an access level that allows a developer in Project B to subscribe to the Pub/Sub topic that is locatedin Project A.

C.

Create a perimeter bridge between Project A and Project B to allow the required communication betweenboth projects.

D.

Remove the Pub/Sub API from the list of restricted services in the perimeter configuration for Project A.

Question 11

A business unit at a multinational corporation signs up for GCP and starts moving workloads into GCP. The business unit creates a Cloud Identity domain with an organizational resource that has hundreds of projects.

Your team becomes aware of this and wants to take over managing permissions and auditing the domain resources.

Which type of access should your team grant to meet this requirement?

Options:

A.

Organization Administrator

B.

Security Reviewer

C.

Organization Role Administrator

D.

Organization Policy Administrator

Question 12

A large e-retailer is moving to Google Cloud Platform with its ecommerce website. The company wants to ensure payment information is encrypted between the customer’s browser and GCP when the customers checkout online.

What should they do?

Options:

A.

Configure an SSL Certificate on an L7 Load Balancer and require encryption.

B.

Configure an SSL Certificate on a Network TCP Load Balancer and require encryption.

C.

Configure the firewall to allow inbound traffic on port 443, and block all other inbound traffic.

D.

Configure the firewall to allow outbound traffic on port 443, and block all other outbound traffic.

Question 13

You are a Cloud Identity administrator for your organization. In your Google Cloud environment groups are used to manage user permissions. Each application team has a dedicated group Your team is responsible for creating these groups and the application teams can manage the team members on their own through the Google Cloud console. You must ensure that the application teams can only add users from within your organization to their groups.

What should you do?

Options:

A.

Change the configuration of the relevant groups in the Google Workspace Admin console to prevent externalusers from being added to the group.

B.

Set an Identity and Access Management (1AM) policy that includes a condition that restricts groupmembership to user principals that belong to your organization.

C.

Define an Identity and Access Management (IAM) deny policy that denies the assignment of principals thatare outside your organization to the groups in scope.

D.

Export the Cloud Identity logs to BigQuery Configure an alert for external members added to groups Havethe alert trigger a Cloud Function instance that removes the external members from the group.

Question 14

Your customer has an on-premises Public Key Infrastructure (PKI) with a certificate authority (CA). You need to issue certificates for many HTTP load balancer frontends. The on-premises PKI should be minimally affected due to many manual processes, and the solution needs to scale.

What should you do?

Options:

A.

Use Certificate Manager to issue Google managed public certificates and configure it at HTTP the load balancers in your infrastructure as code (laC).

B.

Use Certificate Manager to import certificates issued from on-premises PKI and for the frontends. Leverage the gcloud tool for importing

C.

Use a subordinate CA in the Google Certificate Authority Service from the on-premises PKI system to issue certificates for the load balancers.

D.

Use the web applications with PKCS12 certificates issued from subordinate CA based on OpenSSL on-premises Use the gcloud tool for importing. Use the External TCP/UDP Network load balancer instead of an external HTTP Load Balancer.

Question 15

For compliance reasons, an organization needs to ensure that in-scope PCI Kubernetes Pods reside on “in- scope” Nodes only. These Nodes can only contain the “in-scope” Pods.

How should the organization achieve this objective?

Options:

A.

Add a nodeSelector field to the pod configuration to only use the Nodes labeled inscope: true.

B.

Create a node pool with the label inscope: true and a Pod Security Policy that only allows the Pods to run on Nodes with that label.

C.

Place a taint on the Nodes with the label inscope: true and effect NoSchedule and a toleration to match in the Pod configuration.

D.

Run all in-scope Pods in the namespace “in-scope-pci”.

Question 16

Your team needs to configure their Google Cloud Platform (GCP) environment so they can centralize the control over networking resources like firewall rules, subnets, and routes. They also have an on-premises environment where resources need access back to the GCP resources through a private VPN connection. The networking resources will need to be controlled by the network security team.

Which type of networking design should your team use to meet these requirements?

Options:

A.

Shared VPC Network with a host project and service projects

B.

Grant Compute Admin role to the networking team for each engineering project

C.

VPC peering between all engineering projects using a hub and spoke model

D.

Cloud VPN Gateway between all engineering projects using a hub and spoke model

Question 17

A customer deployed an application on Compute Engine that takes advantage of the elastic nature of cloud computing.

How can you work with Infrastructure Operations Engineers to best ensure that Windows Compute Engine VMs are up to date with all the latest OS patches?

Options:

A.

Build new base images when patches are available, and use a CI/CD pipeline to rebuild VMs, deploying incrementally.

B.

Federate a Domain Controller into Compute Engine, and roll out weekly patches via Group Policy Object.

C.

Use Deployment Manager to provision updated VMs into new serving Instance Groups (IGs).

D.

Reboot all VMs during the weekly maintenance window and allow the StartUp Script to download the latest patches from the internet.

Question 18

A company is using Google Kubernetes Engine (GKE) with container images of a mission-critical application The company wants to scan the images for known security issues and securely share the report with the security team without exposing them outside Google Cloud.

What should you do?

Options:

A.

1. Enable Container Threat Detection in the Security Command Center Premium tier.

• 2. Upgrade all clusters that are not on a supported version of GKE to the latest possible GKE version.

• 3. View and share the results from the Security Command Center

B.

• 1. Use an open source tool in Cloud Build to scan the images.

• 2. Upload reports to publicly accessible buckets in Cloud Storage by using gsutil

• 3. Share the scan report link with your security department.

C.

• 1. Enable vulnerability scanning in the Artifact Registry settings.

• 2. Use Cloud Build to build the images

• 3. Push the images to the Artifact Registry for automatic scanning.

• 4. View the reports in the Artifact Registry.

D.

• 1. Get a GitHub subscription.

• 2. Build the images in Cloud Build and store them in GitHub for automatic scanning

• 3. Download the report from GitHub and share with the Security Team

Question 19

Your company recently published a security policy to minimize the usage of service account keys. On-premises Windows-based applications are interacting with Google Cloud APIs. You need to implement Workload Identity Federation (WIF) with your identity provider on-premises.

What should you do?

Options:

A.

Set up a workload identity pool with your corporate Active Directory Federation Service (ADFS) Configure arule to let principals in the pool impersonate the Google Cloud service account.

B.

Set up a workload identity pool with your corporate Active Directory Federation Service (ADFS) Let allprincipals in the pool impersonate the Google Cloud service account.

C.

Set up a workload identity pool with an OpenID Connect (OIDC) service on the name machine Configure arule to let principals in the pool impersonate the Google Cloud service account.

D.

Set up a workload identity pool with an OpenID Connect (OIDC) service on the same machine Let allprincipals in the pool impersonate the Google Cloud service account.

Question 20

Your organization must comply with the regulation to keep instance logging data within Europe. Your workloads will be hosted in the Netherlands in region europe-west4 in a new project. You must configure Cloud Logging to keep your data in the country.

What should you do?

Options:

A.

Configure the organization policy constraint gcp.resourceLocations to europe-west4.

B.

Set the logging storage region to eurcpe-west4 by using the gcloud CLI logging settings update.

C.

Create a new tog bucket in europe-west4. and redirect the _Def auit bucKet to the new bucket.

D.

Configure log sink to export all logs into a Cloud Storage bucket in europe-west4.

Question 21

Your company is using GSuite and has developed an application meant for internal usage on Google App Engine. You need to make sure that an external user cannot gain access to the application even when an employee’s password has been compromised.

What should you do?

Options:

A.

Enforce 2-factor authentication in GSuite for all users.

B.

Configure Cloud Identity-Aware Proxy for the App Engine Application.

C.

Provision user passwords using GSuite Password Sync.

D.

Configure Cloud VPN between your private network and GCP.

Question 22

Your company’s new CEO recently sold two of the company’s divisions. Your Director asks you to help migrate the Google Cloud projects associated with those divisions to a new organization node. Which preparation steps are necessary before this migration occurs? (Choose two.)

Options:

A.

Remove all project-level custom Identity and Access Management (1AM) roles.

B.

Disallow inheritance of organization policies.

C.

Identify inherited Identity and Access Management (1AM) roles on projects to be migrated.

D.

Create a new folder for all projects to be migrated.

E.

Remove the specific migration projects from any VPC Service Controls perimeters and bridges.

Question 23

Your organization wants to protect all workloads that run on Compute Engine VM to ensure that the instances weren't compromised by boot-level or kernel-level malware. Also, you need to ensure that data in use on the VM cannot be read by the underlying host system by using a hardware-based solution.

What should you do?

Options:

A.

•1 Use Google Shielded VM including secure boot Virtual Trusted Platform Module (vTPM) and integrity monitoring

•2 Create a Cloud Run function to check for the VM settings generate metrics and run the function regularly

B.

•1 Activate Virtual Machine Threat Detection in Security Command Center (SCO Premium

•2 Monitor the findings in SCC

C.

* 1 Use Google Shielded VM including secure boot Virtual Trusted Platform Module (vTPM) and integrity monitoring

•2 Activate Confidential Computing

•3 Enforce these actions by using organization policies

D.

•1 Use secure hardened images from the Google Cloud Marketplace

•2 When deploying the images activate the Confidential Computing option

•3 Enforce the use of the correct images and Confidential Computing by using organization policies

Question 24

You manage a mission-critical workload for your organization, which is in a highly regulated industry The workload uses Compute Engine VMs to analyze and process the sensitive data after it is uploaded to Cloud Storage from the endpomt computers. Your compliance team has detected that this workload does not meet the data protection requirements for sensitive data. You need to meet these requirements;

• Manage the data encryption key (DEK) outside the Google Cloud boundary.

• Maintain full control of encryption keys through a third-party provider.

• Encrypt the sensitive data before uploading it to Cloud Storage

• Decrypt the sensitive data during processing in the Compute Engine VMs

• Encrypt the sensitive data in memory while in use in the Compute Engine VMs

What should you do?

Choose 2 answers

Options:

A.

Create a VPC Service Controls service perimeter across your existing Compute Engine VMs and Cloud Storage buckets

B.

Migrate the Compute Engine VMs to Confidential VMs to access the sensitive data.

C.

Configure Cloud External Key Manager to encrypt the sensitive data before it is uploaded to Cloud Storage and decrypt the sensitive data after it is downloaded into your VMs

D.

Create Confidential VMs to access the sensitive data.

E.

Configure Customer Managed Encryption Keys to encrypt the sensitive data before it is uploaded to Cloud Storage, and decrypt the sensitive data after it is downloaded into your VMs.

Question 25

You are migrating an on-premises data warehouse to BigQuery Cloud SQL, and Cloud Storage. You need to configure security services in the data warehouse. Your company compliance policies mandate that the data warehouse must:

•Protect data at rest with full lifecycle management on cryptographic keys

•Implement a separate key management provider from data management

•Provide visibility into all encryption key requests

What services should be included in the data warehouse implementation?

Choose 2 answers

Options:

A.

Customer-managed encryption keys

B.

Customer-Supplied Encryption Keys

C.

Key Access Justifications

D.

Access Transparency and Approval

E.

Cloud External Key Manager

Question 26

You have created an OS image that is hardened per your organization’s security standards and is being stored in a project managed by the security team. As a Google Cloud administrator, you need to make sure all VMs in your Google Cloud organization can only use that specific OS image while minimizing operational overhead. What should you do? (Choose two.)

Options:

A.

Grant users the compuce.imageUser role in their own projects.

B.

Grant users the compuce.imageUser role in the OS image project.

C.

Store the image in every project that is spun up in your organization.

D.

Set up an image access organization policy constraint, and list the security team managed project in the projects allow list.

E.

Remove VM instance creation permission from users of the projects, and only allow you and your team to create VM instances.

Question 27

You have a highly sensitive BigQuery workload that contains personally identifiable information (Pll) that you want to ensure is not accessible from the internet. To prevent data exfiltration only requests from authorized IP addresses are allowed to query your BigQuery tables.

What should you do?

Options:

A.

Use service perimeter and create an access level based on the authorized source IP address as thecondition.

B.

Use Google Cloud Armor security policies defining an allowlist of authorized IP addresses at the globalHTTPS load balancer.

C.

Use the Restrict allowed Google Cloud APIs and services organization policy constraint along with Cloud Data Loss Prevention (DLP).

D.

Use the Restrict Resource service usage organization policy constraint along with Cloud Data Loss Prevention (DLP).

Question 28

You want to update your existing VPC Service Controls perimeter with a new access level. You need to avoid breaking the existing perimeter with this change, and ensure the least disruptions to users while minimizing overhead. What should you do?

Options:

A.

Create an exact replica of your existing perimeter. Add your new access level to the replica. Update the original perimeter after the access level has been vetted.

B.

Update your perimeter with a new access level that never matches. Update the new access level to match your desired state one condition at a time to avoid being overly permissive.

C.

Enable the dry run mode on your perimeter. Add your new access level to the perimeter configuration. Update the perimeter configuration after the access level has been vetted.

D.

Enable the dry run mode on your perimeter. Add your new access level to the perimeter dry run configuration. Update the perimeter configuration after the access level has been vetted.

Question 29

You have been tasked with implementing external web application protection against common web application attacks for a public application on Google Cloud. You want to validate these policy changes before they are enforced. What service should you use?

Options:

A.

Google Cloud Armor's preconfigured rules in preview mode

B.

Prepopulated VPC firewall rules in monitor mode

C.

The inherent protections of Google Front End (GFE)

D.

Cloud Load Balancing firewall rules

E.

VPC Service Controls in dry run mode

Question 30

You are deploying a web application hosted on Compute Engine. A business requirement mandates that application logs are preserved for 12 years and data is kept within European boundaries. You want to implement a storage solution that minimizes overhead and is cost-effective. What should you do?

Options:

A.

Create a Cloud Storage bucket to store your logs in the EUROPE-WEST1 region. Modify your application code to ship logs directly to your bucket for increased efficiency.

B.

Configure your Compute Engine instances to use the Google Cloud's operations suite Cloud Logging agent to send application logs to a custom log bucket in the EUROPE-WEST1 region with a custom retention of 12 years.

C.

Use a Pub/Sub topic to forward your application logs to a Cloud Storage bucket in the EUROPE-WEST1 region.

D.

Configure a custom retention policy of 12 years on your Google Cloud's operations suite log bucket in the EUROPE-WEST1 region.

Question 31

What are the steps to encrypt data using envelope encryption?

Options:

A.

Generate a data encryption key (DEK) locally.

Use a key encryption key (KEK) to wrap the DEK. Encrypt data with the KEK.

Store the encrypted data and the wrapped KEK.

B.

Generate a key encryption key (KEK) locally.

Use the KEK to generate a data encryption key (DEK). Encrypt data with the DEK.

Store the encrypted data and the wrapped DEK.

C.

Generate a data encryption key (DEK) locally.

Encrypt data with the DEK.

Use a key encryption key (KEK) to wrap the DEK. Store the encrypted data and the wrapped DEK.

D.

Generate a key encryption key (KEK) locally.

Generate a data encryption key (DEK) locally. Encrypt data with the KEK.

Store the encrypted data and the wrapped DEK.

Question 32

While migrating your organization’s infrastructure to GCP, a large number of users will need to access GCP Console. The Identity Management team already has a well-established way to manage your users and want to keep using your existing Active Directory or LDAP server along with the existing SSO password.

What should you do?

Options:

A.

Manually synchronize the data in Google domain with your existing Active Directory or LDAP server.

B.

Use Google Cloud Directory Sync to synchronize the data in Google domain with your existing Active Directory or LDAP server.

C.

Users sign in directly to the GCP Console using the credentials from your on-premises Kerberos compliant identity provider.

D.

Users sign in using OpenID (OIDC) compatible IdP, receive an authentication token, then use that token to log in to the GCP Console.

Question 33

The security operations team needs access to the security-related logs for all projects in their organization. They have the following requirements:

Follow the least privilege model by having only view access to logs.

Have access to Admin Activity logs.

Have access to Data Access logs.

Have access to Access Transparency logs.

Which Identity and Access Management (IAM) role should the security operations team be granted?

Options:

A.

roles/logging.privateLogViewer

B.

roles/logging.admin

C.

roles/viewer

D.

roles/logging.viewer

Question 34

Your organization operates Virtual Machines (VMs) with only private IPs in the Virtual Private Cloud (VPC) with internet access through Cloud NAT Everyday, you must patch all VMs with critical OS updates and provide summary reports

What should you do?

Options:

A.

Validate that the egress firewall rules allow any outgoing traffic Log in to each VM and execute OS specific update commands Configure the Cloud Scheduler job to update with critical patches daily for daily updates.

B.

Ensure that VM Manager is installed and running on the VMs. In the OS patch management service. configure the patch jobs to update with critical patches daily.

C.

Assign public IPs to VMs. Validate that the egress firewall rules allow any outgoing traffic Log in to each VM. and configure a daily cron job to enable for OS updates at night during low activity periods.

D.

Copy the latest patches to the Cloud Storage bucket. Log in to each VM. download the patches from the bucket, and install them.

Question 35

You are a consultant for an organization that is considering migrating their data from its private cloud to Google Cloud. The organization’s compliance team is not familiar with Google Cloud and needs guidance on how compliance requirements will be met on Google Cloud. One specific compliance requirement isfor customer data at rest to reside within specific geographic boundaries. Which option should you recommend for the organization to meet their data residency requirements on Google Cloud?

Options:

A.

Organization Policy Service constraints

B.

Shielded VM instances

C.

Access control lists

D.

Geolocation access controls

E.

Google Cloud Armor

Question 36

You are designing a new governance model for your organization's secrets that are stored in Secret Manager. Currently, secrets for Production and Non-Production applications are stored and accessed using service accounts. Your proposed solution must:

Provide granular access to secrets

Give you control over the rotation schedules for the encryption keys that wrap your secrets

Maintain environment separation

Provide ease of management

Which approach should you take?

Options:

A.

1. Use separate Google Cloud projects to store Production and Non-Production secrets.

2. Enforce access control to secrets using project-level identity and Access Management (IAM) bindings.

3. Use customer-managed encryption keys to encrypt secrets.

B.

1. Use a single Google Cloud project to store both Production and Non-Production secrets.

2. Enforce access control to secrets using secret-level Identity and Access Management (IAM) bindings.

3. Use Google-managed encryption keys to encrypt secrets.

C.

1. Use separate Google Cloud projects to store Production and Non-Production secrets.

2. Enforce access control to secrets using secret-level Identity and Access Management (IAM) bindings.

3. Use Google-managed encryption keys to encrypt secrets.

D.

1. Use a single Google Cloud project to store both Production and Non-Production secrets.

2. Enforce access control to secrets using project-level Identity and Access Management (IAM) bindings.

3. Use customer-managed encryption keys to encrypt secrets.

Question 37

Your company plans to move most of its IT infrastructure to Google Cloud. They want to leverage their existing on-premises Active Directory as an identity provider for Google Cloud. Which two steps should you take to integrate the company’s on-premises Active Directory with Google Cloud and configure access management? (Choose two.)

Options:

A.

Use Identity Platform to provision users and groups to Google Cloud.

B.

Use Cloud Identity SAML integration to provision users and groups to Google Cloud.

C.

Install Google Cloud Directory Sync and connect it to Active Directory and Cloud Identity.

D.

Create Identity and Access Management (1AM) roles with permissions corresponding to each Active Directory group.

E.

Create Identity and Access Management (1AM) groups with permissions corresponding to each Active Directory group.

Question 38

You need to provide a corporate user account in Google Cloud for each of your developers and operational staff who need direct access to GCP resources. Corporate policy requires you to maintain the user identity in a third-party identity management provider and leverage single sign-on. You learn that a significant number of users are using their corporate domain email addresses for personal Google accounts, and you need to follow Google recommended practices to convert existing unmanaged users to managed accounts.

Which two actions should you take? (Choose two.)

Options:

A.

Use Google Cloud Directory Sync to synchronize your local identity management system to Cloud Identity.

B.

Use the Google Admin console to view which managed users are using a personal account for their recovery email.

C.

Add users to your managed Google account and force users to change the email addresses associated with their personal accounts.

D.

Use the Transfer Tool for Unmanaged Users (TTUU) to find users with conflicting accounts and ask them to transfer their personal Google accounts.

E.

Send an email to all of your employees and ask those users with corporate email addresses for personal Google accounts to delete the personal accounts immediately.

Question 39

You have noticed an increased number of phishing attacks across your enterprise user accounts. You want to implement the Google 2-Step Verification (2SV) option that uses a cryptographic signature to authenticate a user and verify the URL of the login page. Which Google 2SV option should you use?

Options:

A.

Titan Security Keys

B.

Google prompt

C.

Google Authenticator app

D.

Cloud HSM keys

Question 40

A large financial institution is moving its Big Data analytics to Google Cloud Platform. They want to have maximum control over the encryption process of data stored at rest in BigQuery.

What technique should the institution use?

Options:

A.

Use Cloud Storage as a federated Data Source.

B.

Use a Cloud Hardware Security Module (Cloud HSM).

C.

Customer-managed encryption keys (CMEK).

D.

Customer-supplied encryption keys (CSEK).

Question 41

A company’s application is deployed with a user-managed Service Account key. You want to use Google- recommended practices to rotate the key.

What should you do?

Options:

A.

Open Cloud Shell and run gcloud iam service-accounts enable-auto-rotate --iam- account=IAM_ACCOUNT.

B.

Open Cloud Shell and run gcloud iam service-accounts keys rotate --iam- account=IAM_ACCOUNT --key=NEW_KEY.

C.

Create a new key, and use the new key in the application. Delete the old key from the Service Account.

D.

Create a new key, and use the new key in the application. Store the old key on the system as a backup key.

Question 42

You have stored company approved compute images in a single Google Cloud project that is used as an image repository. This project is protected with VPC Service Controls and exists in the perimeter along with other projects in your organization. This lets other projects deploy images from the image repository project. A team requires deploying a third-party disk image that is stored in an external Google Cloud organization. You need to grant read access to the disk image so that it can be deployed into the perimeter.

What should you do?

Options:

A.

•1 Update the perimeter

•2 Configure the egressTo field to set identity Type toany_identity.

•3 Configure the egressFrom field to include the external Google Cloud project number as an allowed resource and the serviceName to compute. googleapis. com.

B.

* Allow the external project by using the organizational policy

constraints/compute.trustedlmageProjects.

C.

•1 Update the perimeter

•2 Configure the egressTo field to include the external Google Cloud project number as an allowed resource and the serviceName to compute. googleapis. com.

•3 Configure the egressFrom field to set identity Type toany_idestity.

D.

•1 Update the perimeter

•2 Configure the ingressFrcm field to set identityType toan-y_identity.

•3 Configure the ingressTo field to include the external Google Cloud project number as an allowed resource and the serviceName to compute.googleapis -com.

Question 43

You are tasked with exporting and auditing security logs for login activity events for Google Cloud console and API calls that modify configurations to Google Cloud resources. Your export must meet the following requirements:

Export related logs for all projects in the Google Cloud organization.

Export logs in near real-time to an external SIEM.

What should you do? (Choose two.)

Options:

A.

Create a Log Sink at the organization level with a Pub/Sub destination.

B.

Create a Log Sink at the organization level with the includeChildren parameter, and set the destination to a Pub/Sub topic.

C.

Enable Data Access audit logs at the organization level to apply to all projects.

D.

Enable Google Workspace audit logs to be shared with Google Cloud in the Admin Console.

E.

Ensure that the SIEM processes the AuthenticationInfo field in the audit log entry to gather identity information.

Question 44

In a shared security responsibility model for IaaS, which two layers of the stack does the customer share responsibility for? (Choose two.)

Options:

A.

Hardware

B.

Network Security

C.

Storage Encryption

D.

Access Policies

E.

Boot

Question 45

You manage your organization's Security Operations Center (SOC). You currently monitor and detect network traffic anomalies in your Google Cloud VPCs based on packet header information. However, you want the capability to explore network flows and their payload to aid investigations. Which Google Cloud product should you use?

Options:

A.

Marketplace IDS

B.

VPC Flow Logs

C.

VPC Service Controls logs

D.

Packet Mirroring

E.

Google Cloud Armor Deep Packet Inspection

Question 46

Your organization processes sensitive health information. You want to ensure that data is encrypted while in use by the virtual machines (VMs). You must create a policy that is enforced across the entire organization.

What should you do?

Options:

A.

Implement an organization policy that ensures that all VM resources created across your organization use customer-managed encryption keys (CMEK) protection.

B.

Implement an organization policy that ensures all VM resources created across your organization are Confidential VM instances.

C.

Implement an organization policy that ensures that all VM resources created across your organization use Cloud External Key Manager (EKM) protection.

D.

No action is necessary because Google encrypts data while it is in use by default.

Question 47

Which two implied firewall rules are defined on a VPC network? (Choose two.)

Options:

A.

A rule that allows all outbound connections

B.

A rule that denies all inbound connections

C.

A rule that blocks all inbound port 25 connections

D.

A rule that blocks all outbound connections

E.

A rule that allows all inbound port 80 connections

Question 48

You are consulting with a client that requires end-to-end encryption of application data (including data in transit, data in use, and data at rest) within Google Cloud. Which options should you utilize to accomplish this? (Choose two.)

Options:

A.

External Key Manager

B.

Customer-supplied encryption keys

C.

Hardware Security Module

D.

Confidential Computing and Istio

E.

Client-side encryption

Question 49

Your organization s customers must scan and upload the contract and their driver license into a web portal in Cloud Storage. You must remove all personally identifiable information (Pll) from files that are older than 12 months. Also you must archive the anonymized files for retention purposes.

What should you do?

Options:

A.

Set a time to live (TTL) of 12 months for the files in the Cloud Storage bucket that removes PH and movesthe files to the archive storage class.

B.

Create a Cloud Data Loss Prevention (DLP) inspection job that de-identifies Pll in files created more than 12months ago and archives them to another Cloud Storage bucket. Delete the original files.

C.

Schedule a Cloud Key Management Service (KMS) rotation period of 12 months for the encryption keys ofthe Cloud Storage files containing Pll to de-identify them Delete the original keys.

D.

Configure the Autoclass feature of the Cloud Storage bucket to de-identify Pll Archive the files that are olderthan 12 months Delete the original files.

Question 50

Your team needs to make sure that their backend database can only be accessed by the frontend application and no other instances on the network.

How should your team design this network?

Options:

A.

Create an ingress firewall rule to allow access only from the application to the database using firewall tags.

B.

Create a different subnet for the frontend application and database to ensure network isolation.

C.

Create two VPC networks, and connect the two networks using Cloud VPN gateways to ensure network isolation.

D.

Create two VPC networks, and connect the two networks using VPC peering to ensure network isolation.

Question 51

Your company is moving to Google Cloud. You plan to sync your users first by using Google Cloud Directory Sync (GCDS). Some employees have already created Google Cloud accounts by using their company email addresses that were created outside of GCDS. You must create your users on Cloud Identity.

What should you do?

Options:

A.

Configure GCDS and use GCDS search rules lo sync these users.

B.

Use the transfer tool to migrate unmanaged users.

C.

Write a custom script to identify existing Google Cloud users and call the Admin SDK Directory API totransfer their account.

D.

Configure GCDS and use GCDS exclusion rules to ensure users are not suspended.

Question 52

You are exporting application logs to Cloud Storage. You encounter an error message that the log sinks don't support uniform bucket-level access policies. How should you resolve this error?

Options:

A.

Change the access control model for the bucket

B.

Update your sink with the correct bucket destination.

C.

Add the roles/logging.logWriter Identity and Access Management (IAM) role to the bucket for the log sink identity.

D.

Add the roles/logging.bucketWriter Identity and Access Management (IAM) role to the bucket for the log sink identity.

Question 53

You are a Security Administrator at your organization. You need to restrict service account creation capability within production environments. You want to accomplish this centrally across the organization. What should you do?

Options:

A.

Use Identity and Access Management (IAM) to restrict access of all users and service accounts that have access to the production environment.

B.

Use organization policy constraints/iam.disableServiceAccountKeyCreation boolean to disable the creation of new service accounts.

C.

Use organization policy constraints/iam.disableServiceAccountKeyUpload boolean to disable the creation of new service accounts.

D.

Use organization policy constraints/iam.disableServiceAccountCreation boolean to disable the creation of new service accounts.

Question 54

You are part of a security team that wants to ensure that a Cloud Storage bucket in Project A can only be readable from Project B. You also want to ensure that data in the Cloud Storage bucket cannot be accessed from or copied to Cloud Storage buckets outside the network, even if the user has the correct credentials.

What should you do?

Options:

A.

Enable VPC Service Controls, create a perimeter with Project A and B, and include Cloud Storage service.

B.

Enable Domain Restricted Sharing Organization Policy and Bucket Policy Only on the Cloud Storage bucket.

C.

Enable Private Access in Project A and B networks with strict firewall rules to allow communication between the networks.

D.

Enable VPC Peering between Project A and B networks with strict firewall rules to allow communication between the networks.

Question 55

Which type of load balancer should you use to maintain client IP by default while using the standard network tier?

Options:

A.

SSL Proxy

B.

TCP Proxy

C.

Internal TCP/UDP

D.

TCP/UDP Network

Question 56

Your company requires the security and network engineering teams to identify all network anomalies within and across VPCs, internal traffic from VMs to VMs, traffic between end locations on the internet and VMs, and traffic between VMs to Google Cloud services in production. Which method should you use?

Options:

A.

Define an organization policy constraint.

B.

Configure packet mirroring policies.

C.

Enable VPC Flow Logs on the subnet.

D.

Monitor and analyze Cloud Audit Logs.

Question 57

You are a security administrator at your company and are responsible for managing access controls (identification, authentication, and authorization) on Google Cloud. Which Google-recommended best practices should you follow when configuring authentication and authorization? (Choose two.)

Options:

A.

Use Google default encryption.

B.

Manually add users to Google Cloud.

C.

Provision users with basic roles using Google's Identity and Access Management (1AM) service.

D.

Use SSO/SAML integration with Cloud Identity for user authentication and user lifecycle management.

E.

Provide granular access with predefined roles.

Question 58

Your organization is transitioning to Google Cloud You want to ensure that only trusted container images are deployed on Google Kubernetes Engine (GKE) clusters in a project. The containers must be deployed from a centrally managed. Container Registry and signed by a trusted authority.

What should you do?

Choose 2 answers

Options:

A.

Configure the Binary Authorization policy with respective attestations for the project.

B.

Create a custom organization policy constraint to enforce Binary Authorization for Google Kubernetes Engine(GKE).

C.

Enable Container Threat Detection in the Security Command Center (SCC) for the project.

D.

Configure the trusted image organization policy constraint for the project.

E.

Enable Pod Security standards and set them to Restricted.

Question 59

You are working with a client that is concerned about control of their encryption keys for sensitive data. The client does not want to store encryption keys at rest in the same cloud service provider (CSP) as the data that the keys are encrypting. Which Google Cloud encryption solutions should you recommend to this client? (Choose two.)

Options:

A.

Customer-supplied encryption keys.

B.

Google default encryption

C.

Secret Manager

D.

Cloud External Key Manager

E.

Customer-managed encryption keys

Question 60

Your organization is using GitHub Actions as a continuous integration and delivery (Cl/CD) platform. You must enable access to Google Cloud resources from the Cl/CD pipelines in the most secure way.

What should you do?

Options:

A.

Create a service account key and add it to the GitHub pipeline configuration file.

B.

Create a service account key and add it to the GitHub repository content.

C.

Configure a Google Kubernetes Engine cluster that uses Workload Identity to supply credentials to GitHub.

D.

Configure workload identity federation to use GitHub as an identity pool provider.

Question 61

A database administrator notices malicious activities within their Cloud SQL instance. The database administrator wants to monitor the API calls that read the configuration or metadata of resources. Which logs should the database administrator review?

Options:

A.

Admin Activity

B.

System Event

C.

Access Transparency

D.

Data Access

Question 62

A company is backing up application logs to a Cloud Storage bucket shared with both analysts and the administrator. Analysts should only have access to logs that do not contain any personally identifiable information (PII). Log files containing PII should be stored in another bucket that is only accessible by the administrator.

What should you do?

Options:

A.

Use Cloud Pub/Sub and Cloud Functions to trigger a Data Loss Prevention scan every time a file is uploaded to the shared bucket. If the scan detects PII, have the function move into a Cloud Storage bucket only accessible by the administrator.

B.

Upload the logs to both the shared bucket and the bucket only accessible by the administrator. Create a

job trigger using the Cloud Data Loss Prevention API. Configure the trigger to delete any files from the shared bucket that contain PII.

C.

On the bucket shared with both the analysts and the administrator, configure Object Lifecycle Management to delete objects that contain any PII.

D.

On the bucket shared with both the analysts and the administrator, configure a Cloud Storage Trigger that is only triggered when PII data is uploaded. Use Cloud Functions to capture the trigger and delete such files.

Question 63

You run applications on Cloud Run. You already enabled container analysis for vulnerability scanning. However, you are concerned about the lack of control on the applications that are deployed. You must ensure that only trusted container images are deployed on Cloud Run.

What should you do?

Choose 2 answers

Options:

A.

EnableBinary Authorization on the existing Kubernetes cluster.

B.

Set the organization policy constraint constraints/run. allowedBinaryAuthorizationPolicie to

the list of allowed Binary Authorization policy names.

C.

Set the organization policy constraint constraints/compute.trustedimageProjects to the list of

protects that contain the trusted container images.

D.

Enable Binary Authorization on the existing Cloud Run service.

E.

Use Cloud Run breakglass to deploy an image that meets the Binary Authorization policy by default.

Question 64

A manager wants to start retaining security event logs for 2 years while minimizing costs. You write a filter to select the appropriate log entries.

Where should you export the logs?

Options:

A.

BigQuery datasets

B.

Cloud Storage buckets

C.

StackDriver logging

D.

Cloud Pub/Sub topics

Question 65

You want to make sure that your organization’s Cloud Storage buckets cannot have data publicly available to the internet. You want to enforce this across all Cloud Storage buckets. What should you do?

Options:

A.

Remove Owner roles from end users, and configure Cloud Data Loss Prevention.

B.

Remove Owner roles from end users, and enforce domain restricted sharing in an organization policy.

C.

Configure uniform bucket-level access, and enforce domain restricted sharing in an organization policy.

D.

Remove*.setIamPolicypermissions from all roles, and enforce domain restricted sharing in an organization policy.

Question 66

You are using Security Command Center (SCC) to protect your workloads and receive alerts for suspected security breaches at your company. You need to detect cryptocurrency mining software.

Which SCC service should you use?

Options:

A.

Container Threat Detection

B.

Web Security Scanner

C.

Rapid Vulnerability Detection

D.

Virtual Machine Threat Detection

Question 67

You need to connect your organization's on-premises network with an existing Google Cloud environment that includes one Shared VPC with two subnets named Production and Non-Production. You are required to:

Use a private transport link.

Configure access to Google Cloud APIs through private API endpoints originating from on-premises environments.

Ensure that Google Cloud APIs are only consumed via VPC Service Controls.

What should you do?

Options:

A.

1. Set up a Cloud VPN link between the on-premises environment and Google Cloud.

2. Configure private access using the restricted googleapis.com domains in on-premises DNS configurations.

B.

1. Set up a Partner Interconnect link between the on-premises environment and Google Cloud.

2. Configure private access using the private.googleapis.com domains in on-premises DNS configurations.

C.

1. Set up a Direct Peering link between the on-premises environment and Google Cloud.

2. Configure private access for both VPC subnets.

D.

1. Set up a Dedicated Interconnect link between the on-premises environment and Google Cloud.

2. Configure private access using the restricted.googleapis.com domains in on-premises DNS configurations.

Question 68

Your organization wants to be continuously evaluated against CIS Google Cloud Computing Foundations Benchmark v1 3 0 (CIS Google Cloud Foundation 1 3). Some of the controls are irrelevant to your organization and must be disregarded in evaluation. You need to create an automated system or process to ensure that only the relevant controls are evaluated.

What should you do?

Options:

A.

Mark all security findings that are irrelevant with a tag and a value that indicates a security exception Selectall marked findings and mute them on the console every time they appear Activate Security CommandCenter (SCC) Premium.

B.

Activate Security Command Center (SCC) Premium Create a rule to mute the security findings in SCC sothey are not evaluated.

C.

Download all findings from Security Command Center (SCC) to a CSV file Mark the findings that are part ofCIS Google Cloud Foundation 1 3 in the file Ignore the entries that are irrelevant and out of scope for thecompany.

D.

Ask an external audit company to provide independent reports including needed CIS benchmarks. In thescope of the audit clarify that some of the controls are not needed and must be disregarded.